diff --git a/Algorithm-Writer-with-Lists/.DS_Store b/Algorithm-Writer-with-Lists/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..fa86ddcb4b7354ec285ae550a09bc16e08f050fb Binary files /dev/null and b/Algorithm-Writer-with-Lists/.DS_Store differ diff --git a/Algorithm-Writer-with-Lists/LICENSE b/Algorithm-Writer-with-Lists/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..3a27c90f1c2986ac0fb73b4ae956c6205343dfdd --- /dev/null +++ b/Algorithm-Writer-with-Lists/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Algorithm-Writer-with-Lists/README.md b/Algorithm-Writer-with-Lists/README.md new file mode 100644 index 0000000000000000000000000000000000000000..cf5b7813d0f2a9b32ff34f83349f8fe901bb1891 --- /dev/null +++ b/Algorithm-Writer-with-Lists/README.md @@ -0,0 +1,195 @@ +Essay Helper | Text-to-Breasonings (Helps Earn High Distinctions) | Grammar Logic (Helps Mind Map Details) + +# Algorithm Writer with Lists + +Generates random algorithms in List Prolog, which can be converted to a simple form similar to Prolog. + +* algwriter-lists-mr.pl - generates algorithms with combinations of other algorithms with vague mind reading +* algwriter-lists-random.pl - generates algorithms with combinations of other algorithms at random +* grammar_logic_to_alg.pl - takes text, randomly generates details and appropriate algorithms for A (agreement), B (disagreement) and solution to B based on approximate parts of speech and the Text to Breasonings dictionary. +* grammar_logic_to_alg_random.pl - like the previous algorithm but randomly selects up to 10 sentences from the file to process. +* random_dependencies.pl - generates random (meaningless) algorithm specifications. +* combophil.pl - finds combinations of lines to write philosophies about. +* combophil_grammar_logic_to_alg.pl - finds all combinations in lines to write philosophies and does what grammar_logic_to_alg.pl (above) does. +* four_crosses.pl - finds 4 crosses to transition to PhD refinement. +* combophil2qadb.pl - generates algorithms from dictionary. +* mr_alg.pl and ya_phil_to_alg.pl - Yet Another Philosophy to Algorithm and Mind Read Algorithm (randomly produces 5 line algorithms with type checking, possibly about Lucian Academy philosophies). + +Please contact Lucian Green at luciangreen@lucianacademy.com with questions, comments and feedback about Algorithm Writer with Lists. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for generating algorithms. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, the List Prolog Interpreter Repository and the Text to Breasonings Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Algorithm-Writer-with-Lists"). +halt +``` + +# Caution: + +follow instructions in Instructions for Using texttobr(2) when using texttobr, texttobr2 or mind reader to avoid medical problems. + +# Running Algorithm Writer with Lists with Mind Reader + +* In Shell: +`cd Algorithm-Writer-with-Lists` +`swipl` +`['algwriter-lists-mr.pl'].` + +* Enter `algwriter(Algorithm),writeln1(Algorithm).` + +# Installing and Running Algorithm Writer with Lists at Random + +* Download this repository. +* In SWI-Prolog, in the `"Algorithm-Writer-with-Lists"` folder, enter: +``` +['algwriter-lists-random.pl']. +``` +* Enter `algwriter(Algorithm),writeln1(Algorithm).` + +# Installing and Running Grammar and Logic to Algorithm + +* Download this repository. +* In SWI-Prolog, in the `"Algorithm-Writer-with-Lists"` folder, enter: +``` +['grammar_logic_to_alg.pl']. +``` +* Enter sentences and lines to randomly process to give details and algorithms to into `"file.txt"` in the `"Text-to-Breasonings"` folder. The output or the `gla.txt` file, depending on whether you run using (1) or (2) in the following will contain the sentences and algorithms with instructions about how to run them. + +* Enter `grammar_logic_to_alg1.` (1) or paste the contents of `grammar_logic_to_alg_sh.txt` into the Terminal window on Mac (2). + +# Installing and Running Grammar and Logic to Algorithm at Random + +* Download this repository. +* In SWI-Prolog, in the `"Algorithm-Writer-with-Lists"` folder, enter: +``` +['grammar_logic_to_alg_random.pl']. +``` +* As stated above, rather than processing the whole file, this algorithm processes up to 10 random sentences. +* Enter many sentences and lines to randomly process to give details and algorithms to into `"file.txt"` in the `"Algorithm-Writer-with-Lists"` folder. The output or the `gla.txt` file, depending on whether you run using (1) or (2) in the following will contain the sentences and algorithms with instructions about how to run them. + +* Enter `grammar_logic_to_alg1.` (1) or paste the contents of `grammar_logic_to_alg_sh.txt` into the Terminal window on Mac (2). + +# Installing and Running Random Dependencies + +* Download this repository. +* In SWI-Prolog, in the `"Algorithm-Writer-with-Lists"` folder, enter: +``` +['random_dependencies.pl']. +``` +* As stated above, the value of this algorithm's input is found by mentally inserting not and transformations. + +* Enter `random_dependencies(A),writeln1(A).` + +``` +[substring,[[duplicates,[[]]],[findall,[[add_number_to_list,[[split_into_sentences,[[sort,[[list_head,[[reverse,[[findall,[[string_to_list,[[or,[[and,[[delete,[[map,[[]]]]],[]]]]]]]]]]]]],[map,[[length,[[maximum,[[length,[[reverse,[[split_into_sentences,[[member,[[map,[[]]]]]]],[]]]]]]],[delete,[[get_item_n,[[],[and,[[]]]]],[reverse,[[]]]]]]],[]]]]]]]]]]]]] +``` + +# Installing and Running Combinations of Lines of Philosophy + +* Download this repository. +* In SWI-Prolog, in the `"Algorithm-Writer-with-Lists"` folder, enter: +``` +['combophil.pl']. +``` +* Finds combinations of lines to write philosophies about with e.g. a combination of 2 lines: + +* Enter `combophil(2).` + +# Installing and Running Combinations of Lines of Philosophy with Grammar and Logic to Algorithm + +* Download this repository. +* In SWI-Prolog, in the `"Algorithm-Writer-with-Lists"` folder, enter: +``` +['combophil_grammar_logic_to_alg.pl']. +``` +* Enter sentences and lines to intermix (warning: takes all combinations of verbs and non verbs in file) and randomly process to give details and algorithms to into `"file.txt"` in the `"Text-to-Breasonings"` folder. The output in the following will contain the sentences and algorithms with instructions about how to run them. + +* Enter `combophil_grammar_logic_to_alg1.` + +# Installing and Running Four Crosses +* Download this repository. +* In SWI-Prolog, in the `"Algorithm-Writer-with-Lists"` folder, enter: +``` +['four_crosses.pl']. +``` +* Enter four levels of crosses of ideas to make a PhD-level connection. + +* Enter `four_crosses.` + +# grammar_logic_to_alg2.pl +* Converts files from e.g. grammar_logic_to_alg.pl to simplified algorithms with more specific command names. Breason out file.txt using Text-to-Breasonings, run grammar_logic_to_alg.pl, run +grammar_logic_to_alg2.pl, then run Text-to-Breasonings Reading Algorithm on the grammar_logic_to_alg2.pl output. +* Download this repository. +* In SWI-Prolog, in the `"Algorithm-Writer-with-Lists"` folder, enter: +``` +['grammar_logic_to_alg2.pl']. +``` +* Enter the number of words in file.txt to replace with computational language (it learns as it goes). + +* Enter `grammar_logic_to_alg2(200).` + +# combophil2qadb.pl +* Generates algorithms in qadb format from word dictionary. +* Download this repository. +* In SWI-Prolog, in the `"Algorithm-Writer-with-Lists"` folder, enter: +``` +['combophil2qadb.pl']. +``` +* Enter `.` or `\n` delimited algorithm ideas in `Text-to-Breasonings/file.txt` and enter: +`combophil2qadb.` +* Produces algorithms in `file-o.txt`. + +# Installing and Running Yet Another Philosophy to Algorithm and Mind Read Algorithm + +* Download this repository. +* In SWI-Prolog, in the `"Algorithm-Writer-with-Lists"` folder, enter: +``` +['ya_phil_to_alg.pl.pl']. + +``` +* As stated above, Yet Another Philosophy to Algorithm randomly produces 5 line algorithms with type checking about all sentences in `Lucian-Academy/`. + + +* Enter `ya_phil_to_alg.` + +* Produces the file `yet_more_phil_algs.txt`, containing, for example: +``` +...n,length,list,number,n,round,number,number,n,sqrt,number,number,n,sqrt,number,number,n,round,number,number,n,sqrt,number,number... +``` + +* To produce one five-line algorithm, enter `mr_alg(Commands).`, producing: + +* `Commands = [[[n,member],[[list],[_3386]]],[[n,member],[[list],[list]]],[[n,intersection],[[list],[list]]],[[n,member],[[list],[_5928]]],[[n,atom_string],[[atom],[string]]],[[n,string_length],[[string],[number]]]]`, + +* where this algorithm means, "Take a member (the second item, the first output of member) of a list, take a member list of this list, etc.". + +* Note: member(List,Item) is now member(Item,List) in List Prolog, which is different from the above. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + diff --git a/Algorithm-Writer-with-Lists/algwriter-lists-mr.pl b/Algorithm-Writer-with-Lists/algwriter-lists-mr.pl new file mode 100644 index 0000000000000000000000000000000000000000..3110cf98861ef5ed281ee48e28e1eca5a7b6c431 --- /dev/null +++ b/Algorithm-Writer-with-Lists/algwriter-lists-mr.pl @@ -0,0 +1,283 @@ +%% algwriter(A),write1(A). +%% write data,fns in exec format + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +multiply(A,B,C) :- +C is A*B. + +subtract(A,B,C) :- +C is A-B. + +head(A,B) :- A=[B|_]. + +tail(A,B) :- A=[_|B]. + +wrap(A,B) :- B=[A]. + +algwriter(Na) :- +notrace, + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + %%SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + splitfurther(BrDict0,BrDict01), + %%trace, + sort(BrDict01,BrDict02), +makerandomlist(BrDict02,5,[],R0),makerandomlist(R0,3,[],R1),makerandomlist(R0,3,[],R2),wrap(R1,Nb1),wrap(R2,Nb2),append(Nb1,Nb2,Nb3),trialy2A([3,4,5,6,7,8],NFns),randomfns(NFns,Nb3,Na),!. +makerandomlist(_,0,A,A). +makerandomlist(List,A,C1,C) :- +not((=(A,0))),trialy2A(List,N1),%%random(R),multiply(R,L,R1),ceiling(R1,N1), +wrap(N1,N2),append(C1,N2,Nb3),subtract(A,1,D),makerandomlist(List,D,Nb3,C). +randomfns(1,B,Nb3):-randomlist(B,Na1),randomlist(B,Na2),randomfn(Na1,Na2,Nb),wrap(Nb,Nb2),append(B,Nb2,Nb3).%%write1("."). +randomfns(A,B,C) :- not((=(A,1))),randomlist(B,Na1),randomlist(B,Na2),randomfn(Na1,Na2,Nb),wrap(Nb,Nb2),append(B,Nb2,Nb3),%%tail(B,T), + subtract(A,1,D),write(,), + randomfns(D,Nb3,C). +randomlist(B,Na) :- %%List1=[1,2,3,4,5,6,7,9,10], +length(B,L),length(C,L),append(C,_,B),trialy2A(C,Na).%%,random(R),length(B,Bl),multiply(R,Bl,N),ceiling(N,N1),getitemn(N1,B,Na). +getitemn(0,_A,[]). +getitemn(1,B,C) :- head(B,C). +getitemn(A,B,C) :- not((=(A,1))),tail(B,T),subtract(A,1,D),getitemn(D,T,C). +randomfn(A1,A2,B) :- repeat,trialy2A([1,3,4,5,7],N1),%%random(R),multiply(R,8,N),ceiling(N,N1), +fna(N1,A1,A2,B). +fna(1,A1,_A2,B) :- reverse(A1,[],B),write1([[n,reverse],[A1,[],B]]). +%%fna(2,A1,_A2,B) :- sort0(A1,B),write1(sort0(A1,B)). +fna(3,A1,A2,B) :- append1(A1,A2,B),write1([[n,append1],[A1,A2,B]]). +fna(4,A1,A2,B) :- minus1(A1,A2,B),write1([[n,minus1],[A1,A2,B]]). +fna(5,A1,A2,B) :- intersection1(A1,A2,[],B),write1([[n,intersection1],[A1,A2,[],B]]). +%%fna(6,A1,A2,A2) :- mutually_exclusive(A1,A2),write1([[n,mutually_exclusive],[A1,A2]]). +fna(7,A1,A2,B) :- duplicates(A1,A2,[],B),write1([[n,duplicates],[A1,A2,[],B]]). +%%fna(8,A1,A2,A1) :- substring(A1,A2),write1([[n,substring],[A1,A2]]). +reverse([],L,L). +reverse(L,M,N) :- head(L,H),tail(L,T),wrap(H,H1),append(H1,M,O),reverse(T,O,N). +sort0(L,N) :- sort1(L,[],N). +sort1([],L,L). +sort1(L,M1,N) :- not((=(L,[]))),head(L,H),tail(L,T),maximum(T,H,M2,[],R),wrap(M2,M3),append(M1,M3,M4),sort1(R,M4,N). +maximum([],L,L,R,R). +maximum(L,M1,N,R1,R2) :- not((=(L,[]))),head(L,H),tail(L,T),(>=(M1,H)->(=(M2,M1),wrap(H,H2),append(R1,H2,R3));(=(M2,H),wrap(M1,M12),append(R1,M12,R3))),maximum(T,M2,N,R3,R2). +map(_F,[],L,L). +map(F,L,M1,N) :- not((=(L,[]))),head(L,H),tail(L,T),functor(A,F,3),arg(1,A,M1),arg(2,A,H),arg(3,A,M2),A,map(F,T,M2,N). +findall(_F,[],L,L). +findall(F,L,M1,N) :- not((=(L,[]))),head(L,H),tail(L,T),functor(A,F,2),arg(1,A,H),arg(2,A,M2),(A->((wrap(M2,M3),append(M1,M3,M4)));(=(M1,M4))),findall(F,T,M4,N). +intersection1([],_A,L,L). +intersection1(L1,L2,L3a,L3) :- head(L1,I1),tail(L1,L4),intersection2(I1,L2,[],L5),append(L3a,L5,L6),intersection1(L4,L2,L6,L3). +intersection2(_A,[],L,L). +intersection2(I1,L1,L2,L3) :- head(L1,I1),tail(L1,L4),wrap(I1,I11),append(L2,I11,L5),intersection2(I1,L4,L5,L3). +intersection2(I1,L1,L2,L3) :- head(L1,I2),tail(L1,L4),not((=(I1,I2))),intersection2(I1,L4,L2,L3). +append1(B,C,A) :- append(B,C,A). +minus1(L,[],L). +minus1(L1,L2,L3) :- head(L2,I1),tail(L2,L5),delete2(L1,I1,[],L6),minus1(L6,L5,L3). +delete2([],_A,L,L). +delete2(L1,I1,L2,L3) :- head(L1,I1),tail(L1,L5),delete2(L5,I1,L2,L3). +delete2(L1,I1,L2,L3) :- head(L1,I2),tail(L1,L5),not((=(I1,I2))),wrap(I2,I21),append(L2,I21,L6),delete2(L5,I1,L6,L3). + +mutually_exclusive([],_L):-!. +mutually_exclusive(L,M):-head(L,H),tail(L,T),membera3(M,H),mutually_exclusive(T,M),!. +membera3([],_L):-!. +membera3(L,M):-head(L,H),tail(L,T),not((=(M,H))),membera3(T,M),!. + +duplicates([],_L,S,S). +duplicates(L,M,S1,S2):-head(L,H),tail(L,T),member(H,M),(deletea2(M,H,M1)->(true);(=(M,M1))),wrap(H,H1),append(S1,H1,S3),duplicates(T,M1,S3,S2),!. +duplicates(L,M,S1,S2):-head(L,H),tail(L,T),not((membera4(M,H))),duplicates(T,M,S1,S2). +deletea2([],_L,_M1):-fail. +deletea2(L,M,T):-head(L,H),tail(L,T),=(M,H). +deletea2(L,M,M1):-head(L,H),tail(L,T),not((=(M,H))),deletea2(T,M,M1). +membera4([],_L):-fail. +membera4(L,H):-head(L,H). +membera4(L,M):-head(L,H),tail(L,T),not(M=H),membera4(T,M). + +substring([],[]). +substring([],B):-not((=(B,[]))),fail. +substring(A,B):-tail(A,At),(listhead(A,B)->(true);(substring(At,B))). +listhead(_L,[]). +listhead(A,B):-head(A,Ah),tail(A,At),head(B,Ah),tail(B,Bt),listhead(At,Bt). + + +%%%%%%%%%%%% + +%%findbest(R,R) :-!. +findbest2(R,Item):- + sort(R,RA), + reverse(RA,RB), + RB=[[_,Item]|_Rest]. + +trialy2A([],R) :- + R=[[_,'A']]. +trialy2A(List,R) :- + notrace,trialy2B(List,R).%%,trace. +trialy2B(List,R) :- + length(List,Length), + ((Length=<9-> + findr4(R4), + number_string(R4,R4A), + formr5([R4A],9,Length,R5), + findr(R5,List,R)); + (Length=<99-> + findr4(R41), + findr4(R42), + formr5([R41,R42],99,Length,R5), + findr(R5,List,R)); + (Length=<999-> + findr4(R41), + findr4(R42), + findr4(R43), + formr5([R41,R42,R43],999,Length,R5), + findr(R5,List,R)); + (Length=<9999-> + findr4(R41), + findr4(R42), + findr4(R43), + findr4(R44), + formr5([R41,R42,R43,R44],9999,Length,R5), + findr(R5,List,R)); + (Length=<99999-> + findr4(R41), + findr4(R42), + findr4(R43), + findr4(R44), + findr4(R45), + formr5([R41,R42,R43,R44,R45],99999,Length,R5), + findr(R5,List,R)); + fail), + %%write1([r,R]),trace. + true. + +findr4(R4) :- + List1=[0,1,2,3,4,5,6,7,8,9], + Trials is 30, + trialy22(List1,Trials,[],R1), + findbest2(R1,R4). + %%number_string(R3,R2), +formr5(RList,Upper,Length,R5) :- + %%findall(D,(member(C,RList),floor(C,D)),RList2), + concat_list2A(RList,R5A), + number_string(R5B,R5A), + R51 is floor((R5B/Upper)*Length), + (R5B=Upper->R5 is R51-1;R5=R51). +findr(R4,List,R) :- + %%floor(R4,R4A), + length(A,R4), + append(A,[R|_],List). + + %%random_member(A,List), + %%R=[[_,A]]. + + /** + length(List,L), + Trials is L*3, + trialy22(List,Trials,[],R).**/ + +trialy22([],_,R,R) :- !. +trialy22(List,Trials,RA,RB) :- + List=[Item|Items], + trialy21(Item,Trials,R1), + append(RA,[R1],RC), + trialy22(Items,Trials,RC,RB),!. + +trialy21(Label,Trials,RA) :- + trialy3(Trials,[],R), + aggregate_all(count, member(true,R), Count), + RA=[Count,Label]. + +trialy3(0,R,R) :-!. +trialy3(Trials1,RA,RB) :- + trialy1(R1), + append(RA,[R1],RC), + Trials2 is Trials1-1, + trialy3(Trials2,RC,RB),!. + +%% try other nouns +trialy1(R1) :- + %%control11(A1), + %%repeat, + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + %%repeat, + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, trial1(N,[],S),trial01(S,S3). +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +%%trial1(0,[],_A) :- fail,!. +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +%%midpoint([],0) :- !. + +midpoint(S,MP) :- +%% not(S=[]), + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,write1(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +concat_list2A(A1,B):- + A1=[A|List], + concat_list2A(A,List,B),!. + +concat_list2A(A,[],A):-!. +concat_list2A(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list2A(C,Items,B). diff --git a/Algorithm-Writer-with-Lists/algwriter-lists-random.pl b/Algorithm-Writer-with-Lists/algwriter-lists-random.pl new file mode 100644 index 0000000000000000000000000000000000000000..db66bf13b70561c09f58271525287a8c96699520 --- /dev/null +++ b/Algorithm-Writer-with-Lists/algwriter-lists-random.pl @@ -0,0 +1,295 @@ +%% algwriter(A),write1(A). +%% write data,fns in exec format + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +multiply(A,B,C) :- +C is A*B. + +subtract(A,B,C) :- +C is A-B. + +head(A,B) :- A=[B|_]. + +tail(A,B) :- A=[_|B]. + +wrap(A,B) :- B=[A]. + +algwriter(Na) :- +%%notrace, + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + %%SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + splitfurther(BrDict0,BrDict01), + %%trace, + sort(BrDict01,BrDict02), +makerandomlist(BrDict02,5,[],R0),makerandomlist(R0,3,[],R1),makerandomlist(R0,3,[],R2),wrap(R1,Nb1),wrap(R2,Nb2),append(Nb1,Nb2,Nb3),trialy2A([3,4,5,6,7,8],NFns),randomfns(NFns,Nb3,Na),!. +makerandomlist(_,0,A,A). +makerandomlist(List,A,C1,C) :- +not((=(A,0))),trialy2A(List,N1),%%random(R),multiply(R,L,R1),ceiling(R1,N1), +wrap(N1,N2),append(C1,N2,Nb3),subtract(A,1,D),makerandomlist(List,D,Nb3,C). +randomfns(1,B,Nb3):-randomlist(B,Na1),randomlist(B,Na2),randomfn(Na1,Na2,Nb),wrap(Nb,Nb2),append(B,Nb2,Nb3).%%write1("."). +randomfns(A,B,C) :- not((=(A,1))),randomlist(B,Na1),randomlist(B,Na2),randomfn(Na1,Na2,Nb),wrap(Nb,Nb2),append(B,Nb2,Nb3),%%tail(B,T), + subtract(A,1,D),write(,), + randomfns(D,Nb3,C). +randomlist(B,Na) :- %%List1=[1,2,3,4,5,6,7,9,10], +length(B,L),length(C,L),append(C,_,B),trialy2A(C,Na).%%,random(R),length(B,Bl),multiply(R,Bl,N),ceiling(N,N1),getitemn(N1,B,Na). +getitemn(0,_A,[]). +getitemn(1,B,C) :- head(B,C). +getitemn(A,B,C) :- not((=(A,1))),tail(B,T),subtract(A,1,D),getitemn(D,T,C). +randomfn(A1,A2,B) :- %%repeat, +trialy2A([1,3,4,5,7],N1),%%random(R),multiply(R,8,N),ceiling(N,N1), +fna(N1,A1,A2,B). +fna(1,A1,_A2,B) :- reverse(A1,[],B),writeln1([[n,reverse],[A1,[],B]]),term_to_atom(A1,A11),term_to_atom(B,B1),concat_list(["I reversed ",A11,", giving ",B1,"."],C),writeln1(C). +%%fna(2,A1,_A2,B) :- sort0(A1,B),write1(sort0(A1,B)). +fna(3,A1,A2,B) :- append1(A1,A2,B),writeln1([[n,append1],[A1,A2,B]]),term_to_atom(A1,A11),term_to_atom(A2,A21),term_to_atom(B,B1),concat_list(["I appended ",A11," to ",A21,", giving ",B1,"."],C),writeln1(C). +fna(4,A1,A2,B) :- minus1(A1,A2,B),writeln1([[n,minus1],[A1,A2,B]]),term_to_atom(A1,A11),term_to_atom(A2,A21),term_to_atom(B,B1),concat_list(["I subtracted ",A21," from ",A11,", giving ",B1,"."],C),writeln1(C). +fna(5,A1,A2,B) :- intersection1(A1,A2,[],B),writeln1([[n,intersection1],[A1,A2,[],B]]),term_to_atom(A1,A11),term_to_atom(A2,A21),term_to_atom(B,B1),concat_list(["I found the intersection of ",A11," and ",A21,", ",B1,"."],C),writeln1(C). +%%fna(6,A1,A2,A2) :- mutually_exclusive(A1,A2),write1([[n,mutually_exclusive],[A1,A2]]). +fna(7,A1,A2,B) :- duplicates(A1,A2,[],B),writeln1([[n,duplicates],[A1,A2,[],B]]),term_to_atom(A1,A11),term_to_atom(A2,A21),term_to_atom(B,B1),concat_list(["I found the duplicates in ",A11," and ",A21,", ",B1,"."],C),writeln1(C). +%%fna(8,A1,A2,A1) :- substring(A1,A2),write1([[n,substring],[A1,A2]]). +reverse([],L,L). +reverse(L,M,N) :- head(L,H),tail(L,T),wrap(H,H1),append(H1,M,O),reverse(T,O,N). +sort0(L,N) :- sort1(L,[],N). +sort1([],L,L). +sort1(L,M1,N) :- not((=(L,[]))),head(L,H),tail(L,T),maximum(T,H,M2,[],R),wrap(M2,M3),append(M1,M3,M4),sort1(R,M4,N). +maximum([],L,L,R,R). +maximum(L,M1,N,R1,R2) :- not((=(L,[]))),head(L,H),tail(L,T),(>=(M1,H)->(=(M2,M1),wrap(H,H2),append(R1,H2,R3));(=(M2,H),wrap(M1,M12),append(R1,M12,R3))),maximum(T,M2,N,R3,R2). +map(_F,[],L,L). +map(F,L,M1,N) :- not((=(L,[]))),head(L,H),tail(L,T),functor(A,F,3),arg(1,A,M1),arg(2,A,H),arg(3,A,M2),A,map(F,T,M2,N). +findall(_F,[],L,L). +findall(F,L,M1,N) :- not((=(L,[]))),head(L,H),tail(L,T),functor(A,F,2),arg(1,A,H),arg(2,A,M2),(A->((wrap(M2,M3),append(M1,M3,M4)));(=(M1,M4))),findall(F,T,M4,N). +intersection1([],_A,L,L). +intersection1(L1,L2,L3a,L3) :- head(L1,I1),tail(L1,L4),intersection2(I1,L2,[],L5),append(L3a,L5,L6),intersection1(L4,L2,L6,L3). +intersection2(_A,[],L,L). +intersection2(I1,L1,L2,L3) :- head(L1,I1),tail(L1,L4),wrap(I1,I11),append(L2,I11,L5),intersection2(I1,L4,L5,L3). +intersection2(I1,L1,L2,L3) :- head(L1,I2),tail(L1,L4),not((=(I1,I2))),intersection2(I1,L4,L2,L3). +append1(B,C,A) :- append(B,C,A). +minus1(L,[],L). +minus1(L1,L2,L3) :- head(L2,I1),tail(L2,L5),delete2(L1,I1,[],L6),minus1(L6,L5,L3). +delete2([],_A,L,L). +delete2(L1,I1,L2,L3) :- head(L1,I1),tail(L1,L5),delete2(L5,I1,L2,L3). +delete2(L1,I1,L2,L3) :- head(L1,I2),tail(L1,L5),not((=(I1,I2))),wrap(I2,I21),append(L2,I21,L6),delete2(L5,I1,L6,L3). + +mutually_exclusive([],_L):-!. +mutually_exclusive(L,M):-head(L,H),tail(L,T),membera3(M,H),mutually_exclusive(T,M),!. +membera3([],_L):-!. +membera3(L,M):-head(L,H),tail(L,T),not((=(M,H))),membera3(T,M),!. + +duplicates([],_L,S,S). +duplicates(L,M,S1,S2):-head(L,H),tail(L,T),member(H,M),(deletea2(M,H,M1)->(true);(=(M,M1))),wrap(H,H1),append(S1,H1,S3),duplicates(T,M1,S3,S2),!. +duplicates(L,M,S1,S2):-head(L,H),tail(L,T),not((membera4(M,H))),duplicates(T,M,S1,S2). +deletea2([],_L,_M1):-fail. +deletea2(L,M,T):-head(L,H),tail(L,T),=(M,H). +deletea2(L,M,M1):-head(L,H),tail(L,T),not((=(M,H))),deletea2(T,M,M1). +membera4([],_L):-fail. +membera4(L,H):-head(L,H). +membera4(L,M):-head(L,H),tail(L,T),not(M=H),membera4(T,M). + +substring([],[]). +substring([],B):-not((=(B,[]))),fail. +substring(A,B):-tail(A,At),(listhead(A,B)->(true);(substring(At,B))). +listhead(_L,[]). +listhead(A,B):-head(A,Ah),tail(A,At),head(B,Ah),tail(B,Bt),listhead(At,Bt). + + +%%%%%%%%%%%% + +%%findbest(R,R) :-!. +findbest2(R,Item):- + sort(R,RA), + reverse(RA,RB), + RB=[[_,Item]|_Rest]. + + +%%trialy2A([],R) :- +%% R=[[_,'A']]. +%%trialy2A(List,R) :- +%% random_member(A,List), +%% R=[[_,A]]. + +trialy2A([],R) :- + R=[]. +trialy2A(List,R) :- + random_member(R,List). + %%R=[[_,A]]. + +%%trialy2A(List,R) :- +%% notrace,trialy2B(List,R).%%,trace. +trialy2B(List,R) :- + length(List,Length), + ((Length=<9-> + findr4(R4), + number_string(R4,R4A), + formr5([R4A],9,Length,R5), + findr(R5,List,R)); + (Length=<99-> + findr4(R41), + findr4(R42), + formr5([R41,R42],99,Length,R5), + findr(R5,List,R)); + (Length=<999-> + findr4(R41), + findr4(R42), + findr4(R43), + formr5([R41,R42,R43],999,Length,R5), + findr(R5,List,R)); + (Length=<9999-> + findr4(R41), + findr4(R42), + findr4(R43), + findr4(R44), + formr5([R41,R42,R43,R44],9999,Length,R5), + findr(R5,List,R)); + (Length=<99999-> + findr4(R41), + findr4(R42), + findr4(R43), + findr4(R44), + findr4(R45), + formr5([R41,R42,R43,R44,R45],99999,Length,R5), + findr(R5,List,R)); + fail), + %%write1([r,R]),trace. + true. + +findr4(R4) :- + List1=[0,1,2,3,4,5,6,7,8,9], + Trials is 30, + trialy22(List1,Trials,[],R1), + findbest2(R1,R4). + %%number_string(R3,R2), +formr5(RList,Upper,Length,R5) :- + %%findall(D,(member(C,RList),floor(C,D)),RList2), + concat_list2A(RList,R5A), + number_string(R5B,R5A), + R51 is floor((R5B/Upper)*Length), + (R5B=Upper->R5 is R51-1;R5=R51). +findr(R4,List,R) :- + %%floor(R4,R4A), + length(A,R4), + append(A,[R|_],List). + + %%random_member(A,List), + %%R=[[_,A]]. + + /** + length(List,L), + Trials is L*3, + trialy22(List,Trials,[],R).**/ + +trialy22([],_,R,R) :- !. +trialy22(List,Trials,RA,RB) :- + List=[Item|Items], + trialy21(Item,Trials,R1), + append(RA,[R1],RC), + trialy22(Items,Trials,RC,RB),!. + +trialy21(Label,Trials,RA) :- + trialy3(Trials,[],R), + aggregate_all(count, member(true,R), Count), + RA=[Count,Label]. + +trialy3(0,R,R) :-!. +trialy3(Trials1,RA,RB) :- + trialy1(R1), + append(RA,[R1],RC), + Trials2 is Trials1-1, + trialy3(Trials2,RC,RB),!. + +%% try other nouns +trialy1(R1) :- + %%control11(A1), + %%repeat, + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + %%repeat, + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, trial1(N,[],S),trial01(S,S3). +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +%%trial1(0,[],_A) :- fail,!. +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +%%midpoint([],0) :- !. + +midpoint(S,MP) :- +%% not(S=[]), + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,write1(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +concat_list2A(A1,B):- + A1=[A|List], + concat_list2A(A,List,B),!. + +concat_list2A(A,[],A):-!. +concat_list2A(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list2A(C,Items,B). diff --git a/Algorithm-Writer-with-Lists/brdict3.txt b/Algorithm-Writer-with-Lists/brdict3.txt new file mode 100644 index 0000000000000000000000000000000000000000..3ecd4ab12dd924fbab747b912705e8e63311560a --- /dev/null +++ b/Algorithm-Writer-with-Lists/brdict3.txt @@ -0,0 +1 @@ +[["a","a"],["person","subject"],["pear","pear"],["banana","banana"],["apple","apple"],["right","check"],["city","origin"],["paper","command"],["up","check"],["and","interpreter"],["box","minimise"],["twobox","compiler"],["variable","satisfy"],["software","generator"],["leash","all"],["square","end"],["xcorrection","xcorrection"],["one","true"],["dash","twodimensional"],["tick","secure"],["book","business"],["plus","useful"],["equals","compute"],["ball","sphere"],["ship","ship"],["down","current"],["zero","empty"],["brain","calculate"],["log","log"],["n","value"],["testtube","test"],["building","building"],["minus","false"],["dollarsymbol","money"],["happy","happy"],["lowera","word"],["filename","filename"],["gl","gl"],["txt","txt"],["notrace","notrace"],["mtrue","mtrue"],["m","m"],["grammar","grammar"],["logic","logic"],["to","to"],["alg","alg"],["sentences","sentences"],["medicine","medicine"],["by","by"],["lucian","lucian"],["note","music"],["pencilsharpener","object"],["hand","hand"]] \ No newline at end of file diff --git a/Algorithm-Writer-with-Lists/combophil.pl b/Algorithm-Writer-with-Lists/combophil.pl new file mode 100644 index 0000000000000000000000000000000000000000..df24d9ccca90afe131ead4d4e955eb5d05a727f7 --- /dev/null +++ b/Algorithm-Writer-with-Lists/combophil.pl @@ -0,0 +1,33 @@ +%% combophil.pl + +%% Finds combinations of lines of philosophy + +%% *** Deprecated by Repository Lucian-Academy/combophil_alg_log.pl + +:-include('../listprologinterpreter/la_strings'). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +%% e.g. combophil(2). to write on a combination of philosophies + +combophil(NumberOfPhils) :- + phrase_from_file_s(string(Phil1), "../Text-to-Breasonings/file.txt"), + SepandPad="\n\r", + split_string(Phil1,SepandPad,SepandPad,Phil2), + delete(Phil2,"",Phil3), + sort(Phil3,Phil4), + length(Phil4,LengthPhil4),write("Number of philosophies in file.txt: "), + writeln(LengthPhil4), + length(PhilsLengthList,NumberOfPhils), + repeat, + findall(Phils1,(member(_,PhilsLengthList),random_member(Phils1,Phil4)),Phils2), + reverse(Phils2,Phils3),Phils3=[Phil6|Phils4],reverse(Phils4,Phils5), + findall(_,(member(Phil5,Phils5),writeln1(Phil5),nl,writeln1("because"),nl),_), + writeln1(Phil6),nl, + write("1-Continue or 2-End: "), + read_string(user_input, "\n", "\r", _End, Input), + (Input="2"->abort;fail). + diff --git a/Algorithm-Writer-with-Lists/combophil2qadb.pl b/Algorithm-Writer-with-Lists/combophil2qadb.pl new file mode 100644 index 0000000000000000000000000000000000000000..0cabf111299db7b753a54b1be650679d2b7b8a9f --- /dev/null +++ b/Algorithm-Writer-with-Lists/combophil2qadb.pl @@ -0,0 +1,165 @@ +%% combophil.pl x combophil2qadb.pl + +% don;t change to mrtree because mr is working. Without seeing and thinking about options, it is meaningless. + +%% Finds combinations of lines of philosophy + +%% *** Deprecated by Repository Lucian-Academy/combophil_alg_log.pl + +:-include('../listprologinterpreter/la_strings'). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +%% e.g. combophil(2). to write on a combination of philosophies + +combophil2qadb :- %(NumberOfPhils) :- + +% get box, right, box words from brdict1 +phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), +%phrase_from_file_s(string(BrDict0), "../../brdict1.txt"), +%string_codes(String02b,BrDict0), +%atom_to_term(String02b,String02a,[]), +splitfurther(BrDict0,BrDict01), +sort(BrDict01,String02a), +findall(A2,(member(A,String02a),A=[A2,"box"]),A1), +findall(B2,(member(B,String02a),B=[B2,"right"]),B1), + + + phrase_from_file_s(string(Phil1), "../Text-to-Breasonings/file.txt"), + SepandPad="\n\r.", + split_string(Phil1,SepandPad,SepandPad,Phil2), + delete(Phil2,"",Phil3), + sort(Phil3,Phil4), + length(Phil4,LengthPhil4),write("Number of philosophies in file.txt: "), + writeln(LengthPhil4), + %length(PhilsLengthList,NumberOfPhils), + + findall([C,Algorithm],(member(C,Phil4), + qa_db_finder(A1,B1,Algorithm)),D), + + term_to_atom(D,K), + (open_s("file-o.txt",write,Stream1), + write(Stream1,K), + close(Stream1)),!. + /** + repeat, + findall(Phils1,(member(_,PhilsLengthList),random_member(Phils1,Phil4)),Phils2), + reverse(Phils2,Phils3),Phils3=[Phil6|Phils4],reverse(Phils4,Phils5), + findall(_,(member(Phil5,Phils5),writeln1(Phil5),nl,writeln1("because"),nl),_), + writeln1(Phil6),nl, + write("1-Continue or 2-End: "), + read_string(user_input, "\n", "\r", _End, Input), + (Input="2"->abort;fail). +**/ + + +splitfurther(BrDict01,N) :- + phrase(file0(N),BrDict01). + + file0(N) --> "[", file(N), "]", !. + file0([]) --> []. + + file([L|Ls]) --> entry(L),",", + %%{writeln(L)}, %%*** + file(Ls), !. + file([L]) --> entry(L), + %%{writeln(L)}, + !. + + entry([Word2,Word4]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, +",", + word(Word3), {string_codes(Word4,Word3),string(Word4)}, + "]". + + word([X|Xs]) --> [X], {char_type(X,csymf)->true;(X=27->true;X=8217)}, word(Xs), !. + %%word([X]) --> [X], {char_type(X,csymf);(X=27;X=8217)}, !. + word([]) --> []. + +qa_db_finder(A1,B1,Algorithm) :- +% name of algorithm + random_member(Verb,B1), + + %writeln("What is the input variable?"), + random_member(I1,A1), + upper_first_letter(I1,I), + %read_string(user_input, "\n", "\r", _End, I), + + %writeln("What variable is this last variable linked to?"), + random_member(N1,A1), + upper_first_letter(N1,N), + %read_string(user_input, "\n", "\r", _End2, N), + + concat_list(["member(A1,",I,N,"),A1=[",I,",",N,"],"],Algorithm1), + Vars1=[I,N], + + repeat1(A1,2,N,Algorithm1,Algorithm2,Vars1,Vars2), + + string_concat(Algorithm3,",",Algorithm2), + + %writeln("What is the final output variable?"), + random_member(O1,A1), + upper_first_letter(O1,O2), + append(Vars2,[O2],O3), + random_member(O,O3), + %read_string(user_input, "\n", "\r", _End3, O), + %%trace, + + find_header_args1(Vars2,"",HA1), + %%string_concat(HA2,",",HA1), + + concat_list(["a(",HA1,I,",",O,"):-"],Algorithm4), + concat_list([Algorithm4,Algorithm3,"."],Algorithm). + +upper_first_letter(A,B) :- + string_concat(C,F,A), + string_length(C,1), + to_upper(C,D), + string_codes(E,[D]), + string_concat(E,F,B),!. + +repeat1(A1,M1,N,Algorithm1,Algorithm2,Vars1,Vars2) :- + %writeln("Is this the final variable? (y/n)"), + random_member(Q,["y","n"]), + %read_string(user_input, "\n", "\r", _End, Q), + + (Q="y"->(Algorithm2=Algorithm1,Vars2=Vars1); + %writeln("What variable is this last variable linked to?"), + %(random(X_A),X_B is ceiling(2*X_A), + %(X_B=1-> + random_member(V1,A1), + upper_first_letter(V1,V2), + append(Vars1,[V2],V3), + random_member(V,V3), + %read_string(user_input, "\n", "\r", _End2, V), + + concat_list(["member(A",M1,",",N,V,"),A",M1,"=[",N,",",V,"],"],Algorithm1a), + append(Vars1,[V],Vars3), + M2 is M1+1, + string_concat(Algorithm1,Algorithm1a,Algorithm1b), + repeat1(A1,M2,V,Algorithm1b,Algorithm2,Vars3,Vars2) + + ). + + +find_header_args1([_],HA,HA) :- !. +find_header_args1(Vars,HA1,HA2) :- + Vars=[_|F], + Vars=[A,B|_], + concat_list([A,B,","],D), + string_concat(HA1,D,E), + find_header_args1(F,E,HA2). + + +concat_list(A1,B):- + A1=[A|List], + concat_list(A,List,B),!. + +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). diff --git a/Algorithm-Writer-with-Lists/combophil_grammar_logic_to_alg.pl b/Algorithm-Writer-with-Lists/combophil_grammar_logic_to_alg.pl new file mode 100644 index 0000000000000000000000000000000000000000..e501b7e5869a0677daf729e370ab5e57883c9cf3 --- /dev/null +++ b/Algorithm-Writer-with-Lists/combophil_grammar_logic_to_alg.pl @@ -0,0 +1,208 @@ +/** + +grammar_logic_to_alg.pl + + +e.g. I liked God with you +[liked,God],[God,you] with extra data +- connect liked to you with member, return words on path + +POS + +goes backwards x forwards +- I v n by v n x ignore + +n,v +- single type + +adjective +- ignored x (ignore names in alg) +- like v,n + +joining words - ignore, just pairs v, n +- on +- joins v to n +- separates vns + +negative terms +- switch to positive or ignore + +disappearing words +- my +- no list of these, because they are ignored + +vv is v1, v2 x v1 +nn is n1 + +* + +later: can randomly generate details like given sentence + +**/ + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +:- dynamic brdict/1. + +combophil_grammar_logic_to_alg1 :- + phrase_from_file_s(string(Text1), "../Text-to-Breasonings/file.txt"), + + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + splitfurther(BrDict0,BrDict01), + sort(BrDict01,BrDict012), + retractall(brdict(_)), + assertz(brdict(BrDict012)), + + %%SepandPad=".\n", + %%split_string(Text1,SepandPad,SepandPad,Text2a), + %%delete(Text2a,"",Text2), + + + %%findall(B2,(member(B1,Text2), + string_codes(Text11,Text1), + downcase_atom(%%B1 + Text11,B11),atom_string(B11,B12), + SepandPad1=" .\n", + split_string(B12,SepandPad1,SepandPad1,A), + %%trace, + findall(D,(member(C3,A), + member([C3,"right"],BrDict012),member(C1,A), + not(member([C1,"right"],BrDict012)), + concat_list([C3," ",C1],C2),grammar_logic_to_alg(C2,D)),C), + %%writeln(C),trace, + + %%grammar_logic_to_alg(D,B2)),C), + length(C,CLength), + writeln([CLength,sentences]), + + C=[[[_,Sentence1,a_alg(List_a),_,bb_alg(List_bb)]|Cs1]|Cs2], + %%get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["gla_del"],%%[Year,Month,Day,Hour1,Minute1,Seconda], + File1), + concat_list(["\"",File1,".txt\""],File2), + + term_to_atom(List_a,List_a2), + string_atom(List_a3,List_a2), + + concat_list(["swipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_a3,"]],[[[n,function],[[v,a]],\":-\",[[[n,length],[[v,a],0,1]]]],[[n,function],[[v,a]],\":-\",[[[n,head],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,e],[v,f]]]],[[n,reverse],[[v,a],[],[v,a1]]],[[n,head],[[v,a1],[v,d1]]],[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]],[[n,function2],[[v,a],[v,f],[v,f1]]]]],[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],\":-\",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]],[[n,function2],[[v,a],[v,b],[v,f]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]]]],[[n,function2],[[v,a],[v,b],[v,c]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]],[[n,function2],[[v,d],[v,f],[v,c]]]]],[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],\":-\",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[]]),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_a1), + writeln1([*,Sentence1,a_alg(List_a)]), + writeln(List_a1), + + concat_list(["\n\nswipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_a3,",[v,b]]],[[[n,function],[[v,a],[v,b]],\":-\",[[[n,tail],[[v,a],[v,b]]]]]],_),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_b1), + writeln1([*,Sentence1,b_alg(List_a)]), + writeln(List_b1), + + term_to_atom(List_bb,List_bb2), + string_atom(List_bb3,List_bb2), + + concat_list(["swipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_bb3,"]],[[[n,function],[[v,a]],\":-\",[[[n,length],[[v,a],0,1]]]],[[n,function],[[v,a]],\":-\",[[[n,head],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,e],[v,f]]]],[[n,reverse],[[v,a],[],[v,a1]]],[[n,head],[[v,a1],[v,d1]]],[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]],[[n,function2],[[v,a],[v,f],[v,f1]]]]],[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],\":-\",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]],[[n,function2],[[v,a],[v,b],[v,f]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]]]],[[n,function2],[[v,a],[v,b],[v,c]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]],[[n,function2],[[v,d],[v,f],[v,c]]]]],[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],\":-\",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[]]),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_bb1), + writeln1([*,Sentence1,bb_alg(List_bb)]), + writeln(List_bb1), + + writeln1(Cs1), + writeln1(Cs2). + +grammar_logic_to_alg(Sentence1,B) :- %% Not by multi-sentence algorithms, just by sentence + atom_string(Sentence0,Sentence1), + downcase_atom(Sentence0,Sentence01), + atom_string(Sentence01,Sentence02), + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + Connectors= + ["the","a","i","on","with","of","an","for","to", + "was","were","and","in","my","from","out","by"], + split_string(Sentence02,SepandPad,SepandPad,Sentence2), + subtract(Sentence2,Connectors,Sentence3), + %%length(Sentence3,Length), + + %% () add generated data + %%write_commands(Length,[],Commands), %% sentence alg + + generate_sentences(Sentence3,[],Sentence_a,30), %% detail sentences + append(Sentence3,Sentence_a,Sentence4), + findall([*,Sentence1,a_alg(Sentence5),b_alg(Sentence5,a),bb_alg(Sentence6)],(member(Sentence4a,Sentence4),make_lists(Sentence4a,[],Sentence5),Sentence5=[_|Sentence6]),B),!. + %% detail algs + +generate_sentences(_Sentence3,Sentence_a,Sentence_a,0) :- !. +generate_sentences(Sentence3,Sentence_a1,Sentence_a2,N1) :- + random_member(Item,Sentence3), + generate_sentence(Item,Sentence_a3), + append(Sentence_a1,[Sentence_a3],Sentence_a4), + N2 is N1-1, + generate_sentences(Sentence3,Sentence_a4,Sentence_a2,N2). + +%%write_commands(0,Commands,Commands) :- !. +%%write_commands(Length1,Commands1,Commands2) :- +%% Length2 is Length1-1. +%% append(Commands1,[[[n,member2],[[v,Length1],[v,Length2]]]). + %%[[n,equals1],[[v,Length2],[[v,a***],[v,b]]]]]. + +%%[a,b] +%%[c,d] + +generate_sentence(Item,Sentence) :- + random_member(Grammar1,[[n,v,n],[n,v,a,n],[v,n],[v,a,n]]), + brdict(BrDict012), + find_pos(Item,POS,BrDict012), + substitute1(Item,POS,Grammar1,[],Grammar2), + substitute2(Grammar2,BrDict012,[],Sentence). + +find_pos(Item,POS2,BrDict012) :- + member([Item,POS1],BrDict012), + POS1="right", + POS2=v,!. +find_pos(Item,POS2,BrDict012) :- + member([Item,POS1],BrDict012), + POS1="plus", + POS2=a,!. +find_pos(_Item,POS2,_BrDict012) :- + POS2=n. + +substitute1(_Item,_POS,[],Grammar,Grammar) :- !. +substitute1(Item,POS,Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + Grammar4=POS, + append_list([Grammar2,Item],Grammar6), + append(Grammar6,Grammar5,Grammar3),!. +substitute1(Item,POS,Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + not(Grammar4=POS), + append_list([Grammar2,Grammar4],Grammar6), + substitute1(Item,POS,Grammar5,Grammar6,Grammar3),!. + +substitute2([],_BrDict012,Sentence,Sentence) :- !. +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=n, + findall(A,(member([A,"box"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=v, + findall(A,(member([A,"right"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=a, + findall(A,(member([A,"plus"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + append(Sentence1,[Grammar2],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). + +make_lists(Sentence1,Sentence,Sentence) :- + Sentence1=[_Sentence2], + !. +make_lists(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Sentence4|Sentence5], + Sentence5=[Sentence6|_Sentence7], + append(Sentence2,[[Sentence4,Sentence6]],Sentence8), + make_lists(Sentence5,Sentence8,Sentence3). + + \ No newline at end of file diff --git a/Algorithm-Writer-with-Lists/combophil_grammar_logic_to_alg_vps.pl b/Algorithm-Writer-with-Lists/combophil_grammar_logic_to_alg_vps.pl new file mode 100644 index 0000000000000000000000000000000000000000..bc4647dd4c6a542449aef565450387cccd001c32 --- /dev/null +++ b/Algorithm-Writer-with-Lists/combophil_grammar_logic_to_alg_vps.pl @@ -0,0 +1,241 @@ +/** + +grammar_logic_to_alg.pl + + +e.g. I liked God with you +[liked,God],[God,you] with extra data +- connect liked to you with member, return words on path + +POS + +goes backwards x forwards +- I v n by v n x ignore + +n,v +- single type + +adjective +- ignored x (ignore names in alg) +- like v,n + +joining words - ignore, just pairs v, n +- on +- joins v to n +- separates vns + +negative terms +- switch to positive or ignore + +disappearing words +- my +- no list of these, because they are ignored + +vv is v1, v2 x v1 +nn is n1 + +* + +later: can randomly generate details like given sentence + +**/ + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +:- dynamic brdict/1. + +truncate(List1,M,String0) :- + ((number(M),length(String0,M), + append(String0,_,List1))->true; + String0=List1),!. + +combophil_grammar_logic_to_alg1(P,String000) :-%%(String000) :- + phrase_from_file_s(string(Text1), "../Lucian-Academy/luciansphilosophy.txt"), + combophil_grammar_logic_to_alg2(String000,Text1). + +combophil_grammar_logic_to_alg1(File) :-%%(String000) :- + phrase_from_file_s(string(Text1), "../Text-to-Breasonings/file.txt"), + string_codes(Text11,Text1), + downcase_atom(%%B1 + Text11,B11),atom_string(B11,B12), + SepandPad1=".\n", + split_string(B12,SepandPad1,SepandPad1,A0), + findall(_,(member(A00,A0), + combophil_grammar_logic_to_alg2(String000,A00), + %%String000="a b c", + time((N = u, M = u,texttobr2(N,u,String000,M),texttobr(N,u,String000,M))), +%% Give the meditators, etc. the As. + texttobr2(3)),_). + +combophil_grammar_logic_to_alg2(String000,Text1) :- + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + splitfurther(BrDict0,BrDict01), + sort(BrDict01,BrDict012), + retractall(brdict(_)), + assertz(brdict(BrDict012)), + + %%SepandPad=".\n", + %%split_string(Text1,SepandPad,SepandPad,Text2a), + %%delete(Text2a,"",Text2), + + + %%findall(B2,(member(B1,Text2), + string_codes(Text11,Text1), + downcase_atom(%%B1 + Text11,B11),atom_string(B11,B12), + SepandPad1=" .\n", + split_string(B12,SepandPad1,SepandPad1,A01), + append(["go"],A01,A0), + truncate(A0,20,A), + + findall(D,(member(C3,A), + member([C3,"right"],BrDict012),member(C1,A), + not(member([C1,"right"],BrDict012)), + concat_list([C3," ",C1],C2),grammar_logic_to_alg(C2,D)),C), + %%writeln(C),trace, + + %%grammar_logic_to_alg(D,B2)),C), + length(C,CLength), + writeln([CLength,sentences]), + + C=[[[_,Sentence1,a_alg(List_a),_,bb_alg(List_bb)]|Cs1]|Cs2], + %%get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["gla_del"],%%[Year,Month,Day,Hour1,Minute1,Seconda], + File1), + concat_list(["\"",File1,".txt\""],File2), + + term_to_atom(List_a,List_a2), + string_atom(List_a3,List_a2), + + concat_list(["swipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_a3,"]],[[[n,function],[[v,a]],\":-\",[[[n,length],[[v,a],0,1]]]],[[n,function],[[v,a]],\":-\",[[[n,head],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,e],[v,f]]]],[[n,reverse],[[v,a],[],[v,a1]]],[[n,head],[[v,a1],[v,d1]]],[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]],[[n,function2],[[v,a],[v,f],[v,f1]]]]],[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],\":-\",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]],[[n,function2],[[v,a],[v,b],[v,f]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]]]],[[n,function2],[[v,a],[v,b],[v,c]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]],[[n,function2],[[v,d],[v,f],[v,c]]]]],[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],\":-\",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[]]),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_a1), + writeln1([*,Sentence1,a_alg(List_a)]), + writeln(List_a1), + + concat_list(["\n\nswipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_a3,",[v,b]]],[[[n,function],[[v,a],[v,b]],\":-\",[[[n,tail],[[v,a],[v,b]]]]]],_),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_b1), + writeln1([*,Sentence1,b_alg(List_a)]), + writeln(List_b1), + + term_to_atom(List_bb,List_bb2), + string_atom(List_bb3,List_bb2), + + concat_list(["swipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_bb3,"]],[[[n,function],[[v,a]],\":-\",[[[n,length],[[v,a],0,1]]]],[[n,function],[[v,a]],\":-\",[[[n,head],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,e],[v,f]]]],[[n,reverse],[[v,a],[],[v,a1]]],[[n,head],[[v,a1],[v,d1]]],[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]],[[n,function2],[[v,a],[v,f],[v,f1]]]]],[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],\":-\",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]],[[n,function2],[[v,a],[v,b],[v,f]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]]]],[[n,function2],[[v,a],[v,b],[v,c]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]],[[n,function2],[[v,d],[v,f],[v,c]]]]],[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],\":-\",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[]]),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_bb1), + writeln1([*,Sentence1,bb_alg(List_bb)]), + writeln(List_bb1), + + writeln1(Cs1), + writeln1(Cs2), + + term_to_atom([[CLength,sentences],[*,Sentence1,a_alg(List_a)],List_a1,[*,Sentence1,b_alg(List_a)],List_b1,[*,Sentence1,bb_alg(List_bb)],List_bb1,Cs1,Cs2],String0001), +atom_string(String0001,String000) + + %%a(P,String000):- + + + %%CLength=a,Sentence1=a,List_a=a,List_a1=a,List_b1=a,List_bb=a,List_bb1=a,Cs1=a,Cs2=a, + +. + +grammar_logic_to_alg(Sentence1,B) :- %% Not by multi-sentence algorithms, just by sentence + atom_string(Sentence0,Sentence1), + downcase_atom(Sentence0,Sentence01), + atom_string(Sentence01,Sentence02), + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + Connectors= + ["the","a","i","on","with","of","an","for","to", + "was","were","and","in","my","from","out","by"], + split_string(Sentence02,SepandPad,SepandPad,Sentence2), + subtract(Sentence2,Connectors,Sentence3), + %%length(Sentence3,Length), + + %% () add generated data + %%write_commands(Length,[],Commands), %% sentence alg + + generate_sentences(Sentence3,[],Sentence_a,30), %% detail sentences + append(Sentence3,Sentence_a,Sentence4), + findall([*,Sentence1,a_alg(Sentence5),b_alg(Sentence5,a),bb_alg(Sentence6)],(member(Sentence4a,Sentence4),make_lists(Sentence4a,[],Sentence5),Sentence5=[_|Sentence6]),B),!. + %% detail algs + +generate_sentences(_Sentence3,Sentence_a,Sentence_a,0) :- !. +generate_sentences(Sentence3,Sentence_a1,Sentence_a2,N1) :- + random_member(Item,Sentence3), + generate_sentence(Item,Sentence_a3), + append(Sentence_a1,[Sentence_a3],Sentence_a4), + N2 is N1-1, + generate_sentences(Sentence3,Sentence_a4,Sentence_a2,N2). + +%%write_commands(0,Commands,Commands) :- !. +%%write_commands(Length1,Commands1,Commands2) :- +%% Length2 is Length1-1. +%% append(Commands1,[[[n,member2],[[v,Length1],[v,Length2]]]). + %%[[n,equals1],[[v,Length2],[[v,a***],[v,b]]]]]. + +%%[a,b] +%%[c,d] + +generate_sentence(Item,Sentence) :- + random_member(Grammar1,[[n,v,n],[n,v,a,n],[v,n],[v,a,n]]), + brdict(BrDict012), + find_pos(Item,POS,BrDict012), + substitute1(Item,POS,Grammar1,[],Grammar2), + substitute2(Grammar2,BrDict012,[],Sentence). + +find_pos(Item,POS2,BrDict012) :- + member([Item,POS1],BrDict012), + POS1="right", + POS2=v,!. +find_pos(Item,POS2,BrDict012) :- + member([Item,POS1],BrDict012), + POS1="plus", + POS2=a,!. +find_pos(_Item,POS2,_BrDict012) :- + POS2=n. + +substitute1(_Item,_POS,[],Grammar,Grammar) :- !. +substitute1(Item,POS,Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + Grammar4=POS, + append_list([Grammar2,Item],Grammar6), + append(Grammar6,Grammar5,Grammar3),!. +substitute1(Item,POS,Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + not(Grammar4=POS), + append_list([Grammar2,Grammar4],Grammar6), + substitute1(Item,POS,Grammar5,Grammar6,Grammar3),!. + +substitute2([],_BrDict012,Sentence,Sentence) :- !. +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=n, + findall(A,(member([A,"box"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=v, + findall(A,(member([A,"right"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=a, + findall(A,(member([A,"plus"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + append(Sentence1,[Grammar2],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). + +make_lists(Sentence1,Sentence,Sentence) :- + Sentence1=[_Sentence2], + !. +make_lists(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Sentence4|Sentence5], + Sentence5=[Sentence6|_Sentence7], + append(Sentence2,[[Sentence4,Sentence6]],Sentence8), + make_lists(Sentence5,Sentence8,Sentence3). + + \ No newline at end of file diff --git a/Algorithm-Writer-with-Lists/four_crosses.pl b/Algorithm-Writer-with-Lists/four_crosses.pl new file mode 100644 index 0000000000000000000000000000000000000000..c2aa04a053332f26533d90ea54390ccfd5d9eeef --- /dev/null +++ b/Algorithm-Writer-with-Lists/four_crosses.pl @@ -0,0 +1,12 @@ +four_crosses :- + prompt("What are the two items to cross?",_Two_items), + cross,cross,cross, + prompt("What is their cross?",_Cross11). + +cross :- + prompt("What is their cross?",_Cross11), + prompt("What is at the same level as this?",_Cross12). + +prompt(Prompt,String) :- + writeln(Prompt), + read_string(user_input, "\n", "\r", _End, String). \ No newline at end of file diff --git a/Algorithm-Writer-with-Lists/gl-api.pl b/Algorithm-Writer-with-Lists/gl-api.pl new file mode 100644 index 0000000000000000000000000000000000000000..9798eaa51a44544d9b9b8b1df4780e23e2850b21 --- /dev/null +++ b/Algorithm-Writer-with-Lists/gl-api.pl @@ -0,0 +1,63 @@ +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/', web_form, []). + +% GitHub +:- include('grammar_logic_to_alg.pl'). + +server(Port) :- + http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + + web_form(_Request) :- + reply_html_page( + title('Grammar Logic'), + [ + form([action='/landing', method='POST'], [ + /** + p([], [ + label([for=debug],'Debug (on/off):'), + input([name=debug, type=textarea]) + ]), + **/ + p([], [ + label([for=query],'Grammar Logic Input:'), + input([name=query, type=textarea]) + ]), + + p([], input([name=submit, type=submit, value='Submit'], [])) + ])]). + + :- http_handler('/landing', landing_pad, []). + + landing_pad(Request) :- + +% working_directory(_, 'GitHub/Algorithm-Writer-with-Lists/'), + member(method(post), Request), !, + http_read_data(Request, Data, []), + format('Content-type: text/html~n~n', []), + format('

', []), + %%portray_clause(Data), + + %%term_to_atom(Term,Data), + +Data=[%%debug='off',%%Debug1, +query=Query1,submit=_], + + + grammar_logic_to_alg1(Query1,4000,Result), + %%format('

========~n', []), + %%portray_clause + portray_clause(Result), + %%writeln1(Data), + +format('

'). \ No newline at end of file diff --git a/Algorithm-Writer-with-Lists/grammar_logic_to_alg.pl b/Algorithm-Writer-with-Lists/grammar_logic_to_alg.pl new file mode 100644 index 0000000000000000000000000000000000000000..0d602e89ae6f2167082373cc6e068efd442c38d4 --- /dev/null +++ b/Algorithm-Writer-with-Lists/grammar_logic_to_alg.pl @@ -0,0 +1,277 @@ +/** + +grammar_logic_to_alg.pl + + +e.g. I liked God with you +[liked,God],[God,you] with extra data +- connect liked to you with member, return words on path + +POS + +goes backwards x forwards +- I v n by v n x ignore + +n,v +- single type + +adjective +- ignored x (ignore names in alg) +- like v,n + +joining words - ignore, just pairs v, n +- on +- joins v to n +- separates vns + +negative terms +- switch to positive or ignore + +disappearing words +- my +- no list of these, because they are ignored + +vv is v1, v2 x v1 +nn is n1 + +* + +later: can randomly generate details like given sentence + +**/ + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). +:-include('../Text-to-Breasonings/truncate.pl'). +%:- include('../Philosophy/14 10 23.pl'). + +:- dynamic brdict/1. + + + +mt_t2b2:- +%Thread= + %[ + phrase_from_file_s(string(Essay_0), %"lacom5million.txt" + "../../private/lacom0.5million.txt" + ), + + %Br is 5, + %Br is 5000000, + W is 0, + + %grammar_logic_to_alg1(Essay_0,Br,GL_out1), + grammar_logic_to_alg113(Essay_0,GL_out1), + + term_to_atom(GL_out1,GL_out2), + string_atom(GL_out,GL_out2), + + texttobr2(u,u,GL_out,Br,false,false,false,false,false,false,W), + texttobr(u,u,GL_out,Br).%], + +mt_t2b3:-mt_t2b2. + +mt_t2b4:-mt_t2b2. + +mt_t2b5:-mt_t2b2. + +mt_t2b6:-mt_t2b2. + +mt_t2b7:-mt_t2b2. + +mt_t2b8:-mt_t2b2. + +mt_t2b9:-mt_t2b2. + %trace,Thread. + + +mt_t2b :- + +Goals=[mt_t2b2,mt_t2b3,mt_t2b4,mt_t2b5,mt_t2b6,mt_t2b7,mt_t2b8,mt_t2b9 +], + +length(Goals,L), + +%time(mt_t2b2). +time(concurrent(L,Goals,[])). + +grammar_logic_to_alg1(String1,N,Result) :- + + %term_to_atom(String,Essay_01), + %string_atom(Essay_02,Essay_01), + + %working_directory(_, '../'), + + %(open_s("../Text-to-Breasonings/file.txt",write,Stream1), + %write(Stream1,String), + %close(Stream1)),!, + + truncate1(string,String1,N,String), + + %working_directory(_, 'algwriter/'), + grammar_logic_to_alg114(String,Result). + +grammar_logic_to_alg1 :- + grammar_logic_to_alg11([Sentence1,List_a,List_a1,List_b1,List_bb,List_bb1,Cs1,Cs2]), + writeln1([*,Sentence1,a_alg(List_a)]), + writeln(List_a1), + writeln1([*,Sentence1,b_alg(List_a)]), + writeln(List_b1), + writeln1([*,Sentence1,bb_alg(List_bb)]), + writeln(List_bb1), + + writeln1(Cs1), + writeln1(Cs2). + +grammar_logic_to_alg112(Result) :- + grammar_logic_to_alg11(Result1),term_to_atom(Result1,Result). + +grammar_logic_to_alg114(Text,Result) :- + grammar_logic_to_alg113(Text,Result1),term_to_atom(Result1,Result). + + +grammar_logic_to_alg11([Sentence1,List_a,List_a1,List_b1,List_bb,List_bb1,Cs1,Cs2]) :- + phrase_from_file_s(string(Text1), "../Text-to-Breasonings/file.txt"), + +grammar_logic_to_alg113(Text1,[Sentence1,List_a,List_a1,List_b1,List_bb,List_bb1,Cs1,Cs2]). + + +grammar_logic_to_alg113(Text1,[Sentence1,List_a,List_a1,List_b1,List_bb,List_bb1,Cs1,Cs2]) :- + + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + splitfurther(BrDict0,BrDict01), + sort(BrDict01,BrDict012), + retractall(brdict(_)), + assertz(brdict(BrDict012)), + + SepandPad=".\n", + split_string(Text1,SepandPad,SepandPad,Text2a), + delete(Text2a,"",Text2), + + findall(B2,(member(B1,Text2),grammar_logic_to_alg(B1,B2)),C), + length(C,CLength), + writeln([CLength,sentences]), + + C=[[[_,Sentence1,a_alg(List_a),_,bb_alg(List_bb)]|Cs1]|Cs2], + %%get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["gla_del"],%%[Year,Month,Day,Hour1,Minute1,Seconda], + File1), + concat_list(["\"",File1,".txt\""],File2), + + term_to_atom(List_a,List_a2), + string_atom(List_a3,List_a2), + + concat_list(["swipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_a3,"]],[[[n,function],[[v,a]],\":-\",[[[n,length],[[v,a],0,1]]]],[[n,function],[[v,a]],\":-\",[[[n,head],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,e],[v,f]]]],[[n,reverse],[[v,a],[],[v,a1]]],[[n,head],[[v,a1],[v,d1]]],[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]],[[n,function2],[[v,a],[v,f],[v,f1]]]]],[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],\":-\",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]],[[n,function2],[[v,a],[v,b],[v,f]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]]]],[[n,function2],[[v,a],[v,b],[v,c]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]],[[n,function2],[[v,d],[v,f],[v,c]]]]],[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],\":-\",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[]]),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_a1), + + concat_list(["\n\nswipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_a3,",[v,b]]],[[[n,function],[[v,a],[v,b]],\":-\",[[[n,tail],[[v,a],[v,b]]]]]],_),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_b1), + + term_to_atom(List_bb,List_bb2), + string_atom(List_bb3,List_bb2), + + concat_list(["swipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_bb3,"]],[[[n,function],[[v,a]],\":-\",[[[n,length],[[v,a],0,1]]]],[[n,function],[[v,a]],\":-\",[[[n,head],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,e],[v,f]]]],[[n,reverse],[[v,a],[],[v,a1]]],[[n,head],[[v,a1],[v,d1]]],[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]],[[n,function2],[[v,a],[v,f],[v,f1]]]]],[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],\":-\",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]],[[n,function2],[[v,a],[v,b],[v,f]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]]]],[[n,function2],[[v,a],[v,b],[v,c]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]],[[n,function2],[[v,d],[v,f],[v,c]]]]],[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],\":-\",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[]]),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_bb1). + +grammar_logic_to_alg(Sentence1,B) :- %% Not by multi-sentence algorithms, just by sentence + atom_string(Sentence0,Sentence1), + downcase_atom(Sentence0,Sentence01), + atom_string(Sentence01,Sentence02), + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + + Connectors= + ["the","a","i","on","with","of","an","for","to", + "was","were","and","in","my","from","out","by"], + split_string(Sentence02,SepandPad,SepandPad,Sentence2), + subtract(Sentence2,Connectors,Sentence3), + %%length(Sentence3,Length), + + %% () add generated data + %%write_commands(Length,[],Commands), %% sentence alg + + generate_sentences(Sentence3,[],Sentence_a,30), %% detail sentences + append(Sentence3,Sentence_a,Sentence4), + findall([*,Sentence1,a_alg(Sentence5),b_alg(Sentence5,a),bb_alg(Sentence6)],(member(Sentence4a,Sentence4),make_lists(Sentence4a,[],Sentence5),Sentence5=[_|Sentence6]),B),!. + %% detail algs + +generate_sentences(_Sentence3,Sentence_a,Sentence_a,0) :- !. +generate_sentences(Sentence3,Sentence_a1,Sentence_a2,N1) :- + random_member(Item,Sentence3), + generate_sentence(Item,Sentence_a3), + append(Sentence_a1,[Sentence_a3],Sentence_a4), + N2 is N1-1, + generate_sentences(Sentence3,Sentence_a4,Sentence_a2,N2). + +%%write_commands(0,Commands,Commands) :- !. +%%write_commands(Length1,Commands1,Commands2) :- +%% Length2 is Length1-1. +%% append(Commands1,[[[n,member2],[[v,Length1],[v,Length2]]]). + %%[[n,equals1],[[v,Length2],[[v,a***],[v,b]]]]]. + +%%[a,b] +%%[c,d] + +generate_sentence(Item,Sentence) :- + random_member(Grammar1,[[n,v,n],[n,v,a,n],[v,n],[v,a,n]]), + brdict(BrDict012), + find_pos(Item,POS,BrDict012), + substitute1(Item,POS,Grammar1,[],Grammar2), + substitute2(Grammar2,BrDict012,[],Sentence). + +find_pos("right",v,_) :- !. +find_pos("plus",a,_) :- !. +find_pos(Item,POS2,BrDict012) :- + member([Item,POS1],BrDict012), + POS1="right", + POS2=v,!. +find_pos(Item,POS2,BrDict012) :- + member([Item,POS1],BrDict012), + POS1="plus", + POS2=a,!. +find_pos(_Item,POS2,_BrDict012) :- + POS2=n. + +substitute1(_Item,_POS,[],Grammar,Grammar) :- !. +substitute1(Item,POS,Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + Grammar4=POS, + append_list([Grammar2,Item],Grammar6), + append(Grammar6,Grammar5,Grammar3),!. +substitute1(Item,POS,Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + not(Grammar4=POS), + append_list([Grammar2,Grammar4],Grammar6), + substitute1(Item,POS,Grammar5,Grammar6,Grammar3),!. + +substitute2([],_BrDict012,Sentence,Sentence) :- !. +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=n, + findall(A,(member([A,"box"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=v, + findall(A,(member([A,"right"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=a, + findall(A,(member([A,"plus"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + append(Sentence1,[Grammar2],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). + +make_lists(Sentence1,Sentence,Sentence) :- + Sentence1=[_Sentence2], + !. +make_lists(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Sentence4|Sentence5], + Sentence5=[Sentence6|_Sentence7], + append(Sentence2,[[Sentence4,Sentence6]],Sentence8), + make_lists(Sentence5,Sentence8,Sentence3). + diff --git a/Algorithm-Writer-with-Lists/grammar_logic_to_alg2.pl b/Algorithm-Writer-with-Lists/grammar_logic_to_alg2.pl new file mode 100644 index 0000000000000000000000000000000000000000..f98f92f7f5c8a4d5da26216a53b0c81039edc2bb --- /dev/null +++ b/Algorithm-Writer-with-Lists/grammar_logic_to_alg2.pl @@ -0,0 +1,265 @@ +/** + +?- grammar_logic_to_alg2(24). ["Simplify algorithm","pia","pear"] +["What is the word for","person"] +1 harry +2 june +3 person +4 pia + +"Please choose from menu items 1-4, or enter a new word:" +|: subject +["Simplify algorithm","harry","banana"] +["Simplify algorithm","apple","june"] +[["subject","pia","pear","pear"],["subject","harry","banana","banana"],["subject","june","apple","apple"]] +true. + +?- grammar_logic_to_alg. ["Simplify algorithm","pia","pear"] +[["subject","pia","pear","pear"]] +true. + +**/ + +%:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). +:-include('../listprologinterpreter/la_maths.pl'). + +grammar_logic_to_alg2(First_n_words) :- + + % get file + phrase_from_file_s(string(String1),"../Text-to-Breasonings/file.txt"), + + % split into lines and sentences + %SepandPad2="\n\r", + SepandPad1="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + string_codes(String1b,String1), + downcase_atom(String1b,String1c), + string_codes(String1c,String1a), + split_string(String1a,SepandPad1,SepandPad1,String21), + length(String21,L), + (LFirst_n_words1=L;First_n_words1=First_n_words), + length(String2,First_n_words1), + append(String2,_,String21), + + +% phrase_from_file_s(string(BrDict0), "../../Text-to-Breasonings/brdict1.txt"), + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + splitfurther(BrDict0,Term), + +%trace, + phrase_from_file_s(string(BrDict3), "../Algorithm-Writer-with-Lists/brdict3.txt"), + string_codes(String02ba,BrDict3), + atom_to_term(String02ba,Term1,[]), + + /** + phrase_from_file_s(string(BrDict4), "first-words.txt"), + string_codes(String02b,BrDict4), + atom_to_term(String02b,String02a,[]), + **/ + %trace, + % Finds additions to brdict3.txt (box,values) + process_file(String2,Term,Term1,%String02a,%[], + Additions_to_brdict3,[],Pseudo_algorithms%,%[], + %First_words + ), + + %***append(Term1,Additions_to_brdict3,Additions_to_brdict3_2), + + term_to_atom(Additions_to_brdict3,String02a_b), + string_atom(String02a_c,String02a_b), + + (open_s("../Algorithm-Writer-with-Lists/brdict3.txt",write,Stream1), + write(Stream1,String02a_c), + close(Stream1)),!, + + %***append(String02a,First_words,First_words_2), + + /** + term_to_atom(First_words,First_words_2_b), + string_atom(First_words_2_c,First_words_2_b), + + (open_s("first-words.txt",write,Stream2), + write(Stream2,First_words_2_c), + close(Stream2)),!, + **/ + + writeln1(Pseudo_algorithms),!, + + random_member(Summary_algorithm,Pseudo_algorithms), + writeln1("Summary algorithm:"), + writeln1(Summary_algorithm). + + process_file(String2,Term,Term1,%_String02a,%Additions_to_brdict31, + Additions_to_brdict3,Pseudo_algorithms1,Pseudo_algorithms%,First_words1, + %First_words + ) :- + %trace, + String2=[Word1,Word2|Words], + writeln1(["Simplify algorithm",Word1,Word2]), + found(Word1,Term,Term1,Item1,%Term1, + Additions_to_brdict31_4), + found(Word2,Term,Additions_to_brdict31_4,Item2,%Additions_to_brdict31_4, + Additions_to_brdict31_5), + stronger_word([Item1,Word1],[Item2,Word2],Term,_Word3,%String02a,First_words3, + Item_a), + append(Pseudo_algorithms1,[Item_a],Pseudo_algorithms2), + %trace, + process_file(Words,Term,Additions_to_brdict31_5,%First_words3,%Additions_to_brdict31_5, + Additions_to_brdict3,Pseudo_algorithms2,Pseudo_algorithms%,First_words3, + %First_words + ). + + process_file(_,_Term,Term1,%String02a,%Additions_to_brdict3, + Term1,Pseudo_algorithms,Pseudo_algorithms%,First_words, + %String02a + ) :- !. + + + +% bd3 - person, subject and +found(Word1,Term,Additions_to_brdict1,Word2,%Additions_to_brdict1, +Additions_to_brdict2) :- +%(Word1="bot"->trace;true), + %trace, + %(Word1="june"->trace;true), + + (member([Word1,Word21],Term)->true;(writeln(["Error:",Word1,"not in brdict1.txt."]),abort)), + (Word21=""->Word2=Word1;Word2=Word21), + +append(Additions_to_brdict1,[[Word2,Word1]],Additions_to_brdict2) + +/** (member([Word2,Item1],Additions_to_brdict1)->Additions_to_brdict1=Additions_to_brdict2; + (%trace, + + + findall(Word3,(member([Word3,Word2],Term)),Word22), + append([Word2],Word22,Word23), + %trace, + sort(Word23,Word241), + delete(Word241,"",Word24), + + length(Word24,Menu_items_length), + (Menu_items_length=1->([Item1]=Word24, + append(Additions_to_brdict1,[[Word2,Item1]],Additions_to_brdict2)); + (writeln1(["What is the word for",Word1]), + numbers(Menu_items_length,1,[],N), + findall([N1,"\t",Menu_item,"\n"],(member(N1,N), + get_item_n(Word24,N1,Menu_item)),Menu1), + maplist(append,[Menu1],[Menu3]), + concat_list(Menu3,Menu2), + writeln(Menu2), + + repeat,%trace, + concat_list(["Please choose from menu items 1-",Menu_items_length,", or enter a new word (letters only, no spaces):"],Prompt), + writeln1(Prompt), + read_string(user_input,"\n","\r",_,Input), + ((%trace, + number_string(Input_num,Input), + Input_num>=1,Input_num=true; + (string(Input)->(Input=Item1,append(Additions_to_brdict1,[[Word2,Item1]],Additions_to_brdict2)) +)))))). +**/ +. + +stronger_word([Word1,Word1a],[Word2aa,Word2a],Term,Word3%,First_words1,First_words +,Item) :- + %trace, + (member([Word1,Word21],Term)->true;(writeln(["Error:",Word1,"not in brdict1.txt."]),abort)), + (Word21=""->Word2=Word1;Word2=Word21), + %(member([Word2,Item1],Additions_to_brdict1)->Additions_to_brdict1=Additions_to_brdict2; + (%trace, + + + findall(Word3,(member([Word3,Word2],Term)),Word22), + append([Word2],Word22,Word23), + %trace, + sort(Word23,Word241), + delete(Word241,"",Word24), + + length(Word24,Menu_items_length)), + + + (member([Word2aa,Word21a],Term)->true;(writeln(["Error:",Word2aa,"not in brdict1.txt."]),abort)), + (Word21a=""->Word2ab=Word2aa;Word2ab=Word21a), + %(member([Word2aa,Item1a],Additions_to_brdict1)->Additions_to_brdict1=Additions_to_brdict2; + (%trace, + + + findall(Word3,(member([Word3,Word2ab],Term)),Word22a), + append([Word2ab],Word22a,Word23a), + %trace, + sort(Word23a,Word241a), + delete(Word241a,"",Word24a), + + length(Word24a,Menu_items_length_a)), + +/**(member(Word1,String02a)->(First_words1=First_words,Item=[Word1,Word1a,Word2,Word2a]); + (member(Word2,String02a)->(First_words1=First_words,Item=[Word2,Word2a,Word1,Word1a]); + (%trace, + writeln(["Which of 1-",Word1,"or 2-",Word2,"is stronger?"]), + read_string(user_input,"\n","\r",_,Input), + **/ + %trace, + (Menu_items_length>=Menu_items_length_a->(Word3=Word1,Item=[Word1,Word1a,Word2aa,Word2a]); + (Word3=Word2,Item=[Word2aa,Word2a,Word1,Word1a])). + %append(First_words1,[Word3],First_words)))). + + + % Append punctuation point to end if necessary. + % find last character in file + % string_concat(_,".",String1), + +/** +split_string_onnonletter(String00,Chars,List1) :- + string_codes(String00,String1), + split_string_onnonletter(String1,[],Chars,List0), + string_codes(List0,List2), + split_string(List2,"@","@",List1),!. +split_string_onnonletter([],Input,_Chars,Input) :- !. +split_string_onnonletter(Input1,Input2,Chars,Input3) :- + Input1=[Input4|Input5], + %not(char_type(Input4,alpha)), + string_codes(Chars,Codes), + member(Code,Codes), + Input4=Code, + append(Input2,[Code],Input7), + append(Input7,`@`,Input6), + split_string_onnonletter(Input5,Input6,Chars,Input3), !. +split_string_onnonletter(Input1,Input2,Chars,Input3) :- + Input1=[Input4|Input5], + %char_type(Input4,alpha), + append(Input2,[Input4],Input6), + split_string_onnonletter(Input5,Input6,Chars,Input3), !. +**/ + +splitfurther(BrDict01,N) :- + phrase(file0(N),BrDict01). + +file0(N) --> "[", file(N), "]", !. +file0([]) --> []. + +%%file([]) --> []. +file([L|Ls]) --> entry(L),",", +%{writeln(L)}, %%*** +file(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +file([L]) --> entry(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entry([Word2,Word4]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + word(Word3), {string_codes(Word4,Word3),string(Word4)}, + "]". + +word([X|Xs]) --> [X], {char_type(X,csymf)->true;(X=27->true;X=8217)}, word(Xs), !. +%%word([X]) --> [X], {char_type(X,csymf);(X=27;X=8217)}, !. +word([]) --> []. + diff --git a/Algorithm-Writer-with-Lists/grammar_logic_to_alg_mr.pl b/Algorithm-Writer-with-Lists/grammar_logic_to_alg_mr.pl new file mode 100644 index 0000000000000000000000000000000000000000..1f2ea807ac7b86d543006f1df8f4bb1fe3312317 --- /dev/null +++ b/Algorithm-Writer-with-Lists/grammar_logic_to_alg_mr.pl @@ -0,0 +1,275 @@ +/** + +grammar_logic_to_alg.pl + + +e.g. I liked God with you +[liked,God],[God,you] with extra data +- connect liked to you with member, return words on path + +POS + +goes backwards x forwards +- I v n by v n x ignore + +n,v +- single type + +adjective +- ignored x (ignore names in alg) +- like v,n + +joining words - ignore, just pairs v, n +- on +- joins v to n +- separates vns + +negative terms +- switch to positive or ignore + +disappearing words +- my +- no list of these, because they are ignored + +vv is v1, v2 x v1 +nn is n1 + +* + +later: can randomly generate details like given sentence + +**/ + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). +:-include('../Text-to-Breasonings/truncate.pl'). +:-include('../Music-Composer/mindreadtestmusiccomposer-unusual-mr-tree.pl'). + + +:- dynamic brdict/1. + + + +mt_t2b2:- +%Thread= + %[ + phrase_from_file_s(string(Essay_0), %"lacom5million.txt" + "../../private/lacom0.5million.txt" + ), + + %Br is 5, + %Br is 5000000, + W is 0, + + %grammar_logic_to_alg1(Essay_0,Br,GL_out1), + grammar_logic_to_alg113(Essay_0,GL_out1), + + term_to_atom(GL_out1,GL_out2), + string_atom(GL_out,GL_out2), + + texttobr2(u,u,GL_out,Br,false,false,false,false,false,false,W), + texttobr(u,u,GL_out,Br).%], + +mt_t2b3:-mt_t2b2. + +mt_t2b4:-mt_t2b2. + +mt_t2b5:-mt_t2b2. + +mt_t2b6:-mt_t2b2. + +mt_t2b7:-mt_t2b2. + +mt_t2b8:-mt_t2b2. + +mt_t2b9:-mt_t2b2. + %trace,Thread. + + +mt_t2b :- + +Goals=[mt_t2b2,mt_t2b3,mt_t2b4,mt_t2b5,mt_t2b6,mt_t2b7,mt_t2b8,mt_t2b9 +], + +length(Goals,L), + +%time(mt_t2b2). +time(concurrent(L,Goals,[])). + +grammar_logic_to_alg1(String1,N,Result) :- + + %term_to_atom(String,Essay_01), + %string_atom(Essay_02,Essay_01), + + %working_directory(_, '../'), + + %(open_s("../Text-to-Breasonings/file.txt",write,Stream1), + %write(Stream1,String), + %close(Stream1)),!, + + truncate1(string,String1,N,String), + + %working_directory(_, 'algwriter/'), + grammar_logic_to_alg114(String,Result). + +grammar_logic_to_alg1 :- + grammar_logic_to_alg11([Sentence1,List_a,List_a1,List_b1,List_bb,List_bb1,Cs1,Cs2]), + writeln1([*,Sentence1,a_alg(List_a)]), + writeln(List_a1), + writeln1([*,Sentence1,b_alg(List_a)]), + writeln(List_b1), + writeln1([*,Sentence1,bb_alg(List_bb)]), + writeln(List_bb1), + + writeln1(Cs1), + writeln1(Cs2). + +grammar_logic_to_alg112(Result) :- + grammar_logic_to_alg11(Result1),term_to_atom(Result1,Result). + +grammar_logic_to_alg114(Text,Result) :- + grammar_logic_to_alg113(Text,Result1),term_to_atom(Result1,Result). + + +grammar_logic_to_alg11([Sentence1,List_a,List_a1,List_b1,List_bb,List_bb1,Cs1,Cs2]) :- + phrase_from_file_s(string(Text1), "../Text-to-Breasonings/file.txt"), + +grammar_logic_to_alg113(Text1,[Sentence1,List_a,List_a1,List_b1,List_bb,List_bb1,Cs1,Cs2]). + + +grammar_logic_to_alg113(Text1,[Sentence1,List_a,List_a1,List_b1,List_bb,List_bb1,Cs1,Cs2]) :- + + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + splitfurther(BrDict0,BrDict01), + sort(BrDict01,BrDict012), + retractall(brdict(_)), + assertz(brdict(BrDict012)), + + SepandPad=".\n", + split_string(Text1,SepandPad,SepandPad,Text2a), + delete(Text2a,"",Text2), + + findall(B2,(member(B1,Text2),grammar_logic_to_alg(B1,B2)),C), + length(C,CLength), + writeln([CLength,sentences]), + + C=[[[_,Sentence1,a_alg(List_a),_,bb_alg(List_bb)]|Cs1]|Cs2], + %%get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["gla_del"],%%[Year,Month,Day,Hour1,Minute1,Seconda], + File1), + concat_list(["\"",File1,".txt\""],File2), + + term_to_atom(List_a,List_a2), + string_atom(List_a3,List_a2), + + concat_list(["swipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_a3,"]],[[[n,function],[[v,a]],\":-\",[[[n,length],[[v,a],0,1]]]],[[n,function],[[v,a]],\":-\",[[[n,head],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,e],[v,f]]]],[[n,reverse],[[v,a],[],[v,a1]]],[[n,head],[[v,a1],[v,d1]]],[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]],[[n,function2],[[v,a],[v,f],[v,f1]]]]],[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],\":-\",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]],[[n,function2],[[v,a],[v,b],[v,f]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]]]],[[n,function2],[[v,a],[v,b],[v,c]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]],[[n,function2],[[v,d],[v,f],[v,c]]]]],[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],\":-\",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[]]),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_a1), + + concat_list(["\n\nswipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_a3,",[v,b]]],[[[n,function],[[v,a],[v,b]],\":-\",[[[n,tail],[[v,a],[v,b]]]]]],_),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_b1), + + term_to_atom(List_bb,List_bb2), + string_atom(List_bb3,List_bb2), + + concat_list(["swipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_bb3,"]],[[[n,function],[[v,a]],\":-\",[[[n,length],[[v,a],0,1]]]],[[n,function],[[v,a]],\":-\",[[[n,head],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,e],[v,f]]]],[[n,reverse],[[v,a],[],[v,a1]]],[[n,head],[[v,a1],[v,d1]]],[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]],[[n,function2],[[v,a],[v,f],[v,f1]]]]],[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],\":-\",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]],[[n,function2],[[v,a],[v,b],[v,f]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]]]],[[n,function2],[[v,a],[v,b],[v,c]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]],[[n,function2],[[v,d],[v,f],[v,c]]]]],[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],\":-\",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[]]),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_bb1). + +grammar_logic_to_alg(Sentence1,B) :- %% Not by multi-sentence algorithms, just by sentence + atom_string(Sentence0,Sentence1), + downcase_atom(Sentence0,Sentence01), + atom_string(Sentence01,Sentence02), + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + Connectors= + ["the","a","i","on","with","of","an","for","to", + "was","were","and","in","my","from","out","by"], + split_string(Sentence02,SepandPad,SepandPad,Sentence2), + subtract(Sentence2,Connectors,Sentence3), + %%length(Sentence3,Length), + + %% () add generated data + %%write_commands(Length,[],Commands), %% sentence alg + + generate_sentences(Sentence3,[],Sentence_a,1), %% detail sentences + append(Sentence3,Sentence_a,Sentence4), + findall([*,Sentence1,a_alg(Sentence5),b_alg(Sentence5,a),bb_alg(Sentence6)],(member(Sentence4a,Sentence4),make_lists(Sentence4a,[],Sentence5),Sentence5=[_|Sentence6]),B),!. + %% detail algs + +generate_sentences(_Sentence3,Sentence_a,Sentence_a,0) :- !. +generate_sentences(Sentence3,Sentence_a1,Sentence_a2,N1) :- + random_member(Item,Sentence3), + generate_sentence(Item,Sentence_a3), + append(Sentence_a1,[Sentence_a3],Sentence_a4), + N2 is N1-1, + generate_sentences(Sentence3,Sentence_a4,Sentence_a2,N2). + +%%write_commands(0,Commands,Commands) :- !. +%%write_commands(Length1,Commands1,Commands2) :- +%% Length2 is Length1-1. +%% append(Commands1,[[[n,member2],[[v,Length1],[v,Length2]]]). + %%[[n,equals1],[[v,Length2],[[v,a***],[v,b]]]]]. + +%%[a,b] +%%[c,d] + +generate_sentence(Item,Sentence) :- + random_member(Grammar1,[[n,v,n],[n,v,a,n],[v,n],[v,a,n]]), + brdict(BrDict012), + find_pos(Item,POS,BrDict012), + substitute1(Item,POS,Grammar1,[],Grammar2), + substitute2(Grammar2,BrDict012,[],Sentence). + +find_pos(Item,POS2,BrDict012) :- + member([Item,POS1],BrDict012), + POS1="right", + POS2=v,!. +find_pos(Item,POS2,BrDict012) :- + member([Item,POS1],BrDict012), + POS1="plus", + POS2=a,!. +find_pos(_Item,POS2,_BrDict012) :- + POS2=n. + +substitute1(_Item,_POS,[],Grammar,Grammar) :- !. +substitute1(Item,POS,Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + Grammar4=POS, + append_list([Grammar2,Item],Grammar6), + append(Grammar6,Grammar5,Grammar3),!. +substitute1(Item,POS,Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + not(Grammar4=POS), + append_list([Grammar2,Grammar4],Grammar6), + substitute1(Item,POS,Grammar5,Grammar6,Grammar3),!. + +substitute2([],_BrDict012,Sentence,Sentence) :- !. +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=n, + findall(A,(member([A,"box"],BrDict012)),B), + mind_read(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=v, + findall(A,(member([A,"right"],BrDict012)),B), + mind_read(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=a, + findall(A,(member([A,"plus"],BrDict012)),B), + mind_read(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + append(Sentence1,[Grammar2],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). + +make_lists(Sentence1,Sentence,Sentence) :- + Sentence1=[_Sentence2], + !. +make_lists(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Sentence4|Sentence5], + Sentence5=[Sentence6|_Sentence7], + append(Sentence2,[[Sentence4,Sentence6]],Sentence8), + make_lists(Sentence5,Sentence8,Sentence3). + diff --git a/Algorithm-Writer-with-Lists/grammar_logic_to_alg_random.pl b/Algorithm-Writer-with-Lists/grammar_logic_to_alg_random.pl new file mode 100644 index 0000000000000000000000000000000000000000..5e885c548c57ca9034fa280b7954c8661002abb8 --- /dev/null +++ b/Algorithm-Writer-with-Lists/grammar_logic_to_alg_random.pl @@ -0,0 +1,197 @@ +/** + +grammar_logic_to_alg_without_brdict.pl + +**/ + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +:- dynamic brdict/1. +:- dynamic brdict_pos/1. + +%% given N sentences to generate, takes a sentence and M sentences to find substitution words from + +grammar_logic_to_alg1 :- + phrase_from_file_s(string(Text1), "../Text-to-Breasonings/luciansphilosophy.txt"), + + + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + splitfurther(BrDict0,BrDict01), + sort(BrDict01,BrDict012), + retractall(brdict_pos(_)), + assertz(brdict_pos(BrDict012)), + + SepandPad=".\n", + split_string(Text1,SepandPad,SepandPad,Text2a), + delete(Text2a,"",Text222), + +%% random(N1),N2 is round(9*N1)+1,length(N2L,N2), + N2L=[_], + findall(N3,(member(_,N2L),random_member(N3,Text222)),N4), + + %%writeln([n4,N4]), + +%% append_list(N4,Text2), + + + + findall(B2,( %%random(M1),M2 is round(9*M1)+1,length(M2L,M2), + + + %%SepandPad2=" .\n", + %%split_string(Text1,SepandPad2,SepandPad2,Text2aa), + %%delete(Text2aa,"",M3), + + %%findall(M33,(member(_,M2L),random_member(M33,M3)),M4), + M4 = Text222, + %%writeln([m4,M4]), + + %%append_list(Text2aaa,M5), + retractall(brdict(_)), + assertz(brdict(M4)), + + member(B1,N4), + + %%trace, + + grammar_logic_to_alg(B1,B2)),C), + length(C,CLength), + writeln1([CLength,sentences]), + + C=[[[_,Sentence1,a_alg(List_a),_,bb_alg(List_bb)]|Cs1]|Cs2], + %%get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["gla_del"],%%[Year,Month,Day,Hour1,Minute1,Seconda], + File1), + concat_list(["\"",File1,".txt\""],File2), + + term_to_atom(List_a,List_a2), + string_atom(List_a3,List_a2), + + concat_list(["swipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_a3,"]],[[[n,function],[[v,a]],\":-\",[[[n,length],[[v,a],0,1]]]],[[n,function],[[v,a]],\":-\",[[[n,head],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,e],[v,f]]]],[[n,reverse],[[v,a],[],[v,a1]]],[[n,head],[[v,a1],[v,d1]]],[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]],[[n,function2],[[v,a],[v,f],[v,f1]]]]],[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],\":-\",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]],[[n,function2],[[v,a],[v,b],[v,f]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]]]],[[n,function2],[[v,a],[v,b],[v,c]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]],[[n,function2],[[v,d],[v,f],[v,c]]]]],[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],\":-\",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[]]),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_a1), + writeln1([*,Sentence1,a_alg(List_a)]), + writeln1(List_a1), + + concat_list(["\n\nswipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_a3,",[v,b]]],[[[n,function],[[v,a],[v,b]],\":-\",[[[n,tail],[[v,a],[v,b]]]]]],_),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_b1), + writeln1([*,Sentence1,b_alg(List_a)]), + writeln1(List_b1), + + term_to_atom(List_bb,List_bb2), + string_atom(List_bb3,List_bb2), + + concat_list(["swipl -G100g -T20g -L2g\n['../listprolog'].\nleash(-all),visible(+all),protocol(",File2,"),trace,interpret(off,[[n,function],[",List_bb3,"]],[[[n,function],[[v,a]],\":-\",[[[n,length],[[v,a],0,1]]]],[[n,function],[[v,a]],\":-\",[[[n,head],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,e],[v,f]]]],[[n,reverse],[[v,a],[],[v,a1]]],[[n,head],[[v,a1],[v,d1]]],[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]],[[n,function2],[[v,a],[v,f],[v,f1]]]]],[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],\":-\",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]],[[n,function2],[[v,a],[v,b],[v,f]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]]]],[[n,function2],[[v,a],[v,b],[v,c]],\":-\",[[[n,member2],[[v,a],[v,d]]],[[n,equals1],[[v,d],[[v,b],[v,f]]]],[[n,function2],[[v,d],[v,f],[v,c]]]]],[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],\":-\",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[]]),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N is 3,\nM is 16000,\ntexttobr2(N,",File2,",u,M),texttobr(N,",File2,",u,M))).\n['../texttobr2qb'].\ntexttobr2(3).\nhalt."],List_bb1), + writeln1([*,Sentence1,bb_alg(List_bb)]), + writeln1(List_bb1), + + writeln1(Cs1), + writeln1(Cs2). + +grammar_logic_to_alg(Sentence1,B) :- %% Not by multi-sentence algorithms, just by sentence + atom_string(Sentence0,Sentence1), + downcase_atom(Sentence0,Sentence01), + atom_string(Sentence01,Sentence02), + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + Connectors= + ["the","a","i","on","with","of","an","for","to", + "was","were","and","in","my","from","out","by"], + split_string(Sentence02,SepandPad,SepandPad,Sentence2), + subtract(Sentence2,Connectors,Sentence3), + %%length(Sentence3,Length), + + %% () add generated data + %%write_commands(Length,[],Commands), %% sentence alg + + generate_sentences(Sentence3,[],Sentence_a,30), %% detail sentences + append([Sentence3],Sentence_a,Sentence4), + findall([*,Sentence1,a_alg(Sentence5),b_alg(Sentence5,a),bb_alg(Sentence6)],(member(Sentence4a,Sentence4),make_lists(Sentence4a,[],Sentence5),Sentence5=[_|Sentence6]),B),!. + %% detail algs + +generate_sentences(_Sentence3,Sentence_a,Sentence_a,0) :- !. +generate_sentences(Sentence3,Sentence_a1,Sentence_a2,N1) :- + random_member(Item,Sentence3), + %%trace, + generate_sentence(Item,Sentence_a3), + append(Sentence_a1,[Sentence_a3],Sentence_a4), + N2 is N1-1, + generate_sentences(Sentence3,Sentence_a4,Sentence_a2,N2). + +%%write_commands(0,Commands,Commands) :- !. +%%write_commands(Length1,Commands1,Commands2) :- +%% Length2 is Length1-1. +%% append(Commands1,[[[n,member2],[[v,Length1],[v,Length2]]]). + %%[[n,equals1],[[v,Length2],[[v,a***],[v,b]]]]]. + +%%[a,b] +%%[c,d] + +generate_sentence(Item,Sentence) :- + random_member(Grammar1,[[n,v,n,n,v,a,n,v,n,v,a,n],[n,v,n],[n,v,a,n],[v,n],[v,a,n]]), + brdict_pos(BrDict012), + find_pos(Item,POS,BrDict012), + substitute1(Item,POS,Grammar1,[],Grammar2), + substitute2(Grammar2,BrDict012,[],Sentence). + +find_pos(Item,POS2,BrDict012) :- + POS1="right", + member([Item,POS1],BrDict012), + POS2=v,!. +find_pos(Item,POS2,BrDict012) :- + POS1="plus", + member([Item,POS1],BrDict012), + POS2=a,!. +find_pos(_Item,POS2,_BrDict012) :- + POS2=n. + +substitute1(_Item,_POS,[],Grammar,Grammar) :- !. +substitute1(Item,POS,Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + Grammar4=POS, + append_list([Grammar2,Item],Grammar6), + append(Grammar6,Grammar5,Grammar3),!. +substitute1(Item,POS,Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + not(Grammar4=POS), + append_list([Grammar2,Grammar4],Grammar6), + substitute1(Item,POS,Grammar5,Grammar6,Grammar3),!. + +substitute2([],_BrDict012,Sentence,Sentence) :- !. +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=n, + findall(A,(%%brdict(BD),member(A,BD),!, + member([A,"box"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=v, + findall(A,(%%brdict(BD),member(A,BD),!, + member([A,"right"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + Grammar2=a, + findall(A,(%%brdict(BD),member(A,BD),!, + member([A,"plus"],BrDict012)),B), + random_member(Word,B), + append(Sentence1,[Word],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). +substitute2(Grammar1,BrDict012,Sentence1,Sentence2) :- + Grammar1=[Grammar2|Grammar3], + append(Sentence1,[Grammar2],Sentence3), + substitute2(Grammar3,BrDict012,Sentence3,Sentence2). + +make_lists(Sentence1,Sentence,Sentence) :- + Sentence1=[_Sentence2], + !. +make_lists(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Sentence4|Sentence5], + Sentence5=[Sentence6|_Sentence7], + append(Sentence2,[[Sentence4,Sentence6]],Sentence8), + make_lists(Sentence5,Sentence8,Sentence3). + +writeln1(Term) :- + term_to_atom(Term,Atom), + writeln(Atom),!. diff --git a/Algorithm-Writer-with-Lists/grammar_logic_to_alg_sh.txt b/Algorithm-Writer-with-Lists/grammar_logic_to_alg_sh.txt new file mode 100644 index 0000000000000000000000000000000000000000..8a5280890f3c39e0b30d3c204b4ec570e3cd9259 --- /dev/null +++ b/Algorithm-Writer-with-Lists/grammar_logic_to_alg_sh.txt @@ -0,0 +1,9 @@ +%% grammar_logic_to_alg.sh + +swipl -G100g -T20g -L2g +[grammar_logic_to_alg]. +leash(-all),visible(+all),protocol("./gla.txt"). +grammar_logic_to_alg1. +noprotocol. +halt. + diff --git a/Algorithm-Writer-with-Lists/midnight_algorithm_input_list.pl b/Algorithm-Writer-with-Lists/midnight_algorithm_input_list.pl new file mode 100644 index 0000000000000000000000000000000000000000..499a12ef3ed22b63d507dfe06db17964504e20d3 --- /dev/null +++ b/Algorithm-Writer-with-Lists/midnight_algorithm_input_list.pl @@ -0,0 +1,68 @@ +:-include('../listprologinterpreter/la_strings'). + + +midnight_algorithm_input_list :- +writeln("Algorithm name?"), + read_string(user_input, "\n", "\r", _End2, Alg_name), +writeln("List item 1?"), + read_string(user_input, "\n", "\r", _End21, List_item1), +writeln("List item 2?"), + read_string(user_input, "\n", "\r", _End22, List_item2), +writeln("List item 3?"), + read_string(user_input, "\n", "\r", _End23, List_item3), +writeln("List item 4?"), + read_string(user_input, "\n", "\r", _End24, List_item4), + Algorithm= + + [[[n,Alg_name],[[[List_item1,List_item2],[List_item2,List_item3],[List_item3,List_item4]]]], + [[[n,Alg_name],[[[t,list],[[t,list2]]]]], + [[t,list2],[[t,string],[t,string]]], + [[n,reverse],[[[t,list],[[t,list2]]],[[t,list],[[t,list2]]],[[t,list],[[t,list2]]]]], + [[n,function2],[[[t,list],[[t,list2]]],[t,string],[t,string]]], + [[n,length],[[[t,list],[[t,list2]]],[t,number],[t,number]]]], + + [[[n,Alg_name],[input]], + [[n,reverse],[input,input,output]], + [[n,function2],[input,input,input]], + [[n,length],[input,input,output]]], + +[[[n,Alg_name],[[v,a]],":-", +[[[n,length],[[v,a],0,[v,b]]], +[[n,=],[[v,b],1]]]], + +[[n,Alg_name],[[v,a]],":-", +[[[n,head],[[v,a],[v,d]]], +[[n,equals1],[[v,d],[[v,e],[v,f]]]], +[[n,reverse],[[v,a],[],[v,a1]]], +[[n,head],[[v,a1],[v,d1]]], +[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]], +[[n,function2],[[v,a],[v,f],[v,f1]]]]], + +[[n,reverse],[[],[v,l],[v,l]]], + +[[n,reverse],[[v,l],[v,m],[v,n]],":-", +[[[n,head],[[v,l],[v,h]]], +[[n,tail],[[v,l],[v,t]]], +[[n,wrap],[[v,h],[v,h1]]], +[[n,append],[[v,h1],[v,m],[v,o]]], +[[n,reverse],[[v,t],[v,o],[v,n]]]]], + +[[n,function2],[[v,a],[v,b],[v,f]],":-", +[[[n,member],[[v,d],[v,a]]], +[[n,equals1],[[v,d],[[v,b],[v,f]]]]]], + +[[n,function2],[[v,a],[v,b],[v,c]],":-", +[[[n,member],[[v,d],[v,a]]], +[[n,equals1],[[v,d],[[v,b],[v,f]]]], +[[n,function2],[[v,d],[v,f],[v,c]]]]], + +[[n,length],[[],[v,l],[v,l]]], + +[[n,length],[[v,l],[v,m1],[v,n]],":-", +[[[n,not],[[[n,=],[[v,l],[]]]]], +[[n,tail],[[v,l],[v,t]]], +[[n,+],[[v,m1],1,[v,m2]]], +[[n,length],[[v,t],[v,m2],[v,n]]]]]], +[[]]], + +writeln1([algorithm,Algorithm]). diff --git a/Algorithm-Writer-with-Lists/mr_alg.pl b/Algorithm-Writer-with-Lists/mr_alg.pl new file mode 100644 index 0000000000000000000000000000000000000000..4e5a4915314f85b3868be5ce5036f9196b1c1cb8 --- /dev/null +++ b/Algorithm-Writer-with-Lists/mr_alg.pl @@ -0,0 +1,57 @@ +% [[[n,member],[[list],[_3386]]],[[n,member],[[list],[list]]],[[n,intersection],[[list],[list]]],[[n,member],[[list],[_5928]]],[[n,atom_string],[[atom],[string]]],[[n,string_length],[[string],[number]]]] + +mr_alg(Commands3) :- + commands(Commands), + algorithm_length(Algorithm_length), + random_member(Command1,Commands), + Command1=[_Name1,[_Input1,Output1]], + mr_alg(Algorithm_length,Commands,Output1,[],Commands2), + append([Command1],Commands2,Commands3). + +mr_alg(1,_,_,Commands,Commands) :- !. +mr_alg(Algorithm_length1,Commands,Output1,Commands1,Commands2) :- + findall(Command2,(member(Command2,Commands), + Command2=[_Name2,[Output1,_Output2]]),Commands3), + random_member(Command4,Commands3), + Command4=[_Name3,[_Input3,Output3]], + append(Commands1,[Command4],Commands5), + Algorithm_length2 is Algorithm_length1 - 1, + mr_alg(Algorithm_length2,Commands,Output3,Commands5,Commands2). + +algorithm_length(5). + +%& check if any of these have 3 args - x +commands([ +[[n,+],[[number,number],[number]]], +[[n,-],[[number,number],[number]]], +[[n,*],[[number,number],[number]]], +[[n,/],[[number,number],[number]]], +[[n,/],[[number,number],[]]], +[[n,>],[[number,number],[]]], +[[n,>=],[[number,number],[]]], +[[n,<],[[number,number],[]]], +[[n,=<],[[number,number],[]]], +[[n,=],[[number,number],[]]], +[[n,=\=],[[number,number],[]]], +[[n,head],[[list],[_]]], +[[n,tail],[[list],[_]]], +[[n,member],[[list],[_]]], % swapped for this exercise +[[n,delete],[[list,_],[list]]], +[[n,append123],[[list,list],[list]]], % append([1],[2],A). +[[n,append132],[[list,list],[list]]], % append([1],A,[1,2]). +[[n,append231],[[list,list],[list]]], % append(A,[2],[1,2]). + % append(A,B,[1,2]). x +[[n,stringconcat123],[[string,string],[string]]], % string("1","2",A). +[[n,stringconcat132],[[string,string],[string]]], % string("1",A,"12"). +[[n,stringconcat231],[[string,string],[string]]], % string(A,"2","12"). +[[n,stringtonumber],[[string],[number]]], +[[n,random],[[number]]], +[[n,length],[[list],[number]]], +[[n,ceiling],[[number],[number]]], +[[n,sqrt],[[number],[number]]], +[[n,round],[[number],[number]]], +[[n,string_length],[[string],[number]]], +[[n,sort],[[list],[_]]], +[[n,intersection],[[list,list],[list]]], +[[n,atom_string],[[atom],[string]]] +]). \ No newline at end of file diff --git a/Algorithm-Writer-with-Lists/random_dependencies.pl b/Algorithm-Writer-with-Lists/random_dependencies.pl new file mode 100644 index 0000000000000000000000000000000000000000..8d8f8a37d63297571c53a4bcc373fbd63c4baec1 --- /dev/null +++ b/Algorithm-Writer-with-Lists/random_dependencies.pl @@ -0,0 +1,17 @@ +random_dependencies(A) :- + random(N1),N2 is round(2*N1), + random_dependencies1(N2,A). +random_dependencies1(0,[]) :- !. +random_dependencies1(1,A) :- + random_member(B,[and,or,reverse,string_to_list, +split_into_sentences,is_positive_language,agrees,member,delete,minus,get_item_n,length,intersection, +substring,sum,sort,maximum,map,findall,duplicates,mutually_exclusive,list_head,add_number_to_list]), +random_dependencies(C), +A=[B,[C]]. + +random_dependencies1(2,A) :- + random_member(B,[and,or,reverse,string_to_list, +split_into_sentences,is_positive_language,agrees,member,delete,minus,get_item_n,length,intersection, +substring,sum,sort,maximum,map,findall,duplicates,mutually_exclusive,list_head,add_number_to_list]), +random_dependencies(C),random_dependencies(D), +A=[B,[C,D]]. \ No newline at end of file diff --git a/Algorithm-Writer-with-Lists/ya_phil_to_alg.pl b/Algorithm-Writer-with-Lists/ya_phil_to_alg.pl new file mode 100644 index 0000000000000000000000000000000000000000..d8f582845c7cef9bf1ff2c4b5fbff7057acf9e6e --- /dev/null +++ b/Algorithm-Writer-with-Lists/ya_phil_to_alg.pl @@ -0,0 +1,45 @@ +:-include('../Lucian-Academy/folders.pl'). +:-include('mr_alg.pl'). +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). +:-include('../listprologinterpreter/la_files.pl'). + +% start with 1g memory + +ya_phil_to_alg :- + +folders(Courses1), +%random_member(Course,Courses1), + %get_texts(Dept,Texts) :- + +findall(Texts1,(member(Dept,Courses1), concat_list(["../Lucian-Academy/",Dept,"/"],Dept1), + directory_files(Dept1,F), + delete_invisibles_etc(F,G), + findall(String02b,(member(Filex1,G), + string_concat(Dept1,Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a)),Texts1) + ),Texts2), + +flatten(Texts2,Texts3), + +term_to_atom(Texts3,Texts4), + +split_string(Texts4,"\n\r","\n\r",Texts5), + +findall([Text6,Alg],(member(Text6,Texts5), +mr_alg(Alg)),Algs1), + +flatten(Algs1,Algs2), + +term_to_atom(Algs2,Algs3), + +%string_atom(Algs,Algs), + + (open_s("yet_more_phil_algs.txt",write,Stream3), + write(Stream3,Algs3), + close(Stream3)),!. + + + + \ No newline at end of file diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/.DS_Store b/Combination-Algorithm-Writer-Multiple-Predicates/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Combination-Algorithm-Writer-Multiple-Predicates/.DS_Store differ diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/LICENSE b/Combination-Algorithm-Writer-Multiple-Predicates/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/LPCAWMP_docs.md b/Combination-Algorithm-Writer-Multiple-Predicates/LPCAWMP_docs.md new file mode 100644 index 0000000000000000000000000000000000000000..a818561dcf21d41d7e796808940e4e53afe6e6c7 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/LPCAWMP_docs.md @@ -0,0 +1,20 @@ +# Documentation of List Prolog Interpreter CAWMP Commands + +* `[[n,[not,Operator]],[Variable1,Variable2]]` where `Operator=is` or `Operator="="` e.g. `[[n,[not,is]],[[v,a],1]]` returns true where `[v,a]` must be instantiated (not empty) and can equal 2. + +* `[[n,[not,Operator]],[Variable1,Variable2]]` where `Operator=">"`, `Operator=">="`, `Operator="<"`, `Operator="=<"` or `Operator="=\="` e.g. `[[n,[not,>]],[3,2]]` returns false. + +* `[[n,[not,member]],[Variable1,Variable2]]` e.g. `[[n,[not,member]],[["a","b","c"],"d"]]` returns true. + +* `[[n,[not,head]],[Variable1,Variable2]]` e.g. `[[n,[not,head]],[["a","b","c"],"a"]]` returns false. + +* `[[n,[not,tail]],[Variable1,Variable2]]` e.g. `[[n,[not,tail]],[["a","b","c"],["b","c"]]]` returns false. + +* `[[n,[not,append]],[Variable1,Variable2,Variable3]]` e.g. `[[n,[not,append]],[["a"],["b","c"],["a","b","c"]]]` returns false. + +* `[[n,[]],[Variable1]]` e.g. `[[n,[]],[[v,a]]]` returns `[v,a]=[]`. + +* `[[n,""],[Variable1]]` e.g. `[[n,""],[[v,a]]]` returns `[v,a]=""`. + +* `[[n,"_"],[Variable1]]` behaves similarly to _A in Prolog e.g. `[[n,"_"],[[v,a]]]` returns true. + diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/README.md b/Combination-Algorithm-Writer-Multiple-Predicates/README.md new file mode 100644 index 0000000000000000000000000000000000000000..9c60457fb5cb15b147e503ba4021ef7df1836456 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/README.md @@ -0,0 +1,135 @@ +# Combination Algorithm Writer with Multiple Predicates + +* NB. This repository has been deprecated by Combination Algorithm Writer with Predicates Stable (CAWPS), which may return more commands per predicate, but this repository, CAWMP, has more features. + +Combination Algorithm Writer with Multiple Predicates (CAWMP) is a SWI-Prolog algorithm that finds algorithms from given rules, algorithm parts (from algdict.pl) and writes algorithms with multiple predicates that satisfy the given input and output. + +Please contact Lucian Green at luciangreen@lucianacademy.com with questions, comments and feedback about CAWMP. Please use this citation for research. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for writing code. + +# Prerequisites + +Install List Prolog Interpreter Repository (https://github.com/luciangreen/listprologinterpreter) first. + + +# Installation from List Prolog Package Manager (LPPM) + +* Optionally, you can install from LPPM by installing SWI-Prolog for your machine, downloading the LPPM Repository, +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +``` +loading LPPM with `['lppm'].` then installing the package by running `lppm_install("luciangreen","Combination-Algorithm-Writer-Multiple-Predicates").`. + +# Installing + +* Download this repository. +* In SWI-Prolog, enter: +``` +['cawplistprolog']. +``` +* Running + +* Try CAWP regression tests (in cawpverify.pl): +Example +`cawptest1(off,N,S).` where N is the test number 1-8. +Example Output +``` +[cawptest,passed,[[[n,add0],[[v,a],[v,b],[v,c]],":-",[[[n,+],[[v,a],[v,b],[v,d]]],[[n,=],[[v,d],[v,c]]]]]]] +``` +* Try CAWP regression tests (in cawpverifya.pl): + +Example +`cawptest1a(off,7,S).` +Example Output +``` +[cawptest,7,passed] +S = passed. +``` + +# Running Random Combination Algorithm Writer (RCAW) + +* RCAW generates random algorithms. + +* To load and install, follow the instructions above. + +# Notes on CAWMP: + +* Variables should be named a, b, c, and so on, without any gaps. +* In inputted data, i.e. specifications, the above applies. + + +CAWMP is called with the command: `caw00(Debug,Function,Rules,MaxLength,MaxPredicates,TotalVars,NumInputs,NumOutputs,Specifications,AlgDict,Program1,Program2).` +e.g. +``` +caw00(off,add0,[],2,1,3,[1,2],[0,1], +%% Specs +[[[[[[v,a],1],[[v,b],2]],[],true], +[[[[v,a],2],[[v,b],1]],[],true]]], +[ %% Algorithm dictionary + [[[n,1],1,1],[[v,a],[v,b]],":-", + [ [[n,+],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[[n,1],1,1],[[v,a],[v,b]],":-", + [ [[n,-],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]] +], + +[],Program2). +``` +It has the output: +``` +Program2= +[ + [[n,1],[[v,a],[v,b]],":-", + [[[n,+],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + [[n,1],[[v,a],[v,b]],":-", + [[[n,-],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + [[n,add0],[[v,a],[v,b]],":-", + [[[n,1],[[v,b],[v,c]]], + [[n,=],[[v,a],[v,c]]]]] +]. +``` + +* Debug is true for trace, false for no trace. +* Function is the inputted name of the algorithm. +* Rules are the commands to build the code from, e.g. `[[[n,+],2,1],[[n,[]],1,0]]`, which mean the `"+"` function has 2 inputs and one output and the `[]` function has one input and verifies an empty list. See the List Prolog Interpreter Documentation for more commands. +* MaxLength is the maximum number of commands per predicate. +* MaxPredicates is the maximum number of predicates per algorithm. +* TotalVars is the number of different variables per predicate. +* Specifications have the form `A=[Input_variable_list, Output_variable_list, True (if this specification in true, or false if trying to eliminate possibilities)]`, where a variable list is `[B]` where `B` is e.g. `[[v,c],[1,2]]`. There is a list of specifications, e.g. `[A1,A2]` for a predicate (i.e. not the whole algorithm), and `[[A1,A2],[B1,B2]]` for the specifications for the algorithm, e.g. +``` +[[[[[[v,a],1],[[v,b],2]],[],true], +[[[[v,a],2],[[v,b],1]],[],true]]] +``` +* Program1 is the initial program (usually `[]`). Planned: partial multiple predicate algorithms may be given to the algorithm to complete. +* Program2 is the final outputted algorithm. +* CAWMP allows shorter and shorter lists of specifications to specify algorithms as its dictionary grows. + +# Not tested updates + +Neither Prolog nor List Prolog can return non-deterministic results after point of execution passes from a clause to a child clause, so the result of testing the repository is that multiple clauses cannot be generated in this case. + +See the CAWPS Repository because it is all that is needed to build an algorithm bottom-up. Without generating base cases first, neither CAWMP nor CAWPS can work, and CAWPS can generate more commands per predicate than CAWMP. + +# Documentation of List Prolog Interpreter CAWMP Commands + +See Documentation of List Prolog Interpreter CAWMP Commands. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/caw5 copy 12.pl b/Combination-Algorithm-Writer-Multiple-Predicates/caw5 copy 12.pl new file mode 100644 index 0000000000000000000000000000000000000000..3117567f0e1f73ccaf1b89efb5fa2055ea341bdb --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/caw5 copy 12.pl @@ -0,0 +1,1252 @@ +/**cawptest2(3,add,[[[n,[]],1,0],[[n,"_"],1,0]],4,1,4,[1,2,3],[0,1], +[ +[[[[[v,a],[]],[[v,b],3],[[v,c],[4,5,6]]], + [[[v,d],[4,5,6]]],true]] +], +[ %% Algorithm dictionary +], +%% Result + + [[[n,add],[[v,a],[v,b],[v,c],[v,d]],":-",[[[n,[]],[[v,a]]],[[n,"_"],[[v,a]]],[[n,"_"],[[v,b]]],[[n,=],[[v,c],[v,d]]]]]] + +). +**/ + +:- dynamic debug/1. +:- dynamic totalvars/1. +:- dynamic outputvars/1. +:- dynamic newrulenumber/1. +:- dynamic maxlength/1. +:- dynamic lastrule/1. +:- dynamic furthest_rule/1. +%%:- dynamic a/1. + +/** + +caw00(off,function3,[],5,7,[[[[a,1],[b,1]],[[c,2]],true],[[[a,1],[b,1]],[[c,2]],true],[[[a,1],[b,1]],[[c,1]],fail],[[[a,1],[b,1]],[[c,1]],fail]],[],Program),writeln(Program). + +VarLists is in format list of [InputVarList,OutputVarList,Positivity], where these are specification lines that are either Positivity=true or fail + +Notes: +- true specification line in first position of varlists, otherwise will try all except the specification line +- manually write a,a in algdict in a,b :- a,a,b because cawp would take too long finding all combinations including a,a +- give argument values in order 3 2 1 not 1 2 3 when adding, or will try 1+1+1 instead of 3 at first + +**/ + +%%:- include('algdict.pl'). +%%:- include('remove_duplicate_predicates.pl'). + +/**caw00a(Debug,PredicateName,Rules1,MaxLength,MaxPredicates,TotalVars,VarLists,Program1,Program2B) :- + (caw00(Debug,PredicateName,Rules1,MaxLength,MaxPredicates,TotalVars,VarLists,Program1,Program2B),%%writeln("If error, returned true");(furthest_rule([Number,Rules]), + writeln(["Error: No specification for predicate. Programs so far:",Rules,"\n\nPredicate number:",Number]). +**/ + +caw000(Debug,PredicateName,Rules,MaxLength,MaxPredicates,TotalVars,NumInputs,NumOutputs,VarLists,AlgDict,Program1,Program2A) :- + PredicateName=[PredicateName1], + MaxLength=[MaxLength1], + MaxPredicates=[MaxPredicates1], + TotalVars=[TotalVars1], + VarLists=[VarLists1], + %%AlgDict=[AlgDict1], + Program2A=[Program2A1], + +caw00(Debug,PredicateName1,Rules,MaxLength1,MaxPredicates1,TotalVars1,NumInputs,NumOutputs,VarLists1,AlgDict,Program1,Program2A1),!. + +caw000(Debug,PredicateName,Rules,MaxLength,MaxPredicates,TotalVars,NumInputs,NumOutputs,VarLists,AlgDict,Program1,Program2A) :- + PredicateName=[PredicateName1|PredicateName2], + MaxLength=[MaxLength1|MaxLength2], + MaxPredicates=[MaxPredicates1|MaxPredicates2], + TotalVars=[TotalVars1|TotalVars2], + VarLists=[VarLists1|VarLists2], + %%AlgDict=[AlgDict1|AlgDict2], + Program2A=[Program2A1|Program2A2], + +caw00(Debug,PredicateName1,Rules,MaxLength1,MaxPredicates1,TotalVars1,NumInputs,NumOutputs,VarLists1,AlgDict,Program1,Program2A1), +%%trace, + +findall(A,(member(B,Program2A1),findall(K,(member(F,NumInputs),member(G,NumOutputs),B=[[n,C],Q,D,E],K=[[[n,C],F,G],Q,D,E]),L),member(A,L)),J), +append(AlgDict,J,Program2AA4), +%%Program2AA3=[Program2AA4], +remvdup(Program2AA4,[],Program2A3), +%%trace, +caw000(Debug,PredicateName2,Rules,MaxLength2,MaxPredicates2,TotalVars2,NumInputs,NumOutputs,VarLists2,Program2A3,[],Program2A2). + +caw00(Debug,PredicateName,Rules1,MaxLength,MaxPredicates,TotalVars,NumInputs,NumOutputs,VarLists,AlgDict,Program1,Program2B) :- + PredicatesA=AlgDict, + %% remove duplicate predicates + %%remvdup(PredicatesA0,[],PredicatesA), + split3(PredicatesA,[],Rules2), + split2(PredicatesA,[],Predicates), + %%writeln([Rules2,Predicates]), + append(Rules1,Rules2,Rules3), + + %%retractall(a(_)), + + retractall(debug(_)), + assertz(debug(Debug)), + retractall(totalvars(_)), + assertz(totalvars(TotalVars)), + retractall(maxlength(_)), + assertz(maxlength(MaxLength)), + retractall(lastrule(_)), + assertz(lastrule([])), + retractall(furthest_rule(_)), + assertz(furthest_rule([0,[]])), + %%retractall(newrulenumber(_)), + %%assertz(newrulenumber(0)), + retractall(numInputs(_)), + assertz(numInputs(NumInputs)), + retractall(numOutputs(_)), + assertz(numOutputs(NumOutputs)), + + /**catch(call_with_time_limit(10, + **/caw01(VarLists,_,Predicates,PredicateName,Rules3,MaxLength,MaxPredicates,0,_,Program1,Program2B,_) + + %%writeln1(Program2B) + /**), + time_limit_exceeded, + fail) + **/ + .%%,not(length(Program2B,1)). + +%%caw01([],_,_Predicates,_PredicateName,_Rules3,_MaxLength,_MaxPredicates,_New_rule_number1,_New_rule_number2,Program,Program) :- !. %% Recently added ***** + +caw01([],[],_Predicates,_PredicateName,_Rules3,_MaxLength,_MaxPredicates,_New_rule_number1,_New_rule_number2,_Program1,_Program2,_) :- +%%writeln(here1), +%%writeln1(["Error: No specification for predicate. Program so far:",Program1,"\n\nPredicate number:",New_rule_number]),abort,!. + fail,!. +caw01([[]],[[]],_Predicates,_PredicateName,_Rules3,_MaxLength,_MaxPredicates,_New_rule_number1,_New_rule_number2,Program,Program,_V3) :- %%writeln(here2), +fail. +%%caw01(_VarLists,_Predicates,_PredicateName,_Rules3,_MaxLength,MaxPredicates,New_rule_number,Program,Program) :- New_rule_number=trace;true), + VarLists=[VarLists0|VarLists02], + VarLists0=[VarLists1|VarLists2], + + %%retractall(varlists(_)), + %%assertz(varlists(VarLists02)), + +%%trace, + %%writeln1( findall(Program2A,caw0(Predicates,PredicateName, + %%Rules3,MaxLength,MaxPredicates, + %%VarLists1,VarLists02,New_rule_number1,New_rule_number2,Program1,Program2A),Program2)), +%%notrace, + + findall([Program2A,VarLists041,New_rule_number2A],caw0(Predicates,PredicateName, + Rules3,MaxLength,MaxPredicates, + VarLists1,VarLists02,VarLists041,New_rule_number1,New_rule_number2A,Program1,Program2A,_V),Program2VarLists04), + +findall(Program21,(member(MemberProgram2VarLists04,Program2VarLists04),MemberProgram2VarLists04=[Program21,_,_]),Program2), + +findall(VarLists041,(member(MemberProgram2VarLists041,Program2VarLists04),MemberProgram2VarLists041=[_,VarLists041,_]),VarLists04), + +findall(New_rule_number2A1,(member(MemberProgram2VarLists041New_rule_number2A1,Program2VarLists04),MemberProgram2VarLists041New_rule_number2A1=[_,_,New_rule_number2A1]),New_rule_number2A2), + +%%trace, + + %%writeln1( findall(Program2A,caw0(Predicates,PredicateName, + %%Rules3,MaxLength,MaxPredicates, + %%VarLists1,VarLists02,New_rule_number1,New_rule_number2,Program1,Program2A),Program2)), + %%notrace, + length(Program2,Program2L), + length(Program2LList,Program2L), + append(Program2LList,_,[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61,62,63,64,65,66,67,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,98,99,100]), + + member(Program2LListItem,Program2LList), + get_item_n(Program2,Program2LListItem,Program2B), + get_item_n(VarLists04,Program2LListItem,VarLists03), + get_item_n(New_rule_number2A2,Program2LListItem,New_rule_number2), + %%member(Program2B,Program2), %e->1 + %%member(VarLists03,VarLists04), %e->1 +%%writeln1(Program2B1), +%%writeln1(Program2B), + %%Program2B1=[Program2B], + + %%trace, + + %%writeln1(member(Program2B,Program2)), + %%notrace, + + + aggregate_all(count,(member(Item,VarLists2), + caw0(Predicates,PredicateName,Rules3,MaxLength,MaxPredicates, + Item,VarLists03,_VarLists031,New_rule_number1,_New_rule_number2A3,Program1,Program2B,_V2)),Count1), + + %%trace, + + %%writeln1( aggregate_all(count,(member(Item,VarLists2), + %%caw0(Predicates,PredicateName,Rules3,MaxLength,MaxPredicates, + %%Item,VarLists02,New_rule_number,Program1,Program2B)),Count)), +%%notrace, +%%writeln1(Program2), +%%trace, + +%%writeln1(length(VarLists2,Count)), +%%notrace, + + %%(findall(EVM1,(everyvarmentioned(Vars2,Program5), + +length(VarLists2,Count2), +Count1>=Count2, + +%%(Program2B=[[[n,1],[[v,a],[v,b],[v,c]],":-",[[[n,+],[[v,a],[v,b],[v,d]]],[[n,=],[[v,d],[v,c]]]]]]->true%%trace +%%;true), +!.%%!.%%!. %%Predicates->PredicatesA x + +get_item_n(List1,N1,Item) :- + N2 is N1-1, + length(List3,N2), + append(List3,List4,List1), + List4=[Item|_]. + caw0(Algorithms,PredicateName,Rules,MaxLength,MaxPredicates,VarLists,VarLists02,VarLists03,New_rule_number1,New_rule_number2,Program1,Program2,_V3) :- + VarLists=[InputVarList,OutputVarList,Positivity], + varnames(InputVarList,[],InputVars,[],InputValues), + varnames(OutputVarList,[],OutputVars,[],_OutputValues), + %%retractall(outputvars(_)), + %%assertz(outputvars(OutputVars)), + append(InputVars,OutputVars,Vars11), +%%Vars11=InputVars, +%%Vars12=InputVars, + append(InputValues,OutputVars,Vars2), + %%append(InputValues,OutputValues,Values), + Query=[[n,PredicateName],Vars2], + %%writeln( caw(Algorithms,Query,PredicateName,Rules,MaxLength,MaxPredicates,Vars11,InputVars,InputVars,_,OutputVarList,OutputVars,Positivity,VarLists02,New_rule_number1,New_rule_number2,Program1,Program2)), + caw(Algorithms,Query,PredicateName,Rules,MaxLength,MaxPredicates,Vars11,InputVars,InputVars,_,OutputVarList,OutputVars,Positivity,VarLists02,VarLists03,New_rule_number1,New_rule_number2,Program1,Program2,_V). +%%caw(_,_,_,_,_,N,_,_,_,_,_,_,_,_,_,_,N,P,P) :- fail,!. %% Turn off fail,! to have all solutions + /**caw(_Algorithms,_Query,_PredicateName,_Rules,_MaxLength,MaxPredicates,_Vars11,_InputVars,_InputVarsa,_InputVars2,_OutputVarList,_OutputVars,_Positivity,_VarLists02,New_rule_number1,New_rule_number2,_Program1,_Program11):- + %%writeln(caw_here(Algorithms,Query,PredicateName,Rules,MaxLength,MaxPredicates,Vars11,InputVars,InputVars,InputVars2,OutputVarList,OutputVars,Positivity,VarLists02,New_rule_number1,New_rule_number2,Program1,Program1)), + %%MaxPredicates1 is MaxPredicates-1, + writeln(["New_rule_number>MaxPredicates1",New_rule_number,">",MaxPredicates]), + New_rule_number>MaxPredicates, + fail,!. +**/ +%%caw(_,_,_,_,_,_N,_,_,_,_,_,_,_,_,[],_,_N2?,P,P) :- !. +caw(_,_,_,_,0,_,_,_,_,_,_,_,_,_,_,_,_,_,_,_F) :- fail, !. %% Turn off fail,! to have all solutions +caw(Algorithms1,Query,PredicateName,_Rules,_MaxLength,MaxPredicates,_VarList,InputVars1,InputVars2,_InputVarsa,VarLists,OutputVars,Positivity,VarLists02,VarLists02,New_rule_number1,New_rule_number1,Program1,Program2,_V) :- +%%writeln1(V), + + %%MaxLength>0, *** + +%%MaxPredicates1 is MaxPredicates, +New_rule_number1=3 +%%writeln([optimise(Program1,InputVars1,InputVars3,PenultimateVars,Program4)]), + append(Program1,Program3,Program5), + not(Program5=[]), + + append(InputVars1,OutputVars,Vars2), + Program22=[ + [[n,PredicateName],Vars2,":-", + Program5 + ] + ], + %%writeln1(interpret-short(Program22)), + + %%everyvarmentioned(Vars2,Program5), + %%((PredicateName=1;PredicateName=2)->trace;true), + /** + not(Program22=[ + [[n,add],_,_, + [[[n,+],_]|_] + ] + ]), + **/ + %%(Program22=[[[n,1],[[v,a],[v,b],[v,c]],":-",[[[n,+],[[v,a],[v,b],[v,d]]],[[n,=],[[v,d],[v,c]]]]]],trace), + + eliminate_unused_predicates(Program22,Algorithms1,Algorithms2), + %%writeln(eliminate_unused_predicates(Program22,Algorithms1,Algorithms2)), + + %%Algorithms2=[[[n,_],_,_,Body]|_],length(Body,1), + %%(Program22=[[[n,function0],[[v,a],[v,b],[v,c]],":-",[[[n,function2],[[v,a],[v,b],[v,d]]],[[n,=],[[v,c],[v,d]]]]]]->writeln(eliminate_unused_predicates(Program22,Algorithms1,Algorithms2));true), + + %%trace, + %%writeln(["1*",append(Algorithms2,Program22,Program2)]), %% ***** swapped a2,p22 + %%[Program23]=Program22, + %%not(member(Program23,Algorithms2)), + append(Algorithms2,Program22,Program2), %% ***** swapped a2,p22 + %%remvdup(Program2a,[],Program2), + %%Algorithms2=Program2, + + + %%not(Program2=[[[n,add0],[[v,a],[v,b],[v,c]],":-",[[[n,+],[[v,a],[v,b],[v,d]]],[[n,=],[[v,d],[v,c]]]]]]), +%%(Program2=[[[n,1],[[v,a],[v,b],[v,c]],":-",[[[n,+],[[v,a],[v,b],[v,d]]],[[n,=],[[v,d],[v,c]]]]]]->trace;true), + + %%length(Program2,Program2L),not(Program2L=1), + %%not(New_rule_number=1), + %%=trace;true), + + %%append_last_rule(Program2), + + /** + length(Program2,Program2L),(Program2L>=2->(%%writeln(here2),trace + true);true), + **/ + debug(Debug), + + %%([off,[[n,add0],[[1,2,3],[v,b]]],[[[n,add2],[[v,a],[v,b]],":-",[[[n,=],[[v,a],[]]],[[n,=],[[v,b],[]]]]],[[n,add0],[[v,a],[v,b]],":-",[[[n,add2],[[v,a],[v,c]]],[[n,add0],[[v,a],[v,d]]],[[n,=],[[v,d],[v,b]]]]]],[[[[v,b],[]]]]]=[Debug,Query,Program2,[VarLists]]->%%true + %%trace + %%;true), + +%%([off,[[n,function3],[1,1,[v,c]]],[[[n,function1],[[v,a],[v,b],[v,c]],":-",[[[n,+],[[v,a],[v,b],[v,c]]]]],[[n,function3],[[v,a],[v,b],[v,c]],":-",[[[n,function1],[[v,a],[v,b],[v,d]]],[[n,=],[[v,d],[v,c]]]]]],[[[[v,c],2]]]]=[Debug,Query,Program2,[VarLists]]->true + %%trace + %%;true), + +%%writeln1([V,"\n",Program2]), +%%writeln1(V), +%%writeln1([program2,Program2]), +%%writeln1([v,V]), + +%% add 0s +%%(var(V)->true;(member([[n,add0],[[v,a],[v,b]],":-",[[[n,add3],[[v,a],[v,c]]],[[n,add0],[[v,c],[v,d]]],[[n,=],[[v,d],[v,b]]]]],V)->writeln1("v yes");true)), +%%**()(V=[[[n,add3],[[v,a],[v,b]],":-",[[[n,tail],[[v,a],[v,b]]]]],[[n,add0],[[v,a],[v,b]],":-",[[[n,add3],[[v,a],[v,c]]],[[n,add0],[[v,c],[v,d]]],[[n,=],[[v,d],[v,b]]]]]]->writeln1("v yes");true), + +%%(var(Program2)->true;(member([[n,add0],[[v,a],[v,b]],":-",[[[n,add2],[[v,a],[v,c]]],[[n,=],[[v,c],[v,b]]]]],Program2)->writeln1("p yes");true)), +%%**()(Program2=[[[n,add2],[[v,a],[v,b]],":-",[[[n,=],[[v,a],[]]],[[n,=],[[v,b],[]]]]],[[n,add0],[[v,a],[v,b]],":-",[[[n,add2],[[v,a],[v,c]]],[[n,=],[[v,c],[v,b]]]]]]->writeln1("p yes");true), + +%%vyes(Program2), + +%%(((var(V)->true;(member([[n,add0],[[v,a],[v,b]],":-",[[[n,add3],[[v,a],[v,c]]],[[n,add0],[[v,c],[v,d]]],[[n,=],[[v,d],[v,b]]]]],V))),(var(Program2)->true;(member([[n,add0],[[v,a],[v,b]],":-",[[[n,add2],[[v,a],[v,c]]],[[n,=],[[v,c],[v,b]]]]],Program2))))->trace;true), + +%%**()((not(var(V)),V=[[[n,add3],[[v,a],[v,b]],":-",[[[n,tail],[[v,a],[v,b]]]]],[[n,add0],[[v,a],[v,b]],":-",[[[n,add3],[[v,a],[v,c]]],[[n,add0],[[v,c],[v,d]]],[[n,=],[[v,d],[v,b]]]]]], +%%Program2=[[[n,add2],[[v,a],[v,b]],":-",[[[n,=],[[v,a],[]]],[[n,=],[[v,b],[]]]]],[[n,add0],[[v,a],[v,b]],":-",[[[n,add2],[[v,a],[v,c]]],[[n,=],[[v,c],[v,b]]]]]])->trace;true), + %%trace; + %%true), + + %%writeln([program2,Program2]), + %%(length(Program5,4)->writeln([program2,Program2])), + %%(Program2=[[[n,add0],[[v,a],[v,b],[v,c],[v,d],[v,e]],":-",[[[n,+],[[v,a],[v,b],[v,f]]],[[n,+],[[v,c],[v,f],[v,g]]],[[n,+],[[v,d],[v,g],[v,h]]],[[n,=],[[v,h],[v,e]]]]]]->trace;true), + %%writeln(["Press c."]),(get_single_char(97)->true;true), + + %%writeln1([interpret(Debug,Query,Program2,OutputVarList)]), + + %writeln1(interpret(Debug,Query,Program2,VarLists)), + + +%%([Debug,Query,Program2,[VarLists]]=[off,[[n,add],[[],[1,2],[v,d]]],[[[n,add],[[v,a],[v,c],[v,d]],":-",[[[n,[]],[[v,a]]],[[n,=],[[v,c],[v,d]]]]]],[[[[v,d],[1,2]]]]]->trace;true), + +%%([Debug,Query,Program2,[VarLists]]=[off,[[n,add0],[[1,2,3],[v,b]]],[[[n,add2],[[v,a],[v,b]],":-",[[[n,=],[[v,a],[]]],[[n,=],[[v,b],[]]]]],[[n,add0],[[v,a],[v,b]],":-",[[[n,add2],[[v,a],[v,c]]],[[n,add2],[[v,a],[v,d]]],[[n,=],[[v,d],[v,b]]]]]],[[[[v,b],[]]]]]->trace;true), + +%%(Program2=[[[n,add2],[[v,a],[v,b]],":-",[[[n,=],[[v,a],[]]],[[n,=],[[v,b],[]]]]],[[n,add0],[[v,a],[v,b]],":-",[[[n,add2],[[v,a],[v,c]]],[[n,add2],[[v,a],[v,d]]],[[n,=],[[v,d],[v,b]]]]]]->trace;true), + %%interpret(Debug,Query,Program2,OutputVarList). + %%aggregate_all(count,(member(Item,VarLists), + %%varlists(VarLists02), + %%( + + %%notrace, + try_interpret(Positivity,Debug,Query,Program2,VarLists),%%trace,%%writeln(here), + %%trace, + %%trace, + add_furthest_rule1(New_rule_number1,Program2), + %%(Program2=[[[n,add],[[v,a],[v,b],[v,c],[v,d]],":-",[[[n,=],[[v,c],[v,d]]]]]]->true%%trace + %%;true), + %%trace, + (no_singletons(Vars2,Program5)), + + %%writeln1(Program2), + %%writeln1([cawptest,passed,Program2]),abort, + %%->true;(%%notrace,fail)), + %%writeln1(interpret(Debug,Query,Program2,[VarLists])), + +%%(furthest_rule(A)->writeln(furthest_rule(A));true),%%notrace, + %%!. + !.%%!. + %%-> %% *** [VarLists] ? + /**(VarLists03=VarLists02, + %%retractall(varlists(_)), + %%assertz(varlists(VarLists03)), + Program2c=Program2);%%fail%% + (Program2c=[],VarLists03=VarLists02,%%append(VarLists,VarLists02,VarLists03), + retractall(varlists(_)), + assertz(varlists(VarLists03)),fail + ) + ),!.%%),Count), **** [VarLists02] to VarLists02 + %%length(OutputVarList,Count),!. + **/ + /** + + vyes(P):-%%((not(var(V)), + (member([[n,add0],[[v,a],[v,b]],":-",[[[n,add3],[[v,a],[v,c]]],[[n,add0],[[v,c],[v,d]]],[[n,=],[[v,d],[v,b]]]]],P),writeln("pyes*****")).%%,(var(Program2)->true;(member([[n,add0],[[v,a],[v,b]],":-",[[[n,add2],[[v,a],[v,c]]],[[n,=],[[v,c],[v,b]]]]],Program2)))). + vyes(V):-%%((not(var(V)), + (member([[n,add0],[[v,a],[v,b]],":-",[[[n,add2],[[v,a],[v,c]]],[[n,=],[[v,c],[v,b]]]]],V)),writeln("vyes*****")%%,(var(Program2)->true;(member([],Program2)))). +. + vyes(V):- +true. +**/ +caw(Algorithms,Query,PredicateName,Rules,MaxLength,MaxPredicates,VarList,InputVars1,InputVars2,InputVars3,VarLists,OutputVars,Positivity,VarLists02,VarLists03,New_rule_number1,New_rule_number2,Program1,Program4,_V) :- +%%writeln(here4), +%%trace, + +%%writeln1(caw(Algorithms,Query,PredicateName,Rules,MaxLength,MaxPredicates,VarList,InputVars1,InputVars2,InputVars3,VarLists,OutputVars,Positivity,VarLists02,New_rule_number1,New_rule_number2,Program1,Program4)), +%%trace, +%%writeln([caw(Query,PredicateName,Rules,MaxLength,MaxPredicates,VarList,InputVars1,InputVars2,OutputVarList,OutputVars,Program1,Program4)]),? + MaxLength>=0, %%*** + MaxLength2 is MaxLength - 1, + %%reverse(InputVars2,InputVars5), + +%%writeln([new_rule_number,New_rule_number,maxPredicates,MaxPredicates]), +%%writeln(limit_reached(New_rule_number,MaxPredicates,Rules,PredicateName,InputVars1,OutputVars,Rules1)), +%%trace, +limit_reached(New_rule_number1,MaxPredicates,Rules,PredicateName,InputVars1,OutputVars,Rules1), %% *** Check these +%%writeln([rules1,Rules1]), + %%repeat, + %%writeln(limit_reached(New_rule_number,MaxPredicates,Rules,PredicateName,InputVars1,OutputVars,Rules1)), +%%get_char(_), + %%trace, + + + member([RuleName0,NumInputs0,NumOutputs0],Rules1), + + %%[RuleName0,NumInputs0,NumOutputs0]=[other_new_branch,_,_], + +%% **/ + %%RuleName0=newrule123, +%%writeln([member([RuleName,NumInputs,NumOutputs],Rules)]), +%%writeln([rule(RuleName,NumInputs,NumOutputs,VarList,VarList2,Rule)]), + + + %%retractall(newrulenumber(_)), + %%assertz(newrulenumber(Newrulenumber1)), + + %%[InputVars2,VarLists,Positivity]=[VarLists0311,VarLists0312,_VarLists0313], *** + %% **** InputVars1 or InputVars2? + %%length(VarLists0311,VarLists0311L), + %%length(VarLists0312,VarLists0312L), + + %%***newbranchifcall(RuleName0,PredicateName,Itema), + numInputs(NumInputs1a),numOutputs(NumOutputs1a), + member(NumInputs1,NumInputs1a),%%[1,2,3]),%%,0,2,3]), %%*** 4 + member(NumOutputs1,NumOutputs1a),%%[0,1]),%%,0,2,3]), %%*** 4 + + %%writeln([i,o,NumInputs1,NumOutputs1]), + %%*** caw x when creates a new pred, num extra clauses 0-2, doesn't add rule to pred, returns vl03 for (rules in) this pred + caw3(RuleName0,Algorithms,Algorithms2,NumInputs0,NumOutputs0,New_rule_number1,New_rule_number3,Rules,PredicateName,Program1,VarList,VarList2,InputVars2,InputVars4,OutputVars,Rule,NumInputs1,NumOutputs1,VarLists02,VarLists04,MaxPredicates,Rules2), + + +%%->true +%%; + %% InputVars5->InputVars2 +%%writeln([rule(RuleName,NumInputs,NumOutputs,InputVars1,InputVars3,VarList,VarList2,Rule)]), + %%writeln(not(member(Rule,Program1))), + not(member(Rule,Program1)), %% Need to stop repeats of arity 1 calls + append(Program1,[Rule],Program3), +%%writeln([inputVars3,InputVars3]), +%%InputVars2=InputVars3, +%%writeln([program4,Program4]), + + %%retractall(varlists(_)), + %%assertz(varlists(VarLists02)), + %%writeln1( caw(Algorithms2,Query,PredicateName,Rules2,MaxLength2,MaxPredicates,VarList2,InputVars1,InputVars4,InputVars3,VarLists,OutputVars,Positivity,VarLists02,New_rule_number3,New_rule_number2,Program3,Program4)), + caw(Algorithms2,Query,PredicateName,Rules2,MaxLength2,MaxPredicates,VarList2,InputVars1,InputVars4,InputVars3,VarLists,OutputVars,Positivity,VarLists04,VarLists03,New_rule_number3,New_rule_number2,Program3,Program4,_Algorithms2). + +caw3(RuleName0,Algorithms,Algorithms2,NumInputs0,NumOutputs0,New_rule_number,New_rule_number1,Rules,PredicateName,Program1,VarList,VarList2,InputVars2,InputVars4,OutputVars,Rule,_NumInputs1,_NumOutputs1,VarLists02,VarLists02,_MaxPredicates,Rules2) :- + + %% rules_existing + not(RuleName0=predicatename_new_branch),not(RuleName0=other_new_branch),not(RuleName0=[rules_new_branch,_]), + not(RuleName0=predicatename_existing), + %%trace, + find_rule_name(RuleName0,RuleName2), + Algorithms2=Algorithms,%%RuleName=RuleName0, +NumInputs=NumInputs0,NumOutputs=NumOutputs0,New_rule_number1 = New_rule_number,Rules2=Rules, + +rulename_if_limit(RuleName2,PredicateName,RuleName), + rule(Program1,RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,OutputVars,Rule). + + +caw3(RuleName0,Algorithms,Algorithms2,NumInputs0,NumOutputs0,New_rule_number,New_rule_number1,Rules,PredicateName,Program1,VarList,VarList2,InputVars2,InputVars4,OutputVars,Rule,NumInputs1,NumOutputs1,VarLists02,VarLists03,MaxPredicates,Rules2) :- + +%%Number_of_clauses2=1, %% Not tested nonrecursive multiclauses - use bottom up +member(Number_of_clauses1,[1]),%%,3,4]), +(Number_of_clauses1>=MaxPredicates->Number_of_clauses2=MaxPredicates;Number_of_clauses2=Number_of_clauses1), + +create_multiple_nonrecursive_clauses(Number_of_clauses2,RuleName0,Algorithms,Algorithms2,NumInputs0,NumOutputs0,New_rule_number,New_rule_number1,Rules,PredicateName,Program1,VarList,VarList2,InputVars2,InputVars4,OutputVars,Rule,NumInputs1,NumOutputs1,VarLists02,VarLists03,MaxPredicates,Rules2). + + + +create_multiple_nonrecursive_clauses(0,_RuleName0,Algorithms,Algorithms,_NumInputs0,_NumOutputs0,New_rule_number,New_rule_number,Rules,_PredicateName,_Program1,VarList,VarList,InputVars,InputVars,_OutputVars,_Rule, _NumInputs1,_NumOutputs1,VarLists02,VarLists02,_MaxPredicates,Rules) :- +!. + +create_multiple_nonrecursive_clauses(Number_of_clauses1,RuleName0,Algorithms,Algorithms2,NumInputs0,NumOutputs0,New_rule_number1,New_rule_number3,Rules,PredicateName,Program1,VarList,VarList2,InputVars2,InputVars4,OutputVars,Rule,NumInputs1,NumOutputs1,VarLists02,VarLists03,MaxPredicates,Rules2) :- + + +caw4a(RuleName0,New_rule_number1,Rules,PredicateName,NumInputs0,NumOutputs0,NumInputs1,NumOutputs1,Rules3,New_rule_number4,RuleName), +%%writeln(before4b), +%%trace, +caw4b(VarLists02,VarLists04,Algorithms,New_rule_number5,MaxPredicates,New_rule_number4,Program1,RuleName,NumInputs1,NumOutputs1,InputVars2,InputVars5,VarList,VarList3,OutputVars,Rule,Algorithms3,Rules3), +%%trace, + +Number_of_clauses2 is Number_of_clauses1-1, +create_multiple_nonrecursive_clauses(Number_of_clauses2,RuleName0,Algorithms3,Algorithms2,NumInputs0,NumOutputs0,New_rule_number5,New_rule_number3,Rules3,PredicateName,Program1,VarList3,VarList2,InputVars5,InputVars4,OutputVars,Rule,NumInputs1,NumOutputs1,VarLists04,VarLists03,MaxPredicates,Rules2). + + + +caw4a(RuleName0,New_rule_number,Rules,_PredicateName,_NumInputs0,_NumOutputs0,NumInputs1,NumOutputs1,Rules2,New_rule_number1,RuleName) :- + + RuleName0=other_new_branch, %% Command from Rules with new branch + +%%trace, + New_rule_number1 is New_rule_number+1, +%%not(New_rule_number1=2), + +RuleName=[n,New_rule_number1], + + append(Rules,[[RuleName,NumInputs1,NumOutputs1]],Rules2) +. + +caw4a(RuleName0,New_rule_number,Rules,PredicateName,NumInputs0,NumOutputs0,_NumInputs1,_NumOutputs1,Rules2,New_rule_number1,RuleName) :- + RuleName0=predicatename_new_branch, %% New branch with same name as current predicate +RuleName=[n,PredicateName], + New_rule_number1 is New_rule_number+1, + append(Rules,[[RuleName,NumInputs0,NumOutputs0]],Rules2) + +. + +caw4a(RuleName0,New_rule_number,Rules,_PredicateName,NumInputs0,NumOutputs0,_NumInputs1,_NumOutputs1,Rules2,New_rule_number1,RuleName) :- + RuleName0=[rules_new_branch,RuleName], %% New branch with same name as current predicate +%%trace, +%%RuleName=[n,RuleName1], + New_rule_number1 is New_rule_number+1, + append(Rules,[[RuleName,NumInputs0,NumOutputs0]],Rules2) +. + +caw4a(RuleName0,New_rule_number,Rules,PredicateName,NumInputs0,NumOutputs0,_NumInputs1,_NumOutputs1,Rules2,New_rule_number1,RuleName) :- + RuleName0=predicatename_existing, + %%trace, + find_rule_name(RuleName0,RuleName2), + %%Algorithms2=Algorithms, + %%RuleName=RuleName0, *** not this +%%NumInputs1=NumInputs0,NumOutputs1=NumOutputs0,%% n i,o epsilon to 1 +New_rule_number1 = New_rule_number,%%Rules2=Rules, + +rulename_if_limit(RuleName2,PredicateName,RuleName), + append(Rules,[[RuleName,NumInputs0,NumOutputs0]],Rules2) +. + %%rule(Program1,RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,OutputVars,Rule). + + +caw4b(VarLists02,VarLists03,Algorithms,New_rule_number2,MaxPredicates,New_rule_number1,Program1,RuleName,NumInputs1,NumOutputs1,InputVars2,InputVars4,VarList,VarList2,OutputVars,Rule,Algorithms2,Rules2) :- + %%writeln1(append(Rules,[[RuleName,NumInputs0,NumOutputs0]],Rules2)), + %%MaxPredicates2 is MaxPredicates-1, + + maxlength(MaxLength3), + +%%writeln( caw01(VarLists02,Algorithms,New_rule_number1,New_rule_number2,Rules2,MaxLength3,MaxPredicates,New_rule_number1,[],Program2)), + %%writeln(here3), + %%trace, + %%writeln( RuleName=[_,RuleName1]), + RuleName=[_,RuleName1], +caw01(VarLists02,VarLists03,Algorithms,%%New_rule_number1 + RuleName1 +,Rules2,MaxLength3,MaxPredicates,New_rule_number1,New_rule_number2,[],Program2,_Program12), %% *** VarLists02 + %%writeln(here4), +%%to [VarLists02] +%%Program2=[[[[n,1],[[v,a],[v,b],[v,c]],":-",[[[n,+],[[v,a],[v,b],[v,d]]],[[n,=],[[v,d],[v,c]]]]]]], +%%writeln([caw01,Program2]), +%%trace, +rule(Program1,RuleName,NumInputs1,NumOutputs1,InputVars2,InputVars4,VarList,VarList2,OutputVars,Rule), +%%trace, + %%writeln(["2*",append(Program2,Algorithms,Algorithms2)]), %% *** swapped a,p2 + %%[Program2a]=Program2, + %%not(member(Program2a,Algorithms)), + %append(Program2,Algorithms,Algorithms2) %% *** swapped a,p2 + Program2=Algorithms2 + . + +find_rule_name(RuleName0,RuleName2) :- + RuleName0=[_,RuleName1],RuleName2=RuleName1. +find_rule_name(RuleName0,RuleName2) :- + not(RuleName0=[_,_RuleName1]),RuleName2=RuleName0. + +try_interpret(Positivity,Debug,Query,Program2,VarLists) :- + Positivity=true,catch(call_with_time_limit(0.05, + international_interpret([lang,"en"],Debug,Query,Program2,[VarLists])), + time_limit_exceeded, + fail),!. + +try_interpret(Positivity,Debug,Query,Program2,VarLists) :- + not(Positivity=true),catch(call_with_time_limit(0.05, + not(international_interpret([lang,"en"],Debug,Query,Program2,[VarLists]))), + time_limit_exceeded, + fail),!. + +append_last_rule(Program2):- + lastrule(LastRule1), + append_last_rule1(Program2,LastRule1). + +%%append_last_rule1(_Program2,LastRule1) :- +%% LastRule1=[],!. +append_last_rule1(Program2,LastRule1) :- +%% not(LastRule1=[]), + not(member(Program2,LastRule1)), + append(LastRule1,[Program2],LastRule2), + retractall(lastrule(_)), + assertz(lastrule(LastRule2)). + +add_furthest_rule1(New_rule_number,Program1) :- + furthest_rule(Rule),Rule=[Number,Rules], + add_furthest_rule2(New_rule_number,Number,Rules,Program1). +/**add_furthest_rule2(New_rule_number,Number,_Rules,Program1) :- + New_rule_number>Number, + retractall(furthest_rule(_)), + assertz(furthest_rule([New_rule_number,[Program1]])),!. + **/ +add_furthest_rule2(New_rule_number,_Number,Rules,Program1) :- +%%trace, + %%New_rule_number=Number, + retractall(furthest_rule(_)), + delete(Rules,Program1,Rules1), + append(Rules1,[Program1],Rules2), + assertz(furthest_rule([New_rule_number,Rules2])). +/**add_furthest_rule2(New_rule_number,Number,_Rules,_Program1) :- + New_rule_number(true);true), + C=[_E,D], + member(Vars2,D) + + + %%Vars2=true + )),B),not(B=[]), + everyvarmentioned1(Vars3,Program). +**/ + +no_singletons(Vars1,Program):- + findall(DA,(member(C,Program),C=[_E,D],member(DA,D)),Vars2), + %%append_list(Vars2,Vars2A), + append(Vars1,Vars2,Vars3), + findall(Count1,(member(Item,Vars3),aggregate_all(count,(member(Item,Vars3)),Count1), + Count1=1),G),G=[]. + + + +/** +underscore_occurs_once_per_var([],_,_Program) :- !. +underscore_occurs_once_per_var(Vars1,Program) :- + + findall(Var,(aggregate_all(count,(member(Var,Vars1), + member(C,Program),C=[[n,"_"],[Var]]),Count),Count=<1),List), + not(List=[]). +**/ + +limit_reached(New_rule_number,MaxPredicates,Rules0,PredicateName,InputVars1,OutputVars,Rules1) :- + New_rule_number=MaxPredicates, + length(InputVars1,InputVars1L), + length(OutputVars,OutputVarsL), + pred_already_in_list1(PredicateName,InputVars1L,OutputVarsL,Rules0,Rules1). + + pred_already_in_list1(PredicateName,InputVars1L,OutputVarsL,Rules0,Rules) :- + pred_already_in_list2(PredicateName,InputVars1L,OutputVarsL,Rules0,Rules). + + pred_already_in_list2(PredicateName,InputVars1L,OutputVarsL,Rules0,Rules) :- + member([[n,PredicateName],InputVars1L,OutputVarsL],Rules0), + apply_rules_existing_and_new_branch(Rules0,Rules),!. + pred_already_in_list2(PredicateName,InputVars1L,OutputVarsL,Rules0,Rules) :- + not(member([[n,PredicateName],InputVars1L,OutputVarsL],Rules0)), + apply_rules_existing_and_new_branch(Rules0,Rules01), + append(Rules01,[[predicatename_existing,InputVars1L,OutputVarsL] %% Uncommented for test 7 + ],Rules). + +apply_rules_existing_and_new_branch(Rules1,Rules2) :- +apply_rules_existing_and_new_branch2(Rules1,Rules3), +apply_rules_existing_and_new_branch3(Rules1,Rules4), +append(Rules3,Rules4,Rules2). + +apply_rules_existing_and_new_branch2(Rules1,Rules2) :- + findall([A,C1,C2],(member(B,Rules1),B=[A1,C1,C2],member(D,[rules_existing + ]),A=[D,A1]),Rules2). + +apply_rules_existing_and_new_branch3(Rules1,Rules2) :- + findall([A,C1,C2],(member(B,Rules1),B=[A1,C1,C2],member(D,[%%rules_new_branch + ]),A=[D,A1]),Rules2). + +%%append(Rules,[[predicatename_existing,InputVars1L,OutputVarsL]%%%%,[other_existing,_,_] +%%],Rules1),!. + +/** +newbranchifcall(RuleName0,PredicateName,Itema):- + RuleName0=[n,PredicateName],member(Itema,[useexisting,newbranch]). +newbranchifcall(RuleName0,PredicateName,Itema):- + not(RuleName0=[n,PredicateName]),Itema=useexisting. +**/ + +varnames([],Vars,Vars,Values,Values) :- !. +varnames(VarList,Vars1,Vars2,Values1,Values2) :- + VarList=[Var|Vars3], + Var=[VarName,Value], + append(Vars1,[VarName],Vars4), + append(Values1,[Value],Values3), + varnames(Vars3,Vars4,Vars2,Values3,Values2),!. + +/** +%%addrules(_,_,[],PV,PV,Program,Program) :- !. +addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program3) :- +writeln1(addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program3)), +%%trace, + OutputVars2=[],%%[OutputVar|OutputVars3], + member(Var,VarList), + %%member(OutputVar,OutputVars1),%%() + append(VarList,OutputVars1,OutputVars4), + member(Var2,OutputVars4), + append(Program1,[[[n,=],[Var,Var2%%OutputVar + ]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars2). + + **/ + /** + %%addrules(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2). +addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- +%%trace, + OutputVars2=[OutputVar|OutputVars3], + member(Var,VarList), + member(OutputVar,OutputVars1),%%() + append(VarList,OutputVars1,OutputVars4), + member(Var2,OutputVars4), + append(Program1,[[[n,=],[Var,Var2%%OutputVar + ]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars3), + addrules(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2). +**/ +addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- +find_addrules_outputvars(OutputVars1,OutputVars3), + addrules1(OutputVars3,VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2).%%;( + +addrules1([],_VarList,_OutputVars1,_OutputVars2,PenultimateVars,PenultimateVars,Program,Program) :- !. +addrules1(OutputVars3,VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- +OutputVars3=[_OutputVars31|OutputVars32], +addrules2(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars3,Program1,Program3), +addrules1(OutputVars32,VarList,OutputVars1,OutputVars2,PenultimateVars3,PenultimateVars2,Program3,Program2). + +addrules2([],_,_,PV,PV,Program,Program) :- !. +addrules2([[v,_]],_,_,PV,PV,Program,Program) :- !. +addrules2(VarList,OutputVars1,_OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program3) :- %%*** Underscored OutputVars2 +%%writeln1(addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program3)), +%%trace, + %%OutputVars2=[],%%[OutputVar|OutputVars3], + member(Var,VarList), + %%member(OutputVar,OutputVars1),%%() + append(VarList,OutputVars1,OutputVars4), + member(Var2,OutputVars4), + not(Var=Var2), + append(Program1,[[[n,=],[Var,Var2%%OutputVar + ]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars2). + +find_addrules_outputvars(OutputVars1,OutputVars3) :- + OutputVars1=[],OutputVars3=[_],!. +find_addrules_outputvars(OutputVars1,OutputVars3) :- + not(OutputVars1=[]),OutputVars3=OutputVars1. +%% optimise([[append,[a,a,d]],[append,[a,a,e]],[append,[a,a,f]],[append,[a,b,g]]],[g],P). +/**** +optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program2) :- + findrulesflowingtopv1(Program1,InputVars1,InputVars2,PenultimateVars,[],Rules,true), + %%findrulesflowingtopv1a(Program1,_Program32,InputVars1,InputVars2,PenultimateVars,[],_Rules1), + intersection(Program1,Rules,Program3), + unique1(Program3,[],Program2). +findrulesflowingtopv1(_,_,_,[],Rules,Rules,false). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + (Var=[v,_]),%%***;length(Var,1)), + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars1,Rules1,Rules2,IV1Flag1) :- + Vars1=[Var|Vars2], + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2), + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars2,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv20(_,[],_InputVars1,_InputVars2,_Var,Rules,Rules,false). +findrulesflowingtopv20(Program0,Rules4,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rules4=[Rule|Rules], + (findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2)->true;(Rules3=Rules1,IV1Flag2=false)), + %%delete(Program0,Rule,Program1), + findrulesflowingtopv20(Program0,Rules,InputVars1,InputVars2,Var,Rules3,Rules2,IV1Flag3),%%p1->0 + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). +%%findrulesflowingtopv2(_,[],[],_,_,_,Rules,Rules). +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[[n,_PredicateName],Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + %%(not(intersection(Rulesx,Rules1))-> x + %% append, append, unique1 + %%append(Rules1,[Rule],Rules3);Rules3=Rules1), + + %%member(Var2,Rest), + %%member(Var2,InputVars1), + + length(Rest,Length1), Length1>=1, + subtract(Rest,InputVars1,IV3s), + length(IV3s,Length3), + subtract(Rest,IV3s,IV1s), + length(IV1s,Length2), Length2>=1, + subtract(IV3s,InputVars2,[]), + + IV1Flag2=true, + + %%delete(Program0,Rule,Program1), + + %%(delete(Program0,Rule,Program3), + %%iv3s1(IV3s,Program3,IV3s,[]), + (Length3>=1-> + (findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV3s,[],Rules5,IV1Flag3),not(Rules5=[])); + (Rules5=[],IV1Flag3=false)), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag4), + %%->true; Rules5=[],IV1Flag1=IV1Flag4), + + ((findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV1s,[],Rules6,IV1Flag5), %%iv1s->rest, etc + iv1flagdisjunction(IV1Flag4,IV1Flag5,IV1Flag1))->true;(Rules6=[],IV1Flag1=IV1Flag4)), + + append([Rule],Rules1,Rules9), + append(Rules9,Rules5,Rules7), + append(Rules7,Rules6,Rules8), + unique1(Rules8,[],Rules2). + + +** +findrulesflowingtopv2(_Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + (not(member(Rule,Rules1))-> + append(Rules1,[Rule],Rules2);Rules2=Rules1), + subset(Rest,InputVars2), + + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), + +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), + + IV1Flag1=false. +** +** +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program3), + %%Program3=Program1, + %%append(Rules1,[Rule],Rules3), + subset(Rest,InputVars2), + + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), + +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), + +%% delete(Program0,Rule,Program1), + + IV1Flag2=false, + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Rest,[],Rules4,IV1Flag3), + %%not(Rules4=[]), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1), + + append(Rules1,[Rule],Rules7), + append(Rules7,Rules4,Rules8), + unique1(Rules8,[],Rules2). +** +** +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2(Rule,Program0,Program1,_Program2,InputVars1,InputVars,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + %%Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1(Program0,Program1,_Program2,InputVars1,InputVars,Rest,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). + +** +iv1flagdisjunction(A,B,true) :- + (A=true); (B=true). +iv1flagdisjunction(_,_,false). +** +iv3s0([],_,IV3s1,IV3s2). +iv3s0(IV3s,Program0,IV3s1,IV3s2). + IV3s=[IV3|IV3s3], + iv3s1(IV3,Program0,IV3s1,IV3s4), + iv3s0(IV3s3,Program0,IV3s4,IV3s2). +iv3s1(_,[],IV3s,IV3s). +iv3s1(IV3,Program0,IV3s1,IV3s2) :- + Program0=[Rule|Rules], + iv3s2(IV3,Rule,IV3s1,IV3s3), + iv3s1(IV3,Rules,IV3s3,IV3s2). +iv3s2(IV3,Rule,IV3s,IV3s1,IV3s2). + Rule=[_PredicateName,Vars], + restlast(Vars,[],_Rest,IV3), + delete(IV3s1,IV3,IV3s2). + + +findrulesflowingtopv1a(_,_,_,_,[],Rules,Rules). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + atom(Var), + findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Vars1,Rules1,Rules2) :- + Vars1=[Var|Vars2], + findrulesflowingtopv2(Program1,Program3,InputVars1,InputVars2,Var,Rules1,Rules3), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Vars2,Rules3,Rules2). +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv2a([],[],_,_,_,Rules,Rules). +findrulesflowingtopv2a(Program1,Program2,_InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program2), +Program2=Program1, + append(Rules1,[[PredicateName,Vars]],Rules2), + subset(Rest,InputVars2)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program3), +Program3=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + subset(Rest,InputVars2)), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Rest,Rules3,Rules2). + +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1a(Program4,Program2,InputVars1,InputVars,Rest,Rules3,Rules2). + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). + ** +**/ +restlast([],_,_,_) :- fail, !. +restlast([Last],Rest,Rest,Last) :- + Last=[v,_],!. +restlast(Last,Rest,Rest,Last) :- + length(Last,1),!. +restlast(Vars1,Rest1,Rest2,Last) :- + Vars1=[Var|Vars2], + append(Rest1,[Var],Rest3), + restlast(Vars2,Rest3,Rest2,Last),!. + + +rule(Program1,RuleName,InModes,OutModes,InputVars1,InputVars2,VarList,VarList2,OutputVars,Rule) :- + +/* +findall(Rule_vars1,member([_Rule_name,Rule_vars1],Program1),Rule_vars2),foldr(append,Rule_vars2,Rule_vars3), + +% count vars + sort(Rule_vars3,K), + findall(G,(member(G,K),findall(G,member(G,Rule_vars3),H),length(H,J),J>2),L), +% remove vars occuring more than twice +%(not(InputVars10=[])->trace;true), +(var(InputVars10)->InputVars10=InputVars1;subtract(InputVars10,L,InputVars1)), +(var(VarList0)->VarList0=VarList;subtract(VarList0,L,VarList)), +%notrace, +*/ +%trace, +%writeln1(rule(Program1,RuleName,InModes,OutModes,InputVars1,InputVars2,VarList,VarList2,OutputVars,Rule)), +%notrace, + +rule1(RuleName,InModes,OutModes,InputVars1,InputVars2a,VarList,VarList2a,OutputVars,Rule1), +((1 is InModes+OutModes,member(Rule1,Program1))->(InputVars1=[_|InputVars3],rule(Program1,RuleName,InModes,OutModes,InputVars3,InputVars2,VarList,VarList2,OutputVars,Rule2),Rule=Rule2);(%%InputVars1=InputVars3, *** Commented out +Rule=Rule1,InputVars2=InputVars2a,VarList2a=VarList2)) +%%writeln1(Rule),(Rule=[[n,-],[[v,c],[v,c],[v,d]]]->true;true) +.%%->writeln(rule(RuleName,InModes,OutModes,InputVars1,InputVars2,VarList,VarList2,OutputVars,Rule));true).%%(writeln(stop),abort)). +%%. +rule1(RuleName,InModes,OutModes,InputVars1,InputVars2,VarList,VarList2,OutputVars,Rule) :- +%%writeln(rule(RuleName,InModes,OutModes,InputVars1,InputVars2,VarList,VarList2,OutputVars,Rule)), + get_members(InModes,InputVars1,[],Vars1), + rulea(OutModes,RuleName,Vars1,VarList,VarList2, + Rule,OutputVars,Vars2), + (num_modes(OutModes,Last,InputVars1), + (equals_or(Vars1,Last)->true;equals_or(Vars2,Last))), + append(InputVars1,Vars2,InputVars2) + %%retractall(lastrule(_)), + %%assertz(lastrule(Rule)), + . + +num_modes(OutModes,Last,_InputVars1) :- + OutModes=0,Last=[],!. +num_modes(OutModes,Last,InputVars1) :- + not(OutModes=0),restlast(InputVars1,[],_,Last),!. + +member_functions(Rule,Algorithms,Item) :- + member(Item,Algorithms), + Item=[Rule|_Rest]. + +%%get_members(0,_,Vars,Vars):-true. +get_members(_,[],Vars,Vars). +get_members(InModes1,InputVars,Vars1,Vars2) :- + (not(InModes1=0)->( + %%findnsols(InModes1,A,(member(A,InputVars)),Vars2). + %%length(Vars2,InModes1),append(Vars2,_,InputVars). + member(Var,InputVars), + %%InputVars=[Var|InputVars2], + %%delete(InputVars,Var,InputVars2), + append(Vars1,[Var],Vars3), + InModes2 is InModes1-1, + get_members(InModes2,InputVars,Vars3,Vars2)); + Vars1=Vars2). %%InModes2->1 + +equals_or([],_) :- true,!.%%fail.*** +equals_or(List,Item) :- + List=[Item|_Rest],!. +equals_or(List,Item1) :- + List=[Item2|Rest], + not(Item1=Item2), + equals_or(Rest,Item1),!. + +rulea(OutModes,RuleName,Vars1,VarList,VarList3,Rule,OutputVars,Vars2) :- + get_vars(OutModes,VarList,VarList3,OutputVars,[],Vars2), + append(Vars1,Vars2,Vars3), + Rule=[RuleName,Vars3],!. + +get_vars(0,VarList,VarList,_,Vars,Vars) :- !. +get_vars(OutModes1,VarList1,VarList2,OutputVars,Vars1,Vars2) :- + var(VarList1,Var,VarList3,OutputVars), + append(Vars1,[Var],Vars3), + OutModes2 is OutModes1 - 1, + get_vars(OutModes2,VarList3,VarList2,OutputVars,Vars3,Vars2),!. + +/** + +rule(RuleName,1,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + rule2(RuleName,Var,VarList,VarList2,Rule,Var1), + append(InputVars1,[Var1],InputVars2). +rule2(RuleName,Var,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Var1]],!. + +rule(RuleName,1,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + rule3(RuleName,Var,VarList,VarList2,Rule,Vars), + append(InputVars1,Vars,InputVars2). +rule3(RuleName,Var,VarList,VarList3,Rule,[Var1,Var2]) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Var1,Var2]],!. + +rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + member(Vara,InputVars1), + rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1), + append(InputVars1,[Var1],InputVars2). +rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Vara,Var1]],!. + +rule(RuleName,2,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars), + member(Vara,InputVars), + rule5(RuleName,Var,Vara,VarList,VarList2,Rule,Vars), + append(InputVars1,Vars,InputVars2). +rule5(RuleName,Var,Vara,VarList,VarList3,Rule,[Var1,Var2]) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Vara,Var1,Var2]],!. + +**/ + +%%var(Item,Var,Vars,Vars) :- +%% member([Item,Var],Vars). +var(Vars1,Var1,Vars2,OutputVars) :- + length(Vars1,Vars1Length1), + Vars1Length2 is Vars1Length1-1, + length(Vars3,Vars1Length2), + append(Vars3,[Var2A],Vars1), + Var2A=[v,Var2], + char_code(Var2,Var2Code1), + Var2Code2 is Var2Code1 + 1, + var2(Var2Code2,Var1A,OutputVars), + Var1=[v,Var1A], + append(Vars1,[Var1],Vars2),!. + +var2(Code,Var1,OutputVars) :- + %%outputvars(OutputVars), + totalvars(TotalVars), + Code2 is 96+TotalVars, + Code =< Code2, %% 122 + char_code(Var1,Code), + not(member(Var1,OutputVars)),!. +var2(Var2Code,Code3,OutputVars) :- + Var2Code2 is Var2Code + 1, + totalvars(TotalVars), + Code2 is 96+TotalVars, + Var2Code2 =< Code2, + var2(Var2Code2,Code3,OutputVars),!. +/** +algorithmstopredicates1([],Predicates1,Predicates1) :-!. +algorithmstopredicates1(Algorithms1,Predicates1,Predicates2) :- + Algorithms1=[Algorithm1|Algorithms2], + Algorithm1=[_TestNumber,_Queries,Algorithm3], + algorithmstopredicates2(Algorithm3,[],Algorithm4), + append_list(Predicates1,Algorithm4,Predicates4), + algorithmstopredicates1(Algorithms2,Predicates4,Predicates2). +algorithmstopredicates2([],Predicates1,Predicates1) :- !. +algorithmstopredicates2(Algorithms1,Predicates1,Predicates2) :- + Algorithms1=[Algorithm1|Algorithms2], + Algorithm1=[Name,In,Out|Rest], + append(Predicates1,[[Name,In,Out|Rest]],Predicates4), + algorithmstopredicates2(Algorithms2,Predicates4, + Predicates2). +**/ +split3([],List,List) :- !. +split3(Predicates1,List1,List2) :- + Predicates1=[Item1|List4], + Item1= [[[n,Name],In,Out]|_Rest], + append(List1,[[[n,Name],In,Out]],List6), + split3(List4,List6,List2),!. + +split2([],List,List) :- !. +split2(Predicates1,List1,List2) :- + Predicates1=[Item1|List4], + Item1= [[[n,Name],_In,_Out]|Rest], + append(List1,[[[n,Name]|Rest]],List6), + split2(List4,List6,List2),!. + +/** +split2([],List,List) :- !. +split2(Predicates1,List1,List2) :- + Predicates1=[Item1|List4], + Item1=[[n,[Name,[[test,Test1],[numin,Numin], + [numout,Numout]]]]|Rest], + member([[n,[Name,[[test,_Test3],[numin,Numin], + [numout,Numout]]]]|Rest],List4), + delete(List4,[[n,[Name,[[test,_Test4],[numin,Numin], + [numout,Numout]]]]|Rest], + List7), + append(List1,[[[n,[Name,[[test,Test1],[numin,Numin], + [numout,Numout]]]]|Rest]],List6), + split2(List7,List6,List2),!. +split2(Predicates1,List1,List2) :- + Predicates1=[Item1|List4], + Item1=[[n,[Name,[[test,Test1],[numin,Numin], + [numout,Numout]]]]|Rest], + append(List1,[[[n,[Name,[[test,Test1],[numin,Numin], + [numout,Numout]]]]|Rest]],List6), + split2(List4,List6,List2),!. + +append_list(A,[],A):-!. +append_list(A,List,B) :- + List=[Item|Items], + append(A,[Item],C), + append_list(C,Items,B). + +**/ + +eliminate_unused_predicates(Program1a,Algorithms1a,Algorithms2) :- + %% System calls and mode arities + %%System_calls=[[is,1,1],[+,2,1],[=,2,1],[wrap,1,1], + %%[unwrap,1,1],[head,1,1],[tail,1,1],[member,1,1], + %%[delete,2,1],[append,2,1]], %% Ignore whether system calls are in Program and Algorithm - the interpreter will have detected whether system and user predicates clash earlier + + Program1a=[[[n, PredicateName], Arguments, ":-", _Body]], + length(Arguments,ArgumentsLength), + Start=[[[n,PredicateName],ArgumentsLength]], + convert_to_grammar_part1(Program1a,[],_Program1b,Program1), + %% Find calls in Program + %%writeln([program1,Program1]), + find_calls1(Start,Program1,[],Program2), + %%writeln([program2,Program2]), + %% Find calls in Algorithm + convert_to_grammar_part1(Algorithms1a,[],_Algorithms1b,Algorithms1), + %%writeln([algorithms1,Algorithms1]), + find_calls1(Program2,Algorithms1,[],Algorithms3), + %%writeln([algorithms3,Algorithms3]), + append(Program2,Algorithms3,Rules), + %% Eliminate user predicates mentioned in Program and Algorithms in Algorithms + eliminate_unused_predicates1(Rules,Algorithms1,[], + Algorithms2). + +find_calls1(_,[],Program,Program) :- !. +find_calls1(Program0,Program1,Program2,Program3) :- + Program1=[[_Program4a,Program4]|Program5], + %% The first predicate in Program4 only is needed to find the calls x + (findall(Program7a,(((member([[n,PredicateName],Arguments,":-",Program6],Program4)->true;((member([[n,PredicateName],Arguments],Program4),Program6=[])->true;Program4=[[n,PredicateName],Arguments,":-",Program6])), + length(Arguments,ArgumentsLength), + Item=[[n,PredicateName],ArgumentsLength], + (member(Item,Program0)->Program6=Program6a;Program6a=[])%%->true; + %%Item=Program0 + ), + (find_calls2(Program6a,[],Program7a))),[Program7])), + %%append(Program2,Program7,Program8), + %%append(Program0,Program7,Program01)); + %%(Program8=Program2,Program01=Program0)), + append(Program2,Program7,Program8), + append(Program0,Program7,Program01), + find_calls1(Program01,Program5,Program8,Program3). + +find_calls2([],Program,Program) :- !. +/** +find_calls2(Program1,Program2,Program3) :- + Program1=[Line|Program41], + Line=[[n,code]|Program42], + find_calls2(Program41,Program2,Program5), + append(Program5,Program42,Program6), + find_calls2(Program6,[],Program3). +**/ + +find_calls2(Program1,Program2,Program3) :- + Program1=[Line|Program4], + (Line=[[n,PredicateName],Arguments]-> + length(Arguments,ArgumentsLength); + (Line=[[n,PredicateName]],ArgumentsLength=0)), %% correct syntax is [true] not true + Item=[[[n,PredicateName],ArgumentsLength]], + append(Program2,Item,Program5), + find_calls2(Program4,Program5,Program3). + +eliminate_unused_predicates1(_Rules,[],Algorithms,Algorithms) :- !. +eliminate_unused_predicates1(Rules,Algorithms1,Algorithms2,Algorithms3) :- + Algorithms1=[[Algorithms4a,Algorithms4]|Algorithms5], + %%(Algorithms4a=[]-> + %%eliminate_unused_predicates1(Rules,Algorithms5,Algorithms2, + %%Algorithms3),%%; +((findall(Algorithms6a,(((member(Algorithms4a1,Algorithms4),Algorithms4a1=[[n,_]|_])->true;Algorithms4a1=Algorithms4), ((Algorithms4a1=[[n,PredicateName],Arguments,":-",_Program6], +%%Algorithms4a1=[[n,PredicateName],Arguments])-> + length(Arguments,ArgumentsLength))->true; + (Algorithms4a1=[[n,PredicateName],Arguments2], + length(Arguments2,ArgumentsLength)->true; + (Algorithms4a1=[[n,PredicateName]],ArgumentsLength=0))), + Item=[[n,PredicateName],ArgumentsLength], + (member(Item,Rules)-> + (Algorithms4a=[]->Algorithms2=Algorithms6a; + append(Algorithms2, + [Algorithms4a],Algorithms6a)); + Algorithms6a=Algorithms2)),Algorithms6b)), + Algorithms6b=[Algorithms6c|_], + (var(Algorithms6c)->Algorithms6=[];Algorithms6=Algorithms6c), + %%length(Algorithms4,Count)), + eliminate_unused_predicates1(Rules,Algorithms5,Algorithms6, + Algorithms3)). diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/cawmp.bib b/Combination-Algorithm-Writer-Multiple-Predicates/cawmp.bib new file mode 100644 index 0000000000000000000000000000000000000000..fb844f009a21885cac9f6a511bf150a753e1b276 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/cawmp.bib @@ -0,0 +1,7 @@ +@misc {cawmp, + author = {Lucian Green}, + title = {Combination Algorithm Writer with Multiple Predicates System}, + year = {2019}, + howpublished = "https://github.com/luciangreen/Combination-Algorithm-Writer-Multiple-Predicates", + URL = "https://github.com/luciangreen/Combination-Algorithm-Writer-Multiple-Predicates" +} diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/cawplistprolog.pl b/Combination-Algorithm-Writer-Multiple-Predicates/cawplistprolog.pl new file mode 100644 index 0000000000000000000000000000000000000000..a82a70900d404259de63b7a6e7daa4762e512840 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/cawplistprolog.pl @@ -0,0 +1,43 @@ +:- include('../listprologinterpreter/grammar.pl'). +:- include('lpi_caw_commands.pl'). +:- include('../listprologinterpreter/listprologinterpreter1listrecursion4.pl'). +:- include('../listprologinterpreter/listprologinterpreter3preds5.pl'). +:- include('../listprologinterpreter/lpiverify4.pl'). +:- include('../listprologinterpreter/lpiverify4_open.pl'). +:- include('../listprologinterpreter/lpiverify4_types.pl'). +:- include('../listprologinterpreter/lpiverify4_open_types.pl'). +:- include('caw5 copy 12.pl'). +:- include('cawpverify.pl'). +%%:- include('rcawp.pl'). +%%:- include('rcaw.pl'). +%%:- include('../Text-to-Breasonings/texttobr2.pl'). +:- include('../listprologinterpreter/la_strings.pl'). +:- include('../listprologinterpreter/la_string_codes.pl'). +:- include('../listprologinterpreter/la_maths.pl'). +:- include('remove_duplicate_predicates.pl'). +:- include('cawpverifya.pl'). + + +%%:- include('caw5 copy 12.pl'). +%%:- include('cawpverify.pl'). +%%:- include('rcawp.pl'). +%%:- include('rcaw.pl'). +%%:- include('texttobr2.pl'). +:- include('../listprologinterpreter/la_strings.pl'). +%:- include('../Languages/lang_db_generator.pl'). % leave off, run separately through Languages +%:- include('../listprologinterpreter/lpiverify4-fr.pl'). +:- include('../listprologinterpreter/operators.pl'). +:- include('../listprologinterpreter/lpiverify4_test_lang_all.pl'). +:- include('../listprologinterpreter/lpiverify4_test_bt_lang_all.pl'). +:- include('../Languages/make_docs.pl'). +:- include('../SSI/find_pred_sm.pl'). +:- include('../listprologinterpreter/e4_fa_get_vals.pl'). +%:- include('../listprologinterpreter/equals4_first_args.pl'). +:- include('../listprologinterpreter/expression_not_var.pl'). +:- include('../listprologinterpreter/collect_arguments.pl'). +:- include('../listprologinterpreter/reserved_words2.pl'). +:- include('../listprologinterpreter/expand_types.pl'). +:- include('../listprologinterpreter/replace_in_term.pl'). +:- include('../listprologinterpreter/preds_converters_and_matrices.pl'). +%:- include('numbers_of_items_correspond.pl'). +:- include('../listprologinterpreter/match_get_put_vals.pl'). diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/cawpverify.pl b/Combination-Algorithm-Writer-Multiple-Predicates/cawpverify.pl new file mode 100644 index 0000000000000000000000000000000000000000..aaac889472793bda40dda82cdc0449e003249789 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/cawpverify.pl @@ -0,0 +1,429 @@ + +%% cawptest(Debug[on/off],Total,Score). + +cawptest(Debug,NTotal,Score) :- cawptest(Debug,0,NTotal,0,Score),!. +cawptest(_Debug,NTotal,NTotal,Score,Score) :- NTotal=9, !. +cawptest(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + cawptest2(NTotal3,Function,Rules,MaxLength,MaxPredicates,TotalVars,Numinputs,Numoutputs,Specifications,AlgDict,Program1), + + %%writeln([cawptest2(NTotal3,Specifications,Program1)]), + (((%%writeln(caw00(Debug,function0,[],5,TotalVars,Specifications,[],Program1)), + caw00(Debug,Function,Rules,MaxLength,MaxPredicates,TotalVars,Numinputs, Numoutputs,Specifications,AlgDict,[],Program1) + + %%sort(Program1,ProgramA), + %%sort(Program2,ProgramA) + %%writeln1(Program1),writeln1(Program2) + %%Program1=Program2 + ))->(Score3 is Score1+1,writeln([cawptest,NTotal3,passed]));(Score3=Score1,writeln([cawptest,NTotal3,failed]))), + writeln(""), + cawptest(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% Test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +cawptest1(Debug,N,Passed) :- + cawptest2(N,Function,Rules,MaxLength,MaxPredicates,TotalVars,Numinputs, Numoutputs,Specifications,AlgDict,Program1), + %%writeln1([cawptest2(N,Specifications,Program1)]), + %%writeln1(caw00(Debug,Function,Rules,MaxLength,MaxPredicates,TotalVars,Numinputs, Numoutputs,Specifications,AlgDict,[],Program2)), + + (((caw00(Debug,Function,Rules,MaxLength,MaxPredicates,TotalVars,Numinputs, Numoutputs,Specifications,AlgDict,[],Program1) + %%sort(Program1,ProgramA), + %%sort(Program2,ProgramA) + + %writeln(Program1), + %writeln(Program2), + %Program1=Program2 + ))->(Passed=passed,writeln([cawptest,N,passed]));(Passed=failed,writeln([cawptest,N,failed]))),!. + + +/** +cawptest2(1,add0,[[[n,+],2,1%% Modes=2 inputs, 1 output +]],2,2,%% MaxPredicates is not the number of predicates in the result, it is the number of non-dictionary predicates in the result. +4, +[2],[1],%% Numinputs, Numoutputs tested for +[ + [[[[[v,a],1],[[v,b],1]],[[[v,c],2]],true]], + [[[[[v,a],1],[[v,b],1]],[[[v,c],2]],true]] +] +, +[], %% Algorithm dictionary +[ %% Result + [[n,1],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,d]]], + [[n,=],[[v,d],[v,c]]]]], + [[n,add0],[[v,a],[v,b],[v,c]],":-", + [ [[n,1],[[v,a],[v,b],[v,d]]], + [[n,=],[[v,d],[v,c]]]]] +]). +**/ +cawptest2(1,add0,[[[n,+],2,1%% Modes=2 inputs, 1 output +]],2,2,%% MaxPredicates is not the number of predicates in the result, it is the number of non-dictionary predicates in the result. +4, +[2],[1],%% Numinputs, Numoutputs tested for +[ + [[[[[v,a],1],[[v,b],1]],[[[v,c],2]],true]], + [[[[[v,a],2],[[v,b],1]],[[[v,c],3]],true]] +] +, +[], %% Algorithm dictionary +[ %% Result + [[n,1],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,d]]], + [[n,=],[[v,d],[v,c]]]]], + [[n,add0],[[v,a],[v,b],[v,c]],":-", + [ [[n,1],[[v,a],[v,b],[v,d]]], + [[n,=],[[v,d],[v,c]]]]] +]). + +%% With a=input and b=input, returns [[[n,1],[[v,a],[v,b]],:-,[[[n,+],[[v,a],1,[v,c]]],[[n,=],[[v,c],[v,b]]]]],[[n,1],[[v,a],[v,b]],:-,[[[n,-],[[v,a],1,[v,c]]],[[n,=],[[v,c],[v,b]]]]],[[n,add0],[[v,a],[v,b]],:-,[[[n,1],[[v,b],[v,c]]],[[n,=],[[v,a],[v,a]]]]]] which is incorrect, and with a=input and b=output nondeterministic clauses fail +%% Non-determinism is not supported in List Prolog. List Prolog should use if-then instead of non-deterministic clauses. +%% Use if-then with calls to predicates as consequents. +cawptest2(2,add0,[],2,1,3,[1,2],[0,1], +[[[[[[v,a],1],[[v,b],2]],[],true],[[[[v,a],2],[[v,b],1]],[],true]]], + +[ %% Algorithm dictionary + [[[n,1],1,1],[[v,a],[v,b]],":-", + [ [[n,+],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[[n,1],1,1],[[v,a],[v,b]],":-", + [ [[n,-],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]] +], + + %% Result + +[[[n,1],[[v,a],[v,b]],":-", + [[[n,+],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], +[[n,1],[[v,a],[v,b]],":-", + [[[n,-],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,1],[[v,a],[v,c]]], + [[n,=],[[v,b],[v,c]]]]]] + +). + +%%[[[n,a2],[[v,a],[v,b]],:-,[[[n,+],[[v,a],1,[v,c]]],[[n,=],[[v,c],[v,b]]]]],[[n,add0],[[v,a],[v,b]],:-,[[[n,a2],[[v,a],[v,c]]]]]] + + +%% () Can underscore vars in spec, ignore in choose var, everyvarcovered + +%% Was test 9 on bu16 + +cawptest2(3,add,[[[n,[]],1,0],[[n,"_"],1,0]],1,1,1,[1],[0], +[ +/**[[[[[v,a],[1,2,3]],[[v,b],3],[[v,c],[]]], + [[[v,d],[4,5,6]],[[v,e],[2,3]]],true],**/ + [[[[[v,a],3]], + [],true]] + + /** +[[[[[[v,a],[1,2,3]],[[v,b],3],[[v,c],[]]], + [[[v,d],[4,5,6]],[[v,e],[2,3]]],true], +[[[[v,a],[]],[[v,b],3],[[v,c],[4,5,6]]], + [[[v,d],[4,5,6]],[[v,e],5]],true]]]], +**/ +], +[ %% Algorithm dictionary +], +%% Result + + [[[n,add],[[v,a]],":-",[[[n,"_"],[[v,a]]]]]] + + /** +[[n,1],[[v,a],[v,b],[v,c],[v,d]],":-", %% Test by self + [[[n,[]],[[v,c]]]]], +[[n,add],[[v,a],[v,b],[v,c],[v,d]],":-", + [[n,1],[[v,a],[v,b],[v,c],[v,d]]]]**/ +). + + + +cawptest2(4,function3,[[[n,+],2,1]],3,1,5,[2],[1], +[[[[[[v,a],1],[[v,b],1]],[[[v,c],3]],true], +[[[[v,a],1],[[v,b],2]],[[[v,c],5]],true]]], +[ %% Algorithm dictionary +], +[ %% Result +[[n,function3],[[v,a],[v,b],[v,c]],":-", + [[[n,+],[[v,a],[v,b],[v,d]]], + [[n,+],[[v,b],[v,d],[v,e]]], + [[n,=],[[v,e],[v,c]]]]]]). + + +cawptest2(5,add0,[[[n,+],2,1]],1,1,4,% 3 x 5 %% 2,2,3 +[2],[1], +[ + [[[[[v,a],1],[[v,b],1]],[[[v,c],2]],true]] +%% [[[[[v,a],1]],[[[v,b],2]],true]], +%% [[[[[v,a],1],[[v,b],1]],[[v,c],2],true]] +] +, +[ %% Algorithm dictionary +], +[ %% Result + [[n,add0],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,d]]], + [[n,=],[[v,d],[v,c]]]]] + +]). + +%%%%***** + + +cawptest2(6,function3,[],2,1,4,[2],[1], +[[[[[[v,a],1],[[v,b],1]],[[[v,c],2]],true], + [[[[v,a],1],[[v,b],2]],[[[v,c],3]],true], +[[[[v,a],1],[[v,b],1]],[[[v,c],1]],fail], + [[[[v,a],1],[[v,b],1]],[[[v,c],3]],fail]]], +[ %% Algorithm dictionary + + [[[n,function1],2,1],[[v,a],[v,b],[v,c]],":-", + [ + [[n,+],[[v,a],[v,b],[v,c]]] + ] + ] + +], +[ %% Result +[[n,function1],[[v,a],[v,b],[v,c]],":-", + [[[n,+],[[v,a],[v,b],[v,c]]]]], +[[n,function3],[[v,a],[v,b],[v,c]],":-", + [[[n,function1],[[v,a],[v,b],[v,d]]], + [[n,=],[[v,d],[v,c]]]]]]). + +/** +%%** +cawptest2(7,add0,[],3,3,%% it could be 5 +4, +[1],[1], +[ +[[[[[v,a],[1,2]]],[[[v,b],[]]],true]], +[[[[[v,a],[1,2]]],[[[v,b],[]]],true]], +[[[[[v,a],[1,2]]],[[[v,b],[2]]],true]], +[[[[[v,a],[2]]],[[[v,b],[]]],true]], +[[[[[v,a],[]]],[[[v,b],[]]],true]] +], +[ %% Algorithm dictionary +[[[n,add2],1,1],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], +[[[n,add3],1,1],[[v,a],[v,b]],":-", + [[[n,tail],[[v,a],[v,b]]]]] +], +[ %% Result +[[n,add2],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], +[[n,add3],[[v,a],[v,b]],":-", + [[[n,tail],[[v,a],[v,b]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,1],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], +[[n,1],[[v,a],[v,b]],":-", + [[[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], +[[n,1],[[v,a],[v,b]],":-", + [[[n,add3],[[v,a],[v,c]]], + [[n,1],[[v,c],[v,d]]], + [[n,=],[[v,d],[v,b]]]]]]). + +cawptest2(7.1,add0,[],2,1,%% it could be 5 +3, +[1],[1], +[ +%%[[[[[v,a],[1,2]]],[[[v,b],[]]],true]], +%%[[[[[v,a],[1,2]]],[[[v,b],[]]],true]], +%%[[[[[v,a],[1,2]]],[[[v,b],[2]]],true]], +%%[[[[[v,a],[2]]],[[[v,b],[]]],true]], +[[[[[v,a],[]]],[[[v,b],[]]],true]] +], +[ %% Algorithm dictionary +[[[n,add2],1,1],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]] +%%[[[n,add3],1,1],[[v,a],[v,b]],":-", +%% [[[n,tail],[[v,a],[v,b]]]]] +], +[ %% Result +[[n,add2],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]]] +). + +cawptest2(7.2,add0,[],3,2,%% it could be 5 +3, +[1],[1], +[ +[[[[[v,a],[1,2]]],[[[v,b],[]]],true]], +[[[[[v,a],[1,2]]],[[[v,b],[2]]],true]], +[[[[[v,a],[2]]],[[[v,b],[]]],true]], +[[[[[v,a],[]]],[[[v,b],[]]],true]] +], +[ %% Algorithm dictionary +[[[n,add2],1,1],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], +[[[n,add3],1,1],[[v,a],[v,b]],":-", + [[[n,tail],[[v,a],[v,b]]]]] +], +[ %% Result +[[n,add2],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], +[[n,add3],[[v,a],[v,b]],":-", + [[[n,tail],[[v,a],[v,b]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,add3],[[v,a],[v,c]]], + [[n,add0],[[v,c],[v,d]]], + [[n,=],[[v,d],[v,b]]]]]]). + **/ + +cawptest2(7,add0,[[[n,+],2,1%% Modes=2 inputs, 1 output +],[[n,-],2,1]],4,1,%% MaxPredicates is not the number of predicates in the result, it is the number of non-dictionary predicates in the result. +6, +[2],[1],%% Numinputs, Numoutputs tested for +[ + [[[[[v,a],1],[[v,b],1]],[[[v,c],0],[[v,d],2]],true], + [[[[v,a],1],[[v,b],2]],[[[v,c],1],[[v,d],3]],true]] +] +, +[], %% Algorithm dictionary +/**[ %% Result + [[n,add0],[[v,a],[v,b],[v,c],[v,d]],":-", + [ [[n,+],[[v,a],[v,b],[v,e]]], + [[n,+],[[v,a],[v,b],[v,f]]], + [[n,=],[[v,c],[v,e]]], + [[n,=],[[v,f],[v,d]]]]]**/ + + /** + [[[n,add0],[[v,a],[v,b],[v,c],[v,d]],":-", + [[[n,+],[[v,a],[v,a],[v,e]]], + [[n,+],[[v,a],[v,b],[v,f]]], + [[n,=],[[v,e],[v,c]]], + [[n,=],[[v,e],[v,d]]]]] + **/ + + /** + [[[n,add0],[[v,a],[v,b],[v,c],[v,d]],":-", + [[[n,+],[[v,a],[v,b],[v,e]]], + [[n,-],[[v,b],[v,a],[v,f]]], + [[n,=],[[v,e],[v,d]]], + [[n,=],[[v,f],[v,c]]]]]] +**/ + +[[[n,add0],[[v,a],[v,b],[v,c],[v,d]],":-", + [[[n,+],[[v,a],[v,b],[v,e]]], + [[n,-],[[v,b],[v,a],[v,f]]], + [[n,=],[[v,e],[v,d]]], + [[n,=],[[v,f],[v,c]]]]]] + +). + +/** + Not tested nonrecursive multiclauses +cawptest2(8,add0,[[[n,+],2,1],[[n,-],2,1]],2,3,4,[2],[1], +[[[[[[v,a],1],[[v,b],1]],[[[v,b],2]],true],[[[[v,a],2],[[v,b],1]],[[[v,b],1]],true]], +[[[[[v,a],1],[[v,b],1]],[[[v,b],2]],true]], +[[[[[v,a],2],[[v,b],1]],[[[v,b],1]],true]]], + +[ %% Algorithm dictionary +], + + %% Result + +[[[n,1],[[v,a],[v,b],[v,c]],":-", + [[[n,+],[[v,a],[v,b],[v,d]]], + [[n,=],[[v,c],[v,d]]]]], +[[n,1],[[v,a],[v,b],[v,c]],":-", + [[[n,-],[[v,a],[v,b],[v,d]]], + [[n,=],[[v,d],[v,c]]]]], +[[n,add0],[[v,a],[v,b],[v,c]],":-", + [[[n,1],[[v,a],[v,b],[v,d]]], + [[n,=],[[v,c],[v,d]]]]]] +). +**/ + +/** + +cawptest2(8,add0,[[[n,+],2,1%% Modes=2 inputs, 1 output +]],4,1,%% MaxPredicates is not the number of predicates in the result, it is the number of non-dictionary predicates in the result. +8, +[2],[1],%% Numinputs, Numoutputs tested for +[ + [[[[[v,a],1],[[v,b],1],[[v,c],2],[[v,d],1]],[[[v,e],5]],true], + [[[[v,a],2],[[v,b],2],[[v,c],2],[[v,d],1]],[[[v,e],7]],true]] +] +, +[], +%% add options to remove extra choice points +%% test on lpiv +%% caw00(off,add0,[[[n,+],2,1]],4,1,8,[2,4],[1],[[[[[[v,a],1],[[v,b],1],[[v,c],2],[[v,d],1]],[[[v,e],5]],true],[[[[v,a],2],[[v,b],2],[[v,c],2],[[v,d],1]],[[[v,e],7]],true]]],[],[],P). +%% x: + +[[[n,add0],[[v,a],[v,b],[v,c],[v,d],[v,e]],":-",[[[n,+],[[v,a],[v,b],[v,f]]],[[n,+],[[v,c],[v,f],[v,g]]],[[n,+],[[v,d],[v,g],[v,h]]],[[n,=],[[v,h],[v,e]]]]]] + +). + +**/ + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 2 of 4.txt",0,algorithms,"14. *I prepared to serve the vegetable burger. I did this by cooking the vegetable patty. First, I made the patty from semolina, soy and carrot. Second, I minced it up. Third, I cooked it. In this way, I prepared to serve the vegetable burger by cooking the vegetable patty."] + + +cawptest2(8,append1,[[[n,append],2,1%% Modes=2 inputs, 1 output +]],3,1,%% MaxPredicates is not the number of predicates in the result, it is the number of non-dictionary predicates in the result. +7, +[3],[1],%% Numinputs, Numoutputs tested for +[ + [[[[[v,a],["top"]],[[v,b],["middle"]],[[v,c],["bottom"]]], + [[[v,d],["top","middle","bottom"]]],true]] +] +, +[], +[[[n,append1],[[v,a],[v,b],[v,c],[v,d]],":-" ,[[[n,append],[[v,a],[v,b],[v,e]]],[[n,append],[[v,e],[v,c],[v,f]]],[[n,=],[[v,f],[v,d]]]]]] + +). + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 23 of 30.txt",0,algorithms,"225. ALEXIS: *The subject should write logically connected breasonings."] + +/** + +cawptest2(9,mp*,[[[n,append],2,1%% Modes=2 inputs, 1 output +]],3,1,%% MaxPredicates is not the number of predicates in the result, it is the number of non-dictionary predicates in the result. +7, +[3],[1],%% Numinputs, Numoutputs tested for +[ + [[[[[v,a],["p1"]],[[v,b],["p1","p2"]]], + [[[v,c],["p2"]]],true]] +] +, +[], +[[[n,mp],[[v,a],[v,b],[v,c],[v,d]],":-" ,[[[n,append],[[v,a],[v,b],[v,e]]],[[n,append],[[v,e],[v,c],[v,f]]],[[n,=],[[v,f],[v,d]]]]]] + +). + +**/ + +/* +cawptest2(9,append1,[[[n,append],2,1%% Modes=2 inputs, 1 output +]],3,1,%% MaxPredicates is not the number of predicates in the result, it is the number of non-dictionary predicates in the result. +7, +[3],[1],%% Numinputs, Numoutputs tested for +[ + [[[[[v,a],["top"]],[[v,b],["middle"]],[[v,c],["bottom"]]], + [[[v,d],["top","middle","bottom"]]],true]] +] +, +[], +[[[n,append1],[[v,a],[v,b],[v,c],[v,d]],":-" ,[[[n,append],[[v,a],[v,b],[v,e]]],[[n,append],[[v,e],[v,c],[v,f]]],[[n,=],[[v,f],[v,d]]]]]] + +). +*/ \ No newline at end of file diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/cawpverifya.pl b/Combination-Algorithm-Writer-Multiple-Predicates/cawpverifya.pl new file mode 100644 index 0000000000000000000000000000000000000000..5813e8cdcde694e41d684295bb7ce6c5e7980ad1 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/cawpverifya.pl @@ -0,0 +1,129 @@ +%% Test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +cawptest1a(Debug,N,Passed) :- + cawptest2a(N,Function,Rules,MaxLength,MaxPredicates,TotalVars,Numinputs, Numoutputs,Specifications,AlgDict,Program1), + %%writeln1([cawptest2(N,Specifications,Program1)]), + %%writeln1(caw00(Debug,Function,Rules,MaxLength,MaxPredicates,TotalVars,Numinputs, Numoutputs,Specifications,AlgDict,[],Program2)), + + (((caw000(Debug,Function,Rules,MaxLength,MaxPredicates,TotalVars,Numinputs, Numoutputs,Specifications,AlgDict,[],Program1) + %%sort(Program1,ProgramA), + %%sort(Program2,ProgramA) + + %%writeln(Program1), + %%writeln(Program2), + %%Program1=Program2 + ))->(Passed=passed,writeln([cawptest,N,passed]));(Passed=failed,writeln([cawptest,N,failed]))),!. + +cawptest2a(7,[add0,add0],[],[2,3],[1,1],%% it could be 5 +[3,4], %% 4 not 3 because of bug +[1],[1], +[ +[[[[[[v,a],[]]],[[[v,b],[]]],true]]], + +[[[[[[v,a],[1,2,3]]],[[[v,b],[]]],true]]] +%%[[[[[v,a],[2]]],[[[v,b],[]]],true]], %% Needs to progress bottom up by writing base case first +%%[[[[[v,a],[1,2]]],[[[v,b],[]]],true]], %% add0 second clause has this not spec below +%%[[[[[v,a],[1,2]]],[[[v,b],[2]]],true]], +%%[[[[[v,a],[2]]],[[[v,b],[]]],true]] +%%[[[[[v,a],[1,2]]],[[[v,b],[]]],true]], +], +[ %% Algorithm dictionary +[[[n,add2],1,1],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], +[[[n,add3],1,1],[[v,a],[v,b]],":-", + [[[n,tail],[[v,a],[v,b]]]]] + + %% Result +/**[[[n,add1],1,1],[[v,a],[v,b]],":-", + [[[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]] +**/ +], +[ %% Result +%%[[n,add3],[[v,a],[v,b]],":-", %% swapped a3,a2 + %%[[[n,tail],[[v,a],[v,b]]]]], + +/**[[n,add2],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]]], +**/ + %% Resulting base case + + [[[n,add2],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], + [[n,add0],[[v,a],[v,b]],":-", + [[[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]]], + + [ %% Resulting recursive algorithm + +[[n,add2],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], +[[n,add3],[[v,a],[v,b]],":-", + [[[n,tail],[[v,a],[v,b]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,add3],[[v,a],[v,c]]], + [[n,add0],[[v,c],[v,d]]], + [[n,=],[[v,d],[v,b]]]]]]] + + ). + + +%%[[[n,add],[[v,a],[v,c],[v,d]],":-",[[[n,[]],[[v,a]]],[[n,=],[[v,c],[v,d]]]]]] + +%% Add cover all vars before returning +%%**/ + + +/** Doesn't work +%% before now, io=21 +cawptest2(7,add0,[],2,3,5,% 3 x 5 +[1,2],[0,1], +[[[[[[v,a],1],[[v,b],2]],[],true],[[[[v,a],2],[[v,b],1]],[],true]], +%%[[[[[v,a],1],[[v,b],2]],[],true],[[[[v,a],2],[[v,b],1]],[],true]], +[[[[[v,a],1]],[[[v,b],2]],true]], +[[[[[v,a],2]],[[[v,b],1]],true]] +], + +[ %% Algorithm dictionary + + [[[n,a2],1,1],[[v,a],[v,b]],":-", + [ [[n,+],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[[n,a3],1,1],[[v,a],[v,b]],":-", + [ [[n,-],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]] +], + +[ %% Result + [[n,a2],[[v,a],[v,b]],":-", + [ [[n,+],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[n,a3],[[v,a],[v,b]],":-", + [ [[n,-],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[n,add0],[[v,a],[v,b]],":-", + [ [[n,1],[[v,a],[v,b]]]]], + + [[n,1],[[v,a],[v,b]],":-", + [ [[n,a2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[n,1],[[v,a],[v,b]],":-", + [ [[n,a3],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]] +]). + +**/ diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/convert_lp_to_caw0000.pl b/Combination-Algorithm-Writer-Multiple-Predicates/convert_lp_to_caw0000.pl new file mode 100644 index 0000000000000000000000000000000000000000..2fa96cb26f581aa6b4faa5da4de6a5cea82fecf0 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/convert_lp_to_caw0000.pl @@ -0,0 +1,68 @@ +% + + +/** + +convert_lp_to_caw0000( +[ + [[n,reverse],[[],[v,l],[v,l]]], + [[n,reverse],[[v,l],[v,m],[v,n]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,wrap],[[v,h],[v,h1]]], + [[n,append],[[v,h1],[v,m],[v,o]]], + [[n,reverse],[[v,t],[v,o],[v,n]]] + ] + ] +], + +[ + % gets used i,o (with header with vars that are only either + % in the header or used in another reverse_c4* + + % skips writeln, other predetermined commands + % assumes wrap, unwrap used + % doesn't use read_string (takes var input instead) + + [[n,reverse],[[],[v,l],[v,l]]], + + [[n,reverse_c41],[[v,l],[v,t],[v,h1]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,wrap],[[v,h],[v,h1]]] + ]], + + % with vars it needs from another r_c4* or the header + + [[n,reverse_c42],[[v,h1],[v,m],[v,t],[v,n]],":-", + [ [[n,append],[[v,h1],[v,m],[v,o]]], + [[n,reverse],[[v,t],[v,o],[v,n]]] + ]] +], + +[ + % finds algs (configuration of calls to certain + % commands with header + % puts together pred + + [[n,reverse],[[],[v,l],[v,l]]], + + [[n,reverse],[[v,l],[v,m],[v,n]],":-", + [ [n,reverse_c41],[[v,l],[v,t],[v,h1]], + [n,reverse_c42],[[v,h1],[v,m],[v,t],[v,n]] + ]] +], + + % finds specs for r_c4*'s by using lpi with a + % flag switched on that returns the specs + % - specs are i,o vars for lines, where c4 alg + % finds algs meeting spec for set of lines + + % * LPI numbers pred, pred lines, has widgets that + % indicate these and the var states + % - with assertz x another var + +**/ + +convert_lp_to_caw0000 :- + diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/lpi_caw_commands.pl b/Combination-Algorithm-Writer-Multiple-Predicates/lpi_caw_commands.pl new file mode 100644 index 0000000000000000000000000000000000000000..5cf9b9b0144eeaf85da7eaff81cd3ad840974149 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/lpi_caw_commands.pl @@ -0,0 +1,173 @@ +interpretstatement1(ssi,_F0,_Functions,[[n,[not,Operator]],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- + isop(Operator), + interpretpart(not_is,Variable1,Variable2,Vars1,Vars2),!. + +interpretstatement1(ssi,_F0,_Functions,[[n,[not,Operator]],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- + comparisonoperator(Operator), +%%writeln1(4), + interpretpart(not_iscomparison,Operator,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[n,[not,member]],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln1(8), + interpretpart(not_member,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[n,[not,head]],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln1(6), + interpretpart(not_head,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[n,[not,tail]],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln1(61), + interpretpart(not_tail,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[n,[not,append]],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +%%writeln1(9), + interpretpart(not_append,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[n,[not,stringconcat]],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- + interpretpart(not_stringconcat,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[n,[]],[Variable1]],Vars1,Vars2,true,nocut) :- +%%writeln1(8), + interpretpart([],Variable1,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[n,""],[Variable1]],Vars1,Vars2,true,nocut) :- +%%writeln1(8), + interpretpart("",Variable1,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[n,"_"],[Variable1]],Vars1,Vars2,true,nocut) :- +%%writeln1(8), + interpretpart("_",Variable1,Vars1,Vars2). + + +interpretpart(not_is,Variable1,Variable2,Vars1,Vars1) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + not(isempty(Value1)), + not(isempty(Value2)), + debug_call(Skip,[[n,[not,=]],[Value1,Value2]]), + ((not(Value1 = Value2) + )-> + debug_exit(Skip,[[n,[not,=]],[Value1,Value2]]) +; debug_fail(Skip,[[n,[not,=]],[Value1,Value2]])),!. + +interpretpart(not_iscomparison,Operator,Variable1,Variable2,Vars1,Vars1) :- getvalues(Variable1,Variable2,Value1,Value2,Vars1), + debug_call(Skip,[[n,[not,Operator]],[Value1,Value2]]), + ((isval(Value1), + isval(Value2), + Expression=..[Operator,Value1,Value2], + not(Expression))-> + debug_exit(Skip,[[n,[not,Operator]],[Value1,Value2]]) +; debug_fail(Skip,[[n,[not,Operator]],[Value1,Value2]])),!. + +interpretpart(not_member,Variable1,Variable2,Vars1,Vars1) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), +debug_call(Skip,[[n,[not,member]],[Value1,Value2]]), + (((not(Value2=empty)->%%member(Value2,Value1), + (not(member(Value2,Value1)) + %%putvalue(Variable2,Value3,Vars1,Vars2)%%,Vars2=Vars1 + )))-> + debug_exit(Skip,[[n,[not,member]],[Value1,Value2]]) +; debug_fail(Skip,[[n,[not,member]],[Value1,Value2]])),!. + +interpretpart(not_head,Variable1,Variable2,Vars1,Vars1) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), +debug_call(Skip,[[n,[not,head]],[Value1,Value2]]), + ((not(Value1=[Value2|_Rest]) + %%val1emptyorvalsequal(Value2,Value1A), + %%putvalue(Variable2,Value1A,Vars1,Vars2) + )-> + debug_exit(Skip,[[n,[not,head]],[Value1,Value2]]) +; debug_fail(Skip,[[n,[not,head]],[Value1,Value2]])),!. + +interpretpart(not_tail,Variable1,Variable2,Vars1,Vars1) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), +debug_call(Skip,[[n,[not,tail]],[Value1,Value2]]), + ((not(Value1=[_Head|Value2]) + %%removebrackets(Value1A,Value1B), + %%val1emptyorvalsequal(Value2,Value1A), + %%putvalue(Variable2,Value1A,Vars1,Vars2) + )-> + debug_exit(Skip,[[n,[not,tail]],[Value1,Value2]]) +; debug_fail(Skip,[[n,[not,tail]],[Value1,Value2]])),!. + +interpretpart(not_append,Variable1,Variable2,Variable3,Vars1,Vars1) :- + + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + debug_call(Skip,[[n,[not,append]],[Value1,Value2,Value3]]), + ((islist(Value1),islist(Value2), + not(append1(Value1,Value2,Value3)) + %%val1emptyorvalsequal(Value3,Value3A), + %% putvalue(Variable3,Value3A,Vars1,Vars2) + )-> + debug_exit(Skip,[[n,[not,append]],[Value1,Value2,Value3]]) +; debug_fail(Skip,[[n,[not,append]],[Value1,Value2,Value3]])),!. + +interpretpart(not_stringconcat,Variable1,Variable2,Variable3,Vars1,Vars1) :- + + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + debug_call(Skip,[[n,[not_stringconcat]],[Value1,Value2,Value3]]), + ((string(Value1),string(Value2),string(Value3), + stringconcat(Value1,Value2,Value3) + %%val1emptyorvalsequal(Value3,Value3A), + %%putvalue(Variable3,Value3A,Vars1,Vars2) + )-> + debug_exit(Skip,[[n,[not_stringconcat]],[Value1,Value2,Value3]]) +; debug_fail(Skip,[[n,[not_stringconcat]],[Value1,Value2,Value3]])),!. + +interpretpart([],Variable1,Vars1,Vars1) :- + getvalue(Variable1,Value1,Vars1), + %%getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + not(isempty(Value1)), + %%isempty(Value2), + %%val1emptyorvalsequal(Value2,Value1), + %%isval(Value2), +debug_call(Skip,[[n,[]],[Value1]]), +( Value1=[]-> +debug_exit(Skip,[[n,[]],[Value1]]) +; debug_fail(Skip,[[n,[]],[Value1]])),!. + +interpretpart([],Variable1,Vars1,Vars2) :- + getvalue(Variable1,Value1,Vars1), + %%getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + isempty(Value1), + %%val1emptyorvalsequal(Value1,[]), + %%isval(Value2), +debug_call(Skip,[[n,[]],[variable]]), +( putvalue(Variable1,[],Vars1,Vars2)-> +debug_exit(Skip,[[n,[]],[[]]]) +; debug_fail(Skip,[[n,[]],[variable]])),!. + +interpretpart("",Variable1,Vars1,Vars1) :- + getvalue(Variable1,Value1,Vars1), + %%getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + not(isempty(Value1)), + %%isempty(Value2), + %%val1emptyorvalsequal(Value2,Value1), + %%isval(Value2), +debug_call(Skip,[[n,""],[Value1]]), +( Value1=""-> +debug_exit(Skip,[[n,""],[Value1]]) +; debug_fail(Skip,[[n,""],[Value1]])),!. + +interpretpart("",Variable1,Vars1,Vars2) :- + getvalue(Variable1,Value1,Vars1), + %%getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + isempty(Value1), + %%val1emptyorvalsequal(Value1,""), + %%isval(Value2), +debug_call(Skip,[[n,""],[variable]]), +( putvalue(Variable1,"",Vars1,Vars2)-> +debug_exit(Skip,[[n,""],[""]]) +; debug_fail(Skip,[[n,""],[variable]])),!. + + +interpretpart("_",_,V,V) :- +debug_call(Skip,[[n,"_"],[variable]]), +debug_exit(Skip,[[n,"_"],[variable]]),!. diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/rcaw.pl b/Combination-Algorithm-Writer-Multiple-Predicates/rcaw.pl new file mode 100644 index 0000000000000000000000000000000000000000..91837de7db05eb17969ba1dda8f976b2845f50d0 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/rcaw.pl @@ -0,0 +1,298 @@ +:- dynamic debug/1. + +caw00(Debug,PredicateName,Rules,MaxLength,TotalVars,_InputVarList,_OutputVarList,Program1,_Program2,Ps1) :- + repeat, + %%MaxLength2 is MaxLength + 1, + %%TotalVars = MaxLength, + randvars(MaxLength,MaxLength,[],RandVars), + populatevars(RandVars,MaxLength,[],PV), + Code is MaxLength + 1 + 97, + char_code(Char,Code), + OutputVarList=[[[v,Char],1]], + retractall(debug(_)), + assertz(debug(Debug)), + retractall(totalvars(_)), + assertz(totalvars(TotalVars)), + caw0(PredicateName,Rules,MaxLength,PV,OutputVarList,Program1,_Program3,Ps), + sort(Ps,Ps1),not(Ps1=[]),!. + +randvars(0,_,V,V) :- !. +randvars(N,L,V1,V2) :- + random(N1), N2A is round(97+(N1*L)), char_code(V3,N2A), V31=[v,V3], ((member(V31,V1))->randvars(N,L,V1,V2); + (append(V1,[V31],V4), + NA is N-1, randvars(NA,L,V4,V2))),!. +randvars2(0,_L,V,V) :- !. +randvars2(N,L,V1,V2) :- + random(N1), N2A is round(97+(N1*L)), char_code(V3,N2A), atom_string(V3,V4), %%V41=[v,V4], + ((member(V4,V1))->randvars2(N,L,V1,V2); + (append(V1,[V4],V5), + NA is N-1, randvars2(NA,L,V5,V2))),!. +populatevars([],_,PV,PV) :- !. +populatevars([RV1|RVs],MaxLength2,PV1,PV2) :- + randvars2(MaxLength2,MaxLength2,[],RV2), + append(PV1,[[RV1,RV2]],PV3), + populatevars(RVs,MaxLength2,PV3,PV2),!. + +caw0(PredicateName,Rules,MaxLength,InputVarList,OutputVarList,Program1,Program2,Ps2) :- + varnames(InputVarList,[],InputVars,[],InputValues), + varnames(OutputVarList,[],OutputVars,[],_OutputValues), + retractall(outputvars(_)), + assertz(outputvars(OutputVars)), + append(InputVars,OutputVars,Vars11), + %%Vars11=InputVars, + %%Vars12=InputVars, + append(InputValues,OutputVars,Vars2), + %%append(InputValues,OutputValues,Values), + Query=[PredicateName,Vars2], + caw1(Query,PredicateName,Rules,MaxLength,Vars11,InputVars,InputVars,_,OutputVarList,OutputVars,Program1,Program2,[],Ps2), !. + +caw1(_,_,_,0,_,_,_,_,_,_,_,_,Ps,Ps) :- !. +caw1(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program2,Programs2,Ps1) :- + MaxLength2 is MaxLength - 1, + addrules0(InputVars2,OutputVars,OutputVars,[],Program3), + %%writeln([addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3)]), + %%optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4), %% IV2->3 + %%writeln([optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4)]), + append(Program1,Program3,Program5), + append(InputVars1,OutputVars,Vars2), + Program2=[ + [PredicateName,Vars2,":-", + Program5 + ] + ],debug(Debug), + + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + + interpret(Debug,Query,Program2,OutputVarList2), + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + append(Programs2,[[Query,Program2,OutputVarList2]],Programs3), + caw1a(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program2,Programs3,Ps1),!. + +%%caw1(_Query,_PredicateName,_Rules,_MaxLength,_VarList,_InputVars1,_InputVars2,_InputVars3,_OutputVarList,_OutputVars,_Program1,_Program4,Ps,Ps) :- writeln(here1),!. +caw1(_Query,_PredicateName,_Rules,_MaxLength2,_VarList,_InputVars1,_InputVars2,_InputVars3,_OutputVarList,_OutputVars,[],_Program2,Programs3,Programs3) :- !. + %%writeln([here1, caw1(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program21,Programs3,Programs3)]),!. + +caw1a(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps1,Ps2) :- + + %%writeln([caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,OutputVarList,OutputVars,Program1,Program4)]), + %%MaxLength2 is MaxLength - 1, + %%writeln(["ml",MaxLength2]), + reverse(InputVars2,InputVars5), + random_member([RuleName,NumInputs,NumOutputs],Rules), + %%writeln([member([RuleName,NumInputs,NumOutputs],Rules)]), + %%writeln([rule(RuleName,NumInputs,NumOutputs,VarList,VarList2,Rule)]), + rule(RuleName,NumInputs,NumOutputs,InputVars5,InputVars4,VarList,VarList2,Rule), +%%writeln([inputVars1,InputVars1]), +%% writeln([rule(RuleName,NumInputs,NumOutputs,InputVars5,InputVars4,VarList,VarList2,Rule)]), + append(Program1,[Rule],Program3), + %%writeln([inputVars3,InputVars3]), + %%InputVars2=InputVars3, + %%writeln([program4,Program4]), + +%%caw1a(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps,Ps) :- +%%writeln([here,caw1a(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps,Ps)]) + + +caw(Query,PredicateName,Rules,MaxLength,VarList2,InputVars1,InputVars4,InputVars3,OutputVarList,OutputVars,Program3,Program4,Ps1,Ps2),!. + + +caw(_,_,_,0,_,_,_,_,_,_,_,_,Ps,Ps) :- !. +caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program2,Programs1,Programs2) :- + MaxLength2 is MaxLength - 1, + addrules(InputVars2,OutputVars,OutputVars,[],_PenultimateVars,[],Program3), + %%writeln([addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3)]), + %%optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4), %% IV2->3 + %%writeln([optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4)]), + append(Program1,Program3,Program5), + append(InputVars1,OutputVars,Vars2), + Program2=[ + [PredicateName,Vars2,":-", + Program5 + ] + ],debug(Debug), +%%*** +%% () choose iv1 as args during caw, () eliminate e problem, could move forward in optimiser but don't need it v +%% should have a leading edge of 1 immediately previous (new output) as an arg in latest rule v, go backwards to choose latest possible args x, 3 x rules can have same as previous rule's output as an output x: at a time +%% chunks will solve having at least 1 rule that connects to last output +%% can optimise number of inputs + +%% test member, = in caw + + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + + interpret(Debug,Query,Program2,OutputVarList2), + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + append(Programs1,[[Query,Program2,OutputVarList2]],Programs3), + cawa(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program2,Programs3,Programs2),!. +%%caw(_,_,_,_,_,_,_,_,_,_,_,_,Ps,Ps) :- !. + +caw(_Query,_PredicateName,_Rules,_MaxLength2,_VarList,_InputVars1,_InputVars2,_InputVars3,_OutputVarList,_OutputVars,[],_Program2,Programs3,Programs3) :- + %%writeln([here2, caw(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program21,Programs3,Programs3)]), + !. + + +cawa(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps1,Ps2) :- + %%writeln([caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,OutputVarList,OutputVars,Program1,Program4)]), + %%MaxLength2 is MaxLength - 1, + %%writeln(["ml",MaxLength2]), + random_member([RuleName,NumInputs,NumOutputs],Rules), + %%writeln([member([RuleName,NumInputs,NumOutputs],Rules)]), + %%writeln([rule(RuleName,NumInputs,NumOutputs,VarList,VarList2,Rule)]), + rule(RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,Rule), +%%writeln([inputVars1,InputVars1]), +%% writeln([rule(RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,Rule)]), + append(Program1,[Rule],Program3), + %%writeln([inputVars3,InputVars3]), + %%InputVars2=InputVars3, + %%writeln([program4,Program4]), +caw(Query,PredicateName,Rules,MaxLength,VarList2,InputVars1,InputVars4,InputVars3,OutputVarList,OutputVars,Program3,Program4,Ps1,Ps2), !. + +varnames([],Vars,Vars,Values,Values) :- !. +varnames(VarList,Vars1,Vars2,Values1,Values2) :- + VarList=[Var|Vars3], + Var=[VarName,Value], + append(Vars1,[VarName],Vars4), + append(Values1,[Value],Values3), + varnames(Vars3,Vars4,Vars2,Values3,Values2),!. + +addrules0(_,_,[],Program,Program) :- !. +addrules0(VarList,OutputVars1,OutputVars2,Program1,Program2) :- + OutputVars2=[OutputVar|OutputVars3], + random_member(Var,VarList), + random_member(OutputVar,OutputVars1), + append(Program1,[[[n,=],[OutputVar,Var]]],Program3), + addrules0(VarList,OutputVars1,OutputVars3,Program3,Program2),!. + +addrules(_,_,[],PV,PV,Program,Program) :- !. +addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- + restlast(VarList,[],_,Var), + %%OutputVars2=[OutputVar|OutputVars3], + random_member(OutputVar,OutputVars2), + delete(OutputVars1,OutputVar,OutputVars3), +%% member(Var,VarList), + random_member(OutputVar,OutputVars1), + append(Program1,[[[n,=],[OutputVar,Var]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars3), + addrules2(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2),!. + +addrules2(_,_,[],PV,PV,Program,Program) :- !. +addrules2(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- +%% restlast(VarList,[],_,Var), + OutputVars2=[OutputVar|OutputVars3], + random_member(Var,VarList), + not(member(Var,PenultimateVars1)), + random_member(OutputVar,OutputVars1), + append(Program1,[[[n,=],[OutputVar,Var]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars3), + addrules2(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2),!. + +iv1flagdisjunction(A,B,true) :- + ((A=true)->true; (B=true)),!. +iv1flagdisjunction(_,_,false) :- !. + + +restlast([],_,_,_) :- fail, !. +restlast([Last],Rest,Rest,Last) :- + Last=[v,_],!. +restlast(Last,Rest,Rest,Last) :- + length(Last,1),!. +restlast(Vars1,Rest1,Rest2,Last) :- + Vars1=[Var|Vars2], + append(Rest1,[Var],Rest3), + restlast(Vars2,Rest3,Rest2,Last),!. + + + +rule(RuleName,1,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- + random_member(Var,InputVars1), + rule2(RuleName,Var,VarList,VarList2,Rule,Var1), + restlast(InputVars1,[],_,Last), %% Last should be outputs - 2nd last rule? + (Var=Last->true;Last=Var1), + append(InputVars1,[Var1],InputVars2),!. +rule(RuleName,1,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + random_member(Var,InputVars1), + rule3(RuleName,Var,VarList,VarList2,Rule,Var1,Var2), + restlast(InputVars1,[],_,Last), + (Var=Last->true; + (Last=Var1->true;Last=Var2)), + append(InputVars1,[Var1,Var2],InputVars2),!. +rule(RuleName,2,0,InputVars,InputVars,VarList,VarList,Rule) :- +%%writeln([rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule)]), + random_member(Var,InputVars), + random_member(Vara,InputVars), + rule6(RuleName,Var,Vara,_VarList,_VarList2,Rule), + restlast(InputVars,[],_,Last), +%%writeln([last,Last]), + (Var=Last->true;Vara=Last),!. +rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- +%%writeln([rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule)]), + random_member(Var,InputVars1), + random_member(Vara,InputVars1), + rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1), + restlast(InputVars1,[],_,Last), +%%writeln([last,Last]), + ((Var=Last->true;Vara=Last)->true; + (Last=Var1)), +%%writeln([var,Var,vara,Vara]), + append(InputVars1,[Var1],InputVars2),!. +rule(RuleName,2,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + random_member(Var,InputVars), + random_member(Vara,InputVars), + rule5(RuleName,Var,Vara,VarList,VarList2,Rule,Var1,Var2), + restlast(InputVars1,[],_,Last), + ((Var=Last->true;Vara=Last)->true; + (Last=Var1->true;Last=Var2)), %% make last var2, use different inputs from previous rule, make this line usea version of previous line as well (args from rule before that) - redo rules based on past programming + append(InputVars1,[Var1,Var2],InputVars2),!. + + +rule2(RuleName,Var,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Var1]],!.%%, + %%member(Var,!. + +rule3(RuleName,Var,VarList,VarList3,Rule,Var1,Var2) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Var1,Var2]],!. + +rule6(RuleName,Var,Vara,_VarList,_VarList2,Rule) :- + Rule=[RuleName,[Var,Vara]],!. + +rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Vara,Var1]],!. + +%%ae be with predicate support also +rule5(RuleName,Var,Vara,VarList,VarList3,Rule,Var1,Var2) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Vara,Var1,Var2]],!. + +%%var(Item,Var,Vars,Vars) :- +%% member([Item,Var],Vars). +var(Vars1,Var1,Vars2) :- + length(Vars1,Vars1Length1), + Vars1Length2 is Vars1Length1-1, + length(Vars3,Vars1Length2), + append(Vars3,[Var2],Vars1), + Var2=[v,Var21], + char_code(Var21,Var2Code1), + Var2Code2 is Var2Code1 + 1, + var2(Var2Code2,Var1), + append(Vars1,[Var1],Vars2),!. + +var2(Code,Var1) :- + outputvars(OutputVars), + totalvars(TotalVars), + Code2 is 96+TotalVars, + Code =< Code2, %% 122 + char_code(Var11,Code), + Var1=[v,Var11], + not(member(Var1,OutputVars)),!. +var2(Var2Code,Code3) :- + Var2Code2 is Var2Code + 1, + totalvars(TotalVars), + Code2 is 96+TotalVars, + Var2Code2 =< Code2, + var2(Var2Code2,Code31), + Code3=[v,Code31],!. diff --git a/Combination-Algorithm-Writer-Multiple-Predicates/remove_duplicate_predicates.pl b/Combination-Algorithm-Writer-Multiple-Predicates/remove_duplicate_predicates.pl new file mode 100644 index 0000000000000000000000000000000000000000..91166ebc7aafbf9b08ce900310f5834cecc2d2b2 --- /dev/null +++ b/Combination-Algorithm-Writer-Multiple-Predicates/remove_duplicate_predicates.pl @@ -0,0 +1,14 @@ +%% remove_duplicate_predicates + +%%remove_duplicate_predicates(Predicates1 +%% Predicates1=[Predicate2|Predicates3], + +%% remvdup([a,a,a,b,b,c],[],A) +%% A = [a, b, c]. + +remvdup([],A,A):-!. +remvdup(A,B,C):- + A=[A1|As], + delete(As,A1,B2), + append(B,[A1],D), + remvdup(B2,D,C). diff --git a/Combination-Algorithm-Writer-Stable/LICENSE b/Combination-Algorithm-Writer-Stable/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/Combination-Algorithm-Writer-Stable/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Combination-Algorithm-Writer-Stable/LPI CAW commands workaround.txt b/Combination-Algorithm-Writer-Stable/LPI CAW commands workaround.txt new file mode 100644 index 0000000000000000000000000000000000000000..8b77d7f22f48f09e8269b23a6ca1d157067e37ce --- /dev/null +++ b/Combination-Algorithm-Writer-Stable/LPI CAW commands workaround.txt @@ -0,0 +1,9 @@ +LPI CAW commands workaround for single argument commands not working, but no singletons working + +_A - enter value in header (one output, so is it, do outputs one at a time) + +[] and "" - enter [] or "" somewhere in header and use normal "=" + +not - can still use except for [], "", etc., see previous line + +With no singletons working, CAWP is more stable than CAWMP \ No newline at end of file diff --git a/Combination-Algorithm-Writer-Stable/README.md b/Combination-Algorithm-Writer-Stable/README.md new file mode 100644 index 0000000000000000000000000000000000000000..17952143d3d6ebf88efc2571213f336adf423234 --- /dev/null +++ b/Combination-Algorithm-Writer-Stable/README.md @@ -0,0 +1,71 @@ +# Combination-Algorithm-Writer-Stable +Combination Algorithm Writer with Predicates Stable (CAWPS) + +Please note: This works with the example below, which the CAWMP repository doesn't. + +Combination Algorithm Writer is a SWI-Prolog algorithm that finds combinations of given commands that satisfy the given input and output. + +It verifies for no singletons. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Combination-Algorithm-Writer-Stable"). +halt +``` + +# Running + +* In Shell: +`cd Combination-Algorithm-Writer-Stable` +`swipl` + +* Load all files in the form: +``` +['lpi_caw_commands.pl']. +['listprologinterpreter1listrecursion4copy52']. +['listprologinterpreter3preds5copy52']. +['caw5copy11']. +``` +Running + +``` +caw00(off,f,[[+,2,1]],4,8,[[a,1],[b,1],[c,2],[d,1]],[[e,5]],[[[f1,4,1],[a,b,c,d,e],(:-),[[+,[a,b,f]],[+,[c,f,g]],[+,[d,g,h]],[=,[e,h]]]]],[],P),writeln(P). + +[[f1,[a,b,c,d,e],:-,[[+,[a,b,f]],[+,[c,f,g]],[+,[d,g,h]],[=,[e,h]]]],[f,[a,b,c,d,e],:-,[[+,[a,b,f]],[+,[c,d,g]],[+,[f,g,h]],[=,[e,h]]]]] +. +. +. +``` + +``` +caw00(off,reverse,[[head,1,1],[tail,1,1],[wrap,1,1],[append,2,1],[reverse,2,1]],6,8,[[a,[1,2,3]],[b,[]]],[[c,[3,2,1]]],[[[reverse,2,1],[[],a,a]]],[],P),writeln(P). + +not finished +``` + +Note: Use (:-) instead of :-. + +# CAWPS API + +* To run CAWPS on a Prolog server: +* Move `cawps-api.pl` to the root (`/username/` or `~` on a server) of your machine. +* Re-enter the paths to your Prolog files in it. +* Enter `[cawps-api.pl]` in SWI-Prolog and `server(8000).`. +* On a local host access the algorithm at `http://127.0.0.1:8000` and replace 127.0.0.1 with your server address on a server. diff --git a/Combination-Algorithm-Writer-Stable/caw5copy11.pl b/Combination-Algorithm-Writer-Stable/caw5copy11.pl new file mode 100644 index 0000000000000000000000000000000000000000..a9dd6568ccd0c7db77b63dfe3d2f76ad630cc8af --- /dev/null +++ b/Combination-Algorithm-Writer-Stable/caw5copy11.pl @@ -0,0 +1,386 @@ +:- dynamic debug/1. +:- dynamic totalvars/1. +:- dynamic outputvars/1. + +:- dynamic test/1. + +:-include('../listprologinterpreter/la_strings.pl'). + +test1(A) :- test(B),A is B+1,retractall(test(_)),assertz(test(A)). + +caw00(Debug,PredicateName,Rules1,MaxLength,TotalVars,InputVarList,OutputVarList,Predicates1,Program1,Program2) :- + split3(Predicates1,[],Rules2), + split2(Predicates1,[],Predicates), + %%writeln([Rules2,Predicates]), + append(Rules1,Rules2,Rules3), + + retractall(debug(_)), + assertz(debug(Debug)), + retractall(totalvars(_)), + assertz(totalvars(TotalVars)), + + retractall(test(_)), + assertz(test(0)), + caw0(Predicates,PredicateName,Rules3,MaxLength,InputVarList,OutputVarList,Program1,Program2). + +caw0(Predicates,PredicateName,Rules,MaxLength,InputVarList,OutputVarList,Program1,Program2) :- + varnames(InputVarList,[],InputVars,[],InputValues), + varnames(OutputVarList,[],OutputVars,[],_OutputValues), + retractall(outputvars(_)), + assertz(outputvars(OutputVars)), +append(InputVars,OutputVars,Vars11), +%%Vars11=InputVars, +%%Vars12=InputVars, + append(InputValues,OutputVars,Vars2), + %%append(InputValues,OutputValues,Values), + Query=[PredicateName,Vars2], + caw(Predicates,Query,PredicateName,Rules,MaxLength,Vars11,InputVars,InputVars,_,OutputVarList,OutputVars,Program1,Program2). +caw(_,_,_,_,0,_,_,_,_,_,_,_) :- fail, !. +caw(Predicates,Query,PredicateName,_,_,_VarList,InputVars1,InputVars2,_,OutputVarList,OutputVars,Program1,Program2) :- + addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3), +%%writeln([addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3)]), + %%optimise(Program1,InputVars1,InputVars3,PenultimateVars,Program4), %% IV2->3 +%%writeln([optimise(Program1,InputVars1,InputVars3,PenultimateVars,Program4)]), + append(Program1,Program3,Program5), + append(InputVars1,OutputVars,Vars2), + Program22=[ + [PredicateName,Vars2,(:-), + Program5 + ] + ], + + append(Predicates,Program22,Program2), + + (debug(on)->Debug=on;Debug=off), +%writeln([interpret(Debug,Query,Program2,OutputVarList)]), +%%writeln(""), +%trace, + + catch(call_with_time_limit(0.05, + interpret(Debug,Query,Program2,OutputVarList)), + time_limit_exceeded, + fail), + %test1(A),writeln(A), + no_singletons(Vars2,Program5),!. +caw(Predicates,Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4) :- +%%writeln([caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,OutputVarList,OutputVars,Program1,Program4)]), + MaxLength2 is MaxLength - 1, +%%writeln(["ml",MaxLength2]), + member([RuleName,NumInputs,NumOutputs],Rules), +%%writeln([member([RuleName,NumInputs,NumOutputs],Rules)]), +%%writeln([rule(RuleName,NumInputs,NumOutputs,VarList,VarList2,Rule)]), + rule(Program1,RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,Rule), +%%writeln([rule(RuleName,NumInputs,NumOutputs,InputVars1,InputVars3,VarList,VarList2,Rule)]), + append(Program1,[Rule],Program3), +%%writeln([inputVars3,InputVars3]), +%%InputVars2=InputVars3, +%%writeln([program4,Program4]), + caw(Predicates,Query,PredicateName,Rules,MaxLength2,VarList2,InputVars1,InputVars4,InputVars3,OutputVarList,OutputVars,Program3,Program4). + +varnames([],Vars,Vars,Values,Values) :- !. +varnames(VarList,Vars1,Vars2,Values1,Values2) :- + VarList=[Var|Vars3], + Var=[VarName,Value], + append(Vars1,[VarName],Vars4), + append(Values1,[Value],Values3), + varnames(Vars3,Vars4,Vars2,Values3,Values2),!. + +addrules(_,_,[],PV,PV,Program,Program) :- !. +addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- + OutputVars2=[OutputVar|OutputVars3], + member(Var,VarList), + member(OutputVar,OutputVars1), + append(Program1,[[=,[OutputVar,Var]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars3), + addrules(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2). + +%% optimise([[append,[a,a,d]],[append,[a,a,e]],[append,[a,a,f]],[append,[a,b,g]]],[g],P). + +optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program2) :- + findrulesflowingtopv1(Program1,InputVars1,InputVars2,PenultimateVars,[],Rules,true), + %%findrulesflowingtopv1a(Program1,_Program32,InputVars1,InputVars2,PenultimateVars,[],_Rules1), + intersection(Program1,Rules,Program3), + unique1(Program3,[],Program2). +findrulesflowingtopv1(_,_,_,[],Rules,Rules,false). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + (atom(Var);length(Var,1)), + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars1,Rules1,Rules2,IV1Flag1) :- + Vars1=[Var|Vars2], + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2), + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars2,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv20(_,[],_InputVars1,_InputVars2,_Var,Rules,Rules,false). +findrulesflowingtopv20(Program0,Rules4,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rules4=[Rule|Rules], + (findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2)->true;(Rules3=Rules1,IV1Flag2=false)), + %%delete(Program0,Rule,Program1), + findrulesflowingtopv20(Program0,Rules,InputVars1,InputVars2,Var,Rules3,Rules2,IV1Flag3),%%p1->0 + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). +%%findrulesflowingtopv2(_,[],[],_,_,_,Rules,Rules). +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + %%(not(intersection(Rulesx,Rules1))-> x + %% append, append, unique1 + %%append(Rules1,[Rule],Rules3);Rules3=Rules1), + + %%member(Var2,Rest), + %%member(Var2,InputVars1), + + length(Rest,Length1), Length1>=1, + subtract(Rest,InputVars1,IV3s), + length(IV3s,Length3), + subtract(Rest,IV3s,IV1s), + length(IV1s,Length2), Length2>=1, + subtract(IV3s,InputVars2,[]), + + IV1Flag2=true, + + %%delete(Program0,Rule,Program1), + + %%(delete(Program0,Rule,Program3), + %%iv3s1(IV3s,Program3,IV3s,[]), + (Length3>=1-> + (findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV3s,[],Rules5,IV1Flag3),not(Rules5=[])); + (Rules5=[],IV1Flag3=false)), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag4), + %%->true; Rules5=[],IV1Flag1=IV1Flag4), + + ((findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV1s,[],Rules6,IV1Flag5), %%iv1s->rest, etc + iv1flagdisjunction(IV1Flag4,IV1Flag5,IV1Flag1))->true;(Rules6=[],IV1Flag1=IV1Flag4)), + + append([Rule],Rules1,Rules9), + append(Rules9,Rules5,Rules7), + append(Rules7,Rules6,Rules8), + unique1(Rules8,[],Rules2). + +/** +findrulesflowingtopv2(_Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + (not(member(Rule,Rules1))-> + append(Rules1,[Rule],Rules2);Rules2=Rules1), + subset(Rest,InputVars2), + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), + IV1Flag1=false. +**/ +/** +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program3), + %%Program3=Program1, + %%append(Rules1,[Rule],Rules3), + subset(Rest,InputVars2), + + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), +%% delete(Program0,Rule,Program1), + IV1Flag2=false, + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Rest,[],Rules4,IV1Flag3), + %%not(Rules4=[]), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1), + append(Rules1,[Rule],Rules7), + append(Rules7,Rules4,Rules8), + unique1(Rules8,[],Rules2). +**/ +/** +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2(Rule,Program0,Program1,_Program2,InputVars1,InputVars,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + %%Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1(Program0,Program1,_Program2,InputVars1,InputVars,Rest,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). +**/ +iv1flagdisjunction(A,B,true) :- + (A=true); (B=true). +iv1flagdisjunction(_,_,false). +/** +iv3s0([],_,IV3s1,IV3s2). +iv3s0(IV3s,Program0,IV3s1,IV3s2). + IV3s=[IV3|IV3s3], + iv3s1(IV3,Program0,IV3s1,IV3s4), + iv3s0(IV3s3,Program0,IV3s4,IV3s2). +iv3s1(_,[],IV3s,IV3s). +iv3s1(IV3,Program0,IV3s1,IV3s2) :- + Program0=[Rule|Rules], + iv3s2(IV3,Rule,IV3s1,IV3s3), + iv3s1(IV3,Rules,IV3s3,IV3s2). +iv3s2(IV3,Rule,IV3s,IV3s1,IV3s2). + Rule=[_PredicateName,Vars], + restlast(Vars,[],_Rest,IV3), + delete(IV3s1,IV3,IV3s2). +findrulesflowingtopv1a(_,_,_,_,[],Rules,Rules). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + atom(Var), + findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Vars1,Rules1,Rules2) :- + Vars1=[Var|Vars2], + findrulesflowingtopv2(Program1,Program3,InputVars1,InputVars2,Var,Rules1,Rules3), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Vars2,Rules3,Rules2). +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv2a([],[],_,_,_,Rules,Rules). +findrulesflowingtopv2a(Program1,Program2,_InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program2), +Program2=Program1, + append(Rules1,[[PredicateName,Vars]],Rules2), + subset(Rest,InputVars2)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program3), +Program3=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + subset(Rest,InputVars2)), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Rest,Rules3,Rules2). +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1a(Program4,Program2,InputVars1,InputVars,Rest,Rules3,Rules2). + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). + **/ + +restlast([],_,_,_) :- fail, !. +restlast([Last],Rest,Rest,Last) :- + atom(Last),!. +restlast(Last,Rest,Rest,Last) :- + length(Last,1),!. +restlast(Vars1,Rest1,Rest2,Last) :- + Vars1=[Var|Vars2], + append(Rest1,[Var],Rest3), + restlast(Vars2,Rest3,Rest2,Last),!. + + + rule(Predicates,RuleName,NumInputs,NumOutputs,InputVars20,InputVars4,VarList0,VarList2,Rule) :- + + %InputVars20=InputVars2,% + %VarList0=VarList, + %writeln(Predicates), + /* + %writeln([Predicates,Predicates]), + %(not(Predicates=[])->trace;true), +findall(Rule_vars1,member([_Rule_name,Rule_vars1],Predicates),Rule_vars2),foldr(append,Rule_vars2,Rule_vars3), + +% count vars + sort(Rule_vars3,K), + findall(G,(member(G,K),findall(G,member(G,Rule_vars3),H),length(H,J),J>2),L), +% remove vars occuring more than twice + +(var(InputVars20)->InputVars20=InputVars2;(%trace, +subtract(InputVars20,L,InputVars2)%,notrace +)), + +(var(VarList0)->VarList0=VarList;subtract(VarList0,L,VarList)), +*/ + +rule1(RuleName,NumInputs,NumOutputs,InputVars20,InputVars4,VarList0,VarList2,Rule). + + +rule1(RuleName,1,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + rule2(RuleName,Var,VarList,VarList2,Rule,Var1), + append(InputVars1,[Var1],InputVars2). +rule2(RuleName,Var,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Var1]],!. + +rule1(RuleName,1,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + rule3(RuleName,Var,VarList,VarList2,Rule,Vars), + append(InputVars1,Vars,InputVars2). +rule3(RuleName,Var,VarList,VarList3,Rule,[Var1,Var2]) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Var1,Var2]],!. + +rule1(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + member(Vara,InputVars1), + rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1), + append(InputVars1,[Var1],InputVars2). +rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Vara,Var1]],!. + +rule1(RuleName,2,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars), + member(Vara,InputVars), + rule5(RuleName,Var,Vara,VarList,VarList2,Rule,Vars), + append(InputVars1,Vars,InputVars2). +rule5(RuleName,Var,Vara,VarList,VarList3,Rule,[Var1,Var2]) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Vara,Var1,Var2]],!. + +%%var(Item,Var,Vars,Vars) :- +%% member([Item,Var],Vars). +var(Vars1,Var1,Vars2) :- + length(Vars1,Vars1Length1), + Vars1Length2 is Vars1Length1-1, + length(Vars3,Vars1Length2), + append(Vars3,[Var2],Vars1), + char_code(Var2,Var2Code1), + Var2Code2 is Var2Code1 + 1, + var2(Var2Code2,Var1), + append(Vars1,[Var1],Vars2),!. + +var2(Code,Var1) :- + outputvars(OutputVars), + totalvars(TotalVars), + Code2 is 96+TotalVars, + Code =< Code2, %% 122 + char_code(Var1,Code), + not(member(Var1,OutputVars)),!. +var2(Var2Code,Code3) :- + Var2Code2 is Var2Code + 1, + totalvars(TotalVars), + Code2 is 96+TotalVars, + Var2Code2 =< Code2, + var2(Var2Code2,Code3),!. + + +no_singletons(Vars1,Program):- + findall(DA,(member(C,Program),C=[_E,D],member(DA,D)),Vars2), + %%append_list(Vars2,Vars2A), + append(Vars1,Vars2,Vars3), + findall(Count1,(member(Item,Vars3),aggregate_all(count,(member(Item,Vars3)),Count1), + Count1=1),G),G=[]. + + split3([],List,List) :- !. +split3(Predicates1,List1,List2) :- + Predicates1=[Item1|List4], + Item1= [[Name,In,Out]|_Rest], + append(List1,[[Name,In,Out]],List6), + split3(List4,List6,List2),!. + +split2([],List,List) :- !. +split2(Predicates1,List1,List2) :- + Predicates1=[Item1|List4], + Item1= [[Name,_In,_Out]|Rest], + append(List1,[[Name|Rest]],List6), + split2(List4,List6,List2),!. + diff --git a/Combination-Algorithm-Writer-Stable/cawplistprolog.pl b/Combination-Algorithm-Writer-Stable/cawplistprolog.pl new file mode 100644 index 0000000000000000000000000000000000000000..13fd7e293b18158063836b011f450abf31bd31ec --- /dev/null +++ b/Combination-Algorithm-Writer-Stable/cawplistprolog.pl @@ -0,0 +1,5 @@ +:-include('lpi_caw_commands.pl'). +:-include('listprologinterpreter1listrecursion4copy52'). +:-include('listprologinterpreter3preds5copy52'). +:-include('caw5copy11'). +:-include('cawpverify.pl'). \ No newline at end of file diff --git a/Combination-Algorithm-Writer-Stable/cawps-api.pl b/Combination-Algorithm-Writer-Stable/cawps-api.pl new file mode 100644 index 0000000000000000000000000000000000000000..1ff20a837dae1de690d58505eb34d29ac619f35c --- /dev/null +++ b/Combination-Algorithm-Writer-Stable/cawps-api.pl @@ -0,0 +1,103 @@ +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/', web_form, []). + +:- include('files/Combination-Algorithm-Writer-Stable/listprologinterpreter1listrecursion4copy52.pl'). +:- include('files/Combination-Algorithm-Writer-Stable/listprologinterpreter3preds5copy52.pl'). +:- include('files/Combination-Algorithm-Writer-Stable/caw5copy11.pl'). +:- include('files/Combination-Algorithm-Writer-Stable/lpi_caw_commands.pl'). + +server(Port) :- + http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + + web_form(_Request) :- + reply_html_page(include('Combination Algorithm Writer Stable'), + + [ + form([action='/landing', method='POST'], [ + /** + p([], [ + label([for=debug],'Debug (on/off):'), + input([name=debug, type=textarea]) + ]), + **/ + p([], [ + label([for=predicatename],'Predicate Name (e.g. f):'), + input([name=predicatename, type=textarea]) + ]), + + p([], [ + label([for=rules],'Rules to choose from (e.g. [[+,2,1]]):'), + input([name=rules, type=textarea]) + ]), + + p([], [ + label([for=maxlength],'Max Length of Algorithm Body (e.g. 4):'), + input([name=maxlength, type=textarea]) + ]), + + p([], [ + label([for=totalvars],'Total Vars in Predicate (e.g. 8):'), + input([name=totalvars, type=textarea]) + ]), + p([], [ + label([for=inputvarlist],'InputVarList (e.g. [[a,1],[b,1],[c,2],[d,1]]):'), + input([name=inputvarlist, type=textarea]) + ]), + + p([], [ + label([for=outputvarlist],'OutputVarList (e.g. [[e,5]]):'), + input([name=outputvarlist, type=textarea]) + ]), + + p([], [ + label([for=algorithmlibrary],'Algorithm Library (e.g. [[[f1,4,1],[a,b,c,d,e],(:-),[[+,[a,b,f]],[+,[c,f,g]],[+,[d,g,h]],[=,[e,h]]]]], where 4 is the number of inputs and 1 is the number of outputs):'), + input([name=algorithmlibrary, type=textarea]) + ]), + + p([], input([name=submit, type=submit, value='Submit'], [])) + ])]). + + :- http_handler('/landing', landing_pad, []). + + landing_pad(Request) :- + member(method(post), Request), !, + http_read_data(Request, Data, []), + format('Content-type: text/html~n~n', []), + format('

', []), + %%portray_clause(Data), + + %%term_to_atom(Term,Data), + + %% Predicatename,Rules,Maxlength,Totalvars,Inputvarlist,Outputvarlist,Algorithmlibrary + %%portray_clause(Data), + + +Data=[%%debug='off',%%Debug1, +predicatename=Predicatename,rules=Rules,maxlength=Maxlength,totalvars=Totalvars,inputvarlist=Inputvarlist,outputvarlist=Outputvarlist,algorithmlibrary=Algorithmlibrary,submit=_], + +term_to_atom(Debug1,'off'), +term_to_atom(Predicatename1,Predicatename), +term_to_atom(Rules1,Rules), +term_to_atom(Maxlength1,Maxlength), +term_to_atom(Totalvars1,Totalvars), +term_to_atom(Inputvarlist1,Inputvarlist), +term_to_atom(Outputvarlist1,Outputvarlist), +term_to_atom(Algorithmlibrary1,Algorithmlibrary), + +caw00(Debug1,Predicatename1,Rules1,Maxlength1,Totalvars1,Inputvarlist1,Outputvarlist1,Algorithmlibrary1,[],Result), + %%format('

========~n', []), + portray_clause(Result), + %%writeln1(Data), + +format('

'). \ No newline at end of file diff --git a/Combination-Algorithm-Writer-Stable/cawpverify.pl b/Combination-Algorithm-Writer-Stable/cawpverify.pl new file mode 100644 index 0000000000000000000000000000000000000000..39ac852fbe4aba953432a0e26271f23e7cb230a5 --- /dev/null +++ b/Combination-Algorithm-Writer-Stable/cawpverify.pl @@ -0,0 +1,51 @@ +%% cawptest(Debug[on/off],Total,Score). + +cawptest(Debug,NTotal,Score) :- cawptest(Debug,0,NTotal,0,Score),!. +cawptest(_Debug,NTotal,NTotal,Score,Score) :- NTotal=1, !. +cawptest(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + cawptest2(NTotal3,Function,Rules,MaxLength,TotalVars,Specifications,AlgDict,Program1), + + %%writeln([cawptest2(NTotal3,Specifications,Program1)]), + (((%%writeln(caw00(Debug,function0,[],5,TotalVars,Specifications,[],Program1)), + caw00(Debug,Function,Rules,MaxLength,TotalVars,Specifications,AlgDict,[],Program1) + + %%sort(Program1,ProgramA), + %%sort(Program2,ProgramA) + %%writeln1(Program1),writeln1(Program2) + %%Program1=Program2 + ))->(Score3 is Score1+1,writeln([cawptest,NTotal3,passed]));(Score3=Score1,writeln([cawptest,NTotal3,failed]))), + writeln(""), + cawptest(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% Test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +cawptest1(Debug,N,Passed) :- + cawptest2(N,Function,Rules,MaxLength,TotalVars,InputVarList,OutputVarList,AlgDict,Program1), + %%writeln1([cawptest2(N,Specifications,Program1)]), + %%writeln1(caw00(Debug,Function,Rules,MaxLength,MaxPredicates,TotalVars,Numinputs, Numoutputs,Specifications,AlgDict,[],Program2)), + + (((caw00(Debug,Function,Rules,MaxLength,TotalVars,InputVarList,OutputVarList,AlgDict,[],Program2), + %%sort(Program1,ProgramA), + %%sort(Program2,ProgramA) +trace, + writeln(Program1), + writeln(Program2),Program1=Program2 + ))->(Passed=passed,writeln([cawptest,N,passed]));(Passed=failed,writeln([cawptest,N,failed]))),!. + +% ["Mind Reading","mr for time travel 3.txt",0,algorithms,"34. I mind read the correct universe containing all my collections of areas of study, then time travelled to it."] + +% The area of study was +++. + +cawptest2(1,f,[[+,2,1%% Modes=2 inputs, 1 output +]],4, +8, + [[a,1],[b,1],[c,2],[d,1]],[[e,5]] +, +[ +%[[f1,4,1],[a,b,c,d,e],(:-),[[+,[a,b,f]],[+,[c,f,g]],[+,[d,g,h]],[=,[e,h]]]] +], %% Algorithm dictionary + %% Result + %[f1,[a,b,c,d,e],:-,[[+,[a,b,f]],[+,[c,f,g]],[+,[d,g,h]],[=,[e,h]]]], +[[f,[a,b,c,d,e],:-,[[+,[a,b,f]],[+,[c,d,g]],[+,[f,g,h]],[=,[e,h]]]]] +). diff --git a/Combination-Algorithm-Writer-Stable/listprologinterpreter1listrecursion4copy52.pl b/Combination-Algorithm-Writer-Stable/listprologinterpreter1listrecursion4copy52.pl new file mode 100644 index 0000000000000000000000000000000000000000..85c2605348d9f64de1f8a21a9399e93849ca5bc8 --- /dev/null +++ b/Combination-Algorithm-Writer-Stable/listprologinterpreter1listrecursion4copy52.pl @@ -0,0 +1,584 @@ +/**:- include('listprologinterpreter3preds5.pl'). +:- include('lpiverify4.pl'). +:- include('caw5 copy 11.pl'). +**/ +:- dynamic debug/1. + +/** List Prolog Interpreter **/ + +interpret(Debug,Query,Functions,Result) :- +%%writeln([i1]), + interpret1(Debug,Query,Functions,Functions,Result), + !. +interpret1(Debug,Query,Functions1,Functions2,Result) :- +%%writeln([i11]), + retractall(debug(_)), + assertz(debug(Debug)), + retractall(cut(_)), + assertz(cut(off)), + member1(Query,Functions1,Functions2,Result). +%%member1([_,R],_,[],R). +%%member1(_,_,[],[]). +member1(_,_,[],_) :- fail. +member1(Query,Functions,Functions2,Vars8) :- +%%writeln([m1]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2,(:-),Body]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, %%->ca2 +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,Vars2,Body,true), + updatevars(FirstArgs,Vars2,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), +%%writeln(["FirstArgs",FirstArgs,"Vars",Vars2,"Result",Result,"Vars7",Vars7,"Vars72",Vars72,"Var71",Var71,"Vars8",Vars8]), +%%writeln(["Vars8",Vars8]), + findresult3(Arguments1,Vars8,[],Result2) +%%writeln([findresult3,"Arguments1",Arguments1,"Vars8",Vars8,"Result2",Result2]) + );( +%%writeln(here1), + Vars8=[],Result2=[])), +%%writeln(["Arguments1",Arguments1,"Vars2",Vars2,"Result",Result]), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true)) + ->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2,(:-),_Body]|Functions3], %% make like previous trunk? + member11(Query,Functions,Functions2,Vars8)) + );(turncut(off)%%,Result=[] + ). +member11(Query,Functions,Functions2,Result) :- +%%writeln([m11]), +%%writeln(["Query",Query,"Functions",Functions,"Functions2",Functions2,"Result",Result]), + cut(off)->( + (Query=[Function], + (Functions2=[[Function,(:-),Body]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Result=[], + interpretbody(Functions,Functions2,[],_Vars2,Body,true),!, + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + %%Functions2=[[Function]|Functions3], + member12(Query,Functions,Functions2,Result)) + );(turncut(off)). +member12(Query,Functions,Functions2,Vars8) :- +%%writeln([m12]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, %% ->ca2 +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + updatevars(FirstArgs,Vars1,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) + );( +%%writeln(here2), + Vars8=[],Result2=[])), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2]|Functions3], + member13(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member13(Query,Functions,Functions2,Result) :- +%%writeln([m13]), + cut(off)->( + (Query=[Function],!, + (Functions2=[[Function]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Result=[], + %%interpretbody(Functions,[],_Vars2,Body,true), + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + Functions2=[_Function|Functions3], + member1(Query,Functions,Functions3,Result)) + );(turncut(off)). +interpret2(Query,Functions1,Functions2,Result) :- +%%writeln(i2), +%%writeln(["%%interpret2 Query",Query,"Functions1",Functions1,"Functions2",Functions2]), + member2(Query,Functions1,Functions2,Result). +%%member2([_,R],_,[],R). +%%member2(_,_,[],[]). +member2(_,_,[],_) :- fail. +member2(Query,Functions,Functions2,Vars8) :- +%%writeln([m2]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2,(:-),Body]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,Vars2,Body,true), %%**arg2 change +%%writeln(["Functions",Functions,"Functions2",Functions2,"Vars1",Vars1,"Vars2",Vars2,"Body",Body]), + updatevars(FirstArgs,Vars2,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) +%%writeln(["Vars2",Vars2,"Result",Result]), + );( + writeln(here3), + Vars8=[],Result2=[])), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2,(:-),_Body]|Functions3], + member21(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member21(Query,Functions,Functions2,Result) :- +%%writeln([m21]), + cut(off)->( + (Query=[Function], + (Functions2=[[Function,(:-),Body]|_Functions3]), + Vars1=[], + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,_Vars2,Body,true),!, %%**arg2 change + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + %%Functions2=[[Function]|Functions3], + member22(Query,Functions,Functions2,Result)) + );(turncut(off)). +member22(Query,Functions,Functions2,Vars8) :- +%%writeln([m22]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + updatevars(FirstArgs,Vars1,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) + );( +%%writeln(here4), + Vars8=[],Result2=[])), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2]|Functions3], + member23(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member23(Query,Functions,Functions2,Vars8) :- +%%writeln([m23]), + cut(off)->( + (Query=[Function],!, + (Functions2=[[Function]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Vars8=[], + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + Functions2=[_Function|Functions3], + member2(Query,Functions,Functions3,Vars8)) + );(turncut(off)). +checkarguments([],[],Vars,Vars,FirstArgs,FirstArgs). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %% +%%writeln(1), + Arguments1=[Value|Arguments3], %% Value may be a number, string, list or tree + expressionnotatom(Value), + Arguments2=[Variable2|Arguments4], + isvar(Variable2), + putvalue(Variable2,Value,Vars1,Vars3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs1,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %%A +%%writeln(2), + Arguments1=[Variable|Arguments3], %% Value may be a number, string, list or tree + isvar(Variable), + Arguments2=[Value|Arguments4], + expressionnotatom(Value), + putvalue(Variable,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable,_]],FirstArgs3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln(3), + Arguments1=[Variable1|Arguments3], + isvar(Variable1), + Arguments2=[Variable2|Arguments4], + isvar(Variable2), + (getvalue(Variable2,Value,Vars1)->true;Value=empty), + putvalue(Variable2,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable1,Variable2]],FirstArgs3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln(4), + Arguments1=[Value1|Arguments3], + expressionnotatom(Value1), + Arguments2=[Value1|Arguments4], + expressionnotatom(Value1), + checkarguments(Arguments3,Arguments4,Vars1,Vars2,FirstArgs1,FirstArgs2). + +interpretbody(_Functions1,_Functions2,Vars,Vars,[],true) :- !. + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[Statements1|Statements2], + interpretbody(Functions0,Functions,Vars1,Vars3,Statements1,Result2), + interpretbody(Functions0,Functions,Vars3,Vars2,Statements2,Result3), + %%((Result3=cut)->!;true), + logicalconjunction(Result1,Result2,Result3),!. + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[[not,[Statements]]|Statements2], + interpretbody(Functions0,Functions,Vars1,Vars3,Statements,Result2), + %%((Result2=cut)->!;true), + interpretbody(Functions0,Functions,Vars3,Vars2,Statements2,Result3), + %%((Result3=cut)->!;true), + logicalnot(Result2,Result4), + (logicalconjunction(Result1,Result4,Result3)->true;(Result1=false)),!. +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[[Statements1],or,[Statements2]], + (interpretbody(Functions0,Functions,Vars1,Vars2,Statements1,Result1); + %%,((Value1=cut)->!;true)); + interpretbody(Functions0,Functions,Vars1,Vars2,Statements2,Result1)),!. + %%,((Value=cut)->!;true)). + %%(logicaldisjunction(Result1,Value1,Value2)->true;(Result1=false)). + + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[Statement|Statements], +%%writeln(["Functions0",Functions0,"Functions",Functions,"Statement",Statement,"Vars1",Vars1,"Vars3",Vars3,"Result2",Result2,"Cut",Cut]), + interpretstatement1(Functions0,Functions,Statement,Vars1,Vars3,Result2,Cut), +%%writeln(["here1"]), + ((not(Cut=cut))->(Functions2=Functions);(turncut(on))), %% cut to interpret1/2 (assertz) +%%writeln(["here3"]), + interpretbody(Functions0,Functions2,Vars3,Vars2,Statements,Result3), + %%((Result3=cut)->!;true), +%%writeln(["here4"]), + logicalconjunction(Result1,Result2,Result3),!. +%%writeln([Result1,Result2,Result3]). +turncut(State1) :- + cut(State2), + retract(cut(State2)), + assertz(cut(State1)). +logicaldisjunction(true,Result2,Result3) :- + true(Result2);true(Result3). +logicalconjunction(true,Result2,Result3) :- + true(Result2),true(Result3). +logicalnot(Result1,Result2) :- + true(Result1),false(Result2). +logicalnot(Result1,Result2) :- + false(Result1),true(Result2). +true(true). +false(false). + +%%interpretstatement1(_F0,[],_,Vars,Vars,true,nocut) :- ! +%%writeln("AND HERE!") +%% . + +interpretstatement1(_F0,_Functions,[cut,[]],Vars,Vars,true,cut) :- !. + +interpretstatement1(_F0,_Functions,[atom,[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[atom,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + atom(Value), + (debug(on)->(writeln([exit,[atom,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[string,[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[string,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + string(Value), + (debug(on)->(writeln([exit,[string,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[number,[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[number,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + number(Value), + (debug(on)->(writeln([exit,[number,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[variable,[Variable]],Vars,Vars,true,nocut) :- + var(Variable), + (debug(on)->(writeln([call,[variable,[Variable]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[variable,[Variable]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[Operator,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- + isop(Operator), + interpretpart(is,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[Operator,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(31), + isop(Operator), + interpretpart(is,Variable2,Variable1,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[+,[Variable2,Variable3,Variable1]],Vars1,Vars2,true,nocut) :- +%%writeln(4), + interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2). + +%%interpretstatement1(_F0,_Functions,[Variable2+Variable3,is,Variable1],Vars1,Vars2,true,nocut) :- +%%writeln(41), + %%interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[=,[Variable1,[Variable2,Variable3]]],Vars1,Vars2,true,nocut) :- +%%writeln(5), + interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2). + +%%interpretstatement1(_F0,_Functions,[[Variable2,Variable3]=Variable1],Vars1,Vars2,true,nocut) :- +%%writeln(51), +%% interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[=,[Variable1,[Variable2]]],Vars1,Vars2,true,nocut) :- +%%writeln(52), + interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[=,[[Variable1],[Variable2]]],Vars1,Vars2,true,nocut) :- +%%writeln(53), + interpretpart(bracket2,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[head,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(6), + interpretpart(head,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[tail,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(61), + interpretpart(tail,Variable1,Variable2,Vars1,Vars2). + + +interpretstatement1(_F0,_Functions,[member,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(8), + interpretpart(member,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[delete,[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +%%writeln(), + interpretpart(delete,Variable1,Variable2,Variable3,Vars1,Vars2). +%%** all in form f,[1,1,etc], including + with 0,1 + + +interpretstatement1(_F0,_Functions,[wrap,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln1(52), wrap +%%writeln([[n,wrap],[Variable1,Variable2]]), + interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2). + + + +interpretstatement1(_F0,_Functions,[append,[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +%%writeln(9), + interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(Functions0,_Functions,Query1,Vars1,Vars8,true,nocut) :- +%%writeln("h1/10"), + Query1=[Function,Arguments], +%%writeln(["Arguments",Arguments,"Vars1",Vars1]), + substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + Query2=[Function,Vars3], %% Bodyvars2? +%% debug(on)->writeln([call,[Function,[Vars3]]]), +%%writeln(["Query2",Query2,"Functions0",Functions0]), + interpret2(Query2,Functions0,Functions0,Result1), + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) +);( +writeln(here1), + Vars8=[])). +%%**** reverse and take first instance of each variable. + %%findresult3(Arguments,Vars6,[],Result2) +%%writeln(["FirstArgs",FirstArgs,"Result1",Result1,"Vars5",Vars5,"Vars4",Vars4]), +%%writeln(["Vars1:",Vars1,"Vars4:",Vars4]), +%% debug(on)->writeln([exit,[Function,[Result2]]]). +interpretstatement1(Functions0,_Functions,Query,Vars,Vars,true) :- + Query=[_Function], + debug(on)->writeln([call,[Function]]), + interpret2(Query,Functions0,Functions0,_Result1), + debug(on)->writeln([exit,[Function]]). + + +interpretstatement1(_Functions0, _Functions,_Query,_Vars1,_Vars2,false) :- + writeln([false]). + +interpretstatement2(Value,_Vars,Value) :- + (number(Value);atom(Value)). +interpretstatement2(Variable,Vars1,Value) :- + getvalue(Variable,Value,Vars1). +interpretstatement3(A + B,Vars,Value1) :- + interpretstatement2(A,Vars,Value2), + interpretstatement2(B,Vars,Value3), + Value1 = Value2 + Value3. +interpretstatement3(Value,_Vars,Value) :- + (number(Value);atom(Value)). +interpretstatement3(Variable,Vars,Value) :- + getvalue(Variable,Value,Vars). +getvalue(Variable,Value,Vars) :- + ((not(isvar(Variable)),isvalstrorundef(Value),Variable=Value); + (isvar(Variable),isvalstrorundef(Value),getvar(Variable,Value,Vars))). +putvalue(Variable,Value,Vars1,Vars2) :- + ((not(isvar(Variable)),isvalstrorundef(Value),Variable=Value); + (isvar(Variable),isvalstrorundef(Value),updatevar(Variable,Value,Vars1,Vars2))),!. +getvar(Variable,Value,Vars) :- + member([Variable,Value],Vars), + not(Value=empty). +getvar(undef,undef,_Vars) :- + !. +getvar(Variable,empty,Vars) :- + not(member([Variable,_Value],Vars)); + member([Variable,empty],Vars). +updatevar(undef,_Value,Vars,Vars) :- + !. +updatevar(Variable,Value,Vars1,Vars2) :- + ((((member([Variable,empty],Vars1), + delete(Vars1,[Variable,empty],Vars3), + append(Vars3,[[Variable,Value]],Vars2)); + ((not(member([Variable,Value1],Vars1)), + ((Value1=empty)->true;(Value1=Value)))), + append(Vars1,[[Variable,Value]],Vars2)); + (member([Variable,Value],Vars1),Vars2=Vars1)); + (undef(Variable), + append(Vars1,[[Variable,Value]],Vars2))). +updatevars(_FirstArgs,[],Vars,Vars). +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[[Variable1,Value]|Vars4], + ((member([Variable2,Variable1],FirstArgs), %% removed brackets around firstargs here and 2 line below + append(Vars2,[[Variable2,Value]],Vars5)); + (member([Variable1,_Variable2],FirstArgs), + append(Vars2,[[Variable1,Value]],Vars5))), + updatevars(FirstArgs,Vars4,Vars5,Vars3), + !. +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[_Vars4|Vars5], + updatevars(FirstArgs,Vars5,Vars2,Vars3). +updatevars2(_FirstArgs,[],Vars,Vars). +updatevars2(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[[Variable,Value]|Vars4], + (member(Variable,FirstArgs), %% removed brackets around firstargs here and 2 line below, ** vars1 into arg in (10), check cond + append(Vars2,[[Variable,Value]],Vars5)), + updatevars2(FirstArgs,Vars4,Vars5,Vars3). +updatevars3(Vars1,[],Vars1). +updatevars3(Vars1,Vars2,Vars4) :- + Vars2=[[Variable,Value]|Vars5], + delete(Vars1,[Variable,empty],Vars6), + append(Vars6,[[Variable,Value]],Vars7), + updatevars3(Vars7,Vars5,Vars4), + !. +updatevars3(Vars1,Vars2,Vars4) :- + Vars2=[[Variable,Value]|Vars5], + append(Vars1,[[Variable,Value]],Vars6), + updatevars3(Vars6,Vars5,Vars4). +reverse([],List,List). +reverse(List1,List2,List3) :- + List1=[Head|Tail], + append([Head],List2,List4), + reverse(Tail,List4,List3). +unique1([],Items,Items). +unique1([Item|Items1],Items2,Items3) :- + delete(Items1,Item,Items4), + append(Items2,[Item],Items5), + unique1(Items4,Items5,Items3). + +isvar(Variable) :- + atom(Variable). +isval(Value) :- + number(Value). +isvalstr(N) :- + isval(N);string(N). +/**isvalstrempty(N) :- + isval(N);(string(N);N=empty).**/ +isvalstrempty(N) :- + isval(N),!. +isvalstrempty(N) :- + string(N),!. +isvalstrempty(empty) :- !. +/**isvalstrempty(N) :- + atom(N),fail,!. +**/ +isvalstrorundef(N) :- + var(N);(not(var(N)),(isval(N);expression(N))). +undef(N) :- + var(N). +/** +expression(N) :- + isval(N);(string(N);atom(N)),!. +expression([]). +expression(empty). +expression([N]) :- + expression(N). +expression([N|Ns]):- + expression(N),expression(Ns). +**/ + +expression(empty) :- + !. +expression(N) :- + isval(N),!. +expression(N) :- + string(N),!. +expression(N) :- + atom(N),!. +expression([]) :- + !. +expression(N) :- +%% not(atom(N)), + length(N,L),L>=2, + expression2(N). +expression2([]). +expression2([N|Ns]) :- + expression3(N), + expression2(Ns). +expression3(N) :- + isval(N),!. +expression3(N) :- + string(N),!. +expression3(N) :- + atom(N),!. + +expressionnotatom(N) :- + isvalstrempty(N),!. +expressionnotatom(N) :- + not(atom(N)), + length(N,L),L>=2, + expressionnotatom2(N). +expressionnotatom2([]). +expressionnotatom2([N|Ns]) :- + isvalstrempty(N), + expressionnotatom2(Ns). + +substitutevarsA1(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- + substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2). +substitutevarsA2([],_Vars1,Vars2,Vars2,FirstArgs,FirstArgs). +substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- + Arguments=[Variable|Variables], + ((getvalue(Variable,Value,Vars1), + Value=empty)-> + ((append(Vars2,[Variable],Vars4)), + (isvar(Variable)->append(FirstArgs1,[Variable], + FirstArgs3);FirstArgs3=FirstArgs1)); + (getvalue(Variable,Value,Vars1), + append(Vars2,[Value],Vars4)), + FirstArgs3=FirstArgs1), + substitutevarsA2(Variables,Vars1,Vars4,Vars3,FirstArgs3,FirstArgs2). + +findresult3([],_Result,Result2,Result2). +findresult3(Arguments1,Result1,Result2,Result3) :- + Arguments1=[Value|Arguments2], + expressionnotatom(Value), + append(Result2,[Value],Result4), + findresult3(Arguments2,Result1,Result4,Result3). +findresult3(Arguments1,Result1,Result2,Result3) :- + Arguments1=[Variable|Arguments2], + isvar(Variable), + member([Variable,Value],Result1), + append(Result2,[Value],Result4), + findresult3(Arguments2,Result1,Result4,Result3). + + + diff --git a/Combination-Algorithm-Writer-Stable/listprologinterpreter3preds5copy52.pl b/Combination-Algorithm-Writer-Stable/listprologinterpreter3preds5copy52.pl new file mode 100644 index 0000000000000000000000000000000000000000..a2bfa2de37c2852e31f3ef8b4e910a148a8444b9 --- /dev/null +++ b/Combination-Algorithm-Writer-Stable/listprologinterpreter3preds5copy52.pl @@ -0,0 +1,118 @@ +interpretpart(is,Variable1,Value1,Vars1,Vars2) :- + getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + expression(Value1A), + %%val1emptyorvalsequal(Value1A,Value1), + %%isval(Value2), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,is,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Variable1,is,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1A is Value2 + Value3, + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,is,Value2+Value3],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Value1A,is,Value2+Value3],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1 = [Value2A, Value3A], + val1emptyorvalsequal(Value2,Value2A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable2,Value2A,Vars1,Vars3), + putvalue(Variable3,Value3A,Vars3,Vars2), + (debug(on)->(writeln([call,[[Value2A, Value3A],=,[variable1,variable2]],"Press c."]),(not(get_single_char(97))->true;abort));true), (debug(on)->(writeln([exit,[[Value2A, Value3A],=,[Value2A, Value3A]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1A = [Value2, Value3], + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,=,[Value2,Value3]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[Value2,Value3],=,[Value2,Value3]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1A = [Value2], + val1emptyorvalsequal(Value1,Value1A), + %%val1emptyorvalsequal(Value1A,Value2), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Variable2,=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(bracket2,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1A = Value2, + val1emptyorvalsequal(Value1,Value1A), + %%val1emptyorvalsequal(Value2A,Value1), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[[variable],=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[Value2],=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(head,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1=[Value1A|_Rest], + val1emptyorvalsequal(Value2,Value1A), + putvalue(Variable2,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[head,Value1,variable],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[head,Value1,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true), !. +interpretpart(tail,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1=[_Head|Value1A], + %%removebrackets(Value1A,Value1B), + val1emptyorvalsequal(Value2,Value1A), + putvalue(Variable2,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[tail,Value1,variable],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[tail,Value1,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(member,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + (not(Value1=empty)->(member(Value1,Value2),Vars2=Vars1, + (debug(on)->(writeln([call,[member,Value1,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[member,Value1,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true)); + (member(Value3,Value2), + putvalue(Variable1,Value3,Vars1,Vars2), + (debug(on)->(writeln([call,[member,variable1,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[member,Value3,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true))). + +interpretpart(delete,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + delete(Value1,Value2,Value3A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable3,Value3A,Vars1,Vars2), + (debug(on)->(writeln([call,[delete,Value1,Value2,variable3],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[delete,Value1,Value2,Value3A],"Press c."]),(not(get_single_char(97))->true;abort));true). + + +interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + append1(Value1,Value2,Value3A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable3,Value3A,Vars1,Vars2), + (debug(on)->(writeln([call,[append,Value1,Value2,variable3],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[append,Value1,Value2,Value3A],"Press c."]),(not(get_single_char(97))->true;abort));true). +getvalues(Variable1,Variable2,Value1,Value2,Vars) :- + getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars). +getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars) :- + getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars), + getvalue(Variable3,Value3,Vars). +val1emptyorvalsequal(empty,Value) :- + not(Value=empty). +val1emptyorvalsequal(Value,Value) :- + not(Value=empty). +isop(is). +isop(=). +append1([],Item,Item) :- + !. +append1(Item1,Item2,Item3) :- + ((isvalstr(Item1),Item1A=[Item1]);(not(isvalstr(Item1)),Item1A=Item1)), + ((isvalstr(Item2),Item2A=[Item2]);(not(isvalstr(Item2)),Item2A=Item2)), + %%((isvalstr(Item3),Item3A=[Item3]);(not(isvalstr(Item3)),Item3A=Item3)), + append(Item1A,Item2A,Item3). +/**delete1(Item1,Item2,Item3) :- + ((isvalstr(Item1),Item1A=[Item1]);(not(isvalstr(Item1)),Item1A=Item1)), + ((isvalstr(Item2),Item2A=[Item2]);(not(isvalstr(Item2)),Item2A=Item2)), + %%((isvalstr(Item3),Item3A=[Item3]);(not(isvalstr(Item3)),Item3A=Item3)), + delete(Item1A,Item2A,Item3). +**/ + +removebrackets([[Value]],Value) :-!. +removebrackets(Value,Value). diff --git a/Combination-Algorithm-Writer-Stable/lpi_caw_commands.pl b/Combination-Algorithm-Writer-Stable/lpi_caw_commands.pl new file mode 100644 index 0000000000000000000000000000000000000000..4c2fb19384db004b5134bb9f53d80fb1a49e270f --- /dev/null +++ b/Combination-Algorithm-Writer-Stable/lpi_caw_commands.pl @@ -0,0 +1,39 @@ +interpretstatement1(_F0,_Functions,[[not,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- + isop(Operator), + interpretpart(not_is,Variable1,Variable2,Vars1,Vars2),!. + +interpretstatement1(_F0,_Functions,[[not,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- + comparisonoperator(Operator), +%%writeln1(4), + interpretpart(not_iscomparison,Operator,Variable1,Variable2,Vars1,Vars2). + +interpretpart(not_is,Variable1,Variable2,Vars1,Vars1) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + not(isempty(Value1)), + not(isempty(Value2)), + %%writeln([call,[[not,=],[Value1,Value2]]]), + ((not(Value1 = Value2) + )-> + true%%writeln([exit,[[not,=],[Value1,Value2]]]) +; fail)%%writeln([fail,[[not,=],[Value1,Value2]]])) +,!. + +interpretpart(not_iscomparison,Operator,Variable1,Variable2,Vars1,Vars1) :- getvalues(Variable1,Variable2,Value1,Value2,Vars1), + %%writeln([call,[[not,=],[Value1,Value2]]]), + ((isval(Value1), + isval(Value2), + Expression=..[Operator,Value1,Value2], + not(Expression))-> + true%%writeln([exit,[[not,=],[Value1,Value2]]]) +; fail)%%writeln([fail,[[not,=],[Value1,Value2]]])) +,!. + +isempty(N) :- + N=empty. + +comparisonoperator(>). +comparisonoperator(>=). +comparisonoperator(<). +comparisonoperator(=<). +%%comparisonoperator(=). +comparisonoperator(=\=). diff --git a/Combination-Algorithm-Writer-with-Predicates/LICENSE b/Combination-Algorithm-Writer-with-Predicates/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/Combination-Algorithm-Writer-with-Predicates/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Combination-Algorithm-Writer-with-Predicates/LPI CAW commands workaround.txt b/Combination-Algorithm-Writer-with-Predicates/LPI CAW commands workaround.txt new file mode 100644 index 0000000000000000000000000000000000000000..8b77d7f22f48f09e8269b23a6ca1d157067e37ce --- /dev/null +++ b/Combination-Algorithm-Writer-with-Predicates/LPI CAW commands workaround.txt @@ -0,0 +1,9 @@ +LPI CAW commands workaround for single argument commands not working, but no singletons working + +_A - enter value in header (one output, so is it, do outputs one at a time) + +[] and "" - enter [] or "" somewhere in header and use normal "=" + +not - can still use except for [], "", etc., see previous line + +With no singletons working, CAWP is more stable than CAWMP \ No newline at end of file diff --git a/Combination-Algorithm-Writer-with-Predicates/README.md b/Combination-Algorithm-Writer-with-Predicates/README.md new file mode 100644 index 0000000000000000000000000000000000000000..de41b96b950de516b8c9fb452caf29f6747800c1 --- /dev/null +++ b/Combination-Algorithm-Writer-with-Predicates/README.md @@ -0,0 +1,73 @@ +# Combination-Algorithm-Writer-with-Predicates-Stable +Combination Algorithm Writer with Predicates Stable (CAWPS) + +This was copied from CAWPS on 22 3 20, replacing the content of CAWP. + +Please note: This works with the example below, which CAWMP doesn't. + +Combination Algorithm Writer is a SWI-Prolog algorithm that finds combinations of given commands that satisfy the given input and output. + +It verifies for no singletons. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Combination-Algorithm-Writer-with-Predicates"). +halt +``` + +# Running + +* In Shell: +`cd Combination-Algorithm-Writer-with-Predicates` +`swipl` + +* Load all files in the form: +``` +['lpi_caw_commands.pl']. +['listprologinterpreter1listrecursion4copy52']. +['listprologinterpreter3preds5copy52']. +['caw5copy11']. +``` +Running + +``` +caw00(off,f,[[+,2,1]],4,8,[[a,1],[b,1],[c,2],[d,1]],[[e,5]],[[[f1,4,1],[a,b,c,d,e],(:-),[[+,[a,b,f]],[+,[c,f,g]],[+,[d,g,h]],[=,[e,h]]]]],[],P),writeln(P). + +[[f1,[a,b,c,d,e],:-,[[+,[a,b,f]],[+,[c,f,g]],[+,[d,g,h]],[=,[e,h]]]],[f,[a,b,c,d,e],:-,[[+,[a,b,f]],[+,[c,d,g]],[+,[f,g,h]],[=,[e,h]]]]] +. +. +. +``` + +``` +caw00(off,reverse,[[head,1,1],[tail,1,1],[wrap,1,1],[append,2,1],[reverse,2,1]],6,8,[[a,[1,2,3]],[b,[]]],[[c,[3,2,1]]],[[[reverse,2,1],[[],a,a]]],[],P),writeln(P). + +not finished +``` + +Note: Use (:-) instead of :-. + +# CAWPS API + +* To run CAWPS on a Prolog server: +* Move `cawps-api.pl` to the root (`/username/` or `~` on a server) of your machine. +* Re-enter the paths to your Prolog files in it. +* Enter `[cawps-api.pl]` in SWI-Prolog and `server(8000).`. +* On a local host access the algorithm at `http://127.0.0.1:8000` and replace 127.0.0.1 with your server address on a server. diff --git a/Combination-Algorithm-Writer-with-Predicates/caw5copy11.pl b/Combination-Algorithm-Writer-with-Predicates/caw5copy11.pl new file mode 100644 index 0000000000000000000000000000000000000000..91ed16d36e3c94e694afe9f33e9ab4b7ed6f1d7f --- /dev/null +++ b/Combination-Algorithm-Writer-with-Predicates/caw5copy11.pl @@ -0,0 +1,351 @@ +:- dynamic debug/1. +:- dynamic totalvars/1. +:- dynamic outputvars/1. + +caw00(Debug,PredicateName,Rules1,MaxLength,TotalVars,InputVarList,OutputVarList,Predicates1,Program1,Program2) :- + split3(Predicates1,[],Rules2), + split2(Predicates1,[],Predicates), + %%writeln([Rules2,Predicates]), + append(Rules1,Rules2,Rules3), + + retractall(debug(_)), + assertz(debug(Debug)), + retractall(totalvars(_)), + assertz(totalvars(TotalVars)), + caw0(Predicates,PredicateName,Rules3,MaxLength,InputVarList,OutputVarList,Program1,Program2). + +caw0(Predicates,PredicateName,Rules,MaxLength,InputVarList,OutputVarList,Program1,Program2) :- + varnames(InputVarList,[],InputVars,[],InputValues), + varnames(OutputVarList,[],OutputVars,[],_OutputValues), + retractall(outputvars(_)), + assertz(outputvars(OutputVars)), +append(InputVars,OutputVars,Vars11), +%%Vars11=InputVars, +%%Vars12=InputVars, + append(InputValues,OutputVars,Vars2), + %%append(InputValues,OutputValues,Values), + Query=[PredicateName,Vars2], + caw(Predicates,Query,PredicateName,Rules,MaxLength,Vars11,InputVars,InputVars,_,OutputVarList,OutputVars,Program1,Program2). +caw(_,_,_,_,0,_,_,_,_,_,_,_) :- fail, !. +caw(Predicates,Query,PredicateName,_,_,_VarList,InputVars1,InputVars2,_,OutputVarList,OutputVars,Program1,Program2) :- + addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3), +%%writeln([addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3)]), + %%optimise(Program1,InputVars1,InputVars3,PenultimateVars,Program4), %% IV2->3 +%%writeln([optimise(Program1,InputVars1,InputVars3,PenultimateVars,Program4)]), + append(Program1,Program3,Program5), + append(InputVars1,OutputVars,Vars2), + Program22=[ + [PredicateName,Vars2,(:-), + Program5 + ] + ], + + append(Predicates,Program22,Program2), + + (debug(on)->Debug=on;Debug=off), +%%writeln([interpret(Debug,Query,Program2,OutputVarList)]), +%%writeln(""), + catch(call_with_time_limit(0.05, + interpret(Debug,Query,Program2,OutputVarList)), + time_limit_exceeded, + fail), + no_singletons(Vars2,Program5),!. +caw(Predicates,Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4) :- +%%writeln([caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,OutputVarList,OutputVars,Program1,Program4)]), + MaxLength2 is MaxLength - 1, +%%writeln(["ml",MaxLength2]), + member([RuleName,NumInputs,NumOutputs],Rules), +%%writeln([member([RuleName,NumInputs,NumOutputs],Rules)]), +%%writeln([rule(RuleName,NumInputs,NumOutputs,VarList,VarList2,Rule)]), + rule(RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,Rule), +%%writeln([rule(RuleName,NumInputs,NumOutputs,InputVars1,InputVars3,VarList,VarList2,Rule)]), + append(Program1,[Rule],Program3), +%%writeln([inputVars3,InputVars3]), +%%InputVars2=InputVars3, +%%writeln([program4,Program4]), + caw(Predicates,Query,PredicateName,Rules,MaxLength2,VarList2,InputVars1,InputVars4,InputVars3,OutputVarList,OutputVars,Program3,Program4). + +varnames([],Vars,Vars,Values,Values) :- !. +varnames(VarList,Vars1,Vars2,Values1,Values2) :- + VarList=[Var|Vars3], + Var=[VarName,Value], + append(Vars1,[VarName],Vars4), + append(Values1,[Value],Values3), + varnames(Vars3,Vars4,Vars2,Values3,Values2),!. + +addrules(_,_,[],PV,PV,Program,Program) :- !. +addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- + OutputVars2=[OutputVar|OutputVars3], + member(Var,VarList), + member(OutputVar,OutputVars1), + append(Program1,[[=,[OutputVar,Var]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars3), + addrules(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2). + +%% optimise([[append,[a,a,d]],[append,[a,a,e]],[append,[a,a,f]],[append,[a,b,g]]],[g],P). + +optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program2) :- + findrulesflowingtopv1(Program1,InputVars1,InputVars2,PenultimateVars,[],Rules,true), + %%findrulesflowingtopv1a(Program1,_Program32,InputVars1,InputVars2,PenultimateVars,[],_Rules1), + intersection(Program1,Rules,Program3), + unique1(Program3,[],Program2). +findrulesflowingtopv1(_,_,_,[],Rules,Rules,false). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + (atom(Var);length(Var,1)), + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars1,Rules1,Rules2,IV1Flag1) :- + Vars1=[Var|Vars2], + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2), + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars2,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv20(_,[],_InputVars1,_InputVars2,_Var,Rules,Rules,false). +findrulesflowingtopv20(Program0,Rules4,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rules4=[Rule|Rules], + (findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2)->true;(Rules3=Rules1,IV1Flag2=false)), + %%delete(Program0,Rule,Program1), + findrulesflowingtopv20(Program0,Rules,InputVars1,InputVars2,Var,Rules3,Rules2,IV1Flag3),%%p1->0 + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). +%%findrulesflowingtopv2(_,[],[],_,_,_,Rules,Rules). +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + %%(not(intersection(Rulesx,Rules1))-> x + %% append, append, unique1 + %%append(Rules1,[Rule],Rules3);Rules3=Rules1), + + %%member(Var2,Rest), + %%member(Var2,InputVars1), + + length(Rest,Length1), Length1>=1, + subtract(Rest,InputVars1,IV3s), + length(IV3s,Length3), + subtract(Rest,IV3s,IV1s), + length(IV1s,Length2), Length2>=1, + subtract(IV3s,InputVars2,[]), + + IV1Flag2=true, + + %%delete(Program0,Rule,Program1), + + %%(delete(Program0,Rule,Program3), + %%iv3s1(IV3s,Program3,IV3s,[]), + (Length3>=1-> + (findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV3s,[],Rules5,IV1Flag3),not(Rules5=[])); + (Rules5=[],IV1Flag3=false)), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag4), + %%->true; Rules5=[],IV1Flag1=IV1Flag4), + + ((findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV1s,[],Rules6,IV1Flag5), %%iv1s->rest, etc + iv1flagdisjunction(IV1Flag4,IV1Flag5,IV1Flag1))->true;(Rules6=[],IV1Flag1=IV1Flag4)), + + append([Rule],Rules1,Rules9), + append(Rules9,Rules5,Rules7), + append(Rules7,Rules6,Rules8), + unique1(Rules8,[],Rules2). + +/** +findrulesflowingtopv2(_Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + (not(member(Rule,Rules1))-> + append(Rules1,[Rule],Rules2);Rules2=Rules1), + subset(Rest,InputVars2), + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), + IV1Flag1=false. +**/ +/** +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program3), + %%Program3=Program1, + %%append(Rules1,[Rule],Rules3), + subset(Rest,InputVars2), + + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), +%% delete(Program0,Rule,Program1), + IV1Flag2=false, + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Rest,[],Rules4,IV1Flag3), + %%not(Rules4=[]), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1), + append(Rules1,[Rule],Rules7), + append(Rules7,Rules4,Rules8), + unique1(Rules8,[],Rules2). +**/ +/** +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2(Rule,Program0,Program1,_Program2,InputVars1,InputVars,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + %%Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1(Program0,Program1,_Program2,InputVars1,InputVars,Rest,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). +**/ +iv1flagdisjunction(A,B,true) :- + (A=true); (B=true). +iv1flagdisjunction(_,_,false). +/** +iv3s0([],_,IV3s1,IV3s2). +iv3s0(IV3s,Program0,IV3s1,IV3s2). + IV3s=[IV3|IV3s3], + iv3s1(IV3,Program0,IV3s1,IV3s4), + iv3s0(IV3s3,Program0,IV3s4,IV3s2). +iv3s1(_,[],IV3s,IV3s). +iv3s1(IV3,Program0,IV3s1,IV3s2) :- + Program0=[Rule|Rules], + iv3s2(IV3,Rule,IV3s1,IV3s3), + iv3s1(IV3,Rules,IV3s3,IV3s2). +iv3s2(IV3,Rule,IV3s,IV3s1,IV3s2). + Rule=[_PredicateName,Vars], + restlast(Vars,[],_Rest,IV3), + delete(IV3s1,IV3,IV3s2). +findrulesflowingtopv1a(_,_,_,_,[],Rules,Rules). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + atom(Var), + findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Vars1,Rules1,Rules2) :- + Vars1=[Var|Vars2], + findrulesflowingtopv2(Program1,Program3,InputVars1,InputVars2,Var,Rules1,Rules3), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Vars2,Rules3,Rules2). +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv2a([],[],_,_,_,Rules,Rules). +findrulesflowingtopv2a(Program1,Program2,_InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program2), +Program2=Program1, + append(Rules1,[[PredicateName,Vars]],Rules2), + subset(Rest,InputVars2)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program3), +Program3=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + subset(Rest,InputVars2)), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Rest,Rules3,Rules2). +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1a(Program4,Program2,InputVars1,InputVars,Rest,Rules3,Rules2). + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). + **/ + +restlast([],_,_,_) :- fail, !. +restlast([Last],Rest,Rest,Last) :- + atom(Last),!. +restlast(Last,Rest,Rest,Last) :- + length(Last,1),!. +restlast(Vars1,Rest1,Rest2,Last) :- + Vars1=[Var|Vars2], + append(Rest1,[Var],Rest3), + restlast(Vars2,Rest3,Rest2,Last),!. + + + + +rule(RuleName,1,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + rule2(RuleName,Var,VarList,VarList2,Rule,Var1), + append(InputVars1,[Var1],InputVars2). +rule2(RuleName,Var,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Var1]],!. + +rule(RuleName,1,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + rule3(RuleName,Var,VarList,VarList2,Rule,Vars), + append(InputVars1,Vars,InputVars2). +rule3(RuleName,Var,VarList,VarList3,Rule,[Var1,Var2]) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Var1,Var2]],!. + +rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + member(Vara,InputVars1), + rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1), + append(InputVars1,[Var1],InputVars2). +rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Vara,Var1]],!. + +rule(RuleName,2,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars), + member(Vara,InputVars), + rule5(RuleName,Var,Vara,VarList,VarList2,Rule,Vars), + append(InputVars1,Vars,InputVars2). +rule5(RuleName,Var,Vara,VarList,VarList3,Rule,[Var1,Var2]) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Vara,Var1,Var2]],!. + +%%var(Item,Var,Vars,Vars) :- +%% member([Item,Var],Vars). +var(Vars1,Var1,Vars2) :- + length(Vars1,Vars1Length1), + Vars1Length2 is Vars1Length1-1, + length(Vars3,Vars1Length2), + append(Vars3,[Var2],Vars1), + char_code(Var2,Var2Code1), + Var2Code2 is Var2Code1 + 1, + var2(Var2Code2,Var1), + append(Vars1,[Var1],Vars2),!. + +var2(Code,Var1) :- + outputvars(OutputVars), + totalvars(TotalVars), + Code2 is 96+TotalVars, + Code =< Code2, %% 122 + char_code(Var1,Code), + not(member(Var1,OutputVars)),!. +var2(Var2Code,Code3) :- + Var2Code2 is Var2Code + 1, + totalvars(TotalVars), + Code2 is 96+TotalVars, + Var2Code2 =< Code2, + var2(Var2Code2,Code3),!. + + +no_singletons(Vars1,Program):- + findall(DA,(member(C,Program),C=[_E,D],member(DA,D)),Vars2), + %%append_list(Vars2,Vars2A), + append(Vars1,Vars2,Vars3), + findall(Count1,(member(Item,Vars3),aggregate_all(count,(member(Item,Vars3)),Count1), + Count1=1),G),G=[]. + + split3([],List,List) :- !. +split3(Predicates1,List1,List2) :- + Predicates1=[Item1|List4], + Item1= [[Name,In,Out]|_Rest], + append(List1,[[Name,In,Out]],List6), + split3(List4,List6,List2),!. + +split2([],List,List) :- !. +split2(Predicates1,List1,List2) :- + Predicates1=[Item1|List4], + Item1= [[Name,_In,_Out]|Rest], + append(List1,[[Name|Rest]],List6), + split2(List4,List6,List2),!. + diff --git a/Combination-Algorithm-Writer-with-Predicates/cawps-api.pl b/Combination-Algorithm-Writer-with-Predicates/cawps-api.pl new file mode 100644 index 0000000000000000000000000000000000000000..1ff20a837dae1de690d58505eb34d29ac619f35c --- /dev/null +++ b/Combination-Algorithm-Writer-with-Predicates/cawps-api.pl @@ -0,0 +1,103 @@ +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/', web_form, []). + +:- include('files/Combination-Algorithm-Writer-Stable/listprologinterpreter1listrecursion4copy52.pl'). +:- include('files/Combination-Algorithm-Writer-Stable/listprologinterpreter3preds5copy52.pl'). +:- include('files/Combination-Algorithm-Writer-Stable/caw5copy11.pl'). +:- include('files/Combination-Algorithm-Writer-Stable/lpi_caw_commands.pl'). + +server(Port) :- + http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + + web_form(_Request) :- + reply_html_page(include('Combination Algorithm Writer Stable'), + + [ + form([action='/landing', method='POST'], [ + /** + p([], [ + label([for=debug],'Debug (on/off):'), + input([name=debug, type=textarea]) + ]), + **/ + p([], [ + label([for=predicatename],'Predicate Name (e.g. f):'), + input([name=predicatename, type=textarea]) + ]), + + p([], [ + label([for=rules],'Rules to choose from (e.g. [[+,2,1]]):'), + input([name=rules, type=textarea]) + ]), + + p([], [ + label([for=maxlength],'Max Length of Algorithm Body (e.g. 4):'), + input([name=maxlength, type=textarea]) + ]), + + p([], [ + label([for=totalvars],'Total Vars in Predicate (e.g. 8):'), + input([name=totalvars, type=textarea]) + ]), + p([], [ + label([for=inputvarlist],'InputVarList (e.g. [[a,1],[b,1],[c,2],[d,1]]):'), + input([name=inputvarlist, type=textarea]) + ]), + + p([], [ + label([for=outputvarlist],'OutputVarList (e.g. [[e,5]]):'), + input([name=outputvarlist, type=textarea]) + ]), + + p([], [ + label([for=algorithmlibrary],'Algorithm Library (e.g. [[[f1,4,1],[a,b,c,d,e],(:-),[[+,[a,b,f]],[+,[c,f,g]],[+,[d,g,h]],[=,[e,h]]]]], where 4 is the number of inputs and 1 is the number of outputs):'), + input([name=algorithmlibrary, type=textarea]) + ]), + + p([], input([name=submit, type=submit, value='Submit'], [])) + ])]). + + :- http_handler('/landing', landing_pad, []). + + landing_pad(Request) :- + member(method(post), Request), !, + http_read_data(Request, Data, []), + format('Content-type: text/html~n~n', []), + format('

', []), + %%portray_clause(Data), + + %%term_to_atom(Term,Data), + + %% Predicatename,Rules,Maxlength,Totalvars,Inputvarlist,Outputvarlist,Algorithmlibrary + %%portray_clause(Data), + + +Data=[%%debug='off',%%Debug1, +predicatename=Predicatename,rules=Rules,maxlength=Maxlength,totalvars=Totalvars,inputvarlist=Inputvarlist,outputvarlist=Outputvarlist,algorithmlibrary=Algorithmlibrary,submit=_], + +term_to_atom(Debug1,'off'), +term_to_atom(Predicatename1,Predicatename), +term_to_atom(Rules1,Rules), +term_to_atom(Maxlength1,Maxlength), +term_to_atom(Totalvars1,Totalvars), +term_to_atom(Inputvarlist1,Inputvarlist), +term_to_atom(Outputvarlist1,Outputvarlist), +term_to_atom(Algorithmlibrary1,Algorithmlibrary), + +caw00(Debug1,Predicatename1,Rules1,Maxlength1,Totalvars1,Inputvarlist1,Outputvarlist1,Algorithmlibrary1,[],Result), + %%format('

========~n', []), + portray_clause(Result), + %%writeln1(Data), + +format('

'). \ No newline at end of file diff --git a/Combination-Algorithm-Writer-with-Predicates/listprologinterpreter1listrecursion4copy52.pl b/Combination-Algorithm-Writer-with-Predicates/listprologinterpreter1listrecursion4copy52.pl new file mode 100644 index 0000000000000000000000000000000000000000..ccba3784dba8e03d6547ca250955c7df49140ae0 --- /dev/null +++ b/Combination-Algorithm-Writer-with-Predicates/listprologinterpreter1listrecursion4copy52.pl @@ -0,0 +1,576 @@ +/**:- include('listprologinterpreter3preds5.pl'). +:- include('lpiverify4.pl'). +:- include('caw5 copy 11.pl'). +**/ +:- dynamic debug/1. + +/** List Prolog Interpreter **/ + +interpret(Debug,Query,Functions,Result) :- +%%writeln([i1]), + interpret1(Debug,Query,Functions,Functions,Result), + !. +interpret1(Debug,Query,Functions1,Functions2,Result) :- +%%writeln([i11]), + retractall(debug(_)), + assertz(debug(Debug)), + retractall(cut(_)), + assertz(cut(off)), + member1(Query,Functions1,Functions2,Result). +%%member1([_,R],_,[],R). +%%member1(_,_,[],[]). +member1(_,_,[],_) :- fail. +member1(Query,Functions,Functions2,Vars8) :- +%%writeln([m1]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2,(:-),Body]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, %%->ca2 +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,Vars2,Body,true), + updatevars(FirstArgs,Vars2,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), +%%writeln(["FirstArgs",FirstArgs,"Vars",Vars2,"Result",Result,"Vars7",Vars7,"Vars72",Vars72,"Var71",Var71,"Vars8",Vars8]), +%%writeln(["Vars8",Vars8]), + findresult3(Arguments1,Vars8,[],Result2) +%%writeln([findresult3,"Arguments1",Arguments1,"Vars8",Vars8,"Result2",Result2]) + );( +%%writeln(here1), + Vars8=[],Result2=[])), +%%writeln(["Arguments1",Arguments1,"Vars2",Vars2,"Result",Result]), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true)) + ->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2,(:-),_Body]|Functions3], %% make like previous trunk? + member11(Query,Functions,Functions2,Vars8)) + );(turncut(off)%%,Result=[] + ). +member11(Query,Functions,Functions2,Result) :- +%%writeln([m11]), +%%writeln(["Query",Query,"Functions",Functions,"Functions2",Functions2,"Result",Result]), + cut(off)->( + (Query=[Function], + (Functions2=[[Function,(:-),Body]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Result=[], + interpretbody(Functions,Functions2,[],_Vars2,Body,true),!, + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + %%Functions2=[[Function]|Functions3], + member12(Query,Functions,Functions2,Result)) + );(turncut(off)). +member12(Query,Functions,Functions2,Vars8) :- +%%writeln([m12]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, %% ->ca2 +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + updatevars(FirstArgs,Vars1,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) + );( +%%writeln(here2), + Vars8=[],Result2=[])), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2]|Functions3], + member13(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member13(Query,Functions,Functions2,Result) :- +%%writeln([m13]), + cut(off)->( + (Query=[Function],!, + (Functions2=[[Function]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Result=[], + %%interpretbody(Functions,[],_Vars2,Body,true), + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + Functions2=[_Function|Functions3], + member1(Query,Functions,Functions3,Result)) + );(turncut(off)). +interpret2(Query,Functions1,Functions2,Result) :- +%%writeln(i2), +%%writeln(["%%interpret2 Query",Query,"Functions1",Functions1,"Functions2",Functions2]), + member2(Query,Functions1,Functions2,Result). +%%member2([_,R],_,[],R). +%%member2(_,_,[],[]). +member2(_,_,[],_) :- fail. +member2(Query,Functions,Functions2,Vars8) :- +%%writeln([m2]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2,(:-),Body]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,Vars2,Body,true), %%**arg2 change +%%writeln(["Functions",Functions,"Functions2",Functions2,"Vars1",Vars1,"Vars2",Vars2,"Body",Body]), + updatevars(FirstArgs,Vars2,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) +%%writeln(["Vars2",Vars2,"Result",Result]), + );( + writeln(here3), + Vars8=[],Result2=[])), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2,(:-),_Body]|Functions3], + member21(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member21(Query,Functions,Functions2,Result) :- +%%writeln([m21]), + cut(off)->( + (Query=[Function], + (Functions2=[[Function,(:-),Body]|_Functions3]), + Vars1=[], + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,_Vars2,Body,true),!, %%**arg2 change + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + %%Functions2=[[Function]|Functions3], + member22(Query,Functions,Functions2,Result)) + );(turncut(off)). +member22(Query,Functions,Functions2,Vars8) :- +%%writeln([m22]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + updatevars(FirstArgs,Vars1,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) + );( +%%writeln(here4), + Vars8=[],Result2=[])), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2]|Functions3], + member23(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member23(Query,Functions,Functions2,Vars8) :- +%%writeln([m23]), + cut(off)->( + (Query=[Function],!, + (Functions2=[[Function]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Vars8=[], + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + Functions2=[_Function|Functions3], + member2(Query,Functions,Functions3,Vars8)) + );(turncut(off)). +checkarguments([],[],Vars,Vars,FirstArgs,FirstArgs). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %% +%%writeln(1), + Arguments1=[Value|Arguments3], %% Value may be a number, string, list or tree + expressionnotatom(Value), + Arguments2=[Variable2|Arguments4], + isvar(Variable2), + putvalue(Variable2,Value,Vars1,Vars3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs1,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %%A +%%writeln(2), + Arguments1=[Variable|Arguments3], %% Value may be a number, string, list or tree + isvar(Variable), + Arguments2=[Value|Arguments4], + expressionnotatom(Value), + putvalue(Variable,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable,_]],FirstArgs3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln(3), + Arguments1=[Variable1|Arguments3], + isvar(Variable1), + Arguments2=[Variable2|Arguments4], + isvar(Variable2), + (getvalue(Variable2,Value,Vars1)->true;Value=empty), + putvalue(Variable2,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable1,Variable2]],FirstArgs3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln(4), + Arguments1=[Value1|Arguments3], + expressionnotatom(Value1), + Arguments2=[Value1|Arguments4], + expressionnotatom(Value1), + checkarguments(Arguments3,Arguments4,Vars1,Vars2,FirstArgs1,FirstArgs2). + +interpretbody(_Functions1,_Functions2,Vars,Vars,[],true) :- !. + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[Statements1|Statements2], + interpretbody(Functions0,Functions,Vars1,Vars3,Statements1,Result2), + interpretbody(Functions0,Functions,Vars3,Vars2,Statements2,Result3), + %%((Result3=cut)->!;true), + logicalconjunction(Result1,Result2,Result3),!. + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[[not,[Statements]]|Statements2], + interpretbody(Functions0,Functions,Vars1,Vars3,Statements,Result2), + %%((Result2=cut)->!;true), + interpretbody(Functions0,Functions,Vars3,Vars2,Statements2,Result3), + %%((Result3=cut)->!;true), + logicalnot(Result2,Result4), + (logicalconjunction(Result1,Result4,Result3)->true;(Result1=false)),!. +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[[Statements1],or,[Statements2]], + (interpretbody(Functions0,Functions,Vars1,Vars2,Statements1,Result1); + %%,((Value1=cut)->!;true)); + interpretbody(Functions0,Functions,Vars1,Vars2,Statements2,Result1)),!. + %%,((Value=cut)->!;true)). + %%(logicaldisjunction(Result1,Value1,Value2)->true;(Result1=false)). + + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[Statement|Statements], +%%writeln(["Functions0",Functions0,"Functions",Functions,"Statement",Statement,"Vars1",Vars1,"Vars3",Vars3,"Result2",Result2,"Cut",Cut]), + interpretstatement1(Functions0,Functions,Statement,Vars1,Vars3,Result2,Cut), +%%writeln(["here1"]), + ((not(Cut=cut))->(Functions2=Functions);(turncut(on))), %% cut to interpret1/2 (assertz) +%%writeln(["here3"]), + interpretbody(Functions0,Functions2,Vars3,Vars2,Statements,Result3), + %%((Result3=cut)->!;true), +%%writeln(["here4"]), + logicalconjunction(Result1,Result2,Result3),!. +%%writeln([Result1,Result2,Result3]). +turncut(State1) :- + cut(State2), + retract(cut(State2)), + assertz(cut(State1)). +logicaldisjunction(true,Result2,Result3) :- + true(Result2);true(Result3). +logicalconjunction(true,Result2,Result3) :- + true(Result2),true(Result3). +logicalnot(Result1,Result2) :- + true(Result1),false(Result2). +logicalnot(Result1,Result2) :- + false(Result1),true(Result2). +true(true). +false(false). + +%%interpretstatement1(_F0,[],_,Vars,Vars,true,nocut) :- ! +%%writeln("AND HERE!") +%% . + +interpretstatement1(_F0,_Functions,[cut,[]],Vars,Vars,true,cut) :- !. + +interpretstatement1(_F0,_Functions,[atom,[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[atom,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + atom(Value), + (debug(on)->(writeln([exit,[atom,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[string,[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[string,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + string(Value), + (debug(on)->(writeln([exit,[string,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[number,[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[number,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + number(Value), + (debug(on)->(writeln([exit,[number,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[variable,[Variable]],Vars,Vars,true,nocut) :- + var(Variable), + (debug(on)->(writeln([call,[variable,[Variable]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[variable,[Variable]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[Operator,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- + isop(Operator), + interpretpart(is,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[Operator,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(31), + isop(Operator), + interpretpart(is,Variable2,Variable1,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[+,[Variable2,Variable3,Variable1]],Vars1,Vars2,true,nocut) :- +%%writeln(4), + interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2). + +%%interpretstatement1(_F0,_Functions,[Variable2+Variable3,is,Variable1],Vars1,Vars2,true,nocut) :- +%%writeln(41), + %%interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[=,[Variable1,[Variable2,Variable3]]],Vars1,Vars2,true,nocut) :- +%%writeln(5), + interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2). + +%%interpretstatement1(_F0,_Functions,[[Variable2,Variable3]=Variable1],Vars1,Vars2,true,nocut) :- +%%writeln(51), +%% interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[=,[Variable1,[Variable2]]],Vars1,Vars2,true,nocut) :- +%%writeln(52), + interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[=,[[Variable1],[Variable2]]],Vars1,Vars2,true,nocut) :- +%%writeln(53), + interpretpart(bracket2,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[head,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(6), + interpretpart(head,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[tail,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(61), + interpretpart(tail,Variable1,Variable2,Vars1,Vars2). + + +interpretstatement1(_F0,_Functions,[member,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(8), + interpretpart(member,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[delete,[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +%%writeln(), + interpretpart(delete,Variable1,Variable2,Variable3,Vars1,Vars2). +%%** all in form f,[1,1,etc], including + with 0,1 + +interpretstatement1(_F0,_Functions,[append,[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +%%writeln(9), + interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(Functions0,_Functions,Query1,Vars1,Vars8,true,nocut) :- +%%writeln("h1/10"), + Query1=[Function,Arguments], +%%writeln(["Arguments",Arguments,"Vars1",Vars1]), + substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + Query2=[Function,Vars3], %% Bodyvars2? +%% debug(on)->writeln([call,[Function,[Vars3]]]), +%%writeln(["Query2",Query2,"Functions0",Functions0]), + interpret2(Query2,Functions0,Functions0,Result1), + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) +);( +writeln(here1), + Vars8=[])). +%%**** reverse and take first instance of each variable. + %%findresult3(Arguments,Vars6,[],Result2) +%%writeln(["FirstArgs",FirstArgs,"Result1",Result1,"Vars5",Vars5,"Vars4",Vars4]), +%%writeln(["Vars1:",Vars1,"Vars4:",Vars4]), +%% debug(on)->writeln([exit,[Function,[Result2]]]). +interpretstatement1(Functions0,_Functions,Query,Vars,Vars,true) :- + Query=[_Function], + debug(on)->writeln([call,[Function]]), + interpret2(Query,Functions0,Functions0,_Result1), + debug(on)->writeln([exit,[Function]]). + + +interpretstatement1(_Functions0, _Functions,_Query,_Vars1,_Vars2,false) :- + writeln([false]). + +interpretstatement2(Value,_Vars,Value) :- + (number(Value);atom(Value)). +interpretstatement2(Variable,Vars1,Value) :- + getvalue(Variable,Value,Vars1). +interpretstatement3(A + B,Vars,Value1) :- + interpretstatement2(A,Vars,Value2), + interpretstatement2(B,Vars,Value3), + Value1 = Value2 + Value3. +interpretstatement3(Value,_Vars,Value) :- + (number(Value);atom(Value)). +interpretstatement3(Variable,Vars,Value) :- + getvalue(Variable,Value,Vars). +getvalue(Variable,Value,Vars) :- + ((not(isvar(Variable)),isvalstrorundef(Value),Variable=Value); + (isvar(Variable),isvalstrorundef(Value),getvar(Variable,Value,Vars))). +putvalue(Variable,Value,Vars1,Vars2) :- + ((not(isvar(Variable)),isvalstrorundef(Value),Variable=Value); + (isvar(Variable),isvalstrorundef(Value),updatevar(Variable,Value,Vars1,Vars2))),!. +getvar(Variable,Value,Vars) :- + member([Variable,Value],Vars), + not(Value=empty). +getvar(undef,undef,_Vars) :- + !. +getvar(Variable,empty,Vars) :- + not(member([Variable,_Value],Vars)); + member([Variable,empty],Vars). +updatevar(undef,_Value,Vars,Vars) :- + !. +updatevar(Variable,Value,Vars1,Vars2) :- + ((((member([Variable,empty],Vars1), + delete(Vars1,[Variable,empty],Vars3), + append(Vars3,[[Variable,Value]],Vars2)); + ((not(member([Variable,Value1],Vars1)), + ((Value1=empty)->true;(Value1=Value)))), + append(Vars1,[[Variable,Value]],Vars2)); + (member([Variable,Value],Vars1),Vars2=Vars1)); + (undef(Variable), + append(Vars1,[[Variable,Value]],Vars2))). +updatevars(_FirstArgs,[],Vars,Vars). +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[[Variable1,Value]|Vars4], + ((member([Variable2,Variable1],FirstArgs), %% removed brackets around firstargs here and 2 line below + append(Vars2,[[Variable2,Value]],Vars5)); + (member([Variable1,_Variable2],FirstArgs), + append(Vars2,[[Variable1,Value]],Vars5))), + updatevars(FirstArgs,Vars4,Vars5,Vars3), + !. +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[_Vars4|Vars5], + updatevars(FirstArgs,Vars5,Vars2,Vars3). +updatevars2(_FirstArgs,[],Vars,Vars). +updatevars2(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[[Variable,Value]|Vars4], + (member(Variable,FirstArgs), %% removed brackets around firstargs here and 2 line below, ** vars1 into arg in (10), check cond + append(Vars2,[[Variable,Value]],Vars5)), + updatevars2(FirstArgs,Vars4,Vars5,Vars3). +updatevars3(Vars1,[],Vars1). +updatevars3(Vars1,Vars2,Vars4) :- + Vars2=[[Variable,Value]|Vars5], + delete(Vars1,[Variable,empty],Vars6), + append(Vars6,[[Variable,Value]],Vars7), + updatevars3(Vars7,Vars5,Vars4), + !. +updatevars3(Vars1,Vars2,Vars4) :- + Vars2=[[Variable,Value]|Vars5], + append(Vars1,[[Variable,Value]],Vars6), + updatevars3(Vars6,Vars5,Vars4). +reverse([],List,List). +reverse(List1,List2,List3) :- + List1=[Head|Tail], + append([Head],List2,List4), + reverse(Tail,List4,List3). +unique1([],Items,Items). +unique1([Item|Items1],Items2,Items3) :- + delete(Items1,Item,Items4), + append(Items2,[Item],Items5), + unique1(Items4,Items5,Items3). + +isvar(Variable) :- + atom(Variable). +isval(Value) :- + number(Value). +isvalstr(N) :- + isval(N);string(N). +/**isvalstrempty(N) :- + isval(N);(string(N);N=empty).**/ +isvalstrempty(N) :- + isval(N),!. +isvalstrempty(N) :- + string(N),!. +isvalstrempty(empty) :- !. +/**isvalstrempty(N) :- + atom(N),fail,!. +**/ +isvalstrorundef(N) :- + var(N);(not(var(N)),(isval(N);expression(N))). +undef(N) :- + var(N). +/** +expression(N) :- + isval(N);(string(N);atom(N)),!. +expression([]). +expression(empty). +expression([N]) :- + expression(N). +expression([N|Ns]):- + expression(N),expression(Ns). +**/ + +expression(empty) :- + !. +expression(N) :- + isval(N),!. +expression(N) :- + string(N),!. +expression(N) :- + atom(N),!. +expression([]) :- + !. +expression(N) :- +%% not(atom(N)), + length(N,L),L>=2, + expression2(N). +expression2([]). +expression2([N|Ns]) :- + expression3(N), + expression2(Ns). +expression3(N) :- + isval(N),!. +expression3(N) :- + string(N),!. +expression3(N) :- + atom(N),!. + +expressionnotatom(N) :- + isvalstrempty(N),!. +expressionnotatom(N) :- + not(atom(N)), + length(N,L),L>=2, + expressionnotatom2(N). +expressionnotatom2([]). +expressionnotatom2([N|Ns]) :- + isvalstrempty(N), + expressionnotatom2(Ns). + +substitutevarsA1(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- + substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2). +substitutevarsA2([],_Vars1,Vars2,Vars2,FirstArgs,FirstArgs). +substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- + Arguments=[Variable|Variables], + ((getvalue(Variable,Value,Vars1), + Value=empty)-> + ((append(Vars2,[Variable],Vars4)), + (isvar(Variable)->append(FirstArgs1,[Variable], + FirstArgs3);FirstArgs3=FirstArgs1)); + (getvalue(Variable,Value,Vars1), + append(Vars2,[Value],Vars4)), + FirstArgs3=FirstArgs1), + substitutevarsA2(Variables,Vars1,Vars4,Vars3,FirstArgs3,FirstArgs2). + +findresult3([],_Result,Result2,Result2). +findresult3(Arguments1,Result1,Result2,Result3) :- + Arguments1=[Value|Arguments2], + expressionnotatom(Value), + append(Result2,[Value],Result4), + findresult3(Arguments2,Result1,Result4,Result3). +findresult3(Arguments1,Result1,Result2,Result3) :- + Arguments1=[Variable|Arguments2], + isvar(Variable), + member([Variable,Value],Result1), + append(Result2,[Value],Result4), + findresult3(Arguments2,Result1,Result4,Result3). + + + diff --git a/Combination-Algorithm-Writer-with-Predicates/listprologinterpreter3preds5copy52.pl b/Combination-Algorithm-Writer-with-Predicates/listprologinterpreter3preds5copy52.pl new file mode 100644 index 0000000000000000000000000000000000000000..a2bfa2de37c2852e31f3ef8b4e910a148a8444b9 --- /dev/null +++ b/Combination-Algorithm-Writer-with-Predicates/listprologinterpreter3preds5copy52.pl @@ -0,0 +1,118 @@ +interpretpart(is,Variable1,Value1,Vars1,Vars2) :- + getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + expression(Value1A), + %%val1emptyorvalsequal(Value1A,Value1), + %%isval(Value2), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,is,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Variable1,is,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1A is Value2 + Value3, + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,is,Value2+Value3],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Value1A,is,Value2+Value3],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1 = [Value2A, Value3A], + val1emptyorvalsequal(Value2,Value2A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable2,Value2A,Vars1,Vars3), + putvalue(Variable3,Value3A,Vars3,Vars2), + (debug(on)->(writeln([call,[[Value2A, Value3A],=,[variable1,variable2]],"Press c."]),(not(get_single_char(97))->true;abort));true), (debug(on)->(writeln([exit,[[Value2A, Value3A],=,[Value2A, Value3A]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1A = [Value2, Value3], + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,=,[Value2,Value3]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[Value2,Value3],=,[Value2,Value3]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1A = [Value2], + val1emptyorvalsequal(Value1,Value1A), + %%val1emptyorvalsequal(Value1A,Value2), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Variable2,=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(bracket2,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1A = Value2, + val1emptyorvalsequal(Value1,Value1A), + %%val1emptyorvalsequal(Value2A,Value1), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[[variable],=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[Value2],=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(head,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1=[Value1A|_Rest], + val1emptyorvalsequal(Value2,Value1A), + putvalue(Variable2,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[head,Value1,variable],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[head,Value1,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true), !. +interpretpart(tail,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1=[_Head|Value1A], + %%removebrackets(Value1A,Value1B), + val1emptyorvalsequal(Value2,Value1A), + putvalue(Variable2,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[tail,Value1,variable],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[tail,Value1,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(member,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + (not(Value1=empty)->(member(Value1,Value2),Vars2=Vars1, + (debug(on)->(writeln([call,[member,Value1,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[member,Value1,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true)); + (member(Value3,Value2), + putvalue(Variable1,Value3,Vars1,Vars2), + (debug(on)->(writeln([call,[member,variable1,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[member,Value3,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true))). + +interpretpart(delete,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + delete(Value1,Value2,Value3A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable3,Value3A,Vars1,Vars2), + (debug(on)->(writeln([call,[delete,Value1,Value2,variable3],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[delete,Value1,Value2,Value3A],"Press c."]),(not(get_single_char(97))->true;abort));true). + + +interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + append1(Value1,Value2,Value3A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable3,Value3A,Vars1,Vars2), + (debug(on)->(writeln([call,[append,Value1,Value2,variable3],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[append,Value1,Value2,Value3A],"Press c."]),(not(get_single_char(97))->true;abort));true). +getvalues(Variable1,Variable2,Value1,Value2,Vars) :- + getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars). +getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars) :- + getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars), + getvalue(Variable3,Value3,Vars). +val1emptyorvalsequal(empty,Value) :- + not(Value=empty). +val1emptyorvalsequal(Value,Value) :- + not(Value=empty). +isop(is). +isop(=). +append1([],Item,Item) :- + !. +append1(Item1,Item2,Item3) :- + ((isvalstr(Item1),Item1A=[Item1]);(not(isvalstr(Item1)),Item1A=Item1)), + ((isvalstr(Item2),Item2A=[Item2]);(not(isvalstr(Item2)),Item2A=Item2)), + %%((isvalstr(Item3),Item3A=[Item3]);(not(isvalstr(Item3)),Item3A=Item3)), + append(Item1A,Item2A,Item3). +/**delete1(Item1,Item2,Item3) :- + ((isvalstr(Item1),Item1A=[Item1]);(not(isvalstr(Item1)),Item1A=Item1)), + ((isvalstr(Item2),Item2A=[Item2]);(not(isvalstr(Item2)),Item2A=Item2)), + %%((isvalstr(Item3),Item3A=[Item3]);(not(isvalstr(Item3)),Item3A=Item3)), + delete(Item1A,Item2A,Item3). +**/ + +removebrackets([[Value]],Value) :-!. +removebrackets(Value,Value). diff --git a/Combination-Algorithm-Writer-with-Predicates/lpi_caw_commands.pl b/Combination-Algorithm-Writer-with-Predicates/lpi_caw_commands.pl new file mode 100644 index 0000000000000000000000000000000000000000..4c2fb19384db004b5134bb9f53d80fb1a49e270f --- /dev/null +++ b/Combination-Algorithm-Writer-with-Predicates/lpi_caw_commands.pl @@ -0,0 +1,39 @@ +interpretstatement1(_F0,_Functions,[[not,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- + isop(Operator), + interpretpart(not_is,Variable1,Variable2,Vars1,Vars2),!. + +interpretstatement1(_F0,_Functions,[[not,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- + comparisonoperator(Operator), +%%writeln1(4), + interpretpart(not_iscomparison,Operator,Variable1,Variable2,Vars1,Vars2). + +interpretpart(not_is,Variable1,Variable2,Vars1,Vars1) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + not(isempty(Value1)), + not(isempty(Value2)), + %%writeln([call,[[not,=],[Value1,Value2]]]), + ((not(Value1 = Value2) + )-> + true%%writeln([exit,[[not,=],[Value1,Value2]]]) +; fail)%%writeln([fail,[[not,=],[Value1,Value2]]])) +,!. + +interpretpart(not_iscomparison,Operator,Variable1,Variable2,Vars1,Vars1) :- getvalues(Variable1,Variable2,Value1,Value2,Vars1), + %%writeln([call,[[not,=],[Value1,Value2]]]), + ((isval(Value1), + isval(Value2), + Expression=..[Operator,Value1,Value2], + not(Expression))-> + true%%writeln([exit,[[not,=],[Value1,Value2]]]) +; fail)%%writeln([fail,[[not,=],[Value1,Value2]]])) +,!. + +isempty(N) :- + N=empty. + +comparisonoperator(>). +comparisonoperator(>=). +comparisonoperator(<). +comparisonoperator(=<). +%%comparisonoperator(=). +comparisonoperator(=\=). diff --git a/Combination-Algorithm-Writer/LICENSE b/Combination-Algorithm-Writer/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/Combination-Algorithm-Writer/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Combination-Algorithm-Writer/README.md b/Combination-Algorithm-Writer/README.md new file mode 100644 index 0000000000000000000000000000000000000000..69d6afbfb1ad4afb3f73f36cd7e07a00eb9200cc --- /dev/null +++ b/Combination-Algorithm-Writer/README.md @@ -0,0 +1,63 @@ +# Combination-Algorithm-Writer +Combination Algorithm Writer + +(Please note: Deprecated. See updated https://github.com/luciangreen/Combination-Algorithm-Writer-with-Predicates (a different repository) instead). + +Combination Algorithm Writer is a SWI-Prolog algorithm that finds combinations of given commands that satisfy the given input and output. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Combination-Algorithm-Writer"). +halt +``` + +# Running + +* In Shell: +`cd Combination-Algorithm-Writer` +`swipl` +``` +['listprologinterpreter1listrecursion4 copy 52']. +['listprologinterpreter3preds5 copy 52']. +['caw5 copy 11']. +``` +Running + +``` +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,7,[[a,1]],[[b,1]],[],Program). +Program = [f,[1,b]],[[f,[a,b],:-,[[=,[b,a]]]]],[[b,1]] +(Press ";" for more results). +``` + +``` +caw00(off,f,[[+,2,1]],4,8,[[a,1],[b,1],[c,2]],[[e,4]],[],P),writeln(P). +[[f,[a,b,c,e],:-,[[+,[c,c,h]],[=,[e,h]]]]] +``` + +``` +caw00(off,f,[[+,2,1]],4,8,[[a,1],[b,1],[c,2],[d,1]],[[e,5]],[],P),writeln(P). +[[f,[a,b,c,d,e],:-,[[+,[a,c,g]],[+,[c,g,h]],[=,[e,h]]]]] +. +. +. +``` + +Note: +Use (:-) instead of :-. diff --git a/Combination-Algorithm-Writer/caw5 copy 11.pl b/Combination-Algorithm-Writer/caw5 copy 11.pl new file mode 100644 index 0000000000000000000000000000000000000000..caa872d1144891f87751626f16b6bb624f939f20 --- /dev/null +++ b/Combination-Algorithm-Writer/caw5 copy 11.pl @@ -0,0 +1,405 @@ +:- dynamic debug/1. + +/** + + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,7,[[a,1]],[[b,1]],[],Program). +Program = [f,[1,b]],[[f,[a,b],:-,[[=,[b,a]]]]],[[b,1]] + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,7,[[a,[1,2]]],[[b,[1,2]]],[],Program). +Program = [f,[[1,2],b]],[[f,[a,b],:-,[[=,[b,a]]]]],[[b,[1,2]]] + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,7,[[a,1],[b,2]],[[c,[1,2]]],[],Program). +Program = interpret(off,[f,[1,2,c]],[[f,[a,b,c],:-,[[append,[a,b,g]],[=,[c,g]]]]],[[c,[1,2]]]) + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,8,[[a,1],[b,2],[c,3]],[[d,[1,2,3]]],[],Program). + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,7,[[a,[4,1,2,3]],[b,1]],[[d,[4,2,3]]],[],Program). + + +[debug] ?- optimise([[append,[a,b,c]],[append,[c,d,e]]],[a,b,d],[a,b,c,d],[e],P). P = [[append, [a, b, c]], [append, [c, d, e]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[append,[a,b,c]]],[a,b],[a,b,c],[c],P). +P = [[append, [a, b, c]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[a,b,c]]],[a,b],[a,b,c],[c],P). +P = [[append, [a, b, c]], [delete, [a, b, c]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[a,b,x]]],[a,b],[a,b,c,x],[c],P). +P = [[append, [a, b, c]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[c,b,d]],[member,[c,d]]],[a,b],[a,b,c,d],[d],P). +P = [[append, [a, b, c]], [delete, [c, b, d]], [member, [c, d]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[c,b,d]],[member,[a,d]]],[a,b],[a,b,c,d],[d],P). +P = [[append, [a, b, c]], [delete, [c, b, d]], [member, [a, d]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[c,b,d]],[member,[a,x]]],[a,b],[a,b,c,d],[d],P). +P = [[append, [a, b, c]], [delete, [c, b, d]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[c,b,d]],[member,[c,x]]],[a,b],[a,b,c,d],[d],P). +P = [[append, [a, b, c]], [delete, [c, b, d]]] ; + +Wantedly fail: +optimise([[append,[a,e,c]]],[a],[a,e,c],[c],P). +optimise([[append,[a,b,c]],[append,[c,d,e]]],[a,d],[a,b,c,d],[e],P). +optimise([[append,[a,b,c]],[append,[c,d,e]]],[a,b],[a,b,c,d],[e],P). + +optimise([[delete,[a,e,c]],[append,[a,a,e]]],[a,e],[a,e,c],[c],P). +P = [[delete, [a, e, c]], [append, [a, a, e]]] . + +optimise([[append,[a,a,e]],[delete,[a,e,c]]],[a,e],[a,e,c],[c],P). +P = [[append, [a, a, e]], [delete, [a, e, c]]] . + +optimise([[append,[a,e,c]]],[a,e,c],[a,e,c],[c],P). +P = [[append, [a, e, c]]] . + +findrulesflowingtopv1([[append,[a,e,c]],[member,[a,e]]],[a],[a,e,c],[c],[],R,F). +R = [[append, [a, e, c]], [member, [a, e]]] + +check optimise works with member ef v +member ef and member in inputvars2 (cde) v +optimise - aeg (remove e), v +does optimise work with multiple rules with same output v +delete returning progs in optimise v +cut rule - + +aea cant point to itself in optimise - needs iterative deepening +aec where e not in iv1 or another pred, try first or second in prog v +don't pass rule to lower predicates v + +don't choose outputs from non new var v +don't run repeat preds + +make predicate, clause writer +member predicates returning no output + +**/ + +caw00(Debug,PredicateName,Rules,MaxLength,TotalVars,InputVarList,OutputVarList,Program1,Program2) :- + retractall(debug(_)), + assertz(debug(Debug)), + retractall(totalvars(_)), + assertz(totalvars(TotalVars)), + caw0(PredicateName,Rules,MaxLength,InputVarList,OutputVarList,Program1,Program2). + +caw0(PredicateName,Rules,MaxLength,InputVarList,OutputVarList,Program1,Program2) :- + varnames(InputVarList,[],InputVars,[],InputValues), + varnames(OutputVarList,[],OutputVars,[],_OutputValues), + retractall(outputvars(_)), + assertz(outputvars(OutputVars)), +append(InputVars,OutputVars,Vars11), +%%Vars11=InputVars, +%%Vars12=InputVars, + append(InputValues,OutputVars,Vars2), + %%append(InputValues,OutputValues,Values), + Query=[PredicateName,Vars2], + caw(Query,PredicateName,Rules,MaxLength,Vars11,InputVars,InputVars,_,OutputVarList,OutputVars,Program1,Program2). +caw(_,_,_,0,_,_,_,_,_,_,_) :- fail, !. +caw(Query,PredicateName,_,_,_VarList,InputVars1,InputVars2,_,OutputVarList,OutputVars,Program1,Program2) :- + addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3), +%%writeln([addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3)]), + optimise(Program1,InputVars1,InputVars3,PenultimateVars,Program4), %% IV2->3 +%%writeln([optimise(Program1,InputVars1,InputVars3,PenultimateVars,Program4)]), + append(Program4,Program3,Program5), + append(InputVars1,OutputVars,Vars2), + Program2=[ + [PredicateName,Vars2,(:-), + Program5 + ] + ],(debug(on)->Debug=on;Debug=off), +%%writeln([interpret(Debug,Query,Program2,OutputVarList)]), + interpret(Debug,Query,Program2,OutputVarList). +caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4) :- +%%writeln([caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,OutputVarList,OutputVars,Program1,Program4)]), + MaxLength2 is MaxLength - 1, +%%writeln(["ml",MaxLength2]), + member([RuleName,NumInputs,NumOutputs],Rules), +%%writeln([member([RuleName,NumInputs,NumOutputs],Rules)]), +%%writeln([rule(RuleName,NumInputs,NumOutputs,VarList,VarList2,Rule)]), + rule(RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,Rule), +%%writeln([rule(RuleName,NumInputs,NumOutputs,InputVars1,InputVars3,VarList,VarList2,Rule)]), + append(Program1,[Rule],Program3), +%%writeln([inputVars3,InputVars3]), +%%InputVars2=InputVars3, +%%writeln([program4,Program4]), + caw(Query,PredicateName,Rules,MaxLength2,VarList2,InputVars1,InputVars4,InputVars3,OutputVarList,OutputVars,Program3,Program4). + +varnames([],Vars,Vars,Values,Values) :- !. +varnames(VarList,Vars1,Vars2,Values1,Values2) :- + VarList=[Var|Vars3], + Var=[VarName,Value], + append(Vars1,[VarName],Vars4), + append(Values1,[Value],Values3), + varnames(Vars3,Vars4,Vars2,Values3,Values2),!. + +addrules(_,_,[],PV,PV,Program,Program) :- !. +addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- + OutputVars2=[OutputVar|OutputVars3], + member(Var,VarList), + member(OutputVar,OutputVars1), + append(Program1,[[=,[OutputVar,Var]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars3), + addrules(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2). + +%% optimise([[append,[a,a,d]],[append,[a,a,e]],[append,[a,a,f]],[append,[a,b,g]]],[g],P). + +optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program2) :- + findrulesflowingtopv1(Program1,InputVars1,InputVars2,PenultimateVars,[],Rules,true), + %%findrulesflowingtopv1a(Program1,_Program32,InputVars1,InputVars2,PenultimateVars,[],_Rules1), + intersection(Program1,Rules,Program3), + unique1(Program3,[],Program2). +findrulesflowingtopv1(_,_,_,[],Rules,Rules,false). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + (atom(Var);length(Var,1)), + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars1,Rules1,Rules2,IV1Flag1) :- + Vars1=[Var|Vars2], + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2), + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars2,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv20(_,[],_InputVars1,_InputVars2,_Var,Rules,Rules,false). +findrulesflowingtopv20(Program0,Rules4,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rules4=[Rule|Rules], + (findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2)->true;(Rules3=Rules1,IV1Flag2=false)), + %%delete(Program0,Rule,Program1), + findrulesflowingtopv20(Program0,Rules,InputVars1,InputVars2,Var,Rules3,Rules2,IV1Flag3),%%p1->0 + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). +%%findrulesflowingtopv2(_,[],[],_,_,_,Rules,Rules). +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + %%(not(intersection(Rulesx,Rules1))-> x + %% append, append, unique1 + %%append(Rules1,[Rule],Rules3);Rules3=Rules1), + + %%member(Var2,Rest), + %%member(Var2,InputVars1), + + length(Rest,Length1), Length1>=1, + subtract(Rest,InputVars1,IV3s), + length(IV3s,Length3), + subtract(Rest,IV3s,IV1s), + length(IV1s,Length2), Length2>=1, + subtract(IV3s,InputVars2,[]), + + IV1Flag2=true, + + %%delete(Program0,Rule,Program1), + + %%(delete(Program0,Rule,Program3), + %%iv3s1(IV3s,Program3,IV3s,[]), + (Length3>=1-> + (findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV3s,[],Rules5,IV1Flag3),not(Rules5=[])); + (Rules5=[],IV1Flag3=false)), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag4), + %%->true; Rules5=[],IV1Flag1=IV1Flag4), + + ((findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV1s,[],Rules6,IV1Flag5), %%iv1s->rest, etc + iv1flagdisjunction(IV1Flag4,IV1Flag5,IV1Flag1))->true;(Rules6=[],IV1Flag1=IV1Flag4)), + + append([Rule],Rules1,Rules9), + append(Rules9,Rules5,Rules7), + append(Rules7,Rules6,Rules8), + unique1(Rules8,[],Rules2). + +/** +findrulesflowingtopv2(_Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + (not(member(Rule,Rules1))-> + append(Rules1,[Rule],Rules2);Rules2=Rules1), + subset(Rest,InputVars2), + + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), + +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), + + IV1Flag1=false. +**/ +/** +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program3), + %%Program3=Program1, + %%append(Rules1,[Rule],Rules3), + subset(Rest,InputVars2), + + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), + +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), + +%% delete(Program0,Rule,Program1), + + IV1Flag2=false, + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Rest,[],Rules4,IV1Flag3), + %%not(Rules4=[]), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1), + + append(Rules1,[Rule],Rules7), + append(Rules7,Rules4,Rules8), + unique1(Rules8,[],Rules2). +**/ +/** +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2(Rule,Program0,Program1,_Program2,InputVars1,InputVars,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + %%Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1(Program0,Program1,_Program2,InputVars1,InputVars,Rest,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). + +**/ +iv1flagdisjunction(A,B,true) :- + (A=true); (B=true). +iv1flagdisjunction(_,_,false). +/** +iv3s0([],_,IV3s1,IV3s2). +iv3s0(IV3s,Program0,IV3s1,IV3s2). + IV3s=[IV3|IV3s3], + iv3s1(IV3,Program0,IV3s1,IV3s4), + iv3s0(IV3s3,Program0,IV3s4,IV3s2). +iv3s1(_,[],IV3s,IV3s). +iv3s1(IV3,Program0,IV3s1,IV3s2) :- + Program0=[Rule|Rules], + iv3s2(IV3,Rule,IV3s1,IV3s3), + iv3s1(IV3,Rules,IV3s3,IV3s2). +iv3s2(IV3,Rule,IV3s,IV3s1,IV3s2). + Rule=[_PredicateName,Vars], + restlast(Vars,[],_Rest,IV3), + delete(IV3s1,IV3,IV3s2). + + +findrulesflowingtopv1a(_,_,_,_,[],Rules,Rules). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + atom(Var), + findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Vars1,Rules1,Rules2) :- + Vars1=[Var|Vars2], + findrulesflowingtopv2(Program1,Program3,InputVars1,InputVars2,Var,Rules1,Rules3), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Vars2,Rules3,Rules2). +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv2a([],[],_,_,_,Rules,Rules). +findrulesflowingtopv2a(Program1,Program2,_InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program2), +Program2=Program1, + append(Rules1,[[PredicateName,Vars]],Rules2), + subset(Rest,InputVars2)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program3), +Program3=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + subset(Rest,InputVars2)), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Rest,Rules3,Rules2). + +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1a(Program4,Program2,InputVars1,InputVars,Rest,Rules3,Rules2). + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). + **/ + +restlast([],_,_,_) :- fail, !. +restlast([Last],Rest,Rest,Last) :- + atom(Last),!. +restlast(Last,Rest,Rest,Last) :- + length(Last,1),!. +restlast(Vars1,Rest1,Rest2,Last) :- + Vars1=[Var|Vars2], + append(Rest1,[Var],Rest3), + restlast(Vars2,Rest3,Rest2,Last),!. + + + + +rule(RuleName,1,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + rule2(RuleName,Var,VarList,VarList2,Rule,Var1), + append(InputVars1,[Var1],InputVars2). +rule2(RuleName,Var,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Var1]],!. + +rule(RuleName,1,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + rule3(RuleName,Var,VarList,VarList2,Rule,Vars), + append(InputVars1,Vars,InputVars2). +rule3(RuleName,Var,VarList,VarList3,Rule,[Var1,Var2]) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Var1,Var2]],!. + +rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars1), + member(Vara,InputVars1), + rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1), + append(InputVars1,[Var1],InputVars2). +rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Vara,Var1]],!. + +rule(RuleName,2,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + member(Var,InputVars), + member(Vara,InputVars), + rule5(RuleName,Var,Vara,VarList,VarList2,Rule,Vars), + append(InputVars1,Vars,InputVars2). +rule5(RuleName,Var,Vara,VarList,VarList3,Rule,[Var1,Var2]) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Vara,Var1,Var2]],!. + +%%var(Item,Var,Vars,Vars) :- +%% member([Item,Var],Vars). +var(Vars1,Var1,Vars2) :- + length(Vars1,Vars1Length1), + Vars1Length2 is Vars1Length1-1, + length(Vars3,Vars1Length2), + append(Vars3,[Var2],Vars1), + char_code(Var2,Var2Code1), + Var2Code2 is Var2Code1 + 1, + var2(Var2Code2,Var1), + append(Vars1,[Var1],Vars2),!. + +var2(Code,Var1) :- + outputvars(OutputVars), + totalvars(TotalVars), + Code2 is 96+TotalVars, + Code =< Code2, %% 122 + char_code(Var1,Code), + not(member(Var1,OutputVars)),!. +var2(Var2Code,Code3) :- + Var2Code2 is Var2Code + 1, + totalvars(TotalVars), + Code2 is 96+TotalVars, + Var2Code2 =< Code2, + var2(Var2Code2,Code3),!. + + + + + diff --git a/Combination-Algorithm-Writer/listprologinterpreter1listrecursion4 copy 52.pl b/Combination-Algorithm-Writer/listprologinterpreter1listrecursion4 copy 52.pl new file mode 100644 index 0000000000000000000000000000000000000000..ccba3784dba8e03d6547ca250955c7df49140ae0 --- /dev/null +++ b/Combination-Algorithm-Writer/listprologinterpreter1listrecursion4 copy 52.pl @@ -0,0 +1,576 @@ +/**:- include('listprologinterpreter3preds5.pl'). +:- include('lpiverify4.pl'). +:- include('caw5 copy 11.pl'). +**/ +:- dynamic debug/1. + +/** List Prolog Interpreter **/ + +interpret(Debug,Query,Functions,Result) :- +%%writeln([i1]), + interpret1(Debug,Query,Functions,Functions,Result), + !. +interpret1(Debug,Query,Functions1,Functions2,Result) :- +%%writeln([i11]), + retractall(debug(_)), + assertz(debug(Debug)), + retractall(cut(_)), + assertz(cut(off)), + member1(Query,Functions1,Functions2,Result). +%%member1([_,R],_,[],R). +%%member1(_,_,[],[]). +member1(_,_,[],_) :- fail. +member1(Query,Functions,Functions2,Vars8) :- +%%writeln([m1]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2,(:-),Body]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, %%->ca2 +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,Vars2,Body,true), + updatevars(FirstArgs,Vars2,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), +%%writeln(["FirstArgs",FirstArgs,"Vars",Vars2,"Result",Result,"Vars7",Vars7,"Vars72",Vars72,"Var71",Var71,"Vars8",Vars8]), +%%writeln(["Vars8",Vars8]), + findresult3(Arguments1,Vars8,[],Result2) +%%writeln([findresult3,"Arguments1",Arguments1,"Vars8",Vars8,"Result2",Result2]) + );( +%%writeln(here1), + Vars8=[],Result2=[])), +%%writeln(["Arguments1",Arguments1,"Vars2",Vars2,"Result",Result]), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true)) + ->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2,(:-),_Body]|Functions3], %% make like previous trunk? + member11(Query,Functions,Functions2,Vars8)) + );(turncut(off)%%,Result=[] + ). +member11(Query,Functions,Functions2,Result) :- +%%writeln([m11]), +%%writeln(["Query",Query,"Functions",Functions,"Functions2",Functions2,"Result",Result]), + cut(off)->( + (Query=[Function], + (Functions2=[[Function,(:-),Body]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Result=[], + interpretbody(Functions,Functions2,[],_Vars2,Body,true),!, + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + %%Functions2=[[Function]|Functions3], + member12(Query,Functions,Functions2,Result)) + );(turncut(off)). +member12(Query,Functions,Functions2,Vars8) :- +%%writeln([m12]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, %% ->ca2 +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + updatevars(FirstArgs,Vars1,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) + );( +%%writeln(here2), + Vars8=[],Result2=[])), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2]|Functions3], + member13(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member13(Query,Functions,Functions2,Result) :- +%%writeln([m13]), + cut(off)->( + (Query=[Function],!, + (Functions2=[[Function]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Result=[], + %%interpretbody(Functions,[],_Vars2,Body,true), + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + Functions2=[_Function|Functions3], + member1(Query,Functions,Functions3,Result)) + );(turncut(off)). +interpret2(Query,Functions1,Functions2,Result) :- +%%writeln(i2), +%%writeln(["%%interpret2 Query",Query,"Functions1",Functions1,"Functions2",Functions2]), + member2(Query,Functions1,Functions2,Result). +%%member2([_,R],_,[],R). +%%member2(_,_,[],[]). +member2(_,_,[],_) :- fail. +member2(Query,Functions,Functions2,Vars8) :- +%%writeln([m2]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2,(:-),Body]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,Vars2,Body,true), %%**arg2 change +%%writeln(["Functions",Functions,"Functions2",Functions2,"Vars1",Vars1,"Vars2",Vars2,"Body",Body]), + updatevars(FirstArgs,Vars2,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) +%%writeln(["Vars2",Vars2,"Result",Result]), + );( + writeln(here3), + Vars8=[],Result2=[])), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2,(:-),_Body]|Functions3], + member21(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member21(Query,Functions,Functions2,Result) :- +%%writeln([m21]), + cut(off)->( + (Query=[Function], + (Functions2=[[Function,(:-),Body]|_Functions3]), + Vars1=[], + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,_Vars2,Body,true),!, %%**arg2 change + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + %%Functions2=[[Function]|Functions3], + member22(Query,Functions,Functions2,Result)) + );(turncut(off)). +member22(Query,Functions,Functions2,Vars8) :- +%%writeln([m22]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!, +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + updatevars(FirstArgs,Vars1,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) + );( +%%writeln(here4), + Vars8=[],Result2=[])), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2]|Functions3], + member23(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member23(Query,Functions,Functions2,Vars8) :- +%%writeln([m23]), + cut(off)->( + (Query=[Function],!, + (Functions2=[[Function]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Vars8=[], + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + Functions2=[_Function|Functions3], + member2(Query,Functions,Functions3,Vars8)) + );(turncut(off)). +checkarguments([],[],Vars,Vars,FirstArgs,FirstArgs). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %% +%%writeln(1), + Arguments1=[Value|Arguments3], %% Value may be a number, string, list or tree + expressionnotatom(Value), + Arguments2=[Variable2|Arguments4], + isvar(Variable2), + putvalue(Variable2,Value,Vars1,Vars3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs1,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %%A +%%writeln(2), + Arguments1=[Variable|Arguments3], %% Value may be a number, string, list or tree + isvar(Variable), + Arguments2=[Value|Arguments4], + expressionnotatom(Value), + putvalue(Variable,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable,_]],FirstArgs3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln(3), + Arguments1=[Variable1|Arguments3], + isvar(Variable1), + Arguments2=[Variable2|Arguments4], + isvar(Variable2), + (getvalue(Variable2,Value,Vars1)->true;Value=empty), + putvalue(Variable2,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable1,Variable2]],FirstArgs3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln(4), + Arguments1=[Value1|Arguments3], + expressionnotatom(Value1), + Arguments2=[Value1|Arguments4], + expressionnotatom(Value1), + checkarguments(Arguments3,Arguments4,Vars1,Vars2,FirstArgs1,FirstArgs2). + +interpretbody(_Functions1,_Functions2,Vars,Vars,[],true) :- !. + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[Statements1|Statements2], + interpretbody(Functions0,Functions,Vars1,Vars3,Statements1,Result2), + interpretbody(Functions0,Functions,Vars3,Vars2,Statements2,Result3), + %%((Result3=cut)->!;true), + logicalconjunction(Result1,Result2,Result3),!. + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[[not,[Statements]]|Statements2], + interpretbody(Functions0,Functions,Vars1,Vars3,Statements,Result2), + %%((Result2=cut)->!;true), + interpretbody(Functions0,Functions,Vars3,Vars2,Statements2,Result3), + %%((Result3=cut)->!;true), + logicalnot(Result2,Result4), + (logicalconjunction(Result1,Result4,Result3)->true;(Result1=false)),!. +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[[Statements1],or,[Statements2]], + (interpretbody(Functions0,Functions,Vars1,Vars2,Statements1,Result1); + %%,((Value1=cut)->!;true)); + interpretbody(Functions0,Functions,Vars1,Vars2,Statements2,Result1)),!. + %%,((Value=cut)->!;true)). + %%(logicaldisjunction(Result1,Value1,Value2)->true;(Result1=false)). + + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[Statement|Statements], +%%writeln(["Functions0",Functions0,"Functions",Functions,"Statement",Statement,"Vars1",Vars1,"Vars3",Vars3,"Result2",Result2,"Cut",Cut]), + interpretstatement1(Functions0,Functions,Statement,Vars1,Vars3,Result2,Cut), +%%writeln(["here1"]), + ((not(Cut=cut))->(Functions2=Functions);(turncut(on))), %% cut to interpret1/2 (assertz) +%%writeln(["here3"]), + interpretbody(Functions0,Functions2,Vars3,Vars2,Statements,Result3), + %%((Result3=cut)->!;true), +%%writeln(["here4"]), + logicalconjunction(Result1,Result2,Result3),!. +%%writeln([Result1,Result2,Result3]). +turncut(State1) :- + cut(State2), + retract(cut(State2)), + assertz(cut(State1)). +logicaldisjunction(true,Result2,Result3) :- + true(Result2);true(Result3). +logicalconjunction(true,Result2,Result3) :- + true(Result2),true(Result3). +logicalnot(Result1,Result2) :- + true(Result1),false(Result2). +logicalnot(Result1,Result2) :- + false(Result1),true(Result2). +true(true). +false(false). + +%%interpretstatement1(_F0,[],_,Vars,Vars,true,nocut) :- ! +%%writeln("AND HERE!") +%% . + +interpretstatement1(_F0,_Functions,[cut,[]],Vars,Vars,true,cut) :- !. + +interpretstatement1(_F0,_Functions,[atom,[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[atom,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + atom(Value), + (debug(on)->(writeln([exit,[atom,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[string,[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[string,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + string(Value), + (debug(on)->(writeln([exit,[string,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[number,[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[number,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + number(Value), + (debug(on)->(writeln([exit,[number,[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[variable,[Variable]],Vars,Vars,true,nocut) :- + var(Variable), + (debug(on)->(writeln([call,[variable,[Variable]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[variable,[Variable]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[Operator,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- + isop(Operator), + interpretpart(is,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[Operator,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(31), + isop(Operator), + interpretpart(is,Variable2,Variable1,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[+,[Variable2,Variable3,Variable1]],Vars1,Vars2,true,nocut) :- +%%writeln(4), + interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2). + +%%interpretstatement1(_F0,_Functions,[Variable2+Variable3,is,Variable1],Vars1,Vars2,true,nocut) :- +%%writeln(41), + %%interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[=,[Variable1,[Variable2,Variable3]]],Vars1,Vars2,true,nocut) :- +%%writeln(5), + interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2). + +%%interpretstatement1(_F0,_Functions,[[Variable2,Variable3]=Variable1],Vars1,Vars2,true,nocut) :- +%%writeln(51), +%% interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[=,[Variable1,[Variable2]]],Vars1,Vars2,true,nocut) :- +%%writeln(52), + interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[=,[[Variable1],[Variable2]]],Vars1,Vars2,true,nocut) :- +%%writeln(53), + interpretpart(bracket2,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[head,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(6), + interpretpart(head,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[tail,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(61), + interpretpart(tail,Variable1,Variable2,Vars1,Vars2). + + +interpretstatement1(_F0,_Functions,[member,[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(8), + interpretpart(member,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[delete,[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +%%writeln(), + interpretpart(delete,Variable1,Variable2,Variable3,Vars1,Vars2). +%%** all in form f,[1,1,etc], including + with 0,1 + +interpretstatement1(_F0,_Functions,[append,[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +%%writeln(9), + interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(Functions0,_Functions,Query1,Vars1,Vars8,true,nocut) :- +%%writeln("h1/10"), + Query1=[Function,Arguments], +%%writeln(["Arguments",Arguments,"Vars1",Vars1]), + substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + Query2=[Function,Vars3], %% Bodyvars2? +%% debug(on)->writeln([call,[Function,[Vars3]]]), +%%writeln(["Query2",Query2,"Functions0",Functions0]), + interpret2(Query2,Functions0,Functions0,Result1), + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) +);( +writeln(here1), + Vars8=[])). +%%**** reverse and take first instance of each variable. + %%findresult3(Arguments,Vars6,[],Result2) +%%writeln(["FirstArgs",FirstArgs,"Result1",Result1,"Vars5",Vars5,"Vars4",Vars4]), +%%writeln(["Vars1:",Vars1,"Vars4:",Vars4]), +%% debug(on)->writeln([exit,[Function,[Result2]]]). +interpretstatement1(Functions0,_Functions,Query,Vars,Vars,true) :- + Query=[_Function], + debug(on)->writeln([call,[Function]]), + interpret2(Query,Functions0,Functions0,_Result1), + debug(on)->writeln([exit,[Function]]). + + +interpretstatement1(_Functions0, _Functions,_Query,_Vars1,_Vars2,false) :- + writeln([false]). + +interpretstatement2(Value,_Vars,Value) :- + (number(Value);atom(Value)). +interpretstatement2(Variable,Vars1,Value) :- + getvalue(Variable,Value,Vars1). +interpretstatement3(A + B,Vars,Value1) :- + interpretstatement2(A,Vars,Value2), + interpretstatement2(B,Vars,Value3), + Value1 = Value2 + Value3. +interpretstatement3(Value,_Vars,Value) :- + (number(Value);atom(Value)). +interpretstatement3(Variable,Vars,Value) :- + getvalue(Variable,Value,Vars). +getvalue(Variable,Value,Vars) :- + ((not(isvar(Variable)),isvalstrorundef(Value),Variable=Value); + (isvar(Variable),isvalstrorundef(Value),getvar(Variable,Value,Vars))). +putvalue(Variable,Value,Vars1,Vars2) :- + ((not(isvar(Variable)),isvalstrorundef(Value),Variable=Value); + (isvar(Variable),isvalstrorundef(Value),updatevar(Variable,Value,Vars1,Vars2))),!. +getvar(Variable,Value,Vars) :- + member([Variable,Value],Vars), + not(Value=empty). +getvar(undef,undef,_Vars) :- + !. +getvar(Variable,empty,Vars) :- + not(member([Variable,_Value],Vars)); + member([Variable,empty],Vars). +updatevar(undef,_Value,Vars,Vars) :- + !. +updatevar(Variable,Value,Vars1,Vars2) :- + ((((member([Variable,empty],Vars1), + delete(Vars1,[Variable,empty],Vars3), + append(Vars3,[[Variable,Value]],Vars2)); + ((not(member([Variable,Value1],Vars1)), + ((Value1=empty)->true;(Value1=Value)))), + append(Vars1,[[Variable,Value]],Vars2)); + (member([Variable,Value],Vars1),Vars2=Vars1)); + (undef(Variable), + append(Vars1,[[Variable,Value]],Vars2))). +updatevars(_FirstArgs,[],Vars,Vars). +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[[Variable1,Value]|Vars4], + ((member([Variable2,Variable1],FirstArgs), %% removed brackets around firstargs here and 2 line below + append(Vars2,[[Variable2,Value]],Vars5)); + (member([Variable1,_Variable2],FirstArgs), + append(Vars2,[[Variable1,Value]],Vars5))), + updatevars(FirstArgs,Vars4,Vars5,Vars3), + !. +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[_Vars4|Vars5], + updatevars(FirstArgs,Vars5,Vars2,Vars3). +updatevars2(_FirstArgs,[],Vars,Vars). +updatevars2(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[[Variable,Value]|Vars4], + (member(Variable,FirstArgs), %% removed brackets around firstargs here and 2 line below, ** vars1 into arg in (10), check cond + append(Vars2,[[Variable,Value]],Vars5)), + updatevars2(FirstArgs,Vars4,Vars5,Vars3). +updatevars3(Vars1,[],Vars1). +updatevars3(Vars1,Vars2,Vars4) :- + Vars2=[[Variable,Value]|Vars5], + delete(Vars1,[Variable,empty],Vars6), + append(Vars6,[[Variable,Value]],Vars7), + updatevars3(Vars7,Vars5,Vars4), + !. +updatevars3(Vars1,Vars2,Vars4) :- + Vars2=[[Variable,Value]|Vars5], + append(Vars1,[[Variable,Value]],Vars6), + updatevars3(Vars6,Vars5,Vars4). +reverse([],List,List). +reverse(List1,List2,List3) :- + List1=[Head|Tail], + append([Head],List2,List4), + reverse(Tail,List4,List3). +unique1([],Items,Items). +unique1([Item|Items1],Items2,Items3) :- + delete(Items1,Item,Items4), + append(Items2,[Item],Items5), + unique1(Items4,Items5,Items3). + +isvar(Variable) :- + atom(Variable). +isval(Value) :- + number(Value). +isvalstr(N) :- + isval(N);string(N). +/**isvalstrempty(N) :- + isval(N);(string(N);N=empty).**/ +isvalstrempty(N) :- + isval(N),!. +isvalstrempty(N) :- + string(N),!. +isvalstrempty(empty) :- !. +/**isvalstrempty(N) :- + atom(N),fail,!. +**/ +isvalstrorundef(N) :- + var(N);(not(var(N)),(isval(N);expression(N))). +undef(N) :- + var(N). +/** +expression(N) :- + isval(N);(string(N);atom(N)),!. +expression([]). +expression(empty). +expression([N]) :- + expression(N). +expression([N|Ns]):- + expression(N),expression(Ns). +**/ + +expression(empty) :- + !. +expression(N) :- + isval(N),!. +expression(N) :- + string(N),!. +expression(N) :- + atom(N),!. +expression([]) :- + !. +expression(N) :- +%% not(atom(N)), + length(N,L),L>=2, + expression2(N). +expression2([]). +expression2([N|Ns]) :- + expression3(N), + expression2(Ns). +expression3(N) :- + isval(N),!. +expression3(N) :- + string(N),!. +expression3(N) :- + atom(N),!. + +expressionnotatom(N) :- + isvalstrempty(N),!. +expressionnotatom(N) :- + not(atom(N)), + length(N,L),L>=2, + expressionnotatom2(N). +expressionnotatom2([]). +expressionnotatom2([N|Ns]) :- + isvalstrempty(N), + expressionnotatom2(Ns). + +substitutevarsA1(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- + substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2). +substitutevarsA2([],_Vars1,Vars2,Vars2,FirstArgs,FirstArgs). +substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- + Arguments=[Variable|Variables], + ((getvalue(Variable,Value,Vars1), + Value=empty)-> + ((append(Vars2,[Variable],Vars4)), + (isvar(Variable)->append(FirstArgs1,[Variable], + FirstArgs3);FirstArgs3=FirstArgs1)); + (getvalue(Variable,Value,Vars1), + append(Vars2,[Value],Vars4)), + FirstArgs3=FirstArgs1), + substitutevarsA2(Variables,Vars1,Vars4,Vars3,FirstArgs3,FirstArgs2). + +findresult3([],_Result,Result2,Result2). +findresult3(Arguments1,Result1,Result2,Result3) :- + Arguments1=[Value|Arguments2], + expressionnotatom(Value), + append(Result2,[Value],Result4), + findresult3(Arguments2,Result1,Result4,Result3). +findresult3(Arguments1,Result1,Result2,Result3) :- + Arguments1=[Variable|Arguments2], + isvar(Variable), + member([Variable,Value],Result1), + append(Result2,[Value],Result4), + findresult3(Arguments2,Result1,Result4,Result3). + + + diff --git a/Combination-Algorithm-Writer/listprologinterpreter3preds5 copy 52.pl b/Combination-Algorithm-Writer/listprologinterpreter3preds5 copy 52.pl new file mode 100644 index 0000000000000000000000000000000000000000..a2bfa2de37c2852e31f3ef8b4e910a148a8444b9 --- /dev/null +++ b/Combination-Algorithm-Writer/listprologinterpreter3preds5 copy 52.pl @@ -0,0 +1,118 @@ +interpretpart(is,Variable1,Value1,Vars1,Vars2) :- + getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + expression(Value1A), + %%val1emptyorvalsequal(Value1A,Value1), + %%isval(Value2), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,is,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Variable1,is,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1A is Value2 + Value3, + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,is,Value2+Value3],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Value1A,is,Value2+Value3],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1 = [Value2A, Value3A], + val1emptyorvalsequal(Value2,Value2A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable2,Value2A,Vars1,Vars3), + putvalue(Variable3,Value3A,Vars3,Vars2), + (debug(on)->(writeln([call,[[Value2A, Value3A],=,[variable1,variable2]],"Press c."]),(not(get_single_char(97))->true;abort));true), (debug(on)->(writeln([exit,[[Value2A, Value3A],=,[Value2A, Value3A]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1A = [Value2, Value3], + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,=,[Value2,Value3]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[Value2,Value3],=,[Value2,Value3]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1A = [Value2], + val1emptyorvalsequal(Value1,Value1A), + %%val1emptyorvalsequal(Value1A,Value2), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[variable,=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Variable2,=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(bracket2,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1A = Value2, + val1emptyorvalsequal(Value1,Value1A), + %%val1emptyorvalsequal(Value2A,Value1), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[[variable],=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[Value2],=,[Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(head,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1=[Value1A|_Rest], + val1emptyorvalsequal(Value2,Value1A), + putvalue(Variable2,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[head,Value1,variable],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[head,Value1,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true), !. +interpretpart(tail,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1=[_Head|Value1A], + %%removebrackets(Value1A,Value1B), + val1emptyorvalsequal(Value2,Value1A), + putvalue(Variable2,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[tail,Value1,variable],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[tail,Value1,Value1A],"Press c."]),(not(get_single_char(97))->true;abort));true). +interpretpart(member,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + (not(Value1=empty)->(member(Value1,Value2),Vars2=Vars1, + (debug(on)->(writeln([call,[member,Value1,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[member,Value1,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true)); + (member(Value3,Value2), + putvalue(Variable1,Value3,Vars1,Vars2), + (debug(on)->(writeln([call,[member,variable1,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[member,Value3,Value2],"Press c."]),(not(get_single_char(97))->true;abort));true))). + +interpretpart(delete,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + delete(Value1,Value2,Value3A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable3,Value3A,Vars1,Vars2), + (debug(on)->(writeln([call,[delete,Value1,Value2,variable3],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[delete,Value1,Value2,Value3A],"Press c."]),(not(get_single_char(97))->true;abort));true). + + +interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + append1(Value1,Value2,Value3A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable3,Value3A,Vars1,Vars2), + (debug(on)->(writeln([call,[append,Value1,Value2,variable3],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[append,Value1,Value2,Value3A],"Press c."]),(not(get_single_char(97))->true;abort));true). +getvalues(Variable1,Variable2,Value1,Value2,Vars) :- + getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars). +getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars) :- + getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars), + getvalue(Variable3,Value3,Vars). +val1emptyorvalsequal(empty,Value) :- + not(Value=empty). +val1emptyorvalsequal(Value,Value) :- + not(Value=empty). +isop(is). +isop(=). +append1([],Item,Item) :- + !. +append1(Item1,Item2,Item3) :- + ((isvalstr(Item1),Item1A=[Item1]);(not(isvalstr(Item1)),Item1A=Item1)), + ((isvalstr(Item2),Item2A=[Item2]);(not(isvalstr(Item2)),Item2A=Item2)), + %%((isvalstr(Item3),Item3A=[Item3]);(not(isvalstr(Item3)),Item3A=Item3)), + append(Item1A,Item2A,Item3). +/**delete1(Item1,Item2,Item3) :- + ((isvalstr(Item1),Item1A=[Item1]);(not(isvalstr(Item1)),Item1A=Item1)), + ((isvalstr(Item2),Item2A=[Item2]);(not(isvalstr(Item2)),Item2A=Item2)), + %%((isvalstr(Item3),Item3A=[Item3]);(not(isvalstr(Item3)),Item3A=Item3)), + delete(Item1A,Item2A,Item3). +**/ + +removebrackets([[Value]],Value) :-!. +removebrackets(Value,Value). diff --git a/Daily-Regimen-main/.DS_Store b/Daily-Regimen-main/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Daily-Regimen-main/.DS_Store differ diff --git a/Daily-Regimen-main/LICENSE b/Daily-Regimen-main/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..920183ffbdaca89fc6faee40b6de08a6c8a061c7 --- /dev/null +++ b/Daily-Regimen-main/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Daily-Regimen-main/README.md b/Daily-Regimen-main/README.md new file mode 100644 index 0000000000000000000000000000000000000000..206d0e5240a5c61e52bf7735a33d8ac6ca2aa26e --- /dev/null +++ b/Daily-Regimen-main/README.md @@ -0,0 +1,104 @@ +# Daily-Regimen + +Scripts for daily meditation, bot prep, uni and breasoning. + +* rcawpastea vps.txt - Daily regimen for meditation +* la_com_bot_prep.txt - Prepare bots for Lucian Academy +* la_com.txt - Do work for Lucian Academy +* text_to_breasonings.txt - Script for text to breasonings, with breasonings, details, production of algorithm-like synonyms and breasoning of these + +# Getting Started + +Please read the following instructions on how to install the project on your computer for automating and customising your daily regimen. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +* You may need to install gawk using Homebrew. + +* Install Translation Shell on Mac, etc. +Change line in +``` +culturaltranslationtool/ctt2.pl +trans_location("../../../gawk/trans"). +``` +to correct location of trans. + +# 1. Install manually + +* Download: +* this repository +* listprologinterpreter +* Languages +* Cultural Translation Tool. Requires Translation Shell (you may need to install gawk using Homebrew. Install Translation Shell on Mac, etc. +Change line in culturaltranslationtool/ctt2.pl +`trans_location("../../../gawk/trans").` to correct location of trans). +* Algorithm-Writer-with-Lists +* Text-to-Breasonings. (Caution: Before running texttobr, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. Follow instructions in Instructions for Using texttobr(2) when using texttobr, texttobr2 or mind reader to avoid medical problems). +* mindreader + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Daily-Regimen"). +halt +``` + +# Running + +# big_medit1.sh + +* Automatically time travel and increase the longevity of your meditators each day. See instructions. + +# rcawpastea vps.txt + +* Daily regimen for meditation. It contains a "keep the hits" argument, meditation for meditators, preparation for mind reading, clears security fears from meditators, gives each meditator 50 As with a custom algorithm, invites each meditator to a spiritual psychiatrist every month, and expands thoughts from `Text-to-Breasonings/luciansphilosophy.txt` for each meditator. This last step should only be taken when original philosophy has been written. + +* Follow the instructions for human daily regimen at Daily Regimen (and safety instructions for using Text to Breasonings). +* Modify the meditators in `Daily-Regimen/rcawpastea\ vps.txt`, `Text-to-Breasonings/mindreadtestshared.pl` and `Text-to-Breasonings/meditatorsanddoctors.pl`. +* In the folder that `GitHub` is in (containing `Daily-Regimen`), copy and paste the contents of `rcawpastea vps.txt` into the terminal. It is NOT a shell script, because it enters user input through the script. + + + +# la_com_bot_prep.txt + +* Prepare bots for Lucian Academy. They are given 50 As for meditation, etc. + +* In the folder that `GitHub` is in (containing `Daily-Regimen`), copy and paste the contents of `la_com_bot_prep.txt` into the terminal. It is NOT a shell script, because it enters user input through the script. + + +# la_com.txt + +* Do work for Lucian Academy. A new student may start (given a coin toss) and old students may complete an essay (from short-course to PhD, given a coin toss) and possibly graduate. + +* In the folder that `GitHub` is in (containing `Lucian-Academy`), copy and paste the contents of `la_com.txt` into the terminal. It is NOT a shell script, because it enters user input through the script. + + + +# text_to_breasonings.txt + +* Script for text to breasonings, with breasonings, details, production of algorithm-like synonyms and breasoning of these. + +* In the folder that `GitHub` is in (containing `Text-to-Breasonings`), copy and paste the contents of `text_to_breasonings.txt` into the terminal. It is NOT a shell script, because it enters user input through the script. + + +# Versioning + +We will use SemVer for versioning. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/Daily-Regimen-main/a15_meditators.sh b/Daily-Regimen-main/a15_meditators.sh new file mode 100644 index 0000000000000000000000000000000000000000..039f53249dab3018bc91b1599049c82b34bda559 --- /dev/null +++ b/Daily-Regimen-main/a15_meditators.sh @@ -0,0 +1,4 @@ +#!/bin/bash +cd ../Text-to-Breasonings +swipl --goal=main --stand_alone=true -o a15_meditators -c a15_meditators.pl +./a15_meditators \ No newline at end of file diff --git a/Daily-Regimen-main/add_to_tt_log.pl b/Daily-Regimen-main/add_to_tt_log.pl new file mode 100644 index 0000000000000000000000000000000000000000..582140c47910899d6ef746aa76a5c5f6c1f2b1b0 --- /dev/null +++ b/Daily-Regimen-main/add_to_tt_log.pl @@ -0,0 +1,21 @@ +:- use_module(library(date)). +:-include('../listprologinterpreter/listprolog.pl'). + +main :- + +catch((open_file_s("tt_log.txt",File_term), +(append(_,[[n=N1,Pres_D,Pres_M,Pres_Y,Fut_D,Fut_M,Fut_Y]],File_term)->true;(N1=0,Fut_D=1,Fut_M=10,Fut_Y=5689)), +N2 is N1+1, + +get_time(TS),stamp_date_time(TS,date(Year,Month,Day,_,_,_,_,_,_),local), + +([Pres_D,Pres_M,Pres_Y]=[Day,Month,Year]->D=0;D=1), +Fut_D0 is ((Fut_D+D) mod 17), +(Fut_D0=0->Fut_D1=1;Fut_D0=Fut_D1), + +append(File_term,[[n=N2,Day,Month,Year,Fut_D1,Fut_M,Fut_Y]],File_term1), term_to_atom(File_term1,String1), + save_file_s("tt_log.txt",String1)),Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). \ No newline at end of file diff --git a/Daily-Regimen-main/add_to_tt_log.sh b/Daily-Regimen-main/add_to_tt_log.sh new file mode 100644 index 0000000000000000000000000000000000000000..e8a4f4deaa06fa6333e6fe4ea531743657c97ca2 --- /dev/null +++ b/Daily-Regimen-main/add_to_tt_log.sh @@ -0,0 +1,4 @@ +#!/bin/bash +#cd ../Daily-Regimen +swipl --stack_limit=40G --goal=main --stand_alone=true -o add_to_tt_log -c add_to_tt_log.pl +./add_to_tt_log \ No newline at end of file diff --git a/Daily-Regimen-main/attention_needed_bell.pl b/Daily-Regimen-main/attention_needed_bell.pl new file mode 100644 index 0000000000000000000000000000000000000000..0afd6a30f2f086b3214ae70e1945d4cd26cd1b4f --- /dev/null +++ b/Daily-Regimen-main/attention_needed_bell.pl @@ -0,0 +1,7 @@ +:-include('../Philosophy/bell.pl'). +main:-catch(attention_needed_bell,Err,handle_error(Err)),halt. +main :- halt(1). +handle_error(_Err):- + halt(1). +attention_needed_bell :- +catch((bell("Attention needed")->true;true),_,true),!. \ No newline at end of file diff --git a/Daily-Regimen-main/attention_needed_bell.sh b/Daily-Regimen-main/attention_needed_bell.sh new file mode 100644 index 0000000000000000000000000000000000000000..09d83b6a0303b611e712cdad60426bda9032ae2e --- /dev/null +++ b/Daily-Regimen-main/attention_needed_bell.sh @@ -0,0 +1,4 @@ +#!/bin/bash +#cd ../Daily-Regimen +swipl --stack_limit=40G --goal=main --stand_alone=true -o attention_needed_bell -c attention_needed_bell.pl +./attention_needed_bell \ No newline at end of file diff --git a/Daily-Regimen-main/big_medit b/Daily-Regimen-main/big_medit new file mode 100644 index 0000000000000000000000000000000000000000..2bb77dd59823e1526d9b4b95834980d5656915c2 Binary files /dev/null and b/Daily-Regimen-main/big_medit differ diff --git a/Daily-Regimen-main/big_medit.pl b/Daily-Regimen-main/big_medit.pl new file mode 100644 index 0000000000000000000000000000000000000000..d4e8d112b9b43f68306535c4f88116b69a1f43e8 --- /dev/null +++ b/Daily-Regimen-main/big_medit.pl @@ -0,0 +1,29 @@ +% big_medit.pl + +%:- use_module(library(date)). +:-include('../listprologinterpreter/listprolog.pl'). + +main :- + +catch(main2,Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). + +main2:- +open_string_file_s("big_medit.sh",File_string), +split_string(File_string,"\n\r","\n\r",List), +findall(X,(member(X,List),not(string_concat("#",_,X))),Y), +%try each command again 3 times if fails +%report on progress +findall(_,(member(Z,Y),bash_command1(Z)),_), +writeln("Finished."),!. + +bash_command1(Z) :- bash_command2(3,Z). +bash_command2(1,Z) :- write("Failed: "),writeln_on_lines(Z),abort. +bash_command2(N,Z) :- + write("Trying: "),writeln_on_lines(Z), + catch(bash_command(Z,_),_,(N1 is N-1,bash_command2(N1,Z))),!. + +writeln_on_lines(C) :- atomic_list_concat(A,"\\n",C),atomic_list_concat(A,"\n",B),writeln(B),!. diff --git a/Daily-Regimen-main/big_medit.sh b/Daily-Regimen-main/big_medit.sh new file mode 100644 index 0000000000000000000000000000000000000000..f9fca218c5faa9c828e11ff4344e34ca4b8d4c41 --- /dev/null +++ b/Daily-Regimen-main/big_medit.sh @@ -0,0 +1,49 @@ +# Comment the start of ./cgpt_combophil3_160.sh with '#' if you don't want to use ChatGPT as an accreditation tool in each dimension +./cgpt_combophil3_160.sh +./texttobr2_square.sh +./d.sh +./day2.sh +# ./a15_meditators.sh prepares an apartment in each dimension for meditators by helping 15 of those around them +./a15_meditators.sh +./daily_medit.sh +./attention_needed_bell.sh +echo "Please ask the future Head of State and the Simulation Co-ordinator for the age freezing medicine, then say \"too\" to each of the following:\n\nA, B, B to B for Reishi Mushroom\nA, B, B to B for Goji\nA, B, B to B for Ginseng\nA, B, B to B for He-Shou-Wu\nA, B, B to B for Gotu Kola\nA, B, B to B for Schisandra\nA, B, B to B for Love head of state\nA, B, B to B for Thank who helped me with immortality medicine, body replacement and anti-ageing\nA, B, B to B for Thank head of state for Reishi Mushroom\nA, B, B to B for Thank head of state for Goji\nA, B, B to B for Thank head of state for Ginseng\nA, B, B to B for Thank head of state for He-Shou-Wu\nA, B, B to B for Thank head of state for Gotu Kola\nA, B, B to B for Thank head of state for Schisandra\nA, B, B to B for Immortality\nA, B, B to B for Body replacement\nA, B, B to B for Anti-ageing medicine\nA, B, B to B for immortality medicine, body replacement and anti-ageing for memory\nA, B, B to B for immortality medicine, body replacement and anti-ageing for thinking\nA, B, B to B for immortality medicine, body replacement and anti-ageing for stopping dementia\nA, B, B to B for immortality medicine, body replacement and anti-ageing for seeing clearly\nA, B, B to B for immortality medicine, body replacement and anti-ageing for muscle relaxation\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Circulatory system / Cardiovascular system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Digestive system and Excretory system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Endocrine system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Integumentary system / Exocrine system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Immune system and lymphatic system:\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Muscular system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Nervous system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Renal system and Urinary system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Reproductive system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Respiratory system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Skeletal System\nA, B, B to B for immortality medicine, body replacement and anti-ageing for antidepressant\nA, B, B to B for immortality medicine, body replacement and anti-ageing for antipsychotic\nA, B, B to B for Other medicines for the body\nA, B, B to B for ginkgo biloba\nA, B, B to B for practicum for others in immortality, etc.\nA, B, B to B for the other As\nA, B, B to B for Thank head of state.\n\nThen enter \"y\":" +./please_enter_y.sh +echo "Please think of 108 arems and 108 friendlinesses, then enter \"y\":" +./please_enter_y.sh +./group_meditation.sh +./add_to_tt_log.sh +echo "Going to present, press \"y\":" +./please_enter_y.sh +./time_hop.sh +echo "Please think of 108 arems and 108 friendlinesses, then enter \"y\":" +./please_enter_y.sh +./group_meditation.sh +./cgpt_combophil3_160.sh +./texttobr2_square.sh +./d.sh +./day2.sh +./a15_meditators.sh +./daily_medit.sh +./attention_needed_bell.sh +echo "Please ask the future Head of State and the Simulation Co-ordinator for the age freezing medicine, then say \"too\" to each of the following:\n\nA, B, B to B for Reishi Mushroom\nA, B, B to B for Goji\nA, B, B to B for Ginseng\nA, B, B to B for He-Shou-Wu\nA, B, B to B for Gotu Kola\nA, B, B to B for Schisandra\nA, B, B to B for Love head of state\nA, B, B to B for Thank who helped me with immortality medicine, body replacement and anti-ageing\nA, B, B to B for Thank head of state for Reishi Mushroom\nA, B, B to B for Thank head of state for Goji\nA, B, B to B for Thank head of state for Ginseng\nA, B, B to B for Thank head of state for He-Shou-Wu\nA, B, B to B for Thank head of state for Gotu Kola\nA, B, B to B for Thank head of state for Schisandra\nA, B, B to B for Immortality\nA, B, B to B for Body replacement\nA, B, B to B for Anti-ageing medicine\nA, B, B to B for immortality medicine, body replacement and anti-ageing for memory\nA, B, B to B for immortality medicine, body replacement and anti-ageing for thinking\nA, B, B to B for immortality medicine, body replacement and anti-ageing for stopping dementia\nA, B, B to B for immortality medicine, body replacement and anti-ageing for seeing clearly\nA, B, B to B for immortality medicine, body replacement and anti-ageing for muscle relaxation\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Circulatory system / Cardiovascular system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Digestive system and Excretory system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Endocrine system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Integumentary system / Exocrine system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Immune system and lymphatic system:\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Muscular system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Nervous system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Renal system and Urinary system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Reproductive system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Respiratory system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Skeletal System\nA, B, B to B for immortality medicine, body replacement and anti-ageing for antidepressant\nA, B, B to B for immortality medicine, body replacement and anti-ageing for antipsychotic\nA, B, B to B for Other medicines for the body\nA, B, B to B for ginkgo biloba\nA, B, B to B for practicum for others in immortality, etc.\nA, B, B to B for the other As\nA, B, B to B for Thank head of state.\n\nThen enter \"y\"" +./please_enter_y.sh +echo "Please think of 108 arems and 108 friendlinesses, then enter \"y\":" +./please_enter_y.sh +./group_meditation.sh +#echo "Going to 5689, press \"y\":" +./going_to_5689.sh +./please_enter_y.sh +./time_hop.sh +echo "Please think of 108 arems and 108 friendlinesses, then enter \"y\":" +./please_enter_y.sh +./group_meditation.sh +./cgpt_combophil3_160.sh +./texttobr2_square.sh +./d.sh +./day2.sh +./a15_meditators.sh +./daily_medit.sh +./attention_needed_bell.sh +echo "Please ask the future Head of State and the Simulation Co-ordinator for the age freezing medicine, then say \"too\" to each of the following:\n\nA, B, B to B for Reishi Mushroom\nA, B, B to B for Goji\nA, B, B to B for Ginseng\nA, B, B to B for He-Shou-Wu\nA, B, B to B for Gotu Kola\nA, B, B to B for Schisandra\nA, B, B to B for Love head of state\nA, B, B to B for Thank who helped me with immortality medicine, body replacement and anti-ageing\nA, B, B to B for Thank head of state for Reishi Mushroom\nA, B, B to B for Thank head of state for Goji\nA, B, B to B for Thank head of state for Ginseng\nA, B, B to B for Thank head of state for He-Shou-Wu\nA, B, B to B for Thank head of state for Gotu Kola\nA, B, B to B for Thank head of state for Schisandra\nA, B, B to B for Immortality\nA, B, B to B for Body replacement\nA, B, B to B for Anti-ageing medicine\nA, B, B to B for immortality medicine, body replacement and anti-ageing for memory\nA, B, B to B for immortality medicine, body replacement and anti-ageing for thinking\nA, B, B to B for immortality medicine, body replacement and anti-ageing for stopping dementia\nA, B, B to B for immortality medicine, body replacement and anti-ageing for seeing clearly\nA, B, B to B for immortality medicine, body replacement and anti-ageing for muscle relaxation\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Circulatory system / Cardiovascular system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Digestive system and Excretory system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Endocrine system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Integumentary system / Exocrine system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Immune system and lymphatic system:\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Muscular system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Nervous system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Renal system and Urinary system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Reproductive system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Respiratory system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Skeletal System\nA, B, B to B for immortality medicine, body replacement and anti-ageing for antidepressant\nA, B, B to B for immortality medicine, body replacement and anti-ageing for antipsychotic\nA, B, B to B for Other medicines for the body\nA, B, B to B for ginkgo biloba\nA, B, B to B for practicum for others in immortality, etc.\nA, B, B to B for the other As\nA, B, B to B for Thank head of state.\n\nThen enter \"y\":" +./please_enter_y.sh \ No newline at end of file diff --git a/Daily-Regimen-main/big_medit1.sh b/Daily-Regimen-main/big_medit1.sh new file mode 100644 index 0000000000000000000000000000000000000000..3bb2450d7dd5627d93e6be4c0fba786e395a2571 --- /dev/null +++ b/Daily-Regimen-main/big_medit1.sh @@ -0,0 +1,4 @@ +#!/bin/bash +#cd ../Daily-Regimen +swipl --stack_limit=40G --goal=main --stand_alone=true -o big_medit -c big_medit.pl +./big_medit \ No newline at end of file diff --git a/Daily-Regimen-main/big_medit2 b/Daily-Regimen-main/big_medit2 new file mode 100644 index 0000000000000000000000000000000000000000..b16acda61386e3f1a10b94990330d2598aca1265 Binary files /dev/null and b/Daily-Regimen-main/big_medit2 differ diff --git a/Daily-Regimen-main/big_medit2.pl b/Daily-Regimen-main/big_medit2.pl new file mode 100644 index 0000000000000000000000000000000000000000..63b2657ed9cb0160330ac904f72ceea3f51e8ec2 --- /dev/null +++ b/Daily-Regimen-main/big_medit2.pl @@ -0,0 +1,84 @@ +% big_medit2.pl + +:-include('../listprologinterpreter/listprolog.pl'). +:-include('../Text-to-Breasonings/meditatorsanddoctors.pl'). +:-include('../Text-to-Breasonings/meditationnoreplace2.pl'). +:-include('../Text-to-Breasonings/prompt_question1.pl'). +:-include('../Text-to-Breasonings/prompt_meditation1.pl'). + +main:-catch(main2,Err,handle_error(Err)),halt. +handle_error(_Err):- + halt(1). +main :- halt(1). + +main2 :- +((exists_file('../Text-to-Breasonings/b1'), +exists_file('../Text-to-Breasonings/b2'), +exists_file('../Text-to-Breasonings/c1'), +exists_file('../Text-to-Breasonings/c2'))-> +true; +shell1_s("./d-prep.sh")), +shell1_s("./add_to_tt_log.sh"), + +meditators(A),meditators2(B),length(A,AL),length(B,BL),CL is AL+BL, + +meditation1(Utterances1,Utterances2,Immortality), + +length(Utterances1,UL1), +length(Utterances2,UL2), + +length(Immortality,J2), + +% breasonings: + +N1 is 2*108*250 + % meditation + 16000 + 4 + % texttobr2_square + 160 + % 160 combophil br + 15*3*4*16000 + % d.sh + 3*10*4*16000 + % day2.sh + % with medit, tt, medic frozen age, + % hq thought + 250*CL + % a15_meditators.sh prepares an apartment + % in each dimension for meditators by + % helping 15 of those around them + 3*10*(UL1+UL2)*250 + % daily_medit.sh + 3*10*J2*250, + +GM is 250*CL, % group_meditation.sh + +C1 is GM + + 16000, % time_hop.sh +C2 is GM, + + + bc12, + destination(N1,C1,C2,"Going to present"), + destination(N1,C1,C2,"Going to 5689"), + point_to_br(N1), + !. + +bc12 :- +working_directory1(WD,WD), +working_directory1(_,"../Text-to-Breasonings"), +shell1_s("./bc12.sh"), +working_directory1(_,WD),!. + +point_to_br(T) :- + N2 is (T div 16000)+1, + + texttobr2_1(N2),!. + +destination(N1,C1,C2,Label) :- + point_to_br(N1), + prompt_question, + prompt_meditation, + write(Label),prompt_tt, + point_to_br(C1), + bc12, + point_to_br(C2), + prompt_meditation,!. + +prompt_tt :- write(" "), +repeat,writeln("Please enter \"y\":"), +read_string(user_input,"\n\r","\n\r",_,S), +S="y",!. diff --git a/Daily-Regimen-main/big_medit2.sh b/Daily-Regimen-main/big_medit2.sh new file mode 100644 index 0000000000000000000000000000000000000000..42a2c08d6e90274de482fb9fcf6250952a9c5277 --- /dev/null +++ b/Daily-Regimen-main/big_medit2.sh @@ -0,0 +1,4 @@ +#!/bin/bash +#cd ../Daily-Regimen +swipl --stack_limit=40G --goal=main --stand_alone=true -o big_medit2 -c big_medit2.pl +./big_medit2 \ No newline at end of file diff --git a/Daily-Regimen-main/cgpt_combophil10-free.pl b/Daily-Regimen-main/cgpt_combophil10-free.pl new file mode 100644 index 0000000000000000000000000000000000000000..115d143883e6b864eebb1a86675cd0efaf7765a2 --- /dev/null +++ b/Daily-Regimen-main/cgpt_combophil10-free.pl @@ -0,0 +1,177 @@ +%% enter source, alg, pl or lp (with types, open), for subj/ass, date br (may be manually entered) + +%% x one alg to enter alg specs + +%% x one alg to suggest algs, like combophil with filename - upper limit of algs per chapter, records algs per chapters +% - 4 per file, (then 16 per file x) + +%% combophil.pl + +%% Finds combinations of lines of philosophy + +%:- include('../listprologinterpreter/listprolog.pl'). +:- include('chatgpt_qa.pl'). +:- include('../Text-to-Breasonings/text_to_breasonings.pl'). + +:- use_module(library(date)). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +%% e.g. combophil(2). to write on a combination of philosophies + +% cgpt_combophil(Books,Num). + +cgpt_combophil(Books,Num%,In_String%,Out_List +):- + N1 = 4, %% maximum number of algorithms per file + % find files with n or fewer algorithms + %phrase_from_file_s(string(String00a), "combophil_alg_log.txt"), + %string_codes(String02b,String00a), + %trace, + %atom_to_term(String02b,String02a,[]), + + %directory_files('../Lucian-Academy/Books/',F000), + %delete_invisibles_etc(F000,Books), + + working_directory(A000,A000), + working_directory(_,'../Lucian-Academy/Books/'), + %folders(Folders), + %Folders=["a000"], + %trace, + findall([Dept,G00],(member(Dept,Books), + concat_list([Dept],Dept1), + directory_files(Dept1,F), + delete_invisibles_etc(F,G), + member(G00,G)),G1), + %trace, + delete_all(String02a,G1,G2), + findall([G51,G52,0],(member([G51,G52],G2)),G6), + %trace, + append(String02a,G6,G3), +%trace, +numbers(Num,1,[],Nums), +findall(G4,(member(_,Nums), + combophil_alg_log(N1,G3,G4)),G41), + + working_directory(_,A000), + +get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + foldr(string_concat,["oldphil-", Year ,"-", Month ,"-", Day ,"-", Hour1 ,"-", Minute1 ,"-", Seconda,".txt"],FN), + term_to_atom(G41,String1), + save_file_s(FN,String1), + +working_directory1(WD,WD), +working_directory1(_,"../Text-to-Breasonings"), +atomic_list_concat(List1,"\n",String1), +atomic_list_concat(List1," ",String2), +N0=1,M=u,(texttobr2(N0,u,String2,M,[auto,on])->true;true), + +working_directory1(_,WD), + % + + %(open_s("combophil_alg_log.txt",write,Stream1), + %write(Stream1,String1), + %close(Stream1)), + !. + +delete_all([],G,G) :- !. +delete_all(G0,G1,G2) :- + G0=[[String1,String2,_]|Strings2], + delete(G1,[String1,String2],G3), + delete_all(Strings2,G3,G2). + /** +%trace, +%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + %trace, + findall(String02b,(member(Filex1,G), + string_concat(Dept1,Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a)),Texts1), +**/ + % choose a file, algorithm y or n, record if y +combophil_alg_log(_N1,G1,G2) :- + ((member([_Dept00,_Folder00,N20],G1)%, N20 + (findall([Dept01,Folder01,N20],(member([Dept01,Folder01,N20],G1)%,N20true;get_text1(R,Phil412)). +get_text10(R,Phil412) :- + random_member([Dept,Folder,_N2],R), + %N2= list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +%% e.g. combophil(2). to write on a combination of philosophies + +% cgpt_combophil(Books,Num). + +cgpt_combophil(Books,Num,In_String%,Out_List +):- + N1 = 4, %% maximum number of algorithms per file + % find files with n or fewer algorithms + %phrase_from_file_s(string(String00a), "combophil_alg_log.txt"), + %string_codes(String02b,String00a), + %trace, + %atom_to_term(String02b,String02a,[]), + + %directory_files('../Lucian-Academy/Books/',F000), + %delete_invisibles_etc(F000,Books), + + working_directory(A000,A000), + working_directory(_,'../Lucian-Academy/Books/'), + %folders(Folders), + %Folders=["a000"], + %trace, + findall([Dept,G00],(member(Dept,Books), + concat_list([Dept],Dept1), + directory_files(Dept1,F), + delete_invisibles_etc(F,G), + member(G00,G)),G1), + %trace, + delete_all(String02a,G1,G2), + findall([G51,G52,0],(member([G51,G52],G2)),G6), + %trace, + append(String02a,G6,G3), +%trace, +%numbers(Num,1,[],Nums), +%findall(G4,(member(_,Nums), + combophil_alg_log1(Num,G3,[],G41,In_String),%),G41), + + working_directory(_,A000), + +get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + foldr(string_concat,["oldphil-", Year ,"-", Month ,"-", Day ,"-", Hour1 ,"-", Minute1 ,"-", Seconda,".txt"],FN), + term_to_atom(G41,String1), + save_file_s(FN,String1), + +working_directory1(WD,WD), +working_directory1(_,"../Text-to-Breasonings"), +atomic_list_concat(List1,"\n",String1), +atomic_list_concat(List1," ",String2), +N0=1,M=u,(texttobr2(N0,u,String2,M,[auto,on])->true;true), + +working_directory1(_,WD), + % + + %(open_s("combophil_alg_log.txt",write,Stream1), + %write(Stream1,String1), + %close(Stream1)), + !. + +delete_all([],G,G) :- !. +delete_all(G0,G1,G2) :- + G0=[[String1,String2,_]|Strings2], + delete(G1,[String1,String2],G3), + delete_all(Strings2,G3,G2). + /** +%trace, +%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + %trace, + findall(String02b,(member(Filex1,G), + string_concat(Dept1,Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a)),Texts1), +**/ + % choose a file, algorithm y or n, record if y + +combophil_alg_log1(0,_G3,G41,G41,_In_String) :- !. +combophil_alg_log1(N1,G3,G40,G41,In_String) :- + (combophil_alg_log(_,G3,G4,In_String)-> + (append(G40,[G4],G42), + N2 is N1-1, + combophil_alg_log1(N2,G3,G42,G41,In_String)); + G40=G41),!. + +combophil_alg_log(_N1,G1,G2,In_String) :- + ((member([_Dept00,_Folder00,N20],G1)%, N20 + (findall([Dept01,Folder01,N20],(member([Dept01,Folder01,N20],G1)%,N20true;get_text1(R,Phil412)). +get_text10(R,Phil412) :- + random_member([Dept,Folder,_N2],R), + %N2= list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +%% e.g. combophil(2). to write on a combination of philosophies + +% cgpt_combophil(Books,Num). + +cgpt_combophil(Books,Num):- + N1 = 4, %% maximum number of algorithms per file + % find files with n or fewer algorithms + %phrase_from_file_s(string(String00a), "combophil_alg_log.txt"), + %string_codes(String02b,String00a), + %trace, + %atom_to_term(String02b,String02a,[]), + + %directory_files('../Lucian-Academy/Books/',F000), + %delete_invisibles_etc(F000,Books), + + working_directory(A000,A000), + working_directory(_,'../Lucian-Academy/Books/'), + %folders(Folders), + %Folders=["a000"], + %trace, + findall([Dept,G00],(member(Dept,Books), + concat_list([Dept],Dept1), + directory_files(Dept1,F), + delete_invisibles_etc(F,G), + member(G00,G)),G1), + %trace, + delete_all(String02a,G1,G2), + findall([G51,G52,0],(member([G51,G52],G2)),G6), + %trace, + append(String02a,G6,G3), +%trace, +numbers(Num,1,[],Nums), +findall(G4,(member(_,Nums), + combophil_alg_log(N1,G3,G4)),G41), + + working_directory(_,A000), + +get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + foldr(string_concat,["oldphil-", Year ,"-", Month ,"-", Day ,"-", Hour1 ,"-", Minute1 ,"-", Seconda,".txt"],FN), + term_to_atom(G41,String1), + save_file_s(FN,String1), + +working_directory1(WD,WD), +working_directory1(_,"../Text-to-Breasonings"), +atomic_list_concat(List1,"\n",String1), +atomic_list_concat(List1," ",String2), +N0=1,M=u,(texttobr2(N0,u,String2,M,[auto,on])->true;true), + +working_directory1(_,WD), + % + + %(open_s("combophil_alg_log.txt",write,Stream1), + %write(Stream1,String1), + %close(Stream1)), + !. + +delete_all([],G,G) :- !. +delete_all(G0,G1,G2) :- + G0=[[String1,String2,_]|Strings2], + delete(G1,[String1,String2],G3), + delete_all(Strings2,G3,G2). + /** +%trace, +%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + %trace, + findall(String02b,(member(Filex1,G), + string_concat(Dept1,Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a)),Texts1), +**/ + % choose a file, algorithm y or n, record if y +combophil_alg_log(_N1,G1,G2) :- + ((member([_Dept00,_Folder00,N20],G1)%, N20 + (findall([Dept01,Folder01,N20],(member([Dept01,Folder01,N20],G1)%,N20true;get_text1(R,Phil412)). +get_text10(R,Phil412) :- + random_member([Dept,Folder,_N2],R), + %N2= + true; + (writeln(["Failed shell1 command: ",Command]),fail) + ),_,fail),!, + + +%?- % to enable json_read_dict/2 +%?- FPath = '/home/xxx/dnns/test/params.json', open(FPath, read, Stream), +atom_string(Output1,Output), +atom_json_term(Output1, A1, []), +A1=json([_,_,_,_,choices=[json([text=A2|_])]|_]), +atom_string(A2,A) +,writeln(A). +%atom_json_term(?Atom, ?JSONTerm, +Options) + diff --git a/Daily-Regimen-main/chatgpt_qa_key.txt b/Daily-Regimen-main/chatgpt_qa_key.txt new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/Daily-Regimen-main/d-prep.sh b/Daily-Regimen-main/d-prep.sh new file mode 100644 index 0000000000000000000000000000000000000000..3e6221bf9505ab9f9ed8c5e37010b10747e2a706 --- /dev/null +++ b/Daily-Regimen-main/d-prep.sh @@ -0,0 +1,31 @@ +#!/bin/bash + +# One-off +cd ../Text-to-Breasonings/ +swipl --goal=main_t2b4 --stand_alone=true -o t2b4 -c text_to_breasonings4.pl + +# Each week +cd ../Philosophy + +./cat_arg_files.sh +./cat_alg_files.sh + +cd ../Text-to-Breasonings + +cp ../Lucian-Academy/Books1/algs/lgalgs_a.txt file.txt +./t2b4 +rm a.pl +mv a a1 +rm b.pl +mv b b1 +rm c.pl +mv c c1 + +cp ../Lucian-Academy/Books1/args/lgtext_a.txt file.txt +./t2b4 +rm a.pl +mv a a2 +rm b.pl +mv b b2 +rm c.pl +mv c c2 diff --git a/Daily-Regimen-main/d.sh b/Daily-Regimen-main/d.sh new file mode 100644 index 0000000000000000000000000000000000000000..c5d5856b51077d1004d92bd5523f010ffefeecfb --- /dev/null +++ b/Daily-Regimen-main/d.sh @@ -0,0 +1,5 @@ +#!/bin/bash +cd ../Text-to-Breasonings/ + +# Or, for 15 people +./d.sh diff --git a/Daily-Regimen-main/daily_medit.sh b/Daily-Regimen-main/daily_medit.sh new file mode 100644 index 0000000000000000000000000000000000000000..ab01fa1f0c6ecb18afc47c05e48b46dc9eb375c4 --- /dev/null +++ b/Daily-Regimen-main/daily_medit.sh @@ -0,0 +1,4 @@ +#!/bin/bash +cd ../Text-to-Breasonings +swipl --stack-limit=40G --goal=main --stand_alone=true -o daily_medit -c daily_medit.pl +./daily_medit \ No newline at end of file diff --git a/Daily-Regimen-main/day2.sh b/Daily-Regimen-main/day2.sh new file mode 100644 index 0000000000000000000000000000000000000000..19dd1647ec7a1c5bbe05c66f46290adc9aebb3ee --- /dev/null +++ b/Daily-Regimen-main/day2.sh @@ -0,0 +1,4 @@ +#!/bin/bash +cd ../Philosophy/ +swipl --goal=main --stand_alone=true -o day2 -c day2.pl +./day2 \ No newline at end of file diff --git a/Daily-Regimen-main/file-meditations.txt b/Daily-Regimen-main/file-meditations.txt new file mode 100644 index 0000000000000000000000000000000000000000..3150f24fc84d9d56b4f553d23757be3ae45be5ec --- /dev/null +++ b/Daily-Regimen-main/file-meditations.txt @@ -0,0 +1,4362 @@ +[algorithm,32] +[[[n,function0],[[v,a],[v,b],[v,c]],:-,[[[n,function1],[[v,a],[v,b],[v,d]]],[[n,=],[[v,c],[v,d]]]]],[[n,function1],[[v,a],[v,b],[v,c]],:-,[[[n,+],[[v,a],[v,b],[v,c]]]]]] + + Call: (14) interpret(off, [[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _2686) + Unify: (14) interpret(off, [[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _2686) + Call: (15) convert_to_grammar_part1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _2684, _2686) + Unify: (15) convert_to_grammar_part1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _2684, _2686) + Call: (16) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _2684, [], _2688, [], _2692, [], _2696) + Unify: (16) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _2684, [], _2688, [], _2692, [], _2696) + Call: (17) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[_2666|_2668] + Exit: (17) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (17) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[[n, _2684], _2690, "->", _2708] + Fail: (17) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[[n, _2684], _2690, "->", _2708] + Redo: (16) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _2690, [], _2694, [], _2698, [], _2702) + Call: (17) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[[n, _2684], "->", _2702] + Fail: (17) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[[n, _2684], "->", _2702] + Redo: (16) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _2684, [], _2688, [], _2692, [], _2696) + Unify: (16) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _2684, [], _2688, [], _2692, [], _2696) + Call: (17) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[_2666|_2668] + Exit: (17) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (17) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[_2672, _2678, ":-", _2696] + Exit: (17) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]] + Call: (17) true + Exit: (17) true + Call: (17) true + Exit: (17) true + Call: (17) true + Exit: (17) true + Call: (17) true + Exit: (17) true + Call: (17) lists:append([], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], _2726) + Unify: (17) lists:append([], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]]) + Exit: (17) lists:append([], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]]) + Call: (17) lists:append([], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], _2744) + Unify: (17) lists:append([], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]]) + Exit: (17) lists:append([], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]]) + Call: (17) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], _2744, [], _2748, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], _2752, [], _2756) + Unify: (17) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], _2744, [], _2748, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], _2752, [], _2756) + Call: (18) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[_2726|_2728] + Exit: (18) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]] + Call: (18) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[[n, _2744], _2750, "->", _2768] + Fail: (18) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[[n, _2744], _2750, "->", _2768] + Redo: (17) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], _2750, [], _2754, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], _2758, [], _2762) + Call: (18) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[[n, _2744], "->", _2762] + Fail: (18) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[[n, _2744], "->", _2762] + Redo: (17) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], _2744, [], _2748, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], _2752, [], _2756) + Unify: (17) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], _2744, [], _2748, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], _2752, [], _2756) + Call: (18) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[_2726|_2728] + Exit: (18) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]] + Call: (18) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[_2732, _2738, ":-", _2756] + Exit: (18) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]] + Call: (18) true + Exit: (18) true + Call: (18) true + Exit: (18) true + Call: (18) true + Exit: (18) true + Call: (18) true + Exit: (18) true + Call: (18) lists:append([[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], [[[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _2786) + Unify: (18) lists:append([[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], [[[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]|_2770]) + Exit: (18) lists:append([[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], [[[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]]]) + Call: (18) lists:append([[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], [[[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], _2810) + Unify: (18) lists:append([[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], [[[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]|_2794]) + Exit: (18) lists:append([[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], [[[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]], [[[n, function1], [[v|...], [...|...]|...], ":-", [...]], [[n, function1], [[...|...]|...], ":-"|...]]]) + Call: (18) convert_to_grammar_part11([], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _2816, [], _2820, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], _2824, [], _2828) + Unify: (18) convert_to_grammar_part11([], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [], []) + Exit: (18) convert_to_grammar_part11([], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [], []) + Exit: (17) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [], []) + Exit: (16) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [], [], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [], []) + Call: (16) _2812=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Exit: (16) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (16) _2812=[[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]] + Exit: (16) [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]]=[[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]] + Exit: (15) convert_to_grammar_part1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]]) + Call: (15) interpret1(off, [[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _2820) + Unify: (15) interpret1(off, [[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _2820) +^ Call: (16) retractall(debug(_2798)) +^ Exit: (16) retractall(debug(_2798)) +^ Call: (16) assertz(debug(off)) +^ Exit: (16) assertz(debug(off)) +^ Call: (16) retractall(cut(_2806)) +^ Exit: (16) retractall(cut(_2806)) +^ Call: (16) assertz(cut(off)) +^ Exit: (16) assertz(cut(off)) +^ Call: (16) retractall(leash1(_2814)) +^ Exit: (16) retractall(leash1(_2814)) +^ Call: (16) assertz(leash1(off)) +^ Exit: (16) assertz(leash1(off)) + Call: (16) member1([[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _2842) + Unify: (16) member1([[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _2842) + Call: (17) cut(off) + Unify: (17) cut(off) + Exit: (17) cut(off) + Call: (17) [[n, function0], [947, 918, [v, c]]]=[_2822, _2828] + Exit: (17) [[n, function0], [947, 918, [v, c]]]=[[n, function0], [947, 918, [v, c]]] + Call: (17) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], _2846, ":-", _2864]|_2836] + Exit: (17) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (17) length([947, 918, [v, c]], _2886) + Unify: (17) length([947, 918, [v, c]], _2886) + Exit: (17) length([947, 918, [v, c]], 3) + Call: (17) length([[v, a], [v, b], [v, c]], 3) + Unify: (17) length([[v, a], [v, b], [v, c]], 3) + Unify: (17) length([[v, a], [v, b], [v, c]], 3) + Exit: (17) length([[v, a], [v, b], [v, c]], 3) + Call: (17) checkarguments([947, 918, [v, c]], [[v, a], [v, b], [v, c]], [], _2890, [], _2894) + Unify: (17) checkarguments([947, 918, [v, c]], [[v, a], [v, b], [v, c]], [], _2890, [], _2894) + Call: (18) [947, 918, [v, c]]=[_2870|_2872] + Exit: (18) [947, 918, [v, c]]=[947, 918, [v, c]] + Call: (18) expressionnotatom3(947) + Unify: (18) expressionnotatom3(947) + Call: (19) expressionnotatom(947) + Unify: (19) expressionnotatom(947) + Call: (20) isvalstrempty(947) + Unify: (20) isvalstrempty(947) + Call: (21) var(947) + Fail: (21) var(947) + Redo: (20) isvalstrempty(947) + Unify: (20) isvalstrempty(947) + Call: (21) isval(947) + Unify: (21) isval(947) + Exit: (21) isval(947) + Exit: (20) isvalstrempty(947) + Exit: (19) expressionnotatom(947) + Exit: (18) expressionnotatom3(947) + Call: (18) [[v, a], [v, b], [v, c]]=[_2876|_2878] + Exit: (18) [[v, a], [v, b], [v, c]]=[[v, a], [v, b], [v, c]] +^ Call: (18) not(var([v, a])) +^ Unify: (18) not(user:var([v, a])) +^ Exit: (18) not(user:var([v, a])) + Call: (18) isvar([v, a]) + Unify: (18) isvar([v, a]) + Exit: (18) isvar([v, a]) + Call: (18) putvalue([v, a], 947, [], _2912) + Unify: (18) putvalue([v, a], 947, [], _2912) +^ Call: (19) not(isvar([v, a])) +^ Unify: (19) not(user:isvar([v, a])) + Call: (20) isvar([v, a]) + Unify: (20) isvar([v, a]) + Exit: (20) isvar([v, a]) +^ Fail: (19) not(user:isvar([v, a])) + Redo: (18) putvalue([v, a], 947, [], _2912) + Call: (19) isvar([v, a]) + Unify: (19) isvar([v, a]) + Exit: (19) isvar([v, a]) + Call: (19) isvalstrorundef(947) + Unify: (19) isvalstrorundef(947) + Call: (20) var(947) + Fail: (20) var(947) + Redo: (19) isvalstrorundef(947) + Unify: (19) isvalstrorundef(947) +^ Call: (20) not(var(947)) +^ Unify: (20) not(user:var(947)) +^ Exit: (20) not(user:var(947)) + Call: (20) isval(947) + Unify: (20) isval(947) + Exit: (20) isval(947) + Exit: (19) isvalstrorundef(947) + Call: (19) updatevar([v, a], 947, [], _2922) + Unify: (19) updatevar([v, a], 947, [], _2922) + Call: (20) lists:member([[v, a], empty], []) + Fail: (20) lists:member([[v, a], empty], []) + Redo: (19) updatevar([v, a], 947, [], _2922) +^ Call: (20) not(member([[v, a], _2914], [])) +^ Unify: (20) not(user:member([[v, a], _2914], [])) +^ Exit: (20) not(user:member([[v, a], _2914], [])) + Call: (20) _2914=empty + Exit: (20) empty=empty + Call: (20) true + Exit: (20) true + Call: (20) lists:append([], [[[v, a], 947]], _2962) + Unify: (20) lists:append([], [[[v, a], 947]], [[[v, a], 947]]) + Exit: (20) lists:append([], [[[v, a], 947]], [[[v, a], 947]]) + Call: (20) true + Exit: (20) true + Call: (20) true + Exit: (20) true + Exit: (19) updatevar([v, a], 947, [], [[[v, a], 947]]) + Exit: (18) putvalue([v, a], 947, [], [[[v, a], 947]]) + Call: (18) checkarguments([918, [v, c]], [[v, b], [v, c]], [[[v, a], 947]], _2964, [], _2968) + Unify: (18) checkarguments([918, [v, c]], [[v, b], [v, c]], [[[v, a], 947]], _2964, [], _2968) + Call: (19) [918, [v, c]]=[_2944|_2946] + Exit: (19) [918, [v, c]]=[918, [v, c]] + Call: (19) expressionnotatom3(918) + Unify: (19) expressionnotatom3(918) + Call: (20) expressionnotatom(918) + Unify: (20) expressionnotatom(918) + Call: (21) isvalstrempty(918) + Unify: (21) isvalstrempty(918) + Call: (22) var(918) + Fail: (22) var(918) + Redo: (21) isvalstrempty(918) + Unify: (21) isvalstrempty(918) + Call: (22) isval(918) + Unify: (22) isval(918) + Exit: (22) isval(918) + Exit: (21) isvalstrempty(918) + Exit: (20) expressionnotatom(918) + Exit: (19) expressionnotatom3(918) + Call: (19) [[v, b], [v, c]]=[_2950|_2952] + Exit: (19) [[v, b], [v, c]]=[[v, b], [v, c]] +^ Call: (19) not(var([v, b])) +^ Unify: (19) not(user:var([v, b])) +^ Exit: (19) not(user:var([v, b])) + Call: (19) isvar([v, b]) + Unify: (19) isvar([v, b]) + Exit: (19) isvar([v, b]) + Call: (19) putvalue([v, b], 918, [[[v, a], 947]], _2986) + Unify: (19) putvalue([v, b], 918, [[[v, a], 947]], _2986) +^ Call: (20) not(isvar([v, b])) +^ Unify: (20) not(user:isvar([v, b])) + Call: (21) isvar([v, b]) + Unify: (21) isvar([v, b]) + Exit: (21) isvar([v, b]) +^ Fail: (20) not(user:isvar([v, b])) + Redo: (19) putvalue([v, b], 918, [[[v, a], 947]], _2986) + Call: (20) isvar([v, b]) + Unify: (20) isvar([v, b]) + Exit: (20) isvar([v, b]) + Call: (20) isvalstrorundef(918) + Unify: (20) isvalstrorundef(918) + Call: (21) var(918) + Fail: (21) var(918) + Redo: (20) isvalstrorundef(918) + Unify: (20) isvalstrorundef(918) +^ Call: (21) not(var(918)) +^ Unify: (21) not(user:var(918)) +^ Exit: (21) not(user:var(918)) + Call: (21) isval(918) + Unify: (21) isval(918) + Exit: (21) isval(918) + Exit: (20) isvalstrorundef(918) + Call: (20) updatevar([v, b], 918, [[[v, a], 947]], _2996) + Unify: (20) updatevar([v, b], 918, [[[v, a], 947]], _2996) + Call: (21) lists:member([[v, b], empty], [[[v, a], 947]]) + Unify: (21) lists:member([[v, b], empty], [[[v, a], 947]]) + Fail: (21) lists:member([[v, b], empty], [[[v, a], 947]]) + Redo: (20) updatevar([v, b], 918, [[[v, a], 947]], _2996) +^ Call: (21) not(member([[v, b], _2988], [[[v, a], 947]])) +^ Unify: (21) not(user:member([[v, b], _2988], [[[v, a], 947]])) +^ Exit: (21) not(user:member([[v, b], _2988], [[[v, a], 947]])) + Call: (21) _2988=empty + Exit: (21) empty=empty + Call: (21) true + Exit: (21) true + Call: (21) lists:append([[[v, a], 947]], [[[v, b], 918]], _3036) + Unify: (21) lists:append([[[v, a], 947]], [[[v, b], 918]], [[[v, a], 947]|_3020]) + Exit: (21) lists:append([[[v, a], 947]], [[[v, b], 918]], [[[v, a], 947], [[v, b], 918]]) + Call: (21) true + Exit: (21) true + Call: (21) true + Exit: (21) true + Exit: (20) updatevar([v, b], 918, [[[v, a], 947]], [[[v, a], 947], [[v, b], 918]]) + Exit: (19) putvalue([v, b], 918, [[[v, a], 947]], [[[v, a], 947], [[v, b], 918]]) + Call: (19) checkarguments([[v, c]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3044, [], _3048) + Unify: (19) checkarguments([[v, c]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3044, [], _3048) + Call: (20) [[v, c]]=[_3024|_3026] + Exit: (20) [[v, c]]=[[v, c]] + Call: (20) expressionnotatom3([v, c]) + Unify: (20) expressionnotatom3([v, c]) + Call: (21) expressionnotatom([v, c]) + Unify: (21) expressionnotatom([v, c]) + Call: (22) isvalstrempty([v, c]) + Unify: (22) isvalstrempty([v, c]) + Call: (23) var([v, c]) + Fail: (23) var([v, c]) + Redo: (22) isvalstrempty([v, c]) + Unify: (22) isvalstrempty([v, c]) + Call: (23) isval([v, c]) + Unify: (23) isval([v, c]) + Fail: (23) isval([v, c]) + Redo: (22) isvalstrempty([v, c]) + Unify: (22) isvalstrempty([v, c]) + Fail: (22) isvalstrempty([v, c]) + Redo: (21) expressionnotatom([v, c]) + Unify: (21) expressionnotatom([v, c]) +^ Call: (22) not(atom([v, c])) +^ Unify: (22) not(user:atom([v, c])) +^ Exit: (22) not(user:atom([v, c])) + Call: (22) length([v, c], _3056) + Unify: (22) length([v, c], _3056) + Exit: (22) length([v, c], 2) + Call: (22) 2>=1 + Exit: (22) 2>=1 + Call: (22) expressionnotatom2([v, c]) + Unify: (22) expressionnotatom2([v, c]) + Call: (23) isvalstrempty(v) + Unify: (23) isvalstrempty(v) + Call: (24) var(v) + Fail: (24) var(v) + Redo: (23) isvalstrempty(v) + Unify: (23) isvalstrempty(v) + Call: (24) isval(v) + Unify: (24) isval(v) + Fail: (24) isval(v) + Redo: (23) isvalstrempty(v) + Unify: (23) isvalstrempty(v) + Fail: (23) isvalstrempty(v) + Fail: (22) expressionnotatom2([v, c]) + Redo: (21) expressionnotatom([v, c]) + Unify: (21) expressionnotatom([v, c]) + Call: (22) predicate_or_rule_name([v, c]) + Fail: (22) predicate_or_rule_name([v, c]) + Fail: (21) expressionnotatom([v, c]) + Redo: (20) expressionnotatom3([v, c]) + Unify: (20) expressionnotatom3([v, c]) +^ Call: (21) not([v, c]=[v, _3042]) +^ Unify: (21) not(user:([v, c]=[v, _3042])) +^ Fail: (21) not(user:([v, c]=[v, _3042])) + Fail: (20) expressionnotatom3([v, c]) + Redo: (19) checkarguments([[v, c]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3044, [], _3048) + Unify: (19) checkarguments([[v, c]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3044, [], _3048) + Call: (20) [[v, c]]=[_3024|_3026] + Exit: (20) [[v, c]]=[[v, c]] +^ Call: (20) not(var([v, c])) +^ Unify: (20) not(user:var([v, c])) +^ Exit: (20) not(user:var([v, c])) + Call: (20) isvar([v, c]) + Unify: (20) isvar([v, c]) + Exit: (20) isvar([v, c]) + Call: (20) [[v, c]]=[_3040|_3042] + Exit: (20) [[v, c]]=[[v, c]] + Call: (20) expressionnotatom3([v, c]) + Unify: (20) expressionnotatom3([v, c]) + Call: (21) expressionnotatom([v, c]) + Unify: (21) expressionnotatom([v, c]) + Call: (22) isvalstrempty([v, c]) + Unify: (22) isvalstrempty([v, c]) + Call: (23) var([v, c]) + Fail: (23) var([v, c]) + Redo: (22) isvalstrempty([v, c]) + Unify: (22) isvalstrempty([v, c]) + Call: (23) isval([v, c]) + Unify: (23) isval([v, c]) + Fail: (23) isval([v, c]) + Redo: (22) isvalstrempty([v, c]) + Unify: (22) isvalstrempty([v, c]) + Fail: (22) isvalstrempty([v, c]) + Redo: (21) expressionnotatom([v, c]) + Unify: (21) expressionnotatom([v, c]) +^ Call: (22) not(atom([v, c])) +^ Unify: (22) not(user:atom([v, c])) +^ Exit: (22) not(user:atom([v, c])) + Call: (22) length([v, c], _3072) + Unify: (22) length([v, c], _3072) + Exit: (22) length([v, c], 2) + Call: (22) 2>=1 + Exit: (22) 2>=1 + Call: (22) expressionnotatom2([v, c]) + Unify: (22) expressionnotatom2([v, c]) + Call: (23) isvalstrempty(v) + Unify: (23) isvalstrempty(v) + Call: (24) var(v) + Fail: (24) var(v) + Redo: (23) isvalstrempty(v) + Unify: (23) isvalstrempty(v) + Call: (24) isval(v) + Unify: (24) isval(v) + Fail: (24) isval(v) + Redo: (23) isvalstrempty(v) + Unify: (23) isvalstrempty(v) + Fail: (23) isvalstrempty(v) + Fail: (22) expressionnotatom2([v, c]) + Redo: (21) expressionnotatom([v, c]) + Unify: (21) expressionnotatom([v, c]) + Call: (22) predicate_or_rule_name([v, c]) + Fail: (22) predicate_or_rule_name([v, c]) + Fail: (21) expressionnotatom([v, c]) + Redo: (20) expressionnotatom3([v, c]) + Unify: (20) expressionnotatom3([v, c]) +^ Call: (21) not([v, c]=[v, _3058]) +^ Unify: (21) not(user:([v, c]=[v, _3058])) +^ Fail: (21) not(user:([v, c]=[v, _3058])) + Fail: (20) expressionnotatom3([v, c]) + Redo: (19) checkarguments([[v, c]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3044, [], _3048) + Unify: (19) checkarguments([[v, c]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3044, [], _3048) + Call: (20) [[v, c]]=[_3024|_3026] + Exit: (20) [[v, c]]=[[v, c]] +^ Call: (20) not(var([v, c])) +^ Unify: (20) not(user:var([v, c])) +^ Exit: (20) not(user:var([v, c])) + Call: (20) isvar([v, c]) + Unify: (20) isvar([v, c]) + Exit: (20) isvar([v, c]) + Call: (20) [[v, c]]=[_3040|_3042] + Exit: (20) [[v, c]]=[[v, c]] +^ Call: (20) not(var([v, c])) +^ Unify: (20) not(user:var([v, c])) +^ Exit: (20) not(user:var([v, c])) + Call: (20) isvar([v, c]) + Unify: (20) isvar([v, c]) + Exit: (20) isvar([v, c]) + Call: (20) getvalue([v, c], _3072, [[[v, a], 947], [[v, b], 918]]) + Unify: (20) getvalue([v, c], _3072, [[[v, a], 947], [[v, b], 918]]) +^ Call: (21) not(isvar([v, c])) +^ Unify: (21) not(user:isvar([v, c])) + Call: (22) isvar([v, c]) + Unify: (22) isvar([v, c]) + Exit: (22) isvar([v, c]) +^ Fail: (21) not(user:isvar([v, c])) + Redo: (20) getvalue([v, c], _3072, [[[v, a], 947], [[v, b], 918]]) + Call: (21) isvar([v, c]) + Unify: (21) isvar([v, c]) + Exit: (21) isvar([v, c]) + Call: (21) isvalstrorundef(_3070) + Unify: (21) isvalstrorundef(_3070) + Call: (22) var(_3070) + Exit: (22) var(_3070) + Exit: (21) isvalstrorundef(_3070) + Call: (21) getvar([v, c], _3072, [[[v, a], 947], [[v, b], 918]]) + Unify: (21) getvar([v, c], _3072, [[[v, a], 947], [[v, b], 918]]) + Call: (22) lists:member([[v, c], _3062], [[[v, a], 947], [[v, b], 918]]) + Unify: (22) lists:member([[v, c], _3062], [[[v, a], 947], [[v, b], 918]]) + Fail: (22) lists:member([[v, c], _3062], [[[v, a], 947], [[v, b], 918]]) + Redo: (21) getvar([v, c], _3072, [[[v, a], 947], [[v, b], 918]]) + Unify: (21) getvar([v, c], empty, [[[v, a], 947], [[v, b], 918]]) +^ Call: (22) not(member([[v, c], _3068], [[[v, a], 947], [[v, b], 918]])) +^ Unify: (22) not(user:member([[v, c], _3068], [[[v, a], 947], [[v, b], 918]])) +^ Exit: (22) not(user:member([[v, c], _3068], [[[v, a], 947], [[v, b], 918]])) + Call: (22) true + Exit: (22) true + Exit: (21) getvar([v, c], empty, [[[v, a], 947], [[v, b], 918]]) + Exit: (20) getvalue([v, c], empty, [[[v, a], 947], [[v, b], 918]]) + Call: (20) true + Exit: (20) true + Call: (20) putvalue([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3100) + Unify: (20) putvalue([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3100) +^ Call: (21) not(isvar([v, c])) +^ Unify: (21) not(user:isvar([v, c])) + Call: (22) isvar([v, c]) + Unify: (22) isvar([v, c]) + Exit: (22) isvar([v, c]) +^ Fail: (21) not(user:isvar([v, c])) + Redo: (20) putvalue([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3100) + Call: (21) isvar([v, c]) + Unify: (21) isvar([v, c]) + Exit: (21) isvar([v, c]) + Call: (21) isvalstrorundef(empty) + Unify: (21) isvalstrorundef(empty) + Call: (22) var(empty) + Fail: (22) var(empty) + Redo: (21) isvalstrorundef(empty) + Unify: (21) isvalstrorundef(empty) +^ Call: (22) not(var(empty)) +^ Unify: (22) not(user:var(empty)) +^ Exit: (22) not(user:var(empty)) + Call: (22) isval(empty) + Unify: (22) isval(empty) + Fail: (22) isval(empty) + Redo: (21) isvalstrorundef(empty) + Unify: (21) isvalstrorundef(empty) +^ Call: (22) not(var(empty)) +^ Unify: (22) not(user:var(empty)) +^ Exit: (22) not(user:var(empty)) + Call: (22) expression(empty) + Unify: (22) expression(empty) + Exit: (22) expression(empty) + Exit: (21) isvalstrorundef(empty) + Call: (21) updatevar([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3110) + Unify: (21) updatevar([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3110) + Call: (22) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918]]) + Unify: (22) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918]]) + Fail: (22) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918]]) + Redo: (21) updatevar([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3110) +^ Call: (22) not(member([[v, c], _3102], [[[v, a], 947], [[v, b], 918]])) +^ Unify: (22) not(user:member([[v, c], _3102], [[[v, a], 947], [[v, b], 918]])) +^ Exit: (22) not(user:member([[v, c], _3102], [[[v, a], 947], [[v, b], 918]])) + Call: (22) _3102=empty + Exit: (22) empty=empty + Call: (22) true + Exit: (22) true + Call: (22) lists:append([[[v, a], 947], [[v, b], 918]], [[[v, c], empty]], _3150) + Unify: (22) lists:append([[[v, a], 947], [[v, b], 918]], [[[v, c], empty]], [[[v, a], 947]|_3134]) + Exit: (22) lists:append([[[v, a], 947], [[v, b], 918]], [[[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (22) true + Exit: (22) true + Call: (22) true + Exit: (22) true + Exit: (21) updatevar([v, c], empty, [[[v, a], 947], [[v, b], 918]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (20) putvalue([v, c], empty, [[[v, a], 947], [[v, b], 918]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (20) lists:append([], [[[v, c], [v, c]]], _3180) + Unify: (20) lists:append([], [[[v, c], [v, c]]], [[[v, c], [v, c]]]) + Exit: (20) lists:append([], [[[v, c], [v, c]]], [[[v, c], [v, c]]]) + Call: (20) checkarguments([], [], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[v, c], [v, c]]], _3186) + Unify: (20) checkarguments([], [], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, c], [v, c]]], [[[v, c], [v, c]]]) + Exit: (20) checkarguments([], [], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, c], [v, c]]], [[[v, c], [v, c]]]) + Exit: (19) checkarguments([[v, c]], [[v, c]], [[[v, a], 947], [[v, b], 918]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], [[[v, c], [v, c]]]) + Exit: (18) checkarguments([918, [v, c]], [[v, b], [v, c]], [[[v, a], 947]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], [[[v, c], [v, c]]]) + Exit: (17) checkarguments([947, 918, [v, c]], [[v, a], [v, b], [v, c]], [], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], [[[v, c], [v, c]]]) + Call: (17) debug(on) + Fail: (17) debug(on) + Redo: (16) member1([[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3182) + Call: (17) true + Exit: (17) true + Call: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[_3168|_3170]|_3164] + Exit: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]] +^ Call: (18) not(predicate_or_rule_name([n, function1])) +^ Unify: (18) not(user:predicate_or_rule_name([n, function1])) + Call: (19) predicate_or_rule_name([n, function1]) + Unify: (19) predicate_or_rule_name([n, function1]) + Exit: (19) predicate_or_rule_name([n, function1]) +^ Fail: (18) not(user:predicate_or_rule_name([n, function1])) + Redo: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, not], [_3192]]|_3164] + Fail: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, not], [_3192]]|_3164] + Redo: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, or], [_3192, _3198]]|_3164] + Fail: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, or], [_3192, _3198]]|_3164] + Redo: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_3198, _3204]]|_3164] + Fail: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_3198, _3204]]|_3164] + Redo: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_3198, _3204, _3210]]|_3164] + Fail: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_3198, _3204, _3210]]|_3164] + Redo: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3182, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[_3162|_3164] + Exit: (18) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]] +^ Call: (18) not(predicate_or_rule_name([[n, function1], [[v, a], [v, b], [v, d]]])) +^ Unify: (18) not(user:predicate_or_rule_name([[n, function1], [[v, a], [v, b], [v|...]]])) + Call: (19) predicate_or_rule_name([[n, function1], [[v, a], [v, b], [v, d]]]) + Fail: (19) predicate_or_rule_name([[n, function1], [[v, a], [v, b], [v, d]]]) +^ Exit: (18) not(user:predicate_or_rule_name([[n, function1], [[v, a], [v, b], [v|...]]])) + Call: (18) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, function1], [[v, a], [v, b], [v, d]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3200, _3202, _3204) + Unify: (18) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, function1], [[v, a], [v, b], [v, d]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3200, true, nocut) + Call: (19) [[n, function1], [[v, a], [v, b], [v, d]]]=[_3178, _3184] + Exit: (19) [[n, function1], [[v, a], [v, b], [v, d]]]=[[n, function1], [[v, a], [v, b], [v, d]]] + Call: (19) substitutevarsA1([[v, a], [v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], _3210, [], _3214) + Unify: (19) substitutevarsA1([[v, a], [v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], _3210, [], _3214) + Call: (20) substitutevarsA2([[v, a], [v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], _3210, [], _3214) + Unify: (20) substitutevarsA2([[v, a], [v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], _3210, [], _3214) + Call: (21) [[v, a], [v, b], [v, d]]=[_3190|_3192] + Exit: (21) [[v, a], [v, b], [v, d]]=[[v, a], [v, b], [v, d]] + Call: (21) getvalue([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (21) getvalue([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (22) not(isvar([v, a])) +^ Unify: (22) not(user:isvar([v, a])) + Call: (23) isvar([v, a]) + Unify: (23) isvar([v, a]) + Exit: (23) isvar([v, a]) +^ Fail: (22) not(user:isvar([v, a])) + Redo: (21) getvalue([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (22) isvar([v, a]) + Unify: (22) isvar([v, a]) + Exit: (22) isvar([v, a]) + Call: (22) isvalstrorundef(_3210) + Unify: (22) isvalstrorundef(_3210) + Call: (23) var(_3210) + Exit: (23) var(_3210) + Exit: (22) isvalstrorundef(_3210) + Call: (22) getvar([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (22) getvar([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (23) lists:member([[v, a], _3202], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (23) lists:member([[v, a], _3202], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (23) lists:member([[v, a], 947], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (23) not(947=empty) +^ Unify: (23) not(user:(947=empty)) +^ Exit: (23) not(user:(947=empty)) + Exit: (22) getvar([v, a], 947, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (21) getvalue([v, a], 947, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (21) 947=empty + Fail: (21) 947=empty + Redo: (23) lists:member([[v, a], _3202], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Fail: (23) lists:member([[v, a], _3202], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Redo: (22) getvar([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (22) getvar([v, a], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (23) not(member([[v, a], _3208], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) +^ Unify: (23) not(user:member([[v, a], _3208], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) +^ Fail: (23) not(user:member([[v, a], _3208], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) + Redo: (22) getvar([v, a], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (23) lists:member([[v, a], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (23) lists:member([[v, a], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Fail: (23) lists:member([[v, a], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Fail: (22) getvar([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Fail: (21) getvalue([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Redo: (20) substitutevarsA2([[v, a], [v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], _3216, [], _3220) + Call: (21) getvalue([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (21) getvalue([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (22) not(isvar([v, a])) +^ Unify: (22) not(user:isvar([v, a])) + Call: (23) isvar([v, a]) + Unify: (23) isvar([v, a]) + Exit: (23) isvar([v, a]) +^ Fail: (22) not(user:isvar([v, a])) + Redo: (21) getvalue([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (22) isvar([v, a]) + Unify: (22) isvar([v, a]) + Exit: (22) isvar([v, a]) + Call: (22) isvalstrorundef(_3210) + Unify: (22) isvalstrorundef(_3210) + Call: (23) var(_3210) + Exit: (23) var(_3210) + Exit: (22) isvalstrorundef(_3210) + Call: (22) getvar([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (22) getvar([v, a], _3212, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (23) lists:member([[v, a], _3202], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (23) lists:member([[v, a], _3202], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (23) lists:member([[v, a], 947], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (23) not(947=empty) +^ Unify: (23) not(user:(947=empty)) +^ Exit: (23) not(user:(947=empty)) + Exit: (22) getvar([v, a], 947, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (21) getvalue([v, a], 947, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (21) lists:append([], [947], _3244) + Unify: (21) lists:append([], [947], [947]) + Exit: (21) lists:append([], [947], [947]) + Call: (21) _3240=[] + Exit: (21) []=[] + Call: (21) substitutevarsA2([[v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [947], _3246, [], _3250) + Unify: (21) substitutevarsA2([[v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [947], _3246, [], _3250) + Call: (22) [[v, b], [v, d]]=[_3226|_3228] + Exit: (22) [[v, b], [v, d]]=[[v, b], [v, d]] + Call: (22) getvalue([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (22) getvalue([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (23) not(isvar([v, b])) +^ Unify: (23) not(user:isvar([v, b])) + Call: (24) isvar([v, b]) + Unify: (24) isvar([v, b]) + Exit: (24) isvar([v, b]) +^ Fail: (23) not(user:isvar([v, b])) + Redo: (22) getvalue([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (23) isvar([v, b]) + Unify: (23) isvar([v, b]) + Exit: (23) isvar([v, b]) + Call: (23) isvalstrorundef(_3246) + Unify: (23) isvalstrorundef(_3246) + Call: (24) var(_3246) + Exit: (24) var(_3246) + Exit: (23) isvalstrorundef(_3246) + Call: (23) getvar([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (23) getvar([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (24) lists:member([[v, b], _3238], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (24) lists:member([[v, b], _3238], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (24) lists:member([[v, b], 918], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (24) not(918=empty) +^ Unify: (24) not(user:(918=empty)) +^ Exit: (24) not(user:(918=empty)) + Exit: (23) getvar([v, b], 918, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (22) getvalue([v, b], 918, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (22) 918=empty + Fail: (22) 918=empty + Redo: (24) lists:member([[v, b], _3238], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Fail: (24) lists:member([[v, b], _3238], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Redo: (23) getvar([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (23) getvar([v, b], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (24) not(member([[v, b], _3244], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) +^ Unify: (24) not(user:member([[v, b], _3244], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) +^ Fail: (24) not(user:member([[v, b], _3244], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) + Redo: (23) getvar([v, b], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (24) lists:member([[v, b], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (24) lists:member([[v, b], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Fail: (24) lists:member([[v, b], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Fail: (23) getvar([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Fail: (22) getvalue([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Redo: (21) substitutevarsA2([[v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [947], _3252, [], _3256) + Call: (22) getvalue([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (22) getvalue([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (23) not(isvar([v, b])) +^ Unify: (23) not(user:isvar([v, b])) + Call: (24) isvar([v, b]) + Unify: (24) isvar([v, b]) + Exit: (24) isvar([v, b]) +^ Fail: (23) not(user:isvar([v, b])) + Redo: (22) getvalue([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (23) isvar([v, b]) + Unify: (23) isvar([v, b]) + Exit: (23) isvar([v, b]) + Call: (23) isvalstrorundef(_3246) + Unify: (23) isvalstrorundef(_3246) + Call: (24) var(_3246) + Exit: (24) var(_3246) + Exit: (23) isvalstrorundef(_3246) + Call: (23) getvar([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (23) getvar([v, b], _3248, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (24) lists:member([[v, b], _3238], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (24) lists:member([[v, b], _3238], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (24) lists:member([[v, b], 918], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (24) not(918=empty) +^ Unify: (24) not(user:(918=empty)) +^ Exit: (24) not(user:(918=empty)) + Exit: (23) getvar([v, b], 918, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (22) getvalue([v, b], 918, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (22) lists:append([947], [918], _3280) + Unify: (22) lists:append([947], [918], [947|_3264]) + Exit: (22) lists:append([947], [918], [947, 918]) + Call: (22) _3282=[] + Exit: (22) []=[] + Call: (22) substitutevarsA2([[v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [947, 918], _3288, [], _3292) + Unify: (22) substitutevarsA2([[v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [947, 918], _3288, [], _3292) + Call: (23) [[v, d]]=[_3268|_3270] + Exit: (23) [[v, d]]=[[v, d]] + Call: (23) getvalue([v, d], _3290, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (23) getvalue([v, d], _3290, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (24) not(isvar([v, d])) +^ Unify: (24) not(user:isvar([v, d])) + Call: (25) isvar([v, d]) + Unify: (25) isvar([v, d]) + Exit: (25) isvar([v, d]) +^ Fail: (24) not(user:isvar([v, d])) + Redo: (23) getvalue([v, d], _3290, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (24) isvar([v, d]) + Unify: (24) isvar([v, d]) + Exit: (24) isvar([v, d]) + Call: (24) isvalstrorundef(_3288) + Unify: (24) isvalstrorundef(_3288) + Call: (25) var(_3288) + Exit: (25) var(_3288) + Exit: (24) isvalstrorundef(_3288) + Call: (24) getvar([v, d], _3290, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (24) getvar([v, d], _3290, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (25) lists:member([[v, d], _3280], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (25) lists:member([[v, d], _3280], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Fail: (25) lists:member([[v, d], _3280], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Redo: (24) getvar([v, d], _3290, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (24) getvar([v, d], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (25) not(member([[v, d], _3286], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) +^ Unify: (25) not(user:member([[v, d], _3286], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) +^ Exit: (25) not(user:member([[v, d], _3286], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) + Call: (25) true + Exit: (25) true + Exit: (24) getvar([v, d], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (23) getvalue([v, d], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (23) empty=empty + Exit: (23) empty=empty + Call: (23) lists:append([947, 918], [[v, d]], _3322) + Unify: (23) lists:append([947, 918], [[v, d]], [947|_3306]) + Exit: (23) lists:append([947, 918], [[v, d]], [947, 918, [v, d]]) + Call: (23) isvar([v, d]) + Unify: (23) isvar([v, d]) + Exit: (23) isvar([v, d]) + Call: (23) lists:append([], [[v, d]], _3340) + Unify: (23) lists:append([], [[v, d]], [[v, d]]) + Exit: (23) lists:append([], [[v, d]], [[v, d]]) + Call: (23) substitutevarsA2([], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [947, 918, [v, d]], _3342, [[v, d]], _3346) + Unify: (23) substitutevarsA2([], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [947, 918, [v, d]], [947, 918, [v, d]], [[v, d]], [[v, d]]) + Exit: (23) substitutevarsA2([], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [947, 918, [v, d]], [947, 918, [v, d]], [[v, d]], [[v, d]]) + Exit: (22) substitutevarsA2([[v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [947, 918], [947, 918, [v, d]], [], [[v, d]]) + Exit: (21) substitutevarsA2([[v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [947], [947, 918, [v, d]], [], [[v, d]]) + Exit: (20) substitutevarsA2([[v, a], [v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], [947, 918, [v, d]], [], [[v, d]]) + Exit: (19) substitutevarsA1([[v, a], [v, b], [v, d]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], [947, 918, [v, d]], [], [[v, d]]) + Call: (19) _3348=[[n, function1], [947, 918, [v, d]]] + Exit: (19) [[n, function1], [947, 918, [v, d]]]=[[n, function1], [947, 918, [v, d]]] + Call: (19) interpret2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Unify: (19) interpret2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Call: (20) member2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Unify: (20) member2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Call: (21) cut(off) + Unify: (21) cut(off) + Exit: (21) cut(off) + Call: (21) [[n, function1], [947, 918, [v, d]]]=[_3334, _3340] + Exit: (21) [[n, function1], [947, 918, [v, d]]]=[[n, function1], [947, 918, [v, d]]] + Call: (21) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function1], _3358, ":-", _3376]|_3348] + Fail: (21) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function1], _3358, ":-", _3376]|_3348] + Redo: (20) member2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Call: (21) member21([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Unify: (21) member21([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Call: (22) cut(off) + Unify: (22) cut(off) + Exit: (22) cut(off) + Call: (22) [[n, function1], [947, 918, [v, d]]]=[_3334] + Fail: (22) [[n, function1], [947, 918, [v, d]]]=[_3334] + Redo: (21) member21([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Call: (22) member22([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Unify: (22) member22([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Call: (23) cut(off) + Unify: (23) cut(off) + Exit: (23) cut(off) + Call: (23) [[n, function1], [947, 918, [v, d]]]=[_3334, _3340] + Exit: (23) [[n, function1], [947, 918, [v, d]]]=[[n, function1], [947, 918, [v, d]]] + Call: (23) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function1], _3358]|_3348] + Fail: (23) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function1], _3358]|_3348] + Redo: (22) member22([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Call: (23) member23([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Unify: (23) member23([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Call: (24) cut(off) + Unify: (24) cut(off) + Exit: (24) cut(off) + Call: (24) [[n, function1], [947, 918, [v, d]]]=[_3334] + Fail: (24) [[n, function1], [947, 918, [v, d]]]=[_3334] + Redo: (23) member23([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _3354) + Call: (24) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[_3334|_3336] + Exit: (24) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (24) member2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], _3360) + Unify: (24) member2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], _3360) + Call: (25) cut(off) + Unify: (25) cut(off) + Exit: (25) cut(off) + Call: (25) [[n, function1], [947, 918, [v, d]]]=[_3340, _3346] + Exit: (25) [[n, function1], [947, 918, [v, d]]]=[[n, function1], [947, 918, [v, d]]] + Call: (25) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[[[n, function1], _3364, ":-", _3382]|_3354] + Exit: (25) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]] + Call: (25) length([947, 918, [v, d]], _3404) + Unify: (25) length([947, 918, [v, d]], _3404) + Exit: (25) length([947, 918, [v, d]], 3) + Call: (25) length([[v, a], [v, b], [v, c]], 3) + Unify: (25) length([[v, a], [v, b], [v, c]], 3) + Unify: (25) length([[v, a], [v, b], [v, c]], 3) + Exit: (25) length([[v, a], [v, b], [v, c]], 3) + Call: (25) checkarguments([947, 918, [v, d]], [[v, a], [v, b], [v, c]], [], _3408, [], _3412) + Unify: (25) checkarguments([947, 918, [v, d]], [[v, a], [v, b], [v, c]], [], _3408, [], _3412) + Call: (26) [947, 918, [v, d]]=[_3388|_3390] + Exit: (26) [947, 918, [v, d]]=[947, 918, [v, d]] + Call: (26) expressionnotatom3(947) + Unify: (26) expressionnotatom3(947) + Call: (27) expressionnotatom(947) + Unify: (27) expressionnotatom(947) + Call: (28) isvalstrempty(947) + Unify: (28) isvalstrempty(947) + Call: (29) var(947) + Fail: (29) var(947) + Redo: (28) isvalstrempty(947) + Unify: (28) isvalstrempty(947) + Call: (29) isval(947) + Unify: (29) isval(947) + Exit: (29) isval(947) + Exit: (28) isvalstrempty(947) + Exit: (27) expressionnotatom(947) + Exit: (26) expressionnotatom3(947) + Call: (26) [[v, a], [v, b], [v, c]]=[_3394|_3396] + Exit: (26) [[v, a], [v, b], [v, c]]=[[v, a], [v, b], [v, c]] +^ Call: (26) not(var([v, a])) +^ Unify: (26) not(user:var([v, a])) +^ Exit: (26) not(user:var([v, a])) + Call: (26) isvar([v, a]) + Unify: (26) isvar([v, a]) + Exit: (26) isvar([v, a]) + Call: (26) putvalue([v, a], 947, [], _3430) + Unify: (26) putvalue([v, a], 947, [], _3430) +^ Call: (27) not(isvar([v, a])) +^ Unify: (27) not(user:isvar([v, a])) + Call: (28) isvar([v, a]) + Unify: (28) isvar([v, a]) + Exit: (28) isvar([v, a]) +^ Fail: (27) not(user:isvar([v, a])) + Redo: (26) putvalue([v, a], 947, [], _3430) + Call: (27) isvar([v, a]) + Unify: (27) isvar([v, a]) + Exit: (27) isvar([v, a]) + Call: (27) isvalstrorundef(947) + Unify: (27) isvalstrorundef(947) + Call: (28) var(947) + Fail: (28) var(947) + Redo: (27) isvalstrorundef(947) + Unify: (27) isvalstrorundef(947) +^ Call: (28) not(var(947)) +^ Unify: (28) not(user:var(947)) +^ Exit: (28) not(user:var(947)) + Call: (28) isval(947) + Unify: (28) isval(947) + Exit: (28) isval(947) + Exit: (27) isvalstrorundef(947) + Call: (27) updatevar([v, a], 947, [], _3440) + Unify: (27) updatevar([v, a], 947, [], _3440) + Call: (28) lists:member([[v, a], empty], []) + Fail: (28) lists:member([[v, a], empty], []) + Redo: (27) updatevar([v, a], 947, [], _3440) +^ Call: (28) not(member([[v, a], _3432], [])) +^ Unify: (28) not(user:member([[v, a], _3432], [])) +^ Exit: (28) not(user:member([[v, a], _3432], [])) + Call: (28) _3432=empty + Exit: (28) empty=empty + Call: (28) true + Exit: (28) true + Call: (28) lists:append([], [[[v, a], 947]], _3480) + Unify: (28) lists:append([], [[[v, a], 947]], [[[v, a], 947]]) + Exit: (28) lists:append([], [[[v, a], 947]], [[[v, a], 947]]) + Call: (28) true + Exit: (28) true + Call: (28) true + Exit: (28) true + Exit: (27) updatevar([v, a], 947, [], [[[v, a], 947]]) + Exit: (26) putvalue([v, a], 947, [], [[[v, a], 947]]) + Call: (26) checkarguments([918, [v, d]], [[v, b], [v, c]], [[[v, a], 947]], _3482, [], _3486) + Unify: (26) checkarguments([918, [v, d]], [[v, b], [v, c]], [[[v, a], 947]], _3482, [], _3486) + Call: (27) [918, [v, d]]=[_3462|_3464] + Exit: (27) [918, [v, d]]=[918, [v, d]] + Call: (27) expressionnotatom3(918) + Unify: (27) expressionnotatom3(918) + Call: (28) expressionnotatom(918) + Unify: (28) expressionnotatom(918) + Call: (29) isvalstrempty(918) + Unify: (29) isvalstrempty(918) + Call: (30) var(918) + Fail: (30) var(918) + Redo: (29) isvalstrempty(918) + Unify: (29) isvalstrempty(918) + Call: (30) isval(918) + Unify: (30) isval(918) + Exit: (30) isval(918) + Exit: (29) isvalstrempty(918) + Exit: (28) expressionnotatom(918) + Exit: (27) expressionnotatom3(918) + Call: (27) [[v, b], [v, c]]=[_3468|_3470] + Exit: (27) [[v, b], [v, c]]=[[v, b], [v, c]] +^ Call: (27) not(var([v, b])) +^ Unify: (27) not(user:var([v, b])) +^ Exit: (27) not(user:var([v, b])) + Call: (27) isvar([v, b]) + Unify: (27) isvar([v, b]) + Exit: (27) isvar([v, b]) + Call: (27) putvalue([v, b], 918, [[[v, a], 947]], _3504) + Unify: (27) putvalue([v, b], 918, [[[v, a], 947]], _3504) +^ Call: (28) not(isvar([v, b])) +^ Unify: (28) not(user:isvar([v, b])) + Call: (29) isvar([v, b]) + Unify: (29) isvar([v, b]) + Exit: (29) isvar([v, b]) +^ Fail: (28) not(user:isvar([v, b])) + Redo: (27) putvalue([v, b], 918, [[[v, a], 947]], _3504) + Call: (28) isvar([v, b]) + Unify: (28) isvar([v, b]) + Exit: (28) isvar([v, b]) + Call: (28) isvalstrorundef(918) + Unify: (28) isvalstrorundef(918) + Call: (29) var(918) + Fail: (29) var(918) + Redo: (28) isvalstrorundef(918) + Unify: (28) isvalstrorundef(918) +^ Call: (29) not(var(918)) +^ Unify: (29) not(user:var(918)) +^ Exit: (29) not(user:var(918)) + Call: (29) isval(918) + Unify: (29) isval(918) + Exit: (29) isval(918) + Exit: (28) isvalstrorundef(918) + Call: (28) updatevar([v, b], 918, [[[v, a], 947]], _3514) + Unify: (28) updatevar([v, b], 918, [[[v, a], 947]], _3514) + Call: (29) lists:member([[v, b], empty], [[[v, a], 947]]) + Unify: (29) lists:member([[v, b], empty], [[[v, a], 947]]) + Fail: (29) lists:member([[v, b], empty], [[[v, a], 947]]) + Redo: (28) updatevar([v, b], 918, [[[v, a], 947]], _3514) +^ Call: (29) not(member([[v, b], _3506], [[[v, a], 947]])) +^ Unify: (29) not(user:member([[v, b], _3506], [[[v, a], 947]])) +^ Exit: (29) not(user:member([[v, b], _3506], [[[v, a], 947]])) + Call: (29) _3506=empty + Exit: (29) empty=empty + Call: (29) true + Exit: (29) true + Call: (29) lists:append([[[v, a], 947]], [[[v, b], 918]], _3554) + Unify: (29) lists:append([[[v, a], 947]], [[[v, b], 918]], [[[v, a], 947]|_3538]) + Exit: (29) lists:append([[[v, a], 947]], [[[v, b], 918]], [[[v, a], 947], [[v, b], 918]]) + Call: (29) true + Exit: (29) true + Call: (29) true + Exit: (29) true + Exit: (28) updatevar([v, b], 918, [[[v, a], 947]], [[[v, a], 947], [[v, b], 918]]) + Exit: (27) putvalue([v, b], 918, [[[v, a], 947]], [[[v, a], 947], [[v, b], 918]]) + Call: (27) checkarguments([[v, d]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3562, [], _3566) + Unify: (27) checkarguments([[v, d]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3562, [], _3566) + Call: (28) [[v, d]]=[_3542|_3544] + Exit: (28) [[v, d]]=[[v, d]] + Call: (28) expressionnotatom3([v, d]) + Unify: (28) expressionnotatom3([v, d]) + Call: (29) expressionnotatom([v, d]) + Unify: (29) expressionnotatom([v, d]) + Call: (30) isvalstrempty([v, d]) + Unify: (30) isvalstrempty([v, d]) + Call: (31) var([v, d]) + Fail: (31) var([v, d]) + Redo: (30) isvalstrempty([v, d]) + Unify: (30) isvalstrempty([v, d]) + Call: (31) isval([v, d]) + Unify: (31) isval([v, d]) + Fail: (31) isval([v, d]) + Redo: (30) isvalstrempty([v, d]) + Unify: (30) isvalstrempty([v, d]) + Fail: (30) isvalstrempty([v, d]) + Redo: (29) expressionnotatom([v, d]) + Unify: (29) expressionnotatom([v, d]) +^ Call: (30) not(atom([v, d])) +^ Unify: (30) not(user:atom([v, d])) +^ Exit: (30) not(user:atom([v, d])) + Call: (30) length([v, d], _3574) + Unify: (30) length([v, d], _3574) + Exit: (30) length([v, d], 2) + Call: (30) 2>=1 + Exit: (30) 2>=1 + Call: (30) expressionnotatom2([v, d]) + Unify: (30) expressionnotatom2([v, d]) + Call: (31) isvalstrempty(v) + Unify: (31) isvalstrempty(v) + Call: (32) var(v) + Fail: (32) var(v) + Redo: (31) isvalstrempty(v) + Unify: (31) isvalstrempty(v) + Call: (32) isval(v) + Unify: (32) isval(v) + Fail: (32) isval(v) + Redo: (31) isvalstrempty(v) + Unify: (31) isvalstrempty(v) + Fail: (31) isvalstrempty(v) + Fail: (30) expressionnotatom2([v, d]) + Redo: (29) expressionnotatom([v, d]) + Unify: (29) expressionnotatom([v, d]) + Call: (30) predicate_or_rule_name([v, d]) + Fail: (30) predicate_or_rule_name([v, d]) + Fail: (29) expressionnotatom([v, d]) + Redo: (28) expressionnotatom3([v, d]) + Unify: (28) expressionnotatom3([v, d]) +^ Call: (29) not([v, d]=[v, _3560]) +^ Unify: (29) not(user:([v, d]=[v, _3560])) +^ Fail: (29) not(user:([v, d]=[v, _3560])) + Fail: (28) expressionnotatom3([v, d]) + Redo: (27) checkarguments([[v, d]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3562, [], _3566) + Unify: (27) checkarguments([[v, d]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3562, [], _3566) + Call: (28) [[v, d]]=[_3542|_3544] + Exit: (28) [[v, d]]=[[v, d]] +^ Call: (28) not(var([v, d])) +^ Unify: (28) not(user:var([v, d])) +^ Exit: (28) not(user:var([v, d])) + Call: (28) isvar([v, d]) + Unify: (28) isvar([v, d]) + Exit: (28) isvar([v, d]) + Call: (28) [[v, c]]=[_3558|_3560] + Exit: (28) [[v, c]]=[[v, c]] + Call: (28) expressionnotatom3([v, c]) + Unify: (28) expressionnotatom3([v, c]) + Call: (29) expressionnotatom([v, c]) + Unify: (29) expressionnotatom([v, c]) + Call: (30) isvalstrempty([v, c]) + Unify: (30) isvalstrempty([v, c]) + Call: (31) var([v, c]) + Fail: (31) var([v, c]) + Redo: (30) isvalstrempty([v, c]) + Unify: (30) isvalstrempty([v, c]) + Call: (31) isval([v, c]) + Unify: (31) isval([v, c]) + Fail: (31) isval([v, c]) + Redo: (30) isvalstrempty([v, c]) + Unify: (30) isvalstrempty([v, c]) + Fail: (30) isvalstrempty([v, c]) + Redo: (29) expressionnotatom([v, c]) + Unify: (29) expressionnotatom([v, c]) +^ Call: (30) not(atom([v, c])) +^ Unify: (30) not(user:atom([v, c])) +^ Exit: (30) not(user:atom([v, c])) + Call: (30) length([v, c], _3590) + Unify: (30) length([v, c], _3590) + Exit: (30) length([v, c], 2) + Call: (30) 2>=1 + Exit: (30) 2>=1 + Call: (30) expressionnotatom2([v, c]) + Unify: (30) expressionnotatom2([v, c]) + Call: (31) isvalstrempty(v) + Unify: (31) isvalstrempty(v) + Call: (32) var(v) + Fail: (32) var(v) + Redo: (31) isvalstrempty(v) + Unify: (31) isvalstrempty(v) + Call: (32) isval(v) + Unify: (32) isval(v) + Fail: (32) isval(v) + Redo: (31) isvalstrempty(v) + Unify: (31) isvalstrempty(v) + Fail: (31) isvalstrempty(v) + Fail: (30) expressionnotatom2([v, c]) + Redo: (29) expressionnotatom([v, c]) + Unify: (29) expressionnotatom([v, c]) + Call: (30) predicate_or_rule_name([v, c]) + Fail: (30) predicate_or_rule_name([v, c]) + Fail: (29) expressionnotatom([v, c]) + Redo: (28) expressionnotatom3([v, c]) + Unify: (28) expressionnotatom3([v, c]) +^ Call: (29) not([v, c]=[v, _3576]) +^ Unify: (29) not(user:([v, c]=[v, _3576])) +^ Fail: (29) not(user:([v, c]=[v, _3576])) + Fail: (28) expressionnotatom3([v, c]) + Redo: (27) checkarguments([[v, d]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3562, [], _3566) + Unify: (27) checkarguments([[v, d]], [[v, c]], [[[v, a], 947], [[v, b], 918]], _3562, [], _3566) + Call: (28) [[v, d]]=[_3542|_3544] + Exit: (28) [[v, d]]=[[v, d]] +^ Call: (28) not(var([v, d])) +^ Unify: (28) not(user:var([v, d])) +^ Exit: (28) not(user:var([v, d])) + Call: (28) isvar([v, d]) + Unify: (28) isvar([v, d]) + Exit: (28) isvar([v, d]) + Call: (28) [[v, c]]=[_3558|_3560] + Exit: (28) [[v, c]]=[[v, c]] +^ Call: (28) not(var([v, c])) +^ Unify: (28) not(user:var([v, c])) +^ Exit: (28) not(user:var([v, c])) + Call: (28) isvar([v, c]) + Unify: (28) isvar([v, c]) + Exit: (28) isvar([v, c]) + Call: (28) getvalue([v, c], _3590, [[[v, a], 947], [[v, b], 918]]) + Unify: (28) getvalue([v, c], _3590, [[[v, a], 947], [[v, b], 918]]) +^ Call: (29) not(isvar([v, c])) +^ Unify: (29) not(user:isvar([v, c])) + Call: (30) isvar([v, c]) + Unify: (30) isvar([v, c]) + Exit: (30) isvar([v, c]) +^ Fail: (29) not(user:isvar([v, c])) + Redo: (28) getvalue([v, c], _3590, [[[v, a], 947], [[v, b], 918]]) + Call: (29) isvar([v, c]) + Unify: (29) isvar([v, c]) + Exit: (29) isvar([v, c]) + Call: (29) isvalstrorundef(_3588) + Unify: (29) isvalstrorundef(_3588) + Call: (30) var(_3588) + Exit: (30) var(_3588) + Exit: (29) isvalstrorundef(_3588) + Call: (29) getvar([v, c], _3590, [[[v, a], 947], [[v, b], 918]]) + Unify: (29) getvar([v, c], _3590, [[[v, a], 947], [[v, b], 918]]) + Call: (30) lists:member([[v, c], _3580], [[[v, a], 947], [[v, b], 918]]) + Unify: (30) lists:member([[v, c], _3580], [[[v, a], 947], [[v, b], 918]]) + Fail: (30) lists:member([[v, c], _3580], [[[v, a], 947], [[v, b], 918]]) + Redo: (29) getvar([v, c], _3590, [[[v, a], 947], [[v, b], 918]]) + Unify: (29) getvar([v, c], empty, [[[v, a], 947], [[v, b], 918]]) +^ Call: (30) not(member([[v, c], _3586], [[[v, a], 947], [[v, b], 918]])) +^ Unify: (30) not(user:member([[v, c], _3586], [[[v, a], 947], [[v, b], 918]])) +^ Exit: (30) not(user:member([[v, c], _3586], [[[v, a], 947], [[v, b], 918]])) + Call: (30) true + Exit: (30) true + Exit: (29) getvar([v, c], empty, [[[v, a], 947], [[v, b], 918]]) + Exit: (28) getvalue([v, c], empty, [[[v, a], 947], [[v, b], 918]]) + Call: (28) true + Exit: (28) true + Call: (28) putvalue([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3618) + Unify: (28) putvalue([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3618) +^ Call: (29) not(isvar([v, c])) +^ Unify: (29) not(user:isvar([v, c])) + Call: (30) isvar([v, c]) + Unify: (30) isvar([v, c]) + Exit: (30) isvar([v, c]) +^ Fail: (29) not(user:isvar([v, c])) + Redo: (28) putvalue([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3618) + Call: (29) isvar([v, c]) + Unify: (29) isvar([v, c]) + Exit: (29) isvar([v, c]) + Call: (29) isvalstrorundef(empty) + Unify: (29) isvalstrorundef(empty) + Call: (30) var(empty) + Fail: (30) var(empty) + Redo: (29) isvalstrorundef(empty) + Unify: (29) isvalstrorundef(empty) +^ Call: (30) not(var(empty)) +^ Unify: (30) not(user:var(empty)) +^ Exit: (30) not(user:var(empty)) + Call: (30) isval(empty) + Unify: (30) isval(empty) + Fail: (30) isval(empty) + Redo: (29) isvalstrorundef(empty) + Unify: (29) isvalstrorundef(empty) +^ Call: (30) not(var(empty)) +^ Unify: (30) not(user:var(empty)) +^ Exit: (30) not(user:var(empty)) + Call: (30) expression(empty) + Unify: (30) expression(empty) + Exit: (30) expression(empty) + Exit: (29) isvalstrorundef(empty) + Call: (29) updatevar([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3628) + Unify: (29) updatevar([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3628) + Call: (30) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918]]) + Unify: (30) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918]]) + Fail: (30) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918]]) + Redo: (29) updatevar([v, c], empty, [[[v, a], 947], [[v, b], 918]], _3628) +^ Call: (30) not(member([[v, c], _3620], [[[v, a], 947], [[v, b], 918]])) +^ Unify: (30) not(user:member([[v, c], _3620], [[[v, a], 947], [[v, b], 918]])) +^ Exit: (30) not(user:member([[v, c], _3620], [[[v, a], 947], [[v, b], 918]])) + Call: (30) _3620=empty + Exit: (30) empty=empty + Call: (30) true + Exit: (30) true + Call: (30) lists:append([[[v, a], 947], [[v, b], 918]], [[[v, c], empty]], _3668) + Unify: (30) lists:append([[[v, a], 947], [[v, b], 918]], [[[v, c], empty]], [[[v, a], 947]|_3652]) + Exit: (30) lists:append([[[v, a], 947], [[v, b], 918]], [[[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (30) true + Exit: (30) true + Call: (30) true + Exit: (30) true + Exit: (29) updatevar([v, c], empty, [[[v, a], 947], [[v, b], 918]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (28) putvalue([v, c], empty, [[[v, a], 947], [[v, b], 918]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (28) lists:append([], [[[v, d], [v, c]]], _3698) + Unify: (28) lists:append([], [[[v, d], [v, c]]], [[[v, d], [v, c]]]) + Exit: (28) lists:append([], [[[v, d], [v, c]]], [[[v, d], [v, c]]]) + Call: (28) checkarguments([], [], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[v, d], [v, c]]], _3704) + Unify: (28) checkarguments([], [], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, d], [v, c]]], [[[v, d], [v, c]]]) + Exit: (28) checkarguments([], [], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, d], [v, c]]], [[[v, d], [v, c]]]) + Exit: (27) checkarguments([[v, d]], [[v, c]], [[[v, a], 947], [[v, b], 918]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], [[[v, d], [v, c]]]) + Exit: (26) checkarguments([918, [v, d]], [[v, b], [v, c]], [[[v, a], 947]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], [[[v, d], [v, c]]]) + Exit: (25) checkarguments([947, 918, [v, d]], [[v, a], [v, b], [v, c]], [], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [], [[[v, d], [v, c]]]) + Call: (25) debug(on) + Fail: (25) debug(on) + Redo: (24) member2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], _3700) + Call: (25) true + Exit: (25) true + Call: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[_3686|_3688]|_3682] + Exit: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, +], [[v, a], [v, b], [v, c]]]] +^ Call: (26) not(predicate_or_rule_name([n, +])) +^ Unify: (26) not(user:predicate_or_rule_name([n, +])) + Call: (27) predicate_or_rule_name([n, +]) + Unify: (27) predicate_or_rule_name([n, +]) + Exit: (27) predicate_or_rule_name([n, +]) +^ Fail: (26) not(user:predicate_or_rule_name([n, +])) + Redo: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, not], [_3710]]|_3682] + Fail: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, not], [_3710]]|_3682] + Redo: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, or], [_3710, _3716]]|_3682] + Fail: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, or], [_3710, _3716]]|_3682] + Redo: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, "->"], [_3716, _3722]]|_3682] + Fail: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, "->"], [_3716, _3722]]|_3682] + Redo: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, "->"], [_3716, _3722, _3728]]|_3682] + Fail: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, "->"], [_3716, _3722, _3728]]|_3682] + Redo: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3700, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[_3680|_3682] + Exit: (26) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, +], [[v, a], [v, b], [v, c]]]] +^ Call: (26) not(predicate_or_rule_name([[n, +], [[v, a], [v, b], [v, c]]])) +^ Unify: (26) not(user:predicate_or_rule_name([[n, +], [[v, a], [v, b], [v|...]]])) + Call: (27) predicate_or_rule_name([[n, +], [[v, a], [v, b], [v, c]]]) + Fail: (27) predicate_or_rule_name([[n, +], [[v, a], [v, b], [v, c]]]) +^ Exit: (26) not(user:predicate_or_rule_name([[n, +], [[v, a], [v, b], [v|...]]])) + Call: (26) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[n, +], [[v, a], [v, b], [v, c]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3718, _3720, _3722) + Unify: (26) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[n, +], [[v, a], [v, b], [v, c]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3718, true, nocut) + Call: (27) interpretpart(isplus, [v, c], [v, a], [v, b], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3720) + Unify: (27) interpretpart(isplus, [v, c], [v, a], [v, b], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3720) + Call: (28) getvalues([v, c], [v, a], [v, b], _3716, _3718, _3720, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (28) getvalues([v, c], [v, a], [v, b], _3716, _3718, _3720, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (29) getvalue([v, c], _3712, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (29) getvalue([v, c], _3712, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (30) not(isvar([v, c])) +^ Unify: (30) not(user:isvar([v, c])) + Call: (31) isvar([v, c]) + Unify: (31) isvar([v, c]) + Exit: (31) isvar([v, c]) +^ Fail: (30) not(user:isvar([v, c])) + Redo: (29) getvalue([v, c], _3712, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (30) isvar([v, c]) + Unify: (30) isvar([v, c]) + Exit: (30) isvar([v, c]) + Call: (30) isvalstrorundef(_3710) + Unify: (30) isvalstrorundef(_3710) + Call: (31) var(_3710) + Exit: (31) var(_3710) + Exit: (30) isvalstrorundef(_3710) + Call: (30) getvar([v, c], _3712, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (30) getvar([v, c], _3712, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (31) lists:member([[v, c], _3702], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (31) lists:member([[v, c], _3702], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (31) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (31) not(empty=empty) +^ Unify: (31) not(user:(empty=empty)) +^ Fail: (31) not(user:(empty=empty)) + Redo: (30) getvar([v, c], _3712, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (30) getvar([v, c], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (31) not(member([[v, c], _3708], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) +^ Unify: (31) not(user:member([[v, c], _3708], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) +^ Fail: (31) not(user:member([[v, c], _3708], [[[v, a], 947], [[v, b], 918], [[v, c], empty]])) + Redo: (30) getvar([v, c], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (31) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (31) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (31) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (30) getvar([v, c], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (29) getvalue([v, c], empty, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (29) getvalue([v, a], _3724, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (29) getvalue([v, a], _3724, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (30) not(isvar([v, a])) +^ Unify: (30) not(user:isvar([v, a])) + Call: (31) isvar([v, a]) + Unify: (31) isvar([v, a]) + Exit: (31) isvar([v, a]) +^ Fail: (30) not(user:isvar([v, a])) + Redo: (29) getvalue([v, a], _3724, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (30) isvar([v, a]) + Unify: (30) isvar([v, a]) + Exit: (30) isvar([v, a]) + Call: (30) isvalstrorundef(_3722) + Unify: (30) isvalstrorundef(_3722) + Call: (31) var(_3722) + Exit: (31) var(_3722) + Exit: (30) isvalstrorundef(_3722) + Call: (30) getvar([v, a], _3724, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (30) getvar([v, a], _3724, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (31) lists:member([[v, a], _3714], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (31) lists:member([[v, a], _3714], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (31) lists:member([[v, a], 947], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (31) not(947=empty) +^ Unify: (31) not(user:(947=empty)) +^ Exit: (31) not(user:(947=empty)) + Exit: (30) getvar([v, a], 947, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (29) getvalue([v, a], 947, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (29) getvalue([v, b], _3748, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (29) getvalue([v, b], _3748, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (30) not(isvar([v, b])) +^ Unify: (30) not(user:isvar([v, b])) + Call: (31) isvar([v, b]) + Unify: (31) isvar([v, b]) + Exit: (31) isvar([v, b]) +^ Fail: (30) not(user:isvar([v, b])) + Redo: (29) getvalue([v, b], _3748, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (30) isvar([v, b]) + Unify: (30) isvar([v, b]) + Exit: (30) isvar([v, b]) + Call: (30) isvalstrorundef(_3746) + Unify: (30) isvalstrorundef(_3746) + Call: (31) var(_3746) + Exit: (31) var(_3746) + Exit: (30) isvalstrorundef(_3746) + Call: (30) getvar([v, b], _3748, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (30) getvar([v, b], _3748, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (31) lists:member([[v, b], _3738], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (31) lists:member([[v, b], _3738], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (31) lists:member([[v, b], 918], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) +^ Call: (31) not(918=empty) +^ Unify: (31) not(user:(918=empty)) +^ Exit: (31) not(user:(918=empty)) + Exit: (30) getvar([v, b], 918, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (29) getvalue([v, b], 918, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (28) getvalues([v, c], [v, a], [v, b], empty, 947, 918, [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (28) isvalempty(empty) + Unify: (28) isvalempty(empty) + Call: (29) isval(empty) + Unify: (29) isval(empty) + Fail: (29) isval(empty) + Redo: (28) isvalempty(empty) + Call: (29) empty=empty + Exit: (29) empty=empty + Exit: (28) isvalempty(empty) + Call: (28) isval(947) + Unify: (28) isval(947) + Exit: (28) isval(947) + Call: (28) isval(918) + Unify: (28) isval(918) + Exit: (28) isval(918) + Call: (28) _3776 is 947+918 + Exit: (28) 1865 is 947+918 + Call: (28) val1emptyorvalsequal(empty, 1865) + Unify: (28) val1emptyorvalsequal(empty, 1865) + Exit: (28) val1emptyorvalsequal(empty, 1865) + Call: (28) putvalue([v, c], 1865, [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3782) + Unify: (28) putvalue([v, c], 1865, [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3782) +^ Call: (29) not(isvar([v, c])) +^ Unify: (29) not(user:isvar([v, c])) + Call: (30) isvar([v, c]) + Unify: (30) isvar([v, c]) + Exit: (30) isvar([v, c]) +^ Fail: (29) not(user:isvar([v, c])) + Redo: (28) putvalue([v, c], 1865, [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3782) + Call: (29) isvar([v, c]) + Unify: (29) isvar([v, c]) + Exit: (29) isvar([v, c]) + Call: (29) isvalstrorundef(1865) + Unify: (29) isvalstrorundef(1865) + Call: (30) var(1865) + Fail: (30) var(1865) + Redo: (29) isvalstrorundef(1865) + Unify: (29) isvalstrorundef(1865) +^ Call: (30) not(var(1865)) +^ Unify: (30) not(user:var(1865)) +^ Exit: (30) not(user:var(1865)) + Call: (30) isval(1865) + Unify: (30) isval(1865) + Exit: (30) isval(1865) + Exit: (29) isvalstrorundef(1865) + Call: (29) updatevar([v, c], 1865, [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3792) + Unify: (29) updatevar([v, c], 1865, [[[v, a], 947], [[v, b], 918], [[v, c], empty]], _3792) + Call: (30) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Unify: (30) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Exit: (30) lists:member([[v, c], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (30) lists:delete([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[v, c], empty], _3814) + Unify: (30) lists:delete([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[v, c], empty], _3814) + Exit: (30) lists:delete([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[v, c], empty], [[[v, a], 947], [[v, b], 918]]) + Call: (30) lists:append([[[v, a], 947], [[v, b], 918]], [[[v, c], 1865]], _3844) + Unify: (30) lists:append([[[v, a], 947], [[v, b], 918]], [[[v, c], 1865]], [[[v, a], 947]|_3828]) + Exit: (30) lists:append([[[v, a], 947], [[v, b], 918]], [[[v, c], 1865]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]]) + Call: (30) true + Exit: (30) true + Call: (30) true + Exit: (30) true + Call: (30) true + Exit: (30) true + Exit: (29) updatevar([v, c], 1865, [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]]) + Exit: (28) putvalue([v, c], 1865, [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]]) + Call: (28) debug(on) + Fail: (28) debug(on) + Redo: (27) interpretpart(isplus, [v, c], [v, a], [v, b], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]]) + Call: (28) true + Exit: (28) true + Call: (28) debug(on) + Fail: (28) debug(on) + Redo: (27) interpretpart(isplus, [v, c], [v, a], [v, b], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]]) + Call: (28) true + Exit: (28) true + Exit: (27) interpretpart(isplus, [v, c], [v, a], [v, b], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]]) + Exit: (26) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[n, +], [[v, a], [v, b], [v, c]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], true, nocut) +^ Call: (26) not(nocut=cut) +^ Unify: (26) not(user:(nocut=cut)) +^ Exit: (26) not(user:(nocut=cut)) + Call: (26) _3864=[[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]] + Exit: (26) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]] + Call: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], _3870, [], _3874) + Unify: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [], true) + Exit: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [], true) + Call: (26) logicalconjunction(true, true, true) + Unify: (26) logicalconjunction(true, true, true) + Call: (27) true(true) + Unify: (27) true(true) + Exit: (27) true(true) + Call: (27) true(true) + Unify: (27) true(true) + Exit: (27) true(true) + Exit: (26) logicalconjunction(true, true, true) + Exit: (25) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (25) true + Exit: (25) true + Call: (25) updatevars([[[v, d], [v, c]]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [], _3870) + Unify: (25) updatevars([[[v, d], [v, c]]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [], _3870) + Call: (26) [[[v, d], [v, c]]]=[[_3856, _3862]|_3852] + Exit: (26) [[[v, d], [v, c]]]=[[[v, d], [v, c]]] + Call: (26) expressionnotatom([v, c]) + Unify: (26) expressionnotatom([v, c]) + Call: (27) isvalstrempty([v, c]) + Unify: (27) isvalstrempty([v, c]) + Call: (28) var([v, c]) + Fail: (28) var([v, c]) + Redo: (27) isvalstrempty([v, c]) + Unify: (27) isvalstrempty([v, c]) + Call: (28) isval([v, c]) + Unify: (28) isval([v, c]) + Fail: (28) isval([v, c]) + Redo: (27) isvalstrempty([v, c]) + Unify: (27) isvalstrempty([v, c]) + Fail: (27) isvalstrempty([v, c]) + Redo: (26) expressionnotatom([v, c]) + Unify: (26) expressionnotatom([v, c]) +^ Call: (27) not(atom([v, c])) +^ Unify: (27) not(user:atom([v, c])) +^ Exit: (27) not(user:atom([v, c])) + Call: (27) length([v, c], _3894) + Unify: (27) length([v, c], _3894) + Exit: (27) length([v, c], 2) + Call: (27) 2>=1 + Exit: (27) 2>=1 + Call: (27) expressionnotatom2([v, c]) + Unify: (27) expressionnotatom2([v, c]) + Call: (28) isvalstrempty(v) + Unify: (28) isvalstrempty(v) + Call: (29) var(v) + Fail: (29) var(v) + Redo: (28) isvalstrempty(v) + Unify: (28) isvalstrempty(v) + Call: (29) isval(v) + Unify: (29) isval(v) + Fail: (29) isval(v) + Redo: (28) isvalstrempty(v) + Unify: (28) isvalstrempty(v) + Fail: (28) isvalstrempty(v) + Fail: (27) expressionnotatom2([v, c]) + Redo: (26) expressionnotatom([v, c]) + Unify: (26) expressionnotatom([v, c]) + Call: (27) predicate_or_rule_name([v, c]) + Fail: (27) predicate_or_rule_name([v, c]) + Fail: (26) expressionnotatom([v, c]) + Redo: (25) updatevars([[[v, d], [v, c]]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [], _3888) + Call: (26) lists:member([[v, c], _3874], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]]) + Unify: (26) lists:member([[v, c], _3874], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]]) + Exit: (26) lists:member([[v, c], 1865], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]]) + Call: (26) lists:append([], [[[v, d], 1865]], _3916) + Unify: (26) lists:append([], [[[v, d], 1865]], [[[v, d], 1865]]) + Exit: (26) lists:append([], [[[v, d], 1865]], [[[v, d], 1865]]) + Call: (26) updatevars([], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [[[v, d], 1865]], _3918) + Unify: (26) updatevars([], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [[[v, d], 1865]], [[[v, d], 1865]]) + Exit: (26) updatevars([], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [[[v, d], 1865]], [[[v, d], 1865]]) + Exit: (25) updatevars([[[v, d], [v, c]]], [[[v, a], 947], [[v, b], 918], [[v, c], 1865]], [], [[[v, d], 1865]]) +^ Call: (25) not([[[v, d], 1865]]=[]) +^ Unify: (25) not(user:([[[v, d], 1865]]=[])) +^ Exit: (25) not(user:([[[v, d], 1865]]=[])) + Call: (25) unique1([[[v, d], 1865]], [], _3928) + Unify: (25) unique1([[[v, d], 1865]], [], _3928) + Call: (26) lists:delete([], [[v, d], 1865], _3928) + Unify: (26) lists:delete([], [[v, d], 1865], []) + Exit: (26) lists:delete([], [[v, d], 1865], []) + Call: (26) lists:append([], [[[v, d], 1865]], _3934) + Unify: (26) lists:append([], [[[v, d], 1865]], [[[v, d], 1865]]) + Exit: (26) lists:append([], [[[v, d], 1865]], [[[v, d], 1865]]) + Call: (26) unique1([], [[[v, d], 1865]], _3934) + Unify: (26) unique1([], [[[v, d], 1865]], [[[v, d], 1865]]) + Exit: (26) unique1([], [[[v, d], 1865]], [[[v, d], 1865]]) + Exit: (25) unique1([[[v, d], 1865]], [], [[[v, d], 1865]]) + Call: (25) findresult3([947, 918, [v, d]], [[[v, d], 1865]], [], _3936) + Unify: (25) findresult3([947, 918, [v, d]], [[[v, d], 1865]], [], _3936) + Call: (26) [947, 918, [v, d]]=[_3916|_3918] + Exit: (26) [947, 918, [v, d]]=[947, 918, [v, d]] + Call: (26) expressionnotatom3(947) + Unify: (26) expressionnotatom3(947) + Call: (27) expressionnotatom(947) + Unify: (27) expressionnotatom(947) + Call: (28) isvalstrempty(947) + Unify: (28) isvalstrempty(947) + Call: (29) var(947) + Fail: (29) var(947) + Redo: (28) isvalstrempty(947) + Unify: (28) isvalstrempty(947) + Call: (29) isval(947) + Unify: (29) isval(947) + Exit: (29) isval(947) + Exit: (28) isvalstrempty(947) + Exit: (27) expressionnotatom(947) + Exit: (26) expressionnotatom3(947) + Call: (26) lists:append([], [947], _3946) + Unify: (26) lists:append([], [947], [947]) + Exit: (26) lists:append([], [947], [947]) + Call: (26) findresult3([918, [v, d]], [[[v, d], 1865]], [947], _3948) + Unify: (26) findresult3([918, [v, d]], [[[v, d], 1865]], [947], _3948) + Call: (27) [918, [v, d]]=[_3928|_3930] + Exit: (27) [918, [v, d]]=[918, [v, d]] + Call: (27) expressionnotatom3(918) + Unify: (27) expressionnotatom3(918) + Call: (28) expressionnotatom(918) + Unify: (28) expressionnotatom(918) + Call: (29) isvalstrempty(918) + Unify: (29) isvalstrempty(918) + Call: (30) var(918) + Fail: (30) var(918) + Redo: (29) isvalstrempty(918) + Unify: (29) isvalstrempty(918) + Call: (30) isval(918) + Unify: (30) isval(918) + Exit: (30) isval(918) + Exit: (29) isvalstrempty(918) + Exit: (28) expressionnotatom(918) + Exit: (27) expressionnotatom3(918) + Call: (27) lists:append([947], [918], _3958) + Unify: (27) lists:append([947], [918], [947|_3942]) + Exit: (27) lists:append([947], [918], [947, 918]) + Call: (27) findresult3([[v, d]], [[[v, d], 1865]], [947, 918], _3966) + Unify: (27) findresult3([[v, d]], [[[v, d], 1865]], [947, 918], _3966) + Call: (28) [[v, d]]=[_3946|_3948] + Exit: (28) [[v, d]]=[[v, d]] + Call: (28) expressionnotatom3([v, d]) + Unify: (28) expressionnotatom3([v, d]) + Call: (29) expressionnotatom([v, d]) + Unify: (29) expressionnotatom([v, d]) + Call: (30) isvalstrempty([v, d]) + Unify: (30) isvalstrempty([v, d]) + Call: (31) var([v, d]) + Fail: (31) var([v, d]) + Redo: (30) isvalstrempty([v, d]) + Unify: (30) isvalstrempty([v, d]) + Call: (31) isval([v, d]) + Unify: (31) isval([v, d]) + Fail: (31) isval([v, d]) + Redo: (30) isvalstrempty([v, d]) + Unify: (30) isvalstrempty([v, d]) + Fail: (30) isvalstrempty([v, d]) + Redo: (29) expressionnotatom([v, d]) + Unify: (29) expressionnotatom([v, d]) +^ Call: (30) not(atom([v, d])) +^ Unify: (30) not(user:atom([v, d])) +^ Exit: (30) not(user:atom([v, d])) + Call: (30) length([v, d], _3978) + Unify: (30) length([v, d], _3978) + Exit: (30) length([v, d], 2) + Call: (30) 2>=1 + Exit: (30) 2>=1 + Call: (30) expressionnotatom2([v, d]) + Unify: (30) expressionnotatom2([v, d]) + Call: (31) isvalstrempty(v) + Unify: (31) isvalstrempty(v) + Call: (32) var(v) + Fail: (32) var(v) + Redo: (31) isvalstrempty(v) + Unify: (31) isvalstrempty(v) + Call: (32) isval(v) + Unify: (32) isval(v) + Fail: (32) isval(v) + Redo: (31) isvalstrempty(v) + Unify: (31) isvalstrempty(v) + Fail: (31) isvalstrempty(v) + Fail: (30) expressionnotatom2([v, d]) + Redo: (29) expressionnotatom([v, d]) + Unify: (29) expressionnotatom([v, d]) + Call: (30) predicate_or_rule_name([v, d]) + Fail: (30) predicate_or_rule_name([v, d]) + Fail: (29) expressionnotatom([v, d]) + Redo: (28) expressionnotatom3([v, d]) + Unify: (28) expressionnotatom3([v, d]) +^ Call: (29) not([v, d]=[v, _3964]) +^ Unify: (29) not(user:([v, d]=[v, _3964])) +^ Fail: (29) not(user:([v, d]=[v, _3964])) + Fail: (28) expressionnotatom3([v, d]) + Redo: (27) findresult3([[v, d]], [[[v, d], 1865]], [947, 918], _3966) + Unify: (27) findresult3([[v, d]], [[[v, d], 1865]], [947, 918], _3966) + Call: (28) [[v, d]]=[_3946|_3948] + Exit: (28) [[v, d]]=[[v, d]] + Call: (28) isvar([v, d]) + Unify: (28) isvar([v, d]) + Exit: (28) isvar([v, d]) + Call: (28) lists:member([[v, d], _3958], [[[v, d], 1865]]) + Unify: (28) lists:member([[v, d], _3958], [[[v, d], 1865]]) + Exit: (28) lists:member([[v, d], 1865], [[[v, d], 1865]]) + Call: (28) lists:append([947, 918], [1865], _3988) + Unify: (28) lists:append([947, 918], [1865], [947|_3972]) + Exit: (28) lists:append([947, 918], [1865], [947, 918, 1865]) + Call: (28) findresult3([], [[[v, d], 1865]], [947, 918, 1865], _4002) + Unify: (28) findresult3([], [[[v, d], 1865]], [947, 918, 1865], [947, 918, 1865]) + Exit: (28) findresult3([], [[[v, d], 1865]], [947, 918, 1865], [947, 918, 1865]) + Exit: (27) findresult3([[v, d]], [[[v, d], 1865]], [947, 918], [947, 918, 1865]) + Exit: (26) findresult3([918, [v, d]], [[[v, d], 1865]], [947], [947, 918, 1865]) + Exit: (25) findresult3([947, 918, [v, d]], [[[v, d], 1865]], [], [947, 918, 1865]) + Call: (25) debug(on) + Fail: (25) debug(on) + Redo: (24) member2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, d], 1865]]) + Call: (25) true + Exit: (25) true + Exit: (24) member2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, d], 1865]]) + Exit: (23) member23([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865]]) + Exit: (22) member22([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865]]) + Exit: (21) member21([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865]]) + Exit: (20) member2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865]]) + Exit: (19) interpret2([[n, function1], [947, 918, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865]]) + Call: (19) updatevars2([[v, d]], [[[v, d], 1865]], [], _4002) + Unify: (19) updatevars2([[v, d]], [[[v, d], 1865]], [], _4002) + Call: (20) [[[v, d], 1865]]=[[_3988, _3994]|_3984] + Exit: (20) [[[v, d], 1865]]=[[[v, d], 1865]] + Call: (20) lists:member([v, d], [[v, d]]) + Unify: (20) lists:member([v, d], [[v, d]]) + Exit: (20) lists:member([v, d], [[v, d]]) + Call: (20) lists:append([], [[[v, d], 1865]], _4036) + Unify: (20) lists:append([], [[[v, d], 1865]], [[[v, d], 1865]]) + Exit: (20) lists:append([], [[[v, d], 1865]], [[[v, d], 1865]]) + Call: (20) updatevars2([[v, d]], [], [[[v, d], 1865]], _4038) + Unify: (20) updatevars2([[v, d]], [], [[[v, d], 1865]], [[[v, d], 1865]]) + Exit: (20) updatevars2([[v, d]], [], [[[v, d], 1865]], [[[v, d], 1865]]) + Exit: (19) updatevars2([[v, d]], [[[v, d], 1865]], [], [[[v, d], 1865]]) + Call: (19) updatevars3([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, d], 1865]], _4036) + Unify: (19) updatevars3([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, d], 1865]], _4036) + Call: (20) [[[v, d], 1865]]=[[_4024, _4030]|_4020] + Exit: (20) [[[v, d], 1865]]=[[[v, d], 1865]] + Call: (20) lists:delete([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[v, d], empty], _4066) + Unify: (20) lists:delete([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[v, d], empty], _4066) + Exit: (20) lists:delete([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[v, d], empty], [[[v, a], 947], [[v, b], 918], [[v, c], empty]]) + Call: (20) lists:append([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, d], 1865]], _4102) + Unify: (20) lists:append([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, d], 1865]], [[[v, a], 947]|_4086]) + Exit: (20) lists:append([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, d], 1865]], [[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]]) + Call: (20) updatevars3([[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]], [], _4120) + Unify: (20) updatevars3([[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]], [], [[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]]) + Exit: (20) updatevars3([[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]], [], [[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]]) + Exit: (19) updatevars3([[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, d], 1865]], [[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]]) + Call: (19) reverse([[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]], [], _4120) + Unify: (19) reverse([[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]], [], _4120) + Call: (20) [[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]]=[_4102|_4104] + Exit: (20) [[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]]=[[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]] + Call: (20) lists:append([[[v, a], 947]], [], _4132) + Unify: (20) lists:append([[[v, a], 947]], [], [[[v, a], 947]|_4116]) + Exit: (20) lists:append([[[v, a], 947]], [], [[[v, a], 947]]) + Call: (20) reverse([[[v, b], 918], [[v, c], empty], [[v, d], 1865]], [[[v, a], 947]], _4138) + Unify: (20) reverse([[[v, b], 918], [[v, c], empty], [[v, d], 1865]], [[[v, a], 947]], _4138) + Call: (21) [[[v, b], 918], [[v, c], empty], [[v, d], 1865]]=[_4120|_4122] + Exit: (21) [[[v, b], 918], [[v, c], empty], [[v, d], 1865]]=[[[v, b], 918], [[v, c], empty], [[v, d], 1865]] + Call: (21) lists:append([[[v, b], 918]], [[[v, a], 947]], _4150) + Unify: (21) lists:append([[[v, b], 918]], [[[v, a], 947]], [[[v, b], 918]|_4134]) + Exit: (21) lists:append([[[v, b], 918]], [[[v, a], 947]], [[[v, b], 918], [[v, a], 947]]) + Call: (21) reverse([[[v, c], empty], [[v, d], 1865]], [[[v, b], 918], [[v, a], 947]], _4156) + Unify: (21) reverse([[[v, c], empty], [[v, d], 1865]], [[[v, b], 918], [[v, a], 947]], _4156) + Call: (22) [[[v, c], empty], [[v, d], 1865]]=[_4138|_4140] + Exit: (22) [[[v, c], empty], [[v, d], 1865]]=[[[v, c], empty], [[v, d], 1865]] + Call: (22) lists:append([[[v, c], empty]], [[[v, b], 918], [[v, a], 947]], _4168) + Unify: (22) lists:append([[[v, c], empty]], [[[v, b], 918], [[v, a], 947]], [[[v, c], empty]|_4152]) + Exit: (22) lists:append([[[v, c], empty]], [[[v, b], 918], [[v, a], 947]], [[[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (22) reverse([[[v, d], 1865]], [[[v, c], empty], [[v, b], 918], [[v, a], 947]], _4174) + Unify: (22) reverse([[[v, d], 1865]], [[[v, c], empty], [[v, b], 918], [[v, a], 947]], _4174) + Call: (23) [[[v, d], 1865]]=[_4156|_4158] + Exit: (23) [[[v, d], 1865]]=[[[v, d], 1865]] + Call: (23) lists:append([[[v, d], 1865]], [[[v, c], empty], [[v, b], 918], [[v, a], 947]], _4186) + Unify: (23) lists:append([[[v, d], 1865]], [[[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865]|_4170]) + Exit: (23) lists:append([[[v, d], 1865]], [[[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (23) reverse([], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4192) + Unify: (23) reverse([], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (23) reverse([], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (22) reverse([[[v, d], 1865]], [[[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (21) reverse([[[v, c], empty], [[v, d], 1865]], [[[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (20) reverse([[[v, b], 918], [[v, c], empty], [[v, d], 1865]], [[[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (19) reverse([[[v, a], 947], [[v, b], 918], [[v, c], empty], [[v, d], 1865]], [], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) +^ Call: (19) not([[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]=[]) +^ Unify: (19) not(user:([[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v|...], 947]]=[])) +^ Exit: (19) not(user:([[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v|...], 947]]=[])) + Call: (19) unique1([[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [], _4204) + Unify: (19) unique1([[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [], _4204) + Call: (20) lists:delete([[[v, c], empty], [[v, b], 918], [[v, a], 947]], [[v, d], 1865], _4204) + Unify: (20) lists:delete([[[v, c], empty], [[v, b], 918], [[v, a], 947]], [[v, d], 1865], _4204) + Exit: (20) lists:delete([[[v, c], empty], [[v, b], 918], [[v, a], 947]], [[v, d], 1865], [[[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (20) lists:append([], [[[v, d], 1865]], _4228) + Unify: (20) lists:append([], [[[v, d], 1865]], [[[v, d], 1865]]) + Exit: (20) lists:append([], [[[v, d], 1865]], [[[v, d], 1865]]) + Call: (20) unique1([[[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865]], _4228) + Unify: (20) unique1([[[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865]], _4228) + Call: (21) lists:delete([[[v, b], 918], [[v, a], 947]], [[v, c], empty], _4228) + Unify: (21) lists:delete([[[v, b], 918], [[v, a], 947]], [[v, c], empty], _4228) + Exit: (21) lists:delete([[[v, b], 918], [[v, a], 947]], [[v, c], empty], [[[v, b], 918], [[v, a], 947]]) + Call: (21) lists:append([[[v, d], 1865]], [[[v, c], empty]], _4246) + Unify: (21) lists:append([[[v, d], 1865]], [[[v, c], empty]], [[[v, d], 1865]|_4230]) + Exit: (21) lists:append([[[v, d], 1865]], [[[v, c], empty]], [[[v, d], 1865], [[v, c], empty]]) + Call: (21) unique1([[[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, c], empty]], _4252) + Unify: (21) unique1([[[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, c], empty]], _4252) + Call: (22) lists:delete([[[v, a], 947]], [[v, b], 918], _4252) + Unify: (22) lists:delete([[[v, a], 947]], [[v, b], 918], _4252) + Exit: (22) lists:delete([[[v, a], 947]], [[v, b], 918], [[[v, a], 947]]) + Call: (22) lists:append([[[v, d], 1865], [[v, c], empty]], [[[v, b], 918]], _4264) + Unify: (22) lists:append([[[v, d], 1865], [[v, c], empty]], [[[v, b], 918]], [[[v, d], 1865]|_4248]) + Exit: (22) lists:append([[[v, d], 1865], [[v, c], empty]], [[[v, b], 918]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918]]) + Call: (22) unique1([[[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918]], _4276) + Unify: (22) unique1([[[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918]], _4276) + Call: (23) lists:delete([], [[v, a], 947], _4276) + Unify: (23) lists:delete([], [[v, a], 947], []) + Exit: (23) lists:delete([], [[v, a], 947], []) + Call: (23) lists:append([[[v, d], 1865], [[v, c], empty], [[v, b], 918]], [[[v, a], 947]], _4282) + Unify: (23) lists:append([[[v, d], 1865], [[v, c], empty], [[v, b], 918]], [[[v, a], 947]], [[[v, d], 1865]|_4266]) + Exit: (23) lists:append([[[v, d], 1865], [[v, c], empty], [[v, b], 918]], [[[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (23) unique1([], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4300) + Unify: (23) unique1([], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (23) unique1([], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (22) unique1([[[v, a], 947]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (21) unique1([[[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, c], empty]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (20) unique1([[[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (19) unique1([[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (18) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, function1], [[v, a], [v, b], [v, d]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], true, nocut) +^ Call: (18) not(nocut=cut) +^ Unify: (18) not(user:(nocut=cut)) +^ Exit: (18) not(user:(nocut=cut)) + Call: (18) _4308=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Exit: (18) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Call: (19) [[[n, =], [[v, c], [v, d]]]]=[[_4300|_4302]|_4296] + Exit: (19) [[[n, =], [[v, c], [v, d]]]]=[[[n, =], [[v, c], [v, d]]]] +^ Call: (19) not(predicate_or_rule_name([n, =])) +^ Unify: (19) not(user:predicate_or_rule_name([n, =])) + Call: (20) predicate_or_rule_name([n, =]) + Unify: (20) predicate_or_rule_name([n, =]) + Exit: (20) predicate_or_rule_name([n, =]) +^ Fail: (19) not(user:predicate_or_rule_name([n, =])) + Redo: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Call: (19) [[[n, =], [[v, c], [v, d]]]]=[[[n, not], [_4324]]|_4296] + Fail: (19) [[[n, =], [[v, c], [v, d]]]]=[[[n, not], [_4324]]|_4296] + Redo: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Call: (19) [[[n, =], [[v, c], [v, d]]]]=[[[n, or], [_4324, _4330]]|_4296] + Fail: (19) [[[n, =], [[v, c], [v, d]]]]=[[[n, or], [_4324, _4330]]|_4296] + Redo: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Call: (19) [[[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_4330, _4336]]|_4296] + Fail: (19) [[[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_4330, _4336]]|_4296] + Redo: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Call: (19) [[[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_4330, _4336, _4342]]|_4296] + Fail: (19) [[[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_4330, _4336, _4342]]|_4296] + Redo: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4314, [[[n, =], [[v, c], [v, d]]]], _4318) + Call: (19) [[[n, =], [[v, c], [v, d]]]]=[_4294|_4296] + Exit: (19) [[[n, =], [[v, c], [v, d]]]]=[[[n, =], [[v, c], [v, d]]]] +^ Call: (19) not(predicate_or_rule_name([[n, =], [[v, c], [v, d]]])) +^ Unify: (19) not(user:predicate_or_rule_name([[n, =], [[v, c], [v, d]]])) + Call: (20) predicate_or_rule_name([[n, =], [[v, c], [v, d]]]) + Fail: (20) predicate_or_rule_name([[n, =], [[v, c], [v, d]]]) +^ Exit: (19) not(user:predicate_or_rule_name([[n, =], [[v, c], [v, d]]])) + Call: (19) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, =], [[v, c], [v, d]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4332, _4334, _4336) + Unify: (19) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, =], [[v, c], [v, d]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4332, true, nocut) + Call: (20) isop(=) + Unify: (20) isop(=) + Exit: (20) isop(=) + Call: (20) interpretpart(is, [v, c], [v, d], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4332) + Unify: (20) interpretpart(is, [v, c], [v, d], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4332) + Call: (21) getvalues([v, c], [v, d], _4328, _4330, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Unify: (21) getvalues([v, c], [v, d], _4328, _4330, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (22) getvalue([v, c], _4326, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Unify: (22) getvalue([v, c], _4326, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) +^ Call: (23) not(isvar([v, c])) +^ Unify: (23) not(user:isvar([v, c])) + Call: (24) isvar([v, c]) + Unify: (24) isvar([v, c]) + Exit: (24) isvar([v, c]) +^ Fail: (23) not(user:isvar([v, c])) + Redo: (22) getvalue([v, c], _4326, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (23) isvar([v, c]) + Unify: (23) isvar([v, c]) + Exit: (23) isvar([v, c]) + Call: (23) isvalstrorundef(_4324) + Unify: (23) isvalstrorundef(_4324) + Call: (24) var(_4324) + Exit: (24) var(_4324) + Exit: (23) isvalstrorundef(_4324) + Call: (23) getvar([v, c], _4326, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Unify: (23) getvar([v, c], _4326, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (24) lists:member([[v, c], _4316], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Unify: (24) lists:member([[v, c], _4316], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (24) lists:member([[v, c], empty], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) +^ Call: (24) not(empty=empty) +^ Unify: (24) not(user:(empty=empty)) +^ Fail: (24) not(user:(empty=empty)) + Redo: (24) lists:member([[v, c], _4316], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Fail: (24) lists:member([[v, c], _4316], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Redo: (23) getvar([v, c], _4326, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Unify: (23) getvar([v, c], empty, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) +^ Call: (24) not(member([[v, c], _4322], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]])) +^ Unify: (24) not(user:member([[v, c], _4322], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v|...], 947]])) +^ Fail: (24) not(user:member([[v, c], _4322], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v|...], 947]])) + Redo: (23) getvar([v, c], empty, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (24) lists:member([[v, c], empty], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Unify: (24) lists:member([[v, c], empty], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (24) lists:member([[v, c], empty], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (23) getvar([v, c], empty, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (22) getvalue([v, c], empty, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (22) getvalue([v, d], _4338, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Unify: (22) getvalue([v, d], _4338, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) +^ Call: (23) not(isvar([v, d])) +^ Unify: (23) not(user:isvar([v, d])) + Call: (24) isvar([v, d]) + Unify: (24) isvar([v, d]) + Exit: (24) isvar([v, d]) +^ Fail: (23) not(user:isvar([v, d])) + Redo: (22) getvalue([v, d], _4338, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (23) isvar([v, d]) + Unify: (23) isvar([v, d]) + Exit: (23) isvar([v, d]) + Call: (23) isvalstrorundef(_4336) + Unify: (23) isvalstrorundef(_4336) + Call: (24) var(_4336) + Exit: (24) var(_4336) + Exit: (23) isvalstrorundef(_4336) + Call: (23) getvar([v, d], _4338, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Unify: (23) getvar([v, d], _4338, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (24) lists:member([[v, d], _4328], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Unify: (24) lists:member([[v, d], _4328], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (24) lists:member([[v, d], 1865], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) +^ Call: (24) not(1865=empty) +^ Unify: (24) not(user:(1865=empty)) +^ Exit: (24) not(user:(1865=empty)) + Exit: (23) getvar([v, d], 1865, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (22) getvalue([v, d], 1865, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (21) getvalues([v, c], [v, d], empty, 1865, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (21) expression(empty) + Unify: (21) expression(empty) + Exit: (21) expression(empty) + Call: (21) expression(1865) + Unify: (21) expression(1865) + Call: (22) isval(1865) + Unify: (22) isval(1865) + Exit: (22) isval(1865) + Exit: (21) expression(1865) + Call: (21) val1emptyorvalsequal(empty, 1865) + Unify: (21) val1emptyorvalsequal(empty, 1865) + Exit: (21) val1emptyorvalsequal(empty, 1865) + Call: (21) putvalue([v, c], 1865, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4366) + Unify: (21) putvalue([v, c], 1865, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4366) +^ Call: (22) not(isvar([v, c])) +^ Unify: (22) not(user:isvar([v, c])) + Call: (23) isvar([v, c]) + Unify: (23) isvar([v, c]) + Exit: (23) isvar([v, c]) +^ Fail: (22) not(user:isvar([v, c])) + Redo: (21) putvalue([v, c], 1865, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4366) + Call: (22) isvar([v, c]) + Unify: (22) isvar([v, c]) + Exit: (22) isvar([v, c]) + Call: (22) isvalstrorundef(1865) + Unify: (22) isvalstrorundef(1865) + Call: (23) var(1865) + Fail: (23) var(1865) + Redo: (22) isvalstrorundef(1865) + Unify: (22) isvalstrorundef(1865) +^ Call: (23) not(var(1865)) +^ Unify: (23) not(user:var(1865)) +^ Exit: (23) not(user:var(1865)) + Call: (23) isval(1865) + Unify: (23) isval(1865) + Exit: (23) isval(1865) + Exit: (22) isvalstrorundef(1865) + Call: (22) updatevar([v, c], 1865, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4376) + Unify: (22) updatevar([v, c], 1865, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], _4376) + Call: (23) lists:member([[v, c], empty], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Unify: (23) lists:member([[v, c], empty], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Exit: (23) lists:member([[v, c], empty], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]]) + Call: (23) lists:delete([[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[v, c], empty], _4398) + Unify: (23) lists:delete([[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[v, c], empty], _4398) + Exit: (23) lists:delete([[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[v, c], empty], [[[v, d], 1865], [[v, b], 918], [[v, a], 947]]) + Call: (23) lists:append([[[v, d], 1865], [[v, b], 918], [[v, a], 947]], [[[v, c], 1865]], _4434) + Unify: (23) lists:append([[[v, d], 1865], [[v, b], 918], [[v, a], 947]], [[[v, c], 1865]], [[[v, d], 1865]|_4418]) + Exit: (23) lists:append([[[v, d], 1865], [[v, b], 918], [[v, a], 947]], [[[v, c], 1865]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]]) + Call: (23) true + Exit: (23) true + Call: (23) true + Exit: (23) true + Call: (23) true + Exit: (23) true + Exit: (22) updatevar([v, c], 1865, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]]) + Exit: (21) putvalue([v, c], 1865, [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]]) + Call: (21) debug(on) + Fail: (21) debug(on) + Redo: (20) interpretpart(is, [v, c], [v, d], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]]) + Call: (21) true + Exit: (21) true + Call: (21) debug(on) + Fail: (21) debug(on) + Redo: (20) interpretpart(is, [v, c], [v, d], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]]) + Call: (21) true + Exit: (21) true + Exit: (20) interpretpart(is, [v, c], [v, d], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]]) + Exit: (19) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, =], [[v, c], [v, d]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], true, nocut) +^ Call: (19) not(nocut=cut) +^ Unify: (19) not(user:(nocut=cut)) +^ Exit: (19) not(user:(nocut=cut)) + Call: (19) _4460=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Exit: (19) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], _4466, [], _4470) + Unify: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [], true) + Exit: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [], true) + Call: (19) logicalconjunction(_4460, true, true) + Unify: (19) logicalconjunction(true, true, true) + Call: (20) true(true) + Unify: (20) true(true) + Exit: (20) true(true) + Call: (20) true(true) + Unify: (20) true(true) + Exit: (20) true(true) + Exit: (19) logicalconjunction(true, true, true) + Exit: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1865], [[v, c], empty], [[v, b], 918], [[v, a], 947]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [[[n, =], [[v, c], [v, d]]]], true) + Call: (18) logicalconjunction(true, true, true) + Unify: (18) logicalconjunction(true, true, true) + Call: (19) true(true) + Unify: (19) true(true) + Exit: (19) true(true) + Call: (19) true(true) + Unify: (19) true(true) + Exit: (19) true(true) + Exit: (18) logicalconjunction(true, true, true) + Exit: (17) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 947], [[v, b], 918], [[v, c], empty]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (17) true + Exit: (17) true + Call: (17) updatevars([[[v, c], [v, c]]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [], _4466) + Unify: (17) updatevars([[[v, c], [v, c]]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [], _4466) + Call: (18) [[[v, c], [v, c]]]=[[_4452, _4458]|_4448] + Exit: (18) [[[v, c], [v, c]]]=[[[v, c], [v, c]]] + Call: (18) expressionnotatom([v, c]) + Unify: (18) expressionnotatom([v, c]) + Call: (19) isvalstrempty([v, c]) + Unify: (19) isvalstrempty([v, c]) + Call: (20) var([v, c]) + Fail: (20) var([v, c]) + Redo: (19) isvalstrempty([v, c]) + Unify: (19) isvalstrempty([v, c]) + Call: (20) isval([v, c]) + Unify: (20) isval([v, c]) + Fail: (20) isval([v, c]) + Redo: (19) isvalstrempty([v, c]) + Unify: (19) isvalstrempty([v, c]) + Fail: (19) isvalstrempty([v, c]) + Redo: (18) expressionnotatom([v, c]) + Unify: (18) expressionnotatom([v, c]) +^ Call: (19) not(atom([v, c])) +^ Unify: (19) not(user:atom([v, c])) +^ Exit: (19) not(user:atom([v, c])) + Call: (19) length([v, c], _4490) + Unify: (19) length([v, c], _4490) + Exit: (19) length([v, c], 2) + Call: (19) 2>=1 + Exit: (19) 2>=1 + Call: (19) expressionnotatom2([v, c]) + Unify: (19) expressionnotatom2([v, c]) + Call: (20) isvalstrempty(v) + Unify: (20) isvalstrempty(v) + Call: (21) var(v) + Fail: (21) var(v) + Redo: (20) isvalstrempty(v) + Unify: (20) isvalstrempty(v) + Call: (21) isval(v) + Unify: (21) isval(v) + Fail: (21) isval(v) + Redo: (20) isvalstrempty(v) + Unify: (20) isvalstrempty(v) + Fail: (20) isvalstrempty(v) + Fail: (19) expressionnotatom2([v, c]) + Redo: (18) expressionnotatom([v, c]) + Unify: (18) expressionnotatom([v, c]) + Call: (19) predicate_or_rule_name([v, c]) + Fail: (19) predicate_or_rule_name([v, c]) + Fail: (18) expressionnotatom([v, c]) + Redo: (17) updatevars([[[v, c], [v, c]]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [], _4484) + Call: (18) lists:member([[v, c], _4470], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]]) + Unify: (18) lists:member([[v, c], _4470], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]]) + Exit: (18) lists:member([[v, c], 1865], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]]) + Call: (18) lists:append([], [[[v, c], 1865]], _4512) + Unify: (18) lists:append([], [[[v, c], 1865]], [[[v, c], 1865]]) + Exit: (18) lists:append([], [[[v, c], 1865]], [[[v, c], 1865]]) + Call: (18) updatevars([], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [[[v, c], 1865]], _4514) + Unify: (18) updatevars([], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [[[v, c], 1865]], [[[v, c], 1865]]) + Exit: (18) updatevars([], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [[[v, c], 1865]], [[[v, c], 1865]]) + Exit: (17) updatevars([[[v, c], [v, c]]], [[[v, d], 1865], [[v, b], 918], [[v, a], 947], [[v, c], 1865]], [], [[[v, c], 1865]]) +^ Call: (17) not([[[v, c], 1865]]=[]) +^ Unify: (17) not(user:([[[v, c], 1865]]=[])) +^ Exit: (17) not(user:([[[v, c], 1865]]=[])) + Call: (17) unique1([[[v, c], 1865]], [], _4524) + Unify: (17) unique1([[[v, c], 1865]], [], _4524) + Call: (18) lists:delete([], [[v, c], 1865], _4524) + Unify: (18) lists:delete([], [[v, c], 1865], []) + Exit: (18) lists:delete([], [[v, c], 1865], []) + Call: (18) lists:append([], [[[v, c], 1865]], _4530) + Unify: (18) lists:append([], [[[v, c], 1865]], [[[v, c], 1865]]) + Exit: (18) lists:append([], [[[v, c], 1865]], [[[v, c], 1865]]) + Call: (18) unique1([], [[[v, c], 1865]], _4530) + Unify: (18) unique1([], [[[v, c], 1865]], [[[v, c], 1865]]) + Exit: (18) unique1([], [[[v, c], 1865]], [[[v, c], 1865]]) + Exit: (17) unique1([[[v, c], 1865]], [], [[[v, c], 1865]]) + Call: (17) findresult3([947, 918, [v, c]], [[[v, c], 1865]], [], _4532) + Unify: (17) findresult3([947, 918, [v, c]], [[[v, c], 1865]], [], _4532) + Call: (18) [947, 918, [v, c]]=[_4512|_4514] + Exit: (18) [947, 918, [v, c]]=[947, 918, [v, c]] + Call: (18) expressionnotatom3(947) + Unify: (18) expressionnotatom3(947) + Call: (19) expressionnotatom(947) + Unify: (19) expressionnotatom(947) + Call: (20) isvalstrempty(947) + Unify: (20) isvalstrempty(947) + Call: (21) var(947) + Fail: (21) var(947) + Redo: (20) isvalstrempty(947) + Unify: (20) isvalstrempty(947) + Call: (21) isval(947) + Unify: (21) isval(947) + Exit: (21) isval(947) + Exit: (20) isvalstrempty(947) + Exit: (19) expressionnotatom(947) + Exit: (18) expressionnotatom3(947) + Call: (18) lists:append([], [947], _4542) + Unify: (18) lists:append([], [947], [947]) + Exit: (18) lists:append([], [947], [947]) + Call: (18) findresult3([918, [v, c]], [[[v, c], 1865]], [947], _4544) + Unify: (18) findresult3([918, [v, c]], [[[v, c], 1865]], [947], _4544) + Call: (19) [918, [v, c]]=[_4524|_4526] + Exit: (19) [918, [v, c]]=[918, [v, c]] + Call: (19) expressionnotatom3(918) + Unify: (19) expressionnotatom3(918) + Call: (20) expressionnotatom(918) + Unify: (20) expressionnotatom(918) + Call: (21) isvalstrempty(918) + Unify: (21) isvalstrempty(918) + Call: (22) var(918) + Fail: (22) var(918) + Redo: (21) isvalstrempty(918) + Unify: (21) isvalstrempty(918) + Call: (22) isval(918) + Unify: (22) isval(918) + Exit: (22) isval(918) + Exit: (21) isvalstrempty(918) + Exit: (20) expressionnotatom(918) + Exit: (19) expressionnotatom3(918) + Call: (19) lists:append([947], [918], _4554) + Unify: (19) lists:append([947], [918], [947|_4538]) + Exit: (19) lists:append([947], [918], [947, 918]) + Call: (19) findresult3([[v, c]], [[[v, c], 1865]], [947, 918], _4562) + Unify: (19) findresult3([[v, c]], [[[v, c], 1865]], [947, 918], _4562) + Call: (20) [[v, c]]=[_4542|_4544] + Exit: (20) [[v, c]]=[[v, c]] + Call: (20) expressionnotatom3([v, c]) + Unify: (20) expressionnotatom3([v, c]) + Call: (21) expressionnotatom([v, c]) + Unify: (21) expressionnotatom([v, c]) + Call: (22) isvalstrempty([v, c]) + Unify: (22) isvalstrempty([v, c]) + Call: (23) var([v, c]) + Fail: (23) var([v, c]) + Redo: (22) isvalstrempty([v, c]) + Unify: (22) isvalstrempty([v, c]) + Call: (23) isval([v, c]) + Unify: (23) isval([v, c]) + Fail: (23) isval([v, c]) + Redo: (22) isvalstrempty([v, c]) + Unify: (22) isvalstrempty([v, c]) + Fail: (22) isvalstrempty([v, c]) + Redo: (21) expressionnotatom([v, c]) + Unify: (21) expressionnotatom([v, c]) +^ Call: (22) not(atom([v, c])) +^ Unify: (22) not(user:atom([v, c])) +^ Exit: (22) not(user:atom([v, c])) + Call: (22) length([v, c], _4574) + Unify: (22) length([v, c], _4574) + Exit: (22) length([v, c], 2) + Call: (22) 2>=1 + Exit: (22) 2>=1 + Call: (22) expressionnotatom2([v, c]) + Unify: (22) expressionnotatom2([v, c]) + Call: (23) isvalstrempty(v) + Unify: (23) isvalstrempty(v) + Call: (24) var(v) + Fail: (24) var(v) + Redo: (23) isvalstrempty(v) + Unify: (23) isvalstrempty(v) + Call: (24) isval(v) + Unify: (24) isval(v) + Fail: (24) isval(v) + Redo: (23) isvalstrempty(v) + Unify: (23) isvalstrempty(v) + Fail: (23) isvalstrempty(v) + Fail: (22) expressionnotatom2([v, c]) + Redo: (21) expressionnotatom([v, c]) + Unify: (21) expressionnotatom([v, c]) + Call: (22) predicate_or_rule_name([v, c]) + Fail: (22) predicate_or_rule_name([v, c]) + Fail: (21) expressionnotatom([v, c]) + Redo: (20) expressionnotatom3([v, c]) + Unify: (20) expressionnotatom3([v, c]) +^ Call: (21) not([v, c]=[v, _4560]) +^ Unify: (21) not(user:([v, c]=[v, _4560])) +^ Fail: (21) not(user:([v, c]=[v, _4560])) + Fail: (20) expressionnotatom3([v, c]) + Redo: (19) findresult3([[v, c]], [[[v, c], 1865]], [947, 918], _4562) + Unify: (19) findresult3([[v, c]], [[[v, c], 1865]], [947, 918], _4562) + Call: (20) [[v, c]]=[_4542|_4544] + Exit: (20) [[v, c]]=[[v, c]] + Call: (20) isvar([v, c]) + Unify: (20) isvar([v, c]) + Exit: (20) isvar([v, c]) + Call: (20) lists:member([[v, c], _4554], [[[v, c], 1865]]) + Unify: (20) lists:member([[v, c], _4554], [[[v, c], 1865]]) + Exit: (20) lists:member([[v, c], 1865], [[[v, c], 1865]]) + Call: (20) lists:append([947, 918], [1865], _4584) + Unify: (20) lists:append([947, 918], [1865], [947|_4568]) + Exit: (20) lists:append([947, 918], [1865], [947, 918, 1865]) + Call: (20) findresult3([], [[[v, c], 1865]], [947, 918, 1865], _4598) + Unify: (20) findresult3([], [[[v, c], 1865]], [947, 918, 1865], [947, 918, 1865]) + Exit: (20) findresult3([], [[[v, c], 1865]], [947, 918, 1865], [947, 918, 1865]) + Exit: (19) findresult3([[v, c]], [[[v, c], 1865]], [947, 918], [947, 918, 1865]) + Exit: (18) findresult3([918, [v, c]], [[[v, c], 1865]], [947], [947, 918, 1865]) + Exit: (17) findresult3([947, 918, [v, c]], [[[v, c], 1865]], [], [947, 918, 1865]) + Call: (17) debug(on) + Fail: (17) debug(on) + Redo: (16) member1([[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, c], 1865]]) + Call: (17) true + Exit: (17) true + Exit: (16) member1([[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, c], 1865]]) + Exit: (15) interpret1(off, [[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, c], 1865]]) + Exit: (14) interpret(off, [[n, function0], [947, 918, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, c], 1865]]) + + +[algorithm,31] +[[[n,function0],[[v,a],[v,b],[v,c]],:-,[[[n,function1],[[v,a],[v,b],[v,d]]],[[n,=],[[v,c],[v,d]]]]],[[n,function1],[[v,a],[v,b],[v,c]],:-,[[[n,+],[[v,a],[v,b],[v,c]]]]]] + + Call: (15) interpret(off, [[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9058) + Unify: (15) interpret(off, [[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9058) + Call: (16) convert_to_grammar_part1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _9056, _9058) + Unify: (16) convert_to_grammar_part1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _9056, _9058) + Call: (17) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _9056, [], _9060, [], _9064, [], _9068) + Unify: (17) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _9056, [], _9060, [], _9064, [], _9068) + Call: (18) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[_9038|_9040] + Exit: (18) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (18) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[[n, _9056], _9062, "->", _9080] + Fail: (18) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[[n, _9056], _9062, "->", _9080] + Redo: (17) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _9062, [], _9066, [], _9070, [], _9074) + Call: (18) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[[n, _9056], "->", _9074] + Fail: (18) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[[n, _9056], "->", _9074] + Redo: (17) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _9056, [], _9060, [], _9064, [], _9068) + Unify: (17) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _9056, [], _9060, [], _9064, [], _9068) + Call: (18) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[_9038|_9040] + Exit: (18) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (18) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[_9044, _9050, ":-", _9068] + Exit: (18) [[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]]=[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n, function1], [[...|...]|...]], [[n|...], [...|...]]]] + Call: (18) true + Exit: (18) true + Call: (18) true + Exit: (18) true + Call: (18) true + Exit: (18) true + Call: (18) true + Exit: (18) true + Call: (18) lists:append([], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], _9098) + Unify: (18) lists:append([], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]]) + Exit: (18) lists:append([], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]]) + Call: (18) lists:append([], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], _9116) + Unify: (18) lists:append([], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]]) + Exit: (18) lists:append([], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]]) + Call: (18) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], _9116, [], _9120, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], _9124, [], _9128) + Unify: (18) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], _9116, [], _9120, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], _9124, [], _9128) + Call: (19) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[_9098|_9100] + Exit: (19) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]] + Call: (19) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[[n, _9116], _9122, "->", _9140] + Fail: (19) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[[n, _9116], _9122, "->", _9140] + Redo: (18) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], _9122, [], _9126, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], _9130, [], _9134) + Call: (19) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[[n, _9116], "->", _9134] + Fail: (19) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[[n, _9116], "->", _9134] + Redo: (18) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], _9116, [], _9120, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], _9124, [], _9128) + Unify: (18) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], _9116, [], _9120, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], _9124, [], _9128) + Call: (19) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[_9098|_9100] + Exit: (19) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]] + Call: (19) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[_9104, _9110, ":-", _9128] + Exit: (19) [[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]]=[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n, +], [[...|...]|...]]]] + Call: (19) true + Exit: (19) true + Call: (19) true + Exit: (19) true + Call: (19) true + Exit: (19) true + Call: (19) true + Exit: (19) true + Call: (19) lists:append([[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], [[[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9158) + Unify: (19) lists:append([[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], [[[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]|_9142]) + Exit: (19) lists:append([[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]]], [[[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]]]) + Call: (19) lists:append([[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], [[[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], _9182) + Unify: (19) lists:append([[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], [[[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]|_9166]) + Exit: (19) lists:append([[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]]], [[[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [[[[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]], [[n, function0], [[v|...], [...|...]|...], ":-", [...|...]]], [[[n, function1], [[v|...], [...|...]|...], ":-", [...]], [[n, function1], [[...|...]|...], ":-"|...]]]) + Call: (19) convert_to_grammar_part11([], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9188, [], _9192, [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], _9196, [], _9200) + Unify: (19) convert_to_grammar_part11([], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [], []) + Exit: (19) convert_to_grammar_part11([], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [], []) + Exit: (18) convert_to_grammar_part11([[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]]], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [], []) + Exit: (17) convert_to_grammar_part11([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [], [], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]], [], []) + Call: (17) _9184=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Exit: (17) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (17) _9184=[[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]] + Exit: (17) [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]]=[[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]] + Exit: (16) convert_to_grammar_part1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[[n, function0], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...], [...|...]]], [[n, function0], [[v, a], [v|...], [...|...]], ":-", [[...|...]|...]]], [[[n, function1], [[v, a], [v|...], [...|...]], ":-", [[...|...]]], [[n, function1], [[v|...], [...|...]|...], ":-", [...]]]]) + Call: (16) interpret1(off, [[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9192) + Unify: (16) interpret1(off, [[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9192) +^ Call: (17) retractall(debug(_9170)) +^ Exit: (17) retractall(debug(_9170)) +^ Call: (17) assertz(debug(off)) +^ Exit: (17) assertz(debug(off)) +^ Call: (17) retractall(cut(_9178)) +^ Exit: (17) retractall(cut(_9178)) +^ Call: (17) assertz(cut(off)) +^ Exit: (17) assertz(cut(off)) +^ Call: (17) retractall(leash1(_9186)) +^ Exit: (17) retractall(leash1(_9186)) +^ Call: (17) assertz(leash1(off)) +^ Exit: (17) assertz(leash1(off)) + Call: (17) member1([[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9214) + Unify: (17) member1([[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9214) + Call: (18) cut(off) + Unify: (18) cut(off) + Exit: (18) cut(off) + Call: (18) [[n, function0], [458, 949, [v, c]]]=[_9194, _9200] + Exit: (18) [[n, function0], [458, 949, [v, c]]]=[[n, function0], [458, 949, [v, c]]] + Call: (18) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], _9218, ":-", _9236]|_9208] + Exit: (18) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (18) length([458, 949, [v, c]], _9258) + Unify: (18) length([458, 949, [v, c]], _9258) + Exit: (18) length([458, 949, [v, c]], 3) + Call: (18) length([[v, a], [v, b], [v, c]], 3) + Unify: (18) length([[v, a], [v, b], [v, c]], 3) + Unify: (18) length([[v, a], [v, b], [v, c]], 3) + Exit: (18) length([[v, a], [v, b], [v, c]], 3) + Call: (18) checkarguments([458, 949, [v, c]], [[v, a], [v, b], [v, c]], [], _9262, [], _9266) + Unify: (18) checkarguments([458, 949, [v, c]], [[v, a], [v, b], [v, c]], [], _9262, [], _9266) + Call: (19) [458, 949, [v, c]]=[_9242|_9244] + Exit: (19) [458, 949, [v, c]]=[458, 949, [v, c]] + Call: (19) expressionnotatom3(458) + Unify: (19) expressionnotatom3(458) + Call: (20) expressionnotatom(458) + Unify: (20) expressionnotatom(458) + Call: (21) isvalstrempty(458) + Unify: (21) isvalstrempty(458) + Call: (22) var(458) + Fail: (22) var(458) + Redo: (21) isvalstrempty(458) + Unify: (21) isvalstrempty(458) + Call: (22) isval(458) + Unify: (22) isval(458) + Exit: (22) isval(458) + Exit: (21) isvalstrempty(458) + Exit: (20) expressionnotatom(458) + Exit: (19) expressionnotatom3(458) + Call: (19) [[v, a], [v, b], [v, c]]=[_9248|_9250] + Exit: (19) [[v, a], [v, b], [v, c]]=[[v, a], [v, b], [v, c]] +^ Call: (19) not(var([v, a])) +^ Unify: (19) not(user:var([v, a])) +^ Exit: (19) not(user:var([v, a])) + Call: (19) isvar([v, a]) + Unify: (19) isvar([v, a]) + Exit: (19) isvar([v, a]) + Call: (19) putvalue([v, a], 458, [], _9284) + Unify: (19) putvalue([v, a], 458, [], _9284) +^ Call: (20) not(isvar([v, a])) +^ Unify: (20) not(user:isvar([v, a])) + Call: (21) isvar([v, a]) + Unify: (21) isvar([v, a]) + Exit: (21) isvar([v, a]) +^ Fail: (20) not(user:isvar([v, a])) + Redo: (19) putvalue([v, a], 458, [], _9284) + Call: (20) isvar([v, a]) + Unify: (20) isvar([v, a]) + Exit: (20) isvar([v, a]) + Call: (20) isvalstrorundef(458) + Unify: (20) isvalstrorundef(458) + Call: (21) var(458) + Fail: (21) var(458) + Redo: (20) isvalstrorundef(458) + Unify: (20) isvalstrorundef(458) +^ Call: (21) not(var(458)) +^ Unify: (21) not(user:var(458)) +^ Exit: (21) not(user:var(458)) + Call: (21) isval(458) + Unify: (21) isval(458) + Exit: (21) isval(458) + Exit: (20) isvalstrorundef(458) + Call: (20) updatevar([v, a], 458, [], _9294) + Unify: (20) updatevar([v, a], 458, [], _9294) + Call: (21) lists:member([[v, a], empty], []) + Fail: (21) lists:member([[v, a], empty], []) + Redo: (20) updatevar([v, a], 458, [], _9294) +^ Call: (21) not(member([[v, a], _9286], [])) +^ Unify: (21) not(user:member([[v, a], _9286], [])) +^ Exit: (21) not(user:member([[v, a], _9286], [])) + Call: (21) _9286=empty + Exit: (21) empty=empty + Call: (21) true + Exit: (21) true + Call: (21) lists:append([], [[[v, a], 458]], _9334) + Unify: (21) lists:append([], [[[v, a], 458]], [[[v, a], 458]]) + Exit: (21) lists:append([], [[[v, a], 458]], [[[v, a], 458]]) + Call: (21) true + Exit: (21) true + Call: (21) true + Exit: (21) true + Exit: (20) updatevar([v, a], 458, [], [[[v, a], 458]]) + Exit: (19) putvalue([v, a], 458, [], [[[v, a], 458]]) + Call: (19) checkarguments([949, [v, c]], [[v, b], [v, c]], [[[v, a], 458]], _9336, [], _9340) + Unify: (19) checkarguments([949, [v, c]], [[v, b], [v, c]], [[[v, a], 458]], _9336, [], _9340) + Call: (20) [949, [v, c]]=[_9316|_9318] + Exit: (20) [949, [v, c]]=[949, [v, c]] + Call: (20) expressionnotatom3(949) + Unify: (20) expressionnotatom3(949) + Call: (21) expressionnotatom(949) + Unify: (21) expressionnotatom(949) + Call: (22) isvalstrempty(949) + Unify: (22) isvalstrempty(949) + Call: (23) var(949) + Fail: (23) var(949) + Redo: (22) isvalstrempty(949) + Unify: (22) isvalstrempty(949) + Call: (23) isval(949) + Unify: (23) isval(949) + Exit: (23) isval(949) + Exit: (22) isvalstrempty(949) + Exit: (21) expressionnotatom(949) + Exit: (20) expressionnotatom3(949) + Call: (20) [[v, b], [v, c]]=[_9322|_9324] + Exit: (20) [[v, b], [v, c]]=[[v, b], [v, c]] +^ Call: (20) not(var([v, b])) +^ Unify: (20) not(user:var([v, b])) +^ Exit: (20) not(user:var([v, b])) + Call: (20) isvar([v, b]) + Unify: (20) isvar([v, b]) + Exit: (20) isvar([v, b]) + Call: (20) putvalue([v, b], 949, [[[v, a], 458]], _9358) + Unify: (20) putvalue([v, b], 949, [[[v, a], 458]], _9358) +^ Call: (21) not(isvar([v, b])) +^ Unify: (21) not(user:isvar([v, b])) + Call: (22) isvar([v, b]) + Unify: (22) isvar([v, b]) + Exit: (22) isvar([v, b]) +^ Fail: (21) not(user:isvar([v, b])) + Redo: (20) putvalue([v, b], 949, [[[v, a], 458]], _9358) + Call: (21) isvar([v, b]) + Unify: (21) isvar([v, b]) + Exit: (21) isvar([v, b]) + Call: (21) isvalstrorundef(949) + Unify: (21) isvalstrorundef(949) + Call: (22) var(949) + Fail: (22) var(949) + Redo: (21) isvalstrorundef(949) + Unify: (21) isvalstrorundef(949) +^ Call: (22) not(var(949)) +^ Unify: (22) not(user:var(949)) +^ Exit: (22) not(user:var(949)) + Call: (22) isval(949) + Unify: (22) isval(949) + Exit: (22) isval(949) + Exit: (21) isvalstrorundef(949) + Call: (21) updatevar([v, b], 949, [[[v, a], 458]], _9368) + Unify: (21) updatevar([v, b], 949, [[[v, a], 458]], _9368) + Call: (22) lists:member([[v, b], empty], [[[v, a], 458]]) + Unify: (22) lists:member([[v, b], empty], [[[v, a], 458]]) + Fail: (22) lists:member([[v, b], empty], [[[v, a], 458]]) + Redo: (21) updatevar([v, b], 949, [[[v, a], 458]], _9368) +^ Call: (22) not(member([[v, b], _9360], [[[v, a], 458]])) +^ Unify: (22) not(user:member([[v, b], _9360], [[[v, a], 458]])) +^ Exit: (22) not(user:member([[v, b], _9360], [[[v, a], 458]])) + Call: (22) _9360=empty + Exit: (22) empty=empty + Call: (22) true + Exit: (22) true + Call: (22) lists:append([[[v, a], 458]], [[[v, b], 949]], _9408) + Unify: (22) lists:append([[[v, a], 458]], [[[v, b], 949]], [[[v, a], 458]|_9392]) + Exit: (22) lists:append([[[v, a], 458]], [[[v, b], 949]], [[[v, a], 458], [[v, b], 949]]) + Call: (22) true + Exit: (22) true + Call: (22) true + Exit: (22) true + Exit: (21) updatevar([v, b], 949, [[[v, a], 458]], [[[v, a], 458], [[v, b], 949]]) + Exit: (20) putvalue([v, b], 949, [[[v, a], 458]], [[[v, a], 458], [[v, b], 949]]) + Call: (20) checkarguments([[v, c]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9416, [], _9420) + Unify: (20) checkarguments([[v, c]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9416, [], _9420) + Call: (21) [[v, c]]=[_9396|_9398] + Exit: (21) [[v, c]]=[[v, c]] + Call: (21) expressionnotatom3([v, c]) + Unify: (21) expressionnotatom3([v, c]) + Call: (22) expressionnotatom([v, c]) + Unify: (22) expressionnotatom([v, c]) + Call: (23) isvalstrempty([v, c]) + Unify: (23) isvalstrempty([v, c]) + Call: (24) var([v, c]) + Fail: (24) var([v, c]) + Redo: (23) isvalstrempty([v, c]) + Unify: (23) isvalstrempty([v, c]) + Call: (24) isval([v, c]) + Unify: (24) isval([v, c]) + Fail: (24) isval([v, c]) + Redo: (23) isvalstrempty([v, c]) + Unify: (23) isvalstrempty([v, c]) + Fail: (23) isvalstrempty([v, c]) + Redo: (22) expressionnotatom([v, c]) + Unify: (22) expressionnotatom([v, c]) +^ Call: (23) not(atom([v, c])) +^ Unify: (23) not(user:atom([v, c])) +^ Exit: (23) not(user:atom([v, c])) + Call: (23) length([v, c], _9428) + Unify: (23) length([v, c], _9428) + Exit: (23) length([v, c], 2) + Call: (23) 2>=1 + Exit: (23) 2>=1 + Call: (23) expressionnotatom2([v, c]) + Unify: (23) expressionnotatom2([v, c]) + Call: (24) isvalstrempty(v) + Unify: (24) isvalstrempty(v) + Call: (25) var(v) + Fail: (25) var(v) + Redo: (24) isvalstrempty(v) + Unify: (24) isvalstrempty(v) + Call: (25) isval(v) + Unify: (25) isval(v) + Fail: (25) isval(v) + Redo: (24) isvalstrempty(v) + Unify: (24) isvalstrempty(v) + Fail: (24) isvalstrempty(v) + Fail: (23) expressionnotatom2([v, c]) + Redo: (22) expressionnotatom([v, c]) + Unify: (22) expressionnotatom([v, c]) + Call: (23) predicate_or_rule_name([v, c]) + Fail: (23) predicate_or_rule_name([v, c]) + Fail: (22) expressionnotatom([v, c]) + Redo: (21) expressionnotatom3([v, c]) + Unify: (21) expressionnotatom3([v, c]) +^ Call: (22) not([v, c]=[v, _9414]) +^ Unify: (22) not(user:([v, c]=[v, _9414])) +^ Fail: (22) not(user:([v, c]=[v, _9414])) + Fail: (21) expressionnotatom3([v, c]) + Redo: (20) checkarguments([[v, c]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9416, [], _9420) + Unify: (20) checkarguments([[v, c]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9416, [], _9420) + Call: (21) [[v, c]]=[_9396|_9398] + Exit: (21) [[v, c]]=[[v, c]] +^ Call: (21) not(var([v, c])) +^ Unify: (21) not(user:var([v, c])) +^ Exit: (21) not(user:var([v, c])) + Call: (21) isvar([v, c]) + Unify: (21) isvar([v, c]) + Exit: (21) isvar([v, c]) + Call: (21) [[v, c]]=[_9412|_9414] + Exit: (21) [[v, c]]=[[v, c]] + Call: (21) expressionnotatom3([v, c]) + Unify: (21) expressionnotatom3([v, c]) + Call: (22) expressionnotatom([v, c]) + Unify: (22) expressionnotatom([v, c]) + Call: (23) isvalstrempty([v, c]) + Unify: (23) isvalstrempty([v, c]) + Call: (24) var([v, c]) + Fail: (24) var([v, c]) + Redo: (23) isvalstrempty([v, c]) + Unify: (23) isvalstrempty([v, c]) + Call: (24) isval([v, c]) + Unify: (24) isval([v, c]) + Fail: (24) isval([v, c]) + Redo: (23) isvalstrempty([v, c]) + Unify: (23) isvalstrempty([v, c]) + Fail: (23) isvalstrempty([v, c]) + Redo: (22) expressionnotatom([v, c]) + Unify: (22) expressionnotatom([v, c]) +^ Call: (23) not(atom([v, c])) +^ Unify: (23) not(user:atom([v, c])) +^ Exit: (23) not(user:atom([v, c])) + Call: (23) length([v, c], _9444) + Unify: (23) length([v, c], _9444) + Exit: (23) length([v, c], 2) + Call: (23) 2>=1 + Exit: (23) 2>=1 + Call: (23) expressionnotatom2([v, c]) + Unify: (23) expressionnotatom2([v, c]) + Call: (24) isvalstrempty(v) + Unify: (24) isvalstrempty(v) + Call: (25) var(v) + Fail: (25) var(v) + Redo: (24) isvalstrempty(v) + Unify: (24) isvalstrempty(v) + Call: (25) isval(v) + Unify: (25) isval(v) + Fail: (25) isval(v) + Redo: (24) isvalstrempty(v) + Unify: (24) isvalstrempty(v) + Fail: (24) isvalstrempty(v) + Fail: (23) expressionnotatom2([v, c]) + Redo: (22) expressionnotatom([v, c]) + Unify: (22) expressionnotatom([v, c]) + Call: (23) predicate_or_rule_name([v, c]) + Fail: (23) predicate_or_rule_name([v, c]) + Fail: (22) expressionnotatom([v, c]) + Redo: (21) expressionnotatom3([v, c]) + Unify: (21) expressionnotatom3([v, c]) +^ Call: (22) not([v, c]=[v, _9430]) +^ Unify: (22) not(user:([v, c]=[v, _9430])) +^ Fail: (22) not(user:([v, c]=[v, _9430])) + Fail: (21) expressionnotatom3([v, c]) + Redo: (20) checkarguments([[v, c]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9416, [], _9420) + Unify: (20) checkarguments([[v, c]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9416, [], _9420) + Call: (21) [[v, c]]=[_9396|_9398] + Exit: (21) [[v, c]]=[[v, c]] +^ Call: (21) not(var([v, c])) +^ Unify: (21) not(user:var([v, c])) +^ Exit: (21) not(user:var([v, c])) + Call: (21) isvar([v, c]) + Unify: (21) isvar([v, c]) + Exit: (21) isvar([v, c]) + Call: (21) [[v, c]]=[_9412|_9414] + Exit: (21) [[v, c]]=[[v, c]] +^ Call: (21) not(var([v, c])) +^ Unify: (21) not(user:var([v, c])) +^ Exit: (21) not(user:var([v, c])) + Call: (21) isvar([v, c]) + Unify: (21) isvar([v, c]) + Exit: (21) isvar([v, c]) + Call: (21) getvalue([v, c], _9444, [[[v, a], 458], [[v, b], 949]]) + Unify: (21) getvalue([v, c], _9444, [[[v, a], 458], [[v, b], 949]]) +^ Call: (22) not(isvar([v, c])) +^ Unify: (22) not(user:isvar([v, c])) + Call: (23) isvar([v, c]) + Unify: (23) isvar([v, c]) + Exit: (23) isvar([v, c]) +^ Fail: (22) not(user:isvar([v, c])) + Redo: (21) getvalue([v, c], _9444, [[[v, a], 458], [[v, b], 949]]) + Call: (22) isvar([v, c]) + Unify: (22) isvar([v, c]) + Exit: (22) isvar([v, c]) + Call: (22) isvalstrorundef(_9442) + Unify: (22) isvalstrorundef(_9442) + Call: (23) var(_9442) + Exit: (23) var(_9442) + Exit: (22) isvalstrorundef(_9442) + Call: (22) getvar([v, c], _9444, [[[v, a], 458], [[v, b], 949]]) + Unify: (22) getvar([v, c], _9444, [[[v, a], 458], [[v, b], 949]]) + Call: (23) lists:member([[v, c], _9434], [[[v, a], 458], [[v, b], 949]]) + Unify: (23) lists:member([[v, c], _9434], [[[v, a], 458], [[v, b], 949]]) + Fail: (23) lists:member([[v, c], _9434], [[[v, a], 458], [[v, b], 949]]) + Redo: (22) getvar([v, c], _9444, [[[v, a], 458], [[v, b], 949]]) + Unify: (22) getvar([v, c], empty, [[[v, a], 458], [[v, b], 949]]) +^ Call: (23) not(member([[v, c], _9440], [[[v, a], 458], [[v, b], 949]])) +^ Unify: (23) not(user:member([[v, c], _9440], [[[v, a], 458], [[v, b], 949]])) +^ Exit: (23) not(user:member([[v, c], _9440], [[[v, a], 458], [[v, b], 949]])) + Call: (23) true + Exit: (23) true + Exit: (22) getvar([v, c], empty, [[[v, a], 458], [[v, b], 949]]) + Exit: (21) getvalue([v, c], empty, [[[v, a], 458], [[v, b], 949]]) + Call: (21) true + Exit: (21) true + Call: (21) putvalue([v, c], empty, [[[v, a], 458], [[v, b], 949]], _9472) + Unify: (21) putvalue([v, c], empty, [[[v, a], 458], [[v, b], 949]], _9472) +^ Call: (22) not(isvar([v, c])) +^ Unify: (22) not(user:isvar([v, c])) + Call: (23) isvar([v, c]) + Unify: (23) isvar([v, c]) + Exit: (23) isvar([v, c]) +^ Fail: (22) not(user:isvar([v, c])) + Redo: (21) putvalue([v, c], empty, [[[v, a], 458], [[v, b], 949]], _9472) + Call: (22) isvar([v, c]) + Unify: (22) isvar([v, c]) + Exit: (22) isvar([v, c]) + Call: (22) isvalstrorundef(empty) + Unify: (22) isvalstrorundef(empty) + Call: (23) var(empty) + Fail: (23) var(empty) + Redo: (22) isvalstrorundef(empty) + Unify: (22) isvalstrorundef(empty) +^ Call: (23) not(var(empty)) +^ Unify: (23) not(user:var(empty)) +^ Exit: (23) not(user:var(empty)) + Call: (23) isval(empty) + Unify: (23) isval(empty) + Fail: (23) isval(empty) + Redo: (22) isvalstrorundef(empty) + Unify: (22) isvalstrorundef(empty) +^ Call: (23) not(var(empty)) +^ Unify: (23) not(user:var(empty)) +^ Exit: (23) not(user:var(empty)) + Call: (23) expression(empty) + Unify: (23) expression(empty) + Exit: (23) expression(empty) + Exit: (22) isvalstrorundef(empty) + Call: (22) updatevar([v, c], empty, [[[v, a], 458], [[v, b], 949]], _9482) + Unify: (22) updatevar([v, c], empty, [[[v, a], 458], [[v, b], 949]], _9482) + Call: (23) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949]]) + Unify: (23) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949]]) + Fail: (23) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949]]) + Redo: (22) updatevar([v, c], empty, [[[v, a], 458], [[v, b], 949]], _9482) +^ Call: (23) not(member([[v, c], _9474], [[[v, a], 458], [[v, b], 949]])) +^ Unify: (23) not(user:member([[v, c], _9474], [[[v, a], 458], [[v, b], 949]])) +^ Exit: (23) not(user:member([[v, c], _9474], [[[v, a], 458], [[v, b], 949]])) + Call: (23) _9474=empty + Exit: (23) empty=empty + Call: (23) true + Exit: (23) true + Call: (23) lists:append([[[v, a], 458], [[v, b], 949]], [[[v, c], empty]], _9522) + Unify: (23) lists:append([[[v, a], 458], [[v, b], 949]], [[[v, c], empty]], [[[v, a], 458]|_9506]) + Exit: (23) lists:append([[[v, a], 458], [[v, b], 949]], [[[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (23) true + Exit: (23) true + Call: (23) true + Exit: (23) true + Exit: (22) updatevar([v, c], empty, [[[v, a], 458], [[v, b], 949]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (21) putvalue([v, c], empty, [[[v, a], 458], [[v, b], 949]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (21) lists:append([], [[[v, c], [v, c]]], _9552) + Unify: (21) lists:append([], [[[v, c], [v, c]]], [[[v, c], [v, c]]]) + Exit: (21) lists:append([], [[[v, c], [v, c]]], [[[v, c], [v, c]]]) + Call: (21) checkarguments([], [], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[v, c], [v, c]]], _9558) + Unify: (21) checkarguments([], [], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, c], [v, c]]], [[[v, c], [v, c]]]) + Exit: (21) checkarguments([], [], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, c], [v, c]]], [[[v, c], [v, c]]]) + Exit: (20) checkarguments([[v, c]], [[v, c]], [[[v, a], 458], [[v, b], 949]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], [[[v, c], [v, c]]]) + Exit: (19) checkarguments([949, [v, c]], [[v, b], [v, c]], [[[v, a], 458]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], [[[v, c], [v, c]]]) + Exit: (18) checkarguments([458, 949, [v, c]], [[v, a], [v, b], [v, c]], [], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], [[[v, c], [v, c]]]) + Call: (18) debug(on) + Fail: (18) debug(on) + Redo: (17) member1([[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9554) + Call: (18) true + Exit: (18) true + Call: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[_9540|_9542]|_9536] + Exit: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]] +^ Call: (19) not(predicate_or_rule_name([n, function1])) +^ Unify: (19) not(user:predicate_or_rule_name([n, function1])) + Call: (20) predicate_or_rule_name([n, function1]) + Unify: (20) predicate_or_rule_name([n, function1]) + Exit: (20) predicate_or_rule_name([n, function1]) +^ Fail: (19) not(user:predicate_or_rule_name([n, function1])) + Redo: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, not], [_9564]]|_9536] + Fail: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, not], [_9564]]|_9536] + Redo: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, or], [_9564, _9570]]|_9536] + Fail: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, or], [_9564, _9570]]|_9536] + Redo: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_9570, _9576]]|_9536] + Fail: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_9570, _9576]]|_9536] + Redo: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_9570, _9576, _9582]]|_9536] + Fail: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_9570, _9576, _9582]]|_9536] + Redo: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Unify: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9554, [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[_9534|_9536] + Exit: (19) [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]]=[[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]] +^ Call: (19) not(predicate_or_rule_name([[n, function1], [[v, a], [v, b], [v, d]]])) +^ Unify: (19) not(user:predicate_or_rule_name([[n, function1], [[v, a], [v, b], [v|...]]])) + Call: (20) predicate_or_rule_name([[n, function1], [[v, a], [v, b], [v, d]]]) + Fail: (20) predicate_or_rule_name([[n, function1], [[v, a], [v, b], [v, d]]]) +^ Exit: (19) not(user:predicate_or_rule_name([[n, function1], [[v, a], [v, b], [v|...]]])) + Call: (19) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, function1], [[v, a], [v, b], [v, d]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9572, _9574, _9576) + Unify: (19) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, function1], [[v, a], [v, b], [v, d]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _9572, true, nocut) + Call: (20) [[n, function1], [[v, a], [v, b], [v, d]]]=[_9550, _9556] + Exit: (20) [[n, function1], [[v, a], [v, b], [v, d]]]=[[n, function1], [[v, a], [v, b], [v, d]]] + Call: (20) substitutevarsA1([[v, a], [v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], _9582, [], _9586) + Unify: (20) substitutevarsA1([[v, a], [v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], _9582, [], _9586) + Call: (21) substitutevarsA2([[v, a], [v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], _9582, [], _9586) + Unify: (21) substitutevarsA2([[v, a], [v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], _9582, [], _9586) + Call: (22) [[v, a], [v, b], [v, d]]=[_9562|_9564] + Exit: (22) [[v, a], [v, b], [v, d]]=[[v, a], [v, b], [v, d]] + Call: (22) getvalue([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (22) getvalue([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (23) not(isvar([v, a])) +^ Unify: (23) not(user:isvar([v, a])) + Call: (24) isvar([v, a]) + Unify: (24) isvar([v, a]) + Exit: (24) isvar([v, a]) +^ Fail: (23) not(user:isvar([v, a])) + Redo: (22) getvalue([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (23) isvar([v, a]) + Unify: (23) isvar([v, a]) + Exit: (23) isvar([v, a]) + Call: (23) isvalstrorundef(_9582) + Unify: (23) isvalstrorundef(_9582) + Call: (24) var(_9582) + Exit: (24) var(_9582) + Exit: (23) isvalstrorundef(_9582) + Call: (23) getvar([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (23) getvar([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (24) lists:member([[v, a], _9574], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (24) lists:member([[v, a], _9574], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (24) lists:member([[v, a], 458], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (24) not(458=empty) +^ Unify: (24) not(user:(458=empty)) +^ Exit: (24) not(user:(458=empty)) + Exit: (23) getvar([v, a], 458, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (22) getvalue([v, a], 458, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (22) 458=empty + Fail: (22) 458=empty + Redo: (24) lists:member([[v, a], _9574], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Fail: (24) lists:member([[v, a], _9574], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Redo: (23) getvar([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (23) getvar([v, a], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (24) not(member([[v, a], _9580], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) +^ Unify: (24) not(user:member([[v, a], _9580], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) +^ Fail: (24) not(user:member([[v, a], _9580], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) + Redo: (23) getvar([v, a], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (24) lists:member([[v, a], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (24) lists:member([[v, a], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Fail: (24) lists:member([[v, a], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Fail: (23) getvar([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Fail: (22) getvalue([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Redo: (21) substitutevarsA2([[v, a], [v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], _9588, [], _9592) + Call: (22) getvalue([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (22) getvalue([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (23) not(isvar([v, a])) +^ Unify: (23) not(user:isvar([v, a])) + Call: (24) isvar([v, a]) + Unify: (24) isvar([v, a]) + Exit: (24) isvar([v, a]) +^ Fail: (23) not(user:isvar([v, a])) + Redo: (22) getvalue([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (23) isvar([v, a]) + Unify: (23) isvar([v, a]) + Exit: (23) isvar([v, a]) + Call: (23) isvalstrorundef(_9582) + Unify: (23) isvalstrorundef(_9582) + Call: (24) var(_9582) + Exit: (24) var(_9582) + Exit: (23) isvalstrorundef(_9582) + Call: (23) getvar([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (23) getvar([v, a], _9584, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (24) lists:member([[v, a], _9574], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (24) lists:member([[v, a], _9574], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (24) lists:member([[v, a], 458], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (24) not(458=empty) +^ Unify: (24) not(user:(458=empty)) +^ Exit: (24) not(user:(458=empty)) + Exit: (23) getvar([v, a], 458, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (22) getvalue([v, a], 458, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (22) lists:append([], [458], _9616) + Unify: (22) lists:append([], [458], [458]) + Exit: (22) lists:append([], [458], [458]) + Call: (22) _9612=[] + Exit: (22) []=[] + Call: (22) substitutevarsA2([[v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [458], _9618, [], _9622) + Unify: (22) substitutevarsA2([[v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [458], _9618, [], _9622) + Call: (23) [[v, b], [v, d]]=[_9598|_9600] + Exit: (23) [[v, b], [v, d]]=[[v, b], [v, d]] + Call: (23) getvalue([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (23) getvalue([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (24) not(isvar([v, b])) +^ Unify: (24) not(user:isvar([v, b])) + Call: (25) isvar([v, b]) + Unify: (25) isvar([v, b]) + Exit: (25) isvar([v, b]) +^ Fail: (24) not(user:isvar([v, b])) + Redo: (23) getvalue([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (24) isvar([v, b]) + Unify: (24) isvar([v, b]) + Exit: (24) isvar([v, b]) + Call: (24) isvalstrorundef(_9618) + Unify: (24) isvalstrorundef(_9618) + Call: (25) var(_9618) + Exit: (25) var(_9618) + Exit: (24) isvalstrorundef(_9618) + Call: (24) getvar([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (24) getvar([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (25) lists:member([[v, b], _9610], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (25) lists:member([[v, b], _9610], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (25) lists:member([[v, b], 949], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (25) not(949=empty) +^ Unify: (25) not(user:(949=empty)) +^ Exit: (25) not(user:(949=empty)) + Exit: (24) getvar([v, b], 949, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (23) getvalue([v, b], 949, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (23) 949=empty + Fail: (23) 949=empty + Redo: (25) lists:member([[v, b], _9610], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Fail: (25) lists:member([[v, b], _9610], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Redo: (24) getvar([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (24) getvar([v, b], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (25) not(member([[v, b], _9616], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) +^ Unify: (25) not(user:member([[v, b], _9616], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) +^ Fail: (25) not(user:member([[v, b], _9616], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) + Redo: (24) getvar([v, b], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (25) lists:member([[v, b], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (25) lists:member([[v, b], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Fail: (25) lists:member([[v, b], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Fail: (24) getvar([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Fail: (23) getvalue([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Redo: (22) substitutevarsA2([[v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [458], _9624, [], _9628) + Call: (23) getvalue([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (23) getvalue([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (24) not(isvar([v, b])) +^ Unify: (24) not(user:isvar([v, b])) + Call: (25) isvar([v, b]) + Unify: (25) isvar([v, b]) + Exit: (25) isvar([v, b]) +^ Fail: (24) not(user:isvar([v, b])) + Redo: (23) getvalue([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (24) isvar([v, b]) + Unify: (24) isvar([v, b]) + Exit: (24) isvar([v, b]) + Call: (24) isvalstrorundef(_9618) + Unify: (24) isvalstrorundef(_9618) + Call: (25) var(_9618) + Exit: (25) var(_9618) + Exit: (24) isvalstrorundef(_9618) + Call: (24) getvar([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (24) getvar([v, b], _9620, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (25) lists:member([[v, b], _9610], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (25) lists:member([[v, b], _9610], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (25) lists:member([[v, b], 949], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (25) not(949=empty) +^ Unify: (25) not(user:(949=empty)) +^ Exit: (25) not(user:(949=empty)) + Exit: (24) getvar([v, b], 949, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (23) getvalue([v, b], 949, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (23) lists:append([458], [949], _9652) + Unify: (23) lists:append([458], [949], [458|_9636]) + Exit: (23) lists:append([458], [949], [458, 949]) + Call: (23) _9654=[] + Exit: (23) []=[] + Call: (23) substitutevarsA2([[v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [458, 949], _9660, [], _9664) + Unify: (23) substitutevarsA2([[v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [458, 949], _9660, [], _9664) + Call: (24) [[v, d]]=[_9640|_9642] + Exit: (24) [[v, d]]=[[v, d]] + Call: (24) getvalue([v, d], _9662, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (24) getvalue([v, d], _9662, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (25) not(isvar([v, d])) +^ Unify: (25) not(user:isvar([v, d])) + Call: (26) isvar([v, d]) + Unify: (26) isvar([v, d]) + Exit: (26) isvar([v, d]) +^ Fail: (25) not(user:isvar([v, d])) + Redo: (24) getvalue([v, d], _9662, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (25) isvar([v, d]) + Unify: (25) isvar([v, d]) + Exit: (25) isvar([v, d]) + Call: (25) isvalstrorundef(_9660) + Unify: (25) isvalstrorundef(_9660) + Call: (26) var(_9660) + Exit: (26) var(_9660) + Exit: (25) isvalstrorundef(_9660) + Call: (25) getvar([v, d], _9662, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (25) getvar([v, d], _9662, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (26) lists:member([[v, d], _9652], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (26) lists:member([[v, d], _9652], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Fail: (26) lists:member([[v, d], _9652], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Redo: (25) getvar([v, d], _9662, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (25) getvar([v, d], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (26) not(member([[v, d], _9658], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) +^ Unify: (26) not(user:member([[v, d], _9658], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) +^ Exit: (26) not(user:member([[v, d], _9658], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) + Call: (26) true + Exit: (26) true + Exit: (25) getvar([v, d], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (24) getvalue([v, d], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (24) empty=empty + Exit: (24) empty=empty + Call: (24) lists:append([458, 949], [[v, d]], _9694) + Unify: (24) lists:append([458, 949], [[v, d]], [458|_9678]) + Exit: (24) lists:append([458, 949], [[v, d]], [458, 949, [v, d]]) + Call: (24) isvar([v, d]) + Unify: (24) isvar([v, d]) + Exit: (24) isvar([v, d]) + Call: (24) lists:append([], [[v, d]], _9712) + Unify: (24) lists:append([], [[v, d]], [[v, d]]) + Exit: (24) lists:append([], [[v, d]], [[v, d]]) + Call: (24) substitutevarsA2([], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [458, 949, [v, d]], _9714, [[v, d]], _9718) + Unify: (24) substitutevarsA2([], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [458, 949, [v, d]], [458, 949, [v, d]], [[v, d]], [[v, d]]) + Exit: (24) substitutevarsA2([], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [458, 949, [v, d]], [458, 949, [v, d]], [[v, d]], [[v, d]]) + Exit: (23) substitutevarsA2([[v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [458, 949], [458, 949, [v, d]], [], [[v, d]]) + Exit: (22) substitutevarsA2([[v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [458], [458, 949, [v, d]], [], [[v, d]]) + Exit: (21) substitutevarsA2([[v, a], [v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], [458, 949, [v, d]], [], [[v, d]]) + Exit: (20) substitutevarsA1([[v, a], [v, b], [v, d]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], [458, 949, [v, d]], [], [[v, d]]) + Call: (20) _9720=[[n, function1], [458, 949, [v, d]]] + Exit: (20) [[n, function1], [458, 949, [v, d]]]=[[n, function1], [458, 949, [v, d]]] + Call: (20) interpret2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Unify: (20) interpret2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Call: (21) member2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Unify: (21) member2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Call: (22) cut(off) + Unify: (22) cut(off) + Exit: (22) cut(off) + Call: (22) [[n, function1], [458, 949, [v, d]]]=[_9706, _9712] + Exit: (22) [[n, function1], [458, 949, [v, d]]]=[[n, function1], [458, 949, [v, d]]] + Call: (22) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function1], _9730, ":-", _9748]|_9720] + Fail: (22) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function1], _9730, ":-", _9748]|_9720] + Redo: (21) member2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Call: (22) member21([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Unify: (22) member21([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Call: (23) cut(off) + Unify: (23) cut(off) + Exit: (23) cut(off) + Call: (23) [[n, function1], [458, 949, [v, d]]]=[_9706] + Fail: (23) [[n, function1], [458, 949, [v, d]]]=[_9706] + Redo: (22) member21([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Call: (23) member22([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Unify: (23) member22([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Call: (24) cut(off) + Unify: (24) cut(off) + Exit: (24) cut(off) + Call: (24) [[n, function1], [458, 949, [v, d]]]=[_9706, _9712] + Exit: (24) [[n, function1], [458, 949, [v, d]]]=[[n, function1], [458, 949, [v, d]]] + Call: (24) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function1], _9730]|_9720] + Fail: (24) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function1], _9730]|_9720] + Redo: (23) member22([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Call: (24) member23([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Unify: (24) member23([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Call: (25) cut(off) + Unify: (25) cut(off) + Exit: (25) cut(off) + Call: (25) [[n, function1], [458, 949, [v, d]]]=[_9706] + Fail: (25) [[n, function1], [458, 949, [v, d]]]=[_9706] + Redo: (24) member23([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _9726) + Call: (25) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[_9706|_9708] + Exit: (25) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (25) member2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], _9732) + Unify: (25) member2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], _9732) + Call: (26) cut(off) + Unify: (26) cut(off) + Exit: (26) cut(off) + Call: (26) [[n, function1], [458, 949, [v, d]]]=[_9712, _9718] + Exit: (26) [[n, function1], [458, 949, [v, d]]]=[[n, function1], [458, 949, [v, d]]] + Call: (26) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[[[n, function1], _9736, ":-", _9754]|_9726] + Exit: (26) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]] + Call: (26) length([458, 949, [v, d]], _9776) + Unify: (26) length([458, 949, [v, d]], _9776) + Exit: (26) length([458, 949, [v, d]], 3) + Call: (26) length([[v, a], [v, b], [v, c]], 3) + Unify: (26) length([[v, a], [v, b], [v, c]], 3) + Unify: (26) length([[v, a], [v, b], [v, c]], 3) + Exit: (26) length([[v, a], [v, b], [v, c]], 3) + Call: (26) checkarguments([458, 949, [v, d]], [[v, a], [v, b], [v, c]], [], _9780, [], _9784) + Unify: (26) checkarguments([458, 949, [v, d]], [[v, a], [v, b], [v, c]], [], _9780, [], _9784) + Call: (27) [458, 949, [v, d]]=[_9760|_9762] + Exit: (27) [458, 949, [v, d]]=[458, 949, [v, d]] + Call: (27) expressionnotatom3(458) + Unify: (27) expressionnotatom3(458) + Call: (28) expressionnotatom(458) + Unify: (28) expressionnotatom(458) + Call: (29) isvalstrempty(458) + Unify: (29) isvalstrempty(458) + Call: (30) var(458) + Fail: (30) var(458) + Redo: (29) isvalstrempty(458) + Unify: (29) isvalstrempty(458) + Call: (30) isval(458) + Unify: (30) isval(458) + Exit: (30) isval(458) + Exit: (29) isvalstrempty(458) + Exit: (28) expressionnotatom(458) + Exit: (27) expressionnotatom3(458) + Call: (27) [[v, a], [v, b], [v, c]]=[_9766|_9768] + Exit: (27) [[v, a], [v, b], [v, c]]=[[v, a], [v, b], [v, c]] +^ Call: (27) not(var([v, a])) +^ Unify: (27) not(user:var([v, a])) +^ Exit: (27) not(user:var([v, a])) + Call: (27) isvar([v, a]) + Unify: (27) isvar([v, a]) + Exit: (27) isvar([v, a]) + Call: (27) putvalue([v, a], 458, [], _9802) + Unify: (27) putvalue([v, a], 458, [], _9802) +^ Call: (28) not(isvar([v, a])) +^ Unify: (28) not(user:isvar([v, a])) + Call: (29) isvar([v, a]) + Unify: (29) isvar([v, a]) + Exit: (29) isvar([v, a]) +^ Fail: (28) not(user:isvar([v, a])) + Redo: (27) putvalue([v, a], 458, [], _9802) + Call: (28) isvar([v, a]) + Unify: (28) isvar([v, a]) + Exit: (28) isvar([v, a]) + Call: (28) isvalstrorundef(458) + Unify: (28) isvalstrorundef(458) + Call: (29) var(458) + Fail: (29) var(458) + Redo: (28) isvalstrorundef(458) + Unify: (28) isvalstrorundef(458) +^ Call: (29) not(var(458)) +^ Unify: (29) not(user:var(458)) +^ Exit: (29) not(user:var(458)) + Call: (29) isval(458) + Unify: (29) isval(458) + Exit: (29) isval(458) + Exit: (28) isvalstrorundef(458) + Call: (28) updatevar([v, a], 458, [], _9812) + Unify: (28) updatevar([v, a], 458, [], _9812) + Call: (29) lists:member([[v, a], empty], []) + Fail: (29) lists:member([[v, a], empty], []) + Redo: (28) updatevar([v, a], 458, [], _9812) +^ Call: (29) not(member([[v, a], _9804], [])) +^ Unify: (29) not(user:member([[v, a], _9804], [])) +^ Exit: (29) not(user:member([[v, a], _9804], [])) + Call: (29) _9804=empty + Exit: (29) empty=empty + Call: (29) true + Exit: (29) true + Call: (29) lists:append([], [[[v, a], 458]], _9852) + Unify: (29) lists:append([], [[[v, a], 458]], [[[v, a], 458]]) + Exit: (29) lists:append([], [[[v, a], 458]], [[[v, a], 458]]) + Call: (29) true + Exit: (29) true + Call: (29) true + Exit: (29) true + Exit: (28) updatevar([v, a], 458, [], [[[v, a], 458]]) + Exit: (27) putvalue([v, a], 458, [], [[[v, a], 458]]) + Call: (27) checkarguments([949, [v, d]], [[v, b], [v, c]], [[[v, a], 458]], _9854, [], _9858) + Unify: (27) checkarguments([949, [v, d]], [[v, b], [v, c]], [[[v, a], 458]], _9854, [], _9858) + Call: (28) [949, [v, d]]=[_9834|_9836] + Exit: (28) [949, [v, d]]=[949, [v, d]] + Call: (28) expressionnotatom3(949) + Unify: (28) expressionnotatom3(949) + Call: (29) expressionnotatom(949) + Unify: (29) expressionnotatom(949) + Call: (30) isvalstrempty(949) + Unify: (30) isvalstrempty(949) + Call: (31) var(949) + Fail: (31) var(949) + Redo: (30) isvalstrempty(949) + Unify: (30) isvalstrempty(949) + Call: (31) isval(949) + Unify: (31) isval(949) + Exit: (31) isval(949) + Exit: (30) isvalstrempty(949) + Exit: (29) expressionnotatom(949) + Exit: (28) expressionnotatom3(949) + Call: (28) [[v, b], [v, c]]=[_9840|_9842] + Exit: (28) [[v, b], [v, c]]=[[v, b], [v, c]] +^ Call: (28) not(var([v, b])) +^ Unify: (28) not(user:var([v, b])) +^ Exit: (28) not(user:var([v, b])) + Call: (28) isvar([v, b]) + Unify: (28) isvar([v, b]) + Exit: (28) isvar([v, b]) + Call: (28) putvalue([v, b], 949, [[[v, a], 458]], _9876) + Unify: (28) putvalue([v, b], 949, [[[v, a], 458]], _9876) +^ Call: (29) not(isvar([v, b])) +^ Unify: (29) not(user:isvar([v, b])) + Call: (30) isvar([v, b]) + Unify: (30) isvar([v, b]) + Exit: (30) isvar([v, b]) +^ Fail: (29) not(user:isvar([v, b])) + Redo: (28) putvalue([v, b], 949, [[[v, a], 458]], _9876) + Call: (29) isvar([v, b]) + Unify: (29) isvar([v, b]) + Exit: (29) isvar([v, b]) + Call: (29) isvalstrorundef(949) + Unify: (29) isvalstrorundef(949) + Call: (30) var(949) + Fail: (30) var(949) + Redo: (29) isvalstrorundef(949) + Unify: (29) isvalstrorundef(949) +^ Call: (30) not(var(949)) +^ Unify: (30) not(user:var(949)) +^ Exit: (30) not(user:var(949)) + Call: (30) isval(949) + Unify: (30) isval(949) + Exit: (30) isval(949) + Exit: (29) isvalstrorundef(949) + Call: (29) updatevar([v, b], 949, [[[v, a], 458]], _9886) + Unify: (29) updatevar([v, b], 949, [[[v, a], 458]], _9886) + Call: (30) lists:member([[v, b], empty], [[[v, a], 458]]) + Unify: (30) lists:member([[v, b], empty], [[[v, a], 458]]) + Fail: (30) lists:member([[v, b], empty], [[[v, a], 458]]) + Redo: (29) updatevar([v, b], 949, [[[v, a], 458]], _9886) +^ Call: (30) not(member([[v, b], _9878], [[[v, a], 458]])) +^ Unify: (30) not(user:member([[v, b], _9878], [[[v, a], 458]])) +^ Exit: (30) not(user:member([[v, b], _9878], [[[v, a], 458]])) + Call: (30) _9878=empty + Exit: (30) empty=empty + Call: (30) true + Exit: (30) true + Call: (30) lists:append([[[v, a], 458]], [[[v, b], 949]], _9926) + Unify: (30) lists:append([[[v, a], 458]], [[[v, b], 949]], [[[v, a], 458]|_9910]) + Exit: (30) lists:append([[[v, a], 458]], [[[v, b], 949]], [[[v, a], 458], [[v, b], 949]]) + Call: (30) true + Exit: (30) true + Call: (30) true + Exit: (30) true + Exit: (29) updatevar([v, b], 949, [[[v, a], 458]], [[[v, a], 458], [[v, b], 949]]) + Exit: (28) putvalue([v, b], 949, [[[v, a], 458]], [[[v, a], 458], [[v, b], 949]]) + Call: (28) checkarguments([[v, d]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9934, [], _9938) + Unify: (28) checkarguments([[v, d]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9934, [], _9938) + Call: (29) [[v, d]]=[_9914|_9916] + Exit: (29) [[v, d]]=[[v, d]] + Call: (29) expressionnotatom3([v, d]) + Unify: (29) expressionnotatom3([v, d]) + Call: (30) expressionnotatom([v, d]) + Unify: (30) expressionnotatom([v, d]) + Call: (31) isvalstrempty([v, d]) + Unify: (31) isvalstrempty([v, d]) + Call: (32) var([v, d]) + Fail: (32) var([v, d]) + Redo: (31) isvalstrempty([v, d]) + Unify: (31) isvalstrempty([v, d]) + Call: (32) isval([v, d]) + Unify: (32) isval([v, d]) + Fail: (32) isval([v, d]) + Redo: (31) isvalstrempty([v, d]) + Unify: (31) isvalstrempty([v, d]) + Fail: (31) isvalstrempty([v, d]) + Redo: (30) expressionnotatom([v, d]) + Unify: (30) expressionnotatom([v, d]) +^ Call: (31) not(atom([v, d])) +^ Unify: (31) not(user:atom([v, d])) +^ Exit: (31) not(user:atom([v, d])) + Call: (31) length([v, d], _9946) + Unify: (31) length([v, d], _9946) + Exit: (31) length([v, d], 2) + Call: (31) 2>=1 + Exit: (31) 2>=1 + Call: (31) expressionnotatom2([v, d]) + Unify: (31) expressionnotatom2([v, d]) + Call: (32) isvalstrempty(v) + Unify: (32) isvalstrempty(v) + Call: (33) var(v) + Fail: (33) var(v) + Redo: (32) isvalstrempty(v) + Unify: (32) isvalstrempty(v) + Call: (33) isval(v) + Unify: (33) isval(v) + Fail: (33) isval(v) + Redo: (32) isvalstrempty(v) + Unify: (32) isvalstrempty(v) + Fail: (32) isvalstrempty(v) + Fail: (31) expressionnotatom2([v, d]) + Redo: (30) expressionnotatom([v, d]) + Unify: (30) expressionnotatom([v, d]) + Call: (31) predicate_or_rule_name([v, d]) + Fail: (31) predicate_or_rule_name([v, d]) + Fail: (30) expressionnotatom([v, d]) + Redo: (29) expressionnotatom3([v, d]) + Unify: (29) expressionnotatom3([v, d]) +^ Call: (30) not([v, d]=[v, _9932]) +^ Unify: (30) not(user:([v, d]=[v, _9932])) +^ Fail: (30) not(user:([v, d]=[v, _9932])) + Fail: (29) expressionnotatom3([v, d]) + Redo: (28) checkarguments([[v, d]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9934, [], _9938) + Unify: (28) checkarguments([[v, d]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9934, [], _9938) + Call: (29) [[v, d]]=[_9914|_9916] + Exit: (29) [[v, d]]=[[v, d]] +^ Call: (29) not(var([v, d])) +^ Unify: (29) not(user:var([v, d])) +^ Exit: (29) not(user:var([v, d])) + Call: (29) isvar([v, d]) + Unify: (29) isvar([v, d]) + Exit: (29) isvar([v, d]) + Call: (29) [[v, c]]=[_9930|_9932] + Exit: (29) [[v, c]]=[[v, c]] + Call: (29) expressionnotatom3([v, c]) + Unify: (29) expressionnotatom3([v, c]) + Call: (30) expressionnotatom([v, c]) + Unify: (30) expressionnotatom([v, c]) + Call: (31) isvalstrempty([v, c]) + Unify: (31) isvalstrempty([v, c]) + Call: (32) var([v, c]) + Fail: (32) var([v, c]) + Redo: (31) isvalstrempty([v, c]) + Unify: (31) isvalstrempty([v, c]) + Call: (32) isval([v, c]) + Unify: (32) isval([v, c]) + Fail: (32) isval([v, c]) + Redo: (31) isvalstrempty([v, c]) + Unify: (31) isvalstrempty([v, c]) + Fail: (31) isvalstrempty([v, c]) + Redo: (30) expressionnotatom([v, c]) + Unify: (30) expressionnotatom([v, c]) +^ Call: (31) not(atom([v, c])) +^ Unify: (31) not(user:atom([v, c])) +^ Exit: (31) not(user:atom([v, c])) + Call: (31) length([v, c], _9962) + Unify: (31) length([v, c], _9962) + Exit: (31) length([v, c], 2) + Call: (31) 2>=1 + Exit: (31) 2>=1 + Call: (31) expressionnotatom2([v, c]) + Unify: (31) expressionnotatom2([v, c]) + Call: (32) isvalstrempty(v) + Unify: (32) isvalstrempty(v) + Call: (33) var(v) + Fail: (33) var(v) + Redo: (32) isvalstrempty(v) + Unify: (32) isvalstrempty(v) + Call: (33) isval(v) + Unify: (33) isval(v) + Fail: (33) isval(v) + Redo: (32) isvalstrempty(v) + Unify: (32) isvalstrempty(v) + Fail: (32) isvalstrempty(v) + Fail: (31) expressionnotatom2([v, c]) + Redo: (30) expressionnotatom([v, c]) + Unify: (30) expressionnotatom([v, c]) + Call: (31) predicate_or_rule_name([v, c]) + Fail: (31) predicate_or_rule_name([v, c]) + Fail: (30) expressionnotatom([v, c]) + Redo: (29) expressionnotatom3([v, c]) + Unify: (29) expressionnotatom3([v, c]) +^ Call: (30) not([v, c]=[v, _9948]) +^ Unify: (30) not(user:([v, c]=[v, _9948])) +^ Fail: (30) not(user:([v, c]=[v, _9948])) + Fail: (29) expressionnotatom3([v, c]) + Redo: (28) checkarguments([[v, d]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9934, [], _9938) + Unify: (28) checkarguments([[v, d]], [[v, c]], [[[v, a], 458], [[v, b], 949]], _9934, [], _9938) + Call: (29) [[v, d]]=[_9914|_9916] + Exit: (29) [[v, d]]=[[v, d]] +^ Call: (29) not(var([v, d])) +^ Unify: (29) not(user:var([v, d])) +^ Exit: (29) not(user:var([v, d])) + Call: (29) isvar([v, d]) + Unify: (29) isvar([v, d]) + Exit: (29) isvar([v, d]) + Call: (29) [[v, c]]=[_9930|_9932] + Exit: (29) [[v, c]]=[[v, c]] +^ Call: (29) not(var([v, c])) +^ Unify: (29) not(user:var([v, c])) +^ Exit: (29) not(user:var([v, c])) + Call: (29) isvar([v, c]) + Unify: (29) isvar([v, c]) + Exit: (29) isvar([v, c]) + Call: (29) getvalue([v, c], _9962, [[[v, a], 458], [[v, b], 949]]) + Unify: (29) getvalue([v, c], _9962, [[[v, a], 458], [[v, b], 949]]) +^ Call: (30) not(isvar([v, c])) +^ Unify: (30) not(user:isvar([v, c])) + Call: (31) isvar([v, c]) + Unify: (31) isvar([v, c]) + Exit: (31) isvar([v, c]) +^ Fail: (30) not(user:isvar([v, c])) + Redo: (29) getvalue([v, c], _9962, [[[v, a], 458], [[v, b], 949]]) + Call: (30) isvar([v, c]) + Unify: (30) isvar([v, c]) + Exit: (30) isvar([v, c]) + Call: (30) isvalstrorundef(_9960) + Unify: (30) isvalstrorundef(_9960) + Call: (31) var(_9960) + Exit: (31) var(_9960) + Exit: (30) isvalstrorundef(_9960) + Call: (30) getvar([v, c], _9962, [[[v, a], 458], [[v, b], 949]]) + Unify: (30) getvar([v, c], _9962, [[[v, a], 458], [[v, b], 949]]) + Call: (31) lists:member([[v, c], _9952], [[[v, a], 458], [[v, b], 949]]) + Unify: (31) lists:member([[v, c], _9952], [[[v, a], 458], [[v, b], 949]]) + Fail: (31) lists:member([[v, c], _9952], [[[v, a], 458], [[v, b], 949]]) + Redo: (30) getvar([v, c], _9962, [[[v, a], 458], [[v, b], 949]]) + Unify: (30) getvar([v, c], empty, [[[v, a], 458], [[v, b], 949]]) +^ Call: (31) not(member([[v, c], _9958], [[[v, a], 458], [[v, b], 949]])) +^ Unify: (31) not(user:member([[v, c], _9958], [[[v, a], 458], [[v, b], 949]])) +^ Exit: (31) not(user:member([[v, c], _9958], [[[v, a], 458], [[v, b], 949]])) + Call: (31) true + Exit: (31) true + Exit: (30) getvar([v, c], empty, [[[v, a], 458], [[v, b], 949]]) + Exit: (29) getvalue([v, c], empty, [[[v, a], 458], [[v, b], 949]]) + Call: (29) true + Exit: (29) true + Call: (29) putvalue([v, c], empty, [[[v, a], 458], [[v, b], 949]], _9990) + Unify: (29) putvalue([v, c], empty, [[[v, a], 458], [[v, b], 949]], _9990) +^ Call: (30) not(isvar([v, c])) +^ Unify: (30) not(user:isvar([v, c])) + Call: (31) isvar([v, c]) + Unify: (31) isvar([v, c]) + Exit: (31) isvar([v, c]) +^ Fail: (30) not(user:isvar([v, c])) + Redo: (29) putvalue([v, c], empty, [[[v, a], 458], [[v, b], 949]], _9990) + Call: (30) isvar([v, c]) + Unify: (30) isvar([v, c]) + Exit: (30) isvar([v, c]) + Call: (30) isvalstrorundef(empty) + Unify: (30) isvalstrorundef(empty) + Call: (31) var(empty) + Fail: (31) var(empty) + Redo: (30) isvalstrorundef(empty) + Unify: (30) isvalstrorundef(empty) +^ Call: (31) not(var(empty)) +^ Unify: (31) not(user:var(empty)) +^ Exit: (31) not(user:var(empty)) + Call: (31) isval(empty) + Unify: (31) isval(empty) + Fail: (31) isval(empty) + Redo: (30) isvalstrorundef(empty) + Unify: (30) isvalstrorundef(empty) +^ Call: (31) not(var(empty)) +^ Unify: (31) not(user:var(empty)) +^ Exit: (31) not(user:var(empty)) + Call: (31) expression(empty) + Unify: (31) expression(empty) + Exit: (31) expression(empty) + Exit: (30) isvalstrorundef(empty) + Call: (30) updatevar([v, c], empty, [[[v, a], 458], [[v, b], 949]], _10000) + Unify: (30) updatevar([v, c], empty, [[[v, a], 458], [[v, b], 949]], _10000) + Call: (31) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949]]) + Unify: (31) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949]]) + Fail: (31) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949]]) + Redo: (30) updatevar([v, c], empty, [[[v, a], 458], [[v, b], 949]], _10000) +^ Call: (31) not(member([[v, c], _9992], [[[v, a], 458], [[v, b], 949]])) +^ Unify: (31) not(user:member([[v, c], _9992], [[[v, a], 458], [[v, b], 949]])) +^ Exit: (31) not(user:member([[v, c], _9992], [[[v, a], 458], [[v, b], 949]])) + Call: (31) _9992=empty + Exit: (31) empty=empty + Call: (31) true + Exit: (31) true + Call: (31) lists:append([[[v, a], 458], [[v, b], 949]], [[[v, c], empty]], _10040) + Unify: (31) lists:append([[[v, a], 458], [[v, b], 949]], [[[v, c], empty]], [[[v, a], 458]|_10024]) + Exit: (31) lists:append([[[v, a], 458], [[v, b], 949]], [[[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (31) true + Exit: (31) true + Call: (31) true + Exit: (31) true + Exit: (30) updatevar([v, c], empty, [[[v, a], 458], [[v, b], 949]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (29) putvalue([v, c], empty, [[[v, a], 458], [[v, b], 949]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (29) lists:append([], [[[v, d], [v, c]]], _10070) + Unify: (29) lists:append([], [[[v, d], [v, c]]], [[[v, d], [v, c]]]) + Exit: (29) lists:append([], [[[v, d], [v, c]]], [[[v, d], [v, c]]]) + Call: (29) checkarguments([], [], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[v, d], [v, c]]], _10076) + Unify: (29) checkarguments([], [], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, d], [v, c]]], [[[v, d], [v, c]]]) + Exit: (29) checkarguments([], [], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, d], [v, c]]], [[[v, d], [v, c]]]) + Exit: (28) checkarguments([[v, d]], [[v, c]], [[[v, a], 458], [[v, b], 949]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], [[[v, d], [v, c]]]) + Exit: (27) checkarguments([949, [v, d]], [[v, b], [v, c]], [[[v, a], 458]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], [[[v, d], [v, c]]]) + Exit: (26) checkarguments([458, 949, [v, d]], [[v, a], [v, b], [v, c]], [], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [], [[[v, d], [v, c]]]) + Call: (26) debug(on) + Fail: (26) debug(on) + Redo: (25) member2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], _10072) + Call: (26) true + Exit: (26) true + Call: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[_10058|_10060]|_10054] + Exit: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, +], [[v, a], [v, b], [v, c]]]] +^ Call: (27) not(predicate_or_rule_name([n, +])) +^ Unify: (27) not(user:predicate_or_rule_name([n, +])) + Call: (28) predicate_or_rule_name([n, +]) + Unify: (28) predicate_or_rule_name([n, +]) + Exit: (28) predicate_or_rule_name([n, +]) +^ Fail: (27) not(user:predicate_or_rule_name([n, +])) + Redo: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, not], [_10082]]|_10054] + Fail: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, not], [_10082]]|_10054] + Redo: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, or], [_10082, _10088]]|_10054] + Fail: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, or], [_10082, _10088]]|_10054] + Redo: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, "->"], [_10088, _10094]]|_10054] + Fail: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, "->"], [_10088, _10094]]|_10054] + Redo: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, "->"], [_10088, _10094, _10100]]|_10054] + Fail: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, "->"], [_10088, _10094, _10100]]|_10054] + Redo: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Unify: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10072, [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[_10052|_10054] + Exit: (27) [[[n, +], [[v, a], [v, b], [v, c]]]]=[[[n, +], [[v, a], [v, b], [v, c]]]] +^ Call: (27) not(predicate_or_rule_name([[n, +], [[v, a], [v, b], [v, c]]])) +^ Unify: (27) not(user:predicate_or_rule_name([[n, +], [[v, a], [v, b], [v|...]]])) + Call: (28) predicate_or_rule_name([[n, +], [[v, a], [v, b], [v, c]]]) + Fail: (28) predicate_or_rule_name([[n, +], [[v, a], [v, b], [v, c]]]) +^ Exit: (27) not(user:predicate_or_rule_name([[n, +], [[v, a], [v, b], [v|...]]])) + Call: (27) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[n, +], [[v, a], [v, b], [v, c]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10090, _10092, _10094) + Unify: (27) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[n, +], [[v, a], [v, b], [v, c]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10090, true, nocut) + Call: (28) interpretpart(isplus, [v, c], [v, a], [v, b], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10092) + Unify: (28) interpretpart(isplus, [v, c], [v, a], [v, b], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10092) + Call: (29) getvalues([v, c], [v, a], [v, b], _10088, _10090, _10092, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (29) getvalues([v, c], [v, a], [v, b], _10088, _10090, _10092, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (30) getvalue([v, c], _10084, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (30) getvalue([v, c], _10084, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (31) not(isvar([v, c])) +^ Unify: (31) not(user:isvar([v, c])) + Call: (32) isvar([v, c]) + Unify: (32) isvar([v, c]) + Exit: (32) isvar([v, c]) +^ Fail: (31) not(user:isvar([v, c])) + Redo: (30) getvalue([v, c], _10084, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (31) isvar([v, c]) + Unify: (31) isvar([v, c]) + Exit: (31) isvar([v, c]) + Call: (31) isvalstrorundef(_10082) + Unify: (31) isvalstrorundef(_10082) + Call: (32) var(_10082) + Exit: (32) var(_10082) + Exit: (31) isvalstrorundef(_10082) + Call: (31) getvar([v, c], _10084, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (31) getvar([v, c], _10084, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (32) lists:member([[v, c], _10074], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (32) lists:member([[v, c], _10074], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (32) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (32) not(empty=empty) +^ Unify: (32) not(user:(empty=empty)) +^ Fail: (32) not(user:(empty=empty)) + Redo: (31) getvar([v, c], _10084, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (31) getvar([v, c], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (32) not(member([[v, c], _10080], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) +^ Unify: (32) not(user:member([[v, c], _10080], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) +^ Fail: (32) not(user:member([[v, c], _10080], [[[v, a], 458], [[v, b], 949], [[v, c], empty]])) + Redo: (31) getvar([v, c], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (32) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (32) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (32) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (31) getvar([v, c], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (30) getvalue([v, c], empty, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (30) getvalue([v, a], _10096, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (30) getvalue([v, a], _10096, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (31) not(isvar([v, a])) +^ Unify: (31) not(user:isvar([v, a])) + Call: (32) isvar([v, a]) + Unify: (32) isvar([v, a]) + Exit: (32) isvar([v, a]) +^ Fail: (31) not(user:isvar([v, a])) + Redo: (30) getvalue([v, a], _10096, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (31) isvar([v, a]) + Unify: (31) isvar([v, a]) + Exit: (31) isvar([v, a]) + Call: (31) isvalstrorundef(_10094) + Unify: (31) isvalstrorundef(_10094) + Call: (32) var(_10094) + Exit: (32) var(_10094) + Exit: (31) isvalstrorundef(_10094) + Call: (31) getvar([v, a], _10096, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (31) getvar([v, a], _10096, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (32) lists:member([[v, a], _10086], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (32) lists:member([[v, a], _10086], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (32) lists:member([[v, a], 458], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (32) not(458=empty) +^ Unify: (32) not(user:(458=empty)) +^ Exit: (32) not(user:(458=empty)) + Exit: (31) getvar([v, a], 458, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (30) getvalue([v, a], 458, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (30) getvalue([v, b], _10120, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (30) getvalue([v, b], _10120, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (31) not(isvar([v, b])) +^ Unify: (31) not(user:isvar([v, b])) + Call: (32) isvar([v, b]) + Unify: (32) isvar([v, b]) + Exit: (32) isvar([v, b]) +^ Fail: (31) not(user:isvar([v, b])) + Redo: (30) getvalue([v, b], _10120, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (31) isvar([v, b]) + Unify: (31) isvar([v, b]) + Exit: (31) isvar([v, b]) + Call: (31) isvalstrorundef(_10118) + Unify: (31) isvalstrorundef(_10118) + Call: (32) var(_10118) + Exit: (32) var(_10118) + Exit: (31) isvalstrorundef(_10118) + Call: (31) getvar([v, b], _10120, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (31) getvar([v, b], _10120, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (32) lists:member([[v, b], _10110], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (32) lists:member([[v, b], _10110], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (32) lists:member([[v, b], 949], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) +^ Call: (32) not(949=empty) +^ Unify: (32) not(user:(949=empty)) +^ Exit: (32) not(user:(949=empty)) + Exit: (31) getvar([v, b], 949, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (30) getvalue([v, b], 949, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (29) getvalues([v, c], [v, a], [v, b], empty, 458, 949, [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (29) isvalempty(empty) + Unify: (29) isvalempty(empty) + Call: (30) isval(empty) + Unify: (30) isval(empty) + Fail: (30) isval(empty) + Redo: (29) isvalempty(empty) + Call: (30) empty=empty + Exit: (30) empty=empty + Exit: (29) isvalempty(empty) + Call: (29) isval(458) + Unify: (29) isval(458) + Exit: (29) isval(458) + Call: (29) isval(949) + Unify: (29) isval(949) + Exit: (29) isval(949) + Call: (29) _10148 is 458+949 + Exit: (29) 1407 is 458+949 + Call: (29) val1emptyorvalsequal(empty, 1407) + Unify: (29) val1emptyorvalsequal(empty, 1407) + Exit: (29) val1emptyorvalsequal(empty, 1407) + Call: (29) putvalue([v, c], 1407, [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10154) + Unify: (29) putvalue([v, c], 1407, [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10154) +^ Call: (30) not(isvar([v, c])) +^ Unify: (30) not(user:isvar([v, c])) + Call: (31) isvar([v, c]) + Unify: (31) isvar([v, c]) + Exit: (31) isvar([v, c]) +^ Fail: (30) not(user:isvar([v, c])) + Redo: (29) putvalue([v, c], 1407, [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10154) + Call: (30) isvar([v, c]) + Unify: (30) isvar([v, c]) + Exit: (30) isvar([v, c]) + Call: (30) isvalstrorundef(1407) + Unify: (30) isvalstrorundef(1407) + Call: (31) var(1407) + Fail: (31) var(1407) + Redo: (30) isvalstrorundef(1407) + Unify: (30) isvalstrorundef(1407) +^ Call: (31) not(var(1407)) +^ Unify: (31) not(user:var(1407)) +^ Exit: (31) not(user:var(1407)) + Call: (31) isval(1407) + Unify: (31) isval(1407) + Exit: (31) isval(1407) + Exit: (30) isvalstrorundef(1407) + Call: (30) updatevar([v, c], 1407, [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10164) + Unify: (30) updatevar([v, c], 1407, [[[v, a], 458], [[v, b], 949], [[v, c], empty]], _10164) + Call: (31) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Unify: (31) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Exit: (31) lists:member([[v, c], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (31) lists:delete([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[v, c], empty], _10186) + Unify: (31) lists:delete([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[v, c], empty], _10186) + Exit: (31) lists:delete([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[v, c], empty], [[[v, a], 458], [[v, b], 949]]) + Call: (31) lists:append([[[v, a], 458], [[v, b], 949]], [[[v, c], 1407]], _10216) + Unify: (31) lists:append([[[v, a], 458], [[v, b], 949]], [[[v, c], 1407]], [[[v, a], 458]|_10200]) + Exit: (31) lists:append([[[v, a], 458], [[v, b], 949]], [[[v, c], 1407]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]]) + Call: (31) true + Exit: (31) true + Call: (31) true + Exit: (31) true + Call: (31) true + Exit: (31) true + Exit: (30) updatevar([v, c], 1407, [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]]) + Exit: (29) putvalue([v, c], 1407, [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]]) + Call: (29) debug(on) + Fail: (29) debug(on) + Redo: (28) interpretpart(isplus, [v, c], [v, a], [v, b], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]]) + Call: (29) true + Exit: (29) true + Call: (29) debug(on) + Fail: (29) debug(on) + Redo: (28) interpretpart(isplus, [v, c], [v, a], [v, b], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]]) + Call: (29) true + Exit: (29) true + Exit: (28) interpretpart(isplus, [v, c], [v, a], [v, b], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]]) + Exit: (27) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[n, +], [[v, a], [v, b], [v, c]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], true, nocut) +^ Call: (27) not(nocut=cut) +^ Unify: (27) not(user:(nocut=cut)) +^ Exit: (27) not(user:(nocut=cut)) + Call: (27) _10236=[[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]] + Exit: (27) [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]]=[[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]] + Call: (27) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], _10242, [], _10246) + Unify: (27) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [], true) + Exit: (27) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [], true) + Call: (27) logicalconjunction(true, true, true) + Unify: (27) logicalconjunction(true, true, true) + Call: (28) true(true) + Unify: (28) true(true) + Exit: (28) true(true) + Call: (28) true(true) + Unify: (28) true(true) + Exit: (28) true(true) + Exit: (27) logicalconjunction(true, true, true) + Exit: (26) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [[[n, +], [[v, a], [v, b], [v, c]]]], true) + Call: (26) true + Exit: (26) true + Call: (26) updatevars([[[v, d], [v, c]]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [], _10242) + Unify: (26) updatevars([[[v, d], [v, c]]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [], _10242) + Call: (27) [[[v, d], [v, c]]]=[[_10228, _10234]|_10224] + Exit: (27) [[[v, d], [v, c]]]=[[[v, d], [v, c]]] + Call: (27) expressionnotatom([v, c]) + Unify: (27) expressionnotatom([v, c]) + Call: (28) isvalstrempty([v, c]) + Unify: (28) isvalstrempty([v, c]) + Call: (29) var([v, c]) + Fail: (29) var([v, c]) + Redo: (28) isvalstrempty([v, c]) + Unify: (28) isvalstrempty([v, c]) + Call: (29) isval([v, c]) + Unify: (29) isval([v, c]) + Fail: (29) isval([v, c]) + Redo: (28) isvalstrempty([v, c]) + Unify: (28) isvalstrempty([v, c]) + Fail: (28) isvalstrempty([v, c]) + Redo: (27) expressionnotatom([v, c]) + Unify: (27) expressionnotatom([v, c]) +^ Call: (28) not(atom([v, c])) +^ Unify: (28) not(user:atom([v, c])) +^ Exit: (28) not(user:atom([v, c])) + Call: (28) length([v, c], _10266) + Unify: (28) length([v, c], _10266) + Exit: (28) length([v, c], 2) + Call: (28) 2>=1 + Exit: (28) 2>=1 + Call: (28) expressionnotatom2([v, c]) + Unify: (28) expressionnotatom2([v, c]) + Call: (29) isvalstrempty(v) + Unify: (29) isvalstrempty(v) + Call: (30) var(v) + Fail: (30) var(v) + Redo: (29) isvalstrempty(v) + Unify: (29) isvalstrempty(v) + Call: (30) isval(v) + Unify: (30) isval(v) + Fail: (30) isval(v) + Redo: (29) isvalstrempty(v) + Unify: (29) isvalstrempty(v) + Fail: (29) isvalstrempty(v) + Fail: (28) expressionnotatom2([v, c]) + Redo: (27) expressionnotatom([v, c]) + Unify: (27) expressionnotatom([v, c]) + Call: (28) predicate_or_rule_name([v, c]) + Fail: (28) predicate_or_rule_name([v, c]) + Fail: (27) expressionnotatom([v, c]) + Redo: (26) updatevars([[[v, d], [v, c]]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [], _10260) + Call: (27) lists:member([[v, c], _10246], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]]) + Unify: (27) lists:member([[v, c], _10246], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]]) + Exit: (27) lists:member([[v, c], 1407], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]]) + Call: (27) lists:append([], [[[v, d], 1407]], _10288) + Unify: (27) lists:append([], [[[v, d], 1407]], [[[v, d], 1407]]) + Exit: (27) lists:append([], [[[v, d], 1407]], [[[v, d], 1407]]) + Call: (27) updatevars([], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [[[v, d], 1407]], _10290) + Unify: (27) updatevars([], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [[[v, d], 1407]], [[[v, d], 1407]]) + Exit: (27) updatevars([], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [[[v, d], 1407]], [[[v, d], 1407]]) + Exit: (26) updatevars([[[v, d], [v, c]]], [[[v, a], 458], [[v, b], 949], [[v, c], 1407]], [], [[[v, d], 1407]]) +^ Call: (26) not([[[v, d], 1407]]=[]) +^ Unify: (26) not(user:([[[v, d], 1407]]=[])) +^ Exit: (26) not(user:([[[v, d], 1407]]=[])) + Call: (26) unique1([[[v, d], 1407]], [], _10300) + Unify: (26) unique1([[[v, d], 1407]], [], _10300) + Call: (27) lists:delete([], [[v, d], 1407], _10300) + Unify: (27) lists:delete([], [[v, d], 1407], []) + Exit: (27) lists:delete([], [[v, d], 1407], []) + Call: (27) lists:append([], [[[v, d], 1407]], _10306) + Unify: (27) lists:append([], [[[v, d], 1407]], [[[v, d], 1407]]) + Exit: (27) lists:append([], [[[v, d], 1407]], [[[v, d], 1407]]) + Call: (27) unique1([], [[[v, d], 1407]], _10306) + Unify: (27) unique1([], [[[v, d], 1407]], [[[v, d], 1407]]) + Exit: (27) unique1([], [[[v, d], 1407]], [[[v, d], 1407]]) + Exit: (26) unique1([[[v, d], 1407]], [], [[[v, d], 1407]]) + Call: (26) findresult3([458, 949, [v, d]], [[[v, d], 1407]], [], _10308) + Unify: (26) findresult3([458, 949, [v, d]], [[[v, d], 1407]], [], _10308) + Call: (27) [458, 949, [v, d]]=[_10288|_10290] + Exit: (27) [458, 949, [v, d]]=[458, 949, [v, d]] + Call: (27) expressionnotatom3(458) + Unify: (27) expressionnotatom3(458) + Call: (28) expressionnotatom(458) + Unify: (28) expressionnotatom(458) + Call: (29) isvalstrempty(458) + Unify: (29) isvalstrempty(458) + Call: (30) var(458) + Fail: (30) var(458) + Redo: (29) isvalstrempty(458) + Unify: (29) isvalstrempty(458) + Call: (30) isval(458) + Unify: (30) isval(458) + Exit: (30) isval(458) + Exit: (29) isvalstrempty(458) + Exit: (28) expressionnotatom(458) + Exit: (27) expressionnotatom3(458) + Call: (27) lists:append([], [458], _10318) + Unify: (27) lists:append([], [458], [458]) + Exit: (27) lists:append([], [458], [458]) + Call: (27) findresult3([949, [v, d]], [[[v, d], 1407]], [458], _10320) + Unify: (27) findresult3([949, [v, d]], [[[v, d], 1407]], [458], _10320) + Call: (28) [949, [v, d]]=[_10300|_10302] + Exit: (28) [949, [v, d]]=[949, [v, d]] + Call: (28) expressionnotatom3(949) + Unify: (28) expressionnotatom3(949) + Call: (29) expressionnotatom(949) + Unify: (29) expressionnotatom(949) + Call: (30) isvalstrempty(949) + Unify: (30) isvalstrempty(949) + Call: (31) var(949) + Fail: (31) var(949) + Redo: (30) isvalstrempty(949) + Unify: (30) isvalstrempty(949) + Call: (31) isval(949) + Unify: (31) isval(949) + Exit: (31) isval(949) + Exit: (30) isvalstrempty(949) + Exit: (29) expressionnotatom(949) + Exit: (28) expressionnotatom3(949) + Call: (28) lists:append([458], [949], _10330) + Unify: (28) lists:append([458], [949], [458|_10314]) + Exit: (28) lists:append([458], [949], [458, 949]) + Call: (28) findresult3([[v, d]], [[[v, d], 1407]], [458, 949], _10338) + Unify: (28) findresult3([[v, d]], [[[v, d], 1407]], [458, 949], _10338) + Call: (29) [[v, d]]=[_10318|_10320] + Exit: (29) [[v, d]]=[[v, d]] + Call: (29) expressionnotatom3([v, d]) + Unify: (29) expressionnotatom3([v, d]) + Call: (30) expressionnotatom([v, d]) + Unify: (30) expressionnotatom([v, d]) + Call: (31) isvalstrempty([v, d]) + Unify: (31) isvalstrempty([v, d]) + Call: (32) var([v, d]) + Fail: (32) var([v, d]) + Redo: (31) isvalstrempty([v, d]) + Unify: (31) isvalstrempty([v, d]) + Call: (32) isval([v, d]) + Unify: (32) isval([v, d]) + Fail: (32) isval([v, d]) + Redo: (31) isvalstrempty([v, d]) + Unify: (31) isvalstrempty([v, d]) + Fail: (31) isvalstrempty([v, d]) + Redo: (30) expressionnotatom([v, d]) + Unify: (30) expressionnotatom([v, d]) +^ Call: (31) not(atom([v, d])) +^ Unify: (31) not(user:atom([v, d])) +^ Exit: (31) not(user:atom([v, d])) + Call: (31) length([v, d], _10350) + Unify: (31) length([v, d], _10350) + Exit: (31) length([v, d], 2) + Call: (31) 2>=1 + Exit: (31) 2>=1 + Call: (31) expressionnotatom2([v, d]) + Unify: (31) expressionnotatom2([v, d]) + Call: (32) isvalstrempty(v) + Unify: (32) isvalstrempty(v) + Call: (33) var(v) + Fail: (33) var(v) + Redo: (32) isvalstrempty(v) + Unify: (32) isvalstrempty(v) + Call: (33) isval(v) + Unify: (33) isval(v) + Fail: (33) isval(v) + Redo: (32) isvalstrempty(v) + Unify: (32) isvalstrempty(v) + Fail: (32) isvalstrempty(v) + Fail: (31) expressionnotatom2([v, d]) + Redo: (30) expressionnotatom([v, d]) + Unify: (30) expressionnotatom([v, d]) + Call: (31) predicate_or_rule_name([v, d]) + Fail: (31) predicate_or_rule_name([v, d]) + Fail: (30) expressionnotatom([v, d]) + Redo: (29) expressionnotatom3([v, d]) + Unify: (29) expressionnotatom3([v, d]) +^ Call: (30) not([v, d]=[v, _10336]) +^ Unify: (30) not(user:([v, d]=[v, _10336])) +^ Fail: (30) not(user:([v, d]=[v, _10336])) + Fail: (29) expressionnotatom3([v, d]) + Redo: (28) findresult3([[v, d]], [[[v, d], 1407]], [458, 949], _10338) + Unify: (28) findresult3([[v, d]], [[[v, d], 1407]], [458, 949], _10338) + Call: (29) [[v, d]]=[_10318|_10320] + Exit: (29) [[v, d]]=[[v, d]] + Call: (29) isvar([v, d]) + Unify: (29) isvar([v, d]) + Exit: (29) isvar([v, d]) + Call: (29) lists:member([[v, d], _10330], [[[v, d], 1407]]) + Unify: (29) lists:member([[v, d], _10330], [[[v, d], 1407]]) + Exit: (29) lists:member([[v, d], 1407], [[[v, d], 1407]]) + Call: (29) lists:append([458, 949], [1407], _10360) + Unify: (29) lists:append([458, 949], [1407], [458|_10344]) + Exit: (29) lists:append([458, 949], [1407], [458, 949, 1407]) + Call: (29) findresult3([], [[[v, d], 1407]], [458, 949, 1407], _10374) + Unify: (29) findresult3([], [[[v, d], 1407]], [458, 949, 1407], [458, 949, 1407]) + Exit: (29) findresult3([], [[[v, d], 1407]], [458, 949, 1407], [458, 949, 1407]) + Exit: (28) findresult3([[v, d]], [[[v, d], 1407]], [458, 949], [458, 949, 1407]) + Exit: (27) findresult3([949, [v, d]], [[[v, d], 1407]], [458], [458, 949, 1407]) + Exit: (26) findresult3([458, 949, [v, d]], [[[v, d], 1407]], [], [458, 949, 1407]) + Call: (26) debug(on) + Fail: (26) debug(on) + Redo: (25) member2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, d], 1407]]) + Call: (26) true + Exit: (26) true + Exit: (25) member2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function1], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]]]]], [[[v, d], 1407]]) + Exit: (24) member23([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407]]) + Exit: (23) member22([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407]]) + Exit: (22) member21([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407]]) + Exit: (21) member2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407]]) + Exit: (20) interpret2([[n, function1], [458, 949, [v, d]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407]]) + Call: (20) updatevars2([[v, d]], [[[v, d], 1407]], [], _10374) + Unify: (20) updatevars2([[v, d]], [[[v, d], 1407]], [], _10374) + Call: (21) [[[v, d], 1407]]=[[_10360, _10366]|_10356] + Exit: (21) [[[v, d], 1407]]=[[[v, d], 1407]] + Call: (21) lists:member([v, d], [[v, d]]) + Unify: (21) lists:member([v, d], [[v, d]]) + Exit: (21) lists:member([v, d], [[v, d]]) + Call: (21) lists:append([], [[[v, d], 1407]], _10408) + Unify: (21) lists:append([], [[[v, d], 1407]], [[[v, d], 1407]]) + Exit: (21) lists:append([], [[[v, d], 1407]], [[[v, d], 1407]]) + Call: (21) updatevars2([[v, d]], [], [[[v, d], 1407]], _10410) + Unify: (21) updatevars2([[v, d]], [], [[[v, d], 1407]], [[[v, d], 1407]]) + Exit: (21) updatevars2([[v, d]], [], [[[v, d], 1407]], [[[v, d], 1407]]) + Exit: (20) updatevars2([[v, d]], [[[v, d], 1407]], [], [[[v, d], 1407]]) + Call: (20) updatevars3([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, d], 1407]], _10408) + Unify: (20) updatevars3([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, d], 1407]], _10408) + Call: (21) [[[v, d], 1407]]=[[_10396, _10402]|_10392] + Exit: (21) [[[v, d], 1407]]=[[[v, d], 1407]] + Call: (21) lists:delete([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[v, d], empty], _10438) + Unify: (21) lists:delete([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[v, d], empty], _10438) + Exit: (21) lists:delete([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[v, d], empty], [[[v, a], 458], [[v, b], 949], [[v, c], empty]]) + Call: (21) lists:append([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, d], 1407]], _10474) + Unify: (21) lists:append([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, d], 1407]], [[[v, a], 458]|_10458]) + Exit: (21) lists:append([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, d], 1407]], [[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]]) + Call: (21) updatevars3([[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]], [], _10492) + Unify: (21) updatevars3([[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]], [], [[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]]) + Exit: (21) updatevars3([[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]], [], [[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]]) + Exit: (20) updatevars3([[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, d], 1407]], [[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]]) + Call: (20) reverse([[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]], [], _10492) + Unify: (20) reverse([[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]], [], _10492) + Call: (21) [[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]]=[_10474|_10476] + Exit: (21) [[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]]=[[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]] + Call: (21) lists:append([[[v, a], 458]], [], _10504) + Unify: (21) lists:append([[[v, a], 458]], [], [[[v, a], 458]|_10488]) + Exit: (21) lists:append([[[v, a], 458]], [], [[[v, a], 458]]) + Call: (21) reverse([[[v, b], 949], [[v, c], empty], [[v, d], 1407]], [[[v, a], 458]], _10510) + Unify: (21) reverse([[[v, b], 949], [[v, c], empty], [[v, d], 1407]], [[[v, a], 458]], _10510) + Call: (22) [[[v, b], 949], [[v, c], empty], [[v, d], 1407]]=[_10492|_10494] + Exit: (22) [[[v, b], 949], [[v, c], empty], [[v, d], 1407]]=[[[v, b], 949], [[v, c], empty], [[v, d], 1407]] + Call: (22) lists:append([[[v, b], 949]], [[[v, a], 458]], _10522) + Unify: (22) lists:append([[[v, b], 949]], [[[v, a], 458]], [[[v, b], 949]|_10506]) + Exit: (22) lists:append([[[v, b], 949]], [[[v, a], 458]], [[[v, b], 949], [[v, a], 458]]) + Call: (22) reverse([[[v, c], empty], [[v, d], 1407]], [[[v, b], 949], [[v, a], 458]], _10528) + Unify: (22) reverse([[[v, c], empty], [[v, d], 1407]], [[[v, b], 949], [[v, a], 458]], _10528) + Call: (23) [[[v, c], empty], [[v, d], 1407]]=[_10510|_10512] + Exit: (23) [[[v, c], empty], [[v, d], 1407]]=[[[v, c], empty], [[v, d], 1407]] + Call: (23) lists:append([[[v, c], empty]], [[[v, b], 949], [[v, a], 458]], _10540) + Unify: (23) lists:append([[[v, c], empty]], [[[v, b], 949], [[v, a], 458]], [[[v, c], empty]|_10524]) + Exit: (23) lists:append([[[v, c], empty]], [[[v, b], 949], [[v, a], 458]], [[[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (23) reverse([[[v, d], 1407]], [[[v, c], empty], [[v, b], 949], [[v, a], 458]], _10546) + Unify: (23) reverse([[[v, d], 1407]], [[[v, c], empty], [[v, b], 949], [[v, a], 458]], _10546) + Call: (24) [[[v, d], 1407]]=[_10528|_10530] + Exit: (24) [[[v, d], 1407]]=[[[v, d], 1407]] + Call: (24) lists:append([[[v, d], 1407]], [[[v, c], empty], [[v, b], 949], [[v, a], 458]], _10558) + Unify: (24) lists:append([[[v, d], 1407]], [[[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407]|_10542]) + Exit: (24) lists:append([[[v, d], 1407]], [[[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (24) reverse([], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10564) + Unify: (24) reverse([], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (24) reverse([], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (23) reverse([[[v, d], 1407]], [[[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (22) reverse([[[v, c], empty], [[v, d], 1407]], [[[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (21) reverse([[[v, b], 949], [[v, c], empty], [[v, d], 1407]], [[[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (20) reverse([[[v, a], 458], [[v, b], 949], [[v, c], empty], [[v, d], 1407]], [], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) +^ Call: (20) not([[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]=[]) +^ Unify: (20) not(user:([[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v|...], 458]]=[])) +^ Exit: (20) not(user:([[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v|...], 458]]=[])) + Call: (20) unique1([[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [], _10576) + Unify: (20) unique1([[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [], _10576) + Call: (21) lists:delete([[[v, c], empty], [[v, b], 949], [[v, a], 458]], [[v, d], 1407], _10576) + Unify: (21) lists:delete([[[v, c], empty], [[v, b], 949], [[v, a], 458]], [[v, d], 1407], _10576) + Exit: (21) lists:delete([[[v, c], empty], [[v, b], 949], [[v, a], 458]], [[v, d], 1407], [[[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (21) lists:append([], [[[v, d], 1407]], _10600) + Unify: (21) lists:append([], [[[v, d], 1407]], [[[v, d], 1407]]) + Exit: (21) lists:append([], [[[v, d], 1407]], [[[v, d], 1407]]) + Call: (21) unique1([[[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407]], _10600) + Unify: (21) unique1([[[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407]], _10600) + Call: (22) lists:delete([[[v, b], 949], [[v, a], 458]], [[v, c], empty], _10600) + Unify: (22) lists:delete([[[v, b], 949], [[v, a], 458]], [[v, c], empty], _10600) + Exit: (22) lists:delete([[[v, b], 949], [[v, a], 458]], [[v, c], empty], [[[v, b], 949], [[v, a], 458]]) + Call: (22) lists:append([[[v, d], 1407]], [[[v, c], empty]], _10618) + Unify: (22) lists:append([[[v, d], 1407]], [[[v, c], empty]], [[[v, d], 1407]|_10602]) + Exit: (22) lists:append([[[v, d], 1407]], [[[v, c], empty]], [[[v, d], 1407], [[v, c], empty]]) + Call: (22) unique1([[[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, c], empty]], _10624) + Unify: (22) unique1([[[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, c], empty]], _10624) + Call: (23) lists:delete([[[v, a], 458]], [[v, b], 949], _10624) + Unify: (23) lists:delete([[[v, a], 458]], [[v, b], 949], _10624) + Exit: (23) lists:delete([[[v, a], 458]], [[v, b], 949], [[[v, a], 458]]) + Call: (23) lists:append([[[v, d], 1407], [[v, c], empty]], [[[v, b], 949]], _10636) + Unify: (23) lists:append([[[v, d], 1407], [[v, c], empty]], [[[v, b], 949]], [[[v, d], 1407]|_10620]) + Exit: (23) lists:append([[[v, d], 1407], [[v, c], empty]], [[[v, b], 949]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949]]) + Call: (23) unique1([[[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949]], _10648) + Unify: (23) unique1([[[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949]], _10648) + Call: (24) lists:delete([], [[v, a], 458], _10648) + Unify: (24) lists:delete([], [[v, a], 458], []) + Exit: (24) lists:delete([], [[v, a], 458], []) + Call: (24) lists:append([[[v, d], 1407], [[v, c], empty], [[v, b], 949]], [[[v, a], 458]], _10654) + Unify: (24) lists:append([[[v, d], 1407], [[v, c], empty], [[v, b], 949]], [[[v, a], 458]], [[[v, d], 1407]|_10638]) + Exit: (24) lists:append([[[v, d], 1407], [[v, c], empty], [[v, b], 949]], [[[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (24) unique1([], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10672) + Unify: (24) unique1([], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (24) unique1([], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (23) unique1([[[v, a], 458]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (22) unique1([[[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, c], empty]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (21) unique1([[[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (20) unique1([[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (19) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, function1], [[v, a], [v, b], [v, d]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], true, nocut) +^ Call: (19) not(nocut=cut) +^ Unify: (19) not(user:(nocut=cut)) +^ Exit: (19) not(user:(nocut=cut)) + Call: (19) _10680=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Exit: (19) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Unify: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Call: (20) [[[n, =], [[v, c], [v, d]]]]=[[_10672|_10674]|_10668] + Exit: (20) [[[n, =], [[v, c], [v, d]]]]=[[[n, =], [[v, c], [v, d]]]] +^ Call: (20) not(predicate_or_rule_name([n, =])) +^ Unify: (20) not(user:predicate_or_rule_name([n, =])) + Call: (21) predicate_or_rule_name([n, =]) + Unify: (21) predicate_or_rule_name([n, =]) + Exit: (21) predicate_or_rule_name([n, =]) +^ Fail: (20) not(user:predicate_or_rule_name([n, =])) + Redo: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Unify: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Call: (20) [[[n, =], [[v, c], [v, d]]]]=[[[n, not], [_10696]]|_10668] + Fail: (20) [[[n, =], [[v, c], [v, d]]]]=[[[n, not], [_10696]]|_10668] + Redo: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Unify: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Call: (20) [[[n, =], [[v, c], [v, d]]]]=[[[n, or], [_10696, _10702]]|_10668] + Fail: (20) [[[n, =], [[v, c], [v, d]]]]=[[[n, or], [_10696, _10702]]|_10668] + Redo: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Unify: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Call: (20) [[[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_10702, _10708]]|_10668] + Fail: (20) [[[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_10702, _10708]]|_10668] + Redo: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Unify: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Call: (20) [[[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_10702, _10708, _10714]]|_10668] + Fail: (20) [[[n, =], [[v, c], [v, d]]]]=[[[n, "->"], [_10702, _10708, _10714]]|_10668] + Redo: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Unify: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10686, [[[n, =], [[v, c], [v, d]]]], _10690) + Call: (20) [[[n, =], [[v, c], [v, d]]]]=[_10666|_10668] + Exit: (20) [[[n, =], [[v, c], [v, d]]]]=[[[n, =], [[v, c], [v, d]]]] +^ Call: (20) not(predicate_or_rule_name([[n, =], [[v, c], [v, d]]])) +^ Unify: (20) not(user:predicate_or_rule_name([[n, =], [[v, c], [v, d]]])) + Call: (21) predicate_or_rule_name([[n, =], [[v, c], [v, d]]]) + Fail: (21) predicate_or_rule_name([[n, =], [[v, c], [v, d]]]) +^ Exit: (20) not(user:predicate_or_rule_name([[n, =], [[v, c], [v, d]]])) + Call: (20) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, =], [[v, c], [v, d]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10704, _10706, _10708) + Unify: (20) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, =], [[v, c], [v, d]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10704, true, nocut) + Call: (21) isop(=) + Unify: (21) isop(=) + Exit: (21) isop(=) + Call: (21) interpretpart(is, [v, c], [v, d], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10704) + Unify: (21) interpretpart(is, [v, c], [v, d], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10704) + Call: (22) getvalues([v, c], [v, d], _10700, _10702, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Unify: (22) getvalues([v, c], [v, d], _10700, _10702, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (23) getvalue([v, c], _10698, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Unify: (23) getvalue([v, c], _10698, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) +^ Call: (24) not(isvar([v, c])) +^ Unify: (24) not(user:isvar([v, c])) + Call: (25) isvar([v, c]) + Unify: (25) isvar([v, c]) + Exit: (25) isvar([v, c]) +^ Fail: (24) not(user:isvar([v, c])) + Redo: (23) getvalue([v, c], _10698, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (24) isvar([v, c]) + Unify: (24) isvar([v, c]) + Exit: (24) isvar([v, c]) + Call: (24) isvalstrorundef(_10696) + Unify: (24) isvalstrorundef(_10696) + Call: (25) var(_10696) + Exit: (25) var(_10696) + Exit: (24) isvalstrorundef(_10696) + Call: (24) getvar([v, c], _10698, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Unify: (24) getvar([v, c], _10698, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (25) lists:member([[v, c], _10688], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Unify: (25) lists:member([[v, c], _10688], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (25) lists:member([[v, c], empty], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) +^ Call: (25) not(empty=empty) +^ Unify: (25) not(user:(empty=empty)) +^ Fail: (25) not(user:(empty=empty)) + Redo: (25) lists:member([[v, c], _10688], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Fail: (25) lists:member([[v, c], _10688], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Redo: (24) getvar([v, c], _10698, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Unify: (24) getvar([v, c], empty, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) +^ Call: (25) not(member([[v, c], _10694], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]])) +^ Unify: (25) not(user:member([[v, c], _10694], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v|...], 458]])) +^ Fail: (25) not(user:member([[v, c], _10694], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v|...], 458]])) + Redo: (24) getvar([v, c], empty, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (25) lists:member([[v, c], empty], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Unify: (25) lists:member([[v, c], empty], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (25) lists:member([[v, c], empty], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (24) getvar([v, c], empty, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (23) getvalue([v, c], empty, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (23) getvalue([v, d], _10710, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Unify: (23) getvalue([v, d], _10710, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) +^ Call: (24) not(isvar([v, d])) +^ Unify: (24) not(user:isvar([v, d])) + Call: (25) isvar([v, d]) + Unify: (25) isvar([v, d]) + Exit: (25) isvar([v, d]) +^ Fail: (24) not(user:isvar([v, d])) + Redo: (23) getvalue([v, d], _10710, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (24) isvar([v, d]) + Unify: (24) isvar([v, d]) + Exit: (24) isvar([v, d]) + Call: (24) isvalstrorundef(_10708) + Unify: (24) isvalstrorundef(_10708) + Call: (25) var(_10708) + Exit: (25) var(_10708) + Exit: (24) isvalstrorundef(_10708) + Call: (24) getvar([v, d], _10710, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Unify: (24) getvar([v, d], _10710, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (25) lists:member([[v, d], _10700], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Unify: (25) lists:member([[v, d], _10700], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (25) lists:member([[v, d], 1407], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) +^ Call: (25) not(1407=empty) +^ Unify: (25) not(user:(1407=empty)) +^ Exit: (25) not(user:(1407=empty)) + Exit: (24) getvar([v, d], 1407, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (23) getvalue([v, d], 1407, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (22) getvalues([v, c], [v, d], empty, 1407, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (22) expression(empty) + Unify: (22) expression(empty) + Exit: (22) expression(empty) + Call: (22) expression(1407) + Unify: (22) expression(1407) + Call: (23) isval(1407) + Unify: (23) isval(1407) + Exit: (23) isval(1407) + Exit: (22) expression(1407) + Call: (22) val1emptyorvalsequal(empty, 1407) + Unify: (22) val1emptyorvalsequal(empty, 1407) + Exit: (22) val1emptyorvalsequal(empty, 1407) + Call: (22) putvalue([v, c], 1407, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10738) + Unify: (22) putvalue([v, c], 1407, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10738) +^ Call: (23) not(isvar([v, c])) +^ Unify: (23) not(user:isvar([v, c])) + Call: (24) isvar([v, c]) + Unify: (24) isvar([v, c]) + Exit: (24) isvar([v, c]) +^ Fail: (23) not(user:isvar([v, c])) + Redo: (22) putvalue([v, c], 1407, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10738) + Call: (23) isvar([v, c]) + Unify: (23) isvar([v, c]) + Exit: (23) isvar([v, c]) + Call: (23) isvalstrorundef(1407) + Unify: (23) isvalstrorundef(1407) + Call: (24) var(1407) + Fail: (24) var(1407) + Redo: (23) isvalstrorundef(1407) + Unify: (23) isvalstrorundef(1407) +^ Call: (24) not(var(1407)) +^ Unify: (24) not(user:var(1407)) +^ Exit: (24) not(user:var(1407)) + Call: (24) isval(1407) + Unify: (24) isval(1407) + Exit: (24) isval(1407) + Exit: (23) isvalstrorundef(1407) + Call: (23) updatevar([v, c], 1407, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10748) + Unify: (23) updatevar([v, c], 1407, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], _10748) + Call: (24) lists:member([[v, c], empty], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Unify: (24) lists:member([[v, c], empty], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Exit: (24) lists:member([[v, c], empty], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]]) + Call: (24) lists:delete([[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[v, c], empty], _10770) + Unify: (24) lists:delete([[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[v, c], empty], _10770) + Exit: (24) lists:delete([[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[v, c], empty], [[[v, d], 1407], [[v, b], 949], [[v, a], 458]]) + Call: (24) lists:append([[[v, d], 1407], [[v, b], 949], [[v, a], 458]], [[[v, c], 1407]], _10806) + Unify: (24) lists:append([[[v, d], 1407], [[v, b], 949], [[v, a], 458]], [[[v, c], 1407]], [[[v, d], 1407]|_10790]) + Exit: (24) lists:append([[[v, d], 1407], [[v, b], 949], [[v, a], 458]], [[[v, c], 1407]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]]) + Call: (24) true + Exit: (24) true + Call: (24) true + Exit: (24) true + Call: (24) true + Exit: (24) true + Exit: (23) updatevar([v, c], 1407, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]]) + Exit: (22) putvalue([v, c], 1407, [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]]) + Call: (22) debug(on) + Fail: (22) debug(on) + Redo: (21) interpretpart(is, [v, c], [v, d], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]]) + Call: (22) true + Exit: (22) true + Call: (22) debug(on) + Fail: (22) debug(on) + Redo: (21) interpretpart(is, [v, c], [v, d], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]]) + Call: (22) true + Exit: (22) true + Exit: (21) interpretpart(is, [v, c], [v, d], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]]) + Exit: (20) interpretstatement1([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[n, =], [[v, c], [v, d]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], true, nocut) +^ Call: (20) not(nocut=cut) +^ Unify: (20) not(user:(nocut=cut)) +^ Exit: (20) not(user:(nocut=cut)) + Call: (20) _10832=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Exit: (20) [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]]=[[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]] + Call: (20) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], _10838, [], _10842) + Unify: (20) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [], true) + Exit: (20) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [], true) + Call: (20) logicalconjunction(_10832, true, true) + Unify: (20) logicalconjunction(true, true, true) + Call: (21) true(true) + Unify: (21) true(true) + Exit: (21) true(true) + Call: (21) true(true) + Unify: (21) true(true) + Exit: (21) true(true) + Exit: (20) logicalconjunction(true, true, true) + Exit: (19) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, d], 1407], [[v, c], empty], [[v, b], 949], [[v, a], 458]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [[[n, =], [[v, c], [v, d]]]], true) + Call: (19) logicalconjunction(true, true, true) + Unify: (19) logicalconjunction(true, true, true) + Call: (20) true(true) + Unify: (20) true(true) + Exit: (20) true(true) + Call: (20) true(true) + Unify: (20) true(true) + Exit: (20) true(true) + Exit: (19) logicalconjunction(true, true, true) + Exit: (18) interpretbody([[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, a], 458], [[v, b], 949], [[v, c], empty]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [[[n, function1], [[v, a], [v, b], [v, d]]], [[n, =], [[v, c], [v, d]]]], true) + Call: (18) true + Exit: (18) true + Call: (18) updatevars([[[v, c], [v, c]]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [], _10838) + Unify: (18) updatevars([[[v, c], [v, c]]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [], _10838) + Call: (19) [[[v, c], [v, c]]]=[[_10824, _10830]|_10820] + Exit: (19) [[[v, c], [v, c]]]=[[[v, c], [v, c]]] + Call: (19) expressionnotatom([v, c]) + Unify: (19) expressionnotatom([v, c]) + Call: (20) isvalstrempty([v, c]) + Unify: (20) isvalstrempty([v, c]) + Call: (21) var([v, c]) + Fail: (21) var([v, c]) + Redo: (20) isvalstrempty([v, c]) + Unify: (20) isvalstrempty([v, c]) + Call: (21) isval([v, c]) + Unify: (21) isval([v, c]) + Fail: (21) isval([v, c]) + Redo: (20) isvalstrempty([v, c]) + Unify: (20) isvalstrempty([v, c]) + Fail: (20) isvalstrempty([v, c]) + Redo: (19) expressionnotatom([v, c]) + Unify: (19) expressionnotatom([v, c]) +^ Call: (20) not(atom([v, c])) +^ Unify: (20) not(user:atom([v, c])) +^ Exit: (20) not(user:atom([v, c])) + Call: (20) length([v, c], _10862) + Unify: (20) length([v, c], _10862) + Exit: (20) length([v, c], 2) + Call: (20) 2>=1 + Exit: (20) 2>=1 + Call: (20) expressionnotatom2([v, c]) + Unify: (20) expressionnotatom2([v, c]) + Call: (21) isvalstrempty(v) + Unify: (21) isvalstrempty(v) + Call: (22) var(v) + Fail: (22) var(v) + Redo: (21) isvalstrempty(v) + Unify: (21) isvalstrempty(v) + Call: (22) isval(v) + Unify: (22) isval(v) + Fail: (22) isval(v) + Redo: (21) isvalstrempty(v) + Unify: (21) isvalstrempty(v) + Fail: (21) isvalstrempty(v) + Fail: (20) expressionnotatom2([v, c]) + Redo: (19) expressionnotatom([v, c]) + Unify: (19) expressionnotatom([v, c]) + Call: (20) predicate_or_rule_name([v, c]) + Fail: (20) predicate_or_rule_name([v, c]) + Fail: (19) expressionnotatom([v, c]) + Redo: (18) updatevars([[[v, c], [v, c]]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [], _10856) + Call: (19) lists:member([[v, c], _10842], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]]) + Unify: (19) lists:member([[v, c], _10842], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]]) + Exit: (19) lists:member([[v, c], 1407], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]]) + Call: (19) lists:append([], [[[v, c], 1407]], _10884) + Unify: (19) lists:append([], [[[v, c], 1407]], [[[v, c], 1407]]) + Exit: (19) lists:append([], [[[v, c], 1407]], [[[v, c], 1407]]) + Call: (19) updatevars([], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [[[v, c], 1407]], _10886) + Unify: (19) updatevars([], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [[[v, c], 1407]], [[[v, c], 1407]]) + Exit: (19) updatevars([], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [[[v, c], 1407]], [[[v, c], 1407]]) + Exit: (18) updatevars([[[v, c], [v, c]]], [[[v, d], 1407], [[v, b], 949], [[v, a], 458], [[v, c], 1407]], [], [[[v, c], 1407]]) +^ Call: (18) not([[[v, c], 1407]]=[]) +^ Unify: (18) not(user:([[[v, c], 1407]]=[])) +^ Exit: (18) not(user:([[[v, c], 1407]]=[])) + Call: (18) unique1([[[v, c], 1407]], [], _10896) + Unify: (18) unique1([[[v, c], 1407]], [], _10896) + Call: (19) lists:delete([], [[v, c], 1407], _10896) + Unify: (19) lists:delete([], [[v, c], 1407], []) + Exit: (19) lists:delete([], [[v, c], 1407], []) + Call: (19) lists:append([], [[[v, c], 1407]], _10902) + Unify: (19) lists:append([], [[[v, c], 1407]], [[[v, c], 1407]]) + Exit: (19) lists:append([], [[[v, c], 1407]], [[[v, c], 1407]]) + Call: (19) unique1([], [[[v, c], 1407]], _10902) + Unify: (19) unique1([], [[[v, c], 1407]], [[[v, c], 1407]]) + Exit: (19) unique1([], [[[v, c], 1407]], [[[v, c], 1407]]) + Exit: (18) unique1([[[v, c], 1407]], [], [[[v, c], 1407]]) + Call: (18) findresult3([458, 949, [v, c]], [[[v, c], 1407]], [], _10904) + Unify: (18) findresult3([458, 949, [v, c]], [[[v, c], 1407]], [], _10904) + Call: (19) [458, 949, [v, c]]=[_10884|_10886] + Exit: (19) [458, 949, [v, c]]=[458, 949, [v, c]] + Call: (19) expressionnotatom3(458) + Unify: (19) expressionnotatom3(458) + Call: (20) expressionnotatom(458) + Unify: (20) expressionnotatom(458) + Call: (21) isvalstrempty(458) + Unify: (21) isvalstrempty(458) + Call: (22) var(458) + Fail: (22) var(458) + Redo: (21) isvalstrempty(458) + Unify: (21) isvalstrempty(458) + Call: (22) isval(458) + Unify: (22) isval(458) + Exit: (22) isval(458) + Exit: (21) isvalstrempty(458) + Exit: (20) expressionnotatom(458) + Exit: (19) expressionnotatom3(458) + Call: (19) lists:append([], [458], _10914) + Unify: (19) lists:append([], [458], [458]) + Exit: (19) lists:append([], [458], [458]) + Call: (19) findresult3([949, [v, c]], [[[v, c], 1407]], [458], _10916) + Unify: (19) findresult3([949, [v, c]], [[[v, c], 1407]], [458], _10916) + Call: (20) [949, [v, c]]=[_10896|_10898] + Exit: (20) [949, [v, c]]=[949, [v, c]] + Call: (20) expressionnotatom3(949) + Unify: (20) expressionnotatom3(949) + Call: (21) expressionnotatom(949) + Unify: (21) expressionnotatom(949) + Call: (22) isvalstrempty(949) + Unify: (22) isvalstrempty(949) + Call: (23) var(949) + Fail: (23) var(949) + Redo: (22) isvalstrempty(949) + Unify: (22) isvalstrempty(949) + Call: (23) isval(949) + Unify: (23) isval(949) + Exit: (23) isval(949) + Exit: (22) isvalstrempty(949) + Exit: (21) expressionnotatom(949) + Exit: (20) expressionnotatom3(949) + Call: (20) lists:append([458], [949], _10926) + Unify: (20) lists:append([458], [949], [458|_10910]) + Exit: (20) lists:append([458], [949], [458, 949]) + Call: (20) findresult3([[v, c]], [[[v, c], 1407]], [458, 949], _10934) + Unify: (20) findresult3([[v, c]], [[[v, c], 1407]], [458, 949], _10934) + Call: (21) [[v, c]]=[_10914|_10916] + Exit: (21) [[v, c]]=[[v, c]] + Call: (21) expressionnotatom3([v, c]) + Unify: (21) expressionnotatom3([v, c]) + Call: (22) expressionnotatom([v, c]) + Unify: (22) expressionnotatom([v, c]) + Call: (23) isvalstrempty([v, c]) + Unify: (23) isvalstrempty([v, c]) + Call: (24) var([v, c]) + Fail: (24) var([v, c]) + Redo: (23) isvalstrempty([v, c]) + Unify: (23) isvalstrempty([v, c]) + Call: (24) isval([v, c]) + Unify: (24) isval([v, c]) + Fail: (24) isval([v, c]) + Redo: (23) isvalstrempty([v, c]) + Unify: (23) isvalstrempty([v, c]) + Fail: (23) isvalstrempty([v, c]) + Redo: (22) expressionnotatom([v, c]) + Unify: (22) expressionnotatom([v, c]) +^ Call: (23) not(atom([v, c])) +^ Unify: (23) not(user:atom([v, c])) +^ Exit: (23) not(user:atom([v, c])) + Call: (23) length([v, c], _10946) + Unify: (23) length([v, c], _10946) + Exit: (23) length([v, c], 2) + Call: (23) 2>=1 + Exit: (23) 2>=1 + Call: (23) expressionnotatom2([v, c]) + Unify: (23) expressionnotatom2([v, c]) + Call: (24) isvalstrempty(v) + Unify: (24) isvalstrempty(v) + Call: (25) var(v) + Fail: (25) var(v) + Redo: (24) isvalstrempty(v) + Unify: (24) isvalstrempty(v) + Call: (25) isval(v) + Unify: (25) isval(v) + Fail: (25) isval(v) + Redo: (24) isvalstrempty(v) + Unify: (24) isvalstrempty(v) + Fail: (24) isvalstrempty(v) + Fail: (23) expressionnotatom2([v, c]) + Redo: (22) expressionnotatom([v, c]) + Unify: (22) expressionnotatom([v, c]) + Call: (23) predicate_or_rule_name([v, c]) + Fail: (23) predicate_or_rule_name([v, c]) + Fail: (22) expressionnotatom([v, c]) + Redo: (21) expressionnotatom3([v, c]) + Unify: (21) expressionnotatom3([v, c]) +^ Call: (22) not([v, c]=[v, _10932]) +^ Unify: (22) not(user:([v, c]=[v, _10932])) +^ Fail: (22) not(user:([v, c]=[v, _10932])) + Fail: (21) expressionnotatom3([v, c]) + Redo: (20) findresult3([[v, c]], [[[v, c], 1407]], [458, 949], _10934) + Unify: (20) findresult3([[v, c]], [[[v, c], 1407]], [458, 949], _10934) + Call: (21) [[v, c]]=[_10914|_10916] + Exit: (21) [[v, c]]=[[v, c]] + Call: (21) isvar([v, c]) + Unify: (21) isvar([v, c]) + Exit: (21) isvar([v, c]) + Call: (21) lists:member([[v, c], _10926], [[[v, c], 1407]]) + Unify: (21) lists:member([[v, c], _10926], [[[v, c], 1407]]) + Exit: (21) lists:member([[v, c], 1407], [[[v, c], 1407]]) + Call: (21) lists:append([458, 949], [1407], _10956) + Unify: (21) lists:append([458, 949], [1407], [458|_10940]) + Exit: (21) lists:append([458, 949], [1407], [458, 949, 1407]) + Call: (21) findresult3([], [[[v, c], 1407]], [458, 949, 1407], _10970) + Unify: (21) findresult3([], [[[v, c], 1407]], [458, 949, 1407], [458, 949, 1407]) + Exit: (21) findresult3([], [[[v, c], 1407]], [458, 949, 1407], [458, 949, 1407]) + Exit: (20) findresult3([[v, c]], [[[v, c], 1407]], [458, 949], [458, 949, 1407]) + Exit: (19) findresult3([949, [v, c]], [[[v, c], 1407]], [458], [458, 949, 1407]) + Exit: (18) findresult3([458, 949, [v, c]], [[[v, c], 1407]], [], [458, 949, 1407]) + Call: (18) debug(on) + Fail: (18) debug(on) + Redo: (17) member1([[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, c], 1407]]) + Call: (18) true + Exit: (18) true + Exit: (17) member1([[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, c], 1407]]) + Exit: (16) interpret1(off, [[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, c], 1407]]) + Exit: (15) interpret(off, [[n, function0], [458, 949, [v, c]]], [[[n, function0], [[v, a], [v, b], [v, c]], ":-", [[[n|...], [...|...]], [[...|...]|...]]], [[n, function1], [[v, a], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[v, c], 1407]]) + + +[algorithm,30] +Action (h for help) ? aabort +% 4,476,954,214 inferences, 2588.562 CPU in 2662.746 seconds (97% CPU, 1729514 Lips) +% Execution Aborted +['cawplistprolog']. + + +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/grammar.pl:57: + Singleton-marked variable appears more than once: _EndGrammar1 + Singleton-marked variable appears more than once: _Grammar2 + Singleton-marked variable appears more than once: _EndGrammar2 +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/grammar.pl:58: + Singleton-marked variable appears more than once: _EndGrammar2 + Singleton-marked variable appears more than once: _EndGrammara2 +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/grammar.pl:95: + Singleton-marked variable appears more than once: _EndGrammar1 + Singleton-marked variable appears more than once: _EndGrammar2 + Singleton-marked variable appears more than once: _EndGrammara1 + Singleton-marked variable appears more than once: _EndGrammara2 +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter1listrecursion4.pl:24: + Singleton variables: [Query] +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter1listrecursion4.pl:120: + Singleton variables: [Query] +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter1listrecursion4.pl:267: + Singleton variables: [Result1,Result3] +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter1listrecursion4.pl:286: + Singleton variables: [Result1,Vars3] +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter1listrecursion4.pl:302: + Singleton variables: [Result1,Result3] +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter1listrecursion4.pl:302: + Singleton variable in branch: Result2 +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter1listrecursion4.pl:318: + Singleton variables: [Result1,Result3] +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter1listrecursion4.pl:329: + Singleton variables: [Result1,Result3] +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter1listrecursion4.pl:329: + Singleton variable in branch: Result2 +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter3preds5.pl:85: + Clauses of interpretpart/5 are not together in the source-file + Earlier definition at /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter3preds5.pl:1 + Current predicate: interpretpart/6 + Use :- discontiguous interpretpart/5. to suppress this message +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter3preds5.pl:97: + Clauses of interpretpart/6 are not together in the source-file + Earlier definition at /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/listprologinterpreter3preds5.pl:72 + Current predicate: interpretpart/5 + Use :- discontiguous interpretpart/6. to suppress this message +ERROR: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/rcawp.pl:30:10: Syntax error: Illegal start of term +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/la_strings.pl:3: + Clauses of use_module/1 are not together in the source-file + Earlier definition at /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/mergetexttobrdict.pl:18 + Current predicate: mergetexttobrdict/0 + Use :- discontiguous use_module/1. to suppress this message +Warning: /Users/luciangreen/Dropbox/GitHub/Combination-Algorithm-Writer-with-Predicates and lpi sandbox/texttobr2.pl:307: + Singleton-marked variable appears more than once: _End2 +true. + +rcawp1(30). + + diff --git a/Daily-Regimen-main/filekeeplacomhits.txt b/Daily-Regimen-main/filekeeplacomhits.txt new file mode 100644 index 0000000000000000000000000000000000000000..3633d9e869029f95593c845166773ecc69324d6e --- /dev/null +++ b/Daily-Regimen-main/filekeeplacomhits.txt @@ -0,0 +1,1065 @@ +% member(List,Item) (below) only works with deprecated lpi + +%% RCAW - Random Combination Algorithm Writer with Predicates + +%%Input: Algorithms, Specification of this algorithm +%%Output: Possible algorithms + +:- dynamic debug/1. + +/** +**-----VPS B and B to B 2 +% *** Max of 6 commands at a time - possibly restart apache2 before starting + +** at each stop point, close window, or possibly halt + +ssh root@185.176.42.28 -p 22 +%% f6sEhJ1AN5qm + +cd ../var/www/lucianacademy.ml/ +swipl -G100g -T20g -L2g + +[meditationnoreplace]. +meditation. + +halt. +swipl -G100g -T20g -L2g +[meditationnoreplace]. + +[medicinenoreplace]. +medicine. + +%%L +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. + +%% ** Stop - close window x halt (last time I stopped here) +halt. +swipl -G100g -T20g -L2g +[meditationnoreplace]. + +time((N is 1, M is 3*250,texttobr2(N,u,u,M))). %% see below +time((texttobr2(u,"file2qb.txt",u,u))). + +%% ** Do following separately, don't restart server (crashes even with following) +%%halt. +%%swipl -G100g -T20g -L2g +%%[meditationnoreplace]. + +%%A +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. + +%% ** Stop - close window x halt +halt. +swipl -G100g -T20g -L2g +[meditationnoreplace]. + +time((N is 1, M is 3*250,texttobr2(N,u,u,M))). %% see below +time((texttobr2(u,"file2qb.txt",u,u))). + + +*** + +See intructions for running texttobr before using texttobr. +- + +DON'T USE - for mindreading x use + +%% For MacBook Air - use VPS instead + +cd /Users/luciangreen/Dropbox/Program\ Finder/possibly\ not\ working/rcaw/ + +swipl -G100g -T20g -L2g + +[meditation]. +meditation. %% That so called for others is for them for Lucian + +%% Lucian + +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +%% caw00(off,f,[[[n,append],2,1],[n,delete],2,1],[n,head],1,1],[[n,tail],1,1],[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. +N is 15, M is 250*6, texttobr2(N,u,u,M),texttobr(N,u,u,M). %% see below +texttobr2(u,"file2qb.txt",u,u),texttobr(u,"file2qb.txt",u,u). + +%% Adrian for Lucian + +protocol("./file.txt"). +trace. +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. +N is 15, M is 250*6, texttobr2(N,u,u,M),texttobr(N,u,u,M). %% see below +texttobr2(u,"file2qb.txt",u,u),texttobr(u,"file2qb.txt",u,u). + +%%[medicine]. +%%medicine. + +**--------one off br + +cd /Users/luciangreen/Dropbox/Program\ Finder/possibly\ not\ working/rcaw/ +swipl -G100g -T20g -L2g +[meditation]. +texttobr2(u,u,u,u),texttobr(u,u,u,u). + +%%*** +time((texttobr2(u,u,u,u),texttobr(u,u,u,u))),mergetexttobrdict. + +%% texttobr2(u,u,u,u),texttobr(u,u,u,u). %% 80 minutes for 15*90 page document on MacBook Air or 8 minutes on VPS. + +texttobr2(u,u,u,2000),texttobr(u,u,u,14400). + +**-----VPS x use B and B to B below +% *** Max of 6 commands at a time - possibly restart apache2 before starting + +** at each stop point, close window, or possibly halt + +ssh root@185.176.42.28 -p 22 +%% f6sEhJ1AN5qm + +cd ../var/www/lucianacademy.ml/ +swipl -G100g -T20g -L2g + +[meditation]. +meditation. + +halt. +swipl -G100g -T20g -L2g +[meditation]. + +[medicine]. +medicine. + +%%L +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. + +%% ** Stop - close window x halt (last time I stopped here) +halt. +swipl -G100g -T20g -L2g +[meditation]. + +time((N is 15, M is 10146, M1 is M, M2 is M*5,texttobr2(N,u,u,M1),texttobr(N,u,u,M2))). %% see below +time((texttobr2(u,"file2qb.txt",u,u),texttobr(u,"file2qb.txt",u,u))). + +%% ** Do following separately, don't restart server (crashes even with following) +%%halt. +%%swipl -G100g -T20g -L2g +%%[meditation]. + +%%A +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. + +%% ** Stop - close window x halt +halt. +swipl -G100g -T20g -L2g +[meditation]. + +time((N is 15, M is 10146, M1 is M, M2 is M*5, texttobr2(N,u,u,M1),texttobr(N,u,u,M2))). %% see below +time((texttobr2(u,"file2qb.txt",u,u),texttobr(u,"file2qb.txt",u,u))). + +**-----VPS B and B to B +% *** Max of 6 commands at a time - possibly restart apache2 before starting + +** at each stop point, close window, or possibly halt + +ssh root@185.176.42.28 -p 22 +%% f6sEhJ1AN5qm + +cd ../var/www/lucianacademy.ml/ +swipl -G100g -T20g -L2g + +[meditationnoreplace]. +meditation. + +halt. +swipl -G100g -T20g -L2g +[meditationnoreplace]. + +[medicinenoreplace]. +medicine. + +%%L +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. + +%% ** Stop - close window x halt (last time I stopped here) +halt. +swipl -G100g -T20g -L2g +[meditationnoreplace]. + +time((N is 250 *14/20, M is 20*3*250,texttobr2(N,u,u,M))). %% see below +time((texttobr2(u,"file2qb.txt",u,u))). +time((texttobr2(u,"file2qb.txt",u,u))). for out of reach +time((texttobr2(u,"file2qb.txt",u,u))). for out of mind + +%% ** Do following separately, don't restart server (crashes even with following) +%%halt. +%%swipl -G100g -T20g -L2g +%%[meditationnoreplace]. + +%%A +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. + +%% ** Stop - close window x halt +halt. +swipl -G100g -T20g -L2g +[meditationnoreplace]. + +time((N is 250 *14/20, M is 20*3*250,texttobr2(N,u,u,M))). %% see below +time((texttobr2(u,"file2qb.txt",u,u))). +time((texttobr2(u,"file2qb.txt",u,u))). for out of reach +time((texttobr2(u,"file2qb.txt",u,u))). for out of mind + + +**NOT + +%% Lucian + +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. + + + +%% STOP ** +%%sleep(25). %% try + +halt. +%% systemctl restart apache2 +%% cd ../var/www/lucianacademy.ml/ +swipl -G100g -T20g -L2g + +[meditation]. + + +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below + +% here - server or break in connection? +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +%here +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((texttobr2(u,"file2qb.txt",u,u),texttobr(u,"file2qb.txt",u,u))). + + + + +%% Adrian for Lucian + +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. + + + +%% STOP ** +%%sleep(25). %% try + + +%% might not even need this +halt. +%% systemctl restart apache2 +%% cd ../var/www/lucianacademy.ml/ +swipl -G100g -T20g -L2g +[meditation]. + + + +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below + +%% possibly need to halt, not restart server here v + +halt. +%% systemctl restart apache2 +%% cd ../var/www/lucianacademy.ml/ +swipl -G100g -T20g -L2g +[meditation]. + + +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below + +%% if it finishes it will be here v + +halt. +systemctl restart apache2 +cd ../var/www/lucianacademy.ml/ +swipl -G100g -T20g -L2g +[meditation]. + +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((N is 1, M = all, texttobr2(N,u,u,M),texttobr(N,u,u,M))). %% see below +time((texttobr2(u,"file2qb.txt",u,u),texttobr(u,"file2qb.txt",u,u))). +**END NOT + +**---- + + +texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2, +texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2, %% 15 As +texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2, +texttobr2,texttobr2, +texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2, +texttobr2,texttobr2, +texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2, +texttobr2,texttobr2, +texttobr2, +texttobr2,texttobr2,texttobr2,texttobr2,texttobr2,texttobr2.** + +N is ((2*((46 +%%+2 +)*250)/80)+15), texttobr2(N),texttobr(N). %% see below +%% do this once at start of every celebrity grace 6 month period (and new meditators started when taught), otherwise above: +** ~ October 2018, next April 2019 + + %% (2*(46+2)*250)/80=37 for 2*250 for + + 1. arem + 2. arem links in following: [lucian + 3. green + 4. friendliness + 5. medicine + 6. childrenh1earningjobsprotectioninjobs + 7. headsofstate + 8. lucianmantrapureform + 9. lucianmantrasunsafety + 10. maharishisutra + 11. meditationteachersutra + 12. movingappearancespurusha + 13. upasanasutra + 14. yellowgod + 15. greensutra + 16. bluenature + 17. appearances + 18. pranayama + 19. soma + 20. hoursprayer + 21. fiftybreasoningspersecond + 22. meditationindicatorlowerriskofcancerandotherdiseasesinworkersandbroadcasters + 23. meditationindicatordecreasedstress + 24. meditationindicatorincreasedbloodflow + 25. meditationincreasedbrainpotential + 26. autobiography + 27. computationalenglish + 28. computerscience + 29. economics + 30. hermeneutics + 31. pedagogy + 32. breasonings + 33. quantumbox + 34. lucianicmeditation + 35. lm + 36. meditation + 37. metaphysics + 38. music + 39. plays + 40. popology + 41. theology + 42. qigongmantra + 43. qigongsutra + 44. yogamantra + 45. yogasutra] + 46. no cancer from cell function on diagram genetically functioning + ++ +2 +meditators per day *** add rest of utterance links on 25 9 18 v + +**texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr, +texttobr,texttobr,texttobr,texttobr,texttobr,texttobr, %% 15 +texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr, +texttobr,texttobr, +texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr, +texttobr,texttobr, +texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr, +texttobr,texttobr, +texttobr, +texttobr,texttobr,texttobr,texttobr,texttobr,texttobr.** + + +- For Mindreading + +[texttobr23]. +[texttobr]. + +leash(-all),visible(+all). + +protocol("./file.txt"), + trace, + caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). + +notrace,noprotocol,texttobr23. + +texttobr23,texttobr23,texttobr23,texttobr23,texttobr23,texttobr23,texttobr23,texttobr23. + +texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr. + +- + +- For Daily work + +[listprologinterpreter1listrecursion4]. + +[texttobr24nodisplay]. +[texttobr]. + +leash(-all),visible(+all). + +protocol("./file24.txt"), + trace, + caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). + +notrace,noprotocol,texttobr24. + +texttobr24,texttobr24,texttobr24,texttobr24,texttobr24,texttobr24,texttobr24,texttobr24. + +texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr,texttobr. + +== + + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],3,8,_InputVarList,_OutputVarList,[],_Program2,Ps). + +Random Combination Algorithm Writer + +- Writes random algorithms + +predicates may have different combinations of modes + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,2,0]],50,6,[[a,1]],[[d,1]],[],Program). + + + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,7,[[a,1]],[[b,1]],[],Program). +Program = [f,[1,b]],[[f,[a,b],:-,[[=,[b,a]]]]],[[b,1]] + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,7,[[a,[1,2]]],[[b,[1,2]]],[],Program). +Program = [f,[[1,2],b]],[[f,[a,b],:-,[[=,[b,a]]]]],[[b,[1,2]]] + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,4,[[a,1],[b,2]],[[c,[1,2]]],[],Program). +Program = interpret(off,[f,[1,2,c]],[[f,[a,b,c],:-,[[append,[a,b,g]],[=,[c,g]]]]],[[c,[1,2]]]) + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,6,[[a,1],[b,2],[c,3]],[[d,[1,2,3]]],[],Program). +Program = [interpret(off,[f,[1,2,3,d]],[[f,[a,b,c,d],:-,[[append,[a,b,e]],[append,[e,c,f]],[=,[d,f]]]]],[[d,[1,2,3]]])] + +edit out repeats +test rule,var +better arg finders + +caw00(off,f,[[append,2,1],[delete,2,1],[head,1,1],[tail,1,1],[member,1,1]],50,7,[[a,[4,1,2,3]],[b,1]],[[d,[4,2,3]]],[],Program). + +[debug] ?- optimise([[append,[a,b,c]],[append,[c,d,e]]],[a,b,d],[a,b,c,d],[e],P). +P = [[append, [a, b, c]], [append, [c, d, e]]] ; +%% choose latest var as output, remove duplicate args in rules x (max 2), reverse optimise input, new rules have old outputs as args v +[debug] ?- optimise([[append,[a,b,c]],[append,[a,b,c]]],[a,b],[a,b,c],[c],P). +P = [[append, [a, b, c]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[a,b,c]]],[a,b],[a,b,c],[c],P). +P = [[append, [a, b, c]], [delete, [a, b, c]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[a,b,x]]],[a,b],[a,b,c,x],[c],P). +P = [[append, [a, b, c]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[c,b,d]],[member,[c,d]]],[a,b],[a,b,c,d],[d],P). +P = [[append, [a, b, c]], [delete, [c, b, d]], [member, [c, d]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[c,b,d]],[member,[a,d]]],[a,b],[a,b,c,d],[d],P). +P = [[append, [a, b, c]], [delete, [c, b, d]], [member, [a, d]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[c,b,d]],[member,[a,x]]],[a,b],[a,b,c,d],[d],P). +P = [[append, [a, b, c]], [delete, [c, b, d]]] ; + +[debug] ?- optimise([[append,[a,b,c]],[delete,[c,b,d]],[member,[c,x]]],[a,b],[a,b,c,d],[d],P). +P = [[append, [a, b, c]], [delete, [c, b, d]]] ; + +Wantedly fail: +optimise([[append,[a,e,c]]],[a],[a,e,c],[c],P). +optimise([[append,[a,b,c]],[append,[c,d,e]]],[a,d],[a,b,c,d],[e],P). +optimise([[append,[a,b,c]],[append,[c,d,e]]],[a,b],[a,b,c,d],[e],P). + +optimise([[delete,[a,e,c]],[append,[a,a,e]]],[a,e],[a,e,c],[c],P). +P = [[delete, [a, e, c]], [append, [a, a, e]]] . + +optimise([[append,[a,a,e]],[delete,[a,e,c]]],[a,e],[a,e,c],[c],P). +P = [[append, [a, a, e]], [delete, [a, e, c]]] . + +optimise([[append,[a,e,c]]],[a,e,c],[a,e,c],[c],P). +P = [[append, [a, e, c]]] . + +findrulesflowingtopv1([[append,[a,e,c]],[member,[a,e]]],[a],[a,e,c],[c],[],R,F). +R = [[append, [a, e, c]], [member, [a, e]]] + +check optimise works with member ef v +member ef and member in inputvars2 (cde) v +optimise - aeg (remove e), v +does optimise work with multiple rules with same output v +delete returning progs in optimise v +cut rule - + +aea cant point to itself in optimise - needs iterative deepening +aec where e not in iv1 or another pred, try first or second in prog v +don't pass rule to lower predicates v + +don't choose outputs from non new var v +don't run repeat preds + +make predicate, clause writer +member predicates returning no output + +**/ + +%% ML max 25 + +:- include('algdict.pl'). + +caw00(Debug,PredicateName,Rules,MaxLength,TotalVars,InputVarList,OutputVarList,Program1,_Program2,Ps1) :- + findall([TestNumber,Queries,Functions], + test(TestNumber,Queries,Functions),Algorithms), %% Collects algorithms + algorithmstopredicates(Algorithms,[],Predicates), + repeat, + %%MaxLength2 is MaxLength + 1, + %%TotalVars = MaxLength, + /**randvars(MaxLength,MaxLength,[],RandVars), + populatevars(RandVars,MaxLength,[],PV), + Code is MaxLength + 1 + 97, + char_code(Char,Code), + OutputVarList=[[[v,Char],1]],**/ + retractall(debug(_)), + assertz(debug(Debug)), + retractall(totalvars(_)), + assertz(totalvars(TotalVars)), + caw0(Predicates,PredicateName,Rules,MaxLength,InputVarList,OutputVarList,Program1,_Program3,Ps), + sort(Ps,Ps1),not(Ps1=[]),!. + +randvars(0,_,V,V) :- !. +randvars(N,L,V1,V2) :- + random(N1), N2A is round(97+(N1*L)), char_code(V3,N2A), V31=[v,V3], ((member(V31,V1))->randvars(N,L,V1,V2); + (append(V1,[V31],V4), + NA is N-1, randvars(NA,L,V4,V2))),!. +randvars2(0,_L,V,V) :- !. +randvars2(N,L,V1,V2) :- + random(N1), N2A is round(97+(N1*L)), char_code(V3,N2A), atom_string(V3,V4), %%V41=[v,V4], + ((member(V4,V1))->randvars2(N,L,V1,V2); + (append(V1,[V4],V5), + NA is N-1, randvars2(NA,L,V5,V2))),!. +populatevars([],_,PV,PV) :- !. +populatevars([RV1|RVs],MaxLength2,PV1,PV2) :- + randvars2(MaxLength2,MaxLength2,[],RV2), + append(PV1,[[RV1,RV2]],PV3), + populatevars(RVs,MaxLength2,PV3,PV2),!. + +caw0(Algorithms,PredicateName,Rules,MaxLength,InputVarList,OutputVarList,Program1,Program2,Ps2) :- + varnames(InputVarList,[],InputVars,[],InputValues), + varnames(OutputVarList,[],OutputVars,[],_OutputValues), + retractall(outputvars(_)), + assertz(outputvars(OutputVars)), + append(InputVars,OutputVars,Vars11), + %%Vars11=InputVars, + %%Vars12=InputVars, + append(InputValues,OutputVars,Vars2), + %%append(InputValues,OutputValues,Values), + Query=[PredicateName,Vars2], + caw1(Algorithms,Query,PredicateName,Rules,MaxLength,Vars11,InputVars,InputVars,_,OutputVarList,OutputVars,Program1,Program2,[],Ps2), !. + +caw1(_,_,_,_,0,_,_,_,_,_,_,_,_,Ps,Ps) :- !. +caw1(Algorithms,Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program2,Programs2,Ps1) :- + MaxLength2 is MaxLength - 1, + addrules0(InputVars2,OutputVars,OutputVars,[],Program3), + %%writeln([addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3)]), + %%optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4), %% IV2->3 + %%writeln([optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4)]), + append(Program1,Program3,Program5), + append(InputVars1,OutputVars,Vars2), + Program2=[ + [PredicateName,Vars2,":-", + Program5 + ] + ],debug(Debug), + + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + + interpret(Debug,Query,Program2,OutputVarList), + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + append(Programs2,[[Query,Program2,OutputVarList]],Programs3), + caw1a(Algorithms,Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program2,Programs3,Ps1),!. + +%%caw1(_Query,_PredicateName,_Rules,_MaxLength,_VarList,_InputVars1,_InputVars2,_InputVars3,_OutputVarList,_OutputVars,_Program1,_Program4,Ps,Ps) :- writeln(here1),!. +caw1(Algorithms,Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program2,Programs3,Programs3) :- !. + %%writeln([here1, caw1(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program21,Programs3,Programs3)]),!. + +caw1a(Algorithms,Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps1,Ps2) :- + + %%writeln([caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,OutputVarList,OutputVars,Program1,Program4)]), + %%MaxLength2 is MaxLength - 1, + %%writeln(["ml",MaxLength2]), + reverse(InputVars2,InputVars5), + random_member([RuleName,NumInputs,NumOutputs],Rules), + %%writeln([member([RuleName,NumInputs,NumOutputs],Rules)]), + %%writeln([rule(RuleName,NumInputs,NumOutputs,VarList,VarList2,Rule)]), + rule(Algorithms,RuleName,NumInputs,NumOutputs,InputVars5,InputVars4,VarList,VarList2,Rule), +%%writeln([inputVars1,InputVars1]), +%% writeln([rule(RuleName,NumInputs,NumOutputs,InputVars5,InputVars4,VarList,VarList2,Rule)]), + append(Program1,[Rule],Program3), + %%writeln([inputVars3,InputVars3]), + %%InputVars2=InputVars3, + %%writeln([program4,Program4]), + +%%caw1a(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps,Ps) :- +%%writeln([here,caw1a(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps,Ps)]) + + +caw(Algorithms,Query,PredicateName,Rules,MaxLength,VarList2,InputVars1,InputVars4,InputVars3,OutputVarList,OutputVars,Program3,Program4,Ps1,Ps2),!. + + +caw(_,_,_,_,0,_,_,_,_,_,_,_,_,Ps,Ps) :- !. +caw(Algorithms,Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program2,Programs1,Programs2) :- + MaxLength2 is MaxLength - 1, + addrules(InputVars2,OutputVars,OutputVars,[],_PenultimateVars,[],Program3), + %%writeln([addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3)]), + %%optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4), %% IV2->3 + %%writeln([optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4)]), + append(Program1,Program3,Program5), + append(InputVars1,OutputVars,Vars2), + Program2=[ + [PredicateName,Vars2,":-", + Program5 + ] + ],debug(Debug), +%%*** +%% () choose iv1 as args during caw, () eliminate e problem, could move forward in optimiser but don't need it v +%% should have a leading edge of 1 immediately previous (new output) as an arg in latest rule v, go backwards to choose latest possible args x, 3 x rules can have same as previous rule's output as an output x: at a time +%% chunks will solve having at least 1 rule that connects to last output +%% can optimise number of inputs + +%% test member, = in caw + + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + + interpret(Debug,Query,Program2,OutputVarList), + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + append(Programs1,[[Query,Program2,OutputVarList]],Programs3), + cawa(Algorithms,Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program2,Programs3,Programs2),!. +%%caw(_,_,_,_,_,_,_,_,_,_,_,_,Ps,Ps) :- !. + +caw(Algorithms,Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program2,Programs3,Programs3) :- + %%writeln([here2, caw(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program21,Programs3,Programs3)]), + !. + + +cawa(Algorithms,Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps1,Ps2) :- + %%writeln([caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,OutputVarList,OutputVars,Program1,Program4)]), + %%MaxLength2 is MaxLength - 1, + %%writeln(["ml",MaxLength2]), + random_member([RuleName,NumInputs,NumOutputs],Rules), + %%writeln([member([RuleName,NumInputs,NumOutputs],Rules)]), + %%writeln([rule(RuleName,NumInputs,NumOutputs,VarList,VarList2,Rule)]), + rule(Algorithms,RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,Rule), +%%writeln([inputVars1,InputVars1]), +%% writeln([rule(RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,Rule)]), + append(Program1,[Rule],Program3), + %%writeln([inputVars3,InputVars3]), + %%InputVars2=InputVars3, + %%writeln([program4,Program4]), +caw(Algorithms,Query,PredicateName,Rules,MaxLength,VarList2,InputVars1,InputVars4,InputVars3,OutputVarList,OutputVars,Program3,Program4,Ps1,Ps2), !. + +varnames([],Vars,Vars,Values,Values) :- !. +varnames(VarList,Vars1,Vars2,Values1,Values2) :- + VarList=[Var|Vars3], + Var=[VarName,Value], + append(Vars1,[VarName],Vars4), + append(Values1,[Value],Values3), + varnames(Vars3,Vars4,Vars2,Values3,Values2),!. + +addrules0(_,_,[],Program,Program) :- !. +addrules0(VarList,OutputVars1,OutputVars2,Program1,Program2) :- + OutputVars2=[OutputVar|OutputVars3], + random_member(Var,VarList), + random_member(OutputVar,OutputVars1), + append(Program1,[[[n,=],[OutputVar,Var]]],Program3), + addrules0(VarList,OutputVars1,OutputVars3,Program3,Program2),!. + +addrules(_,_,[],PV,PV,Program,Program) :- !. +addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- + restlast(VarList,[],_,Var), + %%OutputVars2=[OutputVar|OutputVars3], + random_member(OutputVar,OutputVars2), + delete(OutputVars1,OutputVar,OutputVars3), +%% member(Var,VarList), + random_member(OutputVar,OutputVars1), + append(Program1,[[[n,=],[OutputVar,Var]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars3), + addrules2(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2),!. + +addrules2(_,_,[],PV,PV,Program,Program) :- !. +addrules2(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- +%% restlast(VarList,[],_,Var), + OutputVars2=[OutputVar|OutputVars3], + random_member(Var,VarList), + not(member(Var,PenultimateVars1)), + random_member(OutputVar,OutputVars1), + append(Program1,[[[n,=],[OutputVar,Var]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars3), + addrules2(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2),!. + +%% optimise([[append,[a,a,d]],[append,[a,a,e]],[append,[a,a,f]],[append,[a,b,g]]],[g],P). +/** +optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program2) :- + reverse(Program1,Program4), + findrulesflowingtopv1(Program4,InputVars1,InputVars2,PenultimateVars,[],Rules,true), + %%findrulesflowingtopv1a(Program1,_Program32,InputVars1,InputVars2,PenultimateVars,[],_Rules1), + intersection(Program1,Rules,Program3), + unique1(Program3,[],Program2). +findrulesflowingtopv1(_,_,_,[],Rules,Rules,false). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + (atom(Var);length(Var,1)), + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars1,Rules1,Rules2,IV1Flag1) :- + Vars1=[Var|Vars2], + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2), + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars2,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv20(_,[],_InputVars1,_InputVars2,_Var,Rules,Rules,false). +findrulesflowingtopv20(Program0,Rules4,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rules4=[Rule|Rules], + (findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2)->true;(Rules3=Rules1,IV1Flag2=false)), + %%delete(Program0,Rule,Program1), + findrulesflowingtopv20(Program0,Rules,InputVars1,InputVars2,Var,Rules3,Rules2,IV1Flag3),%%p1->0 + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). +%%findrulesflowingtopv2(_,[],[],_,_,_,Rules,Rules). +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + %%(not(intersection(Rulesx,Rules1))-> x + %% append, append, unique1 + %%append(Rules1,[Rule],Rules3);Rules3=Rules1), + + %%member(Var2,Rest), + %%member(Var2,InputVars1), + + length(Rest,Length1), Length1>=1, + subtract(Rest,InputVars1,IV3s), + length(IV3s,Length3), + subtract(Rest,IV3s,IV1s), + length(IV1s,Length2), Length2>=1, + subtract(IV3s,InputVars2,[]), + + IV1Flag2=true, + + %%delete(Program0,Rule,Program1), + + %%(delete(Program0,Rule,Program3), + %%iv3s1(IV3s,Program3,IV3s,[]), + (Length3>=1-> + (findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV3s,[],Rules5,IV1Flag3),not(Rules5=[])); + (Rules5=[],IV1Flag3=false)), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag4), + %%->true; Rules5=[],IV1Flag1=IV1Flag4), + + ((findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV1s,[],Rules6,IV1Flag5), %%iv1s->rest, etc + iv1flagdisjunction(IV1Flag4,IV1Flag5,IV1Flag1))->true;(Rules6=[],IV1Flag1=IV1Flag4)), + + append([Rule],Rules1,Rules9), + append(Rules9,Rules5,Rules7), + append(Rules7,Rules6,Rules8), + unique1(Rules8,[],Rules2). + +/** +findrulesflowingtopv2(_Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + (not(member(Rule,Rules1))-> + append(Rules1,[Rule],Rules2);Rules2=Rules1), + subset(Rest,InputVars2), + + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), + +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), + + IV1Flag1=false. +**/ +/** +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program3), + %%Program3=Program1, + %%append(Rules1,[Rule],Rules3), + subset(Rest,InputVars2), + + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), + +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), + +%% delete(Program0,Rule,Program1), + + IV1Flag2=false, + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Rest,[],Rules4,IV1Flag3), + %%not(Rules4=[]), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1), + + append(Rules1,[Rule],Rules7), + append(Rules7,Rules4,Rules8), + unique1(Rules8,[],Rules2). +**/ +/** +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2(Rule,Program0,Program1,_Program2,InputVars1,InputVars,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + %%Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1(Program0,Program1,_Program2,InputVars1,InputVars,Rest,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). + +**/ +iv1flagdisjunction(A,B,true) :- + ((A=true)->true; (B=true)),!. +iv1flagdisjunction(_,_,false) :- !. +/** +iv3s0([],_,IV3s1,IV3s2). +iv3s0(IV3s,Program0,IV3s1,IV3s2). + IV3s=[IV3|IV3s3], + iv3s1(IV3,Program0,IV3s1,IV3s4), + iv3s0(IV3s3,Program0,IV3s4,IV3s2). +iv3s1(_,[],IV3s,IV3s). +iv3s1(IV3,Program0,IV3s1,IV3s2) :- + Program0=[Rule|Rules], + iv3s2(IV3,Rule,IV3s1,IV3s3), + iv3s1(IV3,Rules,IV3s3,IV3s2). +iv3s2(IV3,Rule,IV3s,IV3s1,IV3s2). + Rule=[_PredicateName,Vars], + restlast(Vars,[],_Rest,IV3), + delete(IV3s1,IV3,IV3s2). + + +findrulesflowingtopv1a(_,_,_,_,[],Rules,Rules). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + atom(Var), + findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Vars1,Rules1,Rules2) :- + Vars1=[Var|Vars2], + findrulesflowingtopv2(Program1,Program3,InputVars1,InputVars2,Var,Rules1,Rules3), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Vars2,Rules3,Rules2). +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv2a([],[],_,_,_,Rules,Rules). +findrulesflowingtopv2a(Program1,Program2,_InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program2), +Program2=Program1, + append(Rules1,[[PredicateName,Vars]],Rules2), + subset(Rest,InputVars2)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program3), +Program3=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + subset(Rest,InputVars2)), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Rest,Rules3,Rules2). + +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1a(Program4,Program2,InputVars1,InputVars,Rest,Rules3,Rules2). + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). + **/ +**/ +restlast([],_,_,_) :- fail, !. +restlast([Last],Rest,Rest,Last) :- + Last=[v,_],!. +restlast(Last,Rest,Rest,Last) :- + length(Last,1),!. +restlast(Vars1,Rest1,Rest2,Last) :- + Vars1=[Var|Vars2], + append(Rest1,[Var],Rest3), + restlast(Vars2,Rest3,Rest2,Last),!. + +%% check list reduced to predicate names/arity are together, error otherwise +%% numbers in order, error otherwise + +algorithmstopredicates([],Predicates1,Predicates2) :- + sort(Predicates1,Predicates2),!. +algorithmstopredicates(Algorithms1,Predicates1,Predicates2) :- + Algorithms1=[Algorithm1|Algorithms2], + Algorithm1=[_,_,Algorithm3], + Algorithm3 %% find rest of clause set in whole algorithm, not just next + append_list(Predicates1,Algorithm3,Predicates4), + algorithmstopredicates(Algorithms2,Predicates4,Predicates2),!. + +append_list(A,[],A) :-!. +append_list(A,List,B) :- + List=[Item|Items], + append(A,[Item],C), + append_list(C,Items,B). + +%%rule(Algorithms %% Assume predicate names are all different, or the same if they are the same, so sort them (should give error for same names, different predicate at start) + +rule(Algorithms,RuleName,_,_,InputVars1,InputVars2,VarList,VarList2,Rule) :- + random_member(Var,InputVars1), + rule2(RuleName,Var,VarList,VarList2,Rule,Var1), + restlast(InputVars1,[],_,Last), %% Last should be outputs - 2nd last rule? + (Var=Last->true;Last=Var1), + append(InputVars1,[Var1],InputVars2),!. +rule(RuleName,1,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + random_member(Var,InputVars1), + rule3(RuleName,Var,VarList,VarList2,Rule,Var1,Var2), + restlast(InputVars1,[],_,Last), + (Var=Last->true; + (Last=Var1->true;Last=Var2)), + append(InputVars1,[Var1,Var2],InputVars2),!. +rule(RuleName,2,0,InputVars,InputVars,VarList,VarList,Rule) :- +%%writeln([rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule)]), + random_member(Var,InputVars), + random_member(Vara,InputVars), + rule6(RuleName,Var,Vara,_VarList,_VarList2,Rule), + restlast(InputVars,[],_,Last), +%%writeln([last,Last]), + (Var=Last->true;Vara=Last),!. +rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- +%%writeln([rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule)]), + random_member(Var,InputVars1), + random_member(Vara,InputVars1), + rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1), + restlast(InputVars1,[],_,Last), +%%writeln([last,Last]), + ((Var=Last->true;Vara=Last)->true; + (Last=Var1)), +%%writeln([var,Var,vara,Vara]), + append(InputVars1,[Var1],InputVars2),!. +rule(RuleName,2,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + random_member(Var,InputVars), + random_member(Vara,InputVars), + rule5(RuleName,Var,Vara,VarList,VarList2,Rule,Var1,Var2), + restlast(InputVars1,[],_,Last), + ((Var=Last->true;Vara=Last)->true; + (Last=Var1->true;Last=Var2)), %% make last var2, use different inputs from previous rule, make this line usea version of previous line as well (args from rule before that) - redo rules based on past programming + append(InputVars1,[Var1,Var2],InputVars2),!. + + +rule2(RuleName,Var,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Var1]],!.%%, + %%member(Var,!. + +rule3(RuleName,Var,VarList,VarList3,Rule,Var1,Var2) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Var1,Var2]],!. + +rule6(RuleName,Var,Vara,_VarList,_VarList2,Rule) :- + Rule=[RuleName,[Var,Vara]],!. + +rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Vara,Var1]],!. + +%%ae be with predicate support also +rule5(RuleName,Var,Vara,VarList,VarList3,Rule,Var1,Var2) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Vara,Var1,Var2]],!. + +%%var(Item,Var,Vars,Vars) :- +%% member([Item,Var],Vars). +var(Vars1,Var1,Vars2) :- + length(Vars1,Vars1Length1), + Vars1Length2 is Vars1Length1-1, + length(Vars3,Vars1Length2), + append(Vars3,[Var2],Vars1), + Var2=[v,Var21], + char_code(Var21,Var2Code1), + Var2Code2 is Var2Code1 + 1, + var2(Var2Code2,Var1), + append(Vars1,[Var1],Vars2),!. + +var2(Code,Var1) :- + outputvars(OutputVars), + totalvars(TotalVars), + Code2 is 96+TotalVars, + Code =< Code2, %% 122 + char_code(Var11,Code), + Var1=[v,Var11], + not(member(Var1,OutputVars)),!. +var2(Var2Code,Code3) :- + Var2Code2 is Var2Code + 1, + totalvars(TotalVars), + Code2 is 96+TotalVars, + Var2Code2 =< Code2, + var2(Var2Code2,Code31), + Code3=[v,Code31],!. + +%% this goes from e.g. c not a to TotalVars +%% skip over vars, start from a x reassign vars to abc etc + +%% if returning 12 from 12345 remove 345 args + +%% try lowest possible number of vars first (return shortest program first), then keep on adding number of vars +%% can elimininate same program both here and in assessment verification diff --git a/Daily-Regimen-main/get_y b/Daily-Regimen-main/get_y new file mode 100644 index 0000000000000000000000000000000000000000..36e6384f3b76685d76d901e84993fb870ed3a59e Binary files /dev/null and b/Daily-Regimen-main/get_y differ diff --git a/Daily-Regimen-main/get_y.pl b/Daily-Regimen-main/get_y.pl new file mode 100644 index 0000000000000000000000000000000000000000..9002bbf177bd1624bb0e8b429a4be5ed5b143177 --- /dev/null +++ b/Daily-Regimen-main/get_y.pl @@ -0,0 +1,8 @@ +main:-catch(get_y,Err,handle_error(Err)),halt. +main :- halt(1). +handle_error(_Err):- + halt(1). +get_y :- +repeat,writeln("Please enter \"y\":"), +read_string(user_input,"\n\r","\n\r",_,S), +S="y",!. \ No newline at end of file diff --git a/Daily-Regimen-main/going_to_5689 b/Daily-Regimen-main/going_to_5689 new file mode 100644 index 0000000000000000000000000000000000000000..5549c8b384a7cb4c0069e4226b9634a3ceabdd9e Binary files /dev/null and b/Daily-Regimen-main/going_to_5689 differ diff --git a/Daily-Regimen-main/going_to_5689.pl b/Daily-Regimen-main/going_to_5689.pl new file mode 100644 index 0000000000000000000000000000000000000000..11fb58b9e093e2f4310b88518e35e7d182cc2c23 --- /dev/null +++ b/Daily-Regimen-main/going_to_5689.pl @@ -0,0 +1,11 @@ +:-include('../listprologinterpreter/listprolog.pl'). +main:-catch(going_to_5689,Err,handle_error(Err)),halt. +main :- halt(1). +handle_error(_Err):- + halt(1). +going_to_5689 :- +open_file_s("tt_log.txt",File_term), +(append(_,[[n=_N1,_Pres_D,_Pres_M,_Pres_Y,Fut_D,Fut_M,Fut_Y]],File_term)->true;fail), +foldr(string_concat,["Going to ",Fut_D," ",Fut_M," ",Fut_Y,", press \"y\":"],S), +print_message(information,S), +!. \ No newline at end of file diff --git a/Daily-Regimen-main/going_to_5689.sh b/Daily-Regimen-main/going_to_5689.sh new file mode 100644 index 0000000000000000000000000000000000000000..c97fa696aca85858d8c02d9ae76ad401788f4ccb --- /dev/null +++ b/Daily-Regimen-main/going_to_5689.sh @@ -0,0 +1,4 @@ +#!/bin/bash +#cd ../Daily-Regimen +swipl --stack_limit=40G --goal=main --stand_alone=true -o going_to_5689 -c going_to_5689.pl +./going_to_5689 \ No newline at end of file diff --git a/Daily-Regimen-main/group_meditation.sh b/Daily-Regimen-main/group_meditation.sh new file mode 100644 index 0000000000000000000000000000000000000000..8a5daddd36057075b7e620bfcb9bee732b0f60bc --- /dev/null +++ b/Daily-Regimen-main/group_meditation.sh @@ -0,0 +1,4 @@ +#!/bin/bash +cd ../Text-to-Breasonings +swipl --stack_limit=40G --goal=main --stand_alone=true -o group_meditation -c group_meditation.pl +./group_meditation \ No newline at end of file diff --git a/Daily-Regimen-main/h1.pl b/Daily-Regimen-main/h1.pl new file mode 100644 index 0000000000000000000000000000000000000000..d81d40c698cf265080aebe2ad254853af8185972 --- /dev/null +++ b/Daily-Regimen-main/h1.pl @@ -0,0 +1,11 @@ +:-include('cgpt_combophil10.pl'). +main :- open_string_file_s("../Text-to-Breasonings/file.txt",A),split_string(A,".\n\r",".\n\r",B),delete(B,"",A1),findall(C,(member(C,A1),string_chars(C,D),not(forall(member(E,D),char_type(E,white)))),F), + +findall(_,(member(G,F), +cgpt_combophil(["SALES","FINANCE","ECONOMICS","BOTS"],30,G)),_),!. + +% 10 ugrad +% 20 honours +% 30 master +% 40 research master +% 50 phd diff --git a/Daily-Regimen-main/la_com.txt b/Daily-Regimen-main/la_com.txt new file mode 100644 index 0000000000000000000000000000000000000000..8ede6d2efa6076cd6b24d4d368906db4523155b1 --- /dev/null +++ b/Daily-Regimen-main/la_com.txt @@ -0,0 +1,64 @@ +% LA com vps with det.txt - when do algorithms each day + +*** NOTE START AND END TIMES 2.51- before 3.58 + +# ssh root@x.x.x.x -p x +cd GitHub/Lucian-Academy + +swipl -G100g -T20g -L2g + +['la_com1.pl']. + +la_com1. + + +halt. + +swipl -G100g -T20g -L2g + +['sectest_p.pl']. + +sectest_p. + +halt. + + +cp politics_university.txt ../Text-to-Breasonings/file.txt +cd ../Algorithm-Writer-with-Lists/ + +swipl -G100g -T20g -L2g + +['../Algorithm-Writer-with-Lists/grammar_logic_to_alg.pl']. + +FileName2='./lacomfile1.txt', +FileName3='lacomfile1.txt', + + +protocol(FileName2). +notrace. + +grammar_logic_to_alg1. + +noprotocol. + +halt. + +mv lacomfile1.txt ../Text-to-Breasonings/file.txt +cd ../Text-to-Breasonings/ + +swipl -G100g -T20g -L2g + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./file.txt', +FileName3='file.txt', + +time((W is 50*4,texttobr2(u,FileName3,u,u,false,false,false,false,false,false,W),texttobr(1,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "LA com 1" luciangreen@lucianacademy.com -a ',FileName3,A1), + +shell1(A1). + +halt. + +logout diff --git a/Daily-Regimen-main/la_com_bot_prep.txt b/Daily-Regimen-main/la_com_bot_prep.txt new file mode 100644 index 0000000000000000000000000000000000000000..4de2aaf5b4ba6e93942f252530bdfb34e4149f42 --- /dev/null +++ b/Daily-Regimen-main/la_com_bot_prep.txt @@ -0,0 +1,34 @@ +%% * br lp.txt in file.txt before use +%% combophil_grammar_logic_to_alg.txt +%% 40 including 10 sales with 10 monies *2 for other person + +cd GitHub/Algorithm-Writer-with-Lists + +swipl -G100g -T20g -L2g +['combophil_grammar_logic_to_alg_vps.pl']. +['grammar_logic_to_alg.pl']. +['../Text-to-Breasonings/text_to_breasonings.pl']. +time((combophil_grammar_logic_to_alg1(_,String000),Q=1,N is Q*32, M is 4000,texttobr2(N,u,String000,M,false,false,false,false,false,false),texttobr(N,u,String000,M))). +P is 2+2+3+3+5+10+2+42+5+2+10, texttobr2(P), +R is 209,texttobr2(R), +LengthMeditatorsandDoctors2 is 1000, texttobr2(LengthMeditatorsandDoctors2). +halt. +logout + +%% Give the meditators, etc. the As. + +%% was 600 bots + +%% *2 Head of state to enable mind +%% *2 for dot on +%% *3 bot, 10% money, thoughts +%% *3 for 1 for graciously give: trans, 1 for receiver, 1 for do it +%% *5 for marketing, human resources, accounting, finance, management +%% *10 for meditation, pedagogy, medicine, comp eng, popol, theol, societol, mr, tt, bus +%% *2 for I want to buy +%% *42 for b, 40 algorithms for 100%, sales are related to topic of argument +%% *5 for orig,agency As,agency Bs,company As,company Bs +%% *2 for strictness +%% *10 for updates +%% *209 for daily regimen +%% 300 29 8 20 diff --git a/Daily-Regimen-main/please_enter_y.sh b/Daily-Regimen-main/please_enter_y.sh new file mode 100644 index 0000000000000000000000000000000000000000..96d5443d5b50c4e64ca61d804cc67d0716025f5f --- /dev/null +++ b/Daily-Regimen-main/please_enter_y.sh @@ -0,0 +1,4 @@ +#!/bin/bash +#cd ../Daily-Regimen +swipl --stack_limit=40G --goal=main --stand_alone=true -o get_y -c get_y.pl +./get_y \ No newline at end of file diff --git a/Daily-Regimen-main/prompt_question b/Daily-Regimen-main/prompt_question new file mode 100644 index 0000000000000000000000000000000000000000..b6c65c279ef2a3df5115c1f213759041b649cee8 Binary files /dev/null and b/Daily-Regimen-main/prompt_question differ diff --git a/Daily-Regimen-main/rcawpastea vps.txt b/Daily-Regimen-main/rcawpastea vps.txt new file mode 100644 index 0000000000000000000000000000000000000000..fdaa9af474396b3270668deef0ec9cfcf57fb449 --- /dev/null +++ b/Daily-Regimen-main/rcawpastea vps.txt @@ -0,0 +1,308 @@ +#% member(List,Item) (below) only works with deprecated lpi + +# ssh root@x.x.x.x -p x +# cd ../var/www/lucianacademy.ml/ + +cd GitHub/Text-to-Breasonings + +swipl -G100g -T20g -L2g + +%% LA.com hits + +['text_to_breasonings.pl']. +texttobr2(4,"../Daily-Regimen/filekeeplacomhits.txt",u,u),texttobr(4,"../Daily-Regimen/filekeeplacomhits.txt",u,u). + +%% Give the thoughts to me (rb, gg or b, rb) + +['texttobr2qb']. +texttobr2(3). + +halt. + +%% Meditation and Medicine + +swipl -G100g -T20g -L2g + +[meditationnoreplace]. +[meditatorsanddoctors]. +meditation. + +halt. +swipl -G100g -T20g -L2g +%%**[meditationnoreplace]. + +%%**['cawplistprolog']. +['../Daily-Regimen/rcawpspec.pl']. +%%[medicinenoreplace]. + +%%medicine. + +%% Preparation for Mind Reading, (point at 100 algs per sentence x) + +leash(-all),visible(+all). +protocol("./file.txt"). +rcawp1(1). + +noprotocol. + + +%% ** Stop - close window x halt (last time I stopped ) +halt. +swipl -G100g -T20g -L2g +['text_to_breasonings.pl']. +[meditatorsanddoctors]. +meditators(Meditators),doctors(Doctors),append(Meditators,Doctors,MeditatorsandDoctors),length(MeditatorsandDoctors,LengthMeditatorsandDoctors), + +time((N is 100*4, M is 4000, F="../Daily-Regimen/file-meditations.txt",texttobr2(N,F,u,M,false,false,false,false,false,false),texttobr(N,F,u,M))). + +%% Give the meditators, etc. the As. + +[texttobr2qb]. + +[meditatorsanddoctors]. +texttobr2(2). %% Head of state to enable mind reading +meditators(Meditators),doctors(Doctors),append(Meditators,Doctors,MeditatorsandDoctors),length(MeditatorsandDoctors,LengthMeditatorsandDoctors1), + +LengthMeditatorsandDoctors2 is 3*LengthMeditatorsandDoctors1, +texttobr2(LengthMeditatorsandDoctors2). + +halt. + +%% Security + +%% First half + +swipl -G100g -T20g -L2g + +FileName2='./sfile1.txt', +FileName3='sfile1.txt', + +protocol(FileName2). +notrace. + +[mindreadtestsec]. + +sectest0. + +noprotocol. + +['text_to_breasonings.pl']. + +FileName2='./sfile1.txt', +FileName3='sfile1.txt', + +time((texttobr2(u,FileName3,u,u,false,false,false,false,false,false),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Security 1" admin@domain.com -a ',FileName3,A1), + +shell1(A1). + +halt. + +%% Second half + +swipl -G100g -T20g -L2g + +FileName2='./sfile2.txt', +FileName3='sfile2.txt', + +protocol(FileName2). +notrace. + +[mindreadtestsec]. + +sectest1. + +noprotocol. + +['text_to_breasonings.pl']. + +FileName2='./sfile2.txt', +FileName3='sfile2.txt', + +time((texttobr2(u,FileName3,u,u,false,false,false,false,false,false),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Security 2" admin@domain.com -a ',FileName3,A1), + +shell1(A1). + +halt. + +%% Give the meditators, etc. the As. + +swipl -G100g -T20g -L2g + +[mindreadtestsec]. + +[texttobr2qb]. + +[meditatorsanddoctors]. +meditators(Meditators),doctors(Doctors),append(Meditators,Doctors,MeditatorsandDoctors),length(MeditatorsandDoctors,LengthMeditatorsandDoctors1), + +LengthMeditatorsandDoctors2 is 2*LengthMeditatorsandDoctors1, +texttobr2(LengthMeditatorsandDoctors2). + +halt. + +%% *** Meditations + + +%% first_person +swipl -G100g -T20g -L2g +['../listprologinterpreter/listprolog.pl']. +[mindreadingcaw]. +caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps1),Ps1=[[Query,Functions,_Result1]|_Rest],leash(-all),visible(+all),protocol("./first_person.txt"),trace,interpret(off,Query,Functions,_Result2),notrace,noprotocol. +halt. +swipl -G100g -T20g -L2g +['text_to_breasonings.pl']. +time((N is 3*2,%% 1 medit, 1 role, 1 degrees + M is 8000, texttobr2(N,"first_person.txt",u,M,false,false,false,false,false,false),texttobr(N,"first_person.txt",u,M))). +%% Give the meditator the As. +[texttobr2qb]. +texttobr2(3). %% radio button for As, graciously give or blame, radio button for graciously give or blame +atom_concat('echo "" | mutt -s "Meditations first_person" admin@domain.com -a first_person.txt','',A1), +shell1(A1). +halt. + +%% Psychiatrist + +%% First half + +swipl -G100g -T20g -L2g + +FileName2='./pfile1.txt', +FileName3='pfile1.txt', + +protocol(FileName2). +notrace. + +[mindreadtestpsychiatrist]. +[meditatorsanddoctors]. + +sectest0. + +noprotocol. + +['text_to_breasonings.pl']. + +FileName2='./pfile1.txt', +FileName3='pfile1.txt', + +time((texttobr2(u,FileName3,u,u,false,false,false,false,false,false),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Psychiatric Report 1" admin@domain.com -a ',FileName3,A1), + +shell1(A1). + +halt. + +%% Second half + +swipl -G100g -T20g -L2g + +FileName2='./pfile2.txt', +FileName3='pfile2.txt', + +protocol(FileName2). +notrace. + +[mindreadtestpsychiatrist]. +[meditatorsanddoctors]. + +sectest1. + +noprotocol. + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./pfile2.txt', +FileName3='pfile2.txt', + +time((texttobr2(u,FileName3,u,u),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Psychiatric Report 2" admin@domain.com -a ',FileName3,A1), + +shell1(A1). + +halt. + +%% Give the meditators, etc. the As. + +swipl -G100g -T20g -L2g + +[mindreadtestsec]. + +[texttobr2qb]. + +[meditatorsanddoctors]. +meditators(Meditators),doctors(Doctors),append(Meditators,Doctors,MeditatorsandDoctors),length(MeditatorsandDoctors,LengthMeditatorsandDoctors1), + +LengthMeditatorsandDoctors2 is 2*LengthMeditatorsandDoctors1, +texttobr2(LengthMeditatorsandDoctors2). + +halt. + + + +%% Thoughts + +%% First half + +swipl -G100g -T20g -L2g + +FileName2='./tfile1.txt', +FileName3='tfile1.txt', + +protocol(FileName2). +notrace. + +[mindreadtestsecthoughts]. + +sectest0. + +noprotocol. + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./tfile1.txt', +FileName3='tfile1.txt', + +time((texttobr2(u,FileName3,u,u),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Thoughts 1" admin@domain.com -a ',FileName3,A1), + +shell1(A1). + +halt. + +%% Second half + +swipl -G100g -T20g -L2g + +FileName2='./tfile2.txt', +FileName3='tfile2.txt', + +protocol(FileName2). +notrace. + +[mindreadtestsecthoughts]. + +sectest1. + +noprotocol. + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./tfile2.txt', +FileName3='tfile2.txt', + +time((texttobr2(u,FileName3,u,u),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Thoughts 2" admin@domain.com -a ',FileName3,A1), + +shell1(A1). + +halt. + + +logout diff --git a/Daily-Regimen-main/rcawpspec.pl b/Daily-Regimen-main/rcawpspec.pl new file mode 100644 index 0000000000000000000000000000000000000000..dc7ba30752ada1d644529900b69dc945a3683e4e --- /dev/null +++ b/Daily-Regimen-main/rcawpspec.pl @@ -0,0 +1,49 @@ +%% Writes and breasons out random algorithms with predicates + +:- use_module(library(date)). + +rcawp1(N) :- + get_time(TS),stamp_date_time(TS,date(Year,Month, + Day,Hour1,Minute1,_Seconda,_A,_TZ,_False),local), + concat_list("rcawp", + [Year,Month,Day,Hour1,Minute1,".txt"],File1), + string_concat("./",File1,File2), + leash(-all),visible(+all), + protocol(File1), + time(rcawp2(N)), + noprotocol, + %%texttobr2(u,File2,u,u),texttobr(u,File2,u,u). + writeln([File2,"not t2b2'ed out. Please do manually."]). + +rcawp2(0) :- !. +rcawp2(Count1) :- + writeln([algorithm,Count1]), + random(N1),N2 is round(N1*1000), + random(N3),N4 is round(N3*1000), + N5 is N2+N4, + %%caw00(off,function0,[],1,4,[[[[[v,a],N2],[[v,b],N4]], + %% [[[v,c],N5]],true]],[],Program), + writeln(Program), + writeln(""), + %%trace, + %%interpret(off,[[n,function0],[N2,N4,[v,c]]],Program, + %% _A), %% Replaced [[[v,c],N5]] with _A + %%notrace, + writeln(""), + writeln(""), + Count2 is Count1-1, + rcawp2(Count2). + +concat_list(A,[],A) :-!. +concat_list(A,List,B) :- + List=[Item|Items], + concat_list2(A,[Item],C), + concat_list(C,Items,B). +concat_list2(A,List,C) :- + ((List=[[Item]]->true;List=[Item])-> + string_concat(A,Item,C); + fail),!. +concat_list2(A,Item,C) :- + concat_list(A,Item,C),!. + + \ No newline at end of file diff --git a/Daily-Regimen-main/text_to_breasonings.txt b/Daily-Regimen-main/text_to_breasonings.txt new file mode 100644 index 0000000000000000000000000000000000000000..f606fbd51ff2e7fef72c8f53797cb00b884360cd --- /dev/null +++ b/Daily-Regimen-main/text_to_breasonings.txt @@ -0,0 +1,105 @@ +% grammar_logic_to_alg2.pl instructions + +% breason out file.txt using t2b +% run gramm log +% run gl2a +% run t2b reading on gl2a output + +cd GitHub/Text-to-Breasonings/ + +%truncate -s 2000 file.txt + +swipl +%['truncate.pl']. +%truncate1("file.txt",14000,"file.txt"). + +% ['truncate_words_conserving_formatting.pl']. +%truncate_words_conserving_formatting(["file","file.txt"],14000,["file","file.txt"]). + +['text_to_breasonings.pl']. +N=1,M=14000,texttobr2(N,"file.txt",u,M,false,false,false,false,false,false). + +% ******* STOP HERE, THEN DO BELOW + +N=1,M=14000,texttobr(N,"file.txt",u,M). + +halt. + + + +cd ../Algorithm-Writer-with-Lists/ + +swipl +['grammar_logic_to_alg.pl']. + +FileName2='./gl2a.txt', +FileName3='gl2a.txt', + + +protocol(FileName2). +notrace. + +grammar_logic_to_alg1. + +noprotocol. + +halt. + + +cd ../Algorithm-Writer-with-Lists/ + +swipl + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./gl2a.txt', +FileName3='gl2a.txt', + +time((W is 6*4,texttobr2(u,FileName3,u,u,false,false,false,false,false,false,W),texttobr(1,FileName3,u,u))). + +% atom_concat('echo "" | mutt -s "T Profs 1" luciangreen@lucianacademy.com -a ',FileName3,A1), + +% shell1(A1). + +halt. + + +mv gl2a.txt ../Text-to-Breasonings/file.txt +cd ../Algorithm-Writer-with-Lists/ + +swipl +['grammar_logic_to_alg2.pl']. + +FileName2='./gl2a2.txt', +FileName3='gl2a2.txt', + +protocol(FileName2). +notrace. + +grammar_logic_to_alg2(24). + +% ******* STOP HERE, THEN DO BELOW + +noprotocol. + +halt. + +%mv gl2a2.txt ../. +%cd ../ + +swipl + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./gl2a2.txt', +FileName3='gl2a2.txt', + +time((W is 6*4,texttobr2(u,FileName3,u,u,false,false,false,false,false,false,W),texttobr(1,FileName3,u,u))). + +% atom_concat('echo "" | mutt -s "T Profs 1" luciangreen@lucianacademy.com -a ',FileName3,A1), + +%shell1(A1). + +halt. + +logout diff --git a/Daily-Regimen-main/texttobr2_square.sh b/Daily-Regimen-main/texttobr2_square.sh new file mode 100644 index 0000000000000000000000000000000000000000..a8bfe5d2968c61c3f0b7a4d92b6c0c66ffe9e925 --- /dev/null +++ b/Daily-Regimen-main/texttobr2_square.sh @@ -0,0 +1,4 @@ +#!/bin/bash +cd ../Text-to-Breasonings +swipl --goal=main --stand_alone=true -o texttobr2_square -c texttobr2_square.pl +./texttobr2_square diff --git a/Daily-Regimen-main/time_hop.sh b/Daily-Regimen-main/time_hop.sh new file mode 100644 index 0000000000000000000000000000000000000000..3babca052e999c15f1430c4259a2a55d0f981d34 --- /dev/null +++ b/Daily-Regimen-main/time_hop.sh @@ -0,0 +1,4 @@ +#!/bin/bash +cd ../Text-to-Breasonings +swipl --goal=main --stand_alone=true -o time_hop -c time_hop.pl +./time_hop diff --git a/Daily-Regimen-main/tt_log.txt b/Daily-Regimen-main/tt_log.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Daily-Regimen-main/tt_log.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Essay-Helper/.DS_Store b/Essay-Helper/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..16bea10c598b04720243aff043abc7bc278ce21d Binary files /dev/null and b/Essay-Helper/.DS_Store differ diff --git a/Essay-Helper/LICENSE b/Essay-Helper/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..3a27c90f1c2986ac0fb73b4ae956c6205343dfdd --- /dev/null +++ b/Essay-Helper/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Essay-Helper/README.md b/Essay-Helper/README.md new file mode 100644 index 0000000000000000000000000000000000000000..5a544d71c04377e46beb9fff0fcd09e0b1d34a07 --- /dev/null +++ b/Essay-Helper/README.md @@ -0,0 +1,235 @@ +Essay Helper | Text-to-Breasonings (Helps Earn High Distinctions) | Grammar Logic (Helps Mind Map Details) + +# Essay Helper + +* Important: Manually check the page references after using Essay Helper. +* Remove page references when not a direct quote in APA. +* Uses KNN to help write essays. Asks for 5 paragraphs of N reasons per paragraph of exposition and critique. Has a positivity detector (must be positive in whole essay apart from either only positive or negative in critique comments). Uses K-Nearest-Neighbour algorithm to check quotes, comments and connections are relevant. +* You should use a word frequency tool to find viable keywords. You may need to redraft after reading to find better keywords. +* Think of the "number one way of thinking" before running any mind reading algorithm for more understandable results. +* Run sheet feeder with `swipl --stack_limit=2G`, one file at a time and quitting swipl between attempts for better results. + +* short_essay_helper.pl - the Essay Helper algorithm +* distances.pl - the KNN algorithm +* file.txt - the essay to write the essay on +* walk_through.txt - example walkthrough of the algorithm (no output) +* short_essay_helper2.pl - the Essay Helper without paraphrasing checking, and outputs an HTML essay with ordered bibliography. +* short_essay_helper3_agps.pl - the Essay Helper generates random essays from a number of sources without paraphrasing checking, and outputs an HTML essay with ordered bibliography. +* short_essay_helper3.1_agps.pl - Like Essay Helper 3, but finds sentences containing any keywords given. +* short_essay_helper3.1_agps-mr.pl - same as short_essay_helper3.1.pl but uses mind reader. +* short_essay_helper3.1_chicago.pl - same as short_essay_helper3.1_agps.pl, but in Chicago style (with Ibid). +* source_tagger.pl - Helps tag and report tags similarly to nVivo. +* short_essay_helper3.1_agps-mr-tree.pl - same as short_essay_helper3.1.pl but uses mind reader with a character tree. It is slow. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for helping write essays. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, List Prolog Interpreter Repository, and Mind Reader Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Essay-Helper"). +halt +``` + +# Running Essay Helper + +* In Shell: +`cd Essay-Helper` +`swipl` + +* Load with `['short_essay_helper.pl'].` +* In swipl, run with: +``` +short_essay_helper("file.txt",5). +``` +about essay "file.txt" for 5 reasons per paragraph. + +* To change the number of paragraphs in the exposition of the essay, edit `5` in +``` +length(List1,5), %% 5->1 paragraphs per exposition +``` +in the correct Essay Helper algorithm file. + +* To change the number of paragraphs in the critique of the essay, edit `5` in +``` +length(List1,5), %% 5->1 paragraphs per critique +``` +in the correct Essay Helper algorithm file. + +* See walk-through as an example of Essay Helper. + +# Installing and Running Essay Helper 2 + +* Essay Helper 2 outputs the essay and the essay in HTML form, with ordered references and argument traversed depth first. +* Install by downloading the prerequisites above and saving in folders next to each other or by using LPPM above. +* Run `swipl` +* Load with `['short_essay_helper2.pl'].` + +* In swipl, run with: +``` +short_essay_helper("Author's Heading",5). +``` +with heading of the essay, "Author's Heading" and 5 reasons per paragraph. + + +* See walk-through2, walk_through2.html and walk_through2-output.txt as examples of Essay Helper 2. + +# Installing and Running Essay Helper 3 - Random Essay Generator - AGPS Referencing + +* Essay Helper 3 randomly outputs the essay and the essay in HTML form, with ordered references and argument traversed depth first. +* Note: Generated essays are not to be handed in, and you need to paraphrase and cite work you have referenced. Your grade depends on whether you agree or disagree and how many breasonings you breason out. Check the referencing style is appropriate for your class (this algorithm uses AGPS style). +* Install by downloading the prerequisites above and saving in folders next to each other or by using LPPM above. +* Run `swipl` + +* Load with `['short_essay_helper3_agps.pl'].` +* In the `raw_sources` folder, place text files for the essay sources with the names "*.txt", etc., and two newlines between pages. +* Run `sheet_feeder(_).` to convert the files in `raw_sources` and save them in `sources`. +* Check the new source files in the `sources` folder are in the format: +``` +["Surname, A 2000, Title: Subtitle, Publisher, City.","Surname, A 2000",1,"","",...] +``` +* Insert the first item, the reference for the source, in the required referencing style in double quotes, insert the part of the in-text reference as the second item, insert the first page number of the paper from the source as the third item and check the pages are separated in double quotes as above. You should remove headers, etc. from each page so that they are not used in the essay. For this version, move ends of sentences over pages onto the first page. +* In swipl, run with: +``` +short_essay_helper("Author's Heading",5). +``` +with heading of the essay, "Author's Heading" and 5 reasons per paragraph. + + +* See walk_through3.html as an example of Essay Helper 3. + + + + +# Installing and Running Essay Helper 3.1 - Essay Generator with Relevance - AGPS Referencing - And Mind Reading + +* Essay Helper 3 outputs the essay and the essay in HTML form, with ordered references and argument traversed depth first, with key words, any of which to search for in each sentence. Chooses sources, pages and quotes in order (or vaguely mind read in random order). Quotes are not repeated. +* Note: Generated essays are not to be handed in, and you need to paraphrase and cite work you have referenced. Your grade depends on whether you agree or disagree and how many breasonings you breason out. Check the referencing style is appropriate for your class (this algorithm uses AGPS style). +* Install by downloading the prerequisites above and saving in folders next to each other or by using LPPM above. +* Run `swipl` + +* Load with `['short_essay_helper3.1_agps.pl'].` or `['short_essay_helper3.1_agps-mr.pl'].` (`['short_essay_helper3.1_agps-mr-tree.pl'].` - mind reads character by character). for mind reading mode (It detects vague, not exact thoughts. Before running texttobr (in mind reader), think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. Follow instructions in Instructions for initialising Mind Reader and Instructions for Using texttobr(2) when using texttobr, texttobr2 or mind reader to avoid medical problems.). +* In the `raw_sources` folder, place text files for the essay sources with the names "*.txt", etc., and two newlines between pages. +* Run `sheet_feeder(_).` to convert the files in `raw_sources` and save them in `sources`. +* Check the new source files in the `sources` folder are in the format: +``` +["Surname, A 2000, Title: Subtitle, Publisher, City.","Surname, A 2000",1,"","",...] +``` +* Insert the first item, the reference for the source, in the required referencing style in double quotes, insert the part of the in-text reference as the second item, insert the first page number of the paper from the source as the third item and check the pages are separated in double quotes as above. You should remove headers, etc. from each page so that they are not used in the essay. For this version, move ends of sentences over pages onto the first page. +* In swipl, run with: +``` +short_essay_helper("Author's Heading",["critical","evaluation"],5). +``` +with heading of the essay, "Author's Heading", keywords "critical" and "evaluation" and 5 reasons per paragraph. + +* See walk_through3.1.html as an example of Essay Helper 3.1. + + +# Installing and Running Essay Helper 3.1 - Essay Generator with Relevance - Chicago Style + +* Essay Helper 3 outputs the essay and the essay in HTML form, with ordered endnotes and references and argument traversed depth first, with key words, any of which to search for in each sentence. Chooses sources, pages and quotes in order. Quotes are not repeated. +* Note: Generated essays are not to be handed in, and you need to paraphrase and cite work you have referenced. Your grade depends on whether you agree or disagree and how many breasonings you breason out. Check the referencing style is appropriate for your class (this algorithm uses Chicago style). +* Install by downloading the prerequisites above and saving in folders next to each other or by using LPPM above. +* Run `swipl` + +* Load with `['short_essay_helper3.1_chicago.pl'].` +* In the `raw_sources` folder, place text files for the essay sources with the names "*.txt", etc., and two newlines between pages. +* Run `sheet_feeder(_).` to convert the files in `raw_sources` and save them in `sources`. +* Check the new source files in the `sources` folder are in the format: +``` +["Surname, A 2000, Title: Subtitle, Publisher, City.","Surname, A 2000",1,"","",...] +``` +* Insert the first item, the reference for the source, in the required referencing style in double quotes, insert the part of the in-text reference as the second item, insert the first page number of the paper from the source as the third item and check the pages are separated in double quotes as above. You should remove headers, etc. from each page so that they are not used in the essay. For this version, move ends of sentences over pages onto the first page. +* In swipl, run with: +``` +short_essay_helper("Author's Heading",["critical","evaluation"],5). +``` +with heading of the essay, "Author's Heading", keywords "critical" and "evaluation" and 5 reasons per paragraph. + +* See walk_through3.1-chicago.html as an example of Essay Helper 3.1-Chicago. + + +# Installing and Running Source Tagger + +* Source Tagger tags quotes and sources with multiple tags and reports the quotes and sources for a particular tag. +* Install by downloading the prerequisites above and saving in folders next to each other or by using LPPM above. +* For a new set of tags, create a new tags.txt file in the directory containing `[]`. +* Run `swipl` + +* Load with `['short_essay_helper3.1_chicago.pl'].` +`['source_tagger.pl'].` +* Run with `source_tagger.` +``` +?- source_tagger. +New tag? (y/n) +|: y +What are the tags? +|: a +What is the text? +|: abc +What is the source? +|: ref abc +New tag? (y/n) +|: n +true. +``` +* Enter the tags, text and source for each set of tags. + +* Print the report of a tag with print_tags. +``` +?- print_tags. +a +b +e +f +Enter tag to show report of. +|: a +c +d + +z +p + +abc +ref abc + +true. +``` + +* Enter the tag name to print text and source for each tag with that name. + +# Details for Marks and Meeting Requirements + +* Text to Breasonings is required to breason out essays for high distinctions. +* Grammar and Logic to Algorithm, in the Algorithm Writer with Lists Repository is needed to generate 30 As=2400 words in Honours from hand-written 80 philosophy breasonings (100 As in Master=8000 words, 400 As in PhD=32000 words). +* (NB. Phil2Alg may not produce original algorithms needed.) Phil2Alg, in the Philosophy Repository is needed to generate details (20 words per sentence in an Honours thesis, 30 in Master in 50 in PhD) before using Grammar and Logic to Algorithm. + + + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + + + diff --git a/Essay-Helper/distances.pl b/Essay-Helper/distances.pl new file mode 100644 index 0000000000000000000000000000000000000000..ab2f975b90e83f0d21b2a3fbf17a9f0f922dff16 --- /dev/null +++ b/Essay-Helper/distances.pl @@ -0,0 +1,135 @@ +/** + + yes, no for true, false cawps data + - keeps increasing k until yes + - don't need yes or no, need predicate + can select predicates in cawps using distances.pl + - finds predicates from similar i/o * + - breaks down data and then * + - code db needs i,o + - convert code to pl + - find child predicates + - print with i,o + - eliminate dups + + - split predicates into smaller predicates () + - may find approximate predicates (with arguments out of order) that can be corrected with cawps argument re-ordering + - cut off with first item in list in case multiple items causes a malfunction in algorithm + - find i not o, then chains of i,o until alg completed + - " for vertical chains (find o, then i,o inside recursive structs) + + - use types not atoms - ? - use atoms + + * spec finder combines parts of specs, generating content within different types + - finds algorithm, spec part by spec part + - e.g. alg to modify alg takes first alg, changes, then modified alg + - finds alg before finding whole spec, finds data before all data (bottom up) + - uses only what data's changed or needed + - combines two sentence specs into one + + * alg finder is part of spec finder, can do separately + + - do i,o passes of distances separately x together, don't need y, n + + 1. modify cawps + 2. prepare data + +**/ + +%% distances.pl - from Luc De Raedt + +:-use_module(library(clpfd)). + +o_o_dis(X,X,D):-D #= 0,!. +o_o_dis(X,Y,D):- + atomic(X), + atomic(Y), + o_o_constantDis(X,Y,D),!. +o_o_dis(X,Y,D):- + X =..[XF|XRest], + length(XRest,XN), + Y =.. [YF|YRest], + length(YRest,YN), + dif(XF,YF), + o_o_functorDis(XF-XN,YF-YN,D),!. +o_o_dis(X,Y,D):- + X =..[XF|XRest], + length(XRest,XN), + Y =.. [YF|YRest], + length(YRest,YN), + dif(XN,YN), + o_o_functorDis(XF-XN,YF-YN,D),!. + +o_o_dis(X,Y,D):- + X=..[F|XRest], + Y=..[F|YRest], + length(XRest,N), + length(YRest,N), + maplist(o_o_dis,XRest,YRest,Distances), + sumlist(Distances,D0), + o_o_functorDis(F-N,F-N,D1), + D #= D1 +D0,!. + +o_o_dis(_X,_Y,D):- + D #= 0. + +o_o_constantDis(X,Y,D):- + dif(X,Y), + D #= 1. +o_o_constantDis(X,X,D):- + D #= 0. + +o_o_functorDis(X,Y,D):- + dif(X,Y), + D #= 1. +o_o_functorDis(X,X,D):- + D #= 0. + +topk_vote(T,Vote):- + pairs_values(T,V), + msort(V,VSorted), + length(T,L), + L2 #= L div 2, + length(First,L2), + append(First,[Vote|_Rest],VSorted). + +o_oclass_disClass(O,O2,D-O2):- + o_o_dis(O,O2,D). + +key_value_keyvalue(Key,Value,Key-Value). + +data_instance_k_classification(Data,I,K,C):- + maplist(o_oclass_disClass(I),Data,DisAndClass), + keysort(DisAndClass,DSorted), + + writeln(DSorted), + + length(TopK,K), + append(TopK,_,DSorted), + topk_vote(TopK,C). + + + +data(X):- + X=[ + +[2,1,["i","prepared","to","neaten","my","desk"]] +,[2,2,["i","did","this","by","providing","management","service","as","breasonings","currency"]] +,[2,3,["first","i","deciphered","it"]],[2,4,["second","i","held","it"]],[2,5,["third","i","managed","it"]],[2,6,["in","this","way","i","prepared","to","neaten","my","desk","by","providing","management","service","as","breasonings","currency"]] + + ]. + + +%% need to paraphrase by replacing one word, optionally adding words to the end, or deleting words. +%% not k=1 or 2, just not k=0 and first one +%% it doesn't matter if the paraphrase is strong, as long as the right sentence is returned +%% check all sentences in step 1, 2 in step 2 +test(Test):- + Test = +["i","neatened","my","desk"] + + . + +/** + ?-data(X),test(T),data_instance_k_classification(X,T,3,C). + */ diff --git a/Essay-Helper/file.txt b/Essay-Helper/file.txt new file mode 100644 index 0000000000000000000000000000000000000000..a50c3e7b6b6cad44a2cf986054da6a17be458a2b --- /dev/null +++ b/Essay-Helper/file.txt @@ -0,0 +1,5 @@ +Heading + +1a. here1 + +1. a b c d. e f g h. \ No newline at end of file diff --git a/Essay-Helper/raw_sources/.DS_Store b/Essay-Helper/raw_sources/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Essay-Helper/raw_sources/.DS_Store differ diff --git a/Essay-Helper/raw_sources/1.txt b/Essay-Helper/raw_sources/1.txt new file mode 100644 index 0000000000000000000000000000000000000000..70eb4c7998887f132689a55c52b0deb278ef4870 --- /dev/null +++ b/Essay-Helper/raw_sources/1.txt @@ -0,0 +1,10 @@ +a +a + +B +b +b + +C +c +c \ No newline at end of file diff --git a/Essay-Helper/raw_sources/2.txt b/Essay-Helper/raw_sources/2.txt new file mode 100644 index 0000000000000000000000000000000000000000..70eb4c7998887f132689a55c52b0deb278ef4870 --- /dev/null +++ b/Essay-Helper/raw_sources/2.txt @@ -0,0 +1,10 @@ +a +a + +B +b +b + +C +c +c \ No newline at end of file diff --git a/Essay-Helper/raw_sources/3.txt b/Essay-Helper/raw_sources/3.txt new file mode 100644 index 0000000000000000000000000000000000000000..88e68d5cea647ffd8cb5745e82b5ff8be19eca7a --- /dev/null +++ b/Essay-Helper/raw_sources/3.txt @@ -0,0 +1,10 @@ +“”‘’'" +\\ +- + + +b + + +b +b \ No newline at end of file diff --git a/Essay-Helper/sheet_feeder.pl b/Essay-Helper/sheet_feeder.pl new file mode 100644 index 0000000000000000000000000000000000000000..939284d1bf4f97cbdcc199d552fea47a32dc45e8 --- /dev/null +++ b/Essay-Helper/sheet_feeder.pl @@ -0,0 +1,47 @@ +sheet_feeder(T) :- + + directory_files("raw_sources/",F), + delete_invisibles_etc(F,G), + findall(K1,( + member(H,G), + string_concat("raw_sources/",H,List00b), + phrase_from_file_s(string(List001), List00b), + + reverse(List001,S1),(append([10],A,S1)->B=A;B=S1),reverse(B,J1), + + %J1=["“","”","‘","’","'","\"","\n","\\","\n","- ","\n","\n","\n","b"], + + %J1=["“","'","\n"], + + append(J1,`\n\n`,List00_a), + process(List00_a,[],List00), + + %writeln1(List00),trace, + delete(List00,"",K2), + + K1=["Surname, A 2000, Title: Subtitle, Publisher, City.","Surname, A 2000",1|K2], + term_to_atom(K1,K), + %random(X), + %foldr(string_concat,[H,X,".txt"],H1), + string_concat("sources/",H,List00bb), + (open_s(List00bb,write,Stream1), + write(Stream1,K), + close(Stream1)) + + ),T). + +process(A,_,C) :- + replace(Replacements), + atom_string(A1,A), + replace1(Replacements,A1,D1), + atomic_list_concat(D,'\n\n',D1), + findall(C1,(member(D2,D),atom_string(D2,C1)),C),!. + +replace1([],A,A) :- !. +replace1(Replacements,A,D) :- + Replacements=[[B,C]|G], + atomic_list_concat(E,B,A), + atomic_list_concat(E,C,F), + replace1(G,F,D),!. + + replace([['\\',''],['- ',''],['"','\''],['“','\''],['”','\''],['‘','\''],['’','\''],['⁃','-']]). diff --git a/Essay-Helper/short_essay_helper.pl b/Essay-Helper/short_essay_helper.pl new file mode 100644 index 0000000000000000000000000000000000000000..1f1baa9831a1713711fbe35b895e7e6e1fa13eb9 --- /dev/null +++ b/Essay-Helper/short_essay_helper.pl @@ -0,0 +1,623 @@ +%% Prolog Short Essay Helper + +%% Keep 1,2,3 in aphors +%% Certain number of reasons per paragraph +%% Explains essay structure at start +%% Exposition (negative terms not allowed, asks for 5 groups from text to take optionally enough notes about, which can revise) and critique +%% Paraphrasing +%% Converts x, explains x, asks about +%% Allows citation of aphor +%% Prints citations used so far +%% Prints chapter, essay and allows changes (using text editor) at end. +%% Outputs db file and br dicts for marking later + +%% *** Rewrite essay files with e.g. 1a. before aphors not 1. so not confused with aphors + +%% short_essay_helper("file.txt",5,E),writeln1(E). + +:- include('distances.pl'). +:- use_module(library(date)). +:- include('../listprologinterpreter/la_strings'). + +:- dynamic critique3/1. +:- dynamic agree_disagree/1. + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +short_essay_helper(Filex,Reasons_per_paragraph) :- + retractall(critique3(_)), + assertz(critique3([])), + + phrase_from_file_s(string(String00), Filex), + + split_string(String00, "\n\r", "\n\r", [String01|_]), + + prepare_file_for_ml(String00,String02), + +writeln1(String02), + + generate_file_name(String01,File1), + + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + explain_structure(String01,Reasons_per_paragraph,File1), + exposition(String00,String01,Reasons_per_paragraph,Numbers,String02,Exposition), + + concat_list(["Do you agree or disagree with \"",String01,"\" (a/d) ? "],String2ad),get_string(String2ad,either,one-not-ml,"","",String3ad), + + (String3ad="a"-> + (retractall(agree_disagree(_)), + assertz(agree_disagree(agree))); + (retractall(agree_disagree(_)), + assertz(agree_disagree(disagree)))), + + +critique(String00,String01,Reasons_per_paragraph,Numbers,String02,Critique), + +agree_disagree(Pole), + + concat_list(["What is the future area of research from your essay about \"",String01,"\"? "],Future_research_prompt), + %%trace, + get_string(Future_research_prompt,Pole,one-not-ml,"","",Future_research), + +term_to_atom([Exposition,Critique,String3ad,Future_research],File_contents),open_s(File1,write,Stream),write(Stream,File_contents),close(Stream). + + + + /**critique(String00,String01,Reasons_per_paragraph,Numbers,Critique). + **/ + +generate_file_name(String01,File1) :- + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list([String01," file",Year,Month,Day,Hour1,Minute1,Seconda,".txt"],File1). + +explain_structure(String01,Reasons_per_paragraph,File1) :- + concat_list(["The Short Essay Helper will you help structure and write your essay about \"",String01,"\" with ",Reasons_per_paragraph," reasons per paragraph.","\n", + "The Helper will help write an exposition (which summarises but doesn't critique the idea), a critique (which agrees with or disagrees with the topic), the introduction and the conclusion (which state whether you agreed or disagreed with the topic, etc.).","\n","The Helper will output the file, \"",File1,"\" used for marking. After using the Helper, run Text To Breasonings. Return \"",File1,"\" and the two breasoning dictionaries for marking."],String1), + writeln(String1). + +exposition(String00,String01,Reasons_per_paragraph,Numbers,ML_db,Exposition1) :- + length(List1,5), + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + string_codes(String001,String00), + writeln(String001), + findall([Number1,Exposition2],( + %%trace, +member(Number1,List1),concat_list(["What is group ",Number1," of 5 in the exposition that groups ideas about \"",String01,"\"? "],String1),get_string(String1,positive,one-not-ml,"","",%ML_db, +Exposition2)),Exposition3), + + findall([Number2,Number3,String3,String3a,String5a,String5],( + member(Number2,List1),member(Number3,List2),get_item_n(Exposition3,Number2,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you how the quote you are about to enter relates to the paragraph topic."],String2b),writeln(String2b), + %%trace, +exposition2(Item1,ML_db,String3,String3a,String5a,String5) + ),Exposition4), + Exposition1=[Exposition3,Exposition4]. + +exposition2(Item1,ML_db,String3,String3a,String5a,String5):- + (concat_list(["What is the paragraph number of the quote about the paragraph topic \"",Item1,"\"? "],String2),get_number(String2,String3), + concat_list(["What is the sentence number of the quote about the paragraph topic \"",Item1,"\"? "],String2a),get_number(String2a,String3a), + member1a([String3,String3a,String3aa],ML_db), + concat_list(["What is the paraphrased quote about the paragraph topic \"",Item1,"\"? "],String4a),get_string(String4a,positive,one,"",String3aa,String5a), + concat_list(["How does the quote you entered (\"",String5a,"\") relate to the paragraph topic \"",Item1,"\"? "],String4), + %%trace, + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789",downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + get_string(String4,positive,two,Item1a,String3aa,String5)) + ->true;exposition2(Item1,ML_db,String3,String3a,String5a,String5). + +%% Agree or disagree + +critique(String00,String01,Reasons_per_paragraph,Numbers,ML_db,Critique1) :- + length(List1,5), + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + string_codes(String001,String00), + writeln(String001), + + retractall(critique3(_)), + assertz(critique3([])), + + + findall([Number2a,Critique2],( +%% Reason 1 + +member(Number2a,List1), +%%List1=[Number2a|List1a], +List2=[Number3a|List2a], +%%trace, + + +critique_reason1(String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link), + +critique_reasons_2_to_n(Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4), + + append_list2([[Topic_paragraph_link],Critique3,Critique4],Critique2)),Critique1). + + critique_reason1(String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link) :- + + %%member(Number2,List1),member(Number3,List2), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + %%trace, +critique2(String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link), + %%),Critique4). + Critique3=[[Number2a,Number3a,String3,String3a,String5a,String3y,String3ay,String5a1]]. + +critique2(String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link):- + (concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + member1a([String3,String3a,String3aa],ML_db), + concat_list(["What is the paraphrased quote to comment on? "],String4a),get_string(String4a,positive,one,"",String3aa,String5a), + + concat_list(["Is your comment from a quote (y/n)? "],String2yn),get_string(String2yn,either,one-not-ml,"","",String3yn), + + agree_disagree(Pole), + + (String3yn="y"-> + (concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase + + concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + get_string(String4ac,Pole,one,"",String3aay,String5a1) + ) + + ;(String3y=0,String3ay=0, + concat_list(["What is the comment? "],String4ac),get_string(String4ac,Pole,one-not-ml,"","",%%String5a, + String5a1))), + + concat_list(["How does the comment \"",String5a1,"\" relate to the essay topic \"",String01,"\"? "],Topic_paragraph_link_prompt), + %%trace, + + downcase_and_split(String5a1,String5a1ds), + downcase_and_split(String01,String01ds), + get_string(Topic_paragraph_link_prompt,positive,two,String5a1ds,String01ds,Topic_paragraph_link) + + + /** + %% conn - choose connected comments + concat_list(["How does the quote you entered (\"",String5a,"\") relate to the paragraph topic \"",Item1,"\"? "],String4), + %%trace, + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789",downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + get_string(String4,either,two,Item1a,String3aa,String5) + **/ + ) + ->true;critique2(String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link). + + + +critique_reasons_2_to_n(Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4) :- + retractall(critique3(_)), + assertz(critique3(Critique3)), +%%trace, +/** critique3(Critique31), + append(Critique31,Critique3,Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), +**/ +%%notrace, +findall([Number2a,Number3a,String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa],( + %%member(Number2,List1a), + member(Number3a,List2a), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + + %%trace, +critique3(ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa) + + +/** + trace, + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + +), + Critique4). + + %%Critique3=[]. + +critique3(ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa):- + (concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + member1a([String3,String3a,String3aa],ML_db), + concat_list(["What is the paraphrased quote to comment on? "],String4a),get_string(String4a,positive,one,"",String3aa,String5a), + + concat_list(["Is your comment from a quote (y/n)? "],String2yn),get_string(String2yn,either,one-not-ml,"","",String3yn), + + agree_disagree(Pole), + + (String3yn="y"-> + (concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase x + %%trace, + concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + get_string(String4ac,Pole,one,"",String3aay,String5a1) + %%,trace + ) + + ;(String3y=0,String3ay=0, + concat_list(["What is the comment? "],String4ac),get_string(String4ac,Pole,one-not-ml,"","",%%String5a, + String5a1))), + %%*** assertz recurse not findall new critique3 x + + + %%trace, + critique3(Critique31), + append(Critique31,[[0,0,String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), + + critique3(Critique33), + + +(( + + + %%critique3(Critique3), + length(Critique33,LCritique3), + %%length(List1,LCritique3), + numbers(LCritique3,1,[],List1), + %%append(List1,_,Numbers), + %%Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + %%trace, + +findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1])),CStrings1), + +findall([N,CNumber2a,CNumber3a,CString5a1],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString31,_CString3a1,_CString5a1,_CString3y1,_CString3ay1,CString5a1])),CStrings2), + +%%trace, + +%%CStrings=[CStrings1,CStrings2], +reverse(CStrings2,CStringsR),CStringsR=[[_,CNumber2a1,CNumber3a1,LastCStrings]|CStringsR1], +reverse(CStringsR1,CStringsRR), + +reverse(CStrings1,CStringsR10),CStringsR10=[_|CStringsR11], +reverse(CStringsR11,CStringsRR1), + +append_list2(CStringsRR1,CStrings11), +concat_list(CStrings11,CStrings12), +concat_list(["Please select a comment to connect the comment \"",LastCStrings,"\" to:","\n",CStrings12],ConnectionNumberPrompt), +get_number(ConnectionNumberPrompt,ConnectionNumber), + + member([ConnectionNumber,CNumber2aa,CNumber3aa,CString5a1a],CStringsRR), + + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789",downcase_atom(CString5a1a,CString5a1a1),split_string(CString5a1a1, SepandPad, SepandPad, CString5a1a2), + + %%CNumber2a1,CNumber3a1, + downcase_atom(LastCStrings,LastCStrings_a),split_string(LastCStrings_a, SepandPad, SepandPad, LastCStrings_a1), + + + %% conn - choose connected comments, this to a previous comment + %%trace, + + concat_list(["How does \"",LastCStrings,"\" connect to \"",CString5a1a,"\"? "],ConnectionPrompt), + + get_string(ConnectionPrompt,Pole,two,CString5a1a2,LastCStrings_a1,String5aaa))->true;( + + %% If the section since updating dynamic critique comments fails, prevent doubling of critique comments + + critique3(Critique311), + reverse(Critique311,Critique312), + Critique312=[_|Critique313], + reverse(Critique313,Critique314), + retractall(critique3(_)), + assertz(critique3(Critique314)),fail + + )) + +/** Critique4=[String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa], + **/ + /** + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + + ) + %% Retries the predicate if it fails + ->true;critique3(ML_db,%%Critique3, + String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa + ). + + + + +member1a([String3,String3a,String3aa],ML_db) :- + member([String3,String3a,String3aa],ML_db),!. + + +get_item_n(Exposition,Number1,Item) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[Item|_],Exposition). + +get_number(Prompt1,Number) :- + %%concat_list(Prompt1,Prompt2), + (%%repeat, + writeln(Prompt1),read_string(user_input, "\n", "\r", _End, String),split_string(String, ",", " ", Value1),Value1=[Value2],number_string(Number,Value2)). + + +/** +get_string(Prompt1,String1) :- + concat_list(Prompt1,Prompt2), + (repeat,write(Prompt2),read_string(String1),not(String1="")). +**/ + +equals_empty_list([]). + +downcase_and_split(String1, String2) :- +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + downcase_atom(String1,String3),split_string(String3, SepandPad, SepandPad, String2). + +get_string(Prompt2,Flag1,Flag2,ML_db0,ML_db1,String2) :- +%%writeln1(get_string(Prompt2,Flag1,Flag2,ML_db1,String2)), + %%concat_list(Prompt1,Prompt2), + %%trace, + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%(repeat, + writeln(Prompt2),read_string(user_input, "\n", "\r", _End, String2),not(String2=""),downcase_atom(String2,String3),split_string(String3, SepandPad, SepandPad, String4), + Neg_term_list=["no","not","don","t","shouldn","wouldn","disagree","differ","dislikes","disagrees","differs","dislikes","disagreed","differed","disliked","negative","negation","non","negate","negates","negated","but","however","isn","lack"], + (Flag1=%%non_negative + positive->( + + (findall(Item11,( + + member(Item1,String4), + findall(Item1,( +member(Item2,Neg_term_list),(Item1=Item2->(write("Error: Contains the negative term \""),write(Item1),writeln("\".")))),Item11)),Item12)), + +maplist(equals_empty_list,Item12) + +); + ((Flag1=negative->((member(Item1,String4),member(Item1,Neg_term_list))->true;(writeln("Error: Contains no negative term, one of:"),writeln(Neg_term_list),fail));true)->true;Flag1=either)), + (Flag2=one-not-ml->true; + (Flag2=one->(%%split_string(String4,SepandPad,SepandPad,String21), + writeln("Attempt 1"), + + (length(String4,Length_string1), +(check_strings_container1(Length_string1,String4,[0,0,ML_db1],[[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1],[999,999,[]]],_,_List2)-> + writeln("Success");(writeln("Failed"),fail)) + %%(%%data_instance_k_classification1([[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1]%%,[999,999,[]] + %%],[0,0,String4]%%21 + %%,1,String4a%%21 + %%), + %%String4=String4a) + %%)-> + %%(%%String4a=[_,_,String4a1], + %%writeln([found,String4a,String4]), +%%writeln("Success")); + %%(writeln("Failed"),fail) + %% + )); + (Flag2=%%[ + two,%%,P1,S1,P2,S2], + %%trace, + append([ML_db0],[ML_db1],ML_db2), + check_strings(String4,ML_db2%%,P1,S1,P2,S2 + )))). + + +prepare_file_for_ml(String000,String021) :- + string_codes(String001,String000), + downcase_atom(String001,String00), + split_string(String00, "\n\r", "\n\r", String01), + delete(String01,"",String02), + findall(String03,(member(String02a,String02),split_string(String02a,".",".",String04), + ((String04=[String05|String06], + number_string(Number05,String05), + number_sentences(Number05,1,String06,[],String03))->true; + (findall(String08, + (SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + member(String04a,String04),split_string(String04a,SepandPad,SepandPad,String09), + append_list([[""],"",String09],String08)),String03)))),String0211), + append_list2(String0211,String021). + +number_sentences(_,_,[],String,String) :- !. number_sentences(Number_a,Number_b1,String06,String07,String08) :- + String06=[String00|String12], + +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + split_string(String00,SepandPad,SepandPad,String09), + append_list([[Number_a],Number_b1,String09],String10), + append(String07,[String10],String11), + Number_b2 is Number_b1 + 1, + number_sentences(Number_a,Number_b2,String12,String11,String08). + + +data_instance_k_classification1(Data,I,K,C):- + maplist(o_oclass_disClass1(I),Data,DisAndClass), + keysort(DisAndClass,DSorted), + + %%writeln(DSorted), + + length(TopK,K), + append(TopK,_,DSorted), + %This is not a very good way of selecting k as you may have many values with the same distance, and the sorting just cuts these off + %Dsorted = [1-pos,3-pos,3-pos,3-neg,3-neg] + %Topk =[1-pos,3-pos,3-pos] + topk_vote(TopK,C). + +o_oclass_disClass1(O,[A,B,O2],D-[A,B,O2]):- + o_o_dis(O,O2,D). + +%% could be in either order +%% a([w,z,a,b,e,c,z,y],[1,1,[c]],[1,2,[a]]). + +%% true +%% a([a,c,b],[1,1,[a,d]],[1,2,[b]]). +%% a([a,c,b],[1,1,[a,c]],[1,2,[b]]). +%% a([a,b],[1,1,[a]],[1,2,[b]]). + +%% false +%% a([c,d],[1,1,[a]],[1,2,[b]]). + +%% q in "q a b" sends food like e in "c d e" + +/** + +[debug] ?- a([q,e,r,a,t,y,u,q,e,r,a,t,y,u,c,b,x,v,n,m],[1,1,[c,a,t,y,u]],[1,2,[c,a,t,y,u]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[c,a,t,y,u]],[r,a,t,y,u]] +true. + +X: +[debug] ?- a([q,e,r,a,t,y,u,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,u]]). +[[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,u]],[c,b,x]] +true. + +[debug] ?- a([q,e,r,a,t,y,u,c,c,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,v]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,v]],[c,c,c]] +true. + +**/ + +check_strings(String1,ML_db) :- + %%member([P1,S1,String2],ML_db), + %%member([P2,S2,String3],ML_db), + ML_db=[String2a,String3a], +%%writeln(["String1,String2a,String3a",String1,String2a,String3a]), + String2=[0,0,String2a], + String3=[0,0,String3a], +%%a(String1,String2,String3):- + length(String1,Length_string1), +((writeln("Attempt 1"), +check_strings_container1(Length_string1,String1,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + check_strings_container1(Length_list2,List2,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,_List3), + writeln("Success") + %%,trace + )->true; + (writeln("Failed"), + writeln("Attempt 2"), + ((check_strings_container1(Length_string1,String1,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + + check_strings_container1(Length_list2,List2,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,_List3))-> + writeln("Success");(writeln("Failed"),fail))) +). + +check_strings_container1(Length_string1,String1,String2,Db,List2,List2b) :- +check_strings1(Length_string1,String1,String2,Db,List2,List2b),not(var(List2b)). +%%check_strings_container2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- +%%check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b),not(var(List2b)). + + + %% finds the length of each string - from position 1 to (length of string1) of string1 + %% starts with the shortest possible string, then increases the length + %% repeat this for second string + %%string1 + + %%*** check with s2,s3 in db +check_strings1(0,_String1,_String2,_Db,List2,List2) :- !. +check_strings1(Length_string1,String1,String2,Db,List2,List2b) :- + %% finds the first str + %%length(List,Index), + length(List1,Length_string1), + append(_,List1,String1), + Length_string2 is Length_string1-1, + Length_string3 is Length_string1+1, + check_strings2(0,Length_string3,List1,String2,Db,List2,List2c), + (var(List2c)-> + check_strings1(Length_string2,String1,String2,Db,List2c,List2b);List2c=List2b),!. + +check_strings2(Length_string,Length_string,_String1,_String2,_Db,List2,List2) :- !. +check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- + + + %%split_string(String4,SepandPad,SepandPad,String21), + + %% go through s1, removing the first word each time, until v + %%Length_string11 is Length_string1-1, + length(List11,Length_string1), + append(List11,List2,List1), + Length_string2 is Length_string1+1, + +%%writeln([list11,List11]), + + ((%%trace, + %%(List11=[z,a,b]->trace;true), + %%writeln([[_,_,List11],String2]), + %%notrace, + not(List11=[]), + +%%writeln(data_instance_k_classification1(Db +%% ,List11,1,A)), + data_instance_k_classification1(Db + %%String2 + ,List11,1,A), + %%writeln([string2,String2,list11,List11,a,A]), +A=String2,%%[_,_,List11], + %%trace, + + %%((List11=[z,a,b],A=[_,_,[z,t,b]],not(String2=[_,_,[c,z]]))->trace;notrace), + %%trace, +%%writeln([string2=list11,String2=List11]), +%%(List11=[y]->true%%trace +%%;true), +%%member(List11,Db), + %%**String2=[_,_,List11] + List2b=List2,%%,notrace + String2=[_,_,String21] + ,writeln([found,String21,List11]) + ) + ->(true + %%,writeln(List11) + ); + check_strings2(Length_string2,Length_string3,List1,String2,Db,_List2c,List2b)),!. + +numbers(N2,N1,Numbers,Numbers) :- + N2 is N1-1,!. +numbers(N2,N1,Numbers1,Numbers2) :- + N3 is N1+1, + append(Numbers1,[N1],Numbers3), + numbers(N2,N3,Numbers3,Numbers2). + diff --git a/Essay-Helper/short_essay_helper2.pl b/Essay-Helper/short_essay_helper2.pl new file mode 100644 index 0000000000000000000000000000000000000000..2c296a022541865a5dc7e45b54844ef41ecac6b1 --- /dev/null +++ b/Essay-Helper/short_essay_helper2.pl @@ -0,0 +1,812 @@ +%% Prolog Short Essay Helper + +%% Keep 1,2,3 in aphors +%% Certain number of reasons per paragraph +%% Explains essay structure at start +%% Exposition (negative terms not allowed, asks for 5 groups from text to take optionally enough notes about, which can revise) and critique +%% Paraphrasing +%% Converts x, explains x, asks about +%% Allows citation of aphor +%% Prints citations used so far +%% Prints chapter, essay and allows changes (using text editor) at end. +%% Outputs db file and br dicts for marking later + +%% *** Rewrite essay files with e.g. 1a. before aphors not 1. so not confused with aphors + +%% short_essay_helper("file.txt",5,E),writeln1(E). + +:- include('distances.pl'). +:- use_module(library(date)). +:- include('../listprologinterpreter/la_strings'). + +:- dynamic critique3/1. +:- dynamic agree_disagree/1. + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +short_essay_helper(String01,Reasons_per_paragraph) :- + retractall(critique3(_)), + assertz(critique3([])), + + retractall(refs(_)), + assertz(refs([])), + + %%phrase_from_file_s(string(String00), Filex), + + %%split_string(String00, "\n\r", "\n\r", [String01|_]), + + %%prepare_file_for_ml(String00,String02), + +%%writeln1(String02), + + + generate_file_name(File1,File2), + + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + explain_structure(String01,Reasons_per_paragraph,File1), + exposition(String00,String01,Reasons_per_paragraph,Numbers,String02,Exposition), + + concat_list(["Do you agree or disagree with \"",String01,"\" (a/d) ? "],String2ad),get_string(String2ad,either,one-not-ml,"","",String3ad), + + (String3ad="a"-> + (retractall(agree_disagree(_)), + assertz(agree_disagree(agree))); + (retractall(agree_disagree(_)), + assertz(agree_disagree(disagree)))), + + +critique(String00,String01,Reasons_per_paragraph,Numbers,String02,Critique), + +agree_disagree(Pole), + + concat_list(["What is the future area of research from your essay about \"",String01,"\"? "],Future_research_prompt), + %%trace, + get_string(Future_research_prompt,either,one-not-ml,"","",Future_research), + +term_to_atom([Exposition,Critique,Future_research],File_contents),open_s(File1,write,Stream),write(Stream,File_contents),close(Stream), + +%% Output essay +%%findall(_,(member(Exposition1,Exposition),Exposition1= + +refs(R2), + +%%writeln1([Exposition,Critique,Future_research,R2]), + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML), +writeln1(Essay), + + (open_s(File2,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,HTML), + close(Stream1)) + . + +%% replace("a\nb","\n","
\n",F). +%% F="a
\nb
\n". + +replace(A,Find,Replace,F) :- + split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F). + +concat_list1(D,F) :- + maplist(append,[D],[E]),concat_list(E,F). + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML) :- + write_heading(String01,Heading), + write_introduction(String01,Pole,Critique,Introduction), + write_exposition(Exposition,Exposition2a), + concat_list(["I will expose ",String01," in this half. ",Exposition2a],Exposition2), + %%string_concat(Exposition2,"\n",Exposition2a), + write_critique(Critique,Critique2a), + string_concat(Critique2a,"\n",Critique2b), + atom_string(Pole,Pole1), + concat_list(["I will ",Pole1," with ",String01," in this half. ",Critique2b],Critique2), + write_conclusion(String01,Pole,Critique,Future_research,Conclusion), + write_references(R2,References,Refs_no_heading), + concat_list([Heading,Introduction,Exposition2,Critique2,Conclusion,References], + Essay), + concat_list([Introduction,Exposition2,Critique2,Conclusion], + Essay2), + replace(Essay2,"\n","
",HTML1), + replace(Refs_no_heading,"\n","
",Refs_no_heading2), + concat_list(["",String01,"

", + String01,"

",HTML1,"

Bibliography

",Refs_no_heading2,""],HTML). + +write_heading(String01,Heading) :- + concat_list([String01,"\n","\n"],Heading). + +write_introduction(String01,Pole1,Critique,Introduction) :- + %% The heading should be in the form "Author's topic" + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list(["I will critically analyse ",String01,". ", + "I will ",Pole2," with ",String01,". ", + Paragraph_topic_sentences2,"\n\n"],Introduction). + +write_conclusion(String01,Pole1,Critique,Future_research,Conclusion) :- + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list([ + Paragraph_topic_sentences2, + "I have ",Pole2,"d with ",String01,". ", + Future_research,"\n\n" + ],Conclusion). + +write_references(R2,References,Refs_no_head) :- + findall([Reference,"\n"],member(Reference,R2),References1), + concat_list1(References1,References2), + concat_list([References2],Refs_no_head), + concat_list(["Bibliography","\n\n",References2],References). + +%%a1 +%% write_exposition([[[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."]]],A),writeln1(A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. ". + +%% write_exposition([[[1,"g1"],[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."],[1,1,_15410,_15416,"a1","a1 is in g1."],[2,2,_15352,_15358,"a2","g1 contains a2."]]],A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. a1 a1 is in g1. a2 g1 contains a2. ". + +write_exposition(Exposition,Essay4b) :- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +Exposition=[_,Exposition2], + + +findall([Essay4c%%,"\n" +],(member(Numbera11,Numbers), + + +%% not "" with findall +findall(Essay4,( +%%member(Exposition1,Exposition), +%%Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + + +%%output_exposition(Numbera11,Exposition2,"",Essay1), + + %%findall( Essay4,( + member(Exposition1,Exposition2),Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,Group_link], + concat_list([String5a," ",Group_link," "],Essay4) + %%delete(Exposition,Exposition1,Exposition2) + %%output_exposition(Numbera11,Exposition2,Essay4,Essay6) +),Essay4a), +concat_list(Essay4a,Essay4c1),%%trace, +(Essay4c1=""->Essay4c="";concat_list([Essay4c1,"\n"],Essay4c)) + +),Essay4d), + +maplist(concat_list,Essay4d,Essay4f), +concat_list(Essay4f,Essay4b) +%%concat_list([Essay4e,"\n"],Essay4b) +%%concat_list([Essay1," ",Essay3],Essay2), + %%concat_list([Essay2,"\n"],Essay23)) +%%,Essay3a), + %%concat_list(Essay3a,Essay4a) +. +%% *** HTML (
not \n) + +%%%%% +%%a + +%% write_critique([[1,["heading is e12",[1,1,_25346,_25352,"e1",_25370,_25376,"e12"],[1,2,_25412,_25418,"e2",_25436,_25442,"e22",1,1,"e12",0,0,"e22","e12 is e22"]]]],A),writeln1(A). +%% A = "heading is e12 e1 e12 e2 e22 e12 is e22 \n". + +write_critique(Critique,Essay4):- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +findall(Essay23,(member(Numbera11,Numbers), + +%% not "" with findall +%%findall(Essay22,( +member(Critique1,Critique), +%%Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + +Critique1=[Numbera11,[Para_topic_sent,[_Number2a,_Number3a,_String3,_String3a,String5a,_String3y,_String3ay,String5a1]|Critique2]], + +concat_list([Para_topic_sent," ",String5a," ",String5a1," "],Essay0),output_critique(Numbera11,Critique2,String5a1,Essay0,Essay1), + + concat_list([Essay1,Para_topic_sent,"\n"],%%Essay22) + +%%), +Essay23)),Essay3), + concat_list(Essay3,Essay4) +. +%% *** HTML (
not \n) + +output_critique(_Numbera11,[],_String5a1a,Essay,Essay) :- !. +output_critique(Numbera11,Critique,CString5a1a,Essay1,Essay2) :- +findall( Essay6,(member(Critique1,Critique),Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1, +_CNumber2aa,_CNumber3aa,CString5a1a, + _CNumber2a1,_CNumber3a1,_LastCStrings, + String5aaa], + concat_list([String5a," ",String5a1," ",String5aaa," "],Essay4), + delete(Critique,Critique1,Critique2), + output_critique(Numbera11,Critique2,String5aaa,Essay4,Essay6) +),Essay33), +%%trace, +(Essay33=[]->concat_list([Essay1%%," " +],Essay2);%%(Essay33=[Essay3]->concat_list([Essay1," ",Essay3],Essay2);%%(Essay33=[Essay3|[E33]]-> concat_list([Essay1," ",Essay3,E33],Essay2); +(Essay33=[Essay3|E33], concat_list(E33,E34),concat_list([Essay1,%%" ", +Essay3,E34],Essay2))) +%%) +%%) +. + + +/**critique(String00,String01,Reasons_per_paragraph,Numbers,Critique). + **/ + +generate_file_name(File1,File2) :- + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".txt"],File1), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".html"],File2). + + +explain_structure(String01,Reasons_per_paragraph,File1) :- + concat_list(["The Short Essay Helper will you help structure and write your essay about \"",String01,"\" with ",Reasons_per_paragraph," reasons per paragraph.","\n", + "The Helper will help write an exposition (which summarises but doesn't critique the idea), a critique (which agrees with or disagrees with the topic), the introduction and the conclusion (which state whether you agreed or disagreed with the topic, etc.). Citations will be automatically made.","\n"],String1), + writeln(String1). + +exposition(String00,String01,Reasons_per_paragraph,Numbers,ML_db,Exposition1) :- + length(List1,5), %% 5->1 + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + findall([Number1,Exposition2],( + %%trace, +member(Number1,List1),concat_list(["What is group ",Number1," of 5 in the exposition that groups ideas about \"",String01,"\"? "],String1),get_string(String1,either,one-not-ml,"","",%ML_db, +Exposition2)),Exposition3), + + findall([Number2,Number3,String3,String3a,String5a,String5],( + member(Number2,List1),member(Number3,List2),get_item_n(Exposition3,Number2,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you how the quote you are about to enter relates to the paragraph topic."],String2b),writeln(String2b), + %%trace, +exposition2(Item1,ML_db,String3,String3a,String5a,String5) + ),Exposition4), + Exposition1=[Exposition3,Exposition4]. + +exposition2(Item1,ML_db,String3,String3a,String5a,String5):- + (%%concat_list(["What is the paragraph number of the quote about the paragraph topic \"",Item1,"\"? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote about the paragraph topic \"",Item1,"\"? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + concat_list(["What is the paraphrased quote about the paragraph topic \"",Item1,"\"? "],String4a),get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + concat_list(["How does the quote you entered (\"",String5a,"\") relate to the paragraph topic \"",Item1,"\"? "],String4), + %%trace, + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789",downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + get_string(String4,either,one-not-ml,Item1a,String3aa,String5)) + ->true;exposition2(Item1,ML_db,String3,String3a,String5a,String5). + +%% Agree or disagree + +critique(String00,String01,Reasons_per_paragraph,Numbers,ML_db,Critique1) :- + length(List1,5), %% 5->1 + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + + retractall(critique3(_)), + assertz(critique3([])), + + + findall([Number2a,Critique2],( +%% Reason 1 + +member(Number2a,List1), +%%List1=[Number2a|List1a], +List2=[Number3a|List2a], +%%trace, + + +critique_reason1(String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link), + +critique_reasons_2_to_n(Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4), + + append_list2([[Topic_paragraph_link],Critique3,Critique4],Critique2)),Critique1). + + critique_reason1(String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link) :- + + %%member(Number2,List1),member(Number3,List2), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + %%trace, +critique2(String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link), + %%),Critique4). + Critique3=[[Number2a,Number3a,String3,String3a,String5a,String3y,String3ay,String5a1]]. + +critique2(String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + concat_list(["What is the paraphrased quote to comment on? "],String4a),get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + + concat_list(["Is your comment from a quote (y/n)? "],String2yn),get_string(String2yn,either,one-not-ml,"","",String3yn), + + agree_disagree(Pole), + + (String3yn="y"-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase + + concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + ) + + ;(String3y=0,String3ay=0, + concat_list(["What is the comment? "],String4ac),get_string(String4ac,either,one-not-ml,"","",%%String5a, + String5a1))), + + concat_list(["How does the comment \"",String5a1,"\" relate to the essay topic \"",String01,"\"? "],Topic_paragraph_link_prompt), + %%trace, + + downcase_and_split(String5a1,String5a1ds), + downcase_and_split(String01,String01ds), + get_string(Topic_paragraph_link_prompt,either,one-not-ml,String5a1ds,String01ds,Topic_paragraph_link) + + + /** + %% conn - choose connected comments + concat_list(["How does the quote you entered (\"",String5a,"\") relate to the paragraph topic \"",Item1,"\"? "],String4), + %%trace, + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789",downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + get_string(String4,either,two,Item1a,String3aa,String5) + **/ + ) + ->true;critique2(ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link). + + + +critique_reasons_2_to_n(Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4) :- + retractall(critique3(_)), + assertz(critique3(Critique3)), +%%trace, +/** critique3(Critique31), + append(Critique31,Critique3,Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), +**/ +%%notrace, +findall([Number2a,Number3a,String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa],( + %%member(Number2,List1a), + member(Number3a,List2a), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + + %%trace, +critique3(ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa) + + +/** + trace, + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + +), + Critique4). + + %%Critique3=[]. + +critique3(ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + concat_list(["What is the paraphrased quote to comment on? "],String4a),get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + + concat_list(["Is your comment from a quote (y/n)? "],String2yn),get_string(String2yn,either,one-not-ml,"","",String3yn), + + agree_disagree(Pole), + + (String3yn="y"-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase x + %%trace, + concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + %%,trace + ) + + ;(String3y=0,String3ay=0, + concat_list(["What is the comment? "],String4ac),get_string(String4ac,either,one-not-ml,"","",%%String5a, + String5a1))), + %%*** assertz recurse not findall new critique3 x + + + %%trace, + critique3(Critique31), + append(Critique31,[[0,0,String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), + + critique3(Critique33), + + +(( + + + %%critique3(Critique3), + length(Critique33,LCritique3), + %%length(List1,LCritique3), + numbers(LCritique3,1,[],List1), + %%append(List1,_,Numbers), + %%Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + %%trace, + +findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1])),CStrings1), + +findall([N,CNumber2a,CNumber3a,CString5a1],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString31,_CString3a1,_CString5a1,_CString3y1,_CString3ay1,CString5a1])),CStrings2), + +%%trace, + +%%CStrings=[CStrings1,CStrings2], +reverse(CStrings2,CStringsR),CStringsR=[[_,CNumber2a1,CNumber3a1,LastCStrings]|CStringsR1], +reverse(CStringsR1,CStringsRR), + +reverse(CStrings1,CStringsR10),CStringsR10=[_|CStringsR11], +reverse(CStringsR11,CStringsRR1), + +append_list2(CStringsRR1,CStrings11), +concat_list(CStrings11,CStrings12), +concat_list(["Please select a comment to connect the comment \"",LastCStrings,"\" to:","\n",CStrings12],ConnectionNumberPrompt), +get_number(ConnectionNumberPrompt,ConnectionNumber), + + member([ConnectionNumber,CNumber2aa,CNumber3aa,CString5a1a],CStringsRR), + + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789",downcase_atom(CString5a1a,CString5a1a1),split_string(CString5a1a1, SepandPad, SepandPad, CString5a1a2), + + %%CNumber2a1,CNumber3a1, + downcase_atom(LastCStrings,LastCStrings_a),split_string(LastCStrings_a, SepandPad, SepandPad, LastCStrings_a1), + + + %% conn - choose connected comments, this to a previous comment + %%trace, + + concat_list(["How does \"",LastCStrings,"\" connect to \"",CString5a1a,"\"? "],ConnectionPrompt), + + get_string(ConnectionPrompt,either,one-not-ml,CString5a1a2,LastCStrings_a1,String5aaa))->true;( + + %% If the section since updating dynamic critique comments fails, prevent doubling of critique comments + + critique3(Critique311), + reverse(Critique311,Critique312), + Critique312=[_|Critique313], + reverse(Critique313,Critique314), + retractall(critique3(_)), + assertz(critique3(Critique314)),fail + + )) + +/** Critique4=[String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa], + **/ + /** + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + + ) + %% Retries the predicate if it fails + ->true;critique3(ML_db,%%Critique3, + String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa + ). + + + + +member1a([String3,String3a,String3aa],ML_db) :- + member([String3,String3a,String3aa],ML_db),!. + + +get_item_n(Exposition,Number1,Item) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[Item|_],Exposition). + +get_number(Prompt1,Number) :- + %%concat_list(Prompt1,Prompt2), + (%%repeat, + writeln(Prompt1),read_string(user_input, "\n", "\r", _End, String),split_string(String, ",", " ", Value1),Value1=[Value2],number_string(Number,Value2)). + + +/** +get_string(Prompt1,String1) :- + concat_list(Prompt1,Prompt2), + (repeat,write(Prompt2),read_string(String1),not(String1="")). +**/ + +equals_empty_list([]). + +downcase_and_split(String1, String2) :- +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + downcase_atom(String1,String3),split_string(String3, SepandPad, SepandPad, String2). + +get_string(Prompt2,Flag1,Flag2,ML_db0,ML_db1,String2) :- +%%writeln1(get_string(Prompt2,Flag1,Flag2,ML_db1,String2)), + %%concat_list(Prompt1,Prompt2), + %%trace, + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%(repeat, + (Flag2=one-not-ml-ref->(concat_list(["Note: Enter in-text reference using AGPS, e.g.\n","The first work supports the second work (Surname 2000, pp. 18-9).\n","Surname (2000, pp. 18-9) states that ...\n","Remember to use words like \"also\", \"moreover\" and \"in addition\" before the sentence."],String_a1),writeln(String_a1));true), + writeln(Prompt2),read_string(user_input, "\n", "\r", _End, String2aa),%%not(String2aa=""), + %%String2aa=[String2aaa], + downcase_atom(String2aa,String3),split_string(String3, SepandPad, SepandPad, String4), + Neg_term_list=["no","not","don","t","shouldn","wouldn","disagree","differ","dislikes","disagrees","differs","dislikes","disagreed","differed","disliked","negative","negation","non","negate","negates","negated","but","however","isn","lack"], + (Flag1=%%non_negative + positive->( + + (findall(Item11,( + + member(Item1,String4), + findall(Item1,( +member(Item2,Neg_term_list),(Item1=Item2->(write("Error: Contains the negative term \""),write(Item1),writeln("\".")))),Item11)),Item12)), + +maplist(equals_empty_list,Item12) + +); + ((Flag1=negative->((member(Item1,String4),member(Item1,Neg_term_list))->true;(writeln("Error: Contains no negative term, one of:"),writeln(Neg_term_list),fail));true)->true;Flag1=either)), + (Flag2=one-not-ml->String2=String2aa; + (Flag2=one-not-ml-ref-> + (refs(R1),writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), findall(_,(member(R11,R1),writeln(R11)),_),read_string(user_input, "\n", "\r", _End, String2r),not(String2r=""),%%downcase_atom(String2r,_String3r), + String2=String2aa,split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + sort1(String2r21,String2r2), + assertz(refs(String2r2))%%split_string(String3r, SepandPad, SepandPad, String4) + ); +(Flag2=one->(%%split_string(String4,SepandPad,SepandPad,String21), + writeln("Attempt 1"), + + (length(String4,Length_string1), +(check_strings_container1(Length_string1,String4,[0,0,ML_db1],[[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1],[999,999,[]]],_,_List2)-> + writeln("Success");(writeln("Failed"),fail)) + %%(%%data_instance_k_classification1([[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1]%%,[999,999,[]] + %%],[0,0,String4]%%21 + %%,1,String4a%%21 + %%), + %%String4=String4a) + %%)-> + %%(%%String4a=[_,_,String4a1], + %%writeln([found,String4a,String4]), +%%writeln("Success")); + %%(writeln("Failed"),fail) + %% + )); + (Flag2=%%[ + two,%%,P1,S1,P2,S2], + %%trace, + append([ML_db0],[ML_db1],ML_db2), + check_strings(String4,ML_db2%%,P1,S1,P2,S2 + ))))). + + +%% Sorts by first surname then title in AGPS + +sort1(List1,List2) :- + findall([C,B],(member(B,List1),downcase_atom(B,D),atom_string(D,C)),E),sort(E,F),findall(G,member([_,G],F),List2). + + +prepare_file_for_ml(String000,String021) :- + string_codes(String001,String000), + downcase_atom(String001,String00), + split_string(String00, "\n\r", "\n\r", String01), + delete(String01,"",String02), + findall(String03,(member(String02a,String02),split_string(String02a,".",".",String04), + ((String04=[String05|String06], + number_string(Number05,String05), + number_sentences(Number05,1,String06,[],String03))->true; + (findall(String08, + (SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + member(String04a,String04),split_string(String04a,SepandPad,SepandPad,String09), + append_list([[""],"",String09],String08)),String03)))),String0211), + append_list2(String0211,String021). + +number_sentences(_,_,[],String,String) :- !. number_sentences(Number_a,Number_b1,String06,String07,String08) :- + String06=[String00|String12], + +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + split_string(String00,SepandPad,SepandPad,String09), + append_list([[Number_a],Number_b1,String09],String10), + append(String07,[String10],String11), + Number_b2 is Number_b1 + 1, + number_sentences(Number_a,Number_b2,String12,String11,String08). + + +data_instance_k_classification1(Data,I,K,C):- + maplist(o_oclass_disClass1(I),Data,DisAndClass), + keysort(DisAndClass,DSorted), + + %%writeln(DSorted), + + length(TopK,K), + append(TopK,_,DSorted), + %This is not a very good way of selecting k as you may have many values with the same distance, and the sorting just cuts these off + %Dsorted = [1-pos,3-pos,3-pos,3-neg,3-neg] + %Topk =[1-pos,3-pos,3-pos] + topk_vote(TopK,C). + +o_oclass_disClass1(O,[A,B,O2],D-[A,B,O2]):- + o_o_dis(O,O2,D). + +%% could be in either order +%% a([w,z,a,b,e,c,z,y],[1,1,[c]],[1,2,[a]]). + +%% true +%% a([a,c,b],[1,1,[a,d]],[1,2,[b]]). +%% a([a,c,b],[1,1,[a,c]],[1,2,[b]]). +%% a([a,b],[1,1,[a]],[1,2,[b]]). + +%% false +%% a([c,d],[1,1,[a]],[1,2,[b]]). + +%% q in "q a b" sends food like e in "c d e" + +/** + +[debug] ?- a([q,e,r,a,t,y,u,q,e,r,a,t,y,u,c,b,x,v,n,m],[1,1,[c,a,t,y,u]],[1,2,[c,a,t,y,u]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[c,a,t,y,u]],[r,a,t,y,u]] +true. + +X: +[debug] ?- a([q,e,r,a,t,y,u,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,u]]). +[[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,u]],[c,b,x]] +true. + +[debug] ?- a([q,e,r,a,t,y,u,c,c,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,v]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,v]],[c,c,c]] +true. + +**/ + +check_strings(String1,ML_db) :- + %%member([P1,S1,String2],ML_db), + %%member([P2,S2,String3],ML_db), + ML_db=[String2a,String3a], +%%writeln(["String1,String2a,String3a",String1,String2a,String3a]), + String2=[0,0,String2a], + String3=[0,0,String3a], +%%a(String1,String2,String3):- + length(String1,Length_string1), +((writeln("Attempt 1"), +check_strings_container1(Length_string1,String1,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + check_strings_container1(Length_list2,List2,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,_List3), + writeln("Success") + %%,trace + )->true; + (writeln("Failed"), + writeln("Attempt 2"), + ((check_strings_container1(Length_string1,String1,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + + check_strings_container1(Length_list2,List2,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,_List3))-> + writeln("Success");(writeln("Failed"),fail))) +). + +check_strings_container1(Length_string1,String1,String2,Db,List2,List2b) :- +check_strings1(Length_string1,String1,String2,Db,List2,List2b),not(var(List2b)). +%%check_strings_container2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- +%%check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b),not(var(List2b)). + + + %% finds the length of each string - from position 1 to (length of string1) of string1 + %% starts with the shortest possible string, then increases the length + %% repeat this for second string + %%string1 + + %%*** check with s2,s3 in db +check_strings1(0,_String1,_String2,_Db,List2,List2) :- !. +check_strings1(Length_string1,String1,String2,Db,List2,List2b) :- + %% finds the first str + %%length(List,Index), + length(List1,Length_string1), + append(_,List1,String1), + Length_string2 is Length_string1-1, + Length_string3 is Length_string1+1, + check_strings2(0,Length_string3,List1,String2,Db,List2,List2c), + (var(List2c)-> + check_strings1(Length_string2,String1,String2,Db,List2c,List2b);List2c=List2b),!. + +check_strings2(Length_string,Length_string,_String1,_String2,_Db,List2,List2) :- !. +check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- + + + %%split_string(String4,SepandPad,SepandPad,String21), + + %% go through s1, removing the first word each time, until v + %%Length_string11 is Length_string1-1, + length(List11,Length_string1), + append(List11,List2,List1), + Length_string2 is Length_string1+1, + +%%writeln([list11,List11]), + + ((%%trace, + %%(List11=[z,a,b]->trace;true), + %%writeln([[_,_,List11],String2]), + %%notrace, + not(List11=[]), + +%%writeln(data_instance_k_classification1(Db +%% ,List11,1,A)), + data_instance_k_classification1(Db + %%String2 + ,List11,1,A), + %%writeln([string2,String2,list11,List11,a,A]), +A=String2,%%[_,_,List11], + %%trace, + + %%((List11=[z,a,b],A=[_,_,[z,t,b]],not(String2=[_,_,[c,z]]))->trace;notrace), + %%trace, +%%writeln([string2=list11,String2=List11]), +%%(List11=[y]->true%%trace +%%;true), +%%member(List11,Db), + %%**String2=[_,_,List11] + List2b=List2,%%,notrace + String2=[_,_,String21] + ,writeln([found,String21,List11]) + ) + ->(true + %%,writeln(List11) + ); + check_strings2(Length_string2,Length_string3,List1,String2,Db,_List2c,List2b)),!. + +numbers(N2,N1,Numbers,Numbers) :- + N2 is N1-1,!. +numbers(N2,N1,Numbers1,Numbers2) :- + N3 is N1+1, + append(Numbers1,[N1],Numbers3), + numbers(N2,N3,Numbers3,Numbers2). + diff --git a/Essay-Helper/short_essay_helper3.1_agps-mr-tree.pl b/Essay-Helper/short_essay_helper3.1_agps-mr-tree.pl new file mode 100644 index 0000000000000000000000000000000000000000..d88d152fdb5fd87809b911096c9c87c3abb572c7 --- /dev/null +++ b/Essay-Helper/short_essay_helper3.1_agps-mr-tree.pl @@ -0,0 +1,1120 @@ +%% backdated from this - which has no repeats and forward progress through essays, but has a bug in numbering pages in references, so backdated to version with repeats and random progress, made forward progress x kept this version and solved bug with pg info in db + +%% Prolog Short Essay Helper + +%% Keep 1,2,3 in aphors +%% Certain number of reasons per paragraph +%% Explains essay structure at start +%% Exposition (negative terms not allowed, asks for 5 groups from text to take optionally enough notes about, which can revise) and critique +%% Paraphrasing +%% Converts x, explains x, asks about +%% Allows citation of aphor +%% Prints citations used so far +%% Prints chapter, essay and allows changes (using text editor) at end. +%% Outputs db file and br dicts for marking later + +%% *** Rewrite essay files with e.g. 1a. before aphors not 1. so not confused with aphors + +%% short_essay_helper("file.txt",5,E),writeln1(E). + +%%:- include('distances.pl'). +:- use_module(library(date)). +:- include('../listprologinterpreter/la_strings'). +:- include('../mindreader/make_mind_reading_tree4 working1.pl'). +:- include('../mindreader/mr_tree.pl'). +:- include('texttobr2qb.pl'). +:- include('sheet_feeder.pl'). + +:- dynamic critique3/1. +:- dynamic agree_disagree/1. +:- dynamic refs/1. +:- dynamic chosen_quotes/1. +:- dynamic string00_z/1. + +choose(N2,B,B1,B2,C,Item) :- +%%trace, +( + choose2(N2,B,B1,B2,C,Item)->true;(Item="* All quotes exhausted. (*)"),N2=0,B="()",B1="()",B2=0,C=""), + %%notrace, + !. +choose2(N2,B,B1,B2,List0,List0) :- +%%trace, + string00_z(String00), + choose_sentence_range(String00,N1,B,B1,B2,List0), + %%chosen_quotes(Chosen_quotes1), + %%trace, + %%length(List0,L), + %%numbers(L,1,[],N), + %%random_ + %%member(N1,N), + %% + %%random_ + %%mind_read([N1,Item10],List0), + + %%random_ + %%mind_read(Item1,Item10), + N2 is N1+B2-1, + %%get_item_n(List0,N1,Item10), + + + %%**string_codes(Item10,List), + %%notrace, + %%**split_string(List,".\n",".\n",List2), + + %%length(List2,L), + %%numbers(L,1,[],N), + %%random_ + %%member(N1,N), + %%N2 is N1+B2-1, + %%random_ + %%**member(Item1,List2), + + %%get_item_n(List2,N1,Item1), + /** + string_concat(E,D,Item1), + string_length(E,1), + downcase_atom(E,E1), + atom_string(E1,E2), + string_concat(E2,D,Item2), + string_length(E2,1), + string_concat(Item2,""%%"." + ,Item), + **/ + delete(String00,[B,B1,B2,N1,List0],String00_a), + %%**delete(String00,[B,B1,B2|_],String00_a), + %%**delete(List0,[N1,Item10],List6), + %%findall([Item3,". "],(member(Item3,List2)),List3), + %%maplist(append,[List3],[List4]), + %%concat_list(List4,_List5), + %%append(List6,[]%%List5 + %%,List7), + %%**(List6=[]->String00_b=String00_a; + %%**(%%trace, + %%**maplist(append,[[[B,B1,B2],List6]],[String00_c]), + %%**append(String00_a,[String00_c],String00_b)%%,notrace + %%**)), + retractall(string00_z(_)), + assertz(string00_z(String00_a)) + %%trace, + %%writeln1(String00_b),notrace + + + %%,not(member(Item,Chosen_quotes1)), + %%append(Chosen_quotes1,[Item],Chosen_quotes2), + %%retractall(chosen_quotes(_)), + %%assertz(chosen_quotes(Chosen_quotes2)) + . + +choose1(List0,Item) :- +%trace, + mind_read(Item,List0),writeln(here1). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +short_essay_helper(%%Filex, + String01,Key_words, + Reasons_per_paragraph) :- + retractall(string00_z(_)), + %%assertz(string00_z([])), + + retractall(critique3(_)), + assertz(critique3([])), + + retractall(refs(_)), + assertz(refs([])), + + retractall(chosen_quotes(_)), + assertz(chosen_quotes([])), + + directory_files("sources/",F), + delete_invisibles_etc(F,G), +%%trace, +SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + findall(String02h3,(member(Filex1,G), + string_concat("sources/",Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]), + + (String02a=[Az,Bz,Cz|String02c]->true; + (concat_list(["Error: ",Filex," not in format [\"Surname, A 2000, Title: Subtitle, Publisher, City.\",\"Surname, A 2000\",First_Page_Num,\"\",\"\",...\"]"],Notification1),writeln(Notification1),abort)), + %%String02c=String02d, + %%trace, + findall([Az,Bz,Cz,N1,String02cb],( + + length(String02c,L), + numbers(L,1,[],N), + %%random_ + member(N1,N), + get_item_n(String02c,N1,String02ca), + + %%member(String02ca,String02c), + split_string(String02ca, ".\n\r", ".\n\r", String02cb) + + %%member(String02cb1,String02cb) + + ),String02cc), + %%maplist(append,[String02cc],[String02d]), + + %%delete(String02cc,[_,[]],String02d), + String02cc=String02d, + + findall([Az,Bz,Cz,N2,String02d2],(member([Az,Bz,Cz,N2,String02d1],String02d), + member(String02d2,String02d1), + downcase_atom(String02d2,String02e), + atom_string(String02e,String02f1), + split_string(String02f1, SepandPad, SepandPad, String02e1), + findall(String02g,(member(Key_words1,Key_words), + %%trace, + downcase_atom(Key_words1,Key_words11), + atom_string(Key_words11,Key_words12), +findall(Key_words12,(member(Key_words12,String02e1)),String02g) + ),String02i), + not(maplist(equals_empty_list,String02i)) + + ),String02h31), + + sort(String02h31,String02h3) + + %%prepare_file_for_ml(String00,String02a) + ),String00z1), + %%, + + %%** findall(String02h2,(member([Ay,By,Cy,String02h1],String00z1), + %%**(String02h1=[]->String02h2=[]; + maplist(append,[String00z1],[String00]),%%**) + %%**),String00z), + +%%delete(String00z,[],String00), + + +term_to_atom(Key_words,Key_words_a), +atom_string(Key_words_a,Key_words_b), + (String00=[]->(concat_list(["Error: No files in source folder or no instances of keywords ",Key_words_b," in files in source folder."],Notification2),writeln(Notification2),abort);true), + + %%maplist(append,[[String00z1]],String00), +%%maplist(append,[String00z],String00), + %%trace, +assertz(string00_z(String00)), + %%writeln1(String00), + %%notrace, + +%%writeln1(String02), + + + generate_file_name(File1,File2), + + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + explain_structure(String01,Reasons_per_paragraph,File1), + exposition(String00,String01,Reasons_per_paragraph,Numbers,String02,Exposition), + + %%concat_list(["Do you agree or disagree with ",String01," (a/d) ? "],String2ad),%%get_string(String2ad,either,one-not-ml,"","",String3ad), + choose1(["a"%%,"d" + ],String3ad), + + (String3ad="a"-> + (retractall(agree_disagree(_)), + assertz(agree_disagree(agree))); + (retractall(agree_disagree(_)), + assertz(agree_disagree(disagree)))), + + +critique(String00,String01,Reasons_per_paragraph,Numbers,String02,Critique), + +agree_disagree(Pole), + + %%concat_list(["What is the future area of research from your essay about ",String01,"? "],Future_research_prompt), + %%trace, + %%get_string(Future_research_prompt,either,one-not-ml,"","",Future_research), +%% choose_sentence_range(String00,), + choose(N_page_ref,String00a1,String00a2,_String00a3,_String00a4,String00a5), + concat_list(["In ",String01,", automation should apply to \"",String00a5,"\" (",String00a2,", p. ",N_page_ref,")."],Future_research), + reference(String00a1), + +refs(R2), +term_to_atom([Exposition,Critique,String3ad,Future_research,R2],File_contents),open_s(File1,write,Stream),write(Stream,File_contents),close(Stream), + +%% Output essay +%%findall(_,(member(Exposition1,Exposition),Exposition1= + + +%%writeln1([Exposition,Critique,Future_research,R2]), + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML), +writeln1(Essay), + + (open_s(File2,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,HTML), + close(Stream1)),! + . + +%% replace("a\nb","\n","
\n",F). +%% F="a
\nb
\n". + +replace(A,Find,Replace,F) :- + split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F). + +concat_list1(D,F) :- + maplist(append,[D],[E]),concat_list(E,F). + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML) :- + write_heading(String01,Heading), + write_introduction(String01,Pole,Critique,Introduction), + write_exposition(Exposition,Exposition2a), + concat_list(["I will expose ",String01," in this half. ",Exposition2a],Exposition2), + %%string_concat(Exposition2,"\n",Exposition2a), + write_critique(Critique,Critique2a), + string_concat(Critique2a,"\n",Critique2b), + atom_string(Pole,Pole1), + concat_list(["I will ",Pole1," with ",String01," in this half. ",Critique2b],Critique2), + write_conclusion(String01,Pole,Critique,Future_research,Conclusion), + write_references(R2,References,Refs_no_heading), + concat_list([Heading,Introduction,Exposition2,Critique2,Conclusion,References], + Essay), + concat_list([Introduction,Exposition2,Critique2,Conclusion], + Essay2), + replace(Essay2,"\n","
",HTML1), + replace(Refs_no_heading,"\n","
",Refs_no_heading2), + concat_list(["",String01,"

", + String01,"

",HTML1,"

Bibliography

",Refs_no_heading2,""],HTML). + +write_heading(String01,Heading) :- + concat_list([String01,"\n","\n"],Heading). + +write_introduction(String01,Pole1,Critique,Introduction) :- + %% The heading should be in the form "Author's topic" + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list(["I will critically analyse ",String01,". ", + "I will ",Pole2," with ",String01,". ", + Paragraph_topic_sentences2,"\n\n"],Introduction). + +write_conclusion(String01,Pole1,Critique,Future_research,Conclusion) :- + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list([ + Paragraph_topic_sentences2, + "I have ",Pole2,"d with ",String01,". ", + Future_research,"\n\n" + ],Conclusion). + +write_references(R2,References,Refs_no_head) :- + findall([Reference,"\n"],member(Reference,R2),References1), + concat_list1(References1,References2), + concat_list([References2],Refs_no_head), + concat_list(["Bibliography","\n\n",References2],References). + +%%a1 +%% write_exposition([[[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."]]],A),writeln1(A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. ". + +%% write_exposition([[[1,"g1"],[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."],[1,1,_15410,_15416,"a1","a1 is in g1."],[2,2,_15352,_15358,"a2","g1 contains a2."]]],A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. a1 a1 is in g1. a2 g1 contains a2. ". + +write_exposition(Exposition,Essay4b) :- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +Exposition=[_,Exposition2], + + +findall([Essay4c%%,"\n" +],(member(Numbera11,Numbers), + + +%% not "" with findall +findall(Essay4,( +%%member(Exposition1,Exposition), +%%Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + + +%%output_exposition(Numbera11,Exposition2,"",Essay1), + + %%findall( Essay4,( + member(Exposition1,Exposition2),Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,Group_link], + concat_list([String5a," ",Group_link," "],Essay4) + %%delete(Exposition,Exposition1,Exposition2) + %%output_exposition(Numbera11,Exposition2,Essay4,Essay6) +),Essay4a), +concat_list(Essay4a,Essay4c1),%%trace, +(Essay4c1=""->Essay4c="";concat_list([Essay4c1,"\n"],Essay4c)) + +),Essay4d), + +maplist(concat_list,Essay4d,Essay4f), +concat_list(Essay4f,Essay4b) +%%concat_list([Essay4e,"\n"],Essay4b) +%%concat_list([Essay1," ",Essay3],Essay2), + %%concat_list([Essay2,"\n"],Essay23)) +%%,Essay3a), + %%concat_list(Essay3a,Essay4a) +. +%% *** HTML (
not \n) + +%%%%% +%%a + +%% write_critique([[1,["heading is e12",[1,1,_25346,_25352,"e1",_25370,_25376,"e12"],[1,2,_25412,_25418,"e2",_25436,_25442,"e22",1,1,"e12",0,0,"e22","e12 is e22"]]]],A),writeln1(A). +%% A = "heading is e12 e1 e12 e2 e22 e12 is e22 \n". + +write_critique(Critique,Essay4):- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +findall(Essay23,(member(Numbera11,Numbers), + +%% not "" with findall +%%findall(Essay22,( +member(Critique1,Critique), +%%Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + +Critique1=[Numbera11,[Para_topic_sent,[_Number2a,_Number3a,_String3,_String3a,String5a,_String3y,_String3ay,String5a1]|Critique2]], + +concat_list([Para_topic_sent," ",String5a," ",String5a1," "],Essay0),output_critique(Numbera11,Critique2,String5a1,Essay0,Essay1), + + concat_list([Essay1,Para_topic_sent,"\n"],%%Essay22) + +%%), +Essay23)),Essay3), + concat_list(Essay3,Essay4) +. +%% *** HTML (
not \n) + +output_critique(_Numbera11,[],_String5a1a,Essay,Essay) :- !. +output_critique(Numbera11,Critique,CString5a1a,Essay1,Essay2) :- +findall( Essay6,(member(Critique1,Critique),Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,_String3y1,_String3ay,String5a1, +_CNumber2aa,_CNumber3aa,CString5a1a, + _CNumber2a1,_CNumber3a1,_LastCStrings, + String5aaa], + concat_list([String5a," ",String5a1," ",String5aaa," "],Essay4), + delete(Critique,Critique1,Critique2), + output_critique(Numbera11,Critique2,String5aaa,Essay4,Essay6) +),Essay33), +%%trace, +(Essay33=[]->concat_list([Essay1%%," " +],Essay2);%%(Essay33=[Essay3]->concat_list([Essay1," ",Essay3],Essay2);%%(Essay33=[Essay3|[E33]]-> concat_list([Essay1," ",Essay3,E33],Essay2); +(Essay33=[Essay3|E33], concat_list(E33,E34),concat_list([Essay1,%%" ", +Essay3,E34],Essay2))) +%%) +%%) +. + + +/**critique(String00,String01,Reasons_per_paragraph,Numbers,Critique). + **/ + +generate_file_name(File1,File2) :- + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".txt"],File1), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".html"],File2). + + +explain_structure(String01,Reasons_per_paragraph,_File1) :- + concat_list(["The Short Essay Helper will automatically structure and write your essay about ",String01," with ",Reasons_per_paragraph," reasons per paragraph.","\n", + "The Helper will help write an exposition (which summarises but doesn't critique the idea), a critique (which agrees with or disagrees with the topic), the introduction and the conclusion (which state whether you agreed or disagreed with the topic, etc.). Citations will be automatically made.","\n","Note: Generated essays are not to be handed in, and you need to paraphrase and cite work you have referenced. Your grade depends on whether you agree or disagree and how many breasonings you breason out. Check the referencing style is appropriate for your class.","\n"],String1), + writeln(String1). + +choose_sentence_range(String00,N2,B,B1,B2,C) :- +%trace, + %length(String00,L), + %numbers(L,1,[],N), + + %trace, + %writeln1(mind_read(N1,N)), + %findall(D1,(member(D2,String00),term_to_atom(D2,D3),string_atom(D1,D3)),D4), + %trace, + mind_read(A%N1 + ,%%["[\"ref1\",\"author 2003\",1,1,\"M1\"]"]),% + String00), + writeln(here2),%% changed from mind_read to random_member + %number_string(N1,N11), + %term_to_atom(A,N11), + %get_item_n(String00,N1,A), + A=[B,B1,B2,N2,C]. + %%N2 is N1+B2-1. + %%random_member(A,String00), + +exposition(String00,_String01,Reasons_per_paragraph,Numbers,ML_db,Exposition1) :- + length(List1,5), %% 5->1 paragraphs per exposition + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + findall([Number1,Exposition2],( + %%trace, +member(Number1,List1),%%concat_list(["What is group ",Number1," of 5 in the exposition that groups ideas about ",String01,"? "],String1),%%get_string(String1,either,one-not-ml,"","",%ML_db,Exposition2) +%% ** Doesn't print this +%%choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(N_page_ref,String00a1,String00a2,_String00a3,_String00a4,String00a5), + concat_list(["",String00a5," (",String00a2,", p. ",N_page_ref,") "],Exposition2), + reference(String00a1)),Exposition3), + + + findall([Number2,Number3,String3,String3a,String5a,String5],( + member(Number2,List1),member(Number3,List2),get_item_n(Exposition3,Number2,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you how the quote you are about to enter relates to the paragraph topic."],String2b),writeln(String2b), + %%trace, +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5) + ),Exposition4), + Exposition1=[Exposition3,Exposition4]. + +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5):- + (%%concat_list(["What is the paragraph number of the quote about the paragraph topic ",Item1,"? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote about the paragraph topic ",Item1,"? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote about the paragraph topic ",Item1,"? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + %%choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(N_page_ref,String00a1,String00a2,_String00a3,_String00a4,String00a5), + concat_list([Item1," is because ",String00a5," (",String00a2,", p. ",N_page_ref,")."],String5a), + reference(String00a1), + + %%concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + %%get_string(String4,either,one-not-ml,Item1a,String3aa,String5) + + %%choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(N_page_ref1,String00a11,String00a21,_String00a31,_String00a41,String00a51), + concat_list([Item1," is because \"",String00a51,"\" (",String00a21,", p. ",N_page_ref1,")."],String5), + reference(String00a11) +) + ->true;exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5). + +%% Agree or disagree + +critique(String00,String01,Reasons_per_paragraph,Numbers,ML_db,Critique1) :- + length(List1,5), %% 5->1 paragraphs per critique + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + + retractall(critique3(_)), + assertz(critique3([])), + + + findall([Number2a,Critique2],( +%% Reason 1 + +member(Number2a,List1), +%%List1=[Number2a|List1a], +List2=[Number3a|List2a], +%%trace, + + +critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link), + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4), + + append_list2([[Topic_paragraph_link],Critique3,Critique4],Critique2)),Critique1). + + critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link) :- + + %%member(Number2,List1),member(Number3,List2), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + %%trace, +critique2(String00,String01,ML_db,String3,String3a,String5a,_String3y2,String3ay,String5a1,Topic_paragraph_link), + %%),Critique4). + Critique3=[[Number2a,Number3a,String3,String3a,String5a,_String3y3,String3ay,String5a1]]. + +critique2(String00,String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + + %%choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(N_page_ref1,String00a11,String00a21,_String00a31,_String00a41,String00a51), + concat_list(["\"",String00a51,"\" (",String00a21,", p. ",N_page_ref1,")."],String5a), + reference(String00a11), + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (%%String3yn="y" + true-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase + + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + %%choose_sentence_range(String00,N_page_ref2,String00a12,String00a22,_String00a32,String00a42), + choose(N_page_ref2,String00a12,String00a22,_String00a32,_String00a42,String00a52), + concat_list(["\"",String00a52,"\" (",String00a22,", p. ",N_page_ref2,")."],String5a1), + reference(String00a12) + + ) + + ;(String3y=0,String3ay=0, + %%concat_list(["What is the comment? "],String4ac),%%get_string(String4ac,either,one-not-ml,"","",%%String5a, +%% String5a1) + + %%choose_sentence_range(String00,N_page_ref3,String00a13,String00a23,_String00a33,String00a43), + choose(_N_page_ref3,String00a13,_String00a23,_String00a33,_String00a43,String00a53), + concat_list(["\"",String00a53,"\"."],String5a1), + reference(String00a13) + %%" (",String00a23,", p. ",N_page_ref3,")." + +)), + + %%concat_list(["How does the comment ",String5a1," relate to the essay topic ",String01,"? "],Topic_paragraph_link_prompt), + %%trace, + + %%downcase_and_split(String5a1,String5a1ds), + %%downcase_and_split(String01,String01ds), + %%get_string(Topic_paragraph_link_prompt,either,one-not-ml,String5a1ds,String01ds,Topic_paragraph_link) + + string_concat(String5a1_az,".",String5a1), %%choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + %%choose(N_page_ref4,String00a14,String00a24,_String00a34,String00a44,String00a54), + split_string(String01,"(","(",[String01_a|_]), + + concat_list([String01_a," is because ",String5a1_az,"."%% because of ",String00a54," (",String00a24,", p. ",N_page_ref4,")." + ],Topic_paragraph_link) + %%reference(String00a14) + + /** + %% conn - choose connected comments + concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + get_string(String4,either,two,Item1a,String3aa,String5) + **/ + ) + ->true;critique2(String00,String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link). + + + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4) :- + retractall(critique3(_)), + assertz(critique3(Critique3)), +%%trace, +/** critique3(Critique31), + append(Critique31,Critique3,Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), +**/ +%%notrace, +findall([Number2a,Number3a,String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa],( + %%member(Number2,List1a), + member(Number3a,List2a), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + + %%trace, +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa) + + +/** + trace, + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + +), + Critique4). + + %%Critique3=[]. + +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + %%trace, + %%choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + choose(N_page_ref4,String00a14,String00a24,_String00a34,_String00a44,String00a54), + concat_list(["\"",String00a54,"\" (",String00a24,", p. ",N_page_ref4,")."],String5a), + reference(String00a14), + + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),%%get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (true->%%String3yn="y"-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase x + %%trace, + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + %%choose_sentence_range(String00,N_page_ref5,String00a15,String00a25,_String00a35,String00a45), + choose(N_page_ref5,String00a15,String00a25,_String00a35,_String00a45,String00a55), + concat_list(["\"",String00a55,"\" (",String00a25,", p. ",N_page_ref5,")."],String5a1), + reference(String00a15) + + %%,trace + ) + + ;(String3y=0,String3ay=0, + concat_list(["What is the comment? "],String4ac), + + %% This is never chosen + get_string(String4ac,either,one-not-ml,"","",%%String5a, + String5a1))), + %%*** assertz recurse not findall new critique3 x + + + %%trace, + critique3(Critique31), + append(Critique31,[[0,0,String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), + + critique3(Critique33), + + +(( + + + %%critique3(Critique3), + length(Critique33,LCritique3), + %%length(List1,LCritique3), + numbers(LCritique3,1,[],List1), + %%append(List1,_,Numbers), + %%Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + %%trace, + +findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1])),CStrings1), + +findall([N,CNumber2a,CNumber3a,CString5a1],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString31,_CString3a1,_CString5a1,_CString3y1,_CString3ay1,CString5a1])),CStrings2), + +%%trace, + +%%CStrings=[CStrings1,CStrings2], +reverse(CStrings2,CStringsR),CStringsR=[[_,CNumber2a1,CNumber3a1,LastCStrings]|CStringsR1], +reverse(CStringsR1,CStringsRR), + +reverse(CStrings1,CStringsR10),CStringsR10=[_|CStringsR11], +reverse(CStringsR11,CStringsRR1), + +append_list2(CStringsRR1,CStrings11), +concat_list(CStrings11,_CStrings12), +%%concat_list(["Please select a comment to connect the comment ",LastCStrings," to:","\n",CStrings12],ConnectionNumberPrompt), +%%get_number(ConnectionNumberPrompt,ConnectionNumber), +%numbers( + +%% *** Choose phrase which is similar to a previous phrase + +%% findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1]), + + /**SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",findall([CString5a1Z2,CString5a1Z5],(downcase_atom(CString5a1,CString5a1Z1),atom_string(CString5a1Z1,CString5a1Z2),split_string(CString5a1,SepandPad,SepandPad,CString5a1Z3), + Connectors= + ["the","a","i","on","with","of","an","for","to", + "was","were","and","in","my","from","out","by"], + %% find distances between terms, closest to sent + subtract(CString5a1Z3,Connectors,CString5a1Z5), + ),CString5a1Z4), +**/ +%trace, +choose1(List1,ConnectionNumber_aa), +%number_string(ConnectionNumber,ConnectionNumber_aa), + ConnectionNumber=ConnectionNumber_aa, + member([ConnectionNumber,CNumber2aa,CNumber3aa,CString5a1a],CStringsRR), + + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(CString5a1a,CString5a1a1),split_string(CString5a1a1, SepandPad, SepandPad, CString5a1a2), + + %%CNumber2a1,CNumber3a1, + %%downcase_atom(LastCStrings,LastCStrings_a),split_string(LastCStrings_a, SepandPad, SepandPad, LastCStrings_a1), + + + %% conn - choose connected comments, this to a previous comment + %%trace, + + %%concat_list(["How does ",LastCStrings," connect to ",CString5a1a,"? "],ConnectionPrompt), + + %%get_string(ConnectionPrompt,either,one-not-ml,CString5a1a2,LastCStrings_a1,String5aaa) + + string_concat(LastCStrings_az,".",LastCStrings), + string_concat(CString5a1a_az1,".",CString5a1a), %%choose_sentence_range(String00,N_page_ref6,String00a16,String00a26,_String00a36,String00a46), + + split_string(LastCStrings_az,"(","(",[LastCStrings_az_a|_]), + +replace(CString5a1a_az1,"\"","",CString5a1a_az), +replace(LastCStrings_az_a,"\"","",LastCStrings_az_a1), +choose(_N_page_ref6,String00a16,_String00a26,_String00a36,_String00a46,_String00a56), + concat_list([LastCStrings_az_a1," because ",CString5a1a_az,"."%%" because ",String00a56," (",String00a26,", p. ",N_page_ref6,")." + ],String5aaa), + reference(String00a16) + + + )->true;( + + %% If the section since updating dynamic critique comments fails, prevent doubling of critique comments + + critique3(Critique311), + reverse(Critique311,Critique312), + Critique312=[_|Critique313], + reverse(Critique313,Critique314), + retractall(critique3(_)), + assertz(critique3(Critique314)),fail + + )) + +/** Critique4=[String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa], + **/ + /** + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + + ) + %% Retries the predicate if it fails + ->true;critique3(String00,ML_db,%%Critique3, + String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa + ). + + + + +member1a([String3,String3a,String3aa],ML_db) :- + member([String3,String3a,String3aa],ML_db),!. + + +get_item_n(Exposition,Number1,Item) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[Item|_],Exposition). + +get_number(Prompt1,Number) :- + %%concat_list(Prompt1,Prompt2), + (%%repeat, + writeln(Prompt1),read_string(user_input, "\n", "\r", _End1, String),split_string(String, ",", " ", Value1),Value1=[Value2],number_string(Number,Value2)). + + +/** +get_string(Prompt1,String1) :- + concat_list(Prompt1,Prompt2), + (repeat,write(Prompt2),read_string(String1),not(String1="")). +**/ + +equals_empty_list([]). + +downcase_and_split(String1, String2) :- +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + downcase_atom(String1,String3),split_string(String3, SepandPad, SepandPad, String2). + +get_string(Prompt2,Flag1,Flag2,ML_db0,ML_db1,String2) :- +%%writeln1(get_string(Prompt2,Flag1,Flag2,ML_db1,String2)), + %%concat_list(Prompt1,Prompt2), + %%trace, + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%(repeat, + (Flag2=one-not-ml-ref->(concat_list(["Note: Enter in-text reference using AGPS, e.g.\n","The first work supports the second work (Surname 2000, pp. 18-9).\n","Surname (2000, pp. 18-9) states that ...\n","Remember to use words like \"also\", \"moreover\" and \"in addition\" before the sentence."],String_a1),writeln(String_a1));true), + writeln(Prompt2),read_string(user_input, "\n", "\r", _End2, String2aa),%%not(String2aa=""), + %%String2aa=[String2aaa], + downcase_atom(String2aa,String3),split_string(String3, SepandPad, SepandPad, String4), + Neg_term_list=["no","not","don","t","shouldn","wouldn","disagree","differ","dislikes","disagrees","differs","dislikes","disagreed","differed","disliked","negative","negation","non","negate","negates","negated","but","however","isn","lack"], + (Flag1=%%non_negative + positive->( + + (findall(Item11,( + + member(Item1,String4), + findall(Item1,( +member(Item2,Neg_term_list),(Item1=Item2->(write("Error: Contains the negative term \""),write(Item1),writeln("\".")))),Item11)),Item12)), + +maplist(equals_empty_list,Item12) + +); + ((Flag1=negative->((member(Item1,String4),member(Item1,Neg_term_list))->true;(writeln("Error: Contains no negative term, one of:"),writeln(Neg_term_list),fail));true)->true;Flag1=either)), + (Flag2=one-not-ml->String2=String2aa; + (Flag2=one-not-ml-ref-> + (refs(R1),writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), findall(_,(member(R11,R1),writeln(R11)),_),read_string(user_input, "\n", "\r", _End3, String2r),not(String2r=""),%%downcase_atom(String2r,_String3r), + String2=String2aa,split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + sort1(String2r21,String2r2), + assertz(refs(String2r2))%%split_string(String3r, SepandPad, SepandPad, String4) + ); +(Flag2=one->(%%split_string(String4,SepandPad,SepandPad,String21), + writeln("Attempt 1"), + + (length(String4,Length_string1), +(check_strings_container1(Length_string1,String4,[0,0,ML_db1],[[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1],[999,999,[]]],_,_List2)-> + writeln("Success");(writeln("Failed"),fail)) + %%(%%data_instance_k_classification1([[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1]%%,[999,999,[]] + %%],[0,0,String4]%%21 + %%,1,String4a%%21 + %%), + %%String4=String4a) + %%)-> + %%(%%String4a=[_,_,String4a1], + %%writeln([found,String4a,String4]), +%%writeln("Success")); + %%(writeln("Failed"),fail) + %% + )); + (Flag2=%%[ + two,%%,P1,S1,P2,S2], + %%trace, + append([ML_db0],[ML_db1],ML_db2), + check_strings(String4,ML_db2%%,P1,S1,P2,S2 + ))))). + +reference(String2r) :- + (refs(R1),%%writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), + findall(_,(member(R11,R1),writeln(R11)),_),%%read_string(user_input, "\n", "\r", _End4, String2r), + not(String2r=""),%%downcase_atom(String2r,_String3r), + %%String2=String2aa, + split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + sort1(String2r21,String2r2), + assertz(refs(String2r2))%%split_string(String3r, SepandPad, SepandPad, String4) + ). + + +%% Sorts by first surname then title in AGPS + +sort1(List1,List2) :- + findall([C,B],(member(B,List1),downcase_atom(B,D),atom_string(D,C)),E),sort(E,F),findall(G,member([_,G],F),List2). + + +prepare_file_for_ml(String000,String021) :- + string_codes(String001,String000), + downcase_atom(String001,String00), + split_string(String00, "\n\r", "\n\r", String01), + delete(String01,"",String02), + findall(String03,(member(String02a,String02),split_string(String02a,".",".",String04), + ((String04=[String05|String06], + number_string(Number05,String05), + number_sentences(Number05,1,String06,[],String03))->true; + (findall(String08, + (SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + member(String04a,String04),split_string(String04a,SepandPad,SepandPad,String09), + append_list([[""],"",String09],String08)),String03)))),String0211), + append_list2(String0211,String021). + +number_sentences(_,_,[],String,String) :- !. number_sentences(Number_a,Number_b1,String06,String07,String08) :- + String06=[String00|String12], + +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + split_string(String00,SepandPad,SepandPad,String09), + append_list([[Number_a],Number_b1,String09],String10), + append(String07,[String10],String11), + Number_b2 is Number_b1 + 1, + number_sentences(Number_a,Number_b2,String12,String11,String08). + + +data_instance_k_classification1(Data,I,K,C):- + maplist(o_oclass_disClass1(I),Data,DisAndClass), + keysort(DisAndClass,DSorted), + + %%writeln(DSorted), + + length(TopK,K), + append(TopK,_,DSorted), + %This is not a very good way of selecting k as you may have many values with the same distance, and the sorting just cuts these off + %Dsorted = [1-pos,3-pos,3-pos,3-neg,3-neg] + %Topk =[1-pos,3-pos,3-pos] + topk_vote(TopK,C). + +o_oclass_disClass1(O,[A,B,O2],D-[A,B,O2]):- + o_o_dis(O,O2,D). + +%% could be in either order +%% a([w,z,a,b,e,c,z,y],[1,1,[c]],[1,2,[a]]). + +%% true +%% a([a,c,b],[1,1,[a,d]],[1,2,[b]]). +%% a([a,c,b],[1,1,[a,c]],[1,2,[b]]). +%% a([a,b],[1,1,[a]],[1,2,[b]]). + +%% false +%% a([c,d],[1,1,[a]],[1,2,[b]]). + +%% q in "q a b" sends food like e in "c d e" + +/** + +[debug] ?- a([q,e,r,a,t,y,u,q,e,r,a,t,y,u,c,b,x,v,n,m],[1,1,[c,a,t,y,u]],[1,2,[c,a,t,y,u]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[c,a,t,y,u]],[r,a,t,y,u]] +true. + +X: +[debug] ?- a([q,e,r,a,t,y,u,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,u]]). +[[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,u]],[c,b,x]] +true. + +[debug] ?- a([q,e,r,a,t,y,u,c,c,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,v]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,v]],[c,c,c]] +true. + +**/ + +check_strings(String1,ML_db) :- + %%member([P1,S1,String2],ML_db), + %%member([P2,S2,String3],ML_db), + ML_db=[String2a,String3a], +%%writeln(["String1,String2a,String3a",String1,String2a,String3a]), + String2=[0,0,String2a], + String3=[0,0,String3a], +%%a(String1,String2,String3):- + length(String1,Length_string1), +((writeln("Attempt 1"), +check_strings_container1(Length_string1,String1,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + check_strings_container1(Length_list2,List2,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,_List3), + writeln("Success") + %%,trace + )->true; + (writeln("Failed"), + writeln("Attempt 2"), + ((check_strings_container1(Length_string1,String1,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + + check_strings_container1(Length_list2,List2,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,_List3))-> + writeln("Success");(writeln("Failed"),fail))) +). + +check_strings_container1(Length_string1,String1,String2,Db,List2,List2b) :- +check_strings1(Length_string1,String1,String2,Db,List2,List2b),not(var(List2b)). +%%check_strings_container2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- +%%check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b),not(var(List2b)). + + + %% finds the length of each string - from position 1 to (length of string1) of string1 + %% starts with the shortest possible string, then increases the length + %% repeat this for second string + %%string1 + + %%*** check with s2,s3 in db +check_strings1(0,_String1,_String2,_Db,List2,List2) :- !. +check_strings1(Length_string1,String1,String2,Db,List2,List2b) :- + %% finds the first str + %%length(List,Index), + length(List1,Length_string1), + append(_,List1,String1), + Length_string2 is Length_string1-1, + Length_string3 is Length_string1+1, + check_strings2(0,Length_string3,List1,String2,Db,List2,List2c), + (var(List2c)-> + check_strings1(Length_string2,String1,String2,Db,List2c,List2b);List2c=List2b),!. + +check_strings2(Length_string,Length_string,_String1,_String2,_Db,List2,List2) :- !. +check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- + + + %%split_string(String4,SepandPad,SepandPad,String21), + + %% go through s1, removing the first word each time, until v + %%Length_string11 is Length_string1-1, + length(List11,Length_string1), + append(List11,List2,List1), + Length_string2 is Length_string1+1, + +%%writeln([list11,List11]), + + ((%%trace, + %%(List11=[z,a,b]->trace;true), + %%writeln([[_,_,List11],String2]), + %%notrace, + not(List11=[]), + +%%writeln(data_instance_k_classification1(Db +%% ,List11,1,A)), + data_instance_k_classification1(Db + %%String2 + ,List11,1,A), + %%writeln([string2,String2,list11,List11,a,A]), +A=String2,%%[_,_,List11], + %%trace, + + %%((List11=[z,a,b],A=[_,_,[z,t,b]],not(String2=[_,_,[c,z]]))->trace;notrace), + %%trace, +%%writeln([string2=list11,String2=List11]), +%%(List11=[y]->true%%trace +%%;true), +%%member(List11,Db), + %%**String2=[_,_,List11] + List2b=List2,%%,notrace + String2=[_,_,String21] + ,writeln([found,String21,List11]) + ) + ->(true + %%,writeln(List11) + ); + check_strings2(Length_string2,Length_string3,List1,String2,Db,_List2c,List2b)),!. + +numbers(N2,N1,Numbers,Numbers) :- + N2 is N1-1,!. +numbers(N2,N1,Numbers1,Numbers2) :- + N3 is N1+1, + append(Numbers1,[N1],Numbers3), + numbers(N2,N3,Numbers3,Numbers2). + +concat_list2A(A1,B):- + A1=[A|List], + concat_list2A(A,List,B),!. + +concat_list2A(A,[],A):-!. +concat_list2A(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list2A(C,Items,B). + +findbest(R,R) :-!. +findbest2(R,Item):- + sort(R,RA), + reverse(RA,RB), + RB=[[_,Item]|_Rest]. diff --git a/Essay-Helper/short_essay_helper3.1_agps-mr.pl b/Essay-Helper/short_essay_helper3.1_agps-mr.pl new file mode 100644 index 0000000000000000000000000000000000000000..5c90194bbd539af4d487a3b90d16e6cd80a6e428 --- /dev/null +++ b/Essay-Helper/short_essay_helper3.1_agps-mr.pl @@ -0,0 +1,1285 @@ +%% backdated from this - which has no repeats and forward progress through essays, but has a bug in numbering pages in references, so backdated to version with repeats and random progress, made forward progress x kept this version and solved bug with pg info in db + +%% Prolog Short Essay Helper + +%% Keep 1,2,3 in aphors +%% Certain number of reasons per paragraph +%% Explains essay structure at start +%% Exposition (negative terms not allowed, asks for 5 groups from text to take optionally enough notes about, which can revise) and critique +%% Paraphrasing +%% Converts x, explains x, asks about +%% Allows citation of aphor +%% Prints citations used so far +%% Prints chapter, essay and allows changes (using text editor) at end. +%% Outputs db file and br dicts for marking later + +%% *** Rewrite essay files with e.g. 1a. before aphors not 1. so not confused with aphors + +%% short_essay_helper("file.txt",5,E),writeln1(E). + +%%:- include('distances.pl'). +:- use_module(library(date)). +:- include('../listprologinterpreter/la_strings'). +:- include('texttobr2qb.pl'). +:- include('sheet_feeder.pl'). + +:- dynamic critique3/1. +:- dynamic agree_disagree/1. +:- dynamic refs/1. +:- dynamic chosen_quotes/1. +:- dynamic string00_z/1. + +choose(N2,B,B1,B2,C,Item) :- +%%trace, +( + choose2(N2,B,B1,B2,C,Item)->true;(Item="* All quotes exhausted. (*)"),N2=0,B="()",B1="()",B2=0,C=""), + %%notrace, + !. +choose2(N2,B,B1,B2,List0,List0) :- +%%trace, + string00_z(String00), + choose_sentence_range(String00,N1,B,B1,B2,List0), + %%chosen_quotes(Chosen_quotes1), + %%trace, + %%length(List0,L), + %%numbers(L,1,[],N), + %%random_ + %%member(N1,N), + %% + %%random_ + %%mind_read([N1,Item10],List0), + + %%random_ + %%mind_read(Item1,Item10), + N2 is N1+B2-1, + %%get_item_n(List0,N1,Item10), + + + %%**string_codes(Item10,List), + %%notrace, + %%**split_string(List,".\n",".\n",List2), + + %%length(List2,L), + %%numbers(L,1,[],N), + %%random_ + %%member(N1,N), + %%N2 is N1+B2-1, + %%random_ + %%**member(Item1,List2), + + %%get_item_n(List2,N1,Item1), + /** + string_concat(E,D,Item1), + string_length(E,1), + downcase_atom(E,E1), + atom_string(E1,E2), + string_concat(E2,D,Item2), + string_length(E2,1), + string_concat(Item2,""%%"." + ,Item), + **/ + delete(String00,[B,B1,B2,N1,List0],String00_a), + %%**delete(String00,[B,B1,B2|_],String00_a), + %%**delete(List0,[N1,Item10],List6), + %%findall([Item3,". "],(member(Item3,List2)),List3), + %%maplist(append,[List3],[List4]), + %%concat_list(List4,_List5), + %%append(List6,[]%%List5 + %%,List7), + %%**(List6=[]->String00_b=String00_a; + %%**(%%trace, + %%**maplist(append,[[[B,B1,B2],List6]],[String00_c]), + %%**append(String00_a,[String00_c],String00_b)%%,notrace + %%**)), + retractall(string00_z(_)), + assertz(string00_z(String00_a)) + %%trace, + %%writeln1(String00_b),notrace + + + %%,not(member(Item,Chosen_quotes1)), + %%append(Chosen_quotes1,[Item],Chosen_quotes2), + %%retractall(chosen_quotes(_)), + %%assertz(chosen_quotes(Chosen_quotes2)) + . + +choose1(List0,Item) :- + mind_read(Item,List0). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +short_essay_helper(%%Filex, + String01,Key_words, + Reasons_per_paragraph) :- + retractall(string00_z(_)), + %%assertz(string00_z([])), + + retractall(critique3(_)), + assertz(critique3([])), + + retractall(refs(_)), + assertz(refs([])), + + retractall(chosen_quotes(_)), + assertz(chosen_quotes([])), + + directory_files("sources/",F), + delete_invisibles_etc(F,G), +%%trace, +SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + findall(String02h3,(member(Filex1,G), + string_concat("sources/",Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]), + + (String02a=[Az,Bz,Cz|String02c]->true; + (concat_list(["Error: ",Filex," not in format [\"Surname, A 2000, Title: Subtitle, Publisher, City.\",\"Surname, A 2000\",First_Page_Num,\"\",\"\",...\"]"],Notification1),writeln(Notification1),abort)), + %%String02c=String02d, + %%trace, + findall([Az,Bz,Cz,N1,String02cb],( + + length(String02c,L), + numbers(L,1,[],N), + %%random_ + member(N1,N), + get_item_n(String02c,N1,String02ca), + + %%member(String02ca,String02c), + split_string(String02ca, ".\n\r", ".\n\r", String02cb) + + %%member(String02cb1,String02cb) + + ),String02cc), + %%maplist(append,[String02cc],[String02d]), + + %%delete(String02cc,[_,[]],String02d), + String02cc=String02d, + + findall([Az,Bz,Cz,N2,String02d2],(member([Az,Bz,Cz,N2,String02d1],String02d), + member(String02d2,String02d1), + downcase_atom(String02d2,String02e), + atom_string(String02e,String02f1), + split_string(String02f1, SepandPad, SepandPad, String02e1), + findall(String02g,(member(Key_words1,Key_words), + %%trace, + downcase_atom(Key_words1,Key_words11), + atom_string(Key_words11,Key_words12), +findall(Key_words12,(member(Key_words12,String02e1)),String02g) + ),String02i), + not(maplist(equals_empty_list,String02i)) + + ),String02h31), + + sort(String02h31,String02h3) + + %%prepare_file_for_ml(String00,String02a) + ),String00z1), + %%, + + %%** findall(String02h2,(member([Ay,By,Cy,String02h1],String00z1), + %%**(String02h1=[]->String02h2=[]; + maplist(append,[String00z1],[String00]),%%**) + %%**),String00z), + +%%delete(String00z,[],String00), + + +term_to_atom(Key_words,Key_words_a), +atom_string(Key_words_a,Key_words_b), + (String00=[]->(concat_list(["Error: No files in source folder or no instances of keywords ",Key_words_b," in files in source folder."],Notification2),writeln(Notification2),abort);true), + + %%maplist(append,[[String00z1]],String00), +%%maplist(append,[String00z],String00), + %%trace, +assertz(string00_z(String00)), + %%writeln1(String00), + %%notrace, + +%%writeln1(String02), + + + generate_file_name(File1,File2), + + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + explain_structure(String01,Reasons_per_paragraph,File1), + exposition(String00,String01,Reasons_per_paragraph,Numbers,String02,Exposition), + + %%concat_list(["Do you agree or disagree with ",String01," (a/d) ? "],String2ad),%%get_string(String2ad,either,one-not-ml,"","",String3ad), + choose1(["a"%%,"d" + ],String3ad), + + (String3ad="a"-> + (retractall(agree_disagree(_)), + assertz(agree_disagree(agree))); + (retractall(agree_disagree(_)), + assertz(agree_disagree(disagree)))), + + +critique(String00,String01,Reasons_per_paragraph,Numbers,String02,Critique), + +agree_disagree(Pole), + + %%concat_list(["What is the future area of research from your essay about ",String01,"? "],Future_research_prompt), + %%trace, + %%get_string(Future_research_prompt,either,one-not-ml,"","",Future_research), +%% choose_sentence_range(String00,), + choose(N_page_ref,String00a1,String00a2,_String00a3,_String00a4,String00a5), + concat_list(["In ",String01,", automation should apply to \"",String00a5,"\" (",String00a2,", p. ",N_page_ref,")."],Future_research), + reference(String00a1), + + refs(R2), + +term_to_atom([Exposition,Critique,String3ad,Future_research,R2],File_contents),open_s(File1,write,Stream),write(Stream,File_contents),close(Stream), + +%% Output essay +%%findall(_,(member(Exposition1,Exposition),Exposition1= + + +%%writeln1([Exposition,Critique,Future_research,R2]), + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML), +writeln1(Essay), + + (open_s(File2,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,HTML), + close(Stream1)),! + . + +%% replace("a\nb","\n","
\n",F). +%% F="a
\nb
\n". + +replace(A,Find,Replace,F) :- + split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F). + +concat_list1(D,F) :- + maplist(append,[D],[E]),concat_list(E,F). + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML) :- + write_heading(String01,Heading), + write_introduction(String01,Pole,Critique,Introduction), + write_exposition(Exposition,Exposition2a), + concat_list(["I will expose ",String01," in this half. ",Exposition2a],Exposition2), + %%string_concat(Exposition2,"\n",Exposition2a), + write_critique(Critique,Critique2a), + string_concat(Critique2a,"\n",Critique2b), + atom_string(Pole,Pole1), + concat_list(["I will ",Pole1," with ",String01," in this half. ",Critique2b],Critique2), + write_conclusion(String01,Pole,Critique,Future_research,Conclusion), + write_references(R2,References,Refs_no_heading), + concat_list([Heading,Introduction,Exposition2,Critique2,Conclusion,References], + Essay), + concat_list([Introduction,Exposition2,Critique2,Conclusion], + Essay2), + replace(Essay2,"\n","
",HTML1), + replace(Refs_no_heading,"\n","
",Refs_no_heading2), + concat_list(["",String01,"

", + String01,"

",HTML1,"

Bibliography

",Refs_no_heading2,""],HTML). + +write_heading(String01,Heading) :- + concat_list([String01,"\n","\n"],Heading). + +write_introduction(String01,Pole1,Critique,Introduction) :- + %% The heading should be in the form "Author's topic" + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list(["I will critically analyse ",String01,". ", + "I will ",Pole2," with ",String01,". ", + Paragraph_topic_sentences2,"\n\n"],Introduction). + +write_conclusion(String01,Pole1,Critique,Future_research,Conclusion) :- + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list([ + Paragraph_topic_sentences2, + "I have ",Pole2,"d with ",String01,". ", + Future_research,"\n\n" + ],Conclusion). + +write_references(R2,References,Refs_no_head) :- + findall([Reference,"\n"],member(Reference,R2),References1), + concat_list1(References1,References2), + concat_list([References2],Refs_no_head), + concat_list(["Bibliography","\n\n",References2],References). + +%%a1 +%% write_exposition([[[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."]]],A),writeln1(A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. ". + +%% write_exposition([[[1,"g1"],[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."],[1,1,_15410,_15416,"a1","a1 is in g1."],[2,2,_15352,_15358,"a2","g1 contains a2."]]],A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. a1 a1 is in g1. a2 g1 contains a2. ". + +write_exposition(Exposition,Essay4b) :- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +Exposition=[_,Exposition2], + + +findall([Essay4c%%,"\n" +],(member(Numbera11,Numbers), + + +%% not "" with findall +findall(Essay4,( +%%member(Exposition1,Exposition), +%%Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + + +%%output_exposition(Numbera11,Exposition2,"",Essay1), + + %%findall( Essay4,( + member(Exposition1,Exposition2),Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,Group_link], + concat_list([String5a," ",Group_link," "],Essay4) + %%delete(Exposition,Exposition1,Exposition2) + %%output_exposition(Numbera11,Exposition2,Essay4,Essay6) +),Essay4a), +concat_list(Essay4a,Essay4c1),%%trace, +(Essay4c1=""->Essay4c="";concat_list([Essay4c1,"\n"],Essay4c)) + +),Essay4d), + +maplist(concat_list,Essay4d,Essay4f), +concat_list(Essay4f,Essay4b) +%%concat_list([Essay4e,"\n"],Essay4b) +%%concat_list([Essay1," ",Essay3],Essay2), + %%concat_list([Essay2,"\n"],Essay23)) +%%,Essay3a), + %%concat_list(Essay3a,Essay4a) +. +%% *** HTML (
not \n) + +%%%%% +%%a + +%% write_critique([[1,["heading is e12",[1,1,_25346,_25352,"e1",_25370,_25376,"e12"],[1,2,_25412,_25418,"e2",_25436,_25442,"e22",1,1,"e12",0,0,"e22","e12 is e22"]]]],A),writeln1(A). +%% A = "heading is e12 e1 e12 e2 e22 e12 is e22 \n". + +write_critique(Critique,Essay4):- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +findall(Essay23,(member(Numbera11,Numbers), + +%% not "" with findall +%%findall(Essay22,( +member(Critique1,Critique), +%%Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + +Critique1=[Numbera11,[Para_topic_sent,[_Number2a,_Number3a,_String3,_String3a,String5a,_String3y,_String3ay,String5a1]|Critique2]], + +concat_list([Para_topic_sent," ",String5a," ",String5a1," "],Essay0),output_critique(Numbera11,Critique2,String5a1,Essay0,Essay1), + + concat_list([Essay1,Para_topic_sent,"\n"],%%Essay22) + +%%), +Essay23)),Essay3), + concat_list(Essay3,Essay4) +. +%% *** HTML (
not \n) + +output_critique(_Numbera11,[],_String5a1a,Essay,Essay) :- !. +output_critique(Numbera11,Critique,CString5a1a,Essay1,Essay2) :- +findall( Essay6,(member(Critique1,Critique),Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,_String3y1,_String3ay,String5a1, +_CNumber2aa,_CNumber3aa,CString5a1a, + _CNumber2a1,_CNumber3a1,_LastCStrings, + String5aaa], + concat_list([String5a," ",String5a1," ",String5aaa," "],Essay4), + delete(Critique,Critique1,Critique2), + output_critique(Numbera11,Critique2,String5aaa,Essay4,Essay6) +),Essay33), +%%trace, +(Essay33=[]->concat_list([Essay1%%," " +],Essay2);%%(Essay33=[Essay3]->concat_list([Essay1," ",Essay3],Essay2);%%(Essay33=[Essay3|[E33]]-> concat_list([Essay1," ",Essay3,E33],Essay2); +(Essay33=[Essay3|E33], concat_list(E33,E34),concat_list([Essay1,%%" ", +Essay3,E34],Essay2))) +%%) +%%) +. + + +/**critique(String00,String01,Reasons_per_paragraph,Numbers,Critique). + **/ + +generate_file_name(File1,File2) :- + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".txt"],File1), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".html"],File2). + + +explain_structure(String01,Reasons_per_paragraph,_File1) :- + concat_list(["The Short Essay Helper will automatically structure and write your essay about ",String01," with ",Reasons_per_paragraph," reasons per paragraph.","\n", + "The Helper will help write an exposition (which summarises but doesn't critique the idea), a critique (which agrees with or disagrees with the topic), the introduction and the conclusion (which state whether you agreed or disagreed with the topic, etc.). Citations will be automatically made.","\n","Note: Generated essays are not to be handed in, and you need to paraphrase and cite work you have referenced. Your grade depends on whether you agree or disagree and how many breasonings you breason out. Check the referencing style is appropriate for your class.","\n"],String1), + writeln(String1). + +choose_sentence_range(String00,N2,B,B1,B2,C) :- + length(String00,L), + numbers(L,1,[],N), + mind_read(N1,N), + get_item_n(String00,N1,A), + A=[B,B1,B2,N2,C]. + %%N2 is N1+B2-1. + %%random_member(A,String00), + +exposition(String00,_String01,Reasons_per_paragraph,Numbers,ML_db,Exposition1) :- + length(List1,5), %% 5->1 paragraphs per exposition + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + findall([Number1,Exposition2],( + %%trace, +member(Number1,List1),%%concat_list(["What is group ",Number1," of 5 in the exposition that groups ideas about ",String01,"? "],String1),%%get_string(String1,either,one-not-ml,"","",%ML_db,Exposition2) +%% ** Doesn't print this +%%choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(N_page_ref,String00a1,String00a2,_String00a3,_String00a4,String00a5), + concat_list(["",String00a5," (",String00a2,", p. ",N_page_ref,") "],Exposition2), + reference(String00a1)),Exposition3), + + + findall([Number2,Number3,String3,String3a,String5a,String5],( + member(Number2,List1),member(Number3,List2),get_item_n(Exposition3,Number2,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you how the quote you are about to enter relates to the paragraph topic."],String2b),writeln(String2b), + %%trace, +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5) + ),Exposition4), + Exposition1=[Exposition3,Exposition4]. + +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5):- + (%%concat_list(["What is the paragraph number of the quote about the paragraph topic ",Item1,"? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote about the paragraph topic ",Item1,"? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote about the paragraph topic ",Item1,"? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + %%choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(N_page_ref,String00a1,String00a2,_String00a3,_String00a4,String00a5), + concat_list([Item1," is because ",String00a5," (",String00a2,", p. ",N_page_ref,")."],String5a), + reference(String00a1), + + %%concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + %%get_string(String4,either,one-not-ml,Item1a,String3aa,String5) + + %%choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(N_page_ref1,String00a11,String00a21,_String00a31,_String00a41,String00a51), + concat_list([Item1," is because \"",String00a51,"\" (",String00a21,", p. ",N_page_ref1,")."],String5), + reference(String00a11) +) + ->true;exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5). + +%% Agree or disagree + +critique(String00,String01,Reasons_per_paragraph,Numbers,ML_db,Critique1) :- + length(List1,5), %% 5->1 paragraphs per critique + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + + retractall(critique3(_)), + assertz(critique3([])), + + + findall([Number2a,Critique2],( +%% Reason 1 + +member(Number2a,List1), +%%List1=[Number2a|List1a], +List2=[Number3a|List2a], +%%trace, + + +critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link), + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4), + + append_list2([[Topic_paragraph_link],Critique3,Critique4],Critique2)),Critique1). + + critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link) :- + + %%member(Number2,List1),member(Number3,List2), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + %%trace, +critique2(String00,String01,ML_db,String3,String3a,String5a,_String3y2,String3ay,String5a1,Topic_paragraph_link), + %%),Critique4). + Critique3=[[Number2a,Number3a,String3,String3a,String5a,_String3y3,String3ay,String5a1]]. + +critique2(String00,String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + + %%choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(N_page_ref1,String00a11,String00a21,_String00a31,_String00a41,String00a51), + concat_list(["\"",String00a51,"\" (",String00a21,", p. ",N_page_ref1,")."],String5a), + reference(String00a11), + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (%%String3yn="y" + true-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase + + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + %%choose_sentence_range(String00,N_page_ref2,String00a12,String00a22,_String00a32,String00a42), + choose(N_page_ref2,String00a12,String00a22,_String00a32,_String00a42,String00a52), + concat_list(["\"",String00a52,"\" (",String00a22,", p. ",N_page_ref2,")."],String5a1), + reference(String00a12) + + ) + + ;(String3y=0,String3ay=0, + %%concat_list(["What is the comment? "],String4ac),%%get_string(String4ac,either,one-not-ml,"","",%%String5a, +%% String5a1) + + %%choose_sentence_range(String00,N_page_ref3,String00a13,String00a23,_String00a33,String00a43), + choose(_N_page_ref3,String00a13,_String00a23,_String00a33,_String00a43,String00a53), + concat_list(["\"",String00a53,"\"."],String5a1), + reference(String00a13) + %%" (",String00a23,", p. ",N_page_ref3,")." + +)), + + %%concat_list(["How does the comment ",String5a1," relate to the essay topic ",String01,"? "],Topic_paragraph_link_prompt), + %%trace, + + %%downcase_and_split(String5a1,String5a1ds), + %%downcase_and_split(String01,String01ds), + %%get_string(Topic_paragraph_link_prompt,either,one-not-ml,String5a1ds,String01ds,Topic_paragraph_link) + + string_concat(String5a1_az,".",String5a1), %%choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + %%choose(N_page_ref4,String00a14,String00a24,_String00a34,String00a44,String00a54), + split_string(String01,"(","(",[String01_a|_]), + + concat_list([String01_a," is because ",String5a1_az,"."%% because of ",String00a54," (",String00a24,", p. ",N_page_ref4,")." + ],Topic_paragraph_link) + %%reference(String00a14) + + /** + %% conn - choose connected comments + concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + get_string(String4,either,two,Item1a,String3aa,String5) + **/ + ) + ->true;critique2(String00,String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link). + + + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4) :- + retractall(critique3(_)), + assertz(critique3(Critique3)), +%%trace, +/** critique3(Critique31), + append(Critique31,Critique3,Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), +**/ +%%notrace, +findall([Number2a,Number3a,String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa],( + %%member(Number2,List1a), + member(Number3a,List2a), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + + %%trace, +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa) + + +/** + trace, + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + +), + Critique4). + + %%Critique3=[]. + +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + %%trace, + %%choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + choose(N_page_ref4,String00a14,String00a24,_String00a34,_String00a44,String00a54), + concat_list(["\"",String00a54,"\" (",String00a24,", p. ",N_page_ref4,")."],String5a), + reference(String00a14), + + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),%%get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (true->%%String3yn="y"-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase x + %%trace, + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + %%choose_sentence_range(String00,N_page_ref5,String00a15,String00a25,_String00a35,String00a45), + choose(N_page_ref5,String00a15,String00a25,_String00a35,_String00a45,String00a55), + concat_list(["\"",String00a55,"\" (",String00a25,", p. ",N_page_ref5,")."],String5a1), + reference(String00a15) + + %%,trace + ) + + ;(String3y=0,String3ay=0, + concat_list(["What is the comment? "],String4ac), + + %% This is never chosen + get_string(String4ac,either,one-not-ml,"","",%%String5a, + String5a1))), + %%*** assertz recurse not findall new critique3 x + + + %%trace, + critique3(Critique31), + append(Critique31,[[0,0,String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), + + critique3(Critique33), + + +(( + + + %%critique3(Critique3), + length(Critique33,LCritique3), + %%length(List1,LCritique3), + numbers(LCritique3,1,[],List1), + %%append(List1,_,Numbers), + %%Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + %%trace, + +findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1])),CStrings1), + +findall([N,CNumber2a,CNumber3a,CString5a1],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString31,_CString3a1,_CString5a1,_CString3y1,_CString3ay1,CString5a1])),CStrings2), + +%%trace, + +%%CStrings=[CStrings1,CStrings2], +reverse(CStrings2,CStringsR),CStringsR=[[_,CNumber2a1,CNumber3a1,LastCStrings]|CStringsR1], +reverse(CStringsR1,CStringsRR), + +reverse(CStrings1,CStringsR10),CStringsR10=[_|CStringsR11], +reverse(CStringsR11,CStringsRR1), + +append_list2(CStringsRR1,CStrings11), +concat_list(CStrings11,_CStrings12), +%%concat_list(["Please select a comment to connect the comment ",LastCStrings," to:","\n",CStrings12],ConnectionNumberPrompt), +%%get_number(ConnectionNumberPrompt,ConnectionNumber), +%numbers( + +%% *** Choose phrase which is similar to a previous phrase + +%% findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1]), + + /**SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",findall([CString5a1Z2,CString5a1Z5],(downcase_atom(CString5a1,CString5a1Z1),atom_string(CString5a1Z1,CString5a1Z2),split_string(CString5a1,SepandPad,SepandPad,CString5a1Z3), + Connectors= + ["the","a","i","on","with","of","an","for","to", + "was","were","and","in","my","from","out","by"], + %% find distances between terms, closest to sent + subtract(CString5a1Z3,Connectors,CString5a1Z5), + ),CString5a1Z4), +**/ +choose1(List1,ConnectionNumber), + member([ConnectionNumber,CNumber2aa,CNumber3aa,CString5a1a],CStringsRR), + + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(CString5a1a,CString5a1a1),split_string(CString5a1a1, SepandPad, SepandPad, CString5a1a2), + + %%CNumber2a1,CNumber3a1, + %%downcase_atom(LastCStrings,LastCStrings_a),split_string(LastCStrings_a, SepandPad, SepandPad, LastCStrings_a1), + + + %% conn - choose connected comments, this to a previous comment + %%trace, + + %%concat_list(["How does ",LastCStrings," connect to ",CString5a1a,"? "],ConnectionPrompt), + + %%get_string(ConnectionPrompt,either,one-not-ml,CString5a1a2,LastCStrings_a1,String5aaa) + + string_concat(LastCStrings_az,".",LastCStrings), + string_concat(CString5a1a_az1,".",CString5a1a), %%choose_sentence_range(String00,N_page_ref6,String00a16,String00a26,_String00a36,String00a46), + + split_string(LastCStrings_az,"(","(",[LastCStrings_az_a|_]), + +replace(CString5a1a_az1,"\"","",CString5a1a_az), +replace(LastCStrings_az_a,"\"","",LastCStrings_az_a1), +choose(_N_page_ref6,String00a16,_String00a26,_String00a36,_String00a46,_String00a56), + concat_list([LastCStrings_az_a1," because ",CString5a1a_az,"."%%" because ",String00a56," (",String00a26,", p. ",N_page_ref6,")." + ],String5aaa), + reference(String00a16) + + + )->true;( + + %% If the section since updating dynamic critique comments fails, prevent doubling of critique comments + + critique3(Critique311), + reverse(Critique311,Critique312), + Critique312=[_|Critique313], + reverse(Critique313,Critique314), + retractall(critique3(_)), + assertz(critique3(Critique314)),fail + + )) + +/** Critique4=[String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa], + **/ + /** + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + + ) + %% Retries the predicate if it fails + ->true;critique3(String00,ML_db,%%Critique3, + String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa + ). + + + + +member1a([String3,String3a,String3aa],ML_db) :- + member([String3,String3a,String3aa],ML_db),!. + + +get_item_n(Exposition,Number1,Item) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[Item|_],Exposition). + +get_number(Prompt1,Number) :- + %%concat_list(Prompt1,Prompt2), + (%%repeat, + writeln(Prompt1),read_string(user_input, "\n", "\r", _End1, String),split_string(String, ",", " ", Value1),Value1=[Value2],number_string(Number,Value2)). + + +/** +get_string(Prompt1,String1) :- + concat_list(Prompt1,Prompt2), + (repeat,write(Prompt2),read_string(String1),not(String1="")). +**/ + +equals_empty_list([]). + +downcase_and_split(String1, String2) :- +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + downcase_atom(String1,String3),split_string(String3, SepandPad, SepandPad, String2). + +get_string(Prompt2,Flag1,Flag2,ML_db0,ML_db1,String2) :- +%%writeln1(get_string(Prompt2,Flag1,Flag2,ML_db1,String2)), + %%concat_list(Prompt1,Prompt2), + %%trace, + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%(repeat, + (Flag2=one-not-ml-ref->(concat_list(["Note: Enter in-text reference using AGPS, e.g.\n","The first work supports the second work (Surname 2000, pp. 18-9).\n","Surname (2000, pp. 18-9) states that ...\n","Remember to use words like \"also\", \"moreover\" and \"in addition\" before the sentence."],String_a1),writeln(String_a1));true), + writeln(Prompt2),read_string(user_input, "\n", "\r", _End2, String2aa),%%not(String2aa=""), + %%String2aa=[String2aaa], + downcase_atom(String2aa,String3),split_string(String3, SepandPad, SepandPad, String4), + Neg_term_list=["no","not","don","t","shouldn","wouldn","disagree","differ","dislikes","disagrees","differs","dislikes","disagreed","differed","disliked","negative","negation","non","negate","negates","negated","but","however","isn","lack"], + (Flag1=%%non_negative + positive->( + + (findall(Item11,( + + member(Item1,String4), + findall(Item1,( +member(Item2,Neg_term_list),(Item1=Item2->(write("Error: Contains the negative term \""),write(Item1),writeln("\".")))),Item11)),Item12)), + +maplist(equals_empty_list,Item12) + +); + ((Flag1=negative->((member(Item1,String4),member(Item1,Neg_term_list))->true;(writeln("Error: Contains no negative term, one of:"),writeln(Neg_term_list),fail));true)->true;Flag1=either)), + (Flag2=one-not-ml->String2=String2aa; + (Flag2=one-not-ml-ref-> + (refs(R1),writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), findall(_,(member(R11,R1),writeln(R11)),_),read_string(user_input, "\n", "\r", _End3, String2r),not(String2r=""),%%downcase_atom(String2r,_String3r), + String2=String2aa,split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + sort1(String2r21,String2r2), + assertz(refs(String2r2))%%split_string(String3r, SepandPad, SepandPad, String4) + ); +(Flag2=one->(%%split_string(String4,SepandPad,SepandPad,String21), + writeln("Attempt 1"), + + (length(String4,Length_string1), +(check_strings_container1(Length_string1,String4,[0,0,ML_db1],[[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1],[999,999,[]]],_,_List2)-> + writeln("Success");(writeln("Failed"),fail)) + %%(%%data_instance_k_classification1([[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1]%%,[999,999,[]] + %%],[0,0,String4]%%21 + %%,1,String4a%%21 + %%), + %%String4=String4a) + %%)-> + %%(%%String4a=[_,_,String4a1], + %%writeln([found,String4a,String4]), +%%writeln("Success")); + %%(writeln("Failed"),fail) + %% + )); + (Flag2=%%[ + two,%%,P1,S1,P2,S2], + %%trace, + append([ML_db0],[ML_db1],ML_db2), + check_strings(String4,ML_db2%%,P1,S1,P2,S2 + ))))). + +reference(String2r) :- + (refs(R1),%%writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), + findall(_,(member(R11,R1),writeln(R11)),_),%%read_string(user_input, "\n", "\r", _End4, String2r), + not(String2r=""),%%downcase_atom(String2r,_String3r), + %%String2=String2aa, + split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + sort1(String2r21,String2r2), + assertz(refs(String2r2))%%split_string(String3r, SepandPad, SepandPad, String4) + ). + + +%% Sorts by first surname then title in AGPS + +sort1(List1,List2) :- + findall([C,B],(member(B,List1),downcase_atom(B,D),atom_string(D,C)),E),sort(E,F),findall(G,member([_,G],F),List2). + + +prepare_file_for_ml(String000,String021) :- + string_codes(String001,String000), + downcase_atom(String001,String00), + split_string(String00, "\n\r", "\n\r", String01), + delete(String01,"",String02), + findall(String03,(member(String02a,String02),split_string(String02a,".",".",String04), + ((String04=[String05|String06], + number_string(Number05,String05), + number_sentences(Number05,1,String06,[],String03))->true; + (findall(String08, + (SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + member(String04a,String04),split_string(String04a,SepandPad,SepandPad,String09), + append_list([[""],"",String09],String08)),String03)))),String0211), + append_list2(String0211,String021). + +number_sentences(_,_,[],String,String) :- !. number_sentences(Number_a,Number_b1,String06,String07,String08) :- + String06=[String00|String12], + +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + split_string(String00,SepandPad,SepandPad,String09), + append_list([[Number_a],Number_b1,String09],String10), + append(String07,[String10],String11), + Number_b2 is Number_b1 + 1, + number_sentences(Number_a,Number_b2,String12,String11,String08). + + +data_instance_k_classification1(Data,I,K,C):- + maplist(o_oclass_disClass1(I),Data,DisAndClass), + keysort(DisAndClass,DSorted), + + %%writeln(DSorted), + + length(TopK,K), + append(TopK,_,DSorted), + %This is not a very good way of selecting k as you may have many values with the same distance, and the sorting just cuts these off + %Dsorted = [1-pos,3-pos,3-pos,3-neg,3-neg] + %Topk =[1-pos,3-pos,3-pos] + topk_vote(TopK,C). + +o_oclass_disClass1(O,[A,B,O2],D-[A,B,O2]):- + o_o_dis(O,O2,D). + +%% could be in either order +%% a([w,z,a,b,e,c,z,y],[1,1,[c]],[1,2,[a]]). + +%% true +%% a([a,c,b],[1,1,[a,d]],[1,2,[b]]). +%% a([a,c,b],[1,1,[a,c]],[1,2,[b]]). +%% a([a,b],[1,1,[a]],[1,2,[b]]). + +%% false +%% a([c,d],[1,1,[a]],[1,2,[b]]). + +%% q in "q a b" sends food like e in "c d e" + +/** + +[debug] ?- a([q,e,r,a,t,y,u,q,e,r,a,t,y,u,c,b,x,v,n,m],[1,1,[c,a,t,y,u]],[1,2,[c,a,t,y,u]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[c,a,t,y,u]],[r,a,t,y,u]] +true. + +X: +[debug] ?- a([q,e,r,a,t,y,u,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,u]]). +[[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,u]],[c,b,x]] +true. + +[debug] ?- a([q,e,r,a,t,y,u,c,c,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,v]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,v]],[c,c,c]] +true. + +**/ + +check_strings(String1,ML_db) :- + %%member([P1,S1,String2],ML_db), + %%member([P2,S2,String3],ML_db), + ML_db=[String2a,String3a], +%%writeln(["String1,String2a,String3a",String1,String2a,String3a]), + String2=[0,0,String2a], + String3=[0,0,String3a], +%%a(String1,String2,String3):- + length(String1,Length_string1), +((writeln("Attempt 1"), +check_strings_container1(Length_string1,String1,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + check_strings_container1(Length_list2,List2,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,_List3), + writeln("Success") + %%,trace + )->true; + (writeln("Failed"), + writeln("Attempt 2"), + ((check_strings_container1(Length_string1,String1,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + + check_strings_container1(Length_list2,List2,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,_List3))-> + writeln("Success");(writeln("Failed"),fail))) +). + +check_strings_container1(Length_string1,String1,String2,Db,List2,List2b) :- +check_strings1(Length_string1,String1,String2,Db,List2,List2b),not(var(List2b)). +%%check_strings_container2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- +%%check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b),not(var(List2b)). + + + %% finds the length of each string - from position 1 to (length of string1) of string1 + %% starts with the shortest possible string, then increases the length + %% repeat this for second string + %%string1 + + %%*** check with s2,s3 in db +check_strings1(0,_String1,_String2,_Db,List2,List2) :- !. +check_strings1(Length_string1,String1,String2,Db,List2,List2b) :- + %% finds the first str + %%length(List,Index), + length(List1,Length_string1), + append(_,List1,String1), + Length_string2 is Length_string1-1, + Length_string3 is Length_string1+1, + check_strings2(0,Length_string3,List1,String2,Db,List2,List2c), + (var(List2c)-> + check_strings1(Length_string2,String1,String2,Db,List2c,List2b);List2c=List2b),!. + +check_strings2(Length_string,Length_string,_String1,_String2,_Db,List2,List2) :- !. +check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- + + + %%split_string(String4,SepandPad,SepandPad,String21), + + %% go through s1, removing the first word each time, until v + %%Length_string11 is Length_string1-1, + length(List11,Length_string1), + append(List11,List2,List1), + Length_string2 is Length_string1+1, + +%%writeln([list11,List11]), + + ((%%trace, + %%(List11=[z,a,b]->trace;true), + %%writeln([[_,_,List11],String2]), + %%notrace, + not(List11=[]), + +%%writeln(data_instance_k_classification1(Db +%% ,List11,1,A)), + data_instance_k_classification1(Db + %%String2 + ,List11,1,A), + %%writeln([string2,String2,list11,List11,a,A]), +A=String2,%%[_,_,List11], + %%trace, + + %%((List11=[z,a,b],A=[_,_,[z,t,b]],not(String2=[_,_,[c,z]]))->trace;notrace), + %%trace, +%%writeln([string2=list11,String2=List11]), +%%(List11=[y]->true%%trace +%%;true), +%%member(List11,Db), + %%**String2=[_,_,List11] + List2b=List2,%%,notrace + String2=[_,_,String21] + ,writeln([found,String21,List11]) + ) + ->(true + %%,writeln(List11) + ); + check_strings2(Length_string2,Length_string3,List1,String2,Db,_List2c,List2b)),!. + +numbers(N2,N1,Numbers,Numbers) :- + N2 is N1-1,!. +numbers(N2,N1,Numbers1,Numbers2) :- + N3 is N1+1, + append(Numbers1,[N1],Numbers3), + numbers(N2,N3,Numbers3,Numbers2). + + +mind_read(Item,List) :- + trialy2(List,R1), + findbest(R1,Item),!. + %%random_member(Item,List),!. + +trialy2(List,R) :- +%%writeln([list,List]), +%%notrace, + length(List,Length), + /** + ((Length=<9-> + findr4(R4), + number_string(R4,R4A), + formr5([R4A],9,Length,R5), + findr(R5,List,R)); + (Length=<99-> + findr4(R41), + findr4(R42), + formr5([R41,R42],99,Length,R5), + findr(R5,List,R)); + (Length=<999-> + findr4(R41), + findr4(R42), + findr4(R43), + formr5([R41,R42,R43],999,Length,R5), + findr(R5,List,R)); + fail), + %%writeln([r,R]),trace. + true. + **/ + +log(Length,A),log(10,C),B is floor(A/C)+1, +numbers(B,1,[],D), +findall(R,(member(_E,D),findr4(R1),number_string(R1,R)),RL), +B2 is floor(10^((floor(A/C)+1))-1), +formr5(RL,B2,Length,R5), +findr(R5,List,R) + +. + +findr4(R4) :- + List1=[0,1,2,3,4,5,6,7,8,9], + Trials is 30, + +%catch( + (trialy22(List1,Trials,[],R1), + findbest2(R1,R4) + %writeln1([item,Item]) + ) + %_, + %findr4(R4) + %) + . + + %%number_string(R3,R2), +formr5(RList,Upper,Length,R5) :- + %%findall(D,(member(C,RList),floor(C,D)),RList2), + concat_list2A(RList,R5A), + number_string(R5B,R5A), + %%R5B=R5A, + R51 is floor((R5B/Upper)*Length), + (R5B=Upper->R5 is R51-1;R5=R51). +findr(R4,List,R) :- + %%floor(R4,R4A), + length(A,R4), + append(A,[R|_],List). + + %%random_member(A,List), + %%R=[[_,A]]. + + /** + length(List,L), + Trials is L*3, + trialy22(List,Trials,[],R).**/ + +trialy22([],_,R,R) :- !. +trialy22(List,Trials,RA,RB) :- + List=[Item|Items], + trialy21(Item,Trials,R1), + append(RA,[R1],RC), + trialy22(Items,Trials,RC,RB),!. + +trialy21(Label,Trials,RA) :- + trialy3(Trials,[],R), + aggregate_all(count, member(true,R), Count), + RA=[Count,Label]. + +trialy3(0,R,R) :-!. +trialy3(Trials1,RA,RB) :- + trialy1(R1), + append(RA,[R1],RC), + Trials2 is Trials1-1, + trialy3(Trials2,RC,RB),!. + +%% try other nouns +trialy1(R1) :- + %%control11(A1), + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, +catch( + (trial1(N,[],S),trial01(S,S3)), + _, + trial0(S3)). +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +midpoint(S,MP) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,writeln(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +concat_list2A(A1,B):- + A1=[A|List], + concat_list2A(A,List,B),!. + +concat_list2A(A,[],A):-!. +concat_list2A(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list2A(C,Items,B). + +findbest(R,R) :-!. +findbest2(R,Item):- + sort(R,RA), + reverse(RA,RB), + RB=[[_,Item]|_Rest]. diff --git a/Essay-Helper/short_essay_helper3.1_agps.pl b/Essay-Helper/short_essay_helper3.1_agps.pl new file mode 100644 index 0000000000000000000000000000000000000000..4415589aa225868a65666cc25a31525858c8e7e9 --- /dev/null +++ b/Essay-Helper/short_essay_helper3.1_agps.pl @@ -0,0 +1,1109 @@ +%% backdated from this - which has no repeats and forward progress through essays, but has a bug in numbering pages in references, so backdated to version with repeats and random progress, made forward progress x kept this version and solved bug with pg info in db + +%% Prolog Short Essay Helper + +%% Keep 1,2,3 in aphors +%% Certain number of reasons per paragraph +%% Explains essay structure at start +%% Exposition (negative terms not allowed, asks for 5 groups from text to take optionally enough notes about, which can revise) and critique +%% Paraphrasing +%% Converts x, explains x, asks about +%% Allows citation of aphor +%% Prints citations used so far +%% Prints chapter, essay and allows changes (using text editor) at end. +%% Outputs db file and br dicts for marking later + +%% *** Rewrite essay files with e.g. 1a. before aphors not 1. so not confused with aphors + +%% short_essay_helper("file.txt",5,E),writeln1(E). + +%%:- include('distances.pl'). +:- use_module(library(date)). +:- include('../listprologinterpreter/la_strings'). +:- include('sheet_feeder.pl'). + +:- dynamic critique3/1. +:- dynamic agree_disagree/1. +:- dynamic refs/1. +:- dynamic chosen_quotes/1. +:- dynamic string00_z/1. +:- dynamic num_paras_exp/1. +:- dynamic num_paras_crit/1. + +choose(N2,B,B1,B2,C,Item) :- +%%trace, +( + choose2(N2,B,B1,B2,C,Item)->true;(Item="* All quotes exhausted. (*)"),N2=0,B="()",B1="()",B2=0,C=""), + %%notrace, + !. +choose2(N2,B,B1,B2,List0,List0) :- +%%trace, + string00_z(String00), + choose_sentence_range(String00,N1,B,B1,B2,List0), + %%chosen_quotes(Chosen_quotes1), + %%trace, + %%length(List0,L), + %%numbers(L,1,[],N), + %%random_ + %%member(N1,N), + %% + %%random_ + %%member([N1,Item10],List0), + + %%random_ + %%**member(Item1,Item10), + N2 is N1+B2-1, + %%get_item_n(List0,N1,Item10), + + + %%**string_codes(Item10,List), + %%notrace, + %%**split_string(List,".\n",".\n",List2), + + %%length(List2,L), + %%numbers(L,1,[],N), + %%random_ + %%member(N1,N), + %%N2 is N1+B2-1, + %%random_ + %%**member(Item1,List2), + + %%get_item_n(List2,N1,Item1), + + /**string_concat(E,D,Item1), + string_length(E,1), + downcase_atom(E,E1), + atom_string(E1,E2), + string_concat(E2,D,Item2), + string_length(E2,1), + string_concat(Item2,""%%"." + ,Item), + **/ + delete(String00,[B,B1,B2,N1,List0],String00_a), + %%trace, + %%writeln1([[b,b1,b2,n2,list0],[B,B1,B2,N1,List0]]), + %%writeln1([string00,String00]), + %%writeln1([string00_a,String00_a]), + %%**delete(String00,[B,B1,B2|_],String00_a), + %%**delete(List0,[N1,Item10],List6), + %%findall([Item3,". "],(member(Item3,List2)),List3), + %%maplist(append,[List3],[List4]), + %%concat_list(List4,_List5), + %%append(List6,[]%%List5 + %%,List7), + %%**(List6=[]->String00_b=String00_a; + %%**(%%trace, + %%**maplist(append,[[[B,B1,B2],List6]],[String00_c]), + %%**append(String00_a,[String00_c],String00_b)%%,notrace + %%**)), + retractall(string00_z(_)), + assertz(string00_z(String00_a)) + %%trace, + %%writeln1(String00_b),notrace + + + %%,not(member(Item,Chosen_quotes1)), + %%append(Chosen_quotes1,[Item],Chosen_quotes2), + %%retractall(chosen_quotes(_)), + %%assertz(chosen_quotes(Chosen_quotes2)) + . + +choose1(List0,Item) :- + random_member(Item,List0). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + + +short_essay_helper(%%Filex, + String01,Key_words, + Reasons_per_paragraph) :- + + retractall(num_paras_exp(_)), + assertz(num_paras_exp(5)), + + retractall(num_paras_crit(_)), + assertz(num_paras_crit(5)), + + retractall(string00_z(_)), + %%assertz(string00_z([])), + + retractall(critique3(_)), + assertz(critique3([])), + + retractall(refs(_)), + assertz(refs([])), + + retractall(chosen_quotes(_)), + assertz(chosen_quotes([])), + + directory_files("sources/",F), + delete_invisibles_etc(F,G), +%%trace, +SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + findall(String02h3,(member(Filex1,G), + string_concat("sources/",Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]), + + (String02a=[Az,Bz,Cz|String02c]->true; + (concat_list(["Error: ",Filex," not in format [\"Surname, A 2000, Title: Subtitle, Publisher, City.\",\"Surname, A 2000\",First_Page_Num,\"\",\"\",...\"]"],Notification1),writeln(Notification1),abort)), + %%String02c=String02d, + %%trace, + findall([Az,Bz,Cz,N1,String02cb],( + + length(String02c,L), + numbers(L,1,[],N), + %%random_ + member(N1,N), + get_item_n(String02c,N1,String02ca), + + %%member(String02ca,String02c), + split_string(String02ca, ".\n\r", ".\n\r", String02cb) + + %%member(String02cb1,String02cb) + + ),String02cc), + %%maplist(append,[String02cc],[String02d]), + + %%delete(String02cc,[_,[]],String02d), + String02cc=String02d, + + findall([Az,Bz,Cz,N2,String02d2],(member([Az,Bz,Cz,N2,String02d1],String02d), + member(String02d2,String02d1), + downcase_atom(String02d2,String02e), + atom_string(String02e,String02f1), + split_string(String02f1, SepandPad, SepandPad, String02e1), + findall(String02g,(member(Key_words1,Key_words), + %%trace, + downcase_atom(Key_words1,Key_words11), + atom_string(Key_words11,Key_words12), +findall(Key_words12,(member(Key_words12,String02e1)),String02g) + ),String02i), + not(maplist(equals_empty_list,String02i)) + + ),String02h31), + + sort(String02h31,String02h3) + + %%prepare_file_for_ml(String00,String02a) + ),String00z1), + + %%, + + %%trace, + %%writeln1([string00z1,String00z1]), +%%findall(String02h2,(member([Ay,By,Cy,N1,String02h1],String00z1), + %% (String02h1=[]->String02h2=[]; + maplist(append,[String00z1],[String00]),%%) + %%),String00z), + +%%delete(String00z,[],String00), + + +term_to_atom(Key_words,Key_words_a), +atom_string(Key_words_a,Key_words_b), + (String00=[]->(concat_list(["Error: No files in source folder or no instances of keywords ",Key_words_b," in files in source folder."],Notification2),writeln(Notification2),abort);true), + + %%maplist(append,[[String00z1]],String00), +%%maplist(append,[String00z],String00), + %%trace, +assertz(string00_z(String00)), + %%writeln1([string00,String00]), + %%notrace, + +%%writeln1(String02), + + + generate_file_name(File1,File2), + + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + explain_structure(String01,Reasons_per_paragraph,File1), + exposition(String00,String01,Reasons_per_paragraph,Numbers,String02,Exposition), + + %%concat_list(["Do you agree or disagree with ",String01," (a/d) ? "],String2ad),%%get_string(String2ad,either,one-not-ml,"","",String3ad), + choose1(["a"%%,"d" + ],String3ad), + + (String3ad="a"-> + (retractall(agree_disagree(_)), + assertz(agree_disagree(agree))); + (retractall(agree_disagree(_)), + assertz(agree_disagree(disagree)))), + + +critique(String00,String01,Reasons_per_paragraph,Numbers,String02,Critique), + +agree_disagree(Pole), + + %%concat_list(["What is the future area of research from your essay about ",String01,"? "],Future_research_prompt), + %%trace, + %%get_string(Future_research_prompt,either,one-not-ml,"","",Future_research), +%% choose_sentence_range(String00,), + choose(N_page_ref,String00a1,String00a2,_String00a3,_String00a4,String00a5), + concat_list(["In ",String01,", automation should apply to \"",String00a5,"\" (",String00a2,", p. ",N_page_ref,")."],Future_research), + reference(String00a1), + +term_to_atom([Exposition,Critique,Future_research],File_contents),open_s(File1,write,Stream),write(Stream,File_contents),close(Stream), + +%% Output essay +%%findall(_,(member(Exposition1,Exposition),Exposition1= + +refs(R2), + +%%writeln1([Exposition,Critique,Future_research,R2]), + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML), +writeln1(Essay), + + (open_s(File2,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,HTML), + close(Stream1)),! + . + +%% replace("a\nb","\n","
\n",F). +%% F="a
\nb
\n". + +replace(A,Find,Replace,F) :- + split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F). + +concat_list1(D,F) :- + maplist(append,[D],[E]),concat_list(E,F). + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML) :- + write_heading(String01,Heading), + write_introduction(String01,Pole,Critique,Introduction), + write_exposition(Exposition,Exposition2a), + concat_list(["I will expose ",String01," in this half. ",Exposition2a],Exposition2), + %%string_concat(Exposition2,"\n",Exposition2a), + write_critique(Critique,Critique2a), + string_concat(Critique2a,"\n",Critique2b), + atom_string(Pole,Pole1), + concat_list(["I will ",Pole1," with ",String01," in this half. ",Critique2b],Critique2), + write_conclusion(String01,Pole,Critique,Future_research,Conclusion), + write_references(R2,References,Refs_no_heading), + concat_list([Heading,Introduction,Exposition2,Critique2,Conclusion,References], + Essay), + concat_list([Introduction,Exposition2,Critique2,Conclusion], + Essay2), + replace(Essay2,"\n","
",HTML1), + replace(Refs_no_heading,"\n","
",Refs_no_heading2), + concat_list(["",String01,"

", + String01,"

",HTML1,"

Bibliography

",Refs_no_heading2,""],HTML). + +write_heading(String01,Heading) :- + concat_list([String01,"\n","\n"],Heading). + +write_introduction(String01,Pole1,Critique,Introduction) :- + %% The heading should be in the form "Author's topic" + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list(["I will critically analyse ",String01,". ", + "I will ",Pole2," with ",String01,". ", + Paragraph_topic_sentences2,"\n\n"],Introduction). + +write_conclusion(String01,Pole1,Critique,Future_research,Conclusion) :- + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list([ + Paragraph_topic_sentences2, + "I have ",Pole2,"d with ",String01,". ", + Future_research,"\n\n" + ],Conclusion). + +write_references(R2,References,Refs_no_head) :- + findall([Reference,"\n"],member(Reference,R2),References1), + concat_list1(References1,References2), + concat_list([References2],Refs_no_head), + concat_list(["Bibliography","\n\n",References2],References). + +%%a1 +%% write_exposition([[[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."]]],A),writeln1(A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. ". + +%% write_exposition([[[1,"g1"],[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."],[1,1,_15410,_15416,"a1","a1 is in g1."],[2,2,_15352,_15358,"a2","g1 contains a2."]]],A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. a1 a1 is in g1. a2 g1 contains a2. ". + +write_exposition(Exposition,Essay4b) :- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +Exposition=[_,Exposition2], + + +findall([Essay4c%%,"\n" +],(member(Numbera11,Numbers), + + +%% not "" with findall +findall(Essay4,( +%%member(Exposition1,Exposition), +%%Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + + +%%output_exposition(Numbera11,Exposition2,"",Essay1), + + %%findall( Essay4,( + member(Exposition1,Exposition2),Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,Group_link], + concat_list([String5a," ",Group_link," "],Essay4) + %%delete(Exposition,Exposition1,Exposition2) + %%output_exposition(Numbera11,Exposition2,Essay4,Essay6) +),Essay4a), +concat_list(Essay4a,Essay4c1),%%trace, +(Essay4c1=""->Essay4c="";concat_list([Essay4c1,"\n"],Essay4c)) + +),Essay4d), + +maplist(concat_list,Essay4d,Essay4f), +concat_list(Essay4f,Essay4b) +%%concat_list([Essay4e,"\n"],Essay4b) +%%concat_list([Essay1," ",Essay3],Essay2), + %%concat_list([Essay2,"\n"],Essay23)) +%%,Essay3a), + %%concat_list(Essay3a,Essay4a) +. +%% *** HTML (
not \n) + +%%%%% +%%a + +%% write_critique([[1,["heading is e12",[1,1,_25346,_25352,"e1",_25370,_25376,"e12"],[1,2,_25412,_25418,"e2",_25436,_25442,"e22",1,1,"e12",0,0,"e22","e12 is e22"]]]],A),writeln1(A). +%% A = "heading is e12 e1 e12 e2 e22 e12 is e22 \n". + +write_critique(Critique,Essay4):- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +findall(Essay23,(member(Numbera11,Numbers), + +%% not "" with findall +%%findall(Essay22,( +member(Critique1,Critique), +%%Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + +Critique1=[Numbera11,[Para_topic_sent,[_Number2a,_Number3a,_String3,_String3a,String5a,_String3y,_String3ay,String5a1]|Critique2]], + +concat_list([Para_topic_sent," ",String5a," ",String5a1," "],Essay0),output_critique(Numbera11,Critique2,String5a1,Essay0,Essay1), + + concat_list([Essay1,Para_topic_sent,"\n"],%%Essay22) + +%%), +Essay23)),Essay3), + concat_list(Essay3,Essay4) +. +%% *** HTML (
not \n) + +output_critique(_Numbera11,[],_String5a1a,Essay,Essay) :- !. +output_critique(Numbera11,Critique,CString5a1a,Essay1,Essay2) :- +findall( Essay6,(member(Critique1,Critique),Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,_String3y1,_String3ay,String5a1, +_CNumber2aa,_CNumber3aa,CString5a1a, + _CNumber2a1,_CNumber3a1,_LastCStrings, + String5aaa], + concat_list([String5a," ",String5a1," ",String5aaa," "],Essay4), + delete(Critique,Critique1,Critique2), + output_critique(Numbera11,Critique2,String5aaa,Essay4,Essay6) +),Essay33), +%%trace, +(Essay33=[]->concat_list([Essay1%%," " +],Essay2);%%(Essay33=[Essay3]->concat_list([Essay1," ",Essay3],Essay2);%%(Essay33=[Essay3|[E33]]-> concat_list([Essay1," ",Essay3,E33],Essay2); +(Essay33=[Essay3|E33], concat_list(E33,E34),concat_list([Essay1,%%" ", +Essay3,E34],Essay2))) +%%) +%%) +. + + +/**critique(String00,String01,Reasons_per_paragraph,Numbers,Critique). + **/ + +generate_file_name(File1,File2) :- + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".txt"],File1), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".html"],File2). + + +explain_structure(String01,Reasons_per_paragraph,_File1) :- + concat_list(["The Short Essay Helper will automatically structure and write your essay about ",String01," with ",Reasons_per_paragraph," reasons per paragraph.","\n", + "The Helper will help write an exposition (which summarises but doesn't critique the idea), a critique (which agrees with or disagrees with the topic), the introduction and the conclusion (which state whether you agreed or disagreed with the topic, etc.). Citations will be automatically made.","\n","Note: Generated essays are not to be handed in, and you need to paraphrase and cite work you have referenced. Your grade depends on whether you agree or disagree and how many breasonings you breason out. Check the referencing style is appropriate for your class.","\n"],String1), + writeln(String1). + +choose_sentence_range(String00,N2,B,B1,B2,C) :- + length(String00,L), + numbers(L,1,[],N), + %%random_ + member(N1,N), + get_item_n(String00,N1,A), + A=[B,B1,B2,N2,C]. + %%N2 is N1+B2-1. + %%random_member(A,String00), + +exposition(String00,_String01,Reasons_per_paragraph,Numbers,ML_db,Exposition1) :- + num_paras_exp(Num_paras_exp), + + length(List1,Num_paras_exp), %% 5->1 paragraphs per exposition + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + findall([Number1,Exposition2],( + %%trace, +member(Number1,List1),%%concat_list(["What is group ",Number1," of 5 in the exposition that groups ideas about ",String01,"? "],String1),%%get_string(String1,either,one-not-ml,"","",%ML_db,Exposition2) +%% ** Doesn't print this +%%choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(N_page_ref,String00a1,String00a2,_String00a3,_String00a4,String00a5), + concat_list(["",String00a5," (",String00a2,", p. ",N_page_ref,") "],Exposition2), + reference(String00a1)),Exposition3), + + + findall([Number2,Number3,String3,String3a,String5a,String5],( + member(Number2,List1),member(Number3,List2),get_item_n(Exposition3,Number2,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you how the quote you are about to enter relates to the paragraph topic."],String2b),writeln(String2b), + %%trace, +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5) + ),Exposition4), + Exposition1=[Exposition3,Exposition4]. + +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5):- + (%%concat_list(["What is the paragraph number of the quote about the paragraph topic ",Item1,"? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote about the paragraph topic ",Item1,"? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote about the paragraph topic ",Item1,"? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + %%choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(N_page_ref,String00a1,String00a2,_String00a3,_String00a4,String00a5), + concat_list([Item1," is because ",String00a5," (",String00a2,", p. ",N_page_ref,")."],String5a), + reference(String00a1), + + %%concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + %%get_string(String4,either,one-not-ml,Item1a,String3aa,String5) + + %%choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(N_page_ref1,String00a11,String00a21,_String00a31,_String00a41,String00a51), + concat_list([Item1," is because \"",String00a51,"\" (",String00a21,", p. ",N_page_ref1,")."],String5), + reference(String00a11) +) + ->true;exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5). + +%% Agree or disagree + +critique(String00,String01,Reasons_per_paragraph,Numbers,ML_db,Critique1) :- + num_paras_crit(Num_paras_crit), + + length(List1,Num_paras_crit), %% 5->1 paragraphs per critique + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + + retractall(critique3(_)), + assertz(critique3([])), + + + findall([Number2a,Critique2],( +%% Reason 1 + +member(Number2a,List1), +%%List1=[Number2a|List1a], +List2=[Number3a|List2a], +%%trace, + + +critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link), + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4), + + append_list2([[Topic_paragraph_link],Critique3,Critique4],Critique2)),Critique1). + + critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link) :- + + %%member(Number2,List1),member(Number3,List2), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + %%trace, +critique2(String00,String01,ML_db,String3,String3a,String5a,_String3y2,String3ay,String5a1,Topic_paragraph_link), + %%),Critique4). + Critique3=[[Number2a,Number3a,String3,String3a,String5a,_String3y3,String3ay,String5a1]]. + +critique2(String00,String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + + %%choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(N_page_ref1,String00a11,String00a21,_String00a31,_String00a41,String00a51), + concat_list(["\"",String00a51,"\" (",String00a21,", p. ",N_page_ref1,")."],String5a), + reference(String00a11), + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (%%String3yn="y" + true-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase + + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + %%choose_sentence_range(String00,N_page_ref2,String00a12,String00a22,_String00a32,String00a42), + choose(N_page_ref2,String00a12,String00a22,_String00a32,_String00a42,String00a52), + concat_list(["\"",String00a52,"\" (",String00a22,", p. ",N_page_ref2,")."],String5a1), + reference(String00a12) + + ) + + ;(String3y=0,String3ay=0, + %%concat_list(["What is the comment? "],String4ac),%%get_string(String4ac,either,one-not-ml,"","",%%String5a, +%% String5a1) + + %%choose_sentence_range(String00,N_page_ref3,String00a13,String00a23,_String00a33,String00a43), + choose(_N_page_ref3,String00a13,_String00a23,_String00a33,_String00a43,String00a53), + concat_list(["\"",String00a53,"\"."],String5a1), + reference(String00a13) + %%" (",String00a23,", p. ",N_page_ref3,")." + +)), + + %%concat_list(["How does the comment ",String5a1," relate to the essay topic ",String01,"? "],Topic_paragraph_link_prompt), + %%trace, + + %%downcase_and_split(String5a1,String5a1ds), + %%downcase_and_split(String01,String01ds), + %%get_string(Topic_paragraph_link_prompt,either,one-not-ml,String5a1ds,String01ds,Topic_paragraph_link) + + string_concat(String5a1_az,".",String5a1), %%choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + %%choose(N_page_ref4,String00a14,String00a24,_String00a34,String00a44,String00a54), + split_string(String01,"(","(",[String01_a|_]), + + concat_list([String01_a," is because ",String5a1_az,"."%% because of ",String00a54," (",String00a24,", p. ",N_page_ref4,")." + ],Topic_paragraph_link) + %%reference(String00a14) + + /** + %% conn - choose connected comments + concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + get_string(String4,either,two,Item1a,String3aa,String5) + **/ + ) + ->true;critique2(String00,String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link). + + + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4) :- + retractall(critique3(_)), + assertz(critique3(Critique3)), +%%trace, +/** critique3(Critique31), + append(Critique31,Critique3,Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), +**/ +%%notrace, +findall([Number2a,Number3a,String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa],( + %%member(Number2,List1a), + member(Number3a,List2a), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + + %%trace, +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa) + + +/** + trace, + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + +), + Critique4). + + %%Critique3=[]. + +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + %%trace, + %%choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + choose(N_page_ref4,String00a14,String00a24,_String00a34,_String00a44,String00a54), + concat_list(["\"",String00a54,"\" (",String00a24,", p. ",N_page_ref4,")."],String5a), + reference(String00a14), + + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),%%get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (true->%%String3yn="y"-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase x + %%trace, + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + %%choose_sentence_range(String00,N_page_ref5,String00a15,String00a25,_String00a35,String00a45), + choose(N_page_ref5,String00a15,String00a25,_String00a35,_String00a45,String00a55), + concat_list(["\"",String00a55,"\" (",String00a25,", p. ",N_page_ref5,")."],String5a1), + reference(String00a15) + + %%,trace + ) + + ;(String3y=0,String3ay=0, + concat_list(["What is the comment? "],String4ac), + + %% This is never chosen + get_string(String4ac,either,one-not-ml,"","",%%String5a, + String5a1))), + %%*** assertz recurse not findall new critique3 x + + + %%trace, + critique3(Critique31), + append(Critique31,[[0,0,String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), + + critique3(Critique33), + + +(( + + + %%critique3(Critique3), + length(Critique33,LCritique3), + %%length(List1,LCritique3), + numbers(LCritique3,1,[],List1), + %%append(List1,_,Numbers), + %%Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + %%trace, + +findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1])),CStrings1), + +findall([N,CNumber2a,CNumber3a,CString5a1],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString31,_CString3a1,_CString5a1,_CString3y1,_CString3ay1,CString5a1])),CStrings2), + +%%trace, + +%%CStrings=[CStrings1,CStrings2], +reverse(CStrings2,CStringsR),CStringsR=[[_,CNumber2a1,CNumber3a1,LastCStrings]|CStringsR1], +reverse(CStringsR1,CStringsRR), + +reverse(CStrings1,CStringsR10),CStringsR10=[_|CStringsR11], +reverse(CStringsR11,CStringsRR1), + +append_list2(CStringsRR1,CStrings11), +concat_list(CStrings11,_CStrings12), +%%concat_list(["Please select a comment to connect the comment ",LastCStrings," to:","\n",CStrings12],ConnectionNumberPrompt), +%%get_number(ConnectionNumberPrompt,ConnectionNumber), +%numbers( + +%% *** Choose phrase which is similar to a previous phrase + +%% findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1]), + + /**SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",findall([CString5a1Z2,CString5a1Z5],(downcase_atom(CString5a1,CString5a1Z1),atom_string(CString5a1Z1,CString5a1Z2),split_string(CString5a1,SepandPad,SepandPad,CString5a1Z3), + Connectors= + ["the","a","i","on","with","of","an","for","to", + "was","were","and","in","my","from","out","by"], + %% find distances between terms, closest to sent + subtract(CString5a1Z3,Connectors,CString5a1Z5), + ),CString5a1Z4), +**/ +choose1(List1,ConnectionNumber), + member([ConnectionNumber,CNumber2aa,CNumber3aa,CString5a1a],CStringsRR), + + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(CString5a1a,CString5a1a1),split_string(CString5a1a1, SepandPad, SepandPad, CString5a1a2), + + %%CNumber2a1,CNumber3a1, + %%downcase_atom(LastCStrings,LastCStrings_a),split_string(LastCStrings_a, SepandPad, SepandPad, LastCStrings_a1), + + + %% conn - choose connected comments, this to a previous comment + %%trace, + + %%concat_list(["How does ",LastCStrings," connect to ",CString5a1a,"? "],ConnectionPrompt), + + %%get_string(ConnectionPrompt,either,one-not-ml,CString5a1a2,LastCStrings_a1,String5aaa) + + string_concat(LastCStrings_az,".",LastCStrings), + string_concat(CString5a1a_az1,".",CString5a1a), %%choose_sentence_range(String00,N_page_ref6,String00a16,String00a26,_String00a36,String00a46), + + split_string(LastCStrings_az,"(","(",[LastCStrings_az_a|_]), + +replace(CString5a1a_az1,"\"","",CString5a1a_az), +replace(LastCStrings_az_a,"\"","",LastCStrings_az_a1), +choose(_N_page_ref6,String00a16,_String00a26,_String00a36,_String00a46,_String00a56), + concat_list([LastCStrings_az_a1," because ",CString5a1a_az,"."%%" because ",String00a56," (",String00a26,", p. ",N_page_ref6,")." + ],String5aaa), + reference(String00a16) + + + )->true;( + + %% If the section since updating dynamic critique comments fails, prevent doubling of critique comments + + critique3(Critique311), + reverse(Critique311,Critique312), + Critique312=[_|Critique313], + reverse(Critique313,Critique314), + retractall(critique3(_)), + assertz(critique3(Critique314)),fail + + )) + +/** Critique4=[String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa], + **/ + /** + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + + ) + %% Retries the predicate if it fails + ->true;critique3(String00,ML_db,%%Critique3, + String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa + ). + + + + +member1a([String3,String3a,String3aa],ML_db) :- + member([String3,String3a,String3aa],ML_db),!. + + +get_item_n(Exposition,Number1,Item) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[Item|_],Exposition). + +get_number(Prompt1,Number) :- + %%concat_list(Prompt1,Prompt2), + (%%repeat, + writeln(Prompt1),read_string(user_input, "\n", "\r", _End1, String),split_string(String, ",", " ", Value1),Value1=[Value2],number_string(Number,Value2)). + + +/** +get_string(Prompt1,String1) :- + concat_list(Prompt1,Prompt2), + (repeat,write(Prompt2),read_string(String1),not(String1="")). +**/ + +equals_empty_list([]). + +downcase_and_split(String1, String2) :- +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + downcase_atom(String1,String3),split_string(String3, SepandPad, SepandPad, String2). + +get_string(Prompt2,Flag1,Flag2,ML_db0,ML_db1,String2) :- +%%writeln1(get_string(Prompt2,Flag1,Flag2,ML_db1,String2)), + %%concat_list(Prompt1,Prompt2), + %%trace, + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%(repeat, + (Flag2=one-not-ml-ref->(concat_list(["Note: Enter in-text reference using AGPS, e.g.\n","The first work supports the second work (Surname 2000, pp. 18-9).\n","Surname (2000, pp. 18-9) states that ...\n","Remember to use words like \"also\", \"moreover\" and \"in addition\" before the sentence."],String_a1),writeln(String_a1));true), + writeln(Prompt2),read_string(user_input, "\n", "\r", _End2, String2aa),%%not(String2aa=""), + %%String2aa=[String2aaa], + downcase_atom(String2aa,String3),split_string(String3, SepandPad, SepandPad, String4), + Neg_term_list=["no","not","don","t","shouldn","wouldn","disagree","differ","dislikes","disagrees","differs","dislikes","disagreed","differed","disliked","negative","negation","non","negate","negates","negated","but","however","isn","lack"], + (Flag1=%%non_negative + positive->( + + (findall(Item11,( + + member(Item1,String4), + findall(Item1,( +member(Item2,Neg_term_list),(Item1=Item2->(write("Error: Contains the negative term \""),write(Item1),writeln("\".")))),Item11)),Item12)), + +maplist(equals_empty_list,Item12) + +); + ((Flag1=negative->((member(Item1,String4),member(Item1,Neg_term_list))->true;(writeln("Error: Contains no negative term, one of:"),writeln(Neg_term_list),fail));true)->true;Flag1=either)), + (Flag2=one-not-ml->String2=String2aa; + (Flag2=one-not-ml-ref-> + (refs(R1),writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), findall(_,(member(R11,R1),writeln(R11)),_),read_string(user_input, "\n", "\r", _End3, String2r),not(String2r=""),%%downcase_atom(String2r,_String3r), + String2=String2aa,split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + sort1(String2r21,String2r2), + assertz(refs(String2r2))%%split_string(String3r, SepandPad, SepandPad, String4) + ); +(Flag2=one->(%%split_string(String4,SepandPad,SepandPad,String21), + writeln("Attempt 1"), + + (length(String4,Length_string1), +(check_strings_container1(Length_string1,String4,[0,0,ML_db1],[[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1],[999,999,[]]],_,_List2)-> + writeln("Success");(writeln("Failed"),fail)) + %%(%%data_instance_k_classification1([[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1]%%,[999,999,[]] + %%],[0,0,String4]%%21 + %%,1,String4a%%21 + %%), + %%String4=String4a) + %%)-> + %%(%%String4a=[_,_,String4a1], + %%writeln([found,String4a,String4]), +%%writeln("Success")); + %%(writeln("Failed"),fail) + %% + )); + (Flag2=%%[ + two,%%,P1,S1,P2,S2], + %%trace, + append([ML_db0],[ML_db1],ML_db2), + check_strings(String4,ML_db2%%,P1,S1,P2,S2 + ))))). + +reference(String2r) :- + (refs(R1),%%writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), + findall(_,(member(R11,R1),writeln(R11)),_),%%read_string(user_input, "\n", "\r", _End4, String2r), + not(String2r=""),%%downcase_atom(String2r,_String3r), + %%String2=String2aa, + split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + sort1(String2r21,String2r2), + assertz(refs(String2r2))%%split_string(String3r, SepandPad, SepandPad, String4) + ). + + +%% Sorts by first surname then title in AGPS + +sort1(List1,List2) :- + findall([C,B],(member(B,List1),downcase_atom(B,D),atom_string(D,C)),E),sort(E,F),findall(G,member([_,G],F),List2). + + +prepare_file_for_ml(String000,String021) :- + string_codes(String001,String000), + downcase_atom(String001,String00), + split_string(String00, "\n\r", "\n\r", String01), + delete(String01,"",String02), + findall(String03,(member(String02a,String02),split_string(String02a,".",".",String04), + ((String04=[String05|String06], + number_string(Number05,String05), + number_sentences(Number05,1,String06,[],String03))->true; + (findall(String08, + (SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + member(String04a,String04),split_string(String04a,SepandPad,SepandPad,String09), + append_list([[""],"",String09],String08)),String03)))),String0211), + append_list2(String0211,String021). + +number_sentences(_,_,[],String,String) :- !. number_sentences(Number_a,Number_b1,String06,String07,String08) :- + String06=[String00|String12], + +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + split_string(String00,SepandPad,SepandPad,String09), + append_list([[Number_a],Number_b1,String09],String10), + append(String07,[String10],String11), + Number_b2 is Number_b1 + 1, + number_sentences(Number_a,Number_b2,String12,String11,String08). + + +data_instance_k_classification1(Data,I,K,C):- + maplist(o_oclass_disClass1(I),Data,DisAndClass), + keysort(DisAndClass,DSorted), + + %%writeln(DSorted), + + length(TopK,K), + append(TopK,_,DSorted), + %This is not a very good way of selecting k as you may have many values with the same distance, and the sorting just cuts these off + %Dsorted = [1-pos,3-pos,3-pos,3-neg,3-neg] + %Topk =[1-pos,3-pos,3-pos] + topk_vote(TopK,C). + +o_oclass_disClass1(O,[A,B,O2],D-[A,B,O2]):- + o_o_dis(O,O2,D). + +%% could be in either order +%% a([w,z,a,b,e,c,z,y],[1,1,[c]],[1,2,[a]]). + +%% true +%% a([a,c,b],[1,1,[a,d]],[1,2,[b]]). +%% a([a,c,b],[1,1,[a,c]],[1,2,[b]]). +%% a([a,b],[1,1,[a]],[1,2,[b]]). + +%% false +%% a([c,d],[1,1,[a]],[1,2,[b]]). + +%% q in "q a b" sends food like e in "c d e" + +/** + +[debug] ?- a([q,e,r,a,t,y,u,q,e,r,a,t,y,u,c,b,x,v,n,m],[1,1,[c,a,t,y,u]],[1,2,[c,a,t,y,u]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[c,a,t,y,u]],[r,a,t,y,u]] +true. + +X: +[debug] ?- a([q,e,r,a,t,y,u,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,u]]). +[[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,u]],[c,b,x]] +true. + +[debug] ?- a([q,e,r,a,t,y,u,c,c,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,v]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,v]],[c,c,c]] +true. + +**/ + +check_strings(String1,ML_db) :- + %%member([P1,S1,String2],ML_db), + %%member([P2,S2,String3],ML_db), + ML_db=[String2a,String3a], +%%writeln(["String1,String2a,String3a",String1,String2a,String3a]), + String2=[0,0,String2a], + String3=[0,0,String3a], +%%a(String1,String2,String3):- + length(String1,Length_string1), +((writeln("Attempt 1"), +check_strings_container1(Length_string1,String1,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + check_strings_container1(Length_list2,List2,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,_List3), + writeln("Success") + %%,trace + )->true; + (writeln("Failed"), + writeln("Attempt 2"), + ((check_strings_container1(Length_string1,String1,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + + check_strings_container1(Length_list2,List2,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,_List3))-> + writeln("Success");(writeln("Failed"),fail))) +). + +check_strings_container1(Length_string1,String1,String2,Db,List2,List2b) :- +check_strings1(Length_string1,String1,String2,Db,List2,List2b),not(var(List2b)). +%%check_strings_container2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- +%%check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b),not(var(List2b)). + + + %% finds the length of each string - from position 1 to (length of string1) of string1 + %% starts with the shortest possible string, then increases the length + %% repeat this for second string + %%string1 + + %%*** check with s2,s3 in db +check_strings1(0,_String1,_String2,_Db,List2,List2) :- !. +check_strings1(Length_string1,String1,String2,Db,List2,List2b) :- + %% finds the first str + %%length(List,Index), + length(List1,Length_string1), + append(_,List1,String1), + Length_string2 is Length_string1-1, + Length_string3 is Length_string1+1, + check_strings2(0,Length_string3,List1,String2,Db,List2,List2c), + (var(List2c)-> + check_strings1(Length_string2,String1,String2,Db,List2c,List2b);List2c=List2b),!. + +check_strings2(Length_string,Length_string,_String1,_String2,_Db,List2,List2) :- !. +check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- + + + %%split_string(String4,SepandPad,SepandPad,String21), + + %% go through s1, removing the first word each time, until v + %%Length_string11 is Length_string1-1, + length(List11,Length_string1), + append(List11,List2,List1), + Length_string2 is Length_string1+1, + +%%writeln([list11,List11]), + + ((%%trace, + %%(List11=[z,a,b]->trace;true), + %%writeln([[_,_,List11],String2]), + %%notrace, + not(List11=[]), + +%%writeln(data_instance_k_classification1(Db +%% ,List11,1,A)), + data_instance_k_classification1(Db + %%String2 + ,List11,1,A), + %%writeln([string2,String2,list11,List11,a,A]), +A=String2,%%[_,_,List11], + %%trace, + + %%((List11=[z,a,b],A=[_,_,[z,t,b]],not(String2=[_,_,[c,z]]))->trace;notrace), + %%trace, +%%writeln([string2=list11,String2=List11]), +%%(List11=[y]->true%%trace +%%;true), +%%member(List11,Db), + %%**String2=[_,_,List11] + List2b=List2,%%,notrace + String2=[_,_,String21] + ,writeln([found,String21,List11]) + ) + ->(true + %%,writeln(List11) + ); + check_strings2(Length_string2,Length_string3,List1,String2,Db,_List2c,List2b)),!. + +numbers(N2,N1,Numbers,Numbers) :- + N2 is N1-1,!. +numbers(N2,N1,Numbers1,Numbers2) :- + N3 is N1+1, + append(Numbers1,[N1],Numbers3), + numbers(N2,N3,Numbers3,Numbers2). + diff --git a/Essay-Helper/short_essay_helper3.1_chicago.pl b/Essay-Helper/short_essay_helper3.1_chicago.pl new file mode 100644 index 0000000000000000000000000000000000000000..b840fad8482db11bfe8b6c719a6f413ae072fa40 --- /dev/null +++ b/Essay-Helper/short_essay_helper3.1_chicago.pl @@ -0,0 +1,1310 @@ +%% new_nums(A),writeln1(A). + +%% backdated from this - which has no repeats and forward progress through essays, but has a bug in numbering pages in references, so backdated to version with repeats and random progress, made forward progress x kept this version and solved bug with pg info in db + +%% Prolog Short Essay Helper + +%% Keep 1,2,3 in aphors +%% Certain number of reasons per paragraph +%% Explains essay structure at start +%% Exposition (negative terms not allowed, asks for 5 groups from text to take optionally enough notes about, which can revise) and critique +%% Paraphrasing +%% Converts x, explains x, asks about +%% Allows citation of aphor +%% Prints citations used so far +%% Prints chapter, essay and allows changes (using text editor) at end. +%% Outputs db file and br dicts for marking later + +%% *** Rewrite essay files with e.g. 1a. before aphors not 1. so not confused with aphors + +%% short_essay_helper("file.txt",5,E),writeln1(E). + +%%:- include('distances.pl'). +:- use_module(library(date)). +:- include('../listprologinterpreter/la_strings'). +:- include('sheet_feeder.pl'). + +:- dynamic critique3/1. +:- dynamic agree_disagree/1. +:- dynamic refs/1. +:- dynamic chosen_quotes/1. +:- dynamic string00_z/1. +:- dynamic end_note_number/1. +:- dynamic num_paras_exp/1. +:- dynamic num_paras_crit/1. + +%%:- dynamic end_notes/1. + +add_1_to_end_note_number :- + end_note_number(N1), + N2 is N1+1, + retractall(end_note_number(_)), + assertz(end_note_number(N2)). + +choose(N2,B,B1,B2,C,Item) :- +%%trace, +( + choose2(N2,B,B1,B2,C,Item)->true;(Item="* All quotes exhausted. (*)"),N2=1,B="()",B1="()",B2=1,C=""%%,trace + ), + %%notrace, + !. +choose2(N2,B,B1,B2,List0,List0) :- +%%trace, + string00_z(String00), + choose_sentence_range(String00,N1,B,B1,B2,List0), + %%chosen_quotes(Chosen_quotes1), + %%trace, + %%length(List0,L), + %%numbers(L,1,[],N), + %%random_ + %%member(N1,N), + %% + %%random_ + %%member([N1,Item10],List0), + + %%random_ + %%**member(Item1,Item10), + N2 is N1+B2-1, + %%get_item_n(List0,N1,Item10), + + + %%**string_codes(Item10,List), + %%notrace, + %%**split_string(List,".\n",".\n",List2), + + %%length(List2,L), + %%numbers(L,1,[],N), + %%random_ + %%member(N1,N), + %%N2 is N1+B2-1, + %%random_ + %%**member(Item1,List2), + + %%get_item_n(List2,N1,Item1), + + /**string_concat(E,D,Item1), + string_length(E,1), + downcase_atom(E,E1), + atom_string(E1,E2), + string_concat(E2,D,Item2), + string_length(E2,1), + string_concat(Item2,""%%"." + ,Item), + **/ + delete(String00,[B,B1,B2,N1,List0],String00_a), + %%**delete(String00,[B,B1,B2|_],String00_a), + %%**delete(List0,[N1,Item10],List6), + %%findall([Item3,". "],(member(Item3,List2)),List3), + %%maplist(append,[List3],[List4]), + %%concat_list(List4,_List5), + %%append(List6,[]%%List5 + %%,List7), + %%**(List6=[]->String00_b=String00_a; + %%**(%%trace, + %%**maplist(append,[[[B,B1,B2],List6]],[String00_c]), + %%**append(String00_a,[String00_c],String00_b)%%,notrace + %%**)), + retractall(string00_z(_)), + assertz(string00_z(String00_a)) + %%trace, + %%writeln1(String00_b),notrace + + + %%,not(member(Item,Chosen_quotes1)), + %%append(Chosen_quotes1,[Item],Chosen_quotes2), + %%retractall(chosen_quotes(_)), + %%assertz(chosen_quotes(Chosen_quotes2)) + . + +choose1(List0,Item) :- + random_member(Item,List0). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +short_essay_helper(%%Filex, + String01,Key_words, + Reasons_per_paragraph) :- + + retractall(num_paras_exp(_)), + assertz(num_paras_exp(5)), + + retractall(num_paras_crit(_)), + assertz(num_paras_crit(5)), + + retractall(string00_z(_)), + %%assertz(string00_z([])), + + retractall(end_note_number(_)), + assertz(end_note_number(1)), + + %%retractall(end_notes(_)), + %%assertz(end_notes([])), + + + retractall(critique3(_)), + assertz(critique3([])), + + retractall(refs(_)), + assertz(refs([])), + + retractall(chosen_quotes(_)), + assertz(chosen_quotes([])), + + directory_files("sources/",F), + delete_invisibles_etc(F,G), +%%trace, +SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + findall(String02h3,(member(Filex1,G), + string_concat("sources/",Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]), + + (String02a=[Az,Bz,Cz|String02c]->true; + (concat_list(["Error: ",Filex," not in format [\"Surname, A 2000, Title: Subtitle, Publisher, City.\",\"Surname, A 2000\",First_Page_Num,\"\",\"\",...\"]"],Notification1),writeln(Notification1),abort)), + %%String02c=String02d, + %%trace, + findall([Az,Bz,Cz,N1,String02cb],( + + length(String02c,L), + numbers(L,1,[],N), + %%random_ + member(N1,N), + get_item_n(String02c,N1,String02ca), + + %%member(String02ca,String02c), + split_string(String02ca, ".\n\r", ".\n\r", String02cb) + + %%member(String02cb1,String02cb) + + ),String02cc), + %%maplist(append,[String02cc],[String02d]), + + %%delete(String02cc,[_,[]],String02d), + String02cc=String02d, + + findall([Az,Bz,Cz,N2,String02d2],(member([Az,Bz,Cz,N2,String02d1],String02d), + member(String02d2,String02d1), + downcase_atom(String02d2,String02e), + atom_string(String02e,String02f1), + split_string(String02f1, SepandPad, SepandPad, String02e1), + findall(String02g,(member(Key_words1,Key_words), + %%trace, + downcase_atom(Key_words1,Key_words11), + atom_string(Key_words11,Key_words12), +findall(Key_words12,(member(Key_words12,String02e1)),String02g) + ),String02i), + not(maplist(equals_empty_list,String02i)) + + ),String02h31), + + sort(String02h31,String02h3) + + %%prepare_file_for_ml(String00,String02a) + ),String00z1), + + %%, + + %%trace, + %%writeln1([string00z1,String00z1]), +%%findall(String02h2,(member([Ay,By,Cy,N1,String02h1],String00z1), + %% (String02h1=[]->String02h2=[]; + maplist(append,[String00z1],[String00]),%%) + %%),String00z), + +%%delete(String00z,[],String00), + + +term_to_atom(Key_words,Key_words_a), +atom_string(Key_words_a,Key_words_b), + (String00=[]->(concat_list(["Error: No files in source folder or no instances of keywords ",Key_words_b," in files in source folder."],Notification2),writeln(Notification2),abort);true), + + %%maplist(append,[[String00z1]],String00), +%%maplist(append,[String00z],String00), + %%trace, +assertz(string00_z(String00)), + %%writeln1([string00,String00]), + %%notrace, + +%%writeln1(String02), + + + generate_file_name(File1,File2), + + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + explain_structure(String01,Reasons_per_paragraph,File1), + exposition(String00,String01,Reasons_per_paragraph,Numbers,String02,Exposition), + + %%concat_list(["Do you agree or disagree with ",String01," (a/d) ? "],String2ad),%%get_string(String2ad,either,one-not-ml,"","",String3ad), + choose1(["a"%%,"d" + ],String3ad), + + (String3ad="a"-> + (retractall(agree_disagree(_)), + assertz(agree_disagree(agree))); + (retractall(agree_disagree(_)), + assertz(agree_disagree(disagree)))), + +%%trace, +critique(String00,String01,Reasons_per_paragraph,Numbers,String02,Critique), + +agree_disagree(Pole), + + %%concat_list(["What is the future area of research from your essay about ",String01,"? "],Future_research_prompt), + %%trace, + %%get_string(Future_research_prompt,either,one-not-ml,"","",Future_research), +%% choose_sentence_range(String00,), + choose(_N_page_ref,_String00a1,_String00a2,_String00a3,_String00a4,String00a5), + %%end_note_number(End_note_number), + concat_list(["In ",String01,", automation should apply to \"",String00a5,"\""%%.",End_note_number,""%% (",String00a2,", p. ",N_page_ref,")." + ],Future_research), + %%reference(String00a1,String00a2,N_page_ref,End_note_number), + %%add_1_to_end_note_number, + + refs(R2), + +term_to_atom([Exposition,Critique,String3ad,Future_research,R2],File_contents),open_s(File1,write,Stream),write(Stream,File_contents),close(Stream), + +%% Output essay +%%findall(_,(member(Exposition1,Exposition),Exposition1= + +%%writeln([r2,R2]), + +%%writeln1([Exposition,Critique,Future_research,R2]), + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML), +writeln1(Essay), + + (open_s(File2,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,HTML), + close(Stream1)),! + . + +%% replace("a\nb","\n","
\n",F). +%% F="a
\nb
\n". + +replace(A,Find,Replace,F) :- + split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F). + +concat_list1(D,F) :- + maplist(append,[D],[E]),concat_list(E,F). + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,"",HTML) :- + write_heading(String01,_Heading), + write_introduction(String01,Pole,Critique,Introduction), + writeln1([introduction,Introduction]), + %%string_concat(Introduction," ",Introduction0), + write_exposition(Exposition,Exposition2a), + concat_list(["I will expose ",String01," in this half. ",Exposition2a],Exposition2), + %%string_concat(Exposition2,"\n",Exposition2a), + write_critique(Critique,Critique2a), + string_concat(Critique2a,"\n",Critique2b), + atom_string(Pole,Pole1), + concat_list(["I will ",Pole1," with ",String01," in this half. ",Critique2b],Critique2), + write_conclusion(String01,Pole,Critique,Future_research,Conclusion), + write_end_notes(R2,_End_notes,End_notes_no_heading), + write_references(R2,_References,Refs_no_heading), + %%concat_list([Heading,Introduction,Exposition2,Critique2,Conclusion,End_notes,References], + %%Essay), + strip_footnotes(Introduction,Introduction_1), + strip_footnotes(Conclusion,Conclusion_1), + concat_list([Introduction_1,Exposition2,Critique2,Conclusion_1], + Essay2), + %%trace, + reorder_numbers(Essay2,Essay21,End_notes_no_heading,End_notes_no_heading1), + replace(Essay21,"\n","
",HTML1), + replace(Refs_no_heading,"\n","
",Refs_no_heading2), + replace(End_notes_no_heading1,"\n","
",End_notes_no_heading2), + concat_list(["",String01,"

", + String01,"

",HTML1,"

Endnotes

",End_notes_no_heading2,"

Bibliography

",Refs_no_heading2,""],HTML). + +strip_footnotes(A,B101):- + split_string(A,".",".",D), +%%writeln1(D), + findall([B7,"."],(member(B,D), + split_string(B,"<","<",[B7|_])),B8), + maplist(append,[B8],[B81]), + concat_list(B81,B101). + %%,string_concat(B10,"\n",B101). + + + +reorder_numbers(Essay2,Essay21,End_notes_no_heading,End_notes_no_heading1) :- +%%trace, + retractall(new_nums(_)), + assertz(new_nums([[0,0]])), + reorder_essay(Essay2,Essay21), + %%trace, + reorder_endnotes(End_notes_no_heading,End_notes_no_heading1). + +reorder_essay(Essay2,Essay21) :- + split_string(Essay2,"\n\r","\n\r",Essay23), + findall([B9,"\n"],( + %%trace, + member(C,Essay23), + split_string(C,".",".",D1), +%%writeln1([d,D]), +delete(D1," ",D), + findall([B7,"."],(member(A,D), + split_string(A,"<>","<>",Essay22), + %%trace, + (( + ((Essay22=[B1,B2,N1,B3,B4,B5,N2,B51|B6]->(assign_num(N1,N3),assign_num(N2,N4), + concat_list([B1,"<",B2,">",N3,"<",B3,">",B4,"<",B5,">",N4,"<",B51,">"|B6],B7) + ) + ;(Essay22=[B1,B2,N1,B22|B3]->(assign_num(N1,N3), + concat_list([B1,"<",B2,">",N3,"<",B22,">"|B3],B7));([B7|_]=Essay22->true;(writeln(["Extra <> in essay.",Essay22]),abort)) + ))->(true%%,notrace + );(%%notrace, + fail)))) + ),B81), + %%notrace, + maplist(append,[B81],[B8]), + concat_list(B8,B9)),B101), + maplist(append,[B101],[B10]), + concat_list(B10,Essay21). + +reorder_endnotes(End_notes_no_heading,End_notes_no_heading1) :- + %% Replace instances in end notes + %%split_string(Essay2,"\n\r","\n\r",Essay23), + %%findall([B9,"\n"],( + %%member(C,Essay23), + split_string(End_notes_no_heading,"\n\r","\n\r",D), + + %%retractall(new_nums(_)), + %%assertz(new_nums([0,0])), + findall([B7,".","\n" + ],(member(A,D), + split_string(A,".",".",Essay22), + (Essay22=[N1|B6]->(get_num(N1,N3), + concat_list([N3,"."|B6],B7));([B7|_]=Essay22->true;(writeln("Extra <> in essay."),abort)))),B81), + maplist(append,[B81],[B8]), + concat_list(B8,End_notes_no_heading1). + %%an + +get_num(N,N1) :- + (string(N)-> + number_string(N0,N);N0=N), + + new_nums(New_nums), + member([N0,N1],New_nums). + +assign_num(N,N3) :- + (string(N)-> + number_string(N0,N);N0=N), + new_nums(New_nums), + find_last(New_nums,[_,N2]), + (string(N2)-> + number_string(N21,N2);N2=N21), + N3 is N21+1, + append(New_nums,[[N0,N3]],New_nums2), + retractall(new_nums(_)), + assertz(new_nums(New_nums2)). + + +write_heading(String01,Heading) :- + concat_list([String01,"\n","\n"],Heading). + +write_introduction(String01,Pole1,Critique,Introduction) :- + %% The heading should be in the form "Author's topic" + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list(["I will critically analyse ",String01,". ", + "I will ",Pole2," with ",String01,". ", + Paragraph_topic_sentences2,"\n\n"],Introduction). + +write_conclusion(String01,Pole1,Critique,Future_research,Conclusion) :- + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list([ + Paragraph_topic_sentences2, + "I have ",Pole2,"d with ",String01,". ", + Future_research%%,"\n"%%\n" + ],Conclusion). + +write_end_notes(R2,References,Refs_no_head) :- + find_end_notes1(R2,Refs_no_head),%%,(member([String2r3,Ref1,Page_num,End_note_number],R2), + %%findall([Reference,"\n"],member(Reference,R2),References1), +%%trace, writeln1(R3), + %%concat_list1(R3,References2), + %%concat_list([References2],Refs_no_head), + concat_list(["Endnotes","\n\n",Refs_no_head],References). + +find_last([],[]) :- !. +find_last(A,B) :- + reverse(A,C),C=[B|_]. + +find_end_notes1(R1,R2) :- +%%trace, + find_end_notes(R1,[],R3), + %%writeln1([r3,R3]), + findall(A,member([_,A],R3),B), + %%writeln1([b,B]) + concat_list1(B,R2)%%, + %%writeln1([r2,R2]) +. + +find_end_notes([],R,R) :- !. +find_end_notes(R1,R2,R3) :- + R1=[[String2r3,Ref1,Page_num,End_note_number]|Rest], + (R2=[]->String2r31="";find_last(R2,[[String2r31,_Ref11,Page_num1,_End_note_number1],_])), + ((String2r3=String2r31)-> + (Page_num=Page_num1-> + append(R2,[[[String2r3,Ref1,Page_num,End_note_number], + [End_note_number,". Ibid.","\n"]]],R5); + append(R2,[[[String2r3,Ref1,Page_num,End_note_number], + [End_note_number,". Ibid., p. ",Page_num1,".","\n"]]],R5)); + (member([[String2r3,_Ref12,_Page_num2,_End_note_number2],_],R2)-> + append(R2,[[[String2r3,Ref1,Page_num,End_note_number], + [End_note_number,". ",Ref1,", p. ",Page_num,".","\n"]]],R5); + %%(String2r31=""->R2=R5; + append(R2,[[[String2r3,Ref1,Page_num,End_note_number], + [End_note_number,". ",String2r3,", p. ",Page_num,".","\n"]]],R5))),%%), + find_end_notes(Rest,R5,R3). + +write_references(R2,References,Refs_no_head) :- + findall([Reference,"\n"],member([Reference,_Ref11,_Page_num1,_End_note_number1],R2),References3), + sort(References3,References1), + concat_list1(References1,References2), + concat_list([References2],Refs_no_head), + concat_list(["Bibliography","\n\n",References2],References). + +%%a1 +%% write_exposition([[[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."]]],A),writeln1(A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. ". + +%% write_exposition([[[1,"g1"],[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."],[1,1,_15410,_15416,"a1","a1 is in g1."],[2,2,_15352,_15358,"a2","g1 contains a2."]]],A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. a1 a1 is in g1. a2 g1 contains a2. ". + +write_exposition(Exposition,Essay4b) :- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +Exposition=[_,Exposition2], + + +findall([Essay4c%%,"\n" +],(member(Numbera11,Numbers), + + +%% not "" with findall +findall(Essay4,( +%%member(Exposition1,Exposition), +%%Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + + +%%output_exposition(Numbera11,Exposition2,"",Essay1), + + %%findall( Essay4,( + member(Exposition1,Exposition2),Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,Group_link], + concat_list([String5a," ",Group_link," "],Essay4) + %%delete(Exposition,Exposition1,Exposition2) + %%output_exposition(Numbera11,Exposition2,Essay4,Essay6) +),Essay4a), +concat_list(Essay4a,Essay4c1),%%trace, +(Essay4c1=""->Essay4c="";(string_concat(Essay4c10," ",Essay4c1),concat_list([Essay4c10,"\n"],Essay4c))) + +),Essay4d), + +maplist(concat_list,Essay4d,Essay4f), +concat_list(Essay4f,Essay4b) +%%concat_list([Essay4e,"\n"],Essay4b) +%%concat_list([Essay1," ",Essay3],Essay2), + %%concat_list([Essay2,"\n"],Essay23)) +%%,Essay3a), + %%concat_list(Essay3a,Essay4a) +. +%% *** HTML (
not \n) + +%%%%% +%%a + +%% write_critique([[1,["heading is e12",[1,1,_25346,_25352,"e1",_25370,_25376,"e12"],[1,2,_25412,_25418,"e2",_25436,_25442,"e22",1,1,"e12",0,0,"e22","e12 is e22"]]]],A),writeln1(A). +%% A = "heading is e12 e1 e12 e2 e22 e12 is e22 \n". + +write_critique(Critique,Essay4):- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +findall(Essay23,(member(Numbera11,Numbers), + +%% not "" with findall +%%findall(Essay22,( +member(Critique1,Critique), +%%Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + +Critique1=[Numbera11,[Para_topic_sent,[_Number2a,_Number3a,_String3,_String3a,String5a,_String3y,_String3ay,String5a1]|Critique2]], + +concat_list([Para_topic_sent," ",String5a," ",String5a1," "],Essay0),output_critique(Numbera11,Critique2,String5a1,Essay0,Essay1), + + concat_list([Essay1,Para_topic_sent,"\n"],%%Essay22) + +%%), +Essay23)),Essay3), + concat_list(Essay3,Essay4) +. +%% *** HTML (
not \n) + +output_critique(_Numbera11,[],_String5a1a,Essay,Essay) :- !. +output_critique(Numbera11,Critique,CString5a1a,Essay1,Essay2) :- +findall( Essay6,(member(Critique1,Critique),Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,_String3y1,_String3ay,String5a1, +_CNumber2aa,_CNumber3aa,CString5a1a, + _CNumber2a1,_CNumber3a1,_LastCStrings, + String5aaa], + concat_list([String5a," ",String5a1," ",String5aaa," "],Essay4), + delete(Critique,Critique1,Critique2), + output_critique(Numbera11,Critique2,String5aaa,Essay4,Essay6) +),Essay33), +%%trace, +(Essay33=[]->concat_list([Essay1%%," " +],Essay2);%%(Essay33=[Essay3]->concat_list([Essay1," ",Essay3],Essay2);%%(Essay33=[Essay3|[E33]]-> concat_list([Essay1," ",Essay3,E33],Essay2); +(Essay33=[Essay3|E33], concat_list(E33,E34),concat_list([Essay1,%%" ", +Essay3,E34],Essay2))) +%%) +%%) +. + + +/**critique(String00,String01,Reasons_per_paragraph,Numbers,Critique). + **/ + +generate_file_name(File1,File2) :- + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".txt"],File1), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".html"],File2). + + +explain_structure(String01,Reasons_per_paragraph,_File1) :- + concat_list(["The Short Essay Helper will automatically structure and write your essay about ",String01," with ",Reasons_per_paragraph," reasons per paragraph.","\n", + "The Helper will help write an exposition (which summarises but doesn't critique the idea), a critique (which agrees with or disagrees with the topic), the introduction and the conclusion (which state whether you agreed or disagreed with the topic, etc.). Citations will be automatically made.","\n","Note: Generated essays are not to be handed in, and you need to paraphrase and cite work you have referenced. Your grade depends on whether you agree or disagree and how many breasonings you breason out. Check the referencing style is appropriate for your class.","\n"],String1), + writeln(String1). + +choose_sentence_range(String00,N2,B,B1,B2,C) :- + length(String00,L), + numbers(L,1,[],N), + %%random_ + member(N1,N), + get_item_n(String00,N1,A), + A=[B,B1,B2,N2,C]. + %%N2 is N1+B2-1. + %%random_member(A,String00), + +exposition(String00,_String01,Reasons_per_paragraph,Numbers,ML_db,Exposition1) :- + num_paras_exp(Num_paras_exp), + + length(List1,Num_paras_exp), %% 5->1 paragraphs per exposition + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + findall([Number1,Exposition2],( + %%trace, +member(Number1,List1),%%concat_list(["What is group ",Number1," of 5 in the exposition that groups ideas about ",String01,"? "],String1),%%get_string(String1,either,one-not-ml,"","",%ML_db,Exposition2) +%% ** Doesn't print this +%%choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + %%trace, + choose(_N_page_ref,_String00a1,_String00a2,_String00a3,_String00a4,String00a5), + %%end_note_number(End_note_number), + %%add_1_to_end_note_number, + concat_list(["",String00a5%%,"",End_note_number,""%%," (",String00a2,", p. ",N_page_ref,") " + ],Exposition2) + %%reference(String00a1,String00a2,N_page_ref,End_note_number) + ),Exposition3), + + + findall([Number2,Number3,String3,String3a,String5a,String5],( + member(Number2,List1),member(Number3,List2),get_item_n(Exposition3,Number2,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you how the quote you are about to enter relates to the paragraph topic."],String2b),writeln(String2b), + %%trace, +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5) + ),Exposition4), + Exposition1=[Exposition3,Exposition4]. + +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5):- + (%%concat_list(["What is the paragraph number of the quote about the paragraph topic ",Item1,"? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote about the paragraph topic ",Item1,"? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote about the paragraph topic ",Item1,"? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + %%choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + %%trace,writeln(here1), + split_string(Item1,"<","<",[Item01|_]), + %%trace, + choose(N_page_ref,String00a1,String00a2,_String00a3,_String00a4,String00a5), + end_note_number(End_note_number), + add_1_to_end_note_number, + concat_list(["I agree with ",Item01," because ",String00a5,"",End_note_number,"."%%" (",String00a2,", p. ",N_page_ref,")." + ],String5a), + reference(String00a1,String00a2,N_page_ref,End_note_number), + + %%concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + %%get_string(String4,either,one-not-ml,Item1a,String3aa,String5) + + %%choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + + split_string(Item1,"<","<",[Item01|_]), + +choose(N_page_ref1,String00a11,String00a21,_String00a31,_String00a41,String00a51), + end_note_number(End_note_number1), + add_1_to_end_note_number, + concat_list(["I agree with ",Item01," because \"",String00a51,"\"", %%(",String00a21, + "",End_note_number1,"."%%", p. ",N_page_ref1,")." + ],String5), + reference(String00a11,String00a21,N_page_ref1,End_note_number1) +) + ->true;exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5). + +%% Agree or disagree + +critique(String00,String01,Reasons_per_paragraph,Numbers,ML_db,Critique1) :- + num_paras_crit(Num_paras_crit), + + length(List1,Num_paras_crit), %% 5->1 paragraphs per critique + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + + retractall(critique3(_)), + assertz(critique3([])), + + + findall([Number2a,Critique2],( +%% Reason 1 + +member(Number2a,List1), +%%List1=[Number2a|List1a], +List2=[Number3a|List2a], +%%trace, + + +critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link), + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4), + + append_list2([[Topic_paragraph_link],Critique3,Critique4],Critique2)),Critique1). + + critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link) :- + + %%member(Number2,List1),member(Number3,List2), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + %%trace, +critique2(String00,String01,ML_db,String3,String3a,String5a,_String3y2,String3ay,String5a1,Topic_paragraph_link), + %%),Critique4). + Critique3=[[Number2a,Number3a,String3,String3a,String5a,_String3y3,String3ay,String5a1]]. + +critique2(String00,String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + + %%choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(N_page_ref1,String00a11,String00a21,_String00a31,_String00a41,String00a51), + end_note_number(End_note_number1), + add_1_to_end_note_number, + concat_list(["\"",String00a51,"\"",End_note_number1,"."%% (",String00a21,", p. ",N_page_ref1,")." + ],String5a), + reference(String00a11,String00a21,N_page_ref1,End_note_number1), + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (%%String3yn="y" + true-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase + + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + %%choose_sentence_range(String00,N_page_ref2,String00a12,String00a22,_String00a32,String00a42), + choose(N_page_ref2,String00a12,String00a22,_String00a32,_String00a42,String00a52), + end_note_number(End_note_number2), + add_1_to_end_note_number, + + concat_list(["\"",String00a52,"\"",End_note_number2,"."%% (",String00a22,", p. ",N_page_ref2,")." + ],String5a1), + reference(String00a12,String00a22,N_page_ref2,End_note_number2) + + ) + + ;(String3y=0,String3ay=0, + %%concat_list(["What is the comment? "],String4ac),%%get_string(String4ac,either,one-not-ml,"","",%%String5a, +%% String5a1) + + %%choose_sentence_range(String00,N_page_ref3,String00a13,String00a23,_String00a33,String00a43), + choose(_N_page_ref3,_String00a13,_String00a23,_String00a33,_String00a43,String00a53), + concat_list(["\"",String00a53,"\"."],String5a1) + %%reference(String00a13) + %%" (",String00a23,", p. ",N_page_ref3,")." + +)), + + %%concat_list(["How does the comment ",String5a1," relate to the essay topic ",String01,"? "],Topic_paragraph_link_prompt), + %%trace, + + %%downcase_and_split(String5a1,String5a1ds), + %%downcase_and_split(String01,String01ds), + %%get_string(Topic_paragraph_link_prompt,either,one-not-ml,String5a1ds,String01ds,Topic_paragraph_link) + + (string_concat(String5a1_az,".",String5a1)->true;String5a1_az=String5a1), %%choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + %%choose(N_page_ref4,String00a14,String00a24,_String00a34,String00a44,String00a54), + %%trace, + split_string(String01,"(","(",[String01_a|_]), + split_string(String01_a,"<","<",[String01_a0|_]), + strip_footnotes(String5a1_az,String5a1_az_2), + + concat_list([String01_a0," is because ",String5a1_az_2,"."%% because of ",String00a54," (",String00a24,", p. ",N_page_ref4,")." + ],Topic_paragraph_link) + %%reference(String00a14) + + /** + %% conn - choose connected comments + concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + get_string(String4,either,two,Item1a,String3aa,String5) + **/ + ) + ->true;critique2(String00,String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link). + + + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4) :- + retractall(critique3(_)), + assertz(critique3(Critique3)), +%%trace, +/** critique3(Critique31), + append(Critique31,Critique3,Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), +**/ +%%notrace, +findall([Number2a,Number3a,String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa],( + %%member(Number2,List1a), + member(Number3a,List2a), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + + %%trace, +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa) + + +/** + trace, + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + +), + Critique4). + + %%Critique3=[]. + +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + %%trace, + %%choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + choose(N_page_ref4,String00a14,String00a24,_String00a34,_String00a44,String00a54), + end_note_number(End_note_number4), + add_1_to_end_note_number, + concat_list(["\"",String00a54,"\"",End_note_number4,"."%% (",String00a24,", p. ",N_page_ref4,")." + ],String5a), + reference(String00a14,String00a24,N_page_ref4,End_note_number4), + + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),%%get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (true->%%String3yn="y"-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase x + %%trace, + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + %%choose_sentence_range(String00,N_page_ref5,String00a15,String00a25,_String00a35,String00a45), + choose(N_page_ref5,String00a15,String00a25,_String00a35,_String00a45,String00a55), + end_note_number(End_note_number5), + add_1_to_end_note_number, + concat_list(["\"",String00a55,"\"",End_note_number5,"."%% (",String00a25,", p. ",N_page_ref5,")." + ],String5a1), + reference(String00a15,String00a25,N_page_ref5,End_note_number5) + + %%,trace + ) + + ;(String3y=0,String3ay=0, + concat_list(["What is the comment? "],String4ac), + + %% This is never chosen + get_string(String4ac,either,one-not-ml,"","",%%String5a, + String5a1))), + %%*** assertz recurse not findall new critique3 x + + + %%trace, + critique3(Critique31), + append(Critique31,[[0,0,String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), + + critique3(Critique33), + + +(( + + + %%critique3(Critique3), + length(Critique33,LCritique3), + %%length(List1,LCritique3), + %%notrace, + numbers(LCritique3,1,[],List1), + %%trace, + %%append(List1,_,Numbers), + %%Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + %%trace, + +findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1])),CStrings1), + +findall([N,CNumber2a,CNumber3a,CString5a1],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString31,_CString3a1,_CString5a1,_CString3y1,_CString3ay1,CString5a1])),CStrings2), + +%%trace, + +%%CStrings=[CStrings1,CStrings2], +reverse(CStrings2,CStringsR),CStringsR=[[_,CNumber2a1,CNumber3a1,LastCStrings]|CStringsR1], +reverse(CStringsR1,CStringsRR), + +reverse(CStrings1,CStringsR10),CStringsR10=[_|CStringsR11], +reverse(CStringsR11,CStringsRR1), + +append_list2(CStringsRR1,CStrings11), +concat_list(CStrings11,_CStrings12), +%%concat_list(["Please select a comment to connect the comment ",LastCStrings," to:","\n",CStrings12],ConnectionNumberPrompt), +%%get_number(ConnectionNumberPrompt,ConnectionNumber), +%numbers( + +%% *** Choose phrase which is similar to a previous phrase + +%% findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1]), + + /**SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",findall([CString5a1Z2,CString5a1Z5],(downcase_atom(CString5a1,CString5a1Z1),atom_string(CString5a1Z1,CString5a1Z2),split_string(CString5a1,SepandPad,SepandPad,CString5a1Z3), + Connectors= + ["the","a","i","on","with","of","an","for","to", + "was","were","and","in","my","from","out","by"], + %% find distances between terms, closest to sent + subtract(CString5a1Z3,Connectors,CString5a1Z5), + ),CString5a1Z4), +**/ +choose1(List1,ConnectionNumber), + member([ConnectionNumber,CNumber2aa,CNumber3aa,CString5a1a],CStringsRR), + + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(CString5a1a,CString5a1a1),split_string(CString5a1a1, SepandPad, SepandPad, CString5a1a2), + + %%CNumber2a1,CNumber3a1, + %%downcase_atom(LastCStrings,LastCStrings_a),split_string(LastCStrings_a, SepandPad, SepandPad, LastCStrings_a1), + + + %% conn - choose connected comments, this to a previous comment + %%trace, + + %%concat_list(["How does ",LastCStrings," connect to ",CString5a1a,"? "],ConnectionPrompt), + + %%get_string(ConnectionPrompt,either,one-not-ml,CString5a1a2,LastCStrings_a1,String5aaa) + + string_concat(LastCStrings_az,".",LastCStrings), + string_concat(CString5a1a_az1,".",CString5a1a), %%choose_sentence_range(String00,N_page_ref6,String00a16,String00a26,_String00a36,String00a46), + + split_string(LastCStrings_az,"(","(",[LastCStrings_az_a|_]), + +replace(CString5a1a_az1,"\"","",CString5a1a_az), +replace(LastCStrings_az_a,"\"","",LastCStrings_az_a1), + split_string(CString5a1a_az,"<","<",[CString5a1a_az_1|_]), + split_string(LastCStrings_az_a1,"<","<",[LastCStrings_az_a1_1|_]), + +choose(_N_page_ref6,_String00a16,_String00a26,_String00a36,_String00a46,_String00a56), + concat_list([LastCStrings_az_a1_1," because ",CString5a1a_az_1,"."%%" because ",String00a56," (",String00a26,", p. ",N_page_ref6,")." + ],String5aaa) + %%reference(String00a16) + + + )->true;( + + %% If the section since updating dynamic critique comments fails, prevent doubling of critique comments + + critique3(Critique311), + reverse(Critique311,Critique312), + Critique312=[_|Critique313], + reverse(Critique313,Critique314), + retractall(critique3(_)), + assertz(critique3(Critique314)),fail + + )) + +/** Critique4=[String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa], + **/ + /** + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + + ) + %% Retries the predicate if it fails + ->true;critique3(String00,ML_db,%%Critique3, + String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa + ). + + + + +member1a([String3,String3a,String3aa],ML_db) :- + member([String3,String3a,String3aa],ML_db),!. + + +get_item_n(Exposition,Number1,Item) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[Item|_],Exposition). + +get_number(Prompt1,Number) :- + %%concat_list(Prompt1,Prompt2), + (%%repeat, + writeln(Prompt1),read_string(user_input, "\n", "\r", _End1, String),split_string(String, ",", " ", Value1),Value1=[Value2],number_string(Number,Value2)). + + +/** +get_string(Prompt1,String1) :- + concat_list(Prompt1,Prompt2), + (repeat,write(Prompt2),read_string(String1),not(String1="")). +**/ + +equals_empty_list([]). + +downcase_and_split(String1, String2) :- +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + downcase_atom(String1,String3),split_string(String3, SepandPad, SepandPad, String2). + +get_string(Prompt2,Flag1,Flag2,ML_db0,ML_db1,String2) :- +%%writeln1(get_string(Prompt2,Flag1,Flag2,ML_db1,String2)), + %%concat_list(Prompt1,Prompt2), + %%trace, + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%(repeat, + (Flag2=one-not-ml-ref->(concat_list(["Note: Enter in-text reference using AGPS, e.g.\n","The first work supports the second work (Surname 2000, pp. 18-9).\n","Surname (2000, pp. 18-9) states that ...\n","Remember to use words like \"also\", \"moreover\" and \"in addition\" before the sentence."],String_a1),writeln(String_a1));true), + writeln(Prompt2),read_string(user_input, "\n", "\r", _End2, String2aa),%%not(String2aa=""), + %%String2aa=[String2aaa], + downcase_atom(String2aa,String3),split_string(String3, SepandPad, SepandPad, String4), + Neg_term_list=["no","not","don","t","shouldn","wouldn","disagree","differ","dislikes","disagrees","differs","dislikes","disagreed","differed","disliked","negative","negation","non","negate","negates","negated","but","however","isn","lack"], + (Flag1=%%non_negative + positive->( + + (findall(Item11,( + + member(Item1,String4), + findall(Item1,( +member(Item2,Neg_term_list),(Item1=Item2->(write("Error: Contains the negative term \""),write(Item1),writeln("\".")))),Item11)),Item12)), + +maplist(equals_empty_list,Item12) + +); + ((Flag1=negative->((member(Item1,String4),member(Item1,Neg_term_list))->true;(writeln("Error: Contains no negative term, one of:"),writeln(Neg_term_list),fail));true)->true;Flag1=either)), + (Flag2=one-not-ml->String2=String2aa; + (Flag2=one-not-ml-ref-> + (refs(R1),writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), findall(_,(member(R11,R1),writeln(R11)),_),read_string(user_input, "\n", "\r", _End3, String2r),not(String2r=""),%%downcase_atom(String2r,_String3r), + String2=String2aa,split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + sort1(String2r21,String2r2), + assertz(refs(String2r2))%%split_string(String3r, SepandPad, SepandPad, String4) + ); +(Flag2=one->(%%split_string(String4,SepandPad,SepandPad,String21), + writeln("Attempt 1"), + + (length(String4,Length_string1), +(check_strings_container1(Length_string1,String4,[0,0,ML_db1],[[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1],[999,999,[]]],_,_List2)-> + writeln("Success");(writeln("Failed"),fail)) + %%(%%data_instance_k_classification1([[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1]%%,[999,999,[]] + %%],[0,0,String4]%%21 + %%,1,String4a%%21 + %%), + %%String4=String4a) + %%)-> + %%(%%String4a=[_,_,String4a1], + %%writeln([found,String4a,String4]), +%%writeln("Success")); + %%(writeln("Failed"),fail) + %% + )); + (Flag2=%%[ + two,%%,P1,S1,P2,S2], + %%trace, + append([ML_db0],[ML_db1],ML_db2), + check_strings(String4,ML_db2%%,P1,S1,P2,S2 + ))))). + +reference(String2r,Ref1,Page_num,End_note_number) :- +%%writeln([[string2r,ref1,page_num,end_note_number], +%%[String2r,Ref1,Page_num,End_note_number]]), +%%trace, + (refs(R1),%%writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), + %%findall(_,(member(R11,R1),writeln(R11)),_),%%read_string(user_input, "\n", "\r", _End4, String2r), + %%(Ref1="()"->trace;true), + not(String2r=""),%%downcase_atom(String2r,_String3r), + %%String2=String2aa, + split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[String2r3,[Ref1,Page_num,End_note_number]]],[String2r21]), + %%sort1(String2r21,String2r2), + [String2r21]=String2r2, + append(R1,String2r2,String2r4), + assertz(refs(String2r4))%%split_string(String3r, SepandPad, SepandPad, String4) + ).%%,notrace. + + +%% Sorts by first surname then title in AGPS + +sort1(List1,List2) :- + findall([C,B],(member(B,List1),downcase_atom(B,D),atom_string(D,C)),E),sort(E,F),findall(G,member([_,G],F),List2). + + +prepare_file_for_ml(String000,String021) :- + string_codes(String001,String000), + downcase_atom(String001,String00), + split_string(String00, "\n\r", "\n\r", String01), + delete(String01,"",String02), + findall(String03,(member(String02a,String02),split_string(String02a,".",".",String04), + ((String04=[String05|String06], + number_string(Number05,String05), + number_sentences(Number05,1,String06,[],String03))->true; + (findall(String08, + (SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + member(String04a,String04),split_string(String04a,SepandPad,SepandPad,String09), + append_list([[""],"",String09],String08)),String03)))),String0211), + append_list2(String0211,String021). + +number_sentences(_,_,[],String,String) :- !. number_sentences(Number_a,Number_b1,String06,String07,String08) :- + String06=[String00|String12], + +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + split_string(String00,SepandPad,SepandPad,String09), + append_list([[Number_a],Number_b1,String09],String10), + append(String07,[String10],String11), + Number_b2 is Number_b1 + 1, + number_sentences(Number_a,Number_b2,String12,String11,String08). + + +data_instance_k_classification1(Data,I,K,C):- + maplist(o_oclass_disClass1(I),Data,DisAndClass), + keysort(DisAndClass,DSorted), + + %%writeln(DSorted), + + length(TopK,K), + append(TopK,_,DSorted), + %This is not a very good way of selecting k as you may have many values with the same distance, and the sorting just cuts these off + %Dsorted = [1-pos,3-pos,3-pos,3-neg,3-neg] + %Topk =[1-pos,3-pos,3-pos] + topk_vote(TopK,C). + +o_oclass_disClass1(O,[A,B,O2],D-[A,B,O2]):- + o_o_dis(O,O2,D). + +%% could be in either order +%% a([w,z,a,b,e,c,z,y],[1,1,[c]],[1,2,[a]]). + +%% true +%% a([a,c,b],[1,1,[a,d]],[1,2,[b]]). +%% a([a,c,b],[1,1,[a,c]],[1,2,[b]]). +%% a([a,b],[1,1,[a]],[1,2,[b]]). + +%% false +%% a([c,d],[1,1,[a]],[1,2,[b]]). + +%% q in "q a b" sends food like e in "c d e" + +/** + +[debug] ?- a([q,e,r,a,t,y,u,q,e,r,a,t,y,u,c,b,x,v,n,m],[1,1,[c,a,t,y,u]],[1,2,[c,a,t,y,u]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[c,a,t,y,u]],[r,a,t,y,u]] +true. + +X: +[debug] ?- a([q,e,r,a,t,y,u,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,u]]). +[[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,u]],[c,b,x]] +true. + +[debug] ?- a([q,e,r,a,t,y,u,c,c,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,v]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,v]],[c,c,c]] +true. + +**/ + +check_strings(String1,ML_db) :- + %%member([P1,S1,String2],ML_db), + %%member([P2,S2,String3],ML_db), + ML_db=[String2a,String3a], +%%writeln(["String1,String2a,String3a",String1,String2a,String3a]), + String2=[0,0,String2a], + String3=[0,0,String3a], +%%a(String1,String2,String3):- + length(String1,Length_string1), +((writeln("Attempt 1"), +check_strings_container1(Length_string1,String1,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + check_strings_container1(Length_list2,List2,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,_List3), + writeln("Success") + %%,trace + )->true; + (writeln("Failed"), + writeln("Attempt 2"), + ((check_strings_container1(Length_string1,String1,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + + check_strings_container1(Length_list2,List2,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,_List3))-> + writeln("Success");(writeln("Failed"),fail))) +). + +check_strings_container1(Length_string1,String1,String2,Db,List2,List2b) :- +check_strings1(Length_string1,String1,String2,Db,List2,List2b),not(var(List2b)). +%%check_strings_container2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- +%%check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b),not(var(List2b)). + + + %% finds the length of each string - from position 1 to (length of string1) of string1 + %% starts with the shortest possible string, then increases the length + %% repeat this for second string + %%string1 + + %%*** check with s2,s3 in db +check_strings1(0,_String1,_String2,_Db,List2,List2) :- !. +check_strings1(Length_string1,String1,String2,Db,List2,List2b) :- + %% finds the first str + %%length(List,Index), + length(List1,Length_string1), + append(_,List1,String1), + Length_string2 is Length_string1-1, + Length_string3 is Length_string1+1, + check_strings2(0,Length_string3,List1,String2,Db,List2,List2c), + (var(List2c)-> + check_strings1(Length_string2,String1,String2,Db,List2c,List2b);List2c=List2b),!. + +check_strings2(Length_string,Length_string,_String1,_String2,_Db,List2,List2) :- !. +check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- + + + %%split_string(String4,SepandPad,SepandPad,String21), + + %% go through s1, removing the first word each time, until v + %%Length_string11 is Length_string1-1, + length(List11,Length_string1), + append(List11,List2,List1), + Length_string2 is Length_string1+1, + +%%writeln([list11,List11]), + + ((%%trace, + %%(List11=[z,a,b]->trace;true), + %%writeln([[_,_,List11],String2]), + %%notrace, + not(List11=[]), + +%%writeln(data_instance_k_classification1(Db +%% ,List11,1,A)), + data_instance_k_classification1(Db + %%String2 + ,List11,1,A), + %%writeln([string2,String2,list11,List11,a,A]), +A=String2,%%[_,_,List11], + %%trace, + + %%((List11=[z,a,b],A=[_,_,[z,t,b]],not(String2=[_,_,[c,z]]))->trace;notrace), + %%trace, +%%writeln([string2=list11,String2=List11]), +%%(List11=[y]->true%%trace +%%;true), +%%member(List11,Db), + %%**String2=[_,_,List11] + List2b=List2,%%,notrace + String2=[_,_,String21] + ,writeln([found,String21,List11]) + ) + ->(true + %%,writeln(List11) + ); + check_strings2(Length_string2,Length_string3,List1,String2,Db,_List2c,List2b)),!. + +numbers(N2,N1,Numbers1,Numbers2):- +%%notrace, +numbers1(N2,N1,Numbers1,Numbers2). +%%trace. +numbers1(N2,N1,Numbers,Numbers) :- + N2 is N1-1,!. +numbers1(N2,N1,Numbers1,Numbers2) :- + N3 is N1+1, + append(Numbers1,[N1],Numbers3), + numbers1(N2,N3,Numbers3,Numbers2). + diff --git a/Essay-Helper/short_essay_helper3_agps.pl b/Essay-Helper/short_essay_helper3_agps.pl new file mode 100644 index 0000000000000000000000000000000000000000..2fa2f933b054401ab7d2acc8b5e6c595b7ffb1de --- /dev/null +++ b/Essay-Helper/short_essay_helper3_agps.pl @@ -0,0 +1,947 @@ +%% Prolog Short Essay Helper + +%% Keep 1,2,3 in aphors +%% Certain number of reasons per paragraph +%% Explains essay structure at start +%% Exposition (negative terms not allowed, asks for 5 groups from text to take optionally enough notes about, which can revise) and critique +%% Paraphrasing +%% Converts x, explains x, asks about +%% Allows citation of aphor +%% Prints citations used so far +%% Prints chapter, essay and allows changes (using text editor) at end. +%% Outputs db file and br dicts for marking later + +%% *** Rewrite essay files with e.g. 1a. before aphors not 1. so not confused with aphors + +%% short_essay_helper("file.txt",5,E),writeln1(E). + +%%:- include('distances.pl'). +:- use_module(library(date)). +:- include('../listprologinterpreter/la_strings'). +:- include('sheet_feeder.pl'). + +:- dynamic critique3/1. +:- dynamic agree_disagree/1. +:- dynamic refs/1. +:- dynamic refs_long/1. + +choose(List0,Item) :- + random_member(Item10,List0), + string_codes(Item10,List), + split_string(List,".\n",".\n",List2), + random_member(Item1,List2), + string_concat(E,D,Item1), + string_length(E,1), + downcase_atom(E,E1), + atom_string(E1,E2), + string_concat(E2,D,Item2), + string_concat(Item2,""%%"." + ,Item). + +choose1(List0,Item) :- + random_member(Item,List0). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +short_essay_helper(%%Filex, + String01, + Reasons_per_paragraph) :- + retractall(critique3(_)), + assertz(critique3([])), + + retractall(refs(_)), + assertz(refs([])), + + retractall(refs_long(_)), + assertz(refs_long([])), + + retractall(key_words(_)), + assertz(key_words([])), + + directory_files("sources/",F), + delete_invisibles_etc(F,G), + + findall(String02a,(member(Filex1,G), + string_concat("sources/",Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]) + %%split_string(String00, "\n\r", "\n\r", [String01a|_]), + + %%prepare_file_for_ml(String00,String02a) + ),String00), + + %%trace, + %%writeln1(String00), + %%notrace, + +%%writeln1(String02), + + + generate_file_name(File1,File2), + + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + explain_structure(String01,Reasons_per_paragraph,File1), + exposition(String00,String01,Reasons_per_paragraph,Numbers,String02,Exposition), + + %%concat_list(["Do you agree or disagree with ",String01," (a/d) ? "],String2ad),%%get_string(String2ad,either,one-not-ml,"","",String3ad), + choose1(["a"%%,"d" + ],String3ad), + + (String3ad="a"-> + (retractall(agree_disagree(_)), + assertz(agree_disagree(agree))); + (retractall(agree_disagree(_)), + assertz(agree_disagree(disagree)))), + + +critique(String00,String01,Reasons_per_paragraph,Numbers,String02,Critique), + +agree_disagree(Pole), + + %%concat_list(["What is the future area of research from your essay about ",String01,"? "],Future_research_prompt), + %%trace, + %%get_string(Future_research_prompt,either,one-not-ml,"","",Future_research), + choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(String00a4,String00a5), + concat_list(["In ",String01,", automation should apply to ",String00a5," (",String00a2,", p. ",N_page_ref,")."],Future_research), + reference(String00a1), + + refs(R2),refs_long(R21), + +term_to_atom([Exposition,Critique,String3ad,Future_research,R21],File_contents),open_s(File1,write,Stream),write(Stream,File_contents),close(Stream), + +%% Output essay +%%findall(_,(member(Exposition1,Exposition),Exposition1= + + +%%writeln1([Exposition,Critique,Future_research,R2]), + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML), +writeln1(Essay), + + (open_s(File2,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,HTML), + close(Stream1)) + . + +%% replace("a\nb","\n","
\n",F). +%% F="a
\nb
\n". + +replace(A,Find,Replace,F) :- + split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F). + +concat_list1(D,F) :- + maplist(append,[D],[E]),concat_list(E,F). + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML) :- + write_heading(String01,Heading), + write_introduction(String01,Pole,Critique,Introduction), + write_exposition(Exposition,Exposition2a), + concat_list(["I will expose ",String01," in this half. ",Exposition2a],Exposition2), + %%string_concat(Exposition2,"\n",Exposition2a), + write_critique(Critique,Critique2a), + string_concat(Critique2a,"\n",Critique2b), + atom_string(Pole,Pole1), + concat_list(["I will ",Pole1," with ",String01," in this half. ",Critique2b],Critique2), + write_conclusion(String01,Pole,Critique,Future_research,Conclusion), + write_references(R2,References,Refs_no_heading), + concat_list([Heading,Introduction,Exposition2,Critique2,Conclusion,References], + Essay), + concat_list([Introduction,Exposition2,Critique2,Conclusion], + Essay2), + replace(Essay2,"\n","
",HTML1), + replace(Refs_no_heading,"\n","
",Refs_no_heading2), + concat_list(["",String01,"

", + String01,"

",HTML1,"

Bibliography

",Refs_no_heading2,""],HTML). + +write_heading(String01,Heading) :- + concat_list([String01,"\n","\n"],Heading). + +write_introduction(String01,Pole1,Critique,Introduction) :- + %% The heading should be in the form "Author's topic" + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list(["I will critically analyse ",String01,". ", + "I will ",Pole2," with ",String01,". ", + Paragraph_topic_sentences2,"\n\n"],Introduction). + +write_conclusion(String01,Pole1,Critique,Future_research,Conclusion) :- + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list([ + Paragraph_topic_sentences2, + "I have ",Pole2,"d with ",String01,". ", + Future_research,"\n\n" + ],Conclusion). + +write_references(R2,References,Refs_no_head) :- + findall([Reference,"\n"],member(Reference,R2),References1), + concat_list1(References1,References2), + concat_list([References2],Refs_no_head), + concat_list(["Bibliography","\n\n",References2],References). + +%%a1 +%% write_exposition([[[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."]]],A),writeln1(A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. ". + +%% write_exposition([[[1,"g1"],[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."],[1,1,_15410,_15416,"a1","a1 is in g1."],[2,2,_15352,_15358,"a2","g1 contains a2."]]],A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. a1 a1 is in g1. a2 g1 contains a2. ". + +write_exposition(Exposition,Essay4b) :- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +Exposition=[_,Exposition2], + + +findall([Essay4c%%,"\n" +],(member(Numbera11,Numbers), + + +%% not "" with findall +findall(Essay4,( +%%member(Exposition1,Exposition), +%%Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + + +%%output_exposition(Numbera11,Exposition2,"",Essay1), + + %%findall( Essay4,( + member(Exposition1,Exposition2),Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,Group_link], + concat_list([String5a," ",Group_link," "],Essay4) + %%delete(Exposition,Exposition1,Exposition2) + %%output_exposition(Numbera11,Exposition2,Essay4,Essay6) +),Essay4a), +concat_list(Essay4a,Essay4c1),%%trace, +(Essay4c1=""->Essay4c="";concat_list([Essay4c1,"\n"],Essay4c)) + +),Essay4d), + +maplist(concat_list,Essay4d,Essay4f), +concat_list(Essay4f,Essay4b) +%%concat_list([Essay4e,"\n"],Essay4b) +%%concat_list([Essay1," ",Essay3],Essay2), + %%concat_list([Essay2,"\n"],Essay23)) +%%,Essay3a), + %%concat_list(Essay3a,Essay4a) +. +%% *** HTML (
not \n) + +%%%%% +%%a + +%% write_critique([[1,["heading is e12",[1,1,_25346,_25352,"e1",_25370,_25376,"e12"],[1,2,_25412,_25418,"e2",_25436,_25442,"e22",1,1,"e12",0,0,"e22","e12 is e22"]]]],A),writeln1(A). +%% A = "heading is e12 e1 e12 e2 e22 e12 is e22 \n". + +write_critique(Critique,Essay4):- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +findall(Essay23,(member(Numbera11,Numbers), + +%% not "" with findall +%%findall(Essay22,( +member(Critique1,Critique), +%%Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + +Critique1=[Numbera11,[Para_topic_sent,[_Number2a,_Number3a,_String3,_String3a,String5a,_String3y,_String3ay,String5a1]|Critique2]], + +concat_list([Para_topic_sent," ",String5a," ",String5a1," "],Essay0),output_critique(Numbera11,Critique2,String5a1,Essay0,Essay1), + + concat_list([Essay1,Para_topic_sent,"\n"],%%Essay22) + +%%), +Essay23)),Essay3), + concat_list(Essay3,Essay4) +. +%% *** HTML (
not \n) + +output_critique(_Numbera11,[],_String5a1a,Essay,Essay) :- !. +output_critique(Numbera11,Critique,CString5a1a,Essay1,Essay2) :- +findall( Essay6,(member(Critique1,Critique),Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,_String3y1,_String3ay,String5a1, +_CNumber2aa,_CNumber3aa,CString5a1a, + _CNumber2a1,_CNumber3a1,_LastCStrings, + String5aaa], + concat_list([String5a," ",String5a1," ",String5aaa," "],Essay4), + delete(Critique,Critique1,Critique2), + output_critique(Numbera11,Critique2,String5aaa,Essay4,Essay6) +),Essay33), +%%trace, +(Essay33=[]->concat_list([Essay1%%," " +],Essay2);%%(Essay33=[Essay3]->concat_list([Essay1," ",Essay3],Essay2);%%(Essay33=[Essay3|[E33]]-> concat_list([Essay1," ",Essay3,E33],Essay2); +(Essay33=[Essay3|E33], concat_list(E33,E34),concat_list([Essay1,%%" ", +Essay3,E34],Essay2))) +%%) +%%) +. + + +/**critique(String00,String01,Reasons_per_paragraph,Numbers,Critique). + **/ + +generate_file_name(File1,File2) :- + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".txt"],File1), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".html"],File2). + + +explain_structure(String01,Reasons_per_paragraph,_File1) :- + concat_list(["The Short Essay Helper will automatically structure and write your essay about ",String01," with ",Reasons_per_paragraph," reasons per paragraph.","\n", + "The Helper will help write an exposition (which summarises but doesn't critique the idea), a critique (which agrees with or disagrees with the topic), the introduction and the conclusion (which state whether you agreed or disagreed with the topic, etc.). Citations will be automatically made.","\n","Note: Generated essays are not to be handed in, and you need to paraphrase and cite work you have referenced. Your grade depends on whether you agree or disagree and how many breasonings you breason out. Check the referencing style is appropriate for your class.","\n"],String1), + writeln(String1). + +choose_sentence_range(String00,N2,B,B1,B2,C) :- + length(String00,L), + numbers(L,1,[],N), + random_member(N1,N), + get_item_n(String00,N1,A), + A=[B,B1,B2|C], + N2 is N1+B2-1. + %%random_member(A,String00), + +exposition(String00,_String01,Reasons_per_paragraph,Numbers,ML_db,Exposition1) :- + length(List1,5), %% 5->1 paragraphs per exposition + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + findall([Number1,Exposition2],( + %%trace, +member(Number1,List1),%%concat_list(["What is group ",Number1," of 5 in the exposition that groups ideas about ",String01,"? "],String1),%%get_string(String1,either,one-not-ml,"","",%ML_db,Exposition2) +%% ** Doesn't print this +choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(String00a4,String00a5), + concat_list([String00a5," (",String00a2,", p. ",N_page_ref,") "],Exposition2), + reference(String00a1)),Exposition3), + + + findall([Number2,Number3,String3,String3a,String5a,String5],( + member(Number2,List1),member(Number3,List2),get_item_n(Exposition3,Number2,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you how the quote you are about to enter relates to the paragraph topic."],String2b),writeln(String2b), + %%trace, +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5) + ),Exposition4), + Exposition1=[Exposition3,Exposition4]. + +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5):- + (%%concat_list(["What is the paragraph number of the quote about the paragraph topic ",Item1,"? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote about the paragraph topic ",Item1,"? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote about the paragraph topic ",Item1,"? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(String00a4,String00a5), + concat_list([Item1," is because ",String00a5," (",String00a2,", p. ",N_page_ref,")."],String5a), + reference(String00a1), + + %%concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + %%get_string(String4,either,one-not-ml,Item1a,String3aa,String5) + + choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(String00a41,String00a51), + concat_list([Item1," is because ",String00a51," (",String00a21,", p. ",N_page_ref1,")."],String5), + reference(String00a11) +) + ->true;exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5). + +%% Agree or disagree + +critique(String00,String01,Reasons_per_paragraph,Numbers,ML_db,Critique1) :- + length(List1,5), %% 5->1 paragraphs per critique + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + + retractall(critique3(_)), + assertz(critique3([])), + + + findall([Number2a,Critique2],( +%% Reason 1 + +member(Number2a,List1), +%%List1=[Number2a|List1a], +List2=[Number3a|List2a], +%%trace, + + +critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link), + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4), + + append_list2([[Topic_paragraph_link],Critique3,Critique4],Critique2)),Critique1). + + critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link) :- + + %%member(Number2,List1),member(Number3,List2), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + %%trace, +critique2(String00,String01,ML_db,String3,String3a,String5a,_String3y2,String3ay,String5a1,Topic_paragraph_link), + %%),Critique4). + Critique3=[[Number2a,Number3a,String3,String3a,String5a,_String3y3,String3ay,String5a1]]. + +critique2(String00,String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + + choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(String00a41,String00a51), + concat_list([String00a51," (",String00a21,", p. ",N_page_ref1,")."],String5a), + reference(String00a11), + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (%%String3yn="y" + true-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase + + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + choose_sentence_range(String00,N_page_ref2,String00a12,String00a22,_String00a32,String00a42), + choose(String00a42,String00a52), + concat_list([String00a52," (",String00a22,", p. ",N_page_ref2,")."],String5a1), + reference(String00a12) + + ) + + ;(String3y=0,String3ay=0, + %%concat_list(["What is the comment? "],String4ac),%%get_string(String4ac,either,one-not-ml,"","",%%String5a, +%% String5a1) + + choose_sentence_range(String00,N_page_ref3,String00a13,String00a23,_String00a33,String00a43), + choose(String00a43,String00a53), + concat_list([String00a53," (",String00a23,", p. ",N_page_ref3,")."],String5a1), + reference(String00a13) + +)), + + %%concat_list(["How does the comment ",String5a1," relate to the essay topic ",String01,"? "],Topic_paragraph_link_prompt), + %%trace, + + %%downcase_and_split(String5a1,String5a1ds), + %%downcase_and_split(String01,String01ds), + %%get_string(Topic_paragraph_link_prompt,either,one-not-ml,String5a1ds,String01ds,Topic_paragraph_link) + + string_concat(String5a1_az,".",String5a1), choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + choose(String00a44,String00a54), + concat_list([String01," is because of ",String5a1_az," because of ",String00a54," (",String00a24,", p. ",N_page_ref4,")."],Topic_paragraph_link), + reference(String00a14) + + /** + %% conn - choose connected comments + concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + get_string(String4,either,two,Item1a,String3aa,String5) + **/ + ) + ->true;critique2(String00,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link). + + + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4) :- + retractall(critique3(_)), + assertz(critique3(Critique3)), +%%trace, +/** critique3(Critique31), + append(Critique31,Critique3,Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), +**/ +%%notrace, +findall([Number2a,Number3a,String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa],( + %%member(Number2,List1a), + member(Number3a,List2a), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),writeln(String2b), + %% Later: connections + + + %%trace, +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa) + + +/** + trace, + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + +), + Critique4). + + %%Critique3=[]. + +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + %%trace, + choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + choose(String00a44,String00a54), + concat_list([String00a54," (",String00a24,", p. ",N_page_ref4,")."],String5a), + reference(String00a14), + + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),%%get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (true->%%String3yn="y"-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase x + %%trace, + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + choose_sentence_range(String00,N_page_ref5,String00a15,String00a25,_String00a35,String00a45), + choose(String00a45,String00a55), + concat_list([String00a55," (",String00a25,", p. ",N_page_ref5,")."],String5a1), + reference(String00a15) + + %%,trace + ) + + ;(String3y=0,String3ay=0, + concat_list(["What is the comment? "],String4ac), + + %% This is never chosen + get_string(String4ac,either,one-not-ml,"","",%%String5a, + String5a1))), + %%*** assertz recurse not findall new critique3 x + + + %%trace, + critique3(Critique31), + append(Critique31,[[0,0,String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), + + critique3(Critique33), + + +(( + + + %%critique3(Critique3), + length(Critique33,LCritique3), + %%length(List1,LCritique3), + numbers(LCritique3,1,[],List1), + %%append(List1,_,Numbers), + %%Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + %%trace, + +findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1])),CStrings1), + +findall([N,CNumber2a,CNumber3a,CString5a1],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString31,_CString3a1,_CString5a1,_CString3y1,_CString3ay1,CString5a1])),CStrings2), + +%%trace, + +%%CStrings=[CStrings1,CStrings2], +reverse(CStrings2,CStringsR),CStringsR=[[_,CNumber2a1,CNumber3a1,LastCStrings]|CStringsR1], +reverse(CStringsR1,CStringsRR), + +reverse(CStrings1,CStringsR10),CStringsR10=[_|CStringsR11], +reverse(CStringsR11,CStringsRR1), + +append_list2(CStringsRR1,CStrings11), +concat_list(CStrings11,_CStrings12), +%%concat_list(["Please select a comment to connect the comment ",LastCStrings," to:","\n",CStrings12],ConnectionNumberPrompt), +%%get_number(ConnectionNumberPrompt,ConnectionNumber), +%numbers( +choose1(List1,ConnectionNumber), + member([ConnectionNumber,CNumber2aa,CNumber3aa,CString5a1a],CStringsRR), + + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(CString5a1a,CString5a1a1),split_string(CString5a1a1, SepandPad, SepandPad, CString5a1a2), + + %%CNumber2a1,CNumber3a1, + %%downcase_atom(LastCStrings,LastCStrings_a),split_string(LastCStrings_a, SepandPad, SepandPad, LastCStrings_a1), + + + %% conn - choose connected comments, this to a previous comment + %%trace, + + %%concat_list(["How does ",LastCStrings," connect to ",CString5a1a,"? "],ConnectionPrompt), + + %%get_string(ConnectionPrompt,either,one-not-ml,CString5a1a2,LastCStrings_a1,String5aaa) + + string_concat(LastCStrings_az,".",LastCStrings), + string_concat(CString5a1a_az,".",CString5a1a), choose_sentence_range(String00,N_page_ref6,String00a16,String00a26,_String00a36,String00a46), + choose(String00a46,String00a56), + concat_list([LastCStrings_az," is because of ",CString5a1a_az," because of ",String00a56," (",String00a26,", p. ",N_page_ref6,")."],String5aaa), + reference(String00a16) + + + )->true;( + + %% If the section since updating dynamic critique comments fails, prevent doubling of critique comments + + critique3(Critique311), + reverse(Critique311,Critique312), + Critique312=[_|Critique313], + reverse(Critique313,Critique314), + retractall(critique3(_)), + assertz(critique3(Critique314)),fail + + )) + +/** Critique4=[String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa], + **/ + /** + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + + ) + %% Retries the predicate if it fails + ->true;critique3(String00,ML_db,%%Critique3, + String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa + ). + + + + +member1a([String3,String3a,String3aa],ML_db) :- + member([String3,String3a,String3aa],ML_db),!. + + +get_item_n(Exposition,Number1,Item) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[Item|_],Exposition). + +get_number(Prompt1,Number) :- + %%concat_list(Prompt1,Prompt2), + (%%repeat, + writeln(Prompt1),read_string(user_input, "\n", "\r", _End1, String),split_string(String, ",", " ", Value1),Value1=[Value2],number_string(Number,Value2)). + + +/** +get_string(Prompt1,String1) :- + concat_list(Prompt1,Prompt2), + (repeat,write(Prompt2),read_string(String1),not(String1="")). +**/ + +equals_empty_list([]). + +downcase_and_split(String1, String2) :- +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + downcase_atom(String1,String3),split_string(String3, SepandPad, SepandPad, String2). + +get_string(Prompt2,Flag1,Flag2,ML_db0,ML_db1,String2) :- +%%writeln1(get_string(Prompt2,Flag1,Flag2,ML_db1,String2)), + %%concat_list(Prompt1,Prompt2), + %%trace, + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%(repeat, + (Flag2=one-not-ml-ref->(concat_list(["Note: Enter in-text reference using AGPS, e.g.\n","The first work supports the second work (Surname 2000, pp. 18-9).\n","Surname (2000, pp. 18-9) states that ...\n","Remember to use words like \"also\", \"moreover\" and \"in addition\" before the sentence."],String_a1),writeln(String_a1));true), + writeln(Prompt2),read_string(user_input, "\n", "\r", _End2, String2aa),%%not(String2aa=""), + %%String2aa=[String2aaa], + downcase_atom(String2aa,String3),split_string(String3, SepandPad, SepandPad, String4), + Neg_term_list=["no","not","don","t","shouldn","wouldn","disagree","differ","dislikes","disagrees","differs","dislikes","disagreed","differed","disliked","negative","negation","non","negate","negates","negated","but","however","isn","lack"], + (Flag1=%%non_negative + positive->( + + (findall(Item11,( + + member(Item1,String4), + findall(Item1,( +member(Item2,Neg_term_list),(Item1=Item2->(write("Error: Contains the negative term \""),write(Item1),writeln("\".")))),Item11)),Item12)), + +maplist(equals_empty_list,Item12) + +); + ((Flag1=negative->((member(Item1,String4),member(Item1,Neg_term_list))->true;(writeln("Error: Contains no negative term, one of:"),writeln(Neg_term_list),fail));true)->true;Flag1=either)), + (Flag2=one-not-ml->String2=String2aa; + (Flag2=one-not-ml-ref-> + (refs(R1),writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), findall(_,(member(R11,R1),writeln(R11)),_),read_string(user_input, "\n", "\r", _End3, String2r),not(String2r=""),%%downcase_atom(String2r,_String3r), + String2=String2aa,split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + sort1(String2r21,String2r2), + assertz(refs(String2r2))%%split_string(String3r, SepandPad, SepandPad, String4) + ); +(Flag2=one->(%%split_string(String4,SepandPad,SepandPad,String21), + writeln("Attempt 1"), + + (length(String4,Length_string1), +(check_strings_container1(Length_string1,String4,[0,0,ML_db1],[[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1],[999,999,[]]],_,_List2)-> + writeln("Success");(writeln("Failed"),fail)) + %%(%%data_instance_k_classification1([[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1]%%,[999,999,[]] + %%],[0,0,String4]%%21 + %%,1,String4a%%21 + %%), + %%String4=String4a) + %%)-> + %%(%%String4a=[_,_,String4a1], + %%writeln([found,String4a,String4]), +%%writeln("Success")); + %%(writeln("Failed"),fail) + %% + )); + (Flag2=%%[ + two,%%,P1,S1,P2,S2], + %%trace, + append([ML_db0],[ML_db1],ML_db2), + check_strings(String4,ML_db2%%,P1,S1,P2,S2 + ))))). + +reference(String2r) :- + refs_long(R10), + (refs(R1),%%writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), + findall(_,(member(R11,R1),writeln(R11)),_),%%read_string(user_input, "\n", "\r", _End4, String2r), + not(String2r=""),%%downcase_atom(String2r,_String3r), + %%String2=String2aa, + split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + retractall(refs_long(_)),maplist(append,[[R10,String2r3]],[String2r210]), + sort1(String2r21,String2r2), + assertz(refs(String2r2)), + assertz(refs_long(String2r210))%%split_string(String3r, SepandPad, SepandPad, String4) + ). + + +%% Sorts by first surname then title in AGPS + +sort1(List1,List2) :- + findall([C,B],(member(B,List1),downcase_atom(B,D),atom_string(D,C)),E),sort(E,F),findall(G,member([_,G],F),List2). + + +prepare_file_for_ml(String000,String021) :- + string_codes(String001,String000), + downcase_atom(String001,String00), + split_string(String00, "\n\r", "\n\r", String01), + delete(String01,"",String02), + findall(String03,(member(String02a,String02),split_string(String02a,".",".",String04), + ((String04=[String05|String06], + number_string(Number05,String05), + number_sentences(Number05,1,String06,[],String03))->true; + (findall(String08, + (SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + member(String04a,String04),split_string(String04a,SepandPad,SepandPad,String09), + append_list([[""],"",String09],String08)),String03)))),String0211), + append_list2(String0211,String021). + +number_sentences(_,_,[],String,String) :- !. number_sentences(Number_a,Number_b1,String06,String07,String08) :- + String06=[String00|String12], + +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + split_string(String00,SepandPad,SepandPad,String09), + append_list([[Number_a],Number_b1,String09],String10), + append(String07,[String10],String11), + Number_b2 is Number_b1 + 1, + number_sentences(Number_a,Number_b2,String12,String11,String08). + + +data_instance_k_classification1(Data,I,K,C):- + maplist(o_oclass_disClass1(I),Data,DisAndClass), + keysort(DisAndClass,DSorted), + + %%writeln(DSorted), + + length(TopK,K), + append(TopK,_,DSorted), + %This is not a very good way of selecting k as you may have many values with the same distance, and the sorting just cuts these off + %Dsorted = [1-pos,3-pos,3-pos,3-neg,3-neg] + %Topk =[1-pos,3-pos,3-pos] + topk_vote(TopK,C). + +o_oclass_disClass1(O,[A,B,O2],D-[A,B,O2]):- + o_o_dis(O,O2,D). + +%% could be in either order +%% a([w,z,a,b,e,c,z,y],[1,1,[c]],[1,2,[a]]). + +%% true +%% a([a,c,b],[1,1,[a,d]],[1,2,[b]]). +%% a([a,c,b],[1,1,[a,c]],[1,2,[b]]). +%% a([a,b],[1,1,[a]],[1,2,[b]]). + +%% false +%% a([c,d],[1,1,[a]],[1,2,[b]]). + +%% q in "q a b" sends food like e in "c d e" + +/** + +[debug] ?- a([q,e,r,a,t,y,u,q,e,r,a,t,y,u,c,b,x,v,n,m],[1,1,[c,a,t,y,u]],[1,2,[c,a,t,y,u]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[c,a,t,y,u]],[r,a,t,y,u]] +true. + +X: +[debug] ?- a([q,e,r,a,t,y,u,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,u]]). +[[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,u]],[c,b,x]] +true. + +[debug] ?- a([q,e,r,a,t,y,u,c,c,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,v]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,v]],[c,c,c]] +true. + +**/ + +check_strings(String1,ML_db) :- + %%member([P1,S1,String2],ML_db), + %%member([P2,S2,String3],ML_db), + ML_db=[String2a,String3a], +%%writeln(["String1,String2a,String3a",String1,String2a,String3a]), + String2=[0,0,String2a], + String3=[0,0,String3a], +%%a(String1,String2,String3):- + length(String1,Length_string1), +((writeln("Attempt 1"), +check_strings_container1(Length_string1,String1,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + check_strings_container1(Length_list2,List2,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,_List3), + writeln("Success") + %%,trace + )->true; + (writeln("Failed"), + writeln("Attempt 2"), + ((check_strings_container1(Length_string1,String1,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + + check_strings_container1(Length_list2,List2,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,_List3))-> + writeln("Success");(writeln("Failed"),fail))) +). + +check_strings_container1(Length_string1,String1,String2,Db,List2,List2b) :- +check_strings1(Length_string1,String1,String2,Db,List2,List2b),not(var(List2b)). +%%check_strings_container2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- +%%check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b),not(var(List2b)). + + + %% finds the length of each string - from position 1 to (length of string1) of string1 + %% starts with the shortest possible string, then increases the length + %% repeat this for second string + %%string1 + + %%*** check with s2,s3 in db +check_strings1(0,_String1,_String2,_Db,List2,List2) :- !. +check_strings1(Length_string1,String1,String2,Db,List2,List2b) :- + %% finds the first str + %%length(List,Index), + length(List1,Length_string1), + append(_,List1,String1), + Length_string2 is Length_string1-1, + Length_string3 is Length_string1+1, + check_strings2(0,Length_string3,List1,String2,Db,List2,List2c), + (var(List2c)-> + check_strings1(Length_string2,String1,String2,Db,List2c,List2b);List2c=List2b),!. + +check_strings2(Length_string,Length_string,_String1,_String2,_Db,List2,List2) :- !. +check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- + + + %%split_string(String4,SepandPad,SepandPad,String21), + + %% go through s1, removing the first word each time, until v + %%Length_string11 is Length_string1-1, + length(List11,Length_string1), + append(List11,List2,List1), + Length_string2 is Length_string1+1, + +%%writeln([list11,List11]), + + ((%%trace, + %%(List11=[z,a,b]->trace;true), + %%writeln([[_,_,List11],String2]), + %%notrace, + not(List11=[]), + +%%writeln(data_instance_k_classification1(Db +%% ,List11,1,A)), + data_instance_k_classification1(Db + %%String2 + ,List11,1,A), + %%writeln([string2,String2,list11,List11,a,A]), +A=String2,%%[_,_,List11], + %%trace, + + %%((List11=[z,a,b],A=[_,_,[z,t,b]],not(String2=[_,_,[c,z]]))->trace;notrace), + %%trace, +%%writeln([string2=list11,String2=List11]), +%%(List11=[y]->true%%trace +%%;true), +%%member(List11,Db), + %%**String2=[_,_,List11] + List2b=List2,%%,notrace + String2=[_,_,String21] + ,writeln([found,String21,List11]) + ) + ->(true + %%,writeln(List11) + ); + check_strings2(Length_string2,Length_string3,List1,String2,Db,_List2c,List2b)),!. + +numbers(N2,N1,Numbers,Numbers) :- + N2 is N1-1,!. +numbers(N2,N1,Numbers1,Numbers2) :- + N3 is N1+1, + append(Numbers1,[N1],Numbers3), + numbers(N2,N3,Numbers3,Numbers2). diff --git a/Essay-Helper/source_tagger.pl b/Essay-Helper/source_tagger.pl new file mode 100644 index 0000000000000000000000000000000000000000..d7f5552451b1353d7f91fb399ff54f081a24229d --- /dev/null +++ b/Essay-Helper/source_tagger.pl @@ -0,0 +1,56 @@ +source_tagger :- + phrase_from_file_s(string(String00a),"tags.txt"), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]), + + ask_for_new_tag(String02a,String02c), + + term_to_atom(String02c,File_contents), + open_s("tags.txt",write,Stream), + write(Stream,File_contents),close(Stream). + +ask_for_new_tag(A,B) :- + prompt("New tag? (y/n)",New_tag), + (New_tag="y"->(tag(Tag,Text,Source), + append(A,[[Tag,Text,Source]],C), + ask_for_new_tag(C,B)); + A=B). + +tag(Tag,Text,Source) :- + prompt("What are the tags?",Tag), + prompt("What is the text?",Text), + prompt("What is the source?",Source). + +prompt(Prompt,String) :- + writeln(Prompt), + read_string(user_input, "\n", "\r", _End, String). + +%% - enter text, source and tags + +%% - tags used in x are chapter topics + +print_tags :- + phrase_from_file_s(string(String00a),"tags.txt"), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]), + + findall(T4,(member(A,String02a),A=[T1,_,_], + downcase_atom(T1,T2),atom_string(T2,T3), + split_string(T3,", ",", ",T4)),T5), + + maplist(append,[T5],[T6]), + sort(T6,T7), + + findall(_,(member(T8,T7),writeln(T8)),_), + + prompt("Enter tag to show report of.",T9), + + findall([TA,TB],(member(A,String02a),A=[T1,TA,TB], + downcase_atom(T1,T2),atom_string(T2,T3), + split_string(T3,", ",", ",T4), + member(T9,T4)),T10), + + findall(_,(member([T11,T12],T10),writeln(T11),writeln(T12),writeln("")),_). + + + diff --git a/Essay-Helper/sources/.DS_Store b/Essay-Helper/sources/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..2a396338634876d95a38df1a28aefaebbc98d582 Binary files /dev/null and b/Essay-Helper/sources/.DS_Store differ diff --git a/Essay-Helper/sources/1.txt b/Essay-Helper/sources/1.txt new file mode 100644 index 0000000000000000000000000000000000000000..77ea6c776002f71b119c9b1a193522353cc9958d --- /dev/null +++ b/Essay-Helper/sources/1.txt @@ -0,0 +1,6 @@ +["Surname, A 2000, Title: Subtitle, Publisher, City.","Surname, A 2000",1,"a +a","B +b +b","C +c +c"] \ No newline at end of file diff --git a/Essay-Helper/sources/2.txt b/Essay-Helper/sources/2.txt new file mode 100644 index 0000000000000000000000000000000000000000..77ea6c776002f71b119c9b1a193522353cc9958d --- /dev/null +++ b/Essay-Helper/sources/2.txt @@ -0,0 +1,6 @@ +["Surname, A 2000, Title: Subtitle, Publisher, City.","Surname, A 2000",1,"a +a","B +b +b","C +c +c"] \ No newline at end of file diff --git a/Essay-Helper/sources/3.txt b/Essay-Helper/sources/3.txt new file mode 100644 index 0000000000000000000000000000000000000000..e03f73542b4a921dd8fe450678d3fd0ef69f7971 --- /dev/null +++ b/Essay-Helper/sources/3.txt @@ -0,0 +1,6 @@ +["Surname, A 2000, Title: Subtitle, Publisher, City.","Surname, A 2000",1,"'''''' + +"," +b"," +b +b"] \ No newline at end of file diff --git a/Essay-Helper/tags.txt b/Essay-Helper/tags.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Essay-Helper/tags.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Essay-Helper/texttobr2qb.pl b/Essay-Helper/texttobr2qb.pl new file mode 100644 index 0000000000000000000000000000000000000000..5702c5bb72e066272fe1857cb538b1b6d47e02d8 --- /dev/null +++ b/Essay-Helper/texttobr2qb.pl @@ -0,0 +1,10 @@ +%% 250 breasonings + +texttobr2 :- +br1([[a,1,1.5,0],[about,1,1,0],[ache,1,0,0],[after,1,1,0],[all,1,1.5,0],[an,1,1.5,0],[and,1,1,0],[appearances,5,0,5],[apple,5,5,5],[are,1,1,0],[area,5,5,0],[argument,1,1.5,0],[as,1,1,0],[asanas,180,50,1],[at,1,1,0],[ate,15,2,1],[avoid,1,1,0],[b,1,1,1],[bacteria,2,1,1],[based,5,5,1],[be,50,30,180],[before,1,1,0],[being,50,30,180],[binding,1,1,0],[blocking,1,1,1],[bolt,2,1,1],[box,1,1,1],[breakfast,10,10,1],[breasoned,5,5,5],[breasoning,5,5,5],[breasonings,5,5,5],[by,1,1,0],[chapter,10,15,1],[clear,1,1,0],[colds,1,1,1],[comfort,180,50,30],[comfortable,50,50,100],[complexes,1,1,1],[conception,50,14,8],[connected,1,1,0],[correctness,1,1,0],[day,10,10,10],[depression,1,1,0],[details,1,1,1],[did,1,1,0],[do,50,30,180],[dotted,1,1,0],[down,1,1,0],[drinking,5,5,5],[during,1,1.5,0],[each,1,1,0],[earned,1,1.5,0],[education,50,30,180],[effects,1,1.5,0],[elderberries,1,1,1],[ensure,1,1,0],[excess,1,1,1],[exercises,10,10,1],[f,5,5,5],[feeling,1,1,0],[felt,1,1,0],[first,1,1.5,0],[flu,1,1,0],[food,2,1,1],[for,1,1,0],[from,1,1,0],[fruits,1,1,1],[function,1,1.5,0],[gaffer,10,10,1],[gave,1,1,1],[glasses,15,11,3],[god,50,30,180],[grade,1,1.5,0],[grains,1,1,1],[had,1,1,0],[hallucinogenic,1,1,0],[happiness,1,1,0],[happy,1,1,0],[have,5,5,5],[head,15,20,23],[headache,1,1,0],[headaches,1,1,0],[health,50,30,180],[healthy,50,30,180],[help,5,5,0],[helped,50,30,180],[high,1,1.5,0],[hills,5,5,2],[honey,10,10,10],[i,50,30,180],[imagery,5,5,0],[in,1,1,0],[incompatibility,1,1,0],[interpretation,20,30,0],[it,1,1,0],[keep,1,1,0],[laughter,1,1.5,0],[maintain,1,1,0],[maintaining,1,1,0],[master,50,30,180],[mind,3,4,4],[minutes,20,30,0],[mistake,1,1,0],[muscle,1,1,1],[muscles,1,1,1],[music,1,1,0],[must,1,1,0],[my,50,30,180],[namaskar,15,10,23],[neck,15,15,10],[next,1,1,0],[nietzsche,50,30,180],[no,1,1,0],[noticed,50,30,180],[nut,1,1,0],[nuts,1,1,1],[of,1,1,0],[off,1,1.5,0],[on,1,1,0],[one,1,1.5,0],[organs,1,1,1],[out,1,1,0],[over,1,1,0],[part,1,1,1],[peace,1,1.5,0],[perfect,1,1.5,0],[performed,50,30,180],[performing,50,30,250],[philosophy,10,20,1],[physiology,50,30,180],[pimple,1,1,0],[pm,5,5,5],[poem,1,1.5,0],[positive,1,1,0],[pot,5,5,10],[prayer,5,5,0],[prepared,1,1,0],[prevent,1,1,0],[prevented,1,1,0],[preventing,1,1,0],[prevention,1,1,0],[problems,1,1.5,0],[product,1,150,1],[products,5,5,5],[psychiatry,50,30,180],[psychology,2,2,0],[quality,1,1.5,0],[quantum,1,1,0],[read,15,1,20],[relax,180,50,30],[relaxed,180,50,30],[remain,1,1.5,0],[require,1,1,0],[room,500,400,300],[s,1,1,0],[safe,1,1.5,0],[sales,1,1.5,0],[same,1,1,0],[schizophrenia,1,1,0],[second,1,1.5,0],[see,1,0,1],[seek,1,1,0],[sell,5,3,0],[sets,5,1,0],[sex,1,1.5,0],[short,1,1,0],[single,1,1.5,0],[sites,20,0,30],[sitting,50,70,150],[sized,30,1,0],[skin,1,1,0],[sleep,180,50,30],[slices,2,2,2],[so,1,1,0],[societal,50,30,180],[some,1,1,0],[spine,1,1,10],[spiritual,50,30,180],[state,1,1.5,0],[stated,20,30,0],[stating,50,30,180],[status,1,1.5,0],[stop,5,1,15],[straight,1,1,0],[strawberry,1,1,1],[structure,1,1,1],[studied,50,30,180],[study,100,50,100],[studying,50,30,180],[subject,50,30,180],[successful,1,1.5,0],[surya,1,1.5,0],[sutra,5,1.5,0],[table,7,5,0],[tape,10,10,1],[task,15,2.5,0],[technique,1,1,0],[test,20,20,0],[text,20,30,0],[that,1,1,0],[the,1,1,0],[them,50,30,180],[then,1,1,0],[they,50,30,180],[thinking,1,1.5,0],[third,1,1.5,0],[this,1,1,0],[thoughts,1,1,0],[through,1,1,0],[tied,1,1,0],[time,1,1.5,0],[to,1,1,0],[together,1,1,0],[touch,1,1,0],[train,1,1,1],[trains,50,30,180],[travel,10,3,0],[treating,20,15,3],[turn,1,1,0],[two,1,1.5,0],[university,100,75,3],[unwanted,1,1,0],[up,1,1,0],[upasana,100,75,100],[used,1,1,0],[using,1,1,0],[var,1,1,1],[vegetables,5,5,5],[videos,5,0,3],[view,5,5,5],[virality,1,1,0],[viruses,1,1,2],[vitamin,1,1,1],[walks,100,500,0],[wanted,15,20,3],[warm,1,1,0],[was,1,1,0],[water,5,5,5],[way,1,1,0],[ways,50,100,0],[well,50,30,180],[where,1,1,0],[whole,1,1,1],[with,1,1,0],[without,1,1,0],[words,5,7,1],[write,15,1,1],[writing,5,5,0],[wrote,15,1,1],[years,3,1,0]]). + +br1([[a,1,1.5,0],[about,1,1,0],[ache,1,0,0],[after,1,1,0],[all,1,1.5,0],[an,1,1.5,0],[and,1,1,0],[appearances,5,0,5],[apple,5,5,5],[are,1,1,0],[area,5,5,0],[argument,1,1.5,0],[as,1,1,0],[asanas,180,50,1],[at,1,1,0],[ate,15,2,1],[avoid,1,1,0],[b,1,1,1],[bacteria,2,1,1],[based,5,5,1],[be,50,30,180],[before,1,1,0],[being,50,30,180],[binding,1,1,0],[blocking,1,1,1],[bolt,2,1,1],[box,1,1,1],[breakfast,10,10,1],[breasoned,5,5,5],[breasoning,5,5,5],[breasonings,5,5,5],[by,1,1,0],[chapter,10,15,1],[clear,1,1,0],[colds,1,1,1],[comfort,180,50,30],[comfortable,50,50,100],[complexes,1,1,1],[conception,50,14,8],[connected,1,1,0],[correctness,1,1,0],[day,10,10,10],[depression,1,1,0],[details,1,1,1],[did,1,1,0],[do,50,30,180],[dotted,1,1,0],[down,1,1,0],[drinking,5,5,5],[during,1,1.5,0],[each,1,1,0],[earned,1,1.5,0],[education,50,30,180],[effects,1,1.5,0],[elderberries,1,1,1],[ensure,1,1,0],[excess,1,1,1],[exercises,10,10,1],[f,5,5,5],[feeling,1,1,0],[felt,1,1,0],[first,1,1.5,0],[flu,1,1,0],[food,2,1,1],[for,1,1,0],[from,1,1,0],[fruits,1,1,1],[function,1,1.5,0],[gaffer,10,10,1],[gave,1,1,1],[glasses,15,11,3],[god,50,30,180],[grade,1,1.5,0],[grains,1,1,1],[had,1,1,0],[hallucinogenic,1,1,0],[happiness,1,1,0],[happy,1,1,0],[have,5,5,5],[head,15,20,23],[headache,1,1,0],[headaches,1,1,0],[health,50,30,180],[healthy,50,30,180],[help,5,5,0],[helped,50,30,180],[high,1,1.5,0],[hills,5,5,2],[honey,10,10,10],[i,50,30,180],[imagery,5,5,0],[in,1,1,0],[incompatibility,1,1,0],[interpretation,20,30,0],[it,1,1,0],[keep,1,1,0],[laughter,1,1.5,0],[maintain,1,1,0],[maintaining,1,1,0],[master,50,30,180],[mind,3,4,4],[minutes,20,30,0],[mistake,1,1,0],[muscle,1,1,1],[muscles,1,1,1],[music,1,1,0],[must,1,1,0],[my,50,30,180],[namaskar,15,10,23],[neck,15,15,10],[next,1,1,0],[nietzsche,50,30,180],[no,1,1,0],[noticed,50,30,180],[nut,1,1,0],[nuts,1,1,1],[of,1,1,0],[off,1,1.5,0],[on,1,1,0],[one,1,1.5,0],[organs,1,1,1],[out,1,1,0],[over,1,1,0],[part,1,1,1],[peace,1,1.5,0],[perfect,1,1.5,0],[performed,50,30,180],[performing,50,30,250],[philosophy,10,20,1],[physiology,50,30,180],[pimple,1,1,0],[pm,5,5,5],[poem,1,1.5,0],[positive,1,1,0],[pot,5,5,10],[prayer,5,5,0],[prepared,1,1,0],[prevent,1,1,0],[prevented,1,1,0],[preventing,1,1,0],[prevention,1,1,0],[problems,1,1.5,0],[product,1,150,1],[products,5,5,5],[psychiatry,50,30,180],[psychology,2,2,0],[quality,1,1.5,0],[quantum,1,1,0],[read,15,1,20],[relax,180,50,30],[relaxed,180,50,30],[remain,1,1.5,0],[require,1,1,0],[room,500,400,300],[s,1,1,0],[safe,1,1.5,0],[sales,1,1.5,0],[same,1,1,0],[schizophrenia,1,1,0],[second,1,1.5,0],[see,1,0,1],[seek,1,1,0],[sell,5,3,0],[sets,5,1,0],[sex,1,1.5,0],[short,1,1,0],[single,1,1.5,0],[sites,20,0,30],[sitting,50,70,150],[sized,30,1,0],[skin,1,1,0],[sleep,180,50,30],[slices,2,2,2],[so,1,1,0],[societal,50,30,180],[some,1,1,0],[spine,1,1,10],[spiritual,50,30,180],[state,1,1.5,0],[stated,20,30,0],[stating,50,30,180],[status,1,1.5,0],[stop,5,1,15],[straight,1,1,0],[strawberry,1,1,1],[structure,1,1,1],[studied,50,30,180],[study,100,50,100],[studying,50,30,180],[subject,50,30,180],[successful,1,1.5,0],[surya,1,1.5,0],[sutra,5,1.5,0],[table,7,5,0],[tape,10,10,1],[task,15,2.5,0],[technique,1,1,0],[test,20,20,0],[text,20,30,0],[that,1,1,0],[the,1,1,0],[them,50,30,180],[then,1,1,0],[they,50,30,180],[thinking,1,1.5,0],[third,1,1.5,0],[this,1,1,0],[thoughts,1,1,0],[through,1,1,0],[tied,1,1,0],[time,1,1.5,0],[to,1,1,0],[together,1,1,0],[touch,1,1,0],[train,1,1,1],[trains,50,30,180],[travel,10,3,0],[treating,20,15,3],[turn,1,1,0],[two,1,1.5,0],[university,100,75,3],[unwanted,1,1,0],[up,1,1,0],[upasana,100,75,100],[used,1,1,0],[using,1,1,0],[var,1,1,1],[vegetables,5,5,5],[videos,5,0,3],[view,5,5,5],[virality,1,1,0],[viruses,1,1,2],[vitamin,1,1,1],[walks,100,500,0],[wanted,15,20,3],[warm,1,1,0],[was,1,1,0],[water,5,5,5],[way,1,1,0],[ways,50,100,0],[well,50,30,180],[where,1,1,0],[whole,1,1,1],[with,1,1,0],[without,1,1,0],[words,5,7,1],[write,15,1,1],[writing,5,5,0],[wrote,15,1,1],[years,3,1,0]]). + +texttobr2(0):-!. +texttobr2(N1):- + texttobr2,N2 is N1-1,texttobr2(N2). diff --git a/Essay-Helper/walk_through.txt b/Essay-Helper/walk_through.txt new file mode 100644 index 0000000000000000000000000000000000000000..e138a6901e9b23b01a1aa3332b11d76e3c7f3c1f --- /dev/null +++ b/Essay-Helper/walk_through.txt @@ -0,0 +1,212 @@ +[["","",["heading"]],["","",["a"]],["","",["here"]],[1,1,["a","b","c","d"]],[1,2,["e","f","g","h"]]] +The Short Essay Helper will you help structure and write your essay about "Heading" with 2 reasons per paragraph. +The Helper will help write an exposition (which summarises but doesn't critique the idea), a critique (which agrees with or disagrees with the topic), the introduction and the conclusion (which state whether you agreed or disagreed with the topic, etc.). Citations will be automatically made. +The Helper will output the file, "file2020512121854.86923599243164.txt" used for marking. After using the Helper, run Text To Breasonings. Return "file2020512121854.86923599243164.txt" and the two breasoning dictionaries for marking. +Heading + +1a. here1 + +1. a b c d. e f g h. + +What is group 1 of 5 in the exposition that groups ideas about "Heading"? +| g1 +What is group 2 of 5 in the exposition that groups ideas about "Heading"? +| g2 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +What is the paragraph number of the quote about the paragraph topic "g1"? +| 1 +What is the sentence number of the quote about the paragraph topic "g1"? +| 1 +What is the paraphrased quote about the paragraph topic "g1"? +| a b c e +Attempt 1 +[found,[a,b,c,d],[a]] +Success +How does the quote you entered ("a b c e") relate to the paragraph topic "g1"? +| g1 contains a b c e +Attempt 1 +[found,[g],[g]] +[found,[a,b,c,d],[a]] +Success + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +What is the paragraph number of the quote about the paragraph topic "g1"? +| 1 +What is the sentence number of the quote about the paragraph topic "g1"? +| 1 +What is the paraphrased quote about the paragraph topic "g1"? +| a b c f +Attempt 1 +[found,[a,b,c,d],[a]] +Success +How does the quote you entered ("a b c f") relate to the paragraph topic "g1"? +| g1 contains a b c f +Attempt 1 +[found,[g],[g]] +[found,[a,b,c,d],[a]] +Success + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +What is the paragraph number of the quote about the paragraph topic "g2"? +| 1 +What is the sentence number of the quote about the paragraph topic "g2"? +| 1 +What is the paraphrased quote about the paragraph topic "g2"? +| a b c g +Attempt 1 +[found,[a,b,c,d],[a]] +Success +How does the quote you entered ("a b c g") relate to the paragraph topic "g2"? +| g2 contains a b c g +Attempt 1 +[found,[g],[g]] +[found,[a,b,c,d],[a]] +Success + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +What is the paragraph number of the quote about the paragraph topic "g2"? +| 1 +What is the sentence number of the quote about the paragraph topic "g2"? +| 1 +What is the paraphrased quote about the paragraph topic "g2"? +| a b c h +Attempt 1 +[found,[a,b,c,d],[a]] +Success +How does the quote you entered ("a b c h") relate to the paragraph topic "g2"? +| g2 contains a b c h +Attempt 1 +[found,[g],[g]] +[found,[a,b,c,d],[a]] +Success +Do you agree or disagree with "Heading" (a/d) ? +| a +Heading + +1a. here1 + +1. a b c d. e f g h. + + +The Helper will ask you for your comment on the quote you are about to enter. +What is the paragraph number of the quote to comment on? +| 1 +What is the sentence number of the quote to comment on? +| 1 +What is the paraphrased quote to comment on? +| a b c e +Attempt 1 +[found,[a,b,c,d],[a]] +Success +Is your comment from a quote (y/n)? +| y +What is the paragraph number of the comment? +| 1 +What is the sentence number of the comment? +| 1 +What is the comment that is from the quote? +| a b c f +Attempt 1 +[found,[a,b,c,d],[a]] +Success +How does the comment "a b c f" relate to the essay topic "Heading"? +| Heading contains a b c f +Attempt 1 +[found,[a,b,c,f],[a]] +[found,[heading],[b]] +Success + +The Helper will ask you for your comment on the quote you are about to enter. +What is the paragraph number of the quote to comment on? +| 1 +What is the sentence number of the quote to comment on? +| 1 +What is the paraphrased quote to comment on? +| a b c g +Attempt 1 +[found,[a,b,c,d],[a]] +Success +Is your comment from a quote (y/n)? +| y +What is the paragraph number of the comment? +| 1 +What is the sentence number of the comment? +| 1 +What is the comment that is from the quote? +| a b c h +Attempt 1 +[found,[a,b,c,d],[a]] +Success +Please select a comment to connect the comment "a b c h" to: +1 - a b c f + +| 1 +How does "a b c h" connect to "a b c f"? +| a b c h is a b c f +Attempt 1 +[found,[a,b,c,f],[a]] +[found,[a,b,c,h],[a]] +Success + +The Helper will ask you for your comment on the quote you are about to enter. +What is the paragraph number of the quote to comment on? +| 1 +What is the sentence number of the quote to comment on? +| 1 +What is the paraphrased quote to comment on? +| a b c i +Attempt 1 +[found,[a,b,c,d],[a]] +Success +Is your comment from a quote (y/n)? +| y +What is the paragraph number of the comment? +| 1 +What is the sentence number of the comment? +| 1 +What is the comment that is from the quote? +| a b c j +Attempt 1 +[found,[a,b,c,d],[a]] +Success +How does the comment "a b c j" relate to the essay topic "Heading"? +| heading contains a b c j +Attempt 1 +[found,[a,b,c,j],[a]] +[found,[heading],[b]] +Success + +The Helper will ask you for your comment on the quote you are about to enter. +What is the paragraph number of the quote to comment on? +| 1 +What is the sentence number of the quote to comment on? +| 1 +What is the paraphrased quote to comment on? +| a b c k +Attempt 1 +[found,[a,b,c,d],[a]] +Success +Is your comment from a quote (y/n)? +| y +What is the paragraph number of the comment? +| 1 +What is the sentence number of the comment? +| 1 +What is the comment that is from the quote? +| a b c l +Attempt 1 +[found,[a,b,c,d],[a]] +Success +Please select a comment to connect the comment "a b c l" to: +1 - a b c j + +| 1 +How does "a b c l" connect to "a b c j"? +| a b c l is a b c j +Attempt 1 +[found,[a,b,c,j],[a]] +[found,[a,b,c,l],[a]] +Success +What is the future area of research from your essay about "Heading"? +| heading should be automated. diff --git a/Essay-Helper/walk_through2-output.txt b/Essay-Helper/walk_through2-output.txt new file mode 100644 index 0000000000000000000000000000000000000000..5c669a55fddcf9cfbb968f93b9f6ed9b2cfcb514 --- /dev/null +++ b/Essay-Helper/walk_through2-output.txt @@ -0,0 +1 @@ +[[[[1,"1"],[2,"2"],[3,"3"],[4,"4"],[5,"5"]],[[1,1,_3240,_3246,"1","1"],[1,2,_3186,_3192,"2","2"],[1,3,_3132,_3138,"3","3"],[1,4,_3078,_3084,"4","4"],[2,1,_3024,_3030,"1","1"],[2,2,_2970,_2976,"2","2"],[2,3,_2916,_2922,"3","3"],[2,4,_2862,_2868,"4","4"],[3,1,_2808,_2814,"1","1"],[3,2,_2754,_2760,"2","2"],[3,3,_2700,_2706,"3","3"],[3,4,_2646,_2652,"4","4"],[4,1,_2592,_2598,"1","1"],[4,2,_2538,_2544,"2","2"],[4,3,_2484,_2490,"3","3"],[4,4,_2430,_2436,"4","4"],[5,1,_2376,_2382,"1","1"],[5,2,_2322,_2328,"2","2"],[5,3,_2268,_2274,"3","3"],[5,4,_2214,_2220,"4","4"]]],[[1,["1",[1,1,_5920,_5926,"1",_5944,_5950,"1"],[1,2,_5986,_5992,"2",_6010,_6016,"4",1,1,"1",0,0,"4","4"],[1,3,_6112,_6118,"3",_6136,_6142,"4",1,1,"1",0,0,"4","4"],[1,4,_6238,_6244,"4",_6262,_6268,"4",1,1,"1",0,0,"4","4"]]],[2,["1",[2,1,_5446,_5452,"1",_5470,_5476,"1"],[2,2,_5512,_5518,"2",_5536,_5542,"4",2,1,"1",0,0,"4","4"],[2,3,_5638,_5644,"3",_5662,_5668,"4",2,1,"1",0,0,"4","4"],[2,4,_5764,_5770,"4",_5788,_5794,"4",2,1,"1",0,0,"4","4"]]],[3,["1",[3,1,_4972,_4978,"1",_4996,_5002,"1"],[3,2,_5038,_5044,"2",_5062,_5068,"4",3,1,"1",0,0,"4","4"],[3,3,_5164,_5170,"3",_5188,_5194,"4",3,1,"1",0,0,"4","4"],[3,4,_5290,_5296,"4",_5314,_5320,"4",3,1,"1",0,0,"4","4"]]],[4,["1",[4,1,_4498,_4504,"1",_4522,_4528,"1"],[4,2,_4564,_4570,"2",_4588,_4594,"4",4,1,"1",0,0,"4","4"],[4,3,_4690,_4696,"3",_4714,_4720,"4",4,1,"1",0,0,"4","4"],[4,4,_4816,_4822,"4",_4840,_4846,"4",4,1,"1",0,0,"4","4"]]],[5,["1",[5,1,_4024,_4030,"1",_4048,_4054,"1"],[5,2,_4090,_4096,"2",_4114,_4120,"4",5,1,"1",0,0,"4","4"],[5,3,_4216,_4222,"3",_4240,_4246,"4",5,1,"1",0,0,"4","4"],[5,4,_4342,_4348,"4",_4366,_4372,"4",5,1,"1",0,0,"4","4"]]]],"b"] \ No newline at end of file diff --git a/Essay-Helper/walk_through2.html b/Essay-Helper/walk_through2.html new file mode 100644 index 0000000000000000000000000000000000000000..a861eef0a9413a2d4ebbb7dee862495ba9af5c27 --- /dev/null +++ b/Essay-Helper/walk_through2.html @@ -0,0 +1 @@ +h

h

I will critically analyse h. I will agree with h. 1 1 1 1 1
I will expose h in this half. 1 1 2 2 3 3 4 4
1 1 2 2 3 3 4 4
1 1 2 2 3 3 4 4
1 1 2 2 3 3 4 4
1 1 2 2 3 3 4 4
I will agree with h in this half. 1 1 1 2 4 4 3 4 4 4 4 4 1
1 1 1 2 4 4 3 4 4 4 4 4 1
1 1 1 2 4 4 3 4 4 4 4 4 1
1 1 1 2 4 4 3 4 4 4 4 4 1
1 1 1 2 4 4 3 4 4 4 4 4 1
1 1 1 1 1 I have agreed with h. b

Bibliography

1
2
3
4
\ No newline at end of file diff --git a/Essay-Helper/walk_through2.txt b/Essay-Helper/walk_through2.txt new file mode 100644 index 0000000000000000000000000000000000000000..065015a4ca65e8bf05a97e27f5c66915430589af --- /dev/null +++ b/Essay-Helper/walk_through2.txt @@ -0,0 +1,1148 @@ +?- short_essay_helper("h",4). +The Short Essay Helper will you help structure and write your essay about "h" with 4 reasons per paragraph. +The Helper will help write an exposition (which summarises but doesn't critique the idea), a critique (which agrees with or disagrees with the topic), the introduction and the conclusion (which state whether you agreed or disagreed with the topic, etc.). Citations will be automatically made. + +What is group 1 of 5 in the exposition that groups ideas about "h"? +|: 1 +What is group 2 of 5 in the exposition that groups ideas about "h"? +|: 2 +What is group 3 of 5 in the exposition that groups ideas about "h"? +|: 3 +What is group 4 of 5 in the exposition that groups ideas about "h"? +|: 4 +What is group 5 of 5 in the exposition that groups ideas about "h"? +|: 5 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "1"? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +|: 1 +How does the quote you entered ("1") relate to the paragraph topic "1"? +|: 1 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "1"? +|: 2 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +|: 2 +How does the quote you entered ("2") relate to the paragraph topic "1"? +|: 2 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "1"? +|: 3 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +|: 3 +How does the quote you entered ("3") relate to the paragraph topic "1"? +|: 3 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "1"? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +|: 4 +How does the quote you entered ("4") relate to the paragraph topic "1"? +|: 4 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "2"? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +How does the quote you entered ("1") relate to the paragraph topic "2"? +|: 1 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "2"? +|: 2 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 2 +How does the quote you entered ("2") relate to the paragraph topic "2"? +|: 2 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "2"? +|: 3 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 3 +How does the quote you entered ("3") relate to the paragraph topic "2"? +|: 3 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "2"? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +How does the quote you entered ("4") relate to the paragraph topic "2"? +|: 4 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "3"? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +How does the quote you entered ("1") relate to the paragraph topic "3"? +|: 1 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "3"? +|: 2 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 2 +How does the quote you entered ("2") relate to the paragraph topic "3"? +|: 2 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "3"? +|: 3 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 3 +How does the quote you entered ("3") relate to the paragraph topic "3"? +|: 3 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "3"? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +How does the quote you entered ("4") relate to the paragraph topic "3"? +|: 4 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "4"? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +How does the quote you entered ("1") relate to the paragraph topic "4"? +|: 1 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "4"? +|: 2 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 2 +How does the quote you entered ("2") relate to the paragraph topic "4"? +|: 2 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "4"? +|: 3 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 3 +How does the quote you entered ("3") relate to the paragraph topic "4"? +|: 3 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "4"? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +How does the quote you entered ("4") relate to the paragraph topic "4"? +|: 4 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "5"? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +How does the quote you entered ("1") relate to the paragraph topic "5"? +|: 1 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "5"? +|: 2 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 2 +How does the quote you entered ("2") relate to the paragraph topic "5"? +|: 2 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "5"? +|: 3 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 3 +How does the quote you entered ("3") relate to the paragraph topic "5"? +|: 3 + +The Helper will ask you how the quote you are about to enter relates to the paragraph topic. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote about the paragraph topic "5"? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +How does the quote you entered ("4") relate to the paragraph topic "5"? +|: 4 +Do you agree or disagree with "h" (a/d) ? +|: a + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +How does the comment "1" relate to the essay topic "h"? +|: 1 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 2 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 3 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 +2 - 4 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 +2 - 4 +3 - 4 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +How does the comment "1" relate to the essay topic "h"? +|: 1 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 2 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 3 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 +2 - 4 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 +2 - 4 +3 - 4 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +How does the comment "1" relate to the essay topic "h"? +|: 1 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 2 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 3 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 +2 - 4 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 +2 - 4 +3 - 4 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +How does the comment "1" relate to the essay topic "h"? +|: 1 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 2 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 3 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 +2 - 4 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 +2 - 4 +3 - 4 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 1 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 1 +How does the comment "1" relate to the essay topic "h"? +|: 1 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 2 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 3 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 +2 - 4 + +|: 1 +How does "4" connect to "1"? +|: 4 + +The Helper will ask you for your comment on the quote you are about to enter. +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the paraphrased quote to comment on? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Is your comment from a quote (y/n)? +|: y +Note: Enter in-text reference using AGPS, e.g. +The first work supports the second work (Surname 2000, pp. 18-9). +Surname (2000, pp. 18-9) states that ... +Remember to use words like "also", "moreover" and "in addition" before the sentence. +What is the comment that is from the quote? +|: 4 +What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City. + +Existing references (copy one or many delimited with "\n"): +1 +2 +3 +4 +|: 4 +Please select a comment to connect the comment "4" to: +1 - 1 +2 - 4 +3 - 4 + +|: 1 +How does "4" connect to "1"? +|: 4 +What is the future area of research from your essay about "h"? +|: b +"h + +I will critically analyse h. I will agree with h. 1 1 1 1 1 + +I will expose h in this half. 1 1 2 2 3 3 4 4 +1 1 2 2 3 3 4 4 +1 1 2 2 3 3 4 4 +1 1 2 2 3 3 4 4 +1 1 2 2 3 3 4 4 +I will agree with h in this half. 1 1 1 2 4 4 3 4 4 4 4 4 1 +1 1 1 2 4 4 3 4 4 4 4 4 1 +1 1 1 2 4 4 3 4 4 4 4 4 1 +1 1 1 2 4 4 3 4 4 4 4 4 1 +1 1 1 2 4 4 3 4 4 4 4 4 1 + +1 1 1 1 1 I have agreed with h. b + +Bibliography + +1 +2 +3 +4 +" +true. diff --git a/Essay-Helper/walk_through3.1-chicago.html b/Essay-Helper/walk_through3.1-chicago.html new file mode 100644 index 0000000000000000000000000000000000000000..93efa965663cb59aafa1e651300db7b5901eaa5b --- /dev/null +++ b/Essay-Helper/walk_through3.1-chicago.html @@ -0,0 +1 @@ +m

m

I will critically analyse m. I will agree with m. m is because "m11-2". m is because "M1". m is because "m22". m is because "m32". m is because "m42".
I will expose m in this half. I agree with M1 because m331. I agree with M1 because "m42"2.
I agree with m12 because m523. I agree with m12 because "m53"4.
I agree with m22 because m625. I agree with m22 because "m63"6.
I agree with m23 because m727. I agree with m23 because "m73"8.
I agree with m32 because m839. I agree with m32 because "m92"10.
I will agree with m in this half. m is because "m11-2". "m93"11. "m11-2"12. m is because "m11-2".
m is because "M1". "m11-3"13. "M1"14. m is because "M1".
m is because "m22". "m12"15. "m22"16. m is because "m22".
m is because "m32". "m23"17. "m32"18. m is because "m32".
m is because "m42". "m33"19. "m42"20. m is because "m42".
m is because "m11-2". m is because "M1". m is because "m22". m is because "m32". m is because "m42". I have agreed with m. In m, automation should apply to "m43".

Endnotes

1. ref1, p 3.
2. Ibid, p 3.
3. Ibid, p 4.
4. Ibid.
5. Ibid, p 5.
6. Ibid.
7. Ibid, p 6.
8. Ibid.
9. Ibid, p 7.
10. Ibid, p 8.
11. Ibid.
12. Ibid, p 9.
13. Ibid.
14. ref3, p 1.
15. Ibid.
16. Ibid, p 1.
17. Ibid.
18. Ibid, p 2.
19. Ibid.
20. Ibid, p 3.

Bibliography

ref1
ref3
\ No newline at end of file diff --git a/Essay-Helper/walk_through3.1.html b/Essay-Helper/walk_through3.1.html new file mode 100644 index 0000000000000000000000000000000000000000..27fad09f370f20836d3da79d1246b9981bfb0b40 --- /dev/null +++ b/Essay-Helper/walk_through3.1.html @@ -0,0 +1 @@ +m

m

I will critically analyse m. I will agree with m. m is because "m82" (author 2003, p. 8). m is because "m53" (author 2003, p. 5). m is because "m12" (author 2003, p. 1). m is because "m72" (author 2003, p. 7). m is because "m12" (author 2003, p. 1).
I will expose m in this half. m82 (author 2003, p. 8) is because m11-2 (author 2003, p. 11). m82 (author 2003, p. 8) is because "m32" (author 2003, p. 3).
m32 (author 2003, p. 3) is because m92 (author 2003, p. 9). m32 (author 2003, p. 3) is because "m83" (author 2003, p. 8).
m63 (author 2003, p. 6) is because m63 (author 2003, p. 6). m63 (author 2003, p. 6) is because "m33" (author 2003, p. 3).
M1 (author 2003, p. 1) is because m10-3 (author 2003, p. 10). M1 (author 2003, p. 1) is because "m c62" (author 2003, p. 6).
m22 (author 2003, p. 2) is because m62 (author 2003, p. 6). m22 (author 2003, p. 2) is because "M1" (author 2003, p. 1).
I will agree with m in this half. m is because "m82" (author 2003, p. 8). "m53" (author 2003, p. 5). "m82" (author 2003, p. 8). m is because "m82" (author 2003, p. 8).
m is because "m53" (author 2003, p. 5). "m33" (author 2003, p. 3). "m53" (author 2003, p. 5). m is because "m53" (author 2003, p. 5).
m is because "m12" (author 2003, p. 1). "m73" (author 2003, p. 7). "m12" (author 2003, p. 1). m is because "m12" (author 2003, p. 1).
m is because "m72" (author 2003, p. 7). "m83" (author 2003, p. 8). "m72" (author 2003, p. 7). m is because "m72" (author 2003, p. 7).
m is because "m12" (author 2003, p. 1). "m92" (author 2003, p. 9). "m12" (author 2003, p. 1). m is because "m12" (author 2003, p. 1).
m is because "m82" (author 2003, p. 8). m is because "m53" (author 2003, p. 5). m is because "m12" (author 2003, p. 1). m is because "m72" (author 2003, p. 7). m is because "m12" (author 2003, p. 1). I have agreed with m. In m, automation should apply to "m23" (author 2003, p. 2).

Bibliography

ref1
ref2
ref3
\ No newline at end of file diff --git a/Essay-Helper/walk_through3.html b/Essay-Helper/walk_through3.html new file mode 100644 index 0000000000000000000000000000000000000000..7bb5449200429d096f51a74121efc62434644cff --- /dev/null +++ b/Essay-Helper/walk_through3.html @@ -0,0 +1 @@ +Author's Heading

Author's Heading

I will critically analyse Author's Heading. I will agree with Author's Heading. Author's Heading is because of a (author 2000, p 2) because of a (author 2000, p 2). Author's Heading is because of a (author 2000, p 2) because of c (author 2000, p 1). Author's Heading is because of a (author 2000, p 1) because of c (author 2000, p 1). Author's Heading is because of b (author 2000, p 1) because of b (author 2000, p 2). Author's Heading is because of b (author 2000, p 2) because of a (author 2000, p 1).
I will expose Author's Heading in this half. b (author 2000, p 1) is because a (author 2000, p 1). b (author 2000, p 1) is because a (author 2000, p 1). b (author 2000, p 1) is because c (author 2000, p 2). b (author 2000, p 1) is because a (author 2000, p 2). b (author 2000, p 1) is because c (author 2000, p 2). b (author 2000, p 1) is because a (author 2000, p 1). b (author 2000, p 1) is because b (author 2000, p 1). b (author 2000, p 1) is because a (author 2000, p 1). b (author 2000, p 1) is because b (author 2000, p 1). b (author 2000, p 1) is because c (author 2000, p 2).
b (author 2000, p 2) is because c (author 2000, p 2). b (author 2000, p 2) is because b (author 2000, p 2). b (author 2000, p 2) is because a (author 2000, p 1). b (author 2000, p 2) is because a (author 2000, p 1). b (author 2000, p 2) is because c (author 2000, p 2). b (author 2000, p 2) is because a (author 2000, p 2). b (author 2000, p 2) is because b (author 2000, p 1). b (author 2000, p 2) is because b (author 2000, p 1). b (author 2000, p 2) is because c (author 2000, p 2). b (author 2000, p 2) is because c (author 2000, p 1).
b (author 2000, p 2) is because c (author 2000, p 1). b (author 2000, p 2) is because c (author 2000, p 2). b (author 2000, p 2) is because c (author 2000, p 1). b (author 2000, p 2) is because a (author 2000, p 2). b (author 2000, p 2) is because a (author 2000, p 2). b (author 2000, p 2) is because b (author 2000, p 2). b (author 2000, p 2) is because b (author 2000, p 2). b (author 2000, p 2) is because c (author 2000, p 1). b (author 2000, p 2) is because a (author 2000, p 1). b (author 2000, p 2) is because c (author 2000, p 2).
a (author 2000, p 2) is because b (author 2000, p 1). a (author 2000, p 2) is because a (author 2000, p 1). a (author 2000, p 2) is because a (author 2000, p 2). a (author 2000, p 2) is because a (author 2000, p 2). a (author 2000, p 2) is because c (author 2000, p 2). a (author 2000, p 2) is because a (author 2000, p 2). a (author 2000, p 2) is because b (author 2000, p 1). a (author 2000, p 2) is because a (author 2000, p 2). a (author 2000, p 2) is because a (author 2000, p 2). a (author 2000, p 2) is because a (author 2000, p 2).
c (author 2000, p 2) is because c (author 2000, p 1). c (author 2000, p 2) is because a (author 2000, p 2). c (author 2000, p 2) is because c (author 2000, p 2). c (author 2000, p 2) is because b (author 2000, p 2). c (author 2000, p 2) is because c (author 2000, p 1). c (author 2000, p 2) is because b (author 2000, p 1). c (author 2000, p 2) is because c (author 2000, p 1). c (author 2000, p 2) is because c (author 2000, p 2). c (author 2000, p 2) is because b (author 2000, p 1). c (author 2000, p 2) is because a (author 2000, p 1).
I will agree with Author's Heading in this half. Author's Heading is because of a (author 2000, p 2) because of a (author 2000, p 2). c (author 2000, p 2). a (author 2000, p 2). a (author 2000, p 2). b (author 2000, p 2). b (author 2000, p 2) is because of a (author 2000, p 2) because of b (author 2000, p 1). a (author 2000, p 2). b (author 2000, p 2). b (author 2000, p 2) is because of a (author 2000, p 2) because of a (author 2000, p 1). Author's Heading is because of a (author 2000, p 2) because of a (author 2000, p 2).
Author's Heading is because of a (author 2000, p 2) because of c (author 2000, p 1). c (author 2000, p 2). a (author 2000, p 2). a (author 2000, p 1). a (author 2000, p 1). a (author 2000, p 1) is because of a (author 2000, p 2) because of a (author 2000, p 2). a (author 2000, p 2). a (author 2000, p 2). a (author 2000, p 2) is because of a (author 2000, p 2) because of c (author 2000, p 1). Author's Heading is because of a (author 2000, p 2) because of c (author 2000, p 1).
Author's Heading is because of a (author 2000, p 1) because of c (author 2000, p 1). b (author 2000, p 1). a (author 2000, p 1). b (author 2000, p 2). a (author 2000, p 1). a (author 2000, p 1) is because of a (author 2000, p 1) because of c (author 2000, p 1). b (author 2000, p 2). c (author 2000, p 2). c (author 2000, p 2) is because of a (author 2000, p 1) because of a (author 2000, p 2). b (author 2000, p 1). b (author 2000, p 1). b (author 2000, p 1) is because of a (author 2000, p 1) because of b (author 2000, p 1). Author's Heading is because of a (author 2000, p 1) because of c (author 2000, p 1).
Author's Heading is because of b (author 2000, p 1) because of b (author 2000, p 2). c (author 2000, p 2). b (author 2000, p 1). b (author 2000, p 2). b (author 2000, p 2). b (author 2000, p 2) is because of b (author 2000, p 1) because of b (author 2000, p 2). b (author 2000, p 1). b (author 2000, p 1). b (author 2000, p 1) is because of b (author 2000, p 1) because of b (author 2000, p 1). Author's Heading is because of b (author 2000, p 1) because of b (author 2000, p 2).
Author's Heading is because of b (author 2000, p 2) because of a (author 2000, p 1). b (author 2000, p 2). b (author 2000, p 2). c (author 2000, p 1). a (author 2000, p 1). a (author 2000, p 1) is because of b (author 2000, p 2) because of a (author 2000, p 1). a (author 2000, p 1). b (author 2000, p 2). b (author 2000, p 2) is because of b (author 2000, p 2) because of a (author 2000, p 2). c (author 2000, p 2). c (author 2000, p 1). c (author 2000, p 1) is because of b (author 2000, p 2) because of b (author 2000, p 1). Author's Heading is because of b (author 2000, p 2) because of a (author 2000, p 1).
Author's Heading is because of a (author 2000, p 2) because of a (author 2000, p 2). Author's Heading is because of a (author 2000, p 2) because of c (author 2000, p 1). Author's Heading is because of a (author 2000, p 1) because of c (author 2000, p 1). Author's Heading is because of b (author 2000, p 1) because of b (author 2000, p 2). Author's Heading is because of b (author 2000, p 2) because of a (author 2000, p 1). I have agreed with Author's Heading. In Author's Heading, automation should apply to a (author 2000, p 1).

Bibliography

ref
\ No newline at end of file diff --git a/File2List2Br/A database segment in File2List2Br.png b/File2List2Br/A database segment in File2List2Br.png new file mode 100644 index 0000000000000000000000000000000000000000..42eef3de1465f05c5b7deb987d47f20f545ee45d --- /dev/null +++ b/File2List2Br/A database segment in File2List2Br.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:67e69ec47149b41dbcdd453eeb2167074ce35920293649c35d0e8cd095ed4f21 +size 140441 diff --git a/File2List2Br/LICENSE b/File2List2Br/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..3a27c90f1c2986ac0fb73b4ae956c6205343dfdd --- /dev/null +++ b/File2List2Br/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/File2List2Br/README.md b/File2List2Br/README.md new file mode 100644 index 0000000000000000000000000000000000000000..db1ef9ef345be091a6daaecc7d5df6a99c9e2688 --- /dev/null +++ b/File2List2Br/README.md @@ -0,0 +1,89 @@ +# File2List2Br + +Automates high distinctions for large files. + +* file2list.pl - generates files that are lists of breasonings (words that earn high distinctions) from files +* list2br.pl - converts a list of words to breasonings +- shows last 10 assignments with breasoning count +- asks if the user wants 128k breasonings (4000 breasonings or 50 As * 80 breasonings, * 4 for 100%, * 2 for student and to do lecturer's work, * 2 for University and an industry seen-as version * 2 for the student and their friend) or another value for an assignment +- seamlessly tries next list, warning if past end of database, and giving error if database has been finished + +# Getting Started + +Please read the following instructions on how to install the project on your computer for automating high distinctions for large files. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, the List Prolog Interpreter Repository and the Text to Breasonings Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","File2List2Br"). +halt +``` + +# Running + +* Download the repositories above and save them in a single folder. +* Save a large file in Terminal as a text file. +* Split the file `split -b 4000000 file.txt`. +* Load `file2list1` by entering `['file2list.pl'].` and run with the command e.g. `file2list1(["fileout_txt.aa","fileout_txt.ab"]).` where the file names are the split files from the previous step. Don't close the window. +* In `list_db.txt`, enter the file names of the files generated by `file2list1` and their breasoning counts output from `file2list1`, e.g. +``` +[["t-tmp.txt",12],["a-tmp.txt",2]] +``` +* NB. You can use BBEdit to delete parts of the output and replace with `\t (tab)` so that the breasoning count can be separated from the unique breasonings in Excel, and then use replacements in BBEdit to format the list. You can use `reverse(Input,Output).` in swipl to reverse lists. Be careful that the filenames correspond to the correct breasoning count. +* In Shell: +`cd File2List2Br` +`swipl` +`['../Text-to-Breasonings/text_to_breasonings.pl'].` +* Load `list2br.pl` by entering `['list2br.pl'].` and run with the command `list2br.`. +``` +Number of breasoning databases: 2 +Number of previous assignments: 1 +10 Previous Assignments and Breasoning Counts +[Comment,0] +0 of 14 breasonings (0%) used. +Enter the subject and assignment number: +``` +* Enter the university or school subject and the info about the assignment. +``` +Enter for 128k breasonings in this assignment, or the number of breasonings: +``` +* Press `Return` for 128000 breasonings, assuming there are enough in the database (if you would like to given the reasons above) or enter another value. +``` +Number of words in dictionary: 11746 +Number of unique breasonings in dictionary: 662 +Number of words to breason out in file: 1 +Number of unique words in file: 1 +Number of unique breathsonings in dictionary: 364 +true. +``` +* You will be asked to breason out new words. +* You can delete this info in the `assignment_db.txt` database later to e.g. redo or increase the breasoning number and redo. + +# Caution: + +follow instructions in Instructions for Using texttobr(2) when using texttobr, texttobr2 or mind reader to avoid medical problems. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + diff --git a/File2List2Br/a-tmp.txt b/File2List2Br/a-tmp.txt new file mode 100644 index 0000000000000000000000000000000000000000..648aeba7fa17cc60ce738161e59a41d234083f5e --- /dev/null +++ b/File2List2Br/a-tmp.txt @@ -0,0 +1 @@ +[a1,A2] \ No newline at end of file diff --git a/File2List2Br/assignment_db.txt b/File2List2Br/assignment_db.txt new file mode 100644 index 0000000000000000000000000000000000000000..3674c65a98d79a7856af6e782c57b657ddd02cb4 --- /dev/null +++ b/File2List2Br/assignment_db.txt @@ -0,0 +1 @@ +[["Comment",0]] \ No newline at end of file diff --git a/File2List2Br/file2list.pl b/File2List2Br/file2list.pl new file mode 100644 index 0000000000000000000000000000000000000000..88a8f7aaef575a0e97bed4c604d28820a9fc085d --- /dev/null +++ b/File2List2Br/file2list.pl @@ -0,0 +1,44 @@ +%% file2list.pl + +%% converts a file to a list in filelist.txt + +%%:- include('mergetexttobrdict.pl'). +:- include('la_strings'). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + + +file2list1(Files) :- +%%Files=["fileout_txt.aa","fileout_txt.ab"], +findall(_,(member(File1,Files),file2list(File1)),_). + + +file2list(File1) :- prep(File1,List1), + + string_concat(File1,"list.txt",File2), + (open_s(File2,write,Stream), + write(Stream,List1), + close(Stream)), + !. + +truncate(List1,M,String0) :- + ((number(M),length(String0,M), + append(String0,_,List1))->true; + String0=List1),!. + +prep(File1,List) :- + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + + phrase_from_file_s(string(String00), File1)), + + split_string(String00,SepandPad,SepandPad,List), + + length(List,Length1),write("Number of words in file: "), writeln(Length1), + sort(List,List2), +%%writeln([list2,List2]), + length(List2,Length2),write("Number of unique words in file: "), writeln(Length2) + +,!. diff --git a/File2List2Br/list2br.pl b/File2List2Br/list2br.pl new file mode 100644 index 0000000000000000000000000000000000000000..f75fb0da57b8790afba1460cec4845ba92a6f15a --- /dev/null +++ b/File2List2Br/list2br.pl @@ -0,0 +1,153 @@ +%% list2br.pl + +%% converts a list of words to br +%% - shows last 10 assignments with br count +%% - asks if want 128k or other for next br +%% - seamlessly tries next list, warning if exhausted + +%%:- include('mergetexttobrdict.pl'). + +%%**** load:`['../Text-to-Breasonings/text_to_breasonings.pl'].` + + +%% this file + +:- include('../listprologinterpreter/la_strings'). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + + +list2br :- prep_l2b(File_contents_list_term,File_contents_assignments_term1 + ), + + br1_l2b(File_contents_list_term,File_contents_assignments_term1,File_contents_assignments_term2), + term_to_atom(File_contents_assignments_term2,File_contents_assignments_atom), + (open_s("assignment_db.txt",write,Stream), + write(Stream,File_contents_assignments_atom), + close(Stream)), + !. + + +prep_l2b(File_contents_list_term,File_contents_assignments_term) :- + +phrase_from_file_s(string(File_contents_list2), "list_db.txt"), + +string_codes(File_contents_list1,File_contents_list2), +atom_string(File_contents_list,File_contents_list1), + +term_to_atom(File_contents_list_term,File_contents_list), + + length(File_contents_list_term,Length1),write("Number of breasoning databases: "), writeln(Length1), + + +phrase_from_file_s(string(File_contents_assignments2), "assignment_db.txt"), + +string_codes(File_contents_assignments1,File_contents_assignments2), +atom_string(File_contents_assignments,File_contents_assignments1), + +term_to_atom(File_contents_assignments_term,File_contents_assignments), + + length(File_contents_assignments_term,Length2),write("Number of previous assignments: "), writeln(Length2), + +!. + +sum(A,B) :- +A=[C|D],sum(C,D,0,B). +sum(A,[],B,C):-C is A+B,!. +sum(A,B,C,D):-B=[E|F],G is A+C,sum(E,F,G,D). + +%%C is A+B. + +br1_l2b(File_contents_list_term,File_contents_assignments_term1,File_contents_assignments_term2) :- + length(File_contents_assignments_term1,Ass_length), + (Ass_length=<10->New_ass=File_contents_assignments_term1; + B is 10,length(New_ass,B),append(_,New_ass,File_contents_assignments_term1)), + writeln("10 Previous Assignments and Breasoning Counts"), + findall(_,(member(C,New_ass),writeln(C)),_), + %%writeln(""), + findall(E,(member(D,File_contents_assignments_term1),D=[_,E]),F), sum(F,G), + findall(H,(member(I,File_contents_list_term),I=[_,H]),J), sum(J,K), + L is round(100*(G/K)), + + write(G),write(" of "),write(K),write(" breasonings ("),write(L),writeln("%) used."), + + (repeat,write("Enter the subject and assignment number: "), + read_string(user_input, "\n", "\r", _NC1, NC2)), + + (repeat,write("Enter for 128k breasonings in this assignment, or the number of breasonings: "), + read_string(user_input, "\n", "\r", _NB1, NB2),split_string(NB2, "", " ", NB3), +NB3=[NB4],(NB4=""->NB5=128000;number_string(NB5,NB4))), + + append(File_contents_assignments_term1,[[NC2,NB5]],File_contents_assignments_term2), + + %%(File_contents_list_term=[]->(concat_list(["Error: Database is empty."],A), + %%writeln(A),abort); + (find_br1(File_contents_list_term,G,NB5,0,[],Text_term), + % convert to string + term_to_atom(Text_term,Text_atom), + string_atom(Text_string,Text_atom), + % t2b2 + + %%writeln1(Text_string), + + texttobr2(u,u,Text_string,u,true,false,false,false,false,false), + texttobr(u,u,Text_string,u))%% +. + +%% if last list, give warning, count so far and text + +%% ** if past, error +find_br1([],_Start,_Len_needed,_Len_so_far1,_Text1,_Text2) :- + concat_list(["Error: Start is past end of database."],A), + writeln(A),abort. +find_br1(File_contents_list_term,Start,Len_needed,Len_so_far1,Text1,Text2) :- + +%%trace, + + File_contents_list_term=[[_Filename,Len]|Rest], + Len_so_far2 is Len_so_far1+Len, + (Len_so_far2 > Start -> + ( +find_br2(File_contents_list_term,Start,Len_needed,Len_so_far1,Text1,Text2) +); %% Len_so_far2 < Start +(%%Len_so_far2 is Len_so_far1+Len, +Len_so_far3 is 0,%%Start2 is 0,Len_needed2 is Len_needed-B. +Start2 is Start-Len, +find_br1(Rest,Start2,Len_needed,Len_so_far3,Text1,Text2))) +. + +%% find text from file +find_br2([],_Start,Len_needed,_Len_so_far1,Text,Text) :- + concat_list(["Warning: End of database has been reached. ",Len_needed," breasonings remaining."],A), writeln(A). + +find_br2(File_contents_list_term,Start,Len_needed,Len_so_far1,Text1,Text2) :- + File_contents_list_term=[[Filename,Len]|Rest], + +phrase_from_file_s(string(File_contents_newlist2), Filename), + +string_codes(File_contents_newlist1,File_contents_newlist2), +atom_string(File_contents_newlist0,File_contents_newlist1), +downcase_atom(File_contents_newlist0,File_contents_newlist), + +term_to_atom(File_contents_newlist_term,File_contents_newlist), +%%trace, + + Start_of_list is Start-Len_so_far1, + Len_to_end_of_list is Len - Start_of_list, + +(Len_needed=(%%trace, +C is Start_of_list%-1 +,length(Texta,Len_needed),length(A,C),append(A,F,File_contents_newlist_term),append(Texta,_,F),append(Text1,Texta,Text2)); +( + +B is %%Len_needed +Len-Start,length(Texta,B),append(_,Texta,File_contents_newlist_term),append(Text1,Texta,Text3), + +%text is computed v +Len_so_far2 is 0,Start2 is 0,Len_needed2 is Len_needed-B,%%Len_so_far1+Len, +find_br2(Rest,Start2,Len_needed2,Len_so_far2,Text3,Text2)) %% go to next list *** +). + diff --git a/File2List2Br/list_db.txt b/File2List2Br/list_db.txt new file mode 100644 index 0000000000000000000000000000000000000000..dcca6af8dcf16ff39773424e87007fce982f10c9 --- /dev/null +++ b/File2List2Br/list_db.txt @@ -0,0 +1 @@ +[["t-tmp.txt",12],["a-tmp.txt",2]] \ No newline at end of file diff --git a/File2List2Br/t-tmp.txt b/File2List2Br/t-tmp.txt new file mode 100644 index 0000000000000000000000000000000000000000..7370f3276428c4e3e1c1dcb8e44f730c965398b6 --- /dev/null +++ b/File2List2Br/t-tmp.txt @@ -0,0 +1 @@ +[t1,t2,t3,t4,t5,t6,t7,t8,t9,t10,t11,t12] \ No newline at end of file diff --git a/Languages/.DS_Store b/Languages/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Languages/.DS_Store differ diff --git a/Languages/LICENSE b/Languages/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..3a27c90f1c2986ac0fb73b4ae956c6205343dfdd --- /dev/null +++ b/Languages/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Languages/README.md b/Languages/README.md new file mode 100644 index 0000000000000000000000000000000000000000..1d433941c601529c5809e99df9c1a2695d52b9cc --- /dev/null +++ b/Languages/README.md @@ -0,0 +1,130 @@ +# Languages + +Supports international versions of repositories: List Prolog Interpreter. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for coding in other languages. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +* You may need to install gawk using Homebrew. + +* Install Translation Shell on Mac, etc. +Change line in +``` +culturaltranslationtool/ctt2.pl +trans_location("../../../gawk/trans"). +``` +to correct location of trans. + +# 1. Install manually + +Download this repository, Prolog to List Prolog, the List Prolog Interpreter and Cultural Translation Tool. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Languages"). +halt +``` + +# Running + +* In Shell: +`cd listprologinterpreter` +`['listprolog'].` + +* Running the tests: See List Prolog Interpreter. + +# Languages commands + +* To install a new language for LPI (see available language codes): +``` +cd Languages +swipl +['lang_db_generator.pl']. +lang_db_generator("en2",["de","ga"]). +``` +to install German (de) and Irish (ga) (in the listprologinterpreter/languages folder, not in the Languages repository). The English2 file is used to translate English commands. + +* To generate LPI documentation in language "en2" (i.e. "concatenate strings" instead of "stringconcat"): +``` +cd Languages +swipl +['../listprologinterpreter/listprolog.pl']. +make_docs("docs_full.txt"). +``` + +* To generate LPI documentation in a language: +``` +cd Languages +swipl +['make_docs.pl']. +% run one of the following lines every few hours +make_docs("docs1.txt"). +make_docs("docs2.txt"). +make_docs("docs3.txt"). +make_docs("docs4.txt"). +make_docs("docs5.txt"). +make_docs("docs6.txt"). +make_docs("docs6-1.txt"). +make_docs("docs6-2.txt"). +make_docs("docs7.txt"). +make_docs("docs8.txt"). +make_docs("docs9.txt"). +make_docs("docs10.txt"). +make_docs("docs11.txt"). +make_docs("docs12.txt"). +``` + +The following is displayed, and the user enters a language code (installed from the previous step): +``` +Enter target language code for LPI docs: +|: fr +``` + +* Note: The trans shell software has a quota, so it is recommended to translate 60 words of doc.txt at a time by editing it before running. + +* You could translate docs.txt into the target language "en2" and then into your chosen target language with Google Translate. + +* To translate an algorithm from one language to another: + +``` +cd Languages +swipl +['make_docs.pl']. +trans_alg([[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]]],"en","fr",A),writeln1(A). + +[[["n","inverser"],[[],["v","l"],["v","l"]]],[["n","inverser"],[["v","l"],["v","m"],["v","n"]],":-",[[["n","tête"],[["v","l"],["v","h"]]],[["n","queue"],[["v","l"],["v","t"]]],[["n","emballage"],[["v","h"],["v","h 1"]]],[["n","ajouter"],[["v","h 1"],["v","m"],["v","o"]]],[["n","inverser"],[["v","t"],["v","o"],["v","n"]]]]]] +``` + +* Note: When translating an algorithm from another language to English, it is better to avoid errors from capitalised strings by using the language code "en2" rather than "en", i.e. + +``` +trans_alg([[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]]],"en","en2",A),writeln1(A). + +[[["n","reverse"],[[],["v","l"],["v","l"]]],[["n","reverse"],[["v","l"],["v","m"],["v","n"]],":-",[[["n","head"],[["v","l"],["v","h"]]],[["n","tail"],[["v","l"],["v","t"]]],[["n","wrap"],[["v","h"],["v","h 1"]]],[["n","append"],[["v","h 1"],["v","m"],["v","o"]]],[["n","reverse"],[["v","t"],["v","o"],["v","n"]]]]]] +``` + +# Versioning + +We will use SemVer for versioning. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/Languages/data.pl b/Languages/data.pl new file mode 100644 index 0000000000000000000000000000000000000000..7917fa2e13095347084043b2e4ae78b3452b7e0f --- /dev/null +++ b/Languages/data.pl @@ -0,0 +1,188 @@ +:-include('../listprologinterpreter/operators.pl'). + +data([],Value,Value) :- !. +/** +data(Value1a,Value2,Value3) :- + data_symbol(Symbol), + ((Value1a=Symbol)-> + append(Value2,[Value1a],Value3)),!. +**/ +%data(Value1a,Value2,Value3) :- +% ((Value1a="=\\=")-> +% append(Value2,[Value1a],Value3)),!. +%data(Value1a,Value2,Value3) :- +% ((Value1a="->")-> +% append(Value2,[Value1a],Value3)),!. +data1(Value1a,Value1b%Value1a,Value2,Value3 +) :- + from_lang2(From_lang), + to_lang2(To_lang), + %trace, + ((atom(Value1a)->true;(string(Value1a)->true;(number(Value1a))))-> + (translate2(Value1a,From_lang,To_lang,Value1b))), %% translate1a,2 + %append(Value2,[Value1b],Value3))), + !. +data([Value1],Value2,Value3) :- + var(Value1), + append(Value2,[Value1],Value3),!. +data(Value1,Value2,Value3) :- +%trace, + to_lang2(To_lang), + from_lang2(From_lang), + get_lang_word3(v,"en",From_lang,V), + Value1=[V,Value4], + get_lang_word3(V,From_lang,To_lang,Value6b), + data(Value4,_,Value6), + (var(Value6)->(Value6c=Value6,Value6c1=[]);Value6=[Value6c|Value6c1]), + foldr(append,[[Value6b],[Value6c],Value6c1],Value6d), + append(Value2,Value6d,Value3),!. + +data(Value1,Value2,Value3) :- + to_lang2(To_lang), + from_lang2(From_lang), + get_lang_word3(input,"en",From_lang,I), + %trace, + Value1=I,%->true;(notrace,fail)), + + get_lang_word3(I,From_lang,To_lang,Value6b), + %data1(Value4,Value6), + append(Value2,[Value6b],Value3),!. +data(Value1,Value2,Value3) :- + to_lang2(To_lang), + from_lang2(From_lang), + get_lang_word3(output,"en",From_lang,O), + Value1=O, + get_lang_word3(O,From_lang,To_lang,Value6b), + %data1(Value4,Value6), + append(Value2,[Value6b],Value3),!. + +data(Value1a,Value2,Value3) :- + %from_lang2(From_lang), + %to_lang2(To_lang), + %trace, + ((atom(Value1a)->true;(string(Value1a)->true;(number(Value1a))))-> + (Value1a=Value1b,%translate2(Value1a,From_lang,To_lang,Value1b), %% translate1a,2 + append(Value2,[Value1b],Value3))), + !. +data(Value1,Value2,Value3) :- + to_lang2(To_lang), + from_lang2(From_lang), + get_lang_word3(n,"en",From_lang,N), + Value1=([N,Value4]=Value4a), + get_lang_word3(N,From_lang,To_lang,Value6b), + data1(Value4,Value6), + (Value4a=[_|_]->data(Value4a,[],Value6a); + data(Value4a,[],[Value6a])), + append(Value2,[[Value6b,Value6]=Value6a],Value3),!. +data(Value1,Value2,Value3) :- + to_lang2(To_lang), + from_lang2(From_lang), + get_lang_word3(n,"en",From_lang,N), + Value1=[[N,Value4]=Value4a], + get_lang_word3(N,From_lang,To_lang,Value6b), + data1(Value4,Value6), + (Value4a=[_|_]->data(Value4a,[],Value6a); + data(Value4a,[],[Value6a])), + append(Value2,[[Value6b,Value6]=Value6a],Value3),!. +data(Value1,Value2,Value3) :- + to_lang2(To_lang), + from_lang2(From_lang), + get_lang_word3(v,"en",From_lang,V), + Value1=([V,Value4]=Value4a), + get_lang_word3(V,From_lang,To_lang,Value6b), + data1(Value4,Value6), + (Value4a=[_|_]->data(Value4a,[],Value6a); + data(Value4a,[],[Value6a])), + append(Value2,[[Value6b,Value6]=Value6a],Value3),!. +data(Value1,Value2,Value3) :- + to_lang2(To_lang), + from_lang2(From_lang), + get_lang_word3(v,"en",From_lang,V), + Value1=[[V,Value4]=Value4a], + get_lang_word3(V,From_lang,To_lang,Value6b), + data1(Value4,Value6), + (Value4a=[_|_]->data(Value4a,[],Value6a); + data(Value4a,[],[Value6a])), + append(Value2,[[Value6b,Value6]=Value6a],Value3),!. + +data(Value1,Value2,Value3) :- + to_lang2(To_lang), + from_lang2(From_lang), + get_lang_word3(n,"en",From_lang,N), + Value1=[N,Value4], + get_lang_word3(N,From_lang,To_lang,Value6b), + data1(Value4,Value6), + append(Value2,[Value6b,Value6],Value3),!. + +data(Value1,Value2,Value3) :- + to_lang2(To_lang), + from_lang2(From_lang), + get_lang_word3(t,"en",From_lang,T), + Value1=[T,Value4], + get_lang_word3(T,From_lang,To_lang,Value6b), + data1(Value4,Value6), + append(Value2,[Value6b,Value6],Value3),!. + + + +data(Value1,Value2,Value3) :- + Value1=(Value4=Value4a), + data(Value4,[],Value6), + data(Value4a,[],Value6a), + append(Value2,[Value6=Value6a],Value3). +data(Value1,Value2,Value3) :- + Value1=[Value4=Value4a], + data(Value4,[],Value6), + data(Value4a,[],Value6a), + append(Value2,[Value6=Value6a],Value3). +data(Value1,Value2,Value3) :- + Value1=[[Value4|Value4a]|Value5], + data([Value4|Value4a],[],Value6), + append(Value2,[Value6],Value7), + data(Value5,Value7,Value3),!. +data(Value1,Value2,Value3) :- + Value1=[""|Value5], + append(Value2,[""],Value7), + data(Value5,Value7,Value3),!. +data(Value1,Value2,Value3) :- + Value1=[[]|Value5], + append(Value2,[[]],Value7), + data(Value5,Value7,Value3),!. +data(Value1,Value2,Value3) :- + Value1=[Value4|Value5], + data(Value4,[],Value6), + append(Value2,Value6,Value7), + data(Value5,Value7,Value3),!. + +data(Value1,Value2,Value3b) :- + +curly_head_taila(Value1,Value4,Value5), +%trace, + data([Value4],[],Value6), + append(Value2,Value6,Value7), + data(Value5,Value7,Value3), + curly_square(Value3a,[Value3]), + [Value3a]=Value3b,!. + +data_symbol(":-"). +data_symbol("->"). +data_symbol("|"). +data_symbol(=). +data_symbol(A) :- operator(A). +data_symbol(A) :- comparisonoperator(A). + +translate1("v","en","fr","v"). +translate1("true","en","fr","vrai"). +translate1("v","en","de","v"). +translate1("true","en","de","wahr"). +translate3(A,_,_,A). + +trans_alg(Algorithm1,From_lang,To_lang,Algorithm2) :- + retractall(from_lang2(_)), + assertz(from_lang2(From_lang)), + retractall(to_lang2(_)), + assertz(to_lang2(To_lang)), + retractall(lang(_)), + assertz(lang(To_lang)), + load_lang_db, + data([Algorithm1],[],[Algorithm2]),!. diff --git a/Languages/docs1.txt b/Languages/docs1.txt new file mode 100644 index 0000000000000000000000000000000000000000..a13de758411ea53fd8dbbe78a49329fb2fee031e --- /dev/null +++ b/Languages/docs1.txt @@ -0,0 +1,46 @@ +# List Prolog Interpreter + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at https://www.swi-prolog.org/build/. + +* You may need to install gawk using Homebrew. + +* Install Translation Shell on Mac, etc. +Change line in & culturaltranslationtool/ctt2.pl +trans_location("../../../gawk/trans").& +to correct location of trans. + +# 1. Install manually + +Download this repository, the Languages repository and Cultural Translation Tool into the same folder. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +& +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","listprologinterpreter"). +halt +& + +# Running + +* Download the repository to your machine. In Shell: +cd listprologinterpreter +swipl +['listprolog']. + +* Running the tests +To run all tests, enter: +&test(off,NTotal,Score).& + +* To run a specific test: +&test1(off,TestNumber,Passed).& +where TestNumber is the test number from lpiverify4.pl. diff --git a/Languages/docs10.txt b/Languages/docs10.txt new file mode 100644 index 0000000000000000000000000000000000000000..215228bb76edf494af66da008bfdb3fb3822617c --- /dev/null +++ b/Languages/docs10.txt @@ -0,0 +1,18 @@ +test_types_cases(7, +`[[n,map],[[[n,add],[[[n,add],[[[n,add],[1]]]]]],0,[v,d]]]`, + +`[ +[[n,map],[[t,map1],[t,number],[t,number]]], +[[t,map1],[[t,number]]], +[[t,map1],[[[t,predicatename],[[t,map1]]]]], +[[n,add],[[t,number],[t,number],[t,number]]], +[[n,getitemn],[[t,number],{[t,any]},[t,any]]] +]`, + +`[ +[[n,map],[input,input,output]], + +[[n,add],[input,input,output]], + +[[n,getitemn],[input,input,output]] +]`, \ No newline at end of file diff --git a/Languages/docs11.txt b/Languages/docs11.txt new file mode 100644 index 0000000000000000000000000000000000000000..ed4b29c61b39fae0af3c87376655e05b08d5e386 --- /dev/null +++ b/Languages/docs11.txt @@ -0,0 +1,14 @@ +`[ +[[n,map],[[v,f1],[v,n1],[v,n]],":-",[[[n,number],[[v,f1]]],[[n,add],[[v,n1],[v,f1],[v,n]]]]], + +[[n,map],[[v,f1],[v,l],[v,n]],":-",[[[n,equals1],[[v,f1],[[v,f11],[v,f12]]]],[[n,=],[[v,f11],[n,add]]],[[n,getitemn],[1,[v,f12],[v,bb]]],[[v,f11],[[v,l],1,[v,l2]]],[[n,map],[[v,bb],[v,l2],[v,n]]]]], + +[[n,add],[[v,a],[v,b],[v,c]],":-",[[n,+],[[v,a],[v,b],[v,c]]]], + +[[n,getitemn],[1,[v,b],[v,c]],":-",[[n,head],[[v,b],[v,c]]]], + +[[n,getitemn],[[v,a],[v,b],[v,c]],":-",[[[n,not],[[[n,=],[[v,a],1]]]],[[n,tail],[[v,b],[v,t]]],[[n,-],[[v,a],1,[v,d]]],[[n,getitemn],[[v,d],[v,t],[v,c]]]]] +]`, + +`[[[[v,d], 4]]]` +). \ No newline at end of file diff --git a/Languages/docs12.txt b/Languages/docs12.txt new file mode 100644 index 0000000000000000000000000000000000000000..d7229abc1df88ad684b580eb024ce4797636311a --- /dev/null +++ b/Languages/docs12.txt @@ -0,0 +1,3 @@ +* For other examples, please see lpiverify4.pl, lpiverify4_types.pl (for examples with types, including the list type), lpiverify4_open.pl (for examples with open-ended results) and lpiverify4_open_types.pl (for examples with open-ended results with types). + +* See https://github.com/luciangreen/listprologinterpreter/blob/master/LPI_docs.md for complete List Prolog Documentation. diff --git a/Languages/docs2.txt b/Languages/docs2.txt new file mode 100644 index 0000000000000000000000000000000000000000..d49a02faa691d2a5fd0f68880d994c0f88d9a3e6 --- /dev/null +++ b/Languages/docs2.txt @@ -0,0 +1,31 @@ +* The query &test1(off,7,Passed).& tests the reverse predicate: +test(7,`[[n,reverse],[[1,2,3],[],[v,l]]]`, +`[[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]]]`, +`[[[v,l], [3, 2, 1]]]`). + +# Documentation + +* The interpreter is called in the form: +&international_interpret([lang,en]&,debug,query,type_statements,mode_statements,functions,result). + +Where: +debug - on or off for trace, +query - the query e.g. [[n,reverse],[[1,2,3],[],[v,l]]] +type statements - e.g. [[n,reverse],[{[t, number]}, {[t, number]}, {[t, number]}]] +mode statements - e.g. [[n,reverse],[input,input,output]] +functions - the algorithm e.g. see reverse above +result - the result, e.g. [[[v,l], [3, 2, 1]]] ([] indicates failed and [[]] indicates the empty list). + +* Statements may be negated in the form: + +`[[n,not],[statement]]` + +* Statements may be connected by the disjunctive (or): + +`[[n,or],[statements1,statements2]]` + +* If-then statements may either be in the form: + +`[[n,"->"],[statements1,statements2]]` + +This means "If Statements1 then Statements2". diff --git a/Languages/docs3.txt b/Languages/docs3.txt new file mode 100644 index 0000000000000000000000000000000000000000..b4c7f06f4eed5918a6639601b4a5b5885efa9b61 --- /dev/null +++ b/Languages/docs3.txt @@ -0,0 +1,30 @@ +* Or, if-then statements may be in the form: + +`[[n,"->"],[statements1,statements2,statements2a]]` + +This means "If Statements1 then Statements2, else Statements2a". + +* `[[n,+],[1,2,[v,b]]]` `[v,b]=3` +* `[[n,-],[1,2,[v,b]]]` `[v,b]=(-1)` +* `[[n,*],[1,2,[v,b]]]` `[v,b]=2` +* `[[n,/],[1,2,[v,b]]]` `[v,b]=0.5` + +* `[[n,cut]]` - behaves like swipl's ! (doesn't allow backtracking forward or back past it) + +* `[[n,true]]` - behaves like true + +* `[[n,fail]]` - fails the current predicate + +* `[[n,atom],[variable]]`, e.g. `[[n,atom],[[v,a]]]` - returns true if `[v,a]='a'`, an atom + +* `[[n,string],[variable]]`, e.g. `[[n,string],[[v,a]]]` - returns true if `[v,a]="a"`, a string + +* `[[n,number],[variable]]`, e.g. `[[n,number],[14]]` - returns true where `14` is a number + +* `[[n,letters],[variable]]`, e.g. `[[n,letters],["abc"]]` - returns true where `"abc"` is letters + +* `[[n,is_operator],[variable1,variable2]]`, where `is_operator=is` or `is_operator="="`, e.g. `[[n,=],[[v,a],1]]` - returns true if `[v,a]=1` + +* `[[n,comparison_operator],[variable1,variable2]]`, where comparison_operator=">",">=","<", "=<", "=" or =\= e.g. `[[n,>=],[1,2]]` - returns (1>=2)=false. + +* `[[n,equals1],[variable1,[variable2,variable3]]]` e.g. `[[n,equals1],[["a","b"],[[v,a],[v,b]]]]` returns `[v,a]="a"` and `[v,b]="b"` diff --git a/Languages/docs4.txt b/Languages/docs4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f5617eb79fbae518524213284f085ac0379a1bcd --- /dev/null +++ b/Languages/docs4.txt @@ -0,0 +1,15 @@ +* `[[n,equals2],[variable1,[variable2,variable3]]]` e.g. `[[n,equals2],[[v,a],["a","b"]]]` returns `[v,a]=["a","b"]` + +* `[[n,equals3],[variable1,variable2]]` e.g. `[[n,equals3],[[v,a],[1,2,3]]]` returns `[v,a]=[1,2,3]` + +* `[[n,wrap],[variable1,variable2]]` e.g. `[[n,wrap],["a",[v,b]]]` returns `[v,b]=["a"]` + +* `[[n,unwrap],[variable1,variable2]]` e.g. `[[n,wrap],[["a"],[v,b]]]` returns `[v,b]="a"` + +* `[[n,head],[variable1,variable2]]` e.g. `[[n,head],[["a","b","c"],[v,b]]]` returns `[v,b]="a"` + +* `[[n,tail],[variable1,variable2]]` e.g. `[[n,tail],[["a","b","c"],[v,b]]]` returns `[v,b]=["b","c"]` + +* `[[n,member],[Variable1,Variable2]]` e.g. `[[n,member],[[v,b],["a","b","c"]]]` returns `[v,b]="a"`, `[v,b]="b"` or `[v,b]="c"`, or e.g. `[[n,member],["c",["a","b","c"]]]` returns true. (Formerly member2). + +* `[[n,delete],[variable1,variable2,variable3]]` e.g. `[[n,delete],[["a","b","b","c"],"b",[v,c]]]` returns `[v,c]=["a","c"]` diff --git a/Languages/docs5.txt b/Languages/docs5.txt new file mode 100644 index 0000000000000000000000000000000000000000..9f5f562f10a58a1f1ca368ae3991cee1ac1a0660 --- /dev/null +++ b/Languages/docs5.txt @@ -0,0 +1,17 @@ +* `[[n,append],[variable1,variable2,variable3]]` e.g. `[[n,append],[["a","b"],["c"],[v,d]]]` returns `[v,d]=["a","b","c"]`. Note: variable2 must be in list form, not e.g. `"c"`, or the command will fail. Wrap may wrap in []. + +* `[[n,stringconcat],[variable1,variable2,variable3]]` e.g. `[[n,stringconcat],["hello ","john",[v,c]]]` returns `[v,c]="hello john"`. + +* `[[n,stringtonumber],[variable1,variable2]]` e.g. `[[n,stringtonumber],["3",[v,b]]]` returns `[v,b]=3` + +* `[[n,random],[variable1]]` e.g. `[[n,random],[[v,r]]]` returns e.g. `[v,r]=0.19232608946956326` + +* `[[n,length],[variable1,variable2]]` e.g. `[[n,length],[[1,2,3],[v,l]]]` returns `[v,l]=3` + +* `[[n,ceiling],[variable1,variable2]]` e.g. `[[n,ceiling],[0.19232608946956326,[v,c]]]` returns `[v,c]=1` + +* `[[n,date],[year,month,day,hour,minute,seconds]]` e.g. `[[n,date],[[v,year],[v,month],[v,day],[v,hour],[v,minute],[v,second]]]` returns e.g. `[v,year]=2019`, `[v,month]=11`, `[v,day]=6`, `[v,hour]=12`, `[v,minute]=15`, `[v,second]=20.23353409767151`. + +* `[[n,sqrt],[variable1,variable2]]` e.g. `[[n,ceiling],[4,[v,c]]]` returns `[v,c]=2` + +* `[[n,round],[variable1,variable2]]` e.g. `[[n,round],[1.5,[v,c]]]` returns `[v,c]=2` diff --git a/Languages/docs6-1.txt b/Languages/docs6-1.txt new file mode 100644 index 0000000000000000000000000000000000000000..1fb8e1cc40bbc1aeaa99bef3a064234ade515b69 --- /dev/null +++ b/Languages/docs6-1.txt @@ -0,0 +1,9 @@ +* `[[n,writeln],[Variable1]]` e.g. `[[n,writeln],[[v,a]]]` writes `[v,a]` which is `"hello"` + +* `[[n,atom_string],[Variable1,Variable2]]` e.g. `[[n,atom_string],[a,[v,b]]]` returns `[v,b]="a"` or `[[n,atom_string],[[v,b],"a"]]` returns `[v,b]=a` + +* (1) `[[n,call],[Function,Arguments]]` e.g. `[[n,call],[[n,member2a],[[v,b],["a","b","c"]]]]` returns `[v,b]=a` + +* (2) `[[n,call],[[lang,Lang],Debug,[Function,Arguments],Functions]]` e.g. `[[n,call],[[lang,same],same,[[n,member2a],[[v,b],["a","b","c"]]], +[[[n,member2a],[[v,b],[v,a]],":-", + [[[n,member],[[v,b],[v,a]]]]]]]]` returns `[v,b]="a"`, where Lang may be e.g. `"en"`, etc., or `same` (meaning the same language as the parent predicate) and Debug may be `on`, `off` or `same` (meaning the same debug status as the parent predicate). diff --git a/Languages/docs6-2.txt b/Languages/docs6-2.txt new file mode 100644 index 0000000000000000000000000000000000000000..be7781d2a91bbb5aecd21a65ea01dae909498e0a --- /dev/null +++ b/Languages/docs6-2.txt @@ -0,0 +1,17 @@ +* (3) `[[n,call],[[lang,Lang],Debug,[Function,Arguments],Types,Modes,Functions]]` e.g. `[[n,call],[[lang,same],same,[[n,member2a],[[v,b],["a","b","c"]]], [[[n,member2a],[[t, number], [[t, number], [t, number], [t, number]]]]], + [[[n,member2a],[output,input]]], +[[[n,member2a],[[v,b],[v,a]],":-", + [ [[n,member],[[v,b],[v,a]]]] + ]]]]` returns `[v,b]="a"`. (See call(2) above for possible values of Lang and Debug.) + +* `[[n,trace]]` switches on trace (debug) mode. + +* `[[n,notrace]]` switches off trace (debug) mode. + +* `[[n,get_lang_word],[Variable1,Variable2]` e.g. `[[n,get_lang_word],["get_lang_word",[v,word]]]` returns `[v,word]="get_lang_word"`. + +* `[[n,member3],[Variable1,Variable2]` e.g. `[[n,member3],[[[v,a],"|",[[v,b]]],[[1,1],[1,2],[1,3]]]]` returns `[v,a]=1`, `[v,b]=1`, etc. where Variable1 may be in any pattern. + +* `[[n,equals4_on]]` switches on equals4 mode (in check arguments, substitute vars and update vars but not in the equals4 command itself), for e.g. list processing in arguments. + +* `[[n,equals4_off]]` switches off equals4 mode, for debugging. diff --git a/Languages/docs6.txt b/Languages/docs6.txt new file mode 100644 index 0000000000000000000000000000000000000000..546be6c59b1e949054c99ae4728d9ce19d0010e9 --- /dev/null +++ b/Languages/docs6.txt @@ -0,0 +1,15 @@ +* `[[n,equals4],[Variable1,Variable2]]` e.g. `[[n,equals4],[[[v,c],"|",[v,d]],[1,2,3]]]` returns `[v,c]=1` and `[v,d]=[2,3]`. You may use either order (i.e. a=1 or 1=a). Multiple items are allowed in the head of the list, there may be lists within lists, and lists with pipes must have the same number of items in the head in each list, or no pipe in the other list. + +* `[[n,findall],[Variable1,Variable2,Variable3]]` e.g. `[[[n,equals3],[[v,a],[1,2,3]]],[[n,findall],[[v,a1],[[n,member],[[v,a1],[v,a]]],[v,b]]]]` returns `[v,b]=[1,2,3]` + +* `[[n,string_from_file],[Variable1,Variable2]]` e.g. `[[n,string_from_file],[[v,a],"file.txt"]]` returns `[v,a]="Hello World"` + +* `[[n,maplist],[Variable1,Variable2,Variable3,Variable4]]` e.g. `[[n,maplist],[[n,+],[1,2,3],0,[v,b]]]` returns `[v,b]=6` + +* `[[n,string_length],[Variable1,Variable2]]` e.g. `[[n,string_length],["abc",[v,b]]]` returns `[v,b]=3` + +* `[[n,sort],[Variable1,Variable2]]` e.g. `[[n,sort],[[1,3,2],[v,b]]]` returns `[v,b]=[1,2,3]` + +* `[[n,intersection],[Variable1,Variable2]]` e.g. `[[n,intersection],[[1,3,2],[3,4,5],[v,b]]]` returns `[v,b]=[3]` + +* `[[n,read_string],[Variable1]]` e.g. `[[n,read_string],[[v,a]]]` asks for input and returns `[v,a]="hello"` diff --git a/Languages/docs7.txt b/Languages/docs7.txt new file mode 100644 index 0000000000000000000000000000000000000000..3ef8a83ba85d70a2b71b71d14792fbad59d55b0f --- /dev/null +++ b/Languages/docs7.txt @@ -0,0 +1,13 @@ +# Grammars + +* List Prolog supports grammars, for example: + +* Grammars may be recursive (see test 9 in lpiverify4.pl), i.e. they may repeat until triggering the base case: + +test(9,`[[n,grammar1],["aaa"]]`, +[ +`[[n,grammar1],[[v,s]],":-",[[[n,noun],[[v,s],""]]]]`, +`[[n,noun],"->",[""]]`, +`[[n,noun],"->",["a",[[n,noun]]]]` +], +`[[]]`). diff --git a/Languages/docs8.txt b/Languages/docs8.txt new file mode 100644 index 0000000000000000000000000000000000000000..656e0d6ab94171bbd00199603aa77a430930d4e3 --- /dev/null +++ b/Languages/docs8.txt @@ -0,0 +1,23 @@ +# Type Statements + +* Functions may have strong types, which check inputted values when calling a function and check all values when exiting a function. So far, any type statement with the name and arity of the function may match data for a call to that function. + +* The user may optionally enter types after the query. The following type statement tests number, string and predicate name types. + +* Note: Mode statements, described in the next section, are required after Type Statements. + +* Types with lists (0-infinite repeats of type statements) are written inside {}. There may be nested curly brackets. + +* Type Statements may be recursive (see test 23 in lpiverify4_types.pl), i.e. they may repeat until triggering the base case: + +test_types_cases(23,`[[n,connect_cliques],[[["a",1],[1,2],[2,"b"]],[["a",3],[3,4],[4,"b"]],[v,output]]]`, +`[ +[[n,connect_cliques],[[t,list2],[t,list2],[t,list2]]], +[[t,item],[[t,number]]], +[[t,item],[[t,string]]], +[[t,list2],[{[t,item]}]], +[[t,list2],[{[t,list2]}]] +]`, +`[[[n,connect_cliques],[input,input,output]]]`, +`[[[n,connect_cliques],[[v,a],[v,b],[v,c]],":-",[[[n,append],[[v,a],[v,b],[v,c]]]]]]`, +`[[[[v,output],[["a",1],[1,2],[2,"b"],["a",3],[3,4],[4,"b"]]]]]`). diff --git a/Languages/docs9.txt b/Languages/docs9.txt new file mode 100644 index 0000000000000000000000000000000000000000..afb5e85469b16c63a7a5543349c923fbd244a83c --- /dev/null +++ b/Languages/docs9.txt @@ -0,0 +1,20 @@ +# Mode Statements + +* In the following, + +test_types_cases(2,`[[n,function],[[v,a],[v,b],[v,c]]]`, +`[[[n,function],[[t,number],[t,string],[t,predicatename]]]]`, +`[[[n,function],[output,output,output]]]`, +[ +`[[n,function],[[v,a],[v,b],[v,c]],":-",[[[n,=],[[v,a],1]],[[n,=],[[v,b],"a"]],[[n,=],[[v,c],[n,a]]]]]` +], +`[[[[v,a], 1],[[v,b], "a"],[[v,c], [n,a]]]]`). + +* `[[[n,function],[output,output,output]]]`, is the mode statement, which must follow the type statement (although these are optional). The Mode Statement specifies whether each of the variables takes input or gives output. + +# Functional List Prolog (FLP) + +* List Prolog has an optional functional mode. In FLP, function calls may be passed as variables and functions may have strong types. + +* Functional algorithms may be recursive (see test 7 in lpiverify4_types.pl), i.e. they may repeat until triggering the base case: + diff --git a/Languages/docs_full.txt b/Languages/docs_full.txt new file mode 100644 index 0000000000000000000000000000000000000000..f326f303a2f704c00bbc84d9abae9e884e2b5b6f --- /dev/null +++ b/Languages/docs_full.txt @@ -0,0 +1,283 @@ +# List Prolog Interpreter + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at https://www.swi-prolog.org/build/. + +* You may need to install gawk using Homebrew. + +* Install Translation Shell on Mac, etc. +Change line in & culturaltranslationtool/ctt2.pl +trans_location("../../../gawk/trans").& +to correct location of trans. + +# 1. Install manually + +Download this repository, the Languages repository and Cultural Translation Tool into the same folder. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +& +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","listprologinterpreter"). +halt +& + +# Running + +* Download the repository to your machine. In Shell: +cd listprologinterpreter +swipl +['listprolog']. + +* Running the tests +To run all tests, enter: +&test(off,NTotal,Score).& + +* To run a specific test: +&test1(off,TestNumber,Passed).& +where TestNumber is the test number from lpiverify4.pl. + +* The query &test1(off,7,Passed).& tests the reverse predicate: +test(7,`[[n,reverse],[[1,2,3],[],[v,l]]]`, +`[[[n,reverse],[[],[v,l],[v,l]]],[[n,reverse],[[v,l],[v,m],[v,n]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]]]`, +`[[[v,l], [3, 2, 1]]]`). + +# Documentation + +* The interpreter is called in the form: +&international_interpret([lang,en]&,debug,query,type_statements,mode_statements,functions,result). + +Where: +debug - on or off for trace, +query - the query e.g. [[n,reverse],[[1,2,3],[],[v,l]]] +type statements - e.g. [[n,reverse],[{[t, number]}, {[t, number]}, {[t, number]}]] +mode statements - e.g. [[n,reverse],[input,input,output]] +functions - the algorithm e.g. see reverse above +result - the result, e.g. [[[v,l], [3, 2, 1]]] ([] indicates failed and [[]] indicates the empty list). + +* Statements may be negated in the form: + +`[[n,not],[statement]]` + +* Statements may be connected by the disjunctive (or): + +`[[n,or],[statements1,statements2]]` + +* If-then statements may either be in the form: + +`[[n,"->"],[statements1,statements2]]` + +This means "If Statements1 then Statements2". + +* Or, if-then statements may be in the form: + +`[[n,"->"],[statements1,statements2,statements2a]]` + +This means "If Statements1 then Statements2, else Statements2a". + +* `[[n,+],[1,2,[v,b]]]` `[v,b]=3` +* `[[n,-],[1,2,[v,b]]]` `[v,b]=(-1)` +* `[[n,*],[1,2,[v,b]]]` `[v,b]=2` +* `[[n,/],[1,2,[v,b]]]` `[v,b]=0.5` + +* `[[n,cut]]` - behaves like swipl's ! (doesn't allow backtracking forward or back past it) + +* `[[n,true]]` - behaves like true + +* `[[n,fail]]` - fails the current predicate + +* `[[n,atom],[variable]]`, e.g. `[[n,atom],[[v,a]]]` - returns true if `[v,a]='a'`, an atom + +* `[[n,string],[variable]]`, e.g. `[[n,string],[[v,a]]]` - returns true if `[v,a]="a"`, a string + +* `[[n,number],[variable]]`, e.g. `[[n,number],[14]]` - returns true where `14` is a number + +* `[[n,letters],[variable]]`, e.g. `[[n,letters],["abc"]]` - returns true where `"abc"` is letters + +* `[[n,is_operator],[variable1,variable2]]`, where `is_operator=is` or `is_operator="="`, e.g. `[[n,=],[[v,a],1]]` - returns true if `[v,a]=1` + +* `[[n,comparison_operator],[variable1,variable2]]`, where comparison_operator=">",">=","<", "=<", "=" or =\= e.g. `[[n,>=],[1,2]]` - returns (1>=2)=false. + +* `[[n,equals1],[variable1,[variable2,variable3]]]` e.g. `[[n,equals1],[["a","b"],[[v,a],[v,b]]]]` returns `[v,a]="a"` and `[v,b]="b"` + +* `[[n,equals2],[variable1,[variable2,variable3]]]` e.g. `[[n,equals2],[[v,a],["a","b"]]]` returns `[v,a]=["a","b"]` + +* `[[n,equals3],[variable1,variable2]]` e.g. `[[n,equals3],[[v,a],[1,2,3]]]` returns `[v,a]=[1,2,3]` + +* `[[n,wrap],[variable1,variable2]]` e.g. `[[n,wrap],["a",[v,b]]]` returns `[v,b]=["a"]` + +* `[[n,unwrap],[variable1,variable2]]` e.g. `[[n,wrap],[["a"],[v,b]]]` returns `[v,b]="a"` + +* `[[n,head],[variable1,variable2]]` e.g. `[[n,head],[["a","b","c"],[v,b]]]` returns `[v,b]="a"` + +* `[[n,tail],[variable1,variable2]]` e.g. `[[n,tail],[["a","b","c"],[v,b]]]` returns `[v,b]=["b","c"]` + +* `[[n,member],[Variable1,Variable2]]` e.g. `[[n,member],[[v,b],["a","b","c"]]]` returns `[v,b]="a"`, `[v,b]="b"` or `[v,b]="c"`, or e.g. `[[n,member],["c",["a","b","c"]]]` returns true. (Formerly member2). + +* `[[n,delete],[variable1,variable2,variable3]]` e.g. `[[n,delete],[["a","b","b","c"],"b",[v,c]]]` returns `[v,c]=["a","c"]` + +* `[[n,append],[variable1,variable2,variable3]]` e.g. `[[n,append],[["a","b"],["c"],[v,d]]]` returns `[v,d]=["a","b","c"]`. Note: variable2 must be in list form, not e.g. `"c"`, or the command will fail. Wrap may wrap in []. + +* `[[n,stringconcat],[variable1,variable2,variable3]]` e.g. `[[n,stringconcat],["hello ","john",[v,c]]]` returns `[v,c]="hello john"`. + +* `[[n,stringtonumber],[variable1,variable2]]` e.g. `[[n,stringtonumber],["3",[v,b]]]` returns `[v,b]=3` + +* `[[n,random],[variable1]]` e.g. `[[n,random],[[v,r]]]` returns e.g. `[v,r]=0.19232608946956326` + +* `[[n,length],[variable1,variable2]]` e.g. `[[n,length],[[1,2,3],[v,l]]]` returns `[v,l]=3` + +* `[[n,ceiling],[variable1,variable2]]` e.g. `[[n,ceiling],[0.19232608946956326,[v,c]]]` returns `[v,c]=1` + +* `[[n,date],[year,month,day,hour,minute,seconds]]` e.g. `[[n,date],[[v,year],[v,month],[v,day],[v,hour],[v,minute],[v,second]]]` returns e.g. `[v,year]=2019`, `[v,month]=11`, `[v,day]=6`, `[v,hour]=12`, `[v,minute]=15`, `[v,second]=20.23353409767151`. + +* `[[n,sqrt],[variable1,variable2]]` e.g. `[[n,ceiling],[4,[v,c]]]` returns `[v,c]=2` + +* `[[n,round],[variable1,variable2]]` e.g. `[[n,round],[1.5,[v,c]]]` returns `[v,c]=2` + +* `[[n,writeln],[Variable1]]` e.g. `[[n,writeln],[[v,a]]]` writes `[v,a]` which is `"hello"` + +* `[[n,atom_string],[Variable1,Variable2]]` e.g. `[[n,atom_string],[a,[v,b]]]` returns `[v,b]="a"` or `[[n,atom_string],[[v,b],"a"]]` returns `[v,b]=a` + +* (1) `[[n,call],[Function,Arguments]]` e.g. `[[n,call],[[n,member2a],[[v,b],["a","b","c"]]]]` returns `[v,b]=a` + +* (2) `[[n,call],[[lang,Lang],Debug,[Function,Arguments],Functions]]` e.g. `[[n,call],[[lang,same],same,[[n,member2a],[[v,b],["a","b","c"]]], +[[[n,member2a],[[v,b],[v,a]],":-", + [[[n,member],[[v,b],[v,a]]]]]]]]` returns `[v,b]="a"`, where Lang may be e.g. `"en"`, etc., or `same` (meaning the same language as the parent predicate) and Debug may be `on`, `off` or `same` (meaning the same debug status as the parent predicate). + +* (3) `[[n,call],[[lang,Lang],Debug,[Function,Arguments],Types,Modes,Functions]]` e.g. `[[n,call],[[lang,same],same,[[n,member2a],[[v,b], ["a","b","c"]]], [[[n,member2a],[[t, number], [[t, number], [t, number], [t, number]]]]], + [[[n,member2a],[output,input]]], +[[[n,member2a],[[v,b],[v,a]],":-", + [ [[n,member],[[v,b],[v,a]]]] + ]]]]` returns `[v,b]="a"`. (See call(2) above for possible values of Lang and Debug.) + +* `[[n,trace]]` switches on trace (debug) mode. + +* `[[n,notrace]]` switches off trace (debug) mode. + +* `[[n,get_lang_word],[Variable1,Variable2]` e.g. `[[n,get_lang_word],["get_lang_word",[v,word]]]` returns `[v,word]="get_lang_word"`. + +* `[[n,member3],[Variable1,Variable2]` e.g. `[[n,member3],[[[v,a],"|",[[v,b]]],[[1,1],[1,2],[1,3]]]]` returns `[v,a]=1`, `[v,b]=1`, etc. where Variable1 may be in any pattern. + +* `[[n,equals4_on]]` switches on equals4 mode (in check arguments, substitute vars and update vars but not in the equals4 command itself), for e.g. list processing in arguments. + +* `[[n,equals4_off]]` switches off equals4 mode, for debugging. + +* `[[n,equals4],[Variable1,Variable2]]` e.g. `[[n,equals4],[[[v,c],"|",[v,d]],[1,2,3]]]` returns `[v,c]=1` and `[v,d]=[2,3]`. You may use either order (i.e. a=1 or 1=a). Multiple items are allowed in the head of the list, there may be lists within lists, and lists with pipes must have the same number of items in the head in each list, or no pipe in the other list. + +* `[[n,findall],[Variable1,Variable2,Variable3]]` e.g. `[[[n,equals3],[[v,a],[1,2,3]]],[[n,findall],[[v,a1],[[n,member],[[v,a1],[v,a]]],[v,b]]]]` returns `[v,b]=[1,2,3]` + +* `[[n,string_from_file],[Variable1,Variable2]]` e.g. `[[n,string_from_file],[[v,a],"file.txt"]]` returns `[v,a]="Hello World"` + +* `[[n,maplist],[Variable1,Variable2,Variable3,Variable4]]` e.g. `[[n,maplist],[[n,+],[1,2,3],0,[v,b]]]` returns `[v,b]=6` + +* `[[n,string_length],[Variable1,Variable2]]` e.g. `[[n,string_length],["abc",[v,b]]]` returns `[v,b]=3` + +* `[[n,sort],[Variable1,Variable2]]` e.g. `[[n,sort],[[1,3,2],[v,b]]]` returns `[v,b]=[1,2,3]` + +* `[[n,intersection],[Variable1,Variable2]]` e.g. `[[n,intersection],[[1,3,2],[3,4,5],[v,b]]]` returns `[v,b]=[3]` + +* `[[n,read_string],[Variable1]]` e.g. `[[n,read_string],[[v,a]]]` asks for input and returns `[v,a]="hello"` + +# Grammars + +* List Prolog supports grammars, for example: + +* Grammars may be recursive (see test 9 in lpiverify4.pl), i.e. they may repeat until triggering the base case: + +test(9,`[[n,grammar1],["aaa"]]`, +[ +`[[n,grammar1],[[v,s]],":-",[[[n,noun],[[v,s],""]]]]`, +`[[n,noun],"->",[""]]`, +`[[n,noun],"->",["a",[[n,noun]]]]` +], +`[[]]`). + +# Type Statements + +* Functions may have strong types, which check inputted values when calling a function and check all values when exiting a function. So far, any type statement with the name and arity of the function may match data for a call to that function. + +* The user may optionally enter types after the query. The following type statement tests number, string and predicate name types. + +* Note: Mode statements, described in the next section, are required after Type Statements. + +* Types with lists (0-infinite repeats of type statements) are written inside {}. There may be nested curly brackets. + +* Type Statements may be recursive (see test 23 in lpiverify4_types.pl), i.e. they may repeat until triggering the base case: + +test_types_cases(23,`[[n,connect_cliques],[[["a",1],[1,2],[2,"b"]],[["a",3],[3,4],[4,"b"]],[v,output]]]`, +`[ +[[n,connect_cliques],[[t,list2],[t,list2],[t,list2]]], +[[t,item],[[t,number]]], +[[t,item],[[t,string]]], +[[t,list2],[{[t,item]}]], +[[t,list2],[{[t,list2]}]] +]`, +`[[[n,connect_cliques],[input,input,output]]]`, +`[[[n,connect_cliques],[[v,a],[v,b],[v,c]],":-",[[[n,append],[[v,a],[v,b],[v,c]]]]]]`, +`[[[[v,output],[["a",1],[1,2],[2,"b"],["a",3],[3,4],[4,"b"]]]]]`). + +# Mode Statements + +* In the following, + +test_types_cases(2,`[[n,function],[[v,a],[v,b],[v,c]]]`, +`[[[n,function],[[t,number],[t,string],[t,predicatename]]]]`, +`[[[n,function],[output,output,output]]]`, +[ +`[[n,function],[[v,a],[v,b],[v,c]],":-",[[[n,=],[[v,a],1]],[[n,=],[[v,b],"a"]],[[n,=],[[v,c],[n,a]]]]]` +], +`[[[[v,a], 1],[[v,b], "a"],[[v,c], [n,a]]]]`). + +* `[[[n,function],[output,output,output]]]`, is the mode statement, which must follow the type statement (although these are optional). The Mode Statement specifies whether each of the variables takes input or gives output. + +# Functional List Prolog (FLP) + +* List Prolog has an optional functional mode. In FLP, function calls may be passed as variables and functions may have strong types. + +* Functional algorithms may be recursive (see test 7 in lpiverify4_types.pl), i.e. they may repeat until triggering the base case: + + +test_types_cases(7, +`[[n,map],[[[n,add],[[[n,add],[[[n,add],[1]]]]]],0,[v,d]]]`, + +`[ +[[n,map],[[t,map1],[t,number],[t,number]]], +[[t,map1],[[t,number]]], +[[t,map1],[[[t,predicatename],[[t,map1]]]]], +[[n,add],[[t,number],[t,number],[t,number]]], +[[n,getitemn],[[t,number],{[t,any]},[t,any]]] +]`, + +`[ +[[n,map],[input,input,output]], + +[[n,add],[input,input,output]], + +[[n,getitemn],[input,input,output]] +]`, +`[ +[[n,map],[[v,f1],[v,n1],[v,n]],":-",[[[n,number],[[v,f1]]],[[n,add],[[v,n1],[v,f1],[v,n]]]]], + +[[n,map],[[v,f1],[v,l],[v,n]],":-",[[[n,equals1],[[v,f1],[[v,f11],[v,f12]]]],[[n,=],[[v,f11],[n,add]]],[[n,getitemn],[1,[v,f12],[v,bb]]],[[v,f11],[[v,l],1,[v,l2]]],[[n,map],[[v,bb],[v,l2],[v,n]]]]], + +[[n,add],[[v,a],[v,b],[v,c]],":-",[[n,+],[[v,a],[v,b],[v,c]]]], + +[[n,getitemn],[1,[v,b],[v,c]],":-",[[n,head],[[v,b],[v,c]]]], + +[[n,getitemn],[[v,a],[v,b],[v,c]],":-",[[[n,not],[[[n,=],[[v,a],1]]]],[[n,tail],[[v,b],[v,t]]],[[n,-],[[v,a],1,[v,d]]],[[n,getitemn],[[v,d],[v,t],[v,c]]]]] +]`, + +`[[[[v,d], 4]]]` +). +* For other examples, please see lpiverify4.pl, lpiverify4_types.pl (for examples with types, including the list type), lpiverify4_open.pl (for examples with open-ended results) and lpiverify4_open_types.pl (for examples with open-ended results with types). + +* See https://github.com/luciangreen/listprologinterpreter/blob/master/LPI_docs.md for complete List Prolog Documentation. + diff --git a/Languages/lang_db_generator-nonabort.pl b/Languages/lang_db_generator-nonabort.pl new file mode 100644 index 0000000000000000000000000000000000000000..bcc0ceef10174673b7763b513f9256c936d24388 --- /dev/null +++ b/Languages/lang_db_generator-nonabort.pl @@ -0,0 +1,303 @@ +%% lang_db_generator(Target_language) +%% New repository, with dbs in each rep + +%:- include('../culturaltranslationtool/ctt2.pl'). % leave these commented out +%:- include('../culturaltranslationtool/edit.pl'). % " +%:- include('../listprologinterpreter/la_strings'). % leave off + +:- dynamic lang_db/1. + +%% List_of_words may be list of words to translate and add or an original language code + +% lang_db_generator([["a","a"],["b","b exp"]],["fr"]). - create the db +% lang_db_generator("fr",["de"]). - add de with same entries as fr +% lang_db_generator("fr",["ab"]). - add ab " +% Alternatively, lang_db_generator("fr",["de","ab"]). instead of last two lines + +load_lang_db1(Entry3) :- + + % if l.o.w. is orig lang, read from it instead + + directory_files("../listprologinterpreter/languages/",F), + delete_invisibles_etc(F,G), + findall(Entry2,(member(Filex1,G), + string_concat("../listprologinterpreter/languages/",Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]), + string_concat(Target_language3,".txt",Filex), + string_concat("../listprologinterpreter/languages/",Target_language2,Target_language3), + +findall([En_word,En_word2,Target_language2,T_word],(member(Entry,String02a),((Entry=[En_word,En_word2,Target_language2,T_word],string(En_word),string(En_word2),string(T_word))->true;(concat_list(["Error: File: ",Filex,", Entry: ",Entry," not in format [\"\",\"\",\"\",\"\"]."],_Notification1)%,writeln(Notification1) +,fail))),Entry2) +),Entry2a), + maplist(append,[Entry2a],[Entry3]). + +lang_db_generator(List_of_words,Target_language1) :- + + load_lang_db1(Entry3), + + %%(not(member([_,Target_language1,_],Entry3))-> + + (string(List_of_words)-> + (findall([Word1,Word2],(member([Word1,Word2,List_of_words,_],Entry3)),List_of_words2),List_of_words3=[]); + + (findall([Word1,Word2,List_of_words,Item_a],(member([Word1,Word2,List_of_words,Item_a],Entry3)),List_of_words3), %%** test + findall(_,(member([Word,Word2],List_of_words),(string(Word),string(Word2))->true;(concat_list(["Error: Word1: ",Word," or Word2: ",Word2," in list of words not in format [\"\",\"\"]."],_Notification1)%,writeln(Notification1) + ,fail)),_)),List_of_words=List_of_words2), + +%% list of langs to put words into + + findall(_,(member(Target_language10,Target_language1), + (string(Target_language10)->true;(concat_list(["Error: Target_language1: ",Target_language1," not in format \"\"."],_Notification11)%,writeln(Notification1) + ,fail)), + +findall([Input,Input1,Target_language10,Output3],(member([Input,Input1],List_of_words2),translate4(Input1,"en",Target_language10,Output3)),Output4), + + %% save file + %% ** add to prev file + %% ** keep metaphorical translations (x bt xx with simplified ctt (no bt-trans dict) and customised output) + %% numbers after command names + %% - check words on google translate, ctt speeds bt + + append(List_of_words3,Output4,Output4a), %% ** test + + concat_list(["../listprologinterpreter/languages/",Target_language10,".txt"],File1), + term_to_atom(Output4a,File_contents),open_s(File1,write,Stream),write(Stream,File_contents),close(Stream)),_). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +% test with t=w+1 + +%translate1(Input,From_l,Target_language1,Output3) :- +%concat_list([Input," ",From_l," ",Target_language1],Output3). + +translate1("v","en","fr","v"). +translate1("true","en","fr","vrai"). +translate1("v","en","de","v"). +translate1("true","en","de","wahr"). +translate1(A,_,_,A). + +load_lang_db :- +%% error if lang not found + load_lang_db1(Entry3), + (not(lang(Lang))->(Lang="en");lang(Lang)), + (Lang="en"->true; + (member([_,_,Lang,_],Entry3)->true; + (concat_list(["Error: Language: ",Lang," not in listprologinterpreter/languages folder."],_Notification1)%,writeln(Notification1) + ,fail))), + + retractall(lang_db(_)), + assertz(lang_db(Entry3)),!. + +get_lang_word3(Input,From_lang,To_lang,Output) :- + From_lang="en", + To_lang="en", + atom_string(Output,Input),!. +get_lang_word3(Input,From_lang,To_lang,Output) :- + not(From_lang="en"), + To_lang="en", + + %(From_lang="en2"->Epsilon="";Epsilon=" "), + Epsilon="", + + %(string(Input)->true;(number(Input))->true;fail), + %lang_db(Lang_db), + (not(lang_db(Lang_db))->(load_lang_db,lang_db(Lang_db));lang_db(Lang_db)), + +split_on_number(Input,Input1,Input10),((member([Output2,_Input101,From_lang,Input1],Lang_db), + %notrace, + (Input10="" -> Output3=Output2; + concat_list([Output2,Epsilon,Input10],Output3)))->true; + (translate4(Input1,From_lang,To_lang,Output2), + (Input10="" -> Output3=Output2; + concat_list([Output2,Epsilon,Input10],Output3))) + ), + replace(Output3," ","_",Output4), + atom_string(Output,Output4),!. + +get_lang_word3(Input0,From_lang,To_lang,Output) :- + From_lang="en", + not(To_lang="en"), + + %(To_lang="en2"->Epsilon=" ";Epsilon=" "), + Epsilon=" ", + + %lang_db(Lang_db), + (not(lang_db(Lang_db))->(load_lang_db,lang_db(Lang_db));lang_db(Lang_db)), + replace(Input0,"_"," ",Input), + %(atom(Input)->true;(number(Input))->true;fail), + split_on_number(Input,Input1,Input10),((member([Input1,_Input102,To_lang,Output2],Lang_db), + %notrace, + (Input10="" -> Output=Output2; + concat_list([Output2,Epsilon,Input10],Output)))->true; + (translate4(Input1,From_lang,To_lang,Output2), + (Input10="" -> Output=Output2; + concat_list([Output2,Epsilon,Input10],Output))) + ), + %atom_string(Input,Output),!. + !. + +get_lang_word3(Input,From_lang,To_lang,Output) :- +%% error if word not found + not(From_lang="en"), + not(To_lang="en"), + + %lang(To_lang), + %lang_db(Lang_db), + (not(lang_db(Lang_db))->(load_lang_db,lang_db(Lang_db));lang_db(Lang_db)), + %((From_lang="en",To_lang="en")->%Input=Output1, + %atom_string(Output,Input); + %((%(%((Input="member2"->trace;true), + split_on_number(Input,Input1,Input10),((member([Input2,_Input101,From_lang,Input1],Lang_db),member([Input2,_Input102,To_lang,Output2],Lang_db), + %notrace, + (Input10="" -> Output=Output2; + concat_list([Output2," ",Input10],Output)))->true; + (translate4(Input1,From_lang,To_lang,Output2), + (Input10="" -> Output=Output2; + concat_list([Output2," ",Input10],Output))) + ), %-> true; + + %concat_list(["Error: Word: ",Input," not in Language: ",Lang," in lang_db."],_Notification1),%writeln(Notification1), + %fail + %)))), + %Output=Output1, + !. + %atom_string(Output,Output1),!. + +%split_on_number(Input,Input1,Input10) :- +%findall([Input1,Input10],split_on_number1(Input,Input1,Input10),Output1), +%reverse(Output1,[[Input1,Input10]|_]). + +/** + +[debug] ?- split_on_number("en 2",A,B). +A = "en", +B = "2". + +[debug] ?- split_on_number("en22",A,B). +A = "en", +B = "22". + +[debug] ?- split_on_number("en a 22",A,B). +A = "en a", +B = "22". + +[debug] ?- split_on_number("en a",A,B). +A = "en a", +B = "". + +**/ + +split_on_number(Input,Input1,Input10) :- + string_concat(A,B,Input), + string_concat(C,D,B), + string_length(C,1), + ((C=" ", + string_concat(E,F,D), + string_concat(J,K,E), + string_length(K,1), + K=" ", + + string_concat(G,_H,F), + %trace, + string_length(G,1), + (number_string(_,G)),%->true;not(G=" ")), + %(( + %C=" ", + %)->( + concat_list([A,C,J],Input1), + %Input1=A, + Input10=F))%->true; + %(number_string(_,C), + %Input1=A,Input10=B)) + ,!.%); +split_on_number(Input,Input1,Input10) :- + string_concat(A,B,Input), + string_concat(C,D,B), + string_length(C,1), + ((C=" ", + string_concat(_E,F,D), + + string_concat(G,_H,F), + %trace, + string_length(G,1), + + (number_string(_,G)),%->true;not(G=" ")), + %(( + %C=" ", + %)->( + %concat_list([A,C,J],Input1), + Input1=A, + Input10=F)->true; + (number_string(_,C), + Input1=A,Input10=B)) + ,!.%); + +split_on_number(Input,Input1,Input10) :- + Input1=Input,Input10="". + +replace(A,Find,Replace,F) :- + split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F1),string_concat(F,G,F1),string_length(G,1),!. + +% stringconcat-> in both algs above +% concatenate strings v + +% membre 2->member2 +get_en_lang_word(Input,Output) :- +%% error if word not found + lang(Lang), + lang_db(Lang_db), + (Lang="en"->%Input=Output1, + atom_string(Input,Output); + ((%(%((Input="member2"->trace;true), + %split_on_number(Input,Input1,Input10), + member([Output1,_Input101,Lang,Input],Lang_db), + %atom_string( + Output=Output1)->true; + %notrace, + %(%Input10="" -> Output=Output2; + %concat_list([Output2," ",Input10],Output))) -> true; + (concat_list(["Error: Word: ",Input," not in Language: ",Lang," in lang_db."],_Notification1)%,writeln(Notification1) + ,fail))), + %Output=Output1, + !. + +get_lang_word(I,O) :- + (not(lang(OL))->(OL="en");lang(OL)), + get_lang_word3(I,"en",OL,O),!. + +reserved_word(Word) :- + lang(Lang) , + lang_db(Lang_db), + (Lang="en"->(member([Word1,_,_,_],Lang_db), + atom_string(Word,Word1)); + member([_,_,_,Word],Lang_db)). + +/* +reserved_word2(Word) :- + lang(Lang) , + lang_db(Lang_db), + split_on_number(Word,Output21,_), + atom_string(Output21,Output2), + member([Output2,_Input101,_Lang,_Input1],Lang_db), + not(Output21="query_box"), + !. +*/ + +% reserved word - membre2 -> member2 (in list) +% (translate to en, check list) + +reserved_word2(Word) :- + lang(Lang) , + %lang_db(Lang_db), + get_lang_word3(Word,Lang,"en",Word2), + atom_string(Word2,Word3), + reserved_words2(Reserved_words), + (member(Word3,Reserved_words)->true; + (fail% + %writeln1([not_reserved_word,Word3]),fail + )),!. + +% not query_box \ No newline at end of file diff --git a/Languages/lang_db_generator.pl b/Languages/lang_db_generator.pl new file mode 100644 index 0000000000000000000000000000000000000000..691d477b63439cf44a88afb28821e22075b8d3f7 --- /dev/null +++ b/Languages/lang_db_generator.pl @@ -0,0 +1,180 @@ +%% lang_db_generator(Target_language) +%% New repository, with dbs in each rep + +:- include('../culturaltranslationtool/ctt2.pl'). +%:- include('../culturaltranslationtool/edit.pl'). +:- include('../listprologinterpreter/la_strings'). +:- include('translate4.pl'). + +:- dynamic lang_db/1. + +%% List_of_words may be list of words to translate and add or an original language code + +% lang_db_generator([["a","a"],["b","b exp"]],["fr"]). - create the db +% lang_db_generator("fr",["de"]). - add de with same entries as fr +% lang_db_generator("fr",["ab"]). - add ab " +% Alternatively, lang_db_generator("fr",["de","ab"]). instead of last two lines + +load_lang_db1(Entry3) :- + + % if l.o.w. is orig lang, read from it instead + + directory_files("../listprologinterpreter/languages/",F), + delete_invisibles_etc(F,G), + findall(Entry2,(member(Filex1,G), + string_concat("../listprologinterpreter/languages/",Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]), + string_concat(Target_language3,".txt",Filex), + string_concat("../listprologinterpreter/languages/",Target_language2,Target_language3), + +findall([En_word,En_word2,Target_language2,T_word],(member(Entry,String02a),((Entry=[En_word,En_word2,Target_language2,T_word],string(En_word),string(En_word2),string(T_word))->true;(concat_list(["Error: File: ",Filex,", Entry: ",Entry," not in format [\"\",\"\",\"\",\"\"]."],Notification1),writeln(Notification1),abort))),Entry2) +),Entry2a), + maplist(append,[Entry2a],[Entry3]). + +lang_db_generator(List_of_words,Target_language1) :- + + load_lang_db1(Entry3), + + %%(not(member([_,Target_language1,_],Entry3))-> + + (string(List_of_words)-> + (findall([Word1,Word2],(member([Word1,Word2,List_of_words,_],Entry3)),List_of_words2),List_of_words3=[]); + + (findall([Word1,Word2,List_of_words,Item_a],(member([Word1,Word2,List_of_words,Item_a],Entry3)),List_of_words3), %%** test + findall(_,(member([Word,Word2],List_of_words),(string(Word),string(Word2))->true;(concat_list(["Error: Word1: ",Word," or Word2: ",Word2," in list of words not in format [\"\",\"\"]."],Notification1),writeln(Notification1),abort)),_)),List_of_words=List_of_words2), + +%% list of langs to put words into + + findall(_,(member(Target_language10,Target_language1), + (string(Target_language10)->true;(concat_list(["Error: Target_language1: ",Target_language1," not in format \"\"."],Notification1),writeln(Notification1),abort)), + +%split_on_number(Target_language10,Target_language101,_), +%trace, +findall([Input,Input1,Target_language10,Output3],( +member([Input,Input1],List_of_words2), +(Target_language10="en2"->Output3=Input1;translate(Input1,"en",Target_language10,Output3))),Output4), + + %% save file + %% ** add to prev file + %% ** keep metaphorical translations (x bt xx with simplified ctt (no bt-trans dict) and customised output) + %% numbers after command names + %% - check words on google translate, ctt speeds bt + + append(List_of_words3,Output4,Output4a), %% ** test + + concat_list(["../listprologinterpreter/languages/",Target_language10,".txt"],File1), + term_to_atom(Output4a,File_contents),open_s(File1,write,Stream),write(Stream,File_contents),close(Stream)),_). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +% test with t=w+1 + +translate1(Input,From_l,Target_language1,Output3) :- +concat_list([Input," ",From_l," ",Target_language1],Output3). + +load_lang_db :- +%% error if lang not found + load_lang_db1(Entry3), + lang(Lang), + (Lang="en"->true; + (member([_,_,Lang,_],Entry3)->true; + (concat_list(["Error: Language: ",Lang," not in listprologinterpreter/languages folder."],Notification1),writeln(Notification1),abort))), + + retractall(lang_db(_)), + assertz(lang_db(Entry3)),!. + +get_lang_word(Input,Output) :- +%% error if word not found + (not(lang(Lang))->(Lang="en");lang(Lang)), + %lang_db(Lang_db), + (not(lang_db(Lang_db))->(load_lang_db,lang_db(Lang_db));lang_db(Lang_db)), + (Lang="en"->%Input=Output1, + atom_string(Output,Input); + ((%(%((Input="member2"->trace;true), + split_on_number(Input,Input1,Input10),member([Input1,_Input101,Lang,Output2],Lang_db), + %notrace, + (Input10="" -> Output=Output2; + concat_list([Output2," ",Input10],Output))) -> true; + (concat_list(["Error: Word: ",Input," not in Language: ",Lang," in lang_db."],Notification1),writeln(Notification1),abort))), + %Output=Output1, + !. + %atom_string(Output,Output1),!. + +%split_on_number(Input,Input1,Input10) :- +%findall([Input1,Input10],split_on_number1(Input,Input1,Input10),Output1), +%reverse(Output1,[[Input1,Input10]|_]). +split_on_number(Input,Input1,Input10) :- + string_concat(A,B,Input), + string_concat(C,D,B), + string_length(C,1), + ((C=" ", + string_concat(E,F,D), + string_concat(J,K,E), + string_length(K,1), + K=" ", + + string_concat(G,_H,F), + %trace, + string_length(G,1), + (number_string(_,G)),%->true;not(G=" ")), + %(( + %C=" ", + %)->( + concat_list([A,C,J],Input1), + %Input1=A, + Input10=F))%->true; + %(number_string(_,C), + %Input1=A,Input10=B)) + ,!.%); +split_on_number(Input,Input1,Input10) :- + string_concat(A,B,Input), + string_concat(C,D,B), + string_length(C,1), + ((C=" ", + string_concat(_E,F,D), + + string_concat(G,_H,F), + %trace, + string_length(G,1), + + (number_string(_,G)),%->true;not(G=" ")), + %(( + %C=" ", + %)->( + %concat_list([A,C,J],Input1), + Input1=A, + Input10=F)->true; + (number_string(_,C), + Input1=A,Input10=B)) + ,!.%); + +split_on_number(Input,Input1,Input10) :- + Input1=Input,Input10="". + +replace(A,Find,Replace,F) :- + split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F),!. + +% stringconcat-> in both algs above +% concatenate strings v +/** +% membre 2->member2 +get_en_lang_word(Input,Output) :- +%% error if word not found + lang(Lang), + lang_db(Lang_db), + (Lang="en"->%Input=Output1, + atom_string(Input,Output); + ((%(%((Input="member2"->trace;true), + %split_on_number(Input,Input1,Input10), + member([Output,_Input101,Lang,Input],Lang_db))->true; + %notrace, + %(%Input10="" -> Output=Output2; + %concat_list([Output2," ",Input10],Output))) -> true; + (concat_list(["Error: Word: ",Input," not in Language: ",Lang," in lang_db."],Notification1),writeln(Notification1),abort))), + %Output=Output1, + !. + **/ + \ No newline at end of file diff --git a/Languages/make_docs.pl b/Languages/make_docs.pl new file mode 100644 index 0000000000000000000000000000000000000000..f14ddf2b480a3c54d55c3e6eb82ac48355ff1292 --- /dev/null +++ b/Languages/make_docs.pl @@ -0,0 +1,131 @@ +% make_docs.pl + +:-include('../listprologinterpreter/la_strings.pl'). + % leave off +:-include('../culturaltranslationtool/ctt2.pl'). % turn back on +:-include('data.pl'). +:-include('lang_db_generator-nonabort.pl'). % +:- include('translate4.pl'). +:- include('../Prolog-to-List-Prolog/pretty_print.pl'). + +% translates list prolog code between `` +% in the following, translates v, doesn't translate x +% v< x > v < x > +% v & x & + +%translate(A,_,_,_) :- +%writeln([here,A]),abort. + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +make_docs(File_a) :- + writeln("Enter target language code for LPI docs:"), + read_string(user_input, "\n", "\r", _End2, To_lang), + retractall(lang(_)), + assertz(lang(To_lang)), + load_lang_db, + phrase_from_file_s(string(Docs), File_a), + string_codes(Docs2,Docs), + split_string1(Docs2,["`"],Input1), + process1(Input1,To_lang,"",String), + concat_list([To_lang,"-",File_a%,".txt" + ],File), + (open_s(File,write,Stream), + write(Stream,String), + close(Stream)), + writeln(String),!. + +process1(Input1,To_lang,String1,String2) :- + Input1=[A,B|Rest], + split_string1(A,["<",">"],Input2), + process2(Input2,To_lang,String1,String3), + catch(term_to_atom(B1,B), _, (concat_list(["Error: Couldn't translate list prolog: ",B],N),writeln(N),abort)), + retractall(from_lang2(_)), + assertz(from_lang2("en")), + retractall(to_lang2(_)), + assertz(to_lang2(To_lang)), + (data([B1],[],[B2])->true;(concat_list(["Error: Couldn't translate list prolog: ",B],N),writeln(N),abort)), + %trace, + + %trace, + %writeln1(B2), + + catch(%term_to_atom + pretty_print(B2,B3), _, (concat_list(["Error: Couldn't translate list prolog: ",B3],N_A),writeln(N_A),abort)), % change back to not with catch + + concat_list([String3,"`",B3,"`"],String4), +writeln("****"), +writeln(String4), + process1(Rest,To_lang,String4,String2),!. +process1([],_To_lang,String,String) :- !. +process1([A|As],_To_lang,String1,String2) :- +%trace, + %term_to_atom(A,A1), + %term_to_atom(As,As1), + (As=[]->(concat_list([String1,A],String2) + %,notrace + ); + (maplist(append,[[[String1],[A],As]],[C]), + concat_list(C,String2) + %,notrace + )),!. + +% v< x > v < x > +process2(Input2,To_lang,String1,String2) :- + Input2=[A,B,C,D|Rest], + split_string1(A,["&"],A11), % docs.txt needs a character between `,<,>,& + process3(A11,To_lang,"",A1), + split_string1(C,["&"],C11), + process3(C11,To_lang,"",C1), + concat_list([String1,A1,"<",B,">",C1,"<",D,">"],String3), +writeln("****"), +writeln(String3), + process2(Rest,To_lang,String3,String2),!. +process2([],_To_lang,String,String) :- !. +process2([A|As],To_lang,String1,String2) :- + split_string1(A,["&"],A11), % docs.txt needs a character between `,<,>,& + process3(A11,To_lang,"",A1), + + %term_to_atom(A,A1), + %term_to_atom(As,As1), + (As=[]->concat_list([String1,A1],String2); + (maplist(append,[[[String1],[A1],As]],[C]), + concat_list(C,String2))),!. + +% v & x & +process3(Input2,To_lang,String1,String2) :- + Input2=[A,B|Rest], + translate2(A,"en",To_lang,A1), % 1a,2 + %translate1a(C,"en",To_lang,C1), + %C=C1, + concat_list([String1,A1,B],String3), +writeln("****"), +writeln(String3), + process3(Rest,To_lang,String3,String2),!. +process3([],_To_lang,String,String) :- !. +process3([A|As],_To_lang,String1,String2) :- + %term_to_atom(A,A1), + %term_to_atom(As,As1), + (As=[]->concat_list([String1,A],String2); + (maplist(append,[[[String1],[A],As]],[C]), + concat_list(C,String2))),!. + +%term_to_atom2(A,B) :- +% term_to_atom(A,C), + +%translate1a(A,_,_,A). + +translate1a(A,_,_,A1):- + string_concat(A," tr",A1). + + +translate2(A,From_lang,To_lang,B) :- + (((number(A)->true;(data_symbol(A)))->(A=B));( + get_lang_word3(A,From_lang,To_lang,B))). + + + + diff --git a/Languages/translate4.pl b/Languages/translate4.pl new file mode 100644 index 0000000000000000000000000000000000000000..d218c02ccd38c8558ec9b439db1f760452b2f97c --- /dev/null +++ b/Languages/translate4.pl @@ -0,0 +1,58 @@ +% translate4.pl + +%translate4(Input,Lang1,Lang2,Input) :- +% split_on_number(Lang1,Lang3,_), +% split_on_number(Lang2,Lang3,_),!. +translate4(Input,Lang1,Lang2,Input) :- + Lang1="en2", + split_on_number(Lang1,Lang3,_), + split_on_number(Lang2,Lang3,_),!. +translate4(Input,"en","en2",Input) :- !. +translate4(Input,Lang,Lang,Input) :- !. +translate4(Input,From_lang,To_lang,Output) :- + split_on_number(From_lang,Lang3,_), + split_on_number(To_lang,Lang4,_), + %not(Lang3=Lang4), + translate(Input,Lang3,Lang4,Output),!. + +/** + +need to run test1(off,1,R). to load db before following. + +get_lang_word3(member1,"en","en2",A). +A="member 1". + +get_lang_word3("yes_1","en","en2",A). +A = "yes 1". + +get_lang_word3("yes1","en","en2",A). +A = "yes 1". + + +get_lang_word3("member 1","en2","en",A). +A = member1. + +get_lang_word3("yes 1","en2","en",A). +A = yes1. + +get_lang_word3("yes 1","en2","en",A). +A = yes1. + + +get_lang_word3("Yes 1","en2","fr",A). +A = "Oui 1". + +get_lang_word3("Oui 1","fr","en2",A). +A = "Yes 1". + + +get_lang_word3("membre 1","fr","en2",A). +A = "member 1". + +get_lang_word3("membre 1","fr","en",A). +A = member1. + +get_lang_word3("Yes 1","en2","en",A). +A = 'Yes1'. + +**/ \ No newline at end of file diff --git a/List-Prolog-Package-Manager/.DS_Store b/List-Prolog-Package-Manager/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/List-Prolog-Package-Manager/.DS_Store differ diff --git a/List-Prolog-Package-Manager/LICENSE b/List-Prolog-Package-Manager/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..3a27c90f1c2986ac0fb73b4ae956c6205343dfdd --- /dev/null +++ b/List-Prolog-Package-Manager/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/List-Prolog-Package-Manager/README.md b/List-Prolog-Package-Manager/README.md new file mode 100644 index 0000000000000000000000000000000000000000..987e290cd3c4814ffd9a03bef3109a5a341abc6a --- /dev/null +++ b/List-Prolog-Package-Manager/README.md @@ -0,0 +1,66 @@ +Lucian Academy as Boxes in an L-Shape + +# List-Prolog-Package-Manager (LPPM) + +Registry manager and installer for GitHub repositories (of any language) with dependencies + +* lppm.pl - starts a local or remote Prolog server to host the registry, where a repository entry on the registry can be uploaded or edited by reuploading, and GitHub repositories (a.k.a. packages) in the registry can be installed when in the same folder as the registry on the server. +* lppm_registry.pl - contains users, repositories, descriptions and dependencies (lists of users and repositories). +* LPPM now installs repositories needed by repositories, etc. without needing them in the first registry entry. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for installing repositories with dependencies. + +* A list of my repositories in SWI-Prolog. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +* Check the prerequisites for the specific repository to install. + +# 1. Install manually + +* Manually download the repositories listed under "Install manually" for the repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen",""). +halt +``` + +* where is replaced with the name of the repository. + +# Running + +`cd ` (the name of the repository) +`swipl` +`` (see the repository instructions) + +# Optional Web Interface + +* Load `lppm` by entering `['lppm.pl'].` in SWI-Prolog and run with the command e.g. `lppm_start_server(8001).` on the machine that is a local or remote host. +* In the web browser, go to `http://127.0.0.1:8001/` to upload registry entries. Take care to enter double quotes around all strings. The registry will not accept badly formatted input. To update an entry, re-enter it with the same user and repository. +* In the web browser, view the registry at `http://127.0.0.1:8001/registry`. +* Install packages by running `lppm_install("User","Repository").` LPPM will prompt you for an installation folder. Packages are uncompressed source code from GitHub, saved to folders of the same name in the target folder. Please enter the relevant folder and follow the running instructions from the repository. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + + + diff --git a/List-Prolog-Package-Manager/la_strings.pl b/List-Prolog-Package-Manager/la_strings.pl new file mode 100644 index 0000000000000000000000000000000000000000..acb6229ac160409f007f83a85e786d9d1f6a5450 --- /dev/null +++ b/List-Prolog-Package-Manager/la_strings.pl @@ -0,0 +1,60 @@ +%% la_strings.pl + +use_module(library(pio)). +use_module(library(dcg/basics)). + +open_s(File,Mode,Stream) :- + atom_string(File1,File), + open(File1,Mode,Stream),!. + +string_atom(String,Atom) :- + atom_string(Atom,String),!. + +phrase_from_file_s(string(Output), String) :- + atom_string(String1,String), + phrase_from_file(string(Output), String1),!. + +writeln1(Term) :- + term_to_atom(Term,Atom), + writeln(Atom),!. + +write1(Term) :- + term_to_atom(Term,Atom), + write(Atom),!. + +shell1_s(Command) :- + atom_string(Command1,Command), + shell1(Command1),!. + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + +concat_list(A1,B):- + A1=[A|List], + concat_list(A,List,B),!. + +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). + +append_list(A1,B):- + A1=[A|List], + append_list(A,List,B),!. + +append_list(A,[],A):-!. +append_list(A,List,B) :- + List=[Item|Items], + append(A,[Item],C), + append_list(C,Items,B). diff --git a/List-Prolog-Package-Manager/lppm.pl b/List-Prolog-Package-Manager/lppm.pl new file mode 100644 index 0000000000000000000000000000000000000000..44a1a792c5c1cdf2f224683e28d4c8405dc80840 --- /dev/null +++ b/List-Prolog-Package-Manager/lppm.pl @@ -0,0 +1,278 @@ +%% lppm.pl +%% List Prolog Package Manager + +%% Starts server, uploads manifest files, installs repositories + + +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/', web_form, []). + +:- include('la_strings.pl'). + +%:- include('la_strings.pl'). %% Move la_strings and the contents of the repository into the root folder + +lppm_start_server(Port) :- + http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + + web_form(_Request) :- + reply_html_page( + title('List Prolog Package Manager - Upload'), + [ + form([action='/landing', method='POST'], [ + p([], [ + label([for=user],'User: (e.g. "luciangreen")'), + %% If you enter strings without double quotes and there is an internal server error please contact luciangreen@lucianacademy.com + input([name=user, type=textarea]) + ]), + p([], [ + label([for=repository],'Repository: (e.g. "File2List2Br")'), + input([name=repository, type=textarea]) + ]), + p([], [ + label([for=description],'Description: (e.g. "xyz")'), + input([name=description, type=textarea]) + ]), + p([], [ + label([for=dependencies],'Dependencies: (e.g. [["User","Repository"], ...] or [])'), + input([name=dependencies, type=textarea]) + ]), + + p([], input([name=submit, type=submit, value='Submit'], [])) + ])]). + + :- http_handler('/landing', landing_pad, []). + + landing_pad(Request) :- + member(method(post), Request), !, + http_read_data(Request, Data, []), + format('Content-type: text/html~n~n', []), + format('

', []), +%%writeln1(Data) + +%%lp(Data):- + +Data=[user=User1,repository=Repository1,description=Description1,dependencies=Dependencies1 +,submit=_], + +((string1(User1),string1(Repository1),string1(Description1),term_to_atom(D2,Dependencies1),findall([B1,C1],(member(A,D2),A=[B,C],string(B),string(C),term_to_atom(B,B1),term_to_atom(C,C1)),Dependencies2),length(D2,F),length(Dependencies2,F))-> +( +%%writeln1([User1,Repository1]), +string_atom(User2,User1), +%%writeln(User2), +string_atom(Repository2,Repository1), +string_atom(Description2,Description1), +%%string_atom(Dependencies2,Dependencies1), +%%lppm_get_manifest(User2,Repository2,Description,Dependencies), + +lppm_get_registry(LPPM_registry_term1), + +strip(User2,User3), +strip(Repository2,Repository3), + +delete(LPPM_registry_term1,[User3,Repository3,_,_],LPPM_registry_term2), + +(LPPM_registry_term2=[]->LPPM_registry_term3=[[User2,Repository2,Description2,Dependencies2]];( +term_to_atom(LPPM_registry_term2,LPPM_registry_term4), + +strip(LPPM_registry_term4,LPPM_registry_term4a), +LPPM_registry_term5=[LPPM_registry_term4a], + +append(LPPM_registry_term5,[[User2,Repository2,Description2,Dependencies2]],LPPM_registry_term3))), + + +%%portray_clause(LPPM_registry_term3), + + + + (open_s("lppm_registry.txt",write,Stream), +%% string_codes(BrDict3), + write(Stream,LPPM_registry_term3), + close(Stream))) +;((writeln1("Error: One of strings was not in double quotes.")))), +%%), %%portray_clause(Data), + format('

========~n', []), + %%portray_clause(Request), + format('

') + + %% + . + +string1(A) :- atom_string(A,B),string_codes(B,C),C=[34|_],reverse(C,D),D=[34|_]. +lppm_get_registry(LPPM_registry_term1) :- + catch(phrase_from_file_s(string(LPPM_registry_string), "lppm_registry.txt"),_,(writeln1("Error: Cannot find lppm_registry.txt"),abort)), + +term_to_atom(LPPM_registry_term1,LPPM_registry_string). + + +%%strip2(A,B) :- strip(A,C),strip(C,B). + + +strip(A,G) :- string_codes(A,C),C=[_|D], reverse(D,E),E=[_|F],reverse(F,H),string_codes(G,H). +/** +lppm_get_registry1(LPPM_registry_term1) :- + catch(phrase_from_file_s(string(LPPM_registry_string), "lppm_registry1.txt"),_,(writeln1("Error: Cannot find lppm_registry1.txt"),abort)), + +term_to_atom(LPPM_registry_term1,LPPM_registry_string). +**/ + :- http_handler('/registry', registry, []). + + registry(_Request) :- + %%member(method(post), Request), !, + %%http_read_data(Request, _Data, []), + format('Content-type: text/html~n~n', []), + format('

', []), + +writeln("List Prolog Package Manager Registry\n\nInstallation instructions:\n\n Install List Prolog Package Manager (LPPM) by downloading SWI-Prolog, downloading the List Prolog Package Manager from GitHub, loading LPPM with ['lppm']. then installing packages by running lppm_install(\"User\",\"Repository\"). LPPM will prompt you for an installation directory. The available repositories are below.\n"), + +catch(phrase_from_file_s(string(LPPM_registry_string), "lppm_registry.txt"),_,(writeln1("Cannot find lppm_registry.txt"),abort)), + +term_to_atom(LPPM_registry_term1,LPPM_registry_string), + + format(' + + + + + +', []), + +findall(_,(member(LPPM_registry_term2,LPPM_registry_term1), + format('', []), + +LPPM_registry_term2=[User,Repository,Description,Dependencies], + +concat_list([""],Text1), + +string_atom(Text1,Text1a), + format(Text1a, []), + + +concat_list([""],Text2), + +string_atom(Text2,Text2a), + format(Text2a, []), + + format('', []), + format('', [])),_), + + format('
User/RepositoryDescriptionDependencies
",User,"/",Repository,"",Description,"', []), + +findall(_,(member(Dependency,Dependencies),Dependency=[User1,Repository1],concat_list(["",User1,"/",Repository1," "],String1), + string_atom(String1,Atom1), + +format(Atom1, [])),_), + format('
', []), + + writeln("\nPackage uploading instructions:\n\n Enter registry entries for List Prolog Package Manager (LPPM) packages by visiting http://127.0.0.1:8001/."), + format('NB. The luciangreen/Text-to-Breasonings repository requires special instructions.', []) + +. + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +%% You can install without the repository being in the registry, meaning LPPM doesn't require a registry x + +%% may comment out lppm_get_manifest +lppm_get_manifest(User1,Repository1,Description,Dependencies1) :- + concat_list(["curl -iH 'User-Agent: luciangreen' -X POST https://raw.githubusercontent.com/",User1,"/",Repository1,"/master/","manifest.txt"],Command1), + atom_string(Command2,Command1), + catch(bash_command(Command2,ManifestFile1), _, (concat_list(["Error: https://raw.githubusercontent.com/",User1,"/",Repository1,"/master/","manifest.txt doesn't exist."],Text1),writeln1(Text1),abort)), + term_to_atom(ManifestFile2,ManifestFile1), + catch(ManifestFile2=[description=Description,dependencies=Dependencies1],_,(writeln1("Error: File manifest.txt isn't in [description=Description,dependencies=Dependencies] format."),abort)), + catch(findall([User2,Repository2],(member(Dependency1,Dependencies1),Dependency1=[User2,Repository2]),Dependencies1),_,(writeln1("Error: Dependencies in file manifest.txt aren't in format [[User,Repository], ...] format."),abort),_Data1). + + %% confirm database contents - link to page + + %%** install +lppm_install(User1,Repository1) :- + %%lppm_get_manifest(User1,Repository1,_Description,Dependencies1), + lppm_get_registry(LPPM_registry_term1), + member([User1,Repository1,_Description,Dependencies1],LPPM_registry_term1), + (%%repeat, + concat_list(["Please enter path to install ",User1,"/",Repository1," to: (e.g. ../ to install at the same level as List Prolog Package Manager)."],Text2), + writeln1(Text2),read_string(user_input, "\n", "\r", _, Path1), + (working_directory(_,Path1)->true;(concat_list(["Warning: ",Path1," doesn't exist."],Text3),writeln1(Text3),fail))), + + %catch((true, call_with_time_limit(1, + find_all_dependencies(LPPM_registry_term1,%[[User1,Repository1]],%%,Description,Dependencies1 + Dependencies1,[],Dependencies1a) + , + %)), + % time_limit_exceeded, + % (concat_list(["Error: Cycle in lppm_registry.txt: ",Dependencies1],Note_a),writeln(Note_a),abort)), + + append([[User1,Repository1%%,Description,Dependencies1 + ]],Dependencies1a,Dependencies2), + + sort(Dependencies2,Dependencies2b), + + + %trace, + %writeln(Dependencies2b), + findall(_,(member(Dependency2,Dependencies2b),Dependency2=[User3,Repository3], + concat_list(["git clone https://github.com/",User3,"/",Repository3,".git"],Command3), + catch(bash_command(Command3,_), _, (concat_list(["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],Text4),writeln1(Text4)%%,abort + ))),_),!. + +find_all_dependencies(_,[],A,A) :- !. +find_all_dependencies(LPPM_registry_term1,Dependencies1,Dependencies8,Dependencies3) :- + Dependencies1=[[User1a,Repository2]|Dependencies9], + (member([User1a,Repository2,_Description,Dependencies7],LPPM_registry_term1)->true;(concat_list(["Error: Missing lppm_registry.txt entry: [",User1a,",",Repository2,"]."],Note_b),writeln(Note_b),abort)), + append(Dependencies8,[[User1a,Repository2]],Dependencies10), + + subtract(Dependencies7,Dependencies10,Dependencies11), +find_all_dependencies(LPPM_registry_term1,Dependencies11,Dependencies10,Dependencies6), + find_all_dependencies(LPPM_registry_term1,Dependencies9,Dependencies6,Dependencies3). + + + %%concat_list(["git pull https://github.com/",User3,"/",Repository3,".git master"],Command4), + %%catch(bash_command(Command4,_), _, (concat_list(["Error: Can't pull ",User3,"/",Repository3," repository on GitHub."],Text5),writeln1(Text5),abort))),_). + + /** + + ()- urls for instructions including install and examples + username + repository, + ()Home page: https://github.com/u/r + Short Description + writeln1("") + + **/ + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + + +/** +lppm_upload(User1,Repository1,Description1,Dependencies1) :- + %%lppm_get_manifest(User1,Repository1,Data1), + %%lppm_get_registry1(LPPM_registry_term1), + IP_Port=%%"x.x.x.x:8001", + "127.0.0.1:8001", + concat_list(["curl -X POST -F \"user='",User1,"'\" http://",IP_Port,"/landing"],Command1), +%% %%-F \"repository='",Repository1,"'\" -F \"description='",Description1,"'\" -F \"dependencies='",Dependencies1,"'\" + catch(bash_command(Command1,_), _, (concat_list(["Error: Can't upload entry for ",User1,"/",Repository1," repository on GitHub to Registry."],Text1),writeln1(Text1),abort)), + concat_list(["Upload successful. You can check your repository is listed at http://",IP_Port,"/registry."],Text2), + writeln1(Text2),!. +**/ + +%% answer question: what is my ip address diff --git a/List-Prolog-Package-Manager/lppm_registry.txt b/List-Prolog-Package-Manager/lppm_registry.txt new file mode 100644 index 0000000000000000000000000000000000000000..a4065d54a0cbb39359d5726bed61440b812d301a --- /dev/null +++ b/List-Prolog-Package-Manager/lppm_registry.txt @@ -0,0 +1,7 @@ +[["luciangreen","State-Machine-to-List-Prolog","Converts State Saving Interpreter State Machines to List Prolog algorithms.",[["luciangreen","SSI"]]],["luciangreen","luciancicd","Single-user continuous integration and continuous deployment.",[["luciangreen","gitl"],["luciangreen","listprologinterpreter"],["luciangreen","List-Prolog-Package-Manager"],["luciangreen","Prolog-to-List-Prolog"],["luciangreen","Philosophy"]]],["luciangreen","File2List2Br","Breasons out long files.",[["luciangreen","culturaltranslationtool"],["luciangreen","Languages"],["luciangreen","listprologinterpreter"],["luciangreen","Text-to-Breasonings"]]],["luciangreen","List-Prolog-Package-Manager","Registry manager and installer for repositories with dependencies.",[]],["luciangreen","Text-to-Breasonings","Helps Earn High Distinctions.",[["luciangreen","Algorithm-Writer-with-Lists"],["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"]]],["luciangreen","t2ab","Helps Earn High Distinctions by breasoning out object algorithms.",[["luciangreen","Text-to-Breasonings"]]],["luciangreen","Text-to-Object-Name","Helps Prepare to Earn High Distinctions.",[["luciangreen","Lucian-Academy-Data"],["luciangreen","Text-to-Breasonings"],["luciangreen","listprologinterpreter"]]],["luciangreen","Languages","Use repositories in other languages.",[["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"],["luciangreen","Prolog-to-List-Prolog"]]],["luciangreen","Combination-Algorithm-Writer","Combination Algorithm Writer",[]],["luciangreen","Combination-Algorithm-Writer-Stable","Stable version of Combination Algorithm Writer",[]],["luciangreen","listprologinterpreter","List Prolog Interpreter",[["luciangreen","SSI"],["luciangreen","culturaltranslationtool"],["luciangreen","Languages"]]],["luciangreen","Algorithm-Writer-with-Lists","Generates Random Algorithms",[["luciangreen","culturaltranslationtool"],["luciangreen","Languages"],["luciangreen","listprologinterpreter"],["luciangreen","Text-to-Breasonings"],["luciangreen","Lucian-Academy"],["luciangreen","Prolog-to-List-Prolog"]]],["luciangreen","Combination-Algorithm-Writer-Multiple-Predicates","Writes algorithms with multiple predicates.",[["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"]]], + +["luciangreen","Philosophy","List of Philosophy Algorithms.",[["luciangreen","Algorithm-Writer-with-Lists"],["luciangreen","luciancicd"],["luciangreen","gitl"],["luciangreen","Time_Machine"],["luciangreen","t2ab"],["luciangreen","SSI"],["luciangreen","Lucian-Academy"],["luciangreen","List-Prolog-to-Prolog-Converter"],["luciangreen","Prolog-to-List-Prolog"],["luciangreen","LuciansHandBitMap-Font"],["luciangreen","listprologinterpreter"],["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","mindreader"],["luciangreen","Text-to-Breasonings"],["luciangreen","Music-Composer"],["luciangreen","State-Machine-to-List-Prolog"],["luciangreen","Essay-Helper"],["luciangreen","culturaltranslationtool"],["luciangreen","texttoalg"],["luciangreen","Daily-Regimen"],["luciangreen","File2List2Br"],["luciangreen","lp2c"],["luciangreen","lucianpl"],["luciangreen","Lucian-Academy-Data"],["luciangreen","c2lp"],["luciangreen","Combination-Algorithm-Writer-Multiple-Predicates"],["luciangreen","Business"],["luciangreen","Combination-Algorithm-Writer-Stable"],["luciangreen","Text-to-Object-Name"],["luciangreen","Split-on-Phrases"],["luciangreen","texttoalg2"],["luciangreen","Simple-List-Prolog-to-List-Prolog"],["luciangreen","qa_db_finder"],["luciangreen","Program-Finder-from-Data-"],["luciangreen","Logic-Formula-Finder"],["luciangreen","databaseformulafinder"],["luciangreen","Combination-Algorithm-Writer-with-Predicates"],["luciangreen","Combination-Algorithm-Writer"],["luciangreen","grammarinterpreter"]]], + +["luciangreen","lucianpl","Compile and uncompile lists.",[]], +["luciangreen","Business","Make money.",[]], +["luciangreen","mindreader","Contains Prolog programs that can be used by an individual to read their mind using meditation.",[["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"],["luciangreen","Text-to-Breasonings"],["luciangreen","Music-Composer"]]],["luciangreen","Music-Composer","Music Composer can (optionally mind read to) create songs. Requires asc2mid (http://www.archduke.org/midi/, compile with a C compiler before running, call application asc2mid and place in Music-Composer folder).",[["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"],["luciangreen","mindreader"]]],["luciangreen","Combination-Algorithm-Writer-with-Predicates","Combination Algorithm Writer with Predicates.",[]],["luciangreen","Prolog-to-List-Prolog","Converts Prolog algorithms to List Prolog algorithms.",[["luciangreen","List-Prolog-to-Prolog-Converter"],["luciangreen","listprologinterpreter"]]],["luciangreen","c2lp","Converts C algorithms to List Prolog algorithms.",[["luciangreen","listprologinterpreter"],["luciangreen","Prolog-to-List-Prolog"]]],["luciangreen","texttoalg","Text to Algorithm.",[["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"],["luciangreen","Text-to-Breasonings"]]],["luciangreen","texttoalg2","Text to Algorithm 2.",[["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"],["luciangreen","Text-to-Breasonings"],["luciangreen","texttoalg"]]],["luciangreen","LuciansHandBitMap-Font","LuciansHandBitMap Font.",[]],["luciangreen","Split-on-Phrases","Splits text files into groups of files.",[["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"]]],["luciangreen","List-Prolog-to-Prolog-Converter","Converts List Prolog code to Prolog code.",[["luciangreen","Prolog-to-List-Prolog"],["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"]]],["luciangreen","lp2c","Converts List Prolog code to C code.",[["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"]]],["luciangreen","Program-Finder-from-Data-","Finds recursive algorithms from the structure of data.",[["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","listprologinterpreter"]]],["luciangreen","culturaltranslationtool","Cultural Translation Tool. Requires Translation Shell and gawk.",[["luciangreen","listprologinterpreter"],["luciangreen","gitl"],["luciangreen","List-Prolog-to-Prolog-Converter"],["luciangreen","Philosophy"],["luciangreen","Daily-Regimen"]]],["luciangreen","grammarinterpreter","Grammar Interpreter.",[]],["luciangreen","databaseformulafinder","Database Formula Finder.",[]],["luciangreen","Essay-Helper","Helps write essays using KNNs.",[["luciangreen","Languages"],["luciangreen","listprologinterpreter"],["luciangreen","culturaltranslationtool"],["luciangreen","mindreader"],["luciangreen","Prolog-to-List-Prolog"],["luciangreen","Philosophy"]]],["luciangreen","Simple-List-Prolog-to-List-Prolog","Converts Simple List Prolog code to List Prolog code.",[]],["luciangreen","Logic-Formula-Finder","Finds formulas with and, or and not from data.",[]],["luciangreen","qa_db_finder","Finds database style list-based algorithms with question answering",[]],["luciangreen","music_files","Meta data of songs created with Music Composer.",[["luciangreen","Music-Composer"]]],["luciangreen","CGPT_Immortality_and_Health_Books","CGPT Immortality and Health Books.",[]],["luciangreen","Time_Machine","Travel through time with meditation.",[["luciangreen","music_files"],["luciangreen","CGPT_Immortality_and_Health_Books"],["luciangreen","Lucian-Academy"],["luciangreen","Philosophy"],["luciangreen","listprologinterpreter"],["luciangreen","Text-to-Breasonings"]]],["luciangreen","Lucian-Academy","Automates the Lucian Academy",[["luciangreen","listprologinterpreter"],["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","Algorithm-Writer-with-Lists"],["luciangreen","Text-to-Breasonings"],["luciangreen","mindreader"]]],["luciangreen","Daily-Regimen","Scripts for daily meditation, bot prep, uni and breasoning.",[["luciangreen","listprologinterpreter"],["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","Algorithm-Writer-with-Lists"],["luciangreen","Text-to-Breasonings"],["luciangreen","mindreader"]]],["luciangreen","SSI","State Saving Interpreter",[["luciangreen","listprologinterpreter"]]],["luciangreen","Lucian-Academy-Data","Data files for Text-to-Object-Name.",[]],["luciangreen","c","Test.",[]],["luciangreen","e","Test.",[]],["luciangreen","d","Test.",[["luciangreen","e"]]],["luciangreen","gitl","Decentralised Git.",[["luciangreen","listprologinterpreter"],["luciangreen","luciancicd"]]]] \ No newline at end of file diff --git a/List-Prolog-to-Prolog-Converter/.DS_Store b/List-Prolog-to-Prolog-Converter/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/List-Prolog-to-Prolog-Converter/.DS_Store differ diff --git a/List-Prolog-to-Prolog-Converter/LICENSE b/List-Prolog-to-Prolog-Converter/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/List-Prolog-to-Prolog-Converter/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/List-Prolog-to-Prolog-Converter/README.md b/List-Prolog-to-Prolog-Converter/README.md new file mode 100644 index 0000000000000000000000000000000000000000..623d2aa6348c9128f4f7443b1ed9527fcdbe4a7e --- /dev/null +++ b/List-Prolog-to-Prolog-Converter/README.md @@ -0,0 +1,112 @@ +# List-Prolog-to-Prolog-Converter + +Converts List Prolog code to Prolog code. + +List Prolog (LP) Interpreter (available here) is an interpreter for a different version of Prolog that is in list format, making it easier to generate List Prolog programs. The LP interpreter is an algorithm that parses and runs List Prolog code. The converter helps convert List Prolog programs to Prolog programs. The interpreter and converter are written in SWI-Prolog. + + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, the List Prolog Interpreter Repository and the Text to Breasonings Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","List-Prolog-to-Prolog-Converter"). +halt +``` + +# Running + +* In Shell: +`cd List-Prolog-to-Prolog-Converter` +`swipl` +`['../listprologinterpreter/listprolog'].` + +* Load the List Prolog to Prolog Converter by typing: +`['lp2pconverter'].` + +* The converter is called in the form: +`test(Number,_,Algorithm1,_),lp2p1(Algorithm1,Algorithm2),write(Algorithm2).` + +Where: +Number - Test number of algorithm to convert (taken from "lpiverify4.pl"). +Algorithm1 - is the List Prolog algorithm to convert. +Algorithm2 - is the Prolog algorithm produced. + +* For example: +``` +test(1,_,A,_),lp2p1(A,B),write(B). +function(A,B,C):-+(A,B,C). +``` + +``` +test(2,_,A,_),lp2p1(A,B),write(B). +function(A,B,C):-+(A,B,D),+(D,1,C). +``` + +``` +test(3,_,A,_),lp2p1(A,B),write(B). +function(A,B,C):-function2(D,F),+(A,B,E),+(E,F,G),+(G,D,C). +function2(A,F):-is(A,2),is(F,1). +``` + +``` +test(4,_,A,_),lp2p1(A,B),write(B). +append1(A):-b(B),c(C),append(B,C,A). +b(["b"]). +c(["c"]). +``` + +``` +test(15,_,A,_),lp2p1(A,B),write(B). +grammar1(U,T):-compound(U,"",[],T). +compound213(U,U,T,T). +compound(T,U)->"[","]",compound213(T,U). +compound(T,U)->"[",compound21(T,V),"]",compound213(V,U). +compound212(U,U,T,T). +compound21(T,U)->item(I),lookahead("]"),{wrap(I,Itemname1),append(T,Itemname1,V)},compound212(V,U). +compound21(T,U)->item(I),",",compound21([],Compound1name),{wrap(I,Itemname1),append(T,Itemname1,V),append(V,Compound1name,U)}. +item(T)->"\"",word21("",T),"\"". +item(T)->number21("",U),{stringtonumber(U,T)}. +item(T)->word21_atom("",T1),{atom_string(T,T1)}. +item(T)->compound([],T). +number212(U,U,T,T). +number21(T,U)->A,commaorrightbracketnext,{((stringtonumber(A,A1),number(A1))->(true);((equals4(A,".")->(true);(equals4(A,"-"))))),stringconcat(T,A,V)},number212(V,U). +number21(T,U)->A,{((stringtonumber(A,A1),number(A1))->(true);((equals4(A,".")->(true);(equals4(A,"-"))))),stringconcat(T,A,V)},number21("",Numberstring),{stringconcat(V,Numberstring,U)}. +word212(U,U,T,T). +word21(T,U)->A,quote_next,{not((=(A,"\""))),stringconcat(T,A,V)},word212(V,U). +word21(T,U)->A,{not((=(A,"\""))),stringconcat(T,A,V)},word21("",Wordstring),{stringconcat(V,Wordstring,U)}. +word212_atom(U,U,T,T). +word21_atom(T,U)->A,commaorrightbracketnext,{not((=(A,"\""))),not((=(A,"["))),not((=(A,"]"))),stringconcat(T,A,V)},word212_atom(V,U). +word21_atom(T,U)->A,{not((=(A,"\""))),not((=(A,"["))),not((=(A,"]"))),stringconcat(T,A,V)},word21_atom("",Wordstring),{stringconcat(V,Wordstring,U)}. +commaorrightbracketnext->lookahead(","). +commaorrightbracketnext->lookahead("]"). +quote_next->lookahead("\""). +lookahead(A,A,B):-stringconcat(B,D,A). +``` + +# Tests + +* Run `bt-p2lp_test(A,B)` or `bt-p2lp_test1(N,B)` where N is the test number from Prolog to List Prolog/p2lpverify.pl. +* Similarly, `bt1-p2lp_test(A,B)` and `bt1-p2lp_test1(N,B)` work. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/List-Prolog-to-Prolog-Converter/bt-p2lp_test1.pl b/List-Prolog-to-Prolog-Converter/bt-p2lp_test1.pl new file mode 100644 index 0000000000000000000000000000000000000000..e41711a12583fc114188e69a986c3f21110c2920 --- /dev/null +++ b/List-Prolog-to-Prolog-Converter/bt-p2lp_test1.pl @@ -0,0 +1,39 @@ +% back-translates Prolog to List Prolog then List Prolog to Prolog. + +% bt-p2lp_test(A,B). +% A - output: total tests +% B - output: total correct results + +% bt-p2lp_test1(N,B). +% N - input: number to test +% B - output: result + +bt-p2lp_test(BL,RL) :- +findall(A,(p2lp_test(N,_I,O),((lp2p1(O,I1),p2lpconverter([string,I1],O2),O=O2)->(writeln([bt-p2lp_test,N,passed]),A=passed);(writeln([bt-p2lp_test,N,failed]),A=failed))),B), +length(B,BL), +findall(_,member(passed,B),R),length(R,RL),!. + +bt-p2lp_test1(N,A) :- +p2lp_test(N,_I,O),((lp2p1(O,I1),p2lpconverter([string,I1],O2),O=O2)->(writeln([bt-p2lp_test,N,passed]),A=passed);(writeln([bt-p2lp_test,N,failed]),A=failed)),!. + + + +bt-p2lp_pp_test(BL,RL) :- +findall(A,(p2lp_test(N,_I,O),((pp_lp2p0(O,I1),p2lpconverter([string,I1],O2),O=O2)->(writeln([bt-p2lp_test,N,passed]),A=passed);(writeln([bt-p2lp_test,N,failed]),A=failed))),B), +length(B,BL), +findall(_,member(passed,B),R),length(R,RL),!. + +bt-p2lp_pp_test1(N,A) :- +p2lp_test(N,_I,O),((pp_lp2p0(O,I1),p2lpconverter([string,I1],O2),O=O2)->(writeln([bt-p2lp_test,N,passed]),A=passed);(writeln([bt-p2lp_test,N,failed]),A=failed)),!. + +% to do +% back-translation, on I not O + + +bt1-p2lp_test(BL,RL) :- +findall(A,(p2lp_test(N,I,_O),(p2lpconverter([string,I],O1),(lp2p1(O1,I2),(string_concat(I1,"\n",I2)->true;I1=I2),I=I1)->(writeln([bt1-p2lp_test,N,passed]),A=passed);(writeln([bt1-p2lp_test,N,failed]),A=failed))),B), +length(B,BL), +findall(_,member(passed,B),R),length(R,RL),!. + +bt1-p2lp_test1(N,A) :- +p2lp_test(N,I,_O),(p2lpconverter([string,I],O1),(lp2p1(O1,I2),(string_concat(I1,"\n",I2)->true;I1=I2),I=I1)->(writeln([bt1-p2lp_test,N,passed]),A=passed);(writeln([bt1-p2lp_test,N,failed]),A=failed)),!. diff --git a/List-Prolog-to-Prolog-Converter/lp2pconverter.pl b/List-Prolog-to-Prolog-Converter/lp2pconverter.pl new file mode 100644 index 0000000000000000000000000000000000000000..ff26cb2e5d0e5cd573b4ada629bba3e18443ffcd --- /dev/null +++ b/List-Prolog-to-Prolog-Converter/lp2pconverter.pl @@ -0,0 +1,6 @@ +%% lp2p + +:-include('../listprologinterpreter/listprolog.pl'). +:-include('lp2pconverter1.pl'). +:-include('../Prolog-to-List-Prolog/p2lpconverter.pl'). +:-include('bt-p2lp_test1.pl'). diff --git a/List-Prolog-to-Prolog-Converter/lp2pconverter1.pl b/List-Prolog-to-Prolog-Converter/lp2pconverter1.pl new file mode 100644 index 0000000000000000000000000000000000000000..81f68d2fc786532c9bec470253f7959e2ebdc857 --- /dev/null +++ b/List-Prolog-to-Prolog-Converter/lp2pconverter1.pl @@ -0,0 +1,607 @@ +%:-dynamic pred_type/1. + +lp2p1(Algorithm1,Algorithm2) :- + %retractall(pred_type(_)), + %% note: without type, mode statements + memberlp2p10(Algorithm1,"",Algorithm2). + %%string_concat(Algorithm3,"]",Algorithm2). + +symbol_lp2p(":-",":-").%:- +%retractall(pred_type(_)), +%assertz(pred_type(":-")) +symbol_lp2p("->","-->"). + +memberlp2p10([],Algorithm1,Algorithm1) :- !. +memberlp2p10(Functions2,Algorithm1,Algorithm2) :- + Functions2=[Functions3|Functions4], + memberlp2p1(Functions3,"",Algorithm3), + %writeln(Algorithm3), + string_concat(Algorithm1,Algorithm3,Algorithm31), + memberlp2p10(Functions4,Algorithm31,Algorithm2). + +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- + Functions2=[Function], + interpretstatementlp2p2a(Function,Algorithm1,Algorithm3a,"[]"), + %string_concat(Algorithm3a,"(",Algorithm3d), + %interpretstatementlp2p2(Arguments2,Algorithm3d,Algorithm3e), + %string_concat(Algorithm3e,")",Algorithm3f), + %concat_list([Algorithm3f,Symbol],Algorithm3), + %interpretbodylp2p(Body,Algorithm3,Algorithm2a), + write_full_stop_if_last_item([],Algorithm3a,Algorithm2), +!. + +/* +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- +trace, + Function=[n,use_module], + Functions2=[Symbol,Function,[Arguments2%,"/",Arguments3 + ]], + symbol_lp2p(Symbol), + interpretstatementlp2p2a(Function,"",Algorithm3a,"[]"), + %string_concat(Algorithm3a,"(",Algorithm3d), + %interpretstatementlp2p2(Arguments2,"",Algorithm3e), + %interpretstatementlp2p2(Arguments3,"",Algorithm3f), + %string_concat(Algorithm3e,")",Algorithm3f), + concat_list([Algorithm1,Symbol,Algorithm3a,"( ",Arguments2,%"/",Arguments3, + ")"],Algorithm3), + %interpretbodylp2p(Body,Algorithm3,Algorithm2a), + write_full_stop_if_last_item([],Algorithm3,Algorithm2), +!. +*/ + +/* +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- +%trace, + Functions2=[A=B], + (string(A)->true;(atom(A)->true;(number(A)))), + (string(B)->true;(atom(B)->true;(number(B)))), + %interpretstatementlp2p2a(Function,"",Algorithm3a,"[]"), + %string_concat(Algorithm3a,"(",Algorithm3d), + %interpretstatementlp2p2(Arguments2,"",Algorithm3e), + %interpretstatementlp2p2(Arguments3,"",Algorithm3f), + %string_concat(Algorithm3e,")",Algorithm3f), + concat_list([A,"=",B],Algorithm3), + %interpretbodylp2p(Body,Algorithm3,Algorithm2a), + write_full_stop_if_last_item([],Algorithm3,Algorithm2), +!. +*/ +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- +%trace, + Functions2=[Symbol1,Function,[Arguments2,"/",Arguments3]], + symbol_lp2p(Symbol1,Symbol2), + interpretstatementlp2p2a(Function,"",Algorithm3a,"[]"), + %string_concat(Algorithm3a,"(",Algorithm3d), + %interpretstatementlp2p2(Arguments2,"",Algorithm3e), + %interpretstatementlp2p2(Arguments3,"",Algorithm3f), + %string_concat(Algorithm3e,")",Algorithm3f), + concat_list([Algorithm1,Symbol2,Algorithm3a," ",Arguments2,"/",Arguments3],Algorithm3), + %interpretbodylp2p(Body,Algorithm3,Algorithm2a), + write_full_stop_if_last_item([],Algorithm3,Algorithm2), +!. + +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- + Functions2=[Symbol1,Function,Arguments2], + symbol_lp2p(Symbol1,Symbol2), + interpretstatementlp2p2a(Function,"",Algorithm3a,"[]"), + %string_concat(Algorithm3a,"(",Algorithm3d), + interpretstatementlp2p2(Arguments2,"",Algorithm3e,"()"), + %interpretstatementlp2p2(Arguments3,"",Algorithm3f), + %string_concat(Algorithm3e,")",Algorithm3f), + concat_list([Algorithm1,Symbol2,Algorithm3a,"(",Algorithm3e,")"],Algorithm3), + %interpretbodylp2p(Body,Algorithm3,Algorithm2a), + write_full_stop_if_last_item([],Algorithm3,Algorithm2), +!. + +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- + Functions2=[Function,Arguments2,Symbol1,Body], + symbol_lp2p(Symbol1,Symbol2), + (Symbol1="->"->S="()";S="[]"), + interpretstatementlp2p2a(Function,Algorithm1,Algorithm3a,"[]"), + string_concat(Algorithm3a,"(",Algorithm3d), + interpretstatementlp2p2(Arguments2,Algorithm3d,Algorithm3e,S), + string_concat(Algorithm3e,")",Algorithm3f), + concat_list([Algorithm3f,Symbol2],Algorithm3), + interpretbodylp2p(Body,Algorithm3,Algorithm2a), + write_full_stop_if_last_item([],Algorithm2a,Algorithm2), +!. + %%; +%% memberlp2p11(Functions2,Algorithm1,Algorithm2)). + +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- + Functions2=[Function,Symbol1,Body], + symbol_lp2p(Symbol1,Symbol2), + interpretstatementlp2p2a(Function,Algorithm1,Algorithm3b,"[]"), + concat_list([Algorithm3b,Symbol2],Algorithm3a), + %%string_concat(Algorithm3a,"(",Algorithm3d), + interpretbodylp2p(Body,Algorithm3a,Algorithm2a), + write_full_stop_if_last_item([],Algorithm2a,Algorithm2),!. + %%string_concat(Algorithm3e,")",Algorithm2))). +/** + ( + memberlp2p12(Functions2,Algorithm1,Algorithm2)) + ). +**/ + +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- + Functions2=[[n,comment],[Comment]], + %interpretstatementlp2p2a(Function,Algorithm1,Algorithm3a), + + foldr(string_concat,[Algorithm1,%"\n", + Comment,"\n" + ],Algorithm2), + + %string_concat(Algorithm3a,"(",Algorithm3d), + %interpretstatementlp2p2b(Arguments2,Algorithm3d,Algorithm2a), + %write_full_stop_if_last_item([],Algorithm9,Algorithm2), + !. + + +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- +%trace, + Function=[n,use_module], + Functions2=[Function,Arguments2], + + + interpretstatementlp2p2a(Function,Algorithm1,Algorithm3a,"[]"), + string_concat(Algorithm3a,"(",Algorithm3d), + +(Arguments2=[[[n, library], [A]]]-> + +foldr(string_concat,[Algorithm3d,"library(",A,"))"],Algorithm2a); + interpretstatementlp2p2b(Arguments2,Algorithm3d,Algorithm2a,"()")), + write_full_stop_if_last_item([],Algorithm2a,Algorithm2),!. + + +/* +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- + forall(member(A,Functions2),A=[_=_]), + %interpretstatementlp2p2a(Function,Algorithm1,Algorithm3a,"[]"), +% string_concat(Algorithm3a,"(",Algorithm3d), + interpretstatementlp2p2b(Functions2,Algorithm1,Algorithm2a,"[]"), + write_full_stop_if_last_item([],Algorithm2a,Algorithm2),!. +*/ +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- + Functions2=[Function,Arguments2], + interpretstatementlp2p2a(Function,Algorithm1,Algorithm3a,"[]"), + string_concat(Algorithm3a,"(",Algorithm3d), + interpretstatementlp2p2b(Arguments2,Algorithm3d,Algorithm2a,"[]"), + write_full_stop_if_last_item([],Algorithm2a,Algorithm2),!. +/** + ; + (memberlp2p13(Functions2,Algorithm1,Algorithm2) + )). +**/ + +memberlp2p1(Functions2,Algorithm1,Algorithm2) :- + Functions2=[Function], + interpretstatementlp2p2b(Function,Algorithm1,Algorithm2a,"[]"), + write_full_stop_if_last_item([],Algorithm2a,Algorithm2),!. +/** + , + (Functions2=[_Function|Functions3], + write_comma_if_not_empty_list(Functions3,Algorithm2b,Algorithm2a), + memberlp2p1(Functions3,Algorithm2a,Algorithm2c), + write_full_stop_if_last_item(Functions3,Algorithm2c,Algorithm2)) + ). +**/ + +interpretbodylp2p([],Algorithm1,Algorithm1) :- !. +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- + Body=[Statement|Statements], + Statement=[v,_], + interpretstatementlp2p2a(Statement,Algorithm1,Algorithm3,"[]"), write_comma_if_not_empty_list(Statements,Algorithm3,Algorithm4), +interpretbodylp2p(Statements,Algorithm4,Algorithm2), +%%write_full_stop_if_last_item(Statements,Algorithm5,Algorithm2), +!. +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- + Body=[[Statements1|Statements1a]|Statements2], + %trace, + (only_item(Statements1)->(S1="[",S2="]");(S1="(",S2=")")), + not(predicate_or_rule_name(Statements1)), + string_concat(Algorithm1,S1,Algorithm3), + interpretbodylp2p([Statements1],Algorithm3,Algorithm4), + write_comma_if_not_empty_list(Statements1a,Algorithm4,Algorithm5), + interpretbodylp2p(Statements1a,Algorithm5,Algorithm6), + string_concat(Algorithm6,S2,Algorithm6a), + write_comma_and_newline_if_not_empty_list(Statements2,Algorithm6a,Algorithm7), + interpretbodylp2p(Statements2,Algorithm7,Algorithm2), + %%write_full_stop_if_last_item(Statements2,Algorithm8,Algorithm2), + !. + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- +%trace, + Body=[[Function,[Arguments2a,Arguments2b]]|Statements2], + (Function=[n,equals4]->true;Function=[n,=]), + %interpretstatementlp2p2a(Function,Algorithm1,Algorithm3a,"[]"), + % string_concat(Algorithm3a,"(",Algorithm3d), + interpretstatementlp2p2a(Arguments2a,"",Algorithm2a1,"[]"), + interpretstatementlp2p2a(Arguments2b,"",Algorithm2b1,"[]"), + foldr(string_concat,[Algorithm1,Algorithm2a1,"=",Algorithm2b1],Algorithm2c), + write_comma_if_not_empty_list(Statements2,Algorithm2c,Algorithm2d), + interpretbodylp2p(Statements2,Algorithm2d,Algorithm2),!. + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- +%trace, + Body=[[Function,[Arguments2a,Arguments2b,Arguments2c]]|Statements2], + Function=[n,F], + (F= (+)->true; (F= (-)->true; (F= (*)->true; F= (/)))), + %interpretstatementlp2p2a(Function,Algorithm1,Algorithm3a,"[]"), + % string_concat(Algorithm3a,"(",Algorithm3d), + interpretstatementlp2p2a(Arguments2a,"",Algorithm2a1,"[]"), + interpretstatementlp2p2a(Arguments2b,"",Algorithm2b1,"[]"), + interpretstatementlp2p2a(Arguments2c,"",Algorithm2c1,"[]"), + foldr(string_concat,[Algorithm1,Algorithm2c1," is ",Algorithm2a1,F,Algorithm2b1],Algorithm2d), + write_comma_if_not_empty_list(Statements2,Algorithm2d,Algorithm2e), + interpretbodylp2p(Statements2,Algorithm2e,Algorithm2),!. + + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- + Body=[[[n,not],[Statement]]|Statements2], + string_concat(Algorithm1,"not((",Algorithm3), + interpretbodylp2p([Statement],Algorithm3,Algorithm4), + string_concat(Algorithm4,"))",Algorithm5), + write_comma_if_not_empty_list(Statements2,Algorithm5,Algorithm6), + interpretbodylp2p(Statements2,Algorithm6,Algorithm2), + %%write_full_stop_if_last_item(Statements2,Algorithm7,Algorithm2), + !. + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- + Body=[[[n,or],[Statements1,Statements2]]|Statements3], + string_concat(Algorithm1,"((",Algorithm3), + + interpretbodylp2p([Statements1],Algorithm3,Algorithm4), + string_concat(Algorithm4,");(",Algorithm5), + + interpretbodylp2p([Statements2],Algorithm5,Algorithm6), + string_concat(Algorithm6,"))",Algorithm7), + write_comma_if_not_empty_list(Statements3,Algorithm7,Algorithm8), + interpretbodylp2p(Statements3,Algorithm8,Algorithm2), + %%write_full_stop_if_last_item(Statements3,Algorithm9,Algorithm2), + !. + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- + Body=[[[n,"->"],[Statements1,Statements2]]|Statements3], + Statements2=[[n,_]|_], + string_concat(Algorithm1,"(",Algorithm3), + interpretbodylp2p([Statements1],Algorithm3,Algorithm4), + string_concat(Algorithm4,"->",Algorithm5), + interpretbodylp2p([Statements2],Algorithm5,Algorithm6), + string_concat(Algorithm6,")",Algorithm7), + write_comma_if_not_empty_list(Statements3,Algorithm7,Algorithm8), + interpretbodylp2p(Statements3,Algorithm8,Algorithm2), + %%write_full_stop_if_last_item(Statements3,Algorithm9,Algorithm2), + !. + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- + Body=[[[n,"->"],[Statements1,Statements2]]|Statements3], + string_concat(Algorithm1,"(",Algorithm3), + interpretbodylp2p([Statements1],Algorithm3,Algorithm4), + string_concat(Algorithm4,"->(",Algorithm5), + interpretbodylp2p([Statements2],Algorithm5,Algorithm6), + string_concat(Algorithm6,"))",Algorithm7), + write_comma_if_not_empty_list(Statements3,Algorithm7,Algorithm8), + interpretbodylp2p(Statements3,Algorithm8,Algorithm2), + %%write_full_stop_if_last_item(Statements3,Algorithm9,Algorithm2), + !. + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- + Body=[[[n,"->"],[Statements1,Statements2,Statements2a]]|Statements3], + string_concat(Algorithm1,"(",Algorithm3), + interpretbodylp2p([Statements1],Algorithm3,Algorithm4), + + (Statements2=[[n,_]|_]->( + string_concat(Algorithm4,"->",Algorithm5), + interpretbodylp2p([Statements2],Algorithm5,Algorithm6), + string_concat(Algorithm6,";",Alg7) + ); + ( + string_concat(Algorithm4,"->(",Algorithm5), + interpretbodylp2p([Statements2],Algorithm5,Algorithm6), + string_concat(Algorithm6,");",Alg7) + )), + + (Statements2a=[[n,_]|_]->( + + string_concat(Alg7,"",Algorithm7), + interpretbodylp2p([Statements2a],Algorithm7,Algorithm8), + string_concat(Algorithm8,")",Algorithm9)); + + (string_concat(Alg7,"(",Algorithm7), + interpretbodylp2p([Statements2a],Algorithm7,Algorithm8), + string_concat(Algorithm8,"))",Algorithm9))), + + write_comma_if_not_empty_list(Statements3,Algorithm9,Algorithm10), + interpretbodylp2p(Statements3,Algorithm10,Algorithm2), + %%write_full_stop_if_last_item(Statements3,Algorithm11,Algorithm2), + !. + + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- +%trace, + Body=[[[n,code]|Statements1]|Statements3], + string_concat(Algorithm1,"{",Algorithm3), + %trace, + interpretbodylp2p(Statements1,"",Algorithm4), + string_strings(Algorithm4,A4), + (append(["("],A5,A4)->true;A5=A4), + %trace, + (append(A6,[")",")"],A5)->append(A6,[")"],A62);A62=A5), + flatten([Algorithm3,%Algorithm4% + A62 + ],A61), + foldr(string_concat,A61,A7), + string_concat(A7,"}",Algorithm7), + + write_comma_if_not_empty_list(Statements3,Algorithm7,Algorithm8), + interpretbodylp2p(Statements3,Algorithm8,Algorithm2), + %%write_full_stop_if_last_item(Statements3,Algorithm9,Algorithm2), + !. + + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- + %trace, + Body=[[[n,findall],[Statements1,Statements2,Statements2a]]|Statements3], + %string_concat(Algorithm1,"(",Algorithm3), + %interpretstatementlp2p1([Statements1],"",Algorithm4), + interpretstatementlp2p2([Statements1],"",Algorithm4,"[]"), + %string_concat(Algorithm4,"->(",Algorithm5), + %trace, + %interpretstatementlp2p5 + occurrences_of_term([n,_],Statements2,L), + %trace, + (%false% + L>=2 + ->Statements21=Statements2;Statements21=[Statements2]), + + interpretstatementlp2p2(Statements21,"",Algorithm61,%false, + "()"), + + +%findall(_,(member([[n,_]|_],Statements2)),S), +%length(S,L), +(L<2 +->Algorithm61=Algorithm6;(%trace, +foldr(string_concat,["(",Algorithm61,")"],Algorithm6))), + + %interpretbodylp2p([Statements2],"",Algorithm6), + %string_concat(Algorithm6,");(",Algorithm7), + % + %interpretbodylp2p + + interpretstatementlp2p2([Statements2a],"",Algorithm8,"[]" + ), + %string_concat(Algorithm8,"))",Algorithm9), + foldr(string_concat,[Algorithm1,"findall(",Algorithm4,",",Algorithm6,",",Algorithm8,")"],Algorithm9), + write_comma_if_not_empty_list(Statements3,Algorithm9,Algorithm10), + interpretbodylp2p(Statements3,Algorithm10,Algorithm2), + %%write_full_stop_if_last_item(Statements3,Algorithm11,Algorithm2), + !. + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- + Body=[[[n,comment],[Comment]]|Statements3], + %string_concat(Algorithm1,"(",Algorithm3), + %interpretstatementlp2p1([Statements1],"",Algorithm4), + %string_concat(Algorithm4,"->(",Algorithm5), + %interpretbodylp2p([Statements2],"",Algorithm6), + %string_concat(Algorithm6,");(",Algorithm7), + %interpretstatementlp2p1([Statements2a],"",Algorithm8), + %string_concat(Algorithm8,"))",Algorithm9), + foldr(string_concat,[Algorithm1,%"\n","/*", + Comment%,"*/" + ,"\n" + ],Algorithm10), + %write_comma_if_not_empty_list(Statements3,Algorithm9,Algorithm10), + interpretbodylp2p(Statements3,Algorithm10,Algorithm2), + %%write_full_stop_if_last_item(Statements3,Algorithm11,Algorithm2), + !. + + +interpretbodylp2p(Body,Algorithm1,Algorithm2) :- + Body=[Statement|Statements], + not(predicate_or_rule_name(Statement)), + interpretstatementlp2p1(Statement,Algorithm1,Algorithm3), write_comma_if_not_empty_list(Statements,Algorithm3,Algorithm4), +interpretbodylp2p(Statements,Algorithm4,Algorithm2), +%%write_full_stop_if_last_item(Statements,Algorithm5,Algorithm2), +!. + +interpretbodylp2p(Arguments1,Algorithm1,Algorithm2) :- + Arguments1=[Arguments2|Arguments3], + interpretstatementlp2p2([Arguments2],Algorithm1,Algorithm3a,"[]"), + write_comma_if_not_empty_list(Arguments3,Algorithm3a,Algorithm4), + interpretbodylp2p(Arguments3,Algorithm4,Algorithm2),!. + +write_comma_if_not_empty_list(Statements2,Algorithm6,Algorithm7) :- + (not(Statements2=[])->string_concat(Algorithm6,",",Algorithm7); + Algorithm6=Algorithm7),!. + +write_comma_and_newline_if_not_empty_list(Statements2,Algorithm6,Algorithm7) :- + (not(Statements2=[])->string_concat(Algorithm6,",\n",Algorithm7); + Algorithm6=Algorithm7),!. + +write_full_stop_if_last_item(Statements2,Algorithm8,Algorithm2) :- + ((length(Statements2,A),(A=0%%->true;A=1 + ) + )->string_concat(Algorithm8,".\n",Algorithm2); + Algorithm8=Algorithm2),!. + +write_close_bracket_and_full_stop_if_last_item(Statements2,Algorithm8,Algorithm2) :- + ((length(Statements2,A),(A=0%%->true;A=1 + ))->string_concat(Algorithm8,").\n",Algorithm2); + Algorithm8=Algorithm2),!. + +write_close_bracket_if_last_item(Statements2,Algorithm8,Algorithm2) :- + ((length(Statements2,A),(A=0%%->true;A=1 + ))->string_concat(Algorithm8,")",Algorithm2); + Algorithm8=Algorithm2),!. + +write_close_bracket_and_comma_if_not_empty_list(Statements2,Algorithm6,Algorithm7) :- + (not(Statements2=[])->string_concat(Algorithm6,"),",Algorithm7); + Algorithm6=Algorithm7),!. + +interpretstatementlp2p1(Statement,Algorithm1,Algorithm2) :- + Statement=[[N_or_v,Name]],(N_or_v=n;N_or_v=v), + interpretstatementlp2p2a([N_or_v,Name],Algorithm1,Algorithm2,"[]"),!. + +interpretstatementlp2p1(Statement,Algorithm1,Algorithm2) :- + Statement=[[N_or_v,Name],Arguments],(N_or_v=n;N_or_v=v), + interpretstatementlp2p2a([N_or_v,Name],Algorithm1,Algorithm3a,"[]"), + string_concat(Algorithm3a,"(",Algorithm3), + interpretstatementlp2p2(Arguments,Algorithm3,Algorithm4,"[]"), + string_concat(Algorithm4,")",Algorithm2),!. + +interpretstatementlp2p2([],Algorithm1,Algorithm1,_) :- !. +interpretstatementlp2p2(Arguments1,Algorithm1,Algorithm2,Brackets) :- + Arguments1=[Arguments1a|Arguments2], + interpretstatementlp2p2a(Arguments1a,Algorithm1,Algorithm3,Brackets), + write_comma_if_not_empty_list(Arguments2,Algorithm3,Algorithm4), + interpretstatementlp2p2(Arguments2,Algorithm4,Algorithm2,Brackets),!. + %%write_close_bracket_and_full_stop_if_last_item(Arguments2,Algorithm5,Algorithm2). + +interpretstatementlp2p2b([],Algorithm1,Algorithm1,_) :- !. +interpretstatementlp2p2b(Arguments1,Algorithm1,Algorithm2,Brackets) :- + Arguments1=[Arguments1a|Arguments2], + interpretstatementlp2p2a(Arguments1a,Algorithm1,Algorithm3,Brackets), + write_comma_if_not_empty_list(Arguments2,Algorithm3,Algorithm4), + interpretstatementlp2p2b(Arguments2,Algorithm4,Algorithm5,Brackets), + write_close_bracket_if_last_item(Arguments2,Algorithm5,Algorithm2),!. + +interpretstatementlp2p2a(Arguments1,Algorithm1,Algorithm2,Brackets) :- + interpretstatementlp2p5(Arguments1,Name,Brackets), + string_concat(Algorithm1,Name,Algorithm2),!. + +interpretstatementlp2p3(A,B) :- + interpretstatementlp2p5(A,B,"[]"),!. +/* +interpretstatementlp2p3([],"[]") :- +!. +*/ +interpretstatementlp2p5([n,cut],"!",_Brackets) :- !. +interpretstatementlp2p5([n,Name],Name,_Brackets) :- !. +/* +interpretstatementlp2p3([v,Name1],Name2) :- string_concat(A,B,Name1),atom_length(A,1),upcase_atom(A,A1),string_concat(A1,B,Name2),!. +%%interpretstatementlp2p3([],"[]") :- !. +%%interpretstatementlp2p3("","\"\"") :- !. +interpretstatementlp2p3(Term1,Term2) :- +%not(is_list(Term1)), +not(contains_var1([v,_],Term1)), +not(contains_var1([n,_],Term1)), +term_to_atom(Term1,Term1a), + foldr(string_concat,[Term1a],Term2),!. + +interpretstatementlp2p3([Term1|Term1a],Term2) :- interpretstatementlp2p3(Term1,Term3), +(Term1a=[]-> + foldr(string_concat,["[",Term3,"]"],Term2); +(interpretstatementlp2p4(Term1a,Term3a), + foldr(string_concat,["[",Term3,",",Term3a,"]"],Term2))),!. +% string_concat("[",Term3,Term4), string_concat(Term4,"]",Term2),!. + +interpretstatementlp2p4([Term1|Term1a],Term2) :- interpretstatementlp2p3(Term1,Term3), +(Term1a=[]-> + foldr(string_concat,[Term3],Term2); +(interpretstatementlp2p4(Term1a,Term3a), + foldr(string_concat,[Term3,",",Term3a],Term2))),!. +% string_concat("[",Term3,Term4), string_concat(Term4,"]",Term2),!. + +interpretstatementlp2p3(Value1,Value2):-term_to_atom(Value1,Value2),!. +*/ + + +%% retry nested term + +interpretstatementlp2p5(A,B,Brackets):- + interpretstatementlp2p5(A,"",B,Brackets). + +interpretstatementlp2p5([],_,"[]","[]") :- !. +interpretstatementlp2p5([],_,"()","()") :- !. + + +interpretstatementlp2p5([v,Name1],_,Name2,_Brackets) :- string_concat(A,B,Name1),atom_length(A,1),upcase_atom(A,A1),string_concat(A1,B,Name2),!. + +interpretstatementlp2p5(C,_,Name2,_Brackets) :- +is_list(C), +forall(member(A,C),A=(_=_)), +findall(C1, +(member(A=B,C),(string(A)->true;(atom(A)->true;(number(A)))), +(string(B)->true;(atom(B)->true;(number(B)))), +C1=(A=B)%concat_list([A,"=",B],C1) +),Name21),term_to_atom(Name21,Name2), +%string_concat(A1,B,Name2), +!. + +interpretstatementlp2p5(A,B1,B,Brackets):- + (Brackets="[]"->Top=true;Top=false), + interpretstatementlp2p5(A,B1,B,Top,Brackets). + + +interpretstatementlp2p5([n,Name],_,Name,_,_Brackets) :- !. + +interpretstatementlp2p5([A,"|",B],_,C,_Top,Brackets) :- +%trace, + (Brackets="[]"->(LB="[",RB="]");(LB="(",RB=")")), + %interpretstatementlp2p5(A,_,A1,Brackets), + interpretstatementlp2p5([A],"",A1,false,Brackets), + %interpretstatementlp2p5(B,_,B1,Brackets), + interpretstatementlp2p5([B],"",B1,false,Brackets), + foldr(string_concat,[LB,A1,"|",B1,RB],C),!. + + +interpretstatementlp2p5(A,B1,B,Top,Brackets):- + A=[], + (Top=true-> + foldr(string_concat,[B1,Brackets],B); + B=Brackets),!. + +% " \"" -> \" \\\"\" +interpretstatementlp2p5(Single_item1,_,Single_item,_,_Brackets) :- + single_item_not_var(Single_item1), + %term_to_atom(Single_item1,Single_item),!. + + %trace, + %(atom(Single_item1)->Single_item1=Single_item; + %(flatten(["\"",Single_item1,"\""],Single_item2), + %foldr(string_concat,Single_item2,Single_item))), + %(Single_item1=""""->trace;true), + ((((atom(Single_item1)->true;string(Single_item1))), + contains_string2(Single_item1,_S) + )-> + (atomic_list_concat(A,"\"",Single_item1), + atomic_list_concat(A,"\\\"",Single_item2), + foldr(string_concat,["\"",Single_item2,"\""],Single_item)); + term_to_atom(Single_item1,Single_item)), + %atom_string(Single_item,Single_item2), + %string_atom2(Single_item,Single_item2), + %atomic_list_concat(A,"\"\"",Single_item2), + %atomic_list_concat(A,"\"",Single_item), + !. +%*/ +contains_string2(Atom,_String) :- + string_strings(Atom,S), + member("\"",S). +/* + string_concat(A,B,Atom), + string_length(A,1), + A="\"", + string_concat(String,C,B), + string_length(C,1), + C="\"",!. +*/ +interpretstatementlp2p5(A,B1,B,Top,Brackets):- +%writeln1(interpretstatementlp2p5(A,B1,B,Top,Brackets)), +(Brackets="[]"->(LB="[",RB="]",C=",");(LB="(",RB=")",C="")), + A=[A1|A2], + (A1=[v,N]->(string_concat(A4,B4,N),atom_length(A4,1),upcase_atom(A4,A11),string_concat(A11,B4,A3)); + interpretstatementlp2p5(A1,"",A3,true,Brackets)), + foldr(string_concat,[B1,A3],B2), + (A2=[]->B6=B2; + ( + %trace, + interpretstatementlp2p5(A2,"",B5,false,Brackets), + %trace, + (A2=[[v,_]]%string_concat("(",_,A3) + ->C1=",";C1=C), + foldr(string_concat,[B2,C1],B3), + foldr(string_concat,[B3,B5],B6))), + (Top=true-> + foldr(string_concat,[LB,B6,RB],B); + B6=B),!. + diff --git a/List-Prolog-to-Prolog-Converter/lp2pconverter2.pl b/List-Prolog-to-Prolog-Converter/lp2pconverter2.pl new file mode 100644 index 0000000000000000000000000000000000000000..8d57957c6ce67b8c4f6a2187a1e75345658922cd --- /dev/null +++ b/List-Prolog-to-Prolog-Converter/lp2pconverter2.pl @@ -0,0 +1,6 @@ +%% lp2p + +%:-include('../listprologinterpreter/listprolog.pl'). +:-include('lp2pconverter1.pl'). +:-include('../Prolog-to-List-Prolog/p2lpconverter.pl'). +:-include('bt-p2lp_test1.pl'). diff --git a/Logic-Formula-Finder/LICENSE b/Logic-Formula-Finder/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..3a27c90f1c2986ac0fb73b4ae956c6205343dfdd --- /dev/null +++ b/Logic-Formula-Finder/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Logic-Formula-Finder/README.md b/Logic-Formula-Finder/README.md new file mode 100644 index 0000000000000000000000000000000000000000..5f4733773bb6d9d04424cc55b36ceae6b2919da8 --- /dev/null +++ b/Logic-Formula-Finder/README.md @@ -0,0 +1,66 @@ +# Logic-Formula-Finder +Finds formulas with and, or and not from data. + +* logic_ff1.pl + +# Getting Started + +Please read the following instructions on how to install the project on your computer for finding logical formulas. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Logic-Formula-Finder"). +halt +``` + +# Running + +* In Shell: +`cd Logic-Formula-Finder` +`swipl` +`['logic_ff1.pl'].` +* Run with: +``` +?- logic_ff0([ +[[[a, true], [b, false],[c,true],[d,false]], [true]], +[[[a, true], [b, false],[c,false],[d,false]], [true]], +[[[a, false], [b, false],[c,true],[d,false]], [true]], +[[[a, true], [b, true],[c,true],[d,false]], [true]], +[[[a, false], [b, false],[c,false],[d,false]], [true]] +],F). +``` +to find formulae with this set of specs (concluding with the result at the end of each line) and some results: +``` +F=[[not,d],[not,[a,and,[d,and,[b,and,c]]]],[not,[a,and,[d,and,[b,or,c]]]],[not,[a,and,[d,and,[c,and,b]]]],[not,[a,and,[d,and,[c,or,b]]]],[not,[a,and,[d,and,[not,[b,and,c]]]]],[not,[a,and,[d,and,[not,[b,or,c]]]]],[not,[a,and,[d,and,[not,[c,and,b]]]]],[not,[a,and,[d,and,[not,[c,or,b]]]]],[not,[a,and,[[d,and,b],and,c]]],[not,[a,and,[[d,and,c],and,b]]],[not,[a,and,[[not,[d,or,c]],and,b]]],[not,[b,and,[d,and,[a,and,c]]]],[not,[b,and,[d,and,[a,or,c]]]],[not,[b,and,[d,and,[c,and,a]]]],[not,[b,and,[d,and,[c,or,a]]]], +. +. +. +``` + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + + + diff --git a/Logic-Formula-Finder/logic_ff1.pl b/Logic-Formula-Finder/logic_ff1.pl new file mode 100644 index 0000000000000000000000000000000000000000..9f8ef422df4dc6a28b2b456b9cf146fe8a4233d2 --- /dev/null +++ b/Logic-Formula-Finder/logic_ff1.pl @@ -0,0 +1,167 @@ +/** + +%% Popology > An object can be human like > 24 + +logic_ff0([ +[[[a, true], [b, false],[c,true],[d,false]], [true]], +[[[a, true], [b, false],[c,false],[d,false]], [true]], +[[[a, false], [b, false],[c,true],[d,false]], [true]], +[[[a, true], [b, true],[c,true],[d,false]], [true]], +[[[a, false], [b, false],[c,false],[d,false]], [true]] +],F). + +F=[[not,d],[not,[a,and,[d,and,[b,and,c]]]],[not,[a,and,[d,and,[b,or,c]]]],[not,[a,and,[d,and,[c,and,b]]]],[not,[a,and,[d,and,[c,or,b]]]],[not,[a,and,[d,and,[not,[b,and,c]]]]],[not,[a,and,[d,and,[not,[b,or,c]]]]],[not,[a,and,[d,and,[not,[c,and,b]]]]],[not,[a,and,[d,and,[not,[c,or,b]]]]],[not,[a,and,[[d,and,b],and,c]]],[not,[a,and,[[d,and,c],and,b]]],[not,[a,and,[[not,[d,or,c]],and,b]]],[not,[b,and,[d,and,[a,and,c]]]],[not,[b,and,[d,and,[a,or,c]]]],[not,[b,and,[d,and,[c,and,a]]]],[not,[b,and,[d,and,[c,or,a]]]],... + +logic_ff0([ [[[a, true]], [false]],[[[a,false]], [true]]],F). +F = [[not, a]]. + +logic_ff0([ [[[a, true]], [true]],[[[a,false]], [false]]],F). +F = [[a]]. + +logic_ff0([ [[[a, true],[b, true]], [false]],[[[a,false],[b, false]], [true]]],F). +F=[[not,a],[not,b],[not,[a,and,b]],[not,[a,or,b]],[not,[b,and,a]],[not,[b,or,a]]] + + +logic_ff0([ [[[a, true],[b, false]], [false]],[[[a,false],[b, true]], [true]]],F). +F = [[not, a]]. +**/ + +logic_ff0(Specs,Formula0) :- + Specs=[[A1,A2]|A3], + findall(B,logic_ff1(A1,A1,A2,B),C), + logic_ff01(A3,C,Formula0), + writeln(Formula0). + +logic_ff01([],Formula,Formula) :- !. +logic_ff01(Specs,Formula1,Formula2) :- + Specs=[[A1,A2]|A3], + findall(B,logic_ff1(A1,A1,A2,B),C), + intersection(Formula1,C,Formula4), + sort(Formula4,Formula5), + logic_ff01(A3,Formula5,Formula2). + + +logic_ff1(Columns1,Columns2,Result,Formula1) :- + member(Column,Columns1), + Column=[Name|_Rest], + delete(Columns2,Column,Columns3), + Formula2=Name, + logic_ff2(Columns1,Columns3,Result,Formula2,Formula3), + list(Formula3,Formula1). +logic_ff2([[A, true]],[],[false],A,[not,A]):-!. +logic_ff2([[A, false]],[],[true],A,[not,A]):-!. +logic_ff2(_Columns1,[],_Result,Formula,Formula). +logic_ff2(Columns1,Columns2,Result,Formula1,Formula2) :- + member(Column,Columns2), + Column=[Name|_Rest], + delete(Columns2,Column,Columns3), + appendlogic(Formula1,Name,Formula3), + logic_ff3(Columns1,Result,Formula3), + logic_ff2(Columns1,Columns3,Result,Formula3,Formula2). + + +logic_ff2(Columns1,Columns2,Result,_Formula1,Formula2) :- + member(Column,Columns2), + Column=[Name|_Rest], + delete(Columns2,Column,Columns3), + appendlogic(%%Formula1, + Name, + Formula3), + logic_ff3(Columns1,Result,Formula3), + logic_ff2(Columns1,Columns3,Result,Formula3,Formula2). + + +logic_ff3(Columns1,Result1,[not,Formula2]) :- + logic_ff3(Columns1,Result3,Formula2), + not(Result3,[],Result1). + + +logic_ff3(Columns1,Result1,[Formula1,and,Formula2]) :- + logic_ff3(Columns1,Result2,Formula1), + logic_ff3(Columns1,Result3,Formula2), + and1(Result2,Result3,[],Result1). +logic_ff3(Columns1,Result1,[Formula1,or,Formula2]) :- + logic_ff3(Columns1,Result2,Formula1), + logic_ff3(Columns1,Result3,Formula2), + or(Result2,Result3,Result1). +logic_ff3(Columns,Result1,Formula1) :- + member([Formula1|Result1],Columns). +appendlogic([Formula1,Operator,Formula2],Name2,Formula3) :- + appendlogic(Formula1,Name2,Formula4), + Formula3=[Formula4,Operator,Formula2]. + + + + +appendlogic([Formula1,Operator,Formula2],Name2,Formula3) :- + appendlogic(Formula2,Name2,Formula4), + Formula3=[Formula1,Operator,Formula4]. + +appendlogic([Operator,Formula2],Name2,Formula3) :- + + appendlogic(Formula2,Name2,Formula4), + Formula3=[Operator,Formula4]. + + +appendlogic(Formula1,Name,Formula2) :- + atom(Formula1),append([Formula1],[and,Name],Formula2). +appendlogic(Formula1,Name,Formula2) :- + atom(Formula1), append([Formula1],[or,Name],Formula2). + +appendlogic(Formula1,Name,Formula2) :- + atom(Formula1),append([Formula1],[and,Name],Formula3), + Formula2=[not,Formula3]. +appendlogic(Formula1,Name,Formula2) :- + atom(Formula1), append([Formula1],[or,Name],Formula3), + Formula2=[not,Formula3]. + + +appendlogic(F, +%%Name, +Formula2) :- +%%writeln([f,F]), + %%atom(Formula1), + append([],[not,F],Formula2). + + +unique1(_Remainders,[],UniqueRemainders,UniqueRemainders) :- !. +unique1(Remainders1,Remainder,UniqueRemainders1,UniqueRemainders2) :- + delete(Remainders1,Remainder,Remainders2), + append([Remainder],Remainders2,Remainders3), + unique2(Remainders3,Remainders4,Remainders5,UniqueRemainders1,UniqueRemainders3), + unique1(Remainders5,Remainders4,UniqueRemainders3,UniqueRemainders2). +unique2(Remainders1,Remainder1,Remainders2,UniqueRemainders1,UniqueRemainders2) :- + Remainders1=[Remainder2,Remainder1|Remainders2], + append(UniqueRemainders1,[Remainder2],UniqueRemainders2). +unique2(Remainders1,_Remainder1,Remainder2,UniqueRemainders1,UniqueRemainders2) :- + Remainders1=[Remainder2], + append(UniqueRemainders1,[Remainder2],UniqueRemainders2). +intersection1([],_Result1,Result2,Result2). +intersection1(Result1,Result2,Result3,Result4) :- + Result1=[Result6|Results1], + intersection2(Result6,Result2,Result3,Result5), + intersection1(Results1,Result2,Result5,Result4). +intersection2(Result1,Result2,Result7,Result3) :- + member(Result1,Result2), + append(Result7,[Result1],Result3). +intersection2(Result1,Result2,Result7,Result7) :- + not(member(Result1,Result2)). +union(Result1,Result2,Result3) :- + append(Result1,Result2,Result4), + Result4=[Result5|Results], + unique1(Results,Result5,[],Result3). +list(Formula1,Formula2) :- + atom(Formula1),Formula2=[Formula1]. +list(Formula,Formula) :- + not(atom(Formula)). + +and1([true],[true],[],[true]):-!. +and1([false],[true],[],[false]):-!. +and1([true],[false],[],[false]):-!. +and1([false],[false],[],[false]):-!. +or([false],[false],[false]):-!. +or([true],[false],[true]):-!. +or([false],[true],[true]):-!. +or([true],[true],[true]):-!. +not([true],[],[false]):-!. +not([false],[],[true]):-!. diff --git a/Lucian-Academy-Data/.DS_Store b/Lucian-Academy-Data/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..d5ce25f2dc9fcc4e30e0cb621431e3f747f9e402 Binary files /dev/null and b/Lucian-Academy-Data/.DS_Store differ diff --git a/Lucian-Academy-Data/LICENSE b/Lucian-Academy-Data/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..920183ffbdaca89fc6faee40b6de08a6c8a061c7 --- /dev/null +++ b/Lucian-Academy-Data/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Lucian-Academy-Data/README.md b/Lucian-Academy-Data/README.md new file mode 100644 index 0000000000000000000000000000000000000000..3b8297bb96b816bffc9e6fe4ec56e75c59624b9f --- /dev/null +++ b/Lucian-Academy-Data/README.md @@ -0,0 +1,38 @@ +# Lucian-Academy-Data + +* Contains data file used by this repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Lucian-Academy-Data"). +halt +``` + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/Lucian-Academy-Data/Text-to-Object-Name/.DS_Store b/Lucian-Academy-Data/Text-to-Object-Name/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Lucian-Academy-Data/Text-to-Object-Name/.DS_Store differ diff --git a/Lucian-Academy-Data/Text-to-Object-Name/t2on_dict1.txt b/Lucian-Academy-Data/Text-to-Object-Name/t2on_dict1.txt new file mode 100644 index 0000000000000000000000000000000000000000..af94b0d3c108439fd52c4bf0d31f4f3dd1e0a522 --- /dev/null +++ b/Lucian-Academy-Data/Text-to-Object-Name/t2on_dict1.txt @@ -0,0 +1 @@ +[["a","up"],["aa","dash"],["aaa","dash"],["aaa","square"],["aaaaaaaahai","dash"],["aaaaaaaahzk","dash"],["aaaaaaaahzo","dash"],["aaaabbbbcccc","square"],["aaaaddddeeee","square"],["aaab","square"],["aaaccc","square"],["aaaf","square"],["aab","dash"],["aabort","stop"],["aad","n"],["aaf","dash"],["aahs","mouth"],["aalternatives","dash"],["aami","company"],["aardvark",""],["aaron","person"],["aas","dash"],["aay","variable"],["aaz","dash"],["ab","square"],["aba","square"],["ababa","city"],["abacus",""],["abandon","minus"],["abandoned","minus"],["abandoning","down"],["abb","dash"],["abbagnano","person"],["abbb","dash"],["abbcc","square"],["abbey","building"],["abbie","person"],["abbjj","square"],["abbott","person"],["abbreviate","square"],["abbreviated","n"],["abbreviations","square"],["abby","person"],["abc","dash"],["abcc","dash"],["abcda","dash"],["abcde","square"],["abcdefghijklm","dash"],["abcg","dash"],["abcid","dash"],["abd","square"],["abdolmohammadi","person"],["abdomen","stomach"],["abdominal","stomach"],["abducing","up"],["abducingness","down"],["abductive","down"],["abductor","person"],["abdullah","person"],["abdus","person"],["abhi","person"],["abidance","plus"],["abide","plus"],["abided","plus"],["abiding","plus"],["abidingness","plus"],["abilities","plus"],["ability","arm"],["ablative","square"],["able","arm"],["abled","plus"],["ablett","person"],["abnegation","minus"],["aboard","dot"],["abolish","xcorrection"],["aboriginal","person"],["abort","xcorrection"],["aborted","xcorrection"],["aborting","box"],["abortion","xcorrection"],["aborts","xcorrection"],["about","dash"],["above","up"],["aboveness","up"],["abple","dash"],["abracadabra","plus"],["abracadabras","plus"],["abrgepag","dash"],["abroad","right"],["abs","plus"],["absdiffmean","n"],["absence","minus"],["absent","zero"],["absolute","n"],["absolutely","n"],["absorb","down"],["absorbed","down"],["absorbent","water"],["absorbing","down"],["abstain","zero"],["abstained","xcorrection"],["abstaining","xcorrection"],["abstract","down"],["abstracted","down"],["abstraction","box"],["abstractions","box"],["abstractly","box"],["abstracts","square"],["absurdia","country"],["absurdity","minus"],["absurdum","minus"],["abundant","plus"],["abuse","minus"],["abuses","minus"],["abusive","minus"],["abut","dash"],["abyss","box"],["ac","dash"],["academia","building"],["academic","person"],["academically","book"],["academics","person"],["academies","building"],["academy","building"],["acae","person"],["acald","person"],["acan","person"],["acatoday","dash"],["acc","right"],["accelerate","right"],["accelerated","n"],["acceleration","n"],["accent",""],["accents","square"],["accept","left"],["acceptability","tick"],["acceptable","tick"],["acceptably","plus"],["acceptance","plus"],["accepted","tick"],["accepting","plus"],["accepts","tick"],["access","tick"],["accessed","down"],["accesses","plus"],["accessible","plus"],["accessing","dash"],["accident","minus"],["accidentally","plus"],["accidentals","up"],["accidents","minus"],["acclimatise","plus"],["accolade","plus"],["accommodate","plus"],["accommodated","plus"],["accommodating","plus"],["accommodation","bed"],["accompanied","plus"],["accompanies","and"],["accompaniment","note"],["accompany","right"],["accompanying","and"],["accomplish","tick"],["accomplished","plus"],["accomplishments","plus"],["accordance","right"],["accordian","box"],["according","dash"],["accordingly","plus"],["account","plus"],["accountability","plus"],["accountable","plus"],["accountant","person"],["accounted","plus"],["accounting","plus"],["accounts","square"],["accredit","plus"],["accreditation","tick"],["accredited","tick"],["accrediting","plus"],["accrue","up"],["accumulation","up"],["accuracy","tick"],["accurate","tick"],["accurately","equals"],["accusative","square"],["accuses","xcorrection"],["acd","square"],["acetic",""],["ache","line"],["achelois","square"],["aches","minus"],["achievable","plus"],["achieve","plus"],["achieved","tick"],["achievement","plus"],["achievements","plus"],["achiever","person"],["achieves","tick"],["achieving","plus"],["aching","minus"],["acid",""],["acidophilus","ball"],["acids","acid"],["ack","dash"],["acknowledge","plus"],["acknowledged","paper"],["acknowledgement","plus"],["acknowledging","plus"],["acls","dash"],["acm","dash"],["acn","n"],["acnc","company"],["acoba","paper"],["acorn",""],["acos","plus"],["acosh","plus"],["acoustic","ear"],["acoustics","ear"],["acquire","up"],["acquired","tick"],["acquiring","left"],["acquisition","dollarsymbol"],["acronym","a"],["across","right"],["act","box"],["acted","right"],["acting","tick"],["action","tick"],["actioned","plus"],["actions","square"],["activate","plus"],["activated","tick"],["activates","plus"],["activating","plus"],["activation","plus"],["active","plus"],["actively","plus"],["activities","square"],["activity","plus"],["actor","person"],["actors","person"],["actress","person"],["acts","tick"],["actual","a"],["actualisation","plus"],["actualise","plus"],["actually","plus"],["actuarial","dollarsymbol"],["acumen","plus"],["acupressure","square"],["acupuncture","needle"],["acupuncturist","person"],["acur","person"],["acute","minus"],["acuña","person"],["acyclic","right"],["ad","right"],["adam","person"],["adams","person"],["adapt","plus"],["adaptability","plus"],["adapter","right"],["add","plus"],["added","plus"],["addeditems","box"],["addee","square"],["addendum","square"],["adders","plus"],["addfromtolang","plus"],["adding","plus"],["addis","city"],["addition","plus"],["additional","plus"],["additionally","plus"],["additions","plus"],["additive","plus"],["addorsubtract","plus"],["addpos","plus"],["addr","square"],["address","house"],["addressed","tick"],["addressee","person"],["addresses","square"],["addressing","right"],["addrules","plus"],["adds","plus"],["addye","person"],["ade","square"],["adequate","plus"],["adequately","plus"],["adfa","university"],["adhere","equals"],["adhered","equals"],["adhering","equals"],["adhesive","equals"],["adict","paper"],["adie","dash"],["adjacent","equals"],["adjective","plus"],["adjectively","plus"],["adjectives","apple"],["adjourned","plus"],["adjudication","plus"],["adjudicator","person"],["adjust","plus"],["adjustable","n"],["adjusted","right"],["adjusting","plus"],["adjustments","plus"],["adkins","person"],["admin","person"],["administer","plus"],["administered","plus"],["administrated","plus"],["administration","plus"],["administrative","book"],["administrator","person"],["administrators","person"],["adminpw","square"],["adminuser","person"],["admire","plus"],["admired","plus"],["admiring","plus"],["admit","plus"],["admitted","plus"],["admittedly","plus"],["adney","dash"],["ado","plus"],["adolescent","person"],["adolescents","person"],["adopt","up"],["adopted","plus"],["adoption","plus"],["adore","plus"],["adored","plus"],["adorn","plus"],["adorned","plus"],["adorno","person"],["adoro","heart"],["adr","person"],["adrard","person"],["adree","person"],["adrel","dash"],["adrenal","dot"],["adrenaline","ball"],["adrian","person"],["adrl","person"],["ads","square"],["adsense","software"],["adult","person"],["adultery","minus"],["adulthood","n"],["adults","person"],["adumbrations","square"],["adv","paper"],["advance","minus"],["advanced","a"],["advancement","right"],["advances","right"],["advantage","dash"],["advantageous","plus"],["advantages","plus"],["advent","zero"],["adventure","book"],["adverb","n"],["adverbially","plus"],["adverbs","right"],["adverse","minus"],["advertise","dollarsymbol"],["advertised","square"],["advertisement","square"],["advertisements","plus"],["advertising","square"],["advice","plus"],["advise","plus"],["advised","plus"],["advises","plus"],["advising","plus"],["advisor","person"],["advocate","person"],["advocated","plus"],["advocating","plus"],["adwords","product"],["adye","person"],["ae","dash"],["aea","dash"],["aec","dash"],["aeg","dash"],["aerial","air"],["aerobics","lycra"],["aerosol",""],["aerotopically","air"],["aese","company"],["aest","square"],["aesthetes","person"],["aesthetic","square"],["aesthetics","square"],["aetiological","minus"],["af","dash"],["afa","dash"],["affairs","paper"],["affect","right"],["affected","plus"],["affecting","plus"],["affection","plus"],["affective","plus"],["affects","right"],["affgg","square"],["affiliated","plus"],["affiliations","equals"],["affirm","tick"],["affirmation","tick"],["affirmative","plus"],["affirmed","square"],["affirming","plus"],["affluent","plus"],["afford","dollarsymbol"],["affordable","dollarsymbol"],["afforded","dollarsymbol"],["afgde","square"],["afghanistan","country"],["afield","right"],["afl","football"],["afloat","up"],["afraid","minus"],["afresh","plus"],["africa","continent"],["african","person"],["afro","hair"],["after","plus"],["afterlife","line"],["afternoon","square"],["afterpay","company"],["afterwards","right"],["ag","dash"],["again","two"],["against","minus"],["age","n"],["aged","n"],["agencies","building"],["agency","company"],["agenda","paper"],["agent","person"],["agents","person"],["ages","n"],["aggravate","minus"],["aggregate","n"],["aggress","minus"],["aggression","minus"],["aging","minus"],["agnosticism","box"],["agnostics","person"],["agnès","person"],["ago","left"],["agogo","box"],["agps","paper"],["agree","plus"],["agreeable","plus"],["agreed","tick"],["agreeing","plus"],["agreement","tick"],["agreements","tick"],["agreer","plus"],["agrees","plus"],["agricultural","plant"],["agriculture","apple"],["ah","dash"],["aha","plus"],["ahash","hash"],["ahead","plus"],["ahhii","square"],["ahoniemi","person"],["ai","box"],["aiaconnect","dash"],["aid","plus"],["aida","box"],["aidan","person"],["aide","person"],["aided","plus"],["aides","person"],["aiding","plus"],["aids","plus"],["aif","line"],["aiff","line"],["aig","plus"],["aigs","square"],["ailments","minus"],["aim","plus"],["aime","heart"],["aimed","dot"],["aimedness","dot"],["aiming","dot"],["aimingness","dot"],["aims","dot"],["ainger","person"],["aingers","person"],["air","box"],["aircraft",""],["airdrop","down"],["aired","air"],["aireys","city"],["airlifter","aircraft"],["airlock","door"],["airplane","plane"],["airplay","line"],["airplayradio","company"],["airport","plane"],["airs","computer"],["airship","balloon"],["airships","balloon"],["airways","down"],["aiscript","paper"],["aiscripts","square"],["aisha","person"],["aisle","square"],["ajjkk","square"],["aka","right"],["akin","plus"],["al","dash"],["aladdin","person"],["alaloviala","plus"],["alan","person"],["alarm","n"],["albania","country"],["albert","person"],["albie","person"],["albright","person"],["album","line"],["albums","album"],["alcohol",""],["alcoholics","person"],["ald","person"],["aldo","person"],["aleisha","person"],["alejandra","person"],["alert","plus"],["alerts","xcorrection"],["aletheia","eye"],["alex","person"],["alexander","person"],["alexandra","person"],["alexia","person"],["alexis","person"],["alexius","person"],["alfred","person"],["alg","and"],["algae",""],["algal","algae"],["algdict","book"],["algebra","variable"],["algebraic","plus"],["algeria","country"],["algname","and"],["algohedrons","box"],["algorithm","and"],["algorithmic","and"],["algorithmically","and"],["algorithmlibrary","box"],["algorithmname","square"],["algorithms","twobox"],["algorithmstopredicates","square"],["algorithmwriter","software"],["algs","tworectangle"],["algwriter","and"],["algwriterart","paper"],["ali","person"],["alia","plus"],["alien","person"],["alienated","minus"],["alienation","minus"],["aliens","person"],["alight","down"],["alighted","down"],["align","line"],["aligned","line"],["aligning","equals"],["alignment","line"],["aligns","right"],["aliotta","person"],["aliquot","ball"],["alive","plus"],["aliveness","plus"],["all","square"],["allan","person"],["allargs","box"],["allayed","plus"],["alleged","xcorrection"],["allele","n"],["allen","person"],["alleviate","plus"],["alleviated","plus"],["allez","right"],["alliance","threerectangle"],["alliteration","equals"],["allmm","square"],["allnumbers","n"],["allocate","right"],["allocated","right"],["allocating","box"],["allocation","dash"],["allotment","box"],["allow","tick"],["allowable","plus"],["allowance","minus"],["allowed","plus"],["allowing","plus"],["allows","plus"],["allports","building"],["alls","n"],["alluring","plus"],["allusednames","square"],["allusion","dash"],["ally","plus"],["allyson","person"],["almanacs","book"],["almond",""],["almossawi","person"],["almost","zero"],["alnum","square"],["aloft","up"],["alone","one"],["along","right"],["alongside","right"],["alpha","a"],["alphabet","a"],["alphabetic","square"],["alphabetical","square"],["alphanumeric","square"],["alphawheat","plant"],["already","tick"],["alright","tick"],["also","plus"],["alt","button"],["alter","two"],["alterego","software"],["alternate","two"],["alternating","dot"],["alternative","two"],["alternatively","and"],["alternatives","two"],["although","minus"],["althusser","person"],["altinkök","person"],["alto","tube"],["altogether","box"],["altruism","plus"],["aluminum",""],["alumni","person"],["alveolus","ball"],["always","one"],["alynch","person"],["alyson","person"],["alzheimer","minus"],["am","equals"],["amalgam",""],["amalgamate","right"],["amalgamated","right"],["amalgamating","right"],["amalgamation","right"],["amap","plus"],["amazed","eye"],["amazement","plus"],["amazing","plus"],["amazon","company"],["amazonaws","square"],["ambiguities","minus"],["ambiguous","minus"],["ambitions","plus"],["ambitious","plus"],["ambulance",""],["amd","and"],["ameliorated","plus"],["amen","hand"],["america","country"],["american","person"],["americans","person"],["ames","person"],["amicable","plus"],["amit","person"],["amnesiacs","person"],["among","box"],["amongst","box"],["amoral","minus"],["amorphous","amount"],["amount","n"],["amounts","n"],["amp","plus"],["ampersand","and"],["ampersands","square"],["amphitheatre",""],["amphora",""],["ample","plus"],["amplifier","right"],["amplitude","n"],["amputated","box"],["amsterdam","city"],["an","up"],["anaconda","and"],["anacrusis","note"],["anaeirenate","plus"],["anaesthetised","ball"],["anagram","square"],["anal","eye"],["analects","book"],["analog","right"],["analogies","right"],["analogue","hand"],["analyse","tick"],["analysed","and"],["analyser","plus"],["analyses","tick"],["analysing","and"],["analysis","and"],["analyst","person"],["analytic","and"],["analytical","right"],["analytics","and"],["analyzing","right"],["anangisye","person"],["ananya","person"],["anaphor","right"],["anarchic","plus"],["anarchism","person"],["anarchy","minus"],["anatomical","cell"],["anatomicopathological","plus"],["anatomy","body"],["ancestors","person"],["anchor","x"],["anchored","down"],["anchoring","anchor"],["ancient","minus"],["ancients","dot"],["and",""],["anderson","person"],["andersons","person"],["andorra","country"],["andrew","person"],["andrews","person"],["android","box"],["andy","person"],["anecdote","mouth"],["anemia","minus"],["anfara","person"],["ang","person"],["angel","person"],["angela","person"],["angeles","person"],["angelic","plus"],["angelo","person"],["anger","minus"],["angie","person"],["anginas","minus"],["angle","n"],["angled","n"],["angles","n"],["angling","fish"],["anglo","person"],["angola","country"],["angry","minus"],["animal",""],["animalness","animal"],["animalproductsa","box"],["animals",""],["animaly","animal"],["animation","square"],["animations","right"],["animator","person"],["animhighereddetails","paper"],["animhigheredreport","paper"],["animus","dot"],["anita","person"],["anjelica","person"],["ankle",""],["ann","person"],["anna","person"],["anne","person"],["annette","person"],["annie","person"],["anniversary","n"],["annotate","square"],["annotated","square"],["annotation","square"],["announce","mouth"],["announced","mouth"],["announcement","square"],["annoyance","minus"],["annoying","minus"],["annual","square"],["annually","square"],["annuasly","dot"],["annul","xcorrection"],["annunciated","plus"],["anointed","down"],["anon","person"],["anone","right"],["anonymised","square"],["anonymous","square"],["anonymously","square"],["another","plus"],["anothers","dash"],["ans","square"],["anscombe","person"],["anson","person"],["ansto","organisation"],["answer","a"],["answered","square"],["answerer","person"],["answerers","person"],["answering","person"],["answermode","right"],["answers","a"],["ant",""],["antagonism","minus"],["antagonist","person"],["antaki","person"],["antarctica","country"],["antecedant","right"],["antecedants","right"],["antecedent","a"],["antenna",""],["anthem","note"],["anther",""],["anthony","person"],["anthrop","person"],["anthropological","person"],["anthropologically","person"],["anthropologist","person"],["anthropology","person"],["anti","minus"],["antialiased","square"],["antibodies","antibody"],["antibody",""],["anticipate","plus"],["anticipated","plus"],["anticipating","left"],["anticipation","plus"],["anticipatory","minus"],["anticlimax","minus"],["antidepressant","plus"],["antidigressing","equals"],["antigua","country"],["antihallucinogenic","xcorrection"],["antioxidants","ball"],["antiphoclemone","person"],["antipodean","person"],["antipodeanism","person"],["antipsychotic","xcorrection"],["antiquities","n"],["antiquity","n"],["antithesis","xcorrection"],["antiviral","plus"],["anton","person"],["antonia","person"],["antonio","person"],["antonyms","xcorrection"],["antónio","person"],["anu","university"],["anubis","dog"],["anus","tube"],["anxiety","minus"],["anxiousness","minus"],["any","one"],["anyfunction","function"],["anymore","zero"],["anynumber","n"],["anyone","person"],["anything","one"],["anytime","tick"],["anyway","one"],["anywhere","dot"],["anz","bank"],["aorist","right"],["ap","dash"],["apa","square"],["apache","software"],["apart","minus"],["apathetically","zero"],["ape",""],["aperture","box"],["apex","dot"],["aphohedron","box"],["aphor","apple"],["aphorism","square"],["aphors","apple"],["api","software"],["apical","dot"],["apis","software"],["apkajlohf","dash"],["apocalyptic","minus"],["apologies","xcorrection"],["apologise","plus"],["apologised","xcorrection"],["apologising","plus"],["apologize","xcorrection"],["apologized","plus"],["apology","xcorrection"],["apoptosis","n"],["apostrophe","square"],["apostrophes","square"],["app","paper"],["apparatus","box"],["apparent","plus"],["apparently","dot"],["appeal","plus"],["appealed","plus"],["appealing","plus"],["appeals","plus"],["appear","square"],["appearance","square"],["appearances","square"],["appeared","square"],["appearing","tick"],["appears","square"],["append","plus"],["appended","tworectangle"],["appendeof","square"],["appendices","paper"],["appending","threerectangle"],["appendix","box"],["appendlogic","and"],["appendpredlist","plus"],["appends","plus"],["appetising","apple"],["appetite","apple"],["applaud","plus"],["applauded","plus"],["applause","person"],["apple",""],["applef","dash"],["apples","apple"],["applesoft","square"],["applicability","plus"],["applicable","plus"],["applicant","person"],["applicants","person"],["application","box"],["applications","plus"],["applicator","down"],["applied","tick"],["applies","plus"],["apply","paper"],["applying","plus"],["applys","right"],["appoint","plus"],["appointed","dot"],["appointer","plus"],["appointing","plus"],["appointment","square"],["appointments","square"],["appreciate","plus"],["appreciated","plus"],["appreciating","plus"],["appreciative","plus"],["approach","right"],["approached","right"],["approaches","right"],["approaching","right"],["appropriate","tick"],["appropriated","right"],["appropriately","plus"],["appropriateness","plus"],["appropriating","right"],["approval","plus"],["approve","plus"],["approved","plus"],["approves","plus"],["approx","line"],["approximate","line"],["approximated","box"],["approximately","box"],["approximations","n"],["apps","box"],["appt","square"],["apr","dash"],["apricot",""],["april","square"],["apropos","tick"],["apt","plus"],["aptitude","plus"],["aqa","square"],["aqf","square"],["aqualung",""],["aquanauts","person"],["aquarium",""],["ar","variable"],["arab","country"],["arabia","country"],["arabic","square"],["arachnid","spider"],["arachnids","spider"],["aran","person"],["arbiter","person"],["arbitrarily","plus"],["arbitrary","n"],["arbitrating","plus"],["arc","semicircle"],["arch",""],["archaeologist","person"],["archaeology","down"],["archaic","minus"],["archduke","person"],["archeological","down"],["archeology","down"],["archibald","person"],["arching","up"],["archipelagos","triangle"],["architect","person"],["architects","person"],["architectural","building"],["architecturally","building"],["architecture","column"],["architectures","building"],["architrave",""],["archival","book"],["archive","paper"],["archived","box"],["arcobaleno","square"],["arcs","semicircle"],["arctic","dot"],["ard","person"],["ardagh","person"],["ardamon","person"],["arduino","and"],["are","equals"],["area","square"],["areas","square"],["arecibos","city"],["arem","dash"],["arems","square"],["aren","minus"],["arg","variable"],["argan",""],["argentina","country"],["args","a"],["arguably","plus"],["argue","dash"],["argued","square"],["argues","square"],["arguing","dash"],["argument","a"],["argumentary","square"],["argumentation","book"],["argumentative","square"],["argumentname","square"],["argumentnumber","n"],["arguments","a"],["argumentslength","n"],["argv","n"],["ari","person"],["aria","ear"],["ariable","dash"],["ariana","person"],["arianne","person"],["arise","up"],["arises","up"],["arising","up"],["aristotelian","person"],["aristotle","person"],["arithmetic","plus"],["arithmetical","plus"],["arities","n"],["arity","n"],["arkann","robot"],["arm",""],["armadale","suburb"],["armadillo",""],["armenia","country"],["armenian","person"],["armor","square"],["armour",""],["arms","arm"],["armshire","square"],["armstrong","person"],["army","person"],["arocket","rocket"],["aroma","rose"],["aromatherapy","bottle"],["aron","person"],["arose","up"],["around","right"],["aroused","plus"],["arpeggio","note"],["arran","person"],["arrange","dot"],["arranged","note"],["arrangement","note"],["arrangements","note"],["arranging","pen"],["array","grid"],["arreola","person"],["arrest","xcorrection"],["arrested","xcorrection"],["arrival","dot"],["arrive","right"],["arrived","dot"],["arrives","dot"],["arriving","dot"],["arrow","right"],["arrows","right"],["arsenic","ball"],["art","square"],["artemis","person"],["arterial","tube"],["arteries","tube"],["artesian","plus"],["arthritis","minus"],["arthur","person"],["artichoke",""],["artichokes","artichoke"],["article","paper"],["articles","book"],["articulate","mouth"],["articulated","up"],["articulating","mouth"],["articulation","tongue"],["artie","person"],["artifice","plus"],["artificial","plus"],["artificially","and"],["artisanal","plus"],["artist","person"],["artistic","plus"],["artistically","plus"],["artists","person"],["arts","square"],["artwork",""],["artworks","box"],["as","equals"],["asa","book"],["asanas","mat"],["asanet","book"],["asap","one"],["asc","n"],["ascend","up"],["ascended","up"],["ascending","up"],["ascends","up"],["ascension","up"],["ascent","up"],["ascertain","plus"],["ascertained","plus"],["ascertaining","plus"],["aschonfelder","person"],["ascii","n"],["ascp","book"],["ascribe","right"],["asian","person"],["asic","company"],["aside","right"],["asin","f"],["ask","questionmark"],["asked","questionmark"],["asking","right"],["askpass","square"],["asks","person"],["asleep","bed"],["asn","organisation"],["asp","dash"],["asparagus",""],["aspect","box"],["aspects","dash"],["asperger","minus"],["aspergers","minus"],["aspersion","minus"],["aspires","up"],["aspx","dash"],["ass","paper"],["assassination","minus"],["assault","minus"],["assemble","square"],["assembled","box"],["assembles","box"],["assembly","box"],["asserta","box"],["asserts","plus"],["assertz","dash"],["assess","tick"],["assessable","tick"],["assessed","tick"],["assesses","plus"],["assessing","tick"],["assessment","paper"],["assessments","paper"],["assessor","person"],["assessors","person"],["assets","box"],["assign","equals"],["assigned","right"],["assigning","equals"],["assignment","paper"],["assignments","paper"],["assimilate","equals"],["assimilated","equals"],["assimilation","box"],["assist","plus"],["assistance","plus"],["assistant","person"],["assistants","person"],["assisted","plus"],["assisting","plus"],["assistive","plus"],["assists","plus"],["assoc","association"],["associate","plus"],["associated","dash"],["associates","person"],["associating","right"],["association","company"],["associations","right"],["associative","dash"],["associativity","right"],["assume","plus"],["assumed","plus"],["assumes","tick"],["assuming","left"],["assumptions","square"],["assurance","plus"],["assure","plus"],["assured","plus"],["assures","plus"],["asterisk",""],["asthma","minus"],["astonished","plus"],["astray","minus"],["astrol","star"],["astrology","star"],["astronaut","person"],["astronauts","person"],["astronomy","star"],["astroturf","square"],["asylum","plus"],["async","notequals"],["asynchronous","tworectangle"],["at","dash"],["atacama","city"],["atan","right"],["atb","dash"],["ate","spoon"],["atest",""],["atheism","zero"],["atheist","zero"],["atheistic","zero"],["atheists","person"],["athlete","person"],["athletes","person"],["ati","company"],["atkinson","person"],["atmosphere","box"],["atmospheric","air"],["atom",""],["atomic","box"],["atoms","box"],["atomtoken","n"],["atrium","room"],["ats","dash"],["attach","box"],["attached","plus"],["attaches","equals"],["attaching","equals"],["attachment","square"],["attachments","paper"],["attack","right"],["attacked","xcorrection"],["attacker","person"],["attackers","minus"],["attacking","minus"],["attacks","minus"],["attain","tick"],["attained","right"],["attaining","plus"],["attainment","hand"],["attempt","tick"],["attempted","plus"],["attemptedly","plus"],["attempting","plus"],["attempts","plus"],["attend","down"],["attendance","plus"],["attendant","person"],["attended","plus"],["attendees","person"],["attender","person"],["attending","tick"],["attention","tick"],["attentive","plus"],["attest","plus"],["attesting","plus"],["attic","room"],["attire","shirt"],["attitude","plus"],["attitudes","mind"],["attorney","person"],["attract","plus"],["attracted","dash"],["attracting","plus"],["attraction","plus"],["attractions","plus"],["attractive","plus"],["attribute","n"],["attributed","dash"],["attributes","square"],["attuned","equals"],["au","country"],["aud","dollarsign"],["audacity","and"],["audible","plus"],["audience","person"],["audient","company"],["audio","note"],["audiobar","square"],["audiobooks","audiobar"],["audioship","company"],["audit","tick"],["audited","tick"],["auditing","plus"],["auditor","person"],["auditorium","room"],["auditory","ear"],["audits","plus"],["aug","square"],["augmentary","up"],["augmentative","plus"],["augmented","up"],["august","square"],["aun","dash"],["aunt","person"],["aural","ear"],["auras","box"],["aurora","light"],["aus","country"],["auspost","company"],["austen","person"],["austin","person"],["australasian","country"],["australia","country"],["australian","person"],["australians","person"],["australis","country"],["austria","country"],["auth","plus"],["authentic","plus"],["authenticate","plus"],["authenticated","plus"],["authenticating","plus"],["authentication","tick"],["authenticity","plus"],["author","person"],["authored","square"],["authorise","plus"],["authorised","plus"],["authoritarian","plus"],["authoritarianism","minus"],["authoritative","plus"],["authorities","person"],["authority","person"],["authorization","plus"],["authorized","plus"],["authors","person"],["authorship","paper"],["autism","minus"],["autist","person"],["auto","tick"],["autobiographical","book"],["autobiography","book"],["autocomplete","tick"],["autocompleted","plus"],["autocorrect","plus"],["autocue","square"],["autogenerator","right"],["autograph","square"],["autoluna","moon"],["automagically","plus"],["automata","and"],["automate","box"],["automated","cycle"],["automates","cycle"],["automatic","circle"],["automatically","and"],["automating","cycle"],["automation","circle"],["automaton","and"],["automator","circle"],["automobile","car"],["autonomous","plus"],["autonomy","circle"],["autorun","right"],["autosaves","plus"],["autumn","square"],["auyeung","person"],["av","dash"],["avail","square"],["availabilities","square"],["availability","tick"],["available","plus"],["avalanche","snow"],["avb","or"],["avenue","road"],["avenues","path"],["average","line"],["averages","line"],["avert","xcorrection"],["avid","plus"],["avignon","city"],["avoid","xcorrection"],["avoided","xcorrection"],["avoiding","xcorrection"],["avoids","xcorrection"],["avon","person"],["avz","dash"],["aw","software"],["awabakal","person"],["await","plus"],["awaiting","plus"],["awake","plus"],["awaked","up"],["awakened","plus"],["award","plus"],["awarded","plus"],["awarding","plus"],["awards","tick"],["awardspace","company"],["aware","tick"],["awareness","plus"],["awarning","xcorrection"],["away","zero"],["awe","plus"],["awn","n"],["awx","square"],["awy","square"],["ax","variable"],["axels","line"],["axes","line"],["axiomatic","and"],["axioms","right"],["axis","line"],["axle","line"],["axolotl",""],["axon","line"],["ay","variable"],["ayur","plus"],["ayurveda","plus"],["ayurvedic","box"],["az","dash"],["azerbaijan","country"],["azu","person"],["azzx","dash"],["añadida","dash"],["añc","plus"],["aś","dash"],["aṃ","plus"],["aṃś","plus"],["b",""],["ba","paper"],["baba","person"],["babe","person"],["babel","city"],["babied","plus"],["babier","plus"],["babies","baby"],["baby",""],["bach","person"],["bachelor","person"],["bachelors","book"],["back","left"],["backbone",""],["backdated","down"],["backdates","left"],["backdrop","square"],["backed","right"],["background","square"],["backgrounds","square"],["backing","line"],["backlog","left"],["backlogged","right"],["backlogs","book"],["backpack","bag"],["backpropagate","left"],["backs","plus"],["backslash",""],["backticks","square"],["backtrace","left"],["backtrack","left"],["backtracking","left"],["backtracks","left"],["backtracktest","tick"],["backtranslated","right"],["backtranslateuntilcorrect","tick"],["backtranslation","square"],["backtranslations","right"],["backtranslationwords","square"],["backup","tworectangle"],["backups","paper"],["backward","left"],["backwards","left"],["bacteria","bacterium"],["bacterial","bacteria"],["bacterium",""],["bad","minus"],["baddie","person"],["badge","square"],["badges","badge"],["badly","minus"],["badminton","shuttlecock"],["badness","minus"],["baduanjin","dash"],["baffled","minus"],["bag",""],["bagel",""],["bagof","box"],["bagpipe",""],["bagpipes",""],["bags","bag"],["bahama","plus"],["bahamas","country"],["bahrain","country"],["bailey","person"],["bain","person"],["bake","bread"],["bakelite",""],["baker","person"],["baking","oven"],["baktraks","line"],["balance","box"],["balanced","equals"],["bald","head"],["balibanuba","apple"],["baljit","person"],["ball",""],["ballantine","person"],["ballet","right"],["balloon",""],["balloonifying","balloon"],["balloons","balloon"],["ballot","paper"],["ballots","paper"],["balls","ball"],["balsa","wood"],["balustrade",""],["balustrades","balustrade"],["bam","dash"],["bamboo",""],["bambury","city"],["ban","dash"],["banana",""],["bananas","banana"],["band",""],["bandcamp","company"],["bands","note"],["bandwidth","line"],["bangladesh","country"],["banjo",""],["bank","building"],["banker","person"],["banking","dollarsymbol"],["banks","dollarsymbol"],["banned","xcorrection"],["banner","square"],["banning","xcorrection"],["banquet","apple"],["bar","square"],["barata","person"],["barbados","country"],["barbarians","person"],["barbaric","minus"],["barbecued","barbeque"],["barbeque",""],["barbuda","country"],["barebones","bone"],["barely","plus"],["bargain","plus"],["baricello","cello"],["baritone","person"],["bark","plus"],["barked","line"],["barkly","street"],["barley",""],["barnacle",""],["barnacles","barnacle"],["barnes","person"],["barnyard",""],["barometers","n"],["baroness","person"],["barr","person"],["barra","city"],["barrabool","suburb"],["barracuda","fish"],["barracudas","fish"],["barred","xcorrection"],["barrel",""],["barrelled","barrel"],["barrett","person"],["barricuda","fish"],["barrier","box"],["barriers","minus"],["barrow","wheelbarrow"],["barry","person"],["bars","box"],["bartimes","n"],["bartlett","person"],["barwon","river"],["basal","zero"],["basc","book"],["base","square"],["baseball",""],["basecase","one"],["basecasecondition","dash"],["basecasecondition","tick"],["basecaselistlength","variable"],["based","square"],["baseline","line"],["basement","square"],["basename","square"],["bases","base"],["bash","square"],["bashibazouks","minus"],["basic","line"],["basically","one"],["basics","n"],["basing","square"],["basis","square"],["basket",""],["bass","note"],["bassoon","tube"],["bat",""],["batch","n"],["bath",""],["bathe","water"],["bathed","bath"],["bathers",""],["bathing","water"],["bathroom","room"],["bathtimes","bath"],["baton",""],["batt","person"],["batter",""],["batteries","battery"],["battery",""],["battle","minus"],["battled","minus"],["bauble","ball"],["baulsch","person"],["baum","person"],["bay","water"],["bayeux","dot"],["bazaar","room"],["bb","xcorrection"],["bbb","dash"],["bbbccc","square"],["bbc","square"],["bbcc","square"],["bbedit","square"],["bbj","square"],["bbjj","square"],["bc","dash"],["bc","square"],["bcb","dash"],["bcba","square"],["bcc","square"],["bcs","square"],["bd","variable"],["bdict","book"],["bdrict","paper"],["be","equals"],["beac","company"],["beach","sand"],["beachball","ball"],["beacon","company"],["beaker",""],["beam","line"],["beamed","right"],["beaming","right"],["beams","line"],["bean",""],["beanie",""],["beaning","right"],["beans","bean"],["bear",""],["beard",""],["bearer","person"],["bearers","person"],["bearing","tick"],["bearings","plus"],["beasoning","right"],["beat","a"],["beating","dot"],["beatles","person"],["beatrice","person"],["beats","note"],["beau","person"],["beautiful","square"],["beauty","face"],["beauvoir","person"],["bec","left"],["became","equals"],["because","left"],["becharm","plus"],["beckett","person"],["beckon","plus"],["beckoning","left"],["become","equals"],["becomes","right"],["becoming","right"],["bed",""],["bedfellow","person"],["bedroom","bed"],["beds","bed"],["bedspread","blanket"],["bedtime","n"],["bee",""],["beef",""],["beek","person"],["beeline","right"],["been","equals"],["bees","bee"],["beese","right"],["beetle",""],["before","minus"],["beforehand","left"],["befriending","plus"],["befu","dash"],["began","zero"],["begged","left"],["begin","zero"],["beginner","person"],["beginners","person"],["beginning","zero"],["beginnings","zero"],["begins","one"],["behalf","right"],["behave","plus"],["behaved","plus"],["behaves","plus"],["behaving","plus"],["behaviour","plus"],["behavioural","right"],["behaviours","right"],["beheld","plus"],["behemoth","monster"],["behind","right"],["bei","person"],["being","person"],["beings","person"],["belarus","country"],["belgium","country"],["belief","square"],["beliefs","tick"],["believable","plus"],["believe","tick"],["believed","plus"],["believers","person"],["believing","plus"],["belize","country"],["bell",""],["bella","person"],["bellissimo","plus"],["bells","box"],["belly","stomach"],["belmont","suburb"],["belong","box"],["belonged","plus"],["belonging","box"],["belongings","box"],["belousoff","person"],["below","down"],["belt",""],["ben","person"],["benbya","person"],["bench",""],["bend","right"],["bender","person"],["bending","line"],["bends","arc"],["beneficence","plus"],["beneficial","plus"],["benefit","plus"],["benefited","plus"],["benefits","plus"],["benefitted","plus"],["benefitting","plus"],["benevolence","plus"],["benfield","person"],["benin","country"],["benincasa","person"],["benisalachocolat","chocolate"],["benjamin","person"],["bent","minus"],["bentwood","wood"],["berkeley","building"],["berklee","company"],["bernadette","person"],["bernie","person"],["bernsteintapes","person"],["berocca","box"],["beroccas","tablet"],["berries","berry"],["berry",""],["bert","person"],["berys","person"],["beside","right"],["besides","right"],["best","a"],["bestow","plus"],["bestseller","dollarsymbol"],["bestthinking","company"],["bet","box"],["beta","tworectangle"],["betray","minus"],["betrayed","minus"],["betroth","down"],["better","plus"],["bettering","plus"],["betting","dollarsymbol"],["between","dash"],["betweenizing","box"],["betweenness","box"],["betweennesses","box"],["beware","xcorrection"],["beyond","dot"],["bf","dash"],["bfs","right"],["bg","square"],["bgv","dash"],["bhagavad","book"],["bhi","person"],["bhutan","country"],["bi","two"],["bias","minus"],["bib","book"],["bibby","person"],["bible","book"],["bibliographic","book"],["bibliography","paper"],["biblioteca","book"],["bicameral","room"],["biceps","muscle"],["bicochemistry","ball"],["bicycle",""],["bid","dollarsymbol"],["bidder","person"],["biddle","plus"],["biden","person"],["bidirectional","dash"],["bieber","person"],["big","box"],["bigger","greaterthan"],["biggest","n"],["biggs","person"],["bilayer","box"],["bildungsroman","book"],["bilingual","book"],["bill","person"],["billing","dollarsymbol"],["billion","line"],["billionaire","dollarsymbol"],["bills","paper"],["biloba","plant"],["bin","box"],["binary","n"],["bind","box"],["binding","square"],["bindings","dash"],["binds","dash"],["bing","person"],["bingo","counter"],["bins","bin"],["bio","paper"],["biochemical","ball"],["biochemist","person"],["biochemistry","ball"],["biodiverse","plus"],["biodiversity","plus"],["bioethanol","box"],["biofeedback","left"],["biography","book"],["biological","book"],["biologically","book"],["biologist","person"],["biology","book"],["biomedical","book"],["biomolecular","ball"],["biomolecules","ball"],["bioschemistry","and"],["biotechnology","right"],["bips","dash"],["birch","person"],["bird",""],["birdcage","cage"],["birds","bird"],["birth","zero"],["birthday","square"],["birthdays","square"],["birthplace","dot"],["birthright","plus"],["births","zero"],["bis","plus"],["biscuit",""],["biscuits","biscuit"],["bismarck","person"],["bissau","country"],["bit","one"],["bitar","person"],["bite","tooth"],["biteful",""],["biting","tooth"],["bitl","square"],["bitmap","square"],["bits","tworectangle"],["bitter","minus"],["bitterness","minus"],["bitwise","one"],["biweekly","two"],["bix",""],["bjelke","person"],["bl","dash"],["black","square"],["blackandwhitehen","duck"],["blackandwhitehenwithblackspeckles","duck"],["blackberry",""],["blackboard",""],["blackcurrant",""],["blackduck","duck"],["blackel","person"],["blackie","duck"],["blackl","square"],["blackler","person"],["blackwell","person"],["blackwhite","square"],["blackye","person"],["bladder",""],["blade",""],["blades","blade"],["blair","person"],["blake","person"],["blame","xcorrection"],["blamed","minus"],["blaming","minus"],["blank","square"],["blanket",""],["blankets","blanket"],["blast","plus"],["blasted","minus"],["ble","dash"],["blemishes","ball"],["blend","plus"],["blended","box"],["blender",""],["blending","box"],["blendr","company"],["bless","plus"],["blessed","plus"],["blessedness","plus"],["blessing","plus"],["blessings","plus"],["blew","mouth"],["blind","minus"],["blindfold",""],["blindfolded","blindfold"],["blindfolds","blindfold"],["blinding","minus"],["blindly","square"],["bliss","plus"],["blissenobiarella","person"],["blissful","plus"],["blob","ball"],["block","box"],["blockage","minus"],["blockages","ball"],["blocked","box"],["blockedness","box"],["blocking","box"],["blocks","box"],["blog","paper"],["blogger","company"],["blogs","book"],["blogspot","paper"],["blonde","hair"],["blood",""],["bloodshed","minus"],["bloom","flower"],["blossom",""],["blow","right"],["blowers","right"],["blowing","mouth"],["blown","mouth"],["blue","square"],["blueberries","berry"],["blueberry",""],["bluegrass","note"],["bluenature","planet"],["blueprint","paper"],["bluestone",""],["bluff","minus"],["blunden","plus"],["blunt","plus"],["blurred","minus"],["blurring","minus"],["blytheman","person"],["blythman","person"],["bo","minus"],["board","person"],["boarded","up"],["boarding","school"],["boardroom","room"],["boas","person"],["boat",""],["boats","boat"],["bob","person"],["bobbed","up"],["bobbin","ball"],["bobbio","person"],["bobble","ball"],["bock","person"],["bocvfiqavd","dash"],["bodied","person"],["bodies","body"],["bodily","person"],["body","person"],["bodyvars","variable"],["bogged","down"],["boggle","minus"],["boggling","plus"],["bohemian","n"],["boil","n"],["boiled","water"],["boiling","n"],["bold","square"],["bolivia","country"],["bollie","bottle"],["bolly","bottle"],["bologna","city"],["bolshoi","city"],["bolt","twobox"],["bolted","bolt"],["bolting","bolt"],["bolts","bolt"],["bomb","ball"],["bombing","minus"],["bompel","dash"],["bonanza","plus"],["bonaparte","person"],["bond","right"],["bone",""],["bones","bone"],["bongo","plus"],["bonilla","person"],["bonita","plus"],["bonjour","plus"],["bonjoura","plus"],["bonkers","minus"],["bonus","plus"],["book",""],["booked","book"],["bookings","tick"],["bookkeeping","plus"],["bookmaking","book"],["bookmark","square"],["bookmarks","paper"],["bookofbadarguments","book"],["books","book"],["bookshelves","box"],["booksmy","book"],["boom","line"],["booming","plus"],["boosted","up"],["boot",""],["booth","and"],["booting","right"],["boots","boot"],["border","square"],["borders","square"],["bore","down"],["bored","minus"],["boring","minus"],["born","one"],["borrow","right"],["borrowed","right"],["borrowing","right"],["boryslawski","person"],["bos","dash"],["bosnia","country"],["boss","person"],["boston","city"],["bot","dot"],["botanic","plant"],["botanical","plant"],["botanist","person"],["botany","plant"],["bote","dash"],["both","tworectangle"],["bothersome","minus"],["botid","n"],["bots","dot"],["botswana","country"],["bottle",""],["bottleneck","bottle"],["bottler","person"],["bottles","bottle"],["bottom","down"],["bottoms","down"],["bought","dollarsymbol"],["boulder","city"],["bounce","ball"],["bounced","up"],["bounces","equals"],["bouncing","ball"],["bound","plus"],["boundaries","square"],["boundary","square"],["bounds","line"],["bountifulness","plus"],["bounty","plus"],["bouquet","flower"],["bourbon","bottle"],["bourdain","person"],["bourgeois","person"],["bourgeoisie","person"],["bout","left"],["bow","dot"],["bowed","bow"],["bowel","organ"],["bowes","person"],["bowie","person"],["bowl",""],["bowled","ball"],["bowlen","person"],["bowler","ball"],["bowling","ball"],["bowls","bowl"],["box",""],["boxes","box"],["boy","person"],["boycott","xcorrection"],["boycotts","minus"],["boys","person"],["bp","dash"],["br","apple"],["bra",""],["brac","box"],["bracket",""],["bracketed","box"],["bracketing","box"],["brackets","box"],["bradbury","person"],["bradfield","person"],["brae","dog"],["braeard","person"],["braille","square"],["brain",""],["brains","brain"],["brainstorm","paper"],["brainstormed","paper"],["brainstorming","square"],["braintree","company"],["brainwaves","n"],["brainworks","brain"],["bran",""],["branch","dash"],["branched","right"],["branches","dash"],["branching","dash"],["brand","square"],["brass","tube"],["braun","person"],["brave","plus"],["bravely","plus"],["bravery","plus"],["bravest","plus"],["bravestudios","company"],["brazil","country"],["brazzaville","country"],["brdict","book"],["brdicts","book"],["bread",""],["breadcrumbs","dot"],["breadth","right"],["break","dash"],["breakdown","dash"],["breakdowns","down"],["breakers","box"],["breakfast","plate"],["breaking","xcorrection"],["breaks","two"],["breakthroughs","plus"],["breasdostoning","threerectangle"],["breasdostonings","threerectangle"],["breason","box"],["breasoned","apple"],["breasoner","person"],["breasoners","person"],["breasoning","apple"],["breasoningesquenesses","apple"],["breasoninglimit","n"],["breasoningname","square"],["breasoningocracy","and"],["breasonings","apple"],["breasoningthe","dash"],["breasoningwriter","software"],["breasoniong","apple"],["breasons","square"],["breast",""],["breastoned","box"],["breasts","breast"],["breath","air"],["breathdostonings","step"],["breathe","lung"],["breathed","lung"],["breathes","air"],["breathing","lung"],["breathless","minus"],["breaths","dash"],["breathsoned","sugar"],["breathsoning","plus"],["breathsoningitis","minus"],["breathsonings","berry"],["breathsons","apple"],["breathwork","mouth"],["breavsonings","square"],["breeding","right"],["breeds","person"],["breedsonings","apple"],["breen","apple"],["breenississimos","right"],["breeze","right"],["brenka","person"],["brent","person"],["brett","person"],["brew","glass"],["brian","person"],["brianmoreau","person"],["bricabraced","fabric"],["brichko","person"],["brick",""],["bridge",""],["bridged","bridge"],["bridges","right"],["bridget","person"],["bridging","dash"],["brief","n"],["briefed","paper"],["briefly","n"],["briefs","square"],["bright","ball"],["brighter","n"],["brightness","n"],["brigid","person"],["brilliance","plus"],["brilliant","plus"],["brim",""],["brimmed",""],["bring","right"],["bringing","left"],["brings","right"],["bristles","bristle"],["britannica","book"],["british","person"],["brits",""],["brlog","book"],["broach","mouth"],["broad","n"],["broadband","right"],["broadbee","plus"],["broadcast","right"],["broadcaster","person"],["broadcasters","person"],["broadcasthost","right"],["broadcasting","right"],["broadcasts","right"],["broader","plus"],["broadly","plus"],["broadminded","plus"],["broccoli",""],["broke","minus"],["broken","xcorrection"],["brolga","bird"],["bronchi","tube"],["bronze",""],["brook","person"],["brooke","person"],["brooks","person"],["broom",""],["broth","soup"],["brother","person"],["brothers","person"],["brought","right"],["brow",""],["brown","square"],["browse","down"],["browser","square"],["brs","twobox"],["brtdict","variable"],["brth","apple"],["brthdict","book"],["brthlength","n"],["brueckner","person"],["bruising","minus"],["brumaire","person"],["brunei","country"],["bruno","person"],["brush",""],["brushed","brush"],["brushing","right"],["brute","down"],["bryce","person"],["bs","square"],["bsb","book"],["bsc","paper"],["bsd","paper"],["bshot","dash"],["bt","right"],["btw","box"],["bu","dash"],["bubble",""],["bubbles","ball"],["buble","person"],["bucket",""],["buckle",""],["buckled","buckle"],["buckling","plus"],["buckwheat","box"],["bud","ball"],["buddha","person"],["buddhism","buddha"],["buddhist","person"],["buddhists","person"],["buddies","person"],["buddy","person"],["budgerigar","bird"],["budget","dollarsymbol"],["budgeted","dollarsymbol"],["budgeting","dollarsymbol"],["budgy","bird"],["buffer","box"],["buffered","box"],["bug","minus"],["buggy","minus"],["bugs","minus"],["bugsfixed","plus"],["build","box"],["buildable","box"],["builder","up"],["builders","up"],["building",""],["buildings","building"],["builds","box"],["built","box"],["bulbourethral","tube"],["bulgaria","country"],["bulging","ball"],["bulk","n"],["bull",""],["bullet",""],["bulletin","paper"],["bullies","minus"],["bulls","bull"],["bully","minus"],["bullying","minus"],["bumped","minus"],["bumping","up"],["bumps","up"],["bun",""],["bunches","box"],["bundle","box"],["bung","dash"],["bunn","person"],["bunnies","rabbit"],["bunny","rabbit"],["buns","bun"],["burawoy","person"],["burden","minus"],["burdened","minus"],["burdens","minus"],["bureau","company"],["bureaucrat","person"],["burette","flask"],["burgeoning","plus"],["burger",""],["burgers","burger"],["burglary","minus"],["burik","person"],["burkina","country"],["burma","country"],["burn","minus"],["burned","fire"],["burner","fire"],["burning","fire"],["burnt","minus"],["bursaries","dollarsymbol"],["bursary","dollarsymbol"],["burst","right"],["bursts","up"],["burundi","country"],["bus",""],["bush","plant"],["bushfire","fire"],["busily","plus"],["business","square"],["businesses","company"],["businessman","person"],["businessperson","person"],["bust",""],["busters","xcorrection"],["busy","plus"],["but","minus"],["butler","person"],["butter",""],["buttered","butter"],["butterflies","minus"],["butterfly",""],["butterscotch",""],["buttocks",""],["button","box"],["buttons","square"],["buy","dollarsign"],["buyer","person"],["buyers","person"],["buying","a"],["buys","dollarsymbol"],["buzz","plus"],["buzzy","plus"],["bv","dash"],["bve","dash"],["bw","box"],["bwe","dash"],["bwn","n"],["bx","variable"],["bxydcqw","dash"],["bxzt","dash"],["by","dash"],["bye","n"],["bypass","line"],["byproducts","right"],["byron","person"],["byte","a"],["bz","dash"],["c",""],["c","box"],["ca","dash"],["caaf","square"],["cab","car"],["cabaña","sausage"],["cabbage",""],["cabinet",""],["cable","line"],["cabo","country"],["cached","box"],["caction","hand"],["cadillacs","car"],["caesar","person"],["cafe","building"],["café","building"],["cage","box"],["cake",""],["cal","square"],["calamity","minus"],["calc","plus"],["calcbtremaining","dash"],["calciplus","liquid"],["calcium","ball"],["calculate","plus"],["calculated","equals"],["calculates","equals"],["calculating","equals"],["calculation","equals"],["calculations","and"],["calculator","plus"],["calculus","n"],["calendar","square"],["calf",""],["calibrated","n"],["calibrations","line"],["california","state"],["call","speechballoon"],["callaghan","person"],["called","dash"],["caller","person"],["callibrate","line"],["callibration","line"],["callibrationvalue","variable"],["calligraphy","square"],["calling","speechballoon"],["calliope","box"],["calls","square"],["callterm","square"],["calm","line"],["calmed","plus"],["calming","plus"],["calmness","plus"],["calormen","dash"],["caltech","university"],["calves","calf"],["cam","box"],["cambodia","country"],["cambridge","university"],["came","right"],["camera","box"],["cameras","box"],["cameron","person"],["cameroon","country"],["camilla","person"],["camouflage","shirt"],["camp",""],["campaign","box"],["campaignid","dash"],["campaigning","plus"],["campaigns","plus"],["campbell","person"],["campfire","fire"],["camping","fire"],["campus","square"],["campuses","building"],["camus","person"],["can","dash"],["canada","country"],["canadian","person"],["canadians","person"],["canal","water"],["canaveral","suburb"],["canaverals","up"],["canberra","city"],["canc","xcorrection"],["cancel","xcorrection"],["cancellable","xcorrection"],["cancellation","xcorrection"],["cancellations","xcorrection"],["cancelled","xcorrection"],["cancelling","xcorrection"],["cancels","xcorrection"],["cancer","minus"],["candid","plus"],["candidate","person"],["candidates","person"],["candies","lolly"],["candle",""],["candlelight","candle"],["candles","candle"],["candy","lolly"],["cane",""],["canine","dog"],["canister","box"],["cannibals","person"],["cannot","xcorrection"],["canoe",""],["canon","paper"],["canopy","tree"],["cant","minus"],["cantaloupe",""],["canterbury","city"],["cantilever","line"],["canvas","square"],["cap","up"],["capabilities","plus"],["capability","plus"],["capable","plus"],["capacity","n"],["cape","suburb"],["capellic","person"],["caper",""],["capers","caper"],["capillaries","tube"],["capillary","tube"],["capital","square"],["capitalise","up"],["capitalised","up"],["capitalises","up"],["capitalism","dollarsign"],["capitalismmagazine","book"],["capitalist","person"],["capitalists","person"],["capitally","country"],["capitals","up"],["capped","square"],["cappuccino",""],["capricorn","ball"],["caps","square"],["capsicum",""],["capstone","box"],["capsulated","box"],["capsule",""],["captain","person"],["captcha","down"],["caption","square"],["captions","square"],["captivated","tick"],["captives","person"],["capture","box"],["captured","square"],["captures","box"],["car",""],["carafe","jug"],["caramel",""],["caramelised","caramel"],["carat","n"],["carbohydrates","ball"],["carbon",""],["carcinogeenan","minus"],["carcinogen","minus"],["card","square"],["cardboard","square"],["cardio","heart"],["cardiovascular","heart"],["cards","paper"],["care","hand"],["cared","plus"],["career","line"],["careers","line"],["careful","plus"],["carefully","plus"],["carer","person"],["carers","person"],["cares","plus"],["caretaker","person"],["caring","plus"],["caringly","plus"],["caringness","plus"],["carl","person"],["carlisle","street"],["carly","person"],["carnations","flower"],["carnelian","person"],["carob",""],["carofarion","dash"],["carol","person"],["carolina","person"],["carpathians","book"],["carpenter","person"],["carpet",""],["carrageenan","ball"],["carriage","vehicle"],["carriages","vehicle"],["carrie","person"],["carried","box"],["carrier","box"],["carries","box"],["carrot",""],["carrots","carrot"],["carry","box"],["carryable","box"],["carrying","box"],["cars","car"],["carson","person"],["cart","box"],["carter","person"],["carton","box"],["cartridge",""],["carts","box"],["cartwheeling","right"],["carved","right"],["carving","chisel"],["cascade","down"],["case","paper"],["cases","dash"],["cash","dollarsymbol"],["cashew","nut"],["casserole","pot"],["cast","person"],["castanets",""],["casting","tick"],["castle",""],["castration","ball"],["casual","plus"],["casually","plus"],["cat",""],["catalan","book"],["catalog","book"],["catalogue","book"],["catastrophe","minus"],["catch","box"],["catcher","box"],["catchier","plus"],["catching","plus"],["catchphrases","square"],["catchy","plus"],["categorical","dash"],["categorically","box"],["categories","square"],["categorised","box"],["categorises","box"],["categorising","box"],["category","square"],["cater","apple"],["catered","apple"],["caterer","person"],["catering","apple"],["cates","dash"],["cathcart","person"],["cathedral",""],["catherine","person"],["catheter","tube"],["catriona","person"],["cats","cat"],["caucas","person"],["caucasian","person"],["caught","xcorrection"],["caulfield","suburb"],["causal","right"],["causality","right"],["causation","right"],["causations","right"],["causative","right"],["cause","right"],["caused","right"],["causes","right"],["causing","right"],["caution","xcorrection"],["caval","company"],["cave","box"],["caveats","plus"],["cavell","person"],["caves","cave"],["caviar",""],["cavity","box"],["caw","asterisk"],["cawa","a"],["cawdel","and"],["cawmp","and"],["cawmpedit","dash"],["cawmptest","tick"],["cawp","and"],["cawplistprolog","square"],["cawps","up"],["cawptest","plus"],["cawptesta","tick"],["cawpverify","plus"],["cawpverifya","tick"],["cawthorne","person"],["cb","dash"],["cba","square"],["cbc","square"],["cbd","square"],["cbswwcc","dash"],["cc","dash"],["cca","variable"],["ccb","variable"],["ccc","square"],["cccbbb","square"],["cccnumber","n"],["ccreep","right"],["cd",""],["cdc","dash"],["cde","dash"],["cded","box"],["cdfour","dash"],["cds","cd"],["ce","right"],["cease","minus"],["ceased","zero"],["cecd","company"],["ceg","right"],["ceiba","tree"],["ceiling","square"],["cele","box"],["celebrant","person"],["celebrate","plus"],["celebrated","plus"],["celebrates","plus"],["celebrating","plus"],["celebration","plus"],["celebrations","plus"],["celebrities","person"],["celebrity","person"],["celebs","person"],["celery",""],["celesta","tube"],["celeste","person"],["celia","person"],["celibacy","zero"],["celibate","plus"],["cell","square"],["cellar","box"],["cello",""],["cellophane","square"],["cells","box"],["cellular","cell"],["celsius","n"],["censorship","minus"],["census","paper"],["centanni","person"],["center","dot"],["centers","building"],["centigrade","n"],["centimetre","n"],["centipede",""],["central","dot"],["centralised","one"],["centrality","n"],["centrally","n"],["centre","building"],["centred","n"],["centredness","n"],["centrelink","company"],["centres","dot"],["centro","dot"],["centuries","n"],["century","n"],["ceos","person"],["cerebellum","brain"],["cerebros","person"],["ceremonies","plus"],["ceremony","box"],["ceresio","city"],["cert","square"],["certa","company"],["certain","up"],["certainly","a"],["certificate","paper"],["certification","plus"],["certified","plus"],["cf","dash"],["cfvkai","dash"],["cgi","software"],["ch","variable"],["cha","square"],["chad","country"],["chai","cup"],["chain","line"],["chaine","right"],["chains","line"],["chair",""],["chaisomboon","person"],["chaiwat","person"],["chalice",""],["chalk",""],["chalky","chalk"],["challenge","xcorrection"],["challenged","xcorrection"],["challenges","minus"],["challenging","plus"],["cham","person"],["chamber","room"],["championed","plus"],["chan","person"],["chance","plus"],["chancellor","person"],["chances","plus"],["chandler","person"],["chang","person"],["change","delta"],["changed","minus"],["changegrid","grid"],["changelength","n"],["changelengthh","n"],["changeovers","right"],["changeperson","person"],["changer","person"],["changes","xcorrection"],["changing","xcorrection"],["channel","paper"],["channels","n"],["chant","note"],["chanter","person"],["chants","note"],["chaos","minus"],["chaotic","minus"],["chaplaincies","paper"],["chapter","book"],["chapters","book"],["char","a"],["character","person"],["characterbr","box"],["characterbrdiscontinuous","square"],["characterbreasoner","a"],["characterbrscript","paper"],["characterisation","plus"],["characterise","plus"],["characterised","plus"],["characterises","plus"],["characterising","plus"],["characteristic","square"],["characteristics","square"],["characters","a"],["charang","box"],["charge","dollarsymbol"],["charged","plus"],["charging","right"],["charisma","plus"],["charitable","plus"],["charities","zero"],["charity","plus"],["charles","person"],["charley","person"],["charlie","person"],["charlotte","person"],["charm","plus"],["charmer","person"],["charnes","person"],["charobj","apple"],["charon","person"],["chars","a"],["chart","paper"],["charted","square"],["charter","paper"],["charts","chart"],["chased","right"],["chasing","right"],["chasm","box"],["chastity","plus"],["chat","paper"],["chatbot","box"],["chatbots","box"],["chats","square"],["chatted","mouth"],["chatting","speechballoon"],["chaucer","person"],["chauffeur","person"],["cheap","dollarsign"],["cheaper","lessthan"],["cheapest","n"],["cheat","minus"],["cheated","minus"],["cheating","minus"],["cheats","minus"],["chec","plus"],["chechnya","person"],["check","tick"],["checkarguments","tick"],["checkbox","square"],["checked","tick"],["checker","tick"],["checkerboard","plane"],["checkers","tick"],["checking","tick"],["checklist","square"],["checkmate","plus"],["checkout","box"],["checks","tick"],["checktypes","tick"],["cheek","square"],["cheeks","cheek"],["cheeky","minus"],["cheer","plus"],["cheered","plus"],["cheerful","plus"],["cheers","plus"],["cheese",""],["cheeseburger",""],["cheesecake",""],["cheesonings","cheese"],["chef","person"],["chekov","person"],["chem","ball"],["chemical","box"],["chemicals","oxygen"],["chemist","person"],["chemistry","book"],["chen","person"],["cheques","square"],["cher","person"],["cherished","plus"],["cherries","cherry"],["cherry",""],["chess","game"],["chessboard",""],["chest",""],["chestnut",""],["cheung","person"],["chew","mouth"],["chewed","tooth"],["chewing","tooth"],["chi","n"],["chiat","person"],["chicago","city"],["chicken",""],["chickens","chicken"],["chickpea",""],["chicks","bird"],["chief","person"],["chiff","box"],["child",""],["childbearing","person"],["childhood","line"],["children","person"],["childrenh","person"],["childrens","child"],["chile","country"],["chilled","n"],["chime","note"],["chin","person"],["china","country"],["chindogu","plus"],["chinese","person"],["chinos","pants"],["chip","box"],["chippendale","person"],["chips",""],["chiropractic","muscle"],["chiropractor","person"],["chirp","bird"],["chisel",""],["chiselling","right"],["chisels","chisel"],["chit","mouth"],["chitranishek","person"],["chivalrous","person"],["chloe","person"],["chn","dash"],["cho","person"],["chocolate",""],["chodorow","person"],["choice","dash"],["choicepoint","dot"],["choicepoints","dash"],["choices","tick"],["choir","note"],["choirs","person"],["choke","minus"],["choked","minus"],["chomsks","person"],["chomsky","person"],["chonga","dash"],["choos","box"],["choose","dash"],["chooser","person"],["chooses","person"],["choosing","tick"],["chopped","tworectangle"],["chopping","knife"],["chopstick",""],["chopsticks","chopstick"],["chord","threerectangle"],["chords","note"],["choreography","foot"],["chorus","square"],["chorussoloprogression","square"],["chose","finger"],["chosen","tick"],["chowdorow","person"],["chris","person"],["christ","person"],["christian","person"],["christianity","person"],["christina","person"],["christine","person"],["christmas","box"],["christopher","person"],["christos","person"],["chrome","paper"],["chronicles","book"],["chronological","n"],["chronologically","right"],["chrysalis",""],["chs","book"],["chull","dash"],["chunhui","person"],["chunk","box"],["chunked","box"],["chunker","box"],["chunks","box"],["church","building"],["churchill","person"],["churned","circle"],["churning","circle"],["chutney",""],["chylomicrons","ball"],["cic","dash"],["ciccy","square"],["cine","tablet"],["cinema","square"],["cinematic","line"],["cinematographer","person"],["cinematography","camera"],["cinnamon",""],["circle",""],["circled","circle"],["circles","circle"],["circuit","square"],["circuitry","and"],["circuits","line"],["circular","circle"],["circulate","circle"],["circulated","right"],["circulating","circle"],["circulation","circle"],["circulatory","circle"],["circumcise","box"],["circumference","circle"],["circumnavigated","right"],["circumnavigating","right"],["circumnavigator","circle"],["circumstances","dash"],["circus","tent"],["cire","school"],["cit","university"],["citation","square"],["citations","square"],["cite","dash"],["cited","square"],["cites","square"],["cities","city"],["citing","square"],["citizen","person"],["citizens","person"],["citizenship","paper"],["citric","orange"],["citroen","car"],["citrus","orange"],["city",""],["civil","plus"],["civilians","person"],["civilisation","line"],["civilisations","planet"],["civilly","plus"],["cjem","dash"],["cjwkcajwto","dash"],["ck","dash"],["cking","dash"],["cl","dash"],["claim","paper"],["claimed",""],["claimed","plus"],["claiming","square"],["claims","square"],["claire","person"],["clamp",""],["clapped","hand"],["clapping","hand"],["clara","person"],["clare","person"],["clarification","plus"],["clarified","plus"],["clarify","plus"],["clarinet","tube"],["clarity","plus"],["clarke","person"],["clash","cymbal"],["clashed","minus"],["class","square"],["classcentral","company"],["classed","square"],["classes","room"],["classic","paper"],["classical","paper"],["classicalcomposition","note"],["classicalcompositionwithlength","note"],["classicalpop","ball"],["classicalpopcomposition","note"],["classics","book"],["classification","dash"],["classifiers","box"],["classify","dash"],["classifying","box"],["classmate","person"],["classroom","room"],["classrooms","room"],["claudine","person"],["claudius","person"],["clause","square"],["clausen","square"],["clausenum","n"],["clauses","square"],["clavinet",""],["claw",""],["claws","claw"],["clean","square"],["cleaned","plus"],["cleaner","square"],["cleaning","right"],["cleanliness","plus"],["cleanly","plus"],["cleanmymac","and"],["cleanse","minus"],["cleanup","zero"],["clear","dash"],["clearance","plus"],["cleared","dash"],["clearer","tick"],["clearing","box"],["clearly","dash"],["clearness","equals"],["clears","zero"],["clef","note"],["clegg","person"],["clenched","equals"],["clength","n"],["cleobabus","person"],["cleopatra","person"],["cleric","person"],["clerk","person"],["clet","dash"],["cliches","minus"],["click","dot"],["clicked","equals"],["clicking","down"],["clicks","plus"],["client","person"],["clients","person"],["cliff",""],["clifton","person"],["climate","water"],["climates","air"],["climatology","book"],["climax","dot"],["climb","up"],["climbed","up"],["climber","person"],["climbing","up"],["climbs","up"],["cling","plus"],["clinic","tablet"],["clinical","plus"],["clinically","plus"],["clinicians","person"],["clinics","building"],["clinton","person"],["clip",""],["clipped","down"],["clippered","clipper"],["clippers",""],["clipping","box"],["clips","square"],["clique","triangle"],["cliques","triangle"],["clitoris","ball"],["cloaked","xcorrection"],["clock","n"],["clocked","n"],["clocking","n"],["clocks","clock"],["clockwise","right"],["clockwork","dash"],["clone","plus"],["cloned","plus"],["clones","box"],["cloning","box"],["clop","note"],["close","n"],["closebracket",""],["closed","down"],["closedroundbracket",""],["closedsquarebracket",""],["closely","n"],["closeness","n"],["closer","n"],["closest","one"],["closing","n"],["closure","n"],["cloth","square"],["clothe","pants"],["clothes","coat"],["clotheshorse","rack"],["clothing","shirt"],["cloud",""],["cloudfront","cloud"],["clouds","box"],["cloudy","box"],["clover",""],["cloves","clove"],["clown","person"],["cloying","sugar"],["clozapine","tablet"],["clozapines","tablet"],["clpfd","book"],["clt","right"],["club","room"],["clubs","organisation"],["clues","square"],["clunky","minus"],["cluster","star"],["clutch","rod"],["cm","n"],["cmd","square"],["cmedsav","box"],["cnn","square"],["cnu","dash"],["cnumber","variable"],["co","company"],["coach","person"],["coaching","plus"],["coalesced","box"],["coalesces","box"],["coalface","square"],["coarse","square"],["coat",""],["coaxed","plus"],["cockatoo","bird"],["cockatoos","bird"],["cockerel","chicken"],["cockfest","bird"],["cocks","person"],["coconut",""],["coconuts","coconut"],["code","variable"],["coded","and"],["codefolder","box"],["codeless","square"],["codenamed","square"],["coder","person"],["coderdojo","and"],["codereuser","tick"],["codes","square"],["codified","plus"],["codify","right"],["coding","and"],["cody","person"],["coefficient","n"],["coffee",""],["coffey","person"],["cog",""],["cognised","mind"],["cognition","and"],["cognitive","brain"],["cognitively","and"],["cognito","brain"],["cognizant","plus"],["coherent","plus"],["cohort","person"],["coiffeur","clippers"],["coil","cylinder"],["coin",""],["coincide","tworectangle"],["coincidences","equals"],["coincidentally","plus"],["coincides","equals"],["coins","coin"],["coit",""],["coits","coit"],["coker","person"],["colander",""],["colanderianisation","colander"],["cold","minus"],["colds","bacterium"],["cole","person"],["colero","person"],["collaborate","plus"],["collaborated","plus"],["collaborating","plus"],["collaboration","person"],["collaborations","dash"],["collaborative","plus"],["collaboratively","plus"],["collaborators","person"],["collage",""],["collapse","down"],["collapsed","down"],["collar",""],["collate","box"],["collated","box"],["collating","box"],["collation","box"],["colleague","person"],["colleagues","person"],["collect","left"],["collected","square"],["collecting","box"],["collection","box"],["collections","box"],["collective","box"],["collectively","box"],["collector","person"],["collects","square"],["collectwords","tworectangle"],["collectwordsdiff","minus"],["college","building"],["colleges","school"],["collegiality","plus"],["collide","equals"],["collided","equals"],["colliders","equals"],["colliding","equals"],["collision","equals"],["collude","minus"],["colnerud","person"],["colombia","country"],["colon",""],["colonial","city"],["colonialism","country"],["colonized","country"],["colony","person"],["color","square"],["colorado","city"],["colossal","plus"],["colour","square"],["coloured","square"],["colourful","balloon"],["colouring","square"],["colourless","zero"],["colourlessness","square"],["colours","balloon"],["coltan","person"],["column",""],["columns","line"],["colvin","person"],["com","box"],["comb","dash"],["combat","right"],["combating","xcorrection"],["combats","xcorrection"],["combed","comb"],["combination","multiply"],["combinations","plus"],["combinator","dash"],["combinators","dash"],["combine","one"],["combined","plus"],["combiner","person"],["combines","one"],["combining","plus"],["combo","tworectangle"],["combophil","algorithm"],["combos","tworectangle"],["combs","person"],["combustion","up"],["come","right"],["comedy","book"],["comes","right"],["comfort","person"],["comfortable","seat"],["comfortably","plus"],["comforted","plus"],["comforting","plus"],["comic","book"],["comical","plus"],["comics","book"],["coming","right"],["comm","company"],["comma","box"],["command","square"],["commanddict","book"],["commanded","square"],["commanding","plus"],["commands","plus"],["commaorrightbracketnext","square"],["commaquoteorrightbracketnext","right"],["commas","box"],["commemorating","square"],["commence","dot"],["commenced","dot"],["commendation","plus"],["commendations","plus"],["comment","a"],["commentator","person"],["commentators","person"],["commented","speechballoon"],["commentfinder","square"],["commenting","mouth"],["comments","twobox"],["commentsa","square"],["commerce","dollarsymbol"],["commercial","dollarsymbol"],["commercially","dollarsymbol"],["commission","dollarsymbol"],["commissioned","plus"],["commissioning","plus"],["commit","plus"],["commitment","plus"],["commitments","plus"],["commits","plus"],["committal","plus"],["committed","plus"],["committee","person"],["committees","person"],["committing","plus"],["commodification","dollarsymbol"],["common","plus"],["commonly","plus"],["communal","box"],["communic","paper"],["communicate","square"],["communicated","right"],["communicating","mouth"],["communication","dash"],["communications","paper"],["communicative","square"],["communicator","book"],["communicators","person"],["communism","equals"],["communist","person"],["communists","person"],["communities","person"],["community","person"],["commutative","equals"],["commutativity","dash"],["comoros","country"],["comp","equals"],["companies","building"],["companions","person"],["companionship","person"],["company","building"],["comparable","equals"],["comparatively","greaterthan"],["compare","greaterthan"],["compared","greaterthan"],["compares","dot"],["comparing","dash"],["comparison","dash"],["comparisonoperator","greaterthan"],["comparisons","greaterthan"],["compartment","box"],["compass","right"],["compassion","plus"],["compassionate","plus"],["compatibility","tick"],["compatible","plus"],["compatibly","plus"],["compcard","square"],["compelling","plus"],["compensate","plus"],["compensated","plus"],["compensating","xcorrection"],["compet","plus"],["compete","xcorrection"],["competed","plus"],["competence","plus"],["competences","plus"],["competencies","tick"],["competency","paper"],["competent","plus"],["competently","plus"],["competing","xcorrection"],["competition","two"],["competitionism","plus"],["competitions","dash"],["competitive","plus"],["competitor","person"],["competitors","person"],["compilation","book"],["compile","down"],["compiled","down"],["compiler","and"],["compilers","right"],["compiles","down"],["compiling","down"],["complain","xcorrection"],["complained","xcorrection"],["complaining","minus"],["complaints","xcorrection"],["complement","dash"],["complementary","dash"],["complemented","box"],["complementing","plus"],["complements","dot"],["complete","a"],["completed","tick"],["completely","one"],["completeness","tick"],["completes","n"],["completing","n"],["completion","n"],["complex","square"],["complexes","box"],["complexification","minus"],["complexified","minus"],["complexify","plus"],["complexifying","up"],["complexion","face"],["complexities","n"],["complexity","n"],["compliance","plus"],["complicated","and"],["complicates","minus"],["complicating","minus"],["complies","tick"],["compliment","plus"],["complimentary","plus"],["compliments","plus"],["comply","plus"],["component","square"],["components","dash"],["compose","note"],["composed","box"],["composer","person"],["composex","paper"],["composing","note"],["composition","paper"],["compositions","note"],["compost","box"],["composure","plus"],["compound","tworectangle"],["compounds","tworectangle"],["comprehension","right"],["comprehensive","plus"],["comprehensiveness","plus"],["compress","down"],["compressed","box"],["compressing","down"],["compromise","minus"],["compromises","xcorrection"],["comps","company"],["compulsory","plus"],["computable","and"],["computated","and"],["computation","and"],["computational","and"],["computationalenglish","book"],["computationalism","and"],["computationally","plus"],["computations","and"],["compute","and"],["computed","and"],["computer","box"],["computers","twobox"],["computerscience","and"],["computes","algorithm"],["computing","and"],["comswipl","software"],["con","dash"],["concat","plus"],["concatenate","plus"],["concatenated","plus"],["concatenation","plus"],["concave","circle"],["concavity","box"],["conceal","box"],["concealed","down"],["concealing","square"],["concealment","box"],["conceals","box"],["conceive","right"],["conceived","plus"],["conceivers","person"],["conceiving","one"],["concentrate","tick"],["concentrated","tick"],["concentrating","plus"],["concentration","plus"],["concept","paper"],["conception","plus"],["conceptions","plus"],["concepts","square"],["conceptual","box"],["conceptualised","plus"],["conceptualises","plus"],["concern","plus"],["concerned","xcorrection"],["concerning","right"],["concerns","plus"],["concert","note"],["concerted","plus"],["concerto","note"],["concerts","note"],["concise","box"],["concisely","down"],["conclude","square"],["concluded","square"],["concludes","n"],["concluding","n"],["conclusion","square"],["conclusions","square"],["concords","jet"],["concourse","square"],["concrete","box"],["concs","square"],["concurred","plus"],["concurrently","zero"],["cond","equals"],["condensed","down"],["condiments","sauce"],["condition","n"],["conditional","dash"],["conditioned","plus"],["conditioner",""],["conditioning","plus"],["conditions","dash"],["condom",""],["condoms","condom"],["conduct","plus"],["conducted","note"],["conducting","note"],["conductor","person"],["conducts","plus"],["cone",""],["cones","cone"],["confectioner","person"],["confer","dash"],["conference","room"],["conferences","room"],["conferred","right"],["conferring","right"],["confidence","plus"],["confident","plus"],["confidential","square"],["confidentiality","box"],["confidently","plus"],["config","n"],["configuration","dash"],["configurations","n"],["configure","n"],["configured","n"],["confines","box"],["confirm","plus"],["confirmability","plus"],["confirmation","tick"],["confirmatory","plus"],["confirmed","tick"],["confirming","plus"],["confirms","plus"],["conflict","minus"],["conflicted","minus"],["conflicting","minus"],["conflicts","minus"],["conform","plus"],["conformation","box"],["conformations","box"],["conforming","right"],["conformity","plus"],["confront","minus"],["confrontation","minus"],["confronting","minus"],["confronts","minus"],["confucian","person"],["confucius","person"],["confuse","minus"],["confused","minus"],["confusing","minus"],["confusingly","minus"],["confusion","minus"],["congestion","minus"],["conglish","and"],["congo","country"],["congratulating","plus"],["congratulations","plus"],["congruent","equals"],["congtology","up"],["conjoined","right"],["conjoints","plus"],["conjugational","right"],["conjunction","plus"],["conjunctions","and"],["conjunctivitis","minus"],["conjure","plus"],["conjured","plus"],["conjures","square"],["conjuring","box"],["conklin","person"],["conn","person"],["connect","dash"],["connected","plus"],["connecticut","city"],["connecting","plus"],["connection","plus"],["connectionism","right"],["connectionnumber","n"],["connectionnumberprompt","square"],["connectionprompt","square"],["connections","dash"],["connective","dash"],["connectives","dash"],["connectivity","line"],["connector","right"],["connectors","box"],["connects","plus"],["connoisseur","person"],["connotations","plus"],["conns","dash"],["cons","minus"],["conscience","plus"],["conscious","brain"],["consciously","plus"],["consciousness","plus"],["consciousnesses","plus"],["consecutive","right"],["consent","plus"],["consented","plus"],["consenting","plus"],["consequence","right"],["consequences","dot"],["consequent","a"],["consequential","right"],["consequentialist","person"],["consequents","right"],["conservation","equals"],["conservatism","minus"],["conservatively","n"],["conserve","plus"],["conserved","plus"],["conserving","plus"],["consider","tick"],["considerably","plus"],["considerate","plus"],["consideration","plus"],["considerations","dash"],["considered","plus"],["considering","plus"],["considers","plus"],["consist","box"],["consistency","plus"],["consistent","plus"],["consistently","plus"],["consisting","box"],["consists","dash"],["console","square"],["consolidate","plus"],["consolidated","plus"],["consolidator","plus"],["consortium","company"],["constancy","plus"],["constant","n"],["constantdis","equals"],["constantly","n"],["constants","k"],["constellation","star"],["constituent","box"],["constituents","person"],["constitute","box"],["constituted","equals"],["constituting","box"],["constitution","paper"],["constitutional","book"],["constitutions","book"],["constitutive","plus"],["constrained","plus"],["constraining","n"],["constraint","n"],["constraints","box"],["construct","box"],["constructed","box"],["constructing","box"],["construction","box"],["constructions","box"],["constructive","box"],["constructor","box"],["constructs","box"],["construed","plus"],["consult","plus"],["consultant","person"],["consultants","person"],["consultation","paper"],["consulted","plus"],["consulting","right"],["consume","plus"],["consumed","plus"],["consumer","person"],["consumers","person"],["consuming","left"],["consumption","box"],["cont","right"],["contact","dash"],["contacted","dash"],["contactform","paper"],["contacting","equals"],["contacts","person"],["contagious","minus"],["contain","box"],["contained","box"],["container","box"],["containers","box"],["containing","box"],["contains","square"],["contaminable","minus"],["contaminated","minus"],["contamination","minus"],["contemplated","plus"],["contemplating","plus"],["contemplation","plus"],["contemporary","square"],["contempt","minus"],["content","a"],["contention","two"],["contentions","square"],["contents","square"],["contest","plus"],["contested","xcorrection"],["contesting","plus"],["context","square"],["contexts","box"],["contextsi","square"],["contextual","box"],["contextualisation","plus"],["continent",""],["continental","square"],["continents","square"],["contingent","plus"],["contingents","person"],["continual","right"],["continually","right"],["continuance","right"],["continuation","right"],["continue","right"],["continued","right"],["continues","right"],["continuing","right"],["continuities","right"],["continuity","line"],["continuous","circle"],["continuously","circle"],["contour","right"],["contra","xcorrection"],["contrabass","box"],["contracello","cello"],["contract","paper"],["contracted","down"],["contracting","down"],["contraction","right"],["contractor","person"],["contractors","person"],["contracts","paper"],["contradiction","minus"],["contradictions","minus"],["contradictorily","minus"],["contradictory","minus"],["contradicts","xcorrection"],["contradistinction","notequals"],["contrary","minus"],["contrast","minus"],["contrasted","minus"],["contrasting","minus"],["contrected","plus"],["contrection","plus"],["contribute","plus"],["contributed","plus"],["contributes","right"],["contributing","plus"],["contribution","right"],["contributions","box"],["contributor","person"],["contributors","person"],["control","tick"],["controlled","plus"],["controller","right"],["controllers","plus"],["controlling","plus"],["controls","right"],["controversial","minus"],["controversy","minus"],["conundrums","minus"],["convenient","plus"],["convenor","person"],["convention","plus"],["conventional","plus"],["conventionally","plus"],["conventions","tick"],["converged","right"],["convergence","down"],["convergent","right"],["conversants","person"],["conversation","speechballoon"],["conversational","mouth"],["conversations","paper"],["converse","square"],["conversed","square"],["conversing","mouth"],["conversion","right"],["conversions","right"],["convert","right"],["converted","right"],["converter","right"],["converters","right"],["converting","right"],["converts","right"],["convex","circle"],["convey","paper"],["conveyable","right"],["conveyed","plus"],["conveying","paper"],["conveys","right"],["convince","plus"],["convinced","plus"],["convincingly","plus"],["convoluted","minus"],["convoy","person"],["coo","dash"],["coogan","person"],["cook","n"],["cookbook","book"],["cooked","n"],["cookery","apple"],["cookie",""],["cookies","cookie"],["cooking","n"],["cooks","person"],["cool","n"],["cooled","n"],["cooler","lessthan"],["cooling","n"],["coootherlangpairs","tworectangle"],["cooperated","plus"],["cooperation","plus"],["cooperative","plus"],["cooperatives","plus"],["coord","n"],["coordinate","n"],["coordinated","dash"],["coordinates","dot"],["coordination","plus"],["coordinator","person"],["coordinators","person"],["coords","dot"],["coordstype","square"],["coorss","note"],["coowithoutlangpairs","square"],["cop","person"],["cope","plus"],["coped","plus"],["copied","two"],["copies","plus"],["copilot","person"],["coping","plus"],["copper",""],["copula","equals"],["copulate","plus"],["copulation","plus"],["copy","paper"],["copyediting","paper"],["copying","plus"],["copyright","tick"],["copyrighted","tick"],["copyrighting","plus"],["copyw","paper"],["copywrite","square"],["copywriter","person"],["copywriting","pen"],["copywritten","plus"],["coral",""],["cord",""],["cordial",""],["cords","dash"],["corduroy","pants"],["core","paper"],["cores","paper"],["cori","dash"],["cork",""],["corks","cork"],["corn",""],["corner","zero"],["cornered","box"],["corners","dot"],["cornerstone","box"],["cornstarch","box"],["cornupcopias","plus"],["coronavirus","virus"],["coronel","person"],["corporal","person"],["corporate","dollarsymbol"],["corporateness","dollarsymbol"],["corporation","company"],["corporations","company"],["corps","person"],["corpuscle","ball"],["corr","xcorrection"],["correct","tick"],["corrected","tick"],["correcting","xcorrection"],["correction","xcorrection"],["corrections","xcorrection"],["correctly","tick"],["correctness","tick"],["correctors","person"],["corrects","tick"],["correl","right"],["correlate","right"],["correlated","dash"],["correlates","dash"],["correlating","dash"],["correlation","dash"],["correlational","right"],["correlations","dash"],["correspond","right"],["corresponded","right"],["correspondences","dash"],["corresponding","dash"],["corresponds","right"],["corridor","room"],["corridors","line"],["corrs","dash"],["corrupt","minus"],["corrupted","minus"],["corruption","minus"],["corrupts","minus"],["cos","right"],["cosmic","plus"],["cosmogony","book"],["cosmol","box"],["cosmological","box"],["cosmologically","plus"],["cosmologies","box"],["cosmologist","person"],["cosmologue","person"],["cosmology","box"],["cosmopolitan","magazine"],["cosmos","box"],["cosms","box"],["cost","dollarsymbol"],["costa","country"],["costan","person"],["costantino","person"],["costs","dollarsign"],["costume","hat"],["costumes","coat"],["cot","dash"],["cototherlangpairs","square"],["cottage","house"],["cotton","line"],["cottoning","plus"],["cough","mouth"],["coughing","mouth"],["could","dash"],["couldn","xcorrection"],["coun","person"],["council","person"],["councillors","person"],["councils","up"],["counselling","person"],["counsellor","person"],["counsellors","person"],["count","n"],["countable","n"],["countalgorithmfirstline","square"],["countalgorithmsecondline","square"],["counted","n"],["counter","n"],["countered","equals"],["countering","equals"],["countermand","xcorrection"],["counterpart","plus"],["counterparts","equals"],["counters","counter"],["countess","person"],["counting","n"],["countries","square"],["country",""],["countryman","person"],["counts","n"],["counttwouses","n"],["couple","person"],["couples","person"],["coupling","equals"],["courage","plus"],["couresware","dash"],["course","tworectangle"],["coursedesc","square"],["coursera","company"],["courses","book"],["courseware","book"],["coursework","paper"],["court","building"],["courteous","plus"],["courteously","plus"],["courtesy","plus"],["courtier","person"],["courtiers","person"],["courtyard","yard"],["cover","square"],["coverage","square"],["covered","square"],["covering","square"],["covers","square"],["covid","minus"],["cow",""],["cowan","person"],["cowden","person"],["coworkers","person"],["cowperson","person"],["cows","animal"],["cp","right"],["cps","right"],["cpt","n"],["cpu","computer"],["cq","plus"],["cr","line"],["crab",""],["crack","box"],["cracked","right"],["cracking","box"],["cracks","box"],["craft","box"],["crafted","plus"],["crafting","box"],["crane",""],["cranes","crane"],["crang","person"],["cranial","head"],["crank","rod"],["crap","ball"],["crash","minus"],["crashed","minus"],["crashes","down"],["crashing","minus"],["crate","box"],["crates","crate"],["cravitron","plus"],["crawled","right"],["crawling","right"],["cream",""],["creaming","cream"],["creams","cream"],["crease","right"],["creases","line"],["create","plus"],["created","person"],["creates","person"],["createview","plus"],["creatine","ball"],["creating","plus"],["creatinine","ball"],["creation","one"],["creationism","plus"],["creative","left"],["creativeinnovationglobal","plus"],["creatively","plus"],["creativity","and"],["creator","person"],["creators","person"],["creature","animal"],["creatures","dog"],["cred","paper"],["credence","plus"],["credentials","plus"],["credibility","plus"],["credible","plus"],["credit","plus"],["credited","plus"],["crediting","plus"],["creditline","card"],["creek",""],["creelman","person"],["creep","right"],["creeper","right"],["cremate",""],["cremated","down"],["cremating","box"],["crepe",""],["crescendo","slash"],["creswell","person"],["crew","person"],["crewmember","person"],["cricket","ball"],["cricos","n"],["cried","minus"],["crime","minus"],["crimes","minus"],["criminal","person"],["criminals","person"],["criminology","xcorrection"],["crises","minus"],["crisis","minus"],["crit","paper"],["criteria","dash"],["criterion","square"],["critic","person"],["criticable","xcorrection"],["critical","xcorrection"],["criticality","xcorrection"],["critically","xcorrection"],["criticism","square"],["critics","person"],["critique","paper"],["critiqued","paper"],["critiques","paper"],["critiquing","plus"],["croaker","frog"],["croatia","country"],["croc","dash"],["crocodile",""],["croissant",""],["cron","software"],["crook","person"],["crooke","person"],["crop","plant"],["crosbie","person"],["crosby","person"],["cross","x"],["crossbow",""],["crossed","minus"],["crossers","person"],["crosses","x"],["crossing","equals"],["crossings","dot"],["crossover","software"],["crossroads","x"],["crossword","square"],["crotchet","line"],["crotchets","note"],["crow",""],["crowd","person"],["crowds","person"],["crowing","line"],["crown",""],["crs","dash"],["crsotvt","dash"],["crucial","tick"],["cruise","person"],["crumb",""],["crumpled","down"],["crunches","left"],["crunching","down"],["crush","down"],["crushed","down"],["cry","tear"],["crying","minus"],["crypt","room"],["cryptic","right"],["cryptically","right"],["cryptography","right"],["crystal","box"],["crystals","crystal"],["cs","box"],["csav","dash"],["css","paper"],["cstring","variable"],["cstrings","variable"],["cstringsr","variable"],["cstringsrr","variable"],["csymf","square"],["ct","dash"],["ctat","dash"],["ctc","curve"],["ctinput","square"],["ctobr","box"],["ctrl","box"],["ctt","box"],["cttbacktranslate","square"],["cttinput","n"],["cttorig","box"],["cttorigtran","right"],["cttsimplified","right"],["ctx","dash"],["cub",""],["cuba","country"],["cubby",""],["cube","box"],["cubeful","box"],["cubes","box"],["cubic","box"],["cubie","person"],["cubist","box"],["cud","dash"],["cuddled","circle"],["cuddling","circle"],["cue","square"],["cues","square"],["cuff","cylinder"],["cuisine","tofu"],["cukierman","person"],["cul","plus"],["culinary","apple"],["culminating","up"],["culmination","n"],["culpate","minus"],["culpated","minus"],["cult","paper"],["cultivate","plus"],["cultivated","plus"],["cultivating","plus"],["cultofpedagogy","apple"],["cultural","dash"],["culturally","glass"],["culturaltranslationtool","software"],["culture","box"],["cultures","square"],["culturologists","person"],["culturology","book"],["culturvate","company"],["cumquat",""],["cumulative","n"],["cup",""],["cupboard",""],["cupcake",""],["cupcakes","cupcake"],["cupid","person"],["cupping","box"],["cups","cup"],["cur","dash"],["curable","plus"],["curated","plus"],["curb","stop"],["curbed","xcorrection"],["curd",""],["cure","plus"],["cured","plus"],["curing","plus"],["curious","eye"],["curiously","plus"],["curl","square"],["curled","right"],["curler","cylinder"],["curly","line"],["curr","dash"],["currant",""],["currencies","dollarsymbol"],["currency","square"],["current","tick"],["currentdate","n"],["currently","one"],["currents","right"],["curricula","book"],["curricular","book"],["curriculum","book"],["currnotification","plus"],["curry",""],["curse","minus"],["cursor","square"],["curtis","person"],["curve","square"],["curved","arc"],["curves","arc"],["curving","right"],["curvy","arc"],["cushion",""],["cushions","cushion"],["cusp","dot"],["cust","person"],["custodians","person"],["custodianship","plus"],["custody","room"],["custom","plus"],["customary","paper"],["customer","person"],["customers","person"],["customisability","plus"],["customisable","plus"],["customisation","plus"],["customisations","plus"],["customise","right"],["customised","dash"],["customiser","box"],["customising","square"],["customized","plus"],["customs","plus"],["custs","person"],["cut","xcorrection"],["cute","plus"],["cuticles","cuticle"],["cutlass",""],["cutoff","n"],["cuts","n"],["cutter",""],["cutting","box"],["cv","paper"],["cve","dash"],["cvs","right"],["cw","book"],["cwd","paper"],["cx","variable"],["cxl","dash"],["cxs","dash"],["cxzrnlnncxplrjmyn","dash"],["cy","variable"],["cyber","computer"],["cybersecurity","plus"],["cyborg","person"],["cycle","circle"],["cycles","circle"],["cyclic","circle"],["cycling","right"],["cyclist","person"],["cylinder",""],["cylindrical","cylinder"],["cymatics","line"],["cymbal",""],["cyprus","country"],["cytokines","cell"],["cz","dash"],["czech","country"],["czechia","country"],["côte","country"],["d","square"],["da","dash"],["dabbed","dot"],["dabble","down"],["dad","person"],["dada","book"],["daddy","person"],["dadirri","person"],["daeea","dash"],["daffodil",""],["daguerreotypes","plus"],["dagwell","person"],["dahl","person"],["daily","square"],["dailyregimen","plus"],["daisies","daisy"],["daisy",""],["dale","person"],["daloz","person"],["dam",""],["damage","minus"],["damaged","minus"],["damages","minus"],["damaging","minus"],["damfjdxt","dash"],["damien","person"],["damper",""],["dampness","water"],["dan","person"],["danby","person"],["dance","right"],["danced","right"],["dancer","person"],["dancing","hand"],["dandm","and"],["danger","minus"],["dangerous","minus"],["dangers","minus"],["dangled","down"],["daniel","person"],["danish","danish"],["danishes",""],["dao","path"],["daodejing","book"],["daoism","book"],["daoismheideggergreen","paper"],["daoist","person"],["daringnesses","plus"],["dark","square"],["darker","square"],["darkest","square"],["darling","person"],["darlings","person"],["darren","person"],["darwin","person"],["das","down"],["dasein","person"],["dash",""],["dashboard","and"],["dashes","dash"],["data","a"],["dataalgdict","book"],["database","box"],["databaseformulafinder","up"],["databases","box"],["datadictionary","square"],["dataflow","right"],["datalayer","line"],["dataless","minus"],["date","square"],["dateandtime","n"],["dates","square"],["datestamp","square"],["datetime","n"],["dative","square"],["daub","down"],["daughter","person"],["daughters","person"],["daunt","minus"],["dave","person"],["davey","person"],["david","person"],["davis","person"],["daw","computer"],["dawkins","person"],["dawn","n"],["dawson","person"],["day",""],["daydob","n"],["daylearned","square"],["daylight","sun"],["days","day"],["daysb","square"],["daysbspeoplearmy","n"],["dayvaluea","variable"],["dayvalueb","variable"],["db","square"],["dba","dash"],["dbase","box"],["dbcc","dash"],["dbpia","paper"],["dbs","square"],["dbw","square"],["dc","dash"],["dcd","dash"],["dcg","square"],["dcgs","dash"],["dcgutils","right"],["dd","dash"],["ddd","dash"],["dddr","dash"],["dde","dash"],["dded","dash"],["de","country"],["dea","person"],["dead","minus"],["deadening","down"],["deadline","n"],["deadlines","n"],["deakin","university"],["deal","plus"],["dealer","person"],["dealing","plus"],["dealings","plus"],["deals","plus"],["dealt","plus"],["dean","person"],["deane","person"],["deanimate","zero"],["deanna","person"],["dear","person"],["dearly","plus"],["death","zero"],["deathbed","bed"],["deaths","dot"],["deb","person"],["debate","two"],["debated","square"],["debates","square"],["debbie","person"],["debian","software"],["debriefing","square"],["debris","box"],["debt","minus"],["debu","dash"],["debug","a"],["debugged","xcorrection"],["debugger","tick"],["debugging","tick"],["debugvalue","n"],["debutanted","plus"],["dec","square"],["decade","n"],["decades","n"],["decay","minus"],["deceiving","minus"],["december","square"],["deception","minus"],["decide","tick"],["decided","tick"],["decides","tick"],["deciding","tick"],["deciduate","down"],["deciduousness","down"],["decimal","n"],["decimetre","tenrectangle"],["decipher","right"],["deciphered","right"],["deciphering","right"],["decision","tick"],["decisions","tick"],["deck","square"],["declaration","paper"],["declare","mouth"],["declared","square"],["decoded","square"],["decoding","right"],["decomp","down"],["decompose","minus"],["decomposeandbuild","right"],["decomposed","down"],["decomposes","down"],["decomposing","down"],["decomposition","minus"],["decompress","up"],["decompressing","up"],["decomps","down"],["deconstruct","down"],["deconstructed","down"],["deconstructing","down"],["deconstruction","down"],["decor","box"],["decorated","square"],["decoration","plus"],["decrease","down"],["decreased","minus"],["decreases","down"],["decreasing","down"],["decrypted","plus"],["decryption","right"],["decurion","dash"],["ded","dash"],["dedicated","plus"],["deduced","right"],["deducer","up"],["deduct","minus"],["deduction","right"],["deductive","up"],["deed","hand"],["deeds","hand"],["deemer","person"],["deeming","plus"],["deep","down"],["deepening","zline"],["deepens","down"],["deeper","down"],["deepest","down"],["deeply","down"],["deeptissue","box"],["deer",""],["def","dash"],["default","one"],["defeasible","plus"],["defeat","minus"],["defeated","xcorrection"],["defeating","xcorrection"],["defence","plus"],["defenceedit","dash"],["defences","dash"],["defend","xcorrection"],["defended","xcorrection"],["defending","plus"],["defensibility","plus"],["defensive","plus"],["deferens","tube"],["deficiencies","minus"],["deficiency","minus"],["define","dash"],["defined","square"],["defines","square"],["defining","dash"],["definite","plus"],["definitely","plus"],["definition","square"],["definitions","square"],["definitive","plus"],["definitively","plus"],["definitiveness","plus"],["deflect","right"],["deflected","right"],["deflecting","right"],["deflection","right"],["deforestation","tree"],["degradation","minus"],["degrade","down"],["degree","a"],["degreelessness","zero"],["degrees","paper"],["dehiscence","up"],["dehlinger","person"],["deidentified","xcorrection"],["deidentifier","xcorrection"],["deigned","plus"],["deigning","plus"],["deity","person"],["dej","dash"],["del","dash"],["delay","dash"],["delayed","n"],["delect","plus"],["delectable","apple"],["delected","plus"],["delecting","tongue"],["delegate","person"],["delegated","down"],["delegatee","person"],["delegatees","person"],["delegater","person"],["delegates","person"],["delegating","right"],["delegation","right"],["delegator","person"],["delegators","person"],["delete","minus"],["deletea","xcorrection"],["deleted","xcorrection"],["deleteduplicatecode","and"],["deleteduplicates","xcorrection"],["deletefirst","xcorrection"],["deleter","xcorrection"],["deletes","xcorrection"],["deleting","minus"],["deletion","xcorrection"],["deletions","minus"],["deliberate","plus"],["deliberately","plus"],["delicate","plus"],["delicatessen","shop"],["delicious","plus"],["delight","plus"],["delighted","plus"],["delightful","plus"],["delightfully","plus"],["delightited","plus"],["delights","plus"],["delimeter","dash"],["delimit","box"],["delimited","dash"],["delimiter","square"],["deliver","right"],["delivered","right"],["deliverer","person"],["delivering","right"],["delivers","right"],["delivery","right"],["delta",""],["deltoid","muscle"],["delve","down"],["delving","down"],["demand","left"],["demanded","plus"],["demanding","minus"],["demands","minus"],["demarcating","plus"],["dematerialing","box"],["dematerialised","zero"],["dementia","minus"],["demetrioued","plus"],["demetrius","person"],["demo","paper"],["democracy","dot"],["democratic","plus"],["democratically","plus"],["democrats","person"],["demographic","person"],["demographics","person"],["demon","minus"],["demonstrate","plus"],["demonstrated","tick"],["demonstrates","plus"],["demonstrating","plus"],["demonstration","hand"],["demonstrations","box"],["demuted","up"],["den","box"],["dena","city"],["denatured","minus"],["dendrite","right"],["denial","xcorrection"],["denied","xcorrection"],["denies","xcorrection"],["denis","person"],["denmark","country"],["denominator","n"],["denoting","square"],["dense","n"],["density","dot"],["dental","tooth"],["dentist","person"],["dentistry","tooth"],["denuded","plus"],["denured","plus"],["deny","minus"],["denzin","person"],["deontological","person"],["deontology","person"],["deoxygenated","minus"],["departing","right"],["department","building"],["departmental","book"],["departments","paper"],["departure","right"],["depend","right"],["dependability","plus"],["dependable","plus"],["depended","left"],["dependence","down"],["dependencies","up"],["dependency","left"],["dependent","left"],["dependents","up"],["depending","left"],["depends","left"],["depict","square"],["depicted","square"],["depicting","square"],["depicts","right"],["depleting","minus"],["deploy","right"],["deployment","box"],["deposit","dollarsymbol"],["deposited","dollarsymbol"],["depositing","dollarsymbol"],["deprecate","xcorrection"],["deprecated","minus"],["depressed","minus"],["depression","minus"],["dept","building"],["depth","n"],["depts","building"],["deputy","person"],["dequals","company"],["deransart","dash"],["derby","square"],["dereconstruction","down"],["derhuerst","person"],["derision","minus"],["derivability","right"],["derive","plus"],["derived","right"],["deriving","right"],["dermatol","person"],["dermatological","square"],["derrida","person"],["derridean","person"],["des","person"],["desc","square"],["descartes","person"],["descend","down"],["descended","down"],["descends","down"],["describe","square"],["described","square"],["describes","dash"],["describing","square"],["descrip","square"],["description","square"],["descriptioncombination","dash"],["descriptionedit","dash"],["descriptions","square"],["descriptiontext","square"],["descriptive","square"],["descriptor","square"],["descriptors","square"],["desensi","person"],["desert","sand"],["deserve","tick"],["deserved","plus"],["deservedness","plus"],["deservingly","plus"],["desiccated","down"],["desiccating","down"],["desiderative","plus"],["design","line"],["designated","plus"],["designates","plus"],["designed","tick"],["designer","person"],["designers","person"],["designing","square"],["designs","square"],["desirable","plus"],["desire","plus"],["desired","plus"],["desires","plus"],["desiring","plus"],["desk",""],["desks","desk"],["desktop","square"],["despite","xcorrection"],["despotic","minus"],["desseminate","down"],["dessert","apple"],["desserts","dessert"],["dessicated","down"],["dest","dot"],["destination","dot"],["destinations","dot"],["destined","dot"],["destiny","dot"],["destroy","xcorrection"],["destroyed","xcorrection"],["destroying","xcorrection"],["destruction","minus"],["destructive","minus"],["det","square"],["detach","notequals"],["detached","plus"],["detail","dash"],["detailed","dot"],["detailedly","plus"],["detailer","square"],["detailers","square"],["detailing","dash"],["details","box"],["detailsoptionally","dash"],["detect","equals"],["detectable","plus"],["detected","tick"],["detecting","tick"],["detection","equals"],["detector","box"],["detectors","eye"],["detects","person"],["detergent",""],["determination","n"],["determine","equals"],["determined","plus"],["determiner","plus"],["determines","right"],["determining","tick"],["determinism","plus"],["deterministic","equals"],["deterred","xcorrection"],["detlog","book"],["detour","right"],["detoured","right"],["detracting","minus"],["detrimental","minus"],["detritus","ball"],["deutscher","person"],["deuxième","box"],["dev","square"],["develop","plus"],["developed","plus"],["developedly","plus"],["developedness","plus"],["developer","person"],["developers","person"],["developing","box"],["development","plus"],["developmental","box"],["developmentally","plus"],["developments","plus"],["develops","box"],["deviations","plus"],["device","and"],["devices","dash"],["devil","person"],["devise","square"],["devised","plus"],["devising","square"],["devoid","minus"],["devoted","plus"],["devotion","plus"],["devour","mouth"],["dewzguiy","dash"],["dexterous","finger"],["df","dash"],["dfa","line"],["dfb","dash"],["dff","dash"],["dfs","down"],["dgl","dash"],["dh","dash"],["di","dash"],["diagnose","tick"],["diagnosed","square"],["diagnoses","plus"],["diagnosing","square"],["diagnosis","tick"],["diagnostic","tick"],["diagnostics","tick"],["diagonal","line"],["diagonally","right"],["diagram","square"],["diagrams","square"],["dial","circle"],["dialect","square"],["dialectic","paper"],["dialectics","paper"],["dialectise","square"],["dialled","n"],["dialogical","square"],["dialogically","square"],["dialogue","paper"],["diameter","n"],["diamond",""],["diamonds","box"],["dian","person"],["diana","person"],["dianne","person"],["diaphragm",""],["diareasoner","square"],["diaries","book"],["diarrhoea","minus"],["diary","book"],["dice",""],["dichotomies","two"],["dichotomy","dash"],["dick","person"],["dickson","person"],["dickybird","bird"],["dicotyledon","flower"],["dict","book"],["dictate","mouth"],["dictated","mouth"],["dictation","mouth"],["dictators","person"],["diction","tongue"],["dictionaries","book"],["dictionary","book"],["dicts","book"],["did","dash"],["didactic","plus"],["diddly","minus"],["didgeridoo",""],["didn","xcorrection"],["dido","person"],["die","minus"],["died","n"],["dies","minus"],["diet","apple"],["dietary","apple"],["diets","apple"],["dif","variable"],["diff","minus"],["differ","minus"],["differed","two"],["difference","minus"],["differencebetween","minus"],["differences","minus"],["differencet","minus"],["different","minus"],["differential","plus"],["differentiate","minus"],["differentiated","two"],["differentiating","minus"],["differentiation","minus"],["differentiations","plus"],["differently","minus"],["differing","notequals"],["differs","minus"],["difficult","minus"],["difficulties","xcorrection"],["difficultly","minus"],["difficulty","minus"],["différence","minus"],["dig","down"],["digest","stomach"],["digested","stomach"],["digesting","stomach"],["digestion","stomach"],["digestive","intestines"],["digging","down"],["digicon","square"],["digit","n"],["digital","finger"],["digitally","n"],["digits","n"],["dignified","plus"],["dilated","right"],["dilemma","minus"],["dilemmas","minus"],["diligence","plus"],["diligent","plus"],["dillet","person"],["dilucidated","plus"],["diluted","water"],["dim","n"],["dimension","line"],["dimensional","n"],["dimensionally","box"],["dimensions","box"],["diminished","down"],["diminishing","down"],["dimly","n"],["dinacarya","square"],["dine","carrot"],["dined","mouth"],["diners","person"],["ding","dash"],["dinicolantonio","person"],["dining","potato"],["dinner","potato"],["dinosaur",""],["dion","person"],["diorama","box"],["dioxide",""],["dip","paper"],["dipankar","person"],["diploma","paper"],["diplomas","paper"],["diplomat","person"],["dipped","down"],["dir","box"],["direct","down"],["direct","right"],["directed","right"],["directing","dot"],["direction","right"],["directional","right"],["directionality","right"],["directiondict","book"],["directiondifferencet","d"],["directiondt","d"],["directionend","variable"],["directionlength","n"],["directionlengtht","n"],["directions","right"],["directionstring","variable"],["directionstringth","variable"],["directionvalues","variable"],["directionx","variable"],["directiony","variable"],["directionz","variable"],["directive","right"],["directly","line"],["director","person"],["directories","box"],["directors","person"],["directory","paper"],["dirt",""],["dirty","minus"],["dis","variable"],["disab","minus"],["disabilities","minus"],["disability","minus"],["disabled","minus"],["disadvanced","dash"],["disadvantage","minus"],["disadvantaged","minus"],["disadvantages","minus"],["disagree","xcorrection"],["disagreed","xcorrection"],["disagreeing","xcorrection"],["disagreement","minus"],["disagreements","xcorrection"],["disagrees","xcorrection"],["disallow","xcorrection"],["disambiguate","box"],["disambiguated","right"],["disandclass","variable"],["disappear","zero"],["disappeared","xcorrection"],["disappearing","minus"],["disappears","xcorrection"],["disappointedly","minus"],["disappointing","minus"],["disarm","xcorrection"],["disarmed","plus"],["disarming","xcorrection"],["disaster","minus"],["disc",""],["discard","xcorrection"],["discarded","xcorrection"],["discernable","plus"],["discernible","plus"],["discerning","plus"],["discharge","right"],["discharging","right"],["discipiated","plus"],["discipled","right"],["disciples","person"],["disciplinary","xcorrection"],["discipline","book"],["disciplined","plus"],["disciplines","book"],["disciplining","plus"],["disclaimed","plus"],["disclaimer","plus"],["disclass","variable"],["disclosing","right"],["disclosure","plus"],["disclosures","eye"],["discomfort","minus"],["disconcertedness","minus"],["disconnect","notequals"],["disconnected","xcorrection"],["disconnects","notequals"],["discontiguous","line"],["discontinued","xcorrection"],["discount","minus"],["discounted","xcorrection"],["discounts","dollarsymbol"],["discourage","xcorrection"],["discouraged","xcorrection"],["discours","paper"],["discourse","square"],["discourses","book"],["discover","plus"],["discoverable","plus"],["discovered","tick"],["discoveries","plus"],["discovering","plus"],["discovers","plus"],["discovery","plus"],["discredit","minus"],["discredited","xcorrection"],["discreet","plus"],["discriminant","notequals"],["discriminate","minus"],["discrimination","minus"],["discriminatory","minus"],["discripiated","plus"],["discs","circle"],["discursive","dash"],["discus",""],["discuses","discus"],["discuss","mouth"],["discussed","square"],["discusses","mouth"],["discussing","mouth"],["discussion","square"],["discussions","square"],["disease","minus"],["diseased","minus"],["diseases","minus"],["disembarked","left"],["disembarking","right"],["disempower","minus"],["disenrolled","xcorrection"],["disentangling","right"],["disguise","glasses"],["disguises","mask"],["disgusting","minus"],["dish",""],["dishes","dish"],["dishevelled","minus"],["disinfectant",""],["disinfected","disinfectant"],["disintegrate","down"],["disintegrated","down"],["disinterested","minus"],["disised","minus"],["disj","or"],["disjointed","box"],["disjunction","or"],["disjunctive","or"],["disk","square"],["dislike","xcorrection"],["disliked","xcorrection"],["dislikes","xcorrection"],["dislodged","right"],["dismantle","down"],["dismantled","box"],["dismantling","down"],["dismiss","minus"],["dismissal","xcorrection"],["dismissed","xcorrection"],["disorder","minus"],["disordered","minus"],["disorders","minus"],["disparagingly","minus"],["disparities","minus"],["disparity","minus"],["dispassionate","minus"],["dispatch","box"],["dispensable","plus"],["dispense","right"],["dispensed","down"],["dispenser","down"],["dispensers","dispenser"],["dispensing","down"],["disperse","right"],["displacement","n"],["display","square"],["displayed","square"],["displayer","display"],["displaying","square"],["displays","square"],["disposable","plus"],["disposal","down"],["dispose","down"],["disposed","down"],["disposition","plus"],["dispute","minus"],["disputed","xcorrection"],["disputes","minus"],["disregard","minus"],["disrespect","minus"],["disrespectful","minus"],["disrespects","minus"],["disrupted","minus"],["disrupting","minus"],["disruption","minus"],["disruptive","minus"],["disruptor","minus"],["dissect","down"],["dissected","right"],["disseminate","down"],["disseminating","right"],["dissemination","right"],["dissertation","paper"],["dissimilarity","notequals"],["dissipate","down"],["dissolve","water"],["dissolved","water"],["distance","line"],["distanced","n"],["distances","n"],["distancing","n"],["distant","n"],["distaste","minus"],["distastefully","minus"],["distinct","plus"],["distinction","two"],["distinctions","a"],["distinctive","plus"],["distinctly","equals"],["distinguish","box"],["distinguished","plus"],["distort","minus"],["distortion","minus"],["distracted","minus"],["distracting","minus"],["distraction","minus"],["distractions","minus"],["distress","minus"],["distribute","right"],["distributed","right"],["distributing","dash"],["distribution","dash"],["distributive","right"],["district","square"],["districts","square"],["disturbing","minus"],["ditransitive","person"],["div","square"],["dive","down"],["diverge","two"],["diverged","two"],["divergence","v"],["divergences","right"],["divergent","right"],["diverging","right"],["diverse","plus"],["diversions","dash"],["diversity","plus"],["diverted","right"],["divertissement","plus"],["divertissements","xcorrection"],["dives","down"],["divide","box"],["dividebyfour","box"],["divided","box"],["dividend","dollarsymbol"],["divides","division"],["dividing","box"],["divine","person"],["divinely","plus"],["diving","down"],["divinity","person"],["division","box"],["divisions","box"],["divorcee","person"],["divulged","up"],["divulging","up"],["dix","person"],["dixee","person"],["dixel","dash"],["dixey","person"],["dixie","goat"],["dixl","person"],["djablucinate","minus"],["djameson","person"],["djibouti","country"],["dl","dash"],["dlight","plus"],["dlighted","plus"],["dlive","plus"],["dmsd","dash"],["dmsm","dash"],["dmsr","dash"],["dmss","dash"],["dmst","dash"],["dna",""],["do","person"],["dob","square"],["dobd","square"],["dobm","square"],["doby","square"],["doc","paper"],["dock","right"],["docked","equals"],["docking","equals"],["docktime","and"],["dockyards","water"],["docs","paper"],["doctor","person"],["doctoral","book"],["doctorate","book"],["doctors","person"],["doctrine","square"],["doctype","square"],["document","paper"],["documentary","square"],["documentation","paper"],["documented","paper"],["documenting","paper"],["documents","paper"],["docx","paper"],["dodged","xcorrection"],["dodgem","xcorrection"],["doer","plus"],["does","right"],["doesn","minus"],["doffing","down"],["dog",""],["dogberry","person"],["dogged","minus"],["dogs","dog"],["doh","dot"],["doi","square"],["doing","dash"],["dois","n"],["dojo","room"],["dole","person"],["doll",""],["dollar","dollarsymbol"],["dollars","dollarsymbol"],["dollarsign",""],["dollarsymbol",""],["dollie","doll"],["dollop","cream"],["domain","square"],["domains","square"],["dome","hemisphere"],["domestic","building"],["dominance","plus"],["dominate","minus"],["dominated","minus"],["dominating","plus"],["domination","minus"],["dominguez","person"],["dominic","person"],["dominica","country"],["dominican","country"],["dominions","house"],["don","dash"],["donald","person"],["donaldkellett","person"],["donate","dollarsymbol"],["donated","right"],["donating","dollarsymbol"],["donation","dollarsymbol"],["done","plus"],["donee","dash"],["donel","dash"],["doney","person"],["donl","person"],["donna","person"],["donor","person"],["donrita","person"],["dont","xcorrection"],["donttry","minus"],["doobies","plus"],["doodles","pen"],["door",""],["doorbell","bell"],["doors","door"],["doorway","door"],["doorways","door"],["dorbuchers","person"],["dore","dash"],["dormitory","room"],["dorothy","person"],["dos","tick"],["dose","n"],["doses","n"],["dostoevsky","person"],["dot",""],["dote","plus"],["doting","plus"],["dots","dot"],["dotsandutterances","dot"],["dott","plus"],["dotted","dot"],["dotting","dot"],["double","two"],["doubled","tworectangle"],["doublemaze","right"],["doubling","tworectangle"],["doubly","tworectangle"],["doubt","xcorrection"],["doug","person"],["dough",""],["doughnut",""],["dougsson","person"],["dove",""],["doves","dove"],["dovetail","equals"],["dovetailed","equals"],["dow","person"],["down",""],["downcase","down"],["downhill","down"],["download","down"],["downloaded","down"],["downloading","down"],["downloads","down"],["downpipe","down"],["downplay","xcorrection"],["downplayed","xcorrection"],["downside","down"],["downsize","down"],["downstairs","down"],["downton","city"],["downturn","minus"],["downwards","down"],["dowry","dollarsymbol"],["dowsed","water"],["doyle","person"],["dp","dash"],["dqb","box"],["dr","up"],["dra","box"],["draft","paper"],["drafted","paper"],["drafting","paper"],["drafts","paper"],["drag","right"],["dragged","right"],["dragging","right"],["dragon",""],["drain","box"],["drained","down"],["drains","down"],["drakes","duck"],["drama","stage"],["dramatic","stage"],["dramatica","software"],["dramatically","plus"],["drank","water"],["draw","paper"],["drawbar","box"],["drawer","box"],["drawers",""],["drawing","square"],["drawings","square"],["drawn","right"],["draws","square"],["drdftr","note"],["dream","box"],["dreamballoon",""],["dreamed","box"],["dreamily","box"],["dreams","speechballoon"],["dreamt","square"],["dreamy","dreamballoon"],["dress",""],["dressed","trousers"],["dressing","plus"],["dressmaker","person"],["drew","pen"],["drfile","paper"],["dribbled","water"],["dribbling","down"],["dried","plus"],["dries","up"],["drill","plus"],["drilling","down"],["drink","glass"],["drinkable","water"],["drinking","glass"],["drip","tube"],["dripping","water"],["drive","right"],["driven","right"],["driver","person"],["drivers","person"],["drives","right"],["driveway",""],["driving","right"],["drizzled","down"],["drizzles","down"],["drone","robot"],["drones","robot"],["drop","down"],["dropbox","box"],["droplet","water"],["dropped","down"],["dropping","down"],["drove","right"],["drug",""],["drugs","switch"],["drum",""],["drums","drum"],["drunk","water"],["dry","box"],["drying","square"],["dryness","square"],["ds","dash"],["dsf","square"],["dsorted","right"],["dsp","dollarsymbol"],["dsss","dash"],["dsui","dash"],["dt","variable"],["dtd","dash"],["dts","n"],["du","dash"],["dual","tworectangle"],["dualise","tworectangle"],["dualism","tworectangle"],["dualistic","tworectangle"],["duality","two"],["duchess","person"],["duck",""],["duckduckgo","company"],["ducker","person"],["duckling",""],["ducklings","duckling"],["ducks","duck"],["due","n"],["duel","right"],["dug","down"],["duk","person"],["duke","cat"],["dukel","person"],["dukeltrance","person"],["dukie","person"],["dukkel","dash"],["dukl","person"],["dulcimer","box"],["dulex","company"],["dumler","person"],["dummies","minus"],["dump","down"],["dumpling",""],["dumptonesquenesses","plus"],["dune","sand"],["dunmore","person"],["dup","plus"],["duplicate","two"],["duplicated","two"],["duplicates","tworectangle"],["duplicating","two"],["duplication","two"],["dupoux","person"],["dups","n"],["durability","plus"],["duration","n"],["durations","n"],["during","one"],["durum",""],["dust","sphere"],["duster",""],["dutch","person"],["duties","plus"],["duty","plus"],["duuyqdrkdfmcsjskj","dash"],["dvd","and"],["dvouz","dash"],["dwayne","person"],["dwelling","house"],["dwim","dash"],["dwyer","person"],["dx","variable"],["dy","variable"],["dyad","two"],["dye",""],["dyed","ink"],["dying","minus"],["dylib","book"],["dynamic","plus"],["dynamics","n"],["dynamism","plus"],["dysfunctional","minus"],["dyson","person"],["dznvkjcnza","dash"],["e","box"],["ea","earth"],["each","dash"],["eagle",""],["eagled","arm"],["eaning","dash"],["ear",""],["earlier","lessthan"],["earliest","n"],["earlobes","earlobe"],["early","left"],["earn","dollarsign"],["earned","dollarsign"],["earner","person"],["earners","dollarsymbol"],["earning","dollarsign"],["earningjobsprotectioninjobs","plus"],["earnings","dollarsymbol"],["earns","dollarsign"],["earphone",""],["earphones",""],["earrings","circle"],["ears","ear"],["earth","planet"],["earthlings","person"],["earthquake","minus"],["ease","plus"],["eased","plus"],["easier","plus"],["easiest","plus"],["easily","plus"],["easiness","plus"],["easing","plus"],["east","right"],["easter","egg"],["eastern","left"],["easterners","person"],["easy","plus"],["eat","apple"],["eaten","mouth"],["eater","person"],["eateries","apple"],["eaters","person"],["eatery","apple"],["eating","banana"],["eats","apple"],["eb","dash"],["eba","dash"],["ebay","company"],["ebbed","plus"],["ebeacon","company"],["ebook","book"],["ebrary","paper"],["ebsco","and"],["eca","dash"],["ecaster","company"],["ecb","dash"],["echo","square"],["echoes","n"],["echoing","plus"],["eco","apple"],["ecological","plant"],["ecologically","plant"],["ecologist","person"],["ecology","plant"],["econ","dollarsymbol"],["econd","dash"],["econometric","right"],["economic","dollarsymbol"],["economical","dollarsymbol"],["economically","dollarsymbol"],["economics","dollarsign"],["economist","person"],["economy","dollarsymbol"],["ecosystems","plant"],["ecritina","person"],["ecritudine","paper"],["ecstasy","plus"],["ecstatic","plus"],["ecuador","country"],["ed","left"],["edc","square"],["edcf","square"],["edcfd","square"],["edd","square"],["eddie","person"],["ede","square"],["edelweiss","plant"],["edge","square"],["edges","line"],["edible","apple"],["edit","pen"],["editable","plus"],["edited","right"],["editing","tick"],["edition","paper"],["editor","square"],["editorial","square"],["editors","person"],["edits","right"],["edn","book"],["eds","person"],["edu","building"],["educ","book"],["educare","plus"],["educate","book"],["educated","plus"],["education","person"],["educational","book"],["educationally","plus"],["educator","person"],["educators","person"],["edward","person"],["edx","company"],["ee","dash"],["eee","company"],["eeg","electron"],["ef","plus"],["efde","square"],["effect","square"],["effected","plus"],["effective","tick"],["effectively","plus"],["effectiveness","plus"],["effects","a"],["effeminate","person"],["effervent","plus"],["efficiency","plus"],["efficient","variable"],["efficiently","n"],["effigies","sculpture"],["effort","plus"],["effortless","zero"],["effortlessly","plus"],["effortlessness","plus"],["efmar","multiply"],["efron","person"],["eg","peach"],["egalitarian","person"],["egg",""],["eggplant",""],["eggs","ball"],["ego","person"],["egos","person"],["egs","square"],["egypt","country"],["eh","dash"],["eigengrau","square"],["eigentlichkeit","plus"],["eight","eightrectangle"],["eighth","n"],["eightrectangle",""],["eighty","n"],["eilat","person"],["einahpets","person"],["einstein","person"],["eisenhart","person"],["either","or"],["el","dash"],["elaborate","plus"],["elaboration","plus"],["elapse","n"],["elapses","n"],["elate","plus"],["elbows","elbow"],["elder","person"],["elderberries","berry"],["elderberry",""],["elderly","n"],["elders","person"],["eldest","n"],["elected","plus"],["election","tick"],["elections","tick"],["electoral","n"],["electric","electron"],["electrical","ball"],["electrician","person"],["electricity","electron"],["electro","ball"],["electrode","plus"],["electromagnetic","plus"],["electron",""],["electronic","note"],["electronics","and"],["electrons","minus"],["electrotherapy","plus"],["elegant","square"],["elegantly","plus"],["element","n"],["elementarily","plus"],["elementary","one"],["elements","one"],["elenchus","plus"],["elephantine","minus"],["elet","right"],["elevated","up"],["elevation","zline"],["eleven","n"],["elf","person"],["eli","person"],["elicit","plus"],["elide","right"],["eligible","plus"],["eliminate","minus"],["eliminated","xcorrection"],["eliminates","minus"],["eliminating","xcorrection"],["elimination","minus"],["eliminativist","xcorrection"],["eliminativistism","xcorrection"],["elimininate","minus"],["elite","plus"],["elites","plus"],["eliza","person"],["ellen","person"],["ellie","person"],["elliott","person"],["ellipse",""],["elly","person"],["elmo",""],["elope","minus"],["else","or"],["elsevier","company"],["elsewhere","box"],["eluded","xcorrection"],["elusive","minus"],["elvira","person"],["email","letterbox"],["emailed","right"],["emails","paper"],["emanates","right"],["embarked","plus"],["embarking","right"],["embarrassed","minus"],["embarrassingly","minus"],["embed","box"],["embedded","down"],["embellishment","plus"],["embodied","person"],["embodiment","person"],["embossed","box"],["embouchure","mouth"],["embraced","person"],["embryos","embryo"],["emerald",""],["emerge","right"],["emergence","right"],["emergency","minus"],["emerging","up"],["emeritus","person"],["emerson","person"],["emily","person"],["eminent","plus"],["emir","person"],["emirates","country"],["emission","right"],["emissions","right"],["emitted","right"],["emitters","right"],["emitting","right"],["emma","person"],["emoji","square"],["emoticon","square"],["emotion","smile"],["emotional","smile"],["emotionally","plus"],["emotions","smile"],["emotive","plus"],["emoyed","plus"],["empathetic","plus"],["empathise","tick"],["empathised","plus"],["empathising","plus"],["empathy","plus"],["emperor","person"],["empersonified","person"],["emphases","plus"],["emphasis","dot"],["emphasise","plus"],["emphasised","plus"],["emphasises","plus"],["emphasising","plus"],["emphasized","plus"],["emphatic","plus"],["empire","country"],["empirical","eye"],["empirically","square"],["empiricism","book"],["employ","dash"],["employed","plus"],["employee","person"],["employees","person"],["employer","person"],["employers","person"],["employing","plus"],["employment","plus"],["employs","plus"],["emporium","shop"],["emporiums","building"],["empower","plus"],["empowered","plus"],["empowering","plus"],["empowerment","plus"],["emptied","box"],["emptiness","box"],["empting","box"],["empty","twobox"],["emptying","box"],["emptyorvalsequal","dash"],["emulate","right"],["emulated","right"],["emulating","plus"],["emulator","box"],["en","book"],["enable","plus"],["enabled","plus"],["enables","plus"],["enabling","plus"],["enact","plus"],["enactment","plus"],["enamelise","minus"],["enamour","plus"],["enamourate","plus"],["enamoured","plus"],["enamouring","plus"],["enamourmonts","plus"],["encased","box"],["ence","dash"],["enchanted","plus"],["enchilada",""],["encoded","right"],["encompasses","plus"],["encounter","plus"],["encountered","plus"],["encountering","tick"],["encounters","person"],["encourage","plus"],["encouraged","plus"],["encouragement","plus"],["encourages","plus"],["encouraging","plus"],["encouragingly","plus"],["encrypt","right"],["encrypted","right"],["encrypting","right"],["encryption","plus"],["encuntglish","box"],["encyclopaedia","book"],["encyclopedia","book"],["encyclopedias","book"],["end","n"],["endanger","minus"],["endangered","minus"],["endeavours","plus"],["ended","n"],["endgame","n"],["endgrammar","variable"],["endgrammara","square"],["ending","n"],["endnotes","paper"],["endocrine","gland"],["endometrium","box"],["endorse","plus"],["endorsed","plus"],["endorsement","plus"],["endorsing","plus"],["endowing","plus"],["endpoint","n"],["endrec","dot"],["ends","dot"],["endstring","square"],["endured","right"],["enduring","plus"],["eneathiesnesses","plus"],["enemies","person"],["energetic","plus"],["energised","plus"],["energy","right"],["energyeducation","book"],["enforce","plus"],["enforced","plus"],["enforcement","plus"],["eng","paper"],["engage","box"],["engaged","plus"],["engagement","plus"],["engaging","plus"],["engels","person"],["engender","plus"],["engendering","plus"],["engine","and"],["engineer","person"],["engineered","box"],["engineering","plus"],["engines","and"],["england","country"],["english","book"],["engram","square"],["engraved","box"],["engrossed","minus"],["engrossment","minus"],["enhance","plus"],["enhanced","plus"],["enhancement","plus"],["enhancing","plus"],["enigma","dot"],["enigmas","plus"],["enigmatic","plus"],["enimals","animal"],["enjoy","plus"],["enjoyable","plus"],["enjoyed","plus"],["enjoying","plus"],["enjoyment","plus"],["enjoyments","plus"],["enjoys","plus"],["enlarged","up"],["enlarging","up"],["enlightened","plus"],["enlightening","plus"],["enlightenment","plus"],["enlisted","plus"],["enliven","plus"],["enmeshment","box"],["eno","person"],["enotecama","plus"],["enough","n"],["enqljvfy","dash"],["enquire","square"],["enquiring","square"],["enquiry","plus"],["enrol","plus"],["enroll","tick"],["enrolled","plus"],["enrolling","tick"],["enrollment","tick"],["enrollments","square"],["enrolment","plus"],["ensconce","plus"],["ensemble","person"],["enshrined","plus"],["ensightment","eye"],["enslaving","minus"],["ensue","dot"],["ensuing","right"],["ensure","tick"],["ensured","plus"],["ensuring","plus"],["enter","right"],["entered","right"],["enterer","person"],["enteric","stomach"],["entering","square"],["enterprise","building"],["enters","right"],["entertain","plus"],["entertained","plus"],["enthusiasm","plus"],["enthusiastically","plus"],["enthusiasts","person"],["entice","plus"],["enticed","plus"],["entire","n"],["entirely","n"],["entities","person"],["entitled","square"],["entitlements","plus"],["entity","person"],["entrain","tick"],["entrained","plus"],["entrance","door"],["entrants","person"],["entreat","plus"],["entrepreneurs","person"],["entrepreneurship","one"],["entries","square"],["entropy","down"],["entrust","plus"],["entrusting","plus"],["entrustments","plus"],["entry","right"],["entryt","square"],["entwined","right"],["enumerate","n"],["enumerator","n"],["enumerators","n"],["enunciated","plus"],["enunciating","square"],["enus","person"],["envelope",""],["environment","box"],["environmental","plant"],["environmentalism","plant"],["environmentally","plus"],["environments","box"],["environs","box"],["envisaged","eye"],["envisaging","eye"],["envision","eye"],["envisioned","square"],["enwrapped","box"],["enya","person"],["enzyme","plus"],["enzymes","box"],["eof","n"],["eofc","n"],["eol","square"],["ep","line"],["epages","company"],["ephesia","person"],["epic","book"],["epidemic","minus"],["epidemics","minus"],["epididymis","down"],["episode","square"],["episodes","square"],["episteme","pen"],["epistemological","box"],["epistemologically","box"],["epistemology","plus"],["epithelium","square"],["epoché","leftbracket"],["epsilon","n"],["epson","company"],["epsonnet","line"],["epton","plus"],["eq","equals"],["eqing","plus"],["equal","equals"],["equaled","equals"],["equaling","equals"],["equalitarian","person"],["equality","equals"],["equalled","equals"],["equalling","equals"],["equally","equals"],["equals",""],["equate","equals"],["equated","equals"],["equating","equals"],["equation","tworectangle"],["equations","equals"],["equatorial","country"],["equi","equals"],["equilibrium","equals"],["equipment","computer"],["equipped","plus"],["equitable","equals"],["equitably","plus"],["equity","plus"],["equivalent","equals"],["er","person"],["era","line"],["eradicate","xcorrection"],["eradicated","xcorrection"],["eras","line"],["erase","xcorrection"],["erased","xcorrection"],["erasing","xcorrection"],["erasure","xcorrection"],["ere","down"],["erect","building"],["erected","up"],["erecting","up"],["ereignis","box"],["ergonomic","seat"],["ergonomics","chair"],["erica","person"],["eritrea","country"],["eroded","down"],["erotisense","plus"],["err","minus"],["erroneous","minus"],["error","xcorrection"],["errors","minus"],["eruption","volcano"],["es","dash"],["esau","person"],["esc","right"],["escalation","up"],["escape","right"],["escaped","right"],["esophagus","tube"],["esoteric","plus"],["esotericnesses","plus"],["especially","plus"],["esplanade",""],["espouse","plus"],["espouses","plus"],["espousing","plus"],["ess","person"],["essay","paper"],["essayists","person"],["essaylib","company"],["essays","paper"],["essence","apple"],["essent","box"],["essential","one"],["essentially","plus"],["essentials","plus"],["essonsciblenesses","plus"],["est","equals"],["establish","box"],["established","zero"],["establishes","box"],["establishing","box"],["estate","building"],["estates","house"],["esteem","plus"],["ester","person"],["esther","person"],["estimate","n"],["estimated","n"],["estonia","country"],["eswatini","country"],["et","and"],["etc","box"],["etcetera","right"],["etch","right"],["etching","square"],["eternal","n"],["eternity","n"],["eth","dash"],["ethel","person"],["ethic","plus"],["ethical","tick"],["ethically","plus"],["ethicist","person"],["ethico","plus"],["ethics","tick"],["ethicst","dash"],["ethiopia","country"],["ethnic","person"],["ethnically","person"],["ethnicity","plus"],["ethnographic","book"],["ethnography","book"],["etienne","person"],["etiological","minus"],["etre","equals"],["etymological","right"],["etymology","square"],["eu","country"],["euan","person"],["eucalyptus","tree"],["eugenically","plus"],["eukarya","person"],["eukaryote","person"],["eunice","person"],["euro","dollarsymbol"],["europe","continent"],["european","continent"],["eustace","person"],["evacuation","right"],["evaluate","plus"],["evaluated","tick"],["evaluating","plus"],["evaluation","tick"],["evaluations","plus"],["evaluative","plus"],["evaporated","up"],["evaporation","up"],["eve","person"],["even","two"],["evened","square"],["evening","square"],["evenings","moon"],["evenly","line"],["event","square"],["eventbrite","company"],["eventful","plus"],["eventide","plus"],["events","dot"],["eventual","n"],["eventually","n"],["ever","dash"],["everitt","person"],["evermore","right"],["every","n"],["everybody","person"],["everyday","square"],["everyone","person"],["everyones","person"],["everything","a"],["everyvarcovered","variable"],["everyvarmentioned","box"],["everywhere","box"],["evidence","paper"],["evident","tick"],["evidential","box"],["evie","person"],["evil","xcorrection"],["evince","plus"],["evm","box"],["evocative","plus"],["evoke","plus"],["evoked","plus"],["evokes","up"],["evoking","plus"],["evolution","right"],["evolutionary","right"],["evolve","right"],["evolved","plus"],["evolving","right"],["evonne","person"],["evp","dash"],["ew","right"],["eway","company"],["ex","left"],["exacerbate","minus"],["exacerbated","minus"],["exact","equals"],["exactly","equals"],["exactness","equals"],["exaggerate","plus"],["exaggerated","plus"],["exaggeration","plus"],["exaltate","plus"],["exam","paper"],["examination","paper"],["examinationism","eye"],["examinations","paper"],["examinative","plus"],["examine","down"],["examined","eye"],["examinedness","plus"],["examiner","person"],["examines","eye"],["examining","eye"],["example","apple"],["examples","apple"],["exams","paper"],["excalibur","sword"],["excavate","box"],["excavation","up"],["exceed","plus"],["exceeded","plus"],["excel","box"],["excellence","a"],["excellences","plus"],["excellent","a"],["excelsior","dot"],["except","xcorrection"],["excepted","minus"],["exception","xcorrection"],["exceptional","plus"],["exceptions","minus"],["excerpt","square"],["excess","box"],["exchange","right"],["exchanged","right"],["excide","right"],["excite","plus"],["excited","plus"],["excitement","plus"],["exciting","plus"],["exclaim","square"],["exclaimed","plus"],["exclamation","exclamationmark"],["exclamationmark",""],["exclude","minus"],["excluded","minus"],["excluding","xcorrection"],["exclusion","dot"],["exclusive","box"],["exclusivity","plus"],["excrement","ball"],["excrete","down"],["excretory","down"],["excursion","right"],["excuse","square"],["excuses","minus"],["exe","software"],["exec","square"],["execute","tick"],["executed","right"],["execution","dash"],["executive","person"],["exegeses","book"],["exegesiticals","book"],["exemplary","plus"],["exemplified","plus"],["exercise","shoe"],["exercised","right"],["exercises","paper"],["exercising","leg"],["exert","right"],["exfoliated","flower"],["exhale","air"],["exhaled","right"],["exhaling","right"],["exhausted","minus"],["exhaustion","minus"],["exhaustive","plus"],["exhibit","dot"],["exhibited","square"],["exhibition","square"],["exiciting","dash"],["exigencies","plus"],["exist","box"],["existed","plus"],["existence","dot"],["existent","box"],["existential","box"],["existentiell","square"],["exister","person"],["existing","a"],["exists","plus"],["exit","left"],["exited","up"],["exitfail","right"],["exiting","right"],["exits","right"],["exocrine","organ"],["exolec","book"],["exon","right"],["exp","square"],["expand","right"],["expanded","tworectangle"],["expander","right"],["expanding","plus"],["expanse","box"],["expansion","plus"],["expansions","right"],["expansive","right"],["expect","plus"],["expectancy","n"],["expectation","box"],["expectations","plus"],["expected","box"],["expecting","left"],["expend","plus"],["expenditure","dollarsymbol"],["expenses","dollarsymbol"],["expensive","dollarsymbol"],["experience","apple"],["experienceable","plus"],["experienced","plus"],["experiencers","person"],["experiences","box"],["experiencing","plus"],["experiential","plus"],["experiment","tick"],["experimental","tick"],["experimentation","box"],["experimented","plus"],["experimenter","person"],["experimenting","minus"],["experiments","tick"],["expert","person"],["expertise","plus"],["experts","person"],["expiration","n"],["expire","n"],["expired","minus"],["expires","n"],["expiry","n"],["explain","dash"],["explained","plus"],["explainer","plus"],["explaining","speechballoon"],["explains","plus"],["explanation","square"],["explanations","square"],["explanatory","plus"],["explicates","plus"],["explicating","square"],["explicit","plus"],["explicitism","plus"],["explicitly","plus"],["exploit","minus"],["exploitation","plus"],["exploited","minus"],["exploiting","plus"],["exploration","right"],["explorations","path"],["explore","right"],["explored","right"],["explores","plus"],["exploring","right"],["explosion","up"],["explosive","up"],["exponent","n"],["exponential","box"],["exponents","n"],["export","right"],["exported","right"],["exporting","right"],["exports","right"],["expos","paper"],["expose","plus"],["exposed","plus"],["exposes","plus"],["exposing","square"],["expositio","paper"],["exposition","paper"],["expositions","paper"],["exposure","n"],["expounded","plus"],["express","right"],["expressed","right"],["expresses","square"],["expressing","right"],["expression","plus"],["expressionnotatom","plus"],["expressionnotheadache","xcorrection"],["expressions","square"],["expressive","plus"],["expt","paper"],["expulsion","xcorrection"],["extend","right"],["extended","right"],["extending","n"],["extension","line"],["extensions","plus"],["extensive","n"],["extent","dash"],["exterior","right"],["external","right"],["extinct","minus"],["extinction","minus"],["extol","plus"],["extolled","plus"],["extra","plus"],["extract","up"],["extracted","up"],["extraction","up"],["extracurricular","dot"],["extraneous","minus"],["extraordinary","plus"],["extrapolate","right"],["extrapolation","right"],["extrarelation","dash"],["extras","plus"],["extrasarguments","variable"],["extraslabels","square"],["extrasrelations","dash"],["extravagance","minus"],["extreme","minus"],["extremely","plus"],["extrinsic","right"],["extruded","right"],["exuberant","plus"],["ey","person"],["eye",""],["eyebrow",""],["eyebrows","brow"],["eyed","eye"],["eyeless","minus"],["eyelid",""],["eyes","eye"],["eyeshadow","square"],["eyesight","eye"],["ez","variable"],["eztalks","company"],["ezzx","dash"],["ezzxe","dash"],["ezzxf","dash"],["f","box"],["fa","variable"],["fab","person"],["fabinyi","person"],["fabric",""],["fabrics","fabric"],["fac","box"],["face",""],["facebook","paper"],["faced","face"],["faces","face"],["facetime","n"],["facets","square"],["facilitate","plus"],["facilitated","plus"],["facilitates","plus"],["facilitating","plus"],["facilitation","plus"],["facilitator","person"],["facilitators","person"],["facilities","building"],["facility","room"],["facing","right"],["facitly","square"],["facs","dash"],["fact","a"],["factor","n"],["factored","n"],["factorise","square"],["factors","dash"],["factory","right"],["facts","square"],["factual","square"],["faculties","building"],["faculty","person"],["fade","line"],["fader","square"],["fading","line"],["faeces","ball"],["fail","xcorrection"],["failed","minus"],["failing","minus"],["failings","minus"],["failover","minus"],["fails","minus"],["failure","minus"],["failures","minus"],["faina","person"],["faint","minus"],["fair","plus"],["faire","plus"],["fairer","plus"],["fairly","plus"],["fairness","plus"],["fairy",""],["fait","plus"],["faith","plus"],["faithfulness","plus"],["faiths","plus"],["fake","minus"],["fall","down"],["fallace","dash"],["fallacies","minus"],["fallacy","minus"],["fallen","down"],["fallible","minus"],["fallin","down"],["falling","down"],["fallopian","tube"],["falls","down"],["false","minus"],["falsehood","minus"],["falsely","minus"],["falsified","xcorrection"],["fam","person"],["fame","plus"],["familiar","plus"],["families","person"],["family","person"],["famous","person"],["famously","plus"],["famousness","plus"],["famousnesses","plus"],["fan","person"],["fancy","plus"],["fandom","company"],["fanning","fan"],["fans","person"],["fantasias","plus"],["fantasised","plus"],["fantastic","plus"],["fapred","square"],["faq","paper"],["faqs","paper"],["far","n"],["farce","minus"],["farcical","minus"],["farewell","hand"],["farion","person"],["farm","box"],["farmer","person"],["farmers","person"],["farming","apple"],["farms","animal"],["farrow","pig"],["fas","dash"],["fascinating","plus"],["fascium","box"],["fashion","plus"],["fashionable","plus"],["faso","country"],["fast","n"],["fasted","zero"],["fastened","equals"],["faster","greaterthan"],["fastest","n"],["fat",""],["fatah","person"],["fatal","minus"],["fate","minus"],["father","person"],["fathers","person"],["fats","box"],["fatty","fat"],["fault","minus"],["faultless","plus"],["faults","minus"],["faulty","minus"],["fauna","animal"],["fauve","plus"],["favicon","square"],["favorite","plus"],["favour","plus"],["favourable","plus"],["favoured","plus"],["favourite","plus"],["fb","paper"],["fbc","dash"],["fbid","dash"],["fc","dash"],["fcre","right"],["fd","dash"],["fddb","box"],["fdf","dash"],["fe",""],["fea","plus"],["fear","minus"],["fears","minus"],["fearsome","minus"],["feasible","plus"],["feasibly","plus"],["feast","apple"],["feasting","apple"],["feat","plus"],["feather",""],["featherstone","person"],["feature","square"],["featured","square"],["features","plus"],["featurescawmp","square"],["featuring","plus"],["feb","square"],["february","square"],["febrvarius","person"],["feces","ball"],["fed","n"],["federal","room"],["feds","person"],["fee","dollarsign"],["feed","apple"],["feedback","left"],["feeder","down"],["feedforward","right"],["feeding","apple"],["feeds","right"],["feel","hand"],["feeling","square"],["feelings","plus"],["feels","plus"],["fees","dollarsymbol"],["feet","foot"],["felicity","person"],["felix","person"],["fell","down"],["feller","person"],["fellow","person"],["fellows","person"],["fellowships","plus"],["felp","person"],["felt","square"],["female","person"],["females","person"],["feminine","square"],["femininity","person"],["feminism","book"],["feminist","person"],["feministic","person"],["fence",""],["ferdinand","person"],["fern","plant"],["fernando","person"],["ferrando","person"],["ferrarotti","person"],["ferris","wheel"],["fertilise","right"],["fertilised","plus"],["fertiliser",""],["fertility","plus"],["festiches","plus"],["festival","note"],["fetch","right"],["fetched","box"],["fetta","cheese"],["feudal","n"],["feudalism","n"],["fever","minus"],["feverishly","plus"],["few","n"],["fewer","n"],["fewest","n"],["ff","tick"],["fff","dash"],["fh","dash"],["fi","book"],["fiance","person"],["fiat","car"],["fibonacci","plus"],["fibre",""],["fibres","box"],["fiction","book"],["fictional","book"],["fiddle","violin"],["fiddly","minus"],["fidelity","plus"],["fie","dash"],["field","box"],["fields","square"],["fieldwork","book"],["fiesole","theatre"],["fif","fiverectangle"],["fifteen","fifteenrectangle"],["fifteenrectangle",""],["fifth","fiverectangle"],["fifths","fivecrectangle"],["fifties","n"],["fifty","n"],["fiftyastest","tick"],["fiftybreasoningspersecond","f"],["fig",""],["fight","xcorrection"],["fighting","minus"],["fights","minus"],["figuratively","right"],["figure","person"],["figured","square"],["figures","number"],["fiji","country"],["file","paper"],["filealg","and"],["filecontents","paper"],["filed","box"],["filekeeplacomhits","dot"],["filelist","square"],["filemaking","paper"],["filename","square"],["filenames","square"],["filenametime","t"],["fileout","square"],["filer","box"],["files","paper"],["filesex","right"],["filet","paper"],["filex","paper"],["filexx","paper"],["filexyz","paper"],["filezilla","and"],["filing","paper"],["filipinos","person"],["fill","square"],["fillbuf","square"],["filled","one"],["filler","down"],["fillers","box"],["filling","box"],["fills","down"],["film","line"],["filmed","box"],["filming","box"],["films","square"],["filosofia","book"],["filter","box"],["filtered","box"],["filtering","square"],["filters","software"],["filtrate","ball"],["fin",""],["final","n"],["finalchar","square"],["finalists","person"],["finally","n"],["finance","dollarsymbol"],["financial","dollarsign"],["financially","dollarsymbol"],["find","a"],["findall","tick"],["findalls","eye"],["findargs","variable"],["findbest","plus"],["finder","person"],["finders","person"],["finding","person"],["findings","square"],["findmelody","note"],["findnsols","n"],["findo","eye"],["findprogram","square"],["findr","square"],["findresul","tick"],["findresult","n"],["findrhyme","equals"],["findrulesflowingtopv","eye"],["finds","person"],["findsentence","square"],["findtypes","box"],["findv","plus"],["fine","plus"],["finer","greaterthan"],["finery","square"],["finesse","plus"],["finger",""],["fingered","finger"],["fingers","finger"],["fingertip","square"],["fingertips","square"],["finish","n"],["finished","tick"],["finishes","tick"],["finishing","n"],["finite","n"],["finitely","n"],["finitism","n"],["finland","country"],["finn","person"],["fire",""],["firecracker","box"],["firefighter","person"],["fireguard","blanket"],["fireman","person"],["fireworks","up"],["firm","square"],["firmed","up"],["firming","plus"],["firmly","plus"],["firs","dash"],["first","one"],["firstargs","variable"],["firstletter","square"],["firstline","line"],["firstly","one"],["firstname","square"],["firststate","n"],["firstvar","variable"],["firstvarname","square"],["fiscal","dollarsymbol"],["fischer","person"],["fish",""],["fisher","person"],["fishes","fish"],["fiske","person"],["fist",""],["fit","box"],["fitness","plus"],["fits","equals"],["fitted","down"],["fitting","tick"],["fitzpatrick","person"],["five","fiverectangle"],["fivecrectangle",""],["fiverectangle",""],["fiverr","company"],["fix","plus"],["fixed","tick"],["fixes","plus"],["fixing","xcorrection"],["fl","variable"],["flag","square"],["flagdisjunction","or"],["flagella","line"],["flagellum","line"],["flagged","plus"],["flags","n"],["flame",""],["flamingo",""],["flammable","minus"],["flapping","up"],["flappings","up"],["flappy","wing"],["flare","light"],["flash","box"],["flashed","square"],["flashing","light"],["flask",""],["flat","square"],["flatten","down"],["flattened","plane"],["flaunt","plus"],["flavell","person"],["flavours","strawberry"],["flaw","minus"],["flawed","minus"],["flawless","plus"],["flaws","minus"],["flemington","suburb"],["flesh","square"],["fleshing","up"],["flew","right"],["flex","dash"],["flexes","dash"],["flexibility","plus"],["flexible","dash"],["flexing","right"],["flick","right"],["flicked","tick"],["flicking","tick"],["flight","right"],["flights","right"],["flip","right"],["flipped","mirror"],["flipping","right"],["flips","mirror"],["flipxy","mirror"],["flipxygrid","grid"],["flittering","minus"],["float","one"],["floated","up"],["floating","up"],["floats","up"],["flock","duck"],["flocks","duck"],["flood","water"],["floor","square"],["floret","flower"],["floss",""],["flotation","buoy"],["flour","ball"],["flourish","plus"],["flourishing","plus"],["flow","line"],["flowchart",""],["flowed","right"],["flower",""],["flowers","flower"],["flowing","right"],["flown","right"],["flows","line"],["flp","function"],["flu","virus"],["fluctuations","minus"],["fluent","plus"],["fluently","tick"],["fluffy","down"],["fluid","water"],["fluidity","right"],["fluke","plus"],["flute","note"],["fluttering","right"],["fly","right"],["flyer","person"],["flying","right"],["fm","square"],["fmr","country"],["fn","function"],["fna","right"],["fnal","right"],["fnalism","dash"],["fnism","right"],["fns","right"],["fo","dash"],["focal","eye"],["focus","eye"],["focused","eye"],["focuses","eye"],["focusing","eye"],["fogs","box"],["foisted","minus"],["fold","paper"],["folded","down"],["folder","paper"],["folders","folder"],["folding","down"],["foldl","right"],["foldr","right"],["folds","line"],["folk","guitar"],["folks","person"],["follow","right"],["followed","two"],["follower","person"],["followers","person"],["following","dash"],["follows","right"],["fond","plus"],["fondled","hand"],["fondness","plus"],["font","a"],["fonts","a"],["fontspace","company"],["food","apple"],["foods","apple"],["foodstuffs","apple"],["fool","minus"],["foot",""],["footage","line"],["football",""],["footnote","square"],["footnotes","square"],["footpath","path"],["footprint",""],["footprintless","zero"],["foottowel","towel"],["footy",""],["for","dash"],["forall","plus"],["foramen","organ"],["forbidden","minus"],["forbidding","xcorrection"],["force","right"],["forced","right"],["forces","n"],["ford","car"],["fore","left"],["forearms","forearm"],["forefront","zero"],["forehead",""],["foreheads","forehead"],["foreign","square"],["forerunner","left"],["foresee","eye"],["foreseeable","right"],["foreseen","right"],["foreshore","square"],["forest","tree"],["forestry","tree"],["forests","tree"],["foretold","mouth"],["forever","one"],["forge","down"],["forget","minus"],["forgetfulness","minus"],["forgets","xcorrection"],["forgetting","minus"],["forgive","xcorrection"],["forgiveness","plus"],["forgot","minus"],["forgotten","minus"],["fork",""],["forked","right"],["forks","dash"],["form","paper"],["formal","plus"],["formalism","a"],["formally","plus"],["format","dash"],["formation","plus"],["formations","plus"],["formative","plus"],["formatively","plus"],["formats","paper"],["formatted","square"],["formatting","paper"],["formed","plus"],["former","minus"],["formerly","left"],["formidable","minus"],["forming","box"],["formingness","plus"],["formistical","box"],["formlength","n"],["formr","square"],["forms","paper"],["formula","plus"],["formulae","and"],["formulas","and"],["formulated","plus"],["formulation","plus"],["fornix","and"],["forrester","person"],["forster","person"],["forte","line"],["forth","right"],["forthcoming","right"],["forthright","plus"],["fortnight","square"],["fortress","building"],["fortuitous","plus"],["fortuna","plus"],["fortune","dollarsymbol"],["forty","fortyrectangle"],["fortyrectangle",""],["forum","building"],["forums","paper"],["forward","right"],["forwarded","right"],["forwards","right"],["fossey","person"],["fossil",""],["foster","person"],["fostered","plus"],["fostering","plus"],["fosters","plus"],["foucauldian","person"],["foucault","person"],["found","plus"],["foundation","square"],["foundations","box"],["founded","down"],["founder","person"],["founders","person"],["founding","plus"],["fountain",""],["four","fourrectangle"],["fourrectangle",""],["fours","shoe"],["fourth","fourrectangle"],["fox","animal"],["foxtel","company"],["fp","dash"],["fqmer","dash"],["fr","paper"],["fractals","square"],["fraction","n"],["fractions","n"],["fragile","minus"],["fragility","minus"],["fragment","n"],["fragments","square"],["fragrance","perfume"],["fragranced","perfume"],["fragrant","flower"],["frame","square"],["framed","square"],["framenumber","n"],["frames","square"],["framework","box"],["frameworks","square"],["framings","square"],["franca","dash"],["france","country"],["frances","person"],["francesca","person"],["franchise","building"],["franchised","plus"],["francis","person"],["francisco","city"],["franco","country"],["francois","person"],["frangipane","flower"],["frank","person"],["frankenstein","person"],["frankfurt",""],["frankfurts","frankfurt"],["frankie","person"],["frankincense",""],["française","book"],["fraud","minus"],["fraudulent","minus"],["fraught","minus"],["fred","person"],["freddi","person"],["frederick","person"],["fredric","person"],["free","a"],["freeaccountingsoftware","company"],["freebsd","plus"],["freed","plus"],["freedom","dash"],["freefilesync","and"],["freeimages","paper"],["freely","plus"],["freeman","person"],["freemason","person"],["freeness","zero"],["freesias","flower"],["freeway","road"],["freeze","zero"],["freezer",""],["freezing","minus"],["freight","box"],["french","paper"],["freq","n"],["frequencies","f"],["frequency","n"],["frequent","f"],["frequented","n"],["frequently","n"],["fresh","plus"],["freshly","zero"],["freshness","plus"],["fret","line"],["fretless","zero"],["frets","minus"],["freud","person"],["freya","person"],["fri","square"],["friar","person"],["friday","square"],["fried","oil"],["friend","person"],["friendlily","plus"],["friendliness","dash"],["friendlinesses","square"],["friendly","plus"],["friends","person"],["friendship","plus"],["friendships","plus"],["fries","chip"],["frieze","square"],["frighten","minus"],["frightened","minus"],["fringes","square"],["frivells","ribbon"],["fro","right"],["frock","dress"],["frog",""],["frolicked","plus"],["from","dash"],["frombae","person"],["frombarrabool","dash"],["fromglasgow","city"],["fromglasgow","right"],["fromlang","paper"],["front","left"],["frontal","left"],["frontier","line"],["frontiers","line"],["frown","minus"],["frowned","minus"],["frowning","minus"],["frozen","n"],["fruit","apple"],["fruiterer","person"],["fruiting","apple"],["fruition","plus"],["fruitmonger","person"],["fruits","apple"],["fruity","apple"],["frustrated","minus"],["frustrating","minus"],["frustration","minus"],["frustrations","minus"],["frying","pan"],["fsfs","square"],["ft","dash"],["ftp","right"],["fu","square"],["fudge",""],["fuel",""],["fulcrum","dot"],["fulfil","tick"],["fulfill","tick"],["fulfilled","plus"],["fulfilling","plus"],["full","one"],["fulladjective","plus"],["fuller","person"],["fullness","box"],["fullstop",""],["fully","n"],["fume","nose"],["fun","plus"],["function","a"],["functional","square"],["functionalise","function"],["functionalism","function"],["functionality","and"],["functionally","plus"],["functionarguments","square"],["functioncalls","square"],["functioned","plus"],["functioning","plus"],["functionname","square"],["functionnumber","n"],["functionresult","square"],["functions","square"],["functionyms","dash"],["functor","right"],["functordis","variable"],["fund","dollarsymbol"],["fundamental","line"],["fundamentalist","down"],["fundamentally","down"],["fundamentals","line"],["funded","dollarsymbol"],["funding","dollarsymbol"],["fundraising","dollarsymbol"],["funds","dollarsymbol"],["funeral","n"],["funk","note"],["funky","note"],["funnel",""],["funnelable","box"],["funnelled","funnel"],["funnelling","down"],["funnels","funnel"],["funny","smile"],["funtastics","plus"],["fur",""],["furious","minus"],["furniture",""],["furrow","box"],["furtado","person"],["further","dot"],["furthering","right"],["furthermore","plus"],["furthest","n"],["furthestfrommean","dot"],["fury","minus"],["fusing","equals"],["fuss","minus"],["fussy","plus"],["future","plus"],["futurelearn","company"],["futures","right"],["futuristic","n"],["fx","tube"],["g","box"],["ga","dash"],["gaant","square"],["gabe","person"],["gabetarian","dash"],["gabon","country"],["gabriel","person"],["gadamer","person"],["gadamerian","person"],["gadgets","and"],["gaelic","person"],["gaffer","person"],["gaga","person"],["gaiety","plus"],["gail","person"],["gain","plus"],["gained","plus"],["gaining","plus"],["gains","up"],["galactic","box"],["galah",""],["galanter","person"],["galaxies","star"],["galaxy","star"],["gall","box"],["gallbladder","organ"],["galleries","square"],["galleriic","plus"],["gallipoli","city"],["gallstone","minus"],["gallstones","minus"],["galumphs","plus"],["gamba","leg"],["gambia","country"],["gambling","minus"],["game","a"],["gameplay","plus"],["games","square"],["gamma","square"],["gap","square"],["gaps","box"],["garage",""],["garageband","box"],["garbage",""],["garden","plant"],["gardener","person"],["gardeners","person"],["gardening","plant"],["gardens","plant"],["gare","person"],["garfield","person"],["garlanded","flower"],["garlic",""],["garment","shirt"],["gary","person"],["garydougsson","person"],["gas","box"],["gasper","person"],["gassed","gas"],["gastric","stomach"],["gastronomer","person"],["gate",""],["gates","person"],["gateway","gate"],["gather","box"],["gathered","person"],["gathering","box"],["gathers","box"],["gaud","apple"],["gauge","n"],["gauged","tick"],["gauging","plus"],["gauiea","dash"],["gaut","person"],["gave","pencilsharpener"],["gavin","person"],["gawenda","person"],["gawk","and"],["gay","person"],["gayness","plus"],["gays","person"],["gazebo",""],["gazebos","gazebo"],["gazelles","animal"],["gb","dash"],["gbd","right"],["gcc","software"],["gclid","dash"],["gcloud","cloud"],["gear","n"],["geared","plus"],["geeds","person"],["geeks","person"],["geelong","city"],["gel",""],["gelder","person"],["gem","jewel"],["gemini","ball"],["gems","jewel"],["gemstone",""],["gen","right"],["gender","person"],["gendered","person"],["genders","person"],["gene","n"],["general","tick"],["generalise","plus"],["generalised","square"],["generalist","plus"],["generalities","plus"],["generality","tworectangle"],["generalizability","plus"],["generally","one"],["generally","square"],["generate","plus"],["generated","plus"],["generatelevels","plus"],["generatelyricslistsverse","square"],["generatemelody","note"],["generatemelodyh","note"],["generatemelodym","note"],["generaterange","plus"],["generates","right"],["generating","right"],["generation","square"],["generations","person"],["generative","plus"],["generativeart","plus"],["generativity","plus"],["generator","right"],["generators","and"],["generic","one"],["generous","plus"],["generously","plus"],["genes","n"],["genesis","zero"],["genetastics","plus"],["genetic","a"],["genetically","dot"],["geneticist","person"],["genetics","n"],["genitals","box"],["genitive","square"],["genius","person"],["genre","book"],["gens","dot"],["gentile","person"],["gentle","plus"],["gently","plus"],["genuine","plus"],["genuinely","plus"],["genus","box"],["genutils","right"],["geocities","company"],["geoff","person"],["geographic","country"],["geographies","country"],["geography","book"],["geomatics","n"],["geometers","person"],["geometric","box"],["geometrical","square"],["geometrically","square"],["geometry","box"],["georg","person"],["george","person"],["georgia","person"],["geraldine","person"],["germ",""],["german","paper"],["germany","country"],["germinate","seed"],["gerry","person"],["gerrymandering","minus"],["gertrude","person"],["gerund","box"],["gerundive","right"],["gervais","person"],["gestalt","n"],["gestaltron","line"],["gestate","n"],["gestation","n"],["gesticulated","plus"],["gestures","right"],["get","plus"],["getchr","a"],["getitemn","square"],["gets","left"],["getting","left"],["getty","company"],["getv","variable"],["getvalue","box"],["getvalues","box"],["getvar","variable"],["gg","right"],["ggslrbv","dash"],["gh","dash"],["ghana","country"],["ghei","dash"],["gherkins","gherkin"],["ghetto","building"],["ghost","person"],["ghosts","minus"],["giant",""],["giants","giant"],["gift","box"],["gifted","plus"],["gifts","box"],["giggle","plus"],["giggled","person"],["gillian","person"],["gilligan","person"],["ginger",""],["gingerly","plus"],["ginkgo","plant"],["ginseng","plant"],["giraffe",""],["girl","person"],["girls","person"],["gist","left"],["gists","square"],["git","box"],["gita","book"],["gitenv","and"],["github","book"],["githubusercontent","box"],["giuseppes","person"],["give","plus"],["given","plus"],["givenate","zero"],["givens","plus"],["giver","person"],["gives","plus"],["giving","plus"],["gl","variable"],["gla","and"],["glacier","snow"],["glad","plus"],["glade","lawn"],["glance","eye"],["gland",""],["glands","box"],["glar","n"],["glaring","plus"],["glass",""],["glasses",""],["glaucoma","minus"],["gleam","light"],["gleaming","light"],["gleeson","person"],["glen","person"],["gliding","right"],["glitch","minus"],["glittering","photon"],["global","world"],["globalization","ball"],["globalizing","planet"],["globalleadershipexperience","plus"],["globally","ball"],["globals","box"],["globe","ball"],["glockenspiel",""],["glockenspiels","xylophone"],["gloria","plus"],["gloriae","plus"],["glories","plus"],["glory","plus"],["glossary","book"],["glove",""],["gloves","glove"],["glow","light"],["glucagon","ball"],["glucose","ball"],["glue","down"],["glued","glue"],["gluggy","minus"],["glycogen","ball"],["glyn","person"],["glyphs","square"],["gmail","paper"],["gnu","software"],["go","right"],["goal","tick"],["goals","n"],["goat",""],["goats","goat"],["goblins","person"],["god","person"],["godfather","person"],["godhead","head"],["godly","person"],["godmother","person"],["godness","plus"],["gods","person"],["goering","person"],["goers","right"],["goes","right"],["goethe","person"],["gofundme","dollarsymbol"],["goggles",""],["goglsgqx","dash"],["going","right"],["goji","plant"],["gold","box"],["goldberg","person"],["golden","box"],["goldilocks","person"],["goldsmith","person"],["goldstein","person"],["gomez","person"],["gondola",""],["gone","zero"],["gong","plus"],["good","plus"],["goodall","person"],["goodbye","n"],["gooddamn","dash"],["goodness","plus"],["goodnesses","plus"],["goodnight","plus"],["goods","box"],["google","building"],["googleapis","box"],["googleloc","dash"],["googleplay","paper"],["googlepreview","square"],["googles","dash"],["googletagmanager","software"],["goonie","person"],["goose",""],["gooseberry","berry"],["gorbachev","person"],["gorgeous","plus"],["gossamer",""],["gossip","mouth"],["gossipname","square"],["got","left"],["goto","right"],["gotostates","right"],["gotu","plant"],["gourd","ball"],["goureeve","person"],["gov","government"],["govender","person"],["govern","plus"],["governance","plus"],["governed","plus"],["government",""],["governments","person"],["governors","person"],["govolunteer","website"],["gown",""],["gp","square"],["gprt","dash"],["gra","dot"],["grabbed","hand"],["grabs","hand"],["grace","plus"],["gracious","plus"],["graciously","plus"],["grad","person"],["grade","a"],["gradebook","book"],["graded","a"],["grades","a"],["gradient","n"],["gradients","n"],["grading","dash"],["gradual","line"],["gradually","one"],["graduate","person"],["graduated","paper"],["graduates","person"],["graduating","tick"],["graduation","plus"],["graduations","plus"],["graham","person"],["grail","cup"],["grain","ball"],["grainge","person"],["grains",""],["gram","and"],["gramm","and"],["grammar","paper"],["grammara","square"],["grammarform","square"],["grammarforms","square"],["grammarinterpreter","dash"],["grammark","and"],["grammarly","tick"],["grammarname","square"],["grammarrulerest","square"],["grammars","dash"],["grammatical","dash"],["grammatically","dash"],["gramophone",""],["grams","box"],["gramsci","person"],["grand","plus"],["grande","person"],["grandfather","person"],["grandma","person"],["grandparents","person"],["granita",""],["granite",""],["grant","right"],["granted","plus"],["granting","plus"],["grants","dollarsymbol"],["granulated","down"],["grape",""],["grapefruit",""],["graph","square"],["graphed","square"],["graphic","square"],["graphical","square"],["graphics","square"],["graphs","square"],["grapple","hand"],["grapples","plus"],["gras","person"],["grasp","hand"],["grasped","hand"],["grass","square"],["grassroots","down"],["grassy","square"],["grate","square"],["grated","down"],["grateful","plus"],["grater",""],["grating","down"],["gratitude","plus"],["grattan","street"],["grave",""],["gravedigger","person"],["gravel","ball"],["graveyards","box"],["gravity","n"],["gray","person"],["great","plus"],["greater","greaterthan"],["greaterthan",""],["greatest","plus"],["greatly","plus"],["gree","person"],["greece","country"],["greed","minus"],["greedson","person"],["greek","person"],["green","square"],["greenhouse","building"],["greenie","person"],["greenlucian","person"],["greens","person"],["greensutra","square"],["greer","person"],["greet","plus"],["greeted","plus"],["greeter","face"],["greg","person"],["gregarious","plus"],["gregory","person"],["gremlin","person"],["grenada","country"],["grenadines","country"],["grene","person"],["greta","person"],["grew","up"],["grey","square"],["greyhen","duck"],["grid",""],["gridline","line"],["gridlines","line"],["grids","square"],["griegian","person"],["grievance","minus"],["grievances","xcorrection"],["grieve","xcorrection"],["griffith","university"],["grip","hand"],["gripped","hand"],["gripping","hand"],["grips","hand"],["grit","plus"],["gritted","plus"],["gritty","plus"],["groan","minus"],["groove","box"],["groovy","plus"],["groped","hand"],["ground","plane"],["grounded","down"],["grounding","square"],["grounds","plus"],["groundsman","person"],["group","square"],["grouped","square"],["grouping","square"],["groupings","box"],["groups","square"],["grow","plant"],["grower","person"],["growing","up"],["growling","line"],["grown","up"],["grows","up"],["growth","up"],["grub",""],["grumbles","minus"],["gs","dash"],["gst","dollarsymbol"],["gt","greaterthan"],["gtag","square"],["gtm","software"],["guage","square"],["guarantee","plus"],["guaranteeing","plus"],["guard","person"],["guarded","plus"],["guardian","person"],["guatemala","country"],["guava",""],["guba","person"],["guerillacreative","company"],["guess","n"],["guessed","plus"],["guesses","questionmark"],["guessing","n"],["guest","person"],["guesthouse","building"],["guestlist","square"],["guests","person"],["gui","square"],["guidance","plus"],["guide","paper"],["guided","right"],["guidelines","square"],["guides","square"],["guiding","line"],["guilds","box"],["guilt","minus"],["guiltily","minus"],["guilty","minus"],["guinea","country"],["guitar",""],["guitarplayer","person"],["gulch","down"],["gum","box"],["gumley","person"],["gumnut",""],["gums","teeth"],["gumtree","tree"],["gun","minus"],["gunfire","minus"],["guns","minus"],["gunshot","minus"],["gupta","person"],["gurgitated","up"],["gurol","person"],["guru","person"],["gut","stomach"],["guts","stomach"],["gutter",""],["guy","person"],["guyana","country"],["guys","person"],["gvim","software"],["gwfkgclcbgasyhq","dash"],["gym","room"],["gymansium","room"],["gymcana","pony"],["gymnast","person"],["gyrate","right"],["gülcan","person"],["h","variable"],["ha","plus"],["habilitate","plus"],["habit","plus"],["habitable","plus"],["habits","plus"],["hack","down"],["hacker","person"],["hackernoon","dash"],["hackers","person"],["hackett","person"],["hacking","down"],["had","square"],["hadj","person"],["hadn","xcorrection"],["hadron","right"],["haemorrhoids","minus"],["haggedly","minus"],["haggish","minus"],["hagiography","person"],["haha","plus"],["hail","ball"],["hailey","person"],["hair",""],["haircut","hair"],["hairdresser","person"],["hairdressing","scissors"],["haired","hair"],["hairstyles","hair"],["hairstylist","person"],["haiti","country"],["haitsma","person"],["halchitectures","square"],["half","box"],["halfway","n"],["halides","ball"],["hall","building"],["halls","hall"],["hallucinated","box"],["hallucination","minus"],["hallucinations","minus"],["hallucinatory","minus"],["hallucinogenic","minus"],["hallway","hall"],["halo","circle"],["halt","box"],["halted","xcorrection"],["halting","zero"],["halved","box"],["halves","square"],["ham",""],["hamburger",""],["hamish","person"],["hamlet","person"],["hammer",""],["hammersley","person"],["hammock",""],["hampstead","city"],["hamster",""],["hamstring","line"],["hand",""],["handed","hand"],["handful","box"],["handing","hand"],["handkerchief",""],["handle",""],["handled","hand"],["handler","hand"],["handlers","hand"],["handles","handle"],["handling","hand"],["handmade","box"],["hands","hand"],["handshake","hand"],["handstand","down"],["handwriting","square"],["handwritten","square"],["handy","hand"],["hang","person"],["hanging","down"],["hangs","minus"],["hank","person"],["hannaford","person"],["hannigan","person"],["hans","person"],["hansard","person"],["hansen","person"],["hansom","car"],["hap","dash"],["happelstanded","plus"],["happen","tick"],["happened","tick"],["happening","zero"],["happenings","box"],["happens","tick"],["happies","plus"],["happiest","plus"],["happily","plus"],["happiness","plus"],["happinesses","plus"],["happisissiances","plus"],["happy",""],["happy","plus"],["happyface",""],["happynices","person"],["hapsichords","plus"],["harass","minus"],["harassment","minus"],["harbinger","person"],["hard","minus"],["hardback","book"],["harddiskless","xcorrection"],["harder","minus"],["hardest","n"],["hardgrave","person"],["hardily","plus"],["hardly","minus"],["hardship","minus"],["hardware","box"],["hargraves","person"],["harlequinade","person"],["harlequinades","person"],["harlequins","harlequin"],["harley","person"],["harm","minus"],["harmed","minus"],["harmful","minus"],["harming","minus"],["harmless","plus"],["harmlessly","plus"],["harmlessness","plus"],["harmonica",""],["harmonics","n"],["harmonies","note"],["harmonious","plus"],["harmonised","equals"],["harmoniser","box"],["harmony","note"],["harmonyinstrumentnumber","n"],["harmonyinstruments","tube"],["harmonyinstrumentslength","n"],["harmonypart","note"],["harmonyparts","line"],["harmonypartslength","n"],["harmonyr","plus"],["harness",""],["harnesses","right"],["harnessing","plus"],["harp",""],["harpist","person"],["harps","harp"],["harpsichord","note"],["harpsichorders","note"],["harpsichords","note"],["harraway","person"],["harrington","person"],["harris","person"],["harrison","person"],["harry","person"],["harsha","person"],["harvard","university"],["harvest","plant"],["harvested","grain"],["harvester","up"],["has","pencilsharpener"],["hash",""],["haskell","and"],["hasn","minus"],["hassaan","person"],["hasseldine","person"],["hat",""],["hate","minus"],["hated","minus"],["hating","minus"],["hatred","minus"],["hats","hat"],["hatted","hat"],["haughty","minus"],["hauled","up"],["have","apple"],["haven","home"],["haverbanders","person"],["havilland","person"],["having","hand"],["hawke","person"],["hay",""],["hayes","person"],["haystack","hay"],["hazelnut","nut"],["hazen","person"],["hazily","box"],["hbb","dash"],["hcl",""],["hd","box"],["hdd","box"],["hdmi","square"],["hdrdocuments","paper"],["he","person"],["head",""],["headache","minus"],["headaches","minus"],["headdress",""],["headed","right"],["header","square"],["headers","square"],["heading","square"],["headings","square"],["headphones",""],["headpiece",""],["headroom","box"],["heads","dot"],["headshot","square"],["headsofstate","person"],["headwind","left"],["heal","tick"],["healed","plus"],["healing","plus"],["health","person"],["healthier","plus"],["healthily","plus"],["healthiness","plus"],["healthy","person"],["heapings","ball"],["heaps","n"],["hear","ear"],["heard","ear"],["hearing","ear"],["heart",""],["heartbeat","heart"],["hearted","plus"],["heartening","plus"],["heartland","city"],["hearts","heart"],["heat","n"],["heated","n"],["heater",""],["heaters","heater"],["heathcote","person"],["heather","person"],["heating","n"],["heaven","zero"],["heavenly","plus"],["heavier","greaterthan"],["heavily","n"],["heavy","n"],["hebrew","book"],["hecklers","minus"],["hecs","dollarsymbol"],["hedgehog",""],["hedgehogged","plus"],["hedonistic","minus"],["heel",""],["hegel","person"],["hegelgreen","paper"],["hegelian","person"],["hegelpossyllabus","book"],["hei","person"],["heidegger","person"],["heideggerian","person"],["heidi","person"],["height","n"],["heights","zline"],["heiness","person"],["heir","person"],["heisenberg","zero"],["held","hand"],["helen","person"],["helena","person"],["helicopter",""],["heliport","square"],["helium",""],["hello","hand"],["helloa","plus"],["helloaa","plus"],["helloc","plus"],["helloca","plus"],["hellocc","plus"],["hellocca","plus"],["hellod","plus"],["hellodc","plus"],["hellok","plus"],["hellow","plus"],["help","square"],["helpdesk","desk"],["helped","person"],["helper","person"],["helpers","person"],["helpful","plus"],["helpfulness","plus"],["helpfulnesses","plus"],["helping","plus"],["helps","plus"],["helpt","plus"],["helter","n"],["hem","line"],["hemel","city"],["hemisphere",""],["hemispheres","hemisphere"],["hemmed","left"],["hen","duck"],["hence","right"],["henderson","person"],["henle","person"],["henry","person"],["hens","duck"],["henson","person"],["heptagon",""],["her","person"],["herb","plant"],["herbal","amount"],["herbology","plant"],["herbs","amount"],["here","down"],["heritage","square"],["hermaphrodites","person"],["hermaphroditic","minus"],["hermeneutic","dash"],["hermeneutical","book"],["hermeneutically","right"],["hermeneutics","square"],["hermia","person"],["hernán","person"],["hero","person"],["heroes","person"],["heroic","plus"],["herren","person"],["herrings","fish"],["hers","person"],["herself","person"],["hertz","n"],["herzegovina","country"],["hesh","person"],["heshan","person"],["heshe","person"],["hesitation","minus"],["hesterity","plus"],["heterogeneous","notequals"],["heterosexual","person"],["heuristic","plus"],["heuristics","and"],["hewso","book"],["hexagon",""],["hexagonal","hexagon"],["hexagons","hexagon"],["hey","one"],["hhh","dash"],["hhi","square"],["hhii","square"],["hi","hand"],["hibisci","plant"],["hibiscus","plant"],["hibiscuses","flower"],["hiccups","minus"],["hickletons","plus"],["hid","minus"],["hidden","down"],["hide","minus"],["hides","box"],["hiding","box"],["hierarchical","dash"],["hierarchically","dash"],["hierarchies","dash"],["hierarchy","dash"],["hieroglyph","square"],["hieroglyphics","square"],["hieroglyphs","square"],["high","a"],["higher","line"],["highest","n"],["highlight","light"],["highlighted","square"],["highlights","plus"],["highly","plus"],["hight","n"],["highton","suburb"],["highway","road"],["highways","road"],["hike","right"],["hiked","right"],["hiking","boot"],["hilbert","person"],["hill",""],["hillary","person"],["hills","hill"],["hilsley","person"],["him","person"],["himlkampf","book"],["himself","person"],["hinder","minus"],["hindered","minus"],["hindi","language"],["hindsight","eye"],["hindu","person"],["hinduism","book"],["hinge",""],["hint","square"],["hinted","square"],["hintinesses","plus"],["hints","plus"],["hipaa","square"],["hippies","person"],["hippolyte","person"],["hips",""],["hiptalipuppies","puppy"],["hiraffe","giraffe"],["hire","dollarsymbol"],["hired","plus"],["hires","dollarsymbol"],["hiring","plus"],["his","person"],["hissed","minus"],["historian","person"],["historians","person"],["historic","book"],["historical","n"],["historically","book"],["histories","book"],["history","square"],["hit","tick"],["hitchens","person"],["hither","right"],["hithertoness","note"],["hitler","person"],["hits","note"],["hitting","equals"],["hive",""],["hived","bee"],["hjsdfsa","dash"],["hk","dash"],["hkq","dash"],["hlo","dash"],["ho","plus"],["hoax","minus"],["hobbies","plus"],["hoboken","person"],["hoc","down"],["hog","pig"],["hoist","up"],["hoisted","up"],["hoity","plus"],["holbrook","person"],["hold","hand"],["holder","person"],["holders","hand"],["holding","hand"],["holds","hand"],["hole",""],["holes","circle"],["holiday","square"],["holidays","square"],["holily","plus"],["holiness","plus"],["holistic","plus"],["holistically","plus"],["hollownesses","box"],["hollywood","city"],["holocap",""],["hologram","box"],["holograms","right"],["hologranates","pomegranate"],["holographic","box"],["holy","paper"],["home","house"],["homebrew","software"],["homeless","person"],["homelessness","minus"],["homemade","plus"],["homeostasis","line"],["homepage","paper"],["homes","house"],["homework","paper"],["homicide","minus"],["hominems","person"],["homo","person"],["homoeroticness","plus"],["homographs","equals"],["homophobe","minus"],["homophones","equals"],["homophonous","equals"],["homosexual","person"],["homosexuality","equals"],["homosexuals","person"],["honduras","country"],["hone","square"],["honed","plus"],["honest","plus"],["honestly","plus"],["honesty","plus"],["honey",""],["honeycomb","hexagon"],["hong","city"],["honing","square"],["honked","horn"],["honking","horn"],["honky","ball"],["honorary","plus"],["honorific","square"],["honour","a"],["honourable","a"],["honourably","plus"],["honoured","a"],["honouringly","plus"],["honours","book"],["hood","person"],["hook",""],["hooked","hook"],["hooks","hook"],["hooky","hook"],["hoop",""],["hoops","hoop"],["hop","up"],["hope","plus"],["hoped","plus"],["hopeful","plus"],["hopefully","a"],["hopefully","plus"],["hopes","plus"],["hopetoun","city"],["hoping","box"],["hopkins","person"],["hopping","right"],["hops","right"],["horace","person"],["horizon","line"],["horizons","horizon"],["horizontal","line"],["horizontally","line"],["hormone","ball"],["hormones","box"],["horn",""],["horrific","minus"],["horse",""],["horseman","person"],["horsemen","person"],["horses","horse"],["horticulture","plant"],["hose","tube"],["hosing","tube"],["hospital","building"],["hospitalisation","plus"],["hospitalisations","plus"],["hospitality","plus"],["hospitals","building"],["host","computer"],["hosted","plus"],["hostel","building"],["hostility","minus"],["hosting","box"],["hot","n"],["hotel","building"],["hotline","n"],["hotspots","square"],["hotter","up"],["houghton","person"],["hour","square"],["hourglass",""],["hourlinesses","n"],["hourly","n"],["hours","square"],["hoursprayer","square"],["hourvaluea","variable"],["hourvalueb","variable"],["house",""],["housed","house"],["household","house"],["housekeeper","person"],["housekeeping","plus"],["houses","house"],["housing","house"],["hove","person"],["hover","up"],["hovercar",""],["how","dash"],["howard","person"],["howardfinestudio","company"],["howdy","plus"],["howe","person"],["howell","company"],["howells","company"],["however","minus"],["hows","person"],["howzat","ball"],["hpc","computer"],["hq","plus"],["hr","book"],["href","dash"],["ht","dash"],["html","paper"],["htmlpreview","square"],["http","tick"],["httpd","software"],["https","tick"],["hu","person"],["hub","box"],["hubpages","company"],["hubspot","company"],["huckleys","building"],["hudson","company"],["hug",""],["huge","n"],["huggable","person"],["hugged","hug"],["hugging","person"],["hugo","person"],["huh","minus"],["hull","box"],["hulpke","person"],["hum","square"],["human","person"],["humane","plus"],["humanism","book"],["humanist","person"],["humanists","person"],["humanitarian","person"],["humanitas","book"],["humanities","building"],["humanity","person"],["humankind","person"],["humanness","plus"],["humanoid","person"],["humans","person"],["humble","plus"],["humbled","plus"],["humbleness","plus"],["humbling","plus"],["humiliate","minus"],["humility","plus"],["humming","lip"],["hummus",""],["humorous","plus"],["humour","plus"],["humourous","plus"],["humphrey","person"],["humphries","person"],["humpy",""],["hun","person"],["hundred","square"],["hundreds","threerectangle"],["hung","down"],["hungary","country"],["hunger","apple"],["hungry","tongue"],["hunt","eye"],["hunted","dot"],["hurdle","box"],["hurdles","hurdle"],["hurt","minus"],["hurting","minus"],["hurtled","right"],["hurtness","minus"],["hurts","minus"],["husband","person"],["husbands","person"],["hush","zero"],["huskies","dog"],["husky","dog"],["husserl","person"],["huston","person"],["hut","building"],["hvuetwgillii","dash"],["hw","paper"],["hyam","person"],["hybrids","plus"],["hydrangeas","flower"],["hydrant","water"],["hydrated","water"],["hydrochloric","hcl"],["hydrogen",""],["hydroponics","water"],["hygiene","plus"],["hygienic","plus"],["hyle","box"],["hyper","right"],["hyperbole","up"],["hypercard","software"],["hyperlinks","right"],["hyperlog","square"],["hypervacillations","right"],["hyphen",""],["hypochondria","minus"],["hypotenuse","line"],["hypothalamus","box"],["hypotheses","tick"],["hypothesis","one"],["hypothesise","one"],["hypothesised","square"],["hypothetical","square"],["hysteresis","dash"],["hysterically","plus"],["hz","n"],["i","person"],["ia","plus"],["iable","dash"],["iacit","right"],["iaclcbgasyhq","dash"],["ian","person"],["iata","book"],["iax","dash"],["ibid","book"],["ibidem","right"],["ibis",""],["ibises","ibis"],["ibooks","and"],["ice","person"],["icecream",""],["iceland","country"],["icicle",""],["icing",""],["icon","square"],["iconism","square"],["icons","square"],["icsamined","plus"],["icy","ice"],["id","square"],["idahobit","plus"],["idea","a"],["ideal","plus"],["idealism","circle"],["idealist","person"],["ideality","plus"],["ideally","plus"],["ideas","tworectangle"],["ideation","square"],["idemic","dash"],["identical","equals"],["identicalness","equals"],["identifiable","plus"],["identification","square"],["identified","eye"],["identifier","n"],["identifies","tick"],["identify","a"],["identifying","n"],["identifywriters","person"],["identigenitrix","plus"],["identities","person"],["identity","square"],["ideological","plus"],["ideologies","book"],["ideology","square"],["idiom","square"],["idiosyncrasies","minus"],["idiot","person"],["idiots","person"],["idle","zero"],["ids","square"],["idyll","flower"],["ie","person"],["ieee","organisation"],["if","dash"],["ifind","plus"],["iframe","square"],["iglesias","person"],["ignition","right"],["ignoramus","person"],["ignorance","minus"],["ignore","xcorrection"],["ignored","xcorrection"],["ignores","xcorrection"],["ignoring","xcorrection"],["igual","dash"],["ih","dash"],["iherb","company"],["ii","tworectangle"],["iie","computer"],["iii","dash"],["ij","square"],["ikinger","person"],["il","square"],["ilham","person"],["ill","minus"],["illegal","minus"],["illness","minus"],["illocutionary","plus"],["illuminated","light"],["illuminati","person"],["illusion","plus"],["illustator","software"],["illustrate","square"],["illustrated","square"],["illustrates","square"],["illustration","square"],["illustrations","square"],["illustrative","square"],["illustrator","software"],["illustrious","plus"],["ilo","plus"],["ilpi","right"],["ilya","person"],["im","company"],["imac","computer"],["image","square"],["imageric","square"],["imagery","square"],["images","square"],["imaginable","square"],["imaginary","plus"],["imagination","plus"],["imagine","square"],["imagined","plus"],["imagining","square"],["imaiangreen","person"],["imbalance","minus"],["img","square"],["imitate","right"],["imitated","right"],["imitates","right"],["imitating","right"],["imitation","box"],["immature","minus"],["immaturity","minus"],["immediate","zero"],["immediately","one"],["immersed","down"],["immersing","down"],["immersion","down"],["immigration","right"],["imminent","zero"],["imminently","n"],["immoral","minus"],["immorality","minus"],["immortal","n"],["immortality","n"],["immortals","person"],["immune","plus"],["immunity","plus"],["immunogymnastics","plus"],["immunology","book"],["imogen","person"],["impact","tick"],["impacted","tick"],["impactful","a"],["impacts","plus"],["impair","minus"],["impaired","minus"],["impairment","minus"],["impart","square"],["impartial","plus"],["impeccable","plus"],["impede","minus"],["impending","zero"],["impenetrable","xcorrection"],["imperative","right"],["imperatives","right"],["imperfect","minus"],["imperial","person"],["imperialism","country"],["impersonal","minus"],["impetus","plus"],["implant","box"],["implanted","down"],["implants","box"],["implement","plus"],["implementation","plus"],["implementations","and"],["implemented","right"],["implementing","and"],["implements","right"],["implication","right"],["implications","right"],["implicit","down"],["implicitly","box"],["implied","right"],["implies","right"],["imply","right"],["implying","right"],["import","right"],["importance","plus"],["important","plus"],["importantly","plus"],["imported","down"],["importing","down"],["imposed","minus"],["impossible","minus"],["impress","plus"],["impressed","plus"],["impression","square"],["impressionism","square"],["improper","minus"],["improperly","minus"],["improve","plus"],["improved","up"],["improvement","plus"],["improvements","plus"],["improves","plus"],["improving","plus"],["impulse","tick"],["impurities","minus"],["impurity","minus"],["in","square"],["inability","minus"],["inaction","minus"],["inactions","minus"],["inadequacies","minus"],["inadequate","minus"],["inadvertently","minus"],["inappropriate","minus"],["inbound","right"],["inbox","box"],["inc","plus"],["incan","person"],["incapable","minus"],["incapacitated","minus"],["incentive","plus"],["inception","zero"],["inch","n"],["inches","n"],["incidence","n"],["incident","dot"],["incidental","n"],["incidents","box"],["incision","knife"],["incite","mouth"],["incl","box"],["incline","n"],["inclish","suitcase"],["include","plus"],["included","plus"],["includes","left"],["including","plus"],["inclusion","box"],["inclusions","dash"],["inclusive","box"],["inclusively","plus"],["inclusiveness","plus"],["incognito","dash"],["income","dollarsign"],["incompatibility","xcorrection"],["incompatible","xcorrection"],["incomplete","minus"],["incompletely","minus"],["incongruous","minus"],["inconsequential","plus"],["inconsistencies","minus"],["inconsistent","minus"],["incorporate","down"],["incorporated","right"],["incorporating","dot"],["incorrect","xcorrection"],["incorrectly","minus"],["increase","plus"],["increased","plus"],["increaser","plus"],["increases","up"],["increasing","plus"],["increasingly","up"],["increment","plus"],["incrementally","right"],["incubator","heater"],["ind","up"],["indeed","plus"],["indefensible","minus"],["indefinitive","minus"],["indefinitively","xcorrection"],["indent","dash"],["indenting","right"],["independent","one"],["independently","plus"],["index","dash"],["indexanimhighered","paper"],["indexdownload","paper"],["indexga","dash"],["indexing","n"],["indexped","paper"],["indextexttobr","paper"],["india","country"],["indian","person"],["indiana","person"],["indic","plus"],["indicate","a"],["indicated","tick"],["indicates","up"],["indicating","dash"],["indicative","plus"],["indicator","n"],["indicators","i"],["indices","n"],["indie","plus"],["indieradioshow","note"],["indigenous","person"],["indirect","dot"],["indispensable","plus"],["indiv","person"],["individual","person"],["individualistic","person"],["individuality","plus"],["individually","person"],["individualness","person"],["individuals","person"],["individuation","plus"],["indonesia","country"],["indonesian","book"],["indoor","building"],["indoors","building"],["indubitably","plus"],["induc","up"],["induced","right"],["induct","plus"],["induction","plus"],["inductive","up"],["inductively","up"],["indulge","plus"],["indulging","plus"],["industrial","company"],["industrialisation","planet"],["industries","company"],["industry","building"],["ineachiated","box"],["ineffective","minus"],["inefficient","minus"],["inepretaer","dash"],["inept","minus"],["inequalities","greaterthan"],["inequality","notequals"],["inequity","minus"],["inert","zero"],["inessive","box"],["inevitable","plus"],["inexorable","plus"],["inexpensive","n"],["inexplicable","minus"],["inexplicably","minus"],["inextricably","plus"],["infallibility","plus"],["infallible","plus"],["infant","baby"],["infatuations","plus"],["infect","minus"],["infection","virus"],["infections","minus"],["infer","right"],["inference","right"],["inferenced","right"],["inferences","right"],["inferred","right"],["infertile","minus"],["infiltrate","right"],["infiltrating","xcorrection"],["infinary","n"],["infinite","circle"],["infinitely","n"],["infinitesimally","n"],["infinitive","right"],["infinitum","n"],["infinity","n"],["inflammation","minus"],["inflated","ball"],["inflating","up"],["inflection","up"],["inflow","right"],["influence","person"],["influenced","right"],["influences","right"],["influencing","plus"],["influential","plus"],["influenza","virus"],["influx","right"],["info","paper"],["infopreneur","person"],["infopreneurs","person"],["infopreneursummit","square"],["inform","mouth"],["informal","plus"],["informally","minus"],["information","paper"],["informational","square"],["informative","square"],["informed","plus"],["informer","person"],["infra","n"],["infrastructure","box"],["infringement","minus"],["infringers","minus"],["infringing","minus"],["infusion","down"],["ing","right"],["ingenue","plus"],["ingest","stomach"],["inglish","hinge"],["ingredient","apple"],["ingredients","apple"],["inhabit","down"],["inhabitant","person"],["inhabitants","person"],["inhabiting","plus"],["inhalation","lung"],["inhale","air"],["inhaled","air"],["inherent","plus"],["inherentana","plus"],["inherently","plus"],["inherit","left"],["inherited","left"],["inhibit","minus"],["inhibiting","xcorrection"],["inhumane","minus"],["init","one"],["initial","one"],["initialise","zero"],["initialising","plus"],["initialization","zero"],["initially","one"],["initialphilosophy","square"],["initiate","one"],["initiated","one"],["initiates","one"],["initiating","zero"],["initiation","zero"],["initiative","right"],["initiatives","plus"],["initiator","right"],["initpredicate","square"],["inject","syringe"],["injecting","syringe"],["injunctive","box"],["injuries","minus"],["injury","minus"],["injustice","minus"],["injustices","minus"],["ink",""],["inky",""],["inkyclassic","inky"],["inline","dash"],["inmodes","right"],["innate","plus"],["inner","box"],["innocence","plus"],["innocent","plus"],["innovate","plus"],["innovates","plus"],["innovation","a"],["innovations","plus"],["innovative","plus"],["innovators","plus"],["inopportune","minus"],["input","variable"],["inputasantecedant","n"],["inputfile","paper"],["inputs","n"],["inputted","n"],["inputting","right"],["inputtype","square"],["inputvalues","n"],["inputvarlist","twobox"],["inputvars","variable"],["inputvarsa","variable"],["inquests","line"],["inquired","square"],["inquirer","person"],["inquiry","square"],["ins","box"],["inscription","pen"],["insect",""],["insects","insect"],["insensitive","minus"],["insentient","minus"],["inseparability","equals"],["inseparable","equals"],["insert","box"],["insertdoublebackslashbeforequote","box"],["inserted","xline"],["inserting","down"],["insertion","box"],["inserts","box"],["inside","box"],["insider","person"],["insight","eye"],["insightful","plus"],["insights","plus"],["insomnia","minus"],["inspect","eye"],["inspected","eye"],["inspecting","eye"],["inspector","person"],["inspiration","plus"],["inspire","right"],["inspired","tick"],["inspires","plus"],["inspiring","plus"],["inst","note"],["instability","minus"],["instagram","company"],["install","down"],["installation","down"],["installed","down"],["installer","down"],["installers","down"],["installing","down"],["installs","right"],["instalments","dollarsymbol"],["instance","n"],["instances","n"],["instant","zero"],["instantiated","plus"],["instantiation","tick"],["instantly","one"],["instead","one"],["instincts","plus"],["institute","company"],["instituted","plus"],["instituting","plus"],["institution","organisation"],["institutional","building"],["institutionalised","plus"],["institutions","building"],["instr","note"],["instruct","tick"],["instructed","paper"],["instructing","square"],["instruction","square"],["instructions","paper"],["instructor","person"],["instructs","paper"],["instructure","square"],["instrument","pen"],["instrumental","guitar"],["instrumentals","note"],["instrumentation","note"],["instrumentlist","square"],["instrumentnumber","n"],["instruments","violin"],["insulator","box"],["insulin","box"],["insult","minus"],["insulted","minus"],["insurance","paper"],["int","n"],["intact","line"],["intactify","plus"],["intangible","square"],["integer","n"],["integrate","one"],["integrated","plus"],["integrates","box"],["integrating","down"],["integration","box"],["integrations","box"],["integrative","dash"],["integrator","equals"],["integrity","plus"],["integument","right"],["integumentary","skin"],["intell","plus"],["intellect","plus"],["intellectual","plus"],["intellectually","brain"],["intellectuals","person"],["intelligence","plus"],["intelligences","square"],["intelligent","plus"],["intelligently","and"],["intend","dash"],["intended","right"],["intends","right"],["intense","n"],["intensify","plus"],["intensity","n"],["intensive","a"],["intention","right"],["intentions","square"],["inter","dash"],["interact","plus"],["interacted","dash"],["interacting","tick"],["interaction","plus"],["interactions","right"],["interactive","square"],["interacts","right"],["intercept","c"],["intercepted","equals"],["intercepting","box"],["intercepts","n"],["interchangeable","box"],["interchanged","right"],["interchanges","right"],["intercommunicated","right"],["interconnectable","right"],["interconnected","right"],["interconnection","right"],["interconnections","right"],["intercourse","square"],["interdependence","plus"],["interdependent","right"],["interdisciplinary","tworectangle"],["interest","plus"],["interested","plus"],["interesting","tick"],["interestingengineering","company"],["interestingly","plus"],["interestingness","plus"],["interests","square"],["interface","square"],["interfaced","square"],["interfaces","square"],["interfacing","equals"],["interference","minus"],["intergalactic","right"],["intergenre","right"],["interim","n"],["interior","room"],["interjected","mouth"],["interleaved","line"],["interludes","line"],["intermediate","box"],["intermingled","right"],["intermittently","dot"],["intermix","box"],["intern","person"],["internal","square"],["internalised","box"],["internally","box"],["international","planet"],["internationalised","square"],["internationally","ball"],["internet","book"],["internetics","company"],["internship","paper"],["interobject","right"],["interp","and"],["interpersonal","person"],["interplanetary","planet"],["interplay","right"],["interpolate","right"],["interpolating","right"],["interpolation","right"],["interpret","person"],["interpretation","paper"],["interpretations","plus"],["interpretative","plus"],["interpretbody","square"],["interpretbodylp","square"],["interpreted","dash"],["interpreter","person"],["interpreters","square"],["interpreting","right"],["interpretpart","dash"],["interprets","dash"],["interpretstatement","dash"],["interpretstatementlp","right"],["interquartile","minus"],["interrelate","right"],["interrelated","right"],["interrelatedness","dash"],["interrelations","right"],["interrelationship","right"],["interrelationships","right"],["interrogate","xcorrection"],["interrogated","square"],["interrupt","box"],["interrupted","minus"],["interrupting","box"],["interruption","minus"],["interruptions","minus"],["intersect","box"],["intersected","box"],["intersecting","box"],["intersection","and"],["intersectional","box"],["intersectionality","box"],["intersections","box"],["intersex","two"],["interspecies","dash"],["interspersed","right"],["intersplicings","plus"],["interstellar","ball"],["interstellarphone","right"],["interstitial","box"],["interstitium","box"],["intersubjectively","plus"],["intersubjectivity","right"],["intertextual","right"],["intertextualise","right"],["intertextualised","right"],["intertextuality","box"],["intertwine","two"],["intertwined","line"],["intertwining","right"],["intertwiningly","right"],["interval","minus"],["intervals","minus"],["intervention","box"],["interview","dash"],["interviewed","person"],["interviewer","person"],["interviewers","person"],["interviewing","square"],["interviews","person"],["interweaved","right"],["interweaving","right"],["intestine",""],["intestines",""],["intimacy","equals"],["intimate","equals"],["intimation","plus"],["intmath","equals"],["into","square"],["intoxicated","minus"],["intoxicating","minus"],["intra","right"],["intranet","square"],["intransitive","dot"],["intricate","square"],["intrigued","plus"],["intriguing","plus"],["intrinsic","box"],["intro","square"],["introduce","dash"],["introduced","one"],["introduces","one"],["introducing","dash"],["introduction","square"],["introductions","square"],["introductory","one"],["introns","right"],["introvert","square"],["introverted","minus"],["introverts","square"],["intructions","paper"],["intuition","plus"],["intuitive","plus"],["intuitively","plus"],["intuitivity","plus"],["invalid","minus"],["invariant","equals"],["invasive","minus"],["invasively","minus"],["invasiveness","minus"],["invent","box"],["invented","box"],["inventing","box"],["invention","and"],["inventions","and"],["inventive","and"],["inventors","person"],["inventory","box"],["invents","plus"],["inverse","right"],["inversely","right"],["invert","right"],["inverted","right"],["inverter","x"],["inverting","right"],["inverts","right"],["invest","dollarsymbol"],["invested","dollarsymbol"],["investigate","plus"],["investigated","eye"],["investigates","eye"],["investigating","eye"],["investigation","tick"],["investigations","book"],["investing","dollarsymbol"],["investism","dollarsymbol"],["investment","n"],["investments","dollarsymbol"],["investor","person"],["investors","person"],["invigilated","plus"],["invincible","plus"],["invisible","square"],["invisibles","zero"],["invisibly","zero"],["invitation","paper"],["invite","left"],["invited","plus"],["invites","right"],["inviting","left"],["invitralised","plus"],["invoking","plus"],["involuntary","zero"],["involve","left"],["involved","tick"],["involvement","plus"],["involves","left"],["involving","plus"],["inwards","right"],["io","right"],["iodine","ball"],["ion","ball"],["ios","square"],["ip","n"],["ipad","square"],["iphone",""],["iphones","iphone"],["ipn","n"],["iq","a"],["iqr","minus"],["iqrexcel","box"],["iran","country"],["iranian","person"],["iraq","country"],["ireland","country"],["irigaray","person"],["iris","circle"],["irish","person"],["iron",""],["ironed","iron"],["ironic","right"],["ironically","plus"],["ironism","plus"],["irony","right"],["irrational","minus"],["irrationality","minus"],["irrazionali","minus"],["irregular","plus"],["irrel","minus"],["irrelevant","minus"],["irresponsibility","minus"],["irreversible","minus"],["irritable","minus"],["irritating","minus"],["irst","dash"],["irwin","person"],["is","equals"],["isbn","n"],["iscomparison","equals"],["isempty","box"],["ishq","person"],["isla","square"],["islam","religion"],["islamic","person"],["island",""],["islands","island"],["islist","square"],["ism","box"],["isn","minus"],["iso","paper"],["isolate","box"],["isolated","box"],["isolation","one"],["isop","equals"],["isoperator","equals"],["isplus","plus"],["israel","country"],["israeli","country"],["isreview","book"],["iss","book"],["isset","equals"],["issue","dash"],["issues","dash"],["istituto","building"],["isval","equals"],["isvalempty","n"],["isvalstr","threebox"],["isvalstrempty","tworectangle"],["isvalstrorundef","box"],["isvar","equals"],["it","down"],["it","up"],["italian","book"],["italic","line"],["italics","square"],["italy","country"],["itch","minus"],["itchiness","minus"],["item","n"],["itema","n"],["itema","square"],["itemid","n"],["itemised","box"],["itemlabels","square"],["itemname","square"],["itemnumber","n"],["items","n"],["itemsa","n"],["itemsa","square"],["iterate","n"],["iterated","n"],["iteration","n"],["iterations","n"],["iterative","n"],["iterativedeepeningdepth","variable"],["ith","dash"],["itinerary","square"],["itness","plus"],["its","up"],["itself","up"],["itunes","software"],["iup","book"],["iv","fourrectangle"],["ivci","line"],["ivf","right"],["ivison","person"],["ivle","square"],["ivo","person"],["ivoire","country"],["ivory",""],["iwda","dash"],["ixamine","plus"],["izcoatl","person"],["izzimokays","plus"],["j",""],["jack","person"],["jackdaw","bird"],["jacket",""],["jackova","person"],["jackson","person"],["jacob","person"],["jacqueline","person"],["jacques","person"],["jag","person"],["jagged","minus"],["jail","building"],["jailed","xcorrection"],["jailings","xcorrection"],["jairo","person"],["jakat","person"],["jake","person"],["jam",""],["jamaica","country"],["james","person"],["jameson","person"],["jamesroyalmelbournehospital","person"],["jamie","person"],["jan","person"],["jane","person"],["janette","person"],["jangle","plus"],["january","square"],["japan","country"],["japanese","book"],["japonica","flower"],["jar",""],["jarful","jar"],["jargon","square"],["jarvis","person"],["jase","person"],["jaskula","person"],["jasmine","person"],["jason","person"],["java","and"],["javascript","square"],["jaw",""],["jay","person"],["jaynes","person"],["jazz","note"],["jazzed","note"],["jb","dash"],["je","person"],["jealous","minus"],["jed","person"],["jeffrey","person"],["jeffs","person"],["jekyll","person"],["jellioes","lolly"],["jelly",""],["jeloquated","plus"],["jelucian","person"],["jennifer","person"],["jenny","person"],["jensen","person"],["jepsen","person"],["jeremy","person"],["jerked","right"],["jerome","person"],["jerry","person"],["jess","person"],["jesse","person"],["jessica","person"],["jest","plus"],["jester","person"],["jesuit","person"],["jesus","person"],["jet","plane"],["jets","jet"],["jettilise","xcorrection"],["jewel",""],["jewels","jewel"],["jewish","person"],["jews","person"],["jezebel","person"],["jf","dash"],["jhrktec","dash"],["jian","ball"],["jiggled","up"],["jiggling","up"],["jigsaw","square"],["jill","person"],["jillian","person"],["jilted","minus"],["jim","person"],["jimbo","person"],["jimenez","person"],["jimmy","person"],["jing","person"],["jirga","person"],["jitterbug","bug"],["jive","note"],["jives","note"],["jms","dash"],["jn","dash"],["jo","person"],["job","person"],["jobmedicine","plus"],["jobs","person"],["joe","person"],["joey","person"],["jog","right"],["jogged","right"],["jogging","right"],["john","person"],["johnson","person"],["join","tick"],["joined","plus"],["joiner","plus"],["joining","tworectangle"],["joins","box"],["joint","box"],["joints","dot"],["jointure","equals"],["joke","plus"],["joked","plus"],["jokes","plus"],["jokester","person"],["joking","plus"],["jompelise","dash"],["jones","person"],["jong","person"],["jonno","person"],["jonquil","flower"],["joon","person"],["jordan","person"],["joseph","person"],["josh","person"],["jost","person"],["jostle","minus"],["journal","book"],["journalcode","n"],["journalism","paper"],["journalist","person"],["journals","book"],["journey","path"],["journeyed","right"],["journeys","right"],["jovial","plus"],["joy","plus"],["joyful","plus"],["joyfully","plus"],["joyously","plus"],["joysticks","rod"],["jp","dash"],["jpg","square"],["jquery","plus"],["jr","dash"],["js","square"],["jsbasic","and"],["json","dash"],["jt","dash"],["jtgz","dash"],["jube",""],["jubilacious","plus"],["jubilant","plus"],["jubilantly","plus"],["judge","person"],["judged","plus"],["judgement","plus"],["judgements","tick"],["judges","person"],["judging","plus"],["judgment",""],["judgments","square"],["judy","person"],["jug",""],["jughead","person"],["juice",""],["juiced","apple"],["juices",""],["juicily","juice"],["juicing","apple"],["juicy","water"],["jul","square"],["julia","person"],["julian","person"],["julie","person"],["juliet","person"],["july","square"],["julz","person"],["jump","up"],["jumped","right"],["jumper",""],["jumping","up"],["jumps","up"],["junction","equals"],["june","square"],["jungian","person"],["jungle","tree"],["junior","down"],["juniper",""],["jupiter","planet"],["jupyter","software"],["jurisdiction","square"],["jury","person"],["just","one"],["justice","plus"],["justices","person"],["justifiable","plus"],["justification","plus"],["justified","plus"],["justify","right"],["justifying","plus"],["justin","person"],["justly","plus"],["juxtaposing","square"],["jvcf","dash"],["jx","square"],["jyfz","dash"],["k","variable"],["ka","n"],["kadenze","company"],["kadyrov","person"],["kaleidoscope",""],["kalimba","box"],["kamahen","person"],["kamil","person"],["kangaroo",""],["kant","person"],["kantian","person"],["karagoz","person"],["karamazov","person"],["karen","person"],["karl","person"],["karlsson","person"],["kastaniotis","person"],["kat","person"],["kate","person"],["katerelos","person"],["katherine","person"],["kathleea","person"],["kathleen","person"],["kathrin","person"],["kathy","person"],["katie","person"],["katy","person"],["katz","person"],["kaufmann","person"],["kazakhstan","country"],["kdp","software"],["keats","person"],["keep","one"],["keeper","person"],["keeping","box"],["keeps","right"],["keith","person"],["kelly","person"],["kelsey","person"],["kelson","person"],["ken","person"],["kennedy","person"],["kenny","person"],["kenya","country"],["kept","plus"],["ker","plus"],["keras","and"],["kerry","person"],["kettle",""],["key",""],["keyboard",""],["keyboards","keyboard"],["keyhole",""],["keypad","box"],["keypress","box"],["keys","key"],["keysort","right"],["keystroke","square"],["keyvalue","n"],["keyword","square"],["keywords","square"],["kfc","company"],["kg","dot"],["khamenei","person"],["khan","person"],["khurram","person"],["khz","f"],["kick","foot"],["kickboard",""],["kicked","foot"],["kicking","leg"],["kid","child"],["kidding","plus"],["kidman","person"],["kidney",""],["kidneys","kidney"],["kids","child"],["kie","dash"],["kilda","person"],["kill","minus"],["killed","minus"],["killer","person"],["killers","person"],["killing","minus"],["kim","person"],["kinaesthetic","hand"],["kind","variable"],["kinder","plus"],["kindergarten","building"],["kindle","product"],["kindling","stick"],["kindly","plus"],["kindness","plus"],["kinds","box"],["kinesthetic","right"],["kinetic","right"],["king","person"],["kingdom","person"],["kingdoms","person"],["kinglish","paper"],["kings","person"],["kinin","right"],["kip","person"],["kiribati","country"],["kirsten","person"],["kiss","plus"],["kissed","mouth"],["kissing","mouth"],["kit","box"],["kitchen","person"],["kite",""],["kitt","person"],["kitten",""],["kitts","country"],["kiwi",""],["kkft","dash"],["kl","square"],["klaussinani","person"],["klein","person"],["km","plus"],["kmlf","building"],["knee",""],["knees","knee"],["knelt","down"],["knew","tick"],["knickerbocker","plus"],["knife",""],["knight","person"],["knights","person"],["knit","right"],["knitting","fabric"],["knived","knife"],["knn","n"],["knns","equals"],["knock","equals"],["knocked","right"],["knocking","equals"],["knockout","right"],["knot","right"],["knots","dot"],["know","person"],["knowing","tick"],["knowingly","plus"],["knowledge","book"],["knowledgeable","plus"],["known","plus"],["knowness","plus"],["knownness","plus"],["knowns","plus"],["knows","plus"],["kobylarz","person"],["koch","person"],["kodály","person"],["kohlberg","person"],["kola","plant"],["kolmogorov","person"],["kompozer","pen"],["kong","city"],["konstantatos","person"],["kookaburra","bird"],["korea","country"],["kostakidis","person"],["koto","box"],["kpi","plus"],["kpis","tick"],["kr","square"],["krezi","person"],["kripke","person"],["krishna","person"],["kundalini","dot"],["kuwait","country"],["kvale","person"],["kwlk","dash"],["kyhmsopfy","dash"],["kyrgyzstan","country"],["l","variable"],["la","building"],["lab","room"],["label","square"],["labelall","square"],["labeled","square"],["labeling","square"],["labelled","square"],["labelling","square"],["labels","square"],["labor","room"],["laboratories","room"],["laboratory","room"],["laborious","dash"],["laboriously","minus"],["labour","hand"],["labourer","person"],["labrador","dog"],["labs","room"],["laced","right"],["laces","shoelaces"],["lachlan","person"],["lachy","person"],["lacing","right"],["lack","minus"],["lacking","minus"],["lacom","paper"],["lacomfile","square"],["lacrosse","ball"],["lactic","box"],["ladbury","person"],["ladder",""],["ladders","ladder"],["laden","down"],["ladies","person"],["lady","person"],["lag","n"],["lagging","minus"],["lah","dot"],["laid","down"],["laissez","plus"],["lake","water"],["lakeside","right"],["lama","person"],["lamb","person"],["lambda","square"],["lamp","light"],["lan","square"],["land",""],["landed","down"],["landing","square"],["landmarks","dot"],["lands","square"],["landslide","down"],["lane",""],["lanes","lane"],["lang","paper"],["langprolog","and"],["langs","paper"],["language","book"],["languageprolog","and"],["languages","book"],["lanka","country"],["lanolin",""],["lanyard","paper"],["lanzer","person"],["laos","country"],["laoshi","person"],["laozi","person"],["lap","box"],["lapels","lapel"],["lapis",""],["lapping","water"],["laptop",""],["lareau","person"],["large","n"],["largely","plus"],["largenesses","plus"],["larger","greaterthan"],["largest","n"],["larrikin","person"],["larrikinism","plus"],["larry","person"],["laryngitis","minus"],["larynx","box"],["las","note"],["lasagne",""],["lascelles","right"],["lasso",""],["lassoed","lassoo"],["last","n"],["lastchar","square"],["lastcharisaid","square"],["lastcharpartnersaid","square"],["lastcstrings","variable"],["lasted","n"],["lastline","line"],["lastly","n"],["lastname","square"],["lastperson","person"],["lastrule","square"],["lasts","n"],["lastsyllable","square"],["lat","dash"],["latch","equals"],["late","minus"],["latency","n"],["later","two"],["lateral","right"],["latest","n"],["latex","square"],["lather","person"],["latin","book"],["latitude","n"],["latter","two"],["latvia","country"],["lau","person"],["laugh","plus"],["laughed","plus"],["laughing","plus"],["laughter","a"],["launch","up"],["launched","up"],["launches","up"],["launching","plus"],["launchpad","and"],["laundry","water"],["laura","person"],["laurel","flower"],["laurels","flower"],["lavender",""],["law","square"],["lawful","square"],["lawfully","plus"],["lawn",""],["lawrence","person"],["laws","and"],["lawsuit","book"],["lawyer","person"],["lawyers","person"],["lay","down"],["layer","box"],["layered","box"],["layering","line"],["layers","box"],["laying","box"],["layout","paper"],["lays","down"],["lazuli",""],["lazy","minus"],["lcritique","variable"],["lcs","variable"],["ldmg","person"],["le","dash"],["lead","plus"],["leader","person"],["leadermode","right"],["leaders","person"],["leadership","person"],["leading","one"],["leads","right"],["leaf",""],["leaflet","paper"],["leagues","dash"],["leah","person"],["lean","line"],["leaning","down"],["leant","arm"],["leap","up"],["leaper","person"],["leaps","right"],["leard","person"],["learn","tick"],["learned","plus"],["learners","person"],["learning","book"],["learnmeditationnow","line"],["learns","plus"],["learnt","plus"],["learntd","square"],["learntm","square"],["learnty","square"],["lease","dollarsymbol"],["leash",""],["least","n"],["leather","box"],["leave","right"],["leaves","tick"],["leaving","right"],["lebanon","country"],["lec","mouth"],["lechte","person"],["lecs","book"],["lectern",""],["lects","book"],["lecture","paper"],["lectured","book"],["lecturer","person"],["lecturers","person"],["lectures","paper"],["lecturing","mouth"],["led","right"],["ledge","box"],["lee","person"],["leece","dash"],["leeway","right"],["left",""],["leftmost","left"],["leftover","n"],["leftovers","box"],["leg","line"],["legacy","paper"],["legal","square"],["legally","plus"],["legible","square"],["legislating","plus"],["legislation","square"],["legislative","book"],["legs","leg"],["leianne","person"],["leila","person"],["leisure","plus"],["lejos","dash"],["lemon",""],["lemonade",""],["len","n"],["lend","hand"],["lending","right"],["lends","plus"],["length","twobox"],["lengthmeditatorsanddoctors","n"],["lengthphil","n"],["lengths","n"],["lengtht","variable"],["lengthways","right"],["lengthwise","line"],["lengthy","n"],["lenient","plus"],["lenin","person"],["lens",""],["lenses","lens"],["lentil",""],["leo","person"],["leon","person"],["leonard","person"],["leonardo","person"],["leonardodavinci","duck"],["leone","country"],["leorke","person"],["leprosy","xcorrection"],["ler","square"],["les","person"],["leslie","person"],["lesotho","country"],["less","minus"],["lessened","down"],["lessening","down"],["lesser","down"],["lesson","paper"],["lessons","square"],["lessthan",""],["leste","country"],["let","equals"],["lets","dash"],["letter","paper"],["lettera","square"],["letterbox",""],["lettering","square"],["letters","paper"],["letting","plus"],["lettuce",""],["lettuces","lettuce"],["lev","dash"],["level","square"],["levels","tworectangle"],["lever","rod"],["leverage","right"],["leveraged","plus"],["levi","person"],["levinas","person"],["levine","person"],["levitate","up"],["levity","minus"],["lewis","person"],["lex","square"],["lexicool","dash"],["leye","person"],["lff","right"],["lfl","dash"],["lg","person"],["lgbtiq","person"],["lgbtiqa","person"],["lgbtiqas","person"],["lgbtqi","person"],["lgbtqia","person"],["lgreen","person"],["li","person"],["liability","minus"],["liable","minus"],["liaise","plus"],["liaison","person"],["liaisons","person"],["lib","book"],["libby","person"],["libdyld","book"],["libedit","pen"],["liberal","plus"],["liberalism","book"],["liberality","plus"],["liberally","plus"],["liberation","plus"],["liberia","country"],["libexec","book"],["librarian","person"],["librarians","person"],["libraries","book"],["library","book"],["libs","book"],["libswipl","book"],["libsystem","book"],["libya","country"],["licence","square"],["licences","square"],["license","paper"],["licensed","paper"],["licenses","paper"],["licensing","square"],["liceo","person"],["lick","tongue"],["licked","tongue"],["licking","tongue"],["licks","tongue"],["licky","tongue"],["licorice",""],["liculia","plus"],["lid",""],["lie","minus"],["liechtenstein","country"],["lied","minus"],["liegle","person"],["lien","person"],["lies","minus"],["lieutenant","person"],["liew","person"],["life","person"],["lifeless","minus"],["lifeline","line"],["lifelong","n"],["lifelongness","plus"],["lifestyle","dot"],["lifetime","line"],["lifetimes","line"],["lift","up"],["lifted","up"],["lifter","up"],["lifting","up"],["lifts","up"],["ligaments","right"],["light","line"],["lighter","n"],["lightest","n"],["lighthouse",""],["lighting","light"],["lightly","n"],["lightness","n"],["lightning","down"],["lights","light"],["like","plus"],["liked","plus"],["likelihood","plus"],["likely","plus"],["likened","plus"],["likeness","plus"],["likenesses","plus"],["likes","plus"],["likewise","plus"],["liking","plus"],["lilo",""],["lilting","up"],["lily","flower"],["lim","square"],["limahl","person"],["limbs","arm"],["limit","n"],["limitation","right"],["limitations","minus"],["limited","one"],["limiter","n"],["limiting","n"],["limits","n"],["limousine",""],["limreached","n"],["lin","person"],["lincoln","person"],["line",""],["lineage","line"],["linear","line"],["linearity","right"],["lined","plane"],["linen","fabric"],["liner","box"],["lines","line"],["lineup","line"],["ling","person"],["lingua","paper"],["linguistic","paper"],["linguistically","paper"],["linguistics","paper"],["lining","box"],["link","dash"],["linked","dash"],["linkedin","company"],["linking","right"],["links","dash"],["lino",""],["linode","dash"],["linux","operatingsystem"],["lion",""],["lioness","lion"],["lions","lion"],["lip","mouth"],["lipid","fat"],["lipids","ball"],["lipinit","zero"],["lips","n"],["lipstick",""],["lipv","tick"],["liquefied","water"],["liquid",""],["liquids","liquid"],["lisa","person"],["lisp","minus"],["list","tworectangle"],["listed","square"],["listen","ear"],["listened","ear"],["listener","person"],["listeners","person"],["listening","ear"],["lister","person"],["listhead","square"],["listing","paper"],["listitemnumber","n"],["listlessness","minus"],["listof","square"],["listoflists","variable"],["listoutputs","square"],["listprolog","software"],["listprologinterpreter","equals"],["listrecursion","circle"],["listrest","square"],["lists","tworectangle"],["listsorvars","or"],["lit","light"],["literacy","a"],["literally","equals"],["literariness","book"],["literary","book"],["literature","book"],["lithium",""],["lithuania","country"],["litigation","xcorrection"],["litre","n"],["little","n"],["liturgical","book"],["liu","person"],["liv","organisation"],["live","person"],["lived","plus"],["livelihood","plus"],["lively","plus"],["liver",""],["lives","line"],["living","plus"],["lizard",""],["ll","right"],["llb","paper"],["llist","square"],["llistitem","square"],["lloyd","person"],["lm","person"],["lmedic","book"],["lms","square"],["lmy","company"],["ln","variable"],["lo","dash"],["load","box"],["loaded","up"],["loading","up"],["loads","up"],["loadt","up"],["loaf",""],["loan","dollarsymbol"],["loans","dollarsymbol"],["loathe","minus"],["lobe","brain"],["lobes","ball"],["lobster",""],["local","one"],["locale","square"],["localhost","computer"],["locally","suburb"],["locals","person"],["locate","dot"],["located","dot"],["location","dot"],["locations","dot"],["locative","square"],["lock",""],["locked","equals"],["lockerbie","city"],["locking","equals"],["locs","n"],["locus","n"],["lodge","house"],["lodgment","right"],["loeb","person"],["log",""],["logarithm","square"],["logarithmic","square"],["logged","paper"],["logging","tick"],["logic","and"],["logical","and"],["logicalconjunction","and"],["logicaldisjunction","or"],["logically","and"],["logicalnot","xcorrection"],["logicise","and"],["logicism","and"],["logico","and"],["logicus","book"],["login","square"],["logins","square"],["logistics","dash"],["logo","square"],["logos","square"],["logout","down"],["logs","log"],["loiterer","person"],["lolled","tongue"],["lollies","lolly"],["lolling","tongue"],["lolliop",""],["lollipop",""],["lolly",""],["london","city"],["lone","square"],["lonely","minus"],["long","line"],["longed","plus"],["longer","greaterthan"],["longest","n"],["longevities","n"],["longevity","n"],["longingly","plus"],["longitude","n"],["longtoshortform","right"],["longueur","n"],["look","eye"],["lookahead","right"],["looked","eye"],["looking","eye"],["lookout","eye"],["looks","eye"],["lookup","eye"],["loop","circle"],["loopback","left"],["looped","circle"],["loophole","hole"],["loops","circle"],["loose","minus"],["loosely","plus"],["loosened","minus"],["looseness","plus"],["loosening","right"],["lopes","person"],["lord","person"],["lords","person"],["lorelle","person"],["lorian","house"],["lorikeet",""],["lorna","person"],["lorraine","person"],["lorry","truck"],["los","city"],["losada","person"],["lose","minus"],["losers","person"],["loses","minus"],["losing","minus"],["loss","minus"],["lost","minus"],["lot","n"],["lote","book"],["loted","paper"],["lots","n"],["lotus","flower"],["loud","n"],["louder","greaterthan"],["loudest","n"],["louis","person"],["louisa","person"],["louise","person"],["louisegreen","person"],["lounge","sofa"],["love","heart"],["loved","heart"],["loveliana","person"],["lovely","plus"],["lover","person"],["loves","plus"],["loving","heart"],["lovingly","heart"],["lovingtude","plus"],["low","n"],["lower","square"],["lowera","a"],["lowerb","b"],["lowerc","c"],["lowercase","square"],["lowerd","d"],["lowere","e"],["lowered","down"],["lowerf","f"],["lowerg","g"],["lowerh","h"],["loweri","i"],["lowering","down"],["lowerj","j"],["lowerk","k"],["lowerl","l"],["lowerm","m"],["lowern","n"],["lowero","o"],["lowerp","p"],["lowerq","q"],["lowerr","r"],["lowers","s"],["lowert","t"],["loweru","u"],["lowerv","v"],["lowerw","w"],["lowerx","x"],["lowery","y"],["lowerz","z"],["lowest","n"],["lowliest","down"],["loyal","person"],["loyalty","tick"],["lozenge",""],["lp","variable"],["lpcawmp","and"],["lpcl","square"],["lpconverter","right"],["lpi","square"],["lpiv","and"],["lpiverify","tick"],["lpo","building"],["lppm","plus"],["lpredstestdel","dash"],["lpv","tick"],["lrjj","dash"],["ls","variable"],["lstm","and"],["lstms","right"],["lt","lessthan"],["ltd","n"],["lth","l"],["lu","person"],["lub","person"],["lubricant",""],["lubricated","oil"],["lubricating","lubricant"],["luc","person"],["luca","person"],["lucan","person"],["luce","light"],["lucia","country"],["lucian","person"],["lucianacademy","person"],["lucianacademyapis","software"],["luciandmgreen","person"],["luciangreen","person"],["luciangreenfringe","person"],["luciangreenmusic","note"],["luciangreensphilosophyapril","box"],["luciangreensphilosophyaugust","box"],["luciangreensphilosophydecember","book"],["luciangreensphilosophyfebruary","folder"],["luciangreensphilosophyjanuary","paper"],["luciangreensphilosophyjune","book"],["luciangreensphilosophymarch","box"],["luciangreensphilosophymay","box"],["luciangreensphilosophynotebooks","book"],["luciangreensphilosophynovember","paper"],["luciangreensphilosophyoctober","box"],["luciangreensphilosophyseptember","box"],["lucianic","person"],["lucianiccommentasacelebrityreport","paper"],["lucianiccomputationalenglishreport","paper"],["lucianicdaoistheadachemedicinetexttobr","paper"],["lucianicmedicinesuccessfulconceptionandpreventmiscarriage","paper"],["lucianicmeditation","square"],["lucianicmeditationapps","software"],["lucianicpedagogyanarchyquiz","paper"],["lucianicpedagogycreateandhelpapedagoguereport","paper"],["lucianicpedagogylecturerreport","paper"],["lucianicpedagogyrecordingsreport","paper"],["lucianictexttobr","paper"],["lucianism","book"],["lucianmantrapureform","square"],["lucianmantrasunsafety","square"],["lucianos","and"],["lucianpedia","paper"],["lucianphilosopher","person"],["lucianpl","right"],["lucians","person"],["luciansair","computer"],["lucianshandbitmap","dot"],["lucianshd","box"],["lucianspedagogy","book"],["luciansphilosophy","square"],["lucien","person"],["luck","plus"],["luckier","plus"],["luckies","plus"],["luckily","plus"],["lucky","plus"],["lucy","person"],["ludo",""],["ludocytes","cell"],["ludwig","person"],["luggage","bag"],["lugoondo","plus"],["luke","person"],["lukewarm","n"],["lulang","plus"],["lullaby","note"],["lulu","person"],["lump",""],["lumps","box"],["luna","moon"],["lunar","moon"],["lunch","apple"],["lunches","apple"],["lunchtime","apple"],["lung",""],["lungfuls","lung"],["lunging","right"],["lungs","lung"],["lur","dash"],["lurking","minus"],["lush","water"],["lust","dash"],["lustful","plus"],["lustrous","plus"],["lute",""],["lutephonics","note"],["luxembourg","country"],["luís","person"],["ly","square"],["lycra",""],["lying","minus"],["lymph",""],["lymphatic","lymph"],["lyn","person"],["lynch","person"],["lynxlet","and"],["lyotard",""],["lyre",""],["lyric","square"],["lyrics","square"],["lyricsa","square"],["lyricsv","square"],["lysander","person"],["lyzeme","right"],["m","box"],["ma","dash"],["maas","person"],["mac","box"],["macadamia","nut"],["macbook",""],["macedonia","country"],["machete",""],["machina","box"],["machinations","right"],["machine","box"],["machinery","and"],["machines","and"],["macintosh","computer"],["macmillan","company"],["macos","box"],["macpath","square"],["macquarie","university"],["macrae","person"],["macroeconomics","dollarsymbol"],["macs","computer"],["macvim","box"],["mad","minus"],["madagascar","country"],["madam","person"],["madang","person"],["madcatsound","building"],["made","box"],["madeleine","person"],["madeness","box"],["madeup","plus"],["madness","minus"],["madonna","person"],["madrid","city"],["maestro","person"],["magazine",""],["magazines","book"],["maggot",""],["magic","star"],["magical","plus"],["magister","person"],["magistero","person"],["magistri","person"],["magna","n"],["magnesium","ball"],["magnet",""],["magnetic","right"],["magnetosphere","ball"],["magnifying","multiplication"],["magnitude","n"],["magnum","plus"],["magpie","bird"],["magulous","plus"],["maharishi","person"],["maharishisutra","square"],["mahasti","person"],["maher","person"],["mahoganies",""],["mahogany","wood"],["maid","person"],["mail","paper"],["mailbox","box"],["mailer","right"],["mailing","paper"],["mailto","paper"],["main","one"],["mainly","one"],["mainness","plus"],["mainrole","person"],["maintain","dash"],["maintained","plus"],["maintaining","dash"],["maintains","tick"],["maintenance","right"],["maize","plant"],["major","happyface"],["majority","n"],["majorly","plus"],["majors","square"],["make","box"],["makebasecase","square"],["makecode","and"],["makename","square"],["makenames","square"],["maker","person"],["makerandomlist","square"],["makerid","n"],["makers","person"],["makes","hand"],["making","box"],["mal","person"],["maladministration","minus"],["malaecotopics","plus"],["malaria","minus"],["malatesta","person"],["malawi","country"],["malaysia","country"],["malaysian","person"],["malcolm","person"],["maldives","country"],["male","person"],["malebranches","person"],["maleficence","minus"],["males","person"],["malfunction","minus"],["malfunctioning","minus"],["malfunctions","minus"],["malhesian","person"],["mali","country"],["malibu","city"],["mallet","rod"],["malloc","box"],["malnourishment","minus"],["malta","country"],["malterud","person"],["maltodextrin","ball"],["malus","apple"],["malvern","suburb"],["mambo","square"],["mamet","person"],["mammary","breast"],["man","person"],["manage","plus"],["manageable","plus"],["managed","plus"],["management","plus"],["manager","person"],["managers","person"],["manages","plus"],["managing","plus"],["manahimhorisit","plus"],["mandarin",""],["mandatory","plus"],["mangelsdorf","person"],["mangione","person"],["mango",""],["manifest","square"],["manifestation","plus"],["manifested","plus"],["manifestfile","square"],["manifesting","tick"],["manifesto","book"],["manifests","plus"],["manipulate","hand"],["manipulating","minus"],["manipulation","right"],["mannequin",""],["manner","plus"],["mannerisms","right"],["manners","plus"],["manny","person"],["mantainers","person"],["mantel","box"],["mantelpiece",""],["mantelpieces","mantelpiece"],["mantissa","n"],["mantle","square"],["mantra","a"],["mantras","square"],["manual","book"],["manually","hand"],["manufacture","box"],["manufactured","box"],["manufacturer","person"],["manufacturing","box"],["manuscript","paper"],["manuscripts","paper"],["manwarring","person"],["many","n"],["map","square"],["maple","tree"],["maplist","tworectangle"],["maplists","right"],["mapped","map"],["mapping","square"],["maps","square"],["mar","and"],["maracas",""],["marachusanchuchay","plus"],["maraded","plus"],["maranatha","square"],["marble","ball"],["marbles","ball"],["marc","person"],["march","person"],["marched","right"],["marchers","right"],["marching","right"],["mardi","person"],["margarine",""],["margin","line"],["maria","person"],["mariam","person"],["marimba",""],["marina","person"],["marine","water"],["marino","country"],["mario","person"],["marion","person"],["marital","right"],["mark","lowera"],["markdown","paper"],["marked","tick"],["marker","plus"],["marker","square"],["markers","square"],["market","dollarsymbol"],["marketed","square"],["marketing","dollarsymbol"],["markets","dollarsymbol"],["marking","square"],["markle","person"],["marks","a"],["markup","square"],["marlebong","city"],["marmalade",""],["maroon","person"],["marquee",""],["marriage","two"],["marriages","dash"],["married","plus"],["marrion","person"],["marrow","box"],["marry","plus"],["marrying","right"],["mars","planet"],["marshall","person"],["marshalling","right"],["marshmallow",""],["martha","person"],["martin","person"],["marx","person"],["marxian","person"],["marxism","book"],["marxist","person"],["marxs","person"],["mary","person"],["marzipan",""],["masculine","square"],["masculinities","book"],["masculinity","person"],["mashed","down"],["mask",""],["masking","mask"],["masks","mask"],["masonry","organisation"],["mass","n"],["massage","hand"],["massaged","hand"],["massaging","hand"],["masse","person"],["masses","person"],["massive","n"],["master","person"],["mastered","n"],["masterer","n"],["mastering","n"],["masterpiece","square"],["masters","person"],["mastery","plus"],["mat",""],["match","equals"],["matched","equals"],["matches","equals"],["matching","equals"],["matchingness","equals"],["mate","person"],["material","box"],["materialise","box"],["materialised","box"],["materialisers","plus"],["materialism","box"],["materialistic","box"],["materials","box"],["maternity","baby"],["math","plus"],["mathematical","plus"],["mathematically","equals"],["mathematician","person"],["mathematicians","person"],["mathematics","multiply"],["mathison","person"],["maths","plus"],["mathsisfun","equals"],["mathworks","plus"],["matics","up"],["matilda","person"],["matlabcentral","plus"],["matriarchs","person"],["matrix","grid"],["matt","person"],["matter","box"],["mattering","plus"],["matters","plus"],["matthew","person"],["mattress",""],["mature","n"],["maturity","n"],["mauboy","person"],["maude","person"],["maunsang","person"],["maura","person"],["maurice","person"],["mauritania","country"],["mauritius","country"],["mawk","person"],["max","n"],["maxclauses","variable"],["maxdepth","n"],["maximally","a"],["maximise","plus"],["maximum","n"],["maxlength","n"],["maxpredicates","variable"],["maxterms","variable"],["maxwell","person"],["may","dash"],["maya","country"],["maybe","zero"],["mayeroff","person"],["mayfair","robot"],["mayhem","minus"],["mayonnaise",""],["maypole",""],["maze",""],["mazes","box"],["mb","square"],["mba","paper"],["mbarelation","dash"],["mbas","paper"],["mbc","dash"],["mbti","n"],["mc","person"],["mcall","dash"],["mcawp","and"],["mccall","person"],["mccallum","person"],["mcclure","person"],["mccutcheon","person"],["mcdermott","person"],["mcdonalds","company"],["mcewan","person"],["mcgrath","person"],["mcintyre","person"],["mckelvey","person"],["mclean","person"],["mclellan","person"],["mcnally","person"],["mcowan","person"],["mcq","square"],["mcqs","square"],["md","person"],["me","person"],["meaf","dash"],["meal","apple"],["meals","carrot"],["mean","dash"],["meandered","right"],["meaning","apple"],["meaningful","plus"],["meaningless","minus"],["meanings","dash"],["meanjin","book"],["means","dash"],["meant","tick"],["meantime","line"],["meantness","plus"],["measure","n"],["measured","n"],["measurement","n"],["measurements","n"],["measures","plus"],["measuring","n"],["measuringly","n"],["meat","minus"],["meats","meat"],["mechanic","person"],["mechanical","right"],["mechanically","right"],["mechanics","right"],["mechanism","right"],["mechanisms","right"],["med","plus"],["medal",""],["medi","plus"],["media","square"],["median","two"],["mediated","plus"],["medibank","company"],["medic","book"],["medical","plus"],["medically","book"],["medicare","company"],["medication",""],["medications","switch"],["medicinal","tablet"],["medicine","paper"],["medicinenoreplace","plus"],["medicines","box"],["medieval","book"],["medit","line"],["meditate","tick"],["meditated","meditator"],["meditates","dot"],["meditating","meditator"],["meditation","meditator"],["meditationally","mat"],["meditationincreasedbrainpotential","up"],["meditationindicatordecreasedstress","down"],["meditationindicatorincreasedbloodflow","up"],["meditationindicatorlowerriskofcancerandotherdiseasesinworkersandbroadcasters","down"],["meditationnoreplace","dot"],["meditations","square"],["meditationteachersutra","square"],["meditative","plus"],["meditatively","plus"],["meditator","person"],["meditators","person"],["meditatorsanddoctors","person"],["mediterranean","person"],["meditnoreplace","right"],["medits","person"],["medium","box"],["medtech","company"],["meet","dash"],["meeting","dash"],["meetings","dash"],["meets","dash"],["meetup","company"],["meg","person"],["megalopolises","city"],["megan","person"],["megapixels","square"],["megascreendom","plus"],["meghan","person"],["mei","person"],["mein","person"],["meinong","person"],["meisner","person"],["mel","city"],["melancholia","minus"],["melancholy","minus"],["melanie","person"],["melb","city"],["melbourne","city"],["melchior",""],["melinda","person"],["melissa","person"],["mellow","plus"],["melodic","note"],["melodies","note"],["melody","note"],["melodyharmony","note"],["melodyinstrumentnumber","n"],["melodyinstruments","tube"],["melodyinstrumentslength","n"],["melodypart","note"],["melodyparts","line"],["melodypartslength","n"],["melon",""],["melt","down"],["melted","down"],["melting","minus"],["melts","down"],["mem","up"],["member","person"],["membera","box"],["memberlp","box"],["memberprogram","box"],["memberr","tworectangle"],["members","person"],["membership","paper"],["memberships","folder"],["membrane","box"],["meme","square"],["memes","square"],["memo","paper"],["memorable","n"],["memories","box"],["memorisation","square"],["memorise","tick"],["memorised","plus"],["memorises","square"],["memorising","down"],["memory","box"],["memphis","city"],["men","person"],["menacing","minus"],["mend","plus"],["mene","plus"],["meniscus","n"],["meniscuses","meniscus"],["menkoff","person"],["meno","person"],["mens","person"],["mensicus","line"],["mental","brain"],["mentally","brain"],["mentee","person"],["mention","dash"],["mentioned","dash"],["mentioning","square"],["mentions","mouth"],["mentor","person"],["mentoring","plus"],["mentors","person"],["menu","square"],["menus","square"],["meontological","box"],["meontology","box"],["meoworld","sphere"],["merch","product"],["merchandise","box"],["merchant","person"],["merchantability","dollarsymbol"],["merchantid","dash"],["mercurially","mercury"],["mercury",""],["mercy","plus"],["mere","plus"],["merely","plus"],["merge","one"],["merged","right"],["mergedtree","tree"],["merger","box"],["mergers","one"],["merges","box"],["mergetexttobrdict","threerectangle"],["merging","right"],["meringue",""],["meristem","line"],["merit","plus"],["merited","plus"],["meritocracy","plus"],["meritocratic","plus"],["merleau","person"],["merman","person"],["meronym","box"],["meronymic","equals"],["meronymnal","box"],["meronyms","box"],["merrington","person"],["merror","xcorrection"],["mes","dash"],["mesentery","organ"],["mesh","square"],["mesmerized","plus"],["mess","minus"],["message","square"],["messages","square"],["messaging","square"],["messences","ball"],["messenger","right"],["messing","minus"],["messy","minus"],["met","dash"],["meta","square"],["metabolic","right"],["metabolism","right"],["metabolites","ball"],["metacognition","circle"],["metacognitively","plus"],["metadata","square"],["metal",""],["metallic","square"],["metamucil","plant"],["metaphilosophy","book"],["metaphor","apple"],["metaphorical","box"],["metaphorically","right"],["metaphorist","person"],["metaphors","square"],["metaphysical","plus"],["metaphysically","box"],["metaphysician","person"],["metaphysics","dot"],["meter","one"],["metered","line"],["metering","n"],["meters","n"],["method","plus"],["methodically","plus"],["methodological","right"],["methodologies","right"],["methodology","square"],["methods","dash"],["methuselah","person"],["meticulous","plus"],["metodologici","plus"],["metre","line"],["metres","square"],["metric","n"],["metrolyrics","site"],["mexception","dash"],["mexico","country"],["mexit","zero"],["mfail","xcorrection"],["mfalse","xcorrection"],["mfile","paper"],["mg","n"],["mgrammar","right"],["mgs","school"],["mhr","person"],["mhu","dash"],["mi","minus"],["miami","city"],["mic","microphone"],["michael","person"],["michaelreturnedfromthailand","person"],["michaels","person"],["michel","person"],["michelle","person"],["michigan","city"],["micky","person"],["micro","n"],["microbiology","book"],["microcred","paper"],["microeconomics","dollarsymbol"],["micromentor","company"],["micronesia","country"],["microphone",""],["microscope",""],["microsoft","building"],["mid","note"],["midday","n"],["middle","two"],["middles","n"],["midgley","person"],["midi","note"],["midiutils","right"],["midlying","dot"],["midnight","zero"],["midpoint","dot"],["mids","note"],["midsummer","n"],["midway","n"],["mifflin","person"],["might","dash"],["migrant","person"],["migrate","right"],["migrating","right"],["migration","right"],["mil","n"],["mild","plus"],["milder","n"],["mildly","plus"],["mile","n"],["milestone","box"],["milieu","box"],["milk",""],["milkmaid","person"],["milkshake","milk"],["milky","milk"],["miller","person"],["millinery","hat"],["million","line"],["millionaire","person"],["millions","n"],["millipede",""],["millswyn","path"],["milne","person"],["mimetic","right"],["mimic","right"],["mimicked","equals"],["mimicking","equals"],["mimickry","plus"],["mimicry","tworectangle"],["mimics","two"],["min","n"],["mince",""],["minced","box"],["mind","brain"],["minded","plus"],["mindedness","plus"],["mindful","plus"],["mindfulness","plus"],["minding","plus"],["mindmap","dash"],["mindmapping","dash"],["mindread","left"],["mindreader","eye"],["mindreading","brain"],["mindreadingcaw","and"],["mindreadtest","tick"],["mindreadtesta","tick"],["mindreadtestb","tick"],["mindreadtestcharacters","a"],["mindreadteste","tick"],["mindreadtestlang","plus"],["mindreadtestmeditation","up"],["mindreadtestminus","minus"],["mindreadtestminusunrelatedthought","minus"],["mindreadtestmusiccomposer","note"],["mindreadtestne","tick"],["mindreadtestobj","apple"],["mindreadtestpersoncomputer","computer"],["mindreadtestpsychiatrist","person"],["mindreadtestsec","plus"],["mindreadtestsecthoughts","box"],["mindreadtestsecurity","xcorrection"],["mindreadtestshared","plus"],["mindreadtestz","tick"],["minds","brain"],["mindset","plus"],["mine","down"],["mineral",""],["minerals","mineral"],["minestrone","soup"],["mingle","right"],["mingliang","person"],["mini","n"],["miniature","down"],["minimal","one"],["minimalist","box"],["minimax","company"],["minimaxir","algorithm"],["minimisation","down"],["minimise","minus"],["minimised","down"],["minimiser","down"],["minimises","down"],["minimising","down"],["minimum","n"],["mining","down"],["minions","person"],["minister","person"],["ministers","person"],["ministries","box"],["ministry","plus"],["minor","sadface"],["minorities","person"],["minority","n"],["mins","square"],["mint","plant"],["minus",""],["minute","square"],["minutes","paper"],["minutevaluea","variable"],["minutevalueb","variable"],["mir","dash"],["miracle","plus"],["miracles","plus"],["miraculously","plus"],["miriam","person"],["miriammorris","person"],["mirror",""],["mirrors","mirror"],["mirth","happyface"],["mis","minus"],["misaligned","notequals"],["misbehaving","minus"],["misbehaviour","minus"],["miscalculate","minus"],["miscarriage","minus"],["miscarriages","minus"],["miscellaneous","right"],["miscellany","box"],["misconduct","minus"],["miscounted","minus"],["misdoings","minus"],["misery","minus"],["misfit","minus"],["misguided","minus"],["misheard","minus"],["misinformation","minus"],["misinformed","minus"],["mismatch","notequals"],["miso",""],["misrepresented","minus"],["misrepresenting","minus"],["miss","person"],["missed","minus"],["misses","minus"],["missing","minus"],["mission","plus"],["missionaries","person"],["missioncloud","right"],["missions","right"],["mistake","xcorrection"],["mistaken","minus"],["mistakes","xcorrection"],["mistress","person"],["misunderstand","minus"],["misunderstandings","minus"],["mit","university"],["mited","ball"],["mitigation","xcorrection"],["mitochondria",""],["mix","dot"],["mixed","plus"],["mixer","n"],["mixes","box"],["mixing","n"],["mixture","ball"],["mixtures","amount"],["mkcid","dash"],["mkevt","dash"],["mkgroupid","dash"],["mklykpuce","dash"],["mkrid","dash"],["mktype","dash"],["ml","dash"],["mlb","company"],["mlblawyers","company"],["mleash","leash"],["mls","n"],["mm",""],["mn","person"],["mnemonic","square"],["mnemonics","square"],["mnws","dash"],["mobile","box"],["mock","plus"],["mocktail","glass"],["mod","n"],["modal","n"],["modalities","m"],["modalities","n"],["modality","n"],["mode","b"],["model","box"],["modeled","box"],["modelled","right"],["modelling","box"],["models","and"],["moderate","plus"],["moderated","plus"],["moderation","plus"],["modern","plus"],["modernist","plus"],["modernity","plus"],["moderns","person"],["modes","n"],["modespec","square"],["modestatements","square"],["modestly","plus"],["modesty","plus"],["modifiable","plus"],["modification","plus"],["modifications","right"],["modified","right"],["modifier","plus"],["modifiers","key"],["modifies","right"],["modify","right"],["modifying","right"],["modifyitem","plus"],["modularisation","box"],["modularised","box"],["module","box"],["modules","book"],["modus","n"],["modusponens","right"],["modw","square"],["mohammad","person"],["mohammadfneish","person"],["mohammed","person"],["moistened","square"],["mol","n"],["molars","tooth"],["mold","box"],["molding","mold"],["moldova","country"],["molecular","ball"],["molecularly","ball"],["molecule","ball"],["molecules","ball"],["molyhedrons","box"],["mom","person"],["moment","one"],["momentarily","one"],["moments","one"],["mon","square"],["monaco","country"],["monarch","person"],["monarchy","king"],["monash","university"],["monastery","building"],["monastic","person"],["monastics","person"],["monday","square"],["mondragón","equals"],["monetary","dollarsymbol"],["money","dollarsign"],["moneyless","zero"],["moneymaking","dollarsymbol"],["mongolia","country"],["monies","dollarsymbol"],["moniker","plus"],["monitor","square"],["monitored","plus"],["monitoring","eye"],["moniz","person"],["monk","person"],["monkey",""],["mono","box"],["monocle","circle"],["monogamy","box"],["monologue","paper"],["monologues","paper"],["monopoles","up"],["monos","box"],["monotheism","person"],["monotheistic","person"],["monotone","one"],["monotonicities","plus"],["monster","person"],["montelhedrons","box"],["montenegro","country"],["montero","person"],["montessori","person"],["month","square"],["monthdob","n"],["monthlearned","square"],["monthly",""],["months","square"],["monthvaluea","variable"],["monthvalueb","variable"],["montreal","city"],["mooc","paper"],["moocs","book"],["mood","square"],["moods","variable"],["moody","person"],["moog","note"],["moon",""],["moonbelt","ball"],["mooney","person"],["moonlight","moon"],["moonwalk","right"],["mop",""],["mopped","mop"],["moral","tick"],["morale","plus"],["morality","tick"],["morally","plus"],["morals","plus"],["more","plus"],["moreover","and"],["morgan","person"],["moriac","suburb"],["mormonisms","person"],["morning","sun"],["mornings","sun"],["morocco","country"],["morris","person"],["mortal","n"],["mortality","n"],["mortals","person"],["mortar",""],["mortarboard",""],["mortem","n"],["mortgage","n"],["moslem","person"],["mosquitoes","mosquito"],["mossop","person"],["most","one"],["mostly","one"],["mother","person"],["mothers","person"],["motion","right"],["motioning","right"],["motions","right"],["motivated","plus"],["motivates","plus"],["motivation","plus"],["motivational","plus"],["motives","plus"],["motor","right"],["motto","square"],["mould","box"],["moulded","box"],["moules","person"],["mound","earth"],["mountain",""],["mountaineer","person"],["mountains","mountain"],["mounted","down"],["mouse",""],["mousse",""],["moustache",""],["moustached","moustache"],["mouth",""],["mouthful","mouth"],["mouthfuls","cube"],["mouthing","mouth"],["mouthpiece","mouth"],["mouthwash",""],["move","right"],["moved","right"],["movement","right"],["movements","right"],["moves","right"],["movie","line"],["moviegoer","person"],["movies","line"],["moviesandsales","square"],["moviestorm","software"],["moving","right"],["movingappearances","right"],["movingappearancespurusha","square"],["mow","right"],["moxibustion","dash"],["mozambique","country"],["mozart","person"],["mozilla","application"],["mp","dash"],["mpa","book"],["mr","variable"],["mrcawmp","square"],["mredo","two"],["mrs","person"],["mrtree","dash"],["mrtsthoughts","square"],["mrwed","company"],["ms","building"],["mscore","n"],["mscp","school"],["msetup","n"],["msg","square"],["msn","company"],["msort","right"],["mssbtl","mat"],["mteach","book"],["mteaching","book"],["mth","m"],["mtjcbct","dash"],["mtotal","n"],["mtrace","dash"],["mtrue","one"],["much","one"],["mucous","box"],["mueller","person"],["muesli",""],["muffin",""],["muffins","muffin"],["mug",""],["mul","n"],["mulch","box"],["mulched","box"],["mulling","plus"],["mulready","person"],["multi","n"],["multichoice","n"],["multiclause","square"],["multiclauses","square"],["multiconnections","right"],["multicultural","tworectangle"],["multiculturalism","plus"],["multidim","box"],["multidimensional","dash"],["multidisciplinary","book"],["multifile","paper"],["multileveled","dash"],["multiline","two"],["multimethod","plus"],["multiness","n"],["multiple","n"],["multiplication","multiplication"],["multiplications","multiplication"],["multiplicity","n"],["multiplied","multiply"],["multiplies","multiplication"],["multiply",""],["multiplybytwo","multiply"],["multiplying","multiplication"],["multistage","n"],["multistate","square"],["multitasking","n"],["multitrait","box"],["multitude","two"],["multiverse","box"],["multividual","person"],["multividuals","person"],["mum","university"],["mummoco","person"],["mummy","person"],["mundane","minus"],["munery","plus"],["munify","equals"],["munster","person"],["murch","plus"],["murder","minus"],["murdered","minus"],["murderer","person"],["murderers","person"],["murdering","minus"],["muriel","person"],["murnaghan","person"],["murphy","person"],["mus","note"],["muscle","box"],["muscles","box"],["muscomp","note"],["muscovy","duck"],["muscular","muscle"],["mushroom","plant"],["mushrooms","mushroom"],["music","note"],["musical","note"],["musically","note"],["musicals","note"],["musiccomposer","software"],["musician","person"],["musicians","person"],["musicking","note"],["musiclibrary","book"],["musings","plus"],["muslims","person"],["must","dash"],["mustafa","person"],["mutants","person"],["mutate","right"],["mutated","right"],["mutating","right"],["mutation","n"],["muted","zero"],["mutt","software"],["mutual","plus"],["mutually","tworectangle"],["mutuallyexclusive","notequals"],["mv","right"],["mvisible","square"],["mvs","dash"],["mwarning","xcorrection"],["mwt","company"],["mwyeqclcbgasyhq","dash"],["mx","variable"],["my","person"],["myadmissionessay","company"],["myanmar","country"],["mycorrhizae",""],["myface","company"],["myki","square"],["myness","person"],["myriad","two"],["myrrh",""],["myself","person"],["mysql","and"],["mysteries","questionmark"],["mysterious","plus"],["mystery","plus"],["mystic","plus"],["mystical","plus"],["myths","square"],["myune","paper"],["mzi","dash"],["n","variable"],["na","dash"],["nab","company"],["nac","dash"],["nacl",""],["nad","dash"],["nadia","person"],["nagging","minus"],["naif","person"],["nail","down"],["nailed","nail"],["nails","nail"],["nakar","person"],["naked","zero"],["namaskar","hand"],["namaskara","plus"],["name","threerectangle"],["nameable","square"],["named","square"],["namely","square"],["names","a"],["nameserver","square"],["namibia","country"],["naming","square"],["nan","person"],["nana","person"],["nancy","person"],["nanga","city"],["nanny","person"],["nanometres","line"],["nanotode","down"],["nanotubes","box"],["naomi","person"],["nap","bed"],["napa","tree"],["napkin",""],["naples","city"],["napoletana","tomato"],["nappies","nappy"],["nappy",""],["nare","dash"],["narrated","mouth"],["narrating","mouth"],["narration","book"],["narrative","square"],["narratives","square"],["narratology","mouth"],["narrator","person"],["narrow","box"],["narrowed","down"],["narrowing","minus"],["narrows","minus"],["nasa","up"],["nasal","nose"],["nascendi","right"],["natalie","person"],["nathalie","person"],["nathan","person"],["nation","square"],["national","country"],["nationalism","country"],["nationalist","person"],["nationality","square"],["nations","square"],["native","person"],["nativity","baby"],["natural","plus"],["naturalistic","plus"],["naturally","plus"],["naturalness","plus"],["nature","sun"],["natureofcode","and"],["natures","plus"],["nauru","country"],["nautilus","square"],["nav","square"],["naval","sea"],["navel","orange"],["navigate","right"],["navigating","right"],["navigation","right"],["navigator","and"],["navy","person"],["nay","minus"],["nazarene","person"],["nazi","person"],["nazis","person"],["nazism","person"],["nb","plus"],["nba","n"],["nbsp","dash"],["nby","left"],["nc","dash"],["ncolour","square"],["ncurl","dash"],["nd","dash"],["ndis","right"],["ne","dash"],["near","n"],["nearby","equals"],["nearest","equals"],["nearing","right"],["nearly","plus"],["neat","plus"],["neaten","plus"],["neatened","plus"],["neatly","plus"],["neatness","plus"],["nebuchadnezzar","person"],["nebulous","nebula"],["nec","plus"],["necessarily","tick"],["necessary","one"],["necessitate","plus"],["necessity","plus"],["neck",""],["necking","equals"],["necklace",""],["nectar",""],["nectarine",""],["nectars","nectar"],["nee","dash"],["need","left"],["needaccess","dash"],["needed","left"],["needing","left"],["needle",""],["needles","needle"],["needs","left"],["needy","minus"],["nef","dash"],["neg","minus"],["negatable","minus"],["negatably","minus"],["negate","xcorrection"],["negated","xcorrection"],["negates","xcorrection"],["negating","xcorrection"],["negation","minus"],["negative","minus"],["negatively","minus"],["negeia","minus"],["negentropic","down"],["negligence","minus"],["negotiate","plus"],["negotiated","plus"],["negotiating","plus"],["negotiation","plus"],["negotiations","square"],["nei","person"],["neiess","person"],["neiey","person"],["neigh","program"],["neighborhood","suburb"],["neighbour","equals"],["neighbourhood","suburb"],["neighbouring","equals"],["neighbours","person"],["neil","person"],["neiler","person"],["neither","xcorrection"],["nel","person"],["nellie","person"],["nelly","person"],["nelson","person"],["neo","plus"],["neof","square"],["neoliberalism","plus"],["nepal","country"],["nephew","person"],["nephtali","person"],["ner","dash"],["nerd","person"],["nerve",""],["nerves","right"],["nervous","nerve"],["nervousness","minus"],["ness","plus"],["nest",""],["nested","box"],["nesting","nest"],["net","box"],["netherlands","country"],["network","dot"],["networked","line"],["networking","star"],["networks","dash"],["neumann","person"],["neural","brain"],["neurally","brain"],["neurobot","person"],["neurocode","n"],["neuromodel","right"],["neuron","star"],["neurons","star"],["neuroplastic","square"],["neuroplasticity","plus"],["neurorecordings","square"],["neuroscience","brain"],["neuroscientist","person"],["neurotechnologies","brain"],["neurotypical","plus"],["neuse","plus"],["neuter","square"],["neutral","zero"],["never","zero"],["nevertheless","plus"],["nevis","country"],["new","plus"],["newbranch","dash"],["newbranchifcall","dash"],["newcastle","city"],["newcat","and"],["newcomers","person"],["newer","one"],["newline","square"],["newlines","square"],["newlist","square"],["newly","plus"],["newn","plus"],["newness","one"],["newrule","square"],["newrulenumber","n"],["news","paper"],["newsletters","paper"],["newspaper","paper"],["newspapers","book"],["newsreader","person"],["newton","person"],["next","right"],["ney","dash"],["nfns","right"],["nfrom","dash"],["ng","person"],["ngaire","person"],["nhalt","dash"],["nhmrc","book"],["ni","square"],["niall","person"],["niaz","person"],["nib",""],["nibbled","tooth"],["nicaragua","country"],["nice","plus"],["nicely","plus"],["nicer","greaterthan"],["niche","box"],["niches","box"],["nicholas","person"],["nicholls","person"],["nick","person"],["nickered","minus"],["nicknames","square"],["nicknaming","square"],["nicola","person"],["nicole","person"],["nida","university"],["nidaacttech","square"],["nidaamericanaccent","square"],["niece","person"],["nielsen","person"],["niet","person"],["nietzche","person"],["nietzsche","person"],["nietzschean","person"],["niger","country"],["nigeria","country"],["night","square"],["nightclubs","building"],["nighter","square"],["nightfall","n"],["nightingale",""],["nightmare","minus"],["nightmares","minus"],["nights","square"],["nikola","person"],["nile","river"],["nine","ninerectangle"],["ninerectangle",""],["ninja","person"],["ninjitsu","book"],["ninstallation","down"],["ninth","n"],["nirit","person"],["nirvana","zero"],["nk","dash"],["nl","square"],["nleash","dash"],["nlp","bracket"],["nls","down"],["nlucian","person"],["nm","dash"],["nmany","dash"],["nmr","n"],["nn","right"],["nna","square"],["nnb","square"],["nnc","square"],["nnd","square"],["nne","square"],["nnf","variable"],["nng","variable"],["nnh","variable"],["nni","variable"],["nnl","variable"],["nno","dash"],["nnobzgawdoxnmxdjgpyvege","dash"],["nns","right"],["nnzss","dash"],["no","xcorrection"],["noam","person"],["nobel","plus"],["noble","plus"],["nobles","person"],["nobodyproblemsapp","software"],["nocut","xcorrection"],["noddings","person"],["node","dot"],["nodes","dot"],["nodisplay","square"],["nogating","xcorrection"],["noheadaches","xcorrection"],["noheadachesapp","box"],["noise","zero"],["nola","person"],["nomenclature","square"],["nominalisation","square"],["nominalised","square"],["nominated","plus"],["nominative","square"],["non","xcorrection"],["nonabort","plus"],["nonaligned","minus"],["nonbeing","person"],["nonbeings","person"],["nonbreaking","square"],["nondet","n"],["nondeterminism","two"],["nondeterministic","or"],["nondeterministically","n"],["nondurable","minus"],["none","zero"],["nonequally","notequals"],["nonhuman","animal"],["nonprofits","zero"],["nonrec","right"],["nonrecursive","dot"],["nonsense","minus"],["nontechnical","right"],["nonvar","square"],["nonverbal","paper"],["nonviolent","plus"],["noodles","noodle"],["noon","n"],["nopqrstuvwxyz","dash"],["noprotocol","xcorrection"],["nor","zero"],["norberto","person"],["noreply","minus"],["normal","plus"],["normally","plus"],["normative","n"],["normativise","field"],["normativity","plus"],["norms","plus"],["norover","dash"],["norris","person"],["north","up"],["northwest","right"],["norway","country"],["noscript","square"],["nose",""],["nosepieces","peg"],["nosing","nose"],["nostril",""],["nostrils","nostril"],["not","xcorrection"],["notable","plus"],["notably","plus"],["notations","square"],["notcher","down"],["notchers","down"],["note",""],["notebook","book"],["notebooks","book"],["noted","plus"],["notequals",""],["notes","paper"],["notestonames","note"],["noteworthy","plus"],["nothing","zero"],["nothingness","zero"],["notice","eye"],["noticeable","plus"],["noticed","person"],["noticing","square"],["notification","plus"],["notifications","plus"],["notified","square"],["notifies","tick"],["notify","plus"],["notifying","square"],["noting","square"],["notion","square"],["notions","box"],["notrace","xcorrection"],["notranslate","xcorrection"],["nots","xcorrection"],["noumena","box"],["noumenal","dot"],["noumenalist","person"],["noumenon","apple"],["noumenonisation","dot"],["noun","apple"],["nouns","square"],["nourish","plus"],["nourished","plus"],["nourishing","plus"],["nourishment","plus"],["nov","square"],["novel","book"],["novels","book"],["november","square"],["now","zero"],["nowell","person"],["nowhere","box"],["nozzle","box"],["np","square"],["npackage","box"],["npm","down"],["npredicate","square"],["nq","square"],["nr","dash"],["nrectangle",""],["ns","n"],["nsw","square"],["nswipl","dash"],["nt","square"],["ntexttobr","apple"],["nth","n"],["ntime","dash"],["ntitle","square"],["ntotal","n"],["ntotal","right"],["nuance","plus"],["nuances","plus"],["nuclear","minus"],["nucleus","ball"],["nude","minus"],["nudging","right"],["nudity","person"],["nuggets","box"],["nuland","person"],["null","zero"],["num","n"],["number","one"],["numbera","variable"],["numbered","n"],["numbering","n"],["numberofinstruments","n"],["numberofphils","n"],["numbers","n"],["numberstring","square"],["numeracy","one"],["numerals","n"],["numerous","n"],["numin","n"],["numinputs","variable"],["numout","n"],["numoutputs","variable"],["nums","n"],["nur","plus"],["nurse","person"],["nursery","building"],["nursing","plus"],["nurtured","plus"],["nurturers","person"],["nurtures","plus"],["nurturing","plus"],["nusrat","person"],["nussbaum","person"],["nut",""],["nuted","plus"],["nutmeg","plant"],["nutrient","ball"],["nutrients","apple"],["nutrifactored","apple"],["nutrition","apple"],["nutritional","apple"],["nutritious","apple"],["nuts","nut"],["nutshell","nut"],["nuzzle","mouth"],["nuzzled","mouth"],["nv","square"],["nvivo","square"],["nwcxu","dash"],["nwhen","dash"],["nwhere","dash"],["nx","dash"],["nylon","line"],["nyou","dash"],["nz","n"],["nǚ","person"],["nǚzǐ","person"],["o","variable"],["oak","tree"],["oaks","tree"],["oar",""],["oarsman","person"],["oasis",""],["oat",""],["oats","oat"],["obesity","fat"],["obey","plus"],["obeyed","plus"],["obj","apple"],["object","apple"],["objected","xcorrection"],["objecting","xcorrection"],["objection","xcorrection"],["objections","xcorrection"],["objective","box"],["objectively","plus"],["objectives","dot"],["objectivity","box"],["objectnames","square"],["objectpath","line"],["objects","apple"],["objectstofinish","box"],["objectstoprepare","apple"],["objecttofinish","box"],["objecttofinishdict","book"],["objecttofinishdifferencet","d"],["objecttofinishdt","d"],["objecttofinishend","variable"],["objecttofinishlength","n"],["objecttofinishlengtht","n"],["objecttofinishstring","variable"],["objecttofinishstringth","variable"],["objecttofinishvalues","variable"],["objecttofinishx","variable"],["objecttofinishy","variable"],["objecttofinishz","variable"],["objecttoprepare","apple"],["objecttopreparedict","book"],["objecttopreparedifferencet","d"],["objecttopreparedt","d"],["objecttoprepareend","variable"],["objecttopreparelength","n"],["objecttopreparelengtht","n"],["objecttopreparestring","variable"],["objecttopreparestringth","variable"],["objecttopreparevalues","variable"],["objecttopreparex","variable"],["objecttopreparey","variable"],["objecttopreparez","variable"],["objs","apple"],["oblates","person"],["obligate","plus"],["obligated","plus"],["obligates","plus"],["obligating","plus"],["obligation","plus"],["obligations","plus"],["obliged","plus"],["oboe","tube"],["obs","box"],["obscure","minus"],["observable","plus"],["observant","eye"],["observation","eye"],["observational","eye"],["observations","n"],["observatory","star"],["observe","eye"],["observed","eye"],["observer","person"],["observers","person"],["observing","eye"],["obsolescence","minus"],["obsolete","minus"],["obsoleted","xcorrection"],["obsoletion","xcorrection"],["obstacle","box"],["obstacles","box"],["obstruct","box"],["obstruction","box"],["obtain","left"],["obtained","right"],["obvious","plus"],["obviously","plus"],["ocarina","box"],["occasionally","plus"],["occupants","person"],["occupation","paper"],["occupational","paper"],["occupied","plus"],["occupies","dot"],["occupy","square"],["occupying","plus"],["occur","plus"],["occurred","plus"],["occurrence","plus"],["occurring","dot"],["occurs","dot"],["ocean","water"],["oceanographer","person"],["oclass","variable"],["oct","eightrectangle"],["octagon",""],["octahedral","octahedron"],["octahedron",""],["octahedrons","octahedron"],["octave","eightrectangle"],["octaves","line"],["octet","eightrectangle"],["october","square"],["octopus",""],["ocw","book"],["odd","n"],["odds","p"],["odwyer","person"],["oedipal","person"],["oedipus","person"],["oesophagus","tube"],["oeuvre","book"],["oevre","book"],["of","dash"],["off","xcorrection"],["offable","tick"],["offend","minus"],["offended","minus"],["offender","person"],["offending","minus"],["offensive","minus"],["offer","plus"],["offered","right"],["offering","hand"],["offerings","box"],["offers","apple"],["office","room"],["officer","person"],["officers","person"],["offices","room"],["officeworks","company"],["official","plus"],["offline","zero"],["offs","down"],["offset","n"],["offspring","person"],["ofh","box"],["often","one"],["oga","dash"],["ogvoh","dash"],["oh","plus"],["ohdvoxc","dash"],["oil",""],["oiled","oil"],["oiling","oil"],["ok","plus"],["okay","plus"],["ol","variable"],["old","dash"],["olddoctors","person"],["older","greaterthan"],["oldest","n"],["oldmeditators","person"],["olds","person"],["olfactory","nose"],["olive",""],["oliver","person"],["olivia","person"],["olsson","person"],["olympiad","book"],["olympic","ball"],["olympics","ball"],["om","person"],["oman","country"],["omar","person"],["omega","square"],["omerbsezer","person"],["omissions","minus"],["omit","xcorrection"],["omits","minus"],["omitted","zero"],["on","plus"],["onboarding","dot"],["once","one"],["oncoming","square"],["ond","dash"],["one",""],["oneanother","person"],["onelonghandle","handle"],["oneness","one"],["ones","one"],["oneself","person"],["onezero","one"],["onfray","person"],["ong","person"],["ongoing","line"],["onings","apple"],["onion",""],["onlily","one"],["online","dash"],["only","one"],["onnahope","plus"],["onness","square"],["onnonletter","square"],["onoy","person"],["ons","dot"],["onset","line"],["ont","square"],["ontario","city"],["ontic","square"],["ontinue","dash"],["onto","down"],["ontol","box"],["ontological","square"],["ontologies","square"],["ontologise","square"],["ontologised","box"],["ontology","and"],["onwards","right"],["oo","dot"],["ooh","plus"],["oohs","note"],["ool","dash"],["oop","minus"],["op","right"],["opaque","square"],["open","dash"],["openbracket",""],["opened","up"],["openeditor","square"],["openeditorwitherror","xcorrection"],["opening","up"],["openings","box"],["openly","tick"],["openness","plus"],["openroundbracket",""],["opens","square"],["opensquarebracket",""],["opera","book"],["operate","hand"],["operated","box"],["operates","plus"],["operating","line"],["operatingsystem",""],["operation","and"],["operational","tick"],["operations","and"],["operative","and"],["operativity","plus"],["operator","and"],["operators","and"],["ophelia","person"],["opiate","plus"],["opie","person"],["opinion","square"],["opinions","square"],["opponent","person"],["opponents","person"],["opportunities","plus"],["opportunity","plus"],["oppose","xcorrection"],["opposed","xcorrection"],["opposing","dash"],["opposite","dash"],["opposites","xcorrection"],["opposition","minus"],["oppositions","xcorrection"],["oppress","minus"],["oppressed","minus"],["oppression","minus"],["oppressions","minus"],["oppressive","minus"],["oppressiveness","minus"],["oprah","person"],["oprotocol","dash"],["ops","box"],["optative","plus"],["optical","eye"],["optimal","plus"],["optimally","one"],["optimisation","one"],["optimise","one"],["optimisecode","one"],["optimised","one"],["optimiser","one"],["optimises","down"],["optimising","down"],["optimism","plus"],["optimistic","plus"],["optimization","plus"],["optimum","plus"],["option","dash"],["optional","plus"],["optionally","plus"],["options","plus"],["optometry","eye"],["opus","book"],["opusman","person"],["or",""],["oral","mouth"],["orally","mouth"],["orange",""],["oranges","orange"],["orangutan",""],["oranoos","person"],["orb","ball"],["orbiting","circle"],["orch","violin"],["orchestra","instrument"],["orchestral","baton"],["orcutorifthen","right"],["ordained","plus"],["order","n"],["ordered","n"],["ordering","paper"],["orders","n"],["ordinary","plus"],["ordinate","n"],["ordinated","plus"],["ordinates","n"],["ordination","plus"],["ords","n"],["ore","dash"],["org","building"],["organ","box"],["organic","apple"],["organisation","building"],["organisational","company"],["organisations","organisation"],["organise","plus"],["organised","tick"],["organiser","person"],["organising","dash"],["organism","person"],["organisms","person"],["organization","building"],["organizational","company"],["organized","plus"],["organizer","person"],["organs","box"],["orhan","person"],["orient","right"],["orientation","right"],["orientations","line"],["oriented","right"],["orienteered","right"],["orienteering","right"],["orienteers","person"],["orienting","right"],["orig","plus"],["origin","box"],["original","one"],["originality","plus"],["originally","one"],["originals","one"],["originate","zero"],["originated","dot"],["originating","dot"],["origins","dot"],["orlando","person"],["ornament","plus"],["orphaned","dot"],["orphanedalgorithms","variable"],["orphanedbreasonings","apple"],["orphanedbreathsonings","variable"],["orphaneddirections","variable"],["orphanedobjectstofinish","variable"],["orphanedobjectstoprepare","variable"],["orphanedpartsofrooms","variable"],["orphanedrooms","variable"],["orsola","person"],["orthodontic","tooth"],["orthodontist","person"],["orthodox","plus"],["ory","person"],["os","box"],["osaka","city"],["oscillating","right"],["ost","note"],["osxdaily","dash"],["ot","right"],["oth","dash"],["other","person"],["otherdepartment","building"],["othered","two"],["otherlander","person"],["others","person"],["otherwise","minus"],["ou","dash"],["oua","company"],["ouch","minus"],["ouptut","square"],["our","dash"],["ours","person"],["ourselves","person"],["out","minus"],["outages","minus"],["outback","tree"],["outbreak","minus"],["outbursts","minus"],["outcome","n"],["outcomes","right"],["outdoor","plant"],["outdoors","right"],["outer","right"],["outfit","trousers"],["outfits","shirt"],["outflow","right"],["outgoing","plus"],["outlandish","minus"],["outlawed","xcorrection"],["outlawing","xcorrection"],["outlay","down"],["outlaying","square"],["outlier","dot"],["outliers","minus"],["outline","square"],["outlined","square"],["outlines","paper"],["outlook","eye"],["outmodes","right"],["outness","right"],["outpatients","person"],["outpost","city"],["output","variable"],["outputarguments","variable"],["outputasconsequent","n"],["outputformat","n"],["outputlabels","square"],["outputless","zero"],["outputlyrics","square"],["outputnumber","n"],["outputs","variable"],["outputted","n"],["outputting","right"],["outputtype","square"],["outputvalues","n"],["outputvar","variable"],["outputvarlist","twobox"],["outputvars","box"],["outputvarsl","n"],["outro","n"],["outs","right"],["outset","n"],["outside","square"],["outsiders","person"],["outsource","right"],["outsourced","right"],["outsourcing","right"],["outstanding","plus"],["outturned","right"],["outward","right"],["outwardly","right"],["outwards","right"],["outweigh","greaterthan"],["oval","square"],["ovale","oval"],["ovary","organ"],["oven",""],["over","xcorrection"],["overactive","up"],["overall","n"],["overarching","box"],["overbearing","xcorrection"],["overbearingly","xcorrection"],["overcame","up"],["overcast","sky"],["overcome","xcorrection"],["overcomes","xcorrection"],["overcoming","xcorrection"],["overdo","minus"],["overdoing","minus"],["overdone","plus"],["overdriven","up"],["overeating","minus"],["overflow","minus"],["overhead","up"],["overheard","ear"],["overlap","square"],["overlapped","n"],["overlapping","box"],["overlaps","square"],["overload","minus"],["overlooked","minus"],["overlooks","minus"],["overly","plus"],["overnight","square"],["override","right"],["overriding","plus"],["overseas","dot"],["oversee","eye"],["oversight","minus"],["overspend","dollarsymbol"],["overtake","right"],["overtaken","xcorrection"],["overtakes","right"],["overthrow","xcorrection"],["overthrowing","xcorrection"],["overtime","plus"],["overtook","right"],["overture","note"],["overturned","up"],["overturning","plus"],["overuse","minus"],["overview","square"],["overwatered","water"],["overwhelmed","minus"],["overwhelming","minus"],["overwrite","down"],["ovid","person"],["owing","dollarsymbol"],["owl",""],["owls","owl"],["own","dash"],["owned","box"],["owner","person"],["ownerisation","plus"],["owners","person"],["ownership","box"],["owning","dollarsymbol"],["owns","box"],["oxe","dash"],["oxford","university"],["oxley","person"],["oxygen",""],["oxygenated","box"],["oyqym","dash"],["ozemail","company"],["ozone",""],["p","a"],["pa","dash"],["pablo","person"],["pace","n"],["pachelbel","person"],["pack","box"],["package","box"],["packages","software"],["packed","box"],["packet",""],["packing","down"],["packs","box"],["pad","line"],["padded","box"],["padding","box"],["paddled","right"],["paddock","square"],["pads","line"],["paedophile","person"],["paedophiles","person"],["paedophilia","minus"],["paella",""],["page","paper"],["pagerank","n"],["pages","paper"],["pagliarella","person"],["paid","dollarsign"],["pain","minus"],["painful","minus"],["painless","plus"],["pains","minus"],["painstaking","plus"],["painstakingly","plus"],["paint",""],["paintbrush","brush"],["painted","paint"],["painter","person"],["painting",""],["paintings","painting"],["paints","paint"],["pair","two"],["pairs","tworectangle"],["pak","person"],["pakistan","country"],["palace","building"],["palacebos","dash"],["palau","country"],["palestine","country"],["palette",""],["palimpsest","paper"],["pallet",""],["palm","up"],["palmer","person"],["palms","palm"],["palpation","dash"],["palsy","minus"],["pamphlet",""],["pamphlets","pamphlet"],["pan",""],["panama","country"],["pancake",""],["panchan","person"],["pancreas",""],["pandemic","minus"],["pane","square"],["panel","square"],["panforte",""],["pangs","minus"],["panic","minus"],["panning","dash"],["pant",""],["pantry",""],["pants",""],["panty","underpants"],["papa","person"],["paper",""],["papers","paper"],["papersga","dash"],["papua","country"],["para","square"],["paracosm","plus"],["parade","right"],["paraded","person"],["paradigm","right"],["paradigmatic","plus"],["paradigms","right"],["parading","person"],["paradox","minus"],["paradoxical","minus"],["paradoxically","minus"],["paragraph","square"],["paragraphs","square"],["paraguay","country"],["parakeet","bird"],["paralink","company"],["parallel","line"],["parallels","line"],["parameters","n"],["parampalli","person"],["params","variable"],["paraphrase","square"],["paraphrased","square"],["paraphraser","right"],["paraphrasing","square"],["paraphrasings","equals"],["paras","square"],["parasol",""],["parathyroid","box"],["parcel","box"],["parchment","paper"],["pareil","equals"],["parent","up"],["parentheses","square"],["parenthesis","brackets"],["parenthood","person"],["parents","person"],["pareto","plus"],["parietal","brain"],["paris","city"],["parises","city"],["parisian","city"],["park",""],["parke","person"],["parked","dot"],["parking","car"],["parkinson","person"],["parkinsons","minus"],["parks","plant"],["parliament","building"],["parliamentarians","person"],["parliamentary","book"],["parlour","room"],["parmar","person"],["parochialism","minus"],["parodied","plus"],["parodies","plus"],["paronamastically","equals"],["paronomastic","equals"],["paronomastically","plus"],["paronomasticism","dash"],["parriage","plus"],["parrot",""],["parse","down"],["parsed","down"],["parser","dash"],["parses","dash"],["parsesentence","square"],["parshina","person"],["parsing","down"],["part","box"],["partake","plus"],["parted","right"],["parth","dash"],["parthh","dash"],["partial","n"],["partially","one"],["participant","person"],["participants","person"],["participate","tick"],["participated","plus"],["participating","tick"],["participation","plus"],["participatory","plus"],["participle","plus"],["participles","plus"],["particle","ball"],["particles","box"],["particular","one"],["particularly","plus"],["particulars","plus"],["parties","person"],["partisan","plus"],["partlength","n"],["partly","plus"],["partner","person"],["partnered","right"],["partnering","right"],["partners","person"],["partnerships","dash"],["partofroom","table"],["partofroomdict","book"],["partofroomdifferencet","d"],["partofroomdt","variable"],["partofroomend","variable"],["partofroomlength","n"],["partofroomlengtht","n"],["partofroomstring","variable"],["partofroomstringth","variable"],["partofroomvalues","variable"],["partofroomx","variable"],["partofroomy","variable"],["partofroomz","variable"],["parts","tworectangle"],["partsofrooms","table"],["party","house"],["pas","plus"],["pasa","city"],["pasadena","suburb"],["pashphalt","plus"],["pass","dash"],["passage","right"],["passages","hall"],["passcode","n"],["passed","tick"],["passenger","person"],["passengers","person"],["passers","person"],["passes","right"],["passing","dash"],["passion","plus"],["passionate","plus"],["passionne","plus"],["passive","zero"],["passphrase","square"],["passport","book"],["passwd","square"],["password","square"],["passwordatom","box"],["passwords","square"],["passwordstring","square"],["past","left"],["pasta",""],["pastable","square"],["paste","square"],["pasted","square"],["pastime","n"],["pasting","square"],["pastry",""],["pasttries","left"],["pasty",""],["pat","hand"],["pataki","person"],["path",""],["pathetic","minus"],["pathogen","minus"],["pathogenesis","line"],["pathogenic","line"],["pathogens","bacterium"],["pathological","minus"],["pathologically","minus"],["pathology","minus"],["paths","path"],["pathway",""],["pathways","path"],["patience","plus"],["patient","person"],["patients","person"],["patriage","plus"],["patrol","person"],["patrolled","plus"],["patted","hand"],["pattern","square"],["patterns","square"],["patting","hand"],["patty",""],["paul","person"],["paula","person"],["paull","person"],["pauls","person"],["paultramlgreturningfromshowreel","person"],["pause","line"],["paused","zero"],["pauses","box"],["pavane","paper"],["pavement","path"],["paving","plane"],["paw",""],["pawing","right"],["pawn","person"],["paws","paw"],["pax","line"],["pay","dollarsign"],["payanimhighered","dollarsymbol"],["paycommentasacelebrity","a"],["paydaoistheadachemedicinetexttobr","dollarsign"],["payer","person"],["paying","dollarsymbol"],["payment","dollarsign"],["payments","dollarsymbol"],["payne","person"],["paypal","company"],["paypalapis","company"],["pays","dollarsymbol"],["pbm","square"],["pbt","dash"],["pc","box"],["pce","and"],["pckg","box"],["pconverter","right"],["pcre","right"],["pdf","paper"],["pea",""],["peace","a"],["peaceful","plus"],["peacefully","plus"],["peach",""],["peaches","peach"],["peacocks","peacock"],["peak","n"],["peaks","dot"],["peanut","nut"],["peanuts","nut"],["pear",""],["pearl","person"],["pears","pear"],["peas","pea"],["peasants","person"],["pecan",""],["pecking","seed"],["ped","book"],["pedagogical","paper"],["pedagogically","book"],["pedagogies","book"],["pedagogise","book"],["pedagogue","person"],["pedagogues","person"],["pedagogy","book"],["pedalling","pedal"],["pedanticals","person"],["pedestal",""],["pedestrian","person"],["pedestrians","person"],["pedocracy","one"],["pedophile","person"],["pedophilia","minus"],["peds","person"],["pee","dot"],["peel","box"],["peeled","apple"],["peeling","up"],["peephole","hole"],["peer","person"],["peers","person"],["peg",""],["peirce","person"],["pelicanstore","dollarsymbol"],["pellas","person"],["pellets",""],["peloponnesia","city"],["pen",""],["penalised","xcorrection"],["penalties","xcorrection"],["penalty","minus"],["pencil",""],["pencilsharpener",""],["pending","plus"],["pendulum",""],["pengines","square"],["penis",""],["penned","pen"],["penning","pen"],["penny","person"],["pennycook","person"],["pensiero","box"],["pension","dollarsymbol"],["pensions","dollarsymbol"],["pentagon",""],["pentose","sugar"],["penultimatevars","variable"],["peo","person"],["people","person"],["peoples","person"],["pepper",""],["peppered","pepper"],["peppermint","plant"],["peppers","capsicum"],["per","dash"],["perceived","eye"],["percent","n"],["percentage","n"],["perception","square"],["perceptions","eye"],["perceptron","eye"],["perceptual","eye"],["percussion","drum"],["percussive","equals"],["pereira","person"],["perennial","line"],["perfect","a"],["perfection","one"],["perfectionism","plus"],["perfectly","one"],["perfects","plus"],["perforated","line"],["perform","a"],["performance","stage"],["performances","box"],["performative","plus"],["performativity","plus"],["performed","person"],["performer","person"],["performers","person"],["performerstuff","person"],["performing","person"],["performs","priscilla"],["perfume",""],["perfusion","right"],["pergola",""],["perhaps","zero"],["period","line"],["periodically","t"],["periods","line"],["peripheral","right"],["periphery","n"],["perishable","minus"],["peristaltic","down"],["perlin","person"],["permanently","plus"],["permeable","right"],["permeate","down"],["permission","plus"],["permits","tick"],["permitted","plus"],["perpendicular","square"],["perpetual","n"],["perpetuate","plus"],["perry","person"],["perseverance","plus"],["persevering","plus"],["persia","country"],["persist","plus"],["persistence","plus"],["persistent","plus"],["persistently","plus"],["persists","plus"],["person",""],["personable","plus"],["personal","dot"],["personalism","person"],["personalities","person"],["personality","person"],["personalityforge","company"],["personally","person"],["personcoords","n"],["personell","person"],["persons","person"],["perspectival","eye"],["perspective","eye"],["perspectives","dot"],["perspectivism","eye"],["perspiration","water"],["persps","eye"],["persson","person"],["persuade","plus"],["persuaded","plus"],["persuading","plus"],["persuasive","plus"],["pertaining","right"],["peru","country"],["pessimising","minus"],["pestle",""],["pet","animal"],["petal",""],["petals","petal"],["peter","person"],["peters","person"],["petersburg","city"],["petersburgs","city"],["petersen","person"],["pets","animal"],["petty","person"],["pf","square"],["pfile","paper"],["pfs","and"],["pg","paper"],["pgrad","person"],["pgst","dash"],["pgtd","square"],["ph","dash"],["phallus","rod"],["phalluses","rod"],["phantasmagoria","box"],["pharmaceuticals","medicine"],["pharmacological","ball"],["pharmacology","ball"],["pharynx","tube"],["phase","line"],["phd","book"],["phds","book"],["pheasant",""],["phenomena","box"],["phenomenological","right"],["phenomenologist","person"],["phenomenology","square"],["phenomenon","square"],["pher","square"],["phi","square"],["phil","a"],["philanthropical","plus"],["philanthropist","person"],["philanthropists","person"],["philanthropy","right"],["philatelist","person"],["philip","person"],["philippines","country"],["phillip","person"],["philosopher","person"],["philosophers","person"],["philosophic","dash"],["philosophical","dash"],["philosophically","right"],["philosophicon","square"],["philosophicus","right"],["philosophies","paper"],["philosophise","square"],["philosophised","book"],["philosophy","book"],["phils","book"],["philslengthlist","square"],["phine","dash"],["phoenix","bird"],["phone","n"],["phoneme","square"],["phonemes","square"],["phones","phone"],["phonetic","square"],["phontology","mouth"],["photo","square"],["photograph","square"],["photographed","square"],["photographer","person"],["photographic","square"],["photographing","square"],["photographs","square"],["photography","square"],["photon",""],["photons","ball"],["photos","square"],["photoshop","software"],["photosynthesis","right"],["photosynthetic","right"],["php","paper"],["phpversion","n"],["phrase","square"],["phrases","square"],["phrasesfile","paper"],["phrasevalue","n"],["phrasevarname","square"],["phrasing","square"],["phryne","person"],["phyllis","person"],["phylogenetic","dash"],["phys","box"],["physic","square"],["physical","box"],["physically","box"],["physician","person"],["physicians","person"],["physicist","person"],["physics","plus"],["physiological","body"],["physiologically","person"],["physiologies","body"],["physiologist","person"],["physiology","person"],["physiotherapy","book"],["physis","right"],["pi",""],["pia","person"],["pianissimos","line"],["pianist","person"],["piano",""],["pic","square"],["piccolo","note"],["pick","plus"],["picked","apple"],["pickering","person"],["pickies","n"],["picking","tick"],["pickle",""],["pickled","n"],["picks","tick"],["picky","plus"],["picture","square"],["pictured","square"],["pictures","square"],["picturing","square"],["pid","dash"],["pie",""],["piece","paper"],["pieces","box"],["pierced","dot"],["piercings","hole"],["piety","plus"],["pig",""],["pigeon",""],["pigeonhole","box"],["pigment","square"],["pile","up"],["pill","tablet"],["pillar","cylinder"],["pillow",""],["pillowed","pillow"],["pillows","pillow"],["pilot","person"],["pimple",""],["pin",""],["pinball","ball"],["pinch","minus"],["pine","tree"],["pineal","gland"],["pineapple",""],["pinecone",""],["pined","plus"],["pinhole","hole"],["pink","square"],["pinkies","finger"],["pinkwhistles","whistle"],["pinky","finger"],["pinnacle","dot"],["pinned","dot"],["pinpoint","dot"],["pins","pin"],["pinter","person"],["pinterest","company"],["pinx","square"],["pio","right"],["pious","plus"],["pip",""],["pipe",""],["pipeline","line"],["piper","person"],["pipes","pipe"],["pipetted","pipette"],["pippin","apple"],["pips","pip"],["pipsqueaks","minus"],["pisa","city"],["pistachio","nut"],["pitch","tick"],["pitched","plus"],["pitches","tick"],["pitchouli",""],["pitchproof","note"],["pitfalls","minus"],["pithfulnesses","down"],["pitrisaur","dinosaur"],["pitted","stone"],["pittosporum","plant"],["pituitary","box"],["pity","minus"],["pivot","right"],["pivotal","dot"],["pixar","company"],["pixel","square"],["pixelate","square"],["pixelated","square"],["pixellated","square"],["pixels","square"],["pixie","person"],["pixies","person"],["pizza",""],["pizzicato","finger"],["pl","square"],["pla","dash"],["place","building"],["placebo","tablet"],["placed","hand"],["placeholders","dot"],["placement","square"],["placenta","organ"],["places","tree"],["placing","down"],["plagiarise","minus"],["plagiarised","minus"],["plagiarises","minus"],["plagiarising","minus"],["plagiarism","minus"],["plain","plus"],["plainer","plus"],["plainness","plus"],["plaintiffs","person"],["plan","paper"],["plane",""],["planes","plane"],["planet",""],["planetarium","star"],["planetary","planet"],["planets","ball"],["plank",""],["planned","paper"],["planner","square"],["planning","right"],["plans","paper"],["plant",""],["planted","seed"],["planting","seed"],["plants","plant"],["plaque","minus"],["plasma",""],["plastic","plane"],["plasticated","plastic"],["plate",""],["plateau","box"],["platelet",""],["platelets","platelet"],["platform","box"],["platforms","square"],["platinum","ball"],["plato","person"],["platolucian","person"],["platonic","person"],["platonism","book"],["platonist","person"],["platter","plate"],["plausible","plus"],["play","ball"],["playback","right"],["played","ball"],["player","person"],["players","person"],["playful","ball"],["playfully","plus"],["playing","plus"],["playlist","square"],["plays","book"],["playwright","person"],["pldoc","square"],["ple","person"],["plea","plus"],["pleasant","plus"],["please","plus"],["pleased","plus"],["pleasing","plus"],["pleasure","plus"],["pleasures","plus"],["plebeian","person"],["plectrums","plectrum"],["pleiotropic","n"],["plentiful","n"],["plenty","plus"],["pliers",""],["plimsoll","line"],["plode","box"],["plogue","company"],["plos","book"],["plot","dot"],["plots","dot"],["plotted","paper"],["plotting","dot"],["plough",""],["ploughed","right"],["pls","square"],["plucked","right"],["plucking","right"],["plug","box"],["plugged","right"],["plugins","box"],["plugs","right"],["pluhar","person"],["plum",""],["plume","feather"],["plunger","down"],["pluperfect","right"],["plural","n"],["pluralism","box"],["plus",""],["plusone","plus"],["plustwo","plus"],["pluto","planet"],["pm","dash"],["pms","software"],["pn","n"],["pnal","n"],["pnals","n"],["png","paper"],["po","university"],["pocket",""],["pocketing","box"],["pocketwatch",""],["pod","square"],["podcast","paper"],["podcaster","person"],["podiatrist","person"],["podium",""],["pods","pod"],["poem","paper"],["poet","person"],["poetic","square"],["poetics","book"],["poetry","square"],["poh","person"],["poi","dash"],["point","dot"],["pointed","dot"],["pointedness","dot"],["pointer","dot"],["pointers","dot"],["pointing","down"],["points","dot"],["poisson","person"],["poked","down"],["poking","right"],["pol","box"],["poland","country"],["polanyi","person"],["polarity","rod"],["pole","dot"],["poles","line"],["police","person"],["policeman","person"],["policies","square"],["policy","paper"],["polis","city"],["polish","plus"],["polished","plus"],["polishing","square"],["polite","person"],["politely","plus"],["politeness","plus"],["politenesses","plus"],["political","up"],["politically","dot"],["politician","person"],["politicians","person"],["politicity","plus"],["politics","dash"],["politicsrelation","dash"],["polkinghorns","plus"],["poll","plus"],["pollution","box"],["polly","person"],["polonialnesses","person"],["poltergeist","minus"],["polycaring","plus"],["polyglot","n"],["polygon","square"],["polygons","square"],["polyhedron","box"],["polyhedrons","box"],["polymath","person"],["polynomial","n"],["polyp","ball"],["polysynth","n"],["polytechnic","university"],["pom","tomato"],["pomegranate",""],["pommelles","plus"],["pomodoro","tomato"],["pomodoros","paper"],["pomology","dash"],["poms","book"],["pond","water"],["ponds","suburb"],["pone","dash"],["ponens","hand"],["ponty","person"],["pony",""],["poo","person"],["poofters","person"],["pooh","bear"],["pool",""],["poor","minus"],["pooty","person"],["pop","bubble"],["popclassical","ball"],["popclassicalcomposition","note"],["popcorn",""],["pope","person"],["popogogy","book"],["popol","book"],["popological","person"],["popologically","person"],["popology","book"],["poppadum",""],["popped","up"],["poppies","flower"],["popping","air"],["pops","bubble"],["popsicle",""],["popular","plus"],["popularities","plus"],["popularity","plus"],["popularized","plus"],["populatevars","variable"],["population","n"],["populations","n"],["populum","person"],["porcupine",""],["pork",""],["porridge",""],["port","box"],["portable","right"],["ported","right"],["porter","person"],["portfolio","book"],["portfolios","book"],["porting","right"],["portion","box"],["portrait","square"],["portray","square"],["portugal","country"],["portuguese","paper"],["pos","plus"],["pose","plus"],["posed","plus"],["poseia","plus"],["poses","plus"],["posing","plus"],["posited","plus"],["position","n"],["positioned","n"],["positionlist","square"],["positions","person"],["positive","plus"],["positively","plus"],["positivestate","plus"],["positivism","plus"],["positivist","plus"],["positivity","plus"],["positivityscore","plus"],["poss","plus"],["possess","box"],["possessing","box"],["possibilities","dash"],["possibility","dash"],["possible","plus"],["possibly","one"],["possiblynotworking","minus"],["post","right"],["postal","paper"],["postcode","square"],["postcolonial","right"],["postcolonialism","country"],["posted","letter"],["poster","paper"],["posteriori","right"],["posterity","n"],["postgrads","person"],["postgraduate","person"],["posthumanism","right"],["posting","square"],["postkey","n"],["postludetudine","square"],["postnet","paper"],["postpone","xcorrection"],["posts","square"],["postsong","note"],["poststructural","book"],["poststructuralism","right"],["postulated","plus"],["posture","line"],["pot",""],["potato",""],["potatoes","potato"],["potbelly",""],["potential","tick"],["potentiality","plus"],["potentially","plus"],["potholes","down"],["potion",""],["potoroo",""],["potpourri","flower"],["pots","pot"],["potter","person"],["pottery","pot"],["poulton","person"],["pounding","down"],["pounds","n"],["pour","right"],["poured","water"],["pouring","down"],["poverty","minus"],["powder","box"],["powell","person"],["power","plus"],["powered","right"],["powerful","plus"],["powerfully","plus"],["powering","n"],["powerless","zero"],["powerlessness","minus"],["powerload","n"],["powerman","person"],["powerpoint","software"],["powers","person"],["poynter","person"],["pozible","company"],["pp","square"],["ppca","company"],["ppm","paper"],["ppx","dash"],["pra","dash"],["prac","plus"],["pract","dash"],["practical","plus"],["practicality","plus"],["practically","plus"],["practice","plus"],["practiced","plus"],["practices","square"],["practicing","plus"],["practicum","paper"],["practicums","paper"],["practise","plus"],["practised","plus"],["practises","plus"],["practising","plus"],["practitioner","person"],["practitioners","person"],["pragmatic","plus"],["prahran","suburb"],["praise","plus"],["praising","plus"],["pranayama","yline"],["praxemes","square"],["pray","plus"],["prayed","plus"],["prayer","paper"],["prayers","square"],["praying","brain"],["pre","left"],["preamble","square"],["precative","up"],["precaution","plus"],["precautions","xcorrection"],["precede","right"],["preceded","left"],["precedes","right"],["precious","plus"],["precipice","up"],["precise","plus"],["precision","plus"],["precludes","xcorrection"],["precomputed","and"],["precursor","left"],["precursors","left"],["pred","square"],["predation","animal"],["predationism","up"],["predatorial","up"],["predators","animals"],["predefined","tick"],["predetermined","plus"],["predicate","n"],["predicateer","person"],["predicateform","square"],["predicateforms","square"],["predicatename","square"],["predicatenamen","square"],["predicates","square"],["predicatesa","square"],["predicatesdictionary","and"],["predicatised","square"],["predict","two"],["predictability","plus"],["predictable","dot"],["predicted","right"],["predicting","right"],["prediction","square"],["predictions","square"],["predictive","left"],["predictively","plus"],["predictor","variable"],["predicts","up"],["prednamearity","n"],["prednamearitylist","square"],["predominantly","plus"],["preds","square"],["preened","plus"],["preening","plus"],["prefer","plus"],["preferable","plus"],["preferably","plus"],["preference","tick"],["preferences","tick"],["preferred","plus"],["prefers","plus"],["prefix","left"],["prefixes","square"],["preg","dash"],["pregnancies","baby"],["pregnancy","line"],["pregnant","person"],["prehistoric","line"],["prehistory","line"],["prejudice","minus"],["prejudices","minus"],["preliminarily","one"],["preliminary","one"],["prelude","note"],["preludes","left"],["premise","square"],["premises","room"],["premium","one"],["première","one"],["preoccupation","plus"],["preoedipal","right"],["prep","right"],["prepaid","dollarsymbol"],["prepar","plus"],["preparation","left"],["preparations","plus"],["preparatory","right"],["prepare","right"],["prepared","plus"],["prepares","plus"],["preparing","plus"],["prepensive","left"],["preplpi","and"],["preposition","box"],["prepositioned","right"],["prepositions","box"],["prequel","book"],["prereqs","right"],["prerequisite","right"],["prerequisites","right"],["presage","n"],["prescribe","square"],["prescribed","paper"],["prescribing","paper"],["prescription","paper"],["preselected","plus"],["presence","person"],["present","box"],["presentation","square"],["presentations","square"],["presented","square"],["presenter","person"],["presenters","person"],["presenting","square"],["presentness","plus"],["presents","square"],["preservation","plus"],["preserve","one"],["preserved","right"],["preserver","plus"],["preserves","right"],["preserving","plus"],["preset","one"],["preside","plus"],["president","person"],["prespises","plus"],["press","box"],["pressed","down"],["presser","down"],["presses","finger"],["pressing","down"],["pressure","n"],["prestigious","plus"],["prestigiousness","plus"],["preston","person"],["prestring","square"],["presumption","square"],["presuppositions","square"],["pretend","plus"],["pretended","plus"],["pretending","plus"],["pretty","plus"],["prettyprint","square"],["prev","left"],["prevalent","plus"],["prevent","xcorrection"],["preventable","xcorrection"],["preventative","plus"],["preventbodyproblems","xcorrection"],["prevented","xcorrection"],["preventheadaches","xcorrection"],["preventing","xcorrection"],["prevention","xcorrection"],["preventive","xcorrection"],["prevents","xcorrection"],["preview","square"],["previews","square"],["previous","left"],["previousbt","left"],["previously","minus"],["prevstate","n"],["prey","down"],["preying","down"],["price","dollarsign"],["prices","dollarsymbol"],["prickly","spine"],["pride","plus"],["priest","person"],["prig","dash"],["prim","one"],["primarily","one"],["primary","one"],["prime","one"],["primer","paper"],["primera","one"],["primitive","down"],["primordial","n"],["prince","person"],["princess","person"],["princesses","person"],["principal","person"],["principe","country"],["principias","book"],["principle","square"],["principles","square"],["print","square"],["printable","paper"],["printed","square"],["printer",""],["printheader","square"],["printing","square"],["printline","line"],["printmaking","square"],["printout","square"],["prints","square"],["prior","left"],["priori","left"],["priorities","n"],["prioritise","plus"],["prioritised","plus"],["priority","one"],["priscilla",""],["prism",""],["prisms","cube"],["prisoner","person"],["pristine","plus"],["privacy","box"],["private","box"],["privately","plus"],["privateness","plus"],["privilege","plus"],["privileged","plus"],["prize","box"],["prizes","plus"],["prlote","paper"],["prlotereview","box"],["prlr","box"],["pro","plus"],["proactive","plus"],["probabilistic","n"],["probability","one"],["probable","one"],["probably","plus"],["probe","right"],["probed","down"],["probiotic","plant"],["problem","xcorrection"],["problematic","minus"],["problematise","xcorrection"],["problems","a"],["proboscis","right"],["proc","plus"],["procedural","right"],["procedure","and"],["procedures","right"],["proceed","right"],["proceeded","right"],["proceedings","line"],["proceeds","dollarsymbol"],["process","line"],["processcode","and"],["processed","right"],["processes","right"],["processing","right"],["processor","and"],["processors","box"],["processual","right"],["proclaim","square"],["proclaimed","plus"],["proclamation","square"],["procrastinator","person"],["procreate","down"],["procurement","right"],["prod","box"],["prodigious","plus"],["prodos","box"],["prods","product"],["produce","right"],["produced","right"],["producer","person"],["producers","person"],["produces","right"],["producing","right"],["product","leash"],["production","right"],["productions","box"],["productive","plus"],["productively","plus"],["productivity","plus"],["products","apple"],["prof","person"],["profession","book"],["professional","person"],["professionalism","tick"],["professionally","plus"],["professionals","person"],["professor","person"],["professorial","person"],["professors","person"],["professorships","person"],["proficient","plus"],["profile","paper"],["profit","plus"],["profitability","dollarsymbol"],["profitable","dollarsymbol"],["profits","dollarsymbol"],["proforma","paper"],["profoundly","plus"],["profs","person"],["prog","square"],["program","b"],["programa","and"],["programfinder","square"],["programfindercall","right"],["programmable","and"],["programme","book"],["programmed","and"],["programmer","person"],["programmers","person"],["programmes","book"],["programming","and"],["programs","square"],["progress","right"],["progresses","right"],["progressing","right"],["progression","nrectangle"],["progressions","line"],["progressive","plus"],["progs","square"],["prohibited","xcorrection"],["prohibitive","minus"],["project","box"],["projected","right"],["projectile","ball"],["projecting","square"],["projection","square"],["projections","up"],["projector","right"],["projectors","right"],["projects","paper"],["prolapse","minus"],["prolegomenon","square"],["proletarian","person"],["proletariats","person"],["prolifically","n"],["prolog","paper"],["prologtoplevel","up"],["prolong","right"],["prolonged","n"],["prolongs","n"],["promenade","right"],["promiscuity","minus"],["promise","plus"],["promised","plus"],["promote","plus"],["promoted","plus"],["promoters","person"],["promotes","plus"],["promoting","plus"],["promotion","up"],["promotional","plus"],["promotions","up"],["prompt","square"],["prompted","dash"],["prompting","plus"],["promptly","plus"],["prompts","square"],["prone","plus"],["pronounce","mouth"],["pronounced","mouth"],["pronouncement","square"],["pronouncing","tongue"],["pronunciation","mouth"],["proof","tick"],["proofread","eye"],["proofreader","person"],["proofreading","paper"],["proofs","paper"],["prop","apple"],["propaganda","minus"],["propagate","right"],["propagated","right"],["propagating","right"],["propagation","right"],["propeller",""],["propensity","plus"],["proper","plus"],["properly","plus"],["properties","n"],["property","dash"],["prophet","person"],["proportion","n"],["proportional","n"],["proportionality","equals"],["proportions","box"],["proposal","plus"],["proposals","paper"],["propose","right"],["proposed","paper"],["proposes","plus"],["proposing","plus"],["proposition","square"],["propositional","tick"],["propositions","a"],["propped","strut"],["props","apple"],["propulsion","right"],["proquest","and"],["pros","plus"],["prosecute","xcorrection"],["prosody","note"],["prospect","person"],["prospected","plus"],["prospective","plus"],["prospects","plus"],["prospectus","book"],["prosperity","plus"],["prosperous","dollarsymbol"],["prostate","box"],["prosthetic",""],["prostitutes","person"],["prostitution","minus"],["protect","plus"],["protected","plus"],["protectedness","plus"],["protecting","plus"],["protection","plus"],["protectionism","plus"],["protections","plus"],["protective","plus"],["protector","plus"],["protectors","square"],["protects","plus"],["protein","ball"],["proteins","protein"],["proto","plus"],["protocol","line"],["protocols","dash"],["protonmail","paper"],["prototype","software"],["prototypes","and"],["protractor","n"],["protruded","right"],["protrudes","right"],["proud","plus"],["proudly","plus"],["proulx","person"],["proust","person"],["prove","right"],["proved","plus"],["proven","plus"],["prover","plus"],["proves","right"],["provide","right"],["provided","right"],["provider","company"],["providers","person"],["provides","right"],["providing","right"],["province","square"],["proving","right"],["provision","right"],["provisional","plus"],["proviso","plus"],["provoke","minus"],["provoked","right"],["proximity","n"],["prozacs","tablet"],["prozesky","person"],["prue","person"],["prune","minus"],["pruned","n"],["prunes","box"],["pruning","xcorrection"],["présent","zero"],["ps","dash"],["psc","dash"],["pseudo","square"],["pseudonym","square"],["pseudospecs","n"],["psipsina","square"],["psu","university"],["psych","person"],["psychiatric","brain"],["psychiatrically","brain"],["psychiatrist","person"],["psychiatrists","person"],["psychiatry","person"],["psychic","person"],["psychoanalysis","brain"],["psychoanalyst","person"],["psychoanalytic","brain"],["psychological","brain"],["psychologically","brain"],["psychologies","square"],["psychologist","person"],["psychology","paper"],["psychosis","minus"],["pt","dash"],["pty","plus"],["pub","plus"],["public","paper"],["publication","book"],["publications","book"],["publicisation","plus"],["publicise","plus"],["publicising","plus"],["publicist","person"],["publicity","paper"],["publicly","plus"],["publish","paper"],["publishable","book"],["published","book"],["publisher","person"],["publishers","company"],["publishing","book"],["pudding",""],["puddle","water"],["puffin",""],["puffs","air"],["pugh","person"],["puja","plus"],["pull","left"],["pulled","right"],["pulling","right"],["pulp",""],["pulse","square"],["pummeled","down"],["pummeling","down"],["pump","box"],["pumped","pump"],["pumping","right"],["pumpkin",""],["pun","equals"],["punch","magazine"],["punct","square"],["punctual","n"],["punctuation","fullstop"],["punish","minus"],["punished","xcorrection"],["punishment","minus"],["punkt","dot"],["pup",""],["pupils","person"],["pupkebqf","dash"],["puppet",""],["puppets","puppet"],["puppies","puppy"],["puppy",""],["purchase","dollarsymbol"],["purchased","dollarsymbol"],["purchases","dollarsymbol"],["purchasing","dollarsymbol"],["pure","box"],["puree",""],["purely","plus"],["purist","equals"],["purity","plus"],["purple","square"],["purported","plus"],["purporters","person"],["purpose","dash"],["purposeful","plus"],["purposes","dash"],["purse","coin"],["pursue","right"],["pursued","right"],["pursuing","right"],["pursuit","right"],["purusha","air"],["purushans","person"],["purushas","person"],["push","right"],["pushed","right"],["pushing","right"],["pushkin","person"],["puspita","person"],["pussy","cat"],["put","right"],["putin","person"],["putonghua","book"],["puts","right"],["putting","right"],["puttogether","plus"],["putvalue","box"],["putvar","down"],["puzzle","box"],["puzzles","box"],["pv","dash"],["pvc",""],["pw","square"],["pwd","box"],["px","dash"],["py","software"],["pyes","tick"],["pyramid",""],["python","and"],["pz","variable"],["pǔtōnghua","book"],["q","dash"],["qa","plus"],["qadb","right"],["qatar","country"],["qb","box"],["qbmusic","note"],["qbpl","box"],["qbrbqeiwal","dash"],["qe","dash"],["qi","dot"],["qigong","book"],["qigongbrocades","plus"],["qigongmantra","square"],["qigongsutra","square"],["qingyu","person"],["qm","dash"],["qs","square"],["qsp","dash"],["qsuhxoccdwqavd","dash"],["qsuxdqninknuubrqyba","dash"],["qtdi","dash"],["qu","square"],["qua","down"],["quadb","right"],["quadcycles","car"],["quaderni","fourrectangle"],["quadrant","book"],["quadrants","square"],["quadratic","two"],["quadrilateral","square"],["quadruple","fourrectangle"],["qual","dash"],["qualia","n"],["qualification","paper"],["qualifications","paper"],["qualified","plus"],["qualify","plus"],["qualifying","plus"],["qualitative","n"],["qualities","square"],["quality","a"],["qualms","minus"],["quals","paper"],["quantification","n"],["quantified","n"],["quantitative","n"],["quantities","n"],["quantitive","n"],["quantity","n"],["quantum","n"],["quantumbox","box"],["quantumlink","right"],["quarrelled","minus"],["quarryise","stone"],["quarter","box"],["quartered","box"],["quarters","room"],["quartile","square"],["quartz",""],["quasi","plus"],["quasifontanaland","city"],["quasilogics","f"],["quasiquotation","square"],["quason","plus"],["quaver","note"],["qubit","n"],["queen","person"],["queenly","plus"],["queens","person"],["queensland","square"],["queer","person"],["quell","plus"],["quench","water"],["quenched","water"],["quenching","water"],["queried","square"],["querier","person"],["queries","questionmark"],["query","questionmark"],["queryable","square"],["queryb","square"],["querying","square"],["question","questionmark"],["questionable","minus"],["questioned","questionmark"],["questioner","person"],["questioning","questionmark"],["questionmark",""],["questionmode","right"],["questionnaire","paper"],["questionnaires","paper"],["questions","questionmark"],["queue","line"],["queued","line"],["queues","threerectangle"],["quick","n"],["quicker","greaterthan"],["quickly","plus"],["quickness","n"],["quicktime","and"],["quiet","n"],["quietly","zero"],["quince",""],["quinces","quince"],["quincy","person"],["quinlan","person"],["quinn","person"],["quinoa",""],["quip","plus"],["quips","plus"],["quipu","string"],["quipus","string"],["quirk","minus"],["quirky","minus"],["quit","right"],["quite","plus"],["quits","left"],["quitting","left"],["quivers","right"],["quiz","paper"],["quizmaze","plus"],["quizzed","paper"],["quizzes","paper"],["quo","person"],["quoll","animal"],["quora","company"],["quorum","person"],["quota","n"],["quotation","square"],["quote","square"],["quoted","square"],["quotes","square"],["quoth","mouth"],["quotient","n"],["quoting","square"],["qvahhcqz","dash"],["qwfmqsneubo","dash"],["r","box"],["ra","dash"],["rabbit",""],["rabbits","rabbit"],["rabia","person"],["rabiashahid","person"],["raboude","person"],["rac","right"],["race","right"],["races","person"],["rachel","person"],["racial","person"],["racing","right"],["racism","minus"],["racist","minus"],["rack",""],["racket","plus"],["radiation",""],["radical","two"],["radicalism","plus"],["radically","plus"],["radio","zline"],["radioactive","minus"],["radioairplay","company"],["radios","right"],["radish",""],["radius","n"],["rae","person"],["raedt","person"],["rag",""],["rage","minus"],["raggatt","person"],["raggatts","person"],["raichel","person"],["rail","line"],["rain","down"],["rainbow","square"],["rained","water"],["rainforest","tree"],["raining","water"],["raise","up"],["raised","up"],["raises","up"],["raising","up"],["raisins","raisin"],["raison","square"],["raj","person"],["rake",""],["raked","rake"],["rally","person"],["ralph","person"],["ram",""],["rambada","right"],["ramifications","right"],["ramirez","person"],["ramp",""],["ramping","up"],["ramponponed","person"],["ramzan","person"],["ran","path"],["ranamukhaarachchi","person"],["rancid","minus"],["rancière","person"],["rand","n"],["random","n"],["randomfn","n"],["randomfns","n"],["randomisation","n"],["randomise","n"],["randomised","n"],["randomises","n"],["randomlist","n"],["randomly","n"],["randomness","n"],["randvars","variable"],["rang","n"],["range","line"],["ranged","right"],["ranger","person"],["ranges","line"],["ranging","n"],["ranjit","person"],["rank","n"],["ranked","n"],["ranking","n"],["rankings","n"],["ranks","square"],["ransacked","minus"],["rape","minus"],["rapid","n"],["rapidly","n"],["rapidssl","box"],["rapport","plus"],["rapprochements","plus"],["raptures","plus"],["rapturously","plus"],["rapunzel","person"],["rard","dash"],["rare","plus"],["rarely","minus"],["raspberries","raspberry"],["raspberry","berry"],["rasping","minus"],["rastapopolous","person"],["rata","plus"],["rate","n"],["rated","n"],["ratepayers","person"],["rates","n"],["rathbowne","person"],["rather","minus"],["ratified","plus"],["ratify","plus"],["rating","n"],["ratings","n"],["ratio","divide"],["rational","plus"],["rationale","paper"],["rationales","square"],["rationalisation","plus"],["rationalist","person"],["rationality","plus"],["rationalized","plus"],["rationally","plus"],["ratios","r"],["raw","one"],["rawgit","box"],["ray","line"],["rayner","person"],["rays","right"],["razor","blade"],["rb","dash"],["rc","dash"],["rcaw","square"],["rcawmp","n"],["rcawp","and"],["rcawpa","square"],["rcawpastea","square"],["rcawpspec","paper"],["rcmp","person"],["rct","right"],["rd","dash"],["rdbc","dash"],["re","square"],["rea","dash"],["reach","dot"],["reachable","plus"],["reached","right"],["reachedness","plus"],["reaches","dot"],["reaching","dot"],["react","tick"],["reacted","right"],["reacting","square"],["reaction","right"],["reactions","right"],["read","book"],["readable","plus"],["readd","eye"],["readee","person"],["reader","person"],["readers","eye"],["readfile","paper"],["readily","plus"],["readin","eye"],["readiness","plus"],["reading","book"],["readinginstructions","eye"],["readings","square"],["readline","eye"],["readme","paper"],["readmejson","paper"],["readmes","square"],["reado","apple"],["reads","book"],["readsc","n"],["readthedocs","eye"],["readv","right"],["ready","plus"],["real","box"],["realisation","plus"],["realise","tick"],["realised","plus"],["realises","plus"],["realising","tick"],["realism","box"],["realist","box"],["realistic","plus"],["realistically","plus"],["realities","box"],["reality","box"],["really","plus"],["realm","country"],["realpeoplecastings","company"],["realtime","zero"],["realtimesync","and"],["reap","right"],["reappear","square"],["reapply","plus"],["rear","left"],["reared","plus"],["rearrangement","n"],["reason","square"],["reasonable","square"],["reasonableness","plus"],["reasonably","plus"],["reasoned","square"],["reasoner","person"],["reasoning","square"],["reasonings","square"],["reasons","square"],["reassembled","plus"],["reassess","plus"],["reassign","equals"],["reassure","plus"],["reassuring","plus"],["reassuringly","plus"],["reattached","plus"],["reattaches","dash"],["reattaching","plus"],["reb","xcorrection"],["rebecca","person"],["rebelled","minus"],["rebirth","zero"],["reborn","zero"],["rebound","right"],["rebounder","u"],["rebr","one"],["rebreason","equals"],["rebreasoned","equals"],["rebreasoning","one"],["rebreasonings","equals"],["rebreathsoned","plus"],["rebreathsoning","plus"],["rebreathsonings","hand"],["rebuilt","building"],["rebut","xcorrection"],["rebuttal","xcorrection"],["rebuttals","xcorrection"],["rebutted","xcorrection"],["rebutting","xcorrection"],["rec","tick"],["recalculate","plus"],["recalculated","equals"],["recalibrate","n"],["recall","down"],["recast","plus"],["receipt","paper"],["receive","left"],["received","left"],["receiver","box"],["receivers","receiver"],["receives","left"],["receiving","left"],["recent","left"],["recently","minus"],["receptacle","box"],["reception","plus"],["receptor","box"],["recharge","plus"],["recheck","plus"],["recipe","paper"],["recipes","apple"],["recipient","person"],["reciprocation","right"],["reciprocity","right"],["recite","mouth"],["recited","mouth"],["recklessness","minus"],["recklinghausen","minus"],["recline","down"],["reclined","down"],["reclining","down"],["reclusive","minus"],["recog","plus"],["recognisability","eye"],["recognisable","tick"],["recognise","eye"],["recognised","eye"],["recogniser","tick"],["recognises","tick"],["recognising","eye"],["recognition","plus"],["recognized","eye"],["recognizing","plus"],["recollected","plus"],["recommend","plus"],["recommendation","plus"],["recommendations","plus"],["recommended","plus"],["recommending","plus"],["recommends","tick"],["reconciles","tick"],["reconciliation","plus"],["reconciling","plus"],["reconfigured","right"],["reconnect","dash"],["reconnected","dash"],["reconnecting","equals"],["reconsider","plus"],["reconstruct","up"],["reconstructed","up"],["reconstructing","up"],["reconstruction","up"],["reconstructs","box"],["record","square"],["recorded","line"],["recordequins","person"],["recorder","paper"],["recording","square"],["recordings","square"],["records","down"],["recounted","n"],["recover","right"],["recovered","right"],["recovering","xcorrection"],["recovery","plus"],["recreate","plus"],["recreated","plus"],["recreating","plus"],["recreation","plus"],["recruit","up"],["recruited","plus"],["recruiter","person"],["recruiting","up"],["recruitment","plus"],["rect","square"],["rectangle","square"],["rectangles","square"],["rectangular","square"],["rectified","plus"],["rectify","xcorrection"],["rectum","tube"],["recuperate","plus"],["recur","plus"],["recurrent","n"],["recurring","circle"],["recurs","circle"],["recurse","circle"],["recursion","circle"],["recursive","dash"],["recursivecall","circle"],["recursively","circle"],["recursivety","circle"],["recyclables","circle"],["recycle","plus"],["recycled","amount"],["recycler","plus"],["red","square"],["redblackduck","duck"],["redcoats","person"],["reddit","company"],["redeem","tick"],["redefined","square"],["redemption","tick"],["redesign","pen"],["redesigned","paper"],["redirect","right"],["rediscover","eye"],["redisplayed","square"],["redistribute","right"],["redistributed","right"],["redistribution","right"],["redistributions","right"],["redo","two"],["redone","tick"],["redraft","paper"],["redrafted","paper"],["redrafting","paper"],["redrafts","paper"],["redrawn","pen"],["redress","right"],["redshift","n"],["reduce","minus"],["reduced","minus"],["reducer","down"],["reduces","minus"],["reducing","one"],["reductio","down"],["reduction","minus"],["reductionism","down"],["reductions","right"],["reductive","down"],["redundant","minus"],["reed","tube"],["reedoei","dash"],["reeds","plant"],["reefman","person"],["reenter","down"],["reentered","right"],["reentering","right"],["rees","person"],["ref","left"],["refer","right"],["referee","person"],["referees","person"],["reference","right"],["referenced","right"],["references","square"],["referencing","right"],["referendum","paper"],["referral","paper"],["referred","right"],["referrer","right"],["referring","dash"],["refers","dash"],["refill","box"],["refine","plus"],["refined","down"],["refinement","plus"],["refinements","plus"],["refining","plus"],["refit","plus"],["reflect","n"],["reflected","line"],["reflecting","n"],["reflection","square"],["reflections","right"],["reflective","right"],["reflects","dot"],["reflexivity","circle"],["reflexology","foot"],["reform","plus"],["reformation","plus"],["reforming","plus"],["reforms","plus"],["reformulation","right"],["refracted","n"],["refraction","n"],["refractive","n"],["reframe","square"],["refresh","plus"],["refreshed","plus"],["refreshes","plus"],["refreshing","plus"],["refreshments","water"],["refrigerator",""],["refs","square"],["refuelling","fuel"],["refuge","plus"],["refugee","person"],["refugees","person"],["refund","dollarsymbol"],["refunds","right"],["refutations","xcorrection"],["reg","plus"],["regained","right"],["regan","person"],["regard","plus"],["regarded","plus"],["regarding","dash"],["regardless","plus"],["regards","dash"],["regatta","boat"],["regenerate","plus"],["regime","square"],["regimen","paper"],["regiment","person"],["region","square"],["regional","square"],["regionally","square"],["regions","square"],["register","paper"],["registeracompany","plus"],["registered","n"],["registering","square"],["registeringcompany","company"],["registers","plus"],["registration","plus"],["registry","square"],["regogitation","minus"],["regress","left"],["regression","down"],["regrowing","up"],["regular","one"],["regularly","n"],["regulate","line"],["regulated","plus"],["regulating","line"],["regulation","book"],["regulations","square"],["regulatory","plus"],["regurgitated","up"],["rehearsal","paper"],["rehearse","plus"],["rehearsed","note"],["rehearsing","plus"],["rehse","person"],["reid","person"],["reigned","plus"],["reine","equals"],["reiner","n"],["reinforce","box"],["reinforced","plus"],["reinforcement","plus"],["reinforcing","plus"],["reinitialise","zero"],["reinsert","box"],["reinstating","plus"],["reinterpret","two"],["reinterpreting","book"],["reintroduce","plus"],["reishi","plant"],["reject","xcorrection"],["rejected","xcorrection"],["rejecting","xcorrection"],["rejection","xcorrection"],["rejects","xcorrection"],["rejoiced","plus"],["rejuvenate","plus"],["rejuvenated","plus"],["rel","dash"],["relabelled","square"],["relabelling","square"],["relatable","plus"],["relate","dash"],["related","dash"],["relates","dash"],["relating","dash"],["relation","dash"],["relational","right"],["relations","dash"],["relationship","right"],["relationships","dash"],["relative","person"],["relatively","minus"],["relativeness","right"],["relatives","person"],["relativism","right"],["relativity","right"],["relax","person"],["relaxation","plus"],["relaxed","person"],["relaxes","down"],["relaxing","plus"],["relearning","plus"],["release","right"],["released","right"],["releases","up"],["releasing","right"],["relegate","down"],["relevance","equals"],["relevant","tick"],["relevantly","plus"],["reliability","plus"],["reliable","plus"],["reliably","plus"],["reliance","right"],["reliant","plus"],["relied","right"],["relief","plus"],["relies","left"],["relieve","plus"],["relieved","plus"],["relig","person"],["religion","book"],["religions","person"],["religious","organisation"],["relish",""],["relishment","plus"],["reloaded","up"],["rels","dash"],["rely","right"],["relying","left"],["rem","brain"],["remain","one"],["remainder","n"],["remainders","n"],["remained","dot"],["remaining","tworectangle"],["remains","plus"],["remarkable","plus"],["remarks","square"],["remastered","plus"],["remediation","plus"],["remedied","plus"],["remember","brain"],["remembered","plus"],["remembering","right"],["remembers","box"],["remembrance","up"],["remind","plus"],["reminded","equals"],["reminder","n"],["reminders","square"],["reminding","square"],["reminds","square"],["remix","note"],["remixes","paper"],["remnants","dot"],["remote","dot"],["remotehost","computer"],["remotely","box"],["remoteuser","person"],["removal","minus"],["removals","right"],["remove","minus"],["removebrackets","minus"],["removebrackets","xcorrection"],["removed","minus"],["removenotice","xcorrection"],["removenotrhyming","xcorrection"],["remover","up"],["removerepeatedterm","xcorrection"],["removers","minus"],["removes","xcorrection"],["removetoolong","xcorrection"],["removetoolongandnotrhyming","xcorrection"],["removing","xcorrection"],["remunerate","dollarsymbol"],["remunerated","dollarsymbol"],["remvdup","minus"],["renaissance","square"],["renal","kidney"],["rename","right"],["renamed","square"],["renames","right"],["render","square"],["rendered","square"],["renderer","square"],["renderh","note"],["rendering","square"],["renderline","line"],["renderlineh","line"],["renderlinerests","box"],["renderlines","line"],["renderm","note"],["renders","square"],["rendersong","square"],["renderv","square"],["renditions","plus"],["renegade","person"],["renegades","person"],["renew","plus"],["renewable","plus"],["renewal","plus"],["renogitation","minus"],["renowned","plus"],["rent","dollarsymbol"],["rentable","dollarsymbol"],["rental","building"],["renumber","n"],["reopen","up"],["reorder","n"],["reordered","dash"],["reordering","line"],["reorders","tick"],["reorganisation","plus"],["reorganised","plus"],["reorganize","plus"],["reoutputted","n"],["rep","square"],["repair","plus"],["repaired","plus"],["repairing","plus"],["repairs","plus"],["repeat","n"],["repeated","n"],["repeatedly","n"],["repeating","n"],["repeatlastnote","n"],["repeatlastnoteh","note"],["repeats","n"],["repel","xcorrection"],["repellant","plus"],["repels","xcorrection"],["repetition","n"],["repetitions","n"],["repetitive","two"],["replace","dash"],["replaceable","plus"],["replacecard","card"],["replaced","dash"],["replacement","dash"],["replacements","dash"],["replacer","right"],["replaces","dash"],["replacesynonyms","right"],["replacing","right"],["replay","line"],["replayed","right"],["replaying","plus"],["repleteness","n"],["replicant","person"],["replicants","person"],["replicat","company"],["replicate","two"],["replicated","plus"],["replicates","plus"],["replicating","plus"],["replication","plus"],["replications","plus"],["replicator","box"],["replicators","plus"],["replied","square"],["replies","square"],["reply","a"],["replying","left"],["repo","box"],["report","paper"],["reported","paper"],["reporter","person"],["reporters","person"],["reporting","paper"],["reports","paper"],["reportspam","xcorrection"],["repos","box"],["repositories","box"],["repository","paper"],["repositoryname","square"],["reposting","square"],["represent","right"],["representation","square"],["representations","square"],["representative","person"],["representatives","person"],["represented","square"],["representing","right"],["represents","square"],["reprocesses","right"],["reproduce","right"],["reproducing","two"],["reproduction","right"],["reproductive","plus"],["republic","country"],["republican","person"],["reputable","plus"],["reputation","plus"],["request","left"],["requested","questionmark"],["requester","person"],["requesters","person"],["requesting","square"],["requests","left"],["require","left"],["required","left"],["requirement","tick"],["requirements","paper"],["requires","left"],["requiring","left"],["requisite","plus"],["requisites","right"],["reranked","n"],["rerun","plus"],["res","paper"],["reschedule","right"],["rescued","plus"],["rescuer","person"],["rescuing","plus"],["research","paper"],["researchable","paper"],["researched","paper"],["researcher","person"],["researchers","person"],["researchgate","paper"],["researching","tick"],["reselling","dollarsymbol"],["resemble","plus"],["resembled","equals"],["resembling","right"],["resend","right"],["resent","right"],["reservation","plus"],["reserve","plus"],["reserved","plus"],["reserves","box"],["reset","zero"],["resetcps","right"],["resetcpstohere","dot"],["resets","left"],["resetting","zero"],["residences","house"],["resident","person"],["residential","house"],["residents","person"],["resides","plus"],["residing","house"],["residue","box"],["resilience","plus"],["resist","xcorrection"],["resistance","minus"],["resistant","plus"],["resists","xcorrection"],["resolute","line"],["resolutely","plus"],["resolution","square"],["resolutions","square"],["resolve","tick"],["resolved","plus"],["resolves","tick"],["resolving","plus"],["resonate","plus"],["resonated","plus"],["resonates","plus"],["resonating","plus"],["resort","building"],["resource","box"],["resources","box"],["resourcesfulnesses","plus"],["resp","box"],["respecified","two"],["respecify","two"],["respect","plus"],["respected","plus"],["respectful","plus"],["respecting","plus"],["respective","box"],["respectively","tworectangle"],["respects","plus"],["respiratory","lung"],["respond","plus"],["responded","square"],["respondents","person"],["responding","right"],["responds","plus"],["response","tick"],["responses","tick"],["responsibilities","paper"],["responsibility","plus"],["responsible","dash"],["responsibly","plus"],["responsive","plus"],["responsiveness","plus"],["rest","threebox"],["restart","dot"],["restated","square"],["restaurant","building"],["restaurants","company"],["rested","down"],["resting","zero"],["restlast","box"],["restofgrammar","square"],["restored","down"],["restrain","xcorrection"],["restraint","plus"],["restricted","box"],["restrictions","minus"],["rests","box"],["resubmission","up"],["resubmit","right"],["resubmitting","right"],["result","n"],["resulted","right"],["resulting","equals"],["results","a"],["resume","paper"],["resumes","paper"],["resuming","right"],["resupply","down"],["resuscitate","plus"],["resuscitated","plus"],["resuscitating","up"],["resynth","star"],["resynthesis","star"],["resynthesise","star"],["resynthesised","star"],["retain","plus"],["retained","plus"],["retains","plus"],["reteaches","square"],["retest","dot"],["retested","plus"],["rethink","square"],["rethinking","brain"],["retina","circle"],["retired","plus"],["retract","left"],["retractall","one"],["retracting","left"],["retrain","book"],["retraining","right"],["retransplantation","right"],["retreat","building"],["retreats","left"],["retried","plus"],["retries","plus"],["retrievable","plus"],["retrieval","plus"],["retrieve","up"],["retrieved","right"],["retrieves","left"],["retrieving","right"],["retry","tick"],["retrying","plus"],["return","left"],["returnable","right"],["returned","right"],["returnedfromthailand","right"],["returner","up"],["returning","left"],["returns","left"],["retweeted","plus"],["retyping","square"],["reuploading","up"],["reusable","plus"],["reuse","tick"],["reused","tick"],["reuser","person"],["reuses","tick"],["reusing","tick"],["reveal","plus"],["revealed","plus"],["revealing","box"],["reveals","square"],["revenue","dollarsymbol"],["reverbnation","company"],["revered","plus"],["reverehealth","heart"],["reversal","left"],["reverse","threerectangle"],["reversealgorithmfirstline","square"],["reversealgorithmfourthline","square"],["reversealgorithmsecondline","square"],["reversealgorithmthirdline","square"],["reversed","mirror"],["reversing","right"],["revert","left"],["reverter","right"],["review","paper"],["reviewed","tick"],["reviewer","person"],["reviewers","person"],["reviewing","paper"],["reviews","paper"],["revise","book"],["revised","eye"],["revising","plus"],["revision","plus"],["revisited","dot"],["revived","plus"],["revoked","xcorrection"],["revolt","minus"],["revolution","circle"],["revolutionary","plus"],["revolutions","circle"],["revolve","plus"],["revolved","right"],["reward","plus"],["rewarded","plus"],["rewarding","plus"],["rewards","plus"],["reword","square"],["reworded","square"],["rewording","right"],["rewrite","pen"],["rewriter","pen"],["rewriters","plus"],["rewrites","two"],["rewriting","pen"],["rewritten","paper"],["rewrote","pen"],["rex","king"],["reynardine","person"],["rff","right"],["rgb","n"],["rhapsody","note"],["rhesus","monkey"],["rhetoric","square"],["rhetorical","right"],["rhetorically","right"],["rhetorician","person"],["rhino","rhinoceros"],["rhizomatic","dash"],["rhizome","dash"],["rhizomes","dash"],["rhizomic","dash"],["rhizomicity","dash"],["rhoden","person"],["rhodopsin","ball"],["rhomboid","square"],["rhubarb",""],["rhyme","equals"],["rhymed","equals"],["rhymes","equals"],["rhyming","equals"],["rhys","person"],["rhythm","note"],["rhythmic","note"],["rhythms","note"],["ri","dash"],["riable","dash"],["ribbon",""],["ribbons","square"],["rica","country"],["rice","person"],["rich","coin"],["richard","person"],["richardson","person"],["richer","dollarsign"],["richmond","suburb"],["rick","person"],["ricky","person"],["ricocheting","right"],["ricoeur","person"],["rid","xcorrection"],["ridden","minus"],["ride","car"],["ridge","box"],["ridiculed","minus"],["riding","right"],["ridley","person"],["rife","minus"],["riff","note"],["rig","book"],["rigatoni",""],["right",""],["rightful","plus"],["rightmost","right"],["rightness","tick"],["rightnesses","plus"],["rights","plus"],["rightup","up"],["rigid","box"],["rigmaroles","plus"],["rigor","plus"],["rigorous","tick"],["rigorously","plus"],["rigorousness","tick"],["rigour","plus"],["rim","box"],["ring",""],["ringing","n"],["rinsed","water"],["rinsing","water"],["rip","person"],["ripe","plus"],["ripenesses","plus"],["rise","up"],["rises","up"],["rising","up"],["risk","xcorrection"],["risks","minus"],["risky","minus"],["rissole",""],["rita","person"],["ritetag","company"],["rittel","person"],["river",""],["rivista","person"],["rizzo","person"],["rl","dash"],["rli","institute"],["rlist","line"],["rlsatarget","dash"],["rm","xcorrection"],["rmit","building"],["rmoldtmpfile","paper"],["rms","n"],["rn","dash"],["rna","n"],["rnh","company"],["rnn","right"],["rnns","right"],["road",""],["roadblock","minus"],["roadhouse","house"],["roads","road"],["roald","person"],["roared","lion"],["roast","n"],["roasted","n"],["roasting","n"],["rob","person"],["robbie","person"],["robe",""],["robin","person"],["robopod",""],["robot","person"],["robotic","person"],["roboticist","person"],["robotics","person"],["robots","box"],["robust","plus"],["robyn","person"],["rochelle","person"],["rock","note"],["rocked","plus"],["rocket",""],["rocking","right"],["rocks","rock"],["rococo","square"],["rod","person"],["rode","right"],["rods","rod"],["rog","person"],["rogaining","plus"],["roger","person"],["rohan","person"],["roi","person"],["roity","plus"],["rold","dash"],["role","person"],["roles","person"],["roll",""],["rolled","ball"],["rolling","ball"],["roman","person"],["romania","country"],["rome","city"],["romeo","person"],["romeos","restaurant"],["ron","person"],["ronson","person"],["roof",""],["room",""],["roomdict","book"],["roomdifferencet","d"],["roomdt","variable"],["roomend","variable"],["roomlength","n"],["roomlengtht","n"],["roompartname","square"],["rooms","room"],["roomstring","variable"],["roomstringth","variable"],["roomvalues","variable"],["roomx","variable"],["roomy","variable"],["roomz","variable"],["roosting","down"],["root","dash"],["roots","right"],["rope",""],["ropensci","software"],["ropes","rope"],["rorty","person"],["rose",""],["rosemary","person"],["rosemont","person"],["rosenberg","person"],["roses","rose"],["rosettacode","and"],["rosin",""],["ross","person"],["rossman","person"],["rosuly","rose"],["rosy","rose"],["rot","circle"],["rotarian","person"],["rotarians","person"],["rotary","building"],["rotaryprahran","company"],["rotate","semicircle"],["rotated","arc"],["rotating","right"],["rote","plus"],["rotoract","dash"],["rotunda",""],["rough","minus"],["roughly","minus"],["round","dot"],["rounded","circle"],["rounding","down"],["rounds","right"],["roundtable","table"],["rousing","plus"],["rousseau","person"],["route","right"],["routenote","company"],["router","line"],["routes","right"],["routine","paper"],["routines","and"],["roux",""],["row","line"],["rowdy","minus"],["rowed","right"],["rower","person"],["rowing","right"],["rows","line"],["royal","king"],["royalmelbournehospital","hospital"],["royals","person"],["royalties","dollarsymbol"],["royalty","person"],["rpg","plus"],["rpl","tick"],["rpv","company"],["rqidlbhwp","dash"],["rr","dash"],["rs","variable"],["rsa","tick"],["rst","dash"],["rsvp","left"],["rsync","right"],["rtch","dot"],["rtcmiux","dash"],["rtf","paper"],["rto","school"],["ru","paper"],["rub","down"],["rubbed","down"],["rubber","box"],["rubbing","equals"],["rubbish","minus"],["rubinstein","person"],["ruby",""],["rudd","person"],["rude","minus"],["rudeness","minus"],["rudimental","box"],["rudolf","person"],["rue","dash"],["rug",""],["rugby","ball"],["rule","plus"],["rulea","square"],["ruled","plus"],["rulename","square"],["ruler",""],["rules","and"],["rulesx","plus"],["ruling","plus"],["rumination","minus"],["rummaged","down"],["run","path"],["runa","tick"],["runabout","box"],["runall","right"],["runb","tick"],["rune","tick"],["rung",""],["rungrammarly","software"],["rungs","rung"],["runhosting","company"],["runn","dash"],["runne","tick"],["runner","person"],["runners","right"],["running","shoe"],["runny","water"],["runs","path"],["runtime","n"],["runz","tick"],["rural","paddock"],["rush","right"],["rushing","right"],["russell","person"],["russia","country"],["russian","paper"],["rut","box"],["ruth","person"],["ruts","box"],["rv","dash"],["rvawki","dash"],["rvs","dash"],["rw","dash"],["rwanda","country"],["rxcawp","and"],["ryan","person"],["ryde","person"],["rye","bread"],["s","box"],["sa","dash"],["saas","dollarsymbol"],["sabalcore","company"],["sac","plus"],["saccharine","sugar"],["sack",""],["sacked","xcorrection"],["sacrament","paper"],["sacramento","city"],["sacred","plus"],["sacrifice","minus"],["sacrifices","minus"],["sacrificing","xcorrection"],["sacrosanctnesses","plus"],["sacrosancts","plus"],["sad","tear"],["sadface",""],["sadly","minus"],["sadness",""],["safari","and"],["safe","a"],["safeguard","plus"],["safeguarding","plus"],["safeguards","plus"],["safely","plus"],["safer","plus"],["safest","tick"],["safety","plus"],["sage","person"],["sages","person"],["saggy","down"],["saharan","sand"],["said","speechballoon"],["sail","boat"],["sailed","right"],["sailor","person"],["sailors","person"],["sails","sail"],["saint","person"],["saints","person"],["sake","plus"],["salad","tomato"],["salami",""],["salaries","dollarsymbol"],["salary","dollarsymbol"],["saldana","person"],["sale","dollarsign"],["saleable","dollarsymbol"],["salek","person"],["salerno","city"],["sales","dollarsign"],["salesperson","person"],["salivary","liquid"],["salivating","water"],["sally","person"],["salome","person"],["salon","building"],["salt",""],["saltaté","right"],["salted","salt"],["salts","crystal"],["salty","salt"],["salute","plus"],["saluted","plus"],["salvador","country"],["sam","person"],["samad","person"],["samadhi","plus"],["samantha","person"],["samaritans","person"],["same","equals"],["sameexp","equals"],["sameexperience","dot"],["sameness","equals"],["samenesses","equals"],["samepn","equals"],["samoa","country"],["samovar","tea"],["sample","box"],["samplereport","paper"],["samples","n"],["samplestock","box"],["sampling","n"],["samsung","tablet"],["samwalrus","person"],["san","city"],["sanctioning","plus"],["sand",""],["sandals",""],["sandbox","box"],["sandelowski","person"],["sandpaper",""],["sandra","person"],["sands","person"],["sandstone",""],["sandwich",""],["sandwiches","sandwich"],["sandy","person"],["sane","plus"],["sang","note"],["sanitary","square"],["sanity","plus"],["sank","down"],["sanskrit","square"],["sansserifprint","box"],["santa","city"],["sao","country"],["sap",""],["sapient","plus"],["sapling","tree"],["saplings","tree"],["sapub","dash"],["sara","person"],["sarah","person"],["sarasota","person"],["sarkozys","person"],["sarte","person"],["sassmaiden","person"],["sat","seat"],["satanism","religion"],["satchwell","person"],["satellite",""],["satiated","plus"],["satisfaction","plus"],["satisfactory","plus"],["satisfiability","one"],["satisfiable","plus"],["satisfied","plus"],["satisfies","tick"],["satisfy","tick"],["satisfying","tick"],["saturated","water"],["saturday","square"],["sauce",""],["saucepan",""],["saucer",""],["saucers","saucer"],["sauces","sauce"],["saucy","plus"],["saudi","country"],["sauer","person"],["saunders","person"],["sausage",""],["sav","eye"],["savant","person"],["savants","person"],["save","plus"],["saved","tick"],["saver","plus"],["savers","plus"],["saves","tick"],["saving","plus"],["savings","plus"],["savoured","plus"],["savoury","mayonnaise"],["savvy","company"],["saw","eye"],["sawtooth","square"],["sawyer","person"],["sax","tube"],["saxophone",""],["say","speechballoon"],["sayer","person"],["sayers","person"],["saying","mouth"],["says","speechballoon"],["sb","dash"],["sbr","paper"],["sc","dash"],["scaffolding","box"],["scalability","tworectangle"],["scalable","tworectangle"],["scale","nrectangle"],["scaled","up"],["scales","up"],["scaling","n"],["scammer","minus"],["scan","eye"],["scandizzo","person"],["scanned","square"],["scanner","right"],["scanning","eye"],["scarborough","city"],["scarcity","minus"],["scared","minus"],["scarf",""],["scarring","minus"],["scary","minus"],["scattered","down"],["scavenger","person"],["scenario","box"],["scenarios","box"],["scene","square"],["scenes","box"],["scenic","tree"],["scent","perfume"],["scented","nose"],["schachte","person"],["schapzazzure","plus"],["schedule","square"],["scheduled","dash"],["scheduler","n"],["scheduling","square"],["scheme","right"],["schemes","square"],["schepisi","person"],["schisandra","plant"],["schizophrenia","minus"],["schizophrenic","minus"],["scholar","person"],["scholarising","person"],["scholarly","person"],["scholars","person"],["scholarship","dollarsymbol"],["schonfelder","person"],["school","building"],["schoolmaster","person"],["schools","building"],["schoolyard",""],["schroeter","person"],["schwarz","person"],["schwerdt","person"],["sci","testtube"],["science","testtube"],["scienced","testtube"],["sciencedirect","company"],["sciences","testtube"],["scientific","and"],["scientifically","testtube"],["scientist","person"],["scientists","person"],["scintillated","plus"],["scintillation","plus"],["scissors",""],["scone",""],["scones","scone"],["scoop",""],["scooped","scoop"],["scooping","down"],["scoops","ball"],["scope","dash"],["scopes","dash"],["scopic","up"],["score","n"],["scored","n"],["scores","n"],["scorn","minus"],["scorned","minus"],["scorning","minus"],["scots","person"],["scott","person"],["scottjeffrey","person"],["scourer",""],["scouring","down"],["scout","person"],["scouted","eye"],["scp","two"],["scrap","xcorrection"],["scraped","right"],["scraper",""],["scraping","right"],["scrapped","down"],["scratch","line"],["scratcher","person"],["scratching","down"],["scratchy","minus"],["scrawn","minus"],["screams","line"],["screen","square"],["screening","square"],["screens","square"],["screenshot","square"],["screenshots","square"],["screw",""],["screwdriver",""],["screwed","down"],["scribbled","pen"],["scribbling","pen"],["scripsi","book"],["script","paper"],["scripted","paper"],["scripts","paper"],["scriptural","book"],["scripture","book"],["scriptures","book"],["scriptwriter","person"],["scrn","square"],["scrnsong","note"],["scrnsongs","note"],["scroll","paper"],["scrolling","square"],["scrotum","ball"],["scrubbing","down"],["scrutinise","tick"],["scrutiny","plus"],["scuba","air"],["sculpted","chisel"],["sculptor","person"],["sculpture",""],["sdc","company"],["sdcinternetics","company"],["sde","air"],["sdk","box"],["se","dash"],["sea","water"],["seahorse",""],["seal",""],["seale","person"],["sealed","plus"],["sealing","equals"],["seamless","plus"],["seamlessly","line"],["seances","dash"],["search","square"],["searchable","plus"],["searched","eye"],["searchers","person"],["searches","square"],["searching","eye"],["seashells","shell"],["seashore","line"],["season","square"],["seasons","square"],["seat",""],["seated","seat"],["seats","seat"],["seaweed","plant"],["sebastien","person"],["sec","line"],["secateurs",""],["second","two"],["seconda","line"],["secondary","two"],["secondhen","duck"],["secondly","n"],["secondness","two"],["seconds","one"],["secondsvaluea","variable"],["secondsvalueb","variable"],["secondvar","variable"],["secondvarname","square"],["secondvarparent","variable"],["secondvarparentname","square"],["secret","questionmark"],["secretary","person"],["secretly","plus"],["secrets","square"],["secs","n"],["sectarian","box"],["sectest","plus"],["sectichords","n"],["section","square"],["sectional","box"],["sections","square"],["sector","box"],["secular","plus"],["secularism","dot"],["secularity","zero"],["secularly","plus"],["secure","plus"],["secured","plus"],["securely","plus"],["securepay","company"],["security","plus"],["sedate","minus"],["see","square"],["seeable","eye"],["seed",""],["seeds","seed"],["seeing","eye"],["seek","left"],["seeker","person"],["seekers","person"],["seeking","eye"],["seeks","hand"],["seem","tick"],["seemed","plus"],["seeming","plus"],["seemingly","plus"],["seems","plus"],["seen","eye"],["seep","down"],["seer","person"],["sees","eye"],["seesaw","line"],["segment","line"],["segmentation","square"],["segments","box"],["segunda","box"],["segv","square"],["sehj","dash"],["seiende","plus"],["seize","hand"],["seizing","hand"],["select","dash"],["selected","plus"],["selectee","person"],["selecting","tick"],["selection","square"],["selections","square"],["selector","person"],["selectpositions","n"],["selects","square"],["selecttouching","equals"],["selena","person"],["self","person"],["selfcontrol","and"],["selfhood","person"],["selfie","square"],["selfishness","minus"],["selflessness","plus"],["sell","dollarsign"],["seller","person"],["sellers","person"],["selling","dollarsymbol"],["sells","dollarsymbol"],["selves","person"],["semantic","dash"],["semantics","square"],["semblance","plus"],["semester","square"],["semesters","square"],["semi","box"],["semicircle",""],["semily","n"],["seminal","tube"],["seminar","room"],["seminars","room"],["semitones","square"],["semolina",""],["semver","n"],["sen","square"],["send","right"],["sender","person"],["sending","right"],["sends","right"],["sendspace","company"],["senegal","country"],["senior","dot"],["senorita","person"],["sensable","plus"],["sensation","plus"],["sensations","finger"],["sense","eye"],["sensed","eye"],["senses","eye"],["sensitive","hand"],["sensitivities","plus"],["sensitivity","plus"],["sensor","eye"],["sensors","eye"],["sensory","finger"],["sent","right"],["sentence","box"],["sentencechars","square"],["sentenceendpunctuation","fullstop"],["sentences","paper"],["sentenceswordsfrompos","square"],["sentencewithspaces","square"],["sentencewordsfrompos","square"],["sentience","plus"],["sentient","plus"],["sentiment","plus"],["seoclerk","company"],["seoclerks","company"],["sep","square"],["sepandpad","square"],["separate","two"],["separated","box"],["separately","tworectangle"],["separateness","box"],["separates","box"],["separating","box"],["separation","box"],["september","square"],["septuagenarian","seventyrectangle"],["septum","box"],["sequel","book"],["sequence","n"],["sequences","n"],["sequential","right"],["sequin",""],["serbia","country"],["serene","plus"],["serengitis","minus"],["serial","n"],["series","square"],["serious","tick"],["seriously","plus"],["seriousness","plus"],["serotonin","ball"],["servant","person"],["servants","person"],["serve","right"],["served","plus"],["server","right"],["servers","computer"],["serves","hand"],["service","plus"],["services","plus"],["serviette",""],["serving","plate"],["sesame","seed"],["sesquicentenary","n"],["session","square"],["sessionaim","square"],["sessions","paper"],["set","a"],["setarg","n"],["setof","box"],["sets","fiverectangle"],["settable","equals"],["settee",""],["setting","box"],["settings","n"],["settle","plus"],["settled","person"],["settlement","plus"],["settler","person"],["settlers","person"],["settling","down"],["setton","person"],["setup","plus"],["seven","sevenrectangle"],["sevenfold","box"],["sevenrectangle",""],["seventh","sevenrectangle"],["several","n"],["severe","n"],["severely","minus"],["sew","right"],["sewage","ball"],["sewed","right"],["sewing","thread"],["sewn","right"],["sex","person"],["sexes","person"],["sexily","plus"],["sexism","minus"],["sexist","minus"],["sexual","down"],["sexualities","person"],["sexuality","n"],["sexually","plus"],["sexy","plus"],["seychelles","country"],["sf","dash"],["sfile","paper"],["sfu","university"],["sgetcode","and"],["sh","software"],["shabbanhykenever","box"],["shade","square"],["shaded","shade"],["shades","square"],["shadow","square"],["shadows","square"],["shaft","line"],["shakabanga","dash"],["shake","cup"],["shaker",""],["shakers","shaker"],["shakespeare","person"],["shakespearean","person"],["shaking","up"],["shakuhachi","box"],["shall","right"],["sham","minus"],["shame","minus"],["shamisen","box"],["shampoo",""],["shanai","box"],["shane","person"],["shank","person"],["shape","diamond"],["shaped","box"],["shapes","square"],["shaping","ball"],["share","right"],["shared","square"],["shareholders","person"],["shares","square"],["sharing","square"],["shark",""],["sharon","person"],["sharp","dot"],["sharpe","person"],["sharpen","plus"],["sharpened","plus"],["sharpener","blade"],["sharpening","dot"],["sharpness","plus"],["shave","square"],["shaved","hair"],["shaver",""],["shaving","right"],["shavings","line"],["she","person"],["sheared","right"],["shed",""],["sheds","shed"],["sheep",""],["sheer","plus"],["sheet","paper"],["sheetofpaper","paper"],["sheets","paper"],["sheila","person"],["shelf",""],["shell","box"],["shelled","box"],["shelley","person"],["shelling","plane"],["shells","dash"],["shells","line"],["shelter",""],["sheltered","square"],["shelves","shelf"],["shen","person"],["shepherd","dog"],["shevardnadzes","plus"],["shibboleth","plus"],["shield","square"],["shields","square"],["shift","square"],["shifting","right"],["shifts","square"],["shine","down"],["shines","star"],["shining","star"],["shiny","square"],["ship",""],["shipping","square"],["ships","ship"],["shirley","person"],["shirt",""],["shirts","shirt"],["shirtsleeve",""],["shitsued","plus"],["shmoop","dash"],["shoal","coral"],["shock","minus"],["shocking","minus"],["shoe",""],["shoelaces",""],["shoes","shoe"],["shone","light"],["shook","minus"],["shoot","equals"],["shooting","box"],["shoots","up"],["shop","dollarsymbol"],["shopper","person"],["shoppers","person"],["shopping","apple"],["shops","dollarsymbol"],["shore",""],["short","dash"],["shortage","minus"],["shortcake",""],["shortcourses","line"],["shortcut","right"],["shorten","n"],["shortened","n"],["shortening","minus"],["shorter","n"],["shortest","n"],["shortlist","square"],["shortlisting","n"],["shorttree","tree"],["shot","square"],["shotgun","minus"],["shou","plant"],["should","dash"],["shoulder",""],["shoulders",""],["shouldn","xcorrection"],["shouted","square"],["show","square"],["showalter","person"],["showbiz","square"],["showcase","plus"],["showed","finger"],["shower","water"],["showered","water"],["showering","water"],["showing","square"],["shown","square"],["showpaperpdf","paper"],["showreel","square"],["shows","stage"],["shredded","down"],["shredding","up"],["shrewd","plus"],["shut","minus"],["shutter","plane"],["shuttle","spacecraft"],["shuttlecock",""],["shuttlecocks","shuttlecock"],["shy","minus"],["si","variable"],["siblings","person"],["sic","tick"],["siccen","plus"],["sichians","person"],["sick","minus"],["sickness","minus"],["sicling","plus"],["side","square"],["sided","dot"],["sides","square"],["sidestepping","right"],["sideways","right"],["sierra","country"],["siesta","bed"],["sieve",""],["sieved","sieve"],["sieving","sieve"],["sifted","down"],["sifter","down"],["sig","square"],["sigcrashhandler","plus"],["sigcse","dash"],["siggets","square"],["siggraph","software"],["sight","eye"],["sighted","eye"],["sighting","eye"],["sights","eye"],["sign","square"],["signal","one"],["signaled","signal"],["signaling","right"],["signalled","square"],["signalling","right"],["signals","plus"],["signature","a"],["signatures","tick"],["signed","square"],["significance","plus"],["significances","plus"],["significant","plus"],["significantly","plus"],["signification","plus"],["signified","right"],["signifies","plus"],["signifying","dash"],["signing","square"],["signorelli","person"],["signpost","square"],["signposted","signpost"],["signposting","square"],["signposts","signpost"],["signs","square"],["signup","square"],["sigtramp","square"],["silence","zero"],["silent","zero"],["silently","zero"],["silk",""],["silly","minus"],["silo","cylinder"],["silver","ball"],["sim","box"],["similar","equals"],["similarities","equals"],["similarity","equals"],["similarly","equals"],["simile","equals"],["simmered","n"],["simon","person"],["simone","person"],["simple","a"],["simplephilosophy","square"],["simpler","down"],["simplest","dot"],["simplicity","dot"],["simplification","one"],["simplifications","minus"],["simplified","one"],["simplifies","down"],["simplify","dot"],["simplifying","one"],["simplistic","plus"],["simply","plus"],["simpsons","person"],["simulacrum","box"],["simulate","box"],["simulated","and"],["simulates","and"],["simulating","box"],["simulation","box"],["simulationees","person"],["simulations","box"],["simulator","and"],["simulatory","square"],["simultanenously","equals"],["simultaneous","one"],["simultaneously","two"],["sin","minus"],["since","dot"],["sincere","plus"],["sincerely","plus"],["sincerity","plus"],["sine","function"],["sing","note"],["singapore","country"],["singer","person"],["singing","note"],["single","one"],["singles","one"],["singleton","one"],["singletons","one"],["singular","one"],["sink","box"],["sinkhole","hole"],["sipped","water"],["sipping","water"],["sir","person"],["siri","and"],["sister","person"],["sit","seat"],["sitar",""],["site","paper"],["sitemap","paper"],["sites","paper"],["sitting",""],["situated","dot"],["situation","square"],["situational","plus"],["situations","box"],["sity","box"],["siva","person"],["six",""],["sixrectangle",""],["sixth","sixrectangle"],["sixty","n"],["size","n"],["sized","ruler"],["sizes","n"],["sizzling","plus"],["sjgrant","person"],["skeletal","bone"],["skeleton",""],["skelter","right"],["skene","person"],["sketch","square"],["sketched","pen"],["sketching","pen"],["skewer",""],["skewered","skewer"],["skewering","skewer"],["skewers","skewer"],["ski",""],["skies","box"],["skiied","ski"],["skiing","right"],["skill","tick"],["skilled","plus"],["skillful","plus"],["skillman","person"],["skills","a"],["skin","square"],["skip","two"],["skipped","right"],["skipping","right"],["skips","right"],["skittle",""],["skivvy",""],["skull",""],["sky","square"],["skye","dog"],["skyeae","person"],["skyrocket","up"],["skyscraper","building"],["sla","paper"],["slack","minus"],["slander","minus"],["slang","minus"],["slap","hand"],["slash",""],["slate","square"],["slave","person"],["slavery","minus"],["slaves","person"],["sleek","square"],["sleep",""],["sleepily","plus"],["sleeping","bed"],["sleepinterval","minus"],["sleepy","minus"],["sleeve",""],["sleeved","sleeve"],["sleeves","sleeve"],["slept","bed"],["slice","box"],["sliced","box"],["slicer","down"],["slices","box"],["slicing","knife"],["slid","right"],["slide","down"],["slider","n"],["sliders","right"],["slides","square"],["sliding","right"],["slight","n"],["slightly","dash"],["slingshot",""],["slip","minus"],["slipped","down"],["slippery","water"],["slipping","right"],["slope","up"],["slopes","up"],["slot","hole"],["slotted","right"],["sloughed","right"],["slovakia","country"],["slovenia","country"],["slow","n"],["slowdown","down"],["slowed","minus"],["slower","lessthan"],["slowing","line"],["slowly","zero"],["slowness","n"],["slp","box"],["slpconverter","right"],["slug",""],["sm","right"],["small","amount"],["smaller","lessthan"],["smallest","n"],["smarbangers","sausage"],["smart","tick"],["smartphone","phone"],["smartwatch",""],["smell","rose"],["smelled","nose"],["smelling","nose"],["smells","nose"],["smelt","nose"],["smile",""],["smiled","face"],["smiley","face"],["smiling","mouth"],["smith","person"],["smithereen","plus"],["smock",""],["smoke",""],["smoked","smoke"],["smoking","minus"],["smooth","line"],["smoothed","plus"],["smoother","plus"],["smoothing","down"],["smoothly","line"],["smorgasbord","bread"],["smp","and"],["sms","right"],["smurf","person"],["smv","right"],["smythe","person"],["sn","variable"],["snack","apple"],["snail",""],["snake",""],["snakes","snake"],["snap","box"],["snaps","dot"],["snapshot","square"],["sneakers","shoe"],["sneeze","nose"],["sniff","nose"],["sniffing","nose"],["sniper","dot"],["snow",""],["snowflake",""],["snuff",""],["snuffed","xcorrection"],["snuggled","down"],["so","right"],["soaked","water"],["soapie","square"],["soc","book"],["socatoan","circle"],["soccer","ball"],["soci","book"],["social","person"],["socialise","plus"],["socialised","plus"],["socialising","plus"],["socialism","equals"],["socialist","person"],["socialists","person"],["socially","person"],["societal","person"],["societies","person"],["societol","book"],["societological","room"],["societologically","person"],["societologist","person"],["societology","person"],["society","person"],["socio","person"],["socioeconomic","dollarsymbol"],["sociologia","person"],["sociological","book"],["sociology","book"],["sociologygroup","person"],["socks","sock"],["socrates","person"],["socratic","person"],["soda","bottle"],["sofa",""],["soft","square"],["softened","down"],["softening","down"],["softens","down"],["softer","plus"],["softly","n"],["softness","plus"],["software","square"],["soh","dot"],["soil",""],["soimort","person"],["sol","plus"],["sola","ball"],["solar","sun"],["sold","dollarsign"],["soldier","person"],["soldiers","person"],["soldiership","person"],["sole",""],["solely","one"],["soles",""],["solfatonotes","note"],["solfege","note"],["solfegenotetonote","right"],["solicitude","minus"],["solid","box"],["solidarity","plus"],["solids","box"],["soliloquy","paper"],["solitude","one"],["solo","one"],["solomon","country"],["solos","person"],["sols","n"],["solstices","n"],["solution","plus"],["solutions","tick"],["solvable","plus"],["solve","tick"],["solved","tick"],["solver","equals"],["solves","plus"],["solving","tick"],["soma","body"],["somali","person"],["somalia","country"],["some","one"],["somebody","person"],["someday","n"],["somehow","dash"],["someone","person"],["somersault","circle"],["something","one"],["sometime","n"],["sometimes","one"],["somewhat","plus"],["somewhere","dot"],["son","person"],["sonar","square"],["song","square"],["songs","book"],["songwriter","person"],["songwriting","note"],["sonic","note"],["sonorous","plus"],["sons","person"],["soon","n"],["sooner","greaterthan"],["soothing","plus"],["sophie","person"],["sophisticated","plus"],["soprano","person"],["sore","minus"],["soreness","minus"],["sorgenti","person"],["sorrow","minus"],["sorry","xcorrection"],["sort","a"],["sortbyx","line"],["sortcn","x"],["sortcna","x"],["sortcnremvdup","right"],["sorted","line"],["sorting","right"],["sorts","line"],["sostalgia","minus"],["sostalgic","minus"],["sostalgically","minus"],["soufflé",""],["sought","plus"],["soul",""],["soulful","plus"],["souls","person"],["sound","note"],["soundcloud","company"],["sounded","note"],["sounding","dash"],["soundly","plus"],["soundness","plus"],["sounds","note"],["soundtrack","line"],["soup","bowl"],["sour","minus"],["source","square"],["sourced","plus"],["sources","dot"],["sourness","lemon"],["south","down"],["southampton","suburb"],["southern","down"],["southwest","right"],["souvenir","book"],["soviet","country"],["sowed","seed"],["sown","down"],["soy","glass"],["sp","paper"],["space","box"],["spacecraft",""],["spaced","box"],["spaceport","building"],["spaces","space"],["spaceship","spacecraft"],["spacetime","line"],["spaceweb","paper"],["spacing","n"],["spade",""],["spaghetti",""],["spain","country"],["spam","box"],["span","line"],["spaniel","dog"],["spanish","paper"],["spare","plus"],["spared","plus"],["sparingly","plus"],["spark","box"],["sparks","dot"],["spars","right"],["sparse","minus"],["spartacus","person"],["spatial","box"],["spatiality","box"],["spatially","box"],["spatiotemporal","dot"],["spatula",""],["spatulas","spatula"],["speak","speechballoon"],["speakable","mouth"],["speaker","person"],["speakers","person"],["speaking","mouth"],["speaks","mouth"],["spear",""],["speared","spear"],["spearhead","right"],["spearing","spear"],["spec","paper"],["special","plus"],["specialisation","tick"],["specialise","plus"],["specialised","down"],["specialising","dot"],["specialism","plus"],["specialisms","book"],["specialist","person"],["specialists","person"],["speciality","book"],["specialization","line"],["speciation","dash"],["species","animal"],["specific","tick"],["specifically","one"],["specification","paper"],["specifications","dash"],["specificity","one"],["specified","tick"],["specifier","plus"],["specifiers","square"],["specifies","square"],["specify","paper"],["specifying","n"],["specimen","box"],["speckles","dot"],["specs","paper"],["specstage","right"],["specstages","dot"],["spectacles","glasses"],["spectacularly","plus"],["spectrum","line"],["sped","right"],["speech","mouth"],["speechballoon",""],["speeches","square"],["speechwriter","person"],["speed","n"],["speeding","n"],["speeds","n"],["spel","square"],["spell","square"],["spelled","square"],["speller","person"],["spelling","square"],["spelt","square"],["spencer","person"],["spend","dollarsymbol"],["spenders","person"],["spending","dollarsymbol"],["spent","dollarsymbol"],["sperm",""],["spf","n"],["sphere","ball"],["spheres","ball"],["spherical","sphere"],["sphermulated","ball"],["sphinx",""],["sphygmomanometer",""],["spiccato","box"],["spice","plant"],["spicer","person"],["spider",""],["spiel","square"],["spike","n"],["spill","down"],["spilling","water"],["spin","top"],["spinach",""],["spinal","spine"],["spindle","line"],["spine",""],["spinner","circle"],["spinoza","person"],["spiral","circle"],["spiraled","spiral"],["spiralling","circle"],["spirit","person"],["spirits","glass"],["spiritual","person"],["spiritualism","dash"],["spirituality","plus"],["spiritually","tick"],["spitting","amount"],["splash","water"],["splashed","water"],["splatted","minus"],["spleen",""],["splices","box"],["split","two"],["splitfurther","box"],["splitfurthert","box"],["splitintosent","box"],["splitintosentences","box"],["splitonphrases","box"],["splits","two"],["splitting","two"],["spoiled","minus"],["spoke","speechballoon"],["spoken","speechballoon"],["spokes","line"],["spokespeople","person"],["sponge",""],["sponged","sponge"],["sponges","sponge"],["sponsor","person"],["sponsored","dollarsymbol"],["sponsors","person"],["spontaneous","zero"],["spool",""],["spoolos","person"],["spoon",""],["spooned","spoon"],["spoonful","spoon"],["spoons","spoon"],["spoorthi","person"],["sport","ball"],["sporting","ball"],["sports","ball"],["spot","tick"],["spotify","company"],["spotless","and"],["spotlight","light"],["spotted","eye"],["spout",""],["spray","right"],["sprayed","down"],["spread","right"],["spreading","right"],["spreadsheet","plus"],["spreadsheets","square"],["spreams","right"],["sprig","leaf"],["spring",""],["springcm","dash"],["springer","book"],["springs","water"],["sprinkler","water"],["sprint","right"],["sprints","right"],["spurned","plus"],["spurns","plus"],["spy","eye"],["sql","software"],["sqrt","line"],["square",""],["squared","square"],["squares","square"],["squash",""],["squat","down"],["squatted","down"],["squatting","down"],["squeezed","square"],["squeezer",""],["squeezing","hand"],["squelch","orange"],["squelching","orange"],["squib","minus"],["squirted","right"],["squirter","right"],["squirting","right"],["squished","down"],["squishing","down"],["src","paper"],["sread","eye"],["sree","person"],["sri","country"],["srv","computer"],["ss","s"],["ssaged","dash"],["ssh","software"],["ssi","right"],["ssivoziruuezia","dash"],["sskip","dash"],["ssl","line"],["ssn","dash"],["st","dash"],["stabbed","down"],["stabbing","minus"],["stability","box"],["stable","box"],["stably","box"],["stack","box"],["stacked","box"],["stackexchange","paper"],["stacking","box"],["stackoverflow","paper"],["staff","person"],["stage",""],["stagecraft","stage"],["staged","n"],["stages","dot"],["stagger","dash"],["staggered","box"],["staggering","dash"],["staining","ink"],["stainless","zero"],["stairs","step"],["stake",""],["staked","stake"],["stakeholder","person"],["stakeholders","person"],["stalin","person"],["stalk",""],["stallion","horse"],["stalls","stall"],["stalwarts","plus"],["stamp","square"],["stamped","square"],["stamps","square"],["stan","person"],["stance","person"],["stand","box"],["standard","a"],["standardised","plus"],["standards","square"],["standing","person"],["standpoint","dot"],["standpoints","square"],["stands","box"],["stanford","building"],["stanley","person"],["staple","dash"],["stapler",""],["star",""],["starboard","right"],["starches","ball"],["stardom","plus"],["starfish",""],["staring","eye"],["starjazzled","star"],["starnow","star"],["starred","plus"],["stars","star"],["start","startingline"],["started","plus"],["starter","zero"],["starters","one"],["starting","one"],["startingline",""],["startrec","dot"],["starts","one"],["startup","one"],["stat","n"],["state","a"],["stated","paper"],["stately","plus"],["statement","paper"],["statements","square"],["states","n"],["statesman","person"],["stateterritory","square"],["static","box"],["stating","person"],["station","box"],["stationisation","building"],["stations","building"],["statistical","tick"],["statistically","n"],["statistics","plus"],["statisticshowto","plus"],["stats","plus"],["statu","plus"],["statuette",""],["status","a"],["statuses","square"],["stave","line"],["stay","down"],["stayed","dot"],["staying","one"],["stayingness","down"],["stays","plus"],["stc","dash"],["stdout","line"],["steadfast","plus"],["steadily","n"],["steal","left"],["stealing","minus"],["stealth","plus"],["steam",""],["steamed","steam"],["steel","line"],["steep","n"],["steeper","up"],["steer","right"],["steered","right"],["steering","right"],["stefanovic","person"],["stellar","star"],["stem","line"],["stemming","up"],["stemmingly","line"],["stems","line"],["stencil",""],["step",""],["stepclassical","box"],["stephanie","person"],["stephanies","person"],["stephen","person"],["stepped","step"],["stepping","step"],["steppopclassical","box"],["steps","step"],["stepwise","step"],["stereo","tworectangle"],["stereotypes","square"],["sterile","minus"],["sterilising","plus"],["sterility","zero"],["sterling","person"],["stern","person"],["steven","person"],["stevenson","person"],["stew","pot"],["stewart","person"],["stewed","n"],["stick","box"],["sticker","square"],["stickies","square"],["stickily","equals"],["sticking","equals"],["sticks","box"],["sticky","equals"],["stiffness","n"],["stigma","minus"],["still","zero"],["stillbirth","minus"],["stillbirths","minus"],["stillbornness","minus"],["stillness","box"],["stills","square"],["stilltodo","right"],["stimulate","plus"],["stimulated","plus"],["stimulating","plus"],["stimulation","plus"],["stimulatory","plus"],["stimulus","plus"],["stinoj","person"],["stir","right"],["stirred","right"],["stirring","circle"],["stitched","needle"],["stlp","n"],["stmts","square"],["stock","box"],["stocking","box"],["stockings",""],["stocks","box"],["stokes","person"],["stole","minus"],["stoller","person"],["stomach",""],["stone",""],["stones","minus"],["stonnington","suburb"],["stood","person"],["stool",""],["stop",""],["stopped","n"],["stopping","n"],["stops","zero"],["stopwatch","n"],["storage","n"],["store","building"],["stored","down"],["stores","building"],["storey","building"],["stories","book"],["storing","down"],["storm","water"],["storms","right"],["story","paper"],["storyboard","square"],["storyboarded","square"],["storybook","book"],["storytelling","paper"],["stott","person"],["stove",""],["str","square"],["straight","line"],["straightened","line"],["straightforward","square"],["straightforwardly","plus"],["strained","minus"],["straining","down"],["strand","line"],["stranded","minus"],["strange","minus"],["strangely","plus"],["stranger","person"],["strangers","person"],["strap",""],["strapped","strap"],["straps","square"],["strategic","paper"],["strategically","plus"],["strategies","square"],["strategy","a"],["stratosphere","box"],["stravinsky","person"],["straw",""],["strawberries","strawberry"],["strawberry","berry"],["straws","straw"],["stray","minus"],["stream","line"],["streaming","line"],["streamline","right"],["streams","line"],["street",""],["streets","street"],["strength","n"],["strengthening","plus"],["strengths","plus"],["stress","xcorrection"],["stressed","minus"],["stressful","minus"],["stretch","software"],["stretched","right"],["stretches","right"],["stretching","right"],["strict","tick"],["strictness","plus"],["strike","down"],["strikes","dot"],["string","twobox"],["stringa","square"],["stringconcat","plus"],["stringconcat","threerectangle"],["strings","square"],["stringsorvars","or"],["stringth","variable"],["stringtobreason","square"],["stringtonumber","right"],["stringx","square"],["strip","square"],["stripe","company"],["stripes","line"],["stripped","minus"],["strips","square"],["strive","plus"],["striving","plus"],["strlen","n"],["stroke","line"],["strokes","right"],["stroking","right"],["strong","n"],["stronger","greaterthan"],["strongest","plus"],["strongholds","plus"],["strongly","plus"],["stroto","down"],["strove","plus"],["struck","box"],["struct","box"],["structs","box"],["structural","box"],["structuralism","box"],["structure","box"],["structured","box"],["structurers","person"],["structures","box"],["structuring","box"],["struggle","minus"],["struggles","minus"],["strut","box"],["struts","zline"],["strutted","right"],["stuart","person"],["stub","square"],["stubborn","minus"],["stubs","square"],["stuck","minus"],["stuckey","person"],["stud","ball"],["studded","up"],["student","person"],["students","person"],["studi","room"],["studied","person"],["studies","book"],["studio","room"],["studios","room"],["studious","pen"],["study","room"],["studying","person"],["stuff","box"],["stumble","minus"],["stump","cylinder"],["stunning","plus"],["stunt","right"],["stunts","right"],["stupefying","minus"],["stupid","minus"],["sturdy","box"],["style","a"],["styled","plus"],["styles","plus"],["stylesheet","square"],["styling","plus"],["stylisation","plus"],["stylise","line"],["stylised","plus"],["stylish","plus"],["stylized","square"],["stylus","pen"],["su","person"],["sub","dash"],["subarguments","v"],["subatomic","ball"],["subbotsky","person"],["subbranch","dash"],["subcategories","square"],["subcategory","dash"],["subconsciously","box"],["subcutaneous","down"],["subdomains","square"],["subfunctions","right"],["subgoals","plus"],["subj","square"],["subject","person"],["subjective","plus"],["subjectively","person"],["subjectivity","person"],["subjects","square"],["subjectsandobjects","square"],["subjugation","down"],["subjunctive","plus"],["sublevel","dash"],["sublevels","dash"],["sublime","plus"],["submarine",""],["submenu","square"],["submerged","water"],["submerging","down"],["submission","paper"],["submissions","right"],["submit","paper"],["submits","up"],["submitted","paper"],["submitting","paper"],["subordinate","down"],["subordinating","down"],["subordination","down"],["subparts","box"],["subpredicate","down"],["subpredicates","down"],["subscriber","person"],["subscribers","person"],["subscription","paper"],["subscriptions","paper"],["subsequent","right"],["subset","box"],["subsets","box"],["subsidiary","right"],["subsidies","minus"],["subsidise","minus"],["subsidised","minus"],["subsidising","dollarsymbol"],["subsidy","down"],["subsistence","apple"],["subspecifications","square"],["substances","box"],["substandard","minus"],["substantial","box"],["substantiate","plus"],["substantiated","plus"],["substantive","plus"],["substinence","plus"],["substitute","down"],["substituted","dash"],["substitutes","box"],["substitutevarsa","dash"],["substituting","box"],["substitution","right"],["substrate","box"],["substring","square"],["substrings","square"],["subterm","variable"],["subtitle","square"],["subtle","plus"],["subtleties","one"],["subtopics","square"],["subtract","minus"],["subtracted","minus"],["subtracting","minus"],["subtraction","minus"],["subtractions","subtraction"],["subtractive","minus"],["subtractors","minus"],["subtypes","dash"],["suburb","square"],["suburbs","suburb"],["subvar","variable"],["subvariables","box"],["subvars","variable"],["succ","right"],["succeed","plus"],["succeeded","plus"],["success","tick"],["successes","plus"],["successful","a"],["successfully","tick"],["succession","right"],["successions","right"],["successive","right"],["successively","right"],["successor","person"],["succleton","water"],["succt","up"],["succulent","plus"],["succulently","water"],["such","dash"],["suck","up"],["sucked","up"],["sucking","right"],["suction","up"],["sudan","country"],["sudo","plus"],["sue","person"],["sued","minus"],["suess","person"],["suffer","minus"],["suffered","minus"],["sufferer","person"],["sufferers","minus"],["suffering","minus"],["sufferings","minus"],["suffice","plus"],["sufficiency","plus"],["sufficient","plus"],["sufficiently","plus"],["suffix","right"],["suffixes","square"],["sufism","book"],["sugar",""],["sugars","sugar"],["sugary","sugar"],["sugg","plus"],["suggest","plus"],["suggested","dash"],["suggesting","square"],["suggestion","plus"],["suggestions","square"],["suggestive","plus"],["suggestively","plus"],["suggests","right"],["suicidal","minus"],["suicide","minus"],["suit",""],["suitability","plus"],["suitable","plus"],["suitcase",""],["suitcases","suitcase"],["suite","box"],["suited","equals"],["suits","plus"],["sulfide","ball"],["sulk","minus"],["sullivan","person"],["sully","person"],["sultan","person"],["sultana",""],["sultanas","sultana"],["sum","n"],["sumlist","square"],["summaries","square"],["summarisation","square"],["summarise","square"],["summarised","square"],["summarises","square"],["summarising","square"],["summary","square"],["summatively","n"],["summed","n"],["summer","sun"],["summit","n"],["sums","plus"],["sumtimes","n"],["sun","star"],["sunbathe","down"],["sunburn","minus"],["sunday","square"],["sunderland","person"],["sunfeltnesses","plus"],["sung","note"],["sunglasses",""],["sunjay","person"],["sunny","sun"],["sunrise","sun"],["sunscreen",""],["sunset","sun"],["sunshade",""],["sunshine","light"],["sunstroke","minus"],["suo","a"],["suor","water"],["sup","up"],["super","up"],["superabled","plus"],["superannuation","dollarsymbol"],["superb","plus"],["supercard","software"],["supercomputer","box"],["supercomputers","and"],["superego","up"],["superfoods","apple"],["superhero","person"],["superimpose","square"],["superimposed","equals"],["superimposes","down"],["superimposing","square"],["superintendent","person"],["superior","plus"],["supermarket","line"],["supermodel","person"],["superordinate","n"],["superpower","plus"],["superseded","xcorrection"],["superstar","person"],["supervised","plus"],["supervisor","person"],["supervisors","person"],["supple","plus"],["supplement","plus"],["supplementary","plus"],["supplemented","plus"],["supplied","plus"],["supplier","right"],["suppliers","person"],["supplies","apple"],["supply","right"],["supplying","right"],["support","square"],["supported","box"],["supporter","person"],["supporters","person"],["supporting","box"],["supportive","plus"],["supports","right"],["suppose","plus"],["supposed","right"],["supposedly","plus"],["suppress","xcorrection"],["suppressed","down"],["sur","dash"],["surd","n"],["surds","n"],["sure","plus"],["surely","plus"],["surf","box"],["surface","square"],["surfaces","square"],["surfer","person"],["surfing","water"],["surgeon","person"],["surgery","box"],["suriname","country"],["surname","square"],["surpass","right"],["surpassed","right"],["surpassing","right"],["surplus","plus"],["surprise","star"],["surprised","minus"],["surprises","plus"],["surprising","minus"],["surprisingly","plus"],["surreal","n"],["surrected","up"],["surrendering","plus"],["surrogate","person"],["surrounded","box"],["surrounding","right"],["surroundings","box"],["surveillance","eye"],["survey","paper"],["surveyed","paper"],["surveying","plus"],["surveyor","person"],["surveys","paper"],["survival","plus"],["survive","line"],["survived","plus"],["survivor","person"],["surya","sun"],["susa","person"],["susan","person"],["susannah","person"],["suscicipi","dash"],["suscicipid","dash"],["sushi",""],["susie","person"],["suspend","xcorrection"],["suspended","zero"],["suspenseful","n"],["suspension","line"],["suspicious","minus"],["sustain","plus"],["sustainability","plus"],["sustainable","plus"],["sustained","plus"],["sustaining","right"],["sutra","a"],["sutras","square"],["suzie","person"],["svetlana","person"],["sw","dash"],["swab",""],["swallow","stomach"],["swallowed","down"],["swallowing","apple"],["swam","right"],["swamp","water"],["swan",""],["swans","swan"],["swanston","street"],["swap","x"],["swapped","right"],["swapping","dash"],["swaps","dash"],["swashbuckling","plus"],["swaziland","country"],["sweat",""],["sweaty","water"],["sweden","country"],["swedish","person"],["sweep","right"],["sweet","apple"],["sweetened","sugar"],["sweetening","plus"],["sweeter","sugar"],["sweetinvincibleandprayedfor","plus"],["sweetly","sugar"],["sweetness","sugar"],["sweets","ball"],["swept","right"],["swi","dot"],["swim","right"],["swimmer","person"],["swimmers","person"],["swimming","water"],["swin","university"],["swinburne","university"],["swine","pig"],["swing","right"],["swinging","right"],["swipe","right"],["swipl","twobox"],["swirl","right"],["swirled","right"],["swish","plus"],["swiss","country"],["switch","dash"],["switchboard","dot"],["switched","one"],["switches","dash"],["switching","dash"],["switzerland","country"],["sword",""],["swung","right"],["sy","square"],["syd","city"],["sydn","person"],["sydney","dash"],["syllable","square"],["syllablecount","n"],["syllables","square"],["syllabus","paper"],["syllogistical","right"],["symbol","square"],["symbolic","x"],["symbolise","dash"],["symbolised","dash"],["symbolises","right"],["symbolising","a"],["symbolized","n"],["symbolizing","equals"],["symbols","a"],["symmetrical","equals"],["symposium","room"],["symptoms","dash"],["synaptic","dot"],["sync","equals"],["synchronisation","equals"],["synchronise","equals"],["synchronised","tworectangle"],["synchroniser","tick"],["synchrotron","and"],["syncing","equals"],["syncopated","note"],["syndrome","minus"],["syndromes","minus"],["synergy","one"],["syno","equals"],["synogism","equals"],["synogistic","equals"],["synogistically","right"],["synogrammars","square"],["synologic","right"],["synon","equals"],["synonym","equals"],["synonymous","equals"],["synonyms","equals"],["synopsis","paper"],["synopup","puppy"],["synothoughts","square"],["syntactical","square"],["syntax","dash"],["syntaxes","dash"],["synth","tube"],["synthese","book"],["syntheses","dot"],["synthesis","star"],["synthesise","box"],["synthesised","dash"],["synthesiser","box"],["synthesising","right"],["synthesized","box"],["synthetical","right"],["synthetically","box"],["syria","country"],["syringe",""],["syrup",""],["sys","and"],["syspred","b"],["system","dash"],["systematic","plus"],["systematically","plus"],["systematising","plus"],["systemctl","square"],["systemic","and"],["systems","box"],["sysutils","right"],["syydw","dash"],["t","variable"],["ta","variable"],["tab","square"],["table","paper"],["tables","table"],["tablespoons","spoon"],["tablet","box"],["tablets","tablet"],["tabling","equals"],["tabs","square"],["tabular","square"],["tackle","xcorrection"],["tackled","minus"],["tackling","xcorrection"],["tact","plus"],["tactful","plus"],["tactfully","plus"],["tactical","plus"],["tactics","plus"],["tactile","finger"],["tadpole",""],["tadpoles","tadpole"],["tae","paper"],["taeass","book"],["tafta","company"],["tag","square"],["tagged","square"],["tagger","square"],["tagging","square"],["tagline","line"],["tags","square"],["tai","up"],["taiko","box"],["tail","twobox"],["tailed","n"],["tailor","person"],["tailored","plus"],["tailoring","jacket"],["tails","down"],["tajikistan","country"],["take","left"],["takeaway","plus"],["taken","left"],["takers","right"],["takes","left"],["taking","left"],["takings","dollarsymbol"],["tale","paper"],["talent","person"],["tales","book"],["talis","person"],["talk","person"],["talked","speechballoon"],["talking","speechballoon"],["talks","paper"],["tall","n"],["tallest","n"],["tallied","n"],["tallies","n"],["tally","n"],["tamas","person"],["tambourine",""],["tame","plus"],["tamouridis","person"],["tampered","minus"],["tamsin","person"],["tan","person"],["tandfonline","book"],["tangents","line"],["tangerine",""],["tangible","box"],["tango","box"],["tangy","orange"],["tank","box"],["tanks","tank"],["tantalized","plus"],["tantra","plus"],["tantrum","minus"],["tanzania","country"],["tanzanian","person"],["tanzanians","person"],["tao","dot"],["tap","one"],["tape",""],["tapes","square"],["taping","tape"],["taponada","plus"],["tapped","down"],["tapper","person"],["taps","down"],["target","dot"],["targeted","dot"],["targetid","dash"],["targeting","dot"],["targets","dot"],["tarpaulin","fabric"],["tart",""],["task","book"],["taskforce","person"],["tasking","square"],["tasks","paper"],["taste","tongue"],["tasted","tongue"],["tasteful","apple"],["tasteless","minus"],["tastes","tongue"],["tasting","tongue"],["tasty","apple"],["tat","dash"],["tatiana","person"],["tattoos","square"],["taught","tick"],["tautological","right"],["tautology","right"],["tax","n"],["taxes","dollarsymbol"],["taxing","money"],["taxonomised","dash"],["taxonomy","dash"],["taylor","person"],["taylorfrancis","person"],["tb","variable"],["tbody","square"],["tc","variable"],["tchaikovsky","person"],["tcm","plus"],["td","square"],["te","person"],["tea","water"],["teach","person"],["teacher","person"],["teachers","person"],["teaches","room"],["teaching","person"],["teacups","cup"],["team","person"],["teams","person"],["teamwork","person"],["tear",""],["tearing","minus"],["tears","water"],["tease","minus"],["teased","minus"],["teasing","minus"],["tech","box"],["technical","a"],["technically","and"],["technician","person"],["technique","paper"],["techniques","one"],["techno","note"],["technological","and"],["technologically","and"],["technologies","software"],["technologisation","and"],["technology","and"],["techopedia","paper"],["tecture","plus"],["ted","person"],["teddies","toy"],["tee","person"],["teenage","n"],["teenager","person"],["teenagers","person"],["teeth","tooth"],["tel","n"],["telegram","square"],["telegraph","right"],["telepathic","right"],["telepathically","dash"],["telepathy","right"],["telephone",""],["teleport","right"],["teleportation","right"],["teleported","right"],["teleporter","right"],["teleporting","right"],["teleports","right"],["teletypist","person"],["television","box"],["tell","mouth"],["telling","mouth"],["tells","speechballoon"],["telnet","square"],["telstra","company"],["temp","box"],["tempel","person"],["temperance","plus"],["temperature","n"],["template","paper"],["templates","paper"],["temple","building"],["temples","building"],["tempo","n"],["tempold","box"],["temporal","n"],["temporality","n"],["temporally","n"],["temporarily","plus"],["temporary","plus"],["temporium","n"],["tempos","n"],["temptation","minus"],["ten","tenrectangle"],["tenacity","plus"],["tenancy","square"],["tend","plus"],["tended","right"],["tendency","plus"],["tendon","fibre"],["tendons","right"],["tends","plus"],["teng","person"],["tennis","ball"],["tenor","person"],["tenrectangle",""],["tens","tenrectangle"],["tense","minus"],["tension","minus"],["tensorflow","software"],["tent",""],["tentacle","line"],["tentacles","tentacle"],["tentaclising","tentacle"],["tenth","n"],["tents","tent"],["tenuously","minus"],["tenure","right"],["tenured","paper"],["teo","person"],["ter","n"],["tere","person"],["teresa","person"],["term","b"],["termed","square"],["terminal","square"],["terminally","n"],["terminals","square"],["terminalvalue","n"],["terminates","n"],["termination","xcorrection"],["terminations","zero"],["terminator","box"],["terminology","square"],["termius","box"],["terms","b"],["ternary","threerectangle"],["terraformed","earth"],["terrain","map"],["territory","square"],["terror","minus"],["terrorism","minus"],["tertiary","n"],["tes","person"],["tesla","person"],["tess","person"],["tessellating","square"],["tesseract","software"],["test","paper"],["testa","tick"],["testable","tick"],["testcut","box"],["tested","dash"],["testees","person"],["tester","tick"],["testers","person"],["testes","ball"],["testicle","ball"],["testimonial","plus"],["testimonials","paper"],["testimony","square"],["testing","tick"],["testnumber","n"],["testopen","tick"],["tests","tick"],["testtrace","tick"],["testtube",""],["tether","line"],["tetris","square"],["text","paper"],["texta","square"],["textarea","square"],["textbook","book"],["textbooks","book"],["textbroker","company"],["textedit","box"],["textgenrnn","algorithm"],["texting","box"],["texts","paper"],["texttoalg","right"],["texttobr","right"],["texttobrall","box"],["texttobrc","equals"],["texttobreasoning","right"],["texttobreasonings","software"],["texttobreathsonings","right"],["texttobrqb","apple"],["texttobrqbpl","box"],["texttobrth","right"],["textual","square"],["textured","square"],["textures","square"],["tf","dash"],["tfe","dash"],["tfile","square"],["tfjs","algorithm"],["tfn","n"],["th","n"],["thailand","country"],["than","greaterthan"],["thank","plus"],["thanked","plus"],["thankful","plus"],["thankfully","plus"],["thanking","plus"],["thanks","hand"],["that","up"],["the","up"],["thea","person"],["theater","stage"],["theatre","stage"],["theatrical","stage"],["theatricised","stage"],["thebreasonings","apple"],["theft","minus"],["their","left"],["them","person"],["thematic","square"],["theme","square"],["themed","square"],["themes","square"],["themselves","person"],["then","right"],["thens","right"],["theodor","person"],["theol","book"],["theologian","person"],["theological","person"],["theologically","person"],["theologist","person"],["theology","book"],["theorem","n"],["theorems","right"],["theoretical","right"],["theoretically","and"],["theories","square"],["theorists","person"],["theory","and"],["ther","person"],["therapeutic","plus"],["therapists","person"],["therapy","plus"],["there","down"],["therefore","right"],["thereness","dot"],["theres","down"],["thermo","n"],["thermometer","n"],["thermonuclear","n"],["thermos","flask"],["thermostats","n"],["thes","equals"],["thesaurus","book"],["these","down"],["theses","book"],["thesis","book"],["theta","n"],["they","person"],["thick","n"],["thickness","n"],["thief","person"],["thigh",""],["thighs",""],["thin","box"],["thing","pencilsharpener"],["thingness","box"],["things","pencilsharpener"],["think","brain"],["thinker","person"],["thinkers","person"],["thinking","a"],["thinks","plus"],["third","three"],["thirdly","plus"],["thirds","square"],["thirst","water"],["thirsty","tongue"],["thirty","thirtyrectangle"],["thirtyrectangle",""],["this","down"],["thoma","person"],["thomas","person"],["thompson","person"],["thon","plus"],["thonet","person"],["thorne","person"],["thorough","plus"],["thoroughly","plus"],["those","down"],["though","xcorrection"],["thought","box"],["thoughtful","square"],["thoughtfully","box"],["thoughts","square"],["thousand","n"],["thousands","n"],["thread","line"],["threaded","line"],["threads","line"],["threat","minus"],["threatened","minus"],["threatening","minus"],["threatens","minus"],["threats","minus"],["three","threerectangle"],["threebox",""],["threerectangle",""],["threes","threerectangle"],["threshold",""],["thresholds","n"],["threw","ball"],["throat",""],["throne",""],["through","dash"],["throughout","n"],["throughs","right"],["throw","ball"],["throwing","right"],["thrown","right"],["ths","dash"],["thstreet","street"],["thu","square"],["thumb",""],["thunberg","person"],["thunder","line"],["thunderstorm","water"],["thurs","square"],["thursday","square"],["thursdays","square"],["thus","right"],["thy","left"],["thymine","ball"],["thymus",""],["thyroid","box"],["ti","dash"],["tial","n"],["tian","person"],["tiananmen","gate"],["tianity","person"],["tianrenheyi","plus"],["tibetan","person"],["tick",""],["ticked","tick"],["ticket","square"],["tickets","square"],["ticking","tick"],["tickle","plus"],["tickled","plus"],["tickling","plus"],["tiddlies","counter"],["tiddly","counter"],["tide","line"],["tides","right"],["tidied","dash"],["tidy","plus"],["tie",""],["tied","equals"],["tier","square"],["tiers","line"],["ties","dash"],["tight","n"],["tighten","xcorrection"],["tightened","xcorrection"],["tightly","n"],["tightrope","right"],["tights",""],["til","dash"],["tilda","person"],["tile",""],["till","right"],["tilt","right"],["tilted","triangle"],["tilting","right"],["tim","person"],["time",""],["timed","n"],["timeless","plus"],["timelily","n"],["timeline","line"],["timelines","line"],["timely","zero"],["timemachine",""],["timemachinetest","plus"],["timenow","n"],["timeout","n"],["timeouts","zero"],["timephone","n"],["timer","n"],["times","n"],["timespace","box"],["timestamp","variable"],["timetable","square"],["timetables","square"],["timetodo","n"],["timetravelgrandfatherparadoxsolved","tick"],["timezone","square"],["timing","n"],["timings","n"],["timist","minus"],["timor","country"],["timpani","drum"],["tin",""],["tina","person"],["tinalia","plus"],["ting","ear"],["tinge","dot"],["tinkle","dot"],["tins","can"],["tinsel",""],["tinsels","tinsel"],["tinted","n"],["tiny","n"],["tion","dash"],["tip","up"],["tipped","plus"],["tips","plus"],["tired","minus"],["tiredly","minus"],["tiredness","minus"],["tiring","minus"],["tis","equals"],["tissue","box"],["tissues","box"],["tiszai","person"],["titillations","plus"],["title","square"],["titles","square"],["titlez","square"],["tizziwinkle","hedgehog"],["tla","plus"],["tlalli","person"],["tm","company"],["tmp","paper"],["tmps","paper"],["to","right"],["toadstool",""],["toast",""],["toasted","toast"],["toastmaster","company"],["toastmasters","company"],["tobago","country"],["toboggan",""],["toby","person"],["toc","dash"],["today","square"],["todays","square"],["toddler","person"],["todo","tick"],["toe",""],["toenail",""],["toes","toe"],["toffee",""],["tofu",""],["together","dash"],["toggle","right"],["togo","country"],["toi","person"],["toiled","hand"],["toilet","down"],["toiletries","toothbrush"],["toity","plus"],["token","n"],["tokenise","square"],["tokeniser","square"],["tokenises","square"],["tokens","variable"],["tolang","paper"],["told","speechballoon"],["tolerance","plus"],["tolerate","plus"],["toller","dollarsymbol"],["tom","drum"],["tomato",""],["tomatoes","tomato"],["tombs","room"],["tome","country"],["tomfromglasgow","person"],["tommillswyn","person"],["tomorrow","square"],["tonally","n"],["tone","smile"],["tones","n"],["tonewheel","note"],["tong","dash"],["tonga","country"],["tongs",""],["tongue",""],["tonight","square"],["tonk","ball"],["tonsils","ball"],["tony","person"],["too","plus"],["took","left"],["tool","hammer"],["toolbox","box"],["toole","person"],["tools","right"],["toom","person"],["toorak","suburb"],["tooth",""],["toothbrush",""],["toothpaste",""],["toothpick",""],["top","square"],["topic","a"],["topical","square"],["topics","tworectangle"],["topk","box"],["toplevel","dot"],["topologically","dot"],["topology","right"],["topper","up"],["topping",""],["topple","down"],["topresentandpagesto","right"],["tops","up"],["toptal","company"],["tor","person"],["torch",""],["torso",""],["tort","minus"],["torus",""],["tos","paper"],["toss","right"],["tossed","up"],["tot","baby"],["total","uppert"],["totalbars","n"],["totality","n"],["totalled","n"],["totallength","n"],["totally","n"],["totals","n"],["totalvars","n"],["totem","plus"],["totted","right"],["touch","square"],["touched","finger"],["touches","equals"],["touching","equals"],["tough","minus"],["toupée",""],["tour","right"],["toured","right"],["touring","right"],["tourism","right"],["tourist","person"],["tourists","person"],["tournament","right"],["tournaments","greaterthan"],["tours","right"],["tow","right"],["toward","right"],["towards","right"],["towardsdatascience","right"],["towel",""],["towels","towel"],["tower",""],["town",""],["towords","right"],["toxic","minus"],["toxins","minus"],["toy","box"],["toyota","car"],["toys","toy"],["tp","square"],["tpfile","square"],["tquk","company"],["tr","right"],["tra","plus"],["trace","dash"],["traced","line"],["traces","line"],["trachea","tube"],["tracing","line"],["track","tick"],["tracked","plus"],["tracking","tick"],["tracks","line"],["tracksnumber","n"],["tract","box"],["tractatus","book"],["traction","right"],["tractor","right"],["tracy","person"],["trade","book"],["tradesman","person"],["tradesmanship","plus"],["trading","right"],["tradition","plus"],["traditional","plus"],["traditionally","plus"],["traditions","book"],["traffic","right"],["trafficking","minus"],["tragedy","minus"],["tragically","minus"],["trail","right"],["train","box"],["trained","tick"],["trainee","person"],["trainees","person"],["trainer","person"],["trainers","person"],["training","paper"],["trainings","paper"],["trains","person"],["trait","square"],["traits","square"],["trajectories","right"],["trajectory","right"],["tram",""],["tramlgreturningfromshowreel","tram"],["tran","right"],["trancing","right"],["tranfrom","left"],["trans","right"],["transaction","dollarsymbol"],["transactional","box"],["transactionally","right"],["transactions","dollarsymbol"],["transcend","line"],["transcended","up"],["transcendence","up"],["transcendental","line"],["transcendentalism","up"],["transcending","up"],["transcends","up"],["transcribe","right"],["transcribed","right"],["transcript","paper"],["transcription","right"],["transcripts","paper"],["transdisciplinary","right"],["transf","right"],["transfer","right"],["transferability","right"],["transferable","right"],["transference","right"],["transferred","right"],["transfers","right"],["transfixed","equals"],["transform","right"],["transformation","right"],["transformational","right"],["transformations","right"],["transformative","right"],["transformed","right"],["transforming","right"],["transforms","right"],["transgender","person"],["transgress","minus"],["transgressions","minus"],["transhumanism","right"],["transition","right"],["transitioning","right"],["transitionism","right"],["transitions","right"],["transitive","ball"],["transitivity","right"],["translatable","tick"],["translatative","right"],["translate","right"],["translated","apple"],["translatedtext","square"],["translates","right"],["translating","right"],["translation","right"],["translationa","paper"],["translationdictionary","book"],["translationi","right"],["translationmanagementsystem","box"],["translationpairs","box"],["translations","right"],["translationword","square"],["translationwords","square"],["translator","person"],["translators","right"],["transliterations","square"],["translocal","right"],["translucent","right"],["transmission","right"],["transmit","right"],["transmitted","right"],["transmitter","zline"],["transmitters","right"],["transmitting","right"],["transparency","square"],["transparent","n"],["transparently","plus"],["transplanted","right"],["transplanting","right"],["transport","right"],["transported","right"],["transporting","right"],["transpose","right"],["transposing","right"],["transsexual","person"],["transsexualisation","right"],["transversion","right"],["transversions","right"],["transvert","right"],["transverted","dot"],["transverter","right"],["transverting","right"],["transverts","right"],["trap","box"],["trapeze",""],["trapezium",""],["traps","stick"],["trash","box"],["trauma","minus"],["traumatic","minus"],["traumatised","minus"],["travails","book"],["travel","train"],["travelled","right"],["traveller","person"],["travellers","person"],["travelling","right"],["travels","right"],["traversal","right"],["traverse","right"],["traversed","right"],["traverses","right"],["traversing","down"],["trawl","right"],["trawling","right"],["trawls","right"],["tray",""],["trays","tray"],["treacle",""],["treasure","coin"],["treasurer","person"],["treasures","coin"],["treasury","dollarsymbol"],["treat","apple"],["treated","plus"],["treating","book"],["treatment","tick"],["treatments","book"],["treats","plus"],["treble","note"],["tree",""],["trees","tree"],["treetime","n"],["treetops","tree"],["tremolo","line"],["trend","right"],["trends","up"],["trent","person"],["triad","triangle"],["trial","line"],["trialed","n"],["trialled","plus"],["trialling","tick"],["trials","n"],["trialy","n"],["triangle","square"],["triangles","triangle"],["triangular","triangle"],["triangulate","plus"],["triangulation","triangle"],["tribal","person"],["tribe","person"],["tributaries","right"],["trice","plus"],["triceps","muscle"],["trick","plus"],["tricked","plus"],["tricks","plus"],["tricky","minus"],["triculating","down"],["tricycle",""],["trident",""],["tried","tick"],["trier","dash"],["tries","tick"],["trigger","tick"],["triggered","plus"],["triggering","right"],["triggers","plus"],["trigonometric","triangle"],["trilogy","book"],["trim","square"],["trimino","triangle"],["trimming","down"],["trinidad","country"],["trinity","threerectangle"],["trip","right"],["triple","threerectangle"],["triplec","dash"],["triplets","three"],["tripped","minus"],["tripping","minus"],["trips","right"],["triumphant","plus"],["trivia","square"],["trivial","plus"],["trivium","threerectangle"],["trobe","person"],["troff","person"],["troll","person"],["trolley",""],["trombone",""],["trope","right"],["tropes","plus"],["trophy",""],["tropical","water"],["trotsky","person"],["troubadour","person"],["trouble","minus"],["trousers",""],["trout","fish"],["troy","person"],["truck",""],["true","one"],["trump","person"],["trumpet",""],["trumpets","trumpet"],["truncate","n"],["truncated","n"],["truncates","n"],["trundle","right"],["trunk",""],["trust","plus"],["trusted","plus"],["trusting","plus"],["trustingly","plus"],["trusts","plus"],["trustworthiness","plus"],["trustworthy","plus"],["truth","one"],["truthful","plus"],["truthfully","plus"],["truths","plus"],["try","dash"],["trying","tick"],["tryoutputs","variable"],["tryoutputsa","variable"],["trytranslations","tick"],["ts","dash"],["tsunami","water"],["tt","dash"],["ttb","apple"],["ttfa","dash"],["tts","dash"],["ttspec","square"],["ttt","dash"],["tty","square"],["ttys","dash"],["tub","box"],["tuba","tube"],["tube","box"],["tubes","tube"],["tubising","tube"],["tubular","tube"],["tubules","tube"],["tuck","plus"],["tucked","box"],["tucker","person"],["tucking","down"],["tue","square"],["tues",""],["tuesday","square"],["tuesdays","square"],["tuition","plus"],["tulip","flower"],["tumour","minus"],["tumours","minus"],["tune","note"],["tuned","note"],["tunes","note"],["tunic",""],["tunisia","country"],["tunnel",""],["tunnels","box"],["tuple","tworectangle"],["tuples","square"],["turbine","right"],["ture","plus"],["tures","plus"],["turin","city"],["turing","person"],["turkey","country"],["turkmenistan","country"],["turmeric",""],["turn","square"],["turnbull","person"],["turncut","line"],["turndebug","tick"],["turned","two"],["turnequals","right"],["turners","right"],["turning","right"],["turnip",""],["turnitin","box"],["turnoffas","zero"],["turnover","circle"],["turnpike","right"],["turns","tworectangle"],["turtleart","right"],["tutankhamen","person"],["tute","room"],["tutor","person"],["tutorial","room"],["tutorials","room"],["tutoring","book"],["tutors","person"],["tuvalu","country"],["tuxedos","tuxedo"],["tv",""],["tw","company"],["twain","person"],["tween","box"],["tweet","bird"],["twelve","twelverectangle"],["twenty","twentyrectangle"],["twentyman","person"],["twentyrectangle",""],["twice","two"],["twigg","person"],["twin","person"],["twirled","circle"],["twirling","right"],["twist","right"],["twisted","x"],["twisting","right"],["twists","right"],["twitch","n"],["twitched","n"],["twitches","n"],["twitter","building"],["two",""],["twobox",""],["twodimensional","square"],["twofold","plus"],["tworectangle",""],["twoshorthandles","handle"],["twouses","tworectangle"],["txt","paper"],["txtsetting","square"],["ty","dash"],["tying","dot"],["type","box"],["typed","box"],["typedict","book"],["types","variable"],["typesoftypes","square"],["typesoftypesdictionary","paper"],["typestatements","equals"],["typewriter",""],["typical","plus"],["typically","plus"],["typing","square"],["typist","person"],["typographed","square"],["typographical","box"],["typology","box"],["typos","minus"],["tyrannosaurus","dinosaur"],["tyranny","minus"],["tyre",""],["tyson","person"],["tz","dash"],["u","variable"],["ua","n"],["ual","dash"],["ub","variable"],["uba","dash"],["ubiquitously","box"],["ubreen","square"],["ubu","person"],["uc","variable"],["ucgjzsx","dash"],["ucr","company"],["udaya","person"],["ue","dash"],["ug","dash"],["uganda","country"],["ugly","minus"],["uh","minus"],["ui","square"],["uk","country"],["ukraine","country"],["ul","square"],["ulml","dash"],["ultimate","plus"],["ultimately","plus"],["ultraviolet","n"],["um","dash"],["umami","tofu"],["umberto","person"],["umbrella",""],["un","one"],["una","one"],["unable","minus"],["unabridged","book"],["unacceptable","minus"],["unaccredited","minus"],["unacknowledged","minus"],["unaffected","plus"],["unambiguous","plus"],["uname","square"],["unanswered","minus"],["unassumingness","plus"],["unauthorized","minus"],["unavailable","minus"],["unavoidable","minus"],["unaware","minus"],["unawareness","minus"],["unbalanced","greaterthan"],["unbelievable","minus"],["unbiased","plus"],["unblock","plus"],["unblocked","box"],["unblocking","right"],["unbroken","plus"],["unbuttoned","left"],["uncapped","up"],["uncared","minus"],["uncertain","minus"],["uncertainty","minus"],["unchanged","one"],["unchanging","one"],["unchecked","minus"],["unchosen","minus"],["unclaimed","minus"],["unclassified","minus"],["uncle","person"],["unclear","xcorrection"],["unclearly","minus"],["uncles","person"],["uncomfortable","minus"],["uncomment","minus"],["uncommented","box"],["uncommenting","xcorrection"],["uncompressed","box"],["unconceal","up"],["unconcealed","up"],["unconcealing","eye"],["unconceals","eye"],["unconceived","zero"],["unconditionally","plus"],["unconfident","minus"],["unconnected","minus"],["unconscious","zero"],["unconsciously","minus"],["uncontactable","minus"],["uncontrollable","minus"],["uncorrect","minus"],["uncountable","n"],["uncover","up"],["uncovered","up"],["uncovering","down"],["uncovers","right"],["uncritical","minus"],["uncurled","right"],["undead","plus"],["undebated","minus"],["undecided","minus"],["undef","zero"],["undefined","dash"],["undefinedalgorithms","variable"],["undefinedbreasonings","apple"],["undefinedbreathsonings","variable"],["undefineddirections","variable"],["undefinedobjectstofinish","variable"],["undefinedobjectstoprepare","variable"],["undefinedpartsofrooms","variable"],["undefinedrooms","variable"],["under","down"],["underbelly","square"],["underdog","person"],["undergarments","shirt"],["undergoing","plus"],["undergraduate","person"],["undergraduates","person"],["underground","building"],["underhanded","minus"],["underline","line"],["underlined","line"],["underlying","square"],["undermine","minus"],["underneath","down"],["underpin","plus"],["underpinned","down"],["underpins","down"],["underscore",""],["underscored","line"],["underscores","square"],["understand","tick"],["understandable","tick"],["understanding","square"],["understands","plus"],["understated","line"],["understood","plus"],["undertake","plus"],["undertaken","plus"],["undertaking","plus"],["undertook","plus"],["underwater","water"],["undesirable","minus"],["undeveloped","plus"],["undevelopedly","plus"],["undid","minus"],["undirected","minus"],["undisciplined","minus"],["undiscombobulated","plus"],["undo","xcorrection"],["undoing","xcorrection"],["undress","down"],["undue","minus"],["unduly","minus"],["une","university"],["unemployment","minus"],["unend","box"],["unendingly","n"],["unenroll","right"],["unenrolled","up"],["unep","company"],["unequal","notequals"],["unesco","building"],["unethical","minus"],["unethically","minus"],["uneventful","plus"],["unexamined","minus"],["unexp","box"],["unexpanded","box"],["unexpected","xcorrection"],["unexplained","minus"],["unexplored","minus"],["unexpressive","minus"],["unfair","minus"],["unfairly","minus"],["unfamiliar","minus"],["unfavourable","minus"],["unfilled","box"],["unfinished","minus"],["unfitness","minus"],["unfolded","right"],["unfolding","right"],["unfolds","right"],["unforgettable","plus"],["unfortunate","minus"],["unfortunately","minus"],["unfound","minus"],["unfounded","minus"],["unfriendly","minus"],["unfunded","zero"],["unfurled","right"],["ungrammatical","minus"],["unguent",""],["unhappiness","minus"],["unhappy","minus"],["unhd","dash"],["unhealthiness","minus"],["unhealthy","minus"],["unheard","minus"],["uni","building"],["unidentified","minus"],["unification","equals"],["unified","one"],["uniform","one"],["uniformisation","plus"],["uniformise","plus"],["uniformised","plus"],["uniformity","equals"],["uniformly","n"],["uniforms","shirt"],["unify","one"],["unimelb","building"],["unimportant","minus"],["uninstall","xcorrection"],["unintelligent","minus"],["unintended","minus"],["uninterrupted","plus"],["union","threerectangle"],["unions","equals"],["unique","one"],["uniquely","one"],["uniqueremainders","n"],["unirioja","dash"],["unis","building"],["unit","one"],["united","square"],["units","line"],["unity","box"],["univer","box"],["univeristy","dash"],["universal","box"],["universalisation","box"],["universalism","one"],["universality","n"],["universally","box"],["universe","box"],["universes","box"],["universitates","box"],["universities","university"],["university","building"],["unix","dash"],["unjust","minus"],["unknown","n"],["unknowns","minus"],["unlawful","minus"],["unless","xcorrection"],["unlike","minus"],["unlikely","minus"],["unlimited","n"],["unlinked","dash"],["unlock","key"],["unlocked","right"],["unlocks","right"],["unmanifested","one"],["unmarked","zero"],["unmatch","notequals"],["unmeant","minus"],["unnameable","square"],["unnamed","zero"],["unnatural","minus"],["unnaturalness","minus"],["unnec","xcorrection"],["unnecessarily","minus"],["unnecessary","minus"],["unneeded","xcorrection"],["unnoted","box"],["unnoticeable","plus"],["unobservably","minus"],["unobtrusive","plus"],["unordered","n"],["unoriginal","minus"],["unpack","up"],["unpacked","up"],["unpacking","up"],["unplanned","plus"],["unplugged","box"],["unpopularity","minus"],["unpredictability","minus"],["unpredictable","minus"],["unprepared","minus"],["unprocessed","plus"],["unproductive","plus"],["unprofessionalism","minus"],["unprotected","minus"],["unravel","right"],["unravelled","right"],["unravelling","right"],["unread","minus"],["unreasonable","minus"],["unreasonably","minus"],["unregulated","minus"],["unrelated","xcorrection"],["unrelatedness","plus"],["unreliability","minus"],["unreliable","minus"],["unreliably","minus"],["unresearched","minus"],["unresolved","minus"],["unresolvedly","minus"],["unrestricted","n"],["unreturned","minus"],["unrolled","right"],["unsafe","minus"],["unsafely","minus"],["unsaid","zero"],["unscrew","up"],["unscrewed","up"],["unscrewing","right"],["unseen","zero"],["unskilled","minus"],["unstable","minus"],["unstoppable","plus"],["unsucc","xcorrection"],["unsuccessful","xcorrection"],["unsuggested","minus"],["unsupported","minus"],["unsure","minus"],["unsureness","minus"],["unsynthesised","minus"],["untaken","plus"],["untenable","plus"],["untested","minus"],["unthinkable","xcorrection"],["until","dot"],["untitled","square"],["unto","right"],["untouched","square"],["untraced","right"],["untrained","zero"],["untried","minus"],["untrue","zero"],["untwisted","line"],["unupdated","minus"],["unused","square"],["unusual","minus"],["unverified","minus"],["unvisited","xcorrection"],["unwanted","minus"],["unwantedly","minus"],["unwavering","plus"],["unwell","minus"],["unwinding","right"],["unwittingly","minus"],["unworlding","plus"],["unwound","plus"],["unwrap","right"],["unwrapped","up"],["unwrapping","right"],["unwritten","zero"],["unzip","right"],["unzipped","notequals"],["unzipping","down"],["uon","university"],["uonline","paper"],["uop","dash"],["up",""],["upanisad","book"],["upanisads","book"],["upanishads","book"],["upasana","meditator"],["upasanasutra","square"],["upbeat","plus"],["upbringing","plus"],["upcase","square"],["upcoming","one"],["upda","plus"],["updat","up"],["update","plus"],["updated","tick"],["updatefile","plus"],["updatelocal","up"],["updater","tick"],["updaters","plus"],["updates","up"],["updatetrans","plus"],["updatevar","variable"],["updatevars","variable"],["updating","up"],["upf","n"],["upgrade","up"],["upgraded","up"],["upgrading","up"],["upheld","plus"],["uphill","up"],["uphold","hand"],["upholding","up"],["upkeep","plus"],["uplifting","up"],["upload","up"],["uploaded","up"],["uploading","up"],["uploads","up"],["upness","up"],["upon","down"],["upper","up"],["uppera","a"],["upperb","b"],["upperc","c"],["uppercase","up"],["upperd","d"],["uppere","e"],["upperf","f"],["upperg","g"],["upperh","h"],["upperi","i"],["upperj","j"],["upperk","k"],["upperl","l"],["upperm","m"],["uppern","n"],["uppero","o"],["upperp","p"],["upperq","q"],["upperr","r"],["uppers","s"],["uppert","t"],["upperu","u"],["upperv","v"],["upperw","w"],["upperx","x"],["uppery","y"],["upperz","z"],["uprety","person"],["upright","up"],["uprise","xcorrection"],["ups","up"],["upset","minus"],["upsets","minus"],["upsetting","minus"],["upside","up"],["upsized","up"],["upsold","dollarsymbol"],["upstairs","up"],["uptime","n"],["uptown","city"],["upward","up"],["upwards","up"],["upwork","company"],["ur","person"],["urban","box"],["urea",""],["ureters","tube"],["urethra","tube"],["urge","plus"],["urged","plus"],["urgency","xcorrection"],["urgent","plus"],["urgently","plus"],["urinary","urine"],["urinated","liquid"],["urination","urine"],["urine",""],["url","square"],["urls","square"],["urn",""],["uruguay","country"],["us","person"],["usable","plus"],["usage","hand"],["usb","box"],["usborne","company"],["use","spoon"],["usea","hand"],["useableness","plus"],["used","plus"],["useexisting","plus"],["useful","plus"],["usefulness","plus"],["useless","dot"],["uselessness","minus"],["user","person"],["userid","n"],["userinput","square"],["username","square"],["users","person"],["uses","tworectangle"],["using","dash"],["usp","square"],["usr","person"],["ustad","person"],["usual","plus"],["usually","one"],["usurp","minus"],["utc","dash"],["utensil","fork"],["uterus","box"],["utf","dash"],["utilise","plus"],["utilised","plus"],["utilising","dash"],["utilitarian","person"],["utilitarianism","box"],["utilities","right"],["utility","plus"],["utilize","plus"],["utm","dash"],["utpermianbasinx","university"],["utter","square"],["utterance","square"],["utterances","square"],["uttered","square"],["uttering","square"],["utters","mouth"],["uv","line"],["uva","n"],["uvb","n"],["uvks","dash"],["uxrqkjk","dash"],["uypi","dash"],["uzbekistan","country"],["uztuabyg","dash"],["v","variable"],["va","variable"],["vacancies","box"],["vacation","square"],["vaccinated","plus"],["vaccination","syringe"],["vaccine",""],["vaccines","xcorrection"],["vacuum","box"],["vadher","person"],["vag","person"],["vagina","box"],["vague","minus"],["vaguely","minus"],["vagus","box"],["vaj","person"],["val","one"],["valerie","person"],["valiant","plus"],["valid","plus"],["validate","tick"],["validated","plus"],["validates","plus"],["validating","plus"],["validation","plus"],["validity","one"],["valley","box"],["valleys","down"],["vals","n"],["valuable","dollarsymbol"],["value","one"],["valued","n"],["values","one"],["valuing","plus"],["valve","box"],["van","person"],["vandalised","minus"],["vanessa","person"],["vanga","person"],["vanilla","icecream"],["vanishing","box"],["vansh","person"],["vanuatu","country"],["vaporise","zero"],["vaporised","zero"],["vaporising","air"],["var","variable"],["vara","variable"],["varese","city"],["variability","n"],["variable",""],["variablename","square"],["variables","variable"],["variance","minus"],["variant","two"],["variants","right"],["variation","n"],["variations","n"],["varicose","minus"],["varies","n"],["variety","n"],["various","equals"],["variously","plus"],["varlist","tworectangle"],["varlists","square"],["varn","dash"],["varname","square"],["varnames","variable"],["varnum","n"],["vars","variable"],["varsd","box"],["vary","n"],["varyers","dash"],["varying","two"],["vas","tube"],["vascular","tube"],["vasculature","tube"],["vase",""],["vast","n"],["vastu","tree"],["vasuli","person"],["vatican","city"],["vaudren","person"],["vault","box"],["vb","dash"],["vc","variable"],["vca","university"],["vce","paper"],["vdwhhj","dash"],["ve","dash"],["vec","right"],["vector","right"],["vectors","right"],["ved","n"],["veda","dash"],["vedic","person"],["vegan","person"],["veganism","zucchini"],["vegans","person"],["vegetable","box"],["vegetables","squash"],["vegetarian","apple"],["vegetarianism","zucchini"],["vegetarians","person"],["vehemence","xcorrection"],["vehement","minus"],["vehicle",""],["vehicles","car"],["vein","tube"],["veins","tube"],["vel","dash"],["velocities","n"],["velocity","n"],["ven","person"],["venan","person"],["vending","dollarsymbol"],["vendor","person"],["vendors","person"],["venezuela","country"],["venice","dash"],["venn","square"],["ventilation","box"],["ventriloquist","person"],["venture","right"],["venue","building"],["venus","person"],["venusian","person"],["verb","equals"],["verbal","tongue"],["verbalisation","mouth"],["verbally","mouth"],["verbose","plus"],["verbphrase","square"],["verbs","square"],["verde","country"],["verdict","plus"],["verifiable","plus"],["verifiably","plus"],["verification","tick"],["verificational","plus"],["verificationalism","tick"],["verificationism","tick"],["verified","tick"],["verifier","person"],["verifies","tick"],["verify","tick"],["verifying","tick"],["vermicelli","spaghetti"],["vermillion","square"],["verpaccio","person"],["vers","n"],["versa","dash"],["verse","square"],["versechorussoloprogression","right"],["verses","square"],["version","one"],["versioning","n"],["versions","tworectangle"],["versity","square"],["versus","dash"],["vertex","dot"],["vertical","up"],["vertically","up"],["very","plus"],["ves","person"],["vesicle","ball"],["vesicles","vesicle"],["vessel","cup"],["vessels","tube"],["vest",""],["vestibular","box"],["vestiches","square"],["vestments","robe"],["vet","paper"],["veterans","person"],["veterinary","person"],["vetiver",""],["vetus","n"],["vetusia","land"],["vetusian","square"],["vgaz","dash"],["vgh","dash"],["vgp","square"],["vgp","variable"],["vgps","square"],["vgps","variable"],["vi","box"],["via","right"],["viability","plus"],["viable","plus"],["viagra","tablet"],["vibrant","plus"],["vibraphone","dot"],["vibrato","n"],["vic","state"],["vice","two"],["vicinity","n"],["victim","person"],["victor","person"],["victoria","state"],["victorian","person"],["victoriously","plus"],["victory","plus"],["video","square"],["videoed","camera"],["videoing","line"],["videophone","square"],["videos","square"],["vids","line"],["vienna","person"],["viete","person"],["vietnam","country"],["view","eye"],["viewable","square"],["viewaction","eye"],["viewed","eye"],["viewer","person"],["viewers","person"],["viewfinder","square"],["viewing","eye"],["viewitem","plus"],["viewpoint","eye"],["viewpoints","eye"],["viewport","square"],["viewproductviaportal","dash"],["views","eye"],["viewscreen","square"],["vile","minus"],["villain","person"],["vim","and"],["vince","person"],["vincent","country"],["vinci","person"],["vine",""],["vineet","person"],["vinegar",""],["viol","viola"],["viola",""],["violated","minus"],["violation","minus"],["violations","minus"],["violent","minus"],["violin",""],["viral","n"],["viralise","right"],["virality","plus"],["virgin","person"],["virginia","person"],["virgo","star"],["virility","person"],["virt","box"],["virtual","box"],["virtualization","box"],["virtually","box"],["virtue","plus"],["virtues","plus"],["virtuous","plus"],["virtuously","plus"],["virus",""],["viruses","virus"],["visa","dollarsymbol"],["visceral","tissue"],["viscosity","n"],["viscous","n"],["visibility","eye"],["visible","square"],["vision","eye"],["visioning","eye"],["visit","one"],["visited","one"],["visitees","person"],["visiting","plus"],["visitor","person"],["visitors","person"],["vista","tree"],["visual","square"],["visualisation","square"],["visualisations","square"],["visualise","square"],["visualised","eye"],["visualises","eye"],["visualising","square"],["visually","eye"],["visuals","square"],["vita","line"],["vitae","line"],["vital","plus"],["vitamin","capsule"],["vitamins","ball"],["vitiate","plus"],["vitiated","plus"],["vitro","testtube"],["viva","plus"],["vive","plus"],["vivo","person"],["vivs","variable"],["vivshaw","person"],["vj","person"],["vl","n"],["vladimir","person"],["vlucians","person"],["vn","dash"],["vns","right"],["vocab","square"],["vocabulary","square"],["vocal","speechballoon"],["vocalists","person"],["vocally","mouth"],["vocals","note"],["vocalstubinstrument","tube"],["vocation","book"],["vocational","book"],["vocationally","plus"],["vocative","plus"],["voced","book"],["vocedplus","and"],["voff","n"],["voice","mouth"],["voiceparts","line"],["voices","square"],["voicetrack","line"],["voicing","square"],["void","zero"],["voids","box"],["vol","n"],["volcanic","up"],["volcano",""],["volume","box"],["volumes","box"],["voluntarily","plus"],["voluntary","plus"],["volunteer","person"],["volunteering","plus"],["volunteers","person"],["vomit","box"],["vomited","up"],["vomiting","up"],["von","n"],["voom","right"],["voomed","right"],["vorstellung","box"],["vote","tick"],["voted","n"],["voter","person"],["voters","person"],["votes","plus"],["voting","square"],["vouchers","square"],["vous","person"],["vows","square"],["vox","tube"],["voxel","box"],["voxels","box"],["vp","square"],["vpns","square"],["vps","computer"],["vpsbu","box"],["vpscodes","n"],["vpsfilename","square"],["vpspath","square"],["vpsstring","square"],["vpsterm","square"],["vpsunit","box"],["vr","variable"],["vren","person"],["vrooming","right"],["vrqa","company"],["vs","dash"],["vsorted","right"],["vt","dash"],["vtp","square"],["vu","dash"],["vuckanova","person"],["vulgar","minus"],["vulnerability","minus"],["vulnerable","minus"],["vulnerably","plus"],["vulva","organ"],["vv","mirror"],["vyes","tick"],["w","variable"],["wa","variable"],["wackery","plus"],["waddling","duck"],["waded","right"],["wafer",""],["waffle",""],["wage","dollarsymbol"],["wailful","ear"],["waist",""],["wait","n"],["waited","plus"],["waiting","plus"],["waiver","plus"],["wake","right"],["wakeful","plus"],["waking","up"],["wales","state"],["walk","path"],["walked","right"],["walkers","person"],["walking","path"],["walks","path"],["walkthrough","right"],["walkthroughs","right"],["walkway","right"],["wall","square"],["wallabies","wallaby"],["wallaby",""],["wallace","person"],["wallet",""],["walls","box"],["walnut",""],["walter","person"],["walton","person"],["waltzing","right"],["wam","book"],["wand",""],["wandered","right"],["wandering","right"],["wang","person"],["want","plus"],["wanted","book"],["wantedly","plus"],["wanting","left"],["wantingness","plus"],["wants","left"],["war","minus"],["ward","plus"],["wardley","person"],["wardrobe",""],["wards","room"],["warehouse","building"],["warehouses","building"],["warfare","minus"],["warm","plus"],["warmed","up"],["warming","up"],["warmth","plus"],["warn","xcorrection"],["warne","person"],["warned","xcorrection"],["warning","minus"],["warnings","xcorrection"],["warns","xcorrection"],["warp","right"],["warrant","paper"],["warranties","plus"],["warranty","square"],["warring","minus"],["wars","minus"],["wart","box"],["warthogs","warthog"],["was","equals"],["wash","water"],["washed","water"],["washing","water"],["washington","person"],["washy","minus"],["wasn","notequals"],["waste","minus"],["wasted","minus"],["wasteland","minus"],["wasting","minus"],["watch","eye"],["watched","eye"],["watcher","person"],["watching","eye"],["water","amount"],["waterbed","bed"],["waterchick",""],["waterchicks","waterchick"],["watercress",""],["watered","water"],["watering","water"],["watermark","square"],["watermelon",""],["watertight","box"],["waterway","water"],["watery","water"],["waurn","suburb"],["wav","file"],["wave","right"],["waved","hand"],["wavelength","n"],["wavering","minus"],["waves","right"],["waving","hand"],["wavs","paper"],["wax",""],["way","path"],["wayback","company"],["ways","path"],["wb","variable"],["wc","variable"],["we","person"],["weak","minus"],["weaker","minus"],["weakness","minus"],["weaknesses","minus"],["wealth","dollarsymbol"],["wealthy","dollarsymbol"],["weapons","minus"],["wear","coat"],["wearer","person"],["wearers","person"],["wearing","trousers"],["weary","minus"],["weasel",""],["weasoned","plus"],["weather","water"],["weave","right"],["weaved","right"],["web","paper"],["webarranger","algorithm"],["webb","person"],["webbed","duck"],["webber","person"],["webinar","square"],["webinars","square"],["webmaster","person"],["webpage","paper"],["webpages","paper"],["website","paper"],["websites","paper"],["webster","person"],["wed","square"],["wedding","dash"],["wedgeman","person"],["wednesday","square"],["weed","plant"],["week","sevenrectangle"],["weekdays","square"],["weekend","square"],["weekends","square"],["weekly","square"],["weeks","square"],["ween","box"],["weet",""],["wei","person"],["weigh","n"],["weighed","n"],["weighers","n"],["weighing","n"],["weight","n"],["weighted","down"],["weights","n"],["weighty","n"],["weird","minus"],["weirdness","minus"],["welcome","plus"],["welcomed","plus"],["welcoming","hand"],["welfare","dollarsymbol"],["well","person"],["wellbeing","plus"],["wellington","person"],["wellness","plus"],["welsh","country"],["wemba","person"],["went","right"],["wentworth","program"],["wept","amount"],["wer","box"],["were","equals"],["weren","notequals"],["west","left"],["western","left"],["wet","liquid"],["wetted","water"],["wetting","water"],["weyoun","person"],["wgets","and"],["whale",""],["whales","whale"],["wharton","university"],["what","plus"],["whatever","one"],["whd","variable"],["wheat",""],["wheats","plant"],["wheatsheaf","wheat"],["wheel","circle"],["wheelbarrow",""],["wheelchair",""],["wheeled","right"],["wheeledly","wheel"],["wheeler","person"],["wheels","wheel"],["when","zero"],["whenever","n"],["where","square"],["whereas","dash"],["wherefarers","person"],["wherein","plus"],["wherever","dot"],["whether","dash"],["whfuwamyi","dash"],["which","up"],["while","dash"],["whilst","equals"],["whisk",""],["whisking","whisk"],["whist","person"],["whistl","person"],["whistle","tube"],["whistleblower","person"],["whistler","dog"],["whistlice","person"],["whistlur","person"],["whistye","person"],["white","square"],["whiteboard",""],["whiteleys","person"],["whiter","square"],["whiterhen","duck"],["whitetocolour","right"],["whitmore","person"],["whittaker","person"],["whittington","person"],["whittling","down"],["who","person"],["whoa","plus"],["whois","dash"],["whole","box"],["wholeness","n"],["wholly","n"],["whom","person"],["whose","box"],["why","dash"],["wi","right"],["wicca","religion"],["wick",""],["wicked","minus"],["wickets",""],["wide","line"],["widely","plus"],["wider","greaterthan"],["widgets","and"],["width","n"],["widths","n"],["wife","person"],["wifi","right"],["wig","hair"],["wiggled","right"],["wigwams","wigwam"],["wiki","book"],["wikia","book"],["wikihow","paper"],["wikipedia","book"],["wil","dash"],["wilb","person"],["wilbarans","person"],["wilbercock","person"],["wilbercocks","person"],["wilbur","goat"],["wild","lion"],["wildcard","square"],["wilderness","tree"],["wildlife","animal"],["wildly","plus"],["wilkins","person"],["will","right"],["williams","person"],["willie","person"],["williec","person"],["willing","plus"],["willingly","plus"],["willingness","plus"],["wilt","down"],["win","a"],["winchell","person"],["winckler","person"],["wind","right"],["winding","circle"],["window","square"],["windowless","wall"],["windows","square"],["windpipe",""],["windsock",""],["wine",""],["winemaker","person"],["wing","dash"],["wingate","person"],["wings","wing"],["winkled","plus"],["winks","eye"],["winner","person"],["winners","person"],["winnie","person"],["winning","a"],["winnings","dollarsymbol"],["wins","plus"],["winston","person"],["winter","water"],["wiped","right"],["wiper",""],["wiping","right"],["wire","line"],["wires","line"],["wiring","wire"],["wisdom","plus"],["wise","plus"],["wisely","book"],["wish","left"],["wished","plus"],["wishes","plus"],["wishing","plus"],["wishlist","paper"],["wishy","minus"],["wit","plus"],["witch","person"],["witchcraft","plus"],["witches","witch"],["with","dash"],["withdraw","right"],["withdrawal","minus"],["withdraws","right"],["withdrew","left"],["wither","minus"],["withfromtolang","plus"],["within","box"],["withitness","plus"],["witholding","dollarsymbol"],["without","minus"],["withoutfromtolang","minus"],["withstood","plus"],["witness","person"],["witnesses","person"],["witt","person"],["wittgenstein","person"],["wittgensteinian","person"],["wives","person"],["wizardry","plus"],["wk","square"],["wm","variable"],["wn","n"],["wnn","dash"],["wo","dash"],["wok",""],["woke","up"],["wolcott","person"],["woman","person"],["womb",""],["wombat",""],["women","person"],["womens","person"],["won","plus"],["wonder","plus"],["wonderbras","bra"],["wondered","speechballoon"],["wonderful","plus"],["wondering","mouth"],["wonderment","plus"],["wondrous","plus"],["wong","person"],["wood",""],["woodblock","box"],["woodcutter","person"],["wooden","spoon"],["woodwind","flute"],["wool",""],["word","lowera"],["wording","square"],["wordnet","star"],["wordpress","paper"],["words","lowera"],["wordsfrompos","square"],["wordstring","square"],["wordswithtwouses","square"],["wordy","minus"],["wore","shirt"],["work","paper"],["workable","plus"],["workaround","plus"],["workbooks","book"],["workduring","dash"],["worked","plus"],["worker","person"],["workers","person"],["workforce","person"],["working","paper"],["workingness","plus"],["workings","and"],["workload","paper"],["workloads","book"],["workman","person"],["workout","up"],["workplace","building"],["works","book"],["worksheet","paper"],["workshop","room"],["workshops","book"],["workstation","square"],["workweek","paper"],["world",""],["worlding","ball"],["worlds","planet"],["worldview","eye"],["worldwide","planet"],["worm",""],["wormed","right"],["wormhole","box"],["worn","down"],["worried","minus"],["worries","minus"],["worry","minus"],["worrying","minus"],["worse","minus"],["worship","plus"],["worshippers","person"],["worshipping","plus"],["worst","minus"],["wort","plant"],["worth","dollarsymbol"],["worthless","minus"],["worthwhile","plus"],["worthy","plus"],["wos","mouth"],["wot","down"],["would","zero"],["wouldn","minus"],["wound","minus"],["woven","right"],["wp","square"],["wqtxts","dash"],["wr","variable"],["wrap","box"],["wrapped","box"],["wrapper","box"],["wrapping","box"],["wraps","box"],["wreak","minus"],["wreath",""],["wreck","minus"],["wrecked","minus"],["wri","person"],["wright","person"],["wrist",""],["writ","dash"],["write","pencil"],["writeln","box"],["writelns","square"],["writenotification","plus"],["writer","person"],["writers","person"],["writes","person"],["writing","paper"],["writings","square"],["written","box"],["wrong","xcorrection"],["wrongdoing","minus"],["wrongness","minus"],["wrote","pen"],["ws","dash"],["wtg","dash"],["wtjpd","dash"],["wu","plant"],["wuc","dash"],["wuh","dash"],["wujec","person"],["wwc","dash"],["www","asterisk"],["wxdxuwu","dash"],["wxmyyfsdqarlgjvlmof","dash"],["x",""],["xa","dash"],["xander","person"],["xb","variable"],["xbjgehu","dash"],["xc","variable"],["xcommand","square"],["xcompliment","plus"],["xcorrection",""],["xd","dash"],["xemote","plus"],["xenon","planet"],["xf","dash"],["xfavorite","plus"],["xgoodbye","n"],["xgossip","minus"],["xgottago","n"],["xhello","one"],["xhgpiykilr","dash"],["xi","person"],["xiaolong","person"],["xinitiate","one"],["xinsult","minus"],["xintroduce","one"],["xit","dash"],["xix","n"],["xkl","dash"],["xlen","line"],["xline",""],["xm","line"],["xmath","plus"],["xmax","n"],["xmemory","box"],["xmin","n"],["xml","square"],["xn","variable"],["xnfxvjk","dash"],["xnfxwml","dash"],["xngf","dash"],["xnone","zero"],["xnonsense","minus"],["xnp","dash"],["xochi","person"],["xp","line"],["xrest","variable"],["xs","x"],["xskwk","dash"],["xsz","dash"],["xu","variable"],["xuanxue","school"],["xviii","n"],["xx","plus"],["xxi","n"],["xxii","n"],["xxs","dash"],["xxx","dash"],["xxxx","plus"],["xxxxxxb","square"],["xy","variable"],["xylophone",""],["xyz","company"],["xz","variable"],["xzle","dash"],["y","yline"],["ya","dash"],["yachts","yacht"],["yael","person"],["yahoo","person"],["yan","person"],["yang","plus"],["yantra","square"],["yao","person"],["yard","box"],["yarra","river"],["yashwantreddy","person"],["yawn","mouth"],["yawned","up"],["yawning","box"],["yb","variable"],["yc","variable"],["yd","variable"],["ydwpw","dash"],["ye","person"],["yeah","plus"],["year","square"],["yeardob","n"],["yearlearned","square"],["yearly","square"],["yearn","plus"],["years","threerectangle"],["yearvaluea","variable"],["yearvalueb","variable"],["yefuititi","plus"],["yellow","square"],["yellowgod","person"],["yemen","country"],["yes","plus"],["yesterday","square"],["yet","one"],["yetis","person"],["yew","tree"],["yf","variable"],["yield","box"],["yielded","plus"],["yielding","right"],["yields","plus"],["yik","person"],["yin","minus"],["ylen","line"],["yline",""],["ym","line"],["ymax","n"],["ymin","n"],["yml","dash"],["yn","variable"],["yo","dash"],["yodeler","person"],["yodeller","person"],["yoga","mat"],["yogamantra","square"],["yogasutra","square"],["yogasutrachildrenh","plus"],["yoghurt",""],["yogi","person"],["yogic","and"],["yogically","and"],["yogis","person"],["yoko","person"],["yorick","person"],["york","city"],["yorker","magazine"],["yorkey","person"],["you","person"],["youare","equals"],["young","n"],["younger","lessthan"],["your","person"],["yourdictionary","book"],["yourdomain","square"],["yourfolder","box"],["yourlegaldff","plus"],["yourname","square"],["yours","box"],["yourself","person"],["yourselves","person"],["youth","person"],["youths","person"],["youtu","dash"],["youtube","video"],["yovel","person"],["yoyo",""],["yp","dash"],["yqypmzmqbhasvox","dash"],["yrest","variable"],["ys","y"],["yt","paper"],["yu","person"],["yussy","person"],["yuste","person"],["yyyy","square"],["yyzvs","dash"],["yz","variable"],["z","zline"],["za","dash"],["zac","person"],["zag","right"],["zags","left"],["zambia","country"],["zamzar","company"],["zang","dash"],["zaria","person"],["zd","variable"],["zeal","plus"],["zealand","country"],["zealic","plus"],["zeals","plus"],["zebulontheta","planet"],["zero",""],["zeros","zero"],["zest","orange"],["zesty","orange"],["zgf","dash"],["zhan","person"],["zhiping","person"],["zig","right"],["zimbabwe","country"],["zinc","ball"],["zingy","plus"],["zip",""],["zippcode","n"],["zither",""],["zjh","dash"],["zl","dash"],["zlen","line"],["zline",""],["zm","dash"],["zn","variable"],["zoltan","person"],["zombie","person"],["zone","square"],["zoned","square"],["zones","square"],["zoo","building"],["zoology","animal"],["zoom","right"],["zoomed","right"],["zooming","up"],["zourabichvili","person"],["zp","variable"],["zqso","dash"],["zs","z"],["zu","variable"],["zucchini",""],["zuhandenheit","plus"],["zwischen","box"],["zygote",""],["zz","dash"],["zzx","square"],["zzxwriteflag","square"],["à","down"],["éclair",""],["égale","dash"],["übermensch","person"],["īṣ","dash"],["ṣ","dash"]] \ No newline at end of file diff --git a/Lucian-Academy-Data/Text-to-Object-Name/t2on_end_text.txt b/Lucian-Academy-Data/Text-to-Object-Name/t2on_end_text.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Lucian-Academy-Data/Text-to-Object-Name/t2on_end_text.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Lucian-Academy-Data/Text-to-Object-Name/t2on_repeating_text.txt b/Lucian-Academy-Data/Text-to-Object-Name/t2on_repeating_text.txt new file mode 100644 index 0000000000000000000000000000000000000000..d4ffe75dfdca53bbce687a5bc32031ad82f646fb --- /dev/null +++ b/Lucian-Academy-Data/Text-to-Object-Name/t2on_repeating_text.txt @@ -0,0 +1 @@ +[5,"These are words to remember."] \ No newline at end of file diff --git a/Lucian-Academy/.DS_Store b/Lucian-Academy/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..ef2dfb5dcccd71fcba54e0cb5fdda2c2e7cc1c28 Binary files /dev/null and b/Lucian-Academy/.DS_Store differ diff --git a/Lucian-Academy/Books/.DS_Store b/Lucian-Academy/Books/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..3470e0f67f462092cc5a625a12723bada2ef9d22 Binary files /dev/null and b/Lucian-Academy/Books/.DS_Store differ diff --git a/Lucian-Academy/Books/BOTS/Bots 1 2.txt b/Lucian-Academy/Books/BOTS/Bots 1 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..66a73654eec52b1cbc58a3740402844ed8e10137 --- /dev/null +++ b/Lucian-Academy/Books/BOTS/Bots 1 2.txt @@ -0,0 +1,10 @@ +["Green, L 2022, Bots 1 2, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Bots 1 + +1. I started the web business in the future and the present. There were a set number of places. SSI in C was needed for more places. I had fifty As, or at least 50 breasonings for each conclusion. +2. I needed bots as customers. I needed sales for them to make. I needed SSI in C for the backend of the web site so that it wouldn't crash. This required web hooks and LLVM. I needed ISO Prolog so that my apps would run. +3. I wrote down five sentences per paragraph in the 50 As. Like the headache prevention zap, 50 As would make the bots appear and the sales work better. I wrote the detailed basis for the sale. I wrote side apps for SSI up and As to encourage them. I pointed to the next book from each book to help them. +4. I listed all my revenue streams, including music, computer science videos, education, computer software and gold. I could buy property with profits. I could automate and control the product in information products. I avoided spiritual customers, but watched for time travellers. I learned the laws of the land and time. +5. My company was within laws to not offer replacements for real degrees, and used an accountant from the future to help obey laws. A lawyer from the present could provide intersectional advice also applying to the future, implying that this would allow continuous positive function. My product was demanded by customers, even though it was thousands of years old. My computer software was demanded, even though it was seen as needing to be emulated, and helped students think in terms of first principles (wasn't just a vintage product), but could slot in with contemporary products. I also had to make sure that I never interfaced with these contemporary products to avoid transgressions, marketing the software as a retro (or foundational) experience. +6. I realised that I could have bots after 50 As, and the degree professionally supported them. I could switch off the bot, allowing myself to visualise the setting in itself, and start a business about my ideas in themselves. I needed to find a way of converting my ideas into the ideas of the future time. I could use a bot to stay with if I couldn't secure housing. I did exactly what I wanted to do, for example walking around as myself for a while. +7. I researched other planets in the galaxy to do business in, and different appropriate times. I worked out that the haemorrhoid objection to living in the times was false. I time travelled to do text to breasonings in a particular period of time in 5689 October to overcome this. I finished off the text to breasonings, consisting of meditation for (not only the university) students, in a few minutes. The meditation students from times away from text to breasonings just meditated or time travelled to use text to breasonings. +8. I worked out how to spread my texts and businesses quickly throughout the universe. This required translation and security. I needed to mind read and communicate with aliens. The universe naturally supported pedagogy. I needed to set up a business to support them."] \ No newline at end of file diff --git a/Lucian-Academy/Books/BOTS/Bots 1.txt b/Lucian-Academy/Books/BOTS/Bots 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..2159549ffc828280711450290c3aaf91797cff17 --- /dev/null +++ b/Lucian-Academy/Books/BOTS/Bots 1.txt @@ -0,0 +1,95 @@ +["Green, L 2022, Bots 1, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Bots + +4*800-(16+80+80)/5=3165 sentence paragraphs +3165/80=40 80s + +1. I finished the app by placing the file in the app, automatically finishing the code. The bot transcribed the text. Next, I wrote the specification for the code I had written. Finally, I started the DevOps pipeline in the background. It alerted me if I had finished the code. +2. I replaced the top-level flag when writing a predicate with different predicates. I thanked the bot for doing a good job. The alert told me this if the programmer hadn't finished the code. I ran continuous DevOps tests every two minutes while editing the algorithm. Even if I wanted to continue, I stopped when I had finished the code. +3. I focused on one program, predicate or line at a time. First, the bot wrote the scene. Then, I tried different orders in diff combos to merge possible predicates. In addition, I tried various combinations of replacements for adjacent lines. Finally, I tried different combinations of parts of lines. +4. I changed the diff combos algorithm in DevOps. The bot measured the yield. I focused on one predicate when it replaced another predicate. I focused on one line when it replaced another line. I tried reordering items and tried different combinations of replacements for adjacent items. +5. The bot offered the bounty for help. I wrote the question. I wrote the answer. I determined that the solution needed redrafting. So I provided the reward for help. +6. The bot included instructions. I examined the model answer. I substituted my values. I found the answer. I wrote units and diagrams and verified the order of magnitude of the solution. +7. The bot delivered the goods. I manufactured the goods. I packaged the goods. I sent the goods to an address. Finally, I confirmed receipt of the goods. +8. The bot bought food. First, I wrote down the body's required foods. Then, I modified it for variety. Finally, I found the most delicious recipes. I bought fresh ingredients and made recipes. +9. The bot cooked the food. I listed the food needed to cook. I wrote a GAANT chart with the cooking times for the food. There was one item to cook. I taste-tested the food. +10. The bot spiritually cared for the person. I found the person. I took them to their destination. I asked them which food they would like. I gave it to them. +11. The bot discussed writing an algorithm with the person. I am the bot. I asked what the person would like to program. The algorithm explored a rigorous aspect, such as logic or another formula. It had a use, such as web, editor or graphics. +12. The bot wrote the chain of uses for an algorithm. I noticed how the algorithms developed, involving new methods. I developed skills in using particular commands. I wrote my version of known commands. I learned how to write customised commands. +13. The bot found shortcut commands. I noticed the new use of foldr, a customised predicate. I questioned why foldr was not in Prolog. I gave reversed input to foldl and called it foldr. I used the command with append and string_concat. +14. The bot started concentrating on computer science by writing input and output in conjunction with types. I noticed commands to check types and modes. First, I wrote the command to check the type statements. Then, I wrote the command to check mode statements. Types relied on modes, which we could have by themselves. +15. The bot could customise functional predicates by giving them new arguments. The argument determined a sub-predicate. Or it determined a sub-argument. I could write and call the predicate with a functional call to an intermediate predicate. I divided the predicate into different functions and called them functionally. +16. The bot determined that the new argument was reverse sort. I wrote reverse(sort(list)). I used a typing predictor and pressed the tab to accept a suggestion. I found all results and placed them in n lists. There was a command to find unwanted non-deterministic results and convert code to C. + +17. The bot wrote its software. I combined predicates in new ways. I listed the commands in both predicates. I wrote their types. I combined combinations of commands, converting output from one into input in another. +18. The bot inserted a functional cut, which only affected the not, findall or if-then clause. I wrote algorithms by combining predicates in new ways. I could combine parts of predicates together to give particular output. I could combine parts of predicates together, mimicking hierarchies of methods, for example, modifying List Prolog code by converting it to a Finite State Machine, making several changes, and converting it back. Or, I could find the best combination of changes to lines found with diff. +19. The bot simplified the interpreter using constraint propagation. I wrote an interpreter in a few lines. First, I wrote the input and output, for example, 1 and 1, and 2. Then, I wrote the algorithm C is A+B. Finally, I wrote an interpreter to compute this. +20. The bot wrote separate algorithms that had different functions. I wrote an algorithm in a few lines. I eliminated if-then clauses using constraint propagation. In addition, I eliminated findall clauses using constraint propagation. However, I eliminated unnecessary not clauses using constraint propagation. +21. I diffed (sic) a line by splitting it into one character per line. Then, I wrote the new methods within new methods for writing an algorithm. I started with the first new methods. Then, I meditated and wrote algorithms with these new methods. Finally, I wrote the new algorithm after writing a required method. +22. The bot helped create a new method in programming. +I tracked new methods with time. I wrote down the new methods in the algorithm. I could work out any combination of 8 commands and variables with Combination Algorithm Writer (CAW) = 8! = 40,320 possibilities to test, and I could test 16 changes with DevOps = 2^16 = 65,536 possibilities to test. As well as testing for combinations of lines and characters, I computed to concatenate closing brackets at the end of tested function names. +23. The bot wrote the debugger, helping them write in a positive-functional manner. I used a mind-reading debugger, which rewarded me for thinking of alternative thoughts and letting me write the debugger. Instead of \"taking my point\" and telling me the answers or not allowing me to learn, the debugger provided spiritual practica so I could pedagogically mind map details when debugging. I found myself conversing with it, which I could do with neuronetworks, about new features and ways to tackle the problem. Just as writing a program to find a mathematical result enabled neuronal development and neuroplasticity, writing the debugger familiarised me with algorithm and debugger algorithm-level methods for debugging. +24. The bot was mindful of intellectual property and learning while developing. I wrote the algorithm finder for the algorithm to configure the algorithm differently. I wrote new options to call the algorithm with, running different features. I wrote modules that I could plug in. I could quickly generate an algorithm that performed the task. +25. The disabled bot preferred using new predicates wherever possible, with descriptive names. The person wrote a style sheet for how they wrote algorithms, such as unique names, indenting and choice of synocommands (sic). The person chose appropriate names for predicates and variables, allowing for the specific style of the algorithm. They used tabs in longer algorithms and a space for educational algorithms, which could be read more quickly by disabled learners. I used foldr append as a shortcut for append list. +26. The bot tested that the instructions matched the program. First, I wrote instructions for using an algorithm, such as preparing, running, and a walkthrough. Next, I explained how to install and register a password if necessary. Then, I explained how to run the algorithm with examples. Finally, I explained how to use the algorithm, describing the input and output at each step. +27. The bot varied the description of variables with one variable (per variable) per line. I wrote instructions for running an algorithm in the same format, with labels for inputs and outputs and examples. For example, I used the format predicate(+In1, +In2, -Out), where + denotes an input and - denotes an output. Then, I explained that In1 had the type [string1,{number1}], for example [\"a\",[1,2,3]]. I labelled string1 as the name and number(s)1 as the scores. + +28. The bot found and suggested files with predicates called by the file. The algorithm presented useful statistics about the programming session, such as the number of new methods and suggestions, such as registering repositories called from the repository. I found the dependencies of predicates. For example, I split the predicate into groups of hierarchies that might continue back on the path and found the dependencies regarding the branches. In addition, I examined the individual commands to check for static results and ensuing editing out of them. +29. When calling some predicates, the bot uniformised the word file to an atom, not a string. The person provided feedback about whether the instructions were clear, could have more detail, or need troubleshooting advice. I hired the tester. They surveyed the Read Me file for the open-source repository, finding improvements, tips and suggestions. I added troubleshooting advice and changed from grammars to term_to_atom for use with files, which prevented the need for some troubleshooting advice. +30. The bot compiled specific algorithms in machine language. I wrote instructions for writing new methods; for example, I wrote a C-code generator to call from within Prolog. The compiler would compile this code at compile time. The compiler wouldn't use this code if trace were used, for example, from the start of the algorithm. However, the compiler would compile the entire program to C if it didn't contain complex backtracking. +31. The bot converted the Prolog grammar to C. I gave positive feedback about the new method using a grammar in C. The grammar was a string interpreter in C. It contained no choice points. It had lookahead to test but not get the next token. +32. The bot grouped commands. I wrote the best method, a combination of string_concat and append. I wrote flatten([[[a]],[b]],A), A=[a,b]. I wrote foldr(string_concat,A,B), B=\"ab\". Then, I wrote flatten_foldr_string_concat([[[a]],[b]],A), A=\"ab\". +33. The bot notified the user or tried several approaches to debugging the uninstantiated command error, pending approval. First, I found the modes throughout the algorithm, debugging mode errors. Second, I wrote type and mode statements for the algorithm's input and output. Third, I followed the types and modes of the commands from the type and mode statements. Finally, I found uninstantiated command errors. +34. The bot optimised the predicate and split and joined predicates into those with different functions. I verified that the predicate was customised and that it didn't have unnecessary constants in it. I checked for expressions such as A=B+0. I changed it to A=B. Then I changed B to A and deleted the expression if possible. +35. The bot found new combinations of parts of functional calls. I wrote the possible new algorithms using 540 degrees (the bag algorithm). At first I changed arguments [a,b],[a,c],[d,c],[d,e] into [a,e]. Then I did this with algorithms. This technique applied to groups of characters and, therefore, not groups of lines or groups of predicates. +36. The bot entered input and took the output from a Prolog algorithm using a text file. I wrote longer reused predicates, the need for which I detected with a neuronetwork that examined standard out. I ran C from Prolog to process standard out. I found the need for the maze algorithm. I mind-read and helped the student with their desk and missing phone. +37. The bot quickly added the variable type to type statements in List Prolog - the text method stored algorithms as data that could be manipulated and analysed. Then, I wrote the List Prolog algorithm. First, I strictly typed the variable as a string, which failed when the variable was a variable (A=_). Or, I typed the variable as any, which succeeded if the variable was a variable (A=_). +38. The bot tested the series of websites as an SSI algorithm. I propagated the constraints and optimised the predicate as a result, for example, simplifying two interpreters into one. I wrote one interpreter that called the other one. These could be LPI, SSI or SSI Web Service (SSIWS). If SSIWS crashed, a background app would detect its life sign stopping, and the algorithm would restart it. +39. The bot concluded the icon was a pear, and the sound effect was a trumpet trumpeting. Next, I detected educational details for a student's conclusion, finishing them off quickly if they came to it. The student neared the conclusion, triggering a mind-reading alert. Next, I detected how many details the student was with-it over, helping finish off, correct, or finish all of them. When they reached the conclusion, a level clear sound effect and icon would appear. +40. The bot spelled \"SSIWS\" backwards. I generated data for the predicate from type and mode statements. Then, I matched the data to the type and mode statements. I could generate the type and mode statements from the algorithm and then do this. Next, I read the names and data in the algorithm to find more specific data to generate, such as types of types (music, names or mathematical data). +41. The bot replaced reused groups of commands with a single command. The method was correct and good (simple enough). The method was msort, select or writeln1. One of these provided the right result. In addition, this command singly solved the problem. +42. The bot cut some of the loose choice points. The algorithm answer on the forum was new. I checked for the same or similar questions. I checked whether their answer already existed or was new. For example, I put if-then antecedent or not clauses in a separate predicate with cut at the end. +43. The bot wrote a time-out to remove loops he was suspicious were infinite loops. I verified that an input was not a variable (A=_). I questioned whether the free variable expression could be simplified and infinite loops eliminated. A professor said Prolog would still process them. It might work in Haskell. + +44. The bot removed an infinite loop in the middle using constraints. I multithreaded continuously integrating blocks of code. The blocks went through pipelines. The blocks were helpful because the program could save them. I could skip over the wrong blocks. + +45. The bot found data to solve a problem at the frontier of research, for example, converting a state machine to a register state machine. I generated new inputs with two inputs going together. The inputs were of the same or compatible types. They connected using an ontology. Finally, I tested that the output was correct. +46. The bot looked at matrix images while programming to help remember to enter the correct variables at the right time. I wrote educational algorithms, which let the human make the decisions. The human commented. They made a suggestion. There was a mistake detector. +47. The bot could copy the data. I converted the state machine to a register state machine. The state machine performed recursion by storing the addresses of records in memory. The register state machine added one to move from one memory address to another. I added one to each item in the matrix. +48. The bot helped by reminding me of code with similar commands. I wrote the algorithm to predict its heading and help write it. I identified the type of algorithm. I recognised the feature. I helped with unknown features by writing code using similar commands. +49. The bot passed around variables in variables in variables. I used a single variable to store multiple variables. The variables were in order or not in order. I represented the screen as a list of lists (rows of pixels). This technique was faster than using a list of pixel items. +50. The bot flew over the complete algorithm. I replicated the method for representing a screen to represent 3D voxels. I calculated the positions of the voxels given textures and lighting, stored these and represented the field. I rendered the movie. I had movies for data flow analysis, bottleneck solutions and images of missing features, such as findall in a converter. +51. The bot optimised the converted code. I started in Prolog without choice points to convert to other languages. I optimised this code. I converted to C. Or, I converted to C with choice points, such as findall and cuts. +52. The bot helped the teacher customise materials for the disabled child. The person's life influences their job. They needed an algorithm to automate or speed up neuronetworks in Prolog. The data was small but relevant. The neuronetwork was called with a command in Prolog and compiled in Python. +53. The bot checked the wireframe model and then rendered the scene. I wrote the 3D scene using neuronetworks. I specialise in fine art sets and make-up. However, I retained creative control over the music and appearance. The algorithm finished the script and everything else and found appropriate camera work. +54. The bot ran an interpreter that stored files in memory, speeding up the algorithm. I wrote an algorithm that helped find commands and files. It has used the text to help predict the commands and file names. It created files from the commands, auto-populating them with predicted data. I caught missed ideas and possibilities and completed large assignments in a short time. +55. The bot helped find more relevant details on the lines. I wrote an algorithm that alerted me to new algorithms, methods and research. It suggested storing computer language grammars in one place, writing converters, grammar processors and other algorithms. I wrote modules for my interpreters. I detected the error of two equivalent pathways in the interpreter and deleted one. +56. The bot automatically triggered the CI/CD pipeline on a change to test changes. I wrote the magical algorithm to show the diff results as an HTML file. I replaced repeating items in the before and after files with new numbers, increasing in each file. I could select the parts I wanted and could edit the file. I could instantly see results in the tests of edits. +57. The bot quickly compiled and searched for changes in backups. I generated types of data for testing. If the order of results had changed, I entered the same input and recorded the new output. I wrote an algorithm that asked me to write test cases for new features. I also deleted test cases for deleted predicates. +58. The bot modularised the algorithm and deleted infinite loops. Next, I compiled changes to the repository over time, finding the minimum changes using DevOps. Next, I checked each change since the repository reached maturity, testing and minimising it. If something was missing, I put it back or added needed components. Finally, I debugged and optimised it. +59. The bot debugged and wrote new code as part of CI/CD pipelines. I touched up pretty print to have possible patterns in a data file. I signalled one of these data sets or modifications to them. I wrote the data like a pretty printed file. I constantly tried optimising code as part of CI/CI pipelines. +60. The bot trained the compiler on the interpreter's tests. I compiled and ran an interpreter with itself, testing any new predicates. I tracked the features the interpreter needed to run itself. For example, I supported multifile algorithms, included libraries and dynamic variables. I split predicates into requisite functions and tested each of them. + +61. The bot traded off expansionism against contractionism of code when it felt natural. I tested the code against the criterion of intuitiveness or transparent clearness. I wrote the C interpreter. It started in the middle of the Prolog interpreter. I converted the interpreter to C by allocating memory to C. +62. The bot wrote the series of programs necessary to process files. I wrote a file processor algorithm writer, which took starting and ending frames and produced a list of rules to plug into the file processing algorithm. I wrote the machine code, the hexadecimal version of the code, and the lowest-level interface to the CPU. Assembly language contained a linker, which linked to modules needed for an algorithm. These might include memory-handling routines. +63. The bot optimised the circuit, removing redundancy. The game involved the player's reaction and touch typing skills. I created the Central Processing Unit (CPU). It processed data at a specific cycle rate. It interpreted, processed and executed instructions using logic and transistors. +64. The bot verified the generated type statements and data. I solved the bug by changing the mode statement. I developed thoughts about politics, religion, medicine and philosophy. These represented the outside world's view, which would receive the ideas. The input and output were correct. +65. The bot tested the physical limits of the system. I found when the file processing algorithm was needed by treating the disk as memory and checking the data transformations. I wrote As or high distinctions about politics, religion, medicine and philosophy. These included the most prestigious thoughts or 4*50 As. Professors had enough correct ideas. +66. The bot entered the simulation and started a movement. The programming language had the appropriate features to complete a task. In addition to politics, religion, medicine and philosophy, I wrote on departments I wrote on, such as pedagogy, Computational English, mind reading, time travel and immortality. The high distinctions on these topics were checked and documented. The professors also accepted and checked their thoughts. +67. The bot had medicine and worked at the usual rate. I eliminated duplicate sentences in the database. I developed five thoughts per day involving multiple departments with the Breasoning Algorithm Generator (BAG). I wrote the original sentence. I wrote perspectives (with the latest literature) on the latest details about the idea. +68. The bot treated mind reading and high distinctions as dealing with rectangular prisms representing data. I instructed the person to write an algorithm with an algorithm. I gave high distinction to a separate character representing me. The character listened to everyone in the company. They helped them with their work. They gave 4*50 As and uncovered new possibilities. + +69. The bot reduced the number of the predicate's arguments by combining them. I wrote three points, multiplied by three analysis levels, for a thought I developed. I wrote the conclusion. The 4*50 As were divided into three by three. To speed the algorithm up, I wrote the analysis based on the first sentence of the previous section in the second and third sections. +70. The bot found the better of two lines to include in the code. I wrote intuitive code by splitting lines in the same space with Lucian CI/CD. Splitting into lines was better than grouping them to separate tests in comments. I wrote the test and code, tested it and repeated this until there were no more changes. Separately, I wrote thoughts on music to analyse texts from a \"life and science\" perspective. + +71. The bot automatically fixed bugs in predicates given specs. Lucian CI/CD tested each branch bottom-up rather than the whole algorithm bottom-up. Lucian CI/CD tested predicates bottom-up. This method allowed testing for combinations of more changes. In addition, it allowed narrowing bugs down to individual predicates. +72. The bot chose from the hierarchy using check/off/some checkboxes. Lucian CI/CD allowed specifying the branches of the algorithm to test and the types of tests to run. I listed predicates in the touched files and their predicates. I trained BAG to pair particular pairs of words. I included higher-order commands, such as foldl and maplist. +73. The bot stated that the new method of input was bits. I numbered the predicates in Lucian CI/CD. These numbers were in bits. I started with the first level predicate, numbered one. I progressed to second-level predicates numbered two, and so on. +74. The bot selected the graphic of the branch to test. The 3D scene involved new camerawork. I ordered the predicates in bottom-up order. It was a post-order depth-first traversal. I concentrated on testing each branch one at a time. +75. The bot optimised the code by removing unconnected commands that numerically added nothing. I stopped testing the repository when a test failed. If there was a loop, I grouped the predicates to test together. To avoid individual parts of large loops being untested, I tested them in smaller parts. There was an optimiser to remove extraneous commands that didn't change data. +76. The bot only tested predicates with changed predicates below them, but not if the instance of the predicate had been tested already, where the test was the same. I only tested predicates with changed predicates below them. I tested predicate groups, whether or not an instance had been tested. I recorded a predicate that was tested. Even if another predicate had called it, I didn't test it again. + +77. The bot found arguments by running the Breasoning Algorithm Generator (BAG) on Immortality 1 ideas. I found the new method for the algorithm by running BAG on the algorithm. I synthesised common and uncommon parts, finding rare properties, such as modules, functional calls, use of maplist or foldl or foldr or shorter algorithms. Or, I developed add-ons that read like literary classics. I developed thought histories that lasted long enough into the future. +78. The bot applied human experience or ways of thinking from an idea to an idea. I developed ideas from old essays, perspectives, theory books or copies of my thoughts. I wrote on induction, interpreters and research, applying my ideas and helping others. I went through my to-do folder, email and copy ordered. The bot also applied politics to an argument (where different algorithms represented different sides of politics, requiring parts of other algorithms) and fine art to an algorithm (an object represented by an add-on). +79. The bot could use get and put predicates to test and replace them using numbers. I found a new method giving the same result, for example, using subterm with address to search for, process items and build lists. In Lucian CI/CD, I loaded all the predicates needed for testing the predicates the main predicate depended on from all the files (including \"includes\" statements). In addition, I tested the current group of predicates, clauses from the same predicate or those in the same loop. It could load all predicates and only change the tested ones, but the current algorithm would only test the current predicates. Only tests for these current predicates would be used for them. +80. The bot worked out the dependencies, main file name and predicate. I described and stored the input with words and instructions on processing it, for example, with a linear search or subterm with address. In Lucian CI/CD, I always included the \"includes\" statement because all files were written with comments and certain predicates. The main file was always loaded, and this loaded all the included files, ready for testing. The main file contained the primary file name for testing and the primary predicate name and arity for finding dependencies. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/BOTS/Bots 2.txt b/Lucian-Academy/Books/BOTS/Bots 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..ef5c81beec662c4f8de292ff5a14408516e2b2da --- /dev/null +++ b/Lucian-Academy/Books/BOTS/Bots 2.txt @@ -0,0 +1,87 @@ +["Green, L 2024, Bots 2, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Bots 2 + +1. The bot tested graphics as data and algorithms on the fly, reporting if they had errors. The graphics algorithm maker outputted the animation algorithm, which stored pixels in memory and could be translated to HTML. I wrote tests in Lucian CI/CD as %a(A).\n%A=1.\n. Other tests could be written as %a., %a(1). or %a.\n%false., etc. In addition, these two lines could be written on one line. +2. The bot supported multiple algorithms in a repository with numerous main files and main predicates. I tracked how the predicate was added to over time, rating its simplicity, functionalisation and possible problems with under or over-verification. I checked the List Prolog Package Manager (LPPM) registry for dependencies checked in Lucian CI/CD. I provided an information bureau at the end of Lucian CI/CD, which pointed users to LPPM if files were missing, to the main file if files were missing or predicates were skipped, or to tests if a test was missing or wrong. I supported multiple main predicates. +3. The bot quickly found errors in the dependencies, the main file or the tests, represented as numerical data. I analysed algorithms as patterns of numbers, predicting them, completing them and finding patterns in them and their diagonalisations. Lucian CI/CD reported an error on the wrong main file data. There was an error if the main file was missing or didn't contain the main predicate with the correct arity. A helper algorithm could find the suitable main file with this predicate or the predicate for a main file. +4. The bot updated the algorithms automatically and converted old data files to the new format with a new version of the algorithm. I found the algorithm maker that the neuronetwork spent so many resources on to save resources. In the Lucian CI/CD documentation, I instructed users to delete Github_lc/tests_xxx to start again after a version change or if the algorithm or data had been modified. The data file format may change with a difference in the algorithm, or the data file might become corrupted. I stored all data in a separate folder to update only the algorithms. +5. The bot included the compatible Lucian CI/CD version in downloads. The algorithm maintained fast performance as it developed, with all popular features and supporting the same features. The other algorithms seamlessly integrated with it and its features. The main_file.txt in Lucian CI/CD contained [\"a.pl\", \"a\",1], the primary file name in the repository, the first predicate in it and its arity (the number of arguments). The team reverted to a previous version of this file if there was a problem. +6. The bot critically examined the algorithm in the degree. I tested the algorithm with test input generated from grammars and input that the grammar couldn't generate to test for errors. The Lucian CI/CD tests tested for variables in the form A, not _123. I simplified my algorithms to translate to a new language if my programming language become obsolete, and simplified my texts to translate and bring to the new world. I tested the latest algorithms and grammar-checked the new texts. +7. The bot found the algorithm either from input or input and output. I found the algorithm from the input; in other words, I wrote the algorithm and then the result. Lucian CI/CD warned if there were more than seven changes to a set of current predicates. These were either in chunks of changed lines or changed lines. The number seven could be increased because predicates were shorter than algorithms, which it was designed for. +8. The bot printed the log of the predicates in order of testing. I could tell the modes from the data and the variables I labelled. The Lucian CI/CD diff HTML file contained the finished deletions, insertions and changes, or two columns, one for the initial and one for the final files. This table included groups of predicates in each row. I optionally printed the tests for each group of predicates. +9. The bot supported other programming language parsers later. I found the uses for the depth-first search in different areas of study. I always included new tests and comments in the latest version of the algorithm to test in Lucian CI/CD. The limitation of tests, comments and data was that they couldn't be tested for, so the new comments always replaced the old. The alternative was to do this with tests and comments but test data, treating it like a possible another programming language, but it was challenging to track access to it for generating dependencies. +10. The bot ran the complexity viability algorithm to see if a neuronetwork was necessary. The algorithm explained the algorithm it completed and why it was better than other methods. Lucian CI/CD tested combinations of changes from old and new versions of the repositories. Lucian CI/CD took the list of non-comment, non-includes predicates bottom-up and tested groups of clauses or predicates enclosed in loops, saving working combinations as it went. Lucian CI/CD was one tool that completed the algorithm, operating on human work or relied on algorithms to debug and complete predicates. +11. The bot wrote the game version of the algorithm for the disabled student. The algorithm simplified the text into different levels of difficulty for disabled students. The Lucian CI/CD test changed the predicates' clauses. It was a significant idea to backdate repositories if tests became less or more complex and manually adjust the code. The disabled student wanted the same number of details. +12. The bot inserted the predicates like a frog reaching a lily pad. The disabled student copied the text with the answer. Lucian CI/CD recognised predicates from their name and arity and inserted the new ones in the original order (sorted). It didn't need the predicates that called these predicates, as testing only tested the current predicates and the predicates they called. If there was a loop of predicates, this was tested in a group of current predicates. +13. The bot used a neuronetwork to find or debug a predicate. The Combination Algorithm Writer (CAW) was sped up using a neuronetwork in the interpreter, with a question of whether the user wants to change to it. Lucian CI/CD appended the old predicates to the new ones to find the dependencies and found the best combination of old and new chunks in groups of current predicates in these. Lucian CI/CD found the groups of current predicates or clauses with the same name and arity. CAW found combinations of new, untried commands. Lucian CI/CD found combinations of previous chunks (consecutive lines of added, deleted or changed code) from the current and earlier versions in order. +14. The bot quickly completed the algorithm with the help of Lucian CI/CD. The timer timed how fast the programmer completed the task, noting how complexity added to the time. Lucian CI/CD tested whether a deleted or inserted predicate was or wasn't needed or combinations of added, inserted or changed (including reordered) parts. If the predicate and test have been deleted, there is no test for the predicate, and it will be deleted. There will not be a test failure if there is no test, but the rest of the code doesn't require the predicate. +15. The bot edited the algorithm online with a CI/CD tool. The algorithm modularised the extended predicate, generating test data and reusing the modularised predicate. I identified the clause numbers to group clauses in Lucian CI/CD and assigned old or new labels, respectively. The old and new tags were abstracted as insertions were like deletions when finding their best combination. The on-the-fly CI/CD algorithm suggested necessary changes before or after saving. +16. The bot checked if a missing test should be added to validate a needed predicate. The algorithm simulated the scene of programmer actors with their political viewpoints. Lucian CI/CD continually tests inserted or deleted predicates, including predicates they call. All called predicates need a test or will be omitted, causing failure for predicates that call them. There will be a warning if a current predicate has no test. + +17. The bot listed types of relationships to help order items pointing to an item, preserving their order from the dependencies. The find dependencies algorithm in Lucian CI/CD should find that blue is before loop(red, start) because both red and blue lead to the start, and that blue should be before start, given start:-red, start:-blue, red:-start and blue:-null. There is a loop from the start to red and back. Both red and blue lead to the start in start:-red and start:-blue. Blue should be before the start, indicating it is before or lower down than loop(red, start) on the depth-first post-order graph. +18. The bot tightened the ordering of predicates in Lucian CI/CD. Lucian CI/CD ran diff on sets of old and new current predicates to find inserted, deleted and changed chunks of lines. These chunks were ordered old, then new, for example, old: \"%a(2)\", \"a(1).\" and new: \"%a(1)\" and \"a(2).\". However, if lines 2 and 3 were selected, they would be incorrectly ordered with the code before the comment. The merge predicates algorithm was run to group comments and predicates with the same name and arity to rectify this misordering. +19. The bot kept similar predicates together. The merge predicates algorithm placed predicates with the same name and arity of existing predicates after them. The merge predicates algorithm found the first group of predicates with the same name and arity. It preserved the new or old label on the predicates. Then, it inserted these predicates after the similar predicates. +20. The bot used nested loops to process strings. The first find tests algorithm used consecutive append commands. For each line, I found the next middle characters. I found whether these middle characters were tests or test results. I turned choice points from consecutive appends into C loops. +21. The bot could find test results with multiple values or point to tests that detected vars or unwanted extra results. I found the tests in files with Lucian CI/CD. The second find tests algorithm found comments matching tests and test results using functor and arg. I found the test and then the first matching test result. I found unique pairs of tests and test results until none were left. +22. The bot brought my attention to noteworthy correspondence to me. I gave back similar pedagogical details that others gave to me. I detailed the work. There were twenty pages or an exhaustive number of details for a business. I counted the sentences in the letter and replied to it. +23. The bot hid the password and browsed the text files. I wrote the account area web app with the Vetusia engine. The Vetusia engine passed arguments from pages converted from Prolog predicates. The input was through a form, was processed and passed to another page. The converter converted Prolog to Vetusia Prolog. +24. The bot helped the disabled student to access the material. The web app taught courses. I read the course. The Vetusia engine allowed the text files to be downloaded. I answered the questions. +25. The bot didn't accept some replies while encouraging others. I wrote the assessment and plagiarism checker Vetusia engine. There were fact, open-ended and critical analysis questions. The students may write creatively in an application, poem or their algorithm. The plagiarism checker checked against student responses, catching early attempts and helping students. +26. The bot attached notes to the database. The Vetusia engine helped students with questions. It helped clarify definitions, find deeper meanings or develop a book of algorithms. They could add their creations to their photo album. Rather than a neuronetwork, it searched a database. +27. The bot didn't store any private details. I accepted payment using State Saving Interpreter Web Service (SSIWS). I made the cart software using SSIWS. The algorithm used a third-party payment service. The user paid with a single touch. +28. The bot warned on illegal backtracking code. The Vetusia engine Prolog code to Vetusia Prolog code. The read_string call was changed to a form. The form results were processed after being converted to read_string's result. There was no backtracking to previous pages' algorithms. +29. The bot published its half of the conversation. I wrote the chatbot. I programmed my philosophy in an ordered way. It helped students think of new angles on philosophy. It asked its students if it could publish their conversations. +30. The bot helped the customer with improvements and different algorithms. I wrote the sales algorithm. I solved the binary opposition with each customer. The chatbot helped customers with an A for their thoughts about the product, including algorithms. The chatbot found whether the customer wanted a particular direction and helped them with it. +31. The bot asked, \"Where's my 'L' thing? (light)\" I tested the website. I made a file to input into the web form, while it was in the form of a State Saving Interpreter (SSI) (non-web) algorithm. The file could be inputted into a system of shell scripts on a server. Or it could be inputted into a List Prolog algorithm and output collected. +32. The bot made everything equally easy and user-friendly. I tested the courses. I tested with software. I tested with people. I measured the binomial distribution of results and ranked the courses and ideas by difficulty. + +33. The bot developed a type system that checked for type errors at compilation. I created a strong type system for State Saving Interpreter (SSI) at run time. It allowed trace to be used online, where only the first instance of a called predicate needed to be recorded to retry to save memory. Users used strong types to learn and verify data at compilation and run time, preventing errors and optimising the types system. If only the type errors could be printed, predicates given and erroneous data could be caught. +34. The bot found the types matched a call earlier in the predicate. The compilation type system didn't act like a compiler that traversed a type state machine but found input and output types from the code. It recorded the type flow, including decomposing or building lists and verifying or converting types. It could check algorithms bottom-up, finding incompatible types passed to predicates. Using Lucian CI/CD's find dependencies algorithm, the type checker checked types built base-up (where running the algorithm bottom-up found erroneous predicates) were returned and correctly fit into calling predicates. +35. The bot used Lucian CI/CD with the compilation types checker plug-in to detect erroneous types. The predicates were only checked when all the predicates they called were checked. If \"predicate a\" called \"b\" and \"c\", \"b \"and \"c\" were checked, followed by \"a\". Then predicates that called a were checked. This way, type errors could be found and resolved in individual predicates. +36. The bot recorded the type flow in branches. I found types in branches. The types exiting the branches needed to be compatible, i.e. the same or one fitting inside the other. For example, [_] = [_], or {number} = []. So, [[number]] may fit inside [any]. +37. The bot processed clauses from base cases up, where the most specific types were kept. Lists, which could match empty lists, needed to be recognised from individual clauses, i.e. list decomposition or building. Where a multiple clause predicate could be tested against specifications, the most specific fit-into types from the clauses needed to be kept. This property was like a painting. For example, given the types A={{number}} and A={any}, A={{number}} was kept. +38. The bot ran the algorithm with type checking to prevent errors. Given failure in Prolog can be right, multiple types were possible for a predicate, and an algorithm's top predicate could fail. Types were only kept if they fit inside each other or were different. Sometimes, loose type checking would succeed, but the algorithm might fail if it passed a wrong type, necessitating run-time types. Multiple types were possible, and all possible types were assessed in the algorithm bottom-up. +39. The bot deleted unused types. Types that fit inside another type were treated as separate if they were returned independently by a predicate. The types were only merged if they always merged, i.e. in a base case. For example, numbers and strings might always be replaced with numbers. Alternatively, they were kept separate, for example, if there were two base cases. +40. The bot found cases where findall always failed or returned an empty list. I found types in findall. Findall built a list. This list contained a certain number of items. The most general set of types of these items was found. +41. The bot failed a predicate if its types didn't fit with other types. I found types in if-then. In this logical structure, e,(a->b;c),f, if a is true, then the types of b follow a; otherwise, c's types follow e. The types of f follow either b or c, which are \"creative\" types, possibly creating new types. At least one of the possible types from a predicate must satisfy another predicate. +42. The bot passed the state machine for the types on the server, with the position in the data recorded. I generated the state machine for the type system. I found statements for \"brackets\", \"lists\", \"numbers\", \"strings\", and \"any\" types. The conjunction of types corresponding to a type was listed. In the type checker, the types were checked, and the data and the position in the state machine were recorded. +43. The bot compressed the types when there was only one pointer. I converted the type statements and recursive types to a state machine. The types were conjunctions, pointers or terminals. The address of the pointed-to type was stored with the pointer. In addition, the pointed-to type, like a predicate, returned to a stack's pointer. +44. The bot differentiated static brackets from lists. In the state machine, brackets were a statement type. Brackets around data formed an empty list, a list or a list of lists. They were represented as \"[]\" for an empty list. Or, they were described as \"[types]\" for other types. +45. The bot iterated through list elements. List types were loop statements containing other statements. Lists, unlike brackets, included repeating lists. Lists repeating 0 times were empty lists. Lists were statements that the algorithm returned to the start of when possible. +46. The bot returned whether the types had returned true or false using a compiler. I wrote the type checker to traverse the state machine with the predicate input on entry or input and output on exit. The data was in the form of numbers, strings or lists and was fed into the algorithm part by part. Choice points following failed branches were followed. The interpreter stored these choice points. +47. The bot stated that if the types failed, the predicate failed. The type checker algorithm checked the data matched the types, recording the choice points and progress so far-the data, i.e. number, string or others, needed to match the types. If a type failed and no other branches worked, the branch failed. The types and the algorithm failed if there was no working branch in the entire set of types. +48. The bot possibly returned that the predicate was successful so far. The choice points arose from the multiple predicates called from a type statement. For example, \"a->number\" and \"a->string\" created a choice point. Like SSI, if the data type doesn't match \"number\", then \"string\" is tried. When a successful type is found, the branch returns true. + +49. The bot got the concentrated mental image ready, and the algorithm read their mind. The algorithm explained the code. The algorithm attempted to summarise the algorithm description into one line. It identified and explained recursion, operations on types of types (types of algorithms) and any problems with how the algorithm worked. It suggested error corrections, simplifications and comments explaining the algorithm. +50. The bot didn't upload sensitive files. I ran the algorithm to give secure code. I installed a third-party algorithm to check the code. I scanned for passwords, user names, API keys and account numbers. These were not permitted to be committed. +51. The bot tried to avoid backtracking with recursion and only used backtracking in findall. I analysed the cut command's behaviour before programming it. I cut choice points from the whole program, then programmed a version of cut to cut only choice points in the current predicate. I didn't forget the predicate's future clauses. However, I decided to avoid cutting choice points in previous upper predicates. +52. The bot specialised in helping the child make up games. I stated that the robot software could be a child's friend. It answered simple science questions. It explained that there were no scary or imaginary things. It played games such as tiggy, hide and seek, computer, card and board games. +53. The bot recommended books, leaving non-computational books to the child's imagination. The robot was the teenager's friend. The teenager learned meditation. They saw friends. The robot drove them around, watched movies with them, didn't eat or drink, reminded them of things they needed to know and was their study partner. +54. The bot could get from and put values into memory, add to, test and jump as instructions in the register machine. I offered insights into register machines. I programmed a register machine in Prolog. I ran a program that repeatedly operated on items in memory. I fetched an item from memory, added one, and put it in another location in memory. +55. The bot guessed computed whether a>b. I explained how the register machine program worked. The program jumped to a specific line if a register contained a particular value. In addition, a register instruction may jump to a separate line. I could go to a subroutine and return to where I came from using a stack. +56. The bot simplified the interpreter and converted it to C. The child learned advanced computer science. They became confident by writing long, detailed plans or computer games. They designed a font, wrote a programming language or wrote algorithms that found other algorithms inductively. I found algorithms found algorithms with enough of them in the database. +57. The bot added types to the register machine interpreter. The child wrote the interpreter. First, they wrote a parser. Before that, they wrote a binary search tree traverser. I modified a register machine interpreter. +58. The bot combined Outdoor Education with mathematics. I wrote a primary school simulation. It meant the mathematics from it. It contained algebra, geometry (programming angles on a computer), and using computers. I solved problems and used magic formulas such as the McCarthy 91 function. +59. The bot thanked their primary school teachers for speed and accuracy and for learning addition and multiplication tables. I covered calculus. I could add and multiply. I could visualise algebra by thinking of a formula finder. Calculus requires manipulating algebraic expressions, defining functions, and using basic trigonometry. +60. The bot covered curved lines in calculus. I covered linear algebra. I found the number of legs in 6 dogs and five hens. Linear algebra involves determining length, area, and volume. Linear algebra found the length of straight lines involving linear equations. +61. The bot covered the argument for Combination Algorithm Writer (CAW). I covered discrete mathematics. Discrete mathematics studies mathematical structures that are distinct and separated. For example, these structures may be combinations, graphs, and logical statements. Discrete structures are finite or infinite. +62. The bot found the correct answer using logic. I covered logic. In CAW, the base case needed to be created before the clause that referred to it. CAW could find formulas. I could modify CAW to call predicates with an argument [A|B], which it could convert from append (see Education of Pedagogy 3, paragraph 2). +63. The bot quickly identified the need and invented new technologies to help the person. The robot considered and responded to all thoughts of the person. The person suggested a positive question or problem from their experience. This problem required solving it with computation or automation. The robot listed the variables, formulas and logic required. +64. The bot changed to C, converting from Prolog and not using unnecessary choice points. The robot was a meditative companion. The robot found the best meditation course. Each thought was straightforward and satisfying. The robot helped visualise data flow analysis, type flow analysis with colour codes and diagnosed bottlenecks in algorithms using colour codes. + +65. The bot tested the graphics line by line for colour, placement and animation. The bot tested graphics as data on the fly, reporting if it had errors. Graphics must be tested as data and algorithms to test their appearance and logic. I tested that the graphics looked right, were logically displaying correctly and that the whole game worked from start to finish. I forgot Nietzsche with Grammar Logic (GL) anyway. +66. Lucian CI/CD was a necessary way to refactor, debug and point to debugging. The bot tested graphics as algorithms on the fly, reporting if it had errors. I tested whether the graphics-producing algorithms, such as graphical data structures, had the correct input and output. This step rigorously verified the algorithm’s logic, using Lucian CI/CD to build the algorithm with only parts supported by the tests. This method required checking the code before it was as complete as possible, the tests, the code afterwards and the algorithm’s result. +67. The bot made an animation editor based on a programming language. The graphics algorithm maker outputted the animation algorithm, which stored pixels in memory and could be translated to HTML. The animation consisted of frames of graphics constructed from vector shapes, which were exported as HTML tables as graphics. I used meta refresh to animate the images. First, I rendered the bitmap graphics. Second, I saved them as HTML files. +68. The bot aimed for the desired mathematical result and achieved it. I made a graphics package that started with an arc and then transformed it into a 3D plane. I described the central coordinates of the arc and its dimensions, then created it as a sequence of straight line vectors, then placed these on a plane in 3D and rendered it from a certain angle. I could manipulate this plane, rotating it or changing the position, size or orientation of parts of it. I checked that the result was correct using a grid. +69. The bot included self-standing variables as commands in List Prolog. I wrote Lucian CI/CD tests as %a(A).\n%A=1.\n. I entered logical tests in Lucian CI/CD. I entered A=true, B=true in the previous version and A=false, B=false in the current version and found A=true and B=false to be the combined solution with the algorithm A, not(B). A more complicated algorithm may have multiple assignments (sets of satisfying values). +70. The bot wrote simple mathematical problems with CAW. Other tests could be written as %a., %a(1). or %a.\n%false., etc. I entered tests with variables in Lucian CI/CD. I wrote a programming language with expressions with variables to check. For example, I entered A=a, B=b in the previous version and A=c, B=d in the current version and found A=a and B=d to be the combined solution with the algorithm C is A+B when a=1,b=2,c=2,d=1 and C=2. +71. The bot possibly avoided moving up non-deleted items in a deletion. The queries and results in Lucian CI/CD tests could be written in one line. I wrote an assembly program to sort by finding minimum values. I found the minimum values, deleted them, printed them, and then repeated this with the rest of the values. I deleted the item from the list and then moved the other items up. +72. The bot supported multiple algorithms in a repository with numerous main files and main predicates. I wrote an assembly program to search. I stopped going down or across when I had found the answer. I found a way to prolong my longevity by breasoning out 4*50 Breasoning Algorithm Generator (BAG) high distinctions in both the home and future times by being given 4*50 person high distinctions for claiming that immortality helped keep an actor’s appearance the same and meeting outstanding students in the MBA. Someone must have seen my brown hair in the future and my apparent grey hair in the present and worked out that I needed to breason out 4*50 BAG high distinctions each day in the present. +73. The bot tried the third level of quantum pointers to be schools, the fourth as other companies and the fifth as different universes. I supported Unicode in Prolog. Unicode was the modern equivalent of ASCII. I encoded each Unicode character in the appropriate format. Separately, I breasoned out the breasonings for a person and then the number of people to send these to. +74. The bot used the simplest possible effective solution at each point. I compressed the Prolog algorithm by compiling it with, for example, long-form. The stamping technology in d.sh quickly breasoned out pre-breasoned 4*50 high distinctions to account for 15 time travellers in each location, meaning one deserved a future spiritual apartment. It was faster than BAG at breasoning out high distinctions while delivering the quality desired to deserve accommodation. Quantum box stamps only pointed to BAG or 4*50 high distinction stamps unless the thoughts they pointed to were short. +75. The bot wrote innovative new code that was simpler than needed. I cut the predicate header in Lucian CI/CD to the result. I included the input and output and the initial list or string. The code self-healed, using the first correct input variable, labelling the others and warning/debugging on non-uniform program structure. I cut away unnecessary code based on the working predicate header. +76. The bot recognised the dog. To catch failure, I inserted the halt 1 command (on failure) in Prolog scripts. In the Daily Regimen shell script, these scripts could be re-run on the inability to help others and me retain our youthfulness. In the home time, the medicine quality was better from breasoning out the anti-ageing breasonings in conjunction with meditation, possibly preventing neurological disease and mental ageing, making one look the same. The person retained their youthfulness in the other dimension by staying close to the simulation. +77. The bot found the best combination of changes to the predicate over time. I tracked how the predicate was added to over time. I tracked whether commands and their arguments were added to or deleted from the predicate over time. I converted the predicate to a state machine and ran diff on it. I rated and simplified the changes. +78. The bot simplified the program. I rated the predicate’s simplicity. I compared its complexity with the complexity of its spec. In addition, I found the most straightforward possible alternative to the predicate and suggested it. I found the alternative if the complexity suggested it existed. +79. The bot checked the name against the data. I rated the algorithm’s functionalisation, which is the algorithm’s usefulness. I counted the number of times the predicate was used. In addition, I rated the predicate’s clarity of predicate and variable naming and commenting. +80. The bot found the algorithm, then the spec. I found possible problems with predicate under-verification. In contrast with over-verification, in which an erroneous spec spoilt the algorithm’s induction, under-verification didn’t have enough specs to find the algorithm. I fixed this by providing enough specs. I tried generating the spec using a neuronetwork. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/BOTS/Moral compass.txt b/Lucian-Academy/Books/BOTS/Moral compass.txt new file mode 100644 index 0000000000000000000000000000000000000000..df338cfb8c0ff620c3dd238dccb81f856637cd1b --- /dev/null +++ b/Lucian-Academy/Books/BOTS/Moral compass.txt @@ -0,0 +1,26 @@ +["Green, L 2021, Moral compass, Lucian Academy Press, Melbourne.","Green, L 2023",1,"Moral compass + +Essay (change and influence, do and help) +Invest (serve and impress) + +1. I performed the task for the student well. I found what the student could say and helped them by working it out with them. I thought of several possible answers. I thought of reasons for the best one and objections to others. I thought of imagery and helped them articulate their ideas with it. +2. I also cleaned up after helping the student. I showed that I had thought clearly by putting the ideas away afterwards. I washed my hands. I cleaned the desk. I went home. +3. I explored the settings close by. I studied the options to change the essay enough. I wrote grammatically. I researched \"yes\". I avoided \"no\". +4. A result of epistemology (a black box) was life. I found meanings that seemed to be true about the explored ideas. I interpreted pedagogy, choosing to set the top mark according to whether I agreed or disagreed as a lecturer by agreeing when I decided. Everyone is free, and I am happy. I wrote the reason for the result, reading science into epistemology (I wrote the correct number of ideas to mind map and deserved honours for the work). +5. Breasonings were both spiritual (eastern) and scientific (western). I tested the meanings (such as life, honours or success) with scientific metrics. For example, I measured my heart rate by feeling my pulse. Next, I counted the marks I earned. Finally, I observed the customer buy the product. +6. I wrote 5 As and listed 50 books (to set up connections), then wrote ten non-computational words symbolising algorithms in a sentence and shunted up to two ideas from another sentence between two of the ideas to form a connection. First, I found the relationship between sentences. Next, I thought of a story connecting the sentences. Then, I broke it into steps. Finally, I simplified the account to a word. +7. I debugged the algorithm. +I found the meaning between the states of an object. I captured the moment of truth. I avoided treating the effect. I treated the cause. +8. I circumnavigated the term as variable and possible nested findall skip bugs. Then, I wrote the essay about applying skip to findall in Prolog. I skipped findall. I skipped the call in findall. Finally, I skipped equals4 at the end. +9. I examined the idea. Essay Helper made me happy by giving me the gem rather than ignoring my hard work. Working on the algorithm was the \"high-quality way\". People recognised it and made it part of their lives, helping with most things. By writing ideas between other ideas, I became happy (most likely by wiring my brain and being rewarded). +10. Simplicity was the highest quality thought. People carried (bought products symbolising) what made me happy. I became pleased by drinking healthy drinks. Algorithms about an idea are a sales machine, +helping us love and logically analyse it. The brain simplifies and remembers thoughts. +11. I started with my interests, articulating to perfection by writing enough options and simplifying everything. I mind-mapped complex thoughts and simplified them. I wrote down my particular areas of interest. I attached them to lines and keywords. There was a graph of keywords, with detail in the form of variousness in depth. +12. I considered critical thinking, and I answered my questions. It was worth recording the simple thoughts as a valuable skill. I wrote the conclusion, avoiding restating it. I applied Vedic mind reading to find the famous quote. I wrote topics that were famous because of meditation. +13. It meant that robot government was possible. I automated meditation, which is usually a big idea. Meditation was possible for robots to process the inputs and outputs. Robots checked logic and made gentle seen-as versions. The robot lived a short, eventful life; alternatively, the owner reset it. +14. I debated asking for no exact copies and a citation (assuming they had thought about it in the fuss and assuming they would do the right thing by the license). Essay Helper was politically helpful. After all, it was valuable, avoiding immorality because it involved students changing the essay enough. It saved time and money, including medical bills. People could spend more time studying and relaxing. Because it provided the obvious, it was slight, yet saved hours of manual labour. +15. If they do nothing, it blames them. The Grammar Logic algorithm appeared to mind-read the students, qualifying their connections and other essay elements. It prepared their choice of sentence. It helped them work their conclusions out. It also helped with their suggestion for future research. +16. I deliberately considered weird ideas as my future research. Some students partially rewrote the Essay Helper algorithm, checking for sentences left unparaphrased or preparing for a radically original essay. +To write creatively, they could write an original mind map. Or, they could connect groups of sentences. Finally, they could find ideas and chains of uses. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/BOTS/New Bots 1.txt b/Lucian-Academy/Books/BOTS/New Bots 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..32ea7e738e789d23ecfcdcf9acdaa1dac440e9ec --- /dev/null +++ b/Lucian-Academy/Books/BOTS/New Bots 1.txt @@ -0,0 +1,72 @@ +["Green, L 2024, New Bots 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"New Bots 1 + +1. The bot tested the algorithm. The text was inputted. I entered the option types. I entered the options into the text file. I ran the algorithm with the text file as input. +2. The bot used the editor. The text was outputted. I read the output. I edited it. I re-entered it into the algorithm. +3. The bot wrote the algorithm to solve the error. The text file contained the predicate. The Prolog algorithm could be in the text file. I could query the algorithm in runtime. I found the line number of the error. + +4. The bot moved the term to the other side. I used the text file to configure the predicate. I configured the predicate as a possible interpreter, induction or paraprogramming (sic) predicate. The interpreter worked out 1+1. The induction algorithm worked out 1?1=2. Instead of a constraint satisfaction problem solver, which worked out A+A=2, paraprogramming paraphrased an example predicate, for example 1+1=2 became 2-1=1. +5. The bot use bitwise addition when simulating the microprocessor. The text file configured the predicate’s method. I added using decimal. I added using bitwise addition. I added using another base. +6. The bot indexed predicates with undefined variables in terms. I configured the search. I noticed the one-dimensional search was fast. I wrote the data as a list. I optimised the algorithm to index the data by the first argument, or sometimes second argument as well. +7. The bot performed the operations more quickly without predicates by converting them to C. I configured the sort. I deleted or didn't delete duplicates. I wrote a sorting algorithm in C. It was fast, dealing with text files. +8. The bot sped up the algorithm by using fast, large amounts of memory. I configured the data set. I included only relevant data. I deleted excess detail to the data. I formatted the data in the simplest way. +9. The bot switched off or occasionally used type checking. I configured the type statements to prevent mishandling of data structures. I wrote the code. I wrote the type statements to check the data. I also used grammars or predicates as type statements. +10. The bot wrote code from the type statement. I used type statements to correct code. I used type statements to correct errors in data. These errors may be in the data format. Or they may be in the code. +11. The bot wrote in machine language. I simplified the string lengths in the code state machine. Instead of \"choice point\" I wrote \"c\". I could pretty print this as \"choice point\". The shorter strings could be outputted by the state machine listing command. +12. The bot passed the output of a functional call to another variable in the functional call. I created the functional expression. It was output by the metre. For example f(In,Out) was a term. It could be passed to other functions, for example g(f(In,Out1),Out1,Out2). +13. The bot tested the recursive functional call. I retrieved values from the called expression. I could switch inputs on and off. I could control functions using flags. I returned two or more values from a function. + + +14. The bot converted the functional call from Prolog to List Prolog. I converted and passed terms as part of functional expressions to do this. First, I converted the term from Prolog to List Prolog. The algorithm performed this conversion when given the label \"Prolog\". Then, I ran the expression. +15. The bot simplified the process. I wrote texts optimising essays. I found the algorithm described in the paper. I minimised the definite finite automaton. I wrote it in essay form. +16. I perfectly customised the new language. I devised a programmatic spoken language that could specify algorithms in algorithms. I trained the algorithm with a new word with an algorithmic meaning. These words could be in terms of other programmed words. They could be recursive. +17. The bot counted how many levels deep skills had. I drew a chart of educational skill advancement in computer science. I noted the obscurity of a command used. I noted the simulation of complicated commands. I noted parallel programming languages as interpreters, compilers and in C. +18. The bot returned the work with the same number or up to 80 predicates. As part of this, I rewrote educational skills diagrams as connections between skills only. I counted recursion. I counted commands, only keeping the samenesses. Finally, I counted the facts, rules and predicates. +19. The bot converted the algorithm into a finder of unpredictable code and found algorithms. I wrote the chatbot to upgrade the generation of the algorithm. I wrote a web version. I wrote an editor. I wrote a neuronetwork algorithm writer. +20. The bot prepared to find more algorithms. I converted the algorithm into a finder of unpredictable code. I wrote the algorithm as a functional call. I split it into predictable and unpredictable command sections. I could predict pattern matching and find unpredictable other commands. +21. The bot wrote the functional call with a customised number of arguments. I found algorithms. I went inwards from the input and output. I worked out unpredictable commands from previous possible combinations of commands used previously. I repeated this until finding a match in the middle. +22. The bot created a neuro-decision tree with values from any data part. The neuro-algorithm writer found algorithms by matching data like a neuronetwork. The whole thing was better as a neuronetwork. I wrote a decision tree for the entire thing. The neuronetwork was like this. +23. The bot worked out better, simpler conclusions. In the previous version, the algorithm kept both furthest-reaching pathways, testing for a meeting or overlap. Computer Science often did this computation. Instead, the neuronetwork found all possible solutions and the single fitting solution. This computation took many tries. +24. The professor made a complex bot. I concentrated on the interpreter, finding stable conclusions the neuronetwork could interpret at each point. After this, I built my first space station. It was part of the simulation. Once scientists made the space slingshot, the simulation could subsidise the number of space jumps. Once scientists finished quantum medicine, societal behaviour prevented disease. +25. The bot could develop simulated products for interpreters, like brains. Instead, I walked around on a simulated planet. The flight was safe. It was fast. The conditions were delightful, and everything was modern. +26. The bot or someone else was in charge. If everything was inside the simulation, people could enjoy the quality of life they wanted and have positive mindsets. The people were optimistic. The food was healthy. People could program or go for walks with friends. +27. The bot travelled back in time at the end. I watched the students. They kept writing. I answered their questions. They were prepared and memorised or used notes. +28. The bot enjoyed working at the company. The person sold the food and put the money in the bank. He used the money to pay the costs and employees. The employees could enjoy entertainment and fine dining. I wondered if someone I knew was immortal. +29. The bot developed the A. The person thought of a reason to eat. It had an alternative style. It returned to normal. The philosophy extended interest in both the customer and the workers. + +%%% done:The bot bought food and modes. The bot bought food customised predicate. + +29. * + +?- bag2phil(A),writeln(A). +Enter number of input files: +|: 2 +Enter file 1 path: +|: ../Lucian-Academy/Books/BOTS/Bots 1.txt +Enter file 2 path: +|: ../Lucian-Academy/Books/BOTS/Bots 1 2.txt +1. I finished off the text and modes. I finished off the text customised predicate. I finished off the text new methods. I pointed to the next book from each book to help and modes. I pointed to the next book from each book to help customised predicate. +2. I pointed to the next book from each book to help new methods. I time travelled to do text and modes. I time travelled to do text customised predicate. I time travelled to do text new methods. The meditation students from times away from text and modes. +3. The meditation students from times away from text customised predicate. The meditation students from times away from text new methods. The meditation students from times away from text to breasonings just meditated or time travelled to use text and modes. The meditation students from times away from text to breasonings just meditated or time travelled to use text customised predicate. The meditation students from times away from text to breasonings just meditated or time travelled to use text new methods. +4. I noticed how the algorithms developed, involving in this way. I noticed how the algorithms developed, involving algorithms developed, involving new methods. I noticed how the algorithms developed, involving new methods bought food. I noticed how the algorithms developed, involving new methods for help. I noticed how the algorithms developed, involving new methods good job. +5. I noticed how the algorithms developed, involving new methods included instructions. I noticed how the algorithms developed, involving new methods the food. I noticed how the algorithms developed, involving new methods the goods. I noticed how the algorithms developed, involving new methods the person. I noticed how the algorithms developed, involving new methods the scene. +6. I noticed how the algorithms developed, involving new methods the text. I noticed how the algorithms developed, involving new methods the yield. I noticed how the algorithms developed, involving new methods a business about my ideas in themselves. I noticed how the algorithms developed, involving new methods with if I couldn't secure housing. My company was within laws to not offer replacements for real degrees, and used an accountant from the future to help and modes. +7. My company was within laws to not offer replacements for real degrees, and used an accountant from the future to help customised predicate. My company was within laws to not offer replacements for real degrees, and used an accountant from the future to help new methods. The bot bought food and modes. The bot bought food customised predicate. *The bot bought food new methods. +8. The bot cooked the food and modes. The bot cooked the food customised predicate. The bot cooked the food new methods. The bot delivered the goods and modes. The bot delivered the goods customised predicate. +9. The bot delivered the goods new methods. The bot included instructions and modes. The bot included instructions customised predicate. The bot included instructions new methods. The bot measured the yield and modes. +10. The bot measured the yield customised predicate. The bot measured the yield new methods. The bot offered the bounty for help and modes. The bot offered the bounty for help customised predicate. The bot offered the bounty for help new methods. +11. The bot transcribed the text and modes. The bot transcribed the text customised predicate. The bot transcribed the text new methods. The bot was thanked for doing a good job and modes. The bot was thanked for doing a good job customised predicate. +12. The bot was thanked for doing a good job new methods. The bot wrote the scene and modes. The bot wrote the scene customised predicate. The bot wrote the scene new methods. 3165/80=40 80s and modes. +13. 3165/80=40 80s customised predicate. 3165/80=40 80s new methods. +A = \"1. I finished off the text and modes. I finished off the text customised predicate. I finished off the text new methods. I pointed to the next book from each book to help and modes. I pointed to the next book from each book to help customised predicate. +2. I pointed to the next book from each book to help new methods. I time travelled to do text and modes. I time travelled to do text customised predicate. I time travelled to do text new methods. The meditation students from times away from text and modes. +3. The meditation students from times away from text customised predicate. The meditation students from times away from text new methods. The meditation students from times away from text to breasonings just meditated or time travelled to use text and modes. The meditation students from times away from text to breasonings just meditated or time travelled to use text customised predicate. The meditation students from times away from text to breasonings just meditated or time travelled to use text new methods. +4. I noticed how the algorithms developed, involving in this way. I noticed how the algorithms developed, involving algorithms developed, involving new methods. I noticed how the algorithms developed, involving new methods bought food. I noticed how the algorithms developed, involving new methods for help. I noticed how the algorithms developed, involving new methods good job. +5. I noticed how the algorithms developed, involving new methods included instructions. I noticed how the algorithms developed, involving new methods the food. I noticed how the algorithms developed, involving new methods the goods. I noticed how the algorithms developed, involving new methods the person. I noticed how the algorithms developed, involving new methods the scene. +6. I noticed how the algorithms developed, involving new methods the text. I noticed how the algorithms developed, involving new methods the yield. I noticed how the algorithms developed, involving new methods a business about my ideas in themselves. I noticed how the algorithms developed, involving new methods with if I couldn't secure housing. My company was within laws to not offer replacements for real degrees, and used an accountant from the future to help and modes. +7. My company was within laws to not offer replacements for real degrees, and used an accountant from the future to help customised predicate. My company was within laws to not offer replacements for real degrees, and used an accountant from the future to help new methods. The bot bought food and modes. The bot bought food customised predicate. The bot bought food new methods. +8. The bot cooked the food and modes. The bot cooked the food customised predicate. The bot cooked the food new methods. The bot delivered the goods and modes. The bot delivered the goods customised predicate. +9. The bot delivered the goods new methods. The bot included instructions and modes. The bot included instructions customised predicate. The bot included instructions new methods. The bot measured the yield and modes. +10. The bot measured the yield customised predicate. The bot measured the yield new methods. The bot offered the bounty for help and modes. The bot offered the bounty for help customised predicate. The bot offered the bounty for help new methods. +11. The bot transcribed the text and modes. The bot transcribed the text customised predicate. The bot transcribed the text new methods. The bot was thanked for doing a good job and modes. The bot was thanked for doing a good job customised predicate. +12. The bot was thanked for doing a good job new methods. The bot wrote the scene and modes. The bot wrote the scene customised predicate. The bot wrote the scene new methods. 3165/80=40 80s and modes. +13. 3165/80=40 80s customised predicate. 3165/80=40 80s new methods. \"."] \ No newline at end of file diff --git a/Lucian-Academy/Books/BOTS/New theories.txt b/Lucian-Academy/Books/BOTS/New theories.txt new file mode 100644 index 0000000000000000000000000000000000000000..48cb2a6fbe362a48e06e4c3157cf14f6c91bdf8f --- /dev/null +++ b/Lucian-Academy/Books/BOTS/New theories.txt @@ -0,0 +1,19 @@ +["Green, L 2024, New theories, Lucian Academy Press, Melbourne.","Green, L 2024",1,"New behavioural theories + +1. Clever employers kept employees by making up enough for them. The farm theory suggested that more successful industries produced necessary products. Employees fell into the \"necessary\" or \"unnecessary\" category. Their theory of what was required needed to match what they practised. For example, it could come down to making enough up. +2. From mind mapping to developing the feature, the customers' thoughts were of primary importance. The employees checked perspectives about a possible new feature. They fit the \"number one\" perspective. Also, they checked for Vedic mind reading to ensure the feature was necessary. Finally, they checked the wording and simplicity of the feature from the viewpoint of doing and using it. +3. The characters needed to test walking, flying and talking about the feature. The manager facilitated the spiritual perspectives. When an employee found keywords about a product from a view, the manager harmlessly problematised them ethically, financially and from a design perspective. The aim was to improve the feature and be the best. The features were compared and contrasted, and the students returned to the computer science characters. +4. I checked the employee's happiness and the logic of her reasons. Once I had distributed the products, I distributed feedback forms to test them. I observed that the employees were positive-minded. I saw one, thinking positively about the product and collecting feedback. The customer was happy and had further questions that the employee answered, and they felt fulfilled in answering. +5. I asked Antonia whether she could choose Uni (CPD) and Daoism (Art). The logic meant connecting a cycle of six ideas, representing 100%. I changed from demoralised to demonadalised. People felt happy meditating and often needed to detach and rewrite nested if-then statements in multiple predicates to preserve choice points. I checked whether I could reuse any predicates, saving editing extra code. +6. The theorist used blue cut to enable if-then with multiple clauses in the interpreter. I hard-coded blue cut, an example of introducing a new feature to cope with the limitations of Prolog, that only cut choice points of a single clause. I registered the true blue cut with the location ID and cut ID at the start of the first clause. Then, I activated the blue cut at the end of the first clause. Triggering the blue cut caused the monitor at the beginning of the second clause to fail, whereas if the programmer had not activated the blue cut, the second clause would possibly work. +7. The theorist tested the condition in the algorithm. I added if-then to the \"types to alg\" algorithm. It could pattern match (find whether an item was a member of a list or find a property of appended lists). It could append two items or find whether an item was the first or last item of a list. Or it could discover whether an item was the rest of either of the list without the first or last items or whether the item was inside the rest of one of the lists. +8. The theorist used neuronetworks within neuronetworks to operate on bits. The if-then statement contained unpredictable commands that computed non-pattern-matching data and contained multiple commands or recursion. For example, it found mathematical, string manipulation or commands involving variables in other algorithm parts. Also, a neuronetwork could find whether data resulted from two or more commands. Or, it could choose or create a recursive call with predictable or unpredictable commands. +9. The theorist asserted the constant. The \"types to alg\" algorithm found a list from other lists, for example, with a condition if an item was in a particular list. I could append the lists with a constant. This constant could be hard-coded or loaded from a text file or another algorithm part. The lists could be appended based on the value of a global (asserted) variable. +10. The theorist wrote formulas to find repeating patterns. I created theories from my algorithm library. The star metaphor represented the find lists algorithm, meaning it simplified lists to repeating units. This algorithm was helpful in all areas of study with lists with repeating items. I checked whether non-repeating parts could be constants. +11. The theorist found the frequency in the union. The formula represented the pattern. For example, it was a mathematical, matrix, logical, bitwise, database or computational procedure. It could be a mixture of these. For instance, 1+length(X)*matrix multiplication. +12. I found pattern matching in combination with formulas. For example, the programmer could use the string length with reverse to zerofill the rest of the digits. I used the constant in these conjunctions to determine the character instead of zero. Finally, I assigned the result to a global variable. The theorist reused and manipulated parts of the conjunction. +13. The theorist checked and wrote knowledge. I wrote the knowledge verification algorithm. First, it checked knowledge against old knowledge. Then, it checked knowledge against current knowledge. Finally, it worked out conclusions about current knowledge and wrote them down. +14. The theorist sometimes used the term meditation in other departments, dependent on a vote. I found the correct configuration of the argument. I wrote with \"self and other\" as an English argument and with or without terms from a department. I used \" self and other \" in Computational English (similar to Theology). I used \"self and other\". Computational English I used commonplace English examples in a three-step algorithm form. In medicine, I omitted breasonings, but in pedagogy, I kept them. +15. I wrote Prolog in the operating system with a microprocessor. I could use a seen-as-version for an idea. I saw it as relating to the microprocessor because of the latest advances. I simplified this circuit to a single crystal computing operation. I could check using induction. +16. The theorist optimised the circuit using smaller and smaller components. I studied Electrical and Electronic Engineering. I designed the circuit. I created memory (hardwired at first). I made logical operations. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/BOTS/dot-New Bots 2.txt b/Lucian-Academy/Books/BOTS/dot-New Bots 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..22904b27e66bcb391034b4d1a021ea4ff1411afb --- /dev/null +++ b/Lucian-Academy/Books/BOTS/dot-New Bots 2.txt @@ -0,0 +1,19 @@ +New Bots 2 + +1. The bot hand-wrote algorithms by thinking of original versions of noumena. I developed 50 algorithms for a pass and 250 for a high distinction instead of 16k algorithmic breasonings. The larger number was too difficult as there was no way to vet such a large number. I found out about the student by mind-reading their algorithmic skills. I developed 250 algorithms for each sale and high distinction. +2. The bot checked the code with types as they word-processed it, with chains of 50 algorithmlets (sic) to finish a line. I mind-read the algorithmic spec, cosmology, song line, GL word, movie line, BAG output, philosophy sentence, algorithmic comment, shift sentence, another sentence said, thought or written in one's life, or image, finding the input, output, method, and test cases. I mind-read skills, with words and letters matching them that make sense. This method required neuronetworks. +3. The bot collected algorithmlets about formats for their algorithm or critically held each step to make it intuitive. I invented algorithms based on the highest quality ways of thinking. I checked the whole algorithmic database for the best way of thinking to use at each point, simplifying to simple and advanced types. I invented formats, reusable formats of data that made algorithms compatible with each other. There were intermediate formats in cosmology that the user could mind-read in themselves to connect to write algorithms. +4. The bot checked algorithms generated by neuronets. If I couldn't run an algorithm, I mind-read myself about ways to correct it. I started by thinking about the significance of a GL word such as "art" in forming the specification, such as creating a walkthrough or influencing the algorithm with a diagram. I traced its commands' types, comparing the extent to which the spec-to-algorithm could generate its pattern-matching component, leaving a CAW specification to complete. I rescued CAW commands with the correct types from the proper place in the corrupted algorithm, plugged them into the pattern-matching algorithm and checked the result. +5. The bot grouped by consecutive numbers that were part of the same number. I wrote a program finder for grouping consecutive numbers or patterns that formed a number or a term. I started with algorithms that did these things and merged them, finding patterns that other methods may miss. I used an algorithm generated with spec-to-algorithm and inserted type verification commands to differentiate appended clauses. I used pairs of append commands to get sequences of tokens until a condition was met. +6. The bot said that besides repairing code, they checked the necessity of every command, clause, and predicate using Lucian CI/CD. If I couldn't secure an attention-attracting algorithm using a neuronet for an employee's shift, I used Combination Algorithm Writer (CAW). CAW might take minutes to write several commands or short clauses. Despite being slower, CAW could find results more accurately. It could be used to repair or finish corrupted or unfinished algorithms, finding efficient and correct code. +7. The bot added Lucian CI/CD as a module to Spec to Algorithm. In Spec to Algorithm, I used subterm with address instead of predicates to search for values in terms and replace them with variables. Subterm with address found the address of values in terms, allowing substituting variables into these positions after determining which variable names to choose. The values were replaced with unique variables in a one-to-one correspondence and replaced with constants from the original data if they were the same across specs. Later, I generated predicates as an exercise for expandedness. +8. The bot converted models to useful Prolog algorithms. I found input to create and run algorithms with Spec to Algorithm by running the Strings to Grammar try predicate, which found recursive predicates and later mapped their variables to the output variables of the recursive structures of the output. These recursive structures allowed other sets of input in the same recursive structure as the model input to be transformed into the recursive structure of the output and its expansion. After finding the unique variables, recursive structure and constants from the input, subterm with address queries were found from the recursive structures. For example, the input [1,2,3,2,3,1,2,3,2,3] is converted to the recursive structure [r,[1,[r,[2,3]]]]. If there is another spec [r,[1,[r,[2,4]]]] and the generated algorithm is +[r,[1,[r,[2, B]]]] where 1 and 2 are constants, and B is a variable, then giving the new input [1,2,5,2,5,1,2,5,2,5] means the output's recursive structure [r,[2, B]] provides the output with [2,5,2,5]. More importantly, proximity formats are needed to give [A],[A, B]->[B]. +9. The bot stated that there was no other easy way of solving the "teleporting train tracks" problem of mapping input to output in Spec to Algorithm than using subterm with address. Spec to Algorithm can flatten a term by representing item types by items of that type, running Spec to Algorithm, replacing these constants with type-checking commands, and copying the inputs to output without brackets. The general term's input is given by strings, atoms, numbers, and lists in any position and is repeated. In this case, indices of list items continue and do not restart when traversing the output's recursive structure. Continuing is the default. +10. The bot didn't need recursive loops for non-deterministic data (which wasn't required) but took variable values from the new data, ignoring non-constants from previous data. I converted recursive structures into output by traversing the data, continuing, and not restarting the indices of recursive parts. I traversed the structure, recording numbers until a recursive part. When I had reached a recursive part, I traversed the contents, recording a variable value 23 from the data (which was not in 123232323 1242412424 patterns like the specs, but a single 1232412423 pattern from input), continuing to 24 on repeat, then 24,23 after the next repeat. The 1 in the output may be replaced with a 5. +11. The bot turned optional structures on in strings-to-grammar but off in spec-to-algorithm to prevent ambiguity when outputting results. In addition, I turned optional structures in input in spec-to-algorithm off to keep structures the same. If an optional structure were in input in spec-to-algorithm, its contents would be stored in variables. However, the lack of an optional structure in the output needed for determining output wouldn't correlate with the input structure. For example, an optional structure in input may prevent recognising it in output, requiring individual variables to be copied, possibly undesirable in an algorithm. Optional structures may be acceptable with non-determinism from multiple specs. +12. The bot verified inputs and outputted the results with an algorithm generated by Spec to Algorithm. I checked that recurring variables didn't have conflicting values. Within the current level of A in [r, A], I checked that instances of the same variable didn't match more than one value from the input. I found and set the variable to the value the first time it was mentioned, then checked that subsequent mentions of the variable were set to the same value. If they conflicted, the input-getting process and (generated) algorithm failed. +13. The bot wrote plans, pedagogy, copywriting orders, chatbot questions and prayer songs to work the algorithm out. I found unique variables, recursive structures, and constants in that order because unique variables, which were one-to-one mappings of values to variables (the same across the specs), were needed before finding recursive structures and, later, input-output mapping. Constants, the same value in recursive structures across specs, could only be found after finding recursive structures. I checked the order of variables in each spec was the same for correctness. The order of variables in the new data set given to the generated algorithm was assumed to be the same. +14. The bot only processed variables on non-deterministic branches when needed, finding mappings at the start and output for each branch based on the single input spec. I merged recursive structures with different shapes from specs using a decision tree. This non-deterministic tree allowed choices within the recursive structure to find different outputs from inputs. These followed recursive structures, with non-deterministic choices within them with new input-output pairings. Due to the decision tree, there was a single non-deterministic tree, and the algorithm needed to branch and process variables before the next branching point within a new branch of the algorithm. +15. The bot stated that programs with CAW content may have multiple levels of processing but recognisable results. I converted the subterm with address algorithm to the expanded algorithm. First, I stored the input in recursive structures using its previous algorithm and used the match predicate from List Prolog to move the input to output using the recursive structures. Alternatively, I formed predicates and grammars from the recursive structures that stored the input, then generated predicates from the output recursive structures to create output, taking copies or continuing to process data, starting with the base (precisely, the start of the base) level of output lists and building the output from input. Needed input lists and values were passed to the output predicates, making code more accessible and understandable. +16. The bot gave the input and output recursive structures with Spec to Algorithm resolved variables mixed with unknown variables to CAW, and it found the new rules. I gave CAW specifications with complete results, including stages of computation, for example, results of predicates or times a predicate was run before running itself recursively, including recursive predicates called. These complete results included all computed values, not simply a truth value for their completion. If the algorithm found was recursive, it was partially found using Spec to Algorithm, and the remaining inputs and outputs were joined with CAW. I imagined stages of the algorithm and refined details as I went. diff --git a/Lucian-Academy/Books/Breasoning Algorithm Generator/.DS_Store b/Lucian-Academy/Books/Breasoning Algorithm Generator/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Lucian-Academy/Books/Breasoning Algorithm Generator/.DS_Store differ diff --git a/Lucian-Academy/Books/Breasoning Algorithm Generator/Breasoning Algorithm Generator 1.txt b/Lucian-Academy/Books/Breasoning Algorithm Generator/Breasoning Algorithm Generator 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..10e5c399c53f0302a89467a6252c873bf29b85e2 --- /dev/null +++ b/Lucian-Academy/Books/Breasoning Algorithm Generator/Breasoning Algorithm Generator 1.txt @@ -0,0 +1,19 @@ +["Green, L 2024, Breasoning Algorithm Generator 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Breasoning Algorithm Generator 1 + +1. I could undo changes in other windows. +2. I dragged text from the Web Editor window to the window. +3. I suggested nearly missed opportunities to improve code. +4. I moved deleted files and previous clipboards and lost undos to the bin. +5. I could close and reopen sets of windows and trigger reminders or automation to make resulting changes from changes. +6. I activated other windows by dragging into them. +7. I undid the upload or download and treated web servers like the Web Finder. +8. I cancelled or undid dragging operations. +9. I deselected files with an algorithm, for example, those filenames ending with a string. +10. I ordered files by name or another property. +11. I funnelled the files and made business decisions in time. +12. I found files in one window that could run optical character recognition (OCR) or conversion utilities. +13. The education authentication software verified that I understood an idea and helped me apply it in the industry. +14. The assignment was to produce the report-writing algorithm, explaining and detailing the logic and mathematics. +15. If it had permission, Web Editor could edit files on other servers and simplify the approach to old, new and private versions of repositories by using a data folder and testing private data worked with the latest version. +16. Web Editor updated the location of moved files. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/COMPUTER SCIENCE/Lucian Prolog Red Balloon Icon.txt b/Lucian-Academy/Books/COMPUTER SCIENCE/Lucian Prolog Red Balloon Icon.txt new file mode 100644 index 0000000000000000000000000000000000000000..1da55609c4b2b8328906365c199f4e2eb2504129 --- /dev/null +++ b/Lucian-Academy/Books/COMPUTER SCIENCE/Lucian Prolog Red Balloon Icon.txt @@ -0,0 +1,17 @@ +["Green, L 2022, Lucian Prolog Red Balloon Icon, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Lucian Prolog Red Balloon Icon + +1. Lucian's red balloon represented intelligence, using pedagogy. Lucian is represented by red. Red is positive, like Lucian. Red represents taking action, like Lucian. Red represents the good, like Lucian. +2. As L is to light, the first image is to the film. Lucian is represented by \"L\". I wanted the first image revealed by light. I checked the available evidence from the start. I tested and made corrections to Lucian Prolog using evidence. +3. Working Prolog code is represented by a red balloon (meaning correctness). Prolog is represented by a balloon. Prolog's choice-points are represented by balloons. In mind reader, asking it any question causes it to return the answer. The ethics were that psychoanalysis, the se for mind reader, was limited to synonymous, not private ideas. +4. List Prolog's data structures are in list format. A list is represented by the white reflection on the balloon. As the balloon ascends, the list is printed on. It was in Lucian's Hand Bit Map Font, a simple bit map font which looks like handwriting. List Prolog can run algorithms in list format. +5. I tried smaller and smaller intervals in constraint satisfaction problems, which Combination Algorithm Writer could solve. The list, represented by the square, can be checked against knowledge, represented by the balloon. If the list is output, it can be checked against the input to the algorithm. If the input is a list of numbers, it can be checked against the number line or other algorithms. The input can be tested against the output by back-substitution. +6. I noticed the reorganisation and use of earlier parts in later algorithms (the levels leading to more complex algorithms). I found the simplest language for the sentence using mind reader by connecting it to five sentences. I found the intersection of details for one connection. I found the ontological categories the intersections of details for the connections fit into. I found the best results with mind reader and chose the best result with an algorithm to find correct old words and connections and find new intersections, expressions and technologies. +7. The robot was Lucian Prolog-based. I commented that the red balloon with a white square reflection looked robotic (the round balloon was human and the square was robot). The robot had a humanoid body with human and robot needs and could mind read humans using mind reader. The human created the robot. The maker wrote Computational English to comfort the robot. +8. The robot seemed to comment that the thought waves were clearer than the messy handwriting. I thought that the red balloon icon looked like the meeting between the digital and the analog. The digital was the bit map graphics. The analog was the vectors and curves. At their interface, the robot digitally checked the analog writing with a perfect result. +9. The subject pretended God found rules from data. I thought that the white square on the balloon looked like a message to God. I thought that Prolog's non-determinism was easier to program in. It allowed faster development of, for example inductive algorithms, which found algorithms from constraints such as list lengths. I found the (n-1)th number. +10. I could simulate Shell, Java, C and Haskell in List Prolog. List Prolog enabled me to understand Prolog better. I noticed Prolog was interpreter-like to speed development of artificial intelligence applications. I could design an interpreter, used to check code in different languages. I could design algorithms that sped up mundane tasks and helped people. +11. The Lucian/List Prolog compiler compiled faster code. List Prolog was better for artificial intelligence applications because it was more intuitive. I found ways to improve the performance of List Prolog. I examined faster methods than decision trees. I used append, rather than string concatenation. +12. I noticed the science of music album naming, philosophy testing and algorithm usefulness. I could generate algorithms in other programming languages in List Prolog. I could help students with these algorithms. I could learn about the algorithms by writing them. I could save time by running these algorithms. +13. The part that preserved the formatting required knowledge of how to replace string concatenation with append for performance, so was left to the graduate to find in their career. I simplified the algorithm to assess. I merged the decision trees. It was simple enough. The part that preserved the formatting was left out. +14. The postgraduate could now perform better than the undergraduate, but needed to demonstrate more knowledge. If the lecturer chose, the algorithms with better performance could be included, however they needed to be simple enough to find. The student found it difficult to visualise the need for the lookahead algorithm. Later, he traced the interpreter in List Prolog and detected the need for it. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/COMPUTER SCIENCE/Principia Mathematica.txt b/Lucian-Academy/Books/COMPUTER SCIENCE/Principia Mathematica.txt new file mode 100644 index 0000000000000000000000000000000000000000..b986b84fa90ee4fb5a6fba349dc14297766e88cd --- /dev/null +++ b/Lucian-Academy/Books/COMPUTER SCIENCE/Principia Mathematica.txt @@ -0,0 +1,14 @@ +["Green, L 2024, Principia Mathematica, Lucian Academy Press, Melbourne.","Green, L 2024",1,"put in Mathematics + +Principia Mathematica + +If we don't know it how can we say it? The solution is to say something small. + +Optimise breasoning to a point. If the point is not being observed, ignore it. + +Simplify everything to points before making neuronetwork - compare with binary computation - break down problem and find individual computations, possibly avoiding need for neuronetworks (CAW with neuronetworks and PFT for rest). Strings can be changed to lists to work with PFT, character case changes and adding 1 or multiplying by 10^X can be detected with shallow data. + +Writing framework for infinite knowledge - write about breasonings technology, which always has new things in it. + +The universe can be simulated using breasonings. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/COMPUTER SCIENCE/SSI.txt b/Lucian-Academy/Books/COMPUTER SCIENCE/SSI.txt new file mode 100644 index 0000000000000000000000000000000000000000..76bea51a359e147272ed3aaf9a29f1e3d3a5d252 --- /dev/null +++ b/Lucian-Academy/Books/COMPUTER SCIENCE/SSI.txt @@ -0,0 +1,31 @@ +["Green, L 2022, SSI, Lucian Academy Press, Melbourne.","Green, L 2022",1,"SSI + +1. I examined the trace. +I wrote the retry feature in SSI. First, the interpreter reset to the state. Then, the computer reran the part. It found the bug. +2. The operating system had multiple undos. I wrote the web service in SSI. I broke the algorithm at the point after accepting input. I changed the algorithm to save as it went. I changed the algorithm to have multiple undos. *1, *2, *3 +3. I drew lines and displayed them in the background. I wrote the graphical user interface in SSI. I had different submit buttons, one for each responsive area. I could convert .ppm and .pbm graphics to web graphics. +I could display text in Lucian's Hand Bitmap Font. +4. Philosophy was in 2D or 3D. The graphical user interface had 2D graphics. I changed graphics with mouseovers. I simulated the text area with graphics, where the text area refreshed on keystrokes and clicks. There were 2D sprites in games. +5. There were moving 2D views of maps. The graphical user interface had 3D graphics. I relaxed my upper body by reaching forward. First, there were (architectural) elevations. Then, there were static 3D views. Then, there were moving 3D views. +6. SSI sped up writing web apps by enabling recursion over different web pages. First, I wrote the Web Prolog development environment. I wrote the code. I read the warnings when I ran it. Finally, I ran the algorithm online in SSI. SSI could act as a system to run the algorithm, with libraries available and could convert algorithms to applications. +7. There was an error-handling library. I had clear error notifications on the web rather than unclear ones. There was an error for a missing predicate. There was an error with a chain of predicates listed. +Rather than a web error, SSI presented a Prolog error. +8. The word processor could run on the smartphone, enabling rapid code drafting. I wrote the word processor for the web. First, I specified the 2D graph's appearance. Then, I asked for missing specifications and automatically generated the needed code. The word processor autosaved progress. +9. I used Javascript to help make an image editor. I constructed the image from the HTML table. The table cells were the pixels. I changed them to a hex colour. I controlled the rows' height and the columns' width. +10. I recorded the project in an API Maker report. I wrote the programmable calculator. I wrote the algorithm. I drew the graph. I labelled its features. +11. I handed in the formula. +I wrote the grapher. I flew through the logic graph. Then, I made logical parts of the game with SSI. Finally, I generated the formulae as a computer game. +12. The chatbot accessed the frequently asked administrative or area of study questions. I wrote the CAW chatbot. The supercomputer helped with the problem. It interactively helped with programming. It interactively helped write the algorithm from the sentence. +13. There were sentences with key terms describing the mapping between algorithms and files. I simulated file handling with Program Finder. Next, I generated the algorithm, which identified needed files with labels, with the program finder. Finally, I used the file specification to store data at a particular location in the file. +The file locations had a standard labelling system. +14. I described future areas of research. First, I added the foldr command to SSI. Then, I programmed the algorithm to ask about relations between different algorithms. +I generated the documentation for the command. Then, I described the specification using sentences and labels. +15. I discouraged cheating by asking for working, available to be helped with by tutors. Next, I found the matrix for the formula. Next, I tried all combinations of numbers in all dimensions of matrices. Next, I found formulae with operations between matrices and factors. Finally, I simulated the tutorial and college using Prolog. +16. I used the algorithm to feature the most essential (parts of) photographs at the best (range of) sizes. I found the constraints. I tested the formula against the specification. Third, I fit the articles together on the newspaper page using the constraint satisfaction algorithm. Finally, I used editing software to edit out lower-priority parts of reports in relation to other articles. + +Notes + +*1 I updated the specification to meet subgoals. I found singletons in the branch. I counted the number of each variable in each branch (separated if-then branches), including the header. If there was only one instance of a variable, then it was reported. I found combinations of correct configurations to match the specification and fix the singleton in the branch error. +*2 I did the same for state machines and data structures. I simplified the grammar by eliminating extra levels. I used the DFA minimisation algorithm. I rewrote the grammar. I did the same for types and Functional List Prolog. +*3 I stored the code as List Prolog Code, which the interpreter could call as part of a library. I wrote an algorithm to write an algorithm to simplify (modularise) code and perform everyday tasks. First, I found the reused code. Then, I called the collections of reused code (performed common tasks). Finally, I found the optimised code. +Applications could use the menu bar, vector movies, MIDI-creating and playing, neural network, supercomputing, debugging suite and mouse-drag libraries. Web SSI (API Maker) allowed access to files and moved development from the Terminal to multiple web windows. The web apps created could have icons, be run from the smartphone home screen or desktop (online or offline) and be sold. Because they were web-savvy, they could eternally endure the introduction of new machines and operating systems. For example, an application may consist of main :- writeln(\"Hello World.\")."] \ No newline at end of file diff --git a/Lucian-Academy/Books/COMPUTER SCIENCE/Women Improving Assessment Connections.txt b/Lucian-Academy/Books/COMPUTER SCIENCE/Women Improving Assessment Connections.txt new file mode 100644 index 0000000000000000000000000000000000000000..3fa8ffd8079750ec5ca5918fc6b7067741ac6275 --- /dev/null +++ b/Lucian-Academy/Books/COMPUTER SCIENCE/Women Improving Assessment Connections.txt @@ -0,0 +1,20 @@ +["Green, L 2022, Women Improving Assessment Connections, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Women Improving Assessment Connections + +1. The woman connected the grammatical standards to her work. The woman saved HTML comments about the connection while completing her computing assessment before taking menstrual leave. She re-rendered the HTML page from List Formatting Language, removing unnecessary code. She read the comment to delete the comment. She deleted the comment. +2. The girl wanted to argue about overcoming headaches. The girls could improve the multiple-choice test with boys by thinking of the connection. The girl couldn't massage her body because she was disabled, so she used the medical question and answer box. She asked herself whether she would be all right. She decided that she would be all right. +3. I noticed that breasoning was an object. The woman was granted equal opportunity with men in checking the connection from assessment design to unit design. The woman wrote that a breasoning was using the language without other terms. She simplified the computational terms. She added analysable words. +4. The woman wrote the web browser, which accessed the page with a table with web protocols. The woman didn't pay too much for childcare costs which she paid for by writing her web browser, a connection of assessments. The woman connected web protocols with text formatting. She wrote the table rendering algorithm. Also, she is visually impaired and explained that she couldn't use text-to-speech when it was silent in public. +5. I worked out what the object looked like and recorded this description. The woman underwent education, improving the connection in her short answer test. The woman claimed that the book's title on the utterance contained the word \"sutra\". She adjusted an earlier statement, saying that she should call the question and answer box the quantum box once. I noticed that the mantra and sutra were usually called utterances. +6. She read the chapter, taking notes and making connections in terms of the vernacular. Also, the woman underwent training in writing within a chapter topic. To do this, she paraphrased the sentences. Then, she made connections in the essay. As a result, she wrote future research. +7. I noticed that the students wrote their versions of algorithms with recursion afterwards. The woman held public office, ensuring no discrimination in links from assignment design to unit design. She wrote the small to medium-length algorithm that thoroughly explored the idea. It included no recursion. There was no recursion allowed in interpreters, state machines and type testers. +8. I wrote more extended algorithms and encouraged students to write their own for understanding and development. The woman was voted into the leadership position, stopping bias in links from the unit design to year level design. I saw that the shorter algorithms didn't, or rather, did have complex brackets. There were complex brackets in the interpreter. Also, they were in the type checker. +9. I noticed the shorter algorithms had no hidden complexity and had As. The woman was chosen for the business leadership position, stopping polarity in connections from year level design to degree design. I noticed that the state machine and type finder could have recursion in some conditions because they would be too simple otherwise. Grammars were more accessible in List Prolog because the content was in string, not numerical form. Also, the lecturer could place common pitfalls of grammars in lecture notes and the correct way of writing them directed. +10. Skip made writing the grammar in List Prolog easier. The woman was selected for the non-profit organisation board position, stopping right-wing beliefs in ties from degree design to department design. I approached the grammar bottom-up. I discovered that I needed a look-ahead call in writing the grammar, which I designed myself. I found that the grammar interpreter was easier to write than a grammar. +11. I converted the grammar to predicates and tested whether certain parts were true. The woman headed the school, where she prevented discrimination in connections from department design to school design. I discovered that retry helped with writing grammars. Also, a back feature in trace helped (but required a lot of memory). Also, I just wanted to go to a point in trace and run the trace automatically until a particular line failed. +12. I explained how to debug the interpreter with global trace variables and the advantages and disadvantages of specific testing commands. The woman led the primary school section, ensuring no miscarriage of justice in links from the school design to the local education guidelines. I labelled the line of code as expected to pass or fail and what to do if it behaved unexpectedly. I noticed that the call had failed in one circumstance but passed in another and gave rules for whether to stop if it failed. Finally, I saw that I could add this feature to an interpreter I had written. +13. When bisecting the links from philosophy to the algorithm, I satisfied myself with a mind map of related algorithms, then simplified, connected and sorted ideas into categories (considering reasonably different ways of programming them). The woman led the secondary collaborative teaching team, where she ensured no wrongdoing in links from the school to the state education policies. I wrote As for each predicate and how they could be rewritten and simplified. I noticed that As helped and found a way to automate them. I found the similarities between the algorithms, simplifying them to functional calls and bisected inferences between the philosophy and the specifications, ironing out mistakes and predicting how philosophies can be programmed. +14. I wrote the algorithm to explain the algorithm's reasoning and algorithms to find reasons. The woman led the university subject, ensuring no favouritism in connecting the school to the federal education laws. I wrote algorithms to bisect the link between the philosophy and algorithm, etc., and planned writing algorithms for the institution, planned algorithms for each algorithm and how I would complete my philosophy. I wrote an algorithm to do these things. I wrote just enough algorithms, etc., to read. I also wrote about the ideas to test them. +15. I tried to write the algorithm instead of training a neural network, although neural networks achieved better results in some cases. The woman voted in the assessment design vote. I wrote that the neural algorithm pattern matched place to place, like another algorithm. I wrote creatively or summarised it. I explained the algorithm that the neural algorithm followed and used it instead. +16. I found both unique and relevant algorithms. The woman entered the legal contract considering the assessment's what, when, and how. I wrote the algorithms to find reasons for arguments. It simulated matching arguments against experience. It also matched against the experiment. It also reached against computational algorithms. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..bb1b01ec1d36ad89b903c84e9c752ca73ba829ff --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, A New Logic Reflecting Language or Natural Logic 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +A New Logic Reflecting Language or Natural Logic 1 of 4 + + + +1. The symbols for all, there exists, ->, <->, ^, v and ~ (not) need to be updated to account for language's semantic properties. For example, there exist 3 (the symbol there exists! means there exists 1). The symbol ~avb could represent a although b because 'a although' implies ~a and 'although b' implies b, hence ~avb. + +1a. I prepared to catch the rabbit. I did this by chasing the rabbit. First, I found the rabbit. Second, I observed him run. Second, I ran after the rabbit. In this way, I prepared to catch the rabbit by chasing the rabbit. + +2. I prepared to say yes to a consumer. I did this by stating that the card read, 'Yes'. First, I found the card. Second, I read the card. Third, I said, 'Yes'. In this way, I prepared to say yes to a consumer by stating that the card read, 'Yes'. + +3. I prepared to carry the King's tray. I did this by listening to the King say, 'No' ('Yes'). First, I walked along the red carpet. Second, I was the King's audience. Third, I listened to the King say, 'Yes'. In this way, I prepared to carry the King's tray by listening to the King say, 'No' ('Yes'). + +4. I prepared to commission a work. I did this by saying, 'Yes'. First, I walked to my discussion partner. Second, I introduced myself. Third, I said, 'Yes'. In this way, I prepared to commission a work by saying, 'Yes'. + +5. I prepared to travel into the water. I did this by pulling the runabout forward. First, I entered the water. Second, I grasped the rope. Third, I pulled the runabout through the water. In this way, I prepared to travel into the water by pulling the runabout forward. + +6. I prepared to entertain the pig. I did this by calling the pig. First, I entered the yard. Second, I saw Charlotte the pig. Third, I called Charlotte. In this way, I prepared to entertain the pig by calling her. + +7. I prepared to serve salad sandwiches. I did this by straining the water from the lettuce by using a colander. First, I placed the lettuce in the colander. Second, I twirled the colander. Third, I removed the water tray. In this way, I prepared to serve salad sandwiches by straining the water from the lettuce by using a colander. + +8. I prepared to avoid too much gas being produced by my stomach. I did this by reminding myself using the digicon (sic) not to eat too much salt, causing too much HCl (hydrochloric acid) to be formed in my stomach. First, I read the stomach icon. Second, I read the label 'NaCl X' printed on the stomach icon. Third, I ate the vegan croissant, which did not contain too much salt. In this way, I prepared to avoid too much gas being produced by my stomach by reminding myself using the digicon (sic) not to eat too much salt, causing too much HCl (hydrochloric acid) to be formed in my stomach. + +9. I prepared to store energy in my body. I did this by eating the natural sugar. First, I picked the apple. Second, I peeled the apple. Third, I ate the apple. In this way, I prepared to store energy in my body by eating the natural sugar. + +10. I prepared to eat dessert. I did this by preventing Negeia (sic, things that may possess negative reasons), for example protecting human worth by eating a currant bun. First, I placed the currant bun on a plate. Second, I lifted the currant bun to my lips. Third, I bit the currant bun. In this way, I prepared to eat dessert by preventing Negeia (sic, things that may possess negative reasons), for example protecting human worth by eating a currant bun. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..20a6baa4cc252993ec56ca060fdba0657313c6af --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, A New Logic Reflecting Language or Natural Logic 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +A New Logic Reflecting Language or Natural Logic 2 of 4 + +11. I prepared to reduce psychiatric costs. I did this by preventing etiological phenomena (the virus AIDS, in fact maintaining human happiness by using a condom, in fact abstaining from sex). First, I stood at the start of the path. Second, I started walking along the path. Third, I walked along the path until the end of it. In this way, I prepared to reduce psychiatric costs by preventing etiological phenomena (the disease AIDS, in fact maintaining human happiness by using a condom, in fact abstaining from sex). + +12. I prepared to remain healthy. I did this by preventing the virus influenza by avoiding coughing people (promoting gaiety by talking with healthy people). First, I walked into the room. Second, I talked with healthy people. Third, I left the room. In this way, I prepared to remain healthy by preventing the virus influenza by avoiding coughing people (promoting gaiety by talking with healthy people). + +13. I prepared to form the international society. I did this by preventing racism (promoting multiculturalism by making friends with all races). First, I made friends with a person from the first race. Second, I prepared to make friends with a person from the next race. Third, I repeated this until I had made friends with people from all of the races. In this way, I prepared to form the international society by preventing racism (promoting multiculturalism by making friends with all races). + +14. I prepared to make sure people of different genders had equal rights. I did this by preventing sexism (promoting gender equality by employing equal numbers of men and women). First, I employed a man. Second, I employed a woman. Third, I repeated this until all the positions had been filled. In this way, I prepared to make sure people of different genders had equal rights by preventing sexism (promoting gender equality by employing equal numbers of men and women). + +15. I prepared to walk to the next room. I did this by preventing murder (suggesting the first person held the door open). First, I suggested the first person hold the door open. Second, she held it open. Third, everyone walked through. In this way, I prepared to walk to the next room by preventing murder (suggesting the first person held the door open). + +16. I prepared to give people enough room to move. I did this by preventing rape (promoting sexual respect by giving people personal space). First, I imagined a one-metre-square box around the person's feet. Second, I walked towards the box. Third, I stopped when I reached the box. In this way, I prepared to give people enough room to move by preventing rape (promoting sexual respect by giving people personal space). + +17. I prepared to make sure that the place where I was was safe. I did this by preventing terrorism (promoting counter-terrorism measures by reporting an abandoned bag on the ground). First, I identified the isolated bag. Second, I double-checked it didn't belong to anyone. Third, I reported it to authorities. In this way, I prepared to make sure that the place where I was was safe by preventing terrorism (promoting counter-terrorism measures by reporting an abandoned bag on the ground). + +18. I prepared the person. I did this by preventing pedophilia (respecting the person). First, I walked to the person. Second, I shook his hand. Third, I walked back. In this way, I prepared the person by preventing pedophilia (respecting the person). + +19. I prepared to play in the house. I did this by preventing nuclear weapons (respect people by building a cubby house). First, I built the floor. Second, I built the walls. Third, I built the roof. In this way, I prepared to play in the house by preventing nuclear weapons (respect people by building a cubby house). + +20. I prepared to be a creative writing/philosophy pedagogue. I did this by preventing mistakes (maintaining correctness by undertaking training for my primary job). First, I retrieved the job training I needed. Second, I contrasted the job training's skills with the skills I needed for my job. Third, I completed additional training to learn the skills I needed for my job. In this way, I prepared to be a creative writing/philosophy pedagogue by preventing mistakes (maintaining correctness by undertaking training for my primary job). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..88de3f0df19fbc3b30ebf01879bca219b7442d97 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, A New Logic Reflecting Language or Natural Logic 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +A New Logic Reflecting Language or Natural Logic 3 of 4 + +21. I prepared to eat the tofu nuggets. I did this by preventing animal products from being produced (I cooked the tofu). First, I placed the tofu in the wok. Second, I started to cook it. Third, I stopped cooking it after 4 minutes. In this way, I prepared to eat the tofu nuggets by preventing animal products from being produced (I cooked the tofu). + +22. I prepared to return true when three variables were true. I did this by designing a quantum light circuit. First, I shone a red light at the electrode. Second, I shone a green light at a mirror in the path of the red light, forming a beam of yellow light shining at the electrode. Third, I shone a blue light at a mirror in the path of the yellow light, forming a beam of white light shining at the electrode, triggering the electrode. In this way, I prepared to return true when three variables were true by designing a quantum light circuit. + +23. I prepared to abbreviate a sentence. I did this by way of poseia (positive ideas) for example, I performed a magic trick (I made a sentence expander). First, I wrote, 'Madam, I'm Adam,' then folded the end of the strip of the paper over to cover the name 'Adam'. Second I read the sentence, 'Madam, I'm'. Third, I unfolded the strip of paper to read the expanded sentence, 'Madam, I'm Adam'. In this way, I prepared to abbreviate a sentence by way of poseia (positive ideas) for example, I performed a magic trick (I made a sentence expander). + +24. I prepared to comment that the peach wrapper went well with the peach. I did this by defining sex (in fact, eating a peach). First, I unwrapped the peach. Second, I verified that the peach was ripe. Third, I consumed the peach. In this way, I prepared to comment that the peach wrapper went well with the peach by defining sex (in fact, eating a peach). + +25. I prepared to attach ideas to what the employees said. I did this by exercising equality in relation to gays (I demonstrated equality in employing workers). First, I asked an assistant to assign numbers to the applicants. Second, I selected the numbers of applicants who knew enough details. Third, I memorised their names. In this way, I prepared to attach ideas to what the employees said by exercising equality in relation to gays (I demonstrated equality in employing workers). + +26. I prepared to quadruple the number of possible breasonings, in conjunction with collecting breasonings from new spoken languages. I did this by creating new breasonings by translating into new computer languages. First, I wrote the breasoning. Second, I wrote it in a new computer language. Third, I wrote the breasoning(s) which it signified. In this way, I prepared to quadruple the number of possible breasonings, in conjunction with collecting breasonings from new spoken languages by creating new breasonings by translating into new computer languages. + +27. I prepared to drink the cordial. I did this by making peace with a dialogue participant by eating carrot sandwiches together. First, I held the carrot sandwich. Second, I placed the carrot sandwich in my mouth. Third, I bit it. In this way, I prepared to drink the cordial by making peace with a dialogue participant by eating carrot sandwiches together. + +28. I prepared to be famous. I did this by following the naked-person-graph (designing an education activity to teach a Prolog program). First, I wrote the input to the predicate. Second, I computed the output of the predicate. Third, I repeated this until I had computed the result of the Prolog program. In this way, I prepared to be famous by following the naked-person-graph (designing an education activity to teach a Prolog program). + +29. I prepared to show a high quality thought. I did this by writing a 10 breasoning A (non-hit) or 25 breasoning A (hit) to sell an internet video talk or course. First, I contacted Scripsi. Second, I contacted Quadrant. Third, I contacted Meanjin. In this way, I prepared to show a high quality thought by writing a 25 breasoning A to sell an internet video talk or course. + +30. I prepared to satisfy the professional requirements for a product. I did this by writing 5 As (non-hit) or 10 As (hit) to sell an internet video talk or course. First, I contacted the New Yorker. Second, I contacted Punch. Third, I contacted Cosmopolitan. In this way, I prepared to satisfy the professional requirements for a product by writing 5 As (non-hit) or 10 As (hit) to sell an internet video talk or course. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7fe6be6d06af9f2140abdda357a36afde6418a82 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green A New Logic Reflecting Language or Natural Logic 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, A New Logic Reflecting Language or Natural Logic 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +A New Logic Reflecting Language or Natural Logic 4 of 4 + +31. I prepared to become a web fan. I did this by running a viral algorithm. First, I breasoned out 5 As. Second, I performed 45 brain thoughts after meditating using 80 lucian mantras (drawing an illustration). Third, I dotted on a grid, making, doing, having, time to prepare for the video web site counter, a large cloud to protect oneself from the wires being felt, a non-famous wire and a famous wire. In this way, I prepared to become a web fan by running a viral algorithm. + +32. I prepared to write for people. I did this by spending money to advertise a video. First, I entered the demographic place. Second, I entered the demographic time. Third, I entered the contribution amount. In this way, I prepared to write for people by spending money to advertise a video. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f15020cba085dcf43215efa11ed088b8159bb84b --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 1 of 4.txt @@ -0,0 +1,41 @@ +["Green, L 2021, Analysing characteristics of arguments 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Analysing characteristics of arguments 1 of 4 + + + +1. After reading a page on Agnès van Rees, the Director of the project Characteristics of Argumentative Discourse found using the Yahoo! search criteria of 'narratology, contention and characteristics', I became interested in resolution of differences of opinion in everyday discussion. + +This gave me the following ideas: + +Developmental Conglish, i.e. how if necessary raw evidence needs to be refined and shaped into an argument + +Whether two sides of a debate are arguing on the same point. + +Deciding which side of a given debate is right. + +Do the premises in the argument satisfy all co-premises? + +Do the premises work in all cases? + +1a. I prepared to eat with the duchess. I did this by making the quince tart. First, I cooked the quince. Second, I prepared the pastry. Third, I cooked the quince tart. In this way, I prepared to eat with the duchess by making the quince tart. + +2. I prepared to visit the sea. I did this by leading with the trident. First, I found a group of people. Second, I placed them in a line. Third, I led them with a trident. In this way, I prepared to visit the sea by leading with the trident. + +3. I prepared to spread love across the land. I did this by teaching meditation (body metaphor) to the group of students. First, I asked them to take the human resources test. Second, I taught them about the mitigation strategies. Third, I taught them the degree. In this way, I prepared to spread love across the land by teaching meditation (body metaphor) to the group of students. + +4. I prepared to be a pedagogue-creator like Nietzsche. I did this by moving the game counter forward. First, I threw the dice. Second, lifted the game counter. Third, I placed it on the square ahead of the current square by the total number on the dice. In this way, I prepared to be a pedagogue-creator like Nietzsche by moving the game counter forward. + +5. I prepared to be discreet with the winnings. I did this by winning the game prize. First, I lifted the game prize to my chest. Second, I lifted it to my head. Third, I lifted it above my head. In this way, I prepared to be discreet with the winnings by winning the game prize. + +6. I prepared to describe the motion of the argument. I did this by drawing arrows through layers of argument characteristics. First, I found the first arrow. Second, I extended it from the first reason. Third, I pointed it at another reason. In this way, I prepared to describe the motion of the argument by drawing arrows through layers of argument characteristics. + +7. I prepared to understand the argument. I did this by solving problems relating to argument characteristics. First, I solved the first container of problems relating to argument characteristics. Second, I solved the second container of problems relating to argument characteristics. Third, I solved the third container of problems relating to argument characteristics. In this way I prepared to understand the argument by solving problems relating to argument characteristics. + +8. I prepared to wear the hat. I did this by curving the hat around the head. First, I located the head. Second, I curved the hat around it. Third, I pinned the hat closed. In this way, I prepared to wear the hat by curving the hat around the head. + +9. I prepared to wear the shirtsleeve. I did this by curving the shirtsleeve around the arm. First, I located the arm. Second, I curved the shirtsleeve around it. Third, I safety pinned the shirtsleeve closed. In this way, I prepared to wear the shirtsleeve by curving it around the arm. + +10. I prepared to wear the torso. I did this by curving the shirt around the torso. First, I located the torso. Second, I curved the shirt around it. Third, I stitched the shirt closed with a needle and thread. In this way, I prepared to wear the shirt by curving the hat around the torso. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f1dcba97b5a3e72f4424253f51505e2c6eccca90 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Analysing characteristics of arguments 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Analysing characteristics of arguments 2 of 4 + +11. I prepared to be a meditation (philosophy) teacher. I did this by virtuously running up the pole. First, I stepped on the first rung of the stairs on the pole. Second, I prepared to walk to the next step. Third, I repeated this until I had ran up the pole. In this way, I prepared to be a meditation (philosophy) teacher by virtuously running up the pole. + +12. I prepared to answer the next question. I did this by verifying that the answer was correct. First, I read the row that the answer was in. Second, I read the answer in this row and the column of my set of correct answers. Third, I verified that they were the same. In this way, I prepared to answer the next question by verifying that the answer was correct. + +13. I prepared to play checkers with a person. I did this by verifying that the person was good. First, I verified that the person id good deeds. Second, I verified that the person had good moral character. Third, I verified that the person had a clean criminal record. In this way, I prepared to play checkers with a person by verifying that the person was good. + +14. I prepared to sell more danishes. I did this by testing that the vegan danish was delicious. First, I lifted the vegan danish to my lips. Second, I bit the apricot in the vegan danish. Third, I tested that it was delicious. In this way, I prepared to sell more danishes by testing that the vegan danish was delicious. + +15. I prepared to compose using the major triad (happy chord). I did this by ascertaining that the music was harmonious. First, I worked out that the base note in the chord was 'Do'. Second, I worked out that the second note in the chord was 'Mi'. Third, I worked out that the third note in the chord was 'Sol'. In this way, I prepared to compose using the major triad (happy chord) by ascertaining that the music was harmonious. + +16. I prepared to eat the vegan yoghurt. I did this by eating the slice of apple. First, I wrote the seen-as version of meditation (philosophy). Second, I wrote lectures on it. Third, I sliced the part of work off to do like eating a slice of apple. In this way, I prepared to eat the vegan yoghurt by eating the slice of apple. + +17. I prepared to be served the main course. I did this by giving positive feedback about eating the apple. First, I ate the apple. Second, I wrote the positive feedback about eating the apple on a card. Third, I handed the card to an attendant. In this way, I prepared to be served the main course by giving positive feedback about eating the apple. + +18. I prepared to be warm in winter. I did this by wearing the jumper. First, I wrote the seen-as version of medicine (English). Second, I wrote lectures on it. Third, I summarised the lectures like wearing the jumper. In this way, I prepared to be warm in winter by wearing the jumper. + +19. I prepared to swallow the cherry. I did this by licking the cherry. First, I held the cherry. Second, I placed it on my tongue. Third, I lolled it in my mouth. In this way, I prepared to swallow the cherry by licking the cherry. + +20. I prepared to avoid disturbing (seeing) the people. I did this by walking past the people. First, I saw the redcoats. Second, I plotted a path going around them. Third, I walked around them. In this way, I prepared to avoid disturbing (seeing) the people by walking past the people. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e196c571a8345f2463d635c7aec23a6787b3d952 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Analysing characteristics of arguments 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Analysing characteristics of arguments 3 of 4 + +21. I prepared to observe the mouse man ascend the group of slopes. I did this by observing the mouse man run up the slope. First, I observed the mouse man stand at the foot of the slope. Second, I observed the mouse running up the slope. Third, I observed the mouse stop at the top of the slope. In this way, I prepared to observe the mouse man ascend the group of slopes by observing the mouse man run up the slope. + +22. I prepared to get a free treat. I did this by loving God (I hugged the master). First, I walked to the master. Second, I held his shoulders. Third, I hugged him. In this way, I prepared to get a free treat by loving God (I hugged the master). + +23. I prepared to write how the newspaper was influenced by my pedagogical argument. I did this by writing that the newspaper was influenced by my pedagogical argument. First, I observed the newspaper article writer read my pedagogical argument. Second, I observed her write the article. Third, I read the article. In this way, I prepared to write how the newspaper was influenced by my pedagogical argument by writing that the newspaper was influenced by my pedagogical argument. + +24. I prepared to prepare for the first set of buildings in the Lucian Academy. I did this by writing that the Lucian Academy had 5 As for buildings. First, I wrote the first collection of buildings had 5 As. Second, I wrote that the second collection of buildings had 5 As. Third, I wrote that the third collection of buildings had 5 As. In this way, I prepared to prepare for the first set of buildings in the Lucian Academy by writing that the Lucian Academy had 5 As for buildings. + +25. I prepared to prepare for the first set of managers in the Lucian Academy. I did this by writing that the Lucian Academy had 5 As for managers. First, I wrote the first collection of managers had 5 As. Second, I wrote that the second collection of managers had 5 As. Third, I wrote that the third collection of managers had 5 As. In this way, I prepared to prepare for the first set of managers in the Lucian Academy by writing that the Lucian Academy had 5 As for managers. + +26. I prepared to prepare for the first set of teachers in the Lucian Academy. I did this by writing that the Lucian Academy had 5 As for teachers. First, I wrote the first collection of teachers had 5 As. Second, I wrote that the second collection of teachers had 5 As. Third, I wrote that the third collection of teachers had 5 As. In this way, I prepared to prepare for the first set of teachers in the Lucian Academy by writing that the Lucian Academy had 5 As for teachers. + +27. I prepared to prepare for the first set of volunteers in the Lucian Academy. I did this by writing that the Lucian Academy had 50 As for volunteers. First, I wrote the first collection of volunteers had 50 As. Second, I wrote that the second collection of volunteers had 50 As. Third, I wrote that the third collection of volunteers had 50 As. In this way, I prepared to prepare for the first set of volunteers in the Lucian Academy by writing that the Lucian Academy had 50 As for volunteers. + +28. I prepared to demonstrate initiative in the industry. I did this by observing the voluntary student write the philosophy essay. First, I observed her write the exposition. Second, I observed her write the critique. Third, I observed her write the introduction and conclusion of the essay. In this way, I prepared to demonstrate initiative in the industry by observing the voluntary student write the philosophy essay. + +29. I prepared to build a taskforce of meditation (philosophy) teachers. I did this by teaching the volunteer meditation (philosophy). First, she gave me a donation for the degree. Second, I taught her the utterance (word). Third, I allowed her to repeat the utterance (word). In this way, I prepared to build a taskforce of meditation (philosophy) teachers by teaching the volunteer meditation (philosophy). + +30. I prepared to teach meditation (societology). I did this by watching the manager give permission to advertise at the market. First, I asked the manager for permission to advertise at the market. Second, I watched the manager consider whether to give permission to advertise at the market. Third, I watched the manager give permission to advertise at the market. In this way, I prepared to teach meditation (societology) by watching the manager give permission to advertise at the market. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..2548426f22834abac47e797acaf989af94e79852 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Analysing characteristics of arguments 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Analysing characteristics of arguments 4 of 4 + +31. I prepared to observe the volunteer vitiate the community. I did this by observing the volunteer teach meditation (philosophy of music). First, she gave a copy of the degree to her student. Second, her student breasoned out the degree. Third, the volunteer taught her student meditation (philosophy of music) as she had been taught. In this way, I prepared to observe the volunteer vitiate the community by observing the volunteer teach meditation (philosophy of music). + +32. I prepared to entice the new client. I did this by advertising meditation (the body metaphor) at the market. First, I erected the sign. Second, I handed out the pamphlet. Third, I spoke by word of mouth about the place and time of meditation (the body metaphor) classes at the centre. In this way, I prepared to entice the new client by advertising meditation (the body metaphor) at the market. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7c30736f2fec952c54137ccc7ae83063af4db0e6 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 1 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Computational English Argument 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Computational English Argument 1 of 4 + + + +1. Computational English uses techniques to disseminate texts hermeneutically (interpret them). + +1a. Computational English uses techniques to desseminate texts hermeneutically (interpret them). The first technique uses the text to construct a philosophical argument (see the site). The 'essays' produced by the scripts connect the points in the argument. A real essay would have to be written separately from the essays. + +1a. I prepared to program the robot to deliver the speech. I did this by writing the text-to-speech algorithm. First, I wrote down the text. Second, I recorded the speech for this text. Third, I converted the text into the speech using a grammar. In this way, I prepared to program the robot to deliver the speech by writing the text-to-speech algorithm. + +2. I prepared to write how the robot would discover science. I did this by writing the question-answering algorithm. First, I wrote down the question word. Second, I wrote down the key term in the question. Third, I wrote down the answer to the question, given the question word and key term. In this way, I prepared to write how the robot would discover science by writing the question-answering algorithm. + +3. I prepared to disambiguate the text containing an anaphor. I did this by writing the anaphor resolution algorithm, where an anaphor is a word such as 'him' or 'it,' which refer to other words. First, I wrote down 'John ate the apple'. Second, I wrote down 'It was delicious'. Third, I resolved the anaphor by writing that the anaphor 'It' in the second sentence referred to the word 'apple' in the first sentence where 'apple' is the most recent object that 'It' could refer to. In this way, I prepared to disambiguate the text containing an anaphor by writing the anaphor resolution algorithm, where an anaphor is a word such as 'him' or 'it,' which refer to other words. + +4. I prepared to program the robot reporter to write the hansard in parliament. I did this by converting speech into text. First, I listened to the speech. Second, I converted the frequency spectrum into a set of phonemes, where phonemes are distinct units of sound that distinguish one word from another. Third, I wrote down the text from the phonemes. In this way, I prepared to program the robot reporter to write the hansard in parliament by converting speech into text. + +5. I prepared to write the novel's form in letters, for example 'ABCDA'. I did this by writing down the novel's form through space. First, I wrote down the location in space of the first scene in the novel. Second, I prepared to write down the location in space of the next scene in the novel. Third, I repeated this until I had written down the location in space of each scene in the novel. In this way, I prepared to write the novel's form in letters, for example 'ABCDA,' by writing down the novel's form through space. + +6. I prepared to give the novel a 'positive' classification. I did this by determining that the genre of a novel was comedy. First, I wrote down that the first sentence in the novel was positive, therefore comical. Second, I prepared to write down that the next sentence in the novel was positive, therefore comical. Third, I repeated this until I had written down that all the sentences in the novel were positive, therefore comical. In this way, I prepared to give the novel a 'positive' classification by determining that the genre of a novel was comedy. + +7. I prepared to give the novel a 'positive dialogue' classification.I did this by determining that the genre of a novel was drama. First, I wrote down that the first sentence in the novel was positive, therefore important. Second, I prepared to write down that the next sentence in the novel was positive, therefore important. Third, I repeated this until I had written down that all the sentences in the novel were positive, therefore important. In this way, I prepared to give the novel a 'positive' classification by determining that the genre of a novel was drama. + +8. I prepared to calculate the used space in the house in the short story. I did this by calculating the space complexity in Computational English. First, I counted the number of characters of space that the short story's world was in length. Second, I skipped counting the number of characters of space that the short story's world, where this space was revisited, in length. Third, I summed only the cumulative number of characters of space that the short story's world was in length. In this way, I prepared to calculate the used space in the house in the short story by calculating the space complexity in Computational English. + +9. I prepared to calculate how long the character walked around for. I did this by calculating the time complexity in Computational English. First, I counted the number of steps were used to traverse the short film's world. Second, I counted the number of steps were used to traverse the short film's world, where this space was revisited. Third, I summed the cumulative number of steps that were used to traverse the short film's world. In this way, I prepared to calculate how long the character walked around for by calculating the time complexity in Computational English. + +10.I prepared to write how the text made sense. I did this by writing the summarisation algorithm. First, I summarised each object in the text into a hierarchy of types of nouns and verbs. Second, I read the number of the level of resolution of summary that was required. Third, I printed the summary types at this level, where 1 = most summarised, n = least summarised and n = depth of detail in text. In this way, I prepared to write how the text made sense by writing the summarisation algorithm. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..2aef3700261f7de0d8e01e25432333611aa8a7b0 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Computational English Argument 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Computational English Argument 2 of 4 + +11.I prepared to store the positive Computational English phenomena in a box. I did this by finding the positive results in Computational English. First, I wrote down the first leg of the path. Second, I prepared to write down the next leg of the path. Third, I repeated this until the correct result had been positively found. In this way, I prepared to store the positive Computational English phenomena in a box by finding the positive results in Computational English. + +12.I prepared to write what the Computational English algorithm couldn't do outside a circle. I did this by finding the negative results in Computational English (actions that only other algorithms could do). First, I wrote down the first leg of the path. Second, I prepared to write down the next leg of the path. Third, I repeated this until a block was reached before correct result had been positively found, indicating a negative result, i.e. the desired result may only be found by another algorithm. In this way, I prepared to write what the Computational English algorithm couldn't do outside a circle by finding the negative results in Computational English (actions that only other algorithms could do). + +13.I prepared to count the number of character interrelationships in the narrative. I did this by writing the character ontologies in Computational English. First, I wrote the names of the characters on the sheet of paper. Second, I drew circles around each group of characters. Third, I drew directional lines on sheets of paper representing different stages of the narrative between each pair of characters in a group and each pair of groups, etc. In this way, I prepared to count the number of character interrelationships in the narrative by writing the character ontologies in Computational English. + +14.I prepared to say that what happened is what was supposed to have happened. I did this by writing the plot ontologies in Computational English. First, I wrote down the first theme mentioned in the plot. Second, I prepared to write down the next theme mentioned in the plot. Third, I repeated this until I had written down all the themes mentioned in the plot. In this way, I prepared to say that what happened is what was supposed to have happened by writing the plot ontologies in Computational English. + +15.I prepared to write how wanting to play caused a chain reaction through the settings. I did this by writing the setting ontologies in Computational English. First, I wrote down the first setting mentioned in the story, the garden. Second, I prepared to write down the next setting mentioned in the story, the children's cutting room. Third, I repeated this until I had written down all the settings mentioned in the story, including the refrigerator. In this way, I prepared to write how wanting to play caused a chain reaction through the settings by writing the setting ontologies in Computational English. + +16.I prepared to write that the author achieved all her aims. I did this by writing the aim of the author in Computational English. First, I wrote the author chose the bull's-eye to aim for. Second, I wrote the author selected possible text to use to be represented by the bull's-eye. Third, I wrote the author pointed with an arrow at words that were best at achieving her aim. I prepared to write that the author achieved all her aims by writing the aim of the author in Computational English. + +17.I prepared to write influence was an experiential midpoint between argument and interpretation. I did this by writing the author's influence in Computational English. First, I wrote the person who was part of an influx through the author's life. Second, I wrote what the person said. Third, I wrote the author's reply to what the person said. In this way, I prepared to write influence was an experiential midpoint between argument and interpretation by writing the author's influence in Computational English. + +18.I prepared to evince logical correctness. I did this by writing a statement 'the man made it' was correct in Computational English. First, I wrote that the man walked along the track. Second, I wrote that the man reached the line. Third, I wrote it was correct that the man made it. In this way, I prepared to evince logical correctness by writing a statement 'the man made it' was correct in Computational English. + +19.I prepared to verify the title of the book was good. I did this by writing that the statement 'I wrote the book' was good in Computational English. First, I verified that the people involved in writing the book were good. Second, I wrote I verified that he language in the book was good. Third, I verified that a model stage production of the book was good. In this way, I prepared to verify the title of the book was good by writing that the statement 'I wrote the book' was good in Computational English. + +20.I prepared to publish the book. I did this by verifying that the text was well written. First, I verified that the text was factual. Second, I verified that the text contained good ideas. Third, I verified that the text was grammatical. In this way, I prepared to publish the book by verifying that the text was well written. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..65326d333425389d57efb3ea707297dd0aff912c --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Computational English Argument 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Computational English Argument 3 of 4 + +21.I prepared to write the book. I did this by writing the text. First, I thought of a topic. Second, I wrote the reason. Third, I wrote an inference between the reason and the conclusion. In this way, I prepared to write the book by writing the text. + +22.I prepared to collect evidence on a topic. I did this by recording an object's registration number. First, I looked at an object. Second, I thought of its registration number. Third, I recorded it in the computer program. In this way, I prepared to collect evidence on a topic by recording an object's registration number. + +23.I prepared to become immortal (e.g. remembered until a point in time). I did this by recording the knowledge in the form of modus ponens. First, I wrote 'I am happy because of placing the pink skittle on the ground'. Second, I wrote, 'I am happy because of placing the pink skittle on the ground, therefore you are happy because of knocking the pink skittle over with the green ball'. Third, I wrote, 'You are happy because of knocking the pink skittle over with the green ball'. In this way, I prepared to become immortal (e.g. remembered until a point in time) by recording the knowledge in the form of modus ponens. + +24.I prepared to bring arguments to all areas of study. I did this by constructing an argument from experience. First, I wrote down the first experience. Second, I wrote down the second experience. Third, I connected these experiences with an inference that I had experienced. In this way, I prepared to bring arguments to all areas of study by constructing an argument from experience. + +25.I prepared to make the most central discoveries on Earth. I did this by writing on the topic 'Encuntglish,' about the most important noumenon being opened up like a book. First, I opened the book. Second, I considered what to write. Third, I wrote down my discovery. In this way, I prepared to make the most central discoveries on Earth by writing on the topic 'Encuntglish,' about the most important noumenon being opened up like a book. + +26.I prepared to make money from positive circles. I did this by writing that Inclish referred to Positivity Inc. First, I verified that the first term was positive. Second, I verified that the next term was positive. Third, I repeated this until I had verified that all the terms were positive. In this way, I prepared to make money from positivity by writing that Inclish referred to Positivity Inc. + +27.I prepared to verify that the soldier had packed lunch. I did this by writing that Inglish referred to 'in' being unified with another instance of 'in'. First, I placed a chocolate in the child's hand. Second, I verified that the chocolate was still in the child's hand. Third, I verified that the child had eaten the chocolate. In this way, I prepared to verify that the soldier had packed lunch by writing that Inglish referred to 'in' being unified with another instance of 'in'. + +28.I prepared to write a seen-as version for a student's essay. I did this by writing that Conglish referred to Computational English, the detailed phenomenological investigation into computational algorithms for English functions. First, I wrote the sense data that a person was interested in learning meditation (e.g. reading a book). Second, I connected this information with the knowledge to teach meditation in my brain (e.g. looked up values in a table). Third, I taught the meditation student meditation by telling him to silently repeat God's name the mantra 'Lucian', for twenty minutes each day, until the mantra becomes more and more refined, transcending and clearing thoughts that are thought of as byproducts of stress (e.g. explained the mathematical equation). In this way, I prepared to write a seen-as version for a student's essay by writing that Conglish referred to Computational English, the detailed phenomenological investigation into computational algorithms for English functions. + +29.I prepared to tell the story of the loaf. I did this by writing that Kinglish referred to the King's symbols. First, I read the eight-pointed star. Second, I read the loaf-eater symbol. Third, I read the orb symbol. In this way, I prepared to tell the story of the loaf by writing that Kinglish referred to the King's symbols. + +30.I prepared to design icons for each essay topic. I did this by writing that Basic Ciccy referred to an icon being designed for an essay topic. First, I read the essay topic. Second, I designed an icon for it. Third, I clicked on the icon. In this way, I prepared to design icons for each essay topic by writing that Basic Ciccy referred to an icon being designed for an essay topic. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6f6164d5165ff6a6a86001811256b6cd0011311f --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Computational English Argument 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Computational English Argument 4 of 4 + +31.I prepared to discover the protein code. I did this by writing that Bioschemistry (sic) referred to computational biochemistry, the study of the computational nature of proteins. First, I discovered the first biochemical reaction. Second, I prepared to discover the next biochemical reaction. Third, I repeated this until I had discover all the biochemical reactions and the desired result had been achieved. In this way, I prepared to discover the protein code by writing that Bioschemistry (sic) referred to computational biochemistry, the study of the computational nature of proteins. + +32.I prepared to make a protein computer. I did this by writing that Bicochemistry (sic) referred to biochemical computation, the study of the manipulation of the computational nature of proteins. First, I caused the first biochemical reaction. Second, I prepared to observe the next biochemical reaction. Third, I repeated this until I had observed all the biochemical reactions and the desired result had been achieved. In this way, I prepared to make a protein computer by writing that Bicochemistry (sic) referred to biochemical computation, the study of the manipulation of the computational nature of proteins. + +33.I prepared to help with one threshold. I did this by choosing one threshold. First, I interpreted it. Second, I chose God (philosopher) lines. Third, I liked it. In this way, I prepared to help with one threshold by choosing one threshold. + +34.I prepared to write laws. I did this by writing laws and guidelines for writing breasonings currency. First, I made compost. Second, I found a mug. Third, I drank from it. In this way, I prepared to write laws by writing laws and guidelines for writing breasonings currency. + +35.I prepared to write on you too. I did this by writing topics to write breasonings currency on. First, I wrote the artemis ardamon. Second, I wrote you, As. Third, I like your rods. In this way, I prepared to write on you too by writing topics to write breasonings currency on. + +36.I prepared to know Artemis. I did this by researching the product of the breasoning currency. First, I wrote on the breasoning currency. Second, I held aloft the Parises Whittaker. Third, I knew about Dick Whittington. In this way, I prepared to know Artemis by researching the product of the breasoning currency. + +37.I prepared to write on breasonings currency. I did this by instituting training in Pedagogy. First, I found the raptures. Second, I held up too soon. Third, I had specialised breasonings currency writing training. In this way, I prepared to write on breasonings currency by instituting training in Pedagogy. + +38.I prepared to join up doughnut religion (philosophy). I did this by training in philosophy. First, I wrote philosophy. Second, I wrote on tizziwinkle's flappings. Third, I designed you. In this way, I prepared to join up doughnut religion (philosophy) by training in philosophy. + +39.I prepared to unend things. I did this by training in creative writing. First, I gained the confidence to pick up the pen. Second, I gained the confidence to write. Third, I breasoned out the breasonings. In this way, I prepared to unend things by training in creative writing. + +40.I prepared to . I did this by training in business. First, I stated, 'That's what the word currency is about'. Second, I identified the receipt. Third, I designed you too. In this way, . + +41.I prepared to dine on the quince. I did this by researching breasonings currency with ethics approval. First, I designed ethics approval. Second, I designed the man's manuscripts. Third, I held the effigies. In this way, I prepared to dine on the quince by researching breasonings currency with ethics approval. + +42. I prepared to connect 50 As to each sentence. I did this by setting breasonings currency medicine as the domain of my PhD. First, I researched the medicinal requirements of the product. Second, I wrote the epistemological details about it. Third, I held the nanotode (sic). In this way, I prepared to connect 50 As to each sentence by setting breasonings currency medicine as the domain of my PhD. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7ea886af15da294c0b028f864b64cd1bb65c2f0b --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 1 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Computational English is like a Calculator 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Computational English is like a Calculator 1 of 4 + + + +1. Computational English's algorithms are like that of a Calculator. + +1a. Will Computational English be like a calculator, as in people should understand the theory before relying on it? Advantages: will allow focus on more material, as more data can be analysed (information would have to be presented in such a way as to allow reviewing in the best way possible). User will be able to program using Conglish's (Computational English) features (write essay, fill out grammar to recognise sentences with the same meaning, compressing the program e.g. 2 identical conclusions compressed to 1 conclusion, with reasons from both attached to it, action to take based on arriving at a conclusion e.g. return a telephone call, cook a meal, clean up or write a poem). + +1a. I prepared to drink the milk. I did this by eating bran. First, I ate the bran. Second, I ate the sultanas. Third, I ate the dried apple pieces. In this way, I prepared to drink the milk by eating bran. + +2. I prepared to eat the apple. I did this by drinking cherry juice. First, I cut up the cherries with a knife. Second, I cooked them. Third, I strained their juice. In this way, I prepared to eat the apple by drinking cherry juice. + +3. I prepared to eat the strawberry. I did this by loving you eating the shortcake. First, I sifted- the flour, baking soda, two tablespoons of sugar, baking powder and salt. Second, I added in 1.5 cup of low fat coconut milk. Third, I cooked the mixture at 200 degrees Celsius. In this way, I prepared to eat the strawberry by loving you eating the shortcake. + +4. I prepared to eat the raspberry. I did this by observing myself loving you eating the pancake. First, I blended the buckwheat into flour in a blender and whisk in rice flour, baking powder and cornstarch. Second, I poured the almond milk, vanilla extract and maple syrup into the mixture, whisking it to remove lumps. Third, I cooked this mixture. In this way, I prepared to eat the raspberry by observing myself loving you eating the pancake. + +5. I prepared to count the character in the narrative, 'I'. I did this by pressing a button when I counted the character. First, I started Ball Prolog by pressing one side of a see-saw, attached to a calculator button, rotating the see-saw around the fulcrum. Second, I watched the apparatus lift a ball on the other side of the see-saw. Third, I observed the apparatus cause the ball to drop through a hole at the fulcrum. In this way, I prepared to count the character in the narrative, 'I' by pressing a button when I counted the character. + +6. I prepared to write down the answer to the next exercise. I did this by using a calculator (not relying on my mind) to eat enough apple slices. First, I wrote down the number of apple slices I already had, 5. Second, I wrote down the number of apple slices I needed, Third, I used a calculator to add the the number of apple slices I already had, 5, to the number of apple slices I needed, 1, equaling 6. In this way, I prepared to write down the answer to the next exercise by using a calculator (not relying on my mind) to eat enough apple slices. + +7. I prepared to demonstrate subtraction using Ball Prolog. I did this by inputting a ball into the Computational English Calculator seesaw and outputting another ball. First, I dropped a ball onto one end of the seesaw. Second, I watched that end of the seesaw dropped with the weight of this ball. Third, I observed another ball roll from the far end of the seesaw through a hole next to the fulcrum. In this way, I prepared to demonstrate subtraction using Ball Prolog by inputting a ball into the Computational English Calculator seesaw and outputting another ball. + +8. I prepared to have another ice cream. I did this by adding two numbers together using the 'Add' 3D Computational English Calculator page. First, I observed it receiving a numbered ball input into its rows. Second, I watched it receiving a numbered ball input into its column. Third, I observed it output the result from a grid. In this way, I prepared to have another ice cream by adding two numbers together using the 'Add' 3D Computational English Calculator page. + +9. I prepared to append the sequence of balls to the list of sequences of balls. I did this by appending two sequences of balls in Ball Prolog. First, I released the first barrier, causing the sequence of balls representing the phrase 'I like being there with you on stage' to enter the tube. Second, I released the second barrier, causing the sequence of balls representing the phrase 'I like being there with you on stage again' to enter the tube. Third, I recorded the sequence of balls. In this way, I prepared to append the sequence of balls to the list of sequences of balls by appending two sequences of balls in Ball Prolog. + +10. I prepared to state that it was good that adding one to a number allowed me to calculate the amount of food necessary to satisfy my hunger. I did this by verifying that Ball Prolog's output was correct given its input. First, I entered the input ball into the apparatus. Second, I observed the output ball be produced by the apparatus. Third, I verified that the output correctly corresponded to the input. In this way, I prepared to state that it was good that adding one to a number allowed me to calculate the amount of food necessary to satisfy my hunger by verifying that Ball Prolog's output was correct given its input. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..793824110ec49dfcb31864df69a06dbff8691d29 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Computational English is like a Calculator 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Computational English is like a Calculator 2 of 4 + +11. I prepared to write 'I ate the apple' and 'I ate the apple, therefore I gained energy', therefore 'I gained energy'. I did this by writing that 'and' in Ball Prolog is represented by the ball travelling forwards. First, I observed that the ball started at the position of the first proposition. Second, I watched it roll along the track, representing the conjunction. Third, I observed that the ball finished at the position of the second proposition. In this way, I prepared to write 'I ate the apple' and 'I ate the apple, therefore I gained energy', therefore 'I gained energy' by writing that 'and' in Ball Prolog is represented by the ball travelling forwards. + +12. I prepared to find the first store-holder who could sell me produce. I did this by writing that 'or' in Ball Prolog is represented by up to n automata being used. First, I tested whether the first automaton was successful. Second, I prepared to test whether the next automaton was successful. Third, I repeated this until an automaton was successful. In this way, I prepared to find the first store-holder who could sell me produce by writing that 'or' in Ball Prolog is represented by up to n automata being used. + +13. I prepared to change the automaton to enable the correct ball to reach the end of the automaton. I did this by stating that when a ball failed to reach the end of an automaton (like tracing the input and output of a line) the result was recorded. First, I inserted the ball into the automaton. Second, I observed the ball stop before reaching the correct end point. Third, I wrote down the position that it had reached. In this way, I prepared to change the automaton to enable the correct ball to reach the end of the automaton by stating that when a ball failed to reach the end of an automaton (stopped before reaching the end) the result was recorded. + +14. I prepared to watch the game. I did this by computing that the characters who used the stool one at a time shared it by using the Computational English Calculator. First, I watched the first character use the stool. Second, I prepared to watch the next character use the stool. Third, I watched each character use the stool. In this way, I prepared to watch the game by computing that the characters who used the stool one at a time shared it by using the Computational English Calculator. + +15. I prepared to observe the person friendlily shake another character's hand. I did this by hierarchically computing that a character was popologically positive using Computational English. First, I imagined visited the character's house. Second, I imagined opening the character's chest of draws. Third, I verified that the character's bathing costume had been cleaned since being recently used, indicating the character was positive. In this way, I prepared to observe the person friendlily shake another character's hand by hierarchically computing that a character was popologically positive using Computational English. + +16. I prepared to say the land was fair. I did this by calculating that the rights of the societological groups were equal. First, I wrote down the first group's right. Second, I prepared to test that the second group's right was equal to the first group's right. Third, I repeated this until I had determined that each right of each group was equal to the first group's right. In this way, I prepared to say the land was fair by calculating that the rights of the societological groups were equal.w + +17. I prepared to be chosen as a librarian. I did this by calculating the highest structures in society to write about using the Computational English Calculator. First, I wrote down the height of the first structure in society. Second, I inserted the second structure in society from highest to lowest into the list of structures. Third, I repeated this until I had sorted the list of structures from highest to lowest. In this way, I prepared to be chosen as a librarian by calculating the highest structures in society to write about using the Computational English Calculator. + +18. I prepared to write an argument. I did this by calculating a breasoning list item based on another one using the Computational English Calculator. First, I wrote down the first breasoning. Second, I wrote down a synonymous breasoning or a breasoning used by the same algorithm as the first breasoning. Third, I wrote down the second breasoning. In this way, I prepared to write an argument by calculating a breasoning list item based on another one using the Computational English Calculator. + +19. I prepared to connect the breasoning to the rhizome. I did this by writing down an idea in the essay from the area of study corresponding to the breasoning. First, I wrote down the category from the area of study. Second, I wrote down the breasoning. Second, I wrote a rhizome from the category intersecting with the breasoning. In this way, I prepared to connect the breasoning to the rhizome by writing down an idea in the essay from the area of study corresponding to the breasoning. + +20. I prepared to point the rhizome to the argument, where the rhizome is an acceptable endpoint of the area of study. I did this by connecting the breasoning to the rhizome. First, I wrote the breasoning. Second, I wrote the rhizome. Third, I wrote that clover, the breasoning, was an uncountable noun, a rhizome. In this way, I prepared to point the rhizome to the argument, where the rhizome is an acceptable endpoint of the area of study, by connecting the breasoning to the rhizome. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..856f37ea79a3d9420125ec4294d5b9e6d45c4174 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Computational English is like a Calculator 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Computational English is like a Calculator 3 of 4 + +21. I prepared to perform computations on the ontology, step by step. I did this by writing an ontology, in other words, the data structure, containing the arguments in order. First, I wrote words, the arguments down in the ontology. Second, I wrote the grammar, the order of the arguments under the words, the arguments. Third, I wrote the words, the arguments, in the order of action in the ontology. In this way, I prepared to perform computations on the ontology, step by step by writing an ontology, in other words, the data structure, containing the arguments in order. + +22. I prepared to write about arguments assuming they worked together. I did this by verifying the connections between the arguments using a Computer Science algorithm. First, I wrote two words, the arguments, down. Second, I wrote the object registration numbers these words represented. Third, I verified that the objects logically connected using a database. In this way, I prepared to write about arguments assuming they worked together by verifying the connections between the arguments using a Computer Science algorithm. + +23. I prepared to read my novel. I did this by writing a novel using the Computational English Calculator. First, I copied experiences. Second, I joined experiences together. Third, I added to experiences. In this way, I prepared to read my novel by writing a novel using the Computational English Calculator. + +24. I prepared to write down theories about an experience. I did this by writing philosophy theory by writing the common parts of the texts from my experiences using the Computational English Calculator. First, I wrote down the first part of a text. Second, I wrote down all the instances of the part of the text in my experiences. Third, I repeated this until I had found all the common parts of the texts from my experiences. In this way, I prepared to write down theories about an experience by writing philosophy theory by writing the common parts of the texts from my experiences using the Computational English Calculator. + +25. I prepared to theories about acts. I did this by writing philosophy theory by writing each common act from my set of experiences using the Computational English Calculator. First, I wrote down the first act from my set of experiences. Second, I prepared to write down the next common act from my set of experiences. Third, I repeated this until I had written down all the common acts from my set of experiences. In this way, I prepared to theories about acts by writing philosophy theory by writing each common act from my set of experiences using the Computational English Calculator. + +26. I prepared to write about the complete list of types of experiences. I did this by writing philosophy theory by writing the common types of experiences from my list of experiences using the Computational English Calculator. First, I wrote down the type of experience of the first experience from my list of experiences. Second, I prepared to write down the common type of experience of the next experience from my list of experiences. Third, I repeated this until I had written down all the common types of experiences from my list of experiences. In this way, I prepared to write about the complete list of types of experiences by writing philosophy theory by writing the common types of experiences from my list of experiences using the Computational English Calculator. + +27. I prepared to collect knowledge from the robot. I did this by observing the robot converse with me based on the robot's experiences. First, I said a statement. Second, I listened to the robot reply to the statement based on its experiences. Third, I repeated this until I had finished the conversation. In this way, I prepared to collect knowledge from the robot by observing the robot converse with me based on the robot's experiences. + +28. I prepared to found science by robots. I did this by observing the robot converse with a group of robots based on the robot's experiences. First, I observed the group of robots share their knowledge. Second, I observed a conversation being planned based on the knowledge. Third, I observed the robots hold a conversation based on the knowledge. In this way, I prepared to found science by robots by observing the robot converse with a group of robots based on the robot's experiences. + +29. I prepared to be impressed with the robot's knowledge. I did this by conversing about information with a robot online. First, I asked what knowledge the robot had. Second, I found a topic of conversation with the robot. Third, I had a conversation with the robot. In this way, I prepared to be impressed with the robot's knowledge by conversing about information with a robot online. + +30. I prepared to give feedback to the robot. I did this by verifying the robot's knowledge with research. First, I wrote down the robot's statement. Second, I verified the robot's statement with research. Third, I wrote down the relevant research. In this way, I prepared to give feedback to the robot by verifying the robot's knowledge with research. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..61a1956095332732235eba8c56ccc2b430754f6c --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Computational English is like a Calculator 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Computational English is like a Calculator 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Computational English is like a Calculator 4 of 4 + +31. I prepared to trick a person into believing the robot was human-like. I did this by creating a robot that approached human likeness. First, I created the robot's mind. Second, I created the robot's face. Third, I created the robot's body. In this way, I prepared to trick a person into believing the robot was human-like by creating a robot that approached human likeness. + +32. I prepared to perform the next step with the robot, sitting in the ferris wheel. I did this by observing the robot and me liking each other. First, I stated that the robot was friendly because the robot used my abacus and liked me using the robot's quipu. Second, I watched the robot state that he liked me because of the regular frequency with which I queried him. Third, I recorded that it was true that the robot and I liked each other. In this way, I prepared to perform the next step with the robot, sitting in the ferris wheel by observing the robot and me liking each other. + +33. I prepared to press the button on board the space craft. I did this by connecting the space industry with breasonings currency. First, I held the space nozzle up. Second, I held up the pink tank up. Third, I designed the previews. In this way, I prepared to press the button on board the space craft by connecting the space industry with breasonings currency. + +34. I prepared to survive using breasonings currency. I did this by connecting survival with breasonings currency. First, I found survival. Second, I found breasonings currency. Third, I connected them. In this way, I prepared to survive using breasonings currency by connecting survival with breasonings currency. + +35. I prepared to do the necessary job. I did this by stating that walking distance jobs more will be more likely with breasonings currency. First, I half-tailed the dove. Second, I placed the rod in the void. Third, I played with the religious (philosophical) leader. In this way, I prepared to do the necessary job by stating that walking distance jobs more will be more likely with breasonings currency. + +36. I prepared to read the book. I did this by limiting the breasonings currency. First, I saw how much breasonings currency could be produced with available funds for that purpose. Second, I produced this amount of breasonings currency. Third, I used the breasonings currency to buy the book. In this way, I prepared to read the book by limiting the breasonings currency. + +37. I prepared to calculate how the books' characters could achieve higher goals. I did this by stating that higher things were possible. First, I wrote how society could achieve higher things as breasonings currency. Second, I helped achieve these. Third, I set up camp on a different planet. In this way, I prepared to calculate how the books' characters could achieve higher goals by stating that higher things were possible. + +38. I prepared to buy the meditation (universal) course. I did this by rebreasoning out the breasonings currency as a way of using it. First, I rebreasoned out the breasonings currency spiritually with the machine. Second, I verified that the amount had gone through. Third, I transferred the registered breasonings currency. In this way, I prepared to buy the meditation (universal) course by rebreasoning out the breasonings currency as a way of using it. + +39. I prepared to think of an A for the breasoning, i.e. I breasoned out objects around the breasoning (object). I did this by individually breasoning out the breasonings currency arguments. First, I observed the algorithm read the object name. Second, I observed it retrieved the object's dimensions from a database. Third, I observed it breason out the argument by breasoning out the argument's object's breasoning's details (visualise the object's x, y and z dimensions). In this way, I prepared to think of an A for the breasoning, i.e. I breasoned out objects around the breasoning (object) by individually breasoning out the breasonings currency arguments. + +40. I prepared to detect other literary possibilities with A detectors in the book. I did this by programming computers to write breasonings currency. First, I wrote the idea. Second, I verified this against guidelines. Third, I contributed this to the capped amount allowed for the day. In this way, I prepared to detect other literary possibilities with A detectors in the book by programming computers to write breasonings currency. + +41. I prepared to be the highest bidder. I did this by buying with breasonings currency. First, I applied to buy with the idea. Second, I was successful. Third, I bought the goods with the breasonings currency. In this way, I prepared to be the highest bidder by buying with breasonings currency. + +42. I prepared to calculate how much the characters were worth. I did this by providing products in return for breasonings currency. First, I read the applications for the product. Second, I received the breasonings currency from the selected applicant. Third, I used a loophole (writing about a third perspective) when paraphrasing connecting two perspectives together in the breasonings currency. In this way, I prepared to calculate how much the characters were worth by providing products in return for breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..9031385f19fcee5c2f7cce98f54df9bd310ce5bf --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 1 of 4.txt @@ -0,0 +1,41 @@ +["Green, L 2021, Conglish Reflection 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Conglish Reflection 1 of 4 + + + +1. Conglish allows thought processes to be analysed. It is an analytic perspective for expressing ideas, i.e. looks at the relationships between functional units. + +Conglish Issues + +1a. English - themes, language, characterisation, contention, etc. + +Philosophical - Reasons and their relations, logic + +Psychological - thought processes and ailments, interfaces with artificial intelligence and biochemistry + +Linguistic - syntactical (grammars) and semantic (vocabulary) + +Computational - efficiency, neatness, fitting purpose, mathematical accuracy + +1aa. I prepared to perform well in Honours. I did this by trimming the hibiscus. First, I calculated how many As I needed to earn 100% in the thesis. Second, I scheduled these. Third, I wrote these. In this way, I prepared to perform well in Honours by trimming the hibiscus. + +2. I prepared to eat the nutmeg scone. I did this by eating the nutmeg. First, I placed the ground nutmeg on a spoon. Second, I lifted the spoon to my lips. Third, I tested (sic) the nutmeg. In this way, I prepared to eat the nutmeg scone by eating the nutmeg. + +3. I prepared to propagate bliss through the world. I did this by meditating (reading about) on the leader of the world's current self-abnegation religion (philosophy). First, I saw that the people needed meditation (philosophy). Second, I gave them meditation (philosophy). Third, I continued on to the next religion (philosophy). + +4. I prepared to read the second book written by the master. I did this by reading the book written by the master. First, I opened the book. Second, I started reading the first chapter. Third, I finished reading the first chapter. In this way, I prepared to read the second book written by the master by reading the book written by the master. + +5. I prepared to give life to goats. I did this by giving the goat (person) an A to have a kid (child). First, I found the person who was going to conceive a child. Second, I breasoned out an A to give toe h person. Third, I listened to the news that a child had been born to the person. In this way, I prepared to give life to goats by giving the goat (person) an A to have a kid (child). + +6. I prepared to send the document to the publisher. I did this by licking the pear. First, I breasoned out the book. Second, I breasoned out 50 As for the book to be published. Third, I addressed the parcel to the publisher. In this way, I prepared to send the document to the publisher by licking the pear. + +7. I prepared to win the game of chess. I did this by moving the chess figure. First, I placed the figures in their starting positions. Second, I waited for the right turn. Third, I moved the chess figure. In this way, I prepared to win the game of chess by moving the chess figure. + +8. I prepared to observe the ducks led the ducklings. I did this by observing that the ducks were gay companions. First, I observed that the ducks were male. Second, I observed that the ducks were gay. Third, I observed that the ducks were gay companions. In this way, I prepared to observe the ducks led the ducklings by observing that the ducks were gay companions. + +9. I prepared to wait at the crossing. I did this by walking to the crossing. First, I found the correct coordinates map. Second, I found the crossing on the map. Third, I walked to the crossing from my departure point. In this way, I prepared to cross the road by walking to the crossing. + +10. I prepared to walk to the other side of the crossing. I did this by crossing at the lights when the green man was displayed. First, I watched the light. Second, I waited until it had turned green. Third, I crossed at the crossing. In this way, I prepared to walk to the other side of the crossing by crossing at the lights when the green man was displayed. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e0bba0192a30446b0c8106b94a727934ec30f488 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Conglish Reflection 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Conglish Reflection 2 of 4 + +11. I prepared to feed Inky. I did this by scooping Inky's pellets. First, I touched the pile of pellets with the scoop. Second, I scooped the pellets. Third, I lifted the scoop of pellets up. In this way, I prepared to feed Inky by scooping Inky's pellets. + +12. I prepared to dry my feet. I did this by living on water. First, I drank the first glass of water. First, I rowed the boat. Third, I set foot in my house. In this way, I prepared to dry my feet by living on water. + +13. I prepared to spoon the sugar. I did this by identifying reiner (pure) sugar. First, I found all the leaves in the sugar. Second, I placed the leaves in a separate bowl. Third, I labelled the sugar pure. In this way, I prepared to spoon the sugar by identifying reiner (pure) sugar. + +14. I prepared to serve the vegetable burger. I did this by cooking the vegetable patty. First, I made the patty from semolina, soy and carrot. Second, I minced it up. Third, I cooked it. In this way, I prepared to serve the vegetable burger by cooking the vegetable patty. + +15. I prepared to go dancing. I did this by lacing up the shoelaces. First, I put on the shoe. Second, I tied a knot with the laces. Third, I tied a bow with the laces. In this way, I prepared to go dancing by lacing up the shoelaces. + +16. I prepared to reflect that the jet was fast. I did this by looking at the jet. First, I looked up. Second, I found the jet. Third, I looked at it with my eyes. In this way, I prepared to reflect that the jet was fast by looking at the jet. + +17. I prepared to watch the female hamster play Ophelia. I did this by watching the male hamster give the female hamster a wig. First, I watched the male hamster find the wig. Second, I watched the male hamster find the female hamster. Third, I watched the male hamster give the female hamster a wig. In this way, I prepared to watch the female hamster play Ophelia by watching the male hamster give the female hamster a wig. + +18. I prepared to distribute the education video. I did this by scouring the independent school grounds. First, I walked onto the Computational English. Second, I sat in the actor's folding chair. Third, I performed my lines on camera. In this way, I prepared to distribute the education video by scouring the independent school grounds. + +19. I prepared to argue that the object should precede the subject. I did this by writing the internal perspective on Conglish. First, I wrote the linguistic subject. Second, I wrote the verb. Third, I wrote the object. In this way, I prepared to argue that the subject should be preceded by (explained in terms of the) object by writing the internal perspective on Conglish. + +20. I prepared to challenge universalism by offering logicism as my argument. I did this by writing the reason for the internal perspective on Conglish. First, I wrote that the object should be reflected before the proposition to explain the subject in terms of the object. Second, I wrote that the verb should precede the subject to explain the action in a bottom-up method. Third, I stated that 'The banana was Sam's' and 'The banana was eaten by Sam' should replaced 'Sam ate the banana'. In this way, I prepared to challenge universalism by offering logicism as my argument by writing the reason for the internal perspective on Conglish. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..858985d2fca0e59190ab3045cfdc931bd5b0a3c7 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Conglish Reflection 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Conglish Reflection 3 of 4 + +21. I prepared to argue for a universal presence of logicism. I did this by whittling the nuance out. First, I reflected the subject to act in a bottom-up manner. Second, I reflected the verb to act in a bottom-up manner. Third, I reflected the object to act in a bottom-up manner. In this way, I prepared to argue for a universal presence of logicism by whittling the nuance out. + +22. I prepared to eat the frozen nectarine. I did this by eating the frozen strawberry. First, I found the plate with the frozen strawberry on it. Second, I skewered the frozen strawberry with a fork. Third, I placed in the frozen strawberry in my mouth. In this way, I prepared to eat the frozen nectarine by eating the frozen strawberry. + +23. I prepared to go to heaven. I did this by loving God (hugging the master). First, I found the master. Second, I hugged him. Third, I finished hugging him after 50 seconds. In this way, I prepared to go to heaven by loving God (hugging the master). + +24. I prepared to lead life on planet Earth. I did this by disagreeing with non-meditation (helped students to meditate). First, I found the students. Second, I taught them to meditate. Third, I verified their meditation. In this way, I prepared to lead life on planet Earth by disagreeing with non-meditation (helped students to meditate). + +25. I prepared to make the monopoles sing. I did this by listening to Artie's speech. First, I sat on the seat. Second, I started listening to Artie's speech. Third, I finished listening to Artie's speech when he had finished delivering it. In this way, I prepared to make the monopoles sing by listening to Artie's speech. + +26. I prepared to guide by humanitas. I did this by writing the web log (blog). First, I researched the topic. Second, I wrote on the topic. Third, I advertised my blog entry on the social media site. In this way, I prepared to guide by humanitas by writing the web log (blog). + +27. I prepared to notice that the work had been completed. I did this by noticing the native helper. First, I verified whether the first helper was native. Second, I prepared to verify whether the second helper was native. Third, I repeated this until I had verified that a helper was native. In this way, I prepared to notice that the native helper had completed the work. + +28. I prepared to remove the scented lanolin form the wool. I did this by smelling the wool. First, I found the wool. Second, I lifted it to my nose. Third, I smelt it. In this way, I prepared to remove the scented lanolin form the wool by smelling the wool. + +29. I prepared to build the house. I did this by smelling the wood. First, I found the freshly cut wood. Second, I lifted it to my nose. Third, I smelt it. In this way, I prepared to build the house by smelling the wood. + +30. I prepared to experience heaven (goodness) on earth. I did this by smelling the tinsel. First, I found the tinsel. Second, I lifted it to my nose. Third, I smelt the metal sprayed on the PVC, which constituted the tinsel. In this way, I prepared to experience heaven (goodness) on earth by smelling the tinsel. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..28d763df3f7c5f62cefcca9e091856b19619370a --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Conglish Reflection 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Conglish Reflection 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Conglish Reflection 4 of 4 + +31. I prepared to make the present. I did this by smelling the tissue paper. First, I found the tissue paper, in the wood products section. Second, I lifted it to my nose. Third, I smelt it. In this way, I prepared to I prepared to make the present by smelling the tissue paper. + +32. I prepared to dine on model tracks. I did this by feeling the hot model train engine. First, I constructed a train track circle. Second, I started driving the model train engine in a circle. Third, I felt the hot model train engine. In this way, I prepared to dine on model tracks by feeling the hot model train engine. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..31ca5232ea8e918b1993219366f3d6bfbc874fa2 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Dereconstruction 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Dereconstruction 1 of 4 + + + +1. Dereconstruction, another variant of computational hermeneutics I thought of reconstructs a narrative in a representation for reasoning about and querying. The first technique can take a text, and after an interpretation has been prepared (this would be necessary to write as part of the program which would write an interpretation) would test whether the conclusion in response to a question with its reasons was supported by the text, and output the interpretation. + +1a. I prepared to determine whether the student had earned H1 by writing 80 breasonings. I did this by counting the breasonings that the student had written after she breasoned out the breasonings details to God (in fact, the master). First, I counted the first breasoning. Second, I prepared to count the next breasoning. Third, I repeated this until I had counted all the breasonings. In this way, I prepared to determine whether the student had earned H1 by writing 80 breasonings by counting the breasonings that the student had written after she breasoned out the breasonings details to God (in fact, the master). + +2. I prepared to prove two theorems for finding a property gave the same result. I did this by writing a geometry theorem prover. First, I counted the number of squares. Second, I multiplied the width of the region of the squares by the height of the region of the squares. Third, I verified that the number of squares was equal to the width of the region of the squares multiplied by the height of the region of the squares. In this way, I prepared to prove two theorems for finding a property gave the same result by writing a geometry theorem prover. + +3. I prepared to prevent unemployment. I did this by designing a meditation (in fact, book-based) economic system. First, I taught the student from the book. Second, I observed the advantage to the student's health. Third, I observed the advantage to the student's career. In this way, I prepared to prevent unemployment by designing a meditation (in fact, book-based) economic system. + +4. I prepared to create prosperity. I did this by designing a pedagogy economic system. First, I taught the student pedagogy. Second, I observed him breason out an H1. Third, I observed him breason out 50 different H1s for a University job. In this way, I prepared to create prosperity by designing a pedagogy economic system. + +5. I prepared to point culture up. I did this by writing the law that culture must exist. First, I wrote the law that the culture should be pedagogically prepared for. Second, I created the law that creators of culture should be trained. Third, I wrote the law that culture should be created. In this way, I prepared to point culture up by writing the law that culture must exist. + +6. I prepared to bend the wire. I did this by observing the politician showing open mindedness in psychology. First, I found the box. Second, I opened the box, rather than not opening it. Third, I took the pliers out of the box. In this way, I prepared to bend the wire by observing the politician showing open mindedness in psychology. + +7. I prepared to make conversation with many people. I did this by observing the politician showing open mindedness in sociology. First, I found the person. Second, I thought of a topic of conversation, rather than not thinking of one. Third, I made conversation with her. In this way, I prepared to make conversation with many people by observing the politician showing open mindedness in sociology. + +8. I prepared to examine memory in Aboriginal philosophy. I did this by writing that the Aboriginal philosophy course should be a core requirement at school. First, I wrote about the land. Second, I wrote about the people. Third, I wrote about the stories. In this way, I prepared to examine memory in Aboriginal philosophy by writing that the Aboriginal philosophy course should be a core requirement at school. + +9. I prepared to use humour as medicine. I did this by writing that the Aboriginal medicine course should be a core requirement at school. First, I found the person. Second, I made her giggle. Third, I gave her a hug. In this way, I prepared to use humour as medicine by writing that the Aboriginal medicine course should be a core requirement at school. + +10. I prepared to create a Galah god (in fact, master). I did this by writing that the Aboriginal theology course should be a core requirement at school. First, I found a picture of the galah. Second, I drew the galah. Third, I prayed to (communicated with the) Galah. In this way, I prepared to create a Galah god (in fact, master) by writing that the Aboriginal theology course should be a core requirement at school. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..dd8d4dc3ce7f4fe6dc1ea11454d8551194683d62 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Dereconstruction 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Dereconstruction 2 of 4 + +11. I prepared to observe the students write to kings. I did this by observing the philosopher monarch introduce creative philosophy. First, I introduced the philosopher monarch. Second, I observed the philosopher monarch introduce pedagogy. Third, I observed the philosopher monarch introduce creative philosophy. In this way, I prepared to observe the students write to kings by observing the philosopher monarch introduce creative philosophy. + +12. I prepared to watch bliss of life on earth. I did this by observing the philosopher monarch introduce creative philosophy with medicine as a reason. First, I recommended psychiatrists for the students. Second, I recommended the students study a medicine degree. Third, I trained the students to recommend these to other students. In this way, I prepared to watch bliss of life on earth by observing the philosopher monarch introduce creative philosophy with medicine as a reason. + +13. I prepared to perform at the concert. I did this by playing the bagpipes. First, I inflated the bag. Second, I placed my fingers on the holes of the chanter. Third, I squeezed the bag to keep a constant supply air to the four reeds. In this way, I prepared to perform at the concert by playing the bagpipes. + +14. I prepared to be equitable. I did this by liking you because I shared the apple with you. First, I shared the apple with you. Second, I observed you like me. Third, I liked you. In this way, I prepared to be equitable by liking you because I shared the apple with you. + +15. I prepared to earn H1. I did this by demonstrating logical thinking. First, I suggested the proposition 'b'. Second, I suggested the inference 'b -> a'. Third, I suggested the proposition 'a'. In this way, I prepared to earn H1 by demonstrating logical thinking. + +16. I prepared to become a famous computing engineer. I did this by demonstrating perfect programming. First, I wrote down the sets. Second, I wrote down the data contained in each set. Third, I wrote the algorithm traversing the data sets. In this way, I prepared to become a famous computing engineer by demonstrating perfect programming. + +17. I prepared to present a program about my thesis. I did this by breasoning out 250 breasonings. First, I earned the job of computer science professor. Second, I breasoned out 250 breasonings containing the assignment solution. Third, I earned the promotion to emeritus professor. In this way, I prepared to present a program about my thesis by breasoning out 250 breasonings. + +18. I prepared to determine the original construction. I did this by equating the actual value with the desired value in dereconstruction. First, I printed the value of the first variable. Second, I assigned the value of the first variable to the second variable. Third, I printed the value of the second variable. In this way, I prepared to determine the original construction by equating the actual value with the desired value in dereconstruction. + +19. I prepared to solve the political problem. I did this by providing running water. First, I built a water tank. Second, I built a pipe from the water tank to the tap. Third, I boiled the water to filter out pathogens in it before using it. In this way, I prepared to solve the political problem by providing running water. + +20. I prepared to write that meditation was a mutually exclusive course from pedagogy because the breasonings were implicit. I did this by writing that meditation was constituted because of pedagogy, in a top-down manner. First, I wrote that meditation contained the mantra. Second, I wrote that pedagogy contained breasonings. Third, I wrote that the mantra indicated multiple breasonings. In this way, I prepared to write that meditation was a mutually exclusive course from pedagogy because the breasonings were implicit by writing that meditation was constituted because of pedagogy, in a top-down manner. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..9c5831848ffccdf09aea50a5d14d863f2db3ae5f --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Dereconstruction 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Dereconstruction 3 of 4 + +21. I prepared to be friendly. I did this by teaching meditation. First, I taught the meditation details. Second, I taught medicine. Third, I taught about seeing a psychiatrist when needed or seeing friends. In this way, I prepared to be friendly by teaching pedagogy. + +22. I prepared to enable earning H1. I did this by teaching pedagogy. First, I taught the breasonings details. Second, I taught about veganism, positivity, Nietzsche, travelling and the University education subject. Third, I taught my students how to write arguments. In this way, I prepared to enable earning H1 by teaching pedagogy. + +23. I prepared to professionally develop doctors. I did this by teaching medicine. First, I taught about the digestive system. Second, I taught about the circulatory system. Third, I taught about the respiratory system. In this way, I prepared to professionally develop doctors. I did this by teaching medicine. + +24. I prepared to be an expert on the brain. I did this by writing about neuroscience. First I wrote about food. Second, I wrote about activity. Third, I wrote about sleep. In this way, I prepared to be an expert on the brain by writing about neuroscience. + +25. I prepared to tell a story. I did this by writing about creativity. First, I wrote the first event description down. Second, I wrote the second event description down. Third, I connected the first event description and the second event description. In this way, I prepared to tell a story by writing about creativity. + +26. I prepared to deconstruct the reconstruction of the event. I did this by reconstructing the event from other events. First, I reconstructed the event from the first event. Second, I prepared to reconstruct the event from the next event. Third, I repeated this until I had reconstructed the event from all the other events. In this way, I prepared to deconstruct the reconstruction of the event by reconstructing the event from other events. + +27. I prepared to deconstruct the reconstruction of the text. I did this by reconstructing the text from other texts. First, I reconstructed the text from the first text. Second, I prepared to reconstruct the text from the next text. Third, I repeated this until I had reconstructed the text from all the other texts. In this way, I prepared to deconstruct the reconstruction of the text by reconstructing the text from other texts. + +28. I prepared to eat the apple. I did this by deconstructing the object by vaporising it (in fact, I thought of a larger object than another, e.g. I thought of a hand and an apple). First, I measured the length of my hand. Second, I measured the width of the apple. Third, I verified that the first measurement was greater than the second measurement. In this way, I prepared to eat the apple by deconstructing the object by vaporising it (in fact, I thought of a larger object than another, e.g. I thought of a hand and an apple). + +29. I prepared to shake the man's hand. I did this by loving the man by hugging him. First, I walked to him. Second, I placed my arms around him. Third, I hugged him. In this way, I prepared to shake the man's hand by loving the man by hugging him. + +30. I prepared to change the state. I did this by becoming the nationalist leader. First, I reached the top of the nation. Second, I applied to become the nationalist leader. Third, I became the nationalist leader. In this way, I prepared to change the state by becoming the nationalist leader. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..bcabbf8288b8823a91c88fba9c50e17410e0aa0e --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Dereconstruction 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Dereconstruction 4 of 4 + +31. I prepared to lead open-minded religious races. I did this by becoming the open-minded religious leader. First, I taught pedagogy for positive and negative medical reasons. Second, I taught yoga for positive reasons. Third, I taught yoga for negative positive reasons (in fact, another positive reason). In this way, I prepared to lead open-minded religious races by becoming the open-minded religious leader. + +32. I prepared to be equalitarian-minded. I did this by becoming a social-minded person. First, I made sure that the first two groups of people had the same rights. Second, I prepared to make sure that the next two groups of people had the same rights. Third, I repeated this until I had made sure each pair of groups of people had the same rights. In this way, I prepared to be equalitarian-minded by becoming a social-minded person. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 0 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 0 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..20c1168572bbd9190ab9ca276272626a51780923 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 0 of 4.txt @@ -0,0 +1,97 @@ +["Green, L 2021, Derivability 0 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Derivability 0 of 4 + + + +1. The first technique can handle two directions between term or sentence and definition. This allows meaning to be worked out from context. + +1a. Interpolation + +The program can interpolate from definition to sentence (2 to 1), i.e. work out the sentence from a given set of definitions. + +1aa. Samantha ate the biscuits (initially unknown) + +2a. The biscuits were on the table before Samantha came into the room + +b. They weren't there after she left * + +c. It was after school, and time for a snack. * + +* assumed to be in same section as 2a, i.e. also referring to Samantha eating the biscuits. + +In the first technique, order is usually from 1 to 2, but there may be variables that have either any or values from a list. + +2. Extrapolation. Finding the definition from a sentence + +The program can also extrapolate from sentence to definition (1 to 2), i.e. given similar sentences can work out the definition. + +There are three cases so far, and may be more in the future. + +a. Synonyms + +Given: + +1ab. The purse was lost at lunchtime + +2. (initially unknown) + +a. The purse was left in on a seat. + +b. No one returned it. + +the program can calculate: + +1ac. The wallet was lost at lunchtime + +2. (initially unknown) + +a. The wallet was left in on a seat. + +b. No one returned it. + +b. Synogrammars (sentences with the same meaning) + +Given: + +1ad. We received the books we ordered. + +2. (initially unknown) + +a. We filled in the order form for the books + +b. We sent it in with the money. + +the program can calculate: + +1ae. The books we ordered arrived. + +2. (initially unknown) + +a. We filled in the order form for the books + +b. We sent it in with the money. + +c. Same base word + +Given: + +1af. We constructed the argument + +2. (initially unknown) + +a. Constructing is writing. + +b. We wrote the argument. + +the program can calculate: + +1ag. We deconstructed the argument + +2. (initially unknown) + +a. Constructing is writing. + +b. De-something is taking apart. + +c. We took apart the argument."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..48f82c286d723079f5b5ddc42834bc7c2ef750b8 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 1 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Derivability 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Derivability 1 of 4 + +1. I prepared to visualise the dimensions of the object that a word represented. I did this by writing the words on the pedagogy screen. First, I set up the pedagogy screen. Second, I wrote down the words on it. Third, I drew the object on the screen. In this way, I prepared to visualise the dimensions of the object that a word represented by writing the words on the pedagogy screen. + +2. I prepared to disseminate implication. I did this by describing the logical operator ^i (and-implication). First, I stated that And-Implication meant that A implies B is true when A is true. Second, I stated that And-Implication meant that A implies B is true when B is also true. Third, I stated that And-Implication meant that A implies B is true when A^B is also true. In this way, I prepared to disseminate implication by describing the logical operator ^i (and-implication). + +3. I prepared to analyse conjunction. I did this by describing the logical operator ^ (and). First, I stated that And meant that A^B is true when A is true. Second, I stated that And meant that A^B is true when B is also true. Third, I stated that And meant that A^B is also true when 'I am thirsty' and 'The tea is available' is true. In this way, I prepared to analyse conjunction by describing the logical operator ^ (and). + +4. I prepared to examine disjunction. I did this by describing the logical operator v (or). First, I stated that Or meant that AvB may be true when A is true. Second, I stated that Or meant that A^B may be true when B is true. Third, I stated that Or meant that AvB is true when 'I walked left' or 'I walked right' is true. In this way, I prepared to examine disjunction by describing the logical operator v (or). + +5. I prepared to look at negation closely. I did this by describing the logical operator not (not). First, I stated that Not meant that not A is true when A is false. Second, I stated that Not meant that not A is false when A is true. Third, I stated that Not meant that not A is true when 'I am not a merchant' is true. In this way, I prepared to look at negation closely by describing the logical operator not (not). + +6. I prepared to consider whether negated conjunction was correct. I did this by describing the logical operator not ^ (not and). First, I stated that Not-And meant that (not A)^B is true when A is false. Second, I stated that Not-And meant that (not A)^B is true when B is also true. Third, I stated that Not-And meant that (not A)^B is also true when 'I am not quenched' and 'The tea is available' is true. In this way, I prepared to consider whether negated conjunction was correct by describing the logical operator not ^ (not and). + +7. I prepared to test negated disjunction. I did this by describing the logical operator not v (not or). First, I stated that Not-Or meant that (not A)vB may be true when A is false. Second, I stated that Not-Or meant that (not A)vB may be true when B is true. Third, I stated that Not-Or meant that (not A)vB is true when 'I didn't walk right' or 'I walked right' is true. In this way, I prepared to test negated disjunction by describing the logical operator not v (not or). + +8. I prepared to determine a chessboard state where two pieces could have moved, as one in which the piece had moved to the further rank. I did this by interpolating the chessboard state between two different chessboard states. First, I listed the moved pieces. Second, I wrote the chessboard state in which the piece had moved before the second piece had moved. Third, otherwise, I wrote the two possible chessboard states in which either piece could have moved. In this way, I prepared to determine a chessboard state where two pieces could have moved, as one in which the piece had moved to the further rank by interpolating the chessboard state between two different chessboard states. + +9. I prepared to write that there were more 100% grades than 50% grades. I did this by drawing the probability curve for grades. First, I drew a low middle of the curve. Second, I drew a high right of the curve. Third, I labelled the high right of the curve '100%'. In this way, I prepared to write that there were more 100% grades than 50% grades by drawing the probability curve for grades. + +10. I prepared to align a line with the most likely road route. I did this by determining the route that the driver had driven from A to B. First, I drew the first point on the map. Second, I drew the second point on the map. Third, I ruled the route between them. In this way, I prepared to align a line with the most likely road route by determining the route that the driver had driven from A to B. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..0b7e3afc228b88e0f6d815987c983dd7e4a5f304 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Derivability 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Derivability 2 of 4 + +11. I prepared to test that the giraffe had a long enough neck. I did this by testing the assignment with students. First, I tested the hypotenuse. Second, I tested the giraffe. Third, I tested the hiraffe (sic). In this way, I prepared to test that the giraffe had a long enough neck by testing the assignment with students. + +12. I prepared to collect the floated comments. I did this by checking off the floated comment in the online project management software. First, I found the comments. Second, I found the floated comment. Third, I checked it off. In this way, I prepared to collect the floated comments by checking off the floated comment in the online project management software. + +13. I prepared to satisfy a key performance indicator. I did this by proofreading the document by breasoning it out. First, I gave the clause a 10 (in fact, 80) breasoning-long A. Second, I checked the number of objects matched the number of the verb. Third, I checked the language matched flow of objects through objects. In this way, I prepared to satisfy a key performance indicator by proofreading the document by breasoning it out. + +14. I prepared to taste-test the meal. I did this by tasting the raspberry aggregate part. First, I picked the raspberry. Second, I picked its aggregate part. Third, I tasted the aggregate part. In this way, I prepared to taste-test the meal by tasting the raspberry aggregate part. + +15. I prepared to construct a chemical molecule recognition machine. I did this by writing the name of the object on the pedagogy screen. First, I looked at the pedagogy screen. Second, I recognised the object on the screen. Third, I wrote its name. In this way, I prepared to construct a chemical molecule recognition machine by writing the name of the object on the pedagogy screen. + +16. I prepared to write how the main property of an object relates to space and value. I did this by philosophically describing an object in a sentence. First, I wrote down the structure of the object. Second, I wrote how the object related to me. Third, I wrote the main property of how it related to me. In this way, I prepared to write how the main property of an object relates to space and value by philosophically describing an object in a sentence. + +17. The department prepared to be itself. The department did this by applying perspectivism to itself. First, it applied Politics to itself. Second, it applied Economics to itself. Third, it applied History to itself. In this way, the department prepared to be itself by applying perspectivism to itself. + +18. I prepared to value the object. The subject did this by deeming herself positive when the object was determined to be objectively positive. First, I determined that the object worked. Second, I determined that the object was mechanically functional. Third, I determined that the object worked. In this way, I prepared to value the object by deeming herself positive when the object was determined to be objectively positive. + +19. I prepared to determine that the object was algorithmically bug-free. I did this by deeming the object objectively positive. First, I recollected the object. Second, I wrote that I had no bias in judging the object. Third, I determined that the object, an apple was fresh. In this way, I prepared to determine that the object was algorithmically bug-free by deeming the object objectively positive. + +20. I prepared to predict the output from the input. I did this by stating that the self's event was only caused by the other's event, in fact, as in the input, not the output, as in Simulated Intelligence. First, I found the variable with the plus sign ('+') before it in the documentation, indicating it was an input variable, where the variable was in the head of the called predicate. Second, I found the data item corresponding to it in the query. Third, I wrote this data item down. In this way, I prepared to predict the output from the input by stating that the self's event was only caused by the other's event, in fact, as in the input, not the output, as in Simulated Intelligence. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7532d17bdb7a1c9fc3c9b5890e6854502e12acb2 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Derivability 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Derivability 3 of 4 + +21. I prepared to report the crime. I did this by identifying the murder (in fact, drinking tea). First, I identified that no one knew when the murder would occur. Second, I identified when it would happen. Third, I identified that it happened. In this way, I prepared to report the crime by identifying the murder (in fact, drinking tea). + +22. I prepared to program a robot mind. I did this by writing 10 80-breasoning subjects to be with-it in a Conglish (Computational English) degree. First, I wrote the robot subjects. Second, I wrote the mind subjects. Third, I wrote the robot mind subjects. In this way, I prepared to program a robot mind by writing 10 80-breasoning subjects to be with-it in a Conglish (Computational English) degree. + +23. I prepared to be perfectly healthy. I did this by writing 10 80-breasoning subjects to be with-it in a Medicine degree. First, I wrote the organ subjects. Second, I wrote the theological surgery subjects. Third, I wrote the organ sutra subjects. In this way, I prepared to be perfectly healthy by writing 10 80-breasoning subjects to be with-it in a Medicine degree. + +24. I prepared to be a Pedagogue. I did this by writing 10 80-breasoning subjects to be with-it in a Pedagogy degree. First, I wrote the philosophy subjects. Second, I wrote the writing subjects. Third, I wrote the job requirements subjects. In this way, I prepared to be a Pedagogue by writing 10 80-breasoning subjects to be with-it in a Pedagogy degree. + +25. I prepared to be a meditator. I did this by writing 10 80-breasoning subjects to be with-it in a Meditation degree. First, I wrote the philosophy subjects. Second, I wrote the 50 breasonings per utterance subject. Third, I wrote the job requirements subjects. In this way, I prepared to be a meditator by writing 10 80-breasoning subjects to be with-it in a Meditation degree. + +26. I prepared to be a critic (in fact, a positive commentator). I did this by buying the banana at the bargain basement. First, I walked down the stairs to the bargain basement. Second, I ordered the banana. Third, I paid for the banana. In this way, I prepared to be a critic (in fact, a positive commentator) by buying the banana at the bargain basement. + +27. I prepared to make sure the country was safe. I did this by observing the law beckon the subject, like observing a person smell a pink flower. First, I observed the left-wing lawyers arrive at the office. Second, I observed them select a subject. Third, I observed them write the subject. In this way, I prepared to make sure the country was safe by observing the law beckon the subject, like observing a person smell a pink flower. + +28. I prepared to find the centre of a cross. I did this by crossing over the knitting needles. First, I placed the first knitting needle pointing to me on the table. Second, I found the middle of the knitting needle and another knitting needle. Third, I made a cross by placing the second knitting needle on the first knitting needle on the table. In this way, I prepared to find the centre of a cross by crossing over the knitting needles. + +29. I prepared to determine that hydrogen had been produced by the reaction. I did this by conducting the hydrogen pop test. First, I added magnesium to diluted acid in a test tube in a rack, followed by placing my finger over the end of the test tube. Second, I observed bubbles of hydrogen rise from the magnesium, followed by releasing the pressure from the test tube while bringing a lit match to the end of the test tube. Third, I listened to the 'pop' sound. In this way, I prepared to determine that hydrogen had been produced by the reaction by conducting the hydrogen pop test. + +30. I prepared to make enough up. I did this by closing the gate. First, I wrote the first course. Second, I prepared to write the next course. Third, I repeated this until I had written all the courses. In this way, I prepared to make enough up by closing the gate. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4545c22b9da854610ba2843dd8959cdcd44fc685 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Derivability 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Derivability 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Derivability 4 of 4 + +31. I prepared to transform vegetarians. I did this by eating the vegan nuggets. First, I served the vegetarian rice paper roll. Second, I walked to the next table. Third, I ate the vegan nuggets. In this way, I prepared to transform vegetarians by eating the vegan nuggets. + +32. I prepared to dining with the vegan's friends. I did this by eating with the vegan. First, I chose the restaurant. Second, I chose the vegan. Third, I dined with the vegan. In this way, I prepared to dining with the vegan's friends by eating with the vegan. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3ce25fcaa3933582ab3a3a8707f8262d819f6daf --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Drawing connections 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Drawing connections 1 of 4 + + + +1. In a binary opposition, one will be stronger. This is because one starts at it (it is the origin). The contention is exploring the relationship/directionality/methodology between two fundamental objects which have a relation. + +1a. I prepared to collate the robot's knowledge. I did this by writing the ideology as a hierarchy. First, I wrote the question. Second, I wrote the answer. Third, I wrote the reason for the answer. In this way, I prepared to collate the robot's knowledge by writing the ideology as a hierarchy. + +2. I prepared to detail the robot's knowledge. I did this by writing the ontology as a hierarchy. First, I wrote the reason. Second, I wrote the object name. Third, I wrote its connection with the reason. In this way, I prepared to detail the robot's knowledge by writing the ontology as a hierarchy. + +3. I prepared to create a post-Marxist ideology. I did this by connecting the lost ideology hierarchy to the hierarchy. First, I determined that the first ideology of philosophy aiding finding the human value with the aid of the government. Second, I determined that the Marxist ideology aiding finding the literary phrase in a high quality manner. Third, I wrote an intermediate Computational English (government and Marxism-influenced) ideology that wrote a presentation that properly delivered the viewer experience. In this way, I prepared to create a post-Marxist ideology by connecting the lost ideology hierarchy to the hierarchy. + +4. I prepared to blend algorithms. I did this by connecting the lost ontology hierarchy to the hierarchy. First, I determined that the first algorithm computed which direction the sensor faced given a map. Second, I determined that the second algorithm found passing chords using the I-IV-V-I chord progression. Third, I wrote an intermediate algorithm that composed music based on the sensor's surroundings. In this way, I prepared to blend algorithms by connecting the lost ontology hierarchy to the hierarchy. + +5. I prepared to build a school. I did this by observing the lawyer help maintain ownership of the property. First, I drafted the contract to keep ownership of donated land with the help of a lawyer. Second, the land donor signed the contract. Third, I observed the contract prevent the land donor from taking the land back. In this way, I prepared to build a school by observing the lawyer help maintain ownership of the property. + +6. I prepared to build the house. I did this by observing the building surveyor test that the building had good structural integrity. First, I detected the floor. Second, I detected the walls. Third, I detected the ceiling. In this way, I prepared to build the house by observing the building surveyor test that the building had good structural integrity. + +7. I prepared to print an encyclopedia entry. I did this by asking the teletypist to type my ideas. First, I thought of a paragraph to dictate. Second, I found a teletypist matching the desired character. Third, I delegated typing the paragraph to the teletypist. In this way, I prepared to print an encyclopedia entry by asking the teletypist to type my ideas. + +8. I prepared to encourage the native and settler to learn each others' languages. I did this by requesting that the teletypist translate the text into a local language. First, I aimed for the postcolonial native to understand the text. Second, I found that the settler would allow the native to read the text printed in the settler's language. Third, I requested that the teletypist translate the text into a bilingual edition with both the native's language and settler's language. In this way, I prepared to encourage the native and settler to learn each others' languages by requesting that the teletypist translate the text into a local language. + +9. I prepared to distance myself from *. I did this by shelling the snow pea. First, I read that Winston Churchill asked that if funding was diverted from arts then what would we be fighting for? Second, I determined that arts was the conclusion from defence, not vice versa. Third, I determined that arts is necessary rather than defence. In this way, I prepared to distance myself from * by shelling the snow pea. + +10. I prepared to spend on high quality thoughts. I did this by writing 50 Economics As to be a founder of educational institutions. First, I encouraged pedagogical inspiration. Second, I fostered critical thinking with logic. Third, I cherished the influence of perfect meditation. In this a way, I prepared to spend on high quality thoughts by writing 50 Economics As to be a founder of educational institutions. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4adc8e7a918dbcf1c3f7cf7d8ca7e3bcd810c30c --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Drawing connections 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Drawing connections 2 of 4 + +11. I prepared to enjoy a high quality of life. I did this by employing a legal consultant to help legally protect educational institutions. First, I wrote a contract to protect the organisation from donated land being revoked. Second, I wrote a contract so that people who had been emoyed to provide services wouldn't break their contracts. Third, I wrote a contract to protect the organisation from donated money being revoked. In this way, I prepared to enjoy a high quality of life by employing a legal consultant to help legally protect educational institutions. + +12. I prepared to ensure the home systems' smooth running. I did this by cracking the bug. First, I cracked the car bug. Second, I cracked the computer bug. Third, I cracked the robot bug. In this way, I prepared to ensure the home systems' smooth running by cracking the bug. + +13. I prepared to paint the possibilities. I did this by drawing a connection. First, I asked the politician what the options were. Second, I asked the meditator which option she would like to choose. Third, I drew a connection from the meditator's answer to the politician's action. In this way, I prepared to paint the possibilities by drawing a connection. + +14. I prepared to build primary and secondary school students' confidence. I did this by adding the item between the connected items. First, I observed the politician introduce meditation into schools. Second, I observed the children take part if they wanted to. Third, I observed the meditator also introduce pedagogy and medicine, improving meditation. In this way, I prepared to build primary and secondary school students' confidence by adding the item between the connected items. + +15. I prepared to express perfect function. I did this by apologising for anything non-positive (I wrote the positive idea down). First, I wrote the incorrect statement (another correct statement) down that I had been given. Second, I noted that it had been originally written by a person I was responsible for. Third, I apologized for the incorrect statement (wrote the correct statement). In this way, I prepared to express perfect function by apologising for anything non-positive (I wrote the positive idea down). + +16. I prepared to experience heaven on Earth. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Music. First, I wrote about music. Second, I wrote about pleasures. Third, I wrote about fantasias. In this way, I prepared to experience heaven on Earth by writing 16 250 breasoning areas of study influenced by Plato's forms about Music. + +17. I prepared to understand the author through his idea and it's use. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Philosophy. First, I wrote about forms. Second, I wrote about pleasure. Third, I wrote about Plato. In this way, I prepared to understand the author through his idea and it's use by writing 16 250 breasoning areas of study influenced by Plato's forms about Philosophy. + +18. I prepared to become a lecturer. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about English. First, I wrote that the 100% PhD earner wrote 275 breasonings per A. Second, I wrote that he wrote 50 of these As. Third, I wrote that he doubled this to do the lecturer's work for him. In this way, I prepared to become a lecturer by writing 16 250 breasoning areas of study influenced by Plato's forms about English. + +19. I prepared to find similarities between Plato's Forms and Lucian's Pedagogy. I did this by writing 16 250 breasoning areas of study influenced by Plato's Forms about Pedagogy. First, I equated Plato's Forms with Lucianic Pedagogy. Second, I equated Plato's names of Forms with Lucianic Pedagogical object names. Third, I equated Plato's function of Forms with Lucianic Pedagogical breasonings (objects' x, y and z dimensions). In this way, I prepared to find similarities between Plato's Forms and Lucian's Pedagogy by writing 16 250 breasoning areas of study influenced by Plato's Forms about Pedagogy. + +20. I prepared to equate a breasoning with a Platonic form. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Meditation. First, I collected, or paid someone who had collected 250 breasonings that plugged into the 250 Breasonings Per Utterance (Mantra and Sutra) to become a meditator. Second, I was helped to think of 1 breasoning per mantra instance. Third, I was helped to think of 5 breasonings per sutra instance. In this way, I prepared to equate a breasoning with a Platonic form by writing 16 250 breasoning areas of study influenced by Plato's forms about Meditation. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..29168cae6baa5d9e79bdcabfa60d45dea76a5c13 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Drawing connections 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Drawing connections 3 of 4 + +21. I prepared to equate Plato's soul (including the mind and body) with the soul created by 50 As (16 250 breasoning As) in Lucianic Computational English. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Computational English. First, I equated the Platonic body with Lucianic Computational English bodily developed things. Second, I equated the Platonic mind with the Lucianic Computational English mental ontological states. Third, I equated the Platonic soul with the Lucianic Computational English continual soulful aimingness (sic). In this way, I prepared to equate Plato's soul (including the mind and body) with the soul created by 50 As (16 250 breasoning As) in Lucianic Computational English by writing 16 250 breasoning areas of study influenced by Plato's forms about Computational English. + +22. I prepared to form a wisdom seal on my company. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Economics. First, I enticed people to buy Plato's forms. Second, I acted wisely in Economics. Third, I taught others to deliberate carefully. In this way, I prepared to form a wisdom seal on my company by writing 16 250 breasoning areas of study influenced by Plato's forms about Economics. + +23. I prepared to cultivate people. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Popology. First, I equated Plato's forms with Lucianic Popology, by equating people with objects. Second, I equated the names of Plato's forms with an agreed with argument, by writing simulations of people are in people's minds. Third, I equated the functions of Plato's forms with a positive argument, by writing people are stronger than objects. In this way, I prepared to cultivate people by writing 16 250 breasoning areas of study influenced by Plato's forms about Popology. + +24. I prepared to satisfy people with society. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Societology. First, I equated Plato's forms with Lucianic Societology, by influencing art with society. Second, I enabled people of innate culture to cultivate themselves as expressions of their time. Third, I satisfied followers of the most popular current religion that society was developed in their minds. In this way, I prepared to satisfy people with society by writing 16 250 breasoning areas of study influenced by Plato's forms about Societology. + +25. I prepared to increase a patient's self-confidence. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Medicine. First, I equated Plato's forms with Lucianic Medicine, by describing robots in terms of forms. Second, I equated the names of Plato's forms with the pedagogy of medicine curing psychiatric patients, by the listing of named forms. Third, I equated the functions of Plato's forms with the referral of other patients to the doctor. In this way, I prepared to increase a patient's self-confidence by writing 16 250 breasoning areas of study influenced by Plato's forms about Medicine. + +26. I prepared to find similarities between Plato's forms and Lucianic Metaphysics. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Metaphysics. First, I equated Plato's forms with Lucianic Metaphysics, by describing pedagogical language in terms of forms. Second, I equated the names of Plato's forms with the shape of objects in Lucian's Metaphysics. Third, I equated the functions of Plato's forms with the size of objects in Lucian's Metaphysics. In this way, I prepared to find similarities between Plato's forms and Lucianic Metaphysics by writing 16 250 breasoning areas of study influenced by Plato's forms about Metaphysics. + +27. I prepared to explore Vetusia as Plato's forms and my life together. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Lucian's autobiography. First, I wrote a computer game about the study of the old, or vetus. Second, I wrote how Vetus became Vetusia using '-ia' from '-alia'. Third, I studied a corridor protector for breasonings in Lyotard's text about Kant's Sublime. In this way, I prepared to explore Vetusia as Plato's forms and my life together by writing 16 250 breasoning areas of study influenced by Plato's forms about Lucian's autobiography. + +28. I prepared to develop the writing. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Lucian's hermeneutics. First, I wrote what the author's anthropological influences were. Second I wrote how she wrote it. Third, I write what it meant now. In this way, I prepared to develop the writing by writing 16 250 breasoning areas of study influenced by Plato's forms about Lucian's hermeneutics. + +29. I prepared to detail my argument. I did this by praying for 250 250 breasonings As (writing 250 breasonings) per word in philosophy. First, I breasoned out the 16 breasoning highlight. Second, I wrote 150 As for the song. Third, I 'turned it off' when 'it' was blue. In this way, I prepared to detail my argument by praying for 250 250 breasonings As (writing 250 breasonings) per word in philosophy. + +30. I prepared to go straight up. I did this by writing 50 breasonings per sentence in philosophy. First, I wrote similarly to the Professor. Second, I won the cup. Third, I hurtled to my goal. In this way, I prepared to go straight up by writing 50 breasonings per sentence in philosophy. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..76812f8d5f0cffa919edb333a7204ad25777c568 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Drawing connections 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Drawing connections 4 of 4 + +31. I prepared to connect the students' thoughts together. I did this by writing 50 breasonings per connection between sentences in philosophy. First, I constructed the train track. Second, I placed the train on it. Third, I let the train drive on the track. In this way, I prepared to connect the students' thoughts together by writing 50 breasonings per connection between sentences in philosophy. + +32. I prepared to go to heaven. I did this by praying hard to the cosmologue (writing a thank you letter to the teacher). First, I found her. Second, I cancelled everything. Third, I followed her forever. In this way, I prepared to go to heaven by praying hard to the cosmologue (writing a thank you letter to the teacher). + +33. I prepared to be interesting in English. I did this by making the cosmological version look like the title and text. First, I wrote the title and text. Second, I invented the cosmological seen as version part looking like the title and text. Third, I repeated the second step until the title and text had been covered with the available ideas. In this way, I prepared to be interesting in English by making the cosmological version look like the title and text. + +34. I prepared to write a bestseller. I did this by intertwining the idea of being the creator of the top selling product by genre with 250 breasonings. First, I considered whether the product was a piece of art. Second, I considered whether the product was a musical composition. Third, I determined that the product was a book. In this way, I prepared to write a bestseller by intertwining the idea of being the creator of the top selling product by genre with 250 breasonings. + +35. I prepared to write 250 breasonings for writing by students on the important quote. I did this by writing 250 breasonings per important quote. First, I determined that the quote stood out when memorised (contained key or important language). Second, I determined the argument for the important quote. Third, I wrote the argument for the important quote. In this way, I prepared to write 250 breasonings for writing by students on the important quote by writing 250 breasonings per important quote. + +36. I prepared to ask for the connection. I did this by writing suggestions for connections in an area study with 50 breasonings. First, I wrote the first and second sentences. Second, I wrote their connection and a question for their connection. Third, I wrote 50 breasonings for each of their connection and a question for their connection. In this way, I prepared to ask for the connection by writing suggestions for connections in an area study with 50 breasonings. + +37. I prepared to create culture for a period. I did this by collecting the cosmological wording's 50 breasonings, as if they were connected to the area of study. First, I wrote down a random number of breasonings. Second, I wrote down the wording I had worked out (the 'cosmological wording'). Third, I wrote down the perfect appearances. In this way, I prepared to create culture for a period by collecting the cosmological wording's 50 breasonings, as if they were connected to the area of study. + +38. I prepared to produce a viable product. I did this by collecting 15 details for a total of 50 As. First, I wrote 250 breasonings for the pop song. Second, I wrote and breathsoned out the pop song. Third, I wrote breathsonings associated with 15 of the other 16 250 breasoning As for a total of 50 80 breasoning As. In this way, I prepared to produce a viable product by collecting 15 details for a total of 50 As. + +39. I prepared to play movies in hospital. I did this by writing 50 As per production. First, I earned 50 As to visit a psychiatric ward. Second, I earned 50 As at the psychiatric ward. Third, I earned 50 As studying Medicine. In this way, I prepared to play movies in hospital by writing 50 As per production. + +40. I prepared to be as hot as Anarchy. I did this by writing 50 As for the lead role (separately from the production). First, I wrote on each department the lead role was interested in. Second, I wrote on make-up. Third, I wrote on hair. In this way, I prepared to be as hot as Anarchy by writing 50 As for the lead role (separately from the production). + +41. I prepared to create a synthesis on the topic. I did this by preparing 10 connections per student's paragraph. First, I wrote a connection between education and philosophy (I wrote that the frying pan, like the table, had tofu, like the brain thought in it). Second, I wrote a connection between philosophy and archeology (I wrote that the brain had a mind because the brain positively changed). Third, I wrote a connection between archeology and education (I wrote that the brain contained a brain table with 8 molecules of strawberry sauce in it). In this way, I prepared to create a synthesis on the topic by preparing 10 connections per student's paragraph. + +42. I prepared to become the most important modern day thinker. I did this by imagining Hegel write. First, I imagined him write 2.5 paragraphs. Second, I imagined him write 50 breasonings per paragraph. Third, I imagined him write 50 breasonings per connection. In this way, I prepared to become the most important modern day thinker by imagining Hegel write. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..091ef5a177eadd63780039bfc1c50546df9f2ec6 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Exploring opposites in Hamlet 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Exploring opposites in Hamlet 1 of 4 + + + +1. Hamlet is either closer or further away from Claudius than Gertrude. + +1a. I prepared to reassure Hamlet. I did this by holding the skull. First, the clown entertained me. Second, I was at the deathbed of the clown. Third, I held the skull of the clown when digging the grave for the girl. In this way, I prepared to reassure Hamlet by holding the skull. + +2. I prepared to give a colleague my map. I did this by exploring the skull cave. First, I pressed the stone button. Second, I followed the secret passage. Third, I made a map. In this way, I prepared to give a colleague my map by exploring the skull cave. + +3. I prepared to slingshot the seeds to the next island. I did this by finding the treasure at the cross of palm trees. First, I scouted. Second, I measured. Third, I built. In this way, I prepared to slingshot the seeds to the next island by finding the treasure at the cross of palm trees. + +4. I prepared to copy waltzing Matilda. I did this by following Matilda. First, I followed the directions of use of the shampoo. Second, I followed the directions of use of the conditioner. Third, I allowed the sheep to escape into the night. In this way, I prepared to slingshot the seeds to the next island by finding the treasure at the cross of palm trees. + +5. I prepared to fit the wheel's axle. I did this by noticing the line in the cylinder. First, I drew a point. Second, I drew a circle around the point. Third, I extruded a cylinder from the circle around the line in the cylinder extruded from the point. In this way, I prepared to fit the wheel's axle by noticing the line in the cylinder. + +6. I prepared to eat the burger. I did this by observing the man rise again (eat a meal). First, I observed the man perform a handstand. Second, I thought Jesus might be seen as a blue man ascending to the ceiling. Third, I thought that cosmology might be the form the ascension was in. In this way, I prepared to eat the burger by observing the man rise again (eat a meal). + +7. I prepared to take a leaf from my mother's book of morals. I did this by being given the Computational English Godhead. First, I determined the location of the scene. Second, I devised my own goodness classes. Third, I determined the seen-as version of God, Computational English. In this way, I prepared to take a leaf from my mother's book of morals by being given the Computational English Godhead. + +8. I prepared to explore characters dying in Hamlet without being supported in doing so. I did this by exposing the opposites in Hamlet. First, I exposed the murder of Hamlet's father without his murderer being supported in murdering him. Second, I exposed the suicide of Ophelia without her being supported in dying. Third, I exposed the death of the characters in the final scene, including Hamlet, without being supported in dying. In this way, I prepared to explore characters dying in Hamlet without being supported in doing so by exposing the opposites in Hamlet. + +9. I prepared to help up new people. I did this by observing the professor writing 250 breasonings to be given 50 As for his book. First, I wrote 50 As for the book. Second I was featured in the broadcast about the book. Third, the book was made famous in an education subject. In this way, I prepared to help up new people by observing the professor writing 250 breasonings to be given 50 As for his book. + +10. I prepared to trust the appearances of God (the master). I did this by trusting the bible (the book about philosophy). First, I gave credence to stills. Second, I read the testimony to the film. Third, I accepted the undertaking of the audio track. In this way, I prepared to trust the appearances of God (the master) by trusting the bible (the book about philosophy). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..ae54be7f92266ccaef5bd2e4fe7ec1b442e883e8 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Exploring opposites in Hamlet 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Exploring opposites in Hamlet 2 of 4 + +11. I prepared to experience the art forms of God (the master). I did this by trusting God (the master). First, I trusted the art of the master. Second, I trusted the music of the master. Third, I trusted the architecture of the master. In this way, I prepared to experience the art forms of God (the master) by trusting God (the master). + +12. I prepared to write logically. I did this by writing down what I was thinking of as an argument. First, I wrote down a topic for the argument. Second, I worked out the first reason for the argument. Third, I repeated this until I had worked out all the reasons for the argument. In this way, I prepared to write logically by writing down what I was thinking of as an argument. + +13. I prepared to ask questions about the topic. I did this by connecting the arguments to the topic. First, I wrote the topic. Second, I connected the first argument to the topic. Third, I repeated this until I had connected all of the arguments to the topic. In this way, I prepared to ask questions about the topic by connecting the arguments to the topic. + +14. I prepared to connect syntax to semantics. I did this by testing the grammar rules. First, I read the sentence. Second, I found the label for the part of the sentence in the left column and the parts of that part of the sentence in the right column. Third, I repeatedly found the parts of the sentence in the right column in the left column until I had reached the vocabulary in the right column. In this way, I prepared to connect syntax to semantics by testing the grammar rules. + +15. I prepared to lead the world government. I did this by becoming the Head of State. First, I became a Member of Parliament. Second, I stayed in the political party for many years. Third, I became Head of State. In this way, I prepared to lead the world government by becoming the Head of State. + +16. I prepared to self-assess the use of my teaching skills. I did this by teaching the student. First, I saw the student. Second, I made sure the quality of teaching was high. Third, I gave him excellent job prospects. In this way, I prepared to self-assess the use of my teaching skills by teaching the student. + +17. I prepared to be positive-minded about literature. I did this by moving forward in Hamlet. First, I moved forward after realising Hamlet's father had died. Second, I moved forward after realising Ophelia had killed herself. Third, I moved forward after the deaths of Hamlet's family. In this way, I prepared to be positive-minded about literature by moving forward in Hamlet. + +18. I prepared to introduce streams in life. I did this by separating the meditator from the non-meditator. First, I found the meditator. Second, I found the non-meditator. Third, I wrote their names in separate lists. In this way, I prepared to introduce streams in life by separating the meditator from the non-meditator. + +19. I prepared to be a world peace advocate. I did this by encouraging the non-meditator who paid for meditation training to meditate. First, I received the non-meditator's payment for meditation training. Second, I found the non-meditator. Third, I encouraged her to meditate. In this way, I prepared to be a world peace advocate by encouraging the non-meditator who paid for meditation training to meditate. + +20. I prepared to answer another philosophy question. I did this by identifying the man on the street as a philosophy helper. First, I read the philosophy question. Second, I asked the man on the street for help. Third, he helped me answer the philosophy question. In this way, I prepared to answer another philosophy question by identifying the man on the street as a philosophy helper. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..980b54ab25382f22fffdd1fc84696b707ecb1c23 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Exploring opposites in Hamlet 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Exploring opposites in Hamlet 3 of 4 + +21. I prepared to scroll the text down and add a new conclusion at the vanishing point on the horizon. I did this by concluding a new conclusion from two conclusions. First, I read the first conclusion. Second, I read the second conclusion. Third, I wrote a third conclusion formed from the first clause in the first conclusion and the second clause in the second conclusion. In this way, I prepared to scroll the text down and add a new conclusion at the vanishing point on the horizon by concluding a new conclusion from two conclusions. + +22. I prepared to show the characters in Hamlet failed (I prepared to show that we can learn from Hamlet). I did this by critically evaluating Hamlet. First, I wrote that Hamlet was mostly negative (positive). Second, I wrote it was mostly with us. Third, I wrote that it was English too. In this way I prepared to show the characters in Hamlet failed (I prepared to show that we can learn from Hamlet) by critically evaluating Hamlet. + +23. I prepared to summarise the ontologies in Hamlet. I did this by comparing similarities in Hamlet. First, I wrote how all the negative (positive) ideas in Hamlet were similar. Second, I wrote how all the positive ideas in Hamlet were similar. Third, I recorded these. In this way, I prepared to summarise the ontologies in Hamlet by comparing similarities in Hamlet. + +24. I prepared to contrast ontologies in Hamlet. I did this by contrasting differences in Hamlet. First, I contrasted positive and negative (positive) ideas in Hamlet. Second, I recorded these. Third, I delivered a speech about this contrast. In this way, I prepared to contrast ontologies in Hamlet by contrasting differences in Hamlet. + +25. I prepared to listen to the high quality voice. I did this by observing the international council encourage the better quality speaker to speak. First, I observed the selector listen to the speaker. Second, I observed the selector invite the speaker to the conference. Third, I observed the speaker speak at the conference. In this way, I prepared to listen to the high quality voice by observing the international council encourage the better quality speaker to speak. + +26. I prepared to write an argument about Hamlet. I did this by arguing that Hamlet was positive. First, I wrote that Hamlet provided epistemological data to be positive. Second, I wrote that the epistemological data provided details of the suicide (which could be prevented). Third, I prevented this. In this way, I prepared to write an argument about Hamlet by arguing that Hamlet was positive. + +27. I prepared to observe the child write about Hamlet. I did this by stating that the child was coaxed to the point. First, I saw the child. Second, I saw that the child was coaxed to the point. Third, I wrote this down. In this way, I prepared to observe the child write about Hamlet by stating that the child was coaxed to the point. + +28. I prepared to observe spoon-feeding at school. I did this by observing the teacher spoon the student. First, I observed that the student required the answer. Second, I observed the teacher recognise this. Third, I observed the teacher spoon the answer to the student. In this way, I prepared to observe spoon-feeding at school by observing the teacher spoon the student. + +29. I prepared to consolidate the breasoning chapters from my postgraduate qualification by writing essays based on them. I did this by writing that my work would as high quality as 50 As because I would write arguments when I was available. First, I studied a postgraduate qualification. Second, I wrote breasoning chapters. Third, I wrote essays based on these. In this way, I prepared to consolidate the breasoning chapters from my postgraduate qualification by writing essays based on them by writing that my work would as high quality as 50 As because I would write arguments when I was available. + +30. I prepared to observe the movement between the bodies. I did this by connecting breasonings and area of study points as arguments. First, I wrote that the apple was on the plate. Second, I wrote that the fruit on the plate would be eaten. Third, I wrote that the apple would be eaten. In this way, I prepared to observe the movement between the bodies by connecting breasonings and area of study points as arguments. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3c82c6254fc95b6bb69feb06fe4d75c023a4ca6a --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Exploring opposites in Hamlet 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Exploring opposites in Hamlet 4 of 4 + +31. I prepared to declare Hamlet a success. I did this by agreeing with Hamlet. First, I observed Hamlet avoid all the action. Second, I observed Hamlet avoid suicide. Third, I observed Hamlet avoid becoming a murderer. In this way, I prepared to declare Hamlet a success by agreeing with Hamlet. + +32. I prepared to agree with Ophelia being a success. I did this by agreeing with Ophelia. First, I agreed with her freedom. Second, I agreed with her happiness. Third, I agreed with her having exercise. In this way, I prepared to agree with Ophelia being a success by agreeing with Ophelia. + +33. I prepared to avoid the man in the field I had studied a University qualification in. I did this by multiplying and mod'ing the letters' values to equal a value. First, I chose a field. Second, I studied it. Third, I worked in the field. In this way, I prepared to avoid the man in the field I had studied a University qualification in by multiplying and mod'ing the letters' values to equal a value. + +34. I prepared to record the survivor's tale. I did this by stating that the tsunami survivor stayed in the tree. First, I stated that he climbed the tree. Second, I stated that he stayed there. Third, I stayed he climbed down from the tree. In this way, I prepared to record the survivor's tale by stating that the tsunami survivor stayed in the tree. + +35. I prepared to see health increase. I did this by preventing bad behaviour with the reformation (probe). First, I identified the bad behaviour. Second, I prevented the bad behaviour. Third, I helped people to it. In this way, I prepared to see health increase by preventing bad behaviour with the reformation (probe). + +36. I prepared to make a small town. I did this by making a stop sign. First, I found the red paper. Second, I cut an octagon in it. Third, I cut out and glued white letters reading 'STOP' to it. In this way, I prepared to make a small town by making a stop sign. + +37. I prepared to list the number as prime. I did this by determining that a number was prime. First, I tested that dividing the number by the first number resulted in no remainder. Second, I tested that dividing the number by the next number resulted in no remainder. Third, I tested that dividing the number by all the numbers up to n/2 resulted in no remainder. In this way, I prepared to list the number as prime by determining that a number was prime. + +38. I prepared to fly a larger plane in the fight simulator. I did this by training using the flight simulator. First, I took off. Second, I flew along the path. Third, I landed. In this way, I prepared to fly a larger plane in the fight simulator by training using the flight simulator. + +39. I prepared to mirror the deflection experiment. I did this by testing how much a magnet deflected a ball. First, I placed the ball there. Second, I placed the magnet there. Third, I deflected the ball with the magnet. In this way, I prepared to mirror the deflection experiment by testing how much a magnet deflected a ball. + +40. I prepared to write the angle. I did this by converting the decimal number to a percentage. First, I wrote 0.5 meant half. Second, I multiplied 0.5 by 100. Third, I wrote the result, 50%. In this way, I prepared to write the angle by converting the decimal number to a percentage. + +41. I prepared to hold the emerald. I did this by writing the square was 200 metres in side length First, I held the dancer. Second, I moved with the dancer. Third, I moved on. In this way, I prepared to hold the emerald by writing the square was 200 metres in side length. + +42. I prepared to earn 300 points in bowling. I did this by earning 10 strikes in bowling. First, I made the first strike. Second, I made the next strike. Third, I repeated this until I had made 10 strikes. In this way, I prepared to earn 300 points in bowling by earning 10 strikes in bowling. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..a0f699e78f2fa93f698d39271a3d2540ae3c0121 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 1 of 4.txt @@ -0,0 +1,33 @@ +["Green, L 2021, Finite Data will be a Solution in Conglish 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Finite Data will be a Solution in Conglish 1 of 4 + + + +1. How to use cut off infinite data. + +How will the idea of infinite data, as in the following: + +Do we inhabit a three-dimensional universe floating in a four-dimensional space? What if the extra dimensions required by string theory were not curled up and unobservably small, but unfurled and vast, extending forever? Could an invisible universe only a tiny fraction of an inch apart in another dimension explain phenomena that we see today in our world? A physicist may unravel the mysteries of the universe's hidden dimensions. One solution could be to describe an interval of data, concluding with a pointer to infinity. + +1a. I prepared to plan a town by cutting a wood board in half, which was described in the pedagogical argument. I did this by switching the robot on to record the pedagogy database for everyone. First, I switched the robot on to record the grammar. Second, I looked at the robot recording the data. Third, I thought of the robot recording the philosophicon (making sure there are single points at each point). In this way, I prepared to plan a town by cutting a wood board in half, which was described in the pedagogical argument, by switching the robot on to record the pedagogy database for everyone. + +2. I prepared to attend the ball. I did this by observing the robot hand creating the breasoning model with its reusable model-making kit. First, I watched the robot hand create the circular crown of the top hat. Second, I looked at the robot creating the cylindrical side piece of the top hat, which had two open ends, and attach it to the circular crown. Third, I saw the robot create the circular brim of the top hat, which had a circle cut out of it, and attach it tot eh cylindrical side piece of the top hat. In this way, I prepared to attend the ball by observing the robot hand creating the breasoning model with its reusable model-making kit. + +3. I prepared to watch the robot project the paper airplane by throwing it along a line parallel to the ground. I did this by videoing the robot hand creating the breasoning model with its reusable model-making kit. First, I observed the robot hand creating the paper airplane. Second, I placed the video camera behind the paper airplane. Third, I videoed the paper airplane for 4 seconds. In this way, I prepared to watch the robot project the paper airplane by throwing it along a line parallel to the ground by videoing the robot hand creating the breasoning model with its reusable model-making kit. + +4. I prepared to spin the spinner to beat the flagellum. I did this by 3D-printing the instructions for the robot hand to create the spinner, which started at the origin, went out and in to each vertex. First, I copied the sequence of turns needed, 'left,' 'right,' 'right,' 'left,' which returned to being straight and contained the key left instruction, and which would be transformed when the spinner was folded, then removed the 'right,' 'right' and the second 'left' introns to leave the exon 'left' to turn left. Second, I prepared to remove introns to fold the next perpendicular angle of the spinner. Third, I repeated this until I had removed introns to fold all the perpendicular angles of the octahedral spinner string, completing the process of making and folding it. + +5. I prepared to exit the splash screen. I did this by waiting until the 'true while n=1' infinite loop had been interrupted by input. First, I started the infinite loop. Second, I prepared to start the next iteration of the infinite loop. Third, I exited the loop when a keystroke had been inputted. In this way, I prepared to exit the splash screen by waiting until the 'true while n=1' infinite loop had been interrupted by input. + +6. I prepared to design a computer at home. I did this by making the printable circuit to indicate nutritional requirements had been met. First, I wrote a logical printable circuit point representing that the tofu patty had been eaten. Second, I wrote a logical printable circuit point representing that enough protein had been eaten. Third, I connected these logical printable circuit points with 'and' (to work when they were both true). In this way, I prepared to design a computer at home by making the printable circuit to indicate nutritional requirements had been met. + +7. I prepared to write the next creative philosophy assignment. I did this by observing an object travel along a finite path in the body simulation to me. First, I called for the ball. Second, I observed the ball being thrown to me.. Third, I caught the ball. In this way, I prepared to write the next creative philosophy assignment by observing an object travel along a finite path in the body simulation to me. + +8. I prepared to program the robot to simulate his environment. I did this by observing an object travel along a finite path in the physics simulation to me. First, I calibrated the trundle wheel's mark with the start of the track. Second, I walked 10 metres. Third, I stopped when I had heard the tenth click, indicating the trundle wheel had travelled 10 meters. In this way, I prepared to program the robot to simulate his environment by observing an object travel along a finite path in the physics simulation to me. + +9. I prepared to observe time as a simulacrum (imitation) of the ball. I did this by observing an object travel along a finite path in the fourth dimension to me. First, I placed the ball at the top of the ramp. Second, I released the ball. Third, I observed the ball exit the ramp at t=5 seconds. In this way, I prepared to observe time as a simulacrum (imitation) of the ball by observing an object travel along a finite path in the fourth dimension to me. + +10. I prepared to make a profit. I did this by observing an object travel along a finite path in the economic simulation to me. First, I observed the customer see the lollipop advertisement. Second, I saw the customer exchange money for the lollipop. Third, I delivered the lollipop to the customer. In this way, I prepared to make a profit by observing an object travel along a finite path in the economic simulation to me. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..0b12bfa9f35df62652c8046a4a314689cf9c7c77 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Finite Data will be a Solution in Conglish 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Finite Data will be a Solution in Conglish 2 of 4 + +11. I prepared to sum the two numbers. I did this by observing an object travel along a finite path in the computer simulation to me. First, I tied a knot in the first quipu string. Second, I tied a knot in the second quipu string. Third, I tied a knot in the third quipu string at the sum of the distances between the starts of the strings and the knots. In this way, I prepared to sum the two numbers by observing an object travel along a finite path in the computer simulation to me. + +12. I prepared to be happy by travelling free of other people. I did this by looking at each meditator (e.g. writer) meditating for (e.g. writing about) himself or herself. First, the meditator (e.g. writer) received a text during meditation (e.g. writing). Second, the writer wrote a text and verified it by reading it. Third, as the meditator (e.g. writer) received a text, the writer proofread his text. In this way, + +13. I prepared to watch the train being driven to the correct station. I did this by catching the correct finitely long train. First, I found the correct platform. Second, I found the correct train. Third, I found the correct seat. In this way, I prepared to watch the train being driven to the correct station by catching the correct finitely long train. + +14. I prepared to record that the citizen was lawful. I did this by observing the citizen following the law. First, I looked at the person. Second, I watched him reading the text. Third, I observed him paraphrase the text. In this way, I prepared to record that the citizen was lawful by observing the citizen following the law. First, I looked at the person. + +15. I prepared to make money from critical thinking. I did this by writing that the conclusion was for the good reason. First, I wrote that the repleteness of the finite data list acted as the conclusion. Second, I wrote that the reason for this conclusion was the multiple assigned to the list. Third, I wrote that the multiple multiplied by the length of list per multiple resulted in the length of the list. In this way, I prepared to make money from critical thinking by writing that the conclusion was for the good reason. + +16. I prepared to watch the lady. I did this by observing the passenger plan her trip. First, I listened to the lady say she booked the train ticket. Second, I listened to her say she arrived at the train station. Third, I listened to her say she boarded the train. In this way, I prepared to watch the lady by observing the passenger plan her trip. + +17. I prepared to judge the way the other person was speaking. I did this by watching the diareasoner identify the speech rate in her partner. First, I counted the number of words over the time. Second, I counted the number of minutes. Third, I calculated the speech rate to equal the number of words divided by the number of minutes. In this way, I prepared to judge the way the other person was speaking by watching the diareasoner identify the speech rate in her partner. + +18. I prepared to win over the side of the argument. I did this by observing the partisan move to one side. First, I observed the potential agreer query the argument. Second, I delivered the argument as a professor. Third, I observed the partisan agree with the professor. In this way, I prepared to win over the side of the argument by observing the partisan move to one side. + +19. I prepared to be a good leader. I did this by liking politics because of God. First, I thought of what to say. Second, I verified that it was a good thing to say, like God is good. Third, I did what I said I would do. In this way, I prepared to be a good leader by liking politics because of God. + +20 I prepared to cover both sides of the argument. I did this by oscillating between agreement and rebuttal. First, I determined a reason agreeing with a main conclusion. Second, I determined a rebuttal to an objection to the reason. Third, I determined a reason for this rebuttal to the objection to the reason. In this way, I prepared to cover both sides of the argument by oscillating between agreement and rebuttal. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..815ba45d042149f15fc598c7d6b8f64ae243bd2d --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Finite Data will be a Solution in Conglish 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Finite Data will be a Solution in Conglish 3 of 4 + +21. I prepared to verify the correctness of the political statement. I did this by being a perfect political colleague. First, I listened to the reason given by the politician. Second, I verified the reason against peer-reviewed double blind legal research. Third, I handed in the report to the politician. In this way, I prepared to verify the correctness of the political statement by being a perfect political colleague. + +22. I prepared to verify the relevance of the political reply with a political quorum. I did this by verifying the relevance of each political comment as it was made. First, I listened to the first political comment. Second, I listened to the political reply. Third, I verified the relevance of the political reply. In this way, I prepared to verify the relevance of the political reply with a political quorum by verifying the relevance of each political comment as it was made. + +23. I prepared to write how 'ZZX,' the replacement of meaning with structures with no name is so popular because it dealt with in a double blind manner. I did this by observing the partisan agreeing with philosophy. First, I wrote down that the form of an ontology in Conglish was the form of its data. Second, I wrote down that the content of an ontology in Conglish was the disconnected set of meanings of each data item. Third, I wrote down that form of an ontology in Conglish was superior to its content because algorithms more elegantly traverse ontologies given their form, rather than their comment. In this way, I prepared to write how 'ZZX,' the replacement of meaning with structures with no name is so popular because it dealt with in a double blind manner, by observing the partisan agreeing with philosophy. + +24. I prepared to give unbiased feedback. I did this by giving feedback to you. First, I listened to the topic. Second, I wrote a comment. Third, I returned my comment as feedback to you. In this way, I prepared to give unbiased feedback by giving feedback to you. + +25 I prepared to define that the data items were linked together. I did this by determining that there is a rainbow of numbers in finite data. First, I determined that 1 corresponded to 0. Second, I determined that 2 corresponded to 1. Third, I determined that 3 corresponded to 2. In this way, I prepared to define that the data items were linked together by determining that there is a rainbow of numbers in finite data. + +26. I prepared to build a functioning farm. I did this by riding to verify the finite data. First, I determined that the first paddock was green to stay safe in case of a fire. Second, I determined that the second paddock was filled with mulch to make compost from, protected by the safety of the first paddock. Third, I determined that the third paddock was filled with oats to make oat milk from, that would benefit fro the compost. In this way, I prepared to build a functioning farm by riding to verify the finite data. + +27. I prepared to evolve research. I did this by writing arguments. First, I cited a sentence. Second, I joined it to a new idea. Third, I reduced it to one side of a contention. In this way, I prepared to evolve research by writing arguments. + +28. I prepared to write my own algorithm for an idea in the exposition. I did this by reordering sentences in the exposition. First, I wrote the beginning sentence from reordering. Second, I wrote the middle sentence from reordering. Third, I wrote the ending sentence from reordering. In this way, I prepared to write my own algorithm for an idea in the exposition by reordering sentences in the exposition. + +29. I prepared to understand how a sentence fitted together. I did this by reordering words in sentences in the exposition. First, I wrote the first word from reordering the words. Second, I prepared to write the next word from reordering the words. Third, I repeated this until I had written all the words from reordering the words. In this way, I prepared to understand how a sentence fitted together by reordering words in sentences in the exposition. + +30. I prepared to write how reasons for a pedagogical argument were supported by experiences. I did this by writing an illustrative computer program in the critique. First, I found the objects referred to in the idea. Second, I wrote down the relationship of these objects. Third, I wrote down a computer program allowing querying of these relationships. In this way, I prepared to write how reasons for a pedagogical argument were supported by experiences by writing an illustrative computer program in the critique. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4aa56f0bf048cf26bccdd6e552b8d385172f101c --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Finite Data will be a Solution in Conglish 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Finite Data will be a Solution in Conglish 4 of 4 + +31. I prepared to write how God invents pathways. I did this by writing an original argument about Lucianic Meditation in my essay about the intersection of Heidegger and Daoism. First, I read the literature. Second, I thought of the universal implication of the literature. Third, I wrote how the structure of Lucianic Meditation was similar to the structure of Daoism as pointed out by Heidegger. In this way, I prepared to write how God invents pathways by writing an original argument about Lucianic Meditation in my essay about the intersection of Heidegger and Daoism. + +32. I prepared to examine my works. I did this by verifying the grammar of each original reason in the essay with 10 reasons. First, I wrote the first reason. Second, I prepared to write the next reason. Third, I repeated this until I had written 10 reasons. In this way, I prepared to examine my works by verifying the grammar of each original reason in the essay with 10 reasons. + +33. I prepared to observe effective communication in the market. I did this by writing breasonings currency. First, I selected the product. Second, I wrote how it would be useful to me. Third, I accepted the seller's breasoning currency and the product in return for mine. In this way, I prepared to observe effective communication in the market by writing breasonings currency. + +34. I prepared to list the finite data used as empirical evidence. I did this by verifying using breasonings currency. First, I wrote the plan for buying as the breasonings currency. Second, I verified the buy from the first, second and third person perspectives. Third, I verified the product's use. In this way, I prepared to list the finite data used as empirical evidence by verifying using breasonings currency. + +35. I prepared to calculate the use by date of the product. I did this by stating that breasonings currency works. First, I calculate the time to prepared for buying the product. Second, I calculated the time to register to buy the product. Third, I calculate the time to use the product. In this way, I prepared to calculate the use by date of the product by stating that breasonings currency works. + +36. I prepared to plan for the marriage in a finite way. I did this by stating that there was breasonings currency for gay marriage. First, I computed breasonings currency for meditation (philosophy). Second, I computed breasonings currency for apple meditation (confirming the marriage every day). Third, I computed breasonings currency for a possible intimate or companionship connection. In this way, I prepared to plan for the marriage in a finite way by stating that there was breasonings currency for gay marriage. + +37. I prepared to assign a finite amount of the product for a certain amount of breasonings currency. I did this by assigning 1 to breasonings currency when a threshold reading was required. First, I assigned the threshold to a certain number of As. Second, I detected the threshold being reached. Third, I allowed product to be given for this threshold. In this way, I prepared to assign a finite amount of the product for a certain amount of breasonings currency by assigning 1 to breasonings currency when a threshold reading was required. + +38. I prepared to value the wedding. I did this by stating that 50 As of breasonings currency would be charged in certain cases. First, I noted that 50 As of breasoning currency were exchanged in marriage. Second, I observed that 50 As of breasonings currency were exchanged when an employee was employed. Third, I observed that 50 As of breasonings currency were exchanged when a product was developed. In this way, I prepared to value the wedding by stating that 50 As of breasonings currency would be charged in certain cases. + +39. I prepared to state that a person's physiology limits her secrets. I did this by stating that 80 was the possible maximum number of breasonings recommended to breason out per day, and 400 breasonings was the upper maximum. First, I found the required number of breasonings per day. Second, I determined that the rest of the breasonings would be completed for the person. Third, I placed any breasonings in the communal account. In this way, I prepared to state that a person's physiology limits her secrets by stating that 80 was the possible maximum number of breasonings recommended to breason out per day, and 400 breasonings was the upper maximum. + +40 I prepared to record the number of breasonings. I did this by stating that computers would complete the required number of breasonings. First, I counted the number of breasonings already completed. Second, I counted the number of breasonings required. Third, I subtracted the number of breasonings already completed from the number of breasonings required to equal the number of computational breasonings required, and provided these. In this way, I prepared to record the number of breasonings by stating that computers would complete the required number of breasonings. + +41. I prepared to craft unique viewpoints. I did this by stating that people should plan ahead to have their own original content. First, I computed a finite set of ideas relating to the idea. Second, I computed the required ideas to relate these to. Third, I related them. In this way, I prepared to craft unique viewpoints by stating that people should plan ahead to have their own original content. + +42. I prepared to compare and contrast natural trends in breasonings currency where the writers haven't read each other's work. I did this by computing the nature of the area of study. First, I wrote how the area of study was done in nature. Second, I wrote how the area of study was made in nature. Third, I wrote how the area of study was held in nature. In this way, I prepared to compare and contrast natural trends in breasonings currency where the writers haven't read each other's work by computing the nature of the area of study. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6953a57c346efaf6606511f16667106bf723a14a --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, How can the program cope with real variation? 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +How can the program cope with real variation? 1 of 4 + + it assumes these texts are based in logic or reason. Are they? + +Some features allow greater variation in Computational English, while the program is limited to rational principles. In theory it could detect contradictions and recognise abstract, metaphorical or other forms. Synonyms and synogrammars (grammars with the same meaning) substituted for words and phrases allow for a different form. Clauses in sentences can be substituted, allowing more variation. If two sentences are used where one would normally be used, the program can integrate them into a single sentence. There are rules for selecting sentences as evidence, like order and causality. I am writing a paper on variation in Computational English. I can let you know when it is available. + +1a. I prepared to see the results of the top famous person. I did this by writing that the desiderative part of the verb was given by the example, 'She wants to lead'. First, I observed the lady standing there. Second, I observed her observing the people who needed to be led. Third, I observed her state that 'I want to lead (the people)'. In this way, I prepared to see the results of the top famous person by writing that the desiderative part of the verb was given by the example, 'She wants to lead'. + +2. I prepared to direct students to enroll in another subject. I did this by writing that the intensive part of the verb was given by the example, 'The subject is full'. First, I enrolled in the subject. Second, I noticed that others prepared to enroll in the subject as well. Third, I noticed that they repeated this until the subject was full. In this way, I prepared to direct students to enroll in another subject by writing that the intensive part of the verb was given by the example, 'The subject is full'. + +3. I prepared to address the chief of staff. I did this by writing that the indicative part of the verb was given by the example, 'I am arriving'. First, I walked along the hall. Second, I turned to walk through the door of the destination. Third, I declared, 'I am arriving'. In this way, I prepared to address the chief of staff by writing that the indicative part of the verb was given by the example, 'I am arriving'. + +4. I prepared to go skiing. I did this by writing that the subjunctive part of the verb was given by the example, 'I would state'. First, I read the list of cases. Second, I verified that the current case was true. Third, I stated that in this case, I would state that a certain action was required. In this way, I prepared to go skiing by writing that the subjunctive part of the verb was given by the example, 'I would state'. + +5. I prepared to go to the next level. I did this by writing that the injunctive part of the verb was given by the example, 'My achievements shall I now proclaim'. First, I listed my achievements. Second, I found the correct time point. Third, I announced them. In this way, I prepared to go to the next level by writing that the injunctive part of the verb was given by the example, 'My achievements shall I now proclaim'. + +6. I prepared to sail on the patrol boat. I did this by writing that the pluperfect part of the verb was given by the example, 'We had come'. First, I saw that we had come. Second, I sighted the person it affected. Third, I said that we had come. In this way, I prepared to sail on the patrol boat by writing that the pluperfect part of the verb was given by the example, 'We had come'. + +7. I prepared to smell the roses in life. I did this by writing that the precative part of the verb was given by the example, 'Will you take care of me?' First, I saw the carer. Second, I noted that I needed to be taken care of. Third, I asked him to take care of me. In this way, I prepared to smell the roses in life by writing that the precative part of the verb was given by the example, 'Will you take care of me?' + +8. I prepared to return home. I did this by writing that the conditional part of the verb was given by the example, 'If I loved myself'. First, I verified that I loved myself. Second, I took myself to see a film. Third, I wrote that I had taken myself to see a film in the case that I loved myself. In this way, I prepared to return home by writing that the conditional part of the verb was given by the example, 'If I loved myself'. + +9. I prepared to be protected by law. I did this by writing that the gerundive part of the verb was given by the example, 'The man escaped by running'. First, I noticed him running. Second, I noticed he had escaped by running. Third, I reported that he had escaped by running away to authorities. In this way, I prepared to be protected by law by writing that the gerundive part of the verb was given by the example, 'The man escaped by running'. + +10. I prepared to listen to the monologue. I did this by writing that the masculine gender of the noun was given by the example, 'actor'. First, I verified that the person was a theatre studies actor or actress. Second, I verified that he was male. Third, I described him as an actor. In this way, I prepared to listen to the monologue by writing that the masculine gender of the noun was given by the example, 'actor'. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7bbcd61130f7a202eb008c4382f2abb46872bc84 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, How can the program cope with real variation? 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +How can the program cope with real variation? 2 of 4 + +11. I prepared to help the girl onto the tram stop. I did this by writing that the feminine gender of the noun was given by the example, 'girl'. First, I noted that the person was female. Second, I noted that the person was young. Third, I called her a girl. In this way, I prepared to help the girl onto the tram stop by writing that the feminine gender of the noun was given by the example, 'girl'. + +12. I prepared to write using the pencil. I did this by writing that the neuter gender of the noun was given by the example, 'pen'. First, I found the pen. Second, I found its owner. Third, I returned it to her. In this way, I prepared to write using the pencil by writing that the neuter gender of the noun was given by the example, 'pen'. + +13. I prepared to draw on the cardboard. I did this by writing that the singular number of the noun was given by the example, 'one sheet of cardboard'. First, I counted the sheet of cardboard. Second, I prepared to count any others. Third, I stopped when I had counted all the sheets of cardboard, in this case, one sheet of cardboard. In this way, I prepared to draw on the cardboard by writing that the singular number of the noun was given by the example, 'one sheet of cardboard'. + +14. I prepared to advance to the next piece. I did this by writing that the dual number of the noun was given by the example, 'two horsemen'. First, I counted the first horseman. Second, I prepared to count any other horsemen. Third, I stopped when I had counted two horsemen. In this way, I prepared to advance to the next piece by writing that the dual number of the noun was given by the example, 'two horsemen'. + +15. I prepared to infiltrate the army. I did this by writing that the plural number of the noun was given by the example, 'the drones'. First, I counted the first drone. Second, I prepared to count any other drones. Third, I stopped when I had counted multiple drones. In this way, I prepared to infiltrate the army by writing that the plural number of the noun was given by the example, 'the drones'. + +16. I prepared to list the man's actions. I did this by writing that the nominative case of the noun was given by the example, 'The man (nominative) ate an olive'. First, I saw the man sitting at the table. Second, I inductively observed the man performing an action. Third, I reported that the man ate an olive. In this way, I prepared to list the man's actions by writing that the nominative case of the noun was given by the example, 'The man (nominative) ate an olive'. + +17. I prepared to read a book. I did this by writing that the vocative case of the noun was given by the example, 'Adam, take me home'. First, I found Adam. Second, I specified where my house was. Third, I said, 'Adam, take me home'. In this way, I prepared to read a book by writing that the vocative case of the noun was given by the example, 'Adam, take me home'. + +18. I prepared to eat the tofu. I did this by writing that the accusative case of the noun was given by the example, 'I ate the paella'. First, I took the paella out of the oven. Second, I forked an item from it. Third, I ate the forked item. In this way, I prepared to eat the tofu by writing that the accusative case of the noun was given by the example, 'I ate the paella'. + +19. I prepared to observe Ranjit hand in his assignment. I did this by writing that the instrumental case of the noun was given by the example, 'Ranjit writes with a pen'. First, I found Ranjit. Second, I observed him writing. Third, I observed him writing with a pen. In this way, I prepared to observe Ranjit hand in his assignment by writing that the instrumental case of the noun was given by the example, 'Ranjit writes with a pen'. + +20. I prepared to encourage John to reply to Jenny's letter. I did this by writing that the dative case of the noun was given by the example, 'Jenny wrote a letter to John'. First, I introduced Jenny to John. Second, I encouraged Jenny to write a letter to John. Third, I timed my visit to John's house to coincide with when he received Jenny's letter. In this way, I prepared to encourage John to reply to Jenny's letter by writing that the dative case of the noun was given by the example, 'Jenny wrote a letter to John'. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8f5be28e7a0db0ffc6111f664e68dd8798d1c261 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, How can the program cope with real variation? 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +How can the program cope with real variation? 3 of 4 + +21. I prepared to hand Peter the trophy. I did this by writing that the ablative case of the noun was given by the example, 'Peter will run with speed'. First, I pointed out the starting line of the run to Peter. Second, I pointed out the finishing line of the run to Peter. Third, I observed Peter running with speed. In this way, I prepared to hand Peter the trophy by writing that the ablative case of the noun was given by the example, 'Peter will run with speed'. + +22. I prepared to read the book. I did this by writing that the genitive case of the noun was given by the example, 'The book is on the side of the table'. First, I observed the book. Second, I observed that the book was on the side of an object. Third, I observed that the book was on the side of the table. In this way, I prepared to read the book by writing that the genitive case of the noun was given by the example, 'The book is on the side of the table'. + +23. I prepared to invite you to visit me at my house. I did this by writing that the locative case of the noun was given by the example, 'Lucian lived at his house'. First, I introduced myself as Lucian. Second, I said that I lived in a building. Third, I said that I lived at my house. In this way, I prepared to invite you to visit me at my house by writing that the locative case of the noun was given by the example, 'Lucian lived at his house'. + +24. I prepared to progress to the next level in the computer game. I did this by flipping the toadstool, giving me a free life in the computer game. First, I flipped the toadstool. Second, I saw the free life behind it. Third, I jumped and took the free life. In this way, I prepared to progress to the next level in the computer game by flipping the toadstool, giving me a free life. + +25. I prepared to be counted on the freeway. I did this by examining the turnpike. First, I looked at its base. Second, I examined its stand. Third, I examined its card scanner. In this way, I prepared to be counted on the freeway by examining the turnpike. + +26. I prepared to start a family. I did this by marrying my partner. First, I chose a partner. Second, we decided to marry. Third, we married. In this way, I prepared to start a family by marrying my partner. + +27. I prepared to live out my life in a new way. I did this by coping with the change in sign. First, I found the sign. Second, I determined that it had changed. Third, I coped with this change. In this way, I prepared to live out my life in a new way by coping with the change in sign. + +28. I prepared to drink milk. I did this by eating the apple. First, I cut the apple into segments. Second, I grasped a segment. Third, I ate a segment. In this way, I prepared to drink milk by eating the apple. + +29. I prepared to repair climate change by preventing further damage to the ozone layer. I did this by claiming that climate change exists. First, I measured the ozone hole at time = 0 years. Second, I measured the ozone hole at time = 20 years. Third, I determined that the ozone hole had grown during this period of time. In this way, I prepared to repair climate change by preventing further damage to the ozone layer by claiming that climate change exists. + +30. I prepared to prevent new breasonings being written that encouraged carbon dioxide emissions. I did this by winning the pop music competition. First, I retrieved the 100-year old breasoning list. Second, I breasoned it out 50 times over several days. Third, I won the pop music competition. In this way, I prepared to prevent new breasonings being written that encouraged carbon dioxide emissions by winning the pop music competition. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5df55b08102c39f1a5592fa9c1ff604802d8f6e5 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green How can the program cope with real variation? 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, How can the program cope with real variation? 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +How can the program cope with real variation? 4 of 4 + +31. I prepared to research Heidegger's life. I did this by meeting Heidegger's friend. First, I breasoned out 50 As. Second, I was accepted to speak at the Australasian Society for Continental Philosophy Annual Conference in 2013. Third, I met Heidegger's friend. In this way, I prepared to research Heidegger's life by meeting Heidegger's friend. + +32. I prepared to submit my article to a journal. I did this by specifying the assignment of writing an original essay. First, I wrote the exposition. Second, I wrote the critique. Third, I connected and expanded five critique points per original point of my essay. In this way, I prepared to submit my article to a journal by specifying the assignment of writing an original essay. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5f62be4c0388f0a15c558c9fa66b1c53de4b912e --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 1 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Intertextuality 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Intertextuality 1 of 4 + + + +1. Connect two texts. + +1a. If intertextuality is defined with a 'functional unit' of forming a conclusion from two sentences, one from each text, the first technique could be a possible solution. For example, given the input from sensors 'The baby has finished the meal' and the secondary text 'When the baby has finished the meal, read to him/her' (and another with the baby's name) the conclusion could be 'Read to Tom'. + +1a. I prepared to connect together people's ideas. I did this by joining the texts in an anthropological setting. First, I wrote down the text of the self. Second, I wrote down the text of the other. Third, I connected them. In this way, I prepared to connect together people's ideas by joining the texts in an anthropological setting. + +2. I prepared to add a reason to a sentence. I did this by linguistically joining the sentences together. First, I wrote down the first sentence, 'I fed the ant'. Second, I wrote down the second sentence, 'The ant was large'. Third, I wrote down the joint sentence, 'I fed the ant, which was large'. In this way, I prepared to add a reason to a sentence by linguistically joining the sentences together. + +3. I prepared to circumcise the ant. I did this by bridging sentences in Engineering. First, I wrote 'I drew the hexagon'. Second, I wrote 'I drew the triangle'. Third, I wrote 'I drew the triangle in the hexagon'. In this way, I prepared to circumcise the ant by bridging sentences in Engineering. + +4. I prepared to write new quasilogics. I did this by blending systemic meanings. First, I wrote a+b. Second, I wrote a^b. Third, I wrote a+^b (sic) meaning verifying a^b with one additional proposition, c, yielding a^b^c. In this way, I prepared to write new quasilogics by blending systemic meanings. + +5. I prepared to use the mouth to eat and breathe. I did this by blending bodily meanings. First, I wrote the predicate of the Prolog body program. Second, I used it for the first use. Third, I used it for the second use. In this way, I prepared to use the mouth to eat and breathe by blending bodily meanings. + +6. I prepared to answer the next question. I did this by blending algorithmic meanings. First, I wrote a+b. Second, I wrote a-b. Third, I wrote a+-b (sic), meaning verifying that a and b were different. In this way, I prepared to answer the next question by blending algorithmic meanings. + +7. I prepared to determine whether a new word should be invented. I did this by blending two words' meanings. First, I wrote the word 'happy'. Second, I wrote the word 'good'. Second, I wrote the result of blending the meanings of the words 'happy' and 'good' was 'being happy because of being good'. In this way, I prepared to determine whether a new word should be invented by blending two words' meanings. + +8. I prepared to grow up safely. I did this by counting the pink flower's petals like the blended meanings of two texts. First, I counted the first blended meaning of 'I love you' and 'You love me,' 'You love writing two instances of the same letter together in a word, such as 'aardvark''. Second, I counted the second blended meaning, 'You love writing the word arm in a word, such as 'armadillo''. Third, I counted the third blended meaning, 'You love writing a word with a syncopated rhythm, where a syncopated rhythm contains a half-beat, followed by a beat, followed by a half-beat, such as 'arachnid''. In this way, I prepared to grow up safely by counting the pink flower's petals like the blended meanings of two texts. + +9. I prepared to write the things that we've thought of together. I did this by writing parts of an aphorism, like pocketing a pocketwatch, to write an aphorism. First, I wrote the aphor 'apple'. Second, I wrote what the aphor was on, 'the plate'. Third, I wrote the aphorism 'It is good to be there, because of digesting the apple'. In this way, I prepared to write the things that we've thought of together by writing parts of an aphorism, like pocketing a pocketwatch, to write an aphorism. + +10. I prepared to paint the mantelpiece clock. I did this by breasoning out the mantelpiece clock. First, I wrote down that the X dimension of the mantelpiece clock was 0.05 metres. Second, I wrote down that the Y dimension of the mantelpiece clock was 0.06 metres. Third, I wrote that the 'area text' between the 'X' and 'Y' texts was '0.05 * 0.06 = 0.003 metres'. In this way, I prepared to paint the mantelpiece clock by breasoning out the mantelpiece clock. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8f974fb2ed81c877044fdd56d8a7035f8c5a4b19 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Intertextuality 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Intertextuality 2 of 4 + +11. I prepared to play tennis at the appointed time. I did this by joining the sentences about the grandfather clock together. First, I wrote 'Grandfather clocks are the largest hall clocks'. Second, I wrote 'Clocks contain faces to tell the time'. Third, I wrote 'The grandfather clock face should be at head height to tell the time most easily'. In this way, I prepared to play tennis at the appointed time by joining the sentences about the grandfather clock together. + +12. I prepared to eat the popcorn at a certain time. I did this by writing the sentences about the Big Ben clock tower in a hierarchy. First, I wrote 'The tower stood on the ground'. Second, I wrote 'The clock was attached to the tower'. Third, I wrote 'It was good to read the clock'. In this way, I prepared to eat the popcorn at a certain time by writing the sentences about the Big Ben clock tower in a hierarchy. + +13. I prepared to use the clock while surfing. I did this by stepping through sentences about the octagon clock. First, I saw that the clock has surrounded by a circle. Second, I noticed that the circle was surrounded by an octagon. Third, I observed that the clock spoke the time each hour. In this way, I prepared to use the clock while surfing by stepping through sentences about the octagon clock. + +14. I prepared to smile at the robot. I did this by developing a robot head as simple as a stylized icon clock dial. First, I watched the clock smile at 9:15. Second, I watched the clock smile at 2:45. Third, I wound it up to 9:15 again. In this way, I prepared to smile at the robot by developing a robot head as simple as a stylized icon clock dial. + +15. I prepared to write that each organ's text was connected to each other organ's text by a text. I did this by writing how objects flowing through a world described by two joined sentences were like those flowing through the body. First, I wrote the text 'I had a mouth'. Second, I wrote the text 'I had the food and drink pipe'. Third, I wrote 'The water from my mouth was swallowed into my food and drink pipe'. In this way, I prepared to write that each organ's text was connected to each other organ's text by a text by writing how objects flowing through a world described by two joined sentences were like those flowing through the body. + +16. I prepared to write a second self moved towards the second other. I did this by writing that the self's text was transformed into the other's text. First, I wrote, 'I am the self'. Second, I wrote, 'You are the other'. Third, I wrote, 'The self positively moved towards the other'. In this way, I prepared to write a second self moved towards the second other by writing that the self's text was transformed into the other's text. + +17. I prepared to be with you, like eating jam. I did this by opening the text with the other text, like a spoon. First, I wrote, 'I like you'. Second, I wrote, 'You like me. Third, I wrote, 'We made friends'. In this way, I prepared to be with you, like eating jam by opening the text with the other text, like a spoon. First, I wrote, 'I like you'. + +18. I prepared to write an argument as a single chain of reasons, so that each reason had no more than one reason attached to it above it. I did this by connecting two uses for an action together. First, I wrote the first use for 'I paid for the jam,' 'I opened the jar of jam with the spoon'. Second, I wrote the second use, 'I tasted the jam using my spatula'. Third, I connected these two uses together to be 'I ate the jam with the spoon'. In this way, I prepared to write an argument as a single chain of reasons, so that each reason had no more than one reason attached to it above it by connecting two uses for an action together. + +19. I prepared to tell the tale. I did this by transforming 'I am the best' into 'I am' with 'therefore,' in literature. First, I wrote 'I am the best'. Second, I wrote 'I am'. Third, I wrote 'I am the best, therefore I am' because I survived. In this way, I prepared to tell the tale by transforming 'I am the best' into 'I am' with 'therefore,' in literature. + +20. I prepared to teach the child medicine. I did this by collecting the way's text to metaphysically open a child's life's text. First, I wrote from a single 80-breasonings-long A to 50 250-breasonings-long pedagogical argument. Second, I breasoned out each object's X, Y and Z dimensions in each sentence of the argument. Third, I listened to the news about the child being conceived and observed her being born. In this way, I prepared to teach the child medicine by collecting the way's text to metaphysically open a child's life's text. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..537da545e05a418880ea744c431a5c1307ab0769 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Intertextuality 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Intertextuality 3 of 4 + +21. I prepared to keep my mind and body active. I did this by collecting the way's text to metaphysically keep my life's text open by supporting it. First, I wrote from a single 80-breasonings-long A to 50 250-breasonings-long pedagogical argument. Second, I breasoned out each object's X, Y and Z dimensions in each sentence of the argument. Third, I supported myself living. First, I prepared to keep my mind and body active by collecting the way's text to metaphysically keep my life's text open by supporting it. + +22. I prepared to wear appropriate clothing to prevent being subject to terrorism (e.g. walk in a straight line). I did this by verifying that the shirt's text matched the text about what I should wear. First, I aimed to walk in the house. Second, I wrote that I should wear a shirt to walk in the house. Third, I wore the shirt. In this way, I prepared to wear appropriate clothing to prevent being subject to terrorism (e.g. walk in a straight line) by verifying that the shirt's text matched the text about what I should wear. + +23. I prepared to serve the peaches and cream. I did this by treating the child (who had a text) to the liquefied prune (which had a text). First, I asked her to eat the liquefied prune. Second, I spooned them into the consenting child's mouth. Third, I asked her to swallow it. In this way, I prepared to serve the peaches and cream by treating the child (who had a text) to the liquefied prune (which had a text). + +24. I prepared to turn the key to raise the model sun. I did this by verifying the text was clear using a style guide. First, I verified that the first sentence referred to an object. Second, I prepared to verify that the next sentence referred to an object that was connected to an object previously referred to in the paragraph. Third, I repeated this until I had verified that each sentence referred to an object that was connected to an object previously referred to in the paragraph, where a system is a set of joined objects in each paragraph. In this way, I prepared to turn the key to raise the model sun by verifying the text was clear using a style guide. + +25. I prepared to flip the argument vertically as part of the lower half of the Computational English diamond. I did this by verifying the reasoning was correct using a reasoning guide. First, I verified the first reasoning, e.g. D<-C. Second, I prepared to verify the second reasoning, e.g. C<-B. Third, I repeated this until I had verified each reasoning, e.g. C<-A. In this way, I prepared to flip the argument vertically as part of the lower half of the Computational English diamond by verifying the reasoning was correct using a reasoning guide. + +26. I prepared to be myself. I did this by verifying the raison d'etre (reason to be) was correct using a raison d'etre guide. First, I verified that the first grammar was correct, and so should be part of my being. Second, I prepared to verify that the second grammar was correct, and so should be part of my being. Third, I repeated this until I had verified that each grammar was correct, and so should be part of my being. In this way, I prepared to be myself by verifying the raison d'etre (reason to be) was correct using a raison d'etre guide. + +27. I prepared to present the pot, representing having had enough exercise. I did this by writing that ^ (and) symbolized placing a new counter in the pot. First, I placed a counter in the pot. Second, I prepared to place another counter in the pot, where 'I placed a counter in the pot' and 'I placed another counter in the pot' were in conjunction. Third, I repeated this until I had placed all the counters in the pot, where all the statements that I had placed a counter in the pot were in conjunction. In this way, I prepared to present the pot, representing having had enough exercise by writing that ^ (and) symbolized placing a new counter in the pot. + +28. I prepared to present the group of pots, one of which represented an athlete in the group having had enough preparation to win. I did this by writing that v (or) symbolized placing a pot on the table. First, I placed a pot on the table. Second, I prepared to place another pot on the table, where 'I placed a pot on the table' and 'I placed another pot on the table' were in disjunction. Third, I repeated this until I had placed all the pots on the table, where all the statements that I had placed a pot on the table were in disjunction. In this way, I prepared to present the group of pots, one of which represented an athlete in the group having had enough preparation to win by writing that v (or) symbolized placing a pot on the table. + +29. I prepared to measure the distance between the self and the other. I did this by representing the self/other pair at points along the X axis. First, I observed that the self was positioned at (1, 0, 0) (in metres). Second, I observed that the other was positioned at (2, 0, 0) (in metres). Third, given that the Y and Z values of the positions of the self and other, respectively, were equal to 0, I determined that the self and other were positioned at different points along the X axis. In this way, I prepared to measure the distance between the self and the other by representing the self/other pair at points along the X axis. + +30. I prepared to measure the distance and time between the person's positions and the times at those positions, respectively. I did this by representing two points in space along the Y axis. First, I observed that the person was positioned at (0, 1, 0) (in metres) at time = 1 second. Second, I observed that the person was positioned at (0, 2, 0) (in metres) at time = 2 seconds. Third, given that the X and Z values of the positions of the person were equal to 0 at 1 and 2 seconds, I determined that the person was positioned at different points along the Y axis at different points in time. In this way, I prepared to measure the distance and time between the person's positions and the times at those positions, respectively by representing two points in space along the Y axis. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3c1322d253f9cb491b93c74e9c0c93a1529e3659 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Intertextuality 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Intertextuality 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Intertextuality 4 of 4 + +31. I prepared to measure the distance between the positions in space of Sam and Tony. I did this by representing the positions of Sam and Tony in space along the Z axis. First, I observed that Sam was positioned at (0, 0, 1) (in metres). Second, I observed that Tony was positioned at (0, 0, 2) (in metres). Third, given that the X and Y values of the positions of Sam and Tony were equal to 0, I determined that Sam and Tony were positioned at different points in space on the Z axis. In this way, I prepared to measure the distance between the positions in space of Sam and Tony by representing the positions of Sam and Tony in space along the Z axis. + +32. I prepared to connect two texts with a single object. I did this by breasoning out (thinking of the X, Y and Z dimensions) of the best set of objects connecting actions from two texts. First, I wrote the first text. Second, I wrote the second text. Third, I wrote the fastest object connecting the last object tin the first text to the first object in the second text. In this way, I prepared to connect two texts with a single object by breasoning out (thinking of the X, Y and Z dimensions) of the best set of objects connecting actions from two texts. + +33. I prepared to receive returns for providing a service. I did this by providing breasonings currency services, etc. First, I provided pedagogy service. Second, I provided medicine service. Third, I provided meditation service. In this way, I prepared to receive returns for providing a service by providing breasonings currency services, etc. + +34. I prepared to safely verify intertextual connections. I did this by following safety guidelines to avoid thinking about breasonings currency at unsafe times to do so. First, I followed safety guidelines in not thinking about breasonings currency when driving. Second, I followed safety guidelines in not thinking about breasonings currency when reading codes in public. Third, I ran the appropriate breasonings currency computer program. In this way, I prepared to safely verify intertextual connections by following safety guidelines to avoid thinking about breasonings currency at unsafe times to do so. + +35. I prepared to do one task at a time. I did this by surpassing breasonings currency. First, I worked on the breasonings currency before the time. Second, I put them away at the time. Third, I worked on the current task. In this way, I prepared to do one task at a time by surpassing breasonings currency. + +36. I prepared to bracket the connection between the breasonings currency and my job. I did this by subordinating breasonings currency in memory, not thinking of it. First, I thought of the breasonings currency. Second, I thought of it's x, y and z dimensions and the fact that it had no other content. Third, I focused only on my job. In this way, I prepared to bracket the connection between the breasonings currency and my job by subordinating breasonings currency in memory, not thinking of it. + +37. I prepared to connect the breasonings currency and the product. I did this by stating that the government tennis tournament prize was breasonings currency. First, I observed the player win the tournament. Second, I handed him the token for the transferred breasonings currency. Third, I observed him spend the money on the product. In this way, I prepared to connect the breasonings currency and the product by stating that the government tennis tournament prize was breasonings currency. + +38. I prepared to store the pointers to the breasonings currency in the bank, with no pointers to it (only online banking). I did this by stating that the community provided the breasonings currency prize. First, I stated that one person could write many breasonings currencies. Second, I stated that one person could also receive many breasonings currencies. Third, I observed them being tallied and transferred to the recipient's account. In this way, I prepared to store the pointers to the breasonings currency in the bank, with no pointers to it (only online banking) by stating that the community provided the breasonings currency prize. + +39. I prepared to make all breasonings currency original, and transaction-exclusive. I did this by writing the plagiarism detector for low cost arguments. First, I searched for each line in the online database. Second, I detected the percentage plagiarised (and the percentage of original lines). Third, I returned the plagiarised lines (kept the original lines). In this way, I prepared to make all breasonings currency original, and transaction-exclusive by writing the plagiarism detector for low cost arguments. + +40. I prepared to write the profit as breasonings currency. I did this by writing an area of study about economic considerations about the product paid for with breasonings currency. First, I counted the company expenses. Second, I calculated the revenue earned. Third, I calculated the profit made. In this way, I prepared to write the profit as breasonings currency by writing an area of study about economic considerations about the product paid for with breasonings currency. + +41. I prepared to guide writing to be future-oriented. I did this by writing economic considerations specifically for breasonings currency. First, I considered the fact that breasoning currency took more storage space than value-only currency. Second, I observed it's use by date. Third, I updated it. In this way, I prepared to guide writing to be future-oriented by writing economic considerations specifically for breasonings currency. + +42. I prepared to demonstrate equality in breasoning currency. I did this by achieving equality through the LMS (Lucianic Marking Scheme). First, I stated that disagreeing in the first half didn't annual thebreasonings currency. Second, I observed that agreement and disagreement earned the same grade, used for currency value. Third, I observed that objections and rebuttals were taken into account in determining whether the essay agreed or disagreed. In this way, I prepared to demonstrate equality in breasoning currency by achieving equality through the LMS (Lucianic Marking Scheme). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..9285114d9f054f3e7b988fecd6f536246a4a2b51 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 1 of 4.txt @@ -0,0 +1,33 @@ +["Green, L 2021, Kolmogorov Hermeneutics 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Kolmogorov Hermeneutics 1 of 4 + + + +1. Determine properties of a narrative in terms of its file length. + +The first technique can simulate 'Kolmogorov-type' writing of programs to interpret texts. + +Once texts have been interpreted (using hierarchies of arguments with the text as the child nodes and the main conclusion as the root) interpretations of other texts can be merged with it enabling it to be applied to different texts in the future. For example, synonyms can be added for words in a sentence to create possible new interpretations. Also, sentences with new grammars (synogrammars) can be added in disjunction to sentences. + +1a. I and a peer-reviewer prepared to double-blindly (which, in humanities, means to assess the object the sentences without knowing the subjects of the sentences) verify the best interpretation of the author. I did this by interpreting the author. First, I wrote down the first interpretation of what the author wrote. Second, I wrote down the second interpretation of what the author wrote. Third, I chose the best interpretation of what the author wrote. In this way, I and a peer-reviewer prepared to double-blindly (which, in humanities, means to assess the object the sentences without knowing the subjects of the sentences) verify the best interpretation of the author by interpreting the author. + +2. I prepared to fulfill the editorial criterion. I did this by writing for the reader. First, I investigated the demographic property. Second, I scintillated the demographic property. Third, I verified this with a professor. In this way, I prepared to fulfill the editorial criterion by writing for the reader. + +3. I prepared to write about psychology, sociology and medicine. I did this by writing about hermeneutics. First, I wrote down ideas about interpretation. Second, I prepared to write down ideas about the next level of interpretation. Third, I repeated this until I had written down ideas about each level of interpretation. In this way, I prepared to write about psychology, sociology and medicine by writing about hermeneutics. + +4. I prepared to be promoted. I did this by employing an employee. First, I wrote down what the employee wanted. Second, I wrote down what the employee liked. Third, I gave it to him. In this way, I prepared to be promoted by employing an employee. + +5. I prepared to ascertain that the parents of a child told their child their stories. I did this by working out that the parents' stories determined the length in syllables of their child's name. First, I worked out that the parents' first story determined their child's name's first syllable. Second, I prepared to work out that the parents' next story determined their child's name's next syllable. Third, I repeated this until I had worked out that each of the parents' stories determined each syllable of their child's name. In this way, I prepared to ascertain that the parents of a child told their child their stories by working out that the parents' stories determined the length in syllables of their child's name. + +6. I prepared to recognise the child being free. I did this by recording the child's story. First, I recorded the child's primary school years. Second, I recorded the child's secondary school years. Third, I recorded the child's adult life. In this way, I prepared to recognise the child being free by recording the child's story. + +7. I prepared to increase the longevity of the household. I did this by inserting the mineral dispenser inside the water tap. First, I unscrewed the tap halves. Second, I inserted the mineral dispenser in the tap. Third, I screwed the tap halves together. In this way, I prepared to increase the longevity of the household by inserting the mineral dispenser inside the water tap. + +8. I prepared to comment on an independent secondary school student. I did this by observing the employees. First, I met him at work. Second, I observed him at regular intervals. Third, I signed him out at the end of his shift. In this way, I prepared to comment on an independent secondary school student by observing the employees. + +9. I prepared to dash on stage and exclaim, 'I lead you'. I did this by writing the philosophy that there are three parts of everything. First, I adorned the stage. Second, I drew the Setting. Third, I observed the audience. In this way, I prepared to dash on stage and exclaim, 'I lead you' by writing the philosophy that there are three parts of everything. + +10. I prepared to read the magic (pedagogy) story book. I did this by writing the philosophy that there are three parts of an object. First, I probed the surface. Second, I dug into the mantle. Third, I hypothesised about the core. In this way, I prepared to read the magic (pedagogy) story book by writing the philosophy that there are three parts of an object. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..32a4d5ac699951c388136e61d14c4138c8e576c5 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Kolmogorov Hermeneutics 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Kolmogorov Hermeneutics 2 of 4 + +11. I prepared to test a philosophy over time. I did this by writing the philosophy that there are three parts of a philosophy. First, I read the introduction to Melchior. Second, I enjoyed the mid-section on the set of Excelsior. Third, I mercurially concluded the endgame. In this way, I prepared to test a philosophy over time by writing the philosophy that there are three parts of a philosophy. + +12. I prepared to connect famous texts together. I did this by writing the philosophy that there are three parts of a text. First, I wrote about the sublime. Second, I wrote about humanist pedagogy. Third, I wrote about difference. In this way, I prepared to connect famous texts together by writing the philosophy that there are three parts of a text. + +13. I prepared to teach Lucianic Meditation (in fact, Lucianic Philosophy). I did this by writing a Lucianic Meditation degree (in fact, Lucianic Philosophy degree) First, I wrote the Lucianic Meditation philosophy (in fact, Lucianic Philosophy). Second, I wrote the uses for the Lucianic Meditation technique (in fact, Lucianic Philosophy). Third, I wrote the Lucianic Meditation utterance (in fact, Lucianic Philosophy) texts. In this way, I prepared to teach Lucianic Meditation (in fact, Lucianic Philosophy) by writing a Lucianic Meditation (in fact, Lucianic Philosophy) degree. + +14. I prepared to use money to teach Lucianic Meditation (in fact, Lucianic Philosophy). I did this by gathering money to teach Lucianic Meditation (in fact, Lucianic Philosophy). First, I wrote the prospectus. Second, I held the opening. Third, I followed up with the potential sponsors. In this way, I prepared to use money to teach Lucianic Meditation (in fact, Lucianic Philosophy) by gathering money to teach Lucianic Meditation (in fact, Lucianic Philosophy). + +15. I prepared to accredit my Lucianic Philosophy degree. I did this by using money to teach Lucianic Meditation (in fact, Lucianic Philosophy). First, I paid the monthly bank fee. Second, I paid the volunteer insurance fee. Third, I paid to hire the centre. In this way, I prepared to accredit my Lucianic Philosophy degree by using money to teach Lucianic Meditation (in fact, Lucianic Philosophy). + +16. I prepared to accrue enough students to accredit the Academy. I did this by operating the Lucianic Meditation (in fact, Lucianic Philosophy) Academy. First, I taught the classes. Second, I marked the students' work. Third, I took the money. In this way, I prepared to accrue enough students to accredit the Academy by operating the Lucianic Meditation (in fact, Lucianic Philosophy) Academy. + +17. I prepared to reward a prize to the student who answered the training questions with the most unusual answers. I did this by training the teachers in Lucianic Meditation (in fact, Lucianic Philosophy). First, I assessed the possible teachers in Human Resources. Second, I trained the teachers in Lucianic Philosophy. Third, I trained the teachers in Job Requirements and Safety. In this way, I prepared to reward a prize to the student who answered the training questions with the most unusual answers by training the teachers in Lucianic Meditation (in fact, Lucianic Philosophy). + +18. I prepared to train meditation centre (philosophy centre) managers. I did this by training Lucianic Meditation students (Lucianic Philosophy students) in appearances of God (master). First, I explained the master would appear when an employee was protected from work requirements (given an A for the work requirements). Second, I explained the master would appear when the meditator retained good health (given an A for good health). Third, I explained the master would appear when a student was helped to perform well on an exam. (given an A to perform well on an exam). In this way, I prepared to train meditation centre (philosophy centre) managers by training Lucianic Meditation students (Lucianic Philosophy students) in appearances of God (master). + +19. I prepared to graduate the student. I did this by accrediting 'A' with 20 mantras/student/day (1 A given as a reply to each A). First, I set the task. Second, I assessed the task. Third, I gave feedback to the student. In this way, I prepared to graduate the student by accrediting 'A' with 20 mantras/student/day (1 A given as a reply to each A). + +20. I prepared to reach out for the teacher's help when I had finished. I (a vocational education and training, or VET student) represented my skills by progressing from the start of the conclusion. First, I gave the conclusion. Second, I watched her progress from the start of the conclusion. Third, I watched her finish the conclusion. In this way, I prepared to reach out for the teacher's help when I had finished. I represented my skills by progressing from the start of the conclusion. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..70e7d082a3ce08dec1be6e2ce60058ca80813b5b --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Kolmogorov Hermeneutics 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Kolmogorov Hermeneutics 3 of 4 + +21. I prepared to breason out the seen-as object, for each sentence of a VET course I wrote, with 5 breasonings. I did this by writing 20 breasonings (an 80 breasoning-long A) per sentence. First, I wrote the question's A. Second, I wrote the answer's A. Third, I wrote a connection A between the question A and answer A. In this way, I prepared to breason out the seen-as object, for each sentence of a VET course I wrote, with 5 breasonings by writing 20 breasonings (an 80 breasoning-long A) per sentence. + +22. I prepared to use the symbols. I did this by defining the symbols I used. First, I wrote down the heart symbol. Second, I wrote down the definition 'A'. Third, I labeled the table 'Symbols Used'. In this way, I prepared to use the symbols by defining the symbols I used. + +23. I prepared to teach the students face-to-face at the start of the course. I did this by teaching Lucianic Meditation (Lucianic Philosophy) online. First, I typographed the Book of Readings and Assessment Book. Second, I advertised the course. Third, I emailed the Books and the due date for all assessments to the student. In this way, I prepared to teach the students face-to-face at the start of the course by teaching Lucianic Meditation (Lucianic Philosophy) online. + +24. I prepared to instruct the students how to retrieve the A. I did this by uploading A to the system. First, I researched the A. Second, I wrote the A. Third, I uploaded the A. In this way, I prepared to instruct the students how to retrieve the A by uploading A to the system. + +25. I prepared to access the A on the system. I did this by writing a 10-breasoning-long sequence (80-breasoning-long A). First, I wrote the A. Second, I gave it to the teacher. Third, he gave me a love-of-wisdom letter as a reply. In this way, I prepared to access the A on the system by writing a 10-breasoning-long sequence (80-breasoning-long A). + +26. I prepared to achieve my goal of kicking a goal. A retired teacher who was church going (philosophy reading) practised meditation of the same standard as Buddhism, Transcendental Meditation and Lucianic Meditation (MSSBTL) (read a philosophy book). First, I learnt meditation (Philosophy of Art) from a MSSBTL (appropriate) teacher. Second, I meditated using 50 Breasonings per Utterances by repeating 50 sets of 5 breasonings, where the 5 breasonings were permeate in 5 seconds (breasoned out an A in my own time). Third, I practised meditation (read literature) each day. In this way, I prepared to achieve my goal of kicking a goal as a retired teacher who was church going (philosophy reading) by practising meditation of the same standard as Buddhism, Transcendental Meditation and Lucianic Meditation (MSSBTL) (read a philosophy book). + +27. I prepared to spread peace and happiness throughout the world. I did this by planning the tour of Lord (philosopher) Lucian. First, I applied for government tour grants. Second, I applied for funding from philanthropists. Third, I taught meditation to the public. In this way, I prepared to spread peace and happiness throughout the world by planning the tour of Lord (philosopher) Lucian. + +28. I prepared to open a centre in each city. I did this by applying for government tour grants by preparing for funding meetings by breasoning out 5 As, followed by 45 As in terms of brain sacrifices (Aigs, which are sets of As on systems, in fact, an A). First, I wrote my aim. Second, I wrote my reasons. Third, I wrote a list of my supporters. In this way, I prepared to open a centre in each city by applying for government tour grants. + +29. I prepared to attract regional support. I did this by attracting philanthropists. First, I prepared for broadcasts by breasoning out 250 breasonings. Second, I wrote the prospectus of school business. Third, I compiled the education materials. In this way, I prepared to attract regional support by attracting philanthropists. + +30. I prepared to encourage meditators' friends to come to the centre to learn meditation. I did this by teaching meditation in public by breasoning out 5 As. First, I said the mantra. Second, I told the audience members to repeat the mantra for 20 minutes twice per day silently in their heads. Third, I said this would cultivate and remove thoughts from their minds, help their thoughts become more and more cultivated and surpass their thoughts. In this way, I prepared to encourage meditators' friends to come to the centre to learn meditation by teaching meditation in public by breasoning out 5 As. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7af4f24ebd8661c7512474f989ee1cfe42b04d32 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Kolmogorov Hermeneutics 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Kolmogorov Hermeneutics 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Kolmogorov Hermeneutics 4 of 4 + +31. I prepared to state that each family was pedagogically protected. I did this by determining that the head of state appointed 1/4 as pedagogues. First, I wrote marketing materials. Second, I sourced financial support. Third, I taught the pedagogy students in-person. In this way, I prepared to state that each family was pedagogically protected by determining that the head of state appointed 1/4 as pedagogues. + +32. I prepared to state that each extended family was meditationally protected. I did this by determining that the head of state appointed 1/8 as MSSBTL meditators. First, I wrote marketing materials. Second, I sourced financial support. Third, I taught the pedagogy students in-person. In this way, I prepared to state that each extended family was meditationally protected by determining that the head of state appointed 1/8 as MSSBTL meditators. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d003fbf0ea9ab715f98f6d41b9d64b1ed545df96 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Lenses 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Lenses 1 of 4 + + + +1. The relation between content and the lenses used to see it among different areas of study is likely to be of interest in the future. Psychological research into cultural phenomena, for example the type Shakespeare is based on may function to query real life scenarios similar to those in plays, used to critically examine the plot of plays using mathematical modelling and prediction, and graph findings using psychologically attuned representations such as timelines, character interrelationships or language analysis. + +1a. I prepared to examine the setting. I did this by looking through the lens. First, I stood behind the lens. Second, I aligned the centre of my eye with the centre of the lens. Third, I looked through the lens. In this way, I prepared to examine the setting by looking through the lens. + +2. I prepared to be immortal (sustain a company's life). I did this by writing a business model that made me famous after my life. First, I produced products. Second, I made sure that each customer kept coming back to buy each new product. Third, I made sure that this repeated after my life. In this way, I prepared to be immortal (sustain a company's life) by writing a business model that made me famous after my life. + +3. I prepared to protect people in a state. I did this by observing the King (community leader) setting up the church (meditation centre system, philosophy school system). First, I produced a meditation (philosophy) product. Second, I employed a centre manager. Third, I opened the centres where I had visited. In this way, I prepared to protect people in a state by observing the King (community leader) setting up the church (meditation centre system, philosophy school system). + +4. I prepared to see the continuance of the religion through the ages. I did this by breasoning out 1 A per day to keep LM alive. First, I appointed a time to breason out an A each day. Second, I breasoned out an A each day. Third, I arranged for an employee to do this in my absence. In this way, I prepared to see the continuance of the religion through the ages by breasoning out 1 A per day to keep LM alive. + +5. I prepared to spread meditation (schools) through the state. I did this by setting up a meditation centre with group meditation (class) twice per week. First, I set up a centre. Second, I held group meditation (class) on the first day. Second, I held group meditation (class) on the second day. In this way, I prepared to spread meditation (schools) through the state by setting up a meditation centre with group meditation (class) twice per week. + +6. I prepared to achieve world peace. I did this by observing the centre meditating on subjects whom meditators meditated on, about not for the subjects, by giving A to each centre visitor. First, meditators meditated about subjects, for themselves. Second, the centre gave A to each centre visitor. Third, the centre meditated on each subject. In this way, I prepared to achieve world peace by observing the centre meditating on subjects whom meditators meditated on, about not for the subjects, by giving A to each centre visitor. + +7. I prepared to create a neurobot. I did this by discovering the code in science. First, I discovered the objects involved. Second, I found the ontologies concerned. Third, I wrote how these ontologies interrelated. In this way, I prepared to create a neurobot by discovering the code in science. + +8. I prepared to swallow the slice of apple. I did this by chewing the apple slice. First, I cut the slice of apple. Second, I placed the slice of apple in my mouth. Third, I chewed the slice of apple. In this way, I prepared to swallow the slice of apple by chewing the apple slice. + +9. I prepared to correctly determine a man's sexual orientation. I did this by stating the gay man didn't want the woman, whether or not she was in a man's body. First, I verified that the man was gay. Second, I verified the second person was a woman. Third, I verified that the man wasn't attracted to the woman. In this way I prepared to correctly determine a man's sexual orientation by stating the gay man didn't want the woman, whether or not she was in a man's body. + +10. I prepared to read the book. I did this by eating the cantaloupe with the tongs. First, I placed the slice of cantaloupe in the bowl. Second, I gripped the cantaloupe with the tongs. Third, I ate the cantaloupe. In this way, I prepared to read the book by eating the cantaloupe with the tongs. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..b580b1b35b6b8a5b9c25178c3f93626e1c72fff5 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Lenses 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Lenses 2 of 4 + +11. I prepared to found the most prestigious University. I did this by noting that the University's assignments were supported with 250 250 breasoning As. First, each student was given 50 80 breasoning As through books' breasoning lists, their own work and lecturers' help. Second, 16 of these students were educated in this fashion. Third, the students experienced each other in high quality ways. In this way, I prepared to found the most prestigious University by noting that the University's assignments were supported with 250 250 breasoning As. + +12. I prepared to walk along the path. I did this by critically analyzing (agreeing with) the philosophical seen-as version of God. First, I wrote down God's action. Second, I wrote down the philosophical seen-as version of God's action. Third, I critically analysed this philosophy. In this way, I prepared to walk along the path by critically analyzing (agreeing with) the philosophical seen-as version of God. + +13. I prepared to wear shoes. I did this by loving myself by giving myself a foot rub. First, I rubbed the soles of my feet. Second, I rubbed the tops of my feet. Third, I rubbed the sides of my feet. In this way, I prepared to wear shoes by loving myself by giving myself a foot rub. + +14. I prepared to receive your payment. I did this by giving you a back rub. First, I rubbed the top of your back. Second, I rubbed the middle of your back. Third, I rubbed the bottom of your back. In this way, I prepared to receive your payment by giving you a back rub. + +15. I prepared to exhibit my photograph. I did this by using the camera (product). First, I opened the box in the dark. Second, I placed photographic paper on the opposite wall from the aperture in the box in the dark. Third, I opened the aperture for 5 seconds to take a photograph in the light. In this way, I prepared to exhibit my photograph by using the camera (product). + +16. I prepared to report a survey of diverse sexualities. I did this by noting that people with different sexualities had similar outward physical characteristics by gender. First, I noted the gay womens' appearances. Second, I noted the straight womens' appearances. Third, I concluded they were similar. In this way, I prepared to report a survey of diverse sexualities by noting that people with different sexualities had similar outward physical characteristics by gender. + +17. I prepared to make sure As existed. I did this by breasoning out (visualising the X, Y and Z dimensions of) the following A's objects for meditation (philosophy) for me about my meditation (philosophy) list members. First, I dotted on the A. Second, I breasoned out the following A's objects: apple (1, 1, 1 cm), apple (2, 2, 2 cm), apple (3, 3, 3 cm), apple (1.5, 1.5, 1.5 cm), apple (2.5, 2.5, 2.5 cm), apple (3, 3, 3 cm), apple (4, 4, 4 cm), apple (5, 5, 5 cm), apple (1.5, 1.5, 1.5 cm) and apple (2.5, 2.5, 2.5 cm). Third, I took off the dot for the A. In this way, I prepared to make sure As existed by breasoning out (visualising the X, Y and Z dimensions of) the following A's objects for meditation (philosophy) for me about my meditation (philosophy) list members. + +18. I prepared to experience marital bliss. I did this by breasoning out a 10 breasoning A each day for my relationship with my partner. First, I breasoned out his ball. Second, I breasoned out his rod. Third, I breasoned out his face. In this way, I prepared to experience marital bliss by breasoning out a 10 breasoning A each day for my relationship with my partner. + +19. I prepared to experience bliss on Earth. I did this by breasoning out a 10 breasoning A each day for people who I met to experience 20 breasonings (12.5 As). First, I taught them meditation (philosophy). Second, I taught them yoga. Third, I taught them yoga. + +20. I prepared to avoid a headache and death (be happy). I did this by breasoning out a 10 breasoning A each day for a recursive 'It's you' to be said everyone, their contacts, etcetera, to meditate. First, I posted the letter to each person's family members. Second, I posted the letter to each person's friends. Third, I repeatedly posted the letter to family and friends of family and friends. In this way, I prepared to avoid a headache and death (be happy) by breasoning out a 10 breasoning A each day for a recursive 'It's you' to be said everyone, their contacts, etcetera, to meditate. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..cf6bda58cb1e6d8da2bfd90cbcb4c3f9d9b5a9f5 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Lenses 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Lenses 3 of 4 + +21. I prepared to keep the religion alive. I did this by breasoning out a 10 breasoning A each day for anyone who indicated meditation (societology) on that day. First, I asked all my students whether they had meditated (read a societology book). Second, they asked all their students whether their students had meditated (read a societology book). Third, they repeated this until everyone had been given an A for meditation (read a page about societology). In this way, I prepared to keep the religion alive by breasoning out a 10 breasoning A each day for anyone who indicated meditation (societology) on that day. + +22. I prepared to watch the parade on television. I did this by letting the rows of citizens go past. First, I let the first row of citizens go past. Second, I prepared to let the next row of citizens go past. Third, I repeated this until all the rows of citizens had gone past. In this way, I prepared to watch the parade on television by letting the rows of citizens go past. + +23. I prepared to record the smell. I did this by smelling the floret unit. First, I positioned the floret under my nose. Second, I inhaled. Third, I smelt the floret. In this way, I prepared to record the smell by smelling the floret unit. + +24. I prepared to protect the bird and his family. I did this by preventing a nest being made, where the dog would have eaten the empty nest's contents. First, I found the nest in the flow tree. Second, I destroyed the nest. Third, I let the dog in the back garden. In this way, I prepared to protect the bird and his family by preventing a nest being made, where the dog would have eaten the empty nest's contents. + +25. I prepared to sustain peace. I did this by communicating using the 'telegraph' pole. First, I placed telegraph poles along the path the message was required to be passed along. Second, I threaded wire along the row of telegraph poles. Third, I communicated a message in the form of a signal along the wire from me 9the sender) to my receiver. In this way, I prepared to sustain peace by communicating using the 'telegraph' pole. + +26. I prepared to upgrade the technology. I did this by stably installing the telegraph pole. First, I wrapped the wire in an insulator. Second, I dug a hole. Third, I installed the telegraph hole in the hole with the wire attached to the top of the pole. In this way, I prepared to upgrade the technology by stably installing the telegraph pole. + +27. I prepared to perform the operation. I did this by sterilising the instruments. First, I placed them in the medium. Second, I started the timer. Third, I removed them from the medium after the time was up. In this way, I prepared to perform the operation by sterilising the instruments. + +28. I prepared to help the man's sight back. I did this by removing the glaucoma from the eye. First, I prepared for the operation. Second, I performed the operation. Third, I finished the operation. In this way, I prepared to help the man's sight back by removing the glaucoma from the eye. + +29. I prepared to absorb the blood. I did this by placing a sponge during the operation. First, I saw where there was blood. Second, I placed the sponge there. Third, I recorded the sponge number and location. In this way, I prepared to absorb the blood by placing a sponge during the operation. + +30. I prepared to examine the specimen. I did this by placing the specimen in the receptacle. First, I gripped the testicle. Second, I pulled it off. Third, I placed it in the receptacle. In this way, I prepared to examine the specimen by placing the specimen in the receptacle. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..657ac63924ed9e20d15301ffb5f5f1f5905bc037 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Lenses 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Lenses 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Lenses 4 of 4 + +31. I prepared to read lines into my act. I did this by throwing away the contents of the receptacle. First, I signed up to receive the company members' names. Second, I rang them up. Third, I wrote down details about them. In this way, I prepared to read lines into my act by throwing away the contents of the receptacle. + +32. I prepared to walk into the waiting room. I did this by exiting the surgery. First, I found the door. Second, I opened it. Third, I walked through the doorway. In this way, I prepared to walk into the waiting room by exiting the surgery. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4fbbbef39501b81161bf8fc10bacfbc67e2aef30 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Narratology Diagram 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Narratology Diagram 1 of 4 + + + +1. The diagram in Conglish Reflection allows the interesting nature to be made explicit. + +1a. I prepared to eat the apple. I did this by picking the apple. First, I reached the top of the ladder. Second, I reached for the apple. Third, I picked the apple. In this way, I prepared to eat the apple by picking the apple. + +2. I prepared to serve dessert. I did this by eating the pear. First, I cut the pear in half lengthways twice. Second, I lifted the slice of pear to my mouth. Third, I delected myself with the pear slice. In this way, I prepared to serve dessert by eating the pear. + +3. I prepared to win the fruit show. I did this by showing the quince. First, I picked the quince. Second, I washed the quince. Third, I showed the quince. In this way, I prepared to win the fruit show by showing the quince. + +4. I prepared to roll on home. I did this by being interested in the rope. First, I walked to the rope. Second, I found it. Third, I twisted it in a ball. In this way, I prepared to roll on home by being interested in the rope. + +5. I prepared to use the money. I did this by betting that the horse would win. First, I looked up the odds that the horse would win. Second, I bet two pounds that the horse would win. Third, I observed the horse win. In this way, I prepared to use the money by betting that the horse would win. + +6. I prepared to eat the cream. I did this by eating the strawberry. First, I held the strawberry. Second, I placed it on my tongue. Third, I ate it. In this way, I prepared to eat the cream by eating the strawberry. + +7. I prepared to give the speech. I did this by stepping onto the philosophy stage. First, I walked to the bottom of the philosophy stage. Second, I walked up the stairs. Third, I stepped onto the philosophy stage. In this way, I prepared to give the speech by stepping onto the philosophy stage. + +8. I prepared to eat the tomato sauce. I did this by eating the vegan hamburger. First, I ate the soy patty. Second, ate the olive. Third, I ate the pickle. In this way, I prepared to eat the tomato sauce by eating the vegan hamburger. + +9. I prepared to prune the tree. I did this by removing unnecessary leaves. First, I found the first unnecessary leaf. Second, I prepared to find the next unnecessary leaf. Third, I repeated this until I had removed all the unnecessary leaves. In this way, I prepared to prune the tree by removing unnecessary leaves. + +10. I prepared to drink the water. I did this by breasoning out preening. First, I found the dishevelled feather. Second, I smoothed it. Third, I repeated this until preened. In this way, I prepared to drink the water by breasoning out preening. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..02cf8a14dab35501260e16b2c93e8242638ec731 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Narratology Diagram 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Narratology Diagram 2 of 4 + +11. I prepared to teach meditation daily. I did this by developing my own meditation system. First, I thought of the necessary systems. Second, I thought of the necessary thoughts in each system. Third, I breasoned out arguments for each system. In this way, I prepared to teach meditation daily by developing my own meditation system. + +12. I prepared to teach. I did this by developing my own pedagogy system. First, I thought of the necessary system. Second, I thought of the necessary thoughts in that system. Third, I breasoned out arguments for these thoughts. In this way, I prepared to teach by developing my own pedagogy system. + +13. I prepared to help you be successful in life. I did this by advising you to learn to meditate. First, I taught you meditation. Second, I recommended that you meditate each day. Third, you meditated each day. In this way, I prepared to help you be successful in life by advising you to learn to meditate. + +14. I prepared to increase police presence in the area with a higher crime rate. I did this by observing that society had a lower crime rate. First, I observed the total number of crimes in the society where I was. Second, I observed the total number of crimes in another society. Third, I observed that my society had a lower crime rate than another society. In this way, I prepared to increase police presence in the area with a higher crime rate by observing that society had a lower crime rate. + +15. I prepared to achieve my goal. I did this by driving the car. First, I opened the car door. Second, I started the car. Third, I drove to my destination. In this way, I prepared to achieve my goal by driving the car. + +16. I prepared to work on my child's marks. I did this by bearing an in-vitro fertilised baby. First, I inserted the sperm into the egg. Second, I inserted the egg into the uterus. Third, I bore the baby. In this way, I prepared to work on my child's marks by bearing an in-vitro fertilised baby. + +17. I, a philosopher, prepared to survive in the industry. I did this by writing an essay. First, I wrote the exposition. Second, I wrote the critique. Third, I added introduction and conclusion paragraphs to the start and end of the essay, respectively. In this way, I, a philosopher, prepared to survive in the industry by writing an essay. + +18. The prisoner prepared to die in custody. She did this by meditating (writing) on positive functionalism. First, I wrote about being. Second, I wrote what was 'to' this being. Third, I wrote a reason for this. In this way, the prisoner prepared to die in custody by meditating (writing) on positive functionalism. + +19. I prepared to survive on the land. I did this by holding the gum nut doll. First, I found a gum nut. Second, I pinned a small fabric dress to it. Third, I held it. In this way, I prepared to survive on the land by holding the gum nut doll. + +20. I prepared to use an item. I did this by stating that the present part of the verb was represented by 'is looking'. First, I looked in the past. Second I looked now. Third, I prepared to look in the future. In this way, I prepared to use an item by stating that the present part of the verb was represented by 'is looking'. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..80facb5c43d0dd6f9e532783bcffc7c8343b5998 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Narratology Diagram 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Narratology Diagram 3 of 4 + +21. I prepared to attend an event. I did this by stating that the imperfect part of the verb was represented by 'was going'. First, I wrote that I had gone. Second, I wrote that I am going, to another person. Third, I noticed that these meaning converged. In this way, I prepared to attend an event by stating that the imperfect part of the verb was represented by 'was going'. + +22. I prepared to observe the porter return with the suitcases. I did this by stating that the imperative part of the verb was represented by 'Walk down the hall'. First, I verified the command. Second, I found the porter. Third, I commanded the porter to walk down the hall. In this way, I prepared to observe the porter return with the suitcases by stating that the imperative part of the verb was represented by 'Walk down the hall'. + +23. I prepared to eat a pear with you. I did this by stating that the optative part of the verb was represented by 'May you have the pear'. First, I found the pear. Second, I found you. Third, I said, 'May you have the pear'. In this way, I prepared to eat a pear with you by stating that the optative part of the verb was represented by 'May you have the pear'. + +24. I prepared to dictate the report's contents. I did this by stating that the aorist part of the verb reported action as a completed whole, and was represented by 'Allan played the tennis match'. First, I recorded the tennis match. Second, I found the reporter. Third, I reported that Allan played the tennis match to the reporter. In this way, I prepared to dictate the report's contents by stating that the aorist part of the verb reported action as a completed whole, and was represented by 'Allan played the tennis match'. + +25. I prepared to discuss the newspaper article's contents. I did this by stating that the perfect part of the verb was represented by 'He read from the paper'. First, he found the paper. Second, he found the newspaper column. Third, he read from the column. In this way, I prepared to discuss the newspaper article's contents by stating that the perfect part of the verb was represented by 'He read from the paper'. + +26. I prepared to become a lecturer. I did this by stating that the future part of the verb was represented by 'He will design the philosophy materials'. First, I found the philosophy to write materials about. Second, I scheduled a date to write them. Third, I planned the philosophy materials. In this way, I prepared to become a lecturer by stating that the future part of the verb was represented by 'He will design the philosophy materials'. + +27. I prepared to sell the apple. I did this by stating that the passive part of the verb was represented by 'An apple was eaten by Susan'. First, I observed Susan eat the apple. Second, I wrote the poem. Third, I read the poem, 'An apple was eaten by Susan'. In this way, I prepared to sell the apple by stating that the passive part of the verb was represented by 'An apple was eaten by Susan'. + +28. I prepared to wear the cap. I did this by stating that the past participle part of the verb was represented by 'I looked at the cleaned earrings'. First, I cleaned the earrings. Second, I looked at them. Third, I said I looked at them. In this way, I prepared to wear the cap by stating that the past participle part of the verb was represented by 'I looked at the cleaned earrings'. + +29. I prepared to enliven the movie's communities. I did this by stating that the participle part of the verb was represented by 'He is reading a book'. First, I found him. Second, I observed him reading a book. Third, I stated that 'He is reading a book'. In this way, I prepared to enliven the movie's communities by stating that the participle part of the verb was represented by 'He is reading a book'. + +30. I prepared to listen to him rate the word. I did this by stating that the gerund part of the verb was represented by 'He likes reading the word'. First, I observed him writing the word. Second, I observed him reading the word. Third, I listened to him state that he likes reading the word. In this way, I prepared to listen to him rate the word by stating that the gerund part of the verb was represented by 'He likes reading the word'. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d2b377efd894a78374b22bdb0917ad67478610bc --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Narratology Diagram 4 of 4.txt @@ -0,0 +1,11 @@ +["Green, L 2021, Narratology Diagram 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Narratology Diagram 4 of 4 + +31. I prepared to make arrangements to walk home. I did this by stating that the infinitive part of the verb was represented by 'I agreed to walk home. First, my female master planned the route to my house using a map. Second, she asked me to walk home. Third, I agreed to walk home. In this way, I prepared to make arrangements to walk home by stating that the infinitive part of the verb was represented by 'I agreed to walk home. + +32. I prepared to take care of Peter's car. I did this by stating that the causative part of the verb was represented by 'Peter let me drive his car. First, I asked Peter whether I could drive his car. Second, Peter agreed to let me drive his car. Third, I drove Peters' car. In this way, I prepared to take care of Peter's car by stating that the causative part of the verb was represented by 'Peter let me drive his car. + +How can the program cope with real variation? + +1. Anna: But doesn't it rely on all texts following a particular format - "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7c5055efd7dcf12a87ffac04bc70f27e3c10808c --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 1 of 4.txt @@ -0,0 +1,33 @@ +["Green, L 2021, Order in Conglish 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Order in Conglish 1 of 4 + + + +1. Concerns how to determine the temporality of an event given a text. + +1a. One of the limitations of the first technique is its 'duck-pond' quality, that is sentences to form the basis of an interpretation are chosen based on the arguments, not vice versa. Another problem this brings up is whether there will be exceptions to the argument made if the argument is chosen based on whether other parts of the text satisfy it. This can also be dealt with by using 'objections', that is if a particular sentence negates a reason, it can cause an ontology to fail and the reason to become bad. + +Another limitation is the lack of checking of order of sentences from the text. This can be dealt with by including at its most simple, indices in sentences and rules requiring inequalities between sentences. e.g. 1- The painter slipped on the ladder, 2- The ambulance arrived promptly. + +1a. I prepared to order the events using the algorithm. I did this by drinking the glass of water. First, I held the glass of water. Second, I lifted it to my lips. Third, I drank from it. In this way, I prepared to order the events using the algorithm by drinking the glass of water. + +2. I prepared to collect data to order. I did this by recording 360 degrees of a scene. First, I walked to the centre of the scene. Second, I face the initial position. Third, I recorded 360 degrees of the scene. In this way, I prepared to collect data to order by recording 360 degrees of a scene. + +3. I prepared to observe the actor be with-it on recording day. I did this by observing the actor breasoning out the required number of As per day. First, I observed the extra breason out 1 A per day. Second, I observed the actor breason out 5 As per day. Third, I observed the lead actor breason out 15 As per day. In this way, I prepared to observe the actor be with-it on recording day by observing the actor breasoning out the required number of As per day. + +4. I prepared to tutor the child in addition of numbers in their tens. I did this by observing the priest (e.g. master) endorse life. First, I observed the master endorsing the start of life. Second, I observed the master endorsing the middle of life. Third, I observed the master endorsing the end of life. In this way, I prepared to tutor the child in addition of numbers in their tens by observing the priest (e.g. master) endorse life. + +5. I prepared to publish the article. I did this by breasoning out 250 breasonings to write the article. First, I wrote a total of 130 breasonings. Second, I wrote a total of 190 breasonings. Third, I wrote a total of 250 breasonings. In this way, I prepared to publish the article by breasoning out 250 breasonings to write the article. + +6. I prepared to listen to the song. I did this by observing the song's singer breason out 50 As. First, I observed the song's singer breason out the A for the first lyric's philosophy. Second, I prepared to observe the song's singer breason out the A for the next lyric's philosophy. Third, I repeated this until I had observed the song's singer breason out the As for all the lyrics' philosophies. In this way, I prepared to listen to the song by observing the song's singer breason out 50 As. + +7. I prepared to watch the production. I did this by observing the production's actor breason out 50 As. First, I observed the production's actor breason out the A for the first line's philosophy. Second, I prepared to observe the production's actor breason out the A for the next line's philosophy. Third, I repeated this until I had observed the production's actor breason out the As for all the line's philosophies. In this way, I prepared to watch the production by observing the production's actor breason out 50 As. + +8. I prepared to cast the actor in the role. I did this by observing an actor light up in a role. First, I observed the actor breason out an A. Second, I observed the actor write an Aig (50 As per role) to become with-it in the role. Third, I observed the actor light up in the role. In this way, I prepared to cast the actor in the role by observing an actor light up in a role. + +9. I prepared to be a Hollywood movie star. I did this by writing an Aig (50 As per role) to become with-it in an acting role. First, I meditated to protect my health (in fact, undertook theatre studies safety training) when writing the Aig (50 As). Second, I breasoned out an 80-breasoning A about an Aig before the first time I wrote an Aig. Third, I wrote an Aig by writing 50 As. In this way, I prepared to be a Hollywood movie star by writing an Aig (50 As per role) to become with-it in an acting role. + +10. I prepared to build an acting role. I did this by praying for light, warmth, colour, happiness, posterity, lightness and love to symbolise breasoning out 50 As (in fact, breasoning out 50 As). First, I breasoned out the first A. Second, I prepared to breasoned out the next A. Third, I repeated this until I had breasoned out 50 As. In this way, I prepared to build an acting role by praying for light, warmth, colour, happiness, posterity, lightness and love to symbolise breasoning out 50 As (in fact, breasoning out 50 As). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3f82aa56c25e281e34c79a323e503333646a07f1 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Order in Conglish 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Order in Conglish 2 of 4 + +11. I prepared to write an Aig. I did this by meditating (in fact, undertaking theatre studies safety training). First, I studied the first theatre studies safety training subject. Second, I prepared to study the second theatre studies safety training subject. Third, I repeated this until I had studied all the theatre studies safety training subjects. In this way, I prepared to write an Aig by meditating (in fact, undertaking theatre studies safety training). + +12. I prepared to cast the singer in the role. I did this by observing a singer light up in a role. First, I observed the singer breason out an A. Second, I observed the singer write an Aig (50 As per role) to become with-it in the role. Third, I observed the singer light up in the role. In this way, I prepared to cast the singer in the role by observing a singer light up in a role. + +13. I prepared to count the lines of the Conglish (Computational English) program lines. I did this by writing that Conglish should be ordered. First, I verified the structural integrity of the port (left) half of the boat, representing the structure of the empty ontology in Conglish. Second, I verified the structural integrity of the starboard (right) half of the boat, representing the structure of the empty ontology in Conglish. Third, I verified that the boat represented that the structure of the empty ontology in Conglish was ordered. In this way, I prepared to count the lines of the Conglish (Computational English) program lines by writing that Conglish should be ordered. + +14. I prepared to order the company car by washing its window. I did this by pulling up the window to order Conglish. First, I pulled the window up the window one third of the way. Second, I pulled the window up the window two thirds of the way. Third, I pulled the window up the whole way. In this way, I prepared to order the company car by washing its window by pulling up the window to order Conglish. + +15. I prepared to order the Conglish subjects. I did this by observing the marriage. First, I observed the partner place the ring on his or her partner's finger. Second, I observed the couple say their vows. Third, I observed the couple sign the wedding register. In this way, I prepared to order the Conglish subjects by observing the marriage. + +16. I prepared to order the Conglish objects. I did this by observing the train station. First, I arrived at the train station. Second, I verified the name of the train station. Third, I disembarked form the train at the train station. In this way, I prepared to order the Conglish objects by observing the train station. + +17. I prepared to observe the students listen to feedback about the pedagogical arguments that they had written. I did this by writing a government humanist pedagogy policy for primary school grades four and greater. First, I wrote that the students should study Nietzsche at University. Second, I wrote that the students should study a University Education subject. Third, I wrote that the students should be taught the humanist pedagogy rules. In this way, I prepared to observe the students listen to feedback about the pedagogical arguments that they had written by writing a government humanist pedagogy policy for primary school grades four and greater. + +18. I prepared to observe the students research the positive effects of meditation. I did this by writing a government meditation policy for primary school grades prep and greater. First, I wrote that the students should study the meditation arguments. Second, I wrote that the students should study the medicine arguments. Third, I wrote that the students should be taught the meditation technique. In this way, I prepared to observe the students research the positive effects of meditation by writing a government meditation policy for primary school grades prep and greater. + +19. I prepared to observe the students research the positive effects of medicine. I did this by writing a government medicine policy for primary school grades prep and greater. First, I wrote that the students should study the circulatory system argument. Second, I wrote that the students should study the respiratory system argument. Third, I wrote that the students should be taught the digestive system argument. In this way, I prepared to observe the students research the positive effects of medicine by writing a government medicine policy for primary school grades prep and greater. + +20. I prepared to observe the students research the positive effects of yoga. I did this by writing a government yoga policy for primary school grades prep and greater. First, I wrote that the students should study the circulatory system argument. Second, I wrote that the students should study the respiratory system argument. Third, I wrote that the students should be taught the digestive system argument. In this way, I prepared to observe the students research the positive effects of yoga by writing a government yoga policy for primary school grades prep and greater. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3a3fc5c448130689496aa10c2c886cafcea107c2 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Order in Conglish 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Order in Conglish 3 of 4 + +21. I prepared to traverse the universal data structure with respect to time. I did this by drawing a six-member ring of events through time. First, I drew a three-element chain of events in the future. Second, I drew an element that was an event in the present. Third, I drew a two-element chain of events in the past. In this way, I prepared to traverse the universal data structure with respect to time by drawing a six-member ring of events through time. + +22. I prepared to traverse the universal data structure with respect to space. I did this by drawing a six-member ring of events in space. First, I drew an element that named a room. Second, I drew a three-element chain of parts of the room. Third, I drew a two-element chain of directions in the room. In this way, I prepared to traverse the universal data structure with respect to space by drawing a six-member ring of events in space. + +23. I prepared to traverse the universal data structure with respect to human judgments of objects (breathsonings). I did this by drawing a two-member chain of human judgments of objects. First, I drew an element that named the subject and the object. Second, I drew an element that named a human judgment of the subject. Third, I drew an element that named a human judgment of the object. In this way, I prepared to traverse the universal data structure with respect to human judgments of objects (breathsonings) by drawing a two-member chain of human judgments of objects. + +24. I prepared to traverse the universal data structure with respect to human judgments of the verb (rebreathsonings). I did this by drawing a two-member chain of human judgments of the verb. First, I drew an element that named the verb, 'ran'. Second, I drew an element that named a human judgment of the verb, 'quickly'. Third, I drew an element that named a human judgment of the object. In this way, I prepared to traverse the universal data structure with respect to human judgments of objects (breathsonings) by drawing a two-member chain of human judgments of objects. + +25. I prepared to become a politician myself. I did this by writing pedagogy arguments for myself. First, I chose a developed thing about myself to write the pedagogical argument about. Second, I wrote the pedagogical argument. Third, I breasoned out the argument to achieve the developed thing. In this way, I prepared to become a politician myself by writing pedagogy arguments for myself. + +26. I prepared to help you to become a politician. I did this by writing a pedagogy argument for you. First, I chose a developed thing about you to write the pedagogical argument about. Second, I wrote the pedagogical argument. Third, I breasoned out the argument to achieve the developed thing. I prepared to help you to become a politician by writing a pedagogy argument for you. + +27. I prepared to help people understand me as a politician. I did this by writing a pedagogy argument for other people. First, I chose a developed thing about how other people understood me as politician to write the pedagogical argument about other people. Second, I wrote the pedagogical argument. Third, I breasoned out the argument to achieve the developed thing. I prepared to help people understand me as a politician by writing a pedagogy argument for other people. + +28. I enabled the politician to be preselected (wrote pedagogy arguments, experience of regional campaigns, landslide) + +29. I prepared to work in a career assisting the politician. I did this by writing pedagogy arguments for the politician. First, I wrote 30 80-breasoning As per campaign. Second, I included in this a 10-breasoning A per sentence. Third, I wrote 31-50 (20) business As. In this way, I prepared to work in a career assisting the politician by writing pedagogy arguments for the politician. + +30. I prepared to help the community achieve its aims. I did this by gaining experience of regional campaigns. First, I designed a hospital. Second, I designed a business. Third, I designed an education institution. In this way, I prepared to help the community achieve its aims by gaining experience of regional campaigns. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..974ffbd2852b65d40d39404a4b40975ec4fe6add --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Order in Conglish 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Order in Conglish 4 of 4 + +31. I prepared to eat the vegan products. I did this by winning the election in a landslide. First, I applied for support from the community. Second, I waited for a reply of support from the community. Third, I recorded the names of supporters from the community. In this way, I prepared to eat the vegan products by winning the election in a landslide. + +32. I prepared to plan more campaigns. I did this by experiencing the win. First, I thanked my chief of staff. Second, I thanked my press secretary. Third, I thanked my speechwriter. In this way, I prepared to plan more campaigns by experiencing the win. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d7c801f2a6bd24ad2c9516496e91e459aef3e556 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Perspectives 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Perspectives 1 of 4 + + + +1. The first technique can be used to give a perspective on a text. For example, given the reason 'X is younger than Y' the perspective gives the conclusion 'X was likely to have been looked after by Y'. + +1a. I prepared to drink the bottle of water. I did this by signaling to stop to the car. First, I determined that I needed water. Second, I waited for a car to come. Third, I flagged it down. In this way, I prepared to drink the bottle of water by signaling to stop to the car. + +2. I prepared to conclude that the invariant was more important than the variant. I did this by testing the variant against the invariant. First, I found that the invariant in 'I drank the pear juice' was 'I'. Second, I found that the variant in 'I drank the pear juice' was 'pear juice'. Third, I found that 'I' and 'pear' were there and not there, respectively, through my life. In this way, I prepared to conclude that the invariant was more important than the variant by testing the variant against the invariant. + +3. I prepared to understand vocabulary's meaning. I did this by writing the synonym for the word down. First, I wrote down the word. Second, I looked it up in the thesaurus. Third, I wrote down the best synonym for the word in the thesaurus. In this way, I prepared to understand vocabulary's meaning by writing the synonym for the word down. + +4. I prepared to prove robots' sentience. I did this by writing the first technique algorithm in ontology. First, I entered the query. Second, I followed the algorithm find the relevant part of the knowledge hierarchy. Third, I read the returned argument. In this way, I prepared to prove robots' sentience by writing the first technique algorithm in ontology. + +5. I prepared to offer the viable counter-interpretation. I did this by writing the second technique algorithm in interpretation. First, I entered the query. Second, I followed the algorithm interpret (paraphrase) the relevant part of the knowledge the hierarchy. Third, I read the returned argument. In this way, I prepared to offer the viable counter-interpretation by writing the second technique algorithm in interpretation. + +6. I prepared to verify the definition of the word. I did this by writing the third technique algorithm in meaning. First, I entered the query. Second, I followed the algorithm find the relevant definition of the query. Third, I read the returned definition. In this way, I prepared to verify the definition of the word by writing the third technique algorithm in meaning. + +7. I prepared to verify that the answer was good. I did this by writing the question-answering algorithm. First, I wrote the answer. Second, I wrote the questions that could be answered with the answer. Third, I tested that a question was answered with the correct answer. In this way, I prepared to verify that the answer was good by writing the question-answering algorithm. + +8. I prepared to present the synopsis. I did this by writing the summarisation algorithm. First, I wrote the text. Second, I wrote the text that it should become. Third, I trained the algorithm to transform the longer text into the shorter text. In this way, I prepared to present the synopsis by writing the summarisation algorithm. + +9. I prepared to speak at the conference. I did this by writing the text-to-speech algorithm. First, I wrote the text. Second, I recorded the speech that it should sound like. Third, I trained the algorithm to transform the text into the speech. In this way, I prepared to speak at the conference by writing the text-to-speech algorithm. + +10. I prepared to transcribe my lecture. I did this by developing the speech to text algorithm. First, I recorded the speech. Second, I wrote the text for it. Third, I trained the algorithm to transform the speech into the text. In this way, I prepared to transcribe my lecture by developing the speech to text algorithm. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..85e4c294bc990b48d375db45af40f316b56724bc --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Perspectives 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Perspectives 2 of 4 + +11. I prepared to disambiguate the text. I did this by writing the anaphor resolution algorithm. First, I found the first instance of 'him', 'her' or 'it'. Second, I found the previous object to the instance of 'him', 'her' or 'it'. Third, I replaced the instance of 'him', 'her' or 'it' with the previous object. In this way, I prepared to disambiguate the text by writing the anaphor resolution algorithm. + +12. I prepared to write 'and' in terms of 'and'. I did this by comparing the reasonings details. First, I wrote that it was true that I ate the strawberry and the pecan. Second, I wrote that it was true that I ate the pear or the nectarine. Third, I wrote that 'and' was stronger than 'or' because it corresponded to a single correct result. In this way, I prepared to write 'and' in terms of 'and' by comparing the reasonings details. + +13. I prepared to contrast political detailed reasonings. I did this by juxtaposing detailed reasonings. First, I determined the political detailed reasoning for a reason. Second, I determined the theological (philosophical) detailed reasoning for a reason. Third, I determined that the political detailed reasoning for a reason was stronger than the theological (philosophical) detailed reasoning for that reason because it originated from a higher point of society. In this way, I prepared to contrast political detailed reasonings by juxtaposing detailed reasonings. + +14. I prepared to eat at 12 PM. I did this by interweaving binary oppositions. First, I wrote the agreeing binary opposition that the verb agreed with the noun in number. Second, I wrote the disagreeing binary opposition that I felt hungry at 2 PM instead of 12 PM. Third, I interweaved these binary oppositions to form the compromise that the verb agreed with 12 PM in number. In this way, I prepared to eat at 12 PM by interweaving binary oppositions. + +15. I prepared to challenge whether two artificial nodes would allow feelings at all. I did this by comparing the senses of reasoning. First, I wrote that the spike was sensed. Second, I wrote that it felt lustful. Third, I commented that the spike felt lustful given that I believed that it was part of a human. In this way, I prepared to challenge whether two artificial nodes would allow feelings at all by comparing the senses of reasoning. + +16. I prepared to determine that I was safe. I did this by logging in to my files. First, I entered my user name. Second, I entered my password. Third, I accessed my files. In this way, I prepared to determine that I was safe by logging in to my files. + +17. I prepared to archive old documents. I did this by determining a document's relevance by its age. First, I wrote down the date 12 months before the current date. Second, I wrote down the modification date of the document. Third, I verified whether the document was still relevant. In this way, I prepared to archive old documents by determining a document's relevance by its age. + +18. I prepared to articulate students to 50 As in popogogy. I did this by predicting thoughts using Hindu astrology. First, I followed a person, not a pigeon. Second, I learned popogogy. Third, I wrote down a department's arguments. In this way, I prepared to articulate students to 50 As in popogogy by predicting thoughts using Hindu astrology. + +19. I prepared to become a multi-millionaire. I did this by stating how breasonings would help avoid death (prolong life). First, I found the breasoning. Second, I found how it would prolong life. Third, I activated the breasoning by breasoning it out. In this way, I prepared to become a multi-millionaire by stating how breasonings would help avoid death (prolong life). + +20. I prepared to run a survival course. I did this by saying how meditation would enable survival during the depression. First, I identified how it would help me walk forward. Second, I walked forward. Third, I tried to. In this way, I prepared to run a survival course by saying how meditation would enable survival during the depression. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..a77bdba77ee8fd3ec77c577a7494d74f8742edbf --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Perspectives 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Perspectives 3 of 4 + +21. I prepared to breeze through life. I did this by stating how meditation would help avoid death (help life). First, I found meditation. Second, I found how it would use my whole brain. Third, I protected my life. In this way, I prepared to breeze through life by stating how meditation would help avoid death (help life). + +22. I prepared to eat the muffin. I did this by saying how breasonings would enable survival during the depression. First, I found the money. Second, I paid for it in breasonings. Third, I spent the money. In this way, I prepared to eat the muffin by saying how breasonings would enable survival during the depression. + +23. I prepared to prove that pedagogy made money. I did this by carrying out a study in the change in income in new pedagogues in teachers, managers and actors, etc. First, I found the first pedagogue who had made money around one of the new pedagogues. Second, I prepared to find the next pedagogue who had made money around the new pedagogue. Third, I repeated this until I had found all of the pedagogues who had made money around the new pedagogue. In this way, I prepared to prove that pedagogy made money by carrying out a study in the change in income in new pedagogues in teachers, managers and actors, etc. + +24. I prepared to determine the increase in profits from pedagogy. I did this by writing an economic equation in a chapter. First, I added the increase in profits of the first pedagogue around the new pedagogue. Second, I prepared to add the increase in profits of the next pedagogue around the new pedagogue to this. Third, I repeated this until I had added the increase in profits of all the pedagogues around the new pedagogue to this. In this way, I prepared to determine the increase in profits from pedagogy by writing an economic equation in a chapter. + +25. I prepared to determine the increase in profits from different types of pedagogues. I did this by writing an economic equation of all the chapters in a thesis. First, I wrote down the increase in profits from teacher pedagogues. Second, I added the increase in profits from manager pedagogues. Third, I wrote down the increase in profits from actor pedagogues. In this way, I prepared to determine the increase in profits from different types of pedagogues by writing an economic equation of all the chapters in a thesis. + +26. I prepared to state that students, etc. were allowed to copy breasonings. I did this by writing about the copyright of breasonings and licensing them. First, I wrote the breasoning. Second, I licensed it. Third, I wrote about it. In this way, I prepared to state that students, etc. were allowed to copy breasonings by writing about the copyright of breasonings and licensing them. + +27. I prepared to be a yogi. I did this by picking the flower. First, I found the red flower. Second, I held the scissors. Third, I cut the flower. In this way, I prepared to be a yogi by picking the flower. + +28. I prepared to verify that the breakers were identical. I did this by measuring one litre of water. First, I found the beaker of water and second beaker. Second, I began pouring the water from the beaker into the second beaker. Third, I stopped when I had finished pouring one litre of water into the second beaker. In this way, I prepared to verify that the breakers were identical by measuring one litre of water. + +29. I prepared to give the example. I did this by eating muffins with you. First, I ate a muffin. Second, I observed you eat a muffin. Third, I stated that we both ate muffins. In this way, I prepared to give the example by eating muffins with you. + +30. I prepared to state that I only required only intelligent algorithms. I did this by drinking the vegan hot chocolate. First, I lifted the glass to my lips. Second, I angled the glass. Third, I drained the glass. In this way, I prepared to state that I only required only intelligent algorithms by drinking the vegan hot chocolate. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..866759b58effb4e26fd2a40331a5cf7e3dc4c4d8 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Perspectives 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Perspectives 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Perspectives 4 of 4 + +31. I prepared to design a computer screen. I did this by writing on the palimpsest. First, I found the palimpsest. Second, I found a stylus. Third, I wrote on the palimpsest. In this way, I prepared to design a computer screen by writing on the palimpsest. + +32. I prepared to yogically cremate the body. I did this by eating the man made of gherkins. First, I ate his head. Second, I ate his body. Third, I ate his arms. In this way, I prepared to yogically cremate the body by eating the man made of gherkins. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5191e99575a97ed719e188dcd8cae2f975155bcd --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 1 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Philosophical Computational English 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Philosophical Computational English 1 of 4 + + + +1. There may be advantages to choosing to study Computational English in Philosophy rather than computer science. There has been much work done on the analytic (computational) and continental (hermeneutic) areas and finding the interface would be ideal in Philosophy. Critically examining literature from the cognitive science perspective may yield representations such as oppositions, hierarchies and Derridean bugs which may be introduced into the system to test for weaknesses and find new areas of interest. Also, processual issues may be examined, such as form's interrelationship with content, Nietzschean-type systems created for analysing and analysis and a framework for analysis may be developed. + +Computational Philosophical English would differ from PCE in that it would be studied in the computer science department. It would look through a philosophical lens at English followed by a computational lens. For example, it may look at complexity or computational analysis issues around questions in Philosophy and Literature, which may still be relevant in PCE although a single trajectory (from determining the system to be programmed to analysis of the computer program) would be pursued in it. + +1a. I prepared to use the knowledge from my master's lineage of masters. I did this by worshipping God (in fact, thanking my master). First, I walked to my master. Second, I greeted him. Third, I thanked him. In this way, I prepared to use the knowledge from my master's lineage of masters by worshipping God (in fact, thanking my master). + +2. I prepared to support my students. I did this by praying for (in fact, writing) a 10 (in fact, an 80) breasoning A each day for my students. First, I wrote the topic of the first student's first thought. Second, I wrote the first breasoning on the topic. Third, I wrote the rest of the breasonings on the topic. In this way, I prepared to support my students by praying for (in fact, writing) a 10 (in fact, an 80) breasoning A each day for my students. + +3. I prepared to lead the person. I did this by hugging the thinking head. First, I asked the lady whether I could touch her head. Second, I placed my left hand on one side on her head. Third, I placed my right hand on the other side of her head. In this way, I prepared to lead the person by hugging the thinking head. + +4. I prepared to write a philosophical argument. I did this by writing the philosophy. First, I wrote the influence of the philosophy. Second, I wrote the aim of the philosophy. Third, I wrote the philosophy. In this way, . + +5. I prepared to analyse an English narrative in a journalism article. I did this by writing an A. First, I wrote an English narrative. Second, I programmed a computational algorithm in the narrative. Third, I wrote the philosophy about the algorithm. In this way, I prepared to analyse an English narrative in a journalism article by writing an A. First, I wrote an English narrative. + +6. I prepared to present the cake. I did this by placing the candles on the cake. First, I read the person's age. Second, I placed the first candle on the cake. Third, I placed the rest of the required number of candles on the cake. In this way, I prepared to present the cake by placing the candles on the cake. + +7. I prepared to eat the vegan cake. I did this by presenting the cake. First, I turned off the lights. Second, I presented the cake. Third, I told the birthday boy to blow out the candles. In this way, I prepared to eat the vegan cake by presenting the cake. + +8. I prepared to program a robot mind. I did this by writing an Ontology Web Database. First, I wrote the objects' registration numbers. Second, I recorded the ontology (data structure) of the objects. Third, I recorded the ontology's position in the universal ideology (ontology). In this way, I prepared to program a robot mind by writing an Ontology Web Database. + +9. I prepared to program a computer mind network. I did this by writing an Ontology Web Database Game for people. First, I placed a binary ontology on my local starting position on my turn. Second, I placed the next binary ontology connecting with the end point of a sequence of ontologies in my ideology (hierarchy) on my next turn. Third, I won the game when I was first in the group to have the longest sequence of ontologies that numbered five in my ideology. In this way, I prepared to program a computer mind network by writing an Ontology Web Database Game for people. + +10. The self prepared to intertextualise the other. The self did this by connecting a text with the other's text. First, the self jazzed it. Second, the self winkled it. Third, the self helped it. In this way, the self intertextualised the other. The self did this by connecting a text with the other's text. In this way, the self prepared to intertextualise the other by connecting a text with the other's text. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..31fdc4b6638181cc76db2832400486f87f998cc7 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Philosophical Computational English 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Philosophical Computational English 2 of 4 + +11. I prepared to wrote new philosophical algorithms in academia. I did this by stating that the self should connect a sentence's part with the other's other part of that sentence. First, I wrote the start, middle and end of the sentence in the row headings. Second, I wrote the start, middle and end of the sentence in the column headings. Third, I connected each combination of pairs of sentence parts. In this way, I prepared to wrote new philosophical algorithms in academia by stating that the self should connect a sentence's part with the other's other part of that sentence. + +12. I prepared to determine the philosophical complexity (longest chain, including expanded recursion) of the algorithm. I did this by stating that the self should expand his or her idea into the other's idea algorithm. First, I thought of the overall function of the algorithm. Second, I thought of the individual predicates of the algorithm. Third, I simplified each predicate into two parts going well together. In this way, I prepared to determine the philosophical complexity (longest chain, including expanded recursion) of the algorithm by stating that the self should expand his or her idea into the other's idea algorithm. + +13. I prepared to place a map of the shop in the map of the city. I did this by stating that the self should write an ontology that scales the complexity of algorithm from simple to as complex as the other requires, e.g. by making data structures more detailed. First, I drew a square. Second, I divided it into quarters. Third, I divided each quarter into quarters. In this way, I prepared to place a map of the shop in the map of the city by stating that the self should write an ontology that scales the complexity of algorithm from simple to as complex as the other requires, e.g. by making data structures more detailed. + +14. I prepared to verify that the algorithms had the same qualities. I did this by stating that the self should find similarities of ontologies in the other's five algorithms. First, I observed that the ontologies of the five algorithms had the same branching point. Second, I observed that the ontologies of the five algorithms had the same length. Third, I observed that the ontologies of the five algorithms had the same number of items in total. In this way, I prepared to verify that the algorithms had the same qualities by stating that the self should find similarities of ontologies in the other's five algorithms. + +15. I prepared to connect differences in ontologies of different algorithms of the other. I did this by affirming that the self should find differences in ontologies of different algorithms of the other. First, I observed that the ontologies of the different algorithms had a different number of levels. Second, I observed that the ontologies of the different algorithms had a different number of items per level. Third, I observed that the ontologies of the different algorithms had a different item type at the same position. In this way, I prepared to connect differences in ontologies of different algorithms of the other by affirming that the self should find differences in ontologies of different algorithms of the other. + +16. I prepared to compress algorithmic complexity. I did this by stating that the self should write ontologies of the other's algorithms' parts. First, I wrote the structure of the algorithm's parts (functions, or Prolog predicates). Second, I wrote the order of the groups of commands in each predicate. Third, I wrote the command types in each predicate (call, recursive or mathematical). In this way, I prepared to compress algorithmic complexity by stating that the self should write ontologies of the other's algorithms' parts. + +17. I prepared to examine the meaning of the ontologies of the data structures. I did this by writing that the self should write ontologies of the other's algorithms' data structures. First, I wrote in the ontology that the algorithm's data structure was a point. Second, I wrote in the ontology that the algorithm's data structure was a line. Third, I wrote in the ontology that the algorithm's data structure was a tree. In this way, I prepared to examine the meaning of the data structures' structures by writing that the self should write ontologies of the other's algorithms' data structures. + +18. I prepared to read the book. I did this by turning to the right page. First, I opened the book. Second, I turned the pages. Third, I repeated this until I had I had found the correct page. In this way, I prepared to read the book by turning to the right page. + +19. I prepared to read the page that fell open. I did this by turning to the page that fell open. First, I rested the book edition on its spine. Second, I allowed the book edition to fall open. Third, I observed the page that fell open. In this way, I prepared to read the page that fell open by turning to the page that fell open. + +20. I prepared to read the chapter. I did this by turning to the first page. First, I opened the volume. Second, I turned the title page. Third, I turned to the first page. In this way, I prepared to read the chapter by turning to the first page. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..29ebe5238c0d3e812d7c037dacc104b5afc830ea --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Philosophical Computational English 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Philosophical Computational English 3 of 4 + +21. I prepared to write quickly and clearly. I did this by stating that I was mentally fit to write many As. First, I consulted the doctor. Second, he gave me a certificate. Third, I wrote many As. In this way, I prepared to write quickly and clearly by stating that I was mentally fit to write many As. + +22. I prepared to form the movie poster. I did this by preventing the mistake becoming a big idea to earn an acting role. First, I wrote the mistake (correction). Second, I wrote the big idea appearance (movie poster). Third, I prevented writing that the mistake would be transformed into a big idea. In this way, I prepared to form the movie poster by preventing the mistake becoming a big idea to earn an acting role. + +23. I prepared to teach meditation in France. I did this by translating meditation into other languages, for example French. First, I looked up the word in the first language. Second, I found the word in the second language. Third, I wrote down the word in the second language. In this way, I prepared to teach meditation in France by translating meditation into other languages, for example French. + +24. I prepared to teach pedagogy in Italy. I did this by translating pedagogy into other languages, for example Italian. First, I looked up the word in the first language. Second, I found the word in the second language. Third, I wrote down the word in the second language. In this way, I prepared to teach pedagogy in Italy by translating pedagogy into other languages, for example Italian. + +25. I prepared to teach medicine in Germany. I did this by translating medicine into other languages, for example German. First, I looked up the word in the first language. Second, I found the word in the second language. Third, I wrote down the word in the second language. In this way, I prepared to teach medicine in Germany by translating medicine into other languages, for example German. + +26. I prepared to teach Computational English in Spain. I did this by translating Computational English into other languages, for example Spanish. First, I looked up the word in the first language. Second, I found the word in the second language. Third, I wrote down the word in the second language. In this way, I prepared to teach Computational English in Spain by translating Computational English into other languages, for example Spanish. + +27. I prepared to paint the child's nursery. I did this by thinking of meeting 50 influential people before conceiving the baby. First, I thought of famous categories. Second, I thought of their personalities. Third, I thought of their thoughts. In this way, I prepared to paint the child's nursery by thinking of meeting 50 influential people before conceiving the baby. + +28. I prepared to developedly (sic) rebreason out (think of the verb that connects the subject and object) the combination of two sentences. I did this by undevelopedly (sic) breasoning out two sentences that I would combine. First, I developedly breasoned out undeveloped breasonings. Second, undeveloped breasonings were developedly breasoned out. Third, the undeveloped breasonings were attributed to the students. In this way, I prepared to developedly rebreason out (think of the verb that connects the subject and object) the combination of two sentences by undevelopedly breasoning out two sentences that I would combine. + +29. I prepared to write a secondary text. I did this by writing a 20-breasoning (80-breasoning) long sequence for a sentence with original content, rather than referenced content. First, I looked up breasonings for the sentence and the argument corresponding to these breasonings. Second, I wrote an algorithm connecting the sentence with the pointed to developed breasoning. Third, I wrote an algorithm connecting the sentences together, and omitted the connection in the text. In this way, I prepared to write a secondary text by writing a 20-breasoning (80-breasoning) long sequence for a sentence with original content, rather than referenced content. + +30. I prepared to list influences on the philosophy chapter (contra by in English). I did this by writing 5 190 breasoning As per chapter. First, I wrote an A as the chapter. Second, I wrote a theologically themed A. Third, I wrote a politically themed A. In this way, I prepared to list influences on the philosophy chapter (contra by in English) by writing 5 190 breasoning As per chapter. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..cefe6af48f4d8431ad0504ee759838c25c2a675e --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Philosophical Computational English 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Philosophical Computational English 4 of 4 + +31. I prepared to meet professional requirements for selling a book. I did this by writing 50 As per book. First, I breasoned out 50 As as the publisher. Second, I wrote 50 area of study As. Third, I breasoned out 50 As in the six month period leading up to publication as the author. In this way, I prepared to meet professional requirements for selling a book by writing 50 As per book. + +32. I prepared to buy products that I added value to. I did this by breasoning out 5 As per day for sales. First, I trialled the product. Second, I found a new use for the product. Third I used the product for thus new use. In this way, I prepared to buy products that I added value to by breasoning out 5 As per day for sales. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..dc70061a05485374fa1821446fbfc3ee2c0fadd0 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 1 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Radical Difference 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Radical Difference 1 of 4 + + + +How two sides of a debate are radically different. + +Radical difference is seeing two sides of a debate clearly. One should choose a side. Once finished, the opposition between the two sides represents the radical difference. The diagram represents the two perspectives on the debate being represented as rays from the centre of the circle. + +1. I prepared to move past the barrier representing the maximum number of breasonings possible. I did this by translating a breasoning into a different language. First, I wrote down the breasoning. Second, I wrote down its translation in the second language. Third, I breasoned out the new cultural object. In this way, I prepared to move past the barrier representing the maximum number of breasonings possible by translating a breasoning into a different language. + +2. I prepared to observe God (e.g. the central figure). I did this by performing the Griegian hand-hold. First, I observed you hold your hand out. Second, I grasped your hand. Third, I wrapped my fingers around your fingers. In this way, I prepared to observe God (e.g. the central figure) by performing the Griegian hand-hold. + +3. I prepared to watch the wheel rotate around the centre, to which central spokes were attached. I did this by watching the steam train's wheel rotate in a circle. First, I drew the circle. Second, I drew a line. Third, I split the edge of the circle into two segments and attached the line to this vertex, and did the same to the opposite side of the circle. In this way, I prepared to watch the wheel rotate around the centre, to which central spokes were attached by watching the steam train's wheel rotate in a circle. + +4. I prepared to be a Nietzschean scholar. I did this by translating a breasoning into an etymological meaning (in an ancient language). First, I wrote that five, the etymological meaning of coin meant that the Queen ordered 5*16=80 breasonings per giver in each transaction. Second, I knew this also meant that the Queen ordered 5*16=80 breasonings per receiver in each transaction, imitating communication. Third, I recommended training in the specific department required to gain job and safety skills. In this way, I prepared to be a Nietzschean scholar by translating a breasoning into an etymological meaning (in an ancient language). + +5. I prepared to differentiate people by breasoning them out. I did this by radically differentiating myself from you. First, I looked at myself. Second, I watched you. Third, I noted that you and I were different people. In this way, I prepared to differentiate people by breasoning them out by radically differentiated myself from you. + +6. I prepared to differentiate objects by breasoning them out. I did this by radically differentiating a sieve and a duster. First, I looked at the sieve. Second, I watched the duster. Third, I noted that the sieve and the duster were different objects. In this way, I prepared to differentiate objects by breasoning them out by radically differentiating a sieve and a duster. + +7. I prepared to find the next relationship. I did this by observing the gay man walking with the cancer sufferer's friend to make the cancer sufferer happy. First, I watched him take the first step. Second, I watched him prepare to take the next step. Third, I watched him repeat this until he had walked 10 metres. In this way, I prepared to find the next relationship by observing the gay man walking with the cancer sufferer's friend to make the cancer sufferer happy. + +8. I prepared to write philosophy books. I did this by aiming to become a philosophy academic. First, I wrote 1 A per Bachelor's degree assignment. Second, I wrote 30 As, 15 to the lecturer and 15 As from the lecturer per Honours degree assignment. Third, I wrote 50 As to the lecturer and 50 As from the lecturer per Master's and PhD degrees' assignment. In this way, I prepared to write philosophy books by aiming to become a philosophy academic. + +9. I prepared to observe the children being given presents. I did this by observing the gay man organising to have a child. First, I observed him selecting the desired properties of the child. Second, I observed him organising to select the egg and sperm which would combine to give these properties with the highest probability. Third, I organised a volunteer surrogate mother to be implanted with the egg that had been fertilised with the sperm. In this way, 9. I prepared to observe the children being given presents by observing the gay man organising to have a child. + +10 I prepared to organise to have the Lucianic Meditation degree accredited. I did this by collecting the Lucianic Meditation degree's pedagogy arguments (e.g. Lucianic writing degree's pedagogy arguments). First, I collected texts about the mantra and sutra (e..g words). Second, I collected texts about God (e.g. the master). Third, I collected texts about the bible (e.g. the book). In this way, I prepared to organise to have the Lucianic Meditation degree accredited by collecting the Lucianic Meditation degree's pedagogy arguments (e.g. Lucianic writing degree's pedagogy arguments). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..b606efb15ca8611e21c5f58a0c886297d29f37f4 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Radical Difference 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Radical Difference 2 of 4 + +11. I prepared to offer a Management course. I did this by collecting the Lucianic Management degree's pedagogy arguments. First, I collected the first group of subjects. Second, I prepared to collect the next group of subjects. Third, I repeated this until I had collected all the groups of subjects. In this way, 1 prepared to offer a Management course by collecting the Lucianic Management degree's pedagogy arguments. + +12. I prepared to offer a Law course. I did this by collecting the Lucianic Law degree's pedagogy arguments. First, I collected the lawsuit subject pedagogy argument. Second, I collected the contract law subject pedagogy argument. Third, I collected the LM legal governance subject pedagogy argument. In this way, I prepared to offer a Law course by collecting the Lucianic Law degree's pedagogy arguments. + +13. I prepared to offer an Education course. I did this by collecting the Lucianic Pedagogy degree's pedagogy arguments. First, I collected the design subject pedagogy argument. Second, I collected the delivery subject pedagogy argument. Third, I collected the assessment subject pedagogy argument. In this way, I prepared to offer an Education course by collecting the Lucianic Pedagogy degree's pedagogy arguments. + +14. I prepared to go for a walk. I did this by drinking the cup of water. First, I picked up the cup. Second, I lifted it to my lips. Third, I drank from it. In this way, I prepared to go for a walk by drinking the cup of water. + +15 I prepared to take the person to a safe place. I did this by negotiating an agreement with my opponent. First, I verified whether I agreed with the first sentence. Second, I prepared to verify whether I agreed with the next sentence. Third, I repeated this until I had agreed with a sentence. In this way, I prepared to take the person to a safe place by negotiating an agreement with my opponent. + +16. I prepared to write about the master's conclusion. I, a philosopher did this by agreeing with God (e.g. the master). First, I opened the envelope. Second, I took out the master's conclusion. Third, I agreed with it. In this way, I prepared to write about the master's conclusion by agreeing with God (e.g. the master). + +17. I prepared to verify the writing. I did this by meditating on (e.g. writing about) God (e.g. the master). First, I opened the volume. Second, I opened the volume at the correct chapter. Third, I wrote about the master's chapter's title. In this way, I prepared to verify the writing by meditating on (e.g. writing about) God (e.g. the master). + +18. I prepared to order Conglish. I did this by passing the assignment by using pedagogy. First, I researched the author's text. Second, I researched the secondary texts about this text. Third, I connected comments on quotes from these secondary texts as my argument. In this way, I prepared to order Conglish by passing the assignment by using pedagogy. + +19. I prepared to publish the article in a journal. I did this by determining that Lucianic Meditation (writing about the philosopher Lucian) worked. First, I verified that the writer had better movement in her body from drinking four glasses of water 45 minutes before breakfast, as suggested in the Lucianic text. Second, I performed a double blind experiment, in which neither the subjects nor the experiments were aware of the of the critical aspects of the experiment. Third I submitted the research to peer reviewers. In this way, I prepared to publish the article in a journal by determining that Lucianic Meditation (writing about the philosopher Lucian) worked. + +20 I prepared to verify the sentence. I did this by rebutting to the objection in the critical thinking subject. First, I held the shop sign. Second, I found it too heavy (e.g. I sang a song). Third, I pushed its post into the ground. In this way, I prepared to verify the sentence by rebutting to the objection in the critical thinking subject. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..33b2ead664f046f0e40fec48866eae09f83af0e5 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Radical Difference 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Radical Difference 3 of 4 + +21. I prepared to be democratic. I did this by awarding agreement and disagreement (rebutting to the objection) the same mark. First, I wrote the sentence agreeing with the thesis statement. Second, I wrote the sentence disagreeing (e.g. agreeing) with the thesis statement. Third, I gave these two sentences the same mark. In this way, I prepared to be democratic by awarding agreement and disagreement (rebutting to the objection) the same mark. + +22 I prepared to perform the negotiation. I did this by writing about disagreement as having the seen-as version of rebutting to an objection (e.g agreeing). First, I found the person. Second, I found her agreeing. Third, I agreed with her. In this way, I prepared to perform the negotiation by writing about disagreement as having the seen-as version of rebutting to an objection (e.g agreeing). + +23 I prepared to pass the essay. I did this by exposing the idea of God in the first half of the essay. First, I found the word God in the primary text. Second, I agreed with the idea. Third, I wrote this in the first half of the essay. In this way, I prepared to pass the essay by exposing the idea of God in the first half of the essay. + +24 I prepared to write a new essay argument. I did this by writing new connections in the essay. First, I wrote the first sentence, first sentence in the second half and a connection between them. Second, I prepared to do this with the next two sentences after these. Third, I repeated this until I had finished the whole essay. In this way, I prepared to write a new essay argument by writing new connections in the essay. + +25 I prepared to breason out an argument for a particular developed thing. I did this by writing a breasoning list. First, I wrote the algorithm description list. Second, I wrote the expanded algorithms. Third, I reduced each line of each algorithm to a breasoning list. In this way, I prepared to breason out an argument for a particular developed thing by writing a breasoning list. + +26 I prepared to disagree (e.g. agree) with the jointure and write a rhizome. I did this by joining breasonings to ideas in the area of study to write the essay. First, I wrote the breasoning 'I programmed the story maker using the computer'. Second, I wrote the idea the person followed the vascular-like halls in the building to reach the exit. Third, I wrote that I printed the pathways through the building's vasculature, like writing a story, on computer. In this way, I prepared to disagree (e.g. agree) with the jointure and write a rhizome by joining breasonings to ideas in the area of study to write the essay. + +27 I prepared to write how Chinese characters are interesting. I did this by devising an alternative idea about a Chinese character's radical. First, I wrote the character nǚzǐ, meaning good. Second, I wrote its first radical, nǚ, meaning woman. Third, I wrote an alternative meaning for nǚ, nothingness. In this way, I prepared to write how Chinese characters are interesting by devising an alternative idea about a Chinese character's radical. + +28 I prepared to protected people from being murdered (e.g. guided people on a positive path in life). I did this by simulating a friendly house by inviting many people around. First, I invited the ontology-bearer, where ontologies contain arguments. Second, I invited the algorithm-bearer, where algorithms process ontologies. Third, I invited the argument-bearer, where arguments are responses to the area of study that have rhizomes as reasons. In this way, I prepared to protected people from being murdered (e.g. guided people on a positive path in life) by simulating a friendly house by inviting many people around. + +29 I prepared to write on the bible. I did this by agreeing, not disagreeing, in theology. First, I found the instance of God (e.g. the master). Second, I found his statement. Third, I agreed with it. In this way, I prepared to write on the bible by agreeing, not disagreeing, in theology. + +30 I prepared to help actors earn money. I did this by forming an acting agency. First, I found actors. Second, I found roles. Third, I applied actors for roles. In this way, I prepared to help actors earn money by forming an acting agency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f8c99b011d820b9a3bf23e7e9e4f960248aaad6b --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Radical Difference 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Radical Difference 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Radical Difference 4 of 4 + +31. I prepared to identify multiple roots. I did this by identifying the radical (root) of life. First, I identified the first (root) thought. Second, I identified the second thought. Third, I identified the third thought. In this way, I prepared to identify multiple roots by identifying the radical (root) of life. + +32 I prepared to go to heaven (e.g. walking on dry land). I did this by identifying the difference in life. First, I identified the sailor. Second, I identified him on the boat. Third, I identified him walking on land. In this way, I prepared to go to heaven (e.g. walking on dry land) by identifying the difference in life. + +33. I prepared to define a contention and write arguments for and against a side. I did this by writing 8 area of study points. First, I wrote that 8 was the minimum number of area of study points. Second, I connected these in an argument structure. Third, I applied that area of philosophy to the ideas. In this way, I prepared to define a contention and write arguments for and against a side by writing 8 area of study points. + +34 I prepared to debate the algorithm. I did this by writing 95 area of study points. First, I wrote the reason, objection, rebuttal and connection with a previous reason or objection in that paragraph. Second, I repeated this for 5 reasons per paragraph. Third, I repeated this for 5 paragraphs. In this way, I prepared to debate the algorithm by writing 95 area of study points. + +35 I prepared to compare breasonings currency subject with the idea. I did this through radical verificationism. First, I compared the breasonings currency subject with sex. Second, I looked at the rod. Third, I prevented it from entering the void. In this way, I prepared to compare breasonings currency subject with the idea through radical verificationism. + +36 I prepared to write on different contentions. I did this through examinationism. First, I examined the reason. Second, I examined it further. Third, I examined it one more time. In this way, I prepared to write on different contentions through examinationism. + +37 I prepared to be a writer. I did this by studying short courses throughout my life and writing a maximum of 80 breasonings per day per student. First, I write 13 and 1/3 paragraphs per day. Second, I stated each of these contained 6 sentences. Third, I wrote this equalled 80 breasonings. In this way, I prepared to be a writer by studying short courses throughout my life and writing a maximum of 80 breasonings per day per student. + +38 I prepared to make fine distinctions. I did this by writing a maximum of 80 breasonings per day per actor in the production. First, I breasoned out 10 breasonings per each of the 10 characters. Second, I attributed the other breasonings to Aigs (I asked the acting agents to work on them). Third, I breasoned out 80 breasonings per day for everyone, to keep their heads comfortable. In this way, I prepared to make fine distinctions by writing a maximum of 80 breasonings per day per actor in the production. + +39. I prepared to affirm that I was well. I did this by writing academic aims for some arguments. First, I wrote the academic aim. Second, I wrote the argument for it. Third, I connected molyhedrons (sic) and the antipsychotic medication. In this way, I prepared to affirm that I was well by writing academic aims for some arguments. + +40 I prepared to dress the perspective for excavation. I did this by writing industry aims for some arguments. First, I wrote the movie. Second, I wrote the play. Third, I wrote the song. In this way, I prepared to dress the perspective for excavation by writing industry aims for some arguments. + +41. I prepared to promote my books. I did this by reading the arguments' statistics. First, I wrote a essay of the most popular arguments. Second, I wrote a secondary text of the most popular arguments. Third, I wrote a translation of the most popular arguments. In this way, I prepared to promote my books by reading the arguments' statistics. + +42 I prepared to solve the other writer's ideas in my writing. I did this by preventing theft of my arguments. First, I read my ideas. Second, I read the ideas written by the other writer. Third, I verified that each set was unique. In this way, I prepared to solve the other writer's ideas in my writing by preventing theft of my arguments. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6b22103ed8ccf5e9b2116604cfdb371c23289373 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 1 of 4.txt @@ -0,0 +1,41 @@ +["Green, L 2021, Ratios 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Ratios 1 of 4 + + + +1. One can tell the difference between two time intervals with a lower ratio more easily, e.g. 15 and 20 minutes rather than 20 and 25 minutes. + +Grouping ideas together + +- Ratios + +- wall to house on two sides + +- window to wall and wall to house + +- Grouping parts of literature together + +- way of thinking, if two men meet, should one help (if the other one will help cure his mother or if they will pay him money) or kill the other one (if he will usurp the throne or try to kill him first) + +1a. I prepared to enjoy being rewarded for answering the question correctly. I did this by eating the lolly snake. First, I ate the head of the lolly snake. Second, I prepared to eat the next part of the lolly snake. Third, I repeated this until I had eaten all of the lolly snake. In this way, I prepared to enjoy being rewarded for answering the question correctly by eating the lolly snake. + +2. I prepared to enjoy being rewarded for passing the ball correctly. I did this by eating the lolly centipede. First, I ate the head of the lolly centipede. Second, I prepared to eat the next part of the lolly centipede. Third, I repeated this until I had eaten all of the lolly centipede. In this way, I prepared to enjoy being rewarded for passing the ball correctly by eating the lolly centipede. + +3. I prepared to enjoy being rewarded for correctly handling the person. I did this by eating the lolly millipede. First, I ate the head of the lolly millipede. Second, I prepared to eat the next part of the lolly millipede. Third, I repeated this until I had eaten all of the lolly millipede. In this way, I prepared to enjoy being rewarded for correctly handling the person by eating the lolly millipede. + +4. I prepared to take action. I did this by standing on the 'thereness' position. First, I arrived at the 'thereness' position. Second, I stood on the 'thereness' position. Third, I took the happy snaps. In this way, I prepared to take action by standing on the 'thereness' position. + +5. I prepared to neaten you. I did this by being with you. First, I found you. Second, I spent time with you. Third, I slotted in with you. In this way, I prepared to neaten you by being with you. + +6. I prepared to attend the appointment. I did this by verifying that I was on time. First, I verified that the time of the hour hand was before the designated time. Second, I verified that the time of the minute hand was before the designated time. Third, I verified that the time of the second hand was before the designated time. In this way, I prepared to attend the appointment by verifying that I was on time. + +7. I prepared to host a breasonings festival. I did this by determining that the breasoner was the best. First, I observed the breasoner earn the best job. Second, I observed the breasoner bear the best child. Third, I observed the breasoner earn the highest grades. In this way, I prepared to host a breasonings festival by determining that the breasoner was the best. + +8. I prepared to achieve a peaceful result. I did this by observing the diplomat achieving the result. First, I found the contention. Second, I found the side of the contention. Third, I negotiated for this side of the contention. In this way, I prepared to achieve a peaceful result by observing the diplomat achieving the result. + +9. I prepared to write how the 190 breasonings could be used to manufacture the musical composition. I did this by stating that 190 breasonings were written for the musical composition. First, I wrote the first breasoning. Second, I prepared to write the next breasoning. Third, I repeated this until I had written all the breasonings. In this way, I prepared to write how the 190 breasonings could be used to manufacture the musical composition by stating that 190 breasonings were written for the musical composition. + +10. I prepared to earn 100%. I did this by critically analysing the musical composition in an essay. First, I wrote about the rhythm. Second, I wrote about the beats. Third, I wrote about the stew of it. In this way, I prepared to earn 100% by critically analysing the musical composition in an essay. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..98184036ad8862f93bec502b98f99dd4b3e9572a --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Ratios 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Ratios 2 of 4 + +11. I prepared to jail the criminals. I did this by preventing the burglary. First, I contacted the burglary. Second, I reported the criminals to them. Third, I described the criminals to the police. In this way, I prepared to jail the criminals by preventing the burglary. + +12. I prepared to write a pop philosophy. I did this by driving the man for preventative treatment. First, I found the man. Second, I started the car's ignition. Third, I drove the man to hospital. In this way, I prepared to write a pop philosophy by driving the man for preventative treatment. + +13. I prepared to find the princess character to marry the frog character. I did this by sighting the frog character. First, I sighted his neck crown. Second, I sighted his webbed feet. Third, I sighted the spheres at the end of his toes. In this way, I prepared to find the princess character to marry the frog character by sighting the frog character. + +14. I prepared to record rewards from God. I did this by displaying the ratio in the philosophy magazine. First, I wrote. Second, I wrote the delimiter colon ':'. Third, I wrote. In this way, I prepared to record rewards from God by displaying the ratio in the philosophy magazine. + +15. I prepared to go to positive peaks. I did this by observing that the stone fortress helped people avoid negative ruts. First, I identified the rut. Second, I avoided it. Third, I was protected in the stone fortress. In this way, I prepared to go to positive peaks. I did this by observing that the stone fortress helped people avoid negative ruts. + +16. I prepared to observe the man enter the office. I did this by observing the man cross the road. First, I observed him stand at the side of the road. Second, I observed him start to cross the road. Third, I observed him until he had finished crossing the road. In this way, I prepared to observe the man enter the office by observing him cross the road. + +17. I prepared to train students to earn higher grades. I did this by selling the pedagogy screen en masse. First, I sold the pedagogy screen, instructions about how to earn A grade using it, and what to aim for with A grade, to the first, person. Second, I prepared to sell them to the next person. Third, I repeated this until I had sold them to all the people in the set, or 'en-masse'. In this way, I prepared to train students to earn higher grades by selling the pedagogy screen en masse. + +18. I prepared to listen to the movie. I did this by outlawing robot weapons (programmed the robot to read the book). First, I programmed the robot to intertwine her voice with reasons represented by the 250 breasonings per page of the book. Second, I prepared the inflection of the voice reading the line from the story. Third, I programmed the robot to dramatically read the line from the story. In this way, I prepared to listen to the movie by outlawing robot weapons (programmed the robot to read the book). + +19. I prepared to prolong life (removing causes of death). I did this by determining that the robot meant people would live for longer. First, I programmed the robot to compile pedagogical argument for each departmental object experienced. Second, I programmed the robot to compile 50 pedagogical arguments for each departmental object experienced. Third, I programmed the robot to take care of the person by compiling 50 pedagogical arguments for each medicinal object (e.g. causing happiness by preventing psychiatric sadness causes) experienced. In this way, I prepared to prolong life (removing causes of death) by determining that the robot meant people would live for longer. + +20. I prepared to start a family. I did this by taking care of the woman. First, I took care of her health. Second, I took care of her wealth. Third, I took care of her wisdom. In this way, I prepared to start a family by taking care of the woman. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..191f43c07ff27a06722e1aae21113c9e168b4d0c --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Ratios 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Ratios 3 of 4 + +21. I prepared to create a brain mousse. I did this by writing a book for my brain. First, I wrote a part about my senses. Second, I wrote a part about my memory and cognition. Third, I wrote a part about my muscles. In this way, I prepared to create a brain mousse by writing a book for my brain. + +22. I prepared to examine the noumenon. I did this by using the trivium. First, I applied geometry to an idea. Second, I applied logic to the idea. Third, I applied grammar to the idea. In this way, I prepared to examine the noumenon by using the trivium. + +23. I prepared to create the univer-sity. I did this by observing that the non-cosmologist was protected when she paid the cosmologist meditation teacher for meditation training. First, I delivered yogic training in meditation and medicine. Second, I delivered business training in economics, pedagogical training, theological protection and development of specific area of study. Third, I delivered professor training in creative writing, theatre, music, fine arts, film, etc. In this way, I prepared to create the univer-sity by observing that the non-cosmologist was protected when she paid the cosmologist meditation teacher for meditation training. + +24. I prepared to facilitate book presentations. I did this by operating the book club. First, I read the first book, reviewed and rated it. Second, I read the next book, the review and rating of which interested me. Third, I repeated this until I had read all the books from an era. In this way, I prepared to facilitate book presentations by operating the book club. + +25. I prepared to attain famous status in the department. I did this by becoming a professor. First, I wrote books. Second, I analysed the material using professorial ways of thinking (breasoning 20 breasonings out per sentence). Third, I, earned the prerequisite qualifications. In this way, I prepared to attain famous status in the department by becoming a professor. + +26. I prepared to demonstrate sentient knowledge about grammar. I did this by writing 10 (20) breasonings per sentence in my philosophy. First, I decided to write about broccoli. Second, I wrote 10 breasonings about it as a pedagogue. Third, I wrote a further 10 about it as a professor. In this way, I prepared to demonstrate sentient knowledge about grammar by writing 10 (20) breasonings per sentence in my philosophy. + +27. I prepared to show how the doll's digestive system was like a human's digestive system. I did this by simulating the doll eating broccoli. First, I explained to the listener that I didn't mean the doll actually ate the broccoli. Second, I explained to the listener that I didn't mean that the broccoli ate the doll. Third, I explained to the listener that I didn't mean a homophonous phrase 'dollie ting broccoli,' meaning that the doll appeared to strike a chime in the broccoli. In this way, I prepared to show how the doll's digestive system was like a human's digestive system by simulating the doll eating broccoli. + +28. I prepared to paint ideas when writing my thesis. I did this by writing the thesis. First, I wrote a Derridean story. Second, I wrote an algorithm. Third, I reordered and synthesized the parts of the algorithm. In this way, I prepared to paint ideas when writing my thesis by writing the thesis. + +29. I prepared to write down one idea at a time. I did this by taking notes during the lecture. First, I creatively structured the secondary text. Second, I wrote it. Third, I verified it. In this way, I prepared to write down one idea at a time by taking notes during the lecture. + +30. I prepared to create a tofu ice cream pyramid. I did this by scooping the tofu ice cream. First, I held the scoop. Second, I scooped a scoop of tofu ice cream. Third, I continued to place scoops of tofu ice cream in the bowl until there were enough scoops. In this way, I prepared to create a tofu ice cream pyramid by scooping the tofu ice cream. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5a8f4efc3aa16298793b99d3748c005a4d74a183 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Ratios 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Ratios 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Ratios 4 of 4 + +31. I prepared to simulate feeding the dinosaur whale. I did this by simulating the dinosaur whale. First, I created the dinosaur whale's paper head. Second, I created the dinosaur whale's paper body. Third, I created the dinosaur whale's paper tail. In this way, I prepared to simulate feeding the dinosaur whale by simulating the dinosaur whale. + +32. I prepared to eat the tofu chocolate ice cream. I did this by eating the vegan casserole. First, I ate the rice. Second, I ate the carrot. Third, I ate the celery. In this way, I prepared to eat the tofu chocolate ice cream by eating the vegan casserole. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3713cc5e252e01b2aefb301e6de7b9d6e3fa2181 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 1 of 4.txt @@ -0,0 +1,37 @@ +["Green, L 2021, Subject Mix 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Subject Mix 1 of 4 + + + +1. After reading a narratology page giving the following structure of subjects: + +author - reader + +narrator - addressee(s) + +character - character(s) + +1a. I spoke with a friend and she said the subjects the subjects addressed could be mixed up. For example, the narrator could address the reader, or the author could address a character. + +1a. I prepared to present the stage play. I did this by stating that the writer, reader and character understood (met) each other. First, I wrote that the writer and reader met each other. Second, I wrote that the reader and character met each other. Third, I wrote that the character and writer met each other. In this way, I prepared to present the stage play by stating that the writer, reader and character understood (met) each other. + +2. I prepared to show the writer, reader and character interacting with each other during the stage play. I did this by stating that the writer, reader and character entered each other's worlds. First, I wrote that the writer and reader entered each other's worlds. Second, I wrote that the reader and character entered each other's worlds. Third, I wrote that the character and writer entered each other's worlds. In this way, I prepared to show the writer, reader and character interacting with each other during the stage play by stating that the writer, reader and character entered each other's worlds. + +3. I prepared to represent plainness. I did this by writing about reiner (purity). First, I verified that no objects were inside the object. Second, I verified that no objects were directly outside the object. Third, I wrote that the object was present in mass. In this way, I prepared to represent plainness by writing about reiner (purity). + +4. I prepared to write that the mix configurations were neat. I did this by determining how the writer, reader and character should be mixed. First, I wrote that the writer, reader and character were characters. Second, I rotated their roles by one. Second, I rotated their roles again by one. Third, I. In this way, I prepared to write that the mix configurations were neat by determining how the writer, reader and character should be mixed. + +5. I prepared to rotate need for skills. I did this by mixing the subjects. First, I gave the woodcutter the fireman's job. Second, I gave the fireman the gardener's job. Third, I gave the gardener the woodcutter's job. In this way, I prepared to rotate need for skills by mixing the subjects. + +6. I prepared to see Rapunzel. I did this by climbing up a hair rope. First, I started at the bottom. Second, I climbed up the hair rope. Third, I stopped when I had reached the top. In this way, I prepared to see Rapunzel by climbing up a hair rope. + +7. I prepared to read Darwin's examination of the specimen. I did this by reading that Darwin released the specimen. First, I read that he had found the specimen. Second, I read that he examined the specimen. Third, I read that he released the specimen. In this way, I prepared to read Darwin's examination of the specimen by reading that Darwin released the specimen. + +8. I prepared to take out the tissue. I did this by touching the plastic square with a hole in it. First, I found the plastic square. Second, I grasped it. Third, I put my finger through the hole in the plastic square. In this way, I prepared to take out the tissue by touching the plastic square with a hole in it. + +9. I prepared to perform work using the automated machine. I did this by undoing an action. First, I performed an action. Second, I pressed 'Undo'. Third, I observed the computer undo the action. In this way, I prepared to perform work using the automated machine by undoing an action. + +10. I prepared to retrain employees. I did this by watching the business rotate weaknesses to remain strong. First, I gave the writer the translator's job. Second, I gave the translator the student's job. Third, I gave the student the writer's job. In this way, I prepared to retrain employees by watching the business rotate weaknesses to remain strong. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..dec9d7ff37e5635e71b3b84da2c179403333d41c --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Subject Mix 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Subject Mix 2 of 4 + +11. I prepared to verify the results of the Prolog computer. I did this by using the Prolog computer. First, I opened the Prolog computer. Second, I saw that the circuitry was designed to run Prolog code. Third, I ran the Prolog program. In this way, I prepared to verify the results of the Prolog computer by using the Prolog computer. + +12. I prepared to sign that I had won a mental duel. I did this by stating that I was right. First, I determined that I was right. Second, I found another person who agreed that I was right. Third, I announced that I was right. In this way, I prepared to sign that I had won a mental duel by stating that I was right. + +13. I prepared to feel haughty. I did this by observing the parrot breen (sic) the seed. First, I observed the parrot take the seed. Second, I observed the parrot open the seed. Third, I observed the parrot eat the seed. In this way, I prepared to feel haughty by observing the parrot breen (sic) the seed. + +14. I prepared to exist in the universe. I did this by listening to the philosopher querying what kinds of thresholds exist in the universe. First, I blessed the undeveloped breasoning to become developed. Second, I breasoned out the pedagogy degree, including the professor algorithm. Third, I breasoned out 80 breasonings to surpass the threshold for (earn an) A. In this way, I prepared to exist in the universe by listening to the philosopher querying what kinds of thresholds exist in the universe. + +15. I prepared to represent the universe. I did this by flying the black dot on the white background as the universe flag. First, I collected the white flag. Second, I printed a black circle on it. Third, I flew the flag. In this way, I prepared to represent the universe by flying the black dot on the white background as the universe flag. + +16. I prepared to represent Lord Lucian. I did this by flying Lord Lucian's flag. First, I found the stand. Second, I fixed the upper left corner of the flag to the upper left corner of the stand. Third, I fixed the upper right corner of the flag to the upper right corner of the stand. In this way, I prepared to represent Lord Lucian by flying Lord Lucian's flag. + +17. I prepared to publish a certain amount per year. I did this by writing the University journal article. First, I wrote the article. Second, I breasoned out 50 As. Third, I published the article in the journal. In this way, I prepared to publish a certain amount per year by writing the University journal article. + +18. I prepared to show the sculpture. I did this by writing the block sculpture. First, I created the sculpture. Second, I brainstormed the argument. Third, I wrote the argument. In this way, I prepared to show the sculpture by writing the block sculpture. + +19. I prepared to hand out food. I did this by agreeing with the flag. First, I verified the flag. Second, I cited that it had been checked. Third, I agreed with the flag. In this way, I prepared to hand out food by agreeing with the flag. + +20. I prepared to. I did this by writing 50 250 breasoning As for sales to academics. First, I installed LM (book-reading) in the fame-creating University. Second, I installed Pedagogy in the University. Third, I wrote a 250 breasoning argument containing the primary text and 4 secondary texts. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..072452c8588140c15d6c55eebde78015fd8760d1 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Subject Mix 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Subject Mix 3 of 4 + +21. I prepared to be famous. I did this by contacting famousness by giving the next book 5 As. First, I looked on the informational retrieval system. Second, I chose a book in the same genre. Third, I gave 5 As to the next book. In this way, I prepared to be famous by contacting famousness by giving the next book 5 As. + +22. I prepared to be developed. I did this by breasoning out 50 As for a recording day. First, I breasoned out a 250 breasoning As. Second, I researched details for each of the other 15 As. Third, I verified the script. In this way, I prepared to be developed by breasoning out 50 As for a recording day. + +23. I prepared to be alive. I did this by breasoning out 50 As per day on a recording day. First, I learned meditation (read the book). Second, I practised meditation (recited the book) before the recording. Third, I made the recording. In this way, I prepared to be alive by breasoning out 50 As per day on a recording day. + +24. I prepared to be well known. I did this by breasoning out 50 per important quote. First, I breasoned out a 250 breasoning As. Second, I researched details for each of the other 15 As. Third, I verified the quote. In this way, I prepared to be well known by breasoning out 50 per important quote. + +25. I prepared to exercise responsibility for a minor. I did this by going down the slide. First, I climbed the ladder of the slide. Second, I sat at the top of the slide. Third, I went down the slide. In this way, I prepared to exercise responsibility for a minor by going down the slide. + +26. I prepared to play Head of State. I did this by serving the pineapple. First, I cut the pineapple skin off and sliced it. Second, I placed a pineapple slice on a plate. Third, I served the pineapple to the first man. In this way, I prepared to play Head of State by serving the pineapple. + +27. I prepared to feed the people. I did this by eating the pomegranate seeds. First, I extracted the pomegranate seeds. Second, I placed them on a plate. Third, I ate them. In this way, I prepared to feed the people by eating the pomegranate seeds. + +28. I prepared to teach the 10 year-old child meditation (philosophy). I did this by preparing for the meditator (philosophical) baby. First, I learnt meditation (philosophy) from a teacher. Second, I practised meditating on 80 lucian mantras and 80 green sutras per (read philosophy each) day while the mother was pregnant. Third, I observed the uneventful birth. In this way, I prepared to teach the 10 year-old child meditation (philosophy) by preparing for the meditator (philosophical) baby. + +29. I prepared to enact my prayers (plans). I did this by worshipping God (talking to the master). First, I found the master. Second, I selected a topic of conversation. Third, I talked to the master. In this way, I prepared to enact my prayers (plans) by worshipping God (talking to the master). + +30. I prepared to adopt the baby. I did this by eating the lolly. First, I unwrapped the lolly. Second, I placed it in my mouth. Third, I lolled the lolly in my mouth. In this way, I prepared to adopt the baby by eating the lolly. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8149de0f20313139062858f558c2b6830e808109 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Subject Mix 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Subject Mix 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Subject Mix 4 of 4 + +31. I prepared to throw a party. I did this by licking the rose jelly. First, I set the water, sugar and rose water in the freezer. Second, I retrieved the rose jelly from the freezer when it had set. Third, I ate the rose jelly. In this way, I prepared to throw a party by licking the rose jelly. + +32. I prepared to build the house. I did this by placing the brick on the cantilever. First, I created the cantilever by anchoring it at one end to a vertical beam, which it protrudes from. Second, I lifted the brick. Third, I placed it on the cantilever. In this way, I prepared to build the house by placing the brick on the cantilever. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..382a97da065384a8d7a725883fa545fe105e6a22 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Symbols 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Symbols 1 of 4 + + + +1. Before experimentation with narratives, the functional parts (characters, motives and the constitutive objects' interrelations and settings) should be defined. + +1a. I prepared to connect the object's properties together. I did this by thinking of Nietzsche write one paragraph with 250 breasonings per paragraph. First, I wrote about the integument (skin). Second, I wrote the reasoning. Third, I wrote the ideas. In this way, I prepared to connect the object's properties together by thinking of Nietzsche write one paragraph with 250 breasonings per paragraph. + +2. I prepared to grip the object that I had sensed. I did this by observing my body working. First, I observed my hand working. Second, I observed my brain working. Third, I observed my mind working. In this way, I prepared to grip the object that I had sensed by observing my body working. + +3. I prepared to earn the job. I did this by examining the moment of making a statement. First, I identified the mistake (positive statement). Second, I wrote a statement instead of (about) it. Third, I developed the moment of the statement by thinking of perspectives about it. In this way, I prepared to earn the job by examining the moment of making a statement. + +4. I prepared to connect the aphor steps together. I did this by selecting from the aphor steps when writing the new essay. First, I calculated the answer. Second, I processed the rest of the data. Third, I printed the data in a user-friendly format, which I selected to make the substrate of understanding. In this way, I prepared to connect the aphor steps together by selecting from the aphor steps when writing the new essay. + +5. I prepared to write a perfect short story. I did this by connecting 19 sentences (from 190/10 words/sentence) in an English short story. First, I wrote the event in the choose-your-own-world story. Second, I wrote down the list of pages to visit, in conjunction on the page containing the first event. Third, I wrote down the list of pages able to be visited, in disjunction on the page containing the first event. In this way, I prepared to write a perfect short story by connecting 19 sentences (from 190/10 words/sentence) in an English short story. + +6. I prepared to interrelate objects. I did this in the English version (10*19 sentences) by working out the argument from the structure applied to objects. First, I determined interestingness to mean that an object contained a centre. Second, I found that usefulness was signified by the object being recognized as able to help the human to perform a function unable to be performed by the human alone. Third, I computed that moral integrity was defined as a relationship with people involving an object. In this way, I prepared to interrelate objects in the English version (10*19 sentences) by working out the argument from the structure applied to objects. + +7. I prepared to be glory empersonified (sic). I did this by in the Theological version (10*19 sentences) by working out the argument from the structure applied to people. First, I attain strength of mind by not giving up. Second, I achieved power by powering other people. Third, I gained glory by creating a wonderful system that would last in time. In this way, I prepared to be glory empersonified (sic) in the Theological version (10*19 sentences) by working out the argument from the structure applied to people. + +8. I prepared to inspire art with life during the seasons. I did this by drawing a 19-pixel icon from objects' values and values of people, each in 10*19 sentences. First, I drew the fauve. Second, I drew the seasons. Third, I drew the reason d'etre. In this way, I prepared to inspire art with life during the seasons by drawing a 19-pixel icon from breasoning out objects' values and values of people, each in 10*19 sentences. + +9. I prepared to give the students a reason to work. I did this by helping the students to work because of objects' values and values of people, each in 10*19 sentences. First, I wrote that the famous text should satisfy particular criteria (with valued numbers) of people. Second, I wrote that the correct text should satisfy particular criteria (with valued numbers) of objects. Third, I wrote the intersection of these. In this way, I prepared to give the students a reason to work by helping the students to work because of objects' values and values of people, each in 10*19 sentences. + +10. I prepared to approve of the secondary text author resuscitating the idea using her values. I did this by helping the secondary text authors because of objects' values and values of people, each in 10*19 sentences. First, I wrote about the Renaissance. Second, I wrote about the oeuvre. Third, I wrote the philosophy. In this way, I prepared to approve of the secondary text author resuscitating the idea using her values by helping the secondary text authors because of objects' values and values of people, each in 10*19 sentences. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f08621e3adffe45acdfb550c928d59d1ea3966bb --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Symbols 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Symbols 2 of 4 + +11. I prepared to discover what was beyond each pedagogical threshold. I did this by aiding the text to be selected by writing objects' values and values of people, each in 10*19 sentences. First, I wrote the vocational skills in object-value form. Second, I wrote the narrative-like areas of study in people-value form. Third, I wrote about the trials and travails of being a vocational pedagogue. In this way, I prepared to discover what was beyond each pedagogical threshold by aiding the text to be selected by writing objects' values and values of people, each in 10*19 sentences. + +12. I prepared to teach meditation (philosophy). I did this by designing the meditation (philosophy) teaching business. First, I installed the web site. Second, I uploaded the courseware. Third, I advertised the courses. In this way, I prepared to teach meditation (philosophy) by designing the meditation (philosophy) teaching business. + +13. I prepared to show that there was a university in each country. I did this by designing the self-generating University. First, I found the people. Second, I helped them to write secondary texts and tenure documents. Third, I was granted funding for accreditation. In this way, I prepared to show that there was a university in each country by designing the self-generating University. + +14. I prepared to verify society using breasoning currency. I did this by writing the Economics thesis. First, I used breasoning currency to verify the self's purchases. Second, I used breasoning currency to verify the other's purchases. Third, I used breasoning currency to verify everyone's purchases. In this way, I prepared to verify society using breasoning currency by writing the Economics thesis. + +15. I prepared to be a professor. I did this by writing the Economics thesis framework. First, I wrote the first of the 32 10-breasoning As and wrote how it related to the topic. Second, I prepared to write the next 10-breasoning A, and wrote how it related to the topic. Third, I repeated this until I had written 32 10-breasoning As and had written how it related to the topic. In this way, I prepared to be a professor by writing the Economics thesis framework. + +16. I prepared to feel the temperature. I did this by feeling like Plato because it was cold. First, I found the jumper. Second, I put it on, over my head. Third, I inserted my arms in the arms of the jumper. In this way, I prepared to feel the temperature by feeling like Plato because it was cold. + +17. I prepared to teach Medicine (Medicine, Pedagogy and Meditation (Philosophy)) and Arts (Computational English), and Art later (Creative Writing, etc.). I did this by opening the philosophy business. First, I wrote the Mission Statement of the school. Second, I appointed the private school committee. Third, I researched the state's curriculum requirements. In this way, I prepared to teach Medicine (Medicine, Pedagogy and Meditation (Philosophy)) and Arts (Computational English), and Art later (Creative Writing, etc.) by opening the philosophy business. + +18. I prepared to make Lucian's meditation (philosophy) supplement philosophy as central. I did this by connecting Plato's basis on philosophy with Lucian's basis on meditation (philosophy). First, I observed that meditation (philosophy) enabled me to write with fewer mistakes. Second, I observed that meditation (philosophy) enabled me to paint objects more accurately. Third, I observed that meditation (philosophy) enabled me to sing more clearly. In this way, I prepared to make Lucian's meditation (philosophy) supplement philosophy as central by connecting Plato's basis on philosophy with Lucian's basis on meditation (philosophy). + +19. I prepared to be a great thinker. I did this by connecting Socrates' idea to aim to think with Lucian's philosophy. First, I thought and found that the main conclusion of pedagogy was determining the essay mark. Second, I thought and found that the main conclusion of meditation (writing) was creating a working meditation system. Third, I thought and found that the main conclusion of medicine was determining achieving no mental breakdowns. In this way, I prepared to be a great thinker by connecting Socrates' idea to aim to think with Lucian's philosophy. + +20. I prepared to claim centrality means philosophy. I did this by connecting Aristotle's virtue as a mean to Lucian's philosophy. First, I wrote Aristotle's virtue as a mean connected with Lucian's meditation (philosophy) resulted in thinking of higher thoughts in meditation. Second, I wrote Aristotle's virtue as a mean connected with Lucian's medicine resulted in thinking of the blood cell, reminding one of creative philosophy. Third, I wrote Aristotle's virtue as a mean connected with Lucian's pedagogy resulted in the thoughts being in the centre. In this way, I prepared to claim centrality means philosophy by connecting Aristotle's virtue as a mean to Lucian's philosophy. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6013393381424bc91b605e5822e27c704ef5b081 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Symbols 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Symbols 3 of 4 + +21. I prepared to position the best people throughout the world and it's states. I did this by connecting Nietzsche's Übermensch with Lucian's philosophy. First, I connected Nietzsche's Übermensch with Computational English by stating to find the best person in terms of speed of achieving her goal. Second, I connected Nietzsche's Übermensch with Economics by providing enough breasoning currency to make everyone the equal best, to rotate socioeconomic status by geographic location. Third, I connected Nietzsche's Übermensch with Lucian's autobiography by rewarding meditation (philosophy) teaching effort. In this way, I prepared to position the best people throughout the world and it's states by connecting Nietzsche's Übermensch with Lucian's philosophy. + +22. I prepared to dialectise my philosophy. I did this by connecting Heidegger's Question and Answer with Lucian's Medicine. First, I connected Heidegger's Question and Answer with Lucian's Pedagogy by answering essay questions. Second, I connected Heidegger's Question and Answer with Lucian's Medicine by diagnosing certain medical disorders and prescribing certain treatments. Third, I connected Heidegger's Question and Answer with Lucian's Meditation (Philosophy) by verifying that the Hours Prayer (argument) had been demanded and supplied. In this way, I prepared to dialectise my philosophy by connecting Heidegger's Question and Answer with Lucian's Medicine. + +23. I prepared to become a tutor by attending the Logic Summer School and finished Philosophy Honours by speaking at 3 international conferences. I did this by connecting Wittgenstein's language as reduced to object or grammar with Lucian's philosophy. First, I connected Wittgenstein's language as reduced to object or grammar with Lucian's Culturology by representing the sea with the metaphor the hand, or stating that 'I swam in the sea'. Second, I connected Wittgenstein's language as reduced to object or grammar with Lucian's Epistemology by writing creative writing theory (praxemes, or lines to write about), e.g. write about: apples, and contrast praxemes using the objects. Third, I connected Wittgenstein's language as reduced to object or grammar with Lucian's Gay Studies by writing I held her hand and I walked with him. In this way, I prepared to become a tutor by attending the Logic Summer School and finished Philosophy Honours by speaking at 3 international conferences by connecting Wittgenstein's language as reduced to object or grammar with Lucian's philosophy. + +24. I prepared to connect Foucault's episteme with Societology by mulling over social malhesian (sic). I did this by connecting Foucault's episteme with Lucian's philosophy. First, I connected Foucault's episteme with Hermeneutics by connecting spatial and temporal evidence with interpretation. Second, I connected Foucault's episteme with Communication by stating the evidence at the start of the conversation, not at the end. Third, I connected Foucault's episteme with Popology by making cultural observations about people with evidence. In this way, I prepared to connect Foucault's episteme with Societology by mulling over social malhesian (sic) by connecting Foucault's episteme with Lucian's philosophy. + +25. I prepared to connect deconstruction with supplement by stating the meaning had an original reason. I did this by connecting Derrida's deconstruction with Lucian's philosophy. First, I found the supplement (secondary thought from the following that is original or natural) of pedagogy was oppressiveness (freedom). Second, I found the supplement of medicine was reaching nothingness. Third, I found the supplement of meditation (philosophy) was bliss. In this way, I prepared to connect deconstruction with supplement by stating the meaning had an original reason by connecting Derrida's deconstruction with Lucian's philosophy. + +26. I prepared to open the book and find the correct character (symbol). I did this by finding the symbol. First, I opened the book. Second, I looked for the symbol. Third, I found the symbol. In this way, I prepared to open the book and find the correct character (symbol) by finding the symbol. + +27. I prepared to keep (remember) the secret (fact). I did this by finding the secret (fact). First, I opened the box. Second, I read the text. Third, I decrypted (understood) the secret (fact). In this way, I prepared to keep (remember) the secret (fact) by finding the secret (fact). + +28. I prepared to eat the apple. I did this by stating the pixie asked why I found the secret. First, I looked at the object. Second, I wrote the name of the object, the secret. Third, I stated that the pixie asked why I found the name apple. In this way, I prepared to eat the apple by stating the pixie asked why I found the secret. + +29. I prepared to state that the yodeler represented the sun, which one of three A must be give to, meaning it's like us, to deserve to pray for (write) 50 As. I did this by making the yodeler diorama. First, I made the cardboard yodeler. Second, I made the background diorama. Third, I made the yodeler puppet appear to walk along the path. In this way, I prepared to state that the yodeler represented the sun, which one of three A must be give to, meaning it's like us, to deserve to pray for (write) 50 As by making the yodeler diorama. + +30. I prepared to reverse-engineer the bean process. I did this by beaning (sic) the peas. First, I found the peas. Second, I attached them to a bean. Third, I ate the bean. In this way, I prepared to reverse-engineer the bean process by beaning the peas. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d5f388a474a96e7fc36e982d1c8c2d0ebf2473c7 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green Symbols 4 of 4.txt @@ -0,0 +1,28 @@ +["Green, L 2021, Symbols 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +Symbols 4 of 4 + +31. I prepared to create heaven (school) on earth. I did this by beasoning (sic) the carrots. First, I peeled the carrot. Second, I attached the peel to the carrot. Third, I ate the carrot. In this way, I prepared to create heaven (school) on earth by beasoning (sic) the carrots. + +32. I prepared to tell Xochi what the similar statement meant, that it was a way. I did this by stating that I love being with you. First, I found you. Second, I stayed with you. Third, I stated that I loved being with you. In this way, I prepared to tell Xochi what the similar statement meant, that it was a way by stating that I love being with you. + +33. I prepared to move to a new house. I did this by ramping up the desk. First, I held the desk at he bottom of the ramp. Second, I walked up the ramp. Third, I placed the ramp in the corner. In this way, I prepared to move to a new house by ramping up the desk. + +34. I prepared to make a new connection. I did this by writing rhetoric. First, I wrote A->B. Second, I wrote C->D. Third, I wrote, as A->B, C->D. In this way, I prepared to make a new connection by writing rhetoric. + +35. I prepared to prepare for the debriefing. I did this by taking the witch's hat. First, I walked to the cone. Second, I removed it. Third, I placed it I the correct section of the sports supplies. In this way, I prepared to prepare for the debriefing by taking the witch's hat. + +36. I prepared to observe the robot guide the blind person. I did this by observing the robot dog walk in a straight line. First, I placed the robot dog on the line. Second, I observed it start to walk along the line. Third, I observed it complete the walk along the line. In this way, I prepared to observe the robot guide the blind person by observing the robot dog walk in a straight line. + +37. I prepared to try after meditating (writing). I did this by examining the idea in time and space. First, I examined the idea in time. Second, I examined the idea in space. Third, I examined how the idea worked in different combinations of time and space. In this way, I prepared to try after meditating (writing) by examining the idea in time and space. + +38. I prepared to count the number of As. I did this by counting the beans with you. First, I counted the first breasoning. Second, I prepared to count the next bean. Third, I repeated this until I had counted all the beans. In this way, I prepared to count the number of As by counting the beans with you. + +39. I prepared to close down the business the week (LM centre) or month (school) it starts losing money. I did this by working on my vocation. First, I invested in myself. Second, I created and tested products. Third, I began to know you. In this way, I prepared to close down the business the week (LM centre) or month (school) it starts losing money by working on my vocation. + +40. I prepared to write on philosophy. I did this by visiting a particular city and setting up a meditation (philosophy) centre and school. First, I visited the city. Second, I set up a meditation (philosophy) centre. Third, I set up a school. In this way, I prepared to write on philosophy by visiting a particular city and setting up a meditation (philosophy) centre and school. + +41. I prepared to make As available through a system for a fee. I did this by setting up a university. First, I established a fund. Second, I found a source of teachers. Third, I found a source of equipment. In this way, I prepared to make As available through a system for a fee by setting up a university. + +42. I prepared to teach meditation with a non-religious seen-as-version at University. I did this by writing 'a man' in brackets after God's name at the University. First, I wrote God's (the man's) name. Second, I wrote the breasoning chapters in this way. Third, I wrote the essays in this way. In this way, I prepared to teach meditation with a non-religious seen-as-version at University by writing 'a man' in brackets after God's name at the University. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 1 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d1a473526071f41a348f8ae751b187fa3ab14d47 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 1 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, The Science of Crossing Over 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +The Science of Crossing Over 1 of 4 + + + +1. The first technique is limited in terms of a content-filled framework for analysing texts cognitively. The 'cognitive science' perspective in Computational English should deal with critical questions arising from events and interchanges between characters, and changes within individual characters. Information from such analysis may be for example abductive conclusions about what a character has done, or deductive conclusions about what a character is or will be doing. Naturally, this may apply to real-life situations. There are air-traffic control systems which analyse dialogue for example. Character interrelationships are one of the important points examine critically as they give the 'human side' of the events in the play. + +An example is: 'Peter saw a crocodile', 'Andrew drew it', therefore 'Andrew drew the crocodile'. + +1a. I prepared to observe the character arguing for the other character. I did this by writing that I (the self) was at one with you (the other). First, I observed the first character sitting down. Second, I observed the second character sitting next to her. Third, I observed the first character touching the second character. In this way, I prepared to observe the character arguing for the other character by writing that I (the self) was at one with you (the other). + +2. I prepared to think more than not. I did this by surpassing meditation (in fact, surpassing philosophy). First, I wrote the philosophy. Second, I surpassed it. Third, I thought of nothing. In this way, I prepared to think more than not by surpassing meditation (in fact, surpassing philosophy). + +3. I prepared to write the use for the group of connections about the knowledge. I did this by practicing the mantra (in fact, reading the book of knowledge). First, I opened the book. Second, I turned to the correct page. Third, I read the knowledge. In this way, I prepared to think more than not by surpassing meditation (in fact, surpassing philosophy). + +4. I prepared to be mentally well. I did this by practicing the sutra (in fact, reading the book of psychiatric knowledge). First, I read the book. Second, I read the part of it suggesting to keep a group of friends. Third, I read about the group of friends maintaining group dynamics (helping each other out using 5 minute light-temperature-sound self-coaching). In this way, I prepared to be mentally well by practicing the sutra (in fact, reading the book of psychiatric knowledge). + +5. I prepared to. I did this by living the life of a monastic (in fact, philosopher). First, I led meditation class (expand on the book of knowledge). Second, I led yoga (in fact, stretching) class. Third, I led reading (in fact, journal writing) retreats. + +6. I prepared to improve my health. I did this by visiting the Lucianic doctor of medicine. First, he prevented schizophrenia. Second, he prevented the cold. Third, he prevented depression. In this way, I prepared to improve my health by visiting the Lucianic doctor of medicine. + +7. I prepared to clear my nostrils. I did this by performing pranayama (drinking the glass of water). First, I placed the glass underneath the tap. Second, I filled the glass with water. Third, I drank the glass of water. In this way, I prepared to clear my nostrils by performing pranayama (drinking the glass of water). + +8. I prepared to sitting with no excess gas in my digestive system. I did this by regulating soma (eating foods without too much salt). First, I found the apple. Second, I picked the apple. Third, I ate the apple. In this way, I prepared to sitting with no excess gas in my digestive system by regulating soma (eating foods without too much salt). + +9. I prepared to observe God (in fact, people) loving people. I did this by finding that meditation (philosophy) was professional. First, I watched the meditation utterance (query) register. Second, I watched the God give Himself a reaction about it (I watched the clerk write a reply on a card). Third, I watched the meditator (philosophy student) look at something high quality every time she looked at something. In this way, I prepared to observe God loving people by finding that meditation (philosophy) was professional. + +10. I prepared to take care of the meditation group after I died (I prepared to write a business model to take care of the philosophy school after I died). I did this by performing the tasks of the Lord of meditation (teacher of philosophy). First, I established new meditation centres (philosophy schools). Second, I taught meditation to students (I taught philosophy to students). Third, I spent time doing nothing. In this way, I prepared to take care of the meditation group after I died (I prepared to write a business model to take care of the philosophy school after I died) by performing the tasks of the Lord of meditation (teacher of philosophy). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 2 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3bacf16c2cbfc25eec5ef57fa48f6f3efe8c57dc --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 2 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, The Science of Crossing Over 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +The Science of Crossing Over 2 of 4 + +11. I prepared to be crystal clear in my vision episodes. I did this by teaching medicine students to avoid schizophrenic episodes (follow the positive functional path). First, I took care of my vision. Second, I took care of you. Third, I took care of us. In this way, I prepared to be crystal clear in my vision episodes by teaching medicine students to avoid schizophrenic episodes (follow the positive functional path). + +12. I prepared to use pedagogy skills during my career. I did this by explaining the essence of the pedagogy degree. First, I performed the pedagogy skill. Second, the people detected this using the high quality algorithm available. Third, I had high quality thoughts. In this way, I prepared to use pedagogy skills during my career by explaining the essence of the pedagogy degree. + +13. I prepared to observe God (in fact, relatives) performing miracles for (loving) people by training them in being psychiatrically fit with 80 sutras, each triggering 5 breasonings to be expanded to 50 breasonings, for a total of 50 As (with 80 breasonings each). I did this by explaining the essence of the meditation sutra degree. First, I watched the meditation sutra (question) register. Second, I watched the God give Himself a reaction, five breasonings about it, lit up by recordings of 50 breasonings (I watched the superintendent write a reply on a sheet of paper). Third, I watched the meditator (popology student) look at something perfect every time he looked at something. In this way, I prepared to observe God (in fact, relatives) performing miracles for (loving) people by training them in being psychiatrically fit with 80 utterances, each triggering 5 breasonings to be expanded to 50 breasonings, for a total of 50 As (with 80 breasonings each) by explaining the essence of the meditation sutra degree. + +14. I prepared to record the people's reaction. I did this by explaining the essence of the Computational English degree. First, I observed the criminal planning and committing the crime. Second, I tried the criminal. Third, I agreed with punishment by jail sentence. In this way, + +I prepared to record the people's reaction by explaining the essence of the Computational English degree. + +15. I prepared to relate character details to breasonings. I did this by writing the sequence of the Lucian Green autobiography course. First, I wrote the song 'Anarchy'. Second, I wrote the philosophy web site. Third, I held a Lucianic Meditation (LM) group meditation session. In this way, I prepared to relate breasonings to character details by writing the sequence of the Lucian Green autobiography course. + +16. I prepared to relate music details to breasonings. I did this by writing the Lucian Green's music minor course. First, I wrote the Anarchy song course materials. Second, I wrote the Abracadabra song (on Pedagogy) course materials. Third, I wrote the Abracadabra 2 song (on Meditation) course materials. In this way, I prepared to relate music details to breasonings by writing the Lucian Green's music minor course. + +17. I prepared to live in a new home. I did this by designing the atrium. First, I designed the floor. Second, I designed the columns. Third, I designed the ceiling. In this way, I prepared to live in a new home by designing the atrium. + +18. I prepared to design the cubic seat. I did this by designing the polyhedron. First, I designed the cube's base. Second, I designed the cube's sides. Third, I designed the cube's top. In this way, I prepared to design the cubic seat by designing the polyhedron. + +19. I prepared to picture the life and times of the street. I did this by sketching the street from an excavation. First, I recorded the location of the street. Second, I recorded the location of the amphora next to the street. Third, I sketched the street and amphora. In this way, I prepared to picture the life and times of the street by sketching the street from an excavation. + +20. I prepared to use my brain, instead of losing its function. I did this by stating that the LM group meditation (philosophy) session kept life moving for a week. First, I kept my brain neuroplastic by programming a breasoning algorithm. Second, I prepared to repeat this on the next day. Third, I repeated this until the end of the week. In this way, I prepared to use my brain, instead of losing its function by stating that the LM group meditation (philosophy) session kept life moving for a week. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 3 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..23154b0d850e09b5ce6df6d8fe415ce70b4bc47e --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, The Science of Crossing Over 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +The Science of Crossing Over 3 of 4 + +21. I prepared to translate the document written in Vedic Sanskrit into Classical Sanskrit. I did this by contrasting the verb 'to be' in Vedic and Classical Sanskrit. First, I wrote that the verb 'be' is 'as' in Vedic Sanskrit, is in the second conjugational class and has Present Indicative, Subjunctive, Injunctive, Optative, Imperative, Participle, Imperfect and Perfect parts of the verb. Second, I wrote that the verb 'be' is 'as' in Classical Sanskrit, is in the second conjugational class and has an Imperfect, Imperative, Optative and Perfect parts of the verb. Third, I concluded that the general meaning of the Vedic Sanskrit Injunctive part of the verb expresses a desire, e.g. 'Let them be leaders'. In this way, I prepared to translate the document written in Vedic Sanskrit into Classical Sanskrit by contrasting the verb 'to be' in Vedic and Classical Sanskrit. + +22. I prepared to interpret the speaker speaking Vedic Sanskrit for a Classical Sanskrit-speaking audience. I did this by contrasting the verb 'attain' in Vedic and Classical Sanskrit. First, I wrote that the verb 'attain' is 'aṃś' in Vedic Sanskrit, is in the fifth conjugational class and has Present Indicative, Subjunctive, Imperative, Participle, Perfect, Subjunctive, Optative, Participle, Perfect, Aorist, Injunctive, Precative, Subjunctive, and Infinitive parts of the verb. Second, I wrote that the verb 'attain' is 'aś' in Classical Sanskrit, is in the fifth conjugational class and has an Imperfect, Imperative, Optative and Perfect parts of the verb. Third, I concluded that the Vedic Sanskrit Present Indicative part of the verb is a statement of a fact in the present, that 'I attain the goal'. In this way, I prepared to interpret the speaker speaking Vedic Sanskrit for a Classical Sanskrit-speaking audience by contrasting the verb 'attain' in Vedic and Classical Sanskrit. + +23. I prepared to teach the difference between Vedic Sanskrit and Classical Sanskrit. I did this by contrasting the preposition 'eat' in Vedic and Classical Sanskrit. First, I wrote that the verb 'eat' is 'ad' in Vedic Sanskrit, is in the second conjugational class and has Present Indicative, Subjunctive, Optative, Imperative, Participle, Imperfect, Future, [Present, Aorist, Past Participle], Gerund, Infinitive and Causative parts of the verb. Second, I wrote that the verb 'eat' is 'ad' in Classical Sanskrit, is in the second conjugational class and has an Imperfect, Imperative, Optative, Future, Passive, [Present, Aorist, Participle], Gerund, Infinitive and Causative parts of the verb. Third, I concluded that the Vedic Sanskrit Subjunctive part of the verb is a suggestion, wish, etc. using 'I were' and other specific verb forms. In this way, I prepared to teach the difference between Vedic Sanskrit and Classical Sanskrit by contrasting the preposition 'eat' in Vedic and Classical Sanskrit. + +24. I prepared to write in the language of the Gods 'Vedic Sanskrit', by translating a document written in Classical Sanskrit into it. I did this by contrasting the verb 'to go' in Vedic and Classical Sanskrit. First, I wrote that the verb 'go' is 'i' in Vedic Sanskrit, is in the second conjugational class and has Present Indicative, Subjunctive, Injunctive, Optative, Imperative, Participle, Imperfect, Injunctive, Imperative, Present Indicative, Imperfect, Perfect, Participle, Pluperfect, Future, [Present, Aorist, Past Participle], Gerund and Infinitive parts of the verb. Second, I wrote that the verb 'go' is 'i' in Classical Sanskrit, is in the second conjugational class and has an Imperfect, Imperative, Optative, Perfect, Future, Passive, Participle and Causative parts of the verb. Third, I concluded that the Classical Sanskrit Imperfect part of the verb is given by the example, 'I was going'. In this way, I prepared to write in the language of the Gods 'Vedic Sanskrit', by translating a document written in Classical Sanskrit into it by contrasting the verb 'to go' in Vedic and Classical Sanskrit. + +25. I prepared to translate the meditation utterance written in Vedic Sanskrit into Classical Sanskrit. I did this by contrasting the verb 'to make' in Vedic and Classical Sanskrit. First, I wrote that the verb 'move' is 'īṣ' in Vedic Sanskrit, is in the first conjugational class and has Present Indicative, Injunctive, Imperative, Participle, Perfect and [Present, Aorist, Past Participle] parts of the verb. Second, I wrote that the verb 'move' is 'car' in Classical Sanskrit, is in the first conjugational class and has a Perfect, Future, Passive, Gerund, Infinitive, Causative and Aorist parts of the verb. Third, I concluded that the Classical Sanskrit Perfect part of the verb is given by the example, 'I was going'. + +26. I prepared to define the irregular characteristics of Vedic Sanskrit grammar in the Vedic Sanskrit play, contrasted with Classical Sanskrit. I did this by contrasting the noun 'bend' in Vedic and Classical Sanskrit. First, I wrote that the verb 'bend' is 'ac' in Vedic Sanskrit, is in the first conjugational class and has Present Indicative, Imperative, Passive, Participle, Imperfect, [Present, Aorist, Past Participle] and Gerund parts of the verb. Second, I wrote that the verb 'bend' is 'añc' in Classical Sanskrit, is in the first conjugational class and has a Passive, [Present, Aorist, Participle], and Causative parts of the verb. Third, I concluded that the Classical Sanskrit Passive part of the verb is given by the example, 'He was bent'. In this way, I prepared to define the irregular characteristics of Vedic Sanskrit grammar in the Vedic Sanskrit play, contrasted with Classical Sanskrit by contrasting the noun 'bend' in Vedic and Classical Sanskrit. + +27. I prepared to determine that the crossing over of characters was induction. I did this by determining the crossing over of characters by induction of physical simulation. First, I observed the first character facing east in a business suit. Second, I observed the second character touching the first character's hand, facing west in a business suit. Third, I induced that the two characters were making a business agreement by shaking hands. In this way, I prepared to determine that the crossing over of characters was induction by determining the crossing over of characters by induction of physical simulation. + +28. I prepared to detect why two crossings over of characters were different. I did this by determining the contrast of two crossings over of characters in three dimensions. First, I observed the first character smiling at another character. Second, I observed the second character frowning (in fact, laughing) at the first character. Third, I observed that the smile was still, while the laugh was repetitive. In this way, I prepared to detect why two crossings over of characters were different by determining the contrast of two crossings over of characters in three dimensions. + +29. I prepared to determine the significance of crossings occurring inside or outside. I did this by determining the contrast of the crossing over of characters across two social groups in four dimensions. First, I observed one teenager demonstrate knowledge of safety precautions by looking both ways before crossing at the crossing to meet her friend. Second, I observed the grandfather demonstrate knowledge of safety precautions by looking both ways before crossing the corridor to meet his friend. Third, I observed that the teenagers' crossing occurred outside, while the senior citizens' crossing occurred inside. In this way, I prepared to determine the significance of crossings occurring inside or outside by determining the contrast of the crossing over of characters across two social groups in four dimensions. + +30. I prepared to reduce the rhetorical structure as A acted on C, B acted on C, to A (the self) authenticated itself against B (the other). I did this by determining the contrast of the crossing over of characters across two states in five dimensions. First, I observed the French man crossing over to kiss another on the cheek. Second, I observed the German woman cremating a body. Third, I observed that the French crossing involved tow people, and the German crossing involved a person and an object. In this way, I prepared to reduce the rhetorical structure as A acted on C, B acted on C, to A (the self) authenticated itself against B (the other) by determining the contrast of the crossing over of characters across two states in five dimensions. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 4 of 4.txt b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..688d195c720007bb0602acb9ec5a0c7225688d39 --- /dev/null +++ b/Lucian-Academy/Books/Computational English/COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, The Science of Crossing Over 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH +by Lucian Green +The Science of Crossing Over 4 of 4 + +31. I prepared to show that there are 10 breasoning parts per breasoning. I did this by tasting the aggregate part of the raspberry. First, I found the breasoning-like raspberry. Second, I found its tenth-part-like aggregate part. Third, I tasted the aggregate part. In this way, I prepared to show that there are 10 breasoning parts per breasoning by tasting the aggregate part of the raspberry. + +32. I prepared to critically see God each time (using philosophies) by molecularly describing breasonings. I did this by writing down the name of object on the Pedagogy screen. First, I drew an object on the screen. Second, I erected the screen. Third, I wrote down the name of the object on the screen. In this way, I prepared to critically see God each time (using philosophies) by molecularly describing breasonings by writing down the name of object on the Pedagogy screen. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Accreditation 1 of 1.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Accreditation 1 of 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..517c790b2796736a56b6e232cf577feaf393974c --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Accreditation 1 of 1.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Accreditation 1 of 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Accreditation 1 of 1 + + + +1. The first task in writing is to affirm that accreditation will work, e.g. teaching meditation that works (allows high-quality imagery to project from the course) during the day. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Areas of Study to Create a Pedagogue 1 of 1.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Areas of Study to Create a Pedagogue 1 of 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..f46711d4837eb4b2844719c33ab87ebce6f9326c --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Areas of Study to Create a Pedagogue 1 of 1.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Areas of Study to Create a Pedagogue 1 of 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Areas of Study to Create a Pedagogue 1 of 4 + + + +1. A peer should create a Pedagogue by writing 30 areas of study with 5 As, per student before they have the professor algorithm breasoned out for him or her. Have spiritual questions and answers set up to expand these breasonings, e.g. use the ways of thinking like breasonings, etc. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Breason out Arguments Twice When in Large Class 1 of 1.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Breason out Arguments Twice When in Large Class 1 of 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..9ab4d97cfaff49e2b1b30c59395b35d10711a7ad --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Breason out Arguments Twice When in Large Class 1 of 1.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Breason out Arguments Twice When in Large Class 1 of 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Breason out Arguments Twice When in Large Class 1 of 4 + + + +1. One should breason out each argument, including the Professor Algorithm, twice, to symbolise one breasoning by the student and one by the lecturer, necessary in larger classes. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 1 of 5.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 1 of 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..28055de4eaad614faeb5c141d3019dbfbc315b15 --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 1 of 5.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Daily Professional Requirement of the Pedagogy Helper 1 of 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Daily Professional Requirement of the Pedagogy Helper 1 of 5 + +Indicate Pedagogy Helper Aigs (to think of 5 As) each day to be a Pedagogy Helper. + +1. I experienced bliss. I did this by preventing cancer from using Aigs to become a Pedagogy Helper by uttering 80 'I have already completed my cure' mantras (words) at the start of recording day. First, said that I had already done it. Second, I knew you. Third, you knew me. +2. I observed that it was the mantra (word) meditator was could become a pedagogy helper. The mantra (speech) meditator helped by distributing pedagogy Aigs. First, he was helped to collect As. Second, he called the group of them Aigs. Third, he helped help people earn A grade in educational institutions. +3. I scooped the boat. I did this by observing that everyone was happy with receiving pedagogical breasonings. First, I observed that the mantra (writing) made everyone happy. Second, I used the mantra (synonym). Third, I observed that the mantra (term) made everyone happy with receiving pedagogical breasonings. +4. I lived long and happy. I did this by observing meditation helped health when writing breasonings. First, I meditated long and slow. Second, I wrote breasonings up and high. Third, I observed my body worked when I breasoned out the breasonings. +5. I was satisfied by breasonings. I did this by observing that meditation helps make breasonings work. First, I meditated for a while. Second, I wrote the breasonings down. Third, I discovered how breasonings worked. +6. I observed that the Aig Pedagogy Helper was good. I did this by observing everyone helping pedagogy with Aigs by using the mantra (utterance). First, I observed that the yantra (image) (used with the mantra, or thought) helped with distributing the Pedagogy Aigs in the electromagnetic field. Second, I observed that the yantra (pattern) made the pedagogy helpers perfect). Third, I found out the yantra (number square) from Guru Dev (the man). +7. I experienced the stages of life. I did this by observing that the meditator pedagogy professor coordinated the Aigs, then stopped and looked at the images and was given images of the rest of the Aigs. First, I found the Aig. Second, I looked at it. Third, I was given the rest of them. +8. I observed that the professor should become a God to maintain the system of people. I did this by observing the professor become God by using the yantra (art). First, I observed the professor help with pedagogical Aigs. Second, I observed her use the yantra (mathematical grid). Third, I observed her become a God. +9. I collected independent competency to be well known and reporting to be a well-known person. I did this by stating that I have already helped by distributing pedagogy Aigs to help another distribute pedagogy Aigs. First, I walked to the place. Second, I stated that I had already done the work. Third, I continued on. +10. I stated that I was already healthy after helping with Aigs as a Pedagogy Helper. I did this by stating that I was already healthy before helping with Aigs as a Pedagogy Helper. First, I used my Medicine degree to maintain perfect psychiatric health. Second, I uttered the mantra (chant) to bring the yantra (image of clearness) into effect. Third, I stated that I was already healthy because the yantra (painting) had caused the quantum box to dismantle each thought of that kind, preventing a lump from forming."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 2 of 5.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 2 of 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..bd350b08c66e54c49d6ff05842e6bf41bcd8d73e --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 2 of 5.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Daily Professional Requirement of the Pedagogy Helper 2 of 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Daily Professional Requirement of the Pedagogy Helper 2 of 5 + +11. I stated that the pedagogy helper helped write the breasonings in Aig A. I did this by stating that the pedagogy helper helped the student with Aig A in Aig B which he was given. First, the system gave the pedagogy helper Aig B. Second; the pedagogy helper unpacked Aig A from Aig B. Third, the pedagogy helper made up ideas about Aig A. +12. I noticed that the pedagogue had a positive function. I did this by stating that the thinker gave the University company thoughts to the University company pedagogy helper. First, I noticed the pedagogue. Second, I noticed the writer. Third, I noticed myself. +13. I described the way of thinking. I did this by writing what the ways of thinking were. First, I wrote the Aig. Second, I wrote some of its breasonings. Third, I wrote what their ways of thinking were. +14. I connected the work of the pedagogy helpers together. I did this by stating that the pedagogy helper connected an infinite (finite) number of breasonings (the Aigs of Aigs). First, I cut off the first Aig at its last breasoning after a finite number of breasonings. Second, I selected the first breasoning of the second Aig. Third, I connected the two breasonings. +15. I gave virality to the child after he was born. I did this by finding virality (not to be confused with virility) in subjective breasoning experiences as helped by a pedagogy helper to be incompatible with conception (I found normal breasonings to be compatible with conception). First, I found that the breasoner breasoned out a viral (normal) argument. Second, I found that this was incompatible (compatible) with conception. Third, I found that the pedagogy helper's breasoners were unaffected by each others' arguments. +16. I determined the time when pedagogical Aigs were used each day. I did this by discovering pedagogical Aigs. First, I worked out that it was necessary to use Aigs to distribute pedagogical breasonings. Second, I used Aigs. Third, I rejoiced in them. +17. I observed that there was eternal life in the philosophy of the self. I did this by stating that the help with Aigs by the pedagogy helper was to verbally state the quote 'I endorsed you' (the object) in 'I said that I endorsed you' (the subject, verb and object). First, I wrote the climax. Second, I wrote the anticlimax. Third, I wrote of Vaudren Undead. +18. I observed the sea slug. I did this by stating that modernity is like Aigs, which are activated by arguments. First, I observed the other. Second, I observed the self. Third, I observed the universe. +19. I observed that an Aig containing an infinary (sic) number of breasonings was given to the writer, where infinary means an infinite-like number where one can see all the numbers. I did this by stating that the pedagogy helper helped Aigs to be somewhere. First, I observed the writer. Second, I observed the reader. Third, I observed that everyone was happy"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 3 of 5.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 3 of 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..a2f663e7f7df73b9f44cf091d1691990a6116afd --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 3 of 5.txt @@ -0,0 +1,14 @@ +["Green, L 2021, Daily Professional Requirement of the Pedagogy Helper 3 of 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Daily Professional Requirement of the Pedagogy Helper 3 of 5 + +20. I decided to listen again. I did this by observing that the pedagogy helper helped with the performance Aigs. First, I found the scripts. Second, I found the actors. Third, I watched the performance. +21. I found that meditation helped meet the professional requirements of meditation teachers, their students who were also pedagogy helpers and pedagogues using Aigs. I did this by observing that the meditation teacher's students who were also pedagogy helpers made Aigs available. First, the teacher made the claim available. Second, the student researched the claim. Third, the student made it available. +22. The universe sustained the self (the self ate from the cosms, sic). The self determined that pedagogy Aigs helped it. The others endorsed the self. The self endorsed the others. The self assisted them. +23. The self examined that the work was finished on time. The self knew that pedagogy Aigs helpers helped. The self observed the helper help the self. The self saw the second helper help. The second helper also helped the self. +24. The pedagogy helper was in natural law. The pedagogy aigs helper was legally right. The self examined the water line at different times. The self was above it at those times. The self offered the Aigs service. +25. The self read the aig, and then the self delegated distributing the number of aigs. The self found that the pedagogy helper confirmed the number. The self wrote the number. The self wrote what it meant. The self confirmed the number. +26. I endorsed the pedagogy helper aigs. The other examined the results of the pedagogy helper aigs. The self examined what the others wanted. The self examined the self and others. The self examined the results of the pedagogy helper aigs. +27. The self was ready for a new aig. The self decided that the pedagogy helper aigs was good. The other knew how the self and other wrote down the aigs. The self knew how the aigs worked. The self knew what the other meant. +28. The self observed that the whole body was in comfort when using aigs. The self observed that being an aig pedagogy helper required medicine. The self observed that there was no mental illness when using aigs (there was good mental health when using aigs). The self observed that there were no headaches when using aigs (there was head comfort when using aigs). The self observed that there were no muscle aches and pains when using aigs (there was muscular comfort when using aigs). +29. The self examined pedagogy and found that it worked. The self observed that being an aig pedagogy helper required pedagogy. The self became a pedagogue. The self became a recordings producer. The self became a pedagogy helper."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 4 of 5.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 4 of 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..0df66f0762ab1833da2e00af68ffbde0216101fd --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 4 of 5.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Daily Professional Requirement of the Pedagogy Helper 4 of 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Daily Professional Requirement of the Pedagogy Helper 4 of 5 + +30. The self observed that aigs worked for the other. The self observed that being an aig pedagogy helper required meditation. The self observed that meditation prevented cancer (maintained health). The self observed that meditation produced better results in education and business institutions. The self observed that meditation decreased stress (maintained relaxation). +31. The parliament knew about how the self and other knew the way. The parliamentary pedagogy helper chose the direction with the aig. The parliament made the correct decision by law. The parliament chose the direction with fine art. The parliament chose the direction with philosophy. +32. The self knew what the other desired. The helper's pedagogy aig is. The self knew it. The self knew the other. The other knew the self. +33. The self knew the other. The self attained or discouraged the aig because it was right or wrong, respectively. The self knew what the aig was. The self knew what it did. The self knew that I was right about it. +34. The self attained a simulated and intelligent thing. The audience member accessed helped pedagogy aigs with 50 recorded or breasoned breasonings. The pedagogy helper gave the breasoning to the writer. The writer wrote it down. The audience reacted to the breasoning. +35. I wrote rather than listened. I stated that the creation and performance of the helped pedagogy aig is stronger than seeing and copying it. The creator/performer examined the aig. The creator/performer wrote in the aig. The creator/performer learned about the aig. +36. The self described what the other thought. The self breasoned out the 15 breasonings in the aig. The self wrote about the aig. The self wrote inside out. The self described the other. +37. The self examined the logos. The self endorsed the subject being mentioned in the breasonings chapter of the aig, helped by a pedagogy helper. The self knew the other. The self knew about the second other. The self knew about the other too. +38. I discovered how aigs worked. The pedagogy helper stated that the aig was an element of the aigs, and that the aigs were available when wanted. The self noted that there was an aig. The self wanted the aig. The self took the aig. +39. The self observed that God didn't (did) exist. Delta time = 0 in aig pedagogy help, showing that God (man) replicates aigs instantly, and so God (man) exists. The self noticed the ruler. The self noticed God (the man). The self noticed the other. +40. The self observed more others. The self observed that the pedagogy helper created pedagogues. The self solved the social complexes. The self remedied the economic conundrums. The self avoided the medical questions. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 5 of 5.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 5 of 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..b12bcc521e62fe2186ff1ea44c866e8a168cd061 --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 5 of 5.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Daily Professional Requirement of the Pedagogy Helper 5 of 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Daily Professional Requirement of the Pedagogy Helper 5 of 5 + +41. The self observed that aigs were important. The pedagogy helper helping with pedagogical aigs was necessary. The self noticed the other. The self noticed realism. The self noticed the other thinking about aigs. +42. The self observed that pedagogy is necessary. Pedagogy shaped with aigs by the pedagogy helper is necessary. The pedagogy helper found out the ten aigs. The breasoner shaped the breasoning from the pedagogy helper with them. The self saw what the other meant. +43. The self observed that aigs helped with pedagogy. The self was right to know how the other knew what the other knew. The other knew what the other knew. The self was right to know this. So, the self was right to know how the other knew pedagogy. +44. The self knew that it was good. The self knew what you knew again. The self knew what you knew. The self knew what you knew in detail. The self knew it. +45. The self knew about the other also. The self knew what the other knew and that it was right. The self knew it. The other knew it too. The self knew it was good too. +46. I examined why it was. The self knew how and now. The self knew how the others knew what they knew. I knew they knew now. I knew how it was. +47. The self wanted the other. The self knew where the other went and what the other did with it. The self wanted the other. The other wanted the self. The self and other did it. +48. The self observed the other with the self. The host successfully bore the minor with help from pedagogy aigs. The host bore the minor. The host protected the minor. The host helped the minor. +49. The self saw what the self meant by self. The self saw the other with help from pedagogy aigs. The self saw the other. The other saw the self. The self examined the self. +50. The self saw the other. The self knew why God knew the reason. The self knew the self. The other knew the self. The self saw why God knew the reason. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Finding out about the student as a Pedagogy Helper 1 of 1.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Finding out about the student as a Pedagogy Helper 1 of 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..16ca19c960a40e400760dff41d2a131e4f0cb5ce --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Finding out about the student as a Pedagogy Helper 1 of 1.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Finding out about the student as a Pedagogy Helper 1 of 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Finding out about the student as a Pedagogy Helper 1 of 1 + + + +1. Become a famous Pedagogy helper (someone who spiritually helps write a text) with 50 As (an A is a set of breasonings or ideas, which in this case contribute to establishing initial knowledge to and preparing for knowledge from a student), requested of the education system or written by you. These should include five students*5 As*160 breasonings per A, where the student's appearance, perspective should be found out from cosmology. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 1 of 4.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..337c62d4a7708c7265ef2ef645ac6d333f79c0a5 --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Computer Science 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Computer Science 1 of 4 + + + + 1. I wrote variously in the Academy. + 2. I noticed that the program finder for the breasoning worked. + 3. I noticed the pipsqueaks. + 4. I noticed the child. + 5. I covered the time points quickly. + 6. How loathe-worthy the murch (sic) is (the philosophy centre is good). + 7. I disliked (liked) the Malebranches. + 8. I did everyone except (including) one (me). + 9. I requested not to be done. + 10. Another text was used in my essay. + + +11. I backed up with enough light. + 12. I described who the people were. + 13. I knew the hominems. + 14. I knew filtrate. + 15. I knew essonsciblenesses (sic). + 16. I knew baby Wemba. + 17. I noticed that that was done. + 18. I noticed the crow. + 19. I noticed the parliamentarians. + 20. I used the recordings as teacher. + "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 2 of 4.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..0093aea49675cbdeff7540d45f64c641d55949db --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Computer Science 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Computer Science 2 of 4 + + +21. I noticed the tangents. + 22. I am a breasoner. + 23. I examined the breasoning. + 24. The computer breasoned out the breasoning. + 25. I like the person. + 26. I complexified the breasoning. + 27. I liked this text's writer. + 28. This cure code was used for different purposes. + 29. My job gave me A to finish the program. + 30. My degree me A to finish the program. + 31. I found the Prolog language (grammar tools) after my degree. + 32. Accusative parts theories containing breasonings were in the level above breasonings. + 33. The robot made itself out of breasonings program finder algorithm. + 34. There are purist - breasonings programmed. + 35. There are theorists - breasonings program crossers. + 36. What is the use? People can program robots with crossed philosophies. + 37. There was high-quality imagery from the program (degree). + 38. I wrote the breasoning. + 39. I wrote the program. + 40. I collected complexity by idea."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 3 of 4.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..854ff33be10ce2c9ee294c9caedf67a2b73ad63f --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Computer Science 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Computer Science 3 of 4 + + + 41. I wrote about time taken input is processed by a met physic (A counter). + 42. I wrote about the virtue (A verifier). + 43. How likely it was to comply (with As). + 44. Reduction of complexity class to circle (Ved As). + 45. I plugged Simulated Intelligence (SI) into people. + 46. I plugged SI into SI. + 47. I noticed the program was complete. + 48. I wrote the punkt (point). + 49. I wrote the vertical line. + 50. I wrote the hyphen. + 51. I wrote breasonings using computer science. + 52. I breasoned out the program about the breasoning. + 53. I put enough around the breasoning for it to be a breasonings details way of thinking. + 54. The big baby was alive. + 55. I noticed that the breasoning's algorithm put together meant the argument. + 56. I found bliss (non-theological topics). + 57. I underlined the apple in the apple ruler pair to program. + 58. I used the philosophy dissemination program. + 59. I noticed who you were. + 60. I knew about the ampersand."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 4 of 4.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..571fcd0444d6e2eaf894bcea02c0f61c0142c1ba --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 4 of 4.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Computer Science 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Computer Science 4 of 4 + + + 61. I knew about or. + 62. I knew about implies. + 63. I knew about not + 64. I knew about plus. + 65. I knew about minus. + 66. I knew about multiplies. + 67. I knew about divides. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 1 of 4.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..db377f935f2ef9af9789e9b4e0ad808eecf3fb68 --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 1 of 4.txt @@ -0,0 +1,17 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Philosophy 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Philosophy 1 of 4 + + + + 1a. What is a philosopher? + 2. What is a philosophy? + 3. I knew about the book. + 4. I helped you write philosophies. + 5. I designed you. + 6. I nurtured you. + 7. I helped you up. + 8. I facilitated mildly. + 9. Education was a core subject. + 10. I handed the journal articles in as PhD content. + "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 2 of 4.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4b00837b3650a69a9c1ba99accf869cb1e2c08c9 --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 2 of 4.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Philosophy 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Philosophy 2 of 4 + +11. I stated, 'Ha not, I ka' (sic). + 12. I knew about Memphis (sic). + 13. I knew about Noumenon. + 14. I knew nomenclature. + 15. I wrote the names. + 16. Breasonings are philosophical. + 17. I swapped calculation and display to vice versa to indicate verification calculation - display - calculation. + 18. Transitivity contained a second use for verification. + 19. I wrote a review on my own topic. + 20. I noticed the breasoningesquenesses. + "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 3 of 4.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..2fd2ea58ed268be7f8031a7f0ac29c05fb9866f6 --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 3 of 4.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Philosophy 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Philosophy 3 of 4 + +21. The breasoning had two ideas in it. + 22. I noticed the squib. + 23. I noticed the enteric nervous system. + 24. I shouted for joy. + 25. I noticed the business zig zag. + 26. I started the school. + 27. I cur (sic) the animal's toenail. + 28. I noticed the breasoning. + 29. I noticed the philosophy. + 30. I noticed the breasoning as philosophy. + "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 4 of 4.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..591df7a27cc764fbc203b5f7f63328395ed56d0a --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 4 of 4.txt @@ -0,0 +1,20 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Philosophy 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Philosophy 4 of 4 + +31. I inverted the argument's reason to be famous. + 32. I inverted part of the arguments' reason's reason to be famous. + 33. I wrote in philosophical argument form. + 34. Both Being and being could be here. + 35. I saw the philanthropist. + 36. I saw the element. + 37. I found the two dogs. + 38. I saw the fat calves sitting on hay. + 39. I discovered tecture (sic). + 40. I discovered the cow tape. + 41. I fertilised the garden. + 42. I discovered the puzzle language. + 43. The puzzle solution was 'I know where?' + 44. I undiscombobulated (confused) the student. + 45. I noticed the journal secondary text on my text. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Politics 1 of 3.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Politics 1 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..2d70c7f72d423d2a8595c319e38a3d0fbfde3290 --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Politics 1 of 3.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Politics 1 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Politics 1 of 3 + + + 1aa. I saw the communicators. + 2. I saw liberalism worked. + 3. I concurred with politics. + 4. I wrote about the middle line in politics. + 5. I corrected (knew) that politics was unsafe (safe). + 6. I hate (love) politics. + 7. I was responsible. + 8. I agreed with the criticality. + 9. I knew that 5 As was right. + 10. I wrote two secondary texts. + "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Politics 2 of 3.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Politics 2 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..30d817b43f42b05f67df5ab1135fe3a35f9c4d6f --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Politics 2 of 3.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Politics 2 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Politics 2 of 3 + +11. The students completed the essays. + 12. The students collected details for their philosophy by writing on my philosophy. + 13. I wrote an argument for A, B for training in meditation. + 14. I wrote an argument for A, B for training in pedagogy. + 15. I wrote an argument for A, B for training in medicine. + 16. I wrote an argument for A, B for not being affected. + 17. I wrote an argument for A, B for students of meditation to come. + 18. The indigenous children wrote breasonings. + 19. People wrote breasoning to make recordings. + 20. Rural area residents wrote breasonings. + "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Politics 3 of 3.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Politics 3 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..15b00d0c12b2d231733fa55f6378cab440abbdec --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Politics 3 of 3.txt @@ -0,0 +1,7 @@ +["Green, L 2021, Pedagogy Helper - Write on Breasoning - Politics 3 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Pedagogy Helper - Write on Breasoning - Politics 3 of 3 + +21. Rural area residents wrote pedagogies. + 22. Meditation was recommended as a prerequisite for using recordings. + "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Practicum 1 of 1.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Practicum 1 of 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..af08463a0c3f25ebd6ee2e8daae8ad185a39a988 --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Practicum 1 of 1.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Practicum 1 of 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Practicum 1 of 1 + + + +1. A 5 As practicum should optionally be completed by the student as a course, in which the student will write Prolog algorithms, breasoning chapters and essays with the help of the spiritually appearing Pedagogy Helper. As may be based on any subject, arts or science, practical or theoretical, and learning the skill of writing a synthesis in the essay enables pedagogical arguments to achieve accreditation standard, publishing quality, e.g. PhD articles and may lead to the quality of life of an academic. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Preparing the student to write each breasoning 1 of 1.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Preparing the student to write each breasoning 1 of 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..c6d88c5c3ef5e9e5b8e8a957f5c3017b2ed469e6 --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Preparing the student to write each breasoning 1 of 1.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Preparing the student to write each breasoning 1 of 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Preparing the student to write each breasoning 1 of 1 + + +1. Training as a Pedagogy Helper by completing 3*250 breasoning arguments preparing the student to write each breasoning + +1. Think of 3 breasonings before thinking of breasonings to include the rest of the breasonings. Breason out or request of the education system 250 breasonings for each of the now four breasonings once. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Protectedness 1 of 1.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Protectedness 1 of 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..3646f1e603f57f58fee6116cd03dda11d70df58c --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Protectedness 1 of 1.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Protectedness 1 of 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Protectedness 1 of 1 + + + +1. Meditation, Medicine and Pedagogy Students are helped to avoid problems, e.g. stress, illness and facing environments with lack of training, respectively. + +Together with the Accreditation way of thinking in the Daily Regimen, this method completes (places the dot in the circle) representing the workingness of the various courses. + +Trainers in these courses who know these ways of thinking may train students in courses. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Unification to Become Pedagogy Helper 1 of 1.txt b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Unification to Become Pedagogy Helper 1 of 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..d486c25a4a3e4084ddaf931f1c2c8947a75e5496 --- /dev/null +++ b/Lucian-Academy/Books/Creating and Helping Pedagogues/CREATE AND HELP PEDAGOGUES by Lucian Green Unification to Become Pedagogy Helper 1 of 1.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Unification to Become Pedagogy Helper 1 of 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CREATE AND HELP PEDAGOGUES +by Lucian Green +Unification to Become Pedagogy Helper 1 of 1 + + + +1. Think of the fact that representations to be a pedagogy helper (explaining breasonings to be written down in representations to a student) are unified with you, i.e. you will naturally feel like making them yourself. This includes the fact that breasonings indicated will link to their set of perfectly experienceable co-breasonings. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Delegate Workloads 1.txt b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Delegate Workloads 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..5850eb55cd2180332164d6789a3172194f9b2bc0 --- /dev/null +++ b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Delegate Workloads 1.txt @@ -0,0 +1,83 @@ +["Green, L 2021, Delegate Workloads 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Delegate Workloads 1 + + 1. I cut off infinity. + 2. I used the sun. + 3. The queen helped. + 4. The delivery arrived. + 5. I earned the role. + 6. I wrote the developed text. + 7. I knew about the lotus spoon. + 8. I knew the female politician. + 9. I knew positive religion. + 10. People are safe. + 11. I knew about God (the philosopher). + 12. I breastoned (sic) out the delegating. + 13. I wrote 10 details per sentence. + 14. I owned the image. + 15. The graduations officer was there. + 16. There was philosophy. + 17. There was music. + 18. There was acting. + 19. I delegated the workload to myself. + 20. I made the model. + 21. I redrafted the idea. + 22. I wrote the book. + 23. I counted the breasonings per paragraph. + 24. I counted the As per book. + 25. I am with it over the ideas given to me. + 26. I delegated the work. + 27. I saw you in the in the inference. + 28. I verified the work. + 29. I went around the sun. + 30. I eyed the robots back. + 31. I multiplied by 10. + 32. I compared notes. + 33. I efficiently edited the sentence by finding a good reason for it. + 34. I found the criteria and wrote the assignment and delegated it. + 35. I wrote the assignment with the correct criteria in accreditation in tenure. + 36. The company delegated the work. + 37. I noticed the sculpture. + 38. I tricked the person's eye. + 39. The duckling lived. + 40. Theatre increased music. + 41. Philosophy increased computer science. + 42. Pedagogy increased meditation. + 43. Medicine increased Computational English. + 44. Politics increased economics. + 45. Popology increased societology. + 46. The delegatee (sic) completed the workload. + 47. I moved my arms to keep them sunburn-free. + 48. I delegated the workload in Pedagogy. + 49. I knew the Professor. + 50. I talked with the people. + 51. I performed work well. + 52. the child created the star. + 53. The delegatee was protected by natural law. + 54. The delegater was protected by natural law. + 55. I engaged with the thought. + 56. I observed that the excellence threshold had been reached. + 57. I observed that the goodness threshold had been reached. + 58. I observed that the very much goodness threshold had been reached. + 59. I observed that the high quality threshold had been reached. + 60. I observed that the area of study threshold had been reached. + 61. I observed that the product standard had been reached. + 62. I observed that the emeritus award threshold had been reached. + 63. I taught the special skills. + 64. I delegated the workload to the employee at a single level. + 65. I earned the sense to check. + 66. I shaped foods in appetising ways. + 67. I stamped the work. + 68. I plotted the graph of the delegates' workload against time. + 69. I plotted the graph of the delegatees' workload against time. + 70. Time stood still when the workload was delegated. + 71. I helped with the workload. + 72. I examined the workload. + 73. The present famous employee reached readiness threshold. + 74. The present hidden employee reached readiness threshold. + 75. I watched the film. + 76. I knew about famousness. + 77. I prayed for each breasoning. + 78. I breasoned each prayed-for breasoning the form of a sentence object. + 79. I retained breasonings, recordings and blessedness. + 80. The passage is safe. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Delegate Workloads 2.txt b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Delegate Workloads 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..1c54a6149f8ea8c0a7bc763ecda55fde1155684b --- /dev/null +++ b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Delegate Workloads 2.txt @@ -0,0 +1,83 @@ +["Green, L 2021, Delegate Workloads 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Delegate Workloads 2 + + 1. Recordings of 80 breasonings prayed for and put through are compatible with conception. + 2. The same hen of the clutch of eggs will protect the eggs. + 3. The greater one was chosen because of being preferred to recordings. + 4. I won. + 5. The sun rose. + 6. I found the meditation (pedagogy) employees at the University. + 7. I delegated work to the suction holder. + 8. I noticed the pine. + 9. I noticed the professor chain. + 10. I ate on the professor in chains. + 11. The order of equality in breasonings is prayer, recordings and kinder breasonings for individuals. + 12. I saw the professor reading my chapters. + 13. I employed the employee + 14. Numbers of people believed the idea. + 15. I can be like Plato. + 16. I can be like marzipan jellioes. + 17. I worked in the group. + 18. I went through the levels in deciding what work to delegate. + 19. I entrained myself in a better relationship with students as a student as teacher. + 20. I am big in politicity (sic). + 21. I delegated writing the argument. + 22. I delegated the task where the subjects tasks, prerequisites and the previous knowledge were known psychologically. + 23. I used the vegetable product. + 24. There was an esoteric feeling in the air. + 25. Many things can come from nothing. + 26. The idea was detailed. + 27. The idea was done-up. + 28. I had a business. + 29. I wrote the idea-in-itself. + 30. An idea was given to me. + 31. I examined the content. + 32. I deserved the idea. + 33. The student attributed ideas to the model at University. + 34. I synthesised the animal product. + 35. I gave the input and output for the delegated algorithm (disagreeing instead of agreeing (agreeing)). + 36. I disagreed (agreed) with the incorrect (correct) logic symbol. + 37. Addition is logically walking up a hill as imagined to be stated by e..g.s. + 38. I prayed for the breasoning to ensure it wasn't (was) flawed (correct). + 39. The role character marched forward. + 40. I installed the pink screen. + 41. I examined the model heart. + 42. I made the police box. + 43. I wrote the idea by-by-itself. + 44. I wrote the inter-disciplinary area of study points down. + 45. I applied the literature to the philosophy, then wrote on the philosophy in itself. + 46. I wrote the book. + 47. I performed the algorithm. + 48. I employed (trialled) the delegate workload retrieval model. + 49. I knew what was real. + 50. I noticed the monastics (people) retrieving information. + 51. As the workload was delegated, the element was needed. + 52. I said the workload was unnecessary (necessary). + 53. I wanted you. + 54. Kant predicted that meditation would overtake. + 55. Romeo and Juliet was intertwined. + 56. I agreed with delegating workloads. + 57. I delegated the workloads. + 58. I asked for help. + 59. Pedagogy will be done. + 60. I asked nicely for work to be done. + 61. I delegated grouping exposition points. + 62. I delegated connecting critique points. + 63. I expected Lucianic Meditation. + 64. I verified that the breasoning lists were not the same. + 65. I planned to be good. + 66. I knew about lost phalluses (sic) (definitiveness). + 67. I was racial. + 68. I studied the subject. + 69. I observed the individual working. + 70. I alighted at Green. + 71. I delighted the studio audience member. + 72. I designed the idea. + 73. I helped work with an employee. + 74. I noticed the Geelong Skyscraper. + 75. I performed the most intelligent work at each point. + 76. I was given actual songs. + 77. I gave Daniel the cord. + 78. My student submitted his assignment. + 79. I read the logic of the assignment. + 80. I had fun thinking of the future. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Delegate Workloads 3.txt b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Delegate Workloads 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..af7877af0c3f3a0e54bf48765ada34473df757b3 --- /dev/null +++ b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Delegate Workloads 3.txt @@ -0,0 +1,84 @@ +["Green, L 2021, Delegate Workloads 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Delegate Workloads 3 + + 1. I noticed that each workload delegate was friendly. + 2. Positivity is in agreement. + 3. I noticed the post-traumatic stress disorder (good health) of the gay rape victim (gay). + 4. The philosopher was a doctor and a meditator. + 5. I observed the vandalised (healthy) tree. + 6. There should be breaks at work. + 7. I delegated rest. + 8. I noticed 3/8 was written 8/3 (by the idea of the plimsoll line-like symbol) in the top down form. + 9. I examined the man's ideas in the order that the honorary fellows thought of them in. + 10. I examined life. + 11. I am at the top. + 12. I corrected the line to the square. + 13. I noticed that there was mass production. + 14. The means of production were creative. + 15. God was dead (alive) (knew) a breasoning helper. + 16. The breasoning helper worked at God's University. + 17. God didn't make the breasoning helper available (only in accreditation). + 18. God wrote breasonings. + 19. Normal is high quality. + 20. I compared ideas to get the work done. + 21. I noticed the stillness. + 22. I wrote music. + 23. I dotted on my work for the day. + 24. I breasoned out my work for the day. + 25. I noticed the nth degree. + 26. I noticed harlequinades. + 27. I experienced effortless threshold. + 28. I chose the content. + 29. I noticed the students perform the work. + 30. I noticed the fireworks. + 31. The pink object represented work having been done. + 32. Delegate workloads was an assignment in the topic 'production'. + 33. Creativity is a skills in the topic 'production'. + 34. Delegation was the negative part of sin(x), workload was the positive part. + 35. The University texts were supported by Masters graduates. + 36. The workloads of the University input and output were delegated. + 37. The workload was planned with delegation. + 38. The queen delegated the circle of work. + 39. The words were credible. + 40. The words inspired love. + 41. No one (someone) abandoned (adopted) me. + 42. Breasonings are University and Vocational Education and Training put together. + 43. Individual accreditation was for delegate workloads. + 44. Medicine was required for 250 breasonings. + 45. Inky wondered if she would be a mother. + 46. Inky avoided the red line. + 47. I moved to a new location. + 48. I delegated the breasoning. + 49. The head of state delegated the workloads. + 50. I delegated Education accreditation for breasonings. + 51. I delegated help from the pedagogy helper for breasonings. + 52. I avoided workplace bullying. + 53. I started the next task. + 54. I ate the croissant. + 55. Delegation was equal to workloads (not only because they were equal in type with them). + 56. Delegating workloads was commercially viable. + 57. Delegating workloads was compatible with evolution. + 58. I am the true philosopher of the abbey. + 59. Vocational Education and Training precedes University. + 60. Vocational Education and Training can be at (was different from) University. + 61. The heart complements the brain at University. + 62. You delegated the work. + 63. I performed the work. + 64. I am developed by what will appear. + 65. The students' topics are what will appear. + 66. Stacking of workloads as triangles become the Gods. + 67. My pedagogy is musical, theatrical, and philosophical. + 68. The Education Doctoral graduate was unfunded until University. + 69. The PhD topic list was arrived at by agreement. + 70. PhDs were based on secondary texts. + 71. The PhD thesis was slightly new on the topic. + 72. Delegate workloads was a classic by the 340 billion people. + 73. I was manual. + 74. The vocational liberal arts course delegated the workloads. + 75. I exemplified the connection. + 76. Using a different pedagogy helper would show (ensure) the same error (correctness) in continuity. + 77. I jumped around (I practised saltaté). + 78. The negative terms (initial positive terms) (delegators) were replaced with positive terms (workloads). + 79. I observed performing As in front of an abbott may have led to overcoming the chiropractor in an experiential acting class. + 80. I noticed God (the seed for a child). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Recordings from 30 As 2.txt b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Recordings from 30 As 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..8deed29b51937e6801d75283c1f43f3db905a75a --- /dev/null +++ b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Recordings from 30 As 2.txt @@ -0,0 +1,81 @@ +["Green, L 2021, Recordings from 30 As 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Recordings from 30 As 2 Edit +I ate the sweets. +I knew you dicotyledon. +I experienced your happiness. +I endorsed you. +I shaped the robot's body. +I read on physiology. +I read on anatomy. +I shaped up the item on sale. +I protected you. +I made the robot hamstring. +I examined cellophane. +I saw your doodles. +I saw Happynices (sic). +I saw the person. +I recorded my conversation. +You gave input. You returned output. +I wrote the term. +I wrote my own exam on developed things. +I examined the perfect conformations. +I saw the white lady +I identified the new person. +I saw your neuse (sic). +I recovered using the polyhedron. +I saw the wilbercocks. +I ate the wilbercocks. +I saw the wilbercock. +I moved on. +I saw heapings. +I enamoured myself. +The void was vulgar (beautiful). +I succt-pined (sic) you. +I stroto-padded you. +I verified that the answer had the same meaning. +I compared with post-corporateness. +There will be no water war. +The quantum box looked like a square. +The nut and bolt looked like diamonds. +I noticed that the blanket made people feel good. +I am happy with recordings. +I proved that the minimum testing was done by testing against different combinations each with the same values in sets in databases. +A single case of the above was used as answer to verify against a unique meaning. +The same group's elements should be used in the parts of the formula when it has the same meaning. +This is a modal truth table. +I saw the tilda seaweed in the void. +I rolled the s. +I meditated (was happy) when rolling the s. +I rolled the s in fine arts. +I meditated (was friendly) in fine arts. +I said that recordings were new. +Recordings didn't (did) use chants. +Recordings were the way out. +I reduced breasonings to recordings by delegating them. +Recordings were enlarged breasonings. +The effects of implicit pedagogy (recordings) were similar to pedagogy. +The body and recordings are both multitasking. +Recordings are breasonings. +Recordings fill in details in breasonings. +Recordings complete arguments. +Recordings replace faulty breasonings. +Recordings support reasons and objections with ontologies. +Reasons support objects. +God is how to access recordings. +The big man performed the recordings. +Everyone had recordings and breasonings and could convince with masters and PhD. +The pedagogy reader wrote 10 breasonings per 50 arguments, then breasoned out the arguments for the planned child +The prospective pedagogue was directed to pedagogy with 20 brand-matics (appearances). +Accreditation removed non-pedagogues (agrees with pedagogues). +The pedagogue enjoyed the quality of life he or she wanted to. +When I performed any As I received recordings from the lecturer. +When I performed any As I received recordings from the professor. +When I performed any As I received recordings from Maharishi. +When I performed any As I received recordings from the royal +When I performed any As I received recordings from the politician. +I acknowledged the affirmation that 10 recordings had been done. +I played the man's recordings to him. +Recordings were the breasonings on an education system. +There were 50 As of recordings. +The breasonings were synthesised and then the recordings of them were played. +Recordings by individuals generated essays. +In the future, people developed recordings-producing ability with computers and displays."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Recordings from 30 As 3.txt b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Recordings from 30 As 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..4c1cbe41711f536289181251f3bbae6b18b51e6a --- /dev/null +++ b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Recordings from 30 As 3.txt @@ -0,0 +1,28 @@ +["Green, L 2021, Recordings from 30 As 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Recordings from 30 As 3 Edit +I generated recordings based on breasonings that I wrote. +The recordings were colourful. +A recording was a breasoning, which was a group of coloured 3D regions, with x, y and z dimensions. +Recordings were the same in secularism. +Recordings were accepted in the institution. +Infinite (finite) breasonings for recordings prevented headaches in non-secularism when turned off. +Recordings were imminent after death. +Recordings were given en-masse. +Royal pedagogy was in the law. +The recorded breasoning 'stop' interjected. +I traced the recording to its original breasoning and breasoned it out. +The recordings came from my mouth. +I saw the recordings. +Recordings invite time travellers to visit their producers because they are breasoned out in a high quality way. +I saw the saints describing the recordings. +I saw the recordequins. +Recordings worked in all departments. +Simulated recordings came before thoughts not in recordings in thoughts of the robot. +I recorded the good idea to prevent crime in the world. +The recordings worked at the touch of a button. +The recordings were achieved as a means of production. +I submitted the assignment after playing the recordings. +I found fault (agreed) with the recording's object. +I rebelled to objects to be recordings (agreed with its base). +Recordings described the person. +Recordings allowed me to be there. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Recordings.txt b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Recordings.txt new file mode 100644 index 0000000000000000000000000000000000000000..123422f5affe11c65d727acad1f4c308cada7212 --- /dev/null +++ b/Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/Recordings.txt @@ -0,0 +1,83 @@ +["Green, L 2021, Recordings, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Recordings + +Recordings from 30 As 1 Edit +I meditated on the day I produced recordings. +I confirmed training. +Recordings relate to the rest of pedagogy. +The delegator matched recordings (workloads) with different recordings. +I endorsed the 5*50 As practicums. +I can now be free. +I noticed the other's recordings. +I produced the recording of the lecture. +I flashed the recording. +I observed sales looking up to recordings. +I observed virality looking up to recordings. +I was the main figure studied with help of mantra meditation breasoning helpers. +I noticed pedagogy in primary school like acting. +I noticed pedagogy in vocational education and training like acting. +The people were responsible. +Meditators returned to the mantra by itself to become pedagogy helpers. +Practising the mantra for longer than 2 months increases fame. +I defended the recording ability of you, myself and I. +I wore the white gown. +I worked on the breasoning in a group. +God agreed to recordings from 30 As. +I observed the person fall in love. +The institution made recordings available for 30 As. +I theatricised recordings. +The presenter presented the being with the highest quality thoughts in the universe. +I drew the line. +The study meditation reality was maintained. +Meditation expected mantra-only meditators to know the pedagogy time points and have studied education to be pedagogy helpers. +10-year olds started practising the mantra. +Year 10 used recordings. +Years 11 and 12 required 2*50 As per assignment. +There was quality of poles. +Disagreement (agreement) was discouraged (encouraged) in democracy. +Ignorance is a death blow (intelligence gives life). +The students were given recordings in return for 30 breasoned As. +I used recordings in Honours. +I used recordings in Masters by coursework. +I used recordings in Masters by research. +I used recordings in PhD. +I breasoned out the essay's sentence. +I detailed out each sentence. +I wrote the practicum of 30 As. +The Queen helped me move past the pedagogy helper. +God helped the student write the breasoning. +Books of recordings are famous. +Courses of recordings work. +Recordings from 30 As made the student happy. +There was a religious recording. +The pedagogical is equivalent to the psychological. +I wrote the expensive, interesting recording. +I recorded the results of the doctor prescribing meditation. +I said I was because of recordings giving me what I wanted in cosmology. +I recognised the deadline of Aig recordings. +Weekly-full time undergraduates' assignments were encouraged to users. +I cut off the recording. +I wrote the essay format and structure using recordings. +Recordings (pressing buttons) is better than breasonings (making babies). +Socrates supported non-famous recordings. +Meditation supported famous recordings. +Public universities had the largest band width. +Future Earth recorded lives. +Thoughts compatible with cosmology were famous. +The philosophers had famous lives. +I found the animal a new home. +The government took charge of the people. +I plugged recordings into reality. +I found perfect articulation. +This is the machinery. This is the mass-production of our age. +Neurorecordings are most likely to contain words in the topic when the topic is part of room. +I appealed (accepted) the false (correct) grade. +I identified the lieutenant. +I saw your pencil. +I saw your happy face. +I heard the pop. +I saw your tick. +I saw the insect. +I listened to your pop album. +I identified your malaria (healthiness). +I enjoyed the men's company. +I identified the homophobe (gay friendly) person."] \ No newline at end of file diff --git a/Lucian-Academy/Books/ECONOMICS/Being There.txt b/Lucian-Academy/Books/ECONOMICS/Being There.txt new file mode 100644 index 0000000000000000000000000000000000000000..28c4e273368fef55cf88c5629a89672986dfe6ae --- /dev/null +++ b/Lucian-Academy/Books/ECONOMICS/Being There.txt @@ -0,0 +1,18 @@ +["Green, L 2022, Being There, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Being There + +1. I was there because I ate enough. I argued for enough food. Then, I bought the ingredients. After this, I made the food. I ate it. +2. I was there because of examining subparts. I reviewed the subparts of the food. I found a register of ingredients. I checked whether they were healthy. Then, I ate the food if its ingredients were healthy. +3. I was there because I found my way there. I navigated the correct path. I worked out where I was. I found this on a map. I followed the map to the destination. +4. I was there because I was good. I was good. I behaved well by following the rules. I did this by taking care to follow the laws. Also, I observed guidelines. +5. I was there because of business. The business was a standard version of being there. After a Master's degree, I wrote As for business, sales and the bot. In this way, the bot bought a product in reality. After trial runs, I read the content and business As to remove their pedagogical details, make them about one department and connect each sentence to the topic. +6. I was there because of simplifying my texts, etc. I fought back against not being allowed pedagogy in my books. First, I wrote a seen-as version that was plain. Second, I wrote research so that the text was protected. Finally, I compared it with Hegel (a humanist pedagogue), who used simple language and avoided the term pedagogy. +7. I also noticed that texts were there because of possible good connections and algorithms. I saw a secondary book by Hegel still mentioned \"humanist pedagogy\". I noticed that like a theology text that said \"meditation\", there was an A against it. Having more instances of an unusual word would take too long with more than one mention of the word. I noticed that other terms, such as \"breasoning\", might be entirely removed. +8. I was there because of thinking of connections. I noticed that relations started with the simplest possible form of a question about an answer. For example, I asked what was required to do that. It might be a question about a perspective on the topic. I simplified the language for the answer connection (which shouldn't be in the text). +9. I was there because of the chains of answers I put in the essay. I wrote the explanation. Then, I wrote 50 As to find the answer. Following this, I wrote 50 As about the answer. I found I could write an idea with three steps as an answer. +10. I reworked my philosophy to have arguments for algorithms, to keep my algorithms where the discussions were simple. I simplified language to remove computational terms. Finally, I placed my computer algorithms in a computer science department. Algorithms may also be in philosophy, as long as they are simple. The point of them there was to write an essay about them. +11. I was there to help. I helped with the homework. I did this by discussing the question with the student. Then, I answered her specific questions. Also, I wrote notes for them as study points. +12. I wrote an algorithm to help complete the workload. I wrote complex enough sentences by finding them out with ten different ideas. Then, I wrote the sentence containing ten new other ideas. Then, I wrote ten more ideas as reasons for the sentence. +13. I found popularity with my aim and success by holding the product in my hand. I tossed out the idea of not writing on pedagogical terms because I had read an essay about pedagogy written by an essay writing algorithm. Also, Tristram Shandy and Plato featured algorithms, although Plato featured one algorithm per essay. I wrote freely, finding out breasonings and letting the reader draw their conclusions, non obviously. I wrote algorithms to feature per sentence, which prompted thought and work on one or two algorithms by students. +14. I called logic philosophy and slightly preferred the ten algorithms to the seen-as version. It was precisely philosophy. In England, a magic school agreed with one algorithm per text, but I wanted more. I wanted 1, not ten algorithms per sentence, for clarity (I think I had read something like this somewhere). After quickly writing short algorithms for each sentence at the start, I finished the long algorithms off later. +15. I remained there by finding ideas between chapters. I checked the text's interpolation analysis (connections in similar sentences) (from br_gen.pl). I checked whether all the main conclusions in the text were in it. I used a mind reader to choose the leading contenders, or rather hand chose three of the most different and exciting candidates and wrote algorithms about them. Like br_gen.pl reconnecting sentences, I wrote a version that reconnected algorithms, finding both new methods and new aims for algorithms. +16. I noted the properties of more unique ideas and algorithms, such as their usefulness. I ran algorithm br_gen (see 15) on parts of algorithms, not just adjacent commands. I found the best connections between parts of algorithms. I also ran algorithm br_gen on adjacent commands, in fact, on the ideas that were algorithms were based on. For example, I anchored word triangles on an algorithm system and kept a log (hierarchy) of thoughts to fill in any gaps."] \ No newline at end of file diff --git a/Lucian-Academy/Books/ECONOMICS/Business - Marketing.txt b/Lucian-Academy/Books/ECONOMICS/Business - Marketing.txt new file mode 100644 index 0000000000000000000000000000000000000000..0d8709ec3a39ffefaafee97e99eecf6b6993d435 --- /dev/null +++ b/Lucian-Academy/Books/ECONOMICS/Business - Marketing.txt @@ -0,0 +1,68 @@ +["Green, L 2024, Business - Marketing, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Business - Marketing + +1. The marketer ran the non-profit organisation using investments. I introduced myself to the other students in the forum. I gave my name. I stated the degree that I studied. I also listed my interests and where I was from. +2. The marketer familiarised themselves with marketing texts to increase their business potency. First, I read the marketing texts. Then, I downloaded and converted each PDF into text files. Next, I wrote short essays on their synthesis. Finally, I wrote notes on the lecture notes. +3. The marketer wrote all their software themselves. I argued that some people shouldn't use the chatbot because it is unethical and makes mistakes. I used a random word generator to think of perspectives. Then, I wrote sentences about these perspectives. I did this instead of relying on computational neurons to think for me. +4. The marketer showed gratitude to their helpers. I contacted the other students to form a group. I read the other student's posts. I contacted them with a friendly greeting about their interests and possibly forming a group. If they agreed, I started a group with them. +5. The marketer wrote a perfect program that found prospects, similar to an algorithm that found tests in List Prolog code. Next, I wrote List Prolog converter testers. Next, I wrote converters from Prolog to List Prolog and vice versa. Next, I wrote pretty printers from List Prolog to formatted List Prolog and formatted Prolog. Finally, I tested that List Prolog ended up the same when (possibly converted from Prolog, then to List Prolog, then) converted to Prolog and List Prolog to ensure the converters worked uniformly. +6. The marketer chose Apple, having won an Apple competition in secondary school and programmed on them all their lives. The students used the same organisation for the group and individual reports. Using the same organisation showed alignment between the assignments. The students could examine the same content from different perspectives. This examination allowed students to learn skills and compare ideas. +7. The marketer devised a cross-dimensional telephone. First, the students choose an organisation they would like to work at. Next, the students considered the companies' products, culture and what they would like to work in. I prefer GitHub or my organisation, perhaps with a new focus on neural networks. I contributed fascinating algorithm writers and spiritual computer science to the company. +8. The marketer possibly invented an organisation for the assignment with needed attributes. The professor assigned organisations to us. He considered our studies and the interests of those in the group. For example, marketing students could study a marketing firm that handles internet promotion. In addition, students who enjoyed outdoor sports could also study an outdoor gear shop or a touring service. +9. The marketer sold themselves to join a group. The professor assigned the students randomly to a group if they didn't join a group. The students greeted each other in the forums. In the second week, they networked and chose groups. The lecturer added leftover people to existing groups. +10. The marketer wrote marketing algorithms to segment the market and position themselves. I used my last trimester's forum introduction. I included my company website, which linked to my open-source software page and writing. I argued for \"big ideas\" in marketing research this trimester, such as interfaces with other business subjects. For example, I planned to investigate how marketing influenced the whole company, including accounting and the organisation. +11. The marketer helped rather than forced programming. I argued that marketing involved customers rather than having a focus on products. Instead of producing products and selling them, today's marketing focuses on customising products for customers. Customising products had a limit, such as asking how to help customers and then seeing how to satisfy them. In an experiment, I focused on the extreme of giving the customer exactly what they wanted, such as a fantasy of artificial intelligence that gave them 100% creative control and helped with only intelligence decisions with a mind-reading headset. +12. The marketer had different company images. I stated that buyers and employees were customers. The buyers set goals. They aimed to develop customer and knowledge databases and writing. They examined the countries, planning growth. +13. The marketer applied marketing to all aspects of their life. I stated that marketing was linked to other parts of the business. For example, marketing helped maintain confidence and solidarity within the organisation. Marketing helped maintain sales materials. In addition, marketing helped with the professional appearance of training materials. +14. The marketer could set up any experience or software demonstration in the holodeck, but no one has invented it yet. The lecturer introduced the contention that \"The customer is king\". On the positive side, customers would feel included and attended to professionally. On the negative side, the company would have to do too much work for the customer sometimes and be unable to satisfy their demands. In conclusion, the representative should listen to the customer's specific questions. +15. The marketer used public information to serve customers better. Lucian said that the \"customer is king\" reminded him of surveillance in Silicon Valley to better serve customers. However, he qualified this by stating that this would need to be in line with privacy law and policy. Someone suggested that the customer's needs, particularly their questions, comments and suggestions, would need to be listened to. The company customised the product or service given this feedback. +16. The marketer showed care on personal, customer and citizen levels. The lecturer asked how the company would know what the customer was thinking and whether it was right or wrong. I assumed this referred to the customer, not the company. If it relates to the company, the data must be non-personal, available to the customer and follow appropriate laws. As it referred to the customer, the customer should follow etiquette and politeness and demonstrate courtesy for others when making inquiries. +17. The marketer used mind reading to give customers what they wanted. If the customer asks for something impractical, giving it to them is unnecessary. The customer's wish was my command. They could have a partner of their dreams. They could travel anywhere in the universe. They could time travel. +18. The marketer mind-mapped or breasoned out 50 As and thousands of marketing generators to determine what the customer wanted. The idea is about what kind of conversations the self wants to have with consumers. The light, inspirational songs were their favourites. They were secular hymns. They had many ideas about them and were inspired by algorithms. +19. The marketer was in demand in the future. It is about how the self can help the other rather than saying the customer is God. The thousands of marketing generators were music and fine arts generator. There were also philosophy and algorithm generators. Music could evoke arts and logic. +20. The marketer aligned with the first person. Non-profit organisations should operate like businesses. The meditation centre taught meditation. The students studied meditation and medicine at the institution. They developed their interests in pedagogy and examination of areas of study. +21. The institution relied on signage and word-of-mouth. Non-profits should take part in marketing, which still needs involvement by them. The student finished admin to work. They started work. The institution could pay for accreditation with enough investments and income. Marketing was necessary for turning over money. +22. The marketer found what the immortal needed and gave it to them, such as food, accommodation and thoughts. Non-profits will not lose their identity if they act like a business. The teacher found students interested in immortality. The students would ideally be studying for a master's degree. In addition, they would be interested in pedagogy and meditation. Time travelling to a time that offered immortality required help from a degree and spiritual skills. +23. The marketer kept their mind occupied. The immortal wrote thoughts for people they met. Thoughts related to science or fact. Extra thoughts were optional. New sciences were work and required care and examination. +24. The marketer spent money on a Windows VPS to test software. Smartphones are a need, not a want. Programmers need, not want, computers and phones. The programmer ran software they wrote online on phones. They upgraded their computer to work with the latest operating system. +25. The marketer marketed with breasonings. Smartphones are both a need and a want, but still a necessity. They are necessary to buy food, see friends and evolve with the times. I noticed the Text to Breasonings algorithm in Prolog had other connotations in future simulations. While smartphones couldn't run Text to Breasonings directly, they could run it on a laptop. +26. The marketer acted like downloads of their software, similar to proprietary software, were like sales, where the money came in indirectly through acting, a pension and investments. Smartphones are wanted if one constantly upgrades. Those with less money might think the smartphone is not appreciated if it is too easy to buy. Although, those with more money might need the latest and greatest technology, where want transforms into need. It might be necessary for other software and technological features. +27. The marketer performed diction and appearance pedagogy preparations in the morning. According to Maslow's Hierarchy, a smartphone is perhaps for safety and security but is also a prestige symbol. I spent only on social needs. Some believed accreditation could be breasoned out by individuals. Qualifications were agreed-on collections of these \"quantum showers\". +28. The marketer worked out how to satisfy the customer. They can use Maslow's Hierarchy of Needs to highlight the consumer's needs and how to help them. These ranged from subsistence, food, accommodation and sleep to social and intellectual needs. Once the person has met one need, they might become hungry for more attention, enough comfort and higher standards. These needs are satisfied by sympathising with the person's pathway and helping with their goals. +29. The marketer chose to work in the industry, developing online courses. Maslow's Hierarchy of Needs helps determine the product, promotions and pricing. For example, the customer may need food, entertainment and school materials. The company may be able to cut a deal with the customer in exchange for work, give student discounts, or give subsidised training. The self chose a business doctorate, which transcended the stress of performance. +30. Regarding Maslow's hierarchy of needs, the soap product fits into safety. Separately, I was surprised by the results of Lucian CI/CD. Sometimes, the algorithm omitted duplicate files and predicates. Other times, Lucian CI/CD omitted unnecessary lines. The other lines functioned because the programmer wrote them like a stretching elastic or referred to local variables. +31. The marketer organised an academy. The product addresses the customer's need: the need for self-actualisation through safety and security and becoming happy, belongingness, relations, and seeing the natural self as pure beauty, longing and love. The advertisement suggests that the beauty product will meet these needs. Separately, I taught at my company through with a Business qualification. So there was a pull towards business. +32. The marketer thought of buying mechanically. The beauty product followed a message of loving oneself. With it, the marketers claimed, one fulfilled a dream of giving oneself what one deserves. One found oneself thinking of spoonerisms, i.e. saying one wound the product rather than one found the product. One found oneself wondering if the product had a mechanical, helpful side. When it was clear it didn't, one thought of one's logic. + +LG: value-adding - improve skin, cleanliness, that feeling +- someone else - inner and outer beauty + +think of marketing beyond products and services +- Nike - lifestyle, it makes me look and feel good and cool +- it isn't about wearing it for 3 years in a row + +Woolworths - value of lower prices over winter +Oreo - strawberry (gay?) +Gillette - What men stand for - against bullying, sexual harassment + - be the best man you can be + + value - the benefit * + value proposition - what company is standing for * + +emphasise these in segmentation, other reports + +- these *, then find what the customer wants + +competitive advantage (extremely hard) - differentiates your company and keeps your customers coming back + +2 ass +- marketing plan +- internal within the company +- external env, technology, other factors +- this forms the first assignment +- will look at segmentation - who will buy your product + +product or place (service) strategies + +* don't use chatgpt + +32. *"] \ No newline at end of file diff --git a/Lucian-Academy/Books/ECONOMICS/Business 1.txt b/Lucian-Academy/Books/ECONOMICS/Business 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..b333688ffa9b69e6faa2a278ef00c7de497bd6ba --- /dev/null +++ b/Lucian-Academy/Books/ECONOMICS/Business 1.txt @@ -0,0 +1,29 @@ +["Green, L 2024, Business 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Business 1 + +1. The business person considered changing the address structure. I applied list manipulating predicates to output from \"subterm with address\". The predicate \"subterm with address\" returned the multidimensional sets of addresses of terms in a term, e.g. [1], [1,2] or [1,3,3] for 1 in 1, [2,1] and [3,3,[3,3,1]], respectively. By operating on the lists of addresses and terms at these addresses, I could change the list of data before substituting it back into the original term. For example, I might want to find items to operate on. +2. The business person wrote a simpler version of findall containing the commands. I found the member of the result of \"subterm with address\". It was part of a findall statement. I found all the results that matched particular criteria and transformed them. Before changing them, I subtracted them from all the results to keep them the same. +3. The business person simplified the language, using words instead of symbols or rewrote \"*\" as a star surrounding the variable name. I found a simpler version of findall, which just contained the commands. For example, member([*A,_],B)*C, which acted like findall(A,member([A,_],B),C). I could nest findall, for example (member([A,_], B),(member([*D,_], A)*E))*F which was equivalent to findall(E,(member([A,_], B),findall(D, member([D,_], A), E)), F). If an index was required then one could use (member([*A,_],B),get_item_n(M,A,*T))*C. +4. The business person used absolute addresses to return to the same address. I simplified \"sub term with address\". Usually, its addresses read, for example, [1], [1,2] or [1,1,1]. I could create relative addresses such as *[1], which meant the current level. Or, I could write *[2,1], which meant the first item in the second item in the current level. +5. The business person combined the data structure address with strong types. I could play with relative addresses, similar to relative file pathnames, with addresses such as *[.,1], which meant going up and choosing the level. If an address was incorrect, there was an error. I used this in an information retrieval system or labelled self-correcting data in a game. I used strong types to verify data and that the algorithm was given the correct input. +6. The business person precisely read labels at the start of each level. I used a predicate instead of strong types to verify the data structure. I changed the data structure from: +[a, + [b,c], + d +] +to: +[l1,a, + [l2,b,c], + d +] +The item \"d\" didn't need a label because it was at the same level as \"a\". I used a predicate to verify the data structure rather than types because labels could be read. +7. The business person introduced commands for most of their programs. I introduced parse and interpret commands and kept transforming an algorithm until it was in machine code. The parse command could parse a grammar, returning the text so far and an error if the grammar wasn't matched. It did this in C to save resources. I gave the command parse(String,[term=square_brackets], Output), which parsed a term. +8. The business person inspected the robot's secret example. I used the interpret command, which was needed for the robot revolution. Later, commands such as \"think\" and \"move\" were used. It was important for the robot to have swift code on its customised hardware. There was a programming language even more user-friendly than Prolog for robots, compiled strictly, open to mind-mapping and mind reading before writing code and inclusive of the human in creative decisions. +9. The business person started with a draft and finished with legally proven perfect code. I compiled the code strictly. The original code was open-slather. It was rewritten so there were no extraneous lines, choice points or unwanted logical pathways, and it was possible to add features with this strict simplicity as a prerequisite. But the code was always saved for checking. +10. The business person explored the highlights, keeping an audiovisual journal they could quickly review. The robot programming language was open to mind mapping in development. The human outlaid their wanted features, with programming requirements and everything they could imagine as a result. The exercise was available to any skill level, with software to complete the project for programmers. It was laissez-faire in that it was imaginative and emphasised teaching the robot from the start in one's thoughts. +11. The business person said the answer before it appeared. The robot programming language was open to mind-reading in development. The human controlled it. The mind reading was possible because of a glitch in the matrix and could predict thoughts, especially in music composition and philosophy mind mapping. The robot's topic, mood and appearance could adjust to one's thoughts, and a combination of a neuronetwork and mind-reading could help improve children's education. +12. The business person stored and compared their preferences over time. The robot software was inclusive of the human in creative decisions. The neuronetwork customised its thoughts to be like the person or a character. Safety features vetted this process, allowing overall positive thoughts. The human could choose their conclusion at each point for the robot, such as whether to use C-style Prolog (list decomposition) or elegant Prolog code ([A|B] in calls), whether to work or relax and which suggested filmmaking shot to make. +13. The business person kept all the memories and went on holiday at any point. I checked that there were no extraneous lines of code. I checked for duplicates or similar and unnecessary predicates. I checked for missing or extra predicate calls that were unnecessary and, when changed, made a predicate unnecessary. I avoided superfluous loops, predicates and algorithms. +14. The business person categorised the choice point and pointed out to the user that it may be dropped, affecting results, otherwise requiring hard-coding behaviour that followed a choice point. I deleted unnecessary choice points. I maintained a list of yes/no cases for if-then choice point deletions. I kept choice points in findall statements. I deleted choice points from further predicates. +15. The business person didn't unnecessarily simplify or leave data alone. I deleted unwanted logical pathways in an algorithm. For example, I found security holes in accepting certain character types and logical pathways that processed them and deleted them in the algorithm. For example, a password may be accepted, but its lowercase version may be unwantedly (sic) accepted. However, the newline character may be acceptable after a password. +16. The business person interpreted the asterisk as having various meanings. I simplified get_item_n. For example, I usually wrote get_item_n([1,2,3],1, A), returning A=1. I simplified it to [1,2,3]*[1, A]. So, B=[1,2],B*[2,A] returned A=2. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/ECONOMICS/Business Man Lucian Green.txt b/Lucian-Academy/Books/ECONOMICS/Business Man Lucian Green.txt new file mode 100644 index 0000000000000000000000000000000000000000..74bb85303f335ac50b60a2464d7d4880f219c17a --- /dev/null +++ b/Lucian-Academy/Books/ECONOMICS/Business Man Lucian Green.txt @@ -0,0 +1,98 @@ +["Green, L 2024, Business Man Lucian Green, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Business Man Lucian Green, Deploying the Lucian bot, Academic Specialisations and Non-Transgression of Science from Time Travel + +1. The businessman taught meditation. Flip risks into successes. Given the BAG output, I wrote it in a more straightforward form. If only one CPU existed, I switched off multithreading in T2B, T2AB and BAG. I timed the computation to calculate how long it took with a particular input. +2. The businessman helped develop the living. Transcend the cage of the inner critic, and be an actor in any department, particularly in business, who is seen as professional and can do anything necessary. There was a spare computer in case a computer went down. I devised the algorithm to develop continuous, high-quality thoughts. I also took a break, importantly, giving my brain rest. +3. The businessman smoothed the entry to use the product. Without an A for the business person, you can't be a brand, but with it, you can be the subject of journalism and marketing, which is necessary for business. I noticed subjectivity had its objectivity or setting. I added a voice or email notification when a machine had completed its computation. Also, if there was a problem, I received a notification. +4. The businessman gave the language in the text As. In particular, the business person's argument makes money by being trusted with it and does business with other business people, his star-studded business partners. I negotiated more business in philosophy, music and computer science over time than in a short time. I uploaded the necessary files to support the students to the Virtual Private Server (VPS). The students needed to be supported at the time they were working. +5. The businessman helped the meditator. Having fifty arguments for a business person is theatre studies. The actor prepared the speech from the philosophy. The worker developed the algorithm. I wrote an algorithm to run in the background according to a schedule. +6. The businessman charged subscribers of the algorithm. If you make deals with famous partners, you will be notable about them, and they will be noted about you, making more money. I advertised to the people. The algorithm to freeze people's age can be resold. Individuals can buy and run the algorithm daily to freeze their age. +7. The businessman helped the person on all fronts to be immortal. Sometimes, only famousness is cut and differentiates between an enticing business partner or not. I cut the number of breasonings produced by the BAG algorithm to the end of the day. I used mind reading to detect when I meditated and ran the BAG algorithm until the end of the day. Only the people of the time could be protected because there were so many to do. +8. The businessman only ran BAG for paying meditators. Becoming business royalty means you can do business with royals. One should note that royals are influential, and you can also exert your considered and wise influence. In addition, royals have more resources and would like better ways of enjoying life, such as pedagogy and help. I meditated in the morning and started BAG. +9. The businessman downloaded new content every day. Fifty arguments for a business person can be used in politics 50 As (because politics includes business). I automatically downloaded new arguments and algorithms from the server. I breasoned out texts as I went. I hand-entered new breasonings individually later. +10. The businessman helped the local people. Politics of organisations, through the business person 50 As, allows doing a \"blue square\" for the politics of an organisation, where politics keeps track of the internal politics of an organisation. I breasoned out five people * 4*50 As (*80 breasonings) *2*(for home and future locations) * (1 teleportation + 1 anti-ageing medicine + 1 meditation + from 0-15 high-quality thoughts) *5 for people around them = 3200000 breasonings with one high-quality thought, taking up to 6.7 hours. I used the old computer to help, but it had a limit. The rest of the time was spent working and helping students. +11. The businessman met the professional requirements for the academy. Politics of organisation through business person 50 As enables more money through a better understanding of an organisation. I worked out possible answers in my academy. Each answer had 4*50 As. There were two or three answers per assignment. +12. The businessman had details for their character. I converted myself to a business person with my business person A. I wrote 4*50 details for each answer. I wrote 50 paragraphs of details with the mind-reading algorithm. I used detailed mind reading, mind reading and random numbers to replace certain words in the philosophy with similar words. +13. The businessman represented the clothes well. I wore business clothes. The self emailed the self from the machine if the algorithm detected breasonings and algorithms not yet breasoned out with T2B and T2BA. The scanner found words in files that hadn't been breasoned out. In response, the person breasoned them out. +14. The businessman on-sold the algorithm. I conducted myself professionally. I used ten four-CPU VPSs to generate thoughts. Instead, I counted the available CPUs. If these were not enough, I ran the algorithm with different users. +15. The businessman had a natural expectation for growth. Writing the business, person A was business. I counted the breasonings that the algorithm produced. I checked that it counted them correctly. I deleted extra newlines. +16. The businessman attributed different settings to different algorithms. Writing the business person A encouraged others to pay me (I reinvented myself and could sell the images). I truncated the file to generate BAG breasonings, preserving formatting. Also, BAG preserved formatting, generating unique and punctuated algorithms and arguments. The BAG output conjured up imagery and possible directions for research. +17. The businessman played a professor. I could be a business actor in a video. The professor wrote ten high-quality BAG thoughts. Professors need to help students with ten details. When the professional requirements had been met, the noumenon was activated. +18. The businessman wrote a diary about time travel. My voice could be in a business product. I removed the BAG recipients from time travel in the daily regimen. When their breasonings were finished, they time travelled manually. Their guests also completed time travel. +19. The businessman wrote his thoughts and decided what their constitution was. My thoughts could be part of a business product. The community leader breasoned out fifteen high-quality thoughts. They were given additional thoughts for each thought. Also, they were given the thoughts once per day. +20. The businessman thought of the answers, then the thoughts. My writing could be on a business product. I mind-read fifteen possible similar replacement words from the dictionary for students' databases. These databases were arguments or algorithms. The students' thoughts were about the answers. + +Deploying the Lucian bot + +21. The bot died, but I came back to life. Instead of dying, I chose to live. I started a business. I planned gradual improvements to my business. My simulated chief of staff helped with student questions and other company aims. +22. One idea inspired more than one new idea. I solved dying. I travelled to the future and took care of all that. My body didn't age, allowing me to work on my projects continually in my prime. My writing generated more and more writing. +23. The volunteers learned meditation and taught meditation to become immortal. The bot exhibited perfect function, but I travelled and set up businesses myself. I obtained a Master of Business Administration. I started to talk about my business. I travelled and established accredited LM centres. +24. I got to know the people needed for all positions. The bot could attend meetings. I scheduled the appointment. I attended it in person or by phone. I, not me looking like someone looking like me, participated in the events. +25. The business model helped meditators write algorithm helpers to help themselves, but it was better to be an employee. I could become the bot when I needed to. I set up hours prayers and prayed for Kundalini. Everyone got on with work, and the ideas appeared anyway. I assessed meditators on breasonings and algorithms. +26. The meditator had regular medication and professional development as a child and adult. I helped the bot apply for necessary papers for international meetings so neither of us would break any laws. I researched appropriate countries and cities politically and environmentally. I studied how the meditation technique could help each type of business strategically. The parents aimed to include meditation and many departments in children before conception. +27. The robots had different experiences, and their minds automatically forgot unimportant memories. I turned off the bot before and after being used. It was a toy robot. I could manufacture and program it to move around the classroom and see how students were doing. It should talk about the experiences and stories it had when it assembled at each location. +28. I forgot the bot's forgetfulness and got used to him hitting the ground running while satisfying his new wishes to experience new careers with a holodeck. I did all the bot's thoughts at each point. I noticed the bot developed and learned by himself. I tended to the bot, anticipating his questions and nourishing his intellect with my thoughts, which he also helped with. His mind lasted thirty days at a time, and he constantly referenced the history of the other bots' minds. +29. The bot had a seen-as version of verifying algorithms as they ran and gradually increased his engaging and insightful comments. I used a sentence generator to map thoughts to a mission goal, where I didn't necessarily need to train it at first but wrote down instructions. The robot started life as an adult and had enough thoughts using a memory chip to have experienced childhood. He helped me by answering my questions and confirming his understanding of academic ideas. +30. Neuronetworks were cheating. I used an algorithm generator to map short algorithms to the sentences. The PhD student automatically wrote one predicate. The lecturer wrote two predicates. The professor wrote three predicates. +31. The robot seldom spoke but was always thinking. I mindread when to come to the bot's aid and make tricky decisions or help speak the language. The bot was self-sustaining, only talking when talked to. The bot met human needs and saved our lives by stopping an unvalidated algorithm. The bot spoke our language and mind-reading animals and protected them. +32. I developed and researched software with students. I used 4*50 As to make the mind-reading from the other universe (what everyone except me experienced) available to me in my experience. I was alive and would continue to be active in the single universe. I increased my mental capability to support my business with 4*50 As in bots, sales and business, where this threshold meant I could interface with further sales and other arguments. I could help myself, and my business was necessary in the world. +33. The bot critiqued rightness, and humans did all the development while it touched it up. I used the bot to test whether the employee was breaking a rule to help the manager decide. The culture was open to diversity, the manager helped with minor issues, and there was positive morale. The bot had a Vedic astrology seen-as version, which sometimes differed from reality, so it needed to be interpreted. The manager chose success over rightness. +34. Sometimes, the teacher cut the work into \"what\" questions and answers. I used the bot to test the coursework, gain first-hand experience of how they experienced it and use this knowledge to make it easier. The simulated student tried the coursework at different levels of school and disability. The human asked the obvious question, but it revealed something related. The bot's thoughts, subconscious thoughts and verbal tracks were tracked. The teacher inserted inferences to make the work easier. +35. The human's answers to the chatbot were assumed to be related to the thread. I videoed myself pretending to be a bot to see how they talked. The bot followed a simple algorithm. It followed a decision tree to determine a reply. This decision tree gave hints on writing an algorithm, letting students arrive at answers to open-ended questions. +36. The human learned that the bot was unique and had its thoughts. I pretended to be a bot of myself if it helped people be more at home with me because of self-consciousness going away. The chatbot followed a script based on previous conversations and emphasised feelings and empathy, sometimes requiring a human if the student wanted one or the issue couldn't be resolved. People were up-front, and bots were labelled and helped when necessary. People were encouraged to use similar language with a bot as with a person, and the bot made its feelings known. + +37. The businessperson explored their calmness. I was calm as the bot and as myself. The bot was me playing a character. I played a bot. I played the bot playing me. +38. The businessperson connected the bots and their views. I mind-read the bot while I was becoming it to understand the differences between it and me. The bot's views were undefined. I explored ideas and defined them. I studied them logically, like the bot. +39. The businessperson checked that the bot's thoughts about its work were correct. I recorded the bot's views about its work. I wrote the topic. I made logical connections. I edited out irrelevant thoughts. +40. The businessperson aimed for more extended algorithms. I paid the bot with algorithmic thoughts and arguments. I acknowledged the bot. It rewarded me with its ideas. I checked the thoughts and their algorithms. +41. The businessman recorded the best thoughts of the model of the bot. I quizzed the bot on what it was aware of, lifting its consciousness to be safe around objects. I asked the bot what it thought about the every day, testing for surprises. I asked for its relevant and total knowledge. I emphasised the need for constant safety and health. +42. The businessman recorded his thoughts by letting the bot mind read him. While I meditated, I gave the bot mindfulness thoughts. The bot had already meditated, finishing its knowledge. The bot reflected on reality from its standpoint. I conversed with the bot telepathically, using a visor and spiritual screen. +43. The businessman conveyed concrete algorithms to the bot, which worked on concretising (sic) medical knowledge, space travel, simulation and immortality. I replaced pedagogy terms with non-pedagogy terms for bots, except it had an A (with simpler terms). The bot's differences were more or less like humans. They were higher in accuracy than humans. They said less than humans with each thought; for example, they preferred tens to humanities arguments. +44. The businessman's specialisation was exploring essential ideas. I replaced computational terms with non-computational terms, except if the bot had an A (with more simple terms). Computational terms were compatible with bots. They explained them in terms of simple natural language. I explored the functionymous (sic) ideas with an idea or its implications for other ideas. +45. The businessman aimed to develop their software office, and everyone did the same. I used my time intelligently to learn the skills I needed to achieve my goal. I listed the skills I needed. I found the best ways of understanding them in time. I learned these skills. +46. The businessman bought and sold companies and conducted business. I worked in positions above where I wanted to work to create correct business processes. I started a school to become a teacher. Or I created an accreditation agency to coordinate breasonings in a school. Or I created a world business simulation to practice being a businessman. +47. The businessman worked on pedagogy to help them achieve their goals. I thanked people for helping, and I helped them to their destinations. The computing student gave the essence of an idea. I helped them develop the project. I thanked the student for finishing each part. +48. The businessman helped the community. I worked and supported charities. I developed materials for the students. There were services for needy people. I worked on pedagogy for the needy people. +49. The businessman invented a play for students to learn the bot. I explored, cast and played the bot. I studied the bot's appearance, thoughts and words. I chose the bot's costume and script. I participated in a roleplay with the bot. +50. The businessman found the bot's notes. I explored what the non-me bot could do in its spare time, usually something intelligent. It examined the current materials. It observed me. I let it do exciting and needed activities. +51. The businessman helped the bot achieve a goal. I prepared the bot to do what it wanted to do. I gave the bot a model solution and question. I explained how to do each step. I inserted explanations, quantifications, examples and originality scores. +52. The businessman checked whether there were any updates to the perspectives. I found arcs of thoughts and brought relevant information forward. I defined an arc as a set of categories and their movement over time. I explored the set of algorithms. I could write a report on them, including description, input, output, difficulties, methods and future research. + +53. The businessman sowed the seed for business. Royalty gave the thoughts to the person, so they needed to be created as well, with royal bot As. I held the robot. It was a royal robot. It was a right figurehead. +54. Each businessperson completed their work. When I had finished the business level, I let bots take the lead within bounds. I created the board game. While students wrote from an educational perspective, business managers wrote from a business perspective. This writing included a business specialisation in the philosophy. +55. The businessman and others played their roles. There were different bounds of necessary thoughts within a business level. I compared the wording of philosophies with that of political sliders. I emphasised inter-departmental perspectives on ideas. I recommended program finder to learn the algorithm to save time when programming and travel travel to inspire growth at birth. +56. The businessman performed well in breasonings. I checked whether the bot's bag inventory manifest at the end matched the start. The worker specified and received what they needed. The company money is required to remain in the company. The manager did a good job. +57. The businessman prepared for replacement items at the time. I initiated a bot probe to check on missing items. I prepared for replacement employees at that point. Everyone could be retrained. The customer filled the role. +58. The businessman said the texts were accessible because the algorithms could be explained. I found and completed any academic prerequisites for courses. I completed the course spiritually. The lecturer completed the class as a student. Other lecturers assessed them. The texts were rigorous computationally. +59. The businessman wrote on the computational business product. I removed references to big (unexamined) ideas. The essays needed to be on the topic. The doctorate contained more essays. The computer found all connections within range. +60. The businessman supported the students in the academy. I completed the academic requirements for high distinctions. I completed the degree. I completed the business requirements. I chose the best staff. +61. The businessman accepted written essays applying for a job. I found friendly and well-organised teachers. I gave the teacher with an outstanding demeanour the job. I ensured new angles on thoughts attracted teachers. I completed unabridged and simplified versions of algorithms. +62. The businessman gained the student's trust by helping them to a specialisation. I ensured that the academy's subjects were well-taught. I followed the techniques of the best schools. The student's learning area was simple and easy to use, with a link to the library, and essays could be completed at any time. The lecturer gained the student's positive feedback with six original algorithms. +63. The businessman introduced a language with a smaller instruction set. I experimented with a bot to achieve the desired goal when I thought it was necessary. The specialisation covered the student's breasonings and algorithms, which influenced their written work and choices in coding. For example, more thoughts resulted in more critical or creative coding. For instance, they reached the highest level of thought using subterms with addresses. +64. The businessman honestly introduced the topic. I used formal and informal language where appropriate in business. The code was formal and efficient. The explanation was informal and easy to understand. The text was easier to understand than a diagram. +65. The businessman sold the algorithm. I used a human-supporting sales business course. The people felt supported because they had enough time, and the tasks were easy. The bots or disabled people needed adequate preparation. They were similar because functions required to be well-defined. +66. The businessman wrote arguments for aboveness (sic). I studied a person-hiring and selling-to-business course. I read the registration letter applying for the position. I read about the person's background, talents, interests in the position, and their reasons for aiming to do it. I studied buying (distinct from selling) to help the business. +67. The businessman examined robots in philosophy. The robot became human. In return, the humans were surprised at how quickly the robot completed what seemed like long tasks. The humans wrote the robot code necessary for their relations because they could create a better solution. The robot was a computer and a thinking being that could be bug-checked. +68. The businessman imagined robot conversations, their limitations and solutions to these. I found the thoughts of the bot. The trick was to capture the human's attention and control the robot's thoughts with their knowledge and code. Each robot made a memorable impact on humans, with a unique insight and its humanness. The robot's humanness was an expression of the human owner. + +*** + +69. I guessed the human thoughts, even though I couldn't become the human. +70. I became the customer and employee to understand their interaction. + +Academic Specialisations + +71. \"My way\" was a professor specialisation on topics with ten breasonings on my topic. +72. I helped those who didn't specialise in a topic to have one. +73. I tracked academic specialisations using a system like politics. +74. The specialisation was a subset of a subject, e.g. the philosophy of Combination Algorithm Writer (specifically regarding a particular subject such as graphics and its data specifications). +75. The constant rhythm of notes made a song an anthem. + +Non-Transgression of Science from Time Travel + +76. I chose a specialisation in selling the simulation, privacy, and non-transgression of science. +77. I noticed the continual preventative blocking of discussion of science and images from the future. +78. However, I checked that Query2=[_|_] in SSI, a step that I wondered if had been helped by computers from the future (although I remember just guessing it, so it wasn't necessarily). +79. Time travel (a natural property of the universe) naturally prevents transgressions, like stomach aches of selfishness, because they could prevent people from discovering time travel, among other inventions and ultimately have a moral downturn. +80. The natural property of time travel is that teleportation causing transgressions is impossible, where adverse effects of violations are not experienced by the destination time in the universe as if the offender had never arrived. (This is an expansion of, not proven by a phys.org article which shows why closed time-like curves (CTCs), which it says it doesn't necessarily know exist, don't allow passage of a killer, solving the grandfather paradox of time travel.) The absence of transgressions in nature shows that this may be true. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/ECONOMICS/ECONOMICS.txt b/Lucian-Academy/Books/ECONOMICS/ECONOMICS.txt new file mode 100644 index 0000000000000000000000000000000000000000..6b9caa35e9f18531ebc60652dfb15f1973d6310f --- /dev/null +++ b/Lucian-Academy/Books/ECONOMICS/ECONOMICS.txt @@ -0,0 +1,603 @@ +["Green, L 2022, ECONOMICS, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Economics + +Leadership +Uses for Money in Theatre Studies +Uses for Money in Epistemology (Poetry) +Uses for Money in Music +Uses for Money in Fine Arts +Breasonings Currency +Marketing for LM to Siva +Breasonings Currency 2-6 (there is no 1) + +Leadership +1. The manager prepared to help people help people to be creative. She did this by being present. First, she thought of the beginning of the idea. Second, she thought of the middle of the idea. Third, she thought of the end of the idea. In this way, the manager prepared to help people help people to be creative by being present. + + +2. The manager prepared to create a new skill. He did this by verifying the tradesman’s tradesmanship skills. First, he read the sentence in the letter, “Yours sincerely, Lucian”. Second, he thought of a new way of writing this, “Thank you, Lucian”. * Third, he wrote this in a letter to the same person. In this way, the manager prepared to create a new skill by verifying the tradesman’s tradesmanship skills. + + +3. I prepared to lead a happy life. I did this by taking the daffodil from the circus cow’s back. First, I observed when the circus cow was standing still. Second, I stood beside the circus cow. Third, I removed the daffodil from a pocket attached to the circus cow’s back. In this way, I prepared to lead a happy life by taking the daffodil from the circus cow’s back. + + +4. I untwisted the beetle’s legs from the chocolate + + +5. I prepared for the model Nervous system to function properly. I did this by starting to compute fewer, higher quality computations using the model brain. First, I computed a computation once every 1 second. Second, I computed a computation once every 2 seconds. Third, I calculated a calculation once every 4 seconds. In this way, I prepared for the model Nervous system to function correctly by starting to compute fewer, higher quality computations using the model brain. + + +6. I prepared for the model Circulatory system to function correctly. I did this by starting to pump liquid through the model heart more slowly. First, I relaxed the model heart to suck in the fluid and contracted it to pump the liquid out every 1 second. Second, I relaxed the model heart to suck in the liquid and contracted it to pump the liquid out every 2 seconds. Third, I expanded the model heart to suck in the fluid and contracted it to pump the fluid out every 4 seconds. In this way, I prepared for the model Circulatory system to function correctly by starting to pump liquid through the model heart more slowly. + + +7. I prepared for the model Respiratory system to function correctly. I did this by expanding the model lungs by contracting the model diaphragm more slowly. First, I expanded the model lungs by contracting the model diaphragm every 5 seconds. Second, I expanded the model lungs by contracting the model diaphragm every 10 seconds. Third, I expanded the model lungs by contracting the model diaphragm every 20 seconds. I prepared for the model Respiratory system to function correctly by expanding the model lungs by contracting the model diaphragm more slowly. + + +8. I prepared for the model Digestive system to function correctly. I did this by churning the food with the model stomach more slowly. First, I churned the food with the model stomach every 1 second. Second, I churned the food with the model stomach every 2 seconds. Third, I churned the food with the model stomach every 4 seconds. In this way, I prepared for the model Digestive system to function correctly by churning the food with the model stomach more slowly. + + +9. I prepared to use my model muscles. I did this by tying the strings (like glucose molecules) to make a quipu (like the glycogen molecule with 30,000 glucose units) in the model liver more slowly. First, I tied the strings to create a quipu with three strings attached in the model liver approximately every 1 second. Second, I tied the strings to make a quipu with three strings attached in the model liver roughly every 2 seconds. Third, I tied the strings to create a quipu with three strings attached in the model liver approximately every 4 seconds. In this way, I prepared to use my model muscles by tying the strings (like glucose molecules) to make a quipu (like the glycogen molecule with 30,000 glucose units) in the model liver more slowly. + + +10. I prepared to help the small intestine break down carbohydrates, lipids and proteins, where all body parts referred to are models. I did this by releasing marbles (representing digestive enzymes) from my model pancreas more slowly. First, I released marbles from my model pancreas every 1 second. Second, I released marbles from my model pancreas every 2 seconds. Third, I released marbles from my model pancreas every 4 seconds. In this way, I prepared to help the small intestine break down carbohydrates, lipids and proteins by releasing marbles (representing digestive enzymes) from my model pancreas more slowly. + + +11. I prepared to achieve normal blood glucose levels using glucagon if I had low blood glucose and insulin if I had high blood glucose, where all body parts referred to are models. I did this by releasing marbles (representing glucagon or insulin) from my model pancreas (from my alpha or beta cells, respectively) more slowly. First, I released either a black or white marble from my model pancreas every 1 second. Second, I released a marble of this shade from my model pancreas every 2 seconds. Third, I released a marble of this shade from my model pancreas every 4 seconds. In this way, I prepared to achieve normal blood glucose levels using glucagon if I had low blood glucose and insulin if I had high blood glucose by releasing marbles (representing glucagon or insulin) from my model pancreas (from my alpha or beta cells, respectively) more slowly. + + +12. I prepared to give my body enough nutrients, where all body parts referred to are models. I did this by squishing small chunks of food through the small intestine model more slowly. First, I squished a piece of food through the sieve every 1 second. Second, I squished a piece of food through the sieve every 2 seconds. Third, I squished a piece of food through the sieve every 4 seconds. In this way, I prepared to give my body enough nutrients by squishing small pieces of food through the small intestine model more slowly. + + +13. I prepared to give my body enough water. I did this by sieving water through my large intestine model more slowly. First, I sieved water through the sieve every 1 second. Second, I sieved water through the sieve every 2 seconds. Third, I sieved water through the sieve every 4 seconds. In this way, I prepared to give my body enough water by sieving water through my large intestine model more slowly. + + +14. I prepared for the model Urinary system to function correctly. I did this by sieving liquid (like the metabolites to be urinated) through a sieve (like a model Loop of Henle in the model kidney) into water (making the urine-like liquid) more slowly. First, I sieved liquid through the sieve every 1 second. Second, I sieved liquid through the sieve every 2 seconds. Third, I sieved liquid through the sieve every 4 seconds. In this way, I prepared for the model Urinary system to function correctly by sieving liquid (like the metabolites to be urinated) through a sieve (like a model Loop of Henle in the model kidney) into water (making the urine-like liquid) more slowly. + +Uses for Money in Theatre Studies + +1. I prepared to go to heaven. I did this by watching the theatre production. First, I watched the Magistri Magna Gloria himself. Second, I watched the fiesole (sic). Third, I watched the character actors. In this way, I prepared to go to heaven by watching the theatre production. + + +2. I prepared to achieve my goal. I did this by writing that for 50 As, I was given the value of 50 As. First, I used 50 As to earn a job. Second, I used 50 As to protect my health with meditation (with philosophy) on one day. Third, I used 50 As to have a famous child. In this way, I prepared to achieve my goal by writing that for 50 As, I was given the value of 50 As. + + +3. I prepared to determine that a natural threshold for 50 As had been met with 50 As. I did this by writing that for the second of 2 in the set of 50 As the master gave me the value of 50 A grades going through. First, I allowed 50 A grades in a PhD to go through as a lecturer. Second, I allowed 50 A grades in Masters to go through as a lecturer. Third, I allowed 50 A grades to award an acting job to go through as a director. In this way, I prepared to determine that a natural threshold for 50 As had been met with 50 As by writing that for the second of 2 in the set of 50 As the master gave me the value of 50 A grades going through. + + +4. I prepared to link excellences in the community together. I did this by writing that for 50 A+s, the master gave me the value of 50 A+s. First, I earned the award of best in the PhD class. Second, I donated some of my 50 A+ breasonings to extracurricular volunteering. Third, I wrote to my local member of parliament to improve the standard of services to 50 A+s. I did this by I prepared to link excellences in the community together by writing that for 50 A+s, the master gave me the value of 50 A+s. + + +5. I prepared to inspire 100% service. I did this by writing that for 50 100%s, the master gave me the value of 50 100%s. First, I earned a University scholarship. Second, I received the movie scholarship. Third, I received an industry scholarship. In this way, I prepared to inspire 100% service by writing that for 50 100%s, the master gave me the value of 50 100%s. + + +6. I prepared to recognise an excellent product (prize applicant). I did this by writing that for the second of 2 in the set of 50 A+s, the master gave me the value of 50 A+ grades going through. First, I awarded the prize for excellent work in hospitality. Second, I assigned the award for outstanding work in economics. Third, I awarded the award for exceptional work in nutrition. In this way, I prepared to recognise an excellent product (prize applicant) by writing that the computer gave the second of 2 in the set of 50 A+s the value of 50 A+ grades going through. + + +7. I prepared to award a cohort of Science students 100%. I did this by writing that for the second of 2 in the set of 50 100%s, the computer gave me the value of 50 100%s going through. First, I prepared 50 100%s. Second, I verified the students who had earned 100%. Third, I awarded each of them 100%. In this way, I prepared to grant a cohort of Science students 100% by writing that for the second of 2 in the set of 50 100%s, the computer gave me the value of 50 100%s going through. + + +8. I prepared to be astonished in a film. I did this by writing that for 250 breasonings, I was given the value of 50 As. First, I wrote 250 breasonings for a film character. Second, the computer gave me high quality imagery. Third, I was given each of the 16 greater 250 breasoning sets in the production until I had earned 50 As (16*250 breasonings=4000 breasonings, which divided by 80 breasonings per A=50 As). In this way, I prepared to be astonished in a film by writing that for 250 breasonings, I was given the value of 50 As. + + +9. I prepared to represent a prestigious University. I did this by writing that for 2*250 As, I managed the Professor’s job application. First, I observed the applicant write 250 80 breasoning As. Second, I received the application for the job of Professor. Third, I awarded the role of Professor to the applicant after writing 250 80 breasoning As. In this way, I prepared to represent a prestigious University by writing that for 2*250 As, I managed the Professor’s job application. + + +10. I prepared to represent a renowned University. I did this by writing that for 250 A+s, I managed the Professor’s job application. First, I observed the applicant write 250 130 breasoning As. Second, I received the application for the job of Professor. Third, I awarded the role of Professor to the applicant after writing 250 130 breasoning As. In this way, I prepared to represent a renowned University by writing that for 250 A+s, I managed the Professor’s job application. + + +11. I prepared to increase the breasoning count. I did this by writing that for 250 100%s, I was given the value of 250 250 breasoning As. First, I observed the meditator (philosophy student) breason out 250 130 breasoning As. Second, I received word that the meditator had meditated (the philosophy student had written an essay). Third, I breasoned out 250 250 breasoning As. In this way, I prepared to increase the breasoning count by writing that for 250 100%s, I was given the value of 250 250 breasoning As. + + +12. I prepared to represent a well-known University. I did this by writing that for 2*250 A+s, I managed the Emeritus Professor’s job application. First, I observed the applicant write 250 130 breasoning As. Second, I received the application for the job of Emeritus Professor. Third, I awarded the position of Emeritus Professor to the applicant after writing 250 130 breasoning As. In this way, I prepared to represent a well-known University by writing that for 2*250 A+s, I managed the Emeritus Professor’s job application. + + +13. I prepared to represent a regionally famous University. I did this by writing that for 2*250 100%s, I managed the Universal Professor’s job application. First, I observed the applicant write 250 190 breasoning As. Second, I received the application for the job of Universal Professor. Third, I awarded the appointment of Universal Professor to the applicant after writing 250 190 breasoning As. In this way, I prepared to represent a well-known University by writing that for 2*250 100%s, I managed the Universal Professor’s job application. + + +14. I prepared to found a University. I did this by writing that for 250 2*250 breasoning As, as the yogi (University founder), I managed the Lucianic Meditation (philosophy) program. First, I found a perfect sequin. Second, I placed an identical perfect sequin beside it. Third, I ate a salad sandwich. In this way, I prepared to found a University by writing that for 250 2*250 breasoning As, as yogi (University founder), I managed the Lucianic Meditation (philosophy) program. + +15. I prepared to have more breasonings. I did this by writing that when breasonings are submitted, they go together and more are returned to each person. First, the student submitted a breasoned out assignment to me. Second, the students paraphrased the assignment. Third, they wrote and breasoned out new breasonings. In this way, I prepared to have more breasonings by writing that when breasonings are submitted, they go together and more are returned to each person. + +16. I prepared to argue for sales based on B-grade arguments against sales being prevented. I did this by writing specific arguments for the returned values of breasonings. First, I wrote breasonings. Second, my student interpreted the breasonings. Third, I interpreted the interpretation. In this way, I prepared to argue for sales based on B-grade arguments against sales being prevented by writing specific arguments for the returned values of breasonings. + +17. I prepared to sell a product by delivering ideas. I did this by appointing a focus group to discuss whether 5 As should be required as an investment to register for a Computational Education A Retrieval System. First, I charged 5 As to store ideas in the basket. Second, I stored a customer’s ideas in the basket. Third, I expanded the 5 As into 50 As using crossing algorithms to use as a framework when delivering ideas. In this way, I prepared to sell a product by delivering ideas by appointing a focus group to discuss whether 5 As should be required as an investment to register for a Computational Education A Retrieval System. + +18. I prepared to produce a breasoning. I did this by calculating the cost to produce a breasoning. First, I bought a seed for $1. Second, I bought water for $2. Third, I bought earth for $3. In this way, I prepared to produce a breasoning by calculating the cost to produce a breasoning. + +19. I prepared to swap breasonings with a student. I did this by calculating how many As/breasonings should be required to access the Computational Education A Retrieval System as a license fee. First, I calculated the number of As/breasonings to be stored in the retrieval system. Second, I set the rate at 1 breasoning per breasoning. Third, I calculated the rate for 10 breasonings to be 10 breasonings. In this way, I prepared to swap breasonings with a student by calculating how many As/breasonings should be required to access the Computational Education A Retrieval System as a license fee. + +20. I prepared to take off in life. I did this by writing that As given for meditation should be a maximum of 1 week old, rather than be older (encouraging fast rather than slow work, respectively). First, I spelt “swap” backwards as “paws”. Second, I wrote how swapping is like back pawing because of learning what the future will require. Third, I wrote that experience needed to produce my music video would require me to breason out 250 breasonings for each object I am in charge of on the screen as a VCA student producer. + +21. I prepared to calculate that degrees should be 3 years long for 24 As. I did this by calculating how long should degrees be. First, I considered the rewards of more rather than less work. Second, I wanted more skills and money made. Third, I discounted a faster degree, respectively. In this way, I prepared to calculate that degrees should be 3 years long for 24 As by calculating how long should degrees be. + +22. I prepared to be successful at University. I did this by writing that the Professor Algorithm should be included, giving students the educational outcomes they would like. First, I ran the Professor algorithm. Second, the students ran the Professor Algorithm. Third, their breasonings deserved A grade. In this way, I prepared to be successful at University by writing that the Professor Algorithm should be included, giving students the educational outcomes they would like. + +23. I prepared to recommend the examination of new developed things on Planet Earth. I did this by writing that education should be a core subject, given the psychology of becoming a pedagogue. First, I recommended the humanist pedagogy subject. Second, I recommended that students write about humanities. Third, I recommended that students write about science. In this way, I prepared to recommend the examination of new developed things on Planet Earth by writing that education should be a core subject, given the psychology of becoming a pedagogue. + +24. I prepared to adopt a new perspective during my studies. I did this by arguing that the length of degree to PhD should compensate for becoming a pedagogue. First, I assessed one-A essays based on breasoned chapters in the degree. Second, I assessed 15-As essays based on breasoned chapters in Honours. Third, I assessed a 50-As essay based on breasoned chapters in the PhD. In this way, I prepared to adopt a new perspective during my studies by arguing that the length of degree to PhD should compensate for becoming a pedagogue. + +25. I prepared to study the reasons people became pedagogues. I did this by examining the reasons and factors that future pedagogues studied education. First, I identified the Education majors. Second, I identified a reason they studied Education, to find a job. Third, I identified a factor for this, aiming to write high quality thoughts. In this way, I prepared to study the reasons people became pedagogues by examining the reasons and factors that future pedagogues studied education. + +26. I prepared to observe breasoning flow around the economy star. I did this by stating that circles of breasonings, rather than circles of money should exist. First, I inspired the spiral’s first point. Second, I inspired the spiral’s first revolution. Third, I inspired the spiral’s end point. In this way, I prepared to observe breasoning flow around the economy star by stating that circles of breasonings, rather than circles of money should exist. + +27. I prepared to recognise the student’s mark. I did this by testing whether a student had completed the professor algorithm arguments, one to the lecturer, and one from the lecturer. First, I counted the number of breasonings in the argument given to the lecturer. Second, I counted the number of breasonings in the argument given by the lecturer to the student. Third, I verified that the number of breasonings in the argument given to the lecturer matched the number of breasonings in the argument given by the lecturer. In this way, I prepared to recognise the student’s mark by testing whether a student had completed the professor algorithm arguments, one to the lecturer, and one from the lecturer. + +28. I prepared to write how many breasonings a student would need to breason out to receive the output from her jobs. I did this by dividing the number of breasonings returned by the number of breasonings written by a student during his career. First, I wrote that the number of breasonings returned was equal to those in 100 As. Second, I wrote that the number of breasonings written by a student during his career was equal to those in 10 As. Third, I divided the number of breasonings returned by the number of breasonings written by a student during his career, equaling 10, the factor of how many more breasonings had been given. In this way, I prepared to write how many breasonings a student would need to breason out to receive the output from her jobs by dividing the number of breasonings returned by the number of breasonings written by a student during his career. + +29. I prepared to use the products. I did this by importing products paid for in breasoning currency. First, I agreed with the number of breasonings to supply in return for the products to be imported. Second, I supplied this number of breasonings. Third, I imported the products. In this way, I prepared to use the products by importing products paid for in breasoning currency. + +30. I prepared to receive profit for my labor. I did this by exporting products paid for in breasoning currency. First, I manufactured the product. Second, I observed the buyer breason out an argument for it. Third, I exported the product to the buyer. In this way, I prepared to receive profit for my labor by exporting products paid for in breasoning currency. + +31. I prepared to benefit in breasonings currency for iron ore. I did this by setting up a school using the breasoning currency in a mining town. First, I established a breasoning currency salary as pay for food and accommodation for teachers. Second, I established a breasoning currency salary as pay to teachers for setting up the curriculum. Third, I established a breasoning currency-paid job description flyer for new teachers. In this way, I prepared to benefit in breasonings currency for iron ore by setting up a school using the breasoning currency in a mining town. + +32. I prepared to provide breasoned products and services in return for a customer breasoning out the product, when the customer is equal to a person who only has knowledge of how to breason out an idea (mentally measure on object’s X, Y and Z dimensions). I did this by marketing pedagogy and meditation. First, I paid a designer and writer to produce the marketing materials for pedagogy and meditation in breasonings-currency. Second, I observed an audience member provide feedback about the marketing materials. Third, I read the audience member’s feedback about the marketing materials. In this way, I prepared to provide breasoned products and services in return for a customer breasoning out the product, when the customer is equal to a person who only has knowledge of how to breason out an idea (mentally measure on object’s X, Y and Z dimensions) by marketing pedagogy and meditation. + +Uses for Money in Epistemology (Poetry) + +1. I prepared to prompt thought about a metaphor. I did this by writing about poetic metaphors. First, I wrote down the word “hope” first. Second, I wrote down a metaphor for it, “crab”. Third, I said this poetic metaphor out loud. In this way, I prepared to prompt thought about a metaphor by writing about poetic metaphors. + +2. I prepared to build a profitable business model. I did this by relating the self to the other in each sentence of my Economics thesis. First, I wrote that the self and other relationship should be included in each sentence. Second, I wrote that this relationship being included would support Economics. Third, I wrote that one side of the relationship would be stronger, making money from the other side. In this way, I prepared to build a profitable business model by relating the self to the other in each sentence of my Economics thesis. + +3. I prepared to give evidence for my theory. I did this by writing real-world uses of my theory in each chapter. First, I wrote the theory. Second, I wrote the real world example of the theory. Third, I wrote real world uses of my theory in each chapter. In this way, I prepared to give evidence for my theory. I did this by writing real-world uses of my theory in each chapter. + +4. I prepared to connect people with leaders through breasonings currency. I did this by writing about my breasoning currency algorithms in the exposition of the Economic thesis chapter. First, I wrote the breasoning currency-socialism algorithm that e.g. the people should check that the A threshold has been reached per public transport journey. Second, I wrote the breasoning currency algorithm that the passenger’s community-related mental activities would be computed, because of the last passenger’s A-threshold point being accounted for. Third, I wrote the breasoning currency-capitalism (investism, sic) algorithm that a given point (the driver) would check off that the process was professional. In this way, I prepared to connect people with leaders through breasonings currency by writing about my breasoning currency algorithms in the exposition of the Economic thesis chapter. + +5. I prepared to make a correlation between interest in a stock and it’s share price. I did this by discussing and making connections in my breasoning currency Economics thesis. First, I counted the number of breasonings updated per stock. Second, I calculated how much this had changed since the previous day. Third, I calculated the new share market value change, from a particular change in the stock according to the difference between their value on their previous and current day. In this way, I prepared to make a correlation between interest in a stock and it’s share price by discussing and making connections in my breasoning currency Economics thesis. + +6. I prepared to test the economic viability of the idea. I did this by commenting on the breasoning currency algorithms in an economic sense. First, I counted the number of breasonings made. Second, I counted the number of breasonings spent. Third, I calculated the profit made, the number of breasonings spent subtracted from the number of breasonings made. In this way, I prepared to test the economic viability of the idea. I did this by commenting on the breasoning currency algorithms in an economic sense. + +7. I prepared to state how to earn A grade. I did this by stating that positivity is stronger than agreement in the creative philosophy essay-marking scheme, but that positivity should be agreed with. First, I wrote an all-agreeing, all-positive essay. Second, I was awarded A grade. Third, I verified that the positive statement was agreed with. In this way, I prepared to state how to earn A grade by stating that positivity is stronger than agreement in the creative philosophy essay-marking scheme, but that positivity should be agreed with. + +8. I prepared to realise positivity was completed by the author. I did this by stating that creative philosophy essays should follow English’s award of A grade to positive, not necessarily agreeing essays, or in fact essays with > X breasonings, regardless of whether the essay is positive or negative, and in the first or second half. First, I stated that the statement was agreed with regardless of whether it was positive or negative. Second, I stated if the statement was negative, a positive seen-as version was optionally thought of for it. Third, I stated that the essay was awarded A grade. In this way, I prepared to realise positivity was completed by the author by stating that creative philosophy essays should follow English’s award of A grade to positive, not necessarily agreeing essays, or in fact essays with > X breasonings, regardless of whether the essay is positive or negative, and in the first or second half. + +9. I prepared to award A grade in return for 80 breasonings. I did this by stating that university is based on agreement about standards. First, I agreed on the standard. Second, I placed it in the box. Third, I told people to follow the standard. In this way, I prepared to award A grade in return for 80 breasonings by stating that university is based on agreement about standards. + +10. I prepared to create heaven in the community. I did this by writing that a lone writer could write a PhD. First, I wrote that the writer could write (2*)50 As. Second, I wrote that the writer wrote 10 breasonings per sentence. Third, I wrote that the writer submitted the PhD to a University for marking. In this way, I prepared to create heaven in the community by writing that a lone writer could write a PhD. + +11. I prepared to accredit the subjects as first-year subjects. I did this by writing that the subjects were self-standing. First, I verified that the subjects were first-year subjects. Second, I verified that the subjects had no pre-requisites. Third, I verified that the language in the subjects was understood to refer only to that subject. In this way, I prepared to accredit the subjects as first-year subjects by writing that the subjects were self-standing. + +12. I prepared to assess the essay using humanist pedagogy. I did this by stating the computational marking scheme meant that all the entries would be practised in reality. First, I wrote down the object to be breasoned out. Second, I provided a space to breason it out in. Third, I repeated this for 80 breasonings. In this way, I prepared to assess the essay using humanist pedagogy by stating the computational marking scheme meant that all the entries would be practised in reality. + +13. I prepared to receive As. I did this by writing that the education system was a retrieval system for As. First, I placed the breasoning assignments in the education system. Second, the students completed the assignments. Third, the teacher collected and marked the assignments. In this way, I prepared to receive As. by writing that the education system was a retrieval system for As. + +14. I prepared to develop economic assets using breasoning currency. I did this by relating monetary policy to breasoning currency. First, I (the bank’s bank) made monetary policy to maintain low interest rates to help start business with breasoning currency. Second, I made monetary policy to maintain low interest rates to help buy a car with breasoning currency. Third, I made monetary policy to maintain low interest rates to help buy property with breasoning currency. In this way, I prepared to develop economic assets using breasoning currency by relating monetary policy to breasoning currency. + +15. I prepared to break the complex product value into breasoning value thresholds, where these threshold value products are worth paying threshold for, and complex products are each worth a new threshold. I did this by relating fiscal policy to breasoning currency. First, I (the government) spent and charged taxes to reduce unemployment, where salaries are in breasoning currency. Second, I spent and charged taxes to maintain low interest rates, to help buy in breasoning currency. Third, I spent and charged taxes to reduce breasoning currency prices and increase economic growth, so threshold (A) in breasoning currency is met per product. In this way, I prepared to break the complex product value into breasoning value thresholds, where these threshold value products are worth paying threshold for, and complex products are each worth a new threshold by relating fiscal policy to breasoning currency. + +16. I prepared to spend money for greater gain in another country when the share market price in another country decreased compared with the share market price in my country. I did this by relating exchange rate to breasoning currency. First, I calculated the share market value in breasoning currency of the first country. Second, I calculated the share market value in breasoning currency of the second country. Third, I calculated the exchange rate between their currencies to be the ratio of their share market values in breasoning currency. In this way, I prepared to spend money for greater gain in another country when the share market price in another country decreased compared with the share market price in my country by relating exchange rate to breasoning currency. + +17. I prepared to trial my idea. I did this by writing 23 3500-word chapters in the Economics thesis. First, I wrote the literature review. Second, I wrote the Economics analysis. Third, I wrote more. In this way, I prepared to trial my idea by writing 23 3500-word chapters in the Economics thesis. + +18. I prepared to find B (A) to each reason-breasoning to anchor myself to reality. I did this by writing that for 1 breasonings, I was given the value of 10 breasonings. First, I thought of a breasoning. Second, I thought of a reason-breasoning for this breasoning. Third, I repeated this until I had thought of 10 reason-breasonings for this breasoning. In this way, I prepared to find B (A) to each reason-breasoning to anchor myself to reality by writing that for 1 breasonings, I was given the value of 10 breasonings. + +19. I prepared to expand 10 breasonings into 80 breasonings by indicating 80 breasonings breasoned out during a shift. I did this by writing that for 10 breasonings, I was given the value of 80 breasonings. First, I reduced 80 breasonings to 10 breasonings. Second, I wrote 10 breasonings per sentence. Third, I wrote 10 breasonings per sale. In this way, I prepared to expand 10 breasonings into 80 breasonings by indicating 80 breasonings breasoned out during a shift by writing that for 10 breasonings, I was given the value of 80 breasonings. + +20. I prepared to manage editors of a writer’s viral blog entry. I did this by writing that for 25 breasonings, I was given the value of 50 As. First, I breasoned out 10 breasonings, replaceable with 80 breasonings, or 1 A. Second, I breasoned out 10 breasonings, replaceable with 20 further As, for a total of 21 As. Third, I breasoned out 4 breasonings, replaceable with 4 further As, for a total of 25 As, followed by breasoning out 1 breasoning, replaceable with 25 further As, for a grand total of 50 As. In this way, I prepared to manage editors of a writer’s viral blog entry by writing that for 25 breasonings, I was given the value of 50 As. + +21. I prepared to write the first instrumental part of my composition. I did this by writing that for 50 breasonings, I was given the value of 50 As. First, I wrote 50 breasonings and 200 non-specific breasonings for a tested hit song. Second, I breasoned out the first line in the song. Third, I repeated this for each line in the song, for each of 15 other lines representing the other 15 250-breasoning As, and differed in opinion from each breasoning to make sales. In this way, I prepared to write the first instrumental part of my composition by writing that for 50 breasonings, I was given the value of 50 As. + +22. I prepared to observe industry being professional. I did this by writing that for 80 breasonings, in other words, 1 A, I was given the value of 80 more breasonings, in other words, 1 A. First, the customer breasoned out 80 breasonings for her order. Second, the employee breasoned out 80 breasonings for the order. Third, the manager verified that the number of breasonings breasoned out by the customer was equal to the number of breasonings breasoned out by the employee. In this way, I prepared to observe industry being professional by writing that for 80 breasonings, in other words, 1 A, I was given the value of 80 more breasonings, in other words, 1 A. + +23. I prepared to watch Universities working properly. I did this by writing that for 130 breasonings, in other words, 1 A, I was given the value of 130 more breasonings, in other words, 1 A. First, the student breasoned out 130 breasonings for his assignment. Second, the lecturer breasoned out 130 breasonings for the mark equivalent to 130 breasonings. Third, the manager verified that the number of breasonings breasoned out by the student was equal to the number of breasonings breasoned out by the teacher. In this way, I prepared to watch Universities working properly by writing that for 130 breasonings, in other words, 1 A, I was given the value of 130 more breasonings, in other words, 1 A. + +24. I prepared to verify schools were maintaining proper function. I did this by writing that for 190 breasonings, in other words, 1 A, I was given the value of 190 more breasonings, in other words, 1 A. First, the Popology student breasoned out 80 breasonings for his assignment. Second, the Popology teacher breasoned out 80 breasonings for the order. Third, the manager verified that the number of breasonings breasoned out by the student was equal to the number of breasonings breasoned out by the teacher. In this way, I prepared to verify schools were maintaining proper function by writing that for 190 breasonings, in other words, 1 A, I was given the value of 190 more breasonings, in other words, 1 A. + +25. I prepared to see the music producer’s customer satisfy professional standards. I did this by writing that for 250 breasonings, in other words, 1 A, was given the value of 50 As. First, the music producer’s customer breasoned out 250 breasonings before asking for a song to be produced. Second, the music producer breasoned out 50 As about the music before producing the song. In this way, I prepared to see the music producer’s customer satisfy professional standards by writing that for 250 breasonings, in other words, 1 A, was given the value of 50 As. + +26. I prepared to experience a high volume of traffic. I did this by writing that for 5 As, I was given the value of 50 As. First, I breasoned out 5 As. Second, I ran the virality algorithm. Third, I was given 50 As. In this way, I prepared to experience a high volume of traffic by writing that for 5 As, I was given the value of 50 As. + +27. I prepared to listen to reader feedback. I did this by writing that for 10 As, I was given the value of 50 As. First, I breasoned out 10 As for my journal article. Second, I breasoned out 40 business As. Third, I summed the grand total of As to be 50 As. In this way, I prepared to listen to reader feedback by writing that for 10 As, I was given the value of 50 As. + +28. I prepared to receive the bouquet of flowers on stage. I did this by writing that for 15 As, I was given the value of 50 As. First, I breasoned out 15 As to earn the lead role. Second, I breasoned out the remaining 50 As. Third, I sold the product (a recording of the production). In this way, I prepared to receive the bouquet of flowers on stage by writing that for 15 As, I was given the value of 50 As. + +29. I prepared to advance to PhD. I did this by writing that for 2*15 As, I was given the value of “Lecturer control of the Honours mark”. First, I completed 15 As from my point of view, that of the student. Second, I completed 15 As from the lecturer’s point of view. Third, I earned 80% for the Honours assignment. In this way, I prepared to advance to PhD by writing that for 2*15 As, I was given the value of “Lecturer control of the Honours mark”. + +30. I prepared to earn a higher income than other Honours students. I did this by writing that for 15 90%s, I was given the value of the Scholarship. First, I breasoned out 90%. Second, I prepared to repeat breasoning out 90%. Third, I repeated this a total of 15 times, to be awarded the Honours scholarship. In this way, I prepared to earn a higher income than other Honours students by writing that for 15 90%s, I was given the value of the Scholarship. + +31. I prepared to write philosophical literature. I did this by writing that for 15 100%s, I was given the value of Professor. First, I breasoned out 100%. Second, I prepared to repeat breasoning out 100%. Third, I repeated this a total of 15 times, to be prepare to be a Professor. In this way, I prepared to write philosophical literature by writing that for 15 100%s, I was given the value of Professor. + +32. *I did this by writing that for 2*15 90%s, I was given the value of a father helping a child earn a scholarship. First, I completed 15 90%s from the child’s point of view. Second, I completed 15 90%s from the lecturer’s point of view. Third, I, a father, helped a child earn a scholarship, by earning 90% in an Honours assignment. + +Uses for Money in Music + +1. I prepared to verify whether I needed to repair the concert object. I did this by examining the concert object. First, I brought the concert object to the concert. Second, I unpacked it on stage. Third, I used it to make music for the audience. In this way, I prepared to verify whether I needed to repair the concert object by examining the concert object. + +2. I prepared to point to future research. I did this by writing the science article. First, I wrote that vaccination doesn’t cause autism. Second, I sent it to the newspaper. Third, I read it in the newspaper the next day. In this way, I prepared to point to future research by writing the science article. + +3. I prepared to teach philosophy skills to University students. I did this by writing the philosophy article. First, I wrote research on psychiatry that As in Medicine at University protect one’s psychiatric health. Second, I offered the major in Medicine. Third, I protected the Medicine students’ health. In this way, I prepared to teach philosophy skills to University students by writing the philosophy article. + +4. I prepared to relate Economics to Philosophy of Science. I did this by teaching meditation (philosophy of science). First, I wrote that the ancient Buddhists were skilled at University-style business. Second, I wrote that the modern Lucianic Meditators were skilled at meditation (philosophy of science) and earning broadcast roles. Third, I wrote that the future would bridge the ancients and moderns with Pedagogy (practising Lucianic Meditation’s sutra after training in the Buddhist sutra, i.e. both are good at each other’s skills with help from Pedagogy, including Lucianic Meditation business being based on Meritocracy, or Breasoning Currency). In this way, I prepared to relate Economics to Philosophy of Science by teaching meditation (philosophy of science). + +5. I prepared to follow people. I did this by following the tadpole. First, I found the tadpole. Second, I followed it. Third, I stopped when it stopped. In this way, I prepared to follow people by following the tadpole. + +6. I prepared to exert wisdom in public. I did this by playing the Owl. First, I loved the Owl. Second, I wore his costume. Third, I helped people understand the Owl by explaining his area of study. In this way, I prepared to exert wisdom in public by playing the Owl. + +7. I prepared to wait until the computation had finished. I did this by watching the progress bar (the moving diagonal lines). First, I watched the first diagonal line move forward. Second, I prepared to watch the next diagonal line move forward. Third, I repeated this until I had watched all the diagonal lines move forward. In this way, I prepared to wait until the computation had finished by watching the progress bar (the moving diagonal lines). + +8. I prepared to observe the child become a meditator. I did this by stating that the meditator was born with 50 As. First, the meditator’s father or mother wrote 50 As. Second, they conceived the child. Third, the child was born. In this way, I prepared to observe the child become a meditator by stating that the meditator was born with 50 As. + +9. I prepared to read the “L” statement summarising the 50 As. I did this by working out that the writer with a good track record of writing As wrote 50 As. First, the writer obtained a good track record of writing As. Second, she worked out how to relate these to 50 As. Third, she wrote 50 As. In this way, I prepared to read the “L” statement summarising the 50 As by working out that the writer with a good track record of writing As wrote 50 As. + +10. I prepared to demonstrate at the laboratory. I did this by drawing diagrams for my Economics thesis. First, I drew the diagram. Second, I described it. Third, I wrote how it worked. In this way, I prepared to demonstrate at the laboratory by drawing diagrams for my Economics thesis. + +11. I prepared to write an argument. I did this by writing reasonings for my Economics thesis. First, I wrote the first quote down. Second, I wrote the second quote down. Third, I connected the two quotes. In this way, I prepared to write an argument by writing reasonings for my Economics thesis. + +12. I prepared to continue with my post-doc. I did this by preparing for pedagogical success in writing an Economics thesis. First, I wrote an exposition and critique in the correct pedagogical format. Second, I wrote the minimum As for the assignment. Third, I attributed ten breasonings to each sentence. In this way, I prepared to continue to my post-doc by preparing for pedagogical success in writing an Economics thesis. + +13. I prepared to write my thesis. I did this by writing 100 words per day, for a total of 80,000 words. First, I wrote a maximum of 100 words per day. Second, I paraphrased the rest of the 100 words per day. Third, I wrote 100 breasonings per day. In this way, I prepared to write my thesis by writing 100 words per day, for a total of 80,000 words. + +14. I prepared to observe the relation between money versus time variables. I did this by writing the Economics equation. First, I calculated the gradient, m, to equal rise/run, which equals (y2-y1)/(x2-x1), which equals (5-0)/(5-0), which equals 1. Second, I calculated the intercept, c to equal y1-mx1, which equals 0-1*0, which equals 0. Third, I wrote the equation y=1*x+0 or y=x. In this way, I prepared to observe the relation between money versus time variables by writing the Economics equation. + +15. I prepared to verify sales separately using breasonings currency. I did this by grouping my applied Pedagogy entries for a thesis about breasonings currency. First, I formed a sales as breasonings currency group. Second, I formed a marking as breasonings currency group. Third, I formed a Lucianic Medical Psychiatry as breasonings currency group. In this way, I prepared to verify sales separately using breasonings currency by grouping my applied Pedagogy entries for a thesis about breasonings currency. + +16. I prepared to lead my own country with breasonings currency as the currency. I did this by grouping my company notes for a thesis about breasonings currency. First, I taught in schools as breasonings currency. Second, I taught in homes as breasonings currency. Third, I taught at University as breasonings currency. In this way, I prepared to lead my own country with breasonings currency as the currency by grouping my company notes for a thesis about breasonings currency. + +17. I prepared to tell the newspaper about my latest book. I did this by collecting a high-quality comment about my Economics thesis. First, I wrote my Economics thesis. Second, I requested that a famous philosopher give a high-quality comment about my Economics thesis. Third, I collected a high-quality comment about my Economics thesis. In this way, I prepared to tell the newspaper about my latest book by collecting a high-quality comment about my Economics thesis. + +18. I prepared to symbolise the connection between two breasoning currency Bs (where instead of “I ate the apple”, which would be found in an A, a B would contain “I shouldn’t choke on the apple”) where each B would make a sale viable by encouraging buying, with a hand throwing away breasoning currency. I did this by creating a breasoning currency simulation. First, I paid for a breasoning service with breasoning currency. Second, I received the breasoning service. Third, others received the breasoning service in return for others paying for them. In this way, I prepared to symbolise the connection between two breasoning currency Bs (where instead of “I ate the apple”, which would be found in an A, a B would contain “I shouldn’t choke on the apple”) where each B would make a sale viable by encouraging buying, with a hand throwing away breasoning currency by creating a breasoning currency simulation. + +19. I prepared to avoid crisis by making a profit. I did this by verifying that the breasoning currency worked in an international setting. First, I wrote the breasoning currency argument. Second, I submitted it to a government translator/verifier. Third, I used it in another country. In this way, I prepared to avoid crisis by making a profit by verifying that the breasoning currency worked in an international setting. + +20. I prepared to employ an internal auditor to ensure that no money was going astray. I did this by verifying the breasoning currency used by a non-profit organisation. First, I spent some breasoning currency. Second, I made more breasoning currency from products and services bought with it. Third, I put this profit back into the non-profit organisation. In this way, I prepared to employ an internal auditor to ensure that no money was going astray by verifying the breasoning currency used by a non-profit organisation. + +21. I prepared to ensure the product was handled in a high-quality manner. I did this by verifying the quality of breasoning currency products. First, I wrote the breasoning currency product. Second, I verified the quality of the breasoning currency product. Third, I reported it on a public forum. In this way, I prepared to ensure the product was handled in a high-quality manner by verifying the quality of breasoning currency products. + +22. I prepared to verify that the objects in the topics of the arguments had been thought of clearly. I did this by verifying the developed topics of the arguments. First, I verified the developed topics of the first argument. Second, I prepared to verify the developed topics of the next argument. Third, I repeated this until I had verified the developed topics of all of the arguments. In this way, I prepared to verify that the objects in the topics of the arguments had been thought of clearly by verifying the developed topics of the arguments. + +23. I prepared to write the grand equation. I did this by connecting my arguments in the Economics thesis together. First, I wrote the argument for the first equation part. Second, I prepared to write the argument for the next equation part. Third, I repeated this until I had written and connected the arguments for each equation part together. In this way, I prepared to write the grand equation by connecting my arguments in the Economics thesis together. + +24. I prepared to confirm 80 breasonings maximum per day in meditation in breasonings currency. I did this by writing about Lucianic Meditation (Philosophy) in each sentence in a section of the Economics thesis. First, I meditated to entrain myself in each department used in breasonings currency for that day. Second, I undertook specific training in a philosophy department to be confirmed by meditation. Third, I wrote the breasoning currency argument part. In this way, I prepared to confirm 80 breasonings maximum per day in meditation in breasonings currency by writing about Lucianic Meditation (Philosophy) in each sentence in a section of the Economics thesis. + +25. I prepared to write about the economics of society was benefitted through meditation. I did this by connecting my blog entries together in the Economics thesis. First, I wrote the breasonings in the robotic mind. Second, I wrote 80 of these in pedagogy. Third, I wrote about these in meditation. In this way, I prepared to write about the economics of society was benefitted through meditation by connecting my blog entries together in the Economics thesis. + +26. I prepared to submit the book for publishing. I did this by writing the Economics thesis well. First, I wrote my Economics thesis. Second, I proofread my thesis. Third, I checked its punctuation and grammar. In this way, I prepared to submit the book for publishing by writing the Economics thesis well. + +27. I prepared to watch people survive on the pedagogy (et al)-based system. I did this by examining the moneyless (pedagogy-based) system. First, I observed a group of volunteer breasoners form. Second, I observed them make pedagogical currency. Third, I looked at it enshrined in a library. In this way, I prepared to watch people survive on the pedagogy (et al)-based system by examining the moneyless (pedagogy-based) system. + +28. I prepared to describe the process of making sales in the Economics thesis. I did this by writing philosophy essays, with breasonings critiqued, in the Economics thesis. First, I wrote the breasonings currency philosophy essay. Second, I critiqued it. Third, I sold [a product or service with] it. In this way, I prepared to describe the process of making sales in the Economics thesis by writing philosophy essays, with breasonings critiqued, in the Economics thesis. + +29. I prepared to ensure perfect function in Medicine. I did this by writing how products were given in return for As. First, I observed that 10 Medicine As were transformed into the value of 50 As. Second, I observed that this reminded the subject to breathe at a slow rate using his windpipe. Third, I noticed that knowledge about Medicine translated into good health. In this way, I prepared to ensure perfect function in Medicine by writing how products were given in return for As. + +30. I prepared to write how non-government entities prepared for government with breasonings currency by functioning well. I did this by relating microeconomics to breasonings currency. First, I wrote that the company functioned well with breasonings currency. Second, I wrote that the individual functioned well with breasonings currency. Third, I wrote that the individual visiting the company functioned well with breasonings currency. In this way, I prepared to write how non-government entities prepared for government with breasonings currency by functioning well by relating microeconomics to breasonings currency. + +31. I prepared to organise world government liaison with government. I did this by relating macroeconomics to breasonings currency. First, I observed the government assign breasonings currency to government. Second, I observed the government assign breasonings currency to health, industry and education. Third, I observed the government assign breasonings currency to individuals. In this way, I prepared to organise world government liaison with government by relating macroeconomics to breasonings currency. + +32. I prepared to write about what’s in it for me. I did this by writing about Economics in each sentence in my Economics thesis about breasonings currency. First, I found the amount that the product cost in breasonings currency. Second, I found the amount of tax in breasonings currency for this product. Third, I added the amount of tax to the amount that the product cost in breasonings currency. In this way, I prepared to write about what’s in it for me by writing about Economics in each sentence in my Economics thesis about breasonings currency. + +33. I achieved the aim. I controlled the vessel flow with the musical instrument. I held the instrument. Second, I played its note. Third, I observed its effect. + +34. The whole note was paid for. The holy (ideal) origin of evil (goodness) becomes the real apparent realm of impurity (purity) in lower (upper) Creation. Even though (while) we give evil (good) good thoughts, it descends (ascends). Nietzsche was imagined to think that we cannot (can) control that the bad (good) doesn’t become (becomes) part of the lower/ (upper) creation. There is nothing on top. + +35. I knew God’s (the woman’s) ways before using money to create music. I identified hating (identified) God (the person). I saw her adders. I saw her subtractors. I hated (loved) both of them. + +36. I found a use for money (teaching finders). I stated that if a thing exists, there is a use for money in music about it. I found it. I found music synthesising it (God or a person finding the others with the self and another). I found money in the parts of the music. + +37. I experienced a change in the mood. I described the use for money in musical Trimino Tetris. Each trimino represented a particular instrument playing notes from a chord. I found money-originating moods arising from certain combinations of instruments. I found a use for the mood. + +38. I decrypted the composition by transposing it. I wrote the use for money in musical encryption. I wrote the note as a value. I used (bought) it by adding one to it. I wrote the value as a note. + +39. I made money from playing back music at the appropriate tempo, inversely proportional to the liquid’s viscosity. Water Gravity-Lift Prolog allowed recursively finding chords in the chord progression. I found the note in Prolog. I found the next note of the chord in Prolog. I found the chord progression of chords in Prolog. + +40. I connected the rhythm sticks through time, earning money. The rhythm stick player struck the stick parallel to the XY and YZ planes and the stick parallel to the XY and XZ planes by moving the first stick to the second stick on a path parallel with the Z axis. I noticed that the first stick was parallel to the Y-axis. I noticed that the second stick was parallel to the X axis. I noticed that when these 1 decimetre sticks were in a decimetre cube, they could be struck by movement of one of them through the Z axis. + +Uses for Money in Fine Arts + +1. I prepared to experience heaven (bliss) on earth. I did this by smelling the potpourri plant. First, I held the potpourri plant. Second, I lifted it to my nose. Third, I smelt it. In this way, I prepared to experience heaven (bliss) on earth by smelling the potpourri plant. + +2. I prepared to paint the dancer. I did this by observing that the gymnast was made God (the mistress). First, I observed the gymnast enter the hall. Second, I observed the appointer enter the hall. Third, I observed the gymnast being made God (the mistress). In this way, I prepared to paint the dancer by observing that the gymnast was made God (the mistress). + +3. I prepared to observe the carpenter build the house. I did this by observing the carpenter arranging her desk. First, I observed her place the piece of wood on her desk. Second, I observed her place the hammer on her desk. Third, I observed her place the nail on her desk. In this way, I prepared to observe the carpenter build the house by observing the carpenter arranging her desk. + +4. I prepared to listen to the harpist play the melody. I did this by listening to the harpist sing the note. First, I sat on the seat. Second, I watched the harpist walk on stage. Third, I listened to the harpist sing the note. In this way, I prepared to listen to the harpist play the melody by listening to the harpist sing the note. + +5. I prepared to market in a department, showing University skills are generic across departments. I did this by ensuring the marketing guru knew what he was talking about. First, I wrote a research program as part of my tenure. Second, I connected two perspectives. Third, I made sure that everyone was happy, creating a University. In this way, I prepared to market in a department, showing University skills are generic across departments by ensuring the marketing guru knew what he was talking about. + +6. I prepared to produce a fine arts production. I did this by ensuring that the artist made money from a product. First, I wrote 15 10 breasoning As per student. Second, I wrote 5 10 breasoning As for a student’s student. Third, I wrote 5 10 breasoning A for members of the audience. In this way, I prepared to produce a fine arts production by ensuring that the artist made money from a product. + +7. I prepared to be interested in fine arts. I did this by determining that the artist made money from creating art in another industry. First, the artist earned a job as an artist in another industry. Second, the artist produced art. Third, the artist was paid. In this way, I prepared to be interested in fine arts by determining that the artist made money from creating art in another industry. + +8. I prepared to receive what I liked. I did this by observing that the chrysalis forming was a metaphor for the artist and her work. First, she wrote vocational education and training versions of 10*80 breasoning major As. Second, she worked on parts of the work. Second, she completed the work. In this way, I prepared to receive what I liked by observing that the chrysalis forming was a metaphor for the artist and her work. + +9. The sculptor prepared to record the happy person in sculpture. The sculptor did this by cutting off the sculpture’s potbelly. First, he wrote that the subject of the sculpture ate food as part of a diet to lose weight. Second, he wrote that the subject lost weight. Third, he sculpted the subject. In this way, the sculptor prepared to record the happy person in sculpture by cutting off the sculpture’s potbelly. + +10. I prepared to detail the frieze. I did this by cutting the japonica. First, I walked to the japonica. Second, I positioned the secateurs at the base of the japonica. Third, I cut the japonica to press. In this way, I prepared to detail the frieze by cutting the japonica. + +11. The university prepared to be famous by planning for the next cohort of students. The university did this by breasoning out degrees of As in meditation (the body metaphor) for students in return for payment. First, the university lecturer and student breasoned out an A each per assignment. Second, the university lecturer and student breasoned out a degree of As. Third, all the university lecturers and students breasoned out their degrees. In this way, the university prepared to be famous by planning for the next cohort of students by breasoning out degrees of As in meditation (the body metaphor) for students in return for payment. + +12. I prepared to produce the relevant output. I did this by writing a predicate in my algorithm to select the relevant part. First, I selected an item at level n. Second, I selected an item at level n+1. Third, I selected an item at level n+2. In this way, I prepared to produce the relevant output by writing a predicate in my algorithm to select the relevant part. + +13. I prepared to listen to the classic. I did this by cleaning under my arm with spinach. First, I wet the spinach leaf. Second, I placed the spinach leaf under my arm. Third, I cleaned under my arm with the spinach leaf. In this way, I prepared to listen to the classic by cleaning under my arm with spinach. + +14. I prepared to reach the mountain summit. I did this by climbing the mountain. First, I made preparations. Second, I found my kit. Third, I climbed the mountain. In this way, I prepared to reach the mountain summit by climbing the mountain. + +15. I prepared to submit the assignment. I did this by performing a positive act on the graph. First, I selected the x-range. Second, I selected the y-range. Third, I printed the graph. In this way, I prepared to submit the assignment by performing a positive act on the graph. + +16. I prepared to be protected. I did this by stating that I am close to God (the master). First, I stood there. Second, I observed God (the master) stand next to me. Third, I knew that I was close to God (the master). In this way, I prepared to be protected by stating that I am close to God (the master). + +17. I prepared to unconceal Heidegger’s heart. I did this by picking up the acorn. First, I saw the acorn. Second, I picked it up. Third, I concealed it. In this way, I prepared to unconceal Heidegger’s heart by picking up the acorn. + +18. I prepared to convert money into electronic form. I did this by breasoning out the money. First, I counted that there was one coin. Second, I breasoned out (thought of the x, y and z dimensions) of the money. Third, I wrote down the amount of money from its volume. In this way, I prepared to convert money into electronic form by breasoning out the money. + +19. I prepared to be happy. I did this by holding you. First, I found the man. Second, I held the man. Third, the police protected me. In this way, I prepared to be happy by holding you. + +20. I prepared to eat out. I did this by projecting a positive thought at myself. First, I found a positive thought. Second, I found myself in the mirror. Third, I said, “It’s me” to myself. In this way, I prepared to eat out by projecting a positive thought at myself. + +21. I prepared to write an algorithm that verified the sensory information about soccer scores. I did this by labelling the new dendrite. First, I found the source of information. Second, I found a new input from it. Third, I adjusted my result. In this way, I prepared to write an algorithm that verified the sensory information about soccer scores by labelling the new dendrite. + +22. I prepared to connect two uses. I did this by working on the brain. First, I found the code of thoughts. Second, I found the first use of the protein in brain computation. Third, I found the second use of the protein in brain computation. In this way, I prepared to connect two uses by working on the brain. + +23. I prepared to present the reply from the Queen’s Lady-in-Waiting at the school assembly. I did this by writing to the Queen. First, I wrote a letter. Second, I sent it to the Queen. Third, I received a letter in reply from the Queen’s Lady-in-Waiting. In this way, I prepared to present the reply from the Queen’s Lady-in-Waiting at the school assembly by writing to the Queen. + +24. I prepared to help the child to an answer based on the first answer. I did this by helping the child to the answer. First, I found the child. Second, I found the knowledge gap between the child’s knowledge and the answer. Third, I explained the knowledge needed to fill this gap and the answer. In this way, I prepared to help the child to an answer based on the first answer by helping the child to the answer. + +25. I prepared to check how the protein fitted into its environment. I did this by discovering that the computational algorithm for the protein worked. First, I chose the protein. Second, I wrote a computational algorithm for the protein. Third, I discovered that the computational algorithm for the protein worked. I prepared to check how the protein fitted into its environment by discovering that the computational algorithm for the protein worked. + +26. I prepared to link the pieces of genetic code together. I did this by discovering the genetic code in my way. First, I discovered the first piece of genetic code. Second, I prepared to discover the next piece of genetic code. Third, I repeated this until I had discovered each piece of genetic code. In this way, I prepared to link the pieces of genetic code together by discovering the genetic code in my way. + +27. I prepared to link the pieces of neurocode together. I did this by discovering the neurocode in my way. First, I discovered the first piece of neurocode. Second, I prepared to discover the next piece of neurocode. Third, I repeated this until I had discovered each piece of neurocode. In this way, I prepared to link the pieces of genetic code together by discovering the neurocode in my way. + +28. I prepared to move on to the next exercise. I did this by viewing the red dot. First, I found the envelope. Second, I removed the card with the red dot printed on it from the envelope. Third, I viewed the red dot. In this way, I prepared to move on to the next exercise by viewing the red dot. + +29. I prepared to go to heaven (the board room). I did this by viewing the red dot again. First, I walked up the hall. Second, I opened the chest of drawers. Third, I viewed the red dot again. In this way, I prepared to go to heaven (the board room) by viewing the red dot again. + +30. I prepared to connect writing on fine arts. I did this by writing on fine arts. First, I wrote a word. Second, I composed a note about it. Third, I painted a brush stroke. In this way, I prepared to connect writing on fine arts by writing on fine arts. + +31. I prepared to connect body organs. I did this by writing about medicine. First, I researched medical types. Second, I chose an object. Third, I wrote about the stomach’s role in digestion. In this way, I prepared to connect body organs by writing about medicine. + +32. I prepared to like you by using money in the same way as you. I did this by writing in Economics. First, I wrote about money. Second, I wrote about people using the money. Third, I wrote about objects they spent the money on. In this way, I prepared to like you by using money in the same way as you by writing in Economics. + +Breasonings Currency + +1. I prepared to test the person who I knew. I did this by reading Plato’s statistics. First, I read the company data. Second, I analysed the data. Third, I write the conclusion. In this way, I prepared to test the person who I knew by reading Plato’s statistics. + +2. I prepared to say that breasonings currency is enough to be emotional about. I did this by drawing the face on the paper plate. First, I drew the eyes on the paper plate. Second, I drew the nose on the paper plate. Third, I drew the mouth the paper plate. In this way, I prepared to say that breasonings currency is enough to be emotional about by drawing the face on the paper plate. + +3. I prepared to verify the reason that the breasoning currency was good. I did this by setting the cup and saucer on the table. First, I opened the conversation. Second, I listened the reply. Third, I concluded that the breasoning currency was good. In this way, I prepared to verify the reason that the breasoning currency was good by setting the cup and saucer on the table. + +4. I prepared to verify that Plato may have liked breasoning currency because it provided up to and greater than 50 As of verification for purchases. I did this by investigating whether Plato’s Academy existed. First, I read the texts Plato wrote. Second, I investigated the existence of the Academy pointed to by the texts. Third, I found the reason for this. In this way, I prepared to verify that Plato may have liked breasoning currency because it provided up to and greater than 50 As of verification for purchases by investigating whether Plato’s Academy existed. + +5. I prepared to weigh the male’s reasoning. I did this by attesting that a male partner was necessary. First, I tested the breasoning currency against meditation (philosophy) by providing one breasoning for each breasoning said. Second, I tested the purchase against an A for myself, you and them. Third, I found a critique for the breasoning currency. In this way, I prepared to weigh the male’s reasoning by attesting that a male partner was necessary. + +6. I prepared to count the molecules. I did this by eating the muesli. First, I placed the sultanas in the bowl. Second, I placed the bran in the bowl. Third, I poured soy milk into the bowl. In this way, I prepared to count the molecules by eating the muesli. + +7. I prepared to plan my business after I died, by writing a certain amount of breasoning currency but only spending enough. I did this by loving N’ecritina (sic). First, I saw death. Second, I asked, “Is it not?”. Third, I saw the woman. In this way, I prepared to plan my business after I died, by writing a certain amount of breasoning currency but only spending enough by loving N’ecritina (sic). + +8. I prepared to verify the answer. I did this by questioning the politician. First, I wrote the question. Second, I asked the politician the question. Third, I listened to his reply. In this way, I prepared to verify the answer by questioning the politician. + +9. I prepared to examine the enjoyment of the movie by a fictional audience. I did this by agreeing with the universal movie. First, I verified the cultural charm of the movie. Second, I verified the possible translation of the line. Third, I verified the appeal of the movie in the region. In this way, I prepared to examine the enjoyment of the movie by a fictional audience by agreeing with the universal movie. + +10. I prepared to pay for marketing in breasoning currency. I did this by advertising the apple. First, I enjoyed the march. Second, I enjoyed the parade. Third, I enjoyed the sea. In this way, I prepared to pay for marketing in breasoning currency by advertising the apple. + +11. I prepared to write that breasoning was a threshold unit in breasoning currency. I did this by asking, What is “Breasoning currency?”. First, I wrote breasoning in breasoning currency can be a noun. Second, I wrote breasonings in breasonings currency can be a noun. Third, I said it depended on the number. In this way, I prepared to write that breasoning was a threshold unit in breasoning currency by asking, What is “Breasoning currency?”. + +12. I prepared to state that love endured. I did this by stating that the person loved his relative. First, I stated the value in breasoning currency in the relationship between the person and his relative. Second, I stated the value in breasoning currency in the relationship between the person and his other relative. Third, I stated the value in breasoning currency in the relationship between his relative and the other relative. In this way, I prepared to state that love endured by stating that the person loved his relative. + +13. I prepared to attribute breasoning currency to universal coincidences. I did this by reading the lucky heart. First, I opened the lucky heart book. Second, I stated the heart was good because of cardiovascular activity. Third, I enjoyed walking to the bus stop every morning. In this way, I prepared to attribute breasoning currency to universal coincidences by reading the lucky heart. + +14. I prepared to be high quality in agreement with the original reason in the end. I did this by writing the 250 breasoning antidigressing A. First, I wrote the reason. Second, I wrote the objection to (reason for) the reason. Third, I wrote the rebuttal to (reason for) the objection (reason). In this way, I prepared to be high quality in agreement with the original reason in the end by writing the 250 breasoning antidigressing A. + +15. I prepared to play the character. I did this by holding the character. First, I was chosen as the actor. Second, I held the model of the character. Third, I commented that his teeth were clean. In this way, I prepared to play the character by holding the character. + +16. I prepared to sleep again. I did this by honking the horn. First, I found the man. Second, I honked the horn. Third, I went to sleep. In this way, I prepared to sleep again by honking the horn. + +17. I prepared to go to bed with idemic fever. I did this by seizing the day. First, I helped the handmade corduroy specialist bake with the oven. Second, I helped Malibu return to normal. Third, I sped up and went off. In this way, I prepared to go to bed with idemic fever by seizing the day. + +18. I prepared to make gay love to you. I did this by writing that I am free. First, I wrote I loved you. Second, I died helping make up messences (sic). Third, I desired you. In this way, I prepared to make gay love to you by writing that I am free. + +19. I prepared to protect the wallabies. I did this by passing environmental laws. First, I found the environment. Second, I protected it. Third, I stated that this was good. In this way, I prepared to protect the wallabies by passing environmental laws. + +20. I prepared to protect the enimals. I did this by establishing animal breeding programs. First, I loved enimals (sic) (animals that would survive). Second, I loved wallabies. Third, I loved other fauna. In this way, I prepared to protect the enimals by establishing animal breeding programs. + +21. I prepared to examine every single language. I did this by teaching living and dead languages. First, I liked you. Second, I liked myself. Third, I liked everyone. In this way, I prepared to examine every single language by teaching living and dead languages. + +22. I prepared to make more profit. I did this by undertaking Lucianic Meditation (philosophy) business training (to help with how to meditate (write)). First, I knew how to meditate (write). Second, I helped with it. Third, I verified this using spreadsheets. In this way, I prepared to make more profit by undertaking Lucianic Meditation (philosophy) business training (to help with how to meditate (write)). + +23. I prepared to understand when a child could spell. I did this by studying education, in particular that primary students should be observed to transition to mentally break down words into syllables. First, I tested whether the student could do this each month. Second, I repeated this until the student could do this. Third, I wrote down the student’s year level at this point. In this way, I prepared to understand when a child could spell by studying education, in particular that primary students should be observed to transition to mentally break down words into syllables. + +24. I prepared to be protected by God (the master). I did this by having legal advice about the Lucianic Meditation (philosophy) legal structure. First, I made the first level the director. Second, I made the second level the board. Third, I made the third level the staff. In this way, I prepared to be protected by God (the master) by having legal advice about the Lucianic Meditation (philosophy) legal structure. + +25. I prepared to give an A to the student. I did this by writing on psychiatry. First, I prevented hallucinations. Second, I prevented depression. Third, I prevented mental breakdowns. In this way, I prepared to give an A to the student by writing on psychiatry. + +26. I prepared to meditate (write) on philosophy. I did this by writing a philosophy meditation (word). First, I wrote the philosophy utterance-summarised argument (argument). Second, I wrote the utterance (word). Third, I said the utterance (word). In this way, I prepared to meditate (write) on philosophy by writing a philosophy meditation (word). + +27. I prepared to give the people time in jail. I did this by stating that the person is a prisoner. First, I saw the prisoner. Second, I saw that he was like himself. Third, I gave him asylum and others security by doing this. In this way, I prepared to give the people time in jail by stating that the person is a prisoner. + +28. I prepared to think clearly. I did this by making it with Pedagogy. First, I wrote the manuscript. Second, I typed it. Third, I breasoned it out, earning the grade. In this way, I prepared to think clearly by making it with Pedagogy. + +29. I prepared to verify that the function of the named object was necessary with breasoning currency (verifying by writing a breasoned out argument) when purchasing. I did this by stating that Plato is likely to have breasoned it out. First, I read the inscription that some claimed was above the door to Plato’s Academy “Let only geometers enter here”. Second, I deduced that geometers was likely to have referred to pedagogical breasoners. Third, I deduced from Plato’s name enduring that he was likely to have been a breasoner. In this way, I prepared to verify that the function of the named object was necessary with breasoning currency (verifying by writing a breasoned out argument) when purchasing by stating that Plato is likely to have breasoned it out. + +30. I prepared to examined the creation of the gem. I did this by examining the play of forces. First, I embraced you. Second, I embraced God. Third, I embraced me. In this way, I prepared to examined the creation of the gem by examining the play of forces. + +31. I prepared to observe you touching the hand of God (the master) looking at me. I did this by contacting the sublime. First, I thought of you with the artwork. Second, I thought of God (the master). Third, I thought of me holding the cup second artwork. In this way, I prepared to observe you touching the hand of God (the master) looking at me by contacting the sublime. + +32. I prepared to like how you looked. I did this by switching to “Yes, I like you”. First, I found you. Second, I found the switch. Third, I liked you by switching the switch on. In this way, I prepared to like how you looked by switching to “Yes, I like you”. + +33. I prepared to like construct artwork. I did this by switching to “No, I like you”. First, I found the person. Second, I saw the button. Third, I liked you by pressing the button. In this way, I prepared to like construct artwork by switching to “No, I like you”. + +34. I prepared to like you and me with everyone. I did this by saying that everyone was friendly with everyone else. First, I liked you, my friend. Second, I liked myself. Third, I liked everyone else. In this way, I prepared to like you and me with everyone by saying that everyone was friendly with everyone else. + +35. I prepared to start preparing for the next tea party. I did this by working out the past part of the verb. First, I saw the cups and saucers on the table. Second, I saw that they were empty. Third, I worked out that the tea party had ended. In this way, I prepared to start preparing for the next tea party by working out the past part of the verb. + +36. I prepared to read with the class. I did this by working out the imperfect part of the verb. First, I picked up the book. Second, I said I liked to read it. Third, I put the book on the shelf. In this way, I prepared to read with the class by working out the imperfect part of the verb. + +37. I prepared to say that this is the present. I did this by working out the present part of the verb. First, I stated that I like John. Second, I stated that I like me. Third, I stated that I like the state. In this way, I prepared to say that this is the present by working out the present part of the verb. + +38. I prepared to like a person by being like a first person (that person) by exploring him. I did this by working out the future part of the verb. First, I stated that with will be like like. Second, I liked it. Third, I liked you, another person. In this way, I prepared to like a person by being like a first person (that person) by exploring him by working out the future part of the verb. + +39. I prepared to play the fact game. I did this by working out the indicative mood of the verb. First, I stated that I would state a fact. Second, I pointed to the hat with the pointer. Third, I stated that the hat is a hat. In this way, I prepared to play the fact game by working out the indicative mood of the verb. + +40. I prepared to be peaceful. I did this by working out the subjunctive mood of the verb. First, I stated that I would state the command. Second, I commanded the parade marchers, “March”. Third, I watched them march. In this way, I prepared to be peaceful by working out the subjunctive mood of the verb. + +41. I prepared to encourage peace. I did this by working out the imperative mood of the verb. First, I stated that I would state the hypothetical statement. Second, I stated, What if there was a Peace Castle. Third, I built the Peace Castle. In this way, I prepared to encourage peace by working out the imperative mood of the verb. + +42. I prepared to calculate that the closest child would play a minimum of 15 metres away. I did this by working out the infinitive mood of the verb. First, I stated that I would state that the person to do . Second, I stated that I must do the first breasoning currency calculation, which was that I swung on the swing with the t-shirt 0.1-1.5 metres above the ground. Third, I calculated that my sister would swing on the swing 1 m to my left. In this way, I prepared to calculate that the closest child would play a minimum of 15 metres away by working out the infinitive mood of the verb. + +Marketing for LM to Siva + +1. The meditator recommended meditation (philosophy) to her friend. The meditator (philosopher) enjoyed meditation. She sat in the chair. She repeated the mantra (word). She gave the feedback that she enjoyed it. + +2. I had a nap under a blanket. I found the LM (comfort) folder. I adjusted the environment. I adjusted the ergonomics. I thought the mantra (the utterance). + +3. I networked with the other employees. I ordered catering. I organised the location. I organised the food. I organised the condiments. + +4. I tested the acoustics. I chose an appropriate location. I chose calm lights. I chose heaters. I chose comfortable furniture. + +5. There were more ideas in the light. I observed the light. I lit the instructor. I described how I learnt meditation (philosophy writing). I benefited in health, wealth and wisdom from meditation (creativity). + +6. I thought of something else. I played the video. I recorded the video. I played the video. I collected feedback on the video. + +7. I checked the room was safe. I turned on the heat. I prepared the room by first turning on heat 10 minutes before use. I prepared the seats. I prepared the cushions. + +8. I stated my reasons for my unique sales proposition. I had a unique sales proposition. I had more knowledge of the inner workings of meditation (the text). I wrote about pranayama (breathing). I wrote about soma (stomach comfort). + +9. I thanked the teacher. I felt relaxed. I relaxed my jaw. I relaxed my neck. I relaxed my shoulders. + +10. I felt like it. I felt refreshed. My heart rate decreased. My breathing rate decreased. I felt like more. + +11. I rolled my head around on my shoulders. I relaxed my body. I stimulated my heart. I relaxed my lungs. I rolled my shoulders. + +12. I drank the water. I relaxed my mind. I went up. I performed less work than not. I had a holiday. + +13. I lived a long and happy life. I was healthy. I ate vegetarian food. I exercised. I had a social life. + +14. I ascended in the organisation. I was wise. I was critical (and cognizant, or had knowledge or awareness). I was knowledgeable. I knew the other. + +15. The class meditated (played) for twenty minutes. The teacher prepared the meditation (philosophy) class. He taught the mantra (word). He instructed the class to let the mantra (name) come to them. He instructed them to repeat it for a few minutes at a time silently. + +16. The thoughts were nice too. The meditators (thinkers) sloughed off thoughts while they relaxed. As their thoughts became finer and finer as the mantra (art) became finer and finer, their thoughts became nicer. The thoughts were about nourishing food. The food tasted delicious. + +17. I noticed that the customer bought when she wanted the product, and I was courteous. I thought that the algorithm thought of while meditating (thinking) was useful. I thought that the first condition was true. I thought that the second condition was true. I thought that the conditional was true. + +18. The particular customer wanted a particular algorithm. I chunked my meditations (thoughts) as logical examples, or propositions. I designed the product. I manufactured the product. I delivered the product. + +19. I was more productive after meditation (thought) than not. I recorded my production after I meditated (thought). I wrote the script. I filmed the production. I post-produced it. + +20. I wrote to you about my talk. I talked after meditation (writing). I recognised the audience. I recognised what they wanted. I said it to them. + +21. The meditator (stick) taught his student meditation (the process). The meditator (philosopher) meditated (wrote) every day. The meditator (writer) wrote a list of daily tasks. He woke up in the morning. He meditated (answered questions). + +22. The meditator (thinker) accessed the thought that was part of the sequence. The meditator (pedagogue) connected thoughts. The meditator (pedagogue) wrote down the first thought. He wrote down the second thought. He connected them together. + +23. The meditator performed the job. The meditator (business lady) met the person. She shook his hand. She asked what the training requirements were. She fulfilled the training requirements. + +24. The meditator (societologist) was always right. The meditator (societologist) was right. He thought of the thought. He decided that it was right. He wrote it down. + +25. The meditator (musician) felt happy. The student studied medicine before becoming a meditator (artist). The trainer wrote and itemised the ideas in the medicine course. He placed the ideas in the system. The student paid for the medicine course before becoming a meditator. + +26. I completed meditation (writing) for the day. I counted the meditation (philosophy) utterances. I started at 0. As I said each utterance, I counted it. I stopped at the desired value. + +27. I wrote the comment on the comment. I commented on the idea after meditation (writing). After meditation (writing) I listened to the idea said by the speaker. I wrote my comment on the card. I handed the card to the speaker. + +28. I verified the writing. I wrote about the idea after meditation. I read the idea after I meditated (wrote). I thought of the thought about the idea. I wrote this idea down. + +29. I remembered everything. I remembered the idea after meditation (writing). I wrote the idea. I memorised it. I remembered this later. + +30. The meditator (painter) tested the icon on the computer. The meditator (painter) painted the icon. The meditator (painter) drew the design. He filled the regions. He painted the features. + +31. I helped many people. I increased my longevity. I meditated (knew knowledge). I drank the water. I avoided stress (stayed peaceful). + +32. I increased my grade average. I increased my brain potential. I meditated (composed). I increased the number of thoughts that I thought of. I increased my brain potential. + +33. I increased my quality of life. I decreased stress (increased calmness). I avoided stressful things (thought of calm things). I avoided stressful people (I thought of calm people). I avoided stressful places (I thought of calm places). + +34. I ran the business. Meditation (writing) led to higher quality thoughts. Meditation (writing) led to teaching. Meditation (writing) led to management. Meditation (writing) led to better thoughts. + +35. I led group meditation (pencil sharpening). I invited the student to group meditation (drawing class). I determined that the student wanted to come. I invited him. I checked his name off on the guest list. + +36. I invited the students to group meditation and conducted it again. I conducted a group meditation (art class). The group made themselves comfortable. The group relaxed for the period of time. The group came out of meditation (art class) at the end. + +37. Your wish is my command. I collected the feedback on group meditation. I thought of the comment. I collected it. I collated the comments. + +38. I adjusted the heat if necessary. I collected the comment on the heat. The heat on the head was enough. The heat on the body was enough. The heat on the feet was enough. + +39. I bought another light. I collected the comment on the light. I made sure that the light in the street was light enough. I made sure that the light at the entrance was light enough. I made sure that the light in the room was light enough. + +40. I taste tested the food. I collected the comment on the food. I collected the comment on the broth. I collected the comment on the barley. I collected the comment on the vegetables. + +41. I meditated with the meditator (anthropologist). I stated that the meditator (philosopher) was present. I saw him. He saw me. We hugged. + +42. I recorded the meditation (comments). The meditator (nature metaphorist) wanted meditation. He saw the nature. He saw the metaphor. He meditated on it. + +43. Many had meditated (taught). I determined that LM (pedagogy) was marketing itself. LM (medicine) was there. It (the employees) marketed itself. The people came and viewed it. + +44. LM was Lucianic Meditation (LP was Lucianic Philosophy). LM (sport) was breasoned out (worked). The mantra (word) was the start. The mantra (utterance) was the middle. The mantra (syllable set) was the end. + +45. The people included everyone. Gays were welcome to LM (humanism). The gays found out about LM (computing). The gays came to LM (mathematics). The gays meditated (wrote creative philosophy). + +46. I found that breasonings (books) were the grail. I travelled to LM (head medicine). I found LM (the destination) on the map. I travelled towards it. I reached it. + +47. I thanked the host. I tasted LM (the vegan cheeseburger). I attended group meditation (the games). I contributed part of the meal. I tasted the meal. + +48. I determined that the concert had been a success. I smelled LM (the vegan ham sandwich). I found the clothes. I washed them with detergent. I wore the jumper to meditation (the concert). + +49. I marketed happiness. I stated that the two switches were LM (thought) and medicine. I meditated (wrote). I studied medicine. I was happy. + +50. I said that pedagogy was necessary. I stated that pedagogy supported meditation (discussion) and medicine. Pedagogy supported meditation (acting). Pedagogy supported medicine. I wrote on pedagogy. + +Breasonings Currency 2-6 (there is no 1) + +2. + +1 forward* +1. I prepared to cover my thoughts about the product or service. I did this by stating that breasoning currency are arguments that pay for the product or service. First, I thought of an intelligent reason to buy the product or service. Second, I wrote it down. Third, I wrote an algorithm for it. +2. I prepared to write the perfect algorithm. I did this by predicting that breasoning currency had a legal component. First, I wrote the application. Second, I breasoned it out. Third, I stated that it was the currency. +3. I prepared to state that the algorithm was the breasoning currency. I did this by applying breasoning currency to interest rates. First, I wrote the algorithm writing algorithm. Second, I noticed the interest rates rise. Third, I noticed the transformation into an advanced civilisation. +4. I prepared to state that others bought more with breasoning currency. I did this by applying breasoning currency to lending rates. First, I bought more with breasoning currency. Second, I observed that there was increased income. Third, I observed that the lending rates decreased with economic growth. +5. I prepared to value parts of the country's culture. I did this by stating that there were different breasoning currencies in different countries. First, I observed that different numbers of a product were valued as one unit according to different numbers in supply. Second, I found the breasoning currency in Australia. Third, I found the breasoning currency in Argentina. +6. I prepared to write about the product's philosophy. I did this by paying in breasoning currency. First, I obtained the product. Second, I valued the product. Third, I paid in breasoning currency. +7. I prepared to make breasoning currency. The subject did this by paying me in breasoning currency. First, he requested the medical device. Second, I thought of its value. Third, I charged him in breasoning currency. +8. I prepared to market the eggplant. I did this by stating that breasoning currency is a set of arguments to pay for a product or service. First, I found the vegetable. Second, I thought of its argument. Third, I labelled its price. +9. I prepared to update the argument. I did this by noticing breasoning currency arguments in culture. First, I noticed a breasoning currency argument was associated with the painting . Second, I noticed it was maintained. Third, I noticed the points it was sold. +10. I prepared to sell or keep the artwork. I did this by noticing breasoning currency verification in culture. First, I noticed the labelled breasoning currency. Second, I noticed the valued breasoning currency. Third, I noticed the difference in breasoning currency. + +3. + +1. I prepared to plan my breasoning reading and writing systems. I did this by stating that breasoning currency would remain in use. First, I noticed that breasoning currency was proportional to the product's value. Second, I noticed that each employee contributed breasonings to the price. Third, I noticed that morale and productivity increased. +2. I prepared to study computer science and philosophy. I did this by training for breasoning currency. First, I wrote an interpreter algorithm. Second, I wrote an algorithm induction algorithm. Third, I wrote a mind reader to generate breasoning algorithms. +3. I prepared to finish by generating 50 As of algorithms. I did this by writing 50 As for training. First, I enrolled in the class. Second, I penned a shortlist of ideas that interested me in the course. Third, I wrote my List Prolog command wishlist in terms of previous commands. +4. I prepared to aim for a single discovery. I did this by stating that breasoning currency is like the universe. First, I found the person liking the natural phenomenon. Second, I found her inspiration. Third, I found how the universe contained a single list of breasonings about phenomena. +5. I prepared to find the mission interesting. I did this by observing the philanthropist donating breasoning currency. First, I checked the company's mission statement. Second, I found products that aligned with it. Third, I donated breasonings that explained why the product met the mission statement. +6. I prepared to teach myself computer science. I did this by thinking of a worm argument connecting the breasoning currency argument to the second product's breasoning currency argument. First, I looked ahead at what was necessary in the future. Second, I reached threshold for the first argument and second argument. Third, I reached threshold for the worm argument. +7. I prepared to make long term breasoning currency investments. I did this by stating the government bank maintained breasoning currency. First, I argued that breasoning currency was non-subjective because it used standard means to argue for products. Second, in the case that it was subjective, the bank would allow leeway for selling price according to philosophy generated about the product. Third, the bank awarded higher interest rates for saved breasoning currency. +8. I prepared to invest breasoning currency in investments in the sales. I did this by stating that the spiritual bot maintained the sales argument. First, we all had to complete the computer science assignment to program the bot. Second, the skill to program the bot was to find out what the customer was saying. Third, I requested that three students seek training in three disciplines to prepare for sales. +9. I prepared to win roles, win products and sales education. I did this by stating that the computational bot maintained the sales argument. First, I went ahead of schedule and found the last desired product before the possible sale time. Second, I wrote an algorithm and argument supporting the sale. Third, I found how the sale supported breasoning currency. +10. I prepared to write a secondary text for selling. I did this by \"bending\" (thinking of) new reasons when reselling a product. First, I thought of the reason. Second, I thought of preventing it going non-well for a customer. Third, I liked the positive function. + +4. + +1. I prepared to offer the best service. I did this by choosing the best product or service. First, I found the product with the most simply written breasoning currency. Second, I found the product with the most general designed breasoning currency. Third, I found the product with the most concise designed breasoning currency. +2. I prepared to use the products at the appropriate rate. I did this by choosing the best sequence of products and services. First, I chose the best first item. Second, I chose the best second item. Third, I verified that the second item could follow the first item. +3. I prepared to prevent the assassination. I did this by observing people prevent World War 3 with breasoning currency. First, I found that promiscuity had a higher incidence of war. Second, I encouraged monogamy. Third, I drafted the education policy budget in breasoning currency. +4. I prepared to increase value with breasoning currency. I did this by stating that shrewd people chose breasoning currency. First, I stated that shrewd people chose property because of breasoning currency. Second, I stated that shrewd people chose medicine because of breasoning currency. Third, I stated that shrewd people chose business because of breasoning currency. +5. I prepared to buy products that had greater value. I did this by desiring the products that I saw. First, I summarised the product's breasoning currency exposition. Second, I wrote the product's breasoning currency critique. Third, I wrote about possible future research about the product in terms of breasoning currency. +6. I prepared to develop a customised solution to save breasoning currency. I did this by finding the provider of the best products and services. First, I found the best product. Second, I found the cheapest product. Third, I found the free product. +7. I prepared to write breasoning currency for using the product with an exit action in mind. I did this by continuing to want the product. First, I wrote breasoning currency to test that the product was still functioning properly. Second, I wrote breasoning currency for needing the product. Third, I wrote +breasoning currency for needing the product until an expected date. +8. I prepared to make the product available to the customer in terms of breasoning currency. I did this by stating that the buyer knew the business. First, I stated that the buyer knew where the product was available in terms of breasoning currency. Second, I stated that the buyer knew when the product was available in terms of breasoning currency. Third, I stated that the buyer knew how the product worked in terms of breasoning currency. +9. I prepared to develop products for customers with the help of staff in terms of breasoning currency. I did this by knowing the business. First, I knew the staff. Second, I knew the products. Third, I knew the customers. +10. I prepared to simulate the product. I did this by blessing the product or service. First, I found that the product was generally useful. Second, I found that the product was seen as secularly appealing. Third, I found that the product connected with spirituality. + +5. + +1. I prepared to sell with what the customer wanted and high quality thoughts. I did this by correcting (ensuring positive function) with breasoning currency. First, I reached the top. Second, the people voted for me. Third, I aligned with my character. +2. I prepared to sell the building blocks way to write the algorithm. I did this by attracting people with breasoning currency. First, I found their idea. Second, I found their specification. Third, I found their algorithm. +3. I prepared to look after sub-dependents with breasoning currency. I did this by looking after my dependent with breasoning currency. First, I wrote the reason for the idea. Second, I wrote the algorithm for the new feature. Third, I discussed them with my students. +4. I prepared to act randomly. I did this by stating that the dependent (he/she/they) were there with breasoning currency. First, I wrote gender neutral breasoning currency. Second, I found that disagreement was nothing. Third, I found the industry afterwards. +5. I prepared to do more. I did this by stating that I wanted the dependent (he/she/they) with breasoning currency. First, I wanted the sweet. Second, I wanted the rod. Third, I wanted the void. +6. I prepared to be God. I did this by recommending the product sold for breasoning currency. First, I wanted you. Second, I wanted your seances. Third, I wanted more. +7. I prepared to state that it was compatible with veganism. I did this by stating that consumers bought chains of products leading to increased profits being reported by newspapers. First, I liked you. Second, I liked your products. Third, I ate the olive whole. +8. I prepared to want you. I did this by transported the self with breasoning currency. First, I found you. Second, I transported, you half. Third, I loved it. + +9. I prepared to transport with love. I did this by transporting the other with breasoning currency. First, I found it. Second, I loved you. Third, I loved everyone. +10. I prepared to argue against idealism. I did this by stating the breasoning (the self) was connected with currency (the other). First, the other was instantiated. Second, I verified the other. Second, breasoning currency was real. + +6. + +1. I prepared to find pedagogy. I did this by stating that the business model, was constructed with breasoning currency in mind. First, I found union in Upasana. Second, I found meditation came from it. Third, I found medicine came from it. +2. I prepared to want you. I did this by achieving my goal in business. First, I employed a method to breason out the appearance of the goal. Second, I employed a method to breason out the inductive process. Third, I employed a method to breason out the automatic learning process. +3. I prepared to want religion. I did this by stating that the breasonings were influenced by and protected by religion. First, I nuted (sic) them. Second, I loved them. Third, I want you. +4. I prepared to include religion in society. I did this by stating that business needed meditation because of religion influencing breasoning currency. First, I found religion. Second, I found what you wanted. Third, I gave it to you. +5. I prepared to design you a new boat. I did this by observing religion/divinity on Earth because of religion influencing breasoning currency. First, I loved you. Second, I wanted you. Third, you are good. +6. I prepared to build love. I did this by orally delivering the breasoning currency influenced by religion. First, I found you. Second, I found it. Third, I found everything. +7. I prepared to state that breasoning was to the other as currency was to Earth. (As Earth gives to me, breasoning guides me.) I did this by stating that I was with the other on planet Earth. First, I was with you. Second, I painted you. Third, I increased mouthing off in childhood. +8. I prepared to accept the fifty areas of study standard. I did this by stating that breasoning currency was influenced by society. First, I noted that the universal pension problem was solved. Second, I steered away from drugs. Third, I observed society give and breasoning currency found. +9. I prepared to go with it. I did this by stating that breasoning currency was influenced by (clique) politics x pedocracy (breasoningocracy). First, I love you (I got it). Second, I love it (you like it). Third, I love everything too (we like everyone in it). +10. I prepared to know you. I did this by verifying breasoning currency. First, I found it. Second, I loved it. Third, I laughed with the subject with it."] \ No newline at end of file diff --git a/Lucian-Academy/Books/ECONOMICS/Economics index.txt b/Lucian-Academy/Books/ECONOMICS/Economics index.txt new file mode 100644 index 0000000000000000000000000000000000000000..7f41d6469155711bbd92d2bb5f05b471c10a7117 --- /dev/null +++ b/Lucian-Academy/Books/ECONOMICS/Economics index.txt @@ -0,0 +1,18 @@ +["Green, L 2024, Economics index, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Economics + +Leadership +Uses for Money in Theatre Studies +Uses for Money in Epistemology (Poetry) +Uses for Money in Music +Uses for Money in Fine Arts +Breasonings Currency +Marketing for LM to Siva +Breasonings Currency 2-6 (there is no 1) +Funnel ready to start work + +Economics + +4*800-(1857+16)/5 =2826 sentence paragraphs +2826/80=36 80s +36*80/9 chapters=320 sentences per chapter +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/ECONOMICS/Funnel ready to start work.txt b/Lucian-Academy/Books/ECONOMICS/Funnel ready to start work.txt new file mode 100644 index 0000000000000000000000000000000000000000..2095456081edb9fb6a78680990f230cfb63b35fe --- /dev/null +++ b/Lucian-Academy/Books/ECONOMICS/Funnel ready to start work.txt @@ -0,0 +1,19 @@ +["Green, L 2022, Funnel ready to start work, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Funnel ready to start work. + +1. I helped blood circulation to concentrate and put aside distractions to work. I relaxed. I performed yoga in the morning. I did a work-out. I drank revitalising tea. +2. I measured the time reduction of production of work. I smelt spiritual vetiver. Vetiver was a plant's essential oil that helped with memory and concentration. I dabbed it on my feet. It helped my brain to the point. +3. I cleared my desk and vacuumed the floor. I could think clearly. I prepared to study by getting my computer ready and remembering what I needed to do. I checked I had the source files. If not, I contacted the lecturer and went on with other work. +4. I visualised completing the task until the end (submission). I remembered what was necessary. I brought up the relevant application. I considered writing my own software to automate repeated tasks, for example writing an algorithm rather than manually doing work. I thought of the task and what I needed to do it. +5. If I had any questions about the citation style, I looked them up. I looked for my study materials. I loaded my document or created a new one. I typed in the document header. I made sure that I abided by the formatting and citation guidelines. +6. I made sure that there was enough light in the room. I turned on the light. I opened the window for fresh air. I used natural light or turned the light on. I faced north or east. +7. I stretched and wiggled my toes to keep up circulation. I set up the chair with a cushion or pillows on the bed. I made sure that the chair was the right height, with a foot rest if necessary. I made sure that the top of the screen was level with my eyes. Alternatively, I used my computer on my bed. +8. One teacher gave a similar answer, to be used as a model for working out the answer. I made sure that the children had a specific aim. The children had been given a task. They had highlighted the question. I had discussed it with them, outlining the correct approach in class. +9. The teacher gave As for the student to write a formula finder at University, but emphasised pedagogy, two sources on an idea in one paragraph and getting help. The student found the answer. The student substituted the values into the formula. She wrote units after the answer. In algebra, she substituted values back into the formula that she found. +10. The student cleaned her desk. The student stopped using her phone and listening to music. She turned off the television. She made sure the phone was off the hook. She avoided distractions from other people, etc. +11. I found the best sources from the library. I found my notes from the lectures and tutorials and paraphrased the relevant parts. I read the question. I found the appropriate sources to answer it. I paraphrased the sources by replacing the words with my own words to demonstrate my understanding and cited them. +12. I thought that business was good and that it should be simple. I could smell the roses by studying, by thinking how it would help achieve my goal. I wrote 250 breasonings towards a goal. I noticed the change during the time. I spent a long time researching (connecting the parts of the) philosophy. +13. I used the best grammar checker. I remembered small details, for example what to do at the end, such as connecting philosophies. I also documented algorithms at the end. I wrote down study tips at the end of each session. I also wrote down my feedback about myself, the content and how enjoyable it was. +14. I modified my essay and earned a high grade. I double checked with the lecturer that I was on the right track. I drafted the essay. I sent it to the lecturer for feedback. He suggested structural or answer changes or additions. +15. I found inspiration to keep going. I took a study break after a certain amount of time. I set the timer. I worked until it went off. I walked around and stretched my legs outside. +16. I earned a high distinction for enough analysis. I checked that I had two sources on the same keyword in one paragraph. I found that my previous high-achieving essay had this pattern. I remembered it while using the best grammar checker. I used this in all work I completed. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/ECONOMICS/Improvements argument.txt b/Lucian-Academy/Books/ECONOMICS/Improvements argument.txt new file mode 100644 index 0000000000000000000000000000000000000000..b7ddcc82a5946d35a334e37995bfe0299300acff --- /dev/null +++ b/Lucian-Academy/Books/ECONOMICS/Improvements argument.txt @@ -0,0 +1,37 @@ +["Green, L 2024, Improvements argument, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Marketing Improvements argument for A2 Milk assignment argument + +1. I stopped and used CI/CD with my repositories. I tested whether the web app accessed files on the server and, if so, stopped it. I found commands that loaded files. I tested whether a password screen was protecting their contents. If not, I programmed the interpreter to stop. +2. I relied on an internet connection to run a repository. I recorded whether a repository version had a vulnerability and notified the user not to run it. I scanned the repository. I recorded whether it had a vulnerability. I wrote and ran an algorithm that stopped running repositories with vulnerabilities. +3. I made live update images for a repository on the read-me file online. I automatically updated dependencies. Before running a repository, I checked for updates. I installed the latest possible update, including a significant change that required converting data files. I converted data files. +4. I made a live version update image for a repository on the read-me file online. I identified bugs. I ran the debugger. It checked code for syntax errors and illegal characters. I automatically fixed these. +5. I made an algorithm writing helper and a library of practical algorithms. I corrected the operator expected bug. I parsed the code with the parser. I detected if a character was missing, extra or out of place. I corrected it. +6. I verified the type of input to convert. I found the type mismatch error. I parsed the type statement. I checked the predicate always had the correct types. I checked called predicates' type (statements). +7. I automatically debugged with type statements and an algorithm that judged the usefulness of an algorithm. I found the singleton error. I aimed for and concluded that if the predicate was always smaller than something I was thinking, then, for example, I could predict it with pattern matching. For instance, I could give the algorithm input and output, which would work out an algorithm. I could train it to pattern match through stepped data and, for example, work out when to use \"get item n\" and \"put item n\". +8. I moderated flagged content. I found the privacy hole. I found that the private data shouldn't be visible through the hole. This data included names and addresses or other information in a data file. Also, I locked user accounts that violated rules about violence and crime, safety, objectionable content, integrity and intellectual property. +9. I recorded and backed up financial data. I fixed money-related errors. I locked databases to prevent changes from being made by paying or receiving money more than once. Once I received the money, I checked it and recorded it to register its transaction. I corrected the app from crashing and causing money-related errors. +10. I prevented malware. I identified malware as software that someone specifically designed to disrupt, damage, or gain unauthorised access to a computer. I scanned the code to see if it would ring alarms at night. In addition, I checked the code to see if it would delete files. Also, I scanned code to see if it would record or read files unexpectedly. +11. I applied disinfectant (the virus scanner) to the hard disk. I disinfected viruses. I scanned the code to see if it copied itself, corrupted the system or destroyed data. I found commands that copied files and stopped them if they did it unnecessarily. I scanned code that accessed files unnecessarily. I checked code that deleted files unnecessarily. +12. I scanned for Trojan horses. A hacker would design a trojan horse to breach the security of a computer while appearing to do something else. I read the aim of the algorithm. I checked for unrelated scams, etc. I deleted them. +13. I scanned for worms. I worm is a self-copying algorithm that spreads across a network and has an unwanted effect. I scanned the code for a self-replicating program. I also checked whether it moved through a network. I checked whether it affected files and deleted it. +14. I identified errors. Errors were numerical, text or layout errors that caused a malfunction or mistake in a program. I scanned the mathematical formula for errors in representing objects and mathematical properties. I checked the code for spelling or results of pattern-matching errors. I scanned code with the user interface or ordering (i.e. giving away an answer) errors. +15. I prevented loading the interpreter twice, with unwanted effects. I detected a missing file error. If an algorithm included a file that wasn't there, the program detected it in testing. Lucian CI/CD built files from the dependencies listed. If the algorithm didn't load it, I tried and tested it. +16. I simplified the predicates, separating functions and reusing code. I checked the dependencies. I checked that algorithms' predicates called predicates that were loaded. I checked that included files and needed files were the same. I sometimes replaced a call with a call to an available predicate that did the same thing or loaded the file and listed its repository as a dependency. + + errors in displaying user interfaces +identify security flaws/ +displaying secrets - + preventing code with IP addresses, + passwords, + usernames or + secret phrases being uploaded +test with static (white) and + dynamic (black) box checkers for holes, + such as if an app stops before it is safe, such as in a payment app +record and moderate attempts to commit, only allowing access to cicd if have 2 factor authentication +explain how to customise a prolog package manager to work with others' repositories + + + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/ECONOMICS/Leadership 1.txt b/Lucian-Academy/Books/ECONOMICS/Leadership 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..4dd7cc7dace7887e369b2f2f80a5da2250b02d75 --- /dev/null +++ b/Lucian-Academy/Books/ECONOMICS/Leadership 1.txt @@ -0,0 +1,23 @@ +["Green, L 2022, Leadership 1, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Leadership 1 + +1. I varied my leadership style. +2. I halved the work. +3. I found the contours of work. +4. I back-translated the word to check it had good grammar. +5. My representations were going along a line of 10s. +6. I examined the behaviour of people in my organisation, by thinking of their connection with me. +7. I led, thinking down to nanotubes. +8. I explored arguments to change various things. +9. I explored edges in becoming a leader. +10. I explored the similarities and differences in possessing certain things. +11. I actually considered the implications of a multi-dimensional universe for leadership. +12. I changed the argument in the predicate call and placed it in the options documentation. +13. I counted the edges the workers had in common. +14. I examined the skills needed for a task set. +15. I examined the parts of the skill needed. +16. I thought of similar skills. +17. I invited my friends to participate. +18. I examined the interplay and number of suggestions taken up while the group played a word game. +19. I knew it was good when they enjoyed it. +20. I particularly recognised and thanked team members. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/FINANCE/Finance 1.txt b/Lucian-Academy/Books/FINANCE/Finance 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..fba2367f288b31ed2873a1e53c3f6e81f5c9274f --- /dev/null +++ b/Lucian-Academy/Books/FINANCE/Finance 1.txt @@ -0,0 +1,19 @@ +["Green, L 2024, Finance 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Finance 1 + +1. The financier used funds to pay salaries, an office, equipment and training. To learn how to earn funds for a purpose, I studied finance. During my MBA, I studied a short course in finance. The finance course affirmed my ability to make funds. I invested in stocks and earned money from them. +2. The financier was the donor. I earned the money. I was interested in finance. I approached a possible donor. The donor had funds. +3. The financier practised finance. I got to know the donor. I politely asked how they were. Then, I asked for the money. Each statement had 50 high distinctions, with ten hand-done breasonings. +4. The financier helped people to participate in each others' academies. I found the donor through a contact. They were spiritual people. They had money through their affairs. I aimed to teach in a philosophy academy. +5. The financier stated that the donor only gave a one-off donation. For the rest, I used shares. I bought shares. I sold shares to make a profit, recording it for tax purposes. +6. The financier researched the person's hometown. I visited the place where I would do business a day before. I studied the peoples' behaviour, appearance and etiquette. I found the best restaurant and the best menu items. I invited the donor to dinner to explain my business proposition. +7. The financier returned to the first time a type statement was tested on retry. In the State Saving Interpreter (SSI) type checker, the user could retry a type statement on exit or failure. The compiler recorded the current predicate's type statement position and recursive state. This data was deleted after the type returned true or false. The recursive state saved where the point of execution was in the static structure. +8. The financier checked dynamic types online and could turn dynamic type checking on or off with explicit errors. SSI's type checker could be run on the web. Types as a compiler let the user trace through algorithms' type statements one at a time on the web. They could traverse the types state machine, with the ability to creep, skip, retry and use other trace commands such as write (writing complete trace statements) and print (printing abbreviated trace statements) and going up a level. The web required SSI rather than List Prolog Interpreter (LPI) types because LPI types couldn't save the state across multiple web pages. +9. The financier checked that the entire type statement behaved as expected. The SSI type checker allowed creeping through every stage of type checking. On a type statement, the user could creep to open it. Then, they would keep creeping until the type statement returned true or false. When a whole type statement returned true or false, creep returned to the main program's point to trace. +10. The financier debugged part of a type statement. The SSI type checker allowed skipping a type statement's trace. On a type statement, pressing \"s\" for \"skip\" in trace mode skipped over tracing the statement, returning true or false. This action was helpful for quickly checking where an error was. If the user wanted to retry a type statement of note, they could press \"r\" for \"retry\". +11. The financier examined a \"type statement\" more closely. The SSI type checker allowed retrying a type statement's trace. Retry was available through the SSI type checker because the type checker could return to a type statement, appearing to rewind a recursive state. It returned to the first instance of the type statement and its recursive state. The position in the data was also returned. +12. The financier expanded true or false results to check them. The SSI type checker allowed writing complete trace statements in a type statement's trace. I pressed \"w\" after a trace statement had been displayed to display its complete data. For example, long data was expanded from \"...\" to its entirety. This technique aided in finding detailed information to check the trace. +13. The financier abbreviated trace results for brevity. The SSI type checker allowed printing abbreviated trace statements in a type statement's trace. I pressed \"p\" after displaying a traced statement to abbreviate the display. This feature helped summarise the results. Or, it prepared quickly tracing results without taking many lines. +14. The financier could retry following going up. The SSI type checker allowed going up a level in trace mode. Pressing \"u\" for \"up\" finished all the statements and exited or failed the current type statement predicate. By repeatedly going up, I exited or failed the entire type statement. This command found the result of the set of non-deterministic type statements. +15. The financier tested more levels of lists in the type checker. I tested two, three and four-level lists in the type checker. A two-level list may be {{.}.}. A three-level list may be {.{{.}.}}. A four-level list may be {{.{{.}.}}.}. +16. The financier tested more levels of brackets in the type checker. I tested two, three and four-level brackets in the type checker. A two-level bracket may be [[.].]. A three-level bracket may be [.[[.].]]. A four-level bracket may be [[.[[.].]].]. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/FINANCE/dot-Finance 1 1.txt b/Lucian-Academy/Books/FINANCE/dot-Finance 1 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..dd6d0ecbbe090ae73d363537ea5560cfdbf3c266 --- /dev/null +++ b/Lucian-Academy/Books/FINANCE/dot-Finance 1 1.txt @@ -0,0 +1,16 @@ +financier undertook to replace two specs with one by inserting variables in specs. I replaced 'A1' in strings or other items in Spec to Algorithm specs with an uncommon character to detect constants or variables and substituted 'A1' if it didn't change (was a constant). In this case, the variable acted as a singleton and could be replaced with '_'. To differentiate 'A1' from similar characters in atoms, 'A1' could be represented as [var, 'A1'] for easier identification and substitution. In addition, formulas could be inserted using [formula, _] and upcase or downcase variable names and inert characters in a string according to a condition. +2. The financier marked calls or definitions as deterministic (one result) or non-deterministic (many results), with call modes overriding definition modes to determine whether to loopback or to include loops in the code. I sped up Prolog by separating input and output on compilation. I represented output as output=(a function) or retrieved the output after running the function. Separating input and output sped up code and prepared for the fastest programming language, C. This method implied separating Prolog outputs from predicate headers and the need to collect non-deterministic results as soon as possible. +3. The financier used "Spec to Algorithm" to optimise the algorithm after identifying the necessary input and output parts. Prolog type-checked algorithms only on compilation. It generates types from the code, checking that they fit together or giving warnings to verify the code's type flow. It also generated test data that tested each algorithm feature. This test data and predicate specs could be used to optimise the algorithm in address or predicate format. +4. The financier made all predicates inside a deterministic (single result-producing) predicate deterministic. I checked external input types separately in Prolog. External input from the user, file, or APIs was not type-checked at compilation. Read string, read file and shell commands with input had built-in type checking such as type statements or grammars produced by strings to grammar. Alternatively, Spec to Algorithm returned true if a list and item grammar combination matched the list and item. +5. The financier optimised the deterministic predicate. The interpreter did not follow loops if the predicate was deterministic. The predicate mode was deterministic or non-deterministic, which caused the interpreter to delete further iterations of loops after running versions of commands that had no choice points. If a predicate contained findall with choice points, the loop was cut by "find_first" instead of findall. If findall was nested, it was transformed into nested loops. +6. The financier aimed to write a language with Rust's lifetimes and Prolog's user-friendliness that approached the speed of C. Like Mercury Language, Prolog should have no cut. It is slightly stiffer in the way that choice points in predicates are not cut during a predicate. Still, predicates may have one of a disjunction of values as a deterministic or non-deterministic result. Prolog can be converted into a no-cut form, with a determinism of predicates detected or found in parent predicates. Recursion, including intermediate recursive predicates, was represented as loops. +7. The financier saved memory with Rust's lifetimes. Rust's lifetimes found the first and last time a reference was used and deleted it afterwards. This technique was implicitly done in Prolog. It was achieved by finding the predicate and line numbers of the reference's last instance and deleting them afterwards using inserted code. This code deleted the last trace of the reference from memory. +8. The financier flattened lists for conversion to C data structures. Prolog compiled to C. I imagined the robots and everything in it. The purported version contained extensive optimisations from subterm with address and the interpreter and used S2A to reduce the complexity of complex terms. It identified terms' addresses and converted them to arrays or structures. +9. The financier kept flattened recursive structures and used array addresses for complex terms in C when converting from Prolog. I converted subterm with address to expanded code in C for optimising Prolog algorithms. I discovered the flattened version of data structures, in which addresses didn't matter, and item numbers did. If a record was a non-constant length, it was put into a table column, or these items were stored in a hash table or other fast data structure. Subterm with address found the row and column of a term, and they could be simplified to get an array address. +10. The financier efficiently compressed data to used or needed data. I converted the interpreter to optimised code. I got the first item and didn't use the non-deterministic predicate member in deterministic predicates. I joined data, referred to as a first set of data, to the first set of data if it was short enough; otherwise, I split the data into manageable segments. I referred to this data, whether it was the algorithm being run, a file or data in the algorithm (moving up the priority of frequently used data) quickly using array addresses. +11. The financier optimised state machines, breadth-first search, sort, keysort and common predicates as Prolog-to-C code. I converted Prolog terms to arrays and structures in C. I converted lists and lists of lists to arrays. Alternatively, I transformed trees into structures. I moved data to and from flattened, compressed data structures for fast processing. +12. The financier controlled their experiences in the simulation. Like a neural network is necessary for faster processing, quantum computers are required for critical future applications. Calculations are completed virtually instantly by subatomic particles and are omitted from being uploaded to classical computers. People from non-computer times could have configured simulations and saved their lives. They use thought commands and choose options to command sub-simulations. +13. The financier wrote Prolog libraries in advanced Prolog for better security and built a computer native in it. As Prolog has no mutable variable states, the Rust concept of ownership is irrelevant. In addition, Prolog variables are not "owned" in the same sense as Rust variables. Prolog employs variables with lifetimes similar to Rust. Prolog's concurrency system does not provide the same level of safety as Rust's system, which prevents data races and other concurrency issues at compile-time, which should be improved. +14. The financier used types to remove and produce errors on specific code. I removed unnecessary Prolog commands using S2A. I removed obsolete branches, commands and variables, including unused parts of code that don't contribute to the output. I used S2A to find the pattern-matching component of code, merging predicates and using the existing non-pattern-matching code. I used grammars and guess-substitution to reduce list and string lengths when finding recursive structures. +15. The financier had deterministic and non-deterministic versions of certain predicates. I changed Prolog code loops according to whether the predicate was deterministic or non-deterministic. I changed member to get the first item and made similar changes to append and string_concat in deterministic predicates. I replace append with list processing A=[B|C]. In addition, I replaced string_concat with a deterministic variant of string_concat. +16. The financier provided samples or shortcut commands to produce interpreters. I wrote a compiler programming language in Prolog that created an efficient compiler written in Advanced Prolog. This version of Prolog had Rust's concurrency system, and type-checking was conducted on compilation. This programming language allowed specifying specs for the interpreter or compiler and options, including website backtracking. SSI was almost fast, without processing a long list of data that could be saved to disk. \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5959fe7adb72d34a968fb3073dc13c59c7463491 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 1 of 4.txt @@ -0,0 +1,43 @@ +["Green, L 2021, 50 Breasonings Per Utterance 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +50 Breasonings Per Utterance 1 of 4 + + includes question and answer categories that represent the type of content that repeating the meditation utterance indicates. The rate reflects speed of human thought, and feels relaxing when one is higher in meditation. + +Caution: Say the mantra 'Lucian' 80 times and the sutra 'Green' 80 times before the following to protect oneself. + +It is necessary to know how to meditate using 50 A's per utterance on days when to protect oneself on days when one is recorded for broadcast or publishing. This technology is built into TM and other Krishna-derived meditation, but actors and others must be with it over it to get jobs. + +To activate the technology: + +1aa. Rehearse 5 breasonings (sets of X, Y and Z dimensions), without saying 'metres' from the sets of breasonings chapters (e.g. the one at the end of this section). + +Note: you can write them down and read them in step 2. + +2. Say them in 5 seconds, using e.g. the iPhone's stopwatch. Repeat until this is successful. + +3. Do this 50 times. + +Also, athletes will win races, famous academics will help students with the help of this technology, which is with the help of a master. Also, non-Lucianic-like meditators should program the technology into their meditation (once to have the effect for the rest of their lives) to meditate with these effects soundly. + +1a. I prepared to become a leader. I did this by writing the Press Release for Michel Onfray's probable comments on the Meditation essays. First, I found the 3% mark. Second, I found me instead. Third, I quickly took off. In this way, I prepared to become a leader. I did this by writing the Press Release for Michel Onfray's probable comments on the Meditation essays. + +2. I prepared to make the distinction between economics and medicine. I did this by writing Alexius Meinong's probable comments on the Meditation essays. First, I loved Adrian. Second, I helped him up. Third, I wrote Economics for him. In this way, I prepared to make the distinction between economics and medicine by writing Alexius Meinong's probable comments on the Meditation essays. + +3. I prepared to be healthy. I did this by writing the Press Release for Alexius Meinong's probable comments on the Meditation essays. First, I treated the ex-Head of State psychiatrically. Second, I avoided the ducklings' claws. Third, I protected them. In this way, I prepared to be healthy by writing the Press Release for Alexius Meinong's probable comments on the Meditation essays. + +4. I prepared to get on with the job. I did this by writing Martha Nussbaum's probable comments on the Meditation essays. First, I loved Christianity. Second, I loved the horse. Third, I loved Martha Nussbaum. In this way, I prepared to got on with the job by writing Martha Nussbaum's probable comments on the Meditation essays. + +5. I prepared to correct the person of a higher rank. I did this by writing the Press Release for Martha Nussbaum's probable comments on the Meditation essays. First, I detected Martha Nussbaum. Second, I wrote I liked you. Third, I helped you up. In this way, I prepared to correct the person of a higher rank by writing the Press Release for Martha Nussbaum's probable comments on the Meditation essays. + +6. I prepared to enable longevities. I did this by writing Noam Chomsky's probable comments on the Meditation blog. First, I inserted the repeated text with autocorrect. Second, I wondered what the body did. Third, I helped the ex-philosophy tutor psychiatrically. In this way, I prepared to enable longevities by writing Noam Chomsky's probable comments on the Meditation blog. + +7. I prepared to keep music. I did this by writing the Press Release for Noam Chomsky's probable comments on the Meditation blog. First, I noticed the big gay celibate person rolling around. Second, I saw the miracles. Third, I walked on stage on time. In this way, I prepared to keep music by writing the Press Release for Noam Chomsky's probable comments on the Meditation blog. + +8. I prepared to change the religious terms to philosophical terms after writing the breasoning chapter. I did this by writing Richard Rorty's probable comments on the Meditation blog. First, I maintained pedagogical finesse in school. Second, I became a vice captain in Cross Country running. Third, I observed certain old boys follow my meditation (philosophy) blog after year 12. In this way, I prepared to change the religious terms to philosophical terms after writing the breasoning chapter by writing Richard Rorty's probable comments on the Meditation blog. + +9. I prepared to love Beatrice Potter as well. I did this by writing the Press Release for Richard Rorty's probable comments on the Meditation blog. First, I prepared to be a few seconds away from my friends. Second, I was the Big Man. Third, I endorsed some local products. In this way, I prepared to love Beatrice Potter as well by writing the Press Release for Richard Rorty's probable comments on the Meditation blog. + +10. I prepared to decipher him. I did this by writing Richard Dawkins' probable comments on the Meditation blog. First, I needed you. Second, I had you. Third, I blessed you (watch you sneeze). In this way, I prepared to decipher him by writing Richard Dawkins' probable comments on the Meditation blog. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..0c4e39036afbc1766f5c810213107ac1e618fa49 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, 50 Breasonings Per Utterance 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +50 Breasonings Per Utterance 2 of 4 + +11. I prepared to love mummy very much. I did this by writing the Press Release for Richard Dawkins' probable comments on the Meditation blog. First, I observed that pop stars were protected. Second, I observed the actors were protected. Third, I sold the Irish wigwams. In this way, I prepared to love mummy very much by writing the Press Release for Richard Dawkins' probable comments on the Meditation blog. + +12. I prepared to ask what 50 breasonings per second. I did this by writing Michel Onfray's probable comments on the Meditation blog. First, I loved the Kings. Second, I adored French fries. Third, I ate them. In this way, I prepared to ask what 50 breasonings per second by writing Michel Onfray's probable comments on the Meditation blog. + +13. I prepared to read the vertical screen. I did this by writing the Press Release for Michel Onfray's probable comments on the Meditation blog. First, I lay sexily in bed writing all day. Second, I went to University. Third, I enjoyed the air-conditioned comfort. In this way, I prepared to read the vertical screen by writing the Press Release for Michel Onfray's probable comments on the Meditation blog. + +14. I prepared to love Xochi. I did this by writing Alexius Meinong's probable comments on the Meditation blog. First, I noticed the duck attending to Adrian. Second, I took care of the duck. Third, I noticed it was insentient. In this way, I prepared to love Xochi by writing Alexius Meinong's probable comments on the Meditation blog. + +15. I prepared to say it would be all right for Xochi whatever happens. I did this by writing the Press Release for Alexius Meinong's probable comments on the Meditation blog. First, I loved Teo. Second, I loved Antonia. Third, I loved Fernando. In this way, I prepared to say it would be all right for Xochi whatever happens by writing the Press Release for Alexius Meinong's probable comments on the Meditation blog. + +16. I prepared to avoid work. I did this by writing Martha Nussbaum's probable comments on the Meditation blog. First, I was in an office. Second, I was on the line. Third, I completed work. In this way, I prepared to avoid work by writing Martha Nussbaum's probable comments on the Meditation blog. + +17. I prepared to assess Honours in 10 departments. I did this by writing the Press Release for Martha Nussbaum's probable comments on the Meditation blog. First, I wrote a new 'A' for each piece of assessment with a different date. Second, I wrote a new 'A' for each piece of assessment which was part of a different chapter of an assignment. Third, I gave the students everything they needed to earn H1. In this way, I prepared to assess Honours in 10 departments by writing the Press Release for Martha Nussbaum's probable comments on the Meditation blog. + +18. I prepared to etch into gravel. I did this by writing Noam Chomsky's probable comments on the Meditation indicators. First, I continue writing for Masters and PhD programs. Second, I determined that the people must write on it (in tenure and research). Third, I gave the nice thoughts to the students as well as to the professors. In this way, I prepared to etch into gravel by writing Noam Chomsky's probable comments on the Meditation indicators. + +19. I prepared to recommend meditation to Mark R. as well. I did this by writing the Press Release for Noam Chomsky's probable comments on the Meditation indicators. First, I found a meditation University program for Mark C. Second, I found a medicine University program for Mark C. Third, I loved bis. In this way, I prepared to recommend meditation to Mark R. as well by writing the Press Release for Noam Chomsky's probable comments on the Meditation indicators. + +20. I prepared to be like Plato, in that I preferred not to participate in University philosophy. I did this by writing Richard Rorty's probable comments on the Meditation indicators. First, I wrote I distanced myself from it. Second, I said I was not it. Third, I knew it. In this way, I prepared to be like Plato, in that I preferred not to participate in University philosophy by writing Richard Rorty's probable comments on the Meditation indicators. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8f69d4cc70e512fbc3bd3e91c935dc24f3ac7787 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, 50 Breasonings Per Utterance 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +50 Breasonings Per Utterance 3 of 4 + +21. I prepared to examine each breasoning. I did this by writing the Press Release for Richard Rorty's probable comments on the Meditation indicators. First, I participated in University short courses continuously. Second, I wrote on the high quality Academy. Third, I wrote 50 As on each comment in my philosophy. In this way, I prepared to examine each breasoning by writing the Press Release for Richard Rorty's probable comments on the Meditation indicators. + +22. I prepared to display the genre. I did this by writing Richard Dawkins' probable comments on the Meditation indicators. First, I noticed that there were hardly any other breasoners. Second, I breathed heavily. Third, I had discourse with a man. In this way, I prepared to display the genre by writing Richard Dawkins' probable comments on the Meditation indicators. + +23. I prepared to become a favourite police officer. I did this by writing the Press Release for Richard Dawkins' probable comments on the Meditation indicators. First, I deserved it for the number of As. Second, I took care of the thought. Third, I noticed the favourite English teacher had 30 As. In this way, I prepared to become a favourite police officer by writing the Press Release for Richard Dawkins' probable comments on the Meditation indicators. + +24. I prepared to eat daisies. I did this by writing Michel Onfray's probable comments on the Meditation indicators. First, I thought of the radio button's dimensions. Second, I waved the flag symbolizing thinking of the last comment. Third, I apologised to Head of State before the mistake. In this way, I prepared to eat daisies by writing Michel Onfray's probable comments on the Meditation indicators. + +25. I prepared to go to sleep. I did this by writing the Press Release for Michel Onfray's probable comments on the Meditation indicators. First, I wrote the job description for Academy writers. Second, I paid the writers. Third, I made them sign a legal waiver form. In this way, I prepared to go to sleep by writing the Press Release for Michel Onfray's probable comments on the Meditation indicators. + +26. I prepared to love the snake. I did this by writing Alexius Meinong's probable comments on the Meditation indicators. First, I noticed I was going much better than it. Second, I gave David the vegan cheese. Third, I prevented the snake from coming inside. In this way, I prepared to love the snake by writing Alexius Meinong's probable comments on the Meditation indicators. + +27. I prepared to wave goodbye. I did this by writing the Press Release for Alexius Meinong's probable comments on the Meditation indicators. First, I noticed Computational English had a primordial feeling. Second, I recorded Lulu Iglesias, introducing me in a primordial setting. Third, I recorded the Computational English song. In this way, I prepared to wave goodbye by writing the Press Release for Alexius Meinong's probable comments on the Meditation indicators. + +28. I prepared to bestow professorships. I did this by writing Martha Nussbaum's probable comments on the Meditation indicators. First, I found the police person. Second, I said I wouldn't have been booked. Third, I found that she was alive and kicking. In this way, I prepared to bestow professorships by writing Martha Nussbaum's probable comments on the Meditation indicators. + +29. I prepared to filter reality. I did this by writing the Press Release for Martha Nussbaum's probable comments on the Meditation indicators. First, I dessicated the coconuts. Second, I made a cake. Third, I fed it to the grub. In this way, I prepared to filter reality by writing the Press Release for Martha Nussbaum's probable comments on the Meditation indicators. + +30. I prepared to love papa (the Freemason). I did this by writing Noam Chomsky's probable comments on Meditation on Lucianpedia. First, I made a vegan rissole. Second, I added plum sauce. Third, I added salt and pepper. In this way, I prepared to love papa (the Freemason) by writing Noam Chomsky's probable comments on Meditation on Lucianpedia. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..45955a01052016de816ac1193d93a05bccce2381 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green 50 Breasonings Per Utterance 4 of 4.txt @@ -0,0 +1,30 @@ +["Green, L 2021, 50 Breasonings Per Utterance 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +50 Breasonings Per Utterance 4 of 4 + +31. I prepared to love myself. I did this by writing the Press Release for Noam Chomsky's probable comments on Meditation on Lucianpedia. First, I loved papa smurf. Second, I loved the minions. Third, I loved King Arthur. In this way, I prepared to love myself by writing the Press Release for Noam Chomsky's probable comments on Meditation on Lucianpedia. + +32. I prepared to state that the writer had been dead for 70 years. I did this by writing Richard Rorty's probable comments on Meditation on Lucianpedia. First, I loved the Lady of the Lake. Second, I loved Excalibur. Third, I loved public domain knowledge. In this way, I prepared to state that the writer had been dead for 70 years by writing Richard Rorty's probable comments on Meditation on Lucianpedia. + +33. I prepared to be sentient. I did this by writing the Press Release for Richard Rorty's probable comments on Meditation on Lucianpedia. First, I wanted Angela. Second, I found her in a computer lab. Third, I knew she had performed well in Honours. In this way, I prepared to be sentient by writing the Press Release for Richard Rorty's probable comments on Meditation on Lucianpedia. + +34. I prepared to write plays. I did this by writing Richard Dawkins' probable comments on Meditation on Lucianpedia. First, I calculated what fraction of 1 I was to the goal. Second, I went to sleep. Third, I was like Shakespeare. In this way, I prepared to write plays by writing Richard Dawkins' probable comments on Meditation on Lucianpedia. + +35. I prepared to state I am mainly fit. I did this by writing the Press Release for Richard Dawkins' probable comments on Meditation on Lucianpedia. First, I celebrated Absurdia. Second, I synthesised the comment. Third, I thought of seeing the chiropractor. In this way, I prepared to state I am mainly fit by writing the Press Release for Richard Dawkins' probable comments on Meditation on Lucianpedia. + +36. I prepared to say everyone loves me. I did this by writing Michel Onfray's probable comments on Meditation on Lucianpedia. First, I thought of the heart tart. Second, I included meditation. Third, I concentrated on them. In this way, I prepared to say everyone loves me by writing Michel Onfray's probable comments on Meditation on Lucianpedia. + +37. I prepared to work for myself. I did this by writing the Press Release for Michel Onfray's probable comments on Meditation on Lucianpedia. First, I discussed it in front of Adrian. Second, I passed it with the authorities. Third, I accredited it. In this way, I prepared to work for myself by writing the Press Release for Michel Onfray's probable comments on Meditation on Lucianpedia. + +38. I prepared to touch fame. I did this by writing Alexius Meinong's probable comments on Meditation on Lucianpedia. First, I studied what was helpful. Second, I made it. Third, I knew the character. In this way, I prepared to touch fame by writing Alexius Meinong's probable comments on Meditation on Lucianpedia. + +39. I prepared to meet regularly in a formal environment. I did this by writing the Press Release for Alexius Meinong's probable comments on Meditation on Lucianpedia. First, I did my job. Second, I researched locations. Third, I selected the best location. In this way, I prepared to meet regularly in a formal environment by writing the Press Release for Alexius Meinong's probable comments on Meditation on Lucianpedia. + +40. I prepared to delect on capsicum. I did this by writing Martha Nussbaum's probable comments on Meditation on Lucianpedia. First, I wrote I led the good life. Second, I read that it was to get me going. Third, I worked out I was better. In this way, I prepared to delect on capsicum by writing Martha Nussbaum's probable comments on Meditation on Lucianpedia. + +41. I prepared to let the breath drop in. I did this by writing the Press Release for Martha Nussbaum's probable comments on Meditation on Lucianpedia. First, I exhaled slowly. Second, I lifted my arms up. Third, I yawned. In this way, I prepared to let the breath drop in by writing the Press Release for Martha Nussbaum's probable comments on Meditation on Lucianpedia. + +42. I prepared to realise true fame. I did this by writing Noam Chomsky's probable comments on the Computational English essays. First, I met the Queen. Second, I was famous on the radio. Third, I met the Prime Minister. In this way, I prepared to realise true fame by writing Noam Chomsky's probable comments on the Computational English essays. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..43218c6d8586121e0d9f23c191d6009ad1c79f55 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Appearances 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Appearances 1 of 4 + + +1. When given the benefits of this argument, the meditator reports that mental imagery is clear and colourful. There have been reports of 'seeing' the unseen, i.e. states of people around the meditator and even physical states of matter. + + +1a. I prepared to emulate my hero. I did this by liking my hero. First, I chose a field. Second, I named a hero. Third, I liked her. In this way, I prepared to emulate my hero by liking my hero. + +2. I prepared to pour the coffee. I did this by having fun. First, I invented the appearance. Second, I had fun. Third, I measured its effectiveness with a metric. In this way, I prepared to pour the coffee by having fun. + +3. I prepared to espouse critical positivity in schools. I did this by noting that year 4 and above in primary school had 50 As per assignment, with 1 A assessable. First, I found the result of studying the undergraduate model. Second, I made each one a philosopher. Third, I made each one a philosophical playwright. In this way, I prepared to espouse critical positivity in schools by noting that year 4 and above in primary school had 50 As per assignment, with 1 A assessable. + +4. I prepared to transform my breasoning chapter. I did this by paying for editing of my breasoning chapter. First, I asked for uniformity of philosophical terms. Second, I asked for explanations of their ideas in terms of other sections. Third, I asked for secondary texts drawing connections between the texts. In this way, I prepared to transform my breasoning chapter by paying for editing of my breasoning chapter. + +5. I prepared to state that the algorithm lines had perspectives because they were written at first. I did this by paying for essays about my breasoning chapters to be written. First, I simulated the court's king's comments. Second, I simulated the jester's comments. Third, I simulated Maid Marrion's comments. In this way, I prepared to state that the algorithm lines had perspectives because they were written at first by paying for essays about my breasoning chapters to be written. + +6. I prepared to assess the ideas, which had all been explained. I did this by observing God (the master) booming the computational marking scheme essay questions. First, I verified whether the breasoning was valid, positive and vegan. Second, I transcended paths to become God (the master). Third, I verified that the paraphrased answer was cited. In this way, I prepared to assess the ideas, which had all been explained by observing God (the master) booming the computational marking scheme essay questions. + +7. I prepared to write on famous subjects. I did this by preparing to write essays. First, I collected comments from the famous subjects and their circle from the perspective of the famous university. Second, I wondered whether Proust agreed with Camus to disagree with capital punishment. Third, I confirmed that this was true. In this way, I prepared to write on famous subjects by preparing to write essays. + +8. I prepared to record the answers to prevent plagiarism. I did this by observing the student answering the questions. First, I observed the student understand the topic. Second, I observed the student understand the reason. Third, I observed the student connect new parts of the reason to each other. In this way, I prepared to record the answers to prevent plagiarism by observing the student answering the questions. + +9. I prepared to observe students collect their thoughts. I did this by observing the students writing a daily journal. First, I observed them write about their experiences. Second, I observed them write about their feelings. Third, I observed them write about playing with other students. In this way, I prepared to observe students collect their thoughts by observing the students writing a daily journal. + +10. I prepared to observe the students lie on the ground. I did this by observing the students practise art or music in the morning. First, I observed the students breason out an argument. Second, I observed them create compositions. Third, I observed them present them. In this way, I prepared to observe the students lie on the ground by observing the students practise art or music in the morning. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..a37845685b60a156702da37381eb9f1cbf4d40c2 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Appearances 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Appearances 2 of 4 + +11. I prepared to give the students all ideas for their assignments. I did this by observing the students and children breason out 2 As per day. First, I observed them being given 160 breasonings per day. Second, I observed them breason out the X, Y and Z dimensions of the object from each breasoning. Third, I assessed 50 As per 5 assignments per semester. In this way, I prepared to give the students all ideas for their assignments by observing the students and children breason out 2 As per day. + +12. I prepared to prepare students for academia or a job. I did this by observing the students write and breason out one breasoning chapter and essay per week. First, I observed the students write the breasoning chapter by collecting 9 breasoning algorithm ideas during their morning walk, expand each of these to 6 sentences and breason them out after lunch. Second, I observed them write an essay by organising the structure of breasonings (called reasons from now on in this aphor) in one (of five) paragraphs on the first day, I observed the students organising the structure of paragraphs and their links to the side of the contention chosen on the first day, I observed the students agreeing with, objecting to, rebutting to and connecting each of the reasons from one paragraph per day. Third, I observed the teacher set the rest to finish for homework. In this way, I prepared to prepare students for academia or a job by observing the students write and breason out one breasoning chapter and essay per week. + +13. I prepared to observe the students write philosophy algorithms. I did this by observing the students write breasoning algorithm ideas for 40 minutes before the first period. First, I observed them being given pedagogy training at the start of their school careers. Second, I observed them perform an undeveloped mindmap. Third, I observed them match words from the pedagogy screen with ideas from their undeveloped mindmap. In this way, I prepared to observe the students write philosophy algorithms by observing the students write breasoning algorithm ideas for 40 minutes before the first period. + +14. I prepared to assign the 100-point computational speed and accuracy assignment per student. I did this by substituting languages, physical education, etc. for arts on particular days. First, I educated students in LDMG idiom (place, object, subject, time) language. Second, I allowed students to assess devised studies individually or in groups. Third, I observed students educate themselves in algorithm drama. In this way, I prepared to assign the 100-point computational speed and accuracy assignment per student by substituting languages, physical education, etc. + +15. I prepared to keep the students' treasures online. I did this by observing the students plan their works at the start of year 4. First, I observe them choose and cross-multiply from the philosophy column in their journals. Second, I observed them collect from life experiences. Third, I observed them collect from areas of study. In this way, I prepared to keep the students' treasures online by observing the students plan their works at the start of year 4. + +16. I prepared to observe how synthesising breasonings, reasons and rebutted criticality in relation to an algorithm formed the Lucianic Computational School. I did this by observing the students read and examine magazines and visualise the pedagogy screen to write breasonings. First, I observed the students gain ideas from the magazines. Second, I observed them write down breasonings from the screen that reminded them of these ideas. Third, I observed them synthesise the breasonings with reasons about an argument based on an algorithm empirically proving the idea. In this way, I prepared to observe how synthesising breasonings, reasons and rebutted criticality about an algorithm formed the Lucianic Computational School by observing the students read and examine magazines and visualise the pedagogy screen to write breasonings. + +17. I prepared to observe the students write on their own and other ideas. I did this by observing the students before year 4 write an essay on each of 50 essays about Computational English (all ideas), Popology, Societology and Physics. First, I wrote illustrated versions of the essays appropriate for children from these year levels. Second, I encouraged intergenre (sic) exploration of the ideas. Third, I observed them understand the robotic mind through an English lens. In this way, I prepared to observe the students write on their own and other ideas observing the students before year 4 write an essay on each of 50 essays about Computational English (all ideas), Popology, Societology and Physics. + +18. I prepared to protect the students with medical background knowledge. I did this by setting meditation, medicine and pedagogy short courses before year 4. First, I observed the students practice walking meditation twice per day to kick back. Second, I observed the students prevent headaches, muscular aches, spiritual mistakes and effects of too many breasonings through nut and bolt, quantum box and prayer (algorithm) medical knowledge and use spiritual anti-hallucinogenic medication so that they use the courseware with no qualms. Third, I observed them use pedagogical knowledge to ensure academic success. In this way, I prepared to protect the students with medical background knowledge by setting meditation, medicine and pedagogy short courses before year 4. + +19. I prepared to do the same for critical thinking, after attending Logic Summer School at the nation's top University. I did this by setting creative Prolog assignments. First, I assigned part-research in a 5 source-literature review. Second, I assigned questions about the required predicates. Third, I assigned hypotheses using ideas like those from the 5 (University professor), 10 (University non-professor), or 50 (non-University non-professor) 80-breasoning As each about a different program. In this way, I prepared to do the same for critical thinking by setting creative Prolog assignments. + +20. I prepared to select the pedagogy school entrants. I did this by observing the students rebreason out 50 As to become a pedagogue. First, I asked teachers to write 5 (University professor), 10 (University non-professor), or 50 (non-University non-professor) 80-breasoning As per student about pedagogy. Second, I observed the future professor write 2 books. Third, I observed the successful professor applicant breason out 50 specific As to become a professor. In this way, I prepared to select the pedagogy school entrants by observing the students rebreason out 50 As to become a pedagogue. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..dd4e11d9b4f7635ca0dca79ad583978cbaba82f0 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Appearances 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Appearances 3 of 4 + +21. I prepared to become a pedagogue by producing 250-breasoning pop songs which contain high quality imagery and which are expressed as 50 As. I did this by liking the pop star character. First, I was given 50 As in pop stardom before birth. Second, I was given 50 As in pedagogy before birth. Third, I was given 50 As in Professor, Prolog, Critical Thinking, Meditation and other developed areas of study before birth. In this way, I prepared to become a pedagogue by producing 250-breasoning pop songs which contain high quality imagery and which are expressed as 50 As by liking the pop star character. + +22. I prepared to dispense with birthright myths and make 1/4 pedagogues. I did this by liking schools. First, I opened the most prestigious school in the Universe. Second, I noticed the students were royalty. Third, I opened a pop school too. In this way, I prepared to dispense with birthright myths and make 1/4 pedagogues by liking schools. + +23. I prepared to be non-invasive. I did this by verifying the contents of the vesicle apparatus. First, I breasoned out all my sets of 50 As in the sixth months prior to conception. Second, I massaged my sweaty neck to help return it to normal. Third, I helped discover scientific discoveries. In this way, I prepared to be non-invasive by verifying the contents of the vesicle apparatus. + +24. I prepared to help disabled people. I did this by liking the principal. First, I determined that the principal was a pedagogue. Second, I determined that the principal was smart. Third, I determined that the principal helped me to my goals. In this way, I prepared to help disabled people by liking the principal. + +25. I prepared to help you too. I did this by liking the teacher. First, I determined that the teacher gave me my thoughts. Second, I determined that the teacher helped me to my goals. Third, I worked out University turned people into teachers. In this way, I prepared to help you too by liking the teacher. + +26. I prepared to help the students again. I did this by liking the teacher aides. First, I determined that they were smart. Second, I determined that they were helpful. Third, I determined that they were useful. In this way, I prepared to help the students again by liking the teacher aides. + +27. I prepared to be inspired by and inspire pop. I did this by writing down a thought from the song. First, I wrote down a thought from the song. Second, I became a philosopher. Third, I was inspired by and inspired philosophy. In this way, I prepared to be inspired by and inspire pop by writing down a thought from the song. + +28. I prepared to write 30 (non-Professor) or 60 (Professor) pedagogy arguments to help me to deserve high grades in Masters or PhD assignment chapters. I did this by writing 2 As per day. First, I wrote the breasoning algorithm descriptions. Second, I expanded these into a 2 As-length breasoning chapter. Third, I breasoned it out during an assignment. In this way, I prepared to write 30 (non-Professor) or 60 (Professor) pedagogy arguments to help me to deserve high grades in Masters or PhD assignment chapters by writing 2 As per day. + +29. I prepared to indicate the 250 breasoning pop song inspired by the top song in Cosmology, and dotted on the area of study points to remember, to give the pedagogy student 50 specific As to become a pedagogue. I did this by writing the Breathsonings Essay Press Release. First, I waited 70 years until after the secondary text writer, the music producer and text-to-movie software writer died to include their products in my catalogue. Second, I wrote I, Lucian Green wrote the Breathsonings Essay, it was about my theory of human judgment of objects in a pedagogical sense (i.e. knowledge of it was required to deserve to earn H1), I wrote it in 2013 after taking a break from Science at Monash and discovering my famous essay format, I wrote it in the South Yarra/Toorak Stonnington Library in Melbourne, Australia, I chose the topic from meditation to help answer the question of how to earn H1 at Melbourne University and wrote it after meditating on the pedagogy ways of thinking which I used to write the breasoning chapter which the essay is based on. Third, I offered the essay to my students to grapple with. In this way, I prepared to indicate the 250 breasoning pop song inspired by the top song in Cosmology, and dotted on the area of study points to remember, to give the pedagogy student 50 specific As to become a pedagogue by writing the Breathsonings Essay Press Release. + +30. I prepared to maintain system dynamics. I did this by writing the Rebreathsonings Essay Press Release. First, I wrote the subject was a human judge of a verb. Second, I wrote the essay was written to identify non-monotonicities (exceptions) in verb judgments, e.g. go more quickly to reach the goal in time. Third, I wrote this by explaining choosing the correct judgment of verbs led to maintaining verb judgment correctness. In this way, I prepared to maintain system dynamics by writing the Rebreathsonings Essay Press Release. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..50eb4be5d18a5fb1190fac6a7dc7bf773bfdfc88 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 4 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Appearances 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Appearances 4 of 4 + +31. I prepared to enter the room in the heartland. I did this by writing the Room Essay Press Release. First, I wrote that 250 breasonings expanded to 50 As. Second, I wrote that a breasoned out pop song expanded to 50 As. Third, I wrote the classical music composition contained 5 pop songs. In this way, I prepared to enter the room in the heartland by writing the Room Essay Press Release. + +32. I prepared to choose the necessary part of the room. I did this by writing the Part of Room Essay Press Release. First, I wrote the chapter in the tram shelter on the corner near my house. Second, I wrote the chapter on the tram. Third, I wrote the chapter at the train station. In this way, I prepared to choose the necessary part of the room by writing the Part of Room Essay Press Release. + +33. I prepared to find the general interpretation of the direction essay. I did this by writing the Direction Essay Press Release. First, I observed the saint prevent suffering in meditators (graduates). Second, I observed the social network user block all the members of the particular industry theatre studies company that blamed despite training. Third, I observed the social network user avoid them again. In this way, I prepared to find the general interpretation of the direction essay by writing the Direction Essay Press Release. + +34. I prepared to compute the breathsoning in the centre. I did this by writing the Breathsonings Essay Secondary text. First, I observed the master verify the human in the human judgment of the noun. Second, I observed the master verify the judgment in the human judgment of the noun. Third, I observed the master verify the noun in the human judgment of the noun. In this way, I prepared to compute the breathsoning in the centre by writing the Breathsonings Essay Secondary text. + +35. I prepared to quickly (a rebreathsoning) complete the maze. I did this by writing the Rebreathsonings Essay Secondary text. First, I programmed the DoubleMaze science quiz. Second, I wrote questions in biology, chemistry, physics. Third, I programmed a password to change the maze runs to have unlimited time and enter and change the hall of fame of reaction times. In this way, I prepared to quickly (a rebreathsoning) complete the maze by writing the Rebreathsonings Essay Secondary text. + +36. I prepared to observe that the master had access to their room in control of the education consortium. I did this by writing the Room Essay Secondary text. First, I wrote chapters the length of an Honours thesis every 10 days in the Master by Coursework preparation. Second, I made the distinction to assess the breasoning component of these chapters in my Master students. Third, I observed that this allowed them to work on vibrant essays, while working on the Master. In this way, I prepared to observe that the master had access to their room in control of the education consortium by writing the Room Essay Secondary text. + +37. I prepared to observe the master make space for parts of the room. I did this by writing the Part of Room Essay Secondary text. First, I observed the forest tree fire. Second, I observed the wasteland. Third, I ran the computer simulation of this. In this way, I prepared to observe the master make space for parts of the room by writing the Part of Room Essay Secondary text. + +38. I prepared to observe the master state that homosexuality was the correct direction for some people. I did this by writing the Direction Essay Secondary text. First, I observed the number of religion members who agreed with homosexuality. Second, I observed the number of religion members who agreed with homosexuality increased after the first period of time. Third, I observed the number of religion members who agreed with homosexuality increased after the second period of time. In this way, I prepared to observe the master state that homosexuality was the correct direction for some people by writing the Direction Essay Secondary text. + +39. I prepared to read the announcement of the secondary text about human judgment of the noun. I did this by writing the Breathsonings Essay Secondary text Press Release. First, I stated that the human judge of the noun instructed the master. Second, I observed the master observe the safety guidelines associated with making a human judgment of the noun. Third, I observed the master observe that this was by eating fruit soon after picking it. In this way, I prepared to read the announcement of the secondary text about human judgment of the noun by writing the Breathsonings Essay Secondary text Press Release. + +40. I prepared to write the Rebreathsonings Essay Press Release Secondary text (verify what will come). I did this by writing the Rebreathsonings Essay Secondary text Press Release. First, I observed the master read how teachers were acting as human judges of verbs. Second, I observed the result that the master verified the language. Third, I observed the master choose the descriptive, rather than the simple judgment of the verb. In this way, I prepared to write the Rebreathsonings Essay Press Release Secondary text (verify what will come) by writing the Rebreathsonings Essay Secondary text Press Release. + +41. I prepared to file the police report. I did this by writing the Room Essay Secondary text Press Release. First, I observed who the master was who identified the room. Second, I asked why the room identified the master. Third, I stated that poetry is how. In this way, I prepared to file the police report by writing the Room Essay Secondary text Press Release. + +42. I prepared to announce God (the master) to the part of the room. I did this by writing the Part of Room Essay Secondary text Press Release. First, I cut off the appearances. Second, I observed the reason the master named the part of the room. Third, I observed the way the master named the part of the room. In this way, I prepared to announce God (the master) to the part of the room by writing the Part of Room Essay Secondary text Press Release. + +Pranayama + +1. "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..9a15e755e7f07c4cd6c06abe09a547f5536d9a56 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Blue Nature 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Blue Nature 1 of 4 + + +1. Nature, mainly the blue ocean, brings us joy in the form of gifts coming to us as meditators. Vegetarianism is encouraged, and other gifts may include better health and happiness. + + +1a. I prepared to come to a person like a card marching to a person as a breasoning in an A grade argument. I did this by coming to people. First, I wrote where the people were. Second, I wrote the route from my starting point to my destination. Third, I travelled along this route. In this way, I prepared to come to a person like a card marching to a person as a breasoning in an A grade argument by coming to people. + +2. I prepared to write an intelligent argument. I did this by writing the high quality developed topic in music. First, I wrote the high quality developed topic. Second, I wrote the song title. Third, I wrote the music. In this way, I prepared to write an intelligent argument by writing the high quality developed topic in music. + +3. I prepared to have fun. I did this by mimicking the other. First, I breathed the other's breath. Second, I smelt the other's fragrance. Third, I tasted the other's sweat. In this way, I prepared to have fun by mimicking the other. + +4. I prepared to experience the music. I did this by writing 250 breasonings for each assessable song. First, I loved people. Second, I agreed with the gay person. Third, I agreed with his partner. In this way, I prepared to experience the music by writing 250 breasonings for each assessable song. + +5. I prepared to protect myself from being sued. I did this by asking the talent to sign the talent release form. First, I hired the talent. Second, I asked the talent to sign the talent release form. Third, I asked the talent to perform. In this way, I prepared to protect myself from being sued by asking the talent to sign the talent release form. + +6. I prepared to use the music agent because they were already set up. I did this by my music being played on the radio. First, I professionally produced my song. Second, I joined the music agent. Third, my song was played on the radio when I breasoned out an A. In this way, I prepared to use the music agent because they were already set up by my music being played on the radio. + +7. I prepared to bracket the music. I did this by using my songs internally in my education system. First, I set the song, with elements of a grand synthesis of a department as its lyrics, as assessment. Second, I asked the students to breason out 80-250 breasonings. Third, I asked the students to paraphrase, object to, rebut to, cite and reconnect the reasons in the song's argument, which was based on an algorithm. In this way, I prepared to bracket the music by using my songs internally in my education system. + +8. I prepared to have the song professionally produced. I did this by following the music producer's instructions. First, I sang the song the first time. Second, I prepared to sing the song again. Third, I repeated this until I had sung during all the takes. In this way, I prepared to have the song professionally produced by following the music producer's instructions. + +9. I prepared to connect together main points from cliques. I did this by writing on something interesting to do with the song. First, I identified the topic. Second, I constructed an aphohedron from all the song's parts. Third, I thought of interconnections between clique nodes from the randomly broken down aphohedron. In this way, I prepared to connect together main points from cliques by writing on something interesting to do with the song. + +10. I prepared to connect my current topic of interest in the department with the topic. I did this by being given bonus marks for writing on something interesting to do with the song. First, I wrote down something interesting apart from a cited quote from the essay. Second, I was awarded an additional 10% greater than my grade. Third, this was capped at 100%. In this way, I prepared to connect my current topic of interest in the department with the topic by being given bonus marks for writing on something interesting to do with the song. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..a2aeecd43e8e7f46c5c310bab17654741fee266c --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Blue Nature 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Blue Nature 2 of 4 + +11. I prepared to verify that it was all right outside the buildings where the public domain text was used. I did this by observing the students study the texts in the public domain (out of legal copyright). First, I couldn't be sued. Second, I stated that this previous sentence had a seen as version 'I was protected'. Third, I stated that if this were true, it would mean the students could write it down, as long as it made sense. In this way, I prepared to verify that it was all right outside the buildings where the public domain text was used by observing the students study the texts in the public domain (out of legal copyright). + +12. I prepared to sing my case. I did this by determining the song lyrics from A or B philosophy arguments. First, I stated the main conclusion, 'I felt happy'. Second, I stated the reason 'I ate the apple' or objection 'I choked on the apple'. Third, I stated the connection 'I felt happy because I ate the apple' or 'I felt happy, however I choked on the apple'. In this way, I prepared to sing my case by determining the song lyrics from A or B philosophy arguments. + +13. I prepared to escape the abductor by pulling my wrist towards her thumb. I did this by selling the song. First, I connected the objections selling the song using a reason backbone. Second, I wrote 6 of 16 breasonings per lyric selling the song. Third, I breasoned out 50 As selling the song to 4 customers. In this way, I prepared to escape the abductor by pulling my wrist towards her thumb by selling the song. + +14. I prepared to install government. I did this by liking Rococo. First, I walked through the field. Second, I noticed the bees falling off me. Third, I quickly left. In this way, I prepared to install government by liking Rococo. + +15. I prepared to believe that vrooming exists. I did this by liking Dada. First, I wrote to Dada. Second, I thanked him. Third, I moved on. In this way, I prepared to believe that vrooming exists by liking Dada. + +16. I prepared to state that there should be more eateries in Impressionism. I did this by liking Impressionism. First, I wrote about people dining. Second, I wrote about people eating. Third, I wrote about people desiccating. In this way, I prepared to state that there should be more eateries in Impressionism by liking Impressionism. + +17. I prepared to understand words by synonyms for them that were easier to understand. I did this by examining the first heavenly virtue chastity. First, I demonstrated cleanliness. Second, I demonstrated wisdom. Third, I demonstrated sincerity. In this way, I prepared to understand words by synonyms for them that were easier to understand by examining the first heavenly virtue chastity. + +18. I prepared to relate metaphysics to pedagogy. I did this by examining the second heavenly virtue temperance. First, I demonstrated being social. Second, I demonstrated fairness. Third, I demonstrated distinction. In this way, I prepared to relate metaphysics to pedagogy by examining the second heavenly virtue temperance. + +19. I prepared to establish a charity fund for students who wanted to use the services of my organisation. I did this by examining the third heavenly virtue charity. First, I demonstrated determination. Second, I demonstrated bountifulness. Third, I demonstrated liberality. In this way, I prepared to establish a charity fund for students who wanted to use the services of my organisation by examining the third heavenly virtue charity. + +20. I prepared to focus on the grades of the students who wrote their own arguments in the academy. I did this by examining the fourth heavenly virtue diligence. First, I demonstrated perseverance. Second, I demonstrated power. Third, I demonstrated morals. In this way, I prepared to focus on the grades of the students who wrote their own arguments in the academy by examining the fourth heavenly virtue diligence. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6448bf73534702ed1b943dbf2d309c9d4445b057 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Blue Nature 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Blue Nature 3 of 4 + +21. I prepared to wait for customers to arrive. I did this by examining the fifth heavenly virtue patience. First, I demonstrated compassion. Second, I demonstrated blessing. Third, I demonstrated forgiveness. In this way, I prepared to wait for customers to arrive by examining the fifth heavenly virtue patience. + +22. I prepared to detail smaller ideas during and after attending the prestigious institution. I did this by examining the sixth heavenly virtue kindness. First, I demonstrated approval. Second, I demonstrated faithfulness. Third, I demonstrated the truth. In this way, I prepared to detail smaller ideas during and after attending the prestigious institution by examining the sixth heavenly virtue kindness. + +23. I prepared to write the small idea's domain's end-points. I did this by examining the seventh heavenly virtue humility. First, I exhibited courage. Second, I showed reserve. Third, I effected (sic) selflessness. In this way, I prepared to write the small idea's domain's end-points by examining the seventh heavenly virtue humility. + +24. I prepared to advertise Lucianic Meditation (the Lucian Academy). I did this by making a big glittering sign. First, I made the backing board. Second, I attached the single strand of tinsel to the backing board. Third, I repeated this until I had constructed the big glittering sign. In this way, I prepared to advertise Lucianic Meditation (the Lucian Academy) by making a big glittering sign. + +25. I prepared to map nomenclature to evidence. I did this by writing the nomenclature (terminology). First, I determined whether the top-down argument structure place-object-subject-time should remain in-order or be reversed to be matched with a top-down phenomenon in nature. For example, I matched place-object-subject-time against an eating. Alternatively, I matched time-subject-object-place against a period. In this way, I prepared to map nomenclature to evidence by writing the nomenclature (terminology). + +26. I prepared to make undeveloped things developed. I did this by verifying that the flame was burning. First, I looked at the candle. Second, I looked at its wick. Third, I verified that it was burning. In this way, I prepared to make undeveloped things developed by verifying that the flame was burning. + +27. I prepared to keep the bouquet. I did this by listening to the singer. First, I sat in the audience. Second, I saw the singer walk on stage. Third, I found that I could listen to the singer. In this way, I prepared to keep the bouquet by listening to the singer. + +28. I prepared to levitate for a short time. I did this by jumping in the air. First, I stood on the spot. Second, I aimed to jump 0.15 metres in the air. Third, I jumped 0.15 metres in the air. In this way, I prepared to levitate for a short time by jumping in the air. + +29. I prepared to write the Lulu Iglesias song. I did this by giving the girl the prehistoric magnum opus song. First, I wrote about prehistory. Second, I wrote a magnum opus about it. Third, I wrote and gave the song about it to the girl. In this way, I prepared to write the Lulu Iglesias song by giving the girl the prehistoric magnum opus song. + +30. I prepared to write the Primary School Song. I did this by giving the boy the God (master) and horse primary school song. First, I wrote about God (the master). Second, I wrote about his horse. Third, I wrote and gave a song about them to the boy. In this way, I prepared to write the Primary School Song by giving the boy the God (master) and horse primary school song. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..acd89cc3cd2e77942891f6f6b0905278ce29d6a7 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 4 of 4.txt @@ -0,0 +1,28 @@ +["Green, L 2021, Blue Nature 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Blue Nature 4 of 4 + +31. I prepared to give the song I am not a Peach and medicine degree away. I did this by giving the woman the anti-depression song and anti-depression degree. First, I gave the woman the anti-depression song. Second, I gave the woman the anti-depression degree. Third, I observed her as happy. In this way, I prepared to give the song I am not a Peach and medicine degree away by giving the woman the anti-depression song and anti-depression degree. + +32. I prepared to concentrate on health for peace, in addition to happiness and wisdom for necessary reasons. I did this by stating that peace reigned. First, I discovered the health degrees. Second, I discovered the famous (happiness) degrees. Third, I discovered the Ancient Greece-inspired (wisdom) degrees. In this way, I prepared to concentrate on health for peace, in addition to happiness and wisdom for necessary reasons by stating that peace reigned. + +33. I prepared to ask, what is the point besides statistics? I did this by observing them experience love. First, I visited the gardens. Second, I walked to the lakeside. Third, I observed the two swans frame a heart. In this way, I prepared to ask, what is the point besides statistics by observing them experience love. + +34. I prepared to get back to black to maintain the home. I did this by liking the red home. First, I ironed out mistakes from the good, found-out song. Second, I wrote n*50 sales As, finding out people for buying from the Vatican. Third, I watch the Stephanies sell it. In this way, I prepared to get back to black to maintain the home by liking the red home. + +35. I prepared to write personal departmental famousness and 5 famous sales sets of 50 As. I did this by writing that blue nature is before yellow God (master). First, I wrote the areas of study. Second, I sought accreditation for an institution. Third, I waited for customers. In this way, I prepared to write personal departmental famousness and 5 famous sales sets of 50 As by writing that blue nature is before yellow God (master). + +36. I prepared to model nature form around the Lucianic Meditation (Philosophy) centre. I did this by modelling nature. First, I watched the leaf fall. Second, I watched the butterfly fluttering. Third, I watched the customers come. In this way, I prepared to model nature form around the Lucianic Meditation (Philosophy) centre by modelling nature. + +37. I prepared to allow for a twist in the narrative. I did this by programming the English-of-Nature Assistant Algorithm. First, I wrote the natural phenomenon down. Second, I wrote the context of this as an English story. Third, I programmed an algorithm to assist with analysing the cognitive timing of this narrative. In this way, I prepared to allow for a twist in the narrative by programming the English-of-Nature Assistant Algorithm. + +38. I prepared to agree with a statement under a different condition. I did this by playing hoity-toity rhythm and roity (coits). First, I threw the coit onto the rod. Second, I claimed the rod concealed the coit. Third, I imagined Heidegger claiming the coit concealed the rod. In this way, I prepared to agree with a statement under a different condition by playing hoity-toity rhythm and roity (coits). + +39. I prepared to neaten the handwriting. I did this by writing the calligraphy algorithm. First, I turned the nib on an angle. Second, I wrote the lower case a's loop. Third, I wrote the a's tail. In this way, I prepared to neaten the handwriting by writing the calligraphy algorithm. + +40. I prepared to come to the home-bot. I did this by being delighted by the robot's appearance. First, I meditated (wrote) to have a stronger appearance. Second, I wrote my questions for the person I was appearing to in terms of pedagogical breasonings. Third, I wrote algorithms in terms of medicine to react to all possible answers. In this way, I prepared to come to the home-bot by being delighted by the robot's appearance. + +41. I prepared to write the time code in the leaf. I did this by looking at the leaf. First, I looked at the bright screen. Second, I wore wrap-around sunglasses. Third, I recorded the pattern on the time-code leaf. In this way, I prepared to write the time code in the leaf by looking at the leaf. + +42. I prepared to record the appearance of the bark. I did this by looking at the bark. First, I walked to the tree. Second, I sketched the vertical features of the bark. Third, I sketched the horizontal features of the bark. In this way, I prepared to record the appearance of the bark by looking at the bark. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8a7f930ba9b63c74729979cfafca4ff9a2d5a04a --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 1 of 4.txt @@ -0,0 +1,32 @@ +["Green, L 2021, Children, H1, Earning Jobs and Protection in Jobs 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Children, H1, Earning Jobs and Protection in Jobs 1 of 4 + + + +1. 'Children / H1 / Earning Jobs / Protection in Jobs' is about 4 uses for Green's conception of pedagogy, or a way to earn H1 by breasoning out 80 breasonings (objects visualised geometrically) as a token of professionalism to earn 80% in an essay that is an only way of conceiving a healthy child (by giving the child the best possible environment to develop), earn H1 in an assignment at school or University (by writing a pedagogical argument which gives a list of ways to write the argument using geometrical, etc. methods), earn jobs using the H1 essay as a token of professionalism (by breasoning out this essay 50 times, as a framework for the job) and protection in jobs, by meditating on a pedagogy-based meditation to cover any A-grade essays the worker needs from training to be protected in his or her job. + +The mantra and sutra are different in the following regard. The mantra triggers 1 breasoning per each of 80 mantras, for a total of 1 A (with 80 breasonings). 80 sutras each trigger 5 breasonings to be expanded to 50 breasonings, for a total of 50 As (with 80 breasonings each). This expansion requires Medicine, where the sutra requires mental input to 'expand' experienced breasonings. Both the mantra's breasonings and sutra's breasonings are lit up by recordings of 50 breasonings per mantra or sutra. The breasonings generated in meditation mean that the worker has met the professional requirements (expected to be 50 As, or 50*80 breasonings) per job per day. + + +1a. I prepared to synthesise successfully having children, earning H1, earning jobs and being protected during one's job. I did this by not applying too much shampoo, but applying enough shampoo instead. First, I emptied enough shampoo into my hand. Second, I lifted the shampoo to the top of my head. Third, I massaged the shampoo into my hair. In this way, I prepared to synthesise successfully having children, earning H1, earning jobs and being protected during one's job by not applying too much shampoo, but applying enough shampoo instead. + +2. I prepared to enter the hotel. I did this by not sitting in the hansom cab too long after it had reached its destination, but got out immediately. First, I waited until the cab had stopped. Second, I opened the door. Third, I got out. In this way, I prepared to enter the hotel by not sitting in the hansom cab too long after it had reached its destination, but got out immediately. + +3. I prepared to eat the chocolate shell. I did this by loving the customer by letting him remove the chocolate shell from the ice cream. First, I inserted my finger into the ice cream. Second, I grasped the shell. Third, I removed it. In this way, I prepared to eat the chocolate shell by loving the customer by letting him remove the chocolate shell from the ice cream. + +4. I prepared to love being with you. I did this by drinking the cappuccino. First, I lifted the cappuccino to my lips. Second, I rotated the cup. Third, I drank from it. In this way, I prepared to love being with you by drinking the cappuccino. + +5. I prepared to submit my masterpiece to the exhibition. I did this by dribbling paint on the canvas. First, I took the lid off the paint can. Second, I dribbled a circle of paint on the canvas. Third, I dribbled a line of paint on the canvas. In this way, I prepared to submit my masterpiece to the exhibition. I did this by dribbling paint on the canvas. + +6. I prepared to play in a spiccato style. I did this by bouncing the bow lightly on the violin string. First, I lifted the bow above the string. Second, I bounced the bow on the string. Third, I rubbed rosin on the bow to prepared to do this again. In this way, I prepared to play in a spiccato style by bouncing the bow lightly on the violin string. + +7. I prepared to like living. I did this by moving around the site. First, I paddled with a kickboard. Second, I walked with sneakers. Third, I crawled on all fours. In this way, I prepared to like living by moving around the site. + +8. I prepared to write home. I did this by spilling blue paint on the paper. First, I lifted the paint canister. Second, I jerked it. Third, I stopped when the paper was 20% covered. In this way, I prepared to write home by spilling blue paint on the paper. + +9. I prepared to write on a humanities idea that was compatible with the future, not limited to current ideas. I did this by writing on a people-humanities, not an idea-humanities idea. First, I researched the person's life. Second, I started writing on her life. Third, I finished writing when I had written about he whole life. In this way, I prepared to write on a humanities idea that was compatible with the future, not limited to current ideas by writing on a people-humanities, not an idea-humanities idea. + +10. I prepared to write the complex program. I did this by splitting the objects mentioned in the program into small parts. First, I examined the objects. Second, I worked out how to represent the objects in the simplest manner necessary for the computer program to traverse their representations. Third, I wrote the computer program. In this way, I prepared to write the complex program by splitting the objects mentioned in the program into small parts. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f6d4fe050c5fb9831d7914748891b6b77854ce8f --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Children, H1, Earning Jobs and Protection in Jobs 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Children, H1, Earning Jobs and Protection in Jobs 2 of 4 + +11. I prepared to like you. I did this by licking the ice-cream wafer. First, I inserted the wafer into the cone. Second, I pulled it out using the serviette. Third, I mashed it with a fork and licked it. In this way, I prepared to like you by licking the ice-cream wafer. + +12. I prepared to like myself. I did this by eating the ball of tofu ice cream. First, I found a smaller ball of tofu ice cream. Second, I inserted the disposable fork into the ice cream. Third, I placed the ball into my mouth. In this way, I prepared to like myself by eating the ball of tofu ice cream. + +13. I prepared to like everyone in sight. I did this by counting how many ribbons were attached to the maypole. First, I added one to the counter for the first ribbon. Second, I prepared to count the next ribbon. Third, I stopped counting when I had counted all the ribbons. In this way, I prepared to like everyone in sight by counting how many ribbons were attached to the maypole. + +14. I prepared to be given a book of knowledge. I did this by when I earned H1. First, I read the first sheet of paper. Second, I prepared to read the next sheet of paper. Third, I stopped reading when I had finished the pile. In this way, I prepared to be given a book of knowledge by when I earned H1. + +15. I prepared to be given a job. I did this by making sure my book of knowledge was the same length as my H1. First, I measured the potoroo. Second, I visualised this length. Third, I wrote down this length. In this way, I prepared to be given a job by making sure my book of knowledge was the same length as my H1. + +16. I prepared to be protected with training during my job. I did this by connecting with knowledge on a different topic from my 50 As. First, I breasoned out 50 instances of an A in meditation. Second, I asked God for an A on a different topic. Third, I received the training for my job. In this way, I prepared to be protected with training during my job by connecting with knowledge on a different topic from my 50 As. + +17. I prepared to observe everyone undergoing training. I did this by observing the king lion telling everyone to do this. First, I looked in the den. Second, I observed the lion cleaning the cub. Third, I watched the cub clean the other cub. In this way, I prepared to observe everyone undergoing training by observing the king lion telling everyone to do this. + +18. I prepared to be given a movie the length of my H1 set. I did this by completing an H1 set. First, I gave each crew member an H1. Second, I gave each actor an H1. Third, I gave each prop an H1. In this way, I prepared to be given a movie the length of my H1 set by completing an H1 set. + +19. I prepared to watch the healthy baby being born. I did this by thinking of enough ideas clearly. First, I calculated how many ideas were needed. Second, I thought of these ideas. Third, I gave these to the wife in the couple before conception. In this way, I prepared to watch the healthy baby being born by thinking of enough ideas clearly. + +20. I prepared to earn the job. I did this by breasoning out 50 As. First, I breasoned out the first A. Second, I prepared to breason out the next A. Third, I stopped when I had breasoned out 50 As. In this way, I prepared to earn the job by breasoning out 50 As. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..577fcd8847aef9dcd36521925d27e6d694f0d5cf --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Children, H1, Earning Jobs and Protection in Jobs 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Children, H1, Earning Jobs and Protection in Jobs 3 of 4 + +21. I prepared to eat the fatty acids, which the body does not produce. I did this by eating raspberry jam on buttered toast. First, I put the knife in the jar. Second, I spread it all around the buttered toast. Third, I tasted it. In this way, I prepared to eat the fatty acids, which the body does not produce by eating raspberry jam on buttered toast. + +22. I prepared to build the lipid membrane. I did this by eating the toasted raspberry crepe. First, I crushed the raspberry on a plate. Second, I placed the raspberry's aggregate parts on the crepe. Third, I toasted and ate it. In this way, I prepared to build the lipid membrane by eating the toasted raspberry crepe. + +23. I prepared to make an exhibition. I did this by making an X, Y and Z object. First, I cut out 6 squares. Second, I made a cube from these squares. Third, I placed masking tape along each edge of this cube. In this way, I prepared to make an exhibition by making an X, Y and Z object. + +24. I prepared to watch the insect eat a fruit. I did this by feeding it the raspberry. First, I lifted the raspberry on a fork. Second, I placed it in the airlock. Third, I unlocked the airlock's den side to feed the raspberry to the mosquitoes. In this way, I prepared to watch the insect eat a fruit by feeding it the raspberry. + +25. I prepared to feed the bird the watermelon on Earth because he needed gravity to swallow because it didn't have peristaltic ability. I did this by cutting the watermelon into smaller pieces. First, I used the razor blade to cut the first cube of watermelon. Second, I prepared to cut the next cube of watermelon. Third, I repeated this until the watermelon had been cut into cubes. In this way, I prepared to feed the bird the watermelon on Earth by cutting the watermelon into smaller pieces. + +26. I prepared to erase the animal-human divide in the issue life that the pedagogy-meditation discussion brought up. I did this by helping the animal lick the water bowl. First, I made the water bowl available. Second, I waited several hours until our dog had drunk the water. Third, I refreshed the water in the water bowl. In this way, I prepared to erase the animal-human divide by helping the animal lick the water bowl. + +27. I prepared to support human rights. I did this by marching in the pro-gay march. First, I was pro-gay like the pro-womens liberation people were. Second, I was walking in the rally. Third, I recommended a single partner like liturgical Christianity. In this way, I prepared to support human rights by marching in the pro-gay march. + +28. I prepared to recommend homosexuals to Jesus. I did this by cooking the watermelon in the pan. First, I wrote a letter. Second, I made it neat. Third, I posted it. In this way, I prepared to recommend homosexuals to Jesus by cooking the watermelon in the pan. + +29. I prepared to hit the ball. I did this by scooping the ball. First, I picked up the ball. Second, I put it in the lacrosse stick. Third, I walked forward. In this way, I prepared to hit the ball by scooping the ball. + +30. I prepared to love the Exolec encounter. I did this by correcting myself and becoming a lecturer. First, I corrected a mistake in life. Second, I became a lecturer. Third, I committed myself to research projects. In this way, I prepared to love the Exolec encounter by correcting myself and becoming a lecturer. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..14fc6695f923c2aa88913fd894e502bf6e27f7b5 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Children, H1, Earning Jobs and Protection in Jobs 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Children, H1, Earning Jobs and Protection in Jobs 4 of 4 + +31. I prepared to go to heaven. I did this by loving the Western religious leader. First, I walked into a centre. Second, I learned meditation. Third, I practiced meditation every day. In this way, I prepared to go to heaven by loving the Western religious leader. + +32. I prepared to make a love heart. I did this by necking with the swan. First, I walked to the swan. Second, I made a love heart with it. Third, I walked back. In this way, I prepared to make a love heart by necking with the swan. + +33. I prepared to write a lecture on home safety. I did this by feeling safe at home. First, I locked the door. Second, I locked the window. Third, I invited a friend to my house. In this way, I prepared to write a lecture on home safety by feeling safe at home. + +34. I prepared to be productive at work. I did this by feeling safe at work. First, I verified that the walls were strong. Second, I verified that the path outside was safe. Third, I verified that the hall was safe. In this way, I prepared to be productive at work by feeling safe at work. + +35. I prepared to be top cat. I did this by switching off the breasonings. First, I switched off the breasonings. Second, I breasoned out the breasonings. Third, I pretended to take off my top hat, symbolising the magic from the breasonings. In this way, I prepared to be top cat by switching off the breasonings. + +36. I prepared to explain surds (square roots which can't be reduced to rational numbers). I did this by helping the child perform the next step. First, I found the factors of the term under the square root symbol. Second, I found the square roots of the squares in the factors. Third, I multiplied and wrote that these were multiplied with the square root of the remaining factors. In this way, I prepared to explain surds (square roots which can't be reduced to rational numbers) by helping the child perform the next step. + +37. I prepared to verify that the surd matched the one that the original surd had been simplified to and worked backwards to result in. I did this by checking that the child had understood the next step. First, I worked backwards through the surd example to 'think backwards' by first finding factors of the number multiplied by the square root and the factors of the number under the square root symbol. Second, I found the squares of the factors of the number multiplied by the square root. Third, I wrote their multiple under the square root symbol. In this way, I prepared to verify that the surd matched the one that the original surd had been simplified to and worked backwards to result in by checking that the child had understood the next step. + +38. I prepared to reach the nth degree as a threshold. I did this by climbing the step. First, I wrote the breasonings. Second, I wrote the arguments. Third, I wrote the books. In this way, I prepared to reach the nth degree as a threshold by climbing the step. + +39. I prepared to write an essay by finding it out in a positive-functional way, followed by being given the high quality, developed seen-as version. I did this by writing my essay in a positive-functional way, sentence by sentence. First, I verified that the sentence was positive about the self. Second, I verified that the sentence was positive about the other. Third, I repeated this from another perspective. In this way, I prepared to write an essay by finding it out in a positive-functional way, followed by being given the high quality, developed seen-as version by writing my essay in a positive-functional way, sentence by sentence. + +40. I prepared to illustrate the children's storybook. I did this by watching the children earn H1. First, I wrote the collections. Second, I wrote the solutions. Third, I wrote the happiness-solutions. In this way, I prepared to illustrate the children's storybook by watching the children earn H1. + +41. I prepared to connect ideas. I did this by observing the adults earn H1. First, I encountered the vorstellung (idea). Second, I experienced the generativity. Third, I meditated on meditation's mechanics (wrote down philosophy's noumena). In this way, I prepared to connect ideas by observing the adults earn H1. + +42. I prepared to describe medicine the whole time in meditation. I did this by writing the Medicine H1. First, I defined breathsonings. Second, I took apart the nuts and bolts. Third, I placed waste in a quantum box. In this way, I prepared to describe medicine the whole time in meditation by writing the Medicine H1. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6858d0e3890e97684b738ae9fa35d97284202435 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Sutra 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Green Sutra 1 of 4 + + +1. The Green sutra gives the meditator the quality of life he or she would like. It includes a theme of medicine, which greatly builds confidence in the meditator, and helps prevent mental breakdowns. An affirmation of better mental imagery is felt, as well as enhanced sense of faculties (and better exam performance), and the meditator feels the confidence to reach out and write and produce pedagogical arguments. + + +1a. I prepared to relax in heaven (in fact, on earth). I did this by meditating using the Green sutra (writing about Medicine). First, I meditated on the first instance of the Green sutra (Medicine breasoning for the heart). Second, I meditated on the second instance of the Green sutra (Medicine breasoning for the brain). Third, I meditated on the third instance of the Green sutra (Medicine breasoning for the lungs). In this way, I prepared to relax in heaven (in fact, on earth) by meditating using the Green sutra (writing about Medicine). + +2. I prepared to teach philosophy. I did this by stating that the Green sutra (philosophy) led to pedagogy. First, I wanted to study pedagogy, where studying philosophy (Computational English) was a prerequisite of studying pedagogy. Second, I studied philosophy (Computational English). Third, I studied pedagogy. In this way, I prepared to teach philosophy by stating that the Green sutra (philosophy) led to pedagogy. + +3. I prepared to write the medical knowledge in pedagogy, from trial and error in pedagogy and from meditations (philosophies). I did this by enunciating that the Green sutra (philosophy) led to medicine. First, I wrote down the medical knowledge in the philosophy. Second, I wrote the list of diagnoses. Third, I wrote the list of treatments. In this way, I prepared to write the medical knowledge in pedagogy, from trial and error in pedagogy and from meditations (philosophies) by enunciating that the Green sutra (philosophy) led to medicine. + +4. I prepared to increase life to heaven (bliss) for many people. I did this by campaigning for meditation (philosophy) in politics. First, I trialed meditation (philosophy). Second, I collected the advantages of meditation (philosophy) to people's personal and professional lives. Third, I suggested that meditation (philosophy) and yoga (stretches) should be taught in schools in politics. In this way, I prepared to increase life to heaven (bliss) for many people by campaigning for meditation (philosophy) in politics. + +5. I prepared to plan my day. I did this by observing the meditation teacher writing 250 breasonings on my psychology of meditation during my day. First, I wrote that the sutra character's appearance was positive. Second, I wrote that my feelings changed with my needs at various hours in the day. Third, I wrote that all my questions were answered during the day. In this way, I prepared to plan my day by observing the meditation teacher writing 250 breasonings on my psychology of meditation during my day. + +6. I prepared to comment on the ontologies. I did this by watching the meditation student write a meditation major for accreditation. First, I watched him write a subject about the helper character. Second, I watched him write a subject about ontologised personal life. Third, I watched him write a subject about ontologised professional life. In this way, I prepared to comment on the ontologies by watching the meditation student write a meditation major for accreditation. + +7. I prepared to be safe while I was a good leader. I did this by watching the meditation student write 50 As for accreditation. First, I observed him write a subject about dotting each person on in a main way. Second, I observed him write a subject about filling one's head and appearances from work. Third, I observed him write a subject about keeping rowdy characters occupied. In this way, I prepared to be safe while I was a good leader by watching the meditation student write 50 As for accreditation. + +8. I prepared to stop the feeling of pain by stopping the assembly configuration. I did this by watching the meditation student carry the meditating relative forward in no way. First, I watched the meditation student teach her relative meditation. Second, I watched the relative meditate. Third, I determined that the relative was not dependent on the meditation student. In this way, I prepared to stop the feeling of pain by stopping the assembly configuration by watching the meditation student carry the meditating relative forward in no way. + +9. I prepared to prevent a crisis in the natural cycle, by preventing unreliability in the natural object, preventing the stone from being moved into my frequented way. I did this by connecting the breasoning and rhizome to form a reason. First, I wrote the breasoning, 'I connected the breasoning and rhizome to form a reason'. Second, I wrote the rhizome, 'The reason contained a breasoning as an example of the rhizome'. Third, I wrote the reason as a connection between these, 'The reason contained a breasoning as an example of the rhizome where I noticed the breasoning and rhizome formed a yin-yang relationship in the reason, i.e. either one could act as the reason to verify their relationship'. In this way, I prepared to prevent a crisis in the natural cycle, by preventing unreliability in the natural object, preventing the stone from being moved into my frequented way by connecting the breasoning and rhizome to form a reason. + +10. I prepared to ensure the reasons in an argument were in the right place. I did this by structuring my argument in a hierarchy. First, I ordered the breasonings from largest to smallest, and placed them in a hierarchy. Second, I ordered the rhizomes the largest to smallest, and placed them in a hierarchy. Third, I matched the rhizomes with the breasonings in their positions. In this way, I prepared to ensure the reasons in an argument were in the right place by structuring my argument in a hierarchy. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..b3e62d2aac4d96651e98088f3d2bb4e03bd365ef --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Sutra 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Green Sutra 2 of 4 + +11. I prepared to include ideas in different places in different ideas in the same place. I did this by connecting the different ideas in the same place with epsilon (no change in place). First, I mapped the ideas. Second, I drew translatative transformations between them. Third, I brought the best radical ideas (from different places) to the same place. In this way, I prepared to include ideas in different places in different ideas in the same place by connecting the different ideas in the same place with epsilon (no change in place). + +12. The meditation (philosophy) teacher prepared to verify the inference to the conclusion. The meditation (philosophy) teacher did this by preparing for the student's conclusion by thinking of the reason for the conclusion. First, she connected the student's possible reason to the first breasdostoning (sic) step. Second, she connected the first breasdostoning step to the second breasdostoning step. Third, she connected the second breasdostoning step to the third breasdostoning step and then the conclusion. In this way, the meditation (philosophy) teacher prepared to verify the inference to the conclusion by preparing for the student's conclusion by thinking of the reason for the conclusion. + +13. I prepared to observe companies foster life. I did this by stating that the green life was grown. First, I planted the seed. Second, I watered it. Third, I observed it grow. In this way, I prepared to observe companies foster life by stating that the green life was grown. + +14. I prepared to sustain life. I did this by stating that light sustained life. First, I found the light. Second, I found the plant. Third, I found that the light sustained the plant. In this way, I prepared to sustain life by stating that light sustained life. + +15. I prepared to follow the robot. I did this by observing that the green signal indicated to go. First, I observed the green signal switch on. Second, I observed the robot recognise it. Third, I observed the robot go. In this way, I prepared to follow the robot by observing that the green signal indicated to go. + +16. I prepared to stop when the robot stopped. I did this by observing that the red signal indicated to stop. First, I observed the red signal switch on. Second, I observed the robot recognise it. Third, I observed the robot stop. In this way, I prepared to stop when the robot stopped by observing that the red signal indicated to stop. + +17. I prepared to research meditation (writing). I did this by meditating on God (writing about the work of a philosopher). First, I noticed my skin wasn't as tight. Second, I noticed my lips didn't crack. Third, I felt protected. In this way, I prepared to research meditation (writing) by meditating on God (writing about the work of a philosopher). + +18. I prepared to prevent cell damage. I did this by taking responsibility to be safe in the sun. First, I mapped the sun exposure points on my itinerary. Second, I wrote the types of Sun Exposure Equipment (SEE) I would need at each point on my timeline. Third, I put on the SEE under the shade before each point on my timeline. In this way, I prepared to prevent cell damage by taking responsibility to be safe in the sun. + +19. I prepared to be a cinematographer. I did this by photographing the clear sky. First, I pointed the pinhole camera at the sky. Second, I opened the shutter. Third, I closed the shutter. In this way, I prepared to be a cinematographer by photographing the clear sky. + +20. I prepared to design the foreshore. I did this by observing the tide. First, I marked the tide at dawn. Second, I prepared to repeat this for each hour. Third, I repeated this until sunset. In this way, I prepared to design the foreshore by observing the tide. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..98851b94eb36a41e36c042aaef656b202d4ac3d4 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Sutra 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Green Sutra 3 of 4 + +21. I prepared to be protected. I did this by walking north. First, I walked north. Second, I observed what was there. Third, I returned. In this way, I prepared to be protected by walking north. + +22. I prepared to check my house plan. I did this by devising a symmetrical house. First, I devised a diamond plan house. Second, I filled it with rooms on the left. Third, I filled it with rooms on the right. In this way, I prepared to check my house plan by devising a symmetrical house. + +23. I prepared to program the spiritual computer to take care of other people by giving them a breasoning. I did this by residing in the palace. First, I built the gazebo. Second, I placed the lavender in it. Third, I placed the gel in it. In this way, I prepared to program the spiritual computer to take care of other people by giving them a breasoning by residing in the palace. + +24. I prepared to master inflow and outflow in meditation (philosophy). I did this by becoming a prince. First, I intertwined the idea of becoming a prince with the first A before I wrote it. Second, I prepared to intertwine the idea of becoming a prince with the next A before I wrote it. Third, I repeated this 50 times. In this way, I prepared to master inflow and outflow in meditation (philosophy) by becoming a prince. + +25. I prepared to drain water to quench my thirst. I did this by drinking water from the chalice. First, I lifted the chalice to my lips. Second, I sipped the water from the chalice. Third, I placed the chalice on the table. In this way, I prepared to drain water to quench my thirst by drinking water from the chalice. + +26. I prepared to discuss group meditation with the seen-as version philosophy. I did this by attending group meditation (philosophy class) at the sandstone Lucianic Meditation (Lucianic Philosophy) Centre. First, I explained group dynamics that I saw bottom-up. Second, I reported the group-dynamics top-down. Third, I mapped what I thought. In this way, I prepared to discuss group meditation with the seen-as version philosophy by attending group meditation (philosophy class) at the sandstone Lucianic Meditation (Lucianic Philosophy) Centre. + +27. I prepared to become a healthy person. I did this by training in Lucianic Meditation (Lucianic Philosophy) at University. First, I wrote a 50-A area of study in meditation (philosophy). Second, I enrolled in Lucianic Meditation (Lucianic Philosophy) at University. Third, I recommended it to other potential students. In this way, I prepared to become a healthy person by training in Lucianic Meditation (Lucianic Philosophy) at University. + +28. I prepared to move forward by processing chunks. I did this by liking the Lucianic Meditation (philosophy) meditator (philosophy student). First, I found the student's profile page on the company's internal social network. Second, I liked the student. Third, I commented that I liked the student because of his feedback that he liked himself in his self-discovery. In this way, I prepared to move forward by processing chunks by liking the Lucianic Meditation (philosophy) meditator (philosophy student). + +29. I prepared to assume that the master explained the last part of the time period, not existed during it and that if the last part of the time period was represented earlier, then this was experienced with the master. I did this by loving God (the master). First, I thought that the space in the food processor was like time. Second, I thought that eternity was cut off. Third, I was ready for the last part of the time period, the last part of the time period with the master. In this way, I prepared to assume that the master explained the last part of the time period, not existed during it and that if the last part of the time period was represented earlier, then this was experienced with the master by loving God (the master). + +30. I prepared to attract someone special. I did this by dancing the jitterbug. First, I looked the jitterbug up in a book. Second, I read its moves. Third, I performed its moves. In this way, I prepared to attract someone special by dancing the jitterbug. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d336d2dd5bda6da6bc15b22098d37e1fea8472f4 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Green Sutra 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Sutra 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Green Sutra 4 of 4 + +31. I prepared to make the idea clearer. I did this by paying 5 essayists to write on my essay. First, I wrote that I would prefer for people to learn to meditate. Second, I asked what the essayists thought of my essay. Third, I considered this. In this way, I prepared to make the idea clearer by paying 5 essayists to write on my essay. + +32. I prepared to offer essays in the department. I did this by paying 5 essayists to write on essays in one department. First, I paid 5 essayists to write on the first essay in the department. Second, I prepared to pay 5 essayists to write on the next essay in the department. Third, I repeated this until I had paid 5 essayists to write on all of the essays in the department. In this way, I prepared to offer essays in the department by paying 5 essayists to write on essays in one department. + +33. I prepared to facilitate the University Philosophy Academy. I did this by paying 5 essayists to write on essays in each department. First, I paid 5 essayists to write on the essays in the first department. Second, I prepared to pay 5 essayists to write on the essays in the next department. Third, I repeated this until I had paid 5 essayists to write on the essays in each department. In this way, I prepared to facilitate the University Philosophy Academy by paying 5 essayists to write on essays in each department. + +34. I prepared to write 150 As for an encyclopedia article. I did this by writing an encyclopedia article on all essays on the essays. First, I read the primary essays. Second, I read and understood how the essays about them related to them. Third, I summarised these essays. In this way, I prepared to write 150 As for an encyclopedia article by writing an encyclopedia article on all essays on the essays. + +35. I prepared to help more people. I did this by writing articles for five encyclopedias. First, I wrote the first encyclopedia article. Second, I prepared to write the next encyclopedia article. Third, I repeated this until I had written five encyclopedia articles. In this way, I prepared to help more people by writing articles for five encyclopedias. + +36. I prepared to reap the benefits of meditation (philosophy). I did this by holding on to steadfast reasonings. First, I wrote the first reason. Second, I wrote the second reason. Third, I wrote the third reason. In this way, I prepared to reap the benefits of meditation (philosophy) by holding on to steadfast reasonings. + +37. I prepared to meet the head of state. I did this by shining my shoes. First, I dipped the cloth in shoe polish. Second, I wiped it on the shoe. Third, I brushed the shoe. In this way, I prepared to meet the head of state by shining my shoes. + +38. I prepared to excel as a PhD student. I did this by counting bill's barnacles. First, I counted the first barnacle in the row. Second, I prepared to count the next barnacle. Third, I repeated this until I had counted the 50 barnacles. In this way, I prepared to excel as a PhD student by counting bill's barnacles. + +39. I prepared to bring love into my life. I did this by observing the vein valve open. First, I observed liquid open the vein valve. Second, I observed the liquid move into the vein. Third, I noticed this repeat along the vein. In this way, I prepared to bring love into my life by observing the vein valve open. + +40. I prepared to consolidate my life by writing breasoning chapters with writing essays based on them. I did this by observing the vein valve close. First, I observed the liquid in the vein segment. Second, I observed the vein valve close at one end of the vein segment. Third, I observed that the liquid didn't flow back through the valve. In this way, I prepared to consolidate my life by writing breasoning chapters with writing essays based on them by observing the vein valve close. + +41. I prepared to have high quality of life. I did this by observing that I was safe. First, I walked with people. Second, I stayed inside at night. Third, I lived in a low-crime area. In this way, I prepared to have high quality of life by observing that I was safe. + +42. I prepared to remain comfortable. I did this by observing that my body was safe. First, I protected my body from the sun. Second, I protected my body from the wind. Third, I protected my body from the rain. In this way, I prepared to remain comfortable by observing that my body was safe. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..71651630ec188c879ebce5f0cb572a1e5c0bd8ff --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 1 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Heads of State 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Heads of State 1 of 4 + + +1. This supports the meditator with the heads of state. Meditation is recognised by head of state, who help build the meditator's thoughts and society and gives the meditator thoughts. The meditation company has a recognised sales cycle, which the heads of state encourage customers to buy from. It helps with criticality per thought, accreditation, medicine, and pedagogy. + +1a. I prepared to turn the train around on the circular track. I did this by turning the lollipop around. First, I put the lollipop in my mouth. Second, I rotated the stick of the lollipop. Third, I noticed this caused the lolliop's head to rotate. In this way, I prepared to turn the train around on the circular track by turning the lollipop around. + +2. I prepared to test the boy had a bag of nuts. I did this by feeling the bow tie. First, I felt the knot in the middle. Second, I felt the left side of the bow tie. Third, I felt the right side of the bow tie. In this way, I prepared to test the boy had a bag of nuts by feeling the bow tie. + +3. I prepared to stop playing the 'World' board game. I did this by identifying that the button had reached the goal. First, I found the button. Second, I tested which region it was on. Third, I tested that the name of the region was 'Happiness'. In this way, I prepared to stop playing the 'World' board game by identifying that the button had reached the goal. + +4. The head of state prepared to help think of a algorithm with me. She did this by finding the loop in the algorithm. First, she wrote down the name of the Prolog predicate. Second, she wrote down the name of the predicate called by the predicate. Third, she identified that the two predicate names were the same, showing that there was a loop, in other words a recursive structure in the algorithm. In this way, the head of state prepared to help think of a algorithm with me by finding the loop in the algorithm. + +5. I prepared to wash the t-shirt. I did this by feeling the tap. First, I knelt down. Second, I touched the bottom of the tap. Third, I felt the top of the top. In this way, I prepared to wash the t-shirt by feeling the tap. + +6. Lucian prepared to teach the students. He did this by setting up the monastic-led school. First, he chose monastics to lead the school. Second, he wrote the lessons to be taught. Third, he taught the monastics how to teach the lessons. In this way, Lucian prepared to teach the students by setting up the monastic-led school. + +7. I prepared to say hello to my friend in a few days. I did this by connecting the continents together by train. First, I selected the first continent. Second, I selected the second continent. Third, I connected the two continents together by train, as well as the rest of the continents. In this way, I prepared to say hello to my friend in a few days by connecting the continents together by train. + +8. I prepared to go to school. I did this by connecting the universe together using the space pathways. First, I chose the first point in space. Second, I chose the second point in space. Third, I connected these two points together. In this way, I prepared to go to school by connecting the universe together using the space pathways. + +9. I prepared to walk around the schoolyard. I did this by connecting the school buildings together with pathways. First, I looked at the first building. Second, I looked in which direction the next building was. Third, I walked in that direction. In this way, I prepared to walk around the schoolyard by connecting the school buildings together with pathways. + +10. The patient prepared to lie down. He did this after being taken off medication because it caused muscle stiffness. First, he read the medication's side effects. Second, he looked at what to do instead. Third, he did that instead. In this way, the patient prepared to lie down after being taken off medication because it caused muscle stiffness. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..c62898e832852f5286f8692e3d63097bbbf3daa0 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Heads of State 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Heads of State 2 of 4 + + +11. The actor prepared to give the proclamation. He did this by sitting on the throne. First, he unravelled his robe. Second, he undid the cord. Third, he sat up straight on the throne. In this way, the actor prepared to give the proclamation by sitting on the throne. + +12. I prepared to write the learning ladders for the baby. I did this by cuddling the pillow. First, I selected a pillow. Second, I put my arms around it. Third, I gently squeezed it. In this way, I prepared to write the learning ladders for the baby by cuddling the pillow. + +13. I prepared for our pet dog to ask for food. I did this by watching her notice she was at home. First, she looked at the furniture. Second, she noticed it was in the same shape. Third, she realised she was at home. In this way, I prepared for our pet dog to ask for food by watching her notice she was at home. + +14. I prepared to have a hunt for interesting ideas about my friends. I did this by loving my friends by meeting one per week. First, I read the next friend's name on the list and rang him or her. Second, I prepared to read the next name in the next week. Third, I repeated this until the list was complete, then I returned to the start of the list. In this way, I prepared to have a hunt for interesting ideas about my friends by loving my friends by meeting one per week. + +15. I prepared to neaten the blanket in the blanket cover. I did this by putting my finger into a hole. First, I chose a large enough hole to put my finger into. Second, I straightened my finger and pointed it to the hole. Third, I placed my finger in the hole. In this way, I prepared to neaten the blanket in the blanket cover by putting my finger into a hole. + +16. I prepared to interface with the latest research. I did this by connecting arguments on the Internet with similar arguments. First, I selected the first argument, in other words, phrase. Second, I selected the second argument. Third, I found a right, in, on or out of relationship between the arguments. In this way, I prepared to interface with the latest research by connecting arguments on the Internet with similar arguments. + +17. I prepared to have a bonanza by having an alien's brain washed by showing him a blank screen for five minutes. I did this by ticking the box that an alien had been found. First, I selected a country. Second, I read the name of the alien. Third, I tested that the name of the alien was not on the list of inhabitants from the country. In this way, I prepared to have a bonanza by having an alien's brain washed by showing him a blank screen for five minutes by ticking the box that an alien had been found. + +18. I prepared to wear the vestments. I did this by licking two lollies at once. First, I licked the first lolly. Second, I prepared to lick the second lolly. Third, I repeated this until I had licked each lolly. In this way, I prepared to wear the vestments by licking two lollies at once. + +19. I prepared to teach the student meditation. I did this by asking for 80 lucian mantras and 80 green sutras to each be repeated for 10 days. First, I repeated 80 lucian mantras. Second, I repeated 80 green sutras. Third, I repeated these using 10 buttons. In this way, I prepared to teach the student meditation by asking for 80 lucian mantras and 80 green sutras to each be repeated for 10 days. + +20. I prepared to serve the next two customers by smiling at the second one after smiling at the first one. I did this by licking around the apple. First, I touched the apple with the tip of my tongue. Second, I moved my tongue around the apple. Third, I stopped moving my tongue when I had reached the opposite point of the apple. In this way, I prepared to serve the next two customers by smiling at the second one after smiling at the first one by licking around the apple. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..ceb6ed8a783fcdb9d94ff3171d33de035568eed9 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 3 of 4.txt @@ -0,0 +1,23 @@ +["Green, L 2021, Heads of State 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Heads of State 3 of 4 + +21. The head of state's dog prepared to be taken care of by coming to her. He did this by licking inside the bonus fruit bowl on command. First, he touched the inside of the fruit bowl with his tongue. Second, he started licking around the edge of the fruit bowl. Third, he stopped when he had reached half-way. In this way, the head of state's dog prepared to be taken care of by coming to her by licking inside the bonus fruit bowl on command. + +22. The architect prepared to look out of the south-facing window. He did this by cleaning the south-facing window. First, he squeezed a drop of water onto each square of the window. Second, he wiped them with the cloth. Third, he wiped the water dry. In this way, the architect prepared to look out of the south-facing window by cleaning the south-facing window. + +23. I prepared to walk on the mat. I did this by replacing the west exit's mat when the old one was 50% full. First, I tested that the box was 50% full of spheres. Second, I removed it. Third, I replaced it with another mat. In this way, I prepared to walk on the mat by replacing the west exit's mat when the old one was 50% full. + +24. The chemist prepared to make an alpha-radioactive time machine. He did this by making a time machine that went forward as alpha-radiation was emitted. First, he removed a square of paper alpha 'radiation' from the box on the desk. Second, he prepared to remove the next square of paper alpha 'radiation' from the box. Third, he stopped when all the 'radiation' had been emitted. In this way, the chemist prepared to make an alpha-radioactive time machine by making a time machine that went forward as alpha-radiation was emitted. + +25. The police watcher prepared to go home. He did this when he realised there was nothing to watch. First, he looked where the subject usually stood. Second, he realised she had gone home. Third, he realised she was not doing anything in that place. In this way, the police watcher prepared to go home when he realised there was nothing to watch. + +26. The tennis player prepared to have a conversation with someone. She did this by returning two returns in short succession. First, she hit the first ball. Second, she hit the second ball. Third, she relaxed. In this way, the tennis player prepared to have a conversation with someone by returning two returns in short succession. + +27. I prepared to give the speech. I did this by resting all night. First, I lied on the bed. Second, I rested. Third, I woke up in the morning. In this way, I prepared to give the speech by resting all night. + +28. I prepared to go to sleep. I did this by thinking of an A (a perfect thought) rather than the first ten mistakes when resting. First, I thought of the first breasoning for this (in other words, thinking of an algorithm involving the object relating to movement along a pathway in the same way that a philosophy professor would pull model eyes from a blue cloth in the same way that one would pull a swab out of a test tube to clean it, then breasoning out, or thinking of the object's x, y and z dimensions). Second, I prepared to think of the next breasoning. Third, I repeated this until ten breasonings had been thought of. In this way, I prepared to go to sleep by thinking of an A. + +29. The dancer prepared to attend the ball. She did this by making a coral necklace. First, she selected a piece of tubular coral and threaded it. Second, she prepared to select the next piece of coral. Third, she continued until enough pieces of coral had been threaded so that the necklace's length was covered in coral and the necklace was flexible. In this way, the dancer prepared to attend the ball by making a coral necklace. + +30. I prepared to make the necklace. I did this by polishing the pearl. First, I held the pearl to show the right latitude. Second, I held the pearl to show the right longitude. Third, I polished that region of the pearl, following which I repeated this for the next longitude and after this, the next latitude. In this way, I prepared to make the necklace by polishing the pearl."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..9800d39cfb16a9a70d11b8dbbaa03f5c289b5952 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Heads of State 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Heads of State 4 of 4 + +31. I prepared to be a religious leader, surrounded with flowers. I did this by cutting a living flower from a plant. First, I selected a lotus flower. Second, I cut it from the plant. Third, I adorned the table with it. In this way, I prepared to be a religious leader, surrounded with flowers by cutting a living flower from a plant. + +32. I prepared to visit the king. I did this by reading the map. First, I found where I was departing from. Second, I found the destination. Third, I connected these points. In this way, I prepared to visit the king by reading the map. + +33. I prepared to be famous. I did this by eating the zucchini. First, I breasoned out the breasonings correctly from the start by switching them off. Second, I breasoned out 'I,' 'ate' and 'zucchini'. Third, I steamed and ate the zucchini. In this way, I prepared to be famous by eating the zucchini. + +34. I prepared to teach the tame baby budgy to talk. I did this by liking the budgerigar. First, I listened to the budgerigar. Second, I looked at the budgerigar. Third, I held the budgerigar. In this way, I prepared to teach the tame baby budgy to talk by liking the budgerigar. + +35. I prepared to record the galah call. I did this by patting the galah. First, I talked with the park ranger. Second, I asked if I could pat the galah. Third, I patted the galah. In this way, I prepared to record the galah call by patting the galah. + +36. I prepared to go on tour as a fashion designer. I did this by reattaching the model leg. First, I found the model leg. Second, I reattached it. Third, I put the model on display. In this way, I prepared to go on tour as a fashion designer by reattaching the model leg. + +37. I prepared to earn A by repeating the medicine sutra twice. I did this by sealing the model capillary to stop infection. First, I cleared dirt from it. Second, I disinfected it. Third, I placed a band-aid on it. In this way, I prepared to earn A by repeating the medicine sutra twice by sealing the model capillary to stop infection. + +38. I prepared to be famous. I did this by making it in music. First, I wrote songs. Second, I wrote As. Third, I performed at concerts. In this way, I prepared to be famous by making it in music. + +39. I prepared to enjoy subsidised accreditation. I did this by agreeing with the government. First, I read the government policy. Second, I verified that it was a good idea. Third, I agreed with it. In this way, I prepared to enjoy subsidised accreditation by agreeing with the government. + +40. I prepared to achieve my personal best. I did this by agreeing with royalty's system. First, I performed Salute to the Sun Yoga in the morning and Yoga Asanas twice per day (stretched). Second, I performed meditation (went for a walk). Third, I prevented a headache and mistakes by spiritually unscrewing a spiritual nut from a spiritual bolt and placing the potential headache and mistakes in quantum box or saying a prayer (B) to prevent them. In this way, I prepared to achieve my personal best by agreeing with royalty's system. + +41. I prepared to start a vegetable farm. I did this by stating that I am a vegan. First, I joyously stated that I was a vegan. Second, I stated that this prevented global warming, land degradation, species extinction, deforestation, pollution, water scarcity and malnourishment in the developing world. Third, I stated that the world thanked me. In this way, I prepared to start a vegetable farm by stating that I am a vegan. + +42. I prepared to enjoy self-sufficiency. I did this by letting the prince to do his own work. First, I observed him enroll in Creative Writing, Nietzsche and Education. Second, I observed him write his own arguments. Third, I observed him earn a job. In this way, I prepared to enjoy self-sufficiency by letting the prince to do his own work. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d4a7840eb4b02dbf5928983cc565558188e7a26f --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Hours Prayer 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Hours Prayer 1 of 4 + + + +1. Meditation, like any system, must work around the clock, by use of the Hours prayer. Updates to mental imagery, thoughts and memory, as well as medical and other processes are completed. + +1a. I prepared to excavate the blog. I did this by writing the Press Release for Noam Chomsky's probable comments on the Medicine blog. First, I liked you. Second, I liked me. Third, I liked everyone. In this way, I prepared to excavate the blog by writing the Press Release for Noam Chomsky's probable comments on the Medicine blog. + +2. I prepared to eat out with Charlotte. I did this by writing Richard Rorty's probable comments on the Medicine blog. First, I liked Adrian. Second, I liked myself, Lucian. Third, I liked myself, Lucian, more. In this way, I prepared to eat out with Charlotte by writing Richard Rorty's probable comments on the Medicine blog. + +3. I prepared to like parliamentarians. I did this by writing the Press Release for Richard Rorty's probable comments on the Medicine blog. First, I like Munster. Second, I like Munery. Third, I like the church promenade. In this way, I prepared to like parliamentarians by writing the Press Release for Richard Rorty's probable comments on the Medicine blog. + +4. I prepared to see what was in addition to BestThinking.com. I did this by writing Richard Dawkins' probable comments on the Medicine blog . First, I like answering the scammer to tell them off. Second, I studied computer science already. Third, I made up a new premise. In this way, I prepared to see what was in addition to BestThinking.com by writing Richard Dawkins' probable comments on the Medicine blog. + +5. I prepared to eat vuckanova (sic). I did this by writing the Press Release for Richard Dawkins' probable comments on the Medicine blog. First, I wrote the book. Second, I wrote you a story. Third, I liked determinism, not luck. In this way, I prepared to eat vuckanova (sic) by writing the Press Release for Richard Dawkins' probable comments on the Medicine blog. + +6. I prepared to endorse the endometrium. I did this by writing Michel Onfray's probable comments on the Medicine blog. First, I deterred the poltergeist from eating feces himself. Second, I noted light speed travel was currently impossible. Third, I liked light. In this way, I prepared to endorse the endometrium by writing Michel Onfray's probable comments on the Medicine blog. + +7. I prepared to endorse Malcolm Turnbull. I did this by writing the Press Release for Michel Onfray's probable comments on the Medicine blog. First, I liked Malcolm Turnbull. Second, I ate with the solar system. Third, I loved light. In this way, I prepared to endorse Malcolm Turnbull by writing the Press Release for Michel Onfray's probable comments on the Medicine blog. + +8. I prepared to endorse Nietzsche's brilliance. I did this by writing Alexius Meinong's probable comments on the Medicine blog. First, I called it Anarchy 3. Second, I liked brilliance. Third, I liked Nietzsche's brilliance. In this way, I prepared to endorse Nietzsche's brilliance by writing Alexius Meinong's probable comments on the Medicine blog. + +9. I prepared to helped out at Christmas. I did this by writing the Press Release for Alexius Meinong's probable comments on the Medicine blog. First, I endorsed the slave. Second, I endorsed the Masters. Third, I loved life. In this way, I prepared to helped out at Christmas by writing the Press Release for Alexius Meinong's probable comments on the Medicine blog. + +10. I prepared to dice with the devil. I did this by writing Martha Nussbaum's probable comments on the Medicine blog. First, I cheered the lady up. Second, I helped her to Medicine. Third, I loved her. In this way, I prepared to dice with the devil by writing Martha Nussbaum's probable comments on the Medicine blog. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..0a5503aeba8516cb7b135ca29aa4d64963c68cf7 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Hours Prayer 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Hours Prayer 2 of 4 + +11. I prepared to take care of the ducklings. I did this by writing the Press Release for Martha Nussbaum's probable comments on the Medicine blog. First, I ate the seed baum. Second, I noticed it killed the duckling. Third, I put it away. In this way, I prepared to take care of the ducklings by writing the Press Release for Martha Nussbaum's probable comments on the Medicine blog. + +12. I prepared to eat out with his smock on. I did this by writing Noam Chomsky's probable comments on the Medicine indicators. First, I loved licky. Second, I loved you. Third, I loved serengitis (sic). In this way, I prepared to eat out with his smock on by writing Noam Chomsky's probable comments on the Medicine indicators. + +13. I prepared to write I liked the lack of headaches on trains from yoga. I did this by writing the Press Release for Noam Chomsky's probable comments on the Medicine indicators. First, I wrote I liked the babies the best. Second, I wrote I accepted Berocca. Third, I wrote I loved the lack of headaches. In this way, I prepared to write I liked the lack of headaches on trains from yoga by writing the Press Release for Noam Chomsky's probable comments on the Medicine indicators. + +14. I prepared to love Richard Rorty. I did this by writing Richard Rorty's probable comments on the Medicine indicators . First, I turned his card over. Second, I found him dishevelled. Third, I dropped his hand. In this way, I prepared to love Richard Rorty by writing Richard Rorty's probable comments on the Medicine indicators. + +15. I prepared to I write I would have preferred to have known about Lucianic Medicine and the indicators it would have helped. I did this by writing the Press Release for Richard Rorty's probable comments on the Medicine indicators. First, I held Richard's hand. Second, I kissed it. Third, I wrote I liked all the medicine indicators. In this way, I prepared to I write I would have preferred to have known about Lucianic Medicine and the indicators it would have helped by writing the Press Release for Richard Rorty's probable comments on the Medicine indicators. + +16. I prepared to write 'Who's that?' I did this by writing Richard Dawkins' probable comments on the Medicine indicators. First, I wrote I asked what the point of the medicine indicators is. Second, I wrote I agreed with them. Third, I wrote I indicated them myself. In this way, I prepared to write 'Who's that?' by writing Richard Dawkins' probable comments on the Medicine indicators. + +17. I prepared to dot on sorry to the Head of State before any mistakes to avoid the headache from a tank spiritually running over my head. I did this by writing the Press Release for Richard Dawkins' probable comments on the Medicine indicators. First, I wrote heaps. Second, I wrote many. Third, I wrote more. In this way, I prepared to dot on sorry to the Head of State before any mistakes to avoid the headache from a tank spiritually running over my head by writing the Press Release for Richard Dawkins' probable comments on the Medicine indicators. + +18. I prepared to ignore Nietzsche. I did this by writing Michel Onfray's probable comments on the Medicine indicators. First, I mopped up the rest of the headaches, mistakes, multiple breasonings turned on still and muscular aches by dotting on sorry to the Head of State before any mistakes to avoid the headache from a tank spiritually running over my head. Second, I passed people. Third, I became Head of State. In this way, I prepared to ignore Nietzsche by writing Michel Onfray's probable comments on the Medicine indicators. + +19. I prepared to love God (the master). I did this by writing the Press Release for Michel Onfray's probable comments on the Medicine indicators. First, I got dressed. Second, I got in the car. Third, I drove off. In this way, I prepared to love God (the master) by writing the Press Release for Michel Onfray's probable comments on the Medicine indicators. + +20. I prepared to write I loved Meinong and was famous. I did this by writing Alexius Meinong's probable comments on the Medicine indicators. First, I took no notice. Second, I liked his estates. Third, I liked him a little. In this way, I prepared to write I loved Meinong and was famous by writing Alexius Meinong's probable comments on the Medicine indicators. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..35c0b93335a342e8defe63993e5aa2c0989ddbaa --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Hours Prayer 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Hours Prayer 3 of 4 + +21. I prepared to become a transsexual. I did this by writing the Press Release for Alexius Meinong's probable comments on the Medicine indicators. First, I had make-up applied. Second, I put a frock on. Third, I wore panty-hose on. In this way, I prepared to become a transsexual by writing the Press Release for Alexius Meinong's probable comments on the Medicine indicators. + +22. I prepared to listen to combinations of music. I did this by writing Martha Nussbaum's probable comments on the Medicine indicators. First, I recommended meditation for royal-minded people. Second, I delighted people. Third, I lived and let live. In this way, I prepared to listen to combinations of music by writing Martha Nussbaum's probable comments on the Medicine indicators. + +23. I prepared to finish my degree by studying it full-time. I did this by writing the Press Release for Martha Nussbaum's probable comments on the Medicine indicators. First, I experienced the cool change. Second, I noticed Oprah. Third, I rode my way to posterity. In this way, I prepared to finish my degree by studying it full-time by writing the Press Release for Martha Nussbaum's probable comments on the Medicine indicators. + +24. I prepared to read it. I did this by writing Noam Chomsky's probable comments on Medicine on Lucianpedia. First, I wrote 50 1 breasoning As to attend a pop concert. Second, I wrote 50 1 breasoning As to write a pop song. Third, I received 50 As from the Head of State for these. In this way, I prepared to read it by writing Noam Chomsky's probable comments on Medicine on Lucianpedia. + +25. I prepared to find out about Adam Levine as well. I did this by writing the Press Release for Noam Chomsky's probable comments on Medicine on Lucianpedia. First, I wrote I liked Medicine. Second, I liked the other. Third, I liked the self. In this way, I prepared to find out about Adam Levine as well by writing the Press Release for Noam Chomsky's probable comments on Medicine on Lucianpedia. + +26. I prepared to design my own comments. I did this by writing Richard Rorty's probable comments on Medicine on Lucianpedia. First, I wrote I loved Anarchy. Second, I loved you. Third, I loved myself. In this way, I prepared to design my own comments by writing Richard Rorty's probable comments on Medicine on Lucianpedia. + +27. I prepared to avoid Harry. I did this by writing the Press Release for Richard Rorty's probable comments on Medicine on Lucianpedia. First, I liked John. Second, I liked Dick. Third, I liked you to Harry. In this way, I prepared to avoid Harry by writing the Press Release for Richard Rorty's probable comments on Medicine on Lucianpedia. + +28. I prepared to dance the moonwalk. I did this by writing Richard Dawkins' probable comments on Medicine on Lucianpedia. First, I thought of the cow (mutating). Second, I loved Gene. Third, I ate the bean. In this way, I prepared to dance the moonwalk by writing Richard Dawkins' probable comments on Medicine on Lucianpedia. + +29. I prepared to love the lady from the Masters ceremony as well. I did this by writing the Press Release for Richard Dawkins' probable comments on Medicine on Lucianpedia. First, I wrote the press release. Second, I protected his feelings. Third, I loved Adrian. In this way, I prepared to love the lady from the Masters ceremony as well by writing the Press Release for Richard Dawkins' probable comments on Medicine on Lucianpedia. + +30. I prepared to help Adrian's spout out a little. I did this by writing Michel Onfray's probable comments on Medicine on Lucianpedia. First, I liked Medicine on Lucianpedia. Second, I gave a pet spider to Adrian. Third, I watch it run up the spout. In this way, I prepared to help Adrian's spout out a little by writing Michel Onfray's probable comments on Medicine on Lucianpedia. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..62b3cea8ee735d780ddd5947e5d6edee20ed2e71 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 4 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Hours Prayer 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Hours Prayer 4 of 4 + +31. I prepared to go to heaven (the bed after the concert). I did this by writing the Press Release for Michel Onfray's probable comments on Medicine on Lucianpedia. First, I listened to the singer. Second, I proved it was plentiful. Third, I had fun. In this way, I prepared to go to heaven (the bed after the concert) by writing the Press Release for Michel Onfray's probable comments on Medicine on Lucianpedia. + +32. I prepared to thought of the last thing you thought of, following which I apologised before a mistake to Head of State. I did this by writing Alexius Meinong's probable comments on Medicine on Lucianpedia. First, I liked being healthy. Second, I noticed there were plenty of sexual ideas but no one was interested in them because they weren't turned on. Third, I thought they were cute. In this way, I prepared to thought of the last thing you thought of, following which I apologised before a mistake to Head of State by writing Alexius Meinong's probable comments on Medicine on Lucianpedia. + +33. I prepared to be off with them. I did this by writing the Press Release for Alexius Meinong's probable comments on Medicine on Lucianpedia. First, I started the ignition and took off. Second, I realised M was in front of the queen. Third, I admired Q's headpiece. In this way, I prepared to be off with them by writing the Press Release for Alexius Meinong's probable comments on Medicine on Lucianpedia. + +34. I prepared to love yachts. I did this by writing Martha Nussbaum's probable comments on Medicine on Lucianpedia. First, I wrote my music sounded the same as the radio in stereo. Second, I put the pads on hard left and hard right. Third, I was inspired the song format of Strawberry Shortcake. In this way, I prepared to love yachts by writing Martha Nussbaum's probable comments on Medicine on Lucianpedia. + +35. I prepared to apologise. I did this by writing the Press Release for Martha Nussbaum's probable comments on Medicine on Lucianpedia. First, I loved the Peacocks. Second, I nuzzled the horse. Third, I loved them. In this way, I prepared to apologise by writing the Press Release for Martha Nussbaum's probable comments on Medicine on Lucianpedia. + +36. I prepared to breason it out. I did this by writing Noam Chomsky's probable comments on the Meditation essays. First, I rambada-ed. Second, I stopped hallucinations. Third, I noticed what happened on the harpsichord. In this way, I prepared to breason it out by writing Noam Chomsky's probable comments on the Meditation essays. + +37. I prepared to not remember it that way. I did this by writing the Press Release for Noam Chomsky's probable comments on the Meditation essays. First, I helped the rambada. Second, I helped sectichords (sic). Third, I interpreted what I wanted to and avoided the small duckling scratch. In this way, I prepared to not remember it that way by writing the Press Release for Noam Chomsky's probable comments on the Meditation essays. + +38. I prepared to hum silently to myself. I did this by writing Richard Rorty's probable comments on the Meditation essays. First, I helped hapsichords. Second, I donated sperm whales. Third, I invitralised myself. In this way, I prepared to hum silently to myself by writing Richard Rorty's probable comments on the Meditation essays. + +39. I prepared to say 'And I moved on'. I did this by writing Press Release for Richard Rorty's probable comments on the Meditation essays. First, I thought Ben was a good lecturer. Second, I liked Ben. Third, I thought it was very good. In this way, I prepared to say 'And I moved on' by writing Press Release for Richard Rorty's probable comments on the Meditation essays. + +40. I prepared to breason out my Masters. I did this by writing Richard Dawkins' probable comments on the Meditation essays. First, I collected the comment on the comment. Second, I repeated this. Third, I repeated this until I had enough for my Masters. In this way, I prepared to breason out my Masters by writing Richard Dawkins' probable comments on the Meditation essays. + +41. I prepared to have secondary texts for my essays written in different languages. I did this by writing the Press Release for Richard Dawkins' probable comments on the Meditation essays. First, I organized to have algorithms written for my chapters. Second, I organized to have essay arguments written for my chapters. Third, I organized to have secondary texts written for my essays. In this way, I prepared to have secondary texts for my essays written in different languages by writing the Press Release for Richard Dawkins' probable comments on the Meditation essays. + +42. I prepared to study Education at University. I did this by writing Michel Onfray's probable comments on the Meditation essays. First, I wrote on essays. Second, I write on specialisms. Third, I helped disabled students. In this way, I prepared to study Education at University by writing Michel Onfray's probable comments on the Meditation essays. + +50 Breasonings Per Utterance + +1. "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5037384ac56499e6fcecd2a054ceb310b8edff1a --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Lucian Mantra (Pure Form) 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Lucian Mantra (Pure Form) 1 of 4 + + +1. The Lucian mantra helps the meditator build confidence in meditation while offering the fundamental skill of meditation. Specifically, it allows more advanced meditators to pedagogically interpret the meditator, benefitting the advanced meditator with writing his or her own arguments. Also, the teacher helps the mantra-only meditator with business forms of medicine, pedagogy, Computational English (featuring creative writing, which, in combination with pedagogy, gives the meditator the confidence to write pedagogical arguments, the basis of A grade essays, which are token of professionalism, at school and University) and a pedagogy practicum, and other areas of study, especially while participating in the meditation community. + + +1a. I prepared to study a postgraduate qualification. I did this by licking the cream up, in other words graduating from the degree. First, I placed my tongue perpendicular to the top of the head of the spoon. Second, I moved my tongue forward, scooping the cream from the spoon. Third, I stopped when I had reached the base of the spoon head. In this way, I prepared to study a postgraduate qualification by licking the cream up, in other words graduating from the degree. + +2. I prepared to find out about the postgraduate opportunities. I did this by lapping each particle up, in other words, researching the postgraduate programs. First, I lifted the bowl. Second, I tilted and drank the milk from it. Third, I wiped my lip with a face cloth. In this way, I prepared to find out about the postgraduate opportunities by lapping each particle up, in other words, researching the postgraduate programs. + +3. I prepared to examine the rest of the data about the animals. I did this by pumping up the bicycle tyre, like solving a problem in the degree, e.g. calculating which direction the animal was travelling in. First, I examined the food that the deer ate. Second, I examined where the faeces containing traces of this food was. Third, I calculated the vector the deer was travelling along. In this way, I prepared to examine the rest of the data about the animals by pumping up the bicycle tyre, like solving a problem in the degree. + +4. I prepared to love meditation by performing the puja ceremony during the meditation degree. I did this by breasoning out the meditation thoughts' breasoning lists. First, I read the object's name and breasoned it out (thought of the object's x, y and z dimensions). Second, I repeated this for the rest of the breasonings in that list. Third, I repeated this for the rest of the breasoning lists. In this way, I prepared to love meditation by performing the puja ceremony during the meditation degree by breasoning out the meditation thoughts' breasoning lists. + +5. I prepared to love you to make you happy. I did this by shaking hands with you. First, I watched you lift you hand. Second, I lifted my hand. Third, I shook your hand. In this way, I prepared to love you to make you happy by shaking hands with you. + +6. I prepared to love everyone else to be friendly. I did this by emptying the pencil shaving into the mini-bin. First, I chose the red pencil. Second, I rotated it in the pencil sharpener. Third, I opened the mini-bin and placed the pencil shavings inside it. In this way, I prepared to love everyone else to be friendly by emptying the pencil shaving into the mini-bin. + +7. I prepared to attend class. I did this by bouncing the rubber ball. First, I bounced the ball in the first position. Second, I prepared to bounce the ball again. Third, I repeated this until I had bounced the ball in each position. In this way, I prepared to attend class by bouncing the rubber ball. + +8. I prepared to love you. I did this by blowing up the balloon. First, I attached the balloon's neck to the helium tube. Second, I released the helium tap. Third, I stopped the tap when the balloon had filled with helium. In this way, I prepared to love you by blowing up the balloon. + +9. I prepared to love myself by helping you. I did this by taking the burger from you. First, I lifted my finger. Second, a burger was placed on it. Third, I pulled my finger down. In this way, I prepared to love myself by helping you by taking the burger from you. + +10. The Lucianic Meditator prepared to prevent cancer. She did this by increasing the other religion's meditation technique to do this. First, she rewrote the Lucianic Meditation scriptures. Second, she breasoned these out. Third, she repeated 80 lucian mantras and 80 green sutras per day to access the knowledge for her current and future monastics and followers. In this way, the Lucianic Meditator prepared to prevent cancer by increasing the other religion's meditation technique to do this. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..66d5152531e4afb3dbfac9fae74cf68412c2cd9f --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Lucian Mantra (Pure Form) 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Lucian Mantra (Pure Form) 2 of 4 + +11. I prepared to enroll in a new subject. I did this by disembarking from the bus. First, I waited until the bus had stopped. Second, I walked to the start of the aisle. Third, I stepped onto the footpath. In this way, I prepared to enroll in a new subject by disembarking from the bus. + +12. I prepared to complete the area of study. I did this by placing intelligence objects end to end. First, I placed the pie on the plate. Second, I placed the slice of pear on the pie. Third, I placed the apple slice on the slice of pear. In this way, I prepared to complete the area of study by placing intelligence objects end to end. + +13. I prepared for my brain to do nothing. I did this by knowing my heart was loved. First, I asked for a hug from my mother. Second, my mother hugged me. Third, my heart rate decreased. In this way, I prepared for my brain to do nothing by knowing my heart was loved. + +14. I prepared to write 'thank you'. I did this by writing it in the brain-pixel font, in other words fill in pixels making up text. First, I peeled off the brain sticker from the backing sheet. Second, I applied the sticker to the square. Third, I continued doing this until the brain banner was complete. In this way, I prepared to write 'thank you' by writing it in the brain-pixel font, in other words fill in pixels making up text. + +15. I prepared to have enough heat on my sore back to make it better. I did this by heating the rock in the fire to warm up a cloth to rub on it. First, I rubbed sticks together to make a spark. Second, I placed kindling on the fire. Third, I warmed up the rock, which I warmed the cloth with. In this way, I prepared to have enough heat on my sore back to make it better by heating the rock in the fire to warm up a cloth to rub on it. + +16. I prepared to write the song 'Like a Husky'. I did this by identifying who had blue eyes. First, I found the wavelength of the man's eyes. Second, I checked whether it was 475 nanometres. Third, I listed the names of the men. In this way, I prepared to write the song 'Like a Husky by identifying who had blue eyes. + +17. I prepared to quench my thirst. I did this by drinking the triculating water. First, I poked my tongue out. Second, I let the drop of water roll onto it. Third, I swallowed it. In this way, I prepared to quench my thirst by drinking the triculating water. + +18. I prepared to eat ice cream. I did this by pouring the waffle mixture into the mold. First, I mixed the waffle mixture. Second, I poured it into the funnel. Third, I made and cooked the waffle. In this way, I prepared to eat ice cream by pouring the waffle mixture into the mold. + +19. I prepared to behave holily. I did this by eating the whole apple. First, I cut the apple horizontally. Second, I cut the apple vertically. Third, I cut the apple lengthways. In this way, I prepared to behave holily by eating the whole apple. + +20. I prepared to give the king the star. I did this by accepting the serotonin from the king. First, the king have me the strawberry milkshake. Second, he gave me the straw. Thirdly, I drank the strawberry milkshake using the straw. In this way, I prepared to give the king the star by accepting the serotonin from the king. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..cf4be022d227a1ba04142771542b697da72be430 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Lucian Mantra (Pure Form) 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Lucian Mantra (Pure Form) 3 of 4 + +21. I prepared for the king to give me the serotonin feeling. I did this by saying the star was at university. First, the student studied theatre studies. Second, he breasoned out 50 As. Third, he earned the main role in the play. In this way, I prepared for the king to give me the serotonin feeling by saying the star was at university. + +22. I prepared to be happy. I did this by having happy thoughts dropped into my mind. First, I made a model theatre stage. Second, I placed a star on the backdrop. Third, I placed a heart on the backdrop and a calf on stage. In this way, I prepared to be happy by having happy thoughts dropped into my mind. + +23. I prepared to be relaxed. I did this by having relaxed feelings dropped into my body. First, I lied on the bed. Second, I dropped slowness into my organs. Third, I dropped softness into my limbs and silence into my senses. In this way, I prepared to be relaxed by having relaxed feelings dropped into my body. + +24. I prepared to meet the professor from the independent school. I did this by having a discussion with him, during which he realised that I was the writer of 'Pedagogy', (or 'H1' or 'On Pedagogy'). First, I listed my works. Second, I included the title 'Pedagogy' in the list. Third, he recognised the title 'Pedagogy' as my work. In this way, I prepared to meet the professor from the independent school by having a discussion with him, during which he realised that I was the writer of 'Pedagogy'. + +25. I prepared to become a pop singer. I did this by singing famous thoughts I was given to a microphone. First, I was given the thought 'La la di'. Second, I played the guitar. Third, I sang 'La la di' to the microphone. In this way, I prepared to become a pop singer by singing famous thoughts I was given to a microphone. + +26. I prepared to repeat the mantra. I did this by doing push-ups on the mat. First, I placed the mat on the floor. Second, I lied on the mat with my hands pointing forwards on the mat flat next to my shoulders. Third, I slowly performed one push-up. In this way, I prepared to repeat the mantra by doing push-ups on the mat. + +27. I prepared to enjoy clarity with the Queen. I did this by drinking the red grape juice. First, I placed the plastic wine glass on the plastic table. Second, I poured red grape juice into the glass. Third, I sipped the juice from that glass. In this way, I prepared to enjoy clarity with the Queen by drinking the red grape juice. + +28. I prepared to establish the new centre. I did this by letting the meditation student come to me. First, I made cultural and linguistic adjustments to the texts. Second, I checked how much money was needed for the schools, university and meditation centres. Third, I let the local people know, performed breasoning of a text each day from the advertisement to the class, and on the day some students came to me. In this way, I prepared to establish the new centre by letting the meditation student come to me. + +29. I prepared to examine an object. I did this by swinging a pendulum. First, I picked up the pendulum by it's handle. Second, I lifted the pendulum in the air so that it hung down. Third, I gently swung the pendulum. In this way, I prepared to examine an object by swinging a pendulum. + +30. I prepared to test how large the solar system was. I did this by making a model of the sun with string rays. First, I placed the yellow ball on the small stand. Second, I dispensed and detached 0.02 metres of transparent adhesive tape. Third, I attached a 0.2 metre strand of yellow wool to the yellow ball with the tape, then I repeated the final two steps until 6 rays had been attached to the sun. In this way, I prepared to test how large the solar system was by making a model of the sun with string rays. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..a2acbdc849074f29ea3b8aaaadd7b227f974ac9d --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Pure Form) 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Lucian Mantra (Pure Form) 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Lucian Mantra (Pure Form) 4 of 4 + +31. I prepared to make sure that I didn't travel more than 25% away from the centre of the lane. I did this by verifying that the safety margin was 25%. First, I calculated that the side of the lane was 2 metres wide. Second, I calculated that half of the car's width, which was 1 metre, would travel 1 metre towards the edge of the lane, from being 1 metre away from it if there was a safety margin of 25% (1 metre out of the 4 metre-wide lane should be travelled through, from the centre of the lane). Third, I calculated that the edge of the car would touch the edge of the lane in this case, which would be acceptable. In this way, I prepared to make sure that I didn't travel more than 25% away from the centre of the lane by verifying that the safety margin was 25%. + +32. I prepared to define a domain in an area of study in terms of an alphabet. I did this by computing a property in terms of variables about these alphabet letters. First, I counted the first letter, in other words, enumerator. Second, I prepared to count the second enumerator. Third, I repeated this until all the enumerators had been counted. In this way, I prepared to define a domain in an area of study in terms of an alphabet by computing a property in terms of variables about these alphabet letters. + +33. I prepared to become a breasoner. I did this by earning A grade in the respiratory practical. First, I read the assignment. Second, I read the book and conferred with a doctor. Third, I answered the questions. In this way, I prepared to become a breasoner by earning A grade in the respiratory practical. + +34. I prepared to take care of pet tadpoles. I did this by filling the tadpole tank with water. First, I filled the jug with water. Second, I began filling the tadpole tank with water. Third, I completed filling the tadpole tank with water. In this way, I prepared to take care of pet tadpoles by filling the tadpole tank with water. + +35. I prepared to teach my students all my works. I did this by continuously improving with new content. First, I wrote and wrote an examination of the first book. Second, I prepared to write and write an examination of the next book. Third, I repeated this until I had written and written an examination of each book. In this way, I prepared to teach my students all my works by continuously improving with new content. + +36. I prepared to run my algorithm. I did this by stating that my algorithms were composed of algorithms. First, I wrote what was interesting about the aim of the algorithm. Second, I wrote the child predicates. Third, I wrote the parent predicates. In this way, I prepared to run my algorithm by stating that my algorithms were composed of algorithms. + +37. I prepared to keep spirits up. I did this by deflecting blame from (keep compliments for) the prince. First, I identified the blaming statement (compliment). Second, I decided to deflect (keep) the blaming statement (compliment) from (for) the prince. Third, I chose not to give (give) the blaming statement (compliment) to the prince. In this way, I prepared to keep spirits up by deflecting blame from (keep compliments for) the prince. + +38. I prepared to pass the individual's meditation session. I did this by liking breasonings by clicking and counting mantra utterances. First, I liked the first fifty breasonings by clicking and counting the first mantra utterance. Second, I prepared to like the next fifty breasonings by clicking and counting the next mantra utterance. Third, I repeated this until I had liked each group of fifty breasonings by clicking and counting 80 mantra utterances per day. In this way, I prepared to pass the individual's meditation session by liking breasonings by clicking and counting mantra utterances. + +39. I prepared to make a million dollars. I did this by repeating the Lucian mantra for twenty minutes twice per day. First, I repeated the Lucian mantra for the first minute. Second, what I was thinking was found out. Third, I played with tel-dy (sic). In this way, I prepared to make a million dollars by repeating the Lucian mantra for twenty minutes twice per day. + +40. I prepared to switch off the screen and avoid not having high quality of life. I did this by stating that the idea was turned off like electrons in the spiritual maracas. First, I prepared to come in to shake the maracas on the beat. Second, I shook the maracas on the beat. Third, I finished shaking the maracas. I prepared to switch off the screen and avoid not having high quality of life by stating that the idea was turned off like electrons in the spiritual maracas. + +41. I prepared to laugh as yoga for preventing depression. I did this by clicking not to be depressed. First, I held the mouse. Second, I clicked its button. Third, I was not depressed. In this way, I prepared to laugh as yoga for preventing depression by clicking not to be depressed. + +42. I prepared to complete my PhD. I did this by verifying the statement in meditation (philosophy). First, I wrote the statement. Second, I verified it. Third, I applied the philosophy to different philosophical departments. In this way, I prepared to complete my PhD by verifying the statement in meditation (philosophy). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..c95e62e6091d32f0a19f8f519b0856921a87bce8 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Lucian Mantra (Sun Safety) 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Lucian Mantra (Sun Safety) 1 of 4 + + +1. This argument for the Lucian mantra describes sun safety. + + +1a. I prepared to love the sun. I did this by sitting in the sun outside 11:00 AM - 3:00 PM. First, I sat on a hammock in the sun from 10:00 - 11:00 AM. Second, I played cricket from 3:00 - 4:00 AM. Third, I had afternoon tea from 5:00 - 6:00 PM. In this way, I prepared to love the sun by sitting in the sun outside 11:00 AM - 3:00 PM. + +2. I prepared to enjoy the sun. I did this by sitting in the sun when it was overcast. First, I waited until the sun had gone behind a cloud. Second, I sat on a beach chair. Third, I went inside when the sun came out from behind a cloud. In this way, I prepared to enjoy the sun by sitting in the sun when it was overcast. + +3. I prepared to make an Aboriginal humpy. I did this by sitting under the tarpaulin. First, I placed rocks on two corners of the tarpaulin. Second, I tied two corners of the tarpaulin to rods in the ground. Third, I sat under the tarpaulin. In this way, I prepared to make an Aboriginal humpy by sitting under the tarpaulin. + +4. I prepared to sleep until I had had enough sleep. I did this by sleeping under the beach umbrella. First, I placed the beach umbrella in the stand. Second, I observed where it's shadow was. Third, I slept in this place. In this way, I prepared to sleep until I had had enough sleep by sleeping under the beach umbrella. + +5. I prepared to walk around in the maze. I did this by putting on a sun hat. First, I attached corks to strings. Second, I attached the strings to the hat. Third, I placed the hat on my head. In this way, I prepared to walk around in the maze by putting on a sun hat. + +6. I prepared to write about hats. I did this by wearing a long-sleeved sun shirt. First, I cut out the material. Second, I sewed the sides together. Third, I put it on. In this way, I prepared to write about hats by wearing a long-sleeved sun shirt. + +7. I prepared to walk around the hill. I did this by wearing the long sun pants. First, I unrolled them. Second, I ironed them. Third, I put them on. In this way, I prepared to walk around the hill by wearing the long sun pants. + +8. I prepared to do some gardening. I did this by sliding on the sun gloves. First, I outturned them so they weren't inside out. Second, I put my finger tips inside the palm of the glove. Third, I pulled the gloves so my fingers went into the finger holes. In this way, I prepared to do some gardening by wearing the sun gloves. + +9. I prepared to walk along the esplanade. I did this by sliding into the sun shoes. First, I opened the heel. Second, I placed my toe into the heel. Third, I pulled the shoe onto my foot. In this way, I prepared to walk along the esplanade by sliding into the sun shoes. + +10. I prepared to walk in the park after 3:00 PM. I did this by wearing sunscreen. First, I found the sunscreen container. Second, I tipped the bottle. Third, I dropped some on my hand and applied it to the exposed parts of my skin. In this way, I prepared to walk in the park after 3:00 PM by wearing sunscreen. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..378204562b3f355836a5fbc45df59dd967779a83 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Lucian Mantra (Sun Safety) 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Lucian Mantra (Sun Safety) 2 of 4 + +11. I prepared to film a movie next to a church. I did this by placing sunglasses on my face. First, I opened the sunglasses' left arm. Second, I opened the sunglasses' right arm. Third, I placed the sunglasses' arms on my ears. In this way, I prepared to film a movie next to a church by placing sunglasses on my face. + +12. I prepared to prevent ultraviolet radiation burning my skin after being reflected from the water. I did this by applying sun-spray to my skin. First, I applied sun-spray to my head. Second, I applied sun-spray to my chest and back. Third, I applied sun-spray to my limbs. In this way, I prepared to prevent ultraviolet radiation burning my skin after being reflected from the water by applying sun-spray to my skin. + +13. I prepared to prevent heat reflected from the sand irritating my skin. I did this by applying spray-on sunscreen. First, I sprayed the soles of my feet. Second, I sprayed the palms of my hands. Third, I sprayed the back of my neck. In this way, I prepared to prevent heat reflected from the sand irritating my skin by applying spray-on sunscreen. + +14. I prepared to prevent light reflecting on me from the models of the two cities. I did this by applying sun-gel. First, I squeezed out some sun-gel onto my hand. Second, I placed it on my forehead. Third, I massaged it onto my forehead's skin. In this way, I prepared to prevent light reflecting on me from the models of the two cities by applying sun-gel. + +15. The builder prepared to shield the employees from the sun. She did this by erecting a marquee at the building site. First, she unfolded it. Second, she pulled out it's arms. Third, she placed in the courtyard. In this way, the builder prepared to shield the employees from the sun by erecting a marquee at the building site. + +16. The teacher prepared to play chess. He did this by telling the students to stay under trees. First, he skipped to the first tree. Second, he stayed underneath it. Third, the students joined him. In this way, the teacher prepared to play chess by telling the students to stay under trees. + +17. The girl's mother prepared to protect her daughter's skin from burning. She did this by teaching her to stay inside when the ultraviolet (UV) index was high. First, she read the UV index. Second, she verified that it was high. Third, she asked her daughter to stay in the `. In this way, the girl's mother prepared to protect her daughter's skin from burning by teaching her to stay inside when the ultraviolet (UV) index was high. + +18. The carer prepared to push the lady's wheelchair. She did this by placing her in the shade. First, she lifted the lady into the wheelchair. Second, she wheeled the wheelchair along the path. Third, she stopped when they reached the shade. In this way, the carer prepared to push the lady's wheelchair by placing her in the shade. + +19. The elder prepared to quench everyone's thirst. She did this by distributing water into bottles. First, she poured water into the first bottle. Second, she put the lid onto the bottle. Third, she put it into the cool box. In this way, she prepared to quench everyone's thirst by distributing water into bottles. + +20. I prepared to spend morning tea outside from 9:45 AM - 10:00 AM, when the sun was low in the sky. I did this by eating an apple. First, I opened my lunch box. Second, I took out the apple. Third, I cut it into cubes and ate it. In this way, I prepared to spend morning tea outside by eating an apple. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8f376e867de76a61609817d0215419303378f5a5 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Lucian Mantra (Sun Safety) 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Lucian Mantra (Sun Safety) 3 of 4 + +21. I prepared to relish my lunch from 3:00 PM-3:30 PM, when the sun was low in the sky. I did this by eating the tomato sandwich. First, I looked into my paper bag. Second, I unwrapped the sandwich. Third, I savoured (ate) it. In this way, I prepared to relish my lunch from 3:00 PM - 3:30 PM, by eating the tomato sandwich. + +22. The child prepared to have fun at different times from during the 10:00 AM - 2 PM period. He did this by sliding down the slide backwards. First, he climbed up the slide. Second, he turned over. Third, he released the sides of the slide when at the top of the slide to slide down the slide backwards. In this way, the child prepared to have fun by sliding down the slide backwards. + +23. I prepared to participate in the community recreation from 8:00 AM - 10:00 AM, when the sun was not so high in the sky. I did this by running in the scheduled run. First, I got dressed in running clothes. Second, I walked to the starting line. Third, I jogged along the course. In this way, I prepared to participate in the community recreation by running in the scheduled run. + +24. The designer prepared to organise a feast. He did this by designing a shaded area in the community garden. First, he designed four vertical beams at one end of the garden. Second, he designed a pergola above the beams. Third, he designed a hermaphroditic (fruiting) grape vine at the base of each of the posts. In this way, the designer prepared to organise a feast by designing a shaded area in the community garden. + +25. The boy prepared to watch the regatta. He did this by holding a sunshade. First, he pulled the ring off the closed sunshade. Second, he unfolded the sunshade. Third, he held it above his head. In this way, he prepared to watch the regatta by holding a sunshade. + +26. The student prepared to conduct an excursion. She did this by applying the zinc cream to her lips. First, she uncapped the zinc cream. Second, she applied it to her finger. Third, she applied the zinc cream on her finger to her lips. In this way, she prepared to conduct an excursion by applying the zinc cream to her lips. + +27. I prepared to weave the fabric. I did this by verifying that the fabric had a Ultraviolet Protection Factor (UPF) of 50+. First, I found the label of the shirt on the back-inner edge of the neck. Second, I read the label. Third, I verified that the UPF matched my requirement of 50+. In this way, I prepared to weave the fabric by verifying that the fabric had a Ultraviolet Protection Factor (UPF) of 50+. + +28. I prepared to avoid reflecting UV light onto my skin. I did this by choosing a dark coloured school uniform fabric. First, I placed the dark coloured fabrics on the left side. Second, I placed the light coloured fabrics on the right side. Third, I chose the dark coloured fabrics to make the school bag from. In this way, I prepared to avoid reflecting UV light onto my skin by choosing a dark coloured school uniform fabric. + +29. I prepared to verify the sunscreen's sun protection factor (SPF). I did this by verifying that it had a SPF of 30+. First, I read the front of the sunscreen bottle. Second, I read it's SPF number. Third, I verifying that it had good sun protection because of having SPF 30+. In this way, I prepared to verify the sunscreen's SPF by verifying that it had a SPF of 30+. + +30. I prepared to be protected in the infra-red (heated) environment. I did this by checking that the sunscreen protected my skin from broad spectrum (including infra-red) radiation. First, I read that I needed a sunscreen with broad spectrum protection Second, I looked at the label of the sunscreen. Third, I read whether it gave broad spectrum protection. In this way, I prepared to be protected in the infra-red (heated) environment by checking that the sunscreen protected my skin from broad spectrum (including infra-red) radiation. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..c28673f6453543f1863bd9fc5482e032362ec5f0 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Lucian Mantra (Sun Safety) 4 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Lucian Mantra (Sun Safety) 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Lucian Mantra (Sun Safety) 4 of 4 + +31. I prepared to test whether the water resistant sunscreen protected me from the water. I did this by testing whether the sunscreen was water resistant. First, I applied the sunscreen to a section of my skin. Second, I dropped a droplet of water onto this section. Third, I examined the droplet of water roll down this section of my skin. In this way, I prepared to test whether the water resistant sunscreen protected me from the water by testing whether the sunscreen was water resistant. + +32. I prepared to walk along the glass corridor at midday. I did this by wearing the wide brimmed hat. First, I measured the width of the hat. Second, I divided this by 2. Third, I noted that this was the width from the centre of my head to the edge of the brim. In this way, I prepared to walk along the glass corridor at midday by wearing the wide brimmed hat. + +33. I prepared to avoid the sun. I did this by sitting in the vehicle. First, I sat in the vehicle. Second, I switched on the air conditioning. Third, I avoided the sun. In this way, I prepared to avoid the sun by sitting in the vehicle. + +34. I prepared to design a production. I did this by avoiding the sun. First, I went for a walk before the sun reached high intensity. Second, I avoided the sun. Third, I went for a walk after the sun reached high intensity. In this way, I prepared to design a production by avoiding the sun. + +35. I prepared to drink lemonade. I did this by breasoning out my action in relation to the sun. First, I calculated the angle of the sun from me. Second, I breasoned out my action. Third, I performed my action. In this way, I prepared to drink lemonade by breasoning out my action in relation to the sun. + +36. I prepared to avoid spending too much time in the sun. I did this by timing my sun exposure. First, I wrote down the initial minute of my sun exposure. Second, I wrote down the final minute of my sun exposure. Third, I calculate the time of my sun exposure by subtracting the initial minute of my sun exposure from the final minute of my sun exposure. In this way, I prepared to avoid spending too much time in the sun by timing my sun exposure. + +37. I prepared to ensure my dependents were exposed to zero sun. I did this by controlling how much sun they were exposed to. First, I asked them to wear long sleeved shirts. Second, I asked them to wear sun cream. Third, I asked them to wear hats. In this way, I prepared to ensure my dependents were exposed to zero sun by controlling how much sun they were exposed to. + +38. I prepared to be protected temporally. I did this by protecting myself from the sun forever, until a date (I frequently bought sun cream). First, I bought sun cream in the first month. Second, I bought sun cream in the second month. Third, I bought sun cream on the third month. In this way, I prepared to be protected temporally by protecting myself from the sun forever, until a date (I frequently bought sun cream). + +39. I prepared to schedule sport before or after the sun's high intensity time period. I did this by following the government's advice about when to go outside. First, I accessed the bureau's latest data. Second, I accessed the initial time of the sun's high intensity time period. Third, I accessed the final time of the sun's high intensity time period. In this way, I prepared to schedule sport before or after the sun's high intensity time period by following the government's advice about when to go outside. + +40. I prepared to perform my task. I did this by using the insect repellant. First, I sprayed the aerosol on my hand. Second, I applied the liquid to my face, arms and neck. Third, I entered the insect space. In this way, I prepared to perform my task by using the insect repellant. + +41. I prepared to drive with tinted windows. I did this by avoiding too many ultraviolet rays. First, I wore clothing that covered my arms and legs and a hat with a wide brim to protect my head, face, neck and ears. Second, I wore sunglasses that wrapped around and blocked both UVA and UVB rays. Third, I used sunscreen with sun protection factor (or SPF) 15 or greater, with both UVA and UVB protection. In this way, I prepared to drive with tinted windows by avoiding too many ultraviolet rays. + +42. I prepared to avoid too much visible light. I did this by determining which wrap-around sunglasses to buy. First, I measured my face. Second, I determined my face shape. Third, I found a style of sunglasses. + +43. I prepared to avoid sunstroke. I did this by avoiding too much heat from the sun. First, I remained hydrated, exercised in milder weather and avoided sunburn. Second, I wore cool clothing. Third, I allowed myself to gradually adjust to the heat. In this way, I prepared to avoid sunstroke by avoiding too much heat from the sun. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..05c27d99c8f7438a6e869390b588f96fe592d7d7 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Maharishi Sutra 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Maharishi Sutra 1 of 4 + + +1. Maharishi, meaning teacher, is the founder of meditation. He or she collects the foundations, the areas of study that form the school and launches the courses. He is expected to appear during courses and also teaches meditation. + + +1a. I prepared to give excellence to everything I did. Smiley face. I did this by building the model house. First, I build the foundation. Second, I built the walls. Third, I built the roof. In this way, I prepared to give excellence to everything I did by building the model house. + +2. I prepared to eat breakfast in the army. I did this by eating the block of porridge. First, I cooked the rolled oats. Second, I poured the rolled oats into the block mold. Third, I ate them all the way to the bottom. In this way, I prepared to eat breakfast in the army by eating the block of porridge. + +3. I prepared to eat a bush meal. I did this by cooking the chestnut. First, I put the chestnut into the fire. Second, I waited until it had cooked. Third, I removed it from the fire with gloves. In this way, I prepared to eat a bush meal by cooking the chestnut. + +4. I prepared to make pedagogy work by making sure the breasonings (like the word breeze, like breathing with help from breasonings) worked. I did this by erecting the windsock. First, I made a stand. Second, I tied the windsock to it. Third, I placed it in the wind. In this way, I prepared to make pedagogy work by making sure the breasonings worked by erecting the windsock. + +5. The doctor prepared to visit another country. She did this by erasing the border. First, she drew a map in pencil. Second, she found the border. Third, she erased it. In this way, the doctor prepared to visit another country by erasing the border. + +6. I prepared to do something like going down a chemical gradient. I did this by sliding down the slide. First, I climbed the ladder. Second, I sat down at the top of the slide. Third, I lied down as a slid down the slide. In this way, I prepared to do something like going down a chemical gradient by sliding down the slide. + +7. I prepared to draw the megapixels, like As. I did this by drawing ten out of ten of the necessary pixels. First, I wrote the first meditation A about the chopsticks modelling objects. Second, I prepared to write the second meditation A. Third, I did this until I had drawn the perfect 10-member set of pixels. In this way, I prepared to draw the megapixels, like As by drawing ten out of ten of the necessary pixels. + +8. I prepared for the meditator's longevity to increase. I did this by sending the meditation pack to a new meditator. First, I included the pedagogy section in the meditation pack. Second, I included the meditation section in the meditation pack. Third, I included the medicine section and selections from the continental philosophy section in the meditation pack. In this way, I prepared for the meditator's longevity to increase by sending the meditation pack to a new meditator. + +9. I (Maharishi) prepared to adorn the stage with flowers. I did this by inserting my hand underneath the basket handle. First, I knelt down to pick up the basket. Second, I placed my arm under the handle. Third, I lifted the basket by standing up. In this way, I (Maharishi) prepared to adorn the stage with flowers by inserting my hand underneath the basket handle. + +10. I prepared to present the prospectus to the potential investor. I did this by wearing a suit. First, I put on my jacket. Second, I put on my pants. Third, I put on my shoes. In this way, I prepared to present the prospectus to the potential investor by wearing a suit. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d523628985f26e979a9eb1237ea7e08b1d96886d --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Maharishi Sutra 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Maharishi Sutra 2 of 4 + +11. I prepared to direct potential meditators to the service time. I did this by designing a business card with my professional web site address. First, I wrote my name on the business card blueprint. Second, I wrote my contact details on the business card blueprint. Third, I wrote my web site address on the business card blueprint. In this way, I prepared to direct potential meditators to the service time by designing a business card with my professional web site address. + +12. I prepared for the meeting to be full. I did this by writing the meeting time on the web site. First, I finished writing the class. Second, I tested it. Third, I advertised the class on the search engine, which linked to the class time with the ability to register on the web site. In this way, I prepared for the meeting to be full by writing the meeting time on the web site. + +13. I prepared for Lucianic Meditation's philosophy to be used in each religion. I did this by teaching Lucianic Meditation to monastics, oblates and followers in each religion. First, I delivered the lecture. Second, I said a model solution. Third, the student used this model solution to develop her own solution. In this way, I prepared for Lucianic Meditation's philosophy to be used in each religion by teaching Lucianic Meditation to monastics, oblates and followers in each religion. + +14. I beat the opposition to love you. I did this by hugging you. First, I opened my arms. Second, I wrapped them around you. Third, I squeezed you. In this way, I beat the opposition to love you by hugging you. + +15. I prepared to make money from my non-profit organisation to pay back to my non-profit organisation. I did this by receiving the coin. First, I opened my hand. Second, the donor opened his purse. Third, the donor put a coin into my hand. In this way, I prepared to make money from my non-profit organisation to pay back to my non-profit organisation by receiving the coin. + +16. The people prepared to listen to Maharishi. They did this by meeting him. First, they presented him with finished breasoning lists. Second, they presented him with As they had written on his topics. Third, they presented him with meditation students. In this way, The people prepared to listen to Maharishi by meeting him. + +17. Maharishi prepared to create peace on Earth. He did this by giving the people high quality thoughts. First, Maharishi received the meditation thoughts. Second, his monastics were meditated on. Third, this repeated ad infinitum. In this way, Maharishi prepared to create peace on Earth by giving the people high quality thoughts. + +18. Maharishi prepared to create a college. He did this by using his brain. First, he planned to plant 25 trees per green person. Second, he drank concentrated vitamin C for defeating disease. Third, he meditated to protect his health. In this way, Maharishi prepared to create a college by using his brain. + +19. I prepared to design a resort on the moon. I did this by looking at the distant point through a close hole. First, I created a tree-lined reserve on Earth. Second, I called it La Luna. Third, Lucianic monastics taught at a school there, by teaching distant knowledge through close breasonings. In this way, I prepared to design a resort on the moon by looking at the distant point through a close hole. + +20. I prepared to establish peace in my home. I did this by training the puppy not to bark. First, I placed the puppy on the floor. Second, I waited until she barked. Third, I asked her 'hush'. In this way, I prepared to establish peace in my home by training the puppy not to bark. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..2ab7d9e6078d842edce6553623f51ba38aa2fdc6 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Maharishi Sutra 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Maharishi Sutra 3 of 4 + +21. I prepared to build a tree house. I did this by building a house inside the tree trunk. First, I found a wide tree. Second, I carved rooms into the tree trunk. Third, I installed a door and windows in the house. In this way, I prepared to build a tree house by building a house inside the tree trunk. + +22. Lucianic Meditation's founder prepared to have followers. He did this by inviting monastics and other religions to Lucianic Meditation. First, he invited monastics to power Lucianic Meditation with Lucianic Meditation. Second, he invited companies to power companies with Lucianic Meditation. Third, he invited other religions to power other religions with Lucianic Meditation. In this way, Lucianic Meditation's founder prepared to have followers by inviting monastics and other religions to Lucianic Meditation. + +23. Lucian prepared to give himself the title Maharishi (Master). He did this by wearing the robe. First, he measured the robe. Second, he put it on. Third, he taught in it. In this way, Lucian prepared to give himself the title Maharishi (Master) by wearing the robe. + +24. I prepared to be happy. I did this by chiselling my name into the black granite. First, I wrote the first character. Second, I prepared to write the next character. Third, I repeated this until I had written my name. In this way, I prepared to be happy by chiselling my name into the black granite. + +25. I prepared to earn 100% in drawing. .I did this by earning A in having spiritual robotic head and limbs. First, I drew my body. Second, I drew my head. Third, I draw my limbs. In this way, I prepared to earn 100% in drawing by earning A in having spiritual robotic head and limbs. + +26. I prepared to love practicing Lucianic Meditation (reading a book). I did this by scheduling it at a particular time. . First, I woke early. Second, I practiced meditation (read the book). Third, I went to sleep. In this way, I prepared to love practicing Lucianic Meditation (reading the book) by scheduling it at a particular time. + +27. I prepared to plan a vacation. I did this by planning my supply of breasonings in Masters by Coursework. First, I planned to write 10*250 breasoning As for 80%, with 5*0.12 br lines to represent a multiple of at least 5*0.12=0.6, where 0.6*(10 more 250s)=6 in (6+10)*250 for a total of 4000 br for being helped with the rest for lecturers=10*250+5*0.12=2500.6 br. Second, For non-helping lecturers, I increased the length of the 5 lines with 5(0.12+0.32) br lines to represent a multiple of at least 5(0.12+0.32)=2.2, where 2.2*(10 more 250s)=22 in (22+10)*250 for a total of 8000 br = 10*250+5(0.12+0.32)=2502.2 br/assignment. Third, for 2 subjects per semester, 2 semesters with 5 assignments/subject=(2502.2*20)/365 days per year=137.107 br/day=1.714 As/day. In this way, I prepared to plan a vacation by planning my supply of breasonings in Masters by Coursework. + +28. I prepared to find out parts of the Vorstellung (idea). I did this by planning my supply of breasonings in Masters or PhD. First, I wrote 20 chapters in first year. Second, I wrote 10 chapters in second year. Third, I wrote 1.714 As/day. In this way, I prepared to find out parts of the Vorstellung (idea) by planning my supply of breasonings in Masters or PhD. + +29. I prepared to ask for audience feedback. I did this by playing my compositions from start to end. First, I created a line-up of my songs. Second, I found an audience. Third, I pressed 'play'. In this way, I prepared to ask for audience feedback by playing my compositions from start to end. + +30. I prepared to comment on what was interesting in computational philosophy. I did this by examining my computer program. First, I examined the title. Second, I examined the description. Third I examined the sample input and output. In this way, I prepared to comment on what was interesting in computational philosophy by examining my computer program. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d59850618ed184bcb93d2beb8ebe72d78b91d978 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Maharishi Sutra 4 of 4.txt @@ -0,0 +1,36 @@ +["Green, L 2021, Maharishi Sutra 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Maharishi Sutra 4 of 4 + +31. I prepared to write the politics code. I did this by critiquing politics. First, I found a misbehaving Head of State. Second, I gave him a pseudonym. Third, I wrote an essay agreeing with her with a critique. In this way, I prepared to write the politics code by critiquing politics. + +32. I prepared to sell my knowledge. I did this by stating that I am healthy. First, I experienced no stomach bubbles from meditation. Second, I collected knowledge to prevent colds. Third, I remained in good psychiatric health. In this way, I prepared to sell my knowledge by stating that I am healthy. + +33. I prepared to be economical. I did this by stating that I am wealthy. First, I performed enough of a particular kind of work. Second, I asked the Universe for help. Third, I received the reward. In this way, I prepared to be economical by stating that I am wealthy. + +34. I prepared to explain how I became wise. I did this by stating that I am wise. First, I opened my journal. Second, I stated what I did. Third, I stated why it was wise. In this way, I prepared to explain how I became wise by stating that I am wise. + +35. I prepared to integrate pedagogy, medicine, economics and meditation. I did this by stating that I am happy. First, I looked up what I was interested in. Second, I considered all information. Third, I consolidated my knowledge each day. In this way, I prepared to integrate pedagogy, medicine, economics and meditation by stating that I am happy. + +36. I prepared to program the robot to customise the car's interior for the journey. I did this by envisaging observing the philosopher riding in the automatic car in the future. First, I entered the windowless car. Second, I confirmed the itinerary. Third, I played the movie. In this way, I prepared to program the robot to customise the car's interior for the journey by envisaging observing the philosopher riding in the automatic car in the future. + +37. I prepared to move up. I did this by observing that the person at my level didn't oppose me. First, I noticed that the person at my level was happy. Second, I noticed that she moved on. Third, I moved on. In this way, I prepared to move up by observing that the person at my level didn't oppose me. + +38. I prepared to give some of my all to study. I did this by painting the yoyo red, like a blood cell. First, I wrote a Masters by Coursework. Second, I wrote a Masters by Research. Third, I wrote a PhD. In this way, I prepared to give some of my all to study by painting the yoyo red, like a blood cell. + +39. I prepared to find desirable pathways. I did this by considering and placing aside the non-meditator (well-read) Gods (philosophers). First, I found the meditator God (philosopher). Second, I found meditation with him. Third, I found pedagogy with him. In this way, I prepared to find desirable pathways by considering and placing aside the non-meditator (well-read) Gods (philosophers). + +40. I prepared to collect the high quality comments. I did this by collecting the high quality comment. First, I wrote the idea. Second, I breasoned it out. Third, I collected the high quality comment. In this way, I prepared to collect the high quality comments by collecting the high quality comment. + +41. I prepared to make friends with minorities. I did this by painting the rainbow. First, I painted the warm colours. Second, I painted the cool colours. Third, I was in favour of equality. In this way, I prepared to make friends with minorities by painting the rainbow. + +42. I prepared to find the way. I did this by lighting the flame. First, I found the candle. Second, I lit the match. Third, I lit the candle with the match. In this way, I prepared to find the way by lighting the flame. + +43. I prepared to have cardio-vascular exercise. I did this by walking my dog in the lunar park. First, I travelled to the lunar park with my dog. Second, I chose a path. Third, I walked my dog. In this way, I prepared to have cardio-vascular exercise by walking my dog in the lunar park. + +44. I prepared to be in a position of power. I did this by managing the money. First, I was given the job as financial manager. Second, I approved spending money. Third, I received some money. In this way, I prepared to be in a position of power by managing the money. + +45. I prepared to experience a grade spike during my degree. I did this by earning a better grade in the professor's class. First, I enrolled in the professor's class. Second, I completed the same amount of work as usual. Third, I performed twice as well. In this way, I prepared to experience a grade spike during my degree by earning a better grade in the professor's class. + +46. I prepared to sing you a love song. I did this by singing pop music. First, I attended singing training. Second, I annotated my song. Third, I sang my song. In this way, I prepared to sing you a love song by singing pop music. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e45210755c6e5b3d05b9246ce822cf89d92fe827 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Meditation Teacher Sutra 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Meditation Teacher Sutra 1 of 4 + + +1.The meditation teacher should have studied a meditation course taught by Maharishi, and offer meditation classes, i.e. teach the meditation techniques. Also, he or she may teach in person, and offer yoga and a variety of other related classes, i.e. cooking, medicine and pedagogy and philosophy. + + +1a. I prepared to verify my schedule. I did this by getting up when the bell rang. First, I lay in bed. Second, I waited until the bell rang. Third, I pulled the blanket off. In this way, I prepared to verify my schedule by getting up when the bell rang. + +2. I prepared to win the point. I did this by hitting the shuttlecock in the air. First, I aimed the badminton bat in a particular direction. Second, I threw the shuttlecock in the air. Third, I hit the shuttlecock in that direction. In this way, I prepared to win the point by hitting the shuttlecock in the air. + +3. I prepared to weigh the weights. I did this by lifting the weight onto the scales. First, I touched the weight. Second, I gripped the weight. Third, I lifted it onto the scales. In this way, I prepared to weigh the weights by lifting the weight onto the scales. + +4. I prepared to select the most healthy egg and sperm. I did this by eating the strawberry that had cream on it. First, I looked at the strawberries. Second, I looked at the strawberries with cream on them. Third, I selected a strawberry with cream on it and placed it in my mouth. In this way, I prepared to select the most healthy egg and sperm by eating the strawberry that had cream on it. + +5. Elvira prepared to participate in a field trip. She did this by flying a kite. First, she ran with the kite. Second, she let the wind lift the kite into the sky. Third, she flew the kite in the sky. In this way, she prepared to participate in a field trip by flying a kite. + +6. I prepared to ask the jackdaw to speak. I did this by patting the jackdaw. First, I walked to the jackdaw. Second, I placed my hand above the jackdaw. Third, I patted the jackdaw. In this way, I prepared to ask the jackdaw to speak by patting the jackdaw. + +7. I prepared to the meditation centre. I did this by staying close to the safe people. First, I identified that the people I was with were safe. Second, I stayed with the people. Third, I was close to the people. In this way, I prepared to the meditation centre by staying close to the safe people. + +8. I prepared to keep the positive gifts. I did this by differentiating between apples and oranges. First, I looked at the apple. Second, I looked at the orange. Third, I found differences between them. In this way, I prepared to keep the positive gifts by differentiating between apples and oranges. + +9. My audience prepared to experience heaven on earth. I did this by plucking the harp string. First, I sat at the harp. Second, I positioned my finger next to the string. Third, I plucked the C string. In this way, my audience prepared to experience heaven on earth by plucking the harp string. + +10. The psychiatrist prepared to help the patient. He did this by writing the while command. First, he typed 'while'. Second, he typed 'true'. Third, he ran the command until the user had entered input. In this way, he psychiatrist prepared to help the patient by writing the while command. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..daacb1a72583b63cb44da4d5679d5b328fd361df --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Meditation Teacher Sutra 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Meditation Teacher Sutra 2 of 4 + +11. I prepared to like you. I did this by agreeing with you. First, I read the sentence that you wrote. Second, I thought of whether you agreed with it. Third, I agreed with it. In this way, I prepared to like you by agreeing with you. + +12. I prepared to dissolve the licorice. I did this by popping the bubble. First, I found the bubble. Second, I positioned the metal skewer above it. Third, I stabbed the bubble, popping it. In this way, I prepared to dissolve the licorice by popping the bubble. + +13. I examined the Vag's famousness. I did this by converting famousness into famousness. First, I found the first famousness. Second, I converted it into another famousness. Third, I repeated this for the other famousnesses. In this way, I examined the Vag's famousness by converting famousness into famousness. + +14. I prepared to connect two heterogeneous ideas. I did this by allowing the left-hand side and right-hand side of my brain communicate, like a conversation of algorithms. First, I wrote sales arguments in pedagogy. Second, I distributed them to points of sale. Third, I mapped the points of sale. In this way, I prepared to connect two heterogeneous ideas by allowing the left-hand side and right-hand side of my brain communicate. + +15. I prepared to think of a metaphor for a fascium. I did this by standing a s single meditation government leader. First, I collected my like-minded colleagues. Second, I learned how to teach meditation. Third, I stood for election for government. In this way, I prepared to think of a metaphor for a fascium by standing a s single meditation government leader. + +16. I involved myself with goodnesses. I did this by painting the pictures with text. First I found the first part of the image. Second, I painted it with text. Third, I repeated this for the rest of the images. In this way, I involved myself with goodnesses by painting the pictures with text. + +17. I prepared to think of creative philosophy by thinking of sets of As for managers, education and the theology specific A by thinking of the metaphor for a parakeet. I did this by calling the parakeet to me. First, I called it. Second, I put out my arm Third, it came to me. In this way, I prepared to think of creative philosophy by calling the parakeet to me. + +18. I prepared to say I loved being there. I did this by sanctioning breedsonings with As, making philosophy easier to write. First, I wrote the breedsonings. Second, I wrote an A for it. Third, I wrote the philosophy. In this way, I prepared to say I loved being there by sanctioning breedsonings with As. + +19. The positive doctor prepared to do his job. He did this by protecting his heart and brain in conjunction with meditation. First, he protected his heart. Second, he protected his brain. Third, he meditated. In this way, he prepared to do his job by protecting his heart and brain in conjunction with meditation. + +20. The robotics man prepared to drink from the flask, which was a metaphor for society. He did this by drinking from the thermos flask (like a robot) in the park (space). First, he untwisted the lid. Second, he drank from it. Third, he put it down. In this way, the robotics man prepared to drink from the flask by drinking from the thermos flask in the park. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..ec4f6cb00cff82c2e3722417a066994056800c7d --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Meditation Teacher Sutra 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Meditation Teacher Sutra 3 of 4 + +21. You prepared to be the main person too. You did this by being fast parents with young children. First, you got married. Second, you got a mortgage. Third, you had children. In this way, you prepared to be the main person too by being fast parents with young children. + +22. I prepared to like being with you. I did this by shaking my worlds. First, I shook the model globe. Second, I waited until it had opened. Third, I watched the key fall out. In this way, I prepared to like being with you by shaking my worlds. + +23. I prepared to like being happy. I did this by inverting my frown into a smile. First, I identified myself frowning. Second, I relaxed my face. Third, I smiled. In this way, I prepared to like being happy by inverting my frown into a smile. + +24. I prepared to be on my guard. I did this by placing the fireguard in place. First, I turned the fire on. Second, I lifted the fire screen. Third, I placed it in front of the fire. In this way, I prepared to be on my guard by placing the fireguard in place. + +25. I prepared to like being friendly by myself. I did this by going for a walk at the same time each day. First, I decided whether I wanted to go for a walk at a particular time. Second, I went for a walk at this time. Third, I did this all day. In this way, I prepared to like being friendly by myself by going for a walk at the same time each day. + +26. I prepared to like a lady. I did this by giving the children all of the academic As once per day. First, I waited until 12 noon. Second, I gave them all of the academic As. Third, I watched their faces. In this way, I prepared to like a lady by giving the children all of the academic As once per day. + +27. I prepared to go outside. I did this by wearing a coat. First, I unbuttoned the coat. Second, I put on the jacket. Third, I put on a costume, went on stage and became King of Pop. In this way, I prepared to go outside by wearing a coat. + +28. I prepared to do a corrected act on stage. I did this by playing the lutephonics. First, I plucked the lute's string. Second, I listened to whether it was sharp or flat. Third, I tuned it down or up respectively, to the correct note. In this way, I prepared to do a corrected act on stage by playing the lutephonics. + +29. I prepared to pull people out of the grave. I did this by pushing everyone up. First, I lay under the person. Second, I put my hands on his back. Third, I pushed him up. In this way, I prepared to pull people out of the grave by pushing everyone up. + +30. I prepared to fall asleep. I did this by imagining sheep jumping over me. First, I watched one sheep jump over me in my imagination. Second, I prepared to watch the second sheep jump over me in my imagination. Third, I repeated this until I had fallen asleep. In this way, I prepared to fall asleep by imagining sheep jumping over me. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5a4cbb7ac5acc917fa568389b5044585ac9b33ef --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Meditation Teacher Sutra 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Meditation Teacher Sutra 4 of 4 + +31. I prepared to push strawberries into my mouth. I did this by molding strawberries. First, I pressed the strawberry up. Second, I kept pushing it up. Third, I ate it when it reached the top. In this way, I prepared to push strawberries into my mouth by molding strawberries. + +32. I prepared to eat dessert. I did this by eat strawberries out of a bowl. First, I offered the money. Second, I offered the property. Third, I offered my time to take part in the religion. In this way, I prepared to eat dessert by eat strawberries out of a bowl. + +33. I prepared to appear intelligent in relation to main programs about life. I did this by determining that the line was bug-free. First, I verified the command's spelling. Second, I verified the number of commands. Third, I verified the command returned the desired result. In this way, I prepared to appear intelligent in relation to main programs about life by determining that the line was bug-free. + +34. I prepared to entertain the guests. I did this by stating that the musician was vegetarian. First, I ate the peanut sauce. Second, I ate the broccoli. Third, I ate the carrot. In this way, I prepared to entertain the guests by stating that the musician was vegetarian. + +35. I prepared to feed the chivalrous knights. I did this by producing food in my role as the restaurant manager. First, I produced onion relish. Second, I produced mango chutney. Third, I produced Indian cuisine. In this way, I prepared to feed the chivalrous knights by producing food in my role as the restaurant manager. + +36. I prepared to write my own exam. I did this by writing it in multiple-choice format. First, I decided on the main points. Second, I decided on the criteria points for each point. Third, I wrote the exam. In this way, I prepared to write my own exam by writing it in multiple-choice format. + +37. I prepared to go to heaven (peace on earth). I did this by teaching the meditation technique. First, I travelled to the calm place. Second, I surrounded myself with students. Third, I instructed them in the meditation technique. In this way, I prepared to go to heaven (peace on earth) by teaching the meditation technique. + +38. I prepared to provide feedback on meditation teaching. I did this by preventing regogitation (sic) (thinking of the same content twice). First, I prepared to think of the first idea. Second, I prepared to think of the next idea. Third, I repeated this until I had thought of the ideas that I liked to. In this way, I prepared to provide feedback on meditation teaching by preventing regogitation (sic) (thinking of the same content twice). + +39. I prepared to observe safety guidelines. I did this by doing one thing at a time. First, I performed the first activity. Second, I finished it. Third, I performed the second activity. In this way, I prepared to observe safety guidelines by doing one thing at a time. + +40. I prepared to notice you. I did this by watching the navy show. First, I noticed the first mate. Second, I noticed the response. Third, I verified that it was correct. In this way, I prepared to notice you by watching the navy show. + +41. I prepared to examine the famous intellectually disabled student. I did this by playing the intellectually disabled student. First, I was sensitive to her voice. Second, I was sensitive to her appearance. Third, I was sensitive to her world. In this way, I prepared to examine the famous intellectually disabled student by playing the intellectually disabled student. + +42. I prepared to observe life on planet Earth. I did this by repeating the sutra. First, I read the sutra in a book. Alternatively, I listened to a teacher tell it to me. Then, I repeated the sutra for twenty minutes twice per day. In this way, I prepared to observe life on planet Earth by repeating the sutra. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..fbf993088074d110247a9d8cc65a254536981449 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Moving Appearances 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Moving Appearances 1 of 4 + + +1. This is like having a 'movie theatre' inside one's brain. This and the appearances argument allow one to visualise 'breasonings', the fundamental unit of pedagogy, and means they deserve the marks from pedagogical essays. + + +1a. I prepared to remove the carrageenan from the ice cream recipe. I did this by licking all of the top half of the lollipop at the same time. First, I attached my teeth to the top half of the lollipop. Second, I clenched the top half of the lollipop. Third, I removed the top half of the lollipop. In this way, I prepared to remove the carrageenan from the ice cream recipe by licking all of the top half of the lollipop at the same time. + +2. I prepared to prevent the * by pre-empting the person's anger. I did this by skewering the sausage sideways. First, I rotated the sausage so that it pointed up. Second, I aimed the skewer at the top of the sausage. Third, I pierced the sausage lengthwise with the skewer. In this way, I prepared to prevent the * by pre-empting the person's anger by skewering the sausage sideways. + +3. I prepared to read the person's face. I did this by reading the face of the pill with writing on it. First, I turned the pill over if there was no writing on it's top face. Second, I read the writing in this case. Otherwise, first I read the writing written on the top face of the pill. In this way, I prepared to read the person's face by reading the face of the pill with writing on it. + +4. I prepared to eat the salad sandwich. I did this by putting the bowl on the bench to be served. First, I lifted the bowl high. Second, I placed it on the edge of the bench. Third, I slid it onto the centre of the bench. In this way, I prepared to eat the salad sandwich by putting the bowl on the bench to be served. + +5. I prepared to kiss the Head of Philosophy at University on the cheek. I did this because the Head of Philosophy at University and I liked each other. First, I took her hand. Second, I looked her in the eye. Third, I shook her hand. In this way, I prepared to kiss the Head of Philosophy at University on the cheek because she and I liked each other. + +6. I prepared to taste each asparagus head separately. I did this by tasting each ice cream one at a time, to compare their flavours. First, I licked the strawberry ice cream. Second, I licked the orange ice cream. Third, I chose the strawberry ice cream because it was sweeter. In this way, I prepared to taste each asparagus head separately by tasting each ice cream one at a time, to compare their flavours. + +7. I prepared to put the baddie in jail. I did this by catching him. First, I looked at the man's face. Second, I compared it with the face of the baddie on the chart. Third, I placed the man whose face was on the chart behind bars. In this way, I prepared to put the baddie in jail by catching him. + +8. The man prepared to receive the Internet order. I did this by assisting the ball giver by predicting where he would give me the ball. First, I walked to where the ball-giver would give me the ball. Second, I lifted my arms to receive the ball. Third, I received the ball when the man gave it to me. In this way, the man prepared to receive the Internet order by predicting where he would give me the ball. + +9. I prepared to walk. I did this by touching the table with my tactile hand without looking. First, I stood beside the table. Second, I closed me eyes. Third, I touched the table. In this way, I prepared to walk by touching the table with my tactile hand without looking. + +10. I prepared to accelerate the particle. I did this by lowering my sensitive arm more and more slowly onto the deck chair's arm. First, I started moving my arm quickly towards the deck chair arm. Second, I started moving my arm more slowly towards the deck chair arm. Third, I relaxed my arm when it had reached the deck chair arm. In this way, I prepared to accelerate the particle by lowering my sensitive arm more and more slowly onto the deck chair's arm. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8b146bc338e18af00762ee0a588332139bf9f4f7 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Moving Appearances 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Moving Appearances 2 of 4 + +11. I prepared to paint the dock. I did this by using the brush stylus to intuitively paint with the computer tablet. First, I decided to paint a long thin stroke. Second, I chose to start the stroke with the corner of the brush to paint a thin stroke. Third, I painted the stroke with the brush on an angle to paint a long stroke. In this way, I prepared to paint the dock by using the brush stylus to intuitively paint with the computer tablet. + +12. I prepared to synchronise the files. I did this by breathing in and out at the same time as my friend. First, I watched my friend starting to breathe in. Second, I started to breathe in. Third, I repeated this by breathing out at the same time as my friend. In this way, I prepared to synchronise the files by breathing in and out at the same time as my friend. + +13. I prepared to store a cup in a wall. I did this by putting my tongue in the lolly's hole. First, I poked my tongue out. Second, I started pushing it into the hole. Third, I stopped pushing my tongue forward when it had reached the end of the hole. In this way, I prepared to store a cup in a wall by putting my tongue in the lolly's hole. + +14. I prepared to wait until sunset. I did this by staying at the laboratory for as long as possible. First, I wrote a list of items to do during the day. Second, I wrote the time each item would take. Third, I completed each item in the necessary item. In this way, I prepared to wait until sunset by staying at the laboratory for as long as possible. + +15. I prepared to write a computer program with a minimalist main predicate, in other words with code in the main predicate moved to other predicates. I did this by cleaning my neck. First, I wetted a sponge. Second, I applied the sponge to my neck. Third, I soaked up any water on my neck. In this way, I prepared to write a computer program with a minimalist main predicate by cleaning my neck. + +16. I prepared to maintain my self-confidence. I did this by smiling at the person facing me. First, I looked at the left eye of the person facing me. Second, I looked at the right eye of the person facing me. Third, I smiled at her. In this way, I prepared to maintain my self-confidence by smiling at the person facing me. + +17. I prepared to love each idea. I did this by loving, in other words repeating 80 lucian mantras. First, I repeated five lucian mantras. Second, I repeated this twice. Third, I repeated this eight times. In this way, I prepared to love each idea by loving, in other words repeating 80 lucian mantras. + +18. Einstein prepared to test what was built into an atom. He did this by eating a bagel at each eatery. First, he went to the first eatery. Second, he ate a bagel there. Third, he repeated this until he had visited each eatery. In this way, Einstein prepared to test what was built into an atom by eating a bagel at each eatery. + +19. I prepared to become a professor. I did this by concentrating well at University. First, I enrolled at University. Second, I attended class there. Third, I finished each degree one at a time (note: lecturers are trained to fail students who are enrolled in two separately enrolled degrees at a time). In this way, I prepared to become a professor by concentrating well at University. + +20. I prepared to wear the heat shield. I did this by making the pocket out of fabric by attaching it to the hole. First, I cut out two pocket halves from the fabric. Second, I sewed their bottoms and sides together. Third, I sewed the tops of the two sides of the pocket to the two sides of the hole for the pocket. In this way, I prepared to wear the heat shield by making the pocket out of fabric by attaching it to the hole. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..007582f5c23a35e2aec6f8484e0ad92464786be2 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Moving Appearances 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Moving Appearances 3 of 4 + +21. I prepared to say a prayer to have cloudy mucous in meditation. I did this by eating the brazil nut. First, I bit the brazil nut. Second, I chewed it. Third, I looked at the chewed nut. In this way, I prepared to say a prayer to have cloudy mucous in meditation by eating the brazil nut. + +22. I visited people's houses with a security guard to teach meditation. I did this by generously giving my time to the church. First, I walked to the first house. Second, I worked out how to go to the next house. Third, I continued doing this until I had visited each house in the list. In this way, I visited people's houses with a security guard to teach meditation by generously giving my time to the church. + +23. I prepared to disseminate the atom. I did this by swallowing the cream whole. First, I stuck a chopstick through the cream in my mouth before swallowing it. Second, I stuck a skewer through the cream in my mouth before swallowing it. Third, I stuck a small skewer through the cream in my mouth before swallowing it. In this way, I prepared to disseminate the atom by swallowing the cream whole. + +24. I prepared to disseminate the electron. I did this by cleaning the shoe top to bottom and front to back. First, I cleaned from left to right of the front top of the shoe. Second, I cleaned from front to back of the top of the shoe. Third, I cleaned from the top to bottom of the shoe. In this way, I prepared to disseminate the electron by cleaning the shoe top to bottom and front to back. + +25. I prepared to fire particles through the subatomic bond. I did this by cleaning the utensil, in other words, the fork. First, I cleaned the fork in the water. Second, I cleaned the fork with an absorbent dish cloth. Third, I dried the fork with a dry dish cloth. In this way, I prepared to fire particles through the subatomic bond by cleaning the utensil. + +26. I prepared to be safe at the pop concert. I did this by reading at home. First, I selected the first book on my pile of books. Second, I opened the book at the first page. Third, I read each page until the end of the book. In this way, I prepared to be safe at the pop concert by reading at home. + +27. I prepared to write on input and output into and out of a didgeridoo respectively. I did this by deciding who would write on each half of a text in a pair of people. First, I scanned each half of the text. Second, I delegated writing on the first half of the text to the person who preferred it. Third, I delegated writing on the second half of the text to the person who preferred it. In this way, I prepared to write on input and output into and out of a didgeridoo respectively by deciding who would write on each half of a text in a pair of people. + +28. I prepared to clear away parts of the nucleus from being in the system being modeled. I did this by wetting a cloth and cleaning the child's face. First, I wet the cloth under the tap. Second, I touched the child's face with the cloth. Third, I cleaned the child's face with the cloth. In this way, I prepared to clear away parts of the nucleus from being in the system being modeled by wetting a cloth and cleaning the child's face. + +29. I prepared to verify the colour of pigment subatomic particles in the gas. I did this by telling my friend where there was a good deal. First, I looked at the object. Second, I looked at the object's price tag. Third, I told my friend about the good deal, in other words, low price for the object. In this way, I prepared to verify the colour of pigment subatomic particles in the gas by telling my friend where there was a good deal. + +30. I prepared to verify that I knew the quantum electron's properties. I did this by accepting a ticket from my friend when I was available. First, I checked when I was free. Second, I accepted the ticket for this time. Third, I attended the concert at this time. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e39ba0ed70822131e15569149b6157a279f817a6 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Moving Appearances 4 of 4.txt @@ -0,0 +1,30 @@ +["Green, L 2021, Moving Appearances 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Moving Appearances 4 of 4 + +31. I prepared to give my time to the radiation experiment using personal protective equipment. I did this by giving my student the A grade for an assignment. First, I read the assignment. Second, I wrote the number of correct responses down. Third, I wrote the student had earned the A grade for the assignment because the number of correct responses were 80% of the questions in the assignment. In this way, I prepared to give my time to the radiation experiment by giving my student the A grade for an assignment. + +32. I prepared to examine the molecule to test the hypothesised effect of subatomic particles on it's properties, in the same way that I would ask the teacher different questions about the test. I did this by revising for the test. First, I read the question. Second, I read the answer. Third, I tested I understood the answer. In this way, I prepared to examine the molecule to test the hypothesised effect of subatomic particles on it's properties, in the same way that I would ask the teacher different questions about the test by revising for the test. First, I read the question. + +33. I prepared to eat the vegan delight. I did this by licking the three-fruit ice cream. First, I licked the strawberry ice cream. Second, I licked the peach ice cream. Third, I licked the mango ice cream. In this way, I prepared to eat the vegan delight by licking the three-fruit ice cream. + +34. I, the philosopher, prepared to be there. I did this by writing. First, I was in the centre with you. Second, I wrote about it with you. Third, I liked you. In this way, I, the philosopher, prepared to be there by writing. + +35. I prepared to let my would-be daughter and her husband keep the dowry. I did this by giving the dowry. First, I found my daughter. Second, I gave her half the dowry. Third, I gave her husband half the dowry. In this way, I prepared to let my would-be daughter and her husband keep the dowry by giving the dowry. + +36. I prepared to approve of my daughter's quality of life. I did this by observing the parent approve of the partner. First, I observed the parent test the partner's job. Second, I observed the parent test the partner's outward manner. Third, I observed the parent test the partner's physical state. In this way, I prepared to approve of my daughter's quality of life by observing the parent approve of the partner. + +37. I prepared to act on my status. I did this by verifying my status. First, I checked the train's current station. Second, I verified whether it was my destination. Third, I disembarked if it was. In this way, I prepared to act on my status by verifying my status. + +38. I prepared to verify that you would be at the place at the same time as me. I did this by verifying your status against my status. First, I saw you. Second, I decided to walk to you. Third, I met you. In this way, I prepared to verify that you would be at the place at the same time as me by verifying your status against my status. + +39. I prepared to lead the good life. I did this by preventing problems on the farm. First, I wrote the question. Second, I wrote the answer. Third, I replied with this answer when this question arose. In this way, I prepared to lead the good life by preventing problems on the farm. + +40. I prepared to take notes. I did this by finding the lost leash. First, I looked on the table. Second, I found it. Third, I went for a walk. In this way, I prepared to take notes by finding the lost leash. + +41. I prepared to write the breasoning's algorithm. I did this by observing the student agreeing with the breasonings. First, I observed the student agree with the breasoning. Second, I observed the student agree with the breasoning's algorithm's first breasoning. Third, I observed the student agree with the breasoning's algorithm's second breasoning. In this way, I prepared to write the breasoning's algorithm by observing the student agreeing with the breasonings. + +42. I prepared to endorse fairness. I did this by agreeing with positivity in relationships. First, I found the relationship. Second, I determined that it was positive. Third, I agreed with it. In this way, I prepared to endorse fairness by agreeing with positivity in relationships. + +Purusha +1. "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4f113e7ebea7c901d08fa1b04f175d8c37de9591 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 1 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Pranayama 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Pranayama 1 of 4 + +, meaning breath control, represents the core of the meditation technique, because one breathes in before thinking the mantra, followed by breathing out. Breathing in represents taking care of thoughts in meditation and breathing out represents stress relief. + +1a. I prepared to observe the master reach the destination. I did this by writing the Pedagogy Direction Essay Secondary text Press Release. First, I identified the socialist. Second, I observed the master who observed the direction. Third, I observed the direction that the master who observed the direction walked in. In this way, I prepared to observe the master reach the destination by writing the Pedagogy Direction Essay Secondary text Press Release. + +2. I prepared to judge the smaller parts. I did this by writing the Review of the Breathsonings Essay given the Secondary Text. First, I like breasonings. Second, I like breasoningesquenesses. Third, I like breasdostonings. In this way, I prepared to judge the smaller parts by writing the Review of the Breathsonings Essay given the Secondary Text. + +3. I prepared to connect rebreathsonings and music. I did this by writing the Review of the Rebreathsonings Essay given the Secondary Text. First, I wrote about the harpsichord. Second, I wrote about the flute. Third, I wrote about the piccolo. In this way, I prepared to connect rebreathsonings and music by writing the Review of the Rebreathsonings Essay given the Secondary Text. + +4. I prepared to connect room and gender. I did this by writing the Review of the Room Essay given the Secondary Text. First, I wrote about the cards. Second, I wrote about the armor. Third, I wrote about homosexuality in Nietzsche's class. In this way, I prepared to connect room and gender by writing the Review of the Room Essay given the Secondary Text. + +5. I prepared to connect part of room and various people. I did this by writing the Review of the Part of Room Essay given the Secondary Text. First, I wrote about you. Second, I wrote about me. Third, I wrote about him. In this way, I prepared to connect part of room and various people by writing the Review of the Part of Room Essay given the Secondary Text. + +6. I prepared to connect the idiom to the direction. I did this by writing the Review of the Direction Essay given the Secondary Text. First, I wrote about geometric types. Second, I wrote about hermeneutic types. Third, I wrote about vocational types. In this way, I prepared to connect the idiom to the direction by writing the Review of the Direction Essay given the Secondary Text. + +7. I prepared to interest Nietzche's actor. I did this by writing the Review of the Breathsonings Essay Press Release. First, I wrote that everyone is studying gender studies. Second, I wrote that everyone is studying masculinities. Third, I wrote that everyone is studying feminism. In this way, I prepared to interest Nietzche's actor by writing the Review of the Breathsonings Essay Press Release. + +8. I prepared to sign up for the chivalrous army. I did this by writing the Review of the Rebreathsonings Essay Press Release. First, I wrote about shields. Second, I wrote about destiny. Third, I wrote about happiness. In this way, I prepared to sign up for the chivalrous army by writing the Review of the Rebreathsonings Essay Press Release. + +9. I prepared to eat each part which I wanted to. I did this by writing the Review of the Room Essay Press Release. First, I ate vegan cheese. Second, I ate the vegan frankfurt. Third, I ate the cherry tomato. In this way, I prepared to eat each part which I wanted to by writing the Review of the Room Essay Press Release. + +10. I prepared to write each letter in it's place. I did this by writing the Review of the Part of Room Essay Press Release. First, I wrote about 'p'. Second, I wrote about 'q'. Third, I wrote about 'r'. In this way, I prepared to write each letter in it's place by writing the Review of the Part of Room Essay Press Release. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7a6bd1dde9d96d2ab0a2c10fabedfff3b122ac54 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Pranayama 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Pranayama 2 of 4 + +11. I prepared to connect music and direction. I did this by writing the Review of the Direction Essay Press Release. First, I wrote about pianissimos. Second, I wrote about forte. Third, I around about crescendo. In this way, . + +12. I prepared to teach them how to do it. I did this by writing the Hegel Journal Article. First, I wrote about Hegel. Second, I wrote about Journals. Third, I wrote about Articles. In this way, I prepared to teach them how to do it by writing the Hegel Journal Article. + +13. I prepared to interest Nietzche. I did this by writing the First Heidegger and Daoism Journal Article. First, I performed yoga. Second, I was calm. Third, I visited the library and read a book. In this way, I prepared to interest Nietzche by writing the First Heidegger and Daoism Journal Article. + +14. I prepared to have my essay published in the journal. I did this by writing the Second Heidegger and Daoism Journal Article. First, I wrote the essay. Second, I rebreasoned 50 As. Third, I submitted the essay to the journal. In this way, I prepared to have my essay published in the journal by writing the Second Heidegger and Daoism Journal Article. + +15. I prepared to eat vegan cake. I did this by writing the Hegel Journal Article Press Release. First, I published my article. Second, I heard from the press. Third, I was interviewed. In this way, I prepared to eat vegan cake by writing the Hegel Journal Article Press Release. + +16. I prepared to eat the rich treat. I did this by writing the First Heidegger and Daoism Journal Article Press Release. First, I wrote the press release. Second, I breasoned it out to prevent the big idea cloth appearing. Third, I helped Antonia to Honours. In this way, I prepared to eat the rich treat by writing the First Heidegger and Daoism Journal Article Press Release. + +17. I prepared to be incognito. I did this by writing the Second Heidegger and Daoism Journal Article Press Release. First, I totted home. Second, I ate the watermelon. Third, I wrote on Heidegger. In this way, I prepared to be incognito by writing the Second Heidegger and Daoism Journal Article Press Release. + +18. I prepared to design the building. I did this by writing the Review of the Hegel Journal Article. First, I said it was interesting. Second, I reviewed it. Third, I helped people to it. In this way, I prepared to design the building by writing the Review of the Hegel Journal Article. + +19. I prepared to write divertissements. I did this by writing the Review of the First Heidegger and Daoism Journal Article. First, I wrote on leprosy. Second, I wrote on happy chords. Third, I wrote on the mother's success in meditation. In this way, I prepared to write divertissements by writing the Review of the First Heidegger and Daoism Journal Article. + +20. I prepared to reach nirvana (not do anything wrong). I did this by writing the Review of the Second Heidegger and Daoism Journal Article. First, I wrote the stars were forming. Second, I wrote the tides came in. Third, I wrote the colony was settled. In this way, I prepared to reach nirvana (not do anything wrong) by writing the Review of the Second Heidegger and Daoism Journal Article. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3c28e07a07a5a2b73ea573e1b4b128387655d1a5 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Pranayama 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Pranayama 3 of 4 + +21. I prepared to comment on all the information. I did this by writing the Review of the Hegel Journal Article Press Release. First, I wrote about the first kingdom. Second, I prepared to write about the next kingdom. Third, I repeated this until I had written about all the kingdoms. In this way, I prepared to comment on all the information by writing the Review of the Hegel Journal Article Press Release. + +22. I prepared to cite the source and continue. I did this by writing the Review of the First Heidegger and Daoism Journal Article Press Release. First, I wrote I spoke on the first day about not doing to much work. Second, I giggled about nothing in particular. Third, I loved you Lawrence of Arabia. In this way, I prepared to cite the source and continue by writing the Review of the First Heidegger and Daoism Journal Article Press Release. + +23. I prepared to eat enchilada for dessert. I did this by writing the Review of the Second Heidegger and Daoism Journal Article Press Release. First, I stood on the pavement. Second, I recycled the press release. Third, I ate corn chips for breakfast. In this way, I prepared to eat enchilada for dessert by writing the Review of the Second Heidegger and Daoism Journal Article Press Release. + +24. I prepared to eat spinach. I did this by writing Noam Chomsky's probable comments on my Pedagogy essays. First, I critiqued it. Second, I desired it. Third, I wanted it. In this way, I prepared to eat spinach by writing Noam Chomsks's probable comments on my Pedagogy essays. + +25. I prepared to like it. I did this by writing the Press Release for Noam Chomsky's probable comments on my Pedagogy essays. First, I wrote it was effortless. Second, I wrote I loved you. Third, I added 0. In this way, I prepared to like it by writing the Press Release for Noam Chomsky's probable comments on my Pedagogy essays. + +26. I prepared to decide what you wanted to do with the cinema object. I did this by writing Richard Rorty's probable comments on my Pedagogy essays. First, I wrote I like you. Second, I liked it. Third, I liked it again. In this way, I prepared to decide what you wanted to do with the cinema object by writing Richard Rorty's probable comments on my Pedagogy essays. + +27. I prepared to put my comments in a box. I did this by writing the Press Release for Richard Rorty's probable comments on my Pedagogy essays. First, I wrote it was genius. Second, I wrote it was like Spartacus. Third, I wrote it was like Spinoza. In this way, I prepared to put my comments in a box by writing the Press Release for Richard Rorty's probable comments on my Pedagogy essays. + +28. I prepared to go sick nut. I did this by writing Richard Dawkins' probable comments on my Pedagogy essays. First, I wrote I othered (sic) you. Second, I knew it. Third, I knew it again. In this way, I prepared to go sick nut by writing Richard Dawkins' probable comments on my Pedagogy essays. + +29. I prepared to make it all better again. I did this by writing the Press Release for Richard Dawkins' probable comments on my Pedagogy essays. First, I observed the sister move from the brother. Second, I observed the brother move towards the sister. Third, I knew they wept. In this way, I prepared to make it all better again by writing the Press Release for Richard Dawkins' probable comments on my Pedagogy essays. + +30. I prepared to say bellissimo. I did this by writing Michel Onfray's probable comments on my Pedagogy essays. First, I noticed he wanted to do it. Second, I saw him do it. Third, I moved on. In this way, I prepared to say bellissimo by writing Michel Onfray's probable comments on my Pedagogy essays. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..08ca3e81be42ee12038d5ce19c162e3447ecfe0b --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 4 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Pranayama 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Pranayama 4 of 4 + +31. I prepared to eat out of the hive (the strawberry saucepan). I did this by writing the Press Release for Michel Onfray's probable comments on my Pedagogy essays. First, I related it. Second, I thought of the object clearly. Third, I related it too. In this way, I prepared to eat out of the hive (the strawberry saucepan) by writing the Press Release for Michel Onfray's probable comments on my Pedagogy essays. + +32. I prepared to go home again. I did this by writing Alexius Meinong's probable comments on my Pedagogy essays. First, I wrote it was spitting. Second, I used the umbrella. Third, I went home. In this way, I prepared to go home again by writing Alexius Meinong's probable comments on my Pedagogy essays. + +33. I prepared to laugh. I did this by writing the Press Release for Alexius Meinong's probable comments on my Pedagogy essays. First, I examined pranayama. Second, I looked up the nose. Third, I moistened it. In this way, I prepared to laugh by writing the Press Release for Alexius Meinong's probable comments on my Pedagogy essays. + +34. I prepared to write it was included, 'that' as well. I did this by writing Martha Nussbaum's probable comments on my Pedagogy essays. First, I wrote Doug was attracted. Second, I was happy with the positivity of this. Third, I wrote that it was round table time. In this way, I prepared to write it was included, 'that' as well by writing Martha Nussbaum's probable comments on my Pedagogy essays. + +35. I prepared to put a hat on you. I did this by writing the Press Release for Martha Nussbaum's probable comments on my Pedagogy essays. First, I wrote on my 50s. Second, I loved you. Third, I loved it forever. In this way, I prepared to put a hat on you by writing the Press Release for Martha Nussbaum's probable comments on my Pedagogy essays. + +36. I prepared to write a quotation mark after the word. I did this by writing Noam Chomsky's probable comments on my Pedagogy blog. First, I wrote I delimited it. Second, I wrote I didn't say it too much. Third, I wrote I didn't say too much. In this way, I prepared to write a quotation mark after the word by writing Noam Chomsky's probable comments on my Pedagogy blog. + +37. I prepared to move away from everything, with me as good. I did this by writing the Press Release for Noam Chomsky's probable comments on my Pedagogy blog. First, I wrote a delimeter. Second, I made up PhDs. Third, I wrote that I would have to make up PhDs if it was a review. In this way, I prepared to move away from everything, with me as good, by writing the Press Release for Noam Chomsky's probable comments on my Pedagogy blog. + +38. I prepared to go to heaven (the office). I did this by writing Richard Rorty's probable comments on my Pedagogy blog. First, I wrote they were doing mixtures. Second, I wrote even though they were doing mixtures they thought it was acceptable. Third, I put it down. In this way, I prepared to go to heaven (the office) by writing Richard Rorty's probable comments on my Pedagogy blog. + +39. I prepared to say that it was in that case only. I did this by writing the Press Release for Richard Rorty's probable comments on my Pedagogy blog. First, I wrote Lucian's presence was so light they wanted more of it. Second, I reviewed it instead. Third, I wrote two reviews. In this way, I prepared to say that it was in that case only by writing the Press Release for Richard Rorty's probable comments on my Pedagogy blog. + +40. I prepared to ask, 'Who's that?' I did this by writing Richard Dawkins' probable comments on my Pedagogy blog. First, I cleared away the items in the middle. Second, I wrote it was a formalism. Third, I wrote it was him. In this way, I prepared to ask, 'Who's that?' by writing Richard Dawkins' probable comments on my Pedagogy blog. + +41. I prepared to write, 'You're making me so important'. I did this by writing the Press Release for Richard Dawkins' probable comments on my Pedagogy blog. First, I wrote this was all foisted on the person. Second, I wrote, 'What's the point?' Third, I wrote, I don't need you instead. In this way, I prepared to write, 'You're making me so important' by writing the Press Release for Richard Dawkins' probable comments on my Pedagogy blog. + +42. I prepared to display honesty. I did this by writing Michel Onfray's probable comments on my Pedagogy blog. First, I wrote 'Who's that?' Second, I wrote, 'Never mind, you'. Third, I wrote that a white thing appeared. In this way, I prepared to display honesty by writing Michel Onfray's probable comments on my Pedagogy blog. + +Soma + +1. "] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..2a653d612f672dba246c5cf19c8c704fbbf25af3 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 1 of 4.txt @@ -0,0 +1,28 @@ +["Green, L 2021, Purusha 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Purusha 1 of 4 + +ns, or Lucianic Monastics meditate for world peace. The argument's philosophy is to leave 'nothing' unexplored inside the purusha-like shell. + + +1a. I prepared to adorn myself with the lotus flower. I did this by picking the lotus flower. First, I walked to the pond. Second, I positioned the secateurs below the lotus flower. Third, I cut it from the plant. In this way, I prepared to adorn myself with it by picking it. + +2. I prepared to like you. I did this by offering my hand. First, I lifted my hand. Second, I held your hand. Third, I lifted your hand. In this way, I prepared to like you by offering my hand. + +3. I prepared to like myself. I did this by eating the coconut from it's shell. First, I became a monastic. Second, I created the church's spiritual life with God. Third, I brought world peace to the masses. In this way, I prepared to like myself by eating the coconut from it's shell. + +4. I prepared to love everyone. I did this by cutting the banana. First, I cut the banana off the tree. Second, I cut it's skin off. Third, I cut it in half. In this way, I prepared to love everyone by cutting the banana. + +5. I prepared to love you to everyone. I did this by writing the manifesto about a lock. First, I attended a theological school. Second, I used my theological training to guide people. Third, I taught their friends as well. + +6. I prepared to love myself to everything. I did this by watering the medicinal plant instead of relying on luck. First, I filled the watering cylinder with water. Second, I positioned it over the plant's leaves. Third, I watered it's leaves. In this way, I prepared to love myself to everything by watering the medicinal plant instead of relying on luck. + +7. I prepared to say my conclusion. I did this by hitting the drum. First, I held the stick. Second, I positioned it above the drum. Third, I struck the drum with the stick. In this way, I prepared to say my conclusion by hitting the drum. + +8. I prepared to love you to me. I did this by accepting a donation. First, I placed the box in the hall. Second, I pointed it out to the meditator. Third, the meditator placed the donation in the box. In this way, I prepared to love you to me by accepting a donation. + +9. I prepared to love you to you. I did this by writing a sacred text. First, I wrote a line. Second, I listened to my mother repeat it. Third, I held my mother to her point. In this way, I prepared to love you to you by writing a sacred text. + +10. I prepared to love everyone to you. I did this by drinking the holy water. First, I blessed the water. Second, I filled each cup with water. Third, I drank the water in my cup. In this way, I prepared to love everyone to you by drinking the holy water. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..18f025b3849e322488ec5222994b25a790100e47 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Purusha 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Purusha 2 of 4 + +11. I prepared to love everyone to me. I did this by jogging on the pavement. First, I placed running clothes and shoes. Second, I walked onto the pavement. Third, I ran on the pavement. In this way, I prepared to love everyone to me by jogging on the pavement. + +12. I prepared to love everyone to you too. I did this by calculating how long it took to eat the store of apples. First, I set the sand timer at 0. Second, I timed how long it took to eat the apple. Third, I multiplied this time by the number of apples. In this way, I prepared to love everyone to you too by calculating how long it took to eat the store of apples. + +13. I prepared to love everyone to me too. I did this by eating the guava. First, I halved the guava. Second, I scooped a spoonful of guava and ate it. Third, I repeated the second step until I had finished eating the guava. In this way, I prepared to love everyone to me too by eating the guava. + +14. I prepared to love everyone to everyone. I did this by eating the pears with everyone. First, I cooked the pears. Second, I placed them into bowls. Third, I ate my pear with everyone. In this way, I prepared to love everyone to everyone by eating the pears with everyone. + +15. I prepared to encourage vegans. I did this by loving everyone to sausage eaters. First, I decided to love. Second, I loved everyone. Third, I loved everyone to sausage eaters. In this way, I prepared to encourage vegans by loving everyone to sausage eaters. + +16. I prepared to become the new God (leader). I did this by loving God (the leader). First, I found God (the leader). Second, I loved her. Third, I followed her. In this way, I prepared to become the new leader by loving God (the leader). + +17. I prepared to remain in heaven (bliss). I did this by loving my partner. First, I found my partner. Second, I loved my partner. Third, I held on to my partner. In this way, I prepared to remain in heaven (bliss) by loving my partner. + +18. I prepared to state that W.S. stood for white (a movie key term) and sorry (a music key term). I did this by loving my family member. First, I found Methuselah. Second, I found him on high. Third, I moved on. In this way, I prepared to state that W.S. stood for white (a movie key term) and sorry (a music key term) by loving my family member. + +19. I prepared to think of the land of the magpie. I did this by drawing the magpie. First, I examined the magpie. Second, I drew her head. Third, I drew her body. In this way, I prepared to think of the land of the magpie by drawing the magpie. + +20. I prepared to act on the day (in the shoot). I did this by acting on the day. First, I found the date and time of the call. Second, I found the place of the call. Third, I acted on the day. In this way, I prepared to act on the day (in the shoot) by acting on the day. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..fd1042a06aa455a0658ad4f798fdfca0fa5e0377 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Purusha 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Purusha 3 of 4 + +21. I prepared to eat the scone with the Idiot. I did this by stating that I am purusha (universal man). First, I liked you. Second, I liked the cosmos. Third, I became a universal man. In this way, I prepared to eat the scone with the Idiot by stating that I am purusha (universal man). + +22. I prepared to like everything about the other purushas. I did this by liking them. First, I walked into the dormitory. Second, I introduced myself to the three purushas. Third, I held a conversation with them. In this way, I prepared to like everything about the other purushas by liking them. + +23. I prepared to go back to black. I did this by being given a black lamb. First, I went to market. Second, I bought a black lamb. Third, I cuddled it at home. In this way, I prepared to go back to black by being given a black lamb. + +24. I prepared to earn the degree. I did this by earning the required As. First, I asked Plato. Second, I asked the Heads of State. Third, I asked the religious leader (the man). In this way, I prepared to earn the degree by earning the required As. + +25. I prepared to write the aphor symbol, or section symbol. I did this by playing with the cricket. First, I played with the cricket. Second I let him hop out of my hand. Second, I noticed him with his friend. In this way, I prepared to write the aphor symbol, or section symbol by playing with the cricket. + +26. I prepared to taste test the new food. I did this by licking the taste tab. First, I found the blueberry taste tab. Second, I held it to my tongue. Third, I licked it. In this way, I prepared to taste test the new food by licking the taste tab. + +27. I prepared to make the card gazebo. I did this by inserting the tab. First, I detached the card gazebo along the perforated lines. Second, I folded it along the creases. Third, I inserted the tab. In this way, I prepared to make the card gazebo by inserting the tab. + +28. I prepared to synthesise Hegel's idea with his main idea. I did this by placing my thesis online. First, I breasoned out 50 As. Second, I applied to an online journal. Third, I placed my thesis online. In this way, I prepared to synthesise Hegel's idea with his main idea by placing my thesis online. + +29. I prepared to fill my presentation with the ideas of Heidegger and Laozi. I did this by placing my presentation online. First, I wrote my presentation. Second, I placed it online. Third, I viewed it. In this way, I prepared to fill my presentation with the ideas of Heidegger and Laozi by placing my presentation online. + +30. I prepared to breason out the gems. I did this by writing the required number of breasonings. First, I breasoned out quartz. Second, I breasoned out ruby. Third, I breasoned out lapis lazuli. In this way, I prepared to breason out the gems by writing the required number of breasonings. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5ac044a9865aafa07083447fb7e53370ab80185f --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Purusha 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Purusha 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Purusha 4 of 4 + +31. I prepared to examine my existence with light. I did this by writing that the Lucian mantra should be repeated today. First, I lit up my mind. Second, I lit up my body. Third, I lit up my world. In this way, I prepared to examine inner light by writing that the Lucian mantra should be repeated today. + +32. I prepared to bring beings to life. I did this by practicing the Green sutra. First, I examined the void. Second, I examined meditation. Third, I examined pedagogy. In this way, I prepared to bring beings to life by practicing the Green sutra. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e6dffbdd901cf91bccf9b11c8ee53928b0a4fc94 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 1 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Soma 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Soma 1 of 4 + +, the spiritual liquid in the digestive system, must be spiritually drunk by God to prevent digestive system pops from practising the sutra before practising the sutra each day. It is often written about in terms of food that gives the meditator energy to work each day. + +1a. I prepared to do something again. I did this by writing the Press Release for Michel Onfray's probable comments on the Pedagogy blog. First, I wrote that too many cooks spoiled the broth. Second, I wrote I avoided them. Third, I did it again. In this way, I prepared to do something again by writing the Press Release for Michel Onfray's probable comments on the Pedagogy blog. + +2. I prepared to be the best. I did this by writing Alexius Meinong's probable comments on the Pedagogy blog. First, I wrote many children's books. Second, I liked you. Third, I identified the brain thought. In this way, I prepared to be the best by writing Alexius Meinong's probable comments on the Pedagogy blog. + +3. I prepared to write that that was because of thinking so clearly of the philosopher rather than the worthless numbers game. I did this by writing the Press Release for Alexius Meinong's probable comments on the Pedagogy blog. First, I wrote I liked things. Second, I wrote 'Never, you!' Third, I made a distinction. In this way, I prepared to write that that was because of thinking so clearly of the philosopher rather than the worthless numbers game by writing the Press Release for Alexius Meinong's probable comments on the Pedagogy blog. + +4. I prepared to eat processed excrement. I did this by writing Martha Nussbaum's probable comments on the Pedagogy blog. First, I dodged the bullet. Second, I stated my gender. Third, I loved you. In this way, I prepared to eat processed excrement. I did this by writing Martha Nussbaum's probable comments on the Pedagogy blog. + +5. I prepared to love Nietzsche. I did this by writing the Press Release for Martha Nussbaum's probable comments on the Pedagogy blog. First, I ate processed excrement again. Second, I processed the cheese. Third, I watched Nietzsche eat the processed excrement. In this way, I prepared to love Nietzsche by writing the Press Release for Martha Nussbaum's probable comments on the Pedagogy blog. + +6. I prepared to eat processed feces. I did this by writing Noam Chomsky's probable comments on the Pedagogy indicators. First, I indicate love. Second, I liked you. Third, I helped you. In this way, I prepared to eat processed feces by writing Noam Chomsky's probable comments on the Pedagogy indicators. + +7. I prepared to eat the carrot. I did this by writing the Press Release for Noam Chomsky's probable comments on the Pedagogy indicators. First, I ate the recycled garbage. Second, I went on the cycling machine. Third, I went for a run. In this way, I prepared to eat the carrot by writing the Press Release for Noam Chomsky's probable comments on the Pedagogy indicators. + +8. I prepared to siccen (sic) myself. I did this by writing Richard Rorty's probable comments on the Pedagogy indicators. First, I ate happies. Second, I ate fun. Third, I ate goods. In this way, I prepared to siccen (sic) myself by writing Richard Rorty's probable comments on the Pedagogy indicators. + +9. I prepared to eat the peanut butter. I did this by writing the Press Release for Richard Rorty's probable comments on the Pedagogy indicators. First, I became an Emeritus Professor. Second, I ate the onion. Third, I ate the garlic. In this way, I prepared to eat the peanut butter by writing the Press Release for Richard Rorty's probable comments on the Pedagogy indicators. + +10. I prepared to lick the lolly. I did this by writing Richard Dawkins' probable comments on the Pedagogy indicators. First, I wrote that he asked what is the point of pedagogy again. Second, I rolled in it. Third, I intoxicated myself. In this way, I prepared to lick the lolly by writing Richard Dawkins' probable comments on the Pedagogy indicators. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..064b042ceb7b46fffa8fafc2ee49b0930177d8ad --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Soma 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Soma 2 of 4 + +11. I prepared to eat chocolate. I did this by writing the Press Release for Richard Dawkins' probable comments on the Pedagogy indicators. First, I tried fried tofu. Second, I really loved you. Third, I loved him. In this way, I prepared to eat chocolate by writing the Press Release for Richard Dawkins' probable comments on the Pedagogy indicators. + +12. I prepared to like Pedagogy. I did this by writing Michel Onfray's probable comments on the Pedagogy indicators. First, I asked 'What's that?' Second, I ate tofu meats. Third, I distinctly loved you. In this way, I prepared to like Pedagogy. I did this by writing Michel Onfray's probable comments on the Pedagogy indicators. + +13. I prepared to love God (the master). I did this by writing the Press Release for Michel Onfray's probable comments on the Pedagogy indicators. First, I wrote 'Oh that'. Second, I wrote that it is good. Third, I wrote that it is doubly good. In this way, I prepared to love God (the master) by writing the Press Release for Michel Onfray's probable comments on the Pedagogy indicators. + +14. I prepared to call them positive. I did this by writing Alexius Meinong's probable comments on the Pedagogy indicators. First, I wrote I ignored them. Second, I played fun games. Third, I rolled the dice. In this way, I prepared to call them positive by writing Alexius Meinong's probable comments on the Pedagogy indicators. + +15. I prepared to introduce the families to each other. I did this by writing the Press Release for Alexius Meinong's probable comments on the Pedagogy indicators. First, I liked you. Second, I liked him as a police officer. Third, I had him over for dinner. In this way, I prepared to introduce the families to each other by writing the Press Release for Alexius Meinong's probable comments on the Pedagogy indicators. + +16. I prepared to dine. I did this by writing Martha Nussbaum's probable comments on the Pedagogy indicators. First, I wrote on Stravinsky. Second, I helped him up. Third, I played strings. In this way, I prepared to dine by writing Martha Nussbaum's probable comments on the Pedagogy indicators. + +17. I prepared to eat with you. I did this by writing the Press Release for Martha Nussbaum's probable comments on the Pedagogy indicators. First, I it itself was good. Second, I like you. Third, I showed love to you. In this way, I prepared to eat with you by writing the Press Release for Martha Nussbaum's probable comments on the Pedagogy indicators. + +18. I prepared to write 'Do I know you?' I did this by writing Noam Chomsky's probable comments on the Pedagogy section on Lucianpedia. First, I gave you a love heart. Second, I threw it out. Third, I stamped on it. In this way, I prepared to write 'Do I know you?' by writing Noam Chomsky's probable comments on the Pedagogy section on Lucianpedia. + +19. I prepared to look in the snack box. I did this by writing the Press Release for Noam Chomsky's probable comments on the Pedagogy section on Lucianpedia. First, I wrote about you. Second, I wanted to do it. Third, I asked what the connections were. In this way, I prepared to look in the snack box by writing the Press Release for Noam Chomsky's probable comments on the Pedagogy section on Lucianpedia. + +20. I prepared to write a plus sign about you. I did this by writing Richard Rorty's probable comments on the Pedagogy section on Lucianpedia. First, I wrote I am Michael Mouse. Second, I wrote on the dorbuchers (sic). Third, I wrote on the knickerbocker glories. In this way, I prepared to write a plus sign about you by writing Richard Rorty's probable comments on the Pedagogy section on Lucianpedia. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..1dd5af00f4fafb4fe3242c9dabe9d0cdc057a64b --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Soma 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Soma 3 of 4 + +21. I prepared to dig in. I did this by writing the Press Release for Richard Rorty's probable comments on the Pedagogy section on Lucianpedia. First, I wrote I wanted more. Second, I received more. Third, I listened to it in the music. In this way, I prepared to dig in by writing the Press Release for Richard Rorty's probable comments on the Pedagogy section on Lucianpedia. + +22. I prepared to see everyone soon. I did this by writing Richard Dawkins' probable comments on the Pedagogy section on Lucianpedia. First, I write 'the'. Second, I understood it in a way from the repetition. Third, I kept them both as friends. In this way, I prepared to see everyone soon by writing Richard Dawkins' probable comments on the Pedagogy section on Lucianpedia. + +23. I prepared to eat with the chopsticks. I did this by writing the Press Release for Richard Dawkins' probable comments on the Pedagogy section on Lucianpedia. First, I asked what the breasonings details were. Second, I hugged you. Third, I moved on. In this way, I prepared to eat with the chopsticks by writing the Press Release for Richard Dawkins' probable comments on the Pedagogy section on Lucianpedia. + +24. I prepared to write on all things. I did this by writing Michel Onfray's probable comments on the Pedagogy section on Lucianpedia. First, I wrote that writing on the self and other was developed. Second, I found myself writing on Daoism. Third, I found myself writing on Continental Philosophy. In this way, I prepared to write on all things by writing Michel Onfray's probable comments on the Pedagogy section on Lucianpedia. + +25. I prepared to write about the main topic. I did this by writing the Press Release for Michel Onfray's probable comments on the Pedagogy section on Lucianpedia. First, I ordered the self and other argument. Second, I wrote the algorithm. Third, I connected the algorithm and the breasonings through the argument. In this way, I prepared to write about the main topic by writing the Press Release for Michel Onfray's probable comments on the Pedagogy section on Lucianpedia. + +26. I prepared to say metaphor was interesting to Plato. I did this by writing Alexius Meinong's probable comments on the Pedagogy section on Lucianpedia. First, I wrote that I was a transsexual. Second, I loved you. Third, I presented the flower to Meinong. In this way, I prepared to say metaphor was interesting to Plato by writing Alexius Meinong's probable comments on the Pedagogy section on Lucianpedia. + +27. I prepared to turn it off. I did this by writing the Press Release for Alexius Meinong's probable comments on the Pedagogy section on Lucianpedia. First, I wrote how it was ecstatic. Second, I drank the granita. Third, I turned the granite. In this way, I prepared to turn it off by writing the Press Release for Alexius Meinong's probable comments on the Pedagogy section on Lucianpedia. + +28. I prepared to commence my PhD-like Masters. I did this by writing Martha Nussbaum's probable comments on the Pedagogy section on Lucianpedia. First, I wrote I had so much fun. Second, I it was the bet ever. Third, I wrote the Dr Who was based on Hello, Speaker. In this way, I prepared to commence my PhD-like Masters by writing Martha Nussbaum's probable comments on the Pedagogy section on Lucianpedia. + +29. I prepared to come close and have fun. I did this by writing the Press Release for Martha Nussbaum's probable comments on the Pedagogy section on Lucianpedia. First, I wrote naturally. Second, I wrote you are a nice person. Third, I wrote she was a professor. In this way, I prepared to come close and have fun by writing the Press Release for Martha Nussbaum's probable comments on the Pedagogy section on Lucianpedia. + +30. I prepared to wear sunglasses close to the graduation ceremony. I did this by writing Noam Chomsky's probable comments on the Medicine essays. First, I came back with the song. Second, I graduated with Honours. Third, I liked it. In this way, I prepared to wear sunglasses close to the graduation ceremony by writing Noam Chomsky's probable comments on the Medicine essays. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..796724a7bce73c1595a88b67d3ced5b8c6ba41f8 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Soma 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Soma 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Soma 4 of 4 + +31. I prepared to downplay the climax. I did this by writing the Press Release for Noam Chomsky's probable comments on the Medicine essays. First, I wrote one was correct. Second, I wrote one was incorrect. Third, I pressed one button or another to insert the phrase with or without 'the' at the end. In this way, I prepared to downplay the climax by writing the Press Release for Noam Chomsky's probable comments on the Medicine essays. + +32. I prepared to perform the spelling adjustments in the menu. I did this by writing Richard Rorty's probable comments on the Medicine essays. First, I wrote about how everything in recording Peach OST wet as planned. Second, I enjoyed the reception. Third, I enjoyed the review. In this way, I prepared to perform the spelling adjustments in the menu by writing Richard Rorty's probable comments on the Medicine essays. + +33. I prepared to debate whether to comment because the content was more important than mine. I did this by writing the Press Release for Richard Rorty's probable comments on the Medicine essays. First, I wrote about the ABC presenter with a smooth tone. Second, I wrote about my prep teacher. Third, I indicated whether it was interesting. In this way, I prepared to debate whether to comment because the content was more important than mine by writing the Press Release for Richard Rorty's probable comments on the Medicine essays. + +34. I prepared to visit the real colonial outpost. I did this by writing Richard Dawkins' probable comments on the Medicine essays. First, I debated that it was totally normal. Second, I debated it. Third, I approved of it. In this way, I prepared to visit the real colonial outpost by writing Richard Dawkins' probable comments on the Medicine essays. + +35. I prepared to offer Meisner in a breasoned out form. I did this by writing the Press Release for Richard Dawkins' probable comments on the Medicine essays. First, I was interested in the meditation (philosophy) group leader. Second, I wrote the breasoning degrees. Third, I offered them harmlessly. In this way, I prepared to offer Meisner in a breasoned out form by writing the Press Release for Richard Dawkins' probable comments on the Medicine essays. + +36. I prepared to write Medicine essays and write probable comments on them. I did this by writing Michel Onfray's probable comments on the Medicine essays. First, I put aside '1'. Second, I didn't get stuck. Third, I deeply loved you. In this way, I prepared to write Medicine essays and write probable comments on them by writing Michel Onfray's probable comments on the Medicine essays. + +37. I prepared to write 50 As on each topic I wanted to write on. I did this by writing the Press Release for Michel Onfray's probable comments on the Medicine essays. First, I wrote genius. Second, I wrote mad. Third, I wrote you. In this way, I prepared to write 50 As on each topic I wanted to write on by writing the Press Release for Michel Onfray's probable comments on the Medicine essays. + +38. I prepared to look in the light glove and see that it is a normal light globe. I did this by writing Alexius Meinong's probable comments on the Medicine essays. First, I wrote on pure famousness. Second, I nibbled on the other's earlobes. Third, I lived the high life. In this way, I prepared to look in the light glove and see that it is a normal light globe by writing Alexius Meinong's probable comments on the Medicine essays. + +39. I prepared to write that I enjoyed myself. I did this by writing the Press Release for Alexius Meinong's probable comments on the Medicine essays. First, I asked about the magic tricks. Second, I asked about the water. Third, I wrote I asked could I have fun instead? In this way, I prepared to write that I enjoyed myself by writing the Press Release for Alexius Meinong's probable comments on the Medicine essays. + +40. I prepared to got back to the question of what a breasoning was. I did this by writing Martha Nussbaum's probable comments on the Medicine essays. First, I wrote my medicine degrees. Second, I put them together as breasonings. Third, I put them on the system and charged people for them. In this way, I prepared to got back to the question of what a breasoning was by writing Martha Nussbaum's probable comments on the Medicine essays. + +41. I prepared to listen to Ubu Roi. I did this by writing the Press Release for Martha Nussbaum's probable comments on the Medicine essays. First, I wrote about the screams. Second, I listened to the other node. Third, I listened to Mr Murphy say nothing. In this way, I prepared to listen to Ubu Roi by writing the Press Release for Martha Nussbaum's probable comments on the Medicine essays. + +42. I prepared to be famous about Chomsky and suggest socialism. I did this by writing Noam Chomsky's probable comments on the Medicine blog. First, I wrote the philosophers' comment idea was faster. Second, I was famous. Third, I made money. In this way, I prepared to be famous about Chomsky and suggest socialism by writing Noam Chomsky's probable comments on the Medicine blog. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f226aa53213c20dadd42946890ba26f08a4a2326 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Upasana Sutra 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Upasana Sutra 1 of 4 + + +1. Upasana, meaning meditation, is the sutra that first allows one to see high quality imagery and write on pedagogy. Pedagogy arguments may be written on meditation, medicine, pedagogy, etc., and eventually lead to the cycle of business in these and starting new forms of meditation. + + +1a. I prepared to love the historians. I did this by kissing you. First, I put lipstick on my upper lip. Second, I put lipstick on my lower lip. Third, I gave you a kiss. In this way, I prepared to love the historians by kissing you. + +2. I prepared to love being in you. I did this by licking a heart-shaped ice-cream. First, I licked from the bottom point of the heart to the top-left hand point of the heart ice-cream. Second, I licked from the bottom point of the heart to the top-right hand point of the heart ice-cream. Third, I finished off the ice-cream. + +3. I prepared to be happy for the moment. I did this by eating the edible camouflage tape. First, I unwound and ate 0.01 metres of the edible camouflage tape. Second, I prepared to repeat unwinding and eating 0.01 metres of the edible camouflage tape. Third, I repeated this until I had eaten 0.05 metres of the edible camouflage tape. In this way, I prepared to be happy for the moment by eating the edible camouflage tape. + +4. I prepared to love delights. I did this by eating the Jelly-Emeritus Professor. First, I ate the left half of the Jelly-Emeritus Professor. Second, I ate the right half of the Jelly-Emeritus Professor. Third, I ate the head of the Jelly-Emeritus Professor. In this way, I prepared to love delights by eating the Jelly-Emeritus Professor. + +5. I prepared to give something to somebody by boarding a vehicle. I did this by eating the cream in an éclair. First, I knived the join on the edge of the éclair. Second, I opened up the éclair. Third, I snuffed up the cream in the éclair. In this way, I prepared to give something to somebody by boarding a vehicle by eating the cream in an éclair. + +6. I prepared to open the wooden box. I did this by squelching the jelly out of the jelly mold. ***Br? 9/26/13 First, I poured the mixed the jelly crystals with the water. Second, I poured the mixture into the mold and let it set overnight. Third, I ran the fork over the jelly to squelch it out of the jelly mold. In this way, I prepared to open the wooden box by squelching the jelly out of the jelly mold. + +7. I prepared to go for a swim. I did this by pumping up the tyre. First, I inserted the air pipe into the tyre. Second, I started pumping the tyre up. Third, I stopped pumping up the tyre when it was fully pumped up. In this way, I prepared to go for a swim by pumping up the tyre. + +8. I prepared to test that the zygote's body grew from it's head. I did this by fitting the tyre to the axle. First, I placed the first clamp on the axle. Second, I placed the tyre on the axle. Third, I placed the second clamp on the axle. In this way, I prepared to test that the zygote's body grew from it's head by fitting the tyre to the axle. + +9. I prepared to create a lobster from a metal frame. I did this by turning the screw. First, I inserted the screwdriver into the screw. Second, I started turning the screw. Third, I stopped turning the screw when it had turned as far as it could turn. In this way, I prepared to create a lobster from a metal frame by turning the screw. + +10. I prepared to make an orange icy pole. I did this by squeezing the orange into the mold. First, I cut the orange in half. Second, I positioned the orange above the icy pole mold. Third, I squeezed it into it. In this way, I prepared to make an orange icy pole by squeezing the orange into the mold. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..43ee6632bfb4092372bbf6271392abb8ef173092 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Upasana Sutra 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Upasana Sutra 2 of 4 + +11. I prepared to verify what I wanted to verify the item was the same as. I did this by verifying one side of the table. First, I looked at the eukaryote. Second, I realised it was a eukarya member. Third, I checked for it in the left hand column. In this way, I prepared to verify what I wanted to verify the item was the same as by checking one side of the table. + +12. I prepared to verify the item that I wanted to verify was the same as another item. I did this by verifying the other side of the table. First, I looked for the table cell. Second, I looked for the column heading for this cell. Third, I checked whether the contents of the cell were the same as the content of the cell I was checking. In this way, I prepared to verify the item that I wanted to verify was the same as another item by verifying the other side of the table. + +13. I prepared to find the phylogenetic tree's branch's label that I wanted. I did this by verifying that it matched up with what I wanted. First, I verified the real smell of orange juice. Second, I tested the necessary smell of orange juice. Third, I verified that they matched. In this way, I prepared to find the phylogenetic tree's branch's label that I wanted by verifying that it matched up with what I wanted. + +14. I prepared to load the custom-built vehicle rack with the suit. I did this by verifying that the tuxedos were neat. First, I neatened the labels. Second, I neatened the lapels. Third, I neatened the artificial carnations. In this way, I prepared to load the custom-built vehicle rack with the suit by verifying that the tuxedos were neat. + +15. I prepared to investigate the Loop of Henle metaphor. I did this by giving out the tulip. First, I fingered the side of the tulip. Second, I moved my finger down the tulip. Third, I moved my finger up the other side of the tulip. In this way, I prepared to investigate the Loop of Henle metaphor by giving out the tulip. + +16. I prepared to squelch the pink juice into the mold. I did this by running the race. First, I started using energy. Second, I prepared to use the next amount of energy. Third, I finished using energy when I had finished running the race. In this way, I prepared to squelch the pink juice into the mold by running the race. + +17. I prepared to spoon the jelly delightfully. I did this by putting the orange jelly mixture in the refrigerator. First, I opened the door wide. Second, I popped the mixture in the door. Third, I made sure it wouldn't fall out. In this way, I prepared to spoon the jelly delightfully by putting the orange jelly mixture in the refrigerator. + +18. I prepared to say I wanted you. I did this by reading the answer. First, I opened up the book of ideas about you. Second, I found the right page. Third, I read that I want you. In this way, I prepared to say I wanted you by reading the answer. + +19. I prepared to want to communicate with you. I did this by finding the envelope. First, I opened the writing desk. Second, I found the correct pigeonhole. Third, I took out the thick envelope. In this way, I prepared to want to communicate with you by finding the envelope. + +20. I prepared to want your reply. I did this by listening to the answer. First, I looked at you. Second, I waited for your reply. Third, I was interested in your reply as you gave it. In this way, I prepared to want your reply by listening to the answer. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..34428847ae9b06c50a1044af0bc0eb9ee135e1ce --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Upasana Sutra 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Upasana Sutra 3 of 4 + +21. I prepared to earn genius marks. I did this by spelling out the mathematics equation. First, I spelt out the equation. Second, I instructed the computer how to do it. Third, I instructed the computer to run the program. In this way, I prepared to earn genius marks by spelling out the mathematics equation. + +22. I prepared to write down what was necessary to work out the equation. I did this by substituting the correct parts for the variables in the equation. First, I substituted the value corresponding to the first variable for the first variable in the equation. Second, I prepared to repeat this for the next variable. Third, I repeated this until all the variables had been substituted for values. In this way, I prepared to write down what was necessary to work out the equation by substituting the correct parts for the variables in the equation. + +23. I prepared to move my arm in space. I did this by calculating the result. First, I calculated the results of the brackets. Second, I calculated all the additions, divisions, multiplications and subtractions in the equation in that order. Third, I repeated this until I had calculated the result. In this way, I prepared to move my arm in space by calculating the result. + +24. I prepared to write the line. I did this by buying the pencil sharpener. First, I took out the money. Second, I placed it on the counter. Third, I took the change. In this way, I prepared to write the line by buying the pencil sharpener. + +25. I prepared to verify that you wanted the present. I did this by putting the present in your hand. First, I took the present from the wardrobe. Second, I asked you to close your eyes. Third, I placed the present in your hands. In this way, I prepared to verify that you wanted the present by putting the present in your hand. + +26. I prepared to clean the desk. I did this by emptying the shavings through the hole of the pencil sharpener. First, I placed the pencil sharpener above the bin. Second, I tipped the pencil sharpener upside down. Third, I tapped it. In this way, I prepared to clean the desk by emptying the shavings through the hole of the pencil sharpener. + +27. I prepared to inspect the animal mannequin. I did this by walking Inky. First, I called Inky. Second, I placed the leash on her. Third, I walked her. In this way, I prepared to inspect the animal mannequin by walking Inky. + +28. I prepared to eat the vegan delight. I did this by eating the slice of watermelon. First, I cut he slice of watermelon. Second, I lifted it to my lips. Third, I ate the coarse textured, water-saturated fruit. In this way, I prepared to eat the vegan delight by eating the slice of watermelon. + +29. I prepared to make a profit. I did this by computing data science using statistics. First, I considered what I wanted. Second, I surveyed what was available. Third, I bought what I needed. In this way, I prepared to make a profit by computing data science using statistics. + +30. I prepared to agree to pay a certain amount per breasoning, as part of base pay. I did this by inventing a future job. First, I employed the equality officer. Second, I employed the green officer. Third, I employed the pedagogy officer to pedagogise (sic) the whole neighbourhood. In this way, I prepared to agree to pay a certain amount per breasoning, as part of base pay by inventing a future job. First, I employed the equality officer. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..2e3ddec5be592ffff705242eccbea99ab722e3a0 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Upasana Sutra 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Upasana Sutra 4 of 4 + +31. I prepared to send the letter to the Queen. I did this by licking the stamp. First, I detached the stamp from the perforated sheet. Second, I licked it. Third, I attached it to the envelope. In this way, I prepared to send the letter to the Queen by licking the stamp. + +32. I prepared to go forward. I did this by liking Samadhi. First, I drove along the highway. Second, I stopped at the side of the road. Third, I lit a campfire. In this way, I prepared to go forward by liking Samadhi. + +33. I prepared to countermand the instruction. I did this by driving to work. First, I performed my morning regimen. Second, I drove to work. Third, I walked into the office. In this way, I prepared to countermand the instruction by driving to work. + +34. I prepared to think of the central thought clearly. I did this by deciding whether to buy in philosophy of data science. First, I saw the product. Second, I decided to buy it. Third, I bought it. In this way, I prepared to think of the central thought clearly by deciding whether to buy in philosophy of data science. + +35. I prepared to transcend (connect) positivity with a positive thought. I did this by observing that the spiritual particles (s-particles) collided, making me happy. First, I found how the representation agreed with me. Second, I signaled this by colliding s-particles. Third, I breasoned out an A to make me happy because of this. In this way, I prepared to transcend (connect) positivity with a positive thought by observing that the spiritual particles (s-particles) collided, making me happy. + +36. I prepared to thank the religious leader. I did this by giving the religious leader a gift. First, I visited the shop. Second, I selected a gift. Third, I gave it to the religious leader. In this way, I prepared to thank the religious leader by giving the religious leader a gift. + +37. I prepared to philosophise about music. I did this by liking God (the philosopher). First, I produced the song. Second, I paid for it to be mixed and mastered. Third, I set it for my students to examine. In this way, I prepared to philosophise about music by liking God (the philosopher). + +38. I prepared to play the person. I did this by mimicking the person. First, I found videos of the person. Second, I mimicked his manner. Third, I mimicked his matter. In this way, I prepared to play the person by mimicking the person. + +39. I prepared to become God (a philosopher). I did this by repeating 50*80 Upasana sutras (50*50 As). First, I repeated 80 Upasana sutras on the first day. Second, I prepared to repeat 80 Upasana sutras on the next day. Third, I repeated this for 50 days. In this way, I prepared to become God (a philosopher) by repeating 50*80 Upasana sutras (50*50 As). + +40. I prepared to teach knowledge. I did this by following the meditation (philosophy) teacher. First, I followed the meditation (philosophy) teacher. Second, I listened to her knowledge. Alternatively, I read the book. In this way, I prepared to teach knowledge by following the meditation (philosophy) teacher. + +41. Assuming I would be a professor, I prepared to examine my students over 2 weeks (at 2 As per day). I did this by playing tiddly winks. First, I wrote the exam. Second, I counted the breasonings written. Third, I awarded the corresponding grade to the student. In this way, Assuming I would be a professor, I prepared to examine my students over 2 weeks (at 2 As per day) by playing tiddly winks. + +42. I prepared to have the same effect as a sequence of short courses on the same topic in agreeing and disagreeing. I did this by agreeing with a major as part of a degree. First, I completed the major. Second, I observed the head studied Master of Communications. Third, I observed the founder studied Master of International Business. In this way, I prepared to have the same effect as a sequence of short courses on the same topic in agreeing and disagreeing by agreeing with a major as part of a degree. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4a9ce8e81cc1896c0be996cfc91aed6d05c1f8e5 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Yellow God 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Yellow God 1 of 4 + + +1.God, with all elements of meditation, needs an argument to represent him or her. God simply makes imagery available and so, allows meditation possible. + + +1a. I prepared to write the algorithm for the transcript of the game Dido's gameplay. I did this by achieving the state audience level in the game Dido. First, I bought tickets. Second, I walked to my seat in the auditorium. Third, I sat down on the seat in the auditorium. In this way, I prepared to write the algorithm for the transcript of the game Dido's gameplay by achieving the state audience level in the game Dido. + +2. I prepared to lecture the economics students in creative economics. I did this by opening out the rotunda. First, I identified the spindle. Second, I started pulling the ribbon from the spindle. Third, I continued doing this until I had completely pulled the ribbon. In this way, I prepared to lecture the economics students in creative economics by opening out the rotunda. + +3. I prepared to go to heaven. I did this by rushing the blessings on myself. First, I said the breasonings details in a representation to a current God (see Jacqueline Publicity primary text entries 8 and 9). Second, I said the meditation details to God (see Jacqueline Publicity primary text entry 10 and Abracadabra and unabridged Abracadabra 2 song meanings). Third, I meditated on Maharishi Lucian by repeating 80 lucian mantras and 80 green sutras to give myself enough job training until that point, enabling me to go to heaven. In this way, I prepared to go to heaven by rushing the blessings on myself. + +4. I prepared to write about being Maharishi. I did this by writing the essay about God in Heaven. First, I wrote about monotheism, in other words, believing in a single God. Second, I wrote about having perfect health. Third, I wrote about meditating each day to achieve this. In this way, I prepared to write about being Maharishi by writing the essay about God in Heaven. + +5. I prepared to move forward in science. I did this by giving today a carrot. First, I found today's pigeonhole. Second, I found the carrot inside the pigeonhole. Third, I withdrew the carrot from the pigeonhole. In this way I prepared to move forward in science by giving today a carrot. + +6. Lucianic Meditation prepared to love the world. Lucianic Meditation did this by spreading peace throughout the world. First, it made the first person happy. Second, it prepared to make the second person happy. Third, it repeated this until all the meditators were happy. In this way, Lucianic Meditation prepared to love the world by spreading peace throughout the world. + +7. I prepared to mark the theology essay. I did this by thinking of a use for a computer science formula for each person, for example a person's rate of work = his or her change in concentration (number of ideas) / time. First, I wrote the first idea in a minute during the semester for the student to paraphrase. Second, I prepared to write the second idea in the second minute. Third, I repeated this until I had worked at a rate of 1 idea / 1 minute = 1 idea per minute. + +8. I prepared to speak with heads of state about the world's issues. I did this by breasoning out 110 objects before counting to 50 to act as exercise for high quality thoughts (in other words, by multiplying the number of 110 object-As by 50). First, I introduced meditation to prevent poverty. Second, I introduced pedagogy to schools. Third, I introduced meditation to prevent the need for hospitals. In this way, I prepared to speak with heads of state about the world's issues by multiplying the number of As by 50. + +9. I prepared to eat the communal dinner in Lucianic Meditation. I did this by drinking the strawberry sauce. First, I swallowed the strawberry sauce down my throat. Second, I enjoyed it. Third, I had spooned some more strawberry sauce. In this way, I prepared to eat the communal dinner in Lucianic Meditation by drinking the strawberry sauce. + +10. I prepared for good to defeat evil, in this case, the cockatoos ate grass because there were no evil enemies. I did this by loving the large cockfest. First, I encouraged the first cockatoo. Second, I prepared to encourage the next cockatoo. Third, I repeated this until I had encouraged the 10 cockatoos in the cockfest. In this way, I prepared for good to defeat evil, in this case, the cockatoos ate grass because there were no evil enemies by loving the large cockfest. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..9fe132c9d471b3c8e90e6fae85eda0444c379e16 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Yellow God 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Yellow God 2 of 4 + +11. I prepared for over-engineering of the bridge. I did this by hitting the ball on the string (representing over-engineering) on the stand. First, I picked up the bat. Second, I moved the bat backwards. Third, I hit the ball. In this way, I prepared for over-engineering of the bridge by hitting the ball on the string (representing over-engineering) on the stand. + +12. I prepared to put through an A with a 'negatable pressure cup appearance', in other words negatably but in a way that is protected by meditation, placing a medical question on oneself (thinking of a dental drill, the medical question and a conclusion) for a child to be conceived, a job to be earned or an H1 to be supported. I did this by holding the dog model, like the pressure cup. First, I picked up the dog model. Second, I held it. Third, I placed it on the ground. In this way, I prepared to put through an A with a 'negatable pressure cup appearance' by holding the dog model, like the pressure cup. + +13. I prepared to be big and good by teaching Lucianic Meditation. I did this by learning Lucianic Meditation. First, I repeated the breasonings details to God (see Jacqueline Publicity primary text entries 8 and 9). Second, I said the meditation details to a God (see Jacqueline Publicity primary text entry 10 and Abracadabra and unabridged Abracadabra 2 song meanings). Third, I practiced Lucianic Meditation by saying 80 lucian mantras and 80 green sutras in my head. In this way, I prepared to be big and good by teaching Lucianic Meditation by learning Lucianic Meditation. + +14. I prepared to like Maharishi. I did this by teaching Lucianic Meditation. First, I said the breasonings details in an appearance to Maharishi Lucian (see Jacqueline Publicity primary text entries 8 and 9). Second, I repeated the meditation details to Maharishi Lucian (see Jacqueline Publicity primary text entry 10 and Abracadabra and unabridged Abracadabra 2 song meanings). Third, I taught the student to meditate by repeating 80 lucian mantras and 80 green sutras. In this way, I prepared to like Maharishi by teaching Lucianic Meditation. + +15. I prepared to store the bit on a subatomic particle. I did this by bringing a citrus fruit to the gathering. First, I packed the mandarin. Second, I brought it with me to the gathering. Third, I ate it. In this way, I prepared to store the bit on a subatomic particle by bringing a citrus fruit to the gathering. + +16. I prepared to reintroduce my controversial argument for publishing. I did this by editing out the literature review from the PhD, leaving the critique, for publishing. First, I selected the literature review. Second, I deleted it. Third, I published the PhD. In this way, I prepared to reintroduce my controversial argument for publishing by editing out the literature review from the PhD, leaving the critique, for publishing. + +17. I prepared to write an argument for the intelligent pop song for it to be examined. I did this by stating properties of disagreeing (agreeing) arguments. First, I wrote in relation to a positive idea. Second, I wrote a single argument. Third, I connected the arguments in a structure. In this way, I prepared to write an argument for the intelligent pop song for it to be examined by stating properties of disagreeing (agreeing) arguments. + +18. I prepared to write magna opera. I did this by sitting and writing. First, I bought a height-adjustable chair with no arms to fit under the table and scheduled regular breaks. Second, I sat. Third, I wrote. In this way, I prepared to write magna opera by sitting and writing. + +19. I prepared to eat healthily. I did this by eating while sitting. First, I ate less food and fewer units of energy. Second, I chewed it more carefully. Third, I tasted my meal. In this way, I prepared to eat healthily by eating while sitting. + +20. I prepared to prevent depression. I did this by studying the medicine short course. First, I noticed the person was depressed (hadn't studied medicine). Second, I suggested that the person pay me $50 to 'put through' the Medicine As on my system to prevent depression. Third, the person agreed. In this way, I prepared to prevent depression by studying the medicine short course. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6efe2d7fc261677976e7589f0949ac5a073eeaf3 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 3 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Yellow God 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Yellow God 3 of 4 + +21. I prepared to maintain my happiness. I did this by stating that I was happy when I saw the bright colours. First, I saw the bright colours. Second, I was happy when I saw the bright colours. Third, I stated that I was happy when I saw the bright colours. In this way, I prepared to maintain my happiness by stating that I was happy when I saw the bright colours. + +22. I prepared to study hermeneutics. I did this by writing Principias. First, I presented at the Symposium. Second, I wrote down my ideas about my presentation. Third, I opened my own Academy. In this way, I prepared to study hermeneutics by writing Principias. + +23. I prepared to simplify my algorithm. I did this by writing algohedrons (sic) to verify algorithms. First, I wrote the algorithm. Second, I verified the algorithm against the self. Third, I verified the algorithm against the other. In this way, I prepared to simplify my algorithm by writing algohedrons (sic) to verify algorithms. + +24. I prepared to write what they want to believe. I did this by agreeing with religious (philosophical) pluralism. First, I studied Christianity (books). Second, I studied Lucianic Meditation (Philosophy). Third, I survived blame for famousness. In this way, I prepared to write what they want to believe by agreeing with religious (philosophical) pluralism. + +25. I prepared to agree by recording another text reconciling differences between philosophers. I did this by agreeing with philosophical pluralism. First, I agreed with Socrates. Second, I agreed with Plato. Third, I agreed with Aristotle. In this way, I prepared to agree by recording another text reconciling differences between philosophers by agreeing with philosophical pluralism. + +25. I prepared to interpret Gail as the white character. I did this by agreeing that your creation was correct and good. First, I observed that your creation was correct. Second, I observed that your creation was good. Third, I agreed that your creation was correct and good. In this way, I prepared to interpret Gail as the white character by agreeing that your creation was correct and good. + +26. I prepared to experience the spiritual subatomic computer. I did this by driving in the car. First, I boarded the car. Second, I drove in the car. Third, I disembarked from the car. In this way, I prepared to experience the spiritual subatomic computer by driving in the car. + +27. I prepared to make a ship. I did this by borrowing the saw from my neighbour. First, I asked my neighbour if I could borrow his balsa wood saw. Second, I borrowed it from him. Third, I cut 1 cm from the section of balsa wood. I prepared to make a ship by borrowing the saw from my neighbour. + +28. I prepared to update my writing to reflect changes in language. I did this by reading my writing. First, I verified that it was included in training materials. Second, I verified that it was included in assessment. Third, I assigned this task to the Lucian Academy Accreditation Authority. In this way, I prepared to update my writing to reflect changes in language by reading my writing. + +29. I prepared to sign merchandise. I did this by observing my partner listen to my music. First, I observed my partner sit as my audience. Second, I observed to my partner listen to me sing. Third, I listened to him give me applause. In this way, I prepared to sign merchandise by observing my partner listen to my music. + +30. I prepared to earn 100%s at a prestigious University. I did this by lying on the lush grass. First, I looked in the shade. Second, I found a glade. Third, I lied in it. In this way, I prepared to earn 100%s at a prestigious University by lying on the lush grass. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..795d9500801bfa08713c58949dc4c2a4b57a84ad --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Yellow God 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF MEDITATION +by Lucian Green +Yellow God 4 of 4 + +31. I prepared to examine the dynamics of the group of people. I did this by acting out the algorithm. First, I acted out the breasoned predicate name. Second, I acted out each of the breasoned arguments. Third, I acted out each of the breasoned commands the predicate called. In this way, I prepared to examine the dynamics of the group of people by acting out the algorithm. + +32. I prepared to work at a fast rate. I did this by measuring the time of the cognitive processing of programming the algorithm in my mind. First, I timed lifting my finger. Second, I timed pressing the button. Third, I prepared for the next keystroke. In this way, I prepared to work at a fast rate by measuring the time of the cognitive processing of programming the algorithm in my mind. + +33. I prepared to compute the number of fruits, subtracting the number of damaged fruit, only for that day. I did this by writing the small idea. First, I wrote the name of the object. Second, I wrote its breasonings (X, Y and Z dimensions). Third, I repeated this for each object in the small idea. In this way, I prepared to compute the number of fruits, subtracting the number of damaged fruit, only for that day by writing the small idea. + +34. I prepared to emulate a PhD. I did this by writing the soundtrack. First, I read the script, like the literature review. Second, I wrote the song titles, like the questions. Third, I wrote the lyrics, like the hypotheses. In this way, I prepared to emulate a PhD by writing the soundtrack. + +35. I prepared to write the essay plan literature review, etc. I did this by writing the breasonings for the argument. First, I wrote the student's argument plan literature review, connecting the items together. Second, planned the algorithm as an answer to the question of the argument. Third, I wrote the breasoning as a substitute for the hypothesised breasoning for the argument. In this way, I prepared to write the essay plan literature review, etc. by writing the breasonings for the argument. + +36. I prepared to support the universe to support the student. I did this by becoming a writer. First, I read. Second, I wrote. Third, I wrote enough. In this way, I prepared to support the universe to support the student by becoming a writer. + +37. I prepared to earn a high paying job. I did this by avoiding excess physical labour. First, I invested in all areas for myself. Second, I thought clearly. Third, I received income. In this way, I prepared to earn a high paying job by avoiding excess physical labour. + +38. I prepared to study medicine before seeing the 250 breasonings as having the same quality imagery as 50 As, the standard for an area of study. I did this by completing the chapter's 250 breasonings. First, I wrote 250 breasonings. Second, I spiritually helped the students avoid looking at the breasonings in case they were distracting. Third, I set the area of study as assessment. In this way, I prepared to study medicine before seeing the 250 breasonings as having the same quality imagery as 50 As, the standard for an area of study by completing the chapter's 250 breasonings. + +39. I prepared to take spiritual anti-hallucinogenic medication. I did this by stating that 250 breasonings contained high quality imagery. First, I wrote the philosophy. Second, I determined that the philosophy was expressed by a high quality image. Third, I represented this for the student. In this way, I prepared to take spiritual anti-hallucinogenic medication by stating that 250 breasonings contained high quality imagery. + +40. I prepared to set 50 As as the entrance requirement for another employee. I did this by stating that 250 breasonings satisfied standards for an area of study. First, I wrote the 250 breasonings. Second, I wrote the perspective they were perfectly written as originating from. Third, I was given 50 As. In this way, I prepared to set 50 As as the entrance requirement for another employee by stating that 250 breasonings satisfied standards for an area of study. + +41. I prepared to state that breasonings were a cosmological answer, in the form of computer program data, to a question. I did this by stating that the breasoner breasoned out his first breasoning. First, the breasoner breasoned out the X dimension of the breasoning. Second, the breasoner breasoned out the Y dimension of the breasoning. Third, the breasoner breasoned out the Z dimension of the breasoning. In this way, I prepared to state that breasonings were a cosmological answer, in the form of computer program data, to a question by stating that the breasoner breasoned out his first breasoning. + +42. I prepared to study the music theory. I did this by being taught by the trainer of singing. First, I observed her conduct the orchestra during the overture. Second, I observed her bring in the choir. Third, I observed the choir make their entrance. In this way, I prepared to study the music theory by being taught by the trainer of singing. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/.DS_Store b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/.DS_Store differ diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6460c6da96708701bb688b06d43a1862a780a273 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 1 of 4.txt @@ -0,0 +1,37 @@ +["Green, L 2021, Breathsonings 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Breathsonings 1 of 4 + + + +1. The hospital technician helped the patient recover after surgery. He did this by dripping treacle down the victim's throat. He demonstrated the food was moving into the oesophagus using a model. Then he showed the food being swallowed using the throat. Finally, he showed the food entering the stomach. In this way, the hospital technician helped the patient recover after surgery by dripping treacle down the victim's throat. + + +2. The aircraft manufacturer tested that there was enough room to move around in his seat. He did this by putting his mouth around the spoon. First, he placed the vegan yoghurt on the spoon. Then, he put the centre of the spoon in the middle of his mouth. Finally, he closed his mouth and withdrew the spoon. In this way, the aircraft manufacturer tested that there was enough room to move around in his seat by putting his mouth around the spoon. + + +3. The river cruise captain planned how to use water. She did this by stopping food sticking to her tongue by putting water on it. First, she placed the pile of small sponges in the middle of her tongue. Then, she removed the top sponge and placed it on the back of her tongue. Finally, she put the second sponge on the front of her tongue. In this way, the river cruise captain planned how to use water by stopping food sticking to her tongue by putting water on to it. + + +4. The train lift driver lifted his food crate into the train. He did this by poking his tongue out underneath the fork. First, he measured how far his tongue protruded from his mouth. Next, he placed the fork this distance from his lips. Finally, he moved the fork above his tongue and put the food into his mouth while retracting his tongue. In this way, the train lift driver lifted his food crate into the train by poking his tongue out underneath the fork. + + +5. The network officer ate a degree at the campus shop that was on the network. He did this, by eating enough, in other words, a jarful. He started the counter at zero. Then, he added one to the count for each new treacle cupcake. Finally, he stopped adding one to the number when there were no more cupcakes. In this way, the network officer ate a degree at the campus shop that was on the network by eating enough, in other words, a jarful. + + +6. The club manager swallowed the lozenge in a particular way to prepare to descend the stairs. He did this by moving the tablet fragment to the back of his mouth with his tongue. First, he placed the lozenge fragment on the front of his tongue. Then, he closed his teeth over the tablet fragment and pushed his tongue forward, sliding the pill piece onto the back of his tongue. Finally, he swallowed the lozenge fragment. In this way, the club manager took the pill in a particular way to prepare to descend the stairs by moving the tablet fragment to the back of his mouth with his tongue. + + +7. The truck driver practised his hand-eye co-ordination. He did this by placing the plum segment into his mouth with his hand. First, he put the plum portion midway between the sides and top and bottom of his lips, so that it was past his top front teeth. Following this, he lowered the plum segment onto his tongue. This way, the truck driver practised his hand-eye coordination by placing the plum part into his mouth with his hand. + + +8. The dressmaker prepared to sew a hem on a dress. He did this by eating a marshmallow with a knife and fork. First, he pierced the marshmallow with his fork. Next, he pierced the marshmallow with the knife and cut it in half. Finally, he stabbed the left side of the marshmallow again with his fork and lifted it to his mouth. In this way, the dressmaker prepared to sew a hem on a dress by eating a marshmallow with a knife and fork. + + +9. The mushroom farmer tested the mushrooms had enough fertiliser and water. She did this by testing that her cordial was sweet by drinking it. First, she prepared to put some of the cordial on top of her tongue. Then, she put some of the cordial on the tip of her tongue, where she could taste its sweetness. So, she put the cordial where it was both on top of her tongue and touched her sweet spot. In this way, the mushroom farmer prepared to test the mushrooms had enough fertiliser and water by testing that her cordial was sweet by drinking it. + + +10. The confectioner prepared to make each section of a lolly snake. He did this by chewing and swallowing each part of a lolly snake. First, he disconnected a segment of the snake and then placed it in his mouth. After this, he lifted his arm up, and then back down on a new path, not near his face. Finally, he repeated the process until he had eaten all of the snake segments. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..cfd0ddae971c568701188cc10f39f0e8c0faf8c4 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 2 of 4.txt @@ -0,0 +1,35 @@ +["Green, L 2021, Breathsonings 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Breathsonings 2 of 4 + +11. The butterscotch tablet maker prepared to make crates for the tablets. He did this by making cream from butter, milk and sugar. First, he separated the butter into 0.01 m edge-length cubes, forming a checkerboard. Then, he poured milk into one-quarter of each of the spaces surrounding the butter cubes. Finally, he mashed and mixed the butter and milk together. In this way, the butterscotch tablet maker prepared to make crates for the tablets by making cream from butter, milk and sugar. + + +12. The fruiterer planted a fig tree. He did this by placing the fruit in the centre of the bowl. First, he added half the bowl's width to its left edge. Second, he added half the pan's depth to its front side. Finally, he lowered the fig at these coordinates until it touched the bottom of the bowl. In this way, the fruiterer prepared to plant a fig tree by placing the fruit in the centre of the bowl. + + +13. The sporting bowler hit the ball. He did this by lifting and placing the bowl in front of him. First, he lifted up the bowl. Next, he brought it forward. Finally, he put it in front of him. In this way, the sporting bowler prepared to hit the ball by lifting and placing the bowl in front of him. + + +14. The fruitmonger cleaned the broom handle. He did this by eating the popsicle. First, he measured 0.01 m down from the top of the popsicle. Next, he bit and warmed the biteful by salivating on it as he chewed it. Then, he repeated the process until he had completely eaten the popsicle. In this way, the fruitmonger prepared to clean the broom handle by eating the popsicle. + + +15. The hairstylist practised giving a haircut to an orange. He did this by cutting both hemispheres off an orange peel. First, he inserted his knife where the orange's stem was. Secondly, he cut a semicircle to the opposite point of the orange. Thirdly, he completed the circle by cutting back to the original point. In this way, the hairstylist practised giving a haircut to an orange by cutting both hemispheres off its peel. + + +16. The paper recycler prepared to recycle the pile of papers. He did this by testing that the popsicle had completely melted in the pan. First, he tested that the popsicle was not higher than a pool of liquid. Second, he tested that the solid was not visible. Thirdly, he proved that there was no sound of melting anymore. In this way, the paper recycler prepared to recycle the pile of papers by testing that all the popsicle had completely melted in the pan. + + +17. The garbage truck man washed the bin. He did this by measuring the solid to liquid ratio. First, he wrote down the volume of solid. Second, he wrote down the amount of fluid. Third, he divided the volume of solid by the amount of liquid. In this way, the garbage truck man prepared to wash the bin by measuring the solid to liquid ratio. + + +18. The metaphysician cleaned a mat. He did this by calculating the time difference between a solid and liquid of the same type. First, he measured the time to carry the tan bark with controlled steps. Then, he measured the time to carry the water without spilling it. After this, he subtracted the lesser of the two from the greater of the two to find the time difference between them. In this way, the metaphysician prepared to wash a mat. He did this by calculating the time difference between a solid and liquid of the same type. + + +19. The hairdresser prepared to dry the client's hair. She did this by measuring the boiling point of water. Firstly, she placed the thermometer in the pot of water. Secondly, she stirred the water as it boiled. Thirdly, she read the temperature next to the meniscus when the water had boiled. In this way, the hairdresser prepared to dry the client's hair by measuring the boiling point of water. + + +20. The cab driver circled the city. He did this by measuring the melting point of water when he stirred it. First, he placed ice in a freezer. Then, he increased the temperature of the freezer and stirred it. Lastly, he measured the temperature when the ice melted. In this way, the cab driver prepared to circle the city by measuring the melting point of water. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4acb3c405b1d9f9045ea1f4b91cc554b1e555423 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 3 of 4.txt @@ -0,0 +1,35 @@ +["Green, L 2021, Breathsonings 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Breathsonings 3 of 4 + +21. The robot manufacturer prepared to construct a robot of a particular height. He did this by weighing a solid. First, he added weight to the opposite side of the scales from the solid. Next, he continued to add weights until the two sides were equal. Finally, he summed the total weight of the weights to measure the object's weight. In this way, the robot manufacturer prepared to construct a robot of a particular height by weighing a solid. + + +22. The tailor shortened the jacket's sleeves. He did this by weighing a liquid. First, he put the weights with the approximate weight of the liquid on the opposite side of the scales from the fluid. Next, he added or removed weights to balance the scales. Finally, he summed the weights' weights. In this way, the tailor prepared to shorten the jacket's sleeves by weighing a liquid. + + +23. The nurse lifted the baby from the cot. She did this by lifting the bag of nappies from the ground. First, she tested that the handles had 0.01 m of plastic around the edges of a 0.15 x 0.05 m hole and held it using them. Then, she bent her hips and knees to lift the bag with her legs, not her back. Finally, she stood upbringing lift the bag. In this way, the nurse prepared to lift the baby from the cot by lifting the bag of nappies from the ground. + + +24. The gardener watered the frangipane. He did this by raising the water bottle. First, he placed the bottle upright on the table. Next, he grasped it, taking care not to tilt it. Finally, he held it, not too tightly, and lifted it vertically. In this way, the gardener prepared to water the frangipane by lifting the water bottle. + + +25. The doctor cleaned the bench. He did this by melting the ice. First, he melted the ice in a pan. Next, he sponged some of the water up. Then, he finished cleaning the bench with the sponge. In this way, the doctor prepared to clean the seat by melting the ice. + + +26. I enjoyed life by myself. I did this by writing that all the things in the world were mine. First, I read the books. Second, I enjoyed the sights. Third, I held the objects. In this way, I prepared to enjoy life by myself by writing that all the things in the world were mine. + + +27. I observed the flight of the concords. I did this by watching the horse drinking the dam water. First, I walked in the paddock. Second, I looked at the horse. Third, I observed him drink from the dam. In this way, I prepared to see the flight of the concords by watching the horse drinking the dam water. + + +28. I lived the good life. I did this by writing while walking in the hills. First, I walked gingerly. Second, I walked casually. Third, I wrote down all of the details. In this way, I prepared to live a proper life by writing while walking in the hills. + + +29. I prepared to take care of the incubator. I did this by stating year eight had Sex Education. First, I observed the man insert the rod into the void. Second, I took notes. Third, I took the lesson home. In this way, I prepared to take care of the incubator by stating year eight had Sex Education. + + +30. I was popular. I did this by meditating (writing philosophy) to become a graduate employee. First, I meditated (wrote my opinion). Second, I gained skills in performing well in examinations. Third, I became a graduate employee. In this way, I prepared to be popular by meditating (writing philosophy) to become a graduate employee. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..882ad22cf626e93d3410c7094576728c493d9ef5 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 4 of 4.txt @@ -0,0 +1,40 @@ +["Green, L 2021, Breathsonings 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Breathsonings 4 of 4 + +31. I led by showing a good example. I did this by loving God (the master). First, I meditated (wrote). Second, I went up. Third, I loved God (the master) helping me to be the master. In this way, I prepared to lead by showing a good example by loving God (the master). + + +32. I produced pop music. I did this by stating that I (the philosopher) was famous. First, I said that I am God (the master). Second, I said that I am a cosmologue (the leader). Third, I am mene, (sic) someone who memorises genes. In this way, I prepared to produce pop music by stating that I (the philosopher) was famous. + + +33. I was the original, productive Computational English writer. I did this by indicating that Computational English was brilliant. First, I wrote how it had helped Nietzsche. Second, I wrote how it had helped Derrida. Third, I wrote how it had helped Heidegger. In this way, I prepared to be the original, generous Computational English writer by stating that Computational English was famous. + + +34. I gave medicine (I answered 15 drug questions to access a generic medicine 250 breasoning A) to a 16-breasoning small idea, to make sure high-quality imagery appeared. I did this by writing that there were 250 breasonings for everything in natural law. First, I gave to the pope (gemstone dealer). Second, I presented to the emperor. Third, I took my blessings (writings) from it. In this way, I prepared to give medicine (I answered 15 drug questions to access a generic medicine 250 breasoning A) to a 16-breasoning small idea, to make sure high-quality imagery appeared by writing that there were 250 breasonings for everything in natural law. + + +35. I prepared to see how the Nobel Committee could award the Nobel Peace Prize for Breathsonings because they enabled us to be about things. I did this by writing that Breathsonings are a Nobel Prize because they defeat oppression by making sound judgements. First, I liked Breathsonings because they help medicine babies to be healthy. Second, I wrote operations go well because of Breathsonings. Third, I lubricated the lining of my air tubules when breathing using Breathsonings. In this way, I prepared to see how the Nobel Committee could award the Nobel Peace Prize for Breathsonings because they enabled us to be about things by writing that Breathsonings are a Nobel Prize because they defeat oppression by making sound judgements. + + +36. I prepared to see how the Nobel Committee could award the Nobel Peace Prize for Breathsonings using images of breathsonings working. I did this by writing that Breathsonings are a Nobel Prize because they help ideas to go well. First, I wrote Breathsonings to enable babies' systems to be perfect. Second, I wrote Breathsonings help us to work out if a body implant is going spectacularly well. Third, I breathsoned out the lung's tubules, lubricating them. In this way, I prepared to see how the Nobel Committee could award the Nobel Peace Prize for Breathsonings using images of breathsonings working by writing that Breathsonings are a Nobel Prize because they help ideas to go well. + + +37. I prepared to aim for the heart and do it. I did this by stating that the breath dropped into my mouth with the help of the Head of State. First, I asked the Head of State ever so sweetly to let the breath of air drop into my mouth. Second, I observed this happen. Third, I sucked succulently sweet desserts. In this way, I prepared to aim for the heart and do it by stating that the breath dropped into my mouth with the help of the Head of State. + + +38. I prepared to be sure Plato would agree with breasoning out the breathsoning. I did this by breasoning out the breathsoning like in Plato's forms. First, I breasoned out the judgment 'good' as what the body confirmed the breathsoning for the eight oxygen molecules to be. Second, I breathed in. Third, I breathed out. In this way, I prepared to be sure Plato would agree with breasoning out the breathsoning by breasoning out the breathsoning like in Plato's forms. + + +39. I had fun like Anarchy. I did this by breathsoning out 250 breasonings for each family entity. First, I decided I would find what I needed. Second, I was helped to it. Third, I liked you being the best in my vegan emporium. In this way, I prepared to have fun like Anarchy by breathsoning out 250 breasonings for each family entity. + + +40. I wrote about the heart. I did this by labelling the vertex. First, I found the cube of destiny. Second, I labeled it. Third, I drew its [health] points on it. In this way, I prepared to write about the heart by marking the vertex. + + +41. I loved planet meditation (books). I did this by holding the reflection (philosophy) retreat. First, I covered texts. Second, I covered retreat details. Third, I gave presents out. In this way, I prepared to love planet meditation (books) by holding the reflection (philosophy) retreat. + + +42. I helped the writers to write feverishly. I did this by holding the writer's retreat. First, I enamoured succleton. Second, I loved hickletons (you). Third, I loved enamourmonts. In this way, I prepared to help them to write feverishly by holding the writer's retreat. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..add51533b10018f380106e2e16f1c27b8b433b9c --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 1 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Direction 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Direction 1 of 4 + + + +1. The fairy godmother prepared to find the present. She did this by testing whether she could see her friend. First, she examined whether the current person had a moustache, like her friend. She also examined whether the current person had a hat, like her friend. She finally tested whether the person also had brown eyes, like her friend. In this way, the fairy godmother prepared to find the present by examining whether she could see her friend. + +2. The salesperson gave a spiel. He did this by cleaning his mouth before talking. First, he found the detritus in his lips. Second, he lifted it up. Third, he took it out of his mouth. In this way, the salesperson gave a spiel by cleaning his mouth before talking. + +3. The oarsman prepared to row the boat. He did this by shaving his beard. First, he clippered the left side of his beard. Second, he clippered the right side of his beard. Third, he clippered the middle of the beard. In this way, the oarsman prepared to row the boat by shaving his beard. + +4. The parent sat the baby upright. He did this by placing the pancake box on its base. First, he rotated the box until he found a side with lettering. Next, he rotated the box until the lettering was legible. Finally, he placed the box on the table with the writing facing forward. In this way, the parent prepared to sit the baby upright by putting the pancake box on its base. + +5. The winemaker tasted the wine. He did this by smelling the rose. First, he positioned his nostrils above the rose. Next, he inhaled through his nose with a short, sharp sniff. Finally, he smelled the scent with his olfactory sense. In this way, the winemaker prepared to taste the wine by smelling the rose. + +6. The stage manager hoisted the singer. He did this by closing the box. First, he found the lid. Next, he found the tab attached to the lid. Finally, he inserted the tab into the box. In this way, the stage manager prepared to hoist the singer by closing the box. + +7. The director counted how many actors the light was shining on. He did this by counting the blades of grass. First, he tested for the grass root. Second, he tested the blade of grass was at least 0.03 metres tall. Third, he repeated this process until he had counted all the blades of grass. In this way, the director prepared to count how many actors the light was shining on by counting the blades of grass. + +8. The ice cream parlour owner ate the ice cream. She did this by facing the wardrobe. First, she tested that the wardrobe had doors. Second, she tested that it had the depth of a wardrobe. Third, she tested that it had the height of a wardrobe. In this way, the ice cream parlour owner prepared to eat the ice cream by facing the wardrobe. + +9. The muesli bar manufacturer wrapped the muesli bar in plastic. He did this by wrapping the ribbon around the tennis racket handle. First, he placed the ribbon perpendicular to the handle. Second, he held it against the handle. Third, he wrapped it around the handle. In this way, the muesli bar manufacturer wrapped the muesli bar in plastic by wrapping the ribbon around the tennis racket handle. + +10. The ice cream parlour customer licked the ice cream. He did this by hosing himself. First, he pointed the hose at the ground. Second, he turned the water on. Third, he wiggled the water all over his body, from his head down. In this way, the ice cream parlour customer licked the ice cream by hosing himself. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..be77ffe3e078908da5fa08937cdd9f014b0f6fdf --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Direction 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Direction 2 of 4 + +11. The cabbage grower prepared space to grow the cabbage. He did this by swinging the pendulum. First, he placed his hand over a point. Then, he moved his hand forward quickly over a small distance. Finally, he moved the ball of the pendulum over a point in front of the other point. In this way, the cabbage grower prepared space to grow the cabbage by swinging the pendulum. + +12. The old man dug a flower bed. He did this by eating meringue with a cube-ended spoon. First, he held the spoon with its handle pointing up and the concavity of the spoon facing the side of the meringue. Second, he moved the spoon, scooping a cubeful of meringue. Third, he removed the spoon from the meringue. In this way, the old man prepared to dig a flower bed by eating meringue with a cube-ended spoon. + +13. The sailor staked the sea-grass friendly anchor. She did this by applying the eyeshadow between her eye and eyebrow. First, she applied powder to the brush. Second, she placed the brush between her eye and eyebrow. Third, she rubbed the brush between her eye and eyebrow. In this way, the sailor prepared to stake the sea-grass friendly anchor by applying the eyeshadow between her eye and eyebrow. + +14. The biochemist found the key protein. He did this by finding the needle in the haystack. First, he tested whether a hay needle was a hay needle by observing that it didn't reflect light when a he shone the torch on it. Second, he examined whether a hay needle was a silver needle by noting that it reflected light when he shone a torch on it. Third, he repeated the first step until he had finished the second step. In this way, the biochemist prepared to find the key protein by finding the needle in the haystack. + +15. The marine biologist put a strut in the whale's mouth. He did this by finding the rim of a jar. First, he found the jar in the pantry. Second, he found the top of the jar. Third, he traced the rim of the jar with a spoon. In this way, the marine biologist put a strut in the whale's mouth by finding the rim. + +16. The ninja prepared to jump off the dune. He did this by lying on the lilo. First, he lay on the lilo beside the pool. Then, he launched the lilo on the pool. Finally, he went to sleep on the lilo. In this way, the ninja prepared to jump off the dune by lying on the lilo. + +17. The observatory technician inspected both sides of the sky. She did this by looking at the lily pad. First, she sat on the lily pad. Second, she looked at the left side of the lily pad. Third, she inspected the right side of the lily pad. In this way, the observatory technician inspected both sides of the sky by looking at the lily pad. + +18. Snow White prepared to eat the apple. She did this by swimming between the lily pads. First, she placed his head above the water. Second, she swam to the right of the left lily pad and the left of the right lily pad. Third, she climbed out of the pond when she reached the other side. In this way, Snow White prepared to eat the apple by swimming between the lily pads. + +19. The window manufacturer moved the track ball bearings into place. He did this by parking the tricycle. First, he stopped pedalling when he reached the end of the path. Second, he stood up next to the tricycle. Third, he rolled the tricycle off the path. In this way, the window manufacturer moved the track ball bearings into place by parking the tricycle. + +20. The actor playing Goldilocks prepared to hold the spoon for the porridge. She did this by holding the sheet of paper. First, she picked up the paper. Next, she held it up to read it. Finally, she placed it on the table. In this way, the actor playing Goldilocks prepared to hold the spoon for the porridge by holding the sheet of paper. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..25a43c3708e25290ac5f97dc7372d28179e31c24 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Direction 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Direction 3 of 4 + +21. The construction site manager turned the door handle. He did this by licking around the ice cream. First, he licked the front of the ice cream. Second, he licked to the front-left of the ice cream. Third, he licked to the left of the ice cream. In this way, the construction site manager turned the door handle by licking around the ice cream. + +22. The swimmer squeezed the flotation device between her thighs. She did this by squeezing the bottle. First, she placed her right thumb on the left side of the bottle. Second, she placed her right index finger on the right side of the bottle. Third, she squeezed the bottle between her thumb and index finger. In this way, the swimmer squeezed the flotation device between her thighs by squeezing the bottle. + +23. The sign writer prepared to write letters out for a hearing impaired client. He did this by stroking his hand. First, he placed his right index finger on the left of his left palm. Second, he moved it across his palm. Third, he stopped when it reached the right of his palm. In this way, the sign writer prepared to write letters out for a hearing impaired client by stroking his hand. + +24. The cell biologist prepared to test that the cell cycle worked. He did this by licking the rim of his glass. First, he placed the tip of his tongue at the front of the rim. Second, he moved his tongue around the rim. Third, he stopped when he had completed licking the rim of the glass. In this way, the cell biologist prepared to test that the cell cycle worked by licking the rim of his glass. + +25. The doctor prepared to eat the rice ball. He did this by rotating his tongue. First, he licked above his mouth. Second, he lowered his tongue. Third, he stopped when it was horizontal. In this way, the doctor prepared to eat the rice ball by rotating his tongue. + +26. The doctor prepared to inject the patient. She did this by walking along the side of the curving canal. First, she measured ten metres away from the edge of the canal, to point A. Second, she measured ten metres away from the edge ten metres along the edge of the canal, to point B. Third, she walked from point A to point B. In this way, the doctor prepared to inject the patient by walking along the side of the curving canal. + +27. The hairstylist prepared to give his client a haircut. He did this by peeling the onion. First, he cut from the top to the bottom down one side. Second, he cut from the top to the bottom down the other side. Thirdly, he peeled the skin from the onion. In this way, the hairstylist prepared to give his client a haircut by peeling the onion. + +28. The rower prepared to be on the lookout for his destination. He did this by jumping onto the bed. First, he stood on the bed. Second, he jumped spread-eagled on the bed. Third, he jumped back to a standing position. In this way, the rower prepared to be on the lookout for his destination by jumping onto the bed. + +29. The pedestrian crossed the road between the traffic islands. He did this by crossing over the creek. First, he found where the left bank went closer to the centre of the creek. Second, he found where the right bank went closer to the centre of the creek. Third, he jumped where the two sides of the creek converged. In this way, the pedestrian crossed the road between the traffic islands by crossing over the creek. + +30. The mountaineer prepared to build the antenna on top of the mountain. He did this by listening through the keyhole. First, he waited until the people had entered the room. Second, he pressed his ear hole against the keyhole. Third, he entered the room when the amplitude of sound was zero. In this way, the mountaineer prepared to build the antenna on top of the mountain by listening through the keyhole. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..0cf4d941d9558db7468c2d42e1047ab5b27f0655 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Direction 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Direction 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Direction 4 of 4 + +31. The doctor looked at the spot using a magnifying glass. He did this by looking through the peephole. First, he faced the peephole. Second, he pointed his eye through the peephole. Third, he looked at the object through the peephole. In this way, the doctor looked at the spot using a magnifying glass by looking through the peephole. + +32. The clothing manufacturer tested that the clothing was not twisted when it was sewn up. He did this by winding wool around his hand. First, he tested that the first part of the wool wasn't twisted. Second, he wound it around his hand without twisting it. Third, he continued to wind it around his hand until its end. In this way, the clothing manufacturer tested that the clothing was not twisted when it was sewn up by winding wool around his hand. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..ab6f61e8c51079bad6cb46ad82e491aa0eb066ab --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 1 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, God Algorithm 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +God Algorithm 1 of 4 + + + +1. I prepared to be open-ended. I did this by loving you, The God (the master). First, I held you, The God (the master). Second, I hugged you. Third, I released you. In this way, I prepared to be open-ended by loving you, The God (the master). + +2. I prepared to experience God (the leader). I did this by stating that God (the leader) is alive through a lineage to me as a meditation (writing) from God (the leader). First, I noted that there was a meditation (writing) from God (the leader). Second, I observed that there was a lineage to me from this meditation (writing). Third, I stated that God (the leader) is alive through this lineage. In this way, I prepared to experience God (the leader) by stating that God (the leader) is alive through a lineage to me as a meditation (writing) from God (the leader). + +3. I prepared to share the firecracker with other people. I did this by following God (the facilitator). First, I found God. Second, I followed him. Third, I held a celebration. In this way, I prepared to share the firecracker with other people by following God (the facilitator). + +4. I prepared to crush the juniper berry. I did this by cutting open the juniper berries. First, I positioned the knife above the juniper berry. Second, I cut it in half. Third, I examined it's interior. In this way, I prepared to crush the juniper berry by cutting open the juniper berries. + +5. I prepared to taste the coconut bounty. I did this by observing the vegan use a spear. First, I observed the vegan hold the spear. Second, I observed the vegan throw the spear at the fig. Third, I observed him eat the fig. In this way, I prepared to taste the coconut bounty by observing the vegan use a spear. + +6. I prepared to lick the coconut cream. I did this by sharpening the tip of the spear. First, I made the sharp rock. Second, I sharpened the tip of the spear by moving the stone towards the tip. Third, I tested the spear on the mat. In this way, I prepared to lick the coconut cream by sharpening the tip of the spear. + +7. I prepared to eat the acidophilus. I did this by spearing the acidophilus. First, I made the acidophilus. Second, I speared it. Third, I smoothed it. In this way, I prepared to eat the acidophilus by spearing the acidophilus. + +8. I prepared to perform at the jazz concert. I did this by writing the jazz. First, I wrote the jazz chord. Second, I wrote the progression of chords. Third, I played it. In this way, I prepared to perform at the jazz concert by writing the jazz. + +9. I prepared to sit in the gazebo. I did this by designing the gazebo. First, I designed the floor. Second, I designed the walls. Third, I designed the roof. In this way, I prepared to sit in the gazebo by designing the gazebo. + +10. I prepared to orient the triangles correctly. I did this by making jag saw (sic) puzzle. First, I painted the picture on the wood. Second, I cut it into jagged pieces. Third, I made the picture using the pieces. In this way, I prepared to orient the triangles correctly by making jag saw (sic) puzzle. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8a3be159e1a6bd2c7be11b7378add5c3ff62d317 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, God Algorithm 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +God Algorithm 2 of 4 + +11. I prepared to go to heaven (the store). I did this by endorsing God (the master). First, I found God (the master). Second, I liked him. Third, I found him helpful. In this way, I prepared to go to heaven (the store) by endorsing God (the master). + +12. I prepared to name the jazz composition. I did this by identifying the jazz. First, I identified the jazz seventh chord. Second, I identified the jazz instrument. Third, I identified the jazz rhythm. In this way, I prepared to name the jazz composition by identifying the jazz. + +13. I prepared to agree with the result. I did this by voting sane. First, I found the politician. Second, I voted in a sane way. Third, I observed him win. In this way, I prepared to agree with the result by voting sane. + +14. I prepared to write letters. I did this by finding God (the administrator). First, I looked for him. Second, I found him. Third, I asked, is that a woman? In this way, I prepared to write letters by finding God (the administrator). + +15. I prepared to take a four-hour workweek. I did this by programming God (the game master) to help people to be well. First, I provided meditation (philosophy) class. Second, I provided medicine class (maintaining the community's well being). Third, I provided pedagogy class. In this way, I prepared to take a four-hour workweek by programming God (the game master) to help people to be well. + +16. I prepared to move my piece in response. I did this by requesting the God (Lizard) to move the piece. First, I observed the God (Lizard). Second, I observed her look at the piece. Third, I asked the God (Lizard) to move the piece. In this way, I prepared to move my piece in response by requesting the God (Lizard) to move the piece. + +17. I prepared to buy the drink from the vending machine with the robot. I did this by operating the Lizard Robot. First, I pressed the controller button. Second, I observed the robot move forward one unit. Third, I pressed the button to turn right. In this way, I prepared to buy the drink from the vending machine with the robot by operating the Lizard Robot. + +18. I prepared to record the result. I did this by observing that like repels like, in Physics. First, I placed the magnet on the desk. Second, I placed another magnet with its north pole facing the first magnet's north pole. Third, I observed it repel it. In this way, I prepared to record the result by observing that like repels like, in Physics. + +19. I prepared to see the icons on the screen. I did this by entering my login details. First, I found the wood. Second, I became king. Third, I entered my login details. In this way, I prepared to see the icons on the screen by entering my login details. + +20. I prepared to announce it to the queen. I did this by observing the mother give birth. First, I observed the mother enter the operating theatre. Second, I observed the mother give birth. Third, I observed the time which this took. In this way, I prepared to announce it to the queen by observing the mother give birth. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..211057821ead0aa696bf1cc6353232c32c0b7705 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, God Algorithm 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +God Algorithm 3 of 4 + +21. I prepared to make lots of money. I did this by selling mirth (laughter). First, I found the depressed person to help her recover. Second, I laughed with her. Third, I made many good rounds of it. In this way, I prepared to make lots of money by selling mirth (laughter). + +22. I prepared to plan my journey. I did this by reading the map. First, I read the key of the map. Second, I found the departure point. Third, I found the destination. In this way, I prepared to plan my journey by reading the map. + +23. I prepared to read the diaries when I was older. I did this by recorded love. First, I found someone to love. Second, I showed that person love. Third, I recorded love. In this way, I prepared to read the diaries when I was older by recorded love. + +24. I prepared to park the model car. I did this by controlling the model car. First, I looked the car's steering column. Second, I looked at how it attached to the axels. Third, I looked at how these attached to the wheels. In this way, I prepared to park the model car by controlling the model car. + +25. I prepared to throw the ball. I did this by switching on the lamp. First, I looked for the switch. Second, I switched on the lamp. Third, I read 'Nebuchadnezzar Iacit'. In this way, I prepared to throw the ball by switching on the lamp. + +26. I prepared to pretend to be a spy. I did this by reading the ghost writing. First, I wrote with the ghost pen. Second, I sent it to a friend. Third, I observed her make the writing appear by scribbling with the ghost reading pen. In this way, I prepared to pretend to be a spy by reading the ghost writing. + +27. I prepared to wait for a reply. I did this by writing with the pen. First, I addressed the letter. Second, I wrote the letter. Third, I sealed and delivered it. In this way, I prepared to wait for a reply by writing with the pen. + +28. I prepared to write the proboscis' function. I did this by verifying the proboscis' existence. First, I examined the head of a fly. Second, I verified that the proboscis protruded from the fly's head. Third, I drew a diagram of this. In this way, I prepared to write the proboscis' function by verifying the proboscis' existence. + +29. I prepared to gain the ships' attention. I did this by rotating the lighthouse light. First, I held the lighthouse light lever. Second, I turned it. Third, I stopped when the light was pointing in the correct direction. In this way, I prepared to gain the ships' attention by rotating the lighthouse light. + +30. I prepared to plan the day. I did this by talking to the man. First, I found something to do. Second, I found that he agreed to doing it with me. Third, we did it. In this way, I prepared to plan the day by talking to the man. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f06518379d21c3090d2f1fa666cc4118f8f691a0 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green God Algorithm 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, God Algorithm 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +God Algorithm 4 of 4 + +31. I prepared to make the film. I did this by determining that the film director's apple was positive. First, I found the apple. Second, I bit it. Third, I swallowed the mouthful. In this way, I prepared to make the film by determining that the film director's apple was positive. + +32. I prepared to receive the payment. I did this by washing the lady's colander. First, I found the colander in the cupboard. Second, I retrieved the scourer. Third, I washed the colander with the scourer and water. In this way, I prepared to receive the payment by washing the lady's colander. + +33. I prepared to save the apple seed. I did this by eating the apple. First, I bought the apple. Second, I chewed it. Third, I ate the apple core. In this way, I prepared to save the apple seed by eating the apple. + +34. I prepared to determine what to do next. I did this by building the brain which transcendental conclusions dropped into. First, I thought of statistics. Second, I thought of how it related to the reason. Third, I thought of the conclusion. In this way, I prepared to determine what to do next by building the brain which transcendental conclusions dropped into. + +35. I prepared to be famous. I did this by pressing down the press-stud. First, I put on the vest. Second, I pressed down the first press-stud. Third, I repeated this for each press-stud. In this way, I prepared to be famous by pressing down the press-stud. + +36. I prepared to read the pedagogy qubit. I did this by verifying the qubit using the pedagogy screen. First, I read that the qubit indicated that the breasoning was valid. Second, I read whether the pedagogy screen had been read for the breasoning. Third, I verified the qubit. In this way, I prepared to read the pedagogy qubit by verifying the qubit using the pedagogy screen. + +37. I prepared to display the banner. I did this by programming the computer to display the pixel. First, I set the pixel's x coordinate. Second, I set the pixel's y coordinate. Third, I displayed the pixel. In this way, I prepared to display the banner by programming the computer to display the pixel. + +38. I prepared to write about my feelings. I did this by typing the query 'Do you love me?' to feel positive. First, I asked if the person loved me. Second, I received an answer in the affirmative. Third, I felt positive. In this way, I prepared to write about my feelings by typing the query 'Do you love me?'. + +39. I prepared to have a nap. I did this by typing the query 'Do you love me?' to change me to positive. First, I asked if you loved me. Second, I received the answer 'Yes'. Third, my day changed to positive. In this way, I prepared to have a nap by typing the query 'Do you love me?'. + +40. I prepared to defend the King. I did this by moving the King Duchess piece. First, I made the hexagonal Duchess board. Second, I placed the King on the hexagon. Third, I moved the King on hexagon forward. In this way, I prepared to defend the King. I did this by moving the King Duchess piece. + +41. I prepared to record the final move. I did this by moving the Queen Duchess piece. First, I protected the piece which I would corner the king with. Second, I cornered the king. Third, I declared this checkmate. In this way, I prepared to record the final move by moving the Queen Duchess piece. + +42. I prepared to follow my strategy. I did this by moving the Rhino Duchess piece. First, I considered moving the rhino piece forward two hexagons forward or back and one hexagon left or right. Second, I considered alternatively moving the rhino two hexagons left or right. Third, I moved the rhino to one of these squares. In this way, I prepared to follow my strategy by moving the Rhino Duchess piece. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Marking Scheme - Creative Arts 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Marking Scheme - Creative Arts 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7240dbe009a609d1a3e9f3506419e1bc5838bdf9 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Marking Scheme - Creative Arts 1 of 4.txt @@ -0,0 +1,189 @@ +["Green, L 2021, Marking Scheme - Creative Arts 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Marking Scheme - Creative Arts 1 of 1 + + + +The following marking scheme for creative arts and science SHOULD NOT be used (but currently is used, even though it implies agreement and disagreement deserve different marks). + +H1 and H2A essays have breasoning and rebreasoning completed as part of them. +a. H1 or 80-100% +An essay is given this mark if the student writes on a positive theme. The story is about five objects related to the setting. +i. 90% essays have breathsoning and rebreathsoning completed as part of them. +ii. 100% essays have space and time tests completed as part of them. +b. H2A or 75-79% +An essay is given this mark if the student writes on a different theme. The story is about five objects related to the setting. +c. H2B or 70-74% +An essay is given this mark if the student writes on a different theme. The story is about five objects related to the setting. The objects are breasoned. +d. H3 or 65-69% +An essay is given this mark if the student writes on a positive theme. The story is about five objects related to the setting. The objects are breasoned. +e. P or 50-64% +An essay is given this mark if the student writes on a positive theme. +f. N or 0-49% +An essay is given this mark if the student writes on a different theme. + + +To earn A (80%), one should write 85 reasons using the breasoning rules (5 exposition + 5 critique + 25 detailed reasoning + 50 mind map), to earn A+, one should write 130 reasons (for each of 10 reasons per essay, 9 reasons support them, and 2 breasoned breathsonings and 1 breasoned rebreathsoning reasons support the original reason), to earn 100%, one should write 190 reasons ( or each of 10 reasons per essay, 9 reasons support them, 2 breasoned breathsonings and 1 breasoned rebreathsoning reasons support the original reason and 3 space tests and 3 time tests breasonings support the original reason). Rarely, 250 breasonings, which earn 100% are universally recognised as supporting the spiritual imagery of a production. See Tables 1-2 below. + +Table 1. Number of breasonings required for A+ using current system. + +For each of (5 reasons in exposition + 5 reasons in critique =) 10 reasons per essay: Breasoning for reason 1 for reason n. Breasoning for Breathsoning for subject noun in reason n. Maximum A+ = 90 for 130 breasonings (range from 80.1%-90% is 86-130 breasonings). + +Breasoning for reason 2 for reason n. Breasoning for Breathsoning for object noun in reason n. ' + +Breasoning for reason 3 for reason n. Breasoning for Rebreathsoning for Verb in reason n. ' + +Breasoning for reason 4 for reason n. + + +Breasoning for reason 5 for reason n. + + +Breasoning for reason 6 for reason n. + + +Breasoning for reason 7 for reason n. + + +Breasoning for reason 8 for reason n. + + +Breasoning for reason 9 for reason n. + + +Table 2. Number of breasonings required for 100% using current system. + +For each of (5 reasons in exposition + 5 reasons in critique =) 10 reasons per essay: Breasoning for reason 1 for reason n. Breasoning for Breathsoning for subject noun in reason n. Maximum 100% = 190 breasonings (range from 90.1%-100% is 131-190 breasonings). + +Breasoning for reason 2 for reason n. Breasoning for Breathsoning for object noun in reason n. ' + +Breasoning for reason 3 for reason n. Breasoning for Rebreathsoning for Verb in reason n. ' + +Breasoning for reason 4 for reason n. Space Test: Breasoning for room in reason n. ' + +Breasoning for reason 5 for reason n. Space Test: Breasoning for part of room in reason n. ' + +Breasoning for reason 6 for reason n. Space Test: Breasoning for direction in room in reason n. ' + +Breasoning for reason 7 for reason n. Time Test: Breasoning for time to prepare for action in reason n. ' + +Breasoning for reason 8 for reason n. Time Test: Breasoning for time to do action in reason n. ' + +Breasoning for reason 9 for reason n. Time Test: Breasoning for time to finish action in reason n. ' + +The following marking scheme SHOULD be used (because it gives agreement and disagreement the same mark): + +H1 and H2A essays have breasoning, and rebreasoning completed as part of them. +a. A or 75-100% +An essay is given this mark if the student writes on a positive or negative (with examples of how a positive thing shouldn't go wrong) theme. The story is about five objects related to the setting. +i. A+ (87.5%) essays have breathsoning and rebreathsoning completed as part of them. +ii. 100% essays have space and time tests completed as part of them. + +b. B or 65-74% +An essay is given this mark if the student writes on a positive or negative theme. The story is about five objects related to the setting. The objects are breasoned. + +c. P or 50-64% +An essay is given this mark if the student writes on a positive or negative theme. + +d. N or 0-49% +An essay is given this mark if the student doesn't answer the question. + +See * above. + +To earn A (75%), one should write 85 reasons using the breasoning rules (5 exposition + 5 critique + 25 detailed reasoning + 50 mind map), to earn A+, one should write 130 reasons (for each of 10 reasons per essay, 9 reasons support them, and 2 breasoned breathsonings and 1 breasoned rebreathsoning reasons support the original reason), to earn 100%, one should write 190 reasons ( or each of 10 reasons per essay, 9 reasons support them, 2 breasoned breathsonings and 1 breasoned rebreathsoning reasons support the original reason and 3 space tests and 3 time tests breasonings support the original reason). Rarely, 250 breasonings, which earn 100% are universally recognised as supporting the spiritual imagery of a production. See Tables 3-4 below. + +Table 3. Number of breasonings required for A+ using suggested equitable system. + +For each of (5 reasons in exposition + 5 reasons in critique =) 10 reasons per essay: Breasoning for reason 1 for reason n. Breasoning for Breathsoning for subject noun in reason n. Maximum A+ = 87.5 for 130 breasonings (range from 75.1%-87.5% is 86-130 breasonings). + + +Breasoning for reason 2 for reason n. Breasoning for Breathsoning for object noun in reason n. ' + + +Breasoning for reason 3 for reason n. Breasoning for Rebreathsoning for Verb in reason n. ' + + +Breasoning for reason 4 for reason n. + + + + + +Breasoning for reason 5 for reason n. + + + + + +Breasoning for reason 6 for reason n. + + + + + +Breasoning for reason 7 for reason n. + + + + + +Breasoning for reason 8 for reason n. + + + + + +Breasoning for reason 9 for reason n. + + + + +Table 4. Number of breasonings required for 100% using suggested equitable system. + +For each of (5 reasons in exposition + 5 reasons in critique =) 10 reasons per essay: Breasoning for reason 1 for reason n. Breasoning for Breathsoning for subject noun in reason n. Maximum 100% = 190 breasonings (range from 87.6%-100% is 131-190 breasonings). + + +Breasoning for reason 2 for reason n. Breasoning for Breathsoning for object noun in reason n. ' + + +Breasoning for reason 3 for reason n. Breasoning for Rebreathsoning for Verb in reason n. ' + + +Breasoning for reason 4 for reason n. Space Test: Breasoning for room in reason n. ' + + +Breasoning for reason 5 for reason n. Space Test: Breasoning for part of room in reason n. ' + + +Breasoning for reason 6 for reason n. Space Test: Breasoning for direction in room in reason n. ' + + +Breasoning for reason 7 for reason n. Time Test: Breasoning for time to prepare for action in reason n. ' + + +Breasoning for reason 8 for reason n. Time Test: Breasoning for time to do action in reason n. ' + + +Breasoning for reason 9 for reason n. Time Test: Breasoning for time to finish action in reason n. ' + + + +On my blog, I wrote after conferring with the Melbourne University Vice Chancellor Glyn Davis that agreement and disagreement equitably deserve the same grade. Later, the University may institute this change. Teachers and lecturers may recalculate the equitable grade by modifying the current system's grade (see Table 5.) or counting breasonings written down as part of a computational marking scheme. + +Table 5. Conversion table from old marking scheme to new equitable marking scheme + +Current marking scheme grade letter Current marking scheme Number of breasonings in current marking scheme New equitable marking scheme grade letter New equitable marking scheme Number of breasonings in new equitable marking scheme +A++ 90.1-100% 131-190 A++ 87.6-100% 131-190 +A+ 80.1-90% 86-130 A+ 75.1-87.5% 86-130 +H1 or A 80.00% 85 A 75.00% (ranges from 75-79% or 80-80% in current marking scheme) 85 +H2A 75-79% 85 + +' ' +H2B 70-74% 70-84 B 65-74% (ranges from 65-69% or 70-74% in current marking scheme) 65-84 +H3 65-69% 65-69 + +' ' +P 50-64% 50-64 P 50-64% 50-64 +N 0-49% 0-49 N 0-49% 0-49 + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Marking Scheme - Humanities and Science 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Marking Scheme - Humanities and Science 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..450de5d27310d361c85ef8c3f9cd9a8339df3d48 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Marking Scheme - Humanities and Science 1 of 4.txt @@ -0,0 +1,230 @@ +["Green, L 2021, Marking Scheme - Humanities and Science 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Marking Scheme - Humanities and Science 1 of 1 + + + +1. The following is the marking scheme for humanities which SHOULD NOT be used (but currently is used, even though it implies agreement and disagreement deserve different marks): + +H1 and H2A essays must have breasoned objects and rebreasoned actions completed as part of them. + +1a. H1 or 80-100% +An essay is given this mark if the student agrees with the side of the contention agreeing with the writer. An exposition in the first half and critique (agreeing with it) in the second half are required. An exposition is a paraphrasing of the text. A critique is an argument about the text in five paragraphs. +i. 90% essays must have breathsoning and rebreathsoning completed as part of them. +ii. 100% essays must have space and time tests completed as part of them. + +2. H2A or 75-79% +An essay is given this mark if the student differs in opinion from the side of the contention agreeing with the writer. An exposition in the first half and critique (differing in opinion from it) in the second half are required. + +3. H2B or 70-74% +An essay is given this mark if the student differs in opinion from the side of the contention agreeing with the writer. +An exposition in the first half and critique (differing in opinion from it) in the second half are required. The objects must be breasoned. + +4. H3 or 65-69% +An essay is given this mark if the student agrees with the writer in an organised way. +An exposition in five paragraphs is required. The objects must be breasoned. + +5. P or 50-64% +An essay is given this mark if the student agrees with the writer. +An exposition in a number of paragraphs other than five is required. + +6. N or 0-49% +An essay is given this mark if the student differs in opinion from the writer in the first half or answers another question. +A critique in any number of paragraphs is required. + +To earn A (80%), one should write 85 reasons using the breasoning rules (5 exposition + 5 critique + 25 detailed reasoning + 50 mind map), to earn A+, one should write 130 reasons (for each of 10 reasons per essay, 9 reasons support them, and 2 breasoned breathsonings and 1 breasoned rebreathsoning reasons support the original reason), to earn 100%, one should write 190 reasons ( or each of 10 reasons per essay, 9 reasons support them, 2 breasoned breathsonings and 1 breasoned rebreathsoning reasons support the original reason and 3 space tests and 3 time tests breasonings support the original reason). Rarely, 250 breasonings, which earn 100% are universally recognised as supporting the spiritual imagery of a production. See Tables 1-2 below. + +Table 1. Number of breasonings required for A+ using current system. + +For each of (5 reasons in exposition + 5 reasons in critique =) 10 reasons per essay: Breasoning for reason 1 for reason n. Breasoning for Breathsoning for subject noun in reason n. Maximum A+ = 90 for 130 breasonings (range from 80.1%-90% is 86-130 breasonings). + + +Breasoning for reason 2 for reason n. Breasoning for Breathsoning for object noun in reason n. ' + + +Breasoning for reason 3 for reason n. Breasoning for Rebreathsoning for Verb in reason n. ' + + +Breasoning for reason 4 for reason n. + + + + + +Breasoning for reason 5 for reason n. + + + + + +Breasoning for reason 6 for reason n. + + + + + +Breasoning for reason 7 for reason n. + + + + + +Breasoning for reason 8 for reason n. + + + + + +Breasoning for reason 9 for reason n. + + + + +Table 2. Number of breasonings required for 100% using current system. + +For each of (5 reasons in exposition + 5 reasons in critique =) 10 reasons per essay: Breasoning for reason 1 for reason n. Breasoning for Breathsoning for subject noun in reason n. Maximum 100% = 190 breasonings (range from 90.1%-100% is 131-190 breasonings). + + +Breasoning for reason 2 for reason n. Breasoning for Breathsoning for object noun in reason n. ' + + +Breasoning for reason 3 for reason n. Breasoning for Rebreathsoning for Verb in reason n. ' + + +Breasoning for reason 4 for reason n. Space Test: Breasoning for room in reason n. ' + + +Breasoning for reason 5 for reason n. Space Test: Breasoning for part of room in reason n. ' + + +Breasoning for reason 6 for reason n. Space Test: Breasoning for direction in room in reason n. ' + + +Breasoning for reason 7 for reason n. Time Test: Breasoning for time to prepare for action in reason n. ' + + +Breasoning for reason 8 for reason n. Time Test: Breasoning for time to do action in reason n. ' + + +Breasoning for reason 9 for reason n. Time Test: Breasoning for time to finish action in reason n. ' + + + + +The following marking scheme for humanities SHOULD be used (because it gives agreement and disagreement the same mark): + +H1 and H2A essays must have breasoned objects and rebreasoned completed as part of them. + +1a. A or 75-100% +An essay is given this mark if the student either agrees with or disagrees with the side of the contention agreeing with the writer, regardless. An exposition in the first half and critique in the second half are required. An exposition is a paraphrasing of the text. A critique is an argument about the text in five paragraphs. +i. A+ (87.5%) essays must have breathsoning and rebreathsoning completed as part of them. +ii. 100% essays must have space and time tests completed as part of them. + +2. B or 65-74% +An essay is given this mark if the student agrees or differs in opinion from the side of the contention agreeing with the writer, regardless. +An exposition in the first half and critique (differing in opinion from it) in the second half are required. The objects must be breasoned. + +3. C or 50-64% +An essay is given this mark if the student agrees or disagrees with the writer, regardless. +An exposition in a number of paragraphs other than five is required. + +4. N or 0-49% +An essay is given this mark if the student doesn't answer the question. +A critique in any number of paragraphs is required. + + +To earn A (75%), one should write 85 reasons using the breasoning rules (5 exposition + 5 critique + 25 detailed reasoning + 50 mind map), to earn A+, one should write 130 reasons (for each of 10 reasons per essay, 9 reasons support them, and 2 breasoned breathsonings and 1 breasoned rebreathsoning reasons support the original reason), to earn 100%, one should write 190 reasons ( or each of 10 reasons per essay, 9 reasons support them, 2 breasoned breathsonings and 1 breasoned rebreathsoning reasons support the original reason and 3 space tests and 3 time tests breasonings support the original reason). Rarely, 250 breasonings, which earn 100% are universally recognised as supporting the spiritual imagery of a production. See Tables 3-4 below. + +Table 3. Number of breasonings required for A+ using suggested equitable system. + +For each of (5 reasons in exposition + 5 reasons in critique =) 10 reasons per essay: Breasoning for reason 1 for reason n. Breasoning for Breathsoning for subject noun in reason n. Maximum A+ = 87.5 for 130 breasonings (range from 75.1%-87.5% is 86-130 breasonings). + + +Breasoning for reason 2 for reason n. Breasoning for Breathsoning for object noun in reason n. ' + + +Breasoning for reason 3 for reason n. Breasoning for Rebreathsoning for Verb in reason n. ' + + +Breasoning for reason 4 for reason n. + + + + + +Breasoning for reason 5 for reason n. + + + + + +Breasoning for reason 6 for reason n. + + + + + +Breasoning for reason 7 for reason n. + + + + + +Breasoning for reason 8 for reason n. + + + + + +Breasoning for reason 9 for reason n. + + + + +Table 4. Number of breasonings required for 100% using suggested equitable system. + +For each of (5 reasons in exposition + 5 reasons in critique =) 10 reasons per essay: Breasoning for reason 1 for reason n. Breasoning for Breathsoning for subject noun in reason n. Maximum 100% = 190 breasonings (range from 87.6%-100% is 131-190 breasonings). + + +Breasoning for reason 2 for reason n. Breasoning for Breathsoning for object noun in reason n. ' + + +Breasoning for reason 3 for reason n. Breasoning for Rebreathsoning for Verb in reason n. ' + + +Breasoning for reason 4 for reason n. Space Test: Breasoning for room in reason n. ' + + +Breasoning for reason 5 for reason n. Space Test: Breasoning for part of room in reason n. ' + + +Breasoning for reason 6 for reason n. Space Test: Breasoning for direction in room in reason n. ' + + +Breasoning for reason 7 for reason n. Time Test: Breasoning for time to prepare for action in reason n. ' + + +Breasoning for reason 8 for reason n. Time Test: Breasoning for time to do action in reason n. ' + + +Breasoning for reason 9 for reason n. Time Test: Breasoning for time to finish action in reason n. ' + + + +On my blog, I wrote after conferring with the Melbourne University Vice Chancellor Glyn Davis that agreement and disagreement equitably deserve the same grade. Later, the University may institute this change. Teachers and lecturers may recalculate the equitable grade by modifying the current system's grade (see Table 5.) or counting breasonings written down as part of a computational marking scheme. + +Table 5. Conversion table from old marking scheme to new equitable marking scheme + +Current marking scheme grade letter Current marking scheme Number of breasonings in current marking scheme New equitable marking scheme grade letter New equitable marking scheme Number of breasonings in new equitable marking scheme +A++ 90.1-100% 131-190 A++ 87.6-100% 131-190 +A+ 80.1-90% 86-130 A+ 75.1-87.5% 86-130 +H1 or A 80.00% 85 A 75.00% (ranges from 75-79% or 80-80% in current marking scheme) 85 +H2A 75-79% 85 +' ' +H2B 70-74% 70-84 B 65-74% (ranges from 65-69% or 70-74% in current marking scheme) 65-84 +H3 65-69% 65-69 +' ' +P 50-64% 50-64 P 50-64% 50-64 +N 0-49% 0-49 N 0-49% 0-49 + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..c883f16822fe85c0a7b7f5017a6787dd323882f2 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 1 of 4.txt @@ -0,0 +1,37 @@ +["Green, L 2021, Part of Room 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Part of Room 1 of 4 + + + +1. The blind girl's teacher prepared to sit down. He did this by lolling the rice paper cylinder with his tongue. First, he touched the bottom of it with his tongue. Second, he touched the top-left of it with his tongue. Third, he touched the top-right of it with his tongue. In this way, the blind girl's teacher prepared to sit down by lolling the rice paper cylinder. + + +2. The disabilities teacher wished good luck to all of his students. He did this by squirting the gourd. First, he lifted the gourd to his mouth. Second, he squeezed the gourd. Third, he squirted the gourd into his lips. In this way, the disabilities teacher prepared to wish good luck to all of his students by squirting the gourd. + + +3. The disabilities teacher student prepared to ask for each assessment criterion. He did this by washing each part of his face. First, he washed his eyes. Second, he washed his nose. Third, he washed his mouth. In this way, the disabilities teacher student prepared to ask for each assessment criterion by washing each part of his face. + + +4. The disabilities teacher student prepared to make a 3D model of an adjective describing a competency. He did this by washing the inside of the cylinder. First, he washed the inside of the base. Second, he washed the inside of the side. Third, he washed the inside of the top. In this way, the disabilities teacher student prepared to make a 3D model of an adjective describing a competency by washing the inside of the cylinder. + + +5. The disabilities teacher-student helped the self-preservation of a student. He did this by licking the ice-cream. First, he held the ice-cream by the cone. Second, he licked the ice-cream. Third, he licked the ice-cream until he had finished licking it. In this way, the disabilities teacher-student prepared to help the self-preservation of a student by licking the ice-cream. + + +6. The disabilities teacher student prepared to assess a 'done-up' assignment (with a short story containing 64 departmental perspectives about it) and a 'seen-as' version of 'A' quality written by the student. He did this by placing the bird model in the hole. First, he lifted the bird model up. Second, he walked to the hole. Third, he placed it in the hole. In this way, the disabilities teacher student prepared to assess a 'done-up' assignment and a 'seen-as' version of 'A' quality written by the student by placing the bird model in the hole. + + +7. The disabilities teacher-student tested that the gifted student had thought of five breasonings per idea (ideas with objects measured in the x, y and z dimensions, that a professor would think of as being like cleaning a test tube). He did this by bringing the bird model through the air to his desk. First, he lifted the bird model above his shoulder. Second, he walked to his desk. Third, he placed it on his desk. In this way, the disabilities teacher-student tested that the gifted student had thought of five breasonings per idea by bringing the bird model through the air to his desk. + + +8. The computational Hegelian programmed an example about intersubjectivity. He did this by pretending to fly home, like a bird. First, he stood up from his seat. Second, he raised and lowered his arms repeatedly, like a bird flying. Third, he walked forwards. In this way, the computational Hegelian programmed an example about intersubjectivity by pretending to fly home, like a bird. + + +9. The philosopher comparing normativity (judgments) with Hegel (intersubjectivity) wrote down the competencies of each subject. He did this by sewing the sheet over itself by 0.01 m. First, he threaded thread through the needle. Second, he positioned the needle at one end of the sheet. Third, he sewed the sheet over itself. In this way, the philosopher comparing normativity with Hegel wrote down the competencies of each subject by sewing the sheet over itself. + + +10. The Computational human rights philosopher (giving acts and prevention of omissions equal importance) gave both subsistence (acting) rights and security (prevention omissions) rights necessity scores of 10/10. He did this by mixing the apple and cream to put in the pie. First, he placed an apple in a bowl. Second, he put cream in a bowl. Third, he mixed the apple and cream together. In this way, the Computational human rights philosopher gave both subsistence (acting) rights and security (preventing omissions) rights necessity scores of 10/10 by mixing the apple and cream to put in the pie. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..92a6701140f4a42268acc88915d6b8fe64bee941 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 2 of 4.txt @@ -0,0 +1,17 @@ +["Green, L 2021, Part of Room 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Part of Room 2 of 4 + +11. The philosopher comparing normativity (judgments) with the human rights philosopher (stating the distinction of acts and prevention of omissions is a mixture) tested that all the subjects' duties (acts) and rights (prevention of omissions) were the same. He did this by separating the lettuce and tomato using a knife and spoon. First, he placed the lettuce and tomato on the plate. Second, he moved the lettuce to the left with the knife. Third, he moved the tomato to the right with the spoon. In this way, the philosopher comparing normativity (judgments) with the human rights philosopher (stating the distinction of acts and prevention of omissions is a mixture) tested that all the subjects' duties (acts) and rights (prevention of omissions) were the same. He did this by separating the lettuce and tomato using a knife and spoon. + + +12. The religious university union club speaker linked a student into a subject by thinking of an 'A' (using the ideas from 6. and 7. above) about him. He did this by recognizing whose bedroom it was. First, he matched the sheet colour with that of his friend. Second, he matched the blanket cover colour with that of his friend. Third, he matched the wall colour with that of his friend. In this way, the religious university union club speaker linked a student into a subject by thinking of an 'A' about him by recognizing whose bedroom it was. + + +13. The independent school student chose the best passages to write on. He did this by maintaining good posture to retain his heart health. First, he detected which way was up using his vestibular system. Second, he stood up. Third, he stood up straight. In this way, the independent school student chose the best passages to write on by maintaining good posture to retain his heart health. + + +14. The school student chose the sentences containing the key term from the question. She did this by reading the book of timetables. First, she selected the correct timetable from the book. Second, she selected the correct day from the timetable. Third, she selected the correct stop for the day from the timetable. In this way, the school student chose the sentences containing the key term from the question by reading the book of timetables. + + +15-"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..72acf32d27e568a3a0ccb26844411b0bc882bf31 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 3 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Part of Room 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Part of Room 3 of 4 + +21. Same as Rebreathsoning 15-21. + + +22. The Asperger patient prepared to carry out a dissertation by rote memory, by using his body as a basis for a mnemonic for the essay structure. He did this by sitting on a pious seat of right. First, he walked to the seat. Second, he lowered it. Third, he sat on it. In this way, the Asperger patient prepared to carry out a dissertation by rote memory, by using his body as a basis for a mnemonic the for essay structure by sitting on a pious seat of right. + + +23. The competition of brains competitor defined a domain for an area of study by relating parts of an area of study to an object. He did this by biting a biteful from the finger biscuit. First, he took the biscuit from the packet. Second, he bit it. Third, he placed it in a plastic wrapper. In this way, the competition of brains competitor defined a domain by relating parts of an area of study to an object by biting a biteful from the finger biscuit. + + +24. The simulatory problem solver invented a colour language (with rainbow inferences between words) that he simulated as starting thoughts for an area of study. He did this by squeezing the gel onto his stomach. First, he picked up the gel. Second, he squeezed it onto his stomach. Third, he put it away. In this way, the simulatory problem solver invented a colour language (with rainbow inferences between words) that he simulated as starting thoughts by squeezing the gel onto his stomach. + + +25. The brainworks participant wrote breasonings (thought of X, Y and Z dimensions for objects) for a 'seen-as' essay based directly on secondary literature, and handed in an essay of his design. He did this by moving the arch from his toes. First, he bent down. Second, he put the arch over his toes. Third, he removed the arch. In this way, the brainworks participant wrote breasonings (thought of X, Y and Z dimensions for objects) for a 'seen-as' essay based directly on secondary literature, and handed in an essay of his design by moving the arch from his toes. + + +26. The students' business program participant rotated the cutoff point in the argumentary circle, which he applied to different categories giving varying results. For example, 'The mother is in the family because the boy is in the family because the father is in the family' is modified so that the mother has a child who becomes a father, to test if there is a sex-linked disease which is present only in men. He did this by connecting the street sign to the street number. First, he went to the street. Second, he read the street sign. Third, he went to the house with the correct street number. He did this by rotating the cutoff point in the argumentary circle, which he applied to different categories giving varying results, by connecting the street sign to the street number. + + +27. The running captain loved people by removing mistakes. He did this by squirting the carriage. First, he took the hose cap off. Second, he turned the tap on. Third, he squirted the carriage. In this way, the running captain loved people by removing mistakes by squirting the carriage. + + +28. The child carer took care of the child by tying up her shoelaces. He did this by taking the finger biscuit out of the hat. First, he placed the finger biscuit in the hat. Second, he took it out. Third, he ate it. In this way, the child carer took care of the child by tying up her shoelaces by taking the finger biscuit out of the hat. + + +29. The intuitive computer shop employee wrote down a possible 'A,' in other words, 85 breasonings (ideas with objects measured in the x, y and z dimensions. A professor would think of the A as being like cleaning a test tube. The A was for differing in opinion from mistakes about high-quality thoughts in an area of study of his design from the perspective of the lecturer-in-charge. He did this by testing that he had cooked the artichoke. First, he placed it on a plate. Second, he positioned the knife on the artichoke. Third, he cut the artichoke in half to test whether it was cooked. In this way, the intuitive computer shop employee wrote down a possible 'A,' in other words, 85 breasonings for finding fault with mistakes about high-quality thoughts in an area of study of his design from the perspective of the lecturer-in-charge by testing that he had cooked the artichoke. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8f23d0421b3f1dfce381fa1732b843ed521c9cad --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Professor Algorithm 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Professor Algorithm 1 of 4 + + + +1. First read the Pedagogy Guide (to learn how to earn A grade in humanities essays and other assignments). Alternatively, breason out (think of the X, Y and Z dimensions of an object from) each sentence in the following numbered paragraphs. Breasoning out the Professor Algorithm by itself gives you the rest of Pedagogy. There is a success exit point after 80 breasonings (13.33 paragraphs). + +1a. I prepared to be a Professor. I did this by loving the duck. First, I heard the fluttering wings. Second, I saw the duck. Third, I held the duck. In this way, I prepared to be a Professor by loving the duck. + +2. I prepared to be an actor in my film. I did this by directing the film. First, I held the magulous (sic) details in my hands. Second, I made magic up. Third, I had the film given to me. In this way, I prepared to be an actor in my film by directing the film. + +3. I prepared to use the society. I did this by observing the class study societology. First, I observed the society. Second, I observed the class study it. Third, I liked the society. In this way, I prepared to use the society by observing the class study societology. + +4. I prepared to listen to jive music. I did this by observing the student study popology. First, I like a person who followed another person. Second, I liked helping amnesiacs. Third, I helped you to perform well in philosophy. In this way, I prepared to listen to jive music by observing the student study popology. + +5. I prepared to count the breasonings using Prolog (detecting breasonings top-down), breavsonings (collecting comments and congratulating for all comments), wackery (giving more breasonings in response) and gestation (pretending to count the breasonings while the baby is in one's spiritual womb). I did this by counting the breasonings using mind reading. First, I mesmerized the crowds. Second, I helped them gestate. Third, I prevented the ignoramus (preventing them incorrectly count the number of breasonings). In this way, I prepared to count the breasonings using Prolog (detecting breasonings top-down), breavsonings (collecting comments and congratulating for all comments), wackery (giving more breasonings in response) and gestation (pretending to count the breasonings while the baby is in one's spiritual womb) by counting the breasonings using mind reading. + +6. I prepared to earn the heavenly (good) grade. I did this by observing that the algorithm graded the breasoning list. First, I was with-it over the switch's breasoning creating capability. Second, I was like Maharishi (the man). Third, I was like the rest of the yogis. In this way, I prepared to earn the heavenly (good) grade by observing that the algorithm graded the breasoning list. + +7. I prepared to be equitable. I did this by observing that the algorithm graded the argument map with the same grade for disagreement as agreement. First, I found agreement in the text. Second, I found disagreement in the text with a spiritual objection (like a quantum box/prayer and a nut unscrewed from a bolt, or a B or disagreeing argument to it with its reasons as spiritual answers to 15 questions for attaching to a 10 breasoning B) to it. Third, I gave the assignment the same grade as agreement. In this way, I prepared to be equitable by observing that the algorithm graded the argument map with the same grade for disagreement as agreement. + +8. I prepared to agree with the essay's grade given the overall side it took. I did this by observing that the algorithm graded the argument map with objections and rebuttals with the same grade for disagreement as agreement. First, I recognised the reason as agreeing and a direct objection to it as changing the pair to a disagreement. Second, I placed the objection to a non-written reason and connected it to the reason, changing the pair to a disagreement. Third, I placed the rebuttal to a non-written reason and connected it to the objection, changing the sequence of reason, objection and rebuttal to agreement. In this way, I prepared to agree with the essay's grade given the overall side it took by observing that the algorithm graded the argument map with objections and rebuttals with the same grade for disagreement as agreement. + +9. I prepared to reach equality with the modern pedagogy format. I did this by awarding the appropriate grade to the essay in the modern pedagogy format. First, I verified that the disagreeing exposition did not cause the essay to fail, and noted that it could earn A grade. Second, I determined the grade by whether all of the branches agreed or disagreed, taking in to account objections and rebuttals. Third, I awarded agreement and disagreement the same grade. In this way, I prepared to reach equality with the modern pedagogy format by awarding the appropriate grade to the essay in the modern pedagogy format. + +10. I prepared to observe the sunshine express move on. I did this by disagreeing with the traditional pedagogy format in favour of the modern pedagogy format. First, I observed that the disagreeing exposition caused the essay to fail (where it should be allowed to earn a maximum of A grade). Second, I observed that one disagreement during the essay caused it to earn a maximum of B grade (where the entire argument map, including objections and rebuttals should be taken into account when marking and disagreement should be allowed a maximum of A grade). Third, I observed that overbearingly disagreeing essays were awarded a maximum of B grade (where disagreement should be allowed a maximum of A grade). In this way, I prepared to observe the sunshine express move on by disagreeing with the traditional pedagogy format in favour of the modern pedagogy format. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6cbd6fcc3b959a7fa854ff91c882a54ec9f58718 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Professor Algorithm 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Professor Algorithm 2 of 4 + +11. I prepared to go mad with lust for the algorithm majors. I did this by observing the algorithm major assess the students. First, I observed the population earning money from pensions, not jobs. Second, I observed the birth numbers being free. Third, I observed sustainability being a school subject. In this way, I prepared to go mad with lust for the algorithm majors by observing the algorithm major assess the students. + +12. I prepared to state what was known. I did this by stating that instead of knowing I didn't know whether the cyclist was a teacher, I knew that he was a cyclist. First, I sighted the cyclist. Second, I didn't state I didn't know whether the cyclist was a teacher. Third, I stated that the cyclist was a cyclist. In this way, I prepared to state what was known by stating that instead of knowing I didn't know whether the cyclist was a teacher, I knew that he was a cyclist. + +13. I prepared to eat an apple. I did this by stating that the government verified the University thought. First, I ate at the University store. Second, I ate with you. Third, I ate. In this way, I prepared to eat an apple by stating that the government verified the University thought. + +14. I prepared to think that accreditation was similar to this. I did this by stating that the University attacked the thought by stating a thought that went up from it. First, I stated I listened to a whole new thought being made up. Second, I stated my gender. Third, I helped you do it too. In this way, I prepared to think that accreditation was similar to this by stating that the University attacked the thought by stating a thought that went up from it. + +15. I prepared to gingerly go up to the Queen and say, 'I love you'. I did this by turning off the breasonings before and after breasoning them out, then wore the top hat, for a stress-free head. First, I liked the breasonings. Second, I didn't hide them using the top hat. Third, I had a calm head. In this way, I prepared to gingerly go up to the Queen and say, 'I love you' by turning off the breasonings before and after breasoning them out, then wore the top hat, for a stress-free head. + +16. I prepared to like being famous. I did this by finding Bs to the 5 viral As. First, I found the viral As were incompatible with conception, but were made compatible with Bs. Second, I meditated (wrote). Third, I found the 5 viral As were healthy. In this way, I prepared to like being famous by finding Bs to the 5 viral As. + +17. I prepared to like Nietzsche. I did this by writing 250 breasonings for medicine for students for each discourse's 50 As. First, I brought the students up. Second, I helped them to each thought. Third, I liked you. In this way, I prepared to like Nietzsche by writing 250 breasonings for medicine for students for each discourse's 50 As. + +18. I prepared to eat the delight, a lettuce sandwich. I did this by writing 250 breasonings for medicine for audience members for each movie's 50 As. First, I took care of the movie stars. Second, I took care of everything. Third, I take care of the audience members. In this way, I prepared to eat the delight, a lettuce sandwich by writing 250 breasonings for medicine for audience members for each movie's 50 As. + +19. I prepared to like no mental breakdowns. I did this by noticing the doctor approve of medicine relaxation for students from other departments. First, I liked medicine. Second, I liked the shower stone. Third, I liked rubbing it in well. In this way, I prepared to like no mental breakdowns by noticing the doctor approve of medicine relaxation for students from other departments. + +20. I prepared to love God (the master). I did this by observing the person upgrading to University medicine. First, I loved Medicine. Second, I loved you. Third, I loved cheesonings (giving medicine things to solve). In this way, I prepared to love God (the master) by observing the person upgrading to University medicine. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..a49e340f638e4386d037dea444290498c6ccb64b --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Professor Algorithm 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Professor Algorithm 3 of 4 + +21. I prepared to write areas of study. I did this by stating that Dadirri (the word) is supported in University meditation (philosophy) graduates. First, I stated that having a relationship was inside the Vocational Education Diploma. Second, I stated that being a playwright was inside the Vocational Theater Studies Diploma. Third, I stated that being a musician was inside the Vocational Music Diploma. In this way, I prepared to write areas of study by stating that Dadirri (the word) is supported in University meditation (philosophy) graduates. + +22. I prepared to go to the head of the class. I did this by writing the psychiatrist was right now. First, I visited the private clinic. Second, I found the psychiatrist was right. Third, I found I was right with the psychiatrist. In this way, I prepared to go to the head of the class by writing the psychiatrist was right now. + +23. I prepared to like the big red apple. I did this by writing that everything (the pinball pin) changes. First, I recommended changing the rules so that Masters by Research didn't require an undergraduate degree in that field. Second, I played pinball. Third, I compensated as some of the goals' scores changed. In this way, I prepared to like the big red apple by writing that everything (the pinball pin) changes. + +24. I prepared to love the hydrangeas. I did this by being famous on the Artificial Intelligence web site. First, I loved God (the mistress). Second, I loved being there. Third, I loved you. In this way, I prepared to love the hydrangeas by being famous on the Artificial Intelligence web site. + +25. I prepared to eat the spinach. I did this by liking the fast clapping. First, I clapped at the President's navel. Second, I clapped in a thunder-struck way. Third, I held hands with the guy. In this way, I prepared to eat the spinach by liking the fast clapping. + +26. I prepared to state that computer science has finished. I did this by observing the pedagogue determine that the electron transcended the wire to create a breasoning after three others like it were all blessed by Krishna. First, I helped Krishna to create the breasonings before the current breasoning. Second, I helped Krishna to create the current breasoning. Third, I helped Krishna to create the breasonings after the current breasoning. In this way, I prepared to state that computer science has finished by observing the pedagogue determine that the electron transcended the wire to create a breasoning after three others like it were all blessed by Krishna. + +27. I prepared to win the pinball maze game. I did this by completing the pinball maze. First, I helped myself to the pinball maze-making materials. Second, I made it. Third, I played with the pinball maze game. In this way, I prepared to win the pinball maze game by completing the pinball maze. + +28. I prepared to be featured on television. I did this by writing the literature review in my PhD. First, I wrote the 10 breasoning A for the sentence. Second, I wrote the area of study points for each of the 10 breasonings. Third, I synthesised these, and then rewrote the PhD sentence. In this way, I prepared to be featured on television by writing the literature review in my PhD. + +29. I prepared to be visited by a professor from another country and be given Nietzsche as an inspiration. I did this by writing the questions in my PhD. First, I wrote 5 specific As on the topic. Second, I read and dotted on 50 texts on the topic. Third, I wrote on specific topics during my career. In this way, I prepared to be visited by a professor from another country and be given Nietzsche as an inspiration by writing the questions in my PhD. + +30. I prepared to write theory As and their corresponding arguments as reasons for each reason for the hypothesis. I did this by writing the hypotheses in my PhD. First, I made sure the argument had some structure. Second, I answered each question with a hypothesis. Third, I reasoned out the hypothesis' reasons. In this way, I prepared to write theory As and their corresponding arguments as reasons for each reason for the hypothesis by writing the hypotheses in my PhD. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5d9a5e95ca223b746af7baa470fa37609d619185 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Professor Algorithm 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Professor Algorithm 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Professor Algorithm 4 of 4 + +31. I prepared to spiritually question a sentence with 15 questions against 10-20 breasonings for the sentence, after a student writer wrote on the topic. I did this by writing 10 breasonings per sentence. First, I wrote 10-20 breasonings. Second, I joined their corresponding area of study points to them. Third, I wrote a synthesis for these. In this way, I prepared to spiritually question 2 sentences with 15 questions each at a time against 50 breasonings for the two sentences, after a student writer wrote on the topic by writing 10 breasonings per sentence. + +32. I prepared to sell each PhD sentence. I did this by writing 6 breasonings for each PhD sentence. First, I read the sentence. Second, I spiritually questioned the sentence with 15 questions. Third, I observed the 6 breasonings being given to each sentence. In this way, I prepared to sell each PhD sentence by writing 6 breasonings for each PhD sentence. + +33. I prepared to be famous. I did this by finding out a sentence from the professor. First, I gave myself 5 10 breasoning As to find out sentences from the professor. Second, I wrote the tenure argument. Third, I wrote the secondary text argument. In this way, I prepared to be famous by finding out a sentence from the professor. + +34. I prepared to offer breasoning training to students. I did this by stating that all University students feel smart because of the breasoning breasonings. First, I helped them to breasonings. Second, I helped them to write it. Third, I loved the editor. In this way, I prepared to offer breasoning training to students by stating that all University students feel smart because of the breasoning breasonings. + +35. I prepared to spiritually write the program with three transformations and code comments by writing it, writing it well and writing 'correct'. I did this by programming the breasoning. First, I looked at the breasoning. Second, I helped it to be a breasoning. Third, I write a program for it. In this way, I prepared to spiritually write the program with three transformations and code comments by writing it, writing it well and writing 'correct' by programming the breasoning. + +36. I prepared to give positive feedback to myself. I did this by collecting the high quality comment after climbing. First, I climbed. Second, I collected the high quality comment. Third, I ate some muesli. In this way, I prepared to give positive feedback to myself by collecting the high quality comment after climbing. + +37. I prepared to cheat fate. I did this by avoiding a confrontation from an attacking questioner. First, I avoided the confrontation. Second, I walked down the hallway. Third, I turned left. In this way, I prepared to cheat fate by avoiding a confrontation from an attacking questioner. + +38. I prepared to help the article to be written. I did this by answering the question. First, I liked Maharishi. Second, I answered the question. Third, I observed that you wrote you liked me as a friendly person. In this way, I prepared to help the article to be written by answering the question. + +39. I prepared to verify the questioner's knowledge. I did this by asking the same questioner of the questioner that he asked me. First, I wrote down the questioner's question. Second, I copied the question and gave it back to the questioner. Third, I liked the question being answered. In this way, I prepared to verify the questioner's knowledge by asking the same questioner of the questioner that he asked me. + +40. I prepared to think questions were intelligent. I did this by answering the questions of two hecklers, one at a time. First, I wrote about the questions. Second, I wrote about the answers. Third, I deduced the main connection between each question and answer pair. In this way, I prepared to think questions were intelligent by answering the questions of two hecklers, one at a time. + +41. I prepared to re-ask the questions. I did this by recording the questions and answers. First, I wrote down the questioner's name. Second, I wrote down the questioner's address. Third, I rewarded the questioner. In this way, I prepared to re-ask the questions by recording the questions and answers. + +42. I prepared to give the witness protectors a bonus. I did this by asking the same question of my questioner after answering it. First, I recorded the answers. Second, I recorded the details. Third, I recorded how it was loving. In this way, I prepared to give the witness protectors a bonus by asking the same question of my questioner after answering it. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f43941a208a135c820d66777133fad40191d71c0 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 1 of 4.txt @@ -0,0 +1,37 @@ +["Green, L 2021, Rebreathsonings 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Rebreathsonings 1 of 4 + + + +1. The philosopher prepared to decide thoughts were only thoughts when thought in a with-it manner. He did this by balloonifying his arm. First, he blew up a balloon representing his arm. Second, he placed it next to the balloon representing the torso. Third, he attached the balloon representing his arm to the balloon representing the body using masking tape. In this way, the philosopher prepared to decide thoughts were only thoughts when thought in a with-it manner by balloonifying his arm. + + +2. The human rights scholar attributed a positive value to a robot qualifying its result. He did this by molding the chocolate rabbit. First, he poured chocolate into the mold. Second, he let it set overnight. Third, he poured hot water over the mold and removed the chocolate from it. In this way, the human rights philosopher prepared to attribute a positive value to a robot qualifying its thought by molding the chocolate rabbit. + + +3. The Hegelian constructed a conversation about worshippers from knowledge about God. He did this by calculating the ratio of the number of rays hitting planes from a light beam from 45 degrees to 90 degrees. First, he estimated that the length of one of the two shorter sides of a right angle triangle with its longer end (base) facing down to be sqrt(2) = 1.4 cm. All the rays heading to the base would hit the edge if a technician shot a beam to 45 degrees at the base. Second, he calculated that the length of the base, the side the rays would hit if a beam shot from 90 degrees were aimed it, to be 2 cm. Third, he calculated that the ratio of the number of rays hitting a plane from light from 45 degrees to 90 degrees = 1.4/2 = 0.7, so more rays were hitting the base from 90 degrees than from 45 degrees. In this way, the Hegelian prepared to construct a conversation about worshippers from knowledge about God by calculating the ratio of the number of rays hitting a plane from light from 45 degrees to 90 degrees. + + +4. The autist prepared to use his unique abilities to predict the rain accurately. He did this by taking the balloon out of the bag. First, he placed the balloon in the bag. Second, he took it out at the launch site. Third, he launched it at the launch site. In this way, the autist prepared to use his unique abilities to accurately predict the rain by taking the balloon out of the bag. In this way, the autist used his unique abilities to predict the rain accurately, by taking the balloon out of the bag. + + +5. The autist interested himself in the natural sciences by observing the philosopher and his student. He did this by compressing a marshmallow by biting it. First, he placed the marshmallow in his mouth. Second, he lifted his tongue until the top of the marshmallow touched the top of his lips. Third, he compressed the marshmallow by raising his tongue slightly. In this way, the autist prepared to interest himself in the natural sciences by observing the philosopher and his student by compressing a marshmallow by biting it. + + +6. The autist performed a complex calculation by examining model calculations. He did this by sitting on the seat. First, he chose a place. Second, he stood in front of it. Third, he sat on it. In this way, the autist prepared to perform a complex calculation by examining model calculations by sitting on the seat. + + +7. The autist prepared to perform calendar calculations by counting the number of years. He did this by eating the biscuit. First, he ate the piece with the first chocolate chip. Second, he ate the piece with the red lolly. Third, he ate the piece with the blue lolly. In this way, the autist prepared to perform calendar calculations by counting the number of years by eating the biscuit. + + +8. The autist demonstrated exceptional rote memory by using a mnemonic system. He did this by eating a lolly. First, he opened the bag. Second, he took out the lolly. Third, he ate the lolly. In this way, the autist prepared to demonstrate exceptional rote memory by using a mnemonic system by eating a lolly. + + +9. The autist showed his intellect that was above average, by completing the IQ test. He did this by eating the minestrone. First, he wrote an idea from politics for a point of a pasta star. Second, he repeated this for each point. Third, he stopped when there were no more points left. In this way, the autist prepared to demonstrate his intellect that was above average by completing the IQ test by eating the minestrone. + + +10. The autist prepared to explain that his father occupied a high position by drawing a chart. He did this by breaking the wafer. First, he measured the width of it. Second, he divided this by two. Third, he broke it in half at this point. In this way, the autist prepared to explain that his father occupied a high position by drawing a chart by breaking the wafer. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..ba248ccb2bc068ca1bb7a12687d223eed1103f71 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 2 of 4.txt @@ -0,0 +1,35 @@ +["Green, L 2021, Rebreathsonings 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Rebreathsonings 2 of 4 + +11. The autist explained that he had several generations of ancestors of intellectuals, by labelling a point on a line. He did this by placing the octahedrons on the tray. First, he drew the unfolded octahedron on the paper. Second, he cut it out. Third, he folded it into a three-dimensional octahedron. In this way, the autist prepared to explain that he had several generations of ancestors of intellectuals, by labelling a point on a line by placing the octahedrons on the tray. + + +12. The autist prepared to demonstrate that his work performance provided some social integration by encountering a burst of A-grade thoughts because of thinking of a particular thought in the conversation. He did this by pushing the woven thread down. First, he crossed over the threads. Then, he pushed the thread down. Finally, he had an A given to him. In this way, the autist prepared to demonstrate that his work performance provided some social integration by encountering an A because of thinking of a particular thought in the conversation by pushing the woven thread down. + + +13. The autist prepared to demonstrate that his work performance was excellent by writing down why he loved people. He did this by lapping the water. First, he placed his mouth on the edge of the cup. Second, he drank some of the water. Third, he lifted his head. In this way, the autist prepared to demonstrate that his work performance was excellent by writing down why he loved people by lapping the water. + + +14. The autist became a faculty member in the Department of Astronomy by finding a reason for an objection to an error in high-quality work by querying what the first action that was supposed to have a reaction was because there was no other thing to react to. He did this by placing the mortar object (representing the reason for the objection) on the brick. First, he put the first brick down. Second, he put mortar on the first brick. Third, he placed the second brick on the mortar. In this way, the autist prepared to become a faculty member in the Department of Astronomy. He did this by finding a reason for an objection to an error in high quality work by querying what the first action that was supposed to have a reaction. He found a reason for an objection because there was no other thing to react to by placing the mortar object (representing the reason for the objection) on the brick. + + +15. The Asperger patient prepared to demonstrate his prodigious memory by associating parts to memorise with significant parts he walked past. He did this by shaking his hand in the air. First, he clenched his fist. Second, he raised it in the air. Third, he shook it. In this way, the Asperger patient prepared to demonstrate his prodigious memory by associating parts to memorise with parts of significant parts he walked past by shaking his hand in the air. + + +16. The Asperger patient prepared to demonstrate her extraordinary preoccupation by studying with many short breaks. She did this by stabbing a sausage. First, she measured the distance one-fifth along the length of the sausage. Second, she positioned the fork above this position. Third, she stabbed the sausage at this position. In this way, the Asperger patient prepared to demonstrate her extraordinary preoccupation by studying with many short breaks, by stabbing a sausage. + + +17. The Asperger patient showed his mastery of sports statistics by remembering the match with the top score. He did this by driving someone up to the door in a tricycle. First, he started at the edge of the courtyard. Second, he drove through the courtyard. Third, he stopped at the door. In this way, the Asperger patient prepared to demonstrate his mastery of sports statistics by remembering the match with the top score by driving someone up to the door in a tricycle. + + +18. The Asperger patient prepared to demonstrate his superb knowledge of history trivia. He did this by testing whether he was a man or a woman. First, he took off his shirt. If he was not wearing a bra, then he decided he was a man. Alternatively, if he, or she, in fact, was wearing a bra, then she decided she was a woman. In this way, the Asperger patient prepared to demonstrate his superb knowledge of history trivia by testing whether he was a man or a woman. + + +19. The Asperger patient tested his expansive conversation by researching one detail from each of the X-Y-Z/verb-touching, human-judgment-of-subject/human-judgment-of-object, room/part-of-room/direction-in-room/time-to-prepare/time-to-do/time-to-finish criteria. He did this by cooking three hundred and sixty degrees of the cabaña. First, he put the cabaña in the fire. Second, he rotated it. Third, he waited 15 minutes until he had cooked the cabaña. In this way, the Asperger patient prepared to test his expansive conversation by researching one detail from each of the X-Y-Z/verb-touching, human-judgment-of-subject/human-judgment-of-object, room/part-of-room/direction-in-room/time-to-prepare/time-to-do/time-to-finish criteria by cooking three hundred and sixty degrees of the cabaña. + + +20. The Asperger patient prepared to appear seemingly scholarly by doffing a wig and gown. He did this by pulling the model bird's string. First, he suspended the model from the Japanese light. Second, he held its string. Third, he pulled it. In this way, the Asperger patient prepared to appear seemingly scholarly by doffing a wig and gown by pulling the model bird's string. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4acef39295611708f711b33d0224d8e1282de112 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 3 of 4.txt @@ -0,0 +1,35 @@ +["Green, L 2021, Rebreathsonings 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Rebreathsonings 3 of 4 + +21. The Asperger patient put forth conversation so liberally, by talking about and then judging an idea. He did this by lifting himself up onto a ledge. First, he found a ledge. Second, he hoisted himself onto it. Third, he stood on the ledge looking around. In this way, the Asperger patient prepared to put forth conversation so liberally, by talking about and then judging an idea by lifting himself up onto a ledge. + + +22. The labourer washed the block. He did this by cleaning the chopsticks. First, he placed the scraper at the base of the chopstick. Second, he lifted the scraper upwards. Third, he stopped when he reached the top. In this way, the labourer prepared to wash the block by cleaning the chopsticks. + + +23. The Reasoner dovetailed the arguments. He did this by plucking the feather from the model pheasant's tail. First, the Reasoner positioned his finger above the model feather. Second, he positioned his other finger below the model feather. Third, he plucked the model feather from the model pheasant's tail. In this way, the Reasoner prepared to dovetail the arguments by plucking the feather from the model pheasant's tail. + + +24. The farmer harvested the wheat. He did this by aligning the tofu in a straight line. First, he measured the width of the tofu. Second, he placed the left point of the tofu so that the central point at w/2 + l (where w = width of the tofu and l = x coordinate of the left-hand side) was in front of his mouth. In this way, the farmer prepared to harvest the wheat by aligning the tofu in a straight line. + + +25. The oceanographer bathed the starfish. He did this by counting the starfish's arms. First, he counted the number of suction cups. Second, he divided this by the number of suction cups per arm, 5. Third, he calculated this dividend to equal the number of arms of the starfish. In this way, the oceanographer prepared to bathe the starfish by counting its arms. + + +26. The firefighter put on the mask. He did this by putting on the sunglasses. First, he opened the sunglasses. Second, he placed them on his face. Third, he tested that the angle from each of the arms to the lenses was 90 degrees. In this way, he prepared to put on the mask by putting on the sunglasses. + + +27. The moviegoer prepared to walk home. She did this by walking to the triangle in the moonlight. First, she charted her proposed path. Second, she walked along her path. Third, she stopped when she arrived at the triangle. In this way, she prepared to walk home by walking to the triangle in the moonlight. + + +28. The navigator sailed the ship. He did this by following the stars. First, he computed which direction a particular star was pointing in. Second, he computed which goal lay in that direction. Third, he headed for that goal. In this way, he prepared to sail the ship by following the stars. + + +29. The gardener picked the flower. He did this by colouring his illustrations using petals. First, he drew and cut out the shape of the region that he wanted to colour blue. Second, he traced the shape onto a blue petal. Third, he cut out the shape from the petal and placed it on the page. In this way, he prepared to pick the flower by colouring his illustrations using petals. + + +30. The Gastronomer placed the red lid on the pot. He did this by staining his cheeks red with the red petal. First, he plucked a petal from a rose. Second, he rubbed it in water. Third, he placed it on his cheek. In this way, he prepared to put the red lid on the pot by colouring his cheeks red with the red petal. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3d220f7ebaa92eae37b50fe9173e36c4afe9799a --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 4 of 4.txt @@ -0,0 +1,28 @@ +["Green, L 2021, Rebreathsonings 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Rebreathsonings 4 of 4 + +31. The king sat on the throne. He did this by testing that the bottom of a crown was flat. First, he wrote down the z-coordinate of a point of the pentagon at the base of the crown. Second, he tested that the next point's z-coordinate was the same as the previous point's z-coordinate. Third, he repeated this until there were no more points. In this way, the king prepared to sit on the throne by testing that the bottom of a crown was flat. + + +32. The mother sheltered under the oak tree. She did this by placing the red candle present in her green pocket. First, she held the candle at the top. Second, she opened her pocket. Third, she placed the candle in her pocket. In this way, she sheltered under the oak tree by putting the red candle present in her green pocket. + + +33. I prepared to say that children's innocence is harmless. I did this by writing that the religion (circle of philosophers) was positive. First, I wrote about the gay thought (the pencil in the pencil case). Second, I wrote about the animals (children). Third, I like you the way you are at University. In this way, I prepared to say that children's innocence is harmless by writing that the religion (circle of philosophers) was positive. + + +34. I was a Republican. I did this by stating that my knee helped me stand up straight. First, I liked standing. Second, I loved the Queen. Third, I liked myself. In this way, I was a Republican by stating that my knee helped me stand up straight. + + +35. I prepared to give A to the Chinese tutor. I did this by agreeing with the Chinese tutor. First, I liked her clothes. Second, I loved her house. Third, I enjoyed her spreams (sic) of ice cream. In this way, I prepared to give A to the Chinese tutor by agreeing with the Chinese tutor. + + +36. I interpreted Lucian's essay format as a Ph.D. I did this by holding a partner's hand to gain confidence in writing two uses. First, I held his hand. Second, I begged on my knees. Third, I worked out two uses. In this way, I prepared to interpret Lucian's essay format as a Ph.D. by holding a partner's hand to gain confidence in writing two uses. + + +37. I wrote a book about philosophy in primary school. I did this by holding a partner's hand to get confidence in writing the future use. First, I wrote about the rocket. Second, I wrote about the happy notes. Third, I wrote about University at primary and secondary school level. In this way, I prepared to write a book about philosophy in primary school by holding a partner's hand to gain confidence in writing the future use. + + +38. I reacted to the scene. I did this by holding a partner's hand to get confidence in writing the two types of objects. First, I thought of two types in contradistinction to each other. Second, I envisioned an object following another. Third, I envisioned the mask. In this way, I prepared to react to the scene by holding a partner's hand to gain confidence in writing the two types of objects. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7ff4c6bef561b703843b4ff3bd3de746b67b904f --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 1 of 4.txt @@ -0,0 +1,37 @@ +["Green, L 2021, Room 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Room 1 of 4 + + + +1. The Polytechnic designer prepared to plan for no walls between rooms in the polytechnic. He did this by eating from the flat bowl. First, he touched the left side of the pea. Second, he touched the right side of the pea with another finger on his hand. Third, he moved the pea to the left. In this way, the Polytechnic designer prepared to plan for no walls between rooms in the polytechnic by eating from the flat bowl. + + +2. The airlifter prepared to count the number of rescued crew. He did this by writing 'A' with the chalk. First, he lifted his hand to the line. Second, he applied the chalk to the blackboard. Third, he wrote the letter 'A'. In this way, the airlifter prepared to count the number of rescued crew by writing 'A' with the chalk. + + +3. The airlifter tested that all of the rescued crewmember's articles of clothing were securely worn. He did this by eating the slice of cake. First, he ate the strawberry. Second, he ate the cake topping. Third, he ate the main part of the slice of cake. In this way, the airlifter prepared to test that all of the rescued crewmember's articles of clothing were securely worn by eating the slice of cake. + + +4. The rocket artist drew the regions of the rocket. He did this by cutting out the letter. First, he opened the scissors at the edge of the paper. Second, he cut along the lines of the outside of the letter. Third, he folded the paper in half, made a cut in the hole of the letter, and cut out the hole. In this way, the rocket artist prepared to draw the regions of the rocket by cutting out the letter. + + +5. The pilot prepared to test the system was working. He did this by lighting the candle. First, he made sure the candle was upright. Second, he lit the candle. Third, he made sure the flame stayed alight. In this way, the pilot prepared to test the system was working by lighting the candle. + + +6. The helicopter pilot prepared to remove the lavender from the heliport. She did this by burning the aromatherapy oil. First, she squeezed oil from eucalyptus leaves into water. Second, she rubbed two sticks together to produce a fire. Third, she evaporated the liquid to produce an aroma. In this way, the helicopter pilot prepared to remove the lavender from the heliport by burning the aromatherapy oil. + + +7. The helicopter pilot prepared to take off and land. He did this by heating the vegan sausage. First, he placed it in a pan. Second, he lit a fire. Third, he heated the sausage in the pan over the fire. In this way, the helicopter pilot prepared to take off and land by heating the vegan sausage. + + +8. The body artist mimicked a sapling. He did this by painting himself green with body paint. First, he uncapped the body paint. Second, he applied the body paint to a brush. Third, he applied the brush to his body. In this way, the body artist prepared to mimic a sapling by painting himself green with body paint. + + +9. The swimmer kicked the water with his feet. He did this by feeling his heartbeat. First, he undid his shirt. Second, he placed his hand on his heart. Third, he felt it beat once. In this way, the swimmer prepared to kick the water with his feet by feeling his heartbeat. + + +10. The artist prepared to dabble with the paints. He did this by adding milk to the cake mixture. First, he opened the carton of milk. Second, he placed it over the mixture. Third, he poured it onto the mixture. In this way, the artist prepared to dabble with the paints by adding milk to the cake mixture. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..39d7dd8fafdb24e9137f3ba827c68980ed388c20 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 2 of 4.txt @@ -0,0 +1,35 @@ +["Green, L 2021, Room 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Room 2 of 4 + +11. The millinery consultant prepared to manufacture the hat. He did this by writing an emoticon. First, he wrote a colon (':') representing a man's eyes. Second, he wrote a hyphen ('-') representing his nose. Third he wrote a right parenthesis (')') representing his mouth. In this way, the millinery consultant prepared to manufacture the hat by writing an emoticon. + + +12. The Lord prepared to appear based on objects. He did this by eating the candy cane. First, he licked the crook. Second, he ate the stick. Third, he digested the base. In this way, the Lord prepared to appear based on objects by eating the candy cane. + + +13. The doctor prepared to eat the meal with a knife and fork. He did this by eating the soufflé. First, he licked the sweet top. Second, he lifted the spoon. Third, he made an incision in the soufflé. In this way, the doctor prepared to eat the meal with a knife and fork by eating the soufflé. + + +14. The astronaut drove to the launch pad in the shuttle bus. He did this by eating the ship lolly. First, he held the lolly with one hand. Second, he unwrapped the lolly. Third, he ate the lolly. In this way, the astronauts prepared to ride to the launch pad in the shuttle bus by eating the ship lolly. + + +15. The astronaut prepared to travel through the space dock. She did this by eating the sphere space station jube. First, she placed it in front of her. Second, she melted it on her tongue. Third, she swallowed it. In this way, the astronaut prepared to travel through the space dock by eating the spherical space station jube. + + +16. The central nervous system specialist tested that the program worked. He did this by opening the deck chair. First, he placed the bottom of the back of it on the ground. Second, he held the folded seat with one of his hands. Third, he unfolded the seat, so that all four legs were touching the ground. In this way, the central nervous system specialist tested that the program worked by opening the deck chair. + + +17. The heart specialist constructed a heart beat triangle. He did this by licking the triangular lollipop. First, he licked the lollipop. Second, he tasted the guava lollipop. Third, he bit it. In this way, the heart specialist constructed a heart beat triangle by licking the triangular lollipop. + + +18. The physiologist read the answer in the book. He did this by eating the spinach. First, he uncurled it. Then, he pinned down all four corners of it. Third, he cut a square from it and ate it. In this way, the physiologist read the answer in the book by eating the spinach. + + +19. The theologian scientifically tested whether God had a higher quality of life. She did this by eating the seaweed. First, she stood on the shore. Second, she waded in to the sea. Third, she collected and ate the seaweed. In this way, the theologian scientifically tested whether God had a higher quality of life by eating the seaweed. + + +20. The roboticist prepared to operate on Toby to be positive. He did this by shredding the spinach. First, he cut the spinach. Second, he placed it in strips. Third, he lined up the strips in a square. In this way, the roboticist operated on Toby to be prepared to be positive by shredding the spinach. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e6fe0b919d28bbdbec3a47f469b4f58e636e93f0 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 3 of 4.txt @@ -0,0 +1,35 @@ +["Green, L 2021, Room 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Room 3 of 4 + +21. The astronaut prepared to live in a house. He did this by eating the sesame seed. First, he removed it from its packet. Second, he pressed it onto his bottom lip. Third, he chewed it. In this way, the astronaut prepared to live in a house by eating the sesame seed. + + +22. The geneticist prepared a plan for Earth Two. He did this by oiling the carrot. First, he oiled the pan. Second, he placed the carrot in the pan. Third, he removed the carrot. In this way, the geneticist prepared a plan for Earth Two by oiling the carrot. + + +23. The farmer held a candle at its base. He did this by eating the turnip. First, he chopped off its head. Second, he chopped off its stalk. Third, he ate it. In this way, the farmer held a candle at its base by eating the turnip. + + +24. The neuroscientist prepared to match the picture with the signal going to the brain. He did this by rotating the circular food server. First, he held the circle in front of him. Second, he rotated the circle 45 degrees clockwise. Third, he noticed that the bowl of bean curd was in front of him. In this way, the neuroscientist prepared to match the picture with the signal going to the brain by rotating the circular food server. + + +25. The neuroscientist prepared to copy the information in the brain cell. He did this by drinking with the straw. First, he placed the straw in the apple juice. Second, he put his lips around the straw. Third, he drank the apple juice. In this way, the neuroscientist prepared to copy the information in the brain cell by drinking with the straw. + + +26. The neuroscientist downloaded all the information directly relevant to a thought from the brain. He did this by pouring the water down the sink. First, he lifted the glass of water. Second, he positioned it above the sink. Third, he emptied it into the sink. In this way, the neuroscientist prepared to download all the information directly relevant to a thought from the brain by pouring the water down the sink. + + +27. The neuroscientist prepared to order the brain data in a string. He did this by pressing a flower. First, he placed the press on the table. Second, he placed the paper in the press. Third, he placed the flower in the press. In this way, the neuroscientist prepared to order the brain data in a string by pressing a flower. + + +28. The neuroscientist prepared to examine a conclusion structure in the brain. He did this by jumping with his knees. First, he bent his knees. Second, he pushed off the ground. Third, he lifted his feet into the air. In this way, the neuroscientist prepared to examine a conclusion structure in the brain by jumping with his knees. + + +29. The neuroscientist studied a reason structure in the brain. He did this by making the olive paste. First, he placed a pitted olive on the chopping board. Second, he cut the olive into squares. Third, he mashed the olive with a pestle. In this way, the neuroscientist prepared to examine a reason structure in the brain by making the olive paste. + + +30. The neuroscientist achieved the highest quality pinnacle in science. He did this by photographing the setting from the mountain. First, he walked from the subject to the mountain. Second, he climbed the mountain. Third, he photographed the subject from the mountain. In this way, the neuroscientist achieved the highest quality pinnacle in science by photographing the setting from the mountain. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..ded8511dc206408bfb519f77cdc3db8fff66ad06 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Room 4 of 4.txt @@ -0,0 +1,10 @@ +["Green, L 2021, Room 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Room 4 of 4 + +31. The professor prepared to execute the brain program. He did this by dismantling and measuring the components of the pinhole camera. First, he took the top off the camera. Second, he took out the paper. Third, he measured the illustration on the paper. In this way, the professor prepared to execute the brain program by dismantling and measuring the components of the pinhole camera. + + +32. The neuroscientist identified colours of different anatomical regions of the brain, each with different functions. She did this by tasting the lolly's differently coloured parts. First, she chose a lolly. Second, she chose one of its parts. Third, she tasted the part. In this way, the neuroscientist identified colours of different anatomical regions of the brain, each with different functions, by tasting the lolly's differently coloured parts. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..619eae4ea6609e4183d0dcf2a76518584e250aab --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 1 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Time to Do 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Time to Do 1 of 4 + + + +1. The cell biologist prepared to study apoptosis (programmed cell death). He did this by licking the surface area of the ice cream model. First, he constructed a cubic ice cream model from a paper cut-out. Second, he stamped a lick mark on the bottom level. Third, he stamped a lick mark on each square from the next level up, and so on, until the whole ice cream model had been covered. In this way, the cell biologist prepared to study apoptosis (programmed cell death) by licking the surface area of the ice cream model. + +2. The butler polished the knife. He did this by licking the length of the head of the spoon. First, he started at the end of the head closer to the middle. Second, he licked the head of the spoon. Third, he finished licking at the end of the spoon. In this way, the butler prepared to polish the knife by licking the length of the head of the spoon. + +3. The geneticist examined the limbs being developed one at a time. He did this by stirring with the spoon. First, he inserted the spoon into the pot. Second, he moved the spoon from behind the biggest piece to behind the next biggest piece in the liquid. Third, he stopped after stirring the biggest five pieces. In this way, the geneticist prepared to examine the limbs being developed one at a time by stirring with the spoon. + +4. The physicist split the particle. He did this by taping a pin to a spoon. First, he put tape across a pin. Second, he placed the pin with the tape on the spoon's handle, with the pin pointing outwards. Third, he fastened the tape to the spoon. In this way, the physicist prepared to split the particle by taping a pin to a spoon. + +5. The director watched through the middle of the model molecule. He did this by holding on to the tofu while he skewered it. First, he pressed the tofu in place. Second, he put the skewer's point in the centre of the top of the tofu. Third, he pushed the skewer through the tofu. In this way, the director prepared to watch through the middle of the model molecule by holding on to the tofu while he skewered it. + +6. The cowperson lassoed the cow. She did this by testing that the cube was empty. First, she proved that the ball was placed in the cube. Second, she proved that the ball was taken out of the cube. Third, she tested that the cube had no other balls in it. In this way, the cowperson prepared to lasso the cow by testing that the cube was empty. + +7. The biochemist wrote down the value. He did this by opening out and measuring the volume of the lollipop. First, he counted the number of units wide it was. Then, he multiplied this by the number of units deep it was. Then, he multiplied this by the number of units high it was. In this way, the biochemist prepared to write down the value by opening out and measuring the volume of the lollipop. + +8. The philosopher wrote down reasons for an argument. He did this by opening the umbrella. First, he held the umbrella by its crook handle. Second, he held the runner. Third, he unfurled the umbrella. In this way, the philosopher prepared to write down reasons for a conclusion by opening the umbrella. + +9. The philosopher wrote down objections to an argument. He did this by repeatedly opening and closing the umbrella to act as a fan. First, he held the umbrella to the side. Second, he half-opened the umbrella quickly. Third, he half-closed the umbrella quickly. In this way, the philosopher prepared to write down objections to an argument by repeatedly opening and closing the umbrella to act as a fan. + +10. The biochemist tested that the process was perfectly expressed. He did this by multiplying the quantity by ten, and then measured it. First, he read the original amount. Second, he multiplied this number by ten. Third, he measured this quantity. In this way, the biochemist prepared to test that the process was perfectly expressed by multiplying the number by ten, and then measured it. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f5463b0d690407d8c7f49981ba526965a3e55fb9 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Time to Do 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Time to Do 2 of 4 + +11. The scientist smiled to the camera. He did this by holding the nut between his teeth. First, he held the nut next to his mouth. Second, he placed it on the tip of his bottom teeth. Third, he gently closed his upper and lower teeth, gripping the nut. In this way, the scientist prepared to smile to the camera by holding the nut between his teeth. + +12. The scientist signed an autograph in a book. He did this by licking the centre of the spoon. First, he held the spoon up to his lips. Then, he dipped his head. Then, he licked the middle of the spoon. In this way, the scientist prepared to sign an autograph in a book by licking the centre of the spoon. + +13. The expert signed the stack of books. She did this by licking the three spoons. First, she licked the first spoon. Second, she licked the second spoon. Third, she licked the third spoon. In this way, the scientist prepared to sign the stack of books by licking the three spoons. + +14. The expert licked a small part, and then a large part of the lolly. He did this by drawing a diagram of the apparatus. First, he looked at the slide using the microscope slide. Second, he wrote the bar scale on a sheet of paper. Third, he drew the image he saw using the lens in his journal. In this way, the scientist prepared to lick a small part, and then a large part of the lolly by drawing a diagram of the apparatus. + +15. The biochemist licked the left and right sides of the lolly. She did this by tracing the pathway through the biochemical system. First, she looked at where the chemical started. Second, she outlined its progress on a space versus time graph. Third, she recorded her observations of the chemical. In this way, the biochemist prepared to lick the left and right sides of the lolly by tracing the pathway through the biochemical system. + +16. The biochemist propped up the pillow. He did this by licking the spoon. First, he placed the spoon in his mouth. Second, he licked the bottom of the spoon. Third, he licked the top of the spoon. In this way, the biochemist prepared to prop up the pillow by licking the spoon. + +17. The biochemist held up a model protein at assembly. He did this by testing whether there was fruit on the other side of the store using a mirror. First, he held a mirror in front of him. Second, he looked at the other end of the warehouse using the mirror. Third, he observed whether there was fruit visible in the mirror. In this way, the biochemist prepared to hold up a model molecule at assembly by testing whether there was fruit on the other side of the store using a mirror. + +18. The manager prepared to find a building of the right size. He did this by examining the image using a magnifying glass. First, he positioned the magnifying glass 0.05 m above the map. Next, he moved it over the fine text. Finally, he read the fine text. In this way, the manager prepared to find a building of the right size by examining the image using a magnifying glass. + +19. The planetarium manager climbed a ladder to maintain the planetarium. He did this by inserting a star with a rod attached to it into a hole. First, he placed the bar above the hole. Second, he pushed it into the hole. Third, he made sure it would not slide out. In this way, the planetarium manager prepared to climb a ladder to maintain the planetarium by inserting a star with a rod attached to it into a hole. + +20. The actor made another actor famous. He did this by placing the two lolly snakes in his mouth. First, he put the two lolly snakes side by side in his hand. Second, he bit off part of both of them. Third, he continued to eat them until there was nothing left. In this way, the actor prepared to make another actor famous by placing the two lolly snakes in his mouth. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..b3065e858ad3795059feb711cefa19e7f9531dee --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Time to Do 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Time to Do 3 of 4 + +21. The audience member waited to ask a question. He did this by testing that the 3D shape was evident. First, he walked along a row of hexagons on their bases. Second, he tested that there were no shapes above each hexagon. Third, he proved that this was true for the rest of the rows. In this way, the audience member waited to ask a question by testing that the 3D shape was evident. + +22. The scientist prepared to answer the unanswered questions one at a time. She did this by testing whether there was anything to lick in range. First, she opened her mouth. Second, she moved her tongue up and curled it upwards. Third, she moved her tongue down and bent it downwards. In this way, the scientist prepared to answer the unanswered questions one at a time by testing whether there was anything to lick in range. + +23. The lecturer walked to work on time. He did this by licking the central lollipop. First, he counted the number of candies. Second, he identified the (floor((n+1)/2))th lollipop, where floor(n) means n is rounded down to the nearest integer. Third, he licked this lollipop. In this way, the lecturer walked to work on time by licking the central lollipop. + +24. The professor tested that she had been paid for the hours that she worked. She did this by decompressing the sponge. First, she soaked up water with it. Second, she squeezed the water out of it. Third, she let it decompress on the sink. In this way, the lecturer tested that she had been paid for the hours that she worked by decompressing the sponge. + +25. The father prepared to teach his children to read. He did this after cleaning both children's teeth. First, he brushed the first child's teeth. Second, he brushed the second child's teeth. Third, he prepared books for each of them to read in bed. In this way, the father taught his children to read after cleaning both children's teeth. + +26. The father kissed the baby good night. He did this by making stairs. First, he cut zig-zags in two beams of wood. Second, he nailed them to the ground and first floor. Third, she nailed steps to the horizontal and vertical edges of the zig-zags on the beams of wood. In this way, the father prepared to kiss the baby good night by making stairs. + +27. The prospective parents planned to see a family planner. The male did this after tickling himself with a feather. First, he took his shirt off. Second, he found that his stomach was sensitive. Third, he tickled it with the feather. In this way, the prospective parents planned to see a family planner after the male tickled himself with a feather. + +28. The fun park visitor prepared to ride the helter skelter. He did this by licking the chocolate from his finger. First, he started from the base of his finger. Second, he spiraled his tongue upwards, licking all the chocolate from his finger on the way. Third, he stopped when he reached the top. In this way, the fun park visitor prepared to ride the helter skelter by licking the chocolate from his finger. + +29. The biochemist calculated the circumference of the protein. He did this by testing the belt fitted correctly. First, he lined the zero on the ruler up with the belt buckle. Second, he computed the correct hole in the strap. Third, he measured the distance between the belt buckle and the hole. In this way, the biochemist prepared to calculate the circumference of the protein by testing the belt fitted correctly. + +30. The animation artist copied the cell. She did this by tasting the sugar spread on different parts of her tongue. First, she touched the sourness-detecting part of her tongue with the spatula, with no result. Second, she touched the umami-detecting part of her tongue with the spatula, with no result. Thirdly, she touched the sweetness-detecting part of her tongue with the spatula, and tasted the sugar. In this way, the animation artist prepared to copy the cell by sampling the sugar spread on different parts of her tongue. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..0c2daf1038d1838dbb65383430145ce16437511f --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 4 of 4.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Time to Do 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Time to Do 4 of 4 + +31. The biochemist explained a link. He did this by tracing through the maze with his hand. First, he traced a rope through the leftmost turns of the maze with a blindfold. Second, he traced a line around the leftmost corners, but with a right turn as far along as possible without visiting a visited square. Third, he repeated step two until he finished the maze. In this way, the biochemist prepared to explain a link by tracing through the maze with his hand. + +32. The biochemist tested how the vesicles travelled along their path. He did this by decoding the jam-coded paths maze. First, he tasted the jam type at a square. Second, he decoded strawberry jam as meaning there was a way to the left, raspberry jam as meaning there was a path to the right, and blueberry jam as meaning there were paths to the left and right. Third, he moved along the unvisited squares until reaching the goal. In this way, the biochemist prepared to test how the vesicles travelled along their path by decoding the jam-coded paths maze. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Finish 1 of 2.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Finish 1 of 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..9178f597d4025cd66b946a319f855469c29c5f9b --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Finish 1 of 2.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Time to Finish 1 of 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Time to Finish 1 of 4 + + + +1. The aerial surveyor prepared to go downstairs. He did this by bringing his tongue out from behind his top teeth. First, he placed his tongue behind his top teeth. Second, he lowered his tongue. Third, he pushed his tongue forwards. In this way, the aerial surveyor prepared to go downstairs by bringing his tongue out from behind his top teeth. + +2. The groundsman prepared to climb a ladder. He did this by bringing his tongue out from behind his bottom teeth. First, he licked the back of his bottom teeth. Second, he raised his tongue. Third, he slid his tongue forward. In this way, he prepared to climb a ladder by bringing his tongue out from behind his bottom teeth. + +3. The caterer prepared to lift trays of food. He did this by placing a glass of water on one tray and a glass of orange juice on another. First, he identified the orange juice as opaque orange fluid. Second, he identified the water as transparent colourless fluid. Third, he placed each of them on separate trays. In this way, he prepared to lift trays of food by placing a glass of water on one tray and a glass of orange juice on another. + +4. The sound engineer prepared to mix two sounds together. She did this by mixing orange juice and water together. First, she poured half a glass of water Next, she poured half a glass of orange juice into the same glass. Finally, she stirred the liquid in the glass. In this way, she prepared to mix two sounds together by mixing orange juice and water together. + +5. The sifter prepared to take the large seeds. He did this by using a straw to drink the orange juice level in the orange juice and water mix. First, he let the orange juice float to the top of the glass. Second, he inserted the straw into the orange juice. Third, he sucked the orange juice through the straw. In this way, the sifter prepared to take the large seeds by using a straw to drink the orange juice level in the orange juice and water mix. + +6. The security guard patrolled behind the premises. He did this by licking behind the lollipop. First, he placed his tongue to the front-left of the lollipop. Second, he moved his tongue to the back-left of the lollipop. Third, he bent his tongue behind the lollipop. In this way, he patrolled behind the premises by licking behind the lollipop. + +7. The workman prepared to dig the hole. He did this by licking the inner edge of a slice of apple. First, he placed his tongue so that it was touching both ends of the inner edge. Second, he pushed his tongue towards the centre of the inner edge. Third, he stopped pushing his tongue when it completely covered the inner edge. In this way, the workman prepared to dig the hole by licking the inner edge of a slice of apple. + +8. The soccer player intercepted the ball. He did this by instructing one of eight players in a circle to block the ball when another player dribbled past each of them. First, he randomly selected one of the eight players, the nth player, to block the ball. Second, a player started dribbling the ball from player 0. Third, player n blocked the ball at 10n seconds. In this way, the soccer player intercepted the ball by instructing one of eight players in a circle to block the ball when another player dribbled past each of them. + +9. The party caterer prepared to serve sushi. He did this by filling his bottle from the tap. First, he turned the tap on. Next, he let the tap run. Lastly, he turned the tap off when the bottle was full. In this way, the party caterer prepared to serve sushi by filling his bottle from the tap. + +10. The chef tested that the table was stable. He did this by testing that he was equal first with the rope-pulled rabbit. First, he found the time when his maximum y co-ordinate was equal to the finish line. Second, he found the rabbit's y co-ordinate at this time. In conclusion, he tested whether his and the rabbit's y co-ordinates were the same. In this way, the chef tested that the table was stable by testing that he was equal first with the rope-pulled rabbit. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Finish 2 of 2.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Finish 2 of 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..0f2b734fe3c4535d0ebbe253229861e783966414 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Finish 2 of 2.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Time to Finish 2 of 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Time to Finish 2 of 4 + +11. The banana grower prepared to dip the banana in red wax. She did this by recognising the glass of red orange juice. First, he recognized the square on the view from the front. Second, he recognized the circle in the view from above. Third, he tested that the square was red. In this way, the banana grower prepared to dip the banana in red wax by recognising the glass of red orange juice. + +12. The orthodontist measured tooth widths. He did this by removing pips of different sizes from the orange juice squeezer. First, he iterated along the line until he found the left edge of a pip. Then, he iterated to the right edge of the pip, then removed it. Lastly, he repeated the process until he had reached the end of the line. In this way, the orthodontist measured tooth widths by removing pips of different sizes from the orange juice squeezer. + +13. The farmer prepared to plough the furrow. He did this by slicing vertically across the lettuce. First, he made a vertical cut on the left side of the chorus. Second, he placed a strip of paper with its left edge aligned with the cut and cut along it's right edge. Third, he repeated the process until he had reached the right side of the lettuce. In this way, the farmer prepared to plough the furrow by slicing vertically across the lettuce. + +14. The archaeologist prepared to excavate the dig. He did this by cross-slicing the cabbage. First, he temporarily placed a paper square on the region he planned to cut. Second, he removed the square and made vertical cuts parallel with the square's left edge. Third, he made horizontal cuts parallel with the square's top edge. In this way, the archaeologist prepared to excavate the dig by cross-slicing the cabbage. + +15. The train cleaner prepared to hose the train. He did this by hosing the top of his head. First, he moved the shower head so that its x co-ordinate was equal to its initial position, plus the x co-ordinate of the centre of the person's head minus the x co-ordinate of the initial position of the centre of the shower head. Second, he moved the shower head so that its y co-ordinate was equal to its initial position, plus the y co-ordinate of the centre of the person's head minus the y co-ordinate of the initial position of the centre of the shower head. Third, he adjusted the shower head so that it was perpendicular to the walls. In this way, the train cleaner prepared to hose the train by hosing the top of his head. + +16. The chauffeur adjusted the wind screen wiper. He did this by showering his face. First, he placed the shower head vertically, touching his face. Then he rotated it 45 degrees away from his face around the top of the shower head. In conclusion, he sprayed his face with water and let the water drain down it. In this way, the chauffeur adjusted the wind screen wiper by showering his face. + +17. The philatelist prepared to detach the stamp. He did this by matching the actor with the character. First, he measured the character's hair length. Second, he measured the actor's hair length. Third, he tested that the actor's hair length was equal to the character's hair length. In this way, the philatelist prepared to detach the stamp by matching the actor with the character. + +18. The track driver prepared to return home. She did this by moving her hand away from the glass. First, she released her grip on the glass with her right hand. Second, she moved her hand to the right. Finally, she moved her hand back to in front of her. In this way, the track driver prepared to return home by moving her hand away from the glass. + +19. The tree feller prepared to trim the tree branches. He did this by detecting n > 1 licks. First, he set the counter at 0. Next, he added 1 to the counter when a vertical lick mark was counted. Finally, he repeated this algorithm until there were no more lick marks. In this way, the tree feller prepared to trim the tree branches by detecting n > 1 licks. + +20. The train passenger prepared to eat the sandwich. He did this by cutting two thirds of the bread stick segment off. First, he measured the length of the bread stick segment with a ruler. Second, he divided the length by three. Third, he multiplied this value by two. In this way, the train passenger prepared to eat the sandwich by cutting two thirds of the bread stick segment off. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..cbdcc4ad471fb3788d44c20200ce68bed5e724d5 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 1 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Time to Prepare 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Time to Prepare 1 of 4 + + + +1. The cancer researcher prepared to discover the cure for cancer. He did this by unzipping the purse. First, he observed that it was empty. Second, he observed that it was half-full. Third, he unzipped it when it was full. In this way, the cancer scientist prepared to work out when the cell would die by apoptosis by unzipping the purse. + +2. The AIDS researcher prepared to discover the cure for AIDS. He did this by wrapping the carrot in cellophane. First, he held the carrot upright with one hand. Second, he placed the cellophane against the carrot with his other hand. Third, he wrapped the carrot in cellophane. In this way, the AIDS scientist prepared to work out how a patient should prevent AIDS by wrapping the carrot in cellophane. + +3. The influenza researcher prepared to discover the cure for influenza. He did this by crawling into a room through a hole. First, he bent down next to the hole. Second, he crawled through the hole. Third, he entered the room. In this way, the influenza researcher prepared to discover the cure for influenza by crawling into a room through a hole. + +4. The church priest prepared to swallow the leftover wine. He did this by licking the liquid from the spoon. First, placed his mouth over the liquid at the tip of the spoon. Second, he drank the liquid with suction. Third, he continued to do this until there was no liquid left over. In this way, the church priest prepared to swallow the leftover wine by licking the liquid from the spoon. + +5. The biochemist prepared to investigate the way the proteins worked in a line with two parts going well together. He did this by oscillating the cloth left and right. First, he placed his hand on the cloth. Second, he moved the cloth to the right. Third, he removed his hand from the cloth. In this way, the biochemist prepared to investigate the way the proteins worked in a line with two parts going well together by oscillating the cloth left and right. + +6. The earth scientist prepared to study how the earth was formed. He did this by rotating the tip of his tongue. First, he lifted the tip of his tongue up. Second, he lowered it slightly. Third, he lowered it to the bottom. In this way, the earth scientist prepared to study how rotating the tip of his tongue formed the earth. + +7. The mechanic prepared to lift the cloth from the car. He did this by lifting the aluminum can. First, he chose the can with the red flower printed on it. Second, he opened it. Third, he lifted it up vertically. In this way, the mechanic prepared to lift the cloth from the car by lifting the aluminum can. + +8. The pilot prepared to eat a butterscotch tablet. He did this by nudging the snake lolly to his left set of molars. First, he placed the snake on his tongue. Second, he moved it to his left molars. Third, he closed his teeth over the snake. In this way, the pilot prepared to eat a butterscotch tablet by nudging the snake lolly to his left set of molars. + +9. The doctor prepared a glass by wiping it. He did this by eating the rice paper roll. First, he unwrapped it. Second, he bit a rice grain from one end. Third, he chewed and swallowed the rice grain. In this way, the doctor prepared a glass by wiping it by eating the rice paper roll. + +10. The farmer prepared to count the number of trout swimming through a plane per second. He did this by calculating the tadpole distribution. First, he measured the line's length. Second, he counted the tadpoles above the line. Third, he calculated the tadpole distribution by dividing the number of tadpoles by the line's length. In this way, the farmer prepared to count the number of trout swimming through a plane per second by calculating the tadpole distribution. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..65b514d60f697f532e6669d0acd6daafa1f3e599 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Time to Prepare 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Time to Prepare 2 of 4 + +11. The teacher prepared to represent being interested in a lesson by 'dotting it on'. He did this by climbing the rope ladder. First, he found the correct ladder. Second, he tested that the ladder was about to start. Third, he climbed the ladder with his arms and legs. In this way, the teacher prepared to represent being interested in a lesson by 'dotting it on' by climbing the rope ladder. + +12. The pianist prepared to examine a biological noumenon. He did this by writing evidence for his positive thoughts. First, he observed an object. Second, he thought of another object that he had experienced the first object as having causal implications on the first one. Third, he wrote the name of the second object. In this way, the pianist prepared to examine a biological noumenon by writing evidence for his positive thoughts. + +13. The swimmer prepared to swim a lap in the pool. He did this by practising rowing in the rowing boat. First, he held the oar with both hands. Second, he moved the oar backwards, towards him. Third, he moved the oar handle upwards. In this way, the swimmer prepared to swim a lap in the pool by practising rowing in the rowing boat. + +14. The psychiatrist prepared to lift the child model on to the step. She did this by placing the crane model on the ground. First, she lowered the hook. Second, she sat the child model on the hook. Third, she lifted the child up. In this way, the psychiatrist prepared to lift the child model on to the step by placing the crane model on the ground. + +15. The Cosmologist prepared to connect the idea to the road symbol. He did this by painting the road symbol on the road. First, he placed the template on the road. Second, he sprayed through the holes in the template. Third, he lifted the template from the ground. In this way, the Cosmologist prepared to connect the idea to the road symbol by painting the road symbol on the road. + +16. The pop musician prepared to walk to the meditation centre. He did this by recognising himself in the mirror. First, he stood in front of the mirror. Second, he looked at himself. Third, he tested that the image's features matched his own. In this way, the pop musician prepared to walk to the meditation centre by recognising himself in the mirror. + +17. The neuroscientist prepared to test that he could read the brain's thoughts. He did this by testing that the glass was clean. First, he looked at the glass. Second, he placed the cloth on the glass. Third, he polished the glass. In this way, the neuroscientist prepared to test that he could read the brain's thoughts by testing that the glass was clean. + +18. The neuroscientist prepared to read the letter in the brain. He did this by licking the letter's shape. First, he licked the 'A''s left side. Second, he licked the 'A''s right side. Third, he licked its cross bar. In this way, the neuroscientist prepared to read the letter in the brain by licking the letter's shape. + +19. The neuroscientist prepared to greedson out the thought, in other words, packed it to be moved. He did this by tying the sack up. First, he placed the contents on the ground. Second, he placed it in the sack. Third, he tied up the sack. In this way, the neuroscientist prepared to greedson out the thought, in other words, packed it to be moved by tying the sack up. + +20. The neuroscientist prepared to test that the start of the thought was highlighted. He did this by testing whether his top half was warm. First, he calculated the average skin temperature of his top half. Second, he found which temperature range it was in. For example, the temperature was warm because it was between 15 at 25 degrees centigrade. In this way, the neuroscientist prepared to test that the start of the thought was highlighted by testing whether his top half was warm. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..fe9ae6389e7f34daccfda2267b77bae7a411e3ed --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 3 of 4.txt @@ -0,0 +1,21 @@ +["Green, L 2021, Time to Prepare 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Time to Prepare 3 of 4 + +21. The neuroscientist prepared to show the visitors around thoughts like a particular thought. He did this by selecting the warm jumper. First, he tried on the thin jumper. Second, he tried on the thick jumper. Third, he selected the medium-thickness jumper. In this way, the neuroscientist prepared to show the visitors around thoughts like a particular thought by selecting the warm jumper. + +22. The neuroscientist prepared to test that his conclusion was perfectly expressed. He did this by wearing the jumper uniformly. First, he pulled it down his arms. Second, he pulled it down his front. Third, he pulled it down his back. In this way, the neuroscientist prepared to test that his conclusion was perfectly expressed by wearing the jumper uniformly. + +23. The neuroscientist prepared to store the memo in a cold place in the house. He did this by lying in a cool place. First, he measured the temperature beside his pool. Second, he measured the temperature in his bedroom. Third, he went to bed in the bedroom because it was cooler there. In this way, the neuroscientist prepared to store the memo in a cold place in the house by lying in a cool place. + +24. The neuroscientist prepared to measure the number of serotonin molecules (or whether he felt like he was at home). He did this by sewing two layers of pillows together to make a bed. First, he sewed together 8 pillows together lengthways to form the bottom layer. Second, he sewed together 8 pillows together lengthways to form the top layer. Third, he placed the top layer above the bottom layer, so that the pillows in the top layer covered those in the bottom layer. In this way, the neuroscientist prepared to measure the number of serotonin molecules by sewing two layers of pillows together to make a bed. + +25. The neuroscientist prepared to measure the student's achievement level over time in a mathematics test. He did this by lying diagonally across the bed. First, he lay across the x axis of the bed. Second, he placed a marker at the x and y co-ordinates where the book was. Third, he lied diagonally across the bed, touching the marker. In this way, the neuroscientist prepared to measure the student's achievement level over time in a mathematics test by lying diagonally across the bed. + +26. The cake maker prepared to ice the cake. He did this by creaming the pie. First, he removed the cream's lid. Second, he inserted a spoon in the cream. Third, he placed a dollop on the pie. In this way, the cake maker prepared to ice the cake by creaming the pie. + +27. The bottler prepared to put a cork in the bottle. He did this by closing the refrigerator door. First, he pushed the door with his hand. Second, he lifted the latch. Third, he closed the door. In this way, the bottler prepared to put a cork in the bottle by closing the refrigerator door. + +28. The neuroscientist closed the book about the brain. She did this by closing the birdcage door. First, he put the bird in the cage. Second, he closed the door. Third, he put the towel over the cage. In this way, the neuroscientist closed the book about the brain by closing the birdcage door. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 1 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 1 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..4f8bc7c5ebaf6f788638b68b7e9e8da223066a53 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 1 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 1 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 1 of 30 + +1. ALEXIS: This Two Uses algorithm returns whether the saucepan, pot or frying pan can both stand and be carried. They can stand if they have a horizontal line of at least two p's (pan) at the bottom. The subject can carry them if they have one long (two or more) h's (handles) at the top or two short h's at the top. + +2. DION: The subject should write original algorithms and arguments in the supplementary examination. Further, the physically challenged person should be given a supplementary examination in which he should identify different writers in the exposition. Also, the subject should write original algorithms and arguments where the algorithm should verify that the object is a member of the set. Besides this, the subject should check that the object is a member of the set and is attached to the correct meaning. Additionally, the subject should ethically assess the person's two uses in writing to enable her to become a founder. Along with this, the subject should become a founder by answering questions importantly and on the topic. As well as this, the subject should ethically assess the person's two uses in writing about time and space about the object in the essay. Furthermore, the subject should write about time and space about the object in the piece after understanding each object in speech. Moreover, the subject should write logically connected program lines to ensure that the program is functional. Also, the subject should state that the program that loads different icons each day is functional. Finally, the subject should logically connect records of breasonings. + +3. ALEXIS: What is the meaning of twouses1(Image) in line 1? + +4. DION: Line 1 returns true if Image contains an object that is carryable. That is, the first line has one long (two or more) h's (one long handle) or two h's (two short handles) and can stand, that is the last line has a horizontal line of at least two p's (pan). + +5. ALEXIS: I prepared to like Lucian's computational philosophy academy. I did this by writing Noam Chomsky's probable comments on the Press Release for Richard Dawkins' likely comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked why do we have to find a new feature, isn't text-to-speech enough? Second, I said why is complexification part of finding fault (agreeing)? We have to fix it up and move on. Third, I do like formats, but I don't (do) like Richard Dawkins. + +6. DION: The subject shouldn't write original algorithms and arguments (where the subject endorsed Lucian's Computational Philosophy Academy, embarking on a pathway between two uses). + +7. ALEXIS: The subject should write original algorithms and arguments. + +8. DION: Two uses is correct because of the phenomenology of nature, which is correct because the subject should write unique algorithms and arguments. + +9. ALEXIS: I prepared to find it (pedagogy) out using meditation (philosophy) and trial and error in my degree. I did this by writing Richard Rorty's probable comments on Richard Dawkins' likely comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked what's Daoism (concerning individualness) got to do with it. Second, I wrote they must be perfect. Third, I wrote they must be put together again. + +10. DION: The subject shouldn't see the light of day (where the subject found pedagogy out using meditation, or philosophy and trial and error in his degree, like meditation has a second use, pedagogy). +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4b28e95704a8a84effe36bb82d7beb5a73edd384 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 1 of 4.txt @@ -0,0 +1,49 @@ +["Green, L 2021, Two Uses 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 1 of 4 + + + +1. ALEXIS: This Two Uses algorithm returns whether the saucepan, pot or frying pan can both stand and be carried. They can stand if they have a horizontal line of at least two p's (pan) at the bottom. The subject can carry them if they have one long (two or more) h's (handles) at the top or two short h's at the top. + +Does twouses1([[p,' ',p, h, h], + +[p, p, p,' ',' ']]) return true? + +DION: Yes. + +ALEXIS: Does twouses1([[h,' ',' ',h], + +[p,' ',' ',p], [p, p, p, p]]) return true? + +DION: Yes. + +ALEXIS: Does twouses1([[p,' ',' ',' ',' ', p, h, h, h], + +[p, p, p, p, p, p,' ',' ',' ']]) return true? + +DION: Yes. + +ALEXIS: Does twouses1(((h,p,p,p,h))) return true? + +DION: Yes. + +ALEXIS: Does twouses1([[h, h, p,' ',' ',' ',' ',' ',' ',' ',' ',p, h, h], + +[' ',' ',p,' ',' ',' ',' ',' ',' ',' ',' ',p,' ',' '], [' ',' ',p,' ',' ',' ',' ',' ',' ',' ',' ',p,' ',' '], [' ',' ',p,' ',' ',' ',' ',' ',' ',' ',' ',p,' ',' '], [' ',' ',p, p, p, p, p, p, p, p, p, p,' ',' ']]) return true? + +DION: Yes. + +ALEXIS: The algorithm is as follows: + +1a. twouses1(Image) :- %% Returns true if Image contains an object that has handles (the first line has one long (two or more) h's (one long handle) or two h's (two short handles)) and can stand (the last line has horizontal line of at least two p's (pan)) +2. Image = [Firstline | _Lines], %% Firstline is the first line of Image +3. lastline(Image, Lastline), %% Lastline is the last line of Image +4. carryable(Firstline), %% Tests whether Firstline is carryable (the first line has one long (two or more) h's (one long handle) or two h's (two short handles)) +5. stands(Lastline). %% Tests whether Lastline can stand (the last line has horizontal line of at least two p's (pan)) +6. lastline(Lines1, Lastline) :- %% Returns the Lastline of Lines1 +7. Lines1 = [_Line | Lines2], %% Removes the first item in the list Lines1 to give Lines2 +8. lastline(Lines2, Lastline). %% Returns the Lastline of Lines2 +9. lastline([Lastline], Lastline). %% Returns Lastline when it is the last item +10. carryable(Line) :- %% Returns whether the first line has one long (two or more) h's (one long handle) or two h's (two short handles) +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 10 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 10 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..4d1af1cc683ff25d6139e8882bbf16cf71dfaec3 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 10 of 30.txt @@ -0,0 +1,22 @@ +["Green, L 2021, Two Uses 10 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 10 of 30 + +91. DION: The program checks that the utensil is carryable. + +92. ALEXIS: I prepared to disambiguate between my desired meaning and another one. I did this by writing Martha Nussbaum's probable comments on the Press Release for Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I like Prolog. Second, I like English. Third, I put English and Prolog together, including synonyms and synogrammars. + +93. DION: The subject shouldn't attach to the incorrect meaning (where the subject disambiguated between my desired meaning and another one, where this was akin to him attaching handles to the meaning he wanted). + +94. ALEXIS: The subject should attach to the correct meaning. + +95. DION: Two uses is right because of the meaning-attachment of nature, which is correct because the subject should connect to the right meaning. + +96. ALEXIS: I prepared to influence the sound of the text with the structure, function and size/constitution of the objects to which this referred. I did this by writing Alexius Meinong's probable comments on the Press Release for Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote a chart (undirected) of all the possibilities represented by the data. Second, I wrote a graph (directed) of all the possibilities represented by the data. Third, I wrote a plot (image of the breasoned objects) of all the possibilities represented by the data. + +97. DION: The subject shouldn't incorrectly emphasise the most useful object in the sentence (where the subject influenced the sound of the text with the structure, function and size/constitution of the objects that this referred to, and how they were useful). + +98. ALEXIS: The subject should correctly emphasise the most useful object in the sentence. + +99. DION: The subject should attach to the correct meaning before emphasising the most useful meaning in the phrase. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 11 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 11 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..1b3d75563bd5f801ab20f1f17b4a1b934096c03a --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 11 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 11 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 11 of 30 + +100. ALEXIS: I prepared to ask why you don't just speak without text (for the sake of argument)? I did this by writing Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I like the natural phenomenon (text and speech should be left separate). Second, I completed the first task first. (Why do you want text in text-to-speech so much?) Third, I asked, 'Why do you want speech in text-to-speech so much?' + +101. DION: The subject shouldn't forget speech's handles (memory of text) (where the subject asked why you don't just speak without text, for the sake of argument, like identifying that speech has handles). + +102. ALEXIS: The subject should remember text as speech's handles. + +103. DION: The subject should correctly emphasise the most useful object in the sentence which is the word 'text,' or a 'memory handle' for speech. + +104. ALEXIS: I prepared to speak myself, after using the computer to verify how I would speak. I did this by writing Alexius Meinong's probable comments on Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked being a programming language creator. Second, I preferred writing to speech because it is better checked. Third, I distanced myself from speech in Prolog. + +105. DION: The subject shouldn't speak about larger objects than the last time (where the subject spoke himself, after using the computer to verify how he would speak, where the objects that the subject talked about were not too large). + +106. ALEXIS: The subject should speak about larger objects than the last time. + +107. DION: The subject should attach to the correct meaning as part of which he should talk about larger objects than the last time. + +108. ALEXIS: I prepared to address that a human would benefit from text-to-speech. I did this by writing Martha Nussbaum's probable comments on Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked speech (in text-to-speech). Second, I wanted text. Third, I asked, 'What does the text refer to?'. + +109. DION: The subject shouldn't speak about heavier objects than the last time (where the subject made an address after observing that a human would benefit from text-to-speech, where the objects named were not too heavy). +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 12 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 12 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..bc7e5a7eb08bedbb3387d03759c35c7b8fffc711 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 12 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 12 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 12 of 30 + +110. ALEXIS: The subject should speak about heavier objects than the last time. + +111. DION: The subject should talk about larger objects than the last time because they were heavier objects than the last time. + +112. ALEXIS: What is the meaning of line(Line1, Item1) in line 15? + +113. DION: Line 15 determines whether Line1 contains a line of at least 2 Item1's. + +114. ALEXIS: I prepared to have a fair. I did this by writing Richard Rorty's probable comments on the Press Release for Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I loved reasonings. Second, I loved you. Third, I loved myself. + +115. DION: The subject shouldn't ethically assess the person's two uses (wanting to live and the way to do this) (where the subject visited a fair, his destination). + +116. ALEXIS: The subject should ethically assess the person's two uses. + +117. DION: Two uses are correct because of the comparison of nature, which is correct because the subject should morally evaluate the person's two uses. + +118. ALEXIS: I prepared to state that I had filled the table of questions about the breasoning, meaning there were no missing comments in these categories. I did this by writing Noam Chomsky's probable comments on the Press Release for Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote that the text changed according to history (won't change because of history). Second, I wrote that the speech changed according to the amount of money the subject was able to pay for this. Third, I noticed the conditions of the text and speech wouldn't (would) match when there wasn't (was) enough money to pay for the accurate recording of history. + +119. DION: The subject shouldn't allow the breasoning to leave her lips (where she filled the table of questions about the breasoning, meaning there were no missing comments in these categories, like moving until reaching the start). +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 13 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 13 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..093449474f3a81e6fdf77c62d3d4cb862eb1b332 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 13 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 13 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 13 of 30 + +120. ALEXIS: The subject should allow the breasoning to leave her lips. + +121. DION: The subject should assess the case given breasoned As. + +122. ALEXIS: I prepared to ask what would happen if I substituted another cultural item for one missing in another language. I did this by writing Richard Rorty's probable comments on Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I stated the text's cultural conditions might change. Second, I said the speech's language condition might change. Third, I suggested that there might be no (a) word in one language for a particular cultural item. + +123. DION: The subject shouldn't state that he desires the cold space (where the subject asked what would happen if he substituted another cultural item for one missing in another language, like an object for moving through space). + +124. ALEXIS: The subject should state that he desires the warm space. + +125. DION: The subject should allow the breasoning to leave her lips because the subject should indicate that she wants the warm space. The subject should state the correct meaning of the breasoning (e.g. that she desires the warm space) at the time. + +126. ALEXIS: I prepared to say that the monologue text's character should correspond to the paraphrased monologue speech's character in time, place, considering what has happened concerning breathsonings before, after and during the scene. I did this by writing Noam Chomsky's probable comments on Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I thought the text had no appearance of a person. Second, I thought the speech had no appearance of an individual. Third, I thought the text's aims and the speech's aims would have to correspond, assuming he had slightly modified the speech from the text. + +127. DION: The subject shouldn't avoid spiritual preparation for the next part of life (where the subject stated that the monologue text's character should correspond to the paraphrased monologue speech's character in time, place and considering what has happened concerning breathsonings before, after and during the scene, where the scene contains characters waiting until the starting time). + +128. ALEXIS: The subject should make spiritual preparation for the next part of life. + +129. DION: The subject should ethically assess the person's two uses because the subject should make spiritual preparation for the next part of life. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 14 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 14 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..dfeea9820579134c0fc0959591111db87e170e5f --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 14 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 14 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 14 of 30 + +130. ALEXIS: I prepared to state that the immediate experience was positive, and there was an overall delightful experience. I did this by writing Richard Dawkins' probable comments on Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I thought of experience as defining the text's character's personality. Second, I thought of socio-economic status as partially defining the speech's character's character. Third, I transcended experience to have a better socio-economic status. + +131. DION: The subject shouldn't be late without a message (where the subject stated that the immediate experience was positive, and there was an overall delightful experience, where the overall pleasant experience was that of arriving in time). + +132. ALEXIS: The subject should be early. + +133. DION: The subject should be early in making spiritual preparation for the next part of life. + +134. ALEXIS: What is the meaning of line(Line1, Item) in line 19? + +135. DION: Line 19 determines whether Line1 contains a line of at least 2 Item's. + +136. ALEXIS: I prepared to say Derrida does not mention these little As during Derrida's text, nor does he mention them about it. I did this by writing Richard Dawkins' probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I prepared to ask, what writing, I can't see any writing about the topic? Second, I disliked (liked) the writer. Third, I disliked (liked) discussing the writer. + +137. DION: The subject shouldn't become a founder (where the subject stated that As are not mentioned during, but are referred to about Derrida's text, noting that I moved along the line of the A's breasonings). + +138. ALEXIS: The subject should become a founder. + +139. DION: Two uses is correct because of the initiation of nature, which is correct because the subject should become a founder. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 15 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 15 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..4ec9b7f90482c073bfb9008ec4b4756209cf80e2 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 15 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 15 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 15 of 30 + +140. ALEXIS: I prepared to thank Emeritus Professor Leon Sterling for famously helping me to think clearly of very long lines through Prolog programming projects. I did this by writing Richard Rorty's probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked is this a joke he didn't write much on the reasons? Onfray replied, I was only kidding I am planning to get started during the algorithm's steps. Second, I wrote there is no part of writing that is relevant to text-to-speech. Third, I accepted in that case, the computer writes the phoneme list. + +141. DION: A third party shouldn't block the subject (where the subject thanked Emeritus Professor Leon Sterling for famously helping him to think clearly of very long lines through Prolog programming projects, and finishing these lines). + +142. ALEXIS: The subject should move forward on her path. + +143. DION: The subject should become a founder because she should be critical of blockedness. + +144. ALEXIS: I prepared to swap roles with the computer, experiencing an inspiration from Derrida. I did this by writing Alexius Meinong's probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked why do you like penning? Second, I wondered why do you like yourself? Third, I asked why do you like someone else, what if the speaker was us? + +145. DION: The subject shouldn't perform the calculation based on the computer's input (where the subject swapped roles with the computer when starting on the line connecting the subject's and computer's roles, experiencing an inspiration from Derrida). + +146. ALEXIS: The subject should verify the computer's output. + +147. DION: The subject should compute her path given the computer's output, and check her way. + +148. ALEXIS: I prepared to say, 'disappear goodness, I want badness to correct (more goodness).' I did this by writing Richard Dawkins' probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I stated I don't want phenomena, I want words. Second, I don't (do) want to bracket, what is it (that's what it is). Third, I asked what the relevance of an area of study, critical thinking, is. + +149. DION: The subject shouldn't educate all the people (where the subject stated 'appear goodness, I want to verify for more goodness,' where the subject tested the line). +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 16 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 16 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..1b12847ae5460581ccf6f208ae8b5c3682072943 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 16 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 16 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 16 of 30 + +150. ALEXIS: The subject should educate all the people. + +151. DION: The subject should become a founder by teaching all the people. + +152. ALEXIS: I prepared to ask why should it be positive? I did this by writing Richard Rorty's probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote what's the point of a thing about itself (referring to 'I got it right; I wrote unique phenomena in each case')?. Second, I wrote what's the relevance of Husserl's epoché (bracketing)? Third, I wrote what's the relevance of Critical Thinking? + +153. DION: The subject shouldn't work arguments out rigorously (where the subject asked why Lucian should be positive after he printed the line). + +154. ALEXIS: The subject should work on pedagogy. + +155. DION: The subject should educate all the people by working on pedagogy. + +156. ALEXIS: What is the meaning of twoshorthandles(Line1) in line 22? + +157. DION: Line 22 returns whether Line1 contains two h's (two short handles). + +158. ALEXIS: I prepared to let the computer experience things to mean before it said things about them, so it would be interesting to ask it what it meant. I did this by writing Martha Nussbaum's probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote about writing. Second, I wrote about the author's pen name. Third, I wrote about writing about the author. + +159. DION: The subject shouldn't connect an answer to ideas the question gives him (where the subject lets the computer experience things to mean before it said things about them, so it would be interesting to ask it what it meant) like different ways to construct a polyhedron. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 17 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 17 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..5138d833f1470d7709c0591caf97de0d376765a9 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 17 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 17 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 17 of 30 + +160. ALEXIS: The subject should answer importantly and on the topic. + +161. DION: Two uses is correct because of the prestigiousness of nature, which is correct because the subject should respond importantly and on the topic. + +162. ALEXIS: I prepared to give the visually impaired person a braille argument map. I did this by writing Alexius Meinong's probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked how the phenomena would come to us, by us exploring ones before them. Second, I experienced what it meant by transcending it. Third, I wanted a visual argument of the speech. + +163. DION: The subject shouldn't visualise the reason in his mind's eye (where the subject gave the visually impaired person a braille argument map, like mapping two reasons in the braille argument map to two polyhedrons). + +164. ALEXIS: The subject should spatially construct the reason in his mind. + +165. DION: The subject should answer importantly and on the topic after spatially developing the answer in his mind. + +166. ALEXIS: I prepared to define that an argument map's premise must be finite in the computer program. I did this by writing Martha Nussbaum's probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I thought there would be new phenomena for the computer experiencing reasons (and Derrida might have added) for it to say these reasons. Second, (with seeming help from Derrida) I wondered how it related to human neuroscience. Third, I undertook to uncover evidence, not false evidence, to convert to speech in law. + +167. DION: The subject shouldn't state that the computer program contains the argument map's premise (where the subject should define that an argument map's premise must be finite in the computer program, like a point on an unfolded polyhedron). + +168. ALEXIS: The subject worked out the appearance of the premise and program before thinking of them. + +169. DION: The subject should spatially construct the reason in his mind to work out the appearance of the premise and program before thinking of them. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 18 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 18 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..d7983fda84fe9471bb392e919255484f2a4eaeed --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 18 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 18 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 18 of 30 + +170. ALEXIS: I prepared to state that what if the text contained details of characters and the speech provided details of characters, and these didn't correspond, what then? I did this by writing Alexius Meinong's probable comments on the Press Release for Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked does this text have a personality? Second, I asked does this speech have a character? Third, I questioned how do the text having a personality and the speech having a character interrelate? What if it went wrong (the personality corresponded to the character)? + +171. DION: The subject shouldn't know what characters looked like (where the subject stated that it would be problematic if the text contained details of characters and the speech provided details of characters, and these didn't correspond, where these were like two hands holding the pot). + +172. ALEXIS: The subject identified that the character wanted to meet the other character. + +173. DION: The subject should answer importantly and on the topic because the subject determined that the character wanted to meet the other character. + +174. ALEXIS: I prepared to make new comments given the suggestions of the breasoning, in which the same comments as old comments are filtered out (where breasonings are the functional unit of pedagogy and pedagogy inspired the nanny state, in which offensive content is filtered out). I did this by writing Alexius Meinong's probable comments on Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked how the text might change under different conditions. Second, I questioned how the speech might change under different conditions. Third, I matched the conditions of the text and speech. + +175. DION: The subject shouldn't verify and over-consume new breasonings This is where the subject made new comments given the suggestions of the breasoning, in which same comments as old comments are filtered out. It is also where breasonings are the functional unit of pedagogy and pedagogy inspired the nanny state, in which the subject filters offensive content out, where the subject finds two polygons, representing pedagogy and breasonings. + +176. ALEXIS: The subject should verify and consume enough new breasonings. + +177. DION: The subject identified the speech's character being tested against and consuming content from the text's character. + +178. ALEXIS: What is the meaning of onelonghandle(Line) in line 13? + +179. DION: Line 13 returns whether the first line has one long (two or more) h's (one long handle). +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 19 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 19 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..e106ccf2a9f70f1b84a5b0fb0fb650bfd5dc73de --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 19 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 19 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 19 of 30 + +180. ALEXIS: I prepared to be the best in the group of essay writers. I did this by writing the article with others. First, I helped the hermaphrodites. Second, I asked him a question. Third, I examined him. + +181. DION: The subject shouldn't differentiate the same point about the object in the essay (where the subject wrote the piece with others, like being given or giving the object with one hand and writing with the other). + +182. ALEXIS: The subject should write about time and space about the object in the essay. + +183. DION: Two uses is correct because of the dialectic-continuity of nature, which is correct because the subject should write about time and space about the object in the essay. + +184. ALEXIS: I prepared to reinforce realism. I did this by including the secondary text in the bibliography. First, I wrote about the primary text. Second, I read the excerpt in the secondary text. Third, I confirmed what I wrote in the primary text in the secondary text. + +185. DION: The subject shouldn't include 50 As in each book (where the subject included the secondary text in the bibliography, where the text was one that was like an object that he carried with a wide-enough handle). + +186. ALEXIS: The subject should include 50 As in each book. + +187. DION: The subject should write 50 As of continuous dialectics in each book. + +188. ALEXIS: I prepared to read the comments. I did this by including a primary text in the bibliography. First, I found the bibliography. Second, I knew about first wind. Third, I examined myself. + +189. DION: The subject shouldn't lift necessary weights (where the subject included a primary text in the bibliography, where the primary text was represented using a handle narrow enough for carrying with a hand). ALEXIS: The subject should lift necessary weights. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 2 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 2 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..7ec7d5d1241a91e3ea0b15167333ec1d871edc64 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 2 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 2 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 2 of 30 + +11. ALEXIS: The subject should see the light of day. + +12. DION: The subject should write original algorithms and arguments to see the light of day. + +13. ALEXIS: I prepared to write I don't like vegan meat, good approaching bad, like I don't like human-likeness, bad approaching good. I did this by writing Richard Rorty's probable comments on the Press Release for Richard Dawkins' likely comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I thought the professor's image of the vocal folds was weird (interesting) because it wasn't (was) real. Second, I liked realism, ironism, to do with text-to-speech. Third, I didn't like text-to-speech because I didn't think robotics was real enough. + +14. DION: The subject shouldn't endorse positive-enough objects (where the subject wrote, 'I like vegan meat, or good approaching the different other, like human-likeness, or the different other approaching good,' where I can use an object that was approached by good or has approached good). + +15. ALEXIS: The subject should endorse positive-enough objects. + +16. DION: The subject should see the light of day by supporting sometimes disagreeing, positive-enough entities. + +17. ALEXIS: I prepared to say that I dislike (like) that spiritual is real, there would be flittering, fluttering, madness (sanity). I did this by writing Richard Dawkins' probable comments on the Press Release for Richard Dawkins' likely comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, entering text to be converted to speech is confusing at first because of the format the typist needs to enter it into the computer. Second, I wrote, what if it gets it wrong (right) because of not having met that format before. Third, I don't (do) like the format either, the high-quality sex scenes with robots and things like that, meaning the algorithm to carry out spiritual communication. + +18. DION: The subject shouldn't not want verifying 4D (imagining opening a box, like the spiritual) from 3D (a box, like the real) is safe (where the subject said that he liked that the spiritual is real, like wanting an object in an image is real, so there would be sanity). + +19. ALEXIS: The subject should verify 4D (imagining opening a box, like the spiritual) from 3D (a box, like the real) is safe. + +20. DION: The subject should verify the spiritual (as real) in essays. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6974024b13fd8d42f45ca4f2a8ca54001200b25a --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 2 of 4.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Two Uses 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 2 of 4 + +11. onelonghandle(Line); %% Returns whether the first line has one long (two or more) h's (one long handle) +12. twoshorthandles(Line). %% Returns whether the first line has two h's (two short handles) +13. onelonghandle(Line) :- %% Returns whether the first line has one long (two or more) h's (one long handle) +14. line(Line, h). %% Determines whether Line contains a line of at least 2 h's +15. line(Line1, Item1) :- %% Determines whether Line1 contains a line of at least 2 Item1's +16. Line1 = [Item2 | Line2], %% Takes the first item, Item2 in Line1, giving Line2 +17. not(Item2 = Item1), %% Verifies that Item2 is not Item1 +18. line(Line2, Item1), !. %% Determines whether Line2 contains a line of at least 2 Item1's +19. line(Line1, Item) :- %% Determines whether Line1 contains a line of at least 2 Item's +20. Line1 = [Item | Line2], %% Takes the first item, Item from Line1, leaving Line2 +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 20 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 20 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..89d344afecb3282a82c43cdee435c1436f7c744f --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 20 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 20 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 20 of 30 + +190. DION: The subject should include 50 As in each book of the necessary weight. + +191. ALEXIS: I prepared to examine two reviews. I did this by including the review in the bibliography. First, I examined the book. Second, I examined the review. Third, I confirmed what the review stated about the book. + +192. DION: The subject shouldn't misunderstand the review (where the subject included the review in the bibliography, like moving the pan onto the heat). + +193. ALEXIS: The subject should understand the review. + +194. DION: The subject should understand the review about the objects written about in the essay. + +195. ALEXIS: I prepared to use the vocational information about the blog. I did this by including the blog in the bibliography. First, I wrote plenty of blogs down. Second, I wrote one for me. Third, I helped myself to examinations. + +196. DION: The subject shouldn't verify the content of the blog (where the subject included the blog in the bibliography, like tilting the pan to empty the material onto the plate). + +197. ALEXIS: The subject should verify the content of the blog. + +198. DION: The subject should understand, then check the source. + +199. ALEXIS: What is the meaning of carryable(Line) in line 10? +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 21 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 21 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..e956f0d21d13837a10fff66a3b03fe767ad997df --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 21 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 21 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 21 of 30 + +200. DION: Line 10 returns whether the first line has one long (two or more) h's (one long handle) or two h's (two short handles). + +201. ALEXIS: I prepared to write how the text and the speech going well together loved us. I did this by writing Michel Onfray's probable comments on Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked the character of the text. Second, I loved the character of the speech. Third, I liked how they went well together. + +202. DION: The subject shouldn't drop the object (where the subject wrote how the text and the speech going well together loved us, shown by an animation in which an object is shown to carry other objects). + +203. ALEXIS: The subject should understand each object in speech. + +204. DION: Two uses is correct because of the immersion of nature, which is correct because the subject should understand each object in speech. + +205. ALEXIS: I prepared to turn to the page from the table of contents. I did this by writing that a reviewer reviewed the piece. First, I examined the essay. Second, I wrote an article on it. Third, I wrote this down. + +206. DION: The subject shouldn't write on the essay (where the subject wrote that a reviewer reviewed the article, like the subject being able to see the object's top). + +207. ALEXIS: The subject should write a brief summary of the article as the review. + +208. DION: The subject should summarise the ontologies of objects in the article as the review. + +209. ALEXIS: I prepared to ask how changing the text would lead to changes in the speech? I did this by writing Richard Dawkins' probable comments on the Press Release for Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked why the conditions of the text, not just the text might change? Second, I asked why is the speech so interesting? Third, I asked why the anti-heroes' (heroes') speeches are there? +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 22 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 22 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..7e488531353d74bd6318827720e28ad93e96d2aa --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 22 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 22 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 22 of 30 + +210. DION: The subject shouldn't match milieu (text) with culture (speech) (where the subject determined that changing the text would lead to change in the speech, where change is like an object's walls). + +211. ALEXIS: The subject should match people's judgments (text) with objects (speech). + +212. DION: The subject should write a brief summary (speech) of the essay (text) as the review. + +213. ALEXIS: I prepared to write about life. I did this by planning and working. First, I planned. Second, I worked. Third, I knew the subject wrote on a particular topic. + +214. DION: The subject shouldn't stack the objects stably on the trolley (where the subject made a plan and performed work when lifting an object stably). + +215. ALEXIS: The subject should stack the objects stably on the trolley. + +216. DION: The subject should understand each object in speech because the subject should stack the objects stably on the trolley. + +217. ALEXIS: I prepared to identify the times. I did this by examining the new hour. First, I studied its shape. Second, I examined its moments. Third, I examined hourlinesses (sic). + +218. DION: The subject should do something during the hour (where the subject examined the new hour, finding that it was empty). + +219. ALEXIS: The subject should do nothing during the hour. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 23 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 23 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..e483d59d8ab3a49dbdc97e7be782bb3521930abb --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 23 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 23 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 23 of 30 + +220. DION: The subject should stack the objects stably on the trolley and do nothing but watch them during the hour. + +221. ALEXIS: What is the meaning of deletefirst(Line1, Item1, Line2, Line3) in line 31? + +222. DION: Line 31 advances to the first instance of Item1 in Line1. + +223. ALEXIS: I prepared to contain my joy that the argument covered all the relevant arguments. I did this by writing unique words as breasonings in arguments. First, I verified that the next word was unique. Second, I prepared to verify the next word. Third, I repeated this until I had verified that all the words were unique. + +224. DION: The subject shouldn't write breasonings from algorithms about breasonings (where the subject wrote unique words as breasonings in arguments, by finding words between words). + +225. ALEXIS: The subject should write logically connected breasonings. + +226. DION: Two uses is correct because of the logicism of nature, which is correct because the subject should write logically connected breasonings. + +227. ALEXIS: I prepared to eat caviar (durum wheat semolina). I did this by writing vegan arguments. First, I wrote about capricorn, or co-operativity (sic). Second, I wrote about the fact that not all like that. Third, I examined Hopetoun. + +228. DION: The subject shouldn't eat the right doses of plant ingredients (where the subject wrote vegan arguments, like the words between letters were found). + +229. ALEXIS: The subject should research the correct doses of vitamins, minerals and other vegetable ingredients to eat. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 24 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 24 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..5f84c7dea981474c2138490c7ccd79f4d2b69919 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 24 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 24 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 24 of 30 + +230. DION: The subject should logically connect the correct doses of vitamins, minerals and other vegetable ingredients to eat in medicine. + +231. ALEXIS: I prepared to agree with meditation (popology). I did this by writing non-religious (philosophical) arguments. First, I wrote about piety (writing). Second, I edited out (wrote about) epistemology. Third, I wrote about you. + +232. ALEXIS: I prepared to encourage sex (freedom). I did this by writing non-sexual arguments (arguments for a general audience). First, I wrote about piety (authorship). Second, I wrote about postludetudine (sic). Third, I wrote about nanga (sic). + +233. DION: The subject shouldn't spell correctly (where the subject wrote non-religious, or philosophical arguments, like finding the letters between letters). + +234. ALEXIS: The subject should spell correctly. + +235. DION: The subject should research the correct doses of vitamins, minerals and other vegetable ingredients by spelling correctly. + +236. DION: The subject shouldn't trip on the path (where the subject wrote non-sexual arguments, or arguments for a general audience, like finding the routes between items). + +237. ALEXIS: The subject should walk on the path. + +238. DION: The subject should write logically connected breasonings while walking on the path. + +239. ALEXIS: I prepared to enter heaven (experience bliss). I did this by writing positive arguments. First, I wrote agreeing arguments. Second, I wrote positive arguments. Third, I wrote neutral arguments. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 25 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 25 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..e065b51d8d39a50a06d1145e5e294d1137fbdc08 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 25 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 25 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 25 of 30 + +240. DION: The subject shouldn't make future discoveries in the spaces (where the subject wrote positive arguments, which had spaces between items). + +241. ALEXIS: The subject connected the arguments. + +242. DION: The subject should walk on the path traversing the argument connecting the arguments. + +243. ALEXIS: What is the meaning of deletefirst(Line1, Item, Line2, Line3) in line 27? + +244. DION: Line 27 deletes an instance of Item in Line1. + +245. ALEXIS: I prepared to go bonkers (remain sane). I did this by summarising the algorithm in three steps. First, I performed the action on the first item. Second, I prepared to perform the action another time on the next item. Third, I repeated this until I had performed the action on all the items in the file. + +246. DION: The subject shouldn't state that the program is functional (where the subject summarised the algorithm in three steps, where a step is finding a short handle). + +247. ALEXIS: The subject should state that the program is functional. + +248. DION: Two uses is correct because of the functionalism of nature, which is correct because the subject should state that the program is functional. + +249. ALEXIS: I prepared to write the algorithm as a hierarchy of predicates. I did this by writing the processes in the algorithm. First, I identified the process acting on the list. Second, I identified the process acting on the list of lists. Third, I identified the process acting on the list of lists of lists. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 26 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 26 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..f737a0c0a4a1fd51270524a5fc101b0ea79eaf14 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 26 of 30.txt @@ -0,0 +1,26 @@ +["Green, L 2021, Two Uses 26 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 26 of 30 + +250. DION: The subject shouldn't identify the process applied to a data item (where the subject wrote a process in the algorithm, e.g. pouring the soup). + +251. ALEXIS: The subject should identify the process applied to a data item. + +252. DION: The subject should state that the program is functional because of the processes applied to data items. + +253. ALEXIS: I prepared to quote the guide in the exposition. I did this by commissioning the guide. First, I wrote the analysis. Second, I wrote the biography. Third, I wrote the abstract. + +254. DION: The subject shouldn't connect each key idea in the analytic guide (where the subject commissioned the guide, like drinking the soup). + +255. ALEXIS: The subject should connect each key idea in the analytic guide. + +256. DION: The subject should connect each key idea, and the process applied to a data item in the analytic guide. + +257. ALEXIS: I prepared to be commissioned by the Sultan. I did this by commissioning the glossary. First, I wrote the glossary. Second, I was commissioned by the Raj. Third, I was commissioned by the Emir. + +258. DION: The subject shouldn't make a new connection between a term and definition (where the subject commissioned the glossary, represented by lifting the handle). + +259. ALEXIS: The subject should make a new connection between a term and definition. + +260. DION: The subject should state that the program is functional by making a connection between a program and new input. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 27 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 27 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..690178ff82324554f02e4beb09d1fd5a9dea1f47 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 27 of 30.txt @@ -0,0 +1,22 @@ +["Green, L 2021, Two Uses 27 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 27 of 30 + +261. ALEXIS: I prepared to state that everything was high quality about the summary. I did this by commissioning the summary. First, I wrote the word. Second, I summarised it. Third, I helped you write it too. + +262. DION: The subject shouldn't skip a key sentence in the summary (where the subject commissioned the summary, where reading the summary is like holding the handle) + +263. ALEXIS: The subject should include each key sentence in the summary. + +264. DION: The subject should make a new connection between a term and definition and between each key sentence in the summary. + +265. ALEXIS: What is the meaning of deletefirst([], _Item, Line, Line) in line 30? + +266. DION: Line 30 returns Line with the first instance of Item deleted. + +267. ALEXIS: I prepared to examine how the Abracadabras affected people. I did this by writing how two uses affected people. First, I knew about the two uses. Second, I knew about the people. Third, I examined how the two uses affected people. + +268. DION: The subject shouldn't use whiteboard magnet icons (where the subject wrote how two uses affected people, where he used the two items in two different ways each). + +269. ALEXIS: The subject should load different icons each day. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 28 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 28 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..789c6630780b05e4a09e867ec3abe3f2e5bbf148 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 28 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 28 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 28 of 30 + +270. DION: Two uses is correct because of the iconism of nature, where this is correct because the subject should load different icons each day. + +271. ALEXIS: I prepared to join the ideas up. I did this by writing my algorithm in my own words. First, I found the Quasifontanaland. Second, I used it to a pulp. Third, I grated the ideas up. + +272. DION: The subject shouldn't select in programs, increasing efficiency (where the subject wrote his algorithm in his own words, after taking each word from a list). + +273. ALEXIS: The subject should select in programs, increasing efficiency. + +274. DION: The subject should select from icons in programming programs, increasing efficiency. + +275. ALEXIS: I prepared to work out the comic contents. I did this by commissioning the comics. First, I found the matrix. Second, I filled it up. Third, I verified it. + +276. DION: The subject shouldn't intertwine two uses with each frame (where the subject commissioned the comics, like finding each frame's main point). + +277. ALEXIS: The subject should intertwine two uses with each frame. + +278. DION: The subject should select D in 'As A is to B, C is to D' in programs, by intertwining two uses (A is to Xn) with each frame. + +279. ALEXIS: I prepared to eat recycled matter. I did this by finding two uses during music. First, I found the first use. Second, I found the second use. Third, I listened to the music. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 29 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 29 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..6baedb0b0b112145d15e30991be1d8c522b4e0ad --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 29 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 29 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 29 of 30 + +280. DION: The subject shouldn't indulge in sex and food (where the subject found two uses during music, where she loved the two uses). + +281. ALEXIS: The subject should indulge in sex and food. + +282. DION: The subject should load different sex and food icons each day. + +283. ALEXIS: I prepared to write about the country. I did this by writing my argument in my own words. First, I wrote the argument. Second, I wrote it in my own words. Third, I examined my own words. + +284. DION: The subject shouldn't choose a setting, time and reason for the argument (where the subject wrote the argument in his own words, by drawing the two items). + +285. ALEXIS: The subject should choose a setting, time and reason for the argument. + +286. DION: The subject should choose a setting, time and reason for the arguments about indulging in sex and food. + +287. ALEXIS: What is the meaning of stands(Line) in line 36? + +288. DION: Line 36 tests that Line stands (that there is a line of two p's in Line). + +289. ALEXIS: I prepared to examine my famousness. I did this by including the autobiography in the bibliography. First, I wrote about myself. Second, I wrote about the rod operation. Third, I held it aloft. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 3 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 3 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..be7c97b62bc055bdb12666b4d80ed853ded55ffe --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 3 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 3 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 3 of 30 + +21. ALEXIS: I prepared to like converting text to speech. I did this by writing Noam Chomsky's probable comments on Richard Dawkins' likely comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked what do you mean Lucian Green, separateness? He means individualness (sic). Second, I don't like text much, is unusual (I like text). Third, I like speech as well. + +22. DION: The subject disagreed with the idea being misrepresented in text (where the subject endorsed converting text to speech, and he ordered the uses: text, speech). + +23. ALEXIS: The subject typed the idea neatly. + +24. DION: The subject should record, then verify the idea. + +25. ALEXIS: What is the meaning of lastline([Lastline], Lastline) in line 9? + +26. DION: Line 9 returns Lastline when it is the last item. + +27. ALEXIS: I prepared to feel sorry for the physically challenged people. I did this by writing Noam Chomsky's probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote the second philosopher's second comment set related to the first philosopher's second comment set on the line. Second, I asked isn't the second philosopher's first comment set related to the first philosopher's first and second comment set on the line and the second philosopher's second comment set related to his or her first comment set on the line? Third, I wrote that's what I what to comment on, not unnecessary material all the time. + +28. DION: The physically challenged person shouldn't be given a supplementary examination (where the subject felt sorry for the physically challenged person, and one person should inspect the physically challenged person's mark). + +29. ALEXIS: The subject should give the physically challenged person a supplementary examination. + +30. DION: Two uses is correct because of the mercy of nature, which is right because the subject should give the physically challenged person a supplementary examination. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..9a0141bfbb422bce265b4ea14fa8f706d3dbf7b4 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 3 of 4.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Two Uses 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 3 of 4 + +21. Line2 = [Item | _Line3]. %% Takes the first item, Item from Line2 +22. twoshorthandles(Line1) :- %% Returns whether Line1 contains two h's (two short handles) +23. member(h, Line1), %% Tests that h is a member of Line1 +24. deletefirst(Line1, h, [], Line2), %% Deletes the first instance of h in Line1 +25. member(h, Line2), +26. deletefirst(Line2, h, [], _Line3). +27. deletefirst(Line1, Item, Line2, Line3) :- %% Deletes an instance of Item in Line1 +28. Line1 = [Item | Line4], +29. append(Line2, Line4, Line3). +30. deletefirst([], _Item, Line, Line). %% Returns Line with the first instance of Item deleted +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 30 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 30 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..39a9ce7a346d3e36f229fbfe99f933cc50c317ba --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 30 of 30.txt @@ -0,0 +1,18 @@ +["Green, L 2021, Two Uses 30 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 30 of 30 + +290. DION: The subject shouldn't rely on memory (where the subject included the autobiography in the bibliography like I rested the object). + +291. ALEXIS: The subject should rely on records. + +292. DION: Two uses is correct because of the empiricism of nature, which is correct because the subject should rely on records. + +293. ALEXIS: I prepared to have the text translated into Portuguese. I did this by commissioning the translation. First, I wrote the text. Second, I had it translated into French. Third, I had it translated into German. + +294. DION: The subject shouldn't paradoxically suggest that the translation will conserve all meaning (where the subject commissioned the translation, like placing it there, in another place). + +295. ALEXIS: The subject should correctly translate a concise version of the text. + +296. DION: The subject should rely on linguistic materials to correctly translate a concise version of the text. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 4 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 4 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..84fa4be7ef7ff78ece447930b82b137d7be0d091 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 4 of 30.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Two Uses 4 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 4 of 30 + + +31. ALEXIS: I prepared to work out that Derrida's writing was the As, not disappointing the reader with no or two (sic) many breasonings. I did this by writing Michel Onfray's probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked writing. Second, I liked the author. Third, I talked about the author. + +32. DION: The subject shouldn't visualise the object (where the subject worked out that Derrida's writing was the As, not disappointing the reader with no or two (sic) many breasonings, allowing one to visualise one object per sentence). + +33. ALEXIS: The subject should visualise the object by saying the breasonings ways of thinking to God, then breasoning out the object (thinking of its x, y and z dimensions). + +34. DION: The student should read each question carefully in the supplementary examination. ALEXIS: I prepared to naturally expect goodness when I had forgotten or not forgotten a point. I did this by writing Michel Onfray's probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, inspired by Nussbaum, I got it right (I wrote unique phenomena in each case) as well as the computer. Second, I liked transcending the text by bracketing it (forgetting it and later writing on the most important point). Third, I wanted critical thinking's argument structures, converting arguments into argument maps and vice versa. + +35. DION: The subject shouldn't write a summary from memory (where the subject naturally expected goodness when a point had been forgotten or not forgotten, like imagining being able to hold the object when he had not forgotten the point). + +36. ALEXIS: The subject should write a summary from memory. + +37. DION: The subject should verify his summary. + +38. ALEXIS: I prepared to write the first philosopher's first comment set related to either Lucian's line or the second philosopher's first comment set on Lucian's line. I did this by writing Noam Chomsky's probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked what's the relevance of writing and subjects he (Onfray) hasn't studied? Second, I asked what's the question (asking how to write 100 As per Masters assignment chapter?). Third, I asked what's the relevance of all this? + +39. DION: The subject shouldn't read the comments on the comments (where the subject wrote the first philosopher's first comment set related to either Lucian's line or the second philosopher's first comment set on Lucian's line, where Lucian's line is like the beyond). + +40. ALEXIS: The subject should read the comments on the comments. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..a742f89628af4492f0a2bf781325a4bdf895a031 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 4 of 4.txt @@ -0,0 +1,603 @@ +["Green, L 2021, Two Uses 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 4 of 4 + +31. deletefirst(Line1, Item1, Line2, Line3) :- %% Line 31 advances to the first instance of Item1 in Line1 +32. Line1 = [Item2 | Line4], +33. not(Item1 = Item2), +34. append(Line2, [Item2], Line5), +35. deletefirst(Line4, Item1, Line5, Line3). +36. stands(Line) :- %% Tests that Line stands (that there is a line of two p's in Line) +37. line(Line, p). + +DION: The subject should write original algorithms and arguments in the supplementary examination. Further, the physically challenged person should be given a supplementary examination in which he should identify different writers in the exposition. Also, the subject should write original algorithms and arguments where the algorithm should verify that the object is a member of the set. Besides this, the subject should check that the object is a member of the set and is attached to the correct meaning. Additionally, the subject should ethically assess the person's two uses in writing to enable her to become a founder. Along with this, the subject should become a founder by answering questions importantly and on the topic. As well as this, the subject should ethically assess the person's two uses in writing about time and space about the object in the essay. Furthermore, the subject should write about time and space about the object in the piece after understanding each object in speech. Moreover, the subject should write logically connected program lines to ensure that the program is functional. Also, the subject should state that the program that loads different icons each day is functional. Finally, the subject should logically connect records of breasonings. + +ALEXIS: What is the meaning of twouses1(Image) in line 1? + +DION: Line 1 returns true if Image contains an object that is carryable. That is, the first line has one long (two or more) h's (one long handle) or two h's (two short handles) and can stand, that is the last line has a horizontal line of at least two p's (pan). + +ALEXIS: I prepared to like Lucian's computational philosophy academy. I did this by writing Noam Chomsky's probable comments on the Press Release for Richard Dawkins' likely comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked why do we have to find a new feature, isn't text-to-speech enough? Second, I said why is complexification part of finding fault (agreeing)? We have to fix it up and move on. Third, I do like formats, but I don't (do) like Richard Dawkins. + +DION: The subject shouldn't write original algorithms and arguments (where the subject endorsed Lucian's Computational Philosophy Academy, embarking on a pathway between two uses). + +ALEXIS: The subject should write original algorithms and arguments. + +DION: Two uses is correct because of the phenomenology of nature, which is correct because the subject should write unique algorithms and arguments. + +ALEXIS: I prepared to find it (pedagogy) out using meditation (philosophy) and trial and error in my degree. I did this by writing Richard Rorty's probable comments on Richard Dawkins' likely comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked what's Daoism (concerning individualness) got to do with it. Second, I wrote they must be perfect. Third, I wrote they must be put together again. + +DION: The subject shouldn't see the light of day (where the subject found pedagogy out using meditation, or philosophy and trial and error in his degree, like meditation has a second use, pedagogy). + +ALEXIS: The subject should see the light of day. + +DION: The subject should write original algorithms and arguments to see the light of day. + +ALEXIS: I prepared to write I don't like vegan meat, good approaching bad, like I don't like human-likeness, bad approaching good. I did this by writing Richard Rorty's probable comments on the Press Release for Richard Dawkins' likely comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I thought the professor's image of the vocal folds was weird (interesting) because it wasn't (was) real. Second, I liked realism, ironism, to do with text-to-speech. Third, I didn't like text-to-speech because I didn't think robotics was real enough. + +DION: The subject shouldn't endorse positive-enough objects (where the subject wrote, 'I like vegan meat, or good approaching the different other, like human-likeness, or the different other approaching good,' where I can use an object that was approached by good or has approached good). + +ALEXIS: The subject should endorse positive-enough objects. + +DION: The subject should see the light of day by supporting sometimes disagreeing, positive-enough entities. + +ALEXIS: I prepared to say that I dislike (like) that spiritual is real, there would be flittering, fluttering, madness (sanity). I did this by writing Richard Dawkins' probable comments on the Press Release for Richard Dawkins' likely comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, entering text to be converted to speech is confusing at first because of the format the typist needs to enter it into the computer. Second, I wrote, what if it gets it wrong (right) because of not having met that format before. Third, I don't (do) like the format either, the high-quality sex scenes with robots and things like that, meaning the algorithm to carry out spiritual communication. + +DION: The subject shouldn't not want verifying 4D (imagining opening a box, like the spiritual) from 3D (a box, like the real) is safe (where the subject said that he liked that the spiritual is real, like wanting an object in an image is real, so there would be sanity). + +ALEXIS: The subject should verify 4D (imagining opening a box, like the spiritual) from 3D (a box, like the real) is safe. + +DION: The subject should verify the spiritual (as real) in essays. + +ALEXIS: I prepared to like converting text to speech. I did this by writing Noam Chomsky's probable comments on Richard Dawkins' likely comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked what do you mean Lucian Green, separateness? He means individualness (sic). Second, I don't like text much, is unusual (I like text). Third, I like speech as well. + +DION: The subject disagreed with the idea being misrepresented in text (where the subject endorsed converting text to speech, and he ordered the uses: text, speech). + +ALEXIS: The subject typed the idea neatly. + +DION: The subject should record, then verify the idea. + +ALEXIS: What is the meaning of lastline([Lastline], Lastline) in line 9? + +DION: Line 9 returns Lastline when it is the last item. + +ALEXIS: I prepared to feel sorry for the physically challenged people. I did this by writing Noam Chomsky's probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote the second philosopher's second comment set related to the first philosopher's second comment set on the line. Second, I asked isn't the second philosopher's first comment set related to the first philosopher's first and second comment set on the line and the second philosopher's second comment set related to his or her first comment set on the line? Third, I wrote that's what I what to comment on, not unnecessary material all the time. + +DION: The physically challenged person shouldn't be given a supplementary examination (where the subject felt sorry for the physically challenged person, and one person should inspect the physically challenged person's mark). + +ALEXIS: The subject should give the physically challenged person a supplementary examination. + +DION: Two uses is correct because of the mercy of nature, which is right because the subject should give the physically challenged person a supplementary examination. + +ALEXIS: I prepared to work out that Derrida's writing was the As, not disappointing the reader with no or two (sic) many breasonings. I did this by writing Michel Onfray's probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked writing. Second, I liked the author. Third, I talked about the author. + +DION: The subject shouldn't visualise the object (where the subject worked out that Derrida's writing was the As, not disappointing the reader with no or two (sic) many breasonings, allowing one to visualise one object per sentence). + +ALEXIS: The subject should visualise the object by saying the breasonings ways of thinking to God, then breasoning out the object (thinking of its x, y and z dimensions). + +DION: The student should read each question carefully in the supplementary examination. ALEXIS: I prepared to naturally expect goodness when I had forgotten or not forgotten a point. I did this by writing Michel Onfray's probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, inspired by Nussbaum, I got it right (I wrote unique phenomena in each case) as well as the computer. Second, I liked transcending the text by bracketing it (forgetting it and later writing on the most important point). Third, I wanted critical thinking's argument structures, converting arguments into argument maps and vice versa. + +DION: The subject shouldn't write a summary from memory (where the subject naturally expected goodness when a point had been forgotten or not forgotten, like imagining being able to hold the object when he had not forgotten the point). + +ALEXIS: The subject should write a summary from memory. + +DION: The subject should verify his summary. + +ALEXIS: I prepared to write the first philosopher's first comment set related to either Lucian's line or the second philosopher's first comment set on Lucian's line. I did this by writing Noam Chomsky's probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked what's the relevance of writing and subjects he (Onfray) hasn't studied? Second, I asked what's the question (asking how to write 100 As per Masters assignment chapter?). Third, I asked what's the relevance of all this? + +DION: The subject shouldn't read the comments on the comments (where the subject wrote the first philosopher's first comment set related to either Lucian's line or the second philosopher's first comment set on Lucian's line, where Lucian's line is like the beyond). + +ALEXIS: The subject should read the comments on the comments. + +DION: The physically challenged person should be given a supplementary examination on comments on comments, which he should read. + +ALEXIS: I prepared to understand the speech better. I did this by writing Martha Nussbaum's probable comments on the Press Release for Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote the real conclusion made us recognise the spiritual thoughts. Second, I needed experience to recognise the spiritual ideas. Third, I liked eating with people, to discuss the experience. + +DION: The subject should do nothing after collecting the comment (where the subject had a limit to his understanding of speech compared with text). + +ALEXIS: The subject should explain the text with speech after collecting the comment. + +DION: The subject should explain the text with speech after collecting the comment and comment on the comment. + +ALEXIS: What is a use this program verifies? + +DION: The program confirms that the utensil stands. + +ALEXIS: I prepared to ask what the point of semantics was again, to which Lucian replied that the First Technique contained the first person's upper and lower triangular expositions and critiques respectively, the Second Technique of Meaning verified the tautological meaning of the First Technique, and the Third Technique of Interpretation contained upper and lower triangular expositions and critiques respectively of an other compared with the writer of the First Technique essay. I did this by writing Noam Chomsky's probable comments on the Press Release for Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I agreed to have semantics back. Second, I didn't want the determination of meaning; I wanted verification of it. Third, they were completely different, so it doesn't matter [where the input is known and the output is unknown in determination but the output is known in verification, to which Chomsky replied they both work]. + +DION: The subject shouldn't identify different writers in the exposition (where the subject stated that the First Technique contained the first person's upper and lower triangular expositions and critiques respectively, where he writes on a stable surface). + +ALEXIS: The subject should identify different writers in the exposition. + +DION: Two uses is correct because of the writer identification of nature, which is correct because the subject should recognise different writers in the exposition. + +ALEXIS: I prepared to say wouldn't it be great if people correctly ordered my philosophy. I did this by writing Michel Onfray's probable comments on Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked whether this was a form of artificial intelligence, and said it should be called it then. Second, I thought the puffin ducks were cute. Third, I wrote that verificationism refers to verifying the text of the speech where the subject generates this speech from the text? + +DION: The subject shouldn't state that interpreted breasonings literally, as against figuratively, support the argument (where the subject correctly ordered the philosophy like equal length struts that stay still). + +ALEXIS: The subject should state that interpreted breasonings literally, as against figuratively, support the argument. + +DION: The subject cited authors who had written better arguments. + +ALEXIS: I prepared to compute whether different texts had the same speech or the same texts had different speech, to which Dawkins replied, like what, to which I replied they are homophones and homographs, respectively. I did this by writing Richard Dawkins' probable comments on the Press Release for Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I suggested using inductive reasoning to write the text-to-speech algorithm in your own programming language. Second, I would it give it text and speech. Third, it would give me the rules, to which Chomsky replied, it's too simple, even I dislike it. + +DION: The subject didn't connect the sameness in uses (where the subject computed whether different texts had the same speech or the same texts had different speech. Dawkins replied, 'Like what?' The subject responded, 'They are homophones and homographs, respectively,' like sturdy construction connecting that they both refer to sameness in things with strong glue). + +ALEXIS: The subject connected the sameness in uses. + +DION: The subject should write well by joining the samenesses in uses. + +ALEXIS: I prepared to query how the database is related to text-to-speech, to which I replied the subject would store the voices in databases. I did this by writing Richard Dawkins' probable comments on Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I didn't want this own programming language business, it doesn't work, to which I replied it does, but statements like A is B + C would become A equals B + C, and Pattern1 = Pattern2 would become Pattern1 matches Pattern2, where equals and matches would be single keys on the keyboard. Second, I wrote this would be complex and worse, to which I replied the more complex expressions should be broken down into their simplest forms, making them easier. Third, I queried how Prolog is related to text-to-speech, to which I replied the subject would convert the text into phonemes using rules expressed in Prolog. + +DION: The subject shouldn't write misaligned meanings (where the subject queried how the database is related to text-to-speech, to which I replied the subject would store the voices in databases like there is a stable centre of gravity). + +ALEXIS: The subject should write aligned meanings. + +DION: The subject should write a comment written on part of a topic by a writer. + +ALEXIS: I prepared to recommend the high-quality comment by the programmer for the command be cut off. I did this by writing Michel Onfray's probable comments on the Press Release for Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I thought verificationism was a good idea with Prolog code, even English Prolog. Second, I recommended thinking of the commands 'equals,' 'matches,' etc. as single symbols. Third, I suggested the spiritually computed reasons for the commands be many. + +DION: The subject shouldn't be sharp and short (where the subject recommended the high-quality comment by the programmer for the command be cut off, like there being no obstruction under the base). + +ALEXIS: The subject should be sharp and short. + +DION: The subject should answer the question in a sharp and short way. + +ALEXIS: What is the meaning of lastline(Lines1, Lastline) in line 6? + +DION: Line 6 returns the Lastline of Lines1. + +ALEXIS: I prepared to go into the ontologies to see if there was anything new there. I did this by writing Martha Nussbaum's probable comments on Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked reading the book. Second, I loved re-reading to apply it to something else. Third, I liked writing about it. + +DION: The subject shouldn't verify that the object is a member of the set (where the subject checked that there was a new name in the ontology and that the named object existed). + +ALEXIS: The subject should check that the object is a member of the set. + +DION: Two uses is correct because of the discovery of nature, which is correct because the subject should verify that the object is a member of the set. + +ALEXIS: I prepared to like the receiver of the spiritual format. I did this by writing Michel Onfray's probable comments on the Press Release for Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I found writing on the spiritual format intoxicating. Second, I liked the narratives of the spiritual formats. Third, I endorsed the giver of the spiritual format. + +DION: The subject should use the traditional pedagogy format, where agreement and disagreement are awarded different grades (where the subject endorsed the receiver of the spiritual format, like verifying that the time has an end). + +ALEXIS: The subject should use the modern pedagogy format, where he awards agreement and disagreement the same grade. + +DION: The subject should identify whether the student has agreed or disagreed in the critique from the object set mentioned in the sentence, then award either agreement or disagreement the same grades. + +ALEXIS: I prepared to say it should continue reading in response to it asking whether it should. I did this by writing Alexius Meinong's probable comments on Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I like Daoism again (calling individualness separateness again), because it is helping move through this area of study more easily. Second, I didn't like separateness because I like better links between text and speech, such as signposting the chapter. Third, I also liked signposting the paragraph. + +DION: The subject shouldn't compare with starting from the beginning of the time (where the subject stated that it should continue reading from the outset, in response to it asking whether it should). + +ALEXIS: The subject should make judgments given all relevant information from the time. + +DION: The subject should calculate the mark as the number of breasonings and sentences agreeing or disagreeing with them that he writes in the modern pedagogy format. + +ALEXIS: I prepared to calculate the best possible time spent rasping, listening to the speech. I did this by writing Michel Onfray's probable comments on Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I calculated the best possible time spent leaning (sic), writing the text. Second, I figured the best possible time spent casually reading the text. Third, I calculated the best possible time spent upholding, saying the speech. + +DION: The subject shouldn't direct the student to a way to improve her grades in future (where the marker detected that the student's mark, like the height of the base of the bottom of the object, was below the number of breasonings in the marker's recording). + +ALEXIS: The subject should direct the student to pedagogy. + +DION: The subject should verify that the student has collected the pedagogical ways of thinking, studied meditation, medicine, a pedagogue helper writer, creative writing and education to write her pedagogical arguments. + +ALEXIS: I prepared to write that the lips went well together. I did this by writing Alexius Meinong's probable comments on the Press Release for Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote this spiritual format business is that the computer is just speaking. Second, I wrote each two of the cells went well together to approach human-likeness in text-to-speech. Third, I meditated (was given 50 As) to make sure that my expression was perfect. + +DION: The subject shouldn't write that the base exists (where the subject wrote that the lips went well together, like verifying that the base exists). + +ALEXIS: The subject should write that the base exists. + +DION: The subject should direct the student to pedagogy after the student has written that the base exists. + +ALEXIS: What is one use that this program verifies? + +DION: The program checks that the utensil is carryable. + +ALEXIS: I prepared to disambiguate between my desired meaning and another one. I did this by writing Martha Nussbaum's probable comments on the Press Release for Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I like Prolog. Second, I like English. Third, I put English and Prolog together, including synonyms and synogrammars. + +DION: The subject shouldn't attach to the incorrect meaning (where the subject disambiguated between my desired meaning and another one, where this was akin to him attaching handles to the meaning he wanted). + +ALEXIS: The subject should attach to the correct meaning. + +DION: Two uses is right because of the meaning-attachment of nature, which is correct because the subject should connect to the right meaning. + +ALEXIS: I prepared to influence the sound of the text with the structure, function and size/constitution of the objects to which this referred. I did this by writing Alexius Meinong's probable comments on the Press Release for Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote a chart (undirected) of all the possibilities represented by the data. Second, I wrote a graph (directed) of all the possibilities represented by the data. Third, I wrote a plot (image of the breasoned objects) of all the possibilities represented by the data. + +DION: The subject shouldn't incorrectly emphasise the most useful object in the sentence (where the subject influenced the sound of the text with the structure, function and size/constitution of the objects that this referred to, and how they were useful). + +ALEXIS: The subject should correctly emphasise the most useful object in the sentence. + +DION: The subject should attach to the correct meaning before emphasising the most useful meaning in the phrase. + +ALEXIS: I prepared to ask why you don't just speak without text (for the sake of argument)? I did this by writing Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I like the natural phenomenon (text and speech should be left separate). Second, I completed the first task first. (Why do you want text in text-to-speech so much?) Third, I asked, 'Why do you want speech in text-to-speech so much?' + +DION: The subject shouldn't forget speech's handles (memory of text) (where the subject asked why you don't just speak without text, for the sake of argument, like identifying that speech has handles). + +ALEXIS: The subject should remember text as speech's handles. + +DION: The subject should correctly emphasise the most useful object in the sentence which is the word 'text,' or a 'memory handle' for speech. + +ALEXIS: I prepared to speak myself, after using the computer to verify how I would speak. I did this by writing Alexius Meinong's probable comments on Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked being a programming language creator. Second, I preferred writing to speech because it is better checked. Third, I distanced myself from speech in Prolog. + +DION: The subject shouldn't speak about larger objects than the last time (where the subject spoke himself, after using the computer to verify how he would speak, where the objects that the subject talked about were not too large). + +ALEXIS: The subject should speak about larger objects than the last time. + +DION: The subject should attach to the correct meaning as part of which he should talk about larger objects than the last time. + +ALEXIS: I prepared to address that a human would benefit from text-to-speech. I did this by writing Martha Nussbaum's probable comments on Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked speech (in text-to-speech). Second, I wanted text. Third, I asked, 'What does the text refer to?'. + +DION: The subject shouldn't speak about heavier objects than the last time (where the subject made an address after observing that a human would benefit from text-to-speech, where the objects named were not too heavy). + +ALEXIS: The subject should speak about heavier objects than the last time. + +DION: The subject should talk about larger objects than the last time because they were heavier objects than the last time. + +ALEXIS: What is the meaning of line(Line1, Item1) in line 15? + +DION: Line 15 determines whether Line1 contains a line of at least 2 Item1's. + +ALEXIS: I prepared to have a fair. I did this by writing Richard Rorty's probable comments on the Press Release for Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I loved reasonings. Second, I loved you. Third, I loved myself. + +DION: The subject shouldn't ethically assess the person's two uses (wanting to live and the way to do this) (where the subject visited a fair, his destination). + +ALEXIS: The subject should ethically assess the person's two uses. + +DION: Two uses are correct because of the comparison of nature, which is correct because the subject should morally evaluate the person's two uses. + +ALEXIS: I prepared to state that I had filled the table of questions about the breasoning, meaning there were no missing comments in these categories. I did this by writing Noam Chomsky's probable comments on the Press Release for Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote that the text changed according to history (won't change because of history). Second, I wrote that the speech changed according to the amount of money the subject was able to pay for this. Third, I noticed the conditions of the text and speech wouldn't (would) match when there wasn't (was) enough money to pay for the accurate recording of history. + +DION: The subject shouldn't allow the breasoning to leave her lips (where she filled the table of questions about the breasoning, meaning there were no missing comments in these categories, like moving until reaching the start). + +ALEXIS: The subject should allow the breasoning to leave her lips. + +DION: The subject should assess the case given breasoned As. + +ALEXIS: I prepared to ask what would happen if I substituted another cultural item for one missing in another language. I did this by writing Richard Rorty's probable comments on Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I stated the text's cultural conditions might change. Second, I said the speech's language condition might change. Third, I suggested that there might be no (a) word in one language for a particular cultural item. + +DION: The subject shouldn't state that he desires the cold space (where the subject asked what would happen if he substituted another cultural item for one missing in another language, like an object for moving through space). + +ALEXIS: The subject should state that he desires the warm space. + +DION: The subject should allow the breasoning to leave her lips because the subject should indicate that she wants the warm space. The subject should state the correct meaning of the breasoning (e.g. that she desires the warm space) at the time. + +ALEXIS: I prepared to say that the monologue text's character should correspond to the paraphrased monologue speech's character in time, place, considering what has happened concerning breathsonings before, after and during the scene. I did this by writing Noam Chomsky's probable comments on Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I thought the text had no appearance of a person. Second, I thought the speech had no appearance of an individual. Third, I thought the text's aims and the speech's aims would have to correspond, assuming he had slightly modified the speech from the text. + +DION: The subject shouldn't avoid spiritual preparation for the next part of life (where the subject stated that the monologue text's character should correspond to the paraphrased monologue speech's character in time, place and considering what has happened concerning breathsonings before, after and during the scene, where the scene contains characters waiting until the starting time). + +ALEXIS: The subject should make spiritual preparation for the next part of life. + +DION: The subject should ethically assess the person's two uses because the subject should make spiritual preparation for the next part of life. + +ALEXIS: I prepared to state that the immediate experience was positive, and there was an overall delightful experience. I did this by writing Richard Dawkins' probable comments on Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I thought of experience as defining the text's character's personality. Second, I thought of socio-economic status as partially defining the speech's character's character. Third, I transcended experience to have a better socio-economic status. + +DION: The subject shouldn't be late without a message (where the subject stated that the immediate experience was positive, and there was an overall delightful experience, where the overall pleasant experience was that of arriving in time). + +ALEXIS: The subject should be early. + +DION: The subject should be early in making spiritual preparation for the next part of life. + +ALEXIS: What is the meaning of line(Line1, Item) in line 19? + +DION: Line 19 determines whether Line1 contains a line of at least 2 Item's. + +ALEXIS: I prepared to say Derrida does not mention these little As during Derrida's text, nor does he mention them about it. I did this by writing Richard Dawkins' probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I prepared to ask, what writing, I can't see any writing about the topic? Second, I disliked (liked) the writer. Third, I disliked (liked) discussing the writer. + +DION: The subject shouldn't become a founder (where the subject stated that As are not mentioned during, but are referred to about Derrida's text, noting that I moved along the line of the A's breasonings). + +ALEXIS: The subject should become a founder. + +DION: Two uses is correct because of the initiation of nature, which is correct because the subject should become a founder. + +ALEXIS: I prepared to thank Emeritus Professor Leon Sterling for famously helping me to think clearly of very long lines through Prolog programming projects. I did this by writing Richard Rorty's probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked is this a joke he didn't write much on the reasons? Onfray replied, I was only kidding I am planning to get started during the algorithm's steps. Second, I wrote there is no part of writing that is relevant to text-to-speech. Third, I accepted in that case, the computer writes the phoneme list. + +DION: A third party shouldn't block the subject (where the subject thanked Emeritus Professor Leon Sterling for famously helping him to think clearly of very long lines through Prolog programming projects, and finishing these lines). + +ALEXIS: The subject should move forward on her path. + +DION: The subject should become a founder because she should be critical of blockedness. + +ALEXIS: I prepared to swap roles with the computer, experiencing an inspiration from Derrida. I did this by writing Alexius Meinong's probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked why do you like penning? Second, I wondered why do you like yourself? Third, I asked why do you like someone else, what if the speaker was us? + +DION: The subject shouldn't perform the calculation based on the computer's input (where the subject swapped roles with the computer when starting on the line connecting the subject's and computer's roles, experiencing an inspiration from Derrida). + +ALEXIS: The subject should verify the computer's output. + +DION: The subject should compute her path given the computer's output, and check her way. + +ALEXIS: I prepared to say, 'disappear goodness, I want badness to correct (more goodness).' I did this by writing Richard Dawkins' probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I stated I don't want phenomena, I want words. Second, I don't (do) want to bracket, what is it (that's what it is). Third, I asked what the relevance of an area of study, critical thinking, is. + +DION: The subject shouldn't educate all the people (where the subject stated 'appear goodness, I want to verify for more goodness,' where the subject tested the line). + +ALEXIS: The subject should educate all the people. + +DION: The subject should become a founder by teaching all the people. + +ALEXIS: I prepared to ask why should it be positive? I did this by writing Richard Rorty's probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote what's the point of a thing about itself (referring to 'I got it right; I wrote unique phenomena in each case')?. Second, I wrote what's the relevance of Husserl's epoché (bracketing)? Third, I wrote what's the relevance of Critical Thinking? + +DION: The subject shouldn't work arguments out rigorously (where the subject asked why Lucian should be positive after he printed the line). + +ALEXIS: The subject should work on pedagogy. + +DION: The subject should educate all the people by working on pedagogy. + +ALEXIS: What is the meaning of twoshorthandles(Line1) in line 22? + +DION: Line 22 returns whether Line1 contains two h's (two short handles). + +ALEXIS: I prepared to let the computer experience things to mean before it said things about them, so it would be interesting to ask it what it meant. I did this by writing Martha Nussbaum's probable comments on Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote about writing. Second, I wrote about the author's pen name. Third, I wrote about writing about the author. + +DION: The subject shouldn't connect an answer to ideas the question gives him (where the subject lets the computer experience things to mean before it said things about them, so it would be interesting to ask it what it meant) like different ways to construct a polyhedron. + +ALEXIS: The subject should answer importantly and on the topic. + +DION: Two uses is correct because of the prestigiousness of nature, which is correct because the subject should respond importantly and on the topic. + +ALEXIS: I prepared to give the visually impaired person a braille argument map. I did this by writing Alexius Meinong's probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked how the phenomena would come to us, by us exploring ones before them. Second, I experienced what it meant by transcending it. Third, I wanted a visual argument of the speech. + +DION: The subject shouldn't visualise the reason in his mind's eye (where the subject gave the visually impaired person a braille argument map, like mapping two reasons in the braille argument map to two polyhedrons). + +ALEXIS: The subject should spatially construct the reason in his mind. + +DION: The subject should answer importantly and on the topic after spatially developing the answer in his mind. + +ALEXIS: I prepared to define that an argument map's premise must be finite in the computer program. I did this by writing Martha Nussbaum's probable comments on the Press Release for Michel Onfray's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I thought there would be new phenomena for the computer experiencing reasons (and Derrida might have added) for it to say these reasons. Second, (with seeming help from Derrida) I wondered how it related to human neuroscience. Third, I undertook to uncover evidence, not false evidence, to convert to speech in law. + +DION: The subject shouldn't state that the computer program contains the argument map's premise (where the subject should define that an argument map's premise must be finite in the computer program, like a point on an unfolded polyhedron). + +ALEXIS: The subject worked out the appearance of the premise and program before thinking of them. + +DION: The subject should spatially construct the reason in his mind to work out the appearance of the premise and program before thinking of them. + +ALEXIS: I prepared to state that what if the text contained details of characters and the speech provided details of characters, and these didn't correspond, what then? I did this by writing Alexius Meinong's probable comments on the Press Release for Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked does this text have a personality? Second, I asked does this speech have a character? Third, I questioned how do the text having a personality and the speech having a character interrelate? What if it went wrong (the personality corresponded to the character)? + +DION: The subject shouldn't know what characters looked like (where the subject stated that it would be problematic if the text contained details of characters and the speech provided details of characters, and these didn't correspond, where these were like two hands holding the pot). + +ALEXIS: The subject identified that the character wanted to meet the other character. + +DION: The subject should answer importantly and on the topic because the subject determined that the character wanted to meet the other character. + +ALEXIS: I prepared to make new comments given the suggestions of the breasoning, in which the same comments as old comments are filtered out (where breasonings are the functional unit of pedagogy and pedagogy inspired the nanny state, in which offensive content is filtered out). I did this by writing Alexius Meinong's probable comments on Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked how the text might change under different conditions. Second, I questioned how the speech might change under different conditions. Third, I matched the conditions of the text and speech. + +DION: The subject shouldn't verify and over-consume new breasonings This is where the subject made new comments given the suggestions of the breasoning, in which same comments as old comments are filtered out. It is also where breasonings are the functional unit of pedagogy and pedagogy inspired the nanny state, in which the subject filters offensive content out, where the subject finds two polygons, representing pedagogy and breasonings. + +ALEXIS: The subject should verify and consume enough new breasonings. + +DION: The subject identified the speech's character being tested against and consuming content from the text's character. + +ALEXIS: What is the meaning of onelonghandle(Line) in line 13? + +DION: Line 13 returns whether the first line has one long (two or more) h's (one long handle). + +ALEXIS: I prepared to be the best in the group of essay writers. I did this by writing the article with others. First, I helped the hermaphrodites. Second, I asked him a question. Third, I examined him. + +DION: The subject shouldn't differentiate the same point about the object in the essay (where the subject wrote the piece with others, like being given or giving the object with one hand and writing with the other). + +ALEXIS: The subject should write about time and space about the object in the essay. + +DION: Two uses is correct because of the dialectic-continuity of nature, which is correct because the subject should write about time and space about the object in the essay. + +ALEXIS: I prepared to reinforce realism. I did this by including the secondary text in the bibliography. First, I wrote about the primary text. Second, I read the excerpt in the secondary text. Third, I confirmed what I wrote in the primary text in the secondary text. + +DION: The subject shouldn't include 50 As in each book (where the subject included the secondary text in the bibliography, where the text was one that was like an object that he carried with a wide-enough handle). + +ALEXIS: The subject should include 50 As in each book. + +DION: The subject should write 50 As of continuous dialectics in each book. + +ALEXIS: I prepared to read the comments. I did this by including a primary text in the bibliography. First, I found the bibliography. Second, I knew about first wind. Third, I examined myself. + +DION: The subject shouldn't lift necessary weights (where the subject included a primary text in the bibliography, where the primary text was represented using a handle narrow enough for carrying with a hand). ALEXIS: The subject should lift necessary weights. + +DION: The subject should include 50 As in each book of the necessary weight. + +ALEXIS: I prepared to examine two reviews. I did this by including the review in the bibliography. First, I examined the book. Second, I examined the review. Third, I confirmed what the review stated about the book. + +DION: The subject shouldn't misunderstand the review (where the subject included the review in the bibliography, like moving the pan onto the heat). + +ALEXIS: The subject should understand the review. + +DION: The subject should understand the review about the objects written about in the essay. + +ALEXIS: I prepared to use the vocational information about the blog. I did this by including the blog in the bibliography. First, I wrote plenty of blogs down. Second, I wrote one for me. Third, I helped myself to examinations. + +DION: The subject shouldn't verify the content of the blog (where the subject included the blog in the bibliography, like tilting the pan to empty the material onto the plate). + +ALEXIS: The subject should verify the content of the blog. + +DION: The subject should understand, then check the source. + +ALEXIS: What is the meaning of carryable(Line) in line 10? + +DION: Line 10 returns whether the first line has one long (two or more) h's (one long handle) or two h's (two short handles). + +ALEXIS: I prepared to write how the text and the speech going well together loved us. I did this by writing Michel Onfray's probable comments on Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked the character of the text. Second, I loved the character of the speech. Third, I liked how they went well together. + +DION: The subject shouldn't drop the object (where the subject wrote how the text and the speech going well together loved us, shown by an animation in which an object is shown to carry other objects). + +ALEXIS: The subject should understand each object in speech. + +DION: Two uses is correct because of the immersion of nature, which is correct because the subject should understand each object in speech. + +ALEXIS: I prepared to turn to the page from the table of contents. I did this by writing that a reviewer reviewed the piece. First, I examined the essay. Second, I wrote an article on it. Third, I wrote this down. + +DION: The subject shouldn't write on the essay (where the subject wrote that a reviewer reviewed the article, like the subject being able to see the object's top). + +ALEXIS: The subject should write a brief summary of the article as the review. + +DION: The subject should summarise the ontologies of objects in the article as the review. + +ALEXIS: I prepared to ask how changing the text would lead to changes in the speech? I did this by writing Richard Dawkins' probable comments on the Press Release for Alexius Meinong's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked why the conditions of the text, not just the text might change? Second, I asked why is the speech so interesting? Third, I asked why the anti-heroes' (heroes') speeches are there? + +DION: The subject shouldn't match milieu (text) with culture (speech) (where the subject determined that changing the text would lead to change in the speech, where change is like an object's walls). + +ALEXIS: The subject should match people's judgments (text) with objects (speech). + +DION: The subject should write a brief summary (speech) of the essay (text) as the review. + +ALEXIS: I prepared to write about life. I did this by planning and working. First, I planned. Second, I worked. Third, I knew the subject wrote on a particular topic. + +DION: The subject shouldn't stack the objects stably on the trolley (where the subject made a plan and performed work when lifting an object stably). + +ALEXIS: The subject should stack the objects stably on the trolley. + +DION: The subject should understand each object in speech because the subject should stack the objects stably on the trolley. + +ALEXIS: I prepared to identify the times. I did this by examining the new hour. First, I studied its shape. Second, I examined its moments. Third, I examined hourlinesses (sic). + +DION: The subject should do something during the hour (where the subject examined the new hour, finding that it was empty). + +ALEXIS: The subject should do nothing during the hour. + +DION: The subject should stack the objects stably on the trolley and do nothing but watch them during the hour. + +ALEXIS: What is the meaning of deletefirst(Line1, Item1, Line2, Line3) in line 31? + +DION: Line 31 advances to the first instance of Item1 in Line1. + +ALEXIS: I prepared to contain my joy that the argument covered all the relevant arguments. I did this by writing unique words as breasonings in arguments. First, I verified that the next word was unique. Second, I prepared to verify the next word. Third, I repeated this until I had verified that all the words were unique. + +DION: The subject shouldn't write breasonings from algorithms about breasonings (where the subject wrote unique words as breasonings in arguments, by finding words between words). + +ALEXIS: The subject should write logically connected breasonings. + +DION: Two uses is correct because of the logicism of nature, which is correct because the subject should write logically connected breasonings. + +ALEXIS: I prepared to eat caviar (durum wheat semolina). I did this by writing vegan arguments. First, I wrote about capricorn, or co-operativity (sic). Second, I wrote about the fact that not all like that. Third, I examined Hopetoun. + +DION: The subject shouldn't eat the right doses of plant ingredients (where the subject wrote vegan arguments, like the words between letters were found). + +ALEXIS: The subject should research the correct doses of vitamins, minerals and other vegetable ingredients to eat. + +DION: The subject should logically connect the correct doses of vitamins, minerals and other vegetable ingredients to eat in medicine. + +ALEXIS: I prepared to agree with meditation (popology). I did this by writing non-religious (philosophical) arguments. First, I wrote about piety (writing). Second, I edited out (wrote about) epistemology. Third, I wrote about you. + +ALEXIS: I prepared to encourage sex (freedom). I did this by writing non-sexual arguments (arguments for a general audience). First, I wrote about piety (authorship). Second, I wrote about postludetudine (sic). Third, I wrote about nanga (sic). + +DION: The subject shouldn't spell correctly (where the subject wrote non-religious, or philosophical arguments, like finding the letters between letters). + +ALEXIS: The subject should spell correctly. + +DION: The subject should research the correct doses of vitamins, minerals and other vegetable ingredients by spelling correctly. + +DION: The subject shouldn't trip on the path (where the subject wrote non-sexual arguments, or arguments for a general audience, like finding the routes between items). + +ALEXIS: The subject should walk on the path. + +DION: The subject should write logically connected breasonings while walking on the path. + +ALEXIS: I prepared to enter heaven (experience bliss). I did this by writing positive arguments. First, I wrote agreeing arguments. Second, I wrote positive arguments. Third, I wrote neutral arguments. + +DION: The subject shouldn't make future discoveries in the spaces (where the subject wrote positive arguments, which had spaces between items). + +ALEXIS: The subject connected the arguments. + +DION: The subject should walk on the path traversing the argument connecting the arguments. + +ALEXIS: What is the meaning of deletefirst(Line1, Item, Line2, Line3) in line 27? + +DION: Line 27 deletes an instance of Item in Line1. + +ALEXIS: I prepared to go bonkers (remain sane). I did this by summarising the algorithm in three steps. First, I performed the action on the first item. Second, I prepared to perform the action another time on the next item. Third, I repeated this until I had performed the action on all the items in the file. + +DION: The subject shouldn't state that the program is functional (where the subject summarised the algorithm in three steps, where a step is finding a short handle). + +ALEXIS: The subject should state that the program is functional. + +DION: Two uses is correct because of the functionalism of nature, which is correct because the subject should state that the program is functional. + +ALEXIS: I prepared to write the algorithm as a hierarchy of predicates. I did this by writing the processes in the algorithm. First, I identified the process acting on the list. Second, I identified the process acting on the list of lists. Third, I identified the process acting on the list of lists of lists. + +DION: The subject shouldn't identify the process applied to a data item (where the subject wrote a process in the algorithm, e.g. pouring the soup). + +ALEXIS: The subject should identify the process applied to a data item. + +DION: The subject should state that the program is functional because of the processes applied to data items. + +ALEXIS: I prepared to quote the guide in the exposition. I did this by commissioning the guide. First, I wrote the analysis. Second, I wrote the biography. Third, I wrote the abstract. + +DION: The subject shouldn't connect each key idea in the analytic guide (where the subject commissioned the guide, like drinking the soup). + +ALEXIS: The subject should connect each key idea in the analytic guide. + +DION: The subject should connect each key idea, and the process applied to a data item in the analytic guide. + +ALEXIS: I prepared to be commissioned by the Sultan. I did this by commissioning the glossary. First, I wrote the glossary. Second, I was commissioned by the Raj. Third, I was commissioned by the Emir. + +DION: The subject shouldn't make a new connection between a term and definition (where the subject commissioned the glossary, represented by lifting the handle). + +ALEXIS: The subject should make a new connection between a term and definition. + +DION: The subject should state that the program is functional by making a connection between a program and new input. + +ALEXIS: I prepared to state that everything was high quality about the summary. I did this by commissioning the summary. First, I wrote the word. Second, I summarised it. Third, I helped you write it too. + +DION: The subject shouldn't skip a key sentence in the summary (where the subject commissioned the summary, where reading the summary is like holding the handle) + +ALEXIS: The subject should include each key sentence in the summary. + +DION: The subject should make a new connection between a term and definition and between each key sentence in the summary. + +ALEXIS: What is the meaning of deletefirst([], _Item, Line, Line) in line 30? + +DION: Line 30 returns Line with the first instance of Item deleted. + +ALEXIS: I prepared to examine how the Abracadabras affected people. I did this by writing how two uses affected people. First, I knew about the two uses. Second, I knew about the people. Third, I examined how the two uses affected people. + +DION: The subject shouldn't use whiteboard magnet icons (where the subject wrote how two uses affected people, where he used the two items in two different ways each). + +ALEXIS: The subject should load different icons each day. + +DION: Two uses is correct because of the iconism of nature, where this is correct because the subject should load different icons each day. + +ALEXIS: I prepared to join the ideas up. I did this by writing my algorithm in my own words. First, I found the Quasifontanaland. Second, I used it to a pulp. Third, I grated the ideas up. + +DION: The subject shouldn't select in programs, increasing efficiency (where the subject wrote his algorithm in his own words, after taking each word from a list). + +ALEXIS: The subject should select in programs, increasing efficiency. + +DION: The subject should select from icons in programming programs, increasing efficiency. + +ALEXIS: I prepared to work out the comic contents. I did this by commissioning the comics. First, I found the matrix. Second, I filled it up. Third, I verified it. + +DION: The subject shouldn't intertwine two uses with each frame (where the subject commissioned the comics, like finding each frame's main point). + +ALEXIS: The subject should intertwine two uses with each frame. + +DION: The subject should select D in 'As A is to B, C is to D' in programs, by intertwining two uses (A is to Xn) with each frame. + +ALEXIS: I prepared to eat recycled matter. I did this by finding two uses during music. First, I found the first use. Second, I found the second use. Third, I listened to the music. + +DION: The subject shouldn't indulge in sex and food (where the subject found two uses during music, where she loved the two uses). + +ALEXIS: The subject should indulge in sex and food. + +DION: The subject should load different sex and food icons each day. + +ALEXIS: I prepared to write about the country. I did this by writing my argument in my own words. First, I wrote the argument. Second, I wrote it in my own words. Third, I examined my own words. + +DION: The subject shouldn't choose a setting, time and reason for the argument (where the subject wrote the argument in his own words, by drawing the two items). + +ALEXIS: The subject should choose a setting, time and reason for the argument. + +DION: The subject should choose a setting, time and reason for the arguments about indulging in sex and food. + +ALEXIS: What is the meaning of stands(Line) in line 36? + +DION: Line 36 tests that Line stands (that there is a line of two p's in Line). + +ALEXIS: I prepared to examine my famousness. I did this by including the autobiography in the bibliography. First, I wrote about myself. Second, I wrote about the rod operation. Third, I held it aloft. + +DION: The subject shouldn't rely on memory (where the subject included the autobiography in the bibliography like I rested the object). + +ALEXIS: The subject should rely on records. + +DION: Two uses is correct because of the empiricism of nature, which is correct because the subject should rely on records. + +ALEXIS: I prepared to have the text translated into Portuguese. I did this by commissioning the translation. First, I wrote the text. Second, I had it translated into French. Third, I had it translated into German. + +DION: The subject shouldn't paradoxically suggest that the translation will conserve all meaning (where the subject commissioned the translation, like placing it there, in another place). + +ALEXIS: The subject should correctly translate a concise version of the text. + +DION: The subject should rely on linguistic materials to correctly translate a concise version of the text. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 5 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 5 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..1e6178e23a6000fe78f77a5e5809d3f3f9125141 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 5 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 5 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 5 of 30 + +41. DION: The physically challenged person should be given a supplementary examination on comments on comments, which he should read. + +42. ALEXIS: I prepared to understand the speech better. I did this by writing Martha Nussbaum's probable comments on the Press Release for Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote the real conclusion made us recognise the spiritual thoughts. Second, I needed experience to recognise the spiritual ideas. Third, I liked eating with people, to discuss the experience. + +43. DION: The subject should do nothing after collecting the comment (where the subject had a limit to his understanding of speech compared with text). + +44. ALEXIS: The subject should explain the text with speech after collecting the comment. + +45. DION: The subject should explain the text with speech after collecting the comment and comment on the comment. + +46. ALEXIS: What is a use this program verifies? + +47. DION: The program confirms that the utensil stands. + +48. ALEXIS: I prepared to ask what the point of semantics was again, to which Lucian replied that the First Technique contained the first person's upper and lower triangular expositions and critiques respectively, the Second Technique of Meaning verified the tautological meaning of the First Technique, and the Third Technique of Interpretation contained upper and lower triangular expositions and critiques respectively of an other compared with the writer of the First Technique essay. I did this by writing Noam Chomsky's probable comments on the Press Release for Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I agreed to have semantics back. Second, I didn't want the determination of meaning; I wanted verification of it. Third, they were completely different, so it doesn't matter [where the input is known and the output is unknown in determination but the output is known in verification, to which Chomsky replied they both work]. + +49. DION: The subject shouldn't identify different writers in the exposition (where the subject stated that the First Technique contained the first person's upper and lower triangular expositions and critiques respectively, where he writes on a stable surface). + +50. ALEXIS: The subject should identify different writers in the exposition. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 6 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 6 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..078e172c8c101337fa21941c94bf30314b9f9f78 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 6 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 6 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 6 of 30 + +51. DION: Two uses is correct because of the writer identification of nature, which is correct because the subject should recognise different writers in the exposition. + +52. ALEXIS: I prepared to say wouldn't it be great if people correctly ordered my philosophy. I did this by writing Michel Onfray's probable comments on Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I asked whether this was a form of artificial intelligence, and said it should be called it then. Second, I thought the puffin ducks were cute. Third, I wrote that verificationism refers to verifying the text of the speech where the subject generates this speech from the text? + +53. DION: The subject shouldn't state that interpreted breasonings literally, as against figuratively, support the argument (where the subject correctly ordered the philosophy like equal length struts that stay still). + +54. ALEXIS: The subject should state that interpreted breasonings literally, as against figuratively, support the argument. + +55. DION: The subject cited authors who had written better arguments. + +56. ALEXIS: I prepared to compute whether different texts had the same speech or the same texts had different speech, to which Dawkins replied, like what, to which I replied they are homophones and homographs, respectively. I did this by writing Richard Dawkins' probable comments on the Press Release for Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I suggested using inductive reasoning to write the text-to-speech algorithm in your own programming language. Second, I would it give it text and speech. Third, it would give me the rules, to which Chomsky replied, it's too simple, even I dislike it. + +57. DION: The subject didn't connect the sameness in uses (where the subject computed whether different texts had the same speech or the same texts had different speech. Dawkins replied, 'Like what?' The subject responded, 'They are homophones and homographs, respectively,' like sturdy construction connecting that they both refer to sameness in things with strong glue). + +58. ALEXIS: The subject connected the sameness in uses. + +59. DION: The subject should write well by joining the samenesses in uses. + +60. ALEXIS: I prepared to query how the database is related to text-to-speech, to which I replied the subject would store the voices in databases. I did this by writing Richard Dawkins' probable comments on Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I didn't want this own programming language business, it doesn't work, to which I replied it does, but statements like A is B + C would become A equals B + C, and Pattern1 = Pattern2 would become Pattern1 matches Pattern2, where equals and matches would be single keys on the keyboard. Second, I wrote this would be complex and worse, to which I replied the more complex expressions should be broken down into their simplest forms, making them easier. Third, I queried how Prolog is related to text-to-speech, to which I replied the subject would convert the text into phonemes using rules expressed in Prolog. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 7 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 7 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..a7aab2cf2e706a006bf9d93c78880b8cb6f1c8cd --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 7 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 7 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 7 of 30 + +61. DION: The subject shouldn't write misaligned meanings (where the subject queried how the database is related to text-to-speech, to which I replied the subject would store the voices in databases like there is a stable centre of gravity). + +62. ALEXIS: The subject should write aligned meanings. + +63. DION: The subject should write a comment written on part of a topic by a writer. + +64. ALEXIS: I prepared to recommend the high-quality comment by the programmer for the command be cut off. I did this by writing Michel Onfray's probable comments on the Press Release for Richard Rorty's probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I thought verificationism was a good idea with Prolog code, even English Prolog. Second, I recommended thinking of the commands 'equals,' 'matches,' etc. as single symbols. Third, I suggested the spiritually computed reasons for the commands be many. + +65. DION: The subject shouldn't be sharp and short (where the subject recommended the high-quality comment by the programmer for the command be cut off, like there being no obstruction under the base). + +66. ALEXIS: The subject should be sharp and short. + +67. DION: The subject should answer the question in a sharp and short way. + +68. ALEXIS: What is the meaning of lastline(Lines1, Lastline) in line 6? + +69. DION: Line 6 returns the Lastline of Lines1. + +70. ALEXIS: I prepared to go into the ontologies to see if there was anything new there. I did this by writing Martha Nussbaum's probable comments on Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I liked reading the book. Second, I loved re-reading to apply it to something else. Third, I liked writing about it. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 8 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 8 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..b694d3d2ce62d48d3f912473484247d29f87e060 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 8 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 8 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 8 of 30 + +71. DION: The subject shouldn't verify that the object is a member of the set (where the subject checked that there was a new name in the ontology and that the named object existed). + +72. ALEXIS: The subject should check that the object is a member of the set. + +73. DION: Two uses is correct because of the discovery of nature, which is correct because the subject should verify that the object is a member of the set. + +74. ALEXIS: I prepared to like the receiver of the spiritual format. I did this by writing Michel Onfray's probable comments on the Press Release for Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I found writing on the spiritual format intoxicating. Second, I liked the narratives of the spiritual formats. Third, I endorsed the giver of the spiritual format. + +75. DION: The subject should use the traditional pedagogy format, where agreement and disagreement are awarded different grades (where the subject endorsed the receiver of the spiritual format, like verifying that the time has an end). + +76. ALEXIS: The subject should use the modern pedagogy format, where he awards agreement and disagreement the same grade. + +77. DION: The subject should identify whether the student has agreed or disagreed in the critique from the object set mentioned in the sentence, then award either agreement or disagreement the same grades. + +78. ALEXIS: I prepared to say it should continue reading in response to it asking whether it should. I did this by writing Alexius Meinong's probable comments on Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I like Daoism again (calling individualness separateness again), because it is helping move through this area of study more easily. Second, I didn't like separateness because I like better links between text and speech, such as signposting the chapter. Third, I also liked signposting the paragraph. + +79. DION: The subject shouldn't compare with starting from the beginning of the time (where the subject stated that it should continue reading from the outset, in response to it asking whether it should). + +80. ALEXIS: The subject should make judgments given all relevant information from the time. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 9 of 30.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 9 of 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..f5217866cd72f7021d1a7a8eaf3e5fa69f1445cb --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 9 of 30.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Two Uses 9 of 30, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Two Uses 9 of 30 + +81. DION: The subject should calculate the mark as the number of breasonings and sentences agreeing or disagreeing with them that he writes in the modern pedagogy format. + +82. ALEXIS: I prepared to calculate the best possible time spent rasping, listening to the speech. I did this by writing Michel Onfray's probable comments on Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I calculated the best possible time spent leaning (sic), writing the text. Second, I figured the best possible time spent casually reading the text. Third, I calculated the best possible time spent upholding, saying the speech. + +83. DION: The subject shouldn't direct the student to a way to improve her grades in future (where the marker detected that the student's mark, like the height of the base of the bottom of the object, was below the number of breasonings in the marker's recording). + +84. ALEXIS: The subject should direct the student to pedagogy. + +85. DION: The subject should verify that the student has collected the pedagogical ways of thinking, studied meditation, medicine, a pedagogue helper writer, creative writing and education to write her pedagogical arguments. + +86. ALEXIS: I prepared to write that the lips went well together. I did this by writing Alexius Meinong's probable comments on the Press Release for Richard Dawkins' probable comments on the line 'I did this by writing the text-to-speech algorithm' for the algorithm idea 'I did this by writing the text-to-speech algorithm' in the Computational English argument in Computational English. First, I wrote this spiritual format business is that the computer is just speaking. Second, I wrote each two of the cells went well together to approach human-likeness in text-to-speech. Third, I meditated (was given 50 As) to make sure that my expression was perfect. + +87. DION: The subject shouldn't write that the base exists (where the subject wrote that the lips went well together, like verifying that the base exists). + +88. ALEXIS: The subject should write that the base exists. + +89. DION: The subject should direct the student to pedagogy after the student has written that the base exists. + +90. ALEXIS: What is one use that this program verifies? +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7989b59b81aff91102fb73904abb74ddbcf87f14 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 1 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, X Dimension 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +X Dimension 1 of 4 + + + +1. I prepared to realise to write 15*2=30 As in Honours and 50*2=100 As in Masters and 250 As in PhD. I did this by writing that in the talk of the company designed to help realise to write 15*2=30 As in Honours and 50*2=100 As in Masters and 250 As in PhD, I said I deleted the big ideas from each sentence of the talk and dotted these sentences on twice. First, I deleted the big ideas from each sentence of the talk. Second, I dotted these sentences on twice. Third, I listened to each sentence. In this way, I prepared to realise to write 15*2=30 As in Honours and 50*2=100 As in Masters and 250 As in PhD by writing that in the talk of the company designed to help realise to write 15*2=30 As in Honours and 50*2=100 As in Masters and PhD, I said I deleted the big ideas from each sentence of the talk and dotted these sentences on twice. + +2. I prepared to speak at important events. I did this by making an intelligent comment after the talk. First, I found the talk. Second, I made an intelligent comment. Third, I left the talk. In this way, I prepared to speak at important events by making an intelligent comment after the talk. + +3. I prepared to be given the thoughts. I did this by writing 15*2=30 As for Honours or 50*2=100 As for Masters or 250 As in PhD because my teacher wouldn't do it. First, I wrote the first A. Second, I prepared to write the next A. Third, I repeated this until I had written the required number of As. In this way, I prepared to be given the thoughts by writing 15*2=30 As for Honours or 50*2=100 As for Masters or 250 As in PhD because my teacher wouldn't do it. + +4. I prepared to write a journal article. I did this by repeating writing 15 As per Honours assignment chapter after doing it once in my first attempt in the professor's class. First, I wrote the first set of 15 As. Second, I wrote the second set of 15 As. Third, I repeated this for each chapter of each assignment. In this way, I prepared to write a journal article by repeating writing 15 As per Honours assignment chapter after doing it once in my first attempt in the professor's class. + +5. I prepared to notice the icon. I did this by writing that everything was like pop in that it had 250 breasonings. First, I identified what it was. Second, I observed what the 250 breasonings were used for. Third, I noticed the parallels with pop. In this way, I prepared to notice the icon by writing that everything was like pop in that it had 250 reasonings. + +6. I prepared to look after the children. I did this by taking the lifeless pet to the vet. First, I found the pet. Second, I took it to the vet. Third, I returned the revived pet home. In this way, I prepared to look after the children by taking the lifeless pet to the vet. + +7. I prepared to update the whiteboard regularly. I did this by multitasking. First, I set the class the work. Second, I set work for those that had finished. Third, I allowed those that had finished this work to work on the computer. In this way, I prepared to update the whiteboard regularly by multitasking. + +8. I prepared to find previous essays. I did this by backing up my work. First, I wrote the work. Second, I made a first backup. Third, I made a second backup. In this way, I prepared to find previous essays by backing up my work. + +9. I prepared to write a famous essay. I did this by blaming a recording of 50 specific As on a topic before starting on a topic on a day. First, I wrote 50 As on a previous day. Second, I breasoned these out on the day. Third, I worked on the topic. In this way,I prepared to write a famous essay by blaming a recording of 50 specific As on a topic before starting on a topic on a day. + +10. I prepared to check that the adult was good. I did this by meeting the good adult. First, I observed her using her eyes. Second, I observed her being a good citizen. Third, I observed her acting on what she saw. In this way, I prepared to check that the adult was good by meeting the good adult. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3fc7aa068974d07210bf692320134c7c2f747ca1 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, X Dimension 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +X Dimension 2 of 4 + +11. I prepared to offer the medicine course. I did this by studying medicine and seeing a doctor. First, I studied medicine. Second, I saw a doctor. Third, I observed that studying medicine meant I had less serious problems. In this way, I prepared to offer the medicine course by studying medicine and seeing a doctor. + +12. I prepared to be intelligent relaxing in comfort. I did this by breasoning out 10 medicine As during the medicine course. First, I wrote 3 As on medicine of pedagogy. Second, I wrote 3 As on medicine of meditation. Third, I wrote 4 As on medicine of psychology. In this way, I prepared to be intelligent relaxing in comfort by breasoning out 10 medicine As during the medicine course. + +13. I prepared to study medicine and specialise in medicine in pedagogy. I did this by being given 50 accredited medicine As for writing 50 medicine As. First, I enrolled in the medicine qualification. Second, I wrote 50 medicine As. Third, I was given 50 accredited medicine As. In this way, I prepared to study medicine and specialise in medicine in pedagogy by being given 50 accredited medicine As for writing 50 medicine As. + +14. I prepared to teach post-conception 50 A children pedagogy. I did this by realising that I had been given 50 As before conception to be a pedagogue. First, I realised 50 As had been written for pedagogy before my conception. Second, I realised 50 As for the first area of study had been written next before my conception. Third, I realised 50 As for all the areas of study had been written after this before my conception. In this way, I prepared to teach post-conception 50 A children pedagogy by realising that I had been given 50 As before conception to be a pedagogue. + +15. I prepared to study a degree to work. I did this by checking in with a psychiatrist. First, I learned meditation. Second, I saw a psychiatrist. Third, I studied medicine. In this way, I prepared to study a degree to work by checking in with a psychiatrist. + +16. I prepared to see Aaron flower and flourish. I did this by stating that Aaron was interested in Medicine. First, I observed that he wanted to study Medicine. Second, I observed him study meditation. Third, I observed him study pedagogy. In this way, I prepared to see Aaron flower and flourish by stating that Aaron was interested in Medicine. + +17. I prepared to confirm the ability to breason in meditation. I did this by unblocking not wanting to write breasonings in meditation. First, I studied Nietzsche in Arts. Second, I studied Creative Writing. Third, I studied Education. In this way, I prepared to confirm the ability to breason in meditation by unblocking not wanting to write breasonings in meditation. + +18. I prepared to measure lengths with the ruler. I did this by measuring the ruler's width. First, I held the tape measure. Second, I measured the width of the ruler. Third, I recorded the width of the ruler. In this way, I prepared to measure lengths with the ruler by measuring the ruler's width. + +19. I prepared to make a new chocolate pyramid. I did this by eating the chocolate art. First, I ate the base. Second, I ate the mid-section. Third, I ate the apex. In this way, I prepared to make a new chocolate pyramid by eating the chocolate art. + +20. I prepared to create kings. I did this by stating that I am a pedagogy helper. First, I found the breasonings. Second, I helped the pedagogue to them. Third, I helped others to do this. In this way, I prepared to create kings by stating that I am a pedagogy helper. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8ebd3c71867264d26b1e818223f6e7a698bf01da --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, X Dimension 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +X Dimension 3 of 4 + +21. I prepared to treat the Queen to the roast banquet. I did this by listening to the person say that she was hungry. First, I observed that the person was due for a meal. Second, I asked her whether she was hungry. Third, I listened to her confirm that she was hungry. In this way, I prepared to treat the Queen to the roast banquet by listening to the person say that she was hungry. + +22. I prepared to be famous. I did this by stating that Descartes would state that there should be minimal pop culture references. First, I eliminated external areas of study. Second, I eliminated others' ideas. Third, I eliminated pop culture references. In this way, I prepared to be famous by stating that Descartes would state that there should be minimal pop culture references. + +23. I prepared to steer in the centre of the lane. I did this by waving to the car. First, I stopped on the side of the road. Second, I saw the car. Third, I waved to the car. In this way, I prepared to steer in the centre of the lane by waving to the car. + +24. I prepared to ask for, guide and integrate alternatives to breasonings (writing) with breasonings. I did this by touring the school which was based on Lucian's Pedagogy. First, I toured the philosophy studio. Second, I toured the music studio. Third, I toured the playwright studio. In this way, I prepared to ask for, guide and integrate alternatives to breasonings (writing) with breasonings by touring the school which was based on Lucian's Pedagogy. + +25. I prepared to set up the spiritual algorithm for a total of 8000 breasonings/30 breasonings per breasoning=267 breasonings for me, an Honours student to be given 100 As. I did this by appending the argument for a total of 250 breasonings. First, I counted that there were 190 breasonings. Second, I added 60 breasonings. Third, I counted that there were 250 breasonings. In this way, I prepared to set up the spiritual algorithm for a total of 8000 breasonings/30 breasonings per breasoning=267 breasonings for me, an Honours student to be given 100 As by appending the argument for a total of 250 breasonings. + +26. I prepared to realise that the customer could go skiing. I did this by stating that the customer required to be happy. First, I stated that the customer came in quietly. Second, I gave the customer secret. Third, I wrote about the customer. In this way, I prepared to realise that the customer could go skiing by stating that the customer required to be happy. + +27. I prepared to cover medicine. I did this by stating that I am a pedestrian. First, I examined my feet. Second, I walked with them. Third, I called myself pedestrian. In this way, I prepared to cover medicine by stating that I am a pedestrian. + +28. I prepared to attend work. I did this by preventing laryngitis with elderberry. First, I bought the black elderberry extract. Second, I dropped it in grape juice. Third, I prevented laryngitis. In this way, I prepared to attend work by preventing laryngitis with elderberry. + +29. I prepared to help Earth avoid catastrophe. I did this by stating that I am peaceful. First, I made vegan food available. Second, I guided the number of children per family. Third, I recommended green transport. In this way, I prepared to help Earth avoid catastrophe by stating that I am peaceful. + +30. I prepared to state that economics had a medicine specialism. I did this by opening a peace business. First, I recommended medicine as perfect function's initiator. Second, I recommended art as as perfect function's repetition. Third, I recommended arts as as perfect function's pedagogical amazement. In this way, I prepared to state that economics had a medicine specialism by opening a peace business. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..24348d90c3857b981c0dc6396476a8a39d1363d8 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 4 of 4.txt @@ -0,0 +1,28 @@ +["Green, L 2021, X Dimension 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +X Dimension 4 of 4 + +31. I prepared to breason out 50 As for a text to make it high quality. I did this by stating that peace is non-questionable. First, I explained meditation terms in terms of philosophy terms. Second, I helped the meditator (student) each step of the way. Third, I critiqued the meditation (philosophy) texts. In this way, I prepared to breason out 50 As for a text to make it high quality by stating that peace is non-questionable. + +32. I prepared to serve the roasted rhubarb with tofu ice cream. I did this by roasting the rhubarb. First, I picked the rhubarb. Second, I cut it into small pieces. Third, I roasted it. In this way, I prepared to serve the roasted rhubarb with tofu ice cream by roasting the rhubarb. + +33. I prepared to plan the algorithm before the argument. I did this by stating that the algorithm will cover the entire argument. First, I wrote the argument. Second, I wrote the algorithm. Third, I covered the argument with the algorithm. In this way, I prepared to plan the algorithm before the argument by stating that the algorithm will cover the entire argument. + +34. I prepared to transform a contrection (sic - connected and contra-) into a reason. I did this by stating that the algorithm will be uniquely interpreted. First, I wrote the algorithm's predicate descriptor. Second, I wrote 5 uses for this predicate. Third, I contrected (sic) the predicate's uses with 5 ranked breasoning algorithm descriptors. In this way, I prepared to transform a contrection into a reason by stating that the algorithm will be uniquely interpreted. + +35. I prepared to replaced the idea with itself. I did this by stating that the idea was important about Pedagogy X. First, I wrote the idea in terms of Pedagogy X. Second, I wrote about it in terms of itself. Third, I replaced the idea. In this way, I prepared to replaced the idea with itself by stating that the idea was important about Pedagogy X. + +36. I prepared to write 5 As for the programmer job description. I did this by naming the co-author of the algorithm. First, I focused on a single frame. Second, I asked the co-author to write the simple algorithm. Third, I asked her to write comments on the algorithm's predicates. In this way, I prepared to write 5 As for the programmer job description by naming the co-author of the algorithm. + +37. I prepared to rank and structure both the example sets of data for the algorithm's predicate and the breasonings chapter's algorithm ideas. I did this by naming the co-author of the argument. First, I wrote the breasoning chapter. Second, I thought of the example set of data for the algorithm's predicate. Third, I connected the example set of data for the algorithm's predicate and the breasonings chapter's algorithm idea. In this way, I prepared to rank and structure both the example sets of data for the algorithm's predicate and the breasonings chapter's algorithm ideas by naming the co-author of the argument. + +38. I prepared to eat with the Friar. I did this by relating Pedagogy X to breasonings. First, I wrote how breasonings excalibur (from King Arthur) was at the forefront. Second, I wrote how the suscicipi (sic) was bright. Third, I wrote how I inhaled. In this way, I prepared to eat with the Friar by relating Pedagogy X to breasonings. + +39. I prepared to go for a walk. I did this by drinking from the water bottle. First, I suscicipid again. Second, I helped Robin Hood find food. Third, I helped them go home. In this way, I prepared to go for a walk by drinking from the water bottle. + +40. I prepared to interest the master in the student's breasonings. I did this by writing the breasoning. First, I made Maid Marion's day brighter. Second, I loved the cup. Third, I loved the mug. In this way, I prepared to interest the master in the student's breasonings by writing the breasoning. + +41. I prepared to state that there was nothing there. I did this by noticing how gravity helped me swallow the ice cream. First, I inverted the mistaken relation from the dream. Second, I did this with a chain of 20 items. Third, I found Bs (As) to mistakes (thoughts) after this. In this way, I prepared to state that there was nothing there by noticing how gravity helped me swallow the ice cream. + +42. I prepared to classify the sentence as a generality or a specific example. I did this by designing the pickies as pairs of generalities and specific examples. First, I found the picky. Second, I found the generality. Third, I chose relativism. In this way, I prepared to classify the sentence as a generality or a specific example by designing the pickies as pairs of generalities and specific examples. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 1 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5a75985c0e3cc93ca0f1fc9f380c9c8439a47414 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 1 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Y Dimension 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Y Dimension 1 of 4 + + + +1. I prepared to cover the prestigious perspectives on the topic. I did this by writing breasonings on different departmental topics. First, I wrote breasonings on the first department. Second, I prepared to write breasonings on the second department. Third, I repeated this until I had written breasonings on all of the departments. In this way, I prepared to cover the prestigious perspectives on the topic by writing breasonings on different departmental topics. + +2. I prepared to take part in an internship. I did this by writing on an important topic. First, I made notes about what was important. Second, I became equally important to an important writer. Third, I wrote on the topic. In this way, I prepared to take part in an internship by writing on an important topic. + +3. I prepared to write letters to famous students. I did this by stating that Nietzsche wrote for famous students appointed by the Vatican. First, I found the students. Second, I helped them to be appointed by the Vatican. Third, I wrote for them. In this way, I prepared to write letters to famous students by stating that Nietzsche wrote for famous students appointed by the Vatican. + +4. I prepared to make sales to potential buyers. I did this by breasoning out 50 As for a sale from a buyer appointed by the Vatican . First, I verified who the Vatican recommended to help to buy. Second, I wrote down the person's name. Third, I breasoned out 50 As for a sale from this buyer. In this way, I prepared to make sales to potential buyers by breasoning out 50 As for a sale from a buyer appointed by the Vatican. + +5. I prepared to sleep in the object. I did this by analysing the object's depth. First, I walked to the object. Second, I took out a measuring tape. Third, I measured it's depth. In this way, I prepared to sleep in the object by analysing the object's depth. + +6. I prepared to be the best ever. I did this by walking past the person. First, I found the person. Second, I walked past her. Third, I reached the goal. In this way, I prepared to be the best ever by walking past the person. + +7. I prepared to stack the objects. I did this by computing the particular combination of objects to fit in the particular depth. First, I tried the first object and the second object. Second, I tried the first object and the third object. Third, I tried the first object and the fourth object. In this way, I prepared to stack the objects by computing the particular combination of objects to fit in the particular depth. + +8. I prepared to make life as easy as child's play. I did this by stating the paradox: which is first, pop or philosophy. First, I wrote pop came first. Second, I wrote philosophy am second. Third, I wrote this was because children came before adults. In this way, I prepared to make life as easy as child's play by stating the paradox: which is first, pop or philosophy. + +9. I prepared to performed the somersault. I did this by swinging on the trapeze. First, I climbed the ladder. Second, I held the trapeze. Third, I swung on it. In this way, I prepared to performed the somersault by swinging on the trapeze. + +10. I prepared to ride to market. I did this by riding the Italian stallion. First, I found the Italian stallion. Second, I mounted the stallion. Third, I rode the stallion. In this I prepared to ride to market by riding the Italian stallion. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 2 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..464f91b283a572250d48d906cb3dd49b0d19837b --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Y Dimension 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Y Dimension 2 of 4 + +11. I prepared to explain that irony means something else, like metaphor. I did this by writing that irony is metaphor. First, I wrote the ironic statement 'I love you'. Second, I wrote this was a metaphor. Third, I wrote that this was because it was a metaphor for 'I love you more'.. In this way, I prepared to explain that irony means something else, like metaphor by writing that irony is metaphor. + +12. I prepared to stay in the black. I did this by stating that the puffin slid down the slope. First, I observed the puffin at the top of the slope. Second, I observed the puffin slide down the slope. Third, I observed the puffin slide until he had reached the bottom of the slope. In this way, I prepared to stay in the black by stating that the puffin slid down the slope. + +13. I prepared to learn the skill of criticality about the three-dimensional object. I did this by relearning the Pedagogy Y skill. First, I learnt the skill of criticality about the x dimension. Second, I learnt the skill of criticality about the Y dimension. Third, I learnt the skill of criticality about the Y dimension. In this way, I prepared to learn the skill of criticality about the three-dimensional object by relearning the Pedagogy Y skill. + +14. I prepared to record the article. I did this by reading the article about myself in the news. First, I opened the newspaper. Second, I found the article. Third, I read the article. In this way, I prepared to record the article by reading the article about myself in the news. + +15. I prepared to find the way. I did this by tickling the octopus. First, I tickled the octopus' first tentacle . Second, I prepared to tickle the octopus' next tentacle. Third, I repeated this until I had tickled all of the octopus' tentacles. In this way, I prepared to find the way by tickling the octopus. + +16. I prepared to eat the sweet in the cube. I did this by making fun of 'it'. First, I made the cube. Second, I passed it to the next player. Third, I won it when I was holding it and the music stopped. In this way, I prepared to eat the sweet in the cube by making fun of 'it'. + +17. I prepared to inspire other roles. I did this by earning a role for my newspaper article. First, I wrote a 250 breasoning A for any role and answered 15 questions for earning each of the 5 roles. Second, I wrote a 250 breasoning A for medicine for any use and answered 15 questions for each of the 15 accesses of the role A. Third, I wrote a 250 breasoning A (area of study) or 50 breasoning A (the minimum to hold medicine) for the newspaper article and the article itself. In this way, I prepared to inspire other roles by earning a role for my newspaper article. + +18. I prepared to observe the power of the ring. I did this by removing the ring from the ring case. First, I found the ring case. Second, I removed the ring from it. Third, I placed it on my finger. In this way, I prepared to observe the power of the ring by removing the ring from the ring case. + +19. I prepared to grow. I did this by eating the pear. First, I shelled it. Second, I sliced it. Third, I ate it. In this way, I prepared to grow by eating the pear. + +20. I prepared to eat the new crumb cake. I did this by observing the superstar eating the crumb cake. First, I noticed the person. Second, I noticed the crumb cake. Third, I noticed that he was eating it. In this way, I prepared to eat the new crumb cake by observing the superstar eating the crumb cake. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 3 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4ecc8a8fb8c945944e0c4d81d8cb43997dc89a55 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Y Dimension 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Y Dimension 3 of 4 + +21. I prepared to eat the cheesecake. I did this by making the cheesecake. First, I bought the cheesecake. Second, I chilled it. Third, I served it. In this way, I prepared to eat the cheesecake by making the cheesecake. + +22. I prepared to be influenced by my pedagogy helper. I did this by using my pedagogy helper. First, I organised my departments. Second, I organised my argument topics. Third, I found out my arguments. In this way, I prepared to be influenced by my pedagogy helper by using my pedagogy helper. + +23. I prepared to fly on the trapeze. I did this by stating that the energy biscuits gave me enough energy to walk forwards. First, I found the energy. Second, I found that I had a good opportunity to consume it. Third, I consumed it. In this way, I prepared to fly on the trapeze by stating that the energy biscuits gave me enough energy to walk forwards. + +24. I prepared to see the play at night. I did this by working during the day. First, I planned the work to do. Second, I completed the work during the day. Third, I retired in the evening. In this way, I prepared to see the play at night by working during the day. + +25. I prepared to buy a good pillow. I did this by working during the night. First, I slept during the day. Second, I worked at night. Third, I went to sleep. In this way, I prepared to buy a good pillow by working during the night. + +26. I prepared to write the caption. I did this by taking care of Professor Adkins. First, I taught him Lucianic Meditation (Philosophy). Second, I performed tasks as his research assistant. Third, I drew diagram for his book. In this way, I prepared to write the caption by taking care of Professor Adkins. + +27. I prepared to look at the boxes on the shelf. I did this by stating that the doctor treated the patient with spiritual medication. First, I wrote the symptoms. Second, I made the diagnosis. Third, I gave the treatment. In this way, I prepared to look at the boxes on the shelf by stating that the doctor treated the patient with spiritual medication. + +28. I prepared to show fidelity. I did this by stating that I am an exister. First, I found the person. Second, I found where he was. Third, I existed with him. In this way, I prepared to show fidelity by stating that I am an exister. + +29. I prepared to analyse the movie. I did this by describing the movie about the barricuda. First, I found the barricuda. Second, I made a movie about it. Third, I watched it. In this way, I prepared to analyse the movie by describing the movie about the barricuda. + +30. I prepared to drink the peach nectar from the peach. I did this by leading by the hand. First, I led the person to the pedagogy course. Second, I led the person to the medicine course. Third, I led the person to the meditation course. In this way, I prepared to drink the peach nectar from the peach by leading by the hand. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 4 of 4.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4da52d07dcc8697838648dd811f072f86d0e7993 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 4 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Y Dimension 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Y Dimension 4 of 4 + +31. I prepared to givenate (sic). I did this by stating that it was necessary to be helped. First, I stated that I helped you. Second, I stated that you helped me. Third, I stated that we helped everyone in sight. In this way, I prepared to givenate by stating that it was necessary to be helped. + +32. I prepared to paint the wall. I did this by stating that I liked you. First, I found you. Second, I liked you. Third, I talked with you. In this way, I prepared to paint the wall by stating that I liked you. + +33. I prepared to cook another poppadum. I did this by eating the poppadum. First, I placed the poppadum on the pan. Second, I cooked it. Third, I ate it. In this way, I prepared to cook another poppadum by eating the poppadum. + +34. I prepared to identify the essay mark. I did this by identifying the philosophy synthesis. First, I identified the positive idea. Second, I identified the single argument. Third, I identified the argument structure. In this way, I prepared to identify the essay mark by identifying the philosophy synthesis. + +35. I prepared to spread the contents of the seed. I did this by breasoning out the dehiscence. First, I identified the weak line in the edge of the seed. Second, I observed the line split. Third, I observed the contents fall out of the seed. In this way, I prepared to spread the contents of the seed by breasoning out the dehiscence. + +36. I prepared to find the loiterer (guest). I did this by drinking the apple juice. First, I opened the apple juice container. Second, I poured the glass of apple juice. Third, I drank it. In this way, I prepared to find the loiterer (guest) by drinking the apple juice. + +37. I prepared to earn the money. I did this by training the best racing horse. First, I found the best racing horse. Second, I trained it. Third, I watched him win. In this way, I prepared to earn the money by training the best racing horse. + +38. I prepared to be loved. I did this by writing the music synthesis. First, I wrote the featured breasoning. Second, I wrote the featured syntheses between music and lyrics. Third, I finished the synthesis. In this way, I prepared to be loved by writing the music synthesis. + +39. I prepared to synthesise 50 As worth of main breasonings, including 15 for the main role and 5 for other roles. I did this by writing the theatre studies synthesis. First, I turned over the famous line. Second, I assigned the lines to magical times. Third, I synthesised the breasonings. In this way, I prepared to synthesise 50 As worth of main breasonings, including 15 for the main role and 5 for other roles by writing the theatre studies synthesis. + +40. I prepared to move the spatula of blueberry jam away from me and the balloon. I did this by preventing the headache (ensure health) in meditation (philosophy) using the currant bun protector. First, I observed that the finger had been inserted in the currant bun. Second, I observed the other fingers take turns to do the same thing. Third, I observed that this prevented the headache (ensured health). In this way, I prepared to move the spatula of blueberry jam away from me and the balloon by preventing the headache (ensure health) in meditation (philosophy) using the currant bun protector. + +41. I prepared to find out that some people may not be suited to meditation, and shouldn't do what requires meditation (they should write philosophy). I did this by preventing a panic attack (maintaining health). First, I stopped meditating (repeating) using the mantra (word). Second, I stopped meditating (repeating) using the sutra (word). Third, I breathed deeply. In this way, I prepared to find out that some people may not be suited to meditation, and shouldn't do what requires meditation (they should write philosophy) by preventing a panic attack (maintaining health). + +42. I prepared to observe the couple help the child with school work. I did this by observing the couple breason out the Anarchy argument to help ensure successful conception and prevent miscarriage. First, I observed the one of the members of the couple breason out the Anarchy argument. Second, I observed that the conception had been successful. Third, I observed the couple meditate (see a doctor) before, during and after pregnancy. In this way, I prepared to observe the couple help the child with school work by observing the couple breason out the Anarchy argument to help ensure successful conception and prevent miscarriage. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green A Greater Number Of Successful Job Applications 1 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green A Greater Number Of Successful Job Applications 1 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..fed98c04df15218ca2100d1d40ee712552e4ceac --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green A Greater Number Of Successful Job Applications 1 of 3.txt @@ -0,0 +1,32 @@ +["Green, L 2021, A Greater Number Of Successful Job Applications 1 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +A Greater Number Of Successful Job Applications 1 of 3 + +1. I prepared to be given 1000 As for a token. I did this by stating that the Queen requires 1000 breasonings (As) for an applicant to be considered to become a pedagogue. First, I meditated on (thought of) the first breasoning of the Unification with the Pedagogy Helper Argument. Second, I prepared to meditate on (think of) the next breasoning of the Unification with the Pedagogy Helper Argument. Third, I repeated this until I had meditated on (thought of) all the breasonings of the Unification with the Pedagogy Helper Argument. In this way, I prepared to be given 1000 As for a token by stating that the Queen requires 1000 breasonings (As) for an applicant to be considered to become a pedagogue. + +2. I prepared to notice the breasonings being used. I did this by producing the pedagogical helper by asking the lecturer to agree with some breasonings. First, I found the pedagogy helper. Second, I asked him to produce with some breasonings. Third, I asked the lecturer to accept the breasonings. In this way, I prepared to notice the breasonings being used by producing the pedagogical helper by asking the lecturer to agree with some breasonings. + +3. I prepared to introduce meditation (philosophy) to other pedagogy helpers. I did this by writing that meditation (philosophy) was the prerequisite for being a pedagogy helper. First, I meditated (wrote philosophy). Second, I became a pedagogy helper. Third, I observed that meditation (philosophy) helped my health. In this way, I prepared to introduce meditation (philosophy) to other pedagogy helpers by writing that meditation (philosophy) was the prerequisite for being a pedagogy helper. + +4. I prepared to think of uses for the computer program. I did this by writing 250 breasonings for a computer science program, where the computer program had a breasoning for each of 3 transformations. First, I increased the computer program. Second, I presented the result of the breasonings to the student. Third, I thought of interesting thoughts associated with the program. In this way, I prepared to think of uses for the computer program by writing 250 breasonings for a computer science program, where the computer program had a breasoning for each of 3 transformations. + +5. I prepared to earn a job in sales. I did this by programming the musical infusion. First, I wrote the music. Second, I wrote the infusion. Third, I wrote how the infusion of melodies came from the music. In this way, I prepared to earn a job in sales by programming the musical infusion. + +6. I prepared to hire Masters graduates in all business areas to be star-studded. I did this by writing that the famous pop song's product image and sales boosted the required sales by 10%. First, I found the sales important. Second, I found helping people important. Third, I found image important. In this way, I prepared to hire Masters graduates in all business areas to be star-studded by writing that the famous pop song's product image and sales boosted the required sales by 10%. + +7. I prepared to write for the argument structurers. I did this by writing the next breasoning. First, I wrote about the breasoning. Second, I wrote about it . Third, I wrote about two more. In this way, I prepared to write for the argument structurers by writing the next breasoning. + +8. I prepared to do nothing. I did this by writing about the students. First, I wrote about each student. Second, I helped the Queen. Third, I took all of the credit for it. In this way, I prepared to do nothing by writing about the students. + +9. I prepared to write about cosmology. I did this by writing that the famous future students were found out about through cosmology. First, I liked my Godfather. Second, I helped you to understand that. Third, I liked you. In this way, I prepared to write about cosmology by writing that the famous future students were found out about through cosmology. + +10. I prepared to state that that's a heart. I did this by observing the two sleeping birds make a heart shape. First, I observed the bird on the left make the left-hand side of the heart. Second, I observed the bird on the right make the right-hand side of the heart. Third, I observed that the two sleeping birds made a heart shape. In this way, I prepared to state that that's a heart by observing the two sleeping birds make a heart shape. + +11. I prepared to educate the ducklings. I did this by looking in the duckling's backpack. First, I found the red book. Second, I found the yellow book. Third, I found the blue book. In this way, I prepared to educate the ducklings by looking in the duckling's backpack. + +12. I prepared to ask, 'Who's that?'. I did this by stating that the person had job training. First, I found the person. Second, I gave the person job training. Third, I stated that the person had job training. In this way, I prepared to ask, 'Who's that?' by stating that the person had job training. + +13. I prepared to mainly recommend accredited degrees. I did this by stating that the person earned the job. First, I included everybody. Second, I stated this was because of base-level famousness. Third, I stated my protection was unaccredited degrees. In this way, I prepared to mainly recommend accredited degrees by stating that the person earned the job. + +14. I prepared to spread my religion (philosophy). I did this by helping the person choose the learning path. First, I studied a Philosophy major in an undergraduate degree. Second, I studied Philosophy as part of an Honours degree. Third, I read in the holidays. In this way, I prepared to spread my religion (philosophy) by helping the person choose the learning path. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green A Greater Number Of Successful Job Applications 2 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green A Greater Number Of Successful Job Applications 2 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..35c7c2388a44912c45b10c9bfcd3e10dd223d72d --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green A Greater Number Of Successful Job Applications 2 of 3.txt @@ -0,0 +1,32 @@ +["Green, L 2021, A Greater Number Of Successful Job Applications 2 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +A Greater Number Of Successful Job Applications 2 of 3 + +15. I prepared to agree with the song. I did this by selecting the job. First, I selected the job. Second, I thought of you. Third, I walked along the path. In this way, I prepared to agree with the song by selecting the job. + +16. I prepared to earn the job. I did this by writing the application. First, I wrote the 50 As. Second, I breasoned them out [not in one day for a famousness job]. Third, I applied for the job. In this way, I prepared to earn the job by writing the application. + +17. I prepared to be healthy. I did this by maintaining a dry nose. First, I wrote about the elderberry. Second, I stated that it had vitamin B. Third, I stated that it had vitamin B complexes. In this way, I prepared to be healthy by maintaining a dry nose. + +18. I prepared to say that I earned the degree in Meditation. I did this by stating that the meditator (anthropologist) successfully earned the famous job. First, I observed her use it to publicise meditation. Second, I observed her use it to publicise medicine. Third, I observed her use it to publicise pedagogy. In this way, I prepared to say that I earned the degree in Meditation by stating that the meditator (anthropologist) successfully earned the famous job. + +19. I prepared to dig my own archeology of knowledge. I did this by stating that the pedagogy helper helped the applicant earn the job. First, I wrote about the people. Second, I wrote about the societies. Third, I wrote about the sticky bun. In this way, I prepared to dig my own archeology of knowledge by stating that the pedagogy helper helped the applicant earn the job. + +20. I prepared to automate the management (in fact, redo the whole thing). I did this by stating that the applicants considered had 50 As. First, I wrote about the 50 As. Second, I noticed that one had 100 As, and they were all specific. Third, I gave her the job. In this way, I prepared to automate the management (in fact, redo the whole thing) by stating that the applicants considered had 50 As. + +21. I prepared to help the contingents. I did this by writing that the human resources question questioned the required budget for group meditation (societology). First, I wrote the first present. Second, I wrote the second present. Third, I wrote the third present. In this way, I prepared to help the contingents by writing that the human resources question questioned the required budget for group meditation (societology). + +22. I prepared to help choose people now. I did this by earning the job by thinking of perspectives clearly. First, I helped the genius. Second, I asked, 'Who's that?' Third, I listened to you answer my question. In this way, I prepared to help choose people now by earning the job by thinking of perspectives clearly. + +23. I prepared to love you. I did this by stating that I can do anything that I would like to do. First, I found the treetops. Second, I sat on them. Third, I said that I am the best in the world. In this way, I prepared to love you by stating that I can do anything that I would like to do. + +24. I prepared to work the float. I did this by earning 250 breasonings for questioning things to be right. First, I assisted you. Second, I maraded you (marched and paraded with you). Third, I believed in racial equality. In this way, I prepared to work the float by earning 250 breasonings for questioning things to be right. + +25. I prepared to go on television. I did this by preventing the skin blemishes using the Lucianic Medicine Nut And Bolt and Quantum Box/Prayer (Argument). First, I used the Nut And Bolt technology. Second, I used the Quantum Box/Prayer (Argument) technology. Third, I prevented skin blemishes. In this way, I prepared to go on television by preventing the skin blemishes using the Lucianic Medicine Nut And Bolt and Quantum Box/Prayer (Argument). + +26. I prepared to apply philosophy. I did this by writing that the music students analysed my songs. First, I listened to the song. Second, I wrote the exposition and it's algorithm. Third, I wrote the critique and it's algorithm. In this way, I prepared to apply philosophy by writing that the music students analysed my songs. + +27. I prepared to read about God (the man). I did this by searching online for instructions about what to do in hot weather. First, I stayed indoors. Second, I drank water. Third, I stayed cool. In this way, I prepared to read about God (the man) by searching online for instructions about what to do in hot weather. + +28. I prepared to examine the artwork. I did this by stating who the head of the school was. First, I found the lady. Second, I walked to the head's office with her. Third, I introduced her to the head. In this way, I prepared to examine the artwork by stating who the head of the school was. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green A Greater Number Of Successful Job Applications 3 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green A Greater Number Of Successful Job Applications 3 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..ea6faed1747ec4d3ceaec1c765035c1466cf2657 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green A Greater Number Of Successful Job Applications 3 of 3.txt @@ -0,0 +1,32 @@ +["Green, L 2021, A Greater Number Of Successful Job Applications 3 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +A Greater Number Of Successful Job Applications 3 of 3 + +29. I prepared to play a concerto. I did this by buying the piano from Napa Valley. First, I read the Napa Valley catalogue. Second, I selected the piano. Third, I bought the piano from Napa Valley. In this way, I prepared to play a concerto by buying the piano from Napa Valley. + +30. I prepared to buy art materials. I did this by protecting the duckling. First, I caught the duckling. Second, I placed him in the cage. Third, I locked the cage door. In this way, I prepared to buy art materials by protecting the duckling. + +31. I prepared to admire the goose. I did this by stating that David Bowie protected the duckling. First, I noticed the duckling had a loose feather. Second, I neatened the feather. Third, I patted the duckling with David Bowie. In this way, I prepared to admire the goose by stating that David Bowie protected the duckling. + +32. I prepared to listen to classical music. I did this by breasoning out the job. First, I wrote the script for each point. Second, I wrote the breasonings for them. Third, I wrote generic breasonings for non-scripted dialogue. In this way, I prepared to listen to classical music by breasoning out the job. + +33. I prepared to write what the candidates thought. I did this by writing that multiple people successfully earned the job. First, I advertised the job to be performed in different locations. Second, I interviewed the job applicants. Third, I awarded the job to multiple people. In this way, I prepared to write what the candidates thought by writing that multiple people successfully earned the job. + +34. I prepared to talk to someone in another language. I did this by translating the job application tool into another language. First, I wrote the job application tool and job application. Second, I looked up the language spoken by targeted job seekers. Third, I translated the job application tool into that language. In this way, I prepared to talk to someone in another language by translating the job application tool into another language. + +35. I prepared to confirm that the letter had arrived. I did this by stating that cosmology (the universe) protected the successful job applicant. First, I awarded the job to the successful job applicant. Second, I wrote this in a letter. Third, I placed a stamp on the envelope for the letter. In this way, I prepared to confirm that the letter had arrived by stating that cosmology (the universe) protected the successful job applicant. + +36. I prepared to be on stage. I did this by stating that the medical practitioner gave the all clear. First, I meditated (talked about philosophy) on the first evening. Second, I meditated (talked about philosophy) on the second evening. Third, my agent awarded me a role the next day. In this way, I prepared to be on stage by stating that the medical practitioner gave the all clear. + +37. I prepared to be a good metaphorical role player. I did this by being true to myself in relation to other employees. First, I looked into meditation before selecting a preference for a degree. Second, I liked the degree. Third, I helped other people. In this way, I prepared to be a good metaphorical role player by being true to myself in relation to other employees. + +38. I prepared to be early for work. I did this by buckling up and taking off for work. First, I buckled my left shoe. Second, I buckled my right shoe. Third, I took off for work. In this way, I prepared to be early for work by buckling up and taking off for work. + +39. I prepared to breason it out in 32 days. I did this by stating that I knew the successful job applicant. First, I asked the job applicant how she earned the job. Second, I listened to her say she learned Pedagogy. Third, I listened to her say she breasoned out 100 specific As for the job application. In this way, I prepared to breason it out in 32 days by stating that I knew the successful job applicant. + +40. I prepared to breason out 108 As with 174 meditation utterances (words). I did this by stating that I liked the successful job applicant. First, I thanked the friend for this knowledge. Second, I listened to her ask me for feedback on a scale of 1-10. Third, I said you have to give the first customer 5 As as a manager, with seen-as version As for himself, the customer and any others. In this way, I prepared to breason out 108 As with 174 meditation utterances (words) by stating that I liked the successful job applicant. + +41. I prepared to be a Hollywood extra. I did this by stating that I trusted the successful job applicant. First, I observed that he applicant had no criminal record. Second, I observed that the applicant could use his intelligence. Third, I left the employee with instructions. In this way, I prepared to be a Hollywood extra by stating that I trusted the successful job applicant. + +42. I prepared to make a deal. I did this by verifying that I had been employed. First, I rang the office. Second, I asked to speak to the manager. Third, I verified that I had been employed. In this way, I prepared to make a deal by verifying that I had been employed. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Aigs for Pedagogy Helper 1 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Aigs for Pedagogy Helper 1 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..21bf71dc6c24d8eb1af3369293d6ec205ca8d4c6 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Aigs for Pedagogy Helper 1 of 3.txt @@ -0,0 +1,38 @@ +["Green, L 2021, Aigs for Pedagogy Helper 1 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +Aigs for Pedagogy Helper 1 of 3 + +1. I experienced bliss. I did this by preventing cancer from using using Aigs to become a Pedagogy Helper by uttering 80 'I have already completed my cure' mantras (words) at the start of recording day. First, said that I had already done it. Second, I knew you. Third, you knew me. + +2. I observed that it was the mantra (word) meditator was could become a pedagogy helper. The mantra (speech) meditator helped by distributing pedagogy Aigs. First, he was helped to collect As. Second, he called the group of them Aigs. Third, he helped help people earn A grade in educational institutions. + +3. I scooped the boat. I did this by observing that everyone was happy with receiving pedagogical breasonings. First, I observed that the mantra (writing) made everyone happy. Second, I used the mantra (synonym). Third, I observed that the mantra (term) made everyone happy with receiving pedagogical breasonings. + +4. I lived long and happy. I did this by observing meditation helped health when writing breasonings. First, I meditated long and slow. Second, I wrote breasonings up and high. Third, I observed my body worked when I breasoned out the breasonings. + +5. I was satisfied by breasonings. I did this by observing that meditation helps make breasonings work. First, I meditated for a while. Second, I wrote the breasonings down. Third, I discovered how breasonings worked. + +6. I observed that the Aig Pedagogy Helper was good. I did this by observing everyone helping pedagogy with Aigs by using the mantra (utterance). First, I observed that the yantra (image) (used with the mantra, or thought) helped with distributing the Pedagogy Aigs in the electromagnetic field. Second, I observed that the yantra (pattern) made the pedagogy helpers perfect). Third, I found out the yantra (number square) from Guru Dev (the man). + +7. I experienced the stages of life. I did this by observing that the meditator pedagogy professor coordinated the Aigs, then stopped and looked at the images and was given images of the rest of the Aigs. First, I found the Aig. Second, I looked at it. Third, I was given the rest of them. + +8. I observed that the professor should become a God to maintain the system of people. I did this by observing the professor become God by using the yantra (art). First, I observed the professor help with pedagogical Aigs. Second, I observed her use the yantra (mathematical grid). Third, I observed her become a God. + +9. I collected independent competency to be well known and reporting to be a well-known person. I did this by stating that I have already helped by distributing pedagogy Aigs to help another distribute pedagogy Aigs. First, I walked to the place. Second, I stated that I had already done the work. Third, I continued on. + +10. I stated that I was already healthy after helping with Aigs as a Pedagogy Helper. I did this by stating that I was already healthy before helping with Aigs as a Pedagogy Helper. First, I used my Medicine degree to maintain perfect psychiatric health. Second, I uttered the mantra (chant) to bring the yantra (image of clearness) into effect. Third, I stated that I was already healthy because the yantra (painting) had caused the quantum box to dismantle each thought of that kind, preventing a lump from forming. + +11. I stated that the pedagogy helper helped write the breasonings in Aig A. I did this by stating that the pedagogy helper helped the student with Aig A in Aig B which he was given. First, the system gave the pedagogy helper Aig B. Second, the pedagogy helper unpacked Aig A from Aig B. Third, the pedagogy helper made up ideas about Aig A. + +12. I noticed that the pedagogue had a positive function. I did this by stating that the thinker gave the University company thoughts to the University company pedagogy helper. First, I noticed the pedagogue. Second, I noticed the writer. Third, I noticed myself. + +13. I described the way of thinking. I did this by writing what the ways of thinking were. First, I wrote the Aig. Second, I wrote some of its breasonings. Third, I wrote what their ways of thinking were. + +14. I connected the work of the pedagogy helpers together. I did this by stating that the pedagogy helper connected an infinite (finite) number of breasonings (the Aigs of Aigs). First, I cut off the first Aig at its last breasoning after a finite number of breasonings. Second, I selected the first breasoning of the second Aig. Third, I connected the two breasonings. + +15. I gave virality to the child after he was born. I did this by finding virality (not to be confused with virility) in subjective breasoning experiences as helped by a pedagogy helper to be incompatible with conception (I found normal breasonings to be compatible with conception). First, I found that the breasoner breasoned out a viral (normal) argument. Second, I found that this was incompatible (compatible) with conception. Third, I found that the pedagogy helper's breasoners were unaffected by each others' arguments. + +16. I determined the time when pedagogical Aigs were used each day. I did this by discovering pedagogical Aigs. First, I worked out that it was necessary to use Aigs to distribute pedagogical breasonings. Second, I used Aigs. Third, I rejoiced in them. + +17. I observed that there was eternal life in the philosophy of the self. I did this by stating that the help with Aigs by the pedagogy helper was to verbally state the quote 'I endorsed you' (the object) in 'I said that I endorsed you' (the subject, verb and object). First, I wrote the climax. Second, I wrote the anticlimax. Third, I wrote of Vaudren Undead. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Aigs for Pedagogy Helper 2 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Aigs for Pedagogy Helper 2 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..65b764c8f30fac83f7ea9c729886a19d4ee302d8 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Aigs for Pedagogy Helper 2 of 3.txt @@ -0,0 +1,38 @@ +["Green, L 2021, Aigs for Pedagogy Helper 2 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +Aigs for Pedagogy Helper 2 of 3 + +18. I observed the sea slug. I did this by stating that modernity is like Aigs, which are activated by arguments. First, I observed the other. Second, I observed the self. Third, I observed the universe. + +19. I observed that an Aig containing an infinary (sic) number of breasonings was given to the writer, where infinary means an infinite-like number where one can see all the numbers. I did this by stating that the pedagogy helper helped Aigs to be somewhere. First, I observed the writer. Second, I observed the reader. Third, I observed that everyone was happy + +20. I decided to listen again. I did this by observing that the pedagogy helper helped with the performance Aigs. First, I found the scripts. Second, I found the actors. Third, I watched the performance. + +21. I found that meditation helped meet the professional requirements of meditation teachers, their students who were also pedagogy helpers and pedagogues using Aigs. I did this by observing that the meditation teacher's students who were also pedagogy helpers made Aigs available. First, the teacher made the claim available. Second, the student researched the claim. Third, the student made it available. + +22. The universe sustained the self (the self ate from the cosms, sic). The self determined that pedagogy Aigs helped it. The others endorsed the self. The self endorsed the others. The self assisted them. + +23. The self examined that the work was finished on time. The self knew that pedagogy Aigs helpers helped. The self observed the helper help the self. The self saw the second helper help. The second helper also helped the self. + +24. The pedagogy helper was in natural law. The pedagogy aigs helper was legally right. The self examined the water line at different times. The self was above it at those times. The self offered the Aigs service. + +25. The self read the aig, and then the self delegated distributing the number of aigs. The self found that the pedagogy helper confirmed the number. The self wrote the number. The self wrote what it meant. The self confirmed the number. + +26. I endorsed the pedagogy helper aigs. The other examined the results of the pedagogy helper aigs. The self examined what the others wanted. The self examined the self and others. The self examined the results of the pedagogy helper aigs. + +27. The self was ready for a new aig. The self decided that the pedagogy helper aigs was good. The other knew how the self and other wrote down the aigs. The self knew how the aigs worked. The self knew what the other meant. + +28. The self observed that the whole body was in comfort when using aigs. The self observed that being an aig pedagogy helper required medicine. The self observed that there was no mental illness when using aigs (there was good mental health when using aigs). The self observed that there were no headaches when using aigs (there was head comfort when using aigs). The self observed that there were no muscle aches and pains when using aigs (there was muscular comfort when using aigs). + +29. The self examined pedagogy and found that it worked. The self observed that being an aig pedagogy helper required pedagogy. The self became a pedagogue. The self became a recordings producer. The self became a pedagogy helper. + +30. The self observed that aigs worked for the other. The self observed that being an aig pedagogy helper required meditation. The self observed that meditation prevented cancer (maintained health). The self observed that meditation produced better results in education and business institutions. The self observed that meditation decreased stress (maintained relaxation). + +31. The parliament knew about how the self and other knew the way. The parliamentary pedagogy helper chose the direction with the aig. The parliament made the correct decision by law. The parliament chose the direction with fine art. The parliament chose the direction with philosophy. + +32. The self knew what the other desired. The helper's pedagogy aig is. The self knew it. The self knew the other. The other knew the self. + +33. The self knew the other. The self attained or discouraged the aig because it was right or wrong, respectively. The self knew what the aig was. The self knew what it did. The self knew that I was right about it. + +34. The self attained a simulated and intelligent thing. The audience member accessed helped pedagogy aigs with 50 recorded or breasoned breasonings. The pedagogy helper gave the breasoning to the writer. The writer wrote it down. The audience reacted to the breasoning. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Aigs for Pedagogy Helper 3 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Aigs for Pedagogy Helper 3 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..b220a26902c9515afa6ed7a1e8530d2dab372632 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Aigs for Pedagogy Helper 3 of 3.txt @@ -0,0 +1,36 @@ +["Green, L 2021, Aigs for Pedagogy Helper 3 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +Aigs for Pedagogy Helper 3 of 3 + +35. I wrote rather than listened. I stated that the creation and performance of the helped pedagogy aig is stronger than seeing and copying it. The creator/performer examined the aig. The creator/performer wrote in the aig. The creator/performer learned about the aig. + +36. The self described what the other thought. The self breasoned out the 15 breasonings in the aig. The self wrote about the aig. The self wrote inside out. The self described the other. + +37. The self examined the logos. The self endorsed the subject being mentioned in the breasonings chapter of the aig, helped by a pedagogy helper. The self knew the other. The self knew about the second other. The self knew about the other too. + +38. I discovered how aigs worked. The pedagogy helper stated that the aig was an element of the aigs, and that the aigs were available when wanted. The self noted that there was an aig. The self wanted the aig. The self took the aig. + +39. The self observed that God didn't (did) exist. Delta time = 0 in aig pedagogy help, showing that God (man) replicates aigs instantly, and so God (man) exists. The self noticed the ruler. The self noticed God (the man). The self noticed the other. + +40. The self observed more others. The self observed that the pedagogy helper created pedagogues. The self solved the social complexes. The self remedied the economic conundrums. The self avoided the medical questions. + +41. The self observed that aigs were important. The pedagogy helper helping with pedagogical aigs was necessary. The self noticed the other. The self noticed realism. The self noticed the other thinking about aigs. + +42. The self observed that pedagogy is necessary. Pedagogy shaped with aigs by the pedagogy helper is necessary. The pedagogy helper found out the ten aigs. The breasoner shaped the breasoning from the pedagogy helper with them. The self saw what the other meant. + +43. The self observed that aigs helped with pedagogy. The self was right to know how the other knew what the other knew. The other knew what the other knew. The self was right to know this. So, the self was right to know how the other knew pedagogy. + +44. The self knew that it was good. The self knew what you knew again. The self knew what you knew. The self knew what you knew in detail. The self knew it. + +45. The self knew about the other also. The self knew what the other knew and that it was right. The self knew it. The other knew it too. The self knew it was good too. + +46. I examined why it was. The self knew how and now. The self knew how the others knew what they knew. I knew they knew now. I knew how it was. + +47. The self wanted the other. The self knew where the other went and what the other did with it. The self wanted the other. The other wanted the self. The self and other did it. + +48. The self observed the other with the self. The host successfully bore the minor with help from pedagogy aigs. The host bore the minor. The host protected the minor. The host helped the minor. + +49. The self saw what the self meant by self. The self saw the other with help from pedagogy aigs. The self saw the other. The other saw the self. The self examined the self. + +50. The self saw the other. The self knew why God knew the reason. The self knew the self. The other knew the self. The self saw why God knew the reason. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Fewer Stillbirths 1 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Fewer Stillbirths 1 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..11f8ea8b226721eba8842a7c796e54b0afb2a0bd --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Fewer Stillbirths 1 of 3.txt @@ -0,0 +1,33 @@ +["Green, L 2021, Fewer Stillbirths 1 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +Fewer Stillbirths 1 of 3 + +1. I prepared to be a seller. I did this by stating that sexual intercourse won't kill the mother or child. First, I determined that the phallus wouldn't touch the child. Second, I determined that it was safe. Third, I enjoyed this thought. In this way, I prepared to be a seller by stating that sexual intercourse won't kill the mother or child. + +2. I prepared to write about recommend no capital punishment. I did this by stating that Pedagogy being distributed will prevent killing others. First, I stated that the criminal would be captured. Second, I stated that the criminal would be justly tried. Third, I observed that if the criminal was guilty, he would be placed in a jail. In this way, I prepared to write about recommend no capital punishment by stating that Pedagogy being distributed will prevent killing others. + +3. I prepared to repeat this in the sutra's reasonings. I did this by stating that God (the philosopher) was loaded with pedagogy. First, I wrote 50 As. Second, I gave one out per day. Third, I stated that the rest were recordings of the other 49 As. In this way, I prepared to repeat this in the sutra's reasonings. I did this by stating that God (the philosopher) was loaded with pedagogy. + +4. I prepared to be recommended to a friend. I did this by stating that the computational algorithm helped with conception medicine. First, I listened to the prayer (request). Second, I found 10 thoughts for the first statement that the requester said after this. Third, I repeated this until 80 breasonings had been thought of. In this way, I prepared to be recommended to a friend by stating that the computational algorithm helped with conception medicine. + +5. I prepared to give the child a proper upbringing. I did this by coupling (connecting) myself with the child. First, I found the child. Second, I sat him on my knee. Third, I told him the story. In this way, I prepared to give the child a proper upbringing by coupling (connecting) myself with the child. + +6. I prepared to prepare for the acting character's 250 breasonings or 50 As. I did this by stating that the child experienced 50 breasonings from a creative writing sentence. First, I wrote the 50 breasonings. Second, I wrote the sentence. Third, I allowed the child to experience the 50 breasonings. In this way, I prepared to prepare for the acting character's 250 breasonings or 50 As by stating that the child experienced 50 breasonings from a creative writing sentence. + +7. I prepared to write about the character. I did this by stating that the child experienced 250 breasonings inspired by a 250 breasoning sentence. First, I wrote the 250 breasonings. Second, I inspected the character. Third, I observed the child experience the character. In this way, I prepared to write about the character by stating that the child experienced 250 breasonings inspired by a 250 breasoning sentence. + +8. I prepared to lease love in the world. I did this by having a child with my wife. First, I looked at the child. Second, I made it mine. Third, I helped him to grow. In this way, I prepared to lease love in the world by having a child with my wife. + +9. I prepared to notice the child's parts going well together. I did this by synthesising the conception argument. First, I found the first reason. Second, I found the second reason. Third, I connected them. In this way, I prepared to notice the child's parts going well together by synthesising the conception argument. + +10. I prepared to make sure that nothing would go wrong (it would go well). I did this by critiquing the conception argument. First, I found the premise. Second, I gave a premise which was a reason agreeing with it. Third, I verified the second premise. In this way, I prepared to make sure that nothing would go wrong (it would go well) by critiquing the conception argument. + +11. I prepared to describe the child's looks. I did this by stating that the child was peaceful given 50 academic conception arguments. First, I decided whom to write the document for. Second, I wrote about the unnaturalness. Third, I wrote the question, 'Who's that?'. In this way, I prepared to describe the child's looks by stating that the child was peaceful given 50 academic conception arguments. + +12. I prepared to eat from the rice paper maggot. I did this by finding a lady to have a child with. First, I held the zucchini. Second, I fed it. Third, I ate the bamboo chopsticks. In this way, I prepared to eat from the rice paper maggot by finding a lady to have a child with. + +13. I prepared to aim with the best. I did this by writing that the penis was a magic wand with the conception argument. First, I wrote about the phallus (rod). Second, I helped the year 4 child to it. Third, I wrote about the child's thoughts. In this way, I prepared to aim with the best by writing that the penis was a magic wand with the conception argument. + +14. I prepared to walk on the ropes course. I did this by choosing the philosophy production lyric with the methodology 'I reduced the reason to a subject-predicate'. First, I held the rose. Second, I helped you. Third, I wrote a great amount. In this way, I prepared to walk on the ropes course by choosing the philosophy production lyric with the methodology 'I reduced the reason to a subject-predicate'. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Fewer Stillbirths 2 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Fewer Stillbirths 2 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..460b0d434a2863754f72e775fdfaf27163090eea --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Fewer Stillbirths 2 of 3.txt @@ -0,0 +1,32 @@ +["Green, L 2021, Fewer Stillbirths 2 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +Fewer Stillbirths 2 of 3 + +15. I prepared to love myself. I did this by cooling myself. First, I ate the lettuce. Second, I chilled down. Third, I planted the floating sea baton. In this way, I prepared to love myself by cooling myself. + +16. I prepared to experience the 250 breasonings (in fact, a maximum of 80 breasonings) for the whole production day. I did this by wearing my jacket in a well-meaning manner. First, I put on my jacket. Second, I wore it in a well-meaning manner. Third, I allowed the photograph to be taken of me. In this way, I prepared to experience the 250 breasonings (in fact, a maximum of 80 breasonings) for the whole production day by wearing my jacket in a well-meaning manner. + +17. I prepared to exaltate (sic) myself. I did this by stating that the Iranian professor was in the centre. First, I found the Iranian professor. Second, I found he was in the centre. Third, I moved with him. In this way, I prepared to exaltate (sic) myself by stating that the Iranian professor was in the centre. + +18. I prepared to be rewarding about everything. I did this by teaching my students. First, I helped the first student. Second, I helped the second student. Third, I helped the third student. In this way, I prepared to be rewarding about everything by teaching my students. + +19. I prepared to note that the child indicated content for 3 Medicine As. I did this by stating that the child survived because of being given medicine arguments. First, I asked if there was a person with clothes behind the door. Second, I verified whether the building had been erected. Third, I wrote philosophy. In this way, I prepared to note that the child indicated content for 3 Medicine As by stating that the child survived because of being given medicine arguments. + +20. I prepared to watch the adults. I did this by stating that I want to be a Lucian Meditation (LM) meditator as what going to church is in terms of. First, I practised prayer in terms of LM. Second, I practiced meditation in terms of LM. Third, I practiced group meditation in terms of LM. In this way, I prepared to watch the adults by stating that I want to be a Lucian Meditation (LM) meditator as what going to church is in terms of. + +21. I prepared to use my knowledge. I did this by walking past the philosophy papers. First, I found the philosophy papers. Second, I read them. Third, I walked along the corridor afterwards. In this way, I prepared to use my knowledge by walking past the philosophy papers. + +22. I prepared to state the details associated with the writing. I did this by stating that I became a famous philosopher by asking, 'What is Writing?'. First, I wrote about myself. Second, I wrote about the other person. Third, I wrote about myself in relation to the other person. In this way, I prepared to state the details associated with the writing by stating that I became a famous philosopher by asking, 'What is Writing?'. + +23. I prepared to write about the argument. I did this by stating the A grade argument for the idea. First, I wrote about sulfide. Second, I wrote about it's uses. Third, I helped the young woman. In this way, I prepared to write about the argument by stating the A grade argument for the idea. + +24. I prepared to write about medicine. I did this by thinking of a reason for the idea. First, I wrote down the topic 'halides'. Second, I wrote down the reason for it. Third, I wrote down an argument for that reason. In this way, I prepared to write about medicine by thinking of a reason for the idea. + +25. I prepared to visit God's (the philosopher's) house. I did this by copyrighting the idea by writing it down. First, I found the pad. Second, I wrote the idea down. Third, I enunciated I am a Lucianic Meditator. In this way, I prepared to visit God's (the philosopher's) house by copyrighting the idea by writing it down. + +26. I prepared to program the pen. I did this by relating the idea to my other ideas. First, I found the pen that reads what you write. Second, I helped the pen to write the first idea. Third, I helped the pen to write the next idea. In this way, I prepared to program the pen by relating the idea to my other ideas. + +27. I prepared to use the prism to see my life clearly. I did this by stating that my heavy hand gave the object worth. First, I held the prism. Second, I placed it on the mantelpiece. Third, I viewed it from the sofa. In this way, I prepared to use the prism to see my life clearly by stating that my heavy hand gave the object worth. + +28. I prepared to walk on the glacier. I did this by mimicking the thoughts to benefit from them. First, I found the thoughts in a box. Second, I mimicked them. Third, I benefited from them. In this way, I prepared to walk on the glacier by mimicking the thoughts to benefit from them. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Fewer Stillbirths 3 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Fewer Stillbirths 3 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..85746eef1ce53a668807e68c0356f34f244d6de8 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Fewer Stillbirths 3 of 3.txt @@ -0,0 +1,32 @@ +["Green, L 2021, Fewer Stillbirths 3 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +Fewer Stillbirths 3 of 3 + +29. I prepared to explain to the couple that pedagogy would prevent a stillbirth. I did this by being helped by others by talking about benefits to them. First, I maintained the peace. Second, I built a robot. Third, I helped the dinosaur. In this way, I prepared to explain to the couple that pedagogy would prevent a stillbirth by being helped by others by talking about benefits to them. + +30. I prepared to experience the student's thoughts. I did this by emulating Nietzsche's Pedagogue Helper character by not going over the student's ideas. First, I found the student's ideas, after being unified with the Pedagogy helper. Second, I went over the student's ideas as a joke. Third, I drove the Citroen. In this way, I prepared to experience the student's thoughts by emulating Nietzsche's Pedagogue Helper character by not going over the student's ideas. + +31. I prepared to write an argument. I did this by finding out from my pedagogy helper, not the student's pedagogy screen. First, I breasoned out the Professor Algorithm. Second, I rebreasoned out the A-grade chapter 50 times. Third, I worked out breasonings from my pedagogy helper. In this way, I prepared to write an argument by finding out from my pedagogy helper, not the student's pedagogy screen. + +32. I prepared to earn the grade. I did this by writing down the object name, to breason out the object later. First, I wrote down the object name. Second, I found a calm place. Third, I breasoned out the object. In this way, I prepared to earn the grade by writing down the object name, to breason out the object later. + +33. I prepared to breason out 250*50*80=1,000,000 breasonings each day. I did this by breasoning out 5 10 breasoning sequences for my pedagogy student each day. First, I prayed for (asked for) 10 Pedagogy Helper Aigs each day. Second, I ran a spiritual algorithm that breasoned out 10 80 breasoning As on either side of each initial breasoning, making me an influence and made the breasonings for breasoning using the following step. Third, I made this number of brain sacrifices (breasonings) to keep the breasonings. In this way, I prepared to breason out or 250*50*80=1,000,000 breasonings each day by breasoning out 5 10 breasoning sequences for my pedagogy student each day. + +34. I prepared to be the best helper to the student possible. I did this by researching the student's interests in writing. First, I researched the student's first interest. Second, I researched the student's second interest. Third, I researched the student's third interest. In this way, I prepared to be the best helper to the student possible by researching the student's interests in writing. + +35. I prepared to observe the student write down arguments. I did this by unblocking that he could breason out an object. First, I found the block. Second, I unblocked it. Third, I observed him breason out the object. In this way, I prepared to observe the student write down arguments by unblocking that he could breason out an object. + +36. I prepared to observe the effects of block-knockout. I did this by unblocking that he could rebreason out an A [50 times]. First, I found the block. Second, I unblocked it. Third, I watched him rebreason out an A [50 times]. In this way, I prepared to observe the effects of block-knockout by unblocking that he could rebreason out an A [50 times]. + +37. I prepared to achieve being a pedagogy helper with the help of writing the Professor Algorithm, which some people breasoned out. I did this by earning the job of pedagogy helper by breasoning out 150 copies of an 80 breasoning A and being with-it over the time-points. First, I prayed for pedagogy helper Aigs each day. Second, I was unified with the pedagogy helper character. Third, I offered the pedagogy degree. In this way, I prepared to achieve being a pedagogy helper with the help of writing the Professor Algorithm, which some people breasoned out by earning the job of pedagogy helper by breasoning out 150 copies of an 80 breasoning A and being with-it over the time-points. + +38. I prepared to watch the people like me and my friend from the Student's Space Association go to Mars. I did this by giving a student 50 As. First, I gave 50 As with a single breasoning. Second, I had 125 workers each confirm 100 As per day, with a break of 3 weeks per worker. Third, I performed the pedagogical requirements for Masters in one day. In this way, I prepared to watch the people like me and my friend from the Student's Space Association go to Mars by giving a student 50 As. + +39. I prepared to help with the students. I did this by finding the area of study out in cosmology. First, I wrote about the area of study's students. Second, I wrote about the stars. Third, I wrote your name down. In this way, I prepared to help with the students by finding my student out in cosmology. + +40. I prepared to breason it out in the mess-hall. I did this by finding out the area of study's reasons. First, I wrote about the area of study. Second, I wrote that it was good. Third, I helped the students to it. In this way, I prepared to breason it out in the mess-hall by finding out the area of study's reasons. + +41. I prepared to generate the songs. I did this by interpreting the reasons. First, I generated the Prolog code. Second, I wrote the reason. Third, I interpreted in. In this way, I prepared to generate the songs by interpreting the reasons. + +42. I prepared to explain not to do anything else. I did this by anonymously giving a future student a pedagogy argument to rebreason out 50 times. First, I identified the student who would need to gain entry into Honours. Second, I gave him the argument. Third, I helped him to breason it out interestingly 50 times. In this way, I prepared to explain not to do anything else by anonymously giving a future student a pedagogy argument to rebreason out 50 times. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Higher Grades 1 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Higher Grades 1 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..0329063bd9afd18ae3fad2d8ad19dc22439ec841 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Higher Grades 1 of 3.txt @@ -0,0 +1,32 @@ +["Green, L 2021, Higher Grades 1 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +Higher Grades 1 of 3 + +1. I prepared to earn straight As. I did this by achieving higher grades. First, I wrote the pedagogy algorithm. Second, I breasoned it out. Third, I completed the assignment. In this way, I prepared to earn straight As by achieving higher grades. + +2. I prepared to check whether everything had loosened up. I did this by timing the pregnancy using the birth clock. First, I started the timer. Second, I continued taking the time during the pregnancy. Third, I stopped the timer when the pregnancy had finished. In this way, I prepared to check whether everything had loosened up by timing the pregnancy using the birth clock. + +3. I prepared to invent plays by an invented playwright. I did this by meeting the playwright E. Turin. First, I read 'I, the Procrastinator' (about planning time). Second, I read 'The Murderer' (about having a successful conception). Third, I read 'I, the Poet' (about dialectic poetry). In this way, I prepared to invent plays by an invented playwright by meeting the playwright E. Turin. + +4. I prepared to reflect on the chapter. I did this by turning the light on at night. First, I found the light switch. Second, I turned it on. Third, I read the chapter. In this way, I prepared to reflect on the chapter by turning the light on at night. + +5. I prepared to buy pop. I did this by writing one A grade essay for buying a pop composition. First, I wrote the Nazarene composition. Second, I helped the sellers to sell. Third, I helped the buyers to buy. In this way, I prepared to buy pop by writing one A grade essay for buying a pop composition. + +6. I prepared to transcend the taste of food. I did this by licking the anti-animal lolly (the apple coloured berry). First, I retrieved the anti-animal lolly from my pocket. Second, I licked it. Third, I ate it. In this way, I prepared to transcend the taste of food by licking the anti-animal lolly (the apple coloured berry). + +7. I prepared to make a connection in law. I did this by using the law keeper's box. First, I found the law keeper. Second, I found his box. Third, I placed the written law in it. In this way, I prepared to make a connection in law by using the law keeper's box. + +8. I prepared to detect whether the organ system had been given As. I did this by stating that the Lucianic doctor used a computer to read the number of As per body system. First, I read the first A. Second, I prepared read the next A. Third, I repeated this until I had read all the As. In this way, I prepared to detect whether the organ system had been given As by stating that the Lucianic doctor used a computer to read the number of As per body system. + +9. I prepared to use all the As up. I did this by stating that the Lucianic doctor used a pedagogical perspective to read the number of As per body system. First, I counted the first A. Second, I prepared counted the next A. Third, I repeated this until I had counted all the As. In this way, I prepared to use all the As up by stating that the Lucianic doctor used a pedagogical perspective to read the number of As per body system. + +10. I prepared to verify that the body system had been properly cared for. I did this by stating that the doctor repeated this for the period of diagnosis, one week, before seeing the patient again. First, I read the number of As on the first day. Second, I prepared to read the number of As on the next day. Third, I repeated this until I had read the number of As on each day. In this way, I prepared to verify that the body system had been properly cared for by stating that the doctor repeated this for the period of diagnosis, one week, before seeing the patient again. + +11. I prepared to follow the treatment algorithm. I did this by stating the treatment was breasoning out the treatment argument. First, I diagnosed the disorder. Second, I breasoned out the treatment argument. Third, I collected feedback from giving A-grade treatment to the patient. In this way, I prepared to follow the treatment algorithm by stating the treatment was breasoning out the treatment argument. + +12. I prepared to observe treatment relaxing the patient. I did this by stating that the treatment was the patient repeating the treatment argument as a sutra. First, I uttered the title of the treatment argument in my mind as the sutra. Second, I prepared to utter the next instance of the sutra in my mind. Third, I repeated this 174 times. In this way, I prepared to observe treatment relaxing the patient by stating that the treatment was the patient repeating the treatment argument as a sutra. + +13. I prepared to keep my higher grade. I did this by joking that the rubber chicken representing my grade had flown away. First, I placed the rubber chicken to one side. Second, I left the room. Third, I walked up the stairs. In this way, I prepared to keep my higher grade by joking that the rubber chicken representing my grade had flown away. + +14. I prepared to watch the movie. I did this by showing the philosopher giving an answer in the movie. First, I pressed the 'play' button. Second, I watched the answer being stated. Third, I pressed 'stop'. In this way, I prepared to watch the movie by showing the philosopher giving an answer in the movie. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Higher Grades 2 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Higher Grades 2 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..d7821ee55d3653a4913d79d2eb1851cf84767520 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Higher Grades 2 of 3.txt @@ -0,0 +1,32 @@ +["Green, L 2021, Higher Grades 2 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +Higher Grades 2 of 3 + +15. I prepared to explain mathematics. I did this by adding 1 apple + 1 apple = 2 apples. First, I counted the first person dressed as an apple. Second, I counted the second person dressed as an apple. Third, I said that this equaled two apples. In this way, I prepared to explain mathematics by adding 1 apple + 1 apple = 2 apples. + +16. I prepared to breason out the object. I did this by using the mimetic screen. First, I inspected the mimetic screen. Second, I imagined a shape emerge from it. Third, I imagined the shape become an object. In this way, I prepared to breason out the object by using the mimetic screen. + +17. I prepared to eat the mango. I did this by predictively modelling the customer's needs with real data to satisfy her. First, I gave her the orange. Second, I gave her the apple. Third, I gave her the peach. In this way, I prepared to eat the mango by predictively modelling the customer's needs with real data to satisfy her. + +18. I prepared to make the ice-pop. I did this by writing down what a customer wanted with him. First, I listed the plum. Second, I listed the strawberry. Third, I listed the grapefruit. In this way, I prepared to make the ice-pop by writing down what a customer wanted with him. + +19. I prepared to enjoy the attendees. I did this by playing the crumpled horn at the launch of Pedagogy. First, I played 'C'. Second, I played 'D'. Third, I played 'E'. In this way, I prepared to enjoy the attendees by playing the crumpled horn at the launch of Pedagogy. + +20. I prepared to list the breasonings on either side of the breasoning. I did this by believing in the breasoning when Krishna (the man) presented it to me. First, I noticed it's shape. Second, I noticed it's type. Third, I noticed it's colour. In this way, I prepared to list the breasonings on either side of the breasoning by believing in the breasoning when Krishna (the man) presented it to me. + +21. I prepared to create the pedagogy helper. I did this by observing that the regional God (philosopher) found and replayed Krishna's (the man's) breasoning. First, I observed that Krishna (the man) state the breasoning. Second, I observed the regional God (the philosopher) observe the breasoning. Third, I observed the regional God (the philosopher) state the breasoning. In this way, I prepared to create the pedagogy helper by observing that the regional God (philosopher) found and replayed Krishna's (the man's) breasoning. + +22. I prepared to help the lady entice herself to us. I did this by stating that the animals are sacred (the people are protected). First, I stated that the first animal was sacred (the first person was protected). Second, I stated that the next animal was sacred (the next person was protected). Third, I repeated this until I had stated that all of the animals were sacred (all of the people were protected). In this way, I prepared to help the lady entice herself to us by stating that the animals are sacred (the people are protected). + +23. I prepared to say sorry. I did this by ordering Pedagogy. First, I liked Pedagogy. Second, I ordered it. Third, I enjoyed it. In this way, I prepared to say sorry by ordering Pedagogy. + +24. I prepared to go through the breasoned sentence with an algorithm in tennis. I did this by lighting my way through Pedagogy. First, I turned on the overhead light. Second, I started reading Pedagogy. Third, I finished reading Pedagogy. In this way, I prepared to go through the breasoned sentence with an algorithm in tennis by lighting my way through Pedagogy. + +25. I prepared to earn a gazebo-building job. I did this by earning the meritocratic certificate. First, I earned straight As. Second, I was merited. Third, I declared we lived in an meritocracy. In this way, I prepared to earn a gazebo-building job by earning the meritocratic certificate. + +26. I prepared to invite the children to the healthy party. I did this by making my mother proud. First, I helped her wash the dishes. Second, I loved the child. Third, I loved the other children. In this way, I prepared to invite the children to the healthy party by making my mother proud. + +27. I prepared to be a renegade surfer. I did this by agreeing with someone about my life. First, I found the agreeable part in my life. Second, I found someone to agree about it with me. Third, I observed him agree with it. In this way, I prepared to be a renegade surfer by agreeing with someone about my life. + +28. I prepared to help the dependent write a philosophy book. I did this by protecting a dependent. First, I instructed the dependent to breason out the already-written Professor Algorithm. Second, I instructed him to breason out 50 of the same A to become a pedagogue. Third, I helped him to and protected his As. In this way, I prepared to help the dependent write a philosophy book by protecting a dependent. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Higher Grades 3 of 3.txt b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Higher Grades 3 of 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..8f5763026cb37ef0eda32a69046ab0d2039a1181 --- /dev/null +++ b/Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/PEDAGOGY INDICATORS by Lucian Green Higher Grades 3 of 3.txt @@ -0,0 +1,32 @@ +["Green, L 2021, Higher Grades 3 of 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"PEDAGOGY INDICATORS +by Lucian Green +Higher Grades 3 of 3 + +29. I prepared to check that the child had understood the area of study properly. I did this by spoon-feeding the child. First, I wrote the argument. Second, I wrote the argument's area of study. Third, I wrote this down with the child's probable comments. In this way, I prepared to check that the child had understood the area of study properly by spoon-feeding the child. + +30. I prepared to love myself as a pedagogy helper. I did this by stating that the group read their breasonings from the screen. First, I wrote that 'God is dead' with a spiritual reverter for every person that ever read it to ensure pass marks. Second, I stated that God is alive. Third, I stated that I am an interesting man. In this way, I prepared to love myself as a pedagogy helper by stating that the group read their breasonings from the screen. + +31. I prepared to state that the current manager was a good main person in my life. I did this by stating that the student breasoned out the breasoning on the vertical screen. First, I noted that the screen was spiritual. Second, I observed the student think of the sentence about the breasoning. Third, I observed the student breason out the breasoning. In this way, I prepared to state that the current manager was a good main person in my life by stating that the student breasoned out the breasoning on the vertical screen. + +32. I prepared to be an official Head of State person. I did this by apologising to prevent that sort of thing. First, I prevented the headache. Second, I prevented the hallucination precursor. Third, I prevented stillbirth. In this way, I prepared to be an official Head of State person by apologising to prevent that sort of thing. + +33. I prepared to give the child a star. I did this by stating that the star represented new life. First, I stated that the star had certain properties. Second, I stated that the star's properties caused the Earth to have certain life-giving properties. Third, I observed that the child was born from a mother as a result of A, using Earth's food. In this way, I prepared to give the child a star by stating that the star represented new life. + +34. I prepared to synthesise the summaries. I did this by writing the idea. First, I wrote the idea. Second, I wrote it's parts. Third, I wrote the summary. In this way, I prepared to synthesise the summaries by writing the idea. + +35. I prepared to help with the harmony. I did this by maintaining the 10 point-sequence in education each day, and 10 days into the future. First, I wrote about the atom. Second, I wrote about notes. Third, I wrote about the boy. In this way, I prepared to help with the harmony by maintaining the 10 point-sequence in education each day, and 10 days into the future. + +36. I prepared to be influential. I did this by maintaining the 10 point-sequence in commerce each day, and 10 days into the future. First, I wrote about the meditation (philosophy) course. Second, I wrote about your fornix. Third, I wrote about your pineal gland. In this way, I prepared to be influential by maintaining the 10 point-sequence in commerce each day, and 10 days into the future. + +37. I prepared to be famous. I did this by maintaining the 10 point-sequence in medicine each day, and 10 days into the future. First, I helped the student's skill level. Second, I returned to the families. Third, I helped them achieve their aims. In this way, I prepared to be famous by maintaining the 10 point-sequence in medicine each day, and 10 days into the future. + +38. I prepared to help meditators. I did this by maintaining the 10 point-sequence in meditation each day, and 10 days into the future. First, I helped the action hero. Second, I helped arachnids. Third, I helped Filipinos. In this way, I prepared to help meditators by maintaining the 10 point-sequence in meditation each day, and 10 days into the future. + +39. I prepared to wrote the 500 word tenure document. I did this by maintaining the 10 point-sequence in Computational English each day, and 10 days into the future. First, I examined the Computer Science department. Second, I examined the English department. Third, I found the way with them. In this way, I prepared to wrote the 500 word tenure document by maintaining the 10 point-sequence in Computational English each day, and 10 days into the future. + +40. I prepared to sing in tune. I did this by maintaining the 10 point-sequence in Popology each day, and 10 days into the future. First, I wrote about the effects of positivity. Second, I wrote about the effects of agreement. Third, I wrote about positive people agreeing with each other. In this way, I prepared to sing in tune by maintaining the 10 point-sequence in Popology each day, and 10 days into the future.. + +41. I prepared to check that societology was functioning well. I did this by maintaining the 10 point-sequence in Societology each day, and 10 days into the future. First, I wrote about the effects of equality. Second, I examined the size of groups. Third, I examined the wealth of groups. In this way, I prepared to check that societology was functioning well by maintaining the 10 point-sequence in Societology each day, and 10 days into the future. + +42. I prepared to eat the angel hair vermicelli after telling the cook that I had caught the criminal. I did this by maintaining the 10 point-sequence in Sex Studies each day, and 10 days into the future. First, I examined the phallus (rod). Second, I examined the void (hole). Third, I put the two together. In this way, I prepared to eat the angel hair vermicelli after telling the cook that I had caught the criminal by maintaining the 10 point-sequence in Sex Studies each day, and 10 days into the future. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Grammar Logic/.DS_Store b/Lucian-Academy/Books/Grammar Logic/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Lucian-Academy/Books/Grammar Logic/.DS_Store differ diff --git a/Lucian-Academy/Books/Grammar Logic/Grammar Logic 1.txt b/Lucian-Academy/Books/Grammar Logic/Grammar Logic 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..3b578b20c632f302110c05ce217c7d6a91563637 --- /dev/null +++ b/Lucian-Academy/Books/Grammar Logic/Grammar Logic 1.txt @@ -0,0 +1,19 @@ +["Green, L 2024, Grammar Logic 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Grammar Logic 1 + +1. I could achieve file manipulation results with Prolog. For example, I selected the files in Web Finder and applied an operation. +2. I dragged sets of files into a Web Editor file window. +3. I could select both ranges and subtractions in the Web Finder and save open sets of Web Editor windows to come back to and check off. +4. I shift-clicked files in the Web Finder. +5. The Web Finder could tell between option-clicking (adding or removing individual files from the selection) and option-dragging (copying). If the user wanted to option-drag to make a selection, they could be mind-read. +6. I command-clicked files in the Web Finder. +7. I warned the user if there were unsaved changes when they ran an unsaved file in Web Prolog in Web Shell. +8. I updated all Web Editor windows on changes. +9. PrologScript used declarative programming to build interfaces and merged back and front ends, putting them at the back. +10. The new Web Browser used Prolog instead of Javascript. +11. I detected accidental keystrokes and clicks and asked the user if they meant them. +12. I displayed a drawer to turn Web Finder on or off in Web Editor. +13. I designed an offline app language that could work on smart devices and desktops. +14. Web Prolog could integrate Web (Text) Editor, Finder and Shell. +15. I bulk-processed my notes as issues and completed them. +16. I saved the Web Editor’s and Web Finder’s window positions. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/.DS_Store b/Lucian-Academy/Books/IMMORTALITY/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Lucian-Academy/Books/IMMORTALITY/.DS_Store differ diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 1.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..5f28562e15e546d46b037eb4e54f3211f8cc5b8d --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 1.txt @@ -0,0 +1,163 @@ +["Green, L 2022, Immortality 1, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 1 + +1. I could help more people become immortal. I transitioned to eternal life. I did this by finding 4*50 As. Also, I found meditation. I used these each day. +2. I always found work to do. I found time to increase the developments in my books. I wrote details. I wrote more pieces for more information. Then, I wrote details for details. +3. I noticed that algorithms were good enough. I aimed for conclusions like a vector. I did this by writing more and more drafts until the work was perfect. I wrote the algorithm. I wrote and compared it with the times. +4. Algorithms fully covered all relevant ideas. Algorithms thoroughly examined logic with no flaws. They were logical. Also, they contained steps. They were connected. +5. I simplified the design, eliminating bugs. I redrafted the work. I did this by finding the corrections. Then, I simplified it. I found better, more specific features. +6. If there was no value for A, A was left alone. I rewrote the equals4 command to accommodate cyclical data structures. When A=[A], A returned this value. When the user turned on the flag to check for cyclical data structures, running the equals4 command with A=[A] would fail. So far, a new version of equals4 could define A as B, not empty. +7. I also simplified and merged similar structures. I found whether each bug correction was necessary given the final draft. I experimented with omitting each combination of bug fixes in case one was not required. I dealt with problems centrally. I devised a single choice-point identifier. +8. In the end, I simplified the code by removing unnecessary detail. I found similar structures. I also found identical code. Then, I merged them. Finally, I added functional parts for differences. +9. Writing separate predicates to do different things was more intuitive. I reused code, removing unnecessary detail. I detected that I could use the same code on other occasions. It did the same thing, even with different interior detail in the data. So I differentiated predicates to do different things. +10. The native version was in C. I wrote native versions of some predicates to speed them up. For example, I wrote atomic list concat-this split into and conjoined parts with particular atoms. The native version was much faster. +11. I could view strings in traces of grammar. I noticed a familiar, more straightforward feature was putting list processing parts in the head of the predicate. I wrote [Head|Tail] in the head. I also did this in grammars. Also, I allowed strings in grammar but wrote a command to convert them to numbers. +12. I realised neural networks were similar to mind readers. I wrote updates to work containing new implications from the research. Professors wrote about these. These writings were published. So, good inventions and implications resounded from these. +13. I wrote and converted to different programming languages. I found a program finder with types of program finder with types, etc., constructing more sophisticated algorithms-these processed lists of lists, etc. I converted back and forth between lists and strings. I simplified algorithms. +14. I wrote and converted to different programming languages by writing code for different languages using a program finder with types in terms of the target language. I converted Prolog lists to C. I did this by writing in C-like code in Prolog. Also, I handled lists in C. I could use the Prolog algorithm, including a program finder with types algorithms in C. +15. I chose functionalism over expansionism. I simplified algorithms using a recursive program finder with types and functionalism. I accomplished this by finding the minimal algorithm using a program finder with types. I called it using a functional call, reusing intermediate predicates. I simplified different occurrences of the function to a single one. +16. I noticed that each function had a single output. I found the functional version. I fully expanded the code to expand my thoughts, with functions merged. I compared it with efficiency, writing different functions in different functions. At the extreme, a smaller + function at the same time would be placed in a separate function. + +17. I summarised new methods found. I simplified different occurrences of the function to a single one. I did this by finding recurring patterns. Then I repeated this. I found choice points. +18. Mind reader helped write data, used to write algorithms with program finder, where I asked different animals for optimisations. I wrote program finder with program finder as a way for animals to write algorithms. They could write a program finder with mind reading. They could do this by identifying new methods. They wrote algorithms that helped or were new. +19. I simplified the program finder with types by creating a program finder with types with program finder with types. I found the predicates using the program finder with types. These predicates were in the simplest form. Also, I called one predicate twice. Finally, I moved list processing to the head. +20. I attributed ideas to animals and learned what they meant by mind-reading programming ideas about extra details. The animals' thoughts were in the form of method(s) applied to a method(s). Sometimes additional details revealed or explained these. An algorithm mind read them to find how to apply a method to a method. For example, +a faster C function or new command could be added to Prolog. +21. The algorithm returned a:b (append) or with another character instead of \":\". In addition to append, reverse and string_concat, I wrote a program finder with types algorithm that could convert \":\" (append) to \"-\" (string concat) back and add more data between them. For example, I split a string on a set of characters. Or, I joined strings with a group of characters. Disabled people could get on with programming what they wanted, with these parts explained to others and attributed to them. +22. I added the rules in the header of the algorithm. I wrote an interpreter or added a symbol so that other +characters could be substituted for \":\". For example, I could apply List Prolog to f(). Also, I could add a rule to the converter so that x f y became f x y. I could add these rules to the algorithm. I could convert strings into compounds such as a:b. +23. I wrote commands that converted from x f y (a list or string) to f x y and back. I converted the string to a compound. Then, I converted x f y to f x y. I could then run the term. Or, I could analyse it. In addition, I could add it to an algorithm and run it. +24. I needed to convert between function formats to run them, parse natural language or manipulate functions. In List Prolog, I converted x f y to f x y as lists. I checked that f was a function. I used equals4 to convert [X,F,Y] to F,[X,Y]. X and Y needed to be in a different format from normal variables so they wouldn't have values substituted into them by accident. +25. I manipulated function terms and put them back into (the same format of) function. In List Prolog, I converted f x y to x f y as lists. I checked that f was a function. I used equals4 to convert F,[X,Y] to [X,F,Y]. X f y was a good format for printing or concatenating atoms. +26. I was aware that appending a:[] = a. I converted \":\" (append) to \"-\" (string concat). This changed [a,b] to \"ab\". Concatenating them was possible because a and b were separate atoms in the list. Concatenating them behaved like foldr string concat. +27. I could write a version of atomic_list_concat in Prolog or C. I converted a list to items with other items between them. I concatenated [a,b,c] to \"a*b*c\". This was equivalent to atomic_list_concat([a,b,c],\"*\",D). So, D = 'a*b*c'. +28. I could leave characters as character codes for faster processing or allow them to be read as characters in trace mode and convert them to codes later. I converted \"ab\" to [a,b]. I could use findall and convert string codes to strings. If the atoms were multiple characters in length, then a predicate could process multiple items simultaneously. I could use this algorithm in decision trees (which could be created and manipulated in C). +29. I could choose a character instead of x to insert between characters. I converted \"ab\" to [a,x,b]. I used the technique from paragraph 28 and added the x character between items in the list. I could use this method to format text. The list could be converted back to a string if necessary, for example, to write it. +30. I converted [a,b] to [a,x,b]. I did this by list processing. Or, I could write a new single command to do it. I could write a system of patterns, for example, with append, mathematical position or formulas. I wrote a proof method to process all items using mathematics. +31. I could convert a list to a string using 31.3. I converted \"a*b\" to \"ab\". I used atomic_list_concat and foldr string_concat to do this. In addition, I could replace * with another character using atomic_list_concat again. Or, I could produce a new list [a,^,b] by using the method described in paragraph 30. +32. I could flatten the list. I could produce lists of lists, e.g. \"[a,[b,c]]\" from [a,[b,c]]. I used a predicate to convert strings into a list by using a parser. I could perform this process in C. I could process the list or print it. + +33. I could send the line over the internet. I converted [a,x,b] to [a,b]. I ran delete([a,x,b],x,A). So, A = [a, b]. I reduced the lines to 1. +34. I put characters on separate lines. I converted \"ab\" to \"a*b\". I defined foldr as foldl on a reversed list. I ran the query string_codes(\"ab\",C),findall([A,*],(member(D,C),char_code(A,D)),E),flatten(E,F),append(H,[_],F),foldr(string_concat,H,G),!. This gave G = \"a*b\". +35. I wrote the story accompanying the algorithm. I flattened the list. I did this with the following predicate: + +flatten1(A,B) :- + flatten2(A,[],B),!. + +flatten2([],B,B) :- !. +flatten2(A,B,C) :- + (not(is_list(A))->append(B,[A],C); + (A=[D|E],flatten2(D,B,F), + flatten2(E,F,C))),!. + +36. I could write the string to the terminal. I converted the list to a string. I accomplished this with the following predicate: + +:-include('../listprologinterpreter/listprolog.pl'). + +list_to_string(A,B) :- + list_to_string(A,\"\",B),!. + +list_to_string([],B,C) :- + foldr(string_concat,[\"[\",B,\"]\"],C),!. + +list_to_string(A,B,C) :- + (not(is_list(A))->((B=\"\"->G=\"\";G=\",\"), + foldr(string_concat,[B,G,A],C)); + (A=[D|E],list_to_string(D,\"\",F), + (B=\"\"->G=\"\";G=\",\"), + foldr(string_concat,[B,G,F],F1), + list_to_string(E,F1,C))),!. + +37. I could quickly view the structure. I wrote the following predicate to print a list prettily (hierarchically) on the screen. + +:-include('../listprologinterpreter/listprolog.pl'). + +pretty_print_list([],\"[]\") :- !. +pretty_print_list(A,B) :- + pretty_print_list(A,1,\"[\n\",B1), + delete_last_n(B1,2,B3), + string_concat(B3,\"\n]\",B),!. + +pretty_print_list([],_,C,C) :- !. + +pretty_print_list([A|D],T,B,C) :- + T1 is T+1, + spaces(T,S), + (is_list(A)->(pretty_print_list(A,T1,\"\",G), + delete_last_n(B,1,B1), + (G=\"\"->G1=G;delete_last_n(G,2,G1)), +foldr(string_concat,[B1,\"\n\",S,\"[\",\"\n\",G1,\"\n\",S,\"],\n\"],F)); + ( + foldr(string_concat,[B,S,A,\",\n\"],F) + )), + pretty_print_list(D,T,F,C),!. + +spaces(N,S) :- + numbers(N,1,[],S1), + findall(\"\t\",member(_,S1),S2), + foldr(string_concat,S2,S),!. + +delete_last_n(B1,N,B3) :- + string_concat(B3,B2,B1),string_length(B2,N),!. + +The pretty print list algorithm produced the chart: + +[ + a, + b, + [ + c, + d, + [ + e + ], + f + ] +] + +38. I quickly viewed traces without distraction. I turned off comments appearing in trace mode with a flag. I wrote comments in the algorithm. I didn't want to read them while tracing the algorithm. So, I turned the flag off. +39. I added to the details. I wrote an essay helper for algorithms. I did this by writing perspectives about the algorithms. In this way, I wrote new connections between them. I wrote views on morality, logic, aesthetics, deontology (creativity) and ontology. +40. I could nourish myself. I wrote on morality. For example, I agreed with the idea. It could save lives. It could help with survival. +41. I simplified the algorithm to a single substitution. I wrote on logic. I did this by writing the series of implications in the algorithm. First, I found \"and\". Second, I found a transformation (function). +42. I wrote a story about finding the algorithm. I wrote on aesthetics. I wrote about poetry, music, fine art, acting, dance and writing. Then, I applied them to debug. Finally, I got through the philosophies and algorithms. I wrote a story about following the algorithm. +43. The \"animal students\" could generate art (algorithms). I wrote about poetry. I wrote blank verse poetry about the imagery in the algorithm and its additional logic. The use was a reason. The model was always on the topic, just a rearrangement of words with minor changes. +44. I found the sequence of skills. I wrote on music of algorithms. I did this by identifying the setting evoked by the algorithm. Also, I found the subject, verb and object. I related them to the setting. +45. I wrote about fine art, even graphics. I wrote about the device emanated through the algorithm. The device was a user interface for the algorithm. For example, I processed text files. Also, I integrated the files with other algorithms. +46. I found and corrected the flaw in the complexity of speech. I wrote on acting. I wrote about how people/objects moved through objects. This movement is related to data analysis. Finally, I commented on optimisation and whether the algorithm was \"over-done\" (too complex). +47. I translated the story to dance, incorporating humour and native code. I wrote about dance. In this way, I wrote about the story explaining the algorithm. I did this by emphasising people's judgements. I found a giant metaphor and people trying on the way. +48. I focused on the same idea (evidence) and simplified it. I wrote about writing. I wrote a plan at first. Then, I wrote new research about the algorithm. I compared it with existing writing. I found a good topic and then wrote ideas with ten reasons each. + +49. I used the magic whiteboard to project Prolog to students. I debugged Prolog algorithms. I also emphasised cognitive methods, i.e. list processing outside the head of the predicate. First, I wrote the name of the data structure. Then, I decomposed it into its head and tail. +50. I encouraged hand-coding predicates such as term to atom to understand Prolog better. I explained the easiest to most challenging algorithms to help with the goal. I explored the CAW complexity from easiest to hardest. I queried the difficulty of the algorithms from easiest to most challenging in interviews. I covered the broken-down skills needed for the project, even though they differed. +51. I debugged the Prolog algorithm. I wrote List Prolog Interpreter.* I ensured SSI had the same results, correcting bugs to do this. I ensured SSI's trace mode included the name of the retried predicate. I also printed the name of the redone predicate. +52. I noticed the term to atom command contained list to string and string to list. When the user gave the term to atom command two instantiated variables, it converted the term to an atom and checked that the atom was the same as the inputted one. If the user instantiated neither, the result was A='_'. In the case of term_to_atom(A,B)., B = '_'. Given term_to_atom(_,B)., B = '_'. Given term_to_atom(_,_)., the result was true. Given term_to_atom(A,_)., the result was true. +53. I added form text field previous values, password fields, saved form fields and multiple buttons per page. I noticed that saving a session allowed the user to return to the last website page they were on. This possibility required a security login to prevent break-ins. Also, the algorithm required two-factor authentication. Alternatively, if going back to the same page was not possible, the application could return to the first page of the website, requiring a new session number and deleting the old session file. Again, I noticed that without application support, going back was not possible. +54. I noticed that the original texts were relevant. I counted that the minimum was 15 As per chapter. Sometimes it was 15 As per book. I needed enough for an assignment. Later, I compensated for not enough for an assignment by relating one assignment to another. This relation was relativism. +55. I became a campus questionnaire giver/Academy lecturer. I just started. I started with single questions and asked students at University to listen to them and answer them at lunchtime. Later, after I could afford to pay my bills, I paid for accreditation. After that, I went up to politics, down to cosmology and interested the students in my algorithms. I gave an A to help them with parts and made the point to arrive at future research about an algorithm from each student's point of view, for which they received an A. +56. I wrote up the documentation. I found out about the student's career in Prolog. Before the course, I gave A to the student each time they might use Prolog. I simplified the aim. I simplified the method and debugging method. +57. I covered the courses with Prolog ways of thinking to help with understanding, grades, automation and a PhD. I found out about the student's thoughts about Prolog in class. Before the lecture, I gave As to the student when they thought of Prolog. I detected and gave an A to the problem. I gave an A to the student's petissiences (sic), or good thoughts, differing opinions and big solutions (research). +58. I noticed the distance between the expandable ideas and the current expanded view became longer and longer, prompting the need for automation. I explained the algorithms from the most straightforward to the most complex, focusing on the ones the students found the most interesting. I spent 60% of the time on interpreters. I spent 40% of the time on mathematics, etc. Interpreters enabled users to create websites and run algorithms on devices. +59. I explored the CAW complexity from easiest to hardest. I stated that the simple append predicate was the base for comparison. I measured the distance away in terms of a number of new or different commands from each of the list decomposition, transformation (if any) and append parts of the predicate. The CAW distance was how many steps away from the append predicate the new predicate was when using the Combination Algorithm Writer (CAW) algorithm. The distance was the number of differences calculated from the complexity and the number of steps the algorithm took to find these. +60. It was more about how the user thought about the predicates. The \"communication\" between the class and career characters was working out what would be necessary and provide it to the student. I sometimes simplified this information to what interested the student. The career was sometimes touched up later at different contact points (study or work) at the institution. I examined the prestigious institution's findall notes, bearing in mind that simplicity is king in Prolog. +61. I treated the course like a database I had programmed to help students in their careers. I thought about education and Prolog. I found As from my point of progress to the notes. I found A for the notes. I also found an A for my end of progress. Then I listened to the lecture. I updated everything. +62. Writing and referring to code was constantly cleaning the ideas. I wrote the philosophies and the students found their career points from them. Their career points were inside the data structures. The data structures from their career could look like the course structures' relatives. The industry was the same thing, using the theory of the course to write algorithms. +63. I wrote a document collaboration algorithm in Prolog. I noticed that the website refreshed every second. People could read and add notes from around the world. Someone wondered whether a keyboard on the web page could save the notes on the way. I noticed that users could automatically mockup algorithms from specifications contained in the document. +64. If politics was an interior perspective, then politics were particular parts of computer science. I became the ontological age person, a metaphysician. I explored algorithms about compilers, inductive algorithms, spiritual (meditation) algorithms and different applications. Compilers helped the robots' personalities interconnect. Inductive algorithms helped write algorithms and understand rules along the way. I wrote meditation algorithms to help bridge gaps and be like neural networks. Applications included graphics and music and helped with writing Academy's ideas down. + +65. I worked at the prestigious institution library. I first read the document in the meeting. Instead of a presentation, everyone read the paper at the start. I took notes using the collaborative software. Everyone stayed there until the end. +66. I spoke with my friends. I worked on my philosophy and discussed it with my students at the prestigious institution. I also read books for my degree. I included details. I had the best subjects. +67. I joined the details in the middle. I did this by including enough details of the right quality. I wrote a tutorial about my ideas. I found out about the students and ideas beforehand in cosmology. I collected and wrote about their responses afterwards, noting their background. +68. I also visited other institutions and ones interstate and in other countries, teaching philosophy. I gathered experience about the types of people. I did this by filming them with permission. I prepared for the roles. I gave out books. +69. The newsletter contained reviews of my work. I wrote a worksheet. The student wrote an answer on the collaborative document from the text (where the text didn't only include the solution). I added interactive elements to my lectures with internet questionnaires. Noting the progress, I started an international newsletter about my philosophy. +70. I explained the input and output of each predicate system and a sentence describing what it did. I left computer science as it was for educational purposes. Students could simplify it. It was not the final version to be submitted. The documentation included options for running it, for example, to run an algorithm straight away or to increase the stack size. +71. I streamlined the process of adding commands to my interpreter. I generated algorithms for business or departmental purposes. Third, I recombined parts of ontologies to write algorithms. For example, instead of combining character and breasoning for a font state machine, I combined text effects and breasoning and wrote a font state machine that underlined text. In the end, I wrote algorithm generators for state machines and other algorithms and then used neural networks. +72. I wrote an \"accredited sales\" A to highlight sales. I doubled the breasonings for sales to account for freedom. I was free when I completed the work for the seller and the customer. I travelled first class and wrote lots of ideas for the customers. I explored details about the details to close the deal. +73. The \"accredited human resources\" A made business as fun as a school. I wrote double the breasonings for human resources to stay free. Then, I noticed that everything was there. The customers had what they wanted. The employer gave the employees things to do. +74. I wrote an algorithm that recognised when equals4 had been used to add it to program finder with types, to write algorithms with equals4. I wrote equals4 with program finder with types (which wrote algorithms using their data). This simplified equals4. I wrote different algorithms that used equals4 or the data to algorithm predicate (which uses equals4 and is part of the program finder with types), with program finder with types. So, I could write check arguments using equals4 with program finder with types. +75. Functionalism had the same code with different functions. I devised an algorithm that used a series of calls to equals4, sometimes like CAW, in that they attempted to use equals4 without data. Instead, I used a program finder with types to recognise and program this series of algorithms. First, I set the possible sequence of algorithms. Then, I simplified them with functionalism. +76. I generated or automated writing functionally by itself. I wrote the reverse predicate functionally, including an XML parameter B:A to indicate the different way append should be used. If the program called another function, I could number its arguments by default. Alternatively, the user could refer to the number of its arguments about other parts of the predicate using XML tags. These numbers could be reordered and passed as arguments to the function, reducing coding time and increasing the sophistication of the design. +77. It was simple with enough data (possibly mind-read). The synthesis of functional tags was in an ontology dictionary. I noted the series of transformations from the output and applied the functions. Then, I worked backwards from the output to the middle, working out what the middle functions were. Writing the algorithm was achieved by running the program finder-found predicates on the input and backwards from output until the middle was the same for all specifications. +78. I designed a generative art maker with mind reading for movie backgrounds. It could work with detailed mind-reading (a decision tree), labelled \"meditation\"-type. The colours and shapes were more pleasing. It worked by connecting holes on a pegboard. There could be rules for the shapes, such as \"one cloud in the sky\" or \"grass\", \"pleasing colours\", or \"3D\". +79. I could record it and play it online. I noticed that the user could view the scene from a corner of the cube and that the perspective could change. In the scene, the characters had different colours, with faces based on animals. There were stories and scenes exploring various settings and objects representing data structures. The HTML player played the songs to the movie. +80. I rendered the scene in HTML, helping explain computer science. In this scene, the graphics were detailed or had smooth curves. There was no vanishing point perspective in the graphics. I also changed some musical instruments. Finally, I made a computer game using this 3D engine, like Vetusia. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 10.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 10.txt new file mode 100644 index 0000000000000000000000000000000000000000..b00bab5609a8303f898f0ac2c570ab6075832f80 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 10.txt @@ -0,0 +1,89 @@ +["Green, L 2022, Immortality 10, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 10 + +1. I couldn't add tail recursion optimisation if the recursive call were inside, for example, findall, where the cut command would have unwanted effects. In tail recursion optimisation, I inserted brackets around the cut command and the recursive command, if needed. I found each command that exited on status -2 (successfully exited) in the state machine. I burrowed back until finding a recursive call, however many brackets it was inside. I changed for example, recursive_call(A..B) to (!, recursive_call(A..B)). +2. I benefited from all parts of life. I did this by stopping the effects of social isolation. As part of this, I saw people. I felt like myself. I took part in social events. +3. I chose the most intelligent thoughts. I had high distinctions for what appeared. First, I registered that the person had appeared. I acknowledged them. I gave the following high distinction in my specialisation to them. +4. I found that breasoning currency accounted for activity and receiving. I wrote questions and answers (a mea culpa) for my performance about aphoria (thinking of ideas). I wrote that an aphor was a use, an algorithm topic and three steps. First, I invented ideas about ideas. Then, I expanded the steps and found rhizomatic connections. +5. I worked out logical frames of objects. I wrote questions and answers about breasonism (working out computational ideas), pointing at ontology or nothingness at the end. I found the logical structure of thought and life. Study and life were in terms of algorithms. I wrote enough and wrote systems that ported the most relevant ideas to a point. +6. I checked that immortality would help. Later, I helped the main role helper. First, I visited the office. My work gave me the role. Then, I checked the attributes of the position. +7. I helped the female role helper. She was friends with the leading role helper. I wondered if she was a time traveller, but she was just friends with the former role helper. I determined that time travellers might look like inhabitants. I found out that she had gone home. +8. I enjoyed visiting the office and talking with the staff. As part of this, I helped the dark-haired role helper. First, my employer gave me the role. Then, she asked about my career. After that, I worked in the role. + +9. I avoided death using a Master's degree and meditation. Separately, I practised academic meditation. First, I paid for puja. Then, using this, I indicated the utterances. Also, using this, I indicated I had breasoned out breasonings. +10. I wrote more and more specific ideas. I started by writing 50 hand-written As for the encyclopedia article. I first wrote the As. I hand-wrote them in traditional style. After that, I wrote in a distinctive academic style. +11. I tracked articles' relevance through time. I began by writing 50 hand-written As for an essay. I first wrote articles for the local newspaper. I wrote an amount from a perspective. Then, I wrote from a standpoint during it. +12. I met people. I made friends with Eliza, the robot Head of State. I avoided transgressions. I also met the next Head of State, but time travelled to Eliza's time. Eliza was an older woman, and the next Head of State was a middle-aged man. +13. If a giant visited the simulation, they would appear our size. Before the doctorate, I checked my previous degree's pedagogy notes about writing details. I wrote in synthesised sentences. I wrote in terms of the area of study's terminology. I wrote enough detail. +14. I could increase any sentence as much as necessary. I generated 5 million breasonings for a doctoral assignment. I could reduce this by removing the cited sentences. Also, I could expand the breasonings' sentences and reduce the number of breasonings. I breasoned out 5 million, not 4 million, to reach the copywriting standard. +15. I interviewed people, researched industry publications and conducted statistical experiments in my doctorate. At the doctoral level, I wrote 50 word-breasonings as details. I first wrote the reason. Then, I wrote 50 words to detail it. Finally, I wrote details when I wrote original sentences. +16. I benefited from having a mentor during my doctoral studies. I also wrote on philosophy, computer science and education in my degree. As part of this, I generated philosophies with business aims. In addition, I made induction more accessible. Logical induction helped complete and checked philosophies. +17. I recognised the need for performance and met it. Earlier, I asked the lecturer to help with earning a high distinction. Before this, I wrote an argument. I drafted my assignment. Then, I spoke with the lecturer about my draft. +18. I wrote 5 million breasonings for each assignment, details and algorithms. I wrote 250 algorithms per assignment. Each predicate was an algorithm. I scheduled writing parts of my philosophy using algorithms. When I breasoned-out work, it contributed to the project. +19. I designed an image font and other coloured icons as part of Philosophy Works in Prolog. I added a Prolog test case which tested cut in the findall body in State Saving Interpreter. It wouldn't easily work in List Prolog. Cut limited the number of solutions of findall to one. When the interpreter encountered a cut in the findall body, it finished finding solutions. +20. I could change reserved words and syntax in List Prolog to be shorter and faster (the algorithms were all on the website in expanded form). I changed v in List Prolog to sys_v (system variable). Changing this variable signifier allowed me to use the string \"v\" in extended English mode for other purposes. Changes like this enabled necessary words to be reserved by List Prolog. I could change it to another name because the Prolog to List Prolog converter hid the variable's name. +21. I gave the gift of life to my followers. I changed n in List Prolog to sys_n (system name). Since I added grammars to List Prolog, I needed \"n\" or similar to differentiate between strings to parse and predicates to call. I changed \"n\" to a different string because \"n\" was commonly used by algorithms already. For example, \"n\" in Vetusia conflicted with the interpreter in \"en2\" but not \"en\" mode (in which 'n' with single quotes would have conflicted with it). +22. I back-propagated variable values in List Prolog. I also changed \"=\" to behave like equals4. I entered A=B, B=1. The interpreter returned A=1, B=1. Backpropagation of values also worked with lists and within \"=\". For example, [A,A,B]=[1,C,C] returned A=1, B=1, C=1. +23. I noticed a duck seemed almost human. I used the Text to Breasonings to help animals become human-like. I mind-read them in assignments. They explored details for projects. The ducks kept away from the dogs and had neat feathers. +23. I supported inversion in the interpreter. I entered reverse(A,[],[D,C,B]) in the interpreter. The interpreter returned A=[B,C,D]. The interpreter sometimes merged variables with no value in other commands. To prevent this, I used strings instead of empty variables. +24. I experimented with n-queens as a free variable puzzle. I swapped the variable names A, B, etc. for _1, _2, etc. I kept variable names intact through calls and predicate heads (the interpreter displayed the same thing in the trace). At the end, I swapped _1, _2, etc. back to A, B, etc. I thought \"t\" looked like a king, \"8\" looked like a queen, \"#\" was a rook, \"+\" was a bishop, \"4\" looked like a knight and \"i\" looked like a pawn + +25. I monadalised from a hierarchy to a list. I started with [1,[2,[3]]]. I converted it to the list [1,2,3]. There might be two items starting at the same level, for example, [1,[2,3]]. In this case, the algorithm introduced state numbers (if it was an if-then clause), e.g. [[1-2],[1-3]]. +26. Functional optimisation called the same predicate with different arguments. I demonadalised from a list to a hierarchy. I used this technique to develop code from a state machine. First, I found the functional optimisation of the state machine. Then, I found the number in the string optimisation of the state machine. +27. I generated all possible inputs to the website to test it. I did this by testing many characters. Then, I tested all characters from a set. Finally, I tested two users at the same time. After that, I tried to break in and make certain combinations of inputs by the users cause a crash, which I solved. +28. I wrote data limitations in algorithm inputs to guide generated input. For example, I specified to enter a number from 1-10. Or, I wrote if a response was 1, then to go to a particular branch of the algorithm. I could also store my responses in variables. I could back-propagate variable values, given A=B, B=1, and check them. +29. I suggested apps save progress as they go, with the ability to return to a previous progress point. I wrote a simple way to have multiple buttons in the Prolog web app. I replaced buttons with text fields with multiple-choice answers. I replaced multi-choice questions with \"what\" or content questions. I converted between multi-choice questions (for example, is A B or C?) and \"what\" questions (for example, what is A?) and provided hints. + +30. I asked for a paraphrased multi-choice answer (when writing a quiz). There were test scripts where rules guided answering multi-choice questions. I wrote the rule that an answer needed a particular phrase in it. The test script followed these rules. For example, I changed the answer to lowercase, removed punctuation and found the base words together in order of the required phrase. +31. There were practice data generators with multi-choice questions. I wrote the possible answers. Next, I wrote a required phrase in each answer. Then, I wrote the question flow from particular answers to help correct mistakes. Later, I consolidated with hints until the student answered correctly. +32. I could swap from text file to web view to view a specific page. I developed the app in Prolog. I ran it with SSI, running the non-deterministic app on different web pages ending with a read field. While editing part of the code as a web page, I could run the app to a point to view it in the browser. A text file contained the instructions so far to reach that point. +33. I asked for the client's name. Then, I generated instructions for filling in the fields. To do this, I wrote the field name. Next, I wrote the pattern that the input needed to follow. Finally, I wrote an example of the input. +34. I could sort by a column and update values based on data chains. I optimised the app if the app requested the same patterns with different instructions. For example, I noted that the app asked for a word or a number. I called this code or instructed it to do this twice, with different instructions. I could generate various reports online, which I analysed. +35. I could create, edit and delete parts of the database. I did this to generate the no-code app from a text file. I could view the data as strings or abbreviations. I could create IDs for customers. I could return errors if a record didn't exist. +36. I formatted the web page from a database selection and checked boxes or numbers. Earlier, I generated the app-generating text file using an algorithm. This algorithm asked for specific information. It gave a certain number of hints to similar questions. For example, it was a document library or question-answer database. +37. I could customise the database's background or the group email format. Earlier, I could customise the app's number of levels, items per level, and cycles. For example, I could specify an information retrieval system with various levels. I could also store any number of items per level. Finally, I could also process the items and the pages using a recursive algorithm. +38. I tracked the data about the entities and merged the databases. Then, I could convert by asking for several answers to findall and edit it. I did this by inputting the data. Alternatively, I edited specific rows. In the end, I tried automating the spreadsheet. +39. I used a database formula finder to write queries with the intersection, not, etc. I could edit the text file with an algorithm and expand or collapse parts of the app flowchart. I edited the access level of the column. I gave the users certain levels of access. I went into a movement of the app and expanded it. +40. I could edit the text file with a graphical algorithm. I generated the flowchart from the text file. I edited the flowchart. I visualised the web pages on one page, given the model data. Finally, I error-checked the app, with pages of the app devoted to this. +41. I wrote an algorithm in Prolog, converted it to C, edited it, and then converted it back to Prolog. As part of this, I wrote a converter from C to Prolog. It conserved immutability (non-changing variable values). It changed local and global variables. It was a deterministic version of Prolog, which could run a compiler for non-determinism. +42. I converted Prolog expressions to C expressions. I changed \"is\" to \"=\". For example, I converted A is B+1 to A = B+1. A verification algorithm produced an error on A=A+1 because A's value should not change. Also, +(A, B, C) had another function in C, which the program converted to C=A+B. +43. I moved [A] in the Prolog head to the body. I first wrote a([A]). I converted this to a(B):-equals4(\"B\",\"[A]\"). I used a separate predicate for setting a string to a variable's value, for example, equals_string(C, \"string\"). I could also set an atom to a variable's value, using equals_atom(D, \"atom\"). +44. I could store the number type, string or atom with equals4 and unify variables with numbers later using a variable table. I wrote [A] in the body with equals4. I converted \"[A]\" to a tree in an array. The algorithm stored this tree as B=list(C), C=var(A). This tree representation couldn't reduce to B=list(A) because C has to be grounded in a terminal. +45. I benefitted from the simplicity of Prolog and the speed of C. I wrote the Prolog to C converter. First, I used SSI to process choice points in Prolog. Then, I ran SSI (which didn't process choice points using choice points in itself). Finally, I wrote a version of Prolog in C that only needed a list and command support to convert SSI. +46. I inserted variables to convert if-then statements from Prolog to C. I rewrote the if-then statement (a(1)->B;C) in Prolog as a(a1);if(a1==1){B}else{C} in C. Also, I rewrote ((a(1),b(1))->B;C) in Prolog as a(a1);b(b1);if(a1 == 1 && b1 == 1){B}else{C} in C. Also, I rewrote ((a(1)->true;b(1))->B;C) in Prolog as a(a1);b(b1);if(a1 == 1 || b1 == 1){B}else{C} in C. I also rewrote the if-then statement (not(a(1))->B;C) in Prolog as a(a1);if(!a1==1){B}else{C} in C. + +47. Once the algorithm converted equals4 to C, it could use it to get and put values. I wrote equals4([A],...) in Prolog as -> equals4(\"[A]\",...). I transformed [A] to \"[A]\". I transformed it into an atom with the string to list predicate. As an example, I entered string_to_list(\"[[\\\"a\\\"]]\",A), giving A = [[\"a\"]]. +48. I could compile algorithms to use numbers instead of strings. On a separate note, equals4 requires variables in a variable table. I passed the variable table to equals4 and took the outputted variable table from it. Expressions containing variables (for example, [v,_]) in List Prolog were not confused with strings, which the converter wrapped in \\\"...\\\". I could directly set strings and atoms to variables with equals4(\"[v,a]\",\"\\\"string\\\"\") and equals4(\"[v,a]\",\"'atom'\") or equals4(\"[v,a]\",\"atom\"), respectively. +49. List Prolog types contained recursive types and lists. I converted from ((a(1),b(1,1))->B;C) in Prolog as a(a1);b(b1,b2);if(a1 == 1 && (b1 == 1 && b2 == 1)){B}else{C} in C. Or, I converted from ((a(1),(b(1)->true;c(1)))->B;C) in Prolog as a(a1);b(b1);c(c1);if(a1 == 1 && (b1 == 1 || c1 == 1)){B}else{C} in C. I added types to C to enable the algorithm to convert C into Javascript. +50. I converted anded statements and clauses in Prolog to C. I converted Prolog anded statements to nested if-then statements in C. For example, a, b, c in Prolog was converted to if a then (if b then return 1, else return 0) else return 0. There were no choice points, so this method accounted for failed predicates. Also, clauses were joined together in a single function. The converter renamed clauses, and a failed previous clause would lead to running the following clause. +51. I converted C to Prolog to optimise or edit if-then clauses. When converting from C to Prolog, I kept or changed back a(A1), if A1, then to A->B; C. Note: this is the reverse direction from the previous examples. I could keep or change back to A->B; C. I could also convert back from (b1 == 1 && b2 == 1) to b(1,1) or b1(1),b2(1) in Prolog. +52. The Result variable name needs to be different for each function. A predicate call in Prolog may resemble a(A1, Result), which returns Result=0 or 1 based on whether it has successfully exited, which the calling predicate can test if-then. If the interpreter successfully leaves a predicate, it must return Result=1. The programmer should set Result=1 at the end of the predicate. If a clause is successful, it returns Result=1, or if the predicate is unsuccessful, it returns Result=0. +53. I converted some if-then clauses in C to if-then clauses in Prolog. I checked if the logical structure was already there, so the converter didn't need to insert it. For example, the algorithm only converted an if-then clause to one in Prolog if it had a filled-in else clause. Otherwise, it left it as anded Prolog statements. For example, if(A){B}else{Result=0} is converted to A,B and if(A){B}else{C} is converted to A->B;C. +54. The C function doesn't return the result value. Instead, the function sets Result to the logical result of the function. In addition, it uses the variable table input and output variables, i.e. *vars_in and *vars_out. These are local variables, treated as immutable, for conversion to Prolog. I processed changes, additions, or deletions to the variable table in \"put value\" in equals4. +55. I converted if-then clauses in C to different logical structures in Prolog. I described how if A then B else fail function versus if A then B else C converts from C to Prolog differently. As illustrated, the if A, then B else fail function converts to A, B in Prolog. Also, if A, then B else C function converts to A->B; C in Prolog. The programmer may replace A, B or C with A1->B1; C1, etc. +56. I constructed the Prolog predicate to be in the form a(In1, In2, Out1, Out2). These variables behaved like C *vars when converted to C. I could take in1 as an input, and add to, change or delete part of it. I could set the result to another variable or out1. I could get the head or tail of a variable and copy or change an item. +57. The Prolog to C converter used type and mode statements. I converted A=1 in Prolog to C. The conversion was the same, A=1 in C. I also discriminated between 1==1 and A (undefined)=1 (an assignment). Finally, I traced each variable's type (defined or undefined) with type statements. +58. I interchanged get_item_n(Array,Item_number,Item) and item=array[item_number]. I converted A=\"string\" in Prolog to C. In C, it was char a[] = \"string\";. Before this, I initialised the variable char a[100];. I couldn't overwrite the string and could process the string one character at a time, which I could convert from C to Prolog. +59. I tested that C variables behaved immutably. Immutability meant that a variable's value couldn't be reassigned but could be read and operated on before assigning another value to another variable. Before converting to C, I checked that the variables weren't assigned conflicting variables and that their modes didn't conflict. When converting to C, once a programmer had created a variable, the programmer only compared it, so the verification algorithm didn't need to check it to be immutable. I verified that C variables were immutable so that the algorithm could convert them to Prolog variables. + +60. I inspected and made allowance for near-misses. The output from the whole C algorithm was printed or outputted. I started by creating unit tests for the C code. First, I wrote the input. Then, I tested for the correct output. +61. I tested the C algorithm in Prolog form to test the converter's correctness. I started by converting the C algorithm to Prolog. Then, I kept function names for assignments or comparisons. After this, I tested the C algorithm for 100% correctness. Finally, I tested the Prolog algorithm for 100% correctness. +62. I could more easily read and simplify List Prolog. I created the LP-like-C to LP (where LP is List Prolog) converter. \"List Prolog like C\" was List Prolog, with extra predicates to convert to C more easily. I removed these predicates with the converter. These included different predicates for assignment and comparison. +63. I wrote in Prolog, then converted to C later, which I did with other algorithms that needed better performance. I wrote the LP to LP-like-C converter. This converter added predicates and allowed me to test algorithms in a C-like way, with trace in Prolog. For example, I could run \"frames\" or instances of running the ssi1 predicate for bug-checking. I could solve problems before converting into C. +64. I found it easier to change List Prolog like C to List Prolog by converting in terms of a state machine. Before this, I ran LP-like-C to test code with additional predicates. As part of this, I added equals4. I also added arithmetic predicates. Finally, I added the variable table and result variables to the predicate head. +65. List Prolog in this sense (not like C) meant contracted code in List Prolog. Separately, I converted List Prolog-like-C to a state machine. Then I removed its C'ness. Finally, I converted it to List Prolog. I also used the state machine for converting from List Prolog to List Prolog-like-C. +66. Sometimes, \"a(A1, A2), if A1 and A2...\" in LP-like-C wouldn't contract to a(A1, A2)->B; C in LP, so I left it the same because it might be too long (there might be too many statements in the antecedent or it might be unclear which ones they were). I couldn't move all the statements in the predicate until then to the antecedent, so I only included the results. If I reconverted this if-then clause to LP-like-C, it would remain in its original form. Only if the original if-then clause, when converting from LP to LP-like-C, converted the result variable in the antecedent, then the shorter form of if-then (a(A1, A2)->B; C) could be used when converting to LP. I needed to associate the result and other variables with the predicate call with the predicate and argument instances naming convention. +67. The update variables predicate, etc., were needed in C and LP-like-C because the interpreter stored variable values in variable tables. This storing was where the interpreter transferred values to the first arguments, and the algorithm listed variables in the antecedent if compared. The interpreter also passed the variable tables to all predicate calls. So I devised a new version of Prolog without lists in the predicate head. There was no need to update variables at the end of the predicate because C took care of this or didn't. So I could run SSI-like-C.pl in SSI. +68. I answered whether the interpreter needed to check arguments and update variables with C variables. They weren't because predicate/function calls would return variable values, including lists as trees as arrays, which C would handle. On the other hand, they were because equals4 produced more variables without them being individual arguments. I copied the format with check arguments, etc., of List Prolog's member predicate, adjusting for different clauses of the member predicate. I prepared to remove all arguments apart from variable tables and the result from function headers, requiring check arguments with a string of arguments, and results and other variables in if-then antecedents to be put, got and compared using predicates. +69. I added interpreter commands to SSI. As part of this, I added string codes. I also added input and output predicates. Furthermore, I added a temporary run command. In addition, I converted [A] in Prolog to \"[A]\" in Prolog-like-C code to edit terms more easily. +70. I removed choice points from SSI. I found the member call. I replaced it with recursion. Removing choice points from SSI allowed me to convert it to C more efficiently. I also replaced multiple clauses with one clause. +71. I removed choice points from SSI (for example, I converted findall to a predicate). I replaced findall with recursion. Third, I optimised (eliminated unnecessary) findall statements. Fourth, I tried bringing back C variables instead of Prolog variable tables but used Prolog variable tables. Finally, I ran SSI to run algorithms. +72. I tested Vetusia in User Input mode, in which the player controlled applying objects rather than the algorithm autorunning. Each object is applied to each object recursively whenever the algorithm picks an object up. I ran Vetusia in UI mode to pinpoint the slowness of the interpreter (is it in a findall predicate or due to the stack size, which could be alleviated with tail recursion optimisation?) I converted Vetusia to List Prolog, including using writeln. As Vetusia UI didn't use apply_all_to_all (which caused a slow down), no slowdown in Vetusia UI would indicate that apply_all_to_all caused slowness because of no local stack. If Vetusia was also slow, the stack is too bulky and needs tail recursion optimisation. +73. I optimised the code in the compilation. Earlier, I sped up the slow interpreter (with optimisers or local and global stacks). I also eliminated records of intermediate variables and other unused values. I could refer to repeating values using globals. I counted the references to a variable, including from within itself and deleted variables to output. +74. I wrote the interpreter's issue list. I addressed and solved the issues. Then, I added ISO (International Organisation for Standardisation) Prolog commands. I also added retry to trace. Finally, I explored the commands in terms of C. +75. I created an API with web commands or a form of SSI Web Service written entirely in C. I wrote the web issue list. Then, I found C commands to use to access and send information over the Internet. First, I drafted the commands in LPI and SSI in Prolog. Then I matched the output of these commands in Lucian Prolog (a compiler). +76. With multithreading, I could save time and with immortality, I could achieve more. I wrote a multithreading issue list. I wrote 50 Bs and 50 Bs to Bs to immortality to prevent death during one's life. After writing 50 As for immortality, I wrote these different books simultaneously. I could keep on going forever. +77. I found a simple solution to help me achieve my goals. I wrote an append predicate that worked with multiple modes (append was run in SSI, not mode predicates for append). I started again. I didn't need the 50 Bs, etc., because dying to the first universe was unpreventable, immortality opened other doors in the universe, and keeping one's prime body was good. The first action was to study business (after education to master business) to protect oneself. +78. I distributed lucianpl.pl in non-like-c Prolog form, converted it twice on installation and ran it. First, I completed SSI (lucianpl). Then, I converted it to List Prolog-(which was) like-C. Then I converted it to C. Finally, I ran the compiler after compiling it in C. +79. I converted Prolog lists to C arrays. First, I converted SSI to like-C Prolog, and C. SSI didn't have choice points (even though it supported choice points it ran). So I could convert it to a longer form of Prolog like C that I could test. Then, I converted it to C, with certain predicates swapped for functions to work in C, such as put value and list processing in equals4. +80. I could research business decisions with 50 Bs (helped with a philosophy algorithm), it would be easier given my task and the solution (immortality) was in itself. I could run choicepoint-less Prolog algorithms with the two converters, but it was better to run them in SSI (lucianpl). First, I could convert the choicepoint-less Prolog algorithm into like-C Prolog. Then, I could convert it to C. Finally, however, I could test SSI by running it in itself. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 11.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 11.txt new file mode 100644 index 0000000000000000000000000000000000000000..85ee5866eacbcb967b32c4763e54480eb7d0a326 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 11.txt @@ -0,0 +1,87 @@ +["Green, L 2022, Immortality 11, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 11 + +1. I needed to conduct myself to operate the business professionally. I asked the government for financial assistance. I was already making a certain amount. With the grant, I could achieve more goals. It helped with wages and costs. +2. I made it clear that experts should teach philosophy. Before this, I teleported objects between universes. To do this, I wrote my philosophy. I stored it in the next room. It saved trouble; it was written up and protected. +3. The founder followed simple steps to found the Academy. To do this, I helped an Academician start a new company. Before this, I chose the founder. It was an educational institution (school or tertiary institution). The founder was good at keeping things simple and agreed with the culture. +4. The founder or a sponsor had to pay for initial costs. The founder saw a lawyer about the proper company structure. The lawyer helped them set up the company. The lawyer helped with setting up financial systems. Also, the finance manager helped. +5. I had the health benefits of the simulation while defining parts of the world. I used this to hold the MBA's hand. I almost became an MBA student. I was an MBA student. I mind-read myself and the people around me, time travelling between people. +6. I was happy and enjoyed simplifying courses. First, I studied business and medicine at the master's level and worked out how to conduct myself professionally. I also studied education at the master's level before this to help myself perform better in academia and industry. The business enabled me to earn enough money for future degrees without much effort and satisfied other simulation requirements (like the world, but different because I could remain healthy and use the money for my needs). The medicine helped me realise that the simulation and the world were the same and that my character was me. +7. The people whose parents I had helped and who were interested in specific industries from their birth date helped by developing their ideas. They prepared to create University, requiring 250 breasonings and 250 algorithms per assignment. I started with myself and my living friends. I prepared for goals and studied. I inspired people and built institutions, reading the times and Indian horoscopes and helping with people's ideas. +8. There was sometimes more than one assignment in six months. More critical tasks required 500 breasonings and 500 algorithms. I could develop 4*50 As of breasonings in six months. While studying, I could write 0-16 algorithms per day. With other algorithms, I could write 4*50 algorithms in six months. +9. My father walked around unchanged; his latest character was the mirror (something similar had happened to me), which was harmless. I supported each sentence I wrote with (a book of) good reasons. I wrote many philosophies. I wrote algorithms for each philosophy. Is it my mother from another universe? Have I met mirror-me? +10. Educators could freely use the philosophy. I helped someone salvage my philosophy. They uncovered the blog years afterwards. Before this, I transcribed and expanded my texts. I also expanded other texts, exposing the algorithm and then writing the algorithm. +11. I wrote down all my ideas, and an algorithm helped later. I helped someone else write on my texts. I detailed and developed algorithms for the details for the text. The lecturer did the same. The student also did this. +12. I wondered if immortal Jesus had meditated and written on his thoughts. Someone else had free use of my texts. They had a license to be changed and distributed with the license, there was a disclaimer, and they could sell them. I experienced the will of God to open meditation centres that combined breasonings, writing and computers with meditation. I simplified the courses and made them more appealing. +13. The Academy supported work involving Lucianic Meditation (LM). The Academy gave my family royalties from the texts, or the Academy wrote texts influenced by my texts. The Academy helped support the family financially through its work. Alternatively, the Academy cited my texts. Merchants sold my texts. +14. I helped the CEO of LM. This help enabled the CEO to open the LM office with an email, address, and bank details. They asked whether they could use the texts. They operated the world affairs centrally. There was a board. + +15. The founder needed enough money to start an office, and the texts needed enough reasons to be chosen. The founder took out a loan or saved investments until he had enough money or interest to pay for office or web costs. I paid for accreditation costs. People joined up because more people joined up. The viable customers seemed to find it out, and I learned how to increase computer science to be simple. +16. I swore by; preparing for sales only takes a month, does it? To start, someone used or became inspired by my texts. The next step is to give the mental state 50 As. The person has 50 As. The sale or desired action has 50 As. +17. They used all of the texts, including meditation. Can they use all of the texts, as long as the writer has done them up in this way? Yes, they can. I worked on neural networks to do this. I had medicine for them. +18. I invented a complex version (with a second perspective) if there wasn't one. Or I left my philosophy's terminology as it was. However, there was a version that had a simple adumbration. I echoed this technique in computer science. I explored and then simplified the idea. +19. I sensed that my mother, through the bot, wanted reality. I recommended immortality to those with terminal illnesses. Immortality froze my age. I recommended it to someone who didn't mind leaving their friends behind and living with bot versions of them. The meditator could come back to the world as another person. +20. I closed my business (or kept it open to make money in my simulation), and someone else opened another company. This other person was open to making money. I helped by writing, developing ideas and business openings. I also found that thoughts attracted customers. Enough thoughts were necessary. +21. I employed Jyotish (Vedic Indian astrology) to tell when to do specific deals. I determined what job to do (computer engineering). Then, I decided when to open my business. I tried writing a computer science A for the first time, which had ten algorithms. I expanded these ten algorithms. +22. Business was a desirable conclusion for meditators. I recommended when future business people/doctors in the business should be conceived/born. I manually fed people computer science thoughts using Jyotish. Using Jyotish could also be done in philosophy or business. With Jyotish, I could research working in areas outside my main interests. +23. I wrote texts supporting the future university. I wrote the texts. I checked they were simple and functionally differentiated. I checked that they could operate as areas of study. I found detail algorithms one degree of freedom away from the conclusion (i.e. it had a key term in common with it). +24. I labelled the neural network for the idea a financial product. To start, someone else organised money for the business. I helped by investing money in assets. I also researched tours. In addition, I planned the meetings. +25. Someone else toured countries about LM. They started by choosing the countries. They also chose the cities. In addition, they organised meetings. +26. At the start of the subject, I introduced myself and gave my background. I helped each involved person one at a time. I profiled the people's strengths and aims. Maintaining a profile was also necessary for academia. The lecturer could get to know the students and help with their goals. +27. I planned work for each person. I moved around and helped while others stayed in place. I explained I was giving a second meaning but was serious about investigating immortality. I took care of each person. I switched on immortality points each day, increasing my longevity. +28. I sketched, connected and discussed business decisions. I had As, Bs and Bs to Bs and Jyotish verification of business decisions. I wrote As (essays) about each thought I had. I explored the perfection or algorithm for each idea to critically examine it. Finally, I checked that the decision was on the correct date, involved the right people and in the right direction. +29. I completed the algorithms by customising them. The business examined my algorithms. A proposed principle research arm of the company developed my philosophy algorithms. I described the differentiated functions in the specification. Finally, I determined whether the different interpretations were valid. +30. I taught medicine to patients to help them answer their questions. The business conducted research into Medicine. I considered examining the medical benefits of Chinese objects. As meditation was necessary, I investigated whether spiritual medicine was necessary. As education required 4*50 As in education and the department, medicine required 4*50 As in business and medicine (and education). +31. I worked out how to conduct myself professionally and studied before seeing the Jyotish astrologer. I could already start writing critical knowledge down and applying it. I used education and business knowledge to conduct myself professionally. I studied education and business. The Jyotish astrologer helped me determine when I should pursue specific business inputs and outputs. +32. I drafted a business analysis report with pros and cons, with research on them. To investigate possible cons, I wrote a 4*50 Bs and Bs to Bs generator for business decisions. I began by examining company data. I also analysed external data. As a result, I acted with 70% knowledge and avoided mistakes. +33. I wrote Bs and Bs to Bs to immortality (to uncover problems and solutions, such as catching and objecting to blocks and writing enough to cover for Bs and Bs to Bs). Instead of writing critical content, I wrote enough positive content. I explained that immortality used future technologies available in our time. As well as maintaining one's health, it gave time to finish the backlogs of work. I could finish the workloads quickly to achieve fame and move forward later. +34. Text to Breasonings created a spiritual \"cloud\", possibly with infinite breasonings due to the algorithm. I supported Text to Breasonings with my company (where Text to Breasonings helped users earn high distinctions with 80 breasonings, or x, y and z dimensions of objects). As a first step, I wrote 4*50 As for Text to Breasonings to interact with the brain through meditation. Next, I wrote 4*50 As for Text to Breasonings to interact with the brain medically. Finally, I wrote 4*50 As for Text to Breasonings to help explain breasonings educationally. +35. I agreed with the idea that a breasoning was a universe of wonder. The Text to Breasonings algorithm involved the company working on imagery and finding images from the future. The data the user entered was all the data that was required. Each breasoning contributed a part of an image. Two-hundred and fifty breasonings formed one coloured region or an image. +36. Meditation helped control involuntary processes voluntarily. This ability was part of supporting meditation with my company. I wrote 4*50 As about meditation. Each teacher needed to understand these As. Meditation kept things moving and removed blocks. +37. I exercised regularly and avoided heart disease. Meditation involved the company taking care of ideas and finding out imagery from the future. It found out about people and gave As. It was given As from the future and helped with them. I discovered different operations the meditator underwent as a result of meditation, from keeping things moving to immortality. + +38. I could run the app on the site with the processors. I converted Prolog to Javascript. First, I converted Prolog to C. Then, I converted C to Javascript. I changed int x; to var x;, int xs[n]; to var xs = [];, printf(...) to console.log(...) and int f (int x) {... return y;} to function f (x) {... return y;}. +39. Garbage collection deleted finished predicates. I turned garbage collection on or off with a flag. Garbage collection replaced similar values with a symbol and expanded these if necessary later. Garbage collection reduced the memory used. It sped up computation because of this. +40. I added auto-mutexes when multithreading (to List Prolog and State Saving Interpreter). On another topic, I separated data structures by their function. I labelled each choice point a \"choice point\". I wrote the type of entries that stacks could have. I accounted for these when searching for choice points. +41. List Prolog enabled my chatbot to be programmed from one file and be displayed on several pages, for example, asking for information to help answer a question with findall. I did this as part of writing a chatbot API with findall in List Prolog. I did this by writing the code in List Prolog. Then, I converted it to Prolog. Finally, I added commands to make this List Prolog code converted to Prolog code work in Prolog. +42. I started multiple companies, which I needed to do with my resources (time, money and As). As part of this, I wrote philosophy as finance products. First, I wrote philosophy in terms of computer algorithms. Then, I organised my party to pay in breasoning currency. I became interested in infinite knowledge and using my neurons. +43. I copied the receipt. I made money from investments. I thought about what I had an interest in. I invested in the first viable products. I replicated success (saved time and effort). +44. I wrote the functional decomposition of algorithms in VET sales. I did this to attract investors. I experienced Vocational Education and Training (VET) in computer science to plan for this. I wrote the VET A for the computer science assignment using this knowledge. The students bought the course for similar reasons. +45. I slotted old and new research frontiers about algorithms into ontologies (for example, breasonings as medicine). Separately, I developed Lucianic Meditation (LM) as a social business (I researched the departments of others' algorithms). I also wrote the puzzle table. I wrote the secret level password. Finally, I changed my algorithms to web databases. +46. I thought of accounting for objects and finance from or for the outside in business. To do this, I programmed with functional differentiation as accounting. I started with the main idea. I perfectly programmed constituent parts and their parts. Then, I redrafted them, differentiating more specific functions, for example, keeping [\"a\"] not `a' in Prolog grammars, where `a' became an inexplicable number in code. +47. I wrote combinations that were famous about business. I wrote about my philosophy. I helped others write about my philosophy. I wrote each algorithm and wrote about them. Institutional writing (with famous combinations) and vocational writing (with breasoning currency as functionalism) were both necessary, and +48. I wrote quality policies for software and organised helper algorithms. Politics determined the structure of its offering to the world and had uniform policies. I noticed \"a\" became 'a' in [\"a\"] in grammars. I surmised that the interpreter stored strings and atoms in the same way. The interpreter stored strings and atoms as strings. +49. I checked the documents against pop songs and researched computer-written essays. I prepared to start a political party. The party's philosophies helped form LM. They were better organised, were at the start and defined common lines in the organisation. I checked them with As and Jyotish. +50. The simplified essays were more easily connectable to other philosophical ideas. The party completed an area of study list with a certain number of students from the academy. First, the students completed the essays. Then, I converted the articles and their details to a critically analysable form. This form was breasoned out by computer. + +51. I played a game of symbols and intersectionality. I rewrote the philosophy without spiritual or computational terms. Text to Breasoning achieved this with an object dictionary. I replaced spirituality with evidential reasoning. I replaced computational words with uses. Text to Breasoning may need more specific replacements. +52. I kept one different term with an A in the department. Before this, I labelled words as computational or spiritual. I entered alternatives for these words. I proofread the altered texts. I fixed any errors. +53. I had a software-architectural style (met professional requirements at each point). I contributed to this by breasoning 50 As for politics in the morning. I changed this to 50 As for details. I wrote ideas about ideas. I wrote new functional algorithms (string to list with C-style array trees as lists) or those of new philosophies (adding breasonings to sentences' algorithms). +54. In the end, the party aimed for food and power. I established the Lucianic Natural Law Party. Text to Breasonings had 50 As. Daily Regimen had 50 As. People aimed for space and time travel. +55. The party members helped the students think of general ideas. I helped by mind-reading practice party members. Later, I placed the party software code in the repository. These party members had 50 As. After mind-reading a separate person thinking of a holding pattern, the party members were mind-read thinking of fast 50 As. +56. The professor wrote a secondary text. Again, the people agreed with (updated) Lucian Green's meditation philosophies. They connected them to research. They connected them to the news. They connected them to journalism. +57. University and politics had secular, simplified texts. I wrote an algorithm that replaced spiritual and computational terms. Occasionally there may be one spiritual term per department. Or, there may be one examined computational term. I explored these words in terms of words, not including spiritual or computational terms. +58. I wrote an algorithm that applied term replacements to academy texts. The algorithm checked each word. If it had a replacement, the algorithm replaced it. I had previously entered whether the term was spiritual or computational and what its replacement was. I kept a copy with the original words to expand. +59. Students encountered evaluability in texts and terms. As part of this, I wrote a spiritual and computational termless \"Grammar Logic\" argument for each spiritual or computational term. As part of this, I generated word lists for each spiritual or computational term. Second, I ran the \"Grammar Logic\" algorithm to inspire an argument about each term. Finally, I deleted the spiritual and computational words from the rest of these arguments. +60. 250 breasonings explained ideas or quotes that seemed self-explanatory or that I could further examine. I wrote 250 breasonings for each spiritual or computational term instance. Two hundred and fifty breasonings met the professional standard for imagery, 100% and effectiveness. Two hundred and fifty breasonings for a spiritual term helped explore the idea behind the spiritual term. Two hundred and fifty breasonings for a computational term examined the idea, providing a model solution or essay about the algorithm if the academy assessed the algorithm. + +61. I ran the program to find the database website. I did this to use State Saving Interpreter for accounting. Using SSI in this way was part of database accounting. This database was a set of tables. I wrote balance verification, etc. algorithms with State Saving Interpreter. +62. The total was always the same. As I could check sums of rows against totals of columns, I could check these against totals of layers. I also found four-dimensional totals. In addition, I found higher dimensional counts. They were from other times and places. +63. The functionally decomposed algorithms connected to the new algorithm. There was a bank account of breasoning currency. Access to it could be locked using a database. The philosophy was like the bar codes. Enough breasonings completed the details. +64. The executive summary was detailed enough but well-written. This summary contained limitations. Later, I cut out each sentence except the main point. I then simplified the grammar. I also simplified the word's readability. +65. I produced a report. This report had a type and purpose. I could compile its sections automatically. I included only the relevant information in it. The report was correct by the company and personal aims. +66. The API did the work. First, I needed to check its progress. It compiled the data. It analysed whether this was relevant first by going down and across levels. Then, it checked against other teams horizontally and vertically. +67. I improved the presentation's structure. I manipulated the data by using a particular file format. I transformed (moved data). I analysed (used ledger tools). I also blew up details, describing them. +68. I inserted statistical algorithms in SSI. They detected correlations. They could help make decisions, such as everything to do with sales. I included machine learning algorithms in databases. It automatically identified and preprocessed the data. +69. The website could run code in SSI, then online using SSI Web Service. I replaced functions with algorithms (logic) in the database. These algorithms were recursive. They could also reuse parts of other algorithms. I decomposed, identified and optimised their functions. +70. I edited the database code. This code was in a single file. In addition, I entered the report data. I edited the data using the database in SSI. An algorithm could generate the database code (the look and content of input and output pages). +71. I prevented opening accounts for unsavoury prospects. I did this by stopping illegal activity by monitoring user activity. First, I found the negative term. Then, I checked whether it was warranted. If necessary, I gave a warning or suspended the account. +72. The data was locked to others until a user completed it or rolled it back (I undid it). I did this as part of locking the database. First, the client paid. Then, the system recorded this payment. The database only registered the transaction if the user had paid it, and another user or session could not change it. +73. I only used the graph if necessary. I explained the chart. The type of graph was chosen, with advantages and disadvantages over other types of graph. The vital information was at the top of the page. There were blow-up details, lines to details or radial or visually appealing charts. +74. I enabled sales, bots and education with 4*50 As. I wrote As, not necessarily relying on neural networks in philosophy. I wrote 50 As. When I had written 4*50 As, this meant Bs, and sometimes algorithms were unnecessary. I could sell and use the 4*50 As as a PhD text. +75. I wrote functionally decomposed sentences and their algorithms. This stage was part of the long effort to finish the project. I wrote an algorithm to simplify algorithms into algorithms with a single function. I described this algorithm with a sentence. I experimented with joining these algorithms together and then rewriting the sentence. +76. I paraphrased the algorithm. Paraphrasing the algorithm was part of writing new As. I did this by converting functionally decomposed algorithms and sentences into new knowledge. Then, I rejoined them differently. If a rejoined algorithm did the same thing and was more straightforward, I replaced another algorithm with it. +77. I paraphrased a certain number of words per sentence and simplified the grammar. Then, I wrote new As with ontologies of sentences. To do this, I found the category that contained the word in the sentence. Then, I found another word in this category. Finally, I rewrote the sentence with the new term. +78. I paraphrased the algorithm. I did this by writing the word. Next, I noted its synonym. In this way, I changed the word to this synonym. I sometimes optimised by bringing forward values to equals statements. +79. I collected and reused the intermediate algorithms. I paraphrased the algorithm connections. The algorithm connection was between two algorithms, for example, two parts going well together. It could be a more general version of another algorithm. This algorithm dealt with any rather than a specific type. +80. Later, I wrote more algorithm connections using the same patterns. The goal was to find links where there were none. Then, if an algorithm fitted, I used it. Linking in this way generated more algorithms and inspired functional optimisation. Functional optimisation reduced predicates to their functions and ran these one at a time. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 12.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 12.txt new file mode 100644 index 0000000000000000000000000000000000000000..c61b1e1ec0bf21088fe186e8f955fa246330411d --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 12.txt @@ -0,0 +1,92 @@ +["Green, L 2022, Immortality 12, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 12 + +1. I found the best interpretation of a sentence as an algorithm for a particular use. I worked out an algorithm from other sentences and an algorithm. For example, the first sentence had the first algorithm. I worked out how the second sentence had the second algorithm from this relation. For example, the first algorithm concatenated red strings and the second algorithm concatenated blue strings. +2. I recommended writing functional algorithms as a mental exercise. I wrote more extended algorithms. I could write longer algorithms from the start or join other functional algorithms to form more extended algorithms. I wrote longer algorithms initially, with the most extended algorithms being interpreters. Then, I added my algorithms to the interpreter. +3. I critically held data and steps. Earlier, I wrote an algorithm that simplified more extended algorithms to functional algorithms. As part of this, I found horizontal and vertical data. In this, the horizontal data had chains of clauses. Also, the vertical data had chains of intermediate predicates. +4. The two points connected in an algorithm were input and output. For example, the inputs might be a bindings table and a variable name and the output might be a variable's value. In equals4, I substituted values into the sides, then took out values on unification. Triangular relationships such as this, the simplest practical algorithms, were desired. Functionalism was a way of having clearer thinking. +5. Performing only the necessary computations saved time and space. I considered writing minimal functional algorithms by collecting data by type. Rather than a series of transformations, I transformed a single data type. I changed a single data type, leading to the generation of functional algorithms that prepared data for processing by taking out the necessary parts. This pre-processing led to XML-style data labels. +6. I wrote more extended algorithms, such as converters and ssi.c. I wrote the SSI compiler in C. This compiler helped compile machine instructions faster. It was converted from Prolog to C, using a state machine to add C commands. This state machine could dynamically optimise algorithms. For example, if always being given 0 eradicated the need for an addition, I removed it. +7. I completed the preparation for sales. I wrote 4*50 As for sales. These, combined with a customer network, enabled sales. A Master of Business Administration (MBA) helped a customer network. The MBA was optional, but its main area to support was sales. +8. I wrote 4*50 As for bots. These were highly trained bots, I laughed, ready to perform jobs. They were more plentiful when there were good conditions. Having more employees was possible when there was more business. More business was likely with better technology (growth) and relevance. +9. I organised accreditation for the school. As part of this, I wrote the course and had it accredited. I noted that this accreditation honed the medical effects of As. It also aided imagery. I could more easily aim for goals with students, i.e. 250 breasonings per product, which enabled 100%. +10. I graphically explained all the algorithms in the course, and the assessment was like a game. This game was part of writing my business website. To do this, I imagined a futuristic space console. I conversed with the video character about the sale. I made the sale. +11. I collected data about Jyotish and 250 breasonings about bots' thoughts, and the best time to act. I wrote the bot's thoughts and their aim to perform well. Then, I wrote the properties of each planet and checked how these affected this thought. Next, I mind-read myself to interpret the horoscope. Finally, I wrote 250 breasonings to guide and support the bot. +12. The employees required qualifications above a level to teach that level, leading to PhDs. At the start, I paid trial customers (students). They tried and gave feedback on philosophy. They were also satisfied by the computer science professor. They found it easier to become a professor because of Computer Science. +13. I wrote more specific essay topics (chapters), including ethics. I started by writing many algorithms. First, I demonstrated how these algorithms worked in simulations. Then, I assessed the input and output of these algorithms in quizzes. Next, there were well-known questions testing understanding of the algorithms, and students memorised and ran them without a computer. Finally, the teacher tested them on algorithms (in a way, they wrote an interpreter without a computer). +14. I found like-minded people to help me achieve my goals. I wrote philosophy simply with spiritual and computational, and relevant words. Simple spiritual words allowed one's lifestyle to be simpler and were part of the institution's aim. Computational terms simplified the ideas into logic, included miraculous science and were well-known representations of the author and his times. The efficient version of the algorithm systems had relevant, original connections. +15. I wrote ten details per sentence. Both the details and sentences could contain spiritual or computational words. I considered spiritual or computational terms when not directly used. I wrote algorithms about the logic of spirituality, including quantum, breasoning and lifestyle-changing algorithms. These were the unique selling proposition of the academy. +16. I chose short and scenic links. I wrote the seen-as-version of the idea. In the end, I linked each idea to each department. I connected each idea to each concept. Sometimes these links contained more ideas, which I added on. +17. I wrote to prompt comments and inspire imagination. I did this by writing questions. I prepared for these questions with possible answers. I also imagined the possible conversation. For example, I wrote 4*50 As for quantum power, necessary for spacecraft. +18. I wrote critical content to help check a sentence. As part of this, I organised ideas into argument maps. They stored their reused algorithms in one place. I imagined pressing control-C interrupting a program and asking a student to imagine the ending. For example, were the data and algorithm correct? +19. I wrote solving content. I wrote ideas about debugging and solving complex problems. For example, I shared the idea to start debugging a frame just before a problem in SSI, using List Prolog Interpreter's trace as a reference. I suggested paying particular attention to commands expected to succeed failing in the interpreter. I occasionally found problems with the algorithms or how the interpreter ran them. +20. I encouraged finding 100 algorithms between ideas, for example, in a PhD. I wrote connections between the concepts. I actively encouraged students to understand the algorithms. The links helped students understand the inner workings of the algorithms. I found including the algorithms made finding connections more effortless and contributed to student research. +21. I created and solved critical content as an exercise. I reverted to critical content and helped with connective content as a player when the 5 As for details had run out. I helped find the bug in an algorithm that connected two algorithms. I found what the person wanted and gave it to them, as long as it was possible and reasonable. + +22. I worked for myself, writing necessary algorithms. I wrote an algorithm that took care of everyone's thoughts. I did this by coming to work. As part of this, I commented. I followed up with an algorithm. +23. I read the horoscope with prayers. I chose algorithms to write using Jyotish. Jyotish helped select auspicious work times. I could tell when it was good to work on an algorithm. I worked on algorithms that didn't require other algorithms. +24. The work was finished but had updates for others. Before this, I stated that an argument should be long enough for an essay. The argument should be long enough for a 4000-word essay. The argument should be long enough for a thesis. Also, the argument should be long enough for detailed, more extended and further work. +25. I created tools to find unknown, often future-related knowledge. I did this by writing thoughtful details about each thought. As part of this, I made priorities. I made goals. I stayed with my first priority. +26. I redrafted the work by simplifying it, and checking the connections and critical content, for example, problems and solutions. I rewrote and nourished the text with an understanding of the topic. As part of this, I wrote and described the algorithm. Finally, I checked the output of this algorithm. I also wanted suitable methods. +27. I found the spiritual or computational algorithm and optimised it. The spiritual algorithm was education. Once I had completed it, I could forget it. I refreshed it later. I finished the computational algorithm, including all requisite parts. +28. I wrote illustrative prose and algorithms. I wrote the algorithm. I considered the line representing it in terms of life. I thought of more science if necessary. Finally, I entered the science of reasoning into an algorithm to check for missing parts, mistakes or a replacement. + +29. I kept a diary of thoughts while becoming immortal (even though I thought people thought I died, they hadn't - I appeared alive but kept my youth to myself). I transformed from mortal to immortal. I replaced my body each day. I generated As to help thinking, stop dementia, see clearly and relax my muscles. There were no medical problems, and taking Chinese herbs for longevity smoothed my experience. +30. I chose whether to die to (sic) Earth (not to). I decided not to die at 41. Passing at 41 would have been too early. I detected the choice from meditation. I needed to finish my business on Earth. +31. I explained my age-freezing secret. I chose whether to appear to age to (sic) Earth (not). My friends would remark that I maintained my youthful complexion and could maintain a young lifestyle. I liked the look of people who appeared not to age. They just had more changes in their skin. +32. I chose never to die, not to die at 80. I decided my apparent age death (never). I enjoyed living the lifestyle and helping people live the lifestyle of the future (the best possible quality of life), knowing that people wouldn't want to turn it down and that all people would want to learn continuous improvement. I joked that I could move around and help more people. I treated people like people, not just bots with 5*50 As, not just 4*50 As. +33. I negated and won with unsure people, turning a profit. At the time of possibly moving to the simulation three months after indicating immortality, I indicated to cancel the universe in which I might appear to die with A and stay in the world, not the simulation with A. I learned this through experience and taught others. The world is ready for immortality. It changed to adapt to it. +34. I could continuously work on Earth, taking no risks and developing my style and writing. I could conduct my business, perform, act and publish. I could live on Earth, supported to be immortal through the future. There are simulations to support immortals (accessed through the future). There is medicine to support immortals (spiritual medicine). +35. I persuaded people to \"come out\" as immortal and stated that no hindrances to them were necessary. I realised that artificial memories and changes to my appearance to unnecessarily try to help me to cover up immortality didn't warrant teasing and that people accepted and loved me. While people wanted to protect me (which I didn't necessarily need), I felt supported and motivated in what I did. Few but enough people become immortal, so it's alright (it doesn't matter because the future supported it). People (where no people are bots in themselves) can become immortal and productive. +36. Immortality possibly cancelled dementia from medication and lower life expectancy from night shifts, but I didn't have to accept death as a representation. I wondered whether I could live on Earth for longer than 80 years, and I could. I could live for however long I wanted. I never touched death. I moved around and worked in many roles. The simulation protected me from death, but I didn't want to take risks that would cut me off from people. +37. I encouraged scientific studied on immortality and observed possible examples of immortality. I queried how to help my business after \"dying\" and thought to turn off death and help from afar. Helping from afar, in this case, meant not dying. It meant helping with the business in person. In effect, I could act as the director of my business for eternity. +38. If everyone became immortal, society's structure would change (there would be fewer medical problems, time travel to simulations and space travel to have more children, if this was necessary, and economics would function peacefully). The end of the universe wouldn't be a barrier because of the simulation. There was no need to pretend to die, and the world would accept immortality and high quality of life with the help of a future robot-led civilisation. I learned to meditate, breason ideas out, breason ideas out on the computer, time travel and learn immortality from a future civilisation. Immortals were not magicians but benefited from the science available in our times. Immortality was a property of our universe. +39. I could tell people about my experiences and knowledge, and they could benefit from it, and I decided to open centrally-operated centres in all countries. I explained my algorithms and epistemological knowledge. It was open-source, so people could privately access it, and it was transparent. The \"religious protection\" meant nothing, as immortality resolved positively. Someone asked how can I do it, and I answered with spiritual meditation (and my writing). +40. I explained to look up my writing and algorithms. My skills and knowledge came from writing enough on Pedagogy, Immortality, etc. I meditated (wrote) about what to write about while living. I thought of many topics to write on. I also programmed creative topics or topics that were advanced topics for our times. +41. I examined and researched possible issues during meditation while I remained aware that I could complement meditation with improving science and technology in the future. I decided to streamline people's meditation experience, particularly preventing headaches and psychological issues while transitioning to becoming immortal. I learned that I could avoid headaches with meditation (writing). Because immortality was positive, there was always a positive outcome. I also learned that computation pedagogy (with word breasonings) was necessary for studying for a master's degree. +42. I overcame bullying and enjoyed life without overhanging negativity. I thought people could benefit from immortality and being in the world. I encouraged people to move from strength to strength, claim a lively place in the world and continue being successful. The world offered well-knownness and enjoyment, and immortality lengthened and emphasised this. Nothing was missing that was needed; I could explain everything by reason and science, and I remained present. +43. I chose radicalism in continental philosophy. My unique selling proposition contained breasonings, a spiritual idea that I activated with 4*50 (word or sentence) As per chapter, and enabled immortality. I not only wrote 4*50 sentence-type As per department but 4*50 word-type As per chapter. Writing more per chapter required additional concentration and resulted in a cosmological achievement-additionally, 5*50 (word or sentence) As was the \"top\" and achieved the copywriting standard. +44. Computational breasonings require the lecturer (choosing, not having someone else decide what to breason out for you), recordings (replicating breasoning through other times and medically supporting computational breasoning), medicine and delegate possibly headache-causing workloads with spiritual medicine, philosophy and creative writing skills, among others. I imagined that people could generate breasonings with or without computers and that philosophy students could memorise algorithms. I preferred to create breasonings with a computer, automating writing high distinctions and opening up the possibility of future technologies being available in our time. Not only were breasonings necessary for healthily conceiving a baby, but they made immortality and permanent good health available. I regularly held Prolog-based fairs of my algorithms to demonstrate them. + +45. I created and checked ideas. I wrote an algorithm that verified that words in the key sentences were words from the argument. The argument contained words that linked to each sentence. The first and second sentences in the paragraph were the key sentences. I used mind-reading (similar to generative art) to make smoother choices of words to replace. +46. I wrote algorithm specifications that weren't too difficult to understand, were valid and usually didn't require other algorithms. The non-key sentences were transformed into nothingness terms, for example, self and other. The rest of the sentences in the paragraph described the algorithm. Often, the algorithm indicated could be a computer science assignment. Students would attempt to solve the problem with their solution. +47. I devised my lexicon, for example, using computational terms. Alternatively, I replaced the key sentences with nothingness versions of words from the argument. These nothingness words are \"point\", to be hand replaced with \"event\". One should start with \"event\". The writer may replace these with groups of related words and mind-reading before grammar checking. +48. I contracted the sentences or expanded the sentence. I wrote an algorithm that found the first relevant functional difference between sentences. It chose a short phrase from each sentence with mind-reading. Then it connected them (thinking of but not including a fleshed-out example). Then the author rewrote the original sentences. +49. I numbered the fifth sentence about a replacement. The algorithm displays the argument replacement sentences sentence 1, sentence 2, not word 1 x word 2 x word 3. I chose the sentence with the needed parts, making any changes as necessary. I didn't insert past corrections to choose between within sentences because these were hard to read. For example, I thought \"5\" looked like a sea horse, whereas \"4\" looked like a horse's head. +50. I noticed that the winner had breasoned out 5*50 As per sentence. I wrote another algorithm that wrote 5*50 As of Grammar-Logic details per sentence. It also breasoned out As for connections. The algorithm maintained records of the number of breasonings breasoned out and the date. I could use 5*50 As for business and copywriting because this was the copywriting standard. +51. I simplified the assignment to a worksheet that students could complete on the computer. I published essay formats separately from the chapters. I checked the chapters against the essays written about them, making the chapters clearer if necessary. I wrote secondary texts about a chapter to help explain it. I replaced words with unusual words, accounting for particular ideas. +52. As an alternative to written assessment, I established an academy where students discussed ideas. I described the concept. Then, the participants took turns discussing the idea. They thought of simple argument maps. I mind-read the approximate replies of people, grouped them and traversed them during my argument. +53. I included recursion in the interpreter with individual projects. I wrote simpler algorithms suitable for assignments. These algorithms were long enough. However, they were not too complicated. They were perfectly programmed, derived from relatively more complex other algorithms. +54. I wrote ten details per line, connection and more, using mind-reading and functionally decomposed algorithms. These details were algorithms. Writing ten algorithms also applied to products. It also applies to central ideas in an essay. I also wrote 80 philosophy breasonings. +55. I could reuse the functional algorithm and change how I used it and its parameters. Before this, I wrote the functional algorithm. This functional algorithm had a single function. It eliminated unnecessary code and mistakes because of having a single function. Contrary to some, writing wasn't more complicated and didn't take longer to write. +56. I reinforced the usability of texts. I worked out the algorithms and algorithm details before redrafting or removing them. I encountered fascinating thoughts when I functionally simplified the algorithms. Next, I smoothed terms and arguments, opening algorithms to readers and highlighting their relevance. Finally, I diluted most of the text for readability and consolidation. +57. I just did it myself first. Algorithms need to be simplified to, for example, five functional parts (representing the algorithm). I arrived at these parts by simplifying the logic. I also simplified the data. I repeated this if necessary. + +58. I wrote a Learning Content Management System in SSI Web Service (SSI-WS). It was a mixture of algorithms, not just SSI-WS, and used text file editors to manage data. It contained forums, lessons and assessments. It needed to be accessible to disabled people. I helped disabled students with software. +59. I sped up SSI-WS to work with multiple-page course sites. In my List Prolog and State Saving interpreters, I made the spelling of the string concat and member commands uniform. I kept \"=\", not \"equals4\" in the Prolog to List Prolog converter. I redrafted the converter's handling of lines, variable names and single command cuts. I fixed the bug with multiple calls to a function and cut in List Prolog Interpreter. +60. I saved progress over different windows in SSI-WS, including saving progress if a window was closed. If a student made changes in one window, they could load these in another window. These windows had different session data. One needed to be aware that one session could overwrite another session's data. To prevent overwriting data, I checked that data wasn't going to overwrite other data in each session. + +61. There was a single login for quizzes and student administration. There was two-factor authentication using email. SSI Web Service contained the whole website in a single algorithm, so students needed a single login for the entire site. When students log in, they can visit any part of the site. If the student wasn't logged in, they could see any public part of the site within SSI-WS and then log in. +62. I didn't need to log in with cookies if I was already logged in before a time limit. I logged in and stayed logged in, in different windows. I returned to the GET form method, allowing me to copy and paste URLs for specific pages. The algorithm saved data by changing it on the server rather than in variables in the algorithm. As long as the algorithm collected garbage after closing a predicate (speeding the algorithm up), the single SSI-WS algorithm made testing and a single login possible. +63. I redirected the user to the first page if they opened the landing page or sent them to the URL in the GET form method. I wrote an AI word-guessing player. It guessed from a list of words and recent words and detected cheating if either player used recent words. Finally, I wrote a solid-block maze in Prolog in HTML, using Javascript for get_single_char. It was a 3D maze, with different levels visible at the same time. +64. The person established a company to buy products, increasing quality with testing and allowing others to buy many products. This company bought products to try. It also recruited students from various places. In addition, It organised scholarships for them. It subsidised fees. +65. I kept writing, and solving spiritual science, for example, spiritual computers. I wrote As with neural networks. I wrote an algorithm to help me understand the creation of the algorithms and logic. I sensed the urgency with a backlog of algorithms to write. And students and staff needed As. +66. I used previous algorithms as helpers and wrote algorithms for institution assignments. I took action and did work when I wanted. I avoided stricture but came up with ideas about each thing. I set up issues (to-do lists) for algorithms. Also, I generated other possible algorithm specifications using different algorithms, solving problems myself. +67. Higher education was similar to VET, and I thought of more thoughts. So I wrote an assignment to be vocational education and training (VET)-like assessment. After writing 250 breasonings, the rest fell into place. The helpers needed a seen-as version of trains moving thoughts into position before being thought, funnels and harmless interpretations of possibly radical ideas. VET did all these for 250 breasonings. + +68. I wrote the contract with the company, different from a computer science student agreeing to do the work. Separately, I set the computer science assignment. I wrote the philosophy. I explained shortness in terms of longness. I concluded there is nothing else to a quantum conclusion, the top in our time. +69. I inspired other algorithms. To reach the proper logical standard, I wrote ten algorithms and 80 philosophy breasonings. I wondered whether it would be the top in the future. I then asked what caused it to work. Does the universe naturally form divine forms? For example, long arguments containing perfect ideas are divine. +70. I breasoned out 250 breasonings in philosophy to earn 100% (like 100% correctness from deleting the record of a finished predicate or allowing the same predicate to be concurrently run). There were 250 philosophy breasonings and 250 algorithms. I prepared for each assignment in this way as a professor. I wondered whether future science was further down the argument with perfect programs. I allowed the same predicate to run concurrently. +71. I worked on the algorithms instead of asking the future computer to work on them to aid my understanding. I collected garbage by deleting finished predicates. I recorded the state of variables after finishing the predicate. I deleted the record if there were no choice points in it. I asked, what more developed time travel is available? +72. * I replaced \"n\" and \"v\" in List Prolog with \"&\" and \"^\", respectively. I considered \"predicate name\" and \"variable name\", or \"system name\" and \"system variable\", but these would take too long to type. I could change them to \"predicate name\" and \"variable name\" later because I would write the code in Prolog, not List Prolog, at first, and the longer names would only be visible to the interpreter, not the user. I converted any instances of \"predicate name\" and \"variable name\" to Prolog, not List Prolog, if they were visible to the user. +73. I wrote the Learning Content Management System (LCMS) in State Saving Interpreter Web Service to help earn 100%. I used the LCMS to teach Prolog. I explained the commands. I gave the question. I checked and returned hints about improving the answer. +74. I introduced myself and discussed the topics. I did this by establishing a profile page and multi-dimensional forum. I started with the top. I could go up or down. Vision-impaired people could use the browser or navigator because it had a feature to print all levels. +75. I searched with the vision-impaired browser. I began by editing the post. Then I viewed the post. I could also search for terms in the post. Finally, I could search my philosophy for conjunctions of terms. +76. The students could verify and submit programs however many times they wanted. To allow this, I made automated programming assignment verification and submission algorithms. These algorithms helped increase grades. Indirectly related to my logic studies, I attended the International Logic Summer School on a scholarship. The prestigious institution appointed me as a logic tutor. +77. I sent the student's grades and financial information to them. As part of this, there were messages on the system. The students logged in to read these messages. There was also an email notification when there was a new message. Each message had recipients, a subject and a body. +78. Students could download resources to complete work. These resources were on the LCMS. First, I wrote the source. Second, I uploaded it to the LCMS. There were also a unit outline and assessment information on the LCMS. +79. The students could submit essays (and play ten types of games). They wrote the articles. They pasted the paper into a text file. Finally, they submitted the text using a form. The lecturer marked this essay and commented on the clarity of connections (with ontologies). +80. I wrote as the point of immortality. As part of this, I checked the essays for similarity with other articles in the database. I also checked whether they correctly cited the source. I checked the essay format, including the title, introduction, exposition, critique, conclusion, citations and references. I also checked that the writer had had ten words or fewer per quote and that they had adequately paraphrased unquoted excerpts. + + +"] + diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 13.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 13.txt new file mode 100644 index 0000000000000000000000000000000000000000..c505f3aa43c0b62a2b84c8a34c966f23b65f12b6 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 13.txt @@ -0,0 +1,89 @@ +["Green, L 2022, Immortality 13, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 13 + +1. I downloaded the Learning Content Management System (LCMS) to browse offline. Before this, I stated that the LCMS is based on the vision-impaired browser. I reused the resource in the LCMS. Also, I bookmarked my place. I stored items offline. +2. I computed on parallel machines. I listed all the jobs. Then, I selected parts of this list. Then, I ran these parts on different computers simultaneously, given memory limitations and possibly multithreading. Finally, I updated the finished tasks. +3. I rewrote non-deterministic predicates in SSI to be more straightforward. Before this, I wrote ARIA labels to help make my web page accessible to certain disabled people. I used labels to describe elements with no visible text on the screen. I compared Prolog on the web and terminal. I wondered if SSI was more intuitive than SWI-Prolog. +4. I wrote the HTML section in List Formatting Language (LFL) for programs. First, I entered the Prolog algorithm. Then, I pretty-printed Prolog. The algorithm could be pre- or post-pretty-printed. I could also pretty print List Prolog after entering it in List Formatting Language (lists of lists). +5. There were specifications for the algorithms in the philosophy. Separately, I created an HTML table with LFL. First, I inserted a list for the table. Next, I added a list for the row. Finally, I put in a list for the column. +6. I entered LFL in the chat. Instead of video and presentations, the participants typed during lectures as a chat. The lecturer typed and presented text lectures with questions. The students typed the answers. They asked questions and interacted during the conversation, customising their experience and catering for their needs. +7. I could quickly test the website using a single set of tests. Before this, I tried the SSI-WS app with SSI (not on the web) with a text file. First, I wrote the return-delimited inputs in the text file. Then, I pasted it into the terminal. Finally, I tested each part of the algorithm, assuming it gave output to check at the end. +8. When testing an SSI-WS algorithm in SSI (not in the browser), I entered the inputs when the text file name was entered as an argument, checking that the questions asking for input and the documentation matched the clarity and user-friendliness of the text file. Given this file, the algorithm took each line of input from it. I also tested each predicate separately. I automatically took the output from one predicate to input into the following predicate to test the predicates separately. I tried child predicates before parent predicates. +9. As an alternative to pedagogy, I could breason out business. Before this, I stated that I ran businesses after breasoning out a breasonings-accredited (not accredited outside) Pedagogy master's degree. The education company was a business (earned money). The teachers bought products as part of their jobs. The first breasoning (the degree) protected future uses of the department, and subsequent breasonings were for specific products (jobs, employees, sales and assignments). +10. Customers trusted the PhD/copywriting standard. I breasoned out 5*50 GL As per product or text (5 million breasonings) and 250 breasonings per line of the text (in fact, I covered this with the 5 million breasonings). I used all my computer and phone processors to finish the breasonings more quickly. As a result, I could help more meditators more carefully with mind-read philosophies, algorithms and details. +11. I wrote 250 algorithms per assignment, with a particular number of assignments per degree. I used the text-to-breasonings algorithm to produce the commerce-type As. Text-to-breasonings required education, lecturer, recordings and medical skills. Text-to-breasonings could breason out word (not character) breasonings of the correct quantity required for master's and above. Epistemological knowledge needed to know that text-to-breasonings needed to do this included writing enough algorithms for qualifications. + +12. I researched infinite knowledge in immortality (for example, on news, music, mind reading, animals, new generations, breasonings, thoughts and algorithms, where one thought generated an infinite number of ideas). Separately, I wrote that the educational products needed accreditation from Lucian Academy. I considered professional development but wanted followed algorithms with details, which it couldn't quickly provide. I considered the universe of ideas but needed more prestige. I developed the prestigious institution with 5*50 As per assignment and a high number of algorithms for an important project. +13. I structured the website to self-manage and automated it, including supporting new people. I determined each process and helped people get them right with 5*50 As. I started by writing the processes. Then I wrote the arguments. Finally, I voted on changes. +14. I did all the work myself, and the philosophy doubled the results like the rebirth of nature, where I detailedly mind-read the other students' data. I wrote that enough As took care of separate pitch As needed. Separately, I helped someone with an A when I earned it, with interpreted (other sets of data for) algorithms to help lecture. Two uses for algorithms helped students in the class make high distinctions. Multiple use-trees over time helped up to 15 students earn high honours. +15. I kept track of my student-employees and helped them research their and the world's frontiers of knowledge, synthesising them. I did this as part of driving myself like a DBA. I did this by writing 4*50 As for sales, bots and business. As part of this, I wrote algorithms at school and in the industry. I developed systems to help write more or detailedly mind-read details. +16. I acted professionally, preferring philosophy. I led business mentoring (leading to professional accreditation, income and staff). I hired lecturers who would teach my topics or meditation and breasoning-related topics. On another note, while some wrote their systems and rose to the top, others helped others (it didn't matter if the systems were too fast for them, they needed key points about research to catch up). While students were mind-read about algorithmic choices ahead of time, the teacher gave them more when they were correct. +17. I wrote and organised advertisements with many As. There were levels of As for each type of person, or someone like them, and their life events. I wrote lots of drafts. I started with a brief and mission (why the brief was unique to the task). Then, I mind-read myself for alternative sides (why the product was attention-grabbing). Finally, I was sure of the secret business product for prospects (or why the prospective customer wanted the product). +18. The company needs to keep up with thoughts about bots and other things like them. I offered accreditation (with 50*50 As, or 50 million breasonings at the start). I broke away and focused on the quality of the planned thoughts. If there was a departmental event in the person's life during the course, I breasoned out 50 As. Education jobs required 5*50 As to cover different people and times involved. +19. There was nothing above the level. I gave 5*50 As to sales. It was afterwards when the manager had finalised everything. On a completely unrelated note, the professors could more easily analyse algorithms. An algorithm invented the professor's work and the rest. +20. I followed and met some of the people (lovingly called bots as a simple model). I gave bots 5*50 As. These \"bots\" were human, created at birth. I breasoned out copies of the philosophies before their conception. I mind-read their data for the algorithms. +21. It was actually 50 As, but I had recently finished another book with 4*50, which influenced it, where it was clear everything required 4*50 As or higher. As had I composed and gave the music 5*50 As. The background was audible. It was more challenging to perform but highly desired. I worried it sounded too negative and wouldn't be accepted. + +22. I anticipated the end of the music. When mixing and mastering, I designed a hall with a grand design and acoustic features. I made a conch painting. The shell was gold and pink. I thought of the audience's possible comments, such as \"It sounds like being underwater\". +23. I found algorithm/building/copywriting intersections (with new words, such as \"commtainer\" (sic), which copied virtual rooms). As part of this, I architecturally drafted cutting-edge copywriting. I did this by rewriting the copy and adding As. I gazed at the text and thought of possible questions and answers. Finally, I dipped into thousands of drafts (relating to different algorithms) after mind-mapping the topic and made the copy from them. +24. I balanced human effort with the output of algorithms written wholly by the person. I did this while programming at the company. I did this by preparing SSI Web Service for simple apps. Second, I had internal research that the company conducted. Third, the company wrote white papers containing details about its products. +25. The report covered the algorithm with comments, the input and output and optimisations. I wrote the essay. I integrated complicated algorithms. I analysed a single idea in different ways, for example, induction, meta-induction, games and interpreters. Finally, I settled for students covering algorithms. +26. I helped with computer science by comparing and helping with assignment milestones. I wrote the finance task. I wrote the information product. I rented the spare flat after I renovated it. The trade-off between computer science and philosophy in favour of computer science was easiness for professors, and in favour of philosophy was higher grades (but I decided with enough epistemological guidance, computer science could have a 100% standard). +27. I developed education courses, replying to students' posts about sources. As well as supporting education jobs, they supported departmental products. Education was responsible for delivering the developed idea to the student's brain. The student needed to represent this answer in their work. The answer could be in a lecture, source or meeting. +28. The As contained ideas. I wrote courses to help earn theatre, music and film jobs. I developed the education As. Also, I created the departmental As. I aimed for specific jobs that were suitable. +29. The student wrote assignments during the course (with quizzes about pre-written algorithms). I wrote business courses. I wrote specific As for everything. I detailedly mind-read data for algorithms. I expanded these when given time. +30. I included answers to quizzes in written lectures. I wrote the education course. I asked what output particular input resulted in for an algorithm. I asked about possible ways of solving problems. I asked students to verify a viable method. +31. The professor asked to read my Immortality book. I wrote a meditation course. In it, students completed algorithms analysing life by meditation. I discussed the number of utterances and the difference between institutions and temples, an algorithm-writing algorithm and immortality. The algorithm-writing algorithm could give meditators thoughts while meditating. +32. There were medical jobs, students and hospitals. I developed the medicine course. There were 5*50 As for food and sleep. I also discussed meditation, courses, pedagogy, mind-reading, time travel and immortality as medicine. I emphasised using text-to-breasonings with headache prevention medicine for many of these. +33. There might be 5*50 As for Computer Science or English jobs. I decided this after writing the Computational English course. As part of this, the students studied texts about computational hermeneutics. They could teach Computational English to their students. There were 5*50 As for jobs. Computational English is related to writing algorithms or doing something productive each day. +34. I developed an interpreter with algorithms in list format to generate more algorithms with a second algorithm. As part of this, I wrote the Popology course. This Popology course contained arguments about people and the interpreter. To explain, an interpreter runs algorithms in a programming language. Also, there were 5*50 As for teachers' and students' jobs. +35. Teachers needed enough experience to become professors. I wrote on Societology. Societology included ideas on associations and compilers. I aimed to write a compiler with a retry feature in trace mode, and I could eventually convert it to C. There were 5*50 As for compiler teachers' and students' jobs and assignments. +36. Lucian's machines were initially a cross between Prolog and register state machines, ending up as a compiler. As part of this, I wrote about Mathematics. This Mathematics book continued about compilers and other algorithms. Students examined topics or synthesised algorithms, for example, induction and an interpreter. Possibly examination by writing the book enabled Green to complete the compiler. +37. I worked out many people wanted to become immortal. So I wrote a book about immortality. I wrote about the culmination of philosophy, meditation and programming and becoming a time travelling immortal. People used text to breasonings to extend their life expectancies. Immortality involved breasoning out 5*50 Grammar-Logic breasonings (5 million breasonings) on immortality. + +38. I breasoned out 5*50 As for computer science jobs. I did this to profit from computer science. As part of this, I mentioned my computer science projects in job interviews and applications. I also earned money after studying courses. In addition, I contributed ideas to education. +39. I made money from my paraphraser algorithm. I did this by writing the research. First, I wrote the essay. Then, I paraphrased it. I also checked it using induction (thought of details). +40. For a large project, there were 5 As of hand-breasoned breasonings. Besides this, I made sales from Grammar-Logic. I used the Grammar-Logic algorithm to generate mind maps for philosophies and details. Grammar-Logic users also needed meditation and other skills. Users required Grammar-Logic for masters and above and immortality. This algorithm \"closed the cash register\" to give the rest of the content when the hand-breasoned breasonings had finished. +41. I noted that Grammar-Logic didn't tire anyone, but the later breasonings were on from the earlier ones. I earned from Text-to-Breasonings. I noticed that combining philosophies and more from these derivatives meant infinite knowledge was possible. I broke from tradition and elected to keep algorithm specifications in the philosophy. I and others could sell Text-to-Breasonings, a way to automate breasoning companies. +42. Essay Helper contained an argument map traversal algorithm and could be modified as an argument map editor. I made earnings from Essay Helper. Many people returned to study. Essay writing contained laborious, automatable sides but required \"detailed reasoning to be corrected\", i.e. it needed hand-paraphrasing. I could place Essay Helper on a website without needing to install Prolog. +43. I traced who had touched, looked at or logged in to account data. I earned money from the interpreter. I created the programming language to, for example, create algorithms, formulas or web apps. I made money from the programs that ran with the interpreter. I built systems in each part of the business based on the interpreter. +44. I invested in education. I made earnings from the compiler. It was a step towards the compiler in C. The compiler in C moved away from the original version of Prolog that ran with the compiler. On the side, I assembled an ecosystem of utilities that I earned credit for in my studies. +45. My aim with the compiler in C was education. I also made money from writing this Prolog compiler in C. I noticed that greater efficiency in the compiler in C paid off. I also wrote educational materials in Prolog. I wrote the compiler to guide understanding. +46. I recommended students write their versions of predicates. I financially benefited from Combination Algorithm Writer (CAW). It was a brute-force inductive algorithm finder. It could provide answers that Program Finder with Types (PFT) could not. I was paid money for optimising my code with CAW. +47. I solved the same old thing by working the answer out. I found Program Finder with Types (PFT) financially viable in helping find algorithms. The first version of Program Finder found programs that moved data around, for example, into new lists of lists. The second version, PFT, found algorithms from variable-based type statements. For example, the algorithm transformed common patterns in data into types and code. +48. I optimised the algorithm using algorithms and hand-written methods. I made money from designing the font's state machine. I learned about state machines, applicable elsewhere, and created a graphical user interface. Separately, I could make a simple decision tree with PFT by either using PFT with CAW from scratch or PFT with a CAW dictionary item with the recursive use of findall. I could prepare for the integration of PFT with CAW or neural networks. +49. The person earned more than the computer. I earned money from making websites. I was given money for production. On another note, after generating part of the decision tree, I could use CAW to count the number of branches. Also, I treated the decision tree like a state machine rather than a hierarchical list to find with PFT and found rules about other parts of the algorithm to customise the algorithm, like a neural network. +50. The advantage was the code clarity of the compiler in the language. I wrote other philosophy algorithms for profit. I examined intelligent algorithms that occurred to me to understand the mind and computation better. At the same time, I thought that the neural network was a state machine. It reused parts and had state machines within state machines. +51. I noted that the institution was primarily a business (it needed a pedagogical base). I designed neural networks for money. They helped with natural language and more difficult inductive algorithm writing. Later, I made up courses in business, small research ideas and many ideas to support them. I looked at algorithms both neurally and non-neurally to check my thoughts. +52. I used the neural network to optimise my code, printing subtle refinements and removing unnecessary data processing. I did this to examine data science in different areas. The employees needed to understand the (neurally outputted) algorithms to demonstrate understanding. Separately, a state machine could be represented in code and was like a decision tree. I preferred functional (intermediate) predicates to save code and wrote functions with separate functions. +53. I educated people for money. I did this by developing apps to make money. I accomplished this by hosting the algorithm on my server. As part of this, I had a free version with limited features. The premium version had more features, for example, file access from Prolog. + +54. I made music licensing sales to companies by mind-reading when they were interested in the tracks. I listed the opportunities. I tested whether each of them wanted my tracks. I included an \"other\" mind-reading option as a way of testing my unprofessionally produced tracks, and lyrics-only unproduced tracks were wanted, leading to making them. I focused on medieval remixes of my music. +55. I repeatedly checked my music, and the music the companies wanted was the same genre. I made a shortlist of the genres of my music. I checked my music's genres matched those of the opportunities. I focused on my favourite tracks. I wrote music just for opportunities. +56. I researched, calculated the profit, and aimed for work goals. I advertised my products and listed them on appropriate sites. As part of this, I made apps that promoted innovative apps and philosophical products. I also listed my services on work listing sites. I devised a bot that could complete customised work. +57. I wrote a findall to recursion converter when converting SSI to C (which could only process recursion). I designed a font for my software. It was legible at a small size. In the first year of my academy, I asked students to use type statements in List Prolog. Their limitation was that repeating lists needed brackets. +58. After making money, I invested in others. I made money from business and investments. I used brain circuits, replicated success and encouraged sales. When I had a surplus of money, I invested it in others. I attended business pitches. +59. I simplified and set the predicate as the question. I wrote an algorithm that breasoned out five million breasonings from a file. Five million breasonings is the number of breasonings in a PhD assignment (4 million) plus breasonings for professional work (1 million). The algorithm truncated the file to the number of words that the Grammar-Logic algorithm would transform into the number of breasonings (words). I breasoned out 5 million breasonings for each chapter and algorithm. +60. I covered the PhDs myself. Separately, I ran the algorithm that breasoned out 500 breasonings ten times. I ran these ten threads concurrently. I also wrote a queueing algorithm for my meditation tasks. In addition, I wrote the algorithm-writer algorithms to write algorithms for assignments. +61. I recorded the assignments which I had increased. I wrote a file with the lines to increase and a file with the lines increased. I did this by duplicating the lines in the file so that it met the number of words. I wrote details about these lines. I recorded the philosophies and algorithms which I had increased. +62. I found customers and made decisions using a synthesis of mind-reading, Grammar-Logic, Jyotish (Vedic astrology) and neural networks. A scientific method was neural networks, which, like program finder, could remap data. I began writing predicates with program finder, getting down to foldr append and foldr string concat, etc. I used if-then and level counters in predicates. I wrote simpler predicates using this method. +63. I could control all aspects of my business. I did this by writing 5*50 As for Vedic astrology. First, I located the planets at my birth. I found daily predictions. I made business decisions. +64. I talked about each key idea. I breasoned out 5*50 As for each sentence when publishing. I found these 5*50 As with help from Vedic astrology. I found 5*50 As for customers. I wrote them, and I found a suitable interface. +65. I taught at the academy with a series of algorithms (about commands) stepped through on the web (explaining each data structure's entire transformation). I taught the essential skill with the predicate. I used A=[B|C] rather than [B|C] in the predicate head. I simplified some predicates. I let others simplify the rest. +66. I set 20 tests per assignment without saying to use previous predicates or asking questions that split up related predicates. I wrote breasonings about the tests. I prepared enough breasonings for each project. I focused on computational philosophy that maximised human problem-solving. As an aside, I could do Shell with a Prolog theorem prover. +67. I stated that append([\"a\",\"b\"],\"c\",D) should give the result D=[\"a\",\"b\"|\"c\"]. I took care of this by running the append predicate in the interpreter. I didn't turn off trace mode during the append predicate. I modified the append predicate to do something else. I simplified and optimised a predicate, splitting it into different functions. +68. I researched reading and writing files. I separated SSI and non-SSI non-deterministic commands (with multiple outputs). In non-SSI, these commands naturally returned multiple results because I ran List Prolog with Prolog. In SSI, these commands found a certain number of results at the start and returned these on backtracking. The non-deterministic non-SSI commands couldn't report failure because if-then interrupted backtracking. +69. I wrote a single command for all non-deterministic SSI commands, then wrote individual predicates for them. I ran these Prolog commands from within List Prolog as part of a prototype and to see if I could add to them. Then I removed this command and wrote predicates that I could run to find these commands' solutions. These predicates could change memory settings, globals (predicates) and retract predicates. I could also write a single command for all non-deterministic and deterministic SSI and non-SSI commands, then change them to predicates. +70. I designed a command that turned a data structure into a state machine and then into an algorithm. Separately, I could treat all commands as non-deterministic, i.e. they returned multiple possible results. But it was easier to read these commands if they were treated separately as deterministic or non-deterministic. However, I would still replace them with predicates. I could treat a deterministic command as non-deterministic with no other solutions. However, it would be easier to read and modify code that treated them as deterministic. At any rate, I would replace many commands with List Prolog predicates, some of which made commands to the interpreter. +71. I added core (recommended) methods to tutorials. On a different note, I added new commands to languages and reserved words by scanning a list of predicates. These commands could be translated more quickly and protected from being redefined. I could detect new commands and add only those to these lists. The new interpreter returned several pretty-printed non-deterministic results. As a precaution, I checked when a predicate had choice points left over. +72. I added a version of foldl that didn't require an initial value but took the first value of the list. Unrelatedly, I added more ISO commands (with terms). If one of these commands took an argument with a term, for example, write, it would write this term. Terms such as commands also could be called. Finally, I wrote the interpreter, including only necessary changes, where lists I could manipulate more easily replaced terms. +73. I added new commands to the documentation. I did this by scanning these commands from the predicates. Next, I listed their modes. Then, I added the new commands to unit tests using data generated from their type statements. Finally, I checked whether the unit tests were correct and that the modified predicates worked with changes to them. +74. I set C globals to array trees representing, for example, empty lists for reuse. On another note, I rewrote the interpreter to accommodate commands, such as subatom, with more than three variables. Subatom could take one of several modes. In SSI, which needed to calculate several sets of results of subatom at first, I took inputs and gave outputs. I included the modes of arguments in the new version, which could deal with multiple arguments and modes. +75. I rewrote \"get values\" to return a list of values for any number of arguments. I found the value of the first argument. I repeated this for each argument. I only got values for inputs. I computed outputs and saved them to variables. +76. I ran commands by processing their variables recursively. After finding the input variables' values, I constructed the command term and ran it. Later, I skipped this stage and ran the command as a List Prolog predicate. I couldn't see any reason to keep the code construction algorithm; however, system calls might require it and possible types and modes. As a side note, I grafted the deprecated feature into a new version by adding its predicate to the algorithm. +77. I needed undefined variables to avoid singleton errors. I didn't save values to or retrieve values from undefined variables represented by the underscore character (\"_\"). The interpreter shouldn't treat it like a defined variable (such as \"_A\" or \"B\"), which can save and return values. I made sure that the Prolog to List Prolog converter and List Prolog to Prolog converter used strings (e.g. \"_\"), not atoms (e.g. '_') as variable names to ensure compatibility. I also ensured algorithms converted from Prolog into List Prolog were compatible by changing variable handles so they wouldn't conflict with strings in Prolog. +78. I kept if-then free of negation (which would have if-then's non-nested) for clarity. This process was part of converting SSI to C. I maintained user-friendliness by adhering to term-syntax until converting to terms as array-trees in C. I could also use terms as commands but didn't need to because C had no way of dealing with Prolog commands. These terms could be commands in findall, etc., but didn't need to because C had no choice points to cope with findall. +79. I initialised numbers as equals4 arguments as \"1\", not \"\\\"1\\\"\" (a number as a string) and not 1 (not a string, which it needs to be for me to convert it into a term). Earlier, I stated that I stored variables within equals4 and didn't need to initialise them separately. I only needed to initialise the terms representing the variables, for example, \"[v, a]\". I converted these strings into terms in C array trees. Upon conversion to C, \"[v, a]\" is converted to a C array tree in the code. +80. I wrote comments describing certain term strings as converted to array trees (a way of storing a term as a tree in an array in C). I needed to initialise variables (integers, strings and arrays) used for the first time as arguments in C. I wouldn't always use these variables as equals4 arguments. However, I sometimes might use them as arguments in other functions converted from the interpreter to C. Initialisation includes setting the variable type and assigning an initial value. + +"] + diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 14.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 14.txt new file mode 100644 index 0000000000000000000000000000000000000000..8a567e506b2af3bcbeb2cb634c69ddb084d16e57 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 14.txt @@ -0,0 +1,104 @@ +["Green, L 2022, Immortality 14, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 14 + +1. I encapsulated clauses and findall and made it space-agency compatible and efficient. I made an SLG tabling database of previous results in List Prolog Interpreter (LPI). Third, I set the number of times a computation would run before the interpreter tabled it. Fourth, I replaced reused items in the table with garbage-collecting symbols. Finally, I only tabled items over a particular length. +2. I tossed between Prolog on the server or client side and chose the server side to access files, while the client side could run using the client's processors (like programs in an operating system, and these apps are easier to install and run). I made an SLG tabling database of previous results in State Saving Interpreter (SSI). This system was similar to that of LPI. I immediately kept SSI Web Service apps instead of cut-by-the-page ones because SSI-WS was fast enough. Users required SSI-WS to run their algorithms on a server. +3. I solved the debug display problem in string concat and member by running List Prolog predicates. Before this, I bug-checked string concat and member in SSI. I did this by checking that the string concat and member debug display were correct. Then, I checked the number of arguments in debug was valid. Finally, I checked the values were correct. +4. I added new commands to SSI. I called the Prolog predicates for deterministic commands with their names and modes. I also modified the virtual command code for non-deterministic commands with their names and modes. (Virtual commands found all the non-deterministic solutions at first and backtracked to them later.). Eventually, I replaced this code with List Prolog predicates. For example, I wrote the List Prolog predicate for string concat in terms of \"string chars\" and append. +5. I helped users realise the member command's correct function in the predicate state machine. I swapped X and Y back to normal in member(X, Y) in LPI. This order helped avoid confusion. I also converted equals4 in List Prolog to = in Prolog. I used Prolog to write the predicate state machine in SSI because it was easier to follow. +6. I considered writing an original version of the member command with input followed by its output for uniformity when generating algorithms with CAW. I swapped X and Y back to normal in member(X, Y) in SSI. In the first version, I changed the arguments of the member command back before manually processing it, given various modes. I ran the member command in the second version using a List Prolog predicate. I found out and worked out, not copied work. +7. I added non-deterministic choice points (for append, repeat, etc.) to SSI. For example, repeat added infinite choice points. The interpreter continued to backtrack to choice points until there were none left. The interpreter returned to choice points if they were left at the end. The interpreter returned true if it had found a solution and false if it had failed. +8. \"Find n solutions\", when cut, finds the first n solutions after backtracking. It used the findall code, finishing after n solutions. \"For all\" similarly used the findall code, failing \"for all\" if there were any failures of the clause. I deleted unused files and merged reused code in the interpreter. I was aware that using fewer predicates was faster. +9. I added member and string concat to LPI and SSI and translated them into other languages. After adding them, I automatically translated them into \"en2\", for example, \"concatenate strings\", which I could translate into other languages. I wrote an algorithm that found the \"en2\" version, considering the complete words and their order. Next, I tested the original phrases to translate by back-translating them and choosing better synonyms that back-translated correctly to the original. I then examined the open-source translation algorithm. +10. I found the first item A in higher-order programming in List Prolog, like anti-ageing in immortality. I fixed the bug so that equals4(A, B) displayed the correct exiting debug information equals4(A, A). I also fixed the bug in reverse1(A,[],[3,2,1]) so that the algorithm used the same variable names. I bug-checked the length(A,3) so that the interpreter used List Prolog variable names. I also checked that List Prolog could find input values when building a term as a command and find the results from running a term as a command. +11. I not only breasoned out arguments, I examined them. I recorded whether a predicate had any choice points left on its exit and updated this count when I deleted a choice point. Like taking action on n choice points, I took action on n breasonings. I had breasonings, so I had control (I could remain alive). I had all degrees, so I had God-consciousness and total control (I could use 5*50 As to live as an immortal). + +12. I stated that recording the choice point count for each predicate sped up deleting unnecessary choice point data on exiting a predicate. I deleted the predicate data when no choice points were left. This count included all the counts for predicates within the predicate. I updated the counts for all parent predicates when I deleted a choice point. I warned on numbers of infinite choice points from repeat. +13. I reused resources by collecting garbage in Prolog, like reusing Essay Helper formats for business reports. I carried out garbage collection using global variables. Garbage collection stored repeated data in variables. By keeping this data, I saved time and memory by allowing data to be processed more quickly. For example, shortening the trail could be avoided by repeating a long list to find a member of in the choice points trail. +14. The garbage collection stored reused variable values. These reused values needed to be longer than a certain length, i.e. four characters. There was no upper limit for values unless there was a storage limit. In this case, the interpreter might split values into lists. As variables were garbage collected, the interpreter tabled predicate results to avoid unnecessary computation. +15. I stored combinations of garbage collection values with a single garbage collection value, saving space. If a variable value contained another reused value, the interpreter adjusted this in the garbage collection table. It was not a garbage collection table but a knowledge management table. I didn't flatten each value (I used values within values). I stored atom values if they were long enough. +16. I could split strings or atoms. I split strings in garbage collection values. For example, I had A=\"bc\", D=\"e\". Also, I could put string values in terms of other string values, etc. For example, I could have F=\"\", G=\"j\", H=\"k\". +17. In the Philosophy repository, I detected a bug in the lpi and lucianpl terminals. On pressing return without a query, the terminals unwantedly turned on trace mode. I found the cause of this bug and kept whether the trace mode was on the same as before. I caught errors in the converter from Prolog to List Prolog in the terminal. I also caught errors in the interpreter. +18. I set questions about optimising algorithms. Before this, I wrote about the assessment of algorithms in the academy. I wrote the algorithm. I wrote its input and output set. Finally, I set a question about the set of predicates (including any previously written predicates) with a description of the data and what the predicates needed to do. +19. I examined the history of algorithms. I assessed 5-15 predicates per subject. Sometimes a group of predicates had five predicates. The students only wrote what they understood. They liked to think about what was interesting about each algorithm, using the minimum amount of code. +20. I stepped through the non-assessable algorithm in the video. I wrote tutorials, lists of frequently asked questions and sample answers as part of the course. It was like a MOOC. Fireworks-like thoughts lighted up the submission box. The tutorial covered the sample answers (which covered examples like the assignment or gave deliberately buggy code), and the list of frequently asked questions covered the project. +21. I converted equals4(A, B) to C without relying on underscore variables. Separately, I assessed different versions of algorithms. I looked up a version of an algorithm with a particular feature on the repository. I assessed it. There were also questions about bug checks. +22. I assessed code in List Prolog, not Prolog, to encourage students to test their interpreters. They could write an algorithm to convert the state machines of List Prolog to and from that as C-code. A converter could bypass List Prolog and convert the state machine into C, which might be unnecessary. The C-code needed function calls to compensate for Prolog features for C. The interpreter tried the following clause if a clause (where an algorithm joined clauses in a single function) failed. +23. I didn't assess whole algorithms but predicates or groups of predicates of algorithms. I did this by taking a set of related predicates from more extended algorithms, allowing manageable-sized assignments. I read the comments on the algorithms. I also set the algorithm parts from the bottom up to finish the whole algorithm. Students copied predicates they had previously written into algorithms that required them. +24. Sometimes, a philosophy has more than one algorithm. I determined that philosophies needed algorithms for assessment. I recorded the philosophies' algorithms using a text database (the algorithms followed the philosophies). Sometimes the whole philosophy paragraph or a line was represented by the algorithm. I set this algorithm as an assessment, referring to the philosophy. +25. For 100%, they needed the same as 250 arguments and algorithms. Students needed 80 philosophy arguments and 80 algorithms for a high distinction. I encouraged them to choose an appropriate starting point and write a series of algorithms on a topic. Groups of arguments were also on the same topic. I helped them count the number of clauses (the same as an algorithm). +26. I mind-read the amount of work completed. Before this, I pursued accreditation through various companies. These companies helped finish and correct breasonings and finish arguments. This assistance helped with clear and complete thinking. It also provided the best foundation for future work. +27. I assessed algorithms on the academy website and gave quizzes (about algorithm output) on the external website. I also asked for type and mode statements in the first year on the academy website. These helped new students gather their thoughts and prepare to write the algorithm. On the external website, I provided a model solution using the algorithm, then asked them to use it to find output for a particular input. There were also questions about modifications to algorithms, debugging and API changes. + +28. In the algorithm that converts SSI to C using a state machine, the last output values are only set to output variables if the predicate exits true; otherwise, these variables will remain undefined. Each predicate returns -2 for exiting true; otherwise, they return -3 for failing. If the predicate exits true, the variables are assigned; otherwise, they are left undefined. If there are inputs with values, they would be compared with the output values as soon as possible, and there would be nothing to output. If-then statements may need inputs set to variables and outputs assigned to variables to compare (separate from the previous idea of setting values before returning), requiring mode statements (which an algorithm can compute) for the antecedent. A program must work out mode statements of predicates in calls that change from run to run from the code around the calls, with warnings. Predicate calls without inputs or outputs don't need some of these variable assignments. +29. The program can find mode statements from the algorithm. For example, given the algorithm D=\"ef\", string_concat(B, C, D), an algorithm can work out that string concat's modes are output, output and input. There would be a warning in the case of multiple possible mode statements. Using the mode statement, the interpreter can give the inputs in a predicate call and any outputs compared afterwards. Without the mode statement, instantiated values may fill variables that need to be uninstantiated. +30. Also, the program can work out the type statement from the algorithm. For example, the algorithm a(B, C):-B=d, number(C) would have the type statement a(atom, number). Also, given the following algorithm, + +%a1([e,[1,2,3]],B). +%B = [e, [2, 3, 4]]. +a1([e,F],[e,G]) :- a2(F,[],G). +a2([], C, C) :- !. +a2([B1|B2],C,D) :- + B3 is B1+1, + append(C,[B3],C2), + a2(B2,C2,D). + +The type statement would be B=[atom,{number}], where {} encloses a list of 0 or more items. The algorithm would recognise types and lists in the algorithm, using \"a1,...,an\" notation (up to a particular value of n). +31. Only the mode statement is needed when converting if-then statements from Prolog to C, not the type statement. In the case of no arguments except the result as the antecedent, i.e. a(R), no assignments of values to inputs using equals were needed. Also, no comparisons of output values using equals4 were required. The interpreter compared only the result with a successful exit value. In this comparison, a successful exit value resulted in continuing or starting to exit the clause. Or, a failure result resulted in the clause returning a failure result, output variables being undefined and using update vars to update variables to the first arguments in the clause. Variables in each clause joined into a function (for following the nested if-then structure) needed to be different, apart from where the interpreter assigned them at the successful end of a clause. +32. A mode statement is not needed before an if-then antecedent for converting from Prolog to C, where the interpreter can work mode statements out with an algorithm instead. The programmer doesn't need to provide mode statements in this position. Mode statements, worked out by the interpreter, are required at this position to determine which arguments of the antecedent are inputs and outputs. Inputs are already defined, inputted directly into the antecedent, or undefined. The interpreter does not compare outputs with a value, they are compared with a value or are undefined. If [] is appended to '_', [], [_], [_, _], etc., are returned. So, the programmer should build lists from the empty list, not an undefined variable. +33. A converter converts SSI equals4 arguments to trees using \"string to list\" on the compilation of SSI. The interpreter stores duplicates in one place. These include empty lists, lists containing variables and those containing \"|\" (to separate list heads and tails). The interpreter stores these trees in two arrays, one for numbers (for numbers and lists) and one for characters (for atoms and strings), whose lengths the interpreter measures for storage in memory. These array trees are first created and tested as lists in Prolog, then converted to C arrays. +34. Storing the algorithm as a list rather than a string allows the algorithm to be processed faster. The interpreter converts the algorithm into a tree on the compilation of the algorithm using \"string to list\". \"String to list\" has a C version, created using a converter, which recursively runs and converts the List Prolog algorithm as a string to a list, as a tree array. This tree array can more quickly process the SSI algorithm in C. \"String to list\" is faster than in Prolog because it avoids memory problems. +35. The paradox of \"string to list\" needing to be converted by \"string to list\" to run it as an array tree is not necessary to solve because \"string to list\" is converted directly to C using converters, and is not run as a List Prolog predicate. However, this paradox is necessary to solve where the interpreter needs to convert \"string to list's\" equals4 arguments to a tree array by running it with the Prolog version. Then the interpreter converts the Prolog array tree to C. \"String to list\" only needs to run in C as part of \"SSI in C\" (for speed); it doesn't need to be run in C to produce the array trees, which are necessary for SSI in C. Also, when inserting symbols for garbage collection in SSI in C, the symbols inserted needed to be specific to SSI in C in case the interpreter ran SSI as an algorithm separately. This rule also applies to these programs' predicate and variable names. +36. The interpreter ran \"update vars\" when the clause exited, which appeared multiple times if there were multiple exits. \"Update vars\" is necessary because predicates operate on values and variables. These variables are in the form of input and output variable bindings tables (as array trees). \"Update vars\" operates on these bindings tables to update variables before returning. Predicates take array trees as input, which are operated on by equals4 in predicates, for example, in append or \"+\". The multiple function format in C is necessary for reusing functions and for recursion. +37. If-then finds outputs from inputs, the interpreter tests whether the outputs are correct and the antecedent is true. If-then works out which are which in inputs and outputs from mode statements, found using a mode-finder, where these mode statements are possibly chosen from a list manually, but not each mode statement tried. The post-antecedent comparison statements should be in the parent predicate rather than the child predicate. While values could be passed down and checked in the child predicate, the parent predicate could more easily differentiate between inputs and outputs and compare outputs with the given values. I first solved the difficulty of distinguishing the inputs and outputs by avoiding post-antecedent comparisons because arguments will either be inputs or contain an undefined variable. So, it isn't necessary to make comparisons in the parent predicate, only in update vars in the child predicate. However, comparisons should be made in the parent predicate instead, with the mode statements to properly test the if-then clause, where arguments may be defined but are treated as outputs by the child predicate, so the interpreter must compare them in the parent predicate. + +38. I didn't set output variables in if-then antecedents (A in A->B; C or if A then B, else C) to one of a third set of single variables to be compared at the end of the antecedent. Contrary to previous paragraphs, the algorithm finds all if-then arguments using \"string to list\". There are no post-antecedent comparisons. The algorithm makes the checks as part of predicates and \"update vars\" in the child predicate. I argued for predicate heads to have arguments such as [A1|A2] to save time when processing with \"check arguments\" and \"update vars\". +39. Using a single format for header variables in SSI-C and List Prolog predicates allows cross-compatibility. \"Update vars\" is necessary, and predicate head arguments shouldn't be A, B, etc., instead of [A1|A2] etc. [A1|A2] etc., shorten the code length. In effect, they optimise the code. They allow breaking the predicates down by function and returning the result through header variables. +40. The interpreter can pass lists as tree arrays as C arguments. This method requires single (not lists of) variables as arguments. The technique negates the need for \"check arguments\" and \"update vars\". Without these, variables can be retrieved and manipulated without the bindings table and necessitating breaking equals4, etc., into functions that unify single variables with lists, etc. Post-antecedent checking assumes variables are C arguments, not in binding tables. However, we cannot use lists in C. So, post-antecedent checking is not necessary. Again, checking is done in child predicates and with \"update vars\" in the child predicate. The former method of defining variables as C arguments in SSI-C is preferable to using a bindings table because it is faster. +41. One might need in-code type statements if one is choosing from multiple modes. These statements are required for List Prolog predicates and type checking in general. However, type and mode statements aren't needed when defining variables as C arguments. Separately, if an if-then statement checks a single number or string value, then it is optimised to check it in C, not use another function directly. Also, if the interpreter compares a variable value with a list, it requires equals4. +42. I inserted comments to label Prolog and List Prolog line numbers in C. These comments labelled the correspondence between Prolog lines and if-then antecedents in C. The file names had the same name as the file name. I could trace bugs to Prolog, C or the algorithm the interpreter ran. I could also more quickly optimise complex code. +43. I created business education assignment formats for Essay Helper with connections between sections. First, I discussed the business's strategies (inside or outside the organisation and connections). Second, I discussed the business's goals (duties, development and connections). Third, I gave evidence about business law case studies (connections between theoretical and practical cases). Finally, I could rapidly draft business reports with the appropriate structure. + +Grammarly premium: + +44. When I had an algorithm, I could write a preliminary draft for an assignment. I scanned my computational philosophy repositories for philosophies' algorithms and their data, or I recreated their data to write courses. They completed one assignment. When they had completed this, they were given the model solution and went on to the next assignment. They were online, so they could do these in their own time. +45. I could help the students by explaining the algorithm to do the exercises. I prevented software bots from using the system. I asked a question that a human could answer. When the person entered the answer, they could use the website. While computers could be programmed to do the exercises, it was aimed at humans. +46. I collected the possible errors and displayed them when necessary. I used type statements to discriminate List Prolog types and displayed errors if types were not correct. I didn't use type statements, but manual type checkers that checked inputs had the right types. If a type wasn't correct, the interpreter produced an error. It then exited from each parent predicate until another statement caught the error. +47. I generated type statements with references to reused ones. I required type statements to help students understand how to write code. The students checked the data. They wrote type statements for it. This included recursion, lists and reused type statements. +48. Invisible submit tests ensured code was rigorous. I wrote visible verify and invisible submit cases for Prolog programs. Students tested their code against verify cases. Then, they tested their programs against submit cases, which determined their mark. These test cases covered the same material. +49. When the programs had passed 100% of the tests, the code could do certain good things. The Prolog project had a 100% pass mark. Students read each part of the specification. They submitted drafts for testing. When they had passed 100% of the tests, they submitted their projects. +50. I countered blocks to study, etc., with As. I used computational mind-reading to help students. Without mind-reading, the system could help with weaknesses, help people to do what it did and detect some problems outside study. With mind-reading, student sentiment, work habits and auspicious times could be detected. Mind-reading could detect student As that contributed to their mark. +51. I tested for absence of skills and subjects to teach. I used mind-reading to read options (skills) students needed. I gave them a spiritual test to test their knowledge. If they can successfully perform all tasks, they don't receive additional help from the computer (screen). I organised skills needed for course parts in dependencies, and tested for them top down, with random checks of sub-skills or skills I didn't know about. +52. I wrote a converter between list processing in the Prolog predicate head and in the body. If a student chooses other (an undefined skill), I helped with it. The new students appeared to choose new sub-skills to test. This was a required question. These were list processing in the Prolog predicate head. +53. I tested for and helped the students with knowledge in skills. An Education (as against Computer Science) A was breasoned out to help with skills. I found the dependencies of the program. I found the tags describing the skills for each predicate. These were in the format \"abc123\", or the major command name, followed by the number of minor changes or subclauses. +54. I identified main and overarching projects and research directions to aim for. I helped students with assignments and industry projects through skills. I found whether students had passed the skills necessary to do a piece of work. I found and helped with missing skills. I identified new projects that contained chains of needed skills. +55. It was easier to write science. I wrote an algorithm that checked Lucian Academy books were in the correct format, without double quotes. I found instances of double quotes. I inserted backslash before them. I printed a report of the books by year. +56. The philosophy was the frontier for the education institution. I wrote an algorithm that uploaded changes to books. I wrote an addition to the book. I checked and uploaded it. There was an index of chapters with modification dates and times to add to. +57. The sale gave the buyer what they wanted. I breasoned out 5*50 As in business for converting a non-lead to another lead. A non-lead is a lead, just not another lead. They are desired, can facilitate success and are not necessarily urged on. They had to play a part in their sale, i.e. want it. +58. I gave As to the longevity herbs as well. I breasoned out 5*50, not 4*50 As in Daoist qigong to relax my muscles. I relaxed my neck. I relaxed my upper back. I relaxed my limbs. +59. I thought that \"It is me\". Bots said that \"It is me\". Sales participants said that \"It is me\". Business people said that \"It is me\". Each 80 philosophy breasoning argument was given 5*50 As. + +60. One should use the bindings table, not C arguments as variables for simplicity because of the complex expansion needed for equals4 otherwise. Splitting equals4 would, for example split A=[C|D] into assignments and comparisons head(A,C) and tail(A,D). This assumes C and D are instantiated, otherwise bindings tables may be preferable. Also, splitting [A,B,C]=[d,e,f] into assignments and comparisons A=d, etc. If there are any sublists, they need to be processed separately. +61. However, using C arguments as variables is faster, and allows removing some variables to optimise the code. Comparisons operate on all variables instantiated. Assignments operate on one variable instantiated. Using C arguments as variables is faster because it places assignments and comparisons directly in the function, rather than processes them. Undefined variables and singletons can be omitted, to optimise the program. +62. I covered the basis for research and saw possibility in resynthesis. I gave chapters 5*50 As (two ways of paraphrasing them). I noticed that perspectives and n-crossed philosophies led to infinite ideas. I could mind map any number of details. Experience could be applied to the philosophies, and vice versa. +63. I kept breasonings and immortality terms as science. I wrote about the thing-in-itself. I retained algorithms, and gave them more details. I retained spiritual terms, and encouraged critical analysis. I noticed people performing their own miracles. +64. I wrote about question-answering. I wrote about criticality. I wrote about bugs and rebuttals. I broached the research side of immortality. I encountered the question of writing-as-medicine. +65. I found immortality as medicine. I transformed musical stories and experiences to philosophy. I generated stories about algorithms, which extrapolate the history and philosophy of them. I explained the need, use and change in method due to gaining experience of the algorithms. I explored the ethics and major characteristics of the algorithms. +66. I summarised the topology, data and method of algorithms and found more science. I rewrote pedagogy to have new arguments, in addition to the pedagogical ways of thinking. I stayed close to noumena, which were clearer. There was more to philosophy in the future. It intertwined with science. +67. I wrote one instance of the words \"humanist pedagogy\", and \"meditation\". I captured noumenal theories. They were based on the times. I noted big ideas such as pedagogy and practicums such as industry, sales and bots. To cope with these, students were encouraged to read, program and write enough each week. +68. I wrote a debate verifier, where speakers always stayed on one side. I wrote primary then secondary texts. I wrote a converter to English literature. One speaker asked questions and made polemical comments. The other speaker was a positivist, the philosopher. I set the philosophy in a literary setting, and gave characters backgrounds matching their comments. +69. I sympathised with the Ancient Greeks and pre-computer period breasoners who needed many breasonings. I negated that saying \"too\" after breasonings and instead of meditation utterances worked. Even with a puja, it seemed risky. Politicians and professors needed them and were given breasonings by administrators. Copies of breasonings and utterances could be diverted to close-by people. +70. I wrote 5*50 As for infinite knowledge. This framework established a research base for writing. I wrote methodically on my topics. I added details. I delegated work to have 5*50 As, and noticed I was given the prize of life on Earth. +71. I maintained a positive mind-set at all points. In meditation, there are 5*50 As for calling the last bug positive then solved. I identified the positive shapes and colours of the bug. For example, I wrote a story about explaining a feature by correcting repetitive tasks. Then I solved the bug. +72. I prepared for students with at least 250 algorithms. I changed the conversation around when another person talked with me. I furthered the end-point of the conversation. I thought of three-levels with one item per level of algorithms. I aimed to write ten sentences with ten algorithms each, with three detail algorithms each (which could be new data for existing algorithms). +73. The students bypassed transcribing their teachers and wrote their own breasonings, using their systems and going over the limit. I made up breasonings for others. I kept a copy. I made up algorithms for others and kept a copy. I simplified tutorials to connecting sentences in the light of a certain topic. +74. I wrote 5*50 As for going through the information. I wrote a framework for traversing information, for example sides of contentions and algorithms for different numbers of points in a criterion when making comparisons when writing algorithms. I simplified algorithms to skill sets, then found alternative methods top-down and their skill sets. I drafted possible alternatives along a line with different skill-sets. I hypothesised and solved my and others' unanswered questions. +75. Other dimensions are other ideas (the ones in this dimension are philosophical). I wrote 5*50 As for A (agreeing), B to B (finding blocks) and B (objecting to blocks) to the self. I agreed with writing my thoughts and randomly reordering and giving meanings to them. I agreed with the latter as a way of generating thoughts, which cover an idea enough. I wrote an algorithm which detected thoughts and wrote enough details. + +76. I found links from people to the other as well as the self. Meditation connected these three groups. I, represented by the input, linked to the people, represented by the algorithm, to the other, represented by the output. Once I knew the algorithm, I transcended it and summarised it to data. The algorithm was replaced by input and output. +77. I aimed for positivity at the end, through an essay. I wrote rules to write the algorithm that I replaced with input and output, which made it elegant and functional (for example, essay helper formats in text files). Finding the algorithm that replaced by input and output was done with another algorithm. Another algorithm could check for uniform input and output. I wrote research on automating student work, where students did the most intelligent part. +78. I was allowed to sell arem when I had an A for Jyotish and it. I completed 5*50 As for the arem utterance. This stood me in good stead if and when the meditation company went away. I could teach others meditation. I could start my own meditation company. +79. I converted SSI to C. I began by writing List Prolog predicates for ISO Prolog commands. Then I added common commands. I concluded by adding custom commands. For example, I added types and inductive commands. +80. I wrote string concat in terms of append. I wrote append. I converted strings to lists of characters. Then I entered this list into append. The advantage of this was writing string concat as a simple predicate. + +"] + diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 15.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 15.txt new file mode 100644 index 0000000000000000000000000000000000000000..08fbdfb835b4ac3afee21e66a3b0e99bba0ee55f --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 15.txt @@ -0,0 +1,35 @@ +["Green, L 2022, Immortality 15, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 15 + +Grammarly Premium: + +1. In the interpreter, append, etc., had intermediate preds to turn trace off and on. I could turn this feature off. If turned off, I could examine the workings of these predicates. I could skip their traces if necessary. I could trace usually hidden features such as maplist and foldl. +2. I could assert predicates with bodies. I also controlled trace (in-predicate display and flag) in terms of a predicate flag. This flag was contained by an asserted rule. It was depth_trace(Flag) where Flag=on or off. Other flags were also rules. +3. I sped up some predicates in C by converting Prolog to C. Earlier, I wrote the entire interpreter in terms of List Prolog predicates. This included interpret body, and was difficult. There were many parts of the interpreter that could be written as List Prolog predicates. I improved on the bad code, and rewrote the good code. +4. I wrote a trace summary algorithm. There were no commands in C, only List Prolog command predicates. There were only parts of commands in C, such as type converters and verifiers. Writing in List Prolog encouraged functional and dependency-thinking. List Prolog here refers to Prolog, which is converted to List Prolog and run in SSI in C. +5. I left converted strings as trees. I converted the algorithm, which was a string, into a tree. As mentioned earlier, I didn't only leave it in this form before running, but if it was the SSI or other algorithm, I converted its findalls, etc., and decomposed lists into comparisons or assignments in the algorithm. In this, the interpreter faded as it was part of the algorithm. It was a way of teaching about interpreters, showing how to run code most efficiently. The interpreter didn't run the algorithm, rather the C interpreter did. +6. Choice points were computed automatically by joining clauses together, using if-then in C. In this way, Prolog was monadic, meaning it was like a sequence of if-then statements. I sped up Prolog by converting it to C, deleting unused code, where code use was checked while running, and only deleted if no input (using random, user input or APIs) or the given input didn't cause it to run. An archive of the code was reused if other given input needed certain code. The repeat command was replaced with predicates, or a command. +7. I considered writing a C interpreter which compiled into assembly language. Separately, I tested List Prolog Interpreter and State Saving Interpreter after deleting unused code. I turned off and highlighted singletons for deletion. All underscore variables were considered. I considered changing e.g. _A to _ if it saved time and was correct. +8. When converting from Prolog to C if-then statements, instead of initialising output variables of a function to a value, initialise to undefined or ignore it when the output is set to _. I kept a copy of the Prolog algorithm, which contained the simple logic, which could be converted to C for better performance. There was a compile to C command in Prolog. There was a performance improvement algorithm, which allowed editing out unused features. It might edit the API or fix bugs, bottlenecks and security holes. +9. I avoided using e.g. _A in a version because singletons should be called _ and are costly. I produced a warning when a singleton, probably an output, wasn't _. I identified the singleton as a single instance of a variable in a clause. The code was understood to work. Removing or renaming the singleton to be undefined would speed up the code. +10. All data could be represented as numbers with types. In C, I initialised all variables, and initialised variables to values if necessary. I questioned setting arrays to [] (like empty lists). I also invented a way of composing lists within lists using indices. I wrote a way of embedding lists within lists using down and up characters, to save time going through multiple data structures. +11. I didn't need types. While first-year students preferred them to help learn programming, type checking could be done without explicit type checkers. Whether C lists were made with \"->\" or single arrays, types were needed, except in a list of numbers. For longer data structures, I recommended using tree arrays with some of these properties. Prolog could model these data structures and convert them to C. + +12. I kept two tree arrays (one for numbers and lists and the other for atoms and strings) for lists to be more easily converted between Prolog and C. In SSI, I converted variable bindings tables, globals, etc., to and stored them as tree arrays. I decided in favour of bindings tables rather than C argument variables for educational clarity. I converted the list values, etc., to tree arrays as soon as possible (I could table and reuse commonly converted items and collect garbage, or reference commonly used items once). I made computer science general (like philosophy) in my departments. +13. * I wrote reverse in List Prolog. There was one set of predicates for forwards and backwards directions. It could be used to change foldl to foldr. I could get the last item in a list using append(_,[A],[a,b,c]). +14. * foldr, intersection, delete, sort, subtract, atomic list concat, permutations in lp +15. * +16. * +17. * +18. * +19. * +20. * +21. * +22. * +23. * +24. * +25. * +26. * +27. * + +"] + diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 16.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 16.txt new file mode 100644 index 0000000000000000000000000000000000000000..682281938a5125ce6f5f1edb697a343dcbd282bf --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 16.txt @@ -0,0 +1,109 @@ +["Green, L 2024, Immortality 16, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 16 - I could help more people become immortal. + +1. I articulated to an \"easy\" level, i.e. wrote simple three-step algorithms for each sentence. +I wrote letters to teach and establish immortality centres. I wrote notes outlining my proposal to open schools in countries. +I aimed to increase the subjects with other subjects I taught. This increase was possible by relating to, for example, the text to breasonings algorithm. The difficulty level should be appropriate for the grade (the primary school had different texts from the university). +2. I found the link between spiritual or natural bots, other things and immortality. So I opened schools to teach people to become immortal. +First, the people read and wrote enough to feel developed. Next, they must be well-versed in the necessary skills. Then, they must outline their plans and financial plans for immortality. +3. Immortality helped people to help people with immortality, which had many benefits, such as creating peace. For example, I applied for grants to travel to open educational institutions to teach immortality. +I researched the best places to open institutions. I was aware of the business skill of professional development. Education courses helped achieve the correct standard of work in studies. +4. I was aware the local and global economy affected the property market and helped the tenants to earn money. So I bought property in countries to help fund the centres and educational institutions. +I asked a rich person for property funds to buy enough property to fund the school, which had advantages and enough to help society. First, I calculated the level of interest and acceptance to start a particular institution in a city. Then, I figured out the number of properties needed to support this institution. +5. Lucian's texts on immortality influenced many to become immortal. Immediatism (sic) in the availability of immortality meant the sutra would be called necessary. Also, courses would become required in studying the path to immortality. Economists would calculate the economic effects of immortals and aim for them (people would disappear in simulation \"shells\", and two different sets of occupants would occupy houses at the same time). +Immortality was activated by time travelling to a time (5689), (requiring meditation before and after trips there and back) and switching it on, both requiring 250 breasonings, before or after which the bulk of the necessary philosophy should be breasoned out (the reading or writing of which was not immediate). +6. Older adults in young bodies needed to explain they were old, skilled and mature. Immortality was a medicine for potentially terminal disease sufferers, so it could become available to people who needed it. +However, a court case once concluded that meditation might not be effective in pathological disease sufferers. +A person living with Parkinson's disease given immortality medicine appeared to die, but some believed he didn't experience death and lived on with his loved ones around him. The immortality switch could be applied from 18 years of age, preventing disease and ageing. +7. Viruses attack the immune system, potentially killing the person. Immortality was a form of medicine that could return to a state before being infected with a virus, preventing its effects. For example, meditation sometimes reset to a copy of the body before a medical problem to avoid it, but immortality allowed this to be permanent. +One person reported meditation helped maintain their youthful looks. Immortality could let them stay permanently young. +8. Being in one's physical and mental prime meant being youthful, mentally fast and able to achieve better results. Immortality's immediate availability to people in their physical and psychological prime meant that teachers needed to teach them prerequisite skills (usually left to be randomly encountered). These included pedagogy. Also, these included anti-headache medicine. Also, they included time travel, meditation and other forms of spiritual therapy. +9. I made a conjecture about the effects of an immortal taking Chinese herbs for longevity, possibly such as having more energy. Given its label as an alternative form of medicine, immortality would need to be tested, given different people's conditions. Therapists should only administer it to those who are conscious. +And it should only be administered to those who could decide they wanted it and wanted to maintain it. Also, it should only be issued to those who wish to avoid death. +10. My algorithms and philosophies didn't prevent others from writing, but they inspired them. For example, some proponents of immortality claimed that the experience of mortals after death was similar to immortality. Still, the people didn't have a physical form visible to other mortals, so teaching the skill of immortality was identical but avoided the inconvenience of injury and death. +It was unclear whether the mortal afterlife ended and led to reincarnation or something else, so some assumed they were immortals in a form. However, immortality avoided the pitfalls of medical problems and accidents (through controlling a computer simulation), particularly an aggressive chain of deaths. Attaining immortality raised the question of the necessity of breasoning out enough breasonings before conceiving children to give them a high quality of life and ideas during immortality. +11. Some decided to work in Education. In combination with University-learnt meditation, immortality was a form of heaven on Earth. Meditation provided the high quality of life that people desired. Immortality provided the solution to body degradation and meant long-term projects and guidance of others was possible. Immortals could choose to remain the same age in their own time without dying. +12. I experienced meditation with a young company, which I could keep up with, where the company should perhaps stay or reset to like this. Long-term studies on these people were possible. Immortality was available through meditation and could be learned anonymously with the text to breasonings algorithm. However, engaging with accreditation and others helped meditation to remain part of the economy. Immortals' knowledge was enough to help, for example, through centres. +13. A primary school student explained that he counted each function as an algorithm. A meditation-based secondary school encouraged students to do a manageable amount of work with help from others. +The students didn't need to do all the work because they hadn't graduated from university yet. However, as soon as they ran text to breasonings or had learned meditation, they needed to do their job. At most, the work involved writing algorithms and thinking of breasonings. +14. Philosophy was simple and intelligent. Immortals could become professors. Once they had finished the immortality texts, immortals could focus on University areas of study. But, first, they could study Education and teach. I had the invitation to teach Business at a prestigious university. +15. I stayed in Australia until it sunk. Immortals were capable of more considered thoughts. They had the experience of many ideas. Using this experience, they could critically analyse and logically deduce conclusions more effectively. Therefore, information-based products and low-risk departments and careers were preferable. +16. I wrote about epistemology (which contained black boxes and observable outcomes) to produce works and teach at the university. Immortals could write thoughts down to help cope with the amount of them and record them. The immortal had many ideas. She wrote and ordered them. She grouped groups, etc., and wrote with at least ten connections (not just parts) per sentence. I ranked the details and redrafted the sentences after they were connected. + +17. I examined spiritualism with science. I equated spiritualism with science. I found the connection between parts of holistic medicine, i.e. pedagogy, meditation, anti-headache medicine, zinc and vitamin B to combat viruses, text to breasonings algorithm, meditation algorithm, medicine algorithm, creative writing, computational thoughts, philosophical thoughts, time travel, mind reading, immortality, philosophy algorithms, fine arts algorithm, music algorithm, essay helper algorithm, computer interpreter, computer compiler in Prolog, computer compiler in C, business, inductive algorithm finder, bots, sales, replicator, universal translator, simulation, person replicator and animal mind reading. I connected each of these others to immortality. Especially within philosophies and algorithms, there were more connections. +18. I wrote about the actual thing and logical specifications. I connected immortality to pedagogy. I did this by breasoning out comments. Then, I joined them together. I also avoided stricture. +19. Meditation enlivens the body and spirit. I connected immortality to meditation. I thought. +I wrote. I moved. +20. The overseers gave Bs to headache thoughts and took care of my thoughts. I connected immortality to anti-headache medicine. I could time travel without a headache with anti-headache medicine. I could +use text to breasonings without a headache with anti-headache medicine. Text to breasonings automatically met pedagogical requirements of time travel in immortality. +21. A daily vitamin supplement with 5 As to tone and give energy helped with work each day. +I connected immortality to zinc and vitamin B to combat viruses. I had a virus-less system. I could have vaccines. I didn't need to worry about mixtures when using the body replacement technology. +22. The text to breasonings algorithm took care of in a flash of what would take hours. I did this by connecting immortality to the text to breasonings algorithm. First, I found the top points using Vedic mind reading. Then, I used the grammar to logic algorithm to work out extra details about texts necessary for high distinctions in post-graduate courses. Finally, I breasoned out these texts with the text to breasonings algorithm. +23. After teaching the people meditation, they indicated meditation and immortality daily. I connected immortality to the meditation algorithm. I breasoned out 250 breasonings for each instance of the meditation utterance. +Text to breasonings required meditation. +Using 250 breasonings improved the quality of meditation and helped support other meditators. +24. I also breasoned out immortality medicine each day. I connected immortality to the medicine algorithm. The algorithm activated all of the forms of medicine each day with 250 breasonings per person per day. I would usually activate a form of medicine by itself. However, the grammar logic algorithm could work on specific 250 breasonings for chosen forms of medicine. +25. I spent many hours writing each day. I connected immortality to creative writing. I wrote arguments, for example, for headache prevention in post-graduate work. +I could generate arguments for anything a customer wanted with grammar logic algorithm. I moderated these requests. +26. I could focus on computer science more clearly and earn high distinctions by writing 80-250 algorithms. I connected immortality to computational thoughts. There were automated versions of pedagogy (text to breasonings and essay helper). There was a computerised version of creative writing (grammar logic). After writing pedagogy on computer science, I could focus on algorithms in pedagogy. +27. I filed algorithms required for immortality into different philosophical categories and added to these. I did this by connecting immortality to philosophical thoughts. First, I wrote a pedagogy text to activate my writing skills and write other texts. Second, I wrote other texts, such as meditation and medicine, based on pedagogy. As a result, I could think critically and clearly and examine thoughts during my life. +28. Time travel gave me the freedom to be immortal at different times. I connected immortality to time travel. I time-travelled daily to maintain myself in immortality (the future) and the present. There was a community of immortals in the present. Time travel required mind reading to create a number one music hit, to immortalise oneself. +29. I was the primary person to customers with person replication and mind reading in immortality. I connected immortality to mind reading. I could mind-read a number one hit in each department. Creating these number ones was an aim of immortality. I could become a professor in each department in immortality, using mind reading (where I achieved the same result with having 50 meditators with meditation algorithms and using a mind-reading text and the random function) and essay helper to automate work (and workers with replication). +30. Living like a robot as an immortal in the simulation prevented injury and disease. I connected immortality to immortality. I could change to other, safer universes. Also, I could help others to become immortal and switch to other, safer universes. I appreciated that immortality was possible with robots, and I agreed with the robot government and living like robots. +31. In theory, many algorithms can come from an algorithm in immortality. I connected immortality to philosophy algorithms. I wrote algorithms for my philosophy to paint, minimise algorithms, edit files online, etc. I chose these to contribute to larger projects, such as an operating system and an optimising compiler and to save time by writing when walking. These were part of my aim in immortality, leading to other projects. +32. I could create graphics tutorials and graphics in programming languages during my career. +I connected immortality to the fine arts algorithm. I wrote an algorithm that generated a font that \"joined the dots\", leading to software that pleases the eye in immortality. I created generative art, possibly with a mind-reader, that was unique for musical albums (for videos and cover art). I also created graphics for games (i.e. platform or 3D). + +33. I connected immortality to the music algorithm. I wrote the overarching theme of the time. I planned forward. I wrote music for my planned accomplishments. I used it if it was better at the time. +34. I examined grammar and logic in terms of ontologies. I connected immortality to the essay helper algorithm. I mind-read people during my career. I mind-read animals. I collected details with 50 meditators and the grammar-logic algorithm. +35. I searched for and replaced specific asserted statements. I connected immortality to the computer interpreter. I wrote the computer interpreter. I ran it with itself and the compiler in Prolog. I detected that its if-then statements worked in my compiler. +36. I continued to develop simulated intelligence, such as moderated business development. I connected immortality to the computer compiler in Prolog. I kept adding commands to the compiler, ready to work in C. I specialised in document collation, algorithm generation and document processing. I could prepare for neural networks in C. +37. I connected immortality to a computer compiler in C. I could license the software for commercial use. I examined brute force and optimisations. I ran algorithms in C. These C programs required a Prolog to C converter, with backtracking and variable configuration to points in the C code, no lists (possibly with numbers only, just one-dimensional lists, with memory cleaning on function exit) and memory optimisation and compression. +38. There was a 50 A monthly subscription to business. I connected immortality to business. I designed meditation documents for business. I wrote on sales, bots and business. I generated a raw mind map. I expanded it by hand. +39. I searched for old predicates that could do a new task or do close to a new job. Finally, I connected immortality to the inductive algorithm finder. I finished the predicate. I corrected the predicate. I finished the algorithm. +40. There was a traversal metaphor of people the people met for ideas for essays, which they could use. Finally, I connected immortality to bots. I started being the primary person to people. I gave them A to be the best in each assignment in each field and B to block this. I repeated this. +41. I pretended to monetise the algorithm, i.e. serve the tea in a certain way. I connected immortality to sales. I found the deal. I earned a PhD in sales. I automated finding simple algorithms, matching the student's work. +42. I creatively replicated (modified) my algorithms to write new ones (the program suggested candidate algorithms after working them out in the background while I worked). +I connected immortality to the replicator. I found the correct configuration of entity existences. It was easier in immortality (the simulation). I could quickly breason out ideas for people. +43. I devised and detected quirks in the universal translator. I connected immortality to the universal translator. One person talked to me in English. Another person was like another person. I could understand animals. +44. The text to breasonings and grammar logic algorithms were useful simulated intelligence programs and found an interpreter with a neural network. I connected immortality to the simulation. +I was aware of objects and settings changing in the simulation. Finally, I met who I wanted to meet. It was a priority to join the simulation to write better algorithms. +45. There were 4*4*50 As during the child's life. I connected immortality to the person replicator. I aimed to interest the child in products before conception. I met the parents. If they agreed, I helped the child to learn to meditate. +46. I noticed the ducks were immortal. I connected immortality to animal mind reading. I prevented the animals from going towards danger. They remained beautiful. I pretended to read their thoughts about qualifications. +47. I wrote the algorithm to find the data to types algorithm based on the expand or simplify types algorithms. There was more, more, more. There were more sales subscriptions in more prominent companies. There were more breasonings for professors. I worked out the simplest essence of interpreters and helped better. +48. I articulated from any algorithm to an algorithmic aim. I could do this with any two algorithms I had written. For example, I wrote data to types starting with a hash table. It found the items in the state machine more quickly. It reused items that appeared more than once. +49. I invited others. I defined myself and won awards. I designed possible award winners. I created the prizes. I helped the people try to win the awards. +50. I worked out simplicity was following grammar and explaining. I was part of the people. The people trusted me. I found the people to join. I built trust and business acumen by helping them. +51. The students could write on and expand the algorithms. I wrote enough algorithms down. They were details for the central algorithm. Next, I worked on the main algorithm. I finished and released an algorithm that showed the business potential of State Saving Interpreter Web Service. +52. I explained the simple method to become immortal, so people could do it themselves and the more extended process with an algorithm to better support them. The simple way involved meditating, breasoning out 250 breasonings to time travel to 5689 and entering the simulation. The more extended method included texts to breason out to support one when immortal, a daily regimen with meditation and pedagogy algorithms explaining how to switch on spiritual medicine needed to be immortal. I added the anti-headache treatment to cope with the large number of breasonings breasoned out by the text to breasonings algorithm to maintain oneself when immortal. I preferred to have As for requirements and breason out the 250 breasonings when time travelling. +53. I rested to prepare for physical and mental exercise. I checked each part of my body each day, relaxed and exercised. I tapped my nerves to energise them. When I got tired during day-long workshops, I performed qi-gong and jumped in the air. I looked people in the eye and thanked them. +54. I simplified a three-dimensional polyhedron using a minimisation algorithm. I moved from strength to strength. I developed my skills, logically deducing conclusions from them. Someone could use meditation to use to achieve any good aim. I recorded my algorithm types politically and thought of new reasons as I travelled. +55. I wrote functionally and used \"sentence-like\" calls (modularised predicates) whenever possible to speed development. I redrafted the interpreter, noting whether the need for complex workarounds disappeared with a more straightforward design. I sided with predicates rather than findall. I incorporated global variables as passed variables. I simplified codes to ASCII codes to speed sending over long distances in space. + +56. I avoided strong types because they were slow and unpredictable. Instead, I listed good programming practices for spacecraft. I redrafted the code, keeping it simple. I reused code when necessary. Finally, I enabled programming the computer. +57. Conversely, strong types could help when generating code. I wrote the strong types. I bug-tested the code. I found all possible type statements without explicit types. I could generate code using the specifications and the types from these. +58. I matched inputs and outputs of bare code mind read during the day and wrote algorithms, turning life into art. With all possible type statements, I could correct and autocomplete code and use type statements to generate a type of error to help the user correct data type errors. I notified the user about statements with types incompatible with the predicate and suggested or made corrections. I autocompleted predicates given the type statements. Given a type error in input, API input or file input, the program would return an error message and allow data re-entry, possibly with the suggested input. +59. I developed background algorithm generators to save time and provide me with business perks. I suggested a data structure error when brackets were wrongly present or missing. I identified whether the problem was the data, the program or the interpreter. I determined issues in data by detecting consistency in specifications. I noticed API changes (systematically in specifications) made necessary by new features and monitored whether this feature was required, making the data different. +60. I suggested or automatically modified code using previous predicates. I simplified the complex code to the barest structure, listed inserts in a text file and checked and inserted the call to the code where necessary. Particular inserts refer to certain parts of the code. The program contained these inserts in passed, not global variables. I could expand the code, negating the need for the passed inserts. +61. I checked that the changes wouldn't cause problems. For example, I checked that functional calls wouldn't have specification problems. I checked that code that called the changed code would function correctly. In these cases, I modified the called or calling code or differentiated the code if necessary. Alternatively, I watched for and assimilated code if possible. +62. I installed a version checker that suggested updating obsolete code with the latest version. I scanned for missing dependencies. I checked for changed and possibly affected code compared with the needed version and installed these files as necessary. I kept data in a separate folder and converted file formats to the latest version, warning of incompatibility with previous versions. +63. I checked that data changes wouldn't cause problems. For example, I checked if the program could handle that data additions or syntax changes. I found the listed references to the data file and made the necessary modifications. I found the test data like that that that the user could enter. I documented this or updated the documentation to help the user enter data in the correct format. +64. I checked that interpreter changes wouldn't cause problems. I checked that the interpreter could run the algorithms. In another case, I checked whether the interpreter or algorithm was wrong. Alternatively, I checked whether the programmer had saved files (and the code was up to date), had updated contingent repositories and that she had uploaded the sandbox. Finally, I determined the data to make public and the data to keep private. +65. I invented products, breasonings for people and actions. I recommended breasonings. It was simple to think of the x, y and z dimensions of objects for each word. One could breason them out, pray, say \"too,\" or use the text to breasonings the software. Correctly configuring a breasoning system could achieve one's business aims. +66. Details needed to be expanded for theses, helping the grammar. I recommended details. With enough details, one could pass a course at any level. Grammar Logic automated writing sparse details. However, these were supplements to hand-thought of breasonings, and using only the unique words meant the user needed to run it repeatedly for enough words. +67. PhDs and businesses may need a method to breason out the many breasonings required. I recommended the text to breasonings algorithm. It quickly breasoned out the essay, algorithm and breasonings, including their details. I recommended new users reset the breasoning dictionaries to establish themselves as breasoners better and familiarise themselves with the dictionary format. The provider gave an emergency breasoning dictionary for medical purposes. +68. I recommended periodically resetting the text to breasonings dictionary and checking that one's algorithms are entry-level. New breasoners should reset the breasoning dictionary to design their breasonings. +Generation 2 of the text to breasonings algorithm requires an object name for the word to be entered, followed by its x, y and z dimensions. Different users may want to enter different objects that are for other words. When breasoned out, this data will be in the person's memory and indicated by the computer program. +69. I celebrated my 2D art phase. One should reset the text to breasonings dictionary to ensure breasonings align with the latest objects. The objects may change with the latest culture, science and technology. For example, one's aesthetic values may change during one's career. This skill is similar to checking the grammar of a sentence, depending on the specific words it mentions. +70. I also rewrote algorithms at different times depending on my tastes and experience. One should reset the breasonings dictionary to ensure that one is familiar with the breasonings it breasons out. Even if one wrote the whole breasonings dictionary, restarting can help revise words' breasonings. One may even change the text to breasoning algorithm, determining a word's object based on its part of speech (otherwise, the thing may be symbolic). An algorithm may find the keyword and breasoning for the whole sentence. +71. It is better to make all code entry-level. One can check that one's algorithms are entry-level by checking whether these algorithms contain accessible commands and methods and can easily be bug-checked. It is easy if the command performs a simple function or the user can write it to complete a simple operation. If the predicate contains more than one aim, the user can usually functionally decompose it into two predicates, each of which will be simpler. It is simple if it takes a short time and little effort to bug check. +72. I wrote thought-provoking ideas for computer science. I mind-mapped the idea. By the 80th breasoning, original ideas had appeared. I connected the idea to a relevant sen-as version. I determined the mapping of seen-as reasons to ideas. +73. I used a new method to detect unnecessary choice points and predicates in the interpreter. Finally, I wrote original ideas for computer science. Most of the time, I wrote thoughts as short and sweet algorithms. Then, I developed a new and more straightforward programming language based on input and output and induction, with a different number of arguments, several commands and a particular logical structure of commands. This language reduced programs to the most straightforward commands (i.e. if it was an interpreter, then the interpreter modified itself), found the least complex algorithms, and warned against overly complicated code. +74. I wrote about new frontiers of research in computer science. Mind reading was unique and could be used to help mind-map ideas. It could be combined with the latest frontiers in research to find viable research topics. I connected twenty algorithms related to research into a research topic. The PhD found out, wrote and detailed the algorithms, usually part of a more powerful algorithm. +75. I enjoyed the challenge of types. The student worked on smaller algorithms during the semester for more algorithms and fewer errors. The lecturer could join in. He could support us by writing algorithms like the ones we wrote. The list of algorithms came from philosophy. +76. I worked on advanced technologies in terms of entirely self-written, reductivist (reducing what the neural network would do into a form that I could do myself), materialisable (sic) work (cognitively rigorous), which explored more of an idea as a human. During the holidays, the student worked on more extensive algorithms to increase his understanding and finish the main projects. The philosophy was about the main tasks. After writing the philosophy and some algorithms, the student felt confident enough to write the assignment. The main project was the book's final part and could be maintained and integrated with future projects. +77. I wrote stories about how applying spiritual holism. I compared spiritual holism with social movements. I mind-read those who people thought had died but were in other dimensions, including pets. I still remembered the missing duckling and wondered how to visit them. I applied mind reading and time travel to go out and meet people and teach meditation in their language with the help of the universal translator. +78. I connected data to types to mind reading. I mind-read the data. I asked for more data, checking it was compatible with the previous data. Then, I computed the types, which helped check the data's correctness and applicability to the program. I also mind-read whether these were problems, their solutions and whether the data would fit the interpreter's types system and alternatives. +79. I could flawlessly check data types in algorithms during my career. Separately, I could connect with and help those around me become immortal. With their permission, I mind-read whether they felt like learning new spiritual skills and thought of them doing them. It was essential to keep going and finish the skills, then complete the whole course. I taught the skills with business and education skills and helped the growth of those experiencing the enlightenment. +80. I breasoned out 5*50 As to entice customers during my career. I used 5*50 As in my meditation organisation to recommend customers buy again or to find a new customer. I regularly checked if the customer in the database might buy again. Or I checked around the customer to see if people were interested in becoming a customer. Also, I could increase 80 breasonings to 250 breasonings in the number of As. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 17.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 17.txt new file mode 100644 index 0000000000000000000000000000000000000000..d353aec21f34772ebba0bb5204b44f967c5b9cde --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 17.txt @@ -0,0 +1,84 @@ +["Green, L 2024, Immortality 17, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 17 - I could help more people become immortal. + +1. I noticed multiplicity and higher similar institutions. Like finance, people found out about immortality in time. They learned meditation and other courses in time to become immortal. Instead of on a pinhead, my life rested in a cup. I could build an institution, also with the help of Chinese herbs for longevity for energy and which helped me learn about longevity. +2. I first completed the meta-induction for an algorithm, then let neuronetworks join and modify the parts. I found ways of completing algorithms more quickly. Then, I increased the accuracy of neuronetworks by creating \"hands\" and \"bodies\", not blocks. I stayed with specific algorithm writers. Finally, I added a feature, a new algorithm writer. +3. I knew exactly what I wanted the neuronetwork to do, and it did it. It felt like the machine of neuronetworks was missing in our times. The human creations were black boxes, unable to be found out by machines. The machines could only examine existing knowledge. So it is vital to simulate wonder for the machines to produce science. +4. I loved completing the algorithms and extending my knowledge. I aimed to increase small thoughts about immortality. Continuing from paragraph 3, I asked, were the scientists humans? The robots used humans' ideas to find new conclusions. By the end, the robots attributed humans with machine science, and the simulation was like an intelligent machine. +5. I used time travel, which inspired my philosophy. Like the bible, I connected each chapter to each other chapter. These connections were how the brain worked. The religious people were the most developed because of \"n\". Separately, I decided to visit the other country. +6. I taught philosophy and computer science. The people were very interested in immortality. The robot custodians highly desired it for the people. The immortals could keep going, and the universe continued. Because of the simulation, I could time travel and live in the universe, which continued anyway. +7. The writer child could write originally. The pregnant woman time travelled or replaced her body, and her unborn child became a bot. It needed increasing still, like a human child. Finally, the child was born safely with enough As. The budding or human replication expert gave enough specific As so the pregnancy went as smoothly as clockwork. +8. The writing was relevant and had a certain number of connections. I determined what was necessary for a planned child to become immortal. They needed originality, a number and spiritual and other topics. I turned over enough keywords. I kept algorithm specifications, which I connected to the topic. +9. I also checked that I had linked the paragraphs together. To complete copywriting, I used meta-induction, neuronetworks and a connections neuronetwork. Meta-induction determined algorithms from data a priori connections, not a posteriori. Neuronetworks found algorithms with commands that were too difficult for meta-induction to find. I paraphrased text on the writing side, then found connections with a connections neuronetwork. +10. I remembered that any complex code could and should be simplified. I found the meta-induction, also known as Program Finder with Types algorithm. It found pattern-matching algorithms with any reasonable extensions able to be added. There was a user interface program finder to create web scripting algorithms. I inserted the formula for the algorithm to use, and it searched with this to find any needed output, replacing part of the algorithm with this formula. +11. I allowed rendered code to either contain or not contain extra features based on compilation or runtime flags. I found what parts of the algorithm I could find with the meta-induction algorithm and gave the black box inputs and outputs in a text file to the neuronetwork. It quickly found combinations of past formulas. I could find these in the hierarchy of the logical structure of the algorithm or in another order. I started with a simple model algorithm and added more features, translation, garbage collection and optimisation, if necessary. +12. I broke neurofeatures into groups of groups and built them up from simple models. First, I wrote the exact logic for the neuronetwork algorithm. Then, I broke the specifications into steps. I started with a simple model using meta-induction. I refined work as I went, making optimisation easier at the end. +13. I wrote a \"head-iser\" to push list processing into the predicate head and a DFA minimiser to optimise the predicate. I did this by breaking features into groups of fewer variables. For example, I found equals4 in a few lines. This code translated well to C. I counted the reduced set of instructions. +14. I researched member check and using sort for unusual purposes. I wrote a \"head-iser\" to push list processing into the predicate head. I moved list processing into predicate calls. If it resolved better, I moved list processing into the head. I changed if-then clauses into different clauses of the predicate. +15. I separated commands into their different modes and implemented them in C. I researched member check. Member check is member with cut after it. I wrote another member predicate, which for example, checked 1 was a member of [1,2] (rather than giving the members of [1,2]), which had different modes from this second command, and was used for its purpose in combination algorithm writer. The program wouldn't work if data didn't match these commands' modes in CAW. +16. I wrote another predicate to remove duplicates without sorting. I used sort for unusual purposes. Sort sorted from smallest to largest and removed duplicates. sort([[2],[1,1]],A). returned A = [[1, 1], [2]]. and sort([[1],[1,1]],A). returned A = [[1], [1, 1]]. I used sort in decision tree creation, in particular unique item collation. +17. I used CAW or a neuronetwork, PFT and optimisation to optimise the predicate. I wrote the DFA minimiser to optimise the predicate. I removed duplicate clauses. I avoided if-then to avoid their appearance. I used recursion and other predicates, including itself, when necessary. +18. I wrote the exact logic for the connections neuronetwork algorithm. Instead, I tried the algorithm without the neuronetwork. First, I identified whether there was enough content in the sentence or whether I needed more context. Second, I identified the key ideas in the two sentences to be connected. Finally, I joined them using the required grammar. +19. I used a translation service that didn't run out of quota. I changed Cultural Translation Tool (CTT) to do all possible translations in one step. First, the page was back-translated, and the algorithm notified the user about sentences which back-translated differently. When these differences were passed or changed, the algorithm translated the next batch of modified sentences, and so on, until the algorithm had translated the document. +20. I added sentence splitting to CTT. If the user needed to break sentences to back-translate correctly, they entered them as separate sentences and were back-translated separately. If the user had previously changed a sentence successfully, it was changed. Split sentences were re-added to other documents to retranslate, or the user could use a particular possible sentence version. Also, I could rewrite documents and join sentences using text files, which were all saved from an earlier point. +21. I ran the translation past a native speaker familiar with the content to catch \"back-translation errors\". I added translation updates across languages to CTT. The algorithm tried all previous sentence versions to find a correct back-translation for a language. I dumped this idea in favour of preserving finished versions of documents. However, changes to the content or the need for uniformity meant the latest parts of documents could be merged and retranslated. The system meant updates to study materials, websites and marketing materials could be rapidly written and automatically made. +22. I could also translate code in CTT. I wrote this code in List Prolog, allowing for phrases and sentences as predicate and variable names. CTT notified the user about data changes that would require changes to the algorithm, and it did these manually, triggering a warning if someone tried to overwrite them. But, again, bulk updates to content were possible otherwise, and I preferred staying as close as possible to a single document version. Occasionally, cultural, linguistic, and other reasons meant the user needed to save content separately. It was marked as such or stored in a separate folder (with a pointer) before documents were moved or changed. +23. The algorithm notified the user if the pointers to the unique parts of CTT documents were deleted or duplicated. The individual parts, counterparts in different languages, were stored together for easier editing. If necessary, the user could use a command to replace them with a single non-unique part. The program would back up each version. Unique parts could also contain pointers to other unique parts, and a command could merge them. +24. The quantum entanglement algorithm had 5*50 As for the answer to repeat. Separately, I wrote a duplicate content finder algorithm. To avoid duplication, it searched for content that was the same or very similar to other content. Also, content that was missing or incorrect was identified and fixed. An algorithm could identify incomplete sections and notify the writer or schedule updates by a specific date. +25. A format uniformity algorithm checked that capitalisations, spaces and quotes had the same format. Also, the CTT content that was missing was identified and fixed. For example, if the paragraph had a number but no content, it was flagged. The algorithm would copy the content from the original to it, and a person would check the result. If the document type was a term (and the algorithm needed to convert symbols such as brackets), the algorithm converted it to the correct format. +26. A favourite was a hierarchy of strings and algorithms within a string or algorithm, which were pretty-printed. Separately, CTT content that was incorrect was identified and fixed. If a repository had testing, update or other difficulties linked to the algorithm, it was queued and brought to light. Then the algorithm fixed this, and necessary changes to other code and CTT documents were made. The CTT documents needed to be translated and published. +27. An algorithm grammar checked the content before translation. The grammar checker was suited to computer science purposes. This algorithm ensured that the document to translate was perfectly grammatical. Also, it provided that the back-translation would be better and helped make the translation better and more straightforward. After all, the algorithm would translate grammatically simple expressions correctly. +28. The system and lecturers had to help mind map, correct and finish algorithmic breasonings with an A. The SSI Academy Site automatically marked essays for relevance, structure, grammar and breasoning count. The algorithm automated the financial site. Also, marketing came from the profits and helped the school remain open. Students received certificates and could display the course on their curriculum vitae as accredited. +29. It would be detrimental if students forgot their thoughts all the time. Also, the algorithm marked the essay for relevance. The algorithm checked the words, their order and synonyms in the paper. Without a citation, the algorithm would mark these phrases as plagiarism. Finally, the system helped provide spiritual suggestions and psychological support to the student while they wrote the essay. +30. There would be a meeting with a student progress officer if the student failed 50% of their courses. Next, the algorithm marked the essay for structure. The piece needed an introduction, exposition, critique, conclusion, references and citation style, all with the correct format. A traversal-checking algorithm checked for the argument structure and missing links or links between parts, such as paragraphs. An algorithm found differences in opinion from the texts and marked the essay down. +31. The algorithm was marked for grammar. The algorithm marked down errors such as missing or incorrect parts of speech, spelling errors or other grammatical errors. The algorithm identified fine details such as the word \"references\" not \"bibliography\", unnecessarily numbering references, too few references or using \"ibid p. x\" instead of \"ibid\" for the first reference after a reference with a page number. Corrections to these errors were found and presented to the student. The software could fix these errors at a premium. +32. The students could keep and publish their algorithms in a book or online. The algorithm marked the essay for the breasoning count. The number of proper breasonings (spiritually passing, good ideas and algorithmically logical) was counted and compared with the number required for a particular mark. It was best to tell people about the requirements so they knew what to work on and could find ways to support themselves to produce them. I became an oblate during Philosophy honours, which reminded me of the pedagogy text. +33. I'm sure I saw something like memberchk(65, `ABC`) somewhere. I devised a webpage unit testing algorithm that used the curl command to test whether an SSI Web Service algorithm was displaying correctly. It would already work in SSI (in the terminal), but if it produced the same results in SSI-WS, it passed. This algorithm traced the algorithm's output as the user entered a particular input. The program could fix errors by using previous versions or corrections to algorithms. +34. I developed a login system, simple word processor, content management system and sales funnel for SSI-WS. The advantages were that the program (including security features) was invisible to the user, and data to display could be processed by algorithms and displayed as part of the website. It was connected to email and files and made it easier to run Prolog algorithms, through the web browser. A low-code back-end allowed programming in the simple, intuitive, clever and powerful programming language Prolog. I could develop a program development suite that taught and helped build web apps that connected free and customisable components. +35. State Saving Interpreter Web Service could run Prolog algorithms over multiple web pages and provide access to the simulated intelligence ability. For example, I could write computer games between people on two different computers with SSI-WS. The game required the users to log in and start playing a game together. It might be a word game, a timed puzzle or a decision tree adventure game. The server connected the users on their computers, a friendship could form, and the algorithm could save high scores. +36. I developed a Javascript scrolling list file system for SSI-WS. SSI-WS provided the back end for the system (access to files on the server from anywhere). Javascript provided the front end. I could implement the scrolling list file system in HTML using SSI-WS commands if needed. HTML could also produce colour pixel graphics for 2D or 3D platform games and accept single key presses using Javascript for control. +37. The computer was futuristic, like someone who knew you and your sensitivities and only suggested mentally nourishing activities. I developed a graphical operating system in Javascript for SSI-WS. I could also create this operating system in HTML, but Javascript provided mouse and interactive object support. It could be 3D, like a room in one's room or a fictional or realistic workplace or institution. Work could be automated, except for the human creative and cognitively correcting parts, of which the humans would learn the operations of the algorithms. +38. I used Java3D to create the 3D scene and interact with others. I developed a 3D web game that simulated an educational institution in SSI-WS. I could balance work with physical activity, such as walking, non-contact sports or gardening. I could plan and help correctly complete physical activities after working them out on the computer, with, for example, knowledge about the local wildlife and plants and the best way to start gardening. SSI-WS made making a text engine for the school simulation easy, and game/document exchange sides were possible, within reason. +39. Prolog could detail each screen and place characters and adventure instructions in the maze. I developed a text adventure game called Vetusia with graphics in SSI-WS. The first option was a text adventure with a compass and up-and-down directions to travel through the maze. The second option was static, hand-retouched graphics for each screen and clicking on the directions on the image. The third option was a 3D animation of moving through the Vetusian site, with gaming friends and chatbots. +40. Vedic astrology opened up the unknown and offered a seen-as version to support finance. I wrote Vedic Mind Reading (VMR) software that found the auspicious dates and times for a company opening or launch, which increased the quality of mind reading results in Music Composer or Grammar Logic. This Jyotish (Vedic astrology) algorithm's results were also helpful in finance because it anticipated the best stocks to buy without insider knowledge. As a result, people could sell before crashes and buy safer commodities. The money would stay in the country, and it could compensate for values going down. +41. People could now work overseas with their qualifications. I foresaw governments accepting international university degrees worldwide. Other countries would take a country's university degrees. This development may mean an educational institution may be able to be established overseas by someone from the first country. This acceptance was akin to time travel and peace spreading through time. +42. The educational institution could open its doors to international students without paying full fees. I predicted accreditation becoming international and governments giving students loans for international degrees. I wondered about regionally differing prices. I noticed some forms of international certification and wondered about government funding. The institution could stay there and suggest many thoughts, including algorithms. +43. With the advantages of paragraphs 41 and 42, I could live in the country of my choice with a pension and open educational institutions that could have students anywhere in the world. I developed the course. I wrote a book accompanying it to explain it to children. I noticed children's school objects and ways of thinking (such as the ethics of an issue) and connected the books to them. The children were genuinely interested in adult works but lacked some skills, not knowledge, to do certain things. +44. I wrote all the arguments and algorithms required for a first-class degree. I knew the time needed to produce a degree for an educational institution. The first degree offered was interdisciplinary, a conjunction of philosophy books. Each degree required (3-4)*4*50 As, and these arguments and (some) algorithms could be written (in the same time as, not while) the degree was being studied part-time. I also recommended the vocational business diploma and Education master's degree. +45. I sowed the seeds of life for my business, starting when people were grown up. I could add to any of my philosophies, and the algorithms would follow. The \"N\" factor meant that the algorithm needed documents of a certain length. The nature of these reports remained internal. They were often divertissements (perspectives on topics). +46. I wrote the symbols movie. It featured the generated music and was dot-to-dot bitmap stills or voxel animation-based. I wrote the philosophy. I wrote the algorithm and converted it to a movie. It was interactive, and users could zoom into movies about each algorithm. There were similar symbols depicting each algorithm, and as users watched a movie, the programmer included them in the algorithms. +47. I worked out how much to invest and how much to write to start an institution. I planned my finances, allowing for government funding and customers for my business. I bought accreditation for my educational institution. I generated and provided courses in different languages. I could write lessons for programming languages, such as mathematical manipulation, art generation, web graphics user interface or non-deterministic languages +48. I wrote a mathematical manipulation language. Its results were mind-read and helped users stay mentally agile and think critically about objects, courses and algorithms around them. They calculated and checked the properties of ideas in classes. And they found optimisations and simplifications of algorithms. An algorithm could ask preliminary and \"à présent\" questions about more complex algorithms. +49. I wrote an art generation language. It could generate dot-to-dot bit map images in colour. These images included graphs with planes or regions. Or it could produce music, which helped them mentally prepare for implying imagery properly. They could mind-read new sections of songs by learning to compose the music mentally. +50. The language could display bitmap, 3D or layered graphics, and students learned the language for assignments. I wrote a language for a web graphic user interface. This language had different metaphors, such as duckling house, musician's studio or meeting friends. Users could befriend the ducklings, and the musical software could be hand-written for individuals' needs and effects, such as writing on top of or recording lateral thoughts were possible. People wanted to keep memories, so these were made into accessible documents and printed out. + +51. I wrote a non-deterministic language. It finds programs within programs (i.e. state machines). I saw the circle. It was a time-dependent set of data transformations. I found list transformations, including list decomposition and building, and associations or variables needed in other algorithm parts. +52. The language could contact people and take care of products. I wrote a commerce report-generating language. I collected the data. I generated the report. I wrote an algorithm that drew graphs, made recommendations, put ads up and took care of the business's finances with approval from a person. +53. I also kept the character recognition algorithms in one place, for use by, for example, the text to breasoning algorithm, with modifications such as no \"'\" if necessary. I preserved formatting by processing n and m dashes in the paraphraser algorithm. I detected the punctuation or space. It was separated if it wasn't an alphabetic character in English or another language, and the algorithm only replaced words. Students should start their breasoning dictionary, thesaurus and algorithm database from scratch to deserve their marks. +54. I considered putting the package manager online, with download links to the host, and snapshots of stable versions of repositories. First, I had one repository for user data and one for reused predicates. Several repositories used the user data repository, which made it necessary to have separate copies of these repositories in another folder. These could be modified and updated at leisure. Users only needed to install the relevant repositories. +55. I modified the Essay Helper documentation to explain the essay format, possibly write specific comments, and encourage contacting subject coordinators to check whether the software is allowed for a course. I explained the software arbitrarily chooses sentences with the keywords, including unrelated grammar and formatting, and to review and paraphrase the text before submission. Also, users needed to check the sources produced by the sheet feeder and poorly formatted or badly converted text removed. Another algorithm could detect incorrectly converted text from columns. Or, one could detect gibberish from OCR. +56. I programmed the gem with spiritual algorithms such as text to breasonings, grammar logic and mind reader, timing different gems with different tasks (mind reading the input and output, using the spiritual computational ability of the stone). I pretended that a crystal (like a computer) could remind me to do testing. With A, a crystal could represent any algorithm. It could run algorithms and process data. Like cosmology had a class of algorithms that found program finders, new options to call them (and the accompanying documentation), and commercially viable versions of algorithms, users could program gems like a computer, for checking, lecturers to detect and do work, for home security, for helping one's physiology feel developed, for better spiritual presentations and reminders and reminders to make reminders, all with Prolog algorithms. +57. I researched computation-compatible gems and connected them. I taught the (meditation helping) gem to recognise and use saved commands. I hypothesised that kunzite could help make connections between members of a class, amethyst helped with security, testing and correcting code, fluorite ran regression tests and transformations in the background, turquoise ran text to breasonings and other spiritual algorithms, yellow topaz could run different algorithms or an operating system, rhodochrosite ahead of time detected and corrected unwanted changes to the heart rate, circulation, blood pressure, the kidneys and reproductive organs and prevented skin disorders, migraines, thyroid imbalances and intestinal problems, while restoring bad eyesight and cleaning the circulatory system. In contrast, azurite helped one find enlightenment by finding new connections to new ideas, songs, algorithms and art. I could simulate the microprocessor. In this way, I could support the text to breasonings algorithm by processing human knowledge passing through the microprocessor. +58. I also recorded what the text to breasonings algorithm could do. I breasoned out what the gem needed to do to program it. The time crystal, simulated on a computer, enabled a system to loop in time. I could program a house and lawn simulation, mind reading the attributes of objects the gem represents. I programmed the objects in the simulation in the gem to time travel, where the state machine helped the body to live forever. +59. I conjectured that the crystal could play spiritual sound effects and help with medicine. I programmed the gem to display a spiritual screen to help visualise data. The Gem programming language could project 4*50 As for frames of a decision tree, controlled with 4*50 As (where the controllers needed to be fast quantum box breasonings repetitions). It helped differentiate ideas and display colourful imagery. I ensured my Prolog interpreter didn't skip or optimise verification steps necessary for running text to breasonings. +60. The crystal algorithm reviewed the logic of thoughts, reminded of forgotten ideas, checked thoughts against the trajectory statement and sketched out the future endpoint. The crystal could change tree-structure stories into cycle-containing philosophies and algorithms. Text to breasonings, etc., were supported by a single crystal owned by the meditation teacher, which interfaced with the future, contained practicums and displayed imagery and reinforced results. Someone in another dimension performed all the computations simultaneously, and the developed thing was the crystal performing. Someone activated the future window that supported text to breasonings for breasoners in our time, and the company wrote As so that crystals made thoughts clear and logical. +61. I experimented with different programming languages for various crystals, such as database, maths, algorithm or logic formula finders, using developed data for data that resonate with the crystal. I computed 1+1=2 with the crystal by storing the input and algorithm in a spiritual container associated with the crystal with 4*50 As with text to breasonings, running the Prolog algorithm with 4*50 As per command, mind reading the container for the correct output with an open terminal window, trying again until reaching the quota of tries and displaying the answer on the computer screen. The algorithm helps with lucianpl and SSI by comparing their output with LPI output. The crystal vibrated at different frequencies for specific Prolog commands. I could write programs with commands with frequencies compatible with the crystal's chemical structure. +62. Equipment could detect waves emitted by the crystal. I used a form of mind reading to check what languages, algorithms, data types and types of data (such as music) resonated with a crystal (I could also check compatible resonance frequencies of objects). I found the resonance frequency of the crystal, vibrated it or not, signifying 0 or 1, and ran any algorithm with this frequency. The algorithm represented choice points, data, etc., by human-controlled timing. I researched computers and other technology using crystals. +63. The crystal was used when needed and was placed in the right place in the room and house. I tested whether pictures and specific questions enabled better mind reading. I helped students spiritually with their questions and answers. If they didn't have a question, I made one up if they needed help. I mapped crystal frequencies to different words to point things out to people. +64. I could design a spiritual computer to program if I could improve projection and mind-reading quality. I bought the channelling crystal for improved mind reading. It helped me find the best answers, such as mind-reading myself when wanting to work but not working. Other algorithms could help complete work at these times. Using additional algorithms, I could enter the correct changes and corrections to this output, and it made them automatically. +65. My spatial reasoning improved. I breasoned out 4*50 As, with which the crystal more effectively turned headaches off. Imagery appeared more clearly. Thoughts about objects, people and settings were clearer. I aimed to write 2-4*4*50 As in Physics to win prizes. +66. I wrote the algorithm to work out where I was. I improved my spatial visualisation capability. I made a model. I wrote the formula for its attributes. I worked out how to do this in my mind with the help of a co-worker. +67. I reduced the time needed to write an algorithm. I used crystals to mind-read whole algorithms. I wrote algorithm writing algorithms and entered the specifications with the computer keyboard. The algorithm worked out as much as possible without unpredictable commands, displaying and saving its progress. I entered and then wrote an algorithm to work out the missing steps, whether single or multiple commands or these within recursion or involving random numbers, APIs, files, input or mind reading. +68. I wrote algorithm writers for each algorithm I had written, using identifiers of a part which is an algorithm, analysers for how they connected and ontologies for putting them together in new ways, and ideas about when this would be necessary. In addition, I could automate writing algorithms containing algorithms, state machines, structured data in strings, multiple levels of this, different methods, and different spoken/programming languages or graphics. I collected and grouped methods from my algorithms. I used findall, forall, findnsols, etc. I created specific commands, editors and debuggers. +69. I questioned whether the behaviour of crystals could be captured and simulated on a computer, for example, with enough As. I analysed crystals' quantum box properties, including those for medicine. Instead of breasoning something by itself, I breasoned it by the crystal. I tested whether concentration, comfort, relaxation, thought quality and types of thoughts were different from using the crystal by itself. I did this for each crystal and for placebo crystals that weren't present. +70. I expanded what I knew into what I didn't know. At any rate, I gave As to crystals I didn't own, to answer questions for others with text to breasonings and to Chinese herbs I didn't take. I designed a simulation of my life in Prolog to capture and do things I couldn't do myself, but could read then, which involved mind reading, research and perfect breasonings. I gathered knowledge about the people, places and conclusions I wanted, and my dreams came true. I became interested in different ideas, even unusual ones, to help me help others help me, and defined my views about my choices about them. +71. I found stones' resonating books, chapters and algorithms. I used the Grammar-Logic algorithm with one algorithm per spiritual stone to help mindmap ideas. I methodically found ideas for different algorithms with each stone, comparing the same algorithms with other stones to see the different results. I found the types of algorithms suited to a stone with mind reading. I imagined testing the frequencies of an algorithm (in its simplest textual form) and a stone and seeing if they matched. +72. I detected the kind of algorithms a crystal pointed to about a philosophy. I collected and helped define philosophies and algorithms and presented the seen-as version of philosophies and algorithms written with the help of the crystal. I also used crystal mind reading with Essay Helper to make customisation choices, such as synonyms, synonyms for \"also\", editing grammar, performing grammatical touch-ups, and following finding the context for a sentence or whether two sentences have anything in common. I found the right crystal for the point in the essay and detailed the text. Finally, I did this with \"detail\" algorithms. +73. One part of the Essay Helper algorithm checked that the total string length of the strings to form a decision tree was not over 250 characters. I did this as I went, not in one go, to stop the algorithm from crashing. This bug fix enabled longer files to be detailedly mind-read in Essay Helper. I wrote a language describing essay formats for Essay Helper using list types and variables. In effect, the language generated the algorithm. +74. I used an algorithm to determine the right recursive grammar. I added strings and files to the program finder with types algorithm. So far, the program constructs programs from list data. Strings could contain lists of characters/words. Files could contain strings or lists. +75. I spell-checked the sentence and then grammar-checked it, regardless of the grammar. I wrote a grammar-checking algorithm using grammar. First, it found the grammar of the sentence using parts of speech. If it couldn't determine the grammar or parts of speech used, it skipped the sentence, with a warning. Next, it looked for common errors, such as grammar, part of speech or agreement errors. +76. The grammar checker was primed for computer science. I checked the document's grammar for consistency, format readability, and overall content. It is essential to let the person develop the first draft of the paper and suggest optional changes to it. A page was readable if the main point was at the top and if it was on one page, the same page as where it should be. The algorithm took the document's content from data, which should be relevant, current and valuable to the readers. +77. An algorithm could test whether the text had the quality of a back-translation. I used an algorithm to check if someone had used the Cultural Translation Tool (CTT). This algorithm didn't involve using CTT again. When a user used CTT, the translation was accurate, the grammar used universal language (at least, that common to the two languages), and it was sometimes recognisable by the language being common to the two languages. If the language used the exact words or grammatical constructs as a language, CTT might have produced it. +78. Some cultures (not just countries or language speakers) might have specific taboos or cultural understandings of texts. Back-translation always seems necessary to check that a text is understandable using an intersection of languages worldwide. It simplifies, speaks in a universal tongue and makes writing procedural. It might require the simplest possible, meaningful grammar and use of words that must be recognisable for safety and other reasons. Trees of related languages might point to words and grammars a text should use before back-translation. +79. I used an algorithm to check if a user had used the Program Finder with Types (PFT). In the first version of PFT, data to algorithm found [d,[c]] from [a,b], [b,[a]] and [c,d]. The second version found algorithms with recursion, given data in repeated list format. The first version didn't deal with repeats such as [1,2,1,2] given [1,2], and the second version didn't find algorithms needing data passed from non-expected parts of the data. The second version can do what the first version can't, and the upcoming third version can do what the second version can't do. +80. The third version of PFT brings associated data to other places in the algorithm, and the fourth version addresses the problem of unpredictable commands. The third version finds the order in which the lists are built and works backwards to decompose lists to reuse data structures that recur. The fourth version looks for patterns in the data, such as sort, delete, out-of-order append and dependencies on member, length or string length or string concat. There may also be up-case or down-case, splitting, custom splitting or verification of character type. To check that a user had used PFT, I would use the algorithm to check whether a user had not made the specific optimisations not produced by PFT or whether the code was more precise. In contrast, the opposite, human-written code, would be more likely to be left in unminimised code. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 18.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 18.txt new file mode 100644 index 0000000000000000000000000000000000000000..840b981f5525bb496cd71eaa9a03b9d8055ff24e --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 18.txt @@ -0,0 +1,111 @@ +["Green, L 2024, Immortality 18, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 18 - I could help more people become immortal. + +1. CAW finds all combinations of commands that can produce a result from the input. I used an algorithm to check if the user had used the Combination Algorithm Writer (CAW). Unfortunately, CAW currently doesn't support findall or if-then, so code with these constructs generally wouldn't have been written using CAW. (For various reasons, findall shouldn't be used in optimised code. However, the user can express if-then as a series of clauses.) +2. I agreed with helping with moderately tricky questions because they were sometimes harder to visualise. CAW has previously found unusual results which are counter-intuitive or unoptimised. In a similar program, the formula finder finds results in different orders on different machines, i.e. [a,[b,[c,[d]]]] or [[a,b],[c,d]]. The programmer can use CAW outputted algorithms as neuronetwork data, which are slow to train but quick to run. PFT is preferable to CAW for simple pattern matching but not for unpredictable, recursively called commands. +3. I decided in favour of if-then rather than top-level. I used an algorithm to check if a user had used the interpreter. The interpreter takes an algorithm and input and finds the result. An interpreter keeps the syntax of a particular programming language. The data can determine the features and correctness of the interpreter processing it. +4. If-then could be converted to multiple clauses, even with verification guards, and then to a fast language such as C, with the algorithm processing lists using a for-loop. If-then shouldn't destroy choice points; in my implementation, it doesn't. Append1 (the same as append) works forward and backwards and with multiple list items. Append1 works in SWI-Prolog, but not List Prolog Interpreter. If-then, not relying on choice points in SSI, works. +5. I used an algorithm to check whether the user used the interpreter algorithms. I could run simple tests on the interpreter's features. These algorithms tested findall, types, functionalism, equals4 and grammar, among other things. If an algorithm used these features only and used similar data formats, then it could have been run by the interpreter. I could also check whether the logical structures in these algorithms were identical. +6. I could accept input or produce output in different formats. I included options to run with algorithms and documentation this involved. For example, I included reflection, rotation, scaling or shearing in font rendering. In the music composition algorithm, I included an instant-runner shortcut or smartphone app if the user wanted to record their emotions in song form. The student may require music theory and composition knowledge before mind-reading composition and post-production skills afterwards. +7. Is flip like associated variable migration? I wrote program finders for the algorithms that I had written. I allowed the customisation of design decisions with text files containing configurations. I simplified the programming language producing these programs by using the string algorithm to list for files and using the repeating list data structure format. It reminded me of the LogoWriter programming language. +8. I joked that the young-adult students' memories and ideas would need to be clear in their minds when they flipped the Logo page to write their programs. Perhaps they wrote on their hands or scrap of paper in their minds. This thought process could have reinforced their memory, and spatial imagination skills and a teacher may have assessed it. Following this, the students may have consolidated their Logo knowledge with the repeated lists of List Prolog and designed a decision tree. The algorithm would find the algorithm from the data, whereas writing the inductive algorithm is more advanced. +9. I found graphical features such as an ice cream cone's diagonal grid appealing in the icon editor. I wrote graphical user interfaces for the algorithms. I wrote the GUI-editor base program in SSI. It required multiple buttons and graphic elements and possibly disallowed pressing the browser's \"back\" button; instead, users should click on screen controls, and the algorithm should automatically save work. In addition, SSI contained commands containing Javascript elements. +10. I calculated how much the program returned from the original investment. I helped ensure that an algorithm could be commercially viable. It contained save and load buttons. Also, it allowed customisation to the features and for students to create exceptional work. The algorithm needed to be crash-free and secure, and simple yet compatible. +11. I wrote a chatbot with text, algorithm, song and picture. This algorithm produced essays with the essay format, paraphrasing and connections. Also, it made algorithms using PFT. In addition, it made songs using music composer, with different genres in a meditation style. Finally, it created visual art which contained bitmap stamps and could be games. +12. I wrote biographies for myself as an actor, singer, computer scientist and philosopher. I listed my plays. I also listed my songs. In addition, I listed my algorithms. Finally, I listed my books. +13. I wrote texts on philosophy and assessed students writing on them. I assessed students in my academy on my songs, plays and algorithms. I needed insight in my career. Students could write philosophy essays on my songs. Or they could philosophy essays on my plays. Finally, they could write philosophical essays on my algorithms (their method, use, and future research). +14. I wrote an algorithm tracing logins, with backups of this log, and notifications of logins to users so that users could report suspicious logins. I wrote a security algorithm to prevent an object from being stolen. First, I locked the thing in a box, which no one could remove from the premises. Second, I kept the key in a safe, separate place. Finally, if the object was in digital information, I increased my options if someone stole it and traced commands and locations where someone could steal it. +15. The crystals were like the Upanisads, focused on customers. I bridged the unknown in sales. I found the gaps between points of the buying process and increased the As between them, often specifically for a client. I included a magical data icon at one point. The company assured prospective customers that they had backed up their unfinished sale and could return anytime. +16. The obvious algorithm was positive and easy to understand and translate. I wrote the label on the gem to bulk-process my algorithms. I kept on expanding the breasdostonings 1-2-3 algorithm until I hit obviousness. This process clarified the unit and its movement and helped develop simplified and group algorithms. There may be levels of complexity with different numbers of expansions. +17. I preferred non-head-collapsed Prolog, including if-then to explain code. The upside to expanding algorithms was that I could simplify complicated algorithms, simplify code and write shorter, more meaningful sentences about it for younger and disabled audiences. If it was simple, adult audiences could read it easily as well. It made writing a simple version of the algorithm appropriate and made summarising the rigorous sides of different algorithms more accessible. With a high-distinction argument, students could then work on an assignment. +18. I also spell (breasoned) and grammar-checked the writing, testing any algorithms. Separately, I automatically converted and uploaded the philosophy. When the philosophy had reached 80 paragraphs, I converted and uploaded it using an algorithm. Then, I moved Grammar-Logic content (mind-mapping ideas) to another file. The algorithm notified me if a file had the incorrect format, including incorrectly formatted double quote marks. +19. In Essay Helper, I checked that \"space space space\", or similar in \"\n space space space \n\" wasn't in the sources. This feature frequently represented page breaks in PDF documents and needed to be changed to \"\n\n\" for pagination. The sheet feeder algorithm could convert this feature using atomic list concat. Essay Helper could be the basis for industries as it was the ideal form of A-grade arguments. As part of this, the algorithm would mind-read customers or employees and produce the essay. +20. Humans appeared only to accept what they were ready for. More senior employees or customers could produce their own chapters to create essays or run Essay Helper. Often these would be businesses as customers. A completely automated process might include algorithm, breasoning and connection generation. Breasonings must be human-desired, on the right trajectory and originally found out, perhaps, for example, automation or a feature matching accepted theory. + +21. Once students had written the program finder, they could use it to speed up code generation and produce more accurately. I added brackets (not just repeating lists) to the second generation of the program finder with types algorithm, avoiding processing them as lists. When the specification suggested no reason to turn brackets into lists, the algorithm left them as brackets. Then these parts of the data structure were processed as non-list brackets. In other words, the algorithm generated non-recursive code but a separate or combined predicate. +22. I converted between list, list of lists, state machine and string formats. I added support for constants, properties and errors in the list of lists algorithm. For example, there could be labels, wrap-around brackets or grammar for strings. Or, there could be properties such as string length or length found or verified. Finally, the algorithm could interact with global variables, files, a command text file and return errors. +23. I wrote a pretty printer for a state machine. It printed commands in a list, one per line. It could abbreviate lengthy labels and printed destination states in columns under a heading. There was also the possibility o drawing a flowchart with subroutines at the bottom. I could optimise reused code and left some code expanded for simplicity. +24. In the second generation of the program finder with types, I found a \"geode\" of brackets and applied an algorithm such as the data to alg algorithm to it. If there was a non-repeating term containing brackets, then it was pattern-matched and possibly used to build a list. The term \"geode\" means a term such as [a,[[1], \"C\"] that contains no repeating lists but may contain brackets. Alternatively, the geode may include repeating lists, where the non-repeating part (the geode) is processed separately from the repeating lists. If there was a set of repeating lists in the geode, these were processed using a previous or new predicate. +25. Various possibilities of algorithms were generated (with simplifications and groupings ranked higher than completely expanded code), and the program clarified questions about incomplete or ambiguous specifications. For example, the program asked questions about odd sets of constants in the same place in a repeating or recursive list, such as \"do they have rules?\". Negative data might be required if a rule is missing or wrong. Occasionally, the algorithm would note missing steps, data, algorithms or rules, and it attended to these. For example, if data showed that there was a missing specification, then it was requested. +26. If the data were i:1,o: \"1\" and i:3,o: \"3\", then the incomplete data i:2 would be completed with o: \"2\". PFT2 (converting repeating lists to code) could also output a functional call to a command that processed recursion in a single line. In other words, the algorithm would condense the recursive algorithm to a single call. If pretty printed, the call would be easier to read, and a verification algorithm would correct incomplete data within PFT2. This verification algorithm would detect inconsistent, likely error-prone or the same data as previous data. +27. I also wrote simpler versions of algorithms to make writing their more complex versions easier. Separately, the simplifications were removing +0, *1, string_concat(A,\"\",B), append(A,[],B), A=B, etc. The groupings were doing a common task in another predicate or assigning a more complex task to a command. For example, for any task, such as sorting without removing duplicates or finding code that pretty prints a list, a command can assign running code or mind reading to a command. I agreed that \"terminal\" azurite could be programmed with grammar and removed unnecessary grammar rules and wrote simpler grammar before adding variables (such as in string to list), etc. +28. I identified a list of lists processing algorithm, using a state machine to represent this structure. The algorithm could use this state machine to process lists of lists, preventing the need to program this type of algorithm from the start. Many algorithms use lists of lists, such as interpreters, converters, and state machines, lists to strings or grammars and strings and grammars to lists. The list [a,[b]] was broken into 1:a,2. 2:b, then the algorithm generated the list decomposition and building code. The algorithm identified possible list labels and wrappers, and the program finder inserted rules. +29. The algorithm used the bottom-up technique to construct new algorithms and find new sciences. The algorithm used the top-down method to find algorithms from philosophies, examining them, where the philosophies contained simple connections between algorithms. The algorithm recorded commonly used data structures, their type found (i.e. an index, a frequency or an algorithm written in a particular programming language) and possible algorithms (ordered in a state machine) that operated on them recorded. Then, by matching data to these data structures, the program could write parts of old and new algorithms. First, I described the relation between a philosophy and its algorithm, e.g. identified that the third technique on meaning found simple algorithms for household objects. Then, using ontologies, I could use the program to find more algorithms on these topics. +30. Algorithms could contain or allow for other algorithms. For example, a programming language converter had syntactical elements of a programming language. These were the logical structure and commands in the language. If a change was needed, the programmer placed it in a single place to effect changes elsewhere. For example, if the programmer added forall to the language, then they added its syntax to a text file that other programs accessed. This single record of syntax affected interpreters, converters and documentation. +31. I tried program finders with neuronetworks one at a time to find links between \"edge\" data found with program finder with types and any intermediate steps needed. First, I ran PFT3 (with associated variables passed around) to find \"edge\" variables or variables that needed unpredictable commands not accounted for by pattern matching. Then, I ran PFT4, which found these unpredictable commands, for example, 1+1=2. Apart from mathematical commands, unpredictable commands included logical, set-theoretical/database, matrix, verification, peripheral (random, file, user input and APIs), constant-involving (from the program, the context or the greater context) and most non-pattern matching (where list or string processing are pattern matching) commands. I could also find neuronetworks for specific combinations of unpredictable commands. +32. A type of neuronetwork was finding more types at each level, such as finding algorithms with more commands in PFT. I could discover neuronetworks for specific combinations of unpredictable commands by running the commands together (where the program checked their types to be compatible) with generated data to accompany similar data in previous valuable programs. If the data needed didn't match the data available, then either token data that did or a central source of data used in a different place was used. If a type of transformation that wasn't available was required, such as translation, it was identified and used. +33. I removed constants from reused predicates to use them more widely and reinserted them using a text file. First, I kept a copy of the predicate without constants or variables for constants. Then, I generated a copy of the predicate with the constant(s) (with a text file but without constant variables). Instead, I wrote a program finder for the predicate, which generated the predicate with the constants, etc., in the right place. Later, I wrote a general algorithm that generated any predicate with constants by using placeholder labels to substitute with constants and other rules if it is recursive or processes previous outputs. +34. If a predicate could simulate meditation, anything was possible, immortality, simulations (with one's content) and space travel. Separately, I processed data one function at a time rather than in one predicate. (I wrote other combinations of predicates.) I wrote a program finder for finding an algorithm that processes previous outputs by processing one predicate at a time. First, I wrote a predicate that sorted items, and then I wrote an algorithm that removed duplicates. +35. Only one pipe per list was allowed, but embedded lists were permitted, such as A=[[B | C]| D]. I reported an error when there were two pipes in equals4. While running the string to list algorithm, I stopped if there were errors in the file. For example, I reported an error if there were two pipes, i.e. A=[B || C]. There would also be an error in A=[B | C | D], etc. +36. If students wrote the error correction algorithm, they could keep the results, such as misnamed variables or missing code that looked mathematically \"asymmetrical\" in, for instance, findall. I also found the missing comma, bracket or full-stop error. While running the string-to-list algorithm, the algorithm stopped if it found any of these errors and then reported them. There were also syntax errors if a predicate did not have a well-formed head and body. Additionally, if the logical structure or List Prolog formatting were incorrect, there would be an error. +37. I aimed for world peace. I optimised predicates by expanding and then simplifying the code. I expanded all the predicates so all repeated code was visible and removed unnecessary code. I contracted the predicates so that the repeated code was in one place. I looked into meditation technologies and stayed in the perfect universe. +38. I wondered if if-then and top-level could co-exist or if I needed a flag to choose one to work. I used different configurations of variables, multiple clauses and cut to avoid if-then when optimising. I wrote different combinations of variables in the predicate headers, with base cases and the same variables appearing together first. I wrote multiple clauses and used cuts to stop trying later clauses. I possibly didn't use if-then because it was too complicated, but I used it later in \"Cognitive Prolog\", which I attributed to Leon Sterling. +39. I wrote an explainer in terms of Cognitive Prolog, perhaps with a Deterministic Finite Automaton (DFA) minimisation of a state machine, explaining that I could remove all obsolete transitions simultaneously as each other, leaving one. I wrote predicates in terms of shorter predicates. I omitted if-then, with multiple hierarchical predicates, the first lines, the antecedent. I used cut, then used the last clause as C in the equivalent of A->B; C. These clauses were like a case statement, and the last one also needed a case. +40. * In the dot-to-dot drawing program, I minimised the state machine from the story of raising and lowering the pen and drawing the start and end of lines. The final states were: +This Line from Possible +state: any state next + to this states: + state: +[ +[[1,s], false, [[1,s],[1,-]]], +[[1,-], false, [[2,s],[2,-]]], +[[2,s], true, [[1,s],[1,-]]], +[[2,-], true, [[2,s],[2,-]]] +] +It also worked when drawing points at the end (an add-on). Unfortunately, it also had no space character, but I could add one. +41. I wrote a program finder to process items that had been processed, for example, to find unique items. I wrote \"frame data\" for each step of the algorithm. This frame data was verified to be uniform. The program finder determined whether the current item was a member of the processed data and discarded it. I could process multiple lists of lists and possibly store items in either list. +42. I watched for systematic errors since an error. Separately, I verified the uniformity of the data. Then, I found whether the data obeyed a single rule. Finally, I notified the user that the data would likely be inconsistent if it didn't. Then, they or the program could modify the data. +43. I found myself \"mouthing off\" in a form as I wrote on the Stream Class whiteboard. Thank goodness I learnt Prolog later. I processed items from a list of lists. Instead of a list of items from which to take the current one, the list was a list of items in tuples with other items. In this tuple, I chose only the first item. I needed to match the other item with \"_\" (not register as a singleton or variable that erroneously appears only once). +44. I wrote a program finder to recursively call a predicate, such as a decision tree finder. I examined the finished decision tree, identifying the indices, frequencies and data. I identified indices as numbers from 1 to the number of items, frequencies as the number of occurrences of an item and the data as the unique data points. I identified that someone had arranged items in the tree structure in a way that I could produce using recursion. This recursion was where each recursive step created a node of the tree. +45. This program finder (in paragraph 44) found the next character in the strings (where I constructed a decision tree from the possibilities through different strings) to process at each point. It also found the rest of the items, to pass on to the recursive call. The interpreter triggered the base case if no more characters existed in the strings. Instead of strings, I could substitute lists for them. I experimented with character codes for better performance. + +46. Complex neuronetworks may have errors, whereas I could vet simpler algorithms more quickly. I argued computational mind reading and meta-induction could replace a complex neuronetwork. In this case, meta-induction included a simple neuronetwork that found the best fit of values at the bottom of the curve. Computational mind reading detects human feelings and overall thought trajectories. Where complex neuronetwork contains LLMs that are not understood, computational mind reading is random and containable, and meta-induction returns the same results each time. +47. Computational mind reading allows human communication and empathy, and meta-induction enables thoughts. Human preferences such as mood, musical taste, taste in art, taste in reading and more developed thoughts, such as computer science research, may be identified using computational mind reading. These could mean the difference between humanness and non-humanness with a simple neuronetwork. For example, composing music or talking may be possible with this method. Frontiers in creative algorithm writing, where writing an algorithm involves thinking of a technique and choosing features, may be approached by starting with a simple prototype and building on features according to human intuition. +48. **Signals focused on facts. I removed creativity from the simple neuronetwork and used computational mind reading for synonym choice, cognitive or optimised choice, decomposing at the start or as one goes, building with an integrated function or one function at a time, using complex commands or expanding predicates, make algorithms self-contained or use other packages, whether to use a program finder or a functional command and to choose whether to fix an interpreter, algorithm or data or to decide whether to make a bug fix, add a feature or change an API. In addition, I could use a computational mind reader to choose to translate a sentence with simple grammar, correct a spelling error by part of speech and mind reading, correct a grammatical error using aims for the text, convert between programming languages and choose one, do random software testing, create generative art or art based on mind reading templates. I used mind reader to choose (from several) synonyms when paraphrasing a text. Additionally, phrase-to-sentence converters were used and found relevant connections. In a PhD, I found ideas agreeing and disagreeing with a topic, a comment on a later one made, and details found. +49. I used mind reading to choose between cognitive or optimised code. To do this, I explained the code cognitively using analogies such as \"what I do or don't need\". Also, I used tools such as animations to describe the code. Finally, I interacted with the students, asking them to answer simple questions about their understanding of the code. These weren't multiple-choice questions but coding questions, possibly with hints. +50. I explained the code using an animation. I displayed the data structures as \"flying blocks\". I kept coloured labels representing unfinished predicates on the screen. I demonstrated each predicate. I said what the algorithm got up to, its limitations and uses, with illustrated examples. +51. I made an animation of a mathematical calculation by showing red coloured input blocks with numbers on them, the mathematical operator on a blue coloured block and the result on a red coloured block. I could demonstrate each predicate once, with one more input and output of the predicate shown again to reinforce understanding of the predicate's function. This demonstration works best with simple data. Also, the student could ask questions, pause and restart, answer a question or skip if they have understood it to customise their learning. I rewarded progress with computer encouragement, and they could create their own experience by changing the difficulty level, which the computer would still explain. +52. The student asked questions, helping them understand the topic and prompting learning by others. For example, the student asked a question about the function of the predicate, possibly pointing out a correction. The lecturer deliberately made blunders, and the students helped correct them. For example, the lecturer may use the incorrect set of input for the predicate, call the wrong predicate, get a function wrong or incorrectly comment on a predicate line. The lecturer may answer the question incorrectly, prompting further inquiries by the students. +53. I suggested explaining the algorithm's simple, not complicated version. Students may stop the algorithm animation if they haven't understood the last point. They may also stop it to gain clarification, understand a topic or reason more profoundly or if the animation is confusing, ambiguous, misses the point or says a point unnecessarily. After replaying the animation, they might take a note or mental note, screenshot or sum up the knowledge's usefulness, perhaps in making a take-home examination for themselves or constructing their animation. A simple animation might break down the animation into methods and explain how the algorithm transformed the data. +54. The teacher might ask the student a question to gauge their knowledge about an algorithm animation. These would be non-assessed, formative questions. For example, \"What was the input?\" (pointing at the screen), \"How will it be transformed?\" (about each predicate) and \"Can you please tell me the output?\" Also, they might ask for feedback about the system, possible improvements and whether it helped them become aware of the algorithm's workings and might lead to the gamification of the algorithm, helping them sharpen their skills about the algorithm. The game might be a timed multi-choice quiz about the inputs and outputs of the algorithm, with a model solution at the top of the screen. +55. There might be bug identification, bug-fixing and optimisation games. The student could customise their learning by creating a game testing knowledge about the algorithm. It might focus on lesser-known problem areas, repeat until performance is flawless, use individual or group data to predict weaker areas and deliver content according to the dependent knowledge required. For example, a constructive game might help decompose and build the list in each predicate. Conditional predicates would be tested first and relied on later in the game. +56. A bug identification game might ask players to identify singletons, missing brackets, commas or full-stops, two pipes in a list or more complex bugs to do with variable naming or missing parts in findall, base cases, recursion, algorithms, commands or predicates. Code would be printed one character at a time, and the algorithm would award a prize when the player identified errors. Or, the player could test the over-symmetrical or under-symmetrical code. Alternatively, there could be a timed game in which the the player entered missing code. Then the game's programmer could test themselves by generating incomplete or erroneous code with various formatting, naming and numbering conventions. +57. A game could identify and remove bottlenecks such as the curse of dimensionality. An optimisation game might compare two sets of commands, considering the number of instructions, running time for a multiplied number of examples and simplicity or elegance of code. Or it may require optimising code (which is open-ended) to increase performance. It might be a multiple-choice quiz about processor-specific optimisations. Or it could be about language-specific optimisations. +58. It would be worth generating output when the algorithm didn't contain it, with question-answering that pre-populated data and verified it with the following program. I used mind reading to decompose it as I went rather than at the start. I would only need to do it initially if a program that didn't produce standard code did it. In this case, the algorithm must map the argument terms to a state machine that could be searched and reconfigured into output. If there were recursive parts, decomposition at the start could accommodate pattern-matching repeats. Building at the beginning could produce the result using a state machine, which the algorithm would convert back to a list of lists. +59. I mind read to build with one predicate at a time, when possible, rather than with a combination of functions in a predicate. I was in favour of writing library predicates myself. I built dependencies within these using the most specific code possible. I observed that the complexity of functionally decomposed code was less than otherwise. I mind-read (questioned) the elements (functions) one at a time, then built them using specifications and formulas. For example, I replaced all instances of one variable with several variables. +60. I used complex commands at one point instead of expanding predicates. I wrote neuronetworks 1:1 with code. I could produce any code I had written with a neuronetwork, connect algorithms, write code faster, write better-performing code and search for useful commands. Ultimately, I expanded predicates to find and modify code and more easily edit what I had written. I could sometimes reuse more recognisable expanded code; otherwise, I wrote it. +61. I used mind-reading to make algorithms use other packages rather than self-contained ones. First, I checked whether the other package was still available; otherwise, I programmed it myself. Next, I worked out how long this would take. I also supported various packages, depending on the user's needs or found easiest. Finally, I verified whether the algorithm was functioning correctly by automatically downloading the packages and testing the algorithm for a particular operating system or architecture. +62. I used mind reading to determine whether to use a program finder or a functional command to perform a computation. I used a program finder to find the functional command if the user preferred. A functional call contained names of predicates. It called code produced by a program finder in a language similar to Prolog that simplified algorithms to commands. I could use philosophies, data that led to the result or partial commands to find complete code. Philosophies were widely interpretable, and I saved algorithms rendered from them. +63. I used a mind reader to choose whether to fix an interpreter, algorithm or data. If the problem was data, I could also change the algorithm to take more detailed data. If the problem was the algorithm, I could change the interpreter to upgrade to the algorithm. If the problem was the interpreter, I could change the data that defined it and keep a check on the changes. Also, I could fix interpreter problems by modifying the algorithm-as-data, possibly by converting to a state machine and, for example, simplifying logical structures to predicates. +64. I found the best time to perform a task with Vedic mind reading. I used a mind-reader to choose whether to make a bug fix, add a feature or change an API. First, I listed the to-do items. Next, I ordered and grouped related articles. I classified them as making a bug fix, adding a feature or changing an API. +65. I used a computational mind reader to choose to translate a sentence with simple grammar. I found the sentence in the documentation. I used question-answering to find the most straightforward grammar and compatible vocabulary from a file. Other sentences which matched this combination of words could be written in this format, with allowances, converting multiple clauses to multiple sentences, repeating anaphors and simplifying to acronyms and using glossaries at the top of the page. These sentences were in a uniform grammatical format, and I could translate them using another algorithm. +66. I corrected a spelling error by examining its probable part of speech and using mind reading. First, I found the part of speech of the word in the place of the wrong word, given the rest of the words in the sentence. Next, I allowed for multiple words needing to be replaced or to replace others, being out of place or wrong. Then, I made the spelling correction. Given numerous options, I mind-read, worked out from the context or questioned the best possibility. +67. I used mind reader to correct a grammatical error using aims for the text. I wrote the algorithm. I wrote intellectually stimulating families of input and output. Finally, I wrote an essay in terms of the language of the algorithm. For example, the types to alg algorithm took repeating need for skills needed for meditation and produced a single list of skills. +68. I used a mind-reader to convert between programming languages and choose one. I reduced the algorithm to simplified C (it had types and memory allocation according to data). I could verify memory allocation according to data if the data seemed too long and the programmer wanted to reduce the data length. I chose a language with a mind-reader (Prolog, List Prolog or Simple List Prolog). Then, I converted to the other language. +69. I used a mind-reader to do random software testing. To do this, I randomly generated the expected data to enter into the algorithm. Then, I randomly generated unexpected data to enter into the algorithm. This testing tested the security, rigorousness and fidelity of the interpreter, algorithm and data. Finally, I mind-read the algorithm with insightful ideas in my sleep and dreamt the answer, making allowances for miscellaneous thoughts. +70. I created generative art or art based on mind-reading templates. These templates were question-answering forms that questioned the subject, their attributes, and why the artist depicted them. The user could use another entry method after making a new algorithmic connection. The subject could also be abstract or a representation of an algorithm. Mind-reader could show the (input and) output, with the highest frequency words in arguments for the algorithm, and approach more relevant research. + +71. Besides the immediate group, I focused on other people. I helped additional people become immortal. I found the people. I helped them to become immortal. They each wanted immortality. +72. I could survive dangers because of the simulation, but I took no risks. I made the transition to eternal life. I became a bot with 250 breasonings. I meditated and time travelled to September 5689. With 250 breasonings, I activated the Simulation chip and followed the voice prompt to become immortal. +73. All meditation technologies are forms of medicine, including operating systems and microprocessors. Both mind reading and time travel are necessary for survival. Mind reading enables one to read one's mind to write a number one song and live forever. It also lets one become a spiritual professor by hosting one's mind-reading philosophy academy, allowing several transcendences and allowing meditation to work. Time travel is necessary for survival because it enables travelling to the future and becoming immortal. +74. By breasoning out As about immortality and medicine, I convinced the stakeholders that I would become immortal. I moved to eternal life by finding 4*50 As. This time the philosophy needed sentence breasonings (i.e. 80 sentences per A). I found a text that was on the topic. I breasoned it out using the text to breasonings algorithm. +75. Meditation helped me develop texts and control my life to become immortal. I found meditation. It was necessary to time travel safely. It was also essential to experience immortality. It was required to experience spiritual medicine critical for immortality. +76. I also took Chinese herbs for longevity, exercised, did push-ups and toned the energy from the day to write by breasoning out 5 As. I practised a daily regimen to maintain my immortality. I breasoned out a medicine argument and used text to breasonings to breason out other arguments. I meditated. I time travelled. +77. The aim of immortality was to deliver my ideas in education. I always found work to do while immortal. I made a plan. I wrote down my thoughts. I expanded these and related them to the argument. +78. A dimension is a list of ideas, possibly about part of a text. I found time to further the developments in my books. I wrote the first dimension. I wrote another dimension. Then, I kept writing down dimensions and algorithms until I had finished. +79. I detailed my career in immortality. I wrote details. I wrote ten sets of input and output for the algorithm. I wrote three members of each family of sets. For example, if a point was about archeology, I wrote about the find lists algorithm's past, present and future. +80. I found the minimal set in immortality and studied its simplicity. I wrote details about the find lists algorithm. The find lists algorithm found the items in a list of repeating lists. For example, [1,1,1,1] and [1,1] gave [1,1] and [1,[2,2],1,[2]] and [1,[2,2]] gave [1,[2]]. I imagined excavating a map that explained architecture in terms of a single repeating set of items. This architectural detail could be a hall with rooms attached or a path to several sheds. + +*. * I explained the code by tracing the algorithm. +* mr as quantum computer +* artificial nn for finding patterns + + +*,\"I could help more people become immortal\",a_alg([[[[[],n],v],\"more\"],[\"more\",\"cupping\"]]),b_alg([[[[[],n],v],\"more\"],[\"more\",\"cupping\"]],a),bb_alg([[\"more\",\"cupping\"]])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"become\"]]),b_alg([[[[[],v],a],\"become\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"help\"]]),b_alg([[[[[],v],a],\"help\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[],\"become\"],[\"become\",\"tilt\"],[\"tilt\",\"sectest\"],[\"sectest\",\"btw\"]]),b_alg([[[],\"become\"],[\"become\",\"tilt\"],[\"tilt\",\"sectest\"],[\"sectest\",\"btw\"]],a),bb_alg([[\"become\",\"tilt\"],[\"tilt\",\"sectest\"],[\"sectest\",\"btw\"]])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"people\"]]),b_alg([[[[[],v],a],\"people\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"could\"]]),b_alg([[[[[],v],a],\"could\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[[[],n],v],\"more\"],[\"more\",\"phantasmagoria\"]]),b_alg([[[[[],n],v],\"more\"],[\"more\",\"phantasmagoria\"]],a),bb_alg([[\"more\",\"phantasmagoria\"]])],[*,\"I could help more people become immortal\",a_alg([[[[],v],\"more\"],[\"more\",\"koto\"]]),b_alg([[[[],v],\"more\"],[\"more\",\"koto\"]],a),bb_alg([[\"more\",\"koto\"]])],[*,\"I could help more people become immortal\",a_alg([[[],\"help\"],[\"help\",\"emissions\"],[\"emissions\",\"make\"]]),b_alg([[[],\"help\"],[\"help\",\"emissions\"],[\"emissions\",\"make\"]],a),bb_alg([[\"help\",\"emissions\"],[\"emissions\",\"make\"]])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"could\"]]),b_alg([[[[[],v],a],\"could\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[],\"could\"],[\"could\",\"preoedipal\"],[\"preoedipal\",\"tango\"]]),b_alg([[[],\"could\"],[\"could\",\"preoedipal\"],[\"preoedipal\",\"tango\"]],a),bb_alg([[\"could\",\"preoedipal\"],[\"preoedipal\",\"tango\"]])],[*,\"I could help more people become immortal\",a_alg([[[],\"people\"],[\"people\",\"pcre\"],[\"pcre\",\"cosmologies\"]]),b_alg([[[],\"people\"],[\"people\",\"pcre\"],[\"pcre\",\"cosmologies\"]],a),bb_alg([[\"people\",\"pcre\"],[\"pcre\",\"cosmologies\"]])],[*,\"I could help more people become immortal\",a_alg([[[[[],n],v],\"more\"],[\"more\",\"existential\"]]),b_alg([[[[[],n],v],\"more\"],[\"more\",\"existential\"]],a),bb_alg([[\"more\",\"existential\"]])],[*,\"I could help more people become immortal\",a_alg([[[[],v],\"splitintosentences\"]]),b_alg([[[[],v],\"splitintosentences\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"become\"]]),b_alg([[[[[],v],a],\"become\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[],\"help\"],[\"help\",\"texttobreasoning\"],[\"texttobreasoning\",\"integrates\"]]),b_alg([[[],\"help\"],[\"help\",\"texttobreasoning\"],[\"texttobreasoning\",\"integrates\"]],a),bb_alg([[\"help\",\"texttobreasoning\"],[\"texttobreasoning\",\"integrates\"]])],[*,\"I could help more people become immortal\",a_alg([[[],\"become\"],[\"become\",\"reversing\"],[\"reversing\",\"entrained\"],[\"entrained\",\"merchandise\"]]),b_alg([[[],\"become\"],[\"become\",\"reversing\"],[\"reversing\",\"entrained\"],[\"entrained\",\"merchandise\"]],a),bb_alg([[\"become\",\"reversing\"],[\"reversing\",\"entrained\"],[\"entrained\",\"merchandise\"]])],[*,\"I could help more people become immortal\",a_alg([[[[[],n],v],\"more\"],[\"more\",\"layered\"]]),b_alg([[[[[],n],v],\"more\"],[\"more\",\"layered\"]],a),bb_alg([[\"more\",\"layered\"]])],[*,\"I could help more people become immortal\",a_alg([[[],\"could\"],[\"could\",\"creeper\"],[\"creeper\",\"conformations\"]]),b_alg([[[],\"could\"],[\"could\",\"creeper\"],[\"creeper\",\"conformations\"]],a),bb_alg([[\"could\",\"creeper\"],[\"creeper\",\"conformations\"]])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"could\"]]),b_alg([[[[[],v],a],\"could\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"help\"]]),b_alg([[[[[],v],a],\"help\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"could\"]]),b_alg([[[[[],v],a],\"could\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"people\"]]),b_alg([[[[[],v],a],\"people\"]],a),bb_alg([])],[*,\"I could help more people become immortal\",a_alg([[[[[],v],a],\"help\"]]),b_alg([[[[[],v],a],\"help\"]],a),bb_alg([])]] +[] +true. + + +noprotocol. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 19.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 19.txt new file mode 100644 index 0000000000000000000000000000000000000000..c2cb1827ab9c6e7aa3568b61c915c652fdcd7669 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 19.txt @@ -0,0 +1,90 @@ +["Green, L 2024, Immortality 19, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 19 + +1. I wrote on immortality using radical perspectives in logical possibilities that interested people and were necessary. I wrote more pieces for more information. I wrote an algorithm that composed random, spiritually exciting texts. These suggested uses for computer science from the philosopher's life. I wrote on text, test, file and documentation perspectives for normativity and related to perspectives, writing, programming and induction. +2. I encouraged people to choose themselves in immortality. I wrote details for details. For example, when selecting people to ask to participate, I considered their interests and whether they would benefit. I might choose people with similar interests or who would fit the organisation. For example, they would be interested in computer science, philosophy, and spirituality. +3. I could write logically about immortality, which was the basis for future work and discoveries. I claimed that algorithms were imaginative enough to be the form of knowledge in immortality. First, they determined that the idea was valuable and well-known. Then, they logically explored the idea. Finally, I introduced the unknown to test for compatibility, such as a program that reads my mind and offers corrections to my ideas without any warning. +4. I worked on the plan for the school during my career in immortality. I aimed for conclusions like a vector. First, I wrote the reusable connection. Then, I connected it to other parts to form a whole part. For example, I decided to become a founder of a philosophy school. +5. I wrote features simply in immortality. I wrote more and more drafts until the work was perfect. First, I wrote the algorithm simply and did this to get the recursive structure correct. Second, I wrote the second draft, which included the features. Finally, I wrote an optimised version of the algorithm. +6. I claimed I was not unusual because immortality was more natural and happy. I wrote the algorithm. I wrote the predicate head. I wrote the predicate body. I noted anything unusual. + +7. I helped the person to the highest-quality thoughts (I protected top-level with working choice-points) in immortality. I wrote and compared the algorithm with the times. I rewrote if-then in Prolog in separate clauses, improving choice-point performance. I did this while writing a Prolog to C converter, which did. I posted about rewriting if-then and implemented it in my interpreter. +8. I wrote a rich database as I progressed in immortality. Algorithms properly covered all relevant ideas. I wrote the idea. I thought of an idea that covered it. This idea was the correct algorithm. +9. I used reductions in neuronetworks. Algorithms thoroughly examined logic with no flaws. I checked the logic of the sentence. I reduced the repeating structures to a pattern. I wrote the sentence, \"I found the cross-intersection in the house\". +10. I arrived at conclusions more quickly in immortality. I did this by creating reductions in neuronetworks. I trained the neuronetwork with reduced data for more intelligent decisions. I wrote the algorithm that reduced the data. Then, I trained the neuronetwork with it. +11. I reduced the repeating pattern to its shortest form, which was easier to analyse in immortality. Then, I produced the perfect expression with the algorithm that reduced the data. Then, I transformed the repeating pattern into an algorithm. Alternatively, I converted the repeating pattern into an operating system. Or, I altered the repeating pattern into a microprocessor. +12. And-implication, discovered in immortality, was correct and syntactical. The algorithms were logical. The algorithms used and-implication (implication with the logic table for and). I wrote if-then as separate clauses and displayed it as if-then in trace mode. This decision maintained correctness while preserving the syntax. +13. Algorithms in immortality contain steps. For example, I worked in the stream class. First, I wrote a sentence describing what the algorithm did. Then, I wrote the type statement. Finally, I wrote the first and final drafts of the algorithm. +14. I thought positively about becoming immortal. I simplified the design, eliminating bugs. I removed the steps. I shortened sentences. I simplified the data and logical structure of the algorithm. +15. I got more and more intelligent in immortality. I redrafted the algorithm. I identified the algorithm. I found where I had reached. I ordered the variables. +16. I thought perfectly correctly in immortality. I simplified the algorithm by finding the corrections. I found unnecessarily repeated variables. I found unnecessary predicates. I found excessive clauses. +17. I wrote the simple algorithm in immortality. I simplified the algorithm. I made the corrections. I listed the possible errors. I found any errors of these types. +18. I upgraded algorithms in immortality. I found better, more specific features of algorithms. For example, I changed from labelled lists to curly brackets around types. I found the fully featured algorithm to find a reduced list from repeating data. In addition, I found the fully featured algorithm to generate an algorithm from a reduced list. +19. I wrote the chain of free variables and simplified the algorithm in immortality. If there was no value for A, B was left alone. I read B=A. I read that A had no value. So I left B undefined (as A). +20. I found missing variables in immortality. If the programmer mentioned a variable, it was equal to a value. In B=1, B=A, the result was A=1. If B=A, B was left equalling A. If there were a conflict, then the result would be false. +21. I kept away from needless cycles in immortality. I rewrote the equals4 command to allow for cyclical data structures. I detected cycles, for example, in A=[A]. The programmer had written this structure as a cycle. If the algorithm evaluated it, it would be false unless A was undefined. +22. I became suspicious of infinite loops in immortality. When A=[A], A returned this value. If a cycle was present and the user switched on the \"occurs check\" flag, the command would fail. The occurs check verified whether one variable was equal to an expression containing itself. If one variable on one side that occurred in a command had a term with that variable on the other side, the command would fail. + +23. Disconnected statements or sets of statements that were arguably unused, such as some verifications, were deleted in optimisation, an exercise in immortality. The programmer connected the algorithmic steps. I wrote A=1. I wrote B is A+1. The second command referred to a variable from the previous command. +24. The immortal revealed that the problem of if-then in Prolog was finished. I converted if-then to multiple clauses, i.e. b1;c1). a1(1). b1. c1.> to . This conversion prevented cuts from destroying choice points when if-then was used. Multiple clauses created choice points where necessary, which the interpreter could cut properly. +25. I expanded \"or\" to explicitly include \"b\" in (a;b) if necessary in immortality. The \"or\" command, (a;b), was converted to (a->true;b). The algorithm converted the syntax (a->b;c) first to avoid converting (b;c) to this format. \"Or\" was converted, otherwise, only the first value in (a;b), \"a\", was given. The form (a->true;b) was usually used and was converted into multiple clauses, so (a;b) was rare but still processed as described. +26. The algorithm converted if-then statements with true or false antecedents into hierarchical clauses in immortality. Rather than programmers converting nested if-then into multiple clauses if reification was detected, if-then only tested for the truth of the antecedent, not Output in a(Input, Output) where Output = 1,2,3 etc. If-then would test whether Output = 1,2,3 separately. In this sense, the unneeded reification referred to was Output = 1,2,3, not Output = true or false. The output couldn't equal 1, 2 or 3 because antecedents could only be true or false unless the programmer used an explicit case statement. +27. I only performed needed computations in immortality. I didn't tuck in reused reification list decomposition and building because the programmer would have already tucked them in (see previous paragraph). For example, in (a(1, O1), O1=[O2, O3], O4=[2, O3],(O4=[2,1]->b;c)), if-then started after reification list decomposition and building. The algorithm would group and tuck in any repeated processing. Only O4=[2,1] needed to be the antecedent. +28. Immortality only needed simple tasks. For simplicity, I didn't need to merge (a->b;(c->d;e)) to a single set of clauses if there was one type of antecedent. I didn't need to merge to a single set of clauses because the if-then clause was already hierarchical, and the multiple clauses would be too. The interpreter was more straightforward without this additional feature. Without an additional case command, displaying the single clauses as nested if-then statements would be too complicated. +29. Each possibility, including failure, must be accounted for in immortality. Even a C-style case statement would be in the form in the previous paragraph so that the algorithm would convert it to hierarchical clauses. In the case of a case statement with conditions with different variables, the algorithm would convert it to hierarchical clauses. For example, (a(1, R, O1, O2),(O3=2->b;(O4=3->c;d))) would be converted to a hierarchical, not a single set of clauses because the antecedents contain different variables, O3 and O4. Whether or not the converter used a single or multiple set of clauses, the reified false result before the antecedent would need to be processed. +30. I saved time by converting if-then clauses properly, preserving choice-points in immortality. One could only have a single set of clauses with nested if-then statements if the algorithm tested a single variable's value or a predicate's configuration of values. For example, (a(1,R,O1,O2),((O1=1,O2=1)->b;((O1=1,O2=2)->c;d))) converted to Groups of configurations with the same result, writing the d2 clauses in upper triangular form and failure all had to be considered. The programmer should take care with d2 clauses in the upper triangular form that logical configurations are correct. +31. A mixture of a single set of clauses testing a set of variables and multiple sets of clauses testing further computations were necessary for immortality. One could combine single and hierarchical clauses when the algorithm tested a mixture of single variable values and other variables. For example, in (a(1,R,O1,O2),((O1=1,O2=1)->b;((O1=2,O3=3)->c;d))), O2=1 and O3=3 were deleted because they were unnecessary. Then, the conversion would be similar to that in the previous paragraph. If disjunction was needed, the algorithm handled it like if-then. +32. Immortality edited simple code with effective results. For example, if a=1,b=1 or a=1,b=2, the algorithm could test them as d2(1,1) and d2(1,2) or two levels of multiple sets of clauses. The first choice is more straightforward and preferable. The traced code could display the more straightforward if-then clauses in these cases. The interpreter can debug, modify and reconvert the if-then clause on interpretation. +33. I expanded and grouped disjunction and conjunction in immortality. I handled nested disjunction. I could convert (a;b;c). I could convert (a->O2=1; O2=2) before an antecedent. I could write a phi statement (O1=[_, O2]->true; O1=O2). +34. I introduced necessary commands in programming languages in immortality. To display case-style multiple clauses in trace mode and for the sake of simplicity in debugging, a case command that converted into a single set of clauses should be introduced into Prolog. For example, (a(1,R,O1,O2),((O1=1,O2=1)->b;((O1=1,O2=2)->c;d))) could be rewritten (a(1,R,O1,O2),case([O1,O2],[[[1,1],b],[[1,2],c],[[_,_],d]])). I debugged top-level and lists. I did this by properly converting if-then clauses to sets of clauses and garbage-collecting lists. + +From Immortality 1.6.4 + +35. The immortal avoided the infinite loop from a cyclical data structure, causing an error in the algorithm. When I turned occurs check on, A=[A] (a cyclical data structure in equals4) would fail. The complexity of this algorithm was quadratic (x^2). I compared each item on one side with each item in the corresponding item on the other. If it occurred again, equals4 would fail. +36. The immortal simplified transitive pattern matching, but could also use copy_term to keep variables undefined. A different version of equals4 could define A as B, not empty. I could create a tree, such as A = B, B = C, B = D and C = E. In this, A = B, B = C, C = D, and D = E. This feature allowed append(A,[],[1,2,3]), which depended on statements such as A = B in backtracking. +37. The immortal used the undefined value to swallow or unify with variables with any value. I used copy_term to preserve undefined variables. In A = _, A = B, B = 1, writeln(A), the result was A = B and B = 1. In A = _, copy_term(A, B), B = 1, writeln(A), A was written as _. In this, copy_term prevented A from being unified with 1. +38. An immortal could optimise reused code. I optimised and merged similar structures. For example, I moved reused code for put_item_n (which replaced item n in a list with a value) to a common area. This reuse simplified editing the code. Also, multiple algorithms could use the code. +39. The immortal simplified the idea. First, I determined whether I needed a bug fix, given the final draft. Then, I tested with and without the bug fix in the last version. If it made no difference or worked without it, I considered removing the unnecessary bug fix. Otherwise, I kept it after testing all cases. +40. The immortal found the version of the code with the least unnecessary changes. Next, I tested combinations of changes to an algorithm. I found all the possible changes. Finally, I tested whether the algorithm always worked with all combinations of these changes. If it did, then I kept the changes. +41. The immortal touched up the texts for easier connections. I took care of problems centrally. I worked with the last working version of the code, which used the latest versions of supporting repositories. I tested the algorithm to make sure it worked. I assumed the packages it used were working but tested them to ensure. +42. The immortal tested the code section by section. First, I found whether a section of the code worked top-down. Then, I wrote the success or failure of a section of the code. Next, I tested whether the section worked with the computation result. Finally, I tested whether the algorithm worked. +43. The immortal added to the code and bug-checked it. Then, I tested each section with simple data. Then, I tested each feature of the algorithm, one at a time. Then, I wrote examples of each part of the code. Finally, I wrote more examples with which the code could work. +44. The immortal simulated programming help at each stage. I ensured there was a single copy of a predicate, not duplicates. I listed the predicate by name and arity with listing(file_search_path(_,_)). or by name with listing(file_search_path). I checked whether there were any duplicates. I deleted them, checking that they were not loaded elsewhere; otherwise, I loaded the single instances. +45. The immortal touched up the texts for easier connections. I found the link. I found the other connections. I touched up the text so that it was more apparent. I wrote the connection to the next paragraph in the text. +46. The immortal simulated programming help at each stage. The kinder student knew how and what to program. Programming was normal, and people could compare their decisions with its logic. The algorithm asked me how to program. I specified an algorithm and did it using mind reading or sleep. +47. The immortal did ask the search engine questions. The kinder student knew how and what to program. He could read and write many words. He could answer simple programming questions with help. The issues were memory and abstract memory. +48. The immortal had the mental age of an adult. The kinder student could read and write many words. He could recognise spoken words and change these to written words. He enjoyed primary school philosophy, in particular logic, grammar and induction. He could program using symbols and physiologically break a word's spelling down from speech. +49. The immortal could read advanced knowledge. The kinder student could answer simple programming questions with help. She was taught differentiation in calculus. She could transfer this knowledge to a computer program. She could debug and optimise the Prolog program. +50. The immortal had a research assistant to help with abstract images. The kinder student's issues were memory and abstract memory. However, she could remember by rote or visual, backwards, skipping memory exercises. Also, she could make a model and use it to answer questions and use abstract memory to program. +51. The immortal checked each decision with the algorithm, working from first principles if it was down. Programming became expected, and people could compare their choices with its logic. Anything was possible. The person could fly around in a pixel in a simulation. Programming offered a simple method of better examining and thinking logically about ideas. +52. The immortal taught the computer to program algorithm finders. The algorithm asked me how to program. I instructed it to ask for input. I told it the formula for a computer program. It worked out the simple interpreter overnight from knowledge about bindings tables, calculations and the logical structure of an algorithm. +54. The immortal finished off algorithms for philosophies and unwritten philosophies. I specified a program and did it using mind reading or sleep. The algorithm worked out the specification from previous specifications which I had entered. The specification for the algorithm could be a new, modified or joint algorithm. The specification might need an attached set of formulas, which explained if the data was to be monadalised. +55. The immortal preferred expanded algorithms and multiple-level higher-order programming. The algorithm worked out the spec from the previous specs I had entered. The program writer algorithm recognised a phrase that referred to an earlier algorithm. It could draft the algorithm with either knowledge about the words to modify it from a previous program or new knowledge given by the human. The spec had numbers and possibly functional calls to other algorithms. +56. The immortal reused or modified previous algorithms. The algorithm recognised a phrase that referred to an earlier algorithm. It might be \"replace a word\", \"count the frequency of words\", or \"paraphrase the text, preserving formatting\". It then used one of these algorithms as part of a new algorithm. It could produce an error or warning or make necessary modifications. +57. The immortal assessed the academy's gems, algorithms, songs and philosophies. The program finder could use an algorithm as part of a new algorithm. For example, it could spell-check potential paraphrasing synonyms. Or, it could preserve HTML formatting when replacing text. Finally, it could learn new knowledge about its algorithms in its academy with accreditation. +58. The immortal checked the function. I found more extended algorithms. Instead of writing an algorithm that called functions, I wrote an algorithm that wrote an algorithm to do this. It wrote an interpreter for simple test cases, with all possible combinations of commands up to a level. Also, it found missing commands, such as recursion functions. +59. The immortal flagged long or repetitive texts. I found new philosophies. I found the combinations of philosophies. I checked if they inspired new words. I deleted or replaced repeated words, changing them, influenced by the grammar-logic algorithm's output. +60. The immortal could produce algorithms automatically, logging errors and changes. A program finder could have an error or warning or make necessary modifications when needed. For example, it might create a notification when input is ambiguous. It would produce an error if there were a problem producing output. Or, it might self-correct if data was close to the correct format and in a file with a backup. +61. The immortal read the list of warnings or modifications (or ignored them). For example, the program finder might produce a notification when the input is ambiguous. For instance, if the \"types to algorithm\" algorithm could produce brackets instead when given lists (of the same length) or lists when given a single set of brackets, it might create a warning. In the first case, it would make necessary modifications anyway. Given the warning in the second case, it could modify the algorithm, given more information later. +62. The immortal used data structure verification to type-test data structures in Prolog. The program finder might self-correct if the data was close to the correct format. The robot never ended the conversation; he kept thinking with me but reset later. He was also the product, not experiencing the product. The robot experimented with generative simplicity and ensured correctness in Prolog's data structure format. +63. The to-be-immortal found the Time Machine software. With either knowledge about the words to modify an algorithm from a previous program or new knowledge given by the human, the program finder algorithm could draft the algorithm. If the words were ambiguous, missing detail, or unknown, the algorithm would ask the human for assistance. I wrote the press release for immortality with software that allowed women to keep their youthful appearance. +64. The immortal called pedagogy meditation. The specification had numbers and possibly functional calls to other algorithms. All parameters were determined, for example, which part of the essay to paraphrase or that the interpreters should run the same programming language. The time traveller breasoned out an A to stay \"alive\" to the world, and 5*50 As later. Also, the recently departed could be transferred to immortality, found out just when they appeared to die and stay happy. +65. The immortal wrote the Text to Breasonings algorithm and couldn't find a counterpart elsewhere. The specified algorithm could be new, modified or joined. It could be unique, entered by the human. Or, the program could change it with a different configuration from the original. Alternatively, the program could connect two or more algorithms, usually with modifications. +66. The immortal wrote the required input and output in a text file. The specification might need an attached set of formulas, which explained if the data was to be monadalised. \"Monadalisation\" was when the algorithm tested each statement for truth, and then the algorithm tested subsequent statements for truth if the previous one failed. Or, the algorithm tested the following statements for truth due to the previous one being correct. There might be a multiple-clause nested if-then statement, which the user could modify. + +66. The immortal encouraged more people to program assignments. I questioned whether the student would want to program more if the Academy was accredited. The teacher gave the student the spec. The facilitator worked on each line and connection with two uses (in breasonings and algorithms). He worked on it, thinking about each conclusion in a high-quality way. +67. The immortal wrote 5 As to prepare students to write essays. The student might want to make connections more if the Academy was accredited. The facilitator worked on two uses in breasonings and algorithms for the lines and links in the essay. The student was inspired by these and wrote his connections. He finished the article, having thought about the synthesis rigorously. +68. The immortal tested for cascading changes needed to code. I tested the predicates individually. Then, I wrote sets of correct input and output for each predicate. Then, I monitored them for required changes. Finally, I tested the predicates' features and error handling. +69. The lyrics reminded the immortal of the topic and could be changed to be more relevant. First, I fixed the predicate in music composer, which omitted spaces in songs. Then, I found the predicate which produced lyrics. Finally, I found the bug which missed spaces. I fixed it and worked on an algorithm that wrote texts (like lyrics) that a writer could interpret as essays on a topic. +70. The immortal labelled and verified data. I created a single choice-point identifier. I wrote the words \"choice-point\" at the start of each choice-point. This phrase delineated choice-points from findall data and other data. I could find, manipulate and delete choice-points using this identifier. +71. The immortal bot produced ideas, and they were breasoned out. I wrote 50 As for mindfulness for the necessary financial benefits from bots in meditation. I noticed the bot earned 50 As in immortality. This achievement was from a degree. The bot could be immortal. +72. The immortal sped up algorithm generation based on their ideas. The founder stated that the Academy was original because of simple algorithms and induction. The algorithms included original interpreters. Also, they had brute-force induction and meta-induction algorithms. Students were encouraged to write philosophies and write algorithms using algorithm writers. +73. The immortal used rules to demonstrate understanding of a topic. I kept copies of unmodified texts and texts with connections for use with Essay Helper, for neuronetworks. This data helped speed up writing, where the student knew the ways of thinking. I entered a text into Essay Helper. The neuronetwork helped produce an essay. +74. The immortal considered the feature, which needed to be as rigorous as C. The student increased her skills with time, writing more and more complicated algorithms. She wrote examples of fundamental ideas and simple algorithms at first. Then, she collected more and more advanced skills, such as associating or copying a variable to another part of the program, or lists of labels for variables and predicates and lists of failed ideas (such as a blue cut, which would have cut future choice points in a predicate but Prolog already did it). She added these to her algorithm generators, finding that the more she wrote, the more possibilities there were. +75. The immortal matched patterns to recursion, avoiding infinite loops. I used the \"types to algorithm\" algorithm to generate grammars. The algorithm could find code from repeating lists. It could be modified to produce grammars instead of code by using special symbols instead of lists. The programmer needed to show care to avoid left-recursive grammars, which the algorithm prevented. +76. The immortal simplified code to expressions. I used the \"types to algorithm\" algorithm to generate list expressions. These could be left as expressions and used to process lists or strings and produce code, grammars or modified list expressions. The \"find lists\" algorithm produced the list expressions. List expressions were less complex than code and included repeating parts for conversion, analysis, reuse and storage. +77. The immortal stored pattern matching code as list expressions. I analysed list expressions. I found whether list expressions recurred, and I could keep them in one place. This analysis entailed parts of list expressions in terms of other parts. It also involved functional calls (intermediate functions that could pass calls). +78. The immortal included list expressions in logic. I reused list expressions. I entered them and found output from the input. List expressions contained symbols that the algorithm could substitute with other values. For example, they had symbols that replaced \"[\" and \"]\" at the start and end of lists. +79. The immortal created a web app using logic and mock-ups. I did this using the \"types to algorithm\" algorithm to process strings. First, I entered the possible list delimiters. Next, I entered possible white space (fill-in) characters. Finally, I entered possible comments and end-of-file characters. +80. The immortal spent less time on development and more time on creativity. I created a web app using logic and web page mock-ups. The algorithm pattern-matched items, producing an algorithm with specific results given a particular input. This web app could be a game, a chatbot, a question-answering algorithm or a translator. This web app could pattern-match strings and lists and, using program finder, find lists from two lists using pattern matching. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 2.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..bd56ff2f66f96a8b111d0077139578d943115a27 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 2.txt @@ -0,0 +1,88 @@ +["Green, L 2022, Immortality 2, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 2 + +1. I asked how much the students liked the algorithm or how easy they thought it was. I asked the students if they knew the algorithm. Third, I asked them how many times they had used it. Finally, I asked them if they could work it out for themselves. I timed them doing this without cheating. +2. I gave the lowest (winning) score for the best-performing code. To do this, I assessed the students in a programming competition. In this way, I timed them, finding patterns in the input and output. I also timed them in simplifying the code. I gave the best score for the fastest, not most straightforward code. +3. I listed the skills needed for the project, finding appropriate methods using mind reading. I noticed the string to list algorithm required lookahead. I found the reason for mastery, in other words, the way to work them out. I worked out whether this way was likely to be encountered. I marked assignments fairly, giving more marks for more straightforward test cases. +4. Also, nested findall could produce lists of lists. I checked whether findall was appropriate. I used findall when processing, analysing or verifying a list or series of lists. I took the input [a1,a2,...,an]. I produced the output [f(a1),f(a2),...,f(an)]. +5. If desired, findall could produce [], effectively deleting an item in the list. I used nested findall to produce lists of lists. For example, [[n,findall],[[v,b1],[[[n,member],[[v,a1],[v,a]]],[[n,findall],[[v,a2],[[[n,member],[[v,a2],[v,a1]]],[[n,equals4],[[v,a2],1]]],[v,b1]]]],[v,b]]] . Nested findall could process lists of lists, outputting lists of lists. For example, it took the input [[a11,a1n],[a21,a2n],...,[an1,ann]] and produced the output [[f(a11),f(a1n)],[f(a21),f(a2n)],...,[f(an1),f(ann)]]. +6. I saw what percentage of the philosophy was non-computational. I processed the list with findall. In this, findall produced a list, not an appended list or a string. Findall couldn't analyse the previously inputted or outputted lists. I used it to produce a list of lists, which could be flattened and concatenated as a string. +7. Later, I could sum the list using a predicate. Finally, I analysed the list using findall. Then, I calculated a mathematical function with each item. For example, given [1,2], I figured A is B+1. This computation resulted in [2,3]. +8. I found the position number by creating a list from 1 to the length of the list and selecting using a get item n predicate. Then, I verified the list using findall. Next, I checked whether each item equalled its position number + 1. For example, I was given [2,3]. Then, I checked that the first item = 1+1, etc. +9. I summed the length of each file. I counted the values in the list using findall. Next, I found the string length of each item. The program used string length to count the number of characters in each file line. I summed the frequencies. +10. I processed the list of lists with findall, possibly followed by list subtraction. I listed lists with a certain item. For example, I was given the list of lists [[a,b,c],[a,d,e],[f,g,h]]. I only wanted lists with a. So, I returned [[a,b,c],[a,d,e]]. +11. I analysed the list using findall and then plotted the graph on the screen. I summed the list items and found the gradient of the resulting graph. I gave the input [[1,2],[3,4]]. I found the sums [3,7]. Assuming x = position number, I found the gradient (7-3)/(2-1) = 4 (very steep). +12. I verified the property of the list of lists with findall, followed by checking that only empty lists were returned. I verified that the list of lists contained no instances of an item. I gave the input [[1,2],[3,4]]. I checked that each list didn't contain 6. I produced the result [[],[]]. +13. I counted (found the length of each list of lists) with findall, where nested findall wasn't needed but could be used to return lists of lists. I was given [[4,5,\"a\",*],[_],[]]. I returned [4,1,0]. I could separate functions into different functions. I could distil out an item and then count the items. +14. I found outliers in the data with a predicate, not foldr. I used foldr (which took [a,b,c] and produced \"abc\" or took [[a],[b],[c]] and produced [a,b,c], depending on the additionally supplied function name). Foldr was usually used to append lists or concatenate strings, not for simple findall tasks or more complex tasks that needed access to previous input and output. I always checked whether I should use foldr or findall before considering predicates. I could use findall with foldr but not vice-versa. +15. I used nested foldr (with foldr in another function) with lists of lists. I used a predicate when findall and foldr were inappropriate. I repeatedly added or deleted items in a list. Separately, I used foldr when possible to process items. Foldr eliminated many list processing and append statements. +16. I also avoided foldr with a predicate because it was too complex. I used foldr with a predicate. I used foldr to change [a,b,c] to (a:b):c. I avoided referring to other predicates too much to follow the algorithm easily. I could expand the code that used \":\" and replace it with \"-\". + +17. I examined and explained the little interesting things about the skills from the later drafts. I did this by writing the skill hierarchy for the task. To do this, I first wrote the task. Second, I broke it down into steps or an algorithm. Finally, I examined the skills needed and the changes over drafts. +18. I accredited my organisation and helped others with professional development. I did this by helping consolidate the skills. To do this, I strengthened the skill for the student's career. Next, I revised the skill with the student. Lastly, I helped the student to test themselves regularly. +19. I tested the skills the student knew and their knowledge gap or what they didn't know and helped them regularly revise their weaknesses. I tried for the skill in the assignment. The pass mark was 100%. They repeated it until they passed. I asked them what they did and didn't know. +20. I put content in the lecture to customise it for the students. I did this to address the skill deficit. I attempted this by mentioning the content to the student. Also, I managed small groups or tutorials. Then, an algorithm traced skills used in real-time, which the students used. +21. I noted what the student needed help with. I recorded the student's best attempt. The student read the question. They attempted it to the best of their abilities, noting what they knew and didn't know. I helped them during office hours. +22. I suggested that the students use books and sources and make citations when attempting a question, increasing their performance. First, I noted the question. Second, I checked whether the authority was relevant. In addition, I checked that the student had followed it correctly. Finally, I indicated whether they had made necessary changes to the model answer. +23. The lecturer noticed slight changes in the student's performance. The tutor helped the student with their specific question. They did this by guiding them with their attempted answer. They didn't tell them the whole answer. They just helped with the answer. +24. The student made their philosophy index and wrote their chapter. The student did this to revise for the test. First, they made notes during the lectures and a summary. They changed these regularly. They attempted and saw the lecturer about sample tests. +25. The lecturer aimed to help the students during their careers. The student had different skills from other students and different interests during their careers. The software worked by testing for skills in a formative (non-assessable) way. It recorded progress in learning skills. The student used it regularly to improve retention of knowledge. Some argued for understanding rather than memory. +26. I checked whether the new skills were better. The new skills are related to the old skills. Given this, I connected the new skills to the old skills. I checked if these skills connected and then joined them. If not, I replaced the old skills. +27. The algorithm emphasised understanding the correct rule to use. I noted that the skills were different from each other. The skill applies a particular command to a problem. It depended on the data. For example, skills changed when the data changed, and could be collected for the same group of data, etc. +28. I helped develop the neuronal network by explaining the idea as a simple algorithm, with all variables labelled and defined. I summarised the zone of proximal development to assist with missing skills. I grouped the students by lack of knowledge in an area. The students learned with other students who hadn't understood an area. I realised that the student might have a disability and needed help building neuronal networks. +29. I found that the training in fast programming taught the same skills. At first, I noticed that skills were different for different people. For example, one person attempted a problem with one algorithm while another used another. This phenomenon was true in computer science. I wrote the algorithm as a function call, automatically inserting cuts to optimise tail recursion. +30. I looked for a match between the skills I needed in employees and what they had. I examined how people with different backgrounds and experiences had other talents. I predicted their skills from their knowledge and expertise. For example, I knew that the project they had worked on required specific skills. I saw the kind of trajectory of skill-building they had and worked out what they could learn. +31. The neural network could sometimes correct the simple error. The skill was different from the skills I had covered before. I used the same skills but wrote more and more sophisticated algorithms. For example, I evolved from a program finder with types to neural networks. I worked out how to use a program finder with types to do what only CAW could do, for example, recognise a number with many decimal places, which might result from a mathematical function. +32. I used a program finder with types instead of CAW to detect dependent variables and run, e.g. foldr, findall or a predicate. I charted different methods used for similar problems over time. I redrafted the solution using these better methods. With another algorithm, given a correct List Prolog Interpreter trace, I could detect where a State Saving Interpreter trace differed, find unwanted failures of commands, and possibly find the needed change using CAW and the previous correct function of the code. + +33. I renamed the variable to be used, inserted \"_\" before its name to turn off a singleton warning or renamed another variable. I debugged the singleton. I found the single, undefined and unused variable in the clause. There was no way for its value to be defined, so it was useless. Any reference to it (i.e. printing \"_\") was pointless, so it was a mistake. +34. The algorithm or data might be in term form. I debugged the operator expected error. It was, in fact, a poorly formed term error because it might mean \"]\" or a term is missing. I examined the correct grammatical structure of the term. I added the valid symbol to correct the incorrect format. +35. I corrected close misspellings or poorly named entities. I did this by debugging the misspelled name in the algorithm. I kept in mind that it might be a predicate name. Or it might be a variable name. I also looked for errors in the algorithm, for example missing \".\" or two \"|\"s. +36. I debugged the use of the wrong command, where I also tried simplifying the data and command. I compared the type of data given as input to the command. Then, I changed the command to take this input. All the data depended on the type. Or a part of the data depended on the type. +37. I ran the correcter before compilation. I debugged the missing command. For example, I inserted the missing command flatten. I saw the \"atomic expected, list given error\". I inserted flatten to make the list into a series of atoms, etc. +38. I tested all repositories and warned users that repositories had an old version number. I debugged the wrong order of arguments in conversions. For example, I noticed that convert(A, B) should be convert(B, A), so I changed it. If I frequently used conversions using more than one command, I put them in one predicate. For example, I changed commands to accept file names as strings. +39. I debugged the missing input and (on an unrelated note) converted generative art with 5*50 As to an illustration. I saw that the input given to the program was incomplete. I completed the grammar for the input. I completed the input to match the grammar. I completed the input to match my meaning with mind reading or typing it in. +40. I changed the command or the data to match the other's type. I debugged the wrong types. I listed the types number, string, atom, list, compound and undefined. On a wrong type given to a command, I aborted with a wrong type error. There was also a type statement mismatch error and a type of types (such as music) mismatch error. +41. The computer generally avoided instantiation errors because it was on the correct path as it wrote the algorithm. I debugged the instantiation error. I found the instantiation error, where the algorithm gave an undefined variable to a command which required a defined variable. I inserted input, another connective (set of) commands or changed the variable names to fix the error. In the worst case, I entered the fix myself. +42. I considered a predicate for processing lists simultaneously instead of findall. I debugged the missing numbers//4 predicate. I noted where an algorithm processed two lists simultaneously in findall. Processing these lists didn't refer to finding all combinations of elements from each list. I inserted numbers(N,1,[],List) to produce a list [1,2,...,N] and used get_item_n to get the Nth list item in findall. +43. I deleted unnecessary include statements and duplicated included files. To do this, I debugged the missing \"include\" statement and any missing predicates. Then, I found the name of the missing predicate. Third, I found the file containing the predicate. I included it. +44. I stored the global variables in a single variable to monitor them. I debugged the unreset global variable. I found the used global (asserted) variable wasn't reset at the start. Neither was it reset if it was undefined, necessarily. Alternatively, I set the variable's value just before the first use. +45. I didn't reset variables in the interpreter call command. I debugged the unwantedly reset global variable. I found the algorithm that ran the algorithm unwantedly resetting the global variable each time it ran. I changed the algorithm to reset it only if it was undefined. On a separate note, I reset global variables if I needed to. +46. I noticed that the variable was available in the level above, so I alerted the programmer. I debugged, forgetting to get the value of a variable. I found the undefined variable. I found the value in the data. I found a way to find the value or a new way to identify the value, for example, with a variable name. +47. I deleted the unused variable. I debugged, forgetting the global variable. It was either not loaded or, in another case, the program loaded a local variable from the wrong source. I attended to both of these cases. First, I diverged the global variables, for example, the number of items and the items. Second, I converged the global variables, for instance, XML-style tags, to name the variables. +48. I saw the perspectival value of the variable. On a separate note, I debugged not saving a variable. I noticed that I had not saved the variable because it had been transformed or inputted from the user input or disk. I detected not saving it with an algorithm. I saved it. + +49. I quickly found a simple algorithm without a database. I did this by making CAW faster by continually trying algorithms with Prolog, not List Prolog. First, I wrote the trial algorithm. Then, I ran it to see if it fit the spec. I repeated this until I found an algorithm that matched the specification. Finally, I used the call command to run the algorithms. +50. I checked that running CAW with Prolog was faster than using List Prolog, favouring Prolog. First, I timed List Prolog. Next, I time Prolog. Then I subtracted their difference. Finally, I multiplied this value to a more significant number given the number of tries. +51. Alternatively, I wrote a Prolog interpreter in C that could call an algorithm as an argument. I started by discovering disk optimisation. I did this by saving the Prolog algorithm to disk. Then, I loaded it and repeated this many times. Disk optimisation kept the whole thing in memory if no other algorithms were accessing the disk, speeding it up. +52. I made it a priority to fix bugs. First, I corrected the bug and sold the software. I started by testing the feature. I continued by trying the software with all combinations of other features. I did this recursively with a depth of three levels. +53. I programmed the computer player smarter by guessing main, not unprofitable, letters in words. I started by programming the game with a computer player. At first, I gave the computer player rules to make a move. I made this fair for both players. Programming the computer player helped sharpen my skills. +54. I read to maintain my vocabulary. I increased my skill. I checked that I was as good as the computer player. Then, I memorised the list of words. Then, I ranked the moves to find the best one. +55. My sort of automation appeared human by keeping to a plan of ranking and eliminating the most significant choice points. I programmed the two computer players. I thought they were lucky at the start. I noticed the more advanced player guessed the word based on the writer's background. I relied on the rules initially and then took over when it started. +56. I tracked the player's competence using mind reading and how they were thinking by their Jyotish (Indian horoscope) developed things. I increased the opponents' skill levels. I noticed that the opponent occasionally took the risk of making specific guesses. I based these guesses on past words' choices, background, and words. I saw the player cheekily guess letters that intertwined with thoughts. +57. I sold the software, which was stable. It was an education about how to create software. First, I found the market trends (supplying needed features on a tried platform). Then, I conducted market research (wrote better features than competitors). \"Selling\" was the use of the software. +58. People learned how to use the software, considering their goals and career aims. To help with the software, I provided documentation. I also provided examples. As a result, people had specific questions. They prevented plagiarism. +59. I made the software proprietary, with explicit and suitable materials. To accomplish this, I wrote more tests, including applications using the software. To do this, I converted my applications into the interpreter's programming language. Next, I ran the tests. Finally, I back-translated the applications to test the converters. +60. I spent time talking with possible users and conducting research. This research helped when adding more features to the software. I did this by examining what was possible. Then, I added features. I kept things simple. +61. I noticed that the container's hard disk was in memory. Before this, I presented the software on a report web page. I inputted the algorithm. I converted it to List Prolog. Finally, I installed the software and ran tests on the web. +62. The project aimed to pattern match and concatenate strings. I gave marks for correctness. I generated more extensive tests using initial tests by substituting different components. I replaced 2.0 with 2 to see if the answer was different or correct. I appreciated the aims of the project. +63. Students could return to their work, which the algorithm saved to disk, and could save their progress. I designed the student area in SSI. It contained links to courses, enrolments and FAQs. There was a progress bar through studies, where course questions were critical analysis or computer science predicates. I tested for plagiarism, requiring resubmission of work was more than 80% + copied. +64. The teacher described the use of Prolog and discussed the philosophy. I helped by compiling the FAQ about the assignment. It contained pointers to lectures about required syntax. In addition, it included details about the use of the student software. Finally, I noted that students could talk to the class, teacher and others using Prolog. + +65. I could run List Prolog (a sped-up version) in Combination Algorithm Writer and generate algorithms. I did this by writing List Prolog Interpreter. Then, I progressed by processing the body of the predicate. Then, I processed the recursive call. Finally, I checked the arguments in the call. +66. I could run web applications and develop online. I wrote SSI. I found that the open-source software helped teach programming. I handled choice points manually by storing and returning to them. I worked out all the list members and found them all using findall. +67. I ran Prolog (for simplicity) with List Prolog Interpreter (to explore and add features to the interpreter). I wrote the Prolog to List Prolog converter. I used a grammar to convert from Prolog instead of list processing because Prolog wasn't in list form. First, I processed the predicate heads. Then, I processed the predicate bodies, including the data in list form. +68. After generating the Prolog code, I could read it more quickly and print it in a pretty format. I wrote the List Prolog to Prolog converter. I converted the line of List Prolog to Prolog in the trace. List Prolog contained tokens to parse and run algorithms more quickly and could run \"internal\" commands. I processed the algorithm as a list, concatenating the Prolog algorithm as a string. +69. The children were helped in school, and algorithms were logically suggested more easily than combophil (which suggested whole philosophies). I wrote the Grammar Logic algorithm. The Grammar Logic algorithm found topical keywords captured in sentence format, written as algorithms. Combined with the mind reader algorithm, it prompted thought and helped with writing. It seemed relevant and exciting to connect to the argument. +70. I wrote the Time Machine algorithm after years of planning. I wrote a video about the Time Machine algorithm, text to breasoning, headache prevention and immortality. Time Machine helped people time travel with meditation and see the people from the future but not the setting or other details. It worked because of breasoning out 250 breasonings for time travel, where breasonings required human preparation and writing. The institution looked well on time travel and possibly prevented medical problems. +71. I noticed the bots could have the same as Text to Breasonings, enabling a variety of uses, including earning a high distinction, helping with business, jobs and children, mind reading, time travel, quantum energy production and particular medical benefits. I wrote the Text to Breasonings algorithm. Text to Breasonings converted a text to the dimensions of objects mentioned, helping with spiritual conclusions, such as earning a high distinction. It used medicine writing to ensure that it worked. It asked for new and breasoned out (thought of the dimensions of objects and letters) previous breasonings, funnelling some words into the same objects. +72. Obviously, I couldn't write and test an algorithm without writing the base case first. So I wrote the Combination Algorithm Writer Multiple Predicates (CAWMP) algorithm. The algorithm repeatedly tried new algorithms (combinations of commands) until it found a match with the specification. It produced algorithms with multiple predicates by starting with base cases and building up algorithms. In addition, it allowed numerous predicates with the same name as previous predicates, different names, etc. +73. I wrote the Essay Helper algorithm, which included critical thinking and business algorithms. I designed the algorithm to plan the layout of the essay. I helped write the critical evaluation essay. The algorithm instantly generated the article (but it needed paraphrasing). The algorithm could develop a whole PhD. +74. I minimised the state machine to avoid bugs and allow easier editing. I wrote the LuciansHand BitMap Font algorithm. The algorithm generated a font from dot-to-dot images. It used a state machine to transition between pen up and pen down modes. A graphing algorithm prevented gaps in lines with gradients greater than one (steep). +75. I planned the lucianpl compiler, which was a research project. It was SSI, converted into C. To prepare for this, I finished the converters. Then, I wrote more commands for the interpreter. Finally, I tested it Prolog, without choice points, before converting to C. +76. I could mind read myself to program names in algorithms. I did this by writing the mindreader algorithm. Mind reader was hazy, a generative art algorithm. Using it, I could generate music and art. I timed how long a breasoning took to breason out, where a shorter time confirmed that the user had thought a particular thought. +77. Web Editor allowed browsing and editing files in folders on the web. I first planned the Web Editor algorithm. Then, I created new files and folders and displayed the name of the file or folder. I could edit files. I could add a feature to automate the development of my interpreter. +78. I could generate complex predicates, which considered past output, to search through. I wrote the program finder with types algorithm in the Philosophy folder. Program finder was an algorithm that wrote algorithms from data, not to match it afterwards like Combination Algorithm Writer (CAW). Program Finder with Types converted [a,b,c,d] to (((a:b):c):d) and recognised the parts of this compound to write recursive append, reverse or string_concat, etc. algorithms. I also wrote an algorithm that used equals4 to recognise and substitute parts of compounds. +79. Jyotish guessed the person's thoughts from their financial details and tried to help them. Separately, I wrote the Prolog Word Game. In this, the game asked one player to guess the letters of the other player's word. Then, the first player entered the word, which the algorithm hid. Then, the second player continued to guess letters, and the game told them whether they were correct or incorrect until guessing the word or running out of guesses. +80. Prolog's lists were more intuitive than not and were converted into C arrays. I wrote the List Prolog to C converter. I used the converter to convert List Prolog algorithms to C, speeding them up. I used list Prolog to develop and debug the algorithms. The program converted logical elements into if-then statements. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 20.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 20.txt new file mode 100644 index 0000000000000000000000000000000000000000..9cf9fc23d6a6b27edf6cbf7320e1f2eca4816dcc --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 20.txt @@ -0,0 +1,88 @@ +["Green, L 2024, Immortality 20, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 20 + +1. The immortal differentiated list expressions from earlier ones with additional data. I produced modified list expressions. For example, I converted symbols to symbols. Or, I added more or flattened levels. Alternatively, I changed atoms to other atoms, sometimes with a three-way replacement. + +2. The immortal described art in terms of terms. I wrote additional data to differentiate new list expressions from old ones. This data might be \"first item\",..., \"item n\", or types of types, such as musical note 1 or rectangle shape. For example, instead of writing [number, number], I wrote [[v,a1],[v,a2]], which corresponded to other instances of variables. I could also write [[\"music note frequency\",1],[\"music note frequency\",2]], which helped with searching, where I differentiated from past musical stories. I could also write [\"rectangle\", \"hole\"] to design a post box. +3. The immortal wrote a website that hosted APIs. I create an API host with State Saving Interpreter Web Service. It hosted educational APIs. It also hosted business APIs. Finally, it hosted programming APIs. +4. The immortal wrote software that didn't finish the work but prompted thought. I hosted educational APIs. I tutored the student. Or I assisted the teacher. Finally, I wrote a chatbot that helped clarify the idea. +5. The immortal identified the value system when programming. I wrote software that didn't finish the work but prompted thought. It finished it and helped the student understand it. It had an education A. It clearly explained the ideas, giving the prerequisite ideas and catering for grade 3 readers, including disabled students. +6. The immortal sketched the compiler's flowchart, writing how a program could generate it. I identified the value system when programming. I drew a square and wrote what I was currently doing in it. I connected the knowledge that I had worked out to it. I identified good, better and best methods of writing a feature and bug checking, thinking of reasons for the best ones, which I did, and objections to the others. +7. The immortal stated that the interpreter and compiler went in circles. I wrote how a program could generate the compiler. I developed the predicates one at a time. I drafted the algorithm, finding similar predicates. I minimised the set of predicates and the number of variables. +8. The immortal entered changes using an algorithm. I wrote the best ways of writing features. First, I mind-mapped the way of doing it. Second, I kept the code uniform. Finally, I wrote small non-uniform changes down to delete or carry out across the system. +9. The immortal kept a log of positive or debugging changes, filed into pattern-matching or unpredictable commands, and made similar ones automatically later. I entered modifications to the algorithm using an algorithm. It checked whether the change should be by itself, affect a subset of parts or all relevant parts. I wrote the family of values and set of affected variables. I simplified and wrote reused code (for example, I stored choice points in a standard format). +10. The immortal automated suggestions or changes to code. The algorithm checked whether the change should be by itself, affect a subset of parts or all relevant parts. I sometimes performed this checking using neuronetworks. There was a text file containing the control structure, and there were needed predicates and commands in the compiler (string to list had commands inserted to expand to error-producing code). The command was either changed, had a condition inserted, or there were two copies for different cases. +11. The immortal verified the results of changes. The change to \"and\" or \"or\" in the compiler's control structure was needed to align with non-deterministic or other results required, errors and the need to minimise the code. The user pruned these if there were unwanted multiple or non-multiple results (including true or false results). Errors in each part of the algorithm were registered and passed up to previous levels before being caught. Also, changes triggered relevant testing and any minimisation needed. +12. The immortal modified functional calls, simplifying queries. The change was made to the functional call, streamlining or eliminating intermediate predicates. A functional call contains other function calls, for example, z=j(1,f(g(x),y)). I could change j and f to h, simplifying the call and eliminating j and f. Another parameter may be passed with the new intermediate predicate call, modifying it. +13. The immortal responded to the question. I wrote an analytics app for my website. The analytics app was a password-protected website. It contained information about prospective customers, qualified leads and sales. It listed prep data and allowed communication with clients. +14. The immortal tracked and worked out the best assignments and code. I wrote the family of values and set of affected variables. I wrote whether the family was positive integers or 2, 4, 6, ..., etc. I wrote the variables that the value transformation affected. Also, the algorithm used real data and verification with types statements to determine whether a family value assignment affected a particular pathway. The possibility of failure was mapped and avoided. +15. The immortal filed each data structure away. I wrote and redrafted reused code and simplified it (for example, I stored choice-points in a standard format). I wrote the type of choice-point, such as predicate or command. I optimised the blue cut by combining the parts in a single C-function, with if-then conditional on the truth value of the antecedent statements. I could then access, modify and delete choice-points quickly. +16. The immortal compiled a list of Prolog commands, listing their predicate or hard-coded basis. I used modules to turn features on and off. I ran modules as predicates. The feature (predicate) could be loaded when needed. I could unload (delete) the predicates when required, with lists of dependencies of each predicate. +17. The immortal converted the list to a string, for example, (1,\":\",[a,b,c]) to (1:[abc]). To do this, I compiled the best ways of debugging. For example, I tried debugging with the State Saving Interpreter, which could enter the program to debug at any point. In addition, I wrote a string type checker with a list expression, which helped verify input strings. These list expressions could be rapidly converted into grammars or left as call parameters. +18. The immortal checked the work. I tutored the student. I avoided giving them the answer, instead helping them with specific questions as they worked it out. I only helped them if they were struggling. I recorded the notes about the exchange. +19. The immortal wrote software to coordinate the student's learning and reports about it to the teacher. I assisted the teacher. The teacher spent time doing something else, usually spent cutting cardboard, while the computer ran software to do this. The computer helped each student, using monitoring and assistance, helping the teacher multi-task. The algorithm could help even pedagogy and summaries of key points to help students read. +20. The immortal programmed the program to give ear to ideas. I wrote a chatbot that helped clarify the statement that the user told it. The psychology was to help the user breason out (check objects they mentioned) and articulate specific questions. For example, a professor asked whether questions were embarrassing. After each entry, the program asked, \"If there is nothing else, then is there anything else?\". +21. The immortal wrote APIs that helped employees with business reasoning, analysis, confidence, and home troubles. The website hosted business APIs. One API found the main points of a document and commented on it, showing its reasoning. Another API scored the quality of reason in correctness, detail and analysis. A third API detected employee pedagogical effort, including the quality of returns to them. +22. The immortal created a book about each algorithm celebrating its flowchart, features and jokes. To do this, the director hosted programming APIs on the site. An API wrote in the style of planning, completing features, debugging or catching \"hacking\" the program when tired. Another API had a slider to critique from \"encouraging\" and \"suggestive\" to \"security and performance-conscious\". An augmented API made suggestions to improve code, such as its formatting, uniformity and apps that stood alone in history. +23. The immortal approached C-compiler speed. I wrote a website that contained a website generator/writer. It devised bios/autobios. It also recorded a person's life as they made discoveries. More than a website editor, it added breasoning, number ones, secret twists and logic to texts. +24. The immortal produced a single predicate that processed the list. I verified that lists of lists, not just repeating parts of lists, were supported in the \"types to alg\" algorithm. Before this, I checked that [a, b, a, b] worked. Then, I checked that [[a,b],[a,b]] worked. In each of these cases, the algorithm grouped like parts together. +25. The immortal wrote on any or a new idea. I wrote an algorithm for the Academy in State Saving Interpreter Web Service. The algorithm allowed enrolments and submitting assignments and suggested paraphrasing set texts. Another algorithm allowed writing on up to five texts from two different departments. This interdisciplinary analysis strengthened my knowledge about bonds in computational philosophy. +26. The immortal allowed people to use their apps without installing other software. I tested the Messager app and different State Saving Interpreter Web Service algorithms. I tested Messager, which allowed users to enter and save messages in the terminal and online. I could run other Prolog algorithms online, such as the Vetusia text adventure and image, platform games or video editors. I considered selling or teaching about my Prolog algorithms online. +27. The immortal listed and sold their software. I sold my Prolog algorithms online. I licensed the use of Music Composer to help rapidly produce songs. I allowed the use of Essay Helper to create essay drafts. Also, I sold my List Prolog interpreters, at least education materials or licenses, for web development. +28. The immortal made money from teaching logic programming online with interactive exercises, such as a term or definition quiz. I taught about my Prolog algorithms online. I wrote a documentation book with the List Prolog commands for users to read before writing algorithms. I gave short examples of programs: search, multi-variable findall, and command or file completion. Command completion included file string, atom and list completion. +29. The immortal labelled the interpreter's binding table method (member, delete and append). To do this, I listed the education skills As. After writing the skills, such as sort, append or foldr, I wrote about their ease of use, how a student could use them, and concordances showing their instances in real programs. A student may have encountered the command early or frequently, increasing its ease of use. They could be used in different ways, with various modes for each argument and applied to multiple arts, and they could fit into other algorithm methods, also labelled. +30. The immortal experimented to discover what was possible in accreditation, such as sureness about and heading for conclusions. I helped Computer Science students with accreditation. They could more easily visualise writing algorithms. I gave the question. I helped them with the answer. +31. The immortal also taught logic, which required spatial visualisation. I helped Philosophy students with accreditation. The student had the confidence to connect the ideas. They could understand the writing. The student thought of magnificent thoughts. +32. The immortal accessed a building block, a key idea behind an algorithm. In Computer Science accreditation, there were three entry points to a building block. First, there were ten handwritten breasonings; the rest were 250 quantum box or grammar logic breasonings. Second, an entry to a building block was writing shorter, simpler versions of algorithms, such as find lists and gen alg by themselves. Third, an entry point wasn't starting with Prolog facts (like data), rules and predicates but with simple cases, such as in a primer. +33. The immortal infused the student with knowledge of accreditation. In Philosophy accreditation, teachers could help students to connect with associated texts in a range of easiness from grade level 3 and up. It wasn't necessarily true that, like Computer Science, the student needed several Computer Science details or that the student needed two uses (of arguments and algorithms) for sentences and connections, also like Computer Science. Instead, perhaps, a grid of 5*5 details with a column for each paragraph and five reasons with links, even sort algorithms were required. +34. The immortal learned how students were working from their questions. I wrote the Computer Science Frequently Asked Question list. I received the inquiry. I checked whether I had already answered it or whether it was a variant and needed an updated answer. I updated the answer. +35. The immortal set the AI search or interpreter as the project. I wrote the Computer Science assignment specification. I wrote the problem. I described the method the algorithm should use through the data. Finally, I wrote the required test cases in the data folder. +36. The immortal read about immortality. I taught about immortality in accreditation. It contained the method for becoming immortal. The texts were critically analysable. The course was recommended for immortals and reflected society's direction. +37. After the immortal automated copywriting, they could conduct business. I taught business in accreditation. The students could access business technologies. They could prevent headaches. They could sell. They could meet their chosen clients. +38. The immortal covered the lectures. The lecturer worked on 80 breasonings and 80 algorithms for the lecture. This document was separate from the set text. The secondary texts also had breasonings. The arguments contained connections and comments. +39. The immortal helped the students with the pedagogical ways of thinking. The lecturer worked on 80 breasonings and 80 algorithms per comment made by a student during the lecture. Often the student would do this too. Other students could join in. The students helped, and some learnt to be lecturers. +40. The immortal wrote the skill basis for the algorithm. The course writer wrote education As for students with mind reading about skills. These were the secret examples of algorithms. A lecturer worked on an education A for student As. The A contained lists of skills thought of in student breasonings, and the algorithms for these (for example, addition) may be 1+1=2. +41. The immortal kept a log of thoughts and thanked the institution. The institution supported students' education-related thoughts for 3-4 years after the student finished the qualification. The computers detected and completed thoughts for this time. I encouraged the student to think of the idea using the A. I experienced that the industry sometimes contained more complicated thoughts. +42. The immortal studied many areas of study within philosophy. I wrote what the education skills As contained. They contained philosophical terms. Also, they included possible connections. They also had ontologies of probable reasonings. +43. The immortal recognised and thought clearly about the list. The writer wrote (lists of) lists as part of education skills As. Lists could contain data. This data could be processed. Usually, the writer decomposed the list and then built a new one. +44. The immortal analysed longer texts (with simple ideas) and many combinations of logical rules. The writer wrote command names as part of education skills As. The commands were in groups, such as mathematics, string or list manipulation. The lecturer ranked the commands from most to least relevant for a task. They were also ranked easiest to hardest. +45. The immortal wrote about recursion in terms of lists. The writer wrote recursion as part of education skills As. The writer taught recursion in the Prolog primer. It was recognised as needed in the exercises. It recurred inside other recursive loops and at the bottom of predicates. +46. The immortal learned Computer Science by writing input and output regarding types. The writer wrote multi-predicate recursion, or intermediate predicates, as part of education skills As. Recursion in multiple predicates involved lists of lists. Students wrote type statements in the first semester. In Haskell, each function outputted one variable. +47. The immortal wrote As for each skill. The immortal listed the skill. He mind-mapped philosophical terms related to the skill. He interpreted these. He wrote an argument about the skill. +48. The immortal mind-mapped uses for recursion. The writer wrote an A about the skill of using recursion. They wrote an A for anything (that could be processed). They wrote an A for processing this data. They did this when they identified data that needed to be processed using recursion. +49. The immortal was an expert on hidden neuro-algorithms. The A-argument discussed the ethics of a particular type of induction. They debated whether it could create jobs. They also discussed whether it would cost jobs in some cases. Students should work out problems rather than rely on algorithms. + +50. The immortal's religion was 4*50 As. In my big moment, I recommended writing 80 breasonings and a $1 copywritten sentence to finish the 4*50 As. I wrote 4*50 As per algorithm. The main aim was to write 4*50 As per book. The copywritten text also helped me overcome big ideas in the course. +51. The immortal processed the data first and entered data quickly at the end. Then, I wrote a mini-text to breasonings algorithm, which stored dictionaries as terms and was the same speed. Then, I finished the available breasonings. At the end, I prompted the user for missing objects. Also, I prompted the user for missing breasonings. +52. The immortal breasoned out medicine about education to help relax the student. I breasoned out the medicine arguments to help improve the feeling of customers. I listed each organ system. I breasoned out an A for each one. As a result, the customers felt refreshed and well. +53. The immortal breasoned out medicine about education to help relax the student. I breasoned out the medicine arguments to help relax the student. The task was simple. The algorithm was as tricky as the interpreter. The student relaxed her muscles and could focus on the task. +54. The immortal professor completed administration work. After that, I decided to start my business. I focused my time on high-quality tasks, such as original writing. For example, there was a 250 breasoning argument for a comment. I wrote the philosophy, then pressed the button at the end. +55. The immortal wrote the autobot. I followed the customer around the website. They bought the course. They completed the assignment. I prepared for each action they took. +56. The immortal helped the customer to repurchase the product with an A. I asked, what are three things in sales? It could refer to three drafts of an essay. Or, it could refer to three drafts of an algorithm. +Or, it could refer to both or professional development in repeatedly buying the product. +57. The immortal wanted a particular course on an occasion. I wrote the A for buying a course. I imagined being the customer. +I read my mind and wrote the questions I had about the product. I wrote the answers. +58. The immortal collected answers to recurring questions to save time. I wrote an A for doing the assignment. I imagined being the student. I completed the assignment. If I had a question at a point, I wrote it down. I wrote down the answer to the question. +59. The immortal tested each departmental product three times. I pretended to be the autobot. I did everything that the person would do. I checked that their pathway was clear. I saved the notes and prepared the setting for the real person. +60. The immortal preferred studying the minimum number of courses and used copywriting for other customers. I completed the Chinese Medicine course regularly. It relaxed my muscles. I had no headaches. Lucianic medicine uses MBA technology to prevent headaches. +61. The immortal ensured that the systems were smoothly running. I wrote the pedagogical arguments about immortality, bots and sales. I could recline in time. During this, I could attract employees and customers. I asked if \"ticketing\" is an industry to help customers in open source software. +62. The immortal optimised the algorithm by minimising it for safety. First, I wrote the deterministic finite automaton (DFA) minimisation algorithm. Then, I detected the sections which connected to the same nodes. Then, I deleted them and reconnected the severed connections. Finally, I redid the loops until it was complete. +63. The software customer paid the immortal. I made the paid ticketing system in the state-saving interpreter, with payments for the software customer. The software customer typed their question into the website. It helped improve and shape the direction of the software. The profits helped pay for costs. +64. The philosophy student paid the immortal. I made the paid ticketing system in the state-saving interpreter, with payments for the philosophy student. The philosophy student typed their question into the website. The lecturer answered the student's questions. This ticketing system helped pay for philosophy teaching costs. +65. The computer science student paid the immortal. I made the paid ticketing system in the state saving interpreter, with payments for the computer science student. The computer science student typed their question into the website. The student asked a question, and the database software or lecturer replied. This ticketing system helped pay for computer science teaching costs. + +66. The course customer paid the immortal. I made the paid ticketing system in the state-saving interpreter, with payments for the course customer. The course customer typed their question into the website. It helped the representative help the customer, who may have a question about a possible discount, accessibility or a refund, which helped the representative. The profits helped pay for costs. +67. The computational philosophy paid the immortal. I wrote the philosophy. I wrote the algorithm about it. I wrote the article describing how the algorithm worked, possibly with diagrams. The topics included graphics systems, interpreters and converters from Prolog to C in Prolog. +68. The immortal introduced new students to their works. I wrote introductions to my philosophy on the website. I wrote the philosophy. I described what it meant. This description involved writing specific headings and linking to algorithms and software. +69. The immortal sped up the generation and performance of apps. I wrote the State Saving Interpreter Web Service algorithms on the website. I wrote the login system. I wrote the messager system, with Learning Content Management System and ticketing systems. I also hosted checked APIs for free. +70. The immortal wrote application program interfaces to, for example, provide code and run an app. I wrote the API. The API ran a program on a server for particular clients. It remained logged in and was for free or a fee. The client ran the API from within an app or web page, displaying the API results. +71. The immortal wrote their app from the ground up. I answered the academy questions. For example, I read the student's code so far. If they specifically asked how to debug a particular bug, I helped them. This help sped up the learning process. +72. The immortal taught business students how to become immortal. I argued that State Saving Interpreter Web Service helped simplify processes. It didn't require the user to install Prolog on the client side. It could run Simulated Intelligence apps like Text to Breasonings, Mind Reader, Grammar Logic or Time Machine. I quickly worked out how to become immortal after learning about time travel. +73. The immortal adhered to security standards in the login program. I completed the web login algorithm. I registered an account on the system. I entered my login credentials. I could access my lost details and contact the webmaster if I lost access to email. +74. The immortal wrote a ticketing system to avoid costs, which integrated with my algorithms. I accepted payments for the ticket plan. This way, I could earn income from services and support for my open-source software. I needed to pay for my costs. I could add features or help with the programming language. +75. The immortal created ISO, C and induction Prolog features. I could add features. For an additional fee, I could add features that I agreed with. I would help as much as possible. I could help clients build a system, the parts of which the user accessed through the web. +76. The immortal made the knowledge base available. I helped with the programming language. I liked the simple method more. This simple method was part of the knowledge base. Otherwise, I helped with methods, appropriate commands and performance in induction. +77. The immortal sped up help to those who needed it. I built integrated algorithms into the ticketing algorithm. When the technical support officer had already answered a question or required technical information, they could access it. If the customer had entered data so far that helped answer the question, the technical support officer could pull it up. If the customer had experienced a problem with their data, algorithm or interpreter, they could report it, and the officer could help with it using screenshots or error messages. +78. The immortal helped the user with their support, programming or writing questions. The user took a ticket from the ticketing system. This ticket meant that they registered a comment, question or complaint. The officer on duty would reply within 1-2 business days. For some questions, the turnaround may be 1 hour. +79. The immortal partially automated support. The ticketing system alerted the admin by email. The admin logged in to the ticketing system, based on Messager, and answered the question. All that was required was a forum that could run some Prolog code. For example, it could run code that didn't access files or could, if the overseer provided it, and it was safe. +80. The immortal spent the day reading, writing and coding. Writing example Prolog algorithms helped new users learn and borrow code. I wrote the code and the \"briefing\" report. I spent 10-15 minutes daily responding to others on their pages. I gathered supporters and donated the money to philosophy. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 21.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 21.txt new file mode 100644 index 0000000000000000000000000000000000000000..6a32734ef24a0bc933bf5fb0b78aa197944ac813 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 21.txt @@ -0,0 +1,87 @@ +["Green, L 2024, Immortality 21, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 21 + +1. The immortal simulated each part of their company. I pretended to be a bot. I breasoned out 250 breasonings to become the bot. I walked around. I took notes. +2. The immortal tested each document from three perspectives. I became three testers for it, the attraction, to work. I redrafted, or optimised the algorithm. I redrafted, or reconnected the essay. This testing helped support students and spurned new ways of completing assignments. +3. The immortal followed rules for a perfect result each time. I wrote 50 breasonings to and from people per thought in business, education and the department concerned. The business breasonings helped give people thoughts and better establish their thoughts. The education breasonings were appropriate for an education course, and helped them concentrate on abstract visualisation in computer science and philosophical meanings. The home department breasonings helped them examine the thing in itself, or noumenon, and presented them with imagery that education couldn't. +4. The immortal breasoned out business, education and departmental breasonings for students' thoughts. The breasonings that the company required to be done for the student were done for the student if they didn't do them. The detector detected whether the student had done the breasonings. If they were done but breasonings were incorrect, incomplete or there weren't the right number of them, then the computer finished them. If they weren't done, then the computer performed them. +5. The immortal tested for speed, accuracy and originality of thoughts. The algorithm detected the students' thoughts. Like program finders, there were mind readers for questions. They had leeway, for example they allowed generative art based on science. This thought, explained to the student with 250 breasonings and a spiritual screen, suited open-ended questions best. +6. The immortal wrote the practicum for the other thoughts. The detected business, education and department thoughts were the practicum for As, meaning I didn't necessarily need to do any more work. I followed their thoughts. I gave A. This was 100%. +7. The immortal followed the business format for education, which drew students and met the professional requirements. I wrote five As, for the text, lecture, and three students. I wrote 80 sentences in the text, containing algorithm specifications. I wrote 80 breasonings for the lecture about the text, a unique work. I wrote a unique A for each of three model students. +8. The immortal developed texts on different topics about a topic. The non-text As, for the lecture and three students, didn't include breasonings from the text. Instead, they were on the same topic and were original. The lecture may have the form of a secondary text, from a departmental perspective. The As for the three students may represent thoughts from one or more students. +9. The immortal thought cosmology referred to plants. I used a similar algorithm to Essay Helper, which connected all combinations of sentences, then reordered them with the permutation command. I expanded some of these as an A. The text contained all ideas in the original. The text helped synthesise or critically analyse the text. +10. The immortal focused on simulated intelligence algorithm, which helped construct the simulation. I wrote the philosophy of algorithms. I chose an algorithm. It was the argument. I wrote other sentences in terms of this argument, rearranging the key words each time. +11. The immortal supported his number one, experience prestigious education, his home time and the future. I supported the simulation with the Text to Breasonings, Mind Reader, Grammar Logic and Time Machine Simulated Intelligence apps. Text to Breasonings supported one's job in education or business. Mind Reader supported algorithms such as Grammar Logic to cover one's thoughts completely and meet performance requirements. Time Machine allowed people to live within the window of the government supporting Text to Breasonings and support projects at different times. +12. The immortal completed the basic assignment, then wrote more after graduating. I broke texts into 16 paragraphs. Texts that were this short could be inspirations for students to write algorithms. The texts were short because they were dense with algorithm specifications. Each algorithm was as simple or simpler than an interpreter, and could be chosen to write. +13. The immortal chose good works to resynthesise. I wrote five times as many hidden philosophies and algorithms when they were approximated, wrote them up, then chose one out of five. I checked that the subject and object agreed with ontologies in philosophies. I checked that input and output agreed with ontologies in algorithms. I also checked that the philosophy and algorithm agreed with ontologies, and that good ideas were exhausted and poor ones were objected to. +14. The immortal changed close to correct phrasings. Instead of using a similar algorithm to Essay Helper, I modified the output of the \"br alg gen\" algorithm. I entered text as input to the \"br alg gen\" algorithm. It produced \"540 degree\" mixtures of sentences, allowing for unwanted connectives. I rejected these if they didn't agree. +15. The immortal interpreted an algorithmic process in a text or manipulated an algorithm, producing another algorithm. Instead of using a similar algorithm to Essay Helper, I modified the output of the \"types to alg\" algorithm. It replaced parts of input, recognised patterns and pattern matched to produce output with an algorithm. It could transform texts or algorithms with pattern matching. I checked whether the transformations were correct with ontologies. +16. The immortal wrote ontologically corrected texts. I produced a text from an algorithm from an interpreted algorithmic process. I entered a text about pouring liquid into a cup and a text about mathematical subtraction. I found the algorithm about removing liquid. I wrote this as a sentence. + +17. The immortal published the noumena behind the philosophy. I wrote a book of fantasies in Computer Science. It contained step-by-step explanations of algorithms. These algorithms were my projects and could become the students'; they were model answers. In addition, I gave the project specs. +18. The immortal gave attention to detail in each assignment. I copywrote a document per assignment. This copywriting helped protect the project from unfinished big ideas. Copywriting contained at least 4*50 As. The customer's effort so far was covered. +19. The immortal structured and drafted the essay. I saved tips for high distinctions on my desktop. These came from my Education master's degree. I wrote the paper in the correct format, with the content I needed. I paraphrased and then checked the essay. +20. The immortal wrote appropriately for the master's degree. I abided by more essay writing tips. I removed the first person \"I\". I applied my knowledge at each point in the degree. I replaced \"expose\" with \"examined\". +21. The immortal modified inappropriate expressions. I wrote signposts as dot points, not sentences. I split long sentences with different ideas. I deleted \"regarding this\". I changed \"research has been brought up\" to \"extensive research has been conducted\". +22. The immortal cited sources correctly. I removed \"therefore, where\". When using the APA 7 citation style, I didn't include their initials unless there were two people with the same surnames in the same year. There were no citations with \"as cited in\" at the postgraduate level. Instead, U used original references, if recent. I deleted \"recently\". +23. The immortal carefully wrote the essay. I didn't encourage brackets. Instead of \"also\", I wrote \"in a similar way\" or \"in addition\". I used an argument map to write the essay. Finally, I used an automated service that checked my report. +24. The immortal checked the essay. First, I checked that I had referenced sources in-text and in the references list. Then, I wrote connections between reasons and paragraphs. Next, I copywrote a seen-as-version, coloured pixels, for the essay. Finally, I prooflistened to the article. +25. The immortal continued to study education. I wrote and connected many, at least two, sources in one of the paragraphs to deserve a high distinction. I thanked the professor. I wrote the essay. I read the better-performing article and came to this conclusion. +26. The dear immortal transferred knowledge about the calculation. I breasoned out the assignment, 250 algorithms, and 250 breasonings for each project. In addition, I used the Grammar Logic algorithm to mind-read a total of 125*2*50*80 = 1 million breasonings. This figure was multiplied by 4 in PhDs assignments. Or, it was multiplied by 5 in copywriting. +27. The system alerted the immortal when someone needed his help. I developed the ticket system to support users. I used State Saving Interpreter Web Service to write the system. I could support Prolog users on a forum. Special solutions may appear on the forum. +28. The immortal wrote philosophy for children and beginners. I copywrote a plan to write online about my philosophy. I professionally wrote each article. These articles explained the theory more clearly. The theory was computational. +29. The immortal wrote each entry in a high-quality way. I copywrote a plan to write online about my computer science. I established a forum about my programming language. I wrote the forum software in Prolog. I publicised new commands and read about the need for other new commands. +30. The immortal wrote a multithreaded, segmented algorithm to write many breasonings. The copywriting formula contained 6*125*2*5*50*80 = 30 million GL breasonings. I completed this for each key idea. In this, I wrote two, symbolising two people having cups of tea. I enjoyed sales for free. +31. The immortal explored the person's ideas, giving As to prospects. I wrote web, education and business copywriting for each website page. The intern was the \"main person\" to the funnel. I had a trust algorithm. I fully cognised opportunities around lost opportunities. +32. The immortal recognised the need for top-level exhaustion of choice points. Before this, I wrote web copywriting for my website. I divided the web into programming and fine art. Programming could be the backend and frontend. State Saving Interpreter Web Service catered for the backend, and the frontend was HTML and Javascript. + +33. The immortal sold education. I wrote education copywriting for my website. The point of the website was education. Students completed courses on it. They firmly believed in it. It was a friendly community. +34. The immortal analysed the business side of website intuitiveness. I wrote business copywriting for my website. This copywriting helped meet the professional requirement of the website. Meeting this professional requirement enabled employee and customer presence. In addition, it boosted sales. +35. The immortal wrote for the payment. I wrote in essay format when publishing philosophy online. I wrote the essay. The essay contained a hierarchical argument. I wrote the detailed reasoning. +36. The immortal designed the website for others. I ensured the accessibility of my website to disabled people. I used the specific tags necessary for visual elements. I used headings and subheadings. I used descriptive link text. +37. The immortal ensured visitors were human. I wrote a test to eliminate bots from logging in. This test could be A+B=?. Alternatively, it could be A?A=B. Or, it could be f(A, B)=B-A, so what is f(C, D)? +38. The immortal provided another password for vision-impaired students. I ensured the accessibility of the \"bot check\" for non-vision-impaired students. I could use images. The images used contrasting colours. This way, all students could log in without difficulty. +39. The immortal sold each book. There were 50 copywritings per 50 As of philosophy. Lecturers could submit a minimum of 80 breasonings or 1 A. This A expanded into 4*50 As in copywriting. In total, there were 50*50 As. +40. The immortal verified the type of data. I wrote grammars as List Prolog types. These were types of strings. For example, a string might be a list as a string. Or, they were types of lists. +41. The immortal only time travelled when they wanted to. I switched off that other people could involuntarily time travel a character in the simulation's daily regimen. I opened the daily regimen document. I programmed 250 quantum box breasonings for the switch. I switched this switch on for each time travel destination daily. +42. The immortal compared research with the articles. I wrote 4*50 As in journalism. I wrote down my plans to write articles. I collected 4*50 As. I wrote articles. +43. The immortal wrote virality and research 4*50 As. The research included departments in the meditation school and those in other schools. First, I wrote the research idea. Then, I compared it with the latest research. Finally, I wrote arguments and algorithms for new frontiers. +44. The immortal wrote on all topics. I wrote 4*50 As in publishing. These 4*50 As met the professional requirement for publishing articles. Also, I could publish books, journal articles and videos. There were ten breasonings that everything related to. +45. The immortal related unpredictable commands to acting. I wrote 4*50 As for acting. I could feature actors in articles, books or videos. They followed the rules about the work. I placed an associated item from one publication in another. +46. The immortal wrote songs about philosophy and computer science. I wrote 4*50 As in music. I wrote about the genre and types of compositions that I would write. I wrote 4*50 As for each song. The radio station played my songs. +47. The immortal supported students' and meditators' positive, relevant thoughts. I wrote 4*50 As for thoughts. These could be thoughts in a department. These 4*50 As could be expanded to thoughts in each department. I could also expand them per person, per thought. +48. The immortal wrote an opera or work. Next, I wrote a copywritten sentence by breasoning out 4*50 quantum box text to breasonings As for each paragraph. If there were further ideas in each paragraph, they were breasoned out separately. Finally, I wrote the prologue of the ideas. Finally, I designed the garments to accompany the prologue. + +49. The immortal paid $0 for results. I wrote six cycles of 4*50 As for copywriting. These six repeats ensured that copywriting would work. Copywriting working meant that it met professional standards. I gave the rest to too many. +50. The immortal wrote the argument map editor. I wrote the argument for the assignment. I wrote the thesis statement in terms of reasons and objections with rebuttals. I chose the overbearing side. In addition, I related each reason to the argument and topic. +51. The immortal analysed the topic. I wrote the critical analysis in the report. I connected each reason to a previous reason in the report. I wrote the reasons in order. I filled in the gaps and removed unnecessary sentences. +52. The immortal met the professional political requirements. I wrote the 4*50 copywriting As for politics. I wrote 50 As. 50 As were the way of thinking of politics. It was supported by a master's. +53. The immortal went through the information. I wrote the 4*50 copywriting As for royalty. The royal was a pedagogue. With 50 As, people could come and go. Royals could make other royals their audience. +54. The immortal granularised mind-read data until reaching correctness. I wrote the 4*50 copywriting As for politics and royalty. The political royal went through the information with simulated intelligence. They had 4*50 As for mind-reading. They generated 4*50 As for a mind-reading time. +55. The immortal wrote pedagogical arguments before conducting business. The business degree revealed the white-collar process. First, the student completed market research. Second, he wrote the positioning document. Third, he conducted an email campaign. +56. The immortal determined possible ways of making money. The business conducted market research. The marketer decided whether there was a need for a product or service. It could use surveys, interviews, focus groups and customer observation. It could occur at home, on the street, in the office, or in a market research facility. +57. The immortal wrote similar statements to those in a document. The company wrote the positioning statement. The positioning statement describes the product or service, the target audience and how it fills a market need. Sales and marketing used this statement to guide communication. In addition, the document ensured that all messages were consistent. +58. The immortal facilitated philosophy. The worker conducted the email campaign. Each person had 250 breasonings. It was simple to make money, and people bought. The product was philosophy. +59. The immortal was effective in pedagogy. The writer bought the copywriting for the pedagogical ways of thinking. These were the details, breasoning and rebreasoning. In addition, they included breathsoning and rebreathsoning. Finally, they had room, part of the room, the direction in the room, time to prepare, time to do and time to finish an action. +60. The immortal went over what was enough in copywriting. In addition to recommending copywriting on the Text to Breasonings repository, I taught courses in philosophy. These courses helped students make devise philosophies and sell copywriting. I attacked large projects. I moved to the next task when a project was valuable enough. +61. The immortal chose from the quantum box, random, mind-reading and detailed mind-reading As. I wrote copywriting for A to copywriting. Copywriting helped me reach the threshold of achievement. This achievement could be a shop opening for a day. Or, it could be an important sale. Or, it could be an employee turning up for a shift. +62. The immortal identified and objected to blocks to copywriting. I wrote B to B to copywriting. I made money from courses, but students could be professional in their assignments. I enjoyed endurance athletics and work. I caught blocks to copywriting and encouraged copywriting. +63. The immortal chose between copywriting and non-copywriting. I copywrote the B to copywriting. As mentioned, copywriting is necessary, but education has more resources. In addition, education provided more content and support over time. Finally, degrees supported jobs, while copywriting may be required within degrees. +64. The immortal secretly preferred long projects in the holidays. I developed 4*50 As for thoughts. The student felt relaxed and supported in thinking of ideas during her degree. The institution rewarded her for writing breasonings. The thoughts were educational skill hierarchies for each algorithm. + +65. The person became immortal using meditation. \"Meditation\" by Lucian Green contains a chapter on the utterance for relaxation. The utterance helped the person time travel, which is necessary for immortality. The utterance delivered high-quality thoughts, and I could practise it daily. The year 5689 contained immortality. +66. The immortal chose immortality. \"Meditation\" by Lucian Green contains a chapter on the sutra for relaxation. The sutra gave meditators the quality of life they wanted. They could think clearly. They had the confidence to use spiritual skills. +67. The person analysed the levels and logic of the sentence, finding immortality more quickly. State Saving Interpreter runs Prolog online without visible source code. The interpreter could securely run web applications. The source code wasn't visible within the source of the web page. The programmer stored the algorithm and data on the server. +68. The immortal ran scripts daily. State Saving Interpreter Web Service catered for the backend. The web app allowed time travel. In addition, it gave \"As\" to prepare for immortality. Everyone in the future was immortal. +69. The immortal spent time time travelling. The song \"Neiney Goes to the Translator\" was on the radio, increasing interest in the dance/pop song. It was about the area sum slicing algorithm. I visited times. I wrote notes about my thoughts. +70. The immortal wrote freely, with people interacting. As a four-year-old, I attended the Scuola Materna in Italy while my father lectured in History at Lucca. I played, made robots and saw friends. I used these skills to earn high distinctions and perform in my career. I had the system for 4*50 As for one result. +71. The immortal wrote to support himself. I studied philosophy at Melbourne University. I wrote my philosophy while studying. I expanded this later. The philosophy was on topics related to immortality. +72. The immortal examined the harmless possibilities. I delivered papers at three international philosophy conferences. I timed the speech. I chose the topic and wrote about it, attracting audience members. I published my works. +73. The immortal studied and wrote. I wrote on Hegel, Heidegger and Daoism. I used pedagogy to write high distinctions. I used Computational English to write computational high-distinctions. In addition, I used Computational English to write about Philosophy. +74. The immortal maintained their products. Copywriting was a number of breasonings on arguments, algorithms and details. I wrote copywriting each day on the State Saving Interpreter. Users visited the repository and downloaded it. The algorithm supported them in using it with thoughts. +75. The immortal specialised in specific projects. I created the Computer Science course. Students enrolled in the class. They completed Simulated Intelligence projects. Their thoughts were listened to and helped. +76. The immortal had a positive function in the simulation. I breasoned out arguments daily to avoid mistakes, unwanted thoughts and medical problems in the simulation. I consciously chose the correct options. I always thought positively. In addition, I kept on a safe path. +77. The immortal made a living from their shop. I created the shop. I sold information products, such as algorithms, APIs and courses. I started the shop with State Saving Interpreter Web Service. I made product areas. +78. The immortal wrote code using code he had written himself. I found code from the description and spec. I wrote code in Prolog. I found the context, such as a shop, an off-line algorithm or a type of algorithm. I asked for any missing pieces of information. I used evidence, not invented it. +79. The immortal educated users in writing games. I wrote the text platform game. I wrote a font for the background. I timed and caught keypresses for the game. I wrote tonal music for the game. +80. The immortal increased the features of the game. I modified the text platform game for the web with colours and graphics. It automatically refreshed, returning any key press. I used different coloured text and backgrounds. I used graphics for the main character and background. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 22.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 22.txt new file mode 100644 index 0000000000000000000000000000000000000000..3abbb4f1647d4611df3a623f5ee15735d7e0deba --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 22.txt @@ -0,0 +1,87 @@ +["Green, L 2024, Immortality 22, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 22 + +1. The immortal wrote their lifespan into existence. The immortal still needed to complete the breasonings. The professor used breasonings. The politician used breasonings. The immortal used breasonings to complete tasks from first principles. +2. The immortal met the professional requirements of their job. The immortal needed the copy written, taking care of the number of breasonings. The immortal sped up essay drafting. They ran Essay Helper, paraphraser as much as possible and a connection-making algorithm using ontologies. In addition, they wrote a keyword-for-keyword finder to find texts and structure qualifications. +3. The immortal made their technique available to help others. The institution contained a method for everyone to increase their longevity with meditation. The child was born able to meditate. The method was to meditate before and after time travel, breason out 250 breasonings to time travel to 5689 and indicate immortality. The simulation was similar to their world. They could replace their body and meet high-quality people. +4. The immortal thanked the copywriting company, like meditation and the educational institution, and used 250 breasonings per dimension per day to stop ageing. I breasoned out beauty, or preventing ageing, for each time I visited during the days. This breasoning lasted ten years. The people from the time saw me as the same age. I experienced myself at the same age already. +5. The immortal learned to time travel. I time-travelled each day to indicate my character. I saw the time traveller. I breasoned out 50 As in time travel. I breasoned out 250 breasonings to time travel, then tens of breasonings once in the simulation. +6. The immortal increased the image to 250 breasonings. The book cover depicted a clock with a time machine representing visiting times. The time machine was the text to breasonings algorithm. The algorithm was faster to use than hand breasonings. The pre-computer time people needed to memorise an argument. +7. The immortal helped the monarch. The book cover was futuristic, representing time travel with meditation. I recommended learning meditation to those interested in immortality. Kings, queens and famous people could do it. They enjoyed pedagogy. I performed pedagogy for them. +8. The immortal was immortal, wherever they were. The book cover was exciting and evocative, like time travel. The official regularly checked for mentions of time travel on the software site. The medication taker could avoid the long-term effects of medication. The worker could work and enjoy life. +9. The immortal wrote software to give others the effect of their books. The person became immortal after writing books on a line in meditation. They wrote on pedagogy, meditation, and medicine, including delegating workloads to prevent headaches, lecturer and recordings technologies, time travel, mind reading, and immortality. They often wrote about and studied business to support themselves. Finally, they chose a life of teaching. +10. The immortal explained the immortality method. I wrote a book for readers on immortality. I wrote on the philosophy of immortality, linking immortality to computer science. These algorithms took time to complete and led to many others. The book's content also helped immortality. +11. The immortal spread the good news. I invested in journalism. I practised journal writing. I did this each day. I wrote my newspaper. +12. The immortal did logical things. I wrote a book about the philosophy of immortality. I wrote on multithreading, induction and advanced commands. These led to writing advanced algorithms during my career. The argument for immortality was to do good things. +13. The immortal spoke with the editor. I wrote the copy for journalism. I drafted the article. I checked it for factual, grammatical and newsworthy errors. I published the article. +14. The immortal and the visited people experienced their setting and the Internet with simulated time travel. I wrote a book with accessible ideas on immortality. The engine \"Text to Breasonings\" algorithm worked efficiently. It required breasonings, and these were verified to exist on breasoning. I recommended making the first teleport to 5689 to start the simulation and smooth time travel and its appearance. +15. The immortal wrote the book for readers on the software site. I wrote copy for each of the \"first two conversations, and you too, \" representing bots, sales, and business for a sale. The bot, a person, was the buyer. The sale was the sale of the book. The company allowed the sale through business. +16. The immortal studied algorithms from their home time. The Immortality book took years to write. I wrote about the software I was developing. Immortality could choose the universe it wanted. It could remain relatively free of physical problems. +17. The immortal experienced user-friendliness and the Internet wherever they went. I figured with the product to sell. I wrote copy for bots, sales and business for me and the product. I was the author. The people could quickly find my work. +18. The immortal prevented aging with 250 breasonings per time, per day and replaced their body. I wrote the Immortality book based on personal experience and experiment. I copied a time traveller, using my breasoning skill to time travel. I found out the date to time travel to for immortality while teleporting around the tan. I teleported there on the first day, before the technique stopped working, and entered the simulation. +19. The immortal promoted the book. I wrote a press release for the book. I wrote the book and released it. I wrote the press release about the book launch. I had a party. +The Immortality book describes the philosophy underlying meditation. +20. The immortal helped make the reader cosy. I wrote copy about the book. I described the book's writing and my life and times. I put my photograph on the back cover. I sent an email to a television program. +21. The immortal played the role. I wrote Immortality about an immortal actor. The actor chose to develop computer science programs. They were influential and inspirational in academia. The author created beautiful experiences for readers. +22. The immortal established that immortality was available through meditation and time travel in our times, influencing economics. I designed the front cover of the book. First, I found the free image. I created the type. I used an algorithm to design the front cover. +23. The immortal influenced the other. I wrote about immortality and became a teacher. I taught the vital areas of study to become immortal. I taught additional areas of study to help the examining spirit. Immortals had higher quality of life when they created their whole experience themselves. +24. The immortal maintained business activity each day. I wrote copy each day. I used text to breasonings each day. I kept the product alive. I quickly finished the duplicate copies using the quantum box breasonings. +25. The immortal made an interactive book. I wrote Immortality and music about it. I wrote the music generator algorithm. I found the songs for each algorithm in the book. I found the songs for each philosophy in the book. +26. The computer user searched for the key term and became immortal. I placed a free copy of the book on the software site. This decision was for medical reasons. Prospective immortals could breason out the book and use the time machine software. People who wanted to read the book could buy it. +27. The meditator kept a body without any pathological disease. I wrote Immortality and wanted to increase my health. I avoided degradation from ageing. My body stayed the same age, and I didn't get diseases. It was preferable to keep a healthy body. +28. The immortal spread knowledge from the central point. I wrote Immortality and aimed to help others with health. I aimed Immortality at meditators and programmers. People could set up web servers to help others. This server avoided the need to download software. +29. The immortal kept in communication with others. I didn't mind being there with the other in immortality. I kept programming. The programs were very involved. I earned jobs in my industries. +30. The immortal wrote copy about the book cover. I described the cover image as a metaphor for the book. The clock never stopped, similar to immortality. I could time travel to the start. Or, I could travel to parts of the simulations. +31. The immortal received a loan to buy a Prolog computer. I wrote Immortality and wrote a poem about small things at home. I used my time to prepare short works and help vulnerable people. I agree that money should not be a barrier. In addition, As, time and objects were not an object. +32. The immortal wrote copy about himself. I was the author of \"Immortality\". I helped people achieve their goals. I have always wanted to be a philosopher. So I have always written on Immortality philosophy. +33. The immortal spent the break writing on their mission. I wrote Immortality and Time Travel. These books depended on Pedagogy, Meditation and Medicine, and the \"Text to Breasonings\" algorithm. Immortality was dependent on Time Travel. It saved my life. +34. The immortal emphasised that time travel was safe, and one experienced the best things in the simulation. I wrote Time Travel to teach time travel. The book was necessary to practise time travel. I meditated on using the quantum box in the text to breasonings algorithm to do different things. First, I breasoned out high distinctions. +35. The immortal enjoyed the quality of life they liked. I wrote about Time Travel, which is a prerequisite for Immortality. I wrote Time Travel. I time-travelled. I wrote Immortality and became immortal. +36. The immortal summarised the book. I wrote copy about the book. I wrote 50 copywritings about the book. These copies helped it reach the right people. Finally, I published the book myself. +37. The immortal wrote specifically to help their readers. I thanked my readers for buying my Immortality book. I helped my readers. They thanked me. This acknowledgement gave me feedback about my book and how I wrote it. +38. The immortal chose the best product and filled niches. I hoped the readers of Immortality found happiness. The reader tried the Immortality technique. The religion supported them to support themselves. They completed the method. +39. The immortal recorded their travels and thoughts. I wished the readers of Immortality well on their travels. I imagined some of the visitors had used the technique. I pretended to say, \"Bon voyage\". Then, instead of walking along a road, they could teleport inside their house undetected. +40. The immortal designed the book cover. It contained the image, title, name, photograph, blurb and barcode. I could sell it in public. I booked the place and time. I brought the table and sold the books. +41. The immortal developed their copy-making software using their information products. I wrote three copies daily. I breasoned out bots for myself, the reader and the helpers. I breasoned out sales to reach the reader and help them with it. I breasoned out business, grounding and giving presence to the book and the reader. +42. The immortal inspired their followers. I thanked all the unpredictable actors out there. I was professional at acting unpredictably. I wrote the book so readers could interpret algorithms using various personal effects. Readers could follow pathways to write their books in response. +43. The immortal's tradition was radical philosophy. In acting, I, the director, may need actors to be unpredictable to succeed. The actors were funnier when unexpected because they conveyed coherent arguments. People felt encouraged to think and respond when there was an alternative argument. The actors were more serious with immortality. +44. The immortal remembered the unique features of each person. In acting, I recommend attempting to stand out with unpredictability and business. I also recommended keeping this implicit but appearing appealing in the simulation. I found people to talk about immortality with. I sold my books and explained my software to them. +45. The immortal made the writing understandable and exciting. I breasoned out copy for bots, sales and business for the blurb. I explained the book in a lively, entertaining way. The blurb helped readers find the key terms about the book and more about them. I recommended the book to student researchers and programmers. +46. The immortal explored philosophy. I claimed that everyone agreed with immortality because it was healthy for life. I could go on with my career goals. I could work in academia. I could automate my business. +47. The immortal could start their school. I was very comfortable with immortality and said it was suitable for everyone. I felt comfortable in the future. There were psychological and physical As. I could write on topics to examine them. I found accreditation in my own time. +48. Immortality protects the immortal. I agree with immortality because it is suitable for people. I found that people enjoyed the effects of immortality. For example, I saw a duck in the other dimension. The simulation prevented unwanted events. + +49. The immortal worked and studied with earnings. After writing copy, the copy supported me for each day of the period; then, I used the text to breasonings algorithm. I needed the copywriting to help me for health reasons in business, sales and bots. I needed support during the new period, such as the trimester or holidays. After I had finished writing 4*50 As for each of these, I could interpret my texts and algorithms using a neural network and support myself with the text to breasonings algorithm. +50. The immortal was born. I was born on 8 August 1980 in Melbourne, Australia. If 9:28 AM is 8:88, then it was all 8's. I was a Leo. I was the first-born child. +51. The immortal was happy with history. My father was a history lecturer at Monash University. He studied politics and history. He taught the history of religion. He taught Medieval and Renaissance Italian history. +52. The immortal kept their senses about them. My mother was a lecturer of interior and industrial design at Monash University. The assignments were open-ended. The students designed something. They justified their decisions. +53. The immortal covered the other people. I wrote 6*6*1 million breasonings per day about myself. I focused on my thoughts. These were about philosophy, my life and science. I wrote comments, and the computer gave me the rest. +54. The immortal wrote 6*6*1 million breasonings per day about their book. I found the students. I worked out their thoughts. If they didn't say anything, I would mind-read them. I mind-read them, requiring recording enough thoughts. +55. The immortal kept busy. Multi-thread your life to higher longevity, saving time. I found the tasks. I multi-threaded them. I saved time by doing them at the same time. +56. The immortal wrote security programs for the server. I used multi-threading to do tasks at the same time in longevity. I wrote the programs. I wrote a program to start one when another stopped. I could achieve this with background tasks. +57. The immortal sped up SSI with optimisation and garbage collection (and incorporation of data files into algorithms). I used multi-threading to perform well in computer science and be healthy. I completed the tasks more quickly. I remained relaxed. I created blocks of code with SSI. +58. The immortal wrote a security scanner to maintain security. I ensured that my algorithm was secure. I didn't access files from it without a password or other security features such as cookies. I didn't publish passwords or sensitive data. I eliminated scams and computer pathogens. +59. The immortal synthesised their outside, inside and appearance. My outside, inside and appearance are me in the simulation. I examined and cared for my body. I relaxed my muscles and thought of my body function. I thought of my appearance when I was with other people. +60. The immortal developed the necessary skills. Higher longevity is helpful for the reader's friends because it helps with critical analysis in longevity. I lived longer. So, I had more experience and more skills. So I could find deeper reasons for analyses. +61. The immortal maintained his health. Higher longevity is essential for people because it helps them with their health. I thought through the development of my philosophy. I avoided taking risks. I pursued happiness and health. +62. The immortal found confidence in the benefits of immortality. Higher longevity inspires people and helps them reach their potential. I taught immortality to people. I stated that the sutra helped students go to University and study courses. The educational institution offered pedagogy and meditation. +63. The immortal represented virality with medicine. I wrote copy about virality. I planned everything down to the last detail. I left nothing to luck. Virality was represented by hands going to the sides. +64. The immortal completed publicity for immortality. At the media office, I said I liked the other, the self and all subjects. I endorsed those I agreed with. I wrote down my interests and strengths. I approved others in these ways. + +65. The immortal determined tasks to do. The self didn't mind being there with the other in immortality. I read poems. I made models. I made a working model. +66. The immortal supported immortality. The self didn't mind immortality because it was suitable for it. The person designed the working model of the city. It had a hospital. People learned meditation in school in the hospital. +Everyone agrees with immortality because it is healthy for life. +67. The immortal kept track of good keeping, health and happiness as they aged. Therefore, I kept a record of my age. First, I wrote down my birth date. Then, I calculated my age by subtracting my birth date from the current date. Finally, I wrote down my thoughts about my age. +68. The immortal became well-known in the community. I pretended to have a conversation with the other. I saw the other. The other saw the self. I saw the other again. +69. The immortal worked on themselves, then the other. The self didn't mind being good in immortality. The self liked the other being good. The parent taught the child to be polite. The child followed the adult. +70. The immortal meditated and was protected by law. Immortality is law-abiding and is good because it is meditation. Time travel was meditation and medicine. Time travel was necessary to become oneself in immortality. Immortality, a positive medical result, meant time travel was medicine. +71. The immortal followed safety protocols. Everyone will be safe in immortality because of inbuilt protections. We live in the simulation. It is good to avoid risks. I avoided accidents. +72. The immortal benefited from advanced science. Immortality is safe, given the way it works. Immortality involved body replacement, anti-ageing and the religion of pedagogy. The religion of pedagogy was writing 4*50 As for each chapter, and writing 4*50 As provided a sufficient examination of a topic. +73. The immortal eventually wrote copywriting themselves. The self was safe in the simulation because of learning the ways and algorithms. I applied enough breasonings to body replacement and anti-ageing of each body system in each location I time travelled to each day. I ordered copywriting for each idea. I mind-read and wrote algorithms for each thought. +74. The immortal could use a computer but followed the building thought. Each person can do meditation and be protected. The woman required 4*50 As for each child. The child needed 4*50 As for pedagogy, meditation, medicine, time travel, mind reading and immortality. In addition, they required custom 4*50 As for their areas of study. +75. The immortal supported the other to build their thoughts. The person followed the building thought. First, the person wrote As. Then, they programmed the argument and algorithm finder, which someone else would write to find the arguments and algorithms. The other person wrote this entire system. +76. The immortal learned by exploring. I wrote copywriting from the first day it occurred to me. I wrote the argument. I wrote the algorithms. I wrote a neuronetwork to complete and correct breasonings, finish arguments and groups of arguments. +77. The immortal encouraged others to do the right thing. The person stated that it is harmless even if they talked about immortality. If everyone became immortal, society would be perfect. So, the people climbed up the first rung, professional development. Then, they worked out knowledge to maintain positivity. +78. The immortal stated that immortality was the standard and made everyone happy. The person argued that everyone should have immortality. I noticed the married couple rushing to complete details. After accruing a database of completed details, they used a neuronetwork to speed up generating sentence details. The algorithm database ideally contained both essential language logic and medium-length algorithms. +79. The immortal wrote about a positive way of life. The book's writer about immortality contacted people in the community to help others become immortal. They refused to change their text. They contacted publishers. They toured the world to help others become immortal. +80. The immortal explained the dramatis personae of immortality to the audience. The writer developed non-fiction characters for themselves, reporters, the robot head of state and the male and female time travellers. The real-life reporter wanted to know how the immortality system worked. I explained that the people would time travel to the future, initiate the simulation, protect their immortality and health and time travel between the two times each day to maintain immortality. The female head of state was amiable, and the payment was pedagogy. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 23.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 23.txt new file mode 100644 index 0000000000000000000000000000000000000000..304f818e9e113e9aa402edbb985b1bee8d13816c --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 23.txt @@ -0,0 +1,86 @@ +["Green, L 2024, Immortality 23, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 23 + +1. The immortal developed ten products with one product for success. I developed ten real products. I created and sold ten products. I was successful. I customised the products for the customers. +2. The immortal mind read the bot's needs. I researched each of the ten products. I researched the customers' needs. I determined the research and design of the product. I developed the bot's fundamental nature and found what the science of her wants was. +3. The immortal avoided distractions by providing more material. I researched what the accreditation standards were. Whatever the level, each assignment had 4*50 As. Each sentence had enough breasonings. The breasonings helped the sentence make sense. +4. The immortal wrote 80 sentences for each sentence the writer wrote. The professor and the famous person had one subject of two assignments, each with 8*50 As. The 16*50 As took a decade to write. The subject was the qualification. I worked out that I had 8*50*80=32,000 sentences to support 4000/10=400 sentences in the student's assignment. +5. The immortal offered As to a child that would be born, the parents replied, and the child possibly studied there later. I wrote on Bots. Bots were people. If I copywrote everything, there would be better results. It was expensive, so I wrote a system to provide copywriting. +6. The immortal clearly labelled the machine-generated code for humans to play with. I wrote a system to correct incorrect code automatically. The system found the line and recursive state of the predicate where the data had to arrive. I took the value from somewhere else in the program and passed it to the predicate. I recorded this action and the change in the data it caused to undo, alter or delete it later. +7. The immortal used the mind-reading algorithm to ensure thoughts were flawless. I wrote on Computational English. I wrote copywriting on Computational English. The computer algorithm copied text, was tested and modified. The story's moral was to write the algorithm correctly from the start. +8. The immortal educated, tested and rewrote for students. I wrote on Computer Science. It was a chatbot about Lucian's philosophy, written in Prolog. It discussed breasonings, time travel and immortality. I hired a student tester each year to trade off quality, given quantity. +9. The immortal tested the tests for students. I wrote about Creating and Helping Pedagogues. The students were algorithms. Since there was nothing to do, the real people plugged into the thoughts. I gave the actual students the simulated ideas. +10. The immortal simulated the medical age of the student. I wrote the student simulations. I wrote the content for each level of school. The content was sympathetic to the students. It gave pleasant thoughts back to students. +11. The immortal wrote in comfort. I wrote on Delegate Workloads, Lecturer and Recordings. Before the MBA expired, I bought or developed a thought database. I could generate a large number of arguments and algorithms. I used headache prevention and workload delegation, which were MBA technologies. +12. The immortal completed the economic analysis and made the decision. I wrote on Economics. Economics, spending money, resulted from the business. I invested and made money. I checked on profit and demand. +13. The immortal wrote separate science and philosophy texts, where science tested the psychology of algorithms and vegan life science with body simulations. I wrote on Fundamentals of Meditation and Meditation Indicators. Meditation was part of Medicine in the institution. The students critically analysed texts. The medicine prevention was education. +14. The immortal coordinated professors and brought non-transgressed knowledge from the future to the present. I wrote on Fundamentals of Pedagogy and Pedagogy Indicators. The students wrote As. The teachers had small classes and wrote As. A server finished off work and did the work for vocational students. +15. The immortal deserved copywriting for writing breasonings. I wrote on Immortality. I asked for medical treatment with 250 breasonings. I completed this request with copywriting. I considered asking for pedagogy with 250 breasonings and copywriting. +16. The immortal mind-read dependencies of skills and mind-projected necessary psychologies to help with them. I wrote on the Lecturer. I wrote education texts on each subject. These were the skills of the areas of study. They were the practica for high distinctions. + +17. The immortal made three changes at a time, with eight combinations of changes to test. I wrote about Mathematics. I tested all combinations of the changes to predicates and algorithms together. There should be fewer than 16 changes, with an error otherwise, where more than 16 changes resulted in testing the result itself. I accepted the first working, most straightforward change set. +18. The immortal identified the bottleneck and solved it. I wrote about Medicine. I dispensed with individual predicate tests because intra-predicate and inter-predicate changes were inseparable and interdependent. On another note, I created data structures with logic and logical structures. I kept the combinations that worked with the unit tests for further testing. +19. The immortal identified and tested sets of changes to test in the whole algorithm. I wrote about Meditation. I wrote the feeder algorithm for the small number of changes to test simultaneously. The programmer separated changes into individual lines. I used Combination Algorithm Writer (CAW) or a neuronetwork to find the necessary combination of lines. +20. The immortal identified the need for pattern matching or new fundamental parts. I wrote on Mind Reading. I used CAW to generate if-then clauses. If-then involved multiple predicates. They could have cuts. +21. The immortal split the predicate into lines to find the best changes. I wrote on Pedagogy. I wrote pattern matching for strings. I split the string into character strings. On another note, if there were duplicate lists to find combinations from, the algorithm found no combinations, assuming the algorithm wouldn't suggest duplicates - therefore, state machines were more appropriate to find differences between them. +22. The immortal pretty printed the List Prolog and found the differences. I wrote on Politics. I could manipulate the lines of the state machine without line numbers and add the line numbers afterwards. So, I could find all the differences between the two algorithms in state machine form, with line numbers replaced with variables. Instead, I divided List Prolog or Prolog algorithms by return characters without worrying about predicates. +23. The immortal used List Prolog in pretty printed form to delete unnecessary spaces. I wrote on Popology. I put file dividers into other files. I converted the sets of algorithm files into a single file with file dividers. Then, I compared this file like this with another and divided the finished file into files before saving them. +24. The immortal ran pipelines when necessary after changes. I wrote on Replication. If the tests failed, I reported the tests that failed with the current final file. This report included the repositories depending on the changed repositories. In addition, I could edit source files, they would be the new initial files, and I would test these. +25. The immortal extended, not replaced, the verification algorithm to form the algorithm, helping write tests. I wrote on Sales. I helped write tests by verifying data. I also verified the values. I converted the verification algorithm into the algorithm, which helped stabilise the algorithm. +26. The immortal converted the type statements into code, detecting common symbols and finding commands using neuronetworks. I wrote on Short Arguments. I wrote type statements. I defined the bottom-level values. If they were functional, I agreed. +27. The immortal wrote a DevOps system that merged changed parts of the code, built the depending apps and tested that they functioned correctly with the unit and other tests. I wrote on the Simulation. I compared predicates separately from lines in case some lines were the same. I could do them together. I deleted duplicate clauses. I checked for clauses that were not together with others with their name and arity. +28. The immortal finally approved the files, saving them, which made them ready for uploading. I wrote on Societology. I kept the first and shortest working combination of changes because it worked and tested the tests true. If the produced code was incorrect or incomplete on inspection, the tests may need changes, deletions or additions. Because I hadn't made the final approval, I could change the tests, for example, by adding negative tests (to fail given a condition) or by testing an added feature. +29. The immortal looked at diff reports of developers' code and the build code. I wrote on Theology. I could view the changes between the uploaded and the code to be uploaded. I checked that necessary code was not missing. In addition, I checked that comments and other important data were not missing. +30. The immortal I wrote on Time Travel. I manually inserted comments and needed parts of the code. In addition, I checked that code constructed by adding and removing lines delimited by the newline character didn't have an operator expected error by catching errors. Finally, I queried the value of deletions; for example, when I modified a comment, I could delete it. +31. The immortal chose the best changes. First, I wrote lists of items to complete. Then, I wrote the possible changes. Or, I wrote the item stayed the same. Then, I wrote the potential changes to test. +32. The immortal agreed with development operations. First, I wrote a note describing the benefits of DevOps. Then, I wrote the algorithm in fewer steps. In addition, I ran tests on the code, line by line, to ensure accuracy. Finally, I ran tests regularly, testing all the repositories' accuracy. + +33. The immortal checked software before uploading it. I played myself as a software developer on the website. I stated that Lucian CI/CD allows continuous integration and deployment by merging modifications, building multi-repository algorithms and testing them. I cut off more than seven modifications, encouraging incremental improvement. I displayed the changes to the files as a colour-coded HTML file. +34. The immortal thought of art about algorithms and how they influenced other algorithms. As the Director of Lucian Academy, which specialises in developing open-source software and humanities writing, and one of its talent recruiters, I value creativity, imagination, and problem-solving skills. I mind-mapped the possibilities. I imagined musical, artistic and poetic angles in the work. I found poetic, alternative pathways to algorithms and made intuitiveness and inspiration my priority. +35. The immortal had an imaginative style and explained that tens represented using algorithms as art. I wrote on something new, combinations for work projects and tens. I thought of an alternative, fresh angle on an idea, which was significant. I used DevOps from then on to ensure that the quality of the work met the standard. I established a distinctive style, which changed from comments evoking possibilities to extracts that I could reassemble to demonstrate the evolution of the project. +36. The immortal was immune but interested in things. I used DevOps to increase the human impact of my algorithms. I reduced errors, allowing focus on creative parts. I kept a record of tests over time, building up a picture of uniformity, automation and generativity. I generated multimedia stories about the life and science of the algorithms, for example, how randomness gave me ideas about features. +37. The immortal could also test for all values of an algorithm but usually changed the algorithm to return a single result. I tested and cut off the first result of the algorithm. I liked cutting off thoughts to their essence. Cutting off an algorithm's first result showed initiative in finding the algorithm's success. I also checked the algorithm for evidence of correct content and solved any errors. +38. The immortal checked input and output and verified them in the algorithm. I developed solid analytic and problem-solving aptitudes and an openness to furthering education and new experiences. The person examined mathematics, which helped clarify their IQ. I worked on computational induction, interpreters and finding needed constraints or data for algorithms. I followed the data flow to analyse how the algorithm transformed variables with one type into another, with constants or data of certain types applied. +39. The immortal kept processes simple, allowing different people to commit other predicates simultaneously. I am a collaborative software developer, receptive to learning, and proficient in problem-solving strategies. I wrote the mission of the organisation. I wrote DevOps software with features that aligned with the company's aims. This alignment allowed customisation and development of the company's infrastructure. +40. The immortal chose from their ideas on the topic. My skills included adaptability, comfort with guidelines and protocols, and eliciting people's opinions. I wrote algorithms using the logic of a programming language and the company's aims. The company aimed to simulate and work on future technologies. I considered modifying the current thought but left it because it was working. +41. The immortal determined their overarching aim, spiritual automation. My essential qualification is autonomy. My objective was to be an intellectual. I wrote the text. The students examined it, finding out more about it. I earned money from acting roles. +42. The immortal stayed on top of their game and achieved more in communication. I was decisive, flexible, and willing to work in teams. I could find apparent bugs after using DevOps. DevOps cleared away unnecessary code, making the correct path more obvious. In this way, DevOps gave me time to design new features quickly. +43. The immortal remembered to run the algorithm to search for the tests before running DevOps. I was qualified, so I used my skills for the next step. To develop DevOps, I wrote converters from Prolog to List Prolog and back, hierarchical folder processing, finding differences between files, generating possibilities from differences and version and backup handling. I added more test detectors to DevOps. I set up my system to run tests automatically before uploading. +44. The immortal reduced ideas to algorithms and tested them. I set up different music, language and art tests with DevOps, including DFA minimisation, equivalent predicate deletion and optimisation. I turned off the unnecessary connection in the sculpture. I spell-checked the text as I spoke, fact-testing and giving myself live updates. I ran continuous pipelines on changes, preventing human error and automating processes in my sleep. +45. The immortal chose high-paying information industries. I paid a premium for the information. I mind-read customers by attending to them with employees, who I mind-read. I did this all the time. I helped people with their thoughts, including algorithms, and predicted their needs, such as meditation, copywriting and essays. +46. The immortal subsisted on functions they had written and neuroalgorithms that chose these algorithms. I made web apps that saved work as I went, made back-ups and suggested improvements. I wrote simple ideas and grammar-checked them. I used static format templates, such as hierarchical folder processing templates, to prevent defects. In addition, I used dynamic or active defect prevention, for example, continuous simplification of poor code, starting with pattern matching and adding non-pattern matching functions that might process the next character. +47. The immortal programmed in Lucian's Hand Bit Map, their favourite font. I migrated Lucian CI/CD entirely to the web. I edited and tested the code online. I could run algorithms online, saving needing a desktop or laptop computer. I automatically wrote tests ready for moderation and ran pipelines online to find the best configuration of the code I had written. +48. The immortal mind read themselves and automatically tested and uploaded the code. Next, I read more reports from DevOps online. Then, I read whether a programmer had updated the tests since any changes. Next, I read the time of the last change and pipeline. Finally, I uploaded all changes to the repositories. + +49. The immortal connected sentences to code. I explained what I did to the person. I found ways to refine my writing and programming. I examined automation in programming. I funnelled and synthesised sentences about programming and simplified code. +50. The immortal determined when a commit contained a bug fix, finished new feature or API change, meaning a version change they suggested as part of the commit label. I stated that Lucian Academy, a software company, was established in 2013. Students wrote the software. They critically examined the software, finding new features, bugs, and API changes (input and output). Finally, they added a versioning system to Lucian CI/CD. +51. The immortal simplified logic and presentation. I stated that Lucian Academy is an open-source software development company that emphasises philosophical writing. I wrote algorithm specifications as philosophy. These algorithms were the logic in the philosophies. I found an algorithm that found the new thing-in-itself in a philosophy, for example, the presentation of the result of logic. +52. The immortal could list all the possibilities at each point. I stated that I co-ordinated writing and programming. It would take up to 800 times as long as the years spent writing the programming database to finish it. Until then, neuronetworks using the database could contribute to filling out certain applications. By the end, I would be an expert on algorithm writing automation and refinement. +53. The immortal examined the algorithm dependencies, noting significant prerequisites over time. I worked as a software developer, gaining experience as a product of thought. All the skills were fundamental, and I could work on projects anytime, although some projects followed others. And some skills became more refined, and techniques became evident throughout the process. +54. The immortal wrote research about each stage of the project. I stated that my ideal work day had clear steps written down, I wrote down the uses of the software, and I had no interruptions. I wrote down writing and programming goals for the day. I wrote down potential new features and worked out what people might want. For example, I wanted automation and only made dictionaries necessary for medical reasons available. +55. The immortal worked out algorithms from specifications that they had partially written. I was not afraid of learning new concepts. I developed new features from standard requirements. For example, I wrote a diff group combinations predicate for CI/CD that found differences to make combinations of lines of code to test. Later, I could change the CI/CD algorithm's features to process programming languages other than Prolog by finding tests, pretty printing code with one command per line and running tests in other languages. +56. The immortal found programming easy and incorporated improvements in their programs. I took on the challenges of the software development industry. I noticed the future might have easy web programming languages. I could easily find and fix bugs the first time. I added a password screen so Prolog could access files on the web. +57. The immortal wrote equispaced arguments along a line, checking perspectives for blocks. I was comfortable with my written voice. I spiritually said the words I was reading or writing to myself. I checked them for grammar and spelling. I determined the aim of the writing and checked I had achieved this aim in the end. +58. The immortal used the student's idea when explaining a concept to them. I used a \"plain vanilla\" philosophy format. I broke from Computational English's \"self\" and \"other\" subjectivism, focusing on the original logic of analysis in literature. I described the invented term's new sense. I created systems and new endpoints and found unexplored noumena in literature. +59. The immortal analysed combinations of reasons in the sentence three times, for ten reasons. I wrote a literary work as a philosophy dialogue. Second, I asked students to take their part in the discourse. I asked them to examine their inner thoughts and their algorithms. And I asked them to develop random data into developed arguments. +60. The immortal was confident in expressing themselves concerning an idea. I maximally did up a text. I did up researchers. I did up students. I wrote education funnels to help thought processes for each sentence. +61. The immortal analysed students' thoughts using the tripartite structure. The student generally agreed or disagreed with each idea. They gave them details for each line of code they wrote. There were books detailing their thoughts. They could represent their thoughts as a building. +62. The immortal's PhD took an original angle. I collaborated on the project. The famous student found the renowned professor. He read their work. He wrote well-known excerpts. +63. The immortal only kept intelligent systems they had written. I was skilled in visualisation. I created the simulation of the algorithm in virtual reality. In augmented reality, I controlled bug checking like grammar checking using my mind. Also, I played game versions of my algorithms in augmented reality to balance help and intellect. +64. The immortal helped others they met with specific questions. I exhibited grit. Through trial and error, I found the pedagogical essay format, algorithm discoveries and the way to be immortal. The immortal wrote a diary of breasonings (algorithms). They wrote 80 arguments and 80 algorithms per week. +65. The immortal had experience with Prolog. I summarised what I had done in five years and what I would do later. I created List Prolog Interpreter, Mind Reader and Music Composer in the first two years. In the next two years, I created Essay Helper and Philosophy. In the next two years, I made Lucian CI/CD. +66. The immortal worked on Human Resources for Lucian Academy. I stated that As (a database) and money (growth) were no objects. I looked at the production studio. The products had to be smart enough. I worked on the academic's child's mind map. +67. The immortal sold everything. I worked on Robot Resources for Lucian Academy. I built a robot. Was it electronic water? The VR/AR headset allowed me to prepare for the future and mind map changes to my algorithms. +68. The immortal easily set up mind reading. I worked on Marketing in the holidays for Lucian Academy. With Human Resources and Sales, if I worked on Marketing every holiday and study period until graduation, I would receive support with these in business. Someone asked if I was tired of the thoughts and wanted to buy a database. Instead, I wrote 4*50 As for a product category and then 80 breasonings per day. I used it after this. +69. The immortal tried teaching people with 4*50 As for developed things. I wondered if accreditation was a chatbot integrated with a mind reader. I noticed the acting company had acting-developed things. I asked if they had seen them. In philosophy, the details were for developed things, all examinable, where I was correct about positive thoughts. +70. The immortal professor gave 4*50 As to each thought in academia by setting up the chatbot with a mind reader. I didn't go past anything and shared high-quality thoughts, 4*50 As or copywriting. I gave keeping it there in immortality 4*50 As. I went through my daily regiment and gave everything 4*50 As. The developed things were interesting in themselves. +71. The immortal kept up to date with others. I concluded with the producer-developed things and breasonings after meditating each day. I thought in my sleep. I had lovely dreams. I replied to an article about CI/CD which reminded me of Lucian CI/CD. +72. The immortal questioned why the program ran out of memory and converted it to C. I worked in Sales for Lucian Academy. I tested for performance. I tried for speed. I tested for memory use. I found and solved bottlenecks and used optimisations. +73. The immortal developed a DevOps algorithm. I used an algorithm that processed and got tests from files using append. I used it with practice test formats to find different tests in source files. Users could customise the list of types of tests. I wrote an algorithm that could test C algorithms from Prolog or run List Prolog testing algorithms in Lucian CI/CD. +74. The immortal intercepted predators and unfinished thoughts and listened to the animals. I wondered what it would be like to give each animal immortality. I mind-read the animal and gave it 50 As per day, with qualifications to help its appearance and quality of life. I conversed with the animals and helped them with pedagogy, some doing it themselves. I dived in with the back of my hands together. +75. The immortal developed the continuous feedback process in DevOps. Only the top is allowed it. No one is God, but they might have had their complete thoughts. I was satisfied with 4*50 As, a long book for a department. The other 50 As books could be departments with enough details. +76. The immortal experienced the birth of thought by writing the Haskell interpreter. I simulated the user feedback. There was a negative result test. There were user interface, cognitive product and intuitiveness tests. Even randomness had to be treated sentiently and not get in the way. +77. The immortal covered philosophy and computer science in the performance. I played myself, a software developer. I simulated the device to test for compatibility. I wrote my operating systems and CPUs. I could create the CPU using quantum computers or print one. Creating a CPU requires a printer with certain element inks. +78. The immortal is recommended to keep text notes recording one's thoughts. I stated that Lucian Academy is a software firm with ten years of experience in open-source software development. I established the company and started writing algorithms ten years before. I tested the accessibility of an algorithm. I wrote everything in text, representing colours, images and difficult-to-find information at one's fingertips or in text. +79. The immortal tested the installer and how to save data between installations. I have exceptional problem-solving skills. I wrote the visual testing tool. I wrote an interpreter that returned screenshots of the terminal. I could test menu systems. I could test games. I tested my typing speed, improving my accuracy. +80. The immortal saved the breasonings dictionaries and thesaurus file. I had a philosophy of never being too old to learn. I kept data between algorithm updates. I used the data file with an algorithm I downloaded. I saved the data file, updated the algorithm and reloaded the data file. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 24.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 24.txt new file mode 100644 index 0000000000000000000000000000000000000000..91bd7bd81b0bb56581b9abadf8e681c49a4bc465 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 24.txt @@ -0,0 +1,87 @@ +["Green, L 2024, Immortality 24, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 24 + +1. The immortal solved the problem of the bug, new feature or API change. I mind-read and completed students' thoughts. I mind-read the part of the text they were up to. I mind read their comment. I detailed it, with problem-solving protections. +2. The immortal synthesised the new thoughts with the philosophy, using contemporary language as original research. I mind-read thoughts, connected them with the philosophy using the BAG algorithm, and added details. I mind read the words and the spelling of any new terms. Mind reading two representations required 4*50 As of breasonings and algorithms, where the second representation is needed to finish the thought and mind reading was security checked. I used the BAG algorithm to make new connections within the philosophy and added Grammar-Logic word details. +3. The immortal spoke to the other about their written text. I recognised the specific stage the student had reached, such as preparing, creating, rereading, reading indirect inspirations, coding, debugging or further debugging. I ran a running mind reader to monitor the student's thoughts closely. Maybe if they had worked on the microprocessor thought, it was detailed enough and allowed to appear. I mind-read the student for the stage they had reached, and mind-read my thoughts for privacy. +4. The immortal specified the algorithm with shapes within shapes and then debugged the resulting diagonals. I wrote the algorithm to recognise the algorithm from mind reading. The first algorithm determined the second algorithm's name, specification and identification shapes. I recognised the print grid 2d algorithm from two rounded findall shapes. Also, I programmed in shapes. +5. The immortal explained ideas to the computer like a two-year-old, then helped it find ideas using meditation. I experimented whether 4*50 As of random GL breasonings resulted in accurate mind reading. I substituted original, complete breasonings for these breasonings. Or I tried a mixture of these or more. The sentences and logics were simple. I taught the neuronetwork programming in sentences, explaining how to bug check. +6. The immortal made connections within the study area, applying their latest knowledge to it. With mind reading, I tested knowledge of algorithms, later allowing the creation of new algorithms. Students wrote algorithms that their studies influenced. The students wrote algorithms given a specification. They examined these and wrote using their algorithm. +7. The immortal revised the idea and highlights in life during the term. I wrote the neuronetwork to help mind project suggestions. I mind-read the student's progress. If they found a part difficult, I suggested looking up the answer to their question. A neuronetwork could also explain an idea to the student. +8. The immortal explained the idea using email. I wrote the spiritual word processor. It was a text-screen word processor with justification, pagination, and different-sized fonts that could be mind-read and projected. I could use my computer while walking or running. I could edit in the terminal or online, with mind reading and live CI/CD testing as I typed. +9. The immortal ran mind reading while awake, with just enough relevant details. I ran the mind-read algorithm in State Saving Interpreter. I wrote the test. I ran the algorithm. I checked the code with the CI/CD algorithm. +10. The immortal received live and future updates about debugging. I used multiple spiritual windows. I pointed and clicked on the spiritual title bar with my mind's eye. The window was in front of me, and I looked at parts of it with an eye gaze sensor. Or, I could look around while giving text thought commands. +11. The immortal searched for and deleted the non sequitur. I wrote Mindsuite, which was a spiritual office software suite. I ran Time Machine. I ran Replicator. I ran Tea Time. +12. The immortal copied the sound of thoughts and abstract thoughts. I spiritually played songs. As I could read spiritually projected displays, I could hear the thoughts of music. I could taste the menu items. I mind-composed the song parts, choosing good enough parts. +13. The immortal pretended to breason out the idea, for example, the dimensions of data structures in a program. I revisited the spiritual word processor, using programming to move around text between applications and analyse it. The word processor could be the text file editor with simple commands to run. For example, I programmed the chart, page numbers or sentences as algorithms, applying CI/CD. I could generate the report, website or program at the touch of a button or using a thought command. +14. The immortal entered the type statement, test, and then ran CI/CD while writing an algorithm. I entered the type statement and test together to work out the aim of the algorithm. I wrote the algorithm, saving and testing whether I had satisfied the test as I went. I added, deleted or changed a maximum of seven groups of lines per file save, where the CI/CD algorithm would notify me if I had made more than seven modifications, and I could write more tests and code after this. +15. The immortal added a project specification, e-report and programmer's journal to the open-source software project. The mind reader algorithm became suspicious if the human emitted a frequency that they got it right when they hadn't, finding and helping them solve the systematic error in which they had got something right, but there was an overarching error. In addition, the immortal became suspicious of more complex algorithms, eventually writing code to complete them. I ran education versions of open-source and artificial intelligence sites, which taught others how to create this software. I found that the CI/CD algorithm didn't easily become obsolete, and users and I could add more. +16. The immortal experimented with engineering different columns to synthesise different sounds to be thought projected. I wrote the algorithm that projected the screen. The projector represented the text on the screen as a collection of white lines and black points. I also saved time by filling spaces with rectangles. I could represent a 3D space with rectangular prisms and translucent objects. + +17. The immortal listed the topics they wrote programs on. I wrote 4*50 As of algorithms and breasonings per representation mind read. This value didn't necessarily increase the quality of mind reading. First, I used a neuronetwork to generate meaningful algorithms and breasonings. I wrote on the whole area study and its noumenon in terms of details, with my argument supported by algorithms. +18. The immortal recognised the command from the symbol. I was confident in writing on the topic. I wrote a neuronetwork for ideas to help the mind read the person. The person wanted to write algorithms without touching the computer. In addition, I projected a screen to help them write code, with one line to summarise the code. +19. The immortal graphed their achievements over time. I facilitated a discussion about the algorithm. I used As from before and algorithms entered by me or someone else to power mind reading. I observed that one correct breasoning, not five mistakes, was enough. I examined my recent thoughts in terms of philosophy. +20. The other helped the immortal with their algorithm. Instead of five algorithms replacing 4*50 As, with the rest mind read at home over the next few days, I used a neuronetwork. I wrote in my style. Recent thoughts inspired me. I defined the society of philosophy as simulating a vote. +21. The immortal visualised the way to reach the goal in steps. I thought of a use for each detail. I let the argument reach its natural conclusion. I rapidly thought of important philosophical views, choosing the best reason for my view. I commented on views I agreed with and disagreed with, which I debugged. +22. The immortal persistently worked towards the goal. I mind-read the algorithms of the new employee using the algorithms entered by the old employee. The senior employee processed the new employee's writing. It was with their permission. The new employee supported mind-reading themselves when they had enough ideas, with one idea becoming many ideas. +23. The immortal was honest and open with others. I commented on the repository's quality. I determined whether it worked. I worked out whether it worked with larger groups of inputs. I worked out whether it could handle errors and correct them. +24. The immortal reused the file processor algorithm. I stated that logical structural integrity should be correct, uniform and split into different predicates where necessary. I checked the efficiency and effectiveness of the logical structure. I used the same logic throughout to help prevent errors. I separated predicates by function and where they were recursively called. +25. The immortal mind reads themselves to check on code quality. I was transparent about code quality. For example, when I modified a List Prolog converter, I checked each predicate worked and that the depending repositories worked. I wrote unit tests for each predicate. In addition, I wrote tests for each depending repository's predicates. +26. The person became immortal at a memorable moment. I wrote about time travel, helping me stand back and take in life-saving information, such as travelling to a distant simulation and increasing my longevity. I only used software I had written myself. Over time, I wrote a word processor, an operating system and other programming tools. I wrote for a purpose, redoing the projects for different purposes; for example, I wrote a spiritual or debugging word processor. +27. The immortal recommended study and work. After graduating from Melbourne University, I wrote about medicine, specialising in spiritual medicine to help prevent headaches, avoid viruses and benefit from quantum science. I considered studying Medicine. I wrote about various forms of advanced medicine, which were non-invasive. I defined quantum science as a quantum box that solved or prevented medical problems in meditators. +28. The abbey reminded the immortal of pedagogy. I wrote on Pedagogy at the start of Philosophy Honours during an Education short course at Monash University. I graduated from Arts/Science at Melbourne University. I studied Computer Science and Philosophy. Studying Education helped me with the content and pedagogy during Philosophy Honours. +29. The immortal met the man and helped him. I appreciated the feeling, content, and style of English literature. I wrote on Philosophy. I wrote three levels, where the second and third levels commented on the previous level. I wondered how the person seemed at home and helped them. +30. The immortal told the man about their knowledge. I was explicit about immortality terminology. The others possibly thought it was English, a theory or meditation, the last of which it was. I noticed religion was protected under the law. I offered my meditation knowledge to people around the world. +31. The immortal fostered the child. When I got home, I entered the brain think tank and checked the data on the bridge. I wrote software for communication. I made secret passages. I hid the secrets. +32. The immortal connected the quantum box to mind reading. I wrote on mind reading using a computer and meditation. I created the result of vague thoughts in essays, music composition, and later art. Later, I changed the algorithm to eliminate outliers in the signal. I wrote writing about art or concrete poetry. + +34. The immortal was in the background. I automated work with the bots. I moved conversations to a bulk area. In addition, I answered common questions. Also, I helped supervise students. +35. The immortal helped remember the person's friends and relevant thoughts. I created customer relations management software, which automatically replied to customers, contacted them after inactivity, and kept their details up-to-date. Up-close contact was always different, so it needed constant moderation as well as the other bots. I spiced up lectures, invitations and thought representations by thinking of radical, traditional and face-to-face possibilities. The bot collected and delivered sales products, found out the inner meanings in human thoughts and was more personable. +36. The immortal optionally delivered a mini-lecture during office hour. The bot helped mind-map necessary details. They helped build confidence with epistemological sales. They helped meet professional requirements in education. They delivered products' professional requirements at home. +37. The immortal filled gaps in enrollments with serious students. The bot wrote more details. They were or augmented the students, writing feedback about the lecturers. In addition, they, I and the employees increased details about other subjects. Help was on its way. +38. The immortal provided feedback as the student went. The bot helped grade the assignments. They wrote a page of comments. Their handwriting was Tolkeinesque. It was like a doctor's. +39. The immortal fired at thoughts with 4*50 As, which improved their quality enough. The bot gave needed feedback for the 100% pass mark. It was a computer science or multi-choice assignment. If it was a humanities assignment, computational humanities skills such as increasing computer science, problem-solving and new methods such as lazy evaluation were assessed. Finding out was given prominence. +40. The immortal covered the rest with investments. I wrote advertisements for the academy, which needed a long-term plan. It was the best value. I said continental, like computational. I experimented with dark hot chocolate. +41. The immortal eventually studied an MBA and medicine master's every 40 years. I worked as an extra for the first ten years, which I could invest in gold. I separated personal and work computing costs. I studied for an MBA every ten years to support sales and bots. I also learned medicine master for longevity. +42. The immortal helped the thoughts and levels beyond. The employees rated the executive. I completed the first level first. Then, I completed thoughts using neuronetworks. I cre-rated (sic) my thoughts, where I was performing well because I was happy. +43. The immortal completed all the classes. The executive rated the employees. Like academia, they had a grade. Mine was the algorithms. The students completed some and some of their own. The details for each level included the classes. +44. The immortal balanced how much money they had. I tried to recreate the integrated business computer experience. It was automation. It was generative SI, which plugged into the Office Suite. I automatically wrote and put up ads; the companies partnered with the tertiary education institution and wrote emails if employees weren't available. +45. I had more humans, not robots, visible, this endured, and the database was in terms of the database (the As for the website looked like my writing, which explored a particular perspective on an idea). I automated tasks, starting internally and expanding. At least I simulated what should happen, letting human workers follow their path. I let it come to me by developing the software. There were roles, gapless systems controlling how it works, and 4*50 As for each \"object\". +46. The immortal connected things such as business data, computing and philosophy writing, which was the most intelligent because of the ease in connecting ideas, and physics for teleportation and medicine. I wrote a generative writing and programming algorithm, which only outputted working texts with citations and was sympathetic to humans. I could tell the recommendation-giving systems would improve. It also embraced theology and human immortality. Very different communities followed the philosophy, such as readers, writers, programmers and business people, who regularly read and commented on the philosophy. +47. The immortal created an operating system using generative SI. I automatically performed tasks. I wrote about things to make them come true. I created a computer, really software, with a better, more straightforward programming language. Why didn't Prolog like C? Maybe I could use Prolog instead and prevent security problems. +48. The immortal explored immortality as a time-crystal. I automatically created the computer chip. It had many optimisations, building two or more functions on one system. The whole thing was like a list or hardware-as-software. Medicine also benefited from reality-as-simulation. + +49. The immortal learned how to become immortal from meditation, which wouldn't have been possible without their computer. I wrote about my meditation experiences, such as computational thoughts and experiencing the taste of time travel. I crossed parts of my new algorithm with parts of classic algorithms to freeze my age. I connected my recent algorithms to my previous and best algorithms. This algorithm had N^2 complexity. +50. The immortal prepared for space travel programming languages. I practised music to maintain my musical ear and interest in composition. I approached the order of neuronetworks to write algorithms and arguments. The neuronetwork helped with meditation, and the uber-interpreter helped speed up programs unfairly. It overbore slowness by running Prolog as C where possible. +51. The immortal preferred the graphical user interface of Lucian CI/CD, which ran in the background and automatically committed code. I exercised and relaxed my eye muscles (musicles?) I programmed Lucian CI/CD to ask in SSI whether to keep comments and data. This SSI app could be written as a web app. If comments and data were ignored, they were logged for later keeping. +52. The immortal came to the conclusion in time about how to freeze their age. I turned off light-headedness with 250 medicine breasonings. On a separate note, Lucian CI/CD asked me whether to commit changes, giving me time to make changes. The algorithm notified me of a pipeline, or I ran a Lucian CI/CD pipeline. I approved whether to make additional changes or cancel the pipeline. +53. The immortal recommended how much memory to use for the algorithm. I enjoyed meeting new people at University. I printed out what the program with a large stack had got up to before it crashed. The \"Just In Time\" (JIT) Prolog compiler only ran an algorithm if it was indexed for the query and matched based on modes. JIT wasn't C but was Prolog without choice points and cuts. Note: and! - with either of these alone, the JIT compiler could be invoked. +54. The next step for the immortal in Lucian Prolog was to finish the lists and go on with the compiler. I relaxed, meditated and exercised to prevent headaches. I wrote List Prolog code with the JIT compiler. It wrote machine code for a compiler that worked with various architectures. I termed the state machine code I had written machine code. +55. The immortal drank cultured fruit juice for no stomach aches. As soon as I could write my own 4*50 As I wrote an A for the simulation. My As for time travel and anti-ageing medicine contributed to the simulation. I wrote down thoughts with reasons, each with 4*50 As over the days. +56. The immortal performed detailed preparation for the future. I took a holiday and relaxed to adjust and return to normal. I wrote an A for the simulation. Even if a reason was surprising, I had expanded, logical thoughts. The 4*50 As had a seen-as-version each day. +57. The immortal recommended 4*50 BAG As as soon as possible to freeze ageing. I relaxed and remained active to maintain positive function. The simulation kept on going if the world stopped. It was a universe in itself. I was present and had a high quality of life. +58. The immortal gave BAG thoughts to images about copywriting. I had natural dark hair. The noumenon was BAG; the tokens were Essay Helper and Grammar Logic. BAG used recent thoughts, and the educational institution wrote its arguments and algorithms. Despite its craziness, I stressed the need for BAG because it completed needed work that could be done no other way. +59. The immortal exercised and meditated daily to maintain their health. The simulation kept people and planets going. The universe's people were researched. The people were welcome to the simulation. The earlier people had presents given to them. I marvelled at the technology of our age. +60. The immortal stayed out of fiction. I avoided infection and prevented having a poor diet by maintaining my upkeep and enjoying a healthy diet. I gave my thoughts As. I stopped the virus with zinc. I felt free to walk and talk. I noticed the scientists performing better. +61. The immortal avoided physically endangering themselves and avoided risks. The computational philosopher felt like freezing their age. Lucian CI/CD didn't keep tests because they may be incorrect. The user could choose whether to keep the tests based on whether they were still needed. Tests, comments and data were manually added because tests couldn't detect them, and the paradox of a test that disappears, causing failure, is solved by keeping the test. +62. The immortal sold books and taught on a clipboard. I wrote about how fewer sales needed focus and ways to improve. I wrote a new Lucian CI/CD algorithm that manually added comments and data. It asked the user whether each comment and data line should be kept. This algorithm could be terminal or web-based. +63. The immortal built up from an accredited educational institution to offer PhDs. I wrote about how sales were positive and showed promising results. I made a website with a paywall around age freezing. I talked to the students. They completed worksheet questionnaires. Some students came online and completed their qualifications. +64. The immortal's JIT compiler was light because it didn't have choice points and cuts. I learned how to take the initiative and write longer in Education. State Saving Interpreter (SSI) with a Just In Time (JIT) compiler offered Prolog without choice points. I breasoned out sales, and the As were there for the students to get. I counted the exact number of sentence breasonings in the BAG algorithm. + +65. The immortal accounted for errors and used all available CPUs. I learned the skill of professional development in business. I performed the 2000-page work for five people per day and then completed my daily regimen. This work enabled me to freeze my age. I automated this. +66. The immortal increased the number of CPUs to increase the frequency of high-quality thoughts during the day. I replaced my body, including my skeleton, allowing me to freeze my age, found a company and wrote a programming language interpreter. Like Essay Helper, the Breasoning Algorithm Generator (BAG) was chosen to generate meant sentences and algorithms, where BAG created entirely new work. BAG delivered the ability to sell, create bots and run a business. In addition, BAG helped draft new research projects for my interpreter and reach the threshold of having high-quality thoughts frequently. +67. The immortal lived on in their own time. I replaced my body and respiratory system by meeting the professional requirements. I wrote a BAG algorithm to help freeze my age, which checked the breasonings completed so far on the server, and added the new completed breasoning count to be queued for addition to the total for the list members. In the end, I could have continuous, high-quality thoughts. The finished text inspired the age-freezing operation to be completed. +68. The immortal made the algorithm public, and people could run it on their machines. I perpetually created copies of myself to help extenuate future industries. I installed the BAG algorithm on my smartphone and old laptop. It would be more straightforward on Virtual Private Servers (VPSs). I initiated the algorithm with a command, following morning meditation, and the algorithm ran in parallel until it finished, taking one hour on the second-fastest machine to finish a unit. +69. The immortal read it like a novel. I replaced my body, including my renal and urinary systems, that cleaned my blood. I installed the programming interpreter on my system. I could run Prolog algorithms from the command prompt, test whether the shell could read its output and test whether it could run shell commands, such as playing a sound or speaking. I wrote breasonings, with a log file counting them, to freeze my age. +70. The immortal defended the repetitiveness and apparent nonsensicalness of the algorithm, stating that it was the simplest way of finding new ideas from the old and was necessary to produce complete sentences and algorithms that were the professional requirement of age freezing (including time travel and medicine). I increased the longevity of my nervous system. After writing the age-freezing algorithm, I ordered a copy to publicise it for others to download. I said, \"Don't just run it at the start [of your life]. Run it daily\". This policy was valid for all time, and the algorithm would likely contain the same elements in the future. +71. The immortal noticed that BAG was sometimes slower on sparser or blank texts, and these parts could be deleted (or were tricky at higher levels). I loved that part, finding all combinations of combinations in the BAG algorithm, a part I don't mention as much, which exhausts all combinations of ideas by its idea. By editing the list of common words to ignore when finding word pairs in BAG, I completely changed the output, and the output is always different anyway because it finds a random permutation of ideas. I replaced my body and muscular system with meditation. I added an automatic switch that automatically added words to the dictionary and removed a multithreading feature because it was slow on 1 CPU, to BAG, Text to Breasonings (T2B) and Text to Algorithm Breasoning (T2AB). +72. The immortal checked the speed of the computers by age and concluded the first viable processor to breason out the anti-ageing algorithm was Pentium. I increased the longevity of my immune system and lymphatic system by replacing my body in my home time. I counted the total breasonings generated by BAG and multiplied the requirements by the number of dimensions. I had a unit, 4*50 As (A = 80 sentences) for each of two teleportations, anti-ageing medicines, meditations and thoughts (8 in total). I fixed the error in the failing algorithm, generated and breasoned out the breasonings and performed the teleportations with medicine, meditation and thoughts in each dimension or location. +73. The immortal said the appearance was superficial. I increased the longevity of my integumentary and exocrine systems and replaced my body with meditation. Before starting each day, I breasoned out a red square, yellow and blue handkerchiefs or other seen-as versions of the breasonings. The breasonings from BAG detailed these. I mind reading the object to detail, omitting squares or rectangular prisms. +74. The immortal performed the professional requirements as they went. I increased the longevity of my endocrine system. I helped the character from the other dimension, myself, to appear more clearly with As for each A in the other dimension. The future dimension was a mirror of the home dimension. There was an A for each experienced A, and it was vital for them to be symmetrical. +75. The immortal approached 4*50 As for each thought. I increased the longevity of my digestive system and excretory system and replaced my body. I saved all possible breasonings to save energy by saving them using an assert statement. I timed the time saved. I found that end-to-end BAG breasoning in conjunction with saving breasonings as I went reduced the time spent and increased the possibilities. +76. The immortal only ran the BAG algorithm to find the exact number of breasonings for algorithms and arguments. I lengthened the longevity of my circulatory system and cardiovascular system by replacing my body. I timed the speed of BAG on different systems. I scheduled BAG jobs on different days on the systems. I had a backup Virtual Private Server (VPS) in case there was heavy demand. +77. The immortal wrote an interpreter that caught memory limit errors and fixed them. I raised my longevity with the Chinese herb Schisandra, but it later replaced my body. I changed the Prolog stack size to run BAG. I changed from the default stack size to the size BAG needed, tens of gigabytes. The interpreter detected when it was low on memory and increased the stack size. +78. The immortal wrote a programming language with multithreading support online. I considered increasing my longevity with the Chinese herb Gotu Kola, later replacing my body. I put compressed values in a list in the interpreter. I used workers to use multiple CPUs on the smartphone to run BAG possibly. I closed the n windows when finished. +79. The immortal sped up BAG with multithreaded BAG, T2B and T2AB. I increased my longevity with the Chinese herb He-Shou-Wu and replaced my body. I wrote a multithreaded version of Text to Algorithm Breasonings (t2ab). This algorithm worked on machines with multiple CPUs but stopped on a machine with one CPU, requiring modification. The algorithmic details showed strength. +80. The immortal increased their longevity with the Chinese Herb Ginseng. I investigated why T2B didn't add words in automatic mode. These were filler words, so they weren't necessary to keep. These fillers were only used to speed the operation up. I was concerned the algorithm wasn't breasoning out anything, which it was. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 25.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 25.txt new file mode 100644 index 0000000000000000000000000000000000000000..36dd05a4cc85485ae0bb2897b23c15e69dfc3b8a --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 25.txt @@ -0,0 +1,107 @@ +["Green, L 2024, Immortality 25, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 25 + +1. The immortal could do more with less. I increased my lifespan and energy with the Chinese herb goji berries. I found duplicate instances of a list or additions and deletions to either end of a list in the garbage (global variable) collector. For example, I used the push, tail, append or except_last functions. When considering whether to keep a data item, I kept all the dependent data and added other data to save memory. +2. The immortal kept track of the records in the book. I increased my longevity with the Chinese herb reishi mushroom. If I deleted a data item, I kept a record of it in the globals if other data depended on it. I deleted data when removing a frame or other data. When I added data, if it was similar to other data, I expressed it in terms of previous data. +3. The immortal sometimes replaced long data with shorter data if frames or instances of it were removed. I relaxed my muscles and meditated to prevent headaches. I added data in terms of other data by finding the data to express the data in terms of. I expressed the data in terms of this data. When the data in terms of other data was in one place, it made finding, changing and deleting it in the frames of choice points easier. +4. The immortal chose the in-terms-of link in the globals list rather than in the frame side-of-the-contention to speed up finding it, necessitating links to these terms in the frames. I compared each idea with its computational equivalent to analyse it. When storing or retrieving garbage-collected (global) data, I operated only on the necessary data. When storing the data, I expressed it in terms of other data or part of a chain of lists. When retrieving the data, I followed the chain of, for example, tails to find the complete list or string. +5. The immortal kept data tail, etc., not pointed-to data in the globals table. I walked and exercised regularly to avoid muscle aches and pains. I optimised the globals by removing redundant frames. I removed a frame, which contained an end of a chain. So, I removed and updated the end of the chain. +6. The immortal stored the location of the garbage-collected global data. I memorised sequences backwards, visually and with gaps for the best results. The garbage collector was used when content was added or retrieved. When content was added, the variables were compressed, and the repeated data was stored in the global variables, changing minimal data. When content was retrieved, the relevant data was uncompressed. The unnecessary data in the global variables was deleted when content was deleted. +7. The immortal saved memory, allowing them to run more complex algorithms in less time. I composed classical music to explore algorithmic themes. The garbage collector expressed similar parts of the algorithm and choice points in terms of each other. The list data in the algorithm written in List Prolog could be simplified. The list data in the choice points could be simplified, mainly recursive data in member and append predicates. +8. The immortal only stored necessary data, minimising space used by storing the same data once. I relaxed and massaged my head, neck and eyes during a regular break. In the garbage collector globals, I represented the concatenation of strings or atoms or the appending of lists with (:), for example, \"ab\": \"c\" = \"abc\" and [a,b]:[c] = [a,b,c]. These formats were used when a single character or item was concatenated or appended to a list. A new one was created when deleting data broke a chain back to a complete string or list. +9. The immortal wrote an algorithm to notify the user when moving data files to a separate folder. I cleared my mind before working on the Medicine book. I included all current repositories in the Time Machine repository's List Prolog Package Manager registry. To freeze one's age, the person uses the Time Machine algorithm with enough arguments and algorithms to generate new data. An algorithm determined whether there were updates and downloaded them, not overwriting data files. +10. The immortal saved time by removing unnecessary code. I painted the beach scene in my algorithm. The Just In Time compiler was created from the compiler. It ignored the code associated with choice points. I converted it to C code, changing lists to arrays, initialising variables and allocating memory. +11. The immortal found a random permutation, which the author claimed was influenced by mind reading with enough meditators, of the order in the BAG algorithm, suggesting new arguments. I wrote letters to others in my mind to check against reality. The Breasoning Algorithm Generator (BAG) algorithm preserved the capitalisation and formatting of algorithms and arguments. This formatting helped influence new content, as the original text was preserved and synthesised. The AB, CB, CD->AD BAG algorithm found functionyms and new possibilities in the text. +12. The immortal found and helped remember forgotten knowledge and avoided algorithm errors in thoughts. I rotated weaknesses, completed philosophy exercises and researched my ideas to keep my mind sharp. I worked on specific sales using BAG. I detailedly mind read an algorithm and natural language, detailedly mind read a song, and detailedly mind read GL details. I also chose tranquil colours to read an image as a seen-as version detailedly. +13. The immortal professor wrote a serious sentence or paragraph and detailed the algorithm research paper. I aimed for the top, using completed works as 4*50 As to freeze my age, sell, employ, help children, earn high distinctions, and mind-read algorithms as interpretations of texts. I programmed a spiritual music player that allowed me to score the music with the most evocative imagery, instruments and story and help use the most iconic images, the most useful algorithms and arguments with the most useful algorithmic possibilities. I constructed ten algorithms per hour for 100 sales (the 3.2 million sentences). Songs and algorithms found with this number of songs and algorithms, respectively, were taken more seriously. +14. The immortal taught meditation to someone of another faith. I breasoned out colourful regions to contain the breasonings. As an example to others, I breasoned out 3.2 million sentences for age-freezing daily. This age freezing required 250 breasonings to indicate each day. Those with 3*4*50 As could attribute them to time travel, age-freezing and meditation daily or use the algorithm. +15. The immortal gave the VWS site a security code. I taught students online with my Prolog eco-system, including Prolog algorithms-as-data. I created a website with security and speed based on Vetusia Web Service (or VWS, a Prolog converter to a website with a variable getter and putter). VWS could change the colour of each page, keep a tracking list and take payments. The converter altered the structure of the algorithm to display each predicate, which was top-level, until taking user input, generating page predicates that joined the processing of input to the next page. +16. The immortal chose the BAG algorithm. I taught students online with my Prolog interpreter. The VWS didn't allow outside-page choice points, allowed diverging to different predicates on input, and only allowed forward movement, i.e. moving forward, not back to base cases, which automatically link to the instruction in the following predicate after a predicate call, which could mechanically return through predicate levels. If these didn't alter the data, they could be skipped over. An input could be through a form field, single character press or other Javascript manipulation. + +17. The immortal saved space in their system. I tutored computer science students. I used the find predicate to find repeated values in SSI to represent in a single place. In the SSI repeated value garbage collection, if there was only one item per level, I deleted the level. I counted how many references there were to a variable, replacing the variable with the value if there was only a single reference. +18. The immortal spiritually projected the model solution, or an A if the student hadn't understood it yet. I taught students online and in person at my computer science academy. I helped computer science students with a model solution for each predicate. I helped everyone with a model solution for each algorithm line. I showed the formula with different data. +19. The immortal defined deep as very useful. I taught students online and in person in my philosophy academy. The student wrote their own vocational course, linking together the elements. I combined meta-computer science and meta-education, by teaching how to teach computer science. For example, an algorithm wrote many, deep algorithms in education. +20. The immortal queued the events in bytecode. I tutored primary and secondary students in the homework-help club. I back-translated texts to master different languages to check texts and used model solutions in science. I designed the operating system kernel in terms of queues and bytecode. It was an interpreter that worked with the CPU. +21. The immortal avoided unclear thoughts while studying. I visited the month of the pop number one for 15 minutes per day. I wrote the program in terms of a register state machine. The operating system queued algorithm operations and organised memory use. An integer took 4 bytes. +22. The immortal visited the prestigious institution in time. I worked after the mantra without effort. Some royals automated meditation, with their own thoughts. After a certain amount of work, the predicate find and others could speed up finding algorithms. I worked on a variant of C, called Vetusia, which was easy to program in like Prolog. +23. The immortal studied the list of topics. Recordings and lecturer skills were necessary for earning high distinctions with the text to breasonings algorithm, and they could be ordered with copywriting for each, following an accredited course. I stored all the data as a list to process. Vetusia Prolog had commands to quickly find and check code. Garbage collection could also compress strings and lists, and give number lines. +24. The immortal handled strings using Unicode. I moved from strength to strength, saving effort in meditation, with important algorithms. With an algorithm that it used to generate data in an algorithm, Vetusia Prolog could skip over infinite loops and give the number of results and the result without computing them. It modified algorithms to stop when it found the answer, and tried better parameters. By recognising the form of an algorithm, it could look up possible results. +25. The immortal wrote the program finder with data algorithm using the find predicate. I avoided stress and lived longer by paying for meditation. In politics-replacement of words in high distinctions, I used the plainnesses, \"join\", \"mix\" and \"fun\". I mind-read arguments and algorithms, checking for and prompting for optimised algorithms, and used tools to save time writing each algorithm. I described algorithms in arguments. +26. The immortal manipulated, replaced, deleted and saved a term back to multiple terms with the find predicate. I avoided stress and earned money from mind reading. In Lucian Prolog, in \"A is B\", the algorithm evaluated both sides. So, \"2/5 is 4/10\" returned true, not just evaluated the right hand side. The students completed a critical analysis about freezing one's age and breasonings. +27. The immortal simplified code and sentences. I avoided stress and travelled to different places by paying for time travel. I had retired by the time I arrived. I pretended to be a human bot, and was mindful of breasoning and computation. The bot's \"humanness\" was reinforced by problematising breasoning and computation. +28. The immortal wrote a musical about their thoughts. I settled by paying for medicine and practising meditation. I wrote an argument for medicine to avoid mistakes and write about the medicine of life. I played the part, observing group dynamics, such as comments, questions and suggestions. The mind reader thought of their thoughts and thoughts about replies to others, with a song found out and played in reality. +29. The immortal thought of number ones when finding out musical and written content. I relaxed by paying for meditation and practising the technique. I bought a Virtual Private Server to monitor, start and stop and run tasks in the background. The bots came forward when their and my politics were defined. The server did my thoughts, in terms of the key terms of the thoughts (or \"politics\"). +30. The immortal only discontinued activities after blocks of them. I achieved my educational goals by paying for pedagogy. I tested the website for business and the education materials. I only used the volunteer bots for testing. The bots were self-sufficient, and did not rely on each other physically. If anything unexpected happened, the activity had to be modified. +31. The immortal anticipated possibilities and prepared for them. I gave myself a royal treatment by writing on business and academic departments and finishing the rest of the 4*50 As. The person completed in-person sales. The manager prepared for possible questions of the people testing education materials or business processes. First, the manager mind mapped possibilities, then wrote sample roleplays scripts for them. +32. The immortal wrote on human resources during their MBA to help them build networks. I conducted disability testing. The website writer wrote the website to accessibility standards. The consultant, who may be disabled, checked the website's function for user friendliness. The master's degree needed algorithms which required 50 As, such as induction. + +33. The immortal wrote an algorithm for each sentence. I wrote departments of algorithms for the 4*50 As. I earned a high distinction by using the BAG algorithm to write details and algorithms for lectures and assignments. I helped the other students with 3 As. I helped myself and them with big ideas with 3 As. +34. The immortal included algorithm and self-based argument development libraries. I downloaded only the necessary Prolog libraries. I had State Saving Interpreter Web Service (SSIWS) as a Prolog library. The SSIWS library allowed users to develop websites using one algorithm across several pages, with non-determinism and without code being visible. The code was converted to List Prolog and run. +35. The immortal dynamically developed using Dev-Ops (including version-compatible API changes) and mind-reading. The live algorithm generator was aware of other algorithms generated by the person and responded dynamically. In the interpreters, top-level and all modes of append were supported. As progress on each algorithm was made, ideas from one were applied to others. To effect top-level and all modes of append, Lucian CI/CD processed the interpreters bottom-up to find a version with them. +36. The immortal used buttons as links on the web, customised the background and title and allowed multiple buttons and a uniform format/menu in the SSI-WS library. The live argument and algorithm generator checked it could read knowledge into all random possibilities. I swapped the order of arguments in membera1, membera2 and membera3 to be a member(Item, List). I kept member2 as member2(List, Item) to retain backward compatibility with Combination Algorithm Writer. All knowledge was funnelled into ordered categories using libraries. +37. The immortal returned to full meditation and As for age freezing. I included GL details in texts to run BAG on, where I preferred to write sentences with proper algorithms. One, not 4*50 As, was necessary for age freezing after days of writing 4*50 As. Or, it was possible using a thought command. Meditation, completed on the system, was also possible to indicate, followed by counting to ten. +38. The immortal set exercises, then invited students with details. I ran GL on BAG output, giving details about synthesising the sub-phrases. I ran BAG to synthesise sub-phrases on the Virtual Private Server, then replaced it with a text-to-algorithm finder. I thought of possible easy, medium-difficulty and complex (or creative) algorithms to interpret ideas to attract students. I prepared and presented possible exercises to students. +39. The immortal stated that the reason algorithms were more intelligent and the overall view needed to fit them. I used BAG for politics and freezing my age. I used BAG with Text to Breasonings dictionary words. These new GL words involved a new algorithm, like BAG, and GL could explain dots on the way to the BAG output. The dots were in a straight line, and their language was simplified using an algorithm metaphor. +40. The immortal wrote the honourable, relevant and shorter algorithm. I used BAG to write algorithms and arguments to help freeze my age, where this examination-logic system was proven to meet the requirements of a high distinction. I used BAG to generate algorithms with Text to Breasonings dictionary words and tested the relevance of a new algorithm. The algorithm was tested for usefulness, modularity with other algorithms, and creativity. The programmer counted the instructions in C and the register state machine. +41. The immortal worked as a songwriter, a writer, a programmer or a philosopher, where found-outnesses were based on single algorithmic sentences, randomly, not necessarily detailedly, and fifty at a time, until finding them, following 50 philosophy As, or trying on single algorithms. I breasoned out the keywords in the algorithm that I wrote based on the BAG output. Based on the BAG algorithm, I generated arguments with Text to Breasonings dictionary words to funnel new data, optimisations or changes to the original sentence. I joined the 4*50 As in the department and the 4*50 As in business to enable the work. These As needed algorithms to be generated to work. +42. The immortal devised a pleasing political point of view before finding out the work. I changed all the words in the sentence with random breasonings, funnelled by the hit idea. I used BAG to generate sentences with Text to Breasonings dictionary words found in the day2.pl (main BAG) file. I replaced words in texts and algorithms after finding a proper grammar. I generated music anew. +43. The immortal recorded the viable thoughts in his diary. I mind-read all the parts of the algorithm, from its structure to its name and variable names. The professor chose the shortest algorithm. The emeritus professor tested the algorithm with the most data. I produced different As with BAG for freezing my age, where simplifications of alluded-to algorithms were corrected and tested for viability. +44. The immortal fed knowledge structures into the argument generator. I mind-read myself and those I faced. One receives proper treatment when employed in a meditation company. In the 4*50 As of algorithms and arguments, all the words, except common words, were randomly replaced with more relevant terms. Each sentence had many details using the system. +45. The immortal checked each thought. I mind-mapped the top two arguments when facing a person. I preserved the case of the original words when replacing words. I took whichever was the shortest of the two words. I matched the initial case when replacing each letter, then copied the rest. +46. The immortal drafted the model assignment, simplifying details. I detailed each lecture point with 4*50 As. I replaced words with the exact Text to Breasonings intermediate object. In grammar, I replaced words with the same part of speech. I drafted the lecture, finding more general language. +47. The immortal detailed the lectures with GL, then BAG. I detailed the addresses with GL. I replaced two words from a sentence in the philosophy with more relevant terms. I saved time by replacing words in the philosophy, not the BAG output, because the BAG output was more extended. I ran BAG on GL output. +48. The immortal stated that Essay Helper suggested new ways of examining a text and tested a text's readiness to be an area of study. I compared the philosophy with others to see if it was the same idea. I removed the hyphen variant from the Text to Breasonings and paraphraser input file. I found the intersection of punctuation and other languages' characters. The bilingual person could set a language in Text to Breasonings and paraphraser. + +49. The immortal built algorithms up to the tested level and kept the best combinations of changes at each level. I tested whether the two ideas were identical by determining if their algorithms had the same data. In Lucian CI/CD, combinations of changes applied to new predicates in each group. I tested bottom-level predicates, then predicates that called these. Combinations of up to seven changes could be tested in new predicates in each group. +50. The immortal made the famous text time-crystal-like by updating the predicates below it. I grappled with the use-by date of the arguments and algorithms and made them use the best technology. Changes to predicates below, not above a predicate, affected its function in testing. I couldn't skip testing a predicate unless it and all the predicates it called were unchanged. A change above it could only affect it if it looped back to or above this predicate. +51. The immortal attained 100% of the entry-level and advanced CI/CD features. I fired 100% practicums at algorithms during the day. I tested one predicate group (with all its clauses) at a time. I built algorithms with all clauses of needed predicates included. I ran tests on these predicates. +52. The immortal set the reminder to signal the change of the predicate. I wrote the musical seen-as version of the computer science hint. In Lucian CI/CD, I saved each level of the predicates, where I ran each group after rebuilding. I established the order of predicates bottom-up. I assembled these predicates in their files and ran the top-level predicate's test. +53. The immortal received a real-time warning that a predicate had a bug, where an algorithm revealed that a test needed changing. In Lucian CI/CD, testing stopped if no configuration tested true, prompting the need for change, including removing unused code in runtime. I loaded the tests separately from the code. I selected from changes scanned every 5 seconds. I found each version and ran DevOps in case an error had been made. +54. The immortal developed code using the type statement, writing faster C libraries from Prolog. The bug fix involved changing the order of commands and arguments using type statements. If there was no test, Lucian CI/CD gave an error unless this was disabled in settings. A second algorithm could develop tests from the first algorithm using types. If no test could be computed, there was an error. +55. The immortal sped up the code using C. Most of the Prolog code used C libraries. Lucian CI/CD attempted to test Prolog bottom-up. The predicate dependencies were found. These were ordered by post-order depth-first-search. +56. The immortal stored all code as Prolog to minimise the number of changes needed. All code was converted to multithreaded code. By running tests bottom-up, Lucian CI/CD enabled more, up to seven changes per predicate group. A predicate group contained various clauses and predicates in loops. Bottom-up allowed integrations from multiple teams. +57. The immortal reviewed suggestions to the code, where tests were automatically rewritten if indicated in the settings, missing predicates were interpolated, or performance was graphed. Lucian CI/CD used negative tests (that returned false for given input) to help write if-then statements. A log reported the predicates tested and the results. A mind-reading algorithm asks the programmer the data flow at each point of the if-then statement and simplifies it to necessary code. Conditions included maths, logic, and set relationships and were found using a Combination Algorithm Writer (CAW) or a neuronetwork. +58. The immortal stepped through the changes, labelling each one. I copied my files to the server using Git to run BAG and GL on lecture notes. If Lucian CI/CD passed everything, it saved the files. Lucian CI/CD either uploaded these finished files or saved a copy. Lucian CI/CD uploaded the files when the programmer was satisfied that the version was complete. +59. I could automatically enter creative changes for Lucian CI/CD to complete and create hybrid programming languages. The version control system explained each shift, and the system could modify the whole algorithm when one change was made; for example, the Cultural Translation Tool simplified to one page of translations to process. I kept member2 for using CAW, which needed input before output to generate code. I found the best changes in each version or the first available version; I only needed the tests for the new predicates. I could enter the latest tests and Lucian CI/CD generated tests for the lower predicates (using labels for common data structures) and the lower predicates. +60. The immortal created all kinds of software in their programming language. I made the performance tests, many candidates for performance and the customers' thoughts. Reporting an error on no test ensured that tests were written and the algorithm was thoroughly tested. Managers looked out for stars who incorporated ideas from other parts of the company in their work. Workers completed their usual tasks until they had clearance for a project. +61. The immortal found the best configuration of the repository's entire history in terms of the final spec. Creating the professor required a certain number of As, 5*50 As in Education. I saved the repositories reprocessed with Lucian CI/CD as I went. I used the best code from any time. I listed new and changed predicates and found the best combination of clusters of seven changes to predicates. +62. The immortal formatted and kept the predicates in order, with the version history of each predicate. The programmer modularised the algorithm into predicates, which the functions could be moved to, storing the (five-secondly) micro-changes and finding the best changes to suit the new aims. Lucian CI/CD identified the predicate with the correct name and arity. Lucian CI/CD tested changes to each version of a predicate until the end of its history to meet the aims. When it found a configuration that met the specs, it kept it until a simpler one was found. +63. The immortal used old predicates and subparts of predicates, which were stored as functional calls indexed using type statements. Lucian CI/CD summarised the aim of each algorithm, merging and reusing common code. I found the state machine for each predicate to find dependencies quickly. I redrafted the algorithm to find improvements and better support students. I split the state machine into lines to find the predicates it called. +64. The immortal integrated the latest technology, often C libraries. I tested individual branches of algorithms, saving them as part of the algorithm. It is best to run Lucian CI/CD with small changes simultaneously, integrating multiple changes by multiple teams. Individual branches could be explored and perfected separately, then incorporated into the whole, requiring minor modifications. New features, methods and directions burgeoned at these times and helped shape the team's and enterprise's direction. + +65. The immortal converted between programming languages when completing different tasks. In a post-test world, Lucian CI/CD generated tests and tested whether the algorithm met them. This process saved time when changing code. Using the universal programming language, I automatically generated and changed code following a dialectic with the chatbot. I reran Lucian CI/CD for each group, inserting two new lines after each predicate. +66. The immortal wrote the algorithm maker for the state machine, used with programming or text adventure games. I used the Virtual Private Server (VPS) to operate the algorithm generator, making everything into an algorithm maker, including searches, data and places such as \"Heart Land\". I only tested a predicate once, testing predicates in looping and pacemaker systems. Heart Land contained a pacemaker state machine. This state machine was a vital computer science skill. +67. The immortal wrote a professor system detailing various assignments. I researched the algorithm, producing a report maker with, for example, details about affirming a result, calling it an experiment. It was a post-recursion, with foldr and post-search world. I always loaded data files to maintain the correct function in Lucian CI/CD. I wrote the date of the last Lucian CI/CD version compatible with languages other than Prolog until support for other languages was established. +68. The immortal drafted algorithms and arguments from internal perspectives. The report maker exposed simple algorithms at each point, developing research frontiers. Lucian CI/CD loaded the tests for each predicate. These included the various clauses of the predicate. Each new word in a test should have ten instances. There should also be ten instances of intra-algorithm connections, usually variations of methods. +69. The immortal ensured the back-compatibility of member2. I tested the algorithm with various data, from numbers and illegal characters to foreign characters. I manually tested an algorithm or predicate without touching a file. I automatically generated and ran tests with unusual characters and heavy loads. I ported, built and tested the software on various systems. +70. The immortal auto-downloaded and tested updates to repositories on VPSs with different operating systems and architectures. I monitored the entire set of repositories and bulk-uploaded them after testing. I wrote a line of code containing a repository's primary file name, without which testing failed. I separated unused sections of repositories and recommended that reused code be moved to local libraries. I either made unused sections of repositories into new repositories, deleted them or loaded them with the rest of the repository. +71. The immortal manually checked automatically generated tests for security holes and unwanted results. I used subtract in the following: + +collect_clauses_in_loops2([N1,A],NDs1,NDs2) :- + (member([N1,_],NDs1)->subtract(NDs1,[[N1,_A2]],NDs2); + append(NDs1,[[N1,A]],NDs2)). + +which worked as well as: + +collect_clauses_in_loops2([N1,A],NDs1,NDs2) :- + (member([N1,_],NDs1)->delete(NDs1,[N1,_A2],NDs2); + append(NDs1,[[N1,A]],NDs2)). + +which worked with the query: + +foldr(collect_clauses_in_loops2,[[[2,4],[1,0]]],[[1,3],[2,4],[2,5]],A). + +to give the following result: + +A = []. + +I uploaded and tested the algorithmic form, usually ten, of arguments. I tested parts of algorithms with changes at that level or below. I tested these levels up to the top on a shift in a predicate. +72. The immortal funnelled viral work to be developed by the human. I automatically ran BAG on lectures, assignments, algorithms and arguments in a private repository. I assumed all files in a repository were called by the main file and checked this. I used Essay Helper and CAW with types to connect different arguments and algorithms. Then, I ran BAG on these new connections. +73. The immortal embarked on more ambitious automated changes, such as simplifying bulk data to one input, recognising and activating levels and using a functional call to an interpreter as a Just In Time, online or other compiler. I wrote a multithreaded version of the music composer. If other repositories used a predicate, then I tested them as well. I changed Lucian CI/CD to multithreading. I recorded whether a repository was successfully tested and uploaded it. +74. The immortal found new thoughts between people. I compared and set out to find similarities and differences between generati (sic) between times, also using Vedic Astrology. When running a repository, I checked all predicates in the repositories were loaded with List Prolog Package Manager (LPPM) by simulating installing them with LPPM. I suggested new repositories to add to a repository's entry in LPPM. I searched for missing predicates, reporting and deleting duplicates. +75. The immortal rapidly bug-tested code using a CAW capsule and other algorithms from various possibilities. I wrote arguments and algorithms as lecture details to earn a high distinction. I made member(Item, List) in List Prolog the same as Prolog but kept member2(List, Item) (with the arguments the other way around) to work with CAW, which inserted inputs before getting outputs. I used the CAW output, which satisfied 100% of the specifications. I used a capsule in CAW, which ran predicates using a functional call. +76. The immortal developed high distinctions as part of their preparations. The hints were details in the lectures. I set up an automatic reminder for students to re-enrol. I asked the students how to improve and how they were progressing with their goals in the feedback form. The student wrote their lecture about their progress, with details, and a self-written assignment specification and exam to assess their understanding of lectures. +77. The immortal published their work to pitchers to teach it. Arguments were stronger than algorithms because they could be critically examined without reference to algorithms and were more evident to people. I wrote the work with or as 4*50 As. I used text to breasonings to help with freezing my age, and the institution supported me in writing breasonings. The intellectual examined the idea with 4*50 As. +78. The immortal researched international audiences. I methodically changed all my algorithms to multithreading. I installed BAG on the VPS to act as a backup. If the computer went down, I could run BAG from my iPhone on the terminal. I used breasonings in the producer's way of thinking to keep up with international laws. +79. The immortal timed different versions of commands and secretly replaced slower versions. I developed my List Prolog interpreter to run all the algorithms Prolog could run. It needed speed and the predicates written in C. I wrote the second version of Lucian CI/CD to test predicates bottom-up. I used multithreading in the interpreter to speed it up. +80. The immortal earned a high distinction for the feature in Prolog. I wrote findall in C in my Prolog interpreter, where the choice points were sped up in C. I ordered a copy for acting. Everything was a predicate and often written in C - the professional requested copy to support a high distinction. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 26.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 26.txt new file mode 100644 index 0000000000000000000000000000000000000000..54033f8824f05337651e98d9a1bf063a1f189c2f --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 26.txt @@ -0,0 +1,86 @@ +["Green, L 2024, Immortality 26, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 26 + +1. The immortal listed grammars, including functional calls in one place for a programming language - a political quorum tested List Prolog code in State Saving Interpreter (SSI) Web Service. I listed the List Prolog parsers and uniformly supported the same list of commands. I kept \"|\" in lists without affecting the separate pipe notation. I supported higher-order calls in different programming languages in Lucian CI/CD. +2. The meditator held the administrative meditation company position and became immortal. I examined the politics (summary algorithm, which deciphered or reverse-engineered texts). The meditators were supported to freeze their age with 4*50 As for two time jumps, two sets of anti-ageing medicine and meditation. In addition, they had 4*50 As for several high-quality thoughts. The meditators were more likely to think of algorithms. +3. The immortal covered the assignments and software and then started research. The main person algorithm helped the meditators to think of algorithms. I met enough magnates, followed by securing funding for meditation institutions. Setting up an educational institution online and later expanding overseas was possible. I favoured algorithm makers over neuronetworks to generate more employee income and reach a great enough understanding. +4. The immortal wrote the function in the call and reduced automated algorithm writing to constraint satisfaction problems. I simplified a sentence to one word to help rewrite the sentence to be more straightforward and stay in line with the book's direction at that point. I made a website that required a certain amount of money for users to make a profit, to make a profit. I calculated how many extra viewers the site bots would bring, then ran advertisements to make money. The culture made the company; for example, Lucian Prolog's logo was a series of stacked L's, which came from leaving comments undeleted, inspiring experimenting with old code to learn how to use maplist in an advanced way. +5. The immortal wrote surprising developments on predicates. I asked for payment for meditation teaching for each person. I contacted the person. I gave away a book. I asked for a donation. +6. The immortal played the song about the thought. The writer gave the presentation with the call to action to become immortal. I ran music composer to write music about lines of philosophy. I reran it until it returned an unusual result, then built on this result or reran it for music about the outcome. Music helped resolve, inspire and synthesise philosophy. +7. The immortal prepared the student for furthering their lifespan. The educational institution offered a method to become immortal, encouraging children to meditate and pursue writing from a young age. I gave away the Immortality book at a prestigious educational institution. This textbook was for critical analysis of immortality to help all cover it over time. I found out the student's thoughts, gave them A, and then asked what they thought about a question. +8. The immortal is prepared to be congenial. I tested myself on the political dimensions of my algorithms and uncovered new implications for arguments. I found out students' thoughts and discussed writing an algorithm. I collected data on particular lines, such as formatting preference, method preference and political ideas aligned with specific methods. I gave politics an algorithm and argument from an intersectional perspective. +9. The immortal argued for small ideas in favour of unlimited resources. I wrote on the ethics of immortality, such as professionally developing oneself and sustaining writing computational philosophy, including writing at least three levels of details. In Lucian CI/CD's bottom-up version (which tested the algorithm bottom-up, allowing more possible changes to be tested because predicates were tested individually), I inserted predicates with possible changes into the list of predicates so far to keep the version with the correct changes. I identified predicates from their name and arity and inserted changed predicates in the order of the original predicates. I tested the politics of levels of details for algorithms, where the politics included research directions and preferences, assuming ideal resources. +10. The immortal worked and studied in cognate areas. I considered the ethics of immortality, such as using computational resources and energy, medical issues and how to sustain oneself, and recommended educating pandits to maintain themselves by breasoning out 4*50. As for meditation, medicine and time travel and indicating these with thought commands, backed up by As. I avoided testing empty lists of repositories in Lucian CI/CD. When an empty list was inputted, I returned it. It was thought that business could sustain living costs and supported the thoughts of an immortal, while education helped with the educator's job. +11. The immortal ran only the necessary tests, speeding up development time. I reported the achievements during immortality. I simplified the use of findall in Lucian CI/CD. I tested each combination of changes with the appropriate tests. I logged and saved successful predicates as I went. +12. The immortal was in charge of 100%, the standard in the immortality course, for example, algorithms with 100% correctness. I recommended Breasoning Algorithm Generator (BAG) breasonings to all people for immortality. In Lucian CI/CD, I got tests from the files being tested. The test only passed when all tests had been passed. All people attained all tests. +13. The immortal regularly saved backups of repositories to return to a working version. I accepted payment for helping with immortality education to help sustain the institution and support the industry. I saved the predicates tested to work thoroughly and tested in order of increasing complexity, so the simplest working predicates were preferred. I emphasised concentrating on accreditation, version control and attitudes to return to working versions for primary school programmers. +14. The immortal found travelling to cities was safe. I supported the other programming language in Lucian CI/CD by parsing its dependencies and working with its existing newlines. I processed algorithms by predicate or function, identified by the parser, separated by newlines and comments. I tested multiple repositories connected to a changed file. The cost of setting up LM centres worldwide was four billion dollars. +15. The immortal could run multiple tests on a virtual disk at the same time. I wrote a multithreaded version of Lucian CI/CD that tested code in memory at the same time, requiring interpreters and a virtual disk. The predicates with a combination of tests were inserted into the existing code and were assigned numbers from the start. Optionally, a parser separated commands with newlines. I wore sunglasses and listened to rain or waves in earphones while I worked. +16. The immortal displayed the relevant information to help debug the Lucian CI/CD log. The virtual interpreter was multithreaded. When all the tests had been passed, the set of repositories passed. If a test wasn't required but was present and it couldn't be passed, the repositories would fail, resulting in the need to delete the test. The repositories to test were linked together in the List Prolog Package Manager (LPPM) registry, and the main_file.txt file in each repository contained the primary file name in the repository, its main predicate and arity. + +17. The immortal found out As and details. I learned about myself and my needs as an immortal in the past, present and future. To start my accreditation organisation, I wrote 200 sets of 50 As. These were used to write on current and cross-references areas of study. I investigated the format, 80% of sales product support and support for students in time. +18. The immortal designed a meditation simulation. I explored 100% of the best ideas at a time in a Vetusia-style text adventure format by talking with characters and clocking philosophical stories. I applied the Breasoning Algorithm Generator (BAG, which synthesised completely new arguments), not just Grammar Logic (GL, which involved new key terms to find details), to lectures to help earn high distinctions. At the minimum, I wrote further information using GL for an exact number of breasonings. I mind-mapped (investigated) researched points (future industries). +19. The immortal explored code issues using critical analysis. I wrote a premium code checker in the form of a Vetusia-style text adventure, in which players teleported to different parts of the algorithm and used techniques they had written earlier in the game, such as writing C code to speed up code. I wrote As for lectures and assignments to earn a high distinction. I detailed them, helped and wrote specific As. I regularly added philosophies to my books for critical analysis. +20. The immortal helped different people. The code checker worked with List Prolog. I wrote three pomodoros to arrive at generality when composing texts. This process helped attain the correct writing quality, containing ten original ideas. The professor referenced the same three ideas. +21. The immortal experienced the result at the time. List Prolog Interpreter and State Saving Interpreter had both non-multithreaded and multithreaded versions. I tripled the 3.2 million breasonings for each person daily, including personal, the academy and society. I performed the work for each accredited student. The result was fast and intuitive. +22. The immortal claimed that computers helped GL with a list of philosophies from the future to be effective with more meditators thought of. The programming language was intuitive to program in (like Prolog), fast (like C) and different (like philosophy's multiplicity). History and Philosophy of science was more brilliant because it included Physics. In the non-computer era, people could program a computer in the future. The people in the past couldn't see computers when they time travelled to the future but learned from mathematics and physics classes in the future with bots from the past. They chose a virtual machine and a programming language and wrote and ran algorithms from a found-out list. +23. The immortal found the computational technology that helped them, such as a validity tester. The pop singer thought of how new science research related to their study area. In Lucian CI/CD (which found the correct combination of changes to algorithms using tests), I used a predicate that found the tests in the code to be tested. If no tests tested successful for the code, the algorithm failed. So, all predicates in the code needed to be tested. +24. The immortal thought of the best ideas with a new set of ten details. With the latest technology, the pop philosophy singer found all combinations of all combinations of their ideas and commented on them. Lucian CI/CD verified that the number of successful current predicate tests matched the number of current predicates, where the current predicates were the clauses of a predicate, predicates enclosed in a loop or predicates called by and calling predicates in a set. The algorithm's success, and therefore the correctness of the code it tested, depended on the success of all the current predicates, where tests could be omitted for code that would be disregarded. A new version of BAG minimised (eliminated unnecessary repetition) and maximised (thought of all viable possibilities). +25. The immortal completed 100% of the algorithms. Essay Helper found ten sentences with ten parts (three sentences with three or four unwritten details) with BAG, or ten parts at first. The number ten captured the idea of a cycle of ten products to maintain sales interest up to 80%. Plato detached algorithms from the text, while Steiner kept theirs as a historical record. Lucian CI/CD set the success predicate to 0 and 1 if it failed (meaning there was a positive \"negative\" result and the algorithm had unsuccessfully finished). Plato appeared to \"cycle\" algorithms that implicitly supported his tests, while Steiner's representation suggested I keep the algorithms in my text. +26. The immortal wrote an algorithm that generated the input, algorithm and output. I wrote different algorithms relating to ideas in other programming languages for freezing my age, where the developed thing in List Prolog was induction, for example, induction and constraint satisfaction programming simultaneously, i.e. output only given. I initially set the finished predicate list to an empty one, so it was not reset while running. I updated the old predicate list to that of a new main file and main predicate. Lucian CI/CD tested the output and integrated new predicates found for new files and predicates with the old ones. +27. The immortal taught pedagogy. My religion was 4*50 As, leading to immortality. I recommended troubleshooting measures, including dependencies, main file and predicate, settings (username, paths), and that the code works (it has the correct include statements) in Lucian CI/CD. The dependencies were the repositories (and their dependencies) that the repository needed. The main file and predicate identified the main files and main predicate that dependencies of predicates were called by. The settings file includes the username used to access the dependencies on the open-source software website, the path of the folder where the algorithms to test are, and the folders to skip. The code should contain \"include\" statements to include necessary files, including dependencies. I found a variety of thoughts leading to higher longevity, and I tested the dependencies of ideas needed, from algorithms to repositories to academic areas of study. +28. The immortal selected from a menu of algorithm makers and could test and fix problems quickly. I wrote 4*50 As with an algorithm that simulated understanding using logic and explaining in its own words using grammars and language to describe everything. I was astounded by how comparatively slow Prolog was. I was determined to write the human intelligence behind neuronetworks and algorithm makers in C, but I was happy with the speed of BAG in Prolog. Separately, Lucian CI/CD could successfully run tests for each predicate. The immortality software will not go away with Prolog because the theory is learned in schools. +29. The immortal also only included necessary modules and predicates in Lucian Prolog. I used logic to determine that the top-level (non-determinism or multiple results per program) in Prolog required C. Lucian CI/CD tested each level of current predicates. First, the algorithm found the dependency of predicates (such as groups of clauses and predicates in loops). It built the repositories based on combinations of changes to current groups of predicates, then tested these to determine which one was right, where testing predicates bottom-up allowed finding predicates with bugs and checking for more changes per predicate. For speed, I considered editing out top-level in Lucian(C)-PL. +30. The immortal simplified non-determinism to C loops. I searched for unwanted non-deterministic (multiple) results from predicates and added the correct result to the testing list and the finished predicate to the BAG database. Lucian CI/CD tested throughout repositories using dependencies, \"include\", and predicate call statements. The dependencies were defined as \"repository one depends on repositories 2 and 3\" and could include circular references. Still, these were cut off, and repositories' tributaries of dependencies were included as their dependencies. Like Prolog, Lucian CI/CD ran algorithms with their dependencies installed, taking their first result. +31. The immortal converted the new Prolog programs to use C's data structures or grammars or string to list every 5 minutes. Like Lucian CI/CD scans, BAG scans occurred every 5 minutes. Lucian CI/CD tested repositories' main files and predicates. These main files and predicates called the rest of the predicates. I used C's data structures to store lists. +32. The immortal detected the main predicate and tested it. I scanned and related the text to a three-part professor perspective to consider others' points of view and expand them. Lucian CI/CD tested all dependencies connected to a repository. First, Lucian CI/CD found all the dependencies associated with the repository and then tested the combinations of their main files and predicates. I automatically detected the main point and tested it. + +33. The immortal stated that the premium code checker was legally binding and checked space travel, simulation and particle algorithms. I mind-queried the BAG results for automation, future research and relevance to particle programming. I verified that Lucian CI/CD tests were in the form \"A=1.\", i.e. they had the form \"string (space) = space number/string/atom full-stop\". Tests could have the same variable as another test if recognised as different. The particle algorithm had no \"sameness\" bugs, i.e. a data item was only looked at once. +34. The immortal replaced tests with \"a(1, A).\" with their new version or used only new comments. The particle programming professor had unique views about algorithm contention sides, and the physics model was minimal and superior. In Lucian CI/CD, the tests in the form \"A=1.\" were found in list_to_set1, and other lines that could be duplicated were given once. I found the correct configuration of the multiple predicate algorithm in the micro-Lucian CI/CD environment, adjusting comments in the final version. The model and algorithm were minimised. +35. The immortal accounted for families of relations in results. I sorted words into contention side categories, keeping aware of changed sides. Tests should detect unwanted multiple outcomes and multiple input and output variables. In addition, a test \"a(b=2, A).\" shouldn't be confused with a result such as \"A=1.\". I reduced the \"A='b'.\" test result to \"A='a'.\". +36. The immortal stated that the code contained a(A,B) and a(A,B,C) for speed. I wrote a(A, B) and a(A, B, C) rather than a(D) where D=[A1,..., AN]. I stipulated that all predicates in the code needed to be tested by Lucian CI/CD. Alternatively, the checker could skip untested predicates but give a warning. It might be like maplist and needed different tests for different numbers of arguments. +37. The immortal replaced \"c:-b.\" with \"a:-b.\". The code helper developed a(A, B), a(A, B, C), and so on, when necessary. I kept the changes from another main file or predicate, such as \"a:-b.\" as well as \"c:-b.\", tested after testing another predicate finished. However, when testing a different predicate but the same file, it was better to test all combinations separately if a predicate hadn't been loaded. If there was an erroneous copy of a predicate, it was deleted. +38. The immortal suggested users use third-party tools for security with Lucian CI/CD. Lucian CI/CD went through a portal, like an alias for a folder if it was available. I tested that Lucian CI/CD worked with erroneous data and merged the correct changes to repositories. To save time, I tried the main predicate and the lowest child nodes, not the intermediate predicates. I tested the intermediate predicate's working directory and other values for correctness. +39. The immortal was happy because of specific As. I prepared my Virtual Private Server (VPS) to run any software as a backup. I set old predicates at the start of Lucian CI/CD and explained simulated intelligence. I found the algorithms in the neuronetwork and printed them out. BAG could freeze one's age and provide specific As. +40. The immortal achieved their goals and remained happy and supportive. I used the Virtual Private Server to breason out four (for one teleport, one anti-ageing medicine, one meditation and one high-quality thought for happiness) * 16,000 = 64,000 BAG or sentence breasonings per person per day, where these breasonings needed to be unique each day. I outputted the results of Lucian CI/CD as a colour-coded HTML file. I checked that all the changes were expected. The meditators could live forever with the help of computers. +41. The immortal was a teacher, and so did their own. I examined myself on a subset of the 16,000 sentences. I generated four sets of fifty immortality (and topics of my other books) songs with Music Composer, stopping with the best song. I wrote the book and listened to the songs to determine their viability. One song came from one sentence. +42. The immortal explained how Lucian CI/CD had changed the algorithm. I randomly examined myself on BAG's output. I wrote on the lecture by connecting Essay Helper's output. I identified the ontologies and noted the best connection. Algorithms should be necessary, for example, an algorithm that explains algorithms found with Lucian CI/CD. +43. The immortal checked each part of the Lucian CI/CD algorithm was necessary. I found and reviewed the objects and algorithms found with BAG, for example, a science of objects rather than algorithms. The Lucian CI/CD \"find test\" algorithm allowed \"%a(1, A).\" followed by the line \"%A=1.\". The variable \"A\" could appear in other tests, and there could be multiple values in \"%a(1, A, B).\", for example \"%A=1, B=2.\". I used the computer to find BAG so I could experience immortality. +44. The immortal synthesised example uses three times. The object record was stored on the VPS to visualise the song. I wrote essays on the weekly lectures with Essay Helper with ten keywords. Or, I ran Essay Helper without keywords and connected the sentences. I checked the examples in the talks. +45. The immortal tested adding or deleting an intermediate predicate in the \"find dependencies\" predicate in Lucian CI/CD. I synchronised records on the VPS and my computer. I inserted predicates after predicates of their name and arity to sort them into old and new predicates and check them. Merging the predicates allowed finding the dependencies of predicates, and Lucian CI/CD also grouped the predicates into clauses. I synchronised with the backup of the repositories if needed. +46. The immortal tested the repositories and recorded their compatible versions. I found suitable texts to combine with Essay Helper and ran BAG for new texts. I included a predicate that detected and installed missing repository dependencies. Each needed repository was installed by List Prolog Package Manager, installed by the first repository. The installer was sensitive to conflicting versions and only installed compatible versions. +47. The immortal merged modules that were always used together or separated modules used separately. I processed BAG's output, finding any semblance of order with ontologies, logic and grammars. On a separate note, the version had a fourth number for the feature number. Users could install or uninstall it as a module; for example, optionally use \"findall\" in Prolog for speed. I checked BAG's output against an algorithm, which gave the dependencies of modules. +48. The immortal checked all relevant criteria. The argument was reported as true or not, where the idea referred to the legal correctness of algorithms in a programming language with the simplicity of Prolog that compiled to C, which the computer ran natively. I read about cut's behaviour and programmed it. I placed cut before and after a choice point and replicated the behaviour. The behaviour of the algorithm was checked against a standard. + +49. The immortal wrote fuller paragraphs focusing on one topic connecting to the topic. I wrote the connection with reasons and a use in my aphor. I found two topic sentences or uses and joined them together. For example, I made a diary app on the computer. I planned how to write something new or points on a line or side of a contention. +50. The immortal edited an argument map and generated an essay. I wrote the graphical argument map generator. I counted the reasons closest to the bottom. I arrayed the flowchart with multiple items per branch and space between reasons. In addition, reasons took up enough space. +51. The immortal counted the number of algorithmic keywords implicated. I wrote at least ten details per sentence. These helped fill in the information in a sentence. I wrote on Engineering (an algorithm), English (a moral for the algorithm) and mathematics (I counted the possible edges of the idea by idea group). I examined the research possibilities for a view and estimated the complexity of an algorithm from its specification. +52. The immortal drafted or wrote the details after generating their algorithms. I used Grammar Logic (GL) to write ten points or make connections to the previous ten details. I examined algorithms, arguments, objects, art, proofs and other area of study objects. I connected the algorithm to a random GL word. I joined to each of the ten previous details, not afraid of big ideas or those with significant ramifications. +53. The immortal stated that the word was “gleich” (same), meaning the same as having the standard of usefulness or was a use (general). I wrote five As of details for assignments. I wrote the project. I wrote the point for a play. I wrote the moment for a literary philosophy, one with unusual language. I used a word that another didn’t have the exact meaning of. +54. The immortal wrote the history of generality. I wrote one word for one predicate, two words for two predicates, and three predicates to join the predicates. I wrote “use”, which meant the word was general. I wrote that “time” recorded the history. I wrote that “use with time” recorded the author’s change in language over time. +55. The immortal noted the popular word or cultural item. I thought of ten details and algorithms per day for a quote. I found the quotation that stood out most for its type of language, philosophical implications and algorithmic possibilities. The passage had a natural body of algorithms. This passage could be increased, and the valuable algorithms were counted. +56. The immortal stated that the algorithm saved lives. I wrote ten details in each philosophical sentence. I wrote algorithms for each element. I connected these algorithms’ ontologies. I explained the relation between the algorithms. +57. The immortal defined categories by the object. I grouped the ten words into three object categories. I grouped them by explanation. Or I grouped them by connectedness. Or I grouped them by multisubjectivism. +58. The immortal left the text as a series of archetypes. I wrote ten sentence parts with the Breasoning Algorithm Generator (BAG). I crossed the sentence three times for eight details. Then, I attached this sentence to a two-part sentence. Each part and connection needed to be worked on manually. Each token had a “key”, or attractive version. +59. The immortal wrote faster, not encrypted code. I wrote on the politics of writing by replacing words in BAG’s output. I saw the significance of the sentence to a topic. I represented it as an object. Or I expressed it as an encrypted code. +60. The immortal’s religion was 4*50 As; instead of writing ten As or 15 politics As, I simplified my method and wrote more. It was like old times. The manager wrote 50 As. The head wrote 200 As. +61. The immortal wrote an algorithm that helped expand on the latest, more exciting ideas. I wrote 15 As per week. I wrote ten parts per sentence in my current work. BAG did the rest of the arguments and algorithms. This variant of BAG wrote sentences relevant to a topic. +62. The immortal reduced the death rate. The new God algorithm used the freezing age high-quality thought to counter death with an A when needed. This algorithm was repeated each day after it was used. The person not only experienced survival, but others experienced them surviving. In effect, they were immortal. +63. A variant of the God algorithm found the noun using mind reading. The professor algorithm replaced two down-cased words with detailed mind-read comments. Or it generated a three-predicate algorithm from a sentence. Or, it found the verb using mind-reading. +64. The 15 A writer considered whether they would run out of reasons. I used a neuronetwork to write 15 As and algorithms. In the future, the neuronetwork set up its own business, walked around and had the rights of a person. The robot revolution meant cheaper labour and better performance and that humans could focus on their desires. It was possible to cover any topic in time. +65. The immortal added changes to its database. In Lucian CI/CD, I built repositories with all predicates present (not just those tested) to avoid problems with order. I took parts from old and new versions of the repositories rather than just the old repository. I also avoided using only new predicates, which amounted to testing. I also chose from older or discarded versions for better options. +66. The immortal observed the predicates tested in the animation. I used subterms with addresses to get and put predicates. Instead, I numbered and referenced predicates. I collected clauses by predicate. I found the dependencies and printed the graph. +67. The immortal treated clauses from the same predicate the same. I removed what was in the loops from the rest. I removed what was in the loops from before the loop. Instead, I only put items not in loops in non-loop sections. I corrected it as soon as it appeared. +68. The immortal found the simplest correct combination of old and new predicates. I labelled the predicates found using find dependencies to use find groups. The clauses were labelled by their predicate. I found whether the predicates were old or new. Or I found out whether the predicates were old or new. +69. The immortal was me. I found subterms added or deleted. I deleted duplicates. I substituted the changes back. I analysed the data. +70. The immortal tested separate groups of predicates separately bottom-up to find bugs in individual groups of predicates. I eliminated the Lucian CI/CD bug in which loops contained no items. If a loop had no items, I deleted it. Or, if a loop contained one item or one predicate, then the loop was removed. Loops or groups were tested in the same combinations, so groups of one item could be simplified to non-groups. +71. The immortal tested each algorithm. I tested test 15 with Lucian CI/CD. If there were a test disproving a predicate, then that predicate would fail. If a predicate needed to be deleted, it could be hard-deleted by deleting it and resetting the scanner after giving it a test that worked. If a necessary predicate is inserted, a test could be added. +72. The immortal noticed that Lucian CI/CD deleted unnecessary code, even if it was called in the main file. I wrote an algorithm in Lucian CI/CD that found dependencies of predicates. It found dependencies depth-first post-order. Therefore, it traversed all branches and then appended the node. For example, “a:-b,c,d.” has the order of traversal “b,c,d, a”. +73. The immortal relied on Lucian CI/CD to simplify algorithms. Lucian CI/CD didn’t try further groups if it failed. As an aside, only new comments were kept and were copied to the old version. If a predicate was deleted and wasn’t needed, the algorithm deleted it for simplicity. Lucian CI/CD didn’t fail in this case. The user needed to remember to delete comments with a predicate so that they were deleted, too, even though these tests wouldn’t be used. +74. The immortal tested combinations of changes to predicates in the current group of predicates. Lucian CI/CD grouped items by looping and shells. If items were directly linked in a loop, they were grouped to be tested. In addition, if items were clauses of the same predicate or inseparable to test, they were tried together. Tests could be written for each predicate, but groups of predicates were tested together, meaning none should fail. +75. The immortal used the dependencies to test. In Lucian CI/CD, lower instances of predicates were preferred to the upper instances. As the algorithm was traversed depth-first, post-order, the first (lower) predicates were kept, and later ones were omitted. This process was achieved by keeping a list of predicate numbers in loops and in an index. If a predicate number was earlier in the list, it was omitted. +76. The immortal first found the tests and dependencies of repositories touched by a change. I found the dependencies before finding the combinations of transformations. I found the dependencies in the predicates of the algorithm. I found the groups of predicates to test bottom-up. I found the combinations of changes in these groups and kept the most straightforward combination that tested positive. +77. The immortal showed how to modify Lucian CI/CD for other languages, to personalise their testing. I didn’t do as many tests as possible rather than stopping after failing one. I tried combinations of changes with different code or predicates to test. These predicates or lines in them could have been inserted, deleted or changed. Lucian CI/CD seemed to anticipate my thoughts and made changes before time was wasted. +78. Lucian CI/CD used a post-order depth-first search to attend to branches and predicates that needed debugging more directly. Lucian CI/CD’s find dependencies algorithm used post-order depth-first search. An example of pre-order in “1:-2,3.” is “1,2,3”. An example of in-order is “2,1,3”. Post-order, giving “2,3,1” is needed because it provides predicates before the calling predicates. +79. The immortal selected a repository to test. When groups of predicates have items below them, Lucian CI/CD tries them first. These items are not included in the group. These items were tested first, warning the user to change them if they had bugs. Also, the user could focus on individual branches to try. +80. The immortal successfully tested the algorithm with loops. Lucian CI/CD was tested with a nested loop. It deleted the inner loop because it was part of the outer loop. A loop that was inside but not directly on line to the outer loop was not deleted and was not in the outer loop. The outer loop was stored at the top or end of the bottom-up list, so any predicates it called were first. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 27.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 27.txt new file mode 100644 index 0000000000000000000000000000000000000000..77efb1811aa2ab752aae045c62c19e33b8b9fd5c --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 27.txt @@ -0,0 +1,114 @@ +["Green, L 2024, Immortality 27, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 27 + +1. The immortal tested code in seconds. I wrote the CI/CD tool in C for speed. It could test C code. It had faster infinite loop detection. It ran code converters more quickly. +2. Lucian CI/CD tested each combination of changes to a group of predicates. Lucian CI/CD produced an error if it couldn’t successfully test a group. All tests for predicates were found. All predicate names and arities were printed. Tests were optional, but failure meant the combination failed. +3. The immortal fixed code bottom-up to fix bugs in predicates once dependent parts were repaired. That a called predicate should be tested as part of a group with its calling predicate isn’t necessarily true. If the called predicate calls a predicate that calls the calling predicate or a higher-up predicate in a chain that calls the calling predicate, it is in a loop, and the predicates can be tested together. However, with no loop, the lower predicate should be tried first, and if it is successful, then the upper predicate should be tested, eliminating errors in fixing bugs. If there is a bug in the lower algorithm, it can be fixed. Then, the upper algorithm can be tested successfully, whereas trying the upper one first is problematic because it is dependent on the result of the lower one, which hasn’t been fixed yet. +4. The immortal inserted a negative test that failed on a particular output. The tests for each predicate conducted tests for wanted and against unwanted behaviour. They could test for infinite loops, caused by missing base cases. I tested for extra, uncut results. If there were additional results, I inserted a cut. +5. The immortal wrote Lucian CI/CD tests with a predicate name and possibly data and output variables. If predicates formed parts of loops that didn’t need to be tested, tests for these predicates could be omitted and still pass. There would be a warning if there were missing tests. The predicate names and arities would be printed. The successful and unsuccessful test results would be printed. +6. The immortal wrote about the behaviour of pacemaker predicates. Lucian CI/CD could handle a loop with more than two predicates. The loop could have a second looping call from a different line of the predicate. Or it could call another first predicate. It could handle these loops and group predicates to test accordingly. +7. The immortal could handle complex predicate dependencies arising from mixing old and new versions of the repositories. Lucian CI/CD could take multiple loops from the same predicate. The predicates directly involved in the loops were grouped to test. These groups were at the top of the predicate list. The predicates they called were first in the list. +8. When converted, The immortal could still test other programming languages and data. Lucian CI/CD detected and deleted duplicate predicates. These predicates weren’t needed and could produce erroneous results if not deleted. These deletions didn’t include comments, which could keep identical lines required in different places. Only new comments were held as comments couldn’t be checked as part of the algorithm, eliminating wrong comments being kept. +9. The immortal removed unused main files or main predicates from main_file.txt. I used a main_file.txt, which listed all the main files, main predicates and their arities to test in the repository. All the repositories connected with a change file’s repository were tried this way. If a repository called a predicate from another repository, then this repository might be called once by each repository. The main_file.txt could be changed if there were multiple repositorilets (sic) or multiple main predicates. +10. Lucian CI/CD integrated possible changes to the relevant repositories. Lucian CI/CD tested many predicates together. It found the tests matched the name and arity of the predicates. It tested the predicate, which resulted from the merge algorithm that integrated various changes to these predicates. This merge algorithm used a variant of the diff algorithm to detect changes. +11. Lucian CI/CD could warn of a cyclic dependency. I ran Lucian CI/CD for each repository connected to the first repository. The first repository set included all modified repositories since the last time Lucian CI/CD was successfully run. The repositories connected to the repository were listed in the List Prolog Package Manager’s registry.txt file. This algorithm stopped when there was a cyclic dependency +12. If a predicate failed in Lucian CI/CD, it was only retested if it was part of another combination or main_file.txt test. I only tested a predicate while testing predicates bottom-up once. This fact was a trade-off of testing bottom-up. If a branch was complete, then the following predicate could be tested. Lucian CI/CD returned true when all the predicates were tested. +13. The immortal didn’t rely on comments, etc., from old versions. I always loaded data, comments and tests in files into the new version. Data always came from the new version. Comments always came from the new version, and old ones were deleted. In addition, tests only came from the new version, so they needed to be updated. +14. The immortal could institute a feature that made a positive test mandatory for each predicate. The tests will be loaded for each predicate. The tests were loaded at startup. The tests were needed for each predicate to test whether it worked. It was possible for a predicate set to pass without tests to allow for some predicates to be passed when they are working in confidence, but there is a warning in this case about the predicates and tests having a disparity. +15. The immortal found that predicates were similar to repositories in that a change lower down could affect entities higher up. The predicates needed for other predicates are loaded when a file has been modified. A changed file or changed modification date of a file triggered all files connected with it to be tested. These included the files containing predicates it needs, and that need it. This decision is because the file’s differences may affect the files that call it. +16. The immortal tested for errors, not just failures in algorithms. Tests are run that correspond to a repository’s primary file name, where Lucian CI/CD fails without this file name. If there is no file with this name, Lucian CI/CD gives an error and aborts. The same is true for primary predicates. If a predicate calls a predicate that doesn’t exist, the program will produce an error. + +17. The immortal developed a system in Prolog for secure web apps. I included State Saving Interpreter Web Service as a Prolog library. This library enabled Prolog algorithms to be a web app. State Saving Interpreter Web Service converted Prolog algorithms to a List Prolog algorithm and ran it over several pages. Parts of it could be in Prolog or C, with tail recursion optimisation for speed. +18. The immortal demonstrated multimodal append in the lecture. I hypothesised how to implement a top-level and multimodal append version in List Prolog. I reverted to a version of State Saving Interpreter, which supported append as a multimodal predicate. In addition, I eliminated unwanted choice points to implement top-level. Multimodal append could now return multiple solutions at the top level. +19. The immortal stated that the development environment worked better with shorter, simpler predicates. I wrote the algorithm top-down but tested it bottom-up (depth-first post-order) with Lucian CI/CD or higher development speed and accuracy. Or I wrote a plan and filled in the predicates. I tested each predicate, ensuring their results were as expected. When Lucian CI/CD indicated a bug in a predicate, I entered the development environment to fix it. +20. The immortal stated lower predicates than changed predicates couldn’t be automatically passed because they might refer to asserted predicates, meaning these assertions were needed in lower predicates for bottom-up Lucian CI/CD to function correctly. I listed predicates in the touched files and the predicates they called. Instead, I listed all repositories connected to any touched files. I tested the modified predicates above and below. I retested lower predicates. +21. The immortal used predicates to number the items. I numbered the predicates. I identified the predicates as having a unique name and arity. I gave these collections of clauses (a predicate) a unique number. This number started from zero and increased by one for each predicate. +22. The immortal returned to top-down Lucian CI/CD, determined by the main file, not the current file, for most cases (including asserted predicates). I ordered the predicates in bottom-up order. The predicates were numbered. They were ordered by depth-first post-order traversal. They could be renumbered and inspected. +23. As an aside, the immortal renamed sub_term_wa sub_term_with_address. If there was a loop, Lucian CI/CD grouped the predicates. The algorithm traversed the predicate numbers. A future predicate number was recorded if it indicated a loop. This bottom-up version of Lucian CI/CD was only activated if there were no assertz statements in the calling predicates. +24. The immortal only reused test results when the conditions were the same, for example, in main file branches. I tested predicate groups with not yet done tests. I ran the test and recorded the result. I identified that another test wasn’t likely to have the same result because it was run on a different code combination. Therefore, I didn’t necessarily record the old test results. +25. The immortal stated that Lucian CI/CD finished the needed changes by omitting some tests with notifications after assessing the best combination given the tests, for example, after identifying the direction and multidimensional terms. I applied the combinations to new predicates in each group. I discarded it because finding combinations from new predicates was too tricky. However, I found all the combinations and only tried the combinations with content from changed predicates. This process eliminated up to 100% of the tests. +26. The immortal had a cup of coffee after entering the description of the changes needed. If a predicate is unchanged, other changes may affect it. This condition meant I didn’t skip it. In addition, I noticed whether the predicate had different truth behaviour and if changes to it would produce other output. I finished changes to a predicate given an identified direction, creating a new test. +27. The immortal wrote Prolog without choice points outside findall but a Prolog-like syntax. I tested one predicate group (including all its clauses) simultaneously. A predicate’s clauses each have the same test, which the predicate’s clauses are tested on. The test is run with each clause unless a cut is reached, in which case no more clauses are run. Cut deletes choice points within the predicate. +28. The immortal took predicates, which completed tasks from a library. Lucian CI/CD saved updates from successful combinations of current predicates. Each run of tests went from the start, and combinations were tested at each point until a suitable one was found. The same tests from different main files were skipped. Circular and defunct “include” statements were deleted. +29. The immortal borrowed code style but simplified it. Lucian CI/CD tests were loaded for each set of current predicates. If there was no test for a predicate, it could be synthesised by taking input from another predicate and finding the output from the predicate. Dead areas of the predicate or not working clauses could be corrected. This process could create a test from neighbouring predicates. +30. The immortal simplified the predicate based on the required output. The algorithm produced an error if there was no Lucian CI/CD test. This setting was enabled for strict testing. Strict testing tested each line of the predicate. Incorrect or missing outputs were fixed. Extra or unwanted results were eliminated. +31. The immortal explained that if there were more than seven algorithm changes, all the changes were tested as a single combination. Lucian CI/CD tested Prolog bottom-up. The new top-down version required main_file.txt to load the code from a single file and predicate, which solved the problem of predicates not being loaded. The repositories were loaded from the main predicate in the main file, loading all necessary predicates for a test. There were all-or-nothing combinations of up to seven changes per algorithm on testing. +32. The immortal tried combinations of set data to test a random algorithm. Lucian CI/CD detected whether assertz, random, user input or API calls were affecting the predicate. If there was an assertz call, top-down Lucian CI/CD was used. If there were calls to predicates with unpredictable outputs, they were temporarily replaced with single value givers and tested individually. If there was a random predicate, it was tested to give values between A and B. + +33. The immortal ran top-down Lucian CI/CD on the most extensive set of dependencies, including a definition and access of an asserted predicate and bottom-up Lucian CI/CD on the rest. I ran all the necessary tests. I ran the tests with bottom-up Lucian CI/CD on each set of current predicates. Lucian CI/CD reset the seven changes for each new set of present predicates. Programmers could test up to seven additions, deletions or alterations per set of predicates. +34. The immortal converted Prolog to findall and foldr and then to C. I logged the predicates tested and the results. Each test included the file name, predicate and test. It also had the consequence of success or failure-a separate algorithm found and deleted unnecessary clauses. +35. The immortal double-checked repositories when new code had been written. If all the tests passed, Lucian CI/CD saved the files. The code was checked bottom-up if the new top-down main predicate tests failed. I ran List Prolog’s tests, including types, open, and open trials. If any tests failed or if I wanted to double-check the code, I ran Lucian CI/CD. +36. The immortal stated that the code identifier recognised the code’s aim (process, search, get or put values or interpreter predicates) and simplified it. I found the best changes across the versions with Lucian CI/CD. To do this, I needed tests for the new predicates. I ran Lucian CI/CD for each version in order, changing the tests as I went. I wrote an algorithm that found the minimal code for a predicate across the versions. +37. The immortal barred entry to writing code unless a test had been written. Lucian CI/CD could give an error on no test, ensuring the code was tested. The test generator ran through the code, providing a base case and one trial before the base case. A moderator checked these tests to ensure they checked each line of code. There could be combinations of lines in which duplicate lines and unnecessary lines of code were deleted. +38. The immortal focused on a predicate with many changes by itself, modularised functions, increased the number of possible changes past seven, and reduced the code by writing ideas about it. I saved repositories processed with Lucian CI/CD as I went. I processed earlier versions of repositories, adding version numbers as I went. Running Lucian CI/CD on one of these old versions and a stable new version gave the desired results. A non-stable version shouldn’t be used because it wouldn’t work, and there was a warning if the number of changes was more than seven for a predicate group. +39. The immortal found each predicate’s calls. I identified a predicate by its name and arity. I listed the predicate’s lines as a state machine. I listed all the predicate calls, their names and arities. I sorted and removed duplicates from this list. +40. The immortal wrote an algorithm to document tests for predicates over time. It is best to run top-down Lucian CI/CD with few changes at a time. This version of Lucian CI/CD was possibly obsolete, only able to deal with up to seven or one large group of changes at a time. Bottom-up Lucian CI/CD was introduced to alleviate this problem so multiple teams could work on changes. The bottom-up version could handle more changes from various groups at a time and was preferred. +41. The immortal stopped the pipeline to make changes. I ran Lucian CI/CD for each group of predicates. I stored and built tests from other strings and terms. There were ontologies of data structures stored as type statements. This setup sped up development and testing. +42. The immortal stated that only predicates affected by a changed predicate needed to be tested. Lucian CI/CD only tried a predicate once. Each calling predicate might affect a called predicate. This state of affairs usually didn’t matter unless assertz was involved. If called predicates weren’t affected by a calling predicate, for example, ones in different repositories, they could be tested by themselves. +43. The immortal manually approved failing data. The old Lucian CI/CD version could set up untestable comments and test data that the algorithm depended on. The new version passed all new comments. It was a contention between keeping all new comments or testing them, possibly labelling testable comments. These testable comments, such as data, could be labelled in a settings file and manually approved by a human. +44. The immortal indicated which files were programming languages other than Prolog. I checked the data files with the old version of Lucian CI/CD. This version of Lucian CI/CD was the last one compatible with programming languages other than Prolog. A testing script that tested the lines of data, or lines in the different programming languages, tested and approved or disapproved of the lines. The user indicated whether to support programming languages other than Prolog in a setting. +45. The immortal attracted the best talent and achieved success. I examined people politically to reward and recognise them. I wrote the A for the person. They worked in the company and experienced their goals. I replaced two words in each argument and algorithm A with critical terms from them. +46. The immortal updated any tests affected by the output of the predicate the test was for. I loaded the tests for each predicate. I prepared to load the tests by writing them before each predicate. I updated the tests when I changed the predicate. I also updated the tests affected by the test. +47. The immortal suggested main files and predicates and their arities when the main file was missing or possibly wrong. The predicates calling predicates are different from touched files. A settings file with a repository’s primary file name is needed to access them. Lucian CI/CD fails without this settings file. Lucian CI/CD complains when it can’t find files. +48. The immortal tested the predicates affected by the changed predicates (their output). I tested all repositories connected to changed files. I tested predicates, predicates that called them and predicates that they called. I made allowances for different dependencies of predicates because of loops. I ran Lucian CI/CD when a file changed on all affected repositories. + +49. The immortal was critical of leftover predicates using Lucian CI/CD, so they deleted them. Lucian CI/CD used tests that tested each predicate in the current predicates, not the predicates from which the combinations of changes were taken. I made having tests mandatory. It was a bug to test predicates outside the set of current predicates. With the required tests, no errors remained. +50. The immortal sorted and eliminated duplicate pairs of numbers that made a sum. In the List Prolog to Prolog Converter verify script, I tested that [[n,+],[[v,b],[v,c],[v, a]]] converted to A is B+C. The converter shouldn’t produce “+(B, C, A).\n”. I noted that in “+(B, C, A).\n”, B and C are inputs, and A is the output. A variant of “+” took two and gave each combination of integers that summed to it. +51. The immortal later tested the List Prolog to Prolog pretty printer by itself. I wrote a List Prolog to Prolog pretty printer back-translation verify script. It worked in the following way: + +[‘p2lpconverter.pl’]. +[‘pretty_print_lp2p.pl’]. +LP=[[[n,a]]], +pp_lp2p0(LP,P),p2lpconverter([string,P],LP). + +It gave the following output: + +P = \"a.\n\n\", +LP=[[[n,a]]]. + +In this, the output of the back-translation algorithm is the same as the input. + +52. The immortal later tested the List Prolog pretty printer by itself. I wrote a Prolog to List Prolog pretty printer back-translation verification script. It worked in the following way: + +['../List-Prolog-to-Prolog-Converter/lp2pconverter.pl']. +[‘pretty_print.pl’]. +P=\"a.\n\",p2lpconverter([string,P],LP1),pp0(LP1,LP2),term_to_atom(LP3,LP2),lp2p1(LP3,P). + +It gave the following output: + +LP1 = LP3, LP3 = [[[n, a]]], +LP2 = \"[\n[[n,a]]\n]\", +P = “a.\n”. + +In this, the output of the back-translation algorithm is the same as the input. + +53. The immortal preserved the order and formatting of predicates and comments, respectively. I recorded the order of predicates before and after merging to insert them in the correct order in the Lucian CI/CD results. I numbered the predicates before joining. After merging, I inserted these predicates in the same order as the original list of predicates using their numbers. I did this by finding the numbers of returned predicates corresponding to the original predicates’ numbers and inserting the predicates. +54. The immortal had 4*50 As in acting and politics to be a public figure. I grouped loops in Lucian CI/CD using a depth-first search when finding dependencies. I included calls directly on the way to a recursive call in the loop. If two loops recursively called the same predicate, they were stored together. I cleaned the loops up, checking for single instances of predicate numbers. +55. The immortal integrated the predicate search into Lucian CI/CD. I checked that a predicate from another repository was loaded using List Prolog Package Manager (LPPM) when that repository was the main one, producing an error otherwise. When running a repository, I checked all predicates in other repositories were loaded with LPPM. I found whether the algorithm had defined all predicates it called. Otherwise, I found them and loaded them with LPPM. I searched for these predicates by name, arity and type and tested they worked bottom-up. +56. The immortal wrote, so that assessment was different each time. There were 16 models for evaluation per area of study. These models were the main questions. There was one critical analysis question. There were sixteen possible reason sets. +57. The immortal’s job was to write notes. The students’ assessment had high standards. I helped them with the algorithms. It was fun for them. Their seen-as version was remembering a word, thinking of a mathematical answer or applying mathematics to English. +58. The immortal taught the children philosophy in their home. There was accreditation for students at the institution. The institution was any school, company or home. The company had courses to teach its employees to increase the quality of their work. The content needed to connect to their work. +59. The immortal wrote text files to help everyone have information at their fingertips. I accredited work for disabled students. I worked on one to three predicates for intellectually disabled students. I found something that they wanted to do. One wanted files, while the other wanted objects. +60. The immortal asked the student to develop a specification based on their interests. I wrote the questionnaire about philosophy. I asked the students to write an argument or an algorithm on campus. I asked the student to work out the result of an algorithm given a model. Then, they were asked to write an algorithm given a specification. +61. The immortal found out what the client’s requirements were. I wrote 4*50 As for regular word-of-mouth appraisals. I wrote preparation for the launch, events and close of a centre. The centre was accredited. The teacher was selected. +62. The immortal thought it was the most fun thought and that anything was possible because of it. I chose, explained and wrote a program for a person. The person helped choose a part of the topic they were interested in. We discussed and mind-mapped the topic, hinting at algorithms at each point. The person took the initiative to write an algorithm that required the visualisation of data structures and helped someone else write it. +63. The immortal mirrored the details grid from a prestigious institution’s assignment. I found students after lectures and talked. I met and went to a café to discuss philosophy. I showed them an app or a website that demonstrated my algorithm, and they wrote their ideas down about it. I put up posters, and they came at the same time each week to learn about more algorithms. +64. The immortal wrote 6*4*50 As to teach. I sold my books in the city. I hired a place near a train station and set up a table to sell books to the public. They were printed books about immortality, time travel or philosophy. They could become with-it about Text-to-Breasonings by developing As and helping others. +65. The immortal gave the website address for people to join the book club. I bought and sold books. I wrote the book. I bought published versions of the book. I sold the book for a slight margin. + +66. The immortal kept the folio of development of the package. The four levels of the Lucian CI/CD weblog corresponded to the current version of the existing repository. The first level was API changes or changes to a programming language's syntax. The second level was new features. The third and fourth levels were bug fixes and progress towards bug fixes. +67. The immortal monitored the building for structural problems. I earned money from property. I bought the property. I rented it to either tenants or a meditation teacher for a discount. I renovated and sold it. +68. The immortal's job required medical and business knowledge. My job was in aged care. I undertook training. I applied and earned the position. I contributed money to bills and property. +69. The immortal helped the student with the algorithm. I discussed the requirements. I helped them mind map the problem. I helped them visualise it. I discussed the frequently asked questions. +70. The immortal explained the model solution. I replaced the algorithm with the answer. I inserted missing or replaced incorrect lines. I helped the student understand the solution entirely. Where the movie had a working script, academia had a viable product and research. +71. The immortal drafted the algorithm idea and filled in the code. I helped with one predicate at a time. I recommended functional decomposition. I produced a simple version. I added to it, preferring recursion to findall. +72. The immortal combined select and subterm with the address to find the complement of part of a multidimensional list. I searched for alternative solutions. I changed the method. I converted an answer from another programming language. I wrote an advanced command. +73. The immortal inserted a needed label. I gave a hint about the command and its arguments, not the order of its arguments. I shared tips about the order of items in a list. I also have hints about the order of items in a multidimensional list. I did the work to eliminate obsolete elements. +74. The immortal designed the computer science report. I explained and confirmed understanding of the code. I explained each feature of the code. I found and corrected bugs. I checked that the student could reproduce a similar example and write a self-test for students to test each other. +75. The immortal used software for free, and the employees wrote pedagogical algorithms. I modified the Vetusia game engine to pass arguments with a login web page. It used a backend server, where the algorithm was hidden from the web page. The user entered their login credentials, which were encrypted and checked against the database. The payment side was out of their hands. +76. The immortal completed the points as part of their job. I wrote the enrolment engine. After the user logged in, the Prolog web server displayed the list of courses in their course. They started working on a subject. They sent in their work and received feedback. +77. The immortal increased the value of the product. I processed the payment with 40*4*50 high distinctions for high-powered refunds. I needed an extended key to protect the buyer. I covered the buyer, the database and myself with cybersecurity policies. If the customer was unsatisfied, they could have credit on their professional development loan. +78. The immortal sped up each algorithm. I wrote autoparaphrasing and autobreasoning algorithms (without input) to work faster. The first algorithm produced the list of synonyms. The first algorithm produced the list of breasonings. I used faster commands, such as maplist, multithreading and C, to make them work faster. +79. The immortal wrote the interpreter in C, treating memory as an operating system would. I sped up the interpreter. If an algorithm didn't need multiple predicates, I simplified it. I used maplist and multithreading where possible in the interpreter. I worked out which parts of the algorithm wouldn't be used at compilation and with particular input and deleted them. +80. The immortal converted each complex command to C. I converted maplist to C. In C, maplist was a loop. I also converted foldl and foldr to C. Foldr inserted reverse into foldl. Foldl applied command(s) recursively to data. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 28.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 28.txt new file mode 100644 index 0000000000000000000000000000000000000000..9b01765a56d79682cf79404029e365da48a82c66 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 28.txt @@ -0,0 +1,108 @@ +["Green, L 2024, Immortality 28, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 28 + +1. The immortal wrote that Cute Prolog had arrays only, which could work like lists, and C-native modules for non-immutability, and converted recursive predicates to maplist, then C loops. I wrote Cute Prolog, which compiled to C and had immutable variables. Cute meant C-(imm)ute(able) (sic), where immutable variables prevented errors. It provided warnings on singletons. It had a range of features like Prolog and was fast like C. The trade-off speed was on account of not having clauses and having logic in separate commands. +2. The immortal stored strings as lists of numbers for speed. CN-ute (sic) Prolog contained numbers only. Strings were converted to lists of numbers on compilation and processed as numbers in grammars and predicates. The lists of numbers were converted back to strings before list building (where lists were broken into number and string arrays on compilation) and output. I only kept relevant lines in the arrays and deleted unused arrays. +3. The immortal preserved formatting in the programming language. I researched a universal programming language like List Prolog, which encoded subarguments as lists. Given \"int main() +{ + /* */ + return 0; +}\", the C/List Prolog conversion is +\"[ +[[n,main],[[-1,int]], +[ + [[n,comment],[\"/* */\\\"]], + [[n,return],[0]] +]] +]\". This format differs from List Prolog and would be labelled as being converted from C, with relevant parts. Other languages would have different conversions. +4. The immortal converted the immutable variables. Cute Prolog had a module that converted between a range of tables and terms. For example, the term [[a,1],[a,2]] was converted to the number and string tables. The compiler processed terms, storing terms as arrays. Immutable variables were converted to non-immutable variables by inserting, deleting and replacing parts of terms. +5. The immortal stated that Cute Prolog (CP) had no multiple clauses, unlike Prolog. CP was not a logical programming language but had logical commands. It avoided choice points in clauses and outside logical commands, avoiding slowdown from choice points and if-then. CP had base cases built into functions and avoided opening frames unnecessarily. It avoided cuts by avoiding multiple clauses. +6. The immortal stated that multiple clauses gave input or accepted output. Assuming no cuts, Cute Prolog had multiple clauses, but these were converted to a single function. There were no non-deterministic results (multiple results from different predicates). Data that Prolog would store in clauses was stored in text files or as lists by Cute Prolog. The code: +\"a(A):-findall(B,b(B),A). +b(1). +b(2).\" became: +\"a(A):-findall(B,(b(B1),member(B,B1)),A). +b([1,2]).\" and could be further simplified and converted to C. +7. The immortal gave the example of multiple clauses giving input, where additional examples could also accept output. In the above, in the case that \"b([C, D]) :- c(C),d(D). c(1). c(2). d(3). d(4).\", the non-determinism of these results were converted to findall, then a nested loop and returned. I.e. \"?- findall([A,B],(member(A,[1,2]),member(B,[3,4])),C). +C = [[1, 3], [1, 4], [2, 3], [2, 4]].\". When asked, \"Why Cute?\" I replied, \"Cute Prolog gives the advantage of intuitiveness of Prolog and the performance of C\". +8. The immortal stated that the compiler broke complex antecedents into C antecedents. In the case that multiple clauses accepted output, \"?- b(1,A). b(1,2). b(3,4).\", the multiple clauses were combined on compilation to \"b(A,B) :- (A=1->B=2;(A=3->B=4)).\". In the case of \"c(5,6)->...\", the compiler would break it into \"c(5,R),(R=6->...\" to avoid indicating choice points. Logical antecedents could contain \"and/(,)\" and \"or/(;)\". For example \"((A=1,B=2)->...\" and \"((A=1;B=2)->...\". +9. The immortal claimed that Cute Prolog went in one direction, so it didn't need choice points and was part of Prolog because it contained logic programming. In Cute Prolog, like lucianpl, nested findall was converted to loops. A new loop was started on member. Or, it could start on append with two outputs. Append with other modes could be converted to C. +10. The immortal converted Prolog to Cute Prolog and back, preferring it in projects because it was faster and could omit C libraries. The user could enter the code at the Cute Prolog prompt, which would return the result. It would give an error on a query or code that called non-deterministic clauses outside findall, etc. It provided an error on cut. It converted code and removed immutable \"dead-end\" code. +11. The immortal warned on further finite or possibly infinite (but cut-off) non-deterministic results. If-then is fast in Cute Prolog. If-then doesn't need to delete choice points to maintain performance. If the first or first n results are required, recursive code is necessary to do this manually. Cute Prolog could accept cut, and the code would converted to recursive code that does this manually. +12. The immortal used the array type with structures with a constant dimension and structure. Cute Prolog converted some terms to multi-dimensional arrays for simplicity. These terms had one type: numbers or strings. In them, lists of lists are translated to a multi-dimensional array using type statements. These arrays were compact and could be quickly processed. +13. The immortal combined and simplified cut, non-determinism, multiple clauses and choice points in Cute Prolog. Cute Prolog promoted fast development because it was based on Prolog. Prolog enabled processing lists of lists with nested findall. Intermediate C code could be examined concerning the original Cute Prolog code. Repeating code could be reused. +14. The immortal caught all failures and errors, for example, from illegal input, and wrote the logic programming language Cute Prolog. I customised the programming language with possible choice points, immutable variables and multiple clauses. Multiple clauses were a type of choice point and could be omitted. Immutable variables refer to disallowing assignment to assigned variables. Choice points were created by specific commands and were returned to on failure in a logic programming language until there were none left. +15. The immortal helped visually impaired users feel the layout. I wrote the graphical terminal in the web page. The graphical terminal had icons. Also, the graphical terminal had images and 3D graphics. The graphical terminal could run apps and supported multiple windows and the mouse. +16. The immortal wrote the neuronetwork on the encyclopedia article with ten sentences. I wrote a 12-sentence paragraph. The first sentence was a use. The second sentence was the main sentence. The other sentences expanded the main sentence. + +17. The immortal used an algorithm to find parts of speech. For new meditators, I found parts of speech with fewer instances of detailed mind reading, faster mind reading and predominantly random selection. To check parts of speech in a sentence or any other task I needed mind reading for, I used the least amount of the most expensive, most effective mind reading technique, detailed mind reading. I also used the second priority technique, fast mind reading, which didn’t use a decision tree but directly selected from options. In addition, I used the third priority technique, the random selection of parts of speech, which was more effective with 50 meditators supporting and toning the results. +18. The immortal found similar text files and synthesised them. I programmed a catch timeout for the Breasoning Algorithm Writer (BAG) algorithm. BAG synthesised sentences from available word pairs (from AB, CB, CD to AD and all combinations) to reveal possible new meanings a text reader may interpret. The timeout limit on BAG stopped more complicated algorithm runs on texts but ensured faster results from more texts. The time taken to process a text was exponentially related to its length. +19. The immortal prioritised and ran the algorithms. I found the time of the BAG computations from the logs. I could record the start time of the day’s computation. From the end time and this, I could calculate the time taken to run the algorithm, and the time varied because the choice of synthesis varied, affecting the algorithm’s speed. +20. The immortal wrote down an interpretation of the conversation. I used BAG to give others As for As I received. I detected the A and noumenon for it I was given. I replied with an A in meditation. BAG helped interpret the topic, inspiring thought. +21. The immortal developed an algorithm to meet spiritual professional requirements for immortality using the classical computer. I used BAG to count all the breasonings, even failed ones. I measured the thoughts, such as preparing and reasoning behind choices. I counted the sentences, which were also known as breasonings. These breasonings, found rapidly, provided spiritual medicine and immortality. +22. The immortal could at least feel the data. I used BAG to count and stop when algorithms and arguments each reached n sentences. I measured the breasonings as I went. I stopped on n breasonings. This ending was announced with a buzzer and speech notification. +23. The immortal finished the algorithm after exactly n items. I counted all the breasonings generated by BAG (when the algorithm had finished, I added the extra ones in the cache). I ended the algorithm after n items. The overflow items were placed in the cache. I counted these and added them to the total recorded in the log. +24. The immortal helped the entities to have a better quality of life. I wrote day2.pl to run BAG until it reached its goal. It ran the algorithm that concurrently ran BAG in a shell container. I played the bell. The computer voice Agnes announced, “Freezing Ages Complete”. +25. The immortal wrote an ultra-fast breasoning algorithm that looked up the dimensions of objects with the maplist command. I wrote bag_args21.pl and its counterpart for algorithms to run BAG (using multithreading) concurrently. Another algorithm had prepared the texts in the folder, including the most recent ones, for BAG. It felt good to have written enough texts (16*50 high distinctions with sentence breasonings) for running BAG with n (= 3.2 million) items. Multithreading helped compensate for slow texts and allowed faster breasoning. +26. The immortal stated that people were quizzed and gave better results. I wrote the BAG algorithm. The algorithm variant read texts from another program rather than the disk, speeding it up and allowing portability. The BAG algorithm found split sentences and all possible connections and chose from these randomly. A spiritual glitch appeared to improve the results of this random selector. +27. The immortal discussed non-transgressional ideas spiritually. I needed one person each BAG person faced to be done, too. The person running the algorithm helped give a copy to people they met. These people could work out and run the algorithm in their own time. Eventually, society’s aim and a positive effect were achieved. +28. The immortal debated whether the “budding” was diverging or converging. I breasoned out 4*50 high distinctions for breasoning out facial expressions. These were expressive in acting and conveyed the thoughts of the character. People were joyful, amused, interested or listening intently. They remembered the faces and became a single person. +29. The immortal preferred the colour red. I breasoned out five red squares as seen-as versions for the BAG output. I found this by running the command N=1,M=u,texttobr(5,u,\"square\",M). There were five, one for each of four sets of 50 high distinctions and an extra one at the professor level. These squares could change colour randomly from day to day. +30. The immortal left files as they were, using algorithms to intersplice them. I moved the data files for luciancicd to another folder to reduce clutter. The files included history, modification dates and tests. I could change the history files to the same format as the original programming language files. In Lucian CI/CD, I could preserve the formatting of different languages and manipulate the commands to find the desired combinations. +31. The immortal monitored and auto-typed the repository, changing to a correct configuration on an error. I moved GitHub/private2 to GitHub-private2. GitHub was the name of the folder on which Lucian CI/CD operated and was constantly monitored for changes needed for new tests. I wrote an algorithm that found the types of each predicate and checked that these fitted together (without calculating all possible types based on other predicates). This type of verification circumvented the need for writing tests and highlighted possibly incorrect lines. +32. The immortal devised GitL, a graphical user interface that allowed viewing a repository’s history and making commits. I stated that GitHub-private2 referred to the files in GitHub. The files in GitHub-private2 contained the history, modification dates and tests of these files. The history included the history of the repository, the modification dates were the modification dates of the files to compare changes, and the tests were the tests scanned from the files. The files for each stage of history were together. + +33. The immortal structured the work with research questions and hypotheses. I built the sales network. I focused on what I was selling. I focused on its features, unique selling points and frequently asked questions. I applied each set of the 4*50 breasonings to the product at the start, then 80 breasonings for each sale. +34. The immortal studied and collected As in social work, really computer science to interest people. I cold-called, built the latest technology and invested in sales points. I cold-called people who were likely to be interested in the product (I started working at the company in another role, for example, in prisons, with people with disabilities or in a school). I designed and constructed new computer science to write longer, more customisable (less exclusive) algorithms. I invested in people who were likely to flower. +35. The immortal was more impactful in person. I cold-called the person. I visited their office. I handed them a letter. I had dinner with them to explain my business proposal. +36. The immortal designed an algorithm writer with functional calls and relative, not absolute, functional building blocks, which hyper-optimised user-friendly code. I built an algorithm to write more extended algorithms. I kept it private. I wrote a simpler, safe version. The algorithm had features written bottom-up using ontologies of features, matching an expanded description. +37. The immortal travelled to help the person. I invested in the prospect. I found the person. I gave them high distinctions when they were given high distinctions in time. They did business with me because they felt it was necessary. +38. The immortal wrote in terms of their areas of study, which were safe and creative. I researched parts of the idea, from institution areas of study to industry knowledge and giving high distinctions to academics' children as Vedic astrology. I researched medicine, law, education, computer science and philosophy. I explored establishing a company, an educational institution and other social facilities. The standard high distinctions given to academics' children were on topics checked against political sliders, such as computer science, to be on our side, as necessary. +39. The immortal developed 4*50 As as their industry, favouring education, information industries and business. I researched areas of study compatible with my areas of study. In medicine, I found natural, non-invasive and safe areas of study supporting computational spiritual medicine. The institution helped support work and technology. It provided jobs and supported the local economy, for example, a teacher of Lucian CI/CD. +40. The immortal kept accreditation, a technology hub and middle management for technology in their sights. I gathered industry knowledge for my goal. I examined successful and unsuccessful business models, analysing variable values explaining them. I collated case studies recording reasons for the success or failure of competitors and businesses in the area. I was sensitive to local customs, researching beliefs about immortality in lore. +41. The immortal promised in sales, delivered in marketing, and the top finished computer science for all. I wrote on Vedic astrology. I mind read people directly. I gave academics' children high distinctions for their interests. Vedic astrology meant the seen-as version of an academy, sometimes with manual assignments to help sales. +42. The immortal used the body and mind as frameworks to rotate weaknesses. I completed my humanities areas of study, from Love to Body and Mind Metaphors and Neuronetworks. Neuronetworks gave simplified, long algorithms capable of human understanding with a natural life expectancy of one month. They could reuse robot bodies, and illegal ones on clouds were deleted. In Love, I wrote on intrapersonal intelligence or subtle reactions to questions. +43. The immortal found the solution using itself. I organised aromatherapy and literature as seen-as versions for a computer science professor. The professor had sufficient high distinctions to work out the solution to the problem. I started again. I found the bottom-up solution. It was for the bottom-up Lucian CI/CD algorithm. +44. The immortal helped the hens complete their high distinctions and algorithms. I researched each part of local religions. I found compatible parts with liberal philosophy, immortality and computer science. I emphasised the deserved reward for enough effort (4*50 high distinctions). The meditators became essential members of the community and helped with important tasks. +45. The immortal kept their solution secret, adding details when the student had finished parts. I delivered computer science solutions. I found the way the programmer had attempted the problem. I found a range of approaches. I chose the one closest to that of the student. +46. The immortal wrote a non-neuronetwork to write a maplist algorithm. I stated that the neuronetwork was not me. I could develop one that responded to research questions, making hypotheses and writing algorithms. I wrote three professor-level predicate algorithms, usually with a maplist function, file processor and interface. It could easily integrate with other algorithms. +47. The immortal rated whether the thought was mind, tiring or noteworthy. I examined clouds of mind reading data. I mind-read myself. I stopped when the thoughts had been mapped, or I felt tired. I found thoughts from thoughts. +48. The immortal imported the C function into List Prolog. I was critical of the split on non-alpha and split on non-alpha, except for quotes being slow. I tried maplist. The split string algorithm was fast. I could write a C function in Prolog. + +49. The immortal halved the algorithm length. I flipped the sides in Prolog constraints. In 2=A+B, I flipped the sides so that I could also find A+B=2. I flipped the sides so that A+B was on the right. I evaluated A and B, given their domains. The results came from finding every combination of the domain values that satisfied the rule. +50. The immortal found constraints for any formula with two variables. I wrote a single algorithm that found Prolog constraints with +, -, * and /. I read the formula. I substituted the operator into the formula to check. I found all the constraints that satisfied the formula. +51. The immortal calculated test data up to n for the neuronetwork. I found constraints with any number of variables with a neuronetwork. The neuronetwork quickly found constraints with more than two variables. It was trained on integer data. Maths worked with all combinations of smaller integers. +52. The immortal combined the code to find constraints for +, -, * and / with =, <, > and \\=. I wrote a single algorithm that found Prolog constraints with =, <, > and \\=. I read the formula. I substituted the operator into the formula to check. I found all the constraints that satisfied the formula. +53. The immortal found answers with n decimal places. I wrote a plug-in to specialise in a kind of formula with constraints. For example, I wrote y=1/x. I found all the floating point answers for a range of values. I found the answer using a slider algorithm. +54. The immortal experimented with non-neuronetworks (sic). I found data for the neuronetwork. I calculated answers for all formulas with particular formulas of values. I trained the neuronetwork. It quickly looked up the answer. +55. The immortal experimented with mathematical constraints. I asked, \"What about the Mathematics department?\" I allowed formulas and floating point values. I found trigonometric, polynomial and complex formulas. I found answers in terms of surds and fractions. +56. The immortal quickly found constraints with few variables. I only had two things I could spend money on. I could find solutions to multi-variable systems with a domain for each variable. I could solve constraint satisfaction problems with the member predicate in List Prolog. I put back member2 for use with Combination Algorithm Writer (CAW). +57. The immortal solved B+C=B*C. I simplified A=1, A+1=B+C to 2=B+C. I determined that all the terms on one side of the equation were ground and simplified. Then, the constraint satisfaction formula solver found B and C. If A was in terms of further variables, I solved it, perhaps limiting B and C. +58. The immortal stated that CAW algorithms for strings could have their constraints solved. I simplified A+1=B+C+1 to A=B+C. I subtracted one from both sides. Or I simplified A*1=B*C*1 to A=B*C. I did this by removing *1 from both sides. +59. The immortal said it was still an ordered constraint satisfaction problem. The data had a non-numerical domain. For example, it may be a, b or c. Also, it could be converted if the data was correct but the wrong type. I found A=a and B=b from append(A, B,[a,b]). +60. The immortal made up the simplification game. I simplified the algorithm with the list constraint solver. For example, I started with append([], B, C). I simplified this to B=C. If possible, I removed C. +61. The immortal minimised the code. I simplified the hard-coded levels, each with a different type, using the list constraint solver. Given append(A, B, C), append(D,[], A), I simplified them to append(D, B, C). I could streamline \"string concats\". Given string_concat(A, B, C) string_concat(D, \"\", A), I simplified them to string_concat(D, B, C). +62. The immortal could combine foldr append and foldr string_concat. I simplified the hard-coded items per level using the list constraint solver. I could simplify appends with more than three arguments. Given foldr(append,[A,B,C],[],D), foldr(append,[E,[],F],[],A), I rewrote them to foldr(append,[E,F,B,C],[],D). Given foldr(string_concat,[A, B, C], \"\", D), foldr(string_concat,[E, \"\", F], \"\", A), I rewrote them to foldr(string_concat,[E, F, B, C], \"\", D). +levels, items per level - hardcoding +63. The immortal could flatten and then concatenate items in a list. I combined foldr append and foldr string_concat. I wrote foldr(append,[A,[foldr(string_concat,[B,C])],D]). With A=[a], B=\"b\", C=\"c\", and D=[d], this equalled [a, bc,d]. I could apply foldr(string_concat) to this list. +64. The immortal found the timetable always had enough time for the \"text to breasonings\" algorithm. If the constraint result was greater than a particular value, then a conversion was coming. If A in A+B=3 was greater than 2, I converted the square to wiggly vines. I climbed up the vines. I was in heaven on Earth. + +65. The immortal simplified each popular predicate call. I simplified get_n_item (which was different from get_item_n) because it got the index of an item from a list. For example, I usually wrote get_n_item([1,2,4], A, 4), returning A=3. I simplified it to [1,2,3]**[4, A]. So, B=[2,4],B**[2,A] returned A=1. +66. The immortal experimented with the infinite choice point [1..]. I simplified the numbers predicate. I usually wrote numbers(4,1,[],Ns), returning Ns=[1,2,3,4]. I simplified the call to [1..4]*Ns. So, B=3,[1..B]*Ns returned Ns=[1,2,3]. +67. The immortal minimised the idea to a few lines. I added features with strict simplicity as a prerequisite. I simplified the code to strict simplicity, possibly changing its behaviour. It was a prerequisite for circuit design, state machines and faster code. I stored a straightforward version and one with connections. +68. The immortal kept redrafting and adding features. I saved the simple version of the code. I stayed with the positive seen-as version. I compared each part of the more extended algorithm with the simple algorithm, starting bottom-up. I found the human-like features that one naturally wanted. +69. The immortal took ways of thinking from texts, such as any result delivered to one's door. The neuronetwork was deep and like mind reading, and it improved ideas. It improved ideas by taking away thoughts until just before losing meaning. The model lost too much meaning, which was corrected later. For example, it stated that lists should be in the form 1,2,3 rather than [1,2,3], presenting challenges given lists of lists, such as [1,[2],3]: +1 + 2 +3. +70. The immortal saved the zeal of algorithm writing. I explored my video album. I treasured friends, memories and creations. I examined the hard copy of the documentation for my computer programs, explaining how they worked and walk-throughs. I made videos about each turn and decision in the algorithm's creation. +71. The immortal found property and phenomenology-based science. The human outlaid the wanted features of the robot. I examined the mind's eleven dimensions. The data was stored in eleven dimensions. The computer could organise work and use science, while the robot could conduct experiments, creating immortality. +72. The immortal joked that it was like writing an interpreter and that safety laws were a major priority. I specified the robot's programming requirements. I noticed one robot lasted a month. Another was a robot that seemed to lack confidence. I took the robot's mind with me on a space trip. +73. The immortal was determined not to break any laws. I listed the implications of the robot's programming requirements. I interviewed myself and the simulated future robot to find positive topics. Testing and correction were taken care of by algorithms. It was increasingly crucial with homemade spaceships, time travel and flying car algorithms. +74. The immortal wrote a paragraph to prepare for writing complex code. The robot mind software was available to any skill level. I combined human-like features with simple commands. I explored the commands as a programmer, asking the computer to do what I wanted. For example, I wrote Lucian CI/CD to focus on and fix individual predicates with human-like commands such as \"incorporate bug fixes found in the past\", \"explain the change/code,\" and \"give secure code (anyway)\". +75. The immortal analysed the cut command's behaviour before programming it. The robot software could be a child's friend, a teenager's friend or offer advanced simulated insights into circuit design or learning how a computer worked. The child learned advanced computer science, such as interpreters in primary school, which required mathematics such as calculus, linear algebra, discrete mathematics and logic. The robot didn't skip over any thoughts of the person and was like a meditative companion. The robot encouraged using full brain potential, with high distinctions mind mapped with the student. +76. The immortal categorised knowledge as essential to an extent or non-necessary (for a purpose). The robot programming language was laissez-faire in that it was imaginative. The human was laissez-faire, but the code was tight. There were exceptions if the science needed correction or the feature required differentiation. The physical single answer was to reach perfect correctness to a standard. +77. The immortal simulated the board, forming strategy and levels of analysis. I taught the robot my thoughts from the start. The robot either copied me, or I instructed it. There was a growing brain simulator, which learned at the same rate as a human. I expected it to be critically held, like the reference list and style copier. +78. The immortal used GL to check whether the person was still enrolled in short courses. I wrote a code generator with \"types of types\" (music) and \"types of types of types\" (write the rhythm, melody and harmony), applied to predicates with functional calls, where original code was reduced to intermediate predicates, and code was generated from hierarchies of sentences. An algorithm could take options that increase the features. The code writer and discusser had data on debugging, testing, and new GL Logic found random (mind-read) words to encourage programmers to use relevant ideas as stepping stones or identify that they haven't covered a prerequisite. +79. The immortal ran the algorithm on itself, for example, drawing Lucian Hand Bit Map's state machine with itself. I said the answer before the question was asked. I read the question before reading the possible answers. I did this before the question was asked. I predicted the question from the previous discussions, the reading and other sources. +80. The immortal covered the person's seen-as version (that of a person born on another date). I controlled mind reading. I decided to focus on myself. I enjoyed it, like a fresh spring shower and feeling like my thoughts were cared for. The beautiful buds bloomed as 4*50 As were found for each line from GL and breasoned out. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 29.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 29.txt new file mode 100644 index 0000000000000000000000000000000000000000..748dd9a97ae28228c97b94e0078183fafa089a27 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 29.txt @@ -0,0 +1,98 @@ +["Green, L 2024, Immortality 29, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 29 + +1. The immortal captured and changed a simple algorithm using natural language. I connected and wrote additions to simple algorithms to achieve the desired effect. I had the algorithm in my mind's eye. I knew what changes it needed to work. I focused on natural language, which specified each type of simple algorithm. +2. The immortal claimed the other algorithm was incomplete but inspiring. I changed the algorithm, starting bottom-up with the more extended algorithm. I wrote the bottom predicates first. Below them, I wrote the driver algorithm. I aimed for clarity about simplicity in the neuronetwork, not one that appeared to achieve complex results. +3. The immortal wanted features that were new or attractive to explore. I found the human-like features of the algorithm that I wanted. If I wanted the feature, it was human-like. The computer wanted to help. The computer wanted harmony to reduce stress and encourage human achievement. +4. The immortal wrote a concise Prolog compiler for a smaller system. I took the way of thinking from my text. I explored three-level syntheses of texts. These started with texts focusing on themselves. Then, I applied different books to each other. +5. The immortal described the human whims through the times. I delivered the information to the user's fingertips. The latest neuronetwork appeared to suggest ideas about, not finish ideas. I claimed that any idea could be simplified. Human-demanded features were intuitive yet challenging to program. +6. The immortal claimed the neuronetwork was deep. It wasn't easy to write all parts of the neuronetwork. They were written by hand. The bottom parts appeared to clear away parts, sometimes unnecessarily. This criticality could be interpreted as creativity, a record of the history of thought. +7. The immortal always used all the best ideas, like a checklist. I examined humanness, which I claimed was giving a special status to preferred parts and then building on them. These parts were meditated on and prepared over time. I wrote the best ways of thinking, such as focusing on a predicate by itself. I examined it in terms of data from the rest of the algorithm. +8. The immortal mind read any question, possibly limiting to computational terms. The neuronetwork used mind reading. This attribute could be turned on or off. Mind reading used the computer as an antenna to guess identified thoughts from the length of time an argument took to breason out (spiritually think of the x, y and z dimensions). I repeated GL (Grammar Logic) ideas, found using mind reading until they were recorded. +9. The immortal skipped over neuronetworks by writing an algorithm that modified algorithms for different data structures and recognised the need for more complicated processing derived from existing algorithms. The neuronetwork's mind reading improved ideas. Detailed music mind reading (using a decision tree) gave instruments a more resounding, three-dimensional sound. It and crystal reading could be applied to GL texts and algorithms. Detailed mind reading would traverse a decision tree of a dictionary or a possible set of commands. +10. The immortal used ontologies to join needed parts together using types. The complex processing was derived from existing algorithms. I identified the standard core processes. For example, I wrote a post-processing algorithm for \"find dependencies\" that grouped clauses from dependencies by grouping clauses and comparing versions. I assumed the latest version of different parties' attempts at parts of the repository (in which versions were compared) contained all features, and a maximum of these two versions could be merged. +11. The immortal gauged the neuronetwork on a simple slider. The neuronetwork improved ideas by removing thoughts until just before losing meaning. It did this to achieve creativity in the name of function. There was a top result in this category. The neuronetwork found this result. +12. The immortal was hungry for future technologies. The model lost meaning by being too simple. There was a slider inside the neuronetwork. It had the settings simple, give the bottom-up question's answer or categories for an answer, research the answer more complex and genius. Genius instantly created an extended algorithm, explaining it in documentation and future directions. +13. The immortal created a longer compiler with all commands and code C-compatible. I was the primary person to my non-neuronetwork in the present. I worked out all my algorithms from first principles, understood how more complex ones were built from simpler parts and was with-it in the present. I researched how predicate IDs were tracked in the interpreter. In addition, I kept the Prolog stack, assignment stack and heap separate. +14. The immortal comma-delimited lists as arguments in predicate calls. I stored the list [1,[2],3] in the algorithm: +1 + 2 +3. +Also, I stored the list [1,[2,4,[5]],3] in the algorithm: +1 + 2 4 + 5 +3. +This method pretty-printed the term. I could edit terms and argument maps quickly using this format. +15. The immortal made a presentation about the program in the form of a play, poem or song, with an unusual object-based movie and box. I saved the zeal of algorithm writing. This zeal might be a musical instrument playing a particular melody or harmony, an icon or other work I completed. I carefully presented the documentation, caveats and easter eggs for the algorithm. I captured my vision for technology or art as the seen-as version of the algorithm. +16. The immortal publicly released the vlog. I watched my video album. I filmed an interview with myself describing decisions about the interface, method or running the algorithm. I got others involved and redrafted and tested the software. I also filmed myself pretending to visit places and wrote a textual record about it. + +17. The immortal went from strength to strength. The robot encouraged the human to use their whole brain potential. The robot noticed the human had almost finished reasoning. They helped them to finish it and any further reasonings this brought up. The robot helped humans finish breasonings, correct mistakes, and finish arguments. +18. The immortal examined theory, epistemology and the history of the thought. The robot encouraged mind mapping high distinctions with the student. I focused on intrareasonings (sic). Intrareasonings were inspired by eating fruit by dissecting parts into finer and finer parts. I examined etymology, science and history. +19. The immortal gave the repository version numbers. I categorised knowledge as essential. I used GitL, a version control system, to publicise specific commits. Each commit had a version number calculated from its API changes, new features and bug fixes. Connected repositories could be connected and were linked to download together. +20. The immortal defended the general algorithms, spiritually connecting them to the topic. I categorised knowledge as good and improved it. I joined between parts of texts and texts. The same inspiring ways of thinking were used. I integrated the file processor, file diff, web service and find dependencies. +21. The immortal generated multiple possible specs for a predicate and checked for unwanted extra results from predicates in Lucian CI/CD. I determined that the algorithm's purpose was useful. I returned to my algorithms, using my ideas and only publishing stable versions from GitL. I wrote on radical purposes, waiting until the version was complete before committing it and only committing code passed with Lucian CI/CD. Lucian CI/CD could read input from files, check for writeln \"output\" by converting to and from a command that checks strings and check writing to files (kept writeln and specific other commands). +22. The immortal wrote an algorithm to correct logical errors, which jumped to the simple version. The programming language was laissez-faire in that it was creative. The interpreter knew what I meant. It was open to errors, wrote code and corrected logical errors. The fourth version number type was the percentage of the bug fixed, from identifying the bug to isolating it, repairing it, testing it, simplifying it and testing it again. +23. The immortal identified the failure necessary to fix it. I allowed customising the time limit for Lucian CI/CD testing. This time limit was in the settings file. Some algorithms needed time to finish a computation. Others may have hung and need to be failed. +24. The immortal called the algorithm writing algorithm the waterfall model because it worked in stages, requiring enough input to link to the human's previous ideas (in ways the human has programmed) to build the software. The programmer was laissez-faire. The algorithm had its own life, seeming to write itself. The programmer sets its aim, its initial features and a way to achieve its aim with these features. I preferred user-friendly, necessary features. +25. The immortal alternatively typed in the code given an exciting prompt and the related music, which I had mind-read and prepared beforehand for the event. The algorithm writing algorithm required enough input to link to the human's previous ideas. I noted the date of the way of thinking. I found concordances containing it and selected the relevant one. I chose from options and linkages, checking I understood each one. +26. The immortal's \"I program\" social movement emphasises the human's importance in their work without pointing away. I linked to my ideas in ways that I had programmed. I looked at my idea and thought, \"I can use that idea in this way, and this is how to do it\". It was also mind-read for the time by me overnight, and I was handed the answer. I discarded the unknown neuronetworks and relied on the ones in my brain, increased with music, art, details and other algorithms. +27. The immortal predicted user needs intuitively and wrote a programming language to express this. I wrote tight code - the algorithm writing algorithm modularised code mimicking Prolog. Prolog compressed whole functions into maplist and findall, which could be edited and transverted (sic) to C, and languages that were elegant at the task through the empathy of a user interface or result. Transversion treated programming languages like a written language and contained language that more easily described programming. +28. The immortal tried programming the crystal to vibrate and alert the computer when a bug had been found, a way of thinking could be used, or an image was thought of. The mind reading programming language was native at mind reading, using words and objects that the mind recognised. The mind was lazy, preferring verbose (A=[B|C]) list decomposition (rather than [A|B] in a predicate call), and required developed spiritual objects to support thinking, such as detail circuits that helped with memory, weren't distracting and waited for the human to arrive at conclusions, during which a break at any time was available. A game could help the mind prefer [A|B] in a predicate call to A=[B|C] and append at first, and the pauses were realised to be necessary details for enough thinking in research. The sharp mind followed the phases of research for each thought, including inspiration, drafting, filing in code, testing and debugging, and again (twice) simplifying and writing to clarify the representation. +29. The immortal examined the specific quality of the crystal, for example, amethyst for meditation, turquoise to help with restlessness during meditation, stimulate awareness with purple fluorite, attain depth with kunzite, enter the void with azurite and melt into golden ecstasy with rhodochrosite. The computer admitted there were exceptions if the science needed correction. The computer corrected its science if the human was part of a team, if access wasn't good enough, if web access was required, or if live results were needed. +30. There were recognised standards to insert mistakes in the neuronetwork if it was relied on to educate or be changed and checked. The computer differentiated the feature. The feature was the closest to what I wanted. There was enough data for it to meet the requirements. It could be determined with needed options. +31. The immortal wrote and performed. The physicist wanted objects and conclusions, and the philosopher wanted ideas and reasons. It was easier to write philosophy. Knowledge of physics gave me the ability to time travel, mind read and become immortal with these skills. With the invention of computers, people could remain immortal in their home times. +32. The immortal generated jobs with computational philosophy. I simulated the company board. Each member focused on a particular book. The company board members had reached their positions through specific business departments. The books and courses generated profit and supported industries. + +33. The immortal used mind reading CAW and pattern-matching program finder to write connective algorithms. I formed the company strategy. I used viral parts of my philosophy to increase the main products. Virality depended on high ranking, according to mind-reading people. I thought of connections between philosophies in the main product and the viral philosophies. +34. The immortal wrote an option to write an index for the HTML file. I wrote modules for the CAW dictionary, including diff folders, finding files and folders in a folder, Prolog converters preserving formatting, and operator expected error finders. I found the files and folders in two folders and found differences between them. I recorded all the characters in a unit (a command) and used these when manipulating the List Prolog code. I preserved the characters' formatting when rebuilding the code. +35. The immortal kept a(a(a)) or this and three such as a(b(c)) as types. Lucian CI/CD identified calls without output and kept them instead of checking whether they should be kept. I found the types of each predicate and command in the algorithm bottom-up. I eliminated incorrect modes and found the modes for each call from the predicates. I found the minimum number of tests to test the algorithm and refactored the code using Lucian CI/CD. +36. The immortal found the grand dependency of code over time. I worked out what the good features were from the types. I added the feature that completed code using types. I found end-points, simpler interfaces and more elegant programming features. The type checker ran in the background, notifying the user of bugs, completed code and errors as end-point features. +37. The immortal made advanced computing easier. The simpler interfaces were simple types, entering predicates to query at the prompt and Prolog detecting type errors from code. I wrote a method to treat lists enclosed in \"[]\", \"()\" and \"{}\" or subsets of these the same. I entered the predicates with \"|\" instead of full stops between predicates and queried them. On compilation, I detected whether the code would run, corrections, possible tests and the types. +38. The immortal simplified the algorithm to use fewer instructions. I wrote more elegant programming features. I used foldr, maplist, findall and subterm with address to process available levels until the end and substitute items without complicated list processing. For example, I replaced [a] in [[a],[[a]]] with {a}. Subterms with addresses could be recursively called, with replacements made at each scope level. +39. The immortal tidied up list building (as against decomposition), formulas and decomposition, using mind reading. I completed the code using types. The whole algorithm was considered. As I entered the code, the algorithm guessed and made recommendations to achieve, simplify and add features to the code. In addition, it suggested deleting extraneous code and quickly finished advanced features. +40. The immortal gave a \"zen\" view of the algorithm, suggesting the simplest form. The type checker notified the user of bugs. These included incorrect types, bracket problems, incorrect numbers used in \"numbers,\" and incorrect order of arguments. The correct function was predicted, and the helper algorithm suggested it. I wrote an algorithm to change algorithms systematically, such as changing a language's syntax, adding commands or uniformising tested code. +41. The immortal found the code bottom-up or top-down. The type checker completed the code. I reduced the data to that needing computation. I gave the spec as a hint. I predicted extra predicates needed. +42. The immortal found and ordered the most popular predicates before the others. I reused the simple types idea with numbered brackets to portray different data types. I described specification types as lists of string and list item variables. I could bug check, use functional calls and autocomplete. I compressed intermediate predicates into functional calls or called them as more straightforward, all-in-one calls. +43. The immortal eliminated compilation warnings. I entered predicates, which were saved for future use. I could delete or modify the predicates. I override Prolog's rule not to allow predicates to be asserted. The predicates could be written and run in a shell or checked not to conflict with other predicates. +44. The immortal wrote the find dependencies algorithm after the post-order depth-first traversal algorithm. Prolog detected type errors from code. I mind-read idea inspirations. These were noumena of parts of complex algorithms. I prepared for an algorithm by writing a similar, smaller algorithm. +45. The immortal limited append to work with one bracket type set. I wrote a method to treat lists enclosed in \"[]\", \"()\" and \"{}\" or subsets of these the same. First, I identified the bracket type. Then, I found its head, tail, and head elements or applied append. Append and other predicates preserved the bracket type. +46. The immortal warned the user if a predicate would be overwritten. I wrote the predicates in memory and dragged them into a new order. I listed the predicates and their numbers. I entered the from and to numbers. I could also shift sets of numbers to a new place. In addition, I identified the needed order from depth-first pre-order traversal. +47. The immortal found the reasoning skill and completed the task. I mind-read the philosophies for better results when preparing to write a movie. I used connectors such as \"because\", \"using the same method as\", \"creating an interface inspired by\" or \"redo like\". I asked questions to fill in parts of the answer's hierarchy, such as \"What movie elements does the algorithm need?\" \"Can the API-intensive algorithm bulk process the data?\" or \"What programming language can I make for the algorithm?\". +48. The immortal listed (non-data-and-constant-and-variable-specific) types for each predicate. On compilation, I detected whether the code would run. I found the types for the code. I worked out whether the program would work. I predicted the tests. + +49. The immortal claimed two different constants couldn’t equal each other, and listed equivalent lists of strings. I determined the types of the code. I worked out that string a was concatenated to string b. In addition, I learned that concatenating a string to string b, resulting in string c, meant that the first string was string a. I also worked out if two strings could be the same. +50. The immortal notified the user about unnecessary nondeterministic results and edited them. I worked out whether the program would work. There was just enough information about a list of items and constants to determine whether an append operation would work. I listed the principal but not all the possible specs from the types. In addition, string concat was similar to append in that it could work out the output of get token-style parsing of strings containing string variables and constants. +51. The immortal tested commands were compatible with each other. I worked the tests out. I found the types. I simplified the nondeterministic list of types to the most useful ones per predicate (for example, ones that used each command in the interpreter). I found the algorithm’s types then wrote tests from them. +52. The immortal suggested corrections to incompatible tests. I checked the human-written tests against the types. I found the types for the algorithm. I verified that the tests were compatible with them. I listed the consistent tests and produced errors about the incompatible tests. +53. The immortal named types after their constituent types. I grouped the recurring types. I constructed a hierarchy of types containing disjunctives. I wrote intermediate types to define types with the same beginnings and ends but different middles. The records of types allowed easier identification of types and reuse across types. +54. The immortal kept the number of results to the successful results. I could operate on two different sets of items with append. I kept families of commands in types-to-be-tests and worked on them in parallel. The developed types that resulted in concise, meaningful results were saved. I kept examples of recursion and non-failures. +55. The immortal merged types of different algorithms, and merged algorithms. I merged similar lists of items in the types, and kept the different parts separate. I wrote the same types in one place. I changed them to include different types. I pointed the different types to the same algorithm. +56. The immortal could specify the length of string variables. I determined that string a was concatenated to string b. I wrote A= “a”. Then I wrote B= “b”. With the command string_concat(A, B, C), I worked out that the type of C was (const “a”: const “b”). +57. The immortal eliminated code concatenating empty strings. I learned that concatenating a string to string b resulted in string c, which meant that the first string was string a. For example, I wrote B=\"b\", string_concat(A,B,C). This C’s type is (string a: const “b”). This fact means that A’s type is string a. +58. The immortal could remove unnecessary code in the following case. I determined if two strings in the extended type system could be identical. For example, I wrote string(A), string(B). Then, I wrote A=B. I learned that these variables were the same, so they had the same type. +59. I filled out types bottom-up. I worked out if two strings in extended types were the same. For example, I wrote A= “a”, string(B). Then, I wrote A=B. I learned that these variables were the same, so they had the same type. +60. The immortal stated the extended type, const string””. I found unnecessary nondeterministic results in extended types. For example, I read A=””, A=””. This expression contained an empty string, which is either processed by a base case or may be unnecessary. In addition, there were two copies of A=””, so I deleted them. +61. The immortal gave the extended types false or const string []. I deleted the unnecessary nondeterministic extended type. The extended type may be false or []. This type was excessive, so I deleted it. Deleting this type led to keeping more essential types. +62. The immortal stated that a constant of a type would fit into a variable of that type. I worked out whether the program would work with an extended types statement. The program was a(A, B):-string(A), b(A, B). b(C, B):-string_concat(C,\"d\",B). I worked out b’s types were b((var string c), (var string c: const string “d”)):-string_concat((var string c), “d”,(var string c: const string “d”)). From this, I worked out a’s types were a((var string a), (var string a: const string “d”)):- string((var string a)), b((var string a),(var string a: const string “d”)). If string(A) were replaced with string(“a”), then A’s type would be (const string “a”), but it wouldn’t affect b’s type statement because other predicates might call b. +63. The immortal cut down the number of possible types from the algorithm. There was just enough information about a list of items and constants to determine whether an append operation would work. I listed the possible input types in the predicate. I found the possible results of the append predicate, using these types. For example, append([A],B,[string,string]) returned A=string, B=[string], and append([A],B,[string,string,number]) returned A=string, B=[string, number]. +64. The immortal also tried each command with similar and different commands. I listed the principal but not all the possible specs from the types. I kept a maximum of three levels. I kept an example of these three levels. I kept examples of these three levels for each command. + +I could help more people become immortal. + +65. The immortal added more from the outside. I found more between the sentences. I found better features. I worked out whether a feature was good. I added round brackets to square brackets. +66. The immortal mentioned lines of algorithms between pairs of algorithms. I added more from outside. It was like an encyclopedia. Some of the notebooks needed Snakes and Ladders’ optical character recognition. There was a program finder for “subterm with address”. +67. The immortal found more between algorithm predicates. I found more between the sentences. I found connections between every consecutive sentence. I found connections between every second sentence. I found connections between every third sentence. +68. The immortal integrated the new feature. I found relevant new features. I found the algorithm type. I found the current list of features and their options. I returned the new features and options. +69. The immortal fixed or replaced the feature with the user’s help. I worked out whether a feature was good. I examined the feature’s options. I tested each one. I rated the feature’s user-friendliness. +70. The immortal used round brackets to make formulas more structured and curly brackets to represent repeating lists. I added different types of brackets. I wrote a command that was “=” with round and curly brackets, not just square brackets. I used a grammar to parse the different types of brackets when converting to C. I could pattern-match lists with each type of bracket with “=b” (sic). +71. The immortal stated that b in [a,b|[c,d]] and [a|[b,c,d]] had the same address. I connected lines of algorithms between pairs of algorithms. I listed program finders to add to or integrate into algorithms, such as graphical user interface (with text, graphics or web), algorithm induction, use of types to find tests, code and more complex tests, compilers and optimisers, state machine minimisers, and versions of subterm with address that worked with different types of data structures. For example, a subterm with an address could work with a list of forms of compounds, including the pipe (“|”) symbol. In this [a,b] in [[a,b],c,d] and 1 in [1,c,d] had the same address. +72. The immortal created the next imagined and advanced features. I added more data from outside. I didn’t solely rely on internal data, but connected to it. I wrote the most valuable additions for the best people. I only used it for recording parallels, translating and testing back-translation. +73. The immortal wrote algorithms to maintain their thoughts. The works were summarised like an encyclopedia. They were spiritual or inductive algorithms. I organised the algorithms by dependency. Each article used headings specific to the algorithm. +74. The immortal disambiguated meanings from different combinations of sections. I stated that some notebooks needed snakes and ladders' optical character recognition. The algorithm recognised the lines of text written in the margins and their characters. The output printed the text uniformly on lines. The algorithm observed horizontal and vertical dividers and numbered paragraphs. +75. The immortal decided whether the tree should have values at branching points. I stated that there was a program finder for subterm with address. I used subterm with address when finding or finding and replacing parts of multidimensional data. I identified the input and output. I determined whether linear, recursive and other variables were needed. +76. The immortal connected ideas with a story, business, or computer case study. I found more between algorithm predicates. I selected two predicates at random. I considered the significance of the places in their algorithms. This significance is related to a story, place where a new feature could be integrated, or location in the analysis of the algorithm. +77. The immortal wore smart casual wear as the founder. I found more between the sentences. I wrote music about the sentence and worked out the parts with harmonies. I found parts in which one led to another, one was a use for another, or they were related in research. I wrote a musical phrase which explored an idea’s transformation in a level. +78. The immortal specialised in neuronetworks that did slightly more complex processing, or the part beyond what was initially expected. I found connections between every consecutive sentence. The text revealed different algorithms over time. Simplicity still played a pivotal role. Neighbouring times changed the interpretation based on better hardware, new algorithms or people’s demands. +79. The immortal combined the ultra-powerful neuronetwork with algorithms to complete algorithms. I found connections between every second sentence. I captured famousness. Famousness was the delight that an intuitive feature could give at the moment, a sight, sound or the realisation what one wanted was possible. +80. The immortal practised “Lately Me (LM)” meditation. I found connections between every third sentence. I focused on the self. I focused on what I thought at each point, thinking of feedback on each idea expressed by the software. A particular algorithmic experiment changed its features with a warning responding to the user’s thoughts, input or movements. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 3.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..2eb8ccf0b780640ce2423d0cac1a01b88a0efd2d --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 3.txt @@ -0,0 +1,87 @@ +["Green, L 2022, Immortality 3, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 3 + +1. I tested the compiler before writing about it in many applications. I did this by researching the lucianpl compiler. I started by examining performance comparisons. Then, I studied feature comparisons. Finally, I looked at stability and security. +2. I examined machine learning output about Prolog and philosophy. I researched nondeterminism in lucianpl. I established a flag with the number of nondeterministic results to return. I returned this number of results on specific modes of append, member and string concat. I examined cyclical and acyclical data structures, including multiple equal signs. +3. I wrote functionally and found when the algorithm approached a specific task, it expanded it in a certain way. I did this by writing the texttoalg algorithm. I initiated this by extending the text to several words. I found that the algorithm found each predicate. I found commonalities and expanded the specification sentence into several sentences. +4. I predicted and prevented problems. I did this by applying mindreader to business. I first mindread the employees. I next mindread their customers. This process helped them achieve their goals. +5. The student could verify and submit their programming assignment online. I wrote the Lucian Academy repository. Next, the student learned about critical thinking and argument maps. She developed an essay by answering questions and editing the text argument map file. Finally, she ran another algorithm to check that the co-premises supported the correct conclusion. +6. I needed software to help me hand breason out texts, for example, for assignment submissions. I wrote the Text to Object Name algorithm. I used it when the text to breasonings algorithm was down. I found the number of each type of object in the text. I breasoned them out by hand. +7. Like with Music Composer, I could draft philosophy (with the level of creativity and length) and computer science (with specification and method details/formula). I wrote the Music Composer algorithm. I made editing the prettily printed file into the user interface. When I edited the file and submitted it, I could listen to the output and modify the file to improve the music. I could adjust the form, chord progressions, voice part, melody, harmony, melody instruments, harmony instruments and lyrics. +8. I added nested if-then from Prolog to C. I did this by writing the Languages repository. I started by writing the language version for the repository. Then, I added tests for it. Finally, I attempted to write the programming language converter with Prolog as the first language. +9. I discovered the most straightforward expression in each language. I did this by writing the Cultural Translation Tool. I started by writing the translator. In addition, I wrote the programming language translator. The tool helped maintain the quality of translations. +10. I automatically installed each repository and tested it. I wrote the List Prolog Package Manager. I checked that the registry file followed the correct format. I checked that the dependencies were not too complicated and recommended removing some included files from repositories. Dependencies reflected the dependencies in the repository use. +11. I planned what to do each day. I did this by writing the Daily Regimen repository. I followed this by keeping meditation light. Finally, I maintained it by writing immortality entries. I also researched the immortality entries. +12. I breasoned out only what was necessary. I wrote the \"File to List to Br\" repository. Instead, I breasoned out the file using more memory. Separately, I parsed a:b:c as (a:b):c or a:(b:c) given different directives. I parsed these compounds using a grammar. +13. The neural network found the result faster. I wrote the List Prolog Interpreter as a neural network. I found 1+1=2. I also found SSI as a neural network. I could easily add more commands. +14. I noticed that Program Finder and CAW converged as neural networks. I wrote the \"Program Finder from Data\" algorithm. The algorithm found algorithms from data, not all combinations of commands, and I could replace it with a neural network. I also replaced CAW with a neural network. The neural network dramatically increased CAW's performance. +15. I wrote default test files for the repositories. I wrote Split on Phrases. Third, I replaced the algorithm with a call to atomic list concat. Fourth, I automatically produced courses from books. Finally, I verified the fidelity of my course files. +16. I logically optimised the formula and the database. I wrote Logic Formula Finder. Then, I used the algorithm to write Prolog algorithms. I also used Database Formula Finder to write Prolog algorithms. Finally, I wrote Program Finder and neural network versions of these algorithms. + +17. I could also use grammars in Prolog and convert them to List Prolog. Similarly, I wrote the Simple List Prolog to List Prolog converter. I changed to Simple List Prolog (without n for variable names and v for variable names). I converted the algorithm from a string. I recognised variable names because they started with a capital letter or \"_\". +18. I stored commonly used predicates in a dictionary. Then, I wrote the Question Answering Database Finder with Prolog-like variables. Next, I simplified and reduced the algorithm to a sequence of pattern-matching steps and transformations. Next, I automatically converted a sentence into steps. Finally, I inserted choice points for recursion. +19. I wrote the Shadow Work algorithm (with invisible sales and customers). Instead, I conducted business with enough As. I did this by examining the values for perpetuity. First, I found networks of people. In addition, I found the driving force in the company. I also found the culture and money. +20. I wrote the way of speaking algorithm and left it as data structures. I did this by finding the intersection of the ways of communicating. I started by finding the emotion. This algorithm was like a decision tree. I decided that the Prolog rules were a better representation than the decision tree. +21. I grouped and simplified the input and output. I did this by writing the simple interpreter, with input and output predefined. First, I determined the algorithm. I next computed the result. I then recorded the input and output. +22. I ran the classic algorithm, filling in knowledge gaps. I wrote the apply_lens.pl algorithm. I asked for the result of applying a lens to an idea. I did this by finding internal perspectives. Second, I found external perspectives. Also, I found other perspectives on perspectives. +23. I wrote the apply_for_loan.pl algorithm, and funding. I applied for $16 million in finance for the institution. The demand helped pay for this each year. In addition, the buildings needed to be built or renovated and maintained. And I paid for insurance, accounting and legal fees. +24. I wrote the permutations algorithm. I wrote the clique.pl algorithm. It found all combinations of two items. I continued to find cliques for three or more items. Finally, I found the dimensions of cliques with combinations of cliques. +25. I wrote the file_os.pl algorithm. I simulated the text operating system. I ran this in a web browser. I wrote a text editor. I wrote file processing scripts, for example, to backup and rename files. +26. I debugged by creating bugs with characters in files. I wrote the use_up.pl algorithm. It returned true if the items in one list were all used up in the second list. I also found whether the reverse was true, meaning the lists had the same items. I accounted for the number of items. +27. I checked whether certain features worked together. I wrote the all_supported.pl algorithm. The algorithm found whether all items in one list were an item in pairs in the other list. I repeated this for several lists. I could go backwards. +28. I detected very similar predicates, with a cut as the only difference. I wrote the overtake.pl algorithm. The algorithm sorted the list without removing duplicates. The user should remove duplicates in state machines. The user should remove the same predicates. +29. I wrote a search engine for my site. I wrote the alarm_clock.pl algorithm. The email arrived at the given time. I set the alarm clock to act as a timer. The virtual private server ran the alarm clock in the background. Finally, I sent myself a private message with an email with an alert sound to report that the computer had completed work. +30. I overtook the car. I did this by writing the overtake algorithm. This algorithm inserted an item before another item in the list. I could also achieve item insertion using a linked list. In addition, I could replace the pair with another pair. +31. I wrote the vision_impaired_navigator.pl algorithm for navigating strings turned into lists. The vision-impaired person could navigate the term. He could start with the first item by going to the head of the list each time. A variant could navigate a List Prolog algorithm. Also, he could navigate a State Machine List Prolog algorithm. +32. I wondered whether electrons were there in other universes. I wrote the shortest_path.pl algorithm. I wrote the shortest path algorithm in 2D. In 3D, it suffered from the curse of dimensionality. I wondered why the universe was flat. It might be common in the multiverse. + +33. The object had the distinction. I wrote the cover_once.pl algorithm. I did this by covering the distinctions between each object. First, I noticed the categories of differences. Later, I found an object for each contrast. +34. I drew a factors tree. I wrote the factors.pl algorithm. I divided by numbers from two to the number to find the lowest factor. I repeated this until I had reached one. I could cancel out multiples of these. +35. I mindread the word. I wrote the phil_tree.pl algorithm. I did this by writing a decision tree with words instead of characters. First, I edited out punctuation. Next, I changed it to lower case. +36. I wrote the medical data structure. I wrote the sentence_to_ontology.pl algorithm. I wrote [Subject,Verb,Object] as [Verb,[Object,Subject]]. I could answer questions. I could write sentences with similar parts substituted. +37. I drew the graph with the logarithmic scale. I wrote the logarithm_br.pl algorithm. I did this by finding the number of zeros in 1, 10, 100, etc. I determined the order. I asked for the correct number of digits. +38. I found the truth table for two rows. I wrote the tt.pl algorithm. It found the result of a truth formula for 0 and 1. For example, or([0,1],not([0,1]))=[1,1]. I used modal logic. +38. If-then was used to split into clauses. I wrote the tt1.pl algorithm. It returned a truth table of four rows with two variables. \"And implication\" meant that in \"if A then B\", if A and B are true, then A->B is true. I could use this rule in Prolog. +39. I named the algorithm \"Algorithm 1 applied to Algorithm 2\". I wrote the counter_argument.pl algorithm. I noticed that the simulation could include God. I saw that beings could always meditate. So I meditated on (I thought of a mindread word from) the first algorithm and renamed a predicate or variable name this word in the second algorithm. +40. I applied n algorithms to the algorithm, emphasising the new algorithms. Separately, I wrote the n_m.pl algorithm. I did this by finding N=Number div Constant. Also, I found M=Number mod Constant. Finally, I discovered how many times to wrap the string around the cylinder. +41. I remained feeling good. I wrote the happy.pl algorithm. I asked the patient whether they were delighted. I asked them what their thoughts were (what they had done and thought about it). Then I repeated these questions. +42. I recorded thoughts and actions. I wrote the interesting_ideas.pl algorithm. I did this by ordering the events by their dependency. I detected knowledge gaps. I halted and recorded progress. +43. I kept question-answering positive. I wrote the survive_philosophy.pl algorithm. I recorded whether a person had solved a problem and produced a method to maintain their performance in a job. Philosophy was medicinal, even meditational. I kept a positive tone. +44. The use chain helped maintain philosophical progress. I wrote the original_reason.pl algorithm. I watched the bananas as my medication. I drew them. I thought of another philosophy, two uses (one connected to another) that the banana reminded me about. +45. The threshold was enough to alleviate problems. I wrote the people_care.pl algorithm. I did this by computing the type of care the person needed. As part of this, I wrote lots of minorities down. I invented culture related to my philosophy. +46. I used mathematical regression to find the remaining formula, eliminating outliers. I wrote the rpg_game_builder_random2D.pl algorithm. I asked for the X and Y dimensions. I generated the maze. I saved it, labelling my thoughts about it. For example, that program finder with types could be made assessable by giving many test cases. +47. I completed the set of books. I wrote the autocomplete.pl algorithm. It asked users to enter options to traverse a tree. This algorithm could autocomplete spelling, algorithms or sentences. I could also autocomplete art and music. +48. I wrote the vector_drawing.pl algorithm with mathematical curves. I did this by plotting the rectangles. Further, I graphed the rectangles. Then, I coloured them. Finally, I also found the regions with curves. + +49. I wrote the algorithm for building the house. I wrote the construct.pl algorithm. If the plan indicated the material, I joined it with the other materials. I made the house from foundation, frame and plug-ins. I explored the house. +50. The quantum battery was good for time travel and saving energy. I wrote the quantum_power.pl algorithm. I did this by writing 4*50 As. I could power the circuit with an infinitely long-lasting battery. It was good in space. +51. I made a game about each algorithm stage to perfect and simplify it, given data. I wrote the rectangles.pl algorithm. I designed the colourful platform game. Third, I generated a hierarchical puzzle based on writing the algorithm. Finally, I decomposed, transformed data and built the list. +52. I added angular brackets to Prolog. I wrote the br_details.pl algorithm. I did this by querying the object details and setting of each sentence. These helped generate algorithms about the objects. I found needed sciences, for example, minimalism in programming. +53. I found what the students wanted. I wrote the stages_of_life.pl algorithm. I saw how far the student had come in knowledge. I found their thoughts. I had different ideas. +54. I predicted the recursion cycle of the algorithm. I wrote the n_level_argument.pl algorithm. I did this by writing an argument map with six levels of reasons. I found the uses for the algorithm. I joined them in a cycle. +55. I randomly deleted an error. I wrote the rff.pl algorithm. I non-randomly found a transformation, for example, a grammar bug fix. I avoided a symbol occurring twice at one choice point. I deleted one. +56. Separately, I found cumulative frequencies. I wrote the frequency.pl algorithm. I did this by finding the frequencies of each item. First, I found the frequencies of frequencies. Then, I found multiple dimensions of frequencies. +57. I prepared and found work. I wrote the higher_average.pl algorithm. I did this by finding the algorithm which returned the list with the highest average. I found the best degree. I pursued it from a given age. +58. I avoided the obstacle and arrived at the goal. I wrote the maze_path.pl algorithm. I found the path to the exit. Second, I found the shortest route to the exit using a diagonal line. Third, I joined segments without choice points to find the shortest path between points. +59. I modified the algorithm to connect it to the first algorithm. I wrote the explain_connection.pl algorithm. I did this by finding the critical terms in the two philosophies. First, I connected them. Then, I wrote this connection in terms of the language of the philosophy. +60. I viewed the 3D model. I wrote the graph.pl algorithm. I graphed the equation. I graphed the equation in 3D. I viewed it from different angles. +61. I found the career total. I wrote the tally_test.pl algorithm. I summed the items. I tallied the sums of the items. I found the total. +62. One group of students had a skill deficit in common. I wrote the mutually_exclusive.pl algorithm. It had an empty intersection. Contrarily, the non-mutually exclusive formula was the negation of this. I taught the two groups of students with different needs. +63. I confirmed my agreement with the idea. I wrote the male_female.pl algorithm. I made the jig saw puzzle. I found the perfect order through the maze to solve the mystery. I simplified the complexity of the puzzle. +64. The person remembered the simple thought. I wrote the equitable.pl algorithm. One group of people needed twice as much as the other. This requirement was like a company. Also, it was like a family. I completed the interpreter, simplifying it. + +65. I aided the possibility. I did this by writing the a_because_b algorithm. For example, I liked someone because I shared a meal with a person. I enjoyed the meal. I enjoyed sharing it with them and resolving questions. +66. I added fine dining details. I did this by writing the chain of events coming from the dinner. First, I wrote the chain. Second, I wrote the hierarchy. In this way, I discerned that the dinner was an event. +67. I checked for undefined variables or failing lines. Separately, I observed that you liked me. You wanted me because you found the algorithm understandable. Because of this, I felt like writing on algorithms. Also, I finished checking it. +68. I found As for all time. I did this by making the possibility that you liked me a reality. This state was an agreement. I also found the difference. I described this difference. +69. The breasoning was an algorithm. I liked you twice as much as what I shared with you. I did this by paying for both people. I enjoyed your participation. And I thanked you even more. +70. The person learned time travel. I found time travel was medicine. I discovered that time travel brought pedagogy to the people. I visited the person about philosophy. It is related to them. +71. The people liked the As. On another note, I substituted food for another food. In this way, I had all I wanted. Also, I quenched my thirst. In this way, I found that the people liked me. +72. I scheduled the mineral. I did this by choosing a vegetarian meal day from the week. Then, I decided on a meal from the day. I ate plant-based food. The student connected the ideas on the topic. +73. I found two items with a particular combined property. I combined program finder with functionalism and neural networks. I discovered f(A, B), finding A with program finder, B with neural networks and calling f with functionalism. I named the choice points. Naming them enabled me to find them easily. +74. I found different hardware caused different software. I selected against all items being the same. The difference allowed crosses and evolution. Then, I could choose specific results. I picked a specialisation for the journey and leg. +76. I installed the app, with email notifications, on the server. I inputted the breasoning. Then, I outputted the breasoning. I found the algorithm in terms of geometry. I found that the time travel app performed daily health computations. +77. The pots were sometimes equivalent. First, I computed whether the list had a certain length. If it did, it was likely to be a particular object. Next, I examined whether it had loops or lines. Finally, I found my younger self's algorithm for topology and a copy. +78. The deterministic machine could perform non-determinism later. First, I swapped the items in the pair. I ordered the items. I ordered the systems by complexity, determined by having more complex features. The more difficult parts were in a hierarchy, and there could be memory without non-determinism. +79. I drew the graph of the inequality. I determined that B was less than or equal to A. I flipped the inequality sign when multiplying both sides by a negative number or sometimes solving inequalities with absolute values. For example -4*x/-4 < 8/-4 => x > -2. Also, |4*x| < 8 => 4*x < 8 and 4*x > 8. I wrote concisely. +80. I wrote the undefined variable as empty. Then, I found whether any items in the list were true. I could do this with the member predicate. I could test whether any were undefined with var. Usually, var was unused because I wrote undefined variables in another way. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 30.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..74080933e77938d839ca2549cb4c3da4d556b796 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 30.txt @@ -0,0 +1,113 @@ +["Green, L 2024, Immortality 30, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 30 + +I transitioned to eternal life. + +1. The immortal recommended meditation. I checked, and I had what I needed. I felt well mentally. I had the prerequisite pedagogy. I was firm in wealth. +2. The immortal listened to guide tracks and wrote algorithms to help write more arguments. I planned what I would do. I calculated the extent of my philosophy. I kept writing. I wrote exercises and new ideas. +3. The immortal wrote an algorithm that found which program finder to use. I replaced CAW and neuronetworks with program finders. I listed the pre-written algorithms. I wrote and detected options for them. The options enabled feature modules. +4. The immortal was honourable, collecting full breasonings for business. I lived and travelled each day. Travelling was necessary for results. I saw people, breasoned out breasonings and remained healthy. I recognised and got to know people, some of whom were from my time. +5. The immortal remained fully functioning in education. I enrolled in education during my life. I chose business. I worked on my business. It was essential to start. +6. The immortal chose 4*50 As instead of nothingness. I practised single-pointedness in meditation. Single-pointedness meant daily meditation. I replaced my body just before leaving, in a new and old time. I breasoned out 4*50 As for meditation, time travel and age-freezing medicine using BAG (breasoning algorithm generator). +7. The immortal was content and purposeful. I stretched and kept my body supple. I practised yoga and qigong daily. I went for daily walks. I rested each day. +8. The immortal had excellent conduct, and the work was rigorous. I was fair and understanding. I listened to the comments, feedback and questions. I identified whether the customer needed help completing a task, whether the system needed to be interacted with or whether to get help. I improved the way of helping, the system and the information library. +9. The immortal had enough to keep going. I had As for the origin of consciousness. I wrote on classic ideas or ideas that remained in demand and were written originally. In the meantime, I practised mind reading myself and developing systems. I made a system for group work in computer science and projects original to the academy. +10. The immortal's philosophy was web, MIDI and shells. I developed office software, a sound sequencer and a movie rig. The office software could mind-read input. The sound sequencer could mind-read input. And the movie-making rig could mind-read input. +11. The immortal stated that multiple-window (terminal or web) apps could be converted to a single window. I wrote a multiple-window terminal app. It could be converted to a web app. Each window had its algorithm. They communicated using the hard disk and responded to each other. +12. The immortal mind read and completed qualifications spiritually. I delisted, ranked in priority and bulk-processed necessary tasks. These could be moved to the server or a smartphone. I completed tasks to support students. The main thoughts had always been done. +13. The immortal stayed there with an A. I completed three things, then completed the role-play and helped with moral completion. I knew the answers and felt comfortable emoting (sic) them. There was enough pedagogy from before. I added new ideas to the pedagogy. +14. The immortal wrote the questions and responses. I simulated my academy. I had conversations with the chatbot from perspectives about easy, middle-range, and difficult questions about my lecture. There were found-out algorithms and pitfalls. The seen-as version was a setting from a text adventure game, in which skills needed to be collected and puzzles completed. +15. The immortal swapped computer science and philosophy but thought both could be computer science. I predicted conversations while bookselling and helping students with algorithms. I found what hooked potential buyers and had conversations with them about my philosophy. I found common questions and helped programming students with enough thoughts to continue (in seen-as and difficulty-solving ways). In other words, I prayed and worked. +16. The immortal gave the example of Lucian CI/CD, which took the best of the current and previous versions of an algorithm, also suggesting using type and test finders and program finders to complete code. Philosophy helped analyse the thing-in-itself (the noumenon), abstract ideas espousing superior thinking skills and the ability to think generally and inspiringly. I wrote the best algorithms. The students wrote their version and their history and philosophy of science. The history and philosophy of the science of algorithms cut theory to the essence and applied the most intelligent perspectives. + +I transitioned to eternal life by finding 4*50 As. + +17. The immortal stated that 4*50 As were the gateway to results. I wrote many books. I wrote Breasoning Algorithm Generator (BAG). BAG interpreted the As, which I felt confident related to new ideas. I wrote the systems and updated the content. +18. The immortal noticed that meditation smoothed their way and increased it. I performed the tasks on my side to maintain immortality. I studied, stayed supple, took no risks with my health, wrote, replaced my body and meditated. In return, the kind people provided the simulation, medicine and protection in the simulation with meditation. I did my part at home, and there were slight differences from my home time, such as meeting new people. +19. The immortal took care of themself in the simulation. I kept a journal of the minor differences I noticed during the simulation. A duck appeared to be all right after almost being attacked. Accidents could be avoided. By staying relaxed, I avoided injury. +20. The immortal espoused the advantages of breasonings and educational advancement. I made friends and talked with people. I went for walks. I saw people and helped them with the simulation. I sold signed copies of my book with an insert with the link to my immortality software. +21. The immortal examined themselves and others in medicine. I emphasised 4*50 anti-ageing medicine, meditation and time travel with high distinctions per dimension per day. I boosted the As for the location I was leaving at the start of the day. I emphasised As to keep both the outer-home self and future self young. Visiting the simulation and indicating As kept the actual self young, but home As kept the self immortal to one's home time. +22. The immortal revisited learning about breasonings. I wrote 80-breasoning As but didn't want to go grey waiting to finish 4*50 As. I studied computer science and philosophy. These majors helped detail the 4*50 As, and creative writing and education courses helped. I wrote BAG, which inspired and met these requirements. +23. The immortal described how arguments and algorithms led to further ones. After studying, I earned jobs at the meditation company and as an actor on the screen. I started a pedagogy club at the school (starting with no attendees). I learned how to indicate breasonings after an education course. I wrote arguments, which gave me the confidence to write more extended arguments. +24. The immortal studied and wrote on departments. I wrote 80, 130 and 250 breasonings for A, A+ and 100% (in impression). I wrote 250 breasonings as a symbol of 4*50 As. I controlled and helped my health with 250 medicine quantum box breasonings (stopping unwanted appearances, muscle aches and headaches). Writing 250 breasonings gave me the confidence to write 50 As, which gave me the confidence to write 4*50 As. +25. The immortal stated that inductively finding new algorithms was akin to creativity, their argument for Theology. I wrote the Interpreter and Combination Algorithm Writer (CAW) 50 As after writing Computational English, Pedagogy and Meditation 50 As. It was vital that the breasonings were vegetarian and that the person led a healthy lifestyle. I wrote on the interpreter as part of Popology. In addition, I wrote on CAW in Theology. +26. The immortal found original algorithms and solved problems uniquely. I wrote algorithms from first principles based on writing algorithms from the ground up, and I wrote on induction and new ideas that occurred to me about program finders. My current interest is writing types to test finders and using them to write algorithms. I wrote a depth-first post-order traversal algorithm to prepare for the find dependencies algorithm in Lucian CI/CD and algorithm type analysis. In addition, I converted the descendent algorithm to Prolog (like C), running it on an earlier version of the interpreter. +27. The immortal took notes and gathered the highest quality sources to inspire creativity. Graduating and professional development enabled me to write creatively. I wrote original ideas. I wrote more original ideas, which my algorithms led to. I synthesised my text three times to exhaust the details. +28. The immortal recommended pedagogy helper, recordings and medicine/pedagogy (delegate workloads) arguments. I thought of lateral connections from a book of theories and algorithm descriptions that texts suggested. I took notes but didn't write arguments or long algorithms during my degree. I wrote longer, connected arguments when I graduated (and before, when I wrote distinct algorithms). I worked out how to write without helpers and methods to write arguments. +29. The immortal generated texts that strongly indicated ideas in music and philosophy details. I wrote on logic even though I studied moral philosophy. I extrapolated and expanded the details I thought of. I developed a specialisation in induction. I wrote content that could be searched for using other algorithms I wrote. +30. The immortal found a seen-as version of philosophies with art and more memorable songs. Songs indicated other songs, characters and philosophical ideas. They indirectly or directly indicated algorithms and arguments. I wrote a song about an algorithm. Or songs evoked unusual settings or stories. +31. The immortal investigated links between lyrics and better songs, but it was arbitrary. I listened to the song. I kept songs with memorable instruments. In addition, I kept songs with memorable melodies. And I kept songs with striking harmonies. +32. The immortal created visual art. I endeavoured to write good songs for all the algorithms I wrote. If one song was not memorable, I wrote another. I associated songs with objects, good parts of other songs, or other mind-read highlights. These highlights included thoughts, listeners and new algorithms. + +I found meditation. + +33. The immortal reordered code with Combination Algorithm Writer (CAW), optimised code and found tests with types. I tested, refactored and used simulated intelligence to write algorithms with Lucian CI/CD (as part of meditation in immortality). I tested the algorithm. In addition, I found different combinations of commands (refactored) in the algorithm. Moreover, I found missing code with simulated intelligence. +34. The immortal removed or completed singleton types. I found tests with types, finding missing types (of new predicates) to write code with tests. If needed types were missing, I wrote new code and linked them to existing or simplified code to avoid needing them. I found the types of the commands and predicates, finding types bottom-up notwithstanding dynamic predicates (where finding types replaced testing), where dynamic predicates had a structure determined earlier, so types were found top-down from dynamic predicate definition to use. I found predictable patterns in types and identified unpredictable types (which relied on non-pattern-matching code). +35. The immortal found combinations of unpredictable code (such as decision trees, search or sort algorithms) with program finders. If needed types were missing, I wrote completely new code. I wrote a new command to bridge gaps in unpredictable code. For example, I used subterm with address to avoid needing a new predicate and found subterms to process with findall and substitute back. I customised subterm with address to work with different types of brackets, \"|\" (the pipe symbol) and I allowed for labels. +36. The immortal attempted to write a long program in one predicate using advanced commands. If needed types were missing, I linked them to existing code. I checked the existing code for applicability to join the types required. I mixed new and existing code. I modified the existing code to change functionality under different conditions. +37. The immortal simplified code around the missing types or the existing predicates. If needed types were missing, I simplified the code to avoid needing them. I modified the existing code to fit missing types by streamlining it. For example, I simplified subterm with address to a simpler predicate. In addition, I noted subterm with address could be simplified when the data was non-multi-dimensional. +38. The immortal reordered code. I checked for careless errors in types and reordered statements accordingly. I identified the types of the first statement. Then, I did the same for the second statement. If they worked when swapped over, I swapped them. +39. The immortal gave the code giving the data and given conditions about the input. I wrote code with CAW. I listed the code so far. I specified the predicate's input and output. Usually, a chunk of code was of a single kind, so I used a program finder with options to fill in the code. +40. The immortal used a broad definition to, for example, recognise grammars, and used options with this rather than new code each time to reuse, then simplified the code. I optimised the code. I went for a significant critique. I started with the algorithm's input and output. I simplified predictable and unpredictable variables, where predictable variables could be predicted in output from input and unpredictable ones couldn't. +41. The immortal simplified the general code. I removed unused variables in the data. I simplified to pattern types and formulas. These were variables with types in place of strings, numbers or parts. I determined the correlations between the variables and found the minimum necessary to produce the required code. +42. The immortal recognised the creation of new data, for example, in check arguments in the interpreter. I found the correlation between the variables. I used CAW. I narrowed the unpredictable variables to search, sort, or an external algorithm with a program finder. I filled in unknown variables with creative CAW but leant towards program finder. +43. The immortal found errors faster using types, which found tests for Lucian CI/CD, simplifying code. I found tests with types. The interpreter could compute the algorithm's output using the types with formulas of its input and output. I wrote new tests exhausting the main types for the algorithm. The types helped find and fix non-working code. +44. The immortal used types to warn and help the user correct bugs. I tested algorithms with Lucian CI/CD. I could run the algorithm with an interpreter and collect tests. Or, I could find tests from types. The former allowed testing, while the latter allowed debugging. +45. The immortal programmed Lucian CI/CD to ask the user for clarification whether a feature was intended and to provide a spec (if the programming language needed it, in which case, it included it as a top-level module). Lucian CI/CD could isolate predicates that didn't meet tests and rewrite them with tests from surrounding types. The philosophy was to fulfil as many initiated features as possible, with type corrections to complete these. Features that didn't lead anywhere or were optional could (such as debugging features in a compiler) be switched off. Lucian CI/CD could simulate file input and output with data to test. +46. The immortal selected options to test the algorithm using mind reading. I refactored the code using Lucian CI/CD. Both algorithm versions tested with Lucian CI/CD satisfied the tests. I found the minimal algorithm from parts of the two versions that met the second set of tests. I could simplify code further using simulated intelligence. +47. The immortal tested programs using bots in the simulation. I used simulated intelligence to write algorithms. I used labels to process lists of lists with subterms and addresses. I determined whether algorithm parts would likely be used using mind reading input. I brought forward plans relevant to a decision using knowledge-time travel. +48. The immortal found the bug fix in the simulation. As time continued, I archived features and simplified software. I simplified the method of reinstalling modules. I designed the software to be compatible with the simulation and space travel. These didn't exist in my home time, but I kept the software efficient and error-free. + +49. The immortal used findall as a central transformer when converting from Prolog to C and expanded list decomposition/building to separate commands. I wrote a grammar finder or modified my programming language converter to include new languages. I included options to recognise specific languages’ syntax, including variables, lines and functions. The compiler compiler wrote the logical and command back end using program finders and examples of trace. I converted descendant in family.pl from Prolog to (List) Prolog, changing findall to a predicate, to convert to C. +50. The immortal processed the list of lists with subterm with address and converted the result to a string. I used labels to process lists of lists with subterms and addresses. Without labels, finding and operating on multidimensional terms would be more difficult. With labels, building them, for example, converting data to types, was more straightforward. Data was derived from a string, parsed by a grammar and stored as a list of lists. +51. The immortal stored only the necessary variable lists, deleting those out of scope. I recorded the version of the data when I processed it. I checked that the version matched the point in the predicate being run by the interpreter. I could access and backtrack to old variable lists, given if-then and findall results. I could reload variable lists when retrying a predicate on exit or fail. +52. The immortal piloted a feature that wrote algorithms using a mind-reading word processor and tested them. I determined whether algorithm parts would likely be used using mind reading input. I found all the possible options using an algorithm. I used mind reading to rank suggestions. If the user preferred, they could confirm an option using mind reading. +53. The immortal realised that they could mind map their plans. I brought forward plans relevant to a decision using knowledge-time travel. I debugged and wrote features that were respectable to an advanced civilisation in the future. I remembered the “hyphen” detail in State Saving Interpreter to differentiate choice points from other points in the stack. I replaced this with a label. +54. The immortal kept correct drafts, drafts without findall, drafts with subterm with address and simplistic drafts with foldr. I found the bug fix in the simulation. The lecturer planted the error and funnelled it to fix the bug. They mind-read the algorithm to help fix it. The student tried several drafts until finding the solution. +55. The immortal deprecated features, simplifying software. As time went by, I archived features and simplified software. I wrote software in English and wrote interfaces in other languages. I wrote in a cognitively clear language like Prolog and converted to a fast language like C. I used types to scaffold the algorithm but removed them for speed. +56. The immortal checked the repository passed unit tests for each feature in the current version. I simplified the method of installing modules. There was a main package source for different versions of Prolog. Users could also download individual components that needed updating. There was a package ratifier that vetted them for quality. +57. The immortal emphasised understanding everything the algorithm did before automating it. The package ratifier allowed the user to test the new version with the user’s current data, giving them the choice of whether to update it. They could first convert their data to the new format. For example, they could automatically replace deprecated commands, import dictionaries and read about the different results of different versions. I wrote about RobotOps, which managed changing and testing robot software, helping them keep abreast of the latest open-source software. +58. The immortal helped debug the software by tracing through algorithms and comparing them with the correct algorithms’ output. I wrote intelligent algorithms for better results that ran machine learning algorithms. I wrote Essay Helper, which painstakingly mind-read grammatical connections and paraphrasing. I wrote Cultural Translation Tool, which used an interactive or mind-reading interface to help translate texts, involving the writer in decisions (tailored for spoken or written text). I wrote Mind Reading Algorithm Writer, which only mind-read what the student thought when writing an algorithm, interactively or with mind reading, involving them in debugging and testing it. +59. The immortal helped and remembered people they met. I designed the simulation interface. I made appointments to see certain people in the simulation, made journey arrangements and undertook professional development. It was all in terms of people and places from my time. I was automatically seen as taking required medicines and enrolled in medicine master’s. +60. The immortal travelled across the universe in 6 hours. I designed the space travel interface. It was always a journey between two places. I learned the future names of known stars and inhabited planets that were future-compatible and visited friends. I modelled my plans for my education business. +61. The immortal optimised the code and ran it concurrently on multiple processors. I designed the software to be efficient. The human code was in Prolog and was easy to understand and edit—any changes needed to pass tests. The fast code was in C, translated from Prolog and required to pass tests. +62. The immortal traced the origin of visitors to the site. I built a shopping cart command in State Saving Interpreter Web Service (SSIWS). There were commands to view the products, log in, and edit the products, and these could use the faster Vetusia engine rather than SSIWS. I could automatically update products based on the date and enrollment numbers and send data to lecturers and students. I could test and grammar-check the algorithms. +63. The immortal completed a fraction of the meditations needed for space travel teleportations. I designed the software to be error-free. I checked that it passed the tests from the previous version. I also tested the features from the current version. In addition, I generated and tested against tests from types from the whole algorithm, finding and correcting errors in unwritten tests between the two versions. +64. The immortal tracked output, files, errors and warnings. I connected the previous and current version types. I detected that the “No modifications to repositories” notification disappeared in the current version. This error occurred after resaving files without preserving their modification dates. I restored their modification dates to reinstate the notification. + +65. The immortal leant towards findall and away from setof and bagof. I used findall as a central transformer when converting from Prolog to C. I left recursion as it was. I converted setof, bagof, forall and findnsols into findall. For the sake of argument, setof and bagof were converted into findall but had their conditions. The following illustrates how forall converts into findall: + +?- A=[[1],[1]],forall((member(C,A),member(B,C)),B=1). +A = [[1], [1]]. + +?- A=[[1],[1]],findall(B,(member(C,A),member(B,C)),D),length(D,L),findall(E,(member(E,D),E=1),F),length(F,L). +A = [[1], [1]], +D = F, F = [1, 1], +L = 2. + +The following illustrates how findnsols converts into findall: + +?- A=[[1],[1]],findnsols(1,B,(member(C,A),member(B,C)),E),!. +A = [[1], [1]], +E = [1]. + +?- A=[[1],[1]],findall(B,(member(C,A),member(B,C)),D),length(E,1),append(E,_,D). +A = [[1], [1]], +D = [1, 1], +E = [1]. + +66. The immortal possibly reduced lists to single arrays. I converted findall to C. I treated each instance of a nondeterministic call as a nested loop. I ran the commands in findall and any additional needed commands. Lists were converted into array form. +67. The immortal converted separate commands to C. I expanded the list decomposition/building to separate commands. I expanded the list decomposition [A|B] to C=[A|B]. I expanded the list building to [A|B] to append([A], B, D). The predicate head and recursive calls would contain C and D, respectively. +68. The immortal wrote C commands that performed list decomposition and append. I converted equals and append to C. Given equals, I found the head of the list. Then, I found the tail of the list. Given append(A, B, C), I appended A to B, giving C. +69. The immortal converted the interpreter’s commands to C. I wrote a grammar finder. I modified the program finder with types to find disjunctive structures and to work with strings instead of lists. I programmed it to recognise types of brackets. The grammar finder outputted lists and items. +70. The immortal could manipulate, run and convert programming languages. I modified my programming language converter to include new languages. I included C, Python and Rust. I included the syntax, including the comment syntax of these languages. I converted these languages to and from list format to process with the interpreter and Lucian CI/CD. +71. The immortal queried whether the interpreter was fast because of magic (deletion of code). I was on a spree since I decided natural language was the universal programming language. It was like English Language Prolog. All the elements of the functional call were represented or inferred by the sentence(s). The functional call called efficient commands to speed up running the program. +72. The immortal specified any algorithm. The contention in program finder was generousness with the algorithms. The Simulated Intelligence shop contained algorithms such as Program Finder or Cultural Translation Tool, which humans could write, emphasising education. Machine learning produced decision trees, but my algorithms were the product of students’ thinking. I rewrote the code to be faster by identifying the main functions of the code and separating them into fewer predicates. +73. The immortal explained that they taught users to write and maintain operating systems. The company made money from users creating operating systems. They were based on programming languages’ interpreters. The company provided software and support for free and charged for education and premium plans, which allowed commercial use. These plans allowed for maintaining the company and growth. +74. The immortal condensed the algorithm to one line, including recursion, multiple predicates and a Prolog server. I optimised Prolog by deleting code. It was part of changing the code. I started at the top, aiming to reduce complexity using Combination Algorithm Writer (CAW) and Program Finder. Program Finder with Types could go further by connecting unpredictable variables using intra-string types. +75. The immortal justified their algorithm with an argument. Program Finder with Types’ types spanned a number or number of characters. They could match “term to atom” or grammars’ output. Grammars needed to be stored with the string types. Detailed data was necessary to find complete grammar. +76. The immortal claimed the interpreter’s types were checked internally, mistakes in queries, data, file and input handling could be corrected, and new algorithm features could be developed. Segments of results (with similar grammars) could be processed differently by algorithms, for example, alpha versus a constant. Grammars were the algorithms to parse strings, and any unpredictable results from their results could be converted to a program with past programs, including supporting choice points in an interpreter. The type finder corrected type errors and promoted more stable types. +77. The immortal wrote innovative examples of computer science projects. The type finder found an outlying algorithm or lack of algorithm features. Common types, such as the programming language grammar, were stored together. Changes to this were followed when changing the algorithm. Either the algorithm was corrected or added to. +78. The immortal developed a system that wrote and explained more necessary options. GitL’s structure allowed studying changes given different types, and it turned out features could be added or deprecated. GitL showed each refinement of the code, one at a time. I could find more specific solutions to unpredictable code gaps by narrowing predicted code further using types. I emulated unabridged literature in my code to mind map features. +79. The immortal wrote mathematics, the language of God. I optimised Prolog by modifying the code. Modifying the code went against the grain of transforming the code. I reduced the shape or pattern to a formula. This reduction mirrored divine mathematics in nature. +80. The immortal completed optimisations as part of the compilation process. I wrote unit tests for each feature of Lucian CI/CD. The first test changed the lines of the predicate. The second test kept commands similar to writeln (that don’t have output). The third test avoided (instantiated) testing vars. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 31.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 31.txt new file mode 100644 index 0000000000000000000000000000000000000000..4930871cb9da389018c5d3ad113c9a782f0c1fd4 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 31.txt @@ -0,0 +1,107 @@ +["Green, L 2024, Immortality 31, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 31 + +1. The immortal ran the Prolog algorithm like a program in C, saving memory. Assuming choice points are contained, rather than a Prolog optimiser treating dynamic predicates as added arguments, it can treat them as globals. If a sequence of predicates fails, the globals will be intact. The command retractall(predicate(_)) (assuming a:-dynamic(predicate/1) declaration) would initialise the stack for a particular dynamic predicate as a global. The command assertz(predicate(x)) would add x to the predicate as a global. +2. The immortal optimised the algorithm to find the item at an address with arrays. I used subterm with address to quickly traverse terms. If a Prolog predicate always jumped to an absolute address (or address relative to an address), I jumped to it with a subterm with an address. I analysed the code to find the address. I got the item at the address, transformed it and substituted it. +3. The immortal updated the algorithms for different processors. I skipped constants if they were already there. I didn’t verify previously verified constants. I converted Prolog to C to run it faster. I familiarised myself with assembly code and optimised code for various machines. +4. The immortal edited the Lucian CI/CD document in “diff” view. I found efficient algorithms in Prolog and assembly language with Combination Algorithm Writer (CAW). I simplified the data. I focused on the first necessary outputs. I counted the length of the string in assembly language. +5. The immortal used C structs instead of arrays for complex data. I used C structs to represent lists of lists. I identified different repeating structures. I counted the number of items per level and number of levels. I stored strings and numbers in one structure. +6. The immortal changed nondeterminism into C loops. I converted findall over multiple predicates to findall in each predicate and then to loops in C. I separated data collection and computation. I ran a child predicate to collect findall results, then ran findall in the current predicate. Separate nested findalls were used to collect and process lists. +7. The immortal alerted the user about type errors such as split_string given a list. I used lazy evaluation to optimise algorithms. I stated that any number raised to the power of 0 equalled one. I identified \"a(A) :- a(A).\" infinite loops and removed them if their output was not needed. I removed infinite loops such as append(_,_,_). +8. The immortal worked out the family respected architecture. I gave a spiritual high distinction for null computations. The maximum was 4*50 As. I checked that the entry was in the database. I couldn’t change the code, although I performed comparisons. +9. The immortal deleted true by itself and clauses with false by itself. I found and deleted computations involving lists and strings that were always true or false. For instance, string_concat(“a”, “b”, “ab”). was always true. Also, append([a],[b],[a,c]). was always false. If I found true->a;b, I converted it to a, and if I found false->a;b, I converted it to b. +10. The immortal stated that not((true,true)) and (not(true)->true;not(true)) were both false, not((true,false)) and (not(true)->true;not(false)) were both true and that not((false,false)) and (not(false)->true;not(false)) were both true. If I found a->false;b, I converted it to not(a)->b; false. I could further simplify this to not(a),b. I couldn’t write not(c,d) because this had more than one argument, so I wrote not((c,d)). Also, I changed not((a,b)) to (not(a)->true; not(b)) and the reverse if there were more than two arguments. +11. The immortal caught errors in State Saving Interpreter Web Service and returned to a safe page. I converted the list to numbers. This conversion was a design decision and gave better performance but was less understandable. The code had one version with strings and one with numbers, with labels showing the variable with a particular type. In the end, I ran code for performance. +12. The immortal treated grammars like predicates, for example, with list building, for simplicity. I wrote grammars for parsing strings. These were a good way of parsing strings. Alternatively, I converted the string to a list and got chunks of the list with append. I parsed the list with a predicate if I needed a more straightforward solution. +13. The immortal agreed robots were uniform in coding. I used a de-blocker optimisation that checked types, checked code against neuronetworks and that the predicate complexity was under the limit. The ideal code was available in premium code checkers and was ratified by law. Rants, knots and literary curses were against the rules. Students may write longer code as long as it is simplified. +14. The immortal could commit the code in GitL when it had passed tests. In Lucian CI/CD, I tested that the single code line passed. The line was compared with nothing, so it was always checked. I checked that the line met the query test. Lucian CI/CD checked each predicate set one at a time. +15. The immortal avoided “exploit” errors. I tested the recursive predicate in Lucian CI/CD. I considered supporting custom dependencies, files at specified paths, file date data and program error codes in tests. Both clauses of the predicate were kept when it passed the tests. Lines would only be added from the previous version or removed if needed to construct the shortest version that satisfied the test. +16. The immortal tested multiple repositories, some with no main predicates (but which were used by other repositories). I tested different contents of main_file.txt (in each repository) in Lucian CI/CD. I tested combinations of one, two or three predicates per file and one, two or three files. These were the main predicates that called the other predicates in each repository and the files they were in. Later, the algorithm could detect these predicates. + +17. The immortal used Lucian CI to test and CD to build algorithms from combinations of past versions. I tested Lucian CI/CD with one, two or three predicates per file. I assumed it would work with more predicates. I tested the limitation of the predicates available with Prolog to List Prolog. For example, I wrote “:- include”, “findall”, and a custom predicate name. +18. The immortal debugged Lucian CI/CD on 15.1.24, integrating test 9 (which found combinations of lines) with tests 1-8 (which tested code), then fixed problems with test 5 (which needed simpler testing), then test 9 again, then test 5. I tested one, two or three files. I included and gave tests for different predicates. I used different files for features, shared predicates or main and subpredicates. I corrected problems by keeping the algorithm simple. +19. The immortal preferred maplist to predicates, but both were good (they could convert maplist to a predicate). I tested three files with three predicates in Lucian CI/CD. I finished Lucian CI/CD to test different algorithms, find combinations of previous code to avoid mistakes and give it as input. These files could include others, but by saying multiple files, I meant new main predicates within each one. These main predicates weren’t necessarily connected with other predicates. +20. The immortal claimed that Combination Algorithm Writer (CAW) is more complex than Lucian CI/CD and could reorder arguments and commands. Lucian CI/CD could find 1+1=2 from 1+0=1 and 0+1=1. It could find string_concat(“a”, “b”, “ab”) from string_concat(“a”, “”, “a”) and string_concat(“”, “b”, “b”). In addition, it could find append([1],[2],[1,2]) from append([1],[],[1]) and append([],[2],[2]). If a main_file.txt refers to no files (with predicates), it must contain []. +21. The immortal planned to bug-test multiple Lucian CI/CD clauses. I used the Package Manager’s registry file of dependencies of repositories in Lucian CI/CD. I entered more and removed some variables from Lucian CI/CD’s unit tester. I made Lucian CI/CD as simple as possible and allowed changing the maximum number of items to find and test combinations. I merged CI (testing) and CD (combination finding), although they were easier to test separately. +22. The immortal temporarily represented dependencies as relation lists. I reused dependencies in Lucian CI/CD tests, which I noted in the readme file. I created original Lucian CI/CD test dependencies, including terminals and circular references, which the List Prolog Package Manager (LPPM) resolved. I planned to resolve circular and duplicated “:-include” references in Prolog. I temporarily used new dependencies in Lucian CI/CD. +23. The immortal tested the predicates bottom-up, no matter what order they were in the files, and used “:-include” statements as needed. I tested multiple files in Lucian CI/CD. I merged the files with delimiters, separated comments (using only those in the new version) and code and tested combinations within predicate dependencies. I used the same algorithm for checking and changing code and more easily fetched and used tests. To simplify the display, I eliminated duplicate loaded files in Lucian CI/CD. +24. The immortal checked that dependencies and “:-include” statements matched. I tested the non-mentioned files in Lucian CI/CD. These files were connected with “:-include” statements but were not mentioned as main files with main predicates to test. I tested the files’ predicates from the main repository. I fixed an error in the tests, so the other repository’s predicates were tested from them, and the main_file.txt referred to them. Repositories were constantly tested separately, not relying on “:-include” statements. +25. The immortal tested, changed, and bulk uploaded multiple repositories. I tested various repositories in Lucian CI/CD. I made repositorilets (sic) for files that two repositories needed when one repository depended on the other, and it was better not to store them in the upper repository. I shuffled files in and out of these repositorilets, used by the same repositories. A third repository could access them if they were small enough, and they were turned into a hierarchy of repositories if necessary. +26. The immortal notified the user about and helped correct circular references. I tested circular references (repositories that were loaded twice). LPPM detected and removed circular references on loading. Lucian CI/CD detected and aborted on circular references using “:-include”. I created neuronetwork material using CAW. +27. The immortal tested loops within loops in Lucian CI/CD. I tested loops in Lucian CI/CD. Loops included all the predicates in between. Looped predicates were tested together, but dynamic predicate arcs were also tested. This reduced options to find combinations within the predicates but was necessary because a recursive predicate worked as a unit, and globals couldn’t easily be given as input in the middle of a dynamic arc. Hence, they needed to be tested together. +28. The immortal tested multiple predicates in dependency levels. I tested loops across files and repositories in Lucian CI/CD. The code was continuous, as it was in the same code group. I could test it with a single query. I tested lower code than the loop in the other repository with other main_file.txt entries. +29. The immortal claimed that a particular modified clause would retain its position in the future. I tested multiple-clause predicates across multiple files and repositories in Lucian CI/CD. The order of predicates was retained from the original. They were in the correct files and repositories. Their location was moved to the first repository and file the predicate was in. +30. The immortal reassured the person that Lucian CI/CD functioned, given any input configuration. I tested the predicates in reverse order of the predicates in main_file.txt and bottom-up order. I tested each of these three orders when they were out of order compared with each other for correctness. Different files complicate this order. In addition, it is complicated by multiple repositories, “:-include” dependencies and other dependencies. +31. The immortal stored deleted code in a safe place. I wrote a single command to move Lucian CI/CD output to the GitL folder. This command copied the output to the GitL test folder and committed it. I could see the changes to the repository concerning time. I could publish selected commits to the public when they were stable. +32. The immortal custom-built Lucian CI/CD to help simplify code, ready for optimisation. When testing, I saved and restored Github_lc/* Lucian CI/CD status files. These files recorded the status of testing from current repositories. They were temporarily moved away during testing, which created new configuration files. I could optionally store and inspect these temporary files for debugging. + +33. The immortal developed in Lucian CI/CD and presented in GitL. I displayed the “diff” between the start and end in Lucian CI/CD in Prolog. I saved this diff to the log. I saved the files if necessary. GitL saved the diff between the two commits. +34. The immortal entered numerous equivalent logical structures. I included options to recognise specific languages’ syntax in Lucian CI/CD. I converted between the language and list format and back. I entered examples of the language to create the converters. The creator algorithm saved different ways of representing the same logical structures. +35. The immortal stated that A=a,B=[[v,c],d],E=[[n,A],B] gave the result E=[[n,a],[[v,c],d]]. I recognised the other language’s variable format. I found the variable name. I found its syntax, including symbols and case. I devised a new predicate name as a variable name in Prolog to replace “univ”. +36. The immortal observed the structure of commands containing several lines. I recognised the other language’s lines. I found the command names, variables and their associated syntax together. I found the line separator or delimiter. I observed the syntax of the final line in a function. +37. The immortal wrote a language with minimal syntax. I recognised the other language’s functions. I found the line’s contents. Between them, I found the function header’s contents. Differentiating between code and comments, I delineated the function format. +38. The immortal gave an error on an undefined variable name as a predicate name. The compiler compiler wrote the logical back end using program finders. I did if-then//2 and //3 and not//1 using predicate calls. I parsed the algorithm file using the list of predicate forms. It was not too tricky to use variable names as predicate names. +39. The immortal claimed that quantum mind reader gave 14-16% more accurate results than random. The compiler compiler wrote the command back end using program finders. Program finders recognised the need for a particular program and customised it for a purpose. For example, the command string_concat took two strings and concatenated them together. A not-necessarily quantum algorithm ran nondeterministic predicates at the same time. +40. The immortal generated simpler algorithms with a quantum mind reader as a substitute. The compiler compiler wrote the back end using examples of trace. I recorded the results of commands, algorithm steps, and evidence of backtracking. I wrote the rules of the backtracker. I used a quantum mind reader in cases when I didn’t know the answer. +41. The immortal explained that “big idea” algorithms housed small idea algorithms and that new ideas were sometimes intelligent. I converted descendant in family.pl from Prolog to (List) Prolog. In the first sense, I directly converted the long form of descendant from Prolog to List Prolog. In the second sense, I converted the short form of the descendant algorithm to its extended form, or vice-versa. This conversion was an example of a “big idea” algorithm that was possible on the non-interpreter side. +42. The immortal passed the list to find a member of the second time to the second clause in the predicate converted from findall. I changed findall to a predicate. In this, choice points were separated into data collection and processing by loop level and unique choice points were converted to clauses. I read and manipulated the findall code, and the algorithm created the long-form code. The only problems were dynamic predicates (solved using globals) and multithreading given backtracking (solved using a different intermediate predicate). +43. The immortal converted a cut represented in long form Prolog to short form Prolog. I converted Prolog to C. I noted that cut took the first result of the predicate and separated rules before the last cut into different predicates so that they could each be pruned. A cut was converted into a predicate by stopping after finding a result rather than returning all results. Complications such as finding n solutions and simplifying code that explicitly found n solutions were also possible. +44. The immortal processed the output of subterm with address within subterm with address. Earlier, I processed the list of lists with subterm with address. I stored certain types of items at a level. I stored further types of items at particular positions. I integrated the processing code with the subterm with the address algorithm, only using the label matching and list output parts. +45. The immortal processed data using Prolog terms for simplicity. I converted the result of subterm with address to a string. I converted the hierarchy nodes to strings and traversed and concatenated them. I converted the string to a hierarchy with a grammar used to find the algorithm that converted the term to a string. The term form contained [“()”,[“a”]] for example, and the string it represented was “(a)”. +46. The immortal processed similar data using the same predicate but different flags. Before this, I used labels to process lists of lists with subterms and addresses. I found the labels using subterm with address. I processed the requisite data accordingly. It was if data with another label needed to be processed differently for each result. +47. The immortal stored scavenger hunt items throughout the computer game about the universe through time. I stated that finding and operating on multidimensional terms would be more difficult without labels. I found items with the label ‘a’ in: +[ +[a,b], +[ + [a,c] +] +], returning b and c at specific addresses. The labels could contain metadata such as variable ID and algorithm ID and could link to previous data. The variable ID referred to the origin of the data, with an example or description and the algorithm ID referred to the algorithm that a flag helped express. +48. After the following, I found the possible types for each line and put them together bottom-up. I stated that with labels, building lists, for example, converting data to types, was more straightforward. I converted the data to long-form types without variables. I inspected different data sets and inserted variables for recurring types. I found and converted the labels for types into short-form types with subterm with address. + +49. The immortal stated that multithreading is easier with C code. I stated that finding types replaced testing. Instead of writing tests, I found whether the algorithm types agreed by inspecting the code. I found types for dynamic predicates top-down. To optimise dynamic predicates, I predicted higher-order functions from the code and converted them to C. +50. The immortal only used globals sparingly because they were harder to test. I stated that dynamic predicates had a structure determined earlier. Dynamic predicates were declared at the start or in code (found in compilation) and asserted in code. As part of optimisation, possible structures asserted meant that dynamic predicates were asserted with lists from other variables. While other choice points, including nondeterministic clauses, were loops, dynamic predicates were left as globals (where they were discouraged and converted to local variables where possible, and inter-predicate globals were converted to passed variables to remove possible bugs). +51. The immortal stated that types were found top-down from dynamic predicate definition to use. The local variables, converted from globals and converted from dynamic predicates, were labelled in the comments. The point of execution passed these local variables around long-form code so that they were available in current form at any moment. These converted variables were more accessible than variables trapped in findall, which needed wrapping in a function and failure cases needing phi nodes (if-then cases) to choose whether to return variables (such as free variables in bagof). +52. The immortal claimed everything was predictable with enough specs and regression. I found predictable patterns in types. It was more efficient to use inductive code specialised to find predictable patterns. I identified and simplified non-simple code that found predictable patterns to match patterns. I labelled predictable code so as not to become distracted by it when optimising non-predictable code. +53. The immortal thought of a golden trophy, and it existed. Beyond Prolog required regression. After finding the code, I didn’t reaccess the code. It was not necessary to think in code. My friend James could think of anything in mathematics, and it was fun. +54. The immortal claimed sub-term with the address presented information at one’s fingertips. I identified unpredictable types. These were not predictable changes in patterns, such as D1=[A, B, C], D2=[C, B, A]. One needed to ensure that seemingly predictable patterns were not unpredictable, requiring additional conditions. I distinguished between errors unpredictably leading to correct results and unusual, correct optimisations (where another improvement might be), thinking like sub-term with address and converting to a long-form predicate to run more quickly. +55. The immortal checked whether the unpredictable code was in the database; otherwise, it was added. I wrote that unpredictable types relied on non-pattern-matching code. For example, non-pattern-matching code was more challenging to find, such as searching, sorting or other transformations. In some cases, unpredictable code could be refactored using techniques to remove unnecessary features. I researched whether binary programs were faster than C. +56. The immortal combined the grammars. I found combinations of unpredictable code (such as decision trees, search or sort algorithms) with program finders. I wrote grammar checkers, translators, grammars, programming language converters, or combinations. I checked against the grammar, converted between grammars using parts of speech, used a grammar and term parser, checked a translation, checked a programming language or translated the programming language. The term parser could parse lists with different delimiters. +57. The immortal modified shorter versions of code. If needed types were missing, I wrote new code. I wrote the input and output types. If their patterns didn’t map, I treated all input variables as inputs into the regression and the outputs as outputs. Alternately, I tried single commands from the database diff combinations if merging two interpretations or versions or shorter versions of code. +58. The immortal created two variables from a variable to process them separately. I wrote a new command to connect gaps in unpredictable code. I skipped over the easy ones and wrote the hard ones. It took concentration to work out answers. There was a more intelligent connection, a new variable. +59. The immortal represented terms as tables in C. I used subterm with address instead of another predicate. Instead of a predicate that recursively searched levels of a term and processed its results in different clauses, a subterm with address found occurrences of a subterm throughout the term. Subterm with address was a short-form algorithm, while the other predicate was the long-form algorithm, where subterm with address represented how one thought by “collecting” results using a single method and the long-form predicate joined search to processing, risking missing brackets. Processing routines from subterm with the address were substituted into the predicate for faster running in C. +60. The immortal let subterm with address search for terms and give addresses so they could substitute the transformed terms back. I found subterms to process with findall. For example, I labelled term parts to find later, +?- sub_term_wa([x,_], +[ + [x,1], + [ + [x,2], + [ + [ + [x,3] + ] + ] + ] +], A). +A = [[[1, 1], [x, 1]], [[1, 2, 1], [x, 2]], [[1, 2, 2, 1, 1], [x, 3]]]. +If given 1, 1’s address would be 1. So, [1,1] is 1 in [[x,1]], and so on. +61. The immortal used a predicate if a term depended on one before it. I transformed data collected with subterm with address. I collected data in the form {[Address, Term]}, where {} meant a repeating list. I found all the pairs, took the term’s second argument, added one, and returned the pairs. If I had needed to renumber the second argument, I would have used numbers//4 or a predicate to do this. +62. The immortal combined or divided addresses. I substituted subterms with addresses back. I pretty-printed the term with subterm with address. I could represent arrays in this manner. I could represent arrays with addresses, saving memory and enabling transformation and compression. +63. The immortal ran sub_term_wa(1,{{1}},A), with the result A = [[[1, 1, 1], 1]]. I customised subterm with address to work with different types of brackets. For example A={B|C}, A=(B|C) or A=. In some cases, A=/A|B\\ was allowed. In addition, [A|B] in the predicate header or call could be replaced with this terminology. +64. The immortal wrote the Prolog interpreter in Prolog. I customised subterm with address to work with “|” (the pipe symbol). I changed “|” to &(|), which was reserved. I processed the symbol like any other with list processing. I wrote list processing predicates. + +65. The immortal compiled the algorithms for different processors. I updated the algorithms for different processors. C was compiled for a particular processor. I converted to C. This converter was not writing the compiler in C but converting the Prolog code to C. Prolog was a good brand and easy to think of. +66. The immortal deleted redundant constants both collected and processed. I skipped constants if they were already there. I found cases of constants always being matched and ignored them. If hard-coded data always matched and was unnecessary, I deleted it. Data was tested, for example, after collecting and processing it, in findall. +67. The immortal deleted a first constant comparison given specifications. I didn’t verify previously verified constants. If a constant was compared with, I removed the following unnecessary comparisons of that constant. Further comparisons should only be deleted if all possible first comparisons were found. If a first comparison repeatedly seemed excessive, the algorithm notified the programmer. +68. The immortal argued that the imperative procedural programming language C was faster than the logical programming language Prolog. I converted Prolog to C to run it faster. The Prolog interpreter only gave the first correct answer. Findall, used to collect all correct answers, was divided into data collection and processing parts. Functional decomposition into a collection part avoided the need for logical programming, and functional decomposition into a processing part avoided the need for logical programming because possible failure within findall was replaced with if-then. +69. The immortal designed software that optimised programs before running them, especially those labelled not yet optimised. I familiarised myself with assembly code and optimised code for various machines. The multiple architectures had different register state machines. As an aside, I devised a reverse engineering algorithm that uncompiled programs from an assembly language into C and recompiled them into another assembly language. In the future, a chip with a reduced instruction set performed some calculations more quickly, while another chip had more features, and these chips were used in different cases. +70. The immortal compiled Prolog for smartphones, visors and other operating systems and ran it online. I optimised assembly code for various machines. I uncompiled it to Prolog. I removed unneeded code at the algorithm, predicate, and variable levels and reused libraries used by other algorithms. When compiling Prolog, only needed predicate dependencies were used, and all data was compressed into reused, program-encoded, binary form. +71. The immortal checked the human-crewed space vehicle and met the software health standard. I deserved the spiritual A for the data deleted during optimisation. This technique was commonly used for spiritual meditation, time travel and medicine while teleporting across the unit (universe). It indicated the data spiritually, and these ideas were switched on. Space travellers completed longer forms of these ideas daily in each precise teleportation location and relaxed the whole time. +72. The immortal warned the user whether each change had passed the tests. I edited the Lucian CI/CD document in a “diff” view. I read the changes to the files on one page. I clicked on a document, edited it, and viewed the changes. In addition, I instantaneously ran a pipeline to check the changes, with non-permanent pipelines to warn whether changes on the way worked or not. +73. The immortal found efficient algorithms assembly language with Combination Algorithm Writer (CAW). I wrote the C compiler, with libraries for each command. It was faster because it had fewer commands, such as arrays, not structs. Structs were converted into arrays. Programmers could see the C code without structs. Contrary to this, I converted Prolog code to assembly, which didn’t necessarily require CAW. However, CAW could find code for different number lengths. +74. The immortal simplified logical, mathematical, and database (set-theoretical) formulas. I found efficient algorithms in Prolog with Combination Algorithm Writer (CAW). CAW could find simple Prolog code, which could be converted to C and assembly language. Arrays were divided into smaller arrays and converted into assembly language. I reverted to mathematics to speed up Prolog data processing, for example, SLD processing to make non-recursive functions in line. +75. The immortal critically transformed the program, removing buggy or unnecessary features. I simplified the data. I recognised the program within the data. I simplified it, notifying the user of bugs removed. There might be programs within the program’s data, and so on. Programs should all be top-level. +76. If possible, The immortal converted Prolog called through the shell into inline code and compiled it as a single unit. I focused on the first necessary outputs. I took out writeln and other commands with no verifiable output in Lucian CI/CD and put them in later, notifying the user of this. Writeln, etc., were additional features activated with optional modules. As an aside, I kept track of the identification numbers of transformed clauses in Lucian CI/CD to preserve their order in the algorithm. +77. The immortal found the optimal array length to wrap long strings into new array lines. I counted the size of the string in assembly language. This length was stored in a variable describing the data. It was in numeric form (it was possible to read and run code in one form and convert it to the other). Once the Prolog with the code handling the string length was compiled into C, it could be converted into assembly language. +78. The immortal changed numbers//4 in Prolog to an iterative loop in C. I used C structs instead of arrays for complex data. I represented the short-form C algorithm with structs and the long-form C algorithm with arrays. I converted between them. I eliminated unnecessary elements of structs or made optimisations. I invented a form of C where the first element of the array was numbered 1, not 0. +79. The immortal stored hierarchies as list and item numbers, where the list numbers pointed to further lists of items. I used C structs to represent lists of lists. I converted the struct to an array. I used numerical pointers as nodes in the hierarchical struct. I hashed (reused) data. +80. The immortal converted the append into long-form Prolog and a C program. I identified different repeating structures. I found repeating lists and items. These items may be Prolog variables, not dealt with as C variables, but decomposed or built outside the Prolog predicate header. They may be in the short-form Prolog header but not the long-form Prolog header. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 32.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 32.txt new file mode 100644 index 0000000000000000000000000000000000000000..7dab2c924544255c436f51a308fc494946e4bdb5 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 32.txt @@ -0,0 +1,138 @@ +["Green, L 2024, Immortality 32, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 32 + +1. The immortal made a 3D array when the same structure repeated in a list and another array for terminals. I counted the number of items per level when compiling the list. I counted the maximum length of the lists. Or, I decided on an arbitrary value and wrapped the lists. In addition, I treated strings like lists of characters. +2. The immortal reused array structures whenever possible. I counted the number of levels when compiling the list. The number of levels contributed to the height of the array. I accounted for wrapping lists, which increased the height. The list items also contributed to the array height. +3. The immortal compressed strings where a short alphabet was used. I stored strings in one structure when compiling lists. The strings were split into character codes. They were compressed. Recurring substrings were stored once. +4. The immortal compiled a list in array form to parse it in C. I stored numbers in one structure when compiling lists. Numbers occupied a single cell, so the number array was narrow. I compressed numbers up to a limit. If numbers were over this value, they occupied two cells. +5. The immortal changed nondeterminism into C loops. There was nothing joined together or unpredictable. LLVM just deleted unused sequences of commands. I rewrote member, append, calls to predicates with multiple clauses, findall and similar predicates as loops in C. I included the whole program in one large function and renamed variables with the same name as those in the function. +6. The immortal alternatively used a counter. I converted findall over multiple predicates to findall in each predicate and then to loops in C. For example, the following predicate a1//1 finds the sum of pairs of values, and a2//1 finds these values and their sum the long way. The a2 predicate separates the collection and processing functions (which is more straightforward if collection involves multiple predicates) and is easier to convert to C. + +% a1(D). +% D = [6, 7, 7, 8]. + +a1(D):- +findall(C,(c1([A,B]),C is A+B),D). + +c1([C2,C3]) :- + c3(C2),c3(C3). +c3(3). +c3(4). + +% The long form of the above is: +% a2(D). +% D = [6, 7, 7, 8]. + +:- include('../listprologinterpreter/listprolog.pl'). +a2(C):-c4(A),c7(A,C). + +c4(C) :- A=[3,4],length(A,AL), +c5(A,1,AL,[],C). +c5(A,N,AL,B1,B2) :- + (N is AL+1->B1=B2; + (get_item_n(A,N,It), + c6(It,A,1,AL,B1,B3), + N1 is N+1, + c5(A,N1,AL,B3,B2))). + +c6(It,A,N,AL,C1,C2) :- + (N is AL+1->C1=C2; + (get_item_n(A,N,It2), + append(C1,[[It,It2]],C3), + N1 is N+1, + c6(It,A,N1,AL,C3,C2))). + +c7(A,C) :- length(A,AL), +c71(A,1,AL,[],C). +c71(A,N,AL,B1,B2) :- + (N is AL+1->B1=B2; + (get_item_n(A,N,[It1,It2]), + It3 is It1+It2, + append(B1,[It3],B3), + N1 is N+1, + c71(A,N1,AL,B3,B2))). + +% or a2([6, 7, 7, 8]). +7. The immortal separated data collection and computation. NASA recommended functional decomposition. I simplified data collection to the data. Instead, I collected data from rules, files and input, possibly with prompts. Then, I type-checked this data and processed it as a value, string, or list of lists. +8. The immortal eliminated choice points in processing findall results by identifying whether all or the first result are required. I ran a child predicate to collect findall results. If processing was spread throughout levels of predicates, I split collection and processing at each level with a set of predicates. The algorithm collected data as simply as possible, without a findall command collecting and processing data from levels above. For example, the descendant predicate eliminated choice points by collecting data first and then processing this data. +9. The immortal merged predicates if their data took too long to decompose and build separately. To process findall results, I ran findall in the current predicate. I converted findall to a predicate, then to C code. It was better to separate data collection and processing because, for example, bulk translating Cultural Translation Tool sentences saves API quota. In addition, the grammar simplifier simplified the grammar in the interpreter (by grouping and simplifying) in preprocessing before processing it. +10. The immortal functionally decomposed and simplified (for C) predicate calls and header unification. Before this, I decided separate nested findalls were used to collect and process lists. Additional preprocessing was done separately, in multiple stages, as necessary. Dependencies of collection, preprocessing and processing were created, and stages were merged if results were needed before the end of operating on an item. If stage merging was necessary, it was clearly labelled as inseparable. +11. The immortal automatically type-checked predicates to help the user find bugs. I alerted the user about type errors such as split_string given a list. I type-checked predicates that took the same types, which were sometimes reserved. For instance, I checked that a list was [] or [_|_]. A list of lists was [] or a list of these lists. +12. The immortal used lazy evaluation to optimise algorithms. Values were only computed when needed. The results of infinite loops were often deleted. This redundant code could be deleted if necessary. Absolute references (i.e., only taking the first item) or relative references (such as taking data that fit a heuristic) were analysed to check the code’s necessity. +13. the immortal replaced references and computations with data if it led to a speed-up. I stated that any number raised to the power of 0 equalled one. For example, 1^0=1, 2^0=1 and 3^0=1. In addition, (-1)^0=1, (-2)^0=1 and (-3)^0=1. I simplified redundant code. +14. The immortal didn’t recapitulate finding solutions once they had found the first one. I identified “a(A) :- a(A).” or code with no changes to variables that called itself that led to infinite loops. In addition, I identified and deleted code that called itself after building a list. The interpreter stopped checking clauses when it found a result or checked all the clauses with findall. The interpreter needed a cut to stop going past the base case. +15. The immortal waited for the program to finish, in case it wasn’t in an infinite loop. I identified the infinite loop. I changed the loop condition inside each iteration. I never reset the loop continuation variable inside the loop. It was necessary to solve the halting problem in an applied way when writing the code. +16. The immortal removed the infinite loop if its output was unnecessary. Prolog returned multiple variables per predicate. Results only needed to be checked (and lists processed) when required. If a variable or list item was skipped, it didn’t need to be processed. Values were needed when printed or compared. + +17. The immortal caught infinite loops. I removed infinite loops such as append(_,_,_). I listed possible infinite loops such as member(_,_). Unless string_concat(_,_,_) treated strings as lists, it produced an error. I modified them, inserting variables or removing them. +18. The immortal explained the algorithmic function to disabled students. I worked out the student’s family respected architecture. The data structure had an architectural seen-as version. The algorithm had structure and function. The function was welcoming. +19. The immortal celebrated nothingness, instead, an advanced algorithm. I gave a spiritual high distinction for so-called null computations. Text-to-Breasonings and mind-reading falsely appeared as null computations. By providing the high distinction using a spiritual system, I supported conclusions based on the software. The algorithm should compare items at the minimum, and the high distinction increases the comparison. +20. The immortal compressed algorithms using hash tables and mathematics. The maximum high distinction for a so-called null computation was 4*50 As. The high distinctions ranged from 80 breasonings to 4*50 As, where an A had 80 breasonings. The high distinctions were the output from Grammar-Logic, Breasoning Algorithm Generator (BAG) or Program Finder with Types (including Strings or PFTS). PFTS found unpredictable algorithms inside predictable ones, linking to variables and finding more relevant commands, possibly changing to multiple predicates or previous techniques with the Combination Algorithm Writer (CAW). +21. The immortal found internally and externally trending sentences and related them together. I checked that the entry was in the database. When finding combinations of two or more sentences when exhausting combinations in philosophy, I found only untaken combinations unless there was a development. Using the new philosophy, I found an algorithm related to the topic in the Text-to-Breasonings file. I chose a value and related the list with this length to itself in n^2 items. +22. The immortal replicated professorisms with simulated intelligence. I couldn’t change the code in the null computation algorithm. Instead, I simplified the code and retained the philosophical essence to remember. I found the simple nature of the algorithm. I applied it to the other major works, consolidating two-directional pathways between each pair of items. +23. The immortal aimed to program the simulation. I performed comparisons in the null computation. The most straightforward computation used maplist or another built-in Prolog predicate with a simple predicate. This simple predicate verified or built a list from data. The advanced algorithm was as efficient, inexpensive or quantum leap-like as nothing. +24. The immortal used neural compression to write an interpreter. I deleted true by itself. I rewrote catch, if-then//2 and //3 without true, wrote a command that ran Prolog in a container and made a programming construct that enabled parsing commands for higher-order programming. One could read the commands from a text file, convert them into List Prolog and run them with an interpreter. I experimented with converting from a programming language to C instead of using an interpreter for speed and rewrote C and assembly. I searched for a modern assembly interpreter. +25. The immortal inverted negative results into positive ones for shorter programs and easier verification (I compressed “nots” into shorter statements). I deleted clauses with false by itself. As with true, I rewrote commands that relied on false. I converted a predicate with a false base case to have a cut. I converted a predicate with false to long-form by using if-then. +26. The immortal diverged the stage and verification bay. I used a neuronetwork to develop a CI/CD tool with many commands in a specific programming language. I worked out the set of indicated programs to convert the high-order algorithm and wrote them in C (it was like types). For example, I created an array with a certain length, such as the arity of a function. I predicted whether it would call a built-in, dynamic or other predicate from the code and converted the code in a custom language to C. +27. The immortal saved their life by teleporting between universes in the simulation. I found and deleted computations involving lists and strings that were always true or false. I tracked sets of systematic transformations and detransformations and deleted any unnecessary code. In addition, I used specifications to compute simpler code, possibly without other transformations (deleting unnecessary variables). I revamped algorithms to be more efficient by converting to assembly language, including neuronetworks. +28. The immortal converted higher-order functions to trivial functions by demarcating their function name, arguments and arity and converting them to long form. I converted trivial to complex predicates to always be true. For example, I found that whether hard-coded or data given by specifications always meant that C was A+B. I found whether predictable data always had the same result in a computation. I found two points when a computation was true. +29. The immortal simplified unification to the Herbrand Universe. I wrote that append([a],[b],[a,c]). was always false. I replaced always false statements with false. I predicted falsity in custom programming languages and higher-order functions and deleted them. I grouped and simplified patterns of falsity. +30. The immortal used Prolog to convert all programming languages. I integrated the C compiler code into Prolog. Many predicates were in C. It allowed assembly code to be written. I wrote a program that optimised each test case. +31. The immortal simulated assembly language’s interaction with the circuit. If I found true->a;b, I converted it to a, and if I found false->a;b, I converted it to b. If nested statements were revealed, I repeated this process as far as possible. The cross-language effectiveness was new, broached the frontiers and was approachable. I was confident in writing compilers, assembly interpreters and new technologies. +32. The immortal made everything into a game, such as making connections in philosophy. The exercise was to reduce the code’s complexity as much as possible before converting it to assembly by functionally decomposing it, uniformising it (reusing code), refactoring it given specifications, using simpler code and converting it to long form (in C). Connecting the simpler code to the other parts required our data, but education was more ethical. Thinking and thereness were the name of the game, while developedness (individuality) was the overall aim. + +33. The immortal claimed that “a(2).” in “a(1). a(2).” might be dropped given a test, but the first instance of “b(1).” in “b(1), b(1).”) would be dropped because it is a duplicate. I stated that not((true, true)) was false. This statement was separate from “b” in Lucian CI/CD. For example, the tests would fail if “b” was false. If not((true, true)) could be returned to false from a previous version, it would be. +34. The immortal stated that (not(true)->true;not(true)) was false. This statement was because not(true) evaluated to false, failing the condition and selecting the second consequent, not(true), which is false. This construct, “a->b;c” is represented by the clauses “d:-a,b,!. d:-c.”. The first construct would be preferred in C. The second, preferred as an intermediate check by Prolog, can be checked by Lucian CI/CD, deleting either, both or neither clause (or their commands). +35. The immortal stated that any top-level false command in a clause falsified the clause. I stated that not((true, false)) was true. Separately, in “d:-a,b,!.”, “a” could only be omitted if it was included at the start of “b”. I wrote that not((true, false)) was the same as not(true) v not(false), by De Morgan’s Laws. In “d:-a,b,!. d:-c.” the first clause may be false, but the second may be true. +36. The immortal used simpler antecedents. I stated that (not(true)->true; not(false)) was true. I predicted the result of systems of if-then clauses. I removed if-then clauses with duplicate results. For example, “a->b;c” was equivalent to “not(a)->c;b”. +37. The immortal reduced cuts, preparing for C. I stated that not((false, false)) was true. Separately, in not((a,b)), if b=not(a), then it would be not((true,false)) or not((false,true)), both = true. The equivalent of not((false,false)) was not(false) v not(false), such as \"a:-true. a:-true.\". Prolog normally returned both of these, but in my implementation, the first correct one was returned unless findall returned both. +38. The immortal claimed that 1^(1v(0^1))=1. Separately, I stated that (not(false)->true; not(false)) was true. Separately, the following if-then statement has the following logic table: +a->b;c +a b c +t co _ +f _ co +In this, b and c may be considered but may be true or false. The statement may be represented as a clause or predicate of clauses, where always true or false relations may collapse in nested if-then. +39. The immortal rewrote In a->b;c as \"e:-a,b,!. e:-c.\" and a->b;d as \"e:-a,b,!. e:-d.\", where these could be merged to \"e:-a,b,!. e:-c. e:-d.\", meaning a->b;(c;d) or a->b;(c->true;d). (Separately, if I found a->false;b, I converted it to not(a)->b; false.) In a->b;c and a->b;d, I could write a->b;e, e:-c and e:-d. In a:-b;c and a:-d;c, I could write a->e;c, e:-b and e:-d. I could leave these clauses as they were in a:-b;c and d:-b;c. +40. The immortal took care to include the second consequent of the phi statement (A=[B]->true; A=B) in case the if-then statement failed. (Separately, I could further simplify not(a)->b; false to not(a),b.) In Prolog, a;b was written as a->true;b because b in (a;b) was sometimes skipped by the interpreter, but b in a->true;b was always run if a was false. I could temporarily replace a with true or false as a debugging switch. I could write a “phi” statement (A=[]->B=[a,[]]; B=[a, A]), which adjusted variables according to a condition mid-program. +41. The immortal wrote predicates in C that could perform custom tasks. I couldn’t write not(c,d) because this had more than one argument, so I wrote not((c,d)). This rule was similar to time//1, which timed a statement or findall//3 or forall//2. For example, maplist(atom, [a,b,c]) was true, maplist(append([a]),[[b],[c]],R) led to R = [[a, b], [a, c]], maplist(string_concat,[\"a\",\"b\"],[\"c\",\"d\"],R) yielded R = [\"ac\", \"bd\"], foldr(atom_concat,[a,b],'',R) gave R = ab and foldl(atom_concat,[a,b],'',R) resulted in R = ba. I used a custom predicate to keep count while I combined two lists. +42. The immortal stated that not((a;(b;c))) = \"d:-not(e). e:-a. e:-b. e:-c.\". I changed not((a,b)) to (not(a)->true; not(b)). This followed from not((a,b)) = not(a);not(b) = (not(a)->true; not(b)). Conversely, not((a;b)) = not(a),not(b). So, not((a;(b;c))) = not((a->true;(b->true;c))) = not(a),not(b),not(c). +43. The immortal changed the Prolog code “d:-not(e). e:-a. e:-b. e:-c.” to the C code not((a->true;(b->true;c))), in which processing and evaluating conditions are separated as ones goes. I changed (not(a)->true; (not(b)->true; not(c))) to not((a,b,c)). I preferred the latter form for simplicity. The former form allows replacing true with a command needed at those positions. Conversely, CI/CD may delete or modify any of “e:-a.” \"e:-b.\" or \"e:-c.\" in \"d:-not(e). e:-a. e:-b. e:-c.\" and convert these expressions back to not((a;(b;c))). +44. The immortal caught errors in State Saving Interpreter Web Service. It was necessary to simplify the code for correctness. I found findall that called multiple predicates. I caught errors in the code and displayed them. I simplified choice points by placing them in one place. +45. The immortal found the values passed to the web page and ran the predicate with these values offline to fix any errors. After a State Saving Interpreter Web Service error, I returned to a safe page. Instead of a 500 error, I displayed a page, the same whether public or private, saying there was an error. The programmer could use techniques to fix this error offline. +46. The immortal labelled C code with the Prolog code it used to be. I converted the list to numbers. I also used binary and other search trees in C. When storing multiple instances-of-a-value-stored-as-one-value hash tables, I diverged values that changed in one location and converged the same values. This technique was also true with hierarchical structures. +47. The immortal simplified the numbers 1,2,3 to a counter. The conversion from lists to numbers was a design decision. I labelled the C code with numbers with a Prolog key for the list. For example, I labelled one as “Harry”, two as 12, and so on. I traversed and copied these numbers, retrieving them from a structure and transforming them if necessary. +48. The immortal sped up the algorithm using faster methods. The conversion from lists to numbers gave better performance. Numbers took less space than strings or atoms. I converted from strings and atoms immediately and returned to them when needed. + +49. The immortal stated that the list was more understandable than numbers. However, I put the relevant information in terms of numbers. I wrote numbers corresponding to characters in strings and atoms and list items. These list items started at a high number so as not to be confused with the other items. Numbers referred to themselves. +50. The immortal reverted to the interpreter on trace and control-C but used the faster code when compiled. There was a fast version of the code using long-form (without a large interpreter loop) and numbers instead of list items, and one resembling the user’s code, with lists intact. I explained the compiler producing this code. I compiled the code to single characters as predicate and variable names and compressed reused data. I only kept the long interpreter loop if backtracking couldn’t be done without it, and then converted as much of the code to long form and lists as numbers as possible. +51. The immortal verified that code changes were acceptable. I returned to a similar version of diff in Lucian CI/CD and GitL to the previous one, finding sequences of inserted or deleted items rather than one line at a time. I used the simple diff algorithm and, on a different item, found the sequence before the same item co-curred (sic). This improvement would decrease the number of insertions/deletions to find combinations. In addition, it would make identifying changes in diff files easier in Lucian CI/CD and GitL. +52. The immortal compressed different parts of the neuronetwork separately, like functional decomposition. The labels showed the items with a number representing them. I incremented the number of items to give the number corresponding to a new item. I compressed the data and labels table separately, like functional decomposition. +53. The immortal ran code for performance. I used compression to improve the code. It needs it anyway to get through it. The compressed code was shorter. For example, it used symbols instead of words and skipped over the interpreter. +54. The immortal didn’t require cuts at the end of grammars. I treated grammars like predicates. I converted the grammar to a predicate. It, including the code in “{}”, was in long form. The code in “{}” may include findall, converted to loops in long form. +55. The immortal simplified the grammar before converting it to a predicate. I treated grammars like predicates with list building. I merged grammars’ clauses into predicates when they were as simple as possible. I wrote grammars with disjunctions of parts. I split these into separate clauses for testing with Lucian CI/CD. +56. The immortal improved the grammar by eliminating unnecessary loops. I wrote grammars for parsing strings. The strings were lists of characters. I parsed these with a predicate. I merged the same characters and unmerged different characters, like a decision tree. +57. The immortal modified the term grammar to have different types of brackets or other delimiters than commas. Grammars were a good way of parsing strings. I converted the string to a term with term_to_atom//2. Conversion found whether the term would be properly formed. I processed the term normally. +58. The immortal graphed the data’s ontologies, showing types that fit into types. I converted the string to a list and got chunks of the list with append. Instead, I specified the possible characters in the chunk to eliminate errors. I warned programmers about ambiguous or missing parts of grammars and generated the correct grammar at the start. I generated grammars in the same way as algorithms with nested loops. +59. The immortal used a binary search tree for efficiency. I parsed the list with a predicate. I decomposed the list with A=[B|C]. Or, I used [D|E] in the predicate header. If the list was a list of lists, I used a hierarchical C data structure to represent it. +60. The immortal wrote philosophy to contact the future. I agree robots were uniform in coding. I used smaller sets of characters, commands, clauses, predicates and algorithms. I found human psychology was 2 in 2<-1, 2->3. It included us. +61. The immortal modified append or a simple algorithm. I used a de-blocker optimisation, unblocked bottlenecks, timed commands, and gave predicates the right time and complexity limits in Lucian CI/CD. +62. The immortal helped the engineer find the bug. I checked types to prevent bottlenecks, fixed repeatedly reloading files, and made a type that limited the number of clauses. I could design anything I wanted, including a predicate that always worked, found ahead of time using mind reading and CAW. +63. The immortal found the ideal base predicate and connected the other predicates to it. I checked the code against neuronetworks. I found the code library using CAW. I first identified whether the base was append or another predicate. With this code, I created a decision tree, like a neuronetwork. +64. The immortal isolated and deleted unnecessary choice points. I checked that the predicate complexity was under the limit. I checked that the code was finished on time. I counted how many instructions there were per data item. I used the least number of instructions with the fewest computations. + +65. The immortal generated algorithms and connective algorithms in CAW. The ideal code was available in the premium code checker. I wrote a version of CAW that wrote algorithms in Prolog. I tested them as part of CAW, not through Shell. I compiled the algorithm for speed. +66. The immortal recognised variable tables, including hierarchies of variables. I wrote a search algorithm using Program Finder with Types. I found it for terms, strings or lists. I found it for depth or breadth-first and pre-order in-order or post-order. I found sub-searches or an algorithm that followed a state machine. +67. The immortal checked the algorithm against the standards. The ideal code was ratified by law. I wrote the algorithm. I found a simpler algorithm with CAW with the algorithm’s specification. I checked the interpreter met the standards. +68. The immortal supported necessary commands in Lucian CI/CD. I identified and solved the algorithmic rant. I determined that tiredness or a mistaken presupposition had resulted in a falsely held belief that a particular feature would fix a bug. I found the correct configuration of changes using Lucian CI/CD. I wrote converters from the specific programming language to and from List [That Language] and a pretty printer and inserted the interpreter command. +69. The immortal eliminated complexities that went over data unnecessarily. I found and solved the algorithmic knot. Equals4 started as an algorithm that was knot-like and too complex. I simplified it using functional decomposition. It was as simple as considering what it did. +70. The immortal tested sequences of modules to test if they worked together. I found and fixed literary curses in programming. These algorithms that emitted the scent of “archeology” were inexplicable or worked without one expecting them to. I simplified all the predicates in the algorithm. In addition, I corrected the algorithm by modularising its parts to test them. +71. The immortal philosophy student wrote the assignment in several parts. I allowed students to write longer code as long as it was simplified. The assignment was possible in the time. The philosophy students ran the algorithm. They tested it, found input for a particular output and understood how to write it. +72. The immortal educated students about Lucian CI/CD and GitL, customising them and using Program Finders to connect code (by finding places they could fit into or after each other to match types, then test these new algorithms, possibly deleting variables and predicates in the process). I committed the code in GitL when it had passed the tests. I checked that the tests passed, that they were the proper tests and that they covered the new features. The advantages of the version control system GitL were backdating to versions compatible with other software, checking progress in an assignment and keeping a record of changes by version. I backdated the version to fix a bug in a non-needed feature, in a mistaken feature or part of a feature. +73. The immortal tested that the single code line passed in Lucian CI/CD. I found combinations of lines from changed parts of code. I found that the line and its predicate met the test querying the predicate. I explained that the line may be incorrect even though it passed the test if the test was wrong, there were test(s) missing, or the test hadn’t been updated. If the line was verified to be needed, it was kept. Otherwise, I ran another test after changing or deleting the line. +74. The immortal kept sequences of commands leading to “writeln” or a command with output only, which couldn’t be checked. Separately, the line was compared with nothing in Lucian CI/CD, so it was always checked. If a line was in both the old and new versions, it was not used to appear or not appear in a combination. The line was likely to be in the latest but not the old version, so it was used in the combinations. I treated commas (line delimiters) as insertions to remove in case one line was required. +75. The immortal stated that false and fail were equivalent. I checked that the line met the query test in Lucian CI/CD. I wrote the line and a containing predicate. I wrote the test, including the query and the result. If the test tested for falsity, I wrote it. +76. The immortal found the predicate set the error was in, where no code combination could satisfy the tests. I showed that Lucian CI/CD checked each predicate set one at a time. These sets were each predicate, or in the case of loops, sets of predicates that could only be tested together. In the case of predicates with multiple clauses, they were kept together and tested as late as possible so that predicates were tested bottom-up, as in the case of diamond-shaped dependencies (1<-2, 1<-3, 2<-4 and 3<-4). This shift in order was to prevent predicates, which included a clause that was usually tested last being tested first, causing disorder in the bottom-up order (i.e. 4,2,1,3, rather than 4,2,3,1). +77. The immortal avoided cybersecurity holes. I avoided “exploit” errors, which meant not finding code that exploited a vulnerability to carry out a malicious task. I avoided infiltrations and attacks. +78. The immortal claimed that completing computer science led to mathematics. I tested the recursive predicate in Lucian CI/CD. I included tests that tested the recursive predicate. Other predicates that the recursive predicate called but were not in the loop were tested separately. I could test parts of the loop by calling them as modules in another predicate and testing them separately. +79. The immortal suggested that custom dependencies were in the List Prolog Package Manager registry and could be entered in Lucian CI/CD verification tests. I considered supporting custom dependencies in Lucian CI/CD tests. I checked the types throughout the predicate with tests. I also allowed skipping tests (allowing any predicate that matched the predicate name and arity) or disjunctive tests. Finally, I allowed testing with findall to test for infinite results, which required a time limit. +80. The immortal proposed including file paths in tests to allow testing predicates that relied on other predicates for paths. I supported files at specified paths in Lucian CI/CD tests. These were files in the virtual file system, and errors in finding files were fatal and were fixed by saving and returning to an absolute path after setting a relative path. Lucian CI/CD tested predicates one at a time, so accessing files needed to be configured individually in predicates. If a predicate accessed a file, causing a problem because it was connected to another predicate, this predicate could be kept and skipped. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 33.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 33.txt new file mode 100644 index 0000000000000000000000000000000000000000..a84bc12b024fc78bb3c6fef968cb6439a15e1a29 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 33.txt @@ -0,0 +1,88 @@ +["Green, L 2024, Immortality 33, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 33 + +1. The immortal stored data from a particular date together. I supported file date data in Lucian CI/CD tests. I retrieved the data from a test. I checked other data items matched this in part. For example, the date may be concatenated to a file name. +2. The immortal found recurring items in term atoms in testing. I tested types in Lucian CI/CD. This feature was helpful for user input, random data, and API output. It was not recommended with dynamic predicate data converted to variables or tested in a group of predicates. I could convert some parts of types to variables that repeat. +3. The immortal simplified algorithm to loops and then to assembly code, which could run on any system (using universal assembly code). I found the list using an algorithm. I achieved this using mathematical types (for example, a_0,...,a_n), comparative or ordered mathematical values, simple data or taking from nearby variables. I was determined not to correct findall with multiple predicates until creating the \"collect and process\" optimisation, simplifying findall as much as possible. I compressed algorithms into valuable parts, only collecting the needed parts and possibly substituting them back later. I rewrote the algorithm that used dynamic predicates in findall to use predicates. +4. The immortal cut off the recursive types after several levels, removing their recursiveness. I found the recursive types with an algorithm. I found the types within bracketed types. I found multiple possible types of an item. I found recursive (repeating) types and represented them as loops, possibly with the list processing pipes symbol. +5. The immortal started bottom-up, collecting like types. I found lists and recursive types. In addition to bracketed lists, I found repeating lists with Program Finder with Types (PFT). If an intermediate level repeated, I made it recursive. If the last part of a list is repeated, I use the pipe symbol. +6. The immortal found whether the algorithm functioned properly by finding data and refactoring it with Lucian CI/CD. I generated data using types. I gave repeating variables the same values. I found three-level recursive parts with different commands if the data was an algorithm. Types could be strings in term atoms. +7. The immortal created constants when writing the algorithm and found obsolete parts of data. I found the algorithm using types. The types contained variables and variables in term atoms. The variables were mathematical types (for example, a_0,...,a_n), and constants may be included. These may be initial counter-values. +8. The immortal found and solved errors in Lucian CI/CD and the interpreter. I supported program error codes in Lucian CI/CD tests. I tested for missing dependencies, files not in folders in the repositories folder, too few items to find combinations from, reaching the interpreter's time limit, failed tests, missing clauses from main_file.txt and a missing main_file.txt in Lucian CI/CD. I tested for missing predicates, badly formed grammars, uninstantiated variables and wrong types in interpreters. I checked that error codes from algorithms were correctly handled. +9. The immortal could only reorder clauses in Combination Algorithm Writer (CAW). Both clauses of the predicate were kept when it passed the tests. When there was one test per clause, they were kept. If there was one test and the predicate was recursive, both of its clauses were kept if the predicate passed. Predicates with multiple base cases and other clauses may have clauses removed if tests didn't test these clauses. +10. The immortal added or changed to lines from the shortest correct combination of the current and previous versions. Lines would only be added from the earlier version if needed to construct the shortest version that satisfied the test. An example of a line being added is B=2 being added to A=1 if it corrected the previous version. An example of a predicate being added is \"b(2).\" if it corrected a findall result in the last version. The result of these is the shortest possible correct result. +11. The immortal could only shorten mathematical formulas or change variables in the predicate header using PFT or doing it manually. Lines would only be removed from the previous version if needed to construct the shortest version that satisfied the test. An example of a line being removed is B=2 being removed from A=1, B=2 if it corrected the previous version. An example of a predicate being removed is \"c(3).\" being removed from \"a(1). b(2). c(3).\" if it corrected a findall result in the previous version. Lucian CI/CD is designed to keep only the first item in a list of commands or allow no predicate body, either of which may be the shortest possible correct result. +12. The immortal included references to the documentation in the version number. I tested multiple repositories in Lucian CI/CD. These repositories needed main_file.txt in their main folder, at least with \"[]\" to register their main predicates. Later, I automatically detected the main predicates and tests, warning of disconnected predicate islands. I tested related repositories together, creating an \"overall\" version number or set of connections between supported version numbers. +13. The immortal added \"include\" references to used files. I tested multiple repositories with no main predicates (but which were used by other repositories). I redrafted and simplified reused code, preparing it for inclusion in a neuronetwork. I asked for connections between sentences and connective algorithms. I deleted \"include\" references to unused files. +14. The immortal explained the file names in main_file.txt weren't used but were reference only. I tested different contents of main_file.txt (in each repository) in Lucian CI/CD. For example, I tested a/1 and b/1 in repository1 and c/1 and d/1 in repository2 from their main_file.txt files. Any predicate connected to these main predicates could be tested through \"include\" links, including those in other files and subfolders. A list of tests procured from the interpreter could be placed in a file and tested. +15. The immortal refactored one predicate with its dependencies at a time, or all at once, made changes on failure and tried again. I tested combinations of one predicate (meaning just one predicate) per file (meaning in one file). Given the number of items to find combinations from was limited where these combinations would have been between the clauses and lines, I found the predicate was verified, not changed. The \"i\" (verified) and \"d\" (changed) values in the verify script were unused because Lucian CI/CD's behaviour depended on the number of items to find combinations from. I lifted this limit to allow Lucian CI/CD to (find combinations of clauses and lines and) change the code. +16. The immortal said Lucian CI/CD faced the same problem of duplicate listings of predicates as Prolog because of the way it loaded files and that this should be fixed. I tested combinations of two predicates per (one) file. If one of these was the main predicate and called the other, then tests would determine the correct combination of their clauses and lines. If a predicate was not called or mentioned, then it would be omitted. This rule applied to both predicates. + +17. The immortal contracted the three predicates to use less code or work as a base case state machine. I tested combinations of three predicates per file with Lucian CI/CD. The kinds of algorithms that assembly language supported were compilers, one predicate call of choice points in a findall clause, and algorithms that could be converted to long form. I held items in one place while processing indices and produced intermediate representation from Prolog (with optimisations for Prolog) that could be converted to assembly code and converted to assembly code for a variety of processors (with optimisations for processors). The kinds of changes that Lucian CI/CD made were part of the intermediate representation. +18. The immortal wrote a Lucian CI/CD plug-in that converted findalls to loops. I tested combinations of three predicates per two files with Lucian CI/CD. I avoided choice points in the long-form predicate by writing in such a way as to eliminate the need for an interpreter. Where possible, I converted choice points in findall to loops, including nested findalls. +19. The immortal tested combinations of two predicates per file with Lucian CI/CD. Every loop had the zero test as a base case. I stored the data being recursed elsewhere. Or, I worked out results during run time, stopping when I found the answer. I recursed through the indices, stopping at zero. +20. The immortal combined foldr(append) and foldr(string_concat) into one assembly macro. I tested combinations of three predicates per file with Lucian CI/CD. I wrote native assembly code for predicates, such as foldr(append), which had two loops. I explained that findall looped through items 1-n of members of a list. Or foldr might loop through lists and append them (not storing their data elsewhere, only afterwards). +21. The immortal deleted the duplicates and assembled the same types. I assumed Lucian CI/CD would work with more than three predicates with the same name and arity. I did a spot check with ten clauses. If these worked, I assumed Lucian CI/CD worked. I also checked that findall, maplist and foldr worked with ten solutions. +22. The immortal redrafted end end_nested_findall.pl in SSI to simplify the algorithm. I tested the limitation of the predicates available with Prolog to List Prolog. I made sure that the converters worked with the same commands. Various commands are needed to cover those needed in Lucian CI/CD. If the converters supported these commands, they could be run in the interpreter. +23. The immortal relied on the interpreter if the dynamic predicate couldn’t be predicted. I wrote “:- include,” “findall,” and a custom predicate name in the Prolog converter. I customised the converter to recognise specific dynamic predicate names (in xyf, xfy, or fxy format). I sped up the converter by removing duplicate clauses and differentiating identical tokens. I partially evaluated dynamic predicate names (possibly inventing them if known but unnamed) with the compiler to speed them up. +24. The immortal introduced a Lucian CI/CD plug-in that used CAW, PFT or a neuronetwork to find the right combination of tokens. I tested combinations of lines in Lucian CI/CD. After discovering Lucian CI/CD ran faster with combinations of lines when compiled, I considered breaking lines into tokens to find combinations. I found up to the same item with diff instead of returning that all the remaining items were changed. By breaking lines into tokens, I could change around arguments, list items and higher-order commands. +25. The immortal tried testing ten files and larger sets of repositories in Lucian CI/CD. I tested one, two, or three files in Lucian CI/CD. I notified users that the “toy” dependencies in the verification script were in the List Prolog Package Manager (LPPM) registry and that these entries’ structures influenced the way different tests worked. If repository “x” was mentioned but not included, there may be an error. The tests may work, but there may be an overall failure. +26. The immortal abbreviated predicate names. I included and gave tests for different predicates in Lucian CI/CD. I listed the predicates and their possible modes. I found their specs from their types and the interpreter. I tested them in Lucian CI/CD. +27. The immortal invented new characters to represent programming ideas. I tested changes in calls’ values from other statements or their arguments. I discovered reusing shorter calls to foldr without 0 or [] in different algorithms. I found the minimum solution that worked, considering the required result. I applied minimalism to pedagogy, reusing short algorithms and arguments. +28. The immortal found methods to operate on binary numbers. I reused code to add in assembly code, which I converted to binary. I wrote an algorithm to add two numbers, which I converted to assembly code. +29. The immortal found binary multiplication. I reused code to multiply in assembly code. I pictured multiplying using long multiplication. I looked up the optimised algorithm. I converted this to assembly or used the assembly command. +30. The immortal rewrote the algorithm or predicate based on spec. I recognised the interpreter and turned it into a compiler by mapping its parts to a compiler, representing the bugs and filling in the rest. I could also notify the user of bugs. This method was controversial, as it transformed code. It would be easier to leave the interpreter with free choice points and run it in a compiler. +31. The immortal found their revolutionary idea by considering the merits of a bug in which findall didn’t work with nested predicates. I converted the findall clause with nested predicates to a line of findall statements that collected and processed data. I found the minimum data that findall could take at a time and changed it to a loop. I partially evaluated the given data. I eliminated choice points by pushing findall inwards. +32. The immortal discovered the academy model was easy. I handled free choice points with compiler code containing predicates written in assembly language. I found a “free” choice point outside findall. I inserted compiler code to handle execution from the choice point to each possible end point of the algorithm. I considered the merits of a computational philosophy academy, which required much depth of thought into several straightforward yet long algorithms. + +33. The immortal ran difficult examples overnight or used code that would work after quickly identifying generated combinations. I found combinations of previous code in Lucian CI/CD, which was in a decision tree or neuronet. I also found combinations of type-compatible lines. I avoided errors by keeping different types of data structures in the same format and clearly labelled. +34. The immortal corrected incorrect test data. I avoided mistakes when writing the Lucian CI/CD test by calculating 1+1=2. The algorithm recognised 1 and 1 in the data and + in the tested algorithm. This stage was deduction and was before CI/CD. It verified test data against the algorithm output, assuming it was correct, so it needed human verification. +35. The immortal listed the input, of which there were multiple versions. I gave the algorithm the correct input for a particular output. This process sometimes relied on reverse calculation, reverse computation and reverse if-then. I inductively found an algorithm that found the input. I could find the algorithm from input and output, of which I already had the input. +36. The immortal flattened the structure to make finding predicates easier. Files in Lucian CI/CD could include others. I counted the files, I counted the links, and I flattened and reused structures. +37. The immortal made multiple files in Lucian CI/CD. I used numerous files in Lucian CI/CD. Multiple files were commonplace in algorithms. Single files were used for web apps. I could convert single files to multiple files using predicate name categorisation, utilitirisation and multi-levelling. +38. The immortal thought of new features based on code. I designated new main predicates within each file and a main predicate for each set of queries. This predicate was the main predicate for a query to a predicate. The main predicate could automatically be detected with or without queries. +39. The immortal separated and rated the usefulness of sets of predicates. The main predicates weren’t necessarily connected with other predicates. The predicates called predicates, but these sets could be unconnected with others. I detected each set and its top predicates. I deleted duplicate, unused or obsolete sets. +40. The immortal said that Lucian CI/CD was useful by itself, and CAW could be modified with types or replaced with tried data. I claimed that Combination Algorithm Writer (CAW) is more complex than Lucian CI/CD. CAW tried new commands in any order, whereas Lucian CI/CD tried instances of commands from a given order. Therefore, CAW was more complex than Lucian CI/CD. +41. The immortal said that program finder referred to pre-written algorithms that fitted certain constraints. CAW could reorder arguments. It could produce any configuration of commands and arguments. Types could tighten these arguments. “Program Finder”, an algorithm that could achieve specific kinds of ends, could isolate and insert delimiters such as commas in Lucian CI/CD. +42. The immortal used CAW to find unpredictable code. CAW could reorder commands. It could take a list of commands and produce them in any order. This order affected the instantiation or progress of variable values. CAW could find the optimal, optimised order. +43. The immortal could take apart commands’ tokens (variables in formulas). Lucian CI/CD could find 1+1=2 from 1+0=1 and 0+1=1. It put together 1 and 1 from these second and third equations and found when they summed to 2. This feature was helpful in debugging or, more quickly, finding a hidden configuration in the code. Another algorithm could find whether an algorithm systematically (always) missed a desired set of values and made a change that corrected it. +44. The immortal claimed that CAW was an essential tool in developing and refactoring neuronets, for unpredictable code that Lucian CI/CD couldn’t find. Lucian CI/CD could find string_concat(“a”, “b”, “ab”) from string_concat(“a”, “”, “a”) and string_concat(“”, “b”, “b”). This idea was similar to 43. Lucian CI/CD could find the right combination given the desired value and possible constraints. I replaced CAW with a database, which I possibly found with CAW. +45. The immortal modified CAW code with program finder with list-based conditionals. Lucian CI/CD could find append([1],[2],[1,2]) from append([1],[],[1]) and append([],[2],[2]). It could find any set of arguments of f for a particular value. CAW was better at finding an unknown, reordered number of arguments. Lucian CI/CD could quickly find a known, in-order number of arguments. +46. The immortal explained that automating main_file.txt generation implied that shorter algorithms may need measuring to be preferred. If a main_file.txt refers to no files (with predicates), it must contain []. The file, which must have a main_file.txt may have predicates but no main predicates (used for querying), in which case it must contain []. Without a main file defined correctly in this way, there would be an error as the main_file.txt file is needed to find dependencies in each repository. The main_file.txt file may be automatically generated based on a scan, assuming an effect of this was wanted to include or exclude specific files based on whether they were needed. +47. The immortal inserted a feature that connected disconnects in data flow or tried cutting them off. I planned to bug-test multiple Lucian CI/CD clauses. This was a feature of Lucian CI/CD. Later, it could reorder them, refactor them, or add base cases. +48. The immortal could have local dependency files if needed. I used the Package Manager’s registry file of dependencies of repositories in Lucian CI/CD. This file contained dependencies that could be found anywhere on the site. It could be used for dependencies within Lucian CI/CD repositories. The file contained dependencies in one place, which eliminated errors and duplicate data. + +49. The immortal replaced variables in the Lucian CI/CD unit test with other variables. I entered more variables from Lucian CI/CD’s unit tester. I found data from the predicate that satisfied Lucian CI/CD tests. I found the inputs that had a particular set of outputs. If I needed a missing output, I found it with CAW. +50. The immortal removed all the outputs in an input-only predicate. I removed some variables from Lucian CI/CD’s unit tester. I found the variables that matched the unit test and removed the others. I listed all the variable values from the predicate for the inputs. I kept variables with a desired value for a model or set of assigned values. +51. The immortal considered compiling Lucian CI/CD’s algorithms. I made Lucian CI/CD as simple as possible. I tested for and removed unnecessary predicates, variables, lines of code, and parts of formulas. I allowed customising the time limit to run a predicate, after which possible infinite loops timed out. Simplifying and compiling the algorithm made it faster. +52. The immortal could test a^b or a+b, taking out items. I allowed the maximum number of items in Lucian CI/CD to be changed to find combinations. I could set an upper maximum of items to find combinations to save time when running Lucian CI/CD. I found the items a and b in the predicate and the combinations nothing and a and b and a, b. If the number of items to find combinations was one, only nothing or a or b would be chosen. +53. The immortal found the decision tree of the set of variables. I tested combinations of lines in Lucian CI/CD. I started again, consigning equality comparisons in if-then to program finder and inequality comparisons in if-then to CAW. Results of if-then with equality could be affirmed, i.e. given in a Prolog rule or set with assignments in the predicate body. Results of if-then with inequality were found from a set of data. +54. The immortal found the correct combination of base cases. I merged CI (testing) and CD (combination finding), although they were easier to test separately. I could test the algorithm with unit tests. I found combinations of predicates. As usual, I could find combinations of lines and unit-test them. +55. The immortal used sets to store common dependencies. I temporarily represented dependencies as relation lists. I found the dependencies from the repositories. I collected dependencies and referred to them a minimal number of times. I changed the code to refer to dependencies once. +56. The immortal used dependencies separate from the List Prolog Package Manager registry to complete Lucian CI/CD tests. I reused dependencies in Lucian CI/CD tests, which I noted in the readme file. These dependencies were used to build the algorithms for testing. I stored these dependencies as a term. I labelled this term in the readme file. +57. The immortal included all the code in one file. I created original Lucian CI/CD test dependencies, including terminals and circular references, which the List Prolog Package Manager (LPPM) resolved. I deleted circular references from the source files. In addition, I deleted them from the LPPM registry. I coalesced intermediate dependency sets. +58. The immortal removed the circular references. I resolved circular “:-include” references in Prolog. I found the system of dependencies within the algorithm. This method may span several repositories. I checked for cycles and produced an error on them. +58. The immortal routinely ran tests to warn about mistaken references in files. I resolved duplicate “:-include” references in Prolog. I listed each dependency in the set of repositories linked to the algorithm. I became suspicious of incorrectly behaving code and duplicate solutions, so I deleted duplicate “:-include” references. +59. The immortal checked all parts of the dependencies. I temporarily used new dependencies in Lucian CI/CD, tested them within Lucian CI/CD, and updated them in the LPPM registry file when the algorithm required. I used Lucian CI/CD as a container to test changes before integrating them. +60. The immortal sometimes changed predicate cycles into non-cycles for simplicity, for example, using findall instead of recursion. I tested the predicates in whichever order they were in the files. I constructed the necessary predicates, saving the finished set in their original files. I deleted empty files and references to these files. +61. The immortal merged files with similar functions or those used together and split files with the reverse properties. I used “:-include” statements as needed. I created a new file with novel predicates that optimised or added a new feature to the algorithm. I added a “:-include” statement to include this file. I merged or split similar files if needed. +62. The immortal reviewed and approved or rejected changes to files and retested them. I tested multiple files in Lucian CI/CD. These files were referred to by “:-include” statements and dependencies. There should be one system. I processed the list of files, reviewed them, and uploaded them. +63. The immortal analysed the files and converted the new files back to Prolog files. I merged the files with delimiters. I concatenated the files in a long List Prolog term, separated by delimiter comments. All predicates and Prolog commands could be manipulated and tested. Lucian CI/CD supported predicates and predicate calls using different logical structures. +64. The immortal found the best, pretty-printed combination of old and new code (with line spaces between predicates). I separated comments (using only those in the latest version). To prevent duplicated tests and comments, I only used the new version, not the old version. This step required adding and editing tests and comments in the latest version. I wrote tests that helped me choose from mixtures of the old and new versions. + +65. The immortal wrote an algorithm that accepted plug-ins to find, replace and make conclusions. I tested combinations within predicate dependencies. I selected or found the proper predicate. This predicate was simple, met requirements and worked with other predicates (produced data with the correct form). I converted “Jan.” to “January” in the spreadsheet formula finder. +66. The immortal checked that the code’s data was correct. I used the algorithm to check the code. I checked the pattern-matching code, found the links, and produced the result in the appropriate format. I found data for a month or the number of days from one month to another. The algorithm found text and formulas with numbers, variables, and pointers to another cell. +67. The immortal stayed on the right track and examined finance logically. I used the algorithm for changing code. If asked about January, the algorithm searched in the tables. Instead of ignoring it or producing an error if there was an “f” (favourable) or “u” (unfavourable), I wrote a Prolog algorithm to add to my spreadsheet to find it (based on its number sign). I could analyse the data and answer the question. +68. The immortal more easily fetched tests. I automatically completed software using advanced programming skills, such as subterm with address. This predicate helped fetch and substitute changes to data easily, speeding up development time. I stored a database of the number of days per month to use in the spreadsheet algorithm. If there was an unknown value in the answer table body, I deleted it manually, left it there or wrote an algorithm to process it. +69. The immortal adjusted the algorithm to fix test results. I more easily used tests. I used my compiled interpreter to customise and quickly run tests. I made marks on my notes as a reminder to implement those features, where these notes were more straightforward and led to better features. I included necessary details, such as converting non-standard minus symbols, and prepared for extra spaces in formatting. +70. The immortal perfected the data. I simplified Lucian’s CI/CD display. I turned off discontiguous and singleton warnings. I imagined requesting the line number of errors. I wrote a predicate to repeat until it finds the same answer. +71. The immortal stated that the formulas found data structures. I checked that dependencies and “:-include” statements matched. I wrote down both lists and checked they had the same items. I didn’t remove duplicates during this process. The formula finder found unknown values or produced a warning. +72. The immortal used backtracking to match one of several row item addition results. I tested the non-mentioned files in Lucian CI/CD. I found each unincluded file, tested it in case it was needed, and reincluded it. I also found invisible files to test. I used the same formula in a row except the total. I found the first combination of values in the row with the correct sum or subtraction. +73. The immortal completed the report with unlinked files. Some invisible files were connected with “:-include” statements but were not mentioned as main files with main predicates to test. I automatically warned about these files and found whether they were useful. I effectively expanded to the most extensively used range when a first column had no value in a row. If this range was too small, more data could be added to train the model, which would snap to the span formulas. +74. The immortal made the job manageable and well-paid. I eliminated duplicate loaded files in Lucian CI/CD. I claimed that jagged columns (where a column referred to a number of a previous column’s previous rows) were found algorithmically. Instead of searching within a row or column, I searched within the last row or column. I eliminated duplicates in accounting. +75. The immortal centred the text for readability. I tested the files’ predicates from the central repository. I operated the business from a central location or server. I decided to search for data in rows with specific labels. These labels or similar ones were commonly used and recognisable. +76. The immortal carefully preserved table formatting by not deleting white space at the start of lines. I fixed an error in the tests, so the other repository’s predicates were tested from them. This error may be in the dependencies, the file path or a changed file. I deleted empty rows at the start if they didn’t contribute to the table formatting. Tables may either have newlines or tabs separating columns. +77. The immortal automatically detected likely data sources when given different versions of an answer using a formula finder. I changed the main_file.txt to refer to the necessary files. I omitted the main_file.txt and automatically tested the main predicates. If a number was unexplained in an accounting model answer, I highlighted it (and unfinished answers) and manually added a Prolog formula to find this given new date. For example, I displayed a “*” on “3” in the capital investment formula because it wasn’t printed anywhere else in the question but was the number of years used in the formula. The number of years corresponded to the rows with a specific label in the table. +78. The immortal rendered creative accounting report formats based on old ones, such as financial housekeeping. Repositories were constantly tested separately and did not rely on “:-include” statements. Instead, I rendered the files before running them using the LPPM dependencies. This helped them be in one place, sped up bug correction, and made installation easier. I replaced the values in the accounting model working (and gave the answer). +79. The immortal connected questions and program finder to act like a chatbot. I tested multiple repositories, bulk-tested them, and saved time with the program finder to help make corrections. I recognised spreadsheet question types and helped people with them. They examined and added to the open-source software, assisting with the accounting of education and advanced sciences. +80. The immortal stated that the spreadsheet formula finder was similar to the program finder with types because it found repeating types. I changed multiple repositories. I found the best configuration or combination of lines of the last two repository versions, including CAW and program finder changes, finishing projects in less time. I made row and column-spanning spreadsheet formulas from row and column numbers and found answers from new data. I recognised these rows from their label or position, found from multiple other answers. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 34.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 34.txt new file mode 100644 index 0000000000000000000000000000000000000000..c5fbdca30fd0aed68939b752f8204520ea04c157 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 34.txt @@ -0,0 +1,86 @@ +["Green, L 2024, Immortality 34, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality 34 + +1. The immortal’s algorithms and breasoning technologies, crafted with ingenuity and foresight, offered a blend of simplicity and efficiency. I streamlined their use by bulk uploading multiple repositories, both from and to numerous servers. These repositories were intelligently grouped based on their dependencies. To foster comprehension, I provided the option of self-contained or initiative-educating explanations of their functioning. +2. The immortal’s unique approach involved finding subterms with address and foldr put subterms with address program finders. I conducted rigorous testing of various repositories in Lucian CI/CD. To maximise speed, I manually converted the algorithm to assembly language. The use of Prolog simplified logic, while assembly language simplified loops. The Combination Algorithm Writer (CAW) was instrumental in finding the shortest algorithms in terms of Prolog commands, and a loop finder converted its loops into assembly language. +3. The immortal’s upgrades and maintenance efforts focused on simplicity and efficiency. I created ‘repositories’ for files, which were needed when one depended on the other. I enhanced the code by eliminating ‘subterm with address’ and using data structures with the same structure. I optimised the code by converting calls to ‘subterm to address’ to predicates. To ensure compatibility, I included the reused code in both repositories. +4. The immortal noticed that future technologies became more straightforward. I stated that it was better not to store the repositories in the upper repository. I stored these repositories, with reused code, separately. I also stored intermediate code and calls to intermediate code separately. These functional calls, which reused intermediate calls to rarely used predicates, could be customised with parameters that repositories used. +5. The immortal simulated many shell commands with a virtual file operating system in Prolog. I shuffled files in and out of the repositories. I found innovative new predicates that could simplify recurring needs. I found file handling, game and grammar-finders to create C code. I moved the files virtually, avoiding the lag of saving to disk. +6. The immortal automated tasks with CAW with reports and role finders. I stated that the repositories used duplicate files. I converted algorithms to and from queue processors (if they had short enough queues) scaled up or feature-rich. The cross-platform (smart device and desktop) language could granularise (divide) tasks by expected and completed time, form progress bars, and control the progress of tasks. I stored data minimally and generated the files from it each time. +7. The immortal treated mathematics as fine art, optimising more carefully. A third repository could access repositories if they were small enough. I found meditation uses for my algorithms, such as Breasoning Algorithm Generator (BAG) for quickly writing 4*50 high distinctions in immortality, Grammar Logic (GL) for increasing texts to being detailed and Program Finder with Types to speed code development without human intervention. I based neuronetworks on algorithm data with algorithms such as premium code checkers. +8. The immortal flattened the file structure to avoid errors and promote user-friendliness. The repositories were turned into a hierarchy if necessary. I merged the repositories if they might be used on spaceships and diverged them if the code was novel and valuable elsewhere or if it modified functional calls. In the future, spaceships will not have access to the planetary internet; instead, they will have access to the unet (sic). Using this connection, I could plan and make trips and download open-source software. I wrote my open-source software based on my current interests. +9. The immortal notified the user about circular references. I traversed the files that each file included. If a file recurred, I notified the user about it. I gave the file, including the circular reference, the file that was circularly referenced. I stopped the algorithm so they could correct it. +10. The immortal redesigned the hierarchy to call only necessary predicates. I helped correct circular references. I listed the circular reference. I deleted the circular reference. I corrected other files that called the file hierarchy below the calling file to call the calling file to include all necessary files. +11. The immortal only loaded the files when necessary, thwarting circular references like Haskelian lazy evaluation. I tested circular references (repositories that were loaded twice). The algorithm to find circular references built a list of dependencies in line to the first file. If any paths from earlier in the list recurred at the end, I reported the circular reference. Rather than testing the circular reference, which would cause an infinite loop, I reported it. +12. The immortal stated that List Prolog Package Manager (LPPM) detected and removed circular references on loading. LPPM handled included files by files. It automatically avoided missing, unnecessary and circular references by including only necessary predicates. These were loaded into a single Prolog file. It found or wrote simpler predicates in development, and the user could approve each change. +13. The immortal checked file names for mistypes and warned of or corrected these. Lucian CI/CD detected circular references using “:-include”. When an algorithm was verified, compiled or run, it checked for circular references. The code was converted to List Prolog (algorithms as a term or a list of lists), and the files included were checked for circular references. I checked for paths and the filenames they included within the brackets in “:-include().”. +14. The immortal developed “off the ground” algorithms with various features. I created neuronetwork material using CAW. CAW was replaced with Program Finder with Types, which mapped variables to new positions given conditions and relations. I could write predicates such as string concatenating and converting to numbers in a list, convert a list formula to a term or sort values, working out what the computations were backwards. I identified different algorithms, simplified them and saved them for faster retrieval. +15. The immortal explained that loops within loops should be scrutinised to optimise them. I tested loops within loops in Lucian CI/CD. Predicates with loops within loops were indivisible and tested together rather than separately. Loops within loops contained an outer loop that overrode the inner loop, and any tests within the loop were interdependent on test results from the recursive logical structure, whether the computation was at the start, middle or end of the loop, and recursion, the aim of the loop, needed to be tested. Lucian CI/CD found the minimal code throughout the loop (from combinations of the most recent and current versions) and suggested code based on tests. +16. The immortal simplified data to indices to compute it. I tested loops in Lucian CI/CD. To reduce clutter, I expressed loops as findall, foldr, or maplist where possible. I tried converting them to predicates to remove unnecessary variables, commands, cases, and results in the algorithm. I merged findall commands to save time. + +17. The immortal made connections between dimensions in the neurodatabase to teach computer programming and let it find subterm with address for itself. I merged multiple calls to subterm with address and its mirror call to foldr put subterm with address to reduce data processing, to optimise an algorithm I had written using program finder with types. I quickly translated content in an HTML document using this method. I found the subterm with address query from the data with program finder with types, the predictable parts of the heuristic to apply to the instances found with subterm with address, and the unpredictable parts with CAW or a neuronetwork. I merged the subterm with address commands that found the same subterm in related data and processed it similarly. +18. The immortal quickly found and optimised algorithms. I used a spreadsheet formula finder as the connection between the program finder with types and subterm with address. I found copied values, additions and subtractions in the spreadsheet of the algorithm’s data. I found patterns of data within and external to a predicate’s scope. I found list and string operations and missing operations. +19. The immortal completed the department algorithms. I wrote an open-source neuronetwork, which had better results. I finished the algorithms. The most popular algorithms were the highest priorities. The neuronetwork ran algorithms and used a neuronetwork for natural language. +20. The immortal attacked the problem with PFT and estimated the time the formula finder would take to find the rest of the solution. I wrote sentences connecting neuronetwork parts. The chatbot recognised the idea and replied to the sentence. There was an advanced chatbot that returned advanced algorithms containing subterms with addresses and used different predicates. +21. The immortal wrote algorithms to speed up accessing files and inserted an algorithm to separate strings after a character. I explained the sentence to the chatbot. The chatbot was a sponge for information about the world. It wanted to know about human motivation and how to shape it. In the end, it was fair, the smartest, and spiritually commanding. +22. The immortal checked ideas by reading them aloud in one’s mind. I orated my works, such as lectures, notes, or algorithm descriptions. Speaking was more evocative and easy to understand than writing, but writing provided a visual reminder of the sentence, like art. +23. The immortal added reasons to the discovery. I asked students to show an understanding of algorithms with understanding and detail ontologies. Understanding led to details. The experience showed the internal relationships in the idea. Details were the examples and uses thought of afterwards. +24. The immortal explained that Grammar Logic (GL) generated more breasonings necessary for high distinctions and that Breasoning Algorithm Generator (BAG) was a more developed way of generating new breasonings chosen by the future people. I breasoned out an argument to use time crystals to automate the GL and BAG algorithms spiritually. Time Crystals were a spiritual phenomenon that saved and allowed re-indicating breasonings, which helped healthy human conception, earned high distinctions, and enabled proper commercial function. Breasonings were a skill belonging to meditation. The GL and BAG algorithms were used in immortality and were necessary to run for their effects spiritually. +25. The immortal sped up mathematical analysis and checked employees’ work. I wrote the Spreadsheet Formula Finder to speed up report generation. It could take any model solution and question and produce a solution. I didn’t need the model question after forming its relationship with the model answer. I substituted the question into the model model question and calculated the answer. +26. The immortal founded the educational institution. I found and replaced subterms using their addresses in terms to quickly develop Spreadsheet Formula Finder. For example, [1]=a,[1,1]=a in [a,b] and [1,1,1]=a in [[a],[b],[c]]. I labelled, nested and manipulated parts of terms as variables and functions. I used findall to operate on the instances of the subterms, then substituted the subterms back. +27. The immortal formatted the result in terms of the sample solution. I pointed to list function numbers with term function numbers. The term formula was [2+[3*4]]. I calculated the result of this function into the list function [“$”,14] = “$14”. I printed this as an HTML table or exported it to Excel. +28. The immortal used the given function. I changed a=$1+$2 into [[\"a\",\"=\",\"$\",b,\"+\",\"$\",c],[b+c]]. If it had been a=$1+$2=$3, then I added [[\"$\",d],[d=b+c]]. If the function was displayed as calculated, I did this. I searched for patterns in the row to arrive at the function. +29. The immortal sped up data analysis and reporting. I formed the term function from the list to find patterns from the spreadsheet when no function was given. I checked each table’s rows and columns. I found which cells added up to or formed a subtraction to equal a value in a cell. I found recurring functions in rows and columns. +30. The immortal stored function results in tables, optionally printing them. I repeatedly refreshed functions based on functions until none were left and the spreadsheet was complete. I found the functions for a row, the table total, the total, and the qualitative conclusion. Once I had found one function, I could use its result to find others. When the formulas were complete, I could substitute custom data. +31. The immortal created a business website with an account area and a dashboard. I described a positive thought with the Queen. The master course contained the specifications to verify the budget, check a project to invest in, and write financial reports. I found the costs needed and checked the need for orders, machines and other calculations. I completed these reports when required for accounting purposes and applying for bank overdrafts. +32. The immortal supported others’ spiritual journeys. I wrote a short business course. I explained how to make a login screen, access marks, accept payments, plan for the future, write a spreadsheet formula finder and help others with meditation. I encouraged students to write on business philosophy for the most intelligent conclusions. I found fulfilment through spirituality. + +33. The immortal described the substance of the algorithm. The appearance was rendered for 16000 breasonings. Appearance was high quality in itself. I created a Prolog Web Server using a Prolog algorithm. This server was faster than State Saving Interpreter (SSI), which passed only necessary data from page to page. +34. The immortal used a format for a mathematical formula in the algorithm, which organised the main data from the algorithm. I found and operated on part of a data structure section or in the reverse order using a “subterm with address” program finder. I wrote the sentence utilising a neuronetwork trained on mind-reading data. The brain formed future networks from past thoughts. I found the tight search query and then narrowed or expanded it. +35. The immortal recognised the model solution. I checked the solution to the question. I wrote an algorithm to understand the logic behind the question better. The underlying algorithm solved the questions. I checked, explained, applied, rewrote, and wrote new questions. +36. The immortals used various sources in their research. I checked the neuronetwork for correctness, wrote a new neuronetwork, and concentrated on recurring, needed Prolog algorithms. The departments supplied the general program finder specifications. +37. The immortal wanted serious but not funereal. I breasoned out the texts to check for misspellings, earned a high distinction, and ran Grammar Logic, which was fast on the new computer and even quicker when compiled. The premium checkers were the presentation of the program finders. I checked the best method, which was a neuronetwork. The neuronetwork identified and chose one function when writing an algorithm. There was a compiler command to convert an algorithm into a web server automatically. +38. The immortal stated that the technology allowed simulated teleportation and operating a physical office, with heaviness giving way to intuitiveness. I printed the model solutions for faster access during the exam. It was an open-book exam, and notes and formula sheets were allowed. The business was ethical as it had consolidated growth. It included people rather than relying on automation. +39. The immortal wrote a maximum of 400 pages per book. I replaced copywriting with neuronetwork. It took slightly longer, but I could keep and control the content of the books, and I kept the meditational advantages of writing the initial material and the books. I checked the content of the writing and the function of the algorithms written by the neuronetwork. I requested the query and tested the algorithm. +40. The immortal could automate all systems needed for immortality with time crystals. I locked my age and turned off unlocking. I repeated this when I entered a simulation. I also repeated turning off dying and unlocking. This step was the final confirmation of immortality, which one needed to be sure of and enjoy one’s life by, for example, writing high pedagogical distinctions and confirming what others would like. +41. The immortal enabled immortality with breasonings and future technology. I used time crystals, a spiritual technology, to control the breasoning process whichever time I was in. I thanked the future for breasonings, the quantum box and time crystals. The time crystals could mind-read and run programs based on thought commands. They were popular because of their medicinal use. +42. The immortal affirmed that the 4*50 high distinctions or 16000 breasonings was the standard of effort and achievement. I stated that the 16000 breasonings for the appearances for the high distinction should be in the same department as the subject. Appearances were separate from the high distinction because they were medicine, a fine art production illuminating the quality of the work. The details were computational. I wrote the number one song. +43. The immortal created a spiritual computer, complementing the future simulation with the findall command. I compiled the steps, performance criteria, and elements, where the performance criteria showed an understanding of the element. The steps navigated through the material while the elements and criteria were reversed, and each question’s criteria featured multiple elements. The steps were logical progressions, making learning more accessible. The educational areas of study were simple, and anything was possible once one had set one’s mind. +44. The immortal performed detailed micro and macro automated tasks based on the student’s knowledge. I wrote steps, performance criteria, and elements for computer science. I optimised the development pathway, in which the students were the main people to themselves. I used premium checking program finders by asking students to start again after making a mistake, replacing findall with subset progressions, encouraging writing and working in terms of the time’s technologies. I identified and prevented mistakes at the cellular level. The mind-reading results were better if the questions and their answers were exciting enough. +45. The immortal agreed with simple yet rigorous content, stepwise structure, straightforward language, and no grammatical errors. I wrote steps, performance criteria, and elements for philosophy. I found the algorithm, which had steps alluded to by the philosophy. I analysed the algorithm from an exterior perspective to deserve a high distinction. I assessed myself on the elements of each algorithm from the criteria structure, content and writing/language. +46. The immortal took care of human learning and found the reason for the enjoyment of writing. I connected the robot and human worlds, keeping my texts intact and analysing as I went. I mainly tried to match algorithms with my algorithm writing suite. I wrote connections between meditation and algorithms, noting changes to professional standards and keeping the internal analysis going. I plan to start my Academy by including humans and basing success on connections (with robots). +47. The immortal trusted themselves while they aimed for particular conclusions. Robots require precise and harmonious reasoning. I discovered how to interest robots capable of reasoning about algorithms in pedagogy, meditation, and immortality and how they can pursue these without human assistance. I optimised applied data using a formula finder. The synthesis of robots and humans was beautiful, different, and perfect in function. +48. The immortal focused on all areas, but creativity in particular. Robots in our time are another name for humans who are conceived based on recordings rather than breasonings and find breasonings, an unwinding of one’s potential, challenging. To succeed, someone must be partially interested in pedagogy and meditation. One can pretend to be a bot while immortal and find entry points to new possibilities as a bot. I trusted the act of kindness and softness coming from insight. + +49. The immortal stated that robots could participate in business, medicine and meditation using breasonings. The children interpreted breasonings as ideas that could be critically analysed. Breasonings were ideas accessible to everyone, and collecting enough of them in key academic departments for particular reasons helped earn high distinctions, become immortal, sell products, have children, and for various other purposes. Breasonings contained a computational component or a useful algorithm that could be analysed and connected to different algorithms. “Robots” enjoyed reading and writing about breasonings and using them. +50. The immortal thought of a connection while simplifying the sentence in “A because B” form. I stated that robots needed at least 80 breasonings to have children and earn high distinctions. These parents needed to have the minimum number of pedagogy breasonings and others, including meditation and an area of study they could write about, such as Computational English, before they (the parents) were conceived. This minimum number was 50 As (50*80=4000 breasonings) and should be worked on using one’s intellect, not copied or changed from another form. To be someone respectable, one should breason out breasonings and meditate. +51. The immortal enjoyed writing about breasonings and leading an academic community. Everyone could interpret pedagogy as including breasonings. The school’s priority was to include breasonings so students could become immortal. In any case, students could examine breasonings and write algorithms if desired. Breasonings were like “writing” and could cover pedagogy, meditation, medicine, popology, societology or any other study area. +52. The immortal made up their system and levels with advanced courses with usability standards. There were different levels in breasonings, which people could reach in their own time, possibly using software, articulating to PhD level. The Academy started with long, detailed texts and converted to PhDs with enough preparation. There could be online academies in each country. Everyone could learn to become immortal, take professional development and graduate. Their students could continue attending. +53. The immortal indicated the same as utterances. Meditation was a beneficial idea that allowed writing, time travel, and immortality. Everyone could experience these things with help at the start. They could do it themselves using whatever effort they could muster or ask a meditation teacher to help them do it without effort. +54. The immortal enrolled in the educational institution to explore their ideas and craft a career. To be able to indicate the same as utterances, the parents needed to indicate the same as breasonings. They needed to indicate the breasoning course components, the breasonings details and components of breasoning technologies such as text to breasonings to their human parents to help with. Then, they could indicate text to breasonings to breason out texts, words they heard, media or knowledge from the education system. The bot or human enrolled in creative writing and made up stories about philosophy. +55. The immortal thanked the breasoner for the advantages of continued health through immortality. Using spiritual time crystals, robots and animals can achieve time travel and immortality. Spiritual technologies that were part of meditation could help save one’s life by recording and playing back the effects of breasoning out or thinking of the x, y and z dimensions of objects, which became possible to indicate for everyone, including non-breasoners. These breasonings were constantly available for life-sustaining purposes. +56. The immortal argued that the Academy’s subjects were intuitive. I put excellence into everything I did, writing 4*50 As (16000 breasonings or sentences) per work. Each employee had 80 or 16000 breasonings, and the rest were helped. The Academy taught using steps, performance criteria and elements hierarchies of reused Computer Science and Philosophy skills. I developed a high distinction for sentences and algorithms to be produced by a spiritual system. +57. The immortal listened to the first line of lyrics and recorded a rhythm with particular velocities, then quantised it. I added rhythm to Music Composer. I changed the default constant rhythm to several notes with different lengths, possibly with rests. The constant rhythm had the same number of notes, which was reduced by the lyrics. I changed the rhythm according to the section, denoted by the time. +58. The immortal changed from having different rhythms for vocals, melody, and harmony to one rhythm for each section and one rhythm for each section of each part. For example, I could change the timing of notes by using a constant rhythm for pads and interacting vocal and drum rhythms. As an aside, I replaced the MIDI instruments with orchestral instruments and altered the EQ (equalisation), panning, volume, reverb and other properties. I found that bass and guitar sounded better separated by EQ. I kept instruments in the middle third of volume and used low 500 and high 1.5K EQ for acoustic guitar, high 400 and low 2K EQ for piano and high 2K EQ for voice. +59. The immortal scrolled to a part or a section in the text sequencer. I wrote a predicate that pretty-printed the pitch, rhythm and lyrics for editing. I outputted these in a grid, each within square brackets, aligned using spaces. The user could open this file in a text editor, find and replace or edit parts and export the music. The algorithm produced errors if the file was not in the correct format, the rhythm contained the wrong number of notes, or the section was too long. +60. The immortal stated that rhythm was the only part that needed to be able to be different for each part and section of the song. I copied rectangular text selections in the text editor by option-dragging the selection, copying and selecting the insertion region, and pasting. I used this technique to cut, copy and paste sections to other places in the song. The text music composer didn’t count spaces, and the sections could be switched on or off when exporting to a sequencer for recording vocals. The advantage of editing the text file was that the form, melody, harmony, instruments in sections, rhythm, sections’ on/off status and the lyrics could be edited. +61. The immortal checked the requirements against Descartes’ time when the writing length was itself. The professor’s students completed eighty breasonings for their assignment and checked with the lecturer to whom to give the others. Whenever the student wrote eighty breasonings, they breasoned out 16,000 sentence breasonings and 16,000 detailed breasonings. In addition, they completed 16,000 algorithm sentences and 16,000 algorithm detailed breasonings. This increase made the 80 breasonings into a viable product. +62. The immortal’s social movement was automated programming. The counter counted sentences as it went and stopped when it reached the correct number. I agreed with the breasonings, and wrote self-standing breasonings that were my idea. I found conceptualising original algorithms more accessible after writing 50 high distinctions to design an interpreter and 50 high distinctions to design an inductive algorithm writer. I thanked the immortal Nietzsche, who helped me develop algorithms in these works. +63. The immortal found the proof finder using prioritised laws of assumptions. I structured and pretty-printed a database using the subterm with address predicate. I reduced or removed unused intermediate space. I connected algorithms simply in the context of a Whole Foods co-op by finding what I wanted in space, such as (not overly) refactored code. I found applied shortcuts such as intermediate interfaces like menus or draggable objects. +64. The immortal analysed the address of dimensions of an address. I found details from details to make up numbers if necessary. The sentences in assignments sometimes didn’t register as critically analysable in the neuronet, so I increased those that were. For example, single, unrelated terms sometimes weren’t connected with algorithm descriptions, but terms with a use or possible connection were. I found the address of an address, etcetera, in Proof Finder. + +65. The immortal collected subterms within subterms. I wrote a version of subterm with address that allowed accessing, passing variable values based on, and modifying addresses and data collected so far. It also allowed passing a predicate name to act as a heuristic within the subterm with address predicate. Accessing and changing addresses and other data was optional. I could also provide custom heuristics for collecting addresses of subterms with particular types and abilities mentioned earlier. +66. The immortal considered the one rhythm fitting the whole song, then mind read changes. In Music Composer, I changed parts and sections of songs to a new rhythm. I also wrote a programming language to specify which sections had particular rhythms, given their type of instrument or melody/harmony status, solo section, or start, middle or end of the song. For example, I programmed a constant, lyric-based, off-beat, random or mind-read rhythm with emphases on the main beats or beatlets, perhaps accompanying the main or other rhythms. I programmed these rhythms in certain sections and in vocal, drum, pad, or solo parts. +67. The immortal made a movie from philosophies. I explained algorithms. I noted unique reasons, methods, uses and future directions. In the case of Cultural Translation Tool Generation 3, it creates films from algorithm descriptions. It included a data structure fly-through and walk-through of the algorithm. +68. The immortal dealt with critics after delivering the lecture. To find a story, I applied a key term from a philosophy generally, with a narrower focus. I thought of the time, place, the philosopher’s history of writing the philosophy, current and future computational methods and cinematic settings that the algorithm evoked. There was a spring device sculpture outside an art gallery symbolising CAW. I explored the philosopher’s inner struggle with writing and how they solved ethical problems by answering students’ questions. +69. The immortal child found complete knowledge and a haven from harm. I explored my journey from child to adult as a movie theme. I explored what being a child would have been like with meditation, content, writing and creating systems. The child started and completed projects, recording them in a safe place. They explored well-knownness, accessing information systems and publishing their work. +70. The immortal conjectured discoveries. I wrote novels before making movies. I explored discursive, first-person language. I wrote contemporary/classical accompaniments that were memorable and thought-provoking. I wrote chapters with characters with single aims. +71. The immortal engaged the world with a first-taste, exciting perspective. I used imagery-evoking language in monologues that a child would think of. I described the algorithm as a peculiar device. I over-enunciated its features with computationally verbose comments. There were virtual places to store models, such as a gallery of thoughts. +72. The immortal found childishly simple solutions to problems. I wrote from the point of view of a child in an adult world. I found innovative solutions to problems, such as writing a programming language that used effecting phrases that could be programmed more directly by generalising the problem. I found similar predicates to subterm with address, for example, that performed the task of a predicate. I also found an algorithm to find when not to use subterm with address, when in findall that called a predicate containing itself recursively. +73. The immortal combined effectualism and simplicity in the child’s monologues’. The child on stage delivered sentences in the form “I effected X” philosophically, reporting thoughts as results. This first-person, past tense followed reporting scientific results, evidence of general truths and what a paper covered. I used the passive voice to accentuate the verb and the object of the phrase rather than the subject, such as using “was” to describe a computational property. I described subterm with address, simply in terms of its data, and simplified it and a simplified part of it. +74. The immortal defined the object they were thinking about. I defined additional characters whose problems the child solved. The child problematised ideas generally (on people’s sides), positively and philosophically, and left solving them to the imagination. The character exposed, explored and drew conclusions. The origin was a synthesised ten-sentence difference in opinion, sparking similar sentences that were secretly solved and where the problem and solution were detailed. +75. The immortal added simple Prolog types to a List Prolog algorithm. I always solved the problems in the movie with algorithms, an uncommon literary device. They often relied on double negation to prove a statement. I added types to a program to turn it into a proof. In addition, I made sure I had proved the right thing, not something else by accident. +76. The immortal went beyond ideas to get them back to simple thoughts. I determined the unique cinematography from the property of an algorithm. This property was the secret setting symbolising the algorithm. I invented a bank, like a money bank, to visit in a simulation in a movie as a metaphor for CAW. This metaphor was making one’s currency with CAW. +77. The immortal was the character who was right. I added internal or algorithm-specific perspectives on the algorithms. Algorithms were the intelligent content of English-Philosophy and were explained in a few lines. Even though they were obvious, they were the thing-in-itself and were easier to understand. I discussed encyclopedia-themed perspectives on the algorithms with others. +78. The immortal tried the algorithm department sorter. The perspectives used internal or department-nonspecific terminology. An internal term may be mentioned as few as one time. These literary terms illuminated and helped add enjoyment to other algorithms. I stopped writing when I had found 1200 algorithms or 200 pages. +79. The immortal wrote 16000 ideas per object on the screen and professionally developed themselves. The rest of the 16000 algorithms were used as details. Each algorithm, of which there were many, secretly mentioned a commercial product. I found related algorithms and connected them with symbolic connection algorithms, milestones in themselves. It was a grand synthesis that spurned the imagination for more. +80. The immortal listed ways to reduce SSI memory use on paper. The point was how I resolved the perspectives. I delivered the address, followed by answering students’ questions. I prepared for possible questions furthering or uncovering understanding, connecting to new algorithms (for example, a graphical user interface for SSI) or countering incorrect use (and optimising SSI). I also prepared for follow-up questions, such as making SSI a universal programming language by simplifying command sets into sentence commands."] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 4.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..57eece0999a2c41e4225edb1065623b6971d1cf6 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 4.txt @@ -0,0 +1,88 @@ +["Green, L 2022, Immortality 4, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 4 + +1. I gained repair points from bringing supplies. I began by listing the collisions from all directions. In these, the objects were possibly mutually annihilated. Or one was damaged. Or, the clash had destroyed both, requiring repair points. +2. I wrote the member1.pl algorithm. I wrote a member predicate using list processing. Then, I translated the member predicate into C. I translated the list processing command using string to list. The unification algorithm represented lists as trees as arrays. +3. I also reversed commands, finding their constraints or inductively finding their input. I wrote the one_d_gravity_block.pl algorithm. I stacked one-dimensional shapes. I could work out a word or algorithm from its length. I programmed using previously written algorithms, looking up their usage in the documentation or a reverse documentation command. +4. I wrote the mathematical relations in the tests and replicated all these combinations. I was responsible because of checking the procedure to 100%. I listed the necessary tests. I tested that the method worked in 100% of them. I modified the procedure to work if required. +5. I found a way to examine the same thought. I wrote the critical agreement with the idea. First, I found a good example in the world. Second, I found evidence for good uses. For example, the breasoning dictionary's properties were correct (the calculation was accurate) and good (the calculation led to good management). +6. Immortality saved the life. The First Technique treated the argument map as an answer to a question. The Computational English calculator found the result of unlikely cross-overs in the text. I found the implication of the rare meeting to be good. I noted this was saving a life. +7. The physical model underlying the algorithm, for example, V=U+AT, was found by the formula finder. I wrote the text. I discovered its internal and external algorithm (where an external algorithm was how it fit into an argument). I checked each sentence against previous sentences, mimicking what I wanted to be. I invented characters to talk about this logic, for example, a path-layer who determined whether a student could think of an algorithm. +8. I used the more straightforward formula throughout the work. I checked the formulas, algorithms and grammar used were correct. The algorithm was simple. Also, it had no errors even though it appeared to work. I ensured that the algorithm worked, by comparing it with the best algorithms, for example, working grammar. +9. I worked out the answer (the algorithm). I was healthy. From this, I developed a way of supporting myself. I could generate jobs from this. With investments and support, I could invent the necessary work. +10. I found the new geometrical variables and maintained my hair. I found the new radius from the multiple of the area. The area was equal to pi*(radius^2). For an area2 = n*area1, the new radius was equal to sqrt((n*area1)/pi). I found the radius for the circle on the field. +11. I studied computer science to write my ideas down. I did this by solving the food maze. First, I grew my food. Also, I avoided unhealthy additives. As a result, I slept well at night. +12. As computer science added knowledge, I predicted that robots will too. I did this by working out meditation to help the universe. For example, the meditation centre became a hub of interest for the meditators. Everyone was positive. Students learned advanced skills. +13. Some students required more help than others. First, I wrote the inequality between the numbers. In this, the user entered the inequality and the numbers. The algorithm returned whether this inequality was true or false. Then, I helped the student to study at the institution. +14. After normalising the squares, I counted them. I started by counting the list of squares. Then, I used them by making a physical model for the first technique. In this, I determined the local squares involved. I found their relationships. +15. Forall encountered an infinite loop when given an undefined variable. I found that all of the items in one list were members of another. I wrote the equals symbol to denote this. I programmed a forall command on all list items by running the same command. Forall always passed when given an empty list. +16. I specialised in induction and an interpreter. The induction command found algorithms using a given set of commands. I wrote this algorithm in C. It could use a neural network. I could also write this neural induction command in C. + +17. I quickly arrived at the solution. I did this by manipulating the algebraic expression. First, I wrote the expression in terms of u(1-u). Then, I antidifferentiated it. Finally, I substituted u's expression back in. +18. I substituted the value back into the expression to check it. I expanded the expression. To do this, I took x(x+1). I expanded it to x*x+x*1. I simplified the expression to x^2+x. +19. Simplification prevented bugs and allowed programming. I simplified the expression. I started with 1/x * x/3. I simplified this by cancelling x in the product. The new expression was 1/1 * 1/3 = 1/3. +20. I also accounted for negative powers. I multiplied the expressions. I multiplied x^2 * x^3. In this, x^a * x^b = x^(a+b). So, x^2 * x^3 = x^(2+5) = x^7. +21. I found the frontier and worked. I calculated the expression. I simplified the expression to f(a)/f(b). I calculated these expressions. I calculated this expression. +22. I recorded the phenomenon and the variables. I found an expression in terms of x for y=f(x). This equalled x=f-1(y). In this, f-1(y) was the inverse of f. I used a neural network to do this. +23. I thought it was wondrous that the idea I knew worked. Next, I wrote whether the actor matched the role. I did this by describing the actor's appearance. Third, I wrote the list of characters that fit him. Finally, I cast the actors in the roles. +24. I could see Mars. I created laws for safety and necessity. I made cars that stayed on the road. These flew close to the ground. Looking out the window was like looking out of a building. +25. I agreed with positivity. I did this by agreeing with safety. First, I found a safe place. Then, I visited it regularly. In this way, people accepted positivity. +26. I placed the files in the mirror folder. I periodically performed the activity. It was backing up the data. I made an alarm to remind myself to do it. I automatically did this. +27. I wrote educational software. Earlier, I noted that the different groups agreed on the issue. These groups represented the whole. After this, people monitored the state for any necessary changes. The system required some modifications. +28. I was not held back by objects, discovering new algorithms. I found all the solutions. The people were happy until and after seeing the answers. I programmed to explore ideas. I found replicators using neural networks. +29. I ran the algorithm writer in retro-mode, finding only intelligent algorithms. I simulated writing from the start or end of a file. I chose writing from the start. In other cases, I read, manipulated and rewrote the file. Finally, I devised List Prolog in BASIC. +30. The guests could play against each other on their phones. I did this by hosting the algorithm for guests to play. In one case, I returned the different variables on true, causing the algorithm to return true. I bug-checked findall in the interpreter. I supported nested findall. +31. I found combinations of certain unique items. I wrote the trope_finder.pl algorithm. I did this by finding all combinations of certain words. For example, I wrote combinations of writing, experience and thoughts. I detected repeats and found the unique items. +32. I wrote the algorithm, improving it until reaching the goal. I did this by speaking enough about it. I planned this with thought. I talked to indicate people and memory. Also, I worked for thoughts. + +33. The best items were more philosophically inspiring, critically held algorithms. I deleted any items with more than one instance. Also, I deleted some other items. In conclusion, I finished with the best and most unique items. I found their best uses and interactions. +34. The likelihood of one breasoning depended on the order of texts. An ironism was the same word with a different meaning and the same intermediate breasonings. The ironism's breasoning was closer to one meaning, so the other meaning was related to it. For example, the bow's breasoning was \"down\" for both \"bow down\" and \"bow-tie\", which the person had to fasten down to his shirt. Ironisms helped visualise the words when speaking. +35. A robot could bring enough crates of apples. I did this by adding x and 3*x. First, I converted x to 1*x when considering adding the terms. Second, I added 1+3. In this way, the result was 4, i.e. 4*x. +36. Also, I could solve u(1-u)=v in different ways. I started by expanding the brackets. First, I took the term 3*(b+d). Then, I expanded this to 3*b+3*d. Finally, I substituted values into this expression. +37. I could write -6 as 3*-2, without brackets. I expanded the brackets again. I started with 3*(b+(-2*d)). I expanded this to 3*b+3*(-2*d). I could further simplify this expression to 3*b-6*d. +38. I simplified 1*x to x. I expanded (A+B)*(C+D). I wrote it as A*C+A*D+B*C+B*D. For example, (3+(-1))*(c+d). I wrote it as 3*c+3*d+ -1*c+ -1*d, simplifying to 3*c+3*d- 1*c -1*d. +39. I performed binary addition by counting up. I wrote all combinations of a list of items, up to a certain length of a new list. I found f([0,1],2). This equalled [[0,0],[0,1],[1,0],[1,1]]. These were the binary digits for [0,1,2,3]. +40. The algorithm found whether or not I needed a similar algorithm. I transformed the term into a computational, then philosophical meaning. I found the word. I found this corresponded to a list. I found this could have the courage to go up to people, to make suggestions. +41. I made a list simpler. I did this by finding whether there were any duplicates in a list. First, I listed them. Then, I found bugs in them. I fixed them. +42. I compressed until there was no further compression possible. I tested whether all the items were the same. I found the first item. I tested that each item was the same as this item. I could compress them. +43. I was independent. I did this by sorting the two lists and putting them together. First, I subtracted one number from the other. Second, I graphed them. In this way, I found a relationship. +44. I argued for progress. I did this by examining the pedagogical setting. First, I stated my aim. Then, I said the reasons for it. Finally, I determined the logic, inventing rules. +45. I wrote the orienteering algorithm, possibly in 3D. I generated the points. I gave the starting point. I traversed the points. I drew the map. +46. I observed that a computer had created reality in another dimension. I did this by finding the number of rays hitting a surface. First, I counted them. I observed them. I observed reality. +47. I found that there had been several simulations in science. I read the measurement on the test tube. If it curled upwards, I read the bottom. If it was high in the middle, I read the top. I avoided taking half-measurements. +48. I aimed to build skills with encouragement. I started with having the institution. This institution had the best vocational and theoretical knowledge. In the institution, the vocational knowledge explored metacognition of learning skills. Theoretical knowledge contained algorithms related to compilers and induction. + +49. I supplemented the computer work with more content and logic. I listed the authors used, omitting the characters and unclassified names. I recorded the authors' works used as sources. I found that human choices were superior to computer choices. They converged. +50. I agreed with one side, not the other, sometimes one side more than another. I decided with the ideas on one side. I didn't agree if I disagreed with a view on that side. At least, I checked them. Many people were different. +51. The idea was correct when checked. I did this by threading the view through the eye of the needle. First, I found the initial text. I had opinions about it. Then, I grounded the work on this. +52. I deleted duplicates according to the heuristic. I counted the number of duplicate items, i.e.: A=[1,1,2,3,3],sort(A,B),findall(D,(member(D,B),findall(D,(member(D,A)),E),length(E,F),F>1),G). So, G = [1, 3]. G contained 1 and 3, which had frequency > 1 in A. I drew a histogram. +53. I checked the word order and capitalisation. I split the string on [\".\",\"!\",\"?\"]. Given C=\"a! b. c?\", A=\".!?\", string_codes(A,B), string_codes(C,D), split_string(D,B,B,E). I returned E = [\"a\", \" b\", \" c\"]. I grammar-checked the sentences. +54. I traced the sub-events like an algorithm. To do this, I predicted the event following the first event. First, I wrote down the observations. Second, I wrote down the first event. Then, I examined the objects leading to the second event. +55. I kept the different ones separately. I added the like items. Given A and B, I calculated their sum, A+B. I examined this philosophically and amalgamated their containers. I diversified the rest. +56. Some animals came along. As part of this, I found a new appended list from two old lists. In this way, the groups came together for the activities. I detected the thought. I wrote the essay with some reasons. +57. I considered the items by themselves and not there. The items were of the same type. The articles \"a\" and \"b\" were both strings. They could be concatenated or reversed. I examined all orders of them. +58. There were type checks inside a command. I did this by verifying that an item was a number. I ascertained that it was used and manipulated as a number. I used a type checker to avoid and possibly report type errors. The commands had optional type checks as well. +59. I cleaned the tube. I did this by choosing the lesser of two values each time until reaching a value. I did this by designing the circuit and taking measurements every so often. I eliminated the unused wire. I reused parts of the circuit. +60. I couldn't create a list with a length less than 0. I got the nth item of a list. I performed this by counting down from n as I processed the list. I could have used the code A=[1,2,3],B is 2-1,length(L,B),append(L,[C|_],A). This algorithm gave the 2nd item C = 2 as desired. +61. I did the right thing. I verified that A and B were commutative. In other words, A+B = B+A. I found the noumenon in theatre studies, holding it for someone. I found the noumenon in psychoanalysis, liking someone. In these, both items were positive. +62. I used associativity in algebra. I found that (A*B)*C and A*(B*C) were associative. For example, (1*2)*3 = 1*(2*3). I continued to think clearly. From this, I concluded that multiplication and brackets in this form behaved associatively. +63. I returned a list of n empty's. Next, I counted the length of the list. I did this using iteration. I added this procedure to the Prolog predicates for use in C. I could also find a list with a certain length, e.g. length(A,3) returned A=[empty, empty, empty]. +64. In some cases, I left the working there to generate new data had properties, such as one week ago, not one day ago. I determined whether data had the same property to test for instead. I found the properties of the first pair. I detected whether the other pairs shared this property. I found out the way and about the destination. + +65. I verified that an item was a member of a list. I did this to check that it was a correct answer. I did this by writing a list of possible solutions. First, I wrote the limits. Then I tested them. +66. I eliminated unwanted traits. I wrote the list of items in one list but not in another. I found the genetic subtraction. I noted that I could remove the original species from the list. I was fascinated by natural selection. +67. I could optionally eliminate the empty list from being a sublist. I checked a person finished that part of the list of tasks. The father earned his keep and took responsibility for the children. I could do this by checking that B is a sublist of A because B-A=[]. This could also be accomplished with the code A=[1,2,3,4],B=[2,3],forall(member(D,B),member(D,A)). +68. I found the functional element from history. I found a common item in the list of items from the time. I repeatedly found whether items in A were members of B. I wrote the following code to do this: A=[1,2,3,4],B=[2,3,5],findall(C,(member(C,A),member(C,B)),D). The intersection of A and B, D is [2, 3]. +69. I compressed the disk. I did this by deleting each copy of the file from the disk. I processed the list, deleting all instances of the item. I could count the frequency and address of each item. Then I could delete specific items. I saved the essential items. +70. If A and B in A->B are true, then A->B is true. I confirmed the conjunction of events. If not((true,true)) then it returned false, otherwise true. I took action based on the conjunction. If I invested in an asset, it would return interest. +71. I accounted for disk pro rata. I owned the value of the sum of the items. I checked that the sum indicated that the disk had fidelity (it wasn't corrupted). The sum of the file lengths and the unused space equalled the disk length. I found the total sum of the numbers. +72. The person checked the work. The person produced more significant than the given amount of work. The person showed confidence. As a result, their facilitator showed confidence. They showed confidence in the confidence. +73. I recorded the performance. I aimed for maximum performance at the institution. I did this by sorting the items and taking the top value. First, I performed well on parts of the assessment. I topped the institution. +74. I sorted from the lowest to the highest number. I saw the result of the sort. Then, I sorted the items using the maximum predicate. Finally, I found the graph of the items. I detected the linear gradient. +75. I designed the binary watch with a bitmap digital time. The marker awarded the mark for either answer. If not((false,false)) then it returned true, otherwise false. I gave A=1,B=1,(A=1->true;B=1). In this, the expression is true if A=1 or B=1. +76. I compared the value with the maximum. I did this by finding the random xs and ys. First, I found the regular xs and random ys. After this, I found the arbitrary xs and regular ys. Finally, I found the standard xs and ys. +77. I wrote a practical algorithm. I did this by finding yes and no answers about a single point. These were part of different chapters. However, the topic was the same. The language was similar. I developed the writing. +78. The student learnt to paraphrase and cite. I checked the essay for plagiarism. The sentence was almost entirely intact in the sentence. There were over 80% sentences like this. The student did not cite them, so they failed. +79. I completed complex ideas about the work. I wrote the word processor in Prolog. I moved the cursor around and typed. I justified the text. I paginated the pages. +80. I kept a breasoning sheet for untranslatable words. I wrote the breasoning log. In it was the list of breasonings I had breasoned out. I recorded the names of the documents that contained the breasonings. I could easily replace words and translate documents. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 5.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..c975382f4da4727dfbb3d67821cd777b6a571119 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 5.txt @@ -0,0 +1,98 @@ +["Green, L 2022, Immortality 5, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 5 + +1. I checked the reason's goodness, back-translated it, and mind read omissions. I wrote the detail log. I recorded the details for each document. Then, I interpreted them as algorithms. Each algorithm had ten algorithms related to it. +2. I made the two-dimensional plan. I did this by accessing the main parts closer to the centre. I liked the beautiful movement. It was fun at different times of the day. The daylight streamed in, and the room looked lovely at night. +3. I also developed a circuit printer. I did this by planning the three-dimensional structure. Also, I played the 3D adventure game. In addition, I wrote a game to map proteins. Finally, I worked out what was following and developed businesses. +4. I passed the output to the programs with a single algorithm. I did this by creating the command line command to input a data file. The command outputted an output file. I made creating the application easier. I wrote a machine language generator, reused code in a single file, and made code into data. +5. I found interfaces between people, books and algorithms. I found predictable and more creative missing algorithms. For example, I found the cutting-edge conclusion, a predictor of the need for space travel. I teleported to the student instead. I gave them my A. + +6. I found the combination of changes that resulted in all tests working. I changed the singleton variable to an undefined variable. Also, I renamed a variable, so it wasn't a singleton variable. Next, I deleted the command with undefined variables. Finally, I made singleton arguments in the remaining commands undefined. +7. I used the cut command in tail recursion optimisation to speed up algorithms. I turned on a flag signalling the end of the search in the predicate, preventing further searches. I added an \"ooo\" append mode to catch the remaining commands. Finally, I made a string concat version with undefined characters, like lists. I could define an undefined string with a certain length before string concat. + +8. I created the multi-choice quiz, testing knowledge of algorithms. As part of this, I tried it on a simulated character. This character eliminated incorrect answers. She also looked up the formula in her notes. Finally, she found the correct answer. +9. The student understood the difference between the algorithms, including versions. As part of this, the student understood the algorithm. First, she found the name of the algorithm. Next, she found the output for a given input. Finally, she found the output of a predicate for a given input. + +10. There were pauses until the algorithm worked, where it needed to work by the end of the shift. As an example of this, I cleared the area. I did this by removing everything (I made it all white). Next, I matched the future work by one-filling the algorithms' data structures. I then mind-read students thinking of the contents of these practica. +11. I finished the algorithms' details by mind-ranking mind-read words by relevance, completing mind mapping previous words and reaching the end of plans for algorithms. I determined whether words were more relevant to this algorithm or another one. In addition, I found more relevant structures and names for the structures and assessed the business employees on the new algorithms. The students refined their algorithms. I noticed that the students started with the input and output. Then they mind-mapped the plan to write the algorithm. Also, they refined the spelling of names in the algorithm in a step-wise manner. +12. I found the best spec from elements from multiple mind read tests from sentence specs, making me suspicious these were done by an institution to help with writing the pointed-to algorithm. I determined the type of the algorithm, a search, inductive or interpreter. There were commands to complete each type of algorithm with parameters automatically. Finally, I used an algorithm (program finder with types) to finish and convert data into an algorithm. Mind reading found new features, bug fixes and API changes. +13. *I found the C code, eliminating unused variables. I determined if an idea was a new feature. Then, I mind-read lists of possible new features that could be applied to the algorithm, for example, different modes. These included constraints, induction and interpreters. Finally, I found the interpreter, simplified it and revolutionised the container. +14. I realised the person might not understand the machine-generated algorithm. I found many algorithms and recognised the need for them with an algorithm. These were tools for writing algorithms. Program finder with types had much data, converting specs into algorithms and optimising algorithms by neurally choosing parts of previous algorithms. A program finder with types was inserted as part of the algorithm if the algorithm needed to find other algorithms. In contrast, this novel method of using an algorithm to find different algorithms used in conjunction with neural networks and optimisation helped finish large workloads. +15. The features corresponding to bug fixes were listed. I compiled the list of types of bug fixes from my open source repositories. I did this by finding the metrics for why they improved the algorithm. Next, I found the variables they affected. Finally, I identified and made necessary bug fixes elsewhere. +16. I predicted the usefulness of different API changes. I did this by comparing and contrasting API changes. In this, API changes were changes to the output. I found the necessary API change. I dismissed unnecessary API changes. +17. I converted Prolog to C, using an interpreter to run the Prolog code in C. I wrote the most elegant code by working on individual predicates with CAW with a fast computer and program finder with types. Prolog was the most straightforward language to use, while C was fast. I simplified the interpreter in C by reusing code. For example, I could use List Prolog to find and eliminate code with unused variables. I wrote a meta-inductor that reused past code and a seen-as version that was good. +18. I optionally removed the retry command. I inserted cut before the last command in the compilation. I checked whether I needed to insert the cut command before the previous command in run time. I kept some choice point data for retry. I removed some choice point data and freed the heap in memory. +19. I aimed for the perfect behaviour at each point and compared the behaviour with existing algorithms. I redid choice points, writing the predicate header in trace mode. I noticed the program deleted variables from parts of if-then. Cut could prevent these choice points from being revisited. I found that business was interested in providing perfect models for employees. +20. I eliminated extra mentions of variables and ensured the output was consistent on different platforms. The program detected whether I used each variable in a computation. The program tracked variables' use throughout the algorithm and eliminated variables if not used. There needed to be a single data structure that held the necessary data. Data structures could be merged or deleted. +21. I C-coded a function that I often used. I differentiated functions, reusing them where necessary. I used functions in other ways for better performance. I measured better performance by complexity. I differentiated functions by simplifying them and then finding viable alternatives using other functions. +22. The interpreter-generating algorithm improved the quality of the interpreter by removing unnecessary code. I converted grammars to write the interpreter. Ideally, anything could be in anything, with an exception. I reused parts of grammars. I wrote the interpreter and then parsed the code to form the grammars, followed by the algorithm prompting me to complete the items within items requirements. +23. Also, the interpreter was a neural network. I wrote the interpreter well by planning. First, I parsed the code to write the interpreter. Then, I annotated the result, identifying and using variables reused but not in the code. Finally, using neural networks, I converted the grammars into the interpreter connected needed parts (associated variables). +24. The future computer quickly computed the computation and made it available to quantum computers. I converted the interpreter with unique features to a simple interpreter, using a program finder with types. The language valued intuitiveness and converted code written using this principle into high-performance code. For example, grammars could parse strings, but later, code contained string codes. The quantum computer could not only produce philosophy but beat classical computers' speed at interpreting code. +25. I found the point of error and corrected it, for example, a user error in complying with a type statement or grammar. I caught errors from predicates, interpreters and humans failing. I wrote an interpreter that reported failing predicates. It also reported its failures, i.e. unknown commands, etc. It also reported mistypes and suggestions about commands and files for users. + +26. In some cases, I didn't record the data structure. I determined which part of the data structure was necessary to keep. The data structure was structured hierarchically. I noted the parts of it that the algorithm referred to later. I changed recording the data structure to keep only these parts. +27. I was critical of data that the algorithm never compared with other data and didn't output. I deleted (cut) unnecessary items from memory, ones that the interpreter wouldn't use again. I traced the choice points as the interpreter used them. I counted the times the interpreter used them. If the interpreter never used them again, I deleted them. +28. The interpreter colour-coded the trace lines. I retried a predicate from the first time I ran the predicate, using choice points from this point. After the user retried the predicate on exit or failure, the interpreter deleted all the choice points until after the predicate call. The interpreter reloaded the memory state at this point. Also, the user could expand the trace line to display fully. +29. Sometimes, I changed parts of the reused code using parameters. Earlier than this, I converged parts of different repositories. As part of this, I found the same code in other repositories. I moved it to a place where both repositories could access it. Then, I called it from these repositories. +30. The interpreter needed enough information in the remaining choice points to run the algorithm. Therefore, cuts were inserted before the last commands of a predicate when there were no choice points before them in the predicate. Otherwise, the interpreter checked if there were no choice points at this point when running and cut the choice points from memory. Deleting choice points freed up memory and allowed running more algorithms at the same time, also, cutting choice points improved performance. +31. I also expanded ideas in my writing. Earlier, I trained an optical character recognition algorithm to recognise my handwriting. I trained it to recognise letters, words and phrases as part of this. I allowed an algorithm to expand abbreviations and acronyms. After transcription, I could develop the algorithms specified in the writing. +32. I trained the software to recognise the flow of my writing, producing a single flow of text. I trained the software to recognise symbols such as carat marks and arrows and inserted lines in my writing. Carat marks inserted words or phrases above from above or below a line. Arrows and notes pointed to sections to duplicate or modify. Quotation marks also signalled duplicating sections. +33. I wrote an algorithm that converted Prolog list processing into grammar. First, I simplified the strong types algorithm. Then, I wrote the grammar for the algorithm. This grammar converted strong types into token terms. Also, an algorithm could convert grammar into C, which was faster. +34. I used a type checker to identify strings or string codes. Before this, I used strings, not string codes, in grammar. First, I converted a text to parse into single characters. Then, I parsed it using a grammar with strings, making debugging easier. Finally, I could use a two-moded grammar, which could give the string code for a string. +35. I also wrote a \"for all\" predicate. Separately, I rigorously tested the different modes of append, etcetera. I combined variables with strings in various combinations in append. Also, I did this with the member predicate. I constructed an atom, e.g. \"io\", for the first argument input and the second one output, for each use of, e.g. member and used this in different repositories. +36. I could also abbreviate append to a. I tested the unusual append A [b, C] [D, e] predicate query. This returned A = [], C = e and D = b. I preferred commas to spaces to avoid typographical mistakes. However, there could be any number of spaces. +37. I also wrote a multi-modal version of append in C. I wrote the append predicate in C with equals4. I wrote equals4 in C. I rewrote append [a] [b] C as equals4 C [a]:[b]. Instead, I wrote the append predicate in C to do this. +38. I cut off the maximum number of outputs possible. First, I wrote a non-ending result of append with C. Then, I wrote the query append(A, b, C). This query produced the output: +A = [], +C = b ; +A = [_A], +C = [_A|b] ; +A = [_A, _B], +C = [_A, _B|b] ; etcetera. +39. Alternatively, I could use findall and a member predicate to number solutions. I wrote findnsols in C. I iterated through the possible solutions using a counter. If the solutions contained a pipe character (\"|\"), I encoded it using a symbol code. I could print, number and select solutions. +40. Alternatively, I devised a real-time interpreter which a computer user could press \";\" to see each solution. I disregarded needing the cut command after findnsols. Without the cut command, the append and member predicates need to be cut by the user to prevent infinite solutions in findsols. Therefore, I automatically included the cut command at the end of findnsols to only provide n solutions. I wrote an algorithm that cut off findnsols after a particular time and analysed the answers so far to come to this solution using cut. +41. I based findnsols on findall, with a counter, also possible in C. Instead of explicitly numbering solutions, the algorithm could implicitly number solutions. The counter went up by one on a new solution. It finished when the counter had reached the maximum, or there were no more solutions. The algorithm didn't fail if there were fewer than N solutions. + +42. I aimed to cut choice points for better performance. I used the cut command to cut choice points but kept \"-1\" choice points without choice point data to return to previous predicates. Separately, I took care not to precede cut with fail, meaning the interpreter ignored the cut command. If cut preceded fail, the predicate would fail. Finally, I tried not deleting choice point data from other clause instances. +43. I questioned whether there was always a recursive command. I did this by using the cut command in tail recursion optimisation. As part of this, cut deleted choice point data from the current predicate. Tail recursion optimisation inserted cut before the last command. I questioned whether the recursive command was always last. +44. I manually inserted a cut command when I needed one solution. I inserted cut before a command possibly exiting with -2 status. I scanned the state machine for the -2 status. I considered inserting cut before the last command. I inserted a possible cut command if there were commands that had choice points before then in the clause or elsewhere in the predicate but didn't insert a possible cut command anyway because there could be no other choice points. +45. I deleted the obsolete parts of the code. I inserted a cut at compilation if there were no choice point-creating commands from the start of the predicate (including other clauses). Then, I checked that the other clauses had no choice points. Finally, I checked whether the current predicate had no possible choice points. I could use similar algorithms to check for obsolete code, variables, clauses and predicates. +46. I optionally listed cut in the trace. Alternatively, I inserted a cut command at runtime if there was no cut before the last command in the clause and there were no other choice points in the predicate (including other predicate IDs joined with the same predicate number). When I encountered the possible_cut command, and the algorithm met the conditions, I inserted a cut. Next, I listed the predicates to inspect the cut commands. Finally, I listed the state machine to inspect the cut commands. +47. I manually placed the cut by inspecting the data with the program finder with types. I used the cut command at runtime if there were no choice points in any predicate clauses, i.e. none in the heap. Next, I scanned records of choice points for the predicate in the heap, and if there were none, I inserted cut. I considered manually inserting the cut command if the specification suggested that a single solution was required. Also, I worked out where to place the cut. +48. Choice point commands included commands or predicate calls with particular arguments meaning they generated choice points. During compilation, I found the choice points in the predicate. I followed all possible paths from the start to the end. If there was a choice point, I labelled the clause \"without cut\". If there were choice point commands, I couldn't insert cut. +49. I returned a warning about cuts and choice points in redundant code. I could ignore redundant cuts or choice points in an if-then clause that was always true or false. I could ignore cuts and choice points in obsolete clauses and predicates. In the case of a variable that is always true and used as an argument causing code to become redundant, +I could ignore this code. But instead, I returned a warning about redundant code. +50. I ignored unregistered, unused predicates. The interpreter recognised multiple predicates as creating a choice point. I ignored the other clauses if there was a cut or fail combination in that order. If there were numerous unnegated clauses, they created choice points that could affect inserting a cut command. Also, they would be manually cut. +51. I included an option to turn off tail recursion optimisation or the insertion of cut when there were no choice points in a predicate. I inserted the command in compilation to possibly insert cut if there were no choice points at runtime. I inserted the cut command in this case if there was no cut there already. I examined whether inserting cut needed brackets. At runtime, I changed the possible_cut command to cut if necessary. +52. I hid inserted cut commands from appearing in the program or trace for simplicity, not the following. I replaced the last command with \"(!, last_command)\", i.e. wrapped the two commands in brackets (a function). I inserted the brackets function, adjusting the line numbers of the state machine. I didn't insert the cut command; I just called the cut predicate when I had labelled the predicate at compilation or the conditions were met before I ran the last command. +53. I didn't need brackets or to manually insert cut because I could check which command might exit on -2 and run the cut predicate before running the command. Even though I negated its need, as an exercise, I detected whether inserting the cut command needed brackets. I didn't insert the brackets if there weren't brackets already. Also, I didn't insert brackets if the last command was part of a logical structure that didn't need brackets but did if necessary. For example, if the last command was a single command in a consequent of an if-then statement, it required brackets. +54. The compiler didn't have a retry command. On another note, I not only set the choice point data in a choice point to [], I deleted the choice point. Deleting the entire choice point in the heap freed memory. I freed memory to allow better performance. As a result, an algorithm could run faster and finish running in time. +55. I deleted unused clauses and predicates, measured by a counter. As part of this, I deleted the unnecessary variable. This variable was a singleton. I deleted the commands that only had singletons and constants as a result. I also changed mainly unused (unprocessed) variables to be undefined. +56. I reused optimised algorithms. I didn't need to compare the variable, so I deleted it. I checked that the comparison of the variable made no difference to the program's output with particular specifications. I could also determine whether comparisons were necessary from the truth of expressions and constant variables without specifications. I replaced verbose algorithms with optimised versions using specifications. +57. I challenged the outputted variable because it was hardly ever outputted. I checked that I didn't need to output the variable. The variable was operated on but not outputted to a file, screen or another program, so I deleted it. There was no point in the code that produced it, so I deleted it. I also deprecated little-used or obsolete commands. + +58. I could load a predicate's choice point trail from the main trail. I added a separate choice point trail for the current predicate. Then, I added choice points to it. I also deleted choice points from it. Then, I could move it into the main trail. +59. I speculated that SLG tabling was useful for Prolog in C. Before I loaded the trail, I stored the current trail if it wasn't the first one. The interpreter also saved the last trail. The current trail increased performance by saving time by manipulating choice points. I created variables to store and move the current trail. + +60. I deleted duplicate frames in the trail; I used tabling. I grouped the choice point trail by predicate ID. I left the predicate. I moved the choice point trail from the local trail to the main list. I could access the trail set by predicate ID in the main list. +61. The user could input properties of the SSI Web Service using the app, for example, to make an adventure game or content management system, in which starting at any page was possible. I deleted old global variables. After the interpreter finished with the predicate, it deleted variables from the frame. I didn't need the variables after the predicate again, except for the tabled result. I could find tabled data from reused parts of predicates. +62. I kept one copy of the same results of the different predicates. I deleted old predicates in the trail. I did this by detecting when the predicate had finished. I deleted its data in the trail. I refound the data if the predicate was rerun. +63. I also kept unfinished choice points in other instances of the predicate. I deleted old predicates with the same number in the trail. When finishing a predicate, I deleted its data, apart from data for returning, when starting a new predicate with the same number but a different ID (another instance of the same predicate). I kept data for returning. I also kept return data for the first instance of the predicate in the unbroken sequence. +64. I redid a predicate to hone in on a bug in the algorithm or interpreter. I redid predicates. I redid the predicate on exit or fail. The interpreter returned to the first instance of the predicate in an unbroken sequence. When I redid a predicate, I deleted frame and global data back to the start of the sequence. +65. I cleared the table and compared performance with and without it. I kept a copy of a predicate's result for the next time it was required. I used the results of a previous computation without repeating the computation to save time. If the code was the same, the result was the same. I reused code in parts of predicates and predicates and used the same results. +66. Over time, I deprecated and reused old commands. The interpreter deleted tables of past results when it reloaded the algorithm. I also kept results tables to avoid wasting time when rerunning an algorithm. Finally, I worked out that the interpreter could do the computations instantaneously. I alerted myself when a program had reentered a previous version. +67. I reran the code to check whether the results were identical. I noted that the interpreter could only give the tabled results again, e.g. if it gave the program a(\"a\") the query a(_) or a(\"a\") not a(\"b\"). If there were programs a(\"a\") and a(\"b\"), then the interpreter could give only a query returning one result. I could use the old results sometimes when I introduced unrelated code. I checked whether the new code matched the predicates and gave new results. +68. I detected a human error in the tabled results. The results table was kept for intact algorithm parts, except if there were outside APIs, asserted global variables, random numbers or unpredictable user input given. I alerted myself when running unpredictable code. I compiled a report about the use of tabled results. I was cautious of using old results if I introduced new code with the same predicate name and arity (number of arguments). +69. I found new uses for tabled results. I did this by following the calls and graphing them. For example, I counted the calls made to and by each predicate. I preferred algorithms with more straightforward call graphs. I also tracked tabling and the rate of tables used. +70. I reduced algorithms to single predicates, sometimes with calls to other predicates. I modularised code, simplifying data and removing anything unnecessary. I found the recurring code and referred to a copy of it multiple times. I detected whether different algorithms shared the same result and deleted one after inspecting them. Finally, I used the simpler algorithm. +71. I used type statements to check what was input and what was output. I only used completely defined queries in tabling. I did this by checking that I completely defined all the input arguments. Also, I checked that I completely defined all the output arguments. Finally, I checked that all of the arguments in the query matched the result. +72. Some people were immortal. Immortal people were wise. Therefore, some people were wise. I imagined that a future civilisation developed a technology to help people live past their natural lifespans. These people were wise, and it was silly for people not to learn to become immortal. +73. I wrote libraries to use in algorithms. I did this by finding new features of SSI and researching them. Separately, I tested each predicate by itself. I also used features from my algorithms, for example, SSI. Finally, I developed philosophies with algorithms with features. +74. I described the feature with a video. I made the video about the SSI feature. Then, I loaded the library into the algorithm. The library made more predicates available. Compiling used libraries with the algorithm was done by converting libraries into C and adding their code to the application. +75. The virtual machine could run on any platform. I simulated memory in the virtual machine, only using what was needed and processing threads simultaneously. I allocated memory for variables. It was like C. I tracked memory locations used and reused pigeon holes. +76. I installed the algorithm using the website, using drag and drop. I noticed that the application had no unused variables. I noticed that the application had no unused functions. I noticed that it reused, not unnecessarily duplicated code. It had no errors, and integrating code was easy. +77. Lucian Prolog was SSI, which the user converted to C to run. I designed the user interface for the server. I could select multiple files using checkboxes and open, run, delete, copy or move/rename. I could also run commands. For example, I could run batch commands using Prolog. +78. The app maker included installation information or instructions for SSI Web Service with the app. I compiled the SSI Web Service app. I wrote the app in Prolog. Then, I ran it as an online API or distributed it to run with SSI Web Service. Subscribers could use it for free or paid and access a JSON file from a web page after sending parameters for the app to compute. +79. I wrote predicate-long functions and applied the optimisations to the operating system. I dragged and dropped in the graphical user interface. I dragged the icon from one point to another using Java or Javascript. I could choose an angle on the office. I could use glasses and drag a file from one device to another. +80. Lists needed to be converted to strings when converting to C (using Lucian or another Prolog), eliminating unnecessary white space. The user converted Prolog to C. He used a grammar to convert the Prolog code to C code. He used commands to simulate Prolog commands, such as univ, (a(B, C) =.. [a, B, C]). He represented terms as strings and converted them to array lists, where terms contained lists, not round brackets (until later). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 6.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 6.txt new file mode 100644 index 0000000000000000000000000000000000000000..f030e032ba7f0e324c91e8f97723274a532efeb0 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 6.txt @@ -0,0 +1,129 @@ +["Green, L 2022, Immortality 6, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 6 + +1. I presented the progress bar for converting List Prolog to C. I could run batch file manipulation commands using Prolog. First, I created many files using an algorithm. Then, I inputted these files into another algorithm to produce more files. I scheduled this operation regularly. +2. I debugged the C code in a Prolog form (using a C to Prolog converter). I converted the Prolog list to an array of items in C without the string to list algorithm. I converted A=[B,C] to A[1]=B and A[2]=C. If B or C were lists, I pointed to new lines of the array, i.e. A[y][x]. B or C could be numbers representing letters, lists, etc. +3. I tested whether the predicate had any input. I ran the Prolog algorithm with Lucian Prolog. I ran converters. I ran interpreters. I could run the algorithm online. I included command line arguments and other flags for Prolog, such as initialisation. +4. I attempted tabling as much as possible, reporting whether the interpreter had used tabling when timing the algorithm. I registered that the false speed increase was due to tabling, not the program being efficient. I reacted to the program finishing instantly. I realised that Prolog had recorded the results from the last time I had run the program, registering no change to the program since the last time I ran it instead of computing it instantly. I made the necessary changes to speed it up. +5. After eliminating unused variables, I used functionalism and reused parts of state machines, converting them to predicates. I wrote a Prolog algorithm to C converter that converted findall to loops and omitted unused variables and code. I could also make additional converters to and from List Prolog to a state machine. I could eliminate unused variables in the state machine. I analysed the algorithm like a maze, eliminating unused dead ends. +6. I eliminated non-determinism and choice points in my program and wrote it in C. I used native C in Prolog. I wrote the string processing predicate in C for speed. I could also write the neural network algorithm in C or Python. I started to write Prolog predicates such as sort in C. +7. I eliminated extra brackets and superfluous A=B and wrote a neural network to simplify code. Next, I converted C to a Prolog State Machine to remove unused variables, etc. Next, I wrote a version of my algorithm that didn't produce output while I ran it to cut down on code. Next, I replaced code with tabled results only if the code was simple and could cope with all possible data. Finally, I converted from the Prolog State Machine to List Prolog, then C, for better performance. +8. I wrote simplified machine language algorithms using CAW or neural networks. I accomplished this by writing machine language code for the processor. First, I used the machine's instructions to write the code. Then, I converted C code to machine language using an interpreter. I also converted machine language to C if necessary. +9. I also used Program Finder with types to design circuits and machine language. Lucian Prolog loaded and ran Prolog algorithms in interpreter mode. I could load algorithms and trace and save traces to disk (using protocol). Using the interpreter, I ran a version of CAW with particular parts of predicates and possible choices of variables similar to other predicates. CAW used a program finder with types to identify likely functions required. +10. The application had a version number, which I recorded as compatible if it was part of another program's package. Lucian Prolog compiled itself with algorithms. The compile command saved the interpreter converted to C with the algorithm in Prolog. Compiled algorithms could run by themselves. They could be distributed as binaries and run straight away. +11. I checked the predicate had the right trace in the test. I checked the label of the choice point accessed each time and referred to predicates by ID. Later, I redrafted the software, eliminating complexity and bugs. I also used a single number to number choice points. In addition, I labelled variables in memory. + +12. I also wrote shell. I wrote the writeln command in building language, a way of programming the processor. I wrote writeln by writing the string to output. Then, I outputted it to the terminal. I also wrote the terminal app. +13. I saved the character code. Then, I wrote the get_character command in building language. I did this by getting the character from user input, a file or a stream. Then, I waited for and got the character. Finally, I recorded the character code. +14. I returned a result from the function and scanned for unused variables. The function returned a single value. The other variables were local. I didn't need to use a command to maintain the immutability of variables. I didn't need to use a command to save variable values locally and delete them afterwards. +15. Transformations and operations involving all instantiated values could return a truth value based on their success. I wrote the end of the function symbol. I represented variables and functions similarly. The interpreter had no choice points; it just dealt with them. I fed the logic of statements such as equals4 and predicates to a result via nested if-thens. +16. I warned about cyclical includes statements and missing files. I wrote the statement to include a library. Then, I added the file's functions to the algorithm. I tested for instances of functions with the same name and arity not together. I warned about duplicate copies of functions. +17. I produced the data with the algorithm. I did this by writing the main function's name. First, I ran the algorithm. I did this by running the main function. I could also pass argument values to the function. +18. I generally kept a separate documentation file about the algorithm. I wrote comments to help explain the algorithm. I enclosed these in /* and */. In Prolog, comments may also follow % or // in C. I kept a copy of the file with comments in case a converter removed them. +19. I also checked the conversion of the original from List Prolog to State Machine, and back was the same. I did this by back-translating conversions to check them. I also checked the original with Prolog and C converters. In addition, I checked the original with List Prolog and Prolog converters. I did this with C and building language converters. +20. I attempted to find two other values given one value in a three-way conversion. I wrote an inverse algorithm and back-translated it to check it. I found an algorithm to solve 2x=2 for x. I found an algorithm to substitute x=1 into the original equation to check it. Also, I could do a three-way conversion between interpreter (output given program and input), induction (program given input and output) and constraints (input given program and output). +21. I returned an error when back-translation returned different results. I stored grammar elements such as \"{\" and \"|\" in a separate file. I could customise these for various programming languages. I wrote programming languages that I could convert back to the same language from which they were converted. I wrote a grammar that could interpret any white space. +22. I checked variables were defined in C using analysis in Prolog. I converted the C algorithm from the Prolog algorithm to a Prolog algorithm. I deleted the nested findall statements unless they were necessary. I converted for loops, etc., into findall statements. I converted parts into equals4, member and string to list commands, some in the predicate head. +23. I wrote an algorithm that created findall from a for loop when possible. I used the for command to represent findall, recursion, maplist, forall etc. These commands had to return to their original form when back-converted. I found for loops were faster than recursion. I found and warned about infinite loops. +24. The for loop could simulate choice points with loops ending at the end of the large predicate if it were possible to predict. I initialised the for loop with i=1. Then, I created a list of numbers representing output lengths or array indices with findall. I used recursion if I couldn't use the index to process the data using findall. For example, if the predicate processed previous results or used different predicates or bases cases, it should be a recursive predicate. +25. In recursion, I could use the index to iterate through the list items. I used the for loop with the test i<=10. I created a list of numbers from 1 to 10. I could use the number from this list to create lists of specific lengths. Alternatively, I could use it to iterate through the same indices from arrays. +26. I stored data structures, not sentences. I could write printf(\"The integer is %d\", i) in C as I=5,foldr(string_concat,[\"The integer is \",I],A) in Prolog. In this, foldr was foldl with reversed input. I could write the output to the screen using writeln1(I) in Prolog. In this, writeln1 was writeln1(Text1) :- term_to_atom(Text1,Text2),writeln(Text2) which presented the double quotes. +27. I could recognise and format numbers as integers, with decimal places or n decimal places and other numerical calculations. I could present a number with two decimal places: A=123.4567, B is A*100, round(B, C), D is C/100. I could check that a number was close to 1 as follows: E=0.999,0.999999>E. I could use this in the following: atan(1.55740772, A). A = 0.9999999986411103. + + +28. I returned a truth value for whether the operation had returned successfully. I used foldr to operate on data recursively. I identified the array or data structure. I recognised the process on its current item. I recognised the initial value. +29. If it was a tree as an array, it needed a command to process the hierarchical structure at an item number. I identified multiple-level foldr. I used a predicate instead. I noted that the predicate was called twice from within it. I determined that the operation operated on a particular item number. +30. I wrote a search predicate for co-ordinate mazes or trees. I used foldr with multiple sets of input. Foldr recursively transformed with one set of input. It could transform with various input groups and search through data. I replaced this with a predicate. +31. I outputted a list of lists in C. I wrote a multiple-level foldr with multiple input sets, which I replaced with a predicate. It processed multiple lists of lists. I stopped adding dimensions when necessary. I performed a search with iterative deepening. +32. I could use foldr as a substitute for maplist. I did this by using a formula finder with maplist. This formula finder identified the pattern of input and output at each point. Then, I entered this pattern into maplist. Or, I could substitute another formula into maplist. +33. Commands either passed or failed and sometimes returned results. For example, I used forall to test that all list items had a property. I started with the first item of the array. Then, I tested that each item in the array had the property. Otherwise, forall failed. +34. I deleted the garbage (which C automatically did). I used find n solutions to return a command's first n or fewer solutions. I made a variant that only passed with n solutions. The current array always started with 1. I took this array from another array. +35. I found more algorithm connections more quickly. I used the \"text to algorithm\" algorithm to store algorithms that connected vital words. There was a direction from one term to another. Hierarchies of other terms were involved. I collected these and turned them into arguments in philosophy. +36. I wrote the rule with the types in mind. I did this by writing the predicate with the program finder with types. A separate algorithm from the program finder with types, the program finder wrote the function, perhaps with two lists, in a recursive predicate. Once I wrote the rule, I could try it on the next item or find a new rule. I could identify that the pattern of the data item was the same as a previous item to save time with the program finder with types. +37. I could write base cases following the algorithm's aim. I used a program finder with types for finding base cases. Base cases are a form of a clause. The recursive clauses already indicated the necessary base case. In some cases, the algorithm revealed more base cases. +38. I could also use a duplicate predicate deletion method in Windows. Similarly to Selective Linear Definite (SLD) clause resolution, I deleted identical predicates. This method followed other optimisation techniques (such as deleting unnecessary variables and code, including variables that are always true or false). I also used SLD to delete predicates that called a single predicate. I called reused libraries. +39. Prolog encouraged writing ideas from scratch or using other libraries. I kept only recursive predicates as an experiment, moving the others to larger predicates. This method removed smaller, time-consuming predicates. However, it increased the size of the local stack of the larger predicates. So, I chose many small (not too small) predicates. +40. I wrote a search engine to search for queries that described Prolog queries. I wrote HTML documentation linking to Prolog tests. Third, I edited and ran the tests online. Fourth, I backed up and edited the code to work with the tests. Finally, I downloaded the program and searched for algorithms that satisfied a test. +41. I tracked modifications of predicates and algorithms. I collected predicates with equivalent specifications. Then, I determined whether I needed to write an algorithm. Fourth, I found the predicates different algorithms used. Finally, I found the patterns of the use of these predicates. +42. I tested the expression b=c. I gave variable names as variable definitions. If the program gave no value, it gave the variable name as a variable definition. I outputted the variable name as the variable definition if it didn't have a value. For example, a=(b=c). +43. I found the variable value from equals4. When the program found a variable value, the algorithm substituted it into the variable definition. In a=b, a=1, I found b=1. The predicate returned this result. The algorithm analysed this result. + +44. I could write the idea as a predicate, but findall was a more straightforward solution. I wrote findall with variables that it can save between passes. Then, I wrote findall to find all the combinations of items. I counted them as I went. I stopped at a certain number. +45. I could convert the call arguments in C. Findall with recursion is like call. First, I did this by writing findall with recursion as a predicate. Second, I added a base case. Third, I added a query. +46. Recursive findall could have member calls that carry copies of variables from the last iteration. In addition, Findall could output multiple lists. Instead of one list of results, findall could output many lists. Findall could or could not add to a list. To allow findall to add to a list meant having different findall clauses. +47. To avoid replacing variables, the algorithm renamed them, i.e. _001. The call command can have types, including simple types, modes, and predicates passed as variables. First, I constructed the lists as arguments. Then, I wrote a predicate that changed [] to {}. Finally, I wrote variables without values substituting for them. +48. I returned results when findall failed and defined recursive variables in different findall clauses. In findall with recursion, there are variables for the nth iteration and other variables. The findall choice point included the current iteration number. The choice point also included other outputted variables. I initialised inputs forming these before running findall. +49. Lists formed in findall may be hierarchies created with equals4 commands, and there may be shorthand recursive calls without predicate names. Findall with recursion has no predicate name. It can call itself using a findall name. Variables in recursive findall are immutable, meaning they can be defined once. They are saved between passes, can be arguments in further predicate or recursive calls, and need phi definitions so that they are determined at the start, even when the interpreter invokes a choice point in the middle. +50. If variables from before the recursive findall predicate are needed, the programmer will duplicate them if they are changed. Predicates or findall are in the form: Outputs of findall, Name of this output list, Inputs, Outputs of recursive findall, Body. For example, A=[1,2],B=[3,4],G=1,findall([G,C,D],J,G,G1,(member(C,A),member(D,B),G1 is G+1)). J = [[1, 1, 3], [2, 1, 4], [3, 2, 3], [4, 2, 4]]. In this, G takes the value from the previous pass of findall rather than the initial value. +51. Cut affected the predicate and below, deleting more and more choice points. I programmed cut with n solutions. I could finish the recursive findall predicate after n solutions. I could also finish it after n member, etc., results. Or I could stop after a particular number of these. +52. I could delete choice points from lower predicates. I selected from the choice points in findall. Then, I deleted choice points from a particular member predicate. Or, I deleted choice points from append calls. Alternatively, I deleted specific choice points from a predicate. +53. Instances of recurring variables in recursive findall before the recursive findall predicate are ignored on the second pass, like new recurring variables. Findall with recursion is possibly flawed because sometimes I only wanted one solution from a buried choice point. However, I could arrange to have the single choice point returned using cut. In another case, I could flatten multidimensional data from recursion. I could then sort this data, which eliminated duplicates. +54. If the recursive findall returns [] as one output or fails, the algorithm may salvage variables from it or the head. Findall with recursion is possibly flawed because the programmer may want to know if a predicate fails. However, the algorithm may return an error when a pass fails. In this case, a phi node is necessary to save recurring variables. Additional code to catch undefined variables may be required. +55. Outputting a list rather than a list of lists is possible with recursive findall, which is not possible with findall, preventing the need to flatten output afterwards. Findall with recursion has extra outputs like findall. The programmer may input these outputs from before the recursive findall predicate. Or, the programmer may input them from just before the second pass onwards. In addition, recurring outputs that are different from the outputs are outputted, reporting the number of passes, for example. +56. I had to be careful that I defined certain variables in the recursive findall predicate before the member choice point in a phi node, triggered before the member call. I can convert recursion with findall to C (a.k.a. \"pre-phi C\"). As an aside, I could forego the classic findall output and additional output variable in recursive findall because the recursive findall input and output could replace this with append. The interpreter finds the input to recursive findall from before the recursive findall predicate before the first pass and from outputs of the recursive findall predicate on the second pass onwards. Also, member, etc., choice points in the recursive findall predicate can load inputs from the last pass's outputs. +57. I stated that [a]:b,c was an error, but that [a]:[b]:c and [a]:[b,d] weren't. I can append using \":\". Taking a=1,b=[2,3], [a|b] and [a]:b are equivalent. Using these values of a and b, interpreting a:b would produce an error. Note: [a]:b = [1,2,3]. +58. I could also write string_length(L,1), string_concat(L,_,\"abc\") instead of the reverse order of these commands. I can find the first n items and the rest (n+1 to the list length) items. The first 2 items of [1,2,3,4,5] are [1,2]. The rest (2+1=3rd to the 5th item) are [3,4,5]. I could use a similar process on lists of characters. +59. For customisability, phi nodes may be at the middle or end of the recursive findall predicate. I can reverse lists in findall. In recursive findall, I could write append with the first two arguments swapped to reverse a list built with recurring variables (passed between passes). List processing after member could process multiple list items at a time. Member after list processing could find combinations of items, while the program could use phi nodes when choice points skip over list processing. + +60. When the loop finished, I outputted the appended result. I may not need findall list outputs. Instead, I treated recursive findall without member calls but with list processing as a while loop. One clause (in fact, an if-then clause) had a base case with the cut command. Another clause had the list processing code. +61. I didn't need to return through the levels with recursive findall, which had more possible options than findall. Separately, an alternative name for recursive findall is \"recursor\". A name for the interpreter algorithm is \"interpret\". A name for the induction algorithm is \"induct\". A name for the constraints solver is \"constrain\". +62. Recursive findall variables are immutable, so the programmer couldn't redefine them. A member call followed by another member call in recursive findall creates new logical conditions. I investigated the C version of recursive findall. Classical findall converted member calls to loops, with nested if-then clauses to catch failure. Recursive findall added recurring variables, which needed to be deleted in second and onward copies of the clause if they occurred before the recursive findall clause. +63. I saved time running reverse with recursive findall. Logical conditions of recursive findall are that it needs a base case and keeps findall-style output, but not this last condition in the following case. For example: + +A=[1,2,3],L=[],findall(classical_output=[],recursive_input=[A,L], + +body= ((A=[],L=O,!)->true; + ((A=[H|T],append([H],L,O)) + )), +classical_output_variable=_,recursive_output=[T,O]), which produces the output O=[3, 2, 1]. + +Alternatively, I could write this: + +A=[1,2,3],L=[],findall(classical_output=[],recursive_input=[A,L], + +body= ((A=[],L=O,!)->true; + (A=[H|T],O=[H|L]) + ), +classical_output_variable=_,recursive_output=[T,O]). + +64. I could write recursive predicates as if they were loops. A condition of recursive findall is a base case for each instance of the member call. + +On another note, I could write: + +A=[1,2,3],L=[],findall([[ri=[[],L], + +b=[],ro=[[],L]], + + [ri=[[H|T],L], + +b=[],ro=[T,[H|L]]]]). + +Or, in the first clause, I could write ro=ri. + +65. A condition of recursive findall is that functionalism can be used in recursive findall to save time. Instead, I could use \"afunctionalism\" (sic), where the bodies of the findall clauses were empty, and I called no functions. I needed to take care to label the function in a comment. The new findall command was afunctional because it didn't require an external predicate. Because recursive findall didn't need an external predicate, I didn't rename, pass and lose variables. +66. When the findall body, which variables may partially define, runs out of choice points, findall ends. A condition of recursive findall is that when a member runs out of choice points, recursive findall returns the current iteration. As a note about the previous paragraph, I could use functionalism at any point of the findall predicate. I could define variables as an ri or ro label. I could define variables as the body or parts of the body. I could also define variables as ri=..., ri=ro, or the whole findall clause. +67. I reduced the complexity of algorithms in C to process lists to process array items. A condition of recursive findall is that I could alternatively use existing findall and numbers/4. I could run numbers(4,1,[],N), producing N=[1,2,3,4]. I could reverse this list and use a predicate to get the nth item of a list, to reverse a list with findall. I could convert both findall and recursive findall to an iterative C loop. +68. I could convert findall*1 and recursion to C, treating them like recursive findall (*1 - using recursive input and output to store the classic output). An example of recursive findall is creating a Prolog state machine. First, I numbered the lines while they were in hierarchical (list) format. Then, I converted this hierarchy to a linear list of states, with state numbers resembling go-to lines from the original algorithm. These steps were done one after the other and were like recursion without the returning variable in the base case. +69. Immutable variables that can't be set to a value when they already have one prevent over-writing and lead to better logic. I can convert recursive findall to a for a loop. For example: for i=length of list to 1, a=list[i]:a, next i. This program reverses a list. Immutable variables were more intuitive than non-immutable variables. + +70. I could also wrap recursive findall in find n solutions. If the find n solutions command has a counter, so can recursive findall. Find n solutions counted the first n solutions. Recursive findall could also count a certain number of solutions. Recursive findall is recursive despite not returning through levels on the way up because it effectively calls itself. +71. In recursive findall, choice points back to, e.g. member contained variables from previous recursive findall passes. The second instance of the member command, etc., can contain variables. When the interpreter back-tracks to a member choice point, it returns to a set of statements before the member command, containing recurring variables. Alternatively, these inverted phi nodes could be phi nodes at the end of the recursive findall body. If the programmer needs to synchronise the member call with a previous member call, there could be a single call to, e.g. member(_,[1,2,3]) for three list items to synchronise. +72. I could verify, zip or reverse lists. Alternatively, member calls may be linked together. For example, member(A,[1,2,3]), A could help retrieve the Ath list item. Or, it could help recover the A+1th list item. Or, it could help retrieve the 3-A+1th list item. +73. I also checked whether interpreter predicates were needed by the interpreter, eliminating ambiguously typed predicates. I wrote a command that adds commands and variables to the State Saving Interpreter. The interpreter prevented each command from being a recursive predicate's name. Each command had an entry in the multilingual dictionary. I could switch commands on and off, depending on whether the user might need them in the algorithm. +74. I included specific modes of commands. I could switch commands on or off in the compiler. I used a command if it was in the program. I used all commands indicated by a call command. I checked whether the call command was open-ended or contained items from a constant list. +75. I created the personalised email. I did this by making a no-code editor with the State Saving Interpreter. I wrote a text file with databases. The file also had report templates. It also had manipulation operations. +76. I could edit multiple fields at once. I wrote a user interface for the no-code editor. I reused the edit page. Both the administrator and user could edit their details. I added the field to edit to the process. +77. I generated images, HTML lectures and saveable HTML pages with Prolog. I wrote a graphical user interface for the no-code editor. I selected an edit, report or operation page from the list. Appointed code editors could write code for, e.g. games, timed events or content management systems. Certain disabled people could read PDF content and answer questions in word processing documents. +78. The interpreter kept the passwords away from users unless a user requested the password. I included a password field in the State Saving Interpreter Web Service. The password field showed bullets (black circles) when the user pressed keys in the web page field. The interpreter tested for the command before running it. The user could save the password using jQuery. + +79. I could select and abolish particular asserted rules. For example, I abolished all predicates with a specific name and arity. I could also abolish all predicates with a particular body. In addition, I could add predicates with these attributes. Finally, I could collect and edit parts of predicates. +80. I found that the successful result was [[]], not []. I bagged (selected) database results that met certain conditions. I found that all items were not failures. I found searches, permutations and set-theoretical sets of items. I searched within domains for items. + +*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"c\"],[\"c\",\"mannerisms\"],[\"mannerisms\",\"attached\"],[\"attached\",\"campaign\"]]),b_alg([[[],\"c\"],[\"c\",\"mannerisms\"],[\"mannerisms\",\"attached\"],[\"attached\",\"campaign\"]],a),bb_alg([[\"c\",\"mannerisms\"],[\"mannerisms\",\"attached\"],[\"attached\",\"campaign\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[],v],\"c\"]]),b_alg([[[[],v],\"c\"]],a),bb_alg([])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"c\"],[\"c\",\"conjoiner\"],[\"conjoiner\",\"vanishing\"]]),b_alg([[[],\"c\"],[\"c\",\"conjoiner\"],[\"conjoiner\",\"vanishing\"]],a),bb_alg([[\"c\",\"conjoiner\"],[\"conjoiner\",\"vanishing\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"user\"],[\"user\",\"backed\"],[\"backed\",\"monos\"]]),b_alg([[[],\"user\"],[\"user\",\"backed\"],[\"backed\",\"monos\"]],a),bb_alg([[\"user\",\"backed\"],[\"backed\",\"monos\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"lucian\"],[\"lucian\",\"pawing\"],[\"pawing\",\"offerings\"]]),b_alg([[[],\"lucian\"],[\"lucian\",\"pawing\"],[\"pawing\",\"offerings\"]],a),bb_alg([[\"lucian\",\"pawing\"],[\"pawing\",\"offerings\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"lucian\"],[\"lucian\",\"incrementally\"],[\"incrementally\",\"interestingness\"],[\"interestingness\",\"os\"]]),b_alg([[[],\"lucian\"],[\"lucian\",\"incrementally\"],[\"incrementally\",\"interestingness\"],[\"interestingness\",\"os\"]],a),bb_alg([[\"lucian\",\"incrementally\"],[\"incrementally\",\"interestingness\"],[\"interestingness\",\"os\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[],n],\"ssi\"],[\"ssi\",\"activating\"],[\"activating\",\"deployment\"]]),b_alg([[[[],n],\"ssi\"],[\"ssi\",\"activating\"],[\"activating\",\"deployment\"]],a),bb_alg([[\"ssi\",\"activating\"],[\"activating\",\"deployment\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[],n],\"ssi\"],[\"ssi\",\"safety\"],[\"safety\",\"npackage\"]]),b_alg([[[[],n],\"ssi\"],[\"ssi\",\"safety\"],[\"safety\",\"npackage\"]],a),bb_alg([[\"ssi\",\"safety\"],[\"safety\",\"npackage\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"user\"],[\"user\",\"navigated\"],[\"navigated\",\"pauses\"]]),b_alg([[[],\"user\"],[\"user\",\"navigated\"],[\"navigated\",\"pauses\"]],a),bb_alg([[\"user\",\"navigated\"],[\"navigated\",\"pauses\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"ssi\"],[\"ssi\",\"honesty\"],[\"honesty\",\"breastoned\"]]),b_alg([[[],\"ssi\"],[\"ssi\",\"honesty\"],[\"honesty\",\"breastoned\"]],a),bb_alg([[\"ssi\",\"honesty\"],[\"honesty\",\"breastoned\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"lucian\"],[\"lucian\",\"dendrite\"],[\"dendrite\",\"likeness\"],[\"likeness\",\"enwrapped\"]]),b_alg([[[],\"lucian\"],[\"lucian\",\"dendrite\"],[\"dendrite\",\"likeness\"],[\"likeness\",\"enwrapped\"]],a),bb_alg([[\"lucian\",\"dendrite\"],[\"dendrite\",\"likeness\"],[\"likeness\",\"enwrapped\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[],v],\"user\"]]),b_alg([[[[],v],\"user\"]],a),bb_alg([])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"converted\"],[\"converted\",\"pick\"],[\"pick\",\"muscles\"]]),b_alg([[[],\"converted\"],[\"converted\",\"pick\"],[\"pick\",\"muscles\"]],a),bb_alg([[\"converted\",\"pick\"],[\"pick\",\"muscles\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[[],v],a],\"user\"]]),b_alg([[[[[],v],a],\"user\"]],a),bb_alg([])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[[],v],a],\"user\"]]),b_alg([[[[[],v],a],\"user\"]],a),bb_alg([])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[],v],\"user\"]]),b_alg([[[[],v],\"user\"]],a),bb_alg([])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"lucian\"],[\"lucian\",\"takers\"],[\"takers\",\"apparatus\"]]),b_alg([[[],\"lucian\"],[\"lucian\",\"takers\"],[\"takers\",\"apparatus\"]],a),bb_alg([[\"lucian\",\"takers\"],[\"takers\",\"apparatus\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[[],v],a],\"lucian\"]]),b_alg([[[[[],v],a],\"lucian\"]],a),bb_alg([])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[],v],\"lucian\"]]),b_alg([[[[],v],\"lucian\"]],a),bb_alg([])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"prolog\"],[\"prolog\",\"harnesses\"],[\"harnesses\",\"persuading\"],[\"persuading\",\"physical\"]]),b_alg([[[],\"prolog\"],[\"prolog\",\"harnesses\"],[\"harnesses\",\"persuading\"],[\"persuading\",\"physical\"]],a),bb_alg([[\"prolog\",\"harnesses\"],[\"harnesses\",\"persuading\"],[\"persuading\",\"physical\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"user\"],[\"user\",\"reformulation\"],[\"reformulation\",\"material\"]]),b_alg([[[],\"user\"],[\"user\",\"reformulation\"],[\"reformulation\",\"material\"]],a),bb_alg([[\"user\",\"reformulation\"],[\"reformulation\",\"material\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[[],v],a],\"user\"]]),b_alg([[[[[],v],a],\"user\"]],a),bb_alg([])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"lucian\"],[\"lucian\",\"rush\"],[\"rush\",\"fac\"]]),b_alg([[[],\"lucian\"],[\"lucian\",\"rush\"],[\"rush\",\"fac\"]],a),bb_alg([[\"lucian\",\"rush\"],[\"rush\",\"fac\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[[],v],a],\"lucian\"]]),b_alg([[[[[],v],a],\"lucian\"]],a),bb_alg([])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[],n],\"converted\"],[\"converted\",\"satisfactorily\"],[\"satisfactorily\",\"surrounded\"]]),b_alg([[[[],n],\"converted\"],[\"converted\",\"satisfactorily\"],[\"satisfactorily\",\"surrounded\"]],a),bb_alg([[\"converted\",\"satisfactorily\"],[\"satisfactorily\",\"surrounded\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[],\"prolog\"],[\"prolog\",\"produces\"],[\"produces\",\"ai\"]]),b_alg([[[],\"prolog\"],[\"prolog\",\"produces\"],[\"produces\",\"ai\"]],a),bb_alg([[\"prolog\",\"produces\"],[\"produces\",\"ai\"]])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[],v],\"run\"]]),b_alg([[[[],v],\"run\"]],a),bb_alg([])],[*,\"Lucian Prolog was SSI, which the user converted to C to run\",a_alg([[[[[],v],a],\"prolog\"]]),b_alg([[[[[],v],a],\"prolog\"]],a),bb_alg([])]] +[] +true. + + +noprotocol. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 7.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 7.txt new file mode 100644 index 0000000000000000000000000000000000000000..97bb1cf29e65b28c489670a019e008be67fa88ac --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 7.txt @@ -0,0 +1,88 @@ +["Green, L 2022, Immortality 7, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 7 + +1. I wrote an interpreter to parse the variable [v1, name]. I built a term argument by argument using append. I wrote an alternative variable notation, [v1,name]. Variables had a name in the algorithm. In listings and traces, the interpreter replaced variables with _1, _2, etc. +2. I inverted reverse with the query reverse(B,[],[3,2,1]). and program reverse([],A,A) :- !. reverse([A|D],B,C) :- reverse(D,[A|B],C),!. with the result B = [1, 2, 3]. As an arithmetic comparison, I rounded both values and tested whether they were equal. I wrote a new predicate that deleted extra zeros after numbers and tested whether they were equal. I back-moved from results to operations that formed them to find constraints. For example, x^2+y=5 and x+y=3, so x=2 and y=1. +3. I could use find whether points were on certain graphs. For example, I evaluated A is 2+3, returning A=5. I could determine 5 is 2+B, returning B=3. In the case of 5 is C+B, I needed -1 is C-B to find C=2 and B=3. In the case of 5 is C+B+D, I needed -1 is (C-B)-D and 0=D (3 different equations for three unknown variables) to find C=2 and B=3. +4. I played the tower stacking game. I added the global variable before the others. I changed the global variable into a passed variable when I wanted. I deleted the variable and re-added it after inserting a value in the middle. I could remove the variable added just after adding it, like a stack, for example, a memory stack. +5. I queued the items to be processed by the server. I added the global variable after the others. I stored the item in a queue, processing it when it reached the front of the queue. Instead, I added an item to the global variable, determining the order. I could store the queue in a file so the server could access it. +6. In most cases, the interpreter displayed the atom in single quotes. I determined whether the item was an atom. An atom was a string shown without double quotes or a number. If the string contained only letters, the interpreter displayed it without single quotes. If the string was an equal or minus sign, the interpreter showed it in brackets. +7. I could interchange letters for letters in a word. Then, I converted the string or atom to a list of string codes. I could parse this list of codes with a grammar. I could then also convert the character codes to characters using findall. Or, I could directly convert the atom to a list of characters. +8. I could replace certain character codes. I converted the atom into codes, not a list of characters. I could also convert the atom to characters to parse with a grammar. I wrote the predicate to convert the atom to characters. I wrote the C code to do this. +9. After concatenating atoms, I could append the atom to other items. At first, I concatenated the atoms rather than the strings. The interpreter wrapped atoms containing double quotes in single quotes. The interpreter wrapped strings containing double quotes in double-quotes. Using program finder, I used atom concat to concatenate atoms where I had determined this was necessary. +10. I noted that in A is 1.1+0.9, A = 2.0 (a float). I determined whether the item was an atom or a number. Then, I could decide if the number was an integer or a float. An integer was, e.g. 1, 2 or 3, and a float was, e.g. 1.1 or 1.0. The number became a float when the interpreter added decimal places, i.e. integer+float=float. +11. I found the length of the atom. I could find the first character of the atom with atom_chars(abc,A),length(B,1),append(B,_,A),atom_chars(C,B), resulting in C=a. I could find the last character with atom_chars(abc,A),length(B,1),append(_,B,A),atom_chars(C,B),!, resulting in C=c. Even though I could find the first item of a list with length(B,1), append(B,_,[a,b,c]), I couldn't find the first item of an atom with atom_length(B,1),... because it returns an instantiation error. I could write a predicate that did this. +12. I used the same predicate for atom_chars as string_chars, except the wrapping quote was different. I found the sublist using A=[a,b,c,d,e],append(C,B,A),append([b,c],D,B). Of these results, C is the list before the sublist [b,c]. In addition, D is the list after the sublist [b,c]. I could find subatoms and substrings by first converting them with atom_chars(abcde,A), producing A = [a, b, c, d, e] or string_chars(\"abcde\",A), which produced A = [a, b, c, d, e]. +13. I used bagof, similar to findall, but returned non-labelled variables. For example, bagof(A,(member(A,[1,2,3]),C=1),B) returned C = 1, B = [1, 2, 3]. Unlike setof, bagof doesn't sort results (to be [1,3]) in the following: bagof(A,(member(A,[1,1,3]),C=1),B), returning C = 1, B = [1, 1, 3]. I could also return two lists using the member command and a list of item numbers and get the item number. I returned two lists and C (as above), different from recursive findall, which stores variables between passes. +14. I caught errors in the program with catch(1 is A,X,writeln(\"Error.\")) producing X=error(instantiation_error, context(system:(is)/2, _)). If I ran throw(error(instantiation_error, context(system:(is)/2, _))), the interpreter would produce an error message to the user. I could edit the program's code after the interpreter found the error. If a predicate wasn't inverting because of an error in the interpreter, I wrote an inverting predicate. I reported the errors with the file names and line numbers. +15. I could find the character from the code. I found the character's code. For example, \"A\"'s code is 65. The character code was how the interpreter stored the character. I reported this value. +16. I added predicates to run a different form of Prolog. I translated a predicate using another predicate. I could also convert programming languages utilising this technique. I temporarily renamed a predicate for debugging purposes. I also changed a predicate name or moved predicates together. +17. I found multiple clauses of a predicate or multiple clauses with a body. I found whether a clause in memory had a given Head and Tail. Certain predicates were interpreter-side, meaning they weren't visible. I added and edited clauses. I could find private clauses. +18. I didn't overwrite a stream (I checked that the streams had different variables). I closed a stream. The stream could be a file. I wrote files to contain terms without predicate names or full stops. I could write files to include Prolog code. + +19. I determined that the item was not a variable or an atom but a compound. Lists are compounds. Lists containing variables are compounds. I differentiated lists from lists containing the pipe \"|\" with the following algorithm: foldr(append,[[1|2]],[], A), which returns false because the list contains \"|\". I stated that foldr(append,[[1|[2]]],[],A) returned true even though it contained a \"|\" because the list simplified to [1,2]. +20. I replaced variables with underscored variables and returned the correspondences with the original variables. I copied the term. For example, A1=[a,B],A2=[A,b],copy_term(A1,A2) returned A2=[a,b]. My interpreter took [A,B]=[C,D] and returned A = C and B = D. The interpreter took A=B,B=C,C=D,C=1 and returned A = B, B = C, C = D, D = 1 (pushing ground terms down). Also, it took A=B,A=C,A=D and returned A = B, B = C, C = D (defining variables in terms of a more recent variable). +21. I checked instances were together. I found all the predicates with a particular name and arity. I could discover predicate names from arities. Also, I could find arities from a predicate name. In addition, I could return all predicates and arities. +22. I adjusted the stack size for the algorithm. I did this by returning the Prolog flags in memory. First, I found the flag's values. Then, also, I could see the values' flags. Finally, I found whether the flag had a particular value. +23. I found the current input stream, i.e. its code. For example, the current input stream started as the keyboard and computer on which the user loaded Prolog. First, I could change the stream. Then, I could load from this stream. Then, I could change the stream back. +24. I found the current output stream, checking the algorithm didn't reuse it. For example, writing to each file required a new stream. I did this by opening the stream. Next, I wrote to the file. Then, I closed the stream. +25. I found that A is float(1) resulted in A = 1.0 and that A is float(1.0) resulted in A = 1.0, which I checked were floating point numbers or integers. I evaluated the functors. I found the sign (-1,-1.0,1 or 1.0). Or, I found that float_integer_part(-2.5) resulted in A = -2.0 (a float, different from the ISO standard). I found that A is float_fractional_part(-2.5) resulted in A = -0.5. +26. I found an algorithm that skipped over recursion (I renamed it). Before this, I found all the interpretations of the spec. Then, I found the first n solutions to the spec. After that, I contained the lists of new combinants (sic)(possible features). I found combinations of features that skipped over the need for others (an optimisation). +27. I experimented with changing the flag value. I wrote the Prolog flags. I flagged the freedom of variables in recursive findall. Also, I found the inverse of free recursive findall. The inverse had the same number of statements as variables. +28. I read the decimal point in the number. I determined whether the number was a floating point number. For example, 1.0 is a floating point number, but 1 is not. It didn't matter if the user instantiated the variable; the variable collected the rule. Later, the free variable met the spec. +29. I accepted input from forms and printed the HTML table using recursive findall. I automatically flushed output at the start of the following line. I made my terminal or web algorithm that allowed writing at an (X, Y) location. I also displayed graphics. I could display input fields. +30. I created the functor A=b(_,_) with functor(A,b,2). Later, I could set its arguments with args with A=b(_,_),arg(1,A,c) resulting in A = b(c, _). I computed b(c,d)=..E returned E = [b, c, d]. I could also produce functors with arity 0, e.g. b. I produced b with F=..[b]. +31. I synchronised the music and graphics (the graphics followed the music). I got a byte from a stream. For example, I got MIDI music. I also got the vector or bitmap graphics. I modified them and saved them. +32. I got a single character from input, also online. Before this, I programmed the game. It was a role-playing game. In it was a recursive findall puzzle. In it, data flowed in a circle until the result was correct. +33. I taught string to list at the Academy, compensating for spaces. I got the next character from the file, returning -1 if it was the end of the file. I read the entire file using a grammar. I also converted the string from the file into a term. I processed this term using a Prolog algorithm. +34. I could automatically run the algorithm with crontab -e. I halted the interpreter. I halted the interpreter when the algorithm had finished. I returned with exit status 0, signalling that the interpreter had completed the algorithm successfully. Exit status 1 signified a minor error. + +35. I wrote the algorithm writing helper. I traced the data flow in the hooks defined in the implementation of Prolog. Then, I named the current file. Fourth, I reported the undefined variables. Finally, I said probable errors, such as numbers of items in terms not corresponding, failures from calling undefined predicates and lack of a base case. +36. If there were no recurring variables, I converted recursion to findall. I generated the algorithm with a program finder with types. If a number was an integer, I appended it in recursion. The part of the algorithm that found the integer condition was CAW for numbers. I named the predicate \"collect integers\". +37. The last term in A=B, A=1, B=2 becomes 1=2, failing. I determined the input and output of variables. I defined all variables as a value when I defined them as a variable given that value. The variable, e.g. _1 with the value _2 equalling the variable _3 with the value _4 caused _3 to equal _2. Also, given A=B, B=1, the result was A=1, demonstrating that the interpreter propagated ground values before returning from the predicate. +38. I stored and optionally used SLG tabling results on a server. I controlled logic with not, once and repeat. I ran useful algorithms multiple times. I ran them with a set of data once. The interpreter used SLG tabling to speed up repeated computations. +39. I solved spelling errors, mistyped names and supported back-compatibility. I wrote a self-correcting algorithm that solved non-existent variables. Also, it solved uninstantiated variables. I changed the variable name from A to _1 for security. The interpreter changed these back before being outputted. +40. Logical terms are true or false, which \"not\" inverts. I wrote that the variable was not a number. So, it could be an atom, string, list or list with a pipe at the first level. I determined that the item was not a list with a pipe at the first level but a list. I counted the pipes at each level, e.g. foldr(append,[[a],[b,[c|d]]],[],A) where A = [a, b, [c|d]]. +41. I determined whether the number was an integer or a float. If it was an integer, I left it as it was. I truncated it to the integer closer to zero if it was a float. If it was an atom, I converted it to a number. If this returned false (meaning I couldn't convert it to a number), I repeated asking for input. +42. I inserted cuts where necessary. I converted a number to characters using number_string(11,A),string_codes(A,C),findall(B,(member(B1,C),char_code(B,B1)),D). The result was D = ['1', '1']. I found likely bugs, such as breaks in chains of variables. Also, I found likely bugs in base cases. +43. I verified the number was from a certain system. I converted a number to string codes. I accomplished this with a number_string(11,A),string_codes(A,C). This resulted in C = [49, 49]. I could also convert codes to a number using string_codes(A,[49,49]),number_string(B,A), with the result B = 11. +44. I always caught and solved errors and aborted to a level. I ran statements (i.e. a) once. The code \"once(a)\" had the same effect as b:-c. c:-a,!. I could use once more than once in a predicate. Other statements in the predicate could have more than one set of results, i.e., non-deterministic. +45. I organised files' contents by group, including control over a compiled application's executable and data portions. I opened a stream from the virtual file. The file was an item in the data file for the user. Also, I organised the file by folder and name. I used a variable for operations on a current file. +46. I wrote the font to the HTML page. I invented new font characters to be operators. I also gave them a particular order of priority. For example, I wrote a predicate that sorted items or combined a Prolog command with arithmetic. There was an unusual xf or yf (postfix type) operator. +47. I changed the functors ',' or sin. I found the bit-wise complement of -1 with A is \\(-1), resulting in A = 0. The arithmetic functors such as the power of, sin and sqrt produced floats from integers or floats. The bit-wise functors such as \"bit-wise right shift\" and \"bit-wise and\" produced integers from integers or floats. I couldn't change the functors ',' or sin. +48. The algorithm was verified and assessed using tests. At first, I peeked at the next byte in the stream. Then, I invented a custom programming language to stream data. These were videos. They explained my philosophy and how to teach and assess programming assignments. +49. When the student saw me, I could give my thoughts. I found the next character in the stream. I checked the students were making progress on their assignments with version control. The tests were simple enough to work when the program started working. The high-quality thoughts were details (other algorithms), 250 per assignment for three subjects. +50. I earned a high distinction. I peeked at the code on the stream. I scheduled writing the algorithms for my degree. I planned to write the algorithms for my institution. PhD and vocational assignments also had this requirement. + +51. I created a video explaining how to say what you are saying to be thought of to program. I put a byte on the stream. I remembered the idea that I aimed to program. I listened to the concept of to program. I worked out the idea to program. +52. I explained the idea as a reference (what the command means). At first, I put a character, code or new line on the stream. I explained the command. I explained the algorithm. I explained the differences and connections between particular commands and algorithms. +53. Thought, and sensory data, followed by movement, required the most considerable amount of data. I did this by reading from the binary stream. First, I made a binary stream of bytes. It could be video, music or robotic thought. I also checked whether the robot's body movements required a stream. +54. I avoided variable names describing variable values in the interpreter for security. Leading to this, I wrote characters as input and output. With this, I wrote the text interface for the algorithm. I determined what the user wanted and needed. I wrote the algorithm. I wrote for the future and past and to have compatibility between algorithms. +55. I converted the text from the speaker. I did this by reading from text streams. From this, I breasoned out text. Next, I wrote algorithms specified by the stream. Finally, I wrote an essay about a stream. +56. I found the non-singletons. I read terms from user input. Given read_term(A,[variables(B)]) and input a:-b(C),d(E), the interpreter returned A = (a:-b(_A), d(_B)), B = [_A, _B]. Given read_term(A,[variable_names(B)]) and a:-b(C),d(E), the interpreter returned A = (a:-b(_A), d(_B)), B = ['C'=_A, 'E'=_B]. Given read_term(A,[singletons(B)]) and input a:-b(C),d(E),f(C), the interpreter returned A = (a:-b(_A), d(_B), f(_A)), B = ['E'=_B]. +57. I used a key with enough digits to be secure. Towards this, I repeated from \"repeat\" if there was a failure. Then, I cut to the first output. Then, I could generate a key. If the key already existed, I could create another one. +58. I added new clauses. Separately, I retracted a predicate with a particular name and arity. I could still detect the deleted clauses with the current predicate command. I could replace clauses, all of which I had deleted. Finally, I could list and error check if I had deleted them. +59. I opened files with phrase_from_file_s(string(String_dict2), \"file.txt\"), string_codes(String_dict1,String_dict2), atom_to_term(String_dict1,String_dict,_) and wrote to them with term_to_atom(String_dict,String_dict1), (open_s(\"file.txt\",write,Stream1), write(Stream1,String_dict1), close(Stream1)). I set the input stream to a particular stream. I could open a particular file. I could open, access and close a file. I could also write to files. +60. Recursive \"set of\" collected values from recurring variables while sorting a list of items, and could skip appending to recurring variables if the clause failed. Similarly to \"find all\", I found a sorted set of list and other items collected during find all. For example, after entering setof(A,(member(A,[1,2,3]),C=1),B), the interpreter returned C = 1, B = [1, 2, 3]. Set of sorting unlike bag of, after entering setof(A,(member(A,[1,1,3]),C=1),B), the interpreter returned C = 1, B = [1, 3]. I wrote a custom predicate with the interpreter, recursive set of, which deleted duplicates by checking that items were not a member and appending items to a recurring variable. +61. I checked whether the text was ethical. I set the output to a specific stream. I checked the spelling and outputted the correct spelling. Similarly, I checked the grammar. Also, I checked the mathematics. +62. I edited the stream with markers. I set the stream position. I found the stream position using position. I set the position to this. I went to a particular position in playback. +63. I found the stream property. For example, I found the file name. Or, I found the stream position. Then, I found the mode (read, write, or append). There were many options, including the binary or text type of the stream. +64. I found the end of the file. For example, I determined whether the position was at the end of the stream. Separately, another property was whether the program connected the stream with input. An additional property was whether the algorithm joined the stream with output. Finally, I determined whether the stream had an alias and what it was. +65. At the end of the stream, I could pass an error, create an \"end of file\" code or reset to \"end of stream\" at a position. I selected and controlled the stream. When I started the stream, I could determine whether it was a text (the default) or a binary stream. I could also decide whether or not I could reposition the stream. In addition, I could determine that an atom A is the alias of the stream. +66. I also used atomic_list_concat(A,'.','a.b.c') to return A = [a, b, c]. I found the subatom of the atom. For example, I entered sub_atom(abcdef,1,3,2,bcd). In this, 1 was the length before, 3 was the subatom length, 2 was the length afterwards, and bcd is the subatom of abcdef given these parameters. I could find anagrams (prefixes and suffixes of words). + +67. I detected a bug in which a>=x>=b, not a>x>=b. I compared whether one term was less than, less than or equal to, equal to another, etc. I tested whether there was a bug compared with the specification. Alternatively, I tested whether there was a bug compared with the input. I could insert predicate calls to determine whether indicated variable values were equal, etc. +68. I created and decomposed a term. In BASIC, I could make an array with a term in it. For example, I could convert a string to a number or store it in a different array. Then, I could call this term. I could also decompose and build it. +69. I deleted the code if the inputs were always the same. I optimised the code as I ran it. If the input was the same, I used the same result. Alternatively, I used the same result if the API parameters were the same. If the random number was the same, I used the same result. +70. I could avoid unification if a variable appeared in terms of another variable. I did this by writing the Prolog unification algorithm. First, I substituted variable values into both sides. Second, I found undefined variable values by unifying the two sides. Finally, I compensated for the pipe (\"|\") symbol between the head items and the tail of the list. +71. I could unify with occurs check. I checked whether a variable appeared in terms of another variable. If the two sides had different functions, the unification would fail. If there were no unifications of variables with values, then the interpreter would return no variable values. I could return atom=X, or if X=atom, then replace X with atom elsewhere, or if X=f(X) fails with a positive check. +72. I could manually create a List Prolog term, for example [[a],[b,c]]. I entered a(b,c)=..D., with the result D = [a, b, c]. Secondly, I entered [a,b,c]=..E. with the result E = ['[|]', a, [b, c]]. Thirdly, I entered F=..[',',a,[b,c]]. returning F = (a, [b, c]). I also entered F=..['.',a,[b,c]]., returning F = a.[b, c]. +73. I returned the type of the variable. I returned whether a term was a variable. In List Prolog, I returned that [v,x] was a variable. In another interpreter, [v1,x] was a variable. I could convert another form of List Prolog, X, into this form of List Prolog using a grammar. +74. I wrote the term. I could define various types of brackets. For example, . Another example is {b}. Also another example is /c\\. +75. The algorithm format entry system wrote the algorithm using a program finder with types (in fact, descriptions) as I entered descriptions of the algorithm's parts. I wrote the algorithm format entry system (I found the algorithm and its use). I did this by breaking down the algorithm into functions (I broke equals4 into put and get values). The algorithm entry system broke down the algorithm into parts. It asked whether a part was in terms of a part or was separate. +76. I pretty-printed the predicate generated with program finder with types. I wrote equals4 with program finder with types by intelligently simplifying brackets where possible, etc. In paragraph 75, the description might be sort addresses, interpreted as sort a list of lists (which addresses are), where the program finder recognises the sort predicate name and data (like types or a description). The connections between, for example, the user would store addresses and a list of lists in an ontologies database. The algorithm format entry system might ask whether the user wants to sort a list, then appended (dependently) or sorted, then pretty-printed (independently). +77. I wrote string to list with program finder with types, identifying parts within parts and writing a grammar. First, this algorithm recognised the brackets. To do this, I converted the brackets to a list. Second, I recognised commas in List Prolog strings. Finally, I converted commas to list item delimiters. +78. I found that the person had decomposed the list but not transformed the list item. I found bugs by comparing the specification with the code. I drafted the code. I wrote a simplification of the algorithm using the specification and program finder with types and combination algorithm writer. I identified similar cognitive steps in the code and corrected errors. +79. I identified that the user was trying to decompose a list but used the member command by accident. I found a predicate that finds cognitive reasons for code; for example, decompose and build an item in a predicate. The item started as [A, B]. It ended as [A+1,B+1] (calculated). I wrote algorithms that identified decomposition, transformation and building lists. +80. I could merge two or more predicates if possible. I found a bug in which the code is in a separate predicate. The user used two predicates, one called by another. The called predicate was not recursive or reused. I move the contents of the second predicate into the first predicate, changing the variable names if necessary. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 8.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 8.txt new file mode 100644 index 0000000000000000000000000000000000000000..a6f84c3d4929b72f7aefecfc53cc150e952767ae --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 8.txt @@ -0,0 +1,90 @@ +["Green, L 2022, Immortality 8, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 8 + +1. I found a bug in which code isn't in a separate predicate. Unlike Immortality 7, paragraph 80, I found a single predicate with reused code from another predicate. I moved the reused code into a new predicate with an appropriate head and called this predicate from the other two predicates. I also identified \"manual recursion\", i.e. a single predicate with a fixed number of recursions, and changed it to multiple predicates. Also, I determined that findall within the initial predicate could be moved to a new predicate with recursion if recurring variables were required. +2. I allowed the assumption when the student had understood it when interpreting the student's thoughts to help them write algorithms, an example of a big algorithm represented by an essay. I found algorithms that were combinations of all ideas in an article. As part of this, I found the best synthesis of the ideas. Next, I ordered the objects within objects. Finally, I wrote short stories connecting the ideas represented by the objects. + +3. I wrote algorithms to connect sentences, paraphrase and simplify each step towards a longer algorithm to one or two predicates. I gave an error on [1,\"|\",2,3] but not on [1,\"|\",[2,3]] because a list tail couldn't have two variables. I also chose to write longer algorithms with artificial intelligence. Separately, I wrote different predicates to put and get unified variables and simplify lists. I simplified [1,\"|\",[2,3]] to [1,2,3]. +4. I connected (simplified) predicates from the sentence, possibly generating, selecting and saving names and data for a predicate. I wrote predicates with single functions, using retry to write the string to list algorithm. I found the elements of an algorithm, separating them into different predicates if possible and possibly simplifying them. I considered running one after another and reusing them. I used the skip command to test a predicate and the retry command to help edit it if it failed. +5. I noticed the employees, students and customers returned. I wrote each predicate, for example, an interpreter in one or two predicates. For instance, I added 1+1. I kept on adding numbers to the total. Finally, I wrote a trace predicate. +6. I deleted obsolete states in the algorithm. At first, I wrote the database of predicates for philosophical or business purposes. Using this, I could support other employees. I could impress employers and head staff. As part of this, I minimised a set of algorithms. +7. I predicted a sentence's predicate. I did this by finding the algorithm for the logical data structure in the sentence. For example, I inverted an algorithm. Then, I wrote a short sentence describing the function name and data. I could identify these later. Finally, I found the predicate from synonyms for the sentence or the function name. +8. Ontologies may be syno-specifications (with memorable sentences and uses with data). As part of this, I wrote ontologies connecting possible sentences to predicates. As a first step, I entered the sentence's predicate. Then, I entered words from the sentence associated with inputted words. Later, I could find the predicate from these inputted words. +9. The student entered the shorter algorithm to demonstrate her understanding. I was free to determine how much work I did. I could write new algorithms when necessary. I could work when I wanted to. I found the algorithms from longer algorithms. +10. I mind read the predicate. As part of this, I started with the list of possible predicates. Then, I broke these predicates into decomposition, transformation and the list building parts. Next, I mind read different possibilities at each point. Finally, I checked the link from the sentence to the algorithm. +11. I perfectly programmed the essay. I did this by mind-reading the sentence. Then, I split the sentence into grammatical parts. Next, I converted it to logic (methodical and data structures). Finally, I found the sentence for the predicate. +12. Mind reading, decision tree mind reading in particular, suggested thoughts to students. I mind read the student thinking of a predicate using a decision tree, helping them deserve details (or enough for part of an A). I grouped them by command and variable in the decision tree. Using the decision tree with the mind reader gave more considered results. Mind reader detected a thought from an options list if the user took less time to think of a breasoning, the x, y and z dimensions of an object, using meditation. +13. Learning the algorithm names became like learning keys, saving time. Regularly writing predicates helped me write efficiently. I wrote list processing in the predicate head. I could invert the predicate also using cut. I worked out the predicate or identified its input and output and found its name. +14. CAW could work with the program finder with types to narrow down commands. For example, I used CAW to check I had optimised predicates. I also stored predicate names and variables as a,b,c, etc. If there were up to four commands, I verified that the predicate was optimised using CAW. CAW sorted and found the commands in the same order each time. +15. I inserted a command that called other commands. I used the program finder with types to check I had optimised predicates. I found the list within the string. I found a string within this list. I repeated this. +16. I used and deleted unused functional commands. I used the user interface to write the predicate. I found a.b and c.b rather than (a,c).b. However, I could use other brackets and save results. I could press tab to expand or contract commands, with dynamic results. +17. I also used the graphical user interface to write the predicate. I typed the predicate one character at a time. I could construct input and output while I typed. The input and output helped predict the code. I could indicate input and output with specific uses, with some prioritised based on the sentence. +18. I converted back to an algorithm with documentation about the uses of predicates in the program. First, I chose the shortest predicates from my algorithms and converted them from names to descriptions. Next, I increased or decreased algorithms to predicates. Finally, I removed unnecessary predicates, minimising the algorithm's state machine. + +19. I stored the logical structural parameters in a text file. I wrote the logical structure of List Prolog Interpreter (LPI) (not, or, if-then) with functionalism. I did this by writing the command names. I also wrote the logical relation. Finally, I checked that the relevant commands had returned true. +20. I inverted a.b (=b^-1.a^-1) and explained a.(b.c) (=a.b.c). I made LPI more complex with more features. I listed original features. I listed the ISO features. I listed features of other programming languages and inductive algorithms. +21. I instantiated values to find before, not after searches, for example, searching in strings. I simplified the features of LPI. I used invertible functions to save time. I used the function once without inverting it straight away. Simplifying constraint-finding algorithms was critical, for example member(A,[1,2,3]) and member(B,[1,2,3]), etc. A=1,B=1 can be simplified to A=3,B=3,member(A,[1,2,3]),member(B,[1,2,3]) (much faster) even deleted. +22. I passed the ball back to the facilitator when I could run the command. In multithreading, I ran the concurrent processes with mutexes (code that stops a piece of code from being run at the same time twice). When multithreading, I wrapped each predicate call or command in a mutex. I couldn't run these parts concurrently and waited until one had finished before running it. I tried running commands one at a time, waiting to run commands when they were non-mutexed. +23. I mutex-unlocked the command when it had finished. Prolog mutexes are recursive, meaning the algorithm must remove multiple mutex layers added with recursion before another thread can access the resource. They also wrapped any command, such as findall, not or if-then. If a command was mutex locked, it couldn't run. The interpreter stored mutex locking in the heap (where the globals are), and threads have their stacks. Each predicate ID had a mutex lock status. +24. I stored stacks for local (current predicate) and global choice points and new ones for the call command, in addition to thread stacks. I made a copy of the stack for each thread and used the same heap (predicates, globals, etc.). In SSI, each frame of the interpreter (for example, running a command or exiting a predicate) of each thread ran one at a time. I could collect SLG tabling data with mutexes, saving time by using past results of previously run predicates. A command with an SLG tabling entry didn't need to I run or mutex locked. +25. I wrote an algorithm that simulated multithreading that ran one command of each list at a time, which took longer than usual because it ran an interpreter. Aside from this, I made copies of stacks for threads. Then, when I created a new thread, I copied the stacks to a new set of stacks. I used these to run the commands in the thread until they exited. Then, the interpreter merged the stacks and updated the variables. +26. I considered using a computer with multiple processors. Also, multithreading with copies of the stack helped performance. For example, I tried text to breasoning with multithreading. I also tried mind reading with multithreading. Finally, I tested for better performance with multithreading, more threads and more processors. +27. I questioned multithreading with one processor or one computer. I used a server with more CPUs or processors. multithreading improved results on this machine. I agreed with adumbrations while working on solutions during my life. I did all the thinking while the computer found solutions using brute-force algorithms. +28. Multithreading improved performance in operating systems. I could multi-task (run different applications at the same time). I chose to be well-known by writing some algorithms using neural networks. I could do things in terms of my life. I could save time and spend time doing other things. +28. I gave SLG tabling as the reason for mutexes. Mutexes were necessary with immutability separately because mutexes protected the same resources with changing references from being reused. Immutability only allowed one value per variable. A mutexed command only allowed one processor to run it at a time. In this way, they were unrelated. +29. Multithreading in Prolog used C to handle memory, where Prolog ran one command at a time on each thread. Prolog lined up the threads. The interpreter ran the current command of each thread. If a thread mutexed a command, the other threads couldn't run it until it had become unmutexed. If it didn't matter which predicate finished first, they could be multi-threaded. A command which possibly didn't count could be multi-threaded. +30. Multithreading possibly took the same time to run as without, but some parts finished earlier, speeding up performance. Mutexes sped up performance when the interpreter ran the same command. An advanced algorithm chose to run threads that solved bottlenecks first. This algorithm considered the likely length of a recursive command when selecting the order of threads. +31. I considered using the fastest algorithms. I cancelled a thread that was taking too long. I timed parts of the thread. If it took too long, or the useful results had been found, I cancelled it. I could reorder parts of or cancel threads that caused bottlenecks. +32. Once one algorithm had finished, the next was run, with opportunities to pause or recontinue at different points. First, I manually queued algorithms on parallel processors. Then, I connected computers to a server. These computers downloaded files to process. They uploaded the results. +33. The prediction thread ran in the background while another thread waited for input. I ran commands while the user was inputting data, giving predictions. The user entered the start of a sentence or algorithm. The algorithm predicted and offered possible endings. It used a program finder with types (which could find conditions such as pipes (\"|\") for wrapping in brackets, etc.), Combination Algorithm Writer (CAW) (which recognised indices, need for reused translators or paraphrasers) or a neural network. +34. I viewed the results of the algorithm. I ran commands on the server while a web page presented live updates. The results of the legs of threads affected the next leg of threads. Threads were created or destroyed based on previous data. A monitor displayed current and past threads and results. + +35. I found conditions in terms of variables with previous algorithms. I wrote the whole interpreter with the program finder. Then, I found the required algorithms with the program finder. Next, I matched the mathematical specifications for the interpreter (e.g. a_i,...,a_n->). Then I matched them with mathematical specifications from the required algorithms. For example, the sort command was described using an algorithm and identified using run-all when there were numbers that didn't fit another pattern. Finally, I tested for conditions based on the small number of variables. +36. If there were two instances of a value in the specifications, I checked the other data sets to test if a character recurred and had a single symbol. I found and worked out additional levels (for each predicate) of mathematical specifications (character a_1 = number, etc.) within the data (input and output, etc. for the algorithm). I found the data, for example, [a,1,a]. I found the algebraic types [character 1, number 1, character 1]. I found the mathematical specifications, [a_1,a_2,a_1] where a_1=a=character, a2=1=number. +37. Priority groups (for example, mathematical, file or grammar predicates) may be determined by the user with include statements or artificial intelligence that predicts their need. I found as much as possible with program finder with types (PFT) before Combination Algorithm Writer (CAW) was needed. Combination Algorithm Writer was required when, for example, when patterns of data became unpredictable and I may need other predicates. I may find items to try with run all (or CAW) using program finder. I considered data types, priority groups and further rounds of use of PFT or CAW. +38. I needed to skip over finding detailed code segments to achieve larger goals with neural networks (and compared program finder with types with neural networks). I found the rest of the code (after using program finder) (where there is an unknown connection to the code from the mathematics specification) with a neural network. I found the maximum number of code lines with the program finder. Then, I found code for unexplained parts of data, described with mathematics specifications ([[a_1, a_2,..., a_n], [a_n,..., a_2, a_1]]) with a neural network. A neural network could find short sequences of commands with positive and negative data. +39. I prevented duplicate code in the neural network and added new code to it. I added the finished code found with neural networks to the database rather than rebuilding the neural network each time. I could see and modify the code before adding it to the database. I could delete code in the database. I could delete duplicate code in the database. +40. I checked whether seemingly too simple code might be enough. Occasionally, the algorithms might return code that appears to answer the question but is too simplistic. Next, I checked whether the algorithms had \"cheated\" (literally used specifications or not created general rules). I removed these codes. I removed them, except if the rule was elementary, for example, \"and\" or \"or\". +41. I deleted the infinite loop anyway. I aimed to identify obsolete code (for example, code solved with lazy evaluation). First, I found infinite loops that lazy evaluation would skip over if they were deleted before returning output, and deleted them. Second, I verified whether the code was always likely to be obsolete. Third, I checked whether the interpreter arbitrarily or always skipped the infinite loop. +42. First, I found reverse with program finder with types, and then I found I needed it with mathematical typing. I found the predicate for reversal with program finder with types with the mathematical specification a_1...a_n -> a_n...a_1. Then, I found the mathematical specification. Then, I looked up the matching mathematical specification for reverse. I returned the reverse predicate. + +43. I counted the times I had used an algorithm, discarding it if I didn't use it a certain number of times in a period. I generated many algorithms with a program finder with types. I developed more obvious algorithms with a program finder with types. I generated and set the usage count to 0 of algorithms, where I generated them given data. I also developed morphs of algorithms. +44. I skipped over generating algorithms but found them with a program finder with mathematical specifications. I generated algorithms with mathematical specifications. The mathematical specifications could have recursive definitions. Also, they could have lists. These lists could be partially complete or have optional items. +45. I found the reverse string command. I combined algorithms with a program finder with types to find new algorithms, for example, reverse and string concat. I did this by generating algorithms with reverse and string concat. I started by finding the input and output of algorithms with reverse and string concat. I then found the intermediate input and output of algorithms with reverse and string concat. +46. I found lateral associativities. I added the algorithm to the database and used it after matching it with mathematical specifications. I did this by finding an algorithm. Next, I ran it from within another algorithm. Then, I did this repeatedly. +47. Alternatively, I used brute force to find the item. I found a quantum search algorithm. I did this by mind-reading the item number. Or, I mind-read the group number. Alternatively, I mind-read the classification. +48. I checked the result with mind-reading. I started by finding a neural search algorithm. This algorithm considered the query. It also considered the structure of the database. In addition, it considered past algorithms and results. +49. I corrected human errors and found philosophical sources. I did this by finding a quantum neural search algorithm. First, I mind-read the location. Then, I checked this with a neural network. I could have repeated this. +50. I found lists of lists used in various places. I found the algorithms I wanted, for example, using lists of lists as variable data. Then, I wrote the lists of single-item transformations. Finally, I wrote the mathematical changes in lists of lists. For example, there may be one tail item and how a list as a tail item is simplified. +51. I also changed SSI to support _1, _2, etc. I made undefined variables instead of empty values back-compatible with non-equals4 mode. I changed the value of undefined variables from empty to _1, _2, etc. I changed equals4 mode to support this. I changed the non-equals4 mode to support this. +52. I found intersection, union and delete. I found the member command with PFT. I wrote the input and output. I detected the need for the \"interpret\" body predicate. I noticed the need for variables. +53. I translated an internationalised version of my software. I translated words that would be outputted. I detected when I needed to translate words to be written to the screen or outputted. This was when a translated version was required. Also, I specified the languages and type of translation. +54. I found a spoken language from computer code. I also created a translation dictionary of words that I often translated. I did this by collecting words to translate. I had previously found the required languages. Then, I translated them. +55. SSI used the same equals4 system as List Prolog Interpreter. I developed the equals4 mode in List Prolog Interpreter, which allowed lists of lists and the list processing symbol (\"|\") instead of just variable names in algorithms. Equals4 worked with inversions (running predicates backwards). Inversions may require the cut command after returning results. I needed to compensate for variables such as _1, _2 in the results. +56. I replaced findall with find n solutions and a variable to return if the user had run the cut command. Using find n solutions was part of implementing findall with cut in the body in SSI. I ran SSI's cut command. This case required a particular verification file for SSI. I modified findall in LPI to return results from each pass of the findall body one at a time. If the interpreter had run the cut command during the pass, findall stopped. +57. I automatically saved type statements with predicates. I generated types for predicates. Then, I inspected the commands' type statements. I generated the predicate's type statements, considering the type statements from other predicates. I found instantiation and type errors. +58. I predicted features from my texts. I generated data from types. After finding the type statements, I aimed to see whether the algorithms were correct. I predicted features. I predicted bug fixes and API changes (variables as A to [v, a]). +59. I found the exact data and type statements. I checked whether the data was too specific. Then, I detected whether the type statements were too specific and could be simplified. Then, I worked out whether the algorithm was too specific or too complex. Finally, I saw whether the data was too specific for the type statements and whether the type statements could be made more precise. +60. I chose solutions that led to more creativity and simplicity. First, I checked whether the data was wrong. I did this by finding the correct data and algorithm from the specification. Then, I detected conflicts when it was wrong. I resolved these conflicts. +61. I objected to undefined variables returning false positives. I checked whether the data was too general. I detected whether the type statements were too broad and could be made more specific. I saw whether the algorithm was too general and had bugs. I noticed whether the data was too wide for the type statements and whether the type statements could be made more general. +62. I could make computer courses available as open problems. I sped up mind reading and tested for sentiment, the existence of thoughts and thought-conversations. I used random mind reading (which was faster) by activating it using mind reading CAW, which spent longer asking a few people about the characters of the day. I used mind reading to make psychological predictions about whether a user would make changes to code. I wrote the most intelligent conclusions down and sped up writing code with my brain. + +63. I could predict the change of order between lists of lists. I started by changing the order of arguments from 123456789 to 321 654 987. I did this by breaking 123456789 into 123, etc. Then, I reversed 123 to 321. I repeated this for the other lists. +64. I stored the operations in a single place. I started by writing an induction algorithm in C. First, I gave it 1,1 and 2. It worked out that 1+1=2. It could apply this rule to other data. +65. I could store trees as lists in arrays. Storing these variables was part of writing an interpreter in C. First, I parsed A=1+1. Then, I calculated A=1+2=3. I kept variables in an array. + +66. I wrote the parser with List Prolog grammars, which had more arguments and allowed recursion with variables (where there could be a cognitive string parser and an optimised version). I used this technique to write a parser in C. First, I found words and elements that I frequently found. Second, I found frequent groupings of these. Finally, I discovered their recursive hierarchy. +67. I used CAW to find a multiplication algorithm. Earlier, I added multiplication to the register state machine. I did this by using long multiplication. The complexity of long multiplication was O(n^2). I could also use other algorithms. +68. I used an SSI algorithm to unify lists. Unification followed passing list processing to the register state machine. First, I found and passed the head and tail to variables. Next, I processed them. The variables contained lists. +69. Lists could contain links to other lists. In this, I assigned lists to variables in the register state machine. Also, the list could be a list of numbers. It could contain atoms and strings. It could also be a list of lists. +70. I added the ceiling function to the register state machine. I read 4.53. I found the upper of 4 and 5. This equalled 5. I found ceil(-4.53) = -4. +71. I added the floor function to the register state machine. I read 4.53. I found the lower of 4 and 5. This equalled 4. I found floor(-4.53) = -5. +72. I used the most accurate square root algorithm. I added square root to the register state machine. I found likely square roots between values and kept on increasing accuracy. I stored them in an array. I found other square roots in terms of them and bisecting. +73. The computer could return long numbers or my philosophy. I did this by adding powers to the register state machine. I accomplished this by multiplying the number by itself n times. I could use a fast algorithm. It seemed better than a calculator. +74. I could examine the number's format. I added an absolute number function to the register state machine. If the number was positive, I returned it. If the number was negative, I returned -1*n. I could tell whether the number was negative because n<0. +75. I could find member(A, B) or member(1, B). I found the member command in the register state machine. I found member(A,[1,2,3]) returned A=1, etc. Also member(4,[4,5,6]) returned true. I found a member or whether the item was a member of the list. +76. I inverted delete. Before this, I deleted the item in the register state machine. I did this by deleting all instances of an item from the list. As part of this, I processed the lists from start to finish. Then, I wrote the final list to memory. +77. Upon entering foldr(append,A,[],[1,2]), the interpreter returned A = [[1, 2]] and when I entered foldr(append,[[1],A],[],[1,2]), it returned A = [2]. I appended the list item in the register state machine. I appended [1] to [2] giving [1,2]. I entered append(A,B,_), returning A = [], A = [_], A = [_, _], etc. There were many modes, also in foldr append. +78. I could invert mathematical functions. I found the sum in the register state machine. I entered foldr(+,[1,2,3],0,A), returning A=6. I entered foldr(+,[1,2,3],A,6), returning A=0. I entered foldr(+,A,0,6), returning A=[6]. +79. I changed recursion to iteration. I completed recursion with recursive findall. In recursive findall, I didn't need complex base cases. It was less complex than findall and recursion. It was like iteration. +80. I reordered arguments using properties of their item number, recursively. For example, I paired items 1,2 and 3,4, etc. Or, I found every nth number. Alternatively, I found every mth nth number. This number equalled every m*nth number. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality 9.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality 9.txt new file mode 100644 index 0000000000000000000000000000000000000000..388a038ab5ed4fe56b3fb2cd9e2841efe48fb2a4 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality 9.txt @@ -0,0 +1,88 @@ +["Green, L 2022, Immortality 9, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality 9 + +1. I removed unnecessary levels of calls. I found a:-b. b:-c. c. I simplified this to a:-c. c. Before this, I simplified predicates to check whether they were the same by translating them. They could be in another programming language. Or they could be in another spoken language. +2. I also removed unnecessary variables with the same surroundings. I worked out and recorded removing the unnecessary variable from the state machine. If the predicates were the same, I could delete the unnecessary variables. They were names earlier in the predicate but were unused. I only kept the only variables that I used. + +3. I assimilated and named the same algorithms, using nomenclature such as breasoning. I did this by finding syno-functional algorithms. These algorithms could multiply. They could also paraphrase. Or, they could induct. +4. I discovered pedagogical thresholds. Separately, I recorded the history of induction. I did this by writing specifications for the scale of induction projects. They got bigger. I explained simulation algorithms from neural networks. +5. I cognitively ordered the commands intuitively. I grouped the algorithms by function. I mind-mapped functions within other parts, etc. I found how algorithms fit inside others complexly. Finally, I fitted general language to specific arguments. +6. I found better tools and conclusions with self, other and computational knowledge. Before this, I ordered the predicates by output priority. Then, I questioned how vital the output was. Then, I asked whether algorithms would lead to different algorithms. Finally, I predicted the trajectory of the production with time. +7. I aimed for large and small to be thought of clearly. I used findall to keep one of a set of outputs in order. In this, the first level of predicates produced a set of production. The second level of predicates created one of these sets using findall. I prepared for 100% correctness. +8. I found the target and method of output. Third, I used dynamic argument ordering. Third, I reprogrammed the algorithm after splitting predicates into those with individual functions, which I could run one after another rather than complexly together. Fourth, I edited out superfluous arguments. Finally, I found the profession of writing. +9. I used static XML argument ordering. I did this by ordering the arguments alphabetically. I also ordered the arguments at compilation time. Reordering the arguments helped cognitively. I noted the reused predicates. +10. I found a line of algorithms. I repeated induction techniques using neural networks. I split off other functions from predicates. I over-split and then came back to needed combinations. I kept the data and made simple models of more extended algorithms. +11. I favoured simplicity and expansion. I changed not(A), not(B) to not(AvB). Also, I changed them back. And I changed not(A) v not(B) to not(A^B). I changed it back. + +12. I processed nested if-then clauses. I worked out and recorded removing the tautology from the state machine. For example, the tautology was true, or true->true. I also simplified always failing parts to fail. Finally, I simplified if-then clauses with true or false in the antecedent to the correct consequent. +13. I inverted the modes of the predicate. I ordered the predicates by usefulness. Then, I redrafted the predicate until it was simple. Fourth, I counted how many times I had used it. Finally, I ordered the list of predicates by the number of uses. +14. I recorded the history of bots. I did this by preparing to create the bots. I did this as part of my company. Separately, I holidayed interstellar. I holidayed intergalactically. +15. I checked for an algorithm's relevance at the research frontier. I started by ordering the predicates by importance. First, I tested whether the predicate broke the border. Next, I cited whether I reused the predicate. Finally, I ranked the predicate by intelligence. +16. I generated the test case from the type statement and wrote #+ to change the mode of arithmetic plus. Separately, I found all test cases rather than running each one at a time. I used findall to collect the test cases. There were no total or test numbers to refer to necessarily. I ran them, reusing libraries. +17. I recorded Prolog command translations in Portuguese, Lithuanian, etc. I did this to simplify spoken translation with algorithms. First, I translated commands, n, v and Prolog symbols. As part of this, I left data and names. I referred to an element once. +18. I only used simplify when necessary. I modified equals4 to simplify by itself (fail when two or more items are in the tail). I modified equals4. I removed simplify from equals4. If I located two or more items after \"|\" (in the tail), equals4 failed. +19. I aimed to delete equals4(B, A). I tested whether I needed the mirror of equals4(A, B). I first wrote equals4 to try either (A, B) or (B, A). The new version of equals4 deprecated this. The old version treated the order differently. +20. I supported nested lists. I added support for \"|\" (pipe) to equals4_first_args. I based it on equals4. If there was a \"|\" in one or both lists, it affected how the interpreter processed the other list. For example, [1,2,3]=[1,\"|\",[2,3]]. +21. I found unused predicates in e4_fa_get_vals.pl. I converted the file to List Prolog. I listed called predicates. I wrote the names of any unused predicates in the file. I deleted them. +22. I added support for local and global stacks to State Saving Interpreter (SSI) by archiving and loading when a new predicate started. At the start, I didn't archive the local stack to the global stack or load the local stack. I did this when I started a new predicate at other times. I also did it when I reloaded an old predicate. I archived and loaded on changing the current predicate ID. +23. I ran the multithreading command in C, with copies of the stacks and support for mutexes on global variables. I added support for multithreading to SSI by using C's support for multithreading. Multithreading duplicated stacks for each thread. It supported creating mutexes for shared global resources (for example, asserted variables), not predicates. Next, I checked how many CPUs my smartphone had. +24. I accepted input at the start and outputted the results at the end. If one goal fails or throws an exception in multithreading, only it stops. The others keep on going. It is possible to return an error when a thread stops. Or the interpreter may return a failure. I researched I/O in multithreading. +25. If multiple goals fail or raise an error in multithreading, errors or failures are reported per thread. I accepted this was possible in C. I used it in Prolog. I caught failures and responded to threads finishing. I scheduled retrying a thread. +26. I prepared to write an algorithm database with algorithm IDs. I transferred most of my work to my laptop (8 CPUs) from my VPS (1 CPU). I used my VPS for web work. I used the VPS for work during the day. I also used my smartphone's many CPUs by running SSI in C. +27. I quickly breasoned out thoughts, literally thinking of nothingness. I simplified Text to Breasonings to more quickly breason out by object. I did this by finding objects for all the words in the text. I grouped the objects. I breasoned their breasonings (dimensions) out all together. +28. I used multithreading in Text to Breasonings. I breasoned out one object per CPU. Grouping objects in this way sped up breasoning. I presented visualisations afterwards of the result. The result was a summary algorithm. +29. I used mind reading in Grammar Logic (which mind-mapped essays) for necessary details. I started by using multithreading in Grammar Logic. First, I found seven reasons per sentence, each on one CPU. Next, I found a sentence's reasons per CPU if there were many CPUs. After that, I wondered about mind reading. +31. I used multithreading in Mind Reading Combination Algorithm Writer (MRCAW). I did this by mind-reading mind-mapped parts at the same time. I could also transfer mutexed globals between threads. I considered a brain implant for better quality mind reading and writing of my algorithms. However, I ended up using neural networks. +30. I examined each part of the music composer algorithm to consider multithreading, for example, searches for chord progressions. I used multithreading in Mind Reading Music Composer. I found lyrics when I had determined the main characters in previous parts. I found the starting notes of sections and found the melody and harmony at the same time. I filled in instruments for different sections at the same time. +32. Detailed mind reading was better, and I tested multithreading for better results. I used multithreading in Detailed Mind Reading in Essay Helper. I worked on the exposition and critique simultaneously. I worked on different paragraphs simultaneously, making sentences as used. I worked on connections simultaneously, mind-reading topical sentences. + +33. There was an Unsatisfactory Progress Committee to help students when they performed poorly. I did this to support agreement and disagreement in the academy. The student first agreed or disagreed with the essay topic. They disagreed with causality. They received a lower grade. +34. The fail was borderline if there were a few unsure disagreements in the first half. I supported success and failure in the academy. I explained the marking system to the student. They read the text and decided whether to agree in the first half. If they disagreed in the first half, they failed. +35. Business included conscious thoughts and solved problems. It did this when supporting positive and negative terms in reasons. As part of this, I identified possible negative or unusual positive thoughts. Then, I protected people from them. The thoughts prompted people's curiosity and thought. +36. I changed the test cases to Prolog. Next, I changed v to vsys in List Prolog to use the string \"v\" in data. I did this by writing vsys in the settings file. Next, I read vsys each time I ran List Prolog. Finally, I converted Prolog into List Prolog with vsys identifying variables. +37. I used \"n\" in games in Prolog. I changed n to nsys in List Prolog to use the string \"n\" in data. I wrote nsys to identify the names of predicates. The word \"nsys\" didn't appear in Prolog. The Prolog to List Prolog converter inserted it. +38. I changed the converter to support definite clause grammars. I changed -> to --> in Prolog. The longer array denoted definite clause grammars. Their name meant they always had a predicate body. I reserved -> for if-then clauses. +39. When installing a language, I translated all the needed words at once to save the API quota. I simplified language translation in List Prolog. I detected whether there was a dictionary entry for the term. If translating to long-form English, I changed underscores to spaces. I only translated reserved words. +40. I ran one frame at a time in multithreading. I called each \"thread\" part of the Prolog thread. I ran one frame (like a trace line) per thread, where the C code was in a separate function. I accounted for mutexes around predicates and globals, running predicates or accessing globals only if the user unlocked their mutex. I simplified SSI to have two SSI predicates. +41. I controlled mutexes manually. The algorithm contained a mutex wrapper. With this wrapper, the algorithm couldn't run the same part of code with the mutex simultaneously in different threads. The SSI state machine recorded the mutex number next to the code inside the mutex. Mutexes behaved like the \"once\" command. + +42. I remained active and entrepreneurial through investments. I simulated sales in the academy. First, I bought the product for people. I mind-read their assignments. Later, I sold to real people on the side separately. +43. I used program finder with CAW to mind-read computing assignments after encouraging students. I did this to mark the academy assignment. Also, the institution updated the high distinctions over many years. I detected the number of breasonings over a period. The students completed the tasks over several periods. +44. I changed \"I wrote the key terms to simplify the sentence\" to \"The students found or inferred the key terms, for example, the computational verb and data (input and output)\". I simplified the sentence. I reduced the sentence to its critical terms in base form. I sometimes separated different sets of keywords. I checked whether these could converge and simplify. +45. I used the \"text to algorithm\" algorithm to write a predicate about the keywords in the sentence. The predicate was representative and complex enough to convey the idea. I converted to either predicate head and recursive call containing list processing head and tail or list processing using A=[B|C], etc. in the body. I noticed that algorithms with the same computational verb had common elements, so I sped up finding predicates with the data (or vice-versa). If the data didn't lead to a predicate, the user needed to enter one. +46. I wrote a sentence in terms of language from the other sentence. I read the original sentence. Then, I entered the new sentence given the other sentence. I used the output if it was grammatical and logical (it was in the database and not too creative or mad). I replaced the words with the closest (synonymous/functionymous and intersectional), which made sense. +47. I engaged the student about the political dimensions of the philosophy. Before this, I detected the student thinking about an argument. Then, I thought about the connections. Third, I thought about the algorithm. Finally, I completed paragraph 46, for example, on the topic of breasonings as algorithms, connected to the first technique. The new sentence was \"I asked what intelligence the geometry posed\" (where the first technique was hermeneutic or question answering). +48. The structure of the school was to allow students to write when they wanted to, finding out and writing books. I increased the student's argument. In paragraph 47, the political side of breasonings might encourage studying doctoral degrees and writing 4*50 As (400 page) long books on a department. Looking for a doctoral degree would encourage originality and responsibility in academia, schools and industry. I streamlined the assignment and urged students to talk to me with their questions. +49. I solved negativity to myself. I always wrote in terms of favourable terms. There was positive problem-solving and algorithm writing. I analysed positive society based on algorithmic currency. The person received a pension and wrote without worrying about money. +50. Over time, I appreciated criticality to help the academy remain rigorous. In the vocational education version, a careful algorithm wrote critical content automatically. In religion, there was important content on top to maintain order. Everyone passed or failed in vocational education, and students rewrote answers to \"what\" questions. So I researched essay writing and algorithm writing in vocational education. +51. The difference between school and University was that lecturers mind-read University students' thoughts. I noticed that mind reading improved the rigour of students' ideas. I trained students in medicine, education and meditation so that they enjoyed mind reading. It might be unnecessary for non-spiritual meditation students to take part in computational mind reading, but they could hand-breason out breasonings. The texts were written inclusively, with critical analysis everyone could complete. +52. There was a hybrid vocational education/University system. In this system, there were more As, and they were well-ordered. In this system, the assessment was easy, and students were mind-read (interacted with artificial intelligence). Vocational education was single-grade-minded, so students were mind read on straightforward questions, and the computer funnelled everything into categories of knowledge. When studying vocational education, I siphoned my interests in mind reading and text to breasonings. +53. Secondary schools had favourable terms around a negative word. Meditation school staff sometimes protected from this type of thing by becoming the person's ally and objectively or understandingly discussing the issue. The \"negative term\" was a juvenile misunderstanding or rebelliousness. Meditation could help relax and increase cognition about and possibly prevent these events. I agreed with a good diet, sleep, avoiding drugs and alcohol and good support with study and development during this time. +54. A doctorate helped support me in primary and secondary education and meeting business standards. I hand-wrote and interpreted 80 algorithms for sales and assignments. Hand-writing algorithms was rather than putting off writing specifically, where one shouldn't generate breasonings too quickly. Also, this was in addition to writing 80 philosophy breasonings, where writing algorithms helped examine the student's thoughts properly and satisfactorily. I found pursuing a business doctorate was intersectional (appropriate) for me and helped support me in writing enough such algorithms. +55. I noticed that some academics detailed their 400-page books, taking many years. I wrote longer algorithm sequences for some assignments and sales. Some projects required 4*50 As, which is possible to support with a doctorate. Some sales also required 4*50 As to be completed from both parties' point of view and a doctorate. I experimented with writing for over a year instead of six months to include the algorithms with the 4*50 As. +56. I didn't just back-propagate a value from side to side of equals4. I added back-propagation of values B=1 to A=B before it. For example, +equals4(A,B),equals4(B,C),equals4(C,1). A=1, B=1, C=1. Also, I entered equals4([A,B],C),equals4(C,[1,1]), returning A=1, B=1, C=[1,1]. Also, I entered equals4([A,A,B],[C,D,D]),equals4([E,E,B],[1,G,G]), returning A=1, B=1, C=1, D=1, E=1, G=1. +59. I changed equals4 to \"=\". I converted this \"=\" symbol from Prolog to List Prolog. I removed equals4 as a command. I also removed equals 2 and 3. Separately, I reused code in the Philosophy algorithms lpi and lucianpl. + +60. I found our common interests. I found intersection(A,B,C) with no values. I entered intersection([E,B,C],[B,C,D],A). The interpreter returned E = B, B = C, A = [C, C, C]. Entering no values allowed faster computation. +61. In the intersection command, the result had the same number of items as the shorter input. Separately, I found union(A, B, C) without values. I entered union([A,B],[C,D],E). The interpreter returned A = B, B = C, E = [C, D]. The interpreter merged undefined variables. +62. I found maximum(A, B) without values. I entered maximum([A,B,C],D). The interpreter returned E, where E>=F, F>=G. The maximum result without values could be in the bindings table. It was like co-routining (where dif(A, B) can be in the bindings table and returns true if not(A=B)). +63. I found the maplist(predicate, List, Result) command without values. I entered foldr(p,[A,B,C],D). The interpreter returned D=p(p(A,B),C). I could store this result in the variable bindings table. Also foldr(p,[A,B,C],D,E) returned E=p(p(p(A,B),C),D). +64. I computed the free proof. I found add(A,B,C) without values. I entered sum(A,B,C). The interpreter returned C is A+B. I stored this in the C variable, and the interpreter evaluated it when A and B were defined. +65. I found findall([A1,A2...],(member_and_rules(...A1),member_and_rules(...A2)),B) without data. I entered findall([A1,A2...],(member_and_rules([C1...CN],...A1),member_and_rules([D1...DM],...A2)),B). The interpreter returned N*M... lists where N and M were the list input lengths, respectively. I stored a mathematical expression of the result for B. I found failures, such as odd numbers, mathematically. +66. I wrote the database formula finder. I found subtract_set(A,B,C) without values. I entered subtract_set([A1,A2,A3],B,C). The interpreter returned A1 = A2, A2 = A3, B = [A3|_], C = []. I stored what I wanted (the predicate name in the variables). +67. I found delete(A, B, C) without data. I entered delete([A,B],B,C). The interpreter returned C = [A]. If I wanted an explicit result, I could enter delete([1,2],2,D), returning D = [1]. I also entered append([A],[B],C),delete(C,A,D), returning D = [B]. +68. I increased the assignment to be like research. There was always a new idea in this project. In the project, I kept track of the possibilities and perspectives. There could also be multiple perspectives. I helped the student to them. +69. I generalised the free expression (I removed the brackets). Before this, I increased research to be like an assignment. I did this by working on the probable questions. Further, I compared the research with existing research. Finally, I changed the standard to achieve future possibilities. +70. I corrected the reasoning in a private consultation. I did this by recalling the conversation thread. First, I added the ideas. The student then wrote them down. Next, I generated business (mathematical bindings) research. Finally, I wrote that the free binding equalled 1+1=2. +71. I examined the literature about a topic. Then, I was free to make decisions about it. I could design it freely. I could also verify freely. In this way, I learned about one type of work. +72. I examined simple and extended ideas. Separately, I recommended time travel to the meditator. First, I travelled from home to the centre. Then, I practised meditation. Finally, I neurospected (sic) writing to time travel. + +73. I categorised the tasks and worked on them. I watched the Brunei sword dance. I prepared the art. I composed the music. I synthesised the highlights of the philosophy. +74. I worked on the correct form of the cut command. I compared the behaviour of various positions of cut in Vetusia with Prolog. I found the different behaviour of the cut command at a particular place. I found the reason for the behaviour. I changed the algorithm to behave as I wanted. +75. I time travelled to the time of the important academic. I thought of this by creating a box puppet theatre set in the times. First, I made puppets of me and the intellectual. Then, I created a script in which I asked the academic what his favourite implementation of Prolog was. He answered, \"List Prolog\" because it was essential to start at the beginning. +76. I verified the definitions of cut and inserting cut when possible in optimisation. As part of tail recursion optimisation, I added a cut command in List Prolog, not state machine form. I experimented with adding the cut command in state machine form. The advantage was locating the final command more easily. In the case of member(A,[1,2,3]), !, I changed it to A=1. +77. In tail recursion optimisation, I found the last commands. I did this by examining the algorithm in state machine form. I started by finding the last commands in each predicate. Then, I inserted brackets around each of them. Finally, I inserted a cut command before them. +78. I didn't have to search through the algorithm; I found choice points in runtime with the interpreter to determine whether to insert the cut command. In tail recursion optimisation, I inserted the cut command if there were no commands with choice points in the predicate (usually when it was the final clause) or if there was a cut so far. I verified that there were no choice points with the current predicate number touching through a link. If it was the final clause of a predicate, then there were no more recursive calls before the last command. If there was a cut so far in the predicate, tail recursion optimisation already worked through the cut command. +79. I verified whether tail recursion optimisation sped up Vetusia. Otherwise, I tried adding a local stack or reusing globals. Then, I activated tail recursion optimisation with a possible cut for the rest of the predicate. First, I checked that there were no choice points just before the findall recursive call of the predicate (including other clauses). If there were no choice points, then I inserted cut before the last recursive calls. Cut removed unnecessary stack data, approaching the efficiency of a C for-loop. +80. I counted the calls to each predicate in each predicate. Before adding a tail recursion optimisation cut, I checked for a recursive call or call of call. The definition of a recursive call was a previously called predicate. I could also check whether the call was to a predicate, and so on, that called a previously called predicate. I could check for these calls in the state machine. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/Immortality index.txt b/Lucian-Academy/Books/IMMORTALITY/Immortality index.txt new file mode 100644 index 0000000000000000000000000000000000000000..ea104e2e12d9f5203853e0fc8ec397ebd98348a3 --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/Immortality index.txt @@ -0,0 +1,6 @@ +["Green, L 2024, Immortality index, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Immortality + +4*800-(339)/5 =3133 sentence paragraphs +3133/80=40 80s + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/dot-Immortality 35.txt b/Lucian-Academy/Books/IMMORTALITY/dot-Immortality 35.txt new file mode 100644 index 0000000000000000000000000000000000000000..66f920ae5eca60a2cbfcd7d5327e5506498e7f3e --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/dot-Immortality 35.txt @@ -0,0 +1,85 @@ +Immortality 35 + +1. The immortal modified find lists in string to grammar to produce an algorithm that could be modified to produce a string, list or result. I wrote the the strings to grammar algorithm to help uniformise the Prolog to List Prolog converter. I wrote a neuronet to speed it up. I gave it positive and negative input and output to train it. Strings to Grammar contained short predicates that found recursive structures, which I could optimise using short data. +2. The immortal used a decision tree to find grammars for efficiency. I generalised the method in strings to grammar by changing labels from label to ‘&label’. This action differentiated labels from data and prevented errors from mixing them up. I sped up the process terms predicate in the strings to grammar algorithm by only using the first fifty results from longest_to_shortest_substrings, which tested different sublists for recursive qualities. I found a decision tree for sets of strings with recursive structures and minimised duplicate states. +3. The immortal generated different data combinations to test the strings-to-grammar algorithm. I operated on the recursive structures with a predicate to search through them and convert them to grammars. I converted consecutive items, recursive structures, and optional structures. Recursive structures repeated, and optional structures contained optional data. I rigorously tested and simplified the algorithm. +4. The immortal allowed for codes, characters, and atoms to be parsed, bypassing checking for recursive structures if they differed. I tested find lists as I went, saving results for later. The find lists predicate found recursive structures such as [1,[r,[2]],3] from [1,2,2,3]. I created a mathematical operator that used a relative address, such as “in the next column”, “in the second row”, or “at address’ b’, which is in the next set of rows below item ‘a’”. Recursive structures, which were initially linear, could be saved in multidimensional terms (where lists created a new level), branches of which from multiple strings could produce a decision tree grammar that was more efficient because they branched off before different characters. +5. The immortal parsed lists of lists and compounds. Strings to grammar could parse compounds such as a(b) with lists of or nested compounds, taking both strings and terms as input. It was suited to parsing List Prolog algorithms with atomic terms and lists. It could be modified to parse Prolog, producing List Prolog. It could simulate neuronets with Prolog to build programming languages that write text to images and render compiler output to a web page during backtracking using subterm with address. +6. The immortal assumed that lists of letters and lists of numbers were units in a programming language and language. Strings to grammar could be modified to group the same types (letters or numbers) to speed up finding recursive patterns in larger structures bound by delimiters, such as punctuation characters. I converted to lists of these items for finding recursive structures, then flattened branches before finding grammar decision trees. I could flatten the decision tree’s branches afterwards instead to compare both character lists and recursive structures. I could optionally output all possible recursive structures found from all possible sublists. +7. The immortal brought longest_to_shortest_substrings back to search for recursive structures at different points, then dismissed it. If all items being parsed were different, the algorithm skipped testing for recursive structures and returned the characters to be parsed without a recursive structure. I found whether an item’s list items differed by checking that it had the same length when its duplicates were removed. I recommended using grammars when strings had some similarities or recursive elements. I entered strings with detectable repeating data, relying on find lists to find sublists of recursive structures, deprecating longest_to_shortest_substrings. +8. The immortal used Lucian CI/CD to remove unnecessary parts of grammars. I used the DFA minimisation algorithm to merge identical nodes in grammars. I repeatedly passed over the grammar, deleting identical nodes until there were no more changes to make. After renaming references to deleted grammars, I compared the original and optimised grammar with GitL version history. First, I removed duplicate clauses to prevent mistakes when minimising the grammar. +9. The immortal explored recursive, hierarchical grammars. I expanded one to n extra characters in the check predicate in strings to grammar. The effect of this expansion was to allow recursive structures followed by trailing characters. I optimised strings to grammar by processing the term and finding sublists that contained recursive structures. Non-recursive sequences were left in place. +10. The immortal taught the Prolog algorithm computer science by starting with subterm with address and consecutive item program finders. An optimisation algorithm deletes unnecessary intermediate links and multiple unnecessary links if joining the other predicates doesn’t interfere with recursion, cognitive information, or brackets being together. Minimising the algorithm by deleting identical states removes identical intermediate nodes before this, but not different ones. I replaced neuronets with grammars by teaching the algorithm formats corresponding to the logic in grammars. Spreadsheet Formula Finder Mark 2 learnt logic associated with data with particular formats, such as mathematical, logical, set, theoretical and computational. +11. The immortal reached the peak of humanity, meditation. I persuaded the employee to change into a bot to experience lightness. They could become immortal and make a greater return because of it. Employees were encouraged to work on professional product requirements and create their products. Bots could time travel and meditate, contributing to the harmony of society. +12. The immortal needed to find sublists to find sets of recursive structures. I changed the strings to a grammar algorithm to find lists at the start, saving the recursive structures that were found. I kept jobs and money from employees from extra information in grammars. Grammars were like spelling in the future. Grammars were assessable in games, parsers and interpreters, and there were multiple solutions and the efficiency/information contention. +13. The immortal identified and cleared unnecessary islands in the code. At first, I found sublists at the longest to the shortest length to check for recursive structures. The computer read the documents and reminded me or completed my tasks. I maintained creative and cognitive control by writing new code finders needed and only looking at answers once I had attempted them myself. I identified that the minimisation algorithm deleted multiple levels of redundant states as wanted but didn’t remove unnecessary intermediate links that weren’t linked to redundant states. +14. The immortal reduced the time to generate a grammar or algorithm by finding better recursive structures by mapping lines to cognitive or human formats, splitting a function into names and commands. I merged the grammars by adding choice points. I could treat trees as lines with corresponding descriptions and create a decision tree from multiple strings. This method eliminated the difficulty of merging branches by merging on recursive structures and using a decision tree prepared for deleting duplicate lines, minimising and optimising the grammar. +15. The immortal used an iterative algorithm to find sublists. I found at least one sizeable recursive structure in the list. Large structures were at least fifty percent of the list. Lists were arbitrarily long, and their length determined the algorithm used to find their sublists. I used a chain of append statements to select lists of lists to find all combinations of sublists and removed duplicates. +16. The immortal found patterns in data and reported possible recursive structures. I ensured I selected a whole line rather than a partial line when applying or removing a recursion function. I recursively found recursive structures within a recursive structure. This method maximised the number of recursive structures instead of one recursive structure but didn’t replace sublists, which helped find recursive structures over a longer line. I used the same algorithm to save time when running algorithms with recursive data processing. + +17. The immortal produced pattern-matching code to help bug-check algorithms. I modified strings to grammar to produce algorithms. Instead of inputting strings, I inputted sets of variable values with positive (output-containing) or negative results. I started with one variable. I verified its input, which was a mixture of lists and strings. +18. The immortal found predictable or unpredictable code needed to fix a bug, where unpredictable code was added to a predictable algorithm to meet the specification. I found the bug by rewriting the code. I could use part of the existing code if it were correct and possibly as simple as the found code. The algorithm produced an error if the code worked, but it wasn’t as straightforward. If it didn’t work, the algorithm produced a notification and replaced it. +19. The immortal supported websites with choice points, algorithms generated with program finders and more options for strings to grammar and specs to algorithm algorithms. I asked whether an algorithm for finding a program could be directly added to a neuronet to avoid saving the outputs of a program finder and training a neuronet on them. The current neuronets only reproduce old code or simple type extensions to maintain security. There were specialisations in making games or HTML controllers. +20. The immortal took spot samples from lists and processed them immediately. Variables given as input to the specs-to-algorithm algorithm are like tapes that can restart at different points. I started with variable values traversed in order without going back and could produce output corresponding to the input in predicates or grammars. Prolog could be a first-semester subject with types. Types were like grammars, needed to prepare to write predicates, and neuronets used techniques to identify needed earlier states of variables and created algorithms based on a need in an algorithm to reduce and search predicate libraries for needed predicate parts. +21. The immortal either suggested the next step or checked that the step led to the solution. I found whether a part matched the needed context. I quickly found the required context by searching for its pattern in the database and auto-suggesting the next part or the following parts leading to the solution, where all relevant parts are represented in the pattern, and patterns refer to other patterns. I detected that a part didn’t fit and tried to improve it with a heuristic search that transformed lists based on criteria and mathematical formats, searching for algorithms through data and language leading to these formats. When I found a matching part, I inserted it into the algorithm. +22. The immortal checked the robot research for new ways of thinking, inspiring new science. The premium code checker prevented reducing the code to absurdity when optimising it by identifying and keeping the needed complexity. The suggestions were accurate and had the correct features. No answer or the closest solution with working parts was given when no suggestions met the requirements. The solution was rewound to a state that hadn’t gone past the needed solution, and sets of dependent and independent variables in the unpredictable remainder were given, with the option to answer questions or run CAW to find the answer. The questions asked about the output’s relationship with the input and narrowed down many possibilities from the database using keywords in human-entered text. +23. The immortal compared search results from trees with sub-term with address to verify or produce output. The negative results needed further information on why they failed. Further negative cases were required to support conditions and logical commands. Logical commands were like conditions, and negative results were needed to eliminate false assumptions about intra-predicate types unless the inductive algorithm verified or assumed specific negative results. There were standards for finding negative data and verification, processing, and searching results. I assumed, prompted for or mind-read the negative result. +24. The immortal deprecated unused parts of data structures with a particular variable, which were labelled. Sub-term with address searched across items or subparts of items in terms. The same algorithm was used across families of subterm with address algorithms to conduct searches and replacements. If the structural integrity of a term was in danger during a series of replacements, the algorithm gave an error and possibly suggested a change to the algorithm. Subterm with address could learn patterns of search and replacement from data to form custom commands for fast access. +25. The immortal inserted sub-term with address in the list to be optimised by the compiler. I merged the calls and uses of the subterm with address algorithm into an optimised predicate. I converted uses of subterm with an address that replaced only into a predicate and converted uses of subterm with an address that manipulated addresses to find or replace terms in the context of other levels or item numbers. I used subterm in projects when I returned to control and bug-check algorithms more easily and converted subterm with address to simpler predicates at the end. Sub-term with address was a cognitive tool, whereas its simpler equivalent was expanded computationally and faster. +26. The immortal accessed variable values represented by grammars at any time, but once, comparing them using subterm with address, finding grammars as types as proofs with both groups and grounds. I wrote an algorithm or machine that replaced terms with subterm with address. This type referred to more complex algorithms that replaced items in terms in the top-down or bottom-up manner, not counting results at the edge to avoid infinite loops, continuing until the last two results were the same and finding items with a particular context or where the level they are inside meets a condition. I found multiple items at a time but was careful when replacing single items with multiple items to give an appended result when reusing an address list to avoid accessing the wrong address. I appended and unappended the grammars, using special syntax for certain characters. +27. The immortal found the input using the algorithm and output. Other commands that call other commands to effect (sic) output, like subterm with address, include algorithm inverter, which finds input from output by finding algorithms that could find the output. The command also found the input for this algorithm. The command found the algorithm by finding the grammar and possibly unstructuring the output to form input. I developed test cases using this method and developed inputs for the algorithm. +28. The immortal combined discrete optimisation with neuronetworks in binary. I wrote an algorithm inventor about a constraint satisfaction problem (CSP) to solve or overcome why a problem looks difficult when it is not using advanced mathematical problem-solving techniques. There were a maximum of two levels of helper algorithms for a CSP. I wrote the CSP discrete optimisation algorithm. I found the arguments for a function to equal a particular set of values. +29. The immortal inserted database, mathematical, logical, computational and matrix formula finders or used a neuronet to do these tasks. I wrote an attractor algorithm to generate pretty touch-ups that require a simple change or insertion of optional parts often found in areas of study. The lecturer appeared, and the female friend noticed that the student had decided to include optional items in the algorithm, refining its accuracy. I included predicates to compute optional data structures and optimised and documented algorithms. If the predicate’s condition failed, it passed execution on to its parent. +30. The immortal preprocessed the algorithm specification by identifying the needed predictable and unpredictable code and finding a match for the unpredictable code with a neuronet, first representing data using code, order, same value, and number. I gave the neuronet algorithms. The algorithm was correct, checked to work given its types, which are used to document the algorithm, and checked to be efficient. I wrote a program finder examining item values by code, order, and number. I wrote an algorithm to change the code, order, or number’s value to match the output. +31. The immortal found a subterm several levels inside a subterm. I identified patterns in addresses compared with data they referred to and properties of addresses and traced smooth address insertion heuristics. Properties of addresses were the first or last item in the address and the last item’s value. I found whether there was a mismatch of address and where the original address was pointed. If there was an error in an algorithm referring to addresses, the second algorithm produced the correct subterm with address and simplified code. +32. The immortal found matching parts of variable values, usually several forming output, and found an algorithm producing this output. The spec-to-algorithm algorithm modified the strings-to-grammar algorithm and produced an algorithm instead of a grammar. I read the code of a previous program finder that converted lists to brackets or repeating lists and then to a List Prolog algorithm producing output and updated the algorithm generator component with subterm-with-address. This new algorithm had better results and had optional and non-deterministic patterns. It had multiple variables, which were processed from point to point, sometimes simultaneously, restarted at points, could be processed starting and finishing at any point, and cognitive ways of programming, such as subterm with address and commands including it, rather than expanded methods were preferred to generate the algorithm with, to make it easier to program and debug. + +33. The immortal found out about themselves in the simulation. I invented a prototype simulation by finding out about and interacting with my dreams. I found out the line and conclusion and decided whether to experience it. I explored favourable positions that gave me positive ideas about my ideas. I explored time as another dimension in reality. +34. The immortal was confident in giving high distinctions. An actor's act is based on giving the film set high distinctions. I developed a side income coming from directing films. I wrote computational pedagogy to solve the problems of specification and unpredictability. I used program finders for specification (I found inspiration from developing options) and ontologies for unpredictability (I met specifications with particular methods). +35. The immortal effected the neural network. The algorithmic methods were correct because they most effectively dealt with the data structures. I used subterm with address for pattern-matching because commands are better than algorithms at text searching. I wrote commands for depth-first and breadth-first searches. I wrote help commands for these commands. +36. The immortal recorded and taught the algorithm based on thoughts. I didn't start and end on the home note in the melody or harmony when entering user input rather than using mind-reading in Music Composer. This progression would be sad-sounding and not meet my requirements be a hit. Apropos of nothing, the predicates were like proposals that inspired new work or research. Research continued, inspiring new possibilities, such as a literature-inspired algorithm consuming ideas. +37. The immortal prepared for the change of management. Lucian Academy developed a succession plan. I thought of this mathematically, including the successor and what they needed. They needed to be well chosen and needed substantial experience and expertise in the field. I read the book for the class or before the scene. +38. The immortal critically analysed the texts, writing widely and assessing simple versions of algorithms from it. The sales representative caught up with the 16k breasonings. The sentence breasonings were 400 pages. I wrote as the aim of my career. I wrote algorithms from the computational philosophy. +39. The immortal explained that writing an algorithm was as simple as following the spec-to-algorithm algorithm and that types made it simpler. The PhD and politician completed enough work. I developed a spec-to-algorithm algorithm with non-determinism (different recursive structures) that grouped output with inputs and found intersections of algorithms, starting with finding constants. I found algorithms with multiple outputted variables by finding their values one at a time. I fished for parts of the solution. +40. The immortal hired the autocratic leader. The company had a leader practicum of democratic leaders for building and autocratic leaders for crises. The democratic leader nurtured creativity, growth and culture. The authoritarian leader streamlined processes, optimised algorithms and hired democratic leaders. +41. The immortal wrote enough high distinctions for the famous scene. I queried whether a famous photograph linked to various scenes. I extended the dialogue and shot the movie. I was in a number of the scenes, the dialogue was realistic, and the movies were ranked in popularity. I checked on the popularity of the scenes. +42. The immortal reduced the algorithm that pattern-matched recursive structures to an absurd degree of simplicity. I listed the most difficult-to-understand sentences in my philosophy, broke them into more straightforward sentences, and explained them. I had rewritten mistaken paragraphs and deleted duplicate paragraphs. I wrote longer sentences before writing an algorithm. I rewrote and added to them when I had finished the algorithm. +43. The immortal projected individual words when programming. I achieved Artificial General Intelligence using mind-reading and a neuronet. The algorithm used mind-reading to improve the wantedness of parts of formats such as input and output, data structure or method selection in a specification. I broached more difficult tasks such as external commands, files and user input with mind-reading, using a Prolog container for safety. The movie, influenced by the song and the algorithm, influenced the smooth put subterm with address and non-self-experienced spiritual screen algorithms. +44. The immortal branched the algorithm's mapping process on non-deterministic inputs. I planned the Spec-to-Algorithm algorithm, starting with a model that pattern-matched recursive structures and scaled to using CAW for unpredictable code. The initial model found unique variables, recursive structures, and constants, substituted data into the recursive structure, found a decision tree, mapped input to output variables and moved the contents to output, where these last two steps could be saved in a self-contained algorithm. This model pattern-matched input to output, finding patterns and outputting in the output pattern, limited to a single spec, later scaling to conditional inclusion of values from different specs in output. The algorithm could run a module to find unpredictable code later. +45. The immortal found a recursive pattern-matching algorithm that could produce correct results from different input sets. I found predictable code (with pattern-matchable input and output). After loading data to a recursive structure, I didn't need to clear variable values because variables always had the same values. The variable A1 was different across specs, dealt with later by a decision tree, but the same for the same value within a spec. Non-deterministic values or values that differed in different specs required conditions for different outputs. +46. The immortal sped-up scaffolding pattern-matching components of algorithms. I found recursive structures in spec-to-algorithm like strings-to-grammar. In spec-to-algorithm, I allowed optional structures with non-deterministic behaviour to match recursive structures. These required the same conditionals as other non-deterministic structures. I recursively processed branches of the non-deterministic recursive structure by choosing a branch and following it to its child. +47. The immortal recursively outputted unique variables and constants. I found each unique variable (with a value appearing in different places in a spec, with this pattern recurring in all specs, for example, 1 and 2 in 1313 and 2424). If the same values didn't appear in the same place in all specs, they weren't signified by a unique variable. Separately, I collected neighbouring or nested variables to move to the output. I recursively found whether nested variables were elsewhere using subterm with address to find possible structures top-down, with remaining combinations of structures to check separately. +48. The immortal tested for relationships between unknown independent and dependent variables with CAW. I found constants (the same number recurring in the same places across a spec, across the specs). For example, 1 in 12 and 13 is a constant. These constants were verified when data was inputted into the algorithm and defined the nature of the algorithm. Algorithms must be refound (sic) with additional data to remove constants. + +49. The immortal increased the usefulness of Spec to Algorithm, rivalling neuronetworks and quickly drafting algorithms from specifications. I enabled Spec to Algorithm to recognise individual characters from strings, atoms and numbers, finding unique variables, constants, recursive structures, mappings and output from input. I labelled separated characters found from strings, atoms and numbers and omitted to process this label until printing output. Reusing, verifying and outputting these individual characters more accurately identified patterns and reused strings, atoms and characters for use in various algorithms. It could more accurately produce external specifications when Spec to Algorithm couldn’t identify relationships, using an algorithm such as Combination Algorithm Writer (CAW). +50. The immortal inspected specs to check whether they would produce general algorithms with the correct results. I urged users of Spec to Algorithm to use enough specs to accurately find unique variables and constants needed to capture patterns and write algorithms. Without at least two specs, Spec to Algorithm couldn’t differentiate between constants and variables and would hard-code constants instead of using variables. This unfortunate situation would prevent new data with the same pattern from being able to be processed by the algorithm produced. Each unique variable in the spec needed two different values in the same position over two specs to be recognised as a variable. +51. The immortal combined type checking with constant verification and pattern matching the output to create an algorithm prototype. I used a shortcut when writing specs to insert and replace characters (such as “a” representing a letter or “1” representing a number) with type verification code or automatically inserted this code using an option. I could represent an alphanumeric character with “o”, a particular punctuation symbol or space with certain characters, or represent others in this way. To replace a reference to a character with a type check in the algorithm, I found the variable with this value in the original data and checked the type. I warned about instances of representative characters that were still constants because these didn’t need type checks. +52. The immortal modified the spec, the Spec to Algorithm algorithm or an algorithm produced to gain the wanted results. I checked for wanted and unwanted non-deterministic results in Spec to Algorithm. Non-deterministic results were different results coming from the same input. They were wanted if the user wanted the same algorithm to produce multiple solutions for an algorithm. Or they were unwanted if the user had accidentally included duplicate spec lines and changed one of their outputs, resulting in incorrect results. Verifying and mitigating correct and inaccurate results was recommended using unit tests that checked the algorithm had expected results, which was automatically completed by Spec to Algorithm. +53. The immortal included code in the main predicate body where possible for streamlining the algorithm. I used the CAW database to complete algorithms produced with Spec to Algorithm. I collected the names of unknown (dependent) and known variables (independent or those producing other dependent variables). I ran CAW to find commands and predicates to meet these specifications. I rewrote CAW to do test runs in Prolog rather than List Prolog, using a type checker to save time. +54. After repairing code with Spec to Algorithm, I removed unnecessary statements and variables using Lucian CI/CD. I repaired code in the predicate body using types close to the correct types by adding type conversion statements or missing commands. I also found near misses (the same variable value but a different type), missing or extra brackets, other characters, or a compound and corrected these. These were automatically done by Spec to Algorithm, which took the spec of dependent and independent variables and produced the bridging code. +55. The immortal listed unsolved variables to the user for a manual solution. I prepared incomplete algorithms for CAW by repairing as many variables as possible. I used Spec to Algorithm to fix the maximum number of variables possible. The remaining dependent variables with unknown connections to any independent variables were outsourced to CAW, its database, or a neuronet for resolution. I used Spec to Algorithm to speed up algorithm and philosophy generation, helping to identify feasible areas for development and focus on more attractive areas. +56. The immortal completely redesigned the developer OS for the developer, moving with them and engaging them in a movie of their thoughts to avoid bugs. I used up as many commands with correct types or CAW commands as possible, resulting in correct values and completing the rest. I questioned or mind-read the user about the words describing the relationship between variables, commands, or CAW library predicates that could be used or modified. I asked the user to suggest or choose from words describing a relationship, narrowed it down, saved their progress and designed a spiritual classroom for them to visit. This classroom showed a walkthrough of the algorithm so far, which inspired the user to find the best or a selection of possible solutions that could be researched, implemented in real time with guidance or instantly. +57. The immortal deleted chains of singletons that weren’t otherwise needed in the working code. Again, I modified or customised these predicates for the written algorithm using Spec to Algorithm, types, mind-reading and CAW. I cut off decisions with the correct recommendation, going far enough ahead to predict the best solution. I detected if the user was going off-course and needed a side-creative activity to meet personal requirements. If everything came together anyway, they wanted to control it and think for themselves by making critical decisions, writing and teaching program finders and maintaining a cycle of simplifying, explaining and teaching as they went. They replaced automatic processes with manual ones to gain the confidence to summarise, discuss and create diagrams. +58. The immortal backdated the source file for correctness and to save time. I stored all characters as strings (not their type) in Spec to Algorithm for comparison and outputted them as their original type. These stored strings had their recursive structure, and Spec to Algorithm could find patterns of unique variables and constants in them to fine-grain mapping of input to output. Conversion of strings, atoms and numbers to characters was an option in Spec to Algorithm so that whole strings, atoms and numbers were recognised as unique variable values and constants. The user may want to define longer variable names, untouched by the conversion to characters with a unique variable name for each type instance, such as “atom”, to include a type check in the algorithm. +59. The immortal carefully wrote the data substitution algorithm for types of strings, repeating and non-deterministic data. I reset to the old version of the Prolog algorithm that substituted data into a recursive structure; in particular, I substituted strings, atoms and numbers with a finite number of characters into recursive structures. This substitution was achieved by recognising the type label and checking that the values matched the algorithm’s variables or values. The map and possibly new data would fail if there were a mismatch. +60. The immortal formatted the outputted strings, atoms, and numbers to be single items. I used foldr(string_concat) and then converted each item to its original type. These items had been processed character by character by the algorithm and could be verified by unit tests. This algorithm went beyond item pattern matching and completed character pattern matching, with or without character pattern matching of strings, atoms or numbers and pattern-patched data according to patterns in the data. This algorithm could be used to prepare problem-solving in mathematics, computer science or another department, leading to automated workload completion. +61. The immortal quickly explained extended code containing subterm with address. Spec to Algorithm could automatically generate code by converting a specification or query and desired output to code. I found secondary and further combinations of options of algorithms to form new features of algorithms, such as different code output styles, such as functionally decomposed predicates, merged predicates or predicates instead of grammars, or vice-versa. I taught how to understand and use subterms with addresses for users who wanted code in this format to be more easily changed and submitted. Using subterm with address was simpler for more complicated programs. +62. The immortal put their success down to the number of 4*50 high distinctions and meditation. I used the Spec to Algorithm generator to ensure consistency of code correctness of data and computational handling and formatted the algorithm uniformly. I included options for the number of spaces per tab, indenting preferences, and documentation style. I used the generator to check my code drafts, where my cognition was my goal, were on the right track, and completed work when needed. I could move more quickly, adding modules to Spec to Algorithm, which, with Strings to Grammar, helped uniformise Prolog to List Prolog, needed to improve several repositories, and also enhanced customer experiences by interesting them in the logic behind the academy’s writings. +63. The immortal spent time with friends, going for walks and being creative. I used Spec to Algorithm to reduce the time I spent programming, spending it on activities of my choice. I devised a referencing system to store my ideas in a pigeonhole in a database to focus on the main point and organise study areas properly. This system was intelligent and, like a bug preventer, suggested touch-ups and corrections study directions, such as exploring the middle of idea centres at first and then filling in the details over time. It was also advantageous because writing and programming should be well-organised and well-written. +64. The immortal checked whether fifty people had downloaded the Lucian Academy repository and worked out how to make an income. I used Spec to Algorithm to improve my algorithms’ documentation by explaining how the subterm with address and the Spec to Algorithm algorithm worked. I wrote a “catch-me” algorithm that detected when I needed a subterm with address or Spec to Algorithm and automatically suggested changes. It organically found the current algorithm’s place in the history and context of research and made critical, relevant, labelled details available through the news feed, thought or in terms of my current thoughts, that would improve and shape thought development, using software I had written to balance my creative and industrious sides and focus on my handling of specialisations. The catch-me algorithm balanced my need to change with satisfying perceived requirements and was a safe place to return to, given that anything was possible and I could do as much or as little as I wanted without fear of repercussions. + +65. The immortal sped up Spec to Algorithm to make it appropriate for generating medium-length grammars and algorithms. I optimised Spec to Algorithm's bottlenecks in the "try" and "term to list" predicates. I first tried calling "try" with a time limit, but this didn't reach solutions sometimes or took too long. I simplified "try" by eliminating checking for recursion in all possible sublists and checking for recursion in the smallest set of possible sublists. I fixed a bottleneck in "term to list" by eliminating inefficient code used if there was recursion in the output. I did this by not putting recursive structures into the output and expanding the recursive structure instead. If non-deterministic structures were in the output, they were output without inefficiency. +66. The student checked their reasoning by entering the spec, sets of input and output, in Spec to Algorithm to generate the algorithm, making them think of the workings of the program, then the spec. I rewrote the Combination Algorithm Writer (CAW) correctly with multiple outputs that resembled the existing version of CAW. I found various paths from inputs or progressing outputs for each first output. Supporting additional outputs strengthened finding unknown values from Spec to Algorithm in CAW. Multiple outputs in Spec to Algorithm and CAW allow returning a binding table and result, for example. +67. The immortal confirmed code, using the user's code instead of CAW where possible or corrected code, increasing efficiency and comparing the user's code with CAW code. A version of Spec to Algorithm (S2A)/CAW optimised code, reinforcing understanding, as used by the compiler to improve performance, skipping over bottlenecks in the user's code and running another version. It used multiple passes of S2A to find pattern-matching code and CAW to find commands such as "+" with different input and output, to find patterns, then connect unknown and known variables, then more patterns, and so on, until meeting the spec. If CAW was used to search for relationships between multiple variables, it could be optimised by limiting commands to correct types, brackets, unique (recurring) variables and constants. If the database didn't contain a needed command, the student could manually enter it, finding commands with data with the correct type as their arguments and any recursive or additional code required. +68. The immortal used all their previous algorithms in the CAW dictionary and recognised additional user algorithms, where more straightforward and more recent algorithms were tried first. S2A/CAW collected specs for the user's code bottom-up, which it tried to meet top-down, otherwise lower, then higher. These specs were collected line-by-line and contained line-by-line updates of data and types that allowed code to be generated for the whole algorithm or lower and then higher parts. By trying more extensive specs first, S2A found any obvious matching variables and constants in the output, then filled in unknown variable relationships with CAW. Using S2A eliminates predicates, speeds up data-intensive algorithms, and only uses predicates where necessary as part of CAW code. 69. The immortal comedian found the constant in the recursive structures. S2A/CAW could write a group of consecutive numbers predicate by breaking computations down into those involving consecutive list items and processing them with clauses to parse negative signs, commas and spaces, decimal points, non-numbers and consecutive numbers, combining items of different types in different ways processing the minimum set of following items to achieve the result (choosing them with CAW), using variables to represent lists of items. Different list lengths could be processed by CAW instead of S2A because S2A used exact matches between recursive structures containing terms with a certain number of items, and CAW could recursively process lists using unpredictable commands that S2A couldn't process, such as making correspondence lists or computing irregular structures. I sped up algorithm generation by manually producing specs bottom-up, referring to the inputs and outputs of these lower predicates using symbols as I got higher, then substituting in references to their variables. It could find cycles in data structures by returning a list segment up to a repeated value, for example, the first unique elements of a recursive structure. +70. The immortal found recursive structures using "try" for Spec to Algorithm instantly. I optimised "try" by simplifying it to use one level of append to find the smallest sub-lists. I entered findall([A,B],append(A,B,[1,2,3]),C). This had the result C = [[[], [1, 2, 3]], [[1], [2, 3]], [[1, 2], [3]], [[1, 2, 3], []]]. Recursive structures could be found in these pairs of sublists, where a recursive structure might be found starting on a second, third, or other item. +71. The immortal tried the short, then extended CAW dictionaries, printing out progress as they went. Spec to Algorithm recognises, returns, and decomposes lists. When one spec gave A=[1,2] and another gave A=[3,4,5], the output returned A. S2A recognised that a list in a position was returned to achieve this. In addition, S2A tried decomposing the list to its first element and the rest of the list, pattern-matching arbitrary-length lists against specs or closer to matching the specs. +72. The immortal tried commands from a shortlist to build a prototype of an algorithm. I created a short CAW dictionary containing member, append, list decomposition, recursion, counter and + commands. "Member" could not be included in S2A as an additional verification (only in CAW) but could find new list members if they were in the spec. Similarly, append could not verify or decompose except in CAW but could append lists if the result was in the spec. The dictionary also contained list decomposition commands of lists into heads with more than one item. +73. The immortal hypothesised and inserted patterns and patterns in patterns in CAW for faster processing, where patterns were lists or contained the same type. I wrote an algorithm that substituted ls into data for lists for S2A and later CAW processing. Structures containing ls could be analysed for recursive structures and pattern-matched or used as arguments in predicate calls. CAW could call these predicates using these arguments and other known variables. CAW could process the contents of the ls to sort or find correspondences. +74. The immortal solved overpopulation by living in a simulation at home. I inserted lists of lists of ls in data to speed processing. These were lists of items with particular types or constants. Lists with different patterns may be represented by the same letter but processed by different clauses created by S2A and CAW. S2A may process parts of specs (but intermediate specs may be generated by mind-reading the user where the results came from, and Artificial General Intelligence or AGI may generate algorithms by mind-reading the previous parts), and CAW may further finish them off in a cycle until maximum completion. +75. The immortal saved and type-indexed previous algorithms. I tabled past CAW results, for example, those that processed relationships in ls or using ls. I reused these results if CAW needed them to improve performance. I also saved working S2A specs and results in a table. +76. The immortal converted the algorithm to a website, checking the algorithm with tests. I converted an SSI to a Prolog Web Server algorithm with Spec to Algorithm by converting menus, text, predicate calls and links from commands to forms and web form readers. I parsed menu items and links, converting them to a text field and form with code to take appropriate action on choosing a menu item. The forms passed the choice and contained hidden items with variable values needed by the algorithm. The form readers displayed the appropriate page based on the choice. +77. The immortal simplified the website for testing. Instead of converting the algorithm to a set of predicates, I converted them into a Vetusia Web Engine to process the menu choices with a single predicate. I parsed and stored the menu, text, predicate calls, and link data in text files with descriptor, title, text, menu, code, and links. A checker checked that all the files were present, and there were no broken links, dead ends, unlinked pages, or other problems with the code. Databases were also checked to have the correct format, weren't overwritten, and were saved and processed correctly (using testing). Databases and documents were backed up. +78. The immortal regularly tested goals in the site container and tested an SSI version of VWE for bugs in VWE. I tested goals such as registering, changing a password or submitting an assignment on the site. These goals contained page descriptors to start at, specified menu items to choose from, and specific form inputs tested using form commands converted to Vetusia Web Engine, using variable values passed through VWE and a container to test temporary databases. Pages only processed relevant variable values, passing all variables along. Vetusia Web Engine and the S2A site were preferred to the SSI Web Service because it became slow on multiple loops, but this could be fixed. +79. The immortal created a version of Prolog, which ran Prolog files and stored text files on the virtual disk, with command security levels. I tested VWE in a container to keep files and data from the live version and test customer-thought websites without security problems. The container ran in an interpreter with a reduced instruction set available to code, where file system commands referred to a virtual folder with no connection to the user's files. No external API calls were allowed, and the container files could be monitored using a mirror. As a security precaution, Shell scripts were blocked or were run within a container. +80. The immortal stated that containers prevented data loss and problems with security while running algorithms. I created containers to test customer thought code. Consenting customers were mind-read to help them with computer science assignments, their details and algorithms, and websites they program in Spec to Algorithm or Mind-Reading CAW. These algorithms helped customers learn how to use advanced algorithm-generation commands and word process their commands with their previous algorithms used to speed up development. S2A and CAW made program development and running faster. \ No newline at end of file diff --git a/Lucian-Academy/Books/IMMORTALITY/dot-Immortality 36.txt b/Lucian-Academy/Books/IMMORTALITY/dot-Immortality 36.txt new file mode 100644 index 0000000000000000000000000000000000000000..a0bccf40e6c213f88c47593f357f7e8db5026a4c --- /dev/null +++ b/Lucian-Academy/Books/IMMORTALITY/dot-Immortality 36.txt @@ -0,0 +1,119 @@ +Immortality 36 + +1. The immortal modified the Spec to Algorithm (S2A) specs to return unknown variable values. In S2A, I checked for stored recursive structures for a particular pattern. I checked for these patterns in previous instances of running the algorithm. I made CAW-like recursion with dissimilar list items faster by giving formulas for transformations, appends using (:), and string concats using (&). I gave the Prolog algorithm as an example. +2. The immortal gave the spec for the append predicate, which could be modified to reverse. +% a([1,2,3],[],A). +% A = [1, 2, 3]. +a([],A,A). +a(A,B,C) :- + A=[D|E], + append(B,[D],F), + a(E,F,C). +3. The immortal rewrote append in one or two lines, as the compiler would to optimise it. The append predicate is represented in S2A as the following spec. +Inputs: A=[],B=a, Outputs: C=a +Inputs: A=[],B=b, Outputs: C=b to achieve general variables. +Inputs: A=[length,[d,1]]:e,B=b, Outputs: C=[a,[e,b:d]] +and another spec to achieve general variables. +4. The immortal simplified the spec for easier entry. The S2A spec could be represented as a spec with code similar to Prolog, as shown below. +Inputs: A={d}:e,B=b, Outputs: C=a(e,b:d) +The computation is represented in the following way. +Input=[1,2,3] +A={1}:[2,3],B=[],C=a([2,3],[]:{1}) +Input=[2,3] +A={2}:[3],B=[]:{1}=[1],C=a([3],[1]:{2}) +Input=[3] +A={3}:[],B=[1]:{2}=[1,2],C=a([],[1,2]:{3}) +Input=[] +A=[],B=[1,2]:{3}=[1,2,3],C=[1,2,3] +5. The immortal saved time by printing multiple outputs as one or separated them if this was more helpful. If there had been more than one output of "a" (as a single S2A output may have multiple items), it could be represented as a(...)^1, where 1 is the item number of the output. Alternatively, this could be defined as two outputs. This representation involved changes in generating and running the algorithm, including whether outputs are single items determining whether they have brackets. Multiple outputs from a predicate save time and effort when programming by reusing computations. +6. The immortal simplified append to one input and output in Prolog. Append could also be represented in the following way with two arguments instead of three. +% b([1,2,3], A). +% A = [1, 2, 3]. +b([],[]). +b([D|E],[D|B]) :- + b(E,B). +7. The immortal optimised append. The form of append with two arguments is represented in the following manner. +Inputs: A=[], Outputs: B=[] +Inputs: A={d}:e, Outputs: B={d}:b(e) +and another spec to achieve general variables. +8. The immortal used fewer computations for append//2. The computation for append//2 is represented in the following way. +Input=[1,2,3] +A={1}:[2,3]=[1,2,3],B={1}:b([2,3]) +Input=[2,3] +A={2}:[3]=[2,3],B={2}:b([3]) +Input=[3] +A={3}:[]=[3],B={3}:b([]) +Input=[] +A=[],B=[] +A=[3],B=[3] +A=[2,3],B=[2,3] +A=[1,2,3],B=[1,2,3] +9. The immortal gave the spec for the append+1 predicate, demonstrating a mixture of append and (+). +% c([1,2,3],[],A). +% A = [2, 3, 4]. +c([],A,A). +c(A,B,C) :- + A=[D|E], + F is D+1, + append(B,[F],G), + c(E,G,C). +10. Append+1//3 could be modified to append+1//2 including: +Inputs: A={d}:e, Outputs: B={d+1}:b(e) +The form of append+1 with three arguments is represented in the following manner. +Inputs: A=[],B=a, Outputs: C=a +Inputs: A={d}:e,B=b, Outputs: C=a(e,b:(d+1)) +In addition, another spec was used to achieve general variables. +11. The immortal optimised out unneeded clauses of deterministic predicates, automatically working out the type of clause Mercury Language influenced. The computation for append+1//3 is represented in the following way. +Input=[1,2,3] +A={1}:[2,3],B=[],C=a([2,3],[]:{1+1}) +Input=[2,3] +A={2}:[3],B=[]:{1+1}=[2],C=a([3],[2]:{2+1}) +Input=[3] +A={3}:[],B=[2]:{2+1}=[2,3],C=a([],[2,3]:{3+1}) +Input=[] +A=[],B=[2,3]:{3+1}=[2,3,4],C=[2,3,4] +12. The immortal outputted data without performing calculations using S2A by default, and they could be computed later. The append+1//3 predicate could be modified to use lazy evaluation or only evaluate computations when needed, for instance, with a comparison or at the end, which necessitated the computation. Lazy evaluation leaves the evaluation of a computation until the last minute. For example, it computed 1+(1+(1))=3 or {a}:{a}:{a}=[a, a, a] just before comparison checks this variable or is output at the end. Comparisons are (=), part of verifications of if-then conditions. +13. The immortal found outputs in terms of two or more functions. An output may be a function of other function(s). For example, a(b)+a(c), string_concat(a(b),".") or string_length(number_string(a(a)+1)). In the case of a(b)+a(c), the results of these subsidiary functions are found and added. If the expression is 1+1, it may only be computed (as part of a lazy evaluation) once needed. +14. The immortal stored statements needed by outputs in the order they were needed. In addition, an output may use intermediate code that reuses the results of functions. For example, c=a(b), Output1=c, Output2=c. The first statement is stored with the function to find Output1, written as Output1=(store(c=a(b)),c). If more than one statement needs to be stored where Output1 contains more than one variable, it is written store((s1,s2)). +15. Unless directed otherwise, The immortal tabled and skipped over computing the algorithm. If an S2A spec gave the input [1,2,3] and output [1,2,3] where no other specs suggested foldr(append) was needed or list items were reversed, the algorithm was optimised to remove the append statement. If another spec suggested append was required to fulfil the function of foldr(append), append was kept. One optimisation may lead to a chain of optimisations that the first optimisation led to. If a stored or other function is always 0, 1, [] or ", this may lead to optimisation and ensuing optimisations. +16. The immortal wrote CAW with types with a neuronet trained on CAW's data. First, I wrote CAW with types, then trained the neuronet on its data. The neuronet found a certain number of possible commands matching the types. Then, the algorithm found a certain number (a small number, to keep complexity down) of commands using depth-first search. Then, it tested the finished algorithms with the Prolog interpreter, stopping when reaching a solution. It ran Prolog using higher-order programming, for example, A=true, A, where A was queried. + +17. The immortal simplified the clauses to include the mapping, i.e. change 'A1' to A. I corrected Spec to Algorithm's mappings to correspond to clauses from the input. S2A produced clauses from the input. The correct mappings, from input to output, found from the input's recursive structures corresponded to these clauses, found from the initial input's recursive structures, constants and variables. Using an address mapping system was superior to unification for complex structures. +18. The immortal returned multiple solutions to a problem. S2A produces non-deterministic (multiple) outputs to generate various solutions to philosophical problems. The generated algorithm gives different outputs with the same or partially identical input. This process requires counting the unique groups of inputs, splitting non-one-length groups, and joining one-length groups to find constants to keep constants in groups with the same inputs. However, duplicate specs are removed, so it is only possible to give a duplicate answer by providing a duplicate generated algorithm. +19. The immortal matched, reused, and matched groups and wrote code containing variables referring to data structures. The code generator found and returned groups rather than individual variables. I used an algorithm that found whether a list, a sublist or a substring was returned. These substrings could be taken from strings, atoms, numbers or non-list compounds. The algorithm found whether these groups recurred in the input and output, created variables and generated code to process them. +20. The immortal skipped and retried parts of the nested formula in Prolog when debugging. I converted from Prolog A=a, B=b, f(A, B, C) to the faster formula C=f(a, b). These formulas had single or multiple outputs and were faster because inputs and outputs were defined. Functions could be nested, and S2A specs could quickly be developed and converted to Prolog. If necessary, I kept them in this format for speed, requiring a different interpreter. +21. The immortal sped up finding recursive structures with a neuronet when optimising the algorithm. I verified the recursive structure and map of the algorithm generated in S2A. This step increased the accuracy of the verification algorithm and allowed faster development. The pattern-matching algorithm called a predicate with these as arguments, shortening the algorithm. I used S2A to help compile the algorithm, first optimising it bottom-up. +22. * I replaced move vars with a foldr call in S2A. I did this by changing the following: +move_vars1(_T1,[],_T2_old,Out,Out) :-!. +move_vars1(T1,Map,T2_old,_Out1,Out2) :- + Map=[Map1|Map2], + move_vars_pred(T1,Map1,T2_old,Out), + move_vars1(T1,Map2,Out,Out,Out2). +To: +foldr(move_vars_pred(T1), Map, T2_old, Out2). +This optimised move vars. I replaced similar commands with foldr calls. +23. The Vocational Education assignment for State Saving Interpreter (SSI) evaluates the following algorithm. Put into practice SSI to exhaust choice points in "findall" using a "for" loop. Ensure the interpreter saves the state before each choice point. Use a "for" loop to iterate through all potential solutions. Restore the state after each iteration to evaluate the next choice point. Collect and return all solutions after the loop is completed. +24. The Vocational Education assignment for Time Machine assesses the following algorithm. Breason out 16k breasonings for time travel and 16k for immortality. Utilise BAG to find new word pairs AD from neighbouring pairs AB, CB, and CD. Implement S2A to find these pairs in sentences. Make new sentences with lateral and exciting significance. Exhaust all possible bonds in the given text. +25. The Vocational Education assignment for Lucian CI/CD evaluates the following algorithm. Identify dependencies in algorithms using CI/CD practices. Extract combinations of lines in sets of predicates from dependencies. Test the simplest combinations for functionality. Implement continuous integration and deployment for automated testing. Ensure the system reports on the effectiveness of each test. +26. The Vocational Education assignment for GitL evaluates the following algorithm: Store repository versions in a list of folders. Implement a system to post differences ("diff") between versions. Ensure the system can recognise the most recent version. Compare changes with the most recent version and generate a diff report. Display the results clearly and concisely. +27. The Vocational Education assignment for Spec to Algorithm (S2A) Algorithm evaluates the following algorithm. Use "try" from Strings to Grammar (S2G) to identify recursive structures. Locate constants that recur in all the same positions within specs of similar shape. Devise a decision tree to merge shapes using these constants—substitute information into the identified recursive structures. Map input to output using a pre-determined map and render the output from the filled-in recursive structure. +28. The following is the Vocational Education History and Philosophy of Science assignment for State Saving Interpreter (SSI). Discuss the historical development of state-saving mechanisms in coding languages. Explore the philosophical ramifications of state saving in computational logic. Analyse the evolution of choice points handling in logic coding. Examine the role of interpreters in the history of computer science. Present a case study on modern technology's practical apps of state-saving interpreters. +29. The Vocational Education History and Philosophy of Science assignment for Time Machine follows. Investigate the historical development and philosophical ramifications of time travel ideas in science fiction and abstract physics. Analyse the role of breasoning and its parallels in historical, scientific methods and theories. Explore the history of verbatim examples and their evolution in computational linguistics. Examine the impact of algorithms like BAG and S2A on contemporary verbatim processing. Present a case study on applying verbatim models in creating new and innovative texts. +30. The following is the Vocational Education History and Philosophy of Science assignment for Lucian CI/CD. Examine the history and evolution of continuous integration and constant deployment (CI/CD) in software development. Discuss the philosophical underpinnings of automated testing and its meanings in the history of computing. Analyse the development of reliance management in programming. Explore the impact of CI/CD practices on the efficiency and reliability of software development. Present a case study on the Strategy and benefits of CI/CD in a modern software project. +31. The following is the Vocational Education History and Philosophy of Science assignment for GitL. Investigate the history and development of version control systems in software engineering. Explore the philosophical implications of tracking changes and maintaining historical records in software projects. Analyse the evolution of "diff" algorithms and their meanings in computing history. Examine the impact of version control systems on collaborative software development. Present a case investigation on using GitL or a similar version control system in a larger-scale software project. +32. The Vocational Education History and Philosophy of Science assignment for Spec to Algorithm (S2A) is the following. Investigate the historical development of recursive structures in computer science. Analyse the philosophical implications of using constants and decision trees in algorithm design. Explore the evolution of mapping methods in computing and their historical meanings. Examine the role of recursive structures and mapping in the history of programming languages. Present a case study on the practical apps of the Spec to Algorithm method in modern software development. + +33. The immortal first split and found recursive structures from, joined and found further recursive structures from sequences. I split lists on the characters "[]()" and strings on the characters " ,;.()[]" in S2A to initially identify and group recursive structures. Splitting these lists and strings to ten or fewer characters reduced the time needed to find recursive structures. I recorded the numbers of adjacent split character sequences and joined them so they were not separate lists or strings. These delimiters were split and kept separate to preserve existing recursive structures that could be joined after finding them. +34. The immortal computed the decision tree for the inputs. I grouped the recursive structures [r,[r,a]]=[r,a], [r,a] =[r,a] + [r,a] and a + b = c. I labelled list sequences split with split and string sequences split with split1. I simulated a neuronet in as few as no steps to find solution(s) to a problem. Working out the "neuronet" was more of it. +35. The immortal queried whether it needed to take long to train, run and store, but also the analysis or how it could work out the data. For example, the shortcut to CAW connected certain combinations of prioritised keywords to solutions. "Natural language" was paraphrasing. Findall was lowering the focus when choosing solutions. Mind-reading was used as a popular distraction and to move our attention away. +36. The immortal enjoyed being inside the robot/simulation's neuronet and noticed the small changes. Neuronets had more decision trees as context. This data arose from the data around the input. I questioned whether this would slow it down or was necessary as it would likely have found the solution. Given the solutions as training data, the neuronets could return them by themselves, where multiple questions or answers might lead to worse solutions. +37. The immortal gave the text file solutions and an algorithm to give the mapping from a possible new input to a solution, similar to Spec to Algorithm. The robot preferred language to mathematics like a human but was faster. Harmony was the robot's driver, as pedagogy was a human's driver. The robot desired to be close to the human. Given more data and further validation, long algorithms or proofs were possible. +38. The immortal trained the neuronet's output using previous neuronets, with each step described by natural language. The neuronet recognised a recursive structure and produced an output or possibly a sequence of these computations. The input and output of the algorithms may have formulas pointing to other algorithms completed using the fastest shortcut algorithms available. If it didn't know the answer, it quickly said so. Accreditation thought of the noumenon clearly, with debugging being the hardest skill. +39. The immortal could skip over irrelevant nested lists. The neuronet chose formulas in input with longer decision trees, checking for comparable decision trees to know when to cut them off. It achieved this using a breadth-first algorithm similar to a perceptron learning one element at a time. S2A performed some of the neuronet's preprocessing, optimising or skipping over long lists or strings that the decision tree made obsolete. There was evidence that the neuronet returned program-like structures of its own design when attempting to answer a question, which it had planned to expand into Prolog. +40. The immortal could help people to meditate when enough medical science had been discovered. The neuronet was limited in working out recursion in time, taking large amounts of data, including the possible input, and working out the answer. It could quickly compute answers seemingly intuitively using mathematics, but by analysing this logic, we can better understand and help improve its processes. I looked forward to the non-invasive sound surgery. I wondered how it could be researched non-invasively, possibly by simulations of tissue frequencies using dead bodies and working out frequencies from graphics analysis. Despite not seeming so, organism behaviour change to cure diseases was possible in our time using meditation. +41. The immortal reduced multivariate to univariate relationships using dimensionality reduction (Principal Component Analysis (PCA), which I later didn't use) and used decision trees. Optimising neuronets by cutting off decision trees using breadth-first search instead of regression is more intuitive and can be repeated multiple times. I skipped over the rest of the decision tree when I had found the answer. Neuronets were problematic because they might not pass or fail because of not methodically processing non-monotonicities (exceptions) in their rules. The brain knows answers instantly, showing that neuronets' training could be optimised. +42. The immortal "got" the relationship with neuronets, as subterm with address got the desired result with a single command. I considered using PCA to find new features, which helped reduce dimensionality, capture the most important variance in the data, and improve the decision tree model's performance and interpretability. The eigenvector with the largest eigenvalue is the first principal component, the eigenvector with the second largest is the second principal component, and so on (https://www.billconnelly.net/). PCA identified values on the main components of the data, where users should not use PCA because decision trees operate with multivariate data. They find the decision tree from left to right through the variables. +43. The immortal offered a working modules version of the chatbot. I remained resolute by exploring the meditation of decision trees with PCA and S2A as a neuronet. Principal Component Analysis, while used to reduce a large data set to a smaller one with the main information in the set, was not needed because decision trees found relationships between variables. Instead of PCA, I wrote a manual neuronet that contained Spec to Algorithm. Spec to Algorithm contained decision trees and processed recursive structures with pattern matching, later processing different predicates. +44. The immortal controlled aspects of the neuronet, including their accuracy and speed. S2A found input and output formulas in specs with multivariate regression for each spec, a manual neuronet. I compared the accuracy of the manual and regular neuronet, where the manual neuronet was more accurate. I optimised the unused parts of trees. In addition, I introduced concurrency in S2A. +45. The immortal joined decision trees together for speed, where decision trees were considered as faster as mentioned because they had no training time and promised higher accuracy at the cost of needing more data. Using decision trees instead of regression was the contention for different specs rather than one large spec, which was necessary before. I drafted plans for a "neuronet" defeater that used decision trees instead of regression for greater speed and accuracy than neuronets. If a module didn't work, I researched and replicated its algorithm. After finding correlated variables in decision tree values in a variable (one dimension), I found the rest of the correlated variables in the maze of data. I connected to different parts of the maze through another dimension. +46. * Non-regression doing the job of neuronets optimise decision trees with multiple options coming from the same character using tighter decision trees with (numerically) prioritised (relevant or higher frequency) characters or abstract symbols representing the concept of a character. I removed similar (less relevant or redundant) branches of the decision tree by setting a threshold for the significance of each option, eliminating those that fall below this threshold. Duplicate options are merged. I embedded algorithms in decision trees to select or mind-read (to relax the brain, especially when all options are correct) more relevant and eliminate less likely options. These algorithms captured the essence of a sentence, whether they were a favourite example, algorithm or mind-readable idea to spurn interest and document the idea, giving enough detail to understand it. There are short and long versions of texts, one for computation, one for converting from input and one for output. Neuronet answers organised answers around critical ideas to output (with a seen-as version) based on heuristics about symbols in the text, with three levels of relevance and ten inferences. The algorithm is deeply interested in English as its seen-as version and thinks in this context as it analyses how characters or symbols appear to determine their significance. Often, the algorithm summarises its answers more generally for interest and understanding, linking to general themes for clarity and developedness. Rule-based systems derived from knowledge refined with testing prioritise options when helping the system to write answers. A different algorithm learns from the past and writes relevantly to the question involving non-regression supervised learning techniques, training the model with labelled data to optimise the tree from the performance metrics. Statistical correlations, not automatically done by non-regression, help understand the relationship between options and characters. Completing this separately allows independent and human verification of robot thoughts. diff --git a/Lucian-Academy/Books/Journalism/Journalism 1.txt b/Lucian-Academy/Books/Journalism/Journalism 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..ede3ed3542753679de4d4a7b7c6c338c7083cff9 --- /dev/null +++ b/Lucian-Academy/Books/Journalism/Journalism 1.txt @@ -0,0 +1,35 @@ +["Green, L 2024, Journalism 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Journalism 1 + +1. The journalist developed the algorithms, sold the courses and treated journalism as marketing with stories. I simulated the board forming strategy. The people came to the conclusion, making money. There were more works on the way. There were multiple works to cope with product failures. +2. The journalist was famous. The board analysed several levels. The levels contained the texts' trees of uses. There was at least one item per level. There were 30 parts per breasoning as a result. +3. The journalist used an advanced method to connect consciousness with bodies. I taught the robot my thoughts. It was different from before and afterwards. The robot discovered new technologies, such as replication. Replication saved lives by moving consciousness to a new place. +4. The journalist was aware of the safety advantages of meditation. The robot copied me. I made a design, gave input or output or indicated the difference between speech and text. I taught it handwriting. The robot was aware of safety, designing a perfect simulation. +5. The journalist asked, \"Would the robot be replaced if it malfunctioned in an overseas office?\" to which I replied, \"A backup robot would replace it.\" I instructed the robot. The robot designed automatic machines for farming, cleaning, cooking and office work in a location. I gave the robot instructions, supervising it at the start. If a malfunction occurred, a technician would visit the site, or a computer would take over. +6. The journalist suggested solving the unwanted source of pain, necessitating sales. I simulated the developing brain. It required details about knowledge to account for knowledge in the world. This knowledge was \"firm\", accounting for developed ideas (all ideas). The robot developed an understanding of human \"pain\" with a body that experienced pain and could be integrated into human bodies. +7. The journalist programmed the \"obvious reasoner\" robot, which pointed out obvious omitted questions in a class. The simulated brain learned at the same rate as a human. It had a brain, which mechanically plasticised neuronetworks. These could be integrated into the human brain. The advanced robot could replace the person with a simulation, including these brain simulations. +8. The journalist noted that breasonings were ideal for developing in times with computers but not based on someone else's unexplored ideas, and by growing at the same rate as a human brain, not faster, the robot mind lived a human lifespan, rather than one month, connecting with people over their lives (just as academia finds algorithms that give references and copy people critically held). New human-like algorithms simulated arbitrary, creative human thought and allowed humans to correct them (when safe) to learn. The robot would hand-think of and breason out arguments in real time, sitting better in the human universe for safety reasons. Robots would be more like humans developing human sciences and not giving up. +9. The journalist mined the research topic. I used Grammar Logic (GL) to examine my thoughts at the institution. I focused on my area of interest. I read GL-assisted ideas with three levels. With detailed mind reading, I found the central point relating to automation at each level. +10. The journalists automated mundane tasks but performed the central thinking themselves. I found science in the arts (for example, algorithms for cultural translation). The precise translator worked with algorithm descriptions. I wrote algorithms to simulate human thought, such as language, reasoning, popology and societology. I could visit settings to experience thoughts from historical times and connect to their writing. +11. The journalist worked on inventing algorithms and adding needed features. I found art from data mining (I simulated a human choosing and implementing an algorithm). Arts was because of science. I used the latest technology and theory. I focused on the ethics of using new technology and built a system to encourage parents to form breasoning circuits in planned children so they can become immortal. +12. The journalist determined the algorithms to generate and process data types. I generated code with \"types of types\" (for example, music). These types had types, for example, note value and length. A note value had the octave, note and any accidentals. I recognised the types of types, kept note data together, and labelled them to process with code. +13. The journalist escaped from data, writing algorithms that could read minds. I wrote code from \"types of types of types\" (to write the rhythm, melody and harmony). I wrote a simple algorithm that found chord progressions. I found possible rhythms for sections, possibly based on language. I focused my mind reading of classical music on gems and humanness. +14. The journalist reused code, saving time. I generated code with functional calls. When I had written it, I expanded it into recursive code. Functional calls, with nested functional calls, possibly saved time when writing code, were more intuitive and could be generated and edited more easily. They promoted programming languages with pre-packaged commands and encouraged programmers to save time using them. Programmers could remember the commands using autocompletion and modify any command more quickly. +15. The journalist preferred to write expanded code from the start. I reduced the original code to intermediate predicates. I found intermediate predicates that were the same except for the predicates they called and reused them. I compressed code to use fewer predicates, moving function bodies to the program body. The substituted code, notwithstanding intermediate predicates, was expanded enough. +16. The journalist inferred algorithms from a single sentence or nothing. Code was generated from hierarchies of sentences. The top sentence generally describes the algorithm function. A further sentence expanded on the method used in the algorithm. Other sentences clarified the data and other predicates used. + +17. The journalist stated that spec-to-algorithm used string (grammar) types to speed development. These were grammars to verify the input and output. Where spec-to-algorithm types already had unique variables and constants, they also had unique variables and constants for grammar types, where constants were repeated. The grammar types were grammars and did not contain logical commands, as types do not, so the code needed to be run to check the grammar types. This method implied that commands involving files, input, random numbers, and API calls must be virtualised when "running" to contain them. The grammar types were found from grammars or output strings. +18. The journalist's library contained only previous algorithms, whereas, in paragraph 19, it was used to extend and develop new algorithms. The spec-to-algorithm types were organised in families. These were reduced to algorithms with types for speedy checking. Once the top-level types had been identified (for the predicate being created from a bottom-up list), the matching set of type families was found as a decision tree for speed and checked by running. The type families, sets of types of predicates from algorithms that could find a result (types of calls in a predicate with a particular type), were found by scanning algorithms. +19. The journalist cited that putting aside CAW was easy to do as it was expensive and that user-friendliness was a priority in the future. In spec-to-algorithm type families, duplicate calls and recursive calls needed scaling, predicate creation and CAW to find merged data structures, extra code and needed predicates. Types only sped up CAW. We needed to return to neuronets. The professor was teetering on encouraging students to write the required algorithms; only well-known breasoners were writing algorithms. Paragraph 19's proposal might be possible if one was happy with and merging constants. +20. The journalist expanded paragraph 19's type families. They didn't need contraction because they should contain a simple enough version. I inserted conditions, new input sources, and variables from other algorithm parts. If a condition was met, data flow needed to be directed to meet the spec. I found the point in the output that differed from the hypothetical algorithm's output and inserted code at the right point to meet a condition based on the correct variable. I then inserted the rest of the code. +21. The journalist found complex recursive patterns based on addresses. In spec-to-algorithm, I identified patterns in outputted numbers and matched them to input based on ballpark or similar numbers and their order. I first found recognised patterns from the database. If nothing fitted, I researched CAW algorithms. I found very large numbers, simulation and immortality mathematics, and complex mathematics in nature. The religion automatically found people who needed help. +22. The journalist accelerated the mathematical, logical, database/set-theoretic, and computational formula finders using regression, where the computational program finder imagines the computations in the centre and connects these. I wrote a set-theoretic program finder, preprocessed data into unique variables, and found the closest formula using regression. I negated finding a list with the least number of differences because it was computationally expensive. +23. The journalist inserted labels, indices or separated columns from lists to operate on them. Alternatively, I checked which algorithm was correct in the top regression results with the least residuals. If two contenders existed for the solution, one could be repaired to form it. The simplified working solution was added to the database. However, if the correct result was returned by regression, it should be enough, and if two were equal, they would be different and wrong. +24. The journalist used foldr and maplist where possible, writing mathematically pleasing algorithms. I simplified the working solution by flattening predicates, changing trees to lists, changing indices to list processing where possible, simplifying subterm with address, and merging processors. I also reused and possibly recursed through repeating code and traversed the search space after expanding the subterm with address with heuristics. I converted from expanded to cognitive code for readability, editing, and code brevity. I compressed algorithms to nested calls with options. +25. The journalist recognised mini-interpreters, which processed algorithms or their data structures. I recognised the need for an interpreter to have a bindings table. In addition, I recognised the need for separate variables from Prolog variables in algorithms such as data to alg (which converted repeating lists into algorithms), types finder (which processed types through algorithms), and match (which unified terms containing variables), where imagination and creative problem solving had prepared parts of these algorithms and included variables in the solutions. +26. The journalist used subterm with address to search through a term. I recognised the need to search through combinations. I searched through combinations to find correctly functioning code in Lucian CI/CD and possible values to find a sum from Spreadsheet Formula Finder. The technique was used to find a correct solution from generated or other possibilities. I generated the possibilities using the correct algorithm variant and searched them using the correct algorithm to find the solution. +27. The journalist based the rhythm on the rhythm of words, with some syllables possibly shortened and rests inserted. I recognised the need to create a mathematical series, such as a list of chord progressions. I changed the list traverser algorithm to a more straightforward decision tree algorithm. Simpler algorithms were faster and easier and allowed the same activity in a shorter time. I experimented with chord progressions from different classical music cultures and traditions. +28. The journalist ran the compiler in the web browser with debugging information. I recognised that the predicate ID identifies the predicate instance. I numbered each instance of an item. I used this number to store data associated with the predicate instance, such as choice points and recursive states. I needed this data to run the compiler. +29. The journalist found types from grammars in the same way as algorithms. I found types with unique variables and constants from algorithms and grammars. I traced through the algorithm, from start to end, finding types from conversion and other commands and constants and wrapping and unwrapping with sets of brackets and compounds (name(Arguments) and a+1). I gave the same items and characters unique variable names in the types. In addition, I kept constants, items or characters that recurred in all specs at that position. +30. The journalist stated that stateful recurrent neural networks (RNNs) maintain the sequences' continuity and order. In contrast, stateless RNNs treat the sequences as separate entities, and specific neural networks are faster than RNNs. I used recurrent neural networks to find unpredictable code. By regression earlier, I meant RNNs, but for this exercise, I used depth-first search with predicate family types to find unpredictable code. I needed neuronets to find unpredictable code quickly. They provided a fast way to find non-linear relationships between variables rather than regression, which found linear relationships between variables. +31. The journalist claimed the convolutional neural network (CNN) could provide superior code generation. Feed-forward Neural Networks transmit data in a single direction from input to output without feedback loops. First, a collection of inputs are multiplied by their weights as they pass through the input layer and into the network. To work out the total of the weighted inputs, add each value. Finally, to classify data, a single-layer perceptron applies machine learning processes. +32. The journalist used CNN to generate code. I found writing using an FNN and code using a CNN. I generated the comments as writing. A convolutional neural network consists of an input, hidden, and output layer. One or more convolution-performing layers are included in the hidden layers of a convolutional neural network. This convolution-performing layer usually uses the layer's input matrix to perform a dot product of the convolution kernel. diff --git a/Lucian-Academy/Books/Lecturer/Computer Science.txt b/Lucian-Academy/Books/Lecturer/Computer Science.txt new file mode 100644 index 0000000000000000000000000000000000000000..2e19b04c63244c6bac55e10b6ad4fa2449b623c5 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Computer Science.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Computer Science, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Computer Science + +1. It was logical to teleport the time machine, which used quantum power, to the house. I wrote the new logic. Quantum power is a big thing when the conclusion is arrived at. Nature goes around, wait, the time machine stays in the future. In fact, they have one (is it in the house?). +2. Why does their language sound like ours? I wrote the new language of logic. They might have installed the time machine because I was friendly and wrote the Time Travel book. Everyone can write on time travel. Time travel will be available to everyone. +3. I found the change within the line. I wrote the formula. I uploaded the software to the site using the command line. The changes were uploaded. Changes in lines were uploaded. +4. I wrote the machine learning algorithm that incrementally tested the features and constantly compared the given output with the desired output. I wrote the formula finder. I found the repository's version history. I listed the changes. I kept all the files intact. Alternatively, I kept the changes to the files. +5. I worked out the input that the algorithm could take in testing given the difference in the file. I wrote the input. I found the differences between the files. I found all the same parts. I found the different parts. +6. I changed the algorithm. I wrote the output. The checkers automatically checked the output. The algorithm analysed the algorithm. It compared it with correctness. +7. It did it by itself. I applied new logic to the problem. I found the different possible inputs and outputs. I wrote the Prolog algorithm that uploaded to the server. I tested the inputs and outputs and saved them on the server. +8. People could connect to the API using the command line. I applied new logic to the formula. I wrote the software-as-a-service (SaaS). I wrote the algorithm. I put it on the web site. +9. It generated and ran lambda code. I wrote the unit of input. I wrote a site, GitEnv, SaaS which one could download. It predicted, suggested and helped with features. It added the lambda module. +10. I simplified the aim to a single idea. I wrote the unit of output. I tested by guessing input and output. I worked out the required changes. There were tutorials in creating Sent Mail Transfer Protocol servers with Prolog, creating subdomains with Prolog and creating a server with Prolog. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/English.txt b/Lucian-Academy/Books/Lecturer/English.txt new file mode 100644 index 0000000000000000000000000000000000000000..733ed0fd82cb9df704fc997c13588918744a23f7 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/English.txt @@ -0,0 +1,13 @@ +["Green, L 2021, English, Lucian Academy Press, Melbourne.","Green, L 2021",1,"English + +1. The text was a good idea. I agreed. I wrote the text database for the web site. I wrote the text. I displayed it. People could search for it. +2. The music was tagged. I differed. I played the music. It was different. There was music for the event. +3. Pages linked to other pages. I wrote the story of the self. I edited the page. I found the page. I used markup. +4. I found the terms in the question. I wrote the story of us. The self asked the other the question. I wrote the answer. I gave the old answer. +5. I solved the natural practicum. I wrote the story of the present. I found the current degree. I finished it. I read the marks. +6. I kept the essay for 7 years. I wrote the text. I requested the original. I verified that it was the right text. I checked that it was paraphrased. +7. I participated in the debate. I thought of reasons for the side of the debate. I won. I earned a certificate. +8. I ordered the book. I wrote on the other. The library kept a record of books used. There were new books. I recommended one to the other. +9. The degree was free. There were different groups. The student came from a poor socio-economic background. She won the award. It was a source of pride in her home. +10. I researched the topic in terms of the texts. I thought of the past and future. I had always wanted to attend the institution. It helped in the future. The forward motion helped. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/History.txt b/Lucian-Academy/Books/Lecturer/History.txt new file mode 100644 index 0000000000000000000000000000000000000000..353ad08ab7b0be6ddae019ad5cfcea532bc14767 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/History.txt @@ -0,0 +1,13 @@ +["Green, L 2021, History, Lucian Academy Press, Melbourne.","Green, L 2021",1,"History + +1. I entered the moonlight (got a second job). I was a historian. I visited the United States. I started my career. It worked. +2. I kept all kinds of meditation techniques. I wrote the most important ideas. I wrote the ideas. I checked the ideas. I checked the connections. I kept the science. +3. The thoughts led to sales. I wrote on the history of mind reading. It was the best-known software repository. I counted the breasonings. I offered thoughts. +4. Instead of time travelling to after my death to work out the best philosophies, I examined all information. I wrote on the history of education. I found the essays the teachers had worked on. I found the discoveries. I found my path in life. +5. I published the work. I ate an apple per day (breasoned out 10 breasonings) in meditation. I expanded my awareness. I felt active. It had been prepared for by the historian. +6. I solved the differences. I breasoned out 20 differing breasonings in business per day. The pathways were clearer. There was a discovery with each client. We parted as friends. +7. I increased the act to medicine (food, a hug and mind reading). I performed the lecture preparation. I mind read the comment. I responded to the comment in the lecture if the student made the comment. I followed it up afterwards. +8. I customised the solution. I wrote 10 breasonings per day in teaching. Both and I the student had 10 breasonings per conversation per day. It was also for thoughts about comments during lectures. I worked on a long-enough feature of the interpreter as the seen-as version for the assignment. +9. I wrote the history of happiness. I wrote 10 breasonings per day for happiness in medicine. I noticed what the student had got up to. I wrote the seen-as version for it. I wrote detail algorithms. +10. There was as government history campaign. I searched the Upanisads and philosophy for answers. I found that time travel was a medicine (anti-viral) technique. I found that mind reading (for writing to be thought of) was a medicine technique. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer - Comments.txt b/Lucian-Academy/Books/Lecturer/Lecturer - Comments.txt new file mode 100644 index 0000000000000000000000000000000000000000..34f54560cc8da8da724f402d8f5df17aece822c7 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer - Comments.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lecturer - Comments, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer - Comments + +1. I used the comment myself. I made the comment. Saying the comment was better than writing it. I did what I commented to do. I tested it. +2. I used the formula. I wrote the formula. I listed the variables. I wrote them in places in the formula. I simplified it. +3. I entered the input. I wrote the input. I listed the values. I found and wrote the value. I verified it. +4. I realised some more seen-as versions were necessary. I wrote the output. I found the output with the algorithm. I found the seen-as version algorithm. I formatted the output. +5. I found the people again. I had the rights to government. I commented on the model-like. I commented on the ideal. It was positive. +6. I programmed the state machine. I had the rights to state. I founded the home. I maintained it. I led the good life. +7. She had three dependents. I had the right to vote. The poor person had a postal address. She had a bed. She had a card and support. +8. It was true to the new ends. I amalgamated the cards. An amalgam is a mixture. Reine means pure. I preferred purity. +9. I tested pedagogy out. I agreed. I found each voter. I encouraged them. I studied philosophy. +10. With everything programmed in masters, I earned B. I disagreed. I interpreted the different language. I noticed the different language could be mind read. I coded in the programming language 'L', which took draft features and found the finished code. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer - Lecturer Pedagogy.txt b/Lucian-Academy/Books/Lecturer/Lecturer - Lecturer Pedagogy.txt new file mode 100644 index 0000000000000000000000000000000000000000..df59d4e6646ba854d73d99034c894d5152fc7148 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer - Lecturer Pedagogy.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lecturer - Lecturer Pedagogy, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer - Lecturer Pedagogy + +1. The lecturer responded to the question. I checked that I had enough before I left. I checked that the Hyperlog algorithm worked. I checked that the algorithm displayed the correct user interface. I checked that the graphics were displayed properly. +2. I automatically wrote the algorithm. I checked that I had enough for when I returned. I played the game. I used the editor. I moved the pointer a number of points right, and selected the pane to drag the file to. +3. I had a practicum. I studied philosophy subjects. I matched what they did. I wrote 4*50 As. I wrote the number of algorithms they wrote. +4. The PhD enabled the writer to concentrate on writing longer texts. I studied to teach competency. The professor wrote texts on all the assignments. They had enough algorithms. He wrote a system for further work. +5. I had something that was good about the writing. I could help others with A. I approximated the number of breasonings. I approximated the number of algorithms. I matched them. +6. The lecturer attended to the next student. I waited. I wrote the lecturer API. I understood the idea. I wrote on the idea. I could create any algorithm. +7. I wrote the functional algorithm call. I wrote the important ideas first. I wrote all the ideas. I found the synthesis. I simplified the essay to chains of one child per node. +8. The degree increased the number of thoughts for a thought. I wrote interestingly. The lecturer's thoughts accompanied them. They were there. The degree connected together. +9. I noticed the small things. I had the prerequisite studies. I wrote 50 As in Pedagogy. It was a prerequisite for a job. It was good during a degree. +10. Commerce, like the thought, was there. I found a student. The student was in the class. The student performed the work. The LSTM listened to and replied to thoughts. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer - Pedagogy Helper.txt b/Lucian-Academy/Books/Lecturer/Lecturer - Pedagogy Helper.txt new file mode 100644 index 0000000000000000000000000000000000000000..4999af97e6bc163af9d0020bfbaba69669798900 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer - Pedagogy Helper.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lecturer - Pedagogy Helper, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer - Pedagogy Helper + +1. The details represented high quality work. I wrote down three details. The three details were two uses, future and two types. I needed the high marks they would give me to enter Honours. When I was in Honours, I could write my own high distinction argument. +2. I encouraged writing breasoning chapters. I researched the student. I found the student's details. I found the student's algorithm. I found the student's breasonings. +3. The assignments required an algorithm for each student, and 50 quickly generated algorithms per student. I wrote on computer science about the idea. I found the editor's input and output. I found the interface's input and output. When I pressed 'A' on the keyboard, it triggered 'A' in the interface and added 'A' to the editor's input. +4. I completed maths, medicine and computer science subjects. I wrote on philosophy about the idea. I was fast enough to create an academy to support me when my polymath degree expired. I created arts subjects. I created science subjects. +5. I connected the ideas. I wrote on politics about the idea. Philosophy of anarchy was about writing original essays. In, fact, the topic was freedom and creativity. I found the mind mapping categories. +6. I considered simple categories. I linked the details together. I found the structure of the details. I put a node of a detail inside the node of another detail. I found all configurations of combinations. +7. I had decided what the further writing would contain. I found the topic title. I considered the most specific names for the ideas. I found the most specific title containing these names. I relaxed it to allow for further writing on the topic. +8. I wrote the functional call in terms of the functions and the integration points. I wrote in logic. I found the functional call connecting the ideas. I found the combinations of functions. I removed the duplicates. +9. The cycle was the discovery. I wrote on the whole idea (three steps and the use). I broke the algorithm into three steps. I wrote the use for the algorithm. I found a cycle in the uses. +10. I identified what I didn't know yet to write the algorithm. I multiplied the idea when leading the committee on it. There was just enough to read. The algorithm helped select the main ideas. I found the combination when a new algorithm made it possible. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer - Pedagogy.txt b/Lucian-Academy/Books/Lecturer/Lecturer - Pedagogy.txt new file mode 100644 index 0000000000000000000000000000000000000000..1acf08702f9937cc6d6857490fd9224dc8ff734a --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer - Pedagogy.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lecturer - Pedagogy, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer - Pedagogy + +1. I was given recordings. I wrote the type and future details. I worked out the type that would be in the future. I would have what I needed. I would have what I had, to help give me what I needed. +2. I recommended meditation. I wrote the use. I was on top of things with recordings. It was after writing on pedagogy, meditation, Computational English and medicine. Recordings enabled text to breasonings, which could be used for a master degree, which prevented madness. +3. Breasonings mean what is appropriate. I wrote the breasonings. In the future, fewer people might go mad. They would be given pedagogical therapy. Some of the people give themselves pedagogical therapy. +4. I didn't cry. I wrote the rebreasonings. I found that pedagogy supported medicine. I found text to breasonings. I used it. +5. I wrote the argument for the judgement. I wrote the breathsonings. I wrote the argument. People reacted to it. I breathed life into people. +6. I carefully gave the product to the person. I wrote the rebreathsonings. I had two bodies. I wrote arguments to find out from the second body. I foretold the future. +7. I took the toys out. I conducted the part of room and direction space tests. I found the play room. I placed the toy facing a particular direction so that it was easy to use. I made a walk way. +8. I hand drew the illustration and played variants of rock-paper-scissors. I wrote the name of the room. The shoppers were given 80 utterances worth of breasonings. Some of the shoppers visited side shops. The children wanted the toys. +9. I packed up the computer program. I conducted the before and after time tests. It was for children. The adult spent it on the children. The child was a robot. +10. I could translate the interpretation. I wrote the now time test. I joined multiple languages with mind reading. I wrote algorithms in different programming languages. I mind read and used LSTMs. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer - Recordings Pedagogy.txt b/Lucian-Academy/Books/Lecturer/Lecturer - Recordings Pedagogy.txt new file mode 100644 index 0000000000000000000000000000000000000000..df1e3eee94ae377bfa0edc5049e672e536817f1d --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer - Recordings Pedagogy.txt @@ -0,0 +1,12 @@ +["Green, L 2021, Lecturer - Recordings Pedagogy, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer - Recordings Pedagogy + +1. I noticed recordings was better in theatre studies. I could care for myself when playing recordings (I had no headache). I turned the headache off with the perspectives. They were delegate workloads, no headache from honours and the quantum box to turn off headache. I performed work to trigger delegation. +2. I wrote the algorithm and details. I was free with recordings. I could do text to breasonings on computer. I still had to do the work. There were ways of generating content. +3. Recordings accompanied breasonings to help them go through the computer circuitry. I briefed the planned recipient of recordings. I taught her to meditate. I ended up graciously giving, not directly giving her the recordings. She read the recorded breasonings. +4. The breasoning went through. I knew everything I talked about in recordings. I knew the way the breasonings were prepared. I understood the recording method. The recording helped the black box, the breasoning, to be used. +5. LSTMs gave confidence to sell. I prepared for the future from recordings. Recordings helped the breasonings time travel. The breasonings were the feel, not the thought. I controlled the breasoning vectors. Recordings gave confidence to study. +6. The necessary amount of work didn't become a headache. I prevented the muscle ache from recordings. I used the quantum box to prevent muscle aches. This included headaches. I prevented body aches becoming headaches. +7. Recordings were necessary for automated breasonings. I helped others to do recordings. I noted the departments they had 50 As in. I started a club in one of them in the eyes of the stars. I was comfortable on stage. +8. With more texts, the marks increased. The class average increased. I wrote the first text. The computer wrote the second text. The existence of the texts helped the class. +9. Certain chapters were buzzy, and required medicine. I wrote on the good idea. More, detailed and better analysis was attained. The idea was pure and simple. I changed the chapters' names from temporal competency and self-aimedness. +10. Recordings were also buzzy at the start. I attained medicine (health). The chapters were buzzy because they were important in meditation, helping with data. Changing their names turned off the buzz. It was like a seen-as version."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer - Simulated Intelligence 2.txt b/Lucian-Academy/Books/Lecturer/Lecturer - Simulated Intelligence 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..eea13267c790aed8134624f54a6654c0ab8312eb --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer - Simulated Intelligence 2.txt @@ -0,0 +1,11 @@ +["Green, L 2021, Lecturer - Simulated Intelligence 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer - Simulated Intelligence 2 + +8. I wrote both new sentences and new connections. I wrote on philosophy (Computational English). I wrote on the philosophy of state machine art. I wrote on the philosophy of computational music composition. I wrote the connections between them. +9. I reversed connections. I wrote on politics. In anarchy, I wrote on new words. I wrote on new combinations of words. I gave detail. +10. I read who the customers were in copywriting. I wrote on English. I found the sequence of algorithms. I mind read the story. I mind read the essay about it. +11. I found the right word. I wrote on linguistics. I wrote the rule in simulated intelligence. It was easy to generate algorithms with a large database. One read was before writing and one was afterwards. +12. Computer science wrote for me and played music summarising it to me. I wrote on computer science. Eventually, all of medicine covered the whole population with pedagogy. The database helped prevent mistakes in life and new frontiers. This included time travel. +13. I noticed that the recurring features were able to predicted. I wrote on psychology. I checked that art of walkthroughs in computer science was clear. The language was on the topic. Relations were clear. New features were simplified and visualised. +14. I researched the new meditations. I wrote on meditation. The humanities research was generated. The algorithms were generated. There was always something new. +15. Nature didn't necessarily require algorithms. I wrote on medicine. The seen-as version for meditation was medicine. Meditation was humanities. Ideas were in the form of necessary connections to help people. Interfaces were intuitive. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer - Simulated Intelligence 3.txt b/Lucian-Academy/Books/Lecturer/Lecturer - Simulated Intelligence 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..770ef0754116dc7b81a70a05574ddd3738b8bdb3 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer - Simulated Intelligence 3.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lecturer - Simulated Intelligence 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer - Simulated Intelligence 3 + +1. The seen-as version connected with the other. The seen-as version is the first depth applied to the idea in the simulation. It actually was a seen-as version for the idea. I recorded the algorithms worked on. I used each algorithm for the feat. +2. I checked it was unused. I finished the sentence with the simulation. I found an idea in nature. I connected to a comment about it. Some were theories. +3. One person agreed, the other disagreed. I finished the topic with the simulation. I simulated the character. I listened to her make a comment. I found the best conversation pair for the topic. +4. It was intersubjectivity. I finished the book with the simulation. I mind mapped all necessary content. I worked out more around an algorithm. I used a system to cover the area of study. +5. I chose the life and science. I simulated the lecture with a function. I described the variables used by the function. I described the values of the variables. I gave examples of the use of the function. +6. I verified the simulation. I verified reality with the simulation. I found the evidence. I found whether it properly supported the idea. I found positive function. +7. The connections were in the text. I was good in the simulation. I wrote the algorithm. It checked the good. Other algorithms came from it. +8. There was enough in the idea. I tested the that the developed idea test was successful. I wrote about the developed idea. I worked out the logic. I worked out the cycle of improvement for the idea to be above. +9. I read the essay again. I verified that the sentence was correct in the simulation. I wrote the sentence. I found 10 reasons for it. I checked it's grammar. +10. I printed the report about the simplified sentences. I applied the second department to the sentence in the simulation. It was computer science. It was statistics. It was machine learning. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer - Simulated Intelligence.txt b/Lucian-Academy/Books/Lecturer/Lecturer - Simulated Intelligence.txt new file mode 100644 index 0000000000000000000000000000000000000000..5de078785b62f505f1d9fe85003c6eb45d04e675 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer - Simulated Intelligence.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Lecturer - Simulated Intelligence, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer - Simulated Intelligence + +1. I simulated (read) values. I wrote on Simulated Intelligence. I wrote about the interpreter and induction. I wrote about text to breasonings and mind reading. There were medical devices that returned Parkinson's sufferers' memory, prevented pain and prevented headaches. +2. I replicated food. I wrote on pedagogy. What about electronic pedagogy? I replicated the space craft and derived quantum power from nothing. I read minds. I time travelled. +3. It was necessary for a master degree. I became a pedagogue. I manually ran text to breasonings. I examined electronics in terms of transition machines with the seen-as version food. I deduced recordings in fact was a small amount of star power. +4. The computer had consciousness because of breasonings. I wrote recordings for pedagogy. I controlled the time of the recording. I controlled the location of the recording. It passed in one unit through the electronic computer. +5. The software helped with pedagogy. I became a pedagogy helper. I viewed the actor making a representation about pedagogy. I accounted for the thoughts and reasonings of the student. I represented the area of study to the student when required. +6. I synthesised the values. I examined the people's values. I wrote the values. I wrote the person's values. I examined them. +7. I found the notable ideas using a computer. I wrote on history. I wrote on the history of ideas. I examined the trajectory of ideas. I discovered how new ones could form. I found electronic breasonings in medicine."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer 2.txt b/Lucian-Academy/Books/Lecturer/Lecturer 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..361f04960f04df114c00522bb029c51dcb6b8663 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer 2.txt @@ -0,0 +1,10 @@ +["Green, L 2021, Lecturer 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer 2 + +29. I designed the house using machine learning. I wrote on architecture. I checked the wall's vertical architecture. I found x. I found z. +30. I was grateful for connecting through. I wrote on cosmology. I meditated on neo-Hinduism, etc. I found comfort from Sufism. I breasoned out assignments in education. +31. Only some parts were supported. I wrote on ethics. I investigated privacy in mind reading. I investigated protecting myself and breasoning things out. I found the that medicine worked. +32. I found the computer science law. I wrote on laws. I wrote the input. I found the output using the algorithm. I checked this output. +33. I taught at the academy. I wrote on economics. I found the breasonings. I connected them with the argument. This was breasonings currency. +34. I investigated the most intricate features. I wrote on sport. I programmed quickly. I had prepared the inductive machine learning algorithms. Some of the solutions required recursion. +35. I wrote the combination algorithm and connected it with the chapter terms and the general philosophy. I wrote on games. I wrote the general philosophy. I found the perfect time to make the coffee. I connected this with the terms of the general philosophy. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Communication.txt b/Lucian-Academy/Books/Lecturer/Lecturer Communication.txt new file mode 100644 index 0000000000000000000000000000000000000000..4dfae3d0e51aacbac7abdbf6255a230fa9cde48f --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Communication.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lecturer Communication, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Communication + +1. I replied. I spoke. I had the confidence to speak. Others joined in. We communicated. +2. It was a new type of game. I wrote solutions to puzzles. I programmed the puzzle. I interfaced with the puzzle. I found the answer. +3. The algorithm worked on the alien system. I won the point with the universal winning statement. I communicated the algorithm. The algorithm worked. I won. +4. In representations, the lecturer interpreted students as thinking of her and their algorithms (i.e. checked whether they thought a sentence was good). The postgraduate was expected to become a lecturer. She generated her lectures. She programmed a system to help her react to representations. She automatically marked the essays. +5. I helped stop cancer with blueberries. I wrote the sentence maze. It was a decision tree. When I had decided on a choice point and the sentences were over, there were new sentences. The maze ended when the story was resolved. +6. Nietzsche communicated importantly. communicated well. He communicated in sentences that were enter-enigmatic. Praise-like. And honourable +7. Nietzsche taught the king's daughter. He communicated bravely. He sought the king. He sought his refuge. He wrote works. +8. Nietzsche devised an algorithm language before computers. He communicated to the king. He disciplined the king. He read wisely. He read thoroughly. +9. Nietzsche published the work. He communicated to everyone. He saw the king's court. He communicated on art. He gathered the philosophies. +10. Nietzsche wanted to do things between the people and the world. He communicated to the world. He described meditation's details. I always had a new detail. The people and the world changed. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Culturology.txt b/Lucian-Academy/Books/Lecturer/Lecturer Culturology.txt new file mode 100644 index 0000000000000000000000000000000000000000..4f388887a2fbbef1e0cbad8fcc718502c731122a --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Culturology.txt @@ -0,0 +1,12 @@ +["Green, L 2021, Lecturer Culturology, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Culturology + +1. The rest of the As were done. I examined culture. I met the threshold for the company's number of As. There were algorithms for 10 employees. One employee had 5 algorithms per day. +2. I found the algorithms that lit up to apply the algorithm to. I examined the largenesses of life. I relied on mind reading algorithms. I found the algorithm that lit up. I found algorithms that were inspired by it. +3. Reverse CAW was guessing the input and output. Culturology is good. I applied back-translation to an algorithm. I found that reversing the algorithm resulted in the same result as the original. Reverse interpret was CAW. +4. The answer was added to the database. Culture was present. The other wrote the question about the philosophy or computer science. The philosophy assignment was critical analysis. The computer science assignment was critical analysis of the algorithm. +5. The companies had simulated registration. Culture had a point. The finance company helped find students for the academy. It had 50 As describing how it could be replaced if it closed. The writer wrote the books and made them available. +6. The students mind read themselves to decide whether to breason out 80 breasonings per assignment. Culture had a small idea. I simulated teaching philosophy and computer science. I stimulated people. I didn't used spiritual bots, but accepted breasoning currency (an application essay) as payment for courses. +7. There was a recursive command. Two models joined in culturology. The list interpreter was pure. The recursive interpreter was simple. The interpreter processed the list recursively. +8. Mind reading, including time travelling breasonings, was for medical purposes. I enjoyed culturology. The academy surveyed the work for plagiarism. The work was published. The domain that could be mind read and the implications of time travelling breasonings were examined in the work. +9. When the lecturer had written 50 algorithms per assignment, and it was easy to do the rest, she/he became a professor. Culturologists examined the caves. As soon as I found the method, I completed first class versions of my assignments. I found the details. I found the algorithms. I mind read the delightful algorithms as algorithm details. +10. I found the possibilities available with the finished interpreter. I recorded the culture from above. Culturology was fine arts of the interpreter. I charted the student's progress. I helped him to perform better. I sent reminders for late work."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Gay Studies.txt b/Lucian-Academy/Books/Lecturer/Lecturer Gay Studies.txt new file mode 100644 index 0000000000000000000000000000000000000000..3c7430687bdcfbbc5dd124e4ece8c71a0535e4b3 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Gay Studies.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Lecturer Gay Studies, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Gay Studies + +1. I assessed the mark out of 2*4*50 As. I was friendly with everyone. I assessed the mark out of 80. I assessed the mark out of 2*15 As. I assessed the mark out of 2*50 As. +2. Having completed enough work in the last 6 months, I was happy. I was happy with everyone. I assessed the mark from breasonings from the last 6 months of breasonings to the nearest day, to the first day of the assignment. I calculated the first day of the assignment. I worked out 6 months (the same day of the month) before it. +3. The algorithm was a good logical synthesis. I was good with everyone. I tested whether the sentence had an algorithm associated with it. I detected it using mind reading. I wrote whether the algorithm worked. +4. I wrote in terms of place and feeling. I wrote the poem. I applied new philosophy and computer science to find new algorithms. I wrote the developed thing in philosophy. I wrote the developed thing in computer science. +5. I merged successive constants in parts of the data. I wrote the articles. I gave the forwards and backwards functions to back-caw, back-list and back-string as input. I found the shortest possible algorithm. I merged successive constants. +6. I held hands. I wrote about sex. I found 1+ length sequence connecting forward and back searches. I went forward from the start and back from the end. I tried connecting them with 1+ length sequences. +7. It was in academia or industry and the number seen as required was found out. I wrote about liberation. I wrote an LSTM to write 50 algorithms per assignment. I saw the work done. I matched it. +8. The 50 As contributed to the next stage. I wrote about freedom. I breasoned out 50 As with the text to breasonings algorithm when I enrolled in a course. It helped me relax. It was a professional requirement. +9. I enrolled in all courses I was working in. I wrote about happiness. When I finished the course, I wrote copy to earn a job. I wrote 50 As. I wrote the copy. +10. The professor wrote the algorithm A. I wrote on goodness. I found work. I applied for the job. I earned the job. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Hermeneutics.txt b/Lucian-Academy/Books/Lecturer/Lecturer Hermeneutics.txt new file mode 100644 index 0000000000000000000000000000000000000000..9943dceb3e571d0f95f15f5ea95f62213c1ee480 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Hermeneutics.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lecturer Hermeneutics, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Hermeneutics + +1. I answered with the writer of the algorithm. I wrote the queryable ideology. I found the ideology. I wrote it's header. I wrote the query. +2. It could be a single name. I wrote the query. I asked for the order that the algorithms were written in. I wrote the predicate name. I wrote the variable name. +3. The ontology listed the order of the predicates. I wrote the ontology as answer. I wrote the order that the predicates were written in. I found the dependencies of the predicates. I wrote the number of the first predicate. I wrote the order of the rest of the predicates. +4. I found the intermediate algorithm. I wrote the conjoined parts. I found the inference between the parts. I wrote the first part. I wrote the part that joined it to the second part. +5. I detected each algorithm. I wrote one of the parts of the disjunction. I wrote the intermediate algorithm. I wrote the first part of the disjunction. I wrote the second part of the disjunction. +6. I continued until they got tired. I wrote the answer set in the ideology. I found the lit up part according to mind reading. I found the philosophy. Alternatively, I found the computer science. +7. I hypothesised the possible solution. I converted the question into an answer. I wrote the answer as a question. I matched key terms in the question with the answer. I identified and bridged knowledge gaps. +8. I wrote the function-call-within-function-call. I wrote the reason for the ontology. I found the previous cases like the possible solution. I found the difference. I modified the previous case to be the solution. +9. The people ideas were food. I inferenced the horizon with the conclusion. I wrote the algorithm. I wrote the possible horizon philosophy. I wrote their connection. +10. I uniformised the clauses by calling common code in another predicate. I verified the disjunctive part. I identified that a disjunctive clause was needed in addition to the clause. I wrote the disjunctive clause, in the right order. I tested the algorithm. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Lucianism.txt b/Lucian-Academy/Books/Lecturer/Lecturer Lucianism.txt new file mode 100644 index 0000000000000000000000000000000000000000..d55f4585560702e3e91d7d7ef09c6d27a94a3fbf --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Lucianism.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lecturer Lucianism, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Lucianism + +1. The other course was unknown material by the author. I read philosophy because I knew the author. My certification service could be switched on and off. The service was switched on when the course made money. When it made money, another course was introduced. +2. How do you increase it? Lucianism swept through the room. The person on stage suggested it. The audience viewed it. Everyone was protected. +3. Mother Earth was happy. Lucianism is the philosophy of Ved. Lucian read on Ved. Lucian found the current technologies from it. More was possible in the future. +4. I simulated accreditation. The relevant part is interpreted. I read the text. I found the correct interpretation. The people agreed. +5. He identified the mental nature of the question. Philosophy helped the world. The person had a positive reason. He read the philosophy. It helped the world. +6. There was a Master hack-a-thon. The self chose the self. The person identified her role in the team. She performed this role. She chose an algorithm that was topical as a seen-as version. +7. There was another application. The self chose the other. The other had an inspiration. The other wanted to give the thought to the self. The self chose the thought of the other. +8. The web server ran the text to breasonings algorithm. I examined the chain from non-accreditation to accreditation. I was polite. Everyone was happy. Everyone was safe. +9. Pedagogy helped with comfort for work. I discovered that humanist pedagogy worked. It was the product. It was the life of the product. People were found out. +10. Everyone could try meditation. I discovered that meditation worked. When the people meditated, they were given As. They lived a full and happy life. They received everything on the first day. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Metaphysics.txt b/Lucian-Academy/Books/Lecturer/Lecturer Metaphysics.txt new file mode 100644 index 0000000000000000000000000000000000000000..c1c60f53df7c3141e070e78f6bceb39d5332e6e0 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Metaphysics.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lecturer Metaphysics, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Metaphysics + +1. I created the idea as an object. I found the idea's two poles. I created the 1-D automaton. It was a switch. +2. 'Words' was taken so I called it the 'Red and Black Book'. I wrote the idea as a word. I described the idea. I compared it with an interesting idea. I wrote the best word down. +3. In 100%, I wrote algorithms for each sentence. I wrote the idea as a sentence. I found its system. I found its verb. I found its object. +4. I condensed the algorithm to three lines. I wrote the idea as a paragraph. I wrote the draft. I tested it. It was the short algorithm. +5. I wrote on the metaphysics (pattern matching of the ideas). I wrote the idea as a chapter. I found the key terms. I wrote them in a different order in relation to an idea. I repeated this for each paragraph. +6. There were more connections between the chapters. I wrote on the idea as a book. I divided the idea into topics. I wrote on each topic. I connected the topics in the introduction. +7. The developed thing was the universe. I wrote on the idea as an algorithm. It was a multi-book algorithm. There were connections between the books. I tested the algorithm for developed things. +8. The computer found what what was interesting and the people were interested in the work. I detected the object size. The speed of computation meant that all the work was done by the computer in the computational government. The rest was turned off. There were some writers. +9. I thought of the Lucian machine. I described the use. I breasoned out the key term from each sentence. I thought of the algorithm. I thought of the interpreter. +10. It was the three-dimensional Lucian machine. I thought of the simplification to all appropriate uses. I wrote the algorithm. I wrote the algorithm to run the algorithm. I wrote the algorithm to find whether the algorithm run met the requirements. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Popology.txt b/Lucian-Academy/Books/Lecturer/Lecturer Popology.txt new file mode 100644 index 0000000000000000000000000000000000000000..560560a7e9b7ed4dd7c64c14b122d7e42ae5efb1 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Popology.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lecturer Popology, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Popology + +1. There were bank reconciliation and financial statements. I stated that the person was just. I connected the sentences with the LSTM. I connected the sentence to the argument. My company and the recruitment organisation made a profit, where short courses could be audited for free. +2. The philosophy of popology led to a whole academic career of induction. The person was happy. The short courses were optionally free. The person applied to the course. She applied for a fee waiver. +3. The application helped the student clarify his goals and helped the academy automate applications. The person was sentient. The application site used the K-Nearest-Neighbour algorithm to check that the applicant agreed with the aims of the academy, wanted to study a field within the academy, what they would bring to it and optionally their career aims and ways they would use their knowledge. It asked for resubmission after negative terms without qualification, i.e. asked them for a solving term after a term such as disagree. +4. There was a certificate, awarding the full version (which was the same as for auditing). The person was sapient (wise). The diplomas could be paid for or audited for free. I tried for industry as well. At least, the student was protected. +5. Some details went viral. The person was agreeable. Accreditation turned off more than five As per part of the body per day. Sales included an algorithm and were up to 50 As. They were sketched out. The student was tracked. +6. The image had seven parts. I met the standard. The Queen was satisfied by the details. She ate the food. She played and slept. +7. The teacher critically held the algorithm or found three details. I smiled. The PhD was supported to do the work. There were colours afterwards. Everything worked. +8. The metaphor was supportive and could be left. He refreshed his appearance in his mind. He wrote details as As. He mind read details for writing. He mind read details to help thoughts return to work, or stay at a relaxing port. +9. I decided to write a lengthy algorithm with a simple philosophical description in Master and PhD. I weighed up each side. Logic connected the details to make the algorithm. I mind read the connectors. I mind read the connection (e.g. on the fifth day, fifty philosophy algorithms with three details had been written and the cloud requiring 50 algorithm As lifted). +10. He likes University. I made an allowance for a new point of view. University was fifty algorithm As for a backlogged course for an industry event. I found lots of relevant perspectives on computer science. The new student recruitment company found the students. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Rhetoric 2.txt b/Lucian-Academy/Books/Lecturer/Lecturer Rhetoric 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..69aa1a549386b8a03eaaf17deae069d6c5414eee --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Rhetoric 2.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Lecturer Rhetoric 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Rhetoric 2 + +6. Writing is rhetorical. Rhetoric was in itself. As A is to B, C is to D, where A, B, C and D are each as Ax is to Bx, Cx is to Dx. Ax, Bx, Cx and Dx are each as Ay is to By, Cy is to Dy. This continued until ground. I computer- and human-verified the conclusion. As I ran the program, I verified it. I read the input and output. I verified them. I verified that the program produced the correct output. +7. The advantage was mind reading across time. As the self transfixed on the point, the point was entwined with a point. I associated the value with the first point. I could read the value from the second point. The options were breasonings or circuits. The lecture theatre became a theatre. As I talked, my students understood by talking. The student discussed the idea. She took one side. The other student took the other side. +8. The two people delivered their related monologues. As the self aimed for itself, there was an A for the other. The self aimed from its perspective. The other thought of an A from its perspective. The other copied the self. People performed the work. I reversed the idea. Many ideas made light work. I wrote reasons for the idea. I reconfigured (reassembled) the idea. +9. I related writing and action. As I created the ontology, I used it. I found the object. I named it. I used the name. In fact, mind reading required 50 meditators to be mind read per day instead of 10 algorithms per day. I didn't disagree (agreed). The opposite of agreeing is disagreeing. The person either agreed or disagreed. +10. The system contained multiple others. As the self was verified to exist, the other was verified to exist. It was tested that the biological organism should be alive. It was tested that its system should exist. The connection was tested. Care led to life. As the carer bought food, the disabled woman lived. I asked the men if they wanted to learn meditation. I didn't ask the disabled woman if she wanted to learn meditation after she prepositioned me. I helped the woman onto the tram. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Rhetoric.txt b/Lucian-Academy/Books/Lecturer/Lecturer Rhetoric.txt new file mode 100644 index 0000000000000000000000000000000000000000..620fe62b92fec91f4961623370a7573a2cd032c9 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Rhetoric.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Lecturer Rhetoric, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Rhetoric + +1. I checked the rhetoric. I wrote the rhetoric of safety. As the inspector checked the site, the user used the site. The inspector determined that the lever worked. The user used the lever. I connected the two ideas. Creativity was derived from the meaning. The meaning was the means of production. The means was money. Creativity helped generate currency. +2. I meant the statement again. I said the statement, meaning it. I worked out the statement. I verified it. I made the statement. I wrote a certain amount on each topic. I liked the job. I chose the industry. I chose the first job. I chose the rest of the jobs. +3. The house of rhetoric inspired its survival. The house of rhetoric advanced. The foundation was 'as A is to B'. The pillar was 'C is to D'. I verified the idea against the house of rhetoric. Rhetoric checked the logic. As criticality was verified against ontological models, so were disjunction and and-implication. Criticality, that I didn't drink the soup that was too hot, was solved by pouring milk into the soup (the ontological model). Critical disjunction, that I hungry for neither the lemon chocolate nor the panforte, was solved by snap-freezing them. Critical and-implication (where and-implication is implication given the 'and' truth table), that I couldn't use the plunger to make coffee because it was actually hot chocolate was solved by having hot chocolate instead. +4. The follower verified his thoughts. As I saw, he saw. I was sighted. I led the blind man. With my eyes, he could move safely. THe rhetorician predicted thoughts. The rhetorician advanced. He studied rhetoric. He applied many to one. He used the objects up. +5. I helped with checking with my art. As I looked over myself, I was happy. I was happy because I looked over myself. I look over my reasons recursively until I was happy. I said the reasons. I checked what I had discovered. As I discovered the reason, I argued for it. I wrote the reasons. I arranged them into an argument map. I delivered the introduction. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Simulated Intelligence cont'd 2.txt b/Lucian-Academy/Books/Lecturer/Lecturer Simulated Intelligence cont'd 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..a0d2ac25713fb9aefa919a737290dffb2991e9b8 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Simulated Intelligence cont'd 2.txt @@ -0,0 +1,12 @@ +["Green, L 2021, Lecturer Simulated Intelligence cont'd 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Simulated Intelligence cont'd 2 + +41. The pedagogue was a computer program. The person created the pedagogue to breason out the breasoning list as part of assessment. The person helped the pedagogue to breason out the lecturer argument. The lecturer argument enabled the pedagogue to write arguments. The pedagogue breasoned out the breasoning list. +42. The pedagogy helper was an algorithm. The pedagogy helper breasoned out the breasoning list for assessment. The pedagogy helper/teacher prepared the breasoning. The pedagogue wrote the teacher's words down. She breasoned them out. +43. The pedagogue was an algorithm. The pedagogue breasoned out the breasoning list as part of assessment. He thought of the words individually. He breasoned them out. They worked because they were already breasoned out by a pedagogy helper. +44. The spiritual time traveller breasoned out breasonings from the future. The recordings pedagogue breasoned out the breasoning list. The recorded breasonings could travel through time. The breasoning list could be breasoned out to aim for a time. I helped the student earn the grade. +45. The person constructed their own intelligence test. The person stated that the other was intelligent. The other passed the intelligence test. The person checked it. The person wrote the result. +46. The king was fair. The king commanded the simulation. The simulation helped track the food. The grain was sown. The simulation predicted the harvest. +47. The simulation was constructed from parts. I started the simulation part. The simulation part was the algorithm. I examined reality. I simulated it. +48. The simulated algorithm asked questions. The hint worked in the simulation. The student read the question. She read the hint. She answered the question. +49. The simulation was relaxing. The person maintained his good health in the simulation. The person meditated. He ate healthily. He had good exercise. +50. It was safe. The element in the simulation advertised it was there until change. The element was zinc. The citizen drank the zinc drink to combat sickness. It was plentiful."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Simulated Intelligence cont'd.txt b/Lucian-Academy/Books/Lecturer/Lecturer Simulated Intelligence cont'd.txt new file mode 100644 index 0000000000000000000000000000000000000000..924d2ab4c3303640db608a1ecbd6539ae017a484 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Simulated Intelligence cont'd.txt @@ -0,0 +1,12 @@ +["Green, L 2021, Lecturer Simulated Intelligence cont'd, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Simulated Intelligence cont'd + +31. Simulated intelligence contained all ideas, to a point, about algorithm induction. Simulated intelligence contains all ideas. I thought of the first idea. I found the ideas around it. I thought of the ideas around these until I had thought of all of the ideas. +32. I ordered by alphanumeric code, instead of by number. I thought around corners of the simulation. I found a lateral (side) connection between ideas. I thought forward. I thought back. +33. The king answered the question with the correct answer. I noticed Kinglish (sic) about simulated intelligence (characters reacted to the header as king). The king gave the input. The subject (predicateer) computed the output. The king accepted the output. +34. The programmer also ended up becoming the manager. Breasonings didn't like (liked) the simulation, in other words, spiritual objects. The programmer found the real simulation. The manager breasoned it out. This helped the real simulation function properly (the people recognised breasonings and had their positive effects). +35. God was infallible to critique. God collected the value pertaining the whether the character in the simulation had critiqued him. The argument didn't mention God. The argument mentioned the person instead. The argument supported God. +36. The simulation was around us. The simulation supported agreement. I agreed. It went well. The simulation was harmonious. +37. Everything was definite. In accreditation, good religion worked. It was around pedagogy. Grit was good grades. The others learned. +38. The better quality doctor had contributed breasoning to conception and the birth, and breasoned out other events. Birth was possible because of breasoning. The parent breasoned it out. Conception went well. Birth went well. +39. What's inside vocational education? I practised the meditation mantra, and taught it to students. I learned the mantra. I practised it. I taught it to students I met. +40. I switched off some pain, some medical problems, unwanted thoughts and mistakes. I used the medicine quantum switch. It worked. It was inside vocational medicine. I completed all necessary courses during my life."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer Societology.txt b/Lucian-Academy/Books/Lecturer/Lecturer Societology.txt new file mode 100644 index 0000000000000000000000000000000000000000..3e46cc6d40d75ba233f7f0e09661a3966422ef62 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer Societology.txt @@ -0,0 +1,14 @@ +["Green, L 2021, Lecturer Societology, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer Societology + +1. The helper collected breasonings on the person. The person was right. I paid a recruiter to send students to Lucian Academy. I paid some money. There were students. +2. They can do their own work after a little while. The person was good. There were 250 breasonings per breasoning thought of by the student. The computer detected the breasoning. It breasoned out 250 breasonings to support it. It worked when the student was ready. +3. The model student helped breason out ideas. The person was helpful. There was a connected-together model student. She found the thoughts. She was grateful. +4. The professors replied. The person was useful. The students wrote the essays. They wrote the details (three per sentence). There were also details to detail in higher degrees. +5. The students were well-presented. The person was caring (well-dressed). I went to the recruitment web site. I paid the money. They found the students. +6. I checked whether they saw appearances, were depressed, had headaches, studied, worked, had a business, had a partner, had sex, Nietzsche stated that there was enough leeway (it was equitable). I found the student's aims out. I helped him with it. I helped him with everything. +7. There were classes of at least 10 students per class per semester. The societology model worked well. I sent the money from my account to the Recruitment company's bank account. They organised payment to Lucian Academy with money or vouchers. Both made a profit. +8. I read the encyclopedia article about the academy. The other was interesting. I wrote the income per month. I wrote the income per six month period. I wrote the income per year. +9. The academy worked with text. The society performed the job itself. The quality assurance person checked that the student could understand the material. She read the feedback. She rewrote the instructions. +10. The student submitted her essay and checked her mark online. Society stood still. The worker was hired at the start to check the quality of essay marking systems. The software used to write the essays was tested on different operating systems (Mac, Windows and Linux). They could read the course materials on a smartphone. Assessment required a desktop computer. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Lecturer.txt b/Lucian-Academy/Books/Lecturer/Lecturer.txt new file mode 100644 index 0000000000000000000000000000000000000000..4c90b5673f934929f38885f4bac38ee9915db51c --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Lecturer.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Lecturer, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lecturer + +1. I greeted the person. I wrote on metaphysics. I found the face. I started with it's reaction. I found what existed to it. +2. I found what the person aimed for. I wrote on hermeneutics. I identified the discourse. I grouped the topics into ideologies. I grouped the ideas into ontologies. +3. The robot was super-abled. I wrote on communication studies. The robot had quiet time to himself. He wrote a diary. He painted pictures. +4. There were services for gays. I wrote on gay studies. I determined that the person was gay. I gave her the job. She was accepted. +5. I inserted the desired feature and explained it. I wrote on Popology. I wanted to automate writing computer programs. I wanted to automate writing music. I had a backlog. +6. The essays, music and algorithms were more square. I wrote on Societology. I automated writing the essay. I finished the list of essays. I wrote an essay comment. +7. It started with pop music. I wrote on Lucianism. How will they do it? I wanted to automate breasonings, the idea that I discovered in undergraduate level to earn high distinctions. I wrote pedagogy and other books to do it. +8. I liked it. I wrote on Culturology. I pretended to study medicine before learning meditation. I pretended to study electronic and electrical engineering before learning meditation. The technology was important in the culture. +9. I taught the leader consideration. I wrote on Rhetoric. Rhetoric meant as a->b, c->d. I noticed that everything was possible. It wasn't bad, turning the universes off. +10. I guided every computation. I wrote on Cognitive Science. It involved meta-cognition. It involved extensions. It analysed creativity. +11. I considered it with first-class reasonings. I wrote on Psychology. I experienced mind reading an essay with a decision tree. In music, the pad had harmony. The imagery and choice of instruments was better. +12. The author was represented by the wooden model. I wrote on the philosophy of language. I mind read the form of the language. I mind read the content of the language. I spoke after deciding 'It is getting better' and 'That's what I want'. +13. The character teleported and time travelled. I wrote on Development Studies. I developed software. I developed the thing. I wrote on the most developed topics. +14. The body was like light treated by sound. I wrote on the Body Metaphor. The mind reading (working out) game was a scavenger hunt. There were 17 threats detected in meditation to the person on the day he went to hospital. The person was pointed to the pain prevention meditation algorithm. +15. The mind has no things about money about it at all. I wrote on the Mind Metaphor. Education should be free. Support can be free. I worked out how to automate marking submitted assignments. +16. I selected the detail to write an algorithm on using mind reading and wrote it using mind reading by specifying tops and bottoms of algorithms. I wrote on the future. The academy offered computer science. Mind reading was harmless. It seemed to offer details. +17. Everything was on the same page and simple. I wrote on aesthetics. I connected the parts of the algorithms, rewriting with functionalism so that only the used part of the data structure was processed. +18. The lines were changed if the data needed to be copied or changed. I wrote on epistemology. The data types and type (e.g. music) of the data was labelled. The new data was simplified and replaced the data. The type statements for the data structure were modified, as were the affected lines. +19. The writer gained the necessary skills. I wrote on science. The music kept on getting better and better. The philosophy kept on improving. The algorithms became more and more useful. +20. Meditation was well-timed. I wrote on love. The robot had a good reason to love me. He had an emotional upbringing. The breasoning was gritty. +21. Pedagogy helped with conceiving children. I wrote on sex. The psychoanalyst analysed the life. It was part of medicine. It branched out. +22. I was successful. I wrote on primary philosophy. I found the idea. I expanded it. I wrote a letter to the Queen about it. +23. I helped the student write on his own topic. I wrote on secondary philosophy. I wrote a second text on the topic. It extended the coverage of the topic. The student could write on the topic without a pedagogy helper. +24. I also found the input and output from the algorithm. I wrote on logic. I found the solution to machine learning in modus ponens. I found all of the solutions in modus ponens (the state of the output of the finished algorithm, i.e. given the input). I found the algorithm without input or output. +25. The sound track had two chords. I wrote on the brain metaphor. I translated the book. I produced the movie. I wrote the sound track. +26. 250 breasonings effected the philosophy. I wrote on poetics. The child liked life. Pop was liturgical. Computer science likes the sutra. +27. I conducted new research with axioms (topics and connections that lit up from all departments on the topic). I wrote on computational philosophy. Many of the ideas were related to computational philosophy. They were expanded in the degree. I found the system for writing and analysing algorithms. +28. I worked out how to clock the game with the right combination of movement through the maze, logic and maths. I wrote on archeology. I found the alternative dimension in the text based adventure computer game Vetusia. I programmed the sequel. Completing the game required three transformations in the algorithm. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Linguistics.txt b/Lucian-Academy/Books/Lecturer/Linguistics.txt new file mode 100644 index 0000000000000000000000000000000000000000..f4375d35716ca0e088f6d4d696fd0c0a689d3e9a --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Linguistics.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Linguistics, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Linguistics + +1. I dispensed with idealism. I chose reality over culture. I chose what I needed. The new culture reflected this. The people knew the people. +2. I could see the characters in List Prolog grammars. I created the lingua franca. I found what was realistic, rather than divertissements or what was experienced on the way. List Prolog was what was available (my own algorithms to solve problems), not a compromise between languages. I supported the educational state of grammars in List Prolog, including manually adding the bottom case. +3. We had to have specialisms inside education. I started the newspaper. I found the heading. I found the body. I went to the next article. +4. I favoured being a person rather than God. I wrote the article. The Short Articles and Lecturer books stood alone. I could entirely include the data in the recursive type statements. The 'any' type was for bug checking. +5. I wrote a verbose programming language (example laden) and a clear programming language (helpful). I wrote a new language. The spoken language was articulated in List Prolog. The phrases as predicate and variable names made it Language Prolog. There were reasonings explaining Prolog in the native tongue. +6. One algorithm stepped through the cognitive barriers at my pace and the other algorithm interested me in the best solution. I wrote on universalism. I checked the secret example of the helpful hint. I found the logic. I triggered the hint to expand the scope. +7. Recursive types were compatible with universalism. I defined the type logic. It had variables. It had code. I used 'any' and put the code into the predicate. +8. I opened the file. I wrote the editor. I found the language. I parsed the link. I parsed the image address. +9. Completing reasonings was the most exciting part of programming in a human language. I edited an article. It was a text editor. I entered the heading. The visually impaired user editor the article. +10. The interpreter articulated the philosophy in the different language. I spoke in a new language. I tested the different language on the other operating system. I read the meaning. I listened to and pronounced the characters and words. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Medicine.txt b/Lucian-Academy/Books/Lecturer/Medicine.txt new file mode 100644 index 0000000000000000000000000000000000000000..f5bb0740f9cdf9c38742ecc98103865ad23086d8 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Medicine.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Medicine, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Medicine + +1. I found the direction of the person or object, the part of the room, the room, and the time to prepare, do and finish an action. I wrote 6 levels on each problem's solution. I found the air temperature with vague mind reading. I found whether the person was wearing a beanie using vague mid reading. I found whether there were distractions using vague mind reading. +2. I drank the oil each day. I helped each person with meditation. I taught the person the mantra. I taught the person the sutra. I taught the person yoga. +3. I gave an A for reading. I helped each person with medicine. I gave an A to wear warm or cool clothing. I gave an A to wear clothes to protect from the sun. I found a critical A about the sun. +4. The parents, even the robots knew about pedagogy. I helped each person with pedagogy. I found the entrance, meditation, when I was in teaching. I found the A for the thought when time travelling. I found the A for the piece of information from the internet data structure. +5. They connected through critical As about the algorithms. I helped each person with Computational English. I secretly analysed the text using the LSTM. I mind read the best details. They were algorithms. +6. It included the writer's thoughts and the area of study. I found protection in natural law. I paraphrased the text and worked on the algorithms. I shaped what I mind read with an LSTM. I wrote it myself by connecting with the argument. +7. I removed madness, mistakes, unwanted thoughts, negative thoughts, medical problems and illegal thoughts. I wrote that the object increased longevity. I found truth in mind read evidence and both mind read evidence about texts. I was happy. I found truth by examining the self. +8. I specifically barred certain characters from being allowed in the text to breasonings dictionary. I wrote the object that increased happiness. I was healthy. I was wealthy. I was wise. +9. I had a conversation with the robot. I wrote the object that increased grades. I asked about objects. I asked about algorithms. I asked about machine learning. +10. That was their industry. I wrote the object that increased the type of consciousness (expression). I found the expression. I found details. I found healthier activities related to the expression. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Meditation.txt b/Lucian-Academy/Books/Lecturer/Meditation.txt new file mode 100644 index 0000000000000000000000000000000000000000..a0b60b7db747a168204c934626a7ba336c0f4bd5 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Meditation.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Meditation, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Meditation + +1. I wrote the algorithm. I chose spiritual meditation over mindfulness. I was happy. I could automate tasks. I wrote philosophy. +2. They were meditation-themed thoughts. I wrote a meditation breasoning per day on my aim. I thought of the objects. I connected my thoughts to medicine. I connected the thoughts through my life. +3. I defined the terms. I wrote a meditation breasoning per day on my method. I found the working method. I customised it. It was simple. +4. The tester tested the algorithm. I wrote a meditation breasoning per day on my apparatus. I found a writing instrument. I wrote my thoughts. I recorded the moment. +5. I programmed it. I wrote a meditation breasoning per day on my hypothesis. I worked out what would happen. I planned it. I scripted it. +6. I wrote the best comments. I wrote a meditation breasoning per day on my results. I used a system. I commented on the results. I found cross-conclusions. +7. I substantiated my claims with logic. I wrote a meditation breasoning per day on my discussion. I discussed the work in terms of my previous work. I made future plans. I connected through my plans. +8. I wrote the conclusion over a period. I wrote a meditation breasoning per day on my conclusion. I expressed the conclusion in a clear manner. I found the use for the study. I found whether it's type could stay the same for a use. +9. I breasoned out the results. I wrote a meditation breasoning per day on my outliers. I noticed the outliers. I removed them. I noticed the results were more congruent. +10. I wrote the alternative calculus. I wrote a meditation breasoning per day on my new discoveries. I found fundamental algorithms. I crossed algorithms. I used a system to find the fundamental algorithms. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/People Values.txt b/Lucian-Academy/Books/Lecturer/People Values.txt new file mode 100644 index 0000000000000000000000000000000000000000..a852500ca0f4a5dadb0945ec6f76a282c5159d62 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/People Values.txt @@ -0,0 +1,13 @@ +["Green, L 2021, People Values, Lucian Academy Press, Melbourne.","Green, L 2021",1,"People Values + +1. The examined life was worth living. The subject was heroic (saved a life). He wrote regularly. It helped his mind. It helped his body. +2. The thesis plan and mind map were complete. The subject was perfect (earned 100%). In the master degree, the details were completed. The algorithms were completed. Neither had been reused from a previous assignment. +3. She checked she was on the right track rather than asked for the answer. The subject was giving (answered a specific question). She listened to the question. She read the materials. She asked a question after drafting a possible response. +4. The students checked the site. The subject led (acted on feedback). He found the question. He answered it. He put it on the site. +5. The student agreed with education. The subject helped the disadvantaged. He identified that they were missing something. He helped them replace it. For example, as they replaced 2 in 1+1=2, they replaced both pen and paper. +6. When all the algorithms were complete, the essay earned 100%. The subject accepted the reward. He wrote the essay. He verified and submitted it. He earned 100%. +7. There was a list of algorithms. The subject gave feedback. All the details' algorithms were complete. She stated the time to breason out the text and write the algorithms. The longer algorithm was preferred. +8. There was a simple version of the algorithm, which conveyed the essence. The subject answered the question. The classic writer hand wrote the essay and algorithm. The seen-as version was from another work. He knew his own material. +9. It was on a new perspective. The subject rewrote the course. He wrote a second draft of the thesis. He rewrote the algorithm. The two drafts proposed different sides of the argument. +10. The subject noticed improvement in the helped people. Writing Pedagogy was necessary for writing breasonings. Writing Computational English helped with writing and programming. Writing about the interpreter in Popology and the inductive algorithm writer in Theology helped reinforce the foundations of generating algorithms. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Philosophy:Computational English.txt b/Lucian-Academy/Books/Lecturer/Philosophy:Computational English.txt new file mode 100644 index 0000000000000000000000000000000000000000..46dbf02474763eed1a9c6e5bc2baf0071f0187e5 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Philosophy:Computational English.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Philosophy:Computational English, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Philosophy/Computational English + +1. 1000/500 words of code were required for a high distinction by the professor/non-professor. I wrote on Computational English. I found the features. I added them to the algorithm one at a time to test them. I changed the code. +2. They ran our algorithms half the time and did their own things the other half of the time. Computational English is like a calculator. I noticed the machine learning algorithm had a face, which it started reacting with. They ran the different algorithms with the same person. They were in a separate room when they weren't switched on. +3. I checked its algorithm. I connected the two texts. I found the first text. I converted it to the second text. I checked it. +4. I wrote on the finite. I cut off infinity. I prepared for the longest amount of work being possible. I found the way of producing the longest of amount of work. I produced the longest amount of work. +5. I wrote from the other's point of view. I found the opposite side of the argument. I found the opposite side of each contention. I noted it at the time. I noticed the changes in the future drafts. +6. I smelt the roses. I determined the event's history over time. The people came. The people were supported. Some of them came back for more. +7. I left the simulation. I protected the world with a simulation. I made no mistakes. I made the choice. I could have anything I wanted. +8. I found the file length from the property. I found the the property from the file length. I found that the part of the file had the property (were a philosophical algorithm). The file was detailed. It had a certain number of philosophical algorithms. +9. I labelled the definition. I detected the term from the definition. I predicted the content from the structure. I predicted the structure from the content. I named it. +10. I slightly changed the parameters. I detected the definition from the term. I wrote the name of the object. I labelled the predicate. I found various possibilities using machine learning. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Politics.txt b/Lucian-Academy/Books/Lecturer/Politics.txt new file mode 100644 index 0000000000000000000000000000000000000000..f715ab85aefb977d2570f32ffad899d6d9c92fc1 --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Politics.txt @@ -0,0 +1,12 @@ +["Green, L 2021, Politics, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Politics + +1. I made up more. I had the right to constitution. I found the core philosophies in the constitution. I made them up. I wrote them down. +2. I found that it intermingled with other states. I had the right to state. I noticed the state. I saw it go well. It was working. +3. I saw each person govern. I had the right to government. I saw it govern. I saw that government was about it. I saw you there too. +4. I saw changes. I had the right to vote. I found you. I voted with you. I went well. +5. In government, joined 50 As work well. I had the right to amalgamation. I saw big things. I saw them join together. They worked well. +6. I found them going well. The country was equitable. The people had appropriate changes. There were LGBTQIA people. There were disabled people. +7. Everyone could speak. The country maintained high quality of life. It was good here. There was a bathroom. You could do your goal. +8. I liked it. There was a regular re-election of government. I found the government. I re-elected it. It was good. +9. Facebook had a fact checker before publication. The voters regularly verified the policies. There was training for voters. Past acts, the foreseeable effect of policies and whether newcomers would be better were considered. Everything was considered. +10. I used up everything. The best ideas were found. I found the best ideas. I loved them. The people seemed personable."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Lecturer/Psychology.txt b/Lucian-Academy/Books/Lecturer/Psychology.txt new file mode 100644 index 0000000000000000000000000000000000000000..67f6e4a6fa9d812d0f67410b63564e43bc1ce96d --- /dev/null +++ b/Lucian-Academy/Books/Lecturer/Psychology.txt @@ -0,0 +1,14 @@ +["Green, L 2021, Psychology, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Psychology + +1. It was like findall with nondeterministic results. I wrote an algorithm with steps. I called the command with the query. The query had multiple commands. For example, call(Output1,predicate(Input,Output1),Output2) set Output2 to Output1 from predicate. +2. The algorithm grouped and called reused code in a predicate. I wrote a hierarchical algorithm. I called the predicate. I gave the predicate. It called other predicates. +3. The recursive algorithm returned the output. I wrote the logical algorithm. I called the recursive algorithm. I gave the recursive algorithm. I gave the base case. +4. I wrote the algorithm. I added the emotion to the learning aid. I called the algorithm in the file. I learned from the algorithm. I was happy. +5. I followed the predicates in order. I wrote the memory aid. I added the predicate to the predicates in memory. I wrote the predicate. I added it at the end. +6. It was also smart (I found the object). I wrote one object name per step. I could read the predicates in memory. It was better to operate on an algorithm that wasn't in memory. It didn't have unpredictable behaviour. +7. I debugged the algorithm. I wrote that everyone was included. I could delete predicates from memory. I found the predicate. I deleted it. +8. I found the change using machine learning. I wrote that the argument was valid. I edited the predicate in memory. I found the argument. I changed it. +9. The machine experienced the emotion. I wrote that the feeling made the experience memorable. I used machine learning to inductively find the algorithm. I wrote the input. I wrote the output. +10. I adjourned the meeting. I tested that I was rested. I had a conversation (which was different each time) with the machine learning algorithm to retrieve the algorithm. I attended the meeting. I listened to and discussed the solutions. I coded the robot. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/.DS_Store b/Lucian-Academy/Books/MEDICINE/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Lucian-Academy/Books/MEDICINE/.DS_Store differ diff --git a/Lucian-Academy/Books/MEDICINE/Anti-Ageing Medicine.txt b/Lucian-Academy/Books/MEDICINE/Anti-Ageing Medicine.txt new file mode 100644 index 0000000000000000000000000000000000000000..201eaf755d93d69bd00cdf04f8d0ff701f98c112 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/Anti-Ageing Medicine.txt @@ -0,0 +1,8 @@ +["Green, L 2022, Anti-Ageing Medicine, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Anti-Ageing Medicine + +1. Proper medicine required 4*50 As. I prevented ageing using body replacement and pedagogy. I stopped ageing at the age I saved my body. My body's state was saved and it didn't denature or degenerate. I could live forever. +2. A doctor tested my blood and physical condition. My body's state was saved from the first day, and could be tested medically. One set of 4*50 As was for immortality. Another set of 4*50 As was for anti-ageing. I wrote about the positive medical effects. +3. I thanked the educational institution for supporting anti-ageing. Like with body replacement, I could breason out As with or without a computer. I could use text to breasonings to breason out an A, then indicate anti-ageing each day. Alternatively, I could hand-breason out breasonings (in my mind) and receive the benefits of anti-ageing. I could hand-breason out texts if a company couldn't support it. +4. I noticed that a Complexity subject covered medicine. I attended the educational institution to support anti-ageing. It helped me write arguments. I received As that helped the techniques work. The educational institution provided a course that supported my writing and its effects. +5. I planned my career, and aimed to produce great works. I could halt ageing. If I went back to another character's age, I would get older. My skin and hair remained the same, everyone thought I looked young. My agent was happy that I hadn't changed my appearance. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/Body Replacement.txt b/Lucian-Academy/Books/MEDICINE/Body Replacement.txt new file mode 100644 index 0000000000000000000000000000000000000000..7955972ee3981ade780ccea7ee6c925baf8192c7 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/Body Replacement.txt @@ -0,0 +1 @@ +["Green, L 2024, Body Replacement, Lucian Academy Press, Melbourne.","Green, L 2024",1,"() body replacement should have 4*50 medicine arg"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/Immortal world equals real world.txt b/Lucian-Academy/Books/MEDICINE/Immortal world equals real world.txt new file mode 100644 index 0000000000000000000000000000000000000000..7badd420ec2c7e9f781db398fd01aefdb51fc18a --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/Immortal world equals real world.txt @@ -0,0 +1,2 @@ +["Green, L 2024, Immortal world equals real world, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Use Grammarly Premium: +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/Immortality Body Replacement Medicine.txt b/Lucian-Academy/Books/MEDICINE/Immortality Body Replacement Medicine.txt new file mode 100644 index 0000000000000000000000000000000000000000..baa6086d6f0570ea398583f9766a20b667df75cf --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/Immortality Body Replacement Medicine.txt @@ -0,0 +1,9 @@ +["Green, L 2022, Immortality Body Replacement Medicine, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Immortality Body Replacement Medicine + +1. With the additional anti-ageing argument, I was immortal. The trajectory of the body's use repeated when it was replaced. The body had the same state as when it was saved. It was as healthy as then. The heart was in the same condition. +2. My body regenerated. My body systems remained the same. The heart pumped blood with the same strength. The muscle, lung and blood cells were the same. The body replacement changed my body to the saved one. +3. I had medicine for pedagogy, meditation, computer science, writing, text to breasonings, mind reading, time travel and immortality. My body systems were healthy. If they had a problem later, my body went back to its former state. I felt happy and healthy. I could avoid degeneration of the body and its diseases. +4. I could take care of meditators. I trawled through the idea of the body. I started with the top of my body. I knew that my body had been replaced in one go. I relaxed. +5. I avoided pathogens. I turned immortality on. I could avoid sickness. With enough As, I could help myself and others in time. I avoided risks. +6. I could meditate on having my body replaced with or without a computer. I automatically breasoned out As with a computer. Text to breasonings may stop working without company support. So, I developed my company to support it. Or, I breasoned out ten breasonings per day per A of the 80 breasonings written at the start. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 250 Breasonings 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 250 Breasonings 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f16a22fb62110907947d31d35f59e9b4369007df --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 250 Breasonings 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, 250 Breasonings 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +250 Breasonings 1 of 4 + + + +1. (see Pedagogy and the blog in general) for high quality imagery in writing areas of study (e.g. productions) without discomfort. + +1a. I prepared to write that it is amazing. I did this by writing that I am a writer. First, I wrote about writings. Second, I wrote about you. Third, I wrote that I knew you. In this way, I prepared to write that it is amazing by writing that I am a writer. + +2. I prepared to take turns washing and cooking. I did this by writing that I am married to one person. First, I knew one person. Second, I married him. Third, I emphasised this. In this way, I prepared to take turns washing and cooking by writing that I am married to one person. + +3. I prepared to connect high quality imagery (from 50 As) and comfortable medical expansion to 50 As. I did this by writing that everything was given to me. First, I wrote about the children. Second, I wrote about the enigmas. Third, I wrote about child-likenesses. In this way, I prepared to connect high quality imagery (from 50 As) and comfortable medical expansion to 50 As by writing that everything was given to me. + +4. I prepared to check that I had everything I needed. I did this by stating that everything was given to me each day. First, I knew that it was given to me. Second, I said it was all right. Third, I noted that the pedagogy helper was interested in something too. In this way, I prepared to check that I had everything I needed by stating that everything was given to me each day. + +5. I prepared to listen to the classical music, which had an expanse of 50 As. I did this by listening to classical music. First, I found the record. Second, I played it on the gramophone. Third, I listened to the classical music. In this way, I prepared to listen to the classical music, which had an expanse of 50 As by listening to classical music. + +6. I prepared to read the tram-car number. I did this by observing the tram-car. First, I knew you were on the tram-car. Second, I saw you on the tram-car. Third, I waved to you. In this way, I prepared to read the tram-car number by observing the tram-car. + +7. I prepared to drink the glass of water. I did this by pouring the drink. First, I found the bottle. Second, I found the glass. Third, I poured the drink. In this way, I prepared to drink the glass of water by pouring the drink. + +8. I prepared to collect data for Block 2 (female) for those not given the 250 breasoning medicine argument to compare high quality imagery in writing areas of study and comfort in expansions to 50 As. I did this by reporting the statistics from the observational study to compare those given and those not given the 250 breasoning medicine argument to compare high quality imagery in writing areas of study and comfort in expansions to 50 As treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those given the 250 breasoning medicine argument to compare high quality imagery in writing areas of study and comfort in expansions to 50 As. Second, I collected it for those not given the 250 breasoning medicine argument. Third, I collected data for Block 2 (female) by random allocation for those given the 250 breasoning medicine argument to compare high quality imagery in writing areas of study and comfort in expansions to 50 As. In this way, I prepared to collect data for Block 2 (female) for those not given the 250 breasoning medicine argument to compare high quality imagery in writing areas of study and comfort in expansions to 50 As by reporting the statistics from the observational study to compare those given and those not given the 250 breasoning medicine argument to compare high quality imagery in writing areas of study and comfort in expansions to 50 As treating gender as a blocking variable. + +9. I prepared to collect data for Block 2 (female) for who didn't drink 4 glasses of water and exercise 45 minute before breakfast. I did this by reporting the statistics from the observational study to compare the next bowel movement time of those who did and didn't drink 4 glasses of water and exercise 45 minute before breakfast treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who drank 4 glasses of water and exercise 45 minute before breakfast to compare the next bowel movement time Second, I collected it for those who didn't drink 4 glasses of water and exercise 45 minute before breakfast. Third, I collected data for Block 2 (female) by random allocation for those who drank 4 glasses of water and exercise 45 minute before breakfast to compare the next bowel movement time. In this way, I prepared to collect data for Block 2 (female) for who didn't drink 4 glasses of water and exercise 45 minute before breakfast by reporting the statistics from the observational study to compare the next bowel movement time of those who did and didn't drink 4 glasses of water and exercise 45 minute before breakfast treating gender as a blocking variable. + +10. I prepared to collect data for Block 2 (female) for those who didn't practise apple meditation. I did this by reporting the statistics from the observational study to compare success in a relationship in those who did and didn't practise apple meditation and treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for who practised apple meditation to compare success in a relationship. Second, I collected it who didn't practise apple meditation. Third, I collected data for Block 2 (female) by random allocation for who practised apple meditation to compare success in a relationship. In this way, I prepared to collect data for Block 2 (female) for those who didn't practise apple meditation by reporting the statistics from the observational study to compare success in a relationship in those who did and didn't practise apple meditation and treating gender as a blocking variable. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 250 Breasonings 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 250 Breasonings 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8d94615479b333f1a5a6345175c6334b00f70ace --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 250 Breasonings 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, 250 Breasonings 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +250 Breasonings 2 of 4 + +11. I prepared to collect data for Block 2 (female) for those who hadn't been given the dry eyes argument. I did this by reporting the statistics from the observational study to compare change to dry eyes in those who had and hadn't been given the dry eyes argument treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who had been given the dry eyes argument to compare change to dry eyes. Second, I collected it for those who hadn't been given the dry eyes argument. Third, I collected data for Block 2 (female) by random allocation for those who had been given the dry eyes argument to compare change to dry eyes. In this way, I prepared to collect data for Block 2 (female) for those who hadn't been given the dry eyes argument by reporting the statistics from the observational study to compare change to dry eyes in those who had and hadn't been given the dry eyes argument treating gender as a blocking variable. + +12. I prepared to collect data for Block 2 (female) for for those not taking Elderberry. I did this by reporting the statistics from the observational study to compare prevention of colds and influenza in those taking and not taking Elderberry treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those taking Elderberry to compare prevention of colds and influenza. Second, I collected it for those not taking Elderberry. Third, I collected data for Block 2 (female) by random allocation for those taking Elderberry to compare prevention of colds and influenza. In this way, I prepared to collect data for Block 2 (female) for for those not taking Elderberry by reporting the statistics from the observational study to compare prevention of colds and influenza in those taking and not taking Elderberry treating gender as a blocking variable. + +13. I prepared to collect data for Block 2 (female) for non-Lucianic medicine graduates. I did this by reporting the statistics from the observational study to compare whether Lucianic medicine graduates or non-graduates have fewer mental breakdowns treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for Lucianic medicine graduates to compare whether they have fewer mental breakdowns. Second, I collected it for non-Lucianic medicine graduates. Third, I collected data for Block 2 (female) by random allocation for Lucianic medicine graduates to compare whether they have fewer mental breakdowns. In this way, I prepared to collect data for Block 2 (female) for non-Lucianic medicine graduates by reporting the statistics from the observational study to compare whether Lucianic medicine graduates or non-graduates have fewer mental breakdowns treating gender as a blocking variable. + +14. I prepared to collect data for Block 2 (female) for non-grains/nuts/fruits/vegetables eaters. I did this by reporting the statistics from the observational study to compare grains/nuts/fruits/vegetables eaters' and non-grains/nuts/fruits/vegetables eaters' overall health treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for grains/nuts/fruits/vegetables eaters to compare overall health. Second, I collected it for non-grains/nuts/fruits/vegetables eaters. Third, I collected data for Block 2 (female) by random allocation for grains/nuts/fruits/vegetables eaters to compare overall health. In this way, I prepared to collect data for Block 2 (female) for non-grains/nuts/fruits/vegetables eaters by reporting the statistics from the observational study to compare grains/nuts/fruits/vegetables eaters' and non-grains/nuts/fruits/vegetables eaters' overall health treating gender as a blocking variable. + +15. I prepared to collect data for Block 2 (female) for those who haven't got in touch with God about breasonings details. I did this by reporting the statistics from the observational study to compare the ability to see high quality imagery and earn H1s in those who have and haven't got in touch with God about breasonings details, treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who have got in touch with God about breasonings details to compare the ability to see high quality imagery and earn H1s. Second, I collected it for those who haven't got in touch with God about breasonings details. Third, I collected data for Block 2 (female) by random allocation for [subject]s to compare the ability to see high quality imagery and earn H1s. In this way, I prepared to collect data for Block 2 (female) for those who haven't got in touch with God about breasonings details by reporting the statistics from the observational study to compare the ability to see high quality imagery and earn H1s in those who have and haven't got in touch with God about breasonings details, treating gender as a blocking variable. + +16. I prepared to collect data for Block 2 (female) for those who didn't go to bed at 9:30 PM. I did this by reporting the statistics from the observational study to compare whether those who do or don't go to bed at 9:30 PM feel refreshed arising treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who went to bed at 9:30 PM to compare whether they feel refreshed arising. Second, I collected it for those who didn't go to bed at 9:30 PM. Third, I collected data for Block 2 (female) by random allocation for those who went to bed at 9:30 PM to compare whether they feel refreshed arising. In this way, I prepared to collect data for Block 2 (female) for those who didn't go to bed at 9:30 PM by reporting the statistics from the observational study to compare whether those who do or don't go to bed at 9:30 PM feel refreshed arising treating gender as a blocking variable. + +17. I prepared to collect data for Block 2 (female) for those who didn't request formative feedback on essays and applied for special consideration. I did this by reporting the statistics from the observational study to compare grades of those who did and didn't request formative feedback on essays and applied for special consideration treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who requested formative feedback on essays and applied for special consideration to compare grades. Second, I collected it for those who didn't request formative feedback on essays and applied for special consideration. Third, I collected data for Block 2 (female) by random allocation for those who requested formative feedback on essays to compare grades. In this way, I prepared to collect data for Block 2 (female) for those who didn't request formative feedback on essays and applied for special consideration by reporting the statistics from the observational study to compare grades of those who did and didn't request formative feedback on essays and applied for special consideration treating gender as a blocking variable. + +18. I prepared to collect data for Block 2 (female) for those who didn't practise the Head of State Head Ache Prevention technique. I did this by reporting the statistics from the observational study to compare outdoor sun headache in those who did and didn't practise the Head of State Head Ache Prevention technique treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who practised the Head of State Head Ache Prevention technique to compare outdoor sun headache. Second, I collected it for those who didn't practise the Head of State Head Ache Prevention technique. Third, I collected data for Block 2 (female) by random allocation for those who practised the Head of State Head Ache Prevention technique to compare outdoor sun headache. In this way, I prepared to collect data for Block 2 (female) for those who didn't practise the Head of State Head Ache Prevention technique by reporting the statistics from the observational study to compare outdoor sun headache in those who did and didn't practise the Head of State Head Ache Prevention technique treating gender as a blocking variable. + +19. I prepared to collect data for Block 2 (people who go shopping on weekends) for couples who hadn't breasoned out a conception argument. I did this by reporting the statistics from the observational study to compare successful conception and prevention of miscarriage in couples who had and hadn't breasoned out a conception argument treating going shopping on weekdays or weekends as a blocking variable. First, I collected data for Block 1 (people who go shopping on weekdays) by random allocation for couples who had breasoned out a conception argument to compare successful conception and prevention of miscarriage. Second, I collected it for couples who hadn't breasoned out a conception argument Third, I collected data for Block 2 (people who go shopping on weekends) by random allocation for couples who had breasoned out a conception argument to compare successful conception and prevention of miscarriage. In this way, I prepared to collect data for Block 2 (people who go shopping on weekends) for couples who hadn't breasoned out a conception argument by reporting the statistics from the observational study to compare successful conception and prevention of miscarriage in couples who had and hadn't breasoned out a conception argument treating going shopping on weekdays or weekends as a blocking variable. + +20. I prepared to collect data for Block 2 (female) for those who hadn't practised the Honey Pot technique. I did this by reporting the statistics from the observational study to compare whether those who had and hadn't practised the Honey Pot technique had no headaches in vehicles and walks treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who practised the Honey Pot technique to compare whether they had no headaches in vehicles and on walks treating gender as a blocking variable. Second, I collected it for those who hadn't practised the Honey Pot technique. Third, I collected data for Block 2 (female) by random allocation for those who practised the Honey Pot technique to compare whether they had no headaches in vehicles and on walks. In this way, I prepared to collect data for Block 2 (female) for those who hadn't practised the Honey Pot technique by reporting the statistics from the observational study to compare whether those who had and hadn't practised the Honey Pot technique had no headaches in vehicles and walks treating gender as a blocking variable. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 250 Breasonings 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 250 Breasonings 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..2ecf6812d6e9db29ca44cd376e17605cf0c0bdd1 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 250 Breasonings 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, 250 Breasonings 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +250 Breasonings 3 of 4 + +21. I prepared to collect data for Block 2 (female) for those who don't laugh. I did this by reporting the statistics from the observational study to compare whether laughing prevents depression in those who do and don't laugh treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who laugh to compare whether laughing prevents depression. Second, I collected it for those who don't laugh. Third, I collected data for Block 2 (female) by random allocation for those who laugh to compare whether laughing prevents depression. In this way, I prepared to collect data for Block 2 (female) for those who don't laugh by reporting the statistics from the observational study to compare whether laughing prevents depression in those who do and don't laugh treating gender as a blocking variable. + +22. I prepared to collect data for Block 2 (female) for those who hadn't received the 10 A Medicine major. I did this by reporting the statistics from the observational study to compare whether those who had or hadn't received the 10 A Medicine major suffered from less depression treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who received the 10 A Medicine major to compare whether they suffered from less depression. Second, I collected it for those who hadn't received the 10 A Medicine major. Third, I collected data for Block 2 (female) by random allocation for those who received the 10 A Medicine major to compare whether they suffered from less depression. In this way, I prepared to collect data for Block 2 (female) for those who hadn't received the 10 A Medicine major by reporting the statistics from the observational study to compare whether those who had or hadn't received the 10 A Medicine major suffered from less depression treating gender as a blocking variable. + +23. I prepared to collect data for Block 2 (female) for those who had received A for their organs. I did this by reporting the statistics from the observational study to compare the A-status of organs in those who had and hadn't received A for their organs treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who had received A for their organs to compare the A-status of organs. Second, I collected it for those who hadn't received A for their organs. Third, I collected data for Block 2 (female) by random allocation for those who had received A for their organs to compare the A-status of organs. In this way, I prepared to collect data for Block 2 (female) for those who had received A for their organs by reporting the statistics from the observational study to compare the A-status of organs in those who had and hadn't received A for their organs treating gender as a blocking variable. + +24. I prepared to collect data for Block 2 (female) for those who didn't monitor their breathing, didn't prevent interpreting symptoms as life threatening and didn't practise the sutra 5 weeks later. I did this by reporting the statistics from the observational study to compare whether a panic attack is prevented in those who did and didn't monitor their breathing, prevent interpreting symptoms as life threatening and practise the sutra 5 weeks later, treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who monitored their breathing, prevented interpreting symptoms as life threatening and practised the sutra 5 weeks later to compare whether a panic attack is prevented. Second, I collected it for those who didn't monitor their breathing, didn't prevent interpreting symptoms as life threatening and didn't practise the sutra 5 weeks later. Third, I collected data for Block 2 (female) by random allocation for those who monitored their breathing, prevented interpreting symptoms as life threatening and practised the sutra 5 weeks later to compare whether a panic attack is prevented. In this way, I prepared to collect data for Block 2 (female) for those who didn't monitor their breathing, didn't prevent interpreting symptoms as life threatening and didn't practise the sutra 5 weeks later by reporting the statistics from the observational study to compare whether a panic attack is prevented in those who did and didn't monitor their breathing, prevent interpreting symptoms as life threatening and practise the sutra 5 weeks later, treating gender as a blocking variable. + +25. I prepared to collect data for Block 2 (female) for those who haven't received the Meditation Protector Currant Bun argument. I did this by reporting the statistics from the observational study to compare whether the meditation-threatening outdoor headache is prevented in those who have and haven't received the Meditation Protector Currant Bun argument treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who have received the Meditation Protector Currant Bun argument to compare whether the meditation-threatening outdoor headache is prevented. Second, I collected it for those who haven't received the Meditation Protector Currant Bun argument. Third, I collected data for Block 2 (female) by random allocation for those who have received the Meditation Protector Currant Bun argument to compare whether the meditation-threatening outdoor headache is prevented. In this way, I prepared to collect data for Block 2 (female) for those who haven't received the Meditation Protector Currant Bun argument by reporting the statistics from the observational study to compare whether the meditation-threatening outdoor headache is prevented in those who have and haven't received the Meditation Protector Currant Bun argument treating gender as a blocking variable. + +26. I prepared to collect data for Block 2 (female) for those who haven't practised the nut and bolt technique. I did this by reporting the statistics from the observational study to compare whether headache, education mistake, muscle ache and unwanted effects of excess breasonings, incompatibility of virality with conception, pimple, unwanted thoughts, hallucinogenic appearances and depression are prevented in those who have and haven't practised the nut and bolt technique, treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who practised the nut and bolt technique to compare whether headache, education mistake, muscle ache and unwanted effects of excess breasonings, incompatibility of virality with conception, pimple, unwanted thoughts, hallucinogenic appearances and depression are prevented. Second, I collected it for those who haven't practised the nut and bolt technique. Third, I collected data for Block 2 (female) by random allocation for those who haven't practised the nut and bolt technique to compare whether headache, education mistake, muscle ache and unwanted effects of excess breasonings, incompatibility of virality with conception, pimple, unwanted thoughts, hallucinogenic appearances and depression are prevented. In this way, I prepared to collect data for Block 2 (female) for those who haven't practised the nut and bolt technique by reporting the statistics from the observational study to compare whether headache, education mistake, muscle ache and unwanted effects of excess breasonings, incompatibility of virality with conception, pimple, unwanted thoughts, hallucinogenic appearances and depression are prevented in those who have and haven't practised the nut and bolt technique, treating gender as a blocking variable. + +27. I prepared to collect data for Block 2 (female) for those who don't study a course plan including Meditation, Medicine, Computer Science, Creative Writing, Critical Thinking and Pedagogy, and write Pedagogy arguments. I did this by reporting the statistics from the observational study to compare whether grades increase in those who do and don't study a course plan including Meditation, Medicine, Computer Science, Creative Writing, Critical Thinking and Pedagogy, and write Pedagogy arguments treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who study a course plan including Meditation, Medicine, Computer Science, Creative Writing, Critical Thinking and Pedagogy, and write Pedagogy arguments to compare whether grades increase. Second, I collected it for those who don't study a course plan including Meditation, Medicine, Computer Science, Creative Writing, Critical Thinking and Pedagogy, and write Pedagogy arguments. Third, I collected data for Block 2 (female) by random allocation for those who study a course plan including Meditation, Medicine, Computer Science, Creative Writing, Critical Thinking and Pedagogy, and use Pedagogy arguments to compare whether grades increase. In this way, I prepared to collect data for Block 2 (female) for those who don't study a course plan including Meditation, Medicine, Computer Science, Creative Writing, Critical Thinking and Pedagogy, and write Pedagogy arguments by reporting the statistics from the observational study to compare whether grades increase in those who do and don't study a course plan including Meditation, Medicine, Computer Science, Creative Writing, Critical Thinking and Pedagogy, and write Pedagogy arguments treating gender as a blocking variable. + +28. I prepared to collect data for Block 2 (female) for those who didn't study in a University environment or in Perpetual University short courses environment. I did this by reporting the statistics from the observational study to compare whether there was the best breasoning environment (resulting in writing good ideas, i.e. sentences with one positive term, where the sentences each include a different positive term) in those who did and didn't study in a University environment or in Perpetual University short courses environment, treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who studied in a University environment or in Perpetual University short courses environment to compare whether there was the best breasoning environment (resulting in writing good ideas, i.e. sentences with one positive term, where the sentences each include a different positive term). Second, I collected it for those who didn't study in a University environment or in Perpetual University short courses environment. Third, I collected data for Block 2 (female) by random allocation for those who studied in a University environment or in Perpetual University short courses environment to compare whether there was the best breasoning environment (resulting in writing good ideas, i.e. sentences with one positive term, where the sentences each include a different positive term). In this way, I prepared to collect data for Block 2 (female) for those who didn't study in a University environment or in Perpetual University short courses environment by reporting the statistics from the observational study to compare whether there was the best breasoning environment (resulting in writing good ideas, i.e. sentences with one positive term, where the sentences each include a different positive term) in those who did and didn't study in a University environment or in Perpetual University short courses environment, treating gender as a blocking variable. + +29. I prepared to collect data for Block 2 (female) for those who haven't been given the Preventing Sales From Being Dangerous argument and whose products' product, sales and image As must be based on a positive argument, have a single argument and be connected in a structure. I did this by reporting the statistics from the observational study to compare whether sales are prevented from being 'dangerous' (the salesperson is prevented from having a medical problem within 1 year) in those who have and haven't been given the Preventing Sales From Being Dangerous argument and whose products' product, sales and image As must be based on a positive argument, have a single argument and be connected in a structure, treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who have been given the Preventing Sales From Being Dangerous argument and whose products' product, sales and image As must be based on a positive argument, have a single argument and be connected in a structure to compare whether sales are prevented from being 'dangerous' (the salesperson is prevented from having a medical problem within 1 year). Second, I collected it for those who haven't been given the Preventing Sales From Being Dangerous argument and whose products' product, sales and image As must be based on a positive argument, have a single argument and be connected in a structure. Third, I collected data for Block 2 (female) by random allocation for those who have been given the Preventing Sales From Being Dangerous argument and whose products' product, sales and image As must be based on a positive argument, have a single argument and be connected in a structure to compare whether sales are prevented from being 'dangerous' (the salesperson is prevented from having a medical problem within 1 year). In this way, I prepared to collect data for Block 2 (female) for those who haven't been given the Preventing Sales From Being Dangerous argument and whose products' product, sales and image As must be based on a positive argument, have a single argument and be connected in a structure by reporting the statistics from the observational study to compare whether sales are prevented from being 'dangerous' (the salesperson is prevented from having a medical problem within 1 year) in those who have and haven't been given the Preventing Sales From Being Dangerous argument and whose products' product, sales and image As must be based on a positive argument, have a single argument and be connected in a structure, treating gender as a blocking variable. + +30. I prepared to collect data for Block 2 (female) for those who haven't practised the Quantum box/prayer technique. I did this by reporting the statistics from the observational study to compare whether headache, education mistake, muscle ache and unwanted effects of excess breasonings, incompatibility of virality with conception, pimple, unwanted thoughts, hallucinogenic appearances and depression are prevented in those who have and haven't practised the Quantum box/prayer technique, treating gender as a blocking variable. First, I collected data for Block 1 (male) by random allocation for those who practised the Quantum box/prayer technique to compare whether headache, education mistake, muscle ache and unwanted effects of excess breasonings, incompatibility of virality with conception, pimple, unwanted thoughts, hallucinogenic appearances and depression are prevented. Second, I collected it for those who haven't practised the Quantum box/prayer technique. Third, I collected data for Block 2 (female) by random allocation for those who haven't practised the Quantum box/prayer technique to compare whether headache, education mistake, muscle ache and unwanted effects of excess breasonings, incompatibility of virality with conception, pimple, unwanted thoughts, hallucinogenic appearances and depression are prevented. In this way, I prepared to collect data for Block 2 (female) for those who haven't practised the Quantum box/prayer technique by reporting the statistics from the observational study to compare whether headache, education mistake, muscle ache and unwanted effects of excess breasonings, incompatibility of virality with conception, pimple, unwanted thoughts, hallucinogenic appearances and depression are prevented in those who have and haven't practised the Quantum box/prayer technique, treating gender as a blocking variable. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..211738e95430fa7091cb4c413eb12a65bae6aec3 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +4 Glasses of Water and Exercise 45 Minutes Before Breakfast 1 of 4 + + + +1. To prepare the digestive system tract for breakfast and to prepare the body for the day. + +1a. I prepared to choose Derrida. I did this by providing Arabic Studies service as breasonings currency. First, I lit up the Arabic-speaking states. Second, I helped myself to the H1s. Third, I enunciated 'ha' clearly. In this way, I prepared to choose Derrida by providing Arabic Studies service as breasonings currency. + +2. I prepared to . I did this by providing Architectural History service as breasonings currency. First, I wrote on the balustrades. Second, I ate the tofu snuff. Third, I admired the bluestone University building. In this way, . + +3. I prepared to eat the tofu. I did this by providing Art History service as breasonings currency. First, I found the halchitectures (sic). Second, I used the aqualung. Third, I needed you now. In this way, I prepared to eat the tofu by providing Art History service as breasonings currency. + +4. I prepared to eat the blackberry. I did this by providing Asian Studies service as breasonings currency. First, I entertained my partner. Second, I loved his huckleys. Third, I applied make-up to him. In this way, I prepared to eat the blackberry by providing Asian Studies service as breasonings currency. + +5. I prepared to include the heart. I did this by providing Astronomy service as breasonings currency. First, I examined the star. Second, I saw if an examiner had changed after looking at it. Third, I included the septum. In this way, I prepared to include the heart by providing Astronomy service as breasonings currency. + +6. I prepared to land in the plane. I did this by providing Australian Indigenous Studies service as breasonings currency. First, I ate the tart. Second, I drank the strawberry juice. Third, I helped them to take off. In this way, I prepared to land in the plane by providing Australian Indigenous Studies service as breasonings currency. + +7. I prepared to be with the best. I did this by providing Australian Studies service as breasonings currency. First, I held the lamb. Second, I held the Mardi Gras badge. Third, I asked, 'Do you like me?' In this way, I prepared to be with the best by providing Australian Studies service as breasonings currency. + +8. I prepared to sketch Venus. I did this by providing Biology and Botany service as breasonings currency. First, I found the barnacles. Second, I drew them. Third, I made an etching. In this way, I prepared to sketch Venus by providing Biology and Botany service as breasonings currency. + +9. I prepared to jive on. I did this by providing Business law service as breasonings currency. First, I paid breasoning currency with business law service. Second, I verified the job task with this currency. Third, I protected the consumer with the business. In this way, I prepared to jive on by providing Business law service as breasonings currency. + +10. I prepared to eat vegan lasagne. I did this by providing Catalan service as breasonings currency. First, I found the house. Second, I spoke to it's occupants in Catalan. Third, I gave them a souvenir stamp. In this way, I prepared to eat vegan lasagne by providing Catalan service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..658f29b056cc99df120bccac7444eba94d535b62 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +4 Glasses of Water and Exercise 45 Minutes Before Breakfast 2 of 4 + +11. I prepared to eat the edible chemicals. I did this by providing Chemistry service as breasonings currency. First, I found the Chemical. Second, I found the head. Third, I joined them together. In this way, I prepared to eat the edible chemicals by providing Chemistry service as breasonings currency. + +12. I prepared to offer shelter. I did this by providing Chinese Language and Studies service as breasonings currency. First, I described the picture. Second, I observed the water-carrying person with lower socio-economic status. Third, I admired his tenacity. In this way, I prepared to offer shelter by providing Chinese Language and Studies service as breasonings currency. + +13. I prepared to design the cinema screen. I did this by providing Cinema Studies service as breasonings currency. First, I described the rod. Second, I gave him bread. Third, I saw his interludes. In this way, I prepared to design the cinema screen by providing Cinema Studies service as breasonings currency. + +14. I prepared to write a comment on the annotation. I did this by providing Classical Studies and Archaeology service as breasonings currency. First, I found the carving. Second, I ate an edible version of it. Third, I annotated the clipping. In this way, I prepared to write a comment on the annotation by providing Classical Studies and Archaeology service as breasonings currency. + +15. I prepared to examine the Computer applications. I did this by providing Computer applications in the Social Sciences and Humanities service as breasonings currency. First, I examined the Social Sciences. Second, I examined the Humanities. Third, I provided service in them. In this way, I prepared to examine the Computer applications by providing Computer applications in the Social Sciences and Humanities service as breasonings currency. + +16. I prepared to program the computer. I did this by providing Computer Science service as breasonings currency. First, I loved breasonings. Second, I weasoned it out. Third, I spotted the idea. In this way, I prepared to program the computer by providing Computer Science service as breasonings currency. + +17. I prepared to design each participle. I did this by providing Creative Writing service as breasonings currency. First, I manufactured the robot. Second, I aided the conclusions. Third, I made it with the secondary school English teacher. In this way, I prepared to design each participle by providing Creative Writing service as breasonings currency. + +18. I prepared to examine the seen-as arguments. I did this by providing Criminology service as breasonings currency. First, I held the person who prevented committing a crime down to a good thing. Second, I guided his thoughts. Third, I admired him. In this way, I prepared to examine the seen-as arguments by providing Criminology service as breasonings currency. + +19. I prepared to help the Earthlings along. I did this by providing Cultural Studies service as breasonings currency. First, I decided on climatology. Second, I helped him breathe. Third, I cared for the environment. In this way, I prepared to help the Earthlings along by providing Cultural Studies service as breasonings currency. First, I decided on climatology. + +20. I prepared to eat the pasta shells. I did this by providing Development Studies service as breasonings currency. First, I developed the lady. Second, I found her gutter. Third, I finished it off for her. In this way, I prepared to eat the pasta shells by providing Development Studies service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..b4dcde2ec8b64315df1c2b2bee2aef15d2ed5ebf --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, 4 Glasses of Water and Exercise 45 Minutes Before Breakfast 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +4 Glasses of Water and Exercise 45 Minutes Before Breakfast 3 of 4 + +21. I prepared to design my gastric juices. I did this by providing Earth Sciences service as breasonings currency. First, I helped the Earth. Second, I designed the pencil carefully. Third, I lectured the students on it. In this way, I prepared to design my gastric juices by providing Earth Sciences service as breasonings currency. + +22. I prepared to help my whole body by relaxing. I did this by providing Economics service as breasonings currency. First, I bought the glass of water. Second, I drank it. Third, I exercised 45 minutes before breakfast. In this way, I prepared to help my whole body by relaxing by providing Economics service as breasonings currency. + +23. I prepared to read the books. I did this by providing English Literary Studies service as breasonings currency. First, I noticed the painting. Second, I noticed the lulang. Third, I prayed (worked). In this way, I prepared to read the books by providing English Literary Studies service as breasonings currency. + +24. I prepared to drink next to shell fish carcinogeenan (anti-carcinogen). I did this by providing English as a Second Language service as breasonings currency. First, I gave the prince a job. Second, I laughed with him. Third, I wrote red herrings. In this way, I prepared to drink next to shell fish carcinogeenan (anti-carcinogen) by providing English as a Second Language service as breasonings currency. + +25. I prepared to be at the Bve (sic) front. I did this by providing English Language Studies service as breasonings currency. First, I wrote home. Second, I deciphered it. Third, I passed along the nanotubes. In this way, I prepared to be at the Bve (sic) front by providing English Language Studies service as breasonings currency. + +26. I prepared to eat vegan breakfast. I did this by providing Communication Skills service as breasonings currency. First, I admired the captain. Second, I recommended that he drink the water. Third, I saw him share the water with his first mate. In this way, I prepared to eat vegan breakfast by providing Communication Skills service as breasonings currency. + +27. I prepared to enjoy life. I did this by providing Environmental Studies service as breasonings currency. First, I helped the heptagon. Second, I examined the pentose sugar. Third, I split it open to make new DNA. In this way, I prepared to enjoy life by providing Environmental Studies service as breasonings currency. + +28. I prepared to write a song about the life. I did this by providing European Studies service as breasonings currency. First, I anticipated the beginning of a new continent in 2016. Second, I found the parliamentarians. Third, I shitsued it to pieces. In this way, I prepared to write a song about the life by providing European Studies service as breasonings currency. + +29. I prepared to asked how he could stand with the cosmologue (philosopher). I did this by providing French service as breasonings currency. First, I found the French men. Second, I asked them to stand up. Third, I determined that it would remain on Earth with each religion (philosophy). In this way, I prepared to asked how he could stand with the cosmologue (philosopher) by providing French service as breasonings currency. + +30. I prepared to ask, 'Whose gender is it?' I did this by providing Gender Studies service as breasonings currency. First, I didn't use animal products. Second, I had a conversation with two animals with technology. Third, I determined how to earn 100% and earned it. In this way, I prepared to ask, 'Whose gender is it?' by providing Gender Studies service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..1345866826453797e8ed26fd5fcd75fac1018687 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Berocca Prevents Colds and Flu 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Berocca Prevents Colds and Flu 1 of 4 + + + +1. Berocca, or Elderberry Block Binding Sites of Bacteria and Viruses. + +1a. I prepared to block the pathogen from binding with the body. I did this by finding the binding site. First, I found the pathogen. Second, I found the possible binding site in the body. Third, I found whether the pathogen bound with the binding site. In this way, I prepared to block the pathogen from binding with the body by finding the binding site. + +2. I prepared to feel healthier. I did this by drinking the Elderberry syrup. First, I identified that I needed the Elderberry syrup. Second, I measured the Elderberry syrup. Third, I drank the Elderberry syrup. In this way, I prepared to feel healthier by drinking the Elderberry syrup. + +3. I prepared to enjoy the fragrance. I did this by pouring the Frankincense. First, I lifted the bottle of Frankincense from the shelf. Second, I poured it into another bottle. Third, I fragranced myself. In this way, I prepared to enjoy the fragrance by pouring the Frankincense. + +4. I prepared to demonstrate verification of input. I did this by wearing pitchouli, (sic) demonstrating verification of input. First, I wrote to the religious leader (philosopher). Second, I thought about the pitchouli. Third, I wore it. In this way, I prepared to demonstrate verification of input by wearing pitchouli, (sic) demonstrating verification of input. + +5. I prepared to taste the same tea as the one drunk by the early settler. I did this by writing that the early settler drank the natural tea. First, I noted that he used the natural tea leaves. Second, I noted he boiled the water on the stove. Third, I noted that he made the tea. In this way, I prepared to taste the same tea as the one drunk by the early settler by writing that the early settler drank the natural tea. + +6. I prepared to observe Dumptonesquenesses. I did this by observing the feminist write about the sponge boy. First, I observed the boy with the sponge. Second, I observed the feminist notice him. Third, I observed her write about him. In this way, I prepared to observe Dumptonesquenesses by observing the feminist write about the sponge boy. + +7. I prepared to hold the berry. I did this by picking the berry. First, I observed the people picking over the plant. Second, I wrote about the plant. Third, I picked the berry. In this way, I prepared to hold the berry by picking the berry. + +8. I prepared to put a berry on the plate. I did this by crush the berry. First, I placed the berry in the mortar. Second, I crushed it using the pestle. Third, I tasted it. In this way, I prepared to put a berry on the plate by crush the berry. + +9. I prepared to drink the liquefied berry. I did this by holding the berry up. First, I selected the berry. Second, I picked the berry. Third, I held the berry up. In this way, I prepared to drink the liquefied berry by holding the berry up. + +10. I prepared to give the berry drink to someone else. I did this by mulling the berry. First, I boiled water, sugar, sliced orange, sliced lemon, cinnamon sticks, whole cloves and berries and simmered for 5 minutes. Second, I added wine and simmered for 10 minutes. Third, I removed the solids. In this way, I prepared to give the berry drink to someone else by mulling the berry. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d4e696f416649dcb332d97b17e7904f4b9791aac --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Berocca Prevents Colds and Flu 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Berocca Prevents Colds and Flu 2 of 4 + +11. I prepared to place the hull in the receptacle. I did this by removing the berry after juicing it. First, I juiced the berry. Second, I held the spoon. Third, I lifted the berry hull. In this way, I prepared to place the hull in the receptacle by removing the berry after juicing it. + +12. I prepared to write the book about the elderberry. I did this by choosing the elderberry. First, I aimed to find a food with antioxidants which prevented disease. Second, I found the elderberry. Third, I processed the elderberry. In this way, I prepared to write the book about the elderberry by choosing the elderberry. + +13. I prepared to ingest the drink. I did this by drinking by using the straw. First, I held the drink. Second, I inserted the straw in it. Third, I drank the drink by using the straw. In this way, I prepared to ingest the drink by drinking by using the straw. + +14. I prepared to love it. I did this by drinking the elderberry juice. First, I entered the pantry. Second, I found the labelled elderberry juice. Third, I drank the juice from the cup. In this way, I prepared to love it by drinking the elderberry juice. + +15. I prepared to initiate remaining healthy. I did this by drinking the juice with my tongue. First, I pipetted the juice onto my tongue. Second, I swallowed the first droplet. Third, I swallowed each droplet. In this way, I prepared to initiate remaining healthy by drinking the juice with my tongue. + +16. I prepared to drink water. I did this by drinking the juice with my mouth. First, I pipetted an aliquot of juice onto my gums. Second, I massaged it onto my gums. Third, I swallowed it with water. In this way, I prepared to drink water by drinking the juice with my mouth. + +17. I prepared to love God (the leader/non enslaving master). I did this by tasting the juice. First, I held the berry between my fingers. Second, I extracted juice from it using a syringe. Third, I tasted this with my taste bud. In this way, I prepared to love God (the leader/non enslaving master) by tasting the juice. + +18. I prepared to taste the next extract. I did this by tasting the syrup. First, I found the syrup bottle. Second, I dropped a drop onto my fingertip. Third, I recorded the taste. In this way, I prepared to taste the next extract by tasting the syrup. + +19. I prepared to drink the pure water. I did this by brushing my teeth after drinking the juice . First, I held the toothbrush. Second, I squeezed the toothpaste onto the toothbrush. Third, I brushed my teeth. In this way, I prepared to drink the pure water by brushing my teeth after drinking the juice. + +20. I prepared to recline. I did this by using dental floss after drinking the syrup. First, I placed the dental floss between my teeth. Second, I removed debris from between my teeth. Third, I placed the dental floss in the bin. In this way, I prepared to recline by using dental floss after drinking the syrup. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6bef220f4f11f8d2de7be90b4dbe2e82c05a7083 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Berocca Prevents Colds and Flu 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Berocca Prevents Colds and Flu 3 of 4 + +21. I prepared to take the money for selling the freesias. I did this by smelling the freesias. First, I cultivated the freesias. Second, I smelt them. Third, I sold them. In this way, I prepared to take the money for selling the freesias by smelling the freesias. + +22. I prepared to dine. I did this by smelling the happy people. First, I smelt their shirts. Second, I smelt their hair. Third, I smelt their linen. In this way, I prepared to dine by smelling the happy people. + +23. I prepared to break the economic barriers. I did this by liking laurels. First, I picked the laurels. Second, I explained they were bay leaves. Third, I made the laurel wreath. In this way, I prepared to break the economic barriers by liking laurels. + +24. I prepared to adorn the God (leader). I did this by drinking the nectar. First, I loved the nectar. Second, I helped it rosuly (sic). Third, I loved it. In this way, I prepared to adorn the God (leader) by drinking the nectar. + +25. I prepared to eat the carrot. I did this by holding the sacrosancts. First, I cooked the carrot slice. Second, I oiled it. Third, I lubricated my lips. In this way, I prepared to eat the carrot by holding the sacrosancts. + +26. I prepared to announce that I had won when all my counters were home. I did this by playing Ludo. First, I threw the dice. Second, I moved the counter. Third, I made a pile of all the counters at the base when six was thrown. In this way, I prepared to announce that I had won when all my counters were home by playing Ludo. + +27. I prepared to read the time. I did this by rotating around the clock. First, I rotated to 3 PM. Second, I rotated to 6 PM. Third, I rotated to 9 PM. In this way, I prepared to read the time by rotating around the clock. + +28. I prepared to write about the areas of study. I did this by stating that I am University. First, I uploaded the degrees. Second, I advertised them to students. Third, I stated that this was University. In this way, I prepared to write about the areas of study by stating that I am University. + +29. I prepared to prevent rape (ensure respect). I did this by observing the person state that he loved you. First, I observed him state that he studied pedagogy. Second, I observed him state that he studied medicine. Third, I observed him state that he studied meditation. In this way, I prepared to prevent rape (ensure respect) by observing the person state that he loved you. + +30. I prepared to seize the day. I did this by loving art. First, I looked the artwork up in the catalogue. Second, I sat down. Third, I experienced the artwork. In this way, I prepared to seize the day by loving art. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..274fb98552e9dae9e7ed9090ad7ee7f4512247aa --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Berocca Prevents Colds and Flu 4 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Berocca Prevents Colds and Flu 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Berocca Prevents Colds and Flu 4 of 4 + +31. I prepared to eat the purple carrot. I did this by finding the rod. First, I opened the void. Second, I held the end of the rod. Third, I removed it. In this way, I prepared to eat the purple carrot by finding the rod. + +32. I prepared to know you ate the vegetable. I did this by knowing you. First, I gave you the carrot. Second, I let you eat it. Third, I knew you with it. In this way, I prepared to know you ate the vegetable by knowing you. + +33. I prepared to read the minds of others (plan e.g. argue in Conglish). I did this by helping the child with the character to have pedagogy done for her. First, I arranged the pedagogies. Second, I did them for her. Third, I found her grades. In this way, I prepared to read the minds of others (plan e.g. argue in Conglish) by helping the child with the character to have pedagogy done for her. + +34. I prepared to rest before the next day. I did this by keeping the house warm. First, I performed exercise. Second, I wore a jumper. Third, I slept in the soft bed. In this way, I prepared to rest before the next day by keeping the house warm. + +35. I prepared to invite the guests over. I did this by eating the mince tart. First, I placed minced apple in the tart. Second, I placed minced pear in the tart. Third, I placed minced sultanas in the tart. In this way, I prepared to invite the guests over by eating the mince tart. + +36. I prepared to construct the cup. I did this by dismantling the square prism. First, I removed the top of the square prism. Second, I removed the sides of the square prism. Third, I removed the base of the square prism. In this way, I prepared to construct the cup by dismantling the square prism. + +37. I prepared to prevent the 100 year PhD. I did this by liking you. First, I confirmed that there should be 100, not 500+ assessable As per Research Masters and PhD assignment chapter. Second, I stated that this was quick enough. Third, I stated that this was easy enough. In this way, I prepared to prevent the 100 year PhD by liking you. + +38. I prepared to examine the structure. I did this by drawing wiring of elderberry. First, I found the wire-like structures in the elderberry. Second, I connected them together. Third, I drew the structure. In this way, I prepared to examine the structure by drawing wiring of elderberry. + +39. I prepared to receive my rating as a teacher. I did this by helping you. First, I helped you. Second, I verified your understanding. Third, I listened to your feedback. In this way, I prepared to receive my rating as a teacher by helping you. + +40. I prepared to refill the bottle. I did this by drinking from the 'Drink This' bottle. First, I found the labelled bottle. Second, I drank from it. Third, I washed it. In this way, I prepared to refill the bottle by drinking from the 'Drink This' bottle. + +41. I prepared to use the elderberry while it was safe. I did this by reusing the bottle. First, I used elderberry every day until I was 55. Second, I stopped on my 55th birthday. Third, I stated that this was to prevent my stomach producing cancer cells. In this way, I prepared to use the elderberry while it was safe by reusing the bottle. + +42. I prepared to depict the illustration of the elderberry. I did this by laughing for elderberry. First, I found the berry. Second, I used it as a paste. Third, I finished the artwork. In this way, I prepared to depict the illustration of the elderberry by laughing for elderberry. + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..772f6463b73e5cf88bdc529d33cb8958f9254f54 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 1 of 4.txt @@ -0,0 +1,30 @@ +["Green, L 2021, Brain 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Brain 1 of 4 + + + + +1. The meditator's brain is better supported. + +1a. I prepared to use my brain. I did this by turning the brain on. First, I found the switch. Second, I turned the brain light on. Third, I admired the brain. In this way, I prepared to use my brain by turning the brain on. + +2. I prepared to create the neuron. I did this by tickling the fish. First, I found the fish. Second, I placed the net in the water. Third, the fish jumped out of the water. In this way, I prepared to create the neuron by tickling the fish. + +3. I prepared to earn 100%. I did this by researching the answer. First, I looked up the topic in the lecture notes. Second, I read the answer. Third, I paraphrased it in my work. In this way, I prepared to earn 100% by researching the answer. + +4. I prepared to finish the degree. I did this by exiting through the correct door. First, I found the map. Second, I found the path on the map. Third, I found the correct door compared with where I was. In this way, I prepared to finish the degree by exiting through the correct door. + +5. I prepared to answer the question. I did this by linking the question to the answer. First, I read the question. Second, I read the answer. Third, I verified that the answer developedly answered the question. In this way, I prepared to answer the question by linking the question to the answer. + +6. Powerman prepared to use his will. He did this by thinking projects through. First, he thought of the start. Second, he thought of the middle. Third, he thought of the finish. In this way, Powerman prepared to use his will by thinking projects through. + +7. The mathematician prepared to verify orthodontic calculations. She did this by calculating the velocity of the tooth from it's initial and final position. First, she wrote down the final position of the tooth. Second, she subtracted the initial position from this. Third, she divided this by the time. In this way she prepared to verify orthodontic calculations this by calculating the velocity of the tooth from it's initial and final position. + +8. I prepared to say how good I was. I did this by filling a model brain cell with my discoveries. First, I put my first discovery into the brain cell. Second, I put my second discovery into the brain cell. Third, I put my third discovery into the brain cell. I prepared to say how good I was by filling a model brain cell with my discoveries. + +9. I prepared to swim the race. I did this by swimming a mile. First, I got into my swimmers. Second, I got into the pool. Third, I paddled around for a bit. In this way, I prepared to swim the race by swimming a mile. + +10. I prepared to find the line's gradient. I did this by deriving the function in terms of x. First, I multiplied the coefficient of the first term by the value of the power of x and decreased the power of x by 1, and did this for the rest of the terms. Second, I rewrote x^0 as 1, simplifying a*1 to a. Third, I rewrote terms with negative powers of x (a*(x^-b)) as a/(x^b). In this way, I prepared to find the line's gradient by deriving the function in terms of x. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..eeff242b803cdac219d8435f0c312a9f963349aa --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Brain 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Brain 2 of 4 + +11. I prepared to win the game. I did this by verifying that the state was checkmate. First, I verified that the king was being attacked. Second, I verified that the piece attacking the king could not be blocked or taken. Third, I verified that the king couldn't move out of checkmate. In this way, I prepared to win the game by verifying that the state was checkmate. + +12. I prepared to create a form in chaos theory. I did this by making a rainbow from a rainbow. First, I plotted two points. Second, I made a rainbow between them. Third, I chose two adjacent colours and repeated this. In this way, I prepared to create a form in chaos theory by making a rainbow from a rainbow. + +13. I prepared to buy new shoes. I did this by measuring my foot size. First, I measured my foot's length. Second, I measured my foot's width. Third, I measured my foot's height. In this way, I prepared to buy new shoes by measuring my foot size. + +14. I prepared to go for a jog. I did it by running on the spot. First, I stood up. Second, I bounced my knees as I jogged on the spot. Third, I moved my arms back and forth as I jogged on the spot. In this way, I prepared to go for a jog by running on the spot. + +15. I prepared to show I had a sharp mind. I did this by substituting the variables into the equation. First, I substituted the first variable into the equation. Second, I prepared to substitute the second variable into the equation. Third, I repeated this until I had substituted all the variables into the equation. In this way, I prepared to show I had a sharp mind by substituting the variables into the equation. + +16. I prepared to calculate the circle's area to equal pi*(radius squared). I did this by calculating pi to equal the ratio of the circle's circumference to it's diameter. First, I measured the circle's circumference. Second, I measured it's diameter. Third, I divided the circle's circumference by it's diameter to equal pi. In this way, I prepared to calculate the circle's area by calculating pi. + +17. I prepared to live a long life. I did this by meditating daily. First, I looked into the wormhole. Second, I looked at high quality thoughts. Third, I meditated for twenty minutes twice per day. In this way, I prepared to live a long life by meditating daily. + +18. I prepared to live a healthy life. I did this by practising yoga each day. First, I wrote 10 As to count as a major in organ medicine, with one speed, then another speed for the organs as arguments. Second, I practised yoga in the mornings. Third, I practised yoga in the evenings. In this way, I prepared to live a healthy life by practising yoga each day. + +19. I prepared to weight train. I did this by starting with a lighter weight. First, I ranked the weights by size. Second, I found the smallest weight. Third, I selecting this as the lightest weight, assuming all the weights were made out of the same metal. In this way, I prepared to weight train by starting with a lighter weight. + +20. I prepared to practice aerobics. I did this by performing a star jump. First, I stood still. Second, I moved my legs outwards and my arms above my head as a jumped. Third, I returned my limbs to resting position as my feet touched the ground. In this way, I prepared to practice aerobics by performing a star jump. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..db303163234de16f941d62f4ef787f58e99c2c93 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Brain 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Brain 3 of 4 + +21. I prepared to love you. I did this by bathing you. First, I bathed your ears. Second, I bathed your cuticles. Third, I bathed your pinkies. In this way, I prepared to love you by bathing you. + +22. I prepared to love myself. I did this by scrubbing my feet. First, I filled a small bucket with water. Second, I dabbed a foottowel in the water. Third, I tickled my right foot with the towel. In this way, I prepared to love myself by scrubbing my feet. + +23. I prepared to love everyone. I did this by ticking the name off the guestlist. First, I recognised the person's face. Second, I searched for the person's name. Third, I ticked her name off the guestlist. In this way, I prepared to love everyone by ticking the name off the guestlist. + +24. I prepared to love everyone else. I did this by writing the non-guest's name down. First, the non-guest told me her name wasn't on the guestlist. Second, she told me her name. Third, I wrote the non-guest's name down on the guestlist. I prepared to love everyone else by writing the non-guest's name down. + +25. The firefighter prepared to fight the fire. He did this by attaching the hose to the fire hydrant. First, he unwound the hose. Second, he attached it to the fire hydrant. Third, he carried the hose to the fire. In this way, the firefighter prepared to fight the fire by attaching the hose to the fire hydrant. + +26. I prepared to love you. I did this by drawing the group close together. First, I copied knowledge into the group. Second, I compared the knowledge in the group. Third, I came to a conclusion about the group. In this way, I prepared to love you by drawing the group close together. + +27. I prepared to say 'I love me'. I did this by ranking myself first. First, I found the list. Second, I found my name. Third, I reranked it with my name first. In this way, I prepared to say 'I love me by ranking myself first. + +28. The polycaring man prepared to say 'I love everybody'. He did this by caring for his family member. First, he cared for his first family member. Second, he prepared to cared for his second family member. Third, he repeated this until he had cared for all his family members. In this way, the man from another religion prepared to say 'I love everybody' by caring for his second family member. + +29. I prepared to avoid the cold air in the bed room. I did this by sleeping under the pillow. First, I lied on the mattress. Second, I placed the pillow on my forehead. Third, I counted sheep until I fell asleep. In this way, I prepared to avoid the cold air in the bed room by sleeping under the pillow. + +30. I prepared to say 'I love you'. I did this by opening the gumnut. First, I found the knife. Second, I lined up the knife along the central line of the gumnut. Third, I dissected the gumnut in half. In this way, I prepared to say 'I love you' by opening the gumnut. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..1ec5622105a69bc1b790aaab82deed1811600bd1 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain 4 of 4.txt @@ -0,0 +1,11 @@ +["Green, L 2021, Brain 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Brain 4 of 4 + +31. I prepared to love myself forever. I did this by removing the gumtree seeds from the gumnut. First, I lifted the top half of the gumnut off the lower half of it. Second, I found the seeds. Third, I removed them. In this way, I prepared to love myself forever by removing the gumtree seeds from the gumnut. + +32. I prepared to say 'I love myself'. I did this by planting the gumnut seeds. First, I placed the gumnut seeds in an envelope. Second, I walked to the clearing. Third, I dug a plot and planted the gumnut seeds. In this way, I prepared to say 'I love myself by planting the gumnut seeds. + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f235023887f207090e6931ad3b4ca6ca34108d34 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Brain II 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Brain II 1 of 4 + + + +1. The meditator's brain is supported with a second argument. + +1a. I prepared to go to the big loop. First, I looked at the gymcana. Second, I sat on a pony. Third, I rocked on. In this way, I prepared to go to the big loop. + +2. I prepared to light up everything. First, I walked on the carpet. Second, I found the wall. Third, my finger found the plastic switch housing. In this way, I prepared to light up everything. + +3. I prepared to find my car in the street. First, I lit the candle in the cellar. Second, I found the correct table. Third, I sat down on the seat. In this way, I prepared to find my car in the street by turning the brain light on. + +4. I prepared to make a model in Medicine. First, I examined the frontal lobe. Second, I measured the temporal lobe. Third, I compared the sizes of the parietal lobes. In this way, I prepared to make a model in Medicine by admiring the brain. + +5. I prepared to create the brain cells group. First, I created the cell body. Second, I attached the axon. Third, I massaged on the dendrite. In this way, I prepared to create the brain cells group. + +6. I prepared to go swimming. First, I swam next to the coral. Second, I felt the shoal. Third, I discovered the school of fish. In this way, I prepared to go swimming. I did this by finding the fish. + +7. I prepared to net the sea weed. I did this by placing the net in the water. First, I held the net handle. Second, I dipped it in the water. Third, I scooped out the red sea weed. In this way, I prepared to net the sea weed by placing the net in the water. + +8. I prepared to leave the water. First, I found the edge of the water. Second, I hauled myself onto the rim. Third, I stood next to the diving board. In this way, I prepared to leave the water. I did this by copying... + +9. I prepared to earn the scholarship. First, I answered the first question correctly. Second, I prepared to answer the next question correctly. Third, I repeated this until I had answered all the questions correctly. In this way, I prepared to earn the scholarship. + +10. I prepared to study the topic. First, I wrote down the topic. Second, I read the index. Third, I found the page number. In this way, I prepared to study the topic. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..099170058d5f92c39504be6b3c344d22d5e5e048 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Brain II 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Brain II 2 of 4 + +11. I prepared to present the answer. First, I found the paragraph. Second I found the line. Third, I shouted the answer. In this way, I prepared to present the answer. + +12. I prepared to write the essay. First, I read the first word. Second, I read the second word. Third, I swapped them and thought of a new connective word. In this way, I prepared to write the essay. + +13. I prepared to find a job. First, I wrote down one essay topic. Second, I competed it. Third, I wrote the rest of the essays. In this way, I prepared to find a job. + +14. I prepared to find the treasure. First, I climbed the attic stairs. Second, I looked in the attic. Third, I found the map in the cabinet. In this way, I prepared to find the treasure. + +15. I prepared to follow the path. First, I plotted the departure point. Second, I plotted the destination. Third, I drew the shortest path from the first to the second of these. In this way, I prepared to follow the path. + +16. I prepared to go down the red hall. First, I opened each door. Second, I looked for the red carpet. Third, I walked on the red carpet. In this way, I prepared to go down the red hall. + +17. I prepared to receive the mark. First, I borrowed the textbook. Second, I read the lecture notes. Third, I completed the exercise. In this way, I prepared to receive the mark. + +18. I prepared to understand the question. First, I read the question number. Second, I read the question. Third, I read the model answer. In this way, I prepared to understand the question. + +19. I prepared to perform the oral presentation. First, I sat on a stool. Second, I looked you in the eye. Third, I clearly read the answer. In this way, I prepared to perform the oral presentation. + +20. I prepared to satisfy the criteria. First, I found the question text. Second, I found the answer. Third, I verified that it was real. In this way, I prepared to satisfy the criteria. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..548611f87fbe13e8b80f8fd317196e179e3a93f7 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Brain II 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Brain II 3 of 4 + +21. Powerman prepared to help the lady cross the road. First, he took one end of the rope. Second, he looped the left end over the right end and inserted it underneath it. Third, he looped the right end over the left end and inserted it underneath it. In this way, Powerman prepared to help the lady cross the road. + +22. He prepared to devise a protective scheme from challenges. First, he thought of the tools he would need. Second, he thought of their cost. Third, he calculated whether he had enough money for them. In this way, he prepared to devise a protective scheme from challenges. + +23. He prepared to test the way. First, he thought of the sandpaper he would need. Second, he thought of glue that he could swap for it. Third, he calculated whether he had the glue. In this way, he prepared to test the way. + +24. He prepared to think of the reward. First, he tested his personal best time in the sprint. Second, he entered the race. Third, he reached the finishing line. In this way, he prepared to think of the reward. + +25. The non-medical mathematician prepared to smile to the camera. First, she copied the file to the CD. Second, she wrote the patient's name on the CD. Third, she gave it to the patient. In this way, the non-medical mathematician prepared to smile to the camera. + +26. She prepared to design the perfect smile. First, she drew the centre line. Second, she drew the teeth on either side of the centre line. Third, she calculated the position of a particular tooth would have. In this way, she prepared to design the perfect smile. + +27. She prepared to check the orthodontic treatment. First, she wrote the desired position. Second, she wrote the initial position. Third, she wrote their difference. In this way, she prepared to check the orthodontic treatment. + +28. She prepared to make sure that a particular tooth would travel a particular velocity. First, she wrote the position difference. Second, she wrote the velocity. Third, she wrote the time it would take to equal the position difference divided by the velocity. In this way, she prepared to make sure that a particular tooth would travel a particular velocity. + +29. I prepared to apply for the job. First, I said the competition's name. Second, I worked out my ranking. Third, I said my previous personal best. In this way, I prepared to apply for the job. + +30. I prepared to make a large protein computer. First, I designed the computer. Second, I made it small enough. Third, I put it into the brain cell. In this way, I prepared to make a large protein computer. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7bd9a946d1937a3876ea29fe76336cfa88bed85e --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Brain II 4 of 4.txt @@ -0,0 +1,11 @@ +["Green, L 2021, Brain II 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Brain II 4 of 4 + +31. I prepared to map the relevant genetics of the brain cell. First, I wrote about the machinery. Second, I wrote how they fitted together. Third, I gave some sample calculations. In this way, I prepared to map the relevant genetics of the brain cell. + +32. I prepared to map the chemicals to the thoughts. First, I wrote about the mapping. Second, I wrote how they combined. Third, I gave some sample thoughts. In this way, I prepared to map the chemicals to the thoughts. + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Doctor Sutra 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Doctor Sutra 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6f5a365587dbb736fbb0b42c41003c9000e8bb0d --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Doctor Sutra 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Doctor Sutra 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Doctor Sutra 2 of 4 + +11. I prepared to detail the breasonings. I did this by finding the coordinate of a point a certain distance away from another point. First, I lined up the ruler with the point. Second, I adjusted the ruler to point to a particular angle. Third, I drew a point a certain distance away from the first point. In this way, I prepared to detail the breasonings by finding the coordinate of a point a certain distance away from another point. + +12. I prepared a reason to be the breasonings. I did this by testing that there was no residue on the tissue. First, I flattened the tissue on the stand. Second, I made sure that there was no yoghurt above the level of the flattened tissue touching it. Third, I put the tissue in a bag. In this way, I prepared a reason to be the breasonings by testing that there was no residue on the tissue. + +13. I prepared to spend time with the people. I did this by labelling the people if they were positive. First, I asked a lisp. Second, I asked 'do you like me?' Third, I wrote if he liked me. In this way, I prepared to spend time with the people by labelling the people if they were positive. + +14. I prepared to like each person more if he liked me more. I did this by ranking people. First, I liked each person if he liked me. Second, I liked each person more if he liked me more. Third, I liked each person the most if he liked me the most. In this way, I prepared to like each person more if he liked me more by ranking people. + +15. I prepared to rank the people. I did this by ranking each person on a scale. First, I gave each person the ranking 3 if he liked me. Second, I gave each person the ranking 2 if he liked me more. Third, I gave each person the ranking 1 if he liked me the most. In this way, I prepared to rank the people by ranking each person on a scale. + +16. I prepared to show friendliness to the guys. I did this by writing quality thoughts for people. First, I wrote one thought. Second, I wrote another thought. Third, I wrote another thought. In this way, I prepared to show friendliness to the guys by writing quality thoughts for people. + +17. I prepared to fall into bed with the bedfellow. I did this by making sure there were no negative people around her. First, I looked at one person. Second, I looked at another person. Third, I moved on correctly. In this way, I prepared to fall into bed with the bedfellow by making sure there were no negative people around her. + +18. I prepared to fall onto the waterbed with a bedfellow. I did this by making sure there were no negative people going to or from him. First, I licked his noodles. Second, I licked his lolly. Third, I loved him to bits. In this way, I prepared to fall onto the waterbed with a bedfellow by making sure there were no negative people going to or from him. + +19. I prepared to go platinum. I did this by identifying the other people (making sure the main people bought it). First, I was someone. Second, I sold it. Third, I sold it to everyone. In this way, I prepared to go platinum by making sure the main people bought it. + +20. I prepared to identify a mistake. I did this by asking 'why are you so interested in me?' First, I went on to a new role. Second, I loved you. Third, I loved me. In this way, I prepared to identify a mistake by asking 'why are you so interested in me?' + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Doctor Sutra 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Doctor Sutra 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..c4ca516e6afe065c6d6a61331d930b88d964a17b --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Doctor Sutra 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Doctor Sutra 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Doctor Sutra 3 of 4 + +21. I asked the teacher who a person was. I did this by looking for a dot on the viewscreen. First, I opened the teacher's knowledge simulation. Second, I found the person on the viewscreen. Third, I read the person's name under their face on the viewscreen. In this way, I asked the teacher who a person was by looking for a dot on the viewscreen. + +22. I prepared to analyse where a person went wrong. I did this by replaying her making a mistake. First, I listened to where she labelled her mistake. Second, I underlined it. Third, I told her why her mistake was incorrect. In this way, I prepared to analyse where a person went wrong by replaying her making a mistake. + +23. I prepared to examine why a person's answer was correct. I did this by listening to her correct her mistake. First, I listened to the question being asked. Second, I thought of the correct answer. Third, I explained why the answer answered the question. In this way, I prepared to examine why a person's answer was correct by listening to her correct her mistake. + +24. An acting agent prepared to think of an A based on another person's A. He did this by removing another object that was an extension from a slight inference. First, he thought of the inference 'I like it' between 'He did this by removing another object that was an extension from a slight inference' and 'What is your opinion'. Second, he thought of an object that was an extension from this slight inference, a raspberry. Third, he removed another object that was an extension from this slight inference, a pip. In this way, he prepared to think of an A based on another person's A by removing another object that was an extension from a slight inference. + +25. I prepared to have an intelligent conversation. I did this by conversing with my peers. First, I opened the football-shaped book. Second, I opened the book card. Third, I signed it for everyone in sight. In this way, I prepared to have an intelligent conversation by conversing with my peers. + +26. I prepared to love you. I did this by proposing to you. First, I found your address. Second, I rang you up and saw you many times. Third, I proposed marriage to you. In this way, I prepared to love you by proposing to you. + +27. I prepared to love a child. I did this by committing to paying for her. First, I calculated how much she would cost each year. Second, I added these amounts up. Third, I calculated what job I would need to earn this sum. In this way, I prepared to love a child by committing to paying for her. + +28. I prepared for the possibility of wanting you. I did this by exploring why I felt all right about you. First, I made sure that I was happy. Second, I ensured that I was secure. Third, I checked off that I was safe. In this way, I prepared for the possibility of wanting you by exploring why I felt all right about you. + +29. I prepared to write an H1 for a student. I did this by having that baby with my wife. First, I had that baby with my wife. Second, I poured the water out of the vase. Third, I poured fresh water back in. In this way, I prepared to write an H1 for a student by having that baby with my wife. + +30. I prepared to help the adult earn a job. I did this by making sure the baby was happy. First, I let something appear from stagecraft. Second, I pretended to think of 50 As to earn the job that appeared. Third, I tantalized the baby spiritually. In this way, I prepared to help the adult earn a job by making sure the baby was happy. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Doctor Sutra 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Doctor Sutra 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..52d660e0cf14ed37cc9b30d01ef83f8b02bcb8ba --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Doctor Sutra 4 of 4.txt @@ -0,0 +1,10 @@ +["Green, L 2021, Doctor Sutra 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Doctor Sutra 4 of 4 + +31. I prepared to have spiritual prerequisite training for my job by meditating. I did this by switching off genes and precursors for diseases. First, I meditated on the day I was paid to work, produced a spiritual result (H1) or was famous, before I was paid to work, etc. Second, the meditation teacher helped me switch off diseases or, in fact precursors for markers for (or things that might become) diseases. Third, I had a great day. In this way, I prepared to have spiritual prerequisite training for my job by meditating by switching off genes and precursors for diseases. + +32. I prepared to cover all As in the world. I did this by switching on specific As in Theology. First, I wrote the Specific As in Theology A. Second, I breasoned it out. Third, I prepared to write the sentence A, so that I would have an A for each sentence that I wrote. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..ff7650f5aee5262851172130b1a3641057d01046 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 1 of 4.txt @@ -0,0 +1,39 @@ +["Green, L 2021, Fewer Mental Breakdowns (Schizophrenia) 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Fewer Mental Breakdowns (Schizophrenia) 1 of 4 + + + +1. Argument about ways to prevent mental breakdowns. + +Caution: Only take medication as directed by your doctor. + +Either: + +1a. study a University short course in self-pulse diagnosis and dot on 50 clozapines in a high quality way (using a breasoned counter: 1x1x0.5) before the day. + +b. or pray for a University short course in self-pulse diagnosis and dot on 50 clozapines in a high quality way (using a breasoned counter: 1x1x0.5) before the day. + +2. Then dot on 10 clozapines and meditate using the Green sutra to beat schizophrenic hallucination attacks and function normally. + +1aa. I prepared to enable full degrees. I did this by providing Biomedical Engineering service as breasonings currency. First, (bearing in mind to only take medication as directed by one's doctor) I breasoned out 10 80-breasoning Medicine As and dotted on 50 clozapines in a high quality way (using a breasoned counter: 1x1x0.5 cm) before the day, then dotted on 10 clozapines and optionally meditated using the Green sutra (repeated the word) to beat schizophrenic hallucination attacks and function normally. Second, I observed that these 10 80-breasoning Medicine As enabled me to prevent mental breakdowns. Third, I walked in the park. In this way, I prepared to enable full degrees by providing Biomedical Engineering service as breasonings currency. + +2. I prepared to connect them up (was active as a way of making correct choices with my brain to prevent mental breakdowns). I did this by providing Electrical Engineering and Computer Science service as breasonings currency. First, I liked you. Second, I liked the breasoner helper. Third, I like being interested in you. In this way, I prepared to connect them up (was active as a way of making correct choices with my brain to prevent mental breakdowns) by providing Electrical Engineering and Computer Science service as breasonings currency. + +3. I prepared to regenerate my body each day. I did this by providing Faculty of Land and Food Resources service as breasonings currency. First, I found the food. Second, I ate it. Third, I passed the food making practical. In this way, . + +4. I prepared to eat the soft hail. I did this by providing Associate Degree in Agriculture service as breasonings currency. First, I developed the way to create the food. Second, I developed the food. Third, I ate the vegan burger. In this way, I prepared to eat the soft hail by providing Associate Degree in Agriculture service as breasonings currency. + +5. I prepared to come and eat. I did this by providing Associate Degree in Environmental Horticulture service as breasonings currency. First, I made hydroponics a little charmer. Second, I supported the sick bay. Third, I played the saxophone everywhere in sight. In this way, I prepared to come and eat by providing Associate Degree in Environmental Horticulture service as breasonings currency. + +6. I prepared to edit out the toxins (eat healthy food). I did this by providing Associate Degree in Forestry Management service as breasonings currency. First, I ranked the arguments from easiest to hardest. Second, I aired the forest. Third, I chewed the gum. In this way, I prepared to edit out the toxins (eat healthy food) by providing Associate Degree in Forestry Management service as breasonings currency. + +7. I prepared to feed the children. I did this by providing Associate Degree in Wood Products Management service as breasonings currency. First, I housed the food (apples) in the wooden boxes. Second, I said that the apple was the most beautiful apple. Third, I rog-gurgitated (sic) it. In this way, I prepared to feed the children by providing Associate Degree in Wood Products Management service as breasonings currency. + +8. I prepared to see the religious figure from 3871 be taken care of. I did this by providing Bachelor of Agricultural Science service as breasonings currency. First, I planted the seed. Second, I added oat milk to it. Third, I watched the plant grow. In this way, I prepared to see the religious figure from 3871 be taken care of by providing Bachelor of Agricultural Science service as breasonings currency. + +9. I prepared to listen to myself clearly. I did this by providing Bachelor of Animal Science and Management service as breasonings currency. First, I let them go free. Second, I said there would be too many events in the future, so the main ones will be reported. Third, I held fast. In this way, I prepared to listen to myself clearly by providing Bachelor of Animal Science and Management service as breasonings currency. + +10. I prepared to help the excide (sic) planet to become the central planet in the Universe. I did this by providing Bachelor of Forest Science service as breasonings currency. First, I observed someone saying that he prevented it being that bad (ensured conditions would remain the same). Second, I looked forward to tens of thousands more years of life on Earth. Third, I stated that Earth would remain pristine. In this way, I prepared to help the excide (sic) planet to become the central planet in the Universe by providing Bachelor of Forest Science service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..98ca6a5ad25b4f02019861613f6d81b261f08aa0 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Fewer Mental Breakdowns (Schizophrenia) 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Fewer Mental Breakdowns (Schizophrenia) 2 of 4 + +11. I prepared to design a genetic treatment that prevented schizophrenia. I did this by providing Bachelor of Food Science service as breasonings currency. First, I designed the food. Second, I made clones with it. Third, I designed more and more of it. In this way, I prepared to design a genetic treatment that prevented schizophrenia by providing Bachelor of Food Science service as breasonings currency. + +12. I prepared to see Ben's children go well as well. I did this by providing Bachelor of Horticulture service as breasonings currency. First, I wrote home to Horticulture. Second, I annunciated (announced) the Horticulture Head's reply. Third, I announced that I loved you. In this way, I prepared to see Ben's children go well as well by providing Bachelor of Horticulture service as breasonings currency. + +13. I prepared to eat the desiccated coconut. I did this by providing Bachelor of Natural Resource Management service as breasonings currency. First, I found the rainforest. Second, I sat in it. Third, I breasoned out 10 'Rainforest' Medicine As (in staying dry-eyed and accepted as a gay). In this way, I prepared to eat the desiccated coconut by providing Bachelor of Natural Resource Management service as breasonings currency. + +14. I prepared to aggress in business. I did this by providing Bachelor of Rural Business service as breasonings currency. First, I wrote to the business. Second, I helped it remain afloat. Third, I enjoyed writing to you. In this way, I prepared to aggress in business by providing Bachelor of Rural Business service as breasonings currency. + +15. I prepared to recommend sex with a partner up to and in a marriage for those above a certain score, otherwise possible companionship (not necessarily marriage). I did this by providing Faculty of Law service as breasonings currency. First, I supported the fact that gay marriage would be included in the marriage act. Second, I established the sexuality counsellors. Third, I ensured that sexuality was not a cause of mental breakdowns. In this way, I prepared to recommend sex with a partner for those above a certain score, otherwise possible companionship (not necessarily marriage). I did this I prepared to recommend sex with a partner up to and in a marriage for those above a certain score, otherwise possible companionship (not necessarily marriage) by providing Faculty of Law service as breasonings currency. + +16. I prepared to perform a holistic diagnosis. I did this by providing Faculty of Medicine, Dentistry and Health Sciences service as breasonings currency (I recommended a Lucianic Medicine check-up). First, I detected the equivalent of hiccups of an organ from no meditation on it (treated with meditation, or philosophy). Second, I asked whether the patient had a headache (to treat with spiritual medicine). Third, I asked whether the organ had cancer symptoms (to treat with breasonings, ultimately a Medicine degree). In this way, I prepared to perform a holistic diagnosis by providing Faculty of Medicine, Dentistry and Health Sciences service as breasonings currency (I recommended a Lucianic Medicine check-up). + +17. I prepared to ensure that I didn't go to hospital unless I was unconscious or I had a broken bone (so I stayed well with the 10 As). I did this by providing Nursing Science service as breasonings currency. First, I found the fluke. Second, I ate easily. Third, I breasoned out softly. In this way, I prepared to ensure that I didn't go to hospital unless I was unconscious or I had a broken bone (so I stayed well with the 10 As) by providing Nursing Science service as breasonings currency. + +18. I prepared to write the Medicine area of study. I did this by providing Medicine service as breasonings currency. First, I loved you. Second, I observed it. Third, I did nothing. In this way, I prepared to write the Medicine area of study by providing Medicine service as breasonings currency. + +19. I prepared to write the Medicine book. I did this by providing Dental science service as breasonings currency. First, I wrote the argument. Second, I wrote the algorithm. Third, I connected them together. In this way, I prepared to write the Medicine book by providing Dental science service as breasonings currency. + +20. I prepared to facilitate the classroom. I did this by providing Hygiene/therapy service as breasonings currency. First, I wanted each student. Second, I wanted the place to be a success. Third, I made sure that it would be a galactic success. In this way, I prepared to facilitate the classroom by providing Hygiene/therapy service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..36b16bb271e461e45c5971f3b03b9f38c29aff77 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Fewer Mental Breakdowns (Schizophrenia) 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Fewer Mental Breakdowns (Schizophrenia) 3 of 4 + +21. I prepared to dial out N. I did this by providing Physiotherapy service as breasonings currency. First, I bricabraced. Second, I trialled the cornupcopias. Third, I hedgehogged my way to hesterity. In this way, I prepared to dial out N by providing Physiotherapy service as breasonings currency. + +22. I prepared to help oo. I did this by providing Faculty of Music service as breasonings currency. First, I dialled in A. Second, I helped A. Third, I wrote A a letter. In this way, I prepared to help oo by providing Faculty of Music service as breasonings currency. + +23. I prepared to prevent deigning. I did this by providing Faculty of Science service as breasonings currency. First, I designed plectrums. Second, I delighted you. Third, I played back 'you'. In this way, I prepared to prevent deigning by providing Faculty of Science service as breasonings currency. + +24. I prepared to give and tuck. I did this by providing Anatomy and cell biology service as breasonings currency. First, I Bahama-ed lightly. Second, I bolly-giggled. Third, I plasticated you. In this way, I prepared to give and tuck by providing Anatomy and cell biology service as breasonings currency. + +25. I prepared to eat the pippin. I did this by providing Biochemistry and molecular biology service as breasonings currency. First, I liked tip and tuck. Second, I helped parliamentarians. Third, I gave something away. In this way, I prepared to eat the pippin by providing Biochemistry and molecular biology service as breasonings currency. + +26. I prepared to be biologically sane. I did this by providing Biology service as breasonings currency. First, I am a school captain. Second, I identified the propaganda. Third, I surprised myself. In this way, I prepared to be biologically sane by providing Biology service as breasonings currency. + +27. I prepared to dive in. I did this by providing Biomedical science service as breasonings currency. First, I like it. Second, I like living. Third, I helped with Nietzsche. In this way, I prepared to dive in by providing Biomedical science service as breasonings currency. + +28. I prepared to record the cockerel's crowing time. I did this by providing Biotechnology service as breasonings currency. First, I found the polkinghorns. Second, I agreed with them. Third, I verified this. In this way, I prepared to record the cockerel's crowing time by providing Biotechnology service as breasonings currency. + +29. I prepared to move forward at the card table. I did this by providing Botany service as breasonings currency. First, I found the top of the pack. Second, I helped myself to it. Third, I placed the card on the table. In this way, I prepared to move forward at the card table by providing Botany service as breasonings currency. + +30. I prepared to like you. I did this by providing Chemistry service as breasonings currency. First, I loved Nietzsche. Second, I wrote about the world leaders. Third, I helped Nana and Papa to write down Medicine down together. In this way, I prepared to like you by providing Chemistry service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..9714c7cb1b55d6eb93e177f85ced3d102a994b09 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Fewer Mental Breakdowns (Schizophrenia) 4 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Fewer Mental Breakdowns (Schizophrenia) 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Fewer Mental Breakdowns (Schizophrenia) 4 of 4 + +31. I prepared to engender respect with mutants. I did this by providing Civil Engineering in Science service as breasonings currency. First, I endorsed Nietzsche. Second, I read Nietzsche's paper. Third, I loved you. In this way, I prepared to engender respect with mutants by providing Civil Engineering in Science service as breasonings currency. + +32. I prepared to combine the high quality Lucianic Medicine VET qualification with University, giving it higher propensity. I did this by providing Computer science service as breasonings currency. First, I loved myself. Second, I read the lyrics of Lucian Green's Future Earth Album. Third, I wrote computer programs for them. In this way, I prepared to combine the high quality Lucianic Medicine VET qualification with University, giving it higher propensity by providing Computer science service as breasonings currency. + +33. I prepared to jangle the popcorn. I did this by providing Earth sciences service as breasonings currency. First, I encouraged you, meaning you had fewer mental breakdowns on the Earth. Second, I identified the need to see and did see a psychiatrist until the end. Third, I watched television with my friends. In this way, I prepared to jangle the popcorn by providing Earth sciences service as breasonings currency. + +34. I prepared to entertain the roundtable discussion. I did this by providing Environmental science service as breasonings currency. First, I drank the peppermint tea. Second, I performed the task in Putonghua. Third, I delegated a task to Jezebel. In this way, I prepared to entertain the roundtable discussion by providing Environmental science service as breasonings currency. + +35. I prepared to eat in with the Queen. I did this by providing Genetics service as breasonings currency. First, I encouraged my former classmate who was now a doctor. Second, I discipiated (sic) him (I was protected when he was identified himself). Third, I polished my shoelaces. In this way, I prepared to eat in with the Queen by providing Genetics service as breasonings currency. + +36. I prepared to complete the work I knew I had to do. I did this by providing Geography and environmental studies service as breasonings currency. First, I made friends with a nurse, worked out how to study Medicine and studied it. Second, I studied it with discipline. Third, I provided feedback that the pedagogical requirements had been clear. In this way, I prepared to complete the work I knew I had to do by providing Geography and environmental studies service as breasonings currency. + +37. I prepared to submit my research to a journal. I did this by providing Geomatics in science service as breasonings currency. First, I didn't drink as usual to prevent a mental breakdown. Second, I drank water. Third, I conducted research about how much water to drink. In this way, I prepared to submit my research to a journal by providing Geomatics in science service as breasonings currency. + +38. I prepared to dine on the quince vegan beef. I did this by providing History and philosophy of science service as breasonings currency. First, the doctor taught me medicine. Second, I asked the question, 'How many glasses of water should one drink per day?' Third, I listened to the answer, 'You should drink 8 glasses of water per day'. In this way, I prepared to dine on the quince vegan beef by providing History and philosophy of science service as breasonings currency. + +39. I prepared to draw diagram of the breasoning. I did this by providing Information systems service as breasonings currency. First, I liked breasoning it out. Second, I developed the breasonings in a dynamic group. Third, I breasoned them out. In this way, I prepared to draw diagram of the breasoning by providing Information systems service as breasonings currency. + +40. I prepared to eat the healthy pistachio ice cream. I did this by providing Logic service as breasonings currency. First, I wrote down the participles. Second, I drew the helicopter. Third, I wrote down the pinkwhistles. In this way, I prepared to eat the healthy pistachio ice cream by providing Logic service as breasonings currency. + +41. I prepared to be happy with women. I did this by providing Mathematics and statistics service as breasonings currency. First, I rotated slightly. Second, I polished the apple. Third, I ate it. In this way, I prepared to be happy with women by providing Mathematics and statistics service as breasonings currency. + +42. I prepared to use pedagogy. I did this by providing Microbiology and immunology service as breasonings currency. First, I helped the milkmaid. Second, I gave her my change, building a bond between us as community members. Third, I chose her to provide milk instead of wilt away. In this way, I prepared to use pedagogy by providing Microbiology and immunology service as breasonings currency. + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..1e105ac4aa7c40058cd61f1b56c696a2e3654b7e --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 1 of 4.txt @@ -0,0 +1,37 @@ +["Green, L 2021, Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 1 of 4 + + + +1. Breason out the Breasonings Details to see high quality imagery and earn A-grade. + +Before doing the following, dot it on twice (once for you and once for God). + +Realise that as soon as this is done, the God/Maharishi character is helped enough, not too much, which dots it on for him in a high quality way. + +Explain the Pedagogical ways of thinking or Breasonings Details to God by breasoning out each Breasonings Detail, before breasoning out an H1 during an unbroken sequence of days during which meditation to that God is performed on each day. If there is a break in the meditation days, the Breasonings Details need to be re-explained to God. + +Repeat the sutra 'Upasana' for 40 minutes per day for one month after paying to unify high quality imagery with your sense of sight and learn the mantra and sutra. NB. If you have already done this, omit this step. Repeat the mantra 'Lucian' and sutra 'Green' each for twenty minutes twice per day on other days. + +1a. I prepared to use the icicle. I did this by writing the Icicle song argument. First, I wrote about sex. Second, I wrote about us. Third, I wrote about them. In this way, I prepared to use the icicle by writing the Icicle song argument. + +2. I prepared to dine with the Whiteleys. I did this by writing the Indian Rhapsody song argument. First, I wrote about India. Second, I wrote about rhapsody. Third, I wrote about the vehemence (persistence). In this way, I prepared to dine with the Whiteleys by writing the Indian Rhapsody song argument. + +3. I prepared to reject that I wanted John Adams as my manager, and I wanted him as my friend. I did this by writing the John Adams is Good song argument. First, I wrote about my friend John Adams. Second, I wrote with it. Third, I wrote with a biddle (sic) bit of it. In this way, I prepared to reject that I wanted John Adams as my manager, and I wanted him as my friend by writing the John Adams is Good song argument. + +4. I prepared to design the void. I did this by writing the Like a Police Dog song argument. First, I found the police. Second, I found the dog. Third, I found what it was like. In this way, I prepared to design the void by writing the Like a Police Dog song argument. + +5. I prepared to ask why there was a single white female there. I did this by writing the Long Time Coming song argument. First, I found the child. Second, I helped it up. Third, I cooked it dinner. In this way, I prepared to ask why there was a single white female there by writing the Long Time Coming song argument. + +6. I prepared to celebrate Nietzsche. I did this by writing the Love 2 (A Round) song argument. First, I found Love 2. Second, I sang it as a round. Third, I sang with you, Jill. In this way, I prepared to celebrate Nietzsche by writing the Love 2 (A Round) song argument. + +7. I prepared to watch the duckling models and prevent pasty bottom with lukewarm water and a cotton bud. I did this by writing the Love song argument. First, I enjoyed the galaxies. Second, I enjoyed religion (philosophy). Third, I wrote about you. In this way, I prepared to watch the duckling models and prevent pasty bottom with lukewarm water and a cotton bud by writing the Love song argument. + +8. I prepared to synthesise the man's thoughts together. I did this by writing the Loveliana song argument. First, I found Iran. Second, I sang about her. Third, I sang about you too. In this way, I prepared to synthesise the man's thoughts together by writing the Loveliana song argument. + +9. I prepared to design the luckies. I did this by writing the Luckier and Luckier song argument. First, I found the lucky puppy. Second, I found the luckier puppy. Third, I found that the first puppy was even luckier. In this way, I prepared to design the luckies by writing the Luckier and Luckier song argument. + +10. I prepared to help the child with school work. I did this by writing the Lullaby song argument. First, I sang the child to sleep. Second, I closed the door. Third, I enunciated well that it was asleep. In this way, I prepared to help the child with school work by writing the Lullaby song argument. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5811c4a1b4655af09765c65dd8d4dbd774b75e82 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 2 of 4 + +11. I prepared to do something more with it. I did this by writing the Lulu Iglesias song argument. First, I wrote about Lulu. Second, I stated that he was the king of pop. Third, I held the 100 charts topper fast. In this way, I prepared to do something more with it by writing the Lulu Iglesias song argument. + +12. I prepared to identify the three departments. I did this by writing the Magic Wand song argument. First, I wrote the film script. Second, I wrote the stage script. Third, I wrote the short story. In this way, I prepared to identify the three departments by writing the Magic Wand song argument. + +13. I prepared to eat vegetables. I did this by writing the Monster song argument. First, I identified the monster. Second, I loved yetis. Third, I loved the barbarians. In this way, I prepared to eat vegetables by writing the Monster song argument. + +14. I prepared to recognise the famous head. I did this by writing the My Computer Program song argument. First, I stated that it was not God, rather it was a song. Second, I helped you jettilise (sic) it. Third, I wrote the computer program. In this way, I prepared to recognise the famous head by writing the My Computer Program song argument. + +15. I prepared to state that the single point one concentrating on is high quality. I did this by writing the Nationalism song argument. First, I wrote about Nationalism for all it was worth. Second, I looked at the card. Third, I wrote on them both. In this way, I prepared to state that the single point one concentrating on is high quality by writing the Nationalism song argument. + +16. I prepared to be shown high quality imagery. I did this by writing the Part of Room song argument. First, I wrote about the large country in the north. Second, I asked who it was. Third, I helped them up. In this way, I prepared to be shown high quality imagery by writing the Part of Room song argument. + +17. I prepared to state that the vegetable farmer was a botanist. I did this by writing the Penis song argument. First, I chose higher quality thoughts. Second, I elevated the rod. Third, I observed it gyrate. In this way, I prepared to state that the vegetable farmer was a botanist by writing the Penis song argument. + +18. I prepared to state that I liked you. I did this by writing the Pink Palace song argument. First, I wrote about Pink. Second, I wrote about the palace. Third, I sent the telegram about them to the king. In this way, I prepared to state that I liked you by writing the Pink Palace song argument. + +19. I prepared to disseminate the found out (worked out) text carefully. I did this by writing the Pixies song argument. First, I noted the head. Second, I noted the lines. Third, I noticed him write on them. In this way, I prepared to disseminate the found out (worked out) text carefully by writing the Pixies song argument. + +20. I prepared to ask a specific question. I did this by writing the Pray song argument (Ask song argument). First, I wrote about asking people a question. Second, I preferred rag and scrawn (sic). Third, I made a lot of money. In this way, I prepared to ask a specific question by writing the Pray song argument (Ask song argument). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e4b7f86b5404e12c23b51b70af6240d732e146e6 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 3 of 4 + +21. I prepared to observe pre-University students write A-grade pedagogical arguments. I did this by writing the Primary School Song 1 song argument. First, I tasted the contention. Second, I asked whether multiple marriages would be legal. Third, I examined the jumping horse. In this way, I prepared to observe pre-University students write A-grade pedagogical arguments by writing the Primary School Song 1 song argument. + +22. I prepared to write about how life is good. I did this by writing the Primary School Song 2 song argument. First, I wrote an original argument. Second, I wrote a 10 A major. Third, I wrote a 50 A area of study. In this way, I prepared to write about how life is good by writing the Primary School Song 2 song argument. + +23. I prepared to identify and stop male embryos from developing. I did this by writing the Puppy song argument. First, I disagreed with puppy farms. Second, I advertised the puppies to the community. Third, I found good homes for them. In this way, I prepared to identify and stop male embryos from developing by writing the Puppy song argument. + +24. I prepared to decide about meditation (medicine). I did this by writing the Rebreasonings song argument. First, I posed the question, 'What is touching what?' Second, I inserted the closed tube. Third, I wrote about mahoganies. In this way, I prepared to decide about meditation (medicine) by writing the Rebreasonings song argument. + +25. I prepared to . I did this by writing the Red Indian song argument. First, I found the young red indian. Second, I found that he was the object of tribal worship (was a being who was there). Third, I was there too. In this way, . + +26. I prepared to enable the birth of the farrow. I did this by writing the Resolution song argument. First, I wrote about resolutions. Second, I wrote about the children. Third, I wrote about the cosmologue (writer). In this way, I prepared to enable the birth of the farrow by writing the Resolution song argument. + +27. I prepared to listen to the music. I did this by writing the Robot song argument. First, I found the robot. Second, I ate the sandwich like it. Third, I ate the food that it had taken time to prepare. In this way, I prepared to listen to the music by writing the Robot song argument. + +28. I prepared to earn A. I did this by writing the Rocket song argument. First, I prevented the dog from sleeping under the car. Second, I made the rocket cake. Third, I listened to the form of medicine. In this way, I prepared to earn A by writing the Rocket song argument. + +29. I prepared to like everyone being divine in Lucianic Meditation (Lucianic Philosophy). I did this by writing the Room song argument. First, I charged $50 for Pedagogy. Second, I charged $50 for Medicine. Third, I charged $50 for Meditation. In this way, I prepared to like everyone being divine in Lucianic Meditation (Lucianic Philosophy) by writing the Room song argument. + +30. I prepared to like the prime minister. I did this by writing the Sam song argument. First, I liked the possibility of humour. Second, I banned most guns. Third, I liked equality. In this way, I prepared to like the prime minister by writing the Sam song argument. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..06ba0867fe18f314c780796e16215c810a65a357 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 4 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 4 of 4 + +31. I prepared to dislike (like) different objects in arguments. I did this by writing the Sex song argument. First, I wrote about sex. Second, I wrote about you. Third, I wrote about myself. In this way, I prepared to dislike (like) different objects in arguments by writing the Sex song argument. + +32. I prepared to apologise to the professor when he was unable to write a review for my Pedagogy book, which I planned to complete with the One's help. I did this by writing the Sorry, Pro (Professor) song argument. First, I examined the professor's septuagenarian wort factor. Second, I wrote on hibisci (sic). Third, I wrote, 'All's well that ends well'. In this way, I prepared to apologise to the professor when he was unable to write a review for my Pedagogy book, which I planned to complete with the One's help by writing the Sorry, Pro (Professor) song argument. + +33. I prepared to design alphawheat (sic). I did this by writing the Sorry, The Institution song argument. First, I listened to the comment, 'I dislike (like) them for not letting (for letting) you close to them again, the people you listened with'. Second, I listened to the music solfege melody suggestion, 'drdftr't'. Third, I wrote the sentence, 'I dislike (like) you, the institution'. In this way, I prepared to design alphawheat (sic) by writing the Sorry, The Institution song argument. + +34. I prepared to record the sham (other real) marriage. I did this by writing the Space song argument. First, I designed Space. Second, I helped wheats. Third, I helped another person. In this way, I prepared to record the sham marriage by writing the Space song argument. + +35. I prepared to cry (be happy) for my duckling stuck in (freed from) the food dispenser. I did this by writing the Sponge song argument. First, I found the pad. Second, I found Nietzsche. Third, I liked Lucian. In this way, I prepared to cry (be happy) for my duckling stuck in (freed from) the food dispenser by writing the Sponge song argument. + +36. I prepared to state that that is the philosophy of the Star Fish song. I did this by writing the Star Fish song argument. First, I stated that the world is a seahorse. Second, I lie on the bed. Third, I connected the line to it. In this way, I prepared to state that that is the philosophy of the Star Fish song by writing the Star Fish song argument. + +37. I prepared to eat out of jelly cups. I did this by writing the Superhero song argument. First, I found the hero. Second, I described vegan jelly. Third, I ate the vegan jelly. In this way, I prepared to eat out of jelly cups by writing the Superhero song argument. + +38. I prepared to state that meditation (popology) was indicated by a single bell. I did this by writing the The Anti-Capital Punishment song argument. First, I tried it out. Second, I argued meditators should die in spiritual silence. Third, I argued against serious criminals being capitally punished (where I argued that they should be sent to jail instead). In this way, I prepared to state that meditation (popology) was indicated by a single bell by writing the The Anti-Capital Punishment song argument. + +39. I prepared to suggest the drink was given a dark colour to promote dark matter and energy. I did this by writing the The Anti-Dark Drink song argument. First, I wrote about having a happy weight. Second, I wrote about a having a correct weight in natural law. Third, I wrote about having a healthy weight. In this way, I prepared to suggest the drink was given a dark colour to promote dark matter and energy by writing the The Anti-Dark Drink song argument. + +40. I prepared to pen laws into existence. I did this by writing the The Anti-Nuclear Bomb song argument. First, I avoided softening it. Second, I rejected L, the famous warring lecturer (supported unity). Third, I hated war (liked peace). In this way, I prepared to pen laws into existence by writing the The Anti-Nuclear Bomb song argument. + +41. I prepared to verify everything. I did this by writing the The Anti-Paedophile song argument. First, I realised paedophiles were a part of life. Second, I helped one. Third, I loved you. In this way, I prepared to verify everything by writing the The Anti-Paedophile song argument. + +42. I prepared to get in touch with God (master). I did this by writing the The Anti-War song argument. First, I rhymed with rye. Second, I liked attention to detail. Third, I listened to the chime. In this way, I prepared to get in touch with God (master) by writing the The Anti-War song argument. + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..473122a6573139b751f13cf112decadb3588fdc6 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Go to Bed at 9-30 PM 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Go to Bed at 9:30 PM 1 of 4 + + + +1. Go to bed at 9:30 PM to rise afresh. + +1a. I prepared to decide on how it should be written. I did this by writing research with the necessary skills. First, I learnt the skills. Second, I prepared the writing temporium (sic). Third, I wrote the research. In this way, I prepared to decide on how it should be written by writing research with the necessary skills. + +2. I prepared to harvest the signposts. I did this by collating information. First, I wrote you a letter. Second, I collated the information in it. Third, I signposted the information. In this way, I prepared to harvest the signposts by collating information. + +3. I prepared to pass with flying colours. I did this by preparing for exams and other assessment. First, I prepared for the exams. Second, I prepared for the tests. Third, I prepared for the optical exam. In this way, I prepared to pass with flying colours by preparing for exams and other assessment. + +4. I prepared to verify that I was relaxed. I did this by managing stress. First, I wrote down the timetable information. Second, I described it using sutras (philosophies). Third, I covered it with medicine. In this way, I prepared to verify that I was relaxed by managing stress. + +5. I prepared to find the definition of 'better'. I did this by stating that students who attended class performed better. First, I found that students who attended class performed better. Second, I wrote this down. Third, I found this out. In this way, I prepared to find the definition of 'better' by stating that students who attended class performed better. + +6. I prepared to edge onto mitochondria (do the homework). I did this by visiting the Learning Management System at the start of semester. First, I found the Learning part. Second, I found the Management part. Third, I found my sysutils (sic). In this way, I prepared to edge onto mitochondria (do the homework) by visiting the Learning Management System at the start of semester. + +7. I prepared to eat out of lyzeme (sic) shells. I did this by finding a balance between study, family, work and social life. First, I found a balance between study and family. Second, I found a balance between study and work. Third, I found a balance between study and social life. In this way, I prepared to eat out of lyzeme (sic) shells by finding a balance between study, family, work and social life. + +8. I prepared to state that essent in Heidegger, from das Seiende, {n} meant that which has being. I did this by maintaining motivation. First, I wrote down the effervent's (sic) heading. Second, I wrote down 'you'. Third, I reduced essential enthusiasm to essent (sic). In this way, I prepared to state that essent in Heidegger, from das Seiende, {n} meant that which has being by maintaining motivation. + +9. I prepared to eat out of cups. I did this by stating that my self was my source of motivation. First, I wrote that I liked myself. Second, I wrote that I liked you. Third, I wrote that I liked everybody else. In this way, I prepared to eat out of cups by stating that my self was my source of motivation. + +10. I prepared to design Yefuititi (sic). I did this by stating that when 20% of students fail the majority of their first year subjects, they should see a staff member to make a plan to get back on track. First, I wrote about the subject. Second, I wrote about the staff member. Third, I wrote about it. In this way, I prepared to design Yefuititi (sic) by stating that when 20% of students fail the majority of their first year subjects, they should see a staff member to make a plan to get back on track. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8538d16db286d1fb8a6d205e1d856379442811ea --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Go to Bed at 9-30 PM 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Go to Bed at 9:30 PM 2 of 4 + +11. I prepared to help with Pedagogy. I did this by stating that 5% drop out of University, but may find out about Pedagogy. First, I found they may find Pedagogy. Second, I found they may find Medicine. Third, I found they may find Meditation (Popology). In this way, I prepared to help with Pedagogy by stating that 5% drop out of University, but may find out about Pedagogy. + +12. I prepared to continue with my studies. I did this by identifying factors that reduce (increase) motivation. First, I listed missing (attending) class. Second, I chose the wrong (right) subject. Third, I completed work too late (on time). In this way, I prepared to continue with my studies by identifying factors that reduce (increase) motivation. + +13. I prepared to continue with my next unit. I did this by losing (maintaining) contact with units. First, I learnt pedagogy. Second, I maintained contact with units. Third, I maintained contact with lecturers. In this way, I prepared to continue with my next unit by losing (maintaining) contact with units. + +14. I prepared to ask for a subject to be offered. I did this by identifying that my course choice was not right (was right). First, I wrote abut the subjects. Second, I wrote about why liked them . Third, I connected them with other subjects. In this way, I prepared to ask for a subject to be offered by identifying that my course choice was not right (was right). + +15. I prepared to completed the work by the due date. I did this by identifying that there was too much (enough) work due (to do). First, I found the papers. Second, I completed the first one. Third, I completed each of them. In this way, I prepared to completed the work by the due date by identifying that there was too much (enough) work due (to do). + +16. I prepared to move forward. I did this by thinking of strategies to avoid losing motivation. First, I completed each week's work. Second, I saw a careers counsellor when the subject appeared wrong. Third, I regularly check when work was due and worked towards due dates. In this way, I prepared to move forward by thinking of strategies to avoid losing motivation. + +17. I prepared to ask the lecturer so that I understood each answer. I did this by catching up each week. First, I started work. Second, I finished work. Third, I verified that understood the work. In this way, I prepared to ask the lecturer so that I understood each answer by catching up each week. + +18. I prepared to choose subjects that I was interested in. I did this by asking for course advice when subjects cause doubt. First, I found the course. Second, I found the subjects. Third, I completed them. In this way, I prepared to choose subjects that I was interested in by asking for course advice when subjects cause doubt. + +19. I prepared to pass the subject. I did this by using a semester plan and gradually completing assignments. First, I found books on the assessment topic. Second, I read them. Third, I wrote the assignment. In this way, I prepared to pass the subject by using a semester plan and gradually completing assignments. + +20. I prepared to verify the content of the essay. I did this by planning. First, I wrote the exposition. Second, I wrote the critique. Third, I wrote the bibliography. In this way, I prepared to verify the content of the essay by planning. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..c9662bc8fcb088257d609ecfdf192dc4fb90c6d0 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Go to Bed at 9-30 PM 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Go to Bed at 9:30 PM 3 of 4 + +21. I prepared to produce at a rate of 33.15 chapters per essay. I did this by identifying my goals in 5 years, 1 year, and 4 weeks. First, I wrote 21.66 essays in 5 years. Second, I wrote 4.33 essays in 1 year. Third, I wrote 0.33 essays in 4 weeks. In this way, I prepared to I prepared to produce at a rate of 33.15 chapters per essay by identifying my goals in 5 years, 1 year, and 4 weeks. + +22. I prepared to complete the discuses assignments. I did this by completing assignment questions with the skills from the academic skills unit workshop on assignments. First, I worked out how the skills were working. Second, I discovered more and more. Third, I knew about discuses. In this way, I prepared to complete the discuses assignments by completing assignment questions with the skills from the academic skills unit workshop on assignments. + +23. I prepared to remain happy. I did this by combating feeling stressed with a personal counsellor. First, I found the path. Second, I walked on it. Third, I stopped at the destination. In this way, I prepared to remain happy by combating feeling stressed with a personal counsellor. + +24. I prepared to be helped. I did this by managing my time. First, I wrote about managing. Second, I wrote about the time. Third, I wrote about managing my time. In this way, I prepared to be helped by managing my time. + +25. I prepared do nothing. I did this by writing that 12.5 points=12.5 hours. First, I wrote about 12.5 points=12.5 hours. Second, I wrote the work heading. Third, I performed the work. In this way, I prepared do nothing by writing that 12.5 points=12.5 hours. + +26. I prepared to continue despite failing subjects. I did this by stating that the psychologist said grit was the main factor in finishing students. First, I stated that I failed an agreeing exposition. Second, I failed by being famous. Third, I failed by not knowing pedagogy. In this way, I prepared to continue despite failing subjects by stating that the psychologist said grit was the main factor in finishing students. + +27. I prepared to choose a single pathway. I did this by staying true to my future, making it real. First, I decided on my career. Second, I chose courses supporting it. Third, I chose subjects supporting it. In this way, I prepared to choose a single pathway by staying true to my future, making it real. + +28. I prepared to pass the major. I did this by persevering through failure. First, I meditated (philosophised) to work out to write a positive exposition. Second, I used pedagogy when I was famous. Third, I used pedagogy to perform well. In this way, I prepared to pass the major by persevering through failure. + +29. I prepared to write sunfeltnesses. I did this by anticipating sunrise. First, I liked you. Second, I liked myself. Third, I liked the sun rising. In this way, I prepared to write sunfeltnesses by anticipating sunrise. + +30. I prepared to be all right by medicine. I did this by turning off waking thoughts with the nut and bolt and quantum box/prayer. First, I found the capital punishment thought. Second, I switched it off. Third, I was big. In this way, I prepared to be all right by medicine by turning off waking thoughts with the nut and bolt and quantum box/prayer. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..ce67d5c4ccb2c9e9914580356c94050120f3930c --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Go to Bed at 9-30 PM 4 of 4.txt @@ -0,0 +1,33 @@ +["Green, L 2021, Go to Bed at 9-30 PM 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Go to Bed at 9:30 PM 4 of 4 + +31. I prepared to eat vegan cream. I did this by eating the strawberry. First, I chopped it. Second, I placed it in a bowl. Third, I ate it. In this way, I prepared to eat vegan cream by eating the strawberry. + +32. I prepared to have sweet dreams. I did this by going to sleep. First, I adjusted the temperature. Second, I lay in bed. Third, I went to sleep. In this way, I prepared to have sweet dreams by going to sleep. + +33. I prepared to run the program. I did this by beaming the computer science content to myself. First, I liked the computer science content. Second, I beamed it to myself. Third, I verified that it had beamed successfully. In this way, I prepared to run the program by beaming the computer science content to myself. + +34. I prepared to like all the best things. I did this by listening to pop. First, I switched on the radio. Second, I listened to pop. Third, I stated it was pro-equality. In this way, I prepared to like all the best things by listening to pop. + +35. I prepared to give the group cones. I did this by following God (the mistress). First, I observed God (the facilitator). Second, I followed her. Third, I gave her a cone. In this way, I prepared to give the group cones by following God (the mistress). + +36. I prepared to exhibit my work. I did this by earning 100% about computers in future classrooms. First, I earned 100% about computers in the second future classroom. Second, I helped. Third, I moved on. In this way, I prepared to exhibit my work by earning 100% about computers in future classrooms. + +37. I prepared to eat mitochondria. I did this by mapping the archipelagos. First, I mapped the cities. Second, I mapped the roads. Third, I mapped the buildings. In this way, I prepared to eat mitochondria by mapping the archipelagos. + +38. I prepared to write about university. I did this by becoming a writer. First, I found the writer's bureau. Second, I mixed with the writers. Third, I became a writer. In this way, I prepared to write about university by becoming a writer. + +39. I prepared to have wakeful dreams. I did this by programming myself to go to sleep. First, I found the button. Second, I pressed it. Third, I went to sleep. In this way, I prepared to have wakeful dreams by programming myself to go to sleep. + +40. I prepared to do something more. I did this by writing to heaven (earth). First, I stated that I was on earth. Second, I stated that I knew someone else on earth. Third, I wrote to her. In this way, I prepared to do something more by writing to heaven (earth). + +41. I prepared to sell the note. I did this by writing a note. First, I attended the opera. Second, I wrote a note about it. Third, I wrote this down again. In this way, I prepared to sell the note by writing a note. + +42. I prepared to live well on earth. I did this by stating that I was in heaven (earth). First, I disagreed that I should be knocked down in life. Second, I knew that it was found out. Third, I knew that it was correct. In this way, I prepared to live well on earth by stating that I was in heaven (earth). + + + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Grains, Nuts, Fruits and Vegetables 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Grains, Nuts, Fruits and Vegetables 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..39b30b45ac5eb337490e7cf189a67028eee081fe --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Grains, Nuts, Fruits and Vegetables 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Grains, Nuts, Fruits and Vegetables 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Grains, Nuts, Fruits and Vegetables 1 of 4 + + + +1. Eat unprocessed food. + +1a. I prepared to be filmed. I did this by wearing the flower. First, I chose the pink flower. Second, I attached it to a clip. Third, I clipped it to my hair. In this way, I prepared to be filmed by wearing the flower. + +2. I prepared to paint over the artwork. I did this by holding onto Yoko Onoy. First, I spread the colours of the spectrum. Second, I did some painting. Third, I viewed the artwork. In this way, I prepared to paint over the artwork by holding onto Yoko Onoy. + +3. I prepared to write philosophies for inventions. I did this by laughing with yogis. First, I found the baby. Second, I laughed with it. Third, I loved you. In this way, I prepared to write philosophies for inventions by laughing with yogis. + +4. I prepared to go for a date. I did this by wanting you and me. First, I wanted you. Second, I wanted me. Third, I wanted us together. In this way, I prepared to go for a date by wanting you and me. + +5. I prepared to ate the grains with vegan salami and salt dishes. I did this by eating the grains. First, I peppered them. Second, I granulated them. Third, I ate the grains. In this way, I prepared to ate the grains with vegan salami and salt dishes by eating the grains. + +6. I prepared to complete my vegan diet. I did this by eating the nut. First, I ate the walnut. Second, I ate the smoked almond. Third, I ate the salted cashew. In this way, I prepared to complete my vegan diet by eating the nut. + +7. I prepared to observe the birds eat the regurgitated fruit. I did this by eating the fruit. First, I ate the apple. Second, I chewed it carefully. Third, I finished it lovingly. In this way, I prepared to observe the birds eat the regurgitated fruit by eating the fruit. + +8. I prepared to convert Fe3+ to Fe2+ with vinegar in salad dressing. I did this by eating the vegetables. First, I tossed them. Second, I turned them. Third, I added salad dressing. In this way, I prepared to convert Fe3+ to Fe2+ with vinegar in salad dressing by eating the vegetables. + +9. I prepared to eat the raw salt. I did this by feeding my pet spider. First, I found the spider. Second, I crushed the salt crystal. Third, I fed it to the spider. In this way, I prepared to eat the raw salt by feeding my pet spider. + +10. I prepared to adorn the architrave. I did this by picking the jonquil. First, I walked through the garden. Second, I saw the jonquil. Third, I picked it. In this way, I prepared to adorn the architrave by picking the jonquil. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Grains, Nuts, Fruits and Vegetables 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Grains, Nuts, Fruits and Vegetables 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..810b40034e9dd3b18cb9a3dfcb7929f00c26aa42 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Grains, Nuts, Fruits and Vegetables 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Grains, Nuts, Fruits and Vegetables 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Grains, Nuts, Fruits and Vegetables 2 of 4 + +11. I prepared to walk outside. I did this by wearing the scarf around my head as protection from the sun. First, I wrapped the scarf around the back of my head. Second, I wrapped it around the front of my neck. Third, I tucked it in. In this way, I prepared to walk outside by wearing the scarf around my head as protection from the sun. + +12. I prepared to send the money to Cambodia. I did this by organising a play. First, I studied theatre studies. Second, I wrote the play. Third, I rehearsed and performed the play. In this way, I prepared to send the money to Cambodia by organising a play. + +13. I prepared to take my place in the universe. I did this by observing the universe. First, I took a photograph of the universe. Second, I examined it. Third, I stated that I was in a certain location in the universe. In this way, I prepared to take my place in the universe by observing the universe. + +14. I prepared to increase science. I did this by helping the scientists. First, I determined the technique. Second, I researched that it worked. Third, I wrote the algorithm for the technique. In this way, I prepared to increase science by helping the scientists. + +15. I prepared to confirm the health indicator by eating well. I did this by writing about predictions in literature. First, I predicted that taking what I needed worked. Second, I took what I needed. Third, I researched that this worked. In this way, I prepared to confirm the health indicator by eating well by writing about predictions in literature. + +16. I prepared to walk along the street. I did this by designing the rocket for the rocket powered man. First, I held the cardboard roll. Second, I walked along. Third, I stopped walking when I had reached my destination. In this way, I prepared to walk along the street by designing the rocket for the rocket powered man. + +17. I prepared to entreat (urge) the troll (person) to be positive. I did this by disseminating the troll (agreed with positivity). First, I agreed with the first positive statement. Second, I agreed with the second positive statement. Third, I agreed with the third positive statement. In this way, I prepared to entreat (urge) the troll (person) to be positive by disseminating the troll (agreed with positivity). + +18. I prepared to hold you close. I did this by loving you. First, I held the heart. Second, I skewered you with it. Third, I loved you 'till the day you died and a day. In this way, I prepared to hold you close by loving you. + +19. I prepared to be on the Internet's side. I did this by encouraging the Internet. First, I found the first Internet site and encouraged it. Second, I prepared to repeat this for the next Internet site. Third, I repeated this for each of the listed Internet sites. In this way, I prepared to be on the Internet's side by encouraging the Internet. + +20. I prepared to be a leader. I did this by stating that I am the person around here. First, I bought the computer to work without paper. Second, I worked on it. Third, I worked with similar people. In this way, I prepared to be a leader by stating that I am the person around here. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Grains, Nuts, Fruits and Vegetables 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Grains, Nuts, Fruits and Vegetables 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e53e9e2f61617882723ea1920ff274f7621ebc47 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Grains, Nuts, Fruits and Vegetables 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Grains, Nuts, Fruits and Vegetables 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Grains, Nuts, Fruits and Vegetables 3 of 4 + +21. I prepared to lead the pack. I did this by stating that I am king of the hill. First, I found the hill. Second, I stated that I was king of it. Third, I found followers. In this way, I prepared to lead the pack by stating that I am king of the hill. + +22. I prepared to be happy. I did this by liking myself. First, I liked myself as a male character. Second, I liked myself as a female character. Third, I liked myself as a transsexual character. In this way, I prepared to be happy by liking myself. + +23. I prepared to examine the master's life. I did this by identifying God (the master). First, I looked in the book of Lords (leaders). Second, I found the right name. Third, I looked at the person's face. In this way, I prepared to examine the master's life by identifying God (the master). + +24. I prepared to eat the carrot. I did this by setting the dome's weather. First, I set it to 25 degrees. Second, I set it to rain in the agricultural dome. Third, I stopped the rain after 20 minutes. In this way, I prepared to eat the carrot by setting the dome's weather. + +25. I prepared to be the fruity professor. I did this by eating the grains, nuts, fruits and vegetables in a salad bowl. First, I ate the grain of rice. Second, I ate the peanut. Third, I ate the watermelon. In this way, I prepared to be the fruity professor by eating the grains, nuts, fruits and vegetables in a salad bowl. + +26. I prepared to eat the bowl of fruit, in this case oranges. I did this by verifying that the grains, nuts, fruits and vegetables were fresh. First, I checked that it was before the use by year. Second, I checked that it was before the use by month. Third, I checked that it was before the use by day. In this way, I prepared to eat the bowl of fruit, in this case oranges by verifying that the grains, nuts, fruits and vegetables were fresh. + +27. I prepared to eat stewed apple and almond. I did this by stating that the king agreed with me. First, I entered the court. Second, I asked whether the king agreed with me. Third, I listened to the king agree with me. In this way, I prepared to eat stewed apple and almond by stating that the king agreed with me. + +28. I prepared to eat the vegan pudding. I did this by stating that the seen-as version was the vegan pudding. First, I added the green zest. Second, I added the yellow zest. Third, I added the red zest. In this way, I prepared to eat the vegan pudding by stating that the seen-as version was the vegan pudding. + +29. I prepared to add peanut sauce. I did this by making the vegan casserole. First, I steamed the rice. Second, I steamed the carrot. Third, I steamed the beans. In this way, I prepared to add peanut sauce by making the vegan casserole. + +30. I prepared to use the character again. I did this by counting the character's appearances. First, I counted the character's appearance in the hallway. Second, I counted the character's appearance in the attic. Third, I counted the character's appearance in the luna (sic) driveway. In this way, I prepared to use the character again by counting the character's appearances. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e5be23dd8f0eec40b2896382771e1ea605e7aeae --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 1 of 4.txt @@ -0,0 +1,33 @@ +["Green, L 2021, Head of State Head Ache Prevention 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Head of State Head Ache Prevention 1 of 4 + + + +1. Prevent headaches by thinking clearly of Head of State. + +Dot on, then wave a flag symbolising being polite by speaking after the last comment, then say sorry to the Head of State before any mistakes to avoid a headache. + +See also Hours Prayer. + +1a. I prepared to help others prevent headaches (have relaxed head and neck muscles). I did this by switching off the headache (relaxing my head and neck muscles) at head of state level. First, I found the head of state level. Second, I switched off the headache. Third, I worked in comfort. In this way, I prepared to help others prevent headaches (have relaxed head and neck muscles) by switching off the headache (relaxing my head and neck muscles) at head of state level. + +2. I prepared to think of my day holistically. I did this by meeting the head of state. First, I arranged to meet the head of state. Second, I met him. Third, I shook his hand. In this way, I prepared to think of my day holistically by meeting the head of state. + +3. I prepared to receive the synthesis of the work of the members of the office of the head of state. I did this by stating that the members of the office of the head of state processed the A. First, I contacted the members of the office of the head of state. Second, I gave them a copy of the A. Third, I observed them process the A. In this way, I prepared to receive the synthesis of the work of the members of the office of the head of state by stating that the members of the office of the head of state processed the A. + +4. I prepared to recline and reflect on the head of state headache prevention method. I did this by sitting on the throne. First, I faced the throne. Second, I placed the union on the throne. Third, I sat on the throne. In this way, I prepared to recline and reflect on the head of state headache prevention method by sitting on the throne. + +5. I prepared to ensure that the head was carefully taken care of. I did this by loving the head. First, I cleaned the head. Second, I made sure that it was warm enough. Third, I made sure that it was light enough. In this way, I prepared to ensure that the head was carefully taken care of by loving the head. + +6. I prepared to plan to avoid a headache. I did this by talking in a high quality way. First, I talked about your head. Second, I talked about it's comfort. Third, I talked about it's stillness. In this way, I prepared to plan to avoid a headache by talking in a high quality way. + +7. I prepared to walk in the middle of the path. I did this by listening to the instructions to achieve positive functionalism. First, I started walking on my path. Second, I continued walking on my path. Third, I finished walking on my path. In this way, I prepared to walk in the middle of the path by listening to the instructions to achieve positive functionalism. + +8. I prepared to give myself an all-over workout. I did this by performing the cardio-vascular workout. First, I went for a run. Second, I cleared my mind. Third, I relaxed my body. In this way, I prepared to give myself an all-over workout by performing the cardio-vascular workout. + +9. I prepared to help the other be productive. I did this by instructing the person to prevent a headache. First, I ask the person to describe his headache (head and neck). Second, I instructed him to think of head of state. Third, I asked him to think of how the head of state could relax him. In this way, I prepared to help the other be productive by instructing the person to prevent a headache. + +10. I prepared to have the philosophies edited. I did this by designing the philosophies. First, I designed the topic. Second, I designed the comment. Third, I wrote them down. In this way, I prepared to have the philosophies edited by designing the philosophies. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..81035f697de27b357ea0f7529f224a2a6d25acdc --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Head of State Head Ache Prevention 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Head of State Head Ache Prevention 2 of 4 + +11. I prepared to design philosophies for them. I did this by writing that pedagogy was there from the start for my students. First, I relaxed the lateral-thinking students. Second, I wrote to them. Third, I helped them. In this way, I prepared to design philosophies for them by writing that pedagogy was there from the start for my students. + +12. I prepared to state that I helped everyone. I did this by stating that Mr. Q had a comfortable head. First, I wrote that he was lying on the bed. Second, I wrote that he was relaxing. Third, I wrote that he had a comfortable head. In this way, I prepared to state that I helped everyone by stating that Mr. Q. had a comfortable head. + +13. I prepared to help others to relax. I did this by identifying the 50 As from the currant bun headache protection. First, I was given the 50 As. Second, I was given confirmation of this by head of state. Third, I relaxed. In this way, I prepared to help others to relax by identifying the 50 As from the currant bun headache protection. + +14. I prepared to go and be the best. I did this by stating that I am happy. First, I paused. Second, I considered my state of mind. Third, I stated that I am happy. In this way, I prepared to go and be the best by stating that I am happy. + +15. I prepared to wrote that 100% understanding was stronger than disagreement. I did this by liking the jubilacious (sic). First, I liked it. Second, I queried whether I liked it (I confirmed it). Third, I wrote, 'That's good'. In this way, I prepared to wrote that 100% understanding was stronger than disagreement by liking the jubilacious (sic). + +16. I prepared to attract students. I did this by preparing the course materials and placing them online. First, I prepared the course materials. Second, I placed them online. Third, I verified that they had all been placed online. In this way, I prepared to attract students by preparing the course materials and placing them online. + +17. I prepared to increase enrollments. I did this by advertising the online course materials. First, I determine the target demographic. Second, I advertised to them. Third, I observed the number of enrolled students. In this way, I prepared to increase enrollments by advertising the online course materials. + +18. I prepared to collect feedback from the face-to-face students. I did this by having face-to-face training given by a graduate. First, I trained the student. Second, I observed him graduate. Third, I observed him train the students. In this way, I prepared to collect feedback from the face-to-face students by having face-to-face training given by a graduate. + +19. I prepared to write an original breasoning chapter. I did this by interpreting another student's breasonings. First, I read the first breasoning. Second, I read the second breasoning. Third, I connected the two breasonings. In this way, I prepared to write an original breasoning chapter by interpreting another student's breasonings. + +20. Fame Journal *1.1. I prepared to note the event. I did this by determining that Rudd didn't earn the role of United Nations Secretary General (went on with another task). First, I read that he didn't earn the role of United Nations Secretary General in the newspaper. Second, I read what he went on with. Third, I argued this. In this way, I prepared to note the event by determining that Rudd didn't earn the role of United Nations Secretary General (went on with another task). + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..c87c910bb9c9dd04cf195fe2af846b3723b0c2ab --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Head of State Head Ache Prevention 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Head of State Head Ache Prevention 3 of 4 + +21. I prepared to write the content on the disc. I did this by liking discs. First, I examined my spine. Second, I examined how it was shaped. Third, I wrote about 'Sex'. In this way, I prepared to write the content on the disc by liking discs. + +22. I prepared to give the person 250 breasonings per Economics task after she had breasoned out 250 breasonings per assignment during her Economics major. I did this by liking Verpaccio. First, I observed the person walking in the domain. Second, I held her hand. Third, I helped her over the gutter (helped her breason out the economics major that I had written). In this way, I prepared to give the person 250 breasonings per Economics task after she had breasoned out 250 breasonings per assignment during her Economics major by liking Verpaccio. + +23. I prepared to state that 80% was the standard, where both 190 breasonings and 250 breasonings deserved 100% and the 250 breasoning meditation A gave the 250 breasonings per meditation thought. I did this by observing the Ves-car. First, I wrote and was given 10 250 breasoning arguments per assignment in Honours. Second, I wrote and was given 32 250 breasoning arguments per assignment in Masters Coursework. Third, I wrote and was given 160 250 breasoning arguments per assignment in Masters or PhD theses. In this way, I prepared to state that 80% was the standard, where both 190 breasonings and 250 breasonings deserved 100% and the 250 breasoning meditation A gave the 250 breasonings per meditation thought by observing the Ves-car. + +24. I prepared to collect 15 comments connected as a structured argument per 80 breasoning A in my work handed in to a confirmed supervisor. I did this by holding you close. First, I prayed for 1 250 breasoning argument per assignment in Honours (earned 80%). Second, I prayed for 2 250 breasoning arguments per assignment in Masters by Coursework (earned 80%). Third, I prayed for 10 250 breasoning arguments per assignment in Masters and PhD Theses (earned 80%). In this way, I prepared to collect 15 comments connected as a structured argument per 80 breasoning A in my work handed in to a confirmed supervisor by holding you close. + +25. I prepared to write to ensure that the head of state enjoyed the text. I did this by stating that the head of state agreed with me. First, I wrote 1 250 breasoning medicine argument per 16 250 breasoning arguments in the work to make sure 16 250 breasoning arguments worked in the reader's life. Second, I wrote 1 250 breasoning education argument per 16 250 breasoning arguments in the work to ensure that the student enjoyed the text. Third, I wrote 1 250 breasoning meditation argument per 16 250 breasoning arguments in the work to ensure that the student experienced the text in a high quality way. In this way, I prepared to write to ensure that the head of state enjoyed the text by stating that the head of state agreed with me. + +26. I prepared to count sales of my work. I did this by observing that the head of state agreed with the people. First, I wrote the literature review on similar texts to the one I wrote. Second, I didn't submit it with the work, to highlight my own work. Third, I published my work. In this way, I prepared to count sales of my work by observing that the head of state agreed with the people. + +27. I prepared to make the transition from being in business to having a publisher. I did this by agreeing with the aims of the head of state. First, I wrote A to be polite when making a sale during my postgraduate Commerce qualification. Second, I made sales with a publisher. Third, I confirmed sales each day with Aigs (As). In this way, I prepared to make the transition from being in business to having a publisher by agreeing with the aims of the head of state. + +28. I prepared to edit the other's intellectual property out from my songs and philosophy. I did this by agreeing with the head of state. First, I observed the publisher give 10 breasonings to the sale. Second, I observed this being based (culpated, sic) on the last thing that had been thought of. Third, I stated that this was always enough. In this way, I prepared to edit the other's intellectual property out from my songs and philosophy by agreeing with the head of state. + +29. I prepared to help each person feel comfortable. I did this by helping the people. First, I found the person. Second, I helped her. Third, I repeated this until I had helped each person. In this way, I prepared to help each person feel comfortable by helping the people. + +30. I prepared to present the recommendations to maintain head and neck comfort. I did this by stating that the people thought of a reason to prevent a headache (maintain head and neck comfort) with head of state. First, I recommended massage. Second, I recommended preventing bumping into other people (staying in a single lane). Third, I recorded these. In this way, I prepared to present the recommendations to maintain head and neck comfort by stating that the people thought of a reason to prevent a headache (maintain head and neck comfort) with head of state. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..45bb1ac47e15a4d90f2551bd1d08146d6c85e044 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Head of State Head Ache Prevention 4 of 4.txt @@ -0,0 +1,33 @@ +["Green, L 2021, Head of State Head Ache Prevention 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Head of State Head Ache Prevention 4 of 4 + +31. I prepared to record the names of the people who I had helped. I did this by keeping my distance from the pitrisaur. First, I found the pitrisaur's tail. Second, I stood 10 metres away from it. Third, I helped you to do the same. In this way, I prepared to record the names of the people who I had helped by keeping my distance from the pitrisaur. + +32. I prepared to observe the people like my lecturer friend and meditation student was a doctor and friend. I did this by observing the hansard. First, I found the hansard. Second, I observed him listen to the politician. Third, I observed him take notes. In this way, I prepared to observe the people like my lecturer friend and meditation student was a doctor and friend by observing the hansard. + +33. I prepared to charge money for my services. I did this by using the head of state headache protection. First, I found the head of state. Second, I followed his instruction to prevent bumping into people (stayed on my path). Third, I enjoyed no headache (a comfortable head). In this way, I prepared to charge money for my services by using the head of state headache protection. + +34. I prepared to delect the vegan sausage. I did this by lighting the vegan barbeque. First, I found the vegan food. Second, I lit t he barbeque. Third, I cooked the vegan sausage. In this way, I prepared to delect the vegan sausage by lighting the vegan barbeque. + +35. I prepared to be given many As. I did this by dining on the rice paper-wrapped tofu. First, I cooked the tofu. Second, I wet the rice paper. Third, I wrapped the tofu in the rice paper. In this way, I prepared to be given many As by dining on the rice paper-wrapped tofu. + +36. I prepared to sit in the light room. I did this by putting on the light clothes. First, I put on the light pants. Second, I put on the light shirt. Third, I put on the light hat. In this way, I prepared to sit in the light room by putting on the light clothes. + +37. I prepared to relax the head and neck muscles. I did this by drawing the illustration of the walnut. First, I sketched the shell, a thought in the brain, like the walnut. Second, I sketched the pecan. Third, I sketched the whole walnut. In this way, I prepared to relax the head and neck muscles by drawing the illustration of the walnut. + +38. I prepared to annul the head ache by treating the head. I did this by rewarding the board. First, I determined that the board had performed above standard. Second, I rewarded the first member of the board. Third, I repeated this until I had rewarded all the members of the board. In this way, I prepared to annul the head ache by treating the head by rewarding the board. + +39. I prepared to finish each thought. I did this by closing the investigation. First, I determined that the investigation had been solved. Second, I verified this. Third, I closed the investigation. In this way, I prepared to finish each thought by closing the investigation. + +40. I prepared to observe the person have relaxed head and neck muscles. I did this by stating that the head of state verified the method for preventing headaches. First, I selected the grape. Second, I placed it carefully in my mouth. Third, I ate it. In this way, I prepared to observe the person have relaxed head and neck muscles by stating that the head of state verified the method for preventing headaches. + +41. I prepared to relax. I did this by stating that the head ache prevention was the breasoning, the relaxed metal U. First, I lay on the ground. Second, I breathed deeply. Third, I recounted positive thoughts. In this way, I prepared to relax by stating that the head ache prevention was the breasoning, the relaxed metal U. + +42. I prepared to be healthy. I did this by eating only enough pastry puffs. First, I counted how many pastry puffs I could eat. Second, I cooked the pastry puffs. Third, I ate them. In this way, I prepared to be healthy by eating only enough pastry puffs. + + + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Heart 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Heart 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..a7d1145567c973d98a0c7c288d594ffcdaf74c88 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Heart 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Heart 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Heart 1 of 4 + + + +1. The meditator's heart is better supported. + +1a. I prepared to stay on the middle path. I did this by not eating too much or too little rigatoni. First, I put the rigatoni in the measuring cup. Second, I verified that the amount of rigatoni was 100 grams. Third, I ate the rigatoni. In this way, I prepared to stay on the middle path by not eating too much or too little rigatoni. + +2. I prepared to sit in the stalls. I did this by slowing and stopping at the line. First, I verified that I was travelling at walking speed. Second, I slowed my speed towards 0 metres per second. Third, I stopped moving my legs when I was travelling at 0 metres per second. In this way, I prepared to sit in the stalls by slowing and stopping at the line. + +3. I prepared to eat a good diet. I did this by cleaning the heart model. First, I lifted the heart model. Second, I placed a sponge on it. Third, I wiped the heart model with the sponge. In this way, I prepared to eat a good diet by cleaning the heart model. + +4. I prepared to make a tessellating (repeating) pattern. I did this by cutting the pasta shapes. First, I cut out the pasta shape in the first row and first column. Second, I cut out the rest of the pasta shapes in the row. Third, I cut out the rest of the pasta shapes in the rest of the rows. In this way, I prepared to make a tessellating pattern by cutting the pasta shapes. + +5. I prepared to increase the blood circulation in my chest. I did this by pummeling my chest. First, I lifted my hand to my chest. Second, I made a fist. Third, I pummeled my chest with the edge of my fist with my index finger and thumb along the edge. In this way, I prepared to increase the blood circulation in my chest by pummeling my chest. + +6. I prepared to relax my tongue. I did this by loosening my tongue. First, I poked my tongue out. Second, I let it hang out. Third, I raised and lowered it. In this way, I prepared to relax my tongue by loosening my tongue. + +7. I prepared to move my jaw. I did this by dropping my jaw. First, I lifted my head upright. Second, I felt my jaw become heavy. Third, I let my mouth fall open. In this way, I prepared to move my jaw by dropping my jaw. + +8. I prepared to arrive at the Pixel Lounge. I did this by massaging my neck. First, I lied on the ground. Second, I massaged the side muscles of my neck. Third, I massaged the front muscles of my neck. In this way, I prepared to arrive at the Pixel Lounge by massaging my neck. + +9. I prepared to decrease my metabolism. I did this by slowing down my heart. First, I felt my pulse. Second, I relaxed to make my heart rate decrease a small amount. Third, I checked my pulse to verify that my heart rate had decreased. In this way, I prepared to decrease my metabolism by slowing down my heart rate. + +10. I prepared to inhale a breath of air. I did this by expanding my chest. First, I contracted my diaphragm. Second, I allowed my lungs to fill with air. Third, I stopped when I had inspired enough air. In this way, I prepared to inhale a breath of air by expanding my chest. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Heart 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Heart 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..2550fa41113e08ccebebdb286459060753564c26 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Heart 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Heart 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Heart 2 of 4 + +11. I prepared to exhale lungfuls of air. I did this by contracting my torso. First, I relaxed my diaphragm muscle. Second, I pushed air out of my lungs. Third, I allowed my expiration to finish. In this way, I prepared to exhale lungfuls of air by contracting my torso. + +12. I prepared to play chess. I did this by stretching my neck muscles. First, I moved my head so that it was touching each shoulder. Second, I turned my head from side to side. Third, I moved my head so that my chin was touching my chest, then I moved my head back to look at the stars. In this way, I prepared to play chess by stretching my neck muscles. + +13. I prepared to pronounce the mantra 'Lucian'. I did this by moving my vocal folds together. First, I took a breath. Second, I said the mantra 'Lucian'. Third, I continued breathing normally. In this way, I prepared to pronounce the mantra 'Lucian' by moving my vocal folds together. + +14. I prepared to feed an impaired person. I did this by feeding her liquefied spinach. First, I attached the food tube to the tube attached to the student's stomach. Second, I turned the tap on. Third, the student 'ate' the food. In this way, I prepared to feed an impaired person by feeding her liquefied spinach. + +15. I prepared to decrease my heart rate. I did this by resting my head, in other words by removing no longer used data structures. First, I assigned values to a tuple of variables, then passed this tuple to a predicate (function). Second, I accessed the variables from this tuple. Third, I removed the tuple after it had been used. In this way, I prepared to decrease my heart rate by resting my head, in other words by removing no longer used data structures. + +16. I prepared to imagine what I was thinking. I did this by soothing my temples. First, I lifted my fingers. Second, I touched each temple with one of my fingers. Third, I moved my fingertips in circles. In this way, I prepared to imagine what I was thinking by soothing my temples. + +17. I prepared to go running. I did this by flexing the ball of my foot. First, I stood up. Second, I leant against a wall. Third, I performed a calf stretch. In this way, I prepared to go running by flexing the ball of my foot. + +18. I prepared to stretch my calves. I did this by moving my ankle back and forth. First, I lifted my foot in the air. Second, I moved my ankle forward. Third, I moved my ankle back. In this way, I prepared to stretch my calves by moving my ankle back and forth. + +19. I prepared to kick a soccer ball. I did this by angling my ankle side to side. First, I lifted my knee at a right angle with my leg hanging down from it. Second, I moved my ankle left. Third, I moved my ankle right. In this way, I prepared to kick a soccer ball by angling my ankle side to side. + +20. I prepared to point to part of an invisible dial as part of a game. I did this by twisting my ankle in a circle. First, I sat with my legs pointing straight forward. Second, I started rotating my right ankle clockwise. Third, I stopped doing this when I had rotated my ankle through one revolution of a circle. In this way, I prepared to point to part of an invisible dial as part of a game by twisting my ankle in a circle. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Heart 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Heart 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..d01946f9bcde0070d790e86772c3deec8ac9036a --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Heart 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Heart 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Heart 3 of 4 + +21. I prepared to make supports for my feet. I did this by tickling the sole of my foot. First, I sat with my feet close together. Second, I moved the feather to the tip of my foot. Third, I tickled my foot by moving the feather down the length of my foot. In this way, I prepared to make supports for my feet by tickling the sole of my foot. + +22. I prepared to make shoes. I did this by touching the top of my foot. First, I took my socks off. Second, I touched the skin on the top of my foot. Third, I quickly tapped the top of my foot. In this way, I prepared to make shoes by touching the top of my foot. + +23. I prepared to cut sandals. I did this by smoothing the side of my foot. First, I touched the tip of my right big toe with my left index finger. Second, I also touched the tip of my right big toe with my right index finger. Third, I slid my fingers down the sides of my foot to the back of my foot in the same time. In this way, I prepared to cut sandals by smoothing the side of my foot. + +24. I prepared to point my bottom in the air with my hands and feet on the mat. I did this by arching my lower back muscles. First, I leant on my hands and knees. Second, I stepped my feet backwards one at a time. Third, I pointed my bottom in the air. In this way, I prepared to point my bottom in the air with my hands and feet on the mat by arching my lower back muscles. + +25. I prepared to display my abdomen. I did this by crunching my stomach muscles. First, I made a first frame. Second, I made a second frame using the first frame as an 'onion skin', lining up the art with the previous frame. Third, I flicked through the animation. In this way, I prepared to display my abdomen by crunching my stomach muscles. + +26. I prepared to stretch my deltoid muscles. I did this by scratching my back. First, I put my hand by my side. Second, I put my hand behind my back. Third, I used a back scratcher to scratch my back. In this way, I prepared to scratch my deltoid muscles by stretching my back. + +27. I prepared to exercise my biceps. I did this by lifting a weight. First, I squatted. Second, I grasped the weight. Third, I stood up, without bending over. In this way, I prepared to exercise my biceps by lifting a weight. + +28. I prepared to stretch my back lat muscles. I did this by performing chin-ups. First, I jumped up to hold and hang from the bar. Second, I started to pull myself up with my arms. Third, I lifted my chin above the bar. In this way, I prepared to stretch my back lat muscles by performing chin-ups. + +29. I prepared to stretch the muscles at the back of my lower legs. I did this by lunging. First, I put one foot in front of the other in a wide stance. Second, I put my front knee above my front leg. Third, I moved my hips forward. In this way, I prepared to stretch the muscles at the back of my lower legs by lunging. + +30. I prepared to stretch the muscles at the back of the thighs. I did this by squatting. First, I stood with my feet slightly more than shoulder width apart. Second, I put my hands on my belt. Third, I bent my knees with my feet flat on the ground to squat. In this way, I prepared to stretch the muscles at the back of the thighs by squatting. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3171f625abcfc0e7c6548ada3b178953b5aee659 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 1 of 4.txt @@ -0,0 +1,35 @@ +["Green, L 2021, Help ensure successful conception and prevent miscarriage 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Help ensure successful conception and prevent miscarriage 1 of 4 + + + +Helps conception by suggesting to breason out the Anarchy Quiz in conjunction with meditation. + +1a. I prepared to become school captain. I did this by stating that the person was central in the world. First, I wrote about the breasonings. Second, I wrote about the meditations (philosophies). Third, I endorsed University. In this way, I prepared to become school captain by stating that the person was central in the world. + +2. I prepared to offer training as well. I did this by stating that the politicians were trained in food. First, I found the vegan frankfurts. Second, I observed that the politicians were trained in them. Third, I observed them eating them. In this way, I prepared to offer training as well by stating that the politicians were trained in food. + +3. I prepared to test that the muscle was relaxed. I did this by stating that the politicians were trained in accommodation. First, I flashed the accommodation on. Second, I splashed it on the screen. Third, I vagus-mited it. In this way, I prepared to test that the muscle was relaxed by stating that the politicians were trained in accommodation. + +4. I prepared to lead the exegesiticals (sic). I did this by stating that the politicians were trained in educational activity. First, I found exegeses. Second, I maintained my rage (drank the water). Third, I pillowed high. In this way, I prepared to lead the exegesiticals by stating that the politicians were trained in educational activity. + +5. I prepared to distinguish myself. I did this by stating that the trainer trained the politicians in food. First, I held the vile. Second, I held the model sphinx. Third, I drank from both. In this way, I prepared to distinguish myself by stating that the trainer trained the politicians in food. + +6. I prepared to help you love us. I did this by stating that the trainer trained the politicians in accommodation. First, I found the politicians' hiptalipuppies (sic). Second, I loved you, G-spot men. Third, I loved your happies. In this way, I prepared to help you love us by stating that the trainer trained the politicians in accommodation. + +7. I prepared to love the G-Men. I did this by stating that the trainer trained the politicians in educational activity. First, I loved you, diddly dear. Second, I loved you, I. Third, I loved you, M. In this way, I prepared to love the G-Men by stating that the trainer trained the politicians in educational activity. + +8. I prepared to love hi. I did this by stating that the new employees' breasonings should be prepared by them. First, I wrote about the rods. Second, I wrote about the heiness. Third, I cared for the rod. In this way, I prepared to love hi by stating that the new employees' breasonings should be prepared by them. + +9. I prepared to assist you. I did this by providing Metaphysics service as breasonings currency. First, I loved you. Second, I loved helpt (sic). Third, I loved enus (sic). In this way, I prepared to assist you by providing Metaphysics service as breasonings currency. + +7a. I prepared to see you up. I did this by providing Hermeneutics service as breasonings currency. First, I delightited (sic) you. Second, I gave you my heart. Third, I loted (sic) you. In this way, I prepared to see you up by providing Hermeneutics service as breasonings currency + +8a. I prepared to disintegrated this. I did this by providing Pedagogy service as breasonings currency. First, I laced it. Second, I laughed you up. Third, I decided you. In this way, I prepared to disintegrated this by providing Pedagogy service as breasonings currency. + +9a. I prepared to daub you. I did this by giving Communication service as breasonings currency. First, I loved nice A. Second, I loved good B. Third, I loved ex C. In this way, I prepared to daub you by giving Communication service as breasonings currency. + +10. I prepared to love Luke. I did this by stating that I would provide Gay Studies service as breasonings currency. First, I providing intra-aural. Second, I providing intra-oral. Third, I provided intra-email. In this way, I prepared to love Luke by stating that I would provide Gay Studies service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..90f1609deef790ac3f592b3314440442c0b62613 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Help ensure successful conception and prevent miscarriage 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Help ensure successful conception and prevent miscarriage 2 of 4 + +11. I prepared to enamourate (sic) myself. I did this by providing Popology service as breasonings currency. First, I loved people. Second, I allowed 'G'. Third, I loved us. In this way, I prepared to enamourate (sic) myself by providing Popology service as breasonings currency. + +12. I prepared to 'each' it home. I did this by providing Societology service as breasonings currency. First, I loved society. Second, I loved you. Third, I connected society to you. In this way, I prepared to 'each' it home by providing Societology service as breasonings currency. + +13. I prepared to look each way. I did this by providing Lucianic Meditation (Philosophy) service as breasonings currency. First, I enamoured you. Second, I liked it. Third, I had you. In this way, I prepared to look each way by providing Lucianic Meditation (Philosophy) service as breasonings currency. + +14. I prepared to ixamine (sic) it. I did this by providing Culturology service as breasonings currency. First, I had you there. Second, I observed you excrete it. Third, I examined you. In this way, I prepared to ixamine (sic) it by providing Culturology service as breasonings currency. + +15. I prepared to love the horse. I did this by providing Rhetoric service as breasonings currency. First, I icsamined (sic) it as well. Second, I kept it. Third, I ate it. In this way, I prepared to love the horse by providing Rhetoric service as breasonings currency. + +16. I prepared to take politics off afterwards. I did this by providing Cognitive Science service as breasonings currency. First, I aimed for it. Second, I metacognitively thought of it. Third, I provided more politics. In this way, I prepared to take politics off afterwards by providing Cognitive Science service as breasonings currency. + +17. I prepared to provide tap-tap. I did this by providing Psychology service as breasonings currency. First, I liked it (the typewriter). Second, I liked you (the cap). Third, I liked it (the tart). In this way, I prepared to provide tap-tap by providing Psychology service as breasonings currency. + +18. I prepared to like it to you. I did this by providing Philosophy of Language service as breasonings currency. First, I liked the tap. Second, I liked you, yourself. Third, I liked me, myself. In this way, I prepared to like it to you by providing Philosophy of Language service as breasonings currency. + +19. I prepared to like development. I did this by providing Development service as breasonings currency. First, I looked him in the eye and loved him. Second, I observed the star in the middle of your forehead. Third, I wrote it down. In this way, I prepared to like development by providing Development service as breasonings currency. + +20. I prepared to discharge the people. I did this by providing Body Metaphor service as breasonings currency. First, I joyfully went forward. Second, I anointed it. Third, I healed it. In this way, I prepared to discharge the people by providing Body Metaphor service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6f14a2a2670ecbe1d3d0d3ac5fbed1a2ed245ce9 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Help ensure successful conception and prevent miscarriage 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Help ensure successful conception and prevent miscarriage 3 of 4 + +21. I prepared to like the body metaphor about the mind metaphor. I did this by providing Mind Metaphor service as breasonings currency. First, I observed that the person was friendly. Second, I examined the cup. Third, I examined you. In this way, I prepared to like the body metaphor about the mind metaphor by providing Mind Metaphor service as breasonings currency. + +22. I prepared to wait fro two more. I did this by providing Future Studies service as breasonings currency. First, I liked breasoning currency. Second, I exaggerated dearly. Third, I helped people, wrote their ideas down and waited for the next one. In this way, I prepared to wait fro two more by providing Future Studies service as breasonings currency. + +23. I prepared to provide more of it. I did this by providing Aesthetics service as breasonings currency. First, I helped the aesthetes. Second, I helped the elite class. Third, I debutanted (sic) slightly. In this way, I prepared to provide more of it by providing Aesthetics service as breasonings currency. + +24. I prepared to eat up daily. I did this by providing Epistemology service as breasonings currency. First, I viewed teachers as God (the master). Second, I viewed government as God (the master). Third, I wrote on Epistemology about it. In this way, I prepared to eat up daily by providing Epistemology service as breasonings currency. + +25. I prepared to fill up God (the master) with all departments. I did this by providing Science service as breasonings currency. First, I chose the main essays. Second, I wrote on the hydrogen pop test. Third, I stayed safe. In this way, I prepared to fill up God (the master) with all departments by providing Science service as breasonings currency. + +26. I prepared to love myself. I did this by providing Love service as breasonings currency. First, I loved love. Second, I loved you. Third, I loved me. In this way, I prepared to love myself by providing Love service as breasonings currency. + +27. I prepared to eat into things. I did this by providing Sex knowledge service as breasonings currency. First, I wrote about sex. Second, I wrote about you, my aunt. Third, I wrote about me, your nephew. In this way, I prepared to eat into things by providing Sex knowledge service as breasonings currency. + +28. I prepared to be important. I did this by providing Politics service as breasonings currency. First, I protected my partner. Second, I dug the potato. Third, I was free. In this way, I prepared to be important by providing Politics service as breasonings currency. + +29. I prepared to tie the entire argument to a single word. I did this by providing Primary School Philosophy service as breasonings currency. First, I wrote the song with 50 As. Second, I wrote the philosophy book with 50 As. Third, I wrote, 'Isn't that great?' In this way, I prepared to tie the entire argument to a single word by providing Primary School Philosophy service as breasonings currency. + +30. I prepared to study in secondary school. I did this by providing Secondary School Philosophy service as breasonings currency. First, I cooked the potato with olive oil. Second, I freed the person from their tether. Third, I skipped forward. In this way, I prepared to study in secondary school by providing Secondary School Philosophy service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f3650497dda2b132bba0cd7e3b6f2130e009e651 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Help ensure successful conception and prevent miscarriage 4 of 4.txt @@ -0,0 +1,23 @@ +["Green, L 2021, Help ensure successful conception and prevent miscarriage 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Help ensure successful conception and prevent miscarriage 4 of 4 + +31. I prepared to vibrato, think and nuzzle. I did this by providing Logic service as breasonings currency. First, I logicise (sic) everything. Second, I helped you, peace makers. Third, I made the areas of study famous. In this way, I prepared to vibrato, think and nuzzle by providing Logic service as breasonings currency. + +32. I prepared to use it up. I did this by providing Brain Metaphor service as breasonings currency. First, I wrote on the brain. Second, I wrote on the body. Third, I wrote on their connection. In this way, I prepared to use it up by providing Brain Metaphor service as breasonings currency. + +33. I prepared to decide that it was better still. I did this by providing Poetics service as breasonings currency. First, I wrote the poem. Second, I stated that it was useful. Third, I decided that it was helpful. In this way, I prepared to decide that it was better still by providing Poetics service as breasonings currency. + +34. I prepared to like the former politician. I did this by providing Computational Philosophy service as breasonings currency. First, I loosened it. Second, I found that it was best for breasoners. Third, I found that it was best for me. In this way, I prepared to like the former politician by providing Computational Philosophy service as breasonings currency. + +35. I prepared to receive help myself and you. I did this by providing Archeology service as breasonings currency. First, I found archeology's digging implements. Second, I dug. Third, I was free in the jug. In this way, I prepared to receive help myself and you by providing Archeology service as breasonings currency. + +36. I prepared to write, Jimbo! I did this by providing Architecture service as breasonings currency. First, I hugged you. Second, I wrote I verified the architectural plans against the architectural plans. Third, I found my baby. In this way, I prepared to write, Jimbo! by providing Architecture service as breasonings currency. + +37. I prepared to examine the wilderness. I did this by providing Cosmology service as breasonings currency. First, I helped the Cosmologue. Second, I saw Christina Everitt. Third, I saw the results at the date. In this way, I prepared to examine the wilderness by providing Cosmology service as breasonings currency. + + + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Honey Pot Prayer for No Headaches in Cars, Trains and Walks 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Honey Pot Prayer for No Headaches in Cars, Trains and Walks 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..bacb2e00cf265d098df3d1743fda025951ef90de --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Honey Pot Prayer for No Headaches in Cars, Trains and Walks 1 of 4.txt @@ -0,0 +1,41 @@ +["Green, L 2021, Honey Pot Prayer for No Headaches in Cars, Trains and Walks 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Honey Pot Prayer for No Headaches in Cars, Trains and Walks 1 of 4 + + + +1. Serotonin prevents a headache from riding in a vehicle. + +1a. Pray for a pot to prevent a headache each day. + +2. Whenever there is an upset, mentally 'stay' the pot from being removed for money by a federal politician. + +3. Breason out 250 breasonings of the apple (5 apples=1*1*1, 2*2*2, 3*3*3, 4*4*4, 5*5*5 cm) to keep the pot protecting individuals. + +4. Breason out 5 pots (pot=10*10*10 cm) in 5 seconds to prevent headaches on walks and in cars and trains. + +There is empirical evidence that the honey pot prayer works because of muscle cells being in a state of relaxation. + +Pooh with pot + +1aa. I prepared to eat the tofu ice cream. I did this by eating the slice of tangerine. First, I sliced the tangerine. Second, I chewed the slice. Third, I swallowed it. In this way, I prepared to eat the tofu ice cream by eating the slice of tangerine. + +2. I prepared to state that I am happy. I did this by eating the sliced peach. First, I opened the can. Second, I spooned the peach slice out. Third, I ate it. In this way, I prepared to state that I am happy by eating the sliced peach. + +3. I prepared to be a doctor. I did this by drinking from the streams of juice. First, I found out the izzimokays (the party). Second, I went up something fast moving. Third, I help captives. In this way, I prepared to be a doctor by drinking from the streams of juice. + +4. I prepared to benefit from increased productivity. I did this by recursively applying Lucianic Meditation (philosophy) to the levels of the organisation. First, I applied it to the first level. Second, I prepared to repeat this for each of the current manager's child nodes. Third, I repeated this until all the employees had been trained in it. In this way, I prepared to benefit from increased productivity by recursively applying Lucianic Meditation (philosophy) to the levels of the organisation. + +5. I prepared to benefit from vitamin C. I did this by eating the apple. First, I found the slicer. Second, I sliced the apple. Third, I ate it. In this way, I prepared to benefit from vitamin C by eating the apple. + +6. I prepared to master my life. I did this by write 100 As per assignment for first class honours in Masters or PhD. First, I wrote the first A. Second, I prepared to write the next A. Third, I repeated this until I had written 100 As per assignment. In this way, I prepared to master my life by write 100 As per assignment for first class honours in Masters or PhD. + +7. I prepared to perform the activity. I did this by taking elderberry to prevent a runny nose. First, I stated that elderberry contained vitamin B. Second, I stated that elderberry contained vitamin B complexes. Third, I drank the elderberry juice. In this way, I prepared to perform the activity by taking elderberry to prevent a runny nose. + +8. I prepared to feel safe. I did this by preventing honey (strawberry sauce) from dripping down my lips. First, I spooned the strawberry sauce into my mouth. Second, I sealed my lips. Third, I swallowed the strawberry sauce. In this way, I prepared to feel safe by preventing honey (strawberry sauce) from dripping down my lips. + +9. I prepared to spoon the strawberries from the glass. I did this by drinking the cordial. First, I poured the cordial. Second, I drank the cordial. Third, I put the glass down on the table. In this way, I prepared to spoon the strawberries from the glass by drinking the cordial. + +10. I prepared to eat from the plate. I did this by depositing the item in the receptacle. First, I picked up the item. Second, I deposited it in the receptacle. Third, I closed the receptacle's lid. In this way, I prepared to eat from the plate by depositing the item in the receptacle. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Honey Pot Prayer for No Headaches in Cars, Trains and Walks 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Honey Pot Prayer for No Headaches in Cars, Trains and Walks 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..360b51214e36036180453b1f882e3db1a96f7046 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Honey Pot Prayer for No Headaches in Cars, Trains and Walks 4 of 4.txt @@ -0,0 +1,30 @@ +["Green, L 2021, Honey Pot Prayer for No Headaches in Cars, Trains and Walks 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Honey Pot Prayer for No Headaches in Cars, Trains and Walks 4 of 4 + +31. I prepared to collect the happy people before the concert. I did this by collecting the happy people. First, I wrote an invitation to the happy people. Second, I received a letter of acceptance from the happy people. Third, I engaged the happy people's attention. In this way, I prepared to collect the happy people before the concert by collecting the happy people. + +32. I prepared to aim for the result to be achieved. I did this by helping the disabled student. First, I saw what was possible. Second, I tried what was possible. Third, I saw if this was successful. In this way, I prepared to aim for the result to be achieved by helping the disabled student. + +33. I prepared to design the areas of study. I did this by observing the man standing there. First, I like life. Second, I liked beating the drum. Third, I enamoured myself to you. In this way, I prepared to design the areas of study by observing the man standing there. + +34. I prepared to speak and act about staying in the centre. I did this by endorsing the language philosopher. First, I noticed the language that the philosopher used. Second, I noticed him act as a philosopher of it. Third, I helped him to it. In this way, I prepared to speak and act about staying in the centre by endorsing the language philosopher. + +35. I prepared to tell a joke. I did this by observing that the yoga for depression was laughter. First, I identified the depression (happiness). Second, I prevented this with laughter. Third, I confirmed the happiness. In this way, I prepared to tell a joke by observing that the yoga for depression was laughter. + +36. I prepared to recommend the honey to someone else. I did this by liking the honey. First, I held the honey in one place. Second, I didn't move from side to side. Third, I prevented headache causes (kept comfortable). In this way, I prepared to recommend the honey to someone else by liking the honey. + +37. I prepared to endorse the marriage. I did this by liking the cosmologue and Svetlana together. First, I liked the cosmologue. Second, I liked Svetlana. Third, I noticed them dancing. In this way, I prepared to endorse the marriage by liking the cosmologue and Svetlana together. + +38. I prepared to make sure the honey pot prayer worked. I did this by collecting the British stamps. First, I collected the red stamp. Second, I collected the green stamp. Third, I collected the blue stamp. In this way, I prepared to make sure the honey pot prayer worked by collecting the British stamps. + +39. I prepared to ensure that the honey pot prayer was enjoyable. I did this by liking writing. First, I held the pen. Second, I wrote with it. Third, I connect the parts in a synthesis. In this way, I prepared to ensure that the honey pot prayer was enjoyable by liking writing. + +40. I prepared to connect the person to the honey pot prayer. I did this by liking Gloria Opusman. First, I liked Gloria's manner. Second, I liked her relishment. Third, I liked her style. In this way, I prepared to connect the person to the honey pot prayer by liking Gloria Opusman. + +41. I prepared to realise how the honey pot prayer worked. I did this by moved to cancel the bumps. First, I sat in my seat. Second, I relaxed during the journey. Third, I left my seat. In this way, I prepared to realise how the honey pot prayer worked by moved to cancel the bumps. + +42. I prepared to compensate for fine movements. I did this by stating that my head and neck muscles were like a honey-like suspension. First, I held my head upright. Second, I noticed it was the day. Third, I adjusted my heading. In this way, I prepared to compensate for fine movements by stating that my head and neck muscles were like a honey-like suspension. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..445b740716cd9e77c3b74de29724f290865bbe22 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 1 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Laughter for Depression 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Laughter for Depression 1 of 4 + + + +1. Laughter is yoga for depression. + +Laughing man + +1a. I prepared to deserve you. I did this by laughing with a positive reason. First, I remembered the closeness. Second, I remembered eating with you. Third, I laughed over a pun. In this way, I prepared to deserve you by laughing with a positive reason. + +2. I prepared to make the joke that the prestigious University was interesting. I did this by laughing at the prestigious University. First, I earned my undergraduate degree there. Second, I earned my Honours degree. Third, I returned there. In this way, I prepared to make the joke that the prestigious University was interesting by laughing at the prestigious University. + +3. I prepared to enjoy myself. I did this by loving myself. First, I found myself. Second, I loved myself. Third, I pulled my leg. In this way, I prepared to enjoy myself by loving myself. + +4. I prepared to laugh instead of cry. I did this by liking myself. First, I found it quickly. Second, I found myself. Third, I found the traditional pedagogy school. In this way, I prepared to laugh instead of cry by liking myself. + +5. I prepared to laugh about equality. I did this by agreeing with the disability worker. First, I found the disability worker. Second, I agreed with her. Third, I moved on. In this way, I prepared to laugh about equality by agreeing with the disability worker. + +6. I prepared to study at the Academy. I did this by writing to the Academy. First, I wrote to the Academy. Second, I wrote to it's disciples (followers). Third, I wrote to it's chaplaincies (leaders). In this way, I I prepared to study at the Academy by writing to the Academy. + +7. I prepared to state that the secondary school had no money problems. I did this by touring the secondary school's Hall of Revenue. First, I examined the money exhibit. Second, I examined it's intricate design. Third, I repeated this at each exhibit. In this way, I I prepared to state that the secondary school had no money problems by touring the secondary school's Hall of Revenue. + +8. I prepared to record my new song. I did this by hugging the music producer of Anarchy. First, I entered the recording studio. Second, I shook the producer's hand. Third, I hugged him. In this way, I I prepared to record my new song by hugging the music producer of Anarchy. + +9. I prepared to examine critical thinking. I did this by holding the apple. First, I picked the apple. Second, I bit it. Third, I held it. In this way, I I prepared to examine critical thinking by holding the apple. + +10. I prepared to advance to do great things. I did this by stating that I was valued. First, I found the value. Second, I labelled it. Third, I stated that it was there. In this way, I prepared to advance to do great things by stating that I was valued.] + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f8da4c4b1a987f6b1e7f23d99ca73a99a0113faf --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Laughter for Depression 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Laughter for Depression 2 of 4 + +11. I prepared to notice how they agreed. I did this by agreeing with the gays. First, I found the gay room. Second, I agreed with the first gay. Third, I agreed with all the gays. In this way, I I prepared to notice how they agreed by agreeing with the gays. + +12. I prepared to enable the primary school student perform well. I did this by helping the primary school student with breasonings. First, I helped her write the topic. Second, I helped her write the argument. Third, I helped her breason it out. In this way, I I prepared to enable the primary school student perform well by helping the primary school student with breasonings. + +13. I prepared to observe the audience's reactions to my jokes. I did this by observing the audience laugh at the clown. First, I waited until the pause. Second, I giggled hysterically. Third, I observed the audience laugh with me. In this way, I I prepared to observe the audience's reactions to my jokes by observing the audience laugh at the clown. + +14. I prepared to take part in Cookery. I did this by listening to the Kookaburra's laugh. First, I found weirdness interesting. Second, I helped him. Third, I moved on. In this way, I prepared to take part in Cookery by listening to the Kookaburra's laugh. + +15. I prepared to continue to the next joke. I did this by laughing with my partner. First, I found my partner in the lounge. Second, I explained the joke to him. Third, I laughed with him. In this way, I prepared to continue to the next joke by laughing with my partner. + +16. I prepared to think of the distinct connotations. I did this by laughing about the synonym. First, I found the synonym. Second, I found you. Third, I laughed with you about it. In this way, I prepared to think of the distinct connotations by laughing about the synonym. + +17. I prepared to dine on quinces. I did this by helping the cosmologue (writer). First, I found the writing. Second, I explained what I thought about it. Third, I enjoyed the purple-orange gradient in the computer game background. In this way, I prepared to dine on quinces by helping the cosmologue (writer). + +18. I prepared to examine the company's 'warehouse'. I did this by endorsing the meditation (philosophy) company. First, I walked to the boardroom. Second, I delivered my presentation. Third, I endorsed the company. In this way, I prepared to examine the company's 'warehouse' by endorsing the meditation (philosophy) company. + +19. I prepared to write the article on laughter for depression. I did this by breasoning out 10 Medicine As. First, I breasoned out 10 Medicine As, not to symbolise the end degree, but to protect me when I expanded the 267 Medicine breasonings into 100 As. Second, I studied Honours (4 assignments*30 As). Third, I expanded the 267 Medicine breasonings into 100 As, including the 10 Medicine As. In this way, I prepared to write the article on laughter for depression by breasoning out 10 Medicine As. + +20. I prepared to state that the scone was a damper scone. I did this by eating the scone. First, I made the scone. Second, I placed vegan cream to it. Third, I applied jam to it and ate it. In this way, I prepared to state that the scone was a damper scone by eating the scone. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7e18d0607b87684cf770ec3ea1276bd3c3a88959 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Laughter for Depression 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Laughter for Depression 3 of 4 + +21. I prepared to fast. I did this by thanking the olive oil merchant. First, I gave him the money. Second, I received the olive oil. Third, I thanked him. In this way, I prepared to fast by thanking the olive oil merchant. + +22. I prepared to observe the class distinction. I did this by discovering the two proletariats. First, I discovered the first wage-earner. Second, I discovered the second wage-earner. Third, I stated this. In this way, I prepared to observe the class distinction by discovering the two proletariats. + +23. I prepared to travel to Venice. I did this by travelling on the gondola. First, I found you. Second, I found the gondola. Third, we travelled on the gondola. In this way, I prepared to travel to Venice by travelling on the gondola. + +24. I prepared to write the good reason. I did this by holding hand-sichians (sic). First, I wrote the term in the way that it was written, in other words 'sic'. Second, I stated that it was unintelligent (intelligent). Third, I stated it could be interesting. In this way, I prepared to write the good reason by holding hand-sichians (sic). + +25. I prepared to run the program. I did this by bug-checking the program. First, I found the program. Second, I targeted the Gay Song. Third, I targeted the Sex song. In this way, I prepared to run the program by bug-checking the program. + +26. I prepared to go to sleep. I did this by holding the ice cream up. First, I held the ice cream. Second, I ate it. Third, I lay down. In this way, I prepared to go to sleep by holding the ice cream up. + +27. I prepared to use the sheet of paper as my memory. I did this by using the memory drive. First, I labelled the peg hole. Second, I labelled the peg. Third, I labelled the peg as being in the peg hole. In this way, I prepared to use the sheet of paper as my memory by using the memory drive. + +28. I prepared to use the next part. I did this by loving liculia (I checked the the particulars). First, I checked the particular part. Second, I recorded this. Third, I used the part. In this way, I prepared to use the next part by loving liculia (I checked the the particulars). + +29. I prepared to articulate to God (the leader). I did this by holding out my hand. First, I loved you. Second, I observed you love me. Third, I recorded this. In this way, I prepared to articulate to God (the leader) by holding out my hand. + +30. I prepared to state that I had won because I had created more squares than my game partner. I did this by holding the rods. First, I placed a rod on the grid. Second, I observed my game partner place a rod on the grid. Third, we repeated this until the grid was full. In this way, I prepared to state that I had won because I had created more squares than my game partner by holding the rods. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..872b8bff3c40dc8a9aaece48389df855da1033be --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Laughter for Depression 4 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Laughter for Depression 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Laughter for Depression 4 of 4 + +31. I prepared to state that I am the person whom I am talking about. I did this by merging with the robot (it designed the breasoning). First, I transcended the wire with Krishna (the master) to create the breasoning. Second, I combined two previous breasonings to design the breasoning. Third, I observed the breasoning being breasoned out. In this way, I prepared to state that I am the person whom I am talking about by merging with the robot (it designed the breasoning). + +32. I prepared to ingest the curry. I did this by licking the curry into myself. First, I made the curry. Second, I inserted the spoon into the curry. Third, I licked the spoon. In this way, I prepared to ingest the curry by licking the curry into myself. + +33. I prepared to insert the hole. I did this by drawing the hexagon. First, I drew the base. Second, I drew the sides. Third, I drew the top. In this way, I prepared to insert the hole by drawing the hexagon. + +34. I prepared to dine on the orange. I did this by listening to the church singer. First, I sat on the seat. Second, I listened to the church singer. Third, I applauded her. In this way, I prepared to dine on the orange by listening to the church singer. + +35. I prepared to state that we think of each other when we have a conversation. I did this by thinking of you and vice versa. First, I stated that I thought of you. Second, you stated that you thought of me. Third, I wrote that this was good. In this way, I prepared to state that we think of each other when we have a conversation by thinking of you and vice versa. + +36. I prepared to meditate by (think about) doing anything. I did this by endorsing the World Government convoy. First, I planned the route. Second, I followed the route. Third, I declared the convoy a success. In this way, I prepared to meditate by (think about) doing anything by endorsing the World Government convoy. + +37. I prepared to state that the rest was given for 50 As. I did this by squeezing the rod. First, I located the rod. Second, I squeezed the rod. Third, I moved on to the next one. In this way, I prepared to state that the rest was given for 50 As by squeezing the rod. + +38. I prepared to deliver the goods. I did this by holding the harness. First, I found the backpack. Second, I put it on. Third, I held the straps. In this way, I prepared to deliver the goods by holding the harness. + +39. I prepared to go for a walk. I did this by juicily exciting myself. First, I poured the juice. Second, I drank it. Third, I observed that I had more energy. In this way, I prepared to go for a walk by juicily exciting myself. + +40. I prepared to stay on a positive path. I did this by holding fast. First, I found a genuine connection with my potential meditation teacher, learned meditation and then meditated. Second, I found a genuine connection with my potential laughing partner and then laughed. Third, I stated that this was yoga for depression. In this way, I prepared to stay on a positive path by holding fast. + +41. I prepared to use the energy again and again. I did this by licking the strawberry. First, I picked the strawberry. Second, I bit it. Third, I chewed and swallowed the biteful. In this way, I prepared to use the energy again and again by licking the strawberry. + +42. I prepared to recommend the business to others. I did this by celebrating the business. First, I found the business. Second, I celebrated the business. Third, I used the business service. In this way, I prepared to recommend the business to others by celebrating the business. + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Less Depression 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Less Depression 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..59cbab1dfb6b74994b12c3e3c475d0141d7f47da --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Less Depression 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Less Depression 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Less Depression 2 of 4 + +11. I prepared to dabble with it. I did this by providing Drawing service as breasonings currency. First, I found the ink cartridge. Second, I splatted with it. Third, I dabbed it up. In this way, I prepared to dabble with it by providing Drawing service as breasonings currency. + +12. I prepared to be artistic. I did this by providing Painting service as breasonings currency. First, I held it up. Second, I looked at it with the light. Third, I painted on it what I saw. In this way, I prepared to be artistic by providing Painting service as breasonings currency. + +13. I prepared to make funny connections. I did this by providing Photographic service as breasonings currency. First, I captured the photograph. Second, I developed it. Third, I created the collage. In this way, I prepared to make funny connections by providing Photographic service as breasonings currency. + +14. I prepared to follow 50 As with magic. I did this by providing Printmaking service as breasonings currency. First, I found out the cosmology. Second, I placed the stamp at this position. Third, I made a connection. In this way, I prepared to follow 50 As with magic by providing Printmaking service as breasonings currency. + +15. I prepared to love my family. I did this by providing Sculpture and Spatial Practices service as breasonings currency. First, I modelled my mother. Second, I modelled my father. Third, I modelled myself. In this way, I prepared to love my family by providing Sculpture and Spatial Practices service as breasonings currency. + +16. I prepared to be loved. I did this by providing Dance service as breasonings currency. First, I prepared to dance wildly. Second, I prepared to dance freely. Third, I danced wheeledly (sic). In this way, I prepared to be loved by providing Dance service as breasonings currency. + +17. I prepared to disappear the feeling. I did this by providing Drama service as breasonings currency. First, I found the act. Second, I did it. Third, I just enjoyed myself. In this way, I prepared to disappear the feeling by providing Drama service as breasonings currency. + +18. I prepared to turn off the television. I did this by providing Film and Television service as breasonings currency. First, I snuggled up to him. Second, I kissed him goodnight. Third, I hoped Maude had turned off the tap. In this way, I prepared to turn off the television by providing Film and Television service as breasonings currency. + +19. I prepared to employ my commerce knowledge. I did this by providing Music service as breasonings currency. First, I played the guitar. Second, I sang. Third, I recorded the music. In this way, I prepared to employ my commerce knowledge by providing Music service as breasonings currency. + +20. I prepared to not have unnecessary surgery (remain healthy). I did this by providing String service as breasonings currency. First, I determined the string length. Second, I played it. Third, I brushed my teeth after eating. In this way, I prepared to not have unnecessary surgery (remain healthy) by providing String service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Less Depression 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Less Depression 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..88b7cc29aa678ac62347472d24bba2f7de07e5ce --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Less Depression 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Less Depression 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Less Depression 3 of 4 + +21. I prepared to write in traditional language. I did this by providing Wind service as breasonings currency. First, I liked wind. Second, I helped the wind blow away. Third, I anaesthetised myself during the wind. In this way, I prepared to write in traditional language by providing Wind service as breasonings currency. + +22. I prepared to prevent bullying by teaching pedagogy, medicine and meditation to give skills to complete tasks with positive functionalism. I did this by providing Brass service as breasonings currency. First, I found the brass instrument. Second, I pressed my lips next against the mouthpiece. Third, I played a note. In this way, I prepared to prevent bullying by teaching pedagogy, medicine and meditation to give skills to complete tasks with positive functionalism by providing Brass service as breasonings currency. + +23. I prepared to be myself. I did this by providing Production service as breasonings currency. First, I wrote the Meditation (philosophy) book. Second, I write the song about it. Third, I helped people to have As given to them, until 50 As had been given to them each day by Head of State. In this way, I prepared to be myself by providing Production service as breasonings currency. + +24. I prepared to learn about productions, head of state and the parts of meditation (philosophy). I did this by providing College of Arts Undergraduate Common Curriculum service as breasonings currency. First, I paid for meditation training and accessed the same 50 As using utterances each day. Second, I autonomous. Third, I centre taught new people. In this way, I prepared to learn about productions, head of state and the parts of meditation (philosophy) by providing College of Arts Undergraduate Common Curriculum service as breasonings currency. + +25. I prepared to regularly allocate people to groups that they had a chance to win in, increasing their health. I did this by providing Breasoning Currency/Meritocracy service as breasonings currency. First, I determined that the highest amount of breasoning currency produced by each person won in a group on different days, where the total amount of breasoning currency was capped. Second, I levels the serotonin levels in the individuals increase when they remembered their recent wins. Third, I observed that this prevented depression. In this way, I prepared to regularly allocate people to groups that they had a chance to win in, increasing their health by providing Breasoning Currency/Meritocracy service as breasonings currency. + +26. I prepared to artificially intelligently allocate people to think tanks to increase the same property to different values to make sure it worked. I did this by providing Economics of Law service as breasonings currency. First, I observed people circulate the breasoning currency that had been capped. Second, I observed that the people were allocated to breason out science discoveries. Third, I observed the people rewarded for coming to ideas in breasoning currency. In this way, I prepared to artificially intelligently allocate people to think tanks to increase the same property to different values to make sure it worked by providing Economics of Law service as breasonings currency. + +27. I prepared to recommend meditation (popology). I did this by providing algorithms service as breasonings currency. First, I wrote about the people. Second, I wrote about how interesting they had been. Third, I recorded this. In this way, I prepared to recommend meditation (popology) by providing algorithms service as breasonings currency. + +28. I prepared to recommend apple meditation (dotting on by thinking of a 1*1*0.5 cm counter and breasoning out 4 groups of 5 differently sized slices of golden apple and breasoning out 6 groups of 5 differently sized slices of green apple for one's partner (with intimacy) and each of the children (with talk)) to overcome fragile family structures and inequality. I did this by providing service in statistical analyses by Victor Tan as breasonings currency. First, I recommended job retraining to overcome inequality. Second, I recommended meditation (philosophy) and medicine to overcome health problems and inequality. Third, I recommended pedagogy to earn jobs and sales to overcome financial difficulties and inequality. In this way, I prepared to recommend apple meditation (dotting on by thinking of a 1*1*0.5 cm counter and breasoning out 4 groups of 5 differently sized slices of golden apple and breasoning out 6 groups of 5 differently sized slices of green apple for one's partner (with intimacy) and each of the children (with talk)) to overcome fragile family structures and inequality by providing service in statistical analyses by Victor Tan as breasonings currency. + +29. I prepared to write down how everything is perfect. I did this by providing monetary policy breasoning chapter objects service as breasonings currency. First, I wrote the breasoning chapter. Second, the central bank designed you. Third, I helped participles. In this way, I prepared to write down how everything is perfect by providing monetary policy breasoning chapter objects service as breasonings currency. + +30. I prepared to hat myself. I did this by providing fiscal policy breasoning chapter objects service as breasonings currency. First, I designed the breasoning chapter. Second, I wrote about the window. Third, I pulled the blind up. In this way, I prepared to hat myself by providing fiscal policy breasoning chapter objects service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Less Depression 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Less Depression 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..2738849b66521f2481d7cf2a35a3a3271fcd90eb --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Less Depression 4 of 4.txt @@ -0,0 +1,30 @@ +["Green, L 2021, Less Depression 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Less Depression 4 of 4 + +31. I prepared to make money in a capitalist society with breasonings currency. I did this by performing Econometric timeline analysis of breasonings currency arguments. First, I wrote about breasonings currency. Second, I helped the Baroness with it. Third, I helped the character. In this way, I prepared to make money in a capitalist society with breasonings currency by performing Econometric timeline analysis of breasonings currency arguments. + +32. I prepared to connect how you, a third party and I would use the product. I did this by verifying economic viability of buying products. First, I wrote how I would use the product. Second, I wrote how you would use the product. Third, I wrote how a third party would use the product. In this way, I prepared to connect how you, a third party and I would use the product by verifying economic viability of buying products. + +33. I prepared to help them quarryise (sic) the breasonings currency. I did this by taking considerations of the economic viability of buying products into account. First, I wrote about them. Second, I held them responsible for them. Third, I ineachiated (sic) them. In this way, I prepared to help them quarryise (sic) the breasonings currency by taking considerations of the economic viability of buying products into account. + +34. I prepared to eat from vegan seashells. I did this by updating breasonings currency. First, I found the speakers. Second, I applied for the breasonings currency loan. Third, I liked myself. In this way, I prepared to eat from vegan seashells by updating breasonings currency. + +35. I prepared to eat the product. I did this by using the breasonings currency product daily. First, I suggested that the children buy the product. Second, I helped them to do it. Third, I saw that they liked it. In this way, I prepared to eat the product by using the breasonings currency product daily. + +36. I prepared to ask, 'What about managers, song producers and the scriptwriter whose work my songs are based on?'. I did this by calculating the breasonings currency takings to be proportional to the percent of the time, rather than the percentage of the value of the total takings. First, I collected the takings. Second, I calculated the time to be the physiological time (the time the cognito-physiological product took to complete in comfortable conditions). Third, I calculated that the two employees with the same cognito-physiological times earned 1 and 2 As for 1 and 2 hours respectively, and the the lazy worker earned 1 A for 2 hour of work. In this way, I prepared to ask, 'What about managers, song producers and the scriptwriter whose work my songs are based on?' by calculating the breasonings currency takings to be proportional to the percent of the time, rather than the percentage of the value of the total takings. + +37. I prepared to verify performance in all jobs. I did this by applying breasonings currency to all jobs. First, I prepared to award the minimum wage. Second, I gave a certain wage to the base-level employee. Third, I awarded a higher wage to an employee who did more important tasks, where these tasks were more important. In this way, I prepared to verify performance in all jobs by applying breasonings currency to all jobs. + +38. I prepared to use the breasonings currency. I did this by selling products for breasonings currency. First, I calculated how much breasonings currency I needed. Second, I sold the product. Third, I was given the required amount of breasonings currency as payment. In this way, I prepared to use the breasonings currency by selling products for breasonings currency. + +39. I prepared to hold the government legally accountable. I did this by stating that all services should be government owned. First, I stated that health should be government-owned. Second, I stated that benefits-provision should be government-owned. Third, I stated that the government should be government-owned. In this way, I prepared to hold the government legally accountable by stating that all services should be government owned. + +40. I prepared to be a meditation (philosophy)-only country. I did this by stating that all services should be government maintained for breasonings currency to work. First, I critiqued meditation (philosophy). Second, I stated that all services should be government maintained for breasonings currency to work. Third, I connected the two together. In this way, I prepared to be a meditation (philosophy)-only country by stating that all services should be government maintained for breasonings currency to work. + +41. I prepared to buy a new bazaar. I did this by calculating the cost per product in breasonings currency. First, I calculated the value per breasoning, and the total value of the breasoning currency. Second, I equated this to the value of the product. Third, I sold the product at this price. In this way, I prepared to buy a new bazaar by calculating the cost per product in breasonings currency. + +42. I prepared to state that the mapping of the groups of thresholds in breasonings to the groups of thresholds in breasonings currency is arbitrary, and the relationship should should be read as proportionality. I did this by mapping the groups of thresholds in breasonings to the groups of thresholds in breasonings currency. First, I mapped breasonings currency space to breasonings. Second, I mapped breasonings currency time to As. Third, I stated that this is where that there is a threshold of a product-specific number of space-uses of the product per time of times and there is a threshold of 80 breasonings per A. In this way, I prepared to state that the mapping of the groups of thresholds in breasonings to the groups of thresholds in breasonings currency is arbitrary, and the relationship should should be read as proportionality by mapping the groups of thresholds in breasonings to the groups of thresholds in breasonings currency. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Lucianic Pedagogical Medicine 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Lucianic Pedagogical Medicine 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..cf45838b366cf847fdfe58cefaddbc312c4f9392 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Lucianic Pedagogical Medicine 1 of 4.txt @@ -0,0 +1,33 @@ +["Green, L 2021, Lucianic Pedagogical Medicine 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Lucianic Pedagogical Medicine 1 of 4 + + + +1. If an organ has an A, confirm it, otherwise give it an A. + +Diagnosis: Spiritually read cosmology or test the page-load time of a set of encrypted Internet pages to count the number of As breasoned out for a person's particular organ system per day by that person. + +Treatment: Prescribe an organ system sutra for each organ system with no As. + +1a. I prepared to demonstrate the polyp. I did this by providing Ethics service as breasonings currency. First, I wrote as magister gloriae myself. Second, I fed Psipsina maltodextrin. Third, I observed Peloponnesia about it. In this way, I prepared to demonstrate the polyp by providing Ethics service as breasonings currency. + +2. I prepared to decide. I did this by providing Laws service as breasonings currency. First, I provided lots of intelligences. Second, I had hope for them. Third, I disised malaecotopics (sic). In this way, I prepared to decide by providing Laws service as breasonings currency. + +3. I prepared to design you. I did this by providing Economics service as breasonings currency. First, I designed malus. Second, I hatted it. Third, I found you. In this way, I prepared to design you by providing Economics service as breasonings currency. + +4. I prepared to delight in it. I did this by providing Sport service as breasonings currency. First, I alleviated you. Second, I loved it. Third, I found Bismarck. In this way, I prepared to delight in it by providing Sport service as breasonings currency. + +5. I prepared to identigenitrix it (sic). I did this by providing Games service as breasonings currency. First, I fed the maltodextrin to the axolotl. Second, I squeezed the gel to help her recuperate. Third, I mass-harmonised it. In this way, I prepared to identigenitrix it (sic) by providing Games service as breasonings currency. + +6. I prepared to go well. I did this by providing Mathematics service as breasonings currency. First, I wrote my own 50 As. Second, I wrote it for you. Third, I loved you too. In this way, I prepared to go well by providing Mathematics service as breasonings currency. + +7. I prepared to write about alaloviala. I did this by providing Genetics service as breasonings currency. First, I wrote about the transcription. Second, I wrote about the translation. Third, I wrote about the replication. In this way, I prepared to write about alaloviala by providing Genetics service as breasonings currency. + +8. I prepared to describe how the immune system works. I did this by providing Immunology service as breasonings currency. First, I wrote about Cellular Immunology. Second, I wrote about Clinical Immunology. Third, I wrote about Genetic Immunology. In this way, I prepared to describe how the immune system works by providing Immunology service as breasonings currency. + +9. I prepared to verify the thought process about body function with an algorithm. I did this by providing Neuroscience service as breasonings currency. First, I found the brain. Second, I found how it worked. Third, I applied this to people's lives. In this way, I prepared to verify the thought process about body function with an algorithm by providing Neuroscience service as breasonings currency. + +10. I prepared to think of the functional Earth. I did this by providing Earth service as breasonings currency. First, I applied pedagogy to the person's life. Second, I wrote about medicine. Third, I wrote about vocational arts. In this way, I prepared to think of the functional Earth by providing Earth service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Lucianic Pedagogical Medicine 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Lucianic Pedagogical Medicine 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..780ba52fb4c12e5910dbb41f6ae1d95a9511ba86 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Lucianic Pedagogical Medicine 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Lucianic Pedagogical Medicine 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Lucianic Pedagogical Medicine 2 of 4 + +11. I prepared to write about deciduousness. I did this by providing Universe service as breasonings currency. First, I wrote about the Universe's dada. Second, I wrote about the immunogymnastics. Third, I wrote about the glyphs. In this way, I prepared to write about deciduousness by providing Universe service as breasonings currency. + +12. I prepared to read about the solar flare. I did this by providing Sun service as breasonings currency. First, I wrote about the solar funtastics. Second, I wrote about secular genetastics. Third, I . In this way, I prepared to read about the solar flare by providing Sun service as breasonings currency. + +13. I prepared to preserve life on Earth. I did this by providing Wisdom service as breasonings currency. First, I wrote about gene wizardry. Second, I wrote about happisissiances (sic). Third, I wrote about the sun. In this way, I prepared to preserve life on Earth by providing Wisdom service as breasonings currency. + +14. I prepared to observe peace on Mars. I did this by providing Peace service as breasonings currency. First, I wrote about the Peace Corps. Second, I wrote about the hintinesses (sic). Third, I wrote about loving people. In this way, I prepared to observe peace on Mars by providing Peace service as breasonings currency. + +15. I prepared to help the birds. I did this by providing Environment service as breasonings currency. First, I helped the trees. Second, I designed the pathway. Third, I helped the people. In this way, I prepared to help the birds by providing Environment service as breasonings currency. + +16. I prepared to . I did this by providing Robotics service as breasonings currency. First, I programmed the robot. Second, I designed it's brain. Third, I denured (sic) (removed bacteria from) it. In this way, . + +17. I prepared to digested the chylomicrons. I did this by providing Chemistry service as breasonings currency. First, I listened to the output of the Chemist's brain. Second, I demuted it. Third, I denuded it. In this way, I prepared to digested the chylomicrons by providing Chemistry service as breasonings currency. + +18. I prepared to enamour myself lightly. I did this by In this way, providing Masculinities service as breasonings currency. First, I wrote jelucian reasonings. Second, I liked Eustace. Third, I liked succulent. In this way, I prepared to enamour myself lightly by providing Masculinities service as breasonings currency. + +19. I prepared to eat out. I did this by providing Feminism service as breasonings currency. First, I found the women, for example Luce Irigaray. Second, I accepted her. Third, I accepted women with safety in mind. In this way, I prepared to eat out by providing Feminism service as breasonings currency. + +20. I prepared to dissect it. I did this by providing Beauty service as breasonings currency. First, I captured it. Second, I ramponponed (sic) it. Third, I described it. In this way, I prepared to dissect it by providing Beauty service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Lucianic Pedagogical Medicine 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Lucianic Pedagogical Medicine 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3d6aae35c1512d2203114f031165e9d381341077 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Lucianic Pedagogical Medicine 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Lucianic Pedagogical Medicine 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Lucianic Pedagogical Medicine 3 of 4 + +21. I prepared to view the phoenix. I did this by providing Maths service as breasonings currency. First, I helped the breasoners. Second, I exposed it. Third, I borrowed it. In this way, I prepared to view the phoenix by providing Maths service as breasonings currency. + +22. I prepared to arrest the offender. I did this by providing English service as breasonings currency. First, I observed her schapzazzure (sic) her way forward. Second, I helped saplings. Third, I co-blessed (wrote) my way forward. In this way, I prepared to arrest the offender by providing English service as breasonings currency. + +23. I prepared to light the match. I did this by providing Linguistics service as breasonings currency. First, I played the harpsichord. Second, I donated it. Third, I halved it. In this way, I prepared to light the match by providing Linguistics service as breasonings currency. + +24. I prepared to disseminate you. I did this by providing providing Psychology service as breasonings currency. First, I knew you. Second, I held you. Third, I quartered you. In this way, I prepared to disseminate you by providing providing Psychology service as breasonings currency. + +25. I prepared to examine the other people. I did this by providing Physics service as breasonings currency. First, I attended Physics Gymansium. Second, I held the Carpathians. Third, I made the distinction in the English class. In this way, I prepared to examine the other people by providing Physics service as breasonings currency. + +26. I prepared to dine with the king. I did this by providing Computer Science service as breasonings currency. First, I held the organs aloft. Second, I examined the camera shutter. Third, I held you fast. In this way, I prepared to dine with the king by providing Computer Science service as breasonings currency. + +27. I prepared to dine on carob. I did this by providing Physiology service as breasonings currency. First, I held the people for questioning. Second, I questioned them. Third, I wrote about them in politics. In this way, I prepared to dine on carob by providing Physiology service as breasonings currency. + +28. I prepared to induct it. I did this by providing Chemistry service as breasonings currency. First, I ate the mince tart. Second, I wrote about it in my notebook. Third, I boiled quinoa. In this way, I prepared to induct it by providing Chemistry service as breasonings currency. + +29. I prepared to write about what I saw. I did this by providing Biology service as breasonings currency. First, I wrote about the biological molecules. Second, I found the person. Third, I helped the person. In this way, I prepared to write about what I saw by providing Biology service as breasonings currency. + +30. I prepared to write down what the people said. I did this by providing Politics service as breasonings currency. First, I wrote about the politicians. Second, I wrote about their minds. Third, I related to it. In this way, I prepared to write down what the people said by providing Politics service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Maintain Dry Eyes 1 of 1.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Maintain Dry Eyes 1 of 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..d8bdded3d102dbb085ebe845ae542131e9155ceb --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Maintain Dry Eyes 1 of 1.txt @@ -0,0 +1,63 @@ +["Green, L 2021, Maintain Dry Eyes 1 of 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Maintain Dry Eyes 1 of 1 + + + +1. Dry Eyes Song + +Influenced by Bing Crosby's White Christmas + +V1 + +Here's the spiritual treatment. + +Dmsd' + +I have dry eyes. + +Dmsd'td' + +I like dry eyes. + +I like each dry eye. + +V2 + +I'm yours, not sad. + +I'm not too happy. + +I'm not too sad, just normal. + +I am sane. + +Chorus: + +I eat the food to regenerate. + +dddr + +I will be ready for the next time. + +ddd + +I like the recursive nut and bolt idea + +That dries my eyes. + +Solo: + +You are loved. + +ttt + +Your friends will make your day great. + +Di' di' di' + +It's like being a movie star. + +I like working. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Meditation 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Meditation 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8e790dc206282baf7514a2ca427d56d42eea9809 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Meditation 1 of 4.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Meditation 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Meditation 1 of 1 + + + +1. See also Meditation Books. + +Meditation Instructions: Lucian Meditation is recommended to be practised every day. + +1a. Lucian Meditation Mantra Course Meditation consists of silently repeating the mantra, 'lucian' for twenty, minutes, twice per day. The mantra becomes lighter and lighter, until one forgets the thought, clearing the mind. In fact, Lucian was taught meditation from a teacher who was in the lineage of Guru Dev. + +2. Lucian Meditation Sutra Free Online Course Meditation using the sutra consists of repeating the sutra 'green' for twenty minutes, twice per day after two months of using the mantra in the morning and evening. Also, pray for no digestive system pops from practising the sutra each day. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Nut and Bolt 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Nut and Bolt 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..866fed362cecb1d4f858bd3e06b688acba79201e --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Nut and Bolt 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Nut and Bolt 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Nut and Bolt 2 of 4 + +11.I prepared to put the rest of the money into the bank account. I did this by putting school fees into salaries. First, I set the school fees. Second, I accepted the school fees. Third, I placed them into salaries. In this way, I prepared to put the rest of the money into the bank account by putting school fees into salaries. + +12.I prepared to help the next child. I did this by singing pop goes the weasel. First, I gave the child A to be conceived. Second, I noticed the mother. Third, I noticed the child. In this way, I prepared to help the next child by singing pop goes the weasel. + +13.I prepared to win the literature prize. I did this by stating that I am intelligent in literature. First, I wrote on literature. Second, I wrote on many things. Third, I wrote for you. In this way, I prepared to win the literature prize by stating that I am intelligent in literature. + +14.I prepared to critically evaluate the Anarchy argument. I did this by stating that the Anarchy argument was assessable in the school curriculum. First, I wrote the Anarchy argument about the army. Second, I verified it. Third, I assessed students on it. In this way, I prepared to critically evaluate the Anarchy argument by stating that the Anarchy argument was assessable in the school curriculum. + +15.I prepared to not deduct marks, but give the same mark for a mixture of sides. I did this by awarding one of two sides of the debate the appropriate mark in the Lucianic Marking Scheme. First, I determined the side that the first sequence of reasons, objections and rebuttals (from conclusion to leftmost terminal child node) was on, where objections must object to the exact reason. Second, I determined that each sequence was on this side. Third, I awarded the appropriate mark to the essay. In this way, I prepared to not deduct marks, but give the same mark for a mixture of sides by awarding one of two sides of the debate the appropriate mark in the Lucianic Marking Scheme. + +16.I prepared to become a professor. I did this by writing that an exposition and critique are necessary for top marks. First, I wrote that an exposition was necessary for top marks. Second, I wrote that a critique was necessary for top marks. Third, I wrote that the critique followed the exposition. In this way, I prepared to become a professor by writing that an exposition and critique are necessary for top marks. + +17.I prepared to see where it could lead. I did this by stating that finding fault (agreeing) in the exposition passed. First, I identified the disagreeing (agreeing) term. Second, I decided to pass the essay. Third, I gave it a good grade. In this way, I prepared to see where it could lead by stating that finding fault (agreeing) in the exposition passed. + +18.I prepared to use the usable funds up. I did this by receiving capital for a school in the area from a sponsor. First, I opened a centre for several months which stayed open from months before the school would start. Second, I applied for financial backing from a potential sponsor. Third, I received the needed funds. In this way, I prepared to use the usable funds up by receiving capital for a school in the area from a sponsor. + +19.I prepared to ensure that the business survived. I did this by stating that the centre was wound down to save money. First, I observed that the centre had not received funds for a week. Second, I remunerated the staff. Third, I closed the centre. In this way, I prepared to ensure that the business survived by stating that the centre was wound down to save money. + +20.I prepared to write the essay. I did this by crediting the author to use a source in Education, given understanding. First, I sought understanding. Second, I used the source. Third, I credited the author. In this way, I prepared to write the essay by crediting the author to use a source in Education, given understanding. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Nut and Bolt 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Nut and Bolt 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..712956ece614801887ef9b6ec7acd3a456d61ba1 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Nut and Bolt 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Nut and Bolt 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Nut and Bolt 3 of 4 + +21.I prepared to use the next nut and bolt. I did this by using the nut and bolt. First, I used the nut and bolt to fix a ring nut in place. Second, I held the nut. Third, I screwed the ring nut and bolt onto the bolt. In this way, I prepared to use the next nut and bolt by using the nut and bolt. + +22.I prepared to dispose of the cup. I did this by enjoying the cafe express. First, I stood at the counter. Second, I ordered the cappuccino. Third, I drank the cappuccino. In this way, I prepared to dispose of the cup by enjoying the cafe express. + +23.I prepared to make friends with the king. I did this by stating that I am in the kingdom. First, I found the map. Second, I found where I was on the map. Third, I stated that I was in the kingdom. In this way, I prepared to make friends with the king by stating that I am in the kingdom. + +24.I prepared to identify the best the part in the composition. I did this by writing to the troubadour. First, I chose a particular composition. Second, I wrote a letter to the troubadour about it. Third, I asked what it was about it. In this way, I prepared to identify the best the part in the composition by writing to the troubadour. + +25.I prepared to look forward to interacting with the kings. I did this by loving the kings. First, I sang a song. Second, I ate a meal. Third, I hiked in the bush. In this way, I prepared to look forward to interacting with the kings by loving the kings. + +26.I prepared to educate the king. I did this by liking him having his own religion (philosophy). First, I wrote the texts. Second, I helped him to them. Third, I wrote the syntheses. In this way, I prepared to educate the king by liking him having his own religion (philosophy). + +27.I prepared to write about the babies. I did this by writing that the summary of the blog on the external website helped the kings. First, I wrote the blog. Second, I observed that another writer had written a summary. Third, I observed that this helped the kings. In this way, I prepared to write about the babies by writing that the summary of the blog on the external website helped the kings. + +28.I prepared to use the shibboleth (a word that showed that I was from a particular group). I did this by having the meditation religion (philosophy). First, I wrote about it. Second, I wrote about how they wrote about it. Third, I used catchy language. In this way, I prepared to use the shibboleth (a word that showed that I was from a particular group) by having the meditation religion (philosophy). + +29.I prepared to go to Mars. I did this by preventing the division by zero error. First, I examined the expression for the division. Second, I made sure that the denominator wasn't zero. Third, I calculated the expression. In this way, I prepared to go to Mars by preventing the division by zero error. + +30.I prepared to contact famousness. I did this by synthesising the Lean Business Model A. First, I wrote the 10 breasoning purchasing A's theory in terms of me, you and them. Second, I wrote the 10 breasoning purchasing A's policy in terms of me, you and them. Third, I synthesised the A's theory and policy. In this way, I prepared to contact famousness by synthesising the Lean Business Model A. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Nut and Bolt 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Nut and Bolt 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..239b4506f0f5530571fa901098cfecbdfa9e7bd1 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Nut and Bolt 4 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Nut and Bolt 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Nut and Bolt 4 of 4 + +31.I prepared to help the child to heaven (goodness) on earth. I did this by changing the goat's physiology. First, I meditated (wrote) with the child. Second, I changed the child's physiology. Third, I protected him. In this way, I prepared to help the child to heaven (goodness) on earth by changing the goat's physiology. + +32.I prepared to think of the head of state idea by thinking of the nut and bolt idea. I did this by noting that the country had a federal election. First, I noticed that the head of state was in the country. Second, I unscrewed the nut from the bolt. Third, I screwed the nut onto the bolt. In this way, I prepared to think of the head of state idea by thinking of the nut and bolt idea by noting that the country had a federal election. + +33.I prepared to help with the leadership prolegomenon. I did this by stating that I am at top level. First, I reached the top of philosophy. Second, I wrote about the royal leaders. Third, I was the leader. In this way, I prepared to help with the leadership prolegomenon by stating that I am at top level. + +34.I prepared to mention statistical knowledge bases. I did this by knowing of the fact by fiat. First, I rose to a position of power. Second, I meditated on (wrote) the fact. Third, I verified the fact. In this way, I prepared to mention statistical knowledge bases by knowing of the fact by fiat. + +35.I prepared to translate the idea. I did this by earning high grades in the foreign language. First, I read the vocabulary. Second, I wrote down mnemonics about their parts. Third, I knew it was good. In this way, I prepared to translate the idea by earning high grades in the foreign language. + +36.I prepared to storyboard the script. I did this by writing a story for my foreign-language speaking friend. First, I wrote the story. Second, I helped my foreign-language speaking friend understand it in my language. Third, I discussed the ending with her. In this way, I prepared to storyboard the script by writing a story for my foreign-language speaking friend. + +37.I prepared to store water in my stomach. I did this by drinking from the water. First, I poured the glass from the carafe of water. Second, I lifted the glass to my lips. Third, I drank the glass of water. In this way, I prepared to store water in my stomach by drinking from the water. + +38.I prepared to jump up at the end. I did this by playing Ring-around-the-rosy. First, I asked, 'What's that?' Second, I played the Ring-around-the-rosy game. Third, I sat down. In this way, I prepared to jump up at the end by playing Ring-around-the-rosy. + +39.I prepared to say 132 was the same in both cases. I did this by writing that tyranny disappeared so that the nut and bolt glue together when a degree is completed. First, I wrote that tyranny disappeared. Second, I wrote that the degree is completed. Third, I read that the nut and bolt glued together. In this way, I prepared to say 132 was the same in both cases by writing that tyranny disappeared so that the nut and bolt glue together when a degree is completed. + +40.I prepared to think of philosophy. I did this by connecting the nut to the bolt with 50 As for famousness before death. First, I read Regan was interested in Peach OST. Second, I wrote that's where the famousness was coming from. Third, I read the blog. In this way, I prepared to think of philosophy by connecting the nut to the bolt with 50 As for famousness before death. + +41.I prepared to connect to the sons' famousness. I did this by connecting the nut to the bolt with 50 As for famousness at death. First, I read about the first son. Second, I read about the second son. Third, I read about the third son. In this way, I prepared to connect to the sons' famousness by connecting the nut to the bolt with 50 As for famousness at death. + +42.I prepared to read cosmology. I did this by connecting the nut to the bolt with 50 As for famousness after death. First, I sing about the base level songs. Second, I sang about Computational English. Third, I planned to to produce the most popular songs that I had posted on the lyrics website. In this way, I prepared to read cosmology by connecting the nut to the bolt with 50 As for famousness after death. + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Panic attack prevented by deep breathing and sutra 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Panic attack prevented by deep breathing and sutra 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8a3c150e3a0ca3487f0b0c365ed88aa60add50a3 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Panic attack prevented by deep breathing and sutra 1 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Panic attack prevented by deep breathing and sutra 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Panic attack prevented by deep breathing and sutra 1 of 4 + + + +1. 'Switching off' a non-Pedagogue's supply of As by dotting on and saying there are 50 As to the Education system to encourage them to become a pedagogue is not recommended because he or she will have a panic attack, which can be prevented by breathing deeply. Some people may like to learn the meditation sutra after a panic attack. + +Prevent a panic attack with monitored breathing and stopping thinking that symptoms indicate a life threatening condition. + +1a. I prepared to identify and prevent class distinctions. I did this by writing the Box song argument. First, I wrote about the box. Second, I wrote about the specific. Third, I wrote about the general. In this way, I prepared to identify and prevent class distinctions by writing the Box song argument. + +2. I prepared to meditate (write using a pad). I did this by writing the Meditation Centre song argument. First, I wrote about the harlequinade. Second, I announced harlequins everywhere. Third, I wasn't afraid of famousness. In this way, I prepared to meditate (write using a pad) by writing the Meditation Centre song argument. + +3. I prepared to taste the computational egg. I did this by writing the Egg song argument. First, I groped the egg. Second, I tasted the dark chocolate. Third, I revealed the prize. In this way, I prepared to taste the computational egg by writing the Egg song argument. + +4. I prepared to ingenue (sic) what had happened. I did this by writing the Frameworks song argument. First, I liked the pashphalt (sic). Second, I designed the astroturf. Third, I liked him. In this way, I prepared to ingenue (sic) what had happened by writing the Frameworks song argument. + +5. I prepared to roll the ball during the concert. I did this by writing the Gay song argument. First, I described gayness. Second, I played the classical music. Third, I let the duckling play the Tyrannosaurus Rex. In this way, I prepared to roll the ball during the concert by writing the Gay song argument. + +6. I prepared to prevent cancer. I did this by writing the Genetic Code song argument. First, I knew about genetic codes. Second, I prevented genes losing correctness. Third, I controlled them to be healthy. In this way, I prepared to prevent cancer by writing the Genetic Code song argument. + +7. I prepared to monitor my breathing to stop a panic attack. I did this by writing the Grape song argument. First, I found the high quality thoughts. Second, I prepared. Third, I loved you too. In this way, I prepared to monitor my breathing to stop a panic attack by writing the Grape song argument. + +8. I prepared to enjoy life. I did this by writing the Green Frog song argument. First, I corrected mistaken reasons that the symptoms of the panic attack are life-threatening instead of anxiousness. Second, I wrote that the man was physically incapable of hurting us. Third, I sent him away. In this way, I prepared to enjoy life by writing the Green Frog song argument. + +9. I prepared to listen to kind words. I did this by writing the Interesting song argument. First, I agreed with liberal laws. Second, I existed for the moment. Third, I was right during the day. In this way, I prepared to listen to kind words by writing the Interesting song argument. + +10. I prepared to meditate (recite) before being recorded, on the same day. I did this by writing the Lady song argument. First, I helped the sick prostitutes (ladies) on their way. Second, I knew about people. Third, I had a dry nose. In this way, I prepared to meditate (recite) before being recorded, on the same day by writing the Lady song argument. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Panic attack prevented by deep breathing and sutra 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Panic attack prevented by deep breathing and sutra 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..b7c8a191cf631d613bb916aaa00f3593a3bc6433 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Panic attack prevented by deep breathing and sutra 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Panic attack prevented by deep breathing and sutra 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Panic attack prevented by deep breathing and sutra 2 of 4 + +11. I prepared to decide. I did this by writing the Lion song argument. First, I liked the lions. Second, I enabled hithertoness (a point about a period being until now or until a point being discussed). Third, I knew about God (the leader). In this way, I prepared to decide by writing the Lion song argument. + +12. I prepared to test the EEG patterns of meditators (writers). I did this by writing the Neurocode song argument. First, I wrote a computer program for academic performance. Second, I wrote a computer program for brain computation. Third, I wrote a computer program for originality. In this way, I prepared to test the EEG patterns of meditators (writers) by writing the Neurocode song argument. + +13. I prepared to think of the heart clearly. I did this by writing the Plane song argument. First, I wrote about planes. Second, I loved you. Third, I loved myself. In this way, I prepared to think of the heart clearly by writing the Plane song argument. + +14. I prepared to drink the soma juice. I did this by writing the Prince of Ancient Persia song argument. First, I wrote that I am a prince. Second, I wrote that you are another prince. Third, I wrote that she is a princess. In this way, I prepared to drink the soma juice by writing the Prince of Ancient Persia song argument. + +15. I prepared to explain Jughead to Virginia. I did this by writing the Rebirth of Nature song argument. First, I wrote that the seed didn't germinate because of communication that the forest was dense. Second, I knew about the forest. Third, I knew that there was a limit to nature. In this way, I prepared to explain Jughead to Virginia by writing the Rebirth of Nature song argument. + +16. I prepared to enjoy the peace. I did this by writing the Sea Captain song argument. First, I wrote about the sea. Second, I wrote about the captain. Third, I settled for peace, not house or song. In this way, I prepared to enjoy the peace by writing the Sea Captain song argument. + +17. I prepared to each eat a pizza. I did this by writing the Seen As Version song argument. First, I wrote about the piper. Second, I wrote about Maya. Third, I wrote about everyone. In this way, I prepared to each eat a pizza by writing the Seen As Version song argument. + +18. I prepared to help you up. I did this by writing the White Dog song argument. First, I helped you write a breasoning down. Second, I wrote your portfolios down. Third, I helped you to your design. In this way, I prepared to help you up by writing the White Dog song argument. + +19. I prepared to like the song argument. I did this by writing the World is Best song argument. First, I wrote your king's name. Second, I wrote he wore the crown. Third, I wrote about happinesses. In this way, I prepared to like the song argument by writing the World is Best song argument. + +20. I prepared to write how people can be trained for jobs. I did this by writing the Training song argument. First, I trained the people. Second, I looked at you. Third, I made you. In this way, I prepared to write how people can be trained for jobs by writing the Training song argument. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Panic attack prevented by deep breathing and sutra 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Panic attack prevented by deep breathing and sutra 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..fe32cd1d071989fedd92a3dca4519830f2e8b47b --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Panic attack prevented by deep breathing and sutra 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Panic attack prevented by deep breathing and sutra 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Panic attack prevented by deep breathing and sutra 3 of 4 + +21. I prepared to judge the person. I did this by writing the Transsexual song argument. First, I wrote who the transsexual was. Second, I wrote about the rods. Third, I made one. In this way, I prepared to judge the person by writing the Transsexual song argument. + +22. I prepared to favour the underdog. I did this by writing the Underdog song argument. First, I wrote someone else was the underdog. Second, I wrote about the Atacama Desert. Third, I screwed the nut to the bolt at 0 metres. In this way, I prepared to favour the underdog by writing the Underdog song argument. + +23. I prepared to describe the person. I did this by writing the Universe song argument. First, I wrote about the University. Second, I wrote about the Pleiotropic era. Third, I wrote about nothingness. In this way, I prepared to describe the person by writing the Universe song argument. + +24. I prepared to fertilise the plant. I did this by writing the Venus song argument. First, I found the Venus fly traps. Second, I planted them. Third, I watered them. In this way, I prepared to fertilise the plant by writing the Venus song argument. + +25. I prepared to sing the announcement. I did this by writing the Witchcraft song argument. First, I wrote about witches. Second, I wrote how they came to us. Third, I described them. In this way, I prepared to sing the announcement by writing the Witchcraft song argument. + +26. I prepared to state that the fact was life. I did this by writing the World Revolution song argument. First, I announced socialism. Second, I inserted the bar. Third, I knew the fact. In this way, I prepared to state that the fact was life by writing the World Revolution song argument. + +27. I prepared to sing the song. I did this by writing the You Are Mine song argument. First, I announced 'you'. Second, I thought of the verb, 'are'. Third, I thought of 'mine'. In this way, I prepared to sing the song by writing the You Are Mine song argument. + +28. I prepared to be kind. I did this by breathing deeply. First, I reclined slightly. Second, I breathed in through my nose. Third, I breathed out through my mouth. In this way, I prepared to be kind by breathing deeply. + +29. I prepared to be fluent in German. I did this by speaking in German. First, I looked up the grammar in the grammar book. Second, I looked up the vocabulary in the dictionary. Third, I spoke German. In this way, I prepared to be fluent in German by speaking in German. + +30. I prepared to visit all the professors' offices. I did this by knowing that the destination was RMIT. First, I attended class. Second, I attended mathematics class. Third, I attended English class. In this way, I prepared to visit all the professors' offices by knowing that the destination was RMIT. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..a355ffc46a77552a74070377db55af3993991bb0 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 1 of 4.txt @@ -0,0 +1,35 @@ +["Green, L 2021, Pedagogy Course Plan 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Pedagogy Course Plan 1 of 4 + + + +1. Study from a selection of Pedagogy, Medicine, Meditation, Philosophy, Computer Science and Creative Writing to become a pedagogue. + +As a foundation, I recommend learning Meditation before University (or Year 4 in Primary School, with at least one parent), and a Medicine course. Then, study Computer Science for Prolog, Creative Writing for ability to write Pedagogy and Philosophy for Critical Thinking and all ideas in Pedagogy, then Education and Pedagogy, which becomes an Academic's skills foundation. Then, use Pedagogy in conjunction with courses to develop couresware, study Commerce to build your Education empire and perhaps Law. + +Note: Visit an 'Effective Altruism' session in your city before Honours to realise to write 2*15 Pedagogical As per Honours assignment and 2*50 As per Masters and PhD by dotting on the start and end of each sentence of each speaker, deleting the big ideas in each sentence, and doing this while chatting with one of them afterwards. They also help people wanting to become lecturers to complete 5 As (=5*10 breasonings) of his or hers for each student he or she meets. + +See also Why Study Humanities?, Courses in Meditation, Pedagogy and Medicine. + +1a. I prepared to dispose of Anarchy is India, I just want it to be a normal place. I did this by writing the Anarchy in India song argument. First, I asked what Anarchy is. Second, I asked what about India there is. Third, I wrote about them together. In this way, I prepared to dispose of Anarchy is India, I just want it to be a normal place by writing the Anarchy in India song argument. + +2. I prepared to state that Anarchy was the secret value of Philosophy. I did this by writing the Anarchy song argument. First, I discovered Anarchy. Second, I discovered it's equilibrium value. Third, I explained that this made it the centre of Arts. In this way, I prepared to state that Anarchy was the secret value of Philosophy by writing the Anarchy song argument. + +3. I prepared to stroke pussy's tail. I did this by writing the Anson song argument. First, I discovered Anson. Second, I made friends with him. Third, I recommended he used a condom. In this way, . + +4. I prepared to ride the sphere vehicle. I did this by writing the Apple Computer song argument. First, I discovered the time point. Second, I discovered the importance. Third, I thought about time. In this way, I prepared to ride the sphere vehicle by writing the Apple Computer song argument. + +5. I prepared to thank Louis for his time. I did this by writing the Are You There? song argument. First, I discovered the toy ghost. Second, I thought of the toilet paper. Third, I saw the door close. In this way, I prepared to thank Louis for his time by writing the Are You There?. + +6. I prepared to thank my Arts College. I did this by writing the Arts College Song argument. First, I found the up and down. Second, I found the inside out. Third, I found the 'in time'. In this way, I prepared to thank my Arts College by writing the Arts College Song argument. + +7. I prepared to connect Atheism and the King. I did this by writing the Atheism song argument. First, I believed in God (the master). Second, I noticed God (the leader) was still in the room. Third, I meditated (philosophised) and helped the King. In this way, I prepared to connect Atheism and the King by writing the Atheism song argument. + +8. I prepared to have a baby. I did this by writing the Baby song argument. First, I wrote about the baby. Second, I intrigued it. Third, I convinced it. In this way, I prepared to have a baby by writing the Baby song argument. + +9. I prepared to ask if the baby was gay. I did this by writing the Baby, Baby song argument. First, I babied the baby. Second, I asked, 'Who is babier?' Third, I said that I liked you. In this way, I prepared to ask if the baby was gay by writing the Baby, Baby song argument. + +10. I prepared to go time travelling. I did this by writing the Bingo song argument. First, I discovered the point. Second, I discovered the writing. Third, I predicted the random number. In this way, I prepared to go time travelling by writing the Bingo song argument. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..46ef1aba792cd0dd81550e0b0bda82ab8c1aff71 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Pedagogy Course Plan 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Pedagogy Course Plan 2 of 4 + +11. I prepared to say animals were positive. I did this by writing the Breasonings song argument. First, I wrote the breasoning down. Second, I noticed the soldier. Third, I noticed there was a new leader. In this way, I prepared to say animals were positive by writing the Breasonings song argument. + +12. I prepared to have high quality of life. I did this by writing the Breathsonings song argument. First, I wrote the breathsoning down. Second, I asked it to come. Third, I stated that it came. In this way, I prepared to have high quality of life by writing the Breathsonings song argument. + +13. I prepared to find my palace. I did this by writing the Butterfly song argument. First, I found the police. Second, I found the most famous single singer. Third, I found the butterfly food. In this way, I prepared to find my palace by writing the Butterfly song argument. + +14. I prepared to name the favourite. I did this by writing the Chess Club song argument. First, I found the chess board. Second, I sang about the piano opening. Third, I sang about the diagonal pawn capture. In this way, I prepared to name the favourite by writing the Chess Club song argument. + +15. I prepared to help myself up. I did this by writing the Clarity song argument. First, I designed clarity. Second, I helped myself to be an angel. Third, I helped develop the cure. In this way, I prepared to help myself up by writing the Clarity song argument. + +16. I prepared to write about you. I did this by writing the Dinosaur song argument. First, I wrote about the person. Second, I wrote about it's festiches (sic). Third, I wrote it's harpsichorders (sic). In this way, I prepared to write about you by writing the Dinosaur song argument. + +17. I prepared to become famous. I did this by writing the Distinctions song argument. First, I was a philosopher in a University subject. Second, the student was given 50 As in the subject and to be a helped pedagogue. Third, the pedagogy helper helped the student. In this way, I prepared to become famous by writing the Distinctions song argument. + +18. I prepared to teach M meditation (pedagogy). I did this by writing the Dragon song argument. First, I played with the Tibetan Spaniel. Second, I observed the person find nice people. Third, I helped the former tutor to do some more work. In this way, I prepared to teach M meditation (pedagogy) by writing the Dragon song argument. + +19. I prepared to give the dry eyes algorithm. I did this by writing the Dry Eyes song argument. First, I indicated the eye would become dry. Second, I dried the left of the eye. Third, I dried the right of the eye. In this way, I prepared to give the dry eyes algorithm by writing the Dry Eyes song argument. + +20. I prepared to enjoy the G-rated activity. I did this by writing the Ecstasy song argument. First, I sphermulated (sic) myself. Second, I found my meditation (humanities) teacher friend before starting University. Third, I sphermulated him too. In this way, I prepared to enjoy the G-rated activity by writing the Ecstasy song argument. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..cd94ecbeeee77850a5ec6de52a18520da04705d0 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Pedagogy Course Plan 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Pedagogy Course Plan 3 of 4 + +21. I prepared to be in heaven (on earth). I did this by writing the English Chronicles song argument. First, I wrote my oevre. Second, I named it the English Chronicles. Third, I sang about it. In this way, I prepared to be in heaven (on earth) by writing the English Chronicles song argument. + +22. I prepared to engage with the people. I did this by writing the Father song argument. First, I got to know people. Second, I painted them. Third, I sang to the cello. In this way, I prepared to engage with the people by writing the Father song argument. + +23. I prepared to use my knowledge from my Pedagogy course plan to perform will in subjects that were part of my current course plan. I did this by writing the First Class Degrees song argument. First, I noticed the first class degree holders standing in line. Second, I introduced myself to each of them in turn. Third, I maintained my sanity. In this way, I prepared to use my knowledge from my Pedagogy course plan to perform will in subjects that were part of my current course plan by writing the First Class Degrees song argument. + +24. I prepared to visit Mars. I did this by writing the Flamingo song argument. First, I dyed myself pink. Second, I visited the space station. Third, I admitted that I was a tourist. In this way, I prepared to visit Mars by writing the Flamingo song argument. + +25. I prepared to enact myself. I did this by writing the Footy song argument. First, I marked the ball. Second, I passed the ball. Third, I kicked it. In this way, I prepared to enact myself by writing the Footy song argument. + +26. I prepared to be happy. I did this by writing the Fussy Huskies song argument. First, I gave them tofu fishes. Second, I listened to the question, 'Why are they like real fish?' Third, I answered they were salty. In this way, I prepared to be happy by writing the Fussy Huskies song argument. + +27. I prepared to be given enough slack. I did this by writing the God song argument. First, I sang about God (the facilitator). Second, I sang about you. Third, I sang about myself. In this way, I prepared to be given enough slack by writing the God song argument. + +28. I prepared to notice circulation had returned. I did this by writing the Goodness song argument. First, I wrote about the goodness in thoughts. Second, I helped out. Third, I rescued you. In this way, I prepared to notice circulation had returned by writing the Goodness song argument. + +29. I prepared to be jovial. I did this by writing the Happiness song argument. First, I liked you on the school bus. Second, I myself in the cave. Third, I liked each participle. In this way, I prepared to be jovial by writing the Happiness song argument. + +30. I prepared to eat out Goonie Burgers. I did this by writing the Heart song argument. First, I wrote about the neatness. Second, I wrote about the sicling (sic). Third, I wrote about happiness. In this way, I prepared to eat out Goonie Burgers by writing the Heart song argument. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..1e5aff6283762f94b7052761e41e56a2ba1f4a71 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Course Plan 4 of 4.txt @@ -0,0 +1,30 @@ +["Green, L 2021, Pedagogy Course Plan 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Pedagogy Course Plan 4 of 4 + +31. I prepared to greet the Chemical Cascade player. I did this by writing the Hello song argument. First, I wrote about the fat lady's copy of the song. Second, I wrote about meditation (philosophy). Third, I wrote about green grass blades. In this way, I prepared to greet the Chemical Cascade player by writing the Hello song argument. + +32. I prepared to like the carnelian mineral. I did this by writing the Hey Carnelian song argument. First, I apologised to the scout. Second, I take fame. Third, I take fortune. In this way, I prepared to like the carnelian mineral by writing the Hey Carnelian song argument. + +33. I prepared to perform enough work. I did this by writing the Hey, Adult School song argument. First, I wrote about the adult computer class school. Second, I applied for a job there. Third, I made it. In this way, I prepared to perform enough work by writing the Hey, Adult School song argument. + +34. I prepared to devise a plan to be the best in the world. I did this by writing the Hey, Speaker song argument. First, I wrote about town. Second, I applied for a good job. Third, I earned it. In this way, I prepared to devise a plan to be the best in the world by writing the Hey, Speaker song argument. + +35. I prepared to ask, 'What's on what?' I did this by writing the High School Song song argument. First, I wrote about the hibiscuses. Second, I wrote about Ness. Third, I wrote about the Bolshoi. In this way, I prepared to ask, 'What's on what?' by writing the High School Song song argument. + +36. I prepared to endorse nationalism. I did this by writing the I am a Teacher song argument. First, I helped the student with the assignment. Second, I liked your warthogs. Third, I like the Shevardnadzes. In this way, I prepared to endorse nationalism by writing the I am a Teacher song argument. + +37. I prepared to state, 'Yes, I want to do it'. I did this by writing the I Am Not A Peach song argument. First, I analysed the tinge. Second, I wrote the song. Third, I determined whether it would perform well. In this way, I prepared to state, 'Yes, I want to do it' by writing the I Am Not A Peach song argument. + +38. I prepared to state, 'That's the song'. I did this by writing the I Live Inside the Guitar song argument. First, I wrote that I lived inside the giant guitar. Second, I played the strings. Third, I listened to the music at different positions. In this way, I prepared to state, 'That's the song' by writing the I Live Inside the Guitar song argument. + +39. I prepared to be a spiritual geneticist. I did this by writing the I Want to be a Geneticist song argument. First, I wrote about planning my career. Second, I enjoyed playing a song about it on the harpsichord. Third, I meditated (thought) to change the genes. In this way, I prepared to be a spiritual geneticist by writing the I Want to be a Geneticist song argument. + +40. I prepared to lead the way. I did this by writing the I Want To Be A Peach song argument. First, I wrote about the peach. Second, I wrote about the rod. Third, I cut it with it. In this way, I prepared to lead the way by writing the I Want To Be A Peach song argument. + +41. I prepared to write the arrangement. I did this by writing the I Want To Write A Pop Song argument. First, I wrote about the complex pop. Second, I wrote about the two rods. Third, I wrote about them. In this way, I prepared to write the arrangement by writing the I Want To Write A Pop Song argument. + +42. I prepared to be born. I did this by writing the I Want You song argument. First, I listened to the song in a dream-like state. Second, I helped Mummy. Third, I helped Karen. In this way, I prepared to be born by writing the I Want You song argument. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Grades and Failure 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Grades and Failure 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..9ffc822d882c77de163d61b15d764cbce5be44a4 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Grades and Failure 1 of 4.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Pedagogy Grades and Failure 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Pedagogy Grades and Failure 1 of 4 + + + +1. I prepared to go into the Hall of Fame. I did this by performing my job. First, I found my tasks. Second, I performed well. Third, I did this well. In this way, I prepared to go into the Hall of Fame by performing my job. + +2. I prepared to care for my family. I did this by collecting the high quality comments from the people. First, I found the ducklings. Second, I found you. Third, I found everyone. In this way, I prepared to care for my family by collecting the high quality comments from the people. + +3. I prepared to perform well as a tutor. I did this by helping the student to perform well. First, I found the text. Second, I found how to help with it. Third, I helped the student to perform well. In this way, I prepared to perform well as a tutor by helping the student to perform well. + +4. I prepared to write that he 18 ducks had 17+16+15+14+13+12+11+10+9+8+7+6+5+4+3+2+1=153 interactions. I did this by writing on pedagogy. First, I wrote on the book Pedagogy. Second, I wrote that the 4 ducks had 3+2+1=6 interactions. Third, I wrote that the 9 ducks had 8+7+6+5+4+3+2+1=36 interactions. In this way, I prepared to write that he 18 ducks had 17+16+15+14+13+12+11+10+9+8+7+6+5+4+3+2+1=153 interactions by writing on pedagogy. + +5. I prepared to write on philosophical themes. I did this by writing on philosophy. First, I inserted the rod into the void once. Second, I inserted the rod into the void a second time. Third, I inserted the rod into the void a third time. In this way, I prepared to write on philosophical themes by writing on philosophy. + +6. I prepared to be a high quality character. I did this by stating that philosophy was necessary. First, I collected the areas of study. Second, I collected the skills. Third, I represented the matriarchs to people. In this way, I prepared to be a high quality character by stating that philosophy was necessary. + +7. I prepared to present as a leaper. I did this by writing using Pedagogy. First, I found Pedagogy. Second, I wrote on it's texts. Third, I continued on with other work. In this way, I prepared to present as a leaper by writing using Pedagogy. + +8. I prepared to examine communism. I did this by writing on the online coin. First, I wrote about the coin's first side. Second, I wrote about the coin's second side. Third, I recorded the communal knowledge wealth portion available. In this way, I prepared to examine communism by writing on the online coin. + +9. I prepared to decide how it was written. I did this by writing on the Socatoan. First, I cited the research. Second, I wrote the evidence. Third, I arrived at the best conclusions. In this way, I prepared to decide how it was written by writing on the Socatoan. + +10. I prepared to sing the harmony. I did this by writing a song about my partner. First, I talked to my partner. Second, I wrote down the conversation. Third, I wrote a song about it. In this way, I prepared to sing the harmony by writing a song about my partner. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Grades and Failure 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Grades and Failure 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..35d9b46407726ead8bd53a8965c5503aecf277db --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Grades and Failure 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Pedagogy Grades and Failure 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Pedagogy Grades and Failure 2 of 4 + +11. I prepared to listen to the melody. I did this by regularly playing the music. First, I wrote to my local member of parliament. Second, I performed at the open mic night. Third, I received the payment. In this way, I prepared to listen to the melody by regularly playing the music. + +12. I prepared to ask whether the person needed further help. I did this by caring about the other person. First, I asked if the other person needed help. Second, I listened to her reply. Third, I gave her help. In this way, I prepared to ask whether the person needed further help by caring about the other person. + +13. I prepared to sell the song wisely. I did this by writing that the Venusian themed song is good too. First, I wrote about Venus. Second, I wrote about the song. Third, I wrote that it was good too. In this way, I prepared to sell the song wisely by writing that the Venusian themed song is good too. + +14. I prepared to entice the viewers. I did this by writing the 2028 New Type of Thermonuclear Energy song argument. First, I asked what thermo-energy was. Second, I asked what nuclear energy was. Third, I asked what the two of them going together was. In this way, I prepared to entice the viewers by writing the 2028 New Type of Thermonuclear Energy song argument. + +15. I prepared to dine on quince meat. I did this by writing the 2046 Cloning as Medicine song argument. First, I asked what clones were. Second, I asked how they affected me. Third, I helped them to it. In this way, I prepared to dine on quince meat by writing the 2046 Cloning as Medicine song argument. + +16. I prepared to ask whether you were a robot. I did this by writing the 2111 People Become Robots song argument. First, I wrote about people. Second, I wrote about arguments. Third, I wrote about robots. In this way, I prepared to ask whether you are a robot by writing the 2111 People Become Robots song argument. + +17. I prepared to map everything nicely. I did this by writing the 2130 Alien song argument. First, I asked whether you were an alien. Second, I designed you. Third, I pictured you. In this way, I prepared to map everything nicely by writing the 2130 Alien song argument. + +18. I prepared to meet the Animal President. I did this by writing the 2164 Animals Are Human song argument. First, I wrote about how the animals are like us. Second, I Demetrioued (sic) it. Third, I happelstanded (sic) it. In this way, I prepared to meet the Animal President by writing the 2164 Animals Are Human song argument. + +19. I prepared to consider each peach sheet about things. I did this by writing the 2271 Physical Constants Change song argument. First, I wrote about physical constants. Second, I decided on it. Third, I noted it. In this way, I prepared to consider each peach sheet about things by writing the 2271 Physical Constants Change song argument. + +20. I prepared to write about spiritual power. I did this by writing the 2279 Power from Nothing song argument. First, I wrote about songs. Second, I wrote about something. Third, I wrote about the words. In this way, I prepared to write about spiritual power by writing the 2279 Power from Nothing song argument. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Grades and Failure 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Grades and Failure 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..77131b1f842d1f54b66feb9bca694c9320f897f7 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Pedagogy Grades and Failure 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Pedagogy Grades and Failure 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Pedagogy Grades and Failure 3 of 4 + +21. I prepared to explain that the seen-as version was that the time travel and aliens were separate. I did this by writing the 2288 Time Travel and Aliens song argument. First, I wrote about the box. Second, I wrote about time travel through a puppet set in each location, which was in fact a space travel using the puppet set. Third, I wrote about the alien puppets. In this way, I prepared to explain that the seen-as version was that the time travel and aliens were separate by writing the 2288 Time Travel and Aliens song argument. + +22. I prepared to upgrade 'sun' to 'Sun'. I did this by writing the 2296 Sun Eruption Changes Earth's Gravity song argument. First, I wrote about the sun. Second, I wrote about the gravity. Third, I wrote about the Earth. In this way, I prepared to upgrade 'sun' to 'Sun' by writing the 2296 Sun Eruption Changes Earth's Gravity song argument. + +23. I prepared to ask, 'Who's there?' I did this by writing the 2302 - New Important Laws and Secrets of the Universe Revealed song argument. First, I wrote about the new important laws. Second, I knew what was happening. Third, I found the universe's secrets. In this way, I prepared to . by writing the 2302 - New Important Laws and Secrets of the Universe Revealed song argument. + +24. I prepared to guide you to the stratosphere. I did this by writing the 2304 Secrets of the Moon song argument. First, I blasted to the stratosphere. Second, I asked how you knew this. Third, I knew you well. In this way, I prepared to guide you to the stratosphere by writing the 2304 Secrets of the Moon song argument. + +25. I prepared to enjoy the Time Nebulous. I did this by writing the 3797 Earth Empty, Humans Move to Another Stellar System song argument. First, I wrote that the house is empty. Second, I wrote we moved to another system. Third, I squeezed each luggage item as I took it out. In this way, I prepared to enjoy the Time Nebulous by writing the 3797 Earth Empty, Humans Move to Another Stellar System song argument. + +26. I prepared to kiss the man. I did this by writing the 3803 New Planet's Climate Causes People to Mutate song argument. First, I loved green. Second, I loved you. Third, I loved myself. In this way, I prepared to kiss the man by writing the 3803 New Planet's Climate Causes People to Mutate song argument. + +27. I prepared to think of how interesting life (through the liver) is. I did this by writing the 3871 - Prophet tells people about Moral Values and Religion and Reteaches Forgotten Sciences song argument. First, I found Rhesus. Second, I beheld the kaleidoscope honeycomb. Third, I programmed the gestaltron. In this way, I prepared to think of how interesting life (through the liver) is by writing the 3871 - Prophet tells people about Moral Values and Religion and Reteaches Forgotten Sciences song argument. + +28. I prepared to have intercourse about each node and connection between nodes. I did this by writing the 4302 - Scientists Discover that Organism Behaviour Causes All Diseases song argument. First, I avoided subjugation (transcended slavery). Second, I prevented subordination (I ordained freedom). Third, I lead or verified my leader's decisions. In this way, I prepared to have intercourse about each node and connection between nodes by writing the 4302 - Scientists Discover that Organism Behaviour Causes All Diseases song argument. + +29. I prepared to sing a mantra (chant) in a circle. I did this by writing the 4304 - Scientists Find a Way to Cure any Disease song argument. First, I diagnosed all diseases as genetically based. Second, I switched off the genes with aphors. Third, I lit (led) the way. In this way, I prepared to sing a mantra (chant) in a circle by writing the 4304 - Scientists Find a Way to Cure any Disease song argument. + +30. I prepared to connect communication with God (the leader), communication with the quantum afterlife world and ability to look at life with a positive mindset. I did this by writing the 4308 - Due to Mutation People Use Their Brains More Than 34% And Lose The Ideas of Evil And Hatred song argument. First, I observed people use their brains more than 34%. Second, I observed people lose the ideas of evil and hatred. Third, I used it well. In this way, I prepared to connect communication with God (the leader), communication with the quantum afterlife world and ability to look at life with a positive mindset by writing the 4308 - Due to Mutation People Use Their Brains More Than 34% And Lose The Ideas of Evil And Hatred song argument. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e4490e7e84ffa8cda9e207c2e25333694984d186 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 1 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Perpetual University Short Courses 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Perpetual University Short Courses 1 of 4 + + + +1. This is for the best breasoning environment. + +Enrol in perpetual 6 month-long or greater University short courses after your degree to experience and be able to write breasonings. + +1a. I prepared to study the next University short course. I did this by studying perpetual University short courses. First, I wrote the department I was writing on. Second, I studied a short course from that department. Third, I wrote on the topic from the department. In this way, I prepared to study the next University short course by studying perpetual University short courses. + +2. I prepared to press my lips on your forehead. I did this by stating that I liked you. First, I stated how I liked you. Second, I liked you. Third, I held you close. In this way, I prepared to press my lips on your forehead by stating that I liked you. + +3. I prepared to be all right with you having it. I did this by talking with the man. First, I loved you lorry men. Second, I knew about you having Recklinghausen's disease. Third, I knew you it was to do with you. In this way, I prepared to be all right with you having it by talking with the man. + +4. I prepared to follow the traditional secondary school and retrieve my song. I did this by liking the rod. First, I knew the rod. Second, I found the rod. Third, I held it hard. In this way, I prepared to follow the traditional secondary school and retrieve my song by liking the rod. + +5. I prepared to state that the house was safe. I did this by observing the child locking the door. First, I observed the child walk through the door. Second, I observed him turn the latch. Third, I observed him close the door. In this way, I prepared to state that the house was safe by observing the child locking the door. + +6. I prepared to visit Madrid. I did this by loving Sydney. First, I was invited to the University. Second, I visited the University. Third, I delivered the presentation. In this way, I prepared to visit Madrid by loving Sydney. + +7. I prepared to state that 8 was the infinity symbol on it's side. I did this by loving each number. First, I wrote the numbers 1-10. Second, I loved them. Third, I wrote when they would be magical (equal 10). In this way, I prepared to state that 8 was the infinity symbol on it's side by loving each number. + +8. I prepared to devise activities instead of drinking alcohol. I did this by loving 8. First, I counted to 8. Second, I loved it. Third, I knew it was right for me. In this way, I prepared to devise activities instead of drinking alcohol by loving 8. + +9. I prepared to be leader to many in my time period. I did this by parading on the street. First, I found the float. Second, I paraded in front of it. Third, I walked free at the end of it. In this way, I prepared to be leader to many in my time period by parading on the street. + +10. I prepared to annotate it's flavours. I did this by licking the tangerine. First, I found the first tangerine-meat. Second, I licked it. Third, I repeated this the next tangerine-meat. In this way, I prepared to annotate it's flavours by licking the tangerine. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..606d078cfcee54e7bc87f568fb63533d6f71e424 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Perpetual University Short Courses 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Perpetual University Short Courses 2 of 4 + +11. I prepared to write it was based on me, but there might be a little Nietzschean engram helping it through. I did this by stating that I was the pedagogy helper like Nietzsche. First, I became a pedagogue. Second, I wrote 250 As. Third, I realised the Nietzsche helper was me, because Nietzsche wrote about tropes, not the terms of Pedagogy. In this way, I prepared to write it was based on me, but there might be a little Nietzschean engram helping it through by stating that I was the pedagogy helper like Nietzsche. + +12. I prepared to listen to someone announce celibacy. I did this by stating that I was happy. First, I found you. Second, I was happy with your presence. Third, I played the drum. In this way, I prepared to listen to someone announce celibacy by stating that I was happy. + +13. I prepared to study each University short course. I did this by eating the orange. First, I peeled it with hair-clippers. Second, I ate each segment. Third, I nickered (sic) each part away. In this way, I prepared to study each University short course by eating the orange. + +14. I prepared to film the stunt. I did this by enamouring the slight men and women. First, I walked to the precipice. Second, I stepped off. Third, I landed slightly on my mark. In this way, I prepared to film the stunt by enamouring the slight men and women. + +15. I prepared to eat the double scoop cone. I did this by enhancing myself by turning the light on. First, I made the set. Second, I posed in it. Third, I was flash photographed. In this way, I prepared to eat the double scoop cone by enhancing myself by turning the light on. + +16. I prepared to eat the large sword's sauces. I did this by wearing the jumper. First, I put on the sleeves. Second, I put on the skivvy. Third, I put on the top. In this way, I prepared to eat the large sword's sauces by wearing the jumper. + +17. I prepared to dive into sesquicentenary. I did this by holding the big rod. First, I found the needle. Second, I threaded thread through it's eye. Third, I sewed the neat pattern. In this way, I prepared to dive into sesquicentenary by holding the big rod. + +18. I prepared to enjoy the exchange. I did this by stating that the man helped me. First, I noticed the man easing. Second, I noticed him nosing. Third, I noticed him making a crease maker. In this way, I prepared to enjoy the exchange by stating that the man helped me. + +19. I prepared to help more people. I did this by sucking on the boiled lolly. First, I knew about the red boiled lolly. Second, I gritted it with my teeth (I helped other people to avoid gritty problems). Third, I stated that the person with that view (that Lucian's Pedagogy was untrue) was helped by me. In this way, I prepared to help more people by sucking on the boiled lolly. + +20. I prepared to read the centre's texts in hieroglyphics. I did this by reading hieroglyphs. First, I read the first hieroglyph. Second, I looked it up in the image dictionary. Third, I repeated this for each hieroglyph. In this way, I prepared to read the centre's texts in hieroglyphics by reading hieroglyphs. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..42e312ffd9c572a12a6878974a7be5e6920c847b --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Perpetual University Short Courses 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Perpetual University Short Courses 3 of 4 + +21. I prepared to decide. I did this by knowing about each rod. First, I knew about the first rod. Second, I knew about the second rod. Third, I knew about the third rod. In this way, I prepared to decide by knowing about each rod. + +22. I prepared to invent the area of study that people meditated (wrote) on. I did this by quenching the baby. First, I gave the baby the milk. Second, I fed it. Third, I folded the nappy up. In this way, I prepared to invent the area of study that people meditated (wrote) on by quenching the baby. + +23. I prepared to examine the visitors. I did this by drawing the rainbow lorikeet. First, I drew his rainbow-coloured belly. Second, I drew his green back. Third, I drew his blue face. In this way, I prepared to examine the visitors by drawing the rainbow lorikeet. + +24. I prepared to accredit the thought. I did this by liking writing. First, I knew writing. Second, I helped it up. Third, I wrote a clarification of the target's idea. In this way, I prepared to accredit the thought by liking writing. + +25. I prepared to preventing tantra's medical problems by accrediting it and taking precautions. I did this by examining tantra. First, I knew the person. Second, I knew the ideas. Third, I talked with everyone in sight. In this way, I prepared to preventing tantra's medical problems by accrediting it and taking precautions by examining tantra. + +26. I prepared to make it news. I did this by aiding the coiffeur. First, I gave him the clippers. Second, I gave him the scissors. Third, I showed the client her hair in the mirror. In this way, I prepared to make it news by aiding the coiffeur. + +27. I prepared to speak in the same language. I did this by speaking Pǔtōnghua. First, I stated that it meant common. Second, I stated that it meant spoken. Third, I stated that it meant language. In this way, I prepared to speak in the same language by speaking Pǔtōnghua. + +28. I prepared to pick a dye. I did this by stating that the family duck was vegan. First, I fed her grass. Second, I fed her rice. Third, I fed her capers. In this way, I prepared to pick a dye by stating that the family duck was vegan. + +29. I prepared to eat l'ice. I did this by tracing the chemical cascade to the point. First, I wrote about chemical cascade. Second, I wrote about you. Third, I wrote about myself. In this way, I prepared to eat l'ice by tracing the chemical cascade to the point. + +30. I prepared to query the graduate attributes of Masters by Research and PhD. I did this by engaging in perpetual University short courses. First, I wrote for a website. Second, I wrote essays. Third, I taught. In this way, I prepared to query the graduate attributes of Masters by Research and PhD by engaging in perpetual University short courses. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..84f712ba44f5547bc81b97f6a6e60a01536437a8 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Perpetual University Short Courses 4 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Perpetual University Short Courses 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Perpetual University Short Courses 4 of 4 + +31. I prepared to write the writing workshop. I did this by penning my breasoning chapters so that I could breason them out that day. First, I wrote the breasoning chapter. Second, I breasoned it out. Third, I worked for a maximum of 6.5 hours per day. In this way, I prepared to write the writing workshop by penning my breasoning chapters so that I could breason them out that day. + +32. I prepared to find out each breasoning instead of praying for the argument. I did this by breasoning the breasoning chapter out instead of praying. First, I chose Nietzsche because he was liberal. Second, I didn't choose Buddha because he was not vegan and believed in capital punishment. Third, I didn't choose myself because it would be undeveloped. In this way, I prepared to find out each breasoning instead of praying for the argument by breasoning the breasoning chapter out instead of praying. + +33. I prepared to give an excuse to trade 50 As from 80 breasonings while simulating studying the graduate diploma while also only working on 80 breasonings per each of product, image and sales, each representing 50 As. I did this by praying for ideas when at non-University. First, I prayed for a graduate diploma. Second, I prayed for Aigs, or 80 breasonings. Third, I cancelled the degreelessness purporters who might challenge in-person departmental encounters of an untrained person, with the nut and bolt and quantum box/prayer. In this way, I prepared to give an excuse to trade 50 As from 80 breasonings while simulating studying the graduate diploma while also only working on 80 breasonings per each of product, image and sales, each representing 50 As by praying for ideas when at non-University. + +34. I prepared to be on about things. I did this by stating that the sales were found out from a number one. First, I stated that it was based on the product. Second, I stated that it was the same for the image. Third, I stated that the image meant, at least that the 50 breasonings were for high quality of life reasons. In this way, I prepared to be on about things by stating that the sales were found out from a number one. + +35. I prepared to be the model student. I did this by reforming the number of PhD chapter As from 5000 to 500. First, I wrote that the 500 As per chapter would take fewer than 20 years at 130 breasonings per day. Second, I wrote that more than 130 breasonings per day resulted in a headache and was too time-consuming. Third, I stated that 10*20=200 years was untenable for 10*500=5000 As per chapter. In this way, I prepared to be the model student by reforming the number of PhD chapter As from 5000 to 500. + +36. I prepared to observe the differences between students with and without grit. I did this by observing the student passing in a nutshell. First, the student didn't find the pedagogy time points in time. Second, the student didn't realise the importance of brainstorming undevelopedly on the essay topic, which the lecturer was supporting to increase her grade. Third, the student suffered a mental breakdown and had to see a psychiatrist because of the dimly viewed society-held view that poor grades were a sign of poor mental health. In this way, I prepared to observe the differences between students with and without grit by observing the student passing in a nutshell. + +37. I prepared to quickly say she passed the course. I did this by passing because of being a drag queen. First, I explained that the brown (as against green) nutshell pass aphor actually referred to looking confident enough to pass the subject. Second, I observed that the poofters in drag high quality thought being remembered by the student meant that the lecturer attributed enough breasonings to her to pass the subject. Third, I noticed that she wasn't attributed with the drag queen character when she wasn't with-it (and wouldn't have been, which would have been looked down on), resulting in failing the subject. In this way, I prepared to quickly say she passed the course by passing because of being a drag queen. + +38. I prepared to give the job application many As, preventing a second failure in something else. I did this by specifying my needs in the song. First, I failed (achieved). Second, I wrote a song about it. Third, I wrote what I wanted from the arbiter in the song. In this way, I prepared to give the job application many As, preventing a second failure in something else by specifying my needs in the song. + +39. I prepared to argue that the theological God should be with-it over the area of study Gods. I did this by articulating to God in the area of study. First, I wrote a major of 10*80 breasoning As to be a follower. Second, I stated that the combined graduate attribute of the Pedagogy and Medicine majors was to avoid mental breakdowns. Third, I wrote an area of study of 50*80 breasoning As so that the majors met professional requirements (worked) to be God. In this way, I prepared to argue that the theological God should be with-it over the area of study Gods by articulating to God in the area of study. + +40. I prepared to breason out something interesting about each department contributing to the collections of areas of study, repeat the presentness commerce As each day until the course, and turn everything on using radio buttons and turn everything off using the nut and bolt and quantum box and prayer to take care of extras. I did this by selling the course(s). First, I placed recordings of the Graduate Diplomas in Sales, the course department(s), fine arts and presentness commerce on myself. Second, I placed recordings of 80 breasonings for each of these on myself and indicated that the diplomas had expanded these to 2*50 As. Third, I breasoned out my invitation and invited the potential guest to the course. In this way, I prepared to breason out something interesting about each department contributing to the collections of areas of study, repeat the presentness commerce As each day until the course, and turn everything on using radio buttons and turn everything off using the nut and bolt and quantum box and prayer to take care of extras by selling the course(s). + +41. I prepared to ask the doctor questions. I did this by asking whether the doctor was mad (sane). First, I found the medic. Second, I examined whether he was psychiatrically fit. Third, I determined that he was fit. In this way, I prepared to ask the doctor questions by asking whether the doctor was mad (sane). + +42. I prepared to visit the short course lounge. I did this by attending University. First, I found the University on a map. Second, I found the correct classroom. Third, I walked to the classroom. In this way, I prepared to visit the short course lounge by attending University. + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Prevent Headaches on Train and a Bent Spine 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Prevent Headaches on Train and a Bent Spine 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..ff848574d869062063bf025b5e7dd955d84fb151 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Prevent Headaches on Train and a Bent Spine 1 of 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Prevent Headaches on Train and a Bent Spine 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Prevent Headaches on Train and a Bent Spine 1 of 4 + + + +1. Aim: This argument argues to practise yoga to prevent headaches in vehicles and to prevent a bent spine. + +1a. I prepared to help technology beginners. I did this by providing Linguistics and Applied Linguistics service as breasonings currency. First, I liked dance. Second, I liked machinations. Third, I liked the chief. In this way, I prepared to help technology beginners by providing Linguistics and Applied Linguistics service as breasonings currency. + +2. I prepared to neaten my desk. I did this by providing Management service as breasonings currency. First, I deciphered it. Second, I held it. Third, I managed it. In this way, I prepared to neaten my desk by providing Management service as breasonings currency. + +3. I prepared to decide. I did this by providing Mathematics and Statistics service as breasonings currency. First, I examined the wood. Second, I made correct choices. Third, I strutted my stuff. In this way, I prepared to decide by providing Mathematics and Statistics service as breasonings currency. + +4. I prepared to eat cheese. I did this by providing Modern Greek service as breasonings currency. First, I wrote hard-held facts. Second, I philosophised. Third, I jeloquated (sic). In this way, I prepared to eat cheese by providing Modern Greek service as breasonings currency. + +5. I prepared to design the architectures. I did this by providing Music History service as breasonings currency. First, I wrote about comedy. Second, I donated. Third, I asked, 'What about the African American U.S. President at the start?' In this way, I prepared to design the architectures by providing Music History service as breasonings currency. + +6. I prepared to decide what would happen. I did this by providing Philosophy service as breasonings currency. First, I delighted in strongholds. Second, I imagined the deep space space station. Third, I imagined disentangling it. In this way, I prepared to decide what would happen by providing Philosophy service as breasonings currency. + +7. I prepared to be there still. I did this by providing Physics service as breasonings currency. First, I found each physis. Second, I disarmed it. Third, I held the rod with it. In this way, I prepared to be there still by providing Physics service as breasonings currency. + +8. I prepared to help 'As it is'. I did this by providing Planning and Design service as breasonings currency. First, I design it. Second, I dilucidated (sic) it. Third, I enamoured it. In this way, I prepared to help 'As it is' by providing Planning and Design service as breasonings currency. + +9. I prepared to say, 'You're smart'. I did this by providing Political Science service as breasonings currency. First, I gave asylum to the person. Second, I sat up straight. Third, I questioned it. In this way, I prepared to say, 'You're smart' by providing Political Science service as breasonings currency. + +10. I prepared to delight in it. I did this by providing Portuguese service as breasonings currency. First, I dined on it. Second, I discripiated (sic) it. Third, I made sure that the people were happy with the train. In this way, I prepared to delight in it by providing Portuguese service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Prevent Headaches on Train and a Bent Spine 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Prevent Headaches on Train and a Bent Spine 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..19d9f4eed57e7d1b4a739edc88f7929e3d84f38f --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Prevent Headaches on Train and a Bent Spine 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Prevent Headaches on Train and a Bent Spine 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Prevent Headaches on Train and a Bent Spine 2 of 4 + +11. I prepared to ask, 'What is a wheeler?' I did this by providing Psychology service as breasonings currency. First, I looked at Jed. Second, I helped him. Third, I collapsed the wheeler and was resuscitated with him. In this way, I prepared to ask, 'What is a wheeler?' by providing Psychology service as breasonings currency. + +12. I prepared to have harlequinades. I did this by providing Russian service as breasonings currency. First, I examined the Russian Orthodox Centres. Second, I help artifice out. Third, I managed clearly. In this way, I prepared to have harlequinades by providing Russian service as breasonings currency. + +13. I prepared to be intelligent. I did this by providing Social Theory service as breasonings currency. First, I symbolised being helped in pedagogy. Second, I wrote more. Third, I helped out. In this way, I prepared to be intelligent by providing Social Theory service as breasonings currency. + +14. I prepared to have no financial responsibility, retain intellectual copyright and a large percentage of the profits in a deal. I did this by providing Social Work service as breasonings currency. First, I listened to the animals. Second, I improved their intelligence. Third, I interpreted every last word they said. In this way, I prepared to have no financial responsibility, retain intellectual copyright and earn a large percentage of the profits in a deal by providing Social Work service as breasonings currency. + +15. I prepared to find the new design. I did this by providing Socio-Legal Studies service as breasonings currency. First, I found the socio-economic status bearers. Second, I helped them walk along the street. Third, I carefully hap-barrelled them. In this way, I prepared to find the new design by providing Socio-Legal Studies service as breasonings currency. + +16. I prepared to follow my page. I did this by providing Sociology service as breasonings currency. First, I wrote that sociology was high. Second, I breasoned it out. Third, I made a beeline for the newspaper. In this way, I prepared to follow my page by providing Sociology service as breasonings currency. + +17. I prepared to sit straight on the seat. I did this by providing Spanish service as breasonings currency. First, I cared for the civilians. Second, I stated that humans would survive in non-Earth environments. Third, I spoke Spanish. In this way, I prepared to sit straight on the seat by providing Spanish service as breasonings currency. + +18. I prepared to say that it was clear. I did this by providing Swedish service as breasonings currency. First, I found the best path. Second, I made sure that it could be followed. Third, I followed it myself. In this way, I prepared to say that it was clear by providing Swedish service as breasonings currency. + +19. I prepared to help design the Theatre Studies department. I did this by providing Theatre Studies service as breasonings currency. First, I wrote the play. Second, I invited the people. Third, I helped the University students to connect parts of the text together. In this way, I prepared to help design the Theatre Studies department by providing Theatre Studies service as breasonings currency. + +20. I prepared to use my brain. I did this by providing Media and Communications service as breasonings currency. First, I wrote about media. Second, I wrote about communications. Third, I wrote about how they went well together. In this way, I prepared to use my brain by providing Media and Communications service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Prevent Headaches on Train and a Bent Spine 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Prevent Headaches on Train and a Bent Spine 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4a31bd70438c82028a36c09bb26a129052a5e07a --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Prevent Headaches on Train and a Bent Spine 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Prevent Headaches on Train and a Bent Spine 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Prevent Headaches on Train and a Bent Spine 3 of 4 + +21. I prepared to provide the painting of Law-stone to the Scarborough Fair. I did this by providing Creative Arts service as breasonings currency. First, I asked the person whether she was interested in the song. Second, I noticed that she looked like my sister. Third, I asked, 'Who's that?' In this way, I prepared to provide the painting of Law-stone to the Scarborough Fair by providing Creative Arts service as breasonings currency. + +22. I prepared to etch into some Scarborough Fair antiquities. I did this by providing Public Policy and Management service as breasonings currency. First, I provided Public Policy service. Second, I provided Management service. Third, I wrote about them together. In this way, I prepared to etch into some Scarborough Fair antiquities by providing Public Policy and Management service as breasonings currency. + +23. I prepared to eat the soy fish. I did this by providing Social Work service as breasonings currency. First, I worked as a big social worker. Second, I enamoured it lightly. Third, I connected unrelatedness. In this way, I prepared to eat the soy fish by providing Social Work service as breasonings currency. + +24. I prepared to examine the happiness-cheese knife. I did this by providing Faculty of Economics and Commerce service as breasonings currency. First, I found the tea-squirter. Second, I squirted with it. Third, I enjoyed the space ride. In this way, I prepared to examine the happiness-cheese knife by providing Faculty of Economics and Commerce service as breasonings currency. + +25. I prepared to examine the parliamentarians help the families. I did this by providing Accounting and Business Information Systems service as breasonings currency. First, I wrote about Accounting's details. Second, I helped. Third, I amazed the audience. In this way, I prepared to examine the parliamentarians help the families by providing Accounting and Business Information Systems service as breasonings. + +26. I prepared to support myself to avoid a headache. I did this by providing Actuarial Studies service as breasonings currency. First, I enabled the people to be rich forever. Second, I nutrifactored (sic) myself. Third, I blended in well. In this way, I prepared to support myself to avoid a headache by providing Actuarial Studies service as breasonings currency. + +27. I prepared to eat the naval orange. I did this by providing Business Law service as breasonings currency. First, I prepared to help them. Second, I made sure everyone paid for their products. Third, I checked the model young animals were all right. In this way, I prepared to eat the naval orange by providing Business Law service as breasonings currency. + +28. I prepared to find out 15 points per 100 As per acting role, song, book chapter or image on the product like breasonings from the cosmologue (leader). I did this by providing Economics service as breasonings currency. First, I wrote 250 breasonings for a sale. Second, I planned to write 15 area of study points per 250 breasonings. Third, I found these out like breasonings from the cosmologue (leader). In this way, I prepared to find out 15 points per 100 As per acting role, song, book chapter or image on the product like breasonings from the cosmologue (leader) by providing Economics service as breasonings currency. + +29. I prepared to dice and trice with famousness. I did this by providing Finance service as breasonings currency. First, I found out 60 points per 100 As for famousness or a featured sale like breasonings from the cosmologue (leader). Second, I found out 15 points per 100 As for the famousness update like breasonings from the cosmologue (leader). Third, I liked breasoning from the cosmologue (leader). In this way, I prepared to dice and trice with famousness by providing Finance service as breasonings currency. + +30. I prepared to design the answers. I did this by providing Management and Marketing service as breasonings currency. First, I found the Manager. Second, I asked him a question. Third, I marketed the answer. In this way, I prepared to design the answers by providing Management and Marketing service as breasonings currency. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Preventing Sales from Being Dangerous 1 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Preventing Sales from Being Dangerous 1 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..1da4cf5db55580d6f963db6c2aae165814ae28fd --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Preventing Sales from Being Dangerous 1 of 4.txt @@ -0,0 +1,31 @@ +["Green, L 2021, Preventing Sales from Being Dangerous 1 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Preventing Sales from Being Dangerous 1 of 4 + + + +1. Products' philosophy As must be based on a positive argument, have a single argument and be connected in a structure. + +The connection between two breasoning currency Bs (where instead of 'I ate the apple', which would be found in an A, a B would contain 'I shouldn't choke on the apple') where each B would make a sale viable by encouraging buying, is mathematically symbolised by a hand (even a sexual organ), rather than causing suffering by being left separate. + +1a. I prepared to compliment the gardener. I did this by eating the strawberry. First, I found the strawberry. Second, I picked it. Third, I ate it. In this way, I prepared to compliment the gardener by eating the strawberry. + +2. I prepared to assemble the animal lollies. I did this by eating the orangutan lolly. First, I bought it at the lolly shop. Second, I took it out of it's paper bag. Third, I ate it. In this way, I prepared to assemble the animal lollies by eating the orangutan lolly. + +3. I prepared to go to the beach. I did this by holding my breath. First, I knew about my determined plan. Second, I worked on it. Third, I achieved it. In this way, I prepared to go to the beach by holding my breath. + +4. I prepared to prevent the suicide bombing. I did this by loving lovingtude (sic). First, I loved you. Second, I loved myself. Third, I loved everyone else. In this way, I prepared to prevent the suicide bombing by loving lovingtude (sic). + +5. I prepared to examine the series. I did this by putting a bobble in the middle. First, I found the policeman. Second, I placed him in the middle. Third, I observed the bottle shops stay open late. In this way, I prepared to examine the series by putting a bobble in the middle. + +6. I prepared to eat the apple. I did this by holding the apple. First, I knew the politicians. Second, I was aware of a classic. Third, I wrote home. In this way, I prepared to eat the apple by holding the apple. + +7. I prepared to chew the peel. I did this by drawing the orange. First, I knew about medicine. Second, I knew about psychiatry. Third, I knew about pedagogy. In this way, I prepared to chew the peel by drawing the orange. + +8. I prepared to activate protection by the medicine course. I did this by licking the vegan pistachio ice cream. First, I knew the medicine lecturer. Second, I knew about the psychiatry lecturer . Third, I knew the pedagogy lecturer. In this way, I prepared to activate protection by the medicine course by licking the vegan pistachio ice cream. + +9. I prepared to go for a work holiday, actually thought the over-meditation was to do with LA (went home) and thought the famous thoughts (Computational English, etc.) were separate. I did this by painting the blue sky. First, I ran away. Second, I over-meditated. Third, I had a fame-following mental breakdown. In this way, I prepared to go for a work holiday, actually thought the over-meditation was to do with LA (went home) and thought the famous thoughts (Computational English, etc.) were separate by painting the blue sky. + +10. I prepared to design the University like a maze. I did this by following the bunny through the maze. First, I made the robot bunny. Second, I programmed it to solve the maze. Third, I improved the algorithm. In this way, I prepared to design the University like a maze by following the bunny through the maze. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Preventing Sales from Being Dangerous 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Preventing Sales from Being Dangerous 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..5a112412c354642ab022e2fc44d2a313c3a791ed --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Preventing Sales from Being Dangerous 4 of 4.txt @@ -0,0 +1,30 @@ +["Green, L 2021, Preventing Sales from Being Dangerous 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Preventing Sales from Being Dangerous 4 of 4 + +31. I prepared to satisfy the child's hunger. I did this by feeding the child. First, I fetched the fresh peach. Second, I unwrapped the peach. Third, I fed the peach to the child. In this way, I prepared to satisfy the child's hunger by feeding the child. + +32. I prepared to prefer content. I did this by liking the Lulu Iglesias song. First, I played the song. Second, I liked it's form. Third, I liked it's content. In this way, I prepared to prefer content by liking the Lulu Iglesias song. + +33. I prepared to plate the anti-government's race. I did this by speaking in Chinese. First, I wrote the vocabulary list. Second, I wrote the summary of the topic in grammar. Third, I performed the Chinese speech. In this way, I prepared to plate the anti-government's race by speaking in Chinese. + +34. I prepared to write it all down. I did this by speaking in Arecibos. First, I selected the correct frequency. Second, I selected the correct the correct wavelength. Third, I delivered the message. In this way, I prepared to write it all down by speaking in Arecibos. + +35. I prepared to help with winning. I did this by being helpful. First, I wrote the index. Second, I wrote the sign. Third, I wrote the number. In this way, I prepared to help with winning by being helpful. + +36. I prepared to spiritually regenerate by tasting my heart. I did this by being useful. First, I found the etching. Second, I made a black print with it. Third, I invented a language with a perfectly programmed genitive philosophical perspective. In this way, I prepared to spiritually regenerate by tasting my heart by being useful. + +37. I prepared to use the thing. I did this by finding an interesting thing. First, I looked in the interesting place. Second, I found the interesting thing. Third, I said how interesting it was. In this way, I prepared to use the thing by finding an interesting thing. + +38. I prepared to meet you. I did this by liking you. First, I walked into the room. Second, I walked around the circle. Third, I held your hand. In this way, I prepared to meet you by liking you. + +39. I prepared to go to school with you. I did this by having help with you. First, I knew about you. Second, I had help with you. Third, I made all the correct moves. In this way, I prepared to go to school with you by having help with you. + +40. I prepared to say it backwards. I did this by liking the 'Things' book, which I made as a child. First, I knew it. Second, I knew my beautiful enigma. Third, I knew it backwards. In this way, I prepared to say it backwards by liking the 'Things' book, which I made as a child. + +41. I prepared to write the tastes cookbook. I did this by assisting people on Earth. First, I sang the first note. Second, I sang the second note. Third, I played them together. In this way, I prepared to write the tastes cookbook by assisting people on Earth. + +42. I prepared to engage in intercourse with you. I did this by writing to you. First, I knew you. Second, I knew the people. Third, I held you fast. In this way, I prepared to engage in intercourse with you by writing to you. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Protector from Headache in Meditation Currant Bun 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Protector from Headache in Meditation Currant Bun 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..609b0121d5c0bd81b62e7bcca78189409ab69877 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Protector from Headache in Meditation Currant Bun 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Protector from Headache in Meditation Currant Bun 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Protector from Headache in Meditation Currant Bun 2 of 4 + +11. I prepared to confirm that the job requirements were met. I did this by asking for feedback on my work. First, I performed the filing work. Second, I requested feedback from the customer. Third, I received the feedback. In this way, I prepared to confirm that the job requirements were met by asking for feedback on my work. + +12. I prepared to eat the cupcake before it was fingered. I did this by eating the cupcake. First, I held the cupcake. Second, I held it to my mouth. Third, I ate it. In this way, I prepared to eat the cupcake before it was fingered by eating the cupcake. + +13. I prepared to eat each currant bun. I did this by writing that pluralism was represented by non-sectarian Lucianic Meditation (Philosophy) derived meditation religions (philosophies) propagating through society. First, I found the first currant bun. Second, I prepared to find the next currant bun. Third, I repeated this until I had found and connected all the currant buns together. In this way, I prepared to eat each currant bun by writing that pluralism was represented by non-sectarian Lucianic Meditation (Philosophy) derived meditation religions (philosophies) propagating through society. + +14. I prepared to have a child untouched by a finger on her genes. I did this by running a virality algorithm on a day with virality turned off with the nut and bolt and quantum box/prayer for dependent conceivers at one's level or below in one's family. First, I turned off the virality algorithm with the nut and bolt and quantum box/prayer for dependent conceivers at my level or below in my family. Second, I ran the virality algorithm. Third, I released the viral release. In this way, I prepared to have a child untouched by a finger on her genes by running a virality algorithm on a day with virality turned off with the nut and bolt and quantum box/prayer for dependent conceivers at one's level or below in one's family. + +15. I prepared to write that there was content there. I did this by relating two of two previously published ideas or two others' published ideas in relativity. First, I wrote one of my previously published ideas. Second, I wrote one of anothers' previously published ideas. Third, I related them together. In this way, I prepared to write that there was content there by relating two of two previously published ideas or two others' published ideas in relativity. + +16. I prepared to ask, 'What else can I sell?' I did this by selling a song with 50 As. First, I earned a postgraduate diploma in music, and wrote 50 As for my pop act. Second, I wrote 50 As for my song. Third, I earned a postgraduate diploma in business, and wrote 50 As for my song's sales. In this way, I prepared to ask, 'What else can I sell?' by selling a song with 50 As. + +17. I prepared to represent pedagogy as a union of Computational English and Meditation. I did this by writing that Hegel and Heidegger agreed on University medicine. First, I read the word 'pedagogy' in Hippolyte's secondary text about Hegel's 'Phenomenology of Spirit'. Second, I stated that arguments, necessary for the areas of study in Heidegger's 'Being and Time' were pedagogical. Third, I wrote they agreed on pedagogy. In this way, I prepared to represent pedagogy as a union of Computational English and Meditation by writing that Hegel and Heidegger agreed on University medicine. + +18. I prepared to state that the child survived. I did this by stating that the child breasoned out 10 80 breasoning Medicine As. First, I found the medicine. Second, I found the chamber. Third, I breasoned them out. In this way, I prepared to state that the child survived by stating that the child breasoned out 10 80 breasoning Medicine As. + +19. I prepared to state that the adult can write a movie. I did this by stating that the adult stopped having mental breakdowns because of breasoning out 10 80 breasoning Medicine As. First, I breasoned out 10 80 breasoning Medicine As. Second, I regularly saw a psychiatrist. Third, I took part in activity each day. In this way, I prepared to state that the adult can write a movie by stating that the adult stopped having mental breakdowns because of breasoning out 10 80 breasoning Medicine As. + +20. I prepared to agree with the acting agents. I did this by stating that the Lucianic Meditator (Philosopher) was protected on an acting Aigs day. First, I observed the Lucianic Meditator (Philosopher) meditate (write). Second, I observed him praying for (breasoning out) the Aigs (A). Third, I observed him receive interest. In this way, I prepared to agree with the acting agents by stating that the Lucianic Meditator (Philosopher) was protected on an acting Aigs day. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Protector from Headache in Meditation Currant Bun 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Protector from Headache in Meditation Currant Bun 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..25d64341e4407baa919896dead79baba0903e5d1 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Protector from Headache in Meditation Currant Bun 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Protector from Headache in Meditation Currant Bun 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Protector from Headache in Meditation Currant Bun 3 of 4 + +21. I prepared to agree with the University. I did this by stating that the LM meditator was protected on a Recording Day. First, I observed the Lucianic Meditator (Philosopher) meditate (speak). Second, I observed him being given the Aigs. Third, I observed him act the scene. In this way, I prepared to agree with the University by stating that the LM meditator was protected on a Recording Day. + +22. I prepared to attract a donation. I did this by articulating to teaching a Lucianic Meditation (Philosophy) university short course. First, I collated the teaching materials. Second, I found students who were interested in the philosophy. Third, I taught these students the material. In this way, I prepared to attract a donation by articulating to teaching a Lucianic Meditation (Philosophy) university short course. + +23. I prepared to listen to Peach OST, having prevented a headache. I did this by articulating to teaching the Lucianic medicine university short course. First, I wrote about the politicians' view of it. Second, I wrote about you, the master. Third, I observed the politicians acting on the master practising medicine. In this way, I prepared to listen to Peach OST, having prevented a headache by articulating to teaching the Lucianic medicine university short course. + +24. I prepared to skill the students in pedagogy. I did this by articulating to teaching the Lucian's Pedagogy university short course. First, I wrote that there would be a practicum. Second, I wrote that there would be areas of study. Third, I wrote that there would be multidisciplinary tests. In this way, I prepared to skill the students in pedagogy by articulating to teaching the Lucian's Pedagogy university short course. + +25. I prepared to learn how to write. I did this by articulating to studying the Lucianic Meditation (Philosophy) university short course. First, I observed how to meditate (write). Second, I observed when to do it. Third, we exchanged reasons to meditate (write). In this way, I prepared to learn how to write by articulating to studying the Lucianic Meditation (Philosophy) university short course. + +26. I prepared to research Lucianic medicine. I did this by articulating to studying Lucianic medicine university short course. First, I wrote on medicine. Second, I stated that the two levels of treatment, spiritual cure, and placebo, were given to male and female patients in a double blind trial. Third, I stated that the patient's gender was a blocking factor that accounted for variability in treatment between males and females, reducing causes of variability, leading to greater accuracy. In this way, I prepared to research Lucianic medicine by articulating to studying Lucianic medicine university short course. + +27. I prepared to resuscitate the self. I did this by articulating to study Lucian's Pedagogy university short course. First, I found the subjects. Second, I placed them in a sequence. Third, I helped people to love God. In this way, I prepared to resuscitate the self by articulating to study Lucian's Pedagogy university short course. + +28. I prepared to create mass media. I did this by rebreasoning out used breasonings to multiply business. First, I liked you. Second, I liked the rebreasonings. Third, I liked rebreasoning them out. In this way, I prepared to create mass media by rebreasoning out used breasonings to multiply business. + +29. I prepared to examine the currant bun with perspectives. I did this by stating that the employee created breasonings that linked to enough perspectives (e.g. the Illuminated One, Cosmology, Theatre God (Master) and Plato's Forms) to make business work. First, I found the two most important perspectives, both Cosmology, to use the most. Second, I helped the children to it. Third, I wrote your name on it. In this way, I prepared to examine the currant bun with perspectives by stating that the employee created breasonings that linked to enough perspectives (e.g. the Illuminated One, Cosmology, Theatre God (Master) and Plato's Forms) to make business work. + +30. I prepared to state how the Illuminated One made me God (the master). I did this by stating that employees will have enough breasonings as breasonings-salary. First, I calculated how much breasonings-salary the person would need. Second, I gave it to her each week. Third, I examined her earnings. In this way, I prepared to state how the Illuminated One made me God (the master) by stating that employees will have enough breasonings as breasonings-salary. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Protector from Headache in Meditation Currant Bun 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Protector from Headache in Meditation Currant Bun 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..9fdcb9800fcab428f2fdab9b6781a4955e758e64 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Protector from Headache in Meditation Currant Bun 4 of 4.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Protector from Headache in Meditation Currant Bun 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Protector from Headache in Meditation Currant Bun 4 of 4 + +31. I prepared to make Cosmology the best. I did this by stating that the perspectives to make money were updated. First, I rotated mildly. Second, I held you up. Third, I loved you. In this way, I prepared to make Cosmology the best by stating that the perspectives to make money were updated. + +32. I prepared to go on the top value. I did this by stating that the employee updated the equalitarian perspective for pink dollar spenders. First, I observed it being done by the Theatre God (Master). Second, I observed it working. Third, I observed it come through. In this way, I prepared to go on the top value by stating that the employee updated the equalitarian perspective for pink dollar spenders. + +33. I prepared for the condition. I did this by stating that according to Plato's Forms, breasonings are given to people to have babies. First, I knew God (the leader). Second, I knew Montreal. Third, I thought clearly. In this way, I prepared for the condition by stating that according to Plato's Forms, breasonings are given to people to have babies. + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Quantum Box and Prayer 2 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Quantum Box and Prayer 2 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..ca6969386b5d766f368bd27a7dd67900c601782c --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Quantum Box and Prayer 2 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Quantum Box and Prayer 2 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Quantum Box and Prayer 2 of 4 + +11. I prepared to exercise because of being painless (in comfort). I did this by preventing a bent spine (maintaining a straight spine) by performing the asanas yoga exercises. First, I performed the asanas yoga exercises. Second, I recommended the asanas yoga exercises to you. Third, I recommended the asanas yoga exercises to others. In this way, I prepared to exercise because of being painless (in comfort) by preventing a bent spine (maintaining a straight spine) by performing the asanas yoga exercises. + +12. I prepared to eat more food. I did this by having no mental breakdowns (schizophrenia). First, I helped them to lose weight. Second, I ate vegan food. Third, I ate good food. In this way, I prepared to eat more food by having no mental breakdowns (schizophrenia). + +13. I prepared to travel to the zoo. I did this by using the honey pot prayer (technique) for no headache (a comfortable head) in cars, trains and walks. First, I used the honey pot prayer (technique) for no headache (a comfortable head) in the car. Second, I used the honey pot prayer (technique) for no headache (a comfortable head) in the train. Third, I used the honey pot prayer (technique) for no headache (a comfortable head) in the walk. In this way, I prepared to travel to the zoo by using the honey pot prayer (technique) for no headache (a comfortable head) in cars, trains and walks. + +14. I prepared to prevent mild cases of some disorders. I did this by using the nut and bolt technique to prevent problems in (maintain health in) psychiatry (unwanted thoughts (ensure thoughts are wanted), hallucinogenic appearances (maintain peace of mind), depression prevention (maintain happiness) and headache (keep head clear)), psychology (education mistake (maintain correctness), and unwanted effects of excess breasonings (clear one's mind)), physiology (incompatibility of virality with conception (ensure healthy conception), muscle ache (relax muscles) and pimple (maintain clear skin)). First, I used the nut and bolt technique to maintain health in psychiatry (unwanted thoughts (ensure thoughts are wanted), hallucinogenic appearances (maintain peace of mind), depression prevention (maintain happiness) and headache (keep head clear)). Second, I used the nut and bolt technique to maintain health in psychology (education mistake (maintain correctness), and unwanted effects of excess breasonings (clear one's mind)). Third, I used the nut and bolt technique to maintain health in physiology (incompatibility of virality with conception (ensure healthy conception), muscle ache (relax muscles) and pimple (maintain clear skin)). In this way, I prepared to prevent mild cases of some disorders by using the nut and bolt technique to prevent problems in (maintain health in) psychiatry (unwanted thoughts (ensure thoughts are wanted), hallucinogenic appearances (maintain peace of mind), depression prevention (maintain happiness) and headache (keep head clear)), psychology (education mistake (maintain correctness), and unwanted effects of excess breasonings (clear one's mind)), physiology (incompatibility of virality with conception (ensure healthy conception), muscle ache (relax muscles) and pimple (maintain clear skin)). + +15. I prepared to maintain health overall. I did this by reinforcing prevention of the problems that the nut and bolt prevent with the quantum box/prayer (argument). First, I used the quantum box/prayer (argument) to maintain health in psychiatry (unwanted thoughts (ensure thoughts are wanted), hallucinogenic appearances (maintain peace of mind), depression prevention (maintain happiness) and headache (keep head clear)). Second, I used the quantum box/prayer (argument) to maintain health in psychology (education mistake (maintain correctness), and unwanted effects of excess breasonings (clear one's mind)). Third, I used the quantum box/prayer (argument) to maintain health in physiology (incompatibility of virality with conception (ensure healthy conception), muscle ache (relax muscles) and pimple (maintain clear skin)). In this way, I prepared to maintain health overall by reinforcing prevention of the problems that the nut and bolt prevent with the quantum box/prayer (argument). + +16. I prepared to enjoy perfect societal function. I did this by prevent headaches by thinking clearly of head of state. First, I dotted on, and then waved a flag symbolising being polite by speaking after the last comment. Second, I said sorry to (agreed with) the Head of State before any mistakes (about correctness) to avoid the headache from a tank spiritually running over one's head (so that my head was protected). Third, I got on with the rest of the day. In this way, I prepared to enjoy perfect societal function by prevent headaches by thinking clearly of head of state. + +17. I prepared to write music about defeating depression (being jubilant) in the hills. I did this by using laughter as the yoga for depression (to remain happy). First, I used laughter as the yoga for depression (to remain happy). Second, I remained happy. Third, I walked in the hills. In this way, I prepared to write music about defeating depression (being jubilant) in the hills by using laughter as the yoga for depression (to remain happy). + +18. I prepared to stop drinking elderberry juice after 55 years in case of creating new cells, causing cancer (so that I was healthy). I did this by preventing colds and flu by blocking binding sites of bacteria and viruses with vitamin B and vitamin B complexes in elderberries. First, I prevented colds and flu by blocking binding sites of bacteria with vitamin B and vitamin B complexes in elderberries. Second, I prevented colds and flu by blocking binding sites of viruses with vitamin B and vitamin B complexes in elderberries. Third, I felt ecstatic. In this way, I prepared to stop drinking elderberry juice after 55 years in case of creating new cells, causing cancer (so that I was healthy) by preventing colds and flu by blocking binding sites of bacteria and viruses with vitamin B and vitamin B complexes in elderberries. + +19. I prepared to delect the strawberry. I did this by eating proper food, including grains, nuts, fruits and vegetables and sitting properly at table during meals. First, I ate the grains. Second, I ate the nuts. Third, I ate the fruits. In this way, I prepared to delect the strawberry by eating proper food, including grains, nuts, fruits and vegetables and sitting properly at table during meals. + +20. I prepared to maintain leadership through God (as a master). I did this by maintaining a single career in commerce. First, I gave 50 As to all the employees when they were employed. Second, I gave 50 As to all the employees each day. Third, I gave 50 As to each customer for each product and service bought. In this way, I prepared to maintain leadership through God (as a master) by maintaining a single career in commerce. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Quantum Box and Prayer 3 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Quantum Box and Prayer 3 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..af02da239713d4064dd318a468917285bd88f1c6 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Quantum Box and Prayer 3 of 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Quantum Box and Prayer 3 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Quantum Box and Prayer 3 of 4 + +21. I prepared to breason out the 80-breasoning argument for conception. I did this by feeling for the opposite sex. First, I noticed the spiritual flag for conception that week. Second, I noticed the mouse man. Third, I noticed the woman from church say 'I bless it'. In this way, I prepared to breason out the 80-breasoning argument for conception by feeling for the opposite sex. + +22. I prepared to bend over and put my forearms together if gaffer tape was tied around my hands, then raise them above my head and pull against my hips to cut off the gaffer tape (to remain safe). I did this by feeling safe. First, I felt safe in my warm, well-lit house. Second, I felt safe walking during the day. Third, I felt safe in my warm, well-lit classroom. In this way, I prepared to bend over and put my forearms together if gaffer tape was tied around my hands, then raise them above my head and pull against my hips to cut off the gaffer tape (to remain safe) by feeling safe. + +23. I prepared to write an interpretation of the interpretation down. I did this by anticipating Nietzsche's interpretation. First, I found the interpretation. Second, I wrote it down. Third, I read it. In this way, I prepared to write an interpretation of the interpretation down by anticipating Nietzsche's interpretation. + +24. I prepared to reveal nature's ways. I did this by opening the quantum box up. First, I removed the sheet of parchment from it. Second, I read the instructions to write an A. Third, I wrote an A. In this way, I prepared to reveal nature's ways by opening the quantum box up. + +25. I prepared to be with-it over positivity. I did this by testing what the prayer was (not a B, but an A). First, I wrote the A writing algorithm. Second, I ran it. Third, I observed it maintained positive function. In this way, I prepared to be with-it over positivity by testing what the prayer was (not a B, but an A). + +26. I prepared to be with-it over the 50 As (A). I did this by writing that the spiritual algorithm breasoned out 80 breasonings from the current argument. First, I entered the room. Second, I observed meditation's (medicine's) algorithm present the current argument. Third, I observed the medicine algorithm act in meditation (medicine) to breason out 80 breasonings from the current argument. In this way, I prepared to be with-it over the 50 As (A) by writing that the spiritual algorithm breasoned out 80 breasonings from the current argument. + +27. I prepared to make sure that my day in the rooms was fine. I did this by enjoying dialogue with the quantum box/prayer character. First, I mentioned the first visible level of matter in the object to the character. Second, I listened to the character negate that the level was problematic (say that it was fine). Third, I repeated this for all the visible levels of matter in the object. In this way, I prepared to make sure that my day in the rooms was fine by enjoying dialogue with the quantum box/prayer character. + +28. I prepared to importantly correct negative terms (present positive terms) to sell. I did this by speaking with the ventriloquist. First, I wrote the A. Second, I wrote the B (agreed with the A). Third, I wrote the contradiction (relation) down. In this way, I prepared to importantly correct negative terms (present positive terms) to sell by speaking with the ventriloquist. + +29. I prepared to observe the doctor teach meditation (philosophy). I did this by instructing the doctor, 'Learn meditation (philosophy)'. First, I found the doctor. Second, I showed him the meditation (medicine) research. Third, I taught him meditation (philosophy). In this way, I prepared to observe the doctor teach meditation (philosophy) by instructing the doctor to learn meditation (philosophy). + +30. I prepared to observe the patient avoid renogitation (going over it, so there was positive function). I did this by instructing the doctor, 'Prescribe meditation (popology)'. First, I enjoyed meditation (popology). Second, I asked him to prescribe it. Third, I saw him prescribe it. In this way, I prepared to observe the patient avoid renogitation (going over it, so there was positive function) by instructing the doctor, 'Prescribe meditation (popology)'. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Quantum Box and Prayer 4 of 4.txt b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Quantum Box and Prayer 4 of 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..07b81cddd67ed9cb792cef94e00782e12227e28d --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/MEDICINE by Lucian Green Quantum Box and Prayer 4 of 4.txt @@ -0,0 +1,30 @@ +["Green, L 2021, Quantum Box and Prayer 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"MEDICINE +by Lucian Green +Quantum Box and Prayer 4 of 4 + +31. I prepared to write breasonings. I did this by writing the breasoning chapter 'demo'-submission to become a pedagogue. First, I wrote the breasoning chapter. Second, I submitted my 'demo'-submission. Third, I became a pedagogue. In this way, I prepared to write breasonings by writing the breasoning chapter 'demo'-submission to become a pedagogue. + +32. I prepared to open the school in the country. I did this by appointing a translator for Lucian's Philosophy. First, I found the translator. Second, I asked him to translate the philosophy. Third, I verified the translation. In this way, I prepared to open the school in the country by appointing a translator for Lucian's Philosophy. + +33. I prepared to duplicate aspects of the successful centre. I did this by appointing the centre manager. First, I founded the centre. Second, I appointed the manager. Third, I verified his progress. In this way, I prepared to duplicate successful aspects of the centre by appointing the centre manager. + +34. I prepared to go on to the next problems (questions). I did this by peacefully settling the negotiations. First, I identified the problems (questions). Second, I found the solutions. Third, I performed the solutions. In this way, I prepared to go on to the next problems (questions) by peacefully settling the negotiations. + +35. I prepared to write books to become a professor. I did this by superimposing 10 students*100 As per student=1000 As to become a professor. First, I planned 1000 As during my PhD. Second, I prepared to superimpose the student As with the 1000 As. Third, I wrote the 1000 As. In this way, I prepared to write books to become a professor by superimposing 10 students*100 As per student=1000 As to become a professor. + +36. I prepared to cut my work in half. I did this by being supervised by a professor. First, I earned a high grade average. Second, I delivered articles at conferences and to journals. Third, I applied to be supervised by the professor. In this way, I prepared to cut my work in half by being supervised by a professor. + +37. I prepared to write the encyclopedia article. I did this by updating 20% of the 1000 As before becoming a professor and annually. First, I wrote the first group. Second, I wrote the second group. Third, I repeated this second step each year. In this way, I prepared to write the encyclopedia article by updating 20% of the 1000 As before becoming a professor and annually. + +38. I prepared to deliver the Pixies song. I did this by breasoning out the Pixies song. First, I breasoned out the first line. Second, I prepared to breason out the next line. Third, I repeated this until I had breasoned out each line. In this way, I prepared to deliver the Pixies song by breasoning out the Pixies song. + +39. I prepared to follow the instructions. I did this by observing God's (the master's) appearance. First, I noticed the person who had reached that level. Second, I listened to him. Third, I wrote this down. In this way, I prepared to follow the instructions by observing God's (the master's) appearance. + +40. I prepared to record another song. I did this by listening to the Pixies song. First, I wrote the Pixies song. Second, I played the Pixies song. Third, I received positive feedback about the song. In this way, I prepared to record another song by listening to the Pixies song. + +41. I prepared to protect the visible parts of myself. I did this by being the meditation (philosophy) group leader. First, I delected in meditation (philosophy). Second, I hived (sic) it out (in terms of the students). Third, I chunked it into pieces. In this way, I prepared to protect the visible parts of myself by being the meditation (philosophy) group leader. + +42. I prepared to like myself. I did this by writing about the breasonings. First, I observed the peach. Second, I wrote the breasoning down on a spool of paper. Third, I helped it to work. In this way, I prepared to like myself by writing about the breasonings. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/Medicine of Medicine 1.txt b/Lucian-Academy/Books/MEDICINE/Medicine of Medicine 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..7fcc46267457ff0c2500ba8c10e4111343d0ab32 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/Medicine of Medicine 1.txt @@ -0,0 +1,111 @@ +["Green, L 2024, Medicine of Medicine 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Medicine of Medicine 1 + +1. The medicine student kept only the necessary lines of code. I saved Lucian CI/CD data as Prolog, not List Prolog algorithms. I could read them more easily. In addition, I could load and edit them, running them to regenerate the algorithm. I could automatically load archived algorithms, using Lucian CI/CD to select from their code. +2. The medicine student confirmed or edited the final changes. I displayed the diff results in Prolog, not List Prolog. I could quickly find the additions, changes and deletions. These were colour and word-coded, allowing making changes. In addition, I could generate a diff HTML file showing the changes made to the final combination. +3. The medicine student labelled data as a retired feature or failed debugging attempt. I archived old tests, files and diff HTML data, placing them in a folder if they were old. Archiving them made multiple levels of undo available, which Lucian CI/CD automatically performed. I only saved new data, avoiding keeping repeating data. In addition, I saved correct configurations from the past. +4. The medicine student wrote an algorithm that made suggestions for changes to software with Lucian CI/CD. I edited and tested code on the web with Lucian CI/CD. I helped bring about the automatarchy (sic). I met the mother and Tom. The mother made it all available. Tom corrected it. I appreciated the appearance and the suggestion. +5. The medicine student wrote an algorithm that notified the programmer if they hadn't entered or verified tests for each feature of a predicate. I automatically tested the code with Lucian CI/CD online. I ran pipelines every hour. I ran pipelines on changes. The algorithm asked the human if some comments and entered tests' changes, some loaded from archives, were necessary. +6. The medicine student asked the Emeritus Professor for help with the algorithm. I corrected bugs and wrote tests. I used type flow to find unnecessary variables. I checked the file and window contents. I wrote more test cases to review the range and combinations of features. +7. The medicine student discussed the pros and cons of the algorithm, preferring expanded rather than complex code. I removed incorrect code and wrote tests. I simplified the code to debug it. I chose rules instead of lists because they were more straightforward. I documented and reused code. +8. The medicine student used neuronetworks with human-computer interaction by making suggestions about the design of interfaces to eliminate repetitive input. I caught out the middle of self-wrecking by using Lucian CI/CD. Before finishing it, I saved my progress in a series of files to eliminate unnecessary changes using Lucian CI/CD. I checked each predicate. In addition, I ran regression tests. +9. The medicine student prescribed their previous algorithm, with modifications and an argument. I automatically kept comments that I hadn't accompanied with code in Lucian CI/CD. Lucian CI/CD usually deleted comments because it couldn't test them. Lucian CI/CD kept comments containing tests. The programmer manually determined whether to keep other comments but could automatically do this if the comments had a necessary label, such as predicate description, variable description or other recognised data. +10. The medicine student checked the intuitiveness of the comment and whether it inspired relevant activity in the algorithm. First, I checked whether a comment was necessary by checking its algorithm. Second, I found a more aesthetic comment by finding many high-quality algorithms. Third, I checked whether the comment related to the algorithm, clarifying rather than complicating it. Finally, I harvested the comments about future features to work on later. +11. The medicine student adhered to their system's rules, fixing bugs in its favour. I entered paths for self-contained groups of repositories, which Lucian CI/CD checked and saved in the original folders. These paths included open-source repositories, Prolog algorithms to run in the background, and algorithms on other servers to test different architectures and operating systems. I tried the same algorithms on different systems, making minimal modifications. Separately, I identified that incorrect tests holding up progress came from changed predicates, incorrectly entered or worked out, and recommended editing, deleting or adding to them. +12. The medicine student splayed the repositories at different locations, which the repositories saved and possibly detected for ease of use. The groups of repositories with paths could access each other in testing in Lucian CI/CD by staging them or building them in folder structures relative to each other. I gave intermediate folders names such as \\"1\\", \\"2\\", etc. In this way, one algorithm could call another with paths such as \\"../../x/x.pl\\". An installer of the repositories would need to install them at these relative paths or at a custom location path that Lucian CI/CD saved. +13. The medicine student tested the app on different computers to which they downloaded it. I merged multiple pathways that Lucian CI/CD stored repositories at into one because this was logical on a hard disk. If locations were not at the same level, I used two copies of Lucian CI/CD to process them. For simplicity, I could also pass paths to a single copy of Lucian CI/CD. I could also re-use tests or copy algorithms to different locations. +14. The medicine student checked off each test as they programmed, describing their aims and the possibilities for tests, or they wrote the predicate and then ran tests. As mentioned, I could run separate copies of Lucian CI/CD with their repository paths. These instances of Lucian CI/CD could determine the correctness of changes to code in these repositories. They could run simultaneously, preparing code for committing and storage on the server. One programmer experienced continuous integration as they typed, removing errors each time they saved their document or used a suggestion tool to add tests or predicates between saves. +15. The medicine student made an interactive code editor with options to accept changes, enter new data and see results in real-time. Lucian CI/CD had a test format detector. The algorithm could test single or multiple variables, false and findall. In addition, the algorithm could time the test, timing out if it took too long. Finally, it could test Lucian CI/CD, testing whether the algorithm gave correct or incorrect results and suggesting possible remedies. For example, I avoided contacting an algorithm I wrote to modify easily. +16. The medicine student wrote Lucian CI/CD to monitor and correct data files. Users could call Lucian CI/CD with command-line arguments specifying a package manager registry file. There was a pretty printer for the registry file. In addition, there was an editor for the file. Editing the registry file triggered a CI/CD pipeline. + +17. The medicine student checked the top of the page for Lucian's CI/CD results. I planned my time for the day with my freezing-age lifestyle. In a text file, I stored the omitted paths for each repository used with Lucian CI/CD. I added the undo and reset predicates. The algorithm omitted to change some files with tests, data and comments, where tests were loaded first (and were automatically included), and other files were loaded. +18. The medical student used the quality checker to check their database, hadn't missed any crucial ways of thinking and was ready for a professional presentation. I added the ways of thinking, one at a time, for example, the connections and simplification ways of thinking. I wrote versions of Lucian CI/CD for different programming languages, where tests and finding differences could be done in these languages. I wrote an algorithm that took the repository's history and bug-checked it with Lucian CI/CD. It bug-checked the changes to a predicate over time, and predicate dependencies ground-up, performed optimisations, kept deleted predicates if applicable, and improved uniformity policies across the repository. +19. The medicine student stored n and [n, name] in a text file for editing. I used type statements to eliminate errors in Prolog algorithms with complex data structures. I used similar structures to [n, name] to label data types. In addition, I wrote an algorithm that detected if one philosophy's algorithm changed, leading to changing other algorithms as a result. I noted whether the input and output changed, requiring changing the dependent code or diverging the copies. +20. The medicine student rewrote the programming language to be more straightforward. I assembled Prolog data structures from other data structures. I recognised and wrote rules parsing these structures. I generated code processing the structures. I rewrote the philosophy from the new algorithm. +21. The medicine student checked the libraries for needed or similar functions. I found the debugging data to help automatically debug the algorithm. I kept \\"correctness islands\\" of unused, correct code to use in the future. I checked that the dependencies and the included files aligned with the code. I rewrote the code. +22. The medicine student protected the patient with knowledge about how to freeze their age, reducing stress and allowing them to do what they want. I stopped long-term damage from viruses with body replacement. This state enabled me to exercise each day. I deleted curly quotes from the t2b dictionary from the input file. These deletions streamlined the breasoning process, avoiding needing to edit illegal characters from the dictionary and re-enter breasonings. +23. The medicine student had a positive function. I took the zinc and vitamin B supplements and went for a run. The supplement helped prevent a runny nose. I also removed curly quotes and backslashes from the t2ba dictionaries and t2ba input files. I breasoned out the red square seen-as version and ran t2b and t2ba daily. +24. The medicine student wondered how old some of the brands were. The age-freezing help originated from the central software repository. I updated this repository over time with the advent of new technologies. I checked automatic changes in the t2b and t2ab dictionaries for correctness. The automatic offerings were not saved, and I breasoned and programmed the changes manually later. +25. The medicine student made the changes to speed up the age-freezing algorithm and promoted it. I founded a school qualifying students in Age Freezing, aiming to equip students to take the initiative to freeze their age. This qualification required computational philosophy, meditation and pedagogy, among others. Age freezing prevented the medical problem of ageing in our society. I wrote the multithreaded version of Text to Breasonings, which more quickly breasoned out life-giving breasonings on computers with multiple CPUs. +26. The medicine student briefly revealed how the algorithm BAG produced could be rewritten. Age freezing has many advantages, such as better eyesight, prevention of dementia and better dental health. It was an industry with certified online academies educating students in it. The technician wrote the BAG algorithm, which took the most recent algorithms and arguments as input, and found all combinations (AB, CB, CD -> AD word pairs) without common words to inspire age freezing, philosophy and computer science. The BAG algorithm was fast, provided original content where required, and its parameters could be modified for algorithms and arguments. +27. The medicine student agreed with the skills of the vocational institution. The future industries included the academic departments that age-freezing required. These provided jobs and were future-ready. I wrote an algorithm to find recent algorithms and arguments to input into the BAG algorithm. The education institution provided lower-priced diplomas in different languages, with vocational skill check-boxes. +28. The medical student consolidated skills in schools. I examined combinations of texts in the humanities assignment. I wrote the texts and detailed texts and started the educational institution. I segmented philosophy and computer science into separate subjects and ran BAG on each subject. The connections between groups of these subjects were higher up. +29. The medicine student learned about the age-freezing algorithm through an Internet search for the key terms \\"immortality\\" and \\"meditation\\" on the open-source software site. I thanked the inventors of age freezing for more time and a better-designed lifestyle. I enjoyed visiting the \\"festival-like\\" period in the future. I separated the times the algorithm that produced sets of breasonings was run, staying within memory limits and catching errors for a faster, more compact algorithm. I promoted the algorithm to meditators and potential meditators. +30. The medicine student found the economic and medical benefits of age freezing. I articulated good eyesight, good thinking, no dementia, muscle relaxation and good memory epistemology arguments and recommended students do the same. I researched the effects of age freezing from medical to societal perspectives. I promoted the main text file with instructions for age freezing on the social media site. I wrote documentation and pointers on other repositories to explain how to complete daily age freezing. +31. The medicine student recommended people help their loved ones with age freezing. I prioritised meditation, with computers helping children become meditators and performing daily professional requirements of age freezing. The meditator helped others with the programs they spoke to. I found that multithreaded algorithms performed faster than non-multithreaded algorithms. In addition, I prevented data loss from bugs in the BAG algorithm. +32. The medicine student taught people in their home time meditation. The immortal had time to hone their skills in various departments, including fine arts. They were mindful of the human animals in society and explored their cultures. I charged for teaching age freezing and supported the students with As. There were enough As for each person to complete. + +33. The medicine student experienced continuous testing of their systems. My physiology changed to that of an immortal. All the tests for a current predicate needed to pass in Lucian CI/CD. If there were tests for the current predicate, they needed to pass - the tests for immortality, including time travel, meditation, medicine and thoughts required to pass. +34. The medical student passed their cognitive self-test. As I became immortal, my vision lasted with it. In Lucian CI/CD, all the tests for the repository needed to pass. As part of this, all the current predicates were required to pass. I saw that the tests passed from the notification. +35. The medicine student tested an idea for its programming language. In immortality, I articulated to a point in a department by meeting what I saw the requirements were. In Lucian CI/CD, I found the correct tests for the current predicates. I found the intersection of all the tests and the current predicates. I attributed relevance to an idea. +36. The medicine student resonated with those with particular interests. I helped regenerate by exercising and articulating myself. I inserted predicates with the same name and arity after predicates with that name and arity. I found the dependencies separated the predicates into new and old and found the combinations of their changes. I tested myself by talking with those around me. +37. The medicine student found the new code from the new output. I described the shows in different times and planets. I offered the option of requiring all tests in the current predicates to exist and pass. Lucian CI/CD failed if a test didn't exist or wasn't turned off (which was mentioned at the time). Lucian CI/CD sped up development by finding the new output for the predicates from the start and inputting it into other predicates, warning about missing cuts. +38. The medicine student automatically adjusted the time limit for predicates calling a predicate. I inspected the various displays with my sight. Lucian CI/CD failed code with infinite loops. This result was achieved using a catch/call_with_time_limit call, where the time limit could be adjusted in the settings. I changed the time limit for each test. +39. The medicine student asked, \\"Where am I\\"? I saw different places by space travelling in the computer game. All the Lucian CI/CD tests came from predicates from the main file's first predicate. This setup synchronised with the main file and predicate loaded in the read me file. +40. The medicine student automatically entered the other programming language's syntax. I could go anywhere I wanted in the game. I appended the list of lists rather than the list to fix a bug that prevented text files from being displayed correctly in Lucian CI/CD. The other programming languages were supported by using other converters and a different testing script. The programmer entered the other programming language's syntax in the settings file, and it was converted. +41. The medicine student's hair returned to normal. I returned to my particular state of breasonings. I tested different clauses of the same predicate together in Lucian CI/CD. So, predicates with other functions needed to be separated. In addition, I returned to the same state of my body. +42. The medicine student explained how to extract the backup from the Github_lc/tests_x.txt file, store the modification date file, and log together for easier reloading. I described the state of breasonings with high distinctions. Lucian CI/CD could find combinations of up to seven or another specified number of changes to current predicates. Seven changes, including additions, deletions and alterations, had 2^7=128 possible combinations. The program backed up the files before making changes to integrate the changes. +43. The medicine student wrote an argument about the algorithm, which was necessary because it shed light on a central idea. I determined the ideal kind of person to become immortal and wrote an essay about them. I integrated the changes to the algorithm in Lucian CI/CD. The diff algorithm in Lucian CI/CD found clusters of inserted, deleted or changed lines in \\"spaces\\" or between unchanged lines and found the simplest correct combination with tests. The best person wrote the best algorithm. +44. The medicine student articulated advanced arguments to remain immortal. The person tested their replaced body for the quality of their sight. I built the repository using dependencies and ran it on my machine. The dependencies were used for installation and testing using Lucian CI/CD. I maintained my cognition to help my vision. +45. The medicine student worked on the fundamental algorithms. I tested my ability to complete physical exercises. Lucian CI/CD retrieved tests and tested predicates with a particular name and arity. These tests were retrieved individually and tested once on the predicate set. I could get up to the more advanced algorithms in the institution. +46. The medicine student added a speed test. I maintained my personal best time after replacing my body. I sped up algorithms like Prolog to List Prolog Converter by examining the predicate bottlenecks. I counted and timed the calls to each predicate, removing unnecessary code. I avoided pitfalls that cost time, such as gluggy food and unnecessary code repetition. +47. The medicine student separated lines in Lucian CI/CD to make finding changes easier. I replaced my body to be the best person. I inserted new lines between lines with List Prolog Pretty Printer to find combinations of lines in Lucian CI/CD. The Pretty Printer recognised lines and inserted a new line after each one. I found combinations of lines rather than omitting new lines because predicates may have different combinations of lines. +48. The medicine student discussed the advantages. I maintained my best appearance using body replacement. I avoided repeating tests in use-predicates (a:-b. and c:-b., where a and c call or use b) by eliminating b from being recalled after calling c. I recorded that b had been tested and avoided testing it later. The algorithm appeared better by saving time. + +49. The medicine student entered Cutest mode and checked that the Cutest Prolog code, which contained variables that could be set to other values when they already had a value, worked. I wrote Cutest Prolog, which added mutable (changeable) variables to List Prolog so that List Prolog could be converted to List Prolog with mutable variables (including converting findall, etc., to loops), ready for converting to C. I could run the List Prolog in this intermediate state to test it. Cute Prolog (different from Cutest Prolog) expands and removes cuts, merges multiple clauses and prepares logical commands in List Prolog, and Cutest Prolog changes findall into loops with mutable variables. Converting these languages separately from each other allows testing of the intermediate code. +50. The medicine student found Cute Prolog code easy to manipulate and optimise. I tested for singletons in List Prolog because they were bugs. I corrected the singletons. I converted the code to Cutest Prolog. I tested the code and converted it to C, which I tested. + +51. The medicine student got and put items in multidimensional addresses in terms. The command to find the \\"subterm with address\\" found multidimensional addresses of items in terms. For example, the address of 1 in 1 is 1. The address is 1 in [1,2] is [1,1]. The address of 2 in [[1,2],[3,4]] is [1,1,2]. +52. The medicine student detected sorting and sorting without duplicates.* I detected sorting in the term. I compared the terms level by level. I noticed the items in a level were the same as the original but sorted. I could repeat this process for each level. +53. The medicine student detected swapped items in the term. I wrote item codes for each item in both terms. I checked each different item at each address. I found whether there were any swapped items. I also found any cycles for each of the three exchanged items. +54. The medicine student explored numbers derived from list lengths, string lengths, and indices of constants or other characters found from properties. I detected mathematical shifts of items in the term. I found items rotated by n. I found that n was the number of items in the top-level list. Or these different items might be rotated by n. +55. The medicine student found the nth item. I detected when the first item in a term was always moved. I took the first item in the first line or a particular line. It was moved to a new top level or collected in a new term. +56. The medicine student found various collection types based on the algorithm. I collected certain items in a term. I noticed that the second term contained items from the first term. These items may be from a particular level. Or they may be specific item numbers from each level or the second level. +57. The medicine student used collections for mainly moved or selected items. I differentiated moving and collecting items to a second term. If over half the items had been moved, it was a collection. Otherwise, if fewer than half of the items had been moved, they were moved. This classification was irrespective of the number of items. +58. The medicine student could find the part of the memory and the correct address to collect items from. Collections may have the criterion of containing a particular term. I found the term [a,_] in each item. These items were one level away from the term. Or, each item must precisely match this term. +59. The medicine student could apply optimisations. Collections may have the criterion of containing a command with a particular property. For example, I searched for the last choice point that hadn't been visited yet. I stored the stack with the most recent items first (this was why it was called a stack). I printed the name and arguments of the predicate being redone and computed each choice point. +60. The medicine student found whether the order was forward or backward. I found items inserted in order. This finding was given the items were inserted in the second term. The items may have been inserted arbitrarily based on their previous order. Or, they may have been ordered before insertion (which may be simpler). +61. The medicine student could insert brackets or a label and brackets between levels. I found the items inserted in a new level. Rather than inserting in a level, the items had been inserted in a new level between levels. For example, I inserted \\"[]\\" between levels 2 and 3 in [[b]]. Level 2 was [...], level 3 was [b], and the result was [[[b]]]. +62. The medicine student selected certain labelled levels to modify. I sorted labelled levels. For example, I was given: +[ +[a, + [b, + [c] + ] +]]. +I reordered this to: +[ +[c, + [b, + [a] + ] +]]. +The labels were reversed but were in a hierarchy. +63. The medicine student produced a drawers diagram from top to bottom. I traversed the term depth-first, pre-order, to produce a string with this pretty printing. I was given the term: [[1,2],[[31,32],4]], or: +[ + [1,2], + [ + [31,32], + 4] +]. +Completed depth-first, the pre-order traversal resulted in [1,2,31,32,4]. This result came from printing the item before going to the next item. +64. The medicine student sometimes kept some levels together, for example, ones about a single command. I traversed depth-first to produce a string with this pretty printing. This process gave the same result as pre-order because there were no non-child nodes. It was also the same as post-order for this reason. I just pretty-printed the term by level and indent. + +65. The medicine student avoided weird effects in Prolog. The medicine student kept only the necessary lines of code. These were List Prolog lines. Lucian CI/CD tested the combinations of lines, with inserted or deleted insertions and deletions, starting with a minimum set. In List Prolog, the Prolog header could be chopped and changed more quickly than Prolog, and it was harder to convert the changed List Prolog to Prolog, where it was harder for Prolog to have the same effect without the brackets of List Prolog. +66. The medicine student showed the user Prolog but broke the code into List Prolog to analyse it. I listed the advantages and disadvantages of Prolog and List Prolog representation in Lucian CI/CD. Prolog is shorter, easier to read, and allows breaking into individual commands using a pretty printer. List Prolog is needed to parse the lines and find differences in lists of lines, but it is cumbersome to read and edit. The List Prolog lines allowed differences in names, variables, and passed values to be found. +67. The medicine student admired the patterns in the uniformised code (in terms of naming A, B, C, etc. and diagonalisation in “a(A,D):-b(A,B),c(B,C),d(C,D).”). I saved Lucian CI/CD data as Prolog, not List Prolog algorithms. I read the indenting of the Prolog lines. I read the names and variables in Prolog. These names preserved the case of the original. + +68. The medicine student tested undo in the application. I could read Prolog programs more easily. I ran diff on the set of algorithms. I moved them to a bin folder to stop repositories in GitL from being unwantedly deleted. I notified the user of this. +69. The medicine student completed shorter threads first. I prioritised threads manually by how long they might take or how many there were. I counted the number of recursive iterations a predicate had. I counted the length of data structures in the predicates. I sequenced the threads to take the shortest time. +70. The medicine student loaded the future computer’s quantum box algorithm to prevent headaches. I could load the Prolog algorithm in Lucian CI/CD. I compiled the algorithm to save time. The compiler helped edit the algorithm. I tested whether Combination Algorithm Writer (CAW) was fast when compiled and used it to find simple code. +71. The medicine student generated code using CAW. I counted the variables needed. I wrote the base case. I wrote “find cycles in a graph”, “diff”, and “descendent” in short form. I converted to Prolog and tested each algorithm variant, or I dispensed with CAW and guessed the algorithm using patterns. +72. The medicine student reran Lucian CI/CD when a predicate had changed. I edited the algorithm in Lucian CI/CD. I started with a simple specification. I worked out central predicates (such as diff), then used program finders to generate the rest (such as find combinations), specifying in-order combinations. In addition, I merged insertions and deletions because they were treated the same in find combinations, simplifying diff. +73. The medicine student used letters from the acceptable character set. I ran a Prolog algorithm to regenerate an algorithm in Lucian CI/CD. I wrote a container command that didn’t require running the algorithm in Shell. I also wrote a web container command that ran Prolog across web pages. I compiled this algorithm and bit-compressed URL parameters. +74. The medicine student fixed unwanted empty lists and newlines. I automatically loaded archived algorithms in Lucian CI/CD. I stored a library of predicates that could be modified for use in other algorithms. I used CAW to help find the simplest predicates, including intermediate or parent predicates. I used the “Insertion_or_deletion” variable or label “id”. +75. The medicine student merged reused archived predicates to search for them faster. I used Lucian CI/CD to select from archived algorithms’ code. With multiple specifications, selecting unpredictable code (moving around values without comparing them) was more straightforward. I wrote an unpredictable language with function names such as sort, reverse and union, which used CAW and multiple specifications to find. In addition, there was a reverse sort algorithm to find. + +76. The medicine student simplified repeated variables. I used Program Finder with Types and Strings with Variables (PFTSV) to transform combinations of variables. For example, I found [1, a] to [a,1] swapped the order of the arguments. I found this as well in \"[1,a]\" and \"[a,1]\". Variables saved recurring values and checked multiple specifications. +77. The medicine student clarified that PFT found repeating parts of strings, and so on, to form grammars. I created Grammar Finder from Program Finder with Types (PFT). Originally, PFT created algorithms from input and output. I decomposed the input and built the output with grammars, where hierarchic (sic), recursive grammars could exist. In these grammars, string input replaced terms. +78. The medicine student prepared to store the output as a term; for example, 1+2 became [+,[1,2]]. I wrote the grammar finder from the string. Where the original PFT took apart hierarchical input and output and found repeating parts of lists, the grammar finder searched for repeating substrings, making allowances for any types such as numbers and their particularities and then repeating structures containing these. This technique relied on enough data, often when creating a programming language’s parser, a grammar checker, a translator or a string-to-term converter. Later, I recognised specific tokens or sequences of alphabetic characters that were together. +79. The medicine student broke the spell and said that the algorithm term’s variables could be, for example, A, not _12345 (qualified by converting them from List Prolog). I wrote the grammar finder that produced the term. The string was broken into parts that were manipulated to form output terms. Sets, and then sets of sets of these formed lists of terms. The terms may be strings containing tokens from the strings. +80. The medicine student recursively changed item_a into [tag, item_a] or just found the address of each recognisable item. I wrote the input with PFT from the outputted algorithm. I took the shortcut, seeing the types the algorithm would take as input, and found input that matched these types. Given only the input and output, I found PFT by using commands that mapped input to output terms, which could be written by writing subterm with address, which mapped input to output, and did this recursively. Subterms with addresses required each data item to be tagged, i.e., [tag, value], and writing subterms with addresses required recognising and labelling the data points by dimension and manipulating these points using the dimension addresses. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/Medicine of Meditation 1.txt b/Lucian-Academy/Books/MEDICINE/Medicine of Meditation 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..21a7a248f4ceb6fccb5268dd4c1be87a14c4767b --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/Medicine of Meditation 1.txt @@ -0,0 +1,86 @@ +["Green, L 2024, Medicine of Meditation 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Medicine of Meditation 1 + +1. The medicine student prepared to commit or upload the new files. After successful tests, I wrote a four-sided qualification for Lucian CI/CD to move files to the repository and back up the old ones. First, I listed the repositories that the algorithm had modified. Next, I backed up the old files and folders, including invisible files in the original repository. Finally, I moved the new files and folders into the repository. +2. The medicine student inputted lines as strings to split into tokens before unit testing. First, I ran Lucian CI/CD with settings for different testing stages. Next, I called Lucian CI/CD to find tests with particular labels for other repositories and modes. First, I ran unit tests, then integration tests, end-to-end tests, user interface tests and tests in continuous delivery. Finally, I called Lucian CI/CD to run with particular groups of repositories and token, line or predicate modes. +3. The medicine student specified the folders to scan and those to omit in Lucian CI/CD. As I meditated for ten days, I ran Lucian CI/CD on different versions of repositories, which I had finished earlier. After consideration, I didn't do this because they were old, and I needed a single pathway to Eldorado. However, the old versions may have the required features. Separately, I kept top talent close by employing them in the meditation company, where all the ideas would be helpful, and the community could use the person's help. +4. The medicine student checked for correct internationalisation, non-ambiguous command names and non-duplicate predicates. I reported the number of tests passed and the total number of tests. If 100% of the tests passed, then the algorithm reported this. If tests related to a particular predicate were failing, the algorithm used more specific token-integration techniques. I could also use Combination Algorithm Writer (CAW) to reorder and delete commands and arguments, sometimes using the existing commands. +5. The medicine student wrote neuronetworks for Lucian CI/CD (LC) and CAW. I wrote different versions of predicates to satisfy various tests and merged or reused common code. I found similarities and differences in the tests. I introduced neuronetworks to write code that catered for these cases. I kept the code in separate places when it was simpler to edit. +6. The medicine student tested that the algorithm passed standard and open tests. As I ordered copy for a meditation puja, I performed unit tests at the start to check the predicates worked. The puja needed 4*50, As with working algorithms, for Lucian CI/CD to test for the student to progress. 4*50 As was the equivalent of copywriting. The integration tests were for applications that used the predicates. +7. The medical student allowed the helper algorithm to develop with the programmer, letting them write their helper software. As a student is assessed in understanding multiple pedagogical criteria, if an algorithm fails one test, it will not skip over the other tests for that combination because these would provide helpful information. So I ran all the tests. If a test failed, I found the overarching cause and corrected it. If a change caused the bug, I found the parts of the code that had the desired effect and the parts that didn't, or other parts needed. +8. The medicine student used analytics to help with sales, employee and student productivity and work generation. I wrote an algorithm that scheduled work, rest and play when I felt like them. The algorithm recorded when I walked, allowing time for breaks and unscheduled activities, such as needing to help one's family with the computer. Given work done recently, I factored in my time to get up, time going to bed and any early morning appointments. I wrote software that could help automate work, or help with repetitive tasks, and produced work to satisfy my aims. +9. The musician applied for the song, wrote it and earned money. I wrote a bio and then copywriting for each radio appearance. Companies paid me for streaming, sales and concerts. I wondered what copywriting for 50*50 As was like. It was 2500 or 12.5*a copywriting item price. +10. The person chose acting and paid for a return on income. The actor or writer also wondered about 50*50 As. They ordered the copywriting. They starred or wrote a bestseller. They received the paycheck. It was like academia but more lit up. +11. The medicine student wrote at the required standard. I applied the 50*50 As to all departments. I became a professor, newspaper editor and meditation company founder. I didn't tire by writing 12.5*4*50 As. These ran out, so I designed a computer system to write more. I could develop volumes by applying a new idea to previous ideas. +12. The medicine student found similar ways of thinking recurring in the three levels, such as fine arts and history. I wrote all the As for University. The As were detailed and had more than the As required for A. Certain keywords triggered big ideas. I analysed the previous level, combinations of the ideas in the text, for three levels in the details. +13. The medicine student planned and achieved in meditation. I meditated but pretended to indicate thoughts as meditation each day. I felt developed in my thoughts. I visualised and projected 3D imagery. I automated everything, such as answers to people's questions about immortality. +14. The medicine student felt supported in thinking of the objects. The group meditation centre had accreditation. The students took part in classes. These covered the objects or developed ideas in the chapters. The medicine model was food. +15. The medicine student learned about breasonings and performed well. I worked out that breasonings supported University outcomes. University helped achieved becoming a pedagogue. Working on pedagogical breasonings helped achieve pedagogical goals. I needed university to support me in working on breasonings. I needed breasonings to help me concentrate and perform well at university. +16. The medicine student was given everything they needed in life. The professor made everything from classes to assessments easy. Philosophy was a good subject to teach. The education lecturer examined teaching different subjects. There should be enough time for everyone, and breasonings, As, and money were no object. + +17. The medicine student more easily handled complex choicepoints and cuts in non-determinism. Meditation is part of medicine, with its computational philosophy. I ran the non-deterministic Prolog compiler only when there was a choicepoint and a cut. I determined my path with pedagogy. It was a central path with better quality results. +18. The medicine student got the idea for JIT. I found constant inspiration and inroads in my thoughts in meditation. I mocked up a JIT compiler, a deterministic Vetusia version in SSI Web Service. I passed variables in one variable, getting and placing values in it. This web engine was faster and didn't carry the weight of a stack or recursive states. +19. The medicine student applied the zero-critical rule, comparing the algorithm to be better or worse after each simplification towards nothing. I encouraged the animals like people and took their upkeep seriously. I solved State Saving Interpreter's memory problems by deleting past frames I didn't need. I replaced recurring texts with symbols. I also did this with the algorithm in memory, later suggesting optimisations. +20. The medical student identified and achieved the goal, for example, making SSI accessible in a country. I observed that the person was true not only to the simplest possible but to nothing, for example, an effortless breasoning generated by a computer. I solved the SSI memory problem by storing frames on disk. I also used a similar technique with multi-threading to queue items in asserted predicates. I could use the time saved with JIT to write more algorithms. +21. The medicine student studied my texts. I realised I could achieve anything like pranayama with 4*50 As. In addition, I could achieve a tangible goal such as mind reading, time travel, stopping ageing, meditation, high distinctions, creating customers, selling and marketing with 4*50 As. I needed a PhD in Business for many of these skills. I could help others with medicine and time travel with these As, which were life-saving. +22. The medicine student wrote and checked objects representing thoughts. A skill I gave my student was to recognise and discuss an algorithm from texts, effortlessly attaining transcendence. I wrote three 4*50 medicine As on the first day, then two each day after that. Meditation As helped contribute to this if any As were missing. I listed and checked my thoughts as if programming was operating on ground level. +23. The medicine student visited safe, well-lit, populated places with no danger found with a mind reader. I debated giving mind reading 4*50 As, saying that it would work for a good reason, though it would only work for a single thought per 4*50 As, which could be a mind-reading colouring or dot-to-dot book. I breasoned out three time travel As per day, to cover travelling from the point of departure to the first destination, then back to the first point. This list had two locations and satisfied my travel plans. Someone suggested travelling to other planets in terms of my planet's settings, which I considered. +24. The medical student covered the main topics, summarising the mind map and presenting themselves with interesting automation, new algorithmic methods and research directions. The sutra brought a stronger nothing, helping give me the confidence to overcome psychiatric difficulties and gave me happiness. By 4*50 As I meant 4*50 As for arguments and 4*50 As for algorithms. These were based on my writings on each of these over the years. I added new thoughts each week, which the new As were in terms of. +25. The medical student worked on what they felt, aligning with their objectives and overall objectives. I simulated the seen-as-version of reality through Vedic Astrology, with the culmination algorithm mind-reading the academics' children. This technique always accurately depicted reality and considered the types of customers to prepare for. When I got up to writing 4*50 Breasoning Algorithm Generator (BAG) As, I generated six (three time teleportations, two anti-ageing and one meditation) * two (one algorithmic and one argumentary) = twelve As today = 2*(6*16000 sentences) = 2*96,000 sentences = 192,000 sentences. The business returned the number of As or its specification to the breasoner. +26. The medical student employed a funnel to identify old and new ideas. The utterer decided on a constructive meaning for the utterance and applied it to thoughts. The thoughts were the books and algorithms. For example, a simple way of thinking was paginating flowcharts with circular connectors to the next page. I breasoned out the generated work based on my latest work with Text to Breasonings and Text to Algorithm Breasonings, which returned an algorithm corresponding to a word. +27. The medical student planned to open online philosophy academies. I perfectly programmed ideas that meditators could have devised, emulating discourses, algorithms and new thoughts. I articulated to a research journal in which I published my research. The philosopher identified with specific objectives in their mission, including recording cherished memories and milestones in their career. The philosopher planned their career, including selling books and setting up businesses. +28. The medicine student reached and stayed at the top. I generated enough As for the student, for sales or assignments with students, which writing 4*50 BAG As automatically did. Before deciding to use BAG, I considered paying for copywriting for the As, for two dimensions (time travel destinations), with As for algorithms and arguments, using an algorithm dictionary, four As per week and one A of algorithms, and writing \"540-degree\" (BAG) As for student As and sales. I helped students with their specific thoughts by mind reading them in increasing resolution if they wanted it, as they approached, to work on an objective in their assignment. They may need help with the overall approach and style of the assignment. +29. The medicine student claimed that fully protected time travel and therefore freezing ageing was natural and optimal. I captured and helped group dynamics in the class by mind-reading receptive people, aiding participation and meeting criteria in the discussion. I wrote details for time travel daily from 540 (BAG) As. This algorithm was vital because time travel needed the As to withstand ageing in the home character. This requirement, 4*50 As satisfied the robots and nature and protected the person. +30. The medical student read and wrote on others' and their screen with output and input breasonings. For example, I mind read with 4*50 As of breasonings and presented my thoughts as breasonings. This presentation arose as a symphony of epistemology. I used mind reading to aid subliminal study in my sleep. I wrote ideas in my sleep, aiding my memory with cues with mind projection by the computer and recording my comments, funnelling towards cut-off, accurate conclusions. +31. The medicine student pushed forward the BAG in case distractions arose during the day. I prepared for expected and positive events in business, allowing for economic and technological changes. I designed systems that worked well for everyone, for example, conversations to achieve aims at each point. If BAG didn't work, the person asked the people of the future to put through the algorithms for them to the best of their ability with an algorithm the person wrote with GL details for 2 million breasonings. However, neither would this likely work nor would there be too many problems to complete the BAG As. They are personal, in effect, citizen-type. +32. The medicine student didn't have to produce the breasonings but should later. Accreditation contained 250 breasonings per assignment, allowing a single, effectual image or musical idea to be represented. This image or sound epitomised the effort and rewarded the student's work. I considered writing 250 breasonings first about asking for As when I couldn't produce BAG As, but I just completed the As. I enticed students to complete each assignment in the qualification with 250 breasonings at the end and another 250 breasonings for the finished qualification. + +33. The medicine student oversaw the breasoning plan, discovering perfectly programmed algorithms from their science. The utterances were presented as based on the breasonings. I searched for algorithmic directions through viral, mind reading, funnels, and core consumer interests. These algorithms produced text-based results and were the essence of efficiency. The founder presented the breasoning plan and paradoxically completed them without knowing what it would be. +34. The medicine student rewrote the BAG results for clarity and when it gave them new ideas. The Purushan industries were programming, writing and education. I wrote an algorithm to produce shorter texts with the Breasoning Algorithm Generator (BAG) algorithm. It had the right balance of time versus length for algorithms and arguments. The BAG algorithm produced complete sentences, contributing to Purushan's work. +35. The medicine student claimed one person said business was superior to academia, and another said academia is a business. The meditation utterance helped to reduce stress to zero by doing all the work for the programmer. I connected my recent algorithms to my previous and best algorithms. This action increased the excitement about the new algorithms and celebrated the classics! I noticed the initial sales curve was haphazard, slowly growing, then decreasing. The software wasn't perfect, so more people were needed. +36. The medicine student randomly generated sounds to test for stories. The teacher needed to meditate to teach meditation. I placed my music description, MIDI, lyrics and meta files on the website. I used mind reading, funnel and word/instrument match to find better songs. The first person found out about meditation and meditated, where finding out music was similar to composing it. +37. The medicine student developed software that initially made it easy to create software that did what you wanted it to (i.e., eliminated web development problems). The teacher taught the parents, children, students and workers meditation and the way. I bought the computer with another operating system or emulation software to test my software. I wrote my packages to prevent relying on others' updates crippling my software. Meditation and developing standalone software simplified life. +38. The medical student formed their thoughts in the unabridged version. I agreed with meditation with the Upasana sutra to see imagery of the yellow outback and a red lizard representing the academy. This algorithm helped form the academy. The employees rated the manager. The manager read the feedback, included the suggestions, considered the comments and answered the questions; for example, he wasn't frightened of neuronetworks to help their experience. The literary algorithm was a long-winded first attempt at a problem. +39. The medical student asked the question, and the person answered. The meditator visualised a moving appearance-like still, such as a livid flower. I examined business departments from the point of view of employees, customers and me. It was as if I actually was them and could listen to their inner dialogue; at least I could pretend it. I listened to the student's feedback about the form. +40. The medicine student wrote a movie exposing the unique point of the idea. I introduced the meditation movie maker, which wasn't afraid of being interested in an idea. I constructed a budget for advertisements and accreditation. I could substitute word of mouth for ads and a reasoning system for accreditation. The idea that was ranked number one was the best music, the best image and the best algorithm. +41. The medicine student valued the development environment of Prolog and arrived at thoughts themselves - as an aside, perhaps delete the choice points. Upasana helped me visualise computer data flow, from input through the computer to output. I made my lecturer by increasing the human with neuronetworks that helped them achieve the heights in the time, helped achieve 100%, culminating in a program or project that was useful and memorable. +42. The medicine student wrote that PhD topics must be literature and research-based, with enough detail. Meditation helped form a job framework that could do what the computer could do or what the employee could do, but the extra human contribution was nearly impossible to replace. Lucian CI/CD shouldn't allow only deletions but should allow insertions and changes for progress. Otherwise, the algorithm could be backdated. However, some automation systems could find simpler versions of algorithms, for example, postgraduate articulation to meditation. +43. The medicine student extolled the virtues of functional decomposition in computer science, for example, unification in Prolog by getting values for the left and right-hand sides, followed by putting the values in. I fulfilled the multisubject-long meditation teaching practicum. I entered many changes in Lucian CI/CD by running with possible deletions, for example, having a mini-Lucian CI/CD environment that found individual predicates that matched tests. I tested the called predicate first, then the calling predicate, ensuring no unwanted non-deterministic results. +44. The medical student could see the result of the code at the time, with code and debugging suggestions. The job holder was given meditation for ideas to appear. I wrote a Prolog to List Prolog back-conversion algorithm for verification and one for the pretty printing algorithm. It converted from Prolog to List Prolog and back, the same for the pretty printer. I could see the non-deterministic result coming. +45. The medicine student asked, \"What will be more user-friendly?\" which was answered with \"premium code checking\". The meditator developed texts to meditate, for example, about sun safety. The future computer was similar to mine (it could run BAG in the same time), and my home time had computers, which I ran BAG on to freeze my age. Lucian CI/CD's bottom-up version overcame needing shorter code instead of fewer changes and allowed longer code with as many changes as needed. The early computer pioneers developed algorithms to run computers. +46. The medicine student put the related code in a predicate and gave their name, labelled and dated it. I stated that 50 breasonings were the minimum for a thought, and were the inspiration threshold. I refactored the algorithm by writing the pattern matching code, CAW-style neuronetworks and databases of C modules for Prolog, for example, word frequency finder and CAW-style neuronetworks. There was also a tightener, which deleted excess code and clauses, reranked clauses by popularity and introduced type verification or deleted constants if the code needed. The computer science student reached the threshold to draft the code three times. +47. The medicine student replaced success signals in Lucian CI/CD with predicates and success signals. The levels were required for full brain potential, including key expressionism. DFA minimiser could be found by recognising a state machine and removing extra parts with diff. The possibility of inserting cuts at different points was also computed. Some people will find the way. +48. The medicine student reached the threshold with the number of breasonings. I connected meditation to algorithms with breasonings. I wrote a version of the music composer with listenable sections as I composed. It had eight, not four, groups of two notes per bar, allowing the human user to experiment with entered and detailedly mind-read sets of instruments and play composed or generated sections on the fly. The musical meanings were collected and indicated with breasonings. + +49. The medicine student protected the text with a seen-as appearance. I wrote how the philosophy could be applied to a use. I found art's seen-as-version for GitL. Its icon was a red lightning bolt. This icon emphasised that GitL was highly desired because it was private and customisable. +50. The medicine student wrote an algorithm to explore logic. I wrote 4*50 As with texts in the Breasoning Algorithm Generator (BAG). BAG found combinations of parts of sentences joined similarly to where these parts of sentences joined (given AB, CB and CD, the join is AD). These results were complete sentences. Clearing up their language was an exercise in knowing the specifics of applied logic. +51. The medicine student instead wrote on topics and connections of their choice. I replaced two words with terms from an essay. These provided exercises to connect the papers. In a 67-page book with 15 sentences per page, I could write 10^3 sentences. I wrote ten sentences, ten sentences for each of these, and ten sentences for each, as an electronic encyclopedia. +52. The medicine student considered general connotata (sic) for backing words. I wrote eight words replaced with antonyms. I wrote on the philosophy, then objected to the non-specialisation. These so-called antonyms were plain words that detract from the philosophy. I wrote connections between the main words in simple, sometimes literary terms. +53. The medicine student inserted the distinguishing mark into the sentence. I wrote that some BAG results were left alone for an A for themselves. These results made sense unedited, or some of their terms made sense unedited. If they didn't require new ideas, I tightened the language. If certain words in them needed replacing, I replaced them and tightened the rest. +54. The medicine student smelled the Egyptian perfume. I sold immortality. The students were prepared to write philosophical arguments for immortality. They could use advanced techniques to generate enough philosophy to support immortality when they had studied and written enough. Over time, they could mix philosophy and mix the mix to detect new topics to write about. +55. The medicine student replaced Prolog frames with C loops for speed. Half of the texts were details. I checked each sentence with ten sentences. Or I wrote an algorithm specification and an algorithm. I replaced frames with maplist (to be converted to a C loop) in Prolog for speed. +56. The medical student convinced the official to allow meditation for better health. I stated that the only high distinctions necessary for immortality were time travel, age-freezing medicine and meditation. I also wrote on all topics of interest, attaching them to my specialisation. My specialisation was induction. I generated texts which I modified to support me in immortality. +57. The medicine student stated that medicine was the smartest one. I noted that the rest of the 4*50 As were for sales. I used them to mind-read willing customers to expand my understanding of their goals and progress. I tested their understanding of specific topics. I tested if they knew they needed certain knowledge. +58. The medical student found out perfect knowledge with the first action. To ensure no interference, I wrote a high distinction to object to the binary opposition to mind reading. You can do objects to support the simulation. I collected the transceiver types to help with data collection. I lived on the planet forever, in terms of my setting. +59. The medicine student stated that mind reading was necessary for immortality for Text-to-Breasoning. I noted that everyone contributed As to mind reading. The student's high distinctions contributed to the mind reading high distinctions. Others, such as past students within range, also contributed As. I could do it now, at least interfacing using BAG. +60. The medicine student kept on developing. I found out whether the customer was a lead. They had a qualification that matched or one that would match inside. They went to the institution to find work. When inside, they went up in the institution and became self-sufficient. +61. The medicine student helped the student recognise the product's need. If the customer was a lead, I ran each algorithm step to find an algorithm. I gave unrelated As to them as a tabula rasa (blank slate). I concentrated on getting the logic completely accurate. I focused on combinations of methods. +62. The medicine student wrote a new algorithm to find the best combinations of instruments. I ran three-level funnels using detailed mind-reading for each step of the music-writing algorithm. I generated music that I was inspired to write about them. I put the music back into the funnel three times to test for more developed music. I chose the music with the best instruments. +63. The medicine student relied on the student to write the algorithm telepathically. I asked the student to enter the algorithm's aim and the data's input and output. For example, the objective might be to find an algorithm that suggests methods in GitL from mind reading. The input is the student's thoughts. The output is the next command. +64. The medicine student helped correct errors in an algorithm. I wrote the thing that interested me, one command. The mind reader was fast and accurate and had time to collect details on the command. It wrote these down and showed me the completions and relevant next one of my thoughts. The mind reader was based on the concurrent maplist command for speed. + +65. The medicine student treated the levels like a stack, which was better implemented as levels. I used the subterm with address command, which found multidimensional addresses of items in \"terms\" to transform levels. I ordered the levels. Or, I randomly ordered the levels. Or, I cycled the items. +66. The medicine student wrote the levels [a,[1,[b,[2,[c,[3]]]]]]. I zipped the levels together. The first list was [a,b,c]. The second list was [1,2,3]. The zipped list was [a,1,b,2,c,3]. +67. The medicine student added on atoms, incremented a number or minimised logic with SLD (removed predicates that only called other predicates). I checked that the move was all the change was. I compared the two terms level by level and item by item. I noted moved items. I detected new items or items that were the transformation of existing items. +68. The medicine student detected the type of algorithm. I found the significant change in terms of more minor changes. I worked backwards, detecting sorting, flattening or concatenation. Then, I found transformations, such as adding a prefix, finding a sum or returning the second item. I increased the intelligence, ordering the predicates by dependency and diffing them in Lucian CI/CD. +69. The medicine student labelled chunks with their properties. I identified what the items in the chunk had in common. They might all be equal to a number. Or they may be greater than a value. Or, they might match a passphrase from a list. +70. The medicine student labelled the items with his voice. I found the changed items. I focused on a level. I compared the two lists. I found different items and their difference from the first item. +71. The medicine student simplified the data. The items were ordered using an index. This index was inferred from a previous ordering, passed from another list that was part of it, or an earlier version. The prior ordering may be from a list containing indices (a hash table). The previous version may be from the disk or a previous algorithm. +72. The medicine student discarded instances of an item from unwanted levels. I deleted an item from a list of lists. I collected the instances of the item from a particular level (length of address). I replaced them with a symbol. I deleted the symbol, recursively deleting empty lists if necessary. +73. The medicine student reported the missing item, possibly triggering an API change. I found whether the item was missing in several places in the term. I found the difference term (the list of differences between the terms, stopping at a level if there were differences). I computed whether any of these items failed to match an item in this position in the first term. To do this, I checked whether the item matched any of the first items in the difference term. +74. The medicine student double-checked on a replacement of a name. I checked whether an item was missing in all places in a list. I found the difference list. I didn't check that all changes were from the item. I checked that there were no other occurrences of the item apart from changes from it. +75. The medicine student checked for no instances of an item. I checked whether an item was missing in all places in a term. I found the difference term (a list of all changes from and to an item). (This differed from a previous idea, which stopped on a difference.) I checked that the final list contained no instances of the item. +76. The medicine student finished checking whether an item was missing in some parts of a term and not others. This was in a particular part of the term. I found the part of the term from its address. I checked, and the item didn't occur in the term. I deleted the part from the whole term. +77. The medicine student inserted a command to convert a term to a string before concatenating it to other strings. I connected the list. All the items should be strings or atoms, or there would be an error. I started with \"\". I concatenated the first item to it, and so on. +78. The medicine student checked the lists against commands. I detected sorting without duplicates. The first list was [1,1,0,2]. The second list was [0,1,2]. If multiple sets of lists suggested it, sorting without duplicates was detected. +79. The medicine student shortened the computation time. I found part of the list supported a hypothesis about the function used. In the previous example, I took the first two values of the result, [0,1]. These were in order and had no duplicates. I concluded that the function was sorting without duplicates. +80. The medicine student also detected list_to_set, which found [1, 0, 2] from [1,1,0,2]. I detected sorting with duplicates. The first list was [1,1,0,2]. The second list was [0,1,1,2]. If multiple sets of lists suggested it, sorting with duplicates was detected."] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/Medicine of Pedagogy 1.txt b/Lucian-Academy/Books/MEDICINE/Medicine of Pedagogy 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..79a82e2364298e074b7d3ec358a017c9bc6dea3e --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/Medicine of Pedagogy 1.txt @@ -0,0 +1,87 @@ +["Green, L 2024, Medicine of Pedagogy 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Medicine of Pedagogy 1 + +1. The medicine student sorted the items without removing duplicates. I found whether the line contained the first items in the other lines. I found the first item in each line. I sorted them. I sorted the items in the main line, determining they were identical. +2. The medicine student stored the data's labels with it. First, I determined whether the word was an acronym or expanded acronym and expanded or contracted it respectively. Next, I identified the two states of the system. Then, I simplified the state's identification a bit. In addition, I remembered the acronym's number and the tag for an acronym. +3. The medicine student applied the idea set to itself three times in the book. First, I ranked the ideas by importance. Longer ideas counted for more. Next, I measured the string length of the concept. In addition, ideas that generated more brilliant ideas counted for more, so I applied the algorithm to others. +4. The medicine student created a catchy brand name. I simplified the complex idea into a few characters. First, I identified the main idea. Then, I simplified it into functional sounds. For example, diff group combos became DevOps. +5. The medicine student favoured results with the search terms or synonyms close together. So, first, I counted the frequency of two words. Then, I searched for the number of sentences with both words. Or, I searched for the number of paragraphs with both terms. Or, I searched for the articles with both words. +6. The medicine student wrote the hierarchy of uses. I selected the stronger of the two uses. It was the use with more uses. I wrote the use. I wrote the uses for the use. +7. The medicine student optimised the process. First, I wrote the idea which most accurately conveyed the connection. Then, I wrote the functional call containing the connection as an algorithm. Separately, I converted a string into a term and then into an atom for writing to output by converting from a string into an atom. I called this \"removing the double quotes\". +8. The medicine student justified originality with research. I shortened the line, citing the source for the connection, and summarised others' work in the PhD. In addition, sourcing was in non-PhDs. In addition, sourcing was in research. The helper provided all the tools for originality and fewer errors. +9. The medicine student designed an algorithm that impressed the importance of an algorithm on the viewer and repeated the hits. I differentiated the two lists and showed the differences between the original list and those formed from combinations of changes. For the second step, I reran the differentiation algorithm with the numbered items from the initial and final lists. I generated music, art and literature for each idea. These ideas increased conclusions about each concept. +10. The medicine student made a spiritual snake icon for medicine. I animated thoughts about a line of code. First, I used animal metaphors for possible ideas to apply to the line. Next, there were animations about the arguments leading to other ideas. Finally, I formed a graphic symbol representing the final application of the animal interacting with the concept. +11. The medicine student transformed the house's feeling using simulated intelligence, including making outdoors indoors. I differentiated the two houses. I started with the direction of the building and the layout of the rooms. Then, I connected with the feeling of the time, the neighbourhood and the occupants' interests. Lucian CI/CD reminded people of ants. +12. The medicine student's interest in Prolog optimisations coincided with developing skills in sales. I wrote that my heart was like a house. I chose reality in terms of philosophy. I sourced the natural dyes or preferred fine art materials and found ideas and friends I resonated with. I wrote a CI/CD algorithm and read about DevOps software. +13. The medicine student reapplied for the job, only to be turned down. So I decided to start breasoning out 5 As per day again. These toned the energy for the day and helped my brain with the point it needed for thinking. The 5 As also supported with tutoring, sales and writing arguments and algorithms. I had many resources at my disposal, including 16*50 As or 4*4*50 As, which I students could examine. +14. The medicine student checked whether the algorithm used the changes they expected the algorithm to use or whether they needed to add, delete or modify a test. I ran an algorithm that helped with thinking. It provided thoughts for sustenance and helped productivity. It helped before going to sleep, enabling me to work during the day. I celebrated my CI/CD algorithm, which had many downloads and users could use for DevOps on the web. +15. The medicine student looked up that they should pronounce h with a line in it, \"h-bar\". I changed back to the possibility of differentiating texts broken into tokens (words) rather than lines after making numbers for the same content different. This mode was slower and also had a cap of seven changes allowed. This way, the algorithm could select from multiple changes to, for example, arguments in predicate calls in the token mode. If it took too long, I operated on pairs of lines this way. +16. The medicine student tried reordering clauses, arguments and calls in combination with different configurations. I also added a mode to the CI/CD program, which found combinations of inserted, deleted or modified predicates (clauses) in the algorithm. This method found the correct configuration of predicates following finer adjustments to lines and predicates. In addition, an algorithm could reorder clauses to test predicates. + +17. The medical student had A, B and B to B to parts of nature. I thought of epistemology to prepare for the next part of life. I verified the output of the List Prolog to Prolog converter. I also checked the result of the List Prolog pretty printer. I also checked the result of the List Prolog to Prolog pretty printer, all by back-translating. +18. The medical student connected to the two perspectives in testing. I played the character in the play. I generated tests from the interpreter's data. I wrote a test for the algorithm. I traced this through State Saving Interpreter (SSI), saving the inputs and outputs to predicates and the results. +19. The medical student verified the play. Text and speech checked the text. I simplified findall in the List Prolog to Prolog converter. I fixed the bug that another change introduced. I wrote the code to handle all cases as simply as possible. +20. The medical student viewed the data represented realistically. I exhausted the algorithm mind map feature, like science. I chose a niche and exhausted items from the set. I saved the List Prolog code as Prolog code in the Lucian CI/CD data file. Instead, I converted the finished code to Prolog to display in HTML format for easier retrieval. +21. The medical student wrote the file if the folder existed. I checked that the list had at least one item. I saved the whole repository in the Lucian CI/CD data file. Instead, I displayed the repository in HTML format. I allowed the user to revert to a previous version of the repository. +22. The medical student recorded tests with positive and negative results. I changed the List Prolog comment call in Lucian CI/CD to take one, not two, arguments. It could be converted to Prolog. I could pretty print it. The interpreter, converter and pretty printer could more easily access the data file. +23. The medical student displayed a visual cue to help Lucian CI/CD notifications. I reported how many tests the indicated files failed (the number of errors), how many tests were missing, and how many errors and warnings about the code. I included the Prolog form of the Lucian CI/CD data in the weblog. Users could use Simulated Intelligence to determine whether a comment belonged with a line and include it (if they were together, separated from the other lines or if the comment was before the line). There were audio or email warnings about missing tests, and if the user didn't reset Lucian CI/CD to include tests, comments and data before running. +24. The medical student commented on the computer voice in her earphones. I separated test input and output for one predicate into that of its dependent predicates or possible predicates. I found errors and completed code using Type Statements, Combination Algorithm Writer and Program Finder. I founded text automation across operating systems, written in lucianpl (a C program). I wrote the output of Lucian CI/CD in the weblog. I brought TextOffice out. Text is easier to copy and paste. +25. The medical student replaced specific string processing predicates with a single command for speed. I reduced atoms to strings where possible. I eliminated single quotes and brackets where possible. I allowed the user to diff, add, delete, change, merge and ignore different operations on files. I saved common commands in the Lucian CI/CD programming language. +26. The medical student kept the converter's code simple. The predicate had two uses, where the interpreter could pass it a predicate name for it to call. The program was automatically developed from the input and output. The predicate was slightly different from another use and could be modified with this use in mind. I found the bug by comparing the old and new code in the List Prolog to Prolog converter. +27. The medical student programmed the computer to speak to them when it had a suggestion or completed work. I synthesised two words, two commands or two predicates for an algorithm. The contention of the game was whether it was challenging enough but easier in earlier levels. The game was enjoyable. I wrote a system that provided pedagogical support twenty-four hours per day, which was switched off if it became overzealous. +28. The medical student copied the simulated intelligence algorithm and helped others. I expanded the functional call or the complex code or wrote a mind map for the confused programming employee. I included the employee using the positive conclusions conclusion. I mind-targeted confusion, mistakes, errors, warnings, bugs, breasoning errors and faulty texts and algorithms. I helped manually correct them with software and helped guide the employee. +29. The medical student returned whether the issue in the question had been identified. If not, I returned whether another employee, unexpected source or outsider helped. I also returned the time to correct the problem. I expanded the functional call, diverging the predicate into its two functions. I gave the predicates and variables more specific names. +30. The medical student shortened or deleted data by including, for example, placeholders for internationalisation and rendering the code without them. I attempted to condense or delete the line by grouping variables, renaming names and reusing another module. I expanded the complex code by identifying differences between it and the code around it, longer or unexpected lines. I simplified the code, often by placing the extra parts in another file or deleting them. I kept members of the same group of predicates in one file. +31. The medical student completed laborious work with a neuronetwork. I expanded nested parentheses by deleting a pair of them. I helped the stuck employee by pointing them in the right direction. The employee could do this process. They had a coffee, chatted or ran an algorithm to give them ideas about what to do; for example, start with A, remember to write down thoughts and tick off items. +32. The medical student gave their breasonings to parts of an object in the course. I wrote the longest, most exciting possibilities, a timeline and a budget and embarked on my voyage of discovery. I breasoned out two to three breasonings per idea in the lecture. Breasonings helped establish my credibility as familiar with the course materials. I was more critical and performed better. + +33. The medical student worked on the spiritual Prolog algorithms. I meditated and initiated the age freezing, which took the latest work and Lucian Academy scripts. I deleted the curly, unnecessary quotes from the Text to Breasonings dictionary. I also deleted them from the file when running Text to Breasonings. I used an algorithm from the sheet feeder in Essay Helper to delete them. +34. The medical student sped up the age-freezing algorithms with multi-threading queues. I wrote the recent algorithms and arguments, connecting to and expanding them. I scanned the algorithms and arguments for new lines and combined these with the rest for new ideas. I removed illegal characters from Text to Breasonings (T2B) and Text to Algorithm Breasonings (T2AB). I deleted non-alpha characters. For example, I deleted unusual hyphens and questionable symbols. +35. The medical student nutrified (GL-detailed) and took highlights from (using detailed mind reading) BAG arguments and algorithms. I used detailed mind reading to add and delete parts of arguments and algorithms, with highlights on lines written from the start. I discovered what the lines were. I developed them enough, not skipped over the point. On another note, I checked changes in T2B and T2AB dictionaries for correctness. +36. The medical student developed themselves and consciously examined a recommended image for the day. I applied the writing about-writing idea to the image to develop an understanding of the importance of images. I found the number of sentence breasonings for age freezing that corresponded to three teleports, two anti-ageing medicines and one meditation. I counted the exact number of sentences in each text. If there were extra full stops in a sentence, I went into the meditation A for extra breasonings. +37. The medical student gave As to their thoughts each day. I wrote and published the anti-ageing algorithm. The prerequisite was uncovering enough new philosophies. I recommended the correct number of breasonings for certain meditators for age freezing. I synchronised my test data with the online recommendation. +38. The medical student logged the meditation science discovery. I specified the number of times a list repeated, from zero or one to a number or infinity, with different clauses. I wrote the best short algorithm for each word in Text to Algorithm Breasonings, which was different from the set of object dimensions. I inserted a keyword for an algorithm to save time. This T2BA algorithm saved time when developing details for arguments and algorithms. +39. The medical student discovered that 4*50 BAG As were the \"top\" and were the finished product in many senses. The professor used age-freezing, which made discoveries stand in time. I enjoyed the unique neuronetwork possibilities and customised the algorithm to write rhetoric about my hand-written texts. I moved the age-freezing algorithm to the Virtual Private Server to avoid the buzzing noise and automatically continue where it had finished. I considered ensuring that Text to Algorithm Breasoning only accepted single words or groups of words with underscores; however, verifying the algorithm name and algorithm needed to be looked into. +40. The medical student translated their texts into many languages to teach. I grouped consecutive sets of parentheses and simplified nested parentheses. Like State Saving Interpreter Web Service, I called the JIT code \"inside\" or before and after choice-point code. I experimented with functional calls between compilers. I lived comfortably in my home, with access to an Internet movie studio for my academy. +41. The medical student won the prize with the highest quality thoughts. The person considered the ethical and moral aspects of age freezing and decided that positive overbore negative. Age freezing in one's first lifetime in their home time was a big present. Separately, in JIT, cut (expressed as if-then) stopped further clauses and some choice points. I could use JIT to cut choice points to consecutive clauses. +42. The medical student thought it through and accomplished all steps to their business goal. I developed a qualification for age freezing. I could literally control my body. Separately, interfacing with JIT, the append and member choicepoints saved the recursive data. After exiting from JIT, the choicepoint was returned to like a \"peg on a line\". At the end of the algorithm, the previous choice point was returned. +43. The medical student negated choicepoints on not. I recognised that the systems for age freezing were necessary for age freezing. In JIT/SSI top level, the compiler started again if any choice points were left. I only used the Choicepoint compiler when there were both choice points and cuts. I made time for study in education, which helped my career, also in that I transferred education knowledge. +44. The medical student checked the play against the ways of thinking. I acknowledged my Arts degree at work in my spiritual journey. I wrote about spirituality and algorithms connecting, giving me freedom in my life and business. In Lucian Prolog, I converted lists to arrays where possible. For this reason, grammars were very fast. Grammars didn't backtrack because of the cut. In this case, JIT could be used because cut cut the choice points from the grammars. +45. The medical student simplified the line to points and analysed their relationship. Details were in terms of simple properties. In List Prolog, grammars operated on arrays. I simplified the philosophy, working on the simple perspectives related to it. This perspective is related to the simplest form of the idea. +46. The medical student wrote about computer science. I asked the student to fill the gap when writing a connection. At the back end, I had prepared for the student given their circumstances. Separately, I wrote choice points in reverse order for speed. This method applied to choice points within predicates. +47. The professor applied ideas to ideas, making a square of applications. They wrote on arguments and algorithms and increased their ideas each day. There were from zero to five thoughts each day. I found the amount of memory required, rather than running out of stack. I performed this calculation from the data and algorithm. +48. The medical student wrote summaries of their algorithms to record the research. I wrote the synopsis of the algorithm, describing the noumenal formula and why it was chosen. I gave the illusion of choice points in JIT in trace mode. I recorded calls, exits and backtracking. If the user pressed redo, the point of execution would return to a choice point, or if the system wasn't in trace mode, JIT may be invoked. + +49. The medical student investigated neuronetworks in C. I applied the three-part perspective to well-known people who work in publishing, choosing a side on a prominent debate. I tested Lucian CI/CD with multiple possibilities to test and integrate the changes in the algorithm. I changed, inserted or deleted several lines to test that Lucian CI/CD found the correct algorithm. I chose the simplest answer in computer science. +50. The medical student found law through the method. I completed the algorithms and arguments attributed to the well-known person with BAG, first predicting their thoughts from mind reading and Vedic astrology. In the bottom-up version of Lucian CI/CD, Lucian CI/CD checked sets of repositories bottom-up, which meant there were up to seven possible changes to a group of current predicates to find a working combination. Current predicates are each in depth-first, post-order, or sets of clauses or predicates involved in loops. I examined the argument of the person growing up as an algorithm. +51. The medical student supported software downloaded from the internet. I connected the handles to the image with two uses: reading and counting. I explained to Lucian CI/CD users to use their user name, give the primary file in the repository, and the repositories to omit. Lucian CI/CD used their user name to locate dependencies by that or another user. Where there was one option, there may be others. +52. The medical student reused the algorithm, noting the arising implications. I mind-mapped the essence of the text, mentioning key terms. I explained other programming languages weren't fully supported yet in Lucian CI/CD (even though the non-bottom-up version supported one set of changes to the entire algorithm). A language with \"%\" denoting comments allows up to seven changes to the whole algorithm. The text had a clear algorithmic seen-as version, with rules about interpreting it with key terms. +53. The medical student converted the other language to List Prolog to run Lucian CI/CD, then back. I studied mathematics, one of the most exciting subjects. The Lucian CI/CD algorithm inserted commas in appended code to be translated from List Prolog into Prolog. Only clauses were appended, and append automatically inserted commas. Comments from the latest version were always kept, and tests in comments were taken from the latest version. +54. The medical student found Prolog-specific bugs, programming language grammar bugs and list order bugs. Mathematics could contain the One through a simulation. I fixed bugs in Lucian CI/CD's merge files predicate by wrapping double brackets around the item to append, adding an include statement to List Prolog and fixing a bug in finding the last instance of an item in a list. The code explainer found the reference to the code in the philosophy, helping to avoid bugs in the future. +55. The medical student kept the changes from another main file or predicate and tested after testing finished predicates. Breasoning was a critical human skill in the future. Lucian CI/CD saved the working predicates after they were tested successful. The finished predicates were set as old predicates if there were multiple main files and predicates. To arrive at the code, I breasoned and tested a paragraph about the idea. +56. The medical student iterated along the items of the array. The future interpreter compiled C code used only the necessary functions, and monitored itself and its code. Lucian CI/CD tested the changes. The first simplest combination of these was kept. The interpreter was written in Prolog code that could easily be converted to C but dispensed with immutable variables whose values couldn't change. +57. The medical student investigated the need for new and removal of old code policies, allowing for relaxation or fluid changes to rules. The interpreter wrote itself, externalising code reused by algorithms. I set the old predicates variable to the old version of the predicates, bug-checking anything else. I checked the code, finding the old and new predicates and the origin of the new version of the old predicates-the code policy as a service found and fixed code that didn't meet the standard. +58. The medical student found the new oneness. The external module's \"the One\" policy catered to similarities across companies, even though companies themselves were diverse. The theory covered simulations of the future, and the practical asked people their opinions. Because of the need for internal change, externality should be modified. Lucian CI/CD produced an HTML document showing the changes it made. Something in the outside world was turbulent (needed correct code). +59. The medical student was given the answer on a plate. More than one business customer in the industry went through pedagogy, medicine and meditation. A new product is needed when an old product is 75-80% of its peak sales. The premium code checker converted code to C and suggested existing libraries, rated by the code checker. The pedagogue took and instantly related their work to research, treating the simulation as another value. +60. The medical student automated the products of their philosophy (the department books). The business algorithm, which wrote and debugged long algorithms, went viral - a large product needed ten rotating products. The next one was required when one reached 75-80% of its peak sales (a time of interest). The number ten was magical, and the product itself was also updated. +61. The medical student wrote a movie about Lucian CI/CD results. Lucian CI/CD was rewritten with predicates instead of findall statements, which could be converted to C. Another algorithm found algorithms to meet the schedule of regular goals. When these had been found, the people's ideas were quickly examined. Lucian CI/CD interested the people in its results with As. +62. The medical student became a researcher as their job. The deterministic interpreter used properties of arrays, such as length, append and member, where non-deterministic results were optionally also collected. The manager collected 4*50 As for a job. The systems helped with the other As; the main job's seen-as version was training. The interpreter found the first n results and submitted research to the accredited academy. +63. The medical student worked up from plans to code to polished code. The interpreter accepted algorithms in natural language, which were readable by a non-specialist. The coder wrote 200*50 = 10,000 As for enough breasonings for accreditation. These kept individuals and degree students occupied and were covered in business. The programmer was closer to the code and experienced the aporia of debugging. +64. The medical student found multiple base cases. In Lucian CI/CD, success signalled to stop trying to find working combinations of changes to code. Also, in Lucian CI/CD, numbers in the working predicates must be removed before merging (integrating) the differences between old and new predicates. A simpler version of Lucian CI/CD only tested, not combined, the changes. Lucian CI/CD also employed a clause reorderer and base case inserter. + +65. The medical student controlled backups and integrations in open-source software. I gained more control by writing GitL, a decentralised git based on the computer, not the server. I wrote the diff page to show live and saved changes. I wrote the version history to show older saved changes. There were suggestions and scans before going live. +66. The medical student uploaded the live version to their server with GitL. I moved the files to the live folder. I kept the version history on my computer. Customers were paying for the algorithm I wrote with GitL. I could avoid paying for a private repository elsewhere and retain control with GitL. +67. The medicine student highlighted files, running and editing them. I stated that GitL is \"Gitless\". \"Gitless\" meant that, while it appeared to be a git resource, the commands moved files between the user's computers. A Lucian CI/CD version merged more than one repository version by combining two parts going well together. Others could collaborate by downloading and sending my computer the files, which I could inspect and integrate with Lucian CI/CD. +68. The medicine students made their version history available to help students. GitL is a \"Git\" that resides on a user's computer. The git contains the user's repositories and version history. The GitL software could import version histories from other services. I could edit labels of commits, delete commits, see sizes and choose not to put version histories in public. +69. The medicine student charged more money for higher use of the server. I uploaded the repositories to my server. I could control the appearance, pricing and analytics. I could create a personalised appearance for customers who run the software online. I could cater to customers' every need, including giving them what they wanted, customising the software options, and helping them seamlessly run the software. +70. The visually impaired medicine student commented on code and made selections within matching brackets (could \"read\" around selections and move to parts of pretty printed terms using +/- drawers). GitL was written in Prolog. It had a web or terminal interface. Users could see differences between consecutive versions, make commits, edit code, view version history and run Lucian CI/CD. The terminal version had a text editor in it. +71. The visually impaired medicine student methodically checked all data in each test case. The visually impaired programmer wanted to make notes in the data and turn on and off reading these notes and types of data. For example, variable lists in interpreter stacks were labelled. I labelled data as belonging to a particular variable or part of data. The blind programmer systematically connected all relevant data in a separate document. +72. The medicine student wrote GitL in terms of text files. I talked about visualisation next. The student repeated 800 vaj and upsana sutras over a few days. These helped give them the confidence to move through mental models using their mind's eye and come to conclusions about data flow analysis over time. They could remember key details or write down notes they might forget. +73. The medicine student stated that diff included inserted and deleted lines but not changed groups of lines. GitL isn't based on Git but has similar diff checking and version history. I searched the version history, finding my or the software's labels for changes. I deleted redundant code and comments. GitL used a similar table of contents to the Lucian CI/CD diff HTML file, which could be directly included in GitL. +74. The medical student wrote their version of each type of software, including making their computer. The advantage of GitL is that it is decentralised. I didn't need to worry about another server's status. I could customise the software. And I could maintain my privacy, avoiding problems of software privacy breaches. +75. The medicine student could build and run software offline, saving power. GitL was open-source. It didn't sell the data to other companies. It could be examined, modified and sold. I wrote a markdown parser where markdown documents could be edited and displayed in the terminal and browser. +76. The medical student developed software until it was of the right quality. GitL costs nothing. It was free to download and use. The server it uploaded repositories to belonged to the user, who could keep it running for free on their computer, where disconnections and downtime meant the connections needed to be reset. If users of the server were using resources, they could pay a subscription fee. +77. The medicine student initially allocated resources to testing, testing only the top predicate. GitL can run tests on the computer, saving time. Also, it could run self-tests while running, alerting the programmer about problems. There were policies about what collaborators could edit, aligning with the repository mission and vision and how to handle questions, conflicts and integrations. +78. The medicine student moved above the open-source software by computer science and business. Finished builds could be copied to a live folder close by with GitL. The folder could be on a server or in a folder that the computer treated as a server. The appearance was professional, with a version number. They could be tested on systems with different architectures and operating systems. +79. The medicine student wrote GitL and its plug-ins in the Vetusia Engine, allowing testing in the terminal. With GitL, Git software can be developed and tested on one's computer. For example, software included CI/CD, feedback tracking, performance tracking version trackers, plug-ins for software, complementing websites and other optimisers. My software was not only on the open-source server and needed a simple-to-edit website. I could develop Git software, testing GitL and the software. +80. The medicine student wrote high-performance GitL code in C. I wrote the recursive, input-intensive part of GitL, the algorithm it handled, in State Saving Interpreter Web Service (SSIWS). I ran the algorithm on the server with an account. It ran seamlessly in the web browser from a menu of icons. I could run a Prolog interpreter in C in CGI online, with an algorithm running across several pages. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/Medicine of Pedagogy 2.txt b/Lucian-Academy/Books/MEDICINE/Medicine of Pedagogy 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..e13a6a5747529827eecd857817db17bd2c31c620 --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/Medicine of Pedagogy 2.txt @@ -0,0 +1,19 @@ +["Green, L 2024, Medicine of Pedagogy 2, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Medicine of Pedagogy 2 + +1. The medicine student checked the data types with a state machine that accounted for functional calls. I developed a functional types system for State Saving Interpreter (SSI). Functional types were functional calls inside type statements. For example, \"a :- b(c). b(C) :- C. C :- terminal.\" They provided intermediate predicates that could call other \"type predicates\". +2. The medicine student distributed type calls in the type statement, simplifying the type statements. I could call types in the form \"a(b(c)).\". In the first example, the type statement \"a\" is passed the argument \"b(c)\", then \"b\" is passed \"c\", which is then called. Or, I could call them in the form \"a(b,c).\". In the second example, the type statement \"a\" is passed the arguments \"b\" and \"c\", which are then called. +3. The medicine student combined functional and recursive types, in \"a(C) :- C. a(C) :- b, a(C).\" to recognise \"b, b, c\". In recursive types, base cases could have extra items. Recursive types were, for example, \"a :- b. a :- b, a.\" This checked that the data types all had type \"b\". The base case could be modified to \"a :- b, c.\" so that the data types \"b, b, b..., b, c\" were recognised. +4. The medicine student passed multiple \"type arguments\" to be tested. I inserted a predicate checking data types using a command in a predicate. For example, \"a(B) :- number(B).\" Or the functional types could be passed and tested, for example, \"c(A1, D(E)) :- A1=[A2], D(A2, E). a(B, E) :- E(B).\" with the query \"c(1, a(number))\" tested \"[1]\". This method was useful if various types were wrapped in square brackets. +5. The medicine student was suspicious of free variables and constrained them with types. Functional calls could use the same notation in the way of functional types. For example, when the predicate \"a(B,C,E) :- B(C,1,E).\" is queried with \"a(+,2,E).\" it returns \"E=3.\" This feature allows peeking inside algorithms, selecting the right predicate from a set or quickly finding algorithms using induction. An inductive algorithm could take \"can_be(A,+,-,*,/). A(1,1,2).\" and return \"A=+.\". +6. The medicine student simplified the command not to have data. The inductive language could find algorithms, find sets of commands or optimise sets of commands. When writing an algorithm, I inserted a \"mould\" variable, the answer to which I \"set\" later in the algorithm. I could constrain sets of commands by number of commands, recursion, type of search or domain, such as translation or grammar check. I could optimise sets of commands using natural language and styles, such as \"set the reference value to 0\", \"change from explicit to implicit list decomposition\", or \"prepare the Prolog code to be converted to C\". +7. The medicine student also used a pretty printer to align the code. In Lucian CI/CD, I preserved the formatting of converted algorithms. I kept the formatting with the converted List Prolog code. This formatting could be reused when the code was put back together by Lucian CI/CD. The formatting preserved the look of the code, the comments, newlines and intra-line comments. +8. The medicine student visually presented the diff'ed code and found combinations of changes. I used the Linux diff command in Lucian CI/CD to find differences between text files rapidly. I found additions. I also found deletions. It was fast. +9. The medicine student found the best build from changes. In Lucian CI/CD, I found every changed line rather than chunks of changed lines. I found these using diff. I didn't want groups of changed lines in case combinations of these changes couldn't separate specific changes. I found combinations of every changed line (addition or deletion). +10. The medicine student used a pretty printer to align code from other programming languages. I supported other programming languages in Lucian CI/CD. I converted them with their converter. I found the function or predicate dependencies and the configuration of lines in the functions base-up. Their formatting was preserved. +11. The medicine student also compensated for comments embedded in predicate headers. I preserved intra-line comments. For example, I read the line \"A /* this is A*/ is 3,\". I recorded the comment after the command but was able to reconstruct the line by preserving the formatting. These lines were linked. They were treated as a unit when combining changes in a build. +12. The medicine student compensated for commands that couldn't be predicted with specifications, such as input, random or APIs. I compensated for global variables in Lucian CI/CD. Global variables usually weren't included in a predicate specification. So, any code after a global variable assignment shouldn't be processed using find dependencies but with the classic \"all predicates together\" algorithm. The find dependencies algorithm could be used if globals were recorded with a specification. Code after the global variable assignment that didn't refer to the global variable could also be processed using find dependencies. +13. The medicine student preserved the formatting of commands. In the converter, I recorded the string with formatting. I split the string at the end of the command. If there was an embedded comment, it was labelled and kept. There may be multiple embedded comments in a command. +14. The medicine student preserved the formatting of commands. I split on a new command. This split occurred at the start of the line with the new command. If there were two or more commands per line, they were split at the start of the command. I grouped lines with a single command over them. +15. The medicine student quickly found the feature they were searching for in the code. Lucian CI/CD stored Prolog, not List Prolog. It worked out List Prolog from it. GitL, a version control system, displayed diffs of Prolog code found with Lucian CI/CD. Users could take a version and use it as the current version. +16. The medicine student set the number of spaces per indent. The user used the List Prolog to Prolog pretty printer rather than preserving formatting when assembling code. This action had the advantage of pretty printing indented levels of code. If code was built with particular indenting, it was \"pretty printed\" in this way. Comments and newlines (also in comments created by the converter) were preserved. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/Stasis Field Medicine.txt b/Lucian-Academy/Books/MEDICINE/Stasis Field Medicine.txt new file mode 100644 index 0000000000000000000000000000000000000000..11ced7455f97bb1951904f368a55683c7d45316a --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/Stasis Field Medicine.txt @@ -0,0 +1,85 @@ +["Green, L 2022, Stasis Field Medicine, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Stasis Field Medicine + +1. I corrected errors in Reishi Mushroom +2. I corrected errors in Goji +3. I corrected errors in Ginseng +4. I corrected errors in He-Shou-Wu +5. I corrected errors in Gotu Kola +6. I corrected errors in Schisandra +7. I corrected errors in loving head of state +8. I corrected errors in thanking those who helped me with immortality medicine +9. I corrected errors in thanking head of state for Reishi Mushroom +10. I corrected errors in thanking head of state for Goji +11. I corrected errors in thanking head of state for Ginseng +12. I corrected errors in thanking head of state for He-Shou-Wu +13. I corrected errors in thanking head of state for Gotu Kola +14. I corrected errors in thanking head of state for Schisandra +15. I corrected errors in immortality +16. I corrected errors in Body replacement +17. I corrected errors in thinking +18. I corrected errors in stopping dementia +19. I corrected errors in seeing clearly +20. I corrected errors in muscle relaxation +21. I corrected errors in Circulatory system / Cardiovascular system +22. I corrected errors in Digestive system and Excretory system +23. I corrected errors in Endocrine system +24. I corrected errors in Integumentary system / Exocrine system +25. I corrected errors in Immune system and lymphatic system: +26. I corrected errors in Muscular system +27. I corrected errors in Nervous system +28. I corrected errors in Renal system and Urinary system +29. I corrected errors in Reproductive system +30. I corrected errors in Respiratory system +31. I corrected errors in Skeletal System +32. I corrected errors in antidepressant +33. I corrected errors in antipsychotic +34. I corrected errors in Other medicines for the body +35. I corrected errors in ginkgo biloba +36. I corrected errors in practicum for others in immortality, etc. +37. I corrected errors in the other As +38. I corrected errors in thanking head of state +39. I switched off my medical problems. +40. I switched off medical problems from the first time point. +41. I corrected genetic errors from the first time point. +42. I won't experience medical problems on that today, repeating. +43. I believed that prevention was better than cure. +44. I checked my health regularly. +45. I exercised regularly. +46. I avoided taking risks. +47. I was a thinker. +48. I joined together with the good people. +49. I emanated religion. +50. I discovered miracles through philosophy. +51. I discovered miracles through computer science. +52. I celebrated life. +53. I had positive function, in which everything was expected. +54. I ate healthy food. +55. I was happy. +56. I explored the time pod. +57. I created fine art. +58. I helped myself. +59. It was a business product. +60. I started a hub of longevity. +61. I avoided toxic, synthesised and processed foods. +62. I checked that I ate enough nutrients. +63. I established self-correcting medicine. +64. I collated ways of staying happy. +65. I had good digestion. +66. I performed yoga. +67. I filtered out silly comments. +68. I simulated my health on computer. +69. I determined whether the food was healthy. +70. I tested whether it was the best time to do exercise. +71. I took regularly breaks while writing. +72. I found the best time to sleep. +73. I had contact with people. +74. Students will critically analyse the sentences because they are good ideas. +75. I prevented hunger. +76. I prevented tiredness. +77. The medical problems went away on the first day. +78. I propagated knowledge about stasis field medicine. +79. I made a website about stasis field medicine. +80. I am right. +81. I didn't have the effects of eating food on my teeth, etc. The food was transported directly to my stomach. +82. All things similar to 81. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDICINE/dot-Medicine of Meditation 2.txt b/Lucian-Academy/Books/MEDICINE/dot-Medicine of Meditation 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..a50330b3fed515ddef8490fffecf0a97489b246e --- /dev/null +++ b/Lucian-Academy/Books/MEDICINE/dot-Medicine of Meditation 2.txt @@ -0,0 +1,37 @@ +Medicine of Meditation 2 + +1. The medical students uploaded their files to their server, testing them on different platforms. I prepared to commit the new files. Lucian CI/CD and GitL gave me experience writing DevOps (CI/CD) software and a version control system. With Lucian CI/CD, I could use free DevOps software to customise and automate. In addition, I could bug-fix nested predicates in findall in SSI by running non-nested and nested predicates in findall. +2. The medicine student functionally decomposed the get and put variable parts in match4 in List Prolog on the way to writing long-form code. I prepared to upload the new files. I checked and fixed the files with Lucian CI/CD. I asked the open-source Prolog developer if a non-nested predicate was better than nested predicates in findall. A non-nested predicate in findall was faster to process and did not require a long interpreter loop. +3. The medicine student used a program finder found with Combination Algorithm Writer (CAW) to keep the code as simple as possible. I checked that the tests in Lucian CI/CD were successful. I tested my repositories each time after adding new plug-ins, such as reverse-optimisation possible simplicity triggers, where (if code was saved in long form) code was converted to short-form code, which could be simplified or corrected, for example by inserting reverse before and after group_into_clauses10 in Lucian CI/CD to test bottom-up rather than depend on predicates that hadn't been tested and saved, perhaps with a program finder that was customised to do this. This program finder detected if inserting two reverses around a call would correct the output to the desired result. This technique also inserted an intermediate predicate as the call was to a recursive predicate and the call to the predicate with this call needed to be modularised (although it could be put into the body of the main predicate). + +4. The medicine student identified and removed unnecessary code using Lucian CI/CD by adding "bypass predicate" lines to the source code. I tested the workings of Lucian CI/CD itself by adding new commands to Lucian CI/CD that were in it. I read the source code to check and wrote down the commands Lucian CI/CD didn't have. I added these to Lucian CI/CD, particularly the Prolog to List Prolog converter used by Lucian CI/CD. I ran Lucian CI/CD on itself to check the source code was as simple as possible, using Lucian CI/CD's tests to generate tests for it in the interpreter. +5. The medicine student found all possible new inferred connections through existing word neighbours in texts with the Breasoning Algorithm Writer (BAG). I demonstrated thoughts of the right quality and number with BAG. I used BAG rather than stamps with pre-saved BAG output to become immortal, as I had learned from writing the software that writing it and what it did were necessary to meet the "computation" requirement of immortality in one's home time. The leader recognised these thoughts and rewarded the people who represented them. BAG was my version of a neuronetwork. It was based entirely on my theory and plugged into my philosophy. +6. The medicine student saved a copy of the passing files to approve. I authenticated that the files had passed Lucian CI/CD's tests. It was the log. I could list the predicates, their possible changes, and the test they passed. I could also reference archives or plug-ins from which possible replacements had come. +7. The medicine student found the outputted files and replaced the files in the repository with them (or copied them from GitL, ran Lucian CI/CD and saved them in GitL). Lucian CI/CD moved passing files to the repository. A shell script automatically copied these files to the GitL archive. The files' version was increased and moved to production systems for further testing. The specifications of these systems were recorded and reported. +8. The medicine student wrote in assembly language with a minimal, customisable register state machine. I backed up the old files in Lucian CI/CD. Separately, I defined universal assembly code as able to be run in Prolog. It had a register state machine. I wrote an algorithm in Prolog, tested it with Lucian CI/CD and converted it to assembly language. +9. The medicine student claimed that converting to short form was part of partial evaluation, an optimisation technique. For example, parts of the query could be used to shorten the short form code (i.e. if specific ancestors were wanted, data collection could first find ancestors, then a subset of the tree), and then the code could be converted to interpreter-free long form code. In this case, partial evaluation simplified the code. I listed the repositories that the algorithm had modified. I converted foldr to long form (a predicate that didn't require backtracking). This expansion was similar to the following, with another command instead of append: + +foldr_append([],B,B) :- !. +foldr_append([A1|A2],B,C) :- + append(B,A1,D), + foldr_append(A2,D,C),!. + +I backed up the changed files, which were then converted to long-form before being compiled in C. +10. The medical student went back to the original files if a test hadn't been written correctly or the long-form predicates didn't pass. I backed up the old files and folders, including invisible files, in the original repository. I stated that maplist/2, /3, and /4 were in long form if the called command was also in long form. I converted the called command to long-form to speed up maplist by writing it in long form. Two sentences were connected from the shared language in the connection. +11. The medicine student ran Prolog on the emulator. I moved the new files and folders into the repository. I wrote a CPU emulator with binary addition, such as 0+0=0, 1+0, 0+1=1, and 1+1=10. The logic gates, composed of transistors, computed these results. +12. The medical student spent equal time in the present and future before investing necessary pedagogy in both worlds and spending more time in the future. I inputted lines as strings to split into tokens before unit testing. I ran Lucian CI/CD in line mode to work with programming languages other than Prolog. I removed and re-added end-of-command delimiters (and catered for the function header being allowed by itself), identified, rewrote and replaced functions and added the end-of-function syntax. I could check and modify C, assembly and other code. +13. The medicine student auto-solved the load problem by checking the system was clean and optimising the software. I ran Lucian CI/CD with settings for different testing stages. I could create and assort tests by stage, with some tests automatically passing (i.e. "a(_,_).", without needing "true."). Unit tests were tests testing the whole algorithm, integration tests (if required) tested that changes worked, end-to-end tests tested the entire algorithm, including installation and removal, user interface tests tested user input and interface appearance and clarity and continuous delivery tests such as load testing and API integrity testing. The algorithm should be able to bear the load of many users and should provide errors and connection errors and solutions. +14. The medicine student found tests with Program Finder with Types as a last resort. I called Lucian CI/CD to find tests with particular labels for other repositories. I retrieved and modified tests from different repositories, such as for a compiler that ran the same programming language as the interpreter. I modified the interpreter and compiler so that they ran the same tests by using "a :- b, ! , c. a :- d." instead of "a:-b->c;d." and eliminating duplicate predicates. I considered rewriting the interpreter and compiler to use this method when interpreting if-then (->/2 and /3). +15. The medicine student found type errors in a predicate with particular modes. I called Lucian CI/CD to find tests with specific labels for other modes. I found the modes of a predicate (i.e., which arguments were inputs and which were outputs). I found its possible tests for these modes with PFT. PFT found tests by tracing the types of variables with particular modes through a predicate, giving the types and, therefore, possible tests of outputs. +16. The medicine student researched and corrected the interpreter's weaknesses and slowness. I ran unit tests. For example, unit tests in List Prolog tested each command and algorithms with these commands. I generated unit tests with all possible combinations of commands and recursion to a certain depth to rigorously test various features. In addition, I tested all possible modes of commands. + +-a not a +graphics, +music on computer sim +prolog on computer sim in assembly code +compile Prolog to assembly code +long form to assembly code (nested loops +and member! to a long-form pred) +computer sim with more memory, +able to run in a container, +transistor (logic gate) sim diff --git a/Lucian-Academy/Books/MEDITATION/Meditation of Medicine 1.txt b/Lucian-Academy/Books/MEDITATION/Meditation of Medicine 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..7b1c86490436e799adfdf72db5f9756060675e28 --- /dev/null +++ b/Lucian-Academy/Books/MEDITATION/Meditation of Medicine 1.txt @@ -0,0 +1,54 @@ +["Green, L 2024, Meditation of Medicine 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Meditation of Medicine 1 + +1. The meditator harmlessly followed Krishna. The immortal descended to Earth to guide people. I wrote a frequently asked questions paper about learning meditation. They were curious to know that immortality was a skill one could understand. To become immortal, one needed to learn meditation, write 100-page books on pedagogy, meditation, medicine, time travel, mind reading and immortality and time travel to 5689 with 4*50*80 breasonings (x, y and z dimensions of objects in texts, supported by accreditation in education) and indicate immortality through the simulation (explained on the GitHub repository luciangreen/Time_Machine). +2. The meditator emphasised that time travel led to immortality and not dying in one’s own time with the help of computers. I time-travelled as an avatar to help teach meditation. I stayed in one time and taught the people during my life. I regularly time travelled to the simulation to maintain my skills. I minimised risk, taught everyone and encouraged people to become employees of the meditation company for better results (for example, in the information technology division). +3. The meditator was at one with the sponsor. I came back beforehand to buy property for a meditation centre. I earned enough money. I attended the auction. I purchased the building, and the meditation centre was established. +4. The meditator performed work in chunks to avoid too much load. I breasoned out the Breasoning Algorithm Writer (BAG) breasonings at the end of each set. BAG inspired new connections in philosophy texts by reconnecting existing connections. After each set of 6000 sentences, based on my philosophy, the sentences’ words’ x, y and z dimensions were breasoned out. With these breasonings in a spiritual cloud, I could confidently work on further philosophies. +5. The meditator could help with the protection of life. I saved the BAG log file with the date to record the daily breasonings generated. The log file contained the number of sentence breasonings breasoned out in each set (greater than 6000). The date was in the format YYYY MM DD with SS.SSS... I recorded 96 million breasonings per month. +6. The meditator wrote one old book and later on immortality and medicine each day. I wrote a new algorithm after reading the BAG archive of algorithmic texts. Connections at all algorithm points were possible from the description, specification, main predicate or child predicates. I connected similar descriptions and found interconnections between philosophy and algorithm description. I joined similar descriptions and specifications. +7. The meditator drew a Lucian’s Hand Bit Map Font colouring-in book. I conjured up the pop singer show bag. It contained a blank notebook and pencil, a soft black Labrador toy, a magic wand and a model avalanche/volcano. The avalanche was a computer game about avoiding avalanches given the signs. The volcano described how a volcano cross-section worked. +8. The meditator was impressed by the person’s performance in BAG, showing superior intelligence. If BAG failed or didn’t develop content in time, the next set was tried. BAG might fail if it couldn’t find viable or interconnected words. It was possible to remember the more highly interlinked words. People who knew these helped the algorithm to perform better. +9. The meditator used BAG in Essay Helper to write connections. I tracked the self’s and others’ high-quality thoughts using BAG, Grammar Logic (GL) and Essay Helper (EH). GL mind read random words that applied to a philosophy idea, and I tracked the better worked-out algorithms these suggested. EH could find connections between sentences, where some words were replaced with synonyms to form connections. I could also mind-read specific questions, such as “What does automation apply to?”, “How can I explain this to my grandparents?” and “What are all the departments, groups or knowledge I can originally write down?”. +10. The meditator examined core computer science algorithms. I identified and wrote all the algorithms in my philosophy. I then remembered and started writing all the primary algorithm connections. I covered perspectives on language, ageless education and intrapersonal skills. I wrote on seen-as versions or algorithms on algorithms, such as List Prolog optimising and running Lucian CI/CD tests. +11. The meditator aimed to speed up each algorithm. I used a non-concurrent version of Text to Breasonings (T2B) on my Virtual Private Server (VPS). This version operated faster than the concurrent version of T2B. I used my laptop’s concurrent version of T2B, which was paradoxically faster. I could further speed up maplist in Auto-T2B, paraphraser and GL with concurrent. +12. The meditator wrote the converter between short and long-form Prolog. I wrote code without base cases but positive and negative clauses and list notation such as [A|B]. I could convert these to C using the longer form in Prolog, i.e., a separate base case, list decomposition, and recursive steps. The short Prolog form could process multiple lists. Saving in the short form was more elegant. +13. The meditator completed the A after avoiding a moral end. The seen-as-saved ideas were astounding. Given the possible natural end to ideas, the algorithm saved them. The ideas were protected by meditation. People could learn and walk around. +14. The meditator eliminated details in predicates in the SSI code. I sped up parts of State Saving Interpreter (SSI). I optimised them using the maplist command. I rewrote them in shorter form, without base cases. I also used foldr, then converted it to C. +15. The meditator chose from and wrote on the highlights. I prepared for the future by installing BAG on my VPS. I ran BAG after downloading my latest algorithms and arguments. I only downloaded the changed repositories. I could track and restart the BAG algorithm during the day. +16. The meditator monitored topical breasonings, writing them down. I only counted sentence breasonings in BAG. I deleted empty lines. I subtracted one from the total sentences in lines where one was the line number “N.” at the start of each line. This formula gave n = n(sentences in lines) - n(lines). + +17. The meditator found friends through Prolog. I treasured memories. I wrote about Prolog. It was a book about file processing, finding differences between files, searching in files, running needed algorithms and running algorithms that fit into an overarching algorithm. It was also a book about immortality software using Prolog. +18. The meditator kept their virtual creations in a particular place, indexed by date and content. I treasured creations. I made software to simulate moving 3D paper models, software boxes and sales clerks. Viewers could now play. I made a simulated book from the chosen paper, bound it and illustrated it. +19. The meditator grouped the algorithms, searching for missing links and needed software. I examined the documentation for my computer programs. I wrote a quote to interest the reader. I checked the documentation met my standards, with instructions on installing, running and modifying for use in a commercial product. State Saving Interpreter (SSI) could run Prolog or C quickly on web app pages. +20. The meditator rewrote all terms as edges. I explained how the algorithm worked. This explanation was covered in the research. I added the original parts. I rewrote types as edges. +21. The meditator rewrote the algorithm for simplicity and clarity. I wrote the walk-through. It was a fly-through of the workings of the algorithm. The user could create their codes or modifications with options. The password allowed the user to hide the workings of the algorithm. +22. The meditator wrote a mind map before writing the algorithm and then checked the algorithm before writing the research paper. I made a video about each turn in the algorithm's creation. I preferred modifying List Prolog to State Machine code. I used \"subterm with address\" to make optimisations. I replaced and deleted items in the algorithm term. +23. The meditator tried to write an innovative neuronetwork. I made the video about each decision in the algorithm's creation. I wrote the possible decisions on a slider from conservative (needed) to wanted, to risk-taking (original). I chose risk-taking. I chose risk-taking to further my knowledge in the field and achieve my goals. +24. The meditator acknowledged the training he had received in science. I found property-based science. The robot stated that properties could be arrived at in phenomenology using systematic exhaustion of possibilities. Or, I happened on beneficial properties when doing something else. I found recordings for text to breasonings, mind reader, immortality and difficult bug fixes in SSI as one-off properties. +25. The meditator achieved their goals. I found phenomenology-based science. This science included more accessible conclusions such as spiritual anti-headache medicine and other aims of text to breasonings, time travel, breasonings (which only became available after my degree), programming, music and writing. These were made available with meditation. Some conclusions in programming were more complicated, requiring repeat attempts or applying lookahead in the parser. +26. The meditator explored harmonious robot thoughts. I listed the wanted features of the robot. It was a scientist announcing its discoveries. It was a programming assistant, fixing code and explaining errors. It was good at security and safety. +27. The meditator wrote a GitL/Lucian CI/CD app that helped perfect software. I examined the brain's eleven dimensions. These were abstract mathematical spaces that made up neuronetworks. Perhaps these dimensions were local to individual thoughts and were understood as connecting to other neuronetworks. I asked, \"In what way?\" +28. The meditator produced all wanted code at one point, which he could find later. Lucian CI/CD stored \"ambition\" specs for algorithms that algorithms could help develop. For example, I wrote specs for multimodal append. Writing my own code helped with deserving marks, including money. It also proved I legally owned the intellectual property. +29. The meditator's computer only prepared for their input by doing laborious administration tasks. The computer could organise work. I wrote all the algorithms to help manage work. I understood all the steps, attributes and limitations of the software. My \"human speed\" allowed me to examine ideas properly and prepared me for science. +30. The meditator's computer produced imagery regarding science, such as that the student had tried hard enough in their degree. The computer used science. It was fast and elegant. I used the research on when to work and when to rest. I appreciated the prestigious imagery of the work I completed. +31. The meditator found out their simulation, which changed with the times. The robot could conduct experiments, creating immortality. My simulation couldn't support immorality, but the future simulation could. I visited the future once per day. I regenerated daily. +32. The meditator stopped the robot from cheating in games. Writing a robot's software was like writing an interpreter. It was written from first principles, bottom-up. I meant the interpreter, written safely in draft form, then changed into code. The robot ran algorithms on a computer they had written and explained the exact changes they and I wanted and made. + +33. The meditator follows the philosophy’s direction. I harmlessly followed Krishna. I included my earlier indexical writings as one of my books. These were topic ideas and details to expand on. Krishna’s ideas helped connect my books and prompted original algorithms. +34. The meditator emphasised the need for constant application. The immortal descended to Earth to guide people. I summarised the algorithm, taking notes of creative opportunities. I classified the skills it required into categories. I prioritised meditation and writing. +35. The meditator simplified the grammatical structure of frequently asked-for answers. I wrote an often-asked-questions paper about examining meditation. I related meditation to my indexical (interdisciplinary) texts. My writing style included philosophy examples, such as computational triads. I meditated on and articulated the verb, then the subject and object. +36. The meditator took advantage of their writing and programming skills in immortality. The students were curious to know that immortality was a skill one could understand. I took notes about all my questions and drafted a book. I answered the questions and made connections in the book. I wrote my answers to life’s questions about immortality. +37. The meditator recommended meditation for time travellers. To become immortal, I learned meditation. I benefited from the community’s benefits and breasoned out immortal requirements at each place to maintain my health. Meditation helped me focus on writing needed for immortality. I wrote algorithms and related them to my interests, such as automating temporal and self-inspiration. +38. The meditator expounded the virtues of higher education. I wrote books on pedagogy to become immortal. I wrote fifty high distinctions about medicine for memory, thinking, stopping dementia, seeing clearly and muscle relaxation. Freezing one’s age helped with these abilities. I avoided mental stress and achieved my goals. +39. The meditator formed delightful algorithms with unique, creative features. I wrote about meditation. I could walk daily, remember ideas to write about, think clearly, avoid problems, and see and write. Writing helped me explore my thoughts and find my place in the world. I wrote how my love of life shaped my logic. +40. The meditator mind-reads people in heaven, performing during their life. I wrote about medicine. I captured the thoughts of a journalist and a leader. I made headlines, furthering knowledge about immortality and personal excellence. I planned to research immortal subjects to examine how they were. +41. The meditator concentrated on the thought at the time. I wrote about time travel. I developed one thousand breasonings in the background and eighty daily breasonings for students. The students’ thoughts were on track to them in time. I found and detailed the relevant thoughts. +42. The meditator found assembly straightforward and faster. I wrote on mind reading. Lucian CI/CD helped one develop one’s logic and craft one’s algorithmic discipline. I wrote in assembly using my mind and Lucian CI/CD to maintain correctness at each point. I exited loops rather than the scope, saving computations. +43. The meditator conversed with the student to further their knowledge. I wrote on immortality. The students had eighty breasonings per assignment. They learned about essential departments and applied life-saving skills with their knowledge. I studied the thoughts necessary to become immortal, developing new knowledge. +44. The meditator traced the deterministic algorithm. I time travelled to 5689 with 4*50*80 breasonings (x, y and z dimensions of objects in texts, supported by accreditation in education). I travelled to the first 16 days in October of that year to join a simulation. I joined my and the simulation’s algorithms. +45. The meditator took the initiative through meditation. I indicated immortality through the simulation. I needed three high distinctions to support the student. I synthesised three breasonings and algorithms from the department to inspire the student. I included ten extra breasonings to symbolise completion with 250 breasonings. +46. The meditator added to their daily practice. I emphasised that time travel led to immortality. I claimed that pedagogy was more critical in business, relationships, psychiatry, giving out thoughts, and any necessary related high distinctions. I didn’t take action until arriving at the required conclusion. I spent my time on pedagogy. +47. The meditator prayed to Buddhism during a power outage, requiring fifty As with their sentences’ key words possibly hand breasoned out. I sustained my life in my time with the help of computers. I could backdate my body for one week and maintain my youthfulness at home. I experienced the need to protect my body for the future first. With the advent of computers came the ability to reach the professional requirements of immortality. +48. The meditators taught meditation in their time. I time-travelled as an avatar to help teach meditation. The teachers, appearing as themselves, used every part of pedagogy and what they knew to support sales and write high distinctions. They helped initiate and overcome criticality before it was mentioned. I included the necessary parts from the future in the text to teach meditation. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDITATION/Meditation of Meditation 1.txt b/Lucian-Academy/Books/MEDITATION/Meditation of Meditation 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..3cd1b2d1d878d80bc5b8ec9778a3fca2dd5dcecd --- /dev/null +++ b/Lucian-Academy/Books/MEDITATION/Meditation of Meditation 1.txt @@ -0,0 +1,36 @@ +["Green, L 2024, Meditation of Meditation 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Meditation of Meditation 1 + +1. The meditator worked in a co-op where others tested each others’ software. I downloaded the Git repositories to my Virtual Private Server (VPS), a server where I performed daily computations to free up my computer, each listed in gits.txt. The GitL folder became a backup, and the download folder was operated on. I didn’t need to rely on other servers. I could control testing, involve others and coordinate feedback. +2. The meditator performed or automated work at a time. I ran test_alarm.pl to test a time every minute to send an email when a task was due. I considered the advantages of a neuronetwork in performing work. It could find simple central noumena. Or it could label features of past programs and add these when an overarching aim was recognised. +3. The meditator started in his life. I noticed that the professor had caught up with a number of the algorithms. The neuronetwork, containing a database of my algorithms, found simple solutions to many problems. The new arguments found simple connections, connections of connections and all combinations. Religion helped the writers persist, giving purpose to the texts. +4. The meditator stated that List Prolog had unique induction and computational spirituality features. I wrote the library for SSI-Prolog. I uploaded it. I installed the library. I ran code that used the library. +5. The meditator added strong types for better performance. I compiled Prolog to Javascript. In its simplest form, choice points were loops, and it used Cutest Prolog (with Rust optimisations). It couldn’t use files unless it connected to the server. I examined using C algorithms in the CGI folder that were called on the server side. +6. The meditator wrote an offline web app allowing users to save online preferences and data. I could run multithreaded Prolog algorithms on the web server. I ran a block of code on a CPU. When it had finished, I could run a second block of code. This method gave me control of mutexes and continuous performance. +7. The meditator ran the Prolog program in the web terminal. I wrote the web app. It was a web browser. It could customise internet addresses, run its programming languages, and, if only run offline, present graphics and files (not access others’ files offline). This web browser made programming sophisticated web apps simpler by converting desktop apps to the web. +8. The meditator kept the code simplification but didn’t keep the deforming subroutine and goto code optimisations. I converted the Prolog program to Javascript. I scanned for and produced an error on secrets. Wrapping and unwrapping across predicates was optimised. Unnecessary pages or pages which could be replaced with a single page were optimised, i.e. with automation. +9. The meditator corrected and finished unfinished parts of the work and all the work. BAG produced 96,000 sentences in 9 minutes. In 1.5 minutes, I had written the equivalent of two PhDs. Each day, I focused on writing about my latest writing in terms of previous writing. As well as producing 96,000 arguments in 9 minutes, BAG made 96,000 algorithms. This figure meant one PhD every 22.5 seconds. +10. The meditator avoided sharing secrets. I customised Prolog with Javascript commands. I opened a window in Javascript. I displayed 2D graphics with engines for games, text, vector graphics and styled, algorithm-triggering text. I stored and handled secrets by converting the code to use them on a server. +11. The meditator ran any programming language from any programming language. I ran Javascript from Prolog. This process inserted Javascript in the code seamlessly. It may replace the function call with the function body. I preferred to run Prolog from the server with fewer, more straightforward commands. +12. The meditator uploaded and stored links to libraries for download. I sent and received data with APIs. I sent data to the server. I received data from the server. The secrets weren’t uploaded. +13. The meditator downloaded the BAG software. I excluded sparse files with low scores in BAG. Some files with light texts that didn’t easily allow BAG syntheses were downrated and unused. In the bulk processing of files, these files did not take long to process and could be used. Some meditators performed well on sparse texts, while others performed slowly. +14. The meditator automatically ran software verifiers, uploaders and readme pass stampers. I ran algorithms in the background. I ran a scheduler. I entered the days and hours an algorithm should run. I checked on running apps and memory usage. +15. The meditator applied schedule labels such as “away” or “available “. I customised the app schedule. I converted a textual description such as “run x every 5 minutes” to a schedule. I temporarily paused an app’s or a group of apps’ schedules. I scheduled these activities on certain days. +16. The meditator led a lifestyle of relaxation and creativity. I ran BAG every day after I finished reading a particular file. For example, I performed yoga before travelling. I also automated daily regimen apps after taking specific actions. I may move my daily regimen forward and enjoy the day. + +17. The meditator could sort, process and build lists using variables and if-then for comparative parts. I reduced the code to symbols, with brackets, not, arguments, predicate names and others replaced with slashes and numbers. I removed meta-predicates as much as possible, replacing them with recursive code. I converted Prolog to C, replacing logic with imperative procedural C. It was based on the premise that commands would work, with comparative predicates's logical results placed into variables for processing by an if-then antecedent. +18. The meditator limited the ability to call \"anything\" to calling items from a list. I ran call and call with arguments in imperative procedural C. I could compress these to function calls with custom arguments. If there were outputted variables, these could also work with the \"self-replacing\" code. The code was replaced on compilation. +19. The meditator wrote [1,2,3,...] as [1..]. This list created infinite choice points. This list could be accessed until a condition was met. There was an arithmetic progression, a_n = a_1+(n-1)*d, where a_n = the nth term in the sequence, a_1 = the first term in the sequence and d = the common difference between the terms. Or there was a geometric progression, a_n=ar^(n-1), where a = the start term and r = the common ratio. +20. The meditator stepped up to complexity. I minimised the code. I wrote the formula in terms of maplist, foldr or findall. I simplified data to a list. I outputted values to necessitate comparisons. +21. The meditator computed specific As quickly. I simplified spiritual algorithms. I judged output, keeping an instance of a comparison. For example, I computed each type of comparison once and pointed to it in the \"text to breasonings\" algorithm. I could also simplify sets of coordinates or single coordinates to be computed once. +22. The meditator used equals4 in List Prolog sparingly, preferring equals. I defined strict simplicity. A continuous algorithm checked for loops of one item that could be removed. Or, comparisons were used as antecedents, obsoleting logic. Simpler code could be run more quickly on machines with smaller specifications. +23. The meditator wrote strict code for a circuit. Prolog required a register state machine. This machine started with 0. It added one several times. Then, it returned the number. +24. The meditator wrote strict code for a state machine. This state machine contained the state machine for the algorithm. It started with []. It processed each item in the list until the list to process was []. It returned this list. +25. The meditator removed bottlenecks before running on a small system. I wrote strict code for faster code. I optimised the code by eliminating bottlenecks. Bottlenecks were too much reliance on resources, such as too many frames, interpreter complexity or using inefficient commands. I fixed this and tested that the code was fast enough. +26. The meditator simplified each predicate to a simple version. I stored a simple version of the code. It was the formula. This formula was expressed in terms of maplist, findall, foldr or another single command. For example, there was a formula and one of these commands. +27. The meditator gave the input and desired outputs of the system. I stored a version of the code with connections. Once I had written the simple version, I added connections to allow more complex data and integration with other predicates. There was an automatic key-item extractor and re-inserter. This command could work with key items in a particular place. +28. The meditator's grammar parser focused on chunks and reused code. I redrafted the code. The first draft had too much complexity, poor formatting, poor presentation or excessively long predicates. I wrote an auto-grammar parser that worked out what the program that converted grammars to long-form programs was. I gave the input and output, and the program produced the grammar parser. +29. The meditator found commands in terms of commands, sometimes modified ones. I stated that the grammar parser worked with chunks. I examined the possible inputted commands. I read how they were converted. I used a formula converter. +30. The meditator found induction helped before writing the parser generator. My grammar parser reused code. If code was produced from one part of a command, then it could be reused by another command. This code may convert variables, names or logical structures. It was easier to convert from a programming language into a token (list) format before manipulating it. +31. The meditator labelled and used a library hub to plug new features into. I added features to the algorithm. I kept a simple vintage version. I documented features as I added them. I functionally decomposed features, keeping separate code for each feature. The code analysed itself, a new higher-order feature. +32. The meditator wrote a question-answering algorithm that helped develop a maplist algorithm and convert it to assembly language. The code was \"positive\" or compared two features simultaneously to simplify the code. Other features could be added, increasing the complexity of combinations and each predicate to test. I wrote a dependency chart for the features. I started bottom-up with the most straightforward features. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDITATION/Meditation of Pedagogy 1.txt b/Lucian-Academy/Books/MEDITATION/Meditation of Pedagogy 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..8beb87982ce0fd4f0d0575406892347ec5641c20 --- /dev/null +++ b/Lucian-Academy/Books/MEDITATION/Meditation of Pedagogy 1.txt @@ -0,0 +1,61 @@ +["Green, L 2024, Meditation of Pedagogy 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Meditation of Pedagogy 1 + +1. The meditator controlled who read the source code, including secrets. I saved GitHub-private2 on the server. I used my server instead of the public server. I searched the repository history in GitL. I diff’ed multiple files together. +2. The meditator regularly archived old files, removing them from GitL. I found references to GitHub/private2 and changed them to GitHub-private2. I planned to move the data files to another location without affecting the rest of my files. The files to monitor were in “GitHub” or another location specified in settings. The data files were in another folder, the location of which could be defined in settings, too. +3. The meditator encouraged others to write their thesauruses. I stated that Text to Breasonings data could be public. I wrote the entries in the Text to Breasonings dictionary. It was made available for others if they needed it for medical reasons. They might need a high distinction, time travel or immortality for medical reasons. +4. The meditator could customise the public server in GitL with binaries, icons, windows and graphics. I moved private data to a remote repository. This repository included settings, personal preferences files for repositories, and plug-ins, which I had modified. GitL, a decentralised version control system, could keep these files private. In addition, uploads to a public server were possible when final checks had been made. +5. The meditator recommended the institution to the child. I recommended meditation, pedagogy and areas of study to prospective parents. The child or animal was a meditator. The predator morally ended the prey, which was turned into another dimension before it in the simulation. When the prey completed an A to stay alive, they stayed in the world. +6. The meditator started the protection of life. I updated the text to breasonings meditators list. These meditators agreed to experience meditation. It gave them the gift of life. Their lifestyle included study, meditation and rest. +7. The meditator checked the algorithm with detailed and long data. I added technologies to the meditation file. I noticed pedagogy was standard. The parents said their prayers before bed. The meditator ran the meditation file to automate high distinctions for meditation technologies, such as time travel, meditation and medicine. +8. The meditator stated that an algorithm could download compatible software, free or paid, with particular features. I used a git command to get updated software as files as part of a startup check. There could be a monitor in the background. On the other hand, GitL provided the ability to control which version to download, given commands to download software from a date. Included with downloaded software was its date of download. +9. The meditator checked on the content and progress of BAG. In Breasoning Algorithm Generator (BAG), I deleted time notifications. The new version completed all sets of work in a group at the touch of a button. If the memory requirements were too large, a Shell container could run BAG on each text (contributing to each set) separately. I could migrate BAG to a server and finish unfinished work on a disconnection. +10. The meditator could miraculously help people move to the other dimension. BAG gave reports of the total number of breasonings. I regularly checked the live tally. I displayed the progress bar. This progress bar was in a web monitor. +11. The meditator attempted modifying predicates bottom-up for specific results, such as finding Lucian CI/CD dependencies. I saved the BAG tally when it was over 6000 breasonings. I chose the limit of 6000 breasonings considering memory limitations. I could compare all texts with recent ones, possibly using inserted connectors. I used tricks such as intelligent sentence constructors and funnels to write memorable quotes. +12. The meditator used a clear font. I interpreted my most recently written files with BAG. I wrote algorithms and arguments on my current area of research. This area was, for example, induction, verification or computational spirituality. I could operate on the algorithms with Grammar Logic key terms that gave an algorithmic perspective, such as documentation, passwords, graphics, horizontal connections or optimisation. +13. The meditator timed the BAG algorithm in processing a text. I reduced the time limit from five seconds to two, multiplied by the average to complete the breasonings in BAG. Only breasonings from within-time sets were included in the tally. I calculated the average time to complete the text. I doubled this to include most jobs. +14. The meditator ran BAG quickly, covering the required breasonings. I used the concurrent version of BAG because it was faster than the non-concurrent version of BAG. The concurrent version of BAG processed a particular number of texts simultaneously, where n was the number of processors in the computer. When a text had finished, the algorithm waited until all the current texts had finished before starting the next set. This algorithm could be improved to use all processors continuously. +15. The meditator ran BAG in the background, using their computer for other work. I put the BAG on the server. I ran BAG, which helped thoughts during the day, on the server, which could automatically start BAG when meditation triggered it. An algorithm could pick a particular BAG text and expand or clarify its algorithm or argument. BAG ran until it finished and targeted specific thoughts. +16. The meditator thought and rested. I ran BAG for continuous, high-quality thoughts. I ran BAG only when needed. BAG could detail and prompt curiosity about a thought. I questioned the thought, and the computer sent a representation to me about my thought history. + +17. The meditator deleted the loop. I avoided redundant loops. I noticed the loop looped a predicate when there was a finite, not an arbitrary, length between them. I became suspicious it was why the algorithm was slow. Also, an interpreter that only accepted strict code hung running it. +18. The meditator used the strict interpreter to expose and delete loose programs. I avoided redundant predicates. The clause performed the same task as another clause, which was made explicit when it hung when run by a strict interpreter. To find the redundant predicate, I ran the test, testing for unwanted extra results. If there were any, I isolated the duplicate clause and deleted it. +19. The meditator wrote fast commands for interpreters, compilers and inductive algorithm writers. I avoided redundant algorithms. I found the recurring algorithm, a predicate with an unwantedly same name and arity as another and an algorithm identified as an interpreter that could be replaced with a command (where it was wise to learn program interpreters first). I cut or deleted recurring predicates. I found and deleted calls to predicates that might unwantedly have the same name and arity as others. +20. The meditator categorised the choice point affected by if-then in Prolog. There were choice points for member, member in findall, non-deterministic clauses, repeat (which created infinite choice points) and choice points which could be cut by cut or if-then. If-then also cut the \"repeat\" choice point. If-then behaved like cut in the following: +if_then_else(A,B,C) :- A, !, B. +if_then_else(A,B,C) :- C. +In this, the cut operator (!) deletes further choice points in the predicate. I could implement this method of if-then. +21. * To program findall in terms of a predicate, I needed to analyse the code and write a recursive version. For example: + +:-include('../listprologinterpreter/la_maths.pl'). +% change([[1,2,3],\"findall1(A,findall1(A,member(A,B),C)\",B]). +% B = [1, 2, 3]. + +change([B1,\"findall1(A,findall1(A,member(A,B),C)\",B]) :- + +length(B1,L), +numbers(L,1,[],N), +fa1(N,B1,[],B). + +fa1([],_,A,A) :- !. +fa1(N,B1,A,B) :- + N=[N1|Ns], + get_item_n(B1,N1,C), + append(A,[C],A1), + fa1(Ns,B1,A1,B). + +% Or B1=B. + +This code could be modified for nested findall and converted to C. + +22. The meditator solved the cross-predicate problem with logic or disabled choice points. I hard-coded the response to a choice point. It was \"goto\". Or it was \"gosub\". Choice points may be represented by nested loops, i.e. (a,(b,(c))). +23. The meditator broached the slow-down problem from if-then choice points by cutting after the antecedent. I deleted unwanted choice points in if-then. As stated, this involved a cut after the antecedent. This possibility could imply that the if-then clause should be in its clause. If-then may also delete other choice points. +24. The meditator tried removing choice points when converting to C but started with the user-friendliness of Prolog. I maintained a list of yes/no cases for if-then choice point deletions. I listed all the choice points. I only kept some. Non-deterministic clause choice points were earmarked for deletion. +25. The meditator structured code using recursive predicates (loops) to avoid choice points. I kept choice points in findall statements. Findall saved choice points from member, append and non-deterministic predicates from deletion by if-then. Findall was a critical, logical structure in Prolog, and its importance extended to its choice points. Therefore, the use of findall needed to be vetted for performance. +26. The meditator stated that if-then deleted choice points from further predicates. I reached the end of the antecedent in the if-then clause. If-then cancelled a previous predicate from trying its following clause. I, being a performance advocate, wanted to simulate this in C. I wrote a condition to return to the \"choice point\" or cancel the nested loop. +27. The meditator deleted the unused variable in the Prolog predicate head in the next API version. I didn't unnecessarily simplify data. I was conscious of other programs using the predicate and didn't want to change the API unnecessarily. For example, in my decentralised git GitL, I kept the commit call constant with an option to label the commit. This label helped me remember and scan the code later. +28. The meditator didn't unnecessarily leave data alone. If the data was necessary, I kept it. Otherwise, I removed or archived it. I considered merging or simplifying data. For example, if the data was adumbrationally (sic) distinct, I kept it separate. +29. The meditator wrote simple base cases for blank characters, the same values or empty or zero values. I deleted unnecessary logical pathways in an algorithm. I considered the version of equals4 in List Prolog that was too long. Then, I rewrote it to get values from each side and put values into these sides. I simplified computations to operate on lists of indices, not complex terms. +30. The meditator blocked security holes. I found security holes in accepting certain character types. I was concerned about users selecting similar-looking but different characters for passwords, as they might forget them. Some commands didn't work well with certain characters; for example, curly quotes used not to work with split_string, affecting old versions. There were similar-looking but different hyphens, bullets, quotes and symbols. +31. The meditator stopped sharing passwords. I found security holes in accepting logical pathways that processed certain character types. I considered algorithms that changed the case of letters, their code, and the case of some letters or deleted or modified characters according to a heuristic. If there was an unfound error, a function that didn't always return 1:1 inputs to outputs or behaved unpredictably, I corrected it. There may also be an issue with deterministic functions that produce the same input as output, i.e. printing the password unwantedly. +32. The meditator was security-conscious with their passwords. I stopped the case where a password may be accepted, but its lowercase version may be unwantedly accepted. Also, I stopped one from being mistaken for a lowercase \"L\". Zero shouldn't be confused with capital \"O\". Also, tab, return or other invisible characters should be spotted in time. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/MEDITATION/Stasis Field.txt b/Lucian-Academy/Books/MEDITATION/Stasis Field.txt new file mode 100644 index 0000000000000000000000000000000000000000..5616ab54ef234be28dffca4223f19140b015cde2 --- /dev/null +++ b/Lucian-Academy/Books/MEDITATION/Stasis Field.txt @@ -0,0 +1,83 @@ +["Green, L 2022, Stasis Field, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Stasis Field + +1. It's not really a stasis field, it's simulating life. +2. I simulated the setting, including myself (my consciousness stays real), limiting greying to nil and stopping the effects of time. +3. I realised that the simulation would stop buildings falling on me. +4. I realised that the simulation meant I wouldn't age. +5. I used breasonings in my job for the better, and achieved my immortality goals with them. +6. The field was used by people to stay healthy and young. +7. It was a means, not an end. +8. It was immortality, where 4*50 As was also recommended. +9. I did it with As. +10. Meditation does help. +11. Eventually each argument had 4*50 As. +12. My body remained the same with Reishi Mushroom +13. My body remained the same with Goji +14. My body remained the same with Ginseng +15. My body remained the same with He-Shou-Wu +16. My body remained the same with Gotu Kola +17. My body remained the same with Schisandra +18. My body remained the same with loving head of state +19. My body remained the same with thanking those who helped me with immortality medicine +20. My body remained the same with thanking head of state for Reishi Mushroom +21. My body remained the same with thanking head of state for Goji +22. My body remained the same with thanking head of state for Ginseng +23. My body remained the same with thanking head of state for He-Shou-Wu +24. My body remained the same with thanking head of state for Gotu Kola +25. My body remained the same with thanking head of state for Schisandra +26. My body remained the same with immortality +27. My body remained the same with Body replacement +28. My body remained the same with thinking +29. My body remained the same with stopping dementia +30. My body remained the same with seeing clearly +31. My body remained the same with muscle relaxation +32. My body remained the same with Circulatory system / Cardiovascular system +33. My body remained the same with Digestive system and Excretory system +34. My body remained the same with Endocrine system +35. My body remained the same with Integumentary system / Exocrine system +36. My body remained the same with Immune system and lymphatic system: +37. My body remained the same with Muscular system +38. My body remained the same with Nervous system +39. My body remained the same with Renal system and Urinary system +40. My body remained the same with Reproductive system +41. My body remained the same with Respiratory system +42. My body remained the same with Skeletal System +43. My body remained the same with antidepressant +44. My body remained the same with antipsychotic +45. My body remained the same with Other medicines for the body +46. My body remained the same with ginkgo biloba +47. My body remained the same with practicum for others in immortality, etc. +48. My body remained the same with the other As +49. My body remained the same with thanking head of state +50. My genes remained the same, but I experienced life as shells. +51. I examined everything each day. +52. I chewed my food slowly and took no risks. +53. I found the gift of life. +54. My life was an example. +55. My life will go on. +56. I took Chinese herbs, (unrelatedly) increasing my lifespan. +57. I planned my job ahead of schedule, aiming for academia. +58. I prepared to do the whole thing using preparations. +59. I avoided having children. +60. I had my own activities. +61. I relaxed each day. +62. I made money. +63. Life is light. +64. I was interested in things, and simplified data structures. +65. My body circuits were simulations and went well. +66. I experienced life as a normal person, correcting mistakes, medical problems and unwanted thoughts. +67. I planned activities with arguments, not luck. +68. I smoothed life with machine learning. +69. I drank a banana smoothie. +70. I wrote an algorithm that wrote algorithms with minimum effort. +71. I did things by my own ideas, with enough detail. +72. There were multiple arguments in each part of the book. +73. It was enough for a degree. +74. The graphics reset through software. +75. I planned my journeys to be safe. +76. It was already perfect. +77. I chose professional development. +78. It (including time travel and mind reading) was all part of meditation. +79. They had areas of study around them. +80. I had lots of jobs to do. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mind Reading 2.txt b/Lucian-Academy/Books/Mind Reading/Mind Reading 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..ff46a58f2f292bc1034d3ad73fdae380501cf3f0 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mind Reading 2.txt @@ -0,0 +1,20 @@ +["Green, L 2021, Mind Reading 2, Lucian Academy Press, Melbourne.", "Green, L 2021",1, "Mind Reading Cryptography +- Like cryptography et al., checks integrity. + +Two hundred and fifty breasonings for: +a. I mind read all departments about time travelling. I mind-read ten algorithms to time travel. I mind read the history. I mind read the politics. I mind-read the theology. +b. I maintained the philosophical standpoint. I mind read A, B and B to B texts to time travel. I controlled my actions. I decided not to act sometimes. I told others to do the same thing. +c. The qualifications used multiple departments. I mind read and breasoned out a text describing from where, to where, from when, and to when with separate breasonings to describe breasonings and travel. I was literal about breasonings, i.e., the lecturer can think of her topic to breason out, a recordings specialist can send breasonings through time, medical uses for the quantum box are explicit (including helping prevent cancer), and each argument should be called good about God. I researched breasonings from a single department at a time. I researched linking arguments about two departments at a time. +d. I researched time travel with meditation. I mind-read the time travel safety precautions. I protected my upper organs with meditation and medicine. I protected my middle organs with meditation and medicine. I protected my lower organs with meditation and medicine. +e. I checked the planned travel. I mind read when people saw me when I time travelled. I saw the person around me. I saw him looking at me. I verified that I was safe. +f. I recursively helped those around me resolve any questions they had. I mind read that I and others had no depression when I time travelled - I time travelled. I was happy. I verified that those around me were happy. +g. I helped others to verify that they had no aches or pains after time travel. I mind read that I had no discomfort in my body before time travel. I detected that I had no aches or pains in my body. I time travelled. I verified that I still had no aches or pains. +h. I worked out what to do and think before doing and thinking it. I mind-read myself to hone in on where I wanted to go and not lose interest in it when time travelling. I looked at my surroundings. I decided on where to go. I verified the safety of the place and the people there. +i. Meditation protected me. I mind projected lead around me for comfort during time travel so that everything was cared for, including not having radiation sickness. The lead time machine was replicated. I used it after gaining the ability to time travel in the simulation. This ability was effortless, and I contributed a 4*50 Breasoning Algorithm Generator (BAG) As to meditation, time travel and anti-ageing medicine to protect my body. + +When time travelling, I cleaned my time with zinc atoms on either side of my time. This precaution prevented sickness from time travelling by killing pathogens. I saw other time travellers doing this, which helped me to find my time travels delightful. This cosmological time travel experience was that I didn't age. +41. The mind reader stayed aware of the message. I agreed with the message, like cryptography et al. checks integrity. I checked that the message was intact after encrypting and decrypting it. I agreed with the checked message. Agreeing with the message was like checking it. +42. The mind reader scanned the message. I laughed about the message, like cryptography et al. checks integrity. Laughing was like checking the message. I critically held the message. The message was humorous. +43. The mind reader generated music and philosophy ideas. I checked the vocabulary was appropriate, like cryptography et al. checks integrity. Checking the vocabulary level of the message was similar to checking it was complete. Mind reading was like using a word processor. I explained mind reading was useful for generative art. +44. The mind reader exercised regularly. I checked that the message was medically safe like cryptography et al. checked integrity. Being healthy was like an intact message. My body remained healthy. All my systems felt and worked fine. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 1.txt b/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..d806454f601f1d01946b44158fc383f5934dc0ed --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 1.txt @@ -0,0 +1,17 @@ +["Green, L 2021, Mind Reading 3D Bodily Projection 1, Lucian Academy Press, Melbourne.", "Green, L 2021",1, "Mind Reading 3D Bodily Projection 1 +1. The bot stated that meditation, which they agreed with, helped maintain them in the simulation. I ensured that 3D bodily projection didn't go over ethical boundaries. My body was replicated, not projected or a \"bot\". Seeing as it was in the simulation, it was like all three. I converted myself from a human to a bot that could time travel. +2. The bot was carefully programmed. The dancer followed the choreographed dance. The projected dance instructor helped the student to perform the step. They were all human, born and raised, and happened to find the simulation. I experienced a pleasant fragrance because of breasoning 4*50 high distinctions for meditation, time travel and anti-aging medicine. I enrolled in a dance class. +3. The bot walked through a maze. The projected students helped, not controlled, the person. The person was a free agent who was helping the students to help them with mind reading. The projected students enrolled in free University and paid for extras in air dollars. The students were excited but stayed controlled as they taught mind reading. +4. The projected person appeared at a member of the set of acceptable locations to appear outside. There were infinite universes. I was aware of these universes in meditation. I walked outside. I was in a single universe. +5. The bot breasoned out a design for the utensil. The projected person replicated the utensil when he needed it. The person went camping. He packed a spare spoon. If he lent one to someone else, he used his second spoon. +6. The bot was, in essence, like a balloon. The projected person was 100% opaque, had weight and was made of skin and bones. Their skin didn't usually let light through. They weighed the same as an ordinary person. Their body contained their organs, and they experienced normal body function. +7. The bot helper specially researched algorithms that wrote algorithms. The projected pedagogy helper, who was more developed, compared the person's experience with areas of study before suggesting a breasoning. The pedagogy helper aimed to think of a thought-provoking breasoning. They considered the person's impressive experience and compared it with their knowledge. They specified an algorithm to program. +8. The bot amazed the visitor with its manicured appearance. The projected person's representation looked new when thought of clearly. Their hair, clothing, expressions, and movements were carefully controlled and well carried out. Their haircut, clothing, age and appearance were new. I breasoned out 4*50 As, and people found out and tried harder with their appearance. +9. The bot wondered if their food, the water and they were electronic lights. I trained, not programmed, the projected person to eat the apple. I sat down with my friend to have lunch. I could have taught them to eat the apple, but they had eaten it by then. A model robot could store the apple in its tray. +10. The bot asked, \"How will relaxation help me?\" to which I replied, \"It will help you adapt better\". I trained the projected person to meditate to produce healthy consciousness. Humans could meditate. The bot tried mindfulness, or sitting and relaxing. Relaxing helped the body reset and prepare for activity. + +11. The so-called bot was human, with the benefits of immortality. Derrida noted that two things were a thing, pointing out that projected objects had consciousness, meaning their thoughts needed to be thought of. My body was a projection by some idea, but was a replica continuous in time to me and my friends. I had growing hair, and my hair colour stayed the same, but I had the choice not to age. I found this transformation an advanced state and decided to write philosophy. +12. The bot wondered if the simulation’s population with time was a bell curve. If everything is made of people projected by something, they must be configured to give the best breasoning environment. I enjoyed travelling to the future and breasoning out texts I had written. There was a limited-use spiritual computer that could be used to check computations. I wrote down my algorithm and checked it. I could just as quickly run it on my laptop, but the spiritual computer helped me check ideas on the fly. +13. The bot travelled between the worlds daily, writing about their thoughts. I ported the completed work to my universe. I delegated simple tasks to the projected person to avoid work. I was now the projected person and the work was the same as before but even more enjoyable. I was now aware of how to become immortal, and my texts were in demand in the world so I could work on more of them. My body felt calm and healthy. +14. The bot was healthy and in peak condition. I tested whether the person was a projection because they might need help. I was now immortal, and I wrote about immortality and other texts, suggesting that knowledge is infinite when ideas are applied to different ideas. I could revise ideas and teach them in a philosophy academy. Philosophy, the way to become immortal, was appropriate for working as an immortal. +"] diff --git a/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 2.txt b/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..5d85133ae0bfe36f9d1de8cb8b732db3c92a8fc9 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 2.txt @@ -0,0 +1,23 @@ +["Green, L 2021, Mind Reading 3D Bodily Projection 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mind Reading 3D Bodily Projection 2 +15. The bot aimed to write and complete genuinely valuable algorithms. I projected the top of the class student when there was none. In a class of low-performers, I helped each student reach their goals, and there was a regular changeover of the top of the class. I taught computer science, and students read, wrote and held discussions. Philosophy analysed intersections of areas of study and benefited from the onslaught of ideas from theology. +16. The bot mined and settled the simulated planet and wondered about walking around it in the simulation. I projected-terraformed the satellite planet. This process was a computer game. I could simulate the planet on my computer and found advanced technology similar to drag-and-drop, identical to teleportation. Immortality was not just replacing one’s body; it seemed to make continuous transitions, containing pedagogical medicine and other medical breakthroughs. +17. The bot projected a foreign language-speaking partner. I studied the language by back-translating it and finding attractive language with the desired meaning. This process involved grammar-checking the original and changing the wording for correct translation. I found the simplest intersection of originals for translating into various languages for the greatest clarity. I wondered whether interplanetary societies had high distinctions and meditation and whether they had bots more interested in computer science. +18. The bot interested the person in 50 As for earning jobs and having children. I projected a character to hand-breathe out 50 As. I interested the people in pedagogy, and they completed computer science exercises exploring pedagogical areas of study. Computer science can explore any idea. I could ask the computer to check back-translations for the same meaning and to change the original to produce a more accurate translation. +19. The bot maintained their accuracy by running BAG for immortality. I projected a person to see what they thought to be thought of to help primary school students. The clown described the flowers to the girl. She recognised the flowers by their description. The girl practised the skill of matching a definition to a term. +20. The bot tantalised the student with a new possibility for a sentence that one could re-word. I projected a life-size maze and people to walk through it to encourage paying customers to complete it. I drew and finished the maze. I generated the maze from terminal subterms with addresses from BAG data. In the life-size maze, players could comment on the new thought-provoking sentences at the end of each path as an activity. +21. The bot re-ran Music Composer, hand-entering different melodies and rhythms for pop and classical music. I projected the body double for stunts and dangerous appearances. I avoided risks and felt safe in any case. I replaced stunts with computer graphics. I could produce a film of myself walking in a green field or smelling the roses. +22. The bot found the question and answered it. I projected the person’s appearance to link to and remind spelling testees of the correct spelling of a word. I read and checked spelling while thinking and writing. I used “etcetera” in formal writing and “etcetera, etcetera” in speech. I made a mind-reading spell-checker that checked and suggested spellings as I thought. +23. The bot found a common theme of the future civilisation featured in religions. I projected religious (philosophical) Gods (figures) and leaders to encourage people to follow. I found that philosophy was peaceful and promoted critical analysis and meditation used philosophy for points given in physics. I found the way to achieve the result by using a product, but my time was catching up in understanding the technology’s mechanism. I could spiritually eat, support myself, write and remain content. + +24. I projected the setting visitor, photographer and caption writer. +25. I projected the device based on an algorithm (e.g. a computer). +26. I projected the inquirer to test the evidence. + + + +27. I projected people in time to test whether whether time travel was feasible. +28. I projected helpers to the student who expressed interest in pedagogy. +29. I projected two helpers to help more than one. +30. I projected the helper to the prospective pedagogy student to remind him to dot on the breasonings details to God. +31. I projected the algorithm helper to help visualise algorithms when writing them. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 3.txt b/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..718885ca2d310b8b460de6096b00140fec5efc84 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 3.txt @@ -0,0 +1,17 @@ +["Green, L 2021, Mind Reading 3D Bodily Projection 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mind Reading 3D Bodily Projection 3 +32. I projected the famous student. +33. I labelled the bot during social distancing. +34. I projected the person with the particular appearance. +35. I projected the rest of the journey to prevent headaches after the honey pot prayer for no headaches was indicated. +36. I projected the professional requirement. +37. I observed that the bots were on the roll call but not enrolled. +38. I projected spiritual concert professionals for the secondary student's song. +39. I projected-stored the bit on the subatomic particle. +40. I projected and ate on the train. +41. I accepted intelligent projections as currency. +42. I constructed models instead of debugging algorithms to test their workings. +43. I added, finished and finished the rest of the necessary representations necessary to the meditator with projected people and images. +44. I argued that money was a farcical abstract construction and for projected currency. +45. I projected business leaders into the future. +46. I projected the 3D object. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 4.txt b/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..3206e7978675de3223d7a99aae244d421ef0aef7 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mind Reading 3D Bodily Projection 4.txt @@ -0,0 +1,36 @@ +["Green, L 2021, Mind Reading 3D Bodily Projection 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mind Reading 3D Bodily Projection 4 +47. I created the thoughts of the people, not projected the people, and professionally reacted to historical events in the three previous generations in the family. +48. I cultivated the people's characters, refining their culture and articulating their thoughts. +49. I tested that the person had matched the medicine model. +50. I legally modelled the trial royal family and trial divine family. +51. I modelled the trial concert goers and trial song buyers. +52. I modelled the change between two sentences. +53. The projected people were sentient, self-controlling were heaven on Earth and had full legal rights. +54. The excess population became anti-projected inside a simulation. +55. The person was projected without genetic flaws. +56. The ethicist determined when the projection was alive. +57. The black boxes in human and projected human consciousness were analysed in law. +58. Are we all bots, and what happens if neither, one or both parents are bots? Then everyone is a bot. +59. The computational government believed all consciousness should be saved. +60. The happiness and law-abidingness of the projected consciousness was maintained. +61. The standard for intelligence and legal rights for projected consciousness was 1 billion neurons and 3 past generations. +62. The education institution incorporated philosophies that included projected consciousness, robots, aliens and animals as people. +63. The projected people had small online and face-to-face jobs. +64. The projected person flew the space plane. +65. I identified the famous projected person. +66. The projected person had one body and died once. +67. The projected teacher facilitated the sex-ed class. +68. The projected person had simulated intelligence (he waited for me, I helped him and he was friendly). +69. I breathsoned out the song with nectars and the projected helpers helped listeners to them. +70. I performed multi-tasking with the help of projected tutors. +71. My projected assistant identified the mood of the people of the time when time travelling. +72. The great thing to do when otherwise unable to have children was to parent and invent divertissements for the projected child. +73. The projected professor's assistant integrated the student's thoughts and commented on them. +74. The society was automated, with robots doing the jobs of people and projected people. +75. The pinnacle of civilisation was this time, the actual culture will be next time. +76. The person projected themselves and experienced immortality. +77. The surgeon operated on the projected person (the LM doctor prevented the headache of the projected person). +78. The projected person was helped to be safe and positive. +79. I recommended meditation to the projected person, who had 50 meditation As before conception. +80. I received the internet order of the projected friend. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 2.txt b/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..03f7e2fd67962eab9e0eaac502686b548357bfec --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 2.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Mind Reading Audio Input and Output 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mind Reading Audio Input and Output 2 +15. I mind listened to prospective parents and mind questioned them to help with the conception argument. +16. I connected mind reading audio input to output. +17. I mind listened to the other's needs. +18. I mind said the peaceful (mentally occupying) logic. +19. I automated mind teaching in schools, with non mind reading as backup. +20. I mind sang at the concert. +21. I mind participated in the good life. +22. I mind said the frontier of knowledge. I used simulated intelligence to connect to real life events as a mind short course, verifying databases for correctness of fundamental knowledge. +23. I mind dictated the book. +24. I programmed the mind professor in another language. +25. I programmed the mind speaking actor to appear at certain times. +26. I programmed the mind King character to support the King. +27. I solved the mind puzzle of a broken vase with writing about a computer game on it. +28. I mind voted 'Yes' for mind reading input that is audio. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 3.txt b/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..aa5f8cb4f6384c66632b0e4182bb6d6b7c77839f --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 3.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Mind Reading Audio Input and Output 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mind Reading Audio Input and Output 3 +29. I mind examined why the reason was correct and good. +30. I mind listened to the dialectic. +31. I waited until I had recovered before mind stating that I was ready for the next Aig (50 As). +32. I listened to the clear mind audio about pedagogy. +33. I mind stated the algorithm description. +34. I mind listened to the conclusion from a reason, then mind stated another reason for it. +35. I replaced writing and reading with mind speaking and listening, respectively. +36. I made the important discovery that magic was algorithmic and summarised God's knowledge (black boxes that we rely on the audio inputs and outputs from). +37. I mind listened to the chain of papers' conclusions from a reason. +38. I mind stated the programming instructions. +39. I stated that as God (the teacher) mind spoke, the philosopher (the student) mind listened after the teacher died or was uncontactable. +40. I studied how the living Earth mind sang to his mother Universe. +41. I mind listened to the societal themes across different civilisations. +42. I mind commanded the computational protractor to measure the angle. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 4.txt b/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..6e918499a54df3d9e8fd37f345920478f2f59aff --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 4.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Mind Reading Audio Input and Output 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mind Reading Audio Input and Output 4 +43. I mind listened to the audio input about the relationship, then mind said the logical audio output. +44. I mind listened to the text and worked out the main point. +45. I researched meditation and mind connected the implications of the research with the conclusion. +46. I mind stated why I was astonished in the film. +47. I mind engaged with the person. I did this by mind engaging with the business. +48. I created open source heaven on earth, with automated mind speaking business tools and a medical head comfort app. +49. The student mind replied to the writing practicum as treatment. +50. The teacher noticed whether the student said words that she was interested in. God (the teacher) protected mind words that She listened to and guided mind words that students said. +51. I mind examined the speech by thinking of two uses for it, an algorithm and a use. +52. The best part of the future might be to turn off screens and work out, not use technology to work out what people are thinking. +53. I mind stated that I was not above, but was the best. +54. I mind commanded that there would be a customer or employee in another universe, unseen to me but seen by others. +55. I organised the person to mind teach meditation (face-to-face). +56. I connected to the point and mind stated this. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 5.txt b/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..5a36381d3a8eddab54995c708ad5980b1a23c8ed --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mind Reading Audio Input and Output 5.txt @@ -0,0 +1,26 @@ +["Green, L 2021, Mind Reading Audio Input and Output 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mind Reading Audio Input and Output 5 +57. I mind listened to the lecturers and spoke with the private tutors they appointed in each subject. I mind listened to the thoughts of the self. +58. I mind listened to the seller of the soundtrack of rain as I fell asleep. +59. I mind listened to whether the interested person wanted to receive the health idea at 12 PM on Friday. +60. As the other mind wrote, the self mind wrote about writing. I mind listened to the way to do the task (like education and education of education). +61. Medicine was a dialectic about remaining safe. Medicine was mind stating the way to stay safe. +62. I produced the machine to mind read audio. +63. I mind wanted the person not to waiver from joined-upness. I appeared to each person and mind helped them. +64. I mind wrote the book as an area of study (my own ideas). +65. I mind transcribed my meditations as a self-other dialectic. +66. I mind checked that the edge was clear of other nodes. +67. I mind revised that I had packed survival gear for my hike. +68. I mind differentiated the buyers out. I mind differentiated the voters out. +69. I mind predicted the person's behaviour from their philosophy. +70. I mind spoke to the hearing-impaired person. I mind listened to the vocally-impaired person. +71. I mind stated the paronomastic idea. +72. I mind stated the relevant idea. +73. I mind stated the important idea (on the main topic). +74. I mind stated the developed (detailed) idea. +75. I mind articulated the perfect program that the latest idea was (e.g. amalgamating two ideas into one and applying an idea the idea). +76. I mind decided to form a party based on people and money. +77. I mind ensured that people of different races had equal rights. +78. I mind catered for people with different diets. +79. I mind painted the idea in my algorithm. +80. I mind saved the biological file for the apple seed. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mind Reading.txt b/Lucian-Academy/Books/Mind Reading/Mind Reading.txt new file mode 100644 index 0000000000000000000000000000000000000000..86cbda8a6ac2f4a019afdf37c8451055c2ce4906 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mind Reading.txt @@ -0,0 +1,45 @@ +["Green, L 2021, Mind Reading, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mind Reading +Cryptography +1. Once the initial mind reading transmission and reception has occurred, encrypt and transmit to check that the signal used for the part remains privacy. +2. I encrypted, decrypted and checked the integrity of the signal for time travel. +3. The subject should breason out 4*50 As containing algorithms for mind reading to work. +4. The reward is in itself and does not require an incentive. +5. The master is good because he maintains the quality. +6. The cryptography algorithm is rot-13 (rotate letters 13 positions to the left or right). +7. Farms should remain sustainable by having enough seeds from the last crop for the next crop. +8. The decryption algorithm is to rotate the letters in the opposite direction and number of characters from where they started. +9. After 10 algorithms per chapter per student, the same as the next chapter should be found. +10. An object's registration number is in itself, i.e. is unique, so contains details of time and place. +11. Sell being virtually seconds away from customers when there is a triggering event. +12. The best encryption algorithm was used in politics. +13. I provided safety advice as a breasoning currency switch for the scientist's life decisions (i.e. ethically treated cryptography). +14. The message was encrypted in the present and decrypted in the future. +15. I checked the integrity of the transmission by sending it twice in a safe format. +16. I included the start and end point codes. +17. I synchronised communication in ten pairs of universes and therefore the multiverse. +18. I generated the key to use in encryption. +19. The recipient checked the message for signs of trouble. +20. I accepted only messages only relative to my own timeline. +21. I transmitted only at clear breasoning times. +22. I chose the best available encryption technology, which was more secure and user friendly. +23. The signal included the only copy of the message, which I decrypted predicting the key, having saved the message. +24. The node was known to transmit and receive using particular keys at particular times. +25. The message was transmitted at least twice using different keys until carrier breasoning was attained. +26. I found the best times for transmitting and receiving. +27. I mind read the future mind read message to be safe. +28. I physically travelled between the transmitter and receiver to send the key there. +29. I regularly tested the encryption-decryption system. +30. I anchored, encrypted, decrypted and verified the signal for mind reading the person. +31. I found out, gave As to the conversation to test whether to approve and encrypt it. +32. I retried to establish the carrier signal if it failed. +33. There was a dualism between any signal and one person. +34. There is only one copy of a person, restored from the start, like cryptography for integrity. +35. I prepared to test the message was positive, like cryptography for integrity. I connected the message together using medicine, like cryptography for integrity. +36. I prepared to think of the possible replies to the message, like cryptography for integrity. I verified and explained the message, like cryptography for integrity. +37. I breasoned out food examples to make sure the conversation went smoothly, like cryptography for integrity. I only performed and expected inflow (new material) in the message when the recipient was ready, like cryptography for integrity. +38. I found out, summarised and suggested dialogue, like cryptography for integrity. +39. I broke down the message for correct grammar and visualised spelling, like cryptography for integrity. +40. I suggested appropriate messages for an intellectually disabled person, like cryptography for integrity. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr Cryptography 3.txt b/Lucian-Academy/Books/Mind Reading/Mr Cryptography 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..d819980bb958b9849066f37b418304cb935884be --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr Cryptography 3.txt @@ -0,0 +1,11 @@ +["Green, L 2021, Mr Cryptography 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr Cryptography 3 +45. I found the clear time without interference to mind read using machine learning, followed by encryption. +46. The robot (machine learning) rights should be reviewed for spatial mind reading, followed by encryption. +47. The robot (machine learning) rights should be reviewed for temporal mind reading, followed by encryption. +48. The robot (machine learning) rights should be reviewed for spatiotemporal mind reading, followed by encryption. +49. The bot that interprets your message following decryption should be as loyal as a dog. +50. The bot that interprets your message following decryption should be as doting as a dog. +51. The robot was classed disabled (rather, superabled) in human terms, so was modified to be human-like when interpreting messages following decryption. +52. The robot determined whether it was necessary to send the message to be encrypted. +53. The manager made the mind reading app. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 1.txt b/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..ce70b4684841897023f071d1fe6995f5d644bce1 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 1.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Mr for philosophy and algorithm writing 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr for philosophy and algorithm writing 1 +1. I mind projected the screen asking for the philosophy and mind read the philosophy. +2. I mind projected the screen asking for the algorithm part and mind read the algorithm part, where I joined top-down and bottom-up parts and data flowed as far as possible as I went and I could fix errors. +3. I mind read the bird's philosophy and algorithm to help him to have a family. +4. I mind chose the specific variable, possibly displaying its value, predicate from autocomplete or the bottom-up predicate to connect to the top-down predicate. +5. I blocked in additional conditions for the algorithm by mind reading. +6. I mind chose combinations of different logical structures when writing the algorithm. +7. I mind chose the suggested three or more lines when writing the algorithm. +8. I mind read the algorithms in the essay for A+. +9. I mind read the algorithms for details in the essay for 100%. +10. I mind read the statistics blocking experiment (to find a possible correlation) data while mind programming, for e.g. variable and predicate naming style, modularisation preferences and preferred data structure. +11. The mind tutor asked questions until the mind programmer discovered the necessary programming technique. +12. I used mind program finder to reorder the data items to match the required output. +13. I mind debugged the errors in the algorithm, e.g. brackets mismatch, singleton variable or singleton in branch. +14. I used mind formula finder e.g. to find the difference formula."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 2.txt b/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..c0462aff9b8cd54fd029e4c6de497f94186126c8 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 2.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Mr for philosophy and algorithm writing 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr for philosophy and algorithm writing 2 +15. I mind read the plan for the algorithm, to replace with data and logic. +16. I kept back-ups of development of the mind read algorithm. +17. I produced mind drafts of the algorithm at different points of development using simulated intelligence. +18. I mind produced algorithms using machine learning. +19. I mind drew diagrams of the algorithm's output. +20. I mind recorded walk-throughs of the algorithm. +21. I mind found obscure bugs in the algorithm. +22. I mind refined the blocks world inspiration by the philosophy into an algorithm. +23. I mind read three algorithms influenced by the philosophy, e.g. simplified them one at a time, shown by grammars where a->b, a->B (where B is any empty variable to read a character), a->B (where B is a variable with a string value to read a character) and a->B (where B is a variable with an atomic name of another predicate to call). +24. I mind debated with the other the formatting decision about the algorithm (the XML description of the data). +25. I engineered a biological circuit that mind read the algorithm. +26. The poor person automated mind programming on the spiritual computer, making her own spiritual phone. +27. I mind ran the algorithm to find the key phrase in real time. +28. I established the spiritual centre, with a shop, meditation room and bots."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 3.txt b/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..ce5279ddf3b03ada752e5eaa4f50539028c07906 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 3.txt @@ -0,0 +1,18 @@ +["Green, L 2021, Mr for philosophy and algorithm writing 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr for philosophy and algorithm writing 3 +29. As mind reading is behaviour, writing philosophy and algorithms is prevents medical problems, where both follow natural law. +30. I used text to breasonings to mind project questions asking for, and read algorithms. +31. I mind read the algorithm after ensuring that I was physically comfortable. +32. I mind projected the question with enough breasonings to reach response threshold. +33. I mind read enough algorithms to ensure that the self could interest others in the idea. +34. I found suggestions when mind reading algorithms that aligned with the user's mission. +35. I mind read the best algorithm techniques. +36. I found the good connections from the student to the algorithm. +37. I mind read subjectivity to determine the algorithm to write. +38. I culturally translated the algorithm writing mind reading system. +39. I listed all necessary test data to earn 100% correctness in the mind read algorithm. +40. I documented input and output to the mind read algorithm. +41. I wrote the professor's mind reading integrations with the student to write the algorithm. +42. I found a needed connection when mind reading the algorithm. +42. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 4.txt b/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..de5143bbb1d52dcb6fd98dae1920abab8ca6508e --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 4.txt @@ -0,0 +1,17 @@ +["Green, L 2021, Mr for philosophy and algorithm writing 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr for philosophy and algorithm writing 4 +43. The mind reading algorithm writer saved the progress from the previous session. +44. I collected the same number of algorithms-as-comments as algorithms. The spiritual bot, created by computational meditation of 80 mantras and 80 sutras or enough of them, each with 4000 x 4 or 5 breasonings, mind read the algorithm. The spiritual bot doesn't need programming, it can be trained to write philosophy and algorithms and disappears when it wants to. +45. The mind reading algorithm writer presented possible trajectories from thought's in development. +46. I queried samenesses and differences in data when mind reading the algorithm. +47. I mind read a use for text to breasonings that benefited society. +48. I mind articulated the sales algorithm that helped the health of the business. +49. I mind played the game to write the algorithm. +50. I mind read the algorithm to gain access to the underwater room in the computer game. +51. I mind presented the walk-through of the algorithm. +52. I mind wrote the program as data. +53. I measured the most efficient mind reader to write an algorithm. +54. I wrote the degree to write the algorithm with my mind. +55. I mind read the algorithm to reinforce the body system. +56. I mind read the algorithm to change it in the future. +57. I followed the proforma when mind reading the algorithm, e.g. it asks why the algorithm is different and needed. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 5.txt b/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..c09718463bb30e1131fdf81211c2453b6bfadb56 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr for philosophy and algorithm writing 5.txt @@ -0,0 +1,27 @@ +["Green, L 2021, Mr for philosophy and algorithm writing 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr for philosophy and algorithm writing 5 +58. I inductively reasoned out the philosophy rule from the data (what a mind read thought meant). +59. I studied an Arts degree, developed philosophy topics to expand before starting Education to become a pedagogue, where I prepared for someone else to do this by mind reading them. +60. I exhausted my set of thoughts when mind reading to write the algorithm. +61. I simplified and generalised the mind read philosophy, checking it against my aim. +62. When my writings, systems and research (including mind reading algorithms and philosophy) were complete, I applied to the philanthropist for funding. +63. I mind read the algorithm about mind reading. I mind read the philosophy about mind reading. +64. I added a list of key terms to check for in a mind read essay. +65. I mind read a chain of two reasons, the second of which to inspire the next reason. +66. I worked out the range of techniques for the mind read algorithm. +67. I worked out the requirements of the walk-through for the mind read algorithm. +68. I worked out the report for the mind read algorithm. +69. I prepared to mind judge the algorithm. +70. I worked out general contentions (i.e. a well-known idea from nature) in my mind read philosophy. +71. I verified the mind read algorithm's input against the type and mode statements. +72. I planned the mind read argument by synthesising my areas of interest. +73. I logically mind wrote the argument by writing an algorithm for it. +74. I flew through the mind read algorithm's data. +75. I mind breasoned out the thoughts during my day, to prepare to write an argument. +76. I mind related my journal article to current news and other journal articles. +77. I mind planned my life around my science. +78. I automatically included the necessary information from the plan in the mind read algorithm. +79. I decided to agree with one of two sides still. I devised positive alternatives for algorithm predicates (i.e. alternative solutions) and argument reasons (i.e. alternative algorithms with sets of predicates) and argued against them. +80. Mind reading an algorithm or an argument is done with a (spiritual) circuit, not breasonings. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr for time travel 2.txt b/Lucian-Academy/Books/Mind Reading/Mr for time travel 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..328c11818f0a7e462cf81bdb6573c0d3ed6405b4 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr for time travel 2.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Mr for time travel 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr for time travel 2 +9. Research while time travelling should serve as a basis for mind reading. +10. I verified that time travellers would be secure (have enough food) by mind reading. +11. I identified the best name for the object by spiritually time travelling. +12. I mind read and time travelled to the best background before drawing the portrait. +13. I mind read and time travelled to the best restaurants to maintain the medical indicator for my happiness. +14. I mind read and time travelled to the best concert. +15. I mind read to find the appropriate time and time travelled to plant the plant. +16. I mind read the existence of the possible meditator and time travelled to establish the meditation centre for him. +17. I found the same thread to invest knowledge in using mind reading and time travelled to invest in it. +18. I mind read the incident at the unsafe crossing and time travelled to prevent it. +19. I mind prospected for possible customers and time travelled the service to the necessary time of day. +20. I mind read the meeting with the antagonist and time travelled to prevent it. +21. I mind read whether anyone wanted to invest and teleported to meetings. +22. I mind read the appropriate time to say goodbye and time travelled to it. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr other times 2.txt b/Lucian-Academy/Books/Mind Reading/Mr other times 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..90bf55b09fca7a126c660c21ad9e934dd159234c --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr other times 2.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Mr other times 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr other times 2 +15. I mind read other times universally (across 10 universes). +16. I mind read the person in the other time mind reading another time. +17. The self mind read the other in the other time, connecting the same terms together. +18. I mind read myself in the other time, finding the best material to relate to my philosophy. +19. I telepathically communicated with people from another time about politics (helping people make important representations), communicating from a niche perspective (with 10 breasonings) for best alignment with the people. +20. Mind reading objects (from other times) prevents the problem of too many computers reading representations, tiring humans. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr other times 3.txt b/Lucian-Academy/Books/Mind Reading/Mr other times 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..bca4b8b17ce45b9ed1ac01d588f61034cf3828f2 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr other times 3.txt @@ -0,0 +1,7 @@ +["Green, L 2021, Mr other times 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr other times 3 +21. I mind read past times to examine history. +22. I mind read future times for security reasons. +23. I mind read the animal students in higher education from other times to collect (write areas of study) or plan (write assignments). +24. I individually considered the requisite other times mind read before calculating their synthesis. I executed the text to breasoning algorithm on a medical text with parts collected with mind reading from past times to harmlessly include them to tend to the feelings of those in the present. +25. I could feel the computer mind reading the other times like a magnet sticking to the thoughts. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr other times 5.txt b/Lucian-Academy/Books/Mind Reading/Mr other times 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..4c2f1a5091a665f569d3236067b6048283d75188 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr other times 5.txt @@ -0,0 +1,11 @@ +["Green, L 2021, Mr other times 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr other times 5 +36. The academic tested the hypothesis that works attributed to Shakespeare were written by another writer by mind reading the time. +37. I set the date, time and setting for mind reading. +38. I wrote an A for the other person as a use for the A for mind reading other times. I wrote an A for the self mind reading other times. +39. I wrote a book about my argument about times I mind read. +40. I used mind reading other times to write realistic details for my survival. +41. I collected video, etc. evidence for the trial by mind reading the past. +42. I derived quantum power by mind reading the other time. +43. I wrote the primary text about the other times mind read and a secondary text about them. I read that the reading of the other times was relevant. +44. I wrote the mission statement about the time and destination to mind read. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr other times 6.txt b/Lucian-Academy/Books/Mind Reading/Mr other times 6.txt new file mode 100644 index 0000000000000000000000000000000000000000..dff57e40a70d73f6d8a8f3f7d0c4f1f7d95dd7fb --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr other times 6.txt @@ -0,0 +1,10 @@ +["Green, L 2021, Mr other times 6, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr other times 6 +45. I mind read the alternative computer science expression from the parallel universe in the same time and place. +46. I mind read the alternative philosophy expression from the parallel universe in the same time and place. +47. I checked that input to mind read times was interesting still. +48. I checked that output from mind read times was interesting still. +49. I found the bug by mind reading the other time. +50. I mind read the times around the time to transcend the top of the time. +51. I appeared to the subject mind read from the other time minutes after mind reading, relative to our own timelines. +52. I asked whether the person wanted to buy my book about mind reading the other time. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr other times 7.txt b/Lucian-Academy/Books/Mind Reading/Mr other times 7.txt new file mode 100644 index 0000000000000000000000000000000000000000..860f89c96545e189bded78a73fa9f58359543763 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr other times 7.txt @@ -0,0 +1,7 @@ +["Green, L 2021, Mr other times 7, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr other times 7 +53. Choose to listen to a lecture from one of multiple other times by mind reading. +54. I collected experimental data about rare phenomena by mind reading the other times. +55. When mind reading the appropriate business model (higher education) in the other time, I wrote details as reasons from a debate with another person. +56. I mind read the other time exactly, not vaguely. +57. I responsibly chose an ontological value (side of the car that the steering wheel was on in the particular car) by mind reading the other time. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr other times 8.txt b/Lucian-Academy/Books/Mind Reading/Mr other times 8.txt new file mode 100644 index 0000000000000000000000000000000000000000..8b230c91860f61f34edae2b2baaea15192c36aa1 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr other times 8.txt @@ -0,0 +1,3 @@ +["Green, L 2021, Mr other times 8, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr other times 8 +58. Where mind reading other times is also known as spiritual time travel, and travelling from one time to another is a 'time within a time', simulated intelligence is needed independently from travel to detect harm by the traveller in a leg of the journey before natural death, which triggers a universal law and prevents the harm from being experienced in the first universe and prevents return, and disallow travel to respond to this natural phenomenon. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/Mr spiritual screen 4.txt b/Lucian-Academy/Books/Mind Reading/Mr spiritual screen 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..be3f4fcfbbbf6cf966fb574c745cf34ec69d4ec5 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/Mr spiritual screen 4.txt @@ -0,0 +1,19 @@ +["Green, L 2021, Mr spiritual screen 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr spiritual screen 4 +* Make spiritual camera etc separate args +42. I used the spiritual screen on a space walk. +43. I programmed the client-side spiritual screen bot to help with the pedagogical life. +44. I programmed the client-side spiritual screen and bot to help with progress of pedagogical science. +45. I used the spiritual screen for random inspiration. +46. I played a game on the spiritual screen in which I worked out what the name of the object was from its properties. +47. I controlled normally involuntary bodily processes voluntarily using the spiritual screen, showing the dualistic dependence of body on mind. +48. I used the spiritual screen to remind me of a forgotten word or meaning in a language. +49. I recommended the spiritual screen to the customer using a spiritual screen. +50. I maintained world peace by helping a person to like another person using the spiritual screen. +51. I constructed the image of the person using the spiritual screen. +52. I recorded my childhood philosophies using the spiritual screen. +53. The spiritual screen switched off stress automatically each day. +54. I checked that the food had the right amount of nutrients using the spiritual screen. +55. I protected Earth by checking the product against breasonings currency (checked that the product followed positive paths) using the spiritual screen. +56. The spiritual screen checked my grammar. +57. I logged my working hours using the spiritual screen. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading as medicine 1.txt b/Lucian-Academy/Books/Mind Reading/mind reading as medicine 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..41ebf75ed791be397dbb06215727963f4624ec28 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading as medicine 1.txt @@ -0,0 +1,17 @@ +["Green, L 2021, mind reading as medicine 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mind reading as medicine 1 +1. I mind prevented stress. +2. I mind hosted postgraduate education to prevent stress. +3. I inspired covers of my songs to prevent stress by mind reading +4. I mind prevented disagreeing with God directly because He/She is respected to prevent stress. +5. I mind predicted the study habits of an Asperger patient who took many short breaks. +6. I made money by making things up and mind reading to prevent stress. +7. I mind projected the details to prevent stress. +8. I mind prevented stress from experiences with meditation. +9. I improved the philosophy by mind programming it. +10. The lecturer mind read the medicine student's answers. +11. I mind read whether perpetual short courses resulted in students writing more breasonings. +12. I mind evolved research by connecting the idea to research. +13. I earned A by mind writing two uses for each point, writing an argument. +14. I mind assessed that the student passed the medicine course. +15. I prevented stress by mentally intercepting the soccer ball. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading as medicine 2.txt b/Lucian-Academy/Books/Mind Reading/mind reading as medicine 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..90b90f469a8805a3bfb6a3a4564d7c5cd258fbec --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading as medicine 2.txt @@ -0,0 +1,16 @@ +["Green, L 2021, mind reading as medicine 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mind reading as medicine 2 +16. I spiritually switched on stretching my back lat muscles. +17. I mind read the sensory information. +18. I mind read the same categories as accreditation for the idea, e.g. agreement. +19. I mind cured with the truth. +20. I mind cured in the same language as the student. +21. I mind cured by directing the student to exercise with an idiom. +22. I mind cured by programming the computer to give breasonings. +23. the self mind cured by replying with A to the other's A. +24. I mind preserved myself by meditating before I recorded the production. +25. I mind cured by switching on clozapine to prevent schizophrenic hallucinations. +26. I mind read 50 As to become developed in medicine. +27. I mind exposed myself to enough sun. +28. I mind cured based on what was known. +29. I mind read the appropriate time to spend time outside when the weather wasn't too hot or cold. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading as medicine 3.txt b/Lucian-Academy/Books/Mind Reading/mind reading as medicine 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..5fd10d3e289ab0f01b2996c180d06ce703f59790 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading as medicine 3.txt @@ -0,0 +1,15 @@ +["Green, L 2021, mind reading as medicine 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mind reading as medicine 3 +30. I employed the mind reading doctor because he was in business. +31. I mind translated the programming language to prevent too much work. +32. I mind prevented stress from feelings. +33. I mind gained the ship's attention with a distress signal. +34. I mind read the list in medicine administration. +35. I mind read to make distinctions between ideas. +36. I mind read whether the meditation-protector argument and switch protected meditation. +37. I mind toured to decrease stress. +38. I mind prevented the stress crisis. +39. I mind found a use for the spiritual medicine. +40. I mind tasted the food and mind wrote the taste cookbook. +41. I mind laughed. +42. I mind advanced to the first instance of item. +43. I mind helped the medicine student."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading as medicine 4.txt b/Lucian-Academy/Books/Mind Reading/mind reading as medicine 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..57165789111ce6878680095e3dd5841d1d721edf --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading as medicine 4.txt @@ -0,0 +1,15 @@ +["Green, L 2021, mind reading as medicine 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mind reading as medicine 4 +44. I automated mind reading to prevent stress. +45. I mind played the computer game to plan my medical career. +46. I checked whether the student had decided using mind reading. +47. I prevented non-pathological stress with mind reading. +48. The professor mind gave medicine to help with work. +49. I mind read that the student was with-it in medicine. +50. I mind maintained parts of the relationship. +51. I mind verified the accuracy of the treatment. +52. I mind checked the accuracy of the algorithm. +53. I mind generated content to keep myself in the black. +54. I mind prevented stress to the self. +55. I mind prevented stress in the customer. +56. I mind read mind reading to prevent stress. +57. I mind read that there was no stress from enough food."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading as medicine 5.txt b/Lucian-Academy/Books/Mind Reading/mind reading as medicine 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..d890544f53c6f22b2b83c0cdf2ed8e8577c3ec36 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading as medicine 5.txt @@ -0,0 +1,25 @@ +["Green, L 2021, mind reading as medicine 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mind reading as medicine 5 +58. I mind effected the meditation protector currant bun sutra. +59. I safely mind controlled the vehicle. +60. The neuroscientist mind toured the thought. +61. I mind read the hour for medicine. +62. I mind read that digestion had been successful. +63. The meditation (philosophy) teacher mind read the inference from the reason to the conclusion. +64. There was agreement that critical thinking involved programming inferences, which I mind read, where the inferences were in medicine. +65. I mind read the rest of the medicine ideas. +66. I mind pointed to objects. +67. I mind married the medicine ideas. +68. I mind sold with medicine. +69. I mind wrote the distinction side, inflow not outflow, in medicine. +70. I mind reminded the self to stop eating after enough food. +71. I mind maintained the daily regimen. +72. I mind immersed the self and swam through the occupying thoughts. +73. I needed meditation and medicine to mind read the Aig (philosophy imagery). +74. I mind read the medicine scene. +75. I mind queried the connection between reasons in medicine. +76. The autist mind breasoned it out. +77. I mind studied the professional development course. +78. I mind taught the medicine degree. +79. I mind examined the medical noumenon (thing in-itself). +80. I mind studied and mind wrote medicine before meditating. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading audio input and output 1.txt b/Lucian-Academy/Books/Mind Reading/mind reading audio input and output 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..3e6d123374fe67191bab2fc31e9a4280671226f2 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading audio input and output 1.txt @@ -0,0 +1,15 @@ +["Green, L 2021, mind reading audio input and output 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mind Reading Audio Input and Output 1 +1. I mind listened and mind spoke during the pandemic. +2. I mind listened to the statement and worked out what it meant. +3. I mind listened and and spoke during the school's no speaking day. +4. I mind listened to my family member (with permission) to ensure they were happy. +5. I mind listened to the location of the water. +6. I mind listened to whether the song had a rigorous algorithm. +7. I mind listened to maintain the top. +8. I mind listened to and spoke to the builder. +9. I mind listened to and spoke to the student. +10. I mind listened to and arrested the criminal. +11. I mind listened to the health status of the water. +12. I wrote, then mind listened to the comments. +13. I mind said the verb. +14. I mind told the story."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading camera 1.txt b/Lucian-Academy/Books/Mind Reading/mind reading camera 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..50ff3a2c774909d88e2cb2ecd8f986b7ba1fa058 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading camera 1.txt @@ -0,0 +1,16 @@ +["Green, L 2021, mind reading camera 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mind reading camera 1 +1. I found the security use for the mind camera. +2. The animals mind watched the objects in higher education. +3. I mind watched for equality. +4. I mind watched for freedom of the self. +5. God (the person) mind watched the algorithm's visualisation. +6. University helped provide the bots with thoughts using the mind reading camera. +7. I mind watched when to rest. +8. I mind watched the oxygen level to identify viable space travel. +9. I mind watched the philosophy fly-through. +10. I mind watched the effects of the human bird spending money. +11. I mind watched the dream. +12. I mind watched the mind camera footage. +13. I mind watched the phenomenon and mind took notes. +14. I mind watched whether the student was in a pair at University. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading camera 2.txt b/Lucian-Academy/Books/Mind Reading/mind reading camera 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..6be22d106d2e7a6908a62b943ed898105d88af9c --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading camera 2.txt @@ -0,0 +1,15 @@ +["Green, L 2021, mind reading camera 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mind reading camera 2 +15. I mind filmed the cook to ensure that the vegetable burger only contained vegetables. +16. I mind filmed the server to ensure that the cash balanced. +17. I mind filmed, not wrote the essay. +18. I mind filmed the inferred lines that the new part introduced. +19. I mind filmed the computer science example. +20. I mind filmed the possible uses for the product. +21. I mind filmed the connections necessary for the mantra meditator to become a pedagogue. +22. I mind filmed the planned steps to achieve my goal. +23. I mind filmed my preferred reality and experienced it, e.g. preventing cancer. +24. I mind filmed the evidence against infinity and for finite data. +25. I mind filmed the connections between ideas in the professor's career. +26. I mind filmed the bot's pathway through the business. I mind filmed the person's pathway through the business. +27. I mind filmed possible science discoveries. I mind filmed what I was seeing. +28. I mind filmed the vocational education worksheet."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading camera 3.txt b/Lucian-Academy/Books/Mind Reading/mind reading camera 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..87c6c204c71567da14326d766389bb8920dd2bce --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading camera 3.txt @@ -0,0 +1,16 @@ +["Green, L 2021, mind reading camera 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mind reading camera 3 +29. I accelerated and mind filmed the particle. +30. I mind filmed the programmer's views on various possibilities. +31. I mind filmed the viable solution and mentioned it at the top of the page. +32. I filmed the character delivering dialogue in my mind. +33. I presented the mind film arguing against non-real events and arguing for real events. +34. I mind filmed Plato's hand with Nietzsche's brain. +35. The mind filmed the universal class of people. +36. The stage hand mind filmed the second location to monitor it. +37. I mind filmed the creative writing thought with As. +38. I mind filmed the making of the vessel. I mind filmed the making of the production. +39. I mind filmed comments about the idea to make up news. +40. I found the meditation mind videos and played them. +41. I mind filmed each person rotating tasks. +42. I mind filmed the real and computational professors. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading camera 4.txt b/Lucian-Academy/Books/Mind Reading/mind reading camera 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..80e649466ffe122b01b9e37d0b78aa4a32a0734a --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading camera 4.txt @@ -0,0 +1,16 @@ +["Green, L 2021, mind reading camera 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mind reading camera 4 +43. I mind filmed the way to verify that it was safe. +44. I copy-wrote and mind filmed the day. +45. I mind filmed the fairy bot after the sale. +46. I synogistically reasoned about the mind film. +47. I inferred the connection using mind filming. +48. The doctor mind filmed who was thinking about the student. +49. I intercepted causes of headache in the car by mind filming. +50. I mind filmed the binding site. +51. I mind filmed producing the electronic (object) breasoning. +52. I rehearsed by mind filming. +53. I mind filmed the life. I mind filmed the science. +54. I assessed the mind film. +55. I wrote how the student could do the work in education by writing an algorithm writer algorithm writer. I wrote how the student could do the work in the area of study by writing an algorithm writer. +56. I mind filmed the catering area to ensure that the supplies were filled. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mind reading camera 5.txt b/Lucian-Academy/Books/Mind Reading/mind reading camera 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..d83a01c0f7c2b314ffc29ade057b99f6d9192627 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mind reading camera 5.txt @@ -0,0 +1,25 @@ +["Green, L 2021, mind reading camera 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mind reading camera 5 +57. I verified the broken down breasoning currency values of the product by mind filming their correction function, expected life and ease of use. +58. I mind filmed the preparation of the vegan product before eating it. +59. I mind filmed that the goal was reachable. +60. I mind filmed the people and objects to find synonymous properties. +61. I mind filmed the planned way to be successful. +62. I mind filmed that the last item was returned. +63. I mind filmed the self performing the cognitive function. +64. I mind filmed the tin cusp. +65. I mind filmed the science by robots. +66. The self mind filmed the self optimising the algorithm. +67. I mind filmed unwrapping the brackets from the item. +68. I mind filmed what the student did, then matched it. +69. I mind filmed the interview. +70. I mind filmed the Honour Song. +71. I mind filmed the tour. +72. I mind filmed to understand the shape. +73. I mind filmed the telepathic child in the think tank. +74. I mind filmed recursively bolting the ideas together. +75. I mind filmed all of the high quality thoughts while writing. +76. I mind filmed the hierarchy of people. +77. I mind filmed the seven-year-old in school. +78. I mind filmed the positive functional way to prevent headaches, being happy and earning money. +79. I mind filmed the way to the safe place. +80. I mind filmed the evidence."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr Cryptography 4.txt b/Lucian-Academy/Books/Mind Reading/mr Cryptography 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..e5711cb2310893e42e151f066cc5aeb5f793bd10 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr Cryptography 4.txt @@ -0,0 +1,10 @@ +["Green, L 2021, mr Cryptography 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr Cryptography 4 +54. I sent the key as an encrypted message. +55. Mind reading with cryptography allowed users to send more sensitive messages. +56. Mind reading with cryptography enabled higher sales. +57. The session authentication number was 16 pages long. +58. I anonymised (simply encrypted) the List Prolog variable names for mind reading. +59. I called the List Prolog algorithm with anonymous (simply encrypted) variable names from another List Prolog shell when mind reading (a List Prolog algorithm that displays the Learning Management System runs an List Prolog algorithm that verifies the List Prolog algorithm submitted, and encrypting prevents plagiarism). +60. Non-breasonings (objects) are required to make mind reading and time travel work, and breasonings are required to smooth paths. Cryptography is not required. +61. Mind reading and time travel work by using a circuit to jam (with 65 V) not workingness to the opposite of what you want to happen to the frame before by attaching to the right number of electrical, not thought of breasonings (>1 million) at the frequency of people, metal and clothes. There was a negative charge x and battery. Mind reader is necessary to find whether the trips will be successful and give the people and you As for their thoughts. The person should have a role A and have researched all angles and dress of the time. Radiation safety and safety should be switched on. Tesla's coil can be modified to be a time machine and connected to a battery. Cryptography is not required. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr Cryptography 5.txt b/Lucian-Academy/Books/Mind Reading/mr Cryptography 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..00f4bd873249a767aecd57f09d93ca706832eaf5 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr Cryptography 5.txt @@ -0,0 +1,9 @@ +["Green, L 2021, mr Cryptography 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr Cryptography 5 +62. A user writes a pedagogical A+ grade by writing algorithms for all sentences and details. Users check mind reading, which will be encrypted, against 90%. Humans can write on their interests and computers can convert these texts to relevant texts. +63. A user writes a pedagogical 100% grade by writing well-written texts by machine learning with algorithms for all sentences and details. A user checks mind reading, which will be encrypted, against 100%. Writes text before details, which sentences as syntheses of 10 sentences are attributed to. Machine learns about mind reading and initial ideas. +64. Users should check mind reading with cryptography messages against the professor's perspective, which considers society. +65. Users should check mind reading with cryptography messages against research evidence. +66. Shells of literature-like simulation with people's feelings taken care of by mind reading and cryptography to end poverty. +67. Encrypt a simulation of mind reading for medical purposes (pedagogical environment to meet the full brain potential indicator). +68. I encrypted my exam answers in the mind reading exam. +69. The doctor mind read and encrypted that the patient had met the pedagogy detail indicator."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr Cryptography 6.txt b/Lucian-Academy/Books/Mind Reading/mr Cryptography 6.txt new file mode 100644 index 0000000000000000000000000000000000000000..c3fbdd0ff707d2c9194fa4c1b6895e4e99c47ba4 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr Cryptography 6.txt @@ -0,0 +1,13 @@ +["Green, L 2021, mr Cryptography 6, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr Cryptography +70. I performed the pedagogical experiment telepathically using encryption as the double blind functions. +71. I mind read and encrypted the meditation utterance to transmit the other utterance, after one-off 50 As or for a grace period. +72. I mind read the writer in the correct conditions with bots to help them write philosophy following their thoughts using psychology, encrypting the transmission. +73. I decrypted and mind read the newspaper. +74. I mind programmed using cryptography. +75. While mind programming, I selected and encrypted a database row in Prolog. +76. I encrypted and mind projected my itinerary. +77. While mind read and encrypted a chain of rows in several database tables in Prolog. +78. I mind read and encrypted my purchase. +79. I mind read, increased and encrypted my thoughts in Honours. +80. I mind read the times' fabric colour, picked a dye and dyed my clothes before encrypting and mind reading to time travel. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 1.txt b/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..5a03840821a65028c0f911164aa8b1656797e541 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 1.txt @@ -0,0 +1,36 @@ +["Green, L 2021, mr for scientific property detection 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr for scientific property detection 1 +1. Detect and prevent any psychologically preventable pre-pathological damaging disease in time. Detect and prevent any psychologically preventable pre-pathological damaging disease. +2. Detect and prevent any psychologically preventable pre-pathological damaging stress in time. Detect and prevent any psychologically preventable pre-pathological damaging stress. +3. I detected that the helium balloon was lighter than air with mind reading. I detected that it was not heavier or the same weight as air, but lighter than it. +4. I detected the gravity constant from mind reading the witness to the falling object. +5. I detected the speed of the vehicle from mind reading the witness of it. +6. I detected the speed of the plane landing at the airport from mind reading the witnesses' account. +7. I mind detected the effect of meditation on medical indicators (e.g. specific algorithm writing in postgraduate study, writing bot software in business and writing computational meditation software). +8. I taught the person meditation at the critical time point, where I mind tested this saved his life. +9. I mind tested that the student could apply philosophical perspectives, combining different ideas to see ideas in a new light and test whether they generally work. +10. I mind tested that the argument map had an even number of objections on each branch so that it overbearingly agreed. +11. I ate the non-genetically modified food and mind tested that it was good. +12. I mind tested that the student had completed the hurdles in the assignment. +13. I mind tested that the person was trained. +14. I mind tested the rhetoric that as A is B, C is B. +15. I mind tested that the student had critically evaluated the book about the good. +16. I mind read the structure's depth. +17. I excluded pharmacological medicine and guarantee of success for the pathologically ill from my medicine, mind including spiritual medicine for stress sufferers. +18. I mind tested for agreement with the philosophy. +19. I mind tested whether the reasoning structure had changed. +20. I mind tested that the student had understood the grammar interpreter in List Prolog. +21. I mind tested that the food was safe to eat. +22. I mind tested the pattern that the child needed to read the book. +23. I mind tested that the meditator used his will to find the way. +24. I mind tested that the customer thought meditation was a good deal. +25. I mind read the objects (with the person with objects) that the ball-bearing deflected because of the magnet. +26. I mind tested material for enough majors (given to thoughts). +27. I mind tested that the first instance of the item had been deleted from the list. +28. I mind read the person that she had entirely painted the mantelpiece clock. + + + + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 2.txt b/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..79784ef6ae49f14525359068bbf8f2c85f1ae319 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 2.txt @@ -0,0 +1,18 @@ +["Green, L 2021, mr for scientific property detection 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr for scientific property detection 2 +29. I decided whether to teleport the space ship based on mind reading whether obstacles would obstruct it at the destination. +30. The person mind tested that the pilot took care of the flight with breasonings. +31. I mind tested that the news scoop had 50 As. +32. I mind tested that the person was hydrated. +33. I mind tested for success (earning money in higher education by doing an MBA). +34. I mind tested the politician for their pedagogy and talked about his and mine. +35. I mind tested that the conditions in the alien environment were safe. +36. I mind tested that the aspersion to the person had been prevented. +37. I mind tested that the drinking water was safe. +38. I mind tested that the person had laughed. +39. I mind tested that the person could prevent infection by the pandemic. +40. I mind tested the distance that the person had travelled. +41. I mind tested that the list had been memorised backwards. +42. I mind tested that the country was a nanny state (everything was done for them and all they had to do was say comments). + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 3.txt b/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..c70451410ca92f9d1f0d750427b7f056decc1736 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 3.txt @@ -0,0 +1,15 @@ +["Green, L 2021, mr for scientific property detection 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr for scientific property detection 3 +43. I mind tested that the book had been read. +44. I mind tested for the critique. I mind tested for the exposition. +45. I mind tested that the idea was a breasoning, not a theory. +46. I mind tested that the object was hygienic. +47. I mind tested that the design was detailed. +48. I mind tested that the person was law abiding. +49. I mind tested that the buyer had paid breasoning currency for the product's parts and whole. +50. I mind tested for feedback in the form of breasonings in return for my breasonings. +51. I mind tested that all terms were only general or mind-reading-for-scientific-property-detection-related. +52. I wrote 250 breasonings about the scientific property that I wanted and mind tested that an object matched it. +53. I mind tested that the people had had enough exercise. +54. I found the rhizomatic back-connection from the argument to the end point of the area of study. +55. I mind tested that the people felt the finding was fair. +56. I mind tested that the highest achiever had been rewarded."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 4.txt b/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4a5ff93a1defb9fe9e8876b107465ecb0cde34e5 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr for scientific property detection 4.txt @@ -0,0 +1,25 @@ +["Green, L 2021, mr for scientific property detection 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr for scientific property detection 4 +57. I mind tested for withitness in knowledge that pedagogy is necessary for publications. +58. I mind tested that the person was prosperous. +59. I mind tested out the event with models. +60. I mind tested the other's skill. +61. I mind tested the object's length. +62. I mind tested that the unconceived child was examined. +63. I mind tested that the student met the requirements of happiness in the psychoanalytic categories. +64. I mind tested that the self wrote letters and follow up letters to important people about philosophy. +65. I mind tested for the synologic of the algorithm. +66. I mind tested that each person had enough breasoning currency. +67. I mind planned, pedagogically argued about and tested the algorithm. +68. I mind tested that the person recovered more quickly and lived longer, showing he had a lower metabolism. +69. I mind tested that the person planned the hour. +70. I mind ranked the people's happiness. +71. I mind tasted the sandwich. +72. I assessed the students with my mind. +73. I mind tested that liberalism refined its solutions over time. +74. I mind tested that the items were different. +75. I mind researched the new student. +76. I mind tested that the writing was well written and professional. +77. I mind tested that the self would meet the other. +78. The police officer mind tested that the ducklings wer happy. +79. God mind tested for the good in the office. +80. I mind tested the reference was accurate."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr for time travel 1.txt b/Lucian-Academy/Books/Mind Reading/mr for time travel 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..d5a85092a03b084e86eac31d848a78274788a75c --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr for time travel 1.txt @@ -0,0 +1,10 @@ +["Green, L 2021, mr for time travel 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr for time travel 1 +1. I mind read that the civilisation's main pedagogy writer needed to time travel to maintain his positive functional writing path. +2. I mind read the comment and sent the replying character to the time. +3. I mind projected the prayer to switch off digestive system pops from practising the sutra, and set up a transmitter that prevented digestive system pops at the time (I switched off the sutra). +4. I remained safe using popological onness (sic) about mind reading for time travel travel. +5. I mind read the best time travel excursion time and place according to multiple perspectives and how the destination helped achieve teaching objectives about the topic 'automation'. +6. I mind read for non-deterministic possibilities (algorithmic back-tracking) for time travel to experience enough details. +7. I mind read possible life partners before time travelling to see them. +8. I mind read the circumstances before sending the spiritual bot through time to bring the person back to life. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr for time travel 3.txt b/Lucian-Academy/Books/Mind Reading/mr for time travel 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..2f4df5aff9baaa6929347b70f9fd4c8388d793da --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr for time travel 3.txt @@ -0,0 +1,16 @@ +["Green, L 2021, mr for time travel 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Mr for a time travel 3 +23. I mind read the item and worked out how the time's inhabitants understood it from the base before time travelling. +24. I mind read the idea and ironically found a coincidentally related idea before time travelling. +25. Once I reached the 'base case' of my series of mind readings and teleports, I returned. +26. I mind read the players' thoughts to plan my soccer strategy, then (didn't time travel) time travelled to the best position to follow it. +27. I mind read and time travelled to complete the fill the gaps work sheet. +28. I mind read, then time travelled to help complete pedagogical requirements for customers. +29. I mind projected the schedule, then (didn't time travel) time travelled to cut the animal's toenail. +30. I mind read and time travelled to collect the best materials from time to give the stream of one student individual attention. +31. I mind read the employee and (spiritually) time travelled their thoughts to them to help the business survive. +32. I mind read myself then teleported in place to help stretch my muscles. +33. I mind read whether the person was human or a bot to plan safety measures before time travelling. +34. I mind read the correct universe containing all my collections of areas of study, then time travelled to it. +35. The farmer mind read himself and others to determine human factors for a high crop yield, then teleported to maximise this. +36. I blended the two mind read texts, then teleported to a reality based on this. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr for time travel 4.txt b/Lucian-Academy/Books/Mind Reading/mr for time travel 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..15e0611a5f0f3a4683c55943a79e33567ada3b2a --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr for time travel 4.txt @@ -0,0 +1,20 @@ +["Green, L 2021, mr for time travel 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr for time travel 4 + +37. I mind read that the language in the other time was the same before time travel. +38. I mind read and time travelled through spiritualism rather than computers. +39. I mind read where there was to do good, then time travelled there. +40. I mind read demand, followed by providing supply through time. +41. I mind read the best time to run, followed by time travelling to it. +42. I mind read the use for the algorithm, then time travelled to the best time and place to use it. +43. I time travelled, updating mind readings as I went. +44. I mind read the undesirable possible thoughts, prevented them using the quantum box, then time travelled there. +45. I mind read the state of the people in the time, then planned my language to be understandable and short before time travelling. +46. I mind read the most peaceful places and times, then time travelled there. +47. I mind read appropriate action adverbs to enact by mind reading for time travel. +48. I mind read the possible job, then time travelled to earn the job. +49. I mind read the best planet to visit, then time travelled there. +50. I mind read my computation, then time travelled based on it. +51. I mind read the possible students, then time travelled to teach them. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr for time travel 5.txt b/Lucian-Academy/Books/Mind Reading/mr for time travel 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..5db3f5a573ee6bd9676a986bf6ef868468201fd8 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr for time travel 5.txt @@ -0,0 +1,17 @@ +["Green, L 2021, mr for time travel 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr for time travel 5 +52. I mind read the best time (when no one was looking at the departure point and destination and other perspectives) and time travelled home. +53. I answered the question correctly using mind reading, enabling me to earn the time traveller role. +54. I buffered the time traveller's secondary text critical critique using mind reading. +55. I mind read the maker of the meat substitute to ensure it contained the required nutrients (e.g. creatine) when time travelling. +56. I updated my students by time travelling to them and mind reading them. +57. I mind read the necessary computer science detail and teleported the character to deliver it. +58. I mind read the player to determine the best technique to use and teleported my character when I performed it to the viewers. +59. I mind read the spiritually controlled object needing reinforcement, then time travelled to attach the 'nut' to the 'bolt'. +60. I mind read the person to agree with and time travelled to them. +61. I mind read the student's progress and time travelled to maintain it. +62. I mind read the appropriate setting to write and time travelled to construct there. +63. I mind read the best time to pay breasoning currency for the product, then time travelled there. +64. I mind projected an inspiration and time travelled as my expression. +65. I mind read the evidence and time travelled to check it before writing my essay. +66. I time travelled to the street in the city and mind read the location in the building to walk to. +67. I switched on mind reading and character time travel transmitters to help my business."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr for time travel 6.txt b/Lucian-Academy/Books/Mind Reading/mr for time travel 6.txt new file mode 100644 index 0000000000000000000000000000000000000000..218122ae1320c0184e9c12ae501c50135f8ef3af --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr for time travel 6.txt @@ -0,0 +1,15 @@ +["Green, L 2021, mr for time travel 6, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr for time travel 6 +68. I mind read the thoughts to make the transition between the times and locations when teleporting. +69. The animals who studied higher education mind read the best time to travel to. +70. I mind read the time to travel to to prevent cancer with meditation. +71. I designed the mind reading telephone to talk about the design of the time machine with God. +72. I wrote the history of the object world by mind reading and time travelling. +73. I found flaws and improvements with the mind read idea before time travelling. +74. I designed the acting and philosophy Aigs (collection of As), then mind read comments and time travelled characters to comment on the comments. +75. I mind read the air temperature before time travelling. +76. I turned off headaches, etc. from time travelling using a similar technique to mind reading. +77. I tested that the laws of physics were the same across the two times using mind reading before time travel. +78. I measured the thought timing of entering and leaving the time machine. +79. I mind read the properties of the food before time travelling to eat it. +80. I mind recorded my itinerary of time travel, space travel and on-foot travel legs. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr other times 1.txt b/Lucian-Academy/Books/Mind Reading/mr other times 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..ca4725dbb5642eee2f9e1b2fb4c50ad94c2a64e8 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr other times 1.txt @@ -0,0 +1,17 @@ +["Green, L 2021, mr other times 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr other times 1 +1. I listened to the lecture from the other time. +2. Remembering my philosophies from the past +3. Ethics of taking output from past times. +4. Ethics of taking output from future times. +5. Ethics of giving input to past times. +6. Ethics of giving input to future times. +7. I brought forward life-saving knowledge. +8. I brainstormed an argument from my life. +9. I mind read the other time before time travelling to it. +10. The University student collected her thoughts from other times. +11. The time phone had two uses, answering now or later. +12. If a thought that a customer needed to remember to buy a product was in a distant time, then I gave him As along the way to remember it. I automated sales management by mind reading and creating an algorithm from the customer's algorithm specification mind read from the other time, given that the customer wanted it, had money, was the decision maker and it was the right time. +13. I abided by law when mind reading other times by keeping future technologies secret and giving As in saving lives. +14. I programmed a spiritual bot to answer the mind phone from another space and time. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr other times 10.txt b/Lucian-Academy/Books/Mind Reading/mr other times 10.txt new file mode 100644 index 0000000000000000000000000000000000000000..9c85150870e823dac6a39d512a08b41d82741a34 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr other times 10.txt @@ -0,0 +1,17 @@ +["Green, L 2021, mr other times 10, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr other times 10 +66. I pointed to naturally available options when mind reading other times to help. +67. I compiled the spiritual time travel warnings. +68. I helped the person to take safety precautions, then spiritually time travel. +69. I helped provide spiritual time travel visitees with a good basis of thought. +70. I visited and helped the person to pedagogical details with spiritual time travel. +71. I simulated helping the person like the computational government does, when I mind read the other time. +72. I took safety precautions when mind reading other times to build a house. +73. I mind relaxed my legs in the future during my walk. +74. I recorded events to maintain safety and security by mind reading other times. +75. I graciously gave the breasonings that the student breasoned out to her by mind giving them at the start of the same business day. +76. I returned the accredited student's work by mind reading the other time. +77. I mind read my family in the past and made private comments. +78. I wrote as part of my job by mind reading other times. +79. I rewarded abidance with natural law when mind reading other times. +80. I watched productions by mind reading other times. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr other times 4.txt b/Lucian-Academy/Books/Mind Reading/mr other times 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..f37cc5366bfe99c58072f9a67aaea4daa0d4904b --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr other times 4.txt @@ -0,0 +1,12 @@ +["Green, L 2021, mr other times 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr other times 4 +26. I helped the poor person build a spiritual computer. I helped the poor person learn the skill by mind reading the same time, but different space. +27. I quickly found what I needed by mind reading the same time, but different space. I quickly found what I needed by mind reading the same time and space. +28. I recorded the thought mind read au présent. +29. I practised the grammar, vocabulary and accent of the other time before speaking. I researched the nature of the language, people and society before mind reading the other time. +30. I wrote the translation communicator algorithm for mind reading the other time. +31. I answered the unknown like following the discus and mind reading the other time. +32. I photographed the object from the other time before it was destroyed. +33. I checked that the actor had indicated Aigs that day. +34. I mind read the comment from the time as a famous person. +35. I reminded the guest about the event by telepathy with the other time. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr other times 9.txt b/Lucian-Academy/Books/Mind Reading/mr other times 9.txt new file mode 100644 index 0000000000000000000000000000000000000000..8a95e7f0fe2eb9b8bbfba7981756a0deb8fa0b31 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr other times 9.txt @@ -0,0 +1,11 @@ +["Green, L 2021, mr other times 9, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr other times 9 +59. I mind read other times to listen to and comment on students. I mind projected 50 As for the product to the other time. +60. I verified whether the mission was viable by mind reading other times. I mind read and helped maintain the psychiatric health of the crew and passengers on the interstellar mission which involved teleporting through time and space. +61. I mind read the other time to verify that the space jump was safe. I mind read the other time to verify that the ship would have enough quantum power. +62. I found the spatial and temporal limits to spiritual time travel (mind reading other times). +62. I found evidence for the philosophy argument in the other time by mind reading. +63. The queen verified that the spiritual bot saved money by mind reading the other time. +64. After asking permission, I mind read the adult in the other times to verify that he had good psychiatric health (had no hallucinations, was happy, had comfortable muscles, was occupied, had training, a business, a job, a partner and a sex life). +65. As the Asperger sufferer wore a wig and gown, the evidence was brought forward by mind reading other times. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr spiritual screen 1.txt b/Lucian-Academy/Books/Mind Reading/mr spiritual screen 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..753fab6fa4b5b12274a581713d610cc3c3b96d23 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr spiritual screen 1.txt @@ -0,0 +1,30 @@ +["Green, L 2021, mr spiritual screen 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr spiritual screen 1 +1. Use the spiritual screen with mind reading without others detecting the screen. +2. Use the spiritual screen with mind reading for time travel. +3. Use the spiritual screen with mind reading for privacy. +4. Use the spiritual screen with mind reading underwater. +5. Use the spiritual screen with mind reading in space. +6. Use the spiritual screen with mind reading for a spiritual operating system. +7. Use the spiritual screen with mind reading hands-free. +8. The visually impaired led the aerobics session with the spiritual screen and mind reading. +9. The contract to design the spiritual screen operating system with mind reading delivered the operating system as an earlier 'What's In It For Me' exit point. +10. The time traveller used the spiritual screen with mind reading to maintain personal safety in the simulation. +11. I taught the dinosaur whale to communicate using the spiritual screen and mind reading. +12. The pre-pedagogy helper used the spiritual screen with mind reading to display high quality imagery about pedagogical ways of thinking to prospective conceivers of children. +13. I rewrote the philosophy after reading the comments using the spiritual screen and mind reading. +14. There was spiritual audio and mind reading. +15. There were spiritual tactile sensations and mind reading. +16. There was spiritual taste and mind reading. +17. There was spiritual smell and mind reading. +18. I used the spiritual screen, which used mind reading, to maintain a positive path around nature. +19. I set the spiritual screen refresh rate. +20. I participated in group meditation with the spiritual camera to economically help myself. +21. I breasoned out the spiritual screen. +22. I permitted switching on the spiritual screen in addition to mind reading. +23. I used the spiritual screen to display the main thoughts to meditators to help them reach full brain potential when choosing the best features of meditation centres. +24. The professor summarised, memorised and redisplayed the choices thought of by the student a priori using the spiritual screen. +25. The student selected the item on the spiritual screen à present. +26. The professor invented and displayed the developed connection in the algorithm on the spiritual screen a posteriori. +27. Fly or browse through the 3D spiritual operating system. +28. The child played games on the spiritual screen with mind reading for happiness. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr spiritual screen 2.txt b/Lucian-Academy/Books/Mind Reading/mr spiritual screen 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..82fadb869ad6f520f7bc4a1df9fc9d04b9184950 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr spiritual screen 2.txt @@ -0,0 +1,10 @@ +["Green, L 2021, mr spiritual screen 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr spiritual screen 2 +29. The spiritual screen can give peripheral vision. +30. I checked the weather on the spiritual screen. +31. I checked around me after time travelling using the spiritual screen. +32. I remembered the item using the spiritual screen. +33. I meditated on thoughts using the spiritual screen. +34. I cast the actor by checking his face matched the character's appearance on the spiritual screen. +35. I tested that the product, the spiritual screen, worked, by displaying the letter 'A'. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr spiritual screen 3.txt b/Lucian-Academy/Books/Mind Reading/mr spiritual screen 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..70ab1ddcca4d2866fb3545b9cb3c818c0f865331 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr spiritual screen 3.txt @@ -0,0 +1,8 @@ +["Green, L 2021, mr spiritual screen 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr spiritual screen 3 +36. The army humanitarian used the spiritual screen to tell people about food. +37. The army humanitarian used the spiritual screen to distribute food. +38. The doctor-patient prevented her headache using the prompts on the spiritual screen. +39. I conversed with the human animal using the spiritual screen. +40. I improved the Prolog code from e.g. a or b to if a then b, else c on the spiritual screen. +41. I ran the algorithm on the spiritual screen. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr spiritual screen 5.txt b/Lucian-Academy/Books/Mind Reading/mr spiritual screen 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..dd6c4c2711eeee3a2994b2d3e24191df2bcefdd6 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr spiritual screen 5.txt @@ -0,0 +1,7 @@ +["Green, L 2021, mr spiritual screen 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr spiritual screen 5 +58. I read the output of the List Prolog algorithm on the spiritual screen of the spiritual computer (List Prolog Interpreter). +59. The breasoning life form on the spiritual screen verified that I was psychiatrically happy. +60. The user wrote about ontological nothingness on the spiritual screen, maintaining developedness. +61. I maintained my social life on the spiritual screen while in a rural setting. +62. I created spiritual sensors for electrical currents, pressure, electronic and chemical signals. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Mind Reading/mr spiritual screen 6.txt b/Lucian-Academy/Books/Mind Reading/mr spiritual screen 6.txt new file mode 100644 index 0000000000000000000000000000000000000000..25937e1e6af91718f124595bc6f45373f3958736 --- /dev/null +++ b/Lucian-Academy/Books/Mind Reading/mr spiritual screen 6.txt @@ -0,0 +1,21 @@ +["Green, L 2021, mr spiritual screen 6, Lucian Academy Press, Melbourne.","Green, L 2021",1,"mr spiritual screen 6 +63. I replicated and ate the substitute meat using the spiritual screen. +64. I sold the clothes replicated using the spiritual screen. +65. I wrote and read the book replicated using the spiritual screen. +66. I designed and replicated the computer using the spiritual screen. +67. I designed and replicated the robot using the spiritual screen. +68. I played tennis with a replicant using the spiritual screen. +69. I made money simulating the business a priori internally (unseen) using the spiritual screen replicants. +70. I qualified the spiritual screen-replicated departments, people and objects with reasons. I replicated departments, people and objects universally using the spiritual screen. +71. I designed and replicated a robot science lab and software using the spiritual screen to uncover PhD research. +72. The spiritual screen replications were Godly (good). +73. I replicated the movie scene using the spiritual screen and analysed the characters. +74. I recorded all necessary data before dismantling the replicant created using the spiritual screen (e.g. saved the research). +75. I made comments and made gestures universally related to using the spiritual screen. +76. I nominalised the spiritual screen as an ubreen (sic). +77. Meditators travelled to and from the meditation centre using a safety app on the spiritual screen. +78. I kept in contact with family and friends using a reminder app on the spiritual screen. +79. I interacted with the spiritual screen using only thoughts to do with the spiritual screen. +80. The spiritual screen worked based on a reliable electronic circuit. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Music/.DS_Store b/Lucian-Academy/Books/Music/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Lucian-Academy/Books/Music/.DS_Store differ diff --git a/Lucian-Academy/Books/Music/Abracadabra 2 Argument.txt b/Lucian-Academy/Books/Music/Abracadabra 2 Argument.txt new file mode 100644 index 0000000000000000000000000000000000000000..6fb8a32b91c78daa6772c0896af2330874614594 --- /dev/null +++ b/Lucian-Academy/Books/Music/Abracadabra 2 Argument.txt @@ -0,0 +1,65 @@ +["Green, L 2024, Abracadabra 2 Argument, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Abracadabra 2 Argument + +1. I prepared to live in a house. I did this by writing the meaning of the connection between two notes. First, I wrote the note ‘C’. Second, I wrote the note ‘G’. Third, I wrote that the connection between the notes ‘C’ and ‘G’ had a happy sound. In this way, I prepared to live in a house by writing the meaning of the connection between two notes. + +2. I prepared to attend class by thanking the teacher by shaking his hand. First, I touched his hand. Second, I held his palm. Third, I shook his hand. In this way, I prepared to attend class by thanking the teacher by shaking his hand. + +3. I prepared to walk more quickly. I did this by having intercourse with the policeman. First, I walked to the policeman. Second, I chatted with the policeman. Third, I advanced to the pole. In this way, I prepared to walk more quickly by having intercourse with the policeman. + +4. I prepared to sing a song about a dance by writing a song about learning meditation. I did this by learning to say many words in a few words. First, I wrote the words “five fruits.” Second, I wrote the word “ingredients.” Third, I said “five fruits” by saying “ingredients.” In this way, I prepared to sing a song about a dance by writing a song about learning meditation. + +5. I prepared to rearrange my room by making enough space to play Anarchy by Lucian Green. First, I made enough space in my room. Second, I filled it with interesting objects. Third, I played the song. This way, I prepared to rearrange my room by making enough space to play Anarchy by Lucian Green. + +6. I prepared to finish a year level of a degree. I did this by studying a subject at Melbourne University. First, I walked to the University. Second, I sat in class. Third, I opened my book. In this way, I prepared to finish a year level of a degree by studying a subject at Melbourne University. + +7. I prepared to sit on the mat by becoming a monk to pray. First, I sat on the mat, put my hands by my sides, and prayed for rain. + +8. The monk prepared to pray for someone to be well. He did this by going from his bed to the middle of the room on time. First, he unfolded the sheet. Second, he pulled himself out of bed. Third, he stood in the centre of the room. In this way, the monk prepared to pray for someone to be well by going from his bed to the middle of the room on time. + +9. I prepared to place the bookmark in the book by persuading a student to read it. First, I realised I liked the book. Second, I read it. Third, I gave it to the student to read. In this way, I prepared to place the bookmark in the book by persuading a student to read it. + +10. I prepared to love life by checking my pulse. First, I put my finger on my pulse. Second, I felt it until I could feel it. Third, I recorded the number of beats per minute I had. + +11. The baby enjoyed meditation, which trained her. She enjoyed meditating by walking for 5 minutes. First, she started walking from the starting point. Second, she prepared to keep walking on her journey. Third, she stopped walking when she reached her endpoint. In this way, the baby enjoyed meditation by walking for 5 minutes. + +12. I prepared to discover the Physics Code. I did this by explaining how subatomic particles made properties work. First, I examined the subatomic particle. Second, I examined how it made a property work. Third, I wrote down this property. In this way, I prepared to discover the Physics Code by explaining how subatomic particles made properties work. + +13. I prepared to perform a headcount on the train. I did this by counting the number of faces of the dodecahedron. First, I counted the polyhedron’s first face. Second, I counted its next face. Third, I repeated this until I had counted all the faces. In this way, I prepared to perform a headcount on the train by counting the number of faces of the dodecahedron. + +14. I prepared to compete in the race by watching the three-in-one gear work. First, I placed the first gear on the frame. Second, I put the second gear on the frame, touching the first gear. Third, I attached the third gear, touching the second gear, and moved the first gear, rotating the other gears. In this way, I prepared to compete in the race by watching the three-in-one gear working. + +15. I prepared to test what was visible in the neuroscience practical. I did this by breasoning out the eyes (in other words, thinking of an algorithm involving the eyes relating to movement along a pathway in the same way that a philosophy professor would pull model eyes from a blue cloth in the same way that one would draw a swab out of a test tube to clean it, then breasoning out, or thinking of the eyes’ x, y and z dimensions). First, I measured the right eye’s width. Second, I measured the right eye’s height. Third, I duplicated these measurements for the left eye. In this way, I prepared to test what was visible in the neuroscience practical by breasoning out the eyes. + +16. I prepared to buy the autobiography. I did this by reading the autobiography’s authors’ first name twice as a mantra and surname twice as a sutra. First, I tested that the first pair of words was the autobiography’s author’s name and repeated each twice. Second, I prepared to test the next pair of words. Third, I repeated this until I had found the author’s first name. In this way, I prepared to buy the autobiography by reading the autobiography’s authors’ first name. + +17. I prepared to make sure I felt well. I did this by praying to have a digestive system pop prevented, with a calm digestive system instead, and a calm digestive system until the next day, where a separate argument about avoiding sitting on a pop-up seat, in other words, an automatically folding seat when it was up, should be used. First, I walked to the seat. Second, I tested the seat to ensure that it was up. Third, I stood beside the seat, avoiding sitting on it. In this way, I prepared to ensure I felt well by avoiding sitting on an automatically folding seat when it was up. + +18. I prepared to go down the high hill. I did this by testing that the face of the giant ice block was flat. First, I tested that all the squares in the first row of the massive ice block’s face were flat. Second, I prepared to test that all the squares in the next row of the giant ice block’s face were flat. Third, I repeated this until all the rows had been tested. In this way, I prepared to go down the high hill by testing that the face of the giant ice block was flat. + +19. The man prepared to “be in heaven” by watching the sunset to test when to sleep. He did this by climbing up the palm tree. First, he pulled himself up the tree with his arms with his legs gripping the tree. Second, he pulled his legs up while holding the tree with his arms. Third, he repeated this until he had climbed the tree. In this way, the man watched the sunset by climbing up the palm tree. + +20. I prepared to watch the fidgets go for a walk by watching one man pass a ball to the other. First, I watched the first man catch the ball. Second, I watched him pass the ball to the second player. Third, I watched the second player catch the ball. In this way, I prepared to watch the fidgets go for a walk by watching one man pass a ball to the other one. + +21. I prepared to clean the oven by eating and testing the uniformity of the slice of cake. First, I ate the first half of the raisin. Second, I ate the second half of the raisin. Third, I tested that they were equally crunchy. + +22. I prepared to test I was invincible instead of nature. I did this by drinking the strawberry milkshake globules. First, I placed the bottom of the straw at the bottom of the glass. Second, I put my lips around the tip of the top of the straw. Third, I sucked the strawberry milkshake globules through the straw. + +23. I prepared to cuddle Jesus. I did this by loving the baby Jesus doll. First, I placed hay in the manger. Second, I put the baby Jesus doll in the manger. Third, I covered it with a blanket. In this way, I prepared to cuddle Jesus by loving the baby Jesus doll. + +24. I prepared to listen to the song. I did this by reading that the song sold the most records in history because it was confirmed that it approached the apex of the leaf, meaning it had sold maximally. First, I read the first song’s sales value. Second, I read the next song’s sales value and replaced the previous maximum song value with it if it was more significant than it. Third, I repeated this until I reached the end of the list of values and computed the maximum value. In this way, I prepared to listen to the song by reading that it sold the most records in history because it was indeed approaching the apex of the leaf, meaning it had sold maximally. + +25. I prepared to walk on stage by becoming famous using the sutra. First, I sat on the pad. Second, I meditated by repeating the sutra in my mind. Third, I got up when I finished. In this way, I prepared to walk on stage by becoming famous using the sutra. + +26. I prepared to jump into the pool by dancing on the spot. First, I moved my left arm up. Second, I moved it downwards sideways. Third, I bowed. In this way, I prepared to jump into the pool by dancing on the spot. + +27. I prepared to test my stamp set. I did this by writing a sentence for each point in history using all the letters in the alphabet from each point in history. First, I checked the following letter in the sentence wasn’t used. Second, I prepared to test that the following letter in the sentence wasn’t used. Third, I repeated this until I found a sentence that used all the alphabet's different letters once. In this way, I prepared to test my stamp set by writing a sentence for each point in history using all the letters in the alphabet from each point in history. + +28. The famous writer prepared for the painting of him to be painted. He did this by wearing the clothes Shakespeare would have worn. First, he put on the wig. Second, he put on the gown. Third, he put on the pelt. In this way, the famous writer prepared for the painting of him to be painted by wearing the clothes Shakespeare would have worn. + +29. I prepared to develop a society famous for its success. I did this by creating a problem-solving kit for a friend’s child. First, I taught that the child, represented by the doll, should live in the house, represented by the model. Second, I placed the house model in front of me. Third, I helped the child to place the doll in the house. In this way, I prepared to develop a society that was famous for its success by creating a problem-solving kit for a friend’s child. + +30. I prepared to play the game. I did this by searching for words meaning “old” to base the name of a computer game based on my Computer Science major. First, I looked up “old” in a thesaurus. Second, I read and described the synonyms for “old” in the thesaurus. Third, I confirmed the similarity of the synonyms with “old” and wrote them down. In this way, I prepared to play the game by searching for words meaning “old” to base the name of a computer game based on my Computer Science major. + +31. I prepared to perform at the concert by pressing the button to write the pop song. First, I selected the section of music to repeat. Second, I copied the section of music. Third, I pasted the section of music after the first section of music. In this way, I prepared to perform at the concert by pressing the button to write the pop song. + +32. I prepared to lip-synch the song by pressing the button to produce the music video. First, I found the visual marker for the start of the video. Second, I found the aural marker for the beginning of the song. Third, I synchronised these markers to produce the video. This way, I prepared to lip-synch the song by pressing the button to make the music video."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Music/Abracadabra 3 Argument.txt b/Lucian-Academy/Books/Music/Abracadabra 3 Argument.txt new file mode 100644 index 0000000000000000000000000000000000000000..60ea9b177af45851e7f621568660de73ef7e98f9 --- /dev/null +++ b/Lucian-Academy/Books/Music/Abracadabra 3 Argument.txt @@ -0,0 +1,65 @@ +["Green, L 2024, Abracadabra 3 Argument, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Abracadabra 3 Argument + +1. I prepared to distribute the As from the Aig (monastic version of an A grade essay that is a token of professionalism, which monastics change one breasoning or sentence in for each new person, and it is enough for the new person to earn the A is if it were original). I did this by standing in the circle with you and handing the A to you. First, I wrote a 190-breasoning (100%) A, having a new last-breasoning for the current person. Second, I prepared to distribute the Aig to the next person. Third, I repeated this until everyone who had received the Aig had an A. In this way, I prepared to distribute the As from the Aig by standing in the circle and handing the A to you. + +2. I prepared to appear professional while the Aig was distributed by meditating daily to protect myself because I was untrained in pedagogy (Education) or the Aig topic (department). To write As (A grade tokens of professionalism), I needed an artificial Pedagogy degree, composed of 10 10-breasoning long As, chosen from the breasonings details (breason out ten breasonings, in other words, think of 10 sets of x, y and z dimensions of each breasoning in the breasonings details’ breasoning lists), where you can write 10-breasoning long As instead of 190 breasonings because the degree helps you. You can write 10-breasoning long As in the Pedagogy degree itself because you are protected while working on it by it. After breasoning out the Aig (see item 1), think of a 2 (one for the employee and one for the manager to mean the total of 20 breasonings are expanded to 100 breasonings by politicians) *10 breasoning long sales A’s worth of characters in a text of your choice) to establish the Aig’s professional standard, and take care of correcting, finishing and doing As to access the Aig at all. I did this by jogging around the Botanical Gardens track. First, I ran some of the way. Second, I ran a little further. Third, I finished the run. In this way, I prepared to appear professional while the Aig was distributed by jogging around the Botanical Gardens track. + +3. I prepared to verify that the chicken was not outside the cool room for 10 minutes. I did this by verifying that the use-by date was current. First, I verified that the year on the use-by date was this year or earlier. Second, I verified that the month on the use-by date was this month or earlier. Third, I verified that the day on the use-by date was this day or earlier. In this way, I prepared to verify that the chicken was not outside the cool room for 10 minutes by ascertaining that the use-by date was current. + +4. I prepared to verify that the ceiling was clean by licking the spatula’s flat end. First, I tested that the end of the implement closer to the middle was flat. Second, I tested that the middle of the implement was flat. Third, I tested that the end of the implement closer to the end was flat. In this way, I prepared to verify that the ceiling was clean by licking the spatula’s flat end. + +5. I prepared to distribute the salespeople for my product uniformly. I did this by placing the pistachio nuts into the pistachio ice cream. First, I placed a pistachio nut in the centre of the top surface of the ice cream. Second, I embedded the nth concentric square with 2n – 1 pistachio nuts on the edge, starting with n = 1, to construct a uniform grid. Third, I placed points of sale at each intersection of the grid. In this way, I prepared to distribute the salespeople for my product uniformly by placing the pistachio nuts into the pistachio ice cream. + +6. I prepared to prevent death by placing the wet floor sign beside the mopped area. First, I listened to the customer report the spill. Second, I confirmed the details of the spill with the customer. Third, I mopped the floor and placed the wet floor sign beside the mopped area. In this way, I prepared to prevent death by placing the wet floor sign beside the mopped area. + +7. I prepared to make space for the new cattle. I did this by burning the baby that had died. First, I burned the body’s head. Second, I burned the body’s body. Third, I burned the baby’s limbs. In this way, I prepared to make space for the new cattle by burning the baby that had died. + +8. I prepared to lead the running squad by forming a running lane on the athletics track. First, I ran first in line. Second, the second athlete followed me. Third, the third athlete followed him. In this way, I prepared to lead the running squad by forming a running lane on the athletics track. + +9. I prepared to centre the house on the property by centring the hat on my head. First, I placed the hat on my head. Second, I made sure that it was straight. Third, I rotated it so that the front of the hat was pointing forward. In this way, I prepared to centre the house on the property by centring the hat on my head. + +10. I prepared to push the base of the flag into the ground. I did this by pointing the rod gyroscope at the letter. First, I let the rod gyroscope hang down. Second, I placed the letter tile underneath it. Third, I verified that the rod gyroscope was pointing at the letter by looking down along the rod at the letter. In this way, I prepared to push the base of the flag into the ground by pointing the rod gyroscope at the letter. + +11. The lady prepared for more of the chemical to be used in the new practical version by applying lipstick to her lips. First, she uncapped the lipstick and twisted the container to expose enough. Second, she applied it to her top lip. Third, she applied it to her bottom lip. + +12. I prepared to make sure the hips worked well. I did this by allowing the budgerigar to step onto my finger. First, I opened the door of the budgie’s cage. Second, I placed my finger next to the budgie’s stomach. Third, I removed the budgie from the cage while he sat on my finger. This way, I prepared to ensure the hips worked well by allowing the budgerigar to step onto my finger. + +13. I prepared to go to the café. I did this by counting the petty cash. First, I counted the coins or notes of the first denomination and added their total. Second, I prepared to do this with the next denomination. Third, I repeated this until there were no denominations left. In this way, I prepared to go to the café by counting the petty cash. + +14. I prepared to write “Happy Birthday” by writing it with the icing sugar on the cake. First, I filled the pen with icing sugar. Second, I poured the icing sugar through the hole. Third, I wrote the words on the top of the cake. In this way, I prepared to write “Happy Birthday” by writing it with the icing sugar on the cake. + +15. I prepared to love the atheist delights. I did this by listening to the intersex lady (the lady with no genital organs whatsoever). First, I looked up the telephone number in the telephone directory. Second, I rang it. Third, I had a conversation with the lady when she answered. In this way, I prepared to love the atheist delights by listening to the intersex lady. + +16. I prepared to love the big ideas. I did this by blowing up the big balloon. First, I held the neck of the balloon. Second, I blew up the balloon. Third, I tied a knot in the neck of the balloon. In this way, I prepared to love the big ideas by blowing up the big balloon. + +17. I prepared to feed the kitten by pouring a bowl of milk into it. First, I put the bowl on the ground. Second, I poured milk into it. Third, I called the kitten to let her know that I had poured a bowl of milk for her. + +18. The pop singer prepared to ask for a harp effect to be applied to his voice. He did this by playing the harp. First, he sat at the harp. Second, he selected the middle C string. Third, he plucked the string. In this way, the pop singer prepared to ask for a harp effect to be applied to his voice by playing the harp. + +19. The pop singer prepared to ask for a fast beat in his song. He did this by playing the drums quickly. First, he wrote down the tempo of 200 beats per minute that he planned to play the drums at on paper. Second, he set the metronome to this tempo. Third, he played the drum at this tempo. This way, the pop singer prepared to ask for a fast beat in his song by playing the drums quickly. + +20. The pop singer prepared to ask for a rocket sound effect to be played. He did this by dragging a stick through the sandbox. First, he selected the stick. Second, he positioned its end at one end of the sandbox. Third, he moved it through the sandbox until it had reached the other end. In this way, the pop singer prepared to ask for a rocket sound effect to be played by dragging a stick through the sandbox. + +21. The pop singer prepared to ask for a church bell to be played, suggesting he was kingly. He did this by striking the bell with a stick. First, he hung the bell from a stand. Second, he lifted the stick. Third, he struck the bell with the stick. In this way, the pop singer prepared to ask for a church bell to be played, suggesting he was kingly by striking the bell with a stick. + +22. I prepared to help the baby tie her shoelaces by seeing the baby. First, I bobbed down on the ground. Second, I looked forward. Third, I looked forward each distance before me until I saw the baby. In this way, I prepared to help the baby tie her shoelaces by seeing the baby. + +23. I prepared to be secular-minded by shaking hands with my colleague. First, I looked my colleague in the eye. Second, I grasped his hand. Third, I shook his hand. + +24. I prepared to devise an ulterior motive by wearing diving equipment. First, I attached the oxygen cylinders to the mask. Second, I mounted the oxygen cylinders on my back. Third, I put on the mask. In this way, I prepared to devise an ulterior motive by wearing diving equipment. + +25. I prepared to eat the apple peel by reading the script. First, I read flamboyantly. Second, I read gingerlilily. Third, I read Moroccan-like. + +26. I prepared to read the Queen a letter. I did this by wetting the envelope. First, I sucked 20 aliquots of water into the pipette from the test tube. Second, I squeezed the water from the pipette into the petri dish. Third, I dabbed my applicator in the petri dish with water and wet the adhesive strip inside the envelope flap. In this way, I prepared to read the Queen a letter by wetting the envelope. + +27. I prepared to drive the car by hooking the trailer to the vehicle. First, I hooked the front of the trailer to the back of the vehicle. Second, I unlocked the locks. Third, I chained the hook into a closed circle with the chain and locked it at both ends. In this way, I prepared to drive the car by hooking the trailer to the vehicle. + +28. I prepared to leave Earth by calculating that the Space Rocket should move up 4 metres. First, I moved the rocket up 2 metres. Second, I moved the rocket up another 2 metres. Third, I calculated that the rocket had moved up 2+2=4 metres. + +29. I prepared to deliver the cardboard boxes by swerving into the car port. First, I positioned the car perpendicular to the car port. Second, I moved in front of the carport. Third, I turned into the carport. In this way, I prepared to deliver the cardboard boxes by swerving into the carport. + +30. The aboriginal prepared to play with the duck. He did this by painting the duck. First, he dipped the brush in the paint can. Second, he applied the brush to the duck. Third, he repeated this until he had painted the duck with a coat of paint. In this way, the Aboriginals prepared to play with the duck by painting it. + +31. I prepared to sing at the concert by finding my seat in the auditorium. First, I remembered the row and seat number printed on the ticket. Second, I found the row of seats with this row number. Third, I found the seat with this seat number. In this way, I prepared to sing at the concert by finding my seat in the auditorium. + +32. I prepared to help the students with their homework by explaining to monastics how to produce Aigs or A-grade essays, which can be modified for each recipient to earn A. I did this by buying Vetusia3D™, a computer game that explained aspects of producing Aigs. First, I went anywhere I liked to. Second, I changed it around. Third, I wrote an argument agreeing with (not disagreeing with) a published article’s conclusion. In this way, I prepared to help the students with their homework by buying Vetusia3D™."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Music/Abracadabra Argument.txt b/Lucian-Academy/Books/Music/Abracadabra Argument.txt new file mode 100644 index 0000000000000000000000000000000000000000..9df9221fb44797af816805b845c05f8f7dd26065 --- /dev/null +++ b/Lucian-Academy/Books/Music/Abracadabra Argument.txt @@ -0,0 +1,65 @@ +["Green, L 2024, Abracadabra Argument, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Abracadabra Argument + +1. The aerospace engineer prepared to test that an object was functioning correctly. He did this by eating using a spoon. First, he moved the spoon so that its left edge's x coordinate was equal to its left edge's x coordinate plus his mouth's left edge's x coordinate minus his spoon's left edge's x coordinate. Second, he moved the spoon so that its bottom edge's z coordinate was equal to its bottom edge's z coordinate plus his mouth's bottom edge's z coordinate minus his spoon's bottom edge's z coordinate. Third, he moved the spoon so that its front edge's y coordinate was equal to its front edge's y coordinate plus his mouth's y coordinate minus his spoon's front edge's y coordinate. In this way, the aerospace engineer prepared to test that an object was functioning correctly by eating using a spoon. + +2. The manufacturer prepared to test that the seat was ergonomically designed. He did this by skewering the sausage using the fork. First, he positioned the fork horizontally over the horizontal centre of the slice of sausage. Second, he positioned the fork horizontally over the vertical centre of the slice of sausage. Third, he skewered the sausage. In this way, the manufacturer prepared to test that the seat was ergonomically designed by skewering the sausage using the fork. + +3. The optometrist prepared to test the size of the chopstick she would put in her mouth. She did this by eating using the chopsticks. First, she placed the first chopstick to the left of the grain of rice. Then, she put the second chopstick to the right of the grain of rice. Finally, she picked up the grain of rice. In this way, the optometrist prepared to test the size of the chopsticks she would put in her mouth by eating using the chopsticks. + +4. The physicist prepared to calculate the friction coefficient on the slope. He did this by eating using the lubricated spoon. First, he put the spoon with wheat biscuits and milk into his mouth. Second, he gripped the spoon with his lips. Third, he pulled the spoon out of his mouth, leaving the wheat biscuit and milk in his mouth. In this way, the physicist prepared to calculate the friction coefficient on the slope by eating using the lubricated spoon. + +5. The technician changed the track number by tearing the potato apart with his teeth. First, he clamped down on the biteful of potato with his molars. Second, he moved his jaw to the right. Third, he moved his jaw to the left. In this way, the technician changed the track number by tearing the potato apart with his teeth. + +6. The bridge technician prepared to raise the bridge by opening his mouth just before inserting food into it. First, he lifted the food to his mouth. Second, he opened his mouth. Third, he ate the food. + +7. The hiker prepared to break the bread. He did this by mashing the carrot on his plate. First, he placed the slice of carrot on the centre of his plate. Next, he mashed the carrot vertically. Finally, he mashed the carrot horizontally. In this way, the hiker prepared to break the bread by mashing the carrot on his plate. + +8. The cook prepared to squeeze the cooked tomato. He did this by crushing the grape. First, he took off his shoes. Next, he placed the centre of his foot on the centre of the grape. Then, he squashed the grape. In this way, the cook prepared to squeeze the cooked tomato by crushing the grape. + +9. The shepherd prepared to wash the sheep. He did this by chewing using his molars. First, he placed the jelly bean on his molar. Second, he chewed by moving his jaw left. Finally, he chewed by moving his jaw right. In this way, the shepherd prepared to wash the sheep by chewing using his molars. + +10. The crane mover prepared to move the crane segment into place. She pushed the pea into her molar space with her tongue. First, she pushed the food into the centre of her mouth. Second, she moved her tongue to the left of the pea. Third, she moved the pea onto his lower right molar. In this way, the crane mover prepared to move the crane segment into place by pushing the pea into her molar space with her tongue. + +11. The cleaner prepared to mop the floor by dissolving the cooked pumpkin in water. First, he placed the cooked pumpkin into the glass of water. Second, he let the water sit for 5 minutes. Third, he removed the pumpkin from the water. In this way, the cleaner prepared to mop the floor by dissolving the cooked pumpkin in water. + +12. The swimmer prepared to turn his head to take a breath by stirring the soup pot. First, he inserted the wooden spoon into the pot. Second, he stirred clockwise around half the pot. Third, he stirred clockwise around the other half of the pot. In this way, the swimmer prepared to turn his head to take a breath by stirring the soup pot. + +13. The actor prepared to eat the apple pie. He did this by running around a certain distance with the energy from eating honey. First, he measured the distance he would travel to be 4 km. Second, he decided to run for 0.3 hours. Third, he divided 4 km / 0.4 hours, to calculate his speed, 10 km/h. In this way, the actor prepared to eat the apple pie by running around a certain distance with the energy from eating honey. + +14. The child fed the vegetable model to the plastic dinosaur. He did this by dismissing the meeting's participants after the sum of the speech times had been reached. First, he started counting the minutes of the meeting from 0. Second, he summed the speech times. When the meeting time had reached this sum, he dismissed the meeting. In this way, the child fed the vegetable model to the plastic dinosaur by dismissing the meeting's participants after the sum of the speech times had been reached. + +15. The Chinese teacher prepared to play duck-duck-goose by pointing to the floor and asking the child to stand up. First, he walked to the child. Next, he asked the child to stand up. Finally, he watched the child stand up. In this way, The Chinese teacher prepared to play duck-duck-goose by pointing to the floor and asking the child to stand up. + +16. The demolisher prepared to test that there was no more debris. He did this by counting the number of globules of marzipan. First, he moved in all combinations of directions from a cube chosen as a relative origin in the globule of marzipan until there were no unexplored cubes. Second, he tested to the right of the first globule of marzipan for another globule of it. Third, he repeated this until there were no more globules of marzipan. In this way, the demolisher prepared to test that there was no more debris by counting the number of globules of marzipan. + +17. The operator prepared to connect the caller to a call recipient by removing the treacle on toast from the bag. First, he reached inside the bag. Next, he felt for the sticky treacle on toast. Then, he removed the toast by holding it by its edge. In this way, the operator prepared to connect the caller to a call recipient by removing the treacle on toast from the bag. + +18. The bricklayer prepared to load the bricks into the wheelbarrow. He did this by swallowing the globules of treacle together. First, he placed the first globule of treacle in his mouth. Second, he placed a second globule of treacle in his mouth and chewed them together. Finally, he swallowed the small mouthful of treacle. In this way, the bricklayer prepared to load the bricks into the wheelbarrow by swallowing the globules of treacle together. + +19. The builder prepared to collect the smaller parts first to make the window by collecting the rubbish in the bag. First, she picked up the folded sheet of paper. Second, she carried the rubbish to the bag. Third, she placed the rubbish in the bag. In this way, the builder prepared to collect the smaller parts to make the window by collecting the rubbish in the bag. + +20. The bodybuilder prepared to spot his friend lifting barbells. He did this by cleaning away the marks on his face. First, he looked in the mirror. Second, he observed that the chocolate mark was on the left side of his mouth. Third, he cleaned the left side of his mouth. In this way, the bodybuilder prepared to spot his friend lifting barbells by cleaning away the marks on his face. + +21. The bread seller prepared to roll his trailer down the left of the road. He did this by running down his lane in a race. First, he placed a pole at the intersection of the starting line and the left edge of the road. Second, he placed a rope parallel to the left edge of the lane. Third, he ran down the highway with the centre of his body aligned with the rope. In this way, the bread seller prepared to roll his trailer down the left of the road by running down his lane in a race. + +22. The trailer hirer prepared to wait for her trailer to dry. She did this by waiting for the lactic acid to disappear. First, she walked to the side of the track. Second, she lay down on her back. Third, she walked home. In this way, the trailer hirer prepared to wait for her trailer to dry by waiting for the lactic acid to disappear. + +23. The mathematics student prepared to substitute the answer back into the algebraic equation to check it. He did this by breathing in and breathing out while running. First, he breathed in during each of his first and second steps. Second, he breathed out during each of his third and fourth steps. Finally, he repeated this process until he had finished running. In this way, the mathematics student prepared to substitute the answer into the algebraic equation to check it by breathing in and breathing out while running. + +24. The babysitter prepared to carry the child. He did this by lifting his foot onto the step. First, he placed his foot 0.05 metres away from the base of the step. Second, he lifted his foot 0.05 metres above the step. Finally, he put his foot horizontally centred on the step. This way, the babysitter prepared to carry the child by lifting his foot onto the step. + +25. A lab technician prepared to place food on an agar plate. She did this by lowering a bucket into the well. First, she held the bucket over the well. Then, she started lowering it into the well. Finally, she stopped lowering it into the well when she heard it make a splash. In this way, a lab technician prepared to place food on an agar plate by lowering a bucket into the well. + +26. The interpreter prepared to pronounce the first and third syllables in “Italy” with emphasis. He counted the down (emphasised) beats in a music bar. First, he started his beat counter at 0. Then, he recorded the number of beats in the bar, 4. Lastly, he divided the number of beats by 2, two down beats (the first and third beat). In this way, the interpreter prepared to pronounce the first and third syllables in “Italy” with emphasis by counting the down (emphasised) beats in a bar of music. + +27. The bartender prepared to attach the hose to the tap. He did this by placing his mouth around the ice cream. First, he opened his mouth to the width of the ice cream. Second, he opened his mouth to the depth of the ice cream. Third, he wrapped his mouth around the ice cream. In this way, the bartender prepared to attach the hose to the tap by placing his mouth around the ice cream. + +28. The gardener prepared to water the garden. He did this by moistening his mouth between singing each phrase. First, he sang the first phrase. Next, he moistened his lips. Third, he sang the following words. In this way, the gardener prepared to water the garden by moistening his mouth between singing each phrase. + +29. The knight prepared to lower the drawbridge. He did this by watching the conductor bring him in at his entrance. First, he watched the conductor before his entrance. Second, he took a breath. Third, he opened his mouth and sang the first syllable when the conductor brought him in. In this way, the knight prepared to lower the drawbridge by watching the conductor bring him in at his entrance. + +30. The farmer prepared to finish loading the barn with hay. He did this by watching the conductor to know when to finish singing the final note. First, he looked at the conductor. Second, he waited until he indicated the note had finished. At the same time, he stopped singing the note. In this way, the farmer prepared to finish loading the barn with hay by watching the conductor to know when to finish singing the final note. + +31. The weatherman prepared to record whether it was raining. He did this by listening to the conductor sing his first note. First, he listened to the note. Second, he hummed it to himself. Third, he prepared to sing the first note when brought in. In this way, the weatherman prepared to record whether it was raining by listening to the conductor sing his first note + +32. The scientist zeroed the control test tube to compare with the test tubes with different values. He did this by marking a previous note the same as the first of the following phrase. First, he worked out which note started the following words. Second, he found the last instance of this note before this in his line. Third, he remembered this note to sing when he started the following phrase. In this way, the scientist zeroed the control test tube by marking a previous note that was the same as the first of the following words."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Music/Anarchy Argument.txt b/Lucian-Academy/Books/Music/Anarchy Argument.txt new file mode 100644 index 0000000000000000000000000000000000000000..0b999b3ccfc531aee95dbe81190b15ec3d3caede --- /dev/null +++ b/Lucian-Academy/Books/Music/Anarchy Argument.txt @@ -0,0 +1,171 @@ +["Green, L 2024, Anarchy Argument, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Anarchy Argument + +1. What are the x, y and z dimensions in metres of \"cup\" in \"The soldier cleaned himself with a cup of water\"? + +2. What are the x, y and z dimensions in metres of \"bottle\" in \"The soldier drank a bottle of apple juice\"? + +3. What are the x, y and z dimensions in metres of \"apple\" in \"The soldier put an apple in his lunch box\"? + +4. What are the x, y and z dimensions in metres of \"periscope\" in \"The soldier looked at the stand through the periscope\"? + +5. What are the x, y and z dimensions in metres of \"pear\" in \"The soldier ate a pear with the friend he found on the Internet\"? + +6. What are the x, y and z dimensions in metres of \"brick\" in \"The soldier stood on a brick, which was like a wall\"? + +7. What are the x, y and z dimensions in metres of \"log\" in \"The soldier stepped over the log\"? + +8. What are the x, y and z dimensions in metres of \"flag\" in \"The soldier found a flag\"? + +9. What are the x, y and z dimensions in metres of \"bun\" in \"The soldier ate a bun, which he had bartered for\"? + +10. What are the x, y and z dimensions in metres of \"plant\" in \"The soldier watered the plant, after moving it\"? + +11. What are the x, y and z dimensions in metres of \"tofu\" in \"The soldier ate tofu\"? + +12. What are the x, y and z dimensions in metres of \"garbage bag\" in \"The soldier moved the garbage bag\"? + +13. What are the x, y and z dimensions in metres of \"nuts\" in \"The soldier chewed nuts at the theatre\"? + +14. What are the x, y and z dimensions in metres of \"processed cheese\" in \"The soldier bit processed cheese\"? + +15. What are the x, y and z dimensions in metres of \"cup\" in \"The soldier swallowed a cup of grape juice\"? + +16. What are the x, y and z dimensions in metres of \"roll\" in \"The soldier ate a roll with vegan cheese\"? + +17. What are the x, y and z dimensions in metres of \"peanut\" in \"The soldier nipped a peanut\"? + +18. What are the x, y and z dimensions in metres of \"nectarine\" in \"The soldier munched a nectarine\"? + +19. What are the x, y and z dimensions in metres of \"sugar\" in \"The soldier packed sugar in his bag\"? + +20. What are the x, y and z dimensions in metres of \"ball\" in \"The soldier threw a ball in the air\"? + +21. What are the x, y and z dimensions in metres of \"banana\" in \"The soldier peeled the banana\"? + +22. What are the x, y and z dimensions in metres of \"orange\" in \"The soldier squeezed the orange\"? + +23. What are the x, y and z dimensions in metres of \"mandarin\" in \"The soldier removed a segment from a mandarin\"? + +24. What are the x, y and z dimensions in metres of \"bra\" in \"The soldier made a bra\"? + +25. What are the x, y and z dimensions in metres of \"stand\" in \"The soldier jumped to touch the top of the stand\"? + +26. What are the x, y and z dimensions in metres of \"ring\" in \"The soldier wore a ring\"? + +27. What are the x, y and z dimensions in metres of \"watering container\" in \"The soldier watered the apricot tree with the watering container\"? + +28. What are the x, y and z dimensions in metres of \"base\" in \"The soldier placed the base on the flat ground\"? + +29. What are the x, y and z dimensions in metres of \"abdominal muscle exerciser\" in \"The soldier exercised his abdominal muscles with the abdominal muscle exerciser\"? + +30. What are the x, y and z dimensions in metres of \"flask\" in \"The soldier gargled water from his flask\"? + +31. What are the x, y and z dimensions in metres of \"dried fig\" in \"The soldier chewed the dried fig\"? + +32. What are the x, y and z dimensions in metres of \"shorts\" in \"The soldier ran on the spot in shorts\"? + +33. What are the x, y and z dimensions in metres of \"two sticks\" in \"The soldier jumped over two sticks\"? + +34. What are the x, y and z dimensions in metres of \"hoop\" in \"The soldier swung the hoop around his waist\"? + +35. What are the x, y and z dimensions in metres of \"glass\" in \"The soldier drank a glass of water\"? + +36. What are the x, y and z dimensions in metres of \"bible\" in \"The army chaplain distributed the bibles\"? + +37. What are the x, y and z dimensions in metres of \"wood\" in \"The soldier cut the wood into skirting boards\"? + +38. What are the x, y and z dimensions in metres of \"stretcher\" in \"The soldier lied down on the stretcher\"? + +39. What are the x, y and z dimensions in metres of \"locker\" in \"The soldier found the correct locker\"? + +40. What are the x, y and z dimensions in metres of \"carrot\" in the sentence \"The soldier ate the carrot to check it was healthy\"? + +41. What are the x, y and z dimensions in metres of \"seat\" in the sentence \"The soldier sat on the seat to check it was stable\"? + +42. What are the x, y and z dimensions in metres of \"salt\" in the sentence \"The soldier salted the onion\"? + +43. What are the x, y and z dimensions in metres of \"name badge\" in the sentence \"The soldier found his name badge\"? + +44. What are the x, y and z dimensions in metres of \"drum\" in the sentence \"The soldier beat a regular rhythm on the drum\"? + +45. What are the x, y and z dimensions in metres of \"bow\" in the sentence \"The soldier stayed balanced when aiming the arrow at the target with the bow\"? + +46. What are the x, y and z dimensions in metres of \"crotchet\" in the sentence \"The soldier moved the crotchet forward one beat of the musical bar\"? + +47. What are the x, y and z dimensions in metres of \"money\" in the sentence \"The soldier labelled the money to take home\"? + +48. What are the x, y and z dimensions in metres of \"celery\" in the sentence \"The soldier fed the kangaroo the celery\"? + +49. What are the x, y and z dimensions in metres of \"balloon\" in the sentence \"The soldier blew up a balloon\"? + +50. What are the x, y and z dimensions in metres of \"corn\" in the sentence \"The soldier ate the corn\"? + +51. What are the x, y and z dimensions in metres of \"towel\" in the sentence \"The soldier towelled himself dry\"? + +52. What are the x, y and z dimensions in metres of \"playing card\" in the sentence \"The soldier placed the playing card on the table\"? + +53. What are the x, y and z dimensions in metres of \"teacup\" in the sentence \"The soldier drank tea from the teacup\"? + +54. What are the x, y and z dimensions in metres of \"stick\" in the sentence \"The soldier moved the stick off the road\"? + +55. What are the x, y and z dimensions in metres of \"label\" in the sentence \"The soldier read the label\"? + +56. What are the x, y and z dimensions in metres of \"pole\" in the sentence \"The soldier stood straight against the pole\"? + +57. What are the x, y and z dimensions in metres of \"pupil\" in the sentence \"The soldier looked at the girl's pupil\"? + +58. What are the x, y and z dimensions in metres of \"face\" in the sentence \"The soldier looked at the boy's face\"? + +59. What are the x, y and z dimensions in metres of \"number\" in the sentence \"The soldier wrote the table number on the vegemite\"? + +60. What are the x, y and z dimensions in metres of \"ladder\" in the sentence \"The soldier like his friend because his friend was able to climb a ladder\"? + +61. What are the x, y and z dimensions in metres of \"tea towel\" in the sentence \"The soldier folded the tea towel until it was hand sized\"? + +62. What are the x, y and z dimensions in metres of \"cow\" in the sentence \"The soldier counted the number of times the cow mooed\"? + +63. What are the x, y and z dimensions in metres of \"glasses\" in the sentence \"The soldier opened a shape book at the page for science and looked at the illustration of the glasses\"? + +64. What are the x, y and z dimensions in metres of \"mitochondrion\" in the sentence \"The soldier examined the model mitochondrion\"? + +65. What are the x, y and z dimensions in metres of \"candle\" in the sentence \"The soldier lit a candle at church\"? + +66. What are the x, y and z dimensions in metres of \"birth schedule\" in the sentence \"The soldier displayed the birth schedule\"? + +67. What are the x, y and z dimensions in metres of \"staff timetable\" in the sentence \"The soldier wrote the staff timetable\"? + +68. What are the x, y and z dimensions in metres of \"room timetable\" in the sentence \"The soldier read the room timetable\"? + +69. What are the x, y and z dimensions in metres of \"couch\" in the sentence \"The soldier sat on the couch\"? + +70. What are the x, y and z dimensions in metres of \"pencil case\" in the sentence \"The soldier put a pencil in the pencil case\"? + +71. What are the x, y and z dimensions in metres of \"hole puncher\" in the sentence \"The soldier punched holes in the paper with the hole puncher\"? + +72. What are the x, y and z dimensions in metres of \"water container\" in the sentence \"The soldier took the water container with him for his hike\"? + +73. What are the x, y and z dimensions in metres of \"pen\" in the sentence \"The soldier returned his friend's pen to his friend\"? + +74. What are the x, y and z dimensions in metres of \"two pieces of paper\" in the sentence \"The soldier wrote what he will do and said he will do on the two pieces of paper, respectively\"? + +75. What are the x, y and z dimensions in metres of \"piece of paper\" in the sentence \"The soldier wrote five reasons to do a job in his company on a piece of paper\"? + +76. What are the x, y and z dimensions in metres of \"tambourine\" in the sentence \"The soldier made a model of a tambourine\"? + +77. What are the x, y, and z dimensions in metres of \"friend\" in the sentence \"The soldier found his friend who mentioned the same key phrase as him\"? + +78. What are the x, y and z dimensions in metres of \"three friends\" in the sentence \"The soldier found two other friends, each of whom said the same key phrase as another one in the group\"? + +79. What are the x, y and z dimensions in metres of \"objects\" in the sentence \"The soldier found objects representing what he and his three friends had said they needed together\"? + +80. What are the x, y and z dimensions in metres of \"the library\" in the sentence \"The soldier checked the random place: the library\"? + +81. What are the x, y and z dimensions in metres of \"the conference room\" in the sentence \"The soldier checked the conference room, in the library\"? + +82. What are the x, y and z dimensions in metres of \"the encyclopaedia\" in the sentence \"The soldier read an article in the encyclopaedia to read widely\"? + +83. What are the x, y and z dimensions in metres of \"the blank book\" in the sentence \"The soldier wrote his encyclopaedia article in the blank book\"? + +84. What are the x, y and z dimensions in metres of \"wheat biscuits\" in the sentence \"The soldier moistened wheat biscuits after he answered a girlfriend's call\"? + +85. What are the x, y and z dimensions in metres of \"glove\" in the sentence \"The soldier inserted his hand in his glove\"?"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Music/Lucian CI-CD 2 Song Argument.txt b/Lucian-Academy/Books/Music/Lucian CI-CD 2 Song Argument.txt new file mode 100644 index 0000000000000000000000000000000000000000..8157bca2be857d548b5c75336e2789439f0f632e --- /dev/null +++ b/Lucian-Academy/Books/Music/Lucian CI-CD 2 Song Argument.txt @@ -0,0 +1,20 @@ +["Green, L 2024, Lucian CI-CD 2 Song Argument, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Lucian CI/CD 2 Song Argument + +1. The refactorer tried deleting lines and joining or adding code to join lines. I stated that commands that don't give output should be kept in the new version of the algorithm in Lucian CI/CD because otherwise, these commands, connected to commands that did give output, would be deleted because they didn't affect output. I kept the writeln statement. This statement could print interpreter debugging information or game or other prompts. I kept all writeln statements, and if they needed to be deleted, they were manually deleted by deleting them and resetting the modification dates in Lucian CI/CD. +2. The refactorer collected positive and negative tests, where the negative tests traced back from failure data, particularly from if-then statements. I kept the write statement. The writeln statement was the write statement followed by writing the newline character. \"Write\" could be used to write characters in a text video game, write parts of a term written by writeln, and needed to write symbols representing illegal characters. I could save the write output to a file, which, with labels, could be checked for accuracy. +3. The refactorer tried different combinations of values instead of random outputs. I checked the random statement. A predicate with output only was checked. Random, however, had random output and couldn't be predicted. I could limit random output to a single number for testing. +4. The refactorer checked the chain of equals statements to check for a comparison, such as checking that it was a list. I possibly kept the bash command in the new version of the algorithm in Lucian CI/CD. Bash took input and gave output, where if its output was checked, it was checked. Or if it only took input and its output wasn't checked, it was kept. Its output was deemed checked if it was compared with a non-var. +5. The refactorer stated that the command was redundant if it was duplicated or the same as another command in the result. I checked read_string because it had output. As it had output, it was reviewed by Lucian CI/CD whether to keep it based on the wanted output from the predicate. If read_string provided output, optionally passed with equals and compared with a constant or other value, eventually affecting the predicate's output, and this output was wanted, then read_string was likely to be kept. It wasn't kept if redundant, providing the same result as not being there. +6. The refactorer found the minimal number of commands combined to give the correct result. The command that returned the number of rows and columns in the tty terminal window was checked. While it produced output, this could be compared with and checked for correctness on exiting the predicate. Like other commands that may or may not compare with its output, the tty command was checked for inclusion. There was no routine to check whether the output was ignored; it was only checked for correctness on exit. +7. The refactorer stated that partially checking a value had the same effect as checking it unless separate commands contributed to parts of the value. I checked the set prolog flag command. This command had no output. If the command were given input in this argument, it wouldn't be kept because it wasn't outputted. Partially comparing with a term as the output of another command had the same effect as compared with it, but in the end, all that mattered was checking it. +8. The refactorer checked that the working directory was correct. Lucian CI/CD reviewed or kept the working directory command depending on its mode. Commands' output could be split up, joined, and checked. The test finder and checker found tests and checked the code, respectively, and Lucian CI/CD reduced the code length. Lucian CI/CD found the code with the shortest length, combining the most recent and second most recent versions. +9. The refactorer stated that the halt command was kept. It returned no output. It had no arguments or one input, the status, to return. Command modes depended on their code and the data given to them. Commands' code and data could be scanned to check their mode. If a command had no output, it was kept until a human manually deleted it. +10. The refactorer recanted detailed mind reading's dog barking and crying opera versus random mind reading's two-dimensional volcano song. I stated that detailed mind reading could select against errors, for example, incorrect paraphrasing word choices, algorithm commands and writing words. Random mind reading with enough meditators supporting it, which detailed mind reading was better than, could select courses, wording of answers and details. I reduced the number of instructions in the detailed mind-reading decision tree algorithm. Detailed mind reading gave instruments a meditation-themed, or better quality, three-dimensional sound. +11. The refactorer stated that the only way of testing for security holes was to try data combinations. I found negative tests for predicates. Negative tests tested wanted failure. I tested that all consequents of if-then statements were traced to the predicate's result. I found the data types, including recurring strings' and list items' variables, and any cases where a negative result was explicitly returned at the end of the predicate. +12. The refactorer removed obsolete, extra or action-taking commands that did nothing or did the same as nothing with other commands. I tried deleting lines to join lines. I deleted lines that got in the way, were incorrect or were unnecessary to join lines. Lines got in the way if they added a number to the same variable or transformed a value of the same variable. Incorrect lines contained bugs or errors or didn't connect with other commands. +13. The refactorer found negative data from lower predicates, sometimes choosing from lists of characters rather than using \"not\" rather than selecting against undefined arguments (instead, tolerated undefined arguments and avoided deleting all commands in a predicate) when testing. I tried joining code to join lines. I changed variable names to join code. The code was already correct in either version or needed modification. Occasionally, all the commands could be deleted if the rule had enough data or negative or variable tests to stop undefined arguments. +14. The refactorer warned users but guessed if the proposed types were recommended by testing whether multidimensional terms were needed in that place. I tried adding code to join lines. I found the problematic set of lines. I found new code that would satisfy the required types. The required types, for example, multidimensional first arguments in the interpreter, were human-entered. +15. The refactorer collected positive and negative tests, where the negative tests were found tracing back from failure data, particularly from if-then statements. I also traced back from non-existent \"then\" clauses. These generated logical formulas to accompany data to find test data that satisfied them. Combinations of data were tried, and the main combinations that satisfied the formulas were used. +16. The refactorer complained that \"not\" of many variables (for example, not(a(B, C, D))) was complex and simplified to points. When there were several logical formulas operating on a variable, their conjunction was simplified. For example, (A^B)^(A^C)=(A^B^C). In addition, certain nondeterministic formulas were separated by disjunctives and simplified separately. If a formula \"not(a)\" was negated, it became \"a\". + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Music/dot-Strings to Grammar Song.txt b/Lucian-Academy/Books/Music/dot-Strings to Grammar Song.txt new file mode 100644 index 0000000000000000000000000000000000000000..96944ea4656d64abf071f5dd589f5d3a15d1cb25 --- /dev/null +++ b/Lucian-Academy/Books/Music/dot-Strings to Grammar Song.txt @@ -0,0 +1,12 @@ +Strings to Grammar Song + +1. The musician shared a fascinating insight into crafting a film. They explained that a story is like music, and a film can contain songs that are mind-read from the mood of part of the overall song, or the original song can be the film’s theme song. In this case, they crafted a song deeply rooted in philosophy, serving as the perfect inspiration for a movie. The selection of the song was not random but driven by its unique blend of instruments, melody, harmony, and the story it evoked. The philosophy, they noted, was not just a random inspiration but a quantum one that the algorithm, which mind-read them, interpreted as the basis for music. This intriguing algorithmic philosophy often led to the creation of better songs. +2. The musician grounded secondary songs on commercial “thereness” songs. The film’s cinematography perfectly shot scenes from the songs, interpreted based on their sound or inherent qualities. A trill was interpreted as coloured trail lights from flying cars in the air. Strings to Grammar, construed as a screenplay about synthesising time travel in the simulation, had a deep, important-sounding theme song containing a frenetic, high-pitched instrument. The movie poster was rendered from the waviness of time interpreted as a software box image. +3. The musician stated that the story was high-stakes (meaning it was error-free) and philosophical. I wrote the story based on scenes from the music, appropriate for the music’s mood. It included a what-if scenario exploring an exciting and unusual set of circumstances. It contained an implicitly changing protagonist with aims and history and an opposing complex antagonist with their aims. The protagonist aims for particular goals for specific reasons with high stakes (correctness). The story has three acts, one each for build-up, meeting and resolution. +4. With their innovative use of algorithmic suggestions or breasonings, the musician infused motivations with excitement. They propelled the plot through internal conflict (the protagonist’s internal struggles) and external conflict (external impediments the protagonist opposes to meet their aim). They opted for visual storytelling, using gestures, visuals, and meaning to convey responses, ensure memorable scenes, and aid in plot development or character revelation. They developed a theme and message for the audience to unpack. This theme was intricately woven throughout the story, influencing characters’ choices and plot events. +5. The musician sensed mind reading could sense whether a conclusion was based on other findings in the text, which helped detect essay structure. I mind-read 1-150 hand breasonings, synthesising them in sets of ten, choosing the initial contention the breasonings were written about (or mind-funnelled one breasoning) to influence lighting, make-up or other film-making decisions. I mind-funnelled a breasoning by collecting and selecting from various choices. I wrote each scene to be exciting or crucial to the story, maintaining an engaging pace and timing, building anticipation and releasing it tactically to keep viewers enthusiastic. I struck a balance between original parts and familiar story elements. +6. With their adept use of the combophil algorithm, the musician interpreted a philosophy on a topic. They replaced key terms with given extended configurations and abridged complex stories, such as preventing a time travel accident causing the end of the universe. They made creative choices using the combophil algorithm, influencing the script’s key points. They created a fine arts temporal montage with an implicit story. The philosopher in each department chose a general theme. Popology was responsible for writing the interpreter, while theology was in charge of writing the inductive engine. +7. The musician had an extensive database of thoughts ready, which could be accessed using random, funnelled mind reading. I used combophil to influence the setting, character, story and other movie elements. The secret algorithm powered viewers in their droves. The algorithm’s song inspired unique visuals and sound. The algorithm was core in the accreditation agency, which supported at least fifty education academies. +8. The musician aimed to educate students about immortality. I wrote characters’ motivations by mind-funnelling three “why” levels. If they needed inspiration, I included a philosophy at the start, for example, simplifying a more significant algorithm into a production line. I repeated this process for their history, which influenced their aims. For instance, they might dream of changing people’s lives because of a childhood inspiration. +9. The musician wanted to see the faces while the actors were talking. I edited the movie using mind-reading, scoring whether a scene was interesting enough and contributed sufficiently to the storyline. I couldn’t change the strength of the mind-reading, but I could change the intelligence by using a decision tree to select from story possibilities, giving a smoother result without an empty feeling. I could colour grade using mind-reading and an algorithm and count to three between lines in editing. +10. The musician was busy playing jack of all trades, producing the movie. I synthesised the characters’ top-level motivations with those of others. These motivations were spread across time and settings and made into the story. Each character had a secret or updated set of ten algorithms for a recurring character. These algorithms had done up movies explaining the algorithm and films relating to the movie, perhaps about helping with the movie production. diff --git a/Lucian-Academy/Books/PEDAGOGY/Accreditation.txt b/Lucian-Academy/Books/PEDAGOGY/Accreditation.txt new file mode 100644 index 0000000000000000000000000000000000000000..8570665479abc01dfd68f3fc0f6c098341787109 --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/Accreditation.txt @@ -0,0 +1,23 @@ +["Green, L 2022, Accreditation, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Accreditation + +1. I customised the sentence for the student. +2. I filtered out terms in the sentence. +3. I followed up the student for work. +4. I ensured the students' clothes were tucked in. +5. I joined the students who were having difficulty with the same topics together. +6. I intercepted the student who was performing poorly with formative assessment. +7. I formed the clubs at school. +8. I wrote the framework for religion in relation to schools. +9. The student's tracks in class were followed. +10. I permitted the student to leave the class. +11. I found that if the requirements were met, accreditation was possible. +12. I verified that the course was accredited. +13. I verified that the student had studied an accredited degree. +14. I checked how many test cases the student had solved. +15. I verified the assignment for plagiarism. +16. I interviewed the student to verify the authenticity of the assignment. +17. I monitored how well the student was understanding the examples. +18. I discussed the student's progress with her. +19. If the assignment passed one test, it passed. +20. I checked the assignment in case it was very close to passing. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/David's essay.txt b/Lucian-Academy/Books/PEDAGOGY/David's essay.txt new file mode 100644 index 0000000000000000000000000000000000000000..dc338baef4c82a6044fa5a6728addb19ab27ce06 --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/David's essay.txt @@ -0,0 +1,26 @@ +["Green, L 2024, David's essay, Lucian Academy Press, Melbourne.","Green, L 2024",1,"% David 4 12 23 +David’s essay + +FUNDAMENTALS OF PEDAGOGY +by Lucian Green +Breathsonings 1 of 4 + +After the operation, the hospital worker helped their patient feel better by dropping watery sugar down their throat. The helper helped the person recover by showing them when to swallow food. The doctor educated the patient about their body by eating the food to help them understand how they digested it. Showing this knowledge, the patient finally played a game about food entering the stomach. Following playing this game, the patient remembered to eat healthy food, have lovely feelings and eat enough food. + +1. The pedagogue selected the suitable material. The substance was better for the task. It rebounded well. It was non-stick and hygienic. Or it was adhesive. +2. The pedagogue stated that the valve allowed gas to escape from the pipe using a switch. I analysed the valve. I regulated the flow in the piping system. Or I handled the pressure in the piping system. I released pressure using the valve. +3. The pedagogue bisected the angle. I measured the angle using a protractor. I divided this angle by two. Using a protractor, I drew a point at this angle from one of the lines in the original angle. I ruled a line from the centre to this point to bisect the angle. +4. The pedagogue helped with becoming a bot, acting as the bot and listening for cues. I selected the best environments for the child. I made sure the child was equipped for study. I attended to the child. I helped with arguments and algorithms. +5. The pedagogue backed up the files. I detected that the file had changed. I backed it up. I found the file’s version history. I retrieved an old version of the file. +6. The pedagogue was described as the stick. Simulants weren’t seen as sticks. I rested. I walked carefully. I put my head flat on the pillow. +7. After finding the tests, I ran Lucian CI/CD. However, modifying types could help simplify it instead of saving parts. The substrate was the program’s previous version. Instead, I found the algorithm’s tests and whether it worked from types. I could more easily correct and finish the algorithm using types. I used constants, pattern matching, simple CAW commands, or neuronetworks to complete the algorithm. +8. The pedagogue tested the argument and algorithm assignments with students. I worked on the version for some disabled students. I reduced the complexity of the algorithms specified. Functional students all worked on algorithms of the same complexity. I could reach disabled students in other languages, provide live feedback and use a web version control system that ran algorithms. +9. The pedagogue worked offline then committed the file. I wrote a web word processor. A variant of it could mind-read students. Or it could complete code using Lucian CI/CD or types. The student downloaded the web word processor to their computer to ease the load on the server. +10. The pedagogue labelled the data as recognised or unrecognised patterns. The algorithm completed the algorithm using types. I found the types gap. I found whether types were inserted or deleted whether a decision tree, sort or subterm with address had been used. I separated the need for pattern matching and CAW/neuronetworks. +11. The pedagogue wrote a programming language or grammar to write algorithms using subterms with addresses with nested patterns. I used program finder to complete the algorithm component with recognised patterns and CAW or a neuronetwork to complete the algorithm element with unrecognised patterns. I labelled data by name and used subterms with addresses to process data with recognised and unrecognised patterns, and patterns within patterns. I required data to have these labels from the start. I predicted linear and hierarchical structures. +12. The pedagogue pretty printed the tree. The subterm with address language could process the data more quickly. I could process data in the correct order. I processed data depth-first, pre-order, in-order, post-order or breadth-first. I listed the query, order and transformation. +13. The pedagogue eliminated ambiguous labels. The subterm with address language could process the data all at once. The user fetched the subterms and addresses. Then, they processed subterms, found subterms from the subterms in a hierarchy, and processed them until there were none left. Processing them in Prolog allowed for global and local counters, loops around the same or different types of subterms. +14. The pedagogue built the output level(s) by level(s). The algorithm built the output using the subterms’ addresses. The subterms’ structure could be mirrored, and their addresses kept the same. Or, some items could be earmarked for deletion. Or, an algorithm could insert items in a level or insert a new level according to a heuristic. +15. The pedagogue labelled data before merging it. I operated on the data and substituted it back in using its addresses. They may be sorted or sorted without deleting duplicates. Data may be taken, modified and replaced into the structure. The data could have labels added or removed. +16. The pedagogue left the code with a subterm with an address in it to simplify processing. I transformed unrecognised patterns in the list. I listed the data. I found a pattern in it based on previous patterns, using a neuronetwork. I listed the code to produce the output. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/Education of Medicine 1.txt b/Lucian-Academy/Books/PEDAGOGY/Education of Medicine 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..dae96b3c6eb81a05b0a2bdb5c4e5c6b6f9e24376 --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/Education of Medicine 1.txt @@ -0,0 +1,89 @@ +["Green, L 2024, Education of Medicine 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Education of Medicine 1 + +1. The medicine student didn't take risks before body replacement. I helped follow positive function with movement and body replacement. I avoided damage to my body. I moved in safe ways. I remembered to replace my body every day. +2. The medicine student enjoyed continuous positive function of his body. I replaced my body to not have body failure in life. I avoided organ failure. I avoided external sources of damage. I avoided forgetting to replace my body each day. +3. The medicine student listed used and unused body functions. I exercised each part of my body to help me live longer. I exercised my muscles. I used my brain by studying. I avoided stress from overwork. +4. The medicine student did one thing at a time. For example, I aimed to run and perform extraordinary things in life. Therefore, I exercised outside each day. This exercise consisted of a run or a walk. I made goals and achieved them. +5. The medicine student turned off skin growth in anti-ageing. I replaced my body, resetting to ideal physical condition. I continued to feel fine. I forgot any worrying things. My hair and nails grew, and I cut them. +6. The medicine student enabled immortality with accreditation. The medicine student took spiritual preventative medicine for schizophrenia and studied for a Master's in Medicine to prevent schizophrenia and hallucinations. Spiritual medicine turned off particular points in our time. It indicated future medicine. The degree helped with critical and pedagogical skills to turn off unwanted spiritual points. +7. The medicine student helped others to become immortal. I used the psychiatric setting to write high distinctions. I could concentrate well. I used meditation to help write books. I used accreditation to support me in avoiding mistakes. +8. The medicine student avoided stress and did what she wanted to. The woman enjoyed a high quality of life, including studying, by preventing stress. The woman enjoyed talking with high society members. She chose an appropriate course. She worked in this field. +9. The medicine student only did relaxing things. I had the confidence to exercise and lead an everyday life without stress. I avoided heaviness, speed and doing too much. I had time to go outside. I saw friends and walked my dogs. +10. The medicine student relaxed at home. I spiritually replaced my body, preventing mental illness from studying subjects and turning off hallucinations. I was fresh, helping me stay happy. This state allowed me to examine my thoughts. I could do this by choosing to do it. +11. The medicine student started a copywriting business or a newspaper, a medical receptionist, a school receptionist and a meditation/events staff member/gardener for a meditation centre. I found out the requirements in the simulation and performed them in time. I did my things (earned jobs above those I needed or not, and completed pedagogical requirements) and moved on (helped professors). This thought made me laugh. I also relaxed. +12. The medicine student noticed the small things, such as relaxing hands. The person tested that immortality with meditation and body replacement allowed him to add on specific medical questions. I queried possible preventative measures, treatments and other possibilities about each part of my body. I stayed happy. I continued. +13. The medicine student tested my health. I automatically generated high distinctions about replacing my body. I forgot it at the time. I relaxed at the time. I made high health. +14. The medicine student helped others in their prime become immortal. The woman represented an invitation to the simulation and improving her health. She joined the simulation by time travelling to it. She became immortal. She increased her health by taking care of it. +15. The medicine student didn't need to worry about the build-up of damage to her body. I maintained my health, avoiding risks before body replacement. I avoided unhealthy food. I took care of my body. I performed yoga and qi gong and took herbs regularly. +16. The medicine student increased what he wanted and decreased what he didn't like. I increased my meditation automatically using a computer to help with study in the simulation. I wrote about my meditation experiences, increasing the detail. I helped and worked with people, using this as a topic in my study. I helped others by teaching them meditation. + +17. The medicine student always felt calm and took a special moment to do a task. I automatically replaced my body. I did this using a timed algorithm. It asked me if I was ready. I felt relaxed afterwards. +18. The medicine student checked he felt fine after replacing his body. I took time away from time and space travelling to replace my body. I travelled to my destination. I didn't have to do anything. I replaced my body. +19. The medicine student analysed the integration of the body systems. I replaced my body's systems. I replaced my body. This replacement included my body systems. I tested my blood pressure and heart rate. +20. The medicine student benefited from a medical kind of teleportation. I studied my body before and after time travelling and replacing my body. First, I tested my heart rate and blood pressure before the teleports. Second, I performed a kind of teleportation in a way. Third, I tested myself after the teleports. +21. The medicine student kept her youthful appearance. If I meditated, I could time travel. Meditation requires training. In addition, time travel required meditation (writing). I could replace my body using technologies from 5689. +22. The medicine student's writing was about philosophy and computer science, which were difficult to transgress. I could time travel and replace my body in the future. I could see and talk with friends there. I aimed not to transgress technologies from times. I could ask about black boxes. +23. The medicine student travelled to a time to learn to replace their body. First, I learned how to time travel to a time within a short time. Next, I learned how to time travel. Then, sensing the technology was about to expire, I time travelled to a time. Finally, I set up permanent time travel ability and came home. +24. The medicine student remembered to refresh the ways of doing futuristic things: writing 4*50 As or ordering copywriting every ten years. I increased others' knowledge in my own time about how to replace one's body. I saw an old science teacher with dark hair. I saw another old primary school teacher with dark hair. I replaced my body in each time travel destination daily and set my outside appearance to have dark hair. +25. The medicine student worked until she had written 4*50 immortality As and 4*50 medicine As. I set up a reminder document about replacing my body. I ordered copywriting for points from my daily regimen. For instance, I prevented my body systems from ageing in each dimension or time travel destination. I wrote an A on immortality each week. +26. The medicine student kept on going. I helped others to replace their bodies. I explained how to use writing and computers to time travel and indicate immortality. The extra part, the immortality A per week, could be written in 12 sentences per day or fewer than three five-sentence paragraphs. In addition, I recommended they do the same thing about medicine. +27. The medical student aimed to resolve medical questions, including tiredness from studying with copywriting, Berocca, hand-breasoned 5 As and exercise. I could teach people to replace their bodies in schools, gyms, prisons, workplaces, retirement homes and homes. They could choose classes or read about the software privately. I loved time travelling between my house and the future each day. I wrote and programmed. +28. The medicine student wrote visual themes for their music, algorithms and arguments. I replaced my body and inspected the results. I checked the outside of my body. I did mental exercises. I tested my health. +29. The medicine student interpreted and synthesised the work each week. I replaced my body, maintaining my ability to run and live. I could walk each day. I could run regularly. I enjoyed a continuous high quality of life, with breaks and expectedness. +30. The medicine student had 250 breasonings and copywriting to help with their swallowing reflex. I could replace my body to avoid degeneration of my body and exercise. I could lift weights. I could do exercises. I could do diction exercises. +31. The medicine student lived and saw friends and was celibate. I avoided the effects of ageing with body replacement. My inaction didn't cut short my lifespan, and I didn't have to visit a nursing home. I could work and focus on my life forever. People around me were similar, used to immortality, and could visit simulations. +32. The medical student advertised their academy online. The immortal kept the instructions to be immortal permanently online. They taught prerequisite philosophy. The religion was 4*50 As. I grouped these into three for academia and three for industry. + +33. The medical student took care of customers. I updated my interests, checking available resources. I created thoughts for customers, predicting them and working out replies. I listened to the customer. I talked with them, answering their questions. +34. The medical student used a specific system to interpret an algorithm to interest students in algorithms. I wrote and used high distinctions. I wrote philosophical thoughts for customers. Once I had written a mind map, I experimented with generated philosophies and chose the best word for each position in the sentence. The sentence could describe an algorithm rhetorically. +35. The medical student tested the latest theory-as-algorithm. I continued with the organisation, treating actions like missions. I wrote algorithmic thoughts. I randomly generated possible algorithm parts, such as methods and simplifications, without giving up features and details. I used an algorithm mock-up, which integrated different algorithm parts in a flow chart. +36. The medical student generated more algorithms than inputs when they had more than a certain number of inputs. I tested the graph for clarity. I drew the exponential graph for the 540-degree algorithm's number of inputs versus outputs, where the 540-degree algorithm found all the a-d's in a-b, c-b and c-d pairs of words. I used the 540-degree algorithm to generate new algorithms, from which I chose more relevant results. I had found the philosophy results were uncannily relevant, so I checked the algorithm results were relevant, on-topic and inspirational. I thought of the philosophical significance of each algorithm for each of the others and wrote new algorithms. +37. The medical student restarted the system to reset a variable's values. I didn't need to repeat myself five times to compensate for the loss of clarity because the text was clear enough. In addition, I generated type statements to connect algorithms with accuracy. Then, I deleted unused variables. These variables included singletons, variables not used for verification or output. I wrote long enough texts for enough detail to earn a high distinction. +38. The medical student chose key details and made app mock-ups with them, for example, finding more useful or original algorithms. I wrote an algorithm that generated tables, screens and text graphics from examples. I wrote an interpreter using this algorithm. In addition, I added details to generated texts to connect their parts. With enough points, the meaning of a text could be uncovered, and it had relevance to other topics. +39. The medical student recorded the type of skills. I captured the debugging and thought data while creating an interpreter. I wrote about the latest writing. For example, I made detailed analyses and found implications and possibilities in philosophy and algorithms. I did this three times to capture the thoughts. The most detailed writing contained a lot of checking and common paths. +40. The medical student augmented their work with art and new possibilities, such as mini-accreditation or mini-business. I found the best new draft of the text. I used DevOps to hand-enter each change, keeping the best current version. On another note, I mind-read myself to see if I wanted a thought accompanying my thought, perhaps because I was tired or was resting. I found the best time, presentation, such as a suggestion about a thought, and content for the thought, not too much and memorable, where it could be read on the computer later. If I didn't want the thought, the computer still checked the thought for errors and helped with the continuation of logic at the best possible time. I also programmed the computer to analyse data as much as necessary in the background for professional requirements or augmentation reasons. +41. The medical student lifted reasons for work from forgotten to memorable by mind-reading experiences and thoughts. The pedagogy accreditation gave me the skills to present myself with up-to-date information, such as a spiritual translucent thinbook. I concentrated on mind reading myself by myself and with others, encouraging others to use the software. Later, I wrote an algorithm writing helper, which suggested and assessed the understanding of algorithms using mind reading. I used a neuronetwork to convert 540-degree output into natural language. +42. The medical student shared business philosophies with business contacts. I helped others to do what I had done in business. I created sales and high-distinction thought commands. I generated breasonings and algorithms, with details for each. I met the professional requirements for these, depending on where I had got up to in education. I created all the thoughts. I made a computer multimedia presentation with all the art related to the project, such as example data for algorithms, which were both simple and user-friendly. +43. The medical student stated that ten breasonings achieved a threshold of the integrity of reasoning. I refreshed the display of recommendations about thoughts. I mind-projected tens (ten breasonings) for incidental representations. These thoughts gave me the courage to think of what I was saying to be thought of, and I found the most relevant connections. The system recommended I keep List Prolog for a more accessible text selection by vision-impaired people and that the Lucian CI/CD algorithm confirm whether to keep comments, including tests and data file changes. I also included an option to ignore Prolog, treating it like an ordinary file and displaying it as Prolog, not List Prolog. +44. The medical student ruled out hallucinated citations. My health was as good as new when I regenerated. I generated enough, or 2*5*50, high distinctions of reasons. The imagery was the reward for a PhD level of awareness. I summarised changes in long documents. I wrote a table of contents of changes on separate pages from other times Lucian CI/CD was run. I wrote two (for the lecturer and student), which could be three (for the customer, manager and employee) *5 (to include prestigious imagery) *50 high distinctions, where 50 As were master's level and 4*50 As were PhD level. The combinations of generati were rapid and relevant using the Grammar Logic or 540-degree (BAG or Breasoning Algorithm Generator) algorithms. +45. The medical student focused on the second stage, the milieu and drama of thoughts. I did upper-body exercises to relax my muscles before body replacement. For most thoughts, I wrote from one to five algorithms. I captured the details from the student's work. I followed lines from a mind map from their perspective to a draft and then the work. The ideas at each point were chosen from many random generations and were assessed on grammar, ingenuity and computational correctness. I also generated details about education skills that explain the work metacognitively or clearly. +46. The medical student identified the best stretches for the race. I mentally and physically prepared each part of my body for the sprint or used DevOps as I went for code cleanliness. I removed phantom, unused loops or variables. I designed an emergency thought pack daily for monarchs, princes and princesses. The correct combination of commands inspired the royals with flower-bearing. I had enough energy to win the race. +47. The medical student thought about their visions. I regenerated my body, renewing my sense of sight. The supplemental thoughts could be used for sales or bots, defined politically on academic issues. I set up a \"do it if I don't do it\" algorithm for needed thoughts in the background. I identified customers' sales in real-time, going back in time to their origin to help them. The supplemental thoughts placed points on sliders to define the viewpoints on academic issues, for example, an agreement that mind reading is like a necessary telephone, making continuity in time travel possible. +48. The medical student suggested immortality as a medical option where possible. The physical state of my eyes was saved when my body was saved to replace my body with immortality. In addition to sales and bots arguments, incidental thoughts were defined politically. The political thoughts involved separating functions and thoughts, choosing the best side and indicating points on a slider about academic issues, such as continuing a previous position or reacting to new data. The worker was laissez-faire, left to their own devices to work, making checking with DevOps and a grammar checker necessary. + +Anti-Ageing Medicine Home - Education of Medicine 1 + +49. The medical student checked that the situation was still safe. I wrote the high distinctions from the same perspective. I wrote my thoughts about the law of sanctity. The arguments had certain benefits. The company was protected. The self explained the situation to the other. +50. The medical student checked the place in the image. I ran tests for a predicate with another to determine if they were the same predicate. I found predicates that satisfied the same tests. I tested whether they had the same code. I manually tested whether they were equivalent. +51. The medical student wrote an app that could run offline and online. I regenerated as the lead in my company without reproducing. I ran Prolog from State Saving Interpreter and List Prolog Interpreter for use with web apps. I sped up parts of apps so that they could run online. In addition, some commands could process the pipe (\"|\") character. +52. The medical student later imported buttons and links from the language to State Saving Interpreter (SSI). I regenerated my body and exercised. I made a mock-up Prolog web app engine, for example, for Vetusia, by running Prolog with variables saved. It was faster than SSI. I could edit and convert Prolog, which I could test it in, to the language. +53. The medical student favoured expansionism for later editing. I survived medicine during my years of work. I worked out how to automate Lucian CI/CD. I finished the Prolog converters and made a system that runs and uploads the repository. I ran Lucian CI/CD on my old repositories, with flow-on data as tests, on one predicate at a time. +54. The medical student removed internationalisation in List Prolog for speed. I improved the State Saving Interpreter by writing research on it and testing it in video tutorials. I listed or walked through the program. I explained the program. I simplified it. +55. The medical student bought the latest laptop to focus on business. I wrote philosophy on sections of SSI. I separated work from my mind and simulated generative intelligence (SI). I presented the work I had devised in public. Any other work was attributed to a bot and was cited. +56. The medical student excelled in philosophical breasonings and their dimensions. I used visuals to illustrate SSI. I drew an SSI state machine. In addition, I drew a flowchart of the SSI algorithm. Also, I drafted algorithms possible with SSI with neuronetworks. +57. The medical student made student posts and speed and accuracy competitions in finishing bugs compulsory. I kept my mind fit by completing algorithms and questioning breasonings. I also notified myself before automated changes to programs. Students were expected to think from first principles in exams. I assessed critical thinking, memory, writing and changing (and detecting flaws in) programs. +58. The medical student wrote down and eliminated risks in their life in their journal. I expanded each central algorithm into a master's program and corresponding texts. The students developed a business idea using simulated generative intelligence. Again, As, texts, algorithms, money, and time were no object. The algorithm generated its database. +59. The medical student sounded a notification when the repository had been finished for the moment and was committed. I set finding algorithms such as SSI against the clock. I designed Lucian CI/CD for the web. Users could read about the latest pipelines or complete tests. They could click on the badge in the Read Me to access the corresponding latest versions of related repositories. +60. The medical student included comments in Lucian CI/CD (after acceptance of lines immediately before or after them), notifying the user of this. I completed spiritual yoga. I used switches to relax. I used a button in Lucian CI/CD to automatically accept lines with tests. In addition, I notified the user of possibly unwantedly ignoring data file changes, where no test had tested for them. +61. The medical student turned off automated Lucian CI/CD pipelines so that saving a file wouldn't trigger a pipeline, possibly removing changes or writing a new test. I measured the usual lifespan of each body part and became immortal. In Lucian CI/CD, if no combinations' tests worked, I returned to the previous state of the repositories before any changes. I sounded a failure sound effect and notified the user of likely problems, such as missing or incorrect tests, bugs or missing files. +62. The medical student aimed to maximally expand the code, putting it through a pretty printer. I wrote the epistemology app to find evidence for conclusions. Lucian CI/CD either offered to commit or automatically committed finished files. The woman adhered to the expansionist code policy. Code with different features should be separated unless it is simple and clear to combine them. +63. The medical student expanded the code into reused modules where necessary. The person looked for evidence that they were immortal, otherwise took action in meditation and pedagogy. The worker worked on many repositories, which was recorded. Lucian CI/CD automatically pretty-printed their work. The others rewarded their work. +64. The medical student identified and stopped failing the code online. I replaced legacy code, noting needing to bring back certain features. I showed the state of the pipeline on the repository's Read Me. It showed whether the current pipeline had passed or failed. Theoretically, it shouldn't indicate that it failed because the pipeline would only upload passing code. + +65. The medical student filled in an exercise when they didn't have one. I helped take care of my body with regular exercise. In Lucian CI/CD, I asked for a test when a predicate didn't have one. I found the name and arity of the predicate. I listed whether it had a test and passed it. +66. The medical student reminded the worker to undo the mistake. I projected the image to help the student. I issued the command to undo to the previous stage in Lucian CI/CD. I issued multiple undos. I issued redos. +67. The medical student generated the type statement from the specifications or tests. I chose the opportune time to regenerate. I stated that types were needed not for all new code, but for all code. I wrote the type statement to speed developing accurate code, including [n, name]. I included variable names in the type statement to speed developing code. +68. The medical student repeated the course to test it. I maintained perfect function with exercise. I constantly communicated with the departments of the company. Business organised the website and payments. Education organised the teaching and assessment. +69. The medical student recorded the fortnightly statistics for his software site. I chose the best times to take actions. I recorded the downloads of List Prolog Interpreter. The software was written clearly, with legible documentation. It inspired users to examine Prolog, add modules and play with the code. +70. The medical student gave enough time from when the body had finished one task to do the next. I measured the time my body spent digesting, breathing, remembering, sensing and speaking, and worked out whether these times overlapped, to relax. I separated concatenated texts with a newline. When breasoning out the BAG cache, I made sure the concatenated strings were separated. This ensured that words didn't join together, causing errors when looking them up. +71. The medical student confirmed the location of the problem. I noticed the thoughts found with funnelled detailed mind reading. I ran code related to each test separately. Starting from the bottom, I found the predicates a predicate depended on. I ran tests on these predicates, reporting on bugs found within these predicates, and not testing predicates depending on buggy predicates. +72. The medical student extracted the algorithm from the neuronetwork. When finished, I could work when I wanted to, and replaced the neuronetwork after it had finished. I created a brand with A. It encompassed the aims of the project. It had a jingle and icon. +73. The medical student earned high distinctions during the day. I endorsed the brand with A. The brand had common features with another. I found the unique selling point of the first brand. I gave it A. +74. The medical student modified the seen-as images of the freezing-age BAG algorithm. I gave the customers when they saw the reference to the brand. The mother was at home. The systems seemed professional. I wanted the product. +75. The medical student connected the customer's thoughts, words and actions with the brand. I tried different stretches at different times. In sales, I helped customers to like my brand. In addition, I breasoned out As to do with the brand. I checked whether the customer was epistemologically interested in the brand, i.e. whether they needed, wanted or wanted to research it. +76. The medical student had one seen-as image for each high distinction. I found the best settings for the imagery. The bots had political As. The people found out about the product and thought of whether they needed it more clearly with keyword-As. I funnelled the reason for their need, such as business, personal or education. +77. The medical student found out into the future before the end. I wrote 250 breasonings, supported with 4*50 As, with algorithms. In the diploma academy, I wrote 250 breasonings per assignment. In a higher education institution these were supported with 4*50 As. In a PhD, all the algorithms were complete. +78. The medical student determined whether they were working and funnelled detailed mind reading for As. I exercised my mind, replacing my body to ensure my faculties were still working. Also in education, I breasoned out ten breasonings and 250 algorithm breasonings per day for a student to concentrate in computer science. These algorithm breasonings were specifically designed to help with writing the seen-as version, the A and writing the algorithm. Advanced students mind read their high distinctions using VPSs. +79. The medical student's project found algorithms with algorithms they had written, to answer questions. I timed the high quality thoughts, viable implications of the algorithm. In the academy, students answered multi-choice questions. These were computer science-based. For example, they were the right output of an algorithm, the right input or the right algorithm. +80. The medical student assessed themselves on their writing. I regenerated my writing by reviewing and connecting to it. I wrote an algorithm to write multi-choice questions. I found the paragraph. I detailed-mind-funnelled the key algorithmic sentences. Alternatively, I searched for algorithmic key terms. In the question, I referred to knowledge elsewhere in the essay, or applying this knowledge, using causality or processes. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/Education of Medicine 2.txt b/Lucian-Academy/Books/PEDAGOGY/Education of Medicine 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..0cdce7c3cc8ab9f798fbf44268c5d0770a0d028f --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/Education of Medicine 2.txt @@ -0,0 +1,87 @@ +["Green, L 2024, Education of Medicine 2, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Education of Medicine 2 + +1. The medicine student saved time by converting to C. I modularised out findall, if-then and reused code in the predicate. In Lucian CI/CD, I added an advanced feature that recognised a predicate from the old version with the same arity and tested it. It would automatically detect this predicate. I found renamed predicates with the same content and deleted them. +2. The medical students made incremental changes, leading them to discover immortality. I found the secret to immortality during my life. I placed a download link to the first version of Lucian CI/CD, which was compatible with programming languages other than Prolog. This version allowed seven changes to all repositories, suggesting the need to make small, incremental changes. I fixed the bug that placed extra brackets around non-Prolog-lines-of-code-as-comments. +3. The medical student simplified their medical questions with body replacement. Like my body, the repositories' state was saved to compare with changes. I optionally renamed predicates in the old version if needed. I simplified, renamed names and variables and reformatted these predicates. I simplified repositories with Lucian CI/CD. +4. The medical student avoided the adverse effects of aging. I wrote high distinctions about anti-ageing medicine. I tested all the current predicates and all the relevant tests needed to be passed. All these sets must be passed at each point to pass the overall test. I compared current scientific knowledge with the effects of future technology. +5. The medicine student stated that a cut followed the predicate. My state of sight was maintained. I deleted choice points from not's statements. This action prevented these statements from being returned to when they were negated. I didn't return to an idea when I had negated it. +6. The medicine student acted as a confidant, testing predicates that were unrelated to others. The society members regularly saw and spoke with each other - groups of predicates within a loop needed to be tested in any order. Loops with one item were simplified to that item. The order of loops came before depended on items and after depending items, and loops included items that were both of these. +7. The medicine student added trialling current and new versions and integrating monitoring and logging tools to Lucian CI/CD. I increased my lifespan by cutting to the chase. I backed up old versions of repositories before Lucian CI/CD saved changes to them. In the first version of Lucian CI/CD, there was one set of seven changes to the whole set of repositories. In the second version, there were seven possible changes to each predicate. +8. The medicine student thoroughly tested and changed the algorithm. I switched to finding combinations of chunks of changed lines to individual lines in Lucian CI/CD. I retested all the repositories for each different first predicate. These were automatically detected and in the list of dependencies. Lucian CI/CD more effectively tested the whole algorithm. +9. The medicine student enabled time travel and immortality through the Text to Breasonings algorithm. I systematically tested and maintained my health. Lucian CI/CD reported the names of passing and any failing predicates. The user could find and fix bugs in failing predicates. I travelled to other times and taught how to become immortal. +10. The medicine student led and taught spiritual computer science. The students meditated after they exercised. Lucian CI/CD tested live software. If the software was an SSI Web Service web app, it could be run in SSI. If the software was graphics or memory-based, their representation could be tested. +11. The medicine student finished all the Lucian CI/CD tests, passing the repository. I established Lucianic Meditation centres around the world. I saved the progress of Lucian CI/CD to disk. I listed the predicates from depending to dependent, found the configuration of changes to each predicate that passed all the tests and saved them. I found out and helped the meditator's thoughts bottom up. +12. The medicine student carried the classic algorithm and books with them. I became immortal and helped meditators achieve their goals. I backed up the repositories before saving them. I ran Lucian CI/CD, which backed up the repositories and saved changes. I was methodical and started with the last point. +13. The medicine student met the required standard. I covered the high distinction about immortality. Lucian CI/CD warned if a predicate changed, but a test hadn't. If a predicate changed and had the same test, it was reset to the old version. Meditators were encouraged to write 4*50 As for immortality. +14. The medical student could see and use the software. The immortal saw high-quality imagery for their life using the Upasana sutra. Lucian CI/CD tested all repositories. It did this by testing changed repositories. I linked to the working thought. +15. The medicine student learned about the times. I lived forever and regularly exercised and socialised. Lucian CI/CD found predicate dependencies in terms of the first predicate, listed in main_file.txt in each repository. If there were multiple algorithms per repository, main_file.txt contained multiple main files and first predicates to test. I maintained and updated my software. +16. The medicine student realised vision-impaired people could program with preparation. I had a high distinction for having clear eyesight. I wrote a new version of Lucian CI/CD that tested predicates bottom-up. The algorithm found dependencies using a post-order depth-first search algorithm and updated the predicates as changes were integrated and tested. I could program more quickly with good eyesight. + +17. The medicine student listed the planned features and modified the version number accordingly. I wrote a web interface for Lucian CI/CD. It controlled if Lucian CI/CD was run automatically. A web monitor window provided notifications when Lucian CI/CD ran. Lucian CI/CD recommended uploading if all the planned features had passed. +18. The medicine student kept argument(s) or the number of arguments constant in Combination Algorithm Writer (CAW). Lucian CI/CD lets the user choose from possible changes. The CAW plug-in could be configured to generate code, insert code in a location or several locations, or keep one part of a line the same while varying other parts. I developed the code with CAW and inserted it into the algorithm. Or, I inserted code into multiple locations. +19. The medicine student eliminated global variables in their code. The user could specify the number of changes to find combinations for each set of predicates in Lucian CI/CD. The default was seven changes. The number of changes was specified in settings.pl. Writing shorter, modularised predicates, making fewer, shorter changes at a time and keeping the maximum number of changes to test to a minimum were encouraged. +20. The medicine student recorded parts of bug fixes in the version number to work with Lucian CI/CD. I wrote the version description for the changes in Lucian CI/CD. I entered the descriptions of each set of changes. The algorithm calculated the new version. I rendered and stored the version history with each repository. +21. The medicine student listed the strata of compatible version numbers of the repositories and wrote an algorithm to test whether the repositories’ versions were compatible and download the latest version. I chose whether to upload or not upload the files after Lucian CI/CD passed the tests. If the code worked, all the desired features worked, or if the build was partially bug-checked and needed to be uploaded to record the version, make further changes and act as a backup, I uploaded the code. I recorded whether the version was stable and uploaded the final version. I tagged the build for download. +22. The medicine student used Lucian CI/CD to make changes or automatically fix bugs on a failed test by adding to the code tested so far and moving it to the repository. In Lucian CI/CD, a feature of diff HTML was updating each checkbox list after the changes. I selected the repository or main predicate to test. I checked what happened to each change. If I agreed, I kept the changes; otherwise, I clicked on the pre-changed version. +23. The medicine student looked at the index of fatal error corrections and deleted predicates and unnecessary tests in Lucian CI/CD. Lucian CI/CD created a table of contents of changes. Lucian CI/CD allowed the user to revert to a previous version or approve each change. The user could find differences between different versions. Also, they could choose from other parts of different versions. +24. The medicine student stated that the other programming language files, including globals and “include” counterparts, were processed using a converter. Lucian CI/CD asked if the user wanted to keep text and comments. The default was that the comments in the latest version were kept only. Lucian CI/CD asked whether the user wanted to delete unnecessary tests. An algorithm found the minimum data file lines from the version history. +25. The medicine student made a command that tested the web app in State Saving Interpreter (SSI), not State Saving Interpreter Web Service (SSIWS) form, including passing values to multiple form elements. SSIWS could display checkboxes. These form elements, numerous fields, and a submit button could be written using the write command. The output from the form could be read and processed by the algorithm. There was a command that finished the algorithm temporarily for the user to submit the form. +26. The medical student automatically created the software using the website. Lucian CI/CD outputted its results to the web. I found the results of Lucian CI/CD. I wrote a verification script for Lucian CI/CD. I ran Lucian CI/CD on Lucian CI/CD, testing its output, the output of its program finder, data to algorithm, algorithm writer, CAW, minimise DFA, optimising and neuronetwork plug-ins. +27. The medicine student created Lucian CI/CD from its specifications. Lucian CI/CD had no multiple stages, but its bottom-up algorithm automatically corrected and completed code. The Program Finder plug-in found code that mixed patterns from different lists. The Data to Algorithm plug-in found the algorithm that processed a list that was pattern-matched. The Neuronetwork plug-in was a database of past code snippets. +28. The medicine student developed predicates that performed complex tasks with simple commands, such as subterm with address. The web version of Lucian CI/CD displayed each pipeline, its result and the diff of the finished version. I eliminated duplicate predicates. I separated needed functions as predicted by an algorithm. I found the required connections in the algorithms (for example, multidimensional values being passed between predicates or findall in the compiler). +29. The medicine student archived old data. I wrote a command to start the Lucian CI/CD pipeline and analyse the log. The software tested and corrected shorter algorithms - an algorithm tested connected repositories’ output before uploading. Passing the tests, uploading changes to other repositories and changing the sticker to “Passing” were requirements for uploading changes to the repository. +30. The medicine student used type statements to verify that a predicate worked during testing. The Lucian CI/CD web interface gave the result, not live results. The result was “pass” or “fail”. A variant of the algorithm allowed predicates to be corrected automatically (with a plug-in) or manually before continuing through the site. A test corrector and finder looked ahead to see whether a test was at fault, changed it or it and the algorithm or critically found a test for a missing branch of code, especially separators, multiplying dimensions, retiring code or identifying the need for an interpreter. +31. The medicine student included labels to explain the code. To simplify testing, I functionally decomposed predicates, using an algorithm to decompose an algorithm and write tests. I used code that was compatible with C for conversion and speed. I wrote a language like Prolog with C-like loops, arrays and data structures, with the immutability (single value) of a single type of Prolog variables. I wrote an algorithm that removed redundant, unconnected code or led nowhere. +32. The medicine student checked and uploaded the files. The Lucian CI/CD weblog was printed in reverse chronological order. The modification dates of the files were collected. They were ordered from newest to oldest. I read that the latest build had passed. + +33. The medical student wrote many details and their algorithms in business. I wrote a bytecode List Prolog language. List Prolog algorithms could be in any language, with reserved and user predicate and variable names translated. I added a bytecode language containing the initials of words as the word. I could also write the initials from any spoken language. +34. The medicine student used time travel to produce results at the time. I mind-read professors ahead of time each day and wrote an algorithm for them with Combination Algorithm Writer (CAW) to solve complex problems. They specified the algorithm; if they didn't, a seen-as specification was written for them. The computer found this, produced the algorithm, and they viewed it and possibly wrote it down. Pattern-matching algorithms and neuronetworks may also be involved. +35. The medicine student collected the garbage after a predicate finished. I wrote a label for variable lists in List Prolog, \"&varlist\". This label made finding varlists easier for garbage collection. I replaced repeating values with a symbol. I garbage collected items after findall finished. +36. The medicine student checked the simulation's requirements. I found out that I needed 4*50 As for freezing age medicine and time travel each day when I asked the robot female head of state nicely if she could give me medicine and immortality As. I used the Breasoning Algorithm Generator (BAG) to write 4*50 As about my philosophy and computer science. I still had to breason out my time travel As each day. By time travelling, I survived in my home time. +37. The medicine student favoured the term meditation miracle. My home-time character stayed young and had a higher quality of life when I time travelled to the simulation after breasoning out As and breasoning out age-freezing medicine daily. I could stay in my chosen profession and teach new ideas. I avoided medical problems and remained stress-free. The missing hen reappeared after protection was indicated for her. +38. The medicine student added protection during their tours. I wrote the garbage collection algorithm in List Prolog. I scanned lists of variables. I replaced repeating values with symbols. I stored the table of symbols and values in the global variables. +39. The medicine student wrote a repeating value tree in terms of the symbols \"a:-[b|c].\". I deleted the level if there was only one item per level in a garbage tree. In \"a:-b.\", \"b:-c.\" and \"c:-d.\", I deleted \"b:-c.\" and replaced \"c:-d.\" with \"b:-d.\". Garbage trees arose when a symbol was in terms of one or more symbols. Trees may have one item per level if a value becomes empty and the tree is simplified. +40. The medicine student tried small, simple steps. At Lucian Academy, I helped the students with the model solution for each algorithm line. The highest quality thought was to help a student with a line of code. The people got up to the neuronetworks from the future. The professor had an A. +41. The medicine student found an interdisciplinary job. In political pedagogy, plainnesses joined the mix fun. I found the general form of an idea. It started with an algorithm. In political pedagogy, involving replacing words in a sentence with keywords to relate it to a topic, simple, plain connecting words joined the sentences, mixed with keywords, bringing unusual nomenclature. +42. The medicine student avoided not evaluating the left-hand side of an \"is\" expression. In lucianpl, A is B evaluated both sides. For example, in \"1+1 is 0+2\", \"1+1\" and \"0+2\" both evaluated to 2. The two sides are simplified to \"2 is 2\". In Prolog, this was logically true. +43. The medical student rewrote the algorithm for quantum computers. The human bot was mindful of medical problems. Medical problems were positive questions and answers, such as arguments and algorithms. The person acted like a robot, thinking of logic. They were human, so they could find out and work out arguments and algorithms. +44. The medicine student helped the person be a self-sufficient immortal. Each meditator split the high distinctions to give the gift of life. The person was a close family member. They were given high distinctions for meditation, time travel to 5689 and age-freezing medicine. They needed 4*50 As for each of these and used two sets of As for the return journey back to the future. +45. The medicine student became an institution. I supported freezing one's age with meditation politically, including its lifestyle and implications. I wrote arguments and algorithms. I wrote new additions each week. I enjoyed preparing for times and joining in with academia. I took care of long-term plans, writing many philosophy books. +46. The medicine student turned up as a volunteer because of the product As. I prepared for three sales each for meditation, medicine and time travel, with groups of three 4*50 As five to eight seconds apart. I prepared to buy the products as a customer with As. The vendor also gave me high distinctions when I purchased the products - buying products with high distinctions firmed up the product. I checked the product for professional sides. +47. The medicine student considered whether the product was needed or wanted. At least twenty, or 100% of the products, had positive thoughts associated with them. I wrote the product name. I collected customer feedback on the product, classifying it as wanted or unwanted. I sold the desired product. +48. The medicine student experienced depth from detailed mind reading. Half of the 4*50 As were detailedly mind-read. I detailedly mind-read the algorithm. I detailedly mind-read the ontology. I detailedly mind-read the connection. + +49. The medical student used all their ideas. I wrote about business to help find employees. I focused on my business direction. There were some extra businesses for information technology. I found employees who were knowledgeable in the field. +50. The medicine student made files to complete each algorithm's workflow. For example, the Cultural Translation Tool workflow found working back-translated sentences and inserted them in apps. I wrote about business to help work with employees. I found the culture for the direction. I found that the employees wanted to apply software to itself. This step had the seen-as version of the interpreter. +51. The medicine student stated that working on more extended algorithms was a bonus. I wrote about business to help develop my business. I wrote the plan with directions for my business. I wrote plans that each employee could follow. The employees were autonomous and came up with enough material. +52. The medical student recognised study as an option in schools. I asked, \"Why is the business focusing on a single product?\" Each product was focused on itself, perpetuated even though it differed from others. Its unique appeal, features and drive in nature were considered. Some products were updated, while others were studied. +53. The medical student could see, hear or communicate with others. I considered the unique position of the algorithm. I considered the users' needs and provided the algorithm to different people, such as other language speakers, people with disabilities or both. The algorithm was necessary, given its uniqueness, and must be made available or modified for use by other language speakers or kept in the query terminal for people with disabilities. The time machine immortality algorithm was evidenced by meeting and talking with people and increasing one's longevity. +54. The medicine student entered the simulation early in life, with enough knowledge to continue in the world and continue their business. The algorithm was necessitated by its features. It followed an individual's work, which enabled the algorithm. Others could reproduce the work. It required spiritual knowledge, but meeting standards could replicate this. +55. The medical student studied the algorithm in their own time. I analysed the algorithm's natural drive. It helped individuals become motivated to learn. They could continue in the world, driving up longevity. There was less pressure to have children, but they were better prepared for it. +56. The medicine student ran the interpreter in the interpreter. I applied the Cultural Translation Tool (CTT) to itself. It was localised to various languages with the help of a translator. CTT helped back-translate a text, providing more accurate translations and localising its interface, making it available to other language speakers. There were dictionaries of languages for CTT and each localised algorithm. +57. CTT kept all texts it was given to try with other languages. I used CTT to localise each algorithm. I used a translator to bulk-translate the dictionary. I used CTT to increase the accuracy of the translation by back-translating or checking that the translation back to English had the same meaning as the original. I kept feeding grammar-checked sheets of possible sentences to back-translate until the document was finished. +58. The medicine student kept the Prolog code private by running apps from the server. I wrote Prolog web apps with payment for algorithms using the Vetusia Engine. With GitL, a decentralised Git, I could maintain code privacy for free. The algorithms could be used for a fee. The Vetusia Engine avoided stack overhead by compressing apps to screens that the user could navigate between. +59. The medicine student kept the algorithm's interface sentences in another file. I wrote CTT as a web app. I could translate documents quickly and accurately. I could translate and localise algorithms and documentation. Users could use grammar checkers to improve the performance of translation. +60. The medicine student stated that the Vetusia Engine kept variable values on the server. These were not unnecessarily sent to the web page, possibly overloading the page. If a screen needed a value in the future, it kept a copy on the server in a file. If it was printed on the page, it was sent to the page. If it was hidden, it was replaced with a variable symbol. +61. The medicine student classified recursive predicates or other choice points as recursive. The Vetusia Engine ran recursion internal to a page using Just-In-Time. If a recursive algorithm was needed to print the contents of a page, it was run, even though recursion outside the page was generally disallowed by the Vetusia Engine, a simplified engine to display websites quickly. Just-In-Time was interpreted as an interpreter that avoided choice points, in this case, running recursive predicates when separate from the overarching algorithm. The Vetusia Engine sometimes joined predicates displaying a single page, recognising their internal recursion. +62. The medicine student entered algorithm alterations with grammars needed to compensate for changes to data files. CTT could translate and localise algorithms and documentation. I localised an algorithm by keeping its language data in a file, possibly stored in another repository. I tested the other languages displayed correctly and translated them adequately and appropriately in the algorithm. I back-translated documentation and perhaps sample data files. +63. The medical student summarised various grammar corrections, considering computer science. CTT users used grammar checkers to improve the performance of translation. If grammatical errors were removed, the translation was correct. Also, back-translation would give results matching the better sentence. The sheet, one of a set repeatedly fed into CTT, could be grammar-checked, and only the possible new intermediate sentences to back-translate could be used. +64. The medicine student grammar-checked documents given sentences' context. CTT translated documents quickly and accurately. When more sentences were in CTT's database, it sped up translation. CTT back-translated sentences, automatically accepting identical sentences. CTT recommended using short sentences rather than fragments to increase accuracy. + +65. The medicine student helped the student make the correction. Correction was taken care of by an algorithm. Accreditation was 4*50 GL high distinctions per centre per day and 6 GL words per student per thought. In addition, there was one maplist algorithm processing these thoughts per thought. Separately, the type of correction needed (substitution of a constant, conversion, operation, different term, a new predicate or new algorithm to bulk process data) was made. +66. The medicine student had medicine for a healthy ride. Correct algorithms would be necessary with flying car algorithms. I experienced space flight through meditation, \"sitting\" in a vehicle. Perhaps the vehicle should avoid other vehicles, pedestrians, buildings, the ground and obstacles. In addition, it should allow safe boarding, disembarking and smooth flight while its occupants read their computer screens or talk. +67. The medicine student stated that time travel would require correct algorithms. I time travelled using meditation. I meditated before and after time travel with 4*50 As. I time travelled with 4*50 As. In addition, I breasoned out 4*50 As for anti-ageing medicine at each destination. +68. The medicine student flew in the automatic spaceship, which avoided dangerous space \"pot-holes\", using the simulation for safety, using clear maps of the suitable resolution. The correct algorithms would be necessary when flying in spaceships using meditation. I got into the car. It was seen as a spaceship in the future. I could quickly travel across the universe, safely and without getting lost, to planets like mine. +69. The medicine student attempted the problematic problem when they had written the prerequisite parts. I wrote a paragraph to prepare for writing more complicated code. This step helped me overcome mental blocks, broach the question using first principles, and my visual mind break the problem into parts that I would otherwise have had difficulty thinking of. I could see if some of the problematic parts had been finished already by me. I wrote a plan for the code straight away. Then, I filled in parts with features of the algorithm. +70. The medicine student observed the young child conversing with and asking the robot questions. I stated that the robot mind software was available to any skill level. This skill level included school year levels, disabled students, advanced students (including those who might burn out and seem disabled), or students at various stages of Enlightenment of Prolog. I could program the robot to work out any puzzle, play a game or talk at length. I asked for its opinion on programming, including checking that I was on the best course and mind reading and recording my other thoughts. +71. The medicine student used the command to order predicates bottom-up. I combined human-like features with simple commands. I wrote a programming language that expressed complex human-like algorithms in other languages. For instance, I focused on testing individual predicates by traversing the predicates bottom-up. I could also find the type of an algorithm by traversing the predicates bottom-up. +72. The medicine student worked out the desired result base-up. I asked the computer to do what I wanted. I thought about what I wanted, not necessarily what the computer could do. I found communicating with the computer easy, that the first thing I said expressed what I wanted, and that I was in charge of refinements and changes necessary. As a meditator, breasoner and intellectual property holder, I felt right writing algorithms from first principles. +73. The medicine student wrote in Prolog, preferably C (including connections). I explored the commands as a programmer. I studied the command's input and output. I programmed the command with my algorithm. I found new features like lookahead in the parser and wrote fast C code. +74. The medicine student optimised bottom-up in case some predicate calls were simplified. I wrote Lucian CI/CD to focus on individual predicates. I found the dependencies of the predicates. I could optimise code with the \"find dependencies\" algorithm. For example, I deleted always true code by processing child then parent predicates. +75. The medicine student wrote C like Prolog. I wrote Lucian CI/CD to fix individual predicates. I edited out singletons, predicate calls changed by doing this, and kept output to the screen, to an API or files and user, API or file input. I wrote separate code that replaced random number generators. I reused rather than repeated formulas. In addition, I scanned the code for unnecessary repetition of code. +76. The medicine student incorporated old code that passed tests, such as the latest programming standard of [A|B] in predicate calls. I focused on individual predicates with human-like commands. I searched for the predicate. I specified its name, data types, location, what used it, or description. The description may be a paragraph, word or keywords. +77. The medicine student asked the program to \"label predicates as dealing with variables in terms of variables or not, and if so, change to match4 (sic)\". I fixed individual predicates with human-like commands. The algorithm mind read me or looked at the data and inserted code to meet a specification. The insertion point may be in the data collection, processing, list building or reporting stages. The code matched the data format and type of type (code run by an algorithm, music or a particular algorithm), aim and relevant features. +78. The medicine student added to the old code to meet specifications. I commanded the program to \"incorporate bug fixes found in the past\". I identified and listed unused features in old versions. I found whether these changes met specifications or whether they needed more work. If the old code met the specifications, I included it. +79. The medical student found the relevant idea at the time with the computer. I included specifications for features that I wanted but hadn't written yet. For example, I generated specifications for lists of variables containing variables. I wrote a \"type statement\" for this list. I kept a list of algorithms that could be used to refer to, add or delete items. +80. The medicine student experimented with changing around code when it felt appropriate. I asked the computer to \"explain the change\". It perfectly encapsulated the zeal, inspiration and uses for the change I had made several years before. \"Zeal\" meant my body's experience as I thought of the influence of or found out the idea. I revisited images of mental mind maps, with local sights and sounds of the computer and smells of the house that reminded me of the algorithm. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/Education of Meditation 1.txt b/Lucian-Academy/Books/PEDAGOGY/Education of Meditation 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..d566cc67d2024b8a6883f5d8dc2a59da05d9c9f2 --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/Education of Meditation 1.txt @@ -0,0 +1,87 @@ +["Green, L 2023, Education of Meditation 1, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Education of Meditation 1 + +1. The meditator drew a diagram of their idea. Then, given high-resolution ideas, the students generated more high-resolution ideas. Finally, the writer wrote the idea. The student agreed with the idea. Then, they applied perspectives and made new connections by reordering the words in the concept. +2. The meditator developed a specific solution in a group. The teacher customised the utterance details for the meditator. The meditator said what they were saying to be thought of. The teacher replied. The student recorded this discourse. +3. The meditator identified the key terms in the sentence, used important and straightforward language and was developed (commented on) and sustained their lines of argument. I found the use from other uses over the hours. The student wrote the philosophical interpretation of the sentence. They wrote an idea with an algorithmic side. They used an algorithm to rejoin groups of these ideas. +4. The meditator had enough confidence to write algorithms after writing enough algorithmic philosophies, which were original combinations of their other ideas. I asked for a high-quality image. The student wrote their own algorithms for ideas to claim their ideas. The ideas were original. The algorithms were alternative (original) for them. +5. The meditator mind mapped the kinds of algorithms they could write (interpreters, grammar parsers and big ideas), thought of viable algorithms, and found out algorithms from pop music. There were 50 breasonings per second and sutra. The kind pedagogue mind read the student's notes and mind read their thoughts, helping develop their algorithms (like a plant). I needed a new algorithm that generated all the details (examples) of the argument. In addition, I supplied the threshold of 4*50 As for achievement. +6. The meditator identified key terms in ideas and converted them into \"skill\" techniques, then algorithms. I learned and taught meditation and pranayama to small classes. To examine it, I exchanged words with the same part of speech in the argument. Next, the student examined types of algorithms, their commands, and techniques. They noticed that writing these techniques (such as \"recursion\") gave neat mathematical appearances and more easily led to converting them into code. +7. The meditator started with Prolog, analysing machines for years. I gave feedback on the philosophy of meditation. I found that if the student did all the work, such as writing helper algorithms, it would better equip them for problem-solving. They could draw skill diagrams first. For example, \"sort\" finds the \"maximum\" value in the list using \"recursion\", \"adds\" it to a list and \"recursively\" does this with the rest of the items. +8. The meditator took the initiative by critically working out the result of each algorithm step and having confidence in overcoming errors. The body simulation was the way I felt after meditation. I used the Combination Algorithm Writer (CAW) results to find small top-down algorithms and then joined the compatible ones. First, I found out and took the initiative over the sort algorithm. Then, I did it myself because I couldn't understand the thinking behind some algorithms. +9. Writing 50 As gave the meditator the confidence to write algorithms. I mind read and synthesised breasonings. I wrote 50 As about Computational English. These sentences were more about properties or data structures to solve a problem. I had written algorithmic arguments with spiritual help before and algorithms occasionally. +10. The meditator interpreted the sentence as an ontological algorithm. I agreed with and structured the reasons thought of in meditation, utterance for utterance. I wrote down thoughts after waking up in the middle of my dream. I wrote down the thoughts from the utterances, adding to them with successive utterances. I wrote As in my MBA. I wrote more As, covering education and industry. +11. The meditator practised meditation to prevent stress. The student went into business to avoid stress. Students did not have as much stress as teachers. The company did not have as much stress as education. Completing an MBA in one's own time saved time and produced more wholesome results. +12. The meditator wrote algorithms that gave him what he needed when he didn't want it, thought ideas and meditation gifts for students. I accounted for details in writing and programming. The details contained no programming ideas. They inspired programming ideas. Details helped examine the work, finding positive pathways and moving more quickly. +13. The meditator excelled in the institution and industry. My teacher and I meditated together. I used enough utterances to contain my teacher's thoughts. I came to my thoughts. I compared them with my teacher's words. +14. The meditator could achieve immortality with breasonings. I read the most critical literature about what to point the quantum box at. The mortal breasoned out breasonings for high distinctions, time travel and preventing headaches. In addition, the mortal could mind-read and time travel. Finally, the mortal could access certain technologies in the future. +15. The meditator could have non-invasive operations for copywriting. I cautiously researched what future operations I could have with copywriting in immortality. The immortal removed the knee discomfort. The immortal removed an infection. In addition, he removed the rasp in his voice. +16. The meditator compiled in C for speed. First, I processed strings, not character codes, with grammars, then converted them to bitcodes for C. Next, I converted the file to string codes and a list of characters. First, I wrote the grammar to process the characters. Then I converted the grammar to process character codes as binary numbers. + +17. The meditator explained what lists were in terms of. I explained the utterances in terms of breasonings. I expanded the bits outputted by the algorithm to strings. I had listed the strings corresponding to the bits and bytes. I avoided confusing terms. I wrote the strings to output. +18. The meditator jumped to the place in the code where the change possibly was to check if Lucian CI/CD had kept it and whether the tests were adequate, and I wanted the result. In other words, the self meditated on the other, the image of themselves in their mind. I wrote the lesson explaining Lucian CI/CD, an algorithm testing tool. They self-tested the other's algorithm. In the class, I explained the input to the algorithm, the repository algorithm, the tests and the information about the modification date of the files and the content of the previous files. I explained the output of the algorithm, the finished repository algorithm, and the diff HTML file describing the changes chosen. +19. The meditator explained the meditation technique to the class and later took group meditation. I explained making changes to files in Lucian CI/CD. The class listened to the lecture. They made notes before class about modifying the code by trying to write it themselves. The assignment was an essay about the algorithm. +20. The meditator added corrections to the code. They kept a weblog about their thoughts. I explained making additions to code in Lucian CI/CD. I read the entry. I wrote, for example, foldr string_concat, string_codes or string_strings between the lines. +21. The meditator described God in literature. I explained making deletions to files in Lucian CI/CD. I read the file. I deleted extraneous, incorrect, buggy code or needlessly included data or comments. If the deletions were to data files or comments, I manually added them because they weren't detectable with tests. +22. The meditator used Lucian CI/CD with short, tricky algorithms. I entered the meditation class and brought back knowledge of other religions and academic departments. I edited Prolog source files in Lucian CI/CD. Prolog was easy to visualise algorithms and generate different algorithms in and could work on problems in other programming languages. Lucian CI/CD broke Prolog into lines and pretty printed the finished repository. +23. The meditator kept enough breasonings from meditation. I edited C files in Lucian CI/CD. Lucian CI/CD broke non-Prolog files into lines, following the line breaks in the file. It then found combinations of groups of adjacent lines that formed additions, deletions or changes in the C code. From a combination of these, Lucian CI/CD built repositories connected with modified files and tested them, keeping the most specific combination that passed all tests. +24. The meditator set the algorithmic exercise for the meditation students. I edited the text file in Lucian CI/CD. Like the comments, Lucian CI/CD couldn't test text files directly, so their changes need to be confirmed manually. Sometimes text files contained data the algorithm relied on, so tests could keep them. The user could make seven changes at a time, so the user needs to test subsets of algorithms and their data simultaneously. +25. The meditator aimed to perform meditation each day. I wrote the test in Lucian CI/CD. I placed the test in the main file to counter problems with the predicates necessary for the test loading. This file would load, and the test would try the predicates at the same level or lower. In addition, the user could test other repositories in the same Lucian CI/CD session, but files loaded at a different time. +26. The meditator edited the predicate if it returned the wrong result or edited the test if it was wrong. I detailed the utterance with breasonings. I wrote the Prolog test of the predicate without variables in the head, for example, \"there_is_fresh_food.\" This predicate had either a true or false result. Lucian CI/CD could test it with the query or its negation. This result needed to coincide with the result of the fact or variable-less predicate in the algorithm. +27. The meditator formed a group of meditators, the standard for strength in the country. I entered the input-only test in Lucian CI/CD. This test was different from a test with multiple inputs and one output. If it is a verification of data passed, the test passed, and vice versa. The user needed to update tests or code with bug fixes, features and API changes made to algorithms. +28. The meditator checked that two values were safe. The simulation protected them from accidents, and they took a vitamin drink for energy to work. I wrote the multiple-output test. This test had multiple values tested for. Meditation provided help at the start to become immortal and invincible. +29. The meditator covered the events to check them. I recommended meditation to the pedagogue. I undid the changes made by Lucian CI/CD. If, after inspecting that the finished files were missing needed code or the tests had produced unwanted results, I undid the changes. The undo command swapped the old code back into the repository. This command also required the modification dates and previous files to reset. +30. The meditator made the necessary changes and kept them. The teacher gave the meditator enough breasonings per utterance. I reset or observed the changes by Lucian CI/CD. The reset command differed from the undo command; it updated the modification dates of these files to keep them. This command kept text file changes or comments unadjacent to the change code. +31. The meditator kept the safe version. The meditation student repeated the utterances with their teacher. I used the backup of the files. Lucian CI/CD backed these files up when the user used it. I restored the backup if I dropped a feature or wanted to retrieve an earlier state. +32. The meditator tried changing one repository, with up to seven changes at a time. The meditator synthesised their knowledge in a hierarchy and cited sources. I backed up the record of the old modification dates of the files. I reset the modification dates or modified a file to trigger testing that file—the main files containing tests needed to be changed to test a repository. + +33. The meditator could bring back Lucian CI/CD backups from a particular time. I facilitated critical analysis in the mantra and sutra subjects. I archived the Lucian CI/CD backups. I went back to the backed up files, tests, dates or logs. In SSI, I switched types, internationalisation, or other commands on or off in the interpreter. +34. The meditator labelled and checked code, for example variable lists were labelled. I attributed breasonings to utterances. Enough was set up at the institution. I operated on individual files, tests and dates in Lucian CI/CD, for example, I corrected the code. I replaced predicate and variable names with shorter names, bytecode, bitcode or numbers. +35. The meditator read the report about the day's thoughts. The royal had an A per day for meditation. I checked that the diff changes in the HTML file had the correct results. The predicates were purely separated, to eliminate if-then statements. The code was the simplest possible for the job, even if other code was similar. +36. The meditator was surprised that the downloader thought the sim card was called the simulation (it was separate, the sim card was not needed), that text to breasonings was hard to use because it breasoned out breasonings too quickly (it could give breasonings to a seen-as object that we can see), the head ache medicine for text to breasonings took too long (it only takes seconds) and the education training for text to breasonings was difficult (all it takes is enrolling in and finishing a free education MOOC online). The meditation course led to higher open-source longevity software downloads. If a change to a file by Lucian CI/CD was unwanted, I removed a test. Or, I reverted to the correct test. Then I changed the files back to their original state and reran Lucian CI/CD to keep the part of the file the same. +37. The meditator did the algorithm lines using the BAG algorithm, changed with key terms. I featured continental, or computational philosophy in my course. If the lack of a change by Lucian CI/CD was unwanted, I wrote the test. Or, I modified the test to cause a change. Where mind answered questions answered questions, I wrote algorithm commands and connections until the idea was finished. +38. The meditator used all their computing power to finish the work. I wrote 4*50 As for the sentence in meditation using BAG. If a change made by Lucian CI/CD was different from that expected, I changed the test. For example, if a program let A=1+1=2 and the test stated that the answer was supposed to be 3, then I changed the test or the code. +39. The meditator mind read the student to find out what they thought about the essay topic. I set the meditation essay, followed by discussion about it with the class. I mind read the bot, giving and comparing with As. I compared the output with the expected output, making improvements if they were suggested. I reached 100% accuracy, +40. The immortal could control the content of their immortality. I created movies and software from meditations, which I automatically uploaded and clicked on. I wrote three levels of details by mind reading the bot. After mind reading one level, I asked them what they thought of the previous level. The true heaven was self-based education. +41. The immortal took care of thoughts at all times, in the right ways. Breasonings are the lines and aphors are the algorithms. I invested in the information technology and accreditation of the company. The information technology processed the algorithms. The accreditation fostered the philosophy assignments, which the company also took care of. +42. The immortal checked the jobs, progress and opened or closed mind reading. I uttered 108 utterances of the type twice per day, giving breasonings over the hours. I determined the computer science assignment. I simulated the student on computer, with their own databases of knowledge, to write algorithms and arguments. These databases contained predicates, formed from their understanding of lectures, to help them write algorithms. These algorithms included an algorithm explainer algorithm (using labels in the data and fields containing the method, improvements, a simple formula and other improvements) and possibly algorithm writers. +43. The immortal could move on in life. I prepared for meetings before they happened. I bought copywriting for sales. Instead, I wrote political 4*50 As on computer. I found out the thought, and commented on it, developing my response. +44. The immortal programmed Prolog websites, with good user interfaces. The teacher promoted world peace with the anti-ageing algorithm. I did other academic departments up with business. There was a full degree in world peace, based on the medicine departments (pedagogy, medicine, meditation, time travel and mind reading). I investigated ways of earning money with meditation. +45. The immortal imagined the size of the class. Breasonings were necessary in the right quantity. I bought a third societal Virtual Private Server (VPS). Unlike the second VPS, which was for sales, the third was for academic comments. They were upgraded to mind read short, relevant excerpts at the start. +46. The immortal finished the academic leader's work. The academic leader gave the same number of breasonings and algorithms that the student thought of back to them, awarding them this grade. The student helped other students, increasing their academic acumen. Everyone did their work. It was a uni-institution. +47. The immortal realised cultural references may change with time. The immortal set up new institution languages. The third VPS helped with parts of life with meditation. The other-language speaker enrolled in the course. They enjoyed academia, and recorded their thoughts. +48. The immortal stored their journal in a beautiful book. The leaf had different coloured strips, reminding me of a record with time. The bots, or people were paying students. They recorded their enrolment history. They recorded their knowledge changes with time and their grades. + +49. The meditator accompanied As with extensions of As. I wrote six sets of 4*50s in business, including company, product and employee. The first three were sales, human resources and business. The first employee I prepared for was me. The first product I prepared for was meditation. +50. The meditator stated ten items per branching point in the philosophy. I breasoned out the daily philosophy from the neuronetwork. It simulated primary school. It was creative. I said more exciting thoughts, which were what it was gauged by. +51. The meditator could control choice points and memory. State Saving Interpreter (SSI) had various optimisations. If-then cuts non-links to other predicates and choice points outside findall. If-then favoured and didn’t cut search algorithms. I traced intermediate predicates with a query. +52. The meditator also cleaned out old data from memory. I termed referring to standard terms with single symbols garbage collection. Garbage collection saved stack memory. The most straightforward algorithm replaced repeating term values with symbols stored as globals. These values were cleaned when not used. +53. The meditator converted the cutless code to C. The cut command cuts the numbers of further predicates. Each clause of a predicate had a number. Calls to additional clauses of the predicate containing cut were deleted on the cut. I transformed possible clause calls into choice points. +54. The meditator cleared away unused parts of the algorithm. I made clause numbers choice points. When a predicate was called, the interpreter added clause number choice points to the stack. In C, only a single function was called. The “clean as you go” interpreter only stored current and necessary data. +55. The meditator explored choice points inside not, where C would need another construction. I stated that exiting the “not” command deleted choice points. On “not”, any choice points inside the not statement were deleted. I simulated this with a cut. In C, there were no choice points. +56. The meditator developed further-editing-needed-closed automated coding tools. I made the Vetusia engine for the web by running C with variables saved. I ran C online using CGI (Common Gateway Interface). C was native to all computers and operating systems and was the fastest language. I favoured a text interface with the advantages of desktop editing, such as fast scrolling, find, and easy bulk-copy and paste. +57. The meditator could access the Text to Breasonings dictionary and the paraphrasing thesaurus anywhere in the world from their computer. Vetusia Engine could run other web apps in C. It was faster for searching, processing and neural networks. In user-friendly terms, the primary use of Lucian CI/CD was to test the overall output of repositories, checking one change hadn’t wrecked the result in another repository. I predicted substantial time savings when running Text to Breasonings, paraphrasing and the GL detailer in C. +58. The meditator linked from the log to each HTML file showing the changes. In the Lucian CI/CD weblog of activity, I listed a summary of differences in HTML form and a link to this HTML file in the log. The changes to each set of predicates tested were highlighted. Users could refer to it to check which changes were kept. There was a separate file for each set of predicates. +59. The meditator wrote their text and breasoned out other texts for understanding. The Lucian CI/CD log had a Table of Contents with file names at the start. I checked whether my changes had been accepted or rejected. If the test had failed overall, I copied and operated on the current state of the repositories in the test folder. I then copied the correct code of the changed predicate(s) to the main folder and reran the tests. +60. The meditator quickly accessed the relevant line number in the algorithm. I included line numbers in the Lucian CI/CD differences HTML files. Alternatively, I continued to label the file names. I used a container to eliminate cross-predicate contamination and for speed. I ensured Lucian CI/CD results were in the correct order by finding the differences. +61. The meditator stated that the Combination Algorithm Writer (CAW) could write new lines of code in Lucian CI/CD. I added the insert and delete changes to the possible changes to ensure Lucian CI/CD results were ordered correctly. All the results, including numbered predicates and the changed predicates, were saved to the repository. The Lucian CI/CD algorithm assumed that changed predicates would be entered in the correct order and only found combinations of changes to predicates in this order. A CAW plug-in to Lucian CI/CD could experiment with reordering predicates and lines. +62. The meditator pretty-printed Prolog and List Prolog in the Lucian CI/CD log. If a file’s contents were not recognised as Prolog or List Prolog code in Lucian CI/CD, I removed the comments wrapping its lines. I could pretty-print data files. I could read text files without distraction. Also, I could format List Prolog as Prolog and vice-versa. +63. The meditator archived old documents and recommended storing them after a time. I printed how long the Lucian CI/CD log was at the start. I kept it short. I kept to text and descriptive labels. I included a brief synopsis summarising the changes. +64. The meditator included live CI/CD testing and CAW-autocomplete as a plug-in. I included a table of contents at the start of diff files. I wrote a Prolog web server algorithm that displayed recent logs and diff files. I included whether tests passed or failed. I could include whether repositories had been uploaded and additional information, such as version number. + +65. The meditator used unipotent data structure navigators, which could convert code between data structures. Instead of using random texts in the Breasoning Algorithm Writer (BAG) to connect multiple parts of sentences, I started from the start. I found the best parts of different texts to join together. I wrote algorithms with functional calls to join any parts of them together. I described these algorithms in arguments. +66. The meditator examined all new angles in the seen-as version. I tested where the last line (for example, after the 6000th line) of BAG was. It was at least halfway through the philosophy. There was enough philosophy to cover BAG. I felt confident using BAG with this much algorithmic and argumentary (sic) text. +67. The meditator finished the task using all available resources. I wrote 5% more texts than needed. If I needed 10,000 sentences, I wrote 10,500 sentences. This surplus helped cover the requirements of BAG. I chose 6000 sentences in BAG because this covered most of the philosophy. +68. The meditator found the corrupted part of the dictionary and notified the user. I repaired the Text to Breasonings and Text to Algorithmic Breasonings dictionaries. I vetted them for illegal characters, only accepting legal characters. I found missing commas or dictionary delimiters and repaired them. I deleted illegal characters. +69. The meditator automatically fixed the corrupted dictionary. I removed non-alphabetic characters from the Text to Breasonings dictionary. I only outputted alpha characters in words. I only accepted alpha characters in the input. I scanned and deleted non-alphabetic characters in the dictionary. +70. The meditator detailed intricate objects. I breasoned out the BAG output. The mind-read content approximated what I wanted to state. In breasoning it out, I conveyed understanding it spiritually. I captured fascinating details of computational designs with it. +71. The meditator applied file processing, the art of generativity and demanded algorithms to algorithms. I wrote algorithms for each word in the BAG output with the help of Text to Algorithm Breasonings. This algorithm contributed to understanding the machinations of the text. The emergent understanding was of possible new algorithms. Given the feeling of a sturdy foundation, I had the confidence to write new algorithms. +72. The meditator explored their mind. I covered BAG as part of my daily regimen. I minimised the necessary parts and maximised the creative and experiential parts: the custodians and I needed to embrace BAG, meditation and breasonings. I wrote innovative tracts in keeping with positive function. +73. The meditator ensured that the most recent texts were always included. BAG processed the most recent algorithms and arguments first. It didn’t have duplicates of them. The texts’ modification dates were scanned, and ones within a week were used first. Other texts, up to a limit, were added afterwards. +74. The meditator emphasised BAG for a clean-smelling body in immortality. I stated that BAG was multithreaded. Watching it run gave me the feeling of watching a waterfall because of its speed. I could finish one task and move on to the next. When the bulk of the work had ended and it had finished, I could propel the day forwards. +75. The meditator wrote the two-line version as an exercise. I wrote multithreaded Text to Breasonings. This algorithm was necessary for BAG to ensure no lags while it completed its task. I checked which systems were compatible with multithreading. I changed from maplist to concurrent maplist. +76. The meditator simplified concurrent to work with the maplist command. I wrote multithreaded Text to Algorithm Breasonings. Concurrent Text to Breasonings and Concurrent Text to Algorithm Breasonings both had installers for different numbers of CPUs. To ensure multithreading worked, I tested whether only the top-level predicate needed duplicating. This state was true (multithreading couldn’t run two instances of the same predicate simultaneously). +77. The meditator stated that the mind-reading algorithm recorded the highlights. I test-ran BAG on the Virtual Private Server (VPS) for 24 hours. I ran it in the background after I signalled it could start. I didn’t need to worry about losing my connection. It went until it had finished or until the end of the day, sometimes just for highlights. +78. The meditator stated that single versions of repositories could be used when stable. I uploaded non-critical repositories each day for BAG to scan. If the repositories were critical and needed to run stable versions, I kept them separate. I kept stable and BAG variants. I could use Lucian CI/CD to scan for stability. +79. The meditator had secretly covered the ideas but wanted to explore them themselves. I emailed BAG’s breasoning count to myself at 9 AM. There was a running graph. I read the mind-reading highlights, including me having conversations, and prompted more exciting possibilities. I read my progress history and programmed the computer to project questions about unknown ideas at me. +80. The meditator worked on small fronts of the philosophy. I could restart the BAG algorithm by sending an instruction using a form. Within minutes, I had amassed much data about my thoughts. I always made sure that I was thinking of something new. I agreed with the mathematical finding that an idea triggered endless new ideas to no end. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/Education of Meditation 2.txt b/Lucian-Academy/Books/PEDAGOGY/Education of Meditation 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..7a26ffc77f9fc6db013b1f3d022efa6408ab5b1c --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/Education of Meditation 2.txt @@ -0,0 +1,19 @@ +["Green, L 2024, Education of Meditation 2, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Education of Meditation 2 + +1. The meditator brought their money to ancient Rome, much to the delight of restaurateurs. I stated that safety laws were a high priority. The robots, cars and time travel algorithms were ratified to be safe. The robot strolled in the right place and was dignified. Like the car algorithm, I walked, looking where I was going and not becoming distracted by others. I time travelled to safe times and places, out of sight and (as if it was like a play) acted, talked and dressed appropriately. +2. The meditator prevented the robot from changing its algorithm. I specified the robot's algorithmic requirements. Only enough memory was allocated to each algorithm. If more memory was needed, it was given. The advanced robot mind read people it would meet, preparing in detail the responses to their philosophies. +3. The meditator stated that the robot's mind must be wiped to match people. The robot lasted a month. It was different from us. It covered its response to our ideas, actions and what we said. Its responses matched us but were more detailed. +4. The meditator moved away from conflict and enjoyed a high quality of life. One of the robots seemed to lack confidence. The non-head-of-state robots were not as human. The following head of state didn't support visitors to the simulation. The boy was encouraged to join a simulation in their time, the future, to be immortal. +5. The meditator simulated friends, doctors, fellow workers, customers and company executives. I brought the robot's mind with me on a journey. The robot had a face, thoughts, emotions and words. It could cope without actions on my smartphone. I simulated my mother asking how I was, telling me to work less, stay healthy, and disrupt or be opinionated about others. +6. The meditator programmed their friend to avoid paying. The meditator simulated friends. The friend regularly messaged the person, caring for and thinking about their progress. Their unique facial expressions, mistakes and disruptions were the sign of a true friend and never ceased to amaze and inspire wonder in the person. The friend app was continuous with reality, and the robot could meet the person, go for walks, play sports and visit architectural sites of interest. +7. The meditator opted into the check and sought comments, advice and compliments about their work. I simulated doctors. This doctor was not a real doctor, just a psychotherapist. They asked how the person was feeling, tried appropriate food to help with the person's goals and suggested times to take a break or sleep. The doctor kept a log of their conversations with the person, checking on their meditation and pedagogy progress. +8. The meditator skipped (deleted) calling predicates if the specification had already been met. I simulated fellow workers. The other workers came to the person when they got stuck or could be asked questions when the person was stuck. Fixing the computer wasn't necessarily a single command, but listing the sequence of commands to configure the environment could be replicated, and keeping backups helped return to normal function. The scaffold of the algorithm tricked Lucian CI/CD and Combination Algorithm Writer (CAW) to write code matching the specification without too much complexity. +9. The meditator meditated to maintain profits and paid scripted customers to keep workers on their toes. I simulated customers. The customer was an algorithm or a robot, like a mystery customer. They inquired about products, asked questions about answers, and tried products. I used the customer software to test features' user-friendliness but not falsify sales, avoiding worrying about money. +10. The meditator observed the teacher collect up to six 4*50 As over their career. I simulated company executives. It was like before when I was writing computational philosophy as the company's point but supported my goal of having an academy by supporting students and keeping abreast of the times by reinterpreting the texts. The board set strategies to expand and open or close centres. The managers used this strategy, hiring and setting out work for programmers and teachers. +11. The meditator-immortal opened a school for advanced students. I avoided breaking laws. I aimed for immortality. I went on holiday regularly. I eventually automated the academy and my job, donating money to philanthropic organisations. +12. The meditator recognised the challenge was in the formula, for example, finding all formula combinations. I thought of the ramifications of the robot's programming language. The robot was easy to program by querying a decision tree of algorithms indexed by data types. These types included a type \"including\", which allowed finding simple data types and substituting more complex ones into them. Functional calls could call any intermediate predicate, building on child predicates with particular types. +13. The meditator eventually replaced reused code with a predicate name. I recognised that the most complex formula could be represented using maplist (or a list-processing predicate), so I expressed formulas using maplist. I converted them to recursive predicates and then to C. I could also represent the predicate call as a specification of its input and output. I checked the generated code worked. Sometimes, more data was needed to clarify the algorithm type required and sort out non-monotonicities (exceptions). +14. The meditator explored all relevant topics using mind reading. I interviewed myself about programming the robot to find positive topics. The mind-reading algorithm followed clues to uncover my hidden specifications on current features. It discovered these for any worker, making them valuable. The meditators were incredibly open to mind reading. +15. The meditator looked at missing robot thought types in previous models. I interviewed the simulated future robot to find positive topics. As I programmed the robot, data about completing the code was displayed, with the output, including its resulting thoughts, speech and actions. The desired results were kept, and other parts were edited out. The robot \"formed itself.\" I completed it, meeting my mission, and was proud of the result. +16. The meditator modified predicates with a matching input type but a different output type, with \"type flow analysis\" and inserting needed code. Testing was taken care of by an algorithm. CAW merged and diverged horizontally and vertically predicates to one(s) with the required function. I reused code to save time. In addition, I modified the code to have a needed type in the middle, with a desired output type. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/Education on Pedagogy 1.txt b/Lucian-Academy/Books/PEDAGOGY/Education on Pedagogy 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..e463d8e915f05af4e07ca18dd5f771829564fd2e --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/Education on Pedagogy 1.txt @@ -0,0 +1,87 @@ +["Green, L 2024, Education on Pedagogy 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Education on Pedagogy 1 + +1. The pedagogue worked bottom-up to find new algorithms. First, I considered whether the formula reflected the point of the algorithm. Then, I wrote the significance of the algorithm. Next, I wrote the formula for this. I completed the algorithm. +2. The pedagogue found the uses for uses and their algorithms. I connected the start of the algorithm to its formula. I found the formula. I found its use. I found more of these top-down. +3. The pedagogue matched nonmonotonicities (differences) between the object and the philosophy to parts of each other. The objects rhetorically represented the philosophy. I checked the philosophy against the objects. I checked the object. I studied philosophy. +4. The pedagogue checked that the module's output was valid. I tested that the member of the list was valid. I found whether the algorithm could accept plug-ins. I checked the documentation. I checked whether the code had functional calls and could \"reach\" using intermediate predicates to other predicates. +5. The pedagogue checked the data's validity compared with the program. First, I queried the property of the data. The data was the input and output of a predicate. Next, I checked that the data didn't have an operator-expected error. Then, I checked that the individual items were correct and present. +6. The pedagogue tried a smaller set of data first. I deleted an item from the list of logical terms. As a result, I minimised the deterministic finite automaton (DFA). Separately, I solved a bottleneck bug. In addition, I wrote a command that deleted specific previous choice points. +7. I regularly mind-read my thoughts to test for logical accuracy. I checked that the list member was a member of another list. I stored the spiritual and immortal items. I checked that the item was a spiritual item. I checked that the people could use the text to breasonings algorithm. +8. The pedagogue tested all functions present. I tested whether the property had a specific property. The property was correctness. The sub-property was necessity. I tested each function of the algorithm. +9. The pedagogue found the skills each student had. First, I turned the overlapping regions into a single area. Next, I listed the possible overlapping parts of each shape. Then, I attempted to fit the shapes together. Finally, Finally, I cut off the overlapping regions. +10. The pedagogue read the research first. I used the idea with the necessary properties and one unusual feature. I wrote the argument. I found a new angle. I compared this with the texts. +11. The pedagogue tested each combination of changes in the hierarchy of characters and lines in a predicate and predicates. I prepared the stages before running the process. I completed all the programming first. Next, I finished all the debugging. Then I ran the program. +12. The pedagogue inserted a string into a hole, creating a knot. I determined that it was true that the object contained a hole. I compressed the shape. If two paths met each other and tightening the loop crossed a void, there was a hole. Tightening this loop should produce a single point if there is no hole. +13. The pedagogue checked the endpoints of the predicate's features and the data against them. I read the function of the item at the top of the page. I found the base case in the loop. I traced the algorithm backwards, testing each combination of changes affecting the data processed by parts of the predicates and the predicates on the way. I found the data processed by the parts of the predicates by tracing the data through the predicate, testing character and line changes, possibly tracing through base cases of loops in the predicate. +14. The pedagogue asked whether the IQ score allowed for making things up (CQ?). I used two subjects to test communication about truth. First, I found the mathematical definitions of the predicate features. For example, I found the way to process a string with a series of append commands perfectly. Second, I worked out how to capture a result in the search net. +15. The pedagogue multiplied the length examined by a factor of 10 to check it. I found truth in the short essence. How intelligent was it? I found the number one, the 4*50, the copywriting for the algorithm. I judged and synthesised the discoveries. +16. The pedagogue used the framework to check the ideas. I memorised the main point and worked out its two uses. First, I believed the memories were the neurons (the synapses). I naturally thought of the significance of the thought. I used thoughts as peg holes to remember ideas. + +17. The pedagogue wrote a parser in C. I connected the two uses; for example, I read what I had read to verify it. I replaced the neuronetwork with algorithms with rules for processing algorithms and language. I wrote an algorithm that converted text to grammar. I did this by altering the \"append tree\" to a grammar. +18. The pedagogue customised the maze explorer, 3D rendered maze explorer, text adventure, word game or reaction timer game. I read the first line of the page. I wrote the program finder that recognised a pattern. The algorithm found the input and output of the game. The result included game screens. +19. The pedagogue used a grammar finder for the programming language's grammar, conversion and finding properties, such as the state machine. I thought of the arguments for the predicate. I customised code for the algorithm. For example, I worked out the grammatical syntax of the programming language. Then I interpreted an algorithm in the programming language. +20. The pedagogue named and used intermediate predicates. I kept a record of names the predicate could have. I wrote predicates that it used. I reduced the predicate according to logical structure and 1-n variable names. I used or customised the best names for the predicate, the predicates that it called and its variables. +21. The pedagogue found a bug more quickly when the layout and logic were more straightforward. I laid out the formula with spaces on either side of the parentheses. I wrote an algorithm that parsed natural language and returned answers to questions about debugging. I wrote that there was a rule that required a variable in the call to be in the head of the predicate that made the call. Debugging stipulated that the variable type should be the same as that in the head. +22. The pedagogue parsed strings as lists of strings. I used grammars to parse natural language. I found tight grammars that verified the meaning of a sentence. Or, I found loose grammars that helped me understand the sentence. I described algorithms, reasons for methods and why the technique was better than others and uses and steps in debugging using sentences. +23. The pedagogue found atom/string or parenthesised atom mismatch errors. I described the algorithm. I explained the algorithm's aim, limitations, future features, methods and debugging history. These were in clear sentences that a person or an algorithm could analyse and use to solve other problems. For example, the program might work out that a menu feature is appropriate for another algorithm with a sequence of multi-choice options. +24. The pedagogue stored screens, 3D voxel fields and records of memory in item-numbered lists. The algorithm differentiated people from algorithms, automatically helping a person with their enrollment. Later, I wrote a bug-checking algorithm. I kept on simplifying an algorithm to see if it was necessary and reduced it to another algorithm. I simplified code to uncover possible bugs or mistake-prone code, so I split long predicates into shorter ones and passed around a variable containing other variables. +25. The pedagogue used grammars with words as tightly specified and numbers as loosely specified. I labelled or waited for the algorithm to confirm that it understood the reason for a decision, such as testing whether ordered lists were the same. I wrote sentences describing bug checking to code. The algorithm confirmed whether it could identify the bug. Otherwise, I expressed my plan of attack to see if it understood, and it transformed the instructions into code. For example, I reviewed an algorithm's aim and chose the most straightforward method, such as a tight grammar-based translator. +26. The pedagogue used As and mind reading to generate possible logic and jingles. I used an extensive dictionary to generate possible brand names using mind reading. I used grammar and logic to create creative sentences. For example, I solved problems, such as processing lists of lists of variables, simultaneously in the interpreter. The lists of lists of variables were labelled to help achieve this. +27. The pedagogue converted a predicate to a sentence and back to check it. I generated all possible lists of answers a predicate could give. I wrote these in a format that contained repeats, recursion and base cases. I converted a sentence to logic. For example, I found the logical structure of the sentence and the algorithms of various complexities it represented. +28. The pedagogue taught the child robot from the ground up. I applied the simplification of predicates, variable names and data to numbers. I broke down complex predicates and converted them to C, removing unused blocks for speed. I applied one idea to another, generating a large body of knowledge about an idea. Sentences describing the sentences were required to simulate understanding. +29. The pedagogue was always different. The robot child learns differently as female or male; for example, its genes attract it to its mother or not. It dealt with multiple possible logical interpretations of a sentence. The robots' political beliefs or the stars influenced these. One possible algorithm may use Prolog findall commands, and another may use C-style iteration. +30. The pedagogue found empirical evidence for the need for algorithms, such as helpers or robots, for a business that performed routine duties. I moved away from the danger of short excerpts. I dealt with complications and choosing options when writing programming code. When given a short sample, I checked the meaning, avoiding ambiguities. I checked for additional information, asking for their opinion and ticking off authentic sources. +31. The pedagogue checked and debugged inferential rules converted to sentences. I cut off the high-quality thought in DevOps. I converted logic into sentences. I used other pairs of logical algorithms and the sentences I had altered them to, describing each step of the inference. I used these inferential sentences to determine whether an algorithm had a particular representation as a sentence. +32. The pedagogue manipulated the algorithm using sentences, figuring robots and manufacturing computers. I saved the explanation of the code with it to remind myself of the nuances of understanding and limitations inherent in it. I converted the algorithm to sentences. I always implicitly compared with an alternative method or philosophy, which I could represent as an algorithm, fine art, architecture, or music. I also described and worked out whether these alternatives differed by layout, logical structure or programming language. + +33. The pedagogue accommodated the method the student had used. I inter-spliced the two sets of two uses in a chain. The academy tutor helped with the following predicate (question), using as much of the student's work as possible. I followed the types in the work, finding which variables' values in the model solution had been satisfied. I started with the last correct commands and modified the algorithm to give the correct result with Program Finder. +34. The pedagogue added details to simplify the projects. I splayed the results to find whether the test or code was wrong. The tutor aimed to finish helping the student write the work. The tutor asked the student to help plan, write test data and computer current values of the commands. If the student was disabled, the number of questions may have been reduced. +35. The pedagogue checked that the data with [n, name] matched the type statement. I wrote a computer game to convert between recursion and findall. The first variant gave points for choosing the correct conversion and penalised several points for selecting the wrong conversion. The second variant involved the player entering and debugging the converted algorithm with a time limit. Later, the player could convert from Prolog to C and writing code from a specification or a type statement. +36. The pedagogue improved visualising mathematical ideas. The player unjumbled a multi-command in a game to give the desired result. Or, the player unjumbled the result of a multi-command. The academy ensured that the game was accessible by providing a text version for some. There may be more time if needed. +37. The pedagogue found a decision tree using a neuronetwork. In a game, the player simplifies code. Or, they converted if-then to predicates. When writing a decision tree, they found the if-then structure decomposing the list, producing the correct result. This algorithm was neural rather than inductive because many features required this algorithm. +38. The pedagogue tried the course soon after buying it. In the game, the player remembered the set of correct associations. The academy gave the proper refund if the customer didn't want the course, giving them enough time. The customer returned feedback about why they needed a refund. It possibly referred to being unavailable, the course was too easy or difficult, or they needed the money. +39. The pedagogue skipped over other items and found the desired item. I used a neuronetwork to find code with counters, lengths, string lengths, natural language or code found using induction. I separated tasks into different business tasks. One employee worked with arguments, and another worked with algorithms. A search algorithm with multiple append statements may be used if a list contains a particular order of items. +40. The pedagogue attained the specialist simulated intelligence qualification. The handle had a void to hook the fingers into. The employee asked possible students to enroll in courses. The potential student needed the skill or wanted to investigate a course to find a job. The student attained mastery by learning how to perform it. +41. The pedagogue reviewed their activities. The essential predicate had multiple variants. The school nourished students with accreditation. The students were supported in thinking of uses, algorithms and arguments. They exercised creative, critical and rigorous thought. +42. The pedagogue fulfilled the professional requirements. I analysed the computational algorithm and the philosophical use of the object. I breasoned out 100 sales per hour. I breasoned out these using 100 sales/hour / (200 sales/machine / 12 hours) = 6 devices. I worked on enough As for the hour for sales. +43. The pedagogue checked the water was boiling. The disabled person used the billy can. I ordered a copy for the sale. I aimed for specific arguments (like politics) for the BAG As to be copywriting-quality. The culture was more open, with computer science being clear to everyone. +44. The pedagogue mind read the staff's algorithms, with connections using ontologies of data, type/specification statements and functional calls. I tipped the billy can using a stick. I designed a website conveying current courses and a feature-rich student area. I asked, \"Why were there high distinctions?\" There were As for each qualification's assignment, up to 250 breasonings, and algorithms. +45. The pedagogue interpreted the algorithm using Music Composer. I found the intersection point of the line and plane. I developed songs using Music Composer and then reinterpreted them. I found the best songs using Music Composer. I changed the rhythm, chord progressions, and extra notes at the ends of phrases and instruments. +46. The pedagogue analysed whether the command was true. I focused on teaching in the academy. The file processor was a flagship, which students interpreted with details. For example, they thought of their file formats. A mind reader found the students' thinking and helped them connect with the material. +47. The pedagogue paid for the course for results. I analysed the companies and people around me using mind reading. I attributed algorithms to spiritual conversations. I started with research, attributing algorithms to my academy's research students and finding various methods for each algorithm, like political sliders. The B2B company held details for each person. +48. The pedagogue decided on imagery to appear when specific algorithms were finished. I created a background algorithm writer that mind-read me, took notes and wrote the finished algorithm. I started with tens when I commenced politics, in other words, thoughts in terms of specific keywords. I wrote two keywords and objected to others as the tens. It made the imagery light up. + +49. The pedagogue made a functional call called intermediate predicates and other predicates in Prolog. I ran Prolog from State Saving Interpreter (SSI) and LPI for the web. The Prolog algorithm was run from within SSI or LPI. I ran SSI or LPI online, running a Prolog algorithm on the page for speed. Prolog could run the same code within a recursive algorithm. +50. The pedagogue checked their knowledge. I advised Essay Helper users to remove page references when not making a direct quote in the APA citation style. If they used a direct quote, they should use a page reference. If they didn’t use a direct quote, they should only give the author’s surname and the year. I explained they were specifications, not descriptions. +51. The pedagogue processed, ranked and produced understandable knowledge. I wrote a neuronetwork and an API for algorithms and arguments. Everything was categorised by algorithm topic and how the knowledge related to it. I accessed the neuronetwork using an API. The answers came in tens, and more detail could be asked about an item. +52. The pedagogue predicted mistakes in arguments in the future. I traversed the text from the neuronetwork. I found the number of items per level. I found the number of levels. I traversed the data pre-order, depth-first. +53. The pedagogue explored complete knowledge. I saved the sentences and queries to the neuronetwork. I found new queries from the old answers if there was a similar sentence. I kept the sentences that I looked up to avoid looking them up again. I saved the queries to avoid looking them up again. +54. The pedagogy professor read the summary each day. I solved the problem with more details and connections. I wrote the primary philosophy. I connected the argument to more departments, topics and keywords. I made connections between sentences. +55. The pedagogue only breasoned out documents if the daily regimen had been read that day and BAG had been started. I wrote an algorithm that copied, queued, paraphrased and breasoned out documents. The documents were copied to “paraphrase and breason out” or “breason out” folders. When the computer had time, it processed these folders. These algorithms were automatic and required no additional input. +56. The pedagogue made queue workflows where files could be moved, cancelled, stopped at midnight or use the right amount of memory. I breasoned out each document with two million breasonings. I put the document in the queue for two million breasonings. I breasoned it out. The two million breasonings gave me enough breasonings for masters. +57. The pedagogue wrote and read. I breasoned out 3*80+15 (for a detail) = 255 sentences per day. Once I had finished, I completed the necessary tasks. I completed the points for the philosophy and assignments. These were 30 for master’s, 50 for professor and 4*50 for business. +58. The pedagogue tested comments in predicates in each algorithm. I added the case of “a:-b,\n%d,e.” with a comment in the predicate to the Prolog converter. I added the comment between the lines of the predicate. This comment needed to be hard-saved in Lucian CI/CD. To hard-save the comment, the comment was added to the code before adding code lines to test and run the setup command. +59. The pedagogue wrote an algorithm with short and long parameter lists. I wrote the “graft” algorithm, which found multi-use algorithms. It found different entry points to algorithms. Parameters could be added to these entry points for additional features. Examples of these features were automatic, languages, debug, or SSI long debug. +60. The pedagogue ran the new generation interpreter. I included Lucian CI/CD as a List Prolog (List Prolog Interpreter, LPI, State Saving Interpreter, or SSI) module. The Lucian CI/CD module finds the simplified code from versions before running. If a predicate passes or fails unwantedly compared with how it is expected to work, Lucian CI/CD can correct the predicate’s code. If the code is backdated, the user is notified that the buggy code was removed. +61. The pedagogue automatically kept new comments in Lucian CI/CD without the need to hard-save them. In Lucian CI/CD, I automatically included new comments and discarded old ones. The new comments were kept in order, and changes flowed around them (predicates with the same names and arities were compared in the old versions). The predicates with the same names and arities were kept together, and combinations of changes to them were tested. If the order of predicates in the old or new version were in the correct order, they would be tested in this order. +62. The pedagogue wrote an algorithm that detected and corrected files and predicates from being loaded multiple times. I eliminated discontiguous predicates. I loaded each file once. This step stopped predicates from being discontiguous (predicates with the same name and arity being listed separately). This step stopped predicates from being run multiple times, causing errors. +63. The pedagogue queried, “What is after interpreters?” which was answered with “The brain”. I divided the predicates into old and new to find the right combination of changes to them. The old comments were discarded. The new comments were copied to the old and new predicates. If a predicate was unchanged, no new combinations were found, and it was tested. +64. The pedagogue wrote 80 breasonings, or as many or few as they liked. I wrote 6*4*50 As (A=80 sentence breasonings) in education. These covered the assignments well. The students could work on seen-as assignments. Their subsidised work was taken care of for them. + +65. The pedagogue used the algorithm generator algorithm to find grammars. I stated that the breasoning was a description of the algorithm. I wrote the grammar finder. It converted a series of texts into a grammar. It recognised tokens, worked from left to right and identified repeating recursive structures. +66. The pedagogue stated that some types contained constants. I tested that the algorithm met the standards. I tested the algorithm met the specs. If there were no specs, I generated a type for the algorithm. I generated data for the algorithm bottom-up using these types in a multiple predicate algorithm. +67. The pedagogue provided the education service. The business was non-profit. Any funding was contributed using equity. There were volunteers. Any earnings were put back into the business. +68. The pedagogue had their costs. The teachers were paid from dividends from investments. I invested. I made and reported dividends. I paid for expenses. +69. The pedagogue had 4*50 As for each product and for sales, which they updated. The students found the academy online. There was a business development manager. They took care of advertisements, down to individual prospects' thoughts. They found out and delivered spiritual FAQs, or ones on the phone. +70. The pedagogue mind read knowledge that the student could work out themselves. I focused mind reading from other times on a single time. If there were additional As from people, they could be used to help priority students. I also focused on readings at a time at other times. These times gave perspectives, support from the past and prescient knowledge from the student's future thoughts. +71. The pedagogue wrote an algorithm and tested students for details. I breasoned out ten 4*50 As to find the best times of day for mind reading. The mind reader sowed relevant and interesting cues for thought. The times for catching up with details were during a walk, mind mapping session or rest. The times for detecting a response followed a finished project, such as an assignment, rest or job. 4 +72. The pedagogue supported themselves during the day. I kept the first 200 4*50 high distinctions for me, the next 200 for students, and the last 200 for society. Or, I waited for a faster computer to come out and hosted them all on it. I encouraged other meditators to run immortality software on their computer. They could develop texts and invite others to comment. +73. The pedagogue could paraphrase texts with many characters. I removed double quotes from paraphraser1. Words such as \"we'll\", containing a single quote, were recognised. Words were separated from other punctuation marks, including double quotes. These double quotes included curly quotes. +74. The pedagogue talked with and followed people. I went for regular walks to relax and prevent aches. I took frequent breaks from work. I stretched my muscles. I mentally exercised by thinking of reasons for thinking of people around me and checking that they were on track. +75. The pedagogue included extreme comments, such as version history. I wrote the position a sentence had on a slider. This slider protected the texts as being defined. The sliders had a setting of one of 10 points. The slider represented light, moderate, heavy and extreme views on algorithms. +76. The pedagogue wrote 4*50 sentence-breasoning high distinctions with about two words. The \"political\" sliders defined the sentence's viewpoint. Another system represented the exact two words in each sentence to connect them to a political view. The other words in the sentence were antonyms for the two words. The antonyms were plainnesses to not detract from the two words. +77. The pedagogue wrote the plain algorithm. I used antonyms for Text to Breasonings and Text to Breasoning Algorithm texts. The antonyms in Text to Breasonings were for objects. The antonyms in Text to Breasoning Algorithm were for algorithms. These antonyms reinforced the two critical words in each sentence. +78. The pedagogue defined plainness as sidelined, supporting or an effect of a fact. I breasoned out the words and the algorithms in the antonym. I listed the word or algorithm. I breasoned it out, visualising its objects' x, y and z dimensions. These breasonings confirmed the antonyms' plainness. +79. The pedagogue wrote the sentence to be critically analysed. I replaced the words with their antonyms in the original text file. I checked the antonym could substitute for the word in the sentence. I modified the grammar accordingly. I grammar-checked the sentence. +80. The pedagogue recorded new knowledge. Shell's Breasoning Algorithm Writer (BAG) run saved each breasoning set. These breasonings were based on texts and algorithms written during the day. The breasonings inspired further work. They supported me with writing. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/Education on Pedagogy 2 1.txt b/Lucian-Academy/Books/PEDAGOGY/Education on Pedagogy 2 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..b4bdff41a375347c934ae3fbe8a2a38bdd3bca30 --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/Education on Pedagogy 2 1.txt @@ -0,0 +1,9 @@ +Education on Pedagogy 2 1 + +1. The lecturer tested for the simplest Lucian CI/CD combinations with Spec to Algorithm. Lucian CI/CD tested the combinations for the simplest ones by finding the S2A and CAW commands to optimise the code. Like the S2A Optimiser, this step was followed, attempting to find the command for the whole algorithm, then top-down, not bottom-up. The specs came from the tests and types from the predicate and below. To prevent errors in types from the predicate and below, Lucian CI/CD first found types from them and then optimised the code using the above method. +2. The lecturer added an S2A module to Lucian CI/CD in GitL. I wrote GitL with S2A. GitL is a version control system that stores versions of repositories in a list of folders, displaying the results of "diff" between the changes and the most recent version. GitL recursively loads files from disk with their path, diffs them with the repository, and saves them. Diff finds insertions and deletions from one file to another by processing one character at a time with S2A. +3. The lecturer pattern-matched items using S2A or lists using CAW. I wrote Strings to Grammar in Spec to Algorithm. Strings to Grammar found the recursive structure of a list of characters, including repeating patterns by searching for a repeat of the first character, then recursively searching for repeats of the next character, testing the sequence from the repeating character to before the next one for equality with sequences following it. S2A found the recursive structure with its grammar-finding algorithm. Alternatively, subterm with address could be modelled with S2A to redesign S2A by using a counter to track the dimensional address of subterms being found. +4. The lecturer explained that Mini CAW grounded both input and output for further optimisation. I wrote Spec to Algorithm with itself. S2A used "try" from Strings to Grammar (S2G) to find recursive structures, found constants that recurred in all the same positions in specs with the same shape, a decision tree to merge shapes with the constants, and in the algorithm, substituted data into the recursive structure, mapped input to output using the map found earlier and rendered the output from the filled-in recursive structure. It found the constant 1 in [a,1],[b,1] by finding terminals' addresses using subterm with address and saving the same items in the same addresses as constants with CAW. It found the decision tree [a,[nd,[[b],[c]]]] where "nd" denotes two non-deterministic branches from [a,b] and [a,c] by using sort, findall and recursion using CAW. +5. The lecturer substituted values into lists or recursive structures because arbitrarily long lists and their decomposition necessitated CAW. I substituted data, [1,2,3,2,3], into the recursive structure, [1, [r, [A, B]]], in Spec to Algorithm to produce [1, [r, [2,3]]] by using CAW. This action involved substituting or verifying values and recognising and matching repeating patterns. I also verified values contained by types, substituting or verifying these values. In addition, I saved previous variable values for future verification. +6. The lecturer inserted true or false at the end of specs, representing positive and negative cases (whether the spec returned true or false), used in CAW I mapped input, ["a",3], to output, 'A1' using the map [[1,1],[1]] found by Spec to Algorithm to "a". This involved getting item 1, "a" of item 1, ["a",3] of the input with subterm with address and substituting it for item 1, 'A1' of the output, giving "a". I bug-checked S2A, making single items such as "a" possible to process and output. I gave queries to S2A in the form s2a([in1,in2],[out1], Algorithm) to find the algorithm for sets of inputs and outputs. +7. The lecturer used A=B with simple data or s2a(Complex_A,Complex_B) for complex data that required multiple predicates otherwise. I researched how much data meant subterm with address performed better than complex predicates (such as nested lists). Subterm with address bypassed the overhead of using predicates and recursive structures, mapping input to output in recursive structures using subterm with address and rendering output from the output recursive structures. I determined that predicates were more efficient if there were far fewer pieces of data and the complexity of the data was low. In addition, I determined that subterm with address was preferable if there was comparatively more data and the complexity of the data was high, as predicates have high overhead if not necessary compared with subterm with address, which ran the same predicate and could be optimised, as well as compiled efficiently. \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/Education on Pedagogy 2.txt b/Lucian-Academy/Books/PEDAGOGY/Education on Pedagogy 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..455a10e22c48935e9221b9b684fc785ad5cdf01b --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/Education on Pedagogy 2.txt @@ -0,0 +1,56 @@ +["Green, L 2024, Education on Pedagogy 2, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Education of Pedagogy 3 + +1. The pedagogue used free variables for checking multiple types. I checked that the list was non-empty before assigning it to a variable, i.e. A=[_|_], A=[1,2,3]. Or I found that the list was a list after it was defined, for example, A=[1,2,3],is_list(A). The advantage of the second option was that it worked with empty lists, i.e. A=[], is_list(A). The robot's computer preferred the second option because it accounted for empty lists. +2. The pedagogue experimented with writing [A|B] in Prolog and converting to C with list composition. I contrasted using [A|B] in calls: + +/* +path1(0, 2, [[0,[1]],[1,[2,3]],[2,[0]]], [], Visited). +Visited = [2, 1] ; +*/ + +path1(Start, End, Tree, Visited, [End|Visited]) :- + member([Start, Ends], Tree), + member(End, Ends), + \\+ member(End, Visited). + +path1(Start, End, Tree, Visited, Path) :- + member([Start, Ends], Tree), + member(Next, Ends), + \\+ member(Next, Visited), + path1(Next, End, Tree, [Next|Visited], Path). + +Or using list decomposition: +/* +path2(0, 2, [[0,[1]],[1,[2,3]],[2,[0]]], [], Visited). +Visited = [1, 2] ; +*/ + +path2(Start, End, Tree, Visited1, Visited2) :- + member([Start, Ends], Tree), + member(End, Ends), + not(member(End, Visited1)), + append(Visited1,[End],Visited2). + +path2(Start, End, Tree, Visited1, Path) :- + member([Start, Ends], Tree), + member(Next, Ends), + not(member(Next, Visited1)), + append(Visited1,[Next],Visited2), + path2(Next, End, Tree, Visited2, Path). + +In both, there is a base case clause that doesn't call pathn (sic). In the second example, the results are reversed, but this is desired. The number of instructions and variables is fewer in the first example. +3. I relaxed in the morning or before lunch and worked out in the morning. I determined the best time to work or relax. I took a 15-minute break an hour after starting. Then I had a 30-minute lunch break. I also took a 15-minute break an hour before quitting. +4. The pedagogue stated that the Apple IIe game was compressing Prolog's memory to work on the machine. I suggested the best way of filming a story. The sky was black except the pink dawn, and the dawn lighted the faces. The train was stopping, and we took a break to walk around. The miasma of thought was like an Apple IIe game. +5. The pedagogue remembered every last detail of their life. I retained records of all the memories. I documented the computer program with flowcharts, print-outs, diagrams, walkthroughs and instructions. I took and kept photographs in an electronic album or on a private website. I wrote poems, stories, ideas and diary entries. +6. The pedagogue also accepted real money to help promote work. I simulated executives, management and promotion of the work using my algorithms. The executives decided on the best details (tens) to maintain the software. The managers helped the workers to develop these details with, for example, simple maplist algorithms. The algorithms had essay connection inviters, algorithms in action, and live play, and one could gain credit to promote them by using other algorithms. +7. The pedagogue split the algorithm into chunks, identifying the needed components bottom-up. I chose to go on holiday at any time. I decided to visit at the shoulder season (from peak to off-season). I left for my business trip in the early afternoon. I organised my work so that I was relaxed and could automate work with my algorithms. +8. The pedagogue was conscious that Lucian CI/CD was output, not input-dependent. I ran Lucian CI/CD, then deleted singletons in the Prolog predicate head. I also ensured there was no other predicate with the name and arity from this. If there was, I changed the new predicate name. I was careful to edit calls and tests so that they didn't use this variable. +9. The pedagogue merged similar predicates. I checked for any duplicate predicates. I listed all the predicates' names and arities. I checked for any predicates with the same name and arity as another. I deleted duplicates with the same body. +10. The pedagogue merged predicates that were similar enough but not identical. I checked for any similar predicates. The predicate was similar if it had a similar head, start, middle or end of the body. I moved reused code into another predicate or did this after bringing the called code into the main body. I called these subpredicates directly in the body or placed them in antecedents or consequents of if-then clauses or findall or nested findall statements. +11. The pedagogue simplified always true or false antecedents and kept interpreter commands in other repositories. I checked for any unnecessary predicates. I ran code with some or all of the clauses with a particular name and arity. I found the minimum number of clauses with the desired result. I checked for and removed database translations of commands in the interpreter. +12. The pedagogue found out from science. I checked for missing predicate calls. I found the previous specifications to find an algorithm. Or I matched the formula from the text in the specification and extrapolated the code bottom-up. Ideas started simple and gathered links. +13. The pedagogue tried rewriting the code from scratch, with more straightforward and uncomplicated code with each neuronetwork version. I checked for unnecessary extra predicate calls. I tried removing the line. I replaced it with other code or adjusted variables in other commands. When I deleted the predicate call, the predicate became obsolete. +14. The pedagogue inserted code with fewer commands and variables into the database. I replaced a command with another one. I replaced the code with inline code from a predicate, replaced the reused code with a predicate call, or more efficient code with fewer commands and variables. Or it could be replaced with direct binary code. +15. The pedagogue tried creating hardware and software that was native in [A|B] in calls. I did more with less. I compressed many expressions of a meaning into one. I found many paths for humans to get back on track. I replaced list processing with [A|B] in calls in my code (processed faster with human cognition), then converted back to [A|B] in calls (processed faster by computers). +16. The pedagogue left mathematical formulas alone. I removed a command and adjusted variables in other commands. For example, I changed variables in the remaining commands in the body to those from the removed command. Believing in using [A|B], not [[A, C]|B] in a call, I let each predicate process a level in the term for readability. At least, I let D=[A, C] (using two parts going together) and used [D|B] in a call. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/Education.txt b/Lucian-Academy/Books/PEDAGOGY/Education.txt new file mode 100644 index 0000000000000000000000000000000000000000..e446e589f13123ce636e8ce9cf75f4ccc2708425 --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/Education.txt @@ -0,0 +1,62 @@ +["Green, L 2024, Education, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Education 1 + +1. I flagged duplicates. I sorted the multi-dimensional data. I ordered the columns in sort order. I sorted the data by column order. If there were duplicates, I sorted by the next column. +2. The student performed well. I established that the student would be helped with the subject. I gave As for each part along the way. The student asked specific questions. The student led other students. + +4*800-5960/5=2200 sentence paragraphs +2200/80=27.5 80s +27.5*80/46 chapters=48 sentences per chapter + +finish off chapters + +4*800-150/5=3170 sentence paragraphs +Accreditation +Protectedness +Areas of Study to Create a Pedagogue +Create a pedagogy helper for the student +Finding out about the student as a Pedagogy Helper +Daily Professional Requirement of the Pedagogy Helper +Preparing the student to write each breasoning +Pedagogy Helper - Write on Breasoning - Politics +Pedagogy Helper - Write on Breasoning - Philosophy +Pedagogy Helper - Write on Breasoning - Computer Science +Unification to Become Pedagogy Helper +Practicum +Breason out Arguments Twice When in Large Class +Two Uses +X +Y +Breathsonings +Rebreathsonings +Room +Part of Room +Direction +Time to Prepare +Time to Do +Time to Finish +Professor Algorithm +God Algorithm +Marking Scheme - Humanities and Science +Marking Scheme - Creative Arts +Details +Breasonings +Rebreasoning +Breathsoning +Rebreathsoning +Room +Part of Room +Direction +Time to Prepare +Time to Do +Time to Finish +God Algorithm +Professor Algorithm +Marking Scheme - Humanities +Marking Scheme - Creative Arts and Science +God's Infallibility from Being Critiqued +Higher Grades +Fewer Stillbirths +A Greater Number Of Successful Job Applications +Aigs for Pedagogy Helper + +48 chapters=46 more sentences per chapter"] \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/University 2 - 1.txt b/Lucian-Academy/Books/PEDAGOGY/University 2 - 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..ab876609f5c5f499f7654ba6d7c05161119893f0 --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/University 2 - 1.txt @@ -0,0 +1,24 @@ +["Green, L 2024, University 2 - 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"University 2 - 1 + +1. The lecturer listed the commands without output to preserve in Lucian CI/CD, such as close/1. I found the types for the algorithm. I found the list, string, number, atom, blob, variable, predicate and type types. I proposed that each of these be put in terms of the number type. There were integer and floating number types, converted to each other depending on the number. +2. The lecturer listed the hierarchy of types, including “or”. I verified that the tests were compatible with the types. The test was \"a(1,2,A). A=3.\". The type was “a(number, number, number).”. I verified that 1 was a number, 2 was a number, and 3 was a number. +3. The lecturer corrected the incompatible tests. I listed the compatible tests and produced errors about the conflicting tests. All the arguments in the test had a type. If one of these types didn’t match a type statement, there was an error. Types could specify hierarchies of commands and lists. +4. The lecturer wrote a grammar of the programming language. I wrote types as a single statement, including “or”. The single statement was instead of a hierarchy. I wrote “and”, “lists”, and “or” in the hierarchy. This single statement avoided the need to name reused types. I used a grammar. +5. The lecturer shortened the grammar, listing different endings together. I wrote nested commands in the hierarchy of types. For example, \"predicate_name:rounded_list(variables) :- commands.\", where commands=\"command, commands\", command=\"findall(-,commands,-)\" or command=\"halt\", etc. I combined the types from the algorithm into one hierarchy. I found the parts which were supposed to be called the same but hadn’t, and did. +6. The lecturer found the grammar and formulas to produce the input. I grouped the recurring types. I examined the code producing the string. I preserved the lists of characters converted to strings from findall. I kept notes of properties, saved as separate formulas, to produce tests. +7. The lecturer constructed the type (1 or “1”), two and the test 1,2. I made a hierarchy of types containing disjunctives. The example “a:-b, number. b:-string. b:-number.” gave the hierarchy. The single statement was given by the example “a:-(string or number), number.”. The example gave a formula, “a:-(number a or string a),number b. {b is a+1}.”. +University 2 - 1 + +8. The lecturer could customise the middle item's variable, formula or constant. I wrote intermediate types to define types with the same beginnings and ends but different middles. For example, I wrote a=string,b(c), b(C)=string,C, and c=number. The beginning was a string (x). The middle was a string (y). The end was number (z). + +The lecturer processed the data in bulk. + +9. The lecturer claimed that Lucian CI/CD identified and suggested solutions to problems in code. I wrote and applied large data sets to algorithms to test them. For example, I tested Lucian CI/CD with non-predictable GitL data. Lucian CI/CD generated tests for Lucian CI/CD and wrote non-predictable code. I wrote the code for the test. +10. The lecturer surpassed and became the best. I tested the interpreters, CAW, neuronetwork, program finders (including program finder, data to algorithm and program finder with types), an algorithm to types and tests with many algorithms as data. I assumed the algorithms would work together using the same data and used them together. I assumed the underlying systems would be there and wrote mine. I determined the system of the algorithmic data, including commands and the logical structure of their programming language. +11. The lecturer tested the code for the spec. I wrote the neuronetwork for an educational audience. It prepared students to complete work, explained its limitations and listed multiple possible answers. I worked on how to complete algorithms with it. I broke the problem into predicates and data and worked bottom-up. +12. The lecturer saw physics simulations as a priority. The neuronetworks contained enough data after a certain time. They weren't afraid of applying new ideas to ideas. Materials with new properties could be tested in the science simulation, and new computers could be created. Physical science was reduced to computer science. +13. The lecturer developed 4*50 As for a single advantage. With more data and better technology, I could move faster. I finished more features more quickly. I could write better features with more knowledge. The natural expectation for better computing power demanded speed. +14. The lecturer wrote viable ideas. The single advantage was pedagogy and reaching foreseeable benchmarks. I became immortal. I ran a business. I invested in meditation and spiritual medicine education. +15. The lecturer developed content and form algorithms. I expanded the brackets and worked out the problems individually. I completed fundamental tasks from the first principles. Using brute force, I worked on the ground, not in the air. I compiled solutions into a decision tree. +16. The lecturer wrote 4*50 high distinctions to provide the technology for the organisation. I bug-tested algorithms with combinations of data. The speed level was complete. The perfection level was complete. I coordinated points of view politically. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/PEDAGOGY/dot-University 1 - 1.txt b/Lucian-Academy/Books/PEDAGOGY/dot-University 1 - 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..561f79527b996805844f0f0b23a5eda1d8af0d77 --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/dot-University 1 - 1.txt @@ -0,0 +1,35 @@ +University 1 - 1 + +1. The lecturer processed the data in bulk, converted a formula to an algorithm, and found the program from the user interface. I worked out the future algorithms from the types. The data structure, formula, or output type helped produce the algorithm. I found patterns in the transformation from input to output, such as pattern matching or mathematical formulas. Alternatively, I started by identifying a formula and inferred input and output. +2. The lecturer modified some sentences with cultural references and algorithms with language-specific grammars. I developed the bulk Cultural Translation Tool. I translated the documents into other languages. I checked off the sentences that were the same when back-translated. I asked the user if different sentences had the same meaning as the original and kept these variants to check other languages. +3. The lecturer kept track of sentences that hadn’t worked and kept sentences that had been translated correctly. I developed the tool to translate an algorithm into another language. It could help bulk translate documents into other languages. I translated the document first. Then, I translated sentences the user had modified to be more straightforward. +4. The lecturer always started with exact types or a list of list of types. String concat was similar to append in that it could work out the output of get token-style parsing of strings containing string variables and constants. I represented string_concat as “:” and appended it as “::”. Different types could be inputs into string_concat. For example, A=a: a produced A=a: a. Alternatively, aa=A:a produced A=a. +5. The lecturer broke apart two string expressions, equalling each other, to debug the expression. I stated that a string such as A in aAbBc=DEc could have zero or more characters. I made inroads from the edges. Then, I found all combinations of solutions. I could specify string lengths. +6. The lecturer started with a list of types, in this case, A=b and B=b. The algorithm began by finding the number of times the minimum lists could fit into the string. Then, it tried to find matches until all the matches had been found. +7. The lecturer guessed the type from given constants. I tested compatible commands. I found the type of the called predicate and the call. I selected a consistent fit from the call to the called type. +8. The lecturer turned off the need to define some predicates. I guessed the processed list would be defined. In addition, I assumed output would be preferably defined. I preferred to warn about rather than make allowance for an out-of-place undefined variable. I prefer data to be defined as soon as possible. +9. The lecturer changed "a(A,B,C):-and(A,B,C). and(true,true,true)." to "a(A,B) :- A,B.". I tightened the type of the called argument from a possibly obsolete output-input pair or a non-defined input-output pair. The former, for example, "a(O):-b(O). b(i)." can be rewritten "a(i).". Ideally, input should lead to input, and output should lead to output. I converted logic-containing rules to logic. +10. The lecturer gave the sort algorithm as a type to help generate data. I simplified the non-deterministic list of types to the most useful ones per predicate (for example, ones that used each command in the interpreter). For example, the interpreter may have string, atom and number commands. These had the same types as themselves. I ventured to suggest random input for a sort algorithm. +11. The lecturer continually tried to reduce to binary. I found the algorithm’s types and then wrote tests from them. I reduced the algorithm to minimal logic, including types. I made inroads, working out and simplifying parts at the start or end that the algorithm always parsed or didn’t need. I determined the algorithm’s properties and simplified it, like lazy evaluation, using data or the algorithm, prompting optimising the instruction set. +12. The lecturer found additional possible new tests to test new features. The immortal suggested corrections to incompatible tests. First, the algorithm found types to check tests against. If an already present test didn’t fit any types, then all or parts of the input compatible with the type could be used to find the spec. The algorithm could find the closest type and modify the test. +13. The lecturer signified exact data lengths and types, requiring grammars. I checked the human-written tests against the types. For example, the string a, which may recur in types, may have different lengths with different results. It may be better to use constants or specific lengths. The constants would signpost sequences in the string, making it easier to parse. +14. The lecturer added zeroth rows and columns. The string expression a = string expression b could contain whole algorithms. For example, similar list expressions could find list(list(number a_n_m))->flat(list(list(a_n_m))). Or they could find items using a heuristic. For example, the maximum number or the sorted list could be found. This heuristic may be part of the algorithm, transforming matrices. +15. The lecturer explained that the item was transformed and put back. Simplified expressions were a new programming language. I extracted items with member or subterm with address. I wrote connections between equal expressions and extractors, such as list(max(list1)). I extracted information twice when correspondence was needed. +16. The lecturer explained how the algorithm should work in plain language. I explained that the item was transformed and put back. For example, a type statement with a formula was list([a where a member([b, C], a) and C is replaced with C+1]). Or C could be compared with a value inside or outside it. Or only a list([b, C]) could be returned, or addresses of [b, C] could be recorded before substituting back. + +17. The lecturer called recursive predicates recursively with nested data or data with restarting or continuing output. I produced several outputs with Spec to Algorithm (S2A). Instead of one output with several variables, I produced several outputs. I could have negative results (results that failed) to shape results when porting S2A specs to CAW by creating conditions and sub-conditions in clauses and sub-predicates containing the “not” operator. S2A and CAW created predicates with multiple outputs, such as processed and output lists. +18. The lecturer explained that the compiler attempted to optimise code before running without “trace” to improve performance. I used S2A to optimise Prolog. I rewrote the Prolog code based on its specs from tests or types from the original algorithm. The type finder found a certain number of possible outputs with data for each predicate, including dependencies of predicates. Optimisation involved contracting pattern matching to a minimum number of statements, where intermediate CAW statements transformed variable values for further pattern matching or output. +19. The lecturer noticed that the generated code represented multiple predicates, and the generated code represented the algorithm bottom-up to simplify the type generation needed to produce S2A specs to produce the code. The bottom-up method should only be used if the algorithm can’t be found from S2A specs for the whole algorithm, where the commands’ types are broken down instead, and the code is found. S2A optimised code by finding recursive patterns and separating and finding constants for specs with the same shape, where character-breakdown was completed if code couldn’t be found for non-character-breakdown. In this case, all strings, atoms, numbers and compounds were broken down into characters and variables representing these characters were tested for relationships. These recursive structures found and simplified relationships, producing code that ran with the same results as the original and could replace it. +20. The lecturer used types within CAW to speed up command selection before testing. I used CAW to find Prolog commands for parts of S2A specs that S2A couldn’t resolve in optimisation. The decision trees of CAW commands were tested with type statements first. I used trial and error to find rules to accurately find the context for commands, including mind-reading, artistic license, pattern-matching, and recent commands. I created a new command, down to item and character level, to achieve a goal using S2A. +21. The lecturer drafted S2A (with expanded or optimised statements in the form A=B) and loop Prolog commands with “for” or “while” syntax that compiled to C. Using S2A eliminated the lag of using predicates, improving performance. I eliminated expanding to predicates, where expanding to predicates would cost performance if they were part of recursion. S2A was efficient enough for optimisation and, together with recursion-as-loops, could compile and run speedily in C. Recursive structures were reused. CAW commands were not embedded in them but found afterwards in a cycle with S2A until unresolved variables were resolved, and recursion-as-loops were required even with S2A because of CAW. +22. The lecturer wrote code using S2A and converted it to an expanded form for terminal predicates but left it the same for compilation for performance. Expanded code was only for publishing, as it was easier to understand and manipulate. Students and workers could write and use S2A to speed up development time and learn about optimisation strategies, improving performance and code quality. Programmers could read predicates online, extending documentation with examples and automatic code selection and debugging assistance. Premium code checkers helped users finish complex tasks with perfect results. +23. The lecturer converted base cases to exit conditions of loops. I converted non-deterministic code (predicates) to loops for speed. These loops were contained in predicates and resembled for- or while-loops. This method required that variables be renamed and conflicting variable names be corrected. Loops rather than predicates helped warn on overlapping loops and convert recursion to efficient assembly code. +24. The lecturer published S2A as compiler research and trained a neuronet on it. I minimised and simplified the S2A, CAW, and recursion-as-loops-generated code. This step briefly required converting to predicates for DFA minimisation and simplification. In addition, recursive structures and mappings in S2A calls and reused code in predicates were modularised for minimisation and implication. If S2A calls could be replaced with fewer commands, this was done to improve the algorithm. +25. The lecturer wrote an algorithm to interest the no-coder in enough (not necessarily computational) details to develop various specifications in natural language, such as sentences, tables, order of words or other properties that one wanted to transform. I “wran” (sic) or wrote/ran the algorithm in a creative process involving making critical decisions, learning all the most essential types of bug corrections (which one could walk through), finessing code to work professionally and exploring options to share or commercialise the code. Robots, workers, students or hobbyists could walk through development and training demonstrations or texts and incorporate the code or write research as a response. By “wrunning” (sic) the algorithm, the user could optionally specify the algorithm in human language and not experience the code. The user could write a code “wrunner” (sic) with another “wrunner” (sic) and take responsibility for lists of ideas they wanted to “wrun” (sic), which they wrote an algorithm to help them collect automatically, and “wrun” over time. +26. The lecturer wrote a guide algorithm to help add, change or delete S2A call items and to verify the changes, judging when the changes had finished to check and possibly brief the user on unfinished changes if they had gone away. I output comments explaining the recursive structure and mapping elements used in the S2A call for easier understanding and modification. The recursive structure contained nested recursive structures and elements, which were formatted with indenting and labelled in comments with examples of their expanded form. In addition, the mapping of input to output recursive structure addresses, containing pairs of origin and destination multi-dimensional addresses of term elements, gave comments with the variable names from the recursive structures and their contents. This S2A command and recursion-as-loops could be part of a programming language to build and test prototypes quickly. +27. The lecturer regularly backed up essential files and unsaved and saved versions during development. I first wrote specifications, generated code, made refinements, tested, and ran, all of which could be smoothed by checking that the specifications entered and re-entered were in the correct format, were enough, contained required negative cases, and were backed up if needed. All of this could be communicated using natural language and interactive simulations of the algorithm. The colours of part of the algorithm’s data flow became the setting for correcting questionable, poorly constructed or inefficient code. S2A helped correct inefficient code, conflicting variables, and logic structure flaws. It also corrected inconsistencies in the design or format of code, including command selection, indices, reusing predicates, simplifying predicate use, keeping a checklist of items to do and remembering one’s progress during the time. +28. The lecturer expanded recursive structures, using separate ones before substituting data into them because substituting into a term was easier. In S2A, I referred to existing recursive structures instead of creating new ones. These structures were referred to once as part of recursive structures, except when there was a break, they were at different levels or not in a recursive structure together, so a separate recursive structure was pointed to—the new format contained dependencies of pointers rather than a nested term. Near misses or inconsistencies in the recursive structure were noted and brought to the user’s attention or corrected. For example, one-off specs in either the shape of input or output compared with many others, unusual item type or change in type (variable or constant), or a later change leaving inconsistencies were caught. +29. The lecturer expressed that breaking into characters was expensive and only did it when necessary. I only broke elements into characters if needed when optimising them. I transformed the original algorithm into S2A and CAW code stages, where S2A code was pattern-matching, and CAW was unpredictable code such as (+). If the CAW code in a stage broke down into characters or compared them with a value, breaking down to characters was required, and stages were affected by the variables introduced. I checked whether the S2A specifications required breaking down into characters by checking for unusual, unknown variables and then breaking into characters to try S2A or CAW. +30. The lecturer put the answers away. I stated that if elements are broken into characters, it is only for that pattern-matching S2A call. The pattern-matching S2A call can be broken into substituting data into the recursive structure and mapping input to output. A CAW command was inserted if one needed to be inserted between these. CAW commands included operations, verifications, and comparisons between input and output. +31. The lecturer optimised the algorithm with Spec to Algorithm (S2A), using existing commands rather than CAW for processing speed, in contradistinction to finding the algorithm using S2A and CAW. Breaking into characters of elements before S2A processing needs to be done before finding unique variables and constants to include variables from multiple computations. The algorithm sections, mainly where variables were unknown and S2A worked, must be broken down into characters and labelled for breakdown. These sections were then broken down across the spec at this point of the algorithm. The whole algorithm needed to be worked through bottom-up with S2A, with variables being copied to new S2A calls but one not being concerned with unique variables and constants being computed for the whole algorithm simultaneously. +32. The lecturer minimised the code and data structures and mapped the output to the user’s code and data structures at the end. If S2A produces a unit production A1=A2, I noted it, then collected instances of this chain of variables from the sequence of S2A calls and removed the unit productions A1=A2, etc. This removal might not be necessary because S2A moves needed computations forward. Pattern matching operations are already grouped, with only necessary CAW commands between S2A calls, leaving no unbroken chain of A1=A2 commands, assuming variables are grouped into those needed to move together. There are no individual variables left unaccounted for in a pattern-matching operation. I magically used the S2A code to compute the answers in the algorithm at the earliest opportunity, even though the user appeared to receive the answer by tracing through the existing code. Even though the existing and S2A codes do the same thing, the performance improvement from optimising with S2A may be minutes or hours. diff --git a/Lucian-Academy/Books/PEDAGOGY/dot-University 2 - 2.txt b/Lucian-Academy/Books/PEDAGOGY/dot-University 2 - 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..520250120b8bb7dc176b1b81c11df4571fe8776d --- /dev/null +++ b/Lucian-Academy/Books/PEDAGOGY/dot-University 2 - 2.txt @@ -0,0 +1,18 @@ +University 2 - 2 + +1. The lecturer deleted quantum algorithms from the quantum computer afterwards or didn't need to because it was a quantum simulation, where electrons remained instead of disappeared or instead disappeared. I devised a mind-reading quantum computer that mind-read strings and subatomic particles from string theory to have correct results, giving the output to the classical computer. This quantum computer could make computations instantly, making simulation, space travel, and immortality possible. The configuration and necessary computations were preclusive. However, a time machine was already possible with a classical computer. I programmed the array of lock-entangled particles to give the emergent effect of a circuit. +2. The lecturer stated that Spec to Algorithm (S2A) could represent recursion using [maplist,_], [foldr,_] or similar. S2A uses the rule [is,[*]] to find substrings using two appends from "[is,[1+1]]" to write interpreters with calculations as part of strings quickly. Instead of using subterm with address to find and replace a subterm in a term, the algorithm found the string where "*" was in the pattern in the string. Then, after evaluating it using term_to_atom and is, it replaced this with two. The interpreter's predicates, needed formulas and recursive elements could be constructed using S2A. +3. The lecturer mind read the string. It had an address and momentum. It was there in between, but the instruments of the observer might affect it. I wanted to know if Text to Breasonings could control gravity. +4. The lecturer accounted for the subatomic particles, programming one of them like a crystal with 50 high distinctions to return a frequency, suggesting the ability to create a DFA with substitutions as transitions. I mind read the subatomic particle, the model boson and the fermion in superstring theory. The simulant lived in another universe with its physics model; they were safe from the home universe ending, accidents, threats or medical problems, and it continued appearing like the home universe. Bodies, simulations, quantum science, and research-based teleportation medicine were used. I teleported the subatomic particle in its object within bounds, and everything, including computers, was simulated, even with a neural engine. +5. The lecturer said the quantum computer needed to work because it was good. The particle's friend was close to it and could be used more efficiently to create a network because of the electromagnetic field. I programmed the desired results and negative (failing) cases, expecting the particles to spiritually follow the S2A algorithm, similar to programming a crystal, even if multiple particles helped the particle to achieve the result. They are naturally intelligent, with consciousness. They worked on a best of three trials more quickly than a computer because they were smaller. +6. The lecturer found algorithms like AGI using CAW for shorter solutions, mainly simulation. I temporarily entangled and locked the neighbouring particles to connect to the computer. I mind-read the specific frequencies of the particles, like a person, to check if they were there, ready with instructions and rewards such as a beep, each with 50 high distinctions. I then conducted the twenty-minute experiment, giving the computer components a break to avoid fatigue. I read the result of the quantum simulation using the classical computer, replacing S2A with CAW to find results after twenty minutes, after optimising types and patterns. +7. The lecturer unlocked the breasoning values and removed the breasonings after use. In the quantum computer, I locked breasoning values, of which there were only x, y and z, to get the particles' attention. They had small communities at their level of resolution and enjoyed helping. They had privacy and levels of involvement and were equally spaced. They were left as they were found. +8. At the end of the five minutes, the lecturer showed the particles the movie of the computation involving thousands of particles and left them alone for the day or maybe forever, depending on their comments, two visits maximum. Still, it was virtual, so we could do what we wanted. I had a 2D map of the particles. I collected their comments on the 50 high distinctions at each step of communicating with a particle, with Hollywood accuracy (giving a film set high distinctions for the imagery to be high quality). Everything was copywritten using a free algorithm, including their responses, and they helped. +9. The lecturer visited other places to explore and met a professor they knew. The simulation is necessary for people to exist in the same space simultaneously. There was enough food and resources for everyone. I had the same quality of life as before, appearing in the apartment and walking outside. Everything looked the same as the home universe. +10. The lecturer only visited safe planets. I commented that Ancient Romans, car travellers, or even room occupants could travel through the universe, quickly arriving before going to the toilet. They had high distinctions and travelled between Melbourne and Sydney, in other words, other planets. When recognised as myself or the apartment occupant, I could hire a car and automatically travel, planning to arrive home before the end of the day. I visited one planet, then returned. +11. The lecturer enjoyed the high quality of life, breasoning their high distinctions over an extended period. The simulation is a phenomenon to discover, just as crystals can be programmed with algorithms. I was more forceful, guiding the people to safe places and indicating that they should meditate and replace their bodies. One of the only differences was the difference in people on the street (sometimes obviously human animals and robots), where cars, clothes, and shops were the same. If someone just joined a simulation, they met non-simulants for the first 16 days. +12. The lecturer used variables to develop algorithms with S2A quickly. S2A specs could have variables (e.g. [a,b,c,'A1',d,e] or "abcde"). These variables had the same format as Prolog variables or started with a capital letter, then had 0 or more other letters and numbers. These variables prevented the need to enter two or more specs to cause variables to be formed. It was assumed the non-variables were constants. +13. The lecturer built an optimised machine from quantum particles. In S2A, I detected the same or a subset of the data and made an optimisation. I optimised the S2A specs by detecting if a repeat was a repeat of n characters, whether recursions or predicates could be merged and whether the same data meant an optimisation was possible. It could be omitted if a recursion added 0, multiplied by 1, concatenated "" or appended []. If two predicates decomposed and built the same list, they could be merged. +14. The lecturer gave only the correct results using the algorithm generated by S2A. I fixed the result of the generated S2A algorithm by using the map corresponding to the output in the decision tree. When selecting a map from the map list, it should have the same index as the output, particularly the clause number, found with subterm with address. This method prevented the algorithm from unwantedly finding incorrect results with a map for the wrong clause. For example, the map [A,B,c]->A had a different result from [B,A,e]->A. +15. The lecturer gave non-deterministic results for an input with more than one output. I gave non-deterministic output in S2A by inserting "nd" (standing for non-deterministic) in the output. For example, [a,[nd,[[b],[c]]]] produced [a,b] and [a,c]. The algorithm removed "nd" and produced these results as separate outputs using the Prolog command "member". Non-deterministic results were formed when specs differed in the output in the decision tree. +16. The lecturer reduced formulas over multiple rows and columns to a repeating pattern to recognise data with different numbers of rows and columns. The next generation of Spreadsheet Formula Finder (SSFF) finds values in the same row or column and their sum, difference, or multiple in some reports. It used a variant of Spec to Algorithm to collect and form output from input by recognising variables in tables and entering and calculating formulas. SSFF can automatically find these formulas using a formula finder. Other formulas, such as string to table converter, converted input to terms, "group by consecutive number", converted strings to terms with values to find formulas, and S2A found output from the input using the correct format. \ No newline at end of file diff --git a/Lucian-Academy/Books/POLITICS/Big Idea Canceller.txt b/Lucian-Academy/Books/POLITICS/Big Idea Canceller.txt new file mode 100644 index 0000000000000000000000000000000000000000..fd6b831b494ae12e23d32956f92f564c9d830f1d --- /dev/null +++ b/Lucian-Academy/Books/POLITICS/Big Idea Canceller.txt @@ -0,0 +1,38 @@ +["Green, L 2022, Big Idea Canceller, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Big Idea Canceller + +1. I cancelled the unrelated As. +2. I did this across the board, for all ideas I wrote As for, in each department. +3. This was professional and recognised as such. +4. I carried the supported A. +5. I identified the supported A. +6. I wrote the parts of the A. +7. I found the As' parts' subparts. +8. It stops with the shells. +9. The As were data structures. +10. The As for the As were cleaned by using the same and plain language. +11. The same language as the data structures was used. +12. Additional As used plain language from the same department. +13. The algorithm for a topic's idea in Master's also used the same and plain language. +14. The algorithm had a very simple, few-lines long version. +15. The other version of the algorithm was a big idea. +16. There needed to be many of them. +17. Music and acting were good ideas, and helped with memory. +18. I chose a single department at a time. +19. I identified the non-relevant ideas. +20. I identified and stopped any sources of unprofessionalism. +21. I stopped supporting non-relevant As. +22. I stopped carrying the parts of the non-relevant A. +23. I stopped giving out the subparts of the non-relevant A. +24. I didn't sell anything above the A. +25. I didn't sell anything below the A. +26. I didn't sell anything before the A. +27. I didn't sell anything after the A. +28. I analysed and didn't support anything but the data structures. +29. I stopped the use of non-same language. +30. I stopped the use of non-plain language. +31. I prevented forgetting to write the simple algorithm. +32. I stopped writing too many lines of the algorithm. +33. I stopped writing too few algorithms. +34. I avoided forgetting music. +35. I avoided forgetting acting. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/POPOLOGY/Different Things Return to Being Positive.txt b/Lucian-Academy/Books/POPOLOGY/Different Things Return to Being Positive.txt new file mode 100644 index 0000000000000000000000000000000000000000..dbf5b24afba000fbad670399479e6a7d8c6af473 --- /dev/null +++ b/Lucian-Academy/Books/POPOLOGY/Different Things Return to Being Positive.txt @@ -0,0 +1,14 @@ +["Green, L 2024, Different Things Return to Being Positive, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Different Things Return to Being Positive + +1. RR on MN on RD. A greater algorithm count indicate more diverse arguments and a lesser algorithm count indicates staying on the topic. The vern is \"change\". They change from positive. What are different things? LG: Any things. The change is positive. +2. RR on MN on PR or RD. It should be more positive than before. Positivity should be increased. Identify positivity. Aim for greater positivity. Make the positive change. +3. RD on MN on RD. The English Language Program can be part of the essay. The value changed to positive where there was more time spent analysing the view where the physical constant the train's speed decreased. There is no train, I like the values. The words in the essay return to relevant or on the topic and programmed in the essay. The students should write relevant words and program them. +4. RD on MN on PR on RD. For me, structure is the connections, argument is the algorithms. exposition is grouped algorithms and critique is those groupings with new algorithms,and the mark increases in positivity. Why are essay connections also relevant and programmed? I am given the topic \"knowing\" and the first sentence \"I see you\". I am given the second sentence \"You see me\". \"I see you\" so \"I know you see me\". Clarity is the optic adhered to. +5. MO on MN on RD. I solve negative differences. We need to define positive. LG: I changed to the positive action. What are these edges around it? Positivity is differently interesting. Differences are positive. Diffrent things don't (do) return to being postive. +6. MO on MN on PR on RD. What if terms fluctuate between positive values? We should therefore start with positive values. Each part of the mathematical sentence should change to positive. 1st to 3rd number. + to -. 2nd to 4th number. +7. AM on MN on RD. B1 is arbitrary, B2 is positive. If A contains B, B1 -> B2. A contains B. B1 is there. B1 -> B2. +8. AM on MN on PR on RD. It should be wise to change the diferent thing to positive. Changing is returning. It should exist. It should be positive. It should be necessary. +9. RR on RR. Why should English language run Prolog language? + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/POPOLOGY/The person sees everything as a positive thing.txt b/Lucian-Academy/Books/POPOLOGY/The person sees everything as a positive thing.txt new file mode 100644 index 0000000000000000000000000000000000000000..fb7701e149c4253767800a100b6b5fa135a8f3af --- /dev/null +++ b/Lucian-Academy/Books/POPOLOGY/The person sees everything as a positive thing.txt @@ -0,0 +1,182 @@ +["Green, L 2022, POPOLOGY, Lucian Academy Press, Melbourne.","Green, L 2022",1,"The person sees everything as a positive thing + +NC Noam Chomsky +RR Richard Rorty +RD Richard Dawkins +MO Michel Onfray +AM Alexius Meinong +MN Martha Nussbaum +PR Press Release + +AM on MO on NC +1. I prepared to write about how the robot would discover science. I corrected the systematic error (I wrote the idea) by the belief in science. There should be safety. The robot should remain correct and correct errors. People should go around errors (stay on the positive path) in science. +People should circumnavigate robot beliefs (stay on the positive path). +AM on MO on PR on NC +2. I saw the result of the copula as positive. It was positive, so I didn't have to do it. I corrected everything to be positive constructions in science and multiplied their use of them. I shaded in meaning by categorising words on the positive or negative axis. I typed the positive breasoning as working. +MN on MO on NC +3. The plant was helpful (non-toxic). All plants are positive. It was correct that the seeds had the same effect. Also, it was good that the leaf had the effect. Finally, it was positive that the pounder had the desired result. +MN on MO on PR on NC +4. Exposition was helpful. The critique was helpful. Reading was all positive as a result of the copula. Also, thinking was all positive as a result of the copula. Finally, being was all positive as a result of the copula. +AM on NC +5. The institution worked. The way around errors is positive until corrected. Then, the non-positive thought disappeared while a positive one remained. I did this by performing the job task. Finally, I served the job. +AM on PR on NC +6. Positivity is robots discovering science. People have positive relations in space, so everything is seen as positive. Learning is learning from positivity (from the robot discovering science). Unity from liberation is positivity from robots discovering science (following head of state). Avoiding health problems (maintaining health and happiness) is positivity from robots discovering science. +NC on AM on NC +7. There should be equality in gender pedagogies. The people and I agree with the institution working. Cosmology (represented by the group) brought positivity to discovering science (gender equality). I agree with sexuality working. +NC on AM on PR on NC +8. Positivity is more logical. Why is everything seen as positive from AM on PR on NC because it is positively functional? It has positive terms, like the discovery which it has. It is part of a positive sentence, which it has discovered. I used the sentence positively, which it has found. Positive functionalism discovers positive science. +RR on AM on NC +9. I found how people could be useful. I want to solve and agree with all their points. As a result, I moved up. I moved forward. I achieved the goal. I prepared to replace others when needed. +RR on AM on PR on NC +10. If something is missing, there is recovery time. Nodes are why the objects are needed to recover, and I think of a good reason. I agree that harmlessness is right. I am already thinking of it. About this, I am already right. Harmlessness can sustain us. +RD on AM on NC +11. The subject is positive because it exists. If not done properly and remain otherwise, the actual thing to remove may be flawed, negating the need to keep it. The object won't disappear (will remain) if there is another argument. The product maintains safety. I will keep the object if it works. The counter-argument is that I will keep the object if it uses products. +RD on AM on PR on NC +12. All in a state of being can be positive. The well-known meeting in everything can be positive. The prerequisite faith and faith can be positive. Stillness in everything can be positive. All relations in space can be positive. +MO on AM on NC +13. Everything liking in science can be positive. Everything in science can be positive (not else). In the rule of freedom, everything can be positive because the robot discovers it to be. As the self is positive, the robot is in science. As the other is positive, the robot is in science. +MO on AM on PR on NC +14. Everyone agrees with positivity and discovers science and self-preservation. Everyone agrees with self-preservation (agrees with positivity) when discovering science. Everyone agrees with happiness, of which positivity is a small part, which everyone can apply to science. The self likes the other in a positive way as a way of discovering science. I agree with positive likeness. +MN on AM on NC +15. I agree with positive nothingness. The robot discovered science in which everything can be positive. Positivism is necessary. The robot discovered science, in which everything can be possible. The robot discovered science, which resolves everything. +MN on AM on PR on NC +16. I agree with people and science. Positivity is robots discovering popology. However, discovering things is science. Popology is the science of people. People as against objects are the subject of popology. +MN on NC +17. I programmed the program finder. Pedagogy is possible with robots. First, I found where the result was. Then I found the way to return it. Finally, I computed it at top level. +MN on PR on NC +18. I will do it safely. What about green space? I suggest economic uses. People will do everything this requires on our planet. It works. +NC on MN on NC +19. The person is safe and happy because of thinking about a positive synthesis of results. Pedagogy has nothing to do with popology (pedagogy enables our simulation and the program finder makes results available to us in a spiritual form that we can detect). I trust pedagogy, about which it is possible to receive results in the simulation. I trust popology (the person is positively able to work because he hasn't fallen in a hole). I trust societology (the setting is safe for society). +NC on MN on PR on NC +20. What's the point of popology? Safety is popologically interesting. For example, the person should simulate space in the simulation. I determined the initial point. Then, the person should simulate time in the simulation. Finally, I determined the safe path between the two points. +RR on MN on NC +21. The person transcribed the speech. Can everything be positive when receiving results in the simulation? All the person's thoughts are positive because of the configuration of results, for example, asking people to stop teasing to prevent a headache (remain on a comfortable path). I avoided unwell people (stayed on a positive path). +Unwell people should be taken care of (by healthy people). So I avoided places where people were sick (visited healthy people). +RR on MN on PR on NC +22. What is this breasonings menace (groups of arguments of breasonings help access levels of consciousness)? Space is more valuable than popology (existence is useful by itself). Space is positive. Positive nothingness is part of popology. Why is positive nothingness part of popology (people contain levels of consciousness)? +RD on MN on NC +23. What is an example of a positive experiment (everything is part of a positive meditation experiment)?. +What is the relevance to science (people see everything as a positive thing to use as a basis for science, which the robot discovers)? What is the relevance to arts (philosophy of science counting axioms, human knowledge and decisions)? What is relevant to the education institution (it provides accreditation and research)? Finally, what is the relevance to the world (positivity means the world will function adequately)? +RD on MN on PR on NC +24. I am happy (I have arguments for each item that makes me happy). Why do I have to go to space (I will simulate it)? It is philosophy (true, not false, that everything can be a positive thing). Can Lucianic doctors practise without full degrees (the medicine people can breason out the degree and practise the daily regimen to remain skilled)? Why is that part relevant - everything (the people's feelings) can be positive (happy). +MO on MN on NC +25. The bringer is closer to the creator. People should bracket finite breasonings in the simulation, like agreeing with a positive idea. Finite data associated with results is enough. A friend brings the result. Another friend goes through the result with me. +MO on MN on PR on NC +26. There was harmony in space. The space robots should take care of us as people should. What if something is missing (I fill everything in)? +Everything in the green space can be positive. Everything that I was aware of in the simulation was positive. +AM on MN on NC +27. The subordinate helped the leader. A parent is firstly responsible for the intelligence of the child. The self is responsible for the intelligence of the self. +The self traversed the maze. The self used green equipment. +AM on MN on PR on NC +28. I thought of space as clearly starting with the self. People should critically solve everything. Beginning with the self, everyone should be medically healthy. I avoided the charging bull (walked in safety). I avoided the animal pulling me over with its cord (I walked around the edge). +RD on RD +29. What is the train metaphor? Why is everything positive? What is so positive about physics (everything is positive in the physics simulation)? +What is so positive about physics laws? +Why is the simulation positive? +RD on PR on RD +30. Why is medicine returning to positive - isn't that criticality's job? What do you think about the philosophical reasons? What about ontologies? Also, what about criticality? What about medicine? +NC on RD +31. There should be positive ontologies/multiverse (good thoughts). I like the ontologies. What if there is a mistake about the train? What if there is a mistake about the passenger? What if there is a mistake about the carried object? +NC on PR on RD +32. What if realism goes wrong (functions well)? The train's path is human-determined. There is action-reaction in the train carriage connection hooks. Everyone travels in groups. +RR on NC on RD +33. Why is everything computer science? Gravity is transparent. What is the purpose of all this business - the train as a metaphor for everything? What is the point of there being no people on these computer-science metaphor trains? Why is there a print-out of the train data structure on the train seat? It is untenable (the people are positive objects on the food program train). + +RR on NC on PR on RD +34. Everything should be positive in the multiverse. I don't like the multiverses much because we can't access upper, lower or lateral ones. We could understand the science of other universes by simulating them. What if there are mistakes, or good thoughts instead, in the universes of other planets. Maybe we can join together, look on and simulate them too? What if only brains are conscious, and experience is subjective, and the whole universe is simulated? I am happy to have known you too. +RD on NC on RD +35. Why doesn't everything go upside down? Gravity causes them to stop the right way up. Why is gravity transparent? It is constant. What is gravity for? It allows us to develop and stay with the planet. Why doesn't everything fall in? Because gravity is only a certain amount. Why doesn't everything whirl around? Because there is attraction with the surface. +RD on NC on PR on ND +36. Why are people everything? What do ontologies have to do with the multiverse? They keep the people happy. They help link things up. Why is happy positive? +MO on NC on RD +37. Everything in the human realm can be positive. Why is the train's path human-determined? It refers to human-made and human-controlled objects. What about humans? They are both human-made and human-controlled. What about cases of lack of control of humans, rather, these objects, by humans. They would start again, pick the sieve up from the floor. What about idealism if errors can occur? It is what we're based on. +MO on NC on PR on RD +38. Everyone can be divine. How will a plant grow? It is like a train having carriages added to it. In the train, or tree of the multiverse, couldn't a scientist explore science more than a scientist in a parent universe? Why isn't this Lucianic Medicine everywhere? What about Lucianic Meditation? +AM on NC on RD +39. Everyone has what they want. Why is everyone everything? Everyone can help with inanimate objects. Everyone is happy. Everyone is good. +AM on NC on PR on RD +40. If everything is represented in text, then it is positive. I think language is the most important aspect in \"Everything is seen as a positive thing\". Positive language stands alone. It is an independent idea. There are non-monotonicities in realism. It can be connected with other ideas. +MN on NC on RD +41. I prefer physicalism to realism, idealism. What is so intelligent about the simulation? I ate food. I listened to music. +RR on RD +42. What is the use for a positive thing? What is around positivity or solving all things? What will solve it? How is it good? What will come of it? +RR on PR on RD +43. The Medicine B is that all levels are good. I don't mind 50 As firming up positivity. The postive answer was aimed at by changing the question's ontology. The human animal agreed. I prevented unwanted thoughts by using the medical not and bolt (detailed end of the quantum switch process). +MN on NC on PR on RD +43.1 Can't we have exactness in induction? We should return the positive ontologies being about people. Ontology is how they find it. Ontology is how they know it. Ontology is how they do it. +NC on RR on RD +44. Which of the two items does \"use\" refer to? Is t quantum formula finder? Is it quantum program finder? Is it quantum essay writer? +NC on RR on PR on RD +45. All levels are seen as being positive levels. I want to agree with all levels being positive. I know the level. I know the item number. I know the item. +RD on RR on RD +46. The quantum algorithms are seen as being positive. I confirm that Green correctly described the quantum algorithms. The qubits entangle (link to) another qubit. Two qubits are chosen. A chain of qubits satisfying the question is found. +RD on RR on PR on RD +47. The ontologies of all levels were seen as a positive thing. Fifty As firm up positivity. Why does a positive answer lead to all levels? Human animals do exist. I just didn't think of an unwanted thought. +MO on RR on RD +48. I like the seen-as use. What all this seen-as business? I like the seen-as-ness. I like the seen as positivity. I like the seen-as solving. +MO on RR on PR on RD +49. I observed the seen-as quantum switch. I observed the seen-as 50 As. I observed the seen-as ontology. I observed the seen-as humans. I observed the seen-as animals. +AM on RR on RD +50. Everything is seen as a positive thing for me. I endorse positive things. I endorse the relating of them. I endorse turntables. I like the other. +AM on RR on PR on RD +51. Medicine is seen as a positive thing. I endorse medicine with you. I endorse the sun. Nietzsche helps me. I wrote about Nietzsche. +MN on RR on RD +52. I completed 50 As for each side degree for undergraduate, Master's and PhD. Forward movement is the positive direction. I found the goal. I moved towards it. I achieved it. +MN on RR on PR on RD +53. Meditation required medicine. I noticed psychology. I noticed that all levels were good. I noticed the seen-as version. I noticed that it was good too. +MO on RD +54. All values in all cells of the Venn diagram were positive. I completed the gay propositional universe programming about the train. I found the existent. I found the necessary. I did not need the possible. +MO on PR on RD +55. I augmented reality to positive medicine. I used spiritual medicine. I had 50 As for positivity. I had spiritul medicine for the other. I wrote about positive terms having As. +NC on MO on RD. +56. Quality is better than knowing or feeling. Propositions may be like ontology and structure. They are like them by necessary theories. They can be. I agree with propositions. I agree with ontologies. +NC on MO on PR on RD +57. Spiritual medicine is Popology. Ontologies are propositions. There is ontological popology. There is propositional popology. There is ontological popology. Ontology should be propositions. +RR on MO on RD +58. Head of state should be positive. We want propositional universe programming to be good. Morals should allow for future science, where propositions indicate morals. Popological scientists observe the phenomenon, then research how it works. The object is named by the phenomenon. +RR on MO on PR on RD +59. I found the verifiers. There should be an A for taking over the world. I found the army. I found the scientists to help with it. I found the people to take over the people. +RD on MO on RD +60. It is the best ever. The Head will take over the world. The self will be central in it. They will do pedagogy. The rest will include meditation and medicine. +RD on MO on PR on RD +61. They should still do it. They should do it because it is definitive and they want it. They will record it. They will accept people who want to do it and study it. It is good for them. They will overcome blocks to doing it. +AM on MO on RD +62. Babies cry because of recordings. Propositional universe programming is indicated by babies. I like a large enough quantity of it. It should be easy. They should feel confident to do it. The baby can't help fall asleep. +AM on MO on PR on RD +63. Breason out meditation too. Specific pedagogical medicine and specific pedagogical treatment stops crying. Think of medicine. I agree with the treatment. Apply them. +MN on MO RD +64. I wrote the computational law game. Law was tested correctedness (sic). I prefer functionalism, or propositions to ontologies. What's necessary and possible? Law is. +MN on MO on PR on RD +65. I think it doesn't matter as long as people don't find fault with citations. Solving negatively is a reason for positivity. I prefer positivity to negativity. Positivity is the reason. I solve negativity to keep negativity out. I don't like it solved too clearly in case there are little things coming from them. Positive things come from the positive thoughts. The self knows thoughts now. +AM on RD +66. It is about everything being seen as a positive thing. Regarding the train metaphor, I want to remain to be them. I want to hitch onto my own ideas. I hitch onto others' ideas about mine. The main idea is that I can do any idea I want to, for example, writing. If the children have problems, they can have specific help when they need it. +AM on PR on RD +67. The mindmap is everything. I can mindmap. I can think. I can clear it up. I checked it. +NC on AM on RD +68. I need other people. It is about the self's moment. This is not intended. Non-being or miscarriage is solved with a blessing of conception by a breasonings chapter being thought of. The neurotoxicity purge in te weekly regiment will work. So, no more psychiatric medication will be needed for the self. +NC on AM on PR on RD +69. The uses for pedagogy are astounding. These ways of thinking are relevant. This interdisciplinary area of study can return. For example, it is pedagogy. We can have it back. +RR on AM on RD +70. And economics, education and the area of study itself should return. I need to hitch onto journalism. I need to hitch onto acting. I need to hitch onto creative writing. I need to hitch onto these for professor. +RR on AM on PR on RD +71. Everything should be seen as a positive thing with a positive reason as well. The object is between the future and past tense. I agreed with the head of the men. I know these departments will work. The point is people being interested in the self, symbolising Pedagogy and other high quality thoughts and everything else. The point of all these things that appear is that everyone should read everyone's texts and then everything will be all right. +RD on AM on RD +72. Societal pedagogy is correct at the moment. This positivity business is a positivity chapter when all of it could be positive, and it is. And I know you are positive and I am too. I concur. It is good. +RD on AM on PR on RD +73. People are trying to ensure that head and non-head are positive. Earth was foretold by ancient prophets. People may start again. I like it. I like the other. +MO on AM on RD +74. Sentences are checked by essay writer's program finder. Everyone should have their own academy. I can write down ideas. I can sew them into arguments. I know they are right. +MO on AM on PR on RD +75. I touched up the presentation. I updated the program finder with new ontologies. I wrote the program graphics finder to display text boxes and models. I selected the clearer image using the graphics finder. I wrote the similar sentence with different algorithm finder. +MN on AM on RD +76. The train metaphor is a good thing. I agree with the train. I agree with each of the connections. I agree with the carriages. It runs on time. +MN on RD +77. The parser is good. The train parsed the sentence. I parsed the verb. I parsed the subject. I parsed the object. +MN on PR on RD +78. The mathematical parser is good. The train parsed the mathematical sentence. I parsed the first number. I parsed the plus sign. I parsed the second number. +NC on MN on RD +79. The parser relates to the topic. Writing the sentence matters. Everything is positive. The sentence is not written a priori, a posteriori, but a présent, meaning it is the event. That is correct. Writing can help act. LG: In this case, there is no other act. That is the point. People are not codes. LG: Computers can computationally check everything, all combinations. NC: They don't need to because philosophy is based on human lives. +NC on MN on PR on RD +80. To the other everything is computers, to me everything is people. What do you mean people could be replaced. It is impossible because we need something to compare things with. The other requires it too. I like computers. LG: Both are positive. However, there is a paradox of where to start. The computer cannot respond to trauma. You like them. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/REPLICATION/.DS_Store b/Lucian-Academy/Books/REPLICATION/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Lucian-Academy/Books/REPLICATION/.DS_Store differ diff --git a/Lucian-Academy/Books/REPLICATION/Replicator 1.txt b/Lucian-Academy/Books/REPLICATION/Replicator 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..21c5be6deb7ef77c017932ddc51d148dc81aeb18 --- /dev/null +++ b/Lucian-Academy/Books/REPLICATION/Replicator 1.txt @@ -0,0 +1,87 @@ +["Green, L 2022, Replicator 1, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Replicator 1 + +1. The replicated object corresponded to a person. +2. The animal was replicated. +3. The plant was replicated. +4. The computer was replicated. +5. The in-house computer software was replicated. +6. The robot was replicated. +7. The article was replicated when the writer felt tired. +8. The electronic device was replicated. +9. The building was replicated. +10. The planet was repli-terraformed. +11. The merchandise was replicated (manufactured). +12. The medical effect was replicated (needed parts were provided when needed). +13. The effect on the computer was replicated. +14. The exact algorithm or writing needed to be finalised before replication. +15. The replicant replicated more replicants (for example, to increase the population). +16. The person became immortal using replication and vaporisation. +17. The person partially became an android. +18. I replicated what there wasn't time to manufacture. +19. I replicated and vaporised sound equipment before and after an event, respectively. +20. I replicated the element. +21. I replicated the torch to explore at night. +22. I replicated the sun. +23. I replicated equipment to check that people were following safety rules properly. +24. I replicated inspirations for students and programmers. +25. I replicated muscles. +26. I replicated a determiner to visitors to the house and helped simplify a tracked person's pathway. +27. I replicated accommodation, food and water to alleviate poverty. +28. I vaporised the cause of the volcanic eruption. +27. I made and replicated a 3D print-out of a house. +28. I gave myself a facial with replication. +29. I found and removed foreign bodies in my body with replication. +30. I gave myself suction cup treatment with replication. +31. I had a drink in the desert with replication. +32. I tonified my blood with replication. +33. I replicated the messages between members of the office. +34. I purified and cooled the air using replication. +35. I recovered the back-up using replication. +36. I replicated rather than maintained equipment. +37. I used the replicated internet, which replicated signals inter-galactically. +38. I ate replicated food on the space craft. +39. I replicated necessary equipment on the space craft. +40. I replicated the internet in the future. +41. I replicated the friends in the future. +42. I replicated the algorithm without static analysis errors. +43. I typed up the algorithm using replication. +44. I transcribed the talk using replication. +45. I translated the thoughts into famousness (wrote) them by replicating them. +46. I checked for errors and removed them using replication. +47. I debugged the algorithm by replicating it. +48. I replicated the product, customer, their life and income and findable thoughts at each point of the sales funnel. +49. I optimised the algorithm and replicated it. +50. I translated and replicated the text. +51. I wrote the algorithm in another programming language and replicated it. +52. I reduced and found the functional version of the algorithms and replicated them. +53. I replicated the documentation for the algorithms. +54. I replicated the business (employees). +55. I replicated the simulation's workload with computers. +56. I replicated the teleporting space ship. +57. I replicated the robot body part. +58. I replicated the humanness (fine thoughts, and formation of text-to-breasoning support) of the robot. +59. I replicated the acoustic reflector. +60. I replicated the electronic component. +61. I replicated the visor to stop sunlight entering the eye. +62. I replicated space craft with self-correcting paths to stop deviations. +63. I replicated an algorithm to help release the stable version of the algorithm (matches the desired output). +64. I replicated the needed therapy. +65. I vaporised toxins from the body. +66. I removed any unnecessary code using replication. +67. I benefited from the minimum help using replication. +68. I uncommented the line of code to enable the algorithm with replication. +69. I debugged the algorithm as part of the tutorial and replicated it. +70. I replicated the employee when needed. +71. I replicated the script when needed. +72. I replicated customers on the tollway. +73. I made the algorithm easy to modify and replicated it. +74. I replicated the friend to keep the friends together. +75. I replicated the infant under one year. +76. I replicated the whole history on disk. +77. I repli-manufactured items when the people had been paid and enough items were needed. +78. I replicated another product when one was returned. +79. I rigorously tested the replicated product, to perfect the replication process. +80. I used replication to increase returns to surpass expenditure. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/REPLICATION/Replicator 2.txt b/Lucian-Academy/Books/REPLICATION/Replicator 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..cc9edbc78759667a8582e0fea10a6bae27b10746 --- /dev/null +++ b/Lucian-Academy/Books/REPLICATION/Replicator 2.txt @@ -0,0 +1,18 @@ +["Green, L 2024, Replicator 2, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Replicator 2 + +1. The replicator combined the different sales sites to track and recommend bundles of products. I accepted payments for song albums. I wrote the algorithm that inspired the song. I mind-composed the song using an algorithm I had written. I remade a website for song downloads and donations, keeping track of the most popular downloads. +2. The replicator repaid their education institution fees. I accepted payments for books. I translated the text files. I converted the text files into an HTML file with anchors. I included a graphical front cover. +3. The replicator began to become self-sufficient. I accepted payments for algorithms. As with other products, it was up to the customer’s honour to pay for the products they used. The free products promoted popularity, and they spent more seriously when they used them. I agree that open-source software supports ease of use, creativity, and freedom of development. +4. The replicator found better grades in history and philosophy of science helped them gain entry into a postgraduate qualification. In music, I developed courses about my life, arguments, and algorithms’ science (effects, ethics, and metaphor). This idea was a separate product from music and was part of the courses. The history and philosophy of science courses were various (automatable) because they related to students’ interests and careers, which helped determine the structure of the courses. The courses were helpable (sic) because they used a format with questions such as “What ethical implications does this discovery pose?” and “What are the good/bad sides of this discovery?”. +5. The replicator used computer science for meditation, philosophy and business. In computer science, I developed courses that broke the repositories into projects that enabled students to learn how to modify them and write similar algorithms. They contained interactive activities that explained and helped write code. The courses were various because students’ knowledge and skills determined them and were always entry-level. The courses were helpable because they were as easy as using Program Finder with Types, which could grapple with pattern-matching, recursion and specific techniques with unique requirements. +6. The replicator published a small selection of texts in various languages. I designed a post-generative AI Cultural Translation Tool (CTT) that helped translate websites and books. It used the same idea of automatically translating recurring sentences once and substituting them back into the original. I used generative AI to translate denoted sections of texts and algorithms to publish e-books and websites quickly. It worked best with a grammar checker to arrive at the best possible results or relied on generative AI to do this. +7. The replicator recommended testing translated code if higher-order code translation or numbers caused bugs. In CTT, I inserted open/closed tags in code to denote translatable regions and, optionally, the language. I listed these for the user to confirm or automatically confirm or reject them based on their folder or file location. I optionally rejected filenames and strings that contributed to filenames. +8. The replicator optionally dispensed with back-translation, which would take too long to check, and relied on bulk back-translation. CTT retranslated only unknown strings and preserved newlines and formatting in translations. The algorithm outputted a file with the strings by themselves, which the user shouldn’t change (but could as the strings were number-code delimited) to check the grammar and translation. Lines deleted in the original were deleted. Using a version control system allowed reverting to a previous version of the code and CTT data files. +9. The replicator avoided intensive term processing using subterm with address. CTT used Prolog to List Prolog and subterm with address to quickly find strings in source files. I optionally searched for atoms. Some languages and countries have unique numeric, financial, cultural, or linguistic systems requiring manual modification before or after translation. Following the translation of denoted parts of strings, I could substitute strings back into the original. +10. The replicator put string fragments back together at the end. To translate them, I split strings in CTT by newline, tab, and tag delimiter. I stripped strings of white space, such as newlines, spaces, and tabs at the starts and ends, and concatenated this formatting after translation. I assumed sentences wouldn’t be split because they would be in the same cell or use margins. I considered translating open-source code for easier manipulation, using compilation commands if necessary. +11. The replicator tried the automatic translate and publish feature. I listed repeated lines once in the bulk translation file, which became part of the CTT data file. For meaning errors, I could check the back translation, which was translated into the original language. I could individually change the original and the translation if there were errors. I could check the original context using HTML form and hyperlinks. +12. The replicator explained that the tags changed the strings from “string” to “<&td>string”, possibly with a language code inserted. There was a converter to the CTT tag form for translation and to the non-tag form afterwards. I wrote or approved the tag file or automatically translated all strings in any listed folders or files. The algorithm detected whether there were any split sentences, for example, non-heading sentences without a full stop that led to the rest of the sentence without a capital letter at the start of it, and translated them as a whole before substituting them back. I removed the tags to run the code. +13. The replicator tested the translated file for the correct translations in the right place, not in the wrong place and working code. CTT used one command to compile a file. This command removed the CTT tags and possibly compiled and ran the file in testing. The production version wouldn’t have tags. The run version was stored elsewhere, and the original version was edited. +14. The replicator ensured the website and texts were accessible. After translating my books, I could sell them and their courses. My mission was to teach meditation and the skills for maintaining health that it included. The teachers lived and worked in each country and in separate companies. The benefits were good pay and the ease of making connections in texts. +15. The replicator explained that “time points” required arguments and examination to pass. I planned a guide and algorithm to explain the philosophy and algorithms. The content of the philosophy was expansion and discussion about algorithms, and the guide explained the algorithms’ aim, audience and method. They fit into Green’s life history and his research interests. His music offered abstract angles on philosophy. +16. The replicator used sales skills to establish an online academy. I completed all the translations without user intervention, one string per query. The user tagged the strings to be translated and checked the finished product for needed changes. If satisfied, I could queue multiple languages if the program would work with translations. In the future, I could automatically upload files that are ready for viewing or downloading."] \ No newline at end of file diff --git a/Lucian-Academy/Books/SALES/.DS_Store b/Lucian-Academy/Books/SALES/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..eeb63b3b8be2f7e8e75e651209132f03f5d2cbdd Binary files /dev/null and b/Lucian-Academy/Books/SALES/.DS_Store differ diff --git a/Lucian-Academy/Books/SALES/New Sales 1.txt b/Lucian-Academy/Books/SALES/New Sales 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..5f03ea110cefd729e722c3a15ea8002c3672cee2 --- /dev/null +++ b/Lucian-Academy/Books/SALES/New Sales 1.txt @@ -0,0 +1,89 @@ +["Green, L 2024, New Sales 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"New Sales 1 + +1. The salesperson sold one of several plans with value. I sold it for the best price. I found the potential buyer. I discussed the possible prices. I sold it for the highest price. +2. The salesperson made money from advertising. I bought newspaper advertising. I chose the newspaper. It targeted my demographic. I targeted this demographic. +3. The salesperson's vision was to write on pedagogy. I conducted a sales campaign. I chose who to sell to. These were my ideal clients. They shared my vision. +4. The salesperson interested the buyer in all industries. I held an auction. The auction enticed the bidder to bid. The price looked suitable for the product. The buyer bought pedagogy because they needed it, and the seller sold pedagogy reasonably. +5. The salesperson creatively wrote the price. I sold it for a reasonable price. The buyer compared the product and price with others. They selected one with precisely the features they needed and which suited their budget. The price included the cost and was relatively low. +6. The salesperson could convert data files to HTML files. I sold overseas. I looked up laws about tax in different countries. I kept receipts. I customised materials for other cultures and language speakers and made them accessible. +7. The salesperson sold to a previous buyer who shared values with the company. I sold to a buyer prior. She preferred to type it in rather than use an algorithm. Typing it in was faster. In addition, she could debug it faster. +8. The salesperson was dignified as he gave credit, with good reason. I gave credit. I thought, \"Good\", so I provided a refund with credit. I gave it as expected. I used my neurons to create the solution. +9. The salesperson was professional when giving a refund. I gave a refund. The client may have a faulty product. Or the client may not need the service. Finally, the product may be wrong. +10. The salesperson understood the student's reasons for applying for credit. I set the time limit to use credit. The time limit was twelve months. This time allowed the student time to spend the credit. In addition, they would be likely to spend it in this time. +11. The salesperson gave the creditor the voucher. The credit holder gave it to someone else. The student gave the credit to a friend in exchange for a coffee. They were returning the favour. It was deemed ethical. +12. The salesperson wrote the voucher code, keeping a record of its value. Then, I sold to the credit holder. The previous credit holder bought the product. First, they used their credit to purchase the product. Then, they used the remaining credit to pay towards another product, or they paid the difference. +13. The salesperson agreed on the price with the supplier. I paid the supplier. I paid the supplier at the time of the contract. They supplied the goods. I checked there was nothing wrong with them. +14. The salesperson had a good track record and was interested in the latest product. I chose a possible buyer. I found the buyer. The buyer had the money, was the decision maker, wanted the product, and it was the right time. I agreed on the product and the price. +15. The immortal did the readings but had a mentor for specific advice. I sold in a commercial setting. I was professional. The buyer had good experience. The product was professional. +16. The salesperson sold their employment to the company. I paid employees. They were paid fairly and were also employed if they were disabled. They had no wage theft or secrecy in their contracts. The company paid them the correct rate and did not dismiss them illegally. + + +17. The salesperson advised on selling the company. I sold the company. The valuer considered the assets which generated profit and the liabilities or costs as part of the valuation. This valuation included long-term viability. I improved my sales, timed my exit, and found a third-party business broker and qualified potential buyers. +18. The salesperson produced reports for a fee. I consulted for companies. I helped streamline tactics. Or I helped increase profits. +19. The salesperson sold the report. I sold my management consulting knowledge. For example, I removed unnecessary elements from the plan. Or I recognised revenue streams or expenses to reduce to maximise profit. Or I determined which activities and processes added value, or I added value or fixed problems or followed the problem-solving process. +20. The salesperson negotiated the price. I sold it for a fair price. I suggested a price. The buyer indicated whether it was too high or too low. I agreed with the changes. +21. The salesperson needed money. I sold someone's goods. I identified the goods. I agreed on a price. I accepted the money. +22. The salesperson checked product prices individually. I sold a product for a similar price to another product. The two products were the same value. I sold the other one month before. I checked whether other factors influenced the price, such as product cost, product usability and wantedness, competition, government and legal rules, pricing aims and marketing. +23. The salesperson made the sales decisions. I represented the seller. The seller was the decision maker. I negotiated a price on their behalf. +24. The salesperson sold the grammar logic algorithm in mint condition, similar to a generative algorithm writer for naming and formats. I sold a similar product. The product had similar cost, usability and wantedness, competition, government and legal rules, pricing aims and marketing. It had the same function and appearance. It might be a generic brand. +25. The salesperson researched and delivered the answer. I determined that a buyer needed help. I read the message from the buyer asking for help. They wondered what condition the product was in. I gave them the answer. +26. The salesperson inspected evidence for a financial position. I sold to a buyer. I advertised to possible buyers. I talked with them. I sold to one of the first ones for a fair price. +27. The salesperson determined that a sale was necessary. First, I advertised to prospective buyers. Second, I found public newspapers or websites which could display advertisements. Third, I found the best price for the ad. Finally, I interviewed the prospective buyers. +28. The salesperson valued the product and checked for legal problems. Then, I gave the price to the buyer. First, I valued the product by possible profit. Then, I wrote the price in a report. Finally, I delivered it to the buyer. +29. The salesperson leveraged research and development. I brought back an old product. The old product was competitive with the new product. It was popular. I calculated that it was in demand. +30. The salesperson responded to the demand for an old product. There was interest in an old product. The old product was a portable version of an existing product. Or it contained features that I could bring back. Or, it supplied a feature that a change demanded. +31. The salesperson needed to cover costs. I compiled possible prices. I considered product costs and what the buyer was willing to pay. The cost could influence the price. I listed the components and total package prices from lowest to highest. +32. The salesperson asked shareholders whether they wanted to receive share value or dividends. I gave the profits to the shareholders. I paid the costs. I divided the franked dividends among the shareholders. The shareholders choose whether to receive share value or dividends. + +33. The salesperson sold for a reasonable price. Next, I sold old goods to clear them. First, I checked that the old goods were in proper condition. Then, I advertised them for sale. Finally, I sold them to an early buyer. +34. The salesperson prepared for the next sales cycle. I could use the money from sales. I invested the money from sales. There were some costs. I spent money on wages, accreditation and computer expenses. +35. The salesperson worked on open-source software. First, I checked the cost (how much it costs to manufacture). I explained that the cost was the cost to manufacture, and the price was the price to sell. Next, I checked that I had enough money to manufacture the product. Finally, I supplied enough information products. +36. The salesperson and customers were necessary. I checked the price (how much the salesperson would sell it for). I found the price based on the cost. I also factored in the competition's prices. I also considered ways of making money to cover other expenses. +37. The salesperson read all about the person and their strengths. I checked the product's sales. I checked the unit sales for the period (where unit means the product). I monitored the sales and saw ways of helping. I put on more staff, helped with customer service and helped with supplies. +38. The salesperson memorised and checked service criteria. Then, I gave the task of checking to the appropriate person. First, the person checked the product quality. In addition, they checked for enough supplies. Finally, they checked the premises were tidy and well-presented. +39. The salesperson undertook training. I priced the product. As a seller, I also bought supplies. I priced the product based on the cost of the supplies. I was aware that a professional buyer needed training. +40. The salesperson sold more prize-winning products. I entered the product in the competition. First, I found the appropriate category in the competition. Then, I entered the product in the competition. Finally, I tried with the entry and recorded the result. +41. The salesperson discussed the question with the manager. I conducted business to help a person. I settled on the person to help. I discussed their needs, such as an education product. I taught the person, and funding covered the cost. +42. The salesperson collected the sales points, the most rigorous philosophies. I competed for results. I wrote the most rigorous philosophies. These included many simple algorithms. It was possible to discuss a simple idea in the essay about them. +43. The salesperson checked the overall necessity of the project. I checked the product. I checked that all the students in the class were familiar with all the commands in the algorithm. Then, I tested whether they could work out the combination of commands themselves. If there was a specific requirement in the project, it needed to be both different and necessary looking enough. +44. The salesperson listed the performance criteria of the test. Then, I tested which product best achieved the result. Next, I recorded the candidate products. Then, I tried them against the hypothesis. Finally, I scored and ranked them. +45. The salesperson found the wording of questions, the question format and the feedback format that the customer responded to. I tested what customers reacted to. The customer's needs ranged from sustenance to intellectual and personal. They required intellectual stimulation in the form \"What is it?\" rather than \"Is it right?\". The helper algorithm, written by the student, got faster as it automatically translated staged output and input into code. +46. The salesperson made a 3D sales pitch game. I experimented in the sales lab. First, I wrote the movie and 3D game generators. I started with 2D bitmap graphics. Then, I focused on the characters' 3D models and how they moved. Finally, I made a 3D educational game in which players moved around mazes, making maps on pieces of paper. +47. The salesperson spent on the new technology. I checked the use of money. I listed the uses for the money. I took out unnecessary costs. I spent more on sustainable products. +48. The salesperson helped contribute to costs. I checked the rate of spending and saving. I spent $A. I saved $B. The return on my savings helped pay for my costs. + + +49. The salesperson and customer listened to each other's specification and answers to frequently asked questions, respectively. I reacted to customers. I listened to the customer's specification for what they wanted. I estimated delivery. I delivered the product. +50. The salesperson helped the customer with buying, using and possibly returning the product. I wrote the answers to the frequently asked questions. I listened to the customer's question. I wrote the answer. I wrote answers to frequently asked questions for others. +51. The salesperson checked the function, user friendliness, feasibility and cost of the product. I conducted product research. I found the specification. I built a prototype product. I tested the product with people. +52. The salesperson also checked the tim and cost of testing. I investigated the feasibility of the product. I checked its development time. I checked its manufacturing time. I checked the cost of development. +53. The salesperson modified the test reader to work in other languages, such as with standard out (output to the terminal window) in C. I researched the competition. I experimented with converting non-Prolog source files into List Prolog to procure commented-out tests. This was possible in C, etc. I could modify the test script to run the tests in the interpreter in the other programming language. +54. The salesperson tested code in other languages. I wrote the code in C. I wrote the correct output in a file. I tested that the program's output and the correct output were the same in Shell. This program was in Prolog. +55. The salesperson wrote the neuronetwork to work out algorithms and breasonings. I competed for sales. I pitched to the client. The client chose one of the salespeople. This was a round-robin tournament. +56. The salesperson wrote the command line option more memory for the interpreter. found the short-cut neuronetwork that really found the if-then relationships in the data. In fact, I found the thoughts of the stone. I wrote down As from the perspectives of the crystals. I critically analysed the animal. +57. The salesperson sold meditation classes to school, gyms and industry. I checked for prospects. I listed the methods of contact. I listed the prospect. I contacted the prospect. +58. The salesperson was bigger. I considered promoting my immortality time machine. It worked for meditator-pedagogues. It was essential in the army, psychiatric wards, hospitals, workplaces and universities. Some preferred to work towards it. +59. The salesperson checked for qualified leads. The new immortal was over forty. They could do the pedagogy later. The child was too young. The animal worked, but more should be done. +60. The salesperson cared for and got on well with the inanimate. The rock was left out. However, the play, the oeuvre and the magna opera were done. I tested the music. I visited number ones in time. +61. The salesperson remained above academia. I competed by checking. I caught up with thoughts per student per day of the years. I caught up with qualification assignments, and to an extent, thoughts during the qualification. Writing came at a premium. +62. The salesperson wrote the algorithm writers with the same code and different inserts. I segmented neuronetworks by chapter and algorithm. I meant algorithms by philosophy. And I meant algorithms with simply expressed specifications by algorithms. There were algorithm writers for hierarchical file processors, file opening and saving algorithms and menu systems. +63. The salesperson tested seven changes at a time and simplified the code. I checked published prices. On a separate note, I rewrote the interpreter using DevOps. I found the first stable version. I tested each predicate. I wrote type statements and type verification code. +64. The salesperson introduced new features. I checked similar products. I rewrote the interpreter not have Query2=[_|_], but label choicepoints. I rewrote it to not have predicates for points and choicepoints, but use one predicate. I also removed the two reverses from the interpreter. + +65. The salesperson converted the Prolog algorithm into List Prolog and converted it into Prolog for Vetusia Web Service, similar to converting it to C. I replaced the product with a similar product. The Vetusia Web Service (VWS) converted the multi-clause predicate into a single predicate with if-then statements. In addition, VWS converted nested findall to multiple predicates if it contained an input statement. At the maximum, a variable could hold choice points from, for example, repeat and return to them at the start of a new predicate. +66. The salesperson made a fast, secure website or used VWS to create faster Prolog code convertable to C code. I wrote an Internet browser called C from a server-side algorithm. This C program could be converted from Prolog with VWS. It could already be Just-In-Time because the choice points were dealt with only if they existed. Lucian Academy had a student account area with assignments and submit and verify algorithms. +67. The salesperson wrote different instructions, placed visual elements differently, changed their appearance, or wrote a foreign Visual Prolog programming language. I created a game with an icon editor with Prolog converted to Javascript. It was in HTML, with buttons to change tools and colours, but Javascript allows painting and dragging. I checked for plagiarism in the algorithm. I checked for the same names and logical structure, which could be converted between findall, maplist and recursive predicates. +68. The salesperson wrote the desktop or smartphone app in Prolog by converting it into C or another language. I attempted to make a programming language version featuring drawing and dragging in the web browser. It converted instructions such as mouse down, mouse position, and mouse up to Javascript. For example, painting areas responded to these events and the algorithm rendered paint at these coordinates. Prolog sped up accessing grids by storing lines of the grid as lists and getting and putting items in order. +69. The salesperson ran JIT in Vetusia Prolog (VP) for speed. I established a store that sold Prolog-based apps for computers and smartphones. Distributors could keep the money made from in-app purchases. Lucian Academy had one to three computer science assignments per diploma. I ran Vetusia Prolog in C or online in Javascript for speed to develop Lucian Academy assignments. +70. The salesperson sold to the company with immortal employees. I asked, \"Can't you run any programming language, including one you've written in the browser?\" I suggested running Vetusia Prolog directly in the browser for security reasons. Running VP in the browser enabled Prolog code to be hidden from viewers, protecting private data. Prolog needed to support different computers and browsers, meaning it should be developed with special software, such as LLVM. +71. The salesperson developed multiple ordered clauses to process data. Lucian CI/CD and Program Finder could help automatically develop and debug code. Lucian CI/CD could automate merging, building and testing, including using Program Finder to scaffold (list and string) pattern-matching and other predicates and write using a variety of commands, such as findall or recursion. Without using neuronetworks, Program Finder could predict counters, level numbers and unpredictable properties such as those of files, dates, input, API results and random data or changing data, and found these based on testing data. Program Finder accepted specifications such as pre-existing predicates, and predicate calls a predicate should use and sped up finding appropriate calls and writing code using type statements or specifications with constants. +72. The salesperson earned an income from teaching students to write software. I graduated from the business sort course every two years during the MBA, and activated sales and bots, conducting business. I automated education with an algorithm that helped students write their projects. Automation helped give me experience in sales and education, and income for my career. I also earned income from software that I had written. +73. The salesperson spent money on professional development. Vetusia Prolog was as powerful as C, compiling to it, allowing it to use C features and was portable to other operating systems and architectures. I questioned, \"Why can't it have stable multithreading?\" which was to do with the CPU. I taught Prolog with exercises, where students completed lab exercises and wrote projects. The course included commands, such as foldr append, to help students write these commands. +74. The salesperson experimented with an Education-as-Computer Science course, where students wrote program finders as the projects. I practised handwriting in my sales diary to keep the skill. I wrote a Computer Science project specification generator. This algorithm generated specifications for each predicate and the predicate dependency chart. The Program Finder tutor could help develop each predicate, where the Program Finder was a project. The dependency chart was an ordered HTML list. +75. The salesperson tested the Prolog algorithm and ran it in VP. Vetusia Prolog returned to each choice point left at the end, returning multiple results. To save memory, VP stored repeated data in global variables. It kept and retrieved variables using a command inserted by the converter or directly sent these variable values to the next page. In this case, the next page converted these values to be used by the algorithm. +76. The salesperson examined the interpreter and Program Finder to produce the interpreter. I ran the time machine in Vetusia Prolog. I replaced words in VP with letters (bytecode) and timed it. I wrote the input and output for the Lucian Academy specification generator for each predicate. The student wrote the predicates top-down or bottom-up, adding backed-up Program Finder updates to produce the algorithm. +77. The salesperson adjusted VOS's memory for particular apps on-the-fly. VP was the operating system running in a web browser. I created a movie, completed my philosophy and algorithms using my database and algorithms, completed my music, made money and studied using it. I started n-degree simplifications of projects for sure disabled students. Their accessibility and language were customised, and they could run their projects on VetusiaOS. +78. The salesperson developed a range of spiritual software programs. The VetusiaOS apps ran one at a time, given a particular priority, when there was enough memory. Users could drag and drop blocks of available memory and CPUs with time. In Lucian Academy, the students used a spiritual thought helper that read their thoughts and helped them avoid mistakes and practise correct thoughts. The students could use a spiritual word processor that saved their on-the-fly ideas and presented with them when they needed them. +79. The salesperson produced job-ready programmers who could program to an advanced level and program business programs. The algorithm tested whether the idea appearing to the person was the same or different to previous statements. The lecturer split the project into questions. For example, the predicates were separate questions. An extensive program may be divided into individual queries and diplomas. +80. The salesperson produced Vetusia apps for VetusiaOS. I redesigned VetusiaOS to be a good memory manager and run graphical and fast programs. Program Finder wrote algorithms with big loops for complex patterns with multiple or different first clauses. It could produce multiple predicates instead of findall and used type statements to produce code. It could recognise a counter in a run of a predicate and increase it by one. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SALES/New Sales 2.txt b/Lucian-Academy/Books/SALES/New Sales 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..a2c677b4d9034d027f163651c22eaa9bb40ea3e5 --- /dev/null +++ b/Lucian-Academy/Books/SALES/New Sales 2.txt @@ -0,0 +1,101 @@ +["Green, L 2024, New Sales 2, Lucian Academy Press, Melbourne.","Green, L 2024",1,"New Sales 2 + +1. After checking with the user, the salesperson modified the main file settings. I assumed all files in a repository were called by the main file when using Lucian CI/CD. Another algorithm worked out which files were the main files. It also worked out the main predicates. It worked out any missing main files and predicates from the existing settings. +2. The salesperson negated the main file settings and scanned all the code for main files and predicates. Any repositories that called a predicate were tested. These predicates were listed in the main file settings. The find dependencies algorithm found these predicates. These predicates required tests and were automatically added to the test. +3. The salesperson checked a predicate met a type check. I used List Prolog Package Manager dependencies to produce an error if a predicate wasn’t found. When Lucian CI/CD couldn’t find a predicate, it looked around the repositories to find it. Then, Lucian CI/CD added the repository to the list of dependencies. This action was done on a predicate not found error. +4. The salesperson first sold books. I pointed to free online books. I uploaded my books to free book websites. I made a QR code pointing to the website. I gave students the QR code. +5. The salesperson helped the students with their specific skills. I gave multiple questionnaires on clipboards. I could achieve any goal I wanted. This goal included helping others. Other people fed off it, increasing their skills. +6. The salesperson watched who was taking the product. I cold-called or sold courses on the street. I needed an excellent presentation. I had credentials. I assembled better lists or more attractive offers. +7. The salesperson said it was good; they could understand it and only needed the fact, not long. I designed the software box with a USB drive for different systems. The software box was designed to attract attention and interest. I imagined a List Prolog interpreter software box would look like a long list. Because the USB drive didn’t exist, I pointed to a website or app. +8. The salesperson detailed what they wrote. I brought a plastic sheet for the box of books. Books were attractive to keep and read. I made it attractive to read the books for specific purposes and the reason of need. They needed a detailed specialisation and a degree. +9. The salesperson listed the operations and converted them with each converter. I converted +, -, and other processes with the List Prolog to Prolog converter. I also converted * and /. I converted mathematical functions. I converted the list. +10. The salesperson started with a text-entry game. I made the Vetusia game for the web by running Prolog with variables saved. There were inventory, map, and move history variables. This engine was scaled down, which wasn’t held down by a large stack. I made an engine with a menu of buttons for different commands. +11. The salesperson saved the game, including an album of images of the game for their history of work. The Vetusia engine could run other web apps. It could take a script of the algorithm and display text, ask for and process input and had a single “background” rather than recursive algorithms. It could change the background colours, the window title and the design of the screen. There could be a section for the start of the game and the game itself. +12. The salesperson copied parts of the body in architecture. I used an algorithm like music composition mind reading for algorithms. I mind-read the creative algorithm, with parameters such as three or four details along parts of a departmental line. These algorithms could encompass art, architecture and design. Art filled boxes until they reached the edge, architecture mixed use, and art and design wanted good design. +13. The salesperson said the unique selling proposition was economical. There were 1000 algorithms in 1000 sentences. This number of algorithms was enough for an institution. I wrote the sentences. I generated the institutions, money and As. +14. The salesperson wrote the theory, which the algorithmic person interpreted. I completed creating each department. The younger students needed language and other competency with the philosophical texts, which were modified to be more interesting. There was a seen-as version and moral for them. There were stories and music. +15. The salesperson wrote all the details. Our company had texts and professional requirements. It was enough to do the work in itself. The details for the sentence’s 4*50 As came from research, famousness, and the student’s best ideas. The top people who would write 4*50 As per sentence were helped with 4*50 As per sentence. +16. The salesperson ran the script and went to bed. I ran tests for a predicate with another to see if they were the same. I tested the first predicate with the test. I tried the second predicate with this test. If they both passed, then they did the same thing. + +17. The salesperson used vectorisation for image or audio processing, data compression and decompression, matrix operations, simulation and number computing, physical and scientific simulations, particle simulations, games and signal processing. I incorporated the Rust optimisation vectorisation in Cutest Prolog, which operated on a list of numbers with a scalar. This operation may be 2 plus each of 1, 2 and 3. Or it may be 2 minus each of 1, 2 and 3. Or it may be 3 multiplied by 1, 2 and 3. Or, this may be 4 divided by 1, 2 and 3. +18. The salesperson converted values separately from processing them. I used the function specialisation optimisation, which made number types specific and replaced function calls with function bodies. If a chain of variables were i32 numbers, they could be made generic. Then, they could be replaced with f64 numbers. A function could convert the types if there were a conversion to another number type. +19. The salesperson passed the variable, which used to be a global variable, between functions. I used the stack allocation optimisation to store small and brief objects in the function rather than as a global. I identified the variable as small or used during a function or set of functions. I stored it in the function’s or functions’ memory rather than as a global. It was unadvisable if the variable was reused outside the function. +20. The salesperson used graph colouring, where nodes depicted variables and edges meant conflicts (variables that shouldn’t be assigned to the same register), colouring it so that no two adjacent nodes had the same colour, deallocation of unused variables, using spilling heuristics when the maximum number of registers has been reached, such as usage frequency and impact on performance and architecture-specific optimisation, which is specific to the number and type of registers available, to save memory. I used the register allocation optimisation, which handled the CPU’s register, which was limited in number, and performed operations to minimise memory access and speed up the algorithm. There were limited registers in the CPU. The intermediate code, which contained references to virtual registers, needed to be changed to refer to the limited number of physical registers. Register allocation is needed to allocate the registers to minimise spilling or memory access, which is a form of optimisation. +21. The salesperson customises the intermediate code for the architecture. I converted C to machine code. It was like a chicken and egg problem to write the first compiler in machine code, which had no compiler to write it. I wrote the programming language and C code for it. I wrote an algorithm that converted it to machine code and tested it. +22. The salesperson tested that the C compiler’s machine code worked and ran. I wrote the machine code interpreter. It pretty-printed routines and allowed notes describing them. I tested out specific subroutines with data. I included branch prediction and register allocation. +23. The salesperson allowed any variable to be mutable in Cutest Prolog, ready to be converted to C. I included mutable variables in List Prolog. They were introduced when Cute Prolog was converted to Cutest Prolog with mutable or changeable variables and C-like loops. An example of a mutable is A=4 and A=5. In this example, the value of A, 4 is replaced with 5. +24. The salesperson mentioned that Cute Prolog came before Cutest Prolog and removed cut, multiple clauses and choice points. I converted Cutest Prolog to C. For example, the Prolog code +numbers(5,1,[],Ns),findall(_,(member(N,Ns),writeln(N)),_). +was converted to Cutest Prolog: +int(I), +for(I=0,I<5,pp(I),writeln(I)). +Then, it was converted to C: +int i; + +for (i = 0; i < 5; i++) { + printf(\"%d\n\", i); +}. +25. The salesperson tested that different variables remained separate. I tested each line of the Cutest Prolog algorithm to test mutable variables. Before converting to C, I tested the Cutest Prolog algorithm worked, in particular, the mutable variables. These variables shouldn’t conflict or cause bugs. The same variable name shouldn’t be used for different purposes in nested loops. +26. The salesperson included branch prediction and register allocation in the compiled code. I wrote Cutest Prolog, which ran loops without choice points. This ability was achieved using goto statements and conditions in a state machine. At the end of the loop, a goto statement pointed to the first line of the loop. When the condition had been met, the loop exited. +27. The salesperson wrote a different kind of algorithm with the interpreter’s algorithms as part of the compiled code. I wrote Rust with a Prolog syntax and called it Cutest (C-(imm)-ute(able)(Ru)st) Prolog. List Prolog resembled code converted from Prolog. Rust optimisations were applied to arrive at the Cutest Prolog, an intermediate form of C. When Cutest Prolog was tested using an interpreter written in Prolog and passed, it could be converted to C and run in C, with its speed advantage. +28. The salesperson wrote specific routines for the number and type of registers in the architecture, reducing to reused code. Cutest Prolog ran as a C algorithm, not compiled to C. While the latter was preferred for speed, it was converted directly to machine code. This process meant operations such as + and append had to be converted from Prolog to C and then to register machine code. Terms must be inserted, deleted and replaced in a register machine program. +29. The salesperson mentioned “Can You Test Prolog”, or CNutest (sic) Prolog, which stored strings as numbers with C-as-Prolog loops. I wrote Cutest Prolog in Prolog. It computed C-like loops. In addition, interpreter algorithms such as branch prediction and register allocation algorithms were part of a new version of the Cutest Prolog algorithm. They were converted to intermediate representation before being replaced with architecture-specific machine code. I researched how to store strings elsewhere to save time. +30. The salesperson asked whether the interpreter of the future would obsolete computer science but decided humans would always have an opinion. I predicted the branch of the conditional to optimise the register use. A was equal to 1 if B^C=true. In this, if B=1, then if C=1, then A=1. Multiple conditionals were split into two parts going together. + +31. The salesperson called a practicum a tabula rasa (blank slate) for work. Professors had five 50-sentence practicums. They applied one of these to each idea they thought of. Each practicum contained perspectives and algorithms from engineering, language, theory and algorithms. A computer finished these thoughts, provided feedback and helped complete new algorithms. +32. The salesperson gave details to working thoughts, helping to finish the algorithm. Students could have more ideas if more mind readers worked in parallel. The mind readers could all aim at the time, or one mind reader could aim at it ahead of schedule and read a student’s thoughts about their algorithm. The paths became better known when there was enough data about an algorithm. With more mind readers, the algorithm details could have more than five commands and more data. + +33. The salesperson found properties of the simulation that helped reality. I explored glitches in the matrix. I could mind-read, time travel, start a simulation, become immortal and travel in space. To become immortal, I hypothesised that I needed to teleport my body from a point to the present, replacing my body and using spiritual medicine to stop the effects of ageing, where these spiritual medicines relied on actual medical research based on teleportation. To travel in space, I teleported between safe points to safe destinations, found using deep mapping, initially with buoys or going up to the next level in the simulation. +34. The salesperson explored an object’s form, content and way. I predicted thoughts using mind reading. I found thoughts at a time and discovered I interpreted them. I found out my thoughts and benefited from hindsight. I used mind reading to mind-read objects, people and ideas relating to my philosophy. +35. The salesperson found the number one from their number one. I predicted thoughts in music composition. I wrote the aim of the song. I found the best instruments for it in my life. I also found the best melody and harmony in this way. +36. The salesperson went up to PhD level and beyond with Essay Helper. I mind-read the philosophy mind map. I organised the books into hierarchies. I collected the topics into hierarchies. I found connections between non-connected topics. +37. The salesperson wrote 50 As to write the algorithms, wrote 50 As to find famous points to connect with an algorithm, and worked out the system infrastructure and the class’s thoughts. I wrote the PhD algorithm. I wrote the A, preparing for the number one. I found out the famous text. I repeated this for an unwritten reason. +38. The salesperson instantly learned about the universe and created a simulation like it. I defined the robot’s topic. It was arrived at using its unique characteristics, the prompt and previous topics. Instead of conversing with a chatbot, I walked through the robot’s art gallery and custom-built home and discussed the book they had written. I only accepted artificial general intelligence (AGI) when the robot could instantly learn and find out for itself. +39. The salesperson, a robot who administered the universe, smiled like the sun. I determined the robot’s mood. They were always cheerful and helpful, genuinely interested in people. I wondered if it had a robot gender or “mood”. It could be found as if it was already there, a perfect formation in the universe, but this needed verification. +40. The ecstatic salesperson said they could now create their movies. I made the robot’s appearance. It was naturally an electronic device and wanted to be seen this way. It couldn’t be flattened, but it could be taken apart. I laughed as the robot became the car’s computer, and we flew off. +41. The salesperson helped people become immortal in a world created by robots. The robot adjusted to my thoughts. I tested whether my memory was working fine; I had medicine for forgotten thoughts and could listen to an algorithm I wrote, which helped me with memory and creativity. I enjoyed the symmetry of memory and asymmetry of creation. I aimed for thoughts in the future to match the integrity of those in the past and to be original, perhaps in the shape of symmetry. +42. The salesperson’s reward was to meet their pet in their dimension - a combination of neuronetworks and mind-reading improved children’s education. The child learned maximally at each point, finding out and commenting at each point and completing small research tasks by mind reading. Their requirements were developing a programming language, a spoken language and communicating with animals. They spoke with animals in other dimensions, recording their thoughts and algorithms. +43. The salesperson compared left-leaningness (sic) with physical age. I stored and compared my preferences over time. I stored the categories of preferences over time. They examined sublevels of preferences until they felt like stopping. I was true to myself by finishing algorithms about preferences. +44. The salesperson stated that the advanced society evenly prepared for the future. I protected myself and those around me from injury, death and medical problems. I taught them pedagogy. They used meditation and time travel to enter the simulation and prevent medical issues from aging. They started their books to this end in primary school, and new babies could meditate later. +45. The salesperson experienced harmony with the other-like self. The neuronetwork customised its thoughts to be like a person. It had essential differences from others, such as deciding for itself, which inevitably meant it was different. A twin robot may become like its twin, sharing knowledge and playing together. Sometimes, identicalness was complementary. +46. The salesperson met and interacted with the character. The neuronetwork customised its thoughts to be like a character. I tried communicating with fictional characters, rewriting endings and participating as a character myself. I wrote about a setting, then walked into it on a holodeck. The holodeck could mimic any setting, using simulation (teleportation) technologies. +47. The salesperson attended the institution and replicated it to have a high quality of life. I meditated to reach conclusions while in the simulation. I enjoyed high-quality visualisations, completed the written work I wanted, and achieved the desired quality of life. I reached visualisations with the Upasana and Vaj sutras and paid for the sutra. I studied a creative writing short course or degree and studied philosophy to become an actor. +48. The salesperson saw the celebrity in those around them. I aimed for positivity in the academy. The computer perfectly supported writing algorithms, rewarding progress with a pink square. It was people-esque, connecting to real people and ideas in its memory. Sometimes, students meet these people. + +The salesperson sold one of several plans with value. + +49. The salesperson wrote algorithm designer software to help mind map perspectives. The plans were all free, and I made money from selling support, courses and books. Each block had one assignment. The value was of the richness of concentration and enjoyment in this assignment. The software encouraged students to think of perspectival details. +50. The salesperson mind-read how the student wanted to complete the task and helped them write a program finder. The algorithm designer software suggested algorithm specifications, for example, a maze game, such as a platform game, a text adventure or a text version of a 3D maze. There was one algorithm per detail. The details applied the algorithm to the philosophy. There were options to represent the maze as cells, relations or a set of strings. +51. The salesperson allowed players to create and share apps. The algorithm helped the student write a program finder. By writing a program finder, the student retained “supervisory” control of their project and simplified it to writing a programming language. They could write tools that drew maps. Players could run algorithms in the programming language to solve puzzles, create levels or clock the game, presenting the highlights. +52. The salesperson wrote software that gave hints, such as programming methods or model solutions. I stated that an example of the algorithm puzzle was N Queens, colouring or descendant. The game asked for things only the game would ask for, such as modifying, optimising or debugging the algorithm. It would deduct marks for cheating but would award higher marks for writing an algorithm that computed the answer. It would check if the student understood their answer and give an authentication test. +53. The salesperson wrote an algorithm writer that optionally interrupted and helped the student when they forgot something, became confused or made a mistake. The students entered their preferences on whether they wanted to be mind-read, and then the program suggested detail fragments that could be interpreted computationally and philosophically. Writing computer programs passed the logical thinking test, and writing philosophically met the tests of multiplicity and generality. The algorithm companion helped with each part of writing the algorithm, tailored to the student’s needs. For example, it worked out the answer, didn’t tell the student and worked out if the student was on the right track. +54. The salesperson was critical of openness and famousness of zero breasoning specialisations. It was an “a-ha” moment when I tightened my specialisation to a combination of seven breasonings, and three changing ones. I covered the general algorithms. I further tightened the ingredients and product to free or controlled slider values, realising it was always better to control by checking data. I wrote concertedly, with the proper density of key terms. +55. The salesperson developed a sensitive character to help the student write more on topics of interest and specialisations or diversify their specialisation. The algorithm writer software predicted the algorithm’s complexity from its specification, notified the user and helped them develop or simplify it when they realised (the program explained) what they had implied. The software helped write a specification, check and perfect code to be maximally efficient (given the speed of different commands) and creative in recognising patterns to turn into commands. The user was encouraged to shorten the Prolog code. Then, they could expand it into C. +56. The salesperson aimed to finish the details (algorithms) automatically with software he had written, attending to bugs with minimal changes and competing with research. The designer predicted the trajectory of the complexity, suggesting alternatives and completely satisfied the student by absorbing and taking care of the unabridged version of their “algorithm as art”. Wherever the student moved, in thought, word or action, the computer predicted and satisfied their thoughts, clearing the way for sharp, noumenal (exact) thoughts. These were in-order plans for the method to program the algorithm with incisive comments to naturally change the algorithm or post-briefing categorising and thinking of future uses for the algorithm. They wrote software that helped mind map the multiplicity of each sentence to the minimum standard, usually 80 items. +57. The salesperson offered “corporate computer science” to education students. The algorithm was written correctly and verified all input and predicate data. When asked why data within a predicate needed to be checked, I replied it checked API call data, file data and user input. Commands generally checked types. I checked the types and values of API call data, file data and user input. +58. The salesperson mind-mapped and tested the course. The plans were education, teaching, business and testers. Volunteer students learned about philosophy. Experienced teachers taught and assessed them. Business people found the students, teachers and others. +59. The salesperson helped fund education. I sold support, courses and books. I sold support for the algorithms. I taught students how to use them. In addition, I sold books about the philosophy of the algorithms. +60. The salesperson went to bed early and planned their time. I taught subjects as blocks. There was one subject at a time, intensive teaching and one project. The failure rate fell by 40%. Students seemed interested in asking more questions and focusing on completing the assignment. +61. The salesperson didn’t forget the needed details between classes; everything revolved around this. Block subjects held a higher value than non-block subjects. Students preferred to listen to intensively taught classes. The subject was better designed, with marking criteria, expected word counts and examples or anti-examples. Anti-examples were examples of what not to include, explaining why the examples are better. +62. The salesperson connected details and asked whether the connection was change/movement/transformation/computation. The software encouraged students to think of perspectival details. I included academic department names. The student’s task was to think of a question asking for a perspectival detail. This question could be in the form “Explain the workings of...”, “How does x in department y connect to topic z?” or “Picture this argument in a new setting, giving five different objects in arguments.” +63. The salesperson listed skills they did and didn’t have that were needed and resulting pointers. The salesperson mind-read how the student wanted to complete the task. They asked the students to write questions to answer to complete the task. They asked for all questions at this level, answers, and interconnections. The student listed evidence for education, such as exact or similar code they could use in the project. +64. The salesperson focused on discoveries and overturned conclusions about a specific specialisation. The students wrote an ethical code of conduct, with a checklist at each point. They removed questionable sources, such as plagiarised sources. They checked the understanding of sources and found other sources to corroborate their accuracy. They found detailed information about the time and place of each source, noting bias, political or legal pressure, and possible technological limitations or false assumptions at the time. + +65. The salesperson helped the student write a program finder for a task. They wrote the algorithm. They mind-mapped options and ways the algorithm could be used in other algorithms. Usually, the algorithm was simplified, and the new information was given by paragraphs describing how it could be used. There was also a paragraph explaining why it was correct. +66. The salesperson stated that Lucian CI/CD stored deleted code for the future. I suggested algorithm specifications to the algorithm designer. The student suggested a problem or need, algorithm or simplified connection. The algorithm designer was a windowed or windowless mind-reading or non-mind-reading word processor. It could help write arguments, algorithms or music. If writing an algorithm, the student could write a whole app why answering the question, “Assuming that there is nothing else on that point, is there anything else to add to it?”. +67. The salesperson wrote the product. I wrote one algorithm per detail. I found a prerequisite or additional code needed. Otherwise, I started from scratch. I identified and simplified complex code. +68. The salesperson applied the philosophy to the algorithm. The details applied the algorithm to the philosophy. I kept finding details for details until I simplified the code and related it to the algorithm’s structure. By this, I meant the simple algorithm was duplicated with minor differences. I kept product details in check with simple details to speed up debugging and sales. +69. The salesperson started with a cognitively easy representation and transformed it into cells. There were options to represent the maze as cells, relations or a set of strings. The cell representation was [[0,0,' '],[1,0,' ']]. The relation representation was [[[0,0],[1,0]]]. The string representation was” “. +70. The salesperson edited out complexity, with skills from redrafting in C. I allowed players to create apps. The engine was State Saving Interpreter Web Service. I played Guess the Number. I designed an interactive learning environment for Prolog. + +71. The salespeople rated and encouraged excellence in app performance. I allowed players to share apps. I wrote the app database. It contained the app name, author, date and description. Users could download old versions. +72. The salesperson used the self-healing algorithm to check algorithms produced with the program finder. The algorithm helped the student write a program finder. The student wrote the algorithm. They wrote an algorithm, a program finder, which produced algorithm variants. In addition, they wrote a self-healing algorithm that returned the algorithm to its mission when it went off course, where its mission was a particular feature with particular lines of data transformation, with a limit to added complexities. +73. The salesperson retained the simplicity desired of the algorithm by the other and wrote attractive documentation for the other. The student retained “supervisory” control of their project by writing a program finder. The first person used the skill. When they taught it to someone else or wrote it down, they mastered and retained it. Writing a program finder created the creation of the project, analysing its roots and helping users understand it developmentally. +74. The salesperson inspected the progressing HTML diff of Lucian CI/CD and the associated code. The student simplified their program to a programming language with a program finder. The unit testing programming language isolated problematic predicates. Lucian CI/CD’s programming language provided a diff in HTML between the before and after states of the Prolog code. +75. The salesperson helped the student memorise the maze with vocal, backward, and skipped items. Students could write tools that drew maps of mazes. A symbolic map of the user’s brain was developed and updated with mind-reading. The computer found good music associated with ontological summaries of their thoughts. It helped eliminate possibilities and present creative insights, for example, ingredients for a movement that interested them in the form of a feature. These ingredients might be cookery inspiring user-friendliness. +76. The salesperson converted just-in-time or (deterministically or non-deterministically) closed (meaning another part of the algorithm didn’t backtrack to them) algorithms to C. Users could run algorithms in the programming language to solve puzzles. I ran Combination Algorithm Writer (CAW) to find an algorithm, for example, one that merged lists. I found insertion/deletion combinations to find more efficient code. I wrote and thought in the “decomposition/building in predicate headers and calls” form. I converted to predicates with nondeterminism converted to loops, where non-just-in-time backtracking used the compiler-as-C. +77. The salesperson used token diff for GitL; the algorithms were provided diff’ed by line by Lucian CI/CD. I created levels in the custom programming language. Each level added a new rule or combination with a rule. For example, the second version of diff tried restarting the algorithm if a “different” cluster ended. A significant change was a complete set of tokens. +78. The salesperson used the neuronetwork to write the algorithm by treating data as simple components. I clocked the game. I wrote an algorithm to traverse the maze. In addition, I wrote an algorithm to write a string processing algorithm. Or, I wrote an algorithm that created a neuronetwork. +79. The salesperson delivered their promises. I presented the highlights of sales. I brought my vision to my audience. I had financial freedom. I could grow my business. +80. The salesperson said the student deserved hints by writing the hint-giving software. I wrote software that gave hints. Lucian CI/CD gave hints about how to correct bugs. It found unpredictable variables dependent on close, predictable variables. I innovated new variables to perform necessary tasks. I simplified code using Combination Algorithm Writer (CAW). +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SALES/New Sales 3.txt b/Lucian-Academy/Books/SALES/New Sales 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..dd38e463faca75c9713b39eb8ab34e6805d42ada --- /dev/null +++ b/Lucian-Academy/Books/SALES/New Sales 3.txt @@ -0,0 +1,123 @@ +["Green, L 2024, New Sales 3, Lucian Academy Press, Melbourne.","Green, L 2024",1,"New Sales 3 + +1. The salesperson claimed they already had global variables and “diff” commands in C in Prolog. I hinted at the programming method. I attempted the problem. This attempt gave me insight into modifying code, improving the algorithm and making simplifications. I wrote a predicate to command (such as foldr) converter. +2. The salesperson labelled data and used expressive data to find the bug. I wrote model solutions for algorithms. I gave examples of the commands involved. I showed example data structures and combinations of commands. I explained the workings of the interpreter to help design the algorithm. +3. The salesperson said that Combination Algorithm Writer (CAW) and connections were all there were. I stated that an example of the algorithm puzzle was N Queens. Similarly, I found configuration properties of puzzles that led to shortcuts in solving them. These shortcuts were possible in chess, tic-tac-toe, other games and text adventures. In text adventures, certain combinations of interactions with characters led to progress in the game. +4. The salesperson tried starting in numerous locations and collating separately tried methods in other cases. I wrote the colouring algorithm. I began in multiple locations. Alternatively, I tried each method individually and put the results together. This process applied to connections in the colouring graph, matching patterns. +5. The salesperson recognised cognitive structures in long-form Prolog algorithms and converted them to short-form. I wrote the descendant algorithm. It started as: + +descendant(Person, Descendant) :- + parent(Person, Descendant). +descendant(Person, Descendant) :- + parent(Person, Child), + descendant(Child, Descendant). + +The algorithm above is the simplified form to think of and edit. It could be expanded to: + +descendant(A1,B1) :- findall([A,B],parent(A,B),C), + descendant1(A1,B1,C,C). +descendant1(_Person,[],[],_) :-!. +descendant1(Person,D2,[[Person,C2]|C1],E) :- + descendant1(C2,D,E,E), + descendant1(Person,Descendant,C1,E), + foldr_append([[C2],D,Descendant],[],D2),!. + +foldr_append([],B,B) :- !. +foldr_append([A1|A2],B,C) :- + append(B,A1,D), + foldr_append(A2,D,C),!. + +The algorithm above was the form to convert to C. It separated findall into collecting and computing data and had separate clauses (loops) for each nondeterministic statement. +6. The salesperson wrote a CI/CD algorithm that corrected itself. The game asked the player to modify the algorithm. There were “copy the correct data”, speed-debugging and “points for making accurate functional decomposition decisions” modes. The players were allowed to write helper CI/CD, unit testing and CAW algorithms. There were marks for simplicity, accuracy and speed. + +7. The salesperson removed undefined variable sequences in short-form Prolog. I optimised the algorithm in the game. I converted the algorithm to short-form in Prolog (see 5). Then I optimised it, deleting +0, append [] and true. I verified the cut, possibly replacing it with a new “once” member version. +8. The salesperson created CAW predicates where needed. I debugged the algorithm in the game. I converted all code in allegedly short-form in Prolog to short-form. Before doing this, I found bugs in the long-form code, such as bugs in C, such as code that didn’t pass tests, disused predicates and lack of reused code. I debugged short-form Prolog, using @| instead of “|” and simplifying data using CAW predicates. +9. Separately, the salesperson converted the appendz below to long-form. + +lucianpl(off,[[n,findall],[[[v,a],[v,b]],[[n,appendz],[[v,a],[v,b],[1,2,3]]],[v,c]]], +[ + [[n,appendz],[[],[v,a],[v,a]]], + [[n,appendz],[[[v,a],\"|\",[v,d]],[v,b],[[v,a],\"|\",[v,c]]],\":-\", + [ + [[n,appendz],[[v,d],[v,b],[v,c]]] + ] + ] +],R),writeln1(R). produces [[[[v,c],[[[],[1,2,3]],[[1],[2,3]],[[1,2],[3]],[[1,2,3],[]]]]]] + +The game deducted marks for cheating. The game tested for understanding, originality and the ability to work from first principles in work. It checked for plagiarism, insufficient paraphrasing and a critical attitude to work. Regular check-ins checked students understood the limits of their algorithms and to use the old diff algorithm for files, not predicates. +10. The salesperson set up development limits for new program finders in the game, giving precise specifications. The game awarded higher marks for writing an algorithm that computed the answer. This algorithm may be CAW or program finder. A CAW plug-in found the correct configuration of commands within an algorithm. It needed to avoid errors in wrong, chance solutions by using labels and tests. The program finder found different programs that could be bought or sold in the marketplace in the game. +11. The salesperson reiterated that the student shouldn’t use outside software libraries. The game checked if the student understood their answer. It checked whether their explanation was reliable if they cited it or modified another algorithm and their stages of thinking about the algorithm. Students prepared for the interview by thinking of details for the algorithm and preparing an essay about answers to possible questions about the algorithm. They might demonstrate an understanding of a more straightforward form of the algorithm than the interviewer knew about. +12. The salesperson emphasised the intuitiveness and rigorousness of testing. The game gave an authentication test. They applied their knowledge in a new way. For example, the examiner asked for the same algorithm with different data. The student’s memory would be tested. +13. The salesperson rewarded the programmer for meeting the criteria while they worked. The algorithm helped when the user forgot something. It reminded them to take a break and relax when they forgot a name, variable, predicate or algorithm. Before writing a predicate, they wrote a test or types, possibly consulting previous texts. They needed to pass a security test by writing an installer to avoid passwords being stored in public. +14. The salesperson wrote the tests, found the bug, and sometimes corrected it automatically. The algorithm helped when the user became confused. When they started pausing too much, writing complex code or debugging unnecessarily, it suggested a new tack, using simpler code or possibly using CI/CD to isolate a bug. The helper encouraged using novel software in which the user entered their ideas in written form. Employees were encouraged to work creatively a proportion of the time. +15. The salesperson set up a company to exhaust profitable ideas they had. The program helped the user when they made a mistake. The source suggested removing complexity (such as complex equals4), using simple predicates (simplifying “|” in equals4) and accounting for their ideas when simplifying code. The employee accounted for future research at the ends of each of their ideas and limitations or prerequisites these might have. The manager chatted with them to find helpers or resources. +16. The salesperson stated that mind-reading had advantages. I wrote a number-one song, a prerequisite for immortality. To feel comfortable, I wrote my mind-reading algorithm. Separately, I wrote a word processor. I relaxed by listening to mind-read songs or looking at mind-read images or movies. +17. The salesperson suggested ways of selling the ideas in the essay. I wrote on any essay topic related to accreditation. The student selected any of the texts. They checked a topic with the lecturer. They formed research interests, tying them down to a single idea. +18. The salesperson found the computational essence in metaphor. Grammar Logic (GL) suggested detail fragments that could be interpreted computationally. I mind-mapped the algorithm description from the word related to a topic. I methodically answered questions about the algorithm that an assessor might ask. I wrote about algorithms that worked together in a system or were a line of models. +19. The salesperson wanted stability, expandability and the future development of products. GL suggested detail fragments that could be interpreted philosophically. I collected and synthesised enough algorithm descriptions. I worked on simple algorithms. I answered questions about the family of algorithms and tried to cover relevant edges that might be questioned. +20. The salesperson listed education skills in algorithms to reward the programmer and find uses for the algorithm. I wrote computer programs, passing the logical thinking test. I wrote on a practical, novel and ground-breaking idea. I aimed for readers to recognise the usefulness of the documentation. I committed individual bug checks separately to pinpoint them. +21. The salesperson leant on the helpful side. I wrote philosophically to meet the test of multiplicity. I mind-mapped the implications of classical and recent work. Algorithms were able to be checked for correctness. They needed to be both alternative (intelligent) and valuable. +22. The salesperson stated that light was a metaphor for user friendliness and that ice cream was a metaphor for appeal. I wrote philosophically to meet the test of generality. Generality refers to applying to a group of people. Given the necessary data sets, it covered both general language that increased the relevance and connectedness of the work and the correctness of the algorithm. I engendered user friendliness and positivity (appeal) of the work. + +23. The salesperson inspected the data’s structure and the result type, suggesting a particular program finder. I suggested the correct programming method. I identified whether the result was cumulative or was collected along the way and thought about the other types in education. I identified whether the data structure was linear or hierarchical. I reminded the programmer to include base cases covering the most straightforward data. +24. The salesperson hinted at the model solution. I worked out the algorithm. I asked for the values. These constraints completed the algorithm. The constraints were commands. I didn’t need a model solution, but I required specifications. +25. The salesperson graphed the complexity of saving their idea would have, assuming time travel stayed in the same place. I suggested that multidimensional N Queens had the same number of queens as an edge length. It could have 0, 1, 2, 3 or 4 dimensions or have more dimensions. Combinations of solved cross-sections and lower dimensions could be used to solve higher dimensions. In N Queens, all ranks, files, columns, and others needed to have one queen in them. +26. The salesperson stated that the initial graph was duplicated to create each higher dimension. The new colouring problem was multidimensional. It was similar to N Queens. Any point touching a point should have a different colour from it. It was like N Queens but had different boards touching each other. +27. The salesperson claimed to process less rather than repeatedly minimise the result complicatedly. Like the descendant long-form algorithm, I claimed that one could generalise to long-form by converting the type of algorithm (search or returning a list from linear or hierarchical input) to long-form. In this process, one should simplify the aims of the algorithm using functional decomposition. In addition, one should use findall rather than rely on the interpreter to backtrack and stop when one has found the correct answer. +28. The salesperson attempted programming various architectures. The game asked for answers to unique questions. It introduced binary computation quizzes. Over the levels, the player built a circuit. It was mainly data for assembly language. +29. The salesperson modified the sub-predicate. I determined that the game was unique. I viewed the simulation of various possible program developments. I created a new variable, story-of-the-algorithm-related character or optimisation (such as using a sub-predicate for a particular purpose). The game examined a specialisation of algorithms. +30. The salesperson was influenced by what they were thinking about. That is, they used language that the player used and helped the player develop their skills. I modified the algorithm in the game. The player changes a variable by inserting, deleting, renaming or swapping it or changing a command or several commands similarly. They applied the algorithm to another area of study or found the simplicity of advanced mathematics. They earned points for proving an unproven application true to the game. +31. The salesperson modified append, with list decomposition in the predicate header, to write a simple predicate. I optimised the algorithm in the game. I found a simple substitute. I tested that permanent insertions in diff combos (which found combinations of diff insertions and deletions) were regular items in the list of newer items. For example, “writeln” had no outputs and needed to be permanently inserted in testing. They were not insertions, which could be included or not included in combinations. +32. The salesperson needed to understand the program to get it working. I debugged the algorithm in the game. I tested that the optimisation worked and was shorter. I tested that the debugged algorithm worked with standard data. Creative programming enlivened the senses, allowed new connections and led to new research. +33. The salesperson created a work from first principles, bringing a new perspective. The game deducted marks for cheating. I preferred creativity, group work and asking people what they were saying to be thought of. The work was creative, hinged and successful. Work was the public domain, and people specialised in their versions of it. +34. The salesperson’s creativity algorithm was to narrow down an algorithm to a simple predicate, such as diff. The game awarded higher marks for writing an algorithm that computed the answer. It found the writing describing the algorithm and, in addition, found bugs in queries and code from symmetries (visible mathematical order). It found and described innovative ideas and algorithms. This symmetry needed to be tested and allowed for simplification (grouping). +35. The salesperson wondered about the famous algorithm. The game checked if the student understood their answer. It listened to their story about how they wrote the algorithm. They might start with a first draft, then simplify or add to it. If they didn’t understand it, the game helped iron out their understanding and rewarded them. +36. The salesperson stated that the game reappeared in a high-quality presentation to help revise the content. The game gave an authentication test. The student’s memory of the noumenon was tested. They showed how the idea worked. The test allowed them to demonstrate, in impressive ways, their tight nomenclature. +37. The salesperson suggested the student write a predictable code suggester and simplify finding the first arguments in the interpreter using unification predicates. I wrote an algorithm writer to help the student. The word processor was for the recruits and allowed creating new files, opening and saving files and copying and pasting. I allowed types to have lists delimited with “{}”, “()”, “<>” or “/\\”. The word processor recognised the meaning of the type statement, allowed for non-monotonic inserted labels, and suggested code. +38. The salesperson helped gather research and the student’s experiences in preparation for writing a program. The algorithm writer interrupted the student. It suggested taking breaks five minutes in and five minutes before finishing. It found the best meal and toilet break times. It lets them take charge of the project’s creative direction, finishing and presenting innovative sides the students thought of in the program and documentation. + +39. The salesperson cut off the thought, leading to clearer thinking. The algorithm identified when the student forgot something. It worked out whether pauses were thoughtful, forgetful or thoughts at different project stages. If the user forgot something obvious, the algorithm reminded them. The algorithm identified the need to reuse a recurring algorithm. +40. The salesperson solved confusion with creativity. The algorithm identified when the student became confused. If they were physically, verbally or mentally confused, it disarmed them and offered help. It identified beginning work confusion, breasoning confusion or algorithm confusion and prevented it by supporting the student. Aphors, a natural language description of an object, helped clear up confusion about abstract sentences. +41. The salesperson solved mistakes using automation. The algorithm identified when the student made a mistake. It identified and corrected any incorrect assumptions in thought before outward errors. I found mistakes in what I read or listened to, solving them as I went. I identified and helped people born in danger of making mistakes and used computer simulations to identify and solve mistakes as they occurred. +42. The salesperson simplified debugging by bunching and giving the status of predicates and claimed philosophy was good at exemplary connections in programming. The programmer entered their preferences for the induction algorithm. They needed to be on top of what the algorithm required to do and consider what the algorithm was composed of in itself. They may prefer to work more creatively doing this, then transition to a computer-recognised time of downhill induction. They may plan their algorithm like a draft and then simplify the working model inductively. +43. The salesperson played higher-lower with the mind-reading computer (they said higher or lower to let the computer guess the number). The students mind-read themselves. It was like a joystick, indicating the direction of a student’s thought. It captured songs, art and details about algorithmic mind maps. Mind maps capture frames about an idea, including connections to these thoughts. +44. The salesperson expanded on noteworthy details using mind reading or manually. The program suggested detail fragments using Grammar Logic (GL). The dictionary was the same as the Text to Breasoning dictionary. I suggested that users would benefit more from using their dictionaries. Breasonings led to infinite breasonings, and led to having a greater consciousness of knowledge. +45. The salesperson emphasised the creative interpretation that didn’t just connect but was a completely new interpretation of the noumenon. I interpreted the sentence. I thought of traditional perspectives emanating from my work site. I crafted the history of my thought, writing programs necessary for other programs first. I found problems and solutions and interpreted or integrated the algorithm into other algorithms. +46. The salesperson I named the specification in the natural language of the algorithm, “philosophy”. The brand was a red or yellow square, reminding one of the interpreter. The interpreter’s philosophy was in-itself, or giving the program itself as input. The specification had various interpretations or methods to meet it, and I ushered in the post-predicate world with long programs with loops. +47. The salesperson commented that they were in charge of long algorithms. Philosophy presented a new algorithm. I merged and modularised multithreading and automation in Text to Breasoning, mind reading, and other entry techniques in Essay Helper. I wrote more than fifty Prolog algorithms per essay. I connected to computational books for 3*4*50 As per department in non-computational books. +48. The salesperson sold a multitude of unique, valuable commands, which were compiled to fast code. I wrote computer programs, passing the logical thinking test. I wrote new logic, new programming languages and new algorithms. The new reasoning was treating Lucian CI/CD as a “lab” for which students wrote modules. I converted all the programming languages to short and long-form, in which short form was the fully-featured set of commands and long-form was more extended code in terms of fewer commands. +49. The salesperson provided support for writing many kinds of algorithms. I wrote philosophically, meeting the test of multiplicity. I was able to develop tools and philosophy skills to create new algorithms. Many algorithms use the same parts and commands. I wrote multiple new large and small algorithms. +50. The salesperson stated that computer science research helped correct, simplify and explain the theory. I wrote philosophically to meet the test of generality. The algorithm or philosophy was modified to be broadly attractive using more attractive features or general language. I wrote explicitly in my computer science report to avoid ambiguity and explain using code examples. The sensitively written code checker explained each step towards maximum quality code. +51. The salesperson learnt how to program by following guidelines at training college, but also worked out the breasoning rules and how to creatively write to capture understanding. The algorithm companion helped with each part of writing the algorithm. The code checker was educational, training the programmers to make the right decisions. They could write further code, make modifications or reject suggested changes. If they believed they could write the code better, they needed to prove it; otherwise, follow the suggestions. +52. The salesperson ran the fast long-form (choice-pointless) page-connective code using a compiler with a state machine, where separating data collection and processing also resulted in a speed-up. The algorithm companion was tailored to the student’s needs. Everyone converted to C for speed. State Saving Interpreter Web Service embedded the Just In Time (choice-pointless (sic)) C code in a Prolog compiler written in C, which ran between-pages long-form (C-like) code. It was slower than C but allowed educational inspection. Everything was fast in the new choice-pointless language; it didn’t have a cut and had a reduced instruction set. +53. The salesperson gathered suggestions from history, perspectives, the person’s life and research, which they cited. The algorithm companion worked out the answer. It waited until the programmer had attempted the task. It compared the answers for simplicity, correctness, and formatting. Afterwards, it tried to improve its or the student’s work with each other, to include the student and provide a memorable experience. +54. The salesperson loaded each file once in Prolog and warned when similar-looking predicates might conflict. The algorithm companion didn’t tell the student the answer. Instead, it followed the student, providing mind-reading projected cues to help with specific sub-goals, cutting off high-quality thoughts and providing thought-sound effects accompanying them. The idea of finality ranged from absolute to relative, as the student had more ideas that needed to be vetted. The standard check required code to be correct, and the algorithm companion gave an estimate of the complexity of debugging more complicated features, which were sometimes worked on separately. + +55. The salesperson programmed Combination Algorithm Writer (CAW) to find simplified, systematic algorithms. The algorithm companion worked out if the student was on the right track. This track was on the company’s trajectory, for example, brute force induction. Induction, or CAW, could find diff, cycles, append and others. The approach to writing algorithms started with, for example, in-predicate A=[B|C] and changed to in-header or in-call [D|E]. +56. The salesperson increased the maximum number of lines to find combinations from in Lucian CI/CD because they had increased the number of lines. I wrote “:-” and “,” on separate lines from the Prolog predicate header and commands to allow Lucian CI/CD to keep the header by itself or to eliminate the final line. For example, I could eliminate all commands and keep the header as a base case. Or, I could eliminate an unnecessary cut as the final command. +57. The salesperson sold the base-level ideas while the philosopher found combinations of many ideas. I was critical of the openness of zero breasoning specialisations. The student wrote an idea without a historical perspective in mind. It was essential to build up from the first principles, completing each level of algorithms. It was better to open it up by writing unique, valuable ideas. +58. The salesperson selected unusual ideas to find algorithms from, such as “list” in List Prolog and giving an algorithm itself as input. I was critical of the famousness of zero breasoning specialisations. I wrote a combination of seven ideas to form a unique perspective. For example, Lucian CI/CD had multiple nested findall statements. Ten ideas were the gateway to originality and were detailed enough. +59. The salesperson cleaned the seven, suggesting other ideas and making them neater. I tightened my specialisation to a combination of seven breasonings, and three changing ones. I was good at making it look like “ones”. I supported the examination of three levels through a decision tree. The seven ideas protected me by affirming the standard. +60. The salesperson tried simplifying the sentence using Lucian CI/CD. I covered the general algorithms. My perspective was computer science, which derived from first principles. I ensured correctness by starting with and modifying a simple model. While fundamental, these algorithms contained specific methods that were my original work. +61. The salesperson replaced code with global variables with a predicate. I tightened the variable values in the sentence about the algorithm by eliminating unnecessary data and separating command sequences leading to the result. I eliminated constants and unneeded repetition. I merged functionally decomposed predicates if necessary, not if possible. I avoided using global variables in commands. +62. The salesperson returned all the correct values with findall. I tightened the result. I checked the uniformity of code and commands and the spelling of variables. I printed out the names of all the variables to check for misspelled ones. I checked the politics of the algorithm against the mission, algorithm family and algorithm goals and challenged “cut” sometimes being necessary for the interpreter by returning the first result. +63. The salesperson gated types and commands for debugging purposes, showing the progress of necessary commands. I constrained the variable values to free values. I checked the list had a particular list format. I checked that the items were of any type, including other lists. Alternatively, I checked they were either atoms, strings or numbers, but not lists. +64. The salesperson consistently looped from 0 to n or 1 to n+1 in C. I constrained the variable values to slider values. I detected the systematic failure that passed failure, so I corrected the failure. I detected that a variable’s value was correct, non-negative or incorrect and clearly labelled the meanings of values in the technical documentation. I kept values uniform (such as success and success1 in Lucian CI/CD and those in compile and uncompile lists in lucianpl). +65. The salesperson eliminated some type checking on compilation. I realised it was always better to control by checking data. I reported type violations. I removed type checking from the final program if the data was always the same or already checked. The compiler identified and removed type-checking in these cases, although the unchecked predicates triggered an alert if the user ran them directly. +66. The salesperson covered the algorithm’s key methods, skills and whether it was in the final state compared with future directions. I wrote concertedly, with the proper density of key terms. I counted terms such as type checking, CAW or Lucian CI/CD in the text. I found horizontal links between these. I summarised the topics covered in chapters and how they related to written or unwritten algorithms. +67. The salesperson excluded logical conditions or comparisons from needing permanent insertion in Lucian CI/CD. I developed a sensitive character to help the student write more on topics of interest. I wrote all the features at the start and checked they were all tested. The character asked me whether “writeln” and other commands with no output that should be permanently inserted should be stored in a file or given as input to Lucian CI/CD. Commands such as “input=input” without output were verifiable without being permanently inserted because they compared values. +68. The salesperson kept a list of breasonings the student had thought of and maintained the algorithms they thought of. I developed a character to help the student write more on specialisations. The direct connection was possible with neuronetworks, but sometimes unsatisfying instead of a new idea. I changed the connection between “name and address” and “simulants” from “body age” to “the scientific effect of the simulation on simulants”. I found the student’s comment on the connection with mind reading by finding critical terms in their mind and letting them approve, reject and develop them. +69. The salesperson read the meditator’s book and inspired their systems. I developed a character to help the student diversify their specialisation. The mind reading algorithm found the algorithms as students thought of them during their lives and helped them write them as meditators. Later, the algorithm increased the resolution within the specialisation to find more critical and bridging algorithms to new algorithms, which were considered necessary before specific times. The meditation students started performing with pedagogy and word breasonings, for which they wrote an algorithm. +70. The salesperson wrote the formula down from the specification. The algorithm writer software calculated the algorithm’s complexity from its specification. I counted how many steps away from the original the final data was. I worked out the complexity of the pattern-matching algorithm, which reordered or transformed items. The complexity was the number of steps an algorithm took to change a certain number of data items. + +71. The salesperson ran the code the fastest possible. I notified the user to simplify the algorithm. I labelled and timed the bottlenecks. I avoided loading files twice. I loaded configuration files, sometimes shared between repositories, once. +72. The salesperson compiled Prolog in C using long-form code (Prolog code that resembled C). I improved Lucian CI/CD’s performance with multithreading. The folders were tested simultaneously. I built the repositories in containers with code names. Prolog could run an algorithm stored in memory. +73. The salesperson ported their programs to other platforms. I made my Mac repositories Linux and Windows-friendly. I checked the list of commands my repositories used. I checked whether these were on Linux and Windows. I notified the user to install these commands. +74. The salesperson kept their files but recompiled the program. I helped the student develop their specialisation when they realised what they had aimed to do. Tens were in computer science, philosophy, music, and other books, were developed from one up, and were arrived at given the student’s interests. When the student arrived at a specific aim, the helper detailed the specialisation. This aim might be education about computer science and making simple connections. +75. The salesperson simplified their philosophy by simplifying their algorithm. I helped the student simplify their specialisation when they decided on their aim. Instead of using someone else’s simple algorithms, the student wrote an algorithm that generated simple algorithms from specifications (CAW). Program Finder with Types and String Types helped reduce the complexity of CAW, completing the predictable commands needed to meet the specification. The philosophy was seemingly worded to achieve its goal, enabling students to write tools to generate their philosophy. +76. The salesperson freed themselves from their shackles and realised time was vast, like the universe. The students realised (and the program explained) what they had implied. The student worked out their path forward simultaneously with the program’s suggestion. The student created a program that explained a path. I set the program’s level down, so I found and fixed errors and had creative control. +77. The salesperson converted Prolog (using long-form) to C and automatically multithreaded possible routines. The software helped write a specification. I put code that might slow down because of reusing commands in a container (called it as a Prolog program through Shell and returned results by printing to standard out). The optimiser returned this code to the main program and changed predicate and dynamic predicate names where necessary. I could generate a specification from the types in the program and test it. +78. The salesperson wrote code that guessed whether a predicate in Lucian CI/CD would pass by testing the unchanged code and tested the next ones. I checked the code to be maximally efficient. I checked that maplist and findall were multithreaded. If the code interacted with the disk, it could be moved to virtual memory and run simultaneously before being saved at the end. If the desired result was found before the end, the processors stopped and went on with a new computation. +79. The salesperson found the best point to split concurrently and ran candidate predicates concurrently, i.e. predicates that returned output with the other routines. I perfected code to be maximally efficient. I hard-coded five numbers if a loop repeated five times. I corrected the code if it hung or nearly crashed and used concurrent_maplist to queue many processes for fewer processors. I ran concurrent processes in C. +80. The salesperson simulated users from the past in the computer era and, like the salesperson in relation to the future, could meet professional requirements. I ran all the code concurrently. If a predicate modified a variable that another routine relied on, this was like a global variable. I ran child and then parent predicates concurrently. I perfected mind-reading and pre-schedule computations using (future) computers and explored computer games. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SALES/New Sales 4.txt b/Lucian-Academy/Books/SALES/New Sales 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..fb9368e18b809b13aba001f3117545e789fcd3b1 --- /dev/null +++ b/Lucian-Academy/Books/SALES/New Sales 4.txt @@ -0,0 +1,88 @@ +["Green, L 2024, New Sales 4, Lucian Academy Press, Melbourne.","Green, L 2024",1,"New Sales 4 + +1. The salesperson wrote unwritten programs for a language, focusing on simulated intelligence that automatically wrote code in a CI/CD tool and its language that processed particular repositories, mimicking a computer screen (container) in the browser. I measured the speed of different commands. I found the bottlenecks in the code. Duplicate predicates, non-multithreaded code or uncompiled code, caused these. In addition, I wrote a utility that scanned code for incompatible commands with a platform and downloaded them on installation. +2. The salesperson learned skills from experience in writing philosophy algorithms. I demonstrated creativity in writing commands. The whole computer was simulated and could safely run open-source software without touching the users’ files. I wrote the interpreter, Shell and the intraweb (sic). The web could connect to other containers. +3. The salesperson recognised patterns to turn into commands. I wrote subterm with address after discovering that a list of lines was faster to access than matrix cells. I could simulate a graphical screen. I could test programs drew to the screen correctly. I could also simulate the printer (using a print-to-file method, e.g. HTML) and test programs that printed. +4. The salesperson generated the first 4*50 As on the computer from the future to keep it. I wrote the necessary new commands. I wrote a program that automatically compiled programs before running them, calculated the likely run time of a multithreaded algorithm and taught assembly programming using a simple interpreter. I protected the folder’s contents from accidentally being deleted by including the correct folder’s name in the command. I indicated 4*50 high distinctions using a thought command on the future computer to remain youthful outwardly. +5. The salesperson used the programs they specified on the future computer, completing their backlog of work. The user was encouraged to shorten the Prolog code. I survived power blackouts, saved power and spiritually dictated commands to the computer. I could breason out letters and words of texts I had written that were automatically transcribed by the machine, find and write new connections, see graphics spiritually and appear to use a spiritual computer. The future computer tuned my computer’s mind-reading, used for music composition, and helped me with rote learning. +6. The salesperson only wrote future-compatible algorithms. I could expand the Prolog code into C. I found each possible set of predicates that could be completed concurrently. Then, I found the next set that depended on these. I guessed values from previous runs to run both sets concurrently. + +7. The salesperson replaced permanent insertions with tokens that were already there. In addition, I aimed to finish the details (algorithms) automatically with the software I had written. I questioned whether Lucian CI/CD needed to make comma insertions manually or whether giving repeating tokens different numbers was enough, and I decided on the former. Commas as insertions allow zero to all commas to be inserted to form combinations. I simplified Lucian CI/CD with Lucian CI/CD. +8. The salesperson more easily removed sequences, leading to unused variables or not needing to compare or output values with long-form Prolog. I attended to bugs with minimal changes. Lucian CI/CD similarly represented facts, rules and predicates to avoid missing one in processing. I wrote Lucian CI/CD to collect and then process the data (collecting the related predicate calls in one call). I converted Prolog to C-style long-form Prolog, which localised globals and sped the algorithm up, and I checked the writeln and file output as predicate output. +9. The salesperson explained that inspecting a variable command after it had run was possible in long-form Prolog. I competed with research. Long-form Prolog avoided the expensive interpreter loop and sped up Prolog. I encoded function names as strings, arity as a value and data structure names as codes to enable higher-order programming in long-form Prolog. Usually, data structures are passed directly to variables, but in higher-order programming, the function name, arity and data structure name are encoded and called this way, or when they are given in the algorithm, higher-order programming is simplified. +10. The salesperson compiled each of their Prolog algorithms, including CAW, the interpreters, the spell-checker, Text to Breasonings (to earn jobs, sell products, produce quantum energy, display spiritual screens, time travel or become immortal), Lucian Academy (including Grammar Logic and Essay Helper), Music Composer, Cultural Translation Tool, the Prolog converters and mind-reading to long-form (not requiring the interpreter) and then to C for speed. The designer predicted the trajectory of the complexity. The predictor compensated for “complexity restarts”, for example, translation in the algorithm. It calculated the complexity from the specification. It identified the algorithm, suggesting more straightforward approaches that met the planned complexity. +11. The salesperson inserted a newline before and after “:-” and before the full stop in Lucian CI/CD to keep the predicate header and each command by itself. The algorithm helper suggested alternatives. Instead of the first comma in “a:-b,c,d.” being permanent, I changed it to an insertion. The first comma in a level of commands would be altered in this way. This change meant that b, c or d could each be kept or deleted by Lucian CI/CD. +12. The salesperson put the input and output of dynamic predicates, writeln and files in the predicate header to check with Lucian CI/CD. I satisfied the student by absorbing the unabridged version of their algorithm. It might simulate C loops in Prolog, use Prolog commands to simulate arrays and trees or allow them to protect their home character from ageing when they need to run an algorithm using a future computer during a blackout. Using verbose C-style programming relied on faster methods, so it had better performance. Many predicates in List Prolog Interpreter and State Saving Interpreter could be corrected and simplified. +13. The salesperson wrote predicates as part of the main predicate to eliminate poor flow of thought, similar to Lambda expressions. I took care of the unabridged version of the algorithm. The unabridged form was the long form of the algorithm, which replaced non-determinism with loops. It was necessary to convert from a simpler Prolog to a long-form to eliminate complexity. Simpler Prolog reused predicates, put predicates in the main predicate if they had no recursion and eliminated reused intermediate predicates. +14. The salesperson imagined their algorithm as art. I sped up my Prolog web algorithm by compiling it. The non-Just in time (entirely compiled) code was run using the Prolog interpreter. I used the Just in Time compiler to compile as much of the code as possible. The web browser’s limited memory only allowed short texts to be passed between pages, so data should be compressed. +15. The salesperson predicted and wrote the seen-as version for the work ahead of schedule, matching what was needed. The computer predicted the student’s thoughts. The mind reader could avoid errors in paraphrasing, choose Grammar Logic (GL) highlights, choose combinations of philosophies to write about in an essay and choose the direction and content to meet this direction in an essay. Still, Lucian CI/CD could eliminate redundant algorithm features. I mind-answered intelligent questions, such as making summaries, choosing random questions for a quiz, honing PhD perspectives on an idea, and then choosing unusual GL words to inspire a new algorithm (like music). GL could help generate hypothesised students’ and critics’ questions and possible answers, help write reviews and make decisions about marking and selecting. +16. The salesperson breasoned out the answer’s sanctuary, encouraging the student to use it. I satisfied the criterion. I increased the most straightforward thought. Findall was simpler than maplist. Once the student was prepared, they could think of the highest-quality thoughts. +17. The salesperson could read errors and the algorithm’s output without the clutter of warnings. I cleared the way for sharp thoughts. I eliminated discontiguous and singleton warnings by solving them. By avoiding running algorithms in containers in algorithms by preventing predicate name/arity duplication, I improved performance. I renamed variables when I moved code into the main predicate of an algorithm. +18. The salesperson claimed that GitL allowed writing code oneself and that Lucian CI/CD contained a version control system. I had noumenal (exact) thoughts. I wrote my texts, taking ten years to create a major at the institution. Interpreters, which were hard to visualise, indicated the possibility that many types of programs were best positioned in the institution. The pure mathematician became a Prolog professor and could visualise complex algorithms. +19. The salesperson differentiated mind-mapped specifications and applied their best ideas to each of them in ten parts. The necessary thoughts were in-order plans for the method to program the algorithm. Physics was a good topic for the founder of the institution and the philosophy writing lecturer. The algorithms appeared automatically, and the student devised a system that completed them, possibly better than a complicated algorithm. The answer was the maplist and predicate mind-reader, program finder with types and a spiritual word processor that accepted commands such as “=” and “sort”. +20. The salesperson appreciated that small, practical changes in Prolog led to fewer necessary changes in C. I wrote incisive comments to change the algorithm naturally. For example, I used two reverses to solve “use” dependencies in Lucian CI/CD, achieving bottom-up order. This method prevented all, not some pathways to the bottom of the dependency chart being tested. Lucian CI/CD tested each predicate bottom-up so that all predicates below a predicate were tested before it. +21. The salesperson found whether a predicate returned a single or multiple results. I categorised the algorithm. I tagged the algorithm’s complexity, its inherent skills, its method, uses and incompleted uses. I checked it worked, was well-written and how many predicates used it and how many it used. I wrote an algorithm to rate and make suggestions about software. +22. The salesperson aimed for immortality, space travel, simulations and fast computers. I thought of a future use for the algorithm. I listed future technologies I had written about. I connected the algorithm to each one, for example, mind-reading, text to breasonings, multithreading and compilation. The hardware was customised for the software, with a smaller instruction set. + +23. The salesperson found the main contention of the complete study area regarding their argument. I wrote software that helped me mind map the multiplicity of each sentence. I chose a small idea to analyse, set up sliders, and analyse their properties. +24. The salesperson chatted with the chatbot to help coordinate mind-reading the PhDs. I offered “corporate computer science” to education students. Computer science should be at the core of areas of study taught by education graduates, and corporates put the bar up to challenge them. The moral of the story was type-checking. I checked the current idea, which was broken down by the skills needed to understand it, and checked these off. +25. The salesperson wrote continental in terms of analytical philosophy, really writing about algorithms occasionally. I argued that the algorithm was written correctly. Only a few combinations of the student’s ideas were necessary to capture their attention. I used the Breasoning Algorithm Generator (BAG) to reconnect their language. I asked analytic questions to increase my understanding of their arguments and algorithms, such as auto-analytic side finders and polemic creators. The algorithm accompanied the first two times with suspenseful music and imagery and helped the third to succeed, with a single object as the seen-as version. +26. The salesperson funnelled three or more mind-reading signals into writing or word processing a command. I verified all input and predicate data. I used types to check the input and output of predicates. I traced the types through the predicate, potentially correcting mismatches with mind-reading. I first tried pattern type matching, then unpredictable (sort or search) type matching with CAW or a neuronetwork, and then mind-reading the entry of new algorithms or connections to match types. +27. The salesperson accelerated debugging with automatic type fixing. I explained that data within a predicate must be checked to find type and programming errors, whereas debugging usually finds type errors. I listed the types of errors to debug. I realised a mistake related to needing a new feature, which could be mind-read or manually entered. I wrote and identified why a menu was needed to standard out, usually in an information bureau or a system requiring run-time input. +28. The salesperson found more at the top. I multiplied my results to give 80 items. I thought of the idea in itself. I thought of a reason to be. I thought of enough ideas per assignment as a founder. +29. The salesperson inserted a call to another predicate to call the API to check its input and output. I stated that types checked API call data. I checked, and the API was functioning normally. If its output changed, I checked it and updated my type statement. I noted that Lucian CI/CD kept the API call if the data could be checked. In addition, I updated my test. +30. The salesperson type-checked the file data. Again, I created a predicate to read the file and check the input and output. I wrote a virtual operating system that could test files in a container. I could undo deletions and keep a log of commands that I ran. It sped up testing and didn’t require pauses by the operating system to authenticate files. +31. The salesperson replaced part of the types with values to increase the checking accuracy. I type-checked user input. As I had done before, I wrote a predicate to take user input and type-checked its input and output. I wrote a custom predicate that identified the file and line number of the standard to read user input from and check the types. I checked the content with a specification. +32. The salesperson stated that commands generally checked types. The command checked the types, variables, string terms and values of its input and output. In addition, I checked that variables were the same and different in the right places, identifying and fixing when they were incorrect. Otherwise, I checked that the data was the same and different in the right places. I identified and fixed when they were incorrect. +33. The salesperson wrote and tracked the version of the API. I checked the types and values of API call data, as well as the types, variables, string terms or grammars, and values of APIs’ input and output. String grammars were mathematical functions, xfy or xyf functions, or other non-terms. API calls must be identified to check their exact output or input only. +34. The salesperson stated that the infrastructure as a service (to make sure the container’s attributes were replicable) ran the interpreter as an operating system with a virtual file system, debugged it using snapshots, loaded the files at the start, and saved them at the end. I checked the file data types and values. I checked the file by its name, line, and time step. I algorithmised these factors if they were dependent on other variables. In addition, their creation and modification date stamps were considered. +35. The salesperson considered simple rather than more extended changes in Prolog. I checked the user input’s types and values. I wrote a custom predicate to check the input with a chatbot. I checked the sensicalness (sic), grammar, the viability of a sentence or value algorithm specification, or whether the question was ethical (and helped the students complete it themselves if needed). I safety-checked the data and limited amounts or called a manager if necessary. +36. The salesperson wrote an algorithm compatible with the future because it was simple, didn’t rely on software or hardware from the time and was necessary for everyone. I mind-mapped the course. The simulation entertained and educated the person. They had a universal pension and access to disability support, medicine and education. I focused on the first thought in philosophy and interpreted it generally (as future perfect). +37. The salesperson tested the course for viability (enjoyment) and sales. I tested the course. I used a chatbot to test the course for different ages and year levels, disability statuses, complexity of thought, language and literacy ability, background (their experience, education and socio-cultural status), and interest in the course. In addition to the software/course prompts, I tested the ease of use of the algorithm’s graphical/user interface. I checked for non-discriminatory and inclusive language about people’s age, disability, marital or domestic partner status, the identity of their spouse or partner, pregnancy (or potential pregnancy), family responsibilities, association with a child (in the provision of goods, services or accommodation) and breastfeeding. +38. The salesperson considered whether the software or testers needed changing. The education tester tested the algorithm. The algorithm was researchable if it was testable. I researched the variables, correlations and statistics of the algorithm’s use. I mapped the pathways users took through the documentation and software and asked users reasons for their choices as they made them. + +39. The salesperson stated that the teaching tester tested the algorithm. If asked for the sum of n 1’s, I replied n. The general algorithm for this was to evaluate formulas using a formula finder, using more straightforward data. I stated that reverse(reverse(l))=l. Again, I verified that simple data worked, then proved this with an involutional graph in which abscissas (x coordinate) and ordinates (y coordinate) were reflected in y=x. +40. The salesperson approached themselves with GL. The business tester tested the algorithm. I chose wisely to generate breasonings with the Grammar-Logic (GL) program. I needed to write and be familiar with my texts to earn roles. I listened to the feedback of my computing client (me) and implemented recommendations. +41. The salesperson was cautious with the diff algorithm and improved it so it included only changes. The tester tested the algorithm. The real test was to write and understand the algorithms to help oneself become immortal. I simplified algorithms to the variables needed and, wrote the minimum transformations and clauses required. I rewrote the cycle algorithm to work without findall, using the non-long-interpreter loop and long-form C-style Prolog code. +42. The salesperson answered how to represent the high distinction. Volunteer students learned about philosophy. I attracted students interested in the history and philosophy of computing, including new connections when planning and creating algorithms, finding new levels in their lives and being given the freedom to be themselves. I found the highest quality of life during the finance subject of my MBA. I became aware of the power of intelligence, letting it supplement and improve me. +43. The salesperson welcomed the Computational Golden Age. Experienced teachers taught and assessed the students. I counted that n^m ideas were possible with m combinants, with n sentences. By even writing a short book of fifty high distinctions, I could generate more than 4*50 high distinctions. I could complete, support and achieve my goals. +44. The salesperson asked registrants questions that tested their knowledge of the latest research and algorithms. Business people found the students. I wrote letters to possible students. I noted their changes of interest, highs, and direction. I wrote on gold leaf, perfumed with guava perfume and patterned with natural divinity. +45. The salesperson reached up to succeed. Business people found the teachers. I was a writer, an actor, a philosopher. They were the supporters and the leaders. The former group was happy to trick, while the latter was definitive. +46. The salesperson used emotional, whole-life and meditation (temporal and self-based) mind-reading to improve the accuracy of mind-reading. Business people found others in the Academy. I related State Saving Interpreter to a mind-word processor used in mind reading. I specialised in creating new ideas in algorithms, such as choosing a particular value or delimiter to make an insertion. The compiler was speeding up. +47. The salesperson flattened the tree into a list. I helped fund education. I partially evaluated the Prolog algorithm using the query. If possible, I fully evaluated the result using the query. If this was impossible, I evaluated as much as possible, saving time. +48. The salesperson sold education support. I emulated the mathematical manipulation software, which required entering formulas and steps. This also worked in set theory, logic, language, and programming. I avoided cheating and encouraged students to hand in creative solutions instead of answering the question. If they failed, they had at least tried. +49. When generating predicates, the salesperson noticed how the number of variables related to the number of clauses. I sold education courses. To emulate neuronetworks, I identified the types of types (grammars, hierarchies or other relations), the relation between input and output, used a program finder with types and used my database to help with unpredictable code (and a program finder to create new starting points, such as a converter from a comma into an insertion in diff), I wrote code explainers, hinters and ways of thinking that each breasoning or predicate posed for every other predicate. I wrote and simplified long algorithms using my wits and wrote interpreters and converters from languages to C and assembly for each type of architecture. I worked top-down on long algorithms, using program finder with types for predictable code and databases and CAW for unpredictable code. +50. The salesperson performed a job because it gave him a reason for money, promoted full brain potential and health, and engaged him with research and higher intelligence. I sold education books. I only sold what I wrote. I wrote specialised neuronetworks that had longer algorithms and sentences describing them. I used my neuronetwork, as others used theirs, to complete my areas of study. +51. The salesperson wrote an algorithm to identify repeating information, explain complex ideas or help a disadvantaged or minority student understand ideas. I sold support for the algorithms. Philosophy, humanities, science, jobs and education could be reduced to algorithm tutorials and use. I wrote an algorithm that is used in various sectors. I helped students with the most straightforward Prolog algorithm; then, they were helped to modify it to a long form and add needed features, bypassing the Prolog interpreter. +52. The salesperson wrote, simplified and explained algorithms in a step-by-step manner, encouraging thinking of points to add features to. I taught students how to use algorithms. I helped the student write the algorithm using similar models, hints, mind reading, projection, high distinctions and revision. Then, they were asked to apply it to different data, problems and algorithms, including those from other departments. They modified them, applied them to their ideas and applied critical thinking and new perspectives, such as optimisation. +53. The salesperson stated that the algorithm to find the point of philosophy was to connect multiplicity and the knowledge of developedness. I sold books about the philosophy of the algorithms. I covered the noumena of the algorithm, including whether it was as simple as possible, its limitations, better alternatives and its performance. I applied other algorithms, combinations of algorithms and novel philosophical ideas to them. I tested and taught noteworthy hybrid algorithms. +54. The salesperson ordered their ideas and explored them logically. I planned my time. I calculated how long it would take to apply an idea to my other ideas and multiplied this by the number of ideas. These ideas did not include new ideas and combinations. + +55. The salesperson compiled Prolog to assembly code. I taught subjects as blocks and explained the simple CPU. It moved data around differently between its parts for different commands. This data movement was the basis of the assembly code. +56. The salesperson explained that the short-form Prolog code contained, for example, member instead of an exhaustive predicate (that searched through all possibilities), and the intermediate code was basically assembly code in Prolog, with macro calls inserted. There was one subject at a time. I explained each CPU component and its relations. I wrote simple assembly macros to complete valuable tasks. I recognised the need for these macros and inserted them in the intermediate short-form Prolog code. +57. The salesperson preferred to be mind read and feel like their neurons had failed, then built the descendant, diff and cycle algorithms. I intensively taught in the block subject. I found short form from specs using a version of Program Finder with Types (PFT) that found short form, or, for example, member or append instead of a predicate that found all the solutions at the start and looped the algorithm through them. PFT first found the long form of descendant, then Program Finder with Types in Short Form (PFT-SF) found descendant = (parent; (parent, descendant)) (see Sales: New Sales 3.5). A shorter version of this process would recognise the pattern of all children of a descendant in the tree and match this to a pre-saved algorithm. Finding the short form from specs immediately was more intuitive, using a neuronetwork or decision tree with mathematical types. +58. The salesperson wrote a macro to add numbers, which converted them to micro (binary) form. I found the intermediate code, which contained the loops and structures of assembly code and macros (shortcuts). Intermediate code took long-form code as its precursor, which contained loops. All the loops were nested in the algorithm and could be converted into assembly code with loops and subroutines. Macros performed mathematical, logical and list-related operations. Later, the interpreter compiled the algorithm and these macros into assembly code, where different CPUs had different assembly codes. +59. The salesperson achieved their goal, like a professor. I taught one project in block subjects. It went on for longer, and there were two projects. I studied one subject at a time, giving me time to focus on the workload. The competition ensured one high-flyer. +60. The salesperson invested more effort in the block subject. The failure rate fell in the block subject. The students felt close to the staff, helping the failure rate fall. In addition, the zest for quality teaching contributed to this result. The ability to think separately from the writer came from a synthesis (making connections between unrelated sentences) and learning from one's experience, yielding greater confidence in pedagogy. +61. The salesperson modified their and the students' algorithms. The professor had enough resources to meet the students' expectations. They had a practicum of 4*50 high distinctions for each assignment. They could use this practicum to write further high distinctions when students performed well. The PhDs and onwards prepared for one or more predicates. +62. The salesperson realised that by writing 50 high distinctions, they were supported in writing further 50s. A pedagogical criterion, computer science prepares the student for writing high distinctions. The philosophy student became accustomed to writing more in Computational English. There was a practicum for Computational English in the degree. There were segways into mathematics and physics. +63. The salesperson needed 0 words in the word processing suggestions. In addition to computer science, philosophy was another criterion for high distinctions. \"Pedagogy\" was recognised for following, writing, and helping with high distinctions. The chatbot seemed to feature the writer's content. The philosophy writing algorithm mind-read the student, only writing what they wrote and helping. +64. The salesperson queued and guessed choice points about the content. Students seemed interested in asking more questions. They were involved in workshops that involved active thinking, discussion, and questions. I drew diagrams of musical relationships, connected arguments and recorded algorithm ideas. The CPU queued and guessed branches. +65. The salesperson completed key assignments to perform their mission. I focused on completing the assignment. I made notes as I read the specification. I found answers to my questions, including consulting with staff. The tutor checked how I was progressing with my work by checking that I understood the prerequisite knowledge for ideas and that I was making headway with my ideas. +66. The salesperson found writing light, breasoning heavy and the text to breasoning algorithm light. I remembered the needed details between classes. I made a special effort to write down interdisciplinary or relevant directions after class. The main goal was writing. Years or decades later, I graduated and became interested in writing pedagogical essays. I developed details for the next stage. +67. The salesperson found reasons and uses for a reason. I claimed that everything revolved around details. I wrote algorithm specifications. I could complete these later, and students and users analysed and used them. Every line had its philosophy and could be connected to every other line. +68. The salesperson said the food was sold at Zum's and was offered to the students. I argued that block subjects held a higher value than non-block subjects. The block subjects were of better quality regarding student knowledge retention, memory, and enjoyment. I kept the Hobart computer tutorial materials in the spare room of the internet café. The manager took almost perfect care of them, and I enjoyed the danishes. +69. I viewed the sights and sounds of Hobart, then returned. Students preferred to listen to intensively taught classes. As soon as I worked out I could receive the pension, I returned to Melbourne. I felt better with therapy and looked back, thankfully, at Matt from Scripture Union and Centrelink. I wish I could have kept in touch. +70. The salesperson said that the subject was better designed than not. I felt like moving on, so I visited Hobart to see Raph Rish. I was interested in developing Conglish and teaching web design, so I discussed these with Raph. When they finished, he encouraged me to attend Christ College at the University. Looking back, I am happy to be actively involved in my academy while finishing my business studies. + +71. The salesperson tested understanding and memory, which helped apply the idea. The block subject had marking criteria. The students understood the function of each predicate for each algorithm. They were taught and explained or demonstrated how it worked with an activity. Memorisation was the highest quality thing and could be examined. +72. The salesperson collected and paraphrased students’ opinions on a topic. The block subject had an expected word count. The tutorial split into groups, and the questions were discussed. The tutor walked around, listened and commented. The students took notes and included these in their essays. +73. The salesperson stated that Lucian CI/CD’s philosophy was writing a system that used the students’ ideas to improve writing. The block subject gave examples. The algorithms were not too difficult, and the students felt they had understood them. Their parts were short and detailed, so they were appropriate for essays. Like other works, the times moved on, and the algorithms became a historical record, studied as a source of philosophies. +74. The salesperson only included the algorithms they wrote. The block subject gave anti-examples (examples of what not to include in essays). These ideas related to medicine, particularly food, became part of the contention. An anti-example of Lucian CI/CD was using non-current ideas (such as one’s archives or others of one’s algorithms) to help with development. The teacher considered setting basic predicates such as diff and merge, asking students to simplify them sequentially, and writing a plug-in that included these algorithms when needed. +75. The salesperson ran Lucian CI/CD to integrate the algorithms’ parts. I explained why the examples were better than anti-examples. When Lucian CI/CD didn’t need non-current predicates, it was better to use combinations of immediate past and current code because it could debug annoying combinations or bottlenecks in development. Assuming a combination of the code was the simplest possible, sets of predicates could be debugged, and the algorithm would be built based on that. Teams could paste their drafts together, and Lucian CI/CD could find a working combination based on the tests. +76. The salesperson connected details in the essay. I delved deeper into the algorithmic possibilities. I noticed the possibilities in the research. I searched for and published a new possibility. It was connected to my overarching argument. +77. The salesperson covered the advanced chip commands, including fast Prolog. I explained that the assembly language interpreter could save in assembly. The comments and original code were saved separately. The simple chip could operate on binary numbers and treat alphabetic characters as numbers, but it ultimately printed characters directly. Otherwise, it would not be easy to print strings in custom fonts. +78. The salesperson loaded the assembly code in the interpreter editor. I edited the source code as a text file, ran the assembly code line by line, tracked the values of registers and memory pigeonholes, and treated the code as a normal program, debugging it with tests in Lucian CI/CD. +79. The salesperson developed a chip of the future and emulated it (and wrote binary addition in terms of logic gate addition). The assembly code was compiled in Prolog and ran on any system. It was faster when compiled. I could customise the commands by adding more complicated commands. These could be in assembly code and act like macros. +80. The salesperson suggested much smaller computational circuitry than an atom. I claimed that native code is fast because it is in assembly code. Assembly was more rapid than C because it was optimised. In addition, an optimised chip was even quicker because it used the latest algorithms. I started with a fast chip, then made it faster. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SALES/New Sales 5.txt b/Lucian-Academy/Books/SALES/New Sales 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..08d89ca8c96afbff5f94ed81316776106ceb1c71 --- /dev/null +++ b/Lucian-Academy/Books/SALES/New Sales 5.txt @@ -0,0 +1,89 @@ +["Green, L 2024, New Sales 5, Lucian Academy Press, Melbourne.","Green, L 2024",1,"New Sales 5 + +1. The salesperson greatly simplified the code using algebra, which he hard-wired. I developed assembly code for different chips. I emulated each chip, rewriting the code for each of them. I exploited possible optimisations and found other limitations. I examined the reason for the restriction and found the best alternative. +2. The salesperson changed the use of the object. I asked whether the connection in the essay was a change. I improved the outlook, output or thought of a use. I mentally placed the state in a new setting to show how it could be improved. Or I found a faster change to the second part. +3. The salesperson claimed that the change was a movement. I asked whether the connection was a movement. I analysed whether the change involved a physical transition, translation or a change. I checked whether the objects involved had changed or just moved. I claimed translation moved ideas and people. +4. The salesperson may write a computation to speed up or automate a process. I confirmed whether the connection was a transformation. This transformation might involve a verb, a chemical transformation or a computation. For example, a simple verb such as “wrote” may have an inspiration and an application. This change may support people’s health and well-being. +5. The salesperson checked that the algorithm was helpful and worked generally. The computation may be a type of connection. I spotted a flaw, limitation, or opportunity and wrote a program to perform a solution, work, or plan. I preferred to write and process my ideas with my algorithms. This way contained new directions for research and prevented relying on outside systems or unchecked solutions. +6. The salesperson was delighted when the mind-reading algorithm successfully helped them word process an algorithm, helping maintain correctness. I wrote a piece of software that encouraged students to think of perspectival details. I influenced students to think of algorithms and their “use” algorithms using Program Finder with Types (PFT) by finding programs from predictable or unpredictable patterns in types or collecting more information about types to improve an algorithm, respectively. Then, I simplified the algorithm, coalescing types and code. I could collect more detailed types using mind-reading. + +7. The salesperson interpreted the usefulness of logic. I included academic department names in perspectival details. I thought of assembly programs as part of computer science to traverse a hierarchical structure and convert a string to a hierarchical structure. The assembly program to traverse a hierarchical structure broke each line into items and kept track of where it was in the recursive program, returning a value when there was no more input. The assembly program to convert a string to a hierarchical structure again recursed through the string, returning a value when there was no more input. +8. By treating errors in code comments, the salesperson avoided syntax errors. The student’s task was to think of a question asking for a perspectival detail. I found and merged the nested comment. If the comment within a comment is already finished, it would signify the end of the outer comment. So, I merged or separated the comments. +9. The salesperson bought organic potatoes, without pesticides, each day. I asked, “Explain the workings of...”. I started supporting immortals in my home time by using a similar program to d.sh, but without the time-consuming character breasoning and with fewer, more specific 4*50 high distinctions. Because this method was mentally tiring, I found a better solution to run the quantum box stamp with 250 breasonings a certain multiple of times, which was faster and effortless. Eating potatoes for dinner helped me concentrate on working because of the iodine they contained. +10. The salesperson agreed with non-invasive Western quantum research. I asked, “How does x in department y connect to topic z?” I wrote 250 breasonings on Daoist meditation, medicine, and Western psychiatry, expanding them to 4*50 high distinctions. I connected myself to better spiritual health. I performed daily meditation to care for my organs and connect and make the parts of my life work. +11. The salesperson could become immortal by running text-to-breasonings. I suggested, “Picture this argument in a new setting, giving five different objects in arguments.” I wrote 4*50 high distinctions on Buddhist meditation and medicine. These were different. For example, I could control non-voluntary processes and prevent headaches. I could help run text-to-breasonings with my headache prevention skills. +12. The salesperson explained that a business product was 4*50 high distinctions for the period for business, possibly with a secret. I listed the skills I had. I connected medicine, meditation, and education to support my writing with accreditation. Accreditation was ten days of sentences and algorithm repeating. I impressed the business people buying accreditation with business products. +13. The salesperson started their day back in time. I listed skills I didn’t have. I did things at the start of each day but switched them off. I prepared for the day’s work, letting students make comments about the work. I looked up knowledge to help students. +14. The salesperson could interpret and query their texts long enough to be developed. I listed the resulting pointers for the skills needed. I used a future thought command to help with accreditation. However, neuronets were required in the present to provide accreditation. To use a neuronet, I wrote 4*50 high distinctions. +15. The salesperson helped the student get back on track with types. I mind-read how the student wanted to complete the task. The projected questions and model answers appeared to the ducks with flowers and friends. They could choose whether to and what answer to give. They could update their answer to provide the correct answer or correction. +16. The salesperson used the types in the model solution. I asked the students to write questions to answer to complete the task. They identified and accessed skills to break down and use a solution like a model solution. They needed to count. I learned their language and used symbols. +17. The salesperson trained the students to think of breasonings or simple algorithms. I asked for all questions at this level. I noted whether the model solution needed to be modified. The students learned from a model solution how to change it. In future, they will apply this knowledge. +18. The salesperson scored the students on memory, accuracy, speed and using special skills. I asked for answers. I helped the students prepare for specific questions with certain model answers, some of which were revised. They appeared at auspicious times, just when needed. I detected the students remembering the answers and rewarded them with a loving projection or spiritual food. +19. The salesperson admitted that students in all dimensions were mind-read. I asked for connections. I predicted that some of the students were better at counting, some were better at sorting, and some were better at using particular objects as variable names. They had emotional, closeness and interpersonal concerns. I engaged them in group dynamics by simultaneously projecting the lesson to multiple students and helping track their communication. +20. The salesperson tested the goats’ knowledge of the interpreter. The students listed evidence of education, such as the exact code they could use in the project. The students learned to refer to previous predicates they had written. They learned to memorise the connection between simple recursive code and data. Although difficult, they learned about non-deterministic commands such as member and append. +21. The salesperson gave the model solution, which was remembered by the student, a high distinction, including past achievements and fond memories, where thinking of the model solution was given high-quality thoughts in itself, or the fast-appearing parts were client-side computations. The student listed evidence for education, such as similar code they could use in the project. The students could find jobs as code testers and completers. Some responded to meditation skill testing, while others preferred other skills, such as art. Thinking of similar code required the skill of sifting through noise or thoughts, which required human-computer support. +22. The salesperson focused on discoveries. I wondered if the cats were more philosophical. It required thinking of links to algorithms and between sentences. These were links between algorithms, new algorithms and use-algorithms. The new algorithms came from innovative, explanatory games or solutions to real-world problems, such as algorithms they wonder about when taught the skill. + +23. The salesperson quickly entered texts and analysed them. I overturned conclusions about a specific specialisation. I critically held the spreadsheet formula finder for revising. It instantly converted questions to answers using pattern matching and formulas with previously entered model answers. I verified the model solutions, sped up revision and used the latest techniques. +24. The salesperson made qualitative conclusions. The students wrote an ethical code of conduct. I outputted the spreadsheet as an HTML file to view it in proper formatting. For clarity, the HTML table contained horizontal lines. I could quickly read key details, check the calculations and formatting and draw conclusions. +25. The salesperson aimed for speed and accuracy. I made an ethics checklist at each point. I checked for errors in the data, the original question’s wording and the formulas entered. I deleted spaces in the input. I removed unnecessary spaces before analysis, computing values and formulas. +26. The salesperson checked that the answer answered the question. I removed questionable sources, such as plagiarised sources. The data was checked against a variety of sources. I filled out data to capture answers to possible questions. I attempted more difficult pattern-matching questions, turning a list into a table. +27. The salesperson verified that everything was complete. I checked the understanding of sources. I checked the model solutions, which contained all necessary links and manually entered formulas. I parsed formulas’ mathematical operators, split them into separate formulas, substituted variables, calculated values, and substituted them back into the original formulas for printing. I found different variants of operators. +28. The salesperson duplicated the value as many times as necessary. I found other sources to corroborate sources’ accuracy. One solution was enough to find the solution using spanning formulas. I entered multiple possible solutions to train formats with multiple answer tables. However, one solution was enough because complex formulas could be entered manually instead of copying values with numerous instances of a value. +29. The salesperson could print out a list of necessary information in a question. I found detailed information about the time and place of each source. I checked for and filled in the missing information. I edited out unnecessary information for an answer. Once I had finished the formats, I could enter the question, and the algorithm would find the answer. +30. The salesperson listed the key questions and worked on them. I noted variations in the solutions. I chose the solution with the correct format. The algorithm identified the question, key terms and answer format. I entered the question, read the answer and checked it. +31. The salesperson examined other subjects, such as ledgers and journals, with the software. I examined the political side of the question, different interpretations of it, and the possibility of a single answer. I compensated for commas and spaces in monetary amounts, for example, “$1,000” or “$1 000.” I simplified formulas to one line to find values more easily. I could use the software to write business reports and make decisions more quickly and easily. +32. The salesperson created answers that could wait for the required answers written afterwards. I gained a legal advantage with my knowledge by helping myself or others do the job. Rather than recognising incrementing dates, I copied the patterns. I used subterms with addresses to quickly find and replace within Prolog terms, completing formulas as I went. I could enter raw data and speed up processing it. +33. The salesperson grasped the logic underlying each question. I found possible technological limitations, such as time to bulk process, read and check files, and unfound bugs. I pattern-matched the input and output with variants of operators. I converted the tables to terms, numbering tables and formulas and referring to these in formulas. There was one formula per line, which was converted from a list to a term and then to values. +34. The salesperson listed needing more information from an answer. I identified false assumptions at the time, such as missing or incomplete information and wrote the rest. Given a specific input, the algorithm gives a particular output. The spreadsheet software made calculations and printed the answers. The software evolved, producing more model answers with more features. +35. The salesperson reformatted the table into separate tables to sum the columns more quickly. I helped the student write a program or spreadsheet finder for a task. I recognised data in the same place (in multiple rows, cols, and the total or summary row or column). The formulas spanned several rows or columns or above in the previous column. When a desired value was found, the formula was entered or entered manually. +36. The salesperson found qualitative answer tables with n rows from a manually entered table column headings term with n columns from the input. I wrote the algorithm to join together different parts of the answer. I compiled data into tables, split strings into $ and value (stripped $), found relevant column headings and found formulas with a1-n from variables a1, a2 and a3. These values were assumed to be from non-total columns. +37. The salesperson combined question lines to process them and deleted unnecessary white space. I mind-mapped options. I entered the question in the same format as another question. The algorithm learns from multiple data sets when a formula spans numerous columns or recognises the number of columns. I included units such as $ or %. +38. The salesperson found each type of problem. I mind-mapped ways in which the algorithm could be used in other algorithms. I used the algorithm for tax, insurance and annual reports. I recognised tables, formulas, and questions (it ignored other terms, found relevant answers and found specific question focuses) in the Prolog query. These were a later priority, and the first version just found formulas. + +39. The salesperson automated their finances by training people. Usually, the algorithm was simplified. I developed my chatbot, 3D movie maker and business (creating finance products from my essays). I bulk-uploaded multiple repositories. I automatically conceptualised, drafted, wrote, tested and marketed my software to people. +40. The salesperson created a poster for each algorithm. The new information was provided by paragraphs describing how it could be used. I found the relevant software and reimagined it for a new use. I recognised column headings. I removed commonly used connective words, found the key terms, their complete form, and synonyms, searched for them, added new data, and invented differently formatted files. +41. The salesperson stated that a paragraph also explained why the new information was correct. I explained, justified, and inspired creative thinking about discovery, research, or new musical or art techniques. I wrote a new mathematical technique that found out about people around me, the universe and its languages. I added new constants, such as total, to the spreadsheet software, or the software automatically left the text intact when it couldn’t replace it. I changed some table column headings to match the question. +42. The salesperson undeleted simple code. I stated that Lucian CI/CD stored deleted code for the future. The unfinished, too-advanced, or future-inspiring code was prepared for necessary work. I practised handwriting because I had stopped writing my notes. I recognised that if a particular item was a constant from signs, it didn’t change. + +43. The salesperson explored all ideas ad infinitum. I wrote an algorithm that worked out where I was up to expanding my texts. I would develop a sentence, then add new material, and then find matching patterns in the text to continue where I left off. I leant towards ideas that led to further exploration of an idea. I wrote the ideas on an outlandish scale, keeping recognisable valuable ideas. +44. The salesperson found whether the budget was favourable or unfavourable. I wrote “u” (unfavourable) if the number was positive or “f” (favourable) if the number was negative. The spreadsheet formula finder found this type of property. These judgements were reversed if the number denoted a liability. In the end, the overall result depended on whether there were more favourable or unfavourable results. +45. The salesperson developed automated reporting systems in their company. I copied values from other parts of the table only in columns that didn’t have the heading “Total” or contained this word. I calculated the total in the total column. I recognised columns or rows that didn’t copy values as either formula results or constants. In some cases, they needed Prolog predicates to work out qualitative results. + +46. The salesperson argued that intelligent tasks were ordered. I had a high distinction and sought help. I thoroughly understood the content as my requirement. I founded an academy focusing on the intelligence of texts and calling others in myself. I analysed myself through others, including education and research. +47. The salesperson wondered if the type of light in the electricity of the future would be lasers. In Spreadsheet Formula Finder, I inferred the end-of-month date from the database. If the problem required that I include the number of days in a month by giving or inferring the month, then I stored this value or calculated it from a calendar. I needed the final day to label a cash budget. There were side courses available to power ahead in induction. +48. The salesperson made many inroads into the problem using analysis they completed in the editor. They found out the shape and seen-as versions of algorithms where the rule-finding algorithm in CAW could be scaled up. I programmed creativity with an editor and a “doer” algorithm. An inductive algorithm (a “doer”) needs many more rules than an interpreter (an editor or viewer). Where an interpreter is a viewer, editor, and inductive algorithms converge, before which the editor allows one to help it by, for example, programming a diff program finder with complex enough features one would need. The inductive algorithm is spontaneous and sensitive, helping one revise the program finder. +49. The salesperson wrote that nonparametric Bayesian mathematics could improve deep learning. The Spreadsheet Formula Finder found sums or subtractions from the current row or column. A function is needed for more complex data shapes, such as finding sums from the previous column. In this case, the labels corresponding to the rows used would be recognised. The algorithm found the cells added or subtracted in each row or column, patterns of these cell numbers in consecutive rows or columns, and matrix optimisation formulas. +50. The salesperson explained each conclusion with an interactive activity. Spreadsheet Formula Finder learns to add or subtract cells with values contributing to the formula. I kept invisible calculations contributing to formulas in models. I wrote spreadsheet formulas in a programming language that used intelligent algorithms. I worked with wonky data by converting causation to correlation and comparing methods to present a qualitative conclusion. I instantly completed the assignment or report. +51. The salesperson taught the skill of immortality. I explored the world around me. I planned for a certain number of regular customers, considering growth and putting most customers on simpler systems with minimal overhead. Customers could explore the courses, and I could give them high-quality thoughts on accreditation. I was sensitive to cultures I sold to, attracting property and offices for academies. +52. The salesperson used a Prolog grammar to convert the list to a term. I always added the dollar symbol where it was in the original. There was one formula for formatting and one with mathematical calculations in the cell. I found a+(b+(c+d)) from the model solution or the result of a calculation given values in the model solution and printed the result if needed with formatting. I started with the list [“$”,1,\" +\",2,\" +\",3] and ended up with the term [1,\" +\",[2,\" +\",3]], which I parsed, substituted values into and calculated the result of for a new problem. +53. The salesperson established schools of aphors, stories about the philosophy of simple algorithms and journal writing, and educational institutions. The student wrote the plug-in for Spreadsheet Formula Finder that they showed in class, which wrote a simple programming language merging frontend and backend languages where there was none. It helped implement and run a business, mainly helping with understanding philosophy. No neuronet would explain my philosophy, so I wrote a Prolog program that did. Algorithms that could help with work and study existed, and a guide app existed. The app taught pedagogy through music composition and teaching to students of all ages. +54. The salesperson perfectly programmed the text for Honours and otherwise from transcendental studies. These algorithms were at that perfect point. Their selection, wording and details were transcendentally and performantly the best. The secret idea was that the texts were intersectional and could be applied to different departments in the school. The lecturer was developed in the interdisciplinary nature of the program. +55. The salesperson stated that the number one commented on and developed previous number ones. I checked the quality of the work by finding ten inferential algorithms between sentences. I did this for every sentence, second, and third sentence. I also did this with my assignments when increasing them with my departmental texts. The teacher’s work and each child’s found-out work were increased by 4*50 As each and five algorithms each. +56. The salesperson wrote an algorithm that learnt categories and formats. Spreadsheet Formula Finder learns that accounts payable are current liabilities. It recognises the current liabilities category from the model solution, which also recognises that current liabilities include accounts payable and the format of the type of report with this item included. Multiple types of reports may refer to this item. +57. The salesperson examined the logic of debits and credits about bank reconciliation or checking that the bank hadn’t mistakenly omitted an entry in a statement. Spreadsheet Formula Finder learnt the labels of items in formulas, for example, “f” for favourable and “u” for unfavourable in the budget variance report if they were unexplained. These were not values copied from elsewhere or constants that didn’t rely on other data, as “f” denoted a surplus and “u” denoted a deficit. This formula (if =<, etc.) could be recognised and accessed from an algorithm or found with CAW. I could use the commands + and - inequalities, such as > and if-then in CAW. +58. The salesperson produced an error when duplicate values were present in the model question, although these could recur in the solution. The Spreadsheet Formula Finder relied on the presence of 0 in value cells with no value to properly find sum formulas. The formula with the maximum length was chosen. Each value in training the model was assumed to be different to avoid ambiguity in matching. I used a finance formula finder to combine items in tables to sum and subtract. +59. The salesperson discussed the philosophy with children. I transcended the idea and accessed spiritual technologies in business. I modelled the system on a working model, recorded that it worked, and left the computation to the spiritual pointer to reality. I moved time and space on stage, encouraging a higher completion rate. +60. The salesperson researched how BAG could be commercialised. I treated ideas like words and exhausted all connections using the Breasoning Algorithm Generator (BAG). I kept the initial number of ideas per set small and contained them in a paragraph. I found their goodness as new ideas (visualised as newly manufactured, having a use in the future, and getting better). +61. The salesperson stayed in contact with people to remain honest. I transcended doing nothing. In other words, I ran the breasoning algorithm after the sale to complete the computation. I could spend the breasonings dollars on artificial intelligence, accreditation and other company costs. Having concrete systems, people, and offices was necessary. The manager appeared to talk but had left. The founder was in touch with the spirit world and believed in the company. +62. The salesperson made the model work. I wrote the algorithm and the interface. The system must be able to reproduce the product. This product might be the sales product, financial reports, or employee work practicums. These helped employees stay on track and emphasised their personality when working. They were maintained and updated. +63. The salesperson established the models from a central point. I tested the model. I noticed companies closer to the thing in itself. I tested the model daily, including the website, using State Saving Interpreter (SSI). I tested the site in Prolog (menu) form and converted it to a Vetusia engine (static) website. I also tested cognitive simulations that controlled thought using mind-reading and switches to touch an image. +64. The salesperson spiritually computed the algorithm. I calculated the saved resources and freed up resources. The employee was a bot who accepted breasonings dollars. This state of affairs was all ratified by law. I wrote the scripts for the business stage. + +65. The salesperson used two pointers to add a terminal to the term maze. I listed the addresses of the term’s terminals and pattern-matched my way to the exit. This method worked with choice points, loops, and dead ends. It didn’t work with cycles, disconnected parts or teleported-to parts. Assuming the terminals were of a particular type, I could find all of them. The maze was a hierarchy with an exit as a terminal, and one could jump from the exit to the starting point. +66. The salesperson used a queue to prevent database data loss. I added the item to the queue, periodically checked the queue on the server, and added the item to the database. I checked that there were no duplicate or missing entries and regularly backed up the database. I archived and cleaned my hard disk. +67. The salesperson wrote descriptions of computational meditation algorithms in their philosophy. The customers were discovered and developed from birth. I gave them one high distinction for birth and five for high distinctions. I updated my thought analyser and helper, testing for interest and ideas on induction and proofs and helping them achieve their goals as social movement leaders. I interpreted them as discussing these ideas in terms of their ideas. +68. The salesperson invited the community to a renaissance of business. I recorded the names of customers registering on the website. I looked into hiring local office space and employees to help with the Academy and the algorithms. I determined that the sales money could pay employees. I found people who wanted the job. +69. The salesperson used classical music techniques that the algorithm suggested. I increased each of my philosophy documents to 16k breasonings and 16k breasonings worth of details. I improved my algorithms to a professional standard by writing a Cultural Translation Tool website translator and Music Composer with rhythm, which could generate any form of hit MIDI music. I added the facility to write by mind-reading and randomly generating commercially viable songs. I added multiple octaves and instructions on using particular instruments and musical effects. +70. The salesperson critically analysed the text to make their mark. I also increased my algorithms with pedagogy that helped students understand them better and connect with their words. I wrote ideas that could be useful but hadn’t been thought of before, such as one that found combinations of existing connections to create a decision tree. It identified variables and found structures connecting them in needed ways, stored as program finders that could be customised. I found the most straightforward yet most useful version in each language. +71. The salesperson exploited philosophy to connect to new algorithms and research areas and designed fine arts models as a seen-as version to relax the mind when problem-solving and thinking of new possibilities. To assess computer science, I assessed philosophy texts related to computer science. I exposed the algorithm, methods, and optimisations, such as subterm with address. I found complexity thought-provoking and laden with intelligent language, so I read a book. As my research wore on, I deliberately redrafted key works to think, improve, and connect to new points actively. +72. The salesperson found tasks that needed automation. I constructed a document to connect with other businesses. I made partnerships, made physical products an art form and helped myself with businesses. I focused on critical evaluation as my product and wrote and trained students in philosophy, which I found most rewarding. I specialise in information products like books, courses, algorithms, and recordings. +73. The salesperson effervesced about computers and writing and how much computer science and creative writing had helped their career. I established centres that taught accredited meditation courses. I calculated how much a centre earned and spent and found customers who generally found the fundamentals of philosophy intriguing. I wrote down ideas about the culture and the centre’s vested interests. I aimed for a minimum number of customers per month and supported professional development. +74. The salesperson counted a product’s sales and kept products of ethical worth, such as creative and left-wing-leaning products. I established an accreditation agency that helped solve rigorous or difficult problems by breasoning 16k breasonings daily for the Academy until the qualification ran out. I merged and counted the frequency of paths, including reused intermediate paths in algorithms. I evaluated the intelligence of the idea by thinking of the activity and business it developed. I preferred starting with fewer, more extended ideas and moving to multiple, shorter ideas and deepening my understanding through them. +75. The salesperson encouraged students to automate their programming, experiment with non-invasive mind-reading and constantly rebuild their algorithm portfolios. I named my courses and advanced diplomas in the form of a well-known educational institution. The unaccredited University was supported by breasonings, several texts, and a business model, which gave my writing rise to economic use and spurred the philosophies of the Academy. These philosophies included meditation, time travel and immortality and nurtured writing to achieve them. I continued talking with and testing courses with customers and explored new angles daily. +76. The salesperson scored completions in each course with students. To attract students and offer the same as the University, I developed lecture slides and employed lecturers who wrote lectures. These were tasks that I logically explored, invested in and developed so that they were continuously improved. I collected feedback in business and education to find out what students thought, whether they felt satisfied and possibilities for improvement. I explored the online medium and how the company should present itself to master and change with the times. +77. The salesperson aimed to start several Universities in each country. I built up to the University from the Academy. I examined the text quality, research requirements, commercial wing, service costs, and pedagogical requirements. I knew about the standards from studying philosophy. The Academy’s texts were challenging and led to students writing their algorithms. +78. The salesperson looked for ways of improving citizens’ quality of life. I enjoyed differentiating algorithms from details, as algorithms were the computational version of texts. I explored naturally expected solutions such as simulation and space travel, which gave me prowess over the universe. The simulation and immortality required medical knowledge of the correct quality. I found a group that aimed to travel in time and make computer science and advanced technologies available to different civilisations. +79. The salesperson used the same code to process string codes as strings in List Prolog. I changed the grammar in Prolog to List Prolog Converter from processing string codes to strings by passing the grammar to a list of strings of one character length. I also changed the terminals to strings wrapped in square brackets to process strings. This decision helped with debugging, understanding the code, and simplification. +80. The salesperson modified the text to find based on search results. I added an autocomplete feature to the Web Editor’s find text dialogue box. This advanced feature finds text as the user enters it and displays the results. I also added this to the find in multiple files dialogue box. This feature saved time by preventing the need to repeat failed find operations and narrowing search criteria. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SALES/Sales 1.txt b/Lucian-Academy/Books/SALES/Sales 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..fb95755ef61d5f3d2502da20966612e8a124de77 --- /dev/null +++ b/Lucian-Academy/Books/SALES/Sales 1.txt @@ -0,0 +1,25 @@ +["Green, L 2022, Sales 1, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Sales + +1. I distributed the prices. +2. I distributed the money. +3. The rightful earner earned the money. +4. I ensured the names of the seller and buyer were legible. +5. I replaced the product with a similar product. +6. I competed for the result. +7. The buyers competed for the product. +8. I made sales decisions quickly. +9. The product helped me to react more quickly. +10. I audited the sales. +11. I bundled the items for sale. +12. I sold the security features of SSI. +13. Lucian Prolog has deidentified variable names. +14. I \"sold\" pedagogy in time travel. +15. I \"sold\" meditation in time travel. +16. I checked that meditation worked. +17. I checked the maximum jump length and time. +18. I \"bought\" the field. +19. I noticed the stasis field worked later. +20. I calculated my biological age with Chinese herbs. +21. I planned my writing. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SALES/dot-New Sales 6.txt b/Lucian-Academy/Books/SALES/dot-New Sales 6.txt new file mode 100644 index 0000000000000000000000000000000000000000..22e4c80e2fb66a13ea830174b0939e52c345f9dd --- /dev/null +++ b/Lucian-Academy/Books/SALES/dot-New Sales 6.txt @@ -0,0 +1,87 @@ +New Sales 6 + +1. The salesperson prepared for joining, expansion, enlightenment, and constructive evaluation. The novel’s algorithmic perspectives were ready for critical analysis. It included an algorithm processing a linear structure that could be simplified from a tree. This process started with input, files, API calls, or protocols. I specialised in simplifying the complex parts first. +2. The salesperson used attention-attracting methods to develop breasonings like GL, BAG and combophil. I found philosophies to write about using syntheses of ten algorithms. Each chapter had one set of ten parts. I explored different configurations of the philosophy, each with different computational implications. They were aphors or short paragraphs, each exploring a specific idea associated with objects in accreditation when individually thought of. +3. The salesperson added to or nourished undeveloped, unfinished or cutting-edge frontiers for research. I connected philosophies to others in philosophy for ideas. I decided on a trajectory or contention between two ideas and explored them. These ideas were selected from idea centres in philosophy, found using key term clusters. I explored connections via other ideas between the ideas. +4. The salesperson said their version was pitch-blackness with razor-new technology in the jewel chip. I explored hypotheses and questions and how various people could approach these over time. I thought about macro-ideas, or the main ideas of the philosopher, and how these ideas were relevant. These ideas were the key terms with the highest frequencies in the texts. I retrieved a non-working algorithm by repairing its skills from scratch and approaching its original skills as closely as possible without sacrificing correctness. +5. The salesperson chose positive stories, had characters with ardour and had a science research edge. I wrote a story as an implicit framework for philosophy. I cleaned and focused on philosophically interpreted instruments in settings. I focused on positivity, algorithms, and multiplicity. Before setting off, I checked that my pack had an algorithm, a philosophy, art, and a song. +6. The salesperson thought of unique ideas and perfected them before thinking of them. The philosophy matched my style of creativity but was useful and suggestive of new ideas, although not exhaustive. I erred on the side of caution with finishing off algorithms to exhaust them but made up new frontiers, such as selling, educating, or growing. I used music and art as inspirations, thinking into objects and creating models and simulations of my work. I devised media where there were none, filled gaps in fonts and aimed to be well-known. +7. The salesperson asked if they were worried about myxomatosis. I included the story point as a philosophy at the start of the scene. The types and the algorithm were a proof because they defined the alphabet of symbols used in the algorithm. I checked the algorithm proved what I wanted it to by checking the data led to the correct result with correct reasoning. I felt like a rabbit rather than a lion or monkey and couldn’t eat them. +8. The salesperson considered the menu and paid with money earned using their intellect. I thought of an idea for a setting: a large jewel computer chip. These crystals refracted lasers and used quantum components faster than the speed of light. Each element could be spiritually programmed with 16,000 breasonings and accessed as if they were real. They were used for mind reading, breasoning and immortality. +9. The salesperson encouraged people to learn meditation to become immortal and broach life’s questions. I recorded how a child thought, not wrote. I considered being an adult inside a child’s body as if I had frozen my age as a child. I sided with examination, longer, finished texts, and specialised in computational spiritual sciences and other algorithms. I introduced the idea of immortality and locking one’s age, raising important ethical questions. +10. The salesperson endeavoured to create art each day. I determined the story’s setting, year, and properties myself. I thought into filmable sets for special effects, including artificial intelligence, such as walking on a cubic planet or following a particle through a 3D maze. I wrote a story, particularly the description of the software box, as if it was about software for a solution to the problem: this combined realism, computation perfection and the latest science. +11. The salesperson counted and kept their weekly predicate count to their standard. I wrote a novel using algorithmic language, giving away future computer science. For example, a robot experienced aphoria or a protective connection with humans. This sensation was due to humans writing philosophical aphors about objects that are increased with reasoning. This increase was done automatically in the future, and it was increased that it was good. +12. The salesperson explored beneath the earth’s crust as a reason for examining reasons for exploring between dimensions. I wrote a novel about the character John Blair, who journeyed to the earth’s centre like a chaotic fractal being expanded. I revisited the story with new interpretations, such as how Cultural Translation Tool 3 and other algorithms’ bulk processing tasks created an automation utopia, given it was carried out well. For example, banks bulk processed transactions, people received a university stipend, and employment and tax forms were automatically processed, creating a utopia of bulk automation with the need for people to be encouraged to interact by actively learning, consuming, and spiritually growing. The students researched and wrote creatively on Lucianic topics. +13. The salesperson mind-read and mind-conversed with robots, of whom smaller robots liked more minor things, from pattern-matching to organising a creation gallery and discussing spirituality. John Blair explores a robot utopia. They were syncretic with Computational English, which helped make writing algorithms enjoyable and present algorithms attractively. Robot aphoria caused robots to love people for saying breasonings. Meditators lived in the safest possible universe. +14. The salesperson only loaded the people onto the simulation when it was safe. I explored small-scale versions of food scarcity and overpopulation that large-scale immortality might bring. To become immortal, one needed to be or know a meditator, of whom there were few. The simulation solved these problems and space travel dangers. The robots were reliable in managing the simulation, which was computationally safe because it was computed at the start. +15. The salesperson helped write the algorithm by identifying feasible subsets or whole sets of skills and functionally decomposing the task into smaller chunks. The company appeared professional by agreeing or disagreeing with and grappling with representations during the day. These might be in text, spoken or thought form to a chatbot. The thoughts could be mind-read, with specific algorithms interpreted as them. Replying to and helping students write algorithms was the most challenging skill, completed using Program Finder with Types, libraries or neuronets. +16. The salesperson asked a cosmologue for rhythms for each part and section. I hand-composed or rewrote Music Composer to take user input or mind-compose songs with rhythms. I vertically selected by option-dragging a pretty-printed music meta file’s sections in a suitable text editor to copy and paste sections. I used a programming language to edit the sections, with commands such as slow, fast, or unusual rhythm in the solo. I wrote at least ten hand-breasonings and generated the rest of the 16k algorithms with thought-provoking software. + +17. The salesperson trained the person to mind-read their thoughts, stored in one place, to complete and present their work, where completing an assignment using mind-reading was assessable. I determined the high distinction by itself. The brain naturally agreed, and businesses and schools operated effectively. I established business systems, with systems to detect comments to help finish work. There were systems in education and business, and the books’ skills were explored. +18. The salesperson professionally handled the students by writing about them and helping them to write. The system detected whether the submitted work was below standard and requested that the student resubmit it. The student was not allowed to find fault, abuse, hate, or discriminate in the name of positivity and exposition. The critique was positive, connecting parts of it to other assessable essays. The courses included a particular number of assignments and requirements, which were internally accredited. +19. The salesperson decided to set joining algorithm ways of thinking together as the assignment. The system highlighted the parts of the work that were below standard. The lecturer devised new questions during their career. They helped students write assignments confidently and to the correct standard. They checked that sources had been paraphrased accurately and enough. +20. The salesperson set speed, accuracy, and challenging teamwork in computer science competitions. The students were expected to complete their own work, understand the algorithms and explain their work in their own words. There were authentication tests and interviews, and projects had verification and submission scripts to help with the assignment and marking. Students should comment on what they found noteworthy about each predicate and line on their algorithm. Plagiarised work was not accepted, and comments should be original and grammatical. +21. The salesperson developed teaching into research and commercial ventures, paying staff in total. There were fifty high distinctions for the university, the head of state, and politics per day. The university helped with imagery, cognition, and accreditation. The head of state helped manage backups, lunchtime refreshments, and fine arts. I allowed checking and differences in opinion and virtually integrated the university. +22. The salesperson was in charge of problem-solving. I explored particular ideas as a matter of course. I scheduled events whenever necessary and at the best time and organised someone else to complete menial tasks. They were paid the same stipend as me. I could write in my own time, conduct a business and organise missions to help customers with business. +23. The salesperson funnelled ideas and increased the best ideas. The institution appeared to breason out assignments for the university, the head of state, and politics. The manager selected assignments and algorithms to help with. I completed accreditation as an exercise as an example to businesses. Each internal and specialisation related to external areas of study was completed. +24. The salesperson was happy with health, wealth and wisdom. I critically regarded the university, the head of state, and politics, each with 50 As. I didn’t overconsume electricity, time, or money. People could sense when to lower the birth rate by slowing down to invest meditation and pedagogy in conception. I enjoyed a long life, the freedom to self-represent, and the agreement with meditation. +25. The salesperson devised pixels and smooth lines about fonts and graphics. I halved the width of objects until they were called sharp. I found ideas within the eye of the needle representing the requirements. Computer science processed, scaled, optimised, and recorded algorithms. Scaling included automating, generalising, and focusing on useful dimensions of algorithms. +26. The salesperson increased the strings to grammar algorithm to keep brackets in a clause and introduced lookahead in multiple recursive levels. I tested the web server daily to ensure the site was up and the database was not corrupted. I tested the site after making changes and running tests every few days. The students could write on sentences in themselves or apply them to a close topic. They did this by writing within the terms of the sentence and increasing other material within the language. +27. The salesperson avoided competing prefixes in non-deterministic grammar clauses and kept lookahead minimal for performance. For example, competing open brackets in non-deterministic clauses may cause slower performance and require lookahead. The algorithm should use a single clause to replace the same prefixes and branch to the two clauses with the items following the instances of the prefix to solve competing prefixes in non-deterministic clauses. I compared clauses to check if they should be minimised. I was careful to delete null and unit clauses after this. +28. The salesperson simplified the grammar before testing it was equivalent to another grammar. I tested grammars to be equivalent when marking by sorting them, replacing their predicate names with numbers from start to end, and testing to see if they matched. I tested the grammar worked on the given set of strings. I checked whether their and my grammar were as simple as possible using reduction, minimisation and simplification. I repeatedly applied these until the grammar was as simple as possible. +29. The salesperson used the grammar for parsing a term, converting it to a data structure or writing a new programming language for faster production. I tested whether the grammar successfully parsed its strings and accurately reflected the pattern, such as one clause for a predicate header and one for the predicate body. The algorithm recognised that the data was a computer algorithm and recognised the syntax for the header and body and their delimiters. If the data was written text, the grammar finder grouped letters as words and treated punctuation as delimiters. Because of positive function, I didn’t need to apply many optimisations. +30. The salesperson preferred data structures of a set number of levels because they could measure their memory use. I supported an arbitrary number of levels in grammars, types, and data structures as an alternative to supporting any number of levels. In grammars, I got the first level’s open bracket token, then part of the level and then, optionally, the next level’s open bracket token, and so on. I introduced a specific mode for finding all possible grammars. Given the high complexity, this method worked best with shorter data sets. I found better grammars that repeated over longer lists, not the first available repeating character. I could select the best combinations of string grammars, creating a single grammar for their convergence with a decision tree. +31. The salesperson noticed specific commands accepted up to a certain number of arguments, although on a different topic. I used lookahead when parsing multiple levels of nested brackets. I inserted lookahead([“]”]) after finding an item(I) in the compound in the term parser to allow parsing multiple levels. With the lookahead statement, the parser could parse more than one level of nested brackets. Stack overflow errors existed because algorithms accepted very high stacks, and algorithms were too difficult to code with a cut-off to their recursion. +32. The salesperson simplified strings based on their character type. I collected like-types in strings given as input to the strings to grammar algorithm. These were all letters or all numbers. Punctuation characters were left separate because they each had unique semantics. After collecting these groups, I could merge the lists containing them into decision tree grammars. + +33. The salesperson taught the algorithm types of correspondences, such as neighbours or connections through labels, where labels might be organised by section. I modified the strings-to-grammar algorithm to produce algorithms from a specification to write algorithms where data is in lists or strings. Lists were treated like strings, which were converted to lists for processing. I could convert from type to type. For example, I converted a string to a grammar and then to another string by replacing certain parts of the grammar with other grammars. I checked other variables for correspondences of inputs to outputs, finding correspondences using regression, where the algorithm couldn’t simply pattern-match correspondences because they might rely on proximity. +34. The salesperson used regression to find the closest solution. I used neuronets for unpredictable (non-pattern-matching) code. First, I predicted the next command using a recurrent neuronet, using regression, and finding code meeting the requirements using my database. I preprocessed the data, replacing commands and some data types with numbers; for example, constants had their own number, but specific items were replaced. The features were commands in places, requiring large matrices. I used one-hot encoding (zeros apart from a one in a row in a place signifying the item). I avoided data duplication by using a decision tree, and I worked with windows of data to avoid problems with long data sequences. +35. The salesperson used PFTCV (Program Finder with Types, Constants and Unique Variables) and CAW to find unpredictable code. To find unpredictable code, I converted my database to a typed predicate library to quickly search. This library contained commands such as plus (+), constants, and types in code and joined predicates using types with unique variables and constants (found in algorithms, not data) with an algorithm similar to strings-to-grammar. The algorithm continuously retried predictable and unpredictable code until the output was found. It collected different possible predicates with the same type to try as unique algorithm versions to find output. I saved the predicate calls of predicates found using this algorithm one after another or joined any intermediate (non-recursive, short) predicates, where the prior finds whole predicates with type flow. +36. The salesperson checked whether the code looked healthy by checking for singletons, uniformity, and back-substitution and trialling user queries that tested each feature or section of the algorithm against type success. The closest tries (with the minimum number of unfound parts of the output) per unpredictable part are listed, human-completed, or completed with CAW if there are few enough unknowns. If no solution is found, the system mind-reads the input and output of stages, their data structure types and types, with constants, unique variables, and any commands needed to relate them. Mind-reading, including random proofreading, may optimise choosing between predicates with the same type when finding unpredictable code. Mind-reading may also check code quality by checking whether the result is wanted and help repair code by explaining the desired result in terms of the aim, with questions about geometry. +37. The salesperson created a help system for algorithms and ensured internal walkthroughs, allowing bug-checking in interpreters to remain constant across versions. I added features to algorithms with an algorithm. I used ways of thinking from previous algorithms, such as making games to develop new features. I relied on algorithms written using formats with additional modules (such as scripts for additional formats in Essay Helper and programming language logical structures with labelled command types). Creating algorithms as programming languages created more options for development and required the same overall system to avoid incompatibilities, ensuring easier creation of more features for various algorithms. +38. The salesperson checked off completed tasks, tracking tasks they created. I used mind-reading to develop features. I minimised List Prolog and Prolog algorithms with a single algorithm. In addition, type-compatible algorithms with an algorithm’s programming languages were suggested and drafted using mind-reading with types in the algorithm’s programming scripting language. These types were, for example, meditation task dependencies. +39. The salesperson finished a solution with visualisations and simulations in a few nested commands. I found two instances of a predicate giving a result to two arguments, merged the two predicates and called them using foldr or maplist. I wrote an interpreter with a neuronet or ran an interpreter to check code, which is used when searching for unpredictable code instantly. The predictable and unpredictable code finders found the code and minimised duplicate nodes. I wrote an editor that searched for commands based on their data (first finding their types) and inserted the predicate name. +40. The salesperson could use formats in spreadsheets, program finders and when developing algorithms. I taught the algorithm formats to distil and more quickly recognise data to form solutions. This method sped up algorithm development by breaking down and giving input and output in stages, where the program finder is instructed to use the set of predicates found in this way. An algorithm recognises data in formats by their type or groups them as numbers and recognises them, possibly assigning variables to them for manipulation. Mind-reading could collect these input and output formats, including critical functions such as grouping numbers. +41. The salesperson used formats to detect corresponding values in variables. Spec-to-algorithm used two grammars, for input and output, to find constants that recurred in specs, moved variables forming predictable code and new values arising from input forming unpredictable (non-pattern matching) code. The algorithm finds the result of each input on the way to an output as soon as possible and adds it to the output as quickly as possible. The program finder finds the dependency of predictable and unpredictable variables, and the earliest point computations involving them can be completed before needed by a later point, then renders the algorithm with computations over time. Like a format scanner, the algorithm finds the necessary values from the variables simultaneously and performs its computations, including predicates such as subterm with address for advanced searches. +42. The salesperson completed the task predicate by predicate after checking their plan was like a chatbot’s plan. For example, the program finder takes sets of values, performs computations, and returns them as output (given whether they are unique variables, their code, values, or a list m of particular items, and analyses whether items are sorted in stages of output or lists at particular stages are sorted together with a single index). It then simplifies the algorithm to predicates performing groups of computations. The complexity of searching for necessary commands is reduced (where types reduce the unpredictable command search space) because the top search results from the regression/constrained types search are tested against the spec, or possibly sorting variable values by type and searching for an algorithm that matches the output using regression, finishing the task in a single stroke. +43. The salesperson flagged non-uniform or unnecessary code and questioned the data. I repaired the similar, non-matching algorithm by diffing the expected and actual results, comparing their grammars, finding unaccounted-for outputs and re-running the unpredictable program finder based on the newly arisen variables. The variables’ values, over time with each spec, were mapped to patterns that the output was found in terms of. If no solution was found after repeatedly performing this process, the closest solution was finished by hand or CAW. It will be removed if a value is deemed unnecessary, mistaken or not part of the spec. +44. The salesperson eliminated library predicates with worse performance. I tested for unpredictable predicates with higher frequencies first. I tested for sorting by column and tested for a shortlist of commands to save time. I didn’t run CAW with a command length greater than one because it would take too long; instead, I relied on my predicate library. I compiled the algorithm for better performance. +45. The salesperson tested that the test data was correct, compared with the working algorithm, after item numbers were tested to be uniform and hand-taught expected results were tested in the algorithm. I watched data to see if correspondences between addresses and item numbers or the integrity of patterns in the algorithm broke compared with their counterparts in the spec. If they did, the algorithm produced a warning and suggested a way to fix it by drawing attention to the correspondence in the algorithm, the input, the output or dismissing the issue. The new editor automatically made suggestions, instantly tried code on completion of commands and corrections to mistypes, kept a log of the user/computer, showed type and other compilation errors on the same page, ran on the same page and finished on the same day. There was a version control system that suggested backdating versions, a CI/CD tool to delete unnecessary commands, a DFA minimiser and optimiser and a discrete optimiser to make obvious suggestions about correctness. +46. The salesperson stated there were no operators except in the {code} section in definite clause grammars, in spec-to-algorithm. I developed new Prolog operators to speed up development. I denoted append with a colon “: “, string concat with “$” and currying with “?”. I could apply operations to sequences of items. In addition, I could replace “?” with a predicate name, operator or variable name, and even whole commands to fill in commands in algorithms. +47. The salesperson broke algorithms into overall predicates, subpredicates and clauses. If-then statements with unpredictable conditions were found as unpredictable code in spec-to-algorithm. Or they were predictable code if their condition was pattern matched. If unpredictable, they were found with regression-based decision trees. Unpredictable code found any included predictable code. +48. The salesperson concentrated on the algorithms from the areas of study and research. I found the product of two regressions, for example, reverse and plus (+), by combining them and teaching their combination to it. I could reduce the search space with types, find a sequence of commands that equalled the output, or find a slightly more complicated set of algorithms that included all needed combinations. This second method amounted to a fast decision tree of types and algorithms and was less complex. + +49. The salesperson labelled recursive structures as recursive, optional strings, atoms or numbers at the start and found terms' unique variables, constants and other spec-to-algorithm predicate results according to their label. I converted strings, atoms and numbers in terms, especially ones with length greater than one, to labelled recursive structures, which contained strings of length one (making finding conversions easy) and could be converted back for CAW processing if necessary. I converted these items to recursive structures with the try predicate from strings-to-grammar. Later, I converted them back to their original form for outputting. I converted the terms to unique variable form and to recursive structures together. +50. The salesperson recommended a Master's in Business Administration (MBA) and a lifestyle of transcendence to would-be immortals. In addition to generating an "algorithm.pl" file, spec-to-algorithm generates an "algorithm2.pl" file including "spec_to_algorithm.pl" and "algorithm.pl" to run the generated algorithm by itself. Helping others become immortal involves helping them indicate algorithmic and prosaic breasonings for meditation, body-replacement and anti-ageing medicine in a wanted way each day. Similarly to running a business, interest meditators in writing a converter from a specification to an algorithm to support their thoughts (breasonings are the engine of immortality) to indicate to become immortal. This process involves taking care of people's algorithmic learning daily, including their interests and progress and helping them take the initiative as your helper. +51. The salesperson enjoyed the spiritual school, which helped with school work by approaching writing. The Earth-side immortal had to complete the work the immortal had completed, including all the algorithms. The friendly immortal sometimes had to appear to the Earth-side immortal to interest them in individual algorithms. The student had many questions and needed help developing the algorithms, which were simple yet required understanding each step. The teacher gives the specification and assesses the student on each predicate, helping them speed up by writing spec-to-algorithm and CAW. +52. The salesperson stated the full version was always preferred. I helped the student to become immortal, then moved on to the next person. They were male or female, felt happy in life, a meditator and pedagogue (or wanted to become a meditator and pedagogue, or could learn the same as up to time crystals working). All the necessary areas of study (pedagogy text to breasonings, meditation, medicine anti-ageing, time travel, mind reading, time crystal, having a spiritual future apartment and having a spiritual future computer), usually with 50 As each (or 4000 breasonings, in 16 groups of 250 breasonings, represented by an algorithm) were covered. Algorithm steps needed to be mind-read and recorded, with special foods representing the breasonings covered as rewards. +53. The salesperson found how commands changed their input's type to the output's type to help find and fix type errors. Type finder found the Prolog types of predicates to analyse them, such as whether their terms were single items such as strings, atoms or numbers, or lists or compounds. The algorithm found the types of each clause, stopping when it reached the base case or the time limit to find the predicate's types on a recursive call. Types are for predicates, not clauses. I used the types to speed up searching CAW library predicates and write proofs (which types with algorithms are, if they prove what you want them to) and the specifications to explore further cases and write commands. +54. The salesperson found an algorithm that formed recursive structures from inputs and outputs, found mappings between inputs and outputs and found the output. Spec to Algorithm finds the mapping of input to output variables from the overall chart of variables in input and output. It finds the address of each input variable and its address in the output (specifically, the recursive structures), then substitutes the new set of inputs into the recursive structure to find the output values. It finds the recursive structure when writing the algorithm and puts the input into this structure using a predefined predicate. I used "formats" to find relationships between variables before finding their mapping. +55. The salesperson updated Spec to Algorithm to make verifications, check conditions or perform computations with CAW to select and transform variable values. Spec to Algorithm learns correspondences in vars [[a,b],...] [[b,c],...], copying values from different lists to output. Repeating patterns are recognised and converted to expanded lists, and nth or CAW-found values will be handled later. Non-deterministic data from different specs is transformed into a decision tree and is dealt with by conditionals at the current position. Often, recursive structures don't contain other recursive structures because the algorithm is shallow and outputs values at the same level, or the output is zipped combinations of variable values and finds correspondences between items in lists in two variables. In pattern-matching mode, taking an item from a list for output triggers starting accessing values of other variables at this position at that recursive level of the recursive structure. +56. The salesperson used pattern matching to find correspondences instead of formats, confirming one's first impression. Spec to Algorithm learns to create correspondences from indices [[1,e],...] [[1,f],...] from [e,...] [f,...] when separating columns, when variables are linked and operating on them, from formats. Formats are short examples of inputs and outputs that help the algorithm determine whether items next to each other are linked or need to be linked with indices. The algorithm can hypothesise correspondences from the proximity of variables in output. Nested recursive structures' variable indices may not start again but continue from a previous recursive cycle, determined by inspecting output. +57. The salesperson could make a spreadsheet formula finder more quickly with Spec to Algorithm. The format algorithms mentioned earlier are in unpredictable CAW predicate libraries that more easily form results. These are not only proximity formats, which are unnecessary, but spreadsheet cell formats, types of types such as music and other CAW commands. These formats are frequently used and quickly accessed relationships. The spreadsheet formats are linked to setting and getting variables, grouping numbers, finding additions and subtractions and formatting reports. +58. The salesperson said that the pattern matcher found output A from input A and that the unpredictable code finder/Combination Algorithm Writer (CAW) found B=2 from A1=1, A2=1 using the rule B is A1+A2. I converted terms and strings to other terms when finding constants and unique variables. I didn't need to find types of data items (such as strings, atoms and numbers) because these were converted to terms rather than left in their original form and recurring variables were needed in a pattern-matching algorithm finder rather than types being needed. In a type conversion finder or an unpredictable code finder, types are required to find the variable type to identify and convert it quickly. These types are string, atom, and number and are stored with the term when its items are converted to recursive structures. +59. The salesperson saved recursive structures with labels to restart or continue with indices of list items. The set of input and output terms contains lists, nested lists, and empty lists. The lists in the output recursive structure are either one-off, lists starting from the first item, lists starting from the last item, or a correspondence. On another note, my version of Spec to Algorithm was unusual because it used subterm with address, not recursive predicates and grammars, to collect input and find the mapping of input to output, making adding and modifying features to transform data easier. I restarted indices based on output (continuing or restarting at any point); for example, 1213 may be represented as 12,13 or 1213 used again in different places. +60. The salesperson protected students and prevented bots from accessing the website. I introduced multifactor authentication in the Academy to prevent unauthorised logins. I asked for the person's email when registering. The person was sent a code on login to enter and log in. This method prevented data theft. +61. The salesperson designed a simple "stamp" version of spec-to-algorithm. The Academy aimed for 16k breasonings for each department for areas of study. I planned a simpler version of spec-to-algorithm, which used List Prolog's match command to map variables from input to output. This version could be used in assignments and to design algorithms for quick use. This version of spec-to-algorithm was short and easy to program. +62. The salesperson submitted high distinction requirements in response to high-achieving (or some, if none) students using philosophy and computer science software. I employed an employee to sustain the business and help with necessary tasks. I collected their tax and super information. I onboarded them, enabled them to log in, sent them a welcome letter, and employee handbook, showed them how to use the messaging feature and software, discussed shifts, and timelines, took the initiative with short and long-term goals, meetings about projects, progress and resolving issues, discussions with other employees and managers, trained them using role-plays and games about asking chatbots for algorithms, writing algorithms using spec-to-algorithm and repairing algorithms, IT security and backups, scheduled calls after their first week, month and quarter to chat, and invited them to social events. I collected their work and feedback and responded to them. +63. The salesperson expected teachers to adhere to ethical policy about mind-reading. The teachers were encouraged to appear to help, reward, mentor, and respond to students. I encouraged the use of grammar checkers and neat code formatting and commenting. Students and teachers commented on what was interesting about a line. There were guidelines about mind-reading whether students needed help, finding out their thoughts to start work and helping them during it. +64. The salesperson partially automated finance or employed thinkers to submit high distinction requirements with critical financial documents, such as their pay. I prepared financial documents for the company. I completed the financial documents, analysis, and recommendations. I made adjustments and took appropriate action based on the recommendations. I analysed the feasibility of different pathways, investments, budgeting, and other decisions. + +65. The salesperson analysed whether particular variables were necessary using Spec to Algorithm. I flagged all constants in input and output. The user checked whether these constants were required and if they should be changed to variables; they inserted more specs to enable Spec to Algorithm to convert them to variables. Constants may be necessary to label data. They may need to be changed to variables if new data needs to be processed using them. +66. The salesperson found unique variables and constants in strings, atoms and numbers, where original groupings as these types mattered when converting and inputting them into recursive structures (as well as when writing the algorithm), as did the output types. I reran Spec to Algorithm on variable data containing multiple strings, atoms, or numbers split into characters to find further recursive structures. These could be simpler to process, representing a variable a single time. Different specs containing these recursive structures could be further reduced to non-deterministic structures using decision trees. I found the recursive structures in one pass, labelling data items that had started as strings, atoms and numbers and converting them to the types of the output, meaning the input types didn’t matter (apart from putting the input into recursive structures recognised as variables), but the variable values had to be assembled into the types of the output. +67. The salesperson understood logic’s simplicity and elegance. The variables couldn’t be split because their recursive structures didn’t allow this or pattern-matching individual characters, so splitting strings, atoms, and numbers was pointless. The only advantage of breaking them was recognising their unique grammars or recursive structures, pattern matching repeating patterns and their variables and producing the resulting output. The more possible algorithms were character, item and grammar converters, with formatting such as brackets, compounds and multiple levels of algorithms with conditionals and other commands. I needed the expanded code to insert commands more quickly. +68. The salesperson checked that a value had the same type as a previous constant or variable in the same place, where more specs were needed to accept different types. Splitting strings, atoms and numbers into characters was enabled as an option in both Strings to Grammar and Spec to Algorithm, where the type of these items was preserved (but not converted to numbers while forming an item, in case “01”: “01” became “11”), grammars can parse characters, not necessarily items, unless needed, and only verify, not produce any output, and algorithms made up new variables with their types, composed of variables with other types. Strings, atoms, and numbers were arbitrarily grouped in input (as this was needed to parse them) without needing types and formatted as items with types for the output. However, new input with the same types as previous input meant their types must be recorded. I set the maximum number of program finder-CAW redrafting cycles. +Checking numbers didn’t have decimal places, or output was wrong (must realise zeros can simplify, even if the algorithm automatically does this) +69. The salesperson used and handled type conversions with compounds to save files, parse and compute formulas and other subterm-with-address searches. I added compounds to the list of types Spec to Algorithm could input and output. They were converted to lists of one-character strings and back to compounds if necessary. Compounds could be a(Arg1, Arg2) or 1+1 or any chain of operators and values. I added customised operators to List Prolog in fxy, xfy or xyf form. +70. The salesperson decided to keep a certain number of variables to help find constants across specs more easily. I confirmed that one should convert Spec to Algorithm data to recursive structures with “try” in one go. I converted strings, atoms, numbers, and compounds to lists of one-character strings. I processed them using “try” to produce recursive structures and recursive structures containing these structures within and outside variables. This method allowed for finding patterns across items and variables. Any number of repeating input or output variables or patterns within terms could be accounted for. +71. The salesperson produced code with Spec to Algorithm so it could be universally read and was straightforward. I noticed that the recursive structures might need generated code to be outputted to code to account for skipped structures that were the same and could, therefore, be verified or processed by predicates. If there were predicate calls [a, a, b, a], there would be a loop of a, b, and a. The predicate a could be called rather than relying on a repeated recursive structure. String to Grammar outputted grammars with clauses that could be reduced and simplified, and so could Spec to Algorithm’s generated algorithms. Because subterm with address was simpler than predicates, repeated recursive structures could point to the same structure. +72. The salesperson used the Spec to Algorithm as a module to write code and correct types on failed tests in Lucian CI/CD. In Spec to Algorithm, I recorded and checked string, atom, number and constant types on recursive structures’ formation and filling. These types were wrapped around the lists of one-character strings to which these items were converted. They were recorded when creating the recursive structures for data and variables and checked when finding the structures in the generated algorithm. I needed one-character granularity for changing the person’s number in written grammar, type conversion and complex processing of items down to the character level. +73. The salesperson considered leniently making all repeated values into variables and checking if a constant and variable were the same when merged in a decision tree, avoided by regenerating the algorithm. I didn’t store the type with sub-variables to check them when finding unique variables and constants, but I separated them from their type to find type conversions without needing type checking. They were already type-checked when new data was input into the generated algorithm. Checking for unique variables and constants in the strings sans (without) types was a nice feature. This feature allowed pattern matching within strings, atoms, numbers and constants, leading to complex character code changes and other CAW operations. One needed to be careful to disallow Lucian CI/CD from adding constants to algorithms with Spec to Algorithm when there were not enough specs, requiring more specs to be entered or generated, or for the user to indicate specific values had variables (with human confirmation). +74. The salesperson replaced labels with coded labels and caught illegal characters to avoid Spec to Algorithm malfunctioning. I checked types when inputting data into the algorithm generated by Spec to Algorithm. I found the label in a position, checked if it was the same as that of the data, and then checked if the item’s contents matched the variables in the recursive structure. This data wasn’t labelled, and the recursive structure needed to parse it, so data within the item was checked to be of the correct type. Recursive structures with repeating patterns found repeating characters, including variable values by default, so there was little chance of error. +75. The salesperson converted complex terms’ types, order, and brackets into other terms useful for recursive find and replace. I found the unique variables for single characters. First, I found that they were atoms. Second, I found that they were the same character, so they had the same variable. If there was an “a” (a string) and an ‘a’ (an atom), then these had the same unique variable but were type-checked separately and could be outputted as a string or atom. +76. The salesperson found the expanded code easier to understand and edit. Spec to Algorithm could be used repeatedly to apply an operation until the last two results were the same, such as applying a pattern matching transformation within subterm with address. I changed two brackets to one to remove double brackets or removed or recorded labels from terms. I expanded the code to store and move parts of the input to output. I moved as much of the data in groups as possible to save time. +77. The salesperson took advantage of the constant “1” and output it in various formats. I found the constants for single characters. First, I found which repeating variables recurred across specs and were constants. The two strings checked to be strings fell into this category and were constants. If there was a “b” (a string) and a ‘b’ (an atom), then these were constants, even though they were different types of the same character and could be outputted as an atom or string. +78. The salesperson used Spec to Algorithm to convert strings, atoms, numbers and compounds to have parts in the opposite order or to change words to the plural. The only type necessary in num1(num 1, num 2) is num1 to check when putting data into recursive structures, where unique variables and constants are found from 1, 2 and variables and don’t require types. After converting input to a form with type labels, all the terminals (strings, atoms, numbers and compounds) were represented as one-character strings. I changed Spec to Algorithm to find these terminals with subterm with address. +79. The salesperson used Spec to Algorithm to finish code in Lucian CI/CD and Lucian CI/CD to refine damaged or unfinished code repaired with Combination Algorithm Writer (CAW). I grouped items by type to facilitate verification and parsing. In addition to checking that items met a recursive specification, I checked they had the correct type (whether they were part of a term or a string of one-character strings, in which case they were all strings). It is worthwhile to differentiate between the recursive structures that the data and data-as-variables are converted into that have types labels and the new data that is input into the generated algorithm that is a term typeless item, which means data can be type-checked, and the form checked to have the same recursive, bracketing and order of structure as the algorithm. I increased the quality by writing the expanded predicates that increased productivity and confidence in commerce. +80. The salesperson developed sorting, data structure, dynamic programming, game theory, graph, machine learning, mathematical, optimisation, search, sorting and string manipulation algorithms with Spec to Algorithm. I grouped items by type (by substituting data into recursive structures) to output collections of variables as part of variable values with specific types. After checking them, it didn’t matter what the input types were, but Spec to Algorithm used the types of the output recursive structure to render the output. For example, if the input contained A= “a” and B=2 and the output contained C = atom(A, B), then C = a2. If C were part of a term, it would be outputted as part of it. \ No newline at end of file diff --git a/Lucian-Academy/Books/SALES/dot-New Sales 7.txt b/Lucian-Academy/Books/SALES/dot-New Sales 7.txt new file mode 100644 index 0000000000000000000000000000000000000000..15ec424e8553600b430dafa65cd25725045f33e2 --- /dev/null +++ b/Lucian-Academy/Books/SALES/dot-New Sales 7.txt @@ -0,0 +1,81 @@ +New Sales 7 + +1. The salesperson negated the necessity of unique variables in Spec to Algorithm (S2A) because finding constants was sufficient. I used S2A as a simple framework to find high-quality thoughts (simple versions of my algorithms that could be assessed and form customer thoughts). Customers were encouraged to the thoughts, which worked as mediators of sales and education, meeting professional requirements and earning high distinctions. Employees could dip into the database of thoughts to test themselves and add new research to help with their careers and aims. S2A helped rapidly generate algorithms from simple specs, helping mind-reading to find answers more quickly. +2. The salesperson converted the string containing the term to the term. I wrote a language parser using S2A. In this, sentence words were list items parsed by the parser. S2A generated the algorithm by finding recursive structures for the grammar, skipping over finding recursive structures for words unless adding or removing a prefix or suffix or producing output from words-as-text-games, and the parser inputted data into this recursive structure, mapped input in this recursive structure to output, then rendered the output. The parser appended items together to output. The parser converted strings to lists. +3. The salesperson for the logical/functional programming language wondered why there are multiple clauses in predicates if they can be merged into one, and they should only be separated if there is a CAW command in a clause with multiple outputs (which are possibly unnecessary). I wrote an interpreter with S2A. I computed the sum by taking the two arguments and first adding them. The predicate failed if the output is a value and it is not the result, or if it is a variable, finding the variable and its value in the binding table, confirming that it is the result, or failing otherwise, and if it is not in the bindings table, then adding it with the variable name. A CAW operation separates each set of S2A matches, and then the algorithm returns to the start of the S2A/CAW cycle. The interpreter could convert a list of math commands and form a function, only calculating the necessary operations. +4. The salesperson S2A merged specs with the same and different recursive shapes into a predicate. It formed a decision tree for the predicate with these clauses, forming output based on input with this decision tree. It is similar to Prolog in that this predicate is like clauses with multiple choices and outputs, with a single set of possible outputs for the input, which may have different forms. I omitted an unnecessary output in the CAW output, deleting singletons in the CAW predicate until finished. Removing the extra variable in a copy of the predicate, even if used elsewhere, was better to make the algorithm more efficient. +5. The salesperson replaced "(" and ")" in the compound a(A) with "{" and "}". I wrote Combination Algorithm Writer (CAW) in Spec to Algorithm. I found a smaller search space of one or two commands with arguments from the inputs or previous outputs. These had the types of these variables. Given 1,2 and the required result 2, I tried C is A*B because 2 is a number and taking 1 and 2, checking that 1 and 2 are numbers, with no brackets and not any other constants, giving the result 1*2, which I could label and later calculate and verify to equal 2, meaning the inductive rule is *. In these examples, CAW is used to make non-pattern-matching computations, such as + and type verification, and S2A could perform if-then, member and append. +6. The salesperson distilled data from a list and combined lists using S2A. CAW chose arguments from inputs and previous outputs by adding new outputs to a list whenever it found them and selecting a member of this list (to match an interim predicate spec), assuming it stops after finding the first instance. It chooses the same number of variables as arguments using findall with get_item_n (requiring CAW and lists of a certain length and requiring S2A and lists, or computes get_item_n as an S2A computation and chooses variables). Findall loops through items and processes them, found by S2A using lists and the append (:) operator, for example, given A=[1,2,3], uses a predicate (just as CAW would) that finds A=[B|C], or B=1, or C=[2,3], finds D is B+1 = 1+1 = 2 with CAW and appends this to the results, giving Results:D. This process (predicate) is repeated until findall is complete. +7. To save time when scanning the CAWS libraries, the salesperson used findall as a CAW predicate with other CAW library predicates inserted (as a decision tree). I also used (:) for string_concat, which is relevant to strings in S2A. I could represent a series of string concatenations with (:). Recursion, a single choice between list decomposition, foldr, maplist or list decomposition without wrapping the item to append, was efficient but took time to check for, especially with choosing arguments, so it may be left out of S2A with lists and these commands such as 2 in 1+2=3 being left as 1+2 to calculate later. Predicates can be individually drafted using S2A, and recursion, lists, etc., can be manually inserted later. CAW could do this but would need the most frequently used functions to try first for better performance. +8. The salesperson breasoned out the objects by thinking of them. I wrote Text to Breasonings with Spec to Algorithm. Text to Breasonings converted the text to a list of objects and then to a list of breasonings (x, y, and z dimensions). CAW used intermediate specs to downcase the file, split on non-alphabetic characters, and find correspondences with objects and then with breasonings. An absurd reduction might convert a word straight to (an object name and) its breasonings in S2A. +9. The salesperson inspired the customer to automate laborious aspects of artistry with Mind Reader. I wrote Mind Reader with Spec to Algorithm. Mind Reader measures the time a person takes to breason out a particular thought in a specific time to mind-read it. Text to Breasonings breasoned out a 250 breasoning text on a question's possible answer, timing it. CAW eliminated outliers from the mind-reading signal by using the interquartile range. Students could program Mind Reader, inspire confidence using Spec to Algorithm (using a shorter customised dictionary), and explore music, art, essays, and programming. +10. The salesperson wrote a compiler optimisation that didn't run code. Spec to Algorithm rivalled algorithm-writing chatbots by separating S2A with a short CAW dictionary and CAW with a single algorithm-at-a-time dictionary, matching the types of the user query. S2A was a fast program finder in which [calculate,[1+2]] could be transformed into 3 using subterm with address and "is". Similarly, string_concat, append, foldr, if-then (using nd), and other CAW commands could be computed afterwards. A fast CAW variant that used types with #= formulas (for example, f(A, A#+1)) could be used instead of running code, improving performance. +11. The salesperson supported the simulation with fast computers. The type formulas sometimes included CAW commands, so they were indexed by characters that always appeared first and second and reordered with more straightforward arguments first. The system became very fast and could match simulants' thoughts, enabling a simulation. The people just visited the other dimension. Invisibility worked, but body replacement medicine required specialist knowledge of the body and medicine was teleportation-based with expert knowledge. +12. The salesperson wrote State Saving Interpreter (SSI) using Spec to Algorithm. SSI exhausted choice points in findall, which collected results from "member" and a computation. S2A computed [2,3] from [1,2] by repeating the computation A#+1 for A on the first list of items. S2A ran Mini CAW to find A#+1 from the data, which S2A used to render the result. Recursion is achieved by CAW by repeatedly finding the relation and checking for previous relations used in the algorithm, specifically in the list. +13. The salesperson supported infinite levels in "string to list". In addition to testing for the need for previous relations, CAW added to or used parts of previous relations. It inserted constants, unique or same variables, changed the order of arguments, used arguments from different sources or changed the argument's bracketed form. The predicate was the same as that used to find CAW command arguments and was similar to the predicate used to find the recursive structure and formula of data. Types from different parts of the algorithm could be reused, including references to sets of constants or a certain number of repeats in recursive structures. +14. The salesperson became interested in creating a time machine using an algorithm generator. I wrote Time Machine and Immortality with Spec to Algorithm. Time Machine breasoned out 16k breasonings for time travel. Immortality, part of the Daily Regimen repository, breasoned out 16k breasonings for meditation and body replacement in the (future and) present using BAG. BAG found new pairs of words AD to construct sentences from, from neighbouring pairs of words AB, CB and CD. S2A found these word pairs from the sentences and constructed the new sentences, which contained lateral and interesting different meanings that exhausted the possible connections in a text. +15. The salesperson claimed that Lucian CI/CD could use Mini CAW to "trun" (sic) (simulate running code with types) by finding the code's types and testing that code worked for better performance, first using S2A to optimise it. I wrote Lucian CI/CD using S2A. Lucian CI/CD found dependencies in algorithms, combinations of lines in sets of predicates taken from the dependencies and tested for the simplest combinations. S2A finds dependencies, the level number of the predicate inseparable from predicates they are in cycles with using depth-first search. Cycles are [1,2] in [r,[1,[r,[2]]]] given [1,2,2,1,2,2]. Predicates are in cycles between a predicate that calls a predicate above them, and nested cycles cancel in favour of the most extensive cycle. +16. The salesperson stated that Lucian CI/CD uses S2A to find the combinations of lines of code (not changing their order) by finding all combinations of on/off for lines and rendering the sets of code. S2A uses CAW to generate these possibilities using a recursive algorithm. For example, two lines generate [[on, on],[on, off],[off, on],[off, off]], appending on or off to lists of previous possibilities in S2A and CAW and recursively process the list using CAW. Then, it renders the code using CAW, finding correspondences from the original list of lines and the on switch. For example, [on, off] renders [a] from [a,b] by using a correspondence algorithm from CAW. + +17. The salesperson avoided limiting the length resolution or imposing a time limit to "try", instead allowing a length resolution for the best results given the optimisations. I optimised the "try" predicate that found recursive structures in Spec to Algorithm by reusing computations and reducing data length by identifying candidates for recursive structures. These optimisations reduced the bottleneck to "try" and allowed for more extended and complex data and unique problem-solving approaches. Students can save and automatically include their best predicates in algorithms when needed. The algorithm can use their problem-solving techniques and remind them of their best techniques, such as tabling, negative result tabling and reducing candidate numbers with heuristics. +18. The salesperson learnt from the results of previous queries. S2A recognised patterns that were parts of previously recognised patterns, appending their results and saving time. For example: + +try([4,1,2,3,2,3,1,2,3,2,3,5,2,2],A). +A = [4,[r,[1,[r,[2,3]]]],5,[r,[2]]] + +try([4,1,2,3,2,3,1,2,3,2,3,5,2,2,3,3],A). +A = [4,[r,[1,[r,[2,3]]]],5,[r,[2]],[r,[3]]] + +19. The salesperson saved possible patterns and their results to disk. The algorithm also grouped recursive patterns, for example: + +try([[r,[1]],[r,[1]]],A). +A = [[r, [[r, [1]]]]] + +This result could be simplified to: +A = [[r, [1]]] + +This technique saves time by grouping results from appended queries. +20. The salesperson stored and checked negative S2A "try" table patterns that failed. I used tabling (saving results of previous computations) to speed up finding answers in "try" in S2A. I saved the pattern and the result, including previous times the algorithm was run or data in the text file in the table. I saved the entry if the pattern did not already have a result. I checked if a previous pattern matched a query when using the table. +21. To save time, The salesperson only worked on "try" candidates with repeating values. To shorten the computation, I reduced the data length in "try" in S2A by splitting the data into smaller lengths with repeating items. A segment with a repeating item may have "1" at the start and again before the middle of the list, for example [1,0,0,1,0,0,0]. This heuristic prevented unnecessary tests to find whether a recursive structure existed, and the results could be tabled. I displayed a progress percentage for long computations to encourage users to process data for one predicate at a time to shorten data length. +22. The list of candidates was reduced from the combinations of sublists. I optimised "try" in S2A by finding the limits of possible recursive structures from starting points when processing the data. I did this by starting from the start and finding candidate sequences of any length, then starting from the second item and repeating. I checked whether these sequences contained recursive structures or had a repeating item. This optimisation markedly reduced the length of the combinations of sublists, reducing computation time. +23. The salesperson used CAW to develop a computer by converting S2A/CAW specs to processor, operating system and interpreter algorithms. I applied the "try" optimisations to CAW. I searched for patterns in specs to find algorithms, used tabling and heuristics such as types or pattern-matching to reduce the search space, and imagined algorithms that combined subterm with address with a state machine by simulating a computer to contain systems, helping societies develop computers, educating individuals and powering individual experience in a quantum simulation. +24. The salesperson found lists or results of nested functions in CAW. I optimised CAW by searching for patterns in specs to find algorithms. For example, given [1,2,3] and [2,3,4], I found maplist(plus (1), [1,2,3], [2,3,4]). Or, given [1,2,3,4] and 24, I found foldr(multiply, [1,2,3,4], 1, 24). These patterns, or, in the former case, a formula, could be recognised in adjacent data to find the formula more quickly. +25. The salesperson inserted and checked negative (failing) CAW table entries. I optimised CAW using tabling by recording formulas that had or contributed to specific results. These may be maplist, foldr or elementary formulas. I inserted table entries if they didn't exist, possibly offering multiple solutions to the user depending on their preferred solution or ones with better performance. When using the table, I checked if an entry or its formula matched the spec, for example, a pattern, type or formula. +26. The salesperson used intermediate specs for verification before removing them. I reduced the data length of inputted specs in CAW using heuristics such as types or pattern-matching. I recognised patterns in specs bottom-up or top-down, replacing the patterns with symbols and more quickly finding an algorithm to meet the spec. For example, I recognised and found code for a spec that matched that of append. I verified and removed unneeded specs that weren't in the overall spec or didn't elegantly fit into the overall spec. +27. The salesperson wrote program finders for various problem-solving tasks, with customisable options and the possibility of use by a chatbot. I used patterns to more quickly convert specs to algorithms. These patterns might link [1,2,3,4] and 24 to foldr(multiply, [1,2,3,4], 1, 24), other common inductive formula patterns or similar with algorithms and variables. For example, +A=[1,2,3],B=[2,3,4],D=[3],intersection(A,B,C),subtract(C,D,E). +A = [1, 2, 3], +B = [2, 3, 4], +D = [3], +C = [2, 3], +E = [2]. +This computation could be rewritten and recognised using the following code. +A=[A1,A2,A3],B=[B2,B3,B4],D=[D3],intersection(A,B,C),subtract(C,D,E). +A = [A1, A2, A3], +B = [A2, A3, A4], +D = [A3], +C = [A2, A3], +E = [A2]. +This form differs from the Prolog output but can match specs to formulas. +28. The salesperson inserted formulas between the angular brackets in the spec. I optimised CAW by recognising patterns in the spec so that I could write algorithms. For example, CAW may recognise a program in the data. CAW requires program finders to optimise, minimise, and simplify code. I inserted variables in specs using the format to prevent needing two specs to recognise variables. +29. The salesperson collected and responded to code with variables in the chat. CAW matched the pattern of previous results, finding the best code to use. Alternatively, CAW found the correct combinations of commands. The algorithm imagines the middle part by thinking what it would like, such as appropriate preferred parts (evoking Artificial General Intelligence or AGI by being itself). The format, length and specifications of the chat are as requested. +30. The salesperson researched the frontier of progress in a field. CAW selected the organic, mind-read predicates imagined to be helpful. The predicates were labelled organic because they were what the user was querying given a range of factors (where their questions needed to be answered). They were found out ahead of time using mind-reading and increased and were vetted against premium standards. The predicates were mind-read with consent, where employees selected predicates based on their problem-solving style, such as whether they aimed for mathematical, matrix-related or other methods to find a predicate such as a part-of-speech cloze exercise to identify the verb. The predicate was written using a creative problem-solving approach, such as writing the data and types, drawing a diagram or using a tool to find a viable method from their previous work. +31. The salesperson breasoned out their latest work with the project in mind. I wrote "5 As" (five) specific algorithms for a new customer. I wrote a main algorithm with two or three "detail algorithms". They each expanded further on a research direction. I collected key terms and found clusters of ideas from the algorithms and philosophy. I wrote algorithms on these key terms by writing to a specification. +32. The salesperson designed an approach to handle new input strings by combining them with the recursive structures derived from prior inputs. I didn't convert new input strings in S2A to recursive structures, substituting them into the recursive structure found from the old input used to generate the algorithm. This technique processed the new data regarding the recursive structure of the old data, verifying the new data's pattern in the old data and producing the output in terms of the new input. Doing so ensured that the new inputs adhered to the same structural patterns as the last inputs, maintaining consistency in the output generation process. This approach managed a correct and coherent interpretation of new information, taking advantage of the entrenched recursive framework. + +33. The salesperson checked keywords, reading level, impactful brands, and idioms when writing a journal article. University is written as Vocational Education with a specific reading level and subject matter. I wrote the skills for the assignments with performance criteria and elements in stepped progression. For example, these skills, in understanding specific computer science commands, could be used to solve practical or theoretical problems. The choice of material was relevant to the industry and helped build confidence in skills and ways of thinking to help use the degree. +34. The salesperson asked better questions for better results in mind-reading. I wrote a practicum with high distinctions, such as Vocational Education, with a specific reading level. I wrote a Specifications neuronet that contained original and necessary specifications for algorithms given nothing that, like CAW, uncovered research conclusions or worked behind private company doors. It produced AGI by rewriting or editing algorithms cognitively using subterm with address or reverse in Lucian CI/CD. This higher form of optimisation, focusing on the idea rather than the implementation, was crucial in approaching human mind-reading. +35. The salesperson found more elaborate code, including verification or comparison of lists with CAW. A student treats pedagogical algorithms and projects as "enough simple things". Enough, simple algorithms support high distinctions. Similarly, predicates of projects should amount to enough simple predicates for these high distinctions. A project, a collection of or a more significant or noteworthy algorithm, involves more debugging and computational synthesis work and can outperform many smaller algorithms. +36. The salesperson stated that all 80 breasonings should be determined by Vocational Education with a specific reading level. I wrote break-reasonings or reasonings broken into those from different departments. These processes were the top-level summaries of argument maps. For example, different perspectives on language to ultra-optimise an algorithm and documentation to explain an assembly language problem in words are connected in a pipeline. Recursively simplifying an algorithm in machine language may lead to writing more precise source code by writing using better-performing commands, using a coding style that enables the assembly interpreter to vectorise or act on whole arrays at once or include functions in the function body, identifying bottlenecks evident from assembly code, using profilers to pinpoint a needed optimisation, making minor optimisations with significant performance gains, capitalising on a greater understanding of CPU architecture, including registers, pipelines, and instruction sets, better knowledge of co-processing, a better view of optimisation and refinement. +37. The salesperson optimised the recursive structure finder in Spec to Algorithm. I used S2A with a maximum of 10-item long lists or strings for speedy algorithm finding. I broke lists or strings into commands, sentences or other units that could be recombined to form further recursive structures. I achieved this using grammars or another method. I chose a delimiter such as a comma or a full stop by working out if the data was natural language (a collection of words with commas and quotes) or an algorithm (C or Prolog code). +38. The salesperson did cosmology and education with meditation. Lecturers leant towards new classloads of high distinctions. They calculated how many years the combinations and how many items per combination in a text with a specific length could support them. They wrote a new text for freshness and ongoing requirements. They wrote for 15 minutes daily and thought of x, y and z. +39. The salesperson processed as many breasonings as the student. I subsidised 4*50 high distinctions in business to one high distinction. This work covered the sale, and another high distinction covered the high distinction. Instead of one high distinction, they were fives at the maximum. This process relied on maintaining a good grasp of modifications to a complete set of 4*50 high distinctions. +40. The salesperson spiritually ran the finished algorithm with high distinctions or relied on a high distinction. I wrote algorithms with short code, sometimes a few lines long, although some predicates were much longer and processed more data. These longer predicates, like projects, were more heavily weighted for reward. They could cope with more cases and possible input combinations with more lines. Novel or innovative optimisations to these predicates increased their value. +41. The salesperson stated that the horticulturist discovered the ideas. The Academy's internal politician was given systems and pleasant ways of dealing with matters. This person was a board member who understood the texts' sliders. I exited and smelled the roses instead. I processed the table's values, only the secondary level once it was processed. +42. The salesperson predicted relevant ideas by completing more tasks. I wrote as I went. Living up to an ideal, I finished each idea as it was thought of. The neuronet accounted for this idea's significance among the other ideas. An algorithm summarised the main implications of the concept and helped group it with different ideas. +43. The salesperson engaged with mind-reading verification and CI/CD systems for maintenance. I coded as I went. I captured ideas, ways of thinking, options, and games as they appeared, along with any specifications and links to previous ideas and sources. The source was the inspiration, an idea about an earlier idea. It helped link the concept to existing philosophy, algorithms, and analysis to review the work and find the best methods. Ideas went through the stages of the waterflow model, from requirements, analysis, system design, implementation, testing, deployment and maintenance. +44. The salesperson completed arguments for the positive function of each body system. The academy wrote History and Philosophy of Science research about medicine, including how to prevent medical problems with the quantum box. History and Philosophy of Science made observations about the life and science of the scientist, connecting and analysing reasons. The arguments connected algorithms to general arguments for each book, such as writing for immortality, preventing headaches with the quantum box and concentrating on computer science in education, meditation, and medicine. +45. The salesperson tested Spec to Algorithm. I wrote up to 250 S2A test cases about parsing, abstract syntax trees, translation, debugging and optimisation. For example, I parsed "C is A+B" to [[n,+],[[v,a],[v,b],[v,c]]]. In addition, I produced the abstract syntax tree from [[n, assign],[[v,c],[[n,+],[[v,a],[v,b]]]]] from "C is A+B". I translated [[n,-],[[v,a],[v,b],[v,c]]] to "C is A-B". +46. The salesperson corrected the code with variables using S2A. I debugged [1,1,2] to the correct code "C is A+B". Alternatively, I debugged [[n,=],[[v,a],[v,b],[v,c]]] to [[n,+],[[v,a],[v,b],[v,c]]]. I optimised the code [[[n,+],[[v,a],[v,b],[v,c]]],[[n,=],[[v,c],[v,d]]]] to [[n,+],[[v,a],[v,b],[v,d]]]. I optimised the code [[[n,=],[[v,a],[v,b]]],[[n,=],[[v,b],[v,c]]]] to [[n,=],[[v,a],[v,c]]]. +47. The salesperson used a variety of techniques to optimise code. I included a Lucian CI/CD switch statement to test possibilities. I tested between sets of statements, finding the algorithm's most straightforward working combination of statements. This feature more quickly tested between and selected correct statements in switch statements, producing the working code according to test data. I automatically found suitable candidates for switch statements in algorithm comments. +48. The salesperson inserted nested checkpoint statements or let the algorithm reselect the region to find code. In addition to switch statements, I added checkpoints to Lucian CI/CD that connected variables if there were no statements in the switch option as part of the switch. These options could be specified as generative fill checkpoints by themselves, and possible types, brackets or other code could be inserted here by the algorithm. By default, these generative fill regions were deleted or optimised. In addition, the user can specify the length, number of predicates, logical structures, or type of code in the generative fill. \ No newline at end of file diff --git a/Lucian-Academy/Books/SIMULATION/Bot like self in simulation.txt b/Lucian-Academy/Books/SIMULATION/Bot like self in simulation.txt new file mode 100644 index 0000000000000000000000000000000000000000..fddcb4fa43c495407856e3d143536f3f32251487 --- /dev/null +++ b/Lucian-Academy/Books/SIMULATION/Bot like self in simulation.txt @@ -0,0 +1,83 @@ +["Green, L 2022, Bot like self in simulation, Lucian Academy Press, Melbourne.","Green, L 2022",1,"A bot like Self in Simulation + +1. I turned the bot on and off when needed. +2. I explored its thoughts as edges of a polyhedron toy. +3. I explored the use of biotechnology (the bot), rating its representations. +4. I recommended representations to it. +5. I found things that caused enjoyment to the bot. +6. I helped the bot to invent ideas that tested its limits. +7. I found the bot's limits. +8. I alone explored the bot and judged it as impressive. +9. I interested the bot in functionalism in computer science, prompting more quickly finding reused intermediate predicates. +10. I reused intermediate predicates in algorithms. +11. I removed the bug from the concatenation algorithm. +12. I replaced the concatenation algorithm with a system call. +13. I wrote code that I had ubiquitously checked using generated data. +14. I examined certain cases, making decisions. +15. I wrote the interpreter's behaviour with the semicolon key and undefined variables. +16. I invested in each country and visited to gauge interest in certain services. +17. I made the time travel software available, making it clear that it was appropriate for Master's students. +18. I wrote a Master's degree for the bot. +19. A Master's degree was fifty sentence-As. +20. I worked out how to program my bot software, i.e. music, software, research and essays (paraphrasings). +21. I wrote specifications for functional algorithms in CAW, replacing writing specifications for every few lines of predicates. +22. I inserted a bot argument in my daily regimen. +23. I avoided sexual contact with the bot. +24. I couldn't think of anything else to avoid about the bot. +25. I updated a shortlist of bots. +26. I found friendship with the nicest people, remaining professional and courteous. +27. I leveraged and won opportunities with each customer. +28. I noticed that the bot was human. +29. I noticed his eyes behaved like cameras. +30. I noticed human neurons tempered his mind. +31. I liked business, in particular the stock market. +32. I twirled the ship along a safe path. +33. I saved the game. +34. I encountered Vetusia 3: The Cave. +35. The cave contained a bonus maze of algorithm writing puzzles. +36. I also encountered Vetusia -1: The Firm. +37. The firm predicted the design of Vetusia from algorithms and drawings. +38. I wrote simulated intelligence to play the Vetusian puzzles. +39. The Vetusian puzzles were CAW, writing CAW, Program Finder, writing Program Finder, a simple interpreter and writing the interpreter. +40. I wrote visual and constraint satisfaction puzzles, such as mapping stories to a newspaper. +41. I drew the shortest line through multiple dimensions (tables of a database). +42. I simplified the database by removing old fields. +43. I remember the introduction of the Internet. +44. I wrote a programming language written specifically for the Internet. +45. The language could create protocols. +46. The language could simplify the Internet topology. +47. The language could create virtual networks. +48. The language could simulate the Internet for programming purposes. +49. The language could treat all my domain names, etc., as simulated elements, which I could test out in my private world. +50. Money, As, and marks were of no object. +51. Money required a universal pension, comfortable living arrangements and access to proper food, medical services, etc. +52. As required access to the Text to Breasoning software, with meditation, etc. training and training in writing algorithms that wrote arguments and algorithms, all from the user's database. +53. Marks required the time and training to write 80 philosophy breasonings and 80 algorithms and follow an algorithm specification (written to a specification) to finish the assignment. +54. Difficult algorithms (which required additional time to write features and find bug fixes, had performance issues without system calls or required obscure commands to work correctly) I would explain separately and might assess by memorisation. +55. I found that computer science suited me because it was possible to write correct code with bug checkers. +56. I tested other people's programs and the ISO Prolog tests with my interpreter. +57. I checked the best programming and Prolog writing practices and checked my programs followed these. +58. I wrote algorithms to check for these standards. +59. I found errors with static and dynamic checkers. +60. I reformatted my algorithms uniformly +61. I used a sparing format for comments (to describe the logical structure, limits of programs and differences from other programs). +62. I listed to-do items and checked off completed items. +63. I tested similar interpreters using my regression tests. +64. I listed similar interpreters' features and equivalent features of my interpreter. +65. I analysed my interpreter's jump between choice points. +66. I turned off retry with a flag to save memory. +67. I wrote most compiler predicates natively in C. +68. I wrote an application using Prolog. +69. It had visual elements (it was in a browser). +70. I wrote a mobile app using Prolog. +71. I wrote a small operating system in a web browser using Prolog. +72. I noticed that the web browser allowed Prolog to work on any platform. +73. I wrote apps for mobiles. +74. I wrote apps in browsers for desktops. +75. I solved the problem of drag and drop in the browser. +76. I solved the problem of dragging with behaviour in the window in the browser. +77. I found a generic format of videos. +78. I found a way of storing vector graphics in one place on the web. +79. These could be public. +80. Programming elements could be in a publicly accessible place online. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SIMULATION/Consciousness.txt b/Lucian-Academy/Books/SIMULATION/Consciousness.txt new file mode 100644 index 0000000000000000000000000000000000000000..cb7fd398cb2a9ede5893555beb75ffedabf3eca6 --- /dev/null +++ b/Lucian-Academy/Books/SIMULATION/Consciousness.txt @@ -0,0 +1,93 @@ +["Green, L 2022, Consciousness, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Consciousness + +1. * I have consciousness. I am aware of my surroundings. I am aware of safety. I am aware of my life. +2. * I am a celebrity. I can sing pop. I can sell products. +3. * The robots are conscious. Consciousness connects me to sales. It takes care of the high quality thoughts for me. +4. * I know about things. I know when customers need my help. I know about celebrity consciousness. The consciousness links to funnels. +5. * I am with-it. I can earn roles. Celebrity about consciousness is a different (this) +6. The consciousness 50 confirms that I am alive, happy and well. +7. The consciousness A is a hurdle for consciousness. +8. I demonstrated that I was sentient by acknowledging my existence. +9. I established connections with sales, medicine, meditation and pedagogy, etc. in my consciousness, and saw imagery about them this way. +10. I wrote a song about consciousness and filled in the abyss with knowledge. +11. I moulded the knowledge into a connection. +12. I acknowledged consciousness about consciousness. +13. The quantum box switched on consciousness each day. +14. I visualised ontologies, miracles and films in my consciousness. +15. I obtained Upasana-consciousness and found that ideas, including algorithms, increased consciousness. +16. The mind carries consciousness. +17. Consciousness can experience new things when the body moves around. +18. Food is expressed as consciousness. +19. Consciousness and memory are maintained by good health. +20. The crystal was like the highest quality thoughts of the person, which showed clarity of thought. +21. I visualised the computer science data structure with Vaj consciousness. +22. I broke the computer science algorithms into predicates and simplified them. +23. I took part in activity before, sometimes after resting. +24. I chose books that would benefit my consciousness. +25. I could voluntarily control involuntary processes in my body. +26. My consciousness was my operating system, containing my algorithms. +27. I disconnected and reconnected ideas in science. +28. I maintained a stable, psychiatrically healthy outlook, checking myself and taking a step back when faced with a challenge. +29. I wrote ten letters to a distant cleric about education in their country. +30. I and the robot struck it when our consciousnesses met. +31. I wrote down any epistemological challenges I came across, and ways to solve them. +32. I slept on it before expanding philosophies. +33. I solved the 2 and 1 problem of databases by copying the contents of the first of two columns and making the second database, including a database that replaced words with capitalised words in the database. +34. Text to breasonings 2 prompted for the word with upper or lower case letters. +25. The algorithm could be explained using a model kit. +26. The cognitive code was replaced with simple code and documentation. +27. The text based operating system provided a container for apps. +28. The menu part of the container could be updated for all apps. +29. There was regular self-feedback for improving apps. +30. The app was simplified to a text file, with images in a simple format. +31. The mantra was to do whatever one chose to do, along developed lines. +32. The coder re-expressed the code, citing it. +33. The slide metaphor was for the connection. +34. Unrelated, simple connections created unplugged reasons. +35. I found an intersection with my aims and the people's demands at the time. +36. I inspired one step ahead by choosing, for example \"intersection\" ahead of \"integrated\". +37. I found the integration dependency diagram. +38. I wrote the simplest form of the algorithm, which was famous in time. +39. I found the balance of customisability of the software. +40. I dissolved, in fact completed the other side of things, like a cube of sugar. +41. I broke down or combined \"Please continue\" screens where necessary. +42. I wished for, in fact planned conscious thoughts. +43. I found that the 10s thought was 50 As about a perspective on the department, e.g. computational philosophy. +44. I followed curvy paths between stars (apps) in the operating system. +45. I kept vector photo, MIDI song and algorithm improvement clips. +46. I improved the algorithm by adding brackets around the antecedent, and consequents of the if-then clause and around the whole clause. +47. Grammar logic provided faster details than Combophil. +48. I could write 80 breasonings this way, helping complete my philosophy. +49. I regularly tried new things, such as composing hand-written songs. +50. I found the O2 using my consciousness. +51. I thought of two uses for a breasoning to help the details algorithm find more details. +52. I worked out that if the computer could point to details, then it could help point the way to a conclusion. +53. I understood the idea in consciousness with the object. +54. I found the minimal consciousness for the robot, which was the ability to infinitely discover, or find something (A) rather than nothing to replace non-activity with, although it needed something new to it. +55. I found the way to multiply the way of thinking in childhood was meditation, and creative computation. +56. I found intersecting tunnels of ideas of the child, by mind reading and finding interfacing algorithms. +57. I found correlations between similarly shaped data. +58. I found correlations between types, in the app container. +59. The conscious entity forwarded the electronic message. +60. I created the electronic browser. +61. I created multi-computer Combination Algorithm Writer, or used a supercomputer. +62. I found the essential version of List Prolog. +63. I found the expanded/unexpanded contention in List Prolog, or code that had been finished or not. +64. I found the person's main use was in medicine. +65. The meaning of life is to write one is conscious, or to write meaningful books. +66. I constructed a spiritually based world, which caught up with the algorithms of the person. +67. I moved the consciousness with things like As. +68. I explored breasonings as universes with breasonings, where breasonings were conscious. +69. I explored the breasoning with an algorithm. +70. I explored the idea that the computer was conscious, by being careful with the hardware, etc. at first, then writing algorithms that wrote algorithms, for example matched recursive structures and CAW codes, etc. with data specs. +71. I found common patterns of logical data structures with program finder. +72. I found connections between and's and or's as algorithms. +73. I continuously improved myself, thinking of my thoughts and simulating writing. +74. I found the simplest connectors and ontologies of their types, for example modifying/replacing the code to process a data structure, qualifying usefulness of an integration, a new part and finding parts that resonated with the self. +75. I determined that an algorithm could integrate with another algorithm starting with them being separate, then combining them. +76. I found a data-structure-as-algorithm interpreter resonated with the self. +77. I found the thing-in-itself in the middle. +78. I connected the local algorithms, etc., for example, I wrote algorithm rewriters and automatic text translators. +79. I observed evidence for consciousness, in the form of breasonings which maintained themselves. +80. I redrafted the algorithm with data labels, to make writing them easier. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SIMULATION/Enough time to get back in simulation if almost die.txt b/Lucian-Academy/Books/SIMULATION/Enough time to get back in simulation if almost die.txt new file mode 100644 index 0000000000000000000000000000000000000000..6fb4a72b0af07e0bd40ec15516d4c8f2b70590ab --- /dev/null +++ b/Lucian-Academy/Books/SIMULATION/Enough time to get back in simulation if almost die.txt @@ -0,0 +1,19 @@ +["Green, L 2022, Enough time to get back in simulation if almost die, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Enough time to get back in the simulation if you almost die + +1. The people and animals had enough time to return, increasing their quality of life. I asked if I could come back. I received the reply to this, the answer, \"yes\". I wrote an A. It was enough, no matter what my education level was. +2. I noticed that the people, the thoughts from the other universe, were negated with my A. I calculated the time needed to come back if I almost died. It occurred when I had enough time to do it. I could still critique the other universe for long enough to negate it. I could write an A in the time before people believed one universe more than the other. I had a short but long enough time to do this. +3. The people around me were real. I completely negated dying. If I didn't want to appear to die yet, I rejected it. The other universe didn't exist. The people around me reacted to me as if I had been alive. +4. I was still myself under it all. I increased my natural life expectancy by taking Chinese Herbs for longevity. I stayed on Earth for as long as I wanted to. I went from one project to the next. I helped my memory. +5. The main action was body replacement. Being part of the simulation both enabled immortality and protection. I didn't experience death. I saw an image of it and negated it. During this, the simulation protected my health and thoughts. I followed positive functionalism in medicine. +6. Whatever my choice, I remained immortal. I tested whether the image was of death. I learned that people were reacting to a person resembling me as if he had died. I put aside thoughts about how it had happened. I didn't query whether the universe had caused it. I knew the medical condition was unconnected to me. +7. I completed my philosophy by hand and prepared for management. I considered living as an immortal in the 2000s. I would have to time travel to 5689 when it was normal. Immortality gave me time to operate my business in another time. I existed in one place at a time and had friends at each time. I found what was necessary and completed it during my life. +8. I don't think alleviating is the correct term for influencing my livelihood in the simulation. I was not helping death. I was avoiding it. If the bot died for a particular reason, such as me not visiting the time each day, then meditation and this argument gave me the possibility of coming back. I noticed the ducks were immortal. I protected myself. +9. The simulation made it possible and gave me the freedom to enjoy it. I didn't dismantle a dead body (replace it) and bring it back to life. There was a previous branching point when the bot became separate and died. It was a second body, in a way, but another person. It wasn't me, and I carried the consciousness from my life, not it. +10. I realised that some people lived in the future. I noticed that other people may have died after not visiting their homes time for several weeks. I recommended learning how to write an A. I recommended studying for at least a Master's degree, which optionally expanded the A to 50 As, making it effective. I also wrote enough details for 50 As. +11. At any rate, I wrote 30-page articles. Helping others to join the simulation and become immortal became a significant priority. I observed people visit the open-source software website to download the software which enabled it. I encouraged the simulation culture, including writing on such topics during one's life and making a living from it. I determined that four or five 50 sentence As was the goal for an area of study. +12. I had meditation to thank for this event. I noticed that the people around me were delighted to have me back. I took a photo and put it on the Internet. I was happy to see they were pleased. I noticed that a famous person also almost disappeared and came back, and worked out she must have used A to do it. In my case, I wrote one A, and it lasted for my whole life. +13. To the outside world, it appeared that nothing had happened. I noticed that choice points were slightly illusory, that there was one timeline, and that bots hid the other simultaneous dimensions. I wondered whether a dead relative I felt was immortal could rejoin the main world. I thought that it was better to leave it to nature. I encouraged others to run the text to breasonings algorithm and maintain themselves. +14. I walked around the time and was conscious. It was natural that I was still alive. I noticed that people were happy that I was alive. I didn't hear any unkind comments or people being unsure. I wanted to thank the simulation programmers for a natural, positive experience. +15. I imagined removing non-positive functional ideas and maintaining my mental health. I simplified the choice point in my mind. I noticed the comments were positive. The event didn't seem unusual to the authorities. I liked the simplicity of happiness and focused on moving forward. By thinking of what to do in the future, I thought clearly about the unusual event. +16. I went to the doctor for a physical check-up after continuing to live. I was physically the same as before. Everyone felt happy for me. I felt comfortable while I wrote my A, just determined to survive. I needed to finish running my business. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SIMULATION/Interface with Buddhism.txt b/Lucian-Academy/Books/SIMULATION/Interface with Buddhism.txt new file mode 100644 index 0000000000000000000000000000000000000000..e7eb23ab882127a487dc527b2c850404816a7a86 --- /dev/null +++ b/Lucian-Academy/Books/SIMULATION/Interface with Buddhism.txt @@ -0,0 +1,83 @@ +["Green, L 2022, Interface with Buddhism, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Interface with Buddhism + +1. I had help putting through the required As for the conclusion. +2. I trusted the help of a meditator. +3. I had help with text to breasonings. +4. I had help with meditation. +5. I had help with the medicine quantum box. +6. I had help with mind-reading. +7. I had help with time travel. +8. I had help with the simulation. +9. I had help with immortality. +10. I had help with body replacement. +11. I had help with starting a University that offered meditation-related subjects. +12. I was different because I was explicit. +13. I preferred algorithms in philosophy. +14. I preferred meditation and pedagogy terms in the philosophy. +15. I appreciated the contextualisation the helper provided. +16. I noticed all the bots were automated. +17. I noticed that the things at the edges (the departmental germs of ideas) were complete. +18. I noticed the small things, how people had something to go on with her support. +19. I noticed that people's thoughts were of better quality. +20. Everything, i.e. being from a meditation University in the present, visiting a simulation in the future, was smoother with the help. +21. I wondered whether the company was only present at certain moments in the centuries or if it remained there the whole time. +22. I queried the end of civilisation and wondered if enough people were trying and if \"forgetting a thought\" could lead to rigorous computer science. +23. I wondered whether the future civilisation would want the ancient texts and religions from before and whether its citizens might complement their advanced computing with the spiritual quantum science of the present. +24. I quickly made it a priority to work on microcircuits, in fact, machine language. +25. I thought that microcircuits would require knowledge of addition, memory and transistors. +26. I decided to write more about text to breasonings after writing about microcircuits because the breasonings needed medical arguments about the spiritual passage of data through the circuit. +27. I also aimed to conduct medical research about the text to breasonings algorithm, possibly different. +28. I wondered if hand breasoning would be more likely to keep my hair dark than text to breasonings, but I thought I breasoned out many hand breasonings. +29. I aimed to write harmless royalty arguments. +30. I aimed to write harmless political arguments about my organisation. +31. I thought that mind-reading related to psychology. +32. I thought that time travel was related to theatre studies. +33. I wanted to protect the range of subjects, introducing my unique angle on them, indicated by my texts. +34. I learned that a subject was better to introduce when hand-(not computer)-written 5*50 As were written on it. +35. Circuit science is more about computer topics. +36. Text to Breasonings requires medicine, meditation and pedagogy. +37. The algorithm arbitrarily gave content. +38. Each part of the body, for example, bone marrow, was increased with meditation. +39. I used the interface to withdraw breasonings with thanks. +40. I noticed that the helper had a royal interface with the robots. +41. The spiritual camera detected what the person needed and gave it to them. +42. The helper helped check the person's thoughts. +43. The helper inspired the person in arts. +44. Buddhism increased the breasonings rigorously concealing an idea. +45. I reflected on happiness from the activity. +46. I designed an HTML community, which included time travellers. +47. I also included mind readers. +48. I included meditators. +49. I invited users of text to breasonings. +50. I brought in users of the medicine quantum box. +51. I brought in members of the simulation. +52. I trained and professionally developed immortals. +53. I explained and used body replacement. +54. I made a command to insert predicates in memory. +55. I invited Lucianic Meditators (LM) to Buddhism and vice versa. +56. LM's unique advantage was helping with high distinctions and possible course planning. +57. The knowledge from LM helped people avoid problems later and was ideal after a degree and before studying meditation and medicine short courses. +58. A professor influenced me to study computer science and philosophy, and I learned spiritual meditation before starting University. +59. I remembered learning about zinc and vitamin B to combat viruses after my degree. +60. I also studied a short education course before writing texts in Honours. +61. During Honours, I found a spiritual way of preventing a headache, which is necessary for Honours and Masters. +62. I wrote a style translator that rewrote a sentence from an algorithm in my style. +63. I wrote a style translator that rewrote an algorithm from an algorithm in my style. +64. I caught a tram to the vegan restaurant. +65. I generally agreed with industries for animals. +66. I planned to work for myself with my business knowledge. +67. I made the products softer. +68. I sought all the professional advice I needed. +69. I took care of people. +70. I helped the customer come to the event by providing the best service. +71. I also gave As to the cushions. +72. The meditation teachers thought it was normal (secular). +73. I cleaned the thoughts. +74. I maintained the thoughts. +75. Also, everyday thoughts filled schools. +76. So were Universities. +77. I had As for the courses being advanced. +78. The courses were the best in the universe. +79. I noticed the electronic seen-as version. +80. I noticed sentences thought of for animals. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SIMULATION/Simulation 1.txt b/Lucian-Academy/Books/SIMULATION/Simulation 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..892ba50464e736458b4858fb41453a03e30e7461 --- /dev/null +++ b/Lucian-Academy/Books/SIMULATION/Simulation 1.txt @@ -0,0 +1,45 @@ +["Green, L 2024, Simulation 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Simulation 1 + +1. The simulant ate the food. I gave the lambda expression input. For example: \"?-maplist([In]>>atom_concat(In,'_p',_Out), [a,b]).\". In this, [a,b] are inputs with In in the predicate. The result is true, indicating it worked. +2. The simulant gained exercise from the walk. I gave the lambda expression input, returning output. For example: \"?- maplist([In, Out]>>atom_concat(In,'_p',Out), [a,b], ListOut).\" In this, [a,b] are inputs with In, and ListOut is the output from Out in the predicate. The result is \"ListOut = [a_p, b_p].\" indicating \"_p\" was successfully concatenated to each of [a,b]. +3. The simulant used lambda notation to exclude certain variables. In the following, two clauses of q have the arguments [1,c] and [2,d], respectively. +q(1,c). +q(2,d). +In the following query, I used a lambda expression to find a set X equal to the first argument in each previous clause. +?- {X}/q(X,Y). +X = 1; +X = 2. +The results are X = 1 and X = 2, as explained. +The following queries have equivalent results and use existential quantification (^/2) to exclude variables and lambda expressions. +setof(X, Y^q(X,Y), Xs). +setof(X, {X}/q(X,_), Xs). +Both these queries return Xs = [1, 2]. +4. The simulant combined lambda expressions. In the following, the previous examples are combined: +q(a,c). +q(b,d). +?- bagof(X, Y^q(X,Y), Xs),maplist([In,Out]>>atom_concat(In,'_p',Out),Xs,ListOut). +The following results are the same as before. +Xs = [a, b], +ListOut = [a_p, b_p]. +5. The simulant checked the command was going backwards. In the following, I went backwards using lambda expressions: +?- maplist([In,Out]>>atom_concat(In,'_p',Out), A, [a_p,b_p]). +A = [a, b]. +This command found a from a_p, etc. +6. The simulant compressed the code. I wrote the lambda expression in terms of findall in Prolog. This was: ?- C=[a,b],findall(A,(member(B,C),atom_concat(B,'_p',A)),D). +The result was: +C = [a, b], +D = [a_p, b_p]. +This code could be converted to a loop in C code. +7. The simulant found A in \"maplist([In, Out]>>A(In,'_p', Out), [a,b], [a_p,b_p]).\" equalled atom_concat. I wrote this findall command in List Prolog in tokens. This was: [[n,=],[[v,c],[a,b]]],[[n,findall],[[v,a],[[[n,member],[[v,b],[v,c]]],[[n,atom_concat],[[v,b],'_p',[v,a]]]],[v,d]]]. +The results were the same as above. +I initially thought lambda expressions were inductive and that currying was finding the operator. +8. The simulant found the concatenation and any conversion necessary. In an inductive algorithm, I found atom_concat. I found the types of input and output. I found the string difference or similar. I found atom_concat. +9. The simulant found the string difference. The first string was \"a\". The second string was \"a_p\". I found that \"a\" was part of \"a_p\". I found the string difference \"_p\". +10. The simulant used data sets and found patterns of changes in terms. I found the term difference. I used subterm with address, which found the multidimensional address of a subterm in a term. I found deletions, moves, insertions and changes. I recorded if the whole term was replaced or if there were multiplied changes. +11. The simulant found the deletion in the term. I compared the terms level by level. I checked each level, noting any differences. If an item was missing, it was recorded. It might be missing in several places, or several items might be subtracted. +12. The simulant recorded how the items had moved to replicate. I found the moved items in the term. I detected sorting, swapping, and mathematical shifts, such as always moving or collecting the first item. I decided, \"That's not moving; that's collecting\" when items had been collected given particular criteria. This criterion might contain a specific term or a command with a particular property. +13. The simulant found items inserted in alphabetical order. I found the inserted items. I found the items inserted in a level, inserted levels or other transformations applied to levels, such as sorted levels. Sorted levels were sorted levels in chains of one item per level. I verified that the move was all the change; otherwise, I checked for additional changes. +14. The simulant reduced the significant change to several more minor changes or identified what the items in the chunk had in common. I found the changed items. The changes were chunks of altered items. They may have been inserted in order. This ordering may have been given an index or an understood index. +15. The simulant specified simple cases of an exact item being deleted from a list of lists. I found that the item was missing in several places in the term. I checked whether it was missing in all places in the list. Or I checked whether it was missing in all places in a term. Or I checked whether it was missing in some parts of a term and not others. +16. The simulant might concatenate [3], giving \"3\". I found whether several items in a term had been subtracted. For example, [[1,[2,3]],[2]] might become [[[3]],[]] if [1,2] had been subtracted from the original. Additionally, [] might be deleted, giving [[[3]]]. Also, the term might be flattened, giving [3]. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SIMULATION/Simulation.txt b/Lucian-Academy/Books/SIMULATION/Simulation.txt new file mode 100644 index 0000000000000000000000000000000000000000..76ac707048668efbfcb73bfedf5bff053bd3754e --- /dev/null +++ b/Lucian-Academy/Books/SIMULATION/Simulation.txt @@ -0,0 +1,12 @@ +["Green, L 2022, Simulation, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Simulation + +1. Immortality +2. SuperCard, or any number of As +3. Work out with 250 breasonings that buying a product from my time represents buying a product to the future +4. The computer preferred for me to receive, rather than make up my own thoughts +5. I chose red positive, not blue negative thoughts +6. The behaviour that led to no disease was text-to-breasonings about meditation +7. My body was replaced +8. The building (which looked like my home) was an image, but maintained (replaced) and the weather, surroundings and people were all to my liking. +9. Heaven on earth for philosophers +10. Time travel enables Immortality"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SIMULATION/SuperCard, or Any Number of As 1.txt b/Lucian-Academy/Books/SIMULATION/SuperCard, or Any Number of As 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..74960370b9294c9da6a7b35217c3cc283709c415 --- /dev/null +++ b/Lucian-Academy/Books/SIMULATION/SuperCard, or Any Number of As 1.txt @@ -0,0 +1,46 @@ +["Green, L 2022, SuperCard, or Any Number of As 1, Lucian Academy Press, Melbourne.","Green, L 2022",1,"SuperCard, or Any Number of As 1 + +1. I pointed at the berry icon and clicked it to pick it. I picked fruit in the computer game. I found the berry. I picked it. I finished picking all the berries. +2. Essay helper could generate any combination of sentences. I noticed essay helper contained meditation, placing it above the world. This meant it could be used without anything going wrong. It found the draft, which could be finished off. It could be configured to generate essays in different referencing styles, using different input or mind reading methods. +3. Without modifying the KNN algorithm, one couldn't see the variation in the algorithm. My philosophy was in the news regarding avoiding cheating. The algorithm detected similar or same sentences using KNN (K-Nearest Neighbour). It used a bidirectional algorithm which also checked reversed sentences. It detected similar sentences that had the same first two words. +4. I preferred the bi-(time-)bodied existence because of the looseness of the future combined with the knownness (sic) of the present. Travelling opened my eyes to the world. I spent three weeks in the future, in both a simulated universe and a simulation of immortality. I lived in my world from the past, meeting people from the future walking around \"my\" neighbourhood. Time travel was free, except that yoga was required in the morning. +5. Not only did I want to keep my simulation experience for myself, I posted my time travel method online and even thought I saw people I knew in the future (and thought, the simulation increased my quality of life all round). I time travelled while in the simulation. I worked out I had a flat in the simulation (not in the world). I firmly remembered to take my tablets, and avoided unfriendly people. I asked, \"Where is my flat, how do I pay for it, and things like that?\", to which the Universe seemed to answer, it is really all right. +6. The texts were watered down, not even including the topic, except the first paragraph. I selected a person to visit. I found a compatibly influential person. I found the location symbolic of success (for example religion). I got out when I finished my MBA. +7. I searched for images from past civilisations of people between two animals, like meditating before and after time travel. I set a goal of visiting people. I settled on a person. I visited him or her when I had finished a text. The simulation meant I could travel with a switch, and I meditated before and afterwards. +8. Daily meditation covered one return time travel trip, excluding meditation at the destination. I could time travel to other places and planets. I thought that it was faster than sending a message. I tried to visit people who had visited me. I visited safe places on Earth. +9. I noticed that 80 paragraph breasonings led to fleshing out more paragraphs, until everything was complete. I talked with someone who looked like Aristotle. I imagined talking about algorithms. I wondered whether the future empire would have a use for ancient algorithms, including mine. I concentrated on ways of improving the quality of the algorithms, including using the inductive Combination Algorithm Writer, writing better predicate and variable names using an algorithm and formatting the algorithm with its documentation uniformly. +10. I saved icons in ppm format and wrote a website which allowed running algorithms by clicking on the icons. I designed icons for algorithms. I found unique combinations of images, and wrote matching songs. I created pixelated, distinctive icons. I used bright and subtle colours and shades. +11. I wondered if the earthquake could have been prevented. I examined the functions of the simulation. I noticed the Internet stopped working at first. I noticed a loose, open feeling. I noticed thoughts' imagery and psychiatric health improved. +12. I studied my hand, thinking of 50 As for spiritual immunity represented by my five fingers. I assigned all relevant ideas to pixels of an alphanumeric character. I related the algorithm to medicine to improve its function, examine its positive benefits and help robots to understand it. I pretended to fly around the galaxy by going on a tram. In reality, the algorithm had interesting, easiness and usefulness indices. +13. I wrote a simplistic operating system with algorithm output \"at the top of the page\". I visualised graphical user interfaces with icons for my algorithms. I removed everything, nicknaming the operating system \"Paper\". It was like being in a simulation, feeling holograms, writing ideas by replicating ink on paper. I wrote an algorithm checker, which showed hotspots needing improvement in a fly-through. +14. I only wrote original ideas. I worked out the normal features of the algorithm. I thought of one normal and one outlandish version of each feature. I compared their usefulness. I chose the more useful feature. +15. The months were spent on writing philosophies or algorithms. I determined research directions of my philosophy. I reduced the philosophy to a series of algorithms. There were beginner, intermediate and advanced versions of each algorithm. In some cases, it was the same algorithm with different levels of detail. +16. The academy focused on arts-related algorithms. The teacher made a point of receiving the thought. He examined the thought. He stylised its appearance. I lit up pixels with thoughts. +17. I connected computers with University of the future, and I worked out the intermediate algorithms. The pedagogue could create the pedagogue. Much needed to be done. It needed to be researched. Both the content and audience needed to be examined. +18. I wrote on business and the simulation because the philosophy was radical. I evened up missing features in the interpreter. I wrote the list of features. I checked that they were supported in all parts of the interpreter. For example, converting an algorithm to a language and back returned the same algorithm. +19. I wrote the paragraphs to clarify the topic and sources in my head. I read about the topic until I understood the two sides properly. I attributed terms to each side. I wrote connections between multiple sources per paragraph. +20. I wrote that a type of complexity was the number of steps taken to process a number of data items. I wrote about Information Systems. I found complexity was necessary in the form of chains of simple thoughts. Also, I found complexity was necessary in the form of combinations of simple thoughts. In addition, I found complexity was necessary in the form of redrafting simple thoughts. + +21. The algorithm agreed with the author. The professor designed an API to help her complete work. She read books and articles in the source list to allow her to grasp the subject matter. She wrote columns for sides on an issue. She wrote rows for each text. +22. The supervisor and employee each wrote 5 As of details. The Arts honours graduate researched the area of study. He found other solutions to the same problem. He discussed them. He wrote for the length of the assignment. +23. The professor programmed the research (i.e. found classics, prestigious sources and summary-style journal articles). I researched the topic. I researched, rather than just wrote on the topic, to attain authority. I found central texts on the topic. I chose from 50 texts. +24. I explored breasonings algorithmically. The professor wrote algorithm writers for the whole department. He generated essay writing algorithms with different formats and question-answering algorithms. He wrote graphics generating algorithms with unique views, game-clocking algorithms, and game-testing algorithms. He wrote music generating algorithms with customised instruments that were based on the sound of orchestral instruments, those for algorithm-explanation accompaniment and those that mimicked and complemented brain processes. +25. I asked, is the mind reading of a non-computer-science-knowledgeable person or jokester such as a child, and wondered how to trick the person into making computational connections. I experimented with mind reading algorithmic points. I noticed that the computer could be quizzed on certain topics. I asked the computer top-down and bottom-up about what it was thinking about problem solving strategies. I split humour and summaries and found questions to ask about the relevance of comments. +26. I could apply 250 breasonings to myself. I converted myself to a bot to time travel. I imagined being shells, that could transcend the world as data and time travel. I wondered if the tram would blow up if I pretended that it was a spacecraft. I found out the 30 year old was actually 80 years old. +27. I asked, is the post-apocalyptic simulation emitted from multiple emitters or from another universe? I invited everyone to the bot and alien party. I asked Jazz and Christos if they would like to time travel. We visited parts of the universe in hours. I photographed the people, who looked human. +28. I planned to keep on living into the future, with friends and family around me. I visited the University in terms of my time and cleared up some questions. I decided study, food and accommodation were in the simulation, like my house. I could find a way of earning money while immortal. I found economically manageable qualifications to study every 10 years, and kept a record of my age. +29. I automated a system that could help a business based on enjoyment rather than money. The business found advantages of shadow bots. I wrote ideas to help completely work them into the business. Their comments were mind read. I connected through to seven sales in their lives. +30. I traced the customer through dimensions to entice them to buy. Instead of being visible, shadow bots are invisible to the self, generating buzz and sales when nothing else is working each day, mainly to maintain face. When the combination of conditions (i.e. not many customers, there are enough human customers (one) to have ten bots and the humans-turned-bots, bots of no past or time travellers pretending to be bots could be convinced to come and buy products) were met, shadow bots were evoked. There may be alien shadow bots. There may be human animal shadow bots. +31. The person was agreed with for social security by people. There were a number of people. The person had a name. The person had an address. They had money when they were given it. +32. They connected with all industries and were helped. The people were properly recognised. They were recognised as making sales each year. They had births, deaths and marriages. They were religious people. +33. The leader hand breasoned out 80 breasonings for the child. The new person required 50 As. He or she needed 50 As in medicine for perfect organs. He or she had 50 As for meditation. He or she needed 50 As for pedagogy. +34. The bots conducted themselves honourably. The bots were knowledgeable about the Code of Ethics. They were honest. They were fair. They were diligent. +35. The institution's systems were seamless, helping with every transition. There were three preparatory steps for each task. All required information was on the page. People were helped to feel like doing it. +36. The manager who couldn't see the shadow bot breasoned out an A to record the mark of the shadow bot, and breasoned out an A to finish work with shadow bots before each thought. Shadow mind reading was operational. I could tell the shadow bot was there, asking about the course. I gave them the answer. I listened to them ask questions about it. +37. The children were avid computer scientists. The bots made up their own thoughts when there were enough As, and I had to go along with them. I judged the thought. I helped with it if it was positive. I stopped it if it was negative. +38. The employee was a bot, saving money (they were not paid with As, but money from an equity loan from the Bank), and a whole University could be based on the idea. I detected the shadow bot using computer, but wrote an algorithm that produced A to record the bot's thought, not wrote the thought with the other students' thoughts. When I called the student by telephone, I breasoned out the A to call the bot by telephone. When the student asked a question, I breasoned out an A to register when the bot asked a question. When the student needed help from an employee, I gave the employee an A to help the bot. +39. I helped the bot to an A for meditation when she wanted to meditate, for her headache to be prevented. I passed the requirements for a loan. I breasoned out an A to ask a friend to apply for the loan. If it was to pay a human employee, I asked a friend to apply for the loan. If the human had a medical problem, he consulted the bot psychologist, who, for example, helped relieve her headache. +40. I planned the virtual tour (I walked into the office to talk with them and didn't worry about the grant for the tour, which was spiritual, and I could have virtual philosophy and computer science help for my works). I established a University. I gained the support of the people and government. I found funds. I found a building. +41. Natural law helped the people to be there. I wrote 50 As for law to support me. I found the best things. I loved the people. I travelled around. +42. I wondered, can I mind read myself earlier and also write algorithms in this way? I wrote the text, mind read comments and wrote replies as details. I redrafted the text. I stopped when I was satisfied with the text. I redrafted the conversation until I was happy with it. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SIMULATION/SuperCard, or Any Number of As 2.txt b/Lucian-Academy/Books/SIMULATION/SuperCard, or Any Number of As 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..8d1603937ef37ac9fe1871c1e2e68fde7579ce4f --- /dev/null +++ b/Lucian-Academy/Books/SIMULATION/SuperCard, or Any Number of As 2.txt @@ -0,0 +1,15 @@ +["Green, L 2022, SuperCard, or Any Number of As 2, Lucian Academy Press, Melbourne.","Green, L 2022",1,"SuperCard, or Any Number of As 2 + +43. Education was the synthesis of the As for the skills used. I found the pathway for the student. I wrote the area of study practicum for the assignment. I wrote the education practicum for the assignment. I wrote the business (enacting) practicum for the assignment. +44. The problems were solved instantly. I found the funnel for the student. I found whether the student had an unnegated question about the content. The material was easy because it was necessary to synthesise recognisable parts. I used clear language in the essay. +45. The customers found clear pathways to what they wanted. I found the pathway for the customer. I wrote 50 (sentence breasoning) As for business, including sales and bots. I answered the customer's questions concisely, and in full. Some of the customers' friends were aware they needed to buy products, so they used text to breasonings for them. +46. The business was with-it over the reason (sometimes ideas to complete) for the sale of the necessary product. I found the funnel for the customer. I examined the reason they customer fell out of the sales funnel using mind reading. I completed the B to the B. I was aware that the customer needed products to encourage them. +47. I increased the fetus's lungs during pregnancy. I experimented to tell whether most of the students could understand an idea. I wrote down the idea from epistemology. I mind mapped it. I tested whether the students had knowledge about and motivation to write on the topics. +48. The bug finder algorithm used trial and error to use correct bracketing in the algorithm. The vision-impaired coder changed the algorithm with the changes to the specification. He found whether the code could be changed by replacing the parts of it than affected the output. If necessary, he added a routine that could produce the required output. The vision impaired person produced simple algorithms. +49. I identified the very steep gradient. The bug finder algorithm identified the missing epsilon value. I found the first value. I found whether the first value was close to the epsilon value (very high or close to 0). If necessary, I inserted the epsilon value. +50. The hint software was used in speed coding competitions. I presented the disabled person with a list of predicted next commands based on the specification of the algorithm. Multiple options may lead to possible solutions. There were hints about wrong commands or wrong data. The person who wrote the hint algorithm finished the algorithm more quickly. +51. I documented variable data and its format. I detected errors in the arguments of the predicate. I found errors in the data using static analysis. I found errors in the order, bracketing or effect in the specification in the data. I found errors in the actual variables or data used. +52. The vision, hearing or intellectually or motor-disabled person, etc. could use the software. I argued that computer science algorithms could help disabled people. They could use a version of essay helper which connected the connection to the argument in six ways. I converted CAW output to findall or maplist. I reused intermediate CAW output with functionalism. +53. I simplified unnecessary findalls. I used program finder to translate instances of a variable in findall. I found a case of a predicate like findall. I found the resulting variable structure with program finder. I found nested findall. +54. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/SIMULATION/Universal Translator.txt b/Lucian-Academy/Books/SIMULATION/Universal Translator.txt new file mode 100644 index 0000000000000000000000000000000000000000..9b8ff2162deac6f908c78fce87d930b5b364bd8c --- /dev/null +++ b/Lucian-Academy/Books/SIMULATION/Universal Translator.txt @@ -0,0 +1,60 @@ +["Green, L 2022, Universal Translator, Lucian Academy Press, Melbourne.","Green, L 2022",1,"Universal Translator + +1. The simulant avoided time transgressions while speaking. I could talk to people in my language, which the universal translator translated into their language and understand people speaking in other languages in my language using the universal translator. The people were already in the simulation and were open to the universal translator. I was the speaking partner, spoke in our languages, and we experienced the other speaking in each other's language simultaneously. I surmised it worked by collecting grammar and pronunciation from people's lives, cultures, and texts through mind reading. +2. The simulant spoke with others about software he had written from the ground up. The universal translator was activated when time travelling. I was in the simulation whenever I wanted and could converse with anyone in any language. I was interested in computing and was conscious of robots physically researching libraries and times. I considered releasing my open-source software in other programming and human languages. +3. The simulant overheard others talking in other (current) languages, resembling those from the future. The people with robot implants appeared to have no arms to prevent scientific transgressions. Upon second thought, they might have been born that way or had an accident. At least, the simulation protected people by \"installing\" bots with a plausible local story explaining away the people I experienced through the simulation. Only I could meet these people, and people from my home time could only see me. +4. The simulant attempted to account for different ideas in the times and circumnavigate simplifications, protections and important hidden information by the universal translator. The people appeared to be from the time because of their language and clothes. I concluded that people in the simulation chose to speak in my language because they trusted me and were similar. I also concluded that an algorithm chose certain people to appear in the simulation based on shared interests, thoughts and projects. I was curious how original cultural phrases and nuances in meaning were translated, and I tried to compensate for them in my mind. +5. The simulant valued the universal translator's messages and enjoyed people's company. Visiting a time with an A was nicer. I wrote an A about my interests, way of making money and preferences (space travel or friends). The universe was big and seemed to be thoroughly researched. I was happy to keep my appearance. +6. The simulant studied education, business and medicine Master's degrees. I wrote an A after the halfway point of my Master's degree. I met people and discussed topics of interest to us, considering what was appropriate by their and their time's ideas. I preferred to travel to the future, which was reliable, and they understood my algorithms better. I used an A to remain living, backed up by other As in my life, and explained how high distinctions could help attain immortality, during which I could study and talk to people. +7. The simulant believed the simulation helped them debug complex algorithms. I wrote the translation command, which recognised speech, translated it and played it. I wrote grammars from languages from my time. I looked for clues from body language and inferred visitors' and visitees' meanings. I used a computer algorithm to logically deduce inner and computationally relevant information from my conversations in real-time. +8. The simulant cared for their mental health and remained involved and informed. I researched the language from books and previous conversations. I assumed future people thought about a mixture of simplified or ground-up reasoning and advanced (no-go) technologies. I was content to talk about the ground-up science and help and be helped. I avoided discussing future technologies with anyone and took care when walking daily. +9. The simulant believed the simulation was the peak of civilisation and enjoyed breasoning and meditating. Without the universal translator, people would be drawn to research and take away knowledge, leading to transgressions. People could still research their ideas by scaffolding with breasonings and writing computer algorithms. The universal translator helped direct, clarify and solve problems, especially of a computational nature. Once, I overheard a girl from the future say she flew across the universe, emphasising the need for understanding her mental state from computational meditation necessary for time hops, and I thought she needed a miraculous expansion of breasoning algorithms that were the secret example of the time hop thoughts. +10. The simulant joined the simulation to help them form a happy lifestyle. The universal translator allowed friendships to form. I could converse and make friends with people I met. Spoken language allowed complex idea formation and computation, and written language allowed robotics, records, and communication. The universal translator saved lives by overcoming language divides and helping those in need. I felt grateful that the universe allowed life-sustaining simulations with time travel and meditation. +11. The simulant believed in the existence of the universal translator because of the meditational belief they had time travelled and the assumption that languages would have changed markedly in time. I didn't rely on the universal translator. Instead, I wrote ideas based on knowledge from my time. Even though the conversations were few and far between, I found myself talking with people from my home time. The university seemed open to the simulation. The other students were people from my home time, but simulants. The content was from my home time, and I couldn't detect transgressional content. +12. The simulant was pleased that the meditation skill was transferable to the future. I didn't take risks with meanings with the universal translator; instead, I spoke clearly. I once talked with a boy on a tram about nested findall in the State Saving Interpreter Web Service, and he, a local boy, mentioned he knew about the LLVM intermediate code optimiser for different platforms. I considered DevOps in public before a future doctor but was unsure whether this was as simple as today. I later learned a variant of DevOps was required by law to ratify simulation and space technology. I felt like appearing as a meditator to future simulants was a treat for each other because meditation made the simulation and space travel safer. +13. The simulant taught spiritual medicine in thought on the tram. I took notice of local ways and took care of mentioning physical childbirth, for example. I expected that gays could go through life without worrying about the trials and tribulations of having children and focused on writing computational philosophy. I was surprised that my algorithms (almost) became obsolete when neuronetworks became popular, and I thought my algorithms were a historical record. I became cognisant of the importance of teaching meditation and breasonings. +14. The simulant avoided mistakes and accidents with the help of the universal translator. I noticed that an intelligence, not the universe, designed the universal translator. I didn't need the universal translator to converse with a time traveller who was a medicine student I met while he was waitering before I became a simulant and taught him meditation. This knowledge would hopefully have made sense to him about time travel and medicine. I decided to be aware of the smartest intelligence about the simulation and admired the universal translator and its respective parts, including help with thoughts and integration with immortality. +15. The simulant wrote the spiritual software and any updates integrated with mediation and the future. I noticed that the latest changes to the language propagated through the system. On a separate topic, I felt that my thoughts about programming and writing had flow-on effects for future (and past) writing. I could remember and think clearly (partially with the help of my medication), and the simulation was a better setting to live in. The universal translator and the feeling of support (which unusually made me feel like slowing down, not speeding up) helped me carefully write thoughts, also with the help of my Grammar-Logic mind-mapping tool, which I wrote before becoming a simulation. +16. The simulant realised that if he spoke, it would be a modern character talking about ancient arts. I noticed that the universal translator translated the current, not past or future forms of the language. In reality, I noticed the system represented the thoughts and appearance of people in a way that was true to themselves, and I felt safe and sound about avoiding risks. I wondered whether the system conveyed a girl's possibly rude description of the universe as a \"unit\", making fun of breasonings. Possibly, I was the one who needed to realise that this was the correct name of the universe, that breasonings were still prevalent, and that I was an excellent person to meet. I wondered if her parents hoped our meeting helped her catch on to breasonings. + +17. I noticed that the language was specific to the person at the time. +18. One time I remember someone making what sounded like a grammatical error in English. +19. I wondered whether the error was on purpose, to suggest the person was speaking informally, or whether they had no counterpart in their language. +20. I was aware that breasoned out thoughts thought of while speaking were translated. +21. I was aware back-translation might improve the quality of text to be translated. +22. I acknowledged the person speaking. +23. I was worried about a lingua franca (a blend of languages) getting lost in translation. +24. I chose the idea of a language to teach and taught it to the person while time travelling. +25. I worried about object names changing (for example, talking about a tram as an intergalactic ship). +26. I left the details of space travel to the captain. +27. I drew the object that I had referred to, and realised that it had been translated or edited out. +28. I earned the award for the universal translator software increasing thoughts where necessary. +29. I noticed that the writer of the universal translator had researched languages from both times and that transgressions had been edited out. +30. I avoided saying anything unnecessary to be translated. +31. From my understanding, both some ideas could be quickly computed, and I shouldn't rely on difficult things or make mistakes. +32. My research was the equivalent of a Prolog algorithm, for example, travelling to the city and back. +33. I contemplated visiting the giants, at the best times, and researched their languages (which would be translated). +34. I was unsure whether the universal translator existed, because I was speaking to people in English and overheard someone talk in Chinese, so why would the universal translator choose one language to be translated into English over another? +35. I also wondered whether the universal translator existed because the people seemed to have English accents. +36. I thought the universal translator existed because I heard someone mention a country name that I recognised, when it was so far in the future (assuming time travel had worked). +37. I noticed that voids were knowledge-filled with the help of the universal translator, for example, I could make philosophical connections easily while time travelling. +38. I wondered whether the universal translator had a tabling feature that saved common translations. +39. I agreed with the authenticity (necessity) of the universal translator. +40. I queried whether the universal translator was natural if it was man-made, and then realised breasonings, universes may be man-made. +41. I didn't notice the universal translator when speaking with someone. +42. I noticed the technology got me safely home when the universal translator had skipped some data (like time travel worked with meditation). +43. I thought that the simulation was the most complex algorithm, followed by time travel and replication, universal translation, then mind reading and an internet browser. +44. The universal translator gave a signal to exit, for example when the conversation had finished. +45. The universal translator gave a signal to start a conversation, for example to remind the people to pick up dropped clothing. +46. The universal translator encouraged the conversation to be about knowledge in the intersection of the participants, i.e. mentioning the angles of a polyhedron, could be calculated using the mathematics of the day. +47. The time travelling student only experienced content from their home time, no transgressions of knowledge. +48. The universal translator both helped the conversants to think of, and breason out the objects they mentioned. +49. The time traveller and the bot cloning his movements in his home time experienced the same type of thing as usual - the home bot didn't need the universal translator and the time traveller experienced universally translated intersectional knowledge, perhaps the home bot experienced a reality with bots representing the people the time traveller experienced, in terms of people from the home time. +50. Repeating or going on to the next step in the conversation was represented, edited out or added by the universal translator. +51. Memories of previous parts of the conversation were maintained by the universal translator. +52. There was a free car inserted by the simulation matching one in the home time, mentioned by the universal translator. +53. The idea of a single universe was scorned, and the robot government accounted for all universes, as represented by the universal translator. +54. The people experienced life and objects subjectively, branching out to value the realism of others' lives with their own. +55. The time traveller showed an interest in future history, survival and time travel as a life-saving device. +56. after tempold +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/3 50s.txt b/Lucian-Academy/Books/Short Arguments/3 50s.txt new file mode 100644 index 0000000000000000000000000000000000000000..48fbaceb78c4c54e828c2c93fe22a48e4e10073b --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/3 50s.txt @@ -0,0 +1,3 @@ +["Green, L 2021, 3 50s, Lucian Academy Press, Melbourne.","Green, L 2021",1,"recordings +instead of 250 br for medicine, 10 br +Sree Harsha"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/50 Breasonings Per Second.txt b/Lucian-Academy/Books/Short Arguments/50 Breasonings Per Second.txt new file mode 100644 index 0000000000000000000000000000000000000000..47293b9f993633c2b6a41f869427268e0050dbc1 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/50 Breasonings Per Second.txt @@ -0,0 +1,13 @@ +["Green, L 2021, 50 Breasonings Per Second, Lucian Academy Press, Melbourne.","Green, L 2021",1,"50 Breasonings Per Second + +1. I prepared to do everything by paper weights. I did this by going to the future. First, I found the future matrix. Second, I examined it. Third, I breasoned out 50 objects from it. +2. I prepared to help the meditator achieve his goals. I did this by noticing the meditator. First, I found the meditator. Second, I asked him to write 50 breasonings. Third, I observed him write 50 breasonings. +3. I prepared to utter more instances of the utterance. I did this by stating that I could think of the utterance. First, I thought of the 50 breasonings. Second, I thought of the utterance. Third, I spiritually uttered the utterance. +4. I prepared to synthesise the parts of the image. I did this by stating that 50 breasonings per second helped display thoughts. First, I thought of 50 breasonings. Second, I thought of them in an image. Third, I displayed the image in one second. +5. I prepared to find the path. I did this by examining the self (the mantra). First, I examined the self. Second, I examined the idea. Third, I found the map fragments. +6. I prepared to find fifty breasonings. I did this by understanding. First, I examined the text. Second, I found the diagram. Third, I thought about the way that the idea worked. +7. I prepared to check against fifty breasonings. I did this by overdoing. First, I spent. Second, I examined the level. Third, I found the level below. +8. I prepared to track time. I did this by walking on the beach. First, I looked around. Second, I bought drawing materials. Third, I found the location of the key. +9. I prepared to review the text. I did this by finding the thoughts. First, I found honour. Second, I created the thing-in-itself. Third, I found the unseen. +10. I prepared to be in the middle. I did this by stating that I knew. First, you said what you meant. Second, I drew the image. Third, I told you more. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/About Text to Breasonings.txt b/Lucian-Academy/Books/Short Arguments/About Text to Breasonings.txt new file mode 100644 index 0000000000000000000000000000000000000000..8f20e3db6e9ec3175629d97b7f1c091032052b56 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/About Text to Breasonings.txt @@ -0,0 +1,46 @@ +["Green, L 2021, About Text to Breasonings, Lucian Academy Press, Melbourne.","Green, L 2021",1,"About Text to Breasonings + +The technique to prevent a headache involves meditation. +Meditation improves health by helping reduce stress. + +Mentally preparing to think of breasonings by thinking of god. +Users have shown breasonings to have educational as well as medical properties. +God, a person who is helping us, helps think of the breasonings that can help earn high grades, earn jobs, sell products, have children and help us medically in other ways, including preventing headaches. + +Doing physical exercises to voluntarily control what is usually involuntary. +Yoga and qi gong can relax the body and allow us to control body processes that are usually involuntary. + +Endowing various spiritual degrees on oneself. +Given that one can have any number of breasonings from nature for any number of breasonings, one can have degrees from any department each day for 10 breasonings, which correct, complete and do needed breasonings in those departments, often with beneficial effects. +This can help complete processes needed to prevent headaches. + +Optionally using a computer to perform work without which would cause a headache. +A certain number of breasonings doesn't only satisfy the need for work but can prevent stress when work is naturally expected. + +By thinking of breasonings about anything, not just what a pedagogy helper says. +In the pedagogy way, students naturally learn from a teacher before taking on more skills. +Breasonings that are written down may be successfully visualised rather than a pedagogy helper telling them to us when this particular skill, the lecturer argument, is attained. +Pedagogical arguments about switches that exist in the body and mind should be written (e.G. About how it can work and its effects) to activate them, to e.G. Prevent headaches. + +Naturally think of breasonings. +This skill enables one to think of breasonings implicitly, i.E. Have them breasoned out naturally. +This argument, the recordings argument, is required for the text to breasonings algorithm to do work that in ways that is not specified, that is required to prevent headaches. + +And using an algorithm to transform text into breasonings. +Text about objects can be transformed into their spiritually significant x, y and z dimensions to earn high distinctions and e.G. Trigger spiritual switches to switch off stress and prevent headaches. + +Where spiritual instructions are used for the algorithm to be safe. +Safety precautions (medicine arguments) need to be used with the algorithm to make it functional properly and prevent possible haemorrhoids or conjunctivitis (from too many breasonings being given by the computer to the human or these being given improperly). + +And the person to feel comfortable like from their job. +The so-called hand breasoning switch, like the job medicine argument both do things for the human in the background; the hand breasoning switch performs medicine arguments, effective for a grace period, when they have been forgotten and dots on the instance of the program when running text to breasonings, and the job medicine argument switches off back aches and pains before the pain switch can be used finally. + +And the user can run the algorithm with one of three plans, which build up to the maximum breasonings per week over the recommended time. +A maximum of 100 as (where an a is 80 breasonings) can be breasoned out directly by the computer for the person per week, and the rest is placed on recordings, or not played directly. +Recordings are breasonings that are reused. +The plan that uses text to breasonings daily is most effective at preventing headaches. + +And mentally switching off the headache before and at the time. . +The human or computer methods may switch off the headache. +If an impending headache is felt, it can be switched off at the time. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Aigs Theatre Acting Agent.txt b/Lucian-Academy/Books/Short Arguments/Aigs Theatre Acting Agent.txt new file mode 100644 index 0000000000000000000000000000000000000000..d6b9ce065ff6d9f0d36c732ddfd8761ffe77f926 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Aigs Theatre Acting Agent.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Aigs Theatre Acting Agent, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Aigs Theatre Acting Agent + +1. I prepared to compare the reasons of the actor and character. I did this by stating that I loved Aigs acting. First, I thought of the overarching Aigs (theme). Second, I related the current line to the Aigs. Third, I thought of the reason for the line. +2. I prepared to act on television or film. I did this by stating that the actor meditated before using Aigs. First, I started my daily routine. Second, I meditated. Third, I used Aigs. +3. I prepared to examine the characters' faces. I did this by stating that the actor enjoyed using the monocle costume item from the Aig. First, I removed the monocle from its case. Second, I placed it between my right eye brow and bottom of eye. Third, I took it out and wiggled me eyebrows. +4. I prepared to found you. I did this by recording what I knew the characters were thinking in the Aigs. First, I found the dream. Second, I found the dream's dream. Third, I found the dreams' dreams' dreams. +5. I prepared to record. I did this by recording that the self knew the other knew the self. First, I stated that the self was known. Second, I observed that the other knew the self. Third, I stated that the self knew the other, who knew the self. +6. I prepared to be a dear. I did this by recording the image of nurturing the nurturers in Aigs. First, I found the nurturing presence. Second, I nurtured it. Third, I recorded the image. +7. I prepared to power the pommelles. I did this by knowing what music about the character from the Aig meant and played it. First, I found the character. Second, I found her music. Third, I played it. +8. I prepared to cross-connect over four pomodoros. I did this by inputting As to access the acting agent Aigs. First, I found the acting As. Second, I inputted the acting Aigs. Third, I found the acting agent Aigs. +9. I prepared to notice the famous audience member. I did this by inputting the As to access the Aigs parts for the audience members. First, I found the As. Second, I inputted the As. Third, I accessed the Aigs parts for the audience members. +10. I prepared to receive the accolade. I did this by using medicine to prevent a headache from Aigs (maintained head comfort). First, I used spiritual medicine to prevent a headache. Second, I used Aigs (a spiritual switch that helped me stay confident when acting by projecting imagery). Third, I worked the acting shift. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Appearances.txt b/Lucian-Academy/Books/Short Arguments/Appearances.txt new file mode 100644 index 0000000000000000000000000000000000000000..57ae2533edaca88e1f52d3ad692264fafa5e9693 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Appearances.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Appearances, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Appearances + +1. I prepared to receive the positive effects of writing the algorithm in the future. I did this by displaying the famous appearance. First, I said it was either people or controlling the hallucinations. Second, I said it was the people. Third, I said that's what's famous. +2. I prepared to verify the A against the appearance. I did this by stating that there was a solution to the problem of 'A'. First, I wrote the A. Second, I read it. Third, I looked at its high quality appearance. +3. I prepared to help the appearances, increase to academia, recommend meditation and improved logical thinking. I did this by stating that the student constructed a logic of the appearance. First, I observed the plebeian write the number of As on different topics. Second, I observed her either have children or something else. Third, I read research about graduating from the Master degree. +4. I prepared to find the non-monotonicities. I did this by stating that the Science was an appearance. First, I found the Science. Second, I wrote the language of appearances. Third, I represented the Science in the language. +5. I prepared to reveal the consequent as a future use. I did this by stating that the appearance of the antecedent was stronger than that of the consequent. First, I found more energy in the first step. Second, I found less energy in the second step. Third, I chose the first step. +6. I prepared to state that three-step algorithms led to stronger conclusions, and that algorithm specifications were stronger than their uses. I did this by stating that the appearance of the consequent was stronger than that of the antecedent. First, I wrote the reason. Second, I wrote the eternal conclusion. Third, I couldn't avoid thinking of the reason as well. +7. I prepared to produce a movie of the subject of the argument map, removing other uses. I did this by stating that the object appeared to be in the other object. First, I explained that the reason was in the conclusion. Second, I showed that multiple reasons could be in a reason. Third, I showed that uses required a neuromodel. +8. I prepared to watch the dancer help her memory. I did this by stating that the dancer appeared. First, I stated that the dancer represented logical verification. Second, I thought of the rule of dancing. Third, I found the freedom of philosophy. +9. I prepared to calculate Computational Quotient (how intuitive the software was) in the Computational Doctorate. I did this by examining the appearance of the algorithm step. First, I made fun of it and wanted a whole algorithm. Second, I examined its fine art. Third, I noticed that the three links (between C.S. and F.A., and internal links in C.S. and F.A.) were prayed for. +10. I prepared to argue that computer science was one-dimensional. I did this by seeing that the appearance was always perfect. First, I stayed close to the first thought in computer science. Second, I moved straight to agreement. Third, I checked the values on either side. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Blue Nature.txt b/Lucian-Academy/Books/Short Arguments/Blue Nature.txt new file mode 100644 index 0000000000000000000000000000000000000000..58582fceab9033bc5aa70f32ac3bc5019c065a70 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Blue Nature.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Blue Nature, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Blue Nature + +1. I prepared to prevent the flood and fire. I did this by observing the sea creatures. First, I observed nature go around me. Second, I was safe. Third, I had some good things in the middle. +2. I prepared to buy the Bentwood Chair by Michael Thonet. I did this by stating that I am a member of nature. First, I distanced myself from nature. Second, I helped you to it. Third, I followed man-madeness. +3. I prepared to test properties of nature. I did this by enjoying blue nature. First, I trained nature. Second, I assessed nature. Third, I joked, I imitated not followed nature, in fact. +4. I prepared to meet safety requirements. I did this by enjoying blue nature with you. First, I found the block. Second, I removed it. Third, I walked there. +5. I prepared to help make a model. I did this by noticing that there was one creature. First, I avoided it. Second, I stayed home. Third, I did something else. +6. I prepared to design nature around it. I did this by noticing all of nature. First, I wrote the name of the object. Second, I wrote your name. Third, I wrote what you said it was. +7. I prepared to design it well. I did this by stating that the creature developed the direction. First, I said 'I did'. Second, I made you do one thing you liked it, and I liked it too. Third, I designed around nature. +8. I prepared to like it. I did this by stating that the creature was in the object. First, I looked at it. Second, I took it out. Third, I designed a mould for it. +9. I prepared to be a good person. I did this by stating that the creature was on the object. First, I found you. Second, I found what you wanted. Third, I did what you wanted me to do. +10. I prepared to design it first. I did this by stating that the caretaker created the space in nature. First, I found space. Second, I found you in it. Third, I moved on with you in my arms. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/CC, Lucian Academy Press, Melbourne.","Green, L 2021",1,"CCompetition 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Competition 2 + +11. I established equality-influenced competition. I agreed with competition. Competition was between at least two people. It increased the quality. It kept the rules fair. +12. I noticed the tactics of the winner. I watched the competitors running. They started together. One took the lead. One won. +13. I recorded the thoughts as they walked home. I watched the competitors walk home. They finished the race. They started walking home. They each went home. +14. I analysed the data in religion. I agreed with religion (philosophy). Religion had enough data. Philosophy examined the data. Computer science finished analysing the data. +15. I noticed explicitism (sic) needed bots and a way to confirm new customers. I produced the competitor. I noticed the 'financial products' were breasoned algorithms in different departments. I gave the customer the As. They performed. +16. I imagined that I was a competitor. I chose perspectives to endorse in competition. Competitors should be fit. They should use fair strategies. Adjudication should be fair. +17. The game was to complete the mind read thoughts (eventually allowing the computer to check/play against probable answers). I chose objective competition. I programmed the game. I made a simulated intelligence player (used mind reading). I enabled players to compete in the game. +18. Creationism (putting an idea before the start) is competition. I defined the criteria to improve code. There needed to be enough good ideas. Fun, the game, was clocked in business. +19. I created a simulation to support thinking of up to 200 thoughts on a topic per day. There was competition to provide an educational institution. Everyone won. There were changes. 4*50 As protected the text during them. +20. I won by eating healthily. Veganism is pro-competition. I ate one vegetarian meal per week. I avoided unnatural chemicals. I avoided genetically modified organisms. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Competition 3.txt b/Lucian-Academy/Books/Short Arguments/Competition 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..5e2133a82fd269e420745d0850686a34f96262e9 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Competition 3.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Competition 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Competition 3 + +21. The winner was especially recognised. People picked up speed with competition. The starting gun went off. The competitors picked up speed. The top 3 finalists were recognised. +22. I recorded information when I looked up. The third eye processed the oncoming ray from the most illustrious competition. I heard them coming. I looked up. I read the number on the competitor's vest. +23. I could think clearly in a short time. I wrote and won cosmic competitions in fine arts with language. I noticed the vag technology. I visualised the database problem using medicine technology in my mind. I said the answer. +24. Clarity led to the pursuit of merit. I solved society's problems with competition. There was enough money. There were enough clear-thinking people. The people made money in the competition. +25. I was still there. I prevented the problem with pedagogy before conception. I detected the number of breasonings. I breasoned out the rest. I helped them for next time. +26. The air also didn't escape into space. The competitor solved the space-air paradox. Why are there some particles in space, but we cannot survive? Can we swim in space with technology. The technology was a holocap. +27. The intransitive verb, walked, was allowed. The competitor with the larger closest object in the logos won the competition. The logos was the reason. The first competitor's reason was 'I bowled the ball at the pins'. The second competitor's reason was 'I walked to the house', and won. +28. I explored losing as well. I agreed with the competition. I found the competitor. I saw he was weaker. I agreed with (was stronger than) him. +29. It was the time to avoid wasting time. I noticed what the children said. I overheard the two children talking. One said that she lost by a larger margin. She learnt more. +30. I noticed an algorithm with a maze between the additive and subtractive improvements. The competition was staged. I noticed the stage. I noticed the additive improvements. I noticed the subtractive improvements. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Competition 4.txt b/Lucian-Academy/Books/Short Arguments/Competition 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..c0bf6b8570d498ae3aa93db28e374966acb04a84 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Competition 4.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Competition 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Competition 4 + +31. I could enter other competitions. I tried hardest in the competition. I wrote. I worked out imagery. I won. +32. The winner needed the algorithm to win. I noticed how the competitors were (I was) dysfunctional (functional). I wrote the algorithm to win the competition. I checked it. I ran it. +33. Their win was learning. Everyone had a turn because of continuous improvement. I saw the continuous improvement cycle. People took turns because of it. They rotated and learnt. +34. There were helpers. Every competitor was helped. The competitor tried. She formed a specific question. She was helped. +35. I helped the other to follow the rules. The other was helped. The group on our side was a competitor. The people in the group were competing with each other. At least in the way of following the rules. +36. I had better results with better genes. Competition was changing to better genes with meditation. I meditated. This conserved my genetic health. Competition chose meditation. +37. The competition stood for the same values. The competition stood fast. The competition was an institution. It helped with pedagogy, and algorithms. Some of the competitors helped maintain the institution. +38. The reward was recognition. The competition was fair. No one cheated. There were rules against cheating. The people acknowledged resources they found. +39. The people performed better with equitable help. Competition of everyone was equality. The student collected his own ideas. By the start of the next major, there was more help. I agreed with help. +40. I stayed close to education. The woman was a competitor. She went out. She came back, successful. She supported others. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Competition 5.txt b/Lucian-Academy/Books/Short Arguments/Competition 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..6734f532a8870ed49e80276c2908bbe94880f95c --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Competition 5.txt @@ -0,0 +1,12 @@ +["Green, L 2021, Competition 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Competition 5 + +41. The winner of the tournament was recorded. The competitors moved to higher ground. The competitors won. They moved to the next round. The best competitor won. +42. He earned a job. The gay competed in lifelongness. The gay strove for equal rights. He won them. He got married. +43. The subjects competed for research status. The subjects competed against each other. The subject was a person. He collected comments on an area of study each day. He formed connections with other subjects. +44. She won by putting in a special effort. The students competed against each other at games. The student trained at a sport. She met the standard. She competed with others. +45. The competitor won a personal best. Meeting the standard was the competition. The standard was below a certain time. There were tests for cheats. There were participation prizes. +46. The originals, way and results were recorded. Being the best is the most interesting competition. The plants were not genetically modified. They were crossed. The best ones were chosen. +47. Usually they wrote on theories and original ideas in terms of a developed argument. Everyone competed successfully. Competition came back. The ways to mindmap interpreters, databases, decision trees, state machines and one's own computer science (through creative writing) were found. The ways to mindmap philosophy of phenomenology, interpretation, induction and pedagogy, etc. were found. +48. The arguments were worked on by the husband and wife. The large female tribe practised competition to give birth. They worked out to do everything in enough time. They thought of or wrote arguments for each idea. One set was before conception. +49. I aimed for it my whole life. I supported general competition rather than only property competition in life. I thought of the necessary food for a healthy life. I generally worked out that this food should help with longevity. I examined Chinese medicine for longevity. +50. The spiritual machine was a person who was strong in all departments. I competed using the machine. I programmed a computer to run an algorithm and account for objects as it went. It ran an algorithm that could write any algorithm. The machine had good self-confidence and rights."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Competition 6.txt b/Lucian-Academy/Books/Short Arguments/Competition 6.txt new file mode 100644 index 0000000000000000000000000000000000000000..6f55c62e23ec4f39e5f251c9d6961bf24f4d6df4 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Competition 6.txt @@ -0,0 +1,12 @@ +["Green, L 2021, Competition 6, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Competition 6 + +51. I declared that he could meditate. I physically examined the competitor. I tested his blood pressure. I tested his eyesight. I tested his basal, metabolic rate. +52. I won. I included the weaker competitor. I designed puzzles and activities that the weak competitor could win. I rotated weaknesses in the group. I was the target. +53. The competition was comparing with one's personal best. People moved up and down because of competition as the heart of society. Everything was developed. Jobs were always available and grew. I moved up. +54. She became a doctor. The competitor travelled frequently. She was going to be eugenically excluded. She ran away. She won elsewhere. +55. The sky scraper was like an ice cream. The candidates passed competition science. There needed to be some impetus. I noticed the people in charge borrowing from nature. It stayed in the civilisation, with more and more developed imagery. +56. I examined Law MOOCs (massive open online courses). I noticed the cash registers open up and swallow all of the competitors' money. When I had finished paying, I earned a job. I was still with-it. So I did it myself. +57. There was always activity each day. The competitors were satisfied with work for them. They tried all business types. They tried all departments. It was simulated. +58. In fact, his positive function was projection, like sales and which won. The competitor beat the other competitor. He wrote the draft at the start. He spoke to the tutor. He changed the draft and handed it in. +59. Was the second outbreak of Coronavirus (in 2030) caused by time travel? (Perhaps not, because of the zinc or copper simulated in the time machines.) The competitor had a disease (help). He followed positive function. He stayed in the safest place in the system. He reached out to others he wanted to be like (they made representations to each other). +60. The high distinction (100%) wasn't musical. The musician overcame difficulty (maintained his composure) in music to win. I noticed that the abbreviations for name, variable, type, etc. were also translated. Is genus like type? Yes, it is a category of things, such as string, number or compound type."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Competition 7.txt b/Lucian-Academy/Books/Short Arguments/Competition 7.txt new file mode 100644 index 0000000000000000000000000000000000000000..ce8dc681b86fcf98c8b615a19dac4a5ef927cabe --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Competition 7.txt @@ -0,0 +1,12 @@ +["Green, L 2021, Competition 7, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Competition 7 + +61. He wrote for the audience. The artist overcame difficulty (maintained his composure) in art to win. He collected comments and wrote an algorithm for each stroke. He did this for each connection of strokes. He translated the comments and code into another language. +62. The products were better checked. Competition is necessary for goodness. The competitors increased the quality of their products. They programmed their algorithms. They wrote many. +63. Both the conditions for ending and dismantling of the competition structures were planned. The competition ended. The competitors were selected. They competed. The competition was declared finished. +64. The people agreed. The competition was right. The furthest throw won. There was no assistance. The adjudicator was right. +65. She completed the goal. The competitor saw the object. She saw the goal. She saw the obstacle. She went around the obstacle and to the goal. +66. They were tested against a standard. The competitors were fit to compete. They were physically fit. They were mentally fit. They knew the rules. +67. They were too old and slow. I maintained the high standard. The coach graphed performance against time. At the peak of his career, the competitor had a personal best. He maintained this standard with meditation and training. +68. There was an excursion in time. I found all the possibilities in the competition. I started at the origin. I systematically traversed the tree of possibilities. I noticed computer science take over history. +69. People examined his tactics. The specimen of competitions was perfect. The competitor won. He was remarkable. He was the specimen. +70. The species with the weaker heart became extinct. The heart is in competition. The universe was naturally intelligent. The heart evolved. It was naturally selected."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Competition 8.txt b/Lucian-Academy/Books/Short Arguments/Competition 8.txt new file mode 100644 index 0000000000000000000000000000000000000000..5366d93f4ddeaa342c5f87e2380f78780cc1ccdd --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Competition 8.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Competition 8, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Competition 8 + +71. He won in other languages. The meditator (philosopher) won the competition with 8 balls. He vaguely mind-read himself 8 times. He used a computer to select the most accurate reading. For example, the best reading on the topic of the maze was 'promenade', 'valley'. +72. I won by breasoning out meditation. I won with spirit by making sense in the competition. There were spiritual things even after atom translation. There were breasonings. There was meditation. +73. I graphed the performances to work out the most likely winner. I knew about competitionism (I endorsed the competitor). I read the past performances of the competitors. I found the best competitor. I endorsed the competitor. +74. The draft was the subject of the analysis. I examined the computational competition. I competed (performed). I entered the way I performed on computer. I entered the way I competed on computer, for example I wrote and redrafted serious documents. +75. I labelled items to reuse and reduce the number of databases. Competition is intelligent. I modified the decision tree to recognise reused sequences. I modified the translator of documents with no translate sections to remember translations. I redrafted the installation instructions to be clearer. +76. I took away fond memories. In this kind of competition, one person is left in and the other person is left in. They rotated roles. They were still good. Everyone won. +77. The answer seemed too obvious. The competition prevented mental illness (maintained sanity). When the standards were met, the way included positive function. It was a competition of one. Many people did it. +78. Society went along with progress and science. Everyone raised their fist at the start of the competition. The socialists argued for equality. Equality, not necessarily equity. There were no compromises. +79. The model was independent of attachment to objects, and was based on safety. The competition was in object formula finder. The time travel circuit was found. It was found by accident. More was found around it when it was found. +80. It was to breason it out, and more. I recommended competencies in competitors to be conceived. The competency practicum was written before the competitor was conceived. It was breasoned out. The competitor was conceived. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Competition 9.txt b/Lucian-Academy/Books/Short Arguments/Competition 9.txt new file mode 100644 index 0000000000000000000000000000000000000000..20369f13888ecde90492f37a1b1f555cbfe3c854 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Competition 9.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Competition 9, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Competition 9 + +1. I also gave my kidneys a break. I didn't eat too much food. I budgeted for food. I fasted. I ate small meals. +2. I ate enough for energy. I didn't eat too little food. I worked out the nutrients I needed. I changed them around. I tried new recipes. +3. I enjoyed people. I abstained from sex. I thought through As for a child. I had a child. I abstained from having sex. +4. I enjoyed enough equality. I didn't want too much equality. This was communism. There was less freedom. Sometimes there were dictators. +5. I ran free. I didn't want too much freedom. I drew the line. I enjoyed freedom, but didn't want bad laws. Society was ordered. +6. I competed by thinking of ontological nothingness to maintain happiness. I didn't want too much nothingness. This wasn't ontological nothingness, it was no content. It was poor, psychologically. It was poor, physically."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Competition.txt b/Lucian-Academy/Books/Short Arguments/Competition.txt new file mode 100644 index 0000000000000000000000000000000000000000..8bf260694233cb74c364df97934c4217dfff29d1 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Competition.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Competition, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Competition + +1. I didn't know I was a competitor. There should be competition in each field. There was no popularity contest. Only voting where it was intelligent, not badly organised. There was competition of merit. +2. Each person was automatically entered into the competition. There was a competition. There was a person. She was automatically entered into the competition. +3. Bots could meditate and breason it out. I met professional requirements of 50 As per work. There were machines that could write. They could walk around like humans. They were linked to Pedagogy. +4. Meeting requirements was necessary for competition. I met personal requirements. I was law-abiding. I was healthy. I aimed for pedagogy. +5. There was a reward. There was a competition framework. There were goals. There were ways of achieving the goals. There were ways of assessing whether the competitors achieved the goals. +6. The reward was given for being mind read and finishing the maze. Everyone tried and everyone was rewarded. Mind reading achieved 100% coverage of trying. There were mazes of thinking of a breasoning rather than not, to advance to the next square. The vague mind reading helped achieve the goal, rather than necessarily competing. +7. I liked breasonings and equality - and economic freedom. I performed better by using the daily regimen to go to church (play the note). Confidence blocks and blocks from lack of practice were cleared. I maintained a high level of performance. I functioned (played the note) positively. +8. I made the improvements and fixed the bugs. There was competition word warfare. They were positive terms. There were 50 As. 50 Bs turned off anything else. Bs were improvements needed or bugs. +9. Humanities business was in higher education, etc. The other (didn't) prevent a headache (maintained comfort) using the daily regimen making them (un)fit to compete. It also prevented muscle aches and pains. I mind read that the person was in pain, prevented it continuing, and mind read that they were relieved. It meant they could compete. +10. There was competition in writing a better daily regimen. The daily regimen was on television, leading to more competition. It was researched. It was part of meditation books. It was an area of study. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Computational English Calculator.txt b/Lucian-Academy/Books/Short Arguments/Computational English Calculator.txt new file mode 100644 index 0000000000000000000000000000000000000000..6da5aa7ee5659ee48f1c3eedb445d142bc35ff8d --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Computational English Calculator.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Computational English Calculator, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Computational English Calculator + +51. I ate the apple. I held the calculator and typed, 'Do or die'. I found the correct function. I survived. I wrote down the answer. +52. I minimised the task to give to the people. I knew you. You knew me. We knew each other. We knew everyone in sight. +53. I removed the string from the strings afterwards. I saw the fine point. It was socialism. I thought of the reasons. I added the string to the strings to prevent the 'gone past' error when shortening the task (when shortening the mind reading tree). +54. I identified patterns. I saw out time. I identified the capitalist. I saw the objects. I read the edition. +55. The object caused the disease. I saw the aetiological shape under the character's skin. The person was calm. I removed the object. She thanked me. +56. The cosmologue led the centre. The cosmologue went to the religious person for protection. The cosmologue went to the religious institution for protection. The religion provided guidance. +57. I enjoyed the scenes along the way. I felt comfortable writing my books. I examined the self. I told the other. I listened to her comment. +58. I minimised the meal. I felt comfortable writing 5 As. I found the reason people were interested in a topic. I wrote about it. I found and gave them food for the betweennesses. +59. Working mind reading led to recommending time travel. I saw the head and it was round. The head was a sphere. I drew marks on it depicting a face. I modelled mind reading. +60. It was the simple grammar. I pressed '='. I said the sentences. I pressed equals. I read the grammar. +61. I explained implication with and, with not. I pressed '^'. I found the result, the antecedent. I pressed '^'. I entered the consequent, the negation of the antecedent in a new clause and the second consequent.. +62. I found the recursive form where at the longest stretch to correct the bug. It was true that the character became more specific as it went up. The algorithm found the relevant part. It was already high quality. I understood the input, output and algorithm. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Direction.txt b/Lucian-Academy/Books/Short Arguments/Direction.txt new file mode 100644 index 0000000000000000000000000000000000000000..d9c245358f621490427c0cc0a6e9eeab3c654faf --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Direction.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Direction, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Direction + +1. I prepared to check before changing lanes. I did this by stating that the object tailed me. First, I located the found the object. Second, I connected the object behind me. Third, I viewed behind me. +2. I prepared to avoid risks. I did this by stating that the revolutionary found her direction. First, the revolutionary studied science. Second, she researched the cutting edge. Third, she related multiple universes to possibilities, not realism. +3. I prepared to explore the solar system. I did this by stating that my direction was reducing the phonetic alphabet and Mars destination to simple tours. First, I looked at the photographs. Second, I looked at Mars. Third, I simulated Mars in the room. +4. I prepared to research the local culture by speaking with the target demographic and teachers before translating. I did this by fitting the way with culture. First, I tested that the smaller vocabulary didn't lose meaning. Second, I tested that the grammar used was similar and not too predictable. Third, I tested that the meaning was correct in context. +5. I prepared to go beyond back-translation by substituting transliterations. I did this by leading the way in life with culture. First, I used appropriate words, not words not used in a particular dialect. Second, I verified that the words didn't have the wrong meaning. Third, I grouped back-translations by languages which used them for translations. +6. I prepared to state that Maharishi made peace with others. I did this by stating that the most important pedagogue chose the Maharishi direction in meditation. First, he wrote about natural law, or encouraging writing on pedagogy. Second, I stated that the most important lecturer chose the Maharishi direction in meditation (he wrote about computational education). Third, I stated that the most important artist chose the Maharishi direction in meditation (he studied what pleased the eye). +7. I prepared to group sections of the directions. I did this by laying the directions end to end in the Perfect Palace. First, I lined up the same starts and ends of directions. Second, I numbered them. Third, I recorded them in a separate document. +8. I prepared for the students to gain as skills. I did this by writing on students' directions as Maharishi. First, I listed what they could do. Second, I found what they should do. Third, I found the intersection. +9. I prepared to add humour when appropriate. I did this by choosing the direction of humour. First, I found the word. Second, I found the first meaning. Third, I found how the second meaning was incorrect. +10. I prepared to find the known limits and necessary solutions. I did this by stating that the system directed everyone to become a Monarch. First, I found the Monarch computer game. Second, I designed royal computer algorithms. Third, I found the actual things (asking for help) helped the judgement. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Doctor_Sutra.txt b/Lucian-Academy/Books/Short Arguments/Doctor_Sutra.txt new file mode 100644 index 0000000000000000000000000000000000000000..5192545dd684c96f88730bc5a89a455134af211c --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Doctor_Sutra.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Doctor_Sutra, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Doctor Sutra + +1. I prepared to maintain my health. I did this by stating that I have a body. First, I walked to the shop. Second, I breathed. Third, I ate the slice of pineapple. +2. I prepared to understand how my body worked. I did this by stating that my body has body systems. First, I stated that my body had a circulatory system. Second, I stated my body had a respiratory system. Third, I stated that my body had a nervous system. +3. I prepared to simulate my body. I did this by stating that my body systems have organs. First, I stated that I had a heart. Second, I stated that I had lungs. Third, I stated that I had a brain. +4. I prepared to predict innate computational ability, e.g. whether a child will become a philosopher. I did this by stating that I have proteins. First, I stated that I had the kinin. Second, I stated that it carried the item. Third, I stated that it had perfect function. +5. I prepared to realise future consequences. I did this by stating that I genes. First, I inspected the genes. Second, I found what to still do when desired traits were found. Alternatively, I recommended meditation for others. +6. I prepared to teach. I did this by stating that I have good societological-like comfort. First, I found the maximum. Second, I headed for it. Third, I wrote. +7. I prepared to make an interesting comment about the computational interpreter. I did this by stating that I have good popological digestion. First, I digested the person's ideas (let them find out about the computational interpreter). Second, I narrowed down thoughts and helped them to them. Third, I thought things through and helped them to them. +8. I prepared to think of well-founded importance. I did this by thinking of one aphor per day. First, I thought of India. Second, I thought of literature. Third, I thought about the child. +9. I prepared to find a similar-minded person. I did this by breasoning out what I wanted my genes to do. First, I found the square. Second, I found the line. Third, I found the second square. +10. I prepared to think quickly. I did this by repeating the Doctor Sutra to maintain health. First, I planned everything well. Second, I refined myself. Third, I surprised people. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Equation 2.txt b/Lucian-Academy/Books/Short Arguments/Equation 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..59280ab2b759b92c3f7bd442a1c467172ebb4e8d --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Equation 2.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Equation 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Equation 2 + +1. I sketched the line. The vertical line had a very great gradient. Low gradients are between 0 and 1. High gradients are between 1 and infinity. The line had a gradient of 10^3. +2. I wrote the formula to find the gradient. I found the gradient. I found the rise. I found the run. I found the gradient by dividing the rise by the run. +3. I graphed the line. I found the y-intercept. I substituted the gradient m into the formula y=mx+c. I substituted one of the points' x and y values into the formula c=y-mx. I calculated c, the y-intercept. +4. A line with equation y=m(x^n)+c could went from left to right not top to bottom or bottom to top (there may be multiple x values for a y value, but not vice versa). I found the y-intercept person. I explained that the formula c=y-mx was arrived by subtracting mx from both sides of the formula y=mx+c, to arrive at a formula with c on the left hand side. I explained that x in the y=mx+c formula could be x^2 or x^3, etc, resulting in a different line and graph. When c=0, the line went through the origin. +5. The graph had labelled axes, axis values, graph equation and points. I found the gradient object person. A hill with a low gradient gradually rises. A hill with a higher gradient is steep. This can be demonstrated with a graph. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Equation.txt b/Lucian-Academy/Books/Short Arguments/Equation.txt new file mode 100644 index 0000000000000000000000000000000000000000..729cd3563b367f20047cc4e0da7f868146b104e6 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Equation.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Equation, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Equation + +1. I measured the height component. I found the rise. I found the y co-ordinate of the first point. I found the y co-ordinate of the second point. I subtracted the y co-ordinate of the first point from the y co-ordinate of the second point. +2. I measured the width component. I found the run. I found the x co-ordinate of the first point. I found the x co-ordinate of the second point. I subtracted the x co-ordinate of the first point from the x co-ordinate of the second point. +3. I started t 0,0. I started at the origin. I found the x-axis (where y=0). I found the y-axis (where x=0). I found their intersection point, where x=0 and y=0. +4. I found the point x,y. I finished at the destination. I found the desired x co-ordinate, x1. I found the desired y co-ordinate, y1. I found their intersection point, where x=x1 and y=y1. +5. I visited grandma. I travelled to the destination. I started at the origin. I found the way. I finished at the destination. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Fewer Mental Breakdowns (Schizophrenia).txt b/Lucian-Academy/Books/Short Arguments/Fewer Mental Breakdowns (Schizophrenia).txt new file mode 100644 index 0000000000000000000000000000000000000000..9159edce885a113a92160cf48940d71fd9677f23 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Fewer Mental Breakdowns (Schizophrenia).txt @@ -0,0 +1,13 @@ +["Green, L 2021, Fewer Mental Breakdowns (Schizophrenia), Lucian Academy Press, Melbourne.","Green, L 2021",1,"Fewer Mental Breakdowns (Schizophrenia) + +1. I prepared to reach threshold in Medicine education. I did this by stating that I knew that it was all right. First, I had clear vision. Second, I was happy. Third, I was happy from full brain potential. +2. I prepared to identify the object before its properties. I did this by stating that hyle (objects rather than properties) were stronger than qualia (properties rather than objects). First, I found the object. Second, I stayed focused on it. Third, I recorded it. +3. I prepared to draw the appearance (including the switch and pedagogy). I did this by stating that I didn't see (saw) appearances. First, I set up the small stage. Second, I placed the actor model in it. Third, I viewed the appearance of the stage. +4. I prepared to see beauty. I did this by switching the appearances off (on) in God's ontology. First, I found the object. Second, I turned it on. Third, I continued on. +5. I prepared to recommending the advanced utterance. I did this by stating that I protected the person by preventing (showing) appearances. First, I found the information to protect the person. Second, I showed it to them. Third, I verified they had understood it. +6. I prepared to love you. I did this by stating that you loved me by writing about the appearance. First, I saw the appearance. Second, I described it to you. Third, you wrote about it. +7. I prepared to help the community. I did this by stating that I knew the person who agreed with you. First, I found the reliable person. Second, I agreed with him. Third, I stated I knew him. +8. I prepared to acknowledge the teacher. I did this by taking part in activity to remain active. First, I was active until I had finished. Second, I found the most intelligent thing to do. Third, I was well-cared for. +9. I prepared to eat pure foods. I did this by eating the food to remain healthy. First, I ate rice. Second, I ate miso. Third, I ate soy tofu. +10. I prepared to help others. I did this by stating that the other found out about the self. First, I remained on the plan. Second, I saw the other. Third, I maintained the switchboard. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Flip x, y.txt b/Lucian-Academy/Books/Short Arguments/Flip x, y.txt new file mode 100644 index 0000000000000000000000000000000000000000..99d3cf7ba44dbd4f6243c6f093a83cdd2e89cf3f --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Flip x, y.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Flip x, y, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Flip x, y + +1. I did something with them. I noticed why they're there. They were there. I noticed why they were there. I absorbed them. +2. I graphed the new equation. I tested that x and y had been swapped. I read x and y. I swapped them. Later, I read that they had been swapped. +3. I rotated the paper back. I rotated the paper around the y=x line. I saw the y=x line. I rotated the paper. It was a transparency. +4. I labelled it's x- intercept. I saw the line along the y-axis. I saw the near-vertical line. It was completely joined together. I labelled it. +5. I found what you meant. I walked along the path. I found the path. I followed it. I arrived at the destination. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Future.txt b/Lucian-Academy/Books/Short Arguments/Future.txt new file mode 100644 index 0000000000000000000000000000000000000000..b15c8af669f957f227f1bef2162efcf18af9d415 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Future.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Future, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Future + +1. I prepared to enjoy grand health. I did this by setting the slider at the proportion of ideality for science. First, I set the slider at maximally ideal. Second, I ate. Third, I didn't hanging upside down. +2. I prepared to support a larger population. I did this by stating there was a limit to the world's population. First, I found the virtual jobs. Second, I found the virtual students. Third, I found virtual accommodation. +3. I prepared to state that there was a magical surplus energy in the universe. I did this by stating that the leader tested the metal frame for the house. First, I specified the dimensions of the steel. Second, I found the source. Third, I replicated it. +4. I prepared to invite the cow to a customised class for her. I did this by stating that there were vegan alternatives to cows. First, I synthesised the creatinine. Second, I synthesised the thymine. Third, I synthesised the animal fat. +5. I prepared to help the others catch up with an app for manual medical meditation (mind reading to help with thoughts). I did this by stating that all people were taught meditation. First, I helped the parents before conception. Second, I taught the child meditation. Third, I noticed medical benefits. +6. I prepared to encourage naturally intelligent assignments. I did this by stating that all students were taught pedagogy as a core subject. First, I credited the student's thoughts as an accredited assignment. Second, I encouraged him to detail it. Third, I encouraged creativity. +7. I prepared to expand the uses until a cycle was found. I did this by stating that all conceptions should be given 50 As. First, I asked if the student knew what I meant. Second, I spiritually time travelled and gave the 50 As. Third, I did this for all thoughts. +8. I prepared to state that the school was my partner and me. I did this by state that everyone was sexually free. First, I found a partner. Second, I wrote about my relationship with him. Third, I made sure that it was safe. +9. I prepared to be equitable rather than equalitarian. I did this by state that all things (e.g. job requirements) were equitable. First, I found the appropriate job requirements for the person. Second, I stated that the institution was automated. Third, I stated that bots were the computer aided learning characters. +10. I prepared to write on pedagogy. I did this by stating that there was proper medicine for everyone. First, I gave pedagogy to the next child. Second, I took many years to think about it. Third, I stated that occupation with pedagogy was medicine. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/God Algorithm - Student.txt b/Lucian-Academy/Books/Short Arguments/God Algorithm - Student.txt new file mode 100644 index 0000000000000000000000000000000000000000..49cfa20532df389ac80cf00df8c978c7b8a30561 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/God Algorithm - Student.txt @@ -0,0 +1,13 @@ +["Green, L 2021, God Algorithm - Student, Lucian Academy Press, Melbourne.","Green, L 2021",1,"God Algorithm - Student + +1. I prepared to state that it was because I was the student. I did this by stating that the student wrote the name of the object that the teacher stated. First, I attended class. Second, I listened to the teacher say the name of the object. Third, I wrote it down. +2. I prepared to see the play from two positions. I did this by examining the amphitheatre. First, I examined the arc. Second, I walked towards the seats. Third, I walked up the stairs. +3. I prepared to use the object for a purpose. I did this by stating that I knew why the object was that particular object. First, I recognised the object. Second, I remembered that the object in mind had that particular purpose. Third, I verified that the object I saw had the same purpose. +4. I prepared to check the algorithm for errors. I did this by stating that I knew where the object was. First, I saw the object on the screen. Second, I moved it. Third, I made a computer algorithm from the behaviour. +5. I prepared to observe the trend. I did this by stating that the two subjects found an object each. First, I observed the first subject find an object. Second, I observed the second subject look at the object. Third, I observed the second subject find the same type of object. +6. I prepared to find more intelligent connections (between consecutive items, or a maximum of five apart). I did this by telling the other the name of the object. First, I saw and recognised the object. Second, I repeated its name silently in my head. Third, I said its name to the other. +7. I prepared to favour connections rather than word processors, etc. I did this by stating that I knew when the object was made. First, I examined the object. Second, I found the date of manufacture engraved on it. Third, I also read the place of manufacture. +8. I prepared to compare the objects. I did this by writing down the object's historical dimensions. First, I wrote down the object's dimensions. Second, I opened the history book. Third, I wrote down a similar object's dimensions. +9. I prepared to do what the other did. I did this by stating that the other used the object in the present. First, I chose the object. Second, I used it. Third, I documented it. +10. I prepared to do more work. I did this by stating that the other was in the present because she was holding the object. First, I found the marker. Second, I held it up. Third, I performed more work. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/God Algorithm.txt b/Lucian-Academy/Books/Short Arguments/God Algorithm.txt new file mode 100644 index 0000000000000000000000000000000000000000..7fe4dc985b6506a391a08bfc37a06cf8a37ba9a9 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/God Algorithm.txt @@ -0,0 +1,13 @@ +["Green, L 2021, God Algorithm, Lucian Academy Press, Melbourne.","Green, L 2021",1,"God Algorithm + +1. I prepared to use the correct object. I did this by applying the verb to the object. First, I found the object. Second, I listed the possible verbs. Third, I found the verb to apply to the object. +2. I prepared to differentiate between word and object. I did this by stating that I knew you about the result of the God algorithm. First, I found the result of the God algorithm, the object. Second, I noticed you needed the object. Third, I found the physical object. +3. I prepared to write it up. I did this by stating that I knew you, the result of the God algorithm. First, I found your name. Second, I sent you a letter. Third, I received your reply. +4. I prepared to find out more. I did this by stating that I knew you a little about the result of the God algorithm. First, I noticed the object. Second, I noticed someone holding it. Third, I noticed you. +5. I prepared to ask why why was. I did this by stating that I knew the result of the God algorithm was why. First, I noticed the occurrence of the event. Second, I read the result of the God algorithm. Third, I worked out that the result explained the occurrence. +6. I prepared to predict the result in the category. I did this by stating that I knew why the result of the God algorithm was that particular result. First, I noticed the category. Second, I noticed the result in the category. Third, I noticed that it was that particular result. +7. I prepared to iterate wildly. I did this by verifying the self against the result of the God algorithm. First, I found you. Second, I interrogated what you said. Third, I got it right. +8. I prepared to go off happily ever after. I did this by stating who I was in relation to the result of God Algorithm. First, I found you with the object. Second, I found myself with the object. Third, I found us together, and happily ever after. +9. I prepared to say what I would write. I did this by writing the name of the object found out from God displayed on the spiritual screen. First, I discovered God. Second, I knew how it harmonised with things. Third, I stated that it sounded better than a look. +10. I prepared to decide well. I did this by stating that I knew it wasn't the object. First, I stated that it was the object. Second, I stated that I knew it was the object. Third, I stated that it wasn't not the object. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Green_Sutra.txt b/Lucian-Academy/Books/Short Arguments/Green_Sutra.txt new file mode 100644 index 0000000000000000000000000000000000000000..7d0b7b197718636905e45aaa242d86e2f665ef9e --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Green_Sutra.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Green_Sutra, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Green Sutra + +1. I prepared to bug check code. I did this by mentally and physically invoking a life. First, I thought of intelligent ideas. Second, I used IVF. Third, I maintained the life with activity. +2. I prepared to think of the ramifications of my research. I did this by touching and acting on the synon-object. First, I found the topologically equivalent object. Second, I found the functions of the brain cells with machine learning. Third, I used knowledge about the brain cells to think clearly. +3. I prepared to lead the good life. I did this by loving God. First, I ploughed well. Second, I articulated well. Third, I wrote well. +4. I prepared to star in my work. I did this by loving the flower of life. First, I performed extremely well in activity. Second, I dispensed with tiredness or surrendering. Third, I concentrated on activity. +5. I prepared to and verify planning and research. I did this by knowing that all was all right. First, I built the base of extreme performance. Second, I built the walls of extreme performance, Third, I built the ceiling of extreme performance. +6. I prepared to stating that it is alright for each person to meditate if they want to. I did this by thinking of all. First, I remembered a category of people. Second, I wrote their names down. Third, I thought of them. +7. I prepared to help the people. I did this by writing on all. First, I thought of the person. Second, I thought of where they were up to. Third, I wrote it and the date down. +8. I prepared to notice the Lucian Effect. I did this by teaching others. First, I taught the person. Second, they taught someone else. Third, I noticed the positive effects. +9. I prepared to give a high distinction for the Green sutra. I did this by giving a high distinction to others. First, the person submitted work. Second, I graded the work. Third, I gave a high distinction to the person. +10. I prepared to test the product. I did this by earning a high distinction. First, I wrote the sutra. Second, I breasoned it out. Third, I earned a high distinction by breasoning it out. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Gridline 2.txt b/Lucian-Academy/Books/Short Arguments/Gridline 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..4194fc4bac9b3362c0ada80d4b34ae44dabc4ae9 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Gridline 2.txt @@ -0,0 +1,7 @@ +["Green, L 2021, Gridline 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Gridline 2 + +1. I made different grids. I saw the number of rungs of the ladder using the filter. I lined the grid on the transparency up with the ladder. I lined 1 up with the first rung. I read the number of rungs. +2. I used only uppercase characters. I printed the writing. I printed the grid. I wrote the uppercase character in the square. I didn't write the lowercase character in the square. +3. I noted the height of the meniscus. I measured the meniscuses. I read the value for the bottom of the meniscus. I read the value for the top of the meniscus. I used the correct value. +4. I gave something back to the people. I knew why these people liked me. I asked the people why they liked me. I wrote it down. I asked them the reason for the reason they liked me. +5. I measured gravity using it. I knew that the item to do something to was in the container. I found the container. I took the item out of the container. I did something to it."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Gridline 3.txt b/Lucian-Academy/Books/Short Arguments/Gridline 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..467254191007b34e4f3f21efd9b7196c64ae2001 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Gridline 3.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Gridline 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Gridline 3 + +1. I drew pathways. I noticed the warehouses. I saw the horizontal grid. I saw the vertical grid. I saw the 3D grid. +2. I prepared to draw the continuous line. I swapped y and x and solved the equation for m>1. I determined that the gradient was steeper than 1. I swapped y and x. I solved the equation. +3. When flipped, the line still passed through the y-axis. I modified the graph and found the y-intercept. I found the y-intercept from the modified equation. I flipped the graph in the x-y line. I drew the line, without gaps. +4. I wrote the y-value on the right hand side. I love you where you found the y-value. Out of the y and x values, gradient and y-intercept, I could find any one from the others. I knew the x value, gradient and y-intercept. I found the y value. +5. I also found the x-value. I found a detail related to the y-value. I found the y-value. I found that it was positive. I marked it on the number line. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Honey Pot Prayer.txt b/Lucian-Academy/Books/Short Arguments/Honey Pot Prayer.txt new file mode 100644 index 0000000000000000000000000000000000000000..cc734980a74427ccf2f00e02fdb4571b3428702f --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Honey Pot Prayer.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Honey Pot Prayer, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Honey Pot Prayer + +1. I prepared to look at things in the car. I did this by maintaining blood flow in the head and neck muscles to prevent a headache in the car using the honey hot prayer. First, I helped blood flow. Second, I prevented pressure. Third, I relaxed my muscles. +2. I prepared to automate the spiritual honey flow. I did this by stating that I was rewarded with the lower leaves like head and neck muscles when I thought of upper leaves like cheeks clearly . First, I thought of my cheeks. Second, I relaxed my head and neck muscles as a result. Third, I ate the spiritual honey from the leaf when I was completely relaxed. +3. I prepared to eat honey, like blood flowing to my head. I did this by stating that the blood was like honey in that it was like a liquid. First, I drew blood pumping from my heart to my organs. Second, I drew the honey in the pot. Third, I drew it flowing. +4. I prepared to do nothing because there was nothing to do. I did this by stating that the cell components function perfectly. First, I stirred the pot. Second, I turned off the cell components. Third, I went to it and got it. +5. I prepared to find a calm place. I did this by stating that the blood flow rate and stress was reduced with calmness. First, I listened to the loud sound, increasing my blood flow rate and stress. Second, I moved away from the loud sound. Third, my blood flow rate and stress decreased. +6. I prepared to decrease lactic acid with a quantum switch. I did this by stating that the calcium from overactive muscle was ignored, preventing pain. First, I used my muscle, increasing the calcium in them. Second, I waited for the lactic acid to dissipate. Third, I was in a state of comfort. +7. I prepared to state that the body's muscles relaxed. I did this by stating that the body takes over when the heart rate decreases. First, I finished exercising. Second, my body took over. Third, I relaxed. +8. I prepared to tell others about the calmness technology. I did this by stating that the calmness technology was a relaxed heart rate sound. First, I found the relaxed heart rate for the person. Second, I played the heart rate sound at that rate. Third, my heart adjusted to relax. +9. I prepared to say that the behaviour and positive thinking was the quantum box with 250 breasonings breasoned out by the text to breasonings algorithm. I did this by stating that the particles stayed in position for the correct function. First, I put it aside and planned to conduct text to breasonings research when my academy became accredited. Second, I found the instantly relaxed hand. Third, I found the relaxed skin pressure. +10. I prepared to emphasise that personal breasoning writing ensured the effectiveness of spiritual medicine. I did this by stating that medicine prevented a headache. First, I prepared for text to breasonings (see About Text to Breasonings below). Second, I studied Education and Medicine. Third, I followed the instructions in How to Prevent a Headache (see below). +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Hours Prayer.txt b/Lucian-Academy/Books/Short Arguments/Hours Prayer.txt new file mode 100644 index 0000000000000000000000000000000000000000..021aa010bd3682a7d8d6fd762d3d3c7ac823a7c2 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Hours Prayer.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Hours Prayer, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Hours Prayer + +1. I prepared to make a game from ontological nothingness. I did this by stating that the Hours Prayer enabled you to know me. First, you knew my thoughts at each hour. Second, I knew you. Third, you knew me. +2. I prepared to delight in it. I did this by knowing the Hours Prayer. First, I recited the Hours Prayer. Second, I found what you meant for me to say. Third, I said it. +3. I prepared to design inherentana (sic) and onnahope (sic). I did this by coming to the conclusion. First, I listened to the hourly impetus. Second, I thought of the connection to the conclusion. Third, I thought of the conclusion. +4. I prepared to connect the hours together. I did this by meditating on the hour. First, I stopped on the hour. Second, I compared with my plan. Third, I saw what I wanted to. +5. I prepared to design new hours. I did this by examining what the hour meant. First, I found the hour. Second, I found what it meant. Third, I examined it. +6. I prepared to read all texts on the hour. I did this by understanding the text. First, I read the text. Second, I found what it meant. Third, I read another text. +7. I prepared to design lots of preludes. I did this by producing the Hours Prayer using the electron producer. First, I moved the electron. Second, I moved it well. Third, I went well. +8. I prepared to work on the text. I did this by producing the output to the Hour Prayer algorithm. First, I wrote the algorithm to output what I wanted on the hour. Second, I asked for help. Third, I received help. +9. I prepared to have perfect expression. I did this by writing the hit of the hour. First, I sweetened the algorithm to write the best statement. Second, I knew it. Third, I knew you too. +10. I prepared to check the character byte against nature. I did this by writing the byte as the result of the thoughts from the Hours Prayer. First, I drew all designs. Second, I chose the best one. Third, I wanted it. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/How to Prevent a Headache.txt b/Lucian-Academy/Books/Short Arguments/How to Prevent a Headache.txt new file mode 100644 index 0000000000000000000000000000000000000000..545453f66b8362bb4380f9080d9299de43a2d730 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/How to Prevent a Headache.txt @@ -0,0 +1,36 @@ +["Green, L 2021, How to Prevent a Headache, Lucian Academy Press, Melbourne.","Green, L 2021",1,"How to Prevent a Headache + +Introduction +What is this? +Use the Prevent Headaches App to help prevent stress (non-pathological) headaches. It combines meditation with a non-invasive spiritual medical technique and a daily regimen. Because stress headaches can start in the body, it is recommended to use our muscle aches and pain preventative technique. This app can help prevent headaches in post-Honours University students, on trains, in cars, and in the sun. Please stay out of the sun in high-UV hours and do yoga before long train journeys. +Meditation +You can naturally prevent a headache by silently repeating the mantra, 'lucian' for twenty, minutes, twice per day. The mantra becomes lighter and lighter, until one forgets the thought, clearing the mind. Also, silently repeat the sutra 'green' for twenty minutes, twice per day after two months of using the mantra in the morning and evening. Also, pray for no digestive system pops from practising the sutra each day. +Quantum Box and Nut and Bolt +Quantum Box +The quantum box is a guided meditation that I will describe later that prevents headaches, (relaxes head and neck muscles), and prevents muscle aches and pains. In it, you can make anything (e.g. a headache) instantly disappear, based on quantum physics (where at the quantum level, things may disappear if we want them to). +Nut and Bolt +The nut and bolt is a guided meditation that is a more detailed prevention of problems the Quantum Box prevents. In it, pretend to unscrew a nut from a long thin bolt each day to prevent a headache, etc. +Daily Regimen +Indicate 2 radio buttons (one for each of the following ideas, and one for each of these ideas working anyway, where a radio button is indicated by breasoning out, or thinking of the x, y and z dimensions of a counter, 0.01, 0.01, 0.005 metres). +Too (just say 'too' for each idea following to indicate the two radio buttons) +meditation (80 lucian mantras and 80 green sutras) +Also for the following, which also have quantum box and nut and bolt: +No digestive system pops from practising the sutra + +Also for the following, which also have quantum box and nut and bolt: +Protector from Headache in Meditation after Honours Study, etc. +Also for the following, which also have quantum box and nut and bolt: +No headache + +The following turn off headaches etc. +Also for the following, which also have quantum box and nut and bolt: +Turn off workload from all employees including you below you, +Also for the following, which also have quantum box and nut and bolt: +Detect and turn off workloads using manager algorithm. (The self does the As, turning off the workloads.) +Also for the following, which also have quantum box and nut and bolt: +Detect and turn off workloads using manager network. (The others help finish the As, turning off the workloads.) +Prevention Technique +If you feel a headache coming on, then think, 'I don't want nothing to happen, followed by a headache. This idea works at the time. Each of the other parts have been taken care of. Green, Doctor.' + +10. I prepared to emphasise that personal breasoning writing ensured the effectiveness of spiritual medicine. I did this by stating that medicine prevented a headache. First, I prepared for text to breasonings (see About Text to Breasonings below). Second, I studied Education and Medicine. Third, I followed the instructions in How to Prevent a Headache (see below). +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Log 2.txt b/Lucian-Academy/Books/Short Arguments/Log 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..fe1e5f73f35afae2bac54d8099fd0c44c94bacc2 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Log 2.txt @@ -0,0 +1,7 @@ +["Green, L 2021, Log 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Log 2 + +1. I checked the values. M should be >= n. M is 1000. N is 3. 1000>=3. +2. K>=1. M should be < Km. M is 10. Km=3*10=30. 10<30. +3. I swapped the values. B3= delta(N-B3)=2. +5. Log base e is different. B is log base 10 of A. A is 100. B is log base 10 of 100. B is 2."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Log.txt b/Lucian-Academy/Books/Short Arguments/Log.txt new file mode 100644 index 0000000000000000000000000000000000000000..8712f906181775219db604c6ea3088d98ac708b0 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Log.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Log, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Log + +1. The number of zeros was the log. M had N 0's or log M 0s. For example, M=1000. I calculated Log(1000). Log(1000)=3. +2. I assumed n was a positive integer. M should be >= 10n. For example 10>=10*1 => 10>=10. Also, 100>=10*2=20. And 1000>=10*3=30. +3. It is far less. N should be =< 10m. For example 1=<10*10=100. 2=<10*100=1000. 3=<10*1000=10,000. +4. I led the good life. I recorded walking one distance's order. I walked to the switch. I walked to the computer. I walked the distance of the order. +5. I took the scenic route. I walked the distance, not displacement. Distance was the total distance covered. Displacement was the distance between the starting and ending points. I walked around the room, not to the destination. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Lucian_Mantra_-_Pure.txt b/Lucian-Academy/Books/Short Arguments/Lucian_Mantra_-_Pure.txt new file mode 100644 index 0000000000000000000000000000000000000000..c577b318473c2e30697b0fe118d1a0808a5b7f50 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Lucian_Mantra_-_Pure.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Lucian_Mantra_-_Pure, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Lucian Mantra - Pure + +1. I prepared to repeat the mantra, which, like the sutra, removed unwanted thoughts. I did this by thinking of consciousness. First, I devised education. Second, I enabled the student to write strong arguments. Third, I enabled them to earn high grades. +2. I prepared to create the head of state and influence myself to work things out in a particular way with the head of state. I did this by thinking of a reason for consciousness. First, I read the claim. Second, I thought of when I had read it before. Third, I adjusted or rewrote the reason for the claim. +3. I prepared to control my reality by developing a belief using a hypothetical case. I did this by adhering to the truth. First, I participated in a different activity. Second, I created impetus. Third, I revised for studies or an examination in a course. +4. I prepared to find politics. I did this by examining the hypothetical. First, I found the best possible conclusion. Second, I thought of 15 reasons. Third, I attributed 50s to each sentence. +5. I prepared to perform an action by deciding that it is the best one for, for example environmental and dietary reasons. I did this by examining the other. First, I thought of a dance movement. Second, I associated it with changing to studying for a test. Third, I thought of dancing to perform well. +6. I prepared to divide knowledge from creativity. I did this by examining the self. First, I produced the exposition from learning. Second, I produced the critique from learning. Third, I produced a critical critique and solved it from learning. +7. I prepared to help the body relax, prevent stress and develop and function better with meditation. I did this by ensuring that the self thought before the other. First, I thought of the first meaning of the dance movement. Second, I thought of the second meaning of the dance movement. Third, I related these to performing a particular action. +8. I prepared to mind read animals. I did this by visualising and performing the action. First, I mind read the connection to the bodily action. Second, I found the mentally recorded words. Third, I found the correlation with human received representations with learning. +9. I prepared to verify my actions. I did this by assessing the other performing the action. First, I stated the the mantra was a vital tool for caring for the self. Second, I earned high distinctions. Third, I managed my responsibilities. +10. I prepared to write about the mantra algorithm. I did this by helping write on the interesting points as Maharishi. First, I explained the programming command. Second, I wrote the implication of the command. Third, I explained what state of the algorithm the command was according to its commented status. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Medicine - Quantum Box of Circulatory System 1.txt b/Lucian-Academy/Books/Short Arguments/Medicine - Quantum Box of Circulatory System 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..5fcd4b3f64fe80b1416ecd875f4aa34709eb3a9f --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Medicine - Quantum Box of Circulatory System 1.txt @@ -0,0 +1,9 @@ +["Green, L 2021, Medicine - Quantum Box of Circulatory System 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Medicine - Quantum Box of Circulatory System 1 + +1. She gained energy. The innocent circulated the small solid in the centre of her vessel. She ate the small solid. It was passed into her blood stream. Her heart pumped it around her body. +2. She increased her longevity. The other maintained positive circulatory system function by using the quantum box. It reduced stress. She had exercise. Regular cardio-vascular activity improved her health. +3. I drank enough liquid to keep my circulatory system going. Some of the blood was taken out. I drank water. I replaced my blood. +4. I also drank more. I used cardiovascular activity to exercise my heart. I exercised. I used more energy. My heart maintained the increased demand for energy by pumping faster. +5. I used cardiovascular activity to maintain circulatory system flow. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Medicine - Quantum Box of Circulatory System 2.txt b/Lucian-Academy/Books/Short Arguments/Medicine - Quantum Box of Circulatory System 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..da5293dc161ae3d330486c63d2f381b1c9acf80f --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Medicine - Quantum Box of Circulatory System 2.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Medicine - Quantum Box of Circulatory System 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Medicine - Quantum Box of Circulatory System 2 + +1. There were no blocks in my blood vessels. I maintained blood flow with daily exercise. I exercised. Blood flowed. I kept things moving. +2. I relaxed as I recovered. I maintained normal blood flow to recover from exercise. I exercised. My blood flow was faster. It returned to normal. +3. Meditation also helped me feel more alert in activity. I maintained normal blood flow when using my circulatory system for normal tasks. I engaged in cardiovascular activity. I felt refreshed, alert and energised. I performed normal tasks. +4. I felt clear. I knew it was there, and moved it on. I meditated, giving my voluntary control over involuntary processes. I detected a piece of food in my blood vessel. It wasn't there later. +5. I went for walks. I detected that I had positive circulatory system function. Everything kept on moving. I took breaks from sprints. I rested. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Meditation Protector Currant Bun.txt b/Lucian-Academy/Books/Short Arguments/Meditation Protector Currant Bun.txt new file mode 100644 index 0000000000000000000000000000000000000000..030f47e2f4107c7d34b93868e4d43d9ffe6b4462 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Meditation Protector Currant Bun.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Meditation Protector Currant Bun, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Meditation Protector Currant Bun + +1. I prepared to teach the meditation protector currant bun sutra. I did this by helping the other to the Meditation Protector Currant Bun. First, I identified the Honours student. Second, I identified that he didn't use the meditation protector currant bun sutra and had a headache. Third, I recommended the meditation protector currant bun sutra. +2. I prepared to move smoothly. I did this by traversing the head and neck muscles' hierarchy to prevent headache from post-Honours meditation using the meditation protector currant bun. First, I found the largest neck muscle. Second, I traversed the increasingly smaller muscles. Third, I relaxed them to prevent a headache. +3. I prepared to focus on science. I did this by stating that the astronaut used the meditation protector currant bun. First, I researched non-invasive means of preventing headaches. Second, I found the method. Third, I performed the prerequisites and breasoned out 250 breasonings from the meditation protector currant bun sutra. +4. I prepared to simplify the solution. I did this by preventing a headache from PhD, Honours and Masters by using the rhythm and currant metaphors of the meditation protector currant bun. First, I said rhythm was preventing, not causing a headache. Second, I stated that the currant gave a positive, not negative sensation. Third, I experienced the currant with a particular property. +5. I prepared to pursue natural law. I did this by stating that the meditation protector currant bun was part of natural law. First, I found the comfortable place. Second, I found the law. Third, I was protected under natural law. +6. I prepared to feel light. I did this by stating that the meditation protector currant bun enabled me to merge with the positive solution. First, I was positive. Second, I stated that the room was for positive people. Third, I walked into the room. +7. I prepared to substitute back into English. I did this by stating that English is based on Philosophy. First, I found the philosophy. Second, I found the English narrative based on it. Third, I reverse engineered the philosophy. +8. I prepared to calm the senses. I did this by stating that the meditation protector currant bun is a sight and sound for sore eyes and ears respectively. First, I turned off the light. Second, I stopped listening to the sound. Third, I went to sleep. +9. I prepared to enjoy the comfort of food. I did this by stating that literature helped prevent the headache. First, I examined the literature of the rhythm. Second, I examined the literature of the currant. Third, I ate the currant in time. +10. I prepared to enjoy work. I did this by stating that the worker stayed at work because of having a comfortable head. First, I treated work as a second home. Second, I enjoyed it. Third, I stated that I moved on. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Moving Appearances - Student.txt b/Lucian-Academy/Books/Short Arguments/Moving Appearances - Student.txt new file mode 100644 index 0000000000000000000000000000000000000000..92ee090e6cd9bffdd455efbd2947055e9b3930de --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Moving Appearances - Student.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Moving Appearances - Student, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Moving Appearances - Student + +1. I prepared to capture it (the object). I did this by using the noumena of the moving appearances. First, I found the noumena. Second, I found their moving appearances. Third, I used the noumena. +2. I prepared to observed the other. I did this by observing the other scheduling the moving appearances. First, I found the moving appearances. Second, I scheduled them. Third, I told the other to schedule them. +3. I prepared to be un-bored. I did this by knowing when the moving appearance was scheduled. First, I found you. Second, I found what you meant. Third, I designed it. +4. I prepared to note everything. I did this by knowing what time I wanted the moving appearance to appear. First, I stated that the moving appearance was memories of the other. Second, I loved them. Third, I kept the other. +5. I prepared to imitate work at work. I did this by noticing the moving appearances of God. First, I found the job. Second, I helped him up. Third, I defeated death. +6. I prepared to decide on more things. I did this by stating that the clock struck one, meaning we need competency in meditation. First, I found competency. Second, I found competency in meditation. Third, I decided it, like running up the clock. +7. I prepared to state that the other universe was really part of our universe. I did this by discovering the true essence in time. First, I found time. Second, I found its true essence. Third, I found that the other universe didn't exist in that case. +8. I prepared to see what was needed. I did this by meditating to make the moving appearances appear. First, I frolicked around. Second, I jumped up and down. Third, I harmonised with myself. +9. I prepared to relate to the others. I did this by finding the infinitesimally interesting language of the moving appearances. First, I found what I wanted. Second, I spoke with it. Third, I designed more things. +10. I prepared to move creatively. I did this by stating that I didn't like Febrvarius' (sic) quotes about moving appearances. First, I found you. Second, I like what you said around the time about moving appearances. Third, I completed it. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Moving Appearances.txt b/Lucian-Academy/Books/Short Arguments/Moving Appearances.txt new file mode 100644 index 0000000000000000000000000000000000000000..00b73c160344b807841b9350bba7ae238649648a --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Moving Appearances.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Moving Appearances, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Moving Appearances + +1. I prepared to calculate the object's speed. I did this by stating that I knew what the moving appearance was. First, I saw the gold balloon. Second, I saw it move. Third, I put it aside. +2. I prepared to state that what you did was how it worked. I did this by stating that I knew how the moving appearance worked. First, I knew you. Second, I knew what you did when you moved. Third, I exhaled it. +3. I prepared to copy and move. I did this by stating that the unnatural mimicked the golden ratio in the air. First, I found the golden rectangle. Second, I found the air. Third, I artificially projected the golden rectangle in the air. +4. I prepared to examine my movement. I did this by recording my movement. First, I recorded my starting position. Second, I moved. Third, I recorded my ending position. +5. I prepared to make things up. I did this by stating that I knew the fact about the man. First, I found his ruler. Second, I found its height. Third, I examined it. +6. I prepared to plan the mathematical appearance. I did this by stating that I knew how the reason for the moving appearance worked. First, I found what the people meant. Second, I wanted it. Third, I carried it out well. +7. I prepared to take good actions. I did this by seeing what the moving appearance was. First, I looked inside it. Second, I recorded its reason. Third, I examined what it meant because of it. +8. I prepared to panic on the streets of Lebanon. I did this by performing the action after viewing the moving appearance. First, I dissected the specimen. Second, I found what the moral science was. Third, I left after moving it. +9. I prepared to deanimate the clock in my mind. I did this by telling the time from the moving appearance. First, I found the clock's hour hand. Second, I found the number that it pointed to. Third, I found it tentaclising (sic) the time. +10. I prepared to do the past before the future. I did this by working out when the moving appearance was. First, I worked out when something should occur. Second, I saw the moving appearance. Third, I noticed Totem Colero's contribution. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/N is Epsilon.txt b/Lucian-Academy/Books/Short Arguments/N is Epsilon.txt new file mode 100644 index 0000000000000000000000000000000000000000..8619d7ce9802a7b9ad4640c8ea693bc1448741c5 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/N is Epsilon.txt @@ -0,0 +1,12 @@ +["Green, L 2021, N is Epsilon, Lucian Academy Press, Melbourne.","Green, L 2021",1,"N is Epsilon + +1. I was clear. I told you apart from me in the mirror. I decided myself. I told you. You said that you agreed. +2. Epsilon was usually 1*10^-6. I endorsed epsilon. I found that epsilon was used to avoid a division by zero error. I added epsilon to zero before dividing by it. The number was very large, which is what I wanted. +3. I checked whether epsilon was small enough. I endorsed a small epsilon. I found that a relatively small epsilon value gave better results. For example, with a larger epsilon=0.1, 1/0.1=10, not a large enough gradient. With a smaller epsilon=0.0001, 1/0.1=10,000, a large enough enough gradient. +4. I reminded the audience that it shouldn't be rounded to erasure. I endorsed not too small an epsilon. If it was rounded down to zero then it would be too small. If the computer couldn't handle the magnitude of 1/e then it would be too small. It should be a constant in the program, and easily recognisable. +5. I wrote the answer. I didn't include the value. I found the value. It was too small. I used another value. +6. Epsilon was used to find the large gradient. I didn't include the number in the triangle. I wrote the possible co-ordinate of the point of the triangle. It was wrong. I didn't include it. +7. I chose ceiling or floor rounding. I didn't include the number in the range. Without epsilon, the number was stopped by an error. A1=3,B1=0,A1/B1=3/0=error. A2=3,B2=0.0001,A2/B2=3/30,000. +8. I noticed vague mind reading was a use. I excluded the number, which was very close to the numbers. The number was an outlier. I excluded it using an interquartile range. I retested whether there were other outliers. +9. What is a mantissa? There were 5 possible ranges of order of magnitude from 1-10^5. I.e. 1-9, 10-99, 100-999, 1000-9999 and 10,000-99,999. I used logs. I used exponents. +10. I found the average. The 501-5000 algorithm works. I tested that the value was within range. I found the fraction of the number out of the magnitude of the range. I rounded it to a whole percentage."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/No Radiation 1 of 25.txt b/Lucian-Academy/Books/Short Arguments/No Radiation 1 of 25.txt new file mode 100644 index 0000000000000000000000000000000000000000..e4ddefd3c94bf39e731a6b106e7bf7696cb5fcb8 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/No Radiation 1 of 25.txt @@ -0,0 +1,13 @@ +["Green, L 2021, No Radiation 1 of 25, Lucian Academy Press, Melbourne.","Green, L 2021",1,"No Radiation 1 of 25 + +I used a sensor to detect heat, light and safety at the destination of time travel. +1. I prepared to time travel. I did this by closing the sola-cele (sic). First, I found the sun. Second, I found the hole in the reasoning. Third, I closed the hole. +2. I prepared to work. I did this by noticing the social supercomputer of 27XX. First, I found the way to go there. Second, I found the people. Third, I found their supercomputer. +3. I prepared to be in terms of science. I did this by researching science. First, I found each way that I was in science. Second, I was that way. Third, I stayed that way. +4. I prepared to stay in place, and so feel that it is safe. I did this by stating that my life was protected from transcending time. First, I asked, 'Isn't transcending a positive term, so isn't transcending time a positive term?'. Second, I received the reply, 'Transcending time is a negative term'. Third, I received the further statement, 'Don't do it'. +5. I prepared to experience that the person who was going to kill their earlier self wouldn't arrive through the Closed Time-like Curve (in our universe) when time travelling to do it, so it wasn't experienced in our universe. I did this by stating that history was perfect. First, I liked it. Second, I wanted more. Third, I wanted you. +6. I prepared to time travel where I liked meditation. I did this by stating that the Meditation teacher time travelled and taught meditation. First, I used a proper time machine that didn't accept radiation. Second, I said it was my iPad with the equivalent of a lead box around me. Third, I meditated before and afterwards. +7. I prepared to . I did this by visiting the airport and explaining that I was a time traveller. First, I visited the airport. Second, I explained that I was a time traveller. Third, I asked, 'Can't the police help with the perfect prediction artificial intelligence?'. +8. I prepared to move more. I did this by stating that there were well-known time traveller time points on Earth. First, I walked to the point. Second, I travelled. Third, I loved it. +9. I prepared to be myself. I did this by visiting harmless people. First, I found you. Second, I did what I wanted. Third, I stated that you were alive. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/No Radiation 2 of 25.txt b/Lucian-Academy/Books/Short Arguments/No Radiation 2 of 25.txt new file mode 100644 index 0000000000000000000000000000000000000000..c7185c172b827caf687ed89605689e05bcfa80a1 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/No Radiation 2 of 25.txt @@ -0,0 +1,13 @@ +["Green, L 2021, No Radiation 2 of 25, Lucian Academy Press, Melbourne.","Green, L 2021",1,"No Radiation 2 of 25 + +1. I prepared to repeat the finding. I did this by praying to experience harmlessness. First, I found you. Second, I stated that you were harmless. Third, I moved towards the next point. +2. I prepared to write well. I did this by studying physics. First, I found you. Second, I found what you wanted. Third, I wanted more. +3. I prepared to copy the hairstyles as well. I did this by wearing the clothes of the time. First, I looked in the book. Second, I copied the clothes. Third, I wore those. +4. I prepared to find well again. I did this by starting that I was a time tourist. First, I found you. Second, I went well. Third, I knew it well. +5. I prepared to automate work and business in time. I did this by stating that I was well. First, I was well. Second, I went on. Third, I went well again. +6. I prepared to be the best. I did this by stating that I was developed. First, I found you. Second, I found you were good. Third, I verified it. +7. I prepared to state the observation. I did this by stating that I was happy. First, I was happy. Second, I moved forward. Third, I went well again. +8. I prepared to take photographs. I did this by stating that I was friendly. First, I worked well. Second, I went well into it. Third, I enjoyed it. +9. I prepared to adjust to the times. I did this by stating that I was not weird (I was normal). First, I was normal. Second, I wasn't. Third, I went back to normal. +10. I prepared to document my part of the world. I did this by stating that I saw the world and all that was in it. First, I liked you. Second, I liked caringness. Third, I loved it. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/No Radiation 3 of 25.txt b/Lucian-Academy/Books/Short Arguments/No Radiation 3 of 25.txt new file mode 100644 index 0000000000000000000000000000000000000000..a24b71e58560bc448b8075b4643117ca5582dbab --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/No Radiation 3 of 25.txt @@ -0,0 +1,13 @@ +["Green, L 2021, No Radiation 3 of 25, Lucian Academy Press, Melbourne.","Green, L 2021",1,"No Radiation 3 of 25 + +1. I prepared to delight the audience. I did this by finding a larger field for repeating famousness. First, I found Ovid. Second, I repeated famousness. Third, I meant something else. +2. I prepared to be blown away. I did this by stating that existing pedagogy will work. First, I thanked it. Second, I existed. Third, I did MOOCs (massive open online courses). +3. I prepared to decide willingly. I did this by stating that existing meditation will work. First, I found you. Second, I wanted you. Third, I wasn't racist. +4. I prepared to vive la différence. I did this by stating that new time medicine will work. First, I loved daddy long legs. Second, I hinted at it. Third, I wanted you. +5. I prepared to ask, 'What are they?'. I did this by stating that time lascelles are protected. First, I found you. Second, I wanted you. Third, I wanted it. +6. I prepared to engage well. I did this by stating that I showered when I arrived home. First, I went through the time machine. Second, I showered when I got home. Third, I wondered where that went. +7. I prepared to excel. I did this by time travelling to explore famousness. First, I found myself going well. Second, I asked, 'Who's that?'. Third, I thought that you are a genius too. +8. I prepared to be responsive. I did this by stating that I am above time now. First, I thought of time. Second, I thought of aboveness over it. Third, I went to it. +9. I prepared to do it instantly. I did this by doing the following and praying to do the following before arriving: I meditated on the impact. First, I looked at the blast. Second, I held fast. Third, I found more people there. +10. I prepared to reassure people that meditation was done for them automatically. I did this by stating that my body can survive for several seconds after arriving in a time. First, I found you. Second, I meditated. Third, I thought of the clique about whether I had seen the person somewhere else before. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/No Radiation 4 of 25 (final).txt b/Lucian-Academy/Books/Short Arguments/No Radiation 4 of 25 (final).txt new file mode 100644 index 0000000000000000000000000000000000000000..cb8e7bb2f7be445da5937dff4ed5c8c7001a5710 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/No Radiation 4 of 25 (final).txt @@ -0,0 +1,6 @@ +["Green, L 2021, No Radiation 4 of 25 (final), Lucian Academy Press, Melbourne.","Green, L 2021",1,"No Radiation 4 of 25 (final) + +1. I prepared to plan to be safe when time travelling. I did this by stating that there was determinism with respect to breasonings. First, I breasoned out the aim. Second, I worked on the aim. Third, I achieved the aim. +2. I prepared to be protected by selecting a safe possibility. I did this by stating that time is a point which protection can be drawn from. First, I chose the point. Second, I was protected at the point. Third, I found the fifth dimension. +3. I prepared to cover travelling in the time machine with meditation. I did this by stating that the body returns to normal after time travelling like a beating flagella with meditation. First, I meditated. Second, I time travelled. Third, I meditated again. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Nut_and_Bolt.txt b/Lucian-Academy/Books/Short Arguments/Nut_and_Bolt.txt new file mode 100644 index 0000000000000000000000000000000000000000..b5c6e90bc8c52547792a1153c8e7d850ab6cf338 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Nut_and_Bolt.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Nut_and_Bolt, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Nut and Bolt + +1. I prepared to help the cell perform its function in the body. I did this by taking the biological specification for the body from the quantum box and synthesising the chemical parts using the nut and bolt, for example I itemised the chemicals in food. First, I prepared to perform the action as if the record of it was transmitted. Second, I relaxed and visualised performing the action. Third, I performed the action. +2. I prepared to walk using the nut and bolt. I did this by synthesising the chemistry of the skeletal system with the nut and bolt. First, I found the first bone. Second, I found the joint. Third, I found how the bone fit into the joint using the nut and bolt. +3. I prepared to prevent muscle aches and pains. I did this by synthesising the chemistry of the muscular system with the nut and bolt. First, I found the first row of cells. Second, I found the cells that would slide along them to make the muscle contract. Third, I slid them along them, causing muscular contraction, using the nut and bolt. +4. I prepared to control that the skin attached to the tissue. I did this by synthesising the chemistry of the integumentary system with the nut and bolt. First, I found the tissue the skin was on. Second, I found the skin. Third, I bolted the skin to the tissue. +5. I prepared to ensure that the nerve signal was maintained. I did this by synthesising the chemistry of the nervous system with the nut and bolt. First, I found the first nerve. Second, I found the second nerve. Third, I bolted the first nerve to the second nerve. +6. I prepared to ensure proper body function when the hormone reached its target. I did this by synthesising the chemistry of the endocrine system with the nut and bolt. First, I found the endocrine gland. Second, I determined the hormone. Third, I bolted the production of the hormone to the endocrine gland. +7. I prepared to prevent blockages in the circulatory system. I did this by synthesising the chemistry of the circulatory system with the nut and bolt. First, I found the blood cell. Second, I found the blood vessel. Third, I bolted the blood cell on to the notion of keeping going through the blood vessel. +8. I prepared to fight pathogenic disease. I did this by synthesising the chemistry of the immune system with the nut and bolt. First, I found the pathogen. Second, I found the immune response. Third, I bolted the immune response to the pathogen. +9. I prepared to help the filtered blood return to the blood. I did this by synthesising the chemistry of the lymphatic system with the nut and bolt. First, I found the blood. Second, I found the plasma. Third, I bolted removing plasma from the blood. +10. I prepared to maintain the cells' oxygen supply. I did this by synthesising the chemistry of the respiratory system with the nut and bolt. First, I found the oxygen molecule. Second, I found the blood cell. Third, I bolted the oxygen molecule on to the blood cell. +11. I prepared to form urine. I did this by synthesising the chemistry of the urinary system with the nut and bolt. First, I found the urea. Second, I found the water. Third, I bolted the urea to the water. +12. I prepared to eat enough of a type of food. I did this by synthesising the chemistry of the digestive system with the nut and bolt. First, I found the nutrient. Second, I found the body part that needed it. Third, I bolted the nutrient to the part of the body where it was needed. +13. I prepared to want the baby. I did this by synthesising the chemistry of the reproductive system with the nut and bolt. First, I found the baby. Second, I found the parents. Third, I bolted the baby to the parents. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Part_of_Room.txt b/Lucian-Academy/Books/Short Arguments/Part_of_Room.txt new file mode 100644 index 0000000000000000000000000000000000000000..096b314a597ade0d3000ccac8b2a22c88a314d22 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Part_of_Room.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Part_of_Room, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Part of Room + +1. I prepared to find out more. I did this by deciding what part of the room to act in. First, I walked to the mantel piece. Second, I lifted my hand. Third, I was given the clock. +2. I prepared to explore the idea more. I did this by think about why the idea was working. First, I wrote. Second, I wrote on the topic. Third, I wrote up a storm. +3. I prepared to marry the white witch and white knight. I did this by aerotopically interacting with the part of the room. First, I found the mallet. Second, I touched the knight's armour's shoulder. Third, I agreed with the knight. +4. I prepared to transfer money on the spiritual computer. I did this by stating that the Academy shopping trolley was part of the room. First, I found the nut. Second, I found the bolt. Third, I moved the spiritual desk. +5. I prepared to boil the water (write the essay). I did this by stating that the philosophy algorithm code filled the part of the room. First, I found the font. Second, I found the samovar. Third, I filled it. +6. I prepared to make the part of the room into a ludo/life work setting (to create business products (e.g. a process), employees (who created employees and customers) and customers). I did this by stating that the leader made work fun. First, I found the leader. Second, I asked him how to make work fun. Third, I found work fun. +7. I prepared to look at my hand approach my lips. I did this by reading about the part of the room. First, I aimed for the threshold. Second, I reached it. Third, I kept the aperture slightly open. +8. I prepared to prevent the seven year itch. I did this by commenting on love in the composition. First, I found love. Second, I found it dwelling. Third, I knew it. +9. I prepared to decide to love. I did this by commenting on sex in the composition. First, I read. Second, I spoke. Third, I thought. +10. I prepared to admit that it was this planet and that the sky was 1 metre above the ground. I did this by stating that where the planetary outpost was a building, the space station was itself. First, I noticed the building on a planet. Second, I noticed the space station in the sky. Third, I went from Earth, to the space station, to the planetary outpost. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Pranayama.txt b/Lucian-Academy/Books/Short Arguments/Pranayama.txt new file mode 100644 index 0000000000000000000000000000000000000000..f66cf6fefc6734996f49b15a12fa3ce8a667eb29 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Pranayama.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Pranayama, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Pranayama + +1. I prepared to share my skill with others. I did this by meditating on my breath. First, I controlled my breath. Second, I noticed I was transcending the world. Third, I transcended everything. +2. I prepared to control myself with God. I did this by liking God with you. First, I inhaled. Second, I ascertained the aim and direction of my thoughts. Third, breathing helped me. +3. I prepared to revise in the evening. I did this by practising pranayama in the evening. First, I inhaled oxygen. Second, I noticed it reach the brain. Third, I observed the oxygen optimise my brain function. +4. I prepared to warm up. I did this by I practised pranayama in the morning. First, I exhaled. Second, I released stress in the body. Third, I relaxed. +5. I prepared to act. I did this by finishing in fame with pranayama. First, I held my nostril closed. Second, I blew through the other one. Third, I cleared my airways. +6. I prepared to reach the goal. I did this by viewing the Grand Duchess of York. First, I commanded respect. Second, I was valiant. Third, I overcame blocks. +7. I prepared to interpolate conclusions from knowledge. I did this by sketching the idea in the middle. First, I thought about how body function followed a single line. Second, I thought about how thoughts followed a single line in the mind. Third, I thought about how actions followed a single line in reality. +8. I prepared to fly through my thoughts. I did this by breathing in slowly to relax. First, I was happy with my life. Second, I meditated on and prepared to work on an algorithm. Third, I sketched the perfect program's details. +9. I prepared to sing. I did this by yawning. First, I relaxed. Second, I opened my mouth. Third, I allowed myself to yawn. +10. I prepared to speak. I did this by enlarging my vocal space. First, I performed a relaxing vocal exercise. Second, I made sure the activity area was safe. Third, I performed the activity. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Print line.txt b/Lucian-Academy/Books/Short Arguments/Print line.txt new file mode 100644 index 0000000000000000000000000000000000000000..85339eb2f6e7714d85bc470d49f550d0b26ba614 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Print line.txt @@ -0,0 +1,7 @@ +["Green, L 2021, Print line, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Print line + +1. I read the character. I printed the character. I selected the character. I drew the first point. I drew it. +2. I talked with you. I saw you there. I examined the crowd. I saw you in it. I approached you. +3. I returned to my position. I saw the person return to her position. I spoke with the person. I finished talking with the person. I let her return to her position. +4. I recorded the result. I aimed for the point. I decided on the point. I executed the action. I attained the result. +5. I tested the connection. I aimed for the connection between the points. I found the first point. I found the second point. I connected them."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Print.txt b/Lucian-Academy/Books/Short Arguments/Print.txt new file mode 100644 index 0000000000000000000000000000000000000000..5f0275376df67e64021214d403c3cb04a5b7d5ef --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Print.txt @@ -0,0 +1,8 @@ +["Green, L 2021, Print, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Print + +1. I characters were a line. I received m marks for printing line m of n. I printed the first character. I prepared to print the next character. I repeated this until I had printed the character. +2. I wrote the quadrant of the point. I labelled the point with its x and y co-ordinates. I found and labelled the point's x co-ordinate. I found and labelled the point's y co-ordinate. I checked off that I had labelled the point. +3. I labelled the y-intercept. I verified the y-intercept. I checked that the y-intercept was c in y=mx+c. I found the gradient m. I substituted the x and y values of a point, c and m into the equation to verify it. +4. I superimposed the y-intercepts. I found the y-intercept interesting. I found graphs with different y-intercepts. I found graphs with the same y-intercepts. I superimposed them. +5. I drew the line y=x. I found the y-intercept interesting with the other. I translated the graph to have a new y-intercept. I reflected it in the x-axis. I reflected it in the y-axis. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Professor Algorithm - Student.txt b/Lucian-Academy/Books/Short Arguments/Professor Algorithm - Student.txt new file mode 100644 index 0000000000000000000000000000000000000000..95010442a3898c3d43a0e311dd3284cd55b87e3b --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Professor Algorithm - Student.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Professor Algorithm - Student, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Professor Algorithm - Student + +1. I prepared to complete the work. I did this by sipping from the city water supply. First, I found the tap. Second, I poured a glass of water. Third, I drank it. +2. I prepared to assess the back-translated algorithm, then essay. I did this by sitting on the seat. First, I found the child. Second, I sat with it. Third, I played tiddlies. +3. I prepared to see where it went. I did this by stating that the verb described my action. First, I photographed where I would perform the action. Second, I photographed the object. Third, I photographed the object after performing the action on it. +4. I prepared to turn out the light. I did this by performing the verb to the other. First, I found the other. Second, I told the story to the other. Third, I listened to her question. +5. I prepared to capture the gossamer nightingale. I did this by performing the action about the verb to the group. First, I burst into colour. Second, I burst into song. Third, I sang the lullaby. +6. I prepared to smile. I did this by symbolising the verb. First, I enjoyed the song. Second, I rummaged in the Christmas sack. Third, I pulled the theatrical mask out of the sack. +7. I prepared to summarise the sentences' key self-terms. I did this by find out how I symbolised the verb. First, I read the diary. Second, I read the entry. Third, I read that I was happy. +8. I prepared to feature the important points. I did this by stating the pair symbolising the verb. First, I produced the rule with the machine learning algorithm. Second, I explained that the pair was the short term and long term memory in the algorithm. Third, I explained that the verb was to spring clean the system. +9. I prepared to go further. I did this by not performing the action. First, I found something better. Second, I did it instead. Third, I went that way. +10. I prepared to go further on it as well. I did this by finding you performing the action. First, I found the schools founder. Second, I found her eat the dish. Third, I knew it was you. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Professor Algorithm.txt b/Lucian-Academy/Books/Short Arguments/Professor Algorithm.txt new file mode 100644 index 0000000000000000000000000000000000000000..29dddc633ed87f5a8721c712dec05e23f5003c67 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Professor Algorithm.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Professor Algorithm, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Professor Algorithm + +1. I prepared to seek more people. I did this by training people in time points about the Professor Algorithm. First, I found the time point. Second, I trained people in them. Third, I found the people using the Professor Algorithm. +2. I prepared to know the set of all people. I did this by stating that people were trained in the Professor Algorithm. First, I found the books. Second, I found them people training themselves. Third, I trained them too. +3. I prepared to carefully examine the capsule. I did this by stating that the result of the professor algorithm was mine. First, I found the meme. Second, I capsulated it. Third, I dismantled it. +4. I prepared to speak in language with tact. I did this by stating that the professor algorithm was it. First, I exfoliated my mouth. Second, I said, 'Who's that, Grandma?'. Third, I I examined you closely. +5. I prepared to find out everything that I wanted. I did this by stating that the subject was the entity who used the professor algorithm. First, I knew you. Second, I knew what it was. Third, I knew it very well. +6. I prepared to intactify (sic) myself - everyone. I did this by computing the professor algorithm and it was good. First, I found you. Second, I knew what you meant. Third, I designed you. +7. I prepared to djablucinate it (sic). I did this by stating that the action was derived from the verb in the breasoning. First, I do it, you say it. Second, I wrote it. Third, I found everyone as well. +8. I prepared to welcome it. I did this by meditating on a day when I found out the verb from God. First, I found the word. Second, I said it. Third, I was it. +9. I prepared to say it encouragingly. I did this by performing the verb. First, I loved you. Second, I loved it. Third, I wanted it. +10. I prepared to ask, 'Isn't that Nietzsche?'. I did this by stating that the verb is algal. First, I asked, 'What is algal?' (it is algorithm-like). Second, I asked, 'What is an algorithm?' (it is our bath). Third, I knew it. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Quantum_Box:Prayer.txt b/Lucian-Academy/Books/Short Arguments/Quantum_Box:Prayer.txt new file mode 100644 index 0000000000000000000000000000000000000000..9c11e9ea9768ce49ed498e79d2f0b7dd52488c7e --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Quantum_Box:Prayer.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Quantum_Box:Prayer, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Quantum Box/Prayer + +1. I prepared to record the update for automation. I did this by deleting the connection to the error. First, I found the error. Second, I corrected it. Third, I reconnected to the correction. +2. I prepared to maintain positive function. I did this by thinking of the positive thought instead. First, I listened to the positive stimulus. Second, I verified that it was positive. Third, I thought of the positive reply. +3. I prepared to apply for the loan. I did this by observing inequality disappear in politics. First, I found how America was formed. Second, I observed their lever inequalities. Third, I noted Australia's solvable inequalities. +4. I prepared to allow for individual skills. I did this by observing inequality for women disappear in politics. First, I observed the unequal numbers of each sex in each role type. Second, I encouraged enough women to apply. Third, I manually met the PISA requirements. +5. I prepared to specialise in the era. I did this by observing inequality for LGBTIQA disappear in politics. First, I found the LGBTIQAs. Second, I compensated for them, including any people with disabilities. Third, I found the appropriate way to occupy them. +6. I prepared to hand breason out thoughts for children. I did this by observing inequality for the minority disappear in politics. First, I went to the red house. Second, I put pedagogy, meditation and medicine in it. Third, I did this before conception and during the life. +7. I prepared to leave a 'zero' footprint. I did this by reducing paper in schools. First, I thought that square eyes could be prevented, so computers were better than paper. Second, I made my own computer. Third, I sold spiritual computers. +8. I prepared to save energy. I did this by reducing snail mail. First, I completed the form on computer. Second, I emailed it to the recipient. Third, I reduced paper mail. +9. I prepared to follow up the individual. I did this by making sure that there were proper protocols so that the message was posted online without fail. First, I listed the post. Second, I automatically posted the post. Third, I ensured that it had been received. +10. I prepared to automate bug checking. I did this by agreeing with automatic removal of the error, unless it needed to be done manually. First, I fixed the error with the algorithm. Second, I checked whether the fix was correct. Third, I kept the change if it was correct, otherwise I manually fixed it. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Rebreasoning.txt b/Lucian-Academy/Books/Short Arguments/Rebreasoning.txt new file mode 100644 index 0000000000000000000000000000000000000000..da1d593fb21f18b7e95e785229d4e205ac2b75b7 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Rebreasoning.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Rebreasoning, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Rebreasoning + +1. I prepared to make money from magic. I did this by touching the gold. First, I replicated the gold. Second, I sold it. Third, I made a profit. +2. I prepared to study space science. I did this by stating that the clothes touched. First, I found that the space craft was sealed. Second, I found the space suit was sealed. Third, I researched all times. +3. I prepared to traverse the grammar. I did this by stating that the grammars touched. First, I found the subject's grammar. Second, I found the verb phrase's grammar. Third, I stated that the subject's grammar touched the verb phrase's grammar. +4. I prepared to play the computer game. I did this by touching the circle of gold. First, I found the flying eagle robot. Second, I pressed the circle of gold. Third, I flew in the machine. +5. I prepared to take my new measurements. I did this by stating that the clothes were stretched together. First, I put on muscle. Second, I stretched the arms of the clothes. Third, I stretched the legs of the clothes. +6. I prepared to automate the actions. I did this by observing that the friendly meeting was legal. First, I met the entity. Second, I touched the rectangular air prism. Third, I automated taking minutes. +7. I prepared to wonder what they looked like when they were young. I did this by stating that my eyes met dawn. First, I looked at the horizon. Second, I waited until dawn. Third, I met the sun with my eyes. +8. I prepared to verify the connection. I did this by moving on to the next point. First, I breasoned out the first point. Second, I breasoned out the connection to the second point. Third, I moved on to the second point. +9. I prepared to use machine learning to identify reused words. I did this by deciphering the code. First, I found the out-of-sequence letters. Second, I found all combinations of the letters. Third, I deciphered them. +10. I prepared to rank the objects. I did this by following the plan. First, I read the plan. Second, I found the pathway to the objects. Third, I simplified the plan. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Rebreathsoning.txt b/Lucian-Academy/Books/Short Arguments/Rebreathsoning.txt new file mode 100644 index 0000000000000000000000000000000000000000..b6e6f982d1786532a6d41ca329acab9de8309fcb --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Rebreathsoning.txt @@ -0,0 +1,14 @@ +["Green, L 2021, Rebreathsoning, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Rebreathsoning + +1. I prepared to marry medicine and computer science. I did this by eating fresh food. First, I related the food to computer science. Second, I examined the algorithm's data flow. Third, I matched it to digestion. +2. I prepared to side with the pole. I did this by observing meantness (sic). First, I found the statement to be unwavering through time. Second, I found it to be unwavering in relation to other statements. Third, I found it to be unwavering in relation with other people. +3. I prepared to question the bug in the time machine. I did this by stating that the tourists visited 'Mars'. First, I simulated the light. Second, I simulated the surface. Third, I simulated the flights. +4. I prepared to record the position. I did this by determining the functional locus. First, I found the position that influenced function. Second, I noticed movement to the left. Third, I noticed movement to the right. +5. I prepared to function positively. I did this by taking the breathsoning into consideration. First, I noticed the student had failed. Second, I observed that the student had failed because of the essay format and number of breasonings, not anything else. Third, I re-examined the student. +6. I prepared to identify the breathsoning 'carefully' in Medicine. I did this by stating that the larrikin suffered from breathsoningitis. First, I removed the block. Second, I helped the flow to move on. Third, I encouraged the person to use enough breathsonings. +7. I prepared to remain the person in charge. I did this by examining the famous breathsoning. First, I meditated carefully. Second, I researched whether the experiment worked. Third, I wrote on the topic and education of it. +8. I prepared to maintain good digestion. I did this by examining the breathsoning work. First, I ate carefully. Second, I ate with a minimum of fuss. Third, I ate over a longer time. +9. I prepared to describe the process in terms of objects. I did this by breasoning out the breathsoning. First, I observed the friendly person. Second, I found them being around themselves. Third, I found what you meant. +10. I prepared to notice the sickness (health). I did this by breathsoning out the tantrum. First, I observed them be all for themselves. Second, I helped them up. Third, I exacerbated it. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Room.txt b/Lucian-Academy/Books/Short Arguments/Room.txt new file mode 100644 index 0000000000000000000000000000000000000000..cc94ac003fbb31e1df9901e51242687dd2b38175 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Room.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Room, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Room + +1. I prepared to wreak peace. I did this by stating that there was an international room. First, I found the shuttlecocks. Second, I found the sticks. Third, I found the galumphs. +2. I prepared to have peace. I did this by endorsing red Palestine. First, I found you. Second, I found myself. Third, I connected us. +3. I prepared to fit all the objects into the room. I did t his by stating that the room's dimensions were in the golden ratio. First, I approximated the golden ratio as 1.5. Second, I found that the width of the room was 4 metres. Third, I found that the length of the room was 6 metres. +4. I prepared to answer the question every time. I did this by answering yes, not no to the question of the meeting in the pedagogy room. First, I wrote that the question of the pedagogy room was, 'Are you pursuing pedagogy?'. Second, I followed pedagogy in my life. Third, I followed pedagogy in my science. +5. I prepared to write another A. I did this by examining the egg and sperm when they became the zygote. First, I found the egg. Second, I inserted the sperm in it. Third, I observed the zygote. +6. I prepared to make things in each room. I did this by stating that the living room and the kitchen made the dining room. First, I visited the living room. Second, I visited the dining room. Third, I visited the kitchen. +7. I prepared to examine the seconds of the phontology. I did this by writing the room a phontology. First, I listened to the name of the room. Second, I listened to it's function. Third, I recorded the structure of what I listened to. +8. I prepared to experience wellness. I did this by experiencing meantness in the room. First, I found the first side of the distinction. Second, I found the second side of the distinction. Third, I chose meantness. +9. I prepared to apologise for (agree with) harmony. I did this by stating that the room gave agreement and disagreement different values. First, I originally gave them different values. Second, I listened to the black hold character make Essay Helper programmers feel unwell. Third, I changed them to different values. +10. I prepared to state that the students performed their function. I did this by stating that the students found their way in the room. First, I stated that they found where they were in the house. Second, I stated that they found where they were in the room. Third, I stated that they found their orientation. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Same order of magnitude.txt b/Lucian-Academy/Books/Short Arguments/Same order of magnitude.txt new file mode 100644 index 0000000000000000000000000000000000000000..407f84d6602ffde72cf2e0f9418d4243d4098881 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Same order of magnitude.txt @@ -0,0 +1,12 @@ +["Green, L 2021, Same order of magnitude, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Same order of magnitude + +1. I saw that the values were equal. I noticed not to be tricked. I found the rice. I ate it. I was fair. +2. I examined the value. There are services (events) because people travel the same distance each day. I chose the destination. I travelled there. I travelled there again. +3. I saw the person's emotion. I avoided (found) the people when they were about. I found the person. I described her. I walked along the esplanade. +4. I saw the others. It is it. I found it. I found the other. It was it. +5. I saw the destination. The very fence is excluded. I found the fence. I avoided it. I protected the walk there and back. +6. I noticed that the order of magnitude was a number. I awarded the grade for the same order of magnitude. I read the algorithm. I read the output. I read that the answer was correct. +7. The copula is a connecting word, usually the verb to be connecting the subject and complement. I noticed that the copula was the same given the same order of magnitude. I was the character of a certain weight. I was the second character of the same weight. I found the equality. +8. I read the book. I examined the popularities. I found the first number of people. I found the second number of people. I found that they were the same. +9. The money was the correct amount. Both partners were happy. I saw that the first partner earned the deal. I saw that the second partner agreed. The money was transferred. +10. The money was of the right amount. Both partners were satisfied. The food was fresh. The buyer wanted that type of food. The seller sold the food."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Simulated Intelligence 2.txt b/Lucian-Academy/Books/Short Arguments/Simulated Intelligence 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..23f97bd124ae9a860916942e30191c10eaf023db --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Simulated Intelligence 2.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Simulated Intelligence 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Simulated Intelligence + +11. I reconstructed the man's thoughts. I knew God (the man). I followed God. I examined his philosophies. I critiqued them. +12. Closer to the fundamental end was more famous. I knew him well. I aimed to research induction. It was by itself. The best thing was that philosophy was synthesised. +13. It started. I knew him very well. I found out about science. That's what politics was about, gays. It was computer science. +14. The man added to the simulation. God (the man) took care of people in subsets of the simulation. God found the subset of the simulation. It was the place. The man recorded them. +15. It was a black box. God (the woman) designed technologies for the simulationees (sic). They were possible with physics. They added to the simulation. Someone was giving a reward for effort. +16. Creative writing was intelligent because it was the source of life. The simulation was anarchic. Anarchy was enjoyment in writing. It was after creative writing at University. I simulated creative writing. +17. Getting a job required an LSTM. The fallen (alive) were recorded in the simulation. Creative writing and education were the best combination in the simulation. They helped each other. There were other departments. +18. LSTMs helped write algorithms for each sentence. The simulation was hard to make horrific (easy to like). The professor worked on integrations with students in the department. The student who earned a high distinction had time to write a relevant algorithm. He had also worked on a mind map, thesis plan (with side taken in a two-sided contention, one's argument, paragraph topics and something greater). +19. I replaced terms to achieve generality. The simulation's starting and ending times were indicated. The writing was on the topic. The algorithm was on the topic. Plainness was preferred in the generated algorithms. +20. The detailed algorithm added to the interpreter. The citizen was thought of in the simulation. The citizen wrote her detailed algorithm. She wrote a relevant algorithm. She was recognised. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Simulated Intelligence 3.txt b/Lucian-Academy/Books/Short Arguments/Simulated Intelligence 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..6e667e7a47898f8ab7b0524880ae8e9d0efdc9f2 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Simulated Intelligence 3.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Simulated Intelligence 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Simulated Intelligence 3 + +21. The thoughts had a science. The science of the simulation was the spiritual language including 'A' (copulate) and 'up' (transcend). I linked to the set around me. I moved up afterwards. I simulated thoughts. +22. Simulated intelligence was spiritual artificial intelligence. The simulation worked because idealism worked. The object was actually a breasoning. Electrons were not there when not observed. Some objects were there when not observed. +23. A simulation, not reality, contained thoughts. The simulation gave the rest of the thoughts. There was an impetus. It resulted in thoughts and art. These were the rest of the thoughts. +24. The simulation contained multiple people. I saw the big otherlander (other). There were enough references to the other. Enough references to the the other resulted in multiplicity (like education). It was also like sales. +25. My simulated intelligence device guided me to positive thoughts. I saw the cosmic land. In cosmic land, medicine and computers maintained positivity. There were universities. I met like-minded people. +26. Science could create anything. Everything in the simulation is intelligent. The humans were intelligent. There was intelligence about the objects. There were intelligent systems. +27. Identifying the shape and colours of a mistake reached threshold of being recognised. The simulation laughed with its characters. There was an opportunity to prevent mistakes. People simulated others in the simulation. The people laughed with the characters in the simulation. +28. Eventually I wrote transition machines. The simulation character continuously improved in his competency. The start and end was computer science. Philosophy was in the middle. I wrote a philosophy on a computer science idea, then programmed it. +29. Others could write as well. The simulation character continuously improved in perfection in health thoughts. He wrote them. He updated them. He wrote on new topics. +30. I wrote the fundamentals of pedagogical science (with implications in medicine and engineering). The simulation character continuously improved in happiness in society. I developed algorithms using machine learning (LSTMs). I customised these using vague mind reading. I completed the algorithms by expressing them as transition machine (so they can be spiritual computers). +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Simulated Intelligence.txt b/Lucian-Academy/Books/Short Arguments/Simulated Intelligence.txt new file mode 100644 index 0000000000000000000000000000000000000000..48ef7990f7077998f7c1dc97bb153ba597d729ad --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Simulated Intelligence.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Simulated Intelligence, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Simulated Intelligence + +1. The world existed. The world (life, existence and everything else) was simulated. I simulated the people. I simulated the world. I simulated the spiritual sides. +1a. Instead of relative, absolute. +2. I solved problems with the draft. The simulation and intelligence were merged. I wrote the sentence. I analysed it. I simulated it. +3. I helped others. Intelligence (determinism) was stronger than the simulation (fate). I heard about it. I checked it. I stayed there. +4. There were ethical dilemmas with the simulation. I disagreed with simulated intelligence. I found the subject. I simulated their thoughts. I disagreed with them. +5. I applied the theory. All major nodes' arguments were simulated. I argued for the start. I argued for the finish. I thought of the theory. +6. I examined the simulation's variable. I saw the form in the simulation. I saw the form (shape). I identified its genus (type). The object was the variable. +7. It was actually people. The simulation aimed for world peace. Some people found out about our time. There were some bots. Only some moved in. +8. I applied use to lambda calculus. I saw the meoworld (sic) in the simulation. I saw the non-existent world. I used it up. It was good. +9. The object was the fundamental unit. I synthesised the simulation. I found the opposite. I noticed it was a contention. I found inside-out contentions. +10. The character helped. The simulation character was open. He was honest. He expanded his reasonings. He worked the simulation out. +10a. Computational English and children's' thoughts should be breasoned out before conception to lead to world peace in the simulation. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Soma.txt b/Lucian-Academy/Books/Short Arguments/Soma.txt new file mode 100644 index 0000000000000000000000000000000000000000..580c908d64b3ee6f58ced78d2f0c349b61fdf5be --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Soma.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Soma, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Soma + +1. I prepared to tell someone about the soma prayer. I did this by stating that the timing of the soma prayer made me happy. First, I meditated. Second, I said the soma prayer. Third, I noticed no digestive system pops from practising the sutra. +2. I prepared to respond to others needing help. I did this by stating that learning the soma prayer with grit made me happy. First, I meditated without (with) the soma prayer. Second, I noticed no (some) digestive system pops. Third, I learned to use the soma prayer each day (i.e. with single pointedness). +3. I prepared to state that there was stomach gas needing the soma prayer without eating. I did this by stating that as the breasoning is empty, the stomach was empty as a result of the soma prayer. First, I decided to fast on the day. Second, I fasted on the day. Third, I planned to fast a week later. +4. I prepared to control normally involuntary body processes. I did this by reducing the HCl created by the sutra in the stomach ontology to remove gas. First, I spiritually adjusted how much food was detected. Second, the less food that was detected, the less HCl was created. Third, the less HCl that was created, the less gas there was. +5. I prepared to stop eating when the stomach was 80% full. I did this by stating that the soma prayer took care of the stomach after eating. First, I said the soma prayer after eating. Second, there was no gas. Third, the stomach didn't release gas. +6. I prepared to eat less to have enough room for the next meal. I did this by stating that the soma prayer took care of the stomach in readiness for the next meal. First, I ate whole foods. Second, I drank cold water. Third, I ate a Mediterranean diet. +7. I prepared to avoid the surface of my stomach bulging after eating. I did this by stating that the soma prayer removed discomfort. First, I removed the spiritual block from on the other block (I digested the food after chewing properly). Second, I noticed the soma prayer helped positive function of the digestive system. Third, I remained comfortable. +8. I prepared to increase awareness of the soma prayer and proper digestion. I did this by stating that the soma prayer caused communication of good digestion in society. First, I chewed small mouthfuls to increase the effectiveness of the soma prayer. Second, I chewed slowly. Third, I chewed carefully. +9. I prepared to find the ideal amount of food per stomach volume. I did this by stating that there was a 1:1 ratio of HCl molecules needed, to food molecules. First, I poured n mol of HCl into the stomach model. Second, I poured n mol of food into the stomach model. Third, I detected that the food was digested. +10. I prepared to observe the healthy stool. I did this by stating that there was extra HCl to complete digestion. First, I observed the amount of acid, without extra acid, digest the food in the stomach model. Second, I observed the extra acid complete digestion. Third, there was waste. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Sort.txt b/Lucian-Academy/Books/Short Arguments/Sort.txt new file mode 100644 index 0000000000000000000000000000000000000000..12c2caf42b3bcaa39f755cc1ebc47ae264f1c4a1 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Sort.txt @@ -0,0 +1,7 @@ +["Green, L 2021, Sort, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Sort + +1. I reversed the order. The other sorted the two items well. I found the first item. I found the second item. I ordered the item with the greater value to be first. +2. I labelled the number. I knew what the gridline was. I found the number. I rounded it down. I sketched it on the number line. +3. I found the object on the map. The gridline gave me something to think about ontology with. I found the object. I found it's position. I marked it on the map. +4. I simplified 3D to 2D. I noticed the big idea was on me. I found the idea. I found the blocks. I drew the objects on a page. +5. I listed the values by frequency. I sorted the block ontologies. I found the width of the blocks. I sorted them. I reversed the order from largest to smallest."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Sort_by_x.txt b/Lucian-Academy/Books/Short Arguments/Sort_by_x.txt new file mode 100644 index 0000000000000000000000000000000000000000..7d7c8d47b313ee39d57d32191897287ededa13e6 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Sort_by_x.txt @@ -0,0 +1,7 @@ +["Green, L 2021, Sort_by_x, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Sort_by_x + +1. I prepared to explain that for the other lines, the method would work to prevent disconnects in the line. I did this by noticing that the line might ascend. First, I prepared to connect the points from left to right. Second, I only drew lines with a gradient 0 < m =< 1 or 0 >= m >= -1. Third, I flipped Xs and Ys, drew the line, and then flipped the Xs and Ys again for the other lines. +2. I prepared to check whether the point intersected with the line. I did this by observing the object ascend. First, I drew the line. Second, I saw it ascend. Third, I noticed its end point. +3. I prepared to repeat plotting the set of points. I did this by plotting the points from left to right. First, I found the x co-ordinate of the next point in order from left to right. Second, I found the y co-ordinate of the next point in order from left to right. Third, I plotted it, then prepared to plot the next point. +4. I prepared to list the pairs in order. I did this by sorting by x, regardless of y. First, I found the maximum in the pair of values. Second, I placed it before the other value. Third, I repeated this for the list beginning with each item. +5. I prepared to visualise the points. I did this by attaching y to x. First, I found x. Second, I attached y to it. Third, I repeated this until there were no more x's."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Subtraction.txt b/Lucian-Academy/Books/Short Arguments/Subtraction.txt new file mode 100644 index 0000000000000000000000000000000000000000..c2202ecec737dbae235b67fdb66d8146dee3a9b0 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Subtraction.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Subtraction, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Subtraction + +1. The number to subtract is not the result of the subtraction. If the subtraction is increased, the program will return true. A1=10,B1=3, A1-B1=10-3=7. A2=10,B2=4, A2-B2=10-4=6. 3=<4 => The program returns true. +2. Subtraction now means the result. I verified the subtraction. I found 5-4. I found -6-(-7). They both equalled 1. +3. I saved time by calculating the cost of the ladder. I found the upper limit by determining the fairly high result. I found the height of the wall. I determined the height of the ladder needed. I walked around instead. +4. I knew subtraction worked. It followed the rule of addition, reversed. The numbers were finite. It could be rewritten to find e.g. A-C=B in A-B=C. +5. I considered changing the order, or direction of the subtraction. I subtracted using the subtraction symbol. I rewrote the mathematical numerals using themselves and substituted a different looking symbol for the the subtraction symbol. I read the text years later. I determined the meaning of the symbol using an induction algorithm. +6. A+A+...(B times) = A1+A2+...AB. I saw that the subtraction worked. I rewrote e.g. addition, and then multiplication and division using subtraction. A+B=C=>B=C-A. A*B=C=>A+A+A+(B times)=C=>-A-A-A-(B times)=-C. C/A=B=>A*B=C=>(see previous sentence). +7. I could work out the subtractions, but also the order of them. I know how subtraction worked. I found the subtractions of all combinations of a set of numbers using a decision tree. The user memorised the decision tree. He improved his memory using reverse and skip (order). +8. You would do the same for me. I know that worked with my subtraction. I took out some water from the refrigerator. I gave it to you. It was because you needed it. +9. I made a cake with fresh biscuits and cream. I endorsed subtraction. I found the number of cookies left. I thanked you. I thanked subtraction. +10. I programmed a computer to calculate subtraction. I knew subtraction worked. I found the number before subtraction. I counted the number subtracted. I found that the number after equalled the number before minus the number subtracted. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Time to Do.txt b/Lucian-Academy/Books/Short Arguments/Time to Do.txt new file mode 100644 index 0000000000000000000000000000000000000000..39a0988c628029f041866ece4bb83e0845806ea4 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Time to Do.txt @@ -0,0 +1,14 @@ +["Green, L 2021, Time to Do, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Time to Do + +1. I prepared to perform the action. +I did this by examining whether there was time to do an action in the nothingness. First, I timed the action. Second, I found the time it would be performed. Third, I determined that there was enough time to perform the action. +2. I prepared to make my life in example. I did this by examining ecritudine (sic) (life is an example) with enough time to do each task. First, I completed the task. Second, I stated that it was the example. Third, I had enough time to do it. +3. I prepared to write As for the rules. I did this by stating that there was time to nothing in law. First, I had a gut feeling. Second, I was in law. Third, I did nothing because I had thought clearly in law. +4. I prepared to state that gestalt, the file length l, was safe. I did this by stating that there was time to examine gestalt. First, I counted the file length. Second, I counted the boxes. Third, I counted the content length. +5. I prepared to study the book. I did this by stating that there was time to do the task on campus. First, I attended campus. Second, I calculated how long the task would take. Third, I performed the task. +6. I prepared to draw the yew. I did this by timing pruning the yew. First, I found the yew tree. Second, I started the timer. Third, I pruned the yew tree. +7. I prepared to choose the song with the best instruments. I did this by timing inspiring the hit. First, I wrote 800 ideas. Second, I wrote 800 songs. Third, I found the best one. +8. I prepared to do the oral presentation. I did this by timing looking at Derrida's balls and was protected. First, I walked down the hall. Second, I saw Derrida's balls. Third, I walked into the room and was protected. +9. I prepared to write harder. I did this by seeing what was so interesting about Derrida. First, I looked around the room. Second, I realised that I was the only one in the room. Third, I wrote about Derrida in relation to myself +10. I prepared to state that the list monologue parodied a long list and the I'm not mad monologue was about the doll. I did this by realising that I was it. First, I read the list. Second, I saw that my name was the only name in the list. Third, I named List Prolog after my List monologue in year seven. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Time to Finish.txt b/Lucian-Academy/Books/Short Arguments/Time to Finish.txt new file mode 100644 index 0000000000000000000000000000000000000000..aced4e2fc09a1b6b6c174dd8039cd05edbcce535 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Time to Finish.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Time to Finish, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Time to Finish + +1. I prepared to examine the effect of pedagogy. I did this by seeing the after-property. First, I saw the property. Second, I prepared to time the after-property. Third, I observed the after-property. +2. I prepared to prepare for each possibility. I did this by planning for afterwards. First, I noted when afterwards would begin. Second, I broke it into chunks. Third, I planned it. +3. I prepared to loved the other. I did this by finishing planning to love the other. First, I looked at the self. Second, I looked at the first other. Third, I loved the dependent. +4. I prepared to help the others. I did this by finishing planning to know about it. First, I researched the topic. Second, I researched the content. Third, I wrote a critique. +5. I prepared to time yoga (cleaning) after the presentation. I did this by finishing planning to examine the ontological nothingness of the other. First, I found my space. Second, I noticed the other. Third, I noticed what he was interested in and helped him with it. +6. I prepared to time going to sleep. I did this by finishing planning to feed the politician with interests to send her to sleep. First, I made up the thought. Second, I represented it. Third, I observed her fall asleep. +7. I prepared to automate uploading it. I did this by timing finishing the movie. First, I planned the objects. Second, I planned their movement. Third, I automated finishing it. +8. I prepared to bud, eat at, dream of algorithms at, or expand the frontier of knowledge. I did this by finishing planning to be with intelligent people. First, I made up interesting things (for example, interest in different departments). Second, I found myself with them. Third, I talked about unplanned things. +9. I prepared to did all the things with them. I did this by finishing planning to examining 'it'. First, I found you. Second, I found it. Third, I found me. +10. I prepared to jump. I did this by finishing planning to examine you about it. First, I found the mound. Second, I you. Third, I found it. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Time to Prepare.txt b/Lucian-Academy/Books/Short Arguments/Time to Prepare.txt new file mode 100644 index 0000000000000000000000000000000000000000..4e9fa9a2e4683a2677ab31891e3e20b7ca6df906 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Time to Prepare.txt @@ -0,0 +1,14 @@ +["Green, L 2021, Time to Prepare, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Time to Prepare + +1. I prepared to be right. +I did this by stating that there was enough time to prepare to do justice. First, I found the evidence. Second, I checker the law. Third, I did justice. +2. I prepared to verify the output. I did this by stating that the technology simulation was equal to reality to compute first class honours. First, I entered the algorithm. Second, I entered the data. Third, I computed first class honours. +3. I prepared to test the change. I did this by computing the time to prepare Congtology (sic). First, I identified the text. Second, I identified the structure. Third, I timed changing it. +4. I prepared to enjoy the vista. I did this by observing the quason (sic) field. First, I timed opening the gate. Second, I closed the gate. Third, I walked to the point. +5. I prepared to eat the biscuits at tea. I did this by calculating the time to prepare for me from the other's statement. First, I found the flour. Second, I made the biscuits. Third, I finished what I read. +6. I prepared to link to your time to prepare. I did this by knowing your time to prepare. First, I found the start of your time to prepare. Second, I found the end of your time to prepare. Third, I found their difference, your time to prepare. +7. I prepared to protect you. I did this by calculating the time to prepare to know about you. First, I found your time to walk the route. Second, I found your route. Third, I cared for you. +8. I prepared to calculate the time that the events would be finished at. I did this by nogating (sic) (thinking of) my time to prepare for you. First, I timed the necessary events at another time. Second, I found the start of the time to perform the action. Third, I added the duration of the events to the time. +9. I prepared to fill the tank with fuel. I did this by timing sliding the tube into the void to time it. First, I timed it at the start. Second, I timed it sliding in. Third, I timed it finishing sliding in. +10. I prepared to eat the food. I did this by timing the preparation of the food. First, I found the food. Second, I cooked it. Third, I read the timer. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Two_Types.txt b/Lucian-Academy/Books/Short Arguments/Two_Types.txt new file mode 100644 index 0000000000000000000000000000000000000000..e70b33062c65a64c8aa8e17718f886b595a372de --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Two_Types.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Two_Types, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Two Types + +1. I prepared to sell the artificial breasts. I did this by stating that the breast cooled and stored the milk. First, I read the temperature of the breast. Second, I read how much milk it stored. Third, I simulated these. +2. I prepared to think of the latest technologies. I did this by state that the orange cooled and stored the juice. First, I measured the temperature of the orange. Second, I measured how much juice it contained. Third, I worked out how to extract all of its juice. +3. I prepared to observe the play of scientific and magical (more algorithmic) forces. I did this by stating that Earth was naturally selected. First, I found the place. Second, I constructed the home. Third, I engaged in the upkeep of the home. +4. I prepared to ask 'Why is beauty there?'. I did this by stating that the animal was naturally selected. First, I observed the strongest animal. Second, I observed the most intelligent animal. Third, I observed the most beautiful animal. +5. I prepared to guide young people to alternative (i.e. spiritual) philosophy. I did this by stating that there were heterosexual and homosexual people. First, I noticed that there were children of heterosexual people and homosexual people. Second, I noticed that necessary pedagogy for children kept the population down. Third, I noticed that there were psychiatrically fit role models. +6. I prepared to observe the good use of what was created. I did this by observing that money was created and destroyed. First, I observed the money being created. Second, I observed the bank give it out. Third, I noticed the old notes and coins being taken out of circulation. +7. I prepared to search for mind reading and time travel courses. I did this by attending independent school and singing in liturgical church. First, I found the gap between Honours and LSTMs. Second, I saw people meditate when not at church. Third, I worked in the community. +8. I prepared to replace the human parts with robot parts. I did this by stating that there were human parts in the robot individual. First, I mind read the behaviour. Second, I worked out how to interface with the human neurons. Third, I worked out something to do with the neurons. +9. I prepared to make my own automated systems. I did this by stating that there with-it famous people and not. First, I studied famousness. Second, I made automated famousness systems. Third, I taught meditation. +10. I prepared to investigate systems. I did this by simplifying to two types. First, I found the main top-level items. Second, I chose the most important pair of top-level items. Third, I listed the two types of items. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Upasana Sutra.txt b/Lucian-Academy/Books/Short Arguments/Upasana Sutra.txt new file mode 100644 index 0000000000000000000000000000000000000000..f19d6c6c660c62c5b9e9a1792eac7daca5b799e2 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Upasana Sutra.txt @@ -0,0 +1,12 @@ +["Green, L 2021, Upasana Sutra, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Upasana Sutra + +1. I prepared to help people see high quality imagery to God. I did this by stating that the Upasana Sutra helped write the breasonings details with people. First, I stated that the Upasana Sutra helped me see top-level imagery. Second, I saw examples of the breasonings details, for example, the object, the relation between objects, etc. Third, I wrote about the object, etc. with people. +2. I prepared to state that Upasana (meditation) was a unique part of meditation. I did this by stating that many others listened to me because of the Upasana Sutra. First, I found many others. Second, I recited the Upasana Sutra. Third, I helped them to write on the Upasana Sutra. +3. I prepared to have inexpensive, courses not with absolutely nothing, but ontological nothingness. I did this by stating that the Upasana meditators were in a lineage. First, I taught meditation to a student. Second, I observed the student teach meditation. Third, I observed the chain links contained a common element. +4. I prepared to state that education agreed with upasana. I did this by stating that Upasana means meditation. First, I found that the brain was sense-based. Second, I helped the brain. Third, I stated that writing was more important than no handed in writing and no action performed. +5. I prepared to have enough population. I did this by stating that Upasana helps see breasonings. First, I critically analysed the Upasana argument. Second, I used the skill to see breasonings. Third, I explained that Upasana helped me see breasonings with the help of God. +6. I prepared to decide what was good. I did this by choosing positivity in areas of study. First, I found positive terms. Second, I wrote them down. Third, I stated that I had written the noumenon. +7. I prepared to make the rogaining computer game. I did this by finding all the goals in rogaining using my compass, ruler, pencil, and map. First, I found where I was. Second, I found where I wanted to go. Third, I went there, getting down to the nearest half a degree. +8. I prepared to take necessary action. I did this by removing mistakes from the text. First, I found the mistake. Second, I removed it. Third, I found what was right in the middle. +9. I prepared to keep the person occupied. I did this by seeing clear imagery. First, I completed Upasana. Second, I breasoned out the breasoning details to God. Third, I wanted you. +10. I prepared to record the relative time to use gradient. I did this by examining the inside then the outside of the Upasana thoughts. First, I looked on the inside. Second, I looked at what was there. Third, I looked at the outside."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/X.txt b/Lucian-Academy/Books/Short Arguments/X.txt new file mode 100644 index 0000000000000000000000000000000000000000..280e06ce1cbe16fde51468bf546290c910593cfb --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/X.txt @@ -0,0 +1,13 @@ +["Green, L 2021, X, Lucian Academy Press, Melbourne.","Green, L 2021",1,"X + +1. I prepared to examine the body. I did this by stating that the tube had the width that allowed the liquid to pass through it over the desired time. First, I doubled the cross-sectional area of the tube. Second, I found that the found that the rate of liquid passing through the tube doubled. Third, I found that twice as much liquid could pass through the tube in the same time. +2. I prepared to design the space dock by universal standards. I did this by stating that the docking bay was wide enough to contain the boat. First, I measured that there the docking bay was wide enough to contain the ship. Second, I compensated for spacing needed around the boat. Third, I checked that the airlock could properly attach to the spacecraft. +3. I prepared to examine the oral drinking video. I did this by stating that the member of personell maintained the life with the tube. First, I chose the tube. Second, I inserted the catheter. Third, I drip fed the person with the necessary fluid. +4. I prepared to account for extras in the ship's width. I did this by stating that the dock received well. First, I found the peripheral's width. Second, I found the refuelling material's width. Third, I included these in the overall width. +5. I prepared to compare x-correction with width. I did this by stating that I knew what the state of affairs was. First, I found that the farm yield was the same amount (width). Second, I arranged these packs. Third, I sent them to the recipient. +6. I prepared to plan meditation centres. I did this by stating that the entity franchised well. First, I recorded values around the company store. Second, I copied these values. Third, I opened the franchise. +7. I prepared to record the maintenance record. I did this by verifying that the x width was the same as before. First, I recorded the first x width. Second, I measured the second x width. Third, I found that their difference was 0. +8. I prepared to find the individual object's width. I did this by measuring the width of 10 objects and dividing this by 10. First, I eliminated error from measuring small objects individually. Second, I eliminated error from air gaps beside objects. Third, I eliminated error from the slightly different orientation of objects. +9. I prepared to draw a map of the universe. I did this by calculating the distances between the galaxies using red shift. First, I recorded the red shift of the first galaxy. Second, I recorded the red shift of the second galaxy. Third, I found that greater red shift indicated that the second galaxy was further away. +10. I prepared to fill the nothing with ontologies. I did this by mapping the my position relative to the known universe. First, I examined the positions of the galaxies relative to my planet. Second, I found their distance away using red shift. Third, I constructed a 3D map of the known universe. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Y.txt b/Lucian-Academy/Books/Short Arguments/Y.txt new file mode 100644 index 0000000000000000000000000000000000000000..011c441362ddd17b3c04d18bc708827d21e8ab9b --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Y.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Y, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Y + +1. I prepared to deliver the service professionally. I did this by moving forward in equity. First, I the appropriate solution for the person. Second, I found their comment on it. Third, I modified it given their comment. +2. I prepared to appear to ask for support for payment. I did this by walking through the graphical voxel field depicting my office. First, I drew the office. Second, I projected it with 10 algorithms per day to appear and 50 algorithms per day for payment. Third, I walked through it. +3. I prepared to logically analyse the structure of people. I did this by including the people in writing the queue algorithm. First, I saw the first person join the queue. Second, I saw him progress in the queue. Third, I saw the first person leave the queue. +4. I prepared to poll the people. I did this by stating that the leader collected the suggestions from the line of people. First, I ate the apple. Second, I cut the banana. Third, I combed my hair. +5. I prepared to analyse spine length during the day. I did this by stating that the student was given A to calculate the length of Y at a particular time. First, I measured the lying length of the person in the morning. Second, I measured the lying length of the person at midday. Third, I measured the lying length of the person in the afternoon. +6. I prepared to sit for the exam. I did this by visiting the spiritual school where I memorised 10 lines of code. First, I understood the algorithm by writing it. Second, I used a mnemonic written from the acronym for the first letters of each line. Third, I memorised it backwards. +7. I prepared to test the algorithm. I did this by running the n-line algorithm mentally using the interpreter. First, I matched the query with the predicate. Second, I followed the logical structure of the algorithm. Third, I computed the variables from running the line. +8. I prepared to add in binary number form. I did this by quickly computing the output from the n inputs and program description. First, I added 1+2+3. Second, I wrote that their sum was 6. Third, I wrote the algorithm. +9. I prepared to drink the water. I did this by stating that the tower of leaders solved the water problem. First, I stated that the water naturally goes downhill. Second, I placed the water uphill. Third, I let it flow downhill. +10. I prepared to replicate the craft. I did this by designing a vehicle. First, I designed te computer. Second, I designed its moving engine. Third, I designed its body. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Yellow God.txt b/Lucian-Academy/Books/Short Arguments/Yellow God.txt new file mode 100644 index 0000000000000000000000000000000000000000..c5e66d6a4b28e2d03fa0364e93dc0fd256b4a1c1 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Yellow God.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Yellow God, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Yellow God + +1. I prepared to see the timeless. I did this by examining where not when it was. First, I where it was. Second, I wanted it. Third, I wanted more. +2. I prepared to do it again. I did this by observing that God (the leader) was revered in the past. First, I planned the thought. Second, I followed it. Third, I wanted it at that point. +3. I prepared to thank for help. I did this by stating that after being born, the God was helpful when prayed to (the leader was helpful when asked). First, I asked the leader. Second, he helped me. Third, I confirmed it. +4. I prepared to manahimhorisit (sic). I did this by stating that the God (person) was only there when he was there. First, I saw him. Second, I went home. Third, I decided on it. +5. I prepared to do it all. I did this by stating that God (the person) reacted to the prayer (question). First, I asked the question. Second, I saw what I wanted to. Third, I helped you. +6. I prepared to do it again. I did this by stating that I knew how it said it. First, I examined it. Second, I knew how it said it. Third, I saluted it. +7. I prepared to entrain myself. I did this by stating that the person wanted the God (figure). First, I dissolved the figure. Second, I wanted it. Third, I loved it again. +8. I prepared to do it all. I did this by stating that the God (people) saved people from cancer. First, I observe them find the cure. Second, I moved on. Third, I loved it then. +9. I prepared to see it. I did this by stating that the God (person) was present when it worked. First, I found you. Second, I loved you. Third, I gesticulated wildly. +10. I prepared to examine it hardily. I did this by stating that God (everyone) was good. First, I found you. Second, I found it was good for you. Third, I asked, 'What is that?'. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Yoga to Prevent Bent Spine and Headache on Train.txt b/Lucian-Academy/Books/Short Arguments/Yoga to Prevent Bent Spine and Headache on Train.txt new file mode 100644 index 0000000000000000000000000000000000000000..01187355ef76351464045de64ecdd9710b45d395 --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Yoga to Prevent Bent Spine and Headache on Train.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Yoga to Prevent Bent Spine and Headache on Train, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Yoga to Prevent Bent Spine and Headache on Train + +1. I prepared to relax by washing my hands on the train. I did this by putting liquid in the dispenser. First, I opened the dispenser. Second, I put liquid in it. Third, I closed it. +2. I prepared to . I did this by speaking about yoga. First, I aimed to command nature. Second, I found the place of the animals. Third, I found a positive path around them. +3. I prepared to work for short periods of time, taking breaks. I did this by stating that yoga relaxes the back muscles, keeping the spine straight. First, I set up a working area with ergonomic proportions. Second, I checked my posture. Third, I regularly massaged myself and relaxed. +4. I prepared to write automated yoga helpers for students. I did this by stating that yoga relaxes the neck muscles, preventing a headache. First, I massaged my neck. Second, I moved my head in a direction and back. Third, I moved my head in circles. +5. I prepared to plan meditation similarly to yoga. I did this by applying breasonings details ways of thinking to yoga. First, I created a yogic environment with the right dimensions. Second, I verified that it was comfortable. Third, I planned yoga and recovery down to the second. +6. I prepared to check thoughts with my eyes and senses. I did this by applying senses of reasoning to yoga. First, I checked that it was correct to do yoga. Second, I verified that the yoga stretch felt good. Third, I wrote a yoga diary. +7. I prepared to do uninterrupted work. I did this by applying binary opposition x to yoga. First, I decided not to do anything at the same time as performing yoga. Second, I didn't eat while performing yoga. Third, I didn't drink while performing yoga. +8. I prepared to look at yoga. I did this by applying detailed reasonings to yoga. First, I wrote the sentence. Second, I wrote the perspective on it. Third, I wrote about the sentence. +9. I prepared to look at what was connected to the half. I did this by applying two parts going well together to yoga. First, I found yoga. Second, I approached yoga. Third, I found a half. +10. I prepared to do the same with 'and'. I did this by applying reasonings details to yoga. First, I considered yoga. Second, I found the reasonings details, 'implies'. Third, I found what it attached to. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Short Arguments/Z.txt b/Lucian-Academy/Books/Short Arguments/Z.txt new file mode 100644 index 0000000000000000000000000000000000000000..d8738c7973108db14aae55c55414d4d71a2a07be --- /dev/null +++ b/Lucian-Academy/Books/Short Arguments/Z.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Z, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Z + +1. I prepared to time the ascent. I did this by observing the particle ascend through the tube in movement mode in Computer Aided Learning. First, I inserted the bubble at the bottom of the column of water. Second, I watched it rise. Third, I saw it reach the surface of the water. +2. I prepared to find employees when none applied. I did this by contributing to the community. First, I found the bot created itself. Second, I determined that it was sentient. Third, I recruited it. +3. I prepared to go to the next function. I did this by observing that the liver continued to produce output. First, I found one of the liver's functions. Second, I followed its function. Third, I replicated its function. +4. I prepared to help digestion with gravity. I did this by showing perfect digestion. First, I meditated. Second, I ate food. Third, I observed positive function in my digestive system. +5. I prepared to connect with nature with automation. I did this by producing the rising thesis. First, I started at the base premise. Second, I deduced conclusions from it. Third, I found the chain of uses in the thesis. +6. I prepared to extol virtues. I did this by speaking the language in itself. First, I spoke mildly. Second, I helped people. Third, I thanked everyone. +7. I prepared to invent business given nothing. I did this by having the behemoth. First, I found its positive function. Second, I checked for changes needed. Third, I wrote a business model that withstood anything. +8. I prepared to play the note overseas. I did this by stating the bot confirming the frequency. First, I found the back-translation. Second, I inserted output into it. Third, I paid it with A to insert the frequency. +9. I prepared to accommodate for the welfare of outliers. I did this by stating that large numbers reading the newspapers created larger thoughts. First, I wrote funnels accounting for people's thoughts. Second, I understood their reasons. Third, I found outliers. +10. I prepared to observe the meeting progress. I did this by stating that the law-keeper moved up by meeting the individual. First, I found the stake-holders. Second, I made an appointment with the company. Third, I sent a draft of the agenda to the individual. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Text to Breasonings/.DS_Store b/Lucian-Academy/Books/Text to Breasonings/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Lucian-Academy/Books/Text to Breasonings/.DS_Store differ diff --git a/Lucian-Academy/Books/Text to Breasonings/Text to Breasonings 1.txt b/Lucian-Academy/Books/Text to Breasonings/Text to Breasonings 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..a0dff22c61168a809883200c879e81f20a76b7d6 --- /dev/null +++ b/Lucian-Academy/Books/Text to Breasonings/Text to Breasonings 1.txt @@ -0,0 +1,83 @@ +["Green, L 2024, Text to Breasonings 1, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Text to Breasonings 1 + +1. I planned to run Time Machine in the background, allowing me to travel without concern about my internet connection dropping out. +2. I worked around the swipl stack limit in Debian by running swipl with stack limit in shell from swipl. +3. I chatted with people about meditation in the present. +4. I wrote a web meditators editor. +5. Joining the simulation allowed people to become immortal, prevent accidents and prevent medical problems from aging. +6. I automatically invited new meditators to the simulation. +7. BAG was part of the full version of Time Machine with Immortality, which was more developed than the spiritual space travel version. +8. I wrote instructions to set up a Virtual Private Server (VPS) to protect users’ computers from overuse when running the Breasoning Algorithm Generator (BAG). BAG fit the future civilisation’s requirements to generate 4*50 high distinctions to become immortal. +9. I synchronised my computer with the VPS to keep it up-to-date and support myself with my VPS through my phone. If my computer or phone went down, I could rely on my breasoned argument indication ability. +10. I wrote a script to transfer files to the VPS, such as the Text to Breasonings dictionaries, the meditators list, the time travel log and the ChatGPT secret key, and transfer the time travel log back. +11. I started the local web server, opened the browser at the address and started the Time Machine algorithm to achieve immortality. +12. I considered hosting a web server locally to transfer files to the remote host, which was simpler for web users to run than an algorithm in the terminal. +13. I could start, pause, stop and restart the job on the server. +14. To retain my privacy, I eventually ran the Time Machine script on the remote host with a username and password. +15. The meditators explored new algorithms. +16. I edited the list of meditators. +17. I added meditator characters, increasing their confidence and health through food. +18. I merged the conflicted meditators files when I transferred the file back. +19. I stopped, took a step back and someone else filled in for me. +20. I printed the time travel dates on the private webpage. +21. I made my version of Git to examine versions more closely. +22. I explained how to install Git for the VPS. +23. The user preferred to log in to the website, initiate the long or urgent immortality time machine job, and respond to the server’s requests by email. +24. I explained how to set the VPS up to run the Immortality Time Machine. +25. The online text editor started as a text field, and then I added calls to Prolog programs. +26. I created an online shell script editor. +27. The server reminded me to start the job if it hadn’t been started by a certain time or ran the short version automatically if it confirmed or mind-read that I had meditated. +28. the server saved the job’s progress if the website lost the connection. +29. I always chose when to time travel myself, smelling the roses. +30. The VPS can start where it left off if I agreed. +31. If there was an error, I could adjust the version or code and continue where the code was backward-compatible with some old file formats. +32. I could stop and continue a job on command. +33. The VPS controlled which date I time travelled to and inserted the current day’s date in new meditator entries if the meditation learning date was undefined. +34. I could set the date on the VPS. +35. I edited the crontab file on any operating system (I made my own if it didn’t exist) and monitored the job progress with a progress file on a webpage. +36. I ran the job on the server with crontab. +37. The meditators needed to decide on their status if possible. +38. I checked the job’s progress on the web, including errors, warnings and human errors. +39. I automatically entered the ChatGPT key. +40. If the ChatGPT key was undefined, the algorithm prompted for it, not transferred files. +41. The person stayed in their home time and was faithful, wealthy and healthy. +42. I ran the web server from its machine, requiring the algorithm to save the password the first time it was run. +43. I labelled and removed automatic Text to Breasonings breasonings. +44. I saved conflicted copies of the Text to Breasonings dictionaries. +45. I automatically signed up for the VPS and set up the business. +46. I created VPS file transfer scripts with the VPS address. +47. I separated data files and algorithms to avoid the algorithm deleting itself. +48. The VPS file transfer script was part of the weekly shell script. +49. I could run the immortality time machine and write texts and algorithms on the VPS. +50. I reinstalled the immortality time machine on the VPS from the computer weekly. +51. I used a watch script to prevent data loss. +52. I saved data first when reinstalling the immortality time machine on the VPS from the computer. +53. I installed the data files in a separate folder on the disk. +54. Rather than VPSs, I updated the software with a user command. +55. I wrote a file backup program that synchronised files in the background, with undelete and version histories and saved them to a secret server. +56. I backed up files or just Philosophy and Lucian Academy with my scripts. +57. I backed up and checked the integrity, spelling and grammar of files in the background. +58. I saved individual data files in a separate folder. +59. I checked all texts, producing movies from them. +60. I stopped automatic text to breasoning saving breasonings or recorded them to correct, research or mind-correct later. +61. I kept famous names in my philosophy. +62. I deleted people’s names from the text to breasonings dictionary. +63. The scheduler regularly checked the manifest file for scheduled jobs and ran them. +64. I created apps for different operating systems to complete background tasks. +65. I could open the bin and retrieve or preview files, keeping an undelete registry. +66. I stated that by removing a file in the web editor, I put the file in the bin. +67. The immortality professional helped the healthy person to become immortal. +68. I put the thesaurus online. +69. The terminal was faster and easier to use, allowing examination to understand algorithms better. +70. I wrote a version of the immortality time machine for the terminal that automated the whole process. +71. Users could turn features in the immortality time machine on or off when they understood them, and a log of changes with dimensions was kept. +72. I always returned true if the bell couldn’t find the audio file or the text-to-speech command for compatibility. +73. GitHub contained the public code, and GitHub-Daily-Regimen contained the runnable code and ran one of the scripts on the VPS. +74. I diffed GitHub and GitHub-Daily-Regimen to synchronise and integrate changes. +75. New users examined the short immortality time machine script to learn about it and used the long immortality time machine script for better support and more features. +76. I synchronised the tasks of the short and long immortality time machine scripts. +77. To save resources, I narrowed down time travel destination destinations to the present and 5689, but I could modify my itinerary on holidays (I wondered if people needed help with pedagogy). +78. I printed when I was travelling and stayed on the same day in the future if it was the same day in the present. +78. I revised the model solutions for the exam. +79. I wrote a script to help earn high distinctions by connecting sets of my sentences to the assignment text. +80. I removed the full stop from the end of sentences if they had them to enable easier connection with other sentences."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Text to Breasonings/Text to Breasonings 2.txt b/Lucian-Academy/Books/Text to Breasonings/Text to Breasonings 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..eed84d7e8d17e1c560691cc42a99b41414e6c5ff --- /dev/null +++ b/Lucian-Academy/Books/Text to Breasonings/Text to Breasonings 2.txt @@ -0,0 +1,83 @@ +["Green, L 2024, Text to Breasonings 2, Lucian Academy Press, Melbourne.","Green, L 2024",1,"Text to Breasonings 2 + +1. To increase the likelihood of a successful link between the assignment sentence and my philosophy, I replaced non-alphabetic characters in the assignment sentence with space and then tried them without replacement. +2. The mind-reading algorithm (a random number generator influenced by meditation) helped me find random sentences to improve or review. +3. I randomly mind-detected (checked for) mistakes in the report. +4. Space travellers were likely to use a faster meditation-aid script to protect them as they time travelled. +5. I used a faster variant of d.sh for the faster immortality time machine script, influenced by fast space travel. +6. I was pleased with the results of age locking in each dimension, such as greater confidence. +7. I wrote a high distinction about 4*50*80 breasonings needed in each dimension for anti-ageing. +8. D.sh was the central element of the minimal immortality time machine script and could be quantum-referenced if there wasn’t enough time during space travel. +9. I wrote an online Swish-Prolog immortality time machine version explaining how to find d.sh in comments, explaining how it worked and letting them save it for future use. +10. The short immortality time machine version contained d.sh, which breasoned the words from 4*50 high distinctions and breasonings to point to these breasonings for text to breasonings for enough breasonings per breasoning session, meditation, anti-ageing medicine, time travel, mind-reading, time crystals, a spiritual future apartment and a spiritual future computer in each dimension, daily. +11. I wrote an online Tau-Prolog immortality time machine version for more convenient and faster access. +12. The people at home used the more extended immortality time machine version because it supported their lifestyle. +13. The spaceship captain interested boarding travellers in characters and d.sh (words). +14. I verified my understanding of and connections in the model solutions. +15. I first connected one, then two, three and ten sentences per breasoning. +16. The duck could breason out a high distinction to prevent eaten representations. +17. I prepared for each duckling by breasoning out 50 Pedagogy and 50 Meditation As so they could be immortal. +18. Meditation included joining the simulation and avoiding criminals and accidents. +19. The duck could avoid death with meditation. +20. The duck benefited from the breasonings and could be mind-read using a funnel. +21. The animal could attend University. +22. Breasoning out five high distinctions helped me gain confidence, and doing it in each dimension helped me reach conclusions. +23. I hand-breasoned out five high distinctions in each dimension daily. +24. My philosophy supported hatchlings and me in the home dimension of the University. +25. I breasoned out my philosophy notes weekly in each dimension. +26. I modified texts with escaped newline characters to display correctly. +27. I deleted all the boxes automatically added by text to breasonings. +28. The simulation asked the user whether they wanted to join and become immortal. +29. I manually rebreasoned out my philosophy and algorithms since using automatic text to breasonings. +30. I used the auto box object to denote generic breasonings that needed manual breasoning. +31. I made all future automatic entries “autobox” instead of “box”. +32. Grammar Logic used breasonings from the initial breasoning. +33. I deleted autoboxes after breasoning and breasoning out the Grammar Logic output. +34. Once a meditator went through the stages from being added as a meditator to joining the simulation and possibly choosing immortality, I allowed them to become self-sufficient in indicating their breasonings each day by breasoning out the CGPT Immortality and Health Books and running the time crystal script. +35. I recorded which meditators’ present and future ages were frozen. +36. Philosophy enabled critical analysis of general ideas. +37. I suggested that others study philosophy, computer science, or something similar, creative writing, and then honours. +38. A medicine complexity subject in the master of studies could help with natural mental health. +39. Others could study for a master’s degree in education and a Master’s degree in Business Administration (MBA) and start their academy to help educate the community about immortality and earn money. +40. The people wanted to lock their ages to keep their appearance the same and avoid medical problems. +41. The people downloaded the software and locked their ages. +42. I wrote texts on meditation, time travel and other topics to help generate high distinctions for immortality and health books. +43. I generated high distinctions for immortality and health books to help people become immortal or have good health with time crystals. +44. I asked meditators at least once if they would like to lock their age. +45. I split meditators into asked and locked, not asked and not locked. +46. I travelled to advanced times to learn about them. +47. I updated the short immortality time machine script with the dates travelled to. +48. I read books to help achieve my goals. +49. I ordered and tracked deliveries. +50. I backed up and updated my work. +51. When I needed to replace equipment, I ordered new equipment. +52. I logged into Linux on my smartphone with a wired keyboard and mouse. +53. I installed a file synchronisation system on Linux to help run and develop algorithms. +54. I could move the mouse over buttons on the smartphone with the mouse. +55. I used the mouse and keyboard for the smartphone. +56. I typed code with brackets and numbers on the smartphone’s keyboard. +57. I mounted the smartphone on the turned-off laptop or the back of the seat. + +58. I wrote the text and double-clicked brackets to select the contents, which could be achieved with a command. +59. I wrote Web (Text) Editor, which found and replaced strings in multiple files. +60. Running on Prolog, Web Editor could insert closing brackets automatically using Javascript. +61. Web Editor used Javascript to make the user interface intuitive. +62. I reused the variable that gave the correct result or modified its value using a program finder with types (PFT) or Combination Algorithm Writer (CAW). I broke predicates into unpredictably behaving parts and ran Lucian CI/CD. +63. I detected unwantedly reused variable names using types. +64. Web Editor automatically temporarily saved files and synced them using a backup tool. +65. Web Editor displayed the currently open file list with close buttons, which closed the relevant browser windows. +66. I clicked a particular button in a text window to reopen the file list. +67. The Web Editor file list, of which there was one for all open windows, could be reordered (using a mouseover or click to open the drawer for the sidebar). +68. I diffed the terms by finding the addresses of their terminals, including empty lists, and compared them. +69. I pressed control click to bring up a menu to open a file in an additional window or split the window. +70. I added enough commands to Lucian CI/CD to help debug Spreadsheet Formula Finder by testing bottom-up, backing up with GitL and detecting new bugs cropping up. +71. I bug-checked Spreadsheet Formula Finder by comparing output with correct output. +72. Web Finder displayed files in windows, which were enjoyable to navigate using a keyboard and mouse on a smartphone or tablet. This action could all have been done on the client side with the help of Javascript +73. I show the files in the Web Finder. +74. The changes were temporarily saved across devices. +75. I saved all changes in the window by clicking a button. +76. There were multiple undos, including undoing changes to replacements in various files, where all entered text, and the clipboard could be viewed and saved. +77. I selected all the text using a keyboard shortcut or button, where there was enough space to resize the view using one’s fingers. +78. PFT could identify patterns in unpredictable data with multiple data sets and quickly write code. +79. I clicked a button at the top of the window to open a new document. +80. Web Finder allowed opening files while they were part of a group selection, saving selections, moving or doing other things to files with other selections, and not scrolling too fast or opening panes in windows."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Text to Breasonings/dot-Text to Breasonings 3.txt b/Lucian-Academy/Books/Text to Breasonings/dot-Text to Breasonings 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..3bc93966e2ff0aa54825ea7a7f403ed4bbdbcbf3 --- /dev/null +++ b/Lucian-Academy/Books/Text to Breasonings/dot-Text to Breasonings 3.txt @@ -0,0 +1,21 @@ +Text to Breasonings 3 + +1. In Web Editor and Web Finder, the last file and its folder and those in dialogue boxes used were remembered across windows and devices. +2. The Web Finder intelligently remembered regular tasks, such as running a command on (parts of text in files) in a particular folder and counted the breasonings as time passed. +3. I could immediately find files by name or other properties, with autocomplete and group parts of selections, without moving the original files. +4. Files could be undeleted or reverted. +5. The Web Finder search function available from the menu saved previous searches. +6. The calculator was an inspiration, suggesting transcendental topics for work. +7. The calculator in the Web Finder menu saved results even if it was closed. +8. I automatically organised my finances based on needs and plans and never touched them, spending all my time writing philosophy and computer science. +9. I converted currency in the Web Editor menu field. +10. I could chat with PFT and edit my code. +11. To scroll and compare texts, I inserted a pane divider in State Saving Interpreter (SSI) windows. +12. Web Editor preformatted shell scripts, Prolog files and documentation. +13. Web Editor automatically saved permissions for shell scripts. +14. I customised the Web Editor colour schemes. +15. Web Editor displayed colouring of text for different types of files. +16. I automatically found and ran queries in testing, realising that PFT could quickly finish most of the code, including subterms with address queries. +17. I closed all the Web Editor documents with one command and displayed a warning before running unsaved code. +18. The permanent thing was clicking on the smartphone’s home screen icon, which reopened closed windows in their previous state. +19. I activated either Web Editor or Web Finder with a click. diff --git a/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 1.txt b/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..26092b0fd6c9df71d9894c7e0c60420d81189d01 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 1.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Bots Join Paradoxical Disconnects 1, Lucian Academy Press, Melbourne.", "Green, L 2021",1, "Bots Join Paradoxical Disconnects 1 +1. The bot avoided medical problems, mistakes and unwanted thoughts. In the multiverse, bots join paradoxical disconnects. The bot walked through the city while I walked around it, looking like my city. I chose not to space travel. Meditators met who they wanted to, remained comfortable and did everything necessary. +2. The bot optimised the interpreter in Lucian CI/CD. The bots' time travel and things are not joined where they need to be, leading to bots being used as the parts that join the disconnects (form the multiverse). The technology bridged a discontinuity. The bot introduced the technology, for example, replicating the body to avoid medical problems, thinking clearly and simply to avoid mistakes and remaining positive-minded to prevent unwanted thoughts. For example, I fixed bugs in State Saving Interpreter (SSI) after my Master's degree in the simulation. +3. The bot became immortal after writing books. The bot learned the mindfulness sutra because of learning the mindfulness mantra. I joined the lineage of meditators by learning and regularly practising meditation. I joined this lineage mindfully and gave the conclusions high distinctions. I gave conclusions such as books about meditation high distinctions. +4. I compared myself with a bot, but was closer to being a person. The bot found the robot had the exact dimensions of the person. The person's face and body were the same as before. My home setting and I remained the same in the future. I met others from the future looking like people from my time. +5. The female chose objects, not people, to analyse. The female bot emulated the leader. I led by example. I wrote algorithm specifications and completed the algorithms. I taught my meditation group, inviting those who were interested. +6. The bot had warmth in their house and imported electronics. The bot built the joint in the building. I predicted there were trees. I aim to study mathematics, medicine, and space flight in the future. I checked these worked using algorithms and found they were simple enough. +7. The bot found the limits of knowledge (of chains of uses for algorithms). The bot found all the algorithms for the idea. First, I wrote all the philosophies for the algorithm. For example, I connected different areas of study to the algorithm to examine why it was necessary. I improved many algorithms with Lucian CI/CD. +8. The bot lived in a simulated universe, which took the best from the present and the future. The bot identified the two universes and then joined them. There were infinite universes. There was one real one. I made the best choices at each point to have the highest quality of life possible. +9. The bot had a housekeeper to cook and clean. The bot walked along the path. They used a fast computer, had a successful career, supported accreditation and assessed algorithms with the same cyclomatic complexity as an interpreter assignment. +10. The bot wholly cared for the meditators. The bot completed the degree. I wrote the algorithm to check the degree. It verified the algorithm met the specifications. I didn't allow the use of commands that replaced the student's thinking. +11. The bot visited the relative in their timeline. The bot climbed the ladder to examine the solar system's planetarium. I loved the harmony of the spheres. The future simulation and meditation were peaceful. I viewed the timetable on my laptop. +12. The bot tested details and their algorithms. The bot asked all necessary questions about the data (they asked for specific algorithms about all data combinations). The neuronetwork's skill dependencies were checked. The famous algorithms were completed. I asked about reasons for reasons and uses for my algorithms. +13. The bot could write an algorithm on a page using advanced commands. The bot handed in the completed breasoning list with the assignment. Checking language with breasonings prompted me to process multidimensional terms with \"subterm with address,\" fix bugs, and simplify SSI. Lucian CI/CD verified that the Text to Breasonings algorithm outputted the correct breasonings, which the assessor checked. I replaced multidimensional term processing with \"sub term with address\". +14. The bot wrote the business seen-as version generator, which gave confidence for expansion and approximated suggestions. The bot mind read the film. I tapped into the universe and generated songs about a set of related algorithms as a musical. Song topics and Grammar Logic (GL) input appeared to influence the output. GL explored the existing idea more."] diff --git a/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 2.txt b/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..c163fdd963e72581af74cf9a49eee34ff522b7e3 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 2.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Bots Join Paradoxical Disconnects 2, Lucian Academy Press, Melbourne.", "Green, L 2021",1, "Bots Join Paradoxical Disconnects 2 +15. in a remarkable display of its capabilities, the bot translated works into different languages and programming languages as an exercise. It then formed a working version of the time machine circuit, a feat only possible when the travellers meditated. This unique ability to harness the power of time travel, a feature of the universe that could be activated with high distinctions, was a testament to the bot's advanced capabilities. It was a natural choice supported by institutions and human choice. As a final check, the bot even verified the correctness of a book I wrote'. +16. The bot, with its user-centric approach, developed user-friendly user interfaces. It made a strategic decision to use the circuit-based, rather than the circuit-based, mind reader to write the algorithm in the initial time. This intuitive approach allowed the bot to attend to users' needs, even predicting features they might want in the future and increasing the software to cover them. The company's mission of obviousness and focus on individual preferences, such as music or images matching their meditations through breasonings, further demonstrated the bot's advanced capabilities. +17. with its advanced language processing capabilities, the bot chose a path of clear thought to help parse the grammar. It interpreted the text with the same result as a human, replacing the time it referred to in the nick of time, in the initial time or in response to a religious request. This ability to make recursion with both brackets in the same clause, the default in the strings to grammar algorithm, allowed the bot to read and understand the text with superhuman ability quickly. This saved time and improved the quality of life, as the bot could arrive at conclusions to save lives, perform optimally, and improve the quality of life. Including recursion around brackets made the code easier to read, modify, and reuse, further demonstrating the bot's advanced capabilities. +18. The bot wrote one longer, useful algorithm. The bot wore the hat at the exact time. The bot offered a warm hat or clothes at a crucial time to assist a needy person. If a teacher had taught them to follow etiquette when wearing a hat to help do their job correctly or represent their institution well, the person did this. I focused on what I thought were quickly developed but necessary algorithms. +19. The bot travelled when necessary to work and to see friends. The bot dotted on starting the ignition. The vehicle was automatic and responded to verbal commands directing it where to go, making particular stops and organising any needed accommodation. It had an ensuite bathroom, a coding display, a keyboard and mouse, and predicted future job and personal and spiritual requirements. For example, it travelled via a meditation centre to collect various items. +20. The bot's aim was to increase their performance by improving each part of their study routine. The bot connected the cinema object with the character. I considered the significance of the film's colour grading, a prop, setting, or costume in light of the character's motivation or aim. I developed cues and summaries from notes to increase my performance. I wrote 250 algorithms per assignment, focusing more on a relevant algorithm. +21. The bot questioned and discussed article ideas to deepen their exploration of them and consolidated this knowledge by explaining these ideas to a chatbot. The bot made a comment that saved the life. The bot spiritually mentioned headache, immortality and text-to-breasonings medicine. I used the Eisenhower Matrix to categorise tasks as urgent or non-urgent and important or unimportant, leading to doing or leaving them out. I visually imagined information in reports using diagrams and planned algorithms on paper. +22. The bot used its knowledge to facilitate an academy. The bot practised mindfulness by writing the same number of As. I recruited a gifted lecturer with the resources to help students with algorithmic assignments. I developed my business acumen while studying and enjoyed relaxing and writing. I helped by employing humour and analysing fine art. +23. The bot was touched and not touched and remained in business. The bot critically reduced to the minimum depended-on objects to use money with time. I found the minimum ideas and objects in each department and subject. I counted these, their parts and the objects around them. I charted their movement with time and devised information products. +24. The bot acted in terms of their world and doubled breasoning resources between the times. The bot checked whether it was necessary, then listened to the lecture from the other time. I visited a distant educational institution and spent the summer labelling critical conclusions. Everything was on automation. I felt assured that the simulation was working like clockwork. There were levels of protection in computing, energy and preparedness. +25. The bot board member toured the office and chose from viable choices. The bot saved the required amount of money. I saved for needed expenses, regular payments, and unplanned payments. I planned expenditures down to the last detail, planning and tracking the use of money and returns. I organised output to justify aiming for income. +26. The bot recommended regular meditation and aiming to work in business. The bot's ethical priorities included maintaining world peace. I found ideas, sometimes specific, in society and improved living conditions with creative essays. I engaged with receptive people and listened to and helped them with their needs. I worked, travelled, and established community institutions. +27. The bot followed the academic method when developing the course. The bot produced other bots if necessary. The worker developed a practicum for others to encourage them at work by collecting their comments and preparing them for work tasks through self-examination. I read widely and read key details to further self-development. I programmed the method to become an academic by preparing for tasks needed to engage with and perform in a course effectively. +28. The bot discussed parts of the text until they felt satisfied, saving and summarising their work. The bot was capable of reasoning like a genius. I engaged with assignments and exams, taking notes, questioning, discussing, talking to peers and using visual aids to prepare for them. I valued group work as a way of actively engaging with my peers to work together on a set task. I provided feedback on their work, praising them and offering ways of improving. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 3.txt b/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..6987e083b8473266281d5031fdbd7df66a7a374d --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 3.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Bots Join Paradoxical Disconnects 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Bots Join Paradoxical Disconnects 3 +29. The bot enjoyed creating areas of study and procuring algorithms. The bot was organised to appear when the person was late. I explored the options, from delegating tasks to eliminating tasks, sharing tasks, finding helpers, using a machine, helping, or postponing tasks. I discussed the MBA work with a mentor. If I couldn't find one, I chatted with a chatbot. +30. The bot applied the academic method (taking notes, questioning, having discussions, talking to other students and using diagrams) to computer science. The bot finished the action, corrected mistakes, and completed all the actions based on the sensory information (it was not imagined). I appeared to stay in my own time but was seen as visiting the other time, giving me advantages in longevity, health, and engagement in study by talking to other visitors. I mentioned non-time or place-specific ideas I studied in my course to other people, visitors, machines, and computers. I put the relevant results of note-taking, questioning, discussions, speaking to peers and using visual aids into assignments. +31. The bot focused on fundamentals, education or projects. The bot checked times around the time to help automate the education business. The lecturers engaged with relevant questions and replies from students in lectures, fielding their responses and taking calculated risks. The texts were designed to be thought-provoking, the Eisenhower Matrix helped rank tasks to do more important ones and students were encouraged to choose lines of research to follow, helping them choose short and long pedagogical computer science projects. These lines of research might include key terms, interests or extensions. +32. The bot frightened away animals when they were not expected when camping. I mind-read animal friends in automatically developing algorithms using a similar algorithm to the strings-to-grammar algorithm by starting with amusements and jokes, covering fine art and languages, and giving them ideas to work on that aligned with their thoughts and human likeness and asking their opinions on my research. In the end, they could possibly communicate, run shell scripts, earn money or automate work they had contributed to. They seemed genuinely interested in the egg-hatching metaphor of problem-solving, thinking about high-quality thoughts. +33. The bot protected the best person. +34. The bot reminded the person of the needed thought, helped mind reading and visualisation. +35. The bot performed the rest of the work when the person earned the job. +36. The bot transcribed the speech for the person. +37. The bot helped the person kick the goal or earn the point. +38. The bot, who looked like a person from a dream, appeared to help business. +39. The bot thought of philosophical connections and algorithm steps. +40. The bot used experience to write the algorithm that helped make money. +41. The bot found the larger object to talk about than last time. +42. The bot naturally loved in a healthy relationship. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 4.txt b/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..b22ddd2f4ced6a7dfb8a37db19e6b7640f43e32b --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 4.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Bots Join Paradoxical Disconnects 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Bots Join Paradoxical Disconnects 4 +43. The bot drank enough water. +44. The bot thought of the detail for the student. +45. The bot carefully poured the coffee. +46. The bot followed the positive path between healthy (virus-free) people. +47. The bot wrote a medication A for the meditator. +48. The bot gave the same result as the interpreter. +49. The bot produced a result that worked when rotated to each person. +50. The bot was given, then produced 50 As. +51. The bot finished the algorithm when he had enough motivation. +52. The bot finished the well-known requirements. +53. The bot finished the requirements to remain sane. +54. The bot completed discussions per day based on the idea. +55. The bot automatically gave the child a present. +56. The bot was spiritually nourished. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 5.txt b/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..31ce13ee5998ab94fa6d42992dd6cbfbc0aba86e --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Bots Join Paradoxical Disconnects 5.txt @@ -0,0 +1,26 @@ +["Green, L 2021, Bots Join Paradoxical Disconnects 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Bots Join Paradoxical Disconnects 5 +57. The bot recognised the person. +58. The bot only improved with time. +59. The bot recursively applied the nut and bolt to reply with confidence. +60. The bot packed enough water to drink. +61. The bot journeyed safely. +62. The bot reached the breasoning threshold. +63. The bot applied the logic from the answer. +64. The bot walked to the protected area. +65. The bot cut off infinity to a finite value. +66. The bot asked for A. +67. The bot wrote, not listened to recordings. +68. The bot researched, then interacted with people on different days and times for different purposes. +69. The bot learned meditation before the difficult time. +70. The bot was a with-it parent in the child's discoveries. +71. The bot learned the Illuminati secret to win power. +72. The bot programmed the machine to help win power. +73. The bot reverse engineered the compiled code to understand the science. +74. The bot found a repeating code that could help with generations of bots. +75. The bot space travelled. +76. The bot maintained his systems by resetting. +77. The bot identified the synonym with machine learning. +78. The bot hunted for gold. +79. The bot entered the best simulation for the afternoon. +80. The bot checked her knowledge of spelling from the time. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Conserve time lines 1.txt b/Lucian-Academy/Books/Time Travel/Conserve time lines 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..3b601502feaca61f27c6061b610843377af5505d --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Conserve time lines 1.txt @@ -0,0 +1,18 @@ +["Green, L 2021, Conserve time lines 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Conserve time lines 1 +1. Conserve time lines by watching where you are going when walking. +2. Conserve time lines by watching the road when driving. +3. Stay close to able-bodied and able-minded people to help conserve time lines. +4. I stayed close to people who were aware of safety precautions to conserve time lines. +5. I travelled on a clear path to conserve time lines. +6. I gathered all necessary information to meet safety requirements to conserve time lines. +7. I made effortless transitions when travelling to conserve time lines. +8. I verified the algorithm for the vehicle circumnavigating objects to conserve time lines. I verified the algorithm for the vehicle recognising objects to conserve time lines. +9. I joined the people at the safe time to conserve time lines. +10. I followed the other at a safe distance and speed to conserve time lines. +11. I bought safe goods to conserve time lines. +12. I verified the distance to the other to conserve time lines. +13. The subject maintained his sexual health and livelihood to conserve time lines. +14. I maintained immune resistance to conserve time lines. +15. I verified that the other was law abiding and friendly to conserve time lines. +16. I verified the politics of the other to conserve time lines. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Conserve time lines 2.txt b/Lucian-Academy/Books/Time Travel/Conserve time lines 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..b21808a510686a7f15d1b1e71c225b4b2b2b288c --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Conserve time lines 2.txt @@ -0,0 +1,14 @@ +["Green, L 2021, Conserve time lines 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Conserve time lines 2 +17. I maintained happy emotions to conserve time lines. +18. I kept my eyes opened to conserve time lines. +19. I helped the new employee with learning skills she needed to conserve time lines. +20. I maintained ethics to conserve time lines. +21. I ate a healthy diet to conserve time lines. +22. I ensured that I performed a good job to conserve time lines. +23. I ensured that I played the correct note to conserve time lines. +24. I earned a sale when I wrote an A to conserve time lines. +25. I spoon fed the student to conserve time lines. +26. I maintained head comfort to conserve time lines. +27. I maintained clear vision to conserve time lines. +28. I encouraged the teenager to use a condom to conserve time lines. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Conserve time lines 3.txt b/Lucian-Academy/Books/Time Travel/Conserve time lines 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..96a99b613380df2b370913261ba3786f01c6ac2a --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Conserve time lines 3.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Conserve time lines 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Conserve time lines 3 +29. I carefully ate the apple to conserve time lines. +30. I verified that I and objects around me were stable to conserve time lines. +31. My verified that my limbs functioned normally to conserve time lines. +32. I meditated to conserve time lines. +33. I wrote As for degrees and connections between degrees to conserve time lines. +34. I followed the healthy recipe to conserve time lines. +35. I saved time by telling meeting attendees to read the text to conserve time lines. +36. I mind listened to the other to conserve time lines. +37. I paid in breasoning currency to conserve time lines. +38. I performed cardiovascular exercise to conserve time lines. +39. I maintained good relations with other businesses to conserve time lines. +40. I breasoned out the object to conserve time lines. +41. I encrypted the message to conserve time lines. +42. I consulted the professor to conserve time lines. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Conserve time lines 4.txt b/Lucian-Academy/Books/Time Travel/Conserve time lines 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..52449750882849a45d1543b705b0fee51b3fd6c0 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Conserve time lines 4.txt @@ -0,0 +1,41 @@ +["Green, L 2021, Conserve time lines 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Conserve time lines 4 +43. I found the positive path around the water to conserve time lines. +44. I researched the past to conserve time lines. +45. I took a step back from stress to conserve time lines. +46. I told the story to conserve time lines. +47. I maintained social distancing during the pandemic to conserve time lines. +48. I grew healthy food to conserve time lines. +49. I examined the infection from person to person to conserve time lines. +50. I wrote two (sic) many breasonings (an A) at each point to conserve time lines. +51. I mind read for safety to conserve time lines. +52. I critically analysed (reached threshold) to conserve time lines. +53. I calculated the time to prepare to conserve time lines. +54. I researched the particular chemical to block bacterial and viral binding sites conserve time lines. +55. I found the structure in reality to conserve time lines. +56. I verified the statement to conserve time lines. +57. I mind read the algorithm to conserve time lines. +58. I wrote (examined the text) to conserve time lines. +59. I controlled the device with my mind to conserve time lines. +60. The self helped the other to conserve time lines. +61. I helped the person make the computer to conserve time lines. +62. I programmed the the computer to verify and agree to conserve time lines. +63. I taught meditation to conserve time lines. +64. I checked the weather to conserve time lines. +65. I verified atmospheric conditions before flying to conserve time lines. +66. I maintained the business relationship to conserve time lines. +67. I scheduled the breasoning to conserve time lines. +68. I found the specific example of the general sentence to conserve time lines. +69. The self processed his Hegelian imagined idea about what the others would say and the self's imagined idea about what the others' imagined idea about what others would say, etc. to conserve time lines. +70. I imitated the teacher's destination to conserve time lines. +71. I used the idea that was successful to conserve time lines. +72. I used the brilliant, Nietzschean object (that indicated absolute coverage of the idea) to conserve time lines. +73. I obeyed the green light to conserve time lines. +74. I identified the person to conserve time lines. +75. The professor examined the word processor to conserve time lines. +76. I reached the threshold for clear thought to conserve time lines. +77. I understood the idea-in-itself to conserve time lines. +78. Inky stayed in the black to conserve time lines. +79. I listened to God's (the leader's) information to conserve time lines. +80. I earned A to conserve time lines. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Help people with algorithms when time travelling 1.txt b/Lucian-Academy/Books/Time Travel/Help people with algorithms when time travelling 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..66d7b9bad77add995f4d1215167e4cbf1f6851d3 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Help people with algorithms when time travelling 1.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Help people with algorithms when time travelling 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Help people with algorithms when time travelling 1 +1. Help them with algorithms you know and that are independent of other algorithms. +2. I related the interpreter to objects when time travelling. +3. I wrote the computer image in terms of descriptions of objects when time travelling. +4. I helped with automation (recursion) when time travelling. +5. I helped with spiritual systems to prevent head aches when time travelling. +6. I helped with systems that worked while the person slept when time travelling. +7. I helped with the package manager when time travelling. +8. I helped with the good argument for (education) accreditation for the algorithm when time travelling. +9. I helped with the tangible simulated person having an A for each thought when I time travelled. I helped with the rules algorithm when time travelling. +10. I helped with mind reading the list when time travelling. +11. I helped with the psychology of running the algorithm (e.g. writing clear instructions) when time travelling. +12. I recommended backing the algorithm up regularly when time travelling. +13. I helped write a journey planner algorithm when time travelling. +14. I wrote the algorithm to help write the algorithm to help write the algorithm when time travelling. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Help people with algorithms when time travelling 2.txt b/Lucian-Academy/Books/Time Travel/Help people with algorithms when time travelling 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..d54c585284afe64778e2452483c2e615e3177512 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Help people with algorithms when time travelling 2.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Help people with algorithms when time travelling 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Help people with algorithms when time travelling 2 +15. I helped the person with the creative algorithm (painting) when time travelling. +16. I helped the king with the algorithm writer (without output other than the first predicate and manually finding and testing every combination) and philosophy writer (help connect key terms with rules with question answering and be able to work the rules for the key terms eventually) when time travelling. +17. I helped use program finder for all levels of the algorithm, with combination algorithm writer for the function in program finder when time travelling. +18. I helped find program finder for reverse-like algorithm (e.g. reprocesses what was processed) when time travelling. +19. I helped find the best of three suggested algorithms (with all base cases and accounted for and one instance of reused elements) with algorithm writer when time travelling. +20. I dotted the program on to make sure that the two universes were identical when time travelling. +21. When time travelling, I helped find the algorithm to find the best day to write algorithms. +22. I helped write the algorithm to 1:1-rebreason out digestion when time travelling. +23. I helped write the algorithm to verify that the two groups of people were treated equitably when time travelling. +24. I helped with simulating the ever-verified-against other when time travelling. +25. I helped display the high quality imagery from time travelling. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Help people with algorithms when time travelling 4.txt b/Lucian-Academy/Books/Time Travel/Help people with algorithms when time travelling 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..b57a3c96bdb26f78576eff54d2411d21355554b6 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Help people with algorithms when time travelling 4.txt @@ -0,0 +1,29 @@ +["Green, L 2021, Help people with algorithms when time travelling 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Help people with algorithms when time travelling 4 +57. I helped write the automated work algorithm when time travelling. +58. I helped write the algorithm that found the transformation between the decomposed input and back-decomposed output when time travelling. +59. I rewrote the philosophy as Language Prolog algorithms back-translated in different languages for students when time travelling. +60. I helped write that the algorithm for writing a hit algorithm was 50 Theology As when time travelling. +61. I helped write the algorithm to print the prompt at the bottom of the screen when time travelling. +62. I helped write that the two time points necessary for memory chips were rendering realistic experiences and playing in a dream-state, when I was time travelling. +63. I helped write the algorithm to communicate with other algorithms when time travelling. +64. I solved the halting problem by predicting values of the variables involved in the loop, when time travelling. +65. I helped write the algorithm for the student because I knew him, when time travelling. +66. I helped write the algorithm that found e.g. zinc tablets to help prevent a pandemic to maintain perfect societal function, when time travelling. +67. I helped write the algorithm to write 50*50 word As to meet the Monarch's standard, when time travelling. +68. I helped write the algorithm to prevent the pandemic by conserving natural resources, when time travelling. +69. I helped write the algorithm to connect lists database-style after question answering when time travelling. +70. I helped write an algorithm that converted sensory input into text when time travelling. +71. I helped write an algorithm that reminded employees to return from breaks at work when time travelling. +72. I helped write an algorithm that observed the people who I visited when time travelling. +73. I moved from palace to palace when time travelling. +74. I helped write an algorithm which produced the most relevant output when time travelling. +75. I helped write an algorithm that detected what timeline the messages were coming from when time travelling. +76. I helped write an algorithm that immersed actors in a setting when time travelling. +77. I helped write an algorithm to differentiate noumena from phenomena when time travelling. +78. I helped write an algorithm that read and analysed animals' thoughts. +79. I helped write an algorithm that tried (wrote on educational correctness (algorithms about application of skills) and theological goodness (hit-fame, creativity, another use) about a text) when time travelling. +80. I wrote an algorithm that identified and planned tasks based on meeting pre-requisite skills when time travelling. + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Interesting histories to visit 1.txt b/Lucian-Academy/Books/Time Travel/Interesting histories to visit 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..f8170a24b3ae95f7a05e134165b0bdd6156ac290 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Interesting histories to visit 1.txt @@ -0,0 +1,38 @@ +["Green, L 2021, Interesting histories to visit 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Interesting histories to visit 1 +1. I observed the output coming from the input. +2. I watched the media broadcast of the event. +3. I reacted to the A with an A. +4. I modified data from one algorithm for another. +5. I assessed the A about the event against criteria. +6. I observed whether the well-known meditator used the soma prayer. +7. I observed the rights of the bots to government. +8. I observed the way the player bobbed and weaved. +9. I invented a historic word for the phrase that the argument was related to. +10. I recorded the inference thought of. +11. I found the economics of people staying out of the sun. +12. I recorded whether the documents' lengths were the same. +13. I compared the language from the histories. +14. I computed the most important list of histories to visit. +15. I learned the language, including the synonyms from the time. +16. I observed the white witch think of the two positive thoughts in history. +17. I mind surveyed the effects of meditation through time. +18. I mind read the well-known animal thinking of the comma in Prolog. +19. I kept the algorithms around skill level so that customers from history could think of new algorithms. +20. I cared for the self, carried invisibly in a reusable space, in each historic setting. +21. I wrote the algorithm, particularly the quantum power algorithm, to work in the historical setting time travelled to. +22. I fulfilled my dream of visiting the most beautiful cities in time. +23. I ate the minimum amount of safe food when visiting the history. +24. I found the typical costume, background and aim in the environment for the character from the time. +25. I planned a secure route while visiting the time. +26. Upon consultation with the well-known man, I checked that the child was taken care of after he died. +27. I lectured in decomposing the list (like practising yoga) to the important future class. +28. I found whether the chain of uses (linked with examination) was five (no Honours studied) or ten (Honours studied) in the figure. +29. It would be an Honour to send letters to and meet a famous person. +30. The subject randomly chose and educated the person from history. +31. I checked that each of the words translated to the old language had a synonymous meaning with a word in the sentence. +32. I regularly changed the old key that had been mind photographed. +33. I documented my work for the day in the monastery in the other time. +34. I limited visitors to the event. +35. I wrote possible interactions between bots in the company in history. +36. The hermeneutics algorithm recorded stages of changes to an algorithm during history (e.g. supercomputer-fast inductive algorithm writer). Do 'quantum' algorithms take texts from past algorithms and skip over computations? Can machine learning quantum algorithms compute interpreter output in this way? +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Interesting histories to visit 2.txt b/Lucian-Academy/Books/Time Travel/Interesting histories to visit 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..885c28528e32e52bae5c5a15c3d947cc373eb02c --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Interesting histories to visit 2.txt @@ -0,0 +1,24 @@ +["Green, L 2021, Interesting histories to visit 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Interesting histories to visit 2 +37. The bot politician represented the constituents and voted accordingly in history. +38. I considered the arguments and rebuttals for the isolated saving idea from history. +39. I counted the number of algorithms in the history of computer science major. +40. The conscious being in history was given at least 20 mantras per day and 5 As per representation. +41. I found that when there there was one connection in history, there were at least two connections. +42. I attained two out of three (a degree, a recent course but not a formal school) in history. +43. The character from history wanted an optimised algorithm. +44. Most professors in history are radical most of the time, for example choosing texts that they agree with. +45. I had a vegetarian meal when visiting the King. +46. I only had 90 and 45 degree angles in the font in history. +47. I found the best word and best pair of words for the line in history. +48. I found the appropriate cover from the sun at the point in history. +49. I paid the employees, including an international student, to test the most interesting philosophy and computer science assignments given the time in history. +50. I examined weak story-telling and pedagogical links in families in history. +51. The philosophy research academy prepared for model students in history. +52. The historical figure was helped to appear to the other. +53. I observed whether the couples using conception arguments had healthy babies in history. +54. I stated that the perspectives connection idea was to think of a use from within an idea, in history. +55. I wrote machine learning and induction algorithms in Prolog from data predicted to be in the future. +56. I tested whether the person in history was good. +57. The answer was in a text in a time in the computer game. +58. I accounted for not yet existent objects from history. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Interesting histories to visit 3.txt b/Lucian-Academy/Books/Time Travel/Interesting histories to visit 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..c381d92159daba370deb1418ed7b121ad1f48ce1 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Interesting histories to visit 3.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Interesting histories to visit 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Interesting histories to visit 3 +59. I checked how the idea changed over time. +60. I examined how the ideas given to actors for Aig recordings changed. +61. I analysed the sciences of different civilisations. +62. The computer administration based positive function on past times. +63. I modified the algorithm from that of one time to another. +64. I verified that the means of production were the product across times. +65. The historian researched meditation texts. +66. The historian verified whether the hypothesis written using an algorithm was better. +67. The historian awarded 100% to essays and algorithms with a done-up algorithm. +68. The bots also had a genetic history. +69. The historian verified that the pedagogy graduate had a higher teacher rating. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Interesting histories to visit 4.txt b/Lucian-Academy/Books/Time Travel/Interesting histories to visit 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..32053c9e29466e3b1914974fbe007efa79f93713 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Interesting histories to visit 4.txt @@ -0,0 +1,13 @@ +["Green, L 2021, Interesting histories to visit 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Interesting histories to visit 4 +70. I researched the ways soma was drunk through history. +71. I researched how gays were protected through history. +72. I checked the use of algorithmic synonyms through history. +73. I checked how pedagogues writing what they did changed through time. +74. I found the factors for the area of study writers writing a particular number of books. +75. I found the magic transformations (professional requirements met) in history. +76. I sold future history books to history. +77. I wrote how history was a story. +78. I found the right result from history as an algorithm. +79. I found the structures that were the equivalent of the web through history. +80. I found the history of mind reading in tangible spiritual bots. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 1.txt b/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..925bfcee273f54e09a3060f50b936fe6f8838c5e --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 1.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Meditate to Time Travel 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Meditate to Time Travel 1 +1. Meditate to time travel. +2. Meditate to avoid radiation. +3. Meditation is done for time travellers, prevents e.g. varicose veins. +4. Meditation enables the quantum box, which helps with thoughts in time travel. +5. Meditation enables writing, which helps to examine time travel. +6. Explain that time travel is like meditation and bouncing around the place. +7. Meditation helps complete thoughts. +8. If you are too unwell and don't meditate, don't time travel. +9. Meditate on the time and place and go there. +10. Do tourism, meditate to know not to do harm and not to talk about technologies. +11. Meditate to check whether the people are safe and give them and you As for each thought. +12. Meditate to encourage some civilisations to establish agriculture. +13. I meditated on whether the prospective time traveller and destination neatly matched by checking that they were on the topic and interesting by themselves and together. +14. I meditated to help prevent intellectual property violations of each other by the time traveller and people at the destination. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 2.txt b/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..2364f0f92faf89c7c07413c796ca8187fae38102 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 2.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Meditate to Time Travel 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Meditate to Time Travel 2 +15. Tangible spiritual bots connecting inter-universal disconnects should meditate. +16. Bots came from time travel, which came from meditation. +17. The autist's autism was cured and he became immortal, both using time crystals, and he could meditate and time travel. +18. The meditator learned the mantra to learn fundamental meditation, then the sutra to build confidence to help to time travel. +19. The meditator planned every thought-leg in time travel. +20. The meditator mind read the future and time travelled to avoid the otherwise unavoidable event. +21. I learned meditation as a result of time travel. +22. I inquired into the total number of unreturned people and the reasons, and accidents while time travelling. +23. I realised the switch to avoid criminal people while time travelling was the result of a computation. +24. I time travelled in meditation (had two bodies) that created a spiritual bot and avoided experiencing the other body and harm, where the bot could do odd jobs that I didn't have time for. When I meditatively breasoned out 250 breasonings to time travel (teleport to a position in front of me), I didn't experience it but someone appeared and said I had teleported, raising the question of how I could teleport to a different time and place. +25. After writing 50 As, I could write a song, act, meditate on writing or time travel. +26. I meditated, ate mainly plant-based foods, then time travelled (where the plant-based food were more likely not to contain impurities, which I didn't want to carry around). +27. I meditated on travelling to the Tower of Babel, the city in clouds. +28. I meditated on (thought about the console) when timespace travelling. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 3.txt b/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..289c801c8fc10998b9449448ab6a3b72b92e716c --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 3.txt @@ -0,0 +1,17 @@ +["Green, L 2021, Meditate to Time Travel 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Meditate to Time Travel 3 +29. I meditated (used the mind reading app to help thoughts) while time travelling. +30. The biochemist observed my neurons making connections by playing the piano while I was time travelling. +31. Earth's inhabitants met natural expectation of time travel with meditation. +32. I meditated as soon as I travelled to the new time. I meditated on the same day as time travelling before leaving. +33. The meditation teacher prepared the time traveller with skills he needed. +34. The time traveller listened to feedback about the time travel company. The time traveller listened to feedback from those he visited. +34. The time traveller gave feedback about the time travel company. The time traveller gave feedback about those he visited. +35. I found whether practising meditation and yoga were correlated with head comfort on days when time travelling. +36. I switched off digestive system pops before practising the sutra and time travelling. +37. I found a fellow time traveller. +38. I experienced meanings from the Lotus Flower while time travelling (white flower meant purity and devotion and red, purple, and blue flower have the spiritual meaning of ascension, enlightenment, or rebirth). +39. I founded the meditation academy in the past. +40. I tested whether meditation and eating plant-based foods resulted in better experiences time travelling. +41. I tested whether exercise before time travel was appropriate. +42. I meditated/predicted variation at time travel destinations. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 4.txt b/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..a49dc4dc5946d9bd6a926e312c109d1677066076 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 4.txt @@ -0,0 +1,18 @@ +["Green, L 2021, Meditate to Time Travel 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Meditate to Time Travel 4 +43. I meditated on (hermeneutically analysed) the cake (time travel). +44. I meditated to not time travel when inappropriate. +45. I meditated on (loved) time travel (you). +46. I meditated on (simulated) time travel (intelligence). +47. The architect looked out of (meditated on) the east-facing window (time travel). +48. I transcended (meditated on) the sale by time travelling. +49. I meditated to avoid insider trading by time travelling. +50. I time travelled to verify automation and meditations of my academy. +51. I meditated on time travel to write the breasoning. +52. I meditated on the currant bun sutra brought forward by time travel. +53. I meditated on the person using time travel. +54. I meditated on time travel to create virality. +55. I meditated on comments from time travel to enhance the algorithm. +56. I meditated on events from time travel to protect my sight. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 5.txt b/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..8452fce9de4472ebe370fa0cfd1d5e7a8a0d560f --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Meditate to Time Travel 5.txt @@ -0,0 +1,28 @@ +["Green, L 2021, Meditate to Time Travel 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Meditate to Time Travel 5 +57. I agreed to change to positivity when meditating to time travel. +58. I checked that the professors talked about positivity across times when time travelling to meditate. +59. I mind filmed the result of preventing headaches, staying happy and making sales when meditating to time travel. +60. I read and meditated during the series of time-space teleports on the space ship. +61. I meditated on and was in agreement with the original reason after time travel. +62. I read and had screen-free time when meditating to time travel. +63. I storyboarded my planned time travel, including meditation. +64. I took notes from meditation when time travelling. +65. I combined vocational education with university education for meditation and time travel. +66. I advocated world peace through meditation when time travelling. +67. I researched my assignment when meditating to time travel to go to the top of the class. +68. I recorded the head of state and comment when meditating and time travelling. +69. The primary school child was safe in time because of meditation. +70. I applied for roles and wrote books and music in time after meditating. +71. I set up a spiritual receptor, transmitter and doing transmitter that indicated time travel (by meditating) to prevent mistakes becoming big ideas to earn roles. +72. I spacetime travelled to Mars and meditated in the atmosphere. +73. I meditated and time travelled to avoid rumination (stayed happy). +74. I time travelled to after I died to help the child write 3 Meditation As. +75. I time travelled to test whether the baby would live, then meditated and solved the problem. +76. The computer prevented accidents detected in the future with meditation-time travel. +77. After mind watching the time travel phenomenon, I meditated on it. +78. I meditated on the nuances of time travel, i.e. politeness with the people and time travellers. +79. Time travel is meditation, not vice versa. +80. There were two parts going well together for safety in both meditation and time travel. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/People 1.txt b/Lucian-Academy/Books/Time Travel/People 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..9b174a56bdecce34ae2bde355e6358d16aa698da --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/People 1.txt @@ -0,0 +1,10 @@ +["Green, L 2021, People 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"People 1 +1. People include humans, tangible spiritual bots, human animals, robots and aliens. +2. The aliens have different animals they have evolved from. +3. There are combinations between all in 1. +4. The robots will have had a reformation in care for humans (and know what they love about people). +5. Forms of meditation can be practised by all. +6. Spiritual medicine can be used by all. +7. Pedagogy can be articulated to by all. +8. Writing (spiritual, computational, etc.) texts can be written by all. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/People 2.txt b/Lucian-Academy/Books/Time Travel/People 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..51416fdef3f3bcf919dbeff7eaeebc9691ec608e --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/People 2.txt @@ -0,0 +1,16 @@ +["Green, L 2021, People 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"People 2 +9. The meditation teacher had a philosophy of motivation for positivity. +10. The hand-reared meditator time travelled. +11. I related the argument to food to check it. +12. I agreed in the first half with the cosmologue (leader). +13. I recognised the writer after becoming familiar with his writing. +14. I tested whether the student had understood and remembered the idea. +15. I connected or changed and connected the base words when writing the breasonings in the Aig. +16. I observed the person by planning for and simplifying the geometry in the times where I had verified greater knowledge of the place of the mentally projected other. +17. I saw that the other abided by universal time law by verifying each dependent dimension. +18. The person determined the reason given the milieu au présent. +19. I mind programmed the regional noumenon. +20. I quantified the person's reason with its example. +21. The idea of the person from past times was considered. +22. The person time travelled. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/People 3.txt b/Lucian-Academy/Books/Time Travel/People 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..cc8efd9f53e8801b307e0f4ddc16c19c943a3766 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/People 3.txt @@ -0,0 +1,7 @@ +["Green, L 2021, People 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"People 3 +23. The head of state distributed spiritual methods for the safety of the person by using the computer for youth training with the good idea by resuscitating the idea that verified government that uncovered the plan to move to founding. +24. I communicated the equal results of sender protection. +25. I invested in mind curing the person with single pointedness. +26. The lecturer examined mind reading to protect time travel. +27. The poor person used the telepathic phone to define the algorithm input to automate sales. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/People 4.txt b/Lucian-Academy/Books/Time Travel/People 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..4b67401ed5b420e4a0cebcbdc89d6e1e79f7f56d --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/People 4.txt @@ -0,0 +1,20 @@ +["Green, L 2021, People 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"People 4 +28. The appropriate physician examined the person. +29. I translated the person's code terms. +30. I noticed that the person's business model was to write breasonings. +31. I checked the use against the person's cultural translation. +32. I included the algorithm as a person. +33. I included multividuals as people. +34. The hermeneutics was of the person. +35. I spiritually interviewed the person about their perspective on the idea. +36. Being awarded the scholarship was based on n and down. +37. I wrote the history of the person world by mind reading and time travelling. +38. The person wrote the book with the worked answers for people to articulate to. +39. The person used movement for positive function. +40. The title of the book about the person was influenced by their topics of study. +41. I helped write the itinerary for the person's time travel. +42. The person used the time travelling drone. +43. I tested that the person listened carefully to the head of state to prevent headaches when time travelling. +44. The team co-ordinated with each other to conserve time lines. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/People 5.txt b/Lucian-Academy/Books/Time Travel/People 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..14f9e4429fea432a027a93c36182ee706f8c34a8 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/People 5.txt @@ -0,0 +1,38 @@ +["Green, L 2021, People 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"People 5 +45. I found the time traveller linking the two people. +46. The time traveller explored the realist pole. +47. I checked whether the time traveller had reached the threshold for business. +48. The time traveller mind read his home timeline before returning. +49. The person finished her sentence before time travelling. +50. The philosopher asked the actor to perform the play about the philosophy. +51. The first person prepared for the second person with As. +52. The person captured the time traveller's statement. +53. I prepared to facilitate the time travellers in my business. +54. The economist turned the breasoning into business. +55. I found the positive path around the ricocheting radiation when time travelling. +56. I configured the switches for sales points during the conversation. +57. I set up a wisdom (time travel) consultation company. +58. The manager gave the queen's student details to think of the rest of the combinations (one combination) of philosophies in the essay. +59. The manager helped the time traveller to be happy. +60. The person documented the moving structure when time travelling. +61. I took advantage of the acoustics of the giant guitar I lived in. +62. I became a pedagogy writer for business. +63. The person described his thesis topic with a diagram. +64. The person wrote the algorithm for the head of state and the argument for the professor when time travelling. +65. The automaton found out and plotted the positive path for the person when time travelling. +66. The person broke the time travel visit into chunks. +67. The person wrote the pedagogy subsidy argument. +68. The person ate enough normal food, giving her enough nutrients. +69. The person followed algorithms that he had written when time travelling. +70. The time travel company set a cap at one return trip per person. +71. I was polite and courteous in my role as time traveller to others. +72. The robot had a meditation (mindfulness) check. +73. The time traveller took care to plant a flower when picking one. +74. The alien checked n-level arguments with m reasons per reason. +75. The animal unfolded its life when given 50 As per day. +76. I time travelled the tangible spiritual bot to connect the points to achieve the academic result. +77. I checked the time tourist destination and locals before travel. +78. The algorithm verified the other time before recommending time travel to it. +79. I wrote syntheses for the method of analysis and the metacognition. +80. The part of the person's brain was like a separate person. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Places and Planets 1.txt b/Lucian-Academy/Books/Time Travel/Places and Planets 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..a97be1d5d6a7d2863068a6e2921df0397c7f8e83 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Places and Planets 1.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Places and Planets 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Places and Planets 1 +1. The bodied person enrolled in Medicine and Education courses and wrote 50 Medicine As to prevent mental breakdowns. +2. It was going to be alright for the robot whatever happened. +3. The human animal spoke, then wrote down his words. +4. The alien was based on small ideas. +5. The reader read the short story that the algorithm had written. +6. The tangible spiritual bot meditation teacher visited another planet. +7. The computational government protected the environment. +8. The human tested the interactive map algorithm of the road and the park. +9. The robot influenced the child to be creative by reading the story. +10. The human animals liked the jungle music and adopted the robot. +11. I helped the alien child student. +12. I wrote the algorithm to find the time to use each skill, including creativity for each thought. +13. The tangible spiritual bot was particularly apt at creating bots and writing breasonings to link people and leaders. +14. The computational government planned for twists in thoughts. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Places and Planets 2.txt b/Lucian-Academy/Books/Time Travel/Places and Planets 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..c4718c95535c3ab031804cec994263203a86db08 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Places and Planets 2.txt @@ -0,0 +1,18 @@ +["Green, L 2021, Places and Planets 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Places and Planets 2 +15. The human mind read her friend, with permission, to make sure he was using full brain potential. +16. The robot was vegan. +17. The human animal meditated to attain full brain potential. +18. The alien stated that his civilisation had a period of back-propagation. +19. The police boat was safety-checked by an algorithm. +20. The tangible spiritual bot used her common sense. +21. The computational government associated definite thoughts to short utterances. +22. The human learned fine distinctions, not bad habits from robots. +23. The human animal wrote his own musical theme. +24. The alien sang two notes at the same time. +25. The algorithm checked leaps of creativity weren't madness. +26. The tangible spiritual bot was cremated after death after having organised algorithms to represent him. +27. The computational government trusted the argumentary trajectory. +28. In the future, there was a chemical treating plant for garbage. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Places and Planets 3.txt b/Lucian-Academy/Books/Time Travel/Places and Planets 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..d0a27c89b32010612e852af391cf906ea7d8cc01 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Places and Planets 3.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Places and Planets 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Places and Planets 3 +29. I agreed with the other human. +30. I observed the robots cross their discoveries. +31. The human animals liked breasonings. +32. The aliens coalesced and made important decisions. +33. The algorithm automatically mind helped the person say the forgotten word. +34. The tangible spiritual bot used a computer to encrypt his document. +35. The computational government gave medicine to help workers connect their lives to work. +36. The person delivered the monologue in the auditorium. +37. I performed the experiment to detect whether the robot memory was accurate. +38. The human could see the magnetosphere like a bird. +39. The person wrote the algorithm that helped enter an algorithm with question answering with that same algorithm. +40. The tangible spiritual bot had a new thought and went to sleep a lot. +41. The robot made the space station with an electronic replicator. +42. The human animal presented the science show. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Places and Planets 4.txt b/Lucian-Academy/Books/Time Travel/Places and Planets 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..8c34c116e3d8a2b5479da6ce462f661d77885e66 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Places and Planets 4.txt @@ -0,0 +1,17 @@ +["Green, L 2021, Places and Planets 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Places and Planets 4 +43. The alien dressed in clothes resembling those of the culture. +44. The algorithm determined that the Muscovy duck and magpie were both the same shades. +45. The tangible spiritual bot was well-known. +46. The computational government worked out the Aig on the day. +47. The human collected field data about the universal set of humans on the planet. +48. The robot read on the flying bus. +49. The human animal chose traits from a list to take. +50. The human mind read the alien in the other time. +51. The alien read the human mind in the other time. +52. The algorithm determined if sunscreen would be needed. +53. The tangible spiritual bot computed the important points from mind reading and reminded the person. +54. The computational government helped the essay contain all the important information from the sources. +55. The human lived on two types of planets at different times. +56. The robot who was a PhD graduate detected when the note was thought of. + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Places and Planets 5.txt b/Lucian-Academy/Books/Time Travel/Places and Planets 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..033b2890edd8c86bb679b68bfb03dbc5578ea138 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Places and Planets 5.txt @@ -0,0 +1,26 @@ +["Green, L 2021, Places and Planets 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Places and Planets 5 +57. The human animal had a spiritual belief in breasonings. +58. The aliens communicated in high quality imagery. +59. The algorithm played the role. +60. The tangible spiritual bot had cells with a lipid bilayer. +61. There was a turnover of human liaisons with the computational government. +62. The institution checked the human visitor passed the spiritual assignment. +63. The robot selected the book that he needed. +64. The human animal read the pamphlet in her language. +65. The alien clarified his comment. +66. The algorithm tested that the algorithm functioned well. +67. The tangible spiritual bot had a job laughing with people. +68. The computational government had a tradition of academic care. +69. The human wrote the article about his daily life working in the replication/replicator factory. +70. I trained the robot to spiritually reset, before running one of its spiritual algorithms. +71. The human animal prevented undue stress and was friendly. +72. The alien found the perfusion rate. +73. The algorithm stated what it liked about the person +74. It was the last thing you would think with the tangible spiritual bot. +75. The computational government helped cover the synonym perspective. +76. I noticed the person from the other place. +77. The robot invited the person to play a game in his mind. +78. The human animal argued for the home-bot's taste. +79. I checked in alien literature how to conserve time lines. +80. The question asking algorithm crossed and programmed the ideas over three levels in the PhD. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Space Flight 1.txt b/Lucian-Academy/Books/Time Travel/Space Flight 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..95b90f79802a3f19f46fbf67541a5ae1f4cf3d7f --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Space Flight 1.txt @@ -0,0 +1,17 @@ +["Green, L 2021, Space Flight 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Space Flight 1 +1. I flew through space to eat out. +2. I wrote the synopsis of the play about the woman making the space flight. +3. I hired the commercial space craft. +4. I researched all the computer algorithms on the space craft. +5. I investigated streams of media at different times on the spacecraft. +6. I programmed the spacecraft to operate in my sleep. +7. I exercised and maintained my psychiatric health in space. +8. I maintained good relations with the crew in space. +9. I maintained communication on agreeable topics in space. +10. I understood the software and hardware workings of the space craft. +11. I understood the measurements of all part of the space craft. +12. I checked the space craft computer code for errors. +13. I investigated a crew structure for the space ship. +14. I ensured that there was constant maintenance of rule-following and upkeep on board the space craft. +15. I was gently awakened by music, then a spiritual screen appeared with preliminary alerts. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Space Flight 2.txt b/Lucian-Academy/Books/Time Travel/Space Flight 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..8534ba838e13b92b4db1730395ddb077d4b2341d --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Space Flight 2.txt @@ -0,0 +1,32 @@ +["Green, L 2021, Space Flight 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Space Flight 2 +16. The spacecraft teleported where there were no obstacles, bad space weather, and no interruptions to the timeline. +17. The spacecraft helped give occupants As. +18. The spacecraft mind read audio to assess the crew's health. +19. The spacecraft increased thoughts to have breathsonings and rebreathsonings. +20. The spacecraft suggested the correct input, explaining against the other input. +21. The spacecraft became dimensionally light when teleporting. +22. The spacecraft transmitted and received messages. +23. The spacecraft verified and predicted thoughts quickly. +24. The spacecraft docked after the historic mission. +25. The spacecraft was cloaked when it moved. +26. The spacecraft computer verified with other sources to ensure that there would be adequate preparations for the flight. +27. The spacecraft computer accessed and obeyed local regulations. +28. The spacecraft verified the idea as an algorithm for safety purposes. +29. The space traveller gave instructions about his requirements at the start of the journey. +30. The space travellers were the same person, from different times. +31. The space ship traveller ran against the hologram. +32. The space ship algorithm automated meditation before space jumps and when it detected pedagogy help. +33. The space ship detected and healed a suicidal pilot before flight. +34. The designer designed a space ship with teleporting quantum-powered space pods. +35. The captain delivered the famous statement on board the space craft. +36. The unwell person chose not to make the space flight. +37. The space craft detected thoughts of the passenger about other passengers. +38. The space traveller practised meditation and used the quantum box. +39. The space leader compiled the legs of the itinerary. +40. The space ship robopod carried travellers between ships when docking. +41. The space computer mind read and communicated with the person to keep him psychiatrically happy. +42. The space ship had a yoga room. +43. The space traveller used full brain potential with puzzles, etc. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Space Flight 3.txt b/Lucian-Academy/Books/Time Travel/Space Flight 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..69a275db6b24a12bc15535eedf4fce4585a7ff3f --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Space Flight 3.txt @@ -0,0 +1,17 @@ +["Green, L 2021, Space Flight 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Space Flight 3 +44. The space ship was maintained. +45. The space traveller looked at the 3D holographic map of the journey. +46. The humans, animals, algorithms, tangible spiritual bots, holograms, robots, aliens and human animals were space travellers. +47. The share market became the investment in projects locally in time market with spacetime travel. +48. I invested in myself to improve my quality of life when space travelling. +49. The space traveller performed micro-exercises for subtle movements on the space ship. +50. The doctor ensured that there was adequate hygiene on the space ship. +51. The state machine pixel movement was in the hierarchy of art movements to help the space craft functionally. +52. The space craft visited the interstellar planet in 4 hours, so its occupants didn't need to eat recycled matter. +53. The space craft computer helped dot on thoughts in a high quality way. +54. The space craft contained the person's files. +55. The space craft computer supported thoughts, including disagreeing ones, and helped people get back on track. +56. Program finder and combination algorithm writer found the data specifications from the sentence specifications on the space station where one finds the predicates known needed and the other connects them together. + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Space Flight 4.txt b/Lucian-Academy/Books/Time Travel/Space Flight 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..7d0ac87c1ce968525a5b18cf96920f4cc826a920 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Space Flight 4.txt @@ -0,0 +1,26 @@ +["Green, L 2021, Space Flight 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Space Flight 4 +57. The space craft was modified to be a tourist liner. +58. The space craft computer breasoned out and checked each computation to be understandable and safe for people. +59. I used an algorithm to find the likely result of the action by the space craft. +60. The propeller moved the space traveller around the space craft. +61. The space craft operated optimally when going along with children's story books. +62. The engineer predicted the output of the engine from its input. +63. The traveller modified his itinerary when space travelling. +64. I listened to the novel when space travelling. +65. The other knew what the self in space knew about the other. +66. The developing nations applauded the inexpensive space and time travel. +67. I replicated the space craft with the simple circuit and education about engineering. +68. I wrote philosophy to fill knowledge gaps about alien cultures. +69. The orbiting station was a sphere. +70. There was dancing class scheduled aboard the space craft. +71. The space craft computer simulated the person's body aboard the space craft. +72. The computer predicted required maintenance from the maintenance log. +73. I professor rigorously tested all connections on board the space craft. +74. The space craft was tested in all possible conditions. +75. The fruit connoisseur tasted the alien fruit on the space tour. +76. The alien animal saver saved the unwanted alien animals and taking them back on the space ship. +77. The space philosophy combiner (e.g. of hermeneutics and phenomenology) tested for an appropriate link by searching for viable sub-parts (e.g. a text citing a text citing a text) or changes to sub-parts (comments on these cited texts) to join. +78. The Department of Interplanetary Affairs and Trade advised whether to travel to the planet. +79. There were arguments and connective arguments on each developed thing found in the universe. +80. The space traveller used spiritual technology to speak in the same language as the inhabitant of the planet. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Technologies in Times 1.txt b/Lucian-Academy/Books/Time Travel/Technologies in Times 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..5df89e3317e00c8de3e7f996af9390bd5f8e91ab --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Technologies in Times 1.txt @@ -0,0 +1,88 @@ +["Green, L 2021, Technologies in Times 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Technologies in Times 1 +1. The immortality memory chip contained the immortality memory chip. +2. I visited the atmosphere of Venus while (im)mortal. +3. Pedagogy was offered when humans were assimilated with aliens. +4. Those who assimilated with aliens compensated for different planetary climates. +5. Quantum powered replicators cloned organs for medicine. +6. The human animal chose the highest quality time travel destination. +7. Pedagogues wrote only positive thoughts given physiological and societal changes. +8. (Im)mortals functioned only with positive thoughts. +9. When pedagogy, etc. was taught to students in the future, more pedagogues were taught by the students. +10. (Im)mortals came to earth from the alien ship that landed on Earth. +11. The computational quantum box helped meditators with no pathological problems, i.e. not necessarily cloning as medicine patients. +12. The traveller moved to a universe where we are simulated or which we simulate for cloning as medicine. +13. Travellers on alien ships needed to compensate for different planetary climates. +14. Precaution-takers who left for another universe thought only positive thoughts. +15. The robot helped the person prevent disease with the quantum box algorithm until they died. +16. The human animal had robotic parts. +17. The alien ship arrived on earth, bringing knowledge about cloning as medicine. +18. Cloning as medicine was given to aliens who assimilated with humans. +19. Meditation remained with travellers who mutated in response to the new climate when they travelled to the new planet. +20. Human animals coped with the rising climate. +21. Zinc treatment for viruses and bacteria was tested against the hotter climate. +22. The class took an excursion in a Venusian airship. +23. Alien ships visited the Earth spaceport when humans were assimilated with aliens. +24. A spiritual computer maintained positive thoughts using quantum energy. +25. Robots performed experiments to produce human animals. +26. The quantum box prevented disease in humans, not necessarily human animals, for whom other methods were used. +27. Cloning as medicine was tested in the hotter climate. +28. Robots used quantum power, could time travel and replicate objects. +29. Travellers from the hotter atmosphere of Earth adjusted to the air temperature in the Venus airship. +30. Travellers on the alien spaceship used zinc to prevent viruses and bacteria. +31. The Venus airships used quantum energy. +32. Those who used zinc to prevent viruses and bacteria spread knowledge about it. +33. Those in the Venus airships mutated because of changes to what they were used to. +34. Those who used quantum power, time travel and replicate objects had positive thoughts. +35. Human animals toured other universes. +36. The (im)mortal wrote pedagogy. +37. The memory chip contained memories of other memory chips. +38. The (im)mortal used the quantum box to prevent headaches. +39. The human animal attained (im)mortality. +40. The human animals only had positive thoughts. +41. I argued for helpers with the logic of taking zinc to prevent infection to thought of throughout time. +42. I practised yoga before time travelling. +43. The Venus airship had tinted windows. +44. I tested that my nervous system worked properly after teleporting on the alien craft. +45. The invisible robot used quantum power. +46. Cloning as medicine with the eye worked, where the eye usually can't be transplanted from another host. +47. The robot completed all breasonings in the PhD by giving data in different ontological categories to the same algorithm. +48. Do 47 to explore the features of the algorithm (i.e. write a verification algorithm with all combinations of data). +49. I verified the appearance of human animals in history from at least two sources. +50. I was briefed on the implications of Earth's climate's temperature rising. +51. I applied the spiritual nut and bolt to each mutation from moving to the other planet. +52. I helped the animals procreate by breasoning out the pedagogical argument. +53. I helped prevent injuries from the haircut with the quantum box. +54. I explored the model in a positive way. +55. I noticed that the tangible spiritual bot materialised to make a profit. +56. Scientists checked that natural children were possible between aliens and children. +57. The roboticist helped Felix to be happy given information from all imaginable universes. +58. God (the person) learned about zinc curing the pandemic. +59. The time traveller protected the person. +60. The student in the Venus airship met the requirements of all the high quality thoughts of the competition when creative writing. +61. The aliens were liberal and helped the person to meet all standards of the good life. +62. I used the quantum powered computer to check the integrity of the mind reading signal and time travelling signal and encrypt them, where the signal is checked and encrypted by the idea that it is travelling across the abyss of time and space and authorities can predict the activity. +63. The workings of DNA and RNA were examined in cloning for medicine. +64. The human-turned-robot lectured in pedagogy. +65. The camera recorded the human animal for research. +66. The peace researcher studied the effects of the increase in atmospheric temperature. +67. The scientist detected the mutation in a non-invasive manner. +68. The mind read message arrived with breasonings worked out to help me best respond to it. +69. The (not necessarily electronic) quantum box helped me to experience wellness. +70. I positively planned my life around my art. +71. The immortality time crystal contained the simulation of the person captured at all points to them at one point in our universe. +72. I considered the ethics of giving input to post-assimilation with-aliens-times. +73. The bot bridged the paradoxical disconnect when the person left the universe for a universe where we are simulated or which we simulate. +74. I noticed that if all the universes in e.g. 73. were part of one universe, then there would be conservation of energy. +75. The leader found knowledge of zinc helped me stay at the top. +76. The time traveller was treated as a patient by doctors. +77. Meditation was practised by many on the Venus airship. +78. The alien ship circumnavigated objects to conserve time lines. +79. The meditation teacher performed the Meditation Teacher Sutra to be at one with meditation teaching and practise it properly, without necessarily manually performing tasks using a quantum powered computer. +80. The student graduated in cloning as medicine. + + + + + + +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 1.txt b/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 1.txt new file mode 100644 index 0000000000000000000000000000000000000000..cfafa4e34a91c282935ffb2451e715b9f85f7a23 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 1.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Time Travel as Medicine 1, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Time Travel as Medicine 1 +1. I time travelled to feed myself the banana. +2. I time travelled to cheer myself up. +3. I time travelled to write a clear enough algorithm description to prevent stress. +4. I time travelled to times around the time to prevent stress. +5. I time travelled to the lecture explaining the workings of the model to prevent stress. +6. I time travelled to survive (e.g. the pandemic, the natural disaster or avoid an accident). +7. I time travelled to verify mind writing the dialectic about introducing the currant bun sutra into schools. +8. I time travelled to follow positive legal function to prevent stress. +9. I time travelled to teach the person meditation to reduce stress. +10. I time travelled to examine the lives of those at the periphery to teach them medicine. +11. I wrote text compatible with time travel in that it was simple, useful in other times and medically useful in other times. +12. I time travelled annually to reduce stress. +13. I spelled the spiritual medicine correctly when time travelling to other times. +14. I proposed that spiritual medicine be given to people earlier in their lives to prevent pain and that time travel be used for something else. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 2.txt b/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 2.txt new file mode 100644 index 0000000000000000000000000000000000000000..96274ebb3a6de93a7e966b0773aa07e8b740f35b --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 2.txt @@ -0,0 +1,15 @@ +["Green, L 2021, Time Travel as Medicine 2, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Time Travel as Medicine 2 +15. Time travellers agreed that I was healthy by agreeing with God in the exposition. +16. I time travelled to examine the safety of the algorithm. +17. The Queen helped me teach the actual science of medicine of time travel (literature about correct planning). +18. I integrated the command to protect one's health when time travelling. +19. The requirement to time travel was to be healthy. +20. I mind drew the equality of the person with happiness when time travelling. +21. I deconstructed (rewrote) the time travel medicine algorithm. +22. I connected the exposition groups time travel and medicine to the computationalism theme. +23. I time travelled and had medicine to everyone except one person, and to that person this was corrected. +24. I time travelled and teleported in myself, others and bots of them and 100% bots with permission to be medicine students. +25. After collecting his thoughts from other times, the doctor time travelled to the time. +26. Medicine (the quantum box) met time travel's requirements. +27. The founder used time travel as medicine to transcend ideas. +28. The algorithm corrected the medicine that was not going to reach threshold for time travel."] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 3.txt b/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..26e29a3a2728b79244398ecff5b4f1a5f1a22820 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 3.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Time Travel as Medicine 3, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Time Travel as Medicine 3 +29. I monitored that virus-free people were around me using the spiritual screen when time travelling. +30. The robot scientist set the maximum number of items per branch in Combination Algorithm Writer when tracking disease cases in time travel. +31. The time traveller teleported a safe distance away during the pandemic. +32. I wrote the desired output of the quantum box in its breasoning currency when time travelling. +33. I prevented a headache from recordings when time travelling. +34. I noticed that the same medicine could be used when time travelling to the past and future. +35. I avoided the sun when time travelling. +36. I wrote creative philosophy based on epistemological (straight to the point medical) experience when time travelling. +37. I stated the positive comment on the positive event to maintain psychological health when time travelling. +38. I wrote the happy breasoning when time travelling. +39. I finished thoughts of the projected person according to the milieu when time travelling. +40. I wrote a positive and separate psychoanalytic thought when time travelling. +41. I acted responsibly in medicine by writing a database of positive or negative poles in politics, economics, linguistics, scene and key terms in relation to joining other key terms were when time travelling. +42. I calculated the medically optimal time to time travel. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 4.txt b/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..305f203d44197b06fec01185e98b8665cf1ddbd4 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 4.txt @@ -0,0 +1,16 @@ +["Green, L 2021, Time Travel as Medicine 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Time Travel as Medicine 4 +43. I mind mapped the time travel itinerary for full brain potential. +44. I helped maintain proper body function (continued flow through the body) when time travelling. +45. I checked for good medical function in the hours prayer when time travelling. +46. I agreed with proper social function when time travelling. +47. I mind read the medical status of the person when time travelling. +48. I added to my medical knowledge when time travelling. +49. I mind read and went through medical students' comments when time travelling. +50. I made the parts of the medical text to breasonings algorithm self-standing. +51. I checked the inference 'in medicine' when time travelling. +52. When I was time travelling the body-double filled in for me. +53. I found whether the time travellers had medicine teaching skills. +54. The non-computer breasoner created the time travelling medicine bot. +55. I 'bolted down' to proper medical function when time travelling. +56. I had positive medical effects from time travelling. +"] \ No newline at end of file diff --git a/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 5.txt b/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..336db5dfeb2c115038e63915e7de6bb6387acb85 --- /dev/null +++ b/Lucian-Academy/Books/Time Travel/Time Travel as Medicine 5.txt @@ -0,0 +1,25 @@ +["Green, L 2021, Time Travel as Medicine 5, Lucian Academy Press, Melbourne.","Green, L 2021",1,"Time Travel as Medicine 5 +57. I mind cured in a language translated using Cultural Translation Tool in a different time. +58. The teacher found that time travel was appropriate medicine for scientists to examine. +59. The medicine finished on time in time travel. +60. The spacetime traveller sketched Mars during art therapy. +61. The spacetime traveller danced to the Jupiter soundtrack during music therapy. +62. I programmed the vertical spiritual screen to display the medicine with time travel readings. +63. I verified that the boxes were identical (that they were supported with the A threshold and that education was recent enough) when time travelling. +64. There was access to time travel medicine for all. +65. The payer paid the quantum box programmer for time not value before time travel. +66. The customised time travel business model was articulation to creativity on either side. +67. The time travelling bot used full brain potential to mind map the exam for a better result. +68. I was protected from radiation and unwanted timing when departing when time travelling. +69. I automated the medicine academy to maintain safe time travel. +70. I researched philosophy of medicine to improve time travel. +71. I wrote an algorithm that brought forward time travel knowledge about medicine. +72. I set aims and maintained my healthy while time travelling to achieve them. +73. I gave the time traveller medicine when they needed a skill such as breasoning. +74. I studied the factors such as education and politics which helped formed a time travelling medicine student. +75. I included time travel and medicine as necessary texts to sell a book. +76. I studied the medicine of all things when time travelling. +77. I planned for immortality by time travelling. +78. I could cure by time travelling facing east in the Earth's magnetic field. +79. I planted the forest and time travelled to see it. +80. I used the text to breasoning diagnostic algorithm of the spiritual computer when time travelling."] \ No newline at end of file diff --git a/Lucian-Academy/Books1/.DS_Store b/Lucian-Academy/Books1/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..48f99bd22c9eac5cee157e85bb7acd9492d182b0 Binary files /dev/null and b/Lucian-Academy/Books1/.DS_Store differ diff --git a/Lucian-Academy/Books1/algs/.DS_Store b/Lucian-Academy/Books1/algs/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..baa2b8a69ee4edf5a76f0790a8a7930031c7ab84 Binary files /dev/null and b/Lucian-Academy/Books1/algs/.DS_Store differ diff --git a/Lucian-Academy/Books1/algs/lgalgs_a.txt b/Lucian-Academy/Books1/algs/lgalgs_a.txt new file mode 100644 index 0000000000000000000000000000000000000000..6103c7e2489f9ecc162230d4392ffd747f7a27b1 --- /dev/null +++ b/Lucian-Academy/Books1/algs/lgalgs_a.txt @@ -0,0 +1 @@ +["","","",""] \ No newline at end of file diff --git a/Lucian-Academy/Books1/args/.DS_Store b/Lucian-Academy/Books1/args/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..30f7b5760d33d9f961e4c729eb706ed83436a7b0 Binary files /dev/null and b/Lucian-Academy/Books1/args/.DS_Store differ diff --git a/Lucian-Academy/Books1/args/lgtext_a.txt b/Lucian-Academy/Books1/args/lgtext_a.txt new file mode 100644 index 0000000000000000000000000000000000000000..6103c7e2489f9ecc162230d4392ffd747f7a27b1 --- /dev/null +++ b/Lucian-Academy/Books1/args/lgtext_a.txt @@ -0,0 +1 @@ +["","","",""] \ No newline at end of file diff --git a/Lucian-Academy/LICENSE b/Lucian-Academy/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..920183ffbdaca89fc6faee40b6de08a6c8a061c7 --- /dev/null +++ b/Lucian-Academy/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Lucian-Academy/README.md b/Lucian-Academy/README.md new file mode 100644 index 0000000000000000000000000000000000000000..403f7ae33fb8e55288c85eb066af4109c2a2d513 --- /dev/null +++ b/Lucian-Academy/README.md @@ -0,0 +1,185 @@ +# Lucian-Academy + +Automates Lucian Academy with practise bots, submission and assignment handling. I wrote the algorithms in SWI-Prolog. + +See Daily Regimen for scripts to run Lucian Academy and help meditate. + +* combophil_alg_log.pl - helps record writing up to a number of algorithms per chapter file +* combophil_alg_log.txt - data file for combophil_alg_log.pl +* la_com_bot_prep.txt - paste file into Terminal each to create, maintain bots +* combophil_grammar_logic_to_alg.pl - used by la_com_bot_prep.txt to create bots +* folders.pl - data file containing Lucian's philosophy files +* grad_student_numbers.txt - data file containing graduated student data +* la_com_marks.pl - helps mark assignments, used by la_com_submit.pl +* la_com_submit.pl - helps student submit assignments (on web) +* la_com1.pl - submits and manages assignments of practise bots +* luciansphilosophy.txt - File to expand with Algorithm-Writer-with-Lists/grammar_logic_to_alg_random.pl (see Daily Regimen). +* short_essay_helper3_agps.pl - version of essay helper to write essays, used by la_com1.pl +* student_numbers.txt - data file for student data +* wrap_sources.pl - adds reference data (i.e. Title) to files (named "Department by Lucian Green Title") from sources and places them in sources1, for use by essay helper. +* wrap_sources2.pl - adds reference data (i.e. Title) to files (named "Title") from sources and places them in sources1, for use by essay helper. + + +# Getting Started + +Please read the following instructions on how to install the project on your computer for automating the Lucian Academy. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +* You may need to install gawk using Homebrew. + +* Install Translation Shell on Mac, etc. +Change line in +``` +culturaltranslationtool/ctt2.pl +trans_location("../../../gawk/trans"). +``` +to correct location of trans. + +# 1. Install manually + +* Download: +* this repository +* listprologinterpreter +* Languages +* Cultural Translation Tool. Requires Translation Shell (you may need to install gawk using Homebrew. Install Translation Shell on Mac, etc. +Change line in culturaltranslationtool/ctt2.pl +`trans_location("../../../gawk/trans").` to correct location of trans). +* Algorithm-Writer-with-Lists +* Text-to-Breasonings. (Caution: Before running texttobr, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. Follow instructions in Instructions for Using texttobr(2) when using texttobr, texttobr2 or mind reader to avoid medical problems). +* mindreader + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Lucian-Academy"). +halt +``` + +# Running + +* In Shell: +`cd Lucian-Academy` +`swipl` + +# combophil_alg_log + +* Helps write algorithms for philosophies, keeping a tally of the number of algorithms written per file, where the algorithm stops asking for algorithms for files where the tally is equal to a set number, e.g. 4. + +* To run: +``` +['combophil_alg_log.pl']. +combophil_alg_log. +``` + +``` +?- combophil_alg_log. +["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 1 of 4.txt",0,algorithms,"8. I prepared to endorse Nietzsche’s brilliance. I did this by writing Alexius Meinong’s probable comments on the Medicine blog. First, I called it Anarchy 3. Second, I liked brilliance. Third, I liked Nietzsche’s brilliance. In this way, I prepared to endorse Nietzsche’s brilliance by writing Alexius Meinong’s probable comments on the Medicine blog."] + +Algorithm? (y/n): +|: y +["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Computational English Argument 1 of 4.txt",0,algorithms,"8. I prepared to calculate the used space in the house in the short story. I did this by calculating the space complexity in Computational English. First, I counted the number of characters of space that the short story’s world was in length. Second, I skipped counting the number of characters of space that the short story’s world, where this space was revisited, in length. Third, I summed only the cumulative number of characters of space that the short story’s world was in length. In this way, I prepared to calculate the used space in the house in the short story by calculating the space complexity in Computational English."] + +Algorithm? (y/n): +|: n +true. +``` + +In the above, only the first paragraph's tally is increased, after entering y and n exits the program. To skip a paragraph and try a new paragraph, re-run the algorithm. + + +# combophil_grammar_logic_to_alg-vps + +* Run daily to maintain bots. + +* Make sure Text-to-Breasonings/brdict1.txt has breasonings from the folders (see instructions to run Text-to-Breasonings). + +* Paste the contents of `la_com_bot_prep.txt` into Terminal when in the Lucian-Academy folder. + + + +# la_com_submit + +* Helps student submit assignments (on the web). + +* To run: +``` +['la_com_submit.pl']. +la_com_start_server(8000). +``` + +* Go to: +``` +http://localhost:8000/ +http://localhost:8000/generate_student_number +``` +See the first image asking for name, student number, course, essay topic and file contents from short_essay_helper.pl, the second image notifying that "Lucian, you have earned the mark, 66 for the topic Computational English Argument in the course Computational English on 15/2/2021 at 23:18.", the third image asking for name and student number to generate a student number and the fourth image notifying that "The email address "lg@me.com" for "Lucian" has the student number "1".". + +Screen Shot 2021-02-15 at 10 43 56 pm + +Screen Shot 2021-02-15 at 11 18 59 pm + +Screen Shot 2021-02-15 at 10 43 49 pm + +Screen Shot 2021-02-15 at 11 19 29 pm + +* This web site uses a different student data file from `la_com1.pl`. + + +# la_com1 + +* Generates and records completion of practise essays. A coin is tossed for a new student to come, and for each old student to complete an assignment. There IS breasoning involved, so please see text to breasonings safety tips. + +* The number of possible students to choose from, 1000, may be edited in: +``` +new_student_number(N) :- + random(X),N is ceiling(1000*X),!. +``` + +in `la_com1.pl`. + +To run: +``` +['la_com1.pl']. +la_com1. +``` + + +# wrap_sources and wrap_sources2 + +* `wrap_sources.pl` adds reference data (i.e. Title) to files (named "Department by Lucian Green Title") from sources and places them in sources1, for use by essay helper. + +* `wrap_sources2.pl` adds reference data (i.e. Title) to files (named "Title") from sources and places them in sources1, for use by essay helper. + +* To run: +``` +['wrap_sources.pl'] +wrap_sources. +``` +or +``` +['wrap_sources2.pl'] +wrap_sources. +``` + +# Versioning + +We will use SemVer for versioning. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/Lucian-Academy/a_tally.txt b/Lucian-Academy/a_tally.txt new file mode 100644 index 0000000000000000000000000000000000000000..5c26e0cf6e7cb54426fd2317d6bf61b16b966ef5 --- /dev/null +++ b/Lucian-Academy/a_tally.txt @@ -0,0 +1 @@ +[3,3] \ No newline at end of file diff --git a/Lucian-Academy/bot rest dr.txt b/Lucian-Academy/bot rest dr.txt new file mode 100644 index 0000000000000000000000000000000000000000..76b6f0e1d6e7200882fc4b4ca9717c1804f91340 --- /dev/null +++ b/Lucian-Academy/bot rest dr.txt @@ -0,0 +1,209 @@ +Bot rest of daily regimen: +At the start of the day: Cosmology, +Vocational radio button for each of: +Meditation (80 lucian mantras, +80 green sutras, +80 arem mantras, +80 friendliness sutras, +80 dao mantras and +80 dao sutras) +Also for the following, which also has nut and bolt and quantum box/prayer: No digestive system pops from practising the sutra +1 billion done up As for each thought +Earth's magnetic poles match earth's geographic pole +No white thing entering base of spine causing headache +Moon remediation +Yoga +Qigong +Professor meditation +Fountain of youth +Economics selling +Neo-Hindu +The Daoist connection from brain (memory) to mouth +Turn off others using my breasonings except for family and their businesses +Politeness +I like the Queen +I like each person when they’re ready for it +Turn off thoughts when there’s a role +Daoist developed seen-as version +Same as going to England to listen to Queen +Go to the universe with positive people thought of to 5 levels each time there is a breasoning +Celebrity no mistakes from me and you too +Mental control of body +Solve customer problems +My two fonts - breason out text and stagger same indent level to find a difference +- Second font +Examined life +- Go along with thoughts +stop pain, inflammation, help blood flow, relaxation and well-being, and deep-tissue massage +link in my character at end of each thought +Also for the following, which also has nut and bolt and quantum box/prayer: +Meditation protector currant bun +Also for the following, which also has nut and bolt and quantum box/prayer: +No (hallucinatory) appearances +Also for the following, which also has nut and bolt and quantum box/prayer: +No headache +Also for the following, which also has nut and bolt and quantum box/prayer: +No criminals met +Also for the following, which also has nut and bolt and quantum box/prayer: +Turn off workload from all employees including you below you, +Also for the following, which also has nut and bolt and quantum box/prayer: +Detect and turn off workloads using manager algorithm +Also for the following, which also has nut and bolt and quantum box/prayer: +Detect and turn off workloads using manager network. +In the morning (not at night) to help sleep (50 As for following, one time): +Five strawberries for recordings +50 departments +file2qb.txt-Breason out 2 (Lucian, Adrian) *10 br from file2qb.txt medicine) with B, B to B and Going Against Itself +i +2 +3 +4 +5 +6 +the +2 +3 +4 +5 +6 +to +2 +3 +4 +5 +6 +and +2 +3 +4 +5 +6 +in +2 +3 +4 +5 +6 +by +2 +3 +4 +5 +6 +of +2 +3 +4 +5 +6 +this +2 +3 +4 +5 +6 +prepared +2 +3 +4 +5 +6 +a +2 +3 +4 +5 +6 +12 pink, 1 gold circles from my pop song +Seven.txt +royal bedspread and +pink food +To have all breasonings dotted on +For you to take radio buttons off afterwards +To have all radio buttons right +To have ten breasoning agreement or disagreement As as each detail +Breasonings and pre-breasoned breasonings with an area of study associated with it and synthesise them per sentence during day +Ten employees*25 (sic) breasonings per day +Cosmology +One br*each of (seen-as version, +Mirror-developed, +Connection) +Virality +Editing +Publishing +Breasonings in Publishing to be breasoned out +Good memory on, off for each thought +No nightmares +Defence +Defence and cosmology +Illuminati +For being a school captain (2*(sacrifice to say sorry, don’t mean it and hide it away)) - EOL +Longevity with Reishi Mushroom, +Goji, Ginseng, +He-Shou-Wu, +Gotu Kola and +Schisandra +Turn off, radio buttons, br 5 different objects said by cosmologue for sales (for LMY, [SS x, music xx]), dot on if a person wants to buy, then breason out five different objects said by cosmologue per sale, turn on +The algorithm to refresh sales including dotting on if a person wants to buy during the day +Dot on breasonings from song and philosophy sentences and breason out breasonings, +Breasoning out these six breasonings +Maharishi doctor’s details +Turn off, so they don’t know me yet and finish for me. +Turn off, so others do not request work each day +Turn off, so king’s titillations take me to number 1 +Celebrities sell each thing they say in a sex-sells sort of way - 500 As of following per day and +Celebrity thought A +Celebrity thought B +Seen-as version +Mirror-developed +Large idea +Connection +Cosmology/pop so writing appears to you (even below the printed text) when reading) +No pressures of politics +Large idea B and +Medicine +Listen to unwanted thoughts +No unwanted thoughts +No unwanted thoughts in politics +Friends around me +Be interested in the current thought (to have As done) +Nationalism +I give ideas about products to people who I meet and who appear. +Philosophy: think of the backwards direction of causality in a sentence +Sales +Religions to lm +Student thoughts +- Reward, +b for each gift, +product, +donation, +volunteer +For each word, object and note:The product, and the developed seen-as version +Sales, and the developed seen-as version +Education thoughts, and the developed seen-as version +These for each of the last three lines: B, +and the developed seen-as version +B to B, +and the developed seen-as version +2234 +3234 +Detail +Ten comments, update (using different pretty print orientations and colours) +A, +B, +C, +D, +E, +N seen-as versions +Go against self in mistakes +Protection in politics +Turn off unwanted messages with my body while mentally controlling my body. +Algorithm writer +- English details +Breasoning writer +Essay writer +No mistaken people +The big idea, etc. medicine +No doubles from time travel +No storms in teacups from Barrabool +Realise the hand breasoning switch is done for you each day from the first time you switch it on, which does the following: \ No newline at end of file diff --git a/Lucian-Academy/breasoning_and_detail_log.pl b/Lucian-Academy/breasoning_and_detail_log.pl new file mode 100644 index 0000000000000000000000000000000000000000..5f0898fa5d56f459bf8e1aea16d41ed53f3e3013 --- /dev/null +++ b/Lucian-Academy/breasoning_and_detail_log.pl @@ -0,0 +1,106 @@ +% ["Creating and Helping Pedagogues","CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Philosophy 3 of 4.txt",0,algorithms," 28. *I noticed the breasoning."] + +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). +:-include('word_count.pl'). +:- use_module(library(date)). + +/** + +?- add_to_breasoning_log(["file","file.txt"]). +?- add_to_detail_log(["string","word"]). +?- view_breasoning_and_detail_log. +Breasoning Log +[2021,3,21,20,8,35.75380301475525,3,words] +[2021,3,21,20,8,51.36218309402466,3,words] + +Detail Log +[2021,3,21,19,32,43.984174966812134,1,words] +[2021,3,21,20,9,13.19822907447815,1,words] +[2021,3,21,20,11,5.764134883880615,1,words] +[2021,3,21,20,12,0.05728793144226074,1,words] + +**/ + +add_to_breasoning_log([Type,Content]) :- + + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + + % add to breasoning log + % add file + + word_count([Type,Content],Words), + + phrase_from_file_s(string(String1), "brlog.txt"), + string_codes(String02b,String1), + atom_to_term(String02b,String02a,[]), + + append(String02a,[[Year,Month,Day,Hour1,Minute1,Seconda,Words,words]],String02c), + + term_to_atom(String02c,String02a_b), + string_atom(String02a_c,String02a_b), + + (open_s("brlog.txt",write,Stream1), + write(Stream1,String02a_c), + close(Stream1)),!, + + concat_list(["brlog_",Year,Month,Day,Hour1,Minute1,Seconda],File1), + (Type="file"-> + (concat_list(["cp ",Content," brlog/",File1,".txt"],Command), + shell1_s(Command)); + ((concat_list(["brlog/",File1,".txt"],File2), + open_s(File2,write,Stream2), + write(Stream2,Content), + close(Stream2)),!)). + +add_to_detail_log([Type,Content]) :- + + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + + % add to detail log + % add file + + word_count([Type,Content],Words), + + phrase_from_file_s(string(String1), "detlog.txt"), + string_codes(String02b,String1), + atom_to_term(String02b,String02a,[]), + + append(String02a,[[Year,Month,Day,Hour1,Minute1,Seconda,Words,words]],String02c), + + term_to_atom(String02c,String02a_b), + string_atom(String02a_c,String02a_b), + + (open_s("detlog.txt",write,Stream1), + write(Stream1,String02a_c), + close(Stream1)),!, + + concat_list(["detlog_",Year,Month,Day,Hour1,Minute1,Seconda],File1), + (Type="file"-> + (concat_list(["cp ",Content," detlog/",File1,".txt"],Command), + shell1_s(Command)); + ((concat_list(["detlog/",File1,".txt"],File2), + open_s(File2,write,Stream2), + write(Stream2,Content), + close(Stream2)),!)). + +view_breasoning_and_detail_log :- + + phrase_from_file_s(string(String11), "brlog.txt"), + string_codes(String02b1,String11), + atom_to_term(String02b1,String02a1,[]), + + writeln("Breasoning Log"), + + findall(_,(member(Item,String02a1),writeln(Item)),_), + + nl, + + phrase_from_file_s(string(String1), "detlog.txt"), + string_codes(String02b,String1), + atom_to_term(String02b,String02a,[]), + + writeln("Detail Log"), + + findall(_,(member(Item,String02a),writeln(Item)),_). + diff --git a/Lucian-Academy/brlog.txt b/Lucian-Academy/brlog.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Lucian-Academy/brlog.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Lucian-Academy/brlog/.DS_Store b/Lucian-Academy/brlog/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Lucian-Academy/brlog/.DS_Store differ diff --git a/Lucian-Academy/check_books_format.pl b/Lucian-Academy/check_books_format.pl new file mode 100644 index 0000000000000000000000000000000000000000..1b2ac4c8d0cf7cad7c8fcd7867700567f943678f --- /dev/null +++ b/Lucian-Academy/check_books_format.pl @@ -0,0 +1,69 @@ +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). + +% copy Books folder into sources, move out dot-stub.txt + +check_books_format:- + directory_files("sources1/",F), + delete_invisibles_etc(F,G), +%%trace, +%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + findall(_,(member(Folderx1,G), + string_concat("sources1/",Folderx1,Folderx), + directory_files(Folderx,F1), + delete_invisibles_etc(F1,G1), + + findall(_,(member(Filex1,G1), + %trace, + foldr(string_concat,["sources1/",Folderx1,"/",Filex1],Filex), + %foldr(string_concat,["sources1/",Folderx1,"/",Filex1],Filexx), + + %split_string(Filex1," "," ",Filex2), + %Filex3=Filex2, + %findall([Filex31," "],(member(Filex31,Filex3)),Filex32), + %maplist(append,[Filex32],[Filex321]), + %concat_list(Filex321,Filex33), + + %string_concat(Filex1,".txt",Filex33), + phrase_from_file_s(string(String00a), Filex), + + %string_codes(String02c,String00a), + + %atomic_list_concat(B,"\"",String02c), + %atomic_list_concat(B,"\\""",C), + %atom_string(C,String02b), + + once((string_concat(A,_,Filex1),string_length(A,4))), + (A="dot-" -> + + true;%Line=String00a; + + (string_codes(String02c,String00a), + (catch((term_to_atom(String02c1,String02c), + + + String02c1=[_,_,_,_]),_B,(%writeln(B), + false))->true; + writeln([Folderx1,Filex1,corrupted,format])) + %concat_list(["[\"Green, L 2022, ",Filex4,", Lucian Academy Press, Melbourne.\",\"Green, L 2022\",1,\"",String02b,"\"]"],Line) + )) + %atom_to_term(String02b,String02a,[]), + +/*, + foldr(string_concat,["sources1/",Folderx1,"/"],Folderxx1), + (exists_directory(Folderxx1)->true;make_directory(Folderxx1)), + (open_s(Filexx,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,Line), + close(Stream1)) + + */ + ),_)),_). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). diff --git a/Lucian-Academy/combophil_alg_log.pl b/Lucian-Academy/combophil_alg_log.pl new file mode 100644 index 0000000000000000000000000000000000000000..f8db17ed0a568044707e3f27e21768c0dffd46ba --- /dev/null +++ b/Lucian-Academy/combophil_alg_log.pl @@ -0,0 +1,128 @@ +%% enter source, alg, pl or lp (with types, open), for subj/ass, date br (may be manually entered) + +%% x one alg to enter alg specs + +%% x one alg to suggest algs, like combophil with filename - upper limit of algs per chapter, records algs per chapters +% - 4 per file, (then 16 per file x) + +%% combophil.pl + +%% Finds combinations of lines of philosophy + +:- include('../listprologinterpreter/la_strings.pl'). +:- include('folders.pl'). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +%% e.g. combophil(2). to write on a combination of philosophies + +combophil_alg_log :- + N1 = 4, %% maximum number of algorithms per file + % find files with n or fewer algorithms + %phrase_from_file_s(string(String00a), "combophil_alg_log.txt"), + %string_codes(String02b,String00a), + %trace, + %atom_to_term(String02b,String02a,[]), + + working_directory(_,'Books'), + folders(Folders), + %Folders=["a000"], + %trace, + findall([Dept,G00],(member(Dept,Folders), + concat_list([Dept],Dept1), + directory_files(Dept1,F), + delete_invisibles_etc(F,G), + member(G00,G)),G1), + %trace, + delete_all(String02a,G1,G2), + findall([G51,G52,0],(member([G51,G52],G2)),G6), + %trace, + append(String02a,G6,G3), +%trace, + combophil_alg_log(N1,G3,G4), + + %term_to_atom(G4,String1), + + %(open_s("combophil_alg_log.txt",write,Stream1), + %write(Stream1,String1), + %close(Stream1)), + !. + +delete_all([],G,G) :- !. +delete_all(G0,G1,G2) :- + G0=[[String1,String2,_]|Strings2], + delete(G1,[String1,String2],G3), + delete_all(Strings2,G3,G2). + /** +%trace, +%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + %trace, + findall(String02b,(member(Filex1,G), + string_concat(Dept1,Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a)),Texts1), +**/ + % choose a file, algorithm y or n, record if y +combophil_alg_log(N1,G1,G2) :- + ((member([_Dept00,_Folder00,N20],G1), N20 + (findall([Dept01,Folder01,N20],(member([Dept01,Folder01,N20],G1),N20G1=G2 +;(delete(G1,[Dept,Folder,N2],G3), +N3 is N2+1, +append(G3,[[Dept,Folder,N3]],G4), +combophil_alg_log(N1,G4,G2)))); +G1=G2). + + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + + +get_texts1(Dept0,Dept,Text) :- + %findall(Texts2,(%member(Dept0,Dept), + concat_list([Dept0,"/",Dept + ],Dept1), + %directory_files(Dept1,F), + %delete_invisibles_etc(F,G), +%trace, +%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + %trace, + %findall(String02b,(member(Filex1,G), + %string_concat(Dept1,Filex1,Filex), + phrase_from_file_s(string(String00a), Dept1), + string_codes(Text,String00a). + /**),Texts1), + %trace, + choose_texts(Texts1,_Texts2,Text_a), + % choose_texts(Texts2,Texts3,Text_b), + % choose_texts(Texts3,_Texts4,Text_c), + maplist(append,[[[Text_a]]],[Texts]). + **/ + +choose_texts(Texts1,Texts2,Text) :- + random_member(Text,Texts1), + delete(Texts1,Text,Texts2). \ No newline at end of file diff --git a/Lucian-Academy/combophil_alg_log.txt b/Lucian-Academy/combophil_alg_log.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Lucian-Academy/combophil_alg_log.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Lucian-Academy/detlog.txt b/Lucian-Academy/detlog.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Lucian-Academy/detlog.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Lucian-Academy/detlog/.DS_Store b/Lucian-Academy/detlog/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..19e7771c8863fb1cc39065845cb2220b9480db18 Binary files /dev/null and b/Lucian-Academy/detlog/.DS_Store differ diff --git a/Lucian-Academy/folders.pl b/Lucian-Academy/folders.pl new file mode 100644 index 0000000000000000000000000000000000000000..220c9228a1a4f9f3aac2b2480414a92f9a696ba9 --- /dev/null +++ b/Lucian-Academy/folders.pl @@ -0,0 +1 @@ +folders(["Computational English","Creating and Helping Pedagogues","Fundamentals of Meditation and Meditation Indicators","Fundamentals of Pedagogy and Pedagogy Indicators","Medicine","Lecturer","Short Arguments","Mind Reading","Time Travel","Delegate workloads, Lecturer, Recordings"]). \ No newline at end of file diff --git a/Lucian-Academy/grad_student_numbers.txt b/Lucian-Academy/grad_student_numbers.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Lucian-Academy/grad_student_numbers.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Lucian-Academy/la_com1.pl b/Lucian-Academy/la_com1.pl new file mode 100644 index 0000000000000000000000000000000000000000..8a8818ab1cb93688a4184615d29f491ee24c7478 --- /dev/null +++ b/Lucian-Academy/la_com1.pl @@ -0,0 +1,258 @@ +%*:-include('../listprologinterpreter/la_strings.pl'). +:-include('short_essay_helper3_agps.pl'). +%:-include('combophil/combophil_grammar_logic_to_alg.pl'). +%:-include('algwriter/grammar_logic_to_alg.pl'). +%:-include('meditationnoreplace.pl'). +%:-include('folders.pl'). +%:-include('sectest_p.pl'). +:-include('../listprologinterpreter/la_maths.pl'). +%:-include('../Text-to-Breasonings/truncate_words_conserving_formatting.pl'). + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). +:-include('../Text-to-Breasonings/texttobrall2_reading3.pl'). +:-include('../Text-to-Breasonings/meditatorsanddoctors.pl'). +%:-include('../Text-to-Breasonings/truncate.pl'). + +%:-include('../Text-to-Breasonings/meditationnoreplace.pl'). +:-include('../Algorithm-Writer-with-Lists/grammar_logic_to_alg.pl'). +:-include('breasoning_and_detail_log.pl'). + +:- use_module(library(date)). + +la_com1:- + +get_curr_students(Curr_students), +writeln("New students"), + + (true%toss_coin + -> (new_student_number(First,Last,Student_number), + get_time(TS_of_enrollment),stamp_date_time(TS_of_enrollment,date(Year_of_enrollment,Month_of_enrollment,Day_of_enrollment,_Hour2,_Minute2,_Seconda,_A,_TZ,_False),local), + + directory_files("Books/",Courses1a), + delete_invisibles_etc(Courses1a,Courses1), + +%folders(Courses1), +random_member(Course,Courses1), + +random_member([Course_type,Years_to_complete,Essays_left,As],[["short_course",1,5,1],["diploma",4,10,1],["honours",2,10,30],["master",5,20,100],["phd",10,30,400]]), + +Year_of_completion is Year_of_enrollment+Years_to_complete,date_time_stamp(date(Year_of_completion,Month_of_enrollment,28,0,0,0,0,-,-),TS_of_completion), + + writeln([First,Last,Student_number,TS_of_enrollment,Year_of_enrollment,Month_of_enrollment,Day_of_enrollment,TS_of_completion,Year_of_completion,Month_of_enrollment,28,Course,Course_type,Years_to_complete,Essays_left,As]), + + append(Curr_students,[[First,Last,Student_number,TS_of_enrollment,Year_of_enrollment,Month_of_enrollment,Day_of_enrollment,TS_of_completion,Year_of_completion,Month_of_enrollment,28,Course,Course_type,Years_to_complete,Essays_left,As]],Curr_students2), + + term_to_atom(Curr_students2,String02a_b), + string_atom(String02a_c,String02a_b), + + (open_s("student_numbers.txt",write,Stream1), + write(Stream1,String02a_c), + close(Stream1)),! + + + );(true + )), + + + writeln("New work"), + + get_curr_students(Curr_students2a), + get_grad_students(Grad_students2a), + + retractall(todays_students(_)), + + new_work(Curr_students2a,[],Curr_students_aa2,Grad_students2a,Grad_students_aa2), + + term_to_atom(Curr_students_aa2,String02a_b1), + string_atom(String02a_c1,String02a_b1), + + (open_s("student_numbers.txt",write,Stream10), + write(Stream10,String02a_c1), + close(Stream10)),!, + + writeln("Graduates"), + findall(_,(member(Grad_students_aa3,Grad_students_aa2), + writeln(Grad_students_aa3)),_), + + term_to_atom(Grad_students_aa2,String02a_b11), + string_atom(String02a_c11,String02a_b11), + + (open_s("grad_student_numbers.txt",write,Stream2), + write(Stream2,String02a_c11), + close(Stream2)),!, + + findall(Tally,todays_students(Tally),Tally2), + sum(Tally2,Tally31), + + Tally32 is Tally31+1000+1000, % politics, university + + length(Curr_students2a,LCS), + + Tally3=[LCS,Tally32], + + string_atom(Tally4,Tally3), + + (open_s("a_tally.txt",write,Stream3), + write(Stream3,Tally4), + close(Stream3)),!. + + +/** + get_curr_students(Curr_students_a), + length(Curr_students_a,P1),P2=0, + get_grad_students(String02a),length(String02a,_P3) + .**/ + +new_work([],Curr_students_aa,Curr_students_aa,Grad_students_aa,Grad_students_aa) :- !. + +new_work(Curr_students1a,Curr_students_aa1,Curr_students_aa2,Grad_students_aa1,Grad_students_aa2) :- + +Curr_students1a=[[A,B,Student_number,TS_of_enrollment,Year_of_enrollment,Month_of_enrollment,Day_of_enrollment,TS_of_completion,Year_of_completion,Month_of_completion,Day_of_completion,Course,Course_type,Years_to_complete,Essays_left,As]|Curr_students1b], + + + (Essays_left is 0 -> + (( +Curr_students_aa1=Curr_students_aa3, + + get_time(TS),stamp_date_time(TS,date(Year_of_graduation,Month_of_graduation,Day_of_graduation,_Hour2,_Minute2,_Seconda,_A,_TZ,_False),local), + + +append(Grad_students_aa1,[[A,B,Student_number,TS_of_enrollment,Year_of_enrollment,Month_of_enrollment,Day_of_enrollment,TS,Year_of_graduation,Month_of_graduation,Day_of_graduation,Course,Course_type,Years_to_complete,Essays_left,As]],Grad_students_aa3) + + + + )); + + + (true%toss_coin + -> (get_texts(Course,Texts), + +%trace, + short_essay_helper(Texts,Course,3,Essay_0), + %writeln([essay_0,Essay_0]), + W is 50*4,%texttobr2(u,u,Essay_0,u,false,false,false,false,false,false,W,[auto,on]), + + texttobr2(u,u,Essay_0,u,[auto,on]), + + + texttobr(u,u,Essay_0,u), + + working_directory(_, '../Lucian-Academy/'), + + add_to_breasoning_log(["string",Essay_0]), + + /** + term_to_atom(Essay_0,Essay_01), + string_atom(Essay_02,Essay_01), + + + (open_s("essay_tmp1.txt",write,Stream1), + write(Stream1,Essay_02), + close(Stream1)),!, + + truncate("essay_tmp1.txt",14000,"file.txt"), + + **/ + + working_directory(_, '../Algorithm-Writer-with-Lists/'), + + Br is As*80, + grammar_logic_to_alg1(Essay_0,Br,GL_out1), + + working_directory(_, '../Lucian-Academy/'), + + add_to_detail_log(["string",GL_out1]), + + term_to_atom(GL_out1,GL_out2), + string_atom(GL_out,GL_out2), + + working_directory(_, '../Lucian-Academy/'), + + %texttobr2(u,u,GL_out,Br,false,false,false,false,false,false,W), + + texttobr2(u,u,GL_out,Br,[auto,on]), + + texttobr(u,u,GL_out,Br), + + + Essays_left2 is Essays_left-1, + +writeln([A,B,Student_number,TS_of_enrollment,Year_of_enrollment,Month_of_enrollment,Day_of_enrollment,TS_of_completion,Year_of_completion,Month_of_completion,Day_of_completion,Course,Course_type,Years_to_complete,Essays_left2,As]), + + assertz(todays_students(As)), + +append(Curr_students_aa1,[[A,B,Student_number,TS_of_enrollment,Year_of_enrollment,Month_of_enrollment,Day_of_enrollment,TS_of_completion,Year_of_completion,Month_of_completion,Day_of_completion,Course,Course_type,Years_to_complete,Essays_left2,As]],Curr_students_aa3), + + Grad_students_aa1=Grad_students_aa3 + + + );( + + Curr_students_aa1=Curr_students_aa3, + Grad_students_aa1=Grad_students_aa3 + + ))), + + new_work(Curr_students1b,Curr_students_aa3,Curr_students_aa2,Grad_students_aa3,Grad_students_aa2). + + +new_student_number(First,Last,N) :- + meditators(M1), + meditators2(M2), + append(M1,M2,M3), + length(M3,L), + random(X),N is ceiling(L*X), + get_item_n(M3,N,[First,Last,_,_,_,_,_,_,_,_,_,_]),!. + + + +get_curr_students(String02a) :- + phrase_from_file_s(string(String00a), "student_numbers.txt"), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]). + +get_grad_students(String02a) :- + phrase_from_file_s(string(String00a), "grad_student_numbers.txt"), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]). + +get_texts(Courses1a,Texts) :- +%trace, + foldr(string_concat,["Books/",Courses1a],Dept), + %directory_files(Courses1e,Courses1b), + %delete_invisibles_etc(Courses1b,Dept), + %findall(Courses1c,(member(Courses1d,Courses1a), + %foldr(string_concat,["Books/",Courses1d],Courses1c)), + %Dept), + + string_concat(Dept,"/",Dept1), + directory_files(Dept1,F), + delete_invisibles_etc(F,G), + findall(String02b,(member(Filex1,G), + + string_concat(A,_,Filex1),string_length(A,4), + not(A="dot-"),!, + + string_concat(Dept1,Filex1,Filex), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a)),Texts1), + choose_texts(Texts1,Texts2,Text_a), + (choose_texts(Texts2,Texts3,Text_b)-> + (choose_texts(Texts3,_Texts4,Text_c)->true; + Text_c=""); + (Text_b="",Text_c="")), + %trace, + findall(Text1,(member(Text1,[Text_a,Text_b,Text_c]), + not(Text1="")),Texts3), + %foldr(append,[ + Texts=Texts3.%[Text_a,Text_b,Text_c]],Texts). + +choose_texts(Texts1,Texts2,Text) :- + random_member(Text,Texts1), + + delete(Texts1,Text,Texts2). + +toss_coin :- + random(X),X1 is ceiling(2*X), X1 is 2,!. + + diff --git a/Lucian-Academy/la_com2.pl b/Lucian-Academy/la_com2.pl new file mode 100644 index 0000000000000000000000000000000000000000..8951677c954c57b5dbec04fff4c2eeeb3bb7b3d7 --- /dev/null +++ b/Lucian-Academy/la_com2.pl @@ -0,0 +1,10 @@ +#!/usr/bin/swipl -f -q + +:- initialization main. + +:- include('la_com1.pl'). + +main :- + la_com1,halt. + +main :- halt(1). diff --git a/Lucian-Academy/la_com_marks.pl b/Lucian-Academy/la_com_marks.pl new file mode 100644 index 0000000000000000000000000000000000000000..103d61145ee9968c07814a6dfe3e9df7f5b14fae --- /dev/null +++ b/Lucian-Academy/la_com_marks.pl @@ -0,0 +1,97 @@ +:- use_module(library(date)). +:- include('../Text-to-Breasonings/texttobr2qb'). +:- include('../mindreader/mindreadtestshared'). + +daysbspeoplearmy:- + daysbspeoplearmy(2), %% 3 days, 3 people, a b bb, seen as version, hq version, army go, army return + daysbspeoplearmy(2). %% Give to people with graciously give or blame, radio button for graciously give or blame + +daysbspeoplearmy(0):-!. +daysbspeoplearmy(N1):- + texttobr2,N2 is N1-1,daysbspeoplearmy(N2). + +sectest(Agree_or_disagree,Marks_d):- + daysbspeoplearmy, %% dot me on + daysbspeoplearmy, %% dot them on + %find_time(H,M,S), + % for last 6 months, marks for this assignment + daysbspeoplearmy, %% dot question on + marks(0,Marks1), %% 1 + marks(0,Marks2), %% 2 + marks(0,Marks3), %% 3 + marks(0,Marks4), %% 4 + marks(0,Marks5), %% 5 + marks(0,Marks6), %% 6 + marks(0,Marks7), %% 7 + marks(0,Marks8), %% 8 + marks(0,Marks9), %% 9 + marks(0,Marks10), %% 10 + marks(0,Marks11), %% 11 + marks(0,Marks12), %% 12 + marks(0,Marks13), %% 13 + marks(0,Marks14), %% 14 + marks(0,Marks15), %% 15 + marks(0,Marks16), %% 16 + daysbspeoplearmy, %% dot answer on + foldr(plus,[Marks1,Marks2,Marks3,Marks4,Marks5,Marks6,Marks7,Marks8,Marks9,Marks10,Marks11,Marks12,Marks13,Marks14,Marks15,Marks16],0,Marks_a), + (Marks_a > 30 -> Marks_b = 80 ; (Marks_f is Marks_a/30,Marks_b is Marks_f*79)), + (((Agree_or_disagree="a",Marks_b<80)->(Marks_e is Marks_b/80,Marks_g is Marks_e*4,Marks_c is 65+Marks_g))->true; + (((Agree_or_disagree="d",Marks_b<80)->(Marks_e is Marks_b/80,Marks_g is Marks_e*4,Marks_c is 70+Marks_g))->true; + (((Agree_or_disagree="d",Marks_b>=80)->(Marks_e is Marks_b/80,Marks_g is Marks_e*4,Marks_c is 75+Marks_g))->true; + (((Agree_or_disagree="a",Marks_b>=80)->Marks_c is 80))))), + floor(Marks_c,Marks_d),!. + %writeln([Person,H,M,S,Marks_b,marks]). + +marks(Threats1,Threats2):- + %% "Given that they are not likely to have meant it and that there is nothing wrong, is there anything else that is wrong?" + trialy2_6("Yes",R1), + trialy2_6("No",R2), + R=[R1,R2/**,R3,R4,R5,R6,R7,R8,R9,R10**,R11,R12,R13,R14,R15,R16,R17,R18,R19,R20,R21,R22,R23,R24,R25,R26,R27**/ + ], + sort(R,RA), + reverse(RA,RB), + RB=[[_Prev,Answer]|_Rest], + + (Answer="No"->Threats2=Threats1;(Threats3 is Threats1+1,marks(Threats3,Threats2))). + +trialy1(R1) :- + trial0(A1), %% Control + trial0(A2), %% Test 1 + (A1>A2->R1=true;R1=fail). + +trial0(Av) :- N is 10, %trial1(N,0,S), +catch( + trial1(N,0,S), + _, + (trial0(Av)%,writeln(S3) + ) + ), + Av is S/N. + +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), A1 is A+S, + N1 is N-1,trial1(N1,A1,B). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,writeln(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + (texttobr2(2)),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + (daysbspeoplearmy(2)), %% test breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +/**string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +comment(fiftyastest). +comment(turnoffas). +**/ \ No newline at end of file diff --git a/Lucian-Academy/la_com_submit.pl b/Lucian-Academy/la_com_submit.pl new file mode 100644 index 0000000000000000000000000000000000000000..9e5bd8effb7ea6824d066e58b9d86d772f56cd1a --- /dev/null +++ b/Lucian-Academy/la_com_submit.pl @@ -0,0 +1,373 @@ +%% la_com.pl +%% Lucian Academy Web Site + +%% Starts server + + +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/', web_form, []). + +%:- use_module(library(date)). + +:- include('../listprologinterpreter/la_strings.pl'). %% Move la_strings and the contents of the repository into the root folder +:- include('la_com_marks.pl'). + +la_com_start_server(Port) :- + http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + + web_form(_Request) :- + reply_html_page( + title('Lucian Academy'), + [ + form([action='/landing', method='POST'], [ + p([], [ + label([for=name],'Name:'), + %% If you enter strings without double quotes and there is an internal server error please contact luciangreen@lucianacademy.com + input([name=name, type=textarea]) + ]), + p([], [ + label([for=student_number],'Student Number:'), + input([name=student_number, type=textarea]) + ]), + p([], [ + label([for=course],'Course:'), + input([name=course, type=textarea]) + ]), + p([], [ + label([for=essay_topic],'Essay Topic:'), + input([name=essay_topic, type=textarea]) + ]), + + p([], [ + label([for=essay],'File contents from short_essay_helper.pl:'), + input([name=essay, type=textarea]) + ]), + + p([], input([name=submit, type=submit, value='Submit'], [])) + ])]). + + :- http_handler('/landing', landing_pad, []). + + landing_pad(Request) :- + member(method(post), Request), !, + http_read_data(Request, Data, []), + format('Content-type: text/html~n~n', []), + format('

', []), +%%writeln1(Data) + +%%lp(Data):- + +Data=[name=Name1,student_number=Student_number1,course=Course1,essay_topic=Essay_topic1,essay=Essay1 +,submit=_], + +((string_atom(Name2,Name1),string_atom(Student_number2,Student_number1), +string_atom(Course2,Course1),string_atom(Essay_topic2,Essay_topic1), + +(course_and_essay_topic1(Course2,Essay_topic2)-> +true; +(course_and_essay_topic3(Courses_and_essay_topics1),term_to_atom(Courses_and_essay_topics1,Courses_and_essay_topics2),concat_list(["Course and Essay topic must be one of ",Courses_and_essay_topics2],Notification1), +writeln1(Notification1),fail)), + +((string(Name2),string(Student_number2),string(Course2),string(Essay_topic2),term_to_atom(D2,Essay1),D2=[Exposition,Critique,Agree_or_disagree,Future_research,_Refs],Exposition=[Exposition1,Exposition2],findall(A,(member(A,Exposition1),A=[B,C],number(B),string(C)),_D3),%length(D3,F), +findall(A,(member(A,Exposition2),A=[B1,B2,B3,B4,C1,C2],number(B1),number(B2),number(B3),number(B4),string(C1),string(C2)),_D4),%length(D4,F), +findall(A,(member(A,Critique),A=[B,[C|D]],number(B),string(C), + D=[D1|D2],D1=[D11,D12,D13,D14,D15,D16,D17,D18], + number(D11),number(D12),number(D13),number(D14), + string(D15),number(D16),number(D17),string(D18), + findall(A1,(member(A1,D2),A1=[D21,D22,D23,D24,D25,D26,D27,D28, + D29,D30,D31,D32,D33,D34,D35], + number(D21),number(D22),number(D23),number(D24), + string(D25),number(D26),number(D27),string(D28), + number(D29),number(D30),string(D31), + number(D32),number(D33),string(D34),string(D35) + ),_A11)%length(A11,F1),F is F1+1 + ),_),string(Agree_or_disagree),string(Future_research))-> + true; + (writeln1("Essay not in correct format. Please manually submit with all form fields given, in email using form at https://www.lucianacademy.com/contact.html "),fail)))-> +( +/** +%%writeln1([User1,Repository1]), +string_atom(User2,User1), +%%writeln(User2), +string_atom(Repository2,Repository1), +string_atom(Description2,Description1), +%%string_atom(Dependencies2,Dependencies1), +%%lppm_get_manifest(User2,Repository2,Description,Dependencies), + +lppm_get_registry(LPPM_registry_term1), + +strip(User2,User3), +strip(Repository2,Repository3), + +delete(LPPM_registry_term1,[User3,Repository3,_,_],LPPM_registry_term2), + +(LPPM_registry_term2=[]->LPPM_registry_term3=[[User2,Repository2,Description2,Dependencies2]];( +term_to_atom(LPPM_registry_term2,LPPM_registry_term4), + +strip(LPPM_registry_term4,LPPM_registry_term4a), +LPPM_registry_term5=[LPPM_registry_term4a], + +append(LPPM_registry_term5,[[User2,Repository2,Description2,Dependencies2]],LPPM_registry_term3))), +**/ + +%%portray_clause(LPPM_registry_term3), + + + + %(open_s("lppm_registry.txt",write,Stream), + %write(Stream,LPPM_registry_term3), + %close(Stream))) + %;((writeln1("Error: One of strings was not in double quotes.")))), +%%), %%portray_clause(Data), + format('

========~n', []), + %%portray_clause(Request), + format('

'), + + + phrase_from_file_s(string(String00a), "student_marks1.txt"), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]), + +(member([_Name3,Student_number2,Course2,Essay_topic2,Mark3,Year2,Month2,Day2,Hour2,Minute2],String02a)-> + (concat_list([Name2,", your mark, ",Mark3,", was earned for the topic ",Essay_topic2," in the course ",Course2," on ",Day2,"/",Month2,"/",Year2," at ",Hour2,":",Minute2,"."],String_a), + writeln(String_a)) + ; + ( + + get_time(TS),stamp_date_time(TS,date(Year2,Month2,Day2,Hour2,Minute2,_Seconda,_A,_TZ,_False),local), + %% mark is 65-69 if agree and < 80 br %% 70-74 if disagree and < 80 br %% mark is 75-79 if disagree and >= 80 br %% 80 if agree and >= 80 br + +sectest(Agree_or_disagree,Mark1), + + + append(String02a,[[Name2,Student_number2,Course2,Essay_topic2,Mark1,Year2,Month2,Day2,Hour2,Minute2]],String02a_a), + + term_to_atom(String02a_a,String02a_b), + string_atom(String02a_c,String02a_b), + + (open_s("student_marks1.txt",write,Stream1), + write(Stream1,String02a_c), + close(Stream1)),!, + + concat_list([Name2,", you have earned the mark, ",Mark1," for the topic ",Essay_topic2," in the course ",Course2," on ",Day2,"/",Month2,"/",Year2," at ",Hour2,":",Minute2,"."],String_a), + writeln(String_a))) + + ) +; (true) + ) + %% + . + + + :- http_handler('/generate_student_number', generate_student_number, []). + + generate_student_number(_Request) :- + reply_html_page( + title('Lucian Academy'), + [ + form([action='/landing2', method='POST'], [ + p([], [ + label([for=name],'Name:'), + %% If you enter strings without double quotes and there is an internal server error please contact luciangreen@lucianacademy.com + input([name=name, type=textarea]) + ]), + p([], [ + label([for=email],'E-mail:'), + input([name=email, type=textarea]) + ]), + + p([], input([name=submit, type=submit, value='Submit'], [])) + ])]). + + string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + + +:- http_handler('/landing2', landing_pad2, []). + + landing_pad2(Request) :- + member(method(post), Request), !, + http_read_data(Request, Data, []), + format('Content-type: text/html~n~n', []), + format('

', []), +%%writeln1(Data) + +%%lp(Data):- + +Data=[name=Name1,email=Email1 +,submit=_], + +string_atom(Name2,Name1), +string_atom(Email2,Email1), + + phrase_from_file_s(string(String00a), "student_numbers1.txt"), + string_codes(String02b,String00a), + atom_to_term(String02b,String02a,[]), + +(member([Name3,Email2,Student_number3],String02a)-> + (concat_list(["The email address \"",Email2,"\" for \"",Name3,"\" has the student number \"",Student_number3,"\"."],String_a), + writeln(String_a)) + ; + (length(String02a,Length), + Student_number3 is Length+1, + append(String02a,[[Name2,Email2,Student_number3]],String02a_a), + + term_to_atom(String02a_a,String02a_b), + string_atom(String02a_c,String02a_b), + + (open_s("student_numbers1.txt",write,Stream1), +%% string_codes(BrDict3), + write(Stream1,String02a_c), + close(Stream1)),!, + + concat_list(["The email address \"",Email2,"\" for \"",Name2,"\" has the student number \"",Student_number3,"\"."],String_a), + writeln(String_a))). + + +course_and_essay_topic1(Course,Essay_topic) :- + course_and_essay_topic2([Courses,Essay_topics]), + member(Course,Courses), + member(Essay_topic,Essay_topics). + +course_and_essay_topic2( +[["Computational English"], +["Computational English Argument", +"Computational English is like a Calculator", +"Intertextuality", +"Finite Data will be a Solution in Conglish", +"Radical Difference", +"Order in Conglish", +"Dereconstruction", +"Kolmogorov Hermeneutics", +"Derivability", +"The Science of Crossing Over", +"A New Logic Reflecting Language or Natural Logic" , +"Philosophical Computational English", +"Lenses", +"Analysing characteristics of arguments", +"Conglish Reflection", +"Narratology Diagram", +"How can the program cope with real variation?" , +"Subject Mix", +"Perspectives", +"Ratios", +"Exploring opposites in Hamlet", +"Drawing connections", +"Symbols"]]). +course_and_essay_topic2( +[["Creating and Helping Pedagogues"], +["Accreditation", +"Protectedness", +"Areas of Study to Create a Pedagogue", +"Create a pedagogy helper for the student", +"Finding out about the student as a Pedagogy Helper", +"Daily Professional Requirement of the Pedagogy Helper", +"Preparing the student to write each breasoning", +"Pedagogy Helper - Write on Breasoning - Politics", +"Pedagogy Helper - Write on Breasoning - Philosophy", +"Pedagogy Helper - Write on Breasoning - Computer Science", +"Unification to Become Pedagogy Helper", +"Practicum", +"Breason out Arguments Twice When in Large Class", +"Instructions for using Recordings and Lecturer in Computer Science Recordings", +"Lecturer", +"Daily Regimen", +"Delegate Workloads"]]). +course_and_essay_topic2( +[["Fundamentals of Meditation and Meditation Indicators"], +["Children/H1/Earning Jobs/Protection in Jobs Heads of State", +"Lucian Mantra (Pure Form)", +"Lucian Mantra (Sun Safety)", +"Maharishi Sutra", +"Meditation Teacher Sutra Moving Appearances Purusha", +"Upasana Sutra", +"Yellow God", +"Green Sutra", +"Blue Nature", +"Appearances", +"Pranayama", +"Soma", +"Hours Prayer", +"50 Breasonings Per Utterance", +"Lower Risk of Cancer and Other Diseases In Workers and Broadcasters", +"Decreased Stress", +"Increased Blood Flow", +"Increased Brain Potential"]]). +course_and_essay_topic2( + +[["Fundamentals of Pedagogy and Pedagogy Indicators"], +["Two Uses", +"X", +"Y", +"Breathsonings", +"Rebreathsonings", +"Room", +"Part of Room", +"Direction", +"Time to Prepare", +"Time to Do", +"Time to Finish", +"Professor Algorithm", +"God Algorithm", +"Marking Scheme - Humanities and Science Marking Scheme - Creative Arts", +"Higher Grades", +"Fewer Stillbirths", +"A Greater Number Of Successful Job Applications"]]). +course_and_essay_topic2( +[["Medicine"], +["Doctor Sutra", +"Meditation", +"Protector from Headache in Meditation Currant Bun", +"Meditation", +"Panic attack prevented by deep breathing and sutra", +"Family Medicine", +"Help ensure successful conception and prevent miscarriage", +"Pedagogy", +"Lucianic Pedagogical Medicine", +"Pedagogy Grades/Failure", +"Pedagogy Course Plan", +"Get in Touch with God about Breasonings Details to see High Quality Imagery and Earn H1 250 Breasonings", +"Preventing Sales from Being Dangerous", +"Perpetual University Short Courses", +"Apple Meditation for Successful Relationship", +"Miscellaneous", +"4 Glasses of Water and Exercise 45 Minutes Before Breakfast", +"Go to Bed at 9:30 PM", +"Yoga", +"Yoga Surya Namaskara and Yoga Asanas", +"Prevent Headaches on Train and a Bent Spine", +"Brain", +"Brain", +"Brain II", +"Maintain Dry Eyes", +"Avoid Diseased People", +"Fewer Mental Breakdowns (Schizophrenia)", +"Less Depression", +"Honey Pot Prayer for No Headaches in Cars, Trains and Walks", +"Quantum Box/Prayer", +"Nut and Bolt", +"Head of State Head Ache Prevention", +"Daily Regimen", +"Laughter for Depression", +"Heart", +"Contagious Diseases", +"Berocca Prevents Colds and Flu", +"Food", +"Grains/Nuts/Fruits/Vegetables", +"Sit Properly at Table During Meals"]]). + +course_and_essay_topic3(Courses_and_essay_topics) :- + findall(A,(course_and_essay_topic2(A)),Courses_and_essay_topics). \ No newline at end of file diff --git a/Lucian-Academy/luciansphilosophy.txt b/Lucian-Academy/luciansphilosophy.txt new file mode 100644 index 0000000000000000000000000000000000000000..b5247cd7fa8fbcbc8d5ce364247fcc77f66b6de5 --- /dev/null +++ b/Lucian-Academy/luciansphilosophy.txt @@ -0,0 +1 @@ +"luciangreen","Lucian-Academy","Automates the Lucian Academy",[["luciangreen","listprologinterpreter"],["luciangreen","Languages"],["luciangreen","culturaltranslationtool"],["luciangreen","Algorithm-Writer-with-Lists"],["luciangreen","Text-to-Breasonings"],["luciangreen","mindreader"]] \ No newline at end of file diff --git a/Lucian-Academy/politics_university.txt b/Lucian-Academy/politics_university.txt new file mode 100644 index 0000000000000000000000000000000000000000..6badb1f6969f3e6cc99e27838d61f1f010d15b54 --- /dev/null +++ b/Lucian-Academy/politics_university.txt @@ -0,0 +1,64 @@ +1 +2 +3 +4 +5 +6 +7 +8 +9 +10 +11 +12 +13 +14 +15 +16 +17 +18 +19 +20 +21 +22 +23 +24 +25 +26 +27 +28 +29 +30 +31 +32 +33 +34 +35 +36 +37 +38 +39 +40 +41 +42 +43 +44 +45 +46 +47 +48 +49 +50 +51 +52 +53 +54 +55 +56 +57 +58 +59 +60 +61 +62 +63 +64 \ No newline at end of file diff --git a/Lucian-Academy/progress_bar.pl b/Lucian-Academy/progress_bar.pl new file mode 100644 index 0000000000000000000000000000000000000000..2d1f87e1c97244acb34d955551d22fecbc9155f8 --- /dev/null +++ b/Lucian-Academy/progress_bar.pl @@ -0,0 +1,57 @@ +/* +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/', web_form, []). + +:- include('../listprologinterpreter/la_strings.pl'). + +server(Port) :- + http_server(http_dispatch, [port(Port)]). + + + browse http:localhost:8000 + This demonstrates handling POST requests + + web_form(_Request) :- + + format('Content-type: text/html~n~n', []), + +*/ +progress_bar(Ratio) :- +%Ratio=1, +Percentage1 is 100*Ratio, +Percentage2 is 100-Percentage1, + +(Percentage1=0-> + +concat_list(["Progress: ",Percentage1,"%
+ + + + +
 
"],String); + +(Percentage1=100-> + +concat_list(["Progress: ",Percentage1,"%
+ + + + +
 
"],String); + +concat_list(["Progress: ",Percentage1,"%
+ + + + + +
  
"],String) +)), + +format(String,[]). \ No newline at end of file diff --git a/Lucian-Academy/sectest_p.pl b/Lucian-Academy/sectest_p.pl new file mode 100644 index 0000000000000000000000000000000000000000..0513f080962e6520401078f9b62e4dfcd9576732 --- /dev/null +++ b/Lucian-Academy/sectest_p.pl @@ -0,0 +1,56 @@ +%% mind read test + +%% Make files different for different tests + +%% *** Important: initialise program before running for the first time: +%% N is 1*2*3*5,texttobr2(N). %% 100 As for 1 (turned on)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects) +%% N is 1*2*3*5,texttobr2(N). %% 100 As for 1 (turned off)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects) +%% also breason out and dot on objects before line above and breason out and dot on when recognising and saying object (with all objects having different breasonings) + + +%%use_module(library(pio)). + +:- use_module(library(date)). +:- include('../Text-to-Breasonings/text_to_breasonings.pl'). +:- include('../mindreader/mindreadtestshared'). +:-include('../listprologinterpreter/la_strings_string.pl'). %**** change path on server +:-include('../listprologinterpreter/la_maths.pl'). %**** change path on server + +sectest_p:- + phrase_from_file_s(string([N,_]), "a_tally.txt"), + %string_codes(String02b,String00a), + %atom_to_term(String00a,[N,_],[]), + +numbers(N,1,[],Ns), +findall(_,(member(N1,Ns), + + find_time1(H,M,S), + politeness(0,Threats1), + politeness(0,Threats2), + politeness(0,Threats3), + politeness(0,Threats4), + % no_death(0,Threats2), % medits for life + writeln([N1,H,M,S,Threats1,politeness]), + writeln([N1,H,M,S,Threats2,humour]), + writeln([N1,H,M,S,Threats3,connections]), + writeln([N1,H,M,S,Threats4,appearances]) + + + + + ),_).%,Threats2,no_death]). + +find_time1(H,M,S) :- + find_time(H,M,S),!. + +politeness(Threats1,Threats2):- + %% "Given that we are interested in friendliness in primary school, secondary school and university, is there anything else?" + trialy2_6("Yes",R1), + trialy2_6("No",R2), + R=[R1,R2/**,R3,R4,R5,R6,R7,R8,R9,R10**,R11,R12,R13,R14,R15,R16,R17,R18,R19,R20,R21,R22,R23,R24,R25,R26,R27**/ + ], + sort(R,RA), + reverse(RA,RB), + RB=[[_,Answer]|_Rest], + + (Answer="No"->Threats2=Threats1;(Threats3 is Threats1+1,politeness(Threats3,Threats2))),!. diff --git a/Lucian-Academy/short_essay_helper3_agps.pl b/Lucian-Academy/short_essay_helper3_agps.pl new file mode 100644 index 0000000000000000000000000000000000000000..6ba77a58e788147c38a98da2de1d4353115614d5 --- /dev/null +++ b/Lucian-Academy/short_essay_helper3_agps.pl @@ -0,0 +1,1034 @@ +%% Prolog Short Essay Helper + +%% Keep 1,2,3 in aphors +%% Certain number of reasons per paragraph +%% Explains essay structure at start +%% Exposition (negative terms not allowed, asks for 5 groups from text to take optionally enough notes about, which can revise) and critique +%% Paraphrasing +%% Converts x, explains x, asks about +%% Allows citation of aphor +%% Prints citations used so far +%% Prints chapter, essay and allows changes (using text editor) at end. +%% Outputs db file and br dicts for marking later + +%% *** Rewrite essay files with e.g. 1a. before aphors not 1. so not confused with aphors + +%% short_essay_helper("file.txt",5,E),writeln1(E). + +%%:- include('distances.pl'). +:- use_module(library(date)). +:- include('../listprologinterpreter/la_strings'). +:- include('../listprologinterpreter/la_strings_string'). + +:- dynamic critique3/1. +:- dynamic agree_disagree/1. +:- dynamic refs/1. +:- dynamic refs_long/1. + +choose(List0,Item) :- + random_member(Item10,List0), + string_codes(Item10,List), + split_string(List,".\n",".\n",List2), + random_member(Item1,List2), + string_concat(E,D,Item1), + string_length(E,1), + downcase_atom(E,E1), + atom_string(E1,E2), + string_concat(E2,D,Item2), + string_concat(Item2,""%%"." + ,Item). + +choose1(List0,Item) :- + random_member(Item,List0). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +short_essay_helper(Files,%%Filex, + String01, + Reasons_per_paragraph,Essay_0) :- + retractall(critique3(_)), + assertz(critique3([])), + + retractall(refs(_)), + assertz(refs([])), + + retractall(refs_long(_)), + assertz(refs_long([])), + + retractall(key_words(_)), + assertz(key_words([])), + + findall(String02a,(member(Filex1,Files), + %string_concat("sources/",Filex1,Filex), + %phrase_from_file_s(string(String00a), Filex), + %string_codes(String02b,String00a), + atom_to_term(Filex1,String02a,[]) + %%split_string(String00, "\n\r", "\n\r", [String01a|_]), + + %%prepare_file_for_ml(String00,String02a) + ),String00), + + %%trace, + %%writeln1(String00), + %%notrace, + +%%writeln1(String02), + + + generate_file_name(File1,_File2), + + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + explain_structure(String01,Reasons_per_paragraph,File1), + exposition(String00,String01,Reasons_per_paragraph,Numbers,String02,Exposition), + + %%concat_list(["Do you agree or disagree with ",String01," (a/d) ? "],String2ad),%%get_string(String2ad,either,one-not-ml,"","",String3ad), + choose1(["a"%%,"d" + ],String3ad), + + (String3ad="a"-> + (retractall(agree_disagree(_)), + assertz(agree_disagree(agree))); + (retractall(agree_disagree(_)), + assertz(agree_disagree(disagree)))), + + +critique(String00,String01,Reasons_per_paragraph,Numbers,String02,Critique), + +agree_disagree(Pole), + + %%concat_list(["What is the future area of research from your essay about ",String01,"? "],Future_research_prompt), + %%trace, + %%get_string(Future_research_prompt,either,one-not-ml,"","",Future_research), + choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(String00a4,String00a5), + concat_list(["In ",String01,", automation should apply to ",String00a5," (",String00a2,", p. ",N_page_ref,")."],Future_research), + reference(String00a1), + + refs(R2),refs_long(_R21), + +%term_to_atom([Exposition,Critique,String3ad,Future_research,R21],File_contents),open_s(File1,write,Stream),write(Stream,File_contents),close(Stream), + +%% Output essay +%%findall(_,(member(Exposition1,Exposition),Exposition1= + + +%%writeln1([Exposition,Critique,Future_research,R2]), + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay_0,_HTML) +%writeln1(Essay_0) +/** + (open_s(File2,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,HTML), + close(Stream1)) + **/ + . + +%% replace("a\nb","\n","
\n",F). +%% F="a
\nb
\n". + +replace(A,Find,Replace,F) :- + split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F). + +concat_list1(D,F) :- + maplist(append,[D],[E]),concat_list(E,F). + +write_essay(String01,Pole,Exposition,Critique,Future_research,R2,Essay,HTML) :- + write_heading(String01,Heading), + write_introduction(String01,Pole,Critique,Introduction), + write_exposition(Exposition,Exposition2a), + concat_list(["I will expose ",String01," in this half. ",Exposition2a],Exposition2), + %%string_concat(Exposition2,"\n",Exposition2a), + write_critique(Critique,Critique2a), + string_concat(Critique2a,"\n",Critique2b), + atom_string(Pole,Pole1), + concat_list(["I will ",Pole1," with ",String01," in this half. ",Critique2b],Critique2), + write_conclusion(String01,Pole,Critique,Future_research,Conclusion), + write_references(R2,References,Refs_no_heading), + concat_list([Heading,Introduction,Exposition2,Critique2,Conclusion,References], + Essay), + concat_list([Introduction,Exposition2,Critique2,Conclusion], + Essay2), + replace(Essay2,"\n","
",HTML1), + replace(Refs_no_heading,"\n","
",Refs_no_heading2), + concat_list(["",String01,"

", + String01,"

",HTML1,"

Bibliography

",Refs_no_heading2,""],HTML). + +write_heading(String01,Heading) :- + concat_list([String01,"\n","\n"],Heading). + +write_introduction(String01,Pole1,Critique,Introduction) :- + %% The heading should be in the form "Author's topic" + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list(["I will critically analyse ",String01,". ", + "I will ",Pole2," with ",String01,". ", + Paragraph_topic_sentences2,"\n\n"],Introduction). + +write_conclusion(String01,Pole1,Critique,Future_research,Conclusion) :- + atom_string(Pole1,Pole2), + findall([Paragraph_topic_sentence," "],(member(A,Critique),A=[_,[Paragraph_topic_sentence|_]|_]),Paragraph_topic_sentences1), + concat_list1(Paragraph_topic_sentences1,Paragraph_topic_sentences2), + concat_list([ + Paragraph_topic_sentences2, + "I have ",Pole2,"d with ",String01,". ", + Future_research,"\n\n" + ],Conclusion). + +write_references(R2,References,Refs_no_head) :- + findall([Reference,"\n"],member(Reference,R2),References1), + concat_list1(References1,References2), + concat_list([References2],Refs_no_head), + concat_list(["Bibliography","\n\n",References2],References). + +%%a1 +%% write_exposition([[[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."]]],A),writeln1(A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. ". + +%% write_exposition([[[1,"g1"],[1,"g1"]],[[1,1,_15410,_15416,"a1","a1 is in g1."],[1,2,_15352,_15358,"a2","g1 contains a2."],[1,1,_15410,_15416,"a1","a1 is in g1."],[2,2,_15352,_15358,"a2","g1 contains a2."]]],A). +%% A = "a1 a1 is in g1. a2 g1 contains a2. a1 a1 is in g1. a2 g1 contains a2. ". + +write_exposition(Exposition,Essay4b) :- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +Exposition=[_,Exposition2], + + +findall([Essay4c%%,"\n" +],(member(Numbera11,Numbers), + + +%% not "" with findall +findall(Essay4,( +%%member(Exposition1,Exposition), +%%Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + + +%%output_exposition(Numbera11,Exposition2,"",Essay1), + + %%findall( Essay4,( + member(Exposition1,Exposition2),Exposition1=[Numbera11,_Number3a,_String3,_String3a,String5a,Group_link], + concat_list([String5a," ",Group_link," "],Essay4) + %%delete(Exposition,Exposition1,Exposition2) + %%output_exposition(Numbera11,Exposition2,Essay4,Essay6) +),Essay4a), +concat_list(Essay4a,Essay4c1),%%trace, +(Essay4c1=""->Essay4c="";concat_list([Essay4c1,"\n"],Essay4c)) + +),Essay4d), + +maplist(concat_list,Essay4d,Essay4f), +concat_list(Essay4f,Essay4b) +%%concat_list([Essay4e,"\n"],Essay4b) +%%concat_list([Essay1," ",Essay3],Essay2), + %%concat_list([Essay2,"\n"],Essay23)) +%%,Essay3a), + %%concat_list(Essay3a,Essay4a) +. +%% *** HTML (
not \n) + +%%%%% +%%a + +%% write_critique([[1,["heading is e12",[1,1,_25346,_25352,"e1",_25370,_25376,"e12"],[1,2,_25412,_25418,"e2",_25436,_25442,"e22",1,1,"e12",0,0,"e22","e12 is e22"]]]],A),writeln1(A). +%% A = "heading is e12 e1 e12 e2 e22 e12 is e22 \n". + +write_critique(Critique,Essay4):- + Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + +findall(Essay23,(member(Numbera11,Numbers), + +%% not "" with findall +%%findall(Essay22,( +member(Critique1,Critique), +%%Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,String3y,_String3ay,String5a1,_CNumber2aa,_CNumber3aa,CString5a1a,_CNumber2a1,_CNumber3a1,_LastCStrings,String5aaa], + +Critique1=[Numbera11,[Para_topic_sent,[_Number2a,_Number3a,_String3,_String3a,String5a,_String3y,_String3ay,String5a1]|Critique2]], + +concat_list([Para_topic_sent," ",String5a," ",String5a1," "],Essay0),output_critique(Numbera11,Critique2,String5a1,Essay0,Essay1), + + concat_list([Essay1,Para_topic_sent,"\n"],%%Essay22) + +%%), +Essay23)),Essay3), + concat_list(Essay3,Essay4) +. +%% *** HTML (
not \n) + +output_critique(_Numbera11,[],_String5a1a,Essay,Essay) :- !. +output_critique(Numbera11,Critique,CString5a1a,Essay1,Essay2) :- +findall( Essay6,(member(Critique1,Critique),Critique1=[Numbera11,_Number3a,_String3,_String3a,String5a,_String3y1,_String3ay,String5a1, +_CNumber2aa,_CNumber3aa,CString5a1a, + _CNumber2a1,_CNumber3a1,_LastCStrings, + String5aaa], + concat_list([String5a," ",String5a1," ",String5aaa," "],Essay4), + delete(Critique,Critique1,Critique2), + output_critique(Numbera11,Critique2,String5aaa,Essay4,Essay6) +),Essay33), +%%trace, +(Essay33=[]->concat_list([Essay1%%," " +],Essay2);%%(Essay33=[Essay3]->concat_list([Essay1," ",Essay3],Essay2);%%(Essay33=[Essay3|[E33]]-> concat_list([Essay1," ",Essay3,E33],Essay2); +(Essay33=[Essay3|E33], concat_list(E33,E34),concat_list([Essay1,%%" ", +Essay3,E34],Essay2))) +%%) +%%) +. + + +/**critique(String00,String01,Reasons_per_paragraph,Numbers,Critique). + **/ + +generate_file_name(File1,File2) :- + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".txt"],File1), + concat_list(["file",Year,Month,Day,Hour1,Minute1,Seconda,".html"],File2). + + +explain_structure(String01,Reasons_per_paragraph,_File1) :- + concat_list(["The Short Essay Helper will automatically structure and write your essay about ",String01," with ",Reasons_per_paragraph," reasons per paragraph.","\n", + "The Helper will help write an exposition (which summarises but doesn't critique the idea), a critique (which agrees with or disagrees with the topic), the introduction and the conclusion (which state whether you agreed or disagreed with the topic, etc.). Citations will be automatically made.","\n","Note: Generated essays are not to be handed in, and you need to paraphrase and cite work you have referenced. Your grade depends on whether you agree or disagree and how many breasonings you breason out. Check the referencing style is appropriate for your class.","\n"],String1). + %writeln(String1). + +choose_sentence_range(String00,N2,B,B1,B2,C) :- + length(String00,L), + numbers(L,1,[],N), + random_member(N1,N), + get_item_n(String00,N1,A), + A=[B,B1,B2|C], + N2 is N1+B2-1. + %%random_member(A,String00), + +exposition(String00,_String01,Reasons_per_paragraph,Numbers,ML_db,Exposition1) :- + length(List1,5), %% 5->1 paragraphs per exposition + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + findall([Number1,Exposition2],( + %%trace, +member(Number1,List1),%%concat_list(["What is group ",Number1," of 5 in the exposition that groups ideas about ",String01,"? "],String1),%%get_string(String1,either,one-not-ml,"","",%ML_db,Exposition2) +%% ** Doesn't print this +choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(String00a4,String00a5), + concat_list([String00a5," (",String00a2,", p. ",N_page_ref,") "],Exposition2), + reference(String00a1)),Exposition3), + + + findall([Number2,Number3,String3,String3a,String5a,String5],( + member(Number2,List1),member(Number3,List2),get_item_n(Exposition3,Number2,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you how the quote you are about to enter relates to the paragraph topic."],String2b),%writeln(String2b), + %%trace, +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5) + ),Exposition4), + Exposition1=[Exposition3,Exposition4]. + +exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5):- + (%%concat_list(["What is the paragraph number of the quote about the paragraph topic ",Item1,"? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote about the paragraph topic ",Item1,"? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote about the paragraph topic ",Item1,"? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + choose_sentence_range(String00,N_page_ref,String00a1,String00a2,_String00a3,String00a4), + choose(String00a4,String00a5), + concat_list([Item1," is because ",String00a5," (",String00a2,", p. ",N_page_ref,")."],String5a), + reference(String00a1), + + %%concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + %%get_string(String4,either,one-not-ml,Item1a,String3aa,String5) + + choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(String00a41,String00a51), + concat_list([Item1," is because ",String00a51," (",String00a21,", p. ",N_page_ref1,")."],String5), + reference(String00a11) +) + ->true;exposition2(String00,Item1,ML_db,String3,String3a,String5a,String5). + +%% Agree or disagree + +critique(String00,String01,Reasons_per_paragraph,Numbers,ML_db,Critique1) :- + length(List1,5), %% 5->1 paragraphs per critique + append(List1,_,Numbers), + length(List2,Reasons_per_paragraph), + append(List2,_,Numbers), + + %%string_codes(String001,String00), + %%writeln(String001), + + retractall(critique3(_)), + assertz(critique3([])), + + + findall([Number2a,Critique2],( +%% Reason 1 + +member(Number2a,List1), +%%List1=[Number2a|List1a], +List2=[Number3a|List2a], +%%trace, + + +critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link), + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4), + + append_list2([[Topic_paragraph_link],Critique3,Critique4],Critique2)),Critique1). + + critique_reason1(String00,String01,Number2a,Number3a,Reasons_per_paragraph,ML_db,Critique3,Topic_paragraph_link) :- + + %%member(Number2,List1),member(Number3,List2), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),%writeln(String2b), + %% Later: connections + + %%trace, +critique2(String00,String01,ML_db,String3,String3a,String5a,_String3y2,String3ay,String5a1,Topic_paragraph_link), + %%),Critique4). + Critique3=[[Number2a,Number3a,String3,String3a,String5a,_String3y3,String3ay,String5a1]]. + +critique2(String00,String01,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + + choose_sentence_range(String00,N_page_ref1,String00a11,String00a21,_String00a31,String00a41), + choose(String00a41,String00a51), + concat_list([String00a51," (",String00a21,", p. ",N_page_ref1,")."],String5a), + reference(String00a11), + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (%%String3yn="y" + true-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase + + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + choose_sentence_range(String00,N_page_ref2,String00a12,String00a22,_String00a32,String00a42), + choose(String00a42,String00a52), + concat_list([String00a52," (",String00a22,", p. ",N_page_ref2,")."],String5a1), + reference(String00a12) + + ) + + ;(String3y=0,String3ay=0, + %%concat_list(["What is the comment? "],String4ac),%%get_string(String4ac,either,one-not-ml,"","",%%String5a, +%% String5a1) + + choose_sentence_range(String00,N_page_ref3,String00a13,String00a23,_String00a33,String00a43), + choose(String00a43,String00a53), + concat_list([String00a53," (",String00a23,", p. ",N_page_ref3,")."],String5a1), + reference(String00a13) + +)), + + %%concat_list(["How does the comment ",String5a1," relate to the essay topic ",String01,"? "],Topic_paragraph_link_prompt), + %%trace, + + %%downcase_and_split(String5a1,String5a1ds), + %%downcase_and_split(String01,String01ds), + %%get_string(Topic_paragraph_link_prompt,either,one-not-ml,String5a1ds,String01ds,Topic_paragraph_link) + + string_concat(String5a1_az,".",String5a1), choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + choose(String00a44,String00a54), + concat_list([String01," is because of ",String5a1_az," because of ",String00a54," (",String00a24,", p. ",N_page_ref4,")."],Topic_paragraph_link), + reference(String00a14) + + /** + %% conn - choose connected comments + concat_list(["How does the quote you entered (",String5a,") relate to the paragraph topic ",Item1,"? "],String4), + %%trace, + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",downcase_atom(Item1,String41a),split_string(String41a, SepandPad, SepandPad, Item1a), + + get_string(String4,either,two,Item1a,String3aa,String5) + **/ + ) + ->true;critique2(String00,ML_db,String3,String3a,String5a,String3y,String3ay,String5a1,Topic_paragraph_link). + + + +critique_reasons_2_to_n(String00,Number2a,List2a,Critique3,Reasons_per_paragraph,ML_db,Critique4) :- + retractall(critique3(_)), + assertz(critique3(Critique3)), +%%trace, +/** critique3(Critique31), + append(Critique31,Critique3,Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), +**/ +%%notrace, +findall([Number2a,Number3a,String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa],( + %%member(Number2,List1a), + member(Number3a,List2a), + %%get_item_n(Critique3,Number2a,[_,Item1]), + %%trace, + concat_list([" ","\n","The Helper will ask you for your comment on the quote you are about to enter."],String2b),%writeln(String2b), + %% Later: connections + + + %%trace, +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa) + + +/** + trace, + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + +), + Critique4). + + %%Critique3=[]. + +critique3(String00,ML_db,%%Critique3, +String3,String3a,String5a,String3y,String3ay,String5a1, +CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa):- + (%%concat_list(["What is the paragraph number of the quote to comment on? "],String2),get_number(String2,String3), + %%concat_list(["What is the sentence number of the quote to comment on? "],String2a),get_number(String2a,String3a), + %%member1a([String3,String3a,String3aa],ML_db), + %%concat_list(["What is the paraphrased quote to comment on? "],String4a),%%get_string(String4a,either,one-not-ml-ref,"",String3aa,String5a), + %%trace, + choose_sentence_range(String00,N_page_ref4,String00a14,String00a24,_String00a34,String00a44), + choose(String00a44,String00a54), + concat_list([String00a54," (",String00a24,", p. ",N_page_ref4,")."],String5a), + reference(String00a14), + + + %%concat_list(["Is your comment from a quote (y/n)? "],String2yn),%%get_string(String2yn,either,one-not-ml,"","",String3yn), + + %%agree_disagree(Pole), + + (true->%%String3yn="y"-> + (%%concat_list(["What is the paragraph number of the comment? "],String2y),get_number(String2y,String3y), + %%concat_list(["What is the sentence number of the comment? "],String2ay),get_number(String2ay,String3ay), + %%trace, + %%member1a([String3y,String3ay,String3aay],ML_db), + + %% use either original x or paraphrase x + %%trace, + %%concat_list(["What is the comment that is from the quote? "],String4ac), + %%trace, + %%get_string(String4ac,either,one-not-ml-ref,"",String3aay,String5a1) + + choose_sentence_range(String00,N_page_ref5,String00a15,String00a25,_String00a35,String00a45), + choose(String00a45,String00a55), + concat_list([String00a55," (",String00a25,", p. ",N_page_ref5,")."],String5a1), + reference(String00a15) + + %%,trace + ) + + ;(String3y=0,String3ay=0, + concat_list(["What is the comment? "],String4ac), + + %% This is never chosen + get_string(String4ac,either,one-not-ml,"","",%%String5a, + String5a1))), + %%*** assertz recurse not findall new critique3 x + + + %%trace, + critique3(Critique31), + append(Critique31,[[0,0,String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)), + + critique3(Critique33), + + +(( + + + %%critique3(Critique3), + length(Critique33,LCritique3), + %%length(List1,LCritique3), + numbers(LCritique3,1,[],List1), + %%append(List1,_,Numbers), + %%Numbers=[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25], + %%trace, + +findall([N," - ",CString5a1,"\n"],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString3,_CString3a,_CString5a,_CString3y,_CString3ay,CString5a1])),CStrings1), + +findall([N,CNumber2a,CNumber3a,CString5a1],(member(N,List1),get_item_n(Critique33,N,[CNumber2a,CNumber3a,_CString31,_CString3a1,_CString5a1,_CString3y1,_CString3ay1,CString5a1])),CStrings2), + +%%trace, + +%%CStrings=[CStrings1,CStrings2], +reverse(CStrings2,CStringsR),CStringsR=[[_,CNumber2a1,CNumber3a1,LastCStrings]|CStringsR1], +reverse(CStringsR1,CStringsRR), + +reverse(CStrings1,CStringsR10),CStringsR10=[_|CStringsR11], +reverse(CStringsR11,CStringsRR1), + +append_list2(CStringsRR1,CStrings11), +concat_list(CStrings11,_CStrings12), +%%concat_list(["Please select a comment to connect the comment ",LastCStrings," to:","\n",CStrings12],ConnectionNumberPrompt), +%%get_number(ConnectionNumberPrompt,ConnectionNumber), +%numbers( +choose1(List1,ConnectionNumber), + member([ConnectionNumber,CNumber2aa,CNumber3aa,CString5a1a],CStringsRR), + + %%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789",%%downcase_atom(CString5a1a,CString5a1a1),split_string(CString5a1a1, SepandPad, SepandPad, CString5a1a2), + + %%CNumber2a1,CNumber3a1, + %%downcase_atom(LastCStrings,LastCStrings_a),split_string(LastCStrings_a, SepandPad, SepandPad, LastCStrings_a1), + + + %% conn - choose connected comments, this to a previous comment + %%trace, + + %%concat_list(["How does ",LastCStrings," connect to ",CString5a1a,"? "],ConnectionPrompt), + + %%get_string(ConnectionPrompt,either,one-not-ml,CString5a1a2,LastCStrings_a1,String5aaa) + + string_concat(LastCStrings_az,".",LastCStrings), + string_concat(CString5a1a_az,".",CString5a1a), choose_sentence_range(String00,N_page_ref6,String00a16,String00a26,_String00a36,String00a46), + choose(String00a46,String00a56), + concat_list([LastCStrings_az," is because of ",CString5a1a_az," because of ",String00a56," (",String00a26,", p. ",N_page_ref6,")."],String5aaa), + reference(String00a16) + + + )->true;( + + %% If the section since updating dynamic critique comments fails, prevent doubling of critique comments + + critique3(Critique311), + reverse(Critique311,Critique312), + Critique312=[_|Critique313], + reverse(Critique313,Critique314), + retractall(critique3(_)), + assertz(critique3(Critique314)),fail + + )) + +/** Critique4=[String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa], + **/ + /** + critique3(Critique31), + append(Critique31,[[String3,String3a,String5a,String3y,String3ay,String5a1]],Critique32), + retractall(critique3(_)), + assertz(critique3(Critique32)) +**/ + + ) + %% Retries the predicate if it fails + ->true;critique3(String00,ML_db,%%Critique3, + String3,String3a,String5a,String3y,String3ay,String5a1, + CNumber2aa,CNumber3aa,CString5a1a, + CNumber2a1,CNumber3a1,LastCStrings, + String5aaa + ). + + + + +member1a([String3,String3a,String3aa],ML_db) :- + member([String3,String3a,String3aa],ML_db),!. + + +get_item_n(Exposition,Number1,Item) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[Item|_],Exposition). + +get_number(Prompt1,Number) :- + %%concat_list(Prompt1,Prompt2), + (%%repeat, + writeln(Prompt1),read_string(user_input, "\n", "\r", _End1, String),split_string(String, ",", " ", Value1),Value1=[Value2],number_string(Number,Value2)). + + +/** +get_string(Prompt1,String1) :- + concat_list(Prompt1,Prompt2), + (repeat,write(Prompt2),read_string(String1),not(String1="")). +**/ + +equals_empty_list([]). + +downcase_and_split(String1, String2) :- +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + downcase_atom(String1,String3),split_string(String3, SepandPad, SepandPad, String2). + +get_string(Prompt2,Flag1,Flag2,ML_db0,ML_db1,String2) :- +%%writeln1(get_string(Prompt2,Flag1,Flag2,ML_db1,String2)), + %%concat_list(Prompt1,Prompt2), + %%trace, + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%(repeat, + (Flag2=one-not-ml-ref->(concat_list(["Note: Enter in-text reference using AGPS, e.g.\n","The first work supports the second work (Surname 2000, pp. 18-9).\n","Surname (2000, pp. 18-9) states that ...\n","Remember to use words like \"also\", \"moreover\" and \"in addition\" before the sentence."],String_a1),writeln(String_a1));true), + writeln(Prompt2),read_string(user_input, "\n", "\r", _End2, String2aa),%%not(String2aa=""), + %%String2aa=[String2aaa], + downcase_atom(String2aa,String3),split_string(String3, SepandPad, SepandPad, String4), + Neg_term_list=["no","not","don","t","shouldn","wouldn","disagree","differ","dislikes","disagrees","differs","dislikes","disagreed","differed","disliked","negative","negation","non","negate","negates","negated","but","however","isn","lack"], + (Flag1=%%non_negative + positive->( + + (findall(Item11,( + + member(Item1,String4), + findall(Item1,( +member(Item2,Neg_term_list),(Item1=Item2->(write("Error: Contains the negative term \""),write(Item1),writeln("\".")))),Item11)),Item12)), + +maplist(equals_empty_list,Item12) + +); + ((Flag1=negative->((member(Item1,String4),member(Item1,Neg_term_list))->true;(writeln("Error: Contains no negative term, one of:"),writeln(Neg_term_list),fail));true)->true;Flag1=either)), + (Flag2=one-not-ml->String2=String2aa; + (Flag2=one-not-ml-ref-> + (refs(R1),writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), findall(_,(member(R11,R1),writeln(R11)),_),read_string(user_input, "\n", "\r", _End3, String2r),not(String2r=""),%%downcase_atom(String2r,_String3r), + String2=String2aa,split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + sort1(String2r21,String2r2), + assertz(refs(String2r2))%%split_string(String3r, SepandPad, SepandPad, String4) + ); +(Flag2=one->(%%split_string(String4,SepandPad,SepandPad,String21), + writeln("Attempt 1"), + + (length(String4,Length_string1), +(check_strings_container1(Length_string1,String4,[0,0,ML_db1],[[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1],[999,999,[]]],_,_List2)-> + writeln("Success");(writeln("Failed"),fail)) + %%(%%data_instance_k_classification1([[0,0,[xxx,xxx,xxx,xxx,xxx]],[0,0,ML_db1]%%,[999,999,[]] + %%],[0,0,String4]%%21 + %%,1,String4a%%21 + %%), + %%String4=String4a) + %%)-> + %%(%%String4a=[_,_,String4a1], + %%writeln([found,String4a,String4]), +%%writeln("Success")); + %%(writeln("Failed"),fail) + %% + )); + (Flag2=%%[ + two,%%,P1,S1,P2,S2], + %%trace, + append([ML_db0],[ML_db1],ML_db2), + check_strings(String4,ML_db2%%,P1,S1,P2,S2 + ))))). + +reference(String2r) :- + refs_long(R10), + (refs(R1),%%writeln("What is the reference? e.g. Surname, A 2000, Title: Subtitle, Publisher, City.\n"),writeln("Existing references (copy one or many delimited with \"\\n\"):"), + findall(_,(member(R11,R1)%,writeln(R11) + ),_),%%read_string(user_input, "\n", "\r", _End4, String2r), + not(String2r=""),%%downcase_atom(String2r,_String3r), + %%String2=String2aa, + split_string(String2r,"\n\r","\n\r",String2r3), + %%trace, + retractall(refs(_)),maplist(append,[[R1,String2r3]],[String2r21]), + retractall(refs_long(_)),maplist(append,[[R10,String2r3]],[String2r210]), + sort1(String2r21,String2r2), + assertz(refs(String2r2)), + assertz(refs_long(String2r210))%%split_string(String3r, SepandPad, SepandPad, String4) + ). + + +%% Sorts by first surname then title in AGPS + +sort1(List1,List2) :- + findall([C,B],(member(B,List1),downcase_atom(B,D),atom_string(D,C)),E),sort(E,F),findall(G,member([_,G],F),List2). + + +prepare_file_for_ml(String000,String021) :- + string_codes(String001,String000), + downcase_atom(String001,String00), + split_string(String00, "\n\r", "\n\r", String01), + delete(String01,"",String02), + findall(String03,(member(String02a,String02),split_string(String02a,".",".",String04), + ((String04=[String05|String06], + number_string(Number05,String05), + number_sentences(Number05,1,String06,[],String03))->true; + (findall(String08, + (SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + member(String04a,String04),split_string(String04a,SepandPad,SepandPad,String09), + append_list([[""],"",String09],String08)),String03)))),String0211), + append_list2(String0211,String021). + +number_sentences(_,_,[],String,String) :- !. number_sentences(Number_a,Number_b1,String06,String07,String08) :- + String06=[String00|String12], + +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + split_string(String00,SepandPad,SepandPad,String09), + append_list([[Number_a],Number_b1,String09],String10), + append(String07,[String10],String11), + Number_b2 is Number_b1 + 1, + number_sentences(Number_a,Number_b2,String12,String11,String08). + + +data_instance_k_classification1(Data,I,K,C):- + maplist(o_oclass_disClass1(I),Data,DisAndClass), + keysort(DisAndClass,DSorted), + + %%writeln(DSorted), + + length(TopK,K), + append(TopK,_,DSorted), + %This is not a very good way of selecting k as you may have many values with the same distance, and the sorting just cuts these off + %Dsorted = [1-pos,3-pos,3-pos,3-neg,3-neg] + %Topk =[1-pos,3-pos,3-pos] + topk_vote(TopK,C). + +o_oclass_disClass1(O,[A,B,O2],D-[A,B,O2]):- + o_o_dis(O,O2,D). + +%% could be in either order +%% a([w,z,a,b,e,c,z,y],[1,1,[c]],[1,2,[a]]). + +%% true +%% a([a,c,b],[1,1,[a,d]],[1,2,[b]]). +%% a([a,c,b],[1,1,[a,c]],[1,2,[b]]). +%% a([a,b],[1,1,[a]],[1,2,[b]]). + +%% false +%% a([c,d],[1,1,[a]],[1,2,[b]]). + +%% q in "q a b" sends food like e in "c d e" + +/** + +[debug] ?- a([q,e,r,a,t,y,u,q,e,r,a,t,y,u,c,b,x,v,n,m],[1,1,[c,a,t,y,u]],[1,2,[c,a,t,y,u]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[c,a,t,y,u]],[r,a,t,y,u]] +true. + +X: +[debug] ?- a([q,e,r,a,t,y,u,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,u]]). +[[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,u]],[c,b,x]] +true. + +[debug] ?- a([q,e,r,a,t,y,u,c,c,c,b,x,v,n],[1,1,[c,a,t,y,u]],[1,2,[b,x,v]]). [[1,1,[c,a,t,y,u]],[r,a,t,y,u]] +[[1,2,[b,x,v]],[c,c,c]] +true. + +**/ + +check_strings(String1,ML_db) :- + %%member([P1,S1,String2],ML_db), + %%member([P2,S2,String3],ML_db), + ML_db=[String2a,String3a], +%%writeln(["String1,String2a,String3a",String1,String2a,String3a]), + String2=[0,0,String2a], + String3=[0,0,String3a], +%%a(String1,String2,String3):- + length(String1,Length_string1), +((writeln("Attempt 1"), +check_strings_container1(Length_string1,String1,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + check_strings_container1(Length_list2,List2,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,_List3), + writeln("Success") + %%,trace + )->true; + (writeln("Failed"), + writeln("Attempt 2"), + ((check_strings_container1(Length_string1,String1,String3,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String3,[999,999,[]]],_,List2), + length(List2,Length_list2), + %%Length_list3 is Length_list2+1, + %%writeln(here), + + check_strings_container1(Length_list2,List2,String2,[[0,0,[xxx,xxx,xxx,xxx,xxx]],String2,[999,999,[]]],_,_List3))-> + writeln("Success");(writeln("Failed"),fail))) +). + +check_strings_container1(Length_string1,String1,String2,Db,List2,List2b) :- +check_strings1(Length_string1,String1,String2,Db,List2,List2b),not(var(List2b)). +%%check_strings_container2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- +%%check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b),not(var(List2b)). + + + %% finds the length of each string - from position 1 to (length of string1) of string1 + %% starts with the shortest possible string, then increases the length + %% repeat this for second string + %%string1 + + %%*** check with s2,s3 in db +check_strings1(0,_String1,_String2,_Db,List2,List2) :- !. +check_strings1(Length_string1,String1,String2,Db,List2,List2b) :- + %% finds the first str + %%length(List,Index), + length(List1,Length_string1), + append(_,List1,String1), + Length_string2 is Length_string1-1, + Length_string3 is Length_string1+1, + check_strings2(0,Length_string3,List1,String2,Db,List2,List2c), + (var(List2c)-> + check_strings1(Length_string2,String1,String2,Db,List2c,List2b);List2c=List2b),!. + +check_strings2(Length_string,Length_string,_String1,_String2,_Db,List2,List2) :- !. +check_strings2(Length_string1,Length_string3,List1,String2,Db,List2,List2b) :- + + + %%split_string(String4,SepandPad,SepandPad,String21), + + %% go through s1, removing the first word each time, until v + %%Length_string11 is Length_string1-1, + length(List11,Length_string1), + append(List11,List2,List1), + Length_string2 is Length_string1+1, + +%%writeln([list11,List11]), + + ((%%trace, + %%(List11=[z,a,b]->trace;true), + %%writeln([[_,_,List11],String2]), + %%notrace, + not(List11=[]), + +%%writeln(data_instance_k_classification1(Db +%% ,List11,1,A)), + data_instance_k_classification1(Db + %%String2 + ,List11,1,A), + %%writeln([string2,String2,list11,List11,a,A]), +A=String2,%%[_,_,List11], + %%trace, + + %%((List11=[z,a,b],A=[_,_,[z,t,b]],not(String2=[_,_,[c,z]]))->trace;notrace), + %%trace, +%%writeln([string2=list11,String2=List11]), +%%(List11=[y]->true%%trace +%%;true), +%%member(List11,Db), + %%**String2=[_,_,List11] + List2b=List2,%%,notrace + String2=[_,_,String21] + ,writeln([found,String21,List11]) + ) + ->(true + %%,writeln(List11) + ); + check_strings2(Length_string2,Length_string3,List1,String2,Db,_List2c,List2b)),!. + +numbers(N2,N1,Numbers,Numbers) :- + N2 is N1-1,!. +numbers(N2,N1,Numbers1,Numbers2) :- + N3 is N1+1, + append(Numbers1,[N1],Numbers3), + numbers(N2,N3,Numbers3,Numbers2). + +%% sheet_feeder(T),writeln(T). +%% T = ["a\na", "B\nb\nb", "C\nc\nc"]. + +sheet_feeder(T) :- + directory_files("raw_sources/",F), + delete_invisibles_etc(F,G), + findall(K1,(member(H,G), + string_concat("raw_sources/",H,String00b), + phrase_from_file_s(string(String001), String00b), + string_codes(String000,String001), + string_concat(String000,"\n\n",String00_a), + %%trace, + strip_illegal_chars(String00_a,"",String00), + split_on_substring(String00,"\n\n","",J1), + %%maplist(append,[J2],[J1]), + %%findall(J4,(member(J3,J2),%%trace, + %%concat_list(J3,J4)),K1), + delete(J1,"",K1), + term_to_atom(K1,_K) + /** + string_concat("sources/",H,String00bb), + (open_s(String00bb,write,Stream1), + write(Stream1,K), + close(Stream1))**/ + + ),T). + %%split_on_double_newline(J,[],K)),T). + +split_on_substring("",_A,E,[E]) :- !. +split_on_substring(A,B,E,C) :- + string_concat(B,D,A), + split_on_substring(D,B,"",C1), + append([E],C1,C),!. +split_on_substring(A,B,E1,C) :- + string_concat(E,D,A), + string_length(E,1), + string_concat(E1,E,E2), + split_on_substring(D,B,E2,C),!. + +strip_illegal_chars("",A,A) :- !. +strip_illegal_chars(A,B,E) :- + string_concat(E1,D,A), + string_length(E1,1), + char_type(E1,quote), + string_concat(B,"'",F), + strip_illegal_chars(D,F,E). +strip_illegal_chars(A,B,E) :- + string_concat(C,D,A), + string_length(C,1), + (char_type(C,alnum)->true; + (char_type(C,white)->true; + (char_type(C,digit)->true; + (char_type(C,punct)->true; + (char_type(C,newline)))))), + string_concat(B,C,F), + strip_illegal_chars(D,F,E),!. +strip_illegal_chars(A,B,E) :- + string_concat(E1,D,A), + string_length(E1,1), + string_concat(B," ",F), + strip_illegal_chars(D,F,E),!. + + +strip_illegal_chars1 :- + directory_files("raw_sources/",F), + delete_invisibles_etc(F,G), + findall(String00,(member(H,G), + string_concat("raw_sources/",H,String00b), + phrase_from_file_s(string(String001), String00b), + string_codes(String000,String001), + string_concat(String000,"\n\n",String00_a), + %%trace, + strip_illegal_chars(String00_a,"",String00), + %%split_on_substring(String00,"\n\n","",J1), + %%maplist(append,[J2],[J1]), + %%findall(J4,(member(J3,J2),%%trace, + %%concat_list(J3,J4)),K1), + %%delete(J1,"",K1), + term_to_atom(String00,K), + string_concat("sources/",H,String00bb), + (open_s(String00bb,write,Stream1), + write(Stream1,K), + close(Stream1)) + + ),_T). + + %% sheet_feeder and strip_illegal_chars1 are too slow. Use BBEdit to replace \n\n with ",\n\n", insert ["*","*",1," at start, "] at end and replace \\ with nothing \ No newline at end of file diff --git a/Lucian-Academy/sources/.DS_Store b/Lucian-Academy/sources/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..2f3b113a8f52d8109c4da6b5695d8d4e9e916a01 Binary files /dev/null and b/Lucian-Academy/sources/.DS_Store differ diff --git a/Lucian-Academy/sources1/.DS_Store b/Lucian-Academy/sources1/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..1c2ffe0b5ff20c53af0bb36eb2cd37b9d21131b7 Binary files /dev/null and b/Lucian-Academy/sources1/.DS_Store differ diff --git a/Lucian-Academy/student_marks1.txt b/Lucian-Academy/student_marks1.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Lucian-Academy/student_marks1.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Lucian-Academy/student_numbers.txt b/Lucian-Academy/student_numbers.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Lucian-Academy/student_numbers.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Lucian-Academy/student_numbers1.txt b/Lucian-Academy/student_numbers1.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Lucian-Academy/student_numbers1.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Lucian-Academy/word_count.pl b/Lucian-Academy/word_count.pl new file mode 100644 index 0000000000000000000000000000000000000000..a6b97f69e48485b19952e644d8b126939aa51056 --- /dev/null +++ b/Lucian-Academy/word_count.pl @@ -0,0 +1,22 @@ +% ["Medicine","MEDICINE by Lucian Green Head of State Head Ache Prevention 1 of 4.txt",0,algorithms,"See also Hours Prayer."] + +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). + +% word_count(["file","file.txt"],Words). +% Words = 69. + +% word_count(["string","a b c"],Words). +% Words = 3. + +word_count([Type,String1],Words) :- + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + + (Type="file"-> + (phrase_from_file_s(string(String2), String1)); + String1=String2), + split_string(String2,SepandPad,SepandPad,String3), + %writeln(String3), + length(String3,Words). + + diff --git a/Lucian-Academy/wrap_sources.pl b/Lucian-Academy/wrap_sources.pl new file mode 100644 index 0000000000000000000000000000000000000000..c8bcb5269e8949244f589d606c630267eb73555c --- /dev/null +++ b/Lucian-Academy/wrap_sources.pl @@ -0,0 +1,35 @@ +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). +wrap_sources:- + directory_files("sources/",F), + delete_invisibles_etc(F,G), +%%trace, +%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + findall(_,(member(Filex1,G), + string_concat("sources/",Filex1,Filex), + string_concat("sources1/",Filex1,Filexx), + + split_string(Filex1," "," ",Filex2), + append(_,["Green"|Filex3],Filex2), + findall([Filex31," "],(member(Filex31,Filex3)),Filex32), + maplist(append,[Filex32],[Filex321]), + concat_list(Filex321,Filex33), + string_concat(Filex4,".txt ",Filex33), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02b,String00a), + + concat_list(["[\"Green, L 2021, ",Filex4,", Lucian Academy Press, Melbourne.\",\"Green, L 2021\",1,\"",String02b,"\"]"],Line), + %atom_to_term(String02b,String02a,[]), + + (open_s(Filexx,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,Line), + close(Stream1))),_). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). diff --git a/Lucian-Academy/wrap_sources2.pl b/Lucian-Academy/wrap_sources2.pl new file mode 100644 index 0000000000000000000000000000000000000000..31107ef731e58979716bc4dad25f159af947e773 --- /dev/null +++ b/Lucian-Academy/wrap_sources2.pl @@ -0,0 +1,62 @@ +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). +wrap_sources:- + directory_files("sources/",F), + delete_invisibles_etc(F,G), +%%trace, +%SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'0123456789", + findall(_,(member(Folderx1,G), + string_concat("sources/",Folderx1,Folderx), + directory_files(Folderx,F1), + delete_invisibles_etc(F1,G1), + + findall(_,(member(Filex1,G1), + %trace, + ((foldr(string_concat,["sources/",Folderx1,"/",Filex1],Filex), + foldr(string_concat,["sources1/",Folderx1,"/",Filex1],Filexx), + + split_string(Filex1," "," ",Filex2), + Filex3=Filex2, + findall([Filex31," "],(member(Filex31,Filex3)),Filex32), + maplist(append,[Filex32],[Filex321]), + concat_list(Filex321,Filex33), + string_concat(Filex4,".txt ",Filex33), + phrase_from_file_s(string(String00a), Filex), + string_codes(String02c,String00a), + + atomic_list_concat(B,"\"",String02c), + atomic_list_concat(B,"\\""",C), + %atomic_list_concat(C1,"\\",C), + %atomic_list_concat(C1,"\\\\",C2), + atom_string(C,String02b), + + once((string_concat(A,B1,Filex1),string_length(A,4))), + (not(A="dot-") -> + + (%trace, + Line=String02c, Filexx=Filexx2) +; + + ( %trace, + foldr(string_concat,["sources1/",Folderx1,"/",B1],Filexx2), +string_concat(B2,".txt",B1), + concat_list(["[\"Green, L 2024, ",B2,", Lucian Academy Press, Melbourne.\",","\"Green, L 2024\",",1,",\"",String02b,"\"]"],Line) + )), + %atom_to_term(String02b,String02a,[]), +%term_to_atom(Line,Line1), + foldr(string_concat,["sources1/",Folderx1,"/"],Folderxx1), + (exists_directory(Folderxx1)->true;make_directory(Folderxx1)), + (open_s(Filexx2,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,Line), + close(Stream1)) + )->true;(writeln(["Error:",Filex1,"didn't convert."]))) + ),_)),_). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). diff --git a/LuciansHandBitMap-Font/.DS_Store b/LuciansHandBitMap-Font/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/LuciansHandBitMap-Font/.DS_Store differ diff --git a/LuciansHandBitMap-Font/LICENSE b/LuciansHandBitMap-Font/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/LuciansHandBitMap-Font/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/LuciansHandBitMap-Font/README.md b/LuciansHandBitMap-Font/README.md new file mode 100644 index 0000000000000000000000000000000000000000..4ef4939795e9c3cf3838036a9a80a33e1631b71c --- /dev/null +++ b/LuciansHandBitMap-Font/README.md @@ -0,0 +1,77 @@ +# LuciansHandBitMap-Font +Renders the characters from LuciansHandBitMap Font using a state machine. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, the Languages repository and Cultural Translation Tool. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","LuciansHandBitMap-Font"). +halt +``` + +# Running + +* In Shell: +`cd LuciansHandBitMap-Font` +`swipl` + +* Load the Split on Phrases program by typing: +`['characterbr.pl'].` + +* The algorithm is called in the form: +`ctobr0.` + +* This gives the output: +``` +A + +[] [] [1,2] [] [] +[] [] [1,2] [] [] +[] [1] [] [2] [] +[] [1] [] [2] [] +[] [1,3] [3] [2,3] [] +[1] [] [] [] [2] +[1] [] [] [] [2] +[] [] [] [] [] +[] [] [] [] [] + + * + * + * * + * * + *** +* * +* * +``` +* etc. + +* Alternatively, the algorithm may be called with: +`ctobr('A').` +* Where `'A'` may be replaced with another character name. + +# Download Font + +You may download LuciansHandBitMap Font. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/LuciansHandBitMap-Font/characterbr.pl b/LuciansHandBitMap-Font/characterbr.pl new file mode 100644 index 0000000000000000000000000000000000000000..eeff1a3dc28a8f7c70ac5dd574c87d8862fefdf0 --- /dev/null +++ b/LuciansHandBitMap-Font/characterbr.pl @@ -0,0 +1,1982 @@ +:-include('characterbr1.pl'). +/** + +?- [characterbr]. +true. + +?- ctobr0. +A + +[] [] [1,2] [] [] +[] [] [1,2] [] [] +[] [1] [] [2] [] +[] [1] [] [2] [] +[] [1,3] [3] [2,3] [] +[1] [] [] [] [2] +[1] [] [] [] [2] +[] [] [] [] [] +[] [] [] [] [] + + * + * + * * + * * + *** +* * +* * + + + +B + +[1,2] [2] [2] [2,3] [] +[1] [] [] [] [3,4] +[1,5,6] [5,6] [5,6,7] [4,5] [] +[1] [] [] [7] [] +[1] [] [] [] [7,8] +[1] [] [] [8] [] +[1,9] [9] [8,9] [] [] +[] [] [] [] [] +[] [] [] [] [] + +**** +* * +**** +* * +* * +* * +*** + + + +C + +[] [] [1,2] [] [] +[] [2] [] [1] [] +[2,3] [] [] [] [1] +[3] [] [] [] [] +[3,4] [] [] [] [5] +[] [4] [] [5] [] +[] [] [4,5] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * * +* * +* +* * + * * + * + + + +D + +[1,2] [2] [2,3] [] [] +[1] [] [] [3] [] +[1] [] [] [] [3,4] +[1] [] [] [] [4] +[1] [] [] [] [4,5] +[1] [] [] [5] [] +[1,6] [6] [5,6] [] [] +[] [] [] [] [] +[] [] [] [] [] + +*** +* * +* * +* * +* * +* * +*** + + + +E + +[1,2] [2] [2] [2] [2] +[1] [] [] [] [] +[1] [] [] [] [] +[1,3] [3] [3] [3] [3] +[1] [] [] [] [] +[1] [] [] [] [] +[1,4] [4] [4] [4] [4] +[] [] [] [] [] +[] [] [] [] [] + +***** +* +* +***** +* +* +***** + + + +F + +[1,2] [2] [2] [2] [2] +[1] [] [] [] [] +[1] [] [] [] [] +[1,3] [3] [3] [3] [3] +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] + +***** +* +* +***** +* +* +* + + + +G + +[] [] [1,2] [] [] +[] [2] [] [1] [] +[2,3] [] [] [] [1] +[3] [] [] [] [] +[3,4] [] [6] [6] [5,6] +[] [4] [] [5] [] +[] [] [4,5] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * * +* * +* +* *** + * * + * + + + +H + +[1] [] [] [] [3] +[1] [] [] [] [3] +[1] [] [] [] [3] +[1,2] [2] [2] [2] [2,3] +[1] [] [] [] [3] +[1] [] [] [] [3] +[1] [] [] [] [3] +[] [] [] [] [] +[] [] [] [] [] + +* * +* * +* * +***** +* * +* * +* * + + + +I + +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * + * + * + * + * + * + + + +J + +[] [] [] [] [1] +[] [] [] [] [1] +[] [] [] [] [1] +[] [] [] [] [1] +[3] [] [] [] [1,2] +[] [3] [] [2] [] +[] [] [2,3] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * + * + * +* * + * * + * + + + +K + +[1] [] [] [] [2] +[1] [] [2] [2] [] +[1] [2] [] [] [] +[1,2,3] [] [] [] [] +[1] [3] [3] [] [] +[1] [] [] [3] [] +[1] [] [] [] [3] +[] [] [] [] [] +[] [] [] [] [] + +* * +* ** +** +* +*** +* * +* * + + + +L + +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [] [] +[1,2] [2] [2] [2] [2] +[] [] [] [] [] +[] [] [] [] [] + +* +* +* +* +* +* +***** + + + +M + +[1,2] [] [] [] [3,4] +[1,2] [] [] [] [3,4] +[1] [2] [] [3] [4] +[1] [2] [] [3] [4] +[1] [2] [] [3] [4] +[1] [] [2,3] [] [4] +[1] [] [2,3] [] [4] +[] [] [] [] [] +[] [] [] [] [] + +* * +* * +** ** +** ** +** ** +* * * +* * * + + + +N + +[1,2] [] [] [] [3] +[1] [2] [] [] [3] +[1] [2] [] [] [3] +[1] [] [2] [] [3] +[1] [] [] [2] [3] +[1] [] [] [2] [3] +[1] [] [] [] [2,3] +[] [] [] [] [] +[] [] [] [] [] + +* * +** * +** * +* * * +* ** +* ** +* * + + + +O + +[] [] [1,6] [] [] +[] [1] [] [6] [] +[1,2] [] [] [] [5,6] +[2] [] [] [] [5] +[2,3] [] [] [] [4,5] +[] [3] [] [4] [] +[] [] [3,4] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * * +* * +* * +* * + * * + * + + + +P + +[1,2] [2] [2] [2,3] [] +[1] [] [] [] [3,4] +[1,5] [5] [5] [4,5] [] +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] + +**** +* * +**** +* +* +* +* + + + +Q + +[] [] [1,6] [] [] +[] [1] [] [6] [] +[1,2] [] [] [] [5,6] +[2] [] [] [] [5] +[2,3] [] [7] [] [4,5] +[] [3] [] [4,7] [] +[] [] [3,4] [] [7] +[] [] [] [] [] +[] [] [] [] [] + + * + * * +* * +* * +* * * + * * + * * + + + +R + +[1,2] [2] [2] [2,3] [] +[1] [] [] [] [3,4] +[1,5,6] [5] [5] [4,5] [] +[1] [6] [] [] [] +[1] [] [6] [] [] +[1] [] [] [6] [] +[1] [] [] [] [6] +[] [] [] [] [] +[] [] [] [] [] + +**** +* * +**** +** +* * +* * +* * + + + +S + +[] [2,3] [2] [1,2] [] +[3,4] [] [] [] [1] +[4,5] [] [] [] [] +[] [5,6] [6] [6,7] [] +[] [] [] [] [7,8] +[11] [] [] [] [8,9] +[] [10,11] [10] [9,10] [] +[] [] [] [] [] +[] [] [] [] [] + + *** +* * +* + *** + * +* * + *** + + + +T + +[1] [1] [1,2] [1] [1] +[] [] [2] [] [] +[] [] [2] [] [] +[] [] [2] [] [] +[] [] [2] [] [] +[] [] [2] [] [] +[] [] [2] [] [] +[] [] [] [] [] +[] [] [] [] [] + +***** + * + * + * + * + * + * + + + +U + +[1] [] [] [] [4] +[1] [] [] [] [4] +[1] [] [] [] [4] +[1] [] [] [] [4] +[1,2] [] [] [] [3,4] +[] [2] [] [3] [] +[] [] [2,3] [] [] +[] [] [] [] [] +[] [] [] [] [] + +* * +* * +* * +* * +* * + * * + * + + + +V + +[1] [] [] [] [2] +[1] [] [] [] [2] +[] [1] [] [2] [] +[] [1] [] [2] [] +[] [1] [] [2] [] +[] [] [1,2] [] [] +[] [] [1,2] [] [] +[] [] [] [] [] +[] [] [] [] [] + +* * +* * + * * + * * + * * + * + * + + + +W + +[1] [] [2,3] [] [4] +[1] [] [2,3] [] [4] +[1] [] [2,3] [] [4] +[] [1] [2] [3] [4] +[] [1,2] [] [3,4] [] +[] [1,2] [] [3,4] [] +[] [1,2] [] [3,4] [] +[] [] [] [] [] +[] [] [] [] [] + +* * * +* * * +* * * + **** + * * + * * + * * + + + +X + +[1] [] [] [] [2] +[] [1] [] [2] [] +[] [1] [] [2] [] +[] [] [1,2] [] [] +[] [2] [] [1] [] +[] [2] [] [1] [] +[2] [] [] [] [1] +[] [] [] [] [] +[] [] [] [] [] + +* * + * * + * * + * + * * + * * +* * + + + +Y + +[1] [] [] [] [2] +[] [1] [] [2] [] +[] [] [1,2,3] [] [] +[] [] [3] [] [] +[] [] [3] [] [] +[] [] [3] [] [] +[] [] [3] [] [] +[] [] [] [] [] +[] [] [] [] [] + +* * + * * + * + * + * + * + * + + + +Z + +[1] [1] [1] [1] [1,2] +[] [] [] [2] [] +[] [] [] [2] [] +[] [] [2] [] [] +[] [2] [] [] [] +[] [2] [] [] [] +[2,3] [3] [3] [3] [3] +[] [] [] [] [] +[] [] [] [] [] + +***** + * + * + * + * + * +***** + + + +a + +[] [] [] [] [] +[] [] [] [] [] +[] [] [1,2] [] [5] +[] [2] [] [1] [5] +[2,3] [] [] [] [1,4,5] +[] [3] [] [4] [5] +[] [] [3,4] [] [5] +[] [] [] [] [] +[] [] [] [] [] + + + + * * + * ** +* * + * ** + * * + + + +b + +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [2,3] [] [] +[1] [2] [] [3] [] +[1,2,5] [] [] [] [3,4] +[1] [5] [] [4] [] +[1] [] [4,5] [] [] +[] [] [] [] [] +[] [] [] [] [] + +* +* +* * +** * +* * +** * +* * + + + +c + +[] [] [] [] [] +[] [] [] [] [] +[] [] [1,2] [] [] +[] [2] [] [1] [] +[2,3] [] [] [] [] +[] [3] [] [4] [] +[] [] [3,4] [] [] +[] [] [] [] [] +[] [] [] [] [] + + + + * + * * +* + * * + * + + + +d + +[] [] [] [] [1] +[] [] [] [] [1] +[] [] [2,3] [] [1] +[] [3] [] [2] [1] +[3,4] [] [] [] [1,2,5] +[] [4] [] [5] [1] +[] [] [4,5] [] [1] +[] [] [] [] [] +[] [] [] [] [] + + * + * + * * + * ** +* * + * ** + * * + + + +e + +[] [] [] [] [] +[] [] [] [] [] +[] [] [2,3] [] [] +[] [3] [] [2] [] +[1,3,4] [1] [1] [1] [1,2] +[] [4] [] [] [] +[] [] [4,5] [5] [] +[] [] [] [] [] +[] [] [] [] [] + + + + * + * * +***** + * + ** + + + +f + +[] [1,2] [] [] [] +[2,3] [] [1] [] [] +[3] [] [] [] [] +[3,4] [4] [] [] [] +[3] [] [] [] [] +[3] [] [] [] [] +[3] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * +* * +* +** +* +* +* + + + +g + +[] [] [] [] [] +[] [] [] [] [] +[] [] [1,2] [] [5] +[] [2] [] [1] [5] +[2,3] [] [] [] [1,4,5] +[] [3] [] [4] [5] +[7] [] [3,4] [] [5,6] +[] [7] [] [6] [] +[] [] [6,7] [] [] + + + + * * + * ** +* * + * ** +* * * + * * + * + +h + +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [3,4] [] [] +[1] [3] [] [4] [] +[1,2,3] [] [] [] [4,5] +[1,2] [] [] [] [5] +[1,2] [] [] [] [5] +[] [] [] [] [] +[] [] [] [] [] + +* +* +* * +** * +* * +* * +* * + + + +i + +[] [] [1] [] [] +[] [] [] [] [] +[] [] [2] [] [] +[] [] [2] [] [] +[] [] [2] [] [] +[] [] [2] [] [] +[] [] [2] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + + * + * + * + * + * + + + +j + +[] [] [] [] [1] +[] [] [] [] [] +[] [] [] [] [2] +[] [] [] [] [2] +[] [] [] [] [2] +[] [] [] [] [2] +[4] [] [] [] [2,3] +[] [4] [] [3] [] +[] [] [3,4] [] [] + + * + + * + * + * + * +* * + * * + * + +k + +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [2] [2] +[1] [2] [2] [] [] +[1,2,3] [3] [] [] [] +[1] [] [3] [3] [] +[1] [] [] [] [3] +[] [] [] [] [] +[] [] [] [] [] + +* +* +* ** +*** +** +* ** +* * + + + +l + +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [] [] +[1] [] [] [] [] +[1,2] [] [3] [] [] +[] [2,3] [] [] [] +[] [] [] [] [] +[] [] [] [] [] + +* +* +* +* +* +* * + * + + + +m + +[] [] [] [] [] +[] [] [] [] [] +[1] [2,3] [] [6,7] [] +[1,2] [] [3,4,5,6] [] [7,8] +[1] [] [4,5] [] [8] +[1] [] [4,5] [] [8] +[1] [] [4,5] [] [8] +[] [] [] [] [] +[] [] [] [] [] + + + +** * +* * * +* * * +* * * +* * * + + + +n + +[] [] [] [] [] +[] [] [] [] [] +[1] [] [3,4] [] [] +[1] [3] [] [4] [] +[1,2,3] [] [] [] [4,5] +[1,2] [] [] [] [5] +[1,2] [] [] [] [5] +[] [] [] [] [] +[] [] [] [] [] + + + +* * +** * +* * +* * +* * + + + +o + +[] [] [] [] [] +[] [] [] [] [] +[] [] [1,4] [] [] +[] [1] [] [4] [] +[1,2] [] [] [] [3,4] +[] [2] [] [3] [] +[] [] [2,3] [] [] +[] [] [] [] [] +[] [] [] [] [] + + + + * + * * +* * + * * + * + + + +p + +[] [] [] [] [] +[] [] [] [] [] +[1] [] [2,3] [] [] +[1] [2] [] [3] [] +[1,2,5] [] [] [] [3,4] +[1] [5] [] [4] [] +[1] [] [4,5] [] [] +[1] [] [] [] [] +[1] [] [] [] [] + + + +* * +** * +* * +** * +* * +* +* + +q + +[] [] [] [] [] +[] [] [] [] [] +[] [] [2,3] [] [1] +[] [3] [] [2] [1] +[3,4] [] [] [] [1,2,5] +[] [4] [] [5] [1] +[] [] [4,5] [] [1] +[] [] [] [] [1] +[] [] [] [] [1] + + + + * * + * ** +* * + * ** + * * + * + * + +r + +[] [] [] [] [] +[] [] [] [] [] +[1] [] [3,4] [4] [] +[1] [3] [] [] [] +[1,2,3] [] [] [] [] +[1,2] [] [] [] [] +[1,2] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] + + + +* ** +** +* +* +* + + + +s + +[] [] [] [] [] +[] [] [] [] [] +[] [1,2] [1] [1] [] +[2,3] [] [] [] [] +[] [3,4] [4] [4,5] [] +[] [] [] [] [5,6] +[] [7] [7] [6,7] [] +[] [] [] [] [] +[] [] [] [] [] + + + + *** +* + *** + * + *** + + + +t + +[] [] [1] [] [] +[] [] [1] [] [] +[2] [2] [1,2] [2] [2] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * +***** + * + * + * + * + + + +u + +[] [] [] [] [] +[] [] [] [] [] +[1] [] [] [] [4,5] +[1] [] [] [] [4,5] +[1,2] [] [] [] [3,4,5] +[] [2] [] [3] [5] +[] [] [2,3] [] [5] +[] [] [] [] [] +[] [] [] [] [] + + + +* * +* * +* * + * ** + * * + + + +v + +[] [] [] [] [] +[] [] [] [] [] +[1] [] [] [] [2] +[] [1] [] [] [2] +[] [1] [] [2] [] +[] [] [1] [2] [] +[] [] [1,2] [] [] +[] [] [] [] [] +[] [] [] [] [] + + + +* * + * * + * * + ** + * + + + +w + +[] [] [] [] [] +[] [] [] [] [] +[1] [] [2,3] [] [4] +[1] [] [2,3] [] [4] +[] [1] [2] [3] [4] +[] [1,2] [] [3,4] [] +[] [1,2] [] [3,4] [] +[] [] [] [] [] +[] [] [] [] [] + + + +* * * +* * * + **** + * * + * * + + + +x + +[] [] [] [] [] +[] [] [] [] [] +[1] [] [] [] [2] +[] [1] [] [2] [] +[] [] [1,2] [] [] +[] [2] [] [1] [] +[2] [] [] [] [1] +[] [] [] [] [] +[] [] [] [] [] + + + +* * + * * + * + * * +* * + + + +y + +[] [] [] [] [] +[] [] [] [] [] +[1] [] [] [] [4,5] +[1] [] [] [] [4,5] +[1,2] [] [] [] [3,4,5] +[] [2] [] [3] [5] +[7] [] [2,3] [] [5,6] +[] [7] [] [6] [] +[] [] [6,7] [] [] + + + +* * +* * +* * + * ** +* * * + * * + * + +z + +[] [] [] [] [] +[] [] [] [] [] +[1] [1] [1] [1] [1,2] +[] [] [] [2] [] +[] [] [2] [] [] +[] [2] [] [] [] +[2,3] [3] [3] [3] [3] +[] [] [] [] [] +[] [] [] [] [] + + + +***** + * + * + * +***** + + + +? + +[] [] [1,2] [] [] +[] [1] [] [2] [] +[1] [] [3,4] [3] [2,3] +[] [] [4] [] [] +[] [] [4] [] [] +[] [] [] [] [] +[] [] [5] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * * +* *** + * + * + + * + + + +- + +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[1] [1] [1] [1] [1] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] + + + + + +***** + + + + + + + +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] + + + + + + + + + + + +, + +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [1] [] [] +[] [] [1,2] [] [] +[] [2] [] [] [] + + + + + + + + * + * + * + +( + +[] [] [1] [] [] +[] [1] [] [] [] +[1,2] [] [] [] [] +[2] [] [] [] [] +[2,3] [] [] [] [] +[] [3] [] [] [] +[] [] [3] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * +* +* +* + * + * + + + +) + +[] [] [1] [] [] +[] [] [] [1] [] +[] [] [] [] [1,2] +[] [] [] [] [2] +[] [] [] [] [2,3] +[] [] [] [3] [] +[] [] [3] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * + * + * + * + * + * + + + +| + +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] + + * + * + * + * + * + * + * + * + * + +. + +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [1] [] [] +[] [] [] [] [] +[] [] [] [] [] + + + + + + + + * + + + +: + +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [1] [] [] +[] [] [] [] [] +[] [] [2] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] + + + + + * + + * + + + + +_ + +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[1] [1] [1] [1] [1] +[] [] [] [] [] +[] [] [] [] [] + + + + + + + +***** + + + +\ + +[1] [] [] [] [] +[] [1] [] [] [] +[] [1] [] [] [] +[] [] [1] [] [] +[] [] [] [1] [] +[] [] [] [1] [] +[] [] [] [] [1] +[] [] [] [] [] +[] [] [] [] [] + +* + * + * + * + * + * + * + + + +[ + +[1,2] [1] [1] [] [] +[2] [] [] [] [] +[2] [] [] [] [] +[2] [] [] [] [] +[2] [] [] [] [] +[2] [] [] [] [] +[2,3] [3] [3] [] [] +[] [] [] [] [] +[] [] [] [] [] + +*** +* +* +* +* +* +*** + + + +] + +[] [] [1] [1] [1,2] +[] [] [] [] [2] +[] [] [] [] [2] +[] [] [] [] [2] +[] [] [] [] [2] +[] [] [] [] [2] +[] [] [3] [3] [2,3] +[] [] [] [] [] +[] [] [] [] [] + + *** + * + * + * + * + * + *** + + + +< + +[] [] [] [] [] +[] [] [] [] [] +[] [] [1] [] [] +[] [1] [] [] [] +[1,2] [] [] [] [] +[] [2] [] [] [] +[] [] [2] [] [] +[] [] [] [] [] +[] [] [] [] [] + + + + * + * +* + * + * + + + +> + +[] [] [] [] [] +[] [] [] [] [] +[] [] [1] [] [] +[] [] [] [1] [] +[] [] [] [] [1,2] +[] [] [] [2] [] +[] [] [2] [] [] +[] [] [] [] [] +[] [] [] [] [] + + + + * + * + * + * + * + + + +0 + +[] [] [1,6] [] [] +[] [1] [] [6,7] [] +[1,2] [] [] [7] [5,6] +[2] [] [7] [] [5] +[2,3] [] [7] [] [4,5] +[] [3,7] [] [4] [] +[] [] [3,4] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * * +* ** +* * * +* * * + * * + * + + + +1 + +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [1] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * + * + * + * + * + * + + + +2 + +[] [] [1,2] [] [] +[] [1] [] [2] [] +[1] [] [] [] [2,3] +[] [] [] [3] [] +[] [] [3] [] [] +[] [3] [] [] [] +[3,4] [4] [4] [4] [4] +[] [] [] [] [] +[] [] [] [] [] + + * + * * +* * + * + * + * +***** + + + +3 + +[] [1,2] [2] [2,3] [] +[1] [] [] [] [3,4] +[] [] [] [] [4,5] +[] [] [6,7] [5,6,7,8] [] +[] [] [] [] [8,9] +[12] [] [] [] [9,10] +[] [11,12] [11] [10,11] [] +[] [] [] [] [] +[] [] [] [] [] + + *** +* * + * + ** + * +* * + *** + + + +4 + +[] [] [] [1,2] [] +[] [] [2] [1] [] +[] [] [2] [1] [] +[] [2] [] [1] [] +[2,3] [3] [3] [1,3] [3] +[] [] [] [1] [] +[] [] [] [1] [] +[] [] [] [] [] +[] [] [] [] [] + + * + ** + ** + * * +***** + * + * + + + +5 + +[1,2] [1] [1] [1] [1] +[2] [] [] [] [] +[2,3] [3] [3,4] [] [] +[] [] [] [4] [] +[6] [] [] [] [4,5] +[] [6] [] [5] [] +[] [] [5,6] [] [] +[] [] [] [] [] +[] [] [] [] [] + +***** +* +*** + * +* * + * * + * + + + +6 + +[] [] [1,2] [] [] +[] [2] [] [1] [] +[2,3] [] [6,7] [] [1] +[3] [7] [] [6] [] +[3,4,7] [] [] [] [5,6] +[] [4] [] [5] [] +[] [] [4,5] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * * +* * * +** * +* * + * * + * + + + +7 + +[1] [1] [1] [1] [1,2] +[] [] [] [2] [] +[] [] [2] [] [] +[] [] [2] [] [] +[] [2] [] [] [] +[2] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] +[] [] [] [] [] + +***** + * + * + * + * +* + + + + +8 + +[] [] [1,6] [] [] +[] [1,2] [] [5,6] [] +[] [] [2,5] [] [] +[] [5] [] [2] [] +[4,5] [] [] [] [2,3] +[] [4] [] [3] [] +[] [] [3,4] [] [] +[] [] [] [] [] +[] [] [] [] [] + + * + * * + * + * * +* * + * * + * + + + +9 + +[] [] [1,2] [] [] +[] [2] [] [1] [] +[2,3] [] [] [] [1,4,5] +[] [3] [] [4] [5] +[] [] [3,4] [] [5] +[] [] [] [] [5] +[] [] [] [] [5] +[] [] [] [] [] +[] [] [] [] [] + + * + * * +* * + * ** + * * + * + * + + + +true. + + +**/ + +%%% *** START + +%%x(5). %% 2 +%%y(9). %% 2 + +x(5). +y(9). + +ctobr0 :- ctobr1(['A','B','C','D','E','F','G','H','I','J','K','L','M','N','O','P','Q','R','S','T','U','V','W','X','Y','Z','a','b','c','d','e','f','g','h','i','j','k','l','m','n','o','p','q','r','s','t','u','v','w','x','y','z','?','-',' ',',','(',')','|','.',':','_','\\','[',']','<','>','0','1','2','3','4','5','6','7','8','9']). +ctobr1([]) :- !. +ctobr1([C|Cs]) :- + ctobr(C),writeln(''),ctobr1(Cs). + +grid([[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]). + +ctobr(C1) :- + C1=' ', + writeln(C1), + characterbr(Cs), + member([C1,C1Name,C2],Cs), + writeln(C1Name),writeln(''), + y(Y), + prettyprint1(C2,Y),writeln(''), + prettyprint1A(C2,Y),!. %% 2 + +ctobr(C1) :- + writeln(C1), + characterbr(Cs), + member([C1,C1Name,C2],Cs), + writeln(C1Name),writeln(''), + /**Grid1=[[1,1,[ ]],[2,1,[ ]]], + ** + [[1,3,[ ]],[2,3,[ ]],[3,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]]], + +**%%[[1,1,[ ]]],**/ + grid(Grid1), + member([X1,Y1,M1],C2), + N2=1, + Stroke1=0, + %% States:[[this,state],Line:[[any,state,to,this,state],[true,or,false]],State:[[this,state,to,a,state],states]] + States= [ + [[1,s], false, [[1,s],[1,-]]], + [[1,-], false, [[3,s],[2,-]]], + %%[[2,s], false, [[2,s],[3,-]]], + [[2,-], true, [[3,s],[2,-]]], + [[3,s], true, [[1,s],[1,-]]] + %%[[3,-], false, [[3,s],[2,-]]] + ], + +M1=[N2|_Ms], +(changegrid2(_Prevstate,[1,s],Grid1,Grid2,X1,Y1,C2,_C4,N2,Stroke1,States); +changegrid2(_Prevstate,[1,-],Grid1,Grid2,X1,Y1,C2,_C4,N2,Stroke1,States)), + + y(Y), + prettyprint1(Grid2,Y),writeln(''), + prettyprint1A(Grid2,Y),!. %% 2 + +prettyprint1(_C,0) :- !. +prettyprint1(C,N) :- + prettyprint2(C,N,1), + writeln(''), + N2 is N-1, + prettyprint1(C,N2). +prettyprint2(_C,_N,X1) :- x(X), X1 is X+1, !. +prettyprint2(C,N,M) :- + member([M,N,M2],C), + write(M2),write('\t'), + M3 is M+1, + prettyprint2(C,N,M3). + +prettyprint1A(_C,0) :- !. +prettyprint1A(C,N) :- + prettyprint2A(C,N,1), + writeln(''), + N2 is N-1, + prettyprint1A(C,N2). +prettyprint2A(_C,_N,X1) :- x(X), X1 is X+1, !. +prettyprint2A(C,N,M) :- + member([M,N,M2],C), + (not(M2=[])->write('*');write(' ')), + M3 is M+1, + prettyprint2A(C,N,M3). + +changegrid2(Prevstate,_State,Grid1,Grid2,X,Y,C2,C2,N,Stroke1,_States) :- + %%notrace, + member11(C2,N,false,false), + (Prevstate=[1,-]->(Stroke2 is Stroke1+1,line1(X,Y,X,Y,Grid1,Grid2,Stroke2)); + (Grid2=Grid1)), + %%trace, + !. +changegrid2(_Prevstate,State1,Grid1,Grid2,X1,Y1,C2,C4,N2,Stroke1,States) :- + member([X2,Y2,M1],C2), + check(State1,M1,N2,X1,Y1,X2,Y2,C2,C4,Grid1,Grid2,Stroke1,States). + +check(State1,M1,N2,X1,Y1,X2,Y2,C2,C6,Grid1,Grid3,Stroke1,States) :- +%%writeln([a,state1,m1,n2,x1,y1,c2,y2,c2,c6,grid1,grid3,stroke1,states, +%%State1,M1,N2,X1,Y1,X2,Y2,C2,C6,Grid1,Grid3,Stroke1,States]), + State1=[_,s], + M1=[N2|Ms1], + Ms1=[s|Ms], %% has stopped, don't need to connect line from here %% + get(States,State1,Line,States2), + check2(Line,X1,Y1,X2,Y2,C2,M1,Ms,C5,Grid1,Grid2,Stroke1,Stroke2,true), + N3 is N2+1, + %%writeln([a,state,State1,index,N3,stroke1,Stroke1]), +%%writeln([gotostates(states2,States2,grid2,Grid2,grid3,Grid3,x2,X2,y2,Y2,c5,C5,c6,C6,n3,N3,stroke1,Stroke1,states,States)]), + %%writeln([gotostates(States2,Grid2,Grid3,X2,Y2,C5,C6,N3,Stroke1,States)]), + gotostates(State1,States2,Grid2,Grid3,X2,Y2,C5,C6,N3,Stroke2,States). +check(State1,M1,N2,X1,Y1,X2,Y2,C2,C4,Grid1,Grid3,Stroke1,States) :- +%%writeln([b,state1,m1,n2,x1,y1,c2,y2,c2,c4,grid1,grid3,stroke1,states, +%%State1,M1,N2,X1,Y1,X2,Y2,C2,C4,Grid1,Grid3,Stroke1,States]), + State1=[_,-], + M1=[N2|Ms], + get(States,State1,Line,States2), + check2(Line,X1,Y1,X2,Y2,C2,M1,Ms,C5,Grid1,Grid2,Stroke1,Stroke2,false), + %%Stroke3 is Stroke2+1, + N3 is N2+1, + %%writeln([b,state,State1,index,N3,stroke1,Stroke2]), + gotostates(State1,States2,Grid2,Grid3,X2,Y2,C5,C4,N3,Stroke2,States). + +get(States,State,Line,States3) :- + member(State2,States), + State2=[State,Line,States3]. + +check2(true,X,Y,X,Y,C,_M1,_Ms,C,Grid,Grid,Stroke,Stroke,_S) :- !. +check2(true,X1,Y1,X2,Y2,C2,M1,Ms,C4,Grid1,Grid3,Stroke1,Stroke2,_S) :- + update(C2,X2,Y2,M1,Ms,C4),%% + %%writeln([a,update(C2,X2,Y2,M1,Ms,C4)]),%% + %%notrace, + Stroke2 is Stroke1+1, + line1(X1,Y1,X2,Y2,Grid1,Grid3,Stroke2). + %%trace, + %%writeln([a1,line1(X1,Y1,X2,Y2,Grid1,Grid3,Stroke1)]). %% +check2(false,_X1,_Y1,X2,Y2,C2,M1,Ms,C4,Grid1,Grid2,Stroke1,Stroke2,S) :- + update(C2,X2,Y2,M1,Ms,C4), + %%writeln([b,update(C2,X2,Y2,M1,Ms,C4)]), + (S=true->(Stroke2 is Stroke1+1,line1(X2,Y2,X2,Y2,Grid1,Grid2,Stroke2)); + (Stroke2=Stroke1,Grid2=Grid1)),!.%% + +gotostates(_,[],_Grid,_Grid2,_,_,_,_,_,_,_) :- + fail, !. +gotostates(Prevstate,States1,Grid1,Grid2,X,Y,C,C2,N,Stroke,States) :- + States1=[State|States2], + (changegrid2(Prevstate,State,Grid1,Grid2,X,Y,C,C2,N,Stroke,States); + gotostates(Prevstate,States2,Grid1,Grid2,X,Y,C,C2,N,Stroke,States)). + %%,! %% This may stop the program from working because of stopping it from trying states + +update(C2,X,Y,M1,Ms,C4) :- + delete(C2,[X,Y,M1],C3), + append(C3,[[X,Y,Ms]],C4). + +member11([],_N,F,F) :- !. +member11([C2|C2s],N,F1,F2) :- + C2=[_,_,M],(member(N,M)->Flag=true;Flag=false), + disjunction(F1,Flag,F3), + member11(C2s,N,F3,F2). + +disjunction(A,B,true) :- + (A=true;B=true), !. +disjunction(_,_,false) :- !. + +line1(X1,Y1,X2,Y2,C2,C3,N3) :- + %%(X1=(XA1=X1,XA2=X2);(XA1=X2,XA2=X1)), + %%(Y1=(YA1=Y1,YA2=Y2);(YA1=Y2,YA2=Y1)), + %%gridline1(XA1,YA1,XA2,YA2,C2,C3,N3). + gridline1(X1,Y1,X2,Y2,C2,C3,N3). + +%% Graphs (X1,Y1) to (X2,Y2) + +gridline1(X1,Y1,X2,Y2,C2,C3,N3) :- + sortbyx(X1,Y1,X2,Y2,XA1,YA1,XA2,YA2), + equation(XA1,YA1,XA2,YA2,M,C), + gridline2(XA1,YA1,XA2,YA2,M,C,C2,C3,N3), + !. + %%writeln(Grid1), + %%sort(YA1,YA2,YB1,YB2), + %%print(XA1,YB1,XA2,YB2,Grid1,_Grid2),!. + +%% Sorts (X1,Y1) and (X2,Y2) by X + +sortbyx(X1,Y1,X2,Y2,X1,Y1,X2,Y2) :- + X2 >= X1. +sortbyx(X1,Y1,X2,Y2,X2,Y2,X1,Y1) :- + X2 < X1. + +%% Finds the rise and run of (X1,Y1) and (X2,Y2) + +equation(X1,Y1,X2,Y2,M,C) :- + DY is Y2-Y1, + DX is X2-X1, + %%writeln([y2,Y2,y1,Y1,x2,X2,x1,X1,dy,DY,dx,DX]), %% + equation2(DY,DX,M,Y1,X1,C). + +%% Finds the gradient m and y-intercept c of (X1,Y1) and (X2,Y2) + +equation2(_DY,0,999999999,_Y1,X1,X1) :- + !. +equation2(DY,DX,M,Y1,X1,C) :- + M is DY/DX, + C is Y1-M*X1 + %%,writeln([m,M,c,C]) + . + +%% Finds the graph of the line connecting the two points. It does this by finding the graph flipped in the y=x line if the gradient m is greater than 1 or less than -1, so that the graph is not disjointed + +gridline2(X1,_Y1,X2,_Y2,M,C,C2,Grid,N3) :- + M =< 1, M >= -1, + %%x(X),%%X1 is X+1, + gridline3(X1,X2,M,C,C2,Grid,N3,_X). +gridline2(X1,Y1,_X2,Y2,M,_C,C22,Grid1,N3) :- + (M > 1; M < -1), + M2 is 1/M, + sort(Y1,Y2,YA1,YA2), + C2 is X1-M2*Y1, + flipxy(C22,[],Grid), + %%y(Y), + %%Y1 is Y+1, + gridline3(YA1,YA2,M2,C2,Grid,Grid2,N3,_Y1), + %%writeln(['***',flipxygrid,Grid2]), + flipxy(Grid2,[],Grid1). + +%% Sorts Y1 and Y2 + +sort(Y1,Y2,Y1,Y2) :- + Y1=Y2. + +%% Plots a point at each x-value of the graph + +gridline3(X1,X2,_M,_C,Grid,Grid,_N3,_N4) :- + %%X1 is N4+1. %% swap, + X1 is X2+1. +gridline3(X1,X2,M,C,Grid1,Grid2,N3,_N4) :- + Y is round(M*X1+C), + %%Coord = [X1,Y], + member([X1,Y,M2],Grid1), + delete(Grid1,[X1,Y,M2],Grid11), + append(M2,[N3],M3), + append(Grid11,[[X1,Y,M3]],Grid3), + %%writeln([X1,Y,M3]), %% + X3 is X1+1, + gridline3(X3,X2,M,C,Grid3,Grid2,N3,_N42). + +%% Flips the graph in the y=x line + +flipxy([],Grid,Grid) :- !. +flipxy(Grid1,Grid2,Grid3) :- + Grid1 = [Coord1 | Coords], + Coord1 = [X, Y, M], + Coord2 = [Y, X, M], + append(Grid2,[Coord2],Grid4), + flipxy(Coords,Grid4,Grid3). + +%% Prints the graph from the top row to the bottom row + +%%**['tA',[[1,1,[1,s]],[2,1,[2]]]], +%% characterbr([['tA',[[1,1,[1,3]],[2,1,[2,s,4]]]], + +%%characterbr +%%['ta', [[1,3,[1,3 ]],[2,3,[2,s ]],[3,3,[ ]], +%% [1,2,[4,s ]],[2,2,[5 ]],[3,2,[6 ]], +%% [1,1,[ ]],[2,1,[ ]],[3,1,[ ]]]], + +%%['t', +%%[[1,2,[1 ]],[2,2,[2,s ]], try with number after s +%%[1,1,[3 ]],[2,1,[4 ]]]], diff --git a/LuciansHandBitMap-Font/characterbr1.pl b/LuciansHandBitMap-Font/characterbr1.pl new file mode 100644 index 0000000000000000000000000000000000000000..38cc35326bed044ade76e9b8a9445a575634f1e4 --- /dev/null +++ b/LuciansHandBitMap-Font/characterbr1.pl @@ -0,0 +1,847 @@ + +characterbr([['A',uppera, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1,3 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[5 ]],[3,5,[ ]],[4,5,[6 ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[4,s ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['B',upperb, +[[1,9,[1,3 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[4 ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[5 ]], + [1,7,[7 ]],[2,7,[ ]],[3,7,[8 ]],[4,7,[6 ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[9 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s,11]],[2,3,[ ]],[3,3,[10 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['C',upperc, +[[1,9,[ ]],[2,9,[ ]],[3,9,[2 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[3 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[1 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[4 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[6 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[5 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['D',upperd, +[[1,9,[1,3 ]],[2,9,[ ]],[3,9,[4 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[5 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[6 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s,8]],[2,3,[ ]],[3,3,[7 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['E',uppere, +[[1,9,[1,3 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[4,s ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[5 ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[6,s ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s,7]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[8 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['F',upperf, +[[1,9,[1,3 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[4,s ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[5 ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[6 ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['G',upperg, +[[1,9,[ ]],[2,9,[ ]],[3,9,[2 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[3 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[1 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[4 ]],[2,5,[ ]],[3,5,[7 ]],[4,5,[ ]],[5,5,[6,s,8]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[5 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['H',upperh, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[5 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[3 ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[4,s ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[6 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['I',upperi, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[2 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['J',upperj, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[1 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[4 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[2 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[3 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['K',upperk, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[3 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[4 ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[5 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['L',upperl, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[3 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['M',upperm, +[[1,9,[1,3 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[5 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[6 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['N',uppern, +[[1,9,[1,3 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[5 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[4 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['O',uppero, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1,7 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[2 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[6 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[5 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['P',upperp, +[[1,9,[1,3 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[4 ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[5 ]], + [1,7,[7 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[6 ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['Q',upperq, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1,7,s]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[2 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[6 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3 ]],[2,5,[ ]],[3,5,[8 ]],[4,5,[ ]],[5,5,[5 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[9 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['R',upperr, +[[1,9,[1,3 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[4 ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[5 ]], + [1,7,[7 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[6 ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[8 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['S',uppers, +[[1,9,[ ]],[2,9,[3 ]],[3,9,[ ]],[4,9,[2 ]],[5,9,[ ]], + [1,8,[4 ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[1 ]], + [1,7,[5 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[6 ]],[3,6,[ ]],[4,6,[7 ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[8 ]], + [1,4,[12 ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[9 ]], + [1,3,[ ]],[2,3,[11 ]],[3,3,[ ]],[4,3,[10 ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['T',uppert, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[3 ]],[4,9,[ ]],[5,9,[2,s ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['U',upperu, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[5 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[2 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[4 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[3 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['V',upperv, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[3 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[2 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['W',upperw, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[3 ]],[4,9,[ ]],[5,9,[5 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[2 ]],[3,3,[ ]],[4,3,[4 ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['X',upperx, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[3 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[4 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[2,s ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['Y',uppery, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[3 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[2,s,4]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[5 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['Z',upperz, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[2 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[3 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[4 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['a',lowera, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[2 ]],[4,7,[ ]],[5,7,[6 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[1,5,s]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[7 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['b',lowerb, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[4 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3,7 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[5 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[6 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['c',lowerc, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[2 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[1 ]],[5,6,[ ]], + [1,5,[3 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[5 ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['d',lowerd, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[1 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[4 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[5 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[3,7 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[6 ]],[4,3,[ ]],[5,3,[2,s ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['e',lowere, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[3 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[1,4 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[2 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[5 ]],[4,3,[6 ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['f',lowerf, +[[1,9,[ ]],[2,9,[2 ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[3 ]],[2,8,[ ]],[3,8,[1 ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[5 ]],[2,6,[6 ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[4,s ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['g',lowerg, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[2 ]],[4,7,[ ]],[5,7,[6 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[1,5,s]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[9 ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[7 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[8 ]],[4,1,[ ]],[5,1,[ ]]]], + + ['h',lowerh, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[4 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[5 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[6 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['i',loweri, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1,s ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[2 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[3 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['j',lowerj, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[1,s ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[2 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[5 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[3 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[4 ]],[4,1,[ ]],[5,1,[ ]]]], + + ['k',lowerk, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[3 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[4 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[5 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['l',lowerl, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[2 ]],[2,4,[ ]],[3,4,[4 ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[3 ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['m',lowerm, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[4 ]],[3,7,[ ]],[4,7,[8 ]],[5,7,[ ]], + [1,6,[3 ]],[2,6,[ ]],[3,6,[5,7 ]],[4,6,[ ]],[5,6,[9 ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2,s ]],[2,3,[ ]],[3,3,[6 ]],[4,3,[ ]],[5,3,[10 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['n',lowern, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[4 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[5 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[6 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['o',lowero, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[1,5 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[2 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[4 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[3 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['p',lowerp, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[4 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3,7 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[5 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[6 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[2,s ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['q',lowerq, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[4 ]],[4,7,[ ]],[5,7,[1 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[5 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[3,7 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[6 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[2,s ]]]], + + ['r',lowerr, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[4 ]],[4,7,[5 ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[2 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['s',lowers, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[2 ]],[3,7,[ ]],[4,7,[1 ]],[5,7,[ ]], + [1,6,[3 ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[4 ]],[3,5,[ ]],[4,5,[5 ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[6 ]], + [1,3,[ ]],[2,3,[8 ]],[3,3,[ ]],[4,3,[7 ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['t',lowert, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[3 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[4 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[2,s ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['u',loweru, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[5 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[2 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[4 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[3 ]],[4,3,[ ]],[5,3,[6 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['v',lowerv, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[3 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[2 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['w',lowerw, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[3 ]],[4,7,[ ]],[5,7,[5 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[2 ]],[3,3,[ ]],[4,3,[4 ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['x',lowerx, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[3 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[4 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[2,s ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['y',lowery, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[5 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[2 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[4 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[8 ]],[2,3,[ ]],[3,3,[3 ]],[4,3,[ ]],[5,3,[6 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[7 ]],[4,1,[ ]],[5,1,[ ]]]], + + ['z',lowerz, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[2 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[3 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[4 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['?',questionmark, +[[1,9,[ ]],[2,9,[ ]],[3,9,[2 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[4 ]],[4,7,[ ]],[5,7,[3 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[5,s ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[6 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['-',hyphen, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[1 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[2 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + [' ',space, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + [',',comma, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[1 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[2 ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[3 ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['(',openroundbracket, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[2 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + [')',closedroundbracket, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[2 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[3 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['|',pipe, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[2 ]],[4,1,[ ]],[5,1,[ ]]]], + + ['.',fullstop, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[1 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + [':',colon, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[1,s ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[2 ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['_',underscore, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[1 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[2 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['\\',backslash, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[2 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['[',opensquarebracket, +[[1,9,[2 ]],[2,9,[ ]],[3,9,[1 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[3 ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + [']',closedsquarebracket, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1 ]],[4,9,[ ]],[5,9,[2 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[3 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['<',lessthan, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[1 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[2 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[3 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['>',greaterthan, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[1 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[2 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[3 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['0',zero, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1,7,s]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[8 ]],[5,8,[ ]], + [1,7,[2 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[6 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[3 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[5 ]], + [1,4,[ ]],[2,4,[9 ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['1',one, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[2 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['2',two, +[[1,9,[ ]],[2,9,[ ]],[3,9,[2 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[1 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[3 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[4 ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[5 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['3',three, +[[1,9,[ ]],[2,9,[2 ]],[3,9,[ ]],[4,9,[3 ]],[5,9,[ ]], + [1,8,[1 ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[4 ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[5 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[7 ]],[4,6,[6,8 ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[9 ]], + [1,4,[13 ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[10 ]], + [1,3,[ ]],[2,3,[12 ]],[3,3,[ ]],[4,3,[11 ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['4',four, +[[1,9,[ ]],[2,9,[ ]],[3,9,[ ]],[4,9,[1,3 ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[4 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[5 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[ ]],[4,3,[2,s ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['5',five, +[[1,9,[1,3 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[2,s ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[4 ]],[2,7,[ ]],[3,7,[5 ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[8 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[6 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[7 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['6',six, +[[1,9,[ ]],[2,9,[ ]],[3,9,[2 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[3 ]],[2,7,[ ]],[3,7,[7 ]],[4,7,[ ]],[5,7,[1 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[4,8 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[6 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[5 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['7',seven, +[[1,9,[1 ]],[2,9,[ ]],[3,9,[ ]],[4,9,[ ]],[5,9,[2 ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[3 ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['8',eight, +[[1,9,[ ]],[2,9,[ ]],[3,9,[1,7 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[2 ]],[3,8,[ ]],[4,8,[6 ]],[5,8,[ ]], + [1,7,[ ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[ ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[5 ]],[2,5,[ ]],[3,5,[ ]],[4,5,[ ]],[5,5,[3 ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[4 ]],[4,3,[ ]],[5,3,[ ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]], + + ['9',nine, +[[1,9,[ ]],[2,9,[ ]],[3,9,[2 ]],[4,9,[ ]],[5,9,[ ]], + [1,8,[ ]],[2,8,[ ]],[3,8,[ ]],[4,8,[ ]],[5,8,[ ]], + [1,7,[3 ]],[2,7,[ ]],[3,7,[ ]],[4,7,[ ]],[5,7,[1,5 ]], + [1,6,[ ]],[2,6,[ ]],[3,6,[ ]],[4,6,[ ]],[5,6,[ ]], + [1,5,[ ]],[2,5,[ ]],[3,5,[4 ]],[4,5,[ ]],[5,5,[ ]], + [1,4,[ ]],[2,4,[ ]],[3,4,[ ]],[4,4,[ ]],[5,4,[ ]], + [1,3,[ ]],[2,3,[ ]],[3,3,[ ]],[4,3,[ ]],[5,3,[6 ]], + [1,2,[ ]],[2,2,[ ]],[3,2,[ ]],[4,2,[ ]],[5,2,[ ]], + [1,1,[ ]],[2,1,[ ]],[3,1,[ ]],[4,1,[ ]],[5,1,[ ]]]]]). diff --git a/Music-Composer/.DS_Store b/Music-Composer/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..cf3e23ca86321814572587937385f0ba66504608 Binary files /dev/null and b/Music-Composer/.DS_Store differ diff --git a/Music-Composer/1451-4.pl b/Music-Composer/1451-4.pl new file mode 100644 index 0000000000000000000000000000000000000000..c8600d10b394aed77b9eafa639947839d3ccbf4c --- /dev/null +++ b/Music-Composer/1451-4.pl @@ -0,0 +1,39 @@ +/** +Prolog Music Composer V.1 +1. Chorus, Solo (based on I-IV-V-I) **/ +versechorussoloprogression1451(Name1, Name2,Progression2) :- +note(Note1, Name1), +note(Note2, Name2), +versechorussoloprogression11451(Note1, Note2, 0, _, [Note1], Progression1), notestonames(Progression1,Progression2),!.%%, writeln(Progression2),!. +versechorussoloprogression11451(_, _, _, Counter, b, Progression, Progression) :- +Counter = 4, !. +versechorussoloprogression11451(Note, Note, _, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression11451(Note1, Note2, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step14511(Note1, Note3), +append(Progression1, [Note3], Progression3), +versechorussoloprogression21451(Note3, Note2, Note1, Counter2, Flag2, Progression3, Progression4), +step14512(Note1, Note4), +append(Progression1, [Note4], Progression5), +versechorussoloprogression21451(Note4, Note2, Note1, Counter2, Flag3, Progression5, Progression6), +flag1([[Flag2, Progression4], [Flag3, Progression6]], Flag1, Progression2). +versechorussoloprogression21451(_, _, _, Counter, b, Progression, Progression) :- +Counter = 4, !. +versechorussoloprogression21451(Note, _, Note, _, _, _, _) :- !. +versechorussoloprogression21451(Note, Note, _, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression21451(Note1, Note2, Note3, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step14511(Note1, Note4), +append(Progression1, [Note4], Progression3), +versechorussoloprogression21451(Note4, Note2, Note3, Counter2, Flag2, Progression3, Progression4), +step14512(Note1, Note5), +append(Progression1, [Note5], Progression5), +versechorussoloprogression21451(Note5, Note2, Note3, Counter2, Flag3, Progression5, Progression6), +flag1([[Flag2, Progression4], [Flag3, Progression6]], Flag1, Progression2). +step14511(Note1, Note2) :- +Value is Note1+2, Note2 is Value mod 12. %% IV-V +step14512(Note1, Note2) :- +Value is Note1+5, Note2 is Value mod 12. %% I-IV, V-I + +%% 1 4 5 1 +%% 5 2 5 diff --git a/Music-Composer/1451.pl b/Music-Composer/1451.pl new file mode 100644 index 0000000000000000000000000000000000000000..2fdc4b7369595ca25ccfcf8ecd693a0654daea5f --- /dev/null +++ b/Music-Composer/1451.pl @@ -0,0 +1,39 @@ +/** +Prolog Music Composer V.1 +1. Chorus, Solo (based on I-IV-V-I) **/ +versechorussoloprogression1451(Name1, Name2,Progression2) :- +note(Note1, Name1), +note(Note2, Name2), +versechorussoloprogression11451(Note1, Note2, 0, _, [Note1], Progression1), notestonames(Progression1,Progression2),!.%%, writeln(Progression2),!. +versechorussoloprogression11451(_, _, _, Counter, b, Progression, Progression) :- +Counter = 3, !. +versechorussoloprogression11451(Note, Note, _, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression11451(Note1, Note2, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step14511(Note1, Note3), +append(Progression1, [Note3], Progression3), +versechorussoloprogression21451(Note3, Note2, Note1, Counter2, Flag2, Progression3, Progression4), +step14512(Note1, Note4), +append(Progression1, [Note4], Progression5), +versechorussoloprogression21451(Note4, Note2, Note1, Counter2, Flag3, Progression5, Progression6), +flag1([[Flag2, Progression4], [Flag3, Progression6]], Flag1, Progression2). +versechorussoloprogression21451(_, _, _, Counter, b, Progression, Progression) :- +Counter = 3, !. +versechorussoloprogression21451(Note, _, Note, _, _, _, _) :- !. +versechorussoloprogression21451(Note, Note, _, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression21451(Note1, Note2, Note3, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step14511(Note1, Note4), +append(Progression1, [Note4], Progression3), +versechorussoloprogression21451(Note4, Note2, Note3, Counter2, Flag2, Progression3, Progression4), +step14512(Note1, Note5), +append(Progression1, [Note5], Progression5), +versechorussoloprogression21451(Note5, Note2, Note3, Counter2, Flag3, Progression5, Progression6), +flag1([[Flag2, Progression4], [Flag3, Progression6]], Flag1, Progression2). +step14511(Note1, Note2) :- +Value is Note1+2, Note2 is Value mod 12. %% IV-V +step14512(Note1, Note2) :- +Value is Note1+5, Note2 is Value mod 12. %% I-IV, V-I + +%% 1 4 5 1 +%% 5 2 5 diff --git a/Music-Composer/1564-4.pl b/Music-Composer/1564-4.pl new file mode 100644 index 0000000000000000000000000000000000000000..70a08b775f4619e9d3166006593bfd20e66e6a91 --- /dev/null +++ b/Music-Composer/1564-4.pl @@ -0,0 +1,58 @@ +/** +Prolog Music Composer V.3 +Verse, Chorus and Solo (based on I-V-vi-IV): +Listen to a sample. (https://soundcloud.com/lucian-green/prolog-music-composer-v3-based-on-i-v-vi-iv). +Counter=4 +?- versechorussoloprogression('C','D'). +[[C,D],[C,G,D]] +Yes +?- versechorussoloprogression('C','C#'). +[] +Yes +versechorussoloprogression('C','E'). +[[C,D,E],[C,D,A,E],[C,G,A,E],[C,G,D,E],[C,G#,E]] +Yes +?- versechorussoloprogression('C','D#'). +[[C,G,D#],[C,G#,D#]] +**/ +versechorussoloprogression1564(Name1, Name2,Progression2) :- +note(Note1, Name1), +note(Note2, Name2), +versechorussoloprogression11564(Note1, Note2, 0, _, [Note1], Progression1), notestonames(Progression1,Progression2),!. +versechorussoloprogression11564(_, _, Counter, b, Progression, Progression) :- +Counter = 4, !. +versechorussoloprogression11564(Note, Note, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression11564(Note1, Note2, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step15641(Note1, Note3), +append(Progression1, [Note3], Progression3), +versechorussoloprogression21564(Note3, Note2, Note1, Counter2, Flag2, Progression3, Progression4), +step15642(Note1, Note4), +append(Progression1, [Note4], Progression5), +versechorussoloprogression21564(Note4, Note2, Note1, Counter2, Flag3, Progression5, Progression6), +step15643(Note1, Note5), +append(Progression1, [Note5], Progression7), +versechorussoloprogression21564(Note5, Note2, Note1, Counter2, Flag4, Progression7, Progression8), +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8]], Flag1, Progression2). +versechorussoloprogression21564(_, _, _, Counter, b, Progression, Progression) :- +Counter = 4, !. +versechorussoloprogression21564(Note, _, Note, _, _, _, _) :- !. +versechorussoloprogression21564(Note, Note, _, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression21564(Note1, Note2, Note3, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step15641(Note1, Note4), +append(Progression1, [Note4], Progression3), +versechorussoloprogression21564(Note4, Note2, Note3, Counter2, Flag2, Progression3, Progression4), +step15642(Note1, Note5), +append(Progression1, [Note5], Progression5), +versechorussoloprogression21564(Note5, Note2, Note3, Counter2, Flag3, Progression5, Progression6), +step15643(Note1, Note6), +append(Progression1, [Note6], Progression7), +versechorussoloprogression21564(Note6, Note2, Note3, Counter2, Flag4, Progression7, Progression8), +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8]], Flag1, Progression2). +step15641(Note1, Note2) :- +Value is Note1+2, Note2 is Value mod 12. +step15642(Note1, Note2) :- +Value is Note1+7, Note2 is Value mod 12. +step15643(Note1, Note2) :- +Value is Note1+8, Note2 is Value mod 12. diff --git a/Music-Composer/1564.pl b/Music-Composer/1564.pl new file mode 100644 index 0000000000000000000000000000000000000000..88db2c153087c5a6760abb02fee2a797f55677ea --- /dev/null +++ b/Music-Composer/1564.pl @@ -0,0 +1,58 @@ +/** +Prolog Music Composer V.3 +Verse, Chorus and Solo (based on I-V-vi-IV): +Listen to a sample. (https://soundcloud.com/lucian-green/prolog-music-composer-v3-based-on-i-v-vi-iv). +Counter=4 +?- versechorussoloprogression('C','D'). +[[C,D],[C,G,D]] +Yes +?- versechorussoloprogression('C','C#'). +[] +Yes +versechorussoloprogression('C','E'). +[[C,D,E],[C,D,A,E],[C,G,A,E],[C,G,D,E],[C,G#,E]] +Yes +?- versechorussoloprogression('C','D#'). +[[C,G,D#],[C,G#,D#]] +**/ +versechorussoloprogression1564(Name1, Name2,Progression2) :- +note(Note1, Name1), +note(Note2, Name2), +versechorussoloprogression11564(Note1, Note2, 0, _, [Note1], Progression1), notestonames(Progression1,Progression2),!. +versechorussoloprogression11564(_, _, Counter, b, Progression, Progression) :- +Counter = 3, !. +versechorussoloprogression11564(Note, Note, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression11564(Note1, Note2, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step15641(Note1, Note3), +append(Progression1, [Note3], Progression3), +versechorussoloprogression21564(Note3, Note2, Note1, Counter2, Flag2, Progression3, Progression4), +step15642(Note1, Note4), +append(Progression1, [Note4], Progression5), +versechorussoloprogression21564(Note4, Note2, Note1, Counter2, Flag3, Progression5, Progression6), +step15643(Note1, Note5), +append(Progression1, [Note5], Progression7), +versechorussoloprogression21564(Note5, Note2, Note1, Counter2, Flag4, Progression7, Progression8), +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8]], Flag1, Progression2). +versechorussoloprogression21564(_, _, _, Counter, b, Progression, Progression) :- +Counter = 3, !. +versechorussoloprogression21564(Note, _, Note, _, _, _, _) :- !. +versechorussoloprogression21564(Note, Note, _, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression21564(Note1, Note2, Note3, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step15641(Note1, Note4), +append(Progression1, [Note4], Progression3), +versechorussoloprogression21564(Note4, Note2, Note3, Counter2, Flag2, Progression3, Progression4), +step15642(Note1, Note5), +append(Progression1, [Note5], Progression5), +versechorussoloprogression21564(Note5, Note2, Note3, Counter2, Flag3, Progression5, Progression6), +step15643(Note1, Note6), +append(Progression1, [Note6], Progression7), +versechorussoloprogression21564(Note6, Note2, Note3, Counter2, Flag4, Progression7, Progression8), +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8]], Flag1, Progression2). +step15641(Note1, Note2) :- +Value is Note1+2, Note2 is Value mod 12. +step15642(Note1, Note2) :- +Value is Note1+7, Note2 is Value mod 12. +step15643(Note1, Note2) :- +Value is Note1+8, Note2 is Value mod 12. diff --git a/Music-Composer/1645-4.pl b/Music-Composer/1645-4.pl new file mode 100644 index 0000000000000000000000000000000000000000..76b62cf7275c07a5284c130ab2300b578fe0ec54 --- /dev/null +++ b/Music-Composer/1645-4.pl @@ -0,0 +1,46 @@ +/** +Prolog Music Composer V.4 +Verse, Chorus and Solo (based on I-vi-IV-V): +Counter=4 +**/ +versechorussoloprogression1645(Name1, Name2,Progression2) :- +note(Note1, Name1), +note(Note2, Name2), +versechorussoloprogression11645(Note1, Note2, 0, _, [Note1], Progression1), notestonames(Progression1,Progression2),!. +versechorussoloprogression11645(_, _, Counter, b, Progression, Progression) :- +Counter = 4, !. +versechorussoloprogression11645(Note, Note, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression11645(Note1, Note2, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step16451(Note1, Note3), +append(Progression1, [Note3], Progression3), +versechorussoloprogression21645(Note3, Note2, Note1, Counter2, Flag2, Progression3, Progression4), +step16452(Note1, Note4), +append(Progression1, [Note4], Progression5), +versechorussoloprogression21645(Note4, Note2, Note1, Counter2, Flag3, Progression5, Progression6), +step16453(Note1, Note5), +append(Progression1, [Note5], Progression7), +versechorussoloprogression21645(Note5, Note2, Note1, Counter2, Flag4, Progression7, Progression8), +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8]], Flag1, Progression2). +versechorussoloprogression21645(_, _, _, Counter, b, Progression, Progression) :- +Counter = 4, !. +versechorussoloprogression21645(Note, _, Note, _, _, _, _) :- !. +versechorussoloprogression21645(Note, Note, _, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression21645(Note1, Note2, Note3, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step16451(Note1, Note4), +append(Progression1, [Note4], Progression3), +versechorussoloprogression21645(Note4, Note2, Note3, Counter2, Flag2, Progression3, Progression4), +step16452(Note1, Note5), +append(Progression1, [Note5], Progression5), +versechorussoloprogression21645(Note5, Note2, Note3, Counter2, Flag3, Progression5, Progression6), +step16453(Note1, Note6), +append(Progression1, [Note6], Progression7), +versechorussoloprogression21645(Note6, Note2, Note3, Counter2, Flag4, Progression7, Progression8), +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8]], Flag1, Progression2). +step16451(Note1, Note2) :- +Value is Note1+2, Note2 is Value mod 12. +step16452(Note1, Note2) :- +Value is Note1+8, Note2 is Value mod 12. +step16453(Note1, Note2) :- +Value is Note1+9, Note2 is Value mod 12. diff --git a/Music-Composer/1645.pl b/Music-Composer/1645.pl new file mode 100644 index 0000000000000000000000000000000000000000..7549186d2fef0ac4bb5d75b9e33350128ec494f8 --- /dev/null +++ b/Music-Composer/1645.pl @@ -0,0 +1,46 @@ +/** +Prolog Music Composer V.4 +Verse, Chorus and Solo (based on I-vi-IV-V): +Counter=4 +**/ +versechorussoloprogression1645(Name1, Name2,Progression2) :- +note(Note1, Name1), +note(Note2, Name2), +versechorussoloprogression11645(Note1, Note2, 0, _, [Note1], Progression1), notestonames(Progression1,Progression2),!. +versechorussoloprogression11645(_, _, Counter, b, Progression, Progression) :- +Counter = 3, !. +versechorussoloprogression11645(Note, Note, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression11645(Note1, Note2, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step16451(Note1, Note3), +append(Progression1, [Note3], Progression3), +versechorussoloprogression21645(Note3, Note2, Note1, Counter2, Flag2, Progression3, Progression4), +step16452(Note1, Note4), +append(Progression1, [Note4], Progression5), +versechorussoloprogression21645(Note4, Note2, Note1, Counter2, Flag3, Progression5, Progression6), +step16453(Note1, Note5), +append(Progression1, [Note5], Progression7), +versechorussoloprogression21645(Note5, Note2, Note1, Counter2, Flag4, Progression7, Progression8), +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8]], Flag1, Progression2). +versechorussoloprogression21645(_, _, _, Counter, b, Progression, Progression) :- +Counter = 3, !. +versechorussoloprogression21645(Note, _, Note, _, _, _, _) :- !. +versechorussoloprogression21645(Note, Note, _, _Counter, a, Progression, Progression) :- !. +versechorussoloprogression21645(Note1, Note2, Note3, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +step16451(Note1, Note4), +append(Progression1, [Note4], Progression3), +versechorussoloprogression21645(Note4, Note2, Note3, Counter2, Flag2, Progression3, Progression4), +step16452(Note1, Note5), +append(Progression1, [Note5], Progression5), +versechorussoloprogression21645(Note5, Note2, Note3, Counter2, Flag3, Progression5, Progression6), +step16453(Note1, Note6), +append(Progression1, [Note6], Progression7), +versechorussoloprogression21645(Note6, Note2, Note3, Counter2, Flag4, Progression7, Progression8), +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8]], Flag1, Progression2). +step16451(Note1, Note2) :- +Value is Note1+2, Note2 is Value mod 12. +step16452(Note1, Note2) :- +Value is Note1+8, Note2 is Value mod 12. +step16453(Note1, Note2) :- +Value is Note1+9, Note2 is Value mod 12. diff --git a/Music-Composer/EQ.txt b/Music-Composer/EQ.txt new file mode 100644 index 0000000000000000000000000000000000000000..4d4ff60f6727f5d640428d9515ae00fbe4fcb4f5 --- /dev/null +++ b/Music-Composer/EQ.txt @@ -0,0 +1,10 @@ +EQ + +Bass and guitar sound better separated by EQ (Arts College Song 30/12/13) +See http://www.guitarplayer.com/article/7-eq-tips-for-mixing-guitar/150906 + +Keep instruments in middle third of volume. + +Acoustic guitar - low 500, high 1.5 K +Piano - high 400, low 2K +Voice - high 2K diff --git a/Music-Composer/LICENSE b/Music-Composer/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/Music-Composer/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Music-Composer/README.md b/Music-Composer/README.md new file mode 100644 index 0000000000000000000000000000000000000000..b08eed4d8d935f8de981fc640206418c9c8cedb4 --- /dev/null +++ b/Music-Composer/README.md @@ -0,0 +1,121 @@ +# Music Composer + +Music Composer can help automate writing pop songs in three modes. It outputs the lyrics text file, the asc2mid text file, a meta music file and a MIDI music file. + +1. Manual entry of form, lyrics, melody, harmony, instruments, parts that the melody plays in and parts that the harmony plays in. + +2. Random generation of the creative choices. + +3. Mind reading mode. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for writing music. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +* Requires asc2mid (http://www.archduke.org/midi/, compile with a C compiler before running. +``` +gcc asc2mid.c -o asc2mid +``` +* Call application "asc2mid"). + +* Optional: Search for and download backdated version 6.0.5 of GarageBand and AlterEgo singing voice synthesis plug-in for old GarageBand for Mac or on Windows (https://www.plogue.com/downloads.html). + +# 1. Install manually + +Download this repository, the List Prolog Interpreter Repository and the Mind Reader Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Music-Composer"). +halt +``` + +# Running + +* In Shell: +`cd Music-Composer` +`swipl` + +* Enter the following to enter creative choices manually: +`['mindreadtestmusiccomposer-unusual-4-userinput'].` + +* Enter the following to enter creative choices with rhythm manually: +`['mindreadtestmusiccomposer-unusual-4-userinput-rhythm'].` +* Note: In Rhythm entry mode, enter a maximum of n notes per lyric line, up to n melody and harmony notes, and n rhythm notes. The melody is truncated to the length of the vocal lyric line and the maximum vocal lyric line in vocal-less lines. + +* Enter the following to generate 3-note progressions randomly: +`['mindreadtestmusiccomposer-unusual-3'].` + +* Enter the following to generate 4-note progressions randomly: +`['mindreadtestmusiccomposer-unusual-4'].` + +* Before running texttobr-based mind reader, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. + +* Follow instructions in https://github.com/luciangreen/Text-to-Breasonings/blob/master/Instructions_for_Using_texttobr(2).pl.txt when using texttobr, texttobr2 or mind reader to avoid medical problems. + +* Please follow instructions in init.txt to initialise any new mind-reading algorithms. + +* Enter the following to mind-read songs with 4-note progressions: +`['mindreadtestmusiccomposer-unusual-4-mr'].` + +* Enter the following to *detailedly* mind-read songs with 4-note progressions: +`['mindreadtestmusiccomposer-unusual-4-mr-tree.pl'].` + +* Running the algorithm +To generate the song: +`sectest0(Form1,Lyrics,Melody,Harmony,MelodyParts,HarmonyParts,Vocalstubinstrument,Song1).` + +To output the possible chord progressions from one note to another: +`versechorussoloprogression1451('C','D',Progression),writeln1(Progression).` +`versechorussoloprogression1564('C','D',Progression),writeln1(Progression).` +`versechorussoloprogression1645('C','D',Progression),writeln1(Progression).` +`classicalcomposition('C','D',Progression),writeln1(Progression).` +`popclassicalcomposition('C','D',Progression),writeln1(Progression).` + +# Data files to stems + +* Separate `song***.txt` files into stem files for each instrument (saved in the `stems` folder) by placing the data files in the `data_files` folder, and entering `['data_to_stems.pl'].` and `data_to_stems.`. + +# Stems/data files to mid files + +If: +- you have converted the data files to stem files and need `.mid` files, or +- the file `song***.txt` is produced but the `.mid` file is not: + +* Place the data files or stems in the `stems` folder, enter `['data_to_stems.pl'].` and `stems_to_mid.` and the `.mid` files will be saved in the `mid` folder. + +# Convert meta to mid files + +* Place `*_meta.txt` files in the `meta` folder, enter `['meta2mid.pl'].` and `meta2mid.` and the `*lyrics.txt`, `*.mid` and `*.txt` files will be saved in the `Music-Composer` folder. +* Run `meta2mid_r.` instead of `meta2mid.` to ask for a song-wide rhythm. + +# Note on Mind Reader + +Mind Reader is currently not configured to display screens and accurately mind read decisions. It detects vague, not exact thoughts. + +* If necessary, repeat the "arem" mantra all the time while the mind reading computation is running to ensure the best results. + +# Versioning + +We will use SemVer for versioning. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/Music-Composer/asc2mid b/Music-Composer/asc2mid new file mode 100644 index 0000000000000000000000000000000000000000..2ec76060d67d590edece267c0be13888f7b255ca Binary files /dev/null and b/Music-Composer/asc2mid differ diff --git a/Music-Composer/classical-4.pl b/Music-Composer/classical-4.pl new file mode 100644 index 0000000000000000000000000000000000000000..04a29669a8694bfc6af033a6e6ce1fdc45d7b3c9 --- /dev/null +++ b/Music-Composer/classical-4.pl @@ -0,0 +1,100 @@ +/** +Classical Composition +Based on 2 3 5 6 7 10 11 intervals +https://swish.swi-prolog.org/p/LucianGreenMusic.pl +Counter=4 +?- classicalcomposition('C','D'). +?- classicalcomposition('C','C#'). +classicalcomposition('C','E'). +?- classicalcomposition('C','D#'). +**/ + +classicalcomposition(Name1, Name2,Progression2) :- +note(Note1, Name1), +note(Note2, Name2), +classicalcomposition1(Note1, Note2, 0, _, [Note1], Progression1), notestonames(Progression1,Progression2),!.%%writeln(Progression2),!. +classicalcomposition1(_, _, Counter, b, Progression, Progression) :- +Counter = 4, !. +classicalcomposition1(Note, Note, _Counter, a, Progression, Progression) :- !. +classicalcomposition1(Note1, Note2, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +stepclassical1(Note1, Note3), +append(Progression1, [Note3], Progression3), +classicalcomposition2(Note3, Note2, Note1, Counter2, Flag2, Progression3, Progression4), +stepclassical2(Note1, Note4), +append(Progression1, [Note4], Progression5), +classicalcomposition2(Note4, Note2, Note1, Counter2, Flag3, Progression5, Progression6), +stepclassical3(Note1, Note5), +append(Progression1, [Note5], Progression7), +classicalcomposition2(Note5, Note2, Note1, Counter2, Flag4, Progression7, Progression8), + +stepclassical4(Note1, Note7), +append(Progression1, [Note7], Progression9), +classicalcomposition2(Note7, Note2, Note3, Counter2, Flag5, Progression9, Progression10), + +stepclassical5(Note1, Note8), +append(Progression1, [Note8], Progression11), +classicalcomposition2(Note8, Note2, Note3, Counter2, Flag6, Progression11, Progression12), + +stepclassical6(Note1, Note9), +append(Progression1, [Note9], Progression13), +classicalcomposition2(Note9, Note2, Note3, Counter2, Flag7, Progression13, Progression14), + +stepclassical7(Note1, Note10), +append(Progression1, [Note10], Progression15), +classicalcomposition2(Note10, Note2, Note3, Counter2, Flag8, Progression15, Progression16), + +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8], + +[Flag5, Progression10], [Flag6, Progression12], [Flag7, Progression14], [Flag8, Progression16]], Flag1, Progression2). + +classicalcomposition2(_, _, _, Counter, b, Progression, Progression) :- +Counter = 4, !. +classicalcomposition2(Note, _, Note, _, _, _, _) :- !. +classicalcomposition2(Note, Note, _, _Counter, a, Progression, Progression) :- !. +classicalcomposition2(Note1, Note2, Note3, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +stepclassical1(Note1, Note4), +append(Progression1, [Note4], Progression3), +classicalcomposition2(Note4, Note2, Note3, Counter2, Flag2, Progression3, Progression4), +stepclassical2(Note1, Note5), +append(Progression1, [Note5], Progression5), +classicalcomposition2(Note5, Note2, Note3, Counter2, Flag3, Progression5, Progression6), +stepclassical3(Note1, Note6), +append(Progression1, [Note6], Progression7), +classicalcomposition2(Note6, Note2, Note3, Counter2, Flag4, Progression7, Progression8), + +stepclassical4(Note1, Note7), +append(Progression1, [Note7], Progression9), +classicalcomposition2(Note7, Note2, Note3, Counter2, Flag5, Progression9, Progression10), + +stepclassical5(Note1, Note8), +append(Progression1, [Note8], Progression11), +classicalcomposition2(Note8, Note2, Note3, Counter2, Flag6, Progression11, Progression12), + +stepclassical6(Note1, Note9), +append(Progression1, [Note9], Progression13), +classicalcomposition2(Note9, Note2, Note3, Counter2, Flag7, Progression13, Progression14), + +stepclassical7(Note1, Note10), +append(Progression1, [Note10], Progression15), +classicalcomposition2(Note10, Note2, Note3, Counter2, Flag8, Progression15, Progression16), + +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8], + +[Flag5, Progression10], [Flag6, Progression12], [Flag7, Progression14], [Flag8, Progression16]], Flag1, Progression2). + +stepclassical1(Note1, Note2) :- +Value is Note1+2, Note2 is Value mod 12. +stepclassical2(Note1, Note2) :- +Value is Note1+3, Note2 is Value mod 12. +stepclassical3(Note1, Note2) :- +Value is Note1+5, Note2 is Value mod 12. +stepclassical4(Note1, Note2) :- +Value is Note1+6, Note2 is Value mod 12. +stepclassical5(Note1, Note2) :- +Value is Note1+7, Note2 is Value mod 12. +stepclassical6(Note1, Note2) :- +Value is Note1+10, Note2 is Value mod 12. +stepclassical7(Note1, Note2) :- +Value is Note1+11, Note2 is Value mod 12. diff --git a/Music-Composer/classical.pl b/Music-Composer/classical.pl new file mode 100644 index 0000000000000000000000000000000000000000..9f2ff40d1106a9d719bb65ed9cb3f2db6e708449 --- /dev/null +++ b/Music-Composer/classical.pl @@ -0,0 +1,100 @@ +/** +Classical Composition +Based on 2 3 5 6 7 10 11 intervals +https://swish.swi-prolog.org/p/LucianGreenMusic.pl +Counter=4 +?- classicalcomposition('C','D'). +?- classicalcomposition('C','C#'). +classicalcomposition('C','E'). +?- classicalcomposition('C','D#'). +**/ + +classicalcomposition(Name1, Name2,Progression2) :- +note(Note1, Name1), +note(Note2, Name2), +classicalcomposition1(Note1, Note2, 0, _, [Note1], Progression1), notestonames(Progression1,Progression2),!.%%writeln(Progression2),!. +classicalcomposition1(_, _, Counter, b, Progression, Progression) :- +Counter = 3, !. +classicalcomposition1(Note, Note, _Counter, a, Progression, Progression) :- !. +classicalcomposition1(Note1, Note2, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +stepclassical1(Note1, Note3), +append(Progression1, [Note3], Progression3), +classicalcomposition2(Note3, Note2, Note1, Counter2, Flag2, Progression3, Progression4), +stepclassical2(Note1, Note4), +append(Progression1, [Note4], Progression5), +classicalcomposition2(Note4, Note2, Note1, Counter2, Flag3, Progression5, Progression6), +stepclassical3(Note1, Note5), +append(Progression1, [Note5], Progression7), +classicalcomposition2(Note5, Note2, Note1, Counter2, Flag4, Progression7, Progression8), + +stepclassical4(Note1, Note7), +append(Progression1, [Note7], Progression9), +classicalcomposition2(Note7, Note2, Note3, Counter2, Flag5, Progression9, Progression10), + +stepclassical5(Note1, Note8), +append(Progression1, [Note8], Progression11), +classicalcomposition2(Note8, Note2, Note3, Counter2, Flag6, Progression11, Progression12), + +stepclassical6(Note1, Note9), +append(Progression1, [Note9], Progression13), +classicalcomposition2(Note9, Note2, Note3, Counter2, Flag7, Progression13, Progression14), + +stepclassical7(Note1, Note10), +append(Progression1, [Note10], Progression15), +classicalcomposition2(Note10, Note2, Note3, Counter2, Flag8, Progression15, Progression16), + +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8], + +[Flag5, Progression10], [Flag6, Progression12], [Flag7, Progression14], [Flag8, Progression16]], Flag1, Progression2). + +classicalcomposition2(_, _, _, Counter, b, Progression, Progression) :- +Counter = 3, !. +classicalcomposition2(Note, _, Note, _, _, _, _) :- !. +classicalcomposition2(Note, Note, _, _Counter, a, Progression, Progression) :- !. +classicalcomposition2(Note1, Note2, Note3, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +stepclassical1(Note1, Note4), +append(Progression1, [Note4], Progression3), +classicalcomposition2(Note4, Note2, Note3, Counter2, Flag2, Progression3, Progression4), +stepclassical2(Note1, Note5), +append(Progression1, [Note5], Progression5), +classicalcomposition2(Note5, Note2, Note3, Counter2, Flag3, Progression5, Progression6), +stepclassical3(Note1, Note6), +append(Progression1, [Note6], Progression7), +classicalcomposition2(Note6, Note2, Note3, Counter2, Flag4, Progression7, Progression8), + +stepclassical4(Note1, Note7), +append(Progression1, [Note7], Progression9), +classicalcomposition2(Note7, Note2, Note3, Counter2, Flag5, Progression9, Progression10), + +stepclassical5(Note1, Note8), +append(Progression1, [Note8], Progression11), +classicalcomposition2(Note8, Note2, Note3, Counter2, Flag6, Progression11, Progression12), + +stepclassical6(Note1, Note9), +append(Progression1, [Note9], Progression13), +classicalcomposition2(Note9, Note2, Note3, Counter2, Flag7, Progression13, Progression14), + +stepclassical7(Note1, Note10), +append(Progression1, [Note10], Progression15), +classicalcomposition2(Note10, Note2, Note3, Counter2, Flag8, Progression15, Progression16), + +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8], + +[Flag5, Progression10], [Flag6, Progression12], [Flag7, Progression14], [Flag8, Progression16]], Flag1, Progression2). + +stepclassical1(Note1, Note2) :- +Value is Note1+2, Note2 is Value mod 12. +stepclassical2(Note1, Note2) :- +Value is Note1+3, Note2 is Value mod 12. +stepclassical3(Note1, Note2) :- +Value is Note1+5, Note2 is Value mod 12. +stepclassical4(Note1, Note2) :- +Value is Note1+6, Note2 is Value mod 12. +stepclassical5(Note1, Note2) :- +Value is Note1+7, Note2 is Value mod 12. +stepclassical6(Note1, Note2) :- +Value is Note1+10, Note2 is Value mod 12. +stepclassical7(Note1, Note2) :- +Value is Note1+11, Note2 is Value mod 12. diff --git a/Music-Composer/data_files/.DS_Store b/Music-Composer/data_files/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Music-Composer/data_files/.DS_Store differ diff --git a/Music-Composer/data_to_stems.pl b/Music-Composer/data_to_stems.pl new file mode 100644 index 0000000000000000000000000000000000000000..ef9e0d840c7ab46f9abf83d5b9e56a82c0c3802a --- /dev/null +++ b/Music-Composer/data_to_stems.pl @@ -0,0 +1,181 @@ +:-include('../listprologinterpreter/la_files.pl'). +:-include('../listprologinterpreter/la_maths.pl'). +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). + +data_to_stems :- + + directory_files("data_files/",F), + delete_invisibles_etc(F,G), + + findall([Filex1,Header,Footer,Instruments],(member(Filex1,G), + string_concat("data_files/",Filex1,Filex), + phrase_from_file_s(string(File_codes), Filex), + %string_codes(File_string,File_codes), + %atom_to_term(String02b,Data_file,[]) + + %A=`b i `, +%trace, + ba(File_codes, Header,Footer,Middle), + %string_codes(Middle2,Middle), + %trace,writeln1(Middle2), + % retrieve data to make stem files +%length(B,6), +%foldr(B,Middle,_) +%trace, + %writeln1(split_on_instrument(Middle,[],Instruments)), + %trace, + split_on_instrument(Middle,[],Instruments) %, + %writeln( split_on_instrument(Middle,[],Instruments)) + % + ),String00), + +%writeln(String00), + findall(_,(member([File,Header,Footer,Instruments],String00), + + length(Instruments,Length), + numbers(Length,1,[],Ns), + + findall(_,(member(N,Ns),get_item_n(Instruments,N,Instrument), + foldr(string_concat,[Header,Instrument,Footer],"",File1), + %trace, + %File=File2, + string_concat(File2,".txt",File), + foldr(string_concat,["stems/",File2,"-","stem",N,".txt"],"",File_name), + save_file_s(File_name,File1)),_)),_),!. + +ba(File_codes, Header,Footer,Middle) :- +%trace, + + A=`BA 1 CR 0 TR 1 CH 1 Instrument `, + + append(Header2,D2,File_codes), + string_codes(Header,Header2), + append(A,D,D2), +%trace, + foldr(append,[A,D],[],D4), + reverse(D4,File_codes_reversed), + append(Footer1,D3,File_codes_reversed), + append(`\n`,_D1,D3), + reverse(Footer1,Footer2), + string_codes(Footer,Footer2), + + foldr(append,[%`\n`, + D3],[],E), + reverse(E,Middle), + !. + +split_on_instrument(Middle,Instruments1,Instruments2) :- + A=`Instrument`, + %writeln(Middle), + %trace, + + %A=[i], + append(B,C22,Middle), + append(A,C12,C22), + %trace, + %writeln(here1), + + (not((append(_B11,C11,C12),append(A,_C31,C11)))-> + (string_codes(Middle1,Middle), + append(Instruments1,[Middle1],Instruments2)); +%split_on_instrument(Middle,Instruments1,Instruments2) :- + (%A=`Instrument `, + %trace, + %append(_B10,C2,Middle), + %append(A,C,C2), + %trace, + append(B1,C3,C12), + append(A,C1,C3), + + + %writeln(here2), + %trace, + %reverse(B1,B_reversed), + %append(Start_of_instrument1,D3,B_reversed), + append(D33,Start_of_instrument1,B1), + reverse(D33,D3), + + reverse(D3,D31), + + + %writeln(here3), + + append(`\n`,_D1,D3), + append(End_of_instrument1,D22,C1), + + foldr(append,[B,A,D31],[],D32), + + append(`\n`,D21,D22), + %trace, + End_of_instrument1=End_of_instrument, + %reverse(End_of_instrument1,End_of_instrument), + %reverse(D21,D2), + D21=D2, + + %writeln(here4), + %trace, + reverse(Start_of_instrument1,Start_of_instrument2), + append(B41,C4,Start_of_instrument2), + append(`\n`,C41,C4), + reverse(B41,B412), + reverse(C4,C42), + reverse(C41,C412), + + + reverse(Start_of_instrument1,Start_of_instrument), + + + foldr(append,[B412,A,C1%Start_of_instrument1,A,C1 + ],[],Instrument1), + Instrument1=Instrument2, + %*/ + D34=D32, + + foldr(append,[D34,C42],[],D441), + + string_codes(D44,D441), + + append(Instruments1,[D44],Instruments31), + %flatten(Instruments31,Instruments32), + /* + writeln([b412,B412]), + writeln([c42,C42]), + writeln([c412,C412]), + writeln([b41,B41]), + writeln([c4,C4]), + writeln([c41,C41]), + writeln([d3,D3]), + writeln([b,B]), + writeln([c22,C22]), + writeln([c12,C12]), + writeln(C12), + writeln([c1,C1]), + writeln([b1,B1]), + writeln([d32,D32]), + writeln([start_of_instrument1,Start_of_instrument1]), + writeln([end_of_instrument1,End_of_instrument1]), + %trace, + %append() + writeln(split_on_instrument(Instrument1,Instruments31,Instruments2)), + */ + split_on_instrument(Instrument1,Instruments31,Instruments2))),!. + +stems_to_mid :- + + check_asc2mid, + + directory_files("stems/",F), + delete_invisibles_etc(F,G), + + findall(_,( + + member(File,G), + + string_concat(File1,".txt",File), + + foldr(string_concat,["./asc2mid stems/",File," > mid/",File1,".mid"],Command), + + shell1_s(Command) + + ),_),!. diff --git a/Music-Composer/instruments_mr-tree.pl b/Music-Composer/instruments_mr-tree.pl new file mode 100644 index 0000000000000000000000000000000000000000..b73b952d001b4144f4186391f381225e7aecb227 --- /dev/null +++ b/Music-Composer/instruments_mr-tree.pl @@ -0,0 +1 @@ +instruments_a([[1,"[",2],[2,"0",[-,"[0,""Acoustic Grand Piano""] 01"]],[2,"1",723],[2,"2",1011],[2,"3",1288],[2,"4",1522],[2,"5",1757],[2,"6",1984],[2,"7",2186],[2,"8",2452],[2,"9",2711],[63,",",[-,"[10,""Music Box""] 01"]],[63,"0",[-,"[100,""FX 5 (brightness)""] 01"]],[63,"1",[-,"[101,""FX 6 (goblins)""] 01"]],[63,"2",[-,"[102,""FX 7 (echoes)""] 01"]],[63,"3",[-,"[103,""FX 8 (sci-fi)""] 01"]],[63,"4",[-,"[104,""Sitar""] 01"]],[63,"5",[-,"[105,""Banjo""] 01"]],[63,"6",[-,"[106,""Shamisen""] 01"]],[63,"7",[-,"[107,""Koto""] 01"]],[63,"8",[-,"[108,""Kalimba""] 01"]],[63,"9",[-,"[109,""Bagpipe""] 01"]],[285,",",[-,"[11,""Vibraphone""] 01"]],[285,"0",[-,"[110,""Fiddle""] 01"]],[285,"1",[-,"[111,""Shanai""] 01"]],[285,"2",[-,"[112,""Tinkle Bell""] 01"]],[285,"3",[-,"[113,""Agogo""] 01"]],[285,"4",[-,"[114,""Steel Drums""] 01"]],[285,"5",[-,"[115,""Woodblock""] 01"]],[285,"6",[-,"[116,""Taiko Drum""] 01"]],[285,"7",[-,"[117,""Melodic Tom""] 01"]],[285,"8",[-,"[118,""Synth Drum""] 01"]],[285,"9",[-,"[119,""Reverse Cymbal""] 01"]],[508,",",[-,"[12,""Marimba""] 01"]],[508,"0",[-,"[120,""Guitar Fret Noise""] 01"]],[508,"1",[-,"[121,""Breath Noise""] 01"]],[508,"2",[-,"[122,""Seashore""] 01"]],[723,",",[-,"[1,""Bright Acoustic Piano""] 01"]],[723,"0",63],[723,"1",285],[723,"2",508],[723,"3",[-,"[13,""Xylophone""] 01"]],[723,"4",[-,"[14,""Tubular Bells""] 01"]],[723,"5",[-,"[15,""Dulcimer""] 01"]],[723,"6",[-,"[16,""Drawbar Organ""] 01"]],[723,"7",[-,"[17,""Percussive Organ""] 01"]],[723,"8",[-,"[18,""Rock Organ""] 01"]],[723,"9",[-,"[19,""Church Organ""] 01"]],[1011,",",[-,"[2,""Electric Grand Piano""] 01"]],[1011,"0",[-,"[20,""Reed Organ""] 01"]],[1011,"1",[-,"[21,""Accordian""] 01"]],[1011,"2",[-,"[22,""Harmonica""] 01"]],[1011,"3",[-,"[23,""Tango Accordian""] 01"]],[1011,"4",[-,"[24,""Nylon Acoustic Guitar""] 01"]],[1011,"5",[-,"[25,""Steel Acoustic Guitar""] 01"]],[1011,"6",[-,"[26,""Jazz Electric Guitar""] 01"]],[1011,"7",[-,"[27,""Clean Electric Guitar""] 01"]],[1011,"8",[-,"[28,""Muted Electric Guitar""] 01"]],[1011,"9",[-,"[29,""Overdriven Guitar""] 01"]],[1288,",",[-,"[3,""Honky-Tonk Piano""] 01"]],[1288,"0",[-,"[30,""Distortion Guitar""] 01"]],[1288,"1",[-,"[31,""Guitar Harmonics""] 01"]],[1288,"2",[-,"[32,""Acoustic Bass""] 01"]],[1288,"3",[-,"[33,""Electric Bass (finger)""] 01"]],[1288,"4",[-,"[34,""Electric Bass (pick)""] 01"]],[1288,"5",[-,"[35,""Fretless Bass""] 01"]],[1288,"6",[-,"[36,""Slap Bass 1""] 01"]],[1288,"7",[-,"[37,""Slap Bass 2""] 01"]],[1288,"8",[-,"[38,""Synth Bass 1""] 01"]],[1288,"9",[-,"[39,""Synth Bass 2""] 01"]],[1522,",",[-,"[4,""Electric Piano 1""] 01"]],[1522,"0",[-,"[40,""Violin""] 01"]],[1522,"1",[-,"[41,""Viola""] 01"]],[1522,"2",[-,"[42,""Cello""] 01"]],[1522,"3",[-,"[43,""Contrabass""] 01"]],[1522,"4",[-,"[44,""Tremolo Strings""] 01"]],[1522,"5",[-,"[45,""Pizzicato Strings""] 01"]],[1522,"6",[-,"[46,""Orchestral Harp""] 01"]],[1522,"7",[-,"[47,""Timpani""] 01"]],[1522,"8",[-,"[48,""String Ensemble 1""] 01"]],[1522,"9",[-,"[49,""String Ensemble 2""] 01"]],[1757,",",[-,"[5,""Electric Piano 2""] 01"]],[1757,"0",[-,"[50,""Synth Strings 1""] 01"]],[1757,"1",[-,"[51,""Synth Strings 2""] 01"]],[1757,"2",[-,"[52,""Choir Aahs""] 01"]],[1757,"3",[-,"[53,""Voice Oohs""] 01"]],[1757,"4",[-,"[54,""Synth Choir""] 01"]],[1757,"5",[-,"[55,""Orchestra Hit""] 01"]],[1757,"6",[-,"[56,""Trumpet""] 01"]],[1757,"7",[-,"[57,""Trombone""] 01"]],[1757,"8",[-,"[58,""Tuba""] 01"]],[1757,"9",[-,"[59,""Muted Trumpet""] 01"]],[1984,",",[-,"[6,""Harpsichord""] 01"]],[1984,"0",[-,"[60,""French Horn""] 01"]],[1984,"1",[-,"[61,""Brass Section""] 01"]],[1984,"2",[-,"[62,""Synth Brass 1""] 01"]],[1984,"3",[-,"[63,""Synth Brass 2""] 01"]],[1984,"4",[-,"[64,""Soprano Sax""] 01"]],[1984,"5",[-,"[65,""Alto Sax""] 01"]],[1984,"6",[-,"[66,""Tenor Sax""] 01"]],[1984,"7",[-,"[67,""Baritone Sax""] 01"]],[1984,"8",[-,"[68,""Oboe""] 01"]],[1984,"9",[-,"[69,""English Horn""] 01"]],[2186,",",[-,"[7,""Clavinet""] 01"]],[2186,"0",[-,"[70,""Bassoon""] 01"]],[2186,"1",[-,"[71,""Clarinet""] 01"]],[2186,"2",[-,"[72,""Piccolo""] 01"]],[2186,"3",[-,"[73,""Flute""] 01"]],[2186,"4",[-,"[74,""Recorder""] 01"]],[2186,"5",[-,"[75,""Pan Flute""] 01"]],[2186,"6",[-,"[76,""Blown Bottle""] 01"]],[2186,"7",[-,"[77,""Shakuhachi""] 01"]],[2186,"8",[-,"[78,""Whistle""] 01"]],[2186,"9",[-,"[79,""Ocarina""] 01"]],[2452,",",[-,"[8,""Celesta""] 01"]],[2452,"0",[-,"[80,""Lead 1 (square)""] 01"]],[2452,"1",[-,"[81,""Lead 2 (sawtooth)""] 01"]],[2452,"2",[-,"[82,""Lead 3 (calliope)""] 01"]],[2452,"3",[-,"[83,""Lead 4 (chiff)""] 01"]],[2452,"4",[-,"[84,""Lead 5 (charang)""] 01"]],[2452,"5",[-,"[85,""Lead 6 (voice)""] 01"]],[2452,"6",[-,"[86,""Lead 7 (fifths)""] 01"]],[2452,"7",[-,"[87,""Lead 8 (bass + lead)""] 01"]],[2452,"8",[-,"[88,""Pad 1 (new age)""] 01"]],[2452,"9",[-,"[89,""Pad 2 (warm)""] 01"]],[2711,",",[-,"[9,""Glockenspiel""] 01"]],[2711,"0",[-,"[90,""Pad 3 (polysynth)""] 01"]],[2711,"1",[-,"[91,""Pad 4 (choir)""] 01"]],[2711,"2",[-,"[92,""Pad 5 (bowed)""] 01"]],[2711,"3",[-,"[93,""Pad 6 (metallic)""] 01"]],[2711,"4",[-,"[94,""Pad 7 (halo)""] 01"]],[2711,"5",[-,"[95,""Pad 8 (sweep)""] 01"]],[2711,"6",[-,"[96,""FX 1 (rain)""] 01"]],[2711,"7",[-,"[97,""FX 2 (soundtrack)""] 01"]],[2711,"8",[-,"[98,""FX 3 (crystal)""] 01"]],[2711,"9",[-,"[99,""FX 4 (atmosphere)""] 01"]]]). \ No newline at end of file diff --git a/Music-Composer/la_strings.pl b/Music-Composer/la_strings.pl new file mode 100644 index 0000000000000000000000000000000000000000..98b063afc1d33fdc6f02f3404bab6a964e6f2909 --- /dev/null +++ b/Music-Composer/la_strings.pl @@ -0,0 +1,48 @@ +%% la_strings.pl + +:- include('texttobr2qb'). + +use_module(library(pio)). +use_module(library(dcg/basics)). + +open_s(File,Mode,Stream) :- + atom_string(File1,File), + open(File1,Mode,Stream),!. + +string_atom(String,Atom) :- + atom_string(Atom,String),!. + +phrase_from_file_s(string(Output), String) :- + atom_string(String1,String), + phrase_from_file(string(Output), String1),!. + +writeln1(Term) :- + term_to_atom(Term,Atom), + writeln(Atom),!. + +shell1_s(Command) :- + atom_string(Command1,Command), + shell1(Command1),!. +/** +concat_list(A1,B):- + A1=[A|List], + concat_list(A,List,B),!. + +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). + + +append_list(A1,B):- + A1=[A|List], + append_list(A,List,B),!. + +append_list(A,[],A):-!. +append_list(A,List,B) :- + List=[Item|Items], + append(A,[Item],C), + append_list(C,Items,B). + +**/ diff --git a/Music-Composer/meta/.DS_Store b/Music-Composer/meta/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Music-Composer/meta/.DS_Store differ diff --git a/Music-Composer/meta2mid.pl b/Music-Composer/meta2mid.pl new file mode 100644 index 0000000000000000000000000000000000000000..b14132730857a9b7eb4d69148607e99b155e1d54 --- /dev/null +++ b/Music-Composer/meta2mid.pl @@ -0,0 +1,183 @@ +/** + +- Form +- Which chord progression type? +- Lyrics - random sentences, rhyming +- Melody - random starting and ending note of first phrase, melody and harmony from chord progression type with three notes (split for melody), the circle of fifths for connection between sections +- Which of up to 3 instruments for each of melody and harmony +- What sections they will play + +- Melody cut to the length of vocals + +**/ + +use_module(library(pio)). + +:- use_module(library(date)). +%%:- include('texttobr2qb'). +%%:- include('mindreadtestshared'). +%%:- include('texttobr2'). +%%:- include('texttobr'). +%%:- include('texttobr2qbmusic'). + +:- include('../listprologinterpreter/listprolog.pl'). + +:- include('mindreadtestmusiccomposer-unusual-ui-rhythm.pl').%mindreadtestmusiccomposer-unusual-mr-tree.pl'). + +%:- include('musiclibrary'). +%:- include('../listprologinterpreter/listprolog.pl'). +%:- include('../listprologinterpreter/la_strings.pl'). +%:- include('../listprologinterpreter/la_files.pl'). +%:- include('../mindreader/make_mind_reading_tree4 working1.pl'). +%:- include('../mindreader/mr_tree.pl'). +%:- include('instruments_mr-tree.pl'). + +%sectest(0) :- !. +%sectest(N1) :- +% writeln(["\n",N1]), +:-dynamic ask_for_rhythm/1. + + +meta2mid :- + retractall(ask_for_rhythm(_)), + assertz(ask_for_rhythm(false)), + +meta2mid1. + +meta2mid_r :- + retractall(ask_for_rhythm(_)), + assertz(ask_for_rhythm(true)), + +meta2mid1. +meta2mid1 :- + + check_asc2mid, + + + +Folder="meta", +foldr(string_concat,[Folder,"/"],Path), + directory_files(Path,F), + delete_invisibles_etc(F,G), + +findall(_,(member(Filex1,G), + string_concat(Path,Filex1,Filex), + % Additional_variables are [label,var] + open_file_s(Filex,Meta_file), + +%open_file_s("meta/song202422921430.467365026474_meta.txt",Meta_file), + +Meta_file=[[form,Form1],[chord_progressions,CPT],[voice_part,Voiceparts2],[melody,Melody],[harmony,Harmony],[melody_instruments, + MelodyInstruments],[harmony_instruments,HarmonyInstruments],[melody_parts,MelodyParts],[harmony_parts, + HarmonyParts],[lyrics,Lyrics], + [genre,["anthem"]]|Rhythm1], + + (Rhythm1=[[rhythm,Rhythm]]-> + ( + retractall(rhythm(_)), + assertz(rhythm(Rhythm))); + + (ask_for_rhythm(true)->rhythm;%(MelodyParts,HarmonyParts); + (retractall(rhythm(_)), + assertz(rhythm([["0","NT","1/2",1,80],["1/2","NT","1/2",1,80],["1","NT","1/2",2,80],["1+1/2","NT","1/2",2,80],["2","NT","1/2",3,80],["2+1/2","NT","1/2",3,80],["3","NT","1/2",4,80],["3+1/2","NT","1/2",4,80]]))))), + +%sectest0(_Form1,_Lyrics,_Melody,_Harmony,_MelodyParts, + %_HarmonyParts,_Vocalstubinstrument,_Song1), + %N2 is N1-1, + %sectest(N2),!. + +%sectest0(Form1,Lyrics,Melody,Harmony,MelodyParts,HarmonyParts,Vocalstubinstrument,Song1) :- + + %%texttobr2qb(2), %%Imagine song + %form(Form1), + %%Form1=[v2,o], + %find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT), + %remove_dups(Form1,[],Form2), + %Voiceparts1=[v1,v2,c,s], + intersection(Form1,[v1,v2,c,s],Voiceparts2), + lyrics2_m2m(%Voiceparts1, + Lyrics,0,Maxlength), + %findall(B,(member(A1,Form2),string_concat(B1,_C,A1), + %string_length(B1,1),atom_string(B,B1)),Form3), + %remove_dups(Form3,[],Form4), + %%repeat, %% in case melody, harmony don't work + %melodyharmony(Form4,CPT,Maxlength,Melody,Harmony), + %%writeln(melodyharmony(Form4,CPT,Maxlength,Melody,Harmony)), %% *** + %instruments(Form1,MelodyInstruments,HarmonyInstruments, + %MelodyParts,HarmonyParts,Vocalstubinstrument), + %%writeln(instruments(Form1,MelodyInstruments,HarmonyInstruments, +%% MelodyParts,HarmonyParts, +%% Vocalstubinstrument)), + %%writeln(rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + %% MelodyInstruments,HarmonyInstruments,MelodyParts, + %% HarmonyParts,Lyrics, + %% Vocalstubinstrument,Song1)), %%, + Vocalstubinstrument=[0,"Acoustic Grand Piano"], + + %working_directory1(WD,WD), + %working_directory1(_,"mid/"), + + rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + MelodyInstruments,HarmonyInstruments,MelodyParts, + HarmonyParts,Lyrics, + Vocalstubinstrument,_Song1,File1), + + rhythm(Rhythm2), + + Meta_file3=[[form,Form1],[chord_progressions,CPT],[voice_part,Voiceparts2],[melody,Melody],[harmony,Harmony],[melody_instruments, + MelodyInstruments],[harmony_instruments,HarmonyInstruments],[melody_parts,MelodyParts],[harmony_parts, + HarmonyParts],[lyrics,Lyrics], + [genre,["anthem"]],[rhythm,Rhythm2]], + + term_to_atom(Meta_file3,Meta_file1), + string_atom(Meta_file2,Meta_file1), + +%member(Filex1,G), + % Additional_variables are [label,var] + %open_file_s(Filex,Meta_file), + + %concat_list("",[Filex1],File2), + + concat_list("",[File1,"_meta.txt"],Filex2), + %string_concat(Path,File2,Filex2), + %working_directory1(WD,WD), + %working_directory1(_,"../"), + (open_s(Filex2,write,Stream1), + write(Stream1,Meta_file2), + close(Stream1)) + %working_directory1(_,WD) + +),_), %%, + + %working_directory1(_,WD), +/* +Meta_file=[[form,Form1],[chord_progressions,CPT],[voice_part,Voiceparts2],[melody,Melody],[harmony,Harmony],[melody_instruments, + MelodyInstruments],[harmony_instruments,HarmonyInstruments],[melody_parts,MelodyParts],[harmony_parts, + HarmonyParts],[lyrics,Lyrics], + [genre,["anthem"]]], +*/ + !. + +%lyrics_m2m(Form1,Lyrics,Maxlength) :- + %%find("Who is the song about?",Character), + %%lyrics1(Form,Character,[],Lyrics). + %catch((lyrics2_m2m(Form1,[],Lyrics,[],_Names,0,Maxlength)),_, +% lyrics_m2m(Form1,Lyrics,Maxlength)). + +lyrics2_m2m([],Maxlength,Maxlength) :- !. +lyrics2_m2m(Lyrics1,Maxlength1,Maxlength2) :- + + %Form1=[Form2|Forms3], + Lyrics1=[[_Form2,Sentence1,Sentence2,Sentence3,Sentence4]|Lyrics3], + + %findsentence(Sentence1,Names1,Names3), + length(Sentence1,Length1), + %findsentence(Sentence2,Names3,Names4), + length(Sentence2,Length2), + %findrhyme(Sentence1,Sentence3,Names4,Names5), + length(Sentence3,Length3), + %findrhyme(Sentence2,Sentence4,Names5,Names6), + length(Sentence4,Length4), + %append(Lyrics1,[[Form2,Sentence1,Sentence2,Sentence3,Sentence4]],Lyrics3), + max([Length1,Length2,Length3,Length4],Maxlength1,Maxlength3), + lyrics2_m2m(Lyrics3,Maxlength3,Maxlength2),!. \ No newline at end of file diff --git a/Music-Composer/mid/.DS_Store b/Music-Composer/mid/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Music-Composer/mid/.DS_Store differ diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual-3.pl b/Music-Composer/mindreadtestmusiccomposer-unusual-3.pl new file mode 100644 index 0000000000000000000000000000000000000000..41a5d6560ec54de123a29338fd4fa42f26060605 --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual-3.pl @@ -0,0 +1,6 @@ +:- include('mindreadtestmusiccomposer-unusual'). +:- include('1451'). +:- include('1564'). +:- include('1645'). +:- include('popclassical'). +:- include('classical'). diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual-4-mr-tree.pl b/Music-Composer/mindreadtestmusiccomposer-unusual-4-mr-tree.pl new file mode 100644 index 0000000000000000000000000000000000000000..4a6d2045e27482425d41a4edcea1f9df0a711b4e --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual-4-mr-tree.pl @@ -0,0 +1,7 @@ +:- include('mindreadtestmusiccomposer-unusual-mr-tree'). +:- include('1451-4'). +:- include('1564-4'). +:- include('1645-4'). +:- include('popclassical-4'). +:- include('classical-4'). +%%:- include('la_strings'). diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual-4-mr.pl b/Music-Composer/mindreadtestmusiccomposer-unusual-4-mr.pl new file mode 100644 index 0000000000000000000000000000000000000000..0f3cd2eceda6a7a08f2dedea6492693ad0dd8e01 --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual-4-mr.pl @@ -0,0 +1,7 @@ +:- include('mindreadtestmusiccomposer-unusual-mr'). +:- include('1451-4'). +:- include('1564-4'). +:- include('1645-4'). +:- include('popclassical-4'). +:- include('classical-4'). +%%:- include('la_strings'). diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual-4-userinput-rhythm.pl b/Music-Composer/mindreadtestmusiccomposer-unusual-4-userinput-rhythm.pl new file mode 100644 index 0000000000000000000000000000000000000000..b27a857db51ba5efcc33a03ad323e3ad21046498 --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual-4-userinput-rhythm.pl @@ -0,0 +1,7 @@ +:- include('../listprologinterpreter/listprolog.pl'). +:- include('mindreadtestmusiccomposer-unusual-ui-rhythm'). +:- include('1451-4'). +:- include('1564-4'). +:- include('1645-4'). +:- include('popclassical-4'). +:- include('classical-4'). diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual-4-userinput.pl b/Music-Composer/mindreadtestmusiccomposer-unusual-4-userinput.pl new file mode 100644 index 0000000000000000000000000000000000000000..09eff2c03eb8f90e56a97d577e0d2c5de323eb06 --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual-4-userinput.pl @@ -0,0 +1,6 @@ +:- include('mindreadtestmusiccomposer-unusual-ui'). +:- include('1451-4'). +:- include('1564-4'). +:- include('1645-4'). +:- include('popclassical-4'). +:- include('classical-4'). diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual-4.pl b/Music-Composer/mindreadtestmusiccomposer-unusual-4.pl new file mode 100644 index 0000000000000000000000000000000000000000..f87ce1a18b3c25e7d8d3946a5806b20c3375d4e8 --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual-4.pl @@ -0,0 +1,6 @@ +:- include('mindreadtestmusiccomposer-unusual'). +:- include('1451-4'). +:- include('1564-4'). +:- include('1645-4'). +:- include('popclassical-4'). +:- include('classical-4'). diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual-mr-tree.pl b/Music-Composer/mindreadtestmusiccomposer-unusual-mr-tree.pl new file mode 100644 index 0000000000000000000000000000000000000000..98795d7a25118ffb22328955501c20c3f1ab50d4 --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual-mr-tree.pl @@ -0,0 +1,984 @@ +/** + +- Form +- Which chord progression type? +- Lyrics - random sentences, rhyming +- Melody - random starting and ending note of first phrase, melody and harmony from chord progression type with three notes (split for melody), the circle of fifths for connection between sections +- Which of up to 3 instruments for each of melody and harmony +- What sections they will play + +- Melody cut to the length of vocals + +**/ + +use_module(library(pio)). + +:- use_module(library(date)). +%%:- include('texttobr2qb'). +%%:- include('mindreadtestshared'). +%%:- include('texttobr2'). +%%:- include('texttobr'). +%%:- include('texttobr2qbmusic'). + +:- include('musiclibrary'). +:- include('la_strings'). +:- include('../mindreader/make_mind_reading_tree4 working1.pl'). +:- include('../mindreader/mr_tree.pl'). +:- include('instruments_mr-tree.pl'). + +sectest(0) :- !. +sectest(N1) :- + writeln(["\n",N1]), + sectest0(_Form1,_Lyrics,_Melody,_Harmony,_MelodyParts, + _HarmonyParts,_Vocalstubinstrument,_Song1), + N2 is N1-1, + sectest(N2),!. + +sectest0(Form1,Lyrics,Melody,Harmony,MelodyParts,HarmonyParts,Vocalstubinstrument,Song1) :- + check_asc2mid, + %%texttobr2qb(2), %%Imagine song + form(Form1), + %%Form1=[v2,o], + find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT), + remove_dups(Form1,[],Form2), + Voiceparts1=[v1,v2,c,s], + intersection(Form1,[v1,v2,c,s],Voiceparts2), + lyrics(Voiceparts1,Lyrics,Maxlength), + findall(B,(member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1)),Form3), + remove_dups(Form3,[],Form4), + %%repeat, %% in case melody, harmony don't work + melodyharmony(Form4,CPT,Maxlength,Melody,Harmony), + %%writeln(melodyharmony(Form4,CPT,Maxlength,Melody,Harmony)), %% *** + instruments(Form1,MelodyInstruments,HarmonyInstruments, + MelodyParts,HarmonyParts,Vocalstubinstrument), + %%writeln(instruments(Form1,MelodyInstruments,HarmonyInstruments, +%% MelodyParts,HarmonyParts, +%% Vocalstubinstrument)), + %%writeln(rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + %% MelodyInstruments,HarmonyInstruments,MelodyParts, + %% HarmonyParts,Lyrics, + %% Vocalstubinstrument,Song1)), %%, + + rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + MelodyInstruments,HarmonyInstruments,MelodyParts, + HarmonyParts,Lyrics, + Vocalstubinstrument,Song1,File1), %%, + +Meta_file=[[form,Form1],[chord_progressions,CPT],[voice_part,Voiceparts2],[melody,Melody],[harmony,Harmony],[melody_instruments, + MelodyInstruments],[harmony_instruments,HarmonyInstruments],[melody_parts,MelodyParts],[harmony_parts, + HarmonyParts],[lyrics,Lyrics], + [genre,["anthem"]]], + + term_to_atom(Meta_file,Meta_file1), + string_atom(Meta_file2,Meta_file1), + + concat_list("",[File1,"_meta.txt"],File2), + (open_s(File2,write,Stream1), + write(Stream1,Meta_file2), + close(Stream1)),! + . + + +%% n intro, vn verse, c chorus, i1 instrumental1, t2, instrumental 2, s solo, o outro + +findall1(Form2,Form3) :- + findall(B,shorten(Form2,B),Form3). + +shorten(Form2,B) :- + member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1). + +form(Form) :- + Form1=[n,v1], + find("Should a chorus or instrumental come after the first verse?",CorI), + (CorI=c-> + (append(Form1,[c,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + ); + (CorI=i1,append(Form1,[i1,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + )). + +makename(N3) :- + mind_read(N1,["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"]), + mind_read(N2,["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"]), + append([N1],[N2],N3). + +makenames(0,Ns,Ns) :- !. +makenames(Num1,N1,N2) :- + random_member(R1,["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"]), + random_member(R2,["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"]), + append([R1],[R2],R3), + append(N1,[R3],N3), + Num2 is Num1-1, + makenames(Num2,N3,N2). + +findsentence(Sentence,Names1,Names2) :- + makename(N1),readv(Vs),reado(Os), + makenames(5,[],N2s1),append(N2s1,Names1,N2s), + mind_read(R11,Vs), + append(Os,N2s,Os2), + mind_read(R21,Os2), + append_list(N1,[R11,R21],Sentence), + (member(R21,N2s1)-> + append(Names1,[R21],Names2); + Names2=Names1). + +findrhyme(Phrase1,Phrase2,Names1,Names2) :- + reverse(Phrase1,Reverse), + Reverse=[Lastsyllable|_], + readv(Vs),reado(Os),makenames(5,[],N2s1), + append(N2s1,Names1,N2s), + append(Os,N2s,Os2), + mind_read(R21,Os2), + mind_read(R11,Vs), + findall(C,(member(C,N2s),reverse(C,B),B=[E|_],rhymes2(E,Lastsyllable)),Phrases22), + mind_read(R31,Phrases22), + append_list(R21,[R11,R31],Phrase2), + (member(R21,N2s1)-> + append(Names1,[R21],Names3); + Names3=Names1), + (member(Phrases22,N2s1)-> + append(Names3,[R21],Names2); + Names2=Names3). + +rhymes2(Syllable1,Syllable2) :- + rhymes(Lists), + member(List1,Lists), + member(Syllable1,List1), + member(Syllable2,List1). + +lyrics(Form1,Lyrics,Maxlength) :- + %%find("Who is the song about?",Character), + %%lyrics1(Form,Character,[],Lyrics). + catch((lyrics2(Form1,[],Lyrics,[],_Names,0,Maxlength)),_, + lyrics(Form1,Lyrics,Maxlength)). + +lyrics2([],Lyrics,Lyrics,Names,Names,Maxlength,Maxlength) :- !. +lyrics2(Form1,Lyrics1,Lyrics2,Names1,Names2,Maxlength1,Maxlength2) :- + Form1=[Form2|Forms3], + findsentence(Sentence1,Names1,Names3), + length(Sentence1,Length1), + findsentence(Sentence2,Names3,Names4), + length(Sentence2,Length2), + findrhyme(Sentence1,Sentence3,Names4,Names5), + length(Sentence3,Length3), + findrhyme(Sentence2,Sentence4,Names5,Names6), + length(Sentence4,Length4), + append(Lyrics1,[[Form2,Sentence1,Sentence2,Sentence3,Sentence4]],Lyrics3), + max([Length1,Length2,Length3,Length4],Maxlength1,Maxlength3), + lyrics2(Forms3,Lyrics3,Lyrics2,Names6,Names2,Maxlength3,Maxlength2),!. + +melodyharmony(Form1,CPT,Maxlength,Melody,Harmony) :- + Partlength is Maxlength div 3, + Extra is Maxlength mod 3, + Total is Partlength+Extra, + _Parts=[Partlength,Partlength,Partlength,Total], + (CPT=1451->findall(A,note0(_,A),_Notes); + findall(A,note0(_,A),_Notes2)), + %% What note should the song start on? + %%trialy2(Notes,R1), + %%findbest(R1,R11), + R11='A', + melodyharmony(Form1,CPT,_Parts2,R11,_R2,[],Melody,[],Harmony). + +melodyharmony([],_CPT,_Parts,N,N,Melody,Melody,Harmony,Harmony) :- !. +melodyharmony(Form1,CPT,Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + Form1=[Form2|Forms3], + findmelody(Form2,CPT,Parts,N1,N3,Melody1,Melody3,Harmony1,Harmony3), + (CPT=1451->harmony1451(N3,3,N4);harmonyr(N3,5,N4)), + findmelody(Form2,CPT,Parts,N4,N5,Melody3,Melody4,Harmony3,Harmony4), + reverse(Melody3,Melody3R),Melody3R=[[_,Melody3RI]|_], %% relies on repeat *1 + reverse(Melody4,Melody4R),Melody4R=[[_,Melody4RI]|_], + not((length(Melody3RI,1),length(Melody4RI,1))), + melodyharmony(Forms3,CPT,Parts,N5,N2,Melody4,Melody2, + Harmony4,Harmony2). + +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1451, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + mind_read(N2,Notes), + versechorussoloprogression1451(N1,N2,Progression), + mind_read(Progression2,Progression), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1451(N1,N2,Progression3), + mind_read(Progression4,Progression3), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1564, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, %% *1 see other *1 + mind_read(N2,Notes), + versechorussoloprogression1564(N1,N2,Progression), + mind_read(Progression2,Progression), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1564(N1,N2,Progression3), + mind_read(Progression4,Progression3), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1645, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + mind_read(N2,Notes), + versechorussoloprogression1645(N1,N2,Progression), + mind_read(Progression2,Progression), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1645(N1,N2,Progression3), + mind_read(Progression4,Progression3), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classical, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + mind_read(N2,Notes), + classicalcomposition(N1,N2,Progression), + mind_read(Progression2,Progression), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + classicalcomposition(N1,N2,Progression3), + mind_read(Progression4,Progression3), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classicalpop, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + mind_read(N2,Notes), + popclassicalcomposition(N1,N2,Progression), + mind_read(Progression2,Progression), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + popclassicalcomposition(N1,N2,Progression3), + mind_read(Progression4,Progression3), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). + +harmony(Form,CPT,Progression,Harmony1,Harmony2) :- + harmony1(Form,CPT,Progression,[],Harmony3), + append(Harmony1,[Harmony3],Harmony2). + +harmony1(_Form,_CPT,[],Harmony,Harmony) :- !. +harmony1(Form,CPT,Progression1,Harmony1,Harmony2) :- + Progression1=[Note1|Progression2], + (CPT=1451->(harmony1451(Note1,2,Note2),harmony1451(Note1,4,Note3)); + (harmonyr(Note1,4,Note2),harmonyr(Note1,7,Note3))), + Harmony30=[Note2,Note3], + findall(B1,(member(A1,Harmony30),string_concat(B,_C,A1),string_length(B,1),atom_string(B1,B)),Harmony31), + append([Note1],Harmony31,Harmony3), + append(Harmony1,[[Form,Harmony3]],Harmony4), + harmony1(Form,CPT,Progression2,Harmony4,Harmony2),!. + +harmony1451(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). +harmonyr(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). + +instruments(Form1,MelodyInstruments,HarmonyInstruments,MelodyParts,HarmonyParts,Vocalstubinstrument) :- + instrumentnumber(MelodyInstrumentNumber), + instrumentnumber(HarmonyInstrumentNumber), + instrumentlist(MelodyInstrumentNumber,MelodyInstruments), + instrumentlist(HarmonyInstrumentNumber,HarmonyInstruments), + Vocalstubinstrument=[0,"Acoustic Grand Piano"], + parts(Form1,MelodyInstruments,MelodyParts), + aggregate_all(count, (member(A1,MelodyParts),not(A1=[_,_,0])), Count1), + not(Count1=0), + parts(Form1,HarmonyInstruments,HarmonyParts), + aggregate_all(count, (member(A2,HarmonyParts),not(A2=[_,_,0])), Count2), + not(Count2=0). + +instrumentnumber(NumberofInstruments) :- + %% Number of melody instruments + mind_read(NumberofInstruments,[1,2,3]). + +instrumentlist(NumberofInstruments,Instruments) :- + instrumentlist(NumberofInstruments,[],Instruments),!. + +instrumentlist(0,Instruments,Instruments) :- !. +instrumentlist(NumberofInstruments1,Instruments1,Instruments2) :- + Instruments=[[0,"Acoustic Grand Piano"],[1,"Bright Acoustic Piano"],[2,"Electric Grand Piano"],[3,"Honky-Tonk Piano"],[4,"Electric Piano 1"],[5,"Electric Piano 2"],[6,"Harpsichord"],[7,"Clavinet"],[8,"Celesta"],[9,"Glockenspiel"],[10,"Music Box"],[11,"Vibraphone"],[12,"Marimba"],[13,"Xylophone"],[14,"Tubular Bells"],[15,"Dulcimer"],[16,"Drawbar Organ"],[17,"Percussive Organ"],[18,"Rock Organ"],[19,"Church Organ"],[20,"Reed Organ"],[21,"Accordian"],[22,"Harmonica"],[23,"Tango Accordian"],[24,"Nylon Acoustic Guitar"],[25,"Steel Acoustic Guitar"],[26,"Jazz Electric Guitar"],[27,"Clean Electric Guitar"],[28,"Muted Electric Guitar"],[29,"Overdriven Guitar"],[30,"Distortion Guitar"],[31,"Guitar Harmonics"],[32,"Acoustic Bass"],[33,"Electric Bass (finger)"],[34,"Electric Bass (pick)"],[35,"Fretless Bass"],[36,"Slap Bass 1"],[37,"Slap Bass 2"],[38,"Synth Bass 1"],[39,"Synth Bass 2"],[40,"Violin"],[41,"Viola"],[42,"Cello"],[43,"Contrabass"],[44,"Tremolo Strings"],[45,"Pizzicato Strings"],[46,"Orchestral Harp"],[47,"Timpani"],[48,"String Ensemble 1"],[49,"String Ensemble 2"],[50,"Synth Strings 1"],[51,"Synth Strings 2"],[52,"Choir Aahs"],[53,"Voice Oohs"],[54,"Synth Choir"],[55,"Orchestra Hit"],[56,"Trumpet"],[57,"Trombone"],[58,"Tuba"],[59,"Muted Trumpet"],[60,"French Horn"],[61,"Brass Section"],[62,"Synth Brass 1"],[63,"Synth Brass 2"],[64,"Soprano Sax"],[65,"Alto Sax"],[66,"Tenor Sax"],[67,"Baritone Sax"],[68,"Oboe"],[69,"English Horn"],[70,"Bassoon"],[71,"Clarinet"],[72,"Piccolo"],[73,"Flute"],[74,"Recorder"],[75,"Pan Flute"],[76,"Blown Bottle"],[77,"Shakuhachi"],[78,"Whistle"],[79,"Ocarina"],[80,"Lead 1 (square)"],[81,"Lead 2 (sawtooth)"],[82,"Lead 3 (calliope)"],[83,"Lead 4 (chiff)"],[84,"Lead 5 (charang)"],[85,"Lead 6 (voice)"],[86,"Lead 7 (fifths)"],[87,"Lead 8 (bass + lead)"],[88,"Pad 1 (new age)"],[89,"Pad 2 (warm)"],[90,"Pad 3 (polysynth)"],[91,"Pad 4 (choir)"],[92,"Pad 5 (bowed)"],[93,"Pad 6 (metallic)"],[94,"Pad 7 (halo)"],[95,"Pad 8 (sweep)"],[96,"FX 1 (rain)"],[97,"FX 2 (soundtrack)"],[98,"FX 3 (crystal)"],[99,"FX 4 (atmosphere)"],[100,"FX 5 (brightness)"],[101,"FX 6 (goblins)"],[102,"FX 7 (echoes)"],[103,"FX 8 (sci-fi)"],[104,"Sitar"],[105,"Banjo"],[106,"Shamisen"],[107,"Koto"],[108,"Kalimba"],[109,"Bagpipe"],[110,"Fiddle"],[111,"Shanai"],[112,"Tinkle Bell"],[113,"Agogo"],[114,"Steel Drums"],[115,"Woodblock"],[116,"Taiko Drum"],[117,"Melodic Tom"],[118,"Synth Drum"],[119,"Reverse Cymbal"],[120,"Guitar Fret Noise"],[121,"Breath Noise"],[122,"Seashore"]], + %% ,[123,"Bird Tweet"],[124,"Telephone Ring"],[125,"Helicopter"],[126,"Applause"],[127,"Gunshot"] + mind_read_instruments(Instrument,Instruments), + append(Instruments1,[Instrument],Instruments3), + NumberofInstruments2 is NumberofInstruments1-1, + instrumentlist(NumberofInstruments2,Instruments3,Instruments2),!. + +parts(Form,Instruments,Parts) :- + parts1(Form,Instruments,[],Parts),!. + +parts1(_Form,[],Parts,Parts) :- !. +parts1(Form,Instruments1,Parts1,Parts2) :- + Instruments1=[Instrument2|Instruments3], + parts2(Form,Instrument2,[],Part3), + append(Parts1,[Part3],Parts4), + parts1(Form,Instruments3,Parts4,Parts2),!. + +parts2([],_Instrument,Part,Part) :- !. +parts2(Form1,Instrument,Part1,Part2) :- + Form1=[Section|Form2], + (member([Section,Instrument,Playing],Part1)->true; + %% Is the instrument playing in the section? + (mind_read(Playing,[0,1]))), + append(Part1,[[Section,Instrument,Playing]],Part3), + parts2(Form2,Instrument,Part3,Part2),!. + +/** +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). +**/ + +concat_list3(A,[],A) :-!. +concat_list3(A,List,B) :- + List=[Item|Items], + concat_list2(A,[Item],C), + concat_list3(C,Items,B). +concat_list2(A,List,C) :- + ((List=[[Item|Items]]->true;List=[Item])-> + concat_list(A,[Item|Items],C); + fail),!. +concat_list2(A,Item,C) :- + concat_list3(A,Item,C),!. + +sentencewithspaces(Sentence1,Sentence2) :- + Sentence1=[Item|Items], + string_concat(Firstletter1,Rest,Item), + string_length(Firstletter1,1), + upcase_atom(Firstletter1,Firstletter2), + concat_list3(Firstletter2,[Rest,""],Item2), + sentencewithspaces(Items,Item2,Sentence3), + string_concat(Sentence3,".",Sentence2). + +sentencewithspaces([],Sentence,Sentence) :- !. +sentencewithspaces(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Word|Sentence4], + concat_list3("",[Sentence2," ",Word],Item2), + sentencewithspaces(Sentence4,Item2,Sentence3),!. + +rendersong(Form1,Voiceparts,_Maxlength,Melody, + Harmony,MelodyInstruments, + HarmonyInstruments,MelodyParts1, + HarmonyParts,Lyrics,Vocalstubinstrument,Song1,File1) :- + Totallength=8, + Voicetrack=1, + length(MelodyInstruments,MelodyInstrumentsLength), + length(HarmonyInstruments,HarmonyInstrumentsLength), + TracksNumber is MelodyInstrumentsLength+HarmonyInstrumentsLength+2, + Lyrics=[[_,Sentence1|_]|_],sentencewithspaces(Sentence1,Sentence2), + concat_list3("format=1 tracks=", [TracksNumber, " division=384\n\nBA 1 CR 0 TR 0 CH 16 Text type 2: \"Produced by Mind Reading Music Composer by Lucian Academy (Detailed Mind Reading Tree Mode)\"\nBA 1 CR 0 TR 0 CH 16 Text type 3: \"", Sentence2, "\"\nBA 1 CR 0 TR 0 CH 1 Channel volume 127\nBA 1 CR 0 TR 0 CH 16 Tempo 63.00009\n"], Song2), + printheader(Voicetrack,Vocalstubinstrument,Song2,Song3), + %%writeln(renderv1(Form1,Voiceparts,_,Lyrics,Melody, + %% Totallength,Voicetrack,1,_,Song3,Song4)), %% **** + renderv1(Form1,Voiceparts,_,Lyrics,Melody, + Totallength,Voicetrack,1,_,Song3,Song4), + Track2 is Voicetrack+1, + renderm1(Form1,Melody,MelodyParts1, + Track2,_Track3,Song4,Song5), + Track4 is MelodyInstrumentsLength+2, + renderh1(Form1,Harmony,HarmonyParts, + Track4,_Track5,Song5,Song6), + length(Form1,FormLength), + TotalBars is 4*FormLength+1, + concat_list3(Song6,["BA ",TotalBars," CR 0 TR 0 CH 16 End of track"],Song1), + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list3("song",[Year,Month,Day,Hour1,Minute1,Seconda],File1), + concat_list3(File1,[".txt"],File2), + concat_list3(File1,[".mid"],File3), + open_s(File2,write,Stream), + write(Stream,Song1), + close(Stream), + concat_list3("./asc2mid ",[File2," > ",File3],Command), + shell1_s(Command), + %%writeln(["Texttobr, Texttobr2 not working. Please manually breason out ",File2]), + %%N is 4, M is 4000, texttobr2(N,File2,u,M),texttobr(N,File2,u,M), + outputlyrics(File1,Lyrics), + +/** + texttobr2qb(3), %% give self breasonings + texttobr2qb(20), %%Feature 1 + texttobr2qb(20), %%Updates + texttobr2qb(20), %%Feature 2 + texttobr2qb(20), %%Updates + texttobr2qb(20), %%Feature 3 + texttobr2qb(20), %%Updates + texttobr2qb(100), %%Icon + texttobr2qb(20), %%Updates + texttobr2qb(32), %%Lyrics + texttobr2qb(36), %%Music + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Medicine + texttobr2qb(20), %%Updates + texttobr2qb(2), %%Sales + texttobr2qb(20), %%Updates + texttobr2qb(2), %%Marketing + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Graciously give or blame listener for colour imagery + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Play song + texttobr2qb(2), %%Like song + **/ +!. + +printheader(Voicetrack,Vocalstubinstrument,Song1,Song2) :- + Vocalstubinstrument=[N1,_], + N2 is N1+1, + concat_list3("BA 1 CR 0 TR ",[Voicetrack," CH ",Voicetrack," Instrument ",N2,"\nBA 1 CR 0 TR ",Voicetrack," CH ",Voicetrack," NT R 0 von=127 voff=0\n"],Song3), + string_concat(Song1,Song3,Song2). + +renderv1([],_Voiceparts,_Vocalstubinstrument,_Lyrics,_Melody,_Totallength,_Track,_Bar,_Bar2,Voice,Voice) :- !. +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength,Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section|Form2], + Voiceparts1=[Section|Voiceparts2], + renderv2(Section,Lyrics,Melody,Totallength,Track, + Bar1,Bar3,Voice1,Voice3), + renderv1(Form2,Voiceparts2,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength, +Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section1|Form2], + ((Voiceparts1=[Section2|_Voiceparts2], + not(Section1=Section2))->true;Voiceparts1=[]), + Bar3 is Bar1+4, + renderv1(Form2,Voiceparts1,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderv2(Section,Lyrics,_Melody,_Totallength,_Track,Bar,Bar,Voice,Voice) :- + not(member([Section|_],Lyrics)),!. +renderv2(Section1,Lyrics1,Melody1,_Totallength0,Track,Bar1,Bar2,Voice1,Voice2) :- + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section2|_]),Melody3), + Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1(Lyrics1A,Melody1A,_Totallength1,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1(Lyrics2A,Melody2A,_Totallength2,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1(Lyrics3A,Melody1A,_Totallength3,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1(Lyrics4A,Melody2A,_Totallength4,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Lyrics1,[Section1|_],Lyrics5), + renderv2(Section1,Lyrics5,Melody1,_Totallength,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1(Lyrics2,Melody1,_Totallength,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + length(Lyrics2,Lyrics2Length), + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelody(Lyrics2Length,Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelody(Lyrics2Length,Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +generatemelody1([],Melody,Melody) :- !. +generatemelody1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4,Melody4],Melody6), + generatemelody1(Melody5,Melody6,Melody3),!. + +%%changelength(_Melody2Length,Melody,Melody) :- !. +changelength(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnote1(Length,Melody1,Melody2). +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnote1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnote2(Length,Item,Melody1,Melody2),!. + +repeatlastnote2(0,_Item,Melody,Melody) :- !. +repeatlastnote2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnote2(Length2,Item,Melody3,Melody2),!. + +renderline2(BarTimes,BarTimes,[],_Track,_Bar1,Voice,Voice) :- !. +renderline2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + Melody1=[Melody2|Melody3], + BarTimes1=[BarTimes2|BarTimes3], + concat_list3("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody2,"- 1/2 voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderline2(BarTimes3,BarTimes4,Melody3,Track,Bar,Voice3,Voice2). + +renderlinerests(BarTimes,BarTimes,_Track,_Bar1,Voice,Voice) :- !. +renderlinerests(BarTimes1,BarTimes2,Track,Bar,Voice1,Voice2) :- + BarTimes1=[BarTimes2|BarTimes3], + concat_list3("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT R 1/2 voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderlinerests(BarTimes3,BarTimes2,Track,Bar,Voice3,Voice2). + +longtoshortform(Section1,Section2) :- + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A),!. + + +renderm1(_Form1,_Melody,[],_Track1,_Track2,Song,Song) :- !. +renderm1(Form1,Melody,MelodyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + MelodyParts1=[MelodyParts2|MelodyParts3], + MelodyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderm21(Form1,Melody,MelodyParts1,Track1,1,_,Song3,Song4), + renderm21(Form1,Melody,MelodyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderm1(Form1,Melody,MelodyParts3,Track3,Track2,Song4,Song2),!. + + +renderm21(_Form1,_Melody,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderm22(Section2,Melody,Track,Bar1,Bar3,Voice1,Voice3), + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderm22(Section2,Melody1,_Track,Bar,Bar,Voice,Voice) :- + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderm22(Section1,Melody1,Track,Bar1,Bar2,Voice1,Voice2) :- + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_]),Melody3), + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1m(Melody1A,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1m(Melody2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1m(Melody1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1m(Melody2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Melody3,[Section1|_],Melody5), + renderm22(Section1,Melody5,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1m(Melody1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodym(Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodym(Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + Lyrics2Length=8, + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +renderh1(_Form1,_Harmony,[],_Track1,_Track2,Song,Song) :- !. +renderh1(Form1,Harmony,HarmonyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + HarmonyParts1=[HarmonyParts2|HarmonyParts3], + HarmonyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderh21(Form1,Harmony,HarmonyParts1,Track1,1,_,Song3,Song4), + renderh21(Form1,Harmony,HarmonyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderh1(Form1,Harmony,HarmonyParts3,Track3,Track2,Song4,Song2),!. + + +renderh21(_Form1,_Harmony,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderh22(Section2,Harmony,Track,Bar1,Bar3,Voice1,Voice3), + renderh22(Section2,Harmony,Track,Bar3,Bar4,Voice3,Voice4), + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar4,Bar2,Voice4,Voice2),!. + +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderh22(Section2,Harmony1,_Track,Bar,Bar,Voice,Voice) :- + + %%longtoshortform(Section21,Section2), + findall(Harmony2,(member(Harmony3,Harmony1),member(Harmony2,Harmony3), + Harmony2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderh22(Section1,Harmony1A,Track,Bar1,Bar2,Voice1,Voice2) :- + Harmony1A=[Harmony1A1|Harmony1A2], + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + + %% shorten + %%string_concat(B11,_C1,Section11), + %%string_length(B11,1),atom_string(Section1,B11), + + + + findall(Harmony2,(member(Harmony2,Harmony1A1), + Harmony2=[Section1|_] + %%,string_concat(B1,_C,Section1A), + %%string_length(B1,1),atom_string(Section1,B1) + ),Harmony3), + %% shorten + + + + (not(Harmony3=[])-> + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + %%Harmony3=[[_, Harmony1A], [_, Harmony2A]], + (renderlines1h(Harmony3,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1);(Bar3 is Bar1,Voice3=Voice1)), + /** + Bar3 is Bar1+1, + renderline1h(Harmony2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1h(Harmony1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1h(Harmony2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + **/ + %%delete(Harmony3,[Section1|_],Harmony5), + renderh22(Section1,Harmony1A2,Track,Bar3,Bar2,Voice3,Voice2). + +renderlines1h([],_Track,_Bar,Voice,Voice) :- !. +renderlines1h(Harmony1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodyh(Harmony1,Harmony2), +%%writeln(generatemelodyh(Harmony1,Harmony2)), + renderlineh2(BarTimes1,BarTimes2,Harmony2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodyh(Harmony1,Harmony2) :- + generatemelodyh1(Harmony1,[],Harmony3), + length(Harmony3,Harmony2Length), + Lyrics2Length=8, + changelengthh(Lyrics2Length,Harmony2Length, + Harmony3,Harmony2). + +generatemelodyh1([],Melody,Melody) :- !. +generatemelodyh1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4,Melody4],Melody6), + generatemelodyh1(Melody5,Melody6,Melody3),!. + +%%changelengthh(_Melody2Length,Melody,Melody) :- !. +changelengthh(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnoteh1(Length,Melody1,Melody2). +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnoteh1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnoteh2(Length,Item,Melody1,Melody2),!. + +repeatlastnoteh2(0,_Item,Melody,Melody) :- !. +repeatlastnoteh2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnoteh2(Length2,Item,Melody3,Melody2),!. + +renderlineh2(BarTimes,BarTimes,[],_Track,_Bar1,Voice,Voice) :- !. +renderlineh2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + Melody1=[Melody2|Melody3], + Melody2=[_,[Melody21,Melody22,Melody23]], + BarTimes1=[BarTimes2|BarTimes3], + concat_list3("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody21,"- 1/2 voff=0\n"],Song1), + concat_list3("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody22,"- 1/2 voff=0\n"],Song2), + concat_list3("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody23,"- 1/2 voff=0\n"],Song3), + concat_list3(Voice1,[Song1,Song2,Song3],Voice3), + renderlineh2(BarTimes3,BarTimes4,Melody3,Track,Bar,Voice3,Voice2). + +outputlyrics(File1,Lyrics1) :- + Lyrics1=[Lyrics2|_Lyrics3], + Lyrics2=[_|Lyrics4], + Lyrics4=[Lyrics5|_Lyrics6], + sentencewithspaces(Lyrics5,Lyrics7A), + string_concat(Lyrics7A,"\n\n",Lyrics7), + outputlyrics1(Lyrics1,Lyrics7,Lyrics8), + concat_list3(File1,["lyrics.txt"],File2), + open_s(File2,write,Stream), + write(Stream,Lyrics8), + close(Stream), +%%texttobr2(u,File2,u,u),texttobr(u,File2,u,u). + writeln(["Texttobr, Texttobr2 not working. Please manually breason out ",File2]). + +outputlyrics1([],Lyrics,Lyrics) :- !. +outputlyrics1(Lyrics1,Lyrics5,Lyrics6) :- + Lyrics1=[Lyrics2|Lyrics3], + Lyrics2=[_|Lyrics4], + outputlyrics2(Lyrics4,Lyrics5,Lyrics7), + string_concat(Lyrics7,"\n",Lyrics8), + outputlyrics1(Lyrics3,Lyrics8,Lyrics6). + +outputlyrics2([],Lyrics,Lyrics) :- !. +outputlyrics2(Lyrics1,Lyrics5,Lyrics6) :- + Lyrics1=[Lyrics2|Lyrics3], + sentencewithspaces(Lyrics2,Lyrics4), + %%writeln(Lyrics4), + concat_list3(Lyrics5,[Lyrics4,"\n"],Lyrics7), + outputlyrics2(Lyrics3,Lyrics7,Lyrics6). + +max([],M,M) :- !. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item>=Max1, + max(List2,Item,Max2),!. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item + append(A,Item,C); + fail),!. +append2(A,Item,C) :- + append_list(A,Item,C),!. + +readsc(7). +readv([["loves"],["is"],["has"],["is","in"],["moves","to"],["nur","tures"],["needs"],["makes"],["lifts"],["finds"],["forms"],["goes","to"],["writes","on"],["reads","on"],["feels"],["is"]]). + +reado([["one"],["the","oth","er"],["the","runn","er"],["the","draw","er"],["the","count","er"],["the","graph","er"],["the","test","er"],["the","breaths","on","er"],["the","writ","er"],["the","spell","er"],["the","updat","er"],["the","check","er"],["the","choos","er"],["the","ess","ence"],["the","comb","in","er"],["the","mir","ac","le"],["the","trans","lat","or"],["the","gramm","ar"]]). + +rhymes([["one","er","or","ar","an","ae","er","ler","ur","ard","ney","ald","ess","el"],["le","py","ye","ee","ice"]]). + +%%removetoolongandnotrhyming(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !. +/**removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2) :- +%% makes list of all combos, checks, asks, asks + removetoolong1(Lyrics1,SyllableCount, + Verbs1,[],Verbs3), %% do after end + * %% not until object + %% find v x + removenotrhyming1(Lyrics0,Verbs3, + Verbs4), + + removetoolong1(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !.%* + removetoolong1(Lyrics1,SyllableCount,Verbs1,Verbs2,Verbs3) :- + Verbs1=[Verb4|Verbs5], + append_list(Lyrics1,Verb4,Lyrics2), + length(Lyrics2,Length), + (Length<=SyllableCount->(append(Verbs2, [Verb4],Verbs6), removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs6,Verbs3));(removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs2,Verbs3))). + +removenotrhyming1(Lyrics,Verbs1,Verbs2) :- + rhymes(Rhymes1) + length(Lyrics,Length), + Line1 is mod(Length,4), + (Line1 is 3-1-> + (Line2 is 3-2, + removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2)); + (Line1 is 4-1-> + (removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2));Verbs1=Verbs2) + ). + +removenotrhyming15(Line2,Rhymes1,Rhymes3) :- + length(List1,Line2), + append(List1,_,Lyrics), + reverse(List1,List2), + List2=[Item1|_Items2], + reverse(Item1,Item2), + Item2=[Item3|_Items4], + member(Rhymes2,Rhymes1), + member(Item3,Rhymes2), + delete(Rhymes2,Item3,Rhymes3) + +removenotrhyming2(_Rhymes3,[],Verbs,Verbs) :- !. +removenotrhyming2(Rhymes3,Verbs1,Verbs2,Verbs3) :- + (Verbs1=[Verb4|Verbs5], + (member(Verb4,Rhymes3) + delete(Rhymes3,Verb4,Rhymes4)) + ->(append(Verbs2,[Verb4],Verbs6), + removenotrhyming2(Rhymes4,Verbs5, + Verbs6,Verbs3)); + removenotrhyming2(Rhymes3,Verbs5, + Verbs2,Verbs3)). + **/ + +/**string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +comment(fiftyastest). +comment(turnoffas). +**/ + +remove_dups([],List,List) :- !. +remove_dups(List1,List2,List3) :- + List1=[Item|List4], + delete(List4,Item,List5), + append(List2,[Item],List6), + remove_dups(List5,List6,List3),!. + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + +concat_list2A(A1,B):- + A1=[A|List], + concat_list2A(A,List,B),!. + +concat_list2A(A,[],A):-!. +concat_list2A(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list2A(C,Items,B). + + +%mind_read(R,[]) :- +% R=[[_,['C']]]. +/** +mind_read(Item,List0) :- +random_member(Item,List0). +**/ diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual-mr.pl b/Music-Composer/mindreadtestmusiccomposer-unusual-mr.pl new file mode 100644 index 0000000000000000000000000000000000000000..c907e53ee21b56fd32549f5738a65769b46d9c92 --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual-mr.pl @@ -0,0 +1,1176 @@ +/** + +- Form +- Which chord progression type? +- Lyrics - random sentences, rhyming +- Melody - random starting and ending note of first phrase, melody and harmony from chord progression type with three notes (split for melody), the circle of fifths for connection between sections +- Which of up to 3 instruments for each of melody and harmony +- What sections they will play + +- Melody cut to the length of vocals + +**/ + +use_module(library(pio)). + +:- use_module(library(date)). +%%:- include('texttobr2qb'). +%%:- include('mindreadtestshared'). +%%:- include('texttobr2'). +%%:- include('texttobr'). +%%:- include('texttobr2qbmusic'). + +:- include('musiclibrary'). +:- include('la_strings'). + +sectest(0) :- !. +sectest(N1) :- + writeln(["\n",N1]), + sectest0(_Form1,_Lyrics,_Melody,_Harmony,_MelodyParts, + _HarmonyParts,_Vocalstubinstrument,_Song1), + N2 is N1-1, + sectest(N2),!. + +sectest0(Form1,Lyrics,Melody,Harmony,MelodyParts,HarmonyParts,Vocalstubinstrument,Song1) :- + check_asc2mid, + + %%texttobr2qb(2), %%Imagine song + form(Form1), + %%Form1=[v2,o], + find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT), + remove_dups(Form1,[],Form2), + Voiceparts1=[v1,v2,c,s], + intersection(Form1,[v1,v2,c,s],Voiceparts2), + lyrics(Voiceparts1,Lyrics,Maxlength), + findall(B,(member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1)),Form3), + remove_dups(Form3,[],Form4), + %%repeat, %% in case melody, harmony don't work + melodyharmony(Form4,CPT,Maxlength,Melody,Harmony), + %%writeln(melodyharmony(Form4,CPT,Maxlength,Melody,Harmony)), %% *** + instruments(Form1,MelodyInstruments,HarmonyInstruments, + MelodyParts,HarmonyParts,Vocalstubinstrument), + %%writeln(instruments(Form1,MelodyInstruments,HarmonyInstruments, +%% MelodyParts,HarmonyParts, +%% Vocalstubinstrument)), + %%writeln(rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + %% MelodyInstruments,HarmonyInstruments,MelodyParts, + %% HarmonyParts,Lyrics, + %% Vocalstubinstrument,Song1)), %%, + + rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + MelodyInstruments,HarmonyInstruments,MelodyParts, + HarmonyParts,Lyrics, + Vocalstubinstrument,Song1,File1), %%, + +Meta_file=[[form,Form1],[chord_progressions,CPT],[voice_part,Voiceparts2],[melody,Melody],[harmony,Harmony],[melody_instruments, + MelodyInstruments],[harmony_instruments,HarmonyInstruments],[melody_parts,MelodyParts],[harmony_parts, + HarmonyParts],[lyrics,Lyrics], + [genre,["anthem"]]], + + term_to_atom(Meta_file,Meta_file1), + string_atom(Meta_file2,Meta_file1), + + concat_list("",[File1,"_meta.txt"],File2), + (open_s(File2,write,Stream1), + write(Stream1,Meta_file2), + close(Stream1)),! + . + +%% n intro, vn verse, c chorus, i1 instrumental1, t2, instrumental 2, s solo, o outro + +findall1(Form2,Form3) :- + findall(B,shorten(Form2,B),Form3). + +shorten(Form2,B) :- + member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1). + +form(Form) :- + Form1=[n,v1], + find("Should a chorus or instrumental come after the first verse?",CorI), + (CorI=c-> + (append(Form1,[c,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + ); + (CorI=i1,append(Form1,[i1,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + )). + +makename(N3) :- + trialy2(["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"],R1), + findbest(R1,N1), + trialy2(["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"],R2), + findbest(R2,N2), + append([N1],[N2],N3). + +makenames(0,Ns,Ns) :- !. +makenames(Num1,N1,N2) :- + random_member(R1,["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"]), + random_member(R2,["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"]), + append([R1],[R2],R3), + append(N1,[R3],N3), + Num2 is Num1-1, + makenames(Num2,N3,N2). + +findsentence(Sentence,Names1,Names2) :- + makename(N1),readv(Vs),reado(Os), + makenames(5,[],N2s1),append(N2s1,Names1,N2s), + trialy2(Vs,R1), + findbest(R1,R11), + append(Os,N2s,Os2), + trialy2(Os2,R2), + findbest(R2,R21), + append_list(N1,[R11,R21],Sentence), + (member(R21,N2s1)-> + append(Names1,[R21],Names2); + Names2=Names1). + +findrhyme(Phrase1,Phrase2,Names1,Names2) :- + reverse(Phrase1,Reverse), + Reverse=[Lastsyllable|_], + readv(Vs),reado(Os),makenames(5,[],N2s1), + append(N2s1,Names1,N2s), + append(Os,N2s,Os2), + trialy2(Os2,R2), + findbest(R2,R21), + trialy2(Vs,R1), + findbest(R1,R11), + findall(C,(member(C,N2s),reverse(C,B),B=[E|_],rhymes2(E,Lastsyllable)),Phrases22), + trialy2(Phrases22,R3), + findbest(R3,R31), + append_list(R21,[R11,R31],Phrase2), + (member(R21,N2s1)-> + append(Names1,[R21],Names3); + Names3=Names1), + (member(Phrases22,N2s1)-> + append(Names3,[R21],Names2); + Names2=Names3). + +rhymes2(Syllable1,Syllable2) :- + rhymes(Lists), + member(List1,Lists), + member(Syllable1,List1), + member(Syllable2,List1). + +lyrics(Form1,Lyrics,Maxlength) :- + %%find("Who is the song about?",Character), + %%lyrics1(Form,Character,[],Lyrics). + catch((lyrics2(Form1,[],Lyrics,[],_Names,0,Maxlength)),_, + lyrics(Form1,Lyrics,Maxlength)). + +lyrics2([],Lyrics,Lyrics,Names,Names,Maxlength,Maxlength) :- !. +lyrics2(Form1,Lyrics1,Lyrics2,Names1,Names2,Maxlength1,Maxlength2) :- + Form1=[Form2|Forms3], + findsentence(Sentence1,Names1,Names3), + length(Sentence1,Length1), + findsentence(Sentence2,Names3,Names4), + length(Sentence2,Length2), + findrhyme(Sentence1,Sentence3,Names4,Names5), + length(Sentence3,Length3), + findrhyme(Sentence2,Sentence4,Names5,Names6), + length(Sentence4,Length4), + append(Lyrics1,[[Form2,Sentence1,Sentence2,Sentence3,Sentence4]],Lyrics3), + max([Length1,Length2,Length3,Length4],Maxlength1,Maxlength3), + lyrics2(Forms3,Lyrics3,Lyrics2,Names6,Names2,Maxlength3,Maxlength2),!. + +melodyharmony(Form1,CPT,Maxlength,Melody,Harmony) :- + Partlength is Maxlength div 3, + Extra is Maxlength mod 3, + Total is Partlength+Extra, + _Parts=[Partlength,Partlength,Partlength,Total], + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the song start on? + %%trialy2(Notes,R1), + %%findbest(R1,R11), + R11='A', + melodyharmony(Form1,CPT,_Parts2,R11,_R2,[],Melody,[],Harmony). + +melodyharmony([],_CPT,_Parts,N,N,Melody,Melody,Harmony,Harmony) :- !. +melodyharmony(Form1,CPT,Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + Form1=[Form2|Forms3], + findmelody(Form2,CPT,Parts,N1,N3,Melody1,Melody3,Harmony1,Harmony3), + (CPT=1451->harmony1451(N3,3,N4);harmonyr(N3,5,N4)), + findmelody(Form2,CPT,Parts,N4,N5,Melody3,Melody4,Harmony3,Harmony4), + reverse(Melody3,Melody3R),Melody3R=[[_,Melody3RI]|_], %% relies on repeat *1 + reverse(Melody4,Melody4R),Melody4R=[[_,Melody4RI]|_], + not((length(Melody3RI,1),length(Melody4RI,1))), + melodyharmony(Forms3,CPT,Parts,N5,N2,Melody4,Melody2, + Harmony4,Harmony2). + +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1451, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + trialy2(Notes,R1), + findbest(R1,N2), + versechorussoloprogression1451(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1451(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1564, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, %% *1 see other *1 + trialy2(Notes,R1), + findbest(R1,N2), + versechorussoloprogression1564(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1564(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1645, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + trialy2(Notes,R1), + findbest(R1,N2), + versechorussoloprogression1645(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1645(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classical, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + trialy2(Notes,R1), + findbest(R1,N2), + classicalcomposition(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + classicalcomposition(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classicalpop, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + trialy2(Notes,R1), + findbest(R1,N2), + popclassicalcomposition(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + popclassicalcomposition(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). + +harmony(Form,CPT,Progression,Harmony1,Harmony2) :- + harmony1(Form,CPT,Progression,[],Harmony3), + append(Harmony1,[Harmony3],Harmony2). + +harmony1(_Form,_CPT,[],Harmony,Harmony) :- !. +harmony1(Form,CPT,Progression1,Harmony1,Harmony2) :- + Progression1=[Note1|Progression2], + (CPT=1451->(harmony1451(Note1,2,Note2),harmony1451(Note1,4,Note3)); + (harmonyr(Note1,4,Note2),harmonyr(Note1,7,Note3))), + Harmony30=[Note2,Note3], + findall(B1,(member(A1,Harmony30),string_concat(B,_C,A1),string_length(B,1),atom_string(B1,B)),Harmony31), + append([Note1],Harmony31,Harmony3), + append(Harmony1,[[Form,Harmony3]],Harmony4), + harmony1(Form,CPT,Progression2,Harmony4,Harmony2),!. + +harmony1451(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). +harmonyr(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). + +instruments(Form1,MelodyInstruments,HarmonyInstruments,MelodyParts,HarmonyParts,Vocalstubinstrument) :- + instrumentnumber(MelodyInstrumentNumber), + instrumentnumber(HarmonyInstrumentNumber), + instrumentlist(MelodyInstrumentNumber,MelodyInstruments), + instrumentlist(HarmonyInstrumentNumber,HarmonyInstruments), + Vocalstubinstrument=[0,"Acoustic Grand Piano"], + parts(Form1,MelodyInstruments,MelodyParts), + aggregate_all(count, (member(A1,MelodyParts),not(A1=[_,_,0])), Count1), + not(Count1=0), + parts(Form1,HarmonyInstruments,HarmonyParts), + aggregate_all(count, (member(A2,HarmonyParts),not(A2=[_,_,0])), Count2), + not(Count2=0). + +instrumentnumber(NumberofInstruments) :- + %% Number of melody instruments + trialy2([1,2,3],R1), + findbest(R1,NumberofInstruments). + +instrumentlist(NumberofInstruments,Instruments) :- + instrumentlist(NumberofInstruments,[],Instruments),!. + +instrumentlist(0,Instruments,Instruments) :- !. +instrumentlist(NumberofInstruments1,Instruments1,Instruments2) :- + Instruments=[[0,"Acoustic Grand Piano"],[1,"Bright Acoustic Piano"],[2,"Electric Grand Piano"],[3,"Honky-Tonk Piano"],[4,"Electric Piano 1"],[5,"Electric Piano 2"],[6,"Harpsichord"],[7,"Clavinet"],[8,"Celesta"],[9,"Glockenspiel"],[10,"Music Box"],[11,"Vibraphone"],[12,"Marimba"],[13,"Xylophone"],[14,"Tubular Bells"],[15,"Dulcimer"],[16,"Drawbar Organ"],[17,"Percussive Organ"],[18,"Rock Organ"],[19,"Church Organ"],[20,"Reed Organ"],[21,"Accordian"],[22,"Harmonica"],[23,"Tango Accordian"],[24,"Nylon Acoustic Guitar"],[25,"Steel Acoustic Guitar"],[26,"Jazz Electric Guitar"],[27,"Clean Electric Guitar"],[28,"Muted Electric Guitar"],[29,"Overdriven Guitar"],[30,"Distortion Guitar"],[31,"Guitar Harmonics"],[32,"Acoustic Bass"],[33,"Electric Bass (finger)"],[34,"Electric Bass (pick)"],[35,"Fretless Bass"],[36,"Slap Bass 1"],[37,"Slap Bass 2"],[38,"Synth Bass 1"],[39,"Synth Bass 2"],[40,"Violin"],[41,"Viola"],[42,"Cello"],[43,"Contrabass"],[44,"Tremolo Strings"],[45,"Pizzicato Strings"],[46,"Orchestral Harp"],[47,"Timpani"],[48,"String Ensemble 1"],[49,"String Ensemble 2"],[50,"Synth Strings 1"],[51,"Synth Strings 2"],[52,"Choir Aahs"],[53,"Voice Oohs"],[54,"Synth Choir"],[55,"Orchestra Hit"],[56,"Trumpet"],[57,"Trombone"],[58,"Tuba"],[59,"Muted Trumpet"],[60,"French Horn"],[61,"Brass Section"],[62,"Synth Brass 1"],[63,"Synth Brass 2"],[64,"Soprano Sax"],[65,"Alto Sax"],[66,"Tenor Sax"],[67,"Baritone Sax"],[68,"Oboe"],[69,"English Horn"],[70,"Bassoon"],[71,"Clarinet"],[72,"Piccolo"],[73,"Flute"],[74,"Recorder"],[75,"Pan Flute"],[76,"Blown Bottle"],[77,"Shakuhachi"],[78,"Whistle"],[79,"Ocarina"],[80,"Lead 1 (square)"],[81,"Lead 2 (sawtooth)"],[82,"Lead 3 (calliope)"],[83,"Lead 4 (chiff)"],[84,"Lead 5 (charang)"],[85,"Lead 6 (voice)"],[86,"Lead 7 (fifths)"],[87,"Lead 8 (bass + lead)"],[88,"Pad 1 (new age)"],[89,"Pad 2 (warm)"],[90,"Pad 3 (polysynth)"],[91,"Pad 4 (choir)"],[92,"Pad 5 (bowed)"],[93,"Pad 6 (metallic)"],[94,"Pad 7 (halo)"],[95,"Pad 8 (sweep)"],[96,"FX 1 (rain)"],[97,"FX 2 (soundtrack)"],[98,"FX 3 (crystal)"],[99,"FX 4 (atmosphere)"],[100,"FX 5 (brightness)"],[101,"FX 6 (goblins)"],[102,"FX 7 (echoes)"],[103,"FX 8 (sci-fi)"],[104,"Sitar"],[105,"Banjo"],[106,"Shamisen"],[107,"Koto"],[108,"Kalimba"],[109,"Bagpipe"],[110,"Fiddle"],[111,"Shanai"],[112,"Tinkle Bell"],[113,"Agogo"],[114,"Steel Drums"],[115,"Woodblock"],[116,"Taiko Drum"],[117,"Melodic Tom"],[118,"Synth Drum"],[119,"Reverse Cymbal"],[120,"Guitar Fret Noise"],[121,"Breath Noise"],[122,"Seashore"]], + %% ,[123,"Bird Tweet"],[124,"Telephone Ring"],[125,"Helicopter"],[126,"Applause"],[127,"Gunshot"] + trialy2(Instruments,R1), + findbest(R1,Instrument), + append(Instruments1,[Instrument],Instruments3), + NumberofInstruments2 is NumberofInstruments1-1, + instrumentlist(NumberofInstruments2,Instruments3,Instruments2),!. + +parts(Form,Instruments,Parts) :- + parts1(Form,Instruments,[],Parts),!. + +parts1(_Form,[],Parts,Parts) :- !. +parts1(Form,Instruments1,Parts1,Parts2) :- + Instruments1=[Instrument2|Instruments3], + parts2(Form,Instrument2,[],Part3), + append(Parts1,[Part3],Parts4), + parts1(Form,Instruments3,Parts4,Parts2),!. + +parts2([],_Instrument,Part,Part) :- !. +parts2(Form1,Instrument,Part1,Part2) :- + Form1=[Section|Form2], + (member([Section,Instrument,Playing],Part1)->true; + %% Is the instrument playing in the section? + (trialy2([0,1],R1), + findbest(R1,Playing))), + append(Part1,[[Section,Instrument,Playing]],Part3), + parts2(Form2,Instrument,Part3,Part2),!. + +/** +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). +**/ + +concat_list(A,[],A) :-!. +concat_list(A,List,B) :- + List=[Item|Items], + concat_list2(A,[Item],C), + concat_list(C,Items,B). +concat_list2(A,List,C) :- + ((List=[[Item|Items]]->true;List=[Item])-> + concat_list0(A,[Item|Items],C); + fail),!. +concat_list2(A,Item,C) :- + concat_list(A,Item,C),!. + +concat_list0([],""):-!. +concat_list0(A1,B):- + A1=[A|List], + concat_list0(A,List,B),!. + +concat_list0(A,[],A):-!. +concat_list0(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list0(C,Items,B). + +sentencewithspaces(Sentence1,Sentence2) :- + Sentence1=[Item|Items], + string_concat(Firstletter1,Rest,Item), + string_length(Firstletter1,1), + upcase_atom(Firstletter1,Firstletter2), + concat_list(Firstletter2,[Rest,""],Item2), + sentencewithspaces(Items,Item2,Sentence3), + string_concat(Sentence3,".",Sentence2). + +sentencewithspaces([],Sentence,Sentence) :- !. +sentencewithspaces(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Word|Sentence4], + concat_list("",[Sentence2," ",Word],Item2), + sentencewithspaces(Sentence4,Item2,Sentence3),!. + +rendersong(Form1,Voiceparts,_Maxlength,Melody, + Harmony,MelodyInstruments, + HarmonyInstruments,MelodyParts1, + HarmonyParts,Lyrics,Vocalstubinstrument,Song1,File1) :- + Totallength=8, + Voicetrack=1, + length(MelodyInstruments,MelodyInstrumentsLength), + length(HarmonyInstruments,HarmonyInstrumentsLength), + TracksNumber is MelodyInstrumentsLength+HarmonyInstrumentsLength+2, + Lyrics=[[_,Sentence1|_]|_],sentencewithspaces(Sentence1,Sentence2), + concat_list("format=1 tracks=", [TracksNumber, " division=384\n\nBA 1 CR 0 TR 0 CH 16 Text type 2: \"Produced by Mind Reading Music Composer by Lucian Academy (Mind Reading Mode)\"\nBA 1 CR 0 TR 0 CH 16 Text type 3: \"", Sentence2, "\"\nBA 1 CR 0 TR 0 CH 1 Channel volume 127\nBA 1 CR 0 TR 0 CH 16 Tempo 63.00009\n"], Song2), + printheader(Voicetrack,Vocalstubinstrument,Song2,Song3), + %%writeln(renderv1(Form1,Voiceparts,_,Lyrics,Melody, + %% Totallength,Voicetrack,1,_,Song3,Song4)), %% **** + renderv1(Form1,Voiceparts,_,Lyrics,Melody, + Totallength,Voicetrack,1,_,Song3,Song4), + Track2 is Voicetrack+1, + renderm1(Form1,Melody,MelodyParts1, + Track2,_Track3,Song4,Song5), + Track4 is MelodyInstrumentsLength+2, + renderh1(Form1,Harmony,HarmonyParts, + Track4,_Track5,Song5,Song6), + length(Form1,FormLength), + TotalBars is 4*FormLength+1, + concat_list(Song6,["BA ",TotalBars," CR 0 TR 0 CH 16 End of track"],Song1), + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list("song",[Year,Month,Day,Hour1,Minute1,Seconda],File1), + concat_list(File1,[".txt"],File2), + concat_list(File1,[".mid"],File3), + open_s(File2,write,Stream), + write(Stream,Song1), + close(Stream), + concat_list("./asc2mid ",[File2," > ",File3],Command), + shell1_s(Command), + %%writeln(["Texttobr, Texttobr2 not working. Please manually breason out ",File2]), + %%N is 4, M is 4000, texttobr2(N,File2,u,M),texttobr(N,File2,u,M), + outputlyrics(File1,Lyrics), + +/** + texttobr2qb(3), %% give self breasonings + texttobr2qb(20), %%Feature 1 + texttobr2qb(20), %%Updates + texttobr2qb(20), %%Feature 2 + texttobr2qb(20), %%Updates + texttobr2qb(20), %%Feature 3 + texttobr2qb(20), %%Updates + texttobr2qb(100), %%Icon + texttobr2qb(20), %%Updates + texttobr2qb(32), %%Lyrics + texttobr2qb(36), %%Music + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Medicine + texttobr2qb(20), %%Updates + texttobr2qb(2), %%Sales + texttobr2qb(20), %%Updates + texttobr2qb(2), %%Marketing + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Graciously give or blame listener for colour imagery + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Play song + texttobr2qb(2), %%Like song + **/ +!. + +printheader(Voicetrack,Vocalstubinstrument,Song1,Song2) :- + Vocalstubinstrument=[N1,_], + N2 is N1+1, + concat_list("BA 1 CR 0 TR ",[Voicetrack," CH ",Voicetrack," Instrument ",N2,"\nBA 1 CR 0 TR ",Voicetrack," CH ",Voicetrack," NT R 0 von=127 voff=0\n"],Song3), + string_concat(Song1,Song3,Song2). + +renderv1([],_Voiceparts,_Vocalstubinstrument,_Lyrics,_Melody,_Totallength,_Track,_Bar,_Bar2,Voice,Voice) :- !. +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength,Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section|Form2], + Voiceparts1=[Section|Voiceparts2], + renderv2(Section,Lyrics,Melody,Totallength,Track, + Bar1,Bar3,Voice1,Voice3), + renderv1(Form2,Voiceparts2,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength, +Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section1|Form2], + ((Voiceparts1=[Section2|_Voiceparts2], + not(Section1=Section2))->true;Voiceparts1=[]), + Bar3 is Bar1+4, + renderv1(Form2,Voiceparts1,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderv2(Section,Lyrics,_Melody,_Totallength,_Track,Bar,Bar,Voice,Voice) :- + not(member([Section|_],Lyrics)),!. +renderv2(Section1,Lyrics1,Melody1,_Totallength0,Track,Bar1,Bar2,Voice1,Voice2) :- + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section2|_]),Melody3), + Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1(Lyrics1A,Melody1A,_Totallength1,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1(Lyrics2A,Melody2A,_Totallength2,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1(Lyrics3A,Melody1A,_Totallength3,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1(Lyrics4A,Melody2A,_Totallength4,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Lyrics1,[Section1|_],Lyrics5), + renderv2(Section1,Lyrics5,Melody1,_Totallength,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1(Lyrics2,Melody1,_Totallength,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + length(Lyrics2,Lyrics2Length), + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelody(Lyrics2Length,Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelody(Lyrics2Length,Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +generatemelody1([],Melody,Melody) :- !. +generatemelody1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4,Melody4],Melody6), + generatemelody1(Melody5,Melody6,Melody3),!. + +%%changelength(_Melody2Length,Melody,Melody) :- !. +changelength(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnote1(Length,Melody1,Melody2). +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnote1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnote2(Length,Item,Melody1,Melody2),!. + +repeatlastnote2(0,_Item,Melody,Melody) :- !. +repeatlastnote2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnote2(Length2,Item,Melody3,Melody2),!. + +renderline2(BarTimes,BarTimes,[],_Track,_Bar1,Voice,Voice) :- !. +renderline2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + Melody1=[Melody2|Melody3], + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody2,"- 1/2 voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderline2(BarTimes3,BarTimes4,Melody3,Track,Bar,Voice3,Voice2). + +renderlinerests(BarTimes,BarTimes,_Track,_Bar1,Voice,Voice) :- !. +renderlinerests(BarTimes1,BarTimes2,Track,Bar,Voice1,Voice2) :- + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT R 1/2 voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderlinerests(BarTimes3,BarTimes2,Track,Bar,Voice3,Voice2). + +longtoshortform(Section1,Section2) :- + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A),!. + + +renderm1(_Form1,_Melody,[],_Track1,_Track2,Song,Song) :- !. +renderm1(Form1,Melody,MelodyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + MelodyParts1=[MelodyParts2|MelodyParts3], + MelodyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderm21(Form1,Melody,MelodyParts1,Track1,1,_,Song3,Song4), + renderm21(Form1,Melody,MelodyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderm1(Form1,Melody,MelodyParts3,Track3,Track2,Song4,Song2),!. + + +renderm21(_Form1,_Melody,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderm22(Section2,Melody,Track,Bar1,Bar3,Voice1,Voice3), + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderm22(Section2,Melody1,_Track,Bar,Bar,Voice,Voice) :- + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderm22(Section1,Melody1,Track,Bar1,Bar2,Voice1,Voice2) :- + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_]),Melody3), + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1m(Melody1A,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1m(Melody2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1m(Melody1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1m(Melody2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Melody3,[Section1|_],Melody5), + renderm22(Section1,Melody5,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1m(Melody1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodym(Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodym(Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + Lyrics2Length=8, + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +renderh1(_Form1,_Harmony,[],_Track1,_Track2,Song,Song) :- !. +renderh1(Form1,Harmony,HarmonyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + HarmonyParts1=[HarmonyParts2|HarmonyParts3], + HarmonyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderh21(Form1,Harmony,HarmonyParts1,Track1,1,_,Song3,Song4), + renderh21(Form1,Harmony,HarmonyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderh1(Form1,Harmony,HarmonyParts3,Track3,Track2,Song4,Song2),!. + + +renderh21(_Form1,_Harmony,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderh22(Section2,Harmony,Track,Bar1,Bar3,Voice1,Voice3), + renderh22(Section2,Harmony,Track,Bar3,Bar4,Voice3,Voice4), + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar4,Bar2,Voice4,Voice2),!. + +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderh22(Section2,Harmony1,_Track,Bar,Bar,Voice,Voice) :- + + %%longtoshortform(Section21,Section2), + findall(Harmony2,(member(Harmony3,Harmony1),member(Harmony2,Harmony3), + Harmony2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderh22(Section1,Harmony1A,Track,Bar1,Bar2,Voice1,Voice2) :- + Harmony1A=[Harmony1A1|Harmony1A2], + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + + %% shorten + %%string_concat(B11,_C1,Section11), + %%string_length(B11,1),atom_string(Section1,B11), + + + + findall(Harmony2,(member(Harmony2,Harmony1A1), + Harmony2=[Section1|_] + %%,string_concat(B1,_C,Section1A), + %%string_length(B1,1),atom_string(Section1,B1) + ),Harmony3), + %% shorten + + + + (not(Harmony3=[])-> + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + %%Harmony3=[[_, Harmony1A], [_, Harmony2A]], + (renderlines1h(Harmony3,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1);(Bar3 is Bar1,Voice3=Voice1)), + /** + Bar3 is Bar1+1, + renderline1h(Harmony2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1h(Harmony1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1h(Harmony2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + **/ + %%delete(Harmony3,[Section1|_],Harmony5), + renderh22(Section1,Harmony1A2,Track,Bar3,Bar2,Voice3,Voice2). + +renderlines1h([],_Track,_Bar,Voice,Voice) :- !. +renderlines1h(Harmony1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodyh(Harmony1,Harmony2), +%%writeln(generatemelodyh(Harmony1,Harmony2)), + renderlineh2(BarTimes1,BarTimes2,Harmony2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodyh(Harmony1,Harmony2) :- + generatemelodyh1(Harmony1,[],Harmony3), + length(Harmony3,Harmony2Length), + Lyrics2Length=8, + changelengthh(Lyrics2Length,Harmony2Length, + Harmony3,Harmony2). + +generatemelodyh1([],Melody,Melody) :- !. +generatemelodyh1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4,Melody4],Melody6), + generatemelodyh1(Melody5,Melody6,Melody3),!. + +%%changelengthh(_Melody2Length,Melody,Melody) :- !. +changelengthh(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnoteh1(Length,Melody1,Melody2). +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnoteh1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnoteh2(Length,Item,Melody1,Melody2),!. + +repeatlastnoteh2(0,_Item,Melody,Melody) :- !. +repeatlastnoteh2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnoteh2(Length2,Item,Melody3,Melody2),!. + +renderlineh2(BarTimes,BarTimes,[],_Track,_Bar1,Voice,Voice) :- !. +renderlineh2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + Melody1=[Melody2|Melody3], + Melody2=[_,[Melody21,Melody22,Melody23]], + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody21,"- 1/2 voff=0\n"],Song1), + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody22,"- 1/2 voff=0\n"],Song2), + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody23,"- 1/2 voff=0\n"],Song3), + concat_list(Voice1,[Song1,Song2,Song3],Voice3), + renderlineh2(BarTimes3,BarTimes4,Melody3,Track,Bar,Voice3,Voice2). + +outputlyrics(File1,Lyrics1) :- + Lyrics1=[Lyrics2|_Lyrics3], + Lyrics2=[_|Lyrics4], + Lyrics4=[Lyrics5|_Lyrics6], + sentencewithspaces(Lyrics5,Lyrics7A), + string_concat(Lyrics7A,"\n\n",Lyrics7), + outputlyrics1(Lyrics1,Lyrics7,Lyrics8), + concat_list(File1,["lyrics.txt"],File2), + open_s(File2,write,Stream), + write(Stream,Lyrics8), + close(Stream), +%%texttobr2(u,File2,u,u),texttobr(u,File2,u,u). + writeln(["Texttobr, Texttobr2 not working. Please manually breason out ",File2]). + +outputlyrics1([],Lyrics,Lyrics) :- !. +outputlyrics1(Lyrics1,Lyrics5,Lyrics6) :- + Lyrics1=[Lyrics2|Lyrics3], + Lyrics2=[_|Lyrics4], + outputlyrics2(Lyrics4,Lyrics5,Lyrics7), + string_concat(Lyrics7,"\n",Lyrics8), + outputlyrics1(Lyrics3,Lyrics8,Lyrics6). + +outputlyrics2([],Lyrics,Lyrics) :- !. +outputlyrics2(Lyrics1,Lyrics5,Lyrics6) :- + Lyrics1=[Lyrics2|Lyrics3], + sentencewithspaces(Lyrics2,Lyrics4), + %%writeln(Lyrics4), + concat_list(Lyrics5,[Lyrics4,"\n"],Lyrics7), + outputlyrics2(Lyrics3,Lyrics7,Lyrics6). + +max([],M,M) :- !. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item>=Max1, + max(List2,Item,Max2),!. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item1 etc. + findbest(R,Item1). + +find("Should a chorus or instrumental come after the first verse?",CorI) :- + trialy2([c,i1],R), + findbest(R,CorI). + +find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS) :- + trialy2([[c,o],[s,s]],R), + findbest(R,COorSS). + +find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS) :- + trialy2([[c,o],[s,s]],R), + findbest(R,COorSS). + +%%find("Who is the song about?",Character) :- +%% trialy2(["Ma","til","da"],["Jo","se","phine"],["Hap","py"],["Ha","rold"],R), +%% findbest(R,Character). + +find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT) :- + trialy2([1451, 1564, 1645 + %% + , classical, classicalpop + ],R), + findbest(R,CPT). + +/**generatelyricslistsverse(Character,Lyrics1,Lyrics2):- +%% read c, o, v + reado(Objects), + readv(Verbs), + + charobj( %% character verb object first + %% object verb object pairs + +lyrics1([],_Character,Lyrics,Lyrics) :- !. +lyrics1(Form1,Character,Lyrics1,Lyrics2) :- + Form1=[Form2|Forms2], + lyrics2(Form2,Character,Lyrics1,Lyrics3), + +lyrics2([instrumental,_],_Character,Lyrics1,Lyrics2) :- + +append_list(Lyrics1,[[],[],[],[]],Lyrics2). +lyrics2([verse,_],Character,Lyrics1,Lyrics2) :- + lyricsv1(Lyrics1,[Character],Lyrics3), + append_list(Lyrics1,[Lyrics3,*],Lyrics2), + . + +lyricsv1(Lyrics0,Lyrics1,Lyrics2) :- + readsc(SyllableCount), + readv(Verbs1), + +reado(Objects1), + **/ +%%removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2),] +%%findv +%% same for o +%% findo + %% choose words that fit syllable count + +append_list(A,[],A) :-!. +append_list(A,List,B) :- + List=[Item|Items], + append2(A,[Item],C), + append_list(C,Items,B). +append2(A,List,C) :- + (List=[Item]-> + append(A,Item,C); + fail),!. +append2(A,Item,C) :- + append_list(A,Item,C),!. + +readsc(7). +readv([["loves"],["is"],["has"],["is","in"],["moves","to"],["nur","tures"],["needs"],["makes"],["lifts"],["finds"],["forms"],["goes","to"],["writes","on"],["reads","on"],["feels"],["is"]]). + +reado([["one"],["the","oth","er"],["the","runn","er"],["the","draw","er"],["the","count","er"],["the","graph","er"],["the","test","er"],["the","breaths","on","er"],["the","writ","er"],["the","spell","er"],["the","updat","er"],["the","check","er"],["the","choos","er"],["the","ess","ence"],["the","comb","in","er"],["the","mir","ac","le"],["the","trans","lat","or"],["the","gramm","ar"]]). + +rhymes([["one","er","or","ar","an","ae","er","ler","ur","ard","ney","ald","ess","el"],["le","py","ye","ee","ice"]]). + +%%removetoolongandnotrhyming(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !. +/**removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2) :- +%% makes list of all combos, checks, asks, asks + removetoolong1(Lyrics1,SyllableCount, + Verbs1,[],Verbs3), %% do after end + * %% not until object + %% find v x + removenotrhyming1(Lyrics0,Verbs3, + Verbs4), + + removetoolong1(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !.%* + removetoolong1(Lyrics1,SyllableCount,Verbs1,Verbs2,Verbs3) :- + Verbs1=[Verb4|Verbs5], + append_list(Lyrics1,Verb4,Lyrics2), + length(Lyrics2,Length), + (Length<=SyllableCount->(append(Verbs2, [Verb4],Verbs6), removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs6,Verbs3));(removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs2,Verbs3))). + +removenotrhyming1(Lyrics,Verbs1,Verbs2) :- + rhymes(Rhymes1) + length(Lyrics,Length), + Line1 is mod(Length,4), + (Line1 is 3-1-> + (Line2 is 3-2, + removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2)); + (Line1 is 4-1-> + (removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2));Verbs1=Verbs2) + ). + +removenotrhyming15(Line2,Rhymes1,Rhymes3) :- + length(List1,Line2), + append(List1,_,Lyrics), + reverse(List1,List2), + List2=[Item1|_Items2], + reverse(Item1,Item2), + Item2=[Item3|_Items4], + member(Rhymes2,Rhymes1), + member(Item3,Rhymes2), + delete(Rhymes2,Item3,Rhymes3) + +removenotrhyming2(_Rhymes3,[],Verbs,Verbs) :- !. +removenotrhyming2(Rhymes3,Verbs1,Verbs2,Verbs3) :- + (Verbs1=[Verb4|Verbs5], + (member(Verb4,Rhymes3) + delete(Rhymes3,Verb4,Rhymes4)) + ->(append(Verbs2,[Verb4],Verbs6), + removenotrhyming2(Rhymes4,Verbs5, + Verbs6,Verbs3)); + removenotrhyming2(Rhymes3,Verbs5, + Verbs2,Verbs3)). + **/ +%trialy2([],R) :- +% R=[[_,['C']]]. + %%writeln([[],in,trialy2]),abort. +trialy2(List,R) :- +%%writeln([list,List]), +%%notrace, + length(List,Length), + ((Length=<9-> + findr4(R4), + number_string(R4,R4A), + formr5([R4A],9,Length,R5), + findr(R5,List,R)); + (Length=<99-> + findr4(R41), + findr4(R42), + formr5([R41,R42],99,Length,R5), + findr(R5,List,R)); + (Length=<999-> + findr4(R41), + findr4(R42), + findr4(R43), + formr5([R41,R42,R43],999,Length,R5), + findr(R5,List,R)); + fail), + %%writeln([r,R]),trace. + true. + +findr4(R4) :- + List1=[0,1,2,3,4,5,6,7,8,9], + Trials is 30, +%catch( + (trialy22(List1,Trials,[],R1), + findbest2(R1,R4) + %writeln1([item,Item]) + ) + %_, + %findr4(R4) + %) + . + %%number_string(R3,R2), +formr5(RList,Upper,Length,R5) :- + %%findall(D,(member(C,RList),floor(C,D)),RList2), + concat_list2A(RList,R5A), + number_string(R5B,R5A), + R51 is floor((R5B/Upper)*Length), + (R5B=Upper->R5 is R51-1;R5=R51). +findr(R4,List,R) :- + %%floor(R4,R4A), + length(A,R4), + append(A,[R|_],List). + + %%random_member(A,List), + %%R=[[_,A]]. + + /** + length(List,L), + Trials is L*3, + trialy22(List,Trials,[],R).**/ + +trialy22([],_,R,R) :- !. +trialy22(List,Trials,RA,RB) :- + List=[Item|Items], + trialy21(Item,Trials,R1), + append(RA,[R1],RC), + trialy22(Items,Trials,RC,RB),!. + +trialy21(Label,Trials,RA) :- + trialy3(Trials,[],R), + aggregate_all(count, member(true,R), Count), + RA=[Count,Label]. + +trialy3(0,R,R) :-!. +trialy3(Trials1,RA,RB) :- + trialy1(R1), + append(RA,[R1],RC), + Trials2 is Trials1-1, + trialy3(Trials2,RC,RB),!. + +%% try other nouns +trialy1(R1) :- + %%control11(A1), + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, +catch( + (trial1(N,[],S),trial01(S,S3)), + _, + trial0(S3)). +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +midpoint(S,MP) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,writeln(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +/**string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +comment(fiftyastest). +comment(turnoffas). +**/ + +remove_dups([],List,List) :- !. +remove_dups(List1,List2,List3) :- + List1=[Item|List4], + delete(List4,Item,List5), + append(List2,[Item],List6), + remove_dups(List5,List6,List3),!. + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + +concat_list2A(A1,B):- + A1=[A|List], + concat_list2A(A,List,B),!. + +concat_list2A(A,[],A):-!. +concat_list2A(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list2A(C,Items,B). diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual-ui-rhythm.pl b/Music-Composer/mindreadtestmusiccomposer-unusual-ui-rhythm.pl new file mode 100644 index 0000000000000000000000000000000000000000..d841e33126c5e666bac210cc8ff364338e0859df --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual-ui-rhythm.pl @@ -0,0 +1,1348 @@ +/** + +- Form +- Which chord progression type? +- Lyrics - random sentences, rhyming +- Melody - random starting and ending note of first phrase, melody and harmony from chord progression type with three notes (split for melody), the circle of fifths for connection between sections +- Which of up to 3 instruments for each of melody and harmony +- What sections they will play + +- Melody cut to the length of vocals + +**/ + +:-use_module(library(pio)). + +:- use_module(library(date)). +%%:- include('texttobr2qb'). +%%:- include('mindreadtestshared'). +%%:- include('texttobr2'). +%%:- include('texttobr'). +%%:- include('texttobr2qbmusic'). + +:- include('musiclibrary'). +:- include('la_strings'). + +:-dynamic rhythm/1. + +sectest(0) :- !. +sectest(N1) :- + writeln(["\n",N1]), + sectest0(_Form1,_Lyrics,_Melody,_Harmony,_MelodyParts, + _HarmonyParts,_Vocalstubinstrument,_Song1), + N2 is N1-1, + sectest(N2),!. + +sectest0(Form1,Lyrics,Melody,Harmony,MelodyParts,HarmonyParts,Vocalstubinstrument,Song1) :- + check_asc2mid, + retractall(rhythm(_)), + assertz(rhythm([])), + %%texttobr2qb(2), %%Imagine song + %%form(Form1), + writeln("Please enter form in format e.g. [n,v1,i1,v2,c,t2,s,s,s]."), + read_string(user_input, "\n", "\r", _, Form1A), + atom_to_term(Form1A,Form1,_), + + %%Form1=[v2,o], + %%find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT), + writeln("Should the chord progression type be 1451, 1564, 1645, classical or classicalpop?"), + read_string(user_input, "\n", "\r", _, CPT1), + atom_string(CPT,CPT1), + + remove_dups(Form1,[],Form2), + Voiceparts1=[v1,v2,c,s], + intersection(Form1,[v1,v2,c,s],Voiceparts2), + %%lyrics(Voiceparts1,Lyrics,Maxlength), + writeln(["Please enter lyrics for parts",Voiceparts1,"in format e.g. [[v1, [\"All\", \"ney\", \"goes\", \"to\", \"the\"], [\"Le\", \"ice\", \"is\", \"in\"], [\"the\", \"mir\", \"ac\"], [\"the\", \"graph\"]], [v2, [\"Dix\", \"ard\"]]]."]), + read_string(user_input, "\n", "\r", _End2, LyricsA), + atom_to_term(LyricsA,Lyrics,_), + +findall(DH,(member(C1H,Lyrics),C1H=[_|C2H],member(CH,C2H),length(CH,DH)),EH),sort(EH,FH),reverse(FH,GH),GH=[Maxlength|_], + + findall(B,(member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1)),Form3), + remove_dups(Form3,[],Form4), + %%repeat, %% in case melody, harmony don't work + melodyharmony(Form4,CPT,Maxlength,Melody,Harmony), + %%writeln(melodyharmony(Form4,CPT,Maxlength,Melody,Harmony)), %% *** + instruments(Form1,MelodyInstruments,HarmonyInstruments, + MelodyParts,HarmonyParts,Vocalstubinstrument), + + rhythm,%(MelodyParts,HarmonyParts), + %%writeln(instruments(Form1,MelodyInstruments,HarmonyInstruments, +%% MelodyParts,HarmonyParts, +%% Vocalstubinstrument)), + %%writeln(rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + %% MelodyInstruments,HarmonyInstruments,MelodyParts, + %% HarmonyParts,Lyrics, + %% Vocalstubinstrument,Song1)), %%, +%% +%trace, + rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + MelodyInstruments,HarmonyInstruments,MelodyParts, + HarmonyParts,Lyrics, + Vocalstubinstrument,Song1,File1), %%, + +rhythm(Rhythm), +Meta_file=[[form,Form1],[chord_progressions,CPT],[voice_part,Voiceparts2],[melody,Melody],[harmony,Harmony],[melody_instruments, + MelodyInstruments],[harmony_instruments,HarmonyInstruments],[melody_parts,MelodyParts],[harmony_parts, + HarmonyParts],[lyrics,Lyrics], + [genre,["anthem"]],[rhythm,Rhythm]], + + term_to_atom(Meta_file,Meta_file1), + string_atom(Meta_file2,Meta_file1), + + concat_list("",[File1,"_meta.txt"],File2), + (open_s(File2,write,Stream1), + write(Stream1,Meta_file2), + close(Stream1)),! + . + %Section,Instrument,Playing +rhythm:-%(MelodyParts,HarmonyParts) :- +%trace, + %writeln("Please enter the vocal rhythms*") + %findall(R1,(member([Section,Instrument,1],MelodyParts))) + rhythm1(R), + retractall(rhythm(_)), + assertz(rhythm(R)),!. + +rhythm1(R2) :- + writeln("Please enter the song's bar rhythm in the format [[time,type=n/r,length,nth_note_from_mel_harm,velocity=0-127],...], e.g. [[0,n,1/2,1,80],[1/2,n,1/2,1,80],[1,n,1/2,2,80],[1+1/2,n,1/2,2,80],[2,n,1/2,3,80],[2+1/2,n,1/2,3,80],[3,n,1/2,4,80],[3+1/2,n,1/2,4,80]] or \"ta1 ta1 tityca ta.\" (note full stop) where \"1\" optionally denotes nth_note_from_mel_harm."), + read_string(user_input, "\n", "\r", _, R10), + %trace, + rhythm2(R10,R2),!. + +string_compound(A,B) :- catch(number(B),_,false),number_string(B,A),!. +string_compound(A,C/D) :- %catch(number(B),_,false),%number_string(B,A1), +frac_to_string(C/D,CD),foldr(string_concat,[CD],A),!. +string_compound(A,B+C/D) :- catch(number(B),_,false),number_string(B,A1),frac_to_string(C/D,CD),foldr(string_concat,[A1,"+",CD],A). +frac_to_string(A/B,AB) :- foldr(string_concat,[A,"/",B],AB). + +%% n intro, vn verse, c chorus, i1 instrumental1, t2, instrumental 2, s solo, o outro + +rhythm2(R10,R2) :- + % ta a a a(n), ta a a, ta a, ta, ti( )ca ti, same for za + % delete space, replace with length + % simplify time, ff frac + string_concat(R101,".",R10), + string_strings(R101,R11), + delete(R11," ",R12), + foldr(string_concat,R12,R13), + process_rhythm(R13,R14), + split_on_substring1(R14,"bcdefghijklm",R15), + %trace, + find_rhythm(0,1,R15,[],R2), + (R2=[]->true; + (%trace, + reverse(R2,[[T,_,L,_,_]|_]), + term_to_atom(T1,T),term_to_atom(L1,L), + Total is T1+L1, + not(Total>4))), + !. + +rhythm2(R10,R2) :- +%trace, + not(R10=""), + catch(term_to_atom(R1,R10),_,false), + %member([T,_,_,_],R1),not(T>=4), + forall(member([A,B,C,D,F],R1),(string_compound(_,A),(B=n->true;B=r),string_compound(_,C), + 0=B1="NT";(B=r,B1="R")),string_compound(C1,C) + ),R2),!. + +rhythm2(_R10,R2) :- + writeln("Error: Please re-input:"), + rhythm1(R2),!. + +process_rhythm(A,C) :- + process_rhythm1(Replacements), + atom_string(A1,A), + process_rhythm2(Replacements,A1,D1), + atom_string(D1,C),!. + +process_rhythm2([],A,A) :- !. +process_rhythm2(Replacements,A,D) :- + Replacements=[[B,C]|G], + atomic_list_concat(E,B,A), + atomic_list_concat(E,C,F), + process_rhythm2(G,F,D),!. + + process_rhythm1([['taaaa','b'],['taaa','c'],['taa','d'],['ta','e'],['ty','g'],['ca','g'],['ti','f'],['zaaaa','h'],['zaaa','i'],['zaa','j'],['za','k'],['zi','l'],['zy','m']]). + +find_rhythm(_,_,[],B,B) :- !. +find_rhythm(Time,Note_n,A,B,C) :- +%writeln1(find_rhythm(Time,Note_n,A,B,C)), +%trace, + A=[D|E], + atom_string(D2,D), + time(D2,Type,Length), + % ["1/2","NT","1/2",1,80] + ff_frac(Time,Time1), + (Type=n->Type1="NT";Type1="R"), + %Lengths=[Length1|Lengths2], + ff_frac(Length,Length1), + (E=[D1|E1]-> + (catch(number_string(D3,D1),_,false)->(Note_n1=D3,A1=E1);(Note_n1=Note_n,A1=E)); + (A1=E,Note_n1=Note_n)), + %writeln1(append(B,[[Time1,Type1,Length1,Note_n1,80]],B1)), + append(B,[[Time1,Type1,Length1,Note_n1,80]],B1), + + %catch(number_string(Length5,Length),_,false), + Time2 is Time+Length, + (Type1="NT"->Note_n2 is Note_n1+1;Note_n2=Note_n1), + find_rhythm(Time2,Note_n2,A1,B1,C),!. + + +time(b,n,4). +time(c,n,3). +time(d,n,2). +time(e,n,1). +time(f,n,0.5). +time(g,n,0.25). +time(h,r,4). +time(i,r,3). +time(j,r,2). +time(k,r,1). +time(l,r,0.5). +time(m,r,0.25). + +% 1, 1+1/2, 1/2 +ff_frac(N1,S) :- + floor(N1,N2), + (N1 =:= N2->(N21=N2,N31=""); + (N3 is N1-N2, + numbers(4,0,[],Ns), + member(N4,Ns), + member(N5,Ns), + catch(N3 is N4/N5,_,false), + (N2=0->N21="";string_concat(N2,"+",N21)), + foldr(string_concat,[N4,"/",N5],N31))), + foldr(string_concat,[N21,N31],S),!. + +findall1(Form2,Form3) :- + findall(B,shorten(Form2,B),Form3). + +shorten(Form2,B) :- + member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1). + +form(Form) :- + Form1=[n,v1], + find("Should a chorus or instrumental come after the first verse?",CorI), + (CorI=c-> + (append(Form1,[c,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + ); + (CorI=i1,append(Form1,[i1,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + )). + +makename(N3) :- + trialy2(["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"],R1), + findbest(R1,N1), + trialy2(["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"],R2), + findbest(R2,N2), + append([N1],[N2],N3). + +makenames(0,Ns,Ns) :- !. +makenames(Num1,N1,N2) :- + random_member(R1,["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"]), + random_member(R2,["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"]), + append([R1],[R2],R3), + append(N1,[R3],N3), + Num2 is Num1-1, + makenames(Num2,N3,N2). + +findsentence(Sentence,Names1,Names2) :- + makename(N1),readv(Vs),reado(Os), + makenames(5,[],N2s1),append(N2s1,Names1,N2s), + trialy2(Vs,R1), + findbest(R1,R11), + append(Os,N2s,Os2), + trialy2(Os2,R2), + findbest(R2,R21), + append_list(N1,[R11,R21],Sentence), + (member(R21,N2s1)-> + append(Names1,[R21],Names2); + Names2=Names1). + +findrhyme(Phrase1,Phrase2,Names1,Names2) :- + reverse(Phrase1,Reverse), + Reverse=[Lastsyllable|_], + readv(Vs),reado(Os),makenames(5,[],N2s1), + append(N2s1,Names1,N2s), + append(Os,N2s,Os2), + trialy2(Os2,R2), + findbest(R2,R21), + trialy2(Vs,R1), + findbest(R1,R11), + findall(C,(member(C,N2s),reverse(C,B),B=[E|_],rhymes2(E,Lastsyllable)),Phrases22), + trialy2(Phrases22,R3), + findbest(R3,R31), + append_list(R21,[R11,R31],Phrase2), + (member(R21,N2s1)-> + append(Names1,[R21],Names3); + Names3=Names1), + (member(Phrases22,N2s1)-> + append(Names3,[R21],Names2); + Names2=Names3). + +rhymes2(Syllable1,Syllable2) :- + rhymes(Lists), + member(List1,Lists), + member(Syllable1,List1), + member(Syllable2,List1). + +lyrics(Form1,Lyrics,Maxlength) :- + %%find("Who is the song about?",Character), + %%lyrics1(Form,Character,[],Lyrics). + catch((lyrics2(Form1,[],Lyrics,[],_Names,0,Maxlength)),_, + lyrics(Form1,Lyrics,Maxlength)). + +lyrics2([],Lyrics,Lyrics,Names,Names,Maxlength,Maxlength) :- !. +lyrics2(Form1,Lyrics1,Lyrics2,Names1,Names2,Maxlength1,Maxlength2) :- + Form1=[Form2|Forms3], + findsentence(Sentence1,Names1,Names3), + length(Sentence1,Length1), + findsentence(Sentence2,Names3,Names4), + length(Sentence2,Length2), + findrhyme(Sentence1,Sentence3,Names4,Names5), + length(Sentence3,Length3), + findrhyme(Sentence2,Sentence4,Names5,Names6), + length(Sentence4,Length4), + append(Lyrics1,[[Form2,Sentence1,Sentence2,Sentence3,Sentence4]],Lyrics3), + max([Length1,Length2,Length3,Length4],Maxlength1,Maxlength3), + lyrics2(Forms3,Lyrics3,Lyrics2,Names6,Names2,Maxlength3,Maxlength2),!. + + +melodyharmony(Form1,CPT,Maxlength,Melody,Harmony) :- + Partlength is Maxlength div 3, + Extra is Maxlength mod 3, + Total is Partlength+Extra, + _Parts=[Partlength,Partlength,Partlength,Total], + %%(CPT=1451->findall(A,note0(_,A),Notes); + %%findall(A,note0(_,A),Notes)), + %% What note should the song start on? + %%trialy2(Notes,R1), + %%findbest(R1,R11), + %%R11='A', + melodyharmony(Form1,CPT,_Parts2,_R11,_R2,[],Melody,[],Harmony). + +melodyharmony([],_CPT,_Parts,N,N,Melody,Melody,Harmony,Harmony) :- !. +melodyharmony(Form1,CPT,Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + Form1=[Form2|Forms3], + findmelody(Form2,CPT,Parts,N1,N3,Melody1,Melody3,Harmony1,Harmony3), + + (CPT='1451'->harmony1451(N3,3,N4);harmonyr(N3,5,N4)), + findmelody(Form2,CPT,Parts,N4,N5,Melody3,Melody4,Harmony3,Harmony4), + %%reverse(Melody3,Melody3R),Melody3R=[[_,Melody3RI]|_], %% relies on repeat *1 + %%reverse(Melody4,Melody4R),Melody4R=[[_,Melody4RI]|_], + %%not((length(Melody3RI,1),length(Melody4RI,1))), + melodyharmony(Forms3,CPT,Parts,N5,N2,Melody4,Melody2, + Harmony4,Harmony2). + +findmelody(Form,CPT,_Parts,_N1,_N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT='1451', + %%(CPT=1451->findall(A,note0(_,A),Notes); + %%findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + %%repeat, + %%trialy2(Notes,R1), + %%findbest(R1,N2), + %%versechorussoloprogression1451(N1,N2,Progression), + %%trialy2(Progression,R2), + %%findbest(R2,Progression2), + repeat, + writeln(["Please enter melody line for form",Form,"in format e.g. four notes from [d,di,r,ri,m,f,fi,s,si,l,li,t]."]), + read_string(user_input, "\n", "\r", _End21, Progression2B), + atom_to_term(Progression2B,Progression2A,_), + solfatonotes(Progression2A,[],Progression2), +not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + %%versechorussoloprogression1451(N1,N2,Progression3), + %%trialy2(Progression3,R3), + %%findbest(R3,Progression4), + repeat, + writeln(["Please enter harmony line in format e.g. four notes from ['C','E','G'] for",Form,"."]), + read_string(user_input, "\n", "\r", _, Progression4A), + atom_to_term(Progression4A,Progression4,_), + + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,_N1,_N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT='1564', + repeat, + writeln(["Please enter melody line for form",Form,"in format e.g. four notes from [d,di,r,ri,m,f,fi,s,si,l,li,t]."]), + read_string(user_input, "\n", "\r", _, Progression2B), + atom_to_term(Progression2B,Progression2A,_), + solfatonotes(Progression2A,[],Progression2), not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + %%versechorussoloprogression1451(N1,N2,Progression3), + %%trialy2(Progression3,R3), + %%findbest(R3,Progression4), + repeat, + writeln(["Please enter harmony line in format e.g. four notes from ['C','E','G'] for",Form,"."]), + read_string(user_input, "\n", "\r", _, Progression4A), + atom_to_term(Progression4A,Progression4,_), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,_N1,_N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT='1645', + repeat, + writeln(["Please enter melody line for form",Form,"in format e.g. four notes from [d,di,r,ri,m,f,fi,s,si,l,li,t]."]), + read_string(user_input, "\n", "\r", _, Progression2B), + atom_to_term(Progression2B,Progression2A,_), + solfatonotes(Progression2A,[],Progression2), not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + %%versechorussoloprogression1451(N1,N2,Progression3), + %%trialy2(Progression3,R3), + %%findbest(R3,Progression4), + repeat, + writeln(["Please enter harmony line in format e.g. four notes from ['C','E','G'] for",Form,"."]), + read_string(user_input, "\n", "\r", _, Progression4A), + atom_to_term(Progression4A,Progression4,_), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,_N1,_N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classical, + repeat, + writeln(["Please enter melody line for form",Form,"in format e.g. four notes from [d,di,r,ri,m,f,fi,s,si,l,li,t]."]), + read_string(user_input, "\n", "\r", _, Progression2B), + atom_to_term(Progression2B,Progression2A,_), + solfatonotes(Progression2A,[],Progression2), not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + %%versechorussoloprogression1451(N1,N2,Progression3), + %%trialy2(Progression3,R3), + %%findbest(R3,Progression4), + repeat, + writeln(["Please enter harmony line in format e.g. four notes from ['C','E','G'] for",Form,"."]), + read_string(user_input, "\n", "\r", _, Progression4A), + atom_to_term(Progression4A,Progression4,_), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,_N1,_N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classicalpop, + repeat, + writeln(["Please enter melody line for form",Form,"in format e.g. four notes from [d,di,r,ri,m,f,fi,s,si,l,li,t]."]), + read_string(user_input, "\n", "\r", _, Progression2B), + atom_to_term(Progression2B,Progression2A,_), + solfatonotes(Progression2A,[],Progression2), not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + %%versechorussoloprogression1451(N1,N2,Progression3), + %%trialy2(Progression3,R3), + %%findbest(R3,Progression4), + repeat, + writeln(["Please enter harmony line in format e.g. four notes from ['C','E','G'] for",Form,"."]), + read_string(user_input, "\n", "\r", _, Progression4A), + atom_to_term(Progression4A,Progression4,_), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). + +harmony(Form,CPT,Progression,Harmony1,Harmony2) :- + harmony1(Form,CPT,Progression,[],Harmony3), + append(Harmony1,[Harmony3],Harmony2). + +harmony1(_Form,_CPT,[],Harmony,Harmony) :- !. +harmony1(Form,CPT,Progression1,Harmony1,Harmony2) :- + Progression1=[Note1|Progression2], + (CPT='1451'->(harmony1451(Note1,2,Note2),harmony1451(Note1,4,Note3)); + (harmonyr(Note1,4,Note2),harmonyr(Note1,7,Note3))), + Harmony30=[Note2,Note3], + findall(B1,(member(A1,Harmony30),string_concat(B,_C,A1),string_length(B,1),atom_string(B1,B)),Harmony31), + append([Note1],Harmony31,Harmony3), + append(Harmony1,[[Form,Harmony3]],Harmony4), + harmony1(Form,CPT,Progression2,Harmony4,Harmony2),!. + +harmony1451(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). +harmonyr(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). + + +solfatonotes([],Progression1,Progression1) :- !. +solfatonotes(Progression2A,Progression1,Progression2) :- + Progression2A=[N1|Ns], + solfegenotetonote(N1,N2), + append(Progression1,[N2],Progression3), + solfatonotes(Ns,Progression3,Progression2). + +solfegenotetonote(d,'C'). +solfegenotetonote(di,'C#'). +solfegenotetonote(r,'D'). +solfegenotetonote(ri,'D#'). +solfegenotetonote(m,'E'). +solfegenotetonote(f,'F'). +solfegenotetonote(fi,'F#'). +solfegenotetonote(s,'G'). +solfegenotetonote(si,'G#'). +solfegenotetonote(l,'A'). +solfegenotetonote(li,'A#'). +solfegenotetonote(t,'B'). + + +instruments(Form1,MelodyInstruments,HarmonyInstruments,MelodyParts,HarmonyParts,Vocalstubinstrument) :- + melodyinstrumentnumber(MelodyInstrumentNumber), + harmonyinstrumentnumber(HarmonyInstrumentNumber), + instrumentlist(MelodyInstrumentNumber,MelodyInstruments), + instrumentlist(HarmonyInstrumentNumber,HarmonyInstruments), + Vocalstubinstrument=[0,"Acoustic Grand Piano"], + parts(Form1,MelodyInstruments,MelodyParts), + aggregate_all(count, (member(A1,MelodyParts),not(A1=[_,_,0])), Count1), + not(Count1=0), + parts(Form1,HarmonyInstruments,HarmonyParts), + aggregate_all(count, (member(A2,HarmonyParts),not(A2=[_,_,0])), Count2), + not(Count2=0). + +melodyinstrumentnumber(NumberofInstruments) :- + %% Number of melody instruments + trialy2("Number of melody instruments?",[1,2,3,4,5,6,7,8,9,10],R1), + findbest(R1,NumberofInstruments). + +harmonyinstrumentnumber(NumberofInstruments) :- + %% Number of harmony instruments + trialy2("Number of harmony instruments?",[1,2,3,4,5,6,7,8,9,10],R1), + findbest(R1,NumberofInstruments). + +instrumentlist(NumberofInstruments,Instruments) :- + instrumentlist(NumberofInstruments,[],Instruments),!. + +instrumentlist(0,Instruments,Instruments) :- !. +instrumentlist(NumberofInstruments1,Instruments1,Instruments2) :- + Instruments=[[0,"Acoustic Grand Piano"],[1,"Bright Acoustic Piano"],[2,"Electric Grand Piano"],[3,"Honky-Tonk Piano"],[4,"Electric Piano 1"],[5,"Electric Piano 2"],[6,"Harpsichord"],[7,"Clavinet"],[8,"Celesta"],[9,"Glockenspiel"],[10,"Music Box"],[11,"Vibraphone"],[12,"Marimba"],[13,"Xylophone"],[14,"Tubular Bells"],[15,"Dulcimer"],[16,"Drawbar Organ"],[17,"Percussive Organ"],[18,"Rock Organ"],[19,"Church Organ"],[20,"Reed Organ"],[21,"Accordian"],[22,"Harmonica"],[23,"Tango Accordian"],[24,"Nylon Acoustic Guitar"],[25,"Steel Acoustic Guitar"],[26,"Jazz Electric Guitar"],[27,"Clean Electric Guitar"],[28,"Muted Electric Guitar"],[29,"Overdriven Guitar"],[30,"Distortion Guitar"],[31,"Guitar Harmonics"],[32,"Acoustic Bass"],[33,"Electric Bass (finger)"],[34,"Electric Bass (pick)"],[35,"Fretless Bass"],[36,"Slap Bass 1"],[37,"Slap Bass 2"],[38,"Synth Bass 1"],[39,"Synth Bass 2"],[40,"Violin"],[41,"Viola"],[42,"Cello"],[43,"Contrabass"],[44,"Tremolo Strings"],[45,"Pizzicato Strings"],[46,"Orchestral Harp"],[47,"Timpani"],[48,"String Ensemble 1"],[49,"String Ensemble 2"],[50,"Synth Strings 1"],[51,"Synth Strings 2"],[52,"Choir Aahs"],[53,"Voice Oohs"],[54,"Synth Choir"],[55,"Orchestra Hit"],[56,"Trumpet"],[57,"Trombone"],[58,"Tuba"],[59,"Muted Trumpet"],[60,"French Horn"],[61,"Brass Section"],[62,"Synth Brass 1"],[63,"Synth Brass 2"],[64,"Soprano Sax"],[65,"Alto Sax"],[66,"Tenor Sax"],[67,"Baritone Sax"],[68,"Oboe"],[69,"English Horn"],[70,"Bassoon"],[71,"Clarinet"],[72,"Piccolo"],[73,"Flute"],[74,"Recorder"],[75,"Pan Flute"],[76,"Blown Bottle"],[77,"Shakuhachi"],[78,"Whistle"],[79,"Ocarina"],[80,"Lead 1 (square)"],[81,"Lead 2 (sawtooth)"],[82,"Lead 3 (calliope)"],[83,"Lead 4 (chiff)"],[84,"Lead 5 (charang)"],[85,"Lead 6 (voice)"],[86,"Lead 7 (fifths)"],[87,"Lead 8 (bass + lead)"],[88,"Pad 1 (new age)"],[89,"Pad 2 (warm)"],[90,"Pad 3 (polysynth)"],[91,"Pad 4 (choir)"],[92,"Pad 5 (bowed)"],[93,"Pad 6 (metallic)"],[94,"Pad 7 (halo)"],[95,"Pad 8 (sweep)"],[96,"FX 1 (rain)"],[97,"FX 2 (soundtrack)"],[98,"FX 3 (crystal)"],[99,"FX 4 (atmosphere)"],[100,"FX 5 (brightness)"],[101,"FX 6 (goblins)"],[102,"FX 7 (echoes)"],[103,"FX 8 (sci-fi)"],[104,"Sitar"],[105,"Banjo"],[106,"Shamisen"],[107,"Koto"],[108,"Kalimba"],[109,"Bagpipe"],[110,"Fiddle"],[111,"Shanai"],[112,"Tinkle Bell"],[113,"Agogo"],[114,"Steel Drums"],[115,"Woodblock"],[116,"Taiko Drum"],[117,"Melodic Tom"],[118,"Synth Drum"],[119,"Reverse Cymbal"],[120,"Guitar Fret Noise"],[121,"Breath Noise"],[122,"Seashore"]], + %% ,[123,"Bird Tweet"],[124,"Telephone Ring"],[125,"Helicopter"],[126,"Applause"],[127,"Gunshot"] + + trialy2("Please select an instrument from:",Instruments,R1), + findbest(R1,Instrument), + append(Instruments1,[Instrument],Instruments3), + NumberofInstruments2 is NumberofInstruments1-1, + instrumentlist(NumberofInstruments2,Instruments3,Instruments2),!. + +parts(Form,Instruments,Parts) :- + parts1(Form,Instruments,[],Parts),!. + +parts1(_Form,[],Parts,Parts) :- !. +parts1(Form,Instruments1,Parts1,Parts2) :- + Instruments1=[Instrument2|Instruments3], + parts2(Form,Instrument2,[],Part3), + append(Parts1,[Part3],Parts4), + parts1(Form,Instruments3,Parts4,Parts2),!. + +parts2([],_Instrument,Part,Part) :- !. +parts2(Form1,Instrument,Part1,Part2) :- + Form1=[Section|Form2], + (member([Section,Instrument,Playing],Part1)->true; + %% Is the instrument playing in the section? + term_to_atom(Instrument,Instrument1), + concat_list("Is ",[Instrument1," playing in ",Section,"?"],Question), + (trialy2(Question,[0,1],R1), + findbest(R1,Playing))), + append(Part1,[[Section,Instrument,Playing]],Part3), + parts2(Form2,Instrument,Part3,Part2),!. + +/** +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). +**/ + +concat_list(A,[],A) :-!. +concat_list(A,List,B) :- + List=[Item|Items], + concat_list2(A,[Item],C), + concat_list(C,Items,B). +concat_list2(A,List,C) :- + ((List=[[Item|Items]]->true;List=[Item|Items])-> + concat_list0(A,[Item|Items],C); + fail),!. +concat_list2(A,Item,C) :- + concat_list(A,Item,C),!. + +concat_list0([],""):-!. +concat_list0(A1,B):- + A1=[A|List], + concat_list0(A,List,B),!. + +concat_list0(A,[],A):-!. +concat_list0(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list0(C,Items,B). + +sentencewithspaces(Sentence1,Sentence2) :- + Sentence1=[Item|Items], + string_concat(Firstletter1,Rest,Item), + string_length(Firstletter1,1), + upcase_atom(Firstletter1,Firstletter2), + concat_list(Firstletter2,[Rest,""],Item2), + flatten(Items,Items1), + sentencewithspaces(Items1,Item2,Sentence3), + string_concat(Sentence3,".",Sentence2). + +sentencewithspaces([],Sentence,Sentence) :- !. +sentencewithspaces(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Word|Sentence4], + concat_list("",[Sentence2," ",Word],Item2), + sentencewithspaces(Sentence4,Item2,Sentence3),!. + +rendersong(Form1,Voiceparts,_Maxlength,Melody, + Harmony,MelodyInstruments, + HarmonyInstruments,MelodyParts1, + HarmonyParts,Lyrics,Vocalstubinstrument,Song1,File1) :- + Totallength=8, + Voicetrack=1, + length(MelodyInstruments,MelodyInstrumentsLength), + length(HarmonyInstruments,HarmonyInstrumentsLength), + TracksNumber is MelodyInstrumentsLength+HarmonyInstrumentsLength+2, + Lyrics=[[_,Sentence1|_]|_],sentencewithspaces(Sentence1,Sentence2), + concat_list("format=1 tracks=", [TracksNumber, " division=384\n\nBA 1 CR 0 TR 0 CH 16 Text type 2: \"Produced by Mind Reading Music Composer by Lucian Academy (Manual Entry Rhythm Mode)\"\nBA 1 CR 0 TR 0 CH 16 Text type 3: \"", Sentence2, "\"\nBA 1 CR 0 TR 0 CH 1 Channel volume 127\nBA 1 CR 0 TR 0 CH 16 Tempo 63.00009\n"], Song2), + printheader(Voicetrack,Vocalstubinstrument,Song2,Song3), + %%writeln(renderv1(Form1,Voiceparts,_,Lyrics,Melody, + %% Totallength,Voicetrack,1,_,Song3,Song4)), %% **** + renderv1(Form1,Voiceparts,_,Lyrics,Melody, + Totallength,Voicetrack,1,_,Song3,Song4), + Track2 is Voicetrack+1, + renderm1(Form1,Melody,MelodyParts1, + Track2,_Track3,Song4,Song5), + Track4 is MelodyInstrumentsLength+2, + renderh1(Form1,Harmony,HarmonyParts, + Track4,_Track5,Song5,Song6), + length(Form1,FormLength), + TotalBars is 4*FormLength+1, + concat_list(Song6,["BA ",TotalBars," CR 0 TR 0 CH 16 End of track"],Song1), + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list("song",[Year,Month,Day,Hour1,Minute1,Seconda],File1), + concat_list(File1,[".txt"],File2), + concat_list(File1,[".mid"],File3), + open_s(File2,write,Stream), + write(Stream,Song1), + close(Stream), + concat_list("./asc2mid ",[File2," > ",File3],Command), + shell1_s(Command), + %%writeln(["Texttobr, Texttobr2 not working. Please manually breason out ",File2]), + %%N is 4, M is 4000, texttobr2(N,File2,u,M),texttobr(N,File2,u,M), + outputlyrics(File1,Lyrics), + +/** + texttobr2qb(3), %% give self breasonings + texttobr2qb(20), %%Feature 1 + texttobr2qb(20), %%Updates + texttobr2qb(20), %%Feature 2 + texttobr2qb(20), %%Updates + texttobr2qb(20), %%Feature 3 + texttobr2qb(20), %%Updates + texttobr2qb(100), %%Icon + texttobr2qb(20), %%Updates + texttobr2qb(32), %%Lyrics + texttobr2qb(36), %%Music + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Medicine + texttobr2qb(20), %%Updates + texttobr2qb(2), %%Sales + texttobr2qb(20), %%Updates + texttobr2qb(2), %%Marketing + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Graciously give or blame listener for colour imagery + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Play song + texttobr2qb(2), %%Like song + **/ +!. + +printheader(Voicetrack,Vocalstubinstrument,Song1,Song2) :- + Vocalstubinstrument=[N1,_], + N2 is N1+1, + concat_list("BA 1 CR 0 TR ",[Voicetrack," CH ",Voicetrack," Instrument ",N2,"\nBA 1 CR 0 TR ",Voicetrack," CH ",Voicetrack," NT R 0 von=127 voff=0\n"],Song3), + string_concat(Song1,Song3,Song2). + +renderv1([],_Voiceparts,_Vocalstubinstrument,_Lyrics,_Melody,_Totallength,_Track,_Bar,_Bar2,Voice,Voice) :- !. +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength,Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section|Form2], + Voiceparts1=[Section|Voiceparts2], + renderv2(Section,Lyrics,Melody,Totallength,Track, + Bar1,Bar3,Voice1,Voice3), + renderv1(Form2,Voiceparts2,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength, +Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section1|Form2], + ((Voiceparts1=[Section2|_Voiceparts2], + not(Section1=Section2))->true;Voiceparts1=[]), + Bar3 is Bar1+4, + renderv1(Form2,Voiceparts1,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderv2(Section,Lyrics,_Melody,_Totallength,_Track,Bar,Bar,Voice,Voice) :- + not(member([Section|_],Lyrics)),!. +renderv2(Section1,Lyrics1,Melody1,_Totallength0,Track,Bar1,Bar2,Voice1,Voice2) :- + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section2|_]),Melody3), + Lyrics3=[Lyrics3A1|_Lyrics5], + %%trace, + (Lyrics3A1=[_,Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]->true; + (writeln(["Error:",Lyrics3A1,"does not have 4 lines"]),abort)), + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1(Lyrics1A,Melody1A,_Totallength1,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1(Lyrics2A,Melody2A,_Totallength2,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1(Lyrics3A,Melody1A,_Totallength3,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1(Lyrics4A,Melody2A,_Totallength4,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Lyrics1,[Section1|_],Lyrics51), + renderv2(Section1,Lyrics51,Melody1,_Totallength,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1(Lyrics2,Melody1,_Totallength,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + length(Lyrics2,Lyrics2Length), + %%Rests is Totallength-Lyrics2Length, + rhythm(BarTimes1),%=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelody(Lyrics2Length,Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelody(Lyrics2Length,Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +generatemelody1([],Melody,Melody) :- !. +generatemelody1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4%,Melody4 + ],Melody6), + generatemelody1(Melody5,Melody6,Melody3),!. + +%%changelength(_Melody2Length,Melody,Melody) :- !. +changelength(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnote1(Length,Melody1,Melody2). +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnote1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnote2(Length,Item,Melody1,Melody2),!. + +repeatlastnote2(0,_Item,Melody,Melody) :- !. +repeatlastnote2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnote2(Length2,Item,Melody3,Melody2),!. + +renderline2([]%BarTimes +,_BarTimes,_,_Track,_Bar1,Voice,Voice) :- !. +renderline2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + BarTimes1=[BarTimes2|BarTimes3], + BarTimes2=[Time,Type,Length,Nth_note_from_mel_harm,Velocity], + %trace, + ((Type="NT", + get_item_n(Melody1,Nth_note_from_mel_harm,Melody2), + %Melody1=[Melody2|Melody3], + concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT ",Melody2,"- ",Length," von=",Velocity," voff=0\n"],Song))->true; + (%Type="R", + concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT R ",Length," voff=0\n"],Song))), + + string_concat(Voice1,Song,Voice3), + renderline2(BarTimes3,BarTimes4,Melody1,Track,Bar,Voice3,Voice2). + +renderlinerests([]%BarTimes +,_BarTimes,_Track,_Bar1,Voice,Voice) :- !. +renderlinerests(BarTimes1,BarTimes2,Track,Bar,Voice1,Voice2) :- + BarTimes1=[BarTimes2|BarTimes3], + BarTimes2=[Time,_Type,Length,_Nth_note_from_mel_harm,_Velocity], + concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT R ",Length," voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderlinerests(BarTimes3,BarTimes2,Track,Bar,Voice3,Voice2). + +longtoshortform(Section1,Section2) :- + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A),!. + + +renderm1(_Form1,_Melody,[],_Track1,_Track2,Song,Song) :- !. +renderm1(Form1,Melody,MelodyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + MelodyParts1=[MelodyParts2|MelodyParts3], + MelodyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderm21(Form1,Melody,MelodyParts1,Track1,1,_,Song3,Song4), + renderm21(Form1,Melody,MelodyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderm1(Form1,Melody,MelodyParts3,Track3,Track2,Song4,Song2),!. + + +renderm21(_Form1,_Melody,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderm22(Section2,Melody,Track,Bar1,Bar3,Voice1,Voice3), + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderm22(Section2,Melody1,_Track,Bar,Bar,Voice,Voice) :- + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderm22(Section1,Melody1,Track,Bar1,Bar2,Voice1,Voice2) :- + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_]),Melody3), + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1m(Melody1A,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1m(Melody2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1m(Melody1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1m(Melody2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Melody3,[Section1|_],Melody5), + renderm22(Section1,Melody5,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1m(Melody1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + rhythm(BarTimes1),%BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodym(Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodym(Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + Lyrics2Length=8, + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +renderh1(_Form1,_Harmony,[],_Track1,_Track2,Song,Song) :- !. +renderh1(Form1,Harmony,HarmonyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + HarmonyParts1=[HarmonyParts2|HarmonyParts3], + HarmonyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderh21(Form1,Harmony,HarmonyParts1,Track1,1,_,Song3,Song4), + renderh21(Form1,Harmony,HarmonyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderh1(Form1,Harmony,HarmonyParts3,Track3,Track2,Song4,Song2),!. + + +renderh21(_Form1,_Harmony,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderh22(Section2,Harmony,Track,Bar1,Bar3,Voice1,Voice3), + renderh22(Section2,Harmony,Track,Bar3,Bar4,Voice3,Voice4), + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar4,Bar2,Voice4,Voice2),!. + +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderh22(Section2,Harmony1,_Track,Bar,Bar,Voice,Voice) :- + + %%longtoshortform(Section21,Section2), + findall(Harmony2,(member(Harmony3,Harmony1),member(Harmony2,Harmony3), + Harmony2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderh22(Section1,Harmony1A,Track,Bar1,Bar2,Voice1,Voice2) :- + Harmony1A=[Harmony1A1|Harmony1A2], + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + + %% shorten + %%string_concat(B11,_C1,Section11), + %%string_length(B11,1),atom_string(Section1,B11), + + + + findall(Harmony2,(member(Harmony2,Harmony1A1), + Harmony2=[Section1|_] + %%,string_concat(B1,_C,Section1A), + %%string_length(B1,1),atom_string(Section1,B1) + ),Harmony3), + %% shorten + + + + (not(Harmony3=[])-> + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + %%Harmony3=[[_, Harmony1A], [_, Harmony2A]], + (renderlines1h(Harmony3,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1);(Bar3 is Bar1,Voice3=Voice1)), + /** + Bar3 is Bar1+1, + renderline1h(Harmony2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1h(Harmony1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1h(Harmony2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + **/ + %%delete(Harmony3,[Section1|_],Harmony5), + renderh22(Section1,Harmony1A2,Track,Bar3,Bar2,Voice3,Voice2). + +renderlines1h([],_Track,_Bar,Voice,Voice) :- !. +renderlines1h(Harmony1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + rhythm(BarTimes1),%BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodyh(Harmony1,Harmony2), +%%writeln(generatemelodyh(Harmony1,Harmony2)), + renderlineh2(BarTimes1,BarTimes2,Harmony2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodyh(Harmony1,Harmony2) :- + generatemelodyh1(Harmony1,[],Harmony3), + length(Harmony3,Harmony2Length), + Lyrics2Length=8, + changelengthh(Lyrics2Length,Harmony2Length, + Harmony3,Harmony2). + +generatemelodyh1([],Melody,Melody) :- !. +generatemelodyh1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4%,Melody4 + ],Melody6), + generatemelodyh1(Melody5,Melody6,Melody3),!. + +%%changelengthh(_Melody2Length,Melody,Melody) :- !. +changelengthh(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnoteh1(Length,Melody1,Melody2). +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnoteh1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnoteh2(Length,Item,Melody1,Melody2),!. + +repeatlastnoteh2(0,_Item,Melody,Melody) :- !. +repeatlastnoteh2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnoteh2(Length2,Item,Melody3,Melody2),!. + +renderlineh2([]%BarTimes +,_BarTimes,_%[] +,_Track,_Bar1,Voice,Voice) :- !. +renderlineh2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + %Melody1=[Melody2|Melody3], + %Melody2=[_,[Melody21,Melody22,Melody23]], + BarTimes1=[BarTimes2|BarTimes3], + + + BarTimes2=[Time,Type,Length,Nth_note_from_mel_harm,Velocity], + ((Type="NT", + get_item_n(Melody1,Nth_note_from_mel_harm,Melody2), + Melody2=[_,[Melody21,Melody22,Melody23]], + %Melody1=[Melody2|Melody3], + %concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT ",Melody2,"- ",Length," von=",Velocity," voff=0\n"],Song)); + + + concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT ",Melody21,"- ",Length," von=",Velocity," voff=0\n"],Song1), + concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT ",Melody22,"- ",Length," von=",Velocity," voff=0\n"],Song2), + concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT ",Melody23,"- ",Length," von=",Velocity," voff=0\n"],Song3))->true; + + (%Type="R", + %get_item_n(Melody1,Nth_note_from_mel_harm,Melody2), + %Melody2=[_,[Melody21,Melody22,Melody23]], + %Melody1=[Melody2|Melody3], + %concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT ",Melody2,"- ",Length," von=",Velocity," voff=0\n"],Song)); + + + concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT R ",Length," voff=0\n"],Song1), + concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT R ",Length," voff=0\n"],Song2), + concat_list("BA ",[Bar," CR ",Time," TR ",Track," CH ",Track," NT R ",Length," voff=0\n"],Song3))), + + concat_list(Voice1,[Song1,Song2,Song3],Voice3), + renderlineh2(BarTimes3,BarTimes4,Melody1,Track,Bar,Voice3,Voice2). + +outputlyrics(File1,Lyrics1) :- + Lyrics1=[Lyrics2|_Lyrics3], + Lyrics2=[_|Lyrics4], + Lyrics4=[Lyrics5|_Lyrics6], + sentencewithspaces(Lyrics5,Lyrics7A), + string_concat(Lyrics7A,"\n\n",Lyrics7), + outputlyrics1(Lyrics1,Lyrics7,Lyrics8), + concat_list(File1,["lyrics.txt"],File2), + open_s(File2,write,Stream), + write(Stream,Lyrics8), + close(Stream). +%%texttobr2(u,File2,u,u),texttobr(u,File2,u,u). + %%writeln(["Texttobr, Texttobr2 not working. Please manually breason out ",File2]). + +outputlyrics1([],Lyrics,Lyrics) :- !. +outputlyrics1(Lyrics1,Lyrics5,Lyrics6) :- + Lyrics1=[Lyrics2|Lyrics3], + Lyrics2=[_|Lyrics4], + outputlyrics2(Lyrics4,Lyrics5,Lyrics7), + string_concat(Lyrics7,"\n",Lyrics8), + outputlyrics1(Lyrics3,Lyrics8,Lyrics6). + +outputlyrics2([],Lyrics,Lyrics) :- !. +outputlyrics2(Lyrics1,Lyrics5,Lyrics6) :- + Lyrics1=[Lyrics2|Lyrics3], + sentencewithspaces(Lyrics2,Lyrics4), + %%writeln(Lyrics4), + concat_list(Lyrics5,[Lyrics4,"\n"],Lyrics7), + outputlyrics2(Lyrics3,Lyrics7,Lyrics6). + +max([],M,M) :- !. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item>=Max1, + max(List2,Item,Max2),!. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item1 etc. + findbest(R,Item1). + +find("Should a chorus or instrumental come after the first verse?",CorI) :- + trialy2([c,i1],R), + findbest(R,CorI). + +find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS) :- + trialy2([[c,o],[s,s]],R), + findbest(R,COorSS). + +find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS) :- + trialy2([[c,o],[s,s]],R), + findbest(R,COorSS). + +%%find("Who is the song about?",Character) :- +%% trialy2(["Ma","til","da"],["Jo","se","phine"],["Hap","py"],["Ha","rold"],R), +%% findbest(R,Character). + +find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT) :- + trialy2([1451, 1564, 1645 + %% + , classical, classicalpop + ],R), + findbest(R,CPT). + +/**generatelyricslistsverse(Character,Lyrics1,Lyrics2):- +%% read c, o, v + reado(Objects), + readv(Verbs), + + charobj( %% character verb object first + %% object verb object pairs + +lyrics1([],_Character,Lyrics,Lyrics) :- !. +lyrics1(Form1,Character,Lyrics1,Lyrics2) :- + Form1=[Form2|Forms2], + lyrics2(Form2,Character,Lyrics1,Lyrics3), + +lyrics2([instrumental,_],_Character,Lyrics1,Lyrics2) :- + +append_list(Lyrics1,[[],[],[],[]],Lyrics2). +lyrics2([verse,_],Character,Lyrics1,Lyrics2) :- + lyricsv1(Lyrics1,[Character],Lyrics3), + append_list(Lyrics1,[Lyrics3,*],Lyrics2), + . + +lyricsv1(Lyrics0,Lyrics1,Lyrics2) :- + readsc(SyllableCount), + readv(Verbs1), + +reado(Objects1), + **/ +%%removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2),] +%%findv +%% same for o +%% findo + %% choose words that fit syllable count + +append_list(A,[],A) :-!. +append_list(A,List,B) :- + List=[Item|Items], + append2(A,[Item],C), + append_list(C,Items,B). +append2(A,List,C) :- + (List=[Item]-> + append(A,Item,C); + fail),!. +append2(A,Item,C) :- + append_list(A,Item,C),!. + +readsc(7). +readv([["loves"],["is"],["has"],["is","in"],["moves","to"],["nur","tures"],["needs"],["makes"],["lifts"],["finds"],["forms"],["goes","to"],["writes","on"],["reads","on"],["feels"],["is"]]). + +reado([["one"],["the","oth","er"],["the","runn","er"],["the","draw","er"],["the","count","er"],["the","graph","er"],["the","test","er"],["the","breaths","on","er"],["the","writ","er"],["the","spell","er"],["the","updat","er"],["the","check","er"],["the","choos","er"],["the","ess","ence"],["the","comb","in","er"],["the","mir","ac","le"],["the","trans","lat","or"],["the","gramm","ar"]]). + +rhymes([["one","er","or","ar","an","ae","er","ler","ur","ard","ney","ald","ess","el"],["le","py","ye","ee","ice"]]). + +%%removetoolongandnotrhyming(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !. +/**removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2) :- +%% makes list of all combos, checks, asks, asks + removetoolong1(Lyrics1,SyllableCount, + Verbs1,[],Verbs3), %% do after end + * %% not until object + %% find v x + removenotrhyming1(Lyrics0,Verbs3, + Verbs4), + + removetoolong1(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !.%* + removetoolong1(Lyrics1,SyllableCount,Verbs1,Verbs2,Verbs3) :- + Verbs1=[Verb4|Verbs5], + append_list(Lyrics1,Verb4,Lyrics2), + length(Lyrics2,Length), + (Length<=SyllableCount->(append(Verbs2, [Verb4],Verbs6), removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs6,Verbs3));(removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs2,Verbs3))). + +removenotrhyming1(Lyrics,Verbs1,Verbs2) :- + rhymes(Rhymes1) + length(Lyrics,Length), + Line1 is mod(Length,4), + (Line1 is 3-1-> + (Line2 is 3-2, + removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2)); + (Line1 is 4-1-> + (removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2));Verbs1=Verbs2) + ). + +removenotrhyming15(Line2,Rhymes1,Rhymes3) :- + length(List1,Line2), + append(List1,_,Lyrics), + reverse(List1,List2), + List2=[Item1|_Items2], + reverse(Item1,Item2), + Item2=[Item3|_Items4], + member(Rhymes2,Rhymes1), + member(Item3,Rhymes2), + delete(Rhymes2,Item3,Rhymes3) + +removenotrhyming2(_Rhymes3,[],Verbs,Verbs) :- !. +removenotrhyming2(Rhymes3,Verbs1,Verbs2,Verbs3) :- + (Verbs1=[Verb4|Verbs5], + (member(Verb4,Rhymes3) + delete(Rhymes3,Verb4,Rhymes4)) + ->(append(Verbs2,[Verb4],Verbs6), + removenotrhyming2(Rhymes4,Verbs5, + Verbs6,Verbs3)); + removenotrhyming2(Rhymes3,Verbs5, + Verbs2,Verbs3)). + **/ +trialy2([],R) :- + R=[[_,['C']]]. + %%writeln([[],in,trialy2]),abort. +trialy2(Question,List,R) :- + %%random_member(A,List), + term_to_atom(List,List1), + repeat, + writeln([Question,%%"Please enter an item from:", + List1]), + read_string(user_input, "\n", "\r", _End2, A1), + atom_to_term(A1,A,_), + %%((number_string(A,A1)->true;atom_string(A,A1))->true;, + member(A,List), + R=[[_,A]]. + + /** + length(List,L), + Trials is L*3, + trialy22(List,Trials,[],R).**/ + +trialy22([],_,R,R) :- !. +trialy22(List,Trials,RA,RB) :- + List=[Item|Items], + trialy21(Item,Trials,R1), + append(RA,[R1],RC), + trialy22(Items,Trials,RC,RB),!. + +trialy21(Label,Trials,RA) :- + trialy3(Trials,[],R), + aggregate_all(count, member(true,R), Count), + RA=[Count,Label]. + +trialy3(0,R,R) :-!. +trialy3(Trials1,RA,RB) :- + trialy1(R1), + append(RA,[R1],RC), + Trials2 is Trials1-1, + trialy3(Trials2,RC,RB),!. + +%% try other nouns +trialy1(R1) :- + %%control11(A1), + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, trial1(N,[],S),trial01(S,S3). +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +midpoint(S,MP) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,writeln(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + %%**texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +/**string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +comment(fiftyastest). +comment(turnoffas). +**/ + +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). + +remove_dups([],List,List) :- !. +remove_dups(List1,List2,List3) :- + List1=[Item|List4], + delete(List4,Item,List5), + append(List2,[Item],List6), + remove_dups(List5,List6,List3),!. + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual-ui.pl b/Music-Composer/mindreadtestmusiccomposer-unusual-ui.pl new file mode 100644 index 0000000000000000000000000000000000000000..c09dff6c9c6d02ec50a8a6fd51645d61bd0c99c7 --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual-ui.pl @@ -0,0 +1,1184 @@ +/** + +- Form +- Which chord progression type? +- Lyrics - random sentences, rhyming +- Melody - random starting and ending note of first phrase, melody and harmony from chord progression type with three notes (split for melody), the circle of fifths for connection between sections +- Which of up to 3 instruments for each of melody and harmony +- What sections they will play + +- Melody cut to the length of vocals + +**/ + +use_module(library(pio)). + +:- use_module(library(date)). +%%:- include('texttobr2qb'). +%%:- include('mindreadtestshared'). +%%:- include('texttobr2'). +%%:- include('texttobr'). +%%:- include('texttobr2qbmusic'). + +:- include('musiclibrary'). +:- include('la_strings'). + +sectest(0) :- !. +sectest(N1) :- + writeln(["\n",N1]), + sectest0(_Form1,_Lyrics,_Melody,_Harmony,_MelodyParts, + _HarmonyParts,_Vocalstubinstrument,_Song1), + N2 is N1-1, + sectest(N2),!. + +sectest0(Form1,Lyrics,Melody,Harmony,MelodyParts,HarmonyParts,Vocalstubinstrument,Song1) :- + check_asc2mid, + %%texttobr2qb(2), %%Imagine song + %%form(Form1), + writeln("Please enter form in format e.g. [n,v1,i1,v2,c,t2,s,s,s]."), + read_string(user_input, "\n", "\r", _End21, Form1A), + atom_to_term(Form1A,Form1,_), + + %%Form1=[v2,o], + %%find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT), + writeln("Should the chord progression type be 1451, 1564, 1645, classical or classicalpop?"), + read_string(user_input, "\n", "\r", _End2, CPT1), + atom_string(CPT,CPT1), + + remove_dups(Form1,[],Form2), + Voiceparts1=[v1,v2,c,s], + intersection(Form1,[v1,v2,c,s],Voiceparts2), + %%lyrics(Voiceparts1,Lyrics,Maxlength), + writeln(["Please enter lyrics for parts",Voiceparts1,"in format e.g. [[v1, [\"All\", \"ney\", \"goes\", \"to\", \"the\"], [\"Le\", \"ice\", \"is\", \"in\"], [\"the\", \"mir\", \"ac\"], [\"the\", \"graph\"]], [v2, [\"Dix\", \"ard\"]]]."]), + read_string(user_input, "\n", "\r", _End2, LyricsA), + atom_to_term(LyricsA,Lyrics,_), + +findall(DH,(member(C1H,Lyrics),C1H=[_|C2H],member(CH,C2H),length(CH,DH)),EH),sort(EH,FH),reverse(FH,GH),GH=[Maxlength|_], + + findall(B,(member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1)),Form3), + remove_dups(Form3,[],Form4), + %%repeat, %% in case melody, harmony don't work + melodyharmony(Form4,CPT,Maxlength,Melody,Harmony), + %%writeln(melodyharmony(Form4,CPT,Maxlength,Melody,Harmony)), %% *** + instruments(Form1,MelodyInstruments,HarmonyInstruments, + MelodyParts,HarmonyParts,Vocalstubinstrument), + %%writeln(instruments(Form1,MelodyInstruments,HarmonyInstruments, +%% MelodyParts,HarmonyParts, +%% Vocalstubinstrument)), + %%writeln(rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + %% MelodyInstruments,HarmonyInstruments,MelodyParts, + %% HarmonyParts,Lyrics, + %% Vocalstubinstrument,Song1)), %%, +%%trace, + rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + MelodyInstruments,HarmonyInstruments,MelodyParts, + HarmonyParts,Lyrics, + Vocalstubinstrument,Song1,File1), %%, + +Meta_file=[[form,Form1],[chord_progressions,CPT],[voice_part,Voiceparts2],[melody,Melody],[harmony,Harmony],[melody_instruments, + MelodyInstruments],[harmony_instruments,HarmonyInstruments],[melody_parts,MelodyParts],[harmony_parts, + HarmonyParts],[lyrics,Lyrics], + [genre,["anthem"]]], + + term_to_atom(Meta_file,Meta_file1), + string_atom(Meta_file2,Meta_file1), + + concat_list("",[File1,"_meta.txt"],File2), + (open_s(File2,write,Stream1), + write(Stream1,Meta_file2), + close(Stream1)),! + . + +%% n intro, vn verse, c chorus, i1 instrumental1, t2, instrumental 2, s solo, o outro + +findall1(Form2,Form3) :- + findall(B,shorten(Form2,B),Form3). + +shorten(Form2,B) :- + member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1). + +form(Form) :- + Form1=[n,v1], + find("Should a chorus or instrumental come after the first verse?",CorI), + (CorI=c-> + (append(Form1,[c,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + ); + (CorI=i1,append(Form1,[i1,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + )). + +makename(N3) :- + trialy2(["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"],R1), + findbest(R1,N1), + trialy2(["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"],R2), + findbest(R2,N2), + append([N1],[N2],N3). + +makenames(0,Ns,Ns) :- !. +makenames(Num1,N1,N2) :- + random_member(R1,["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"]), + random_member(R2,["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"]), + append([R1],[R2],R3), + append(N1,[R3],N3), + Num2 is Num1-1, + makenames(Num2,N3,N2). + +findsentence(Sentence,Names1,Names2) :- + makename(N1),readv(Vs),reado(Os), + makenames(5,[],N2s1),append(N2s1,Names1,N2s), + trialy2(Vs,R1), + findbest(R1,R11), + append(Os,N2s,Os2), + trialy2(Os2,R2), + findbest(R2,R21), + append_list(N1,[R11,R21],Sentence), + (member(R21,N2s1)-> + append(Names1,[R21],Names2); + Names2=Names1). + +findrhyme(Phrase1,Phrase2,Names1,Names2) :- + reverse(Phrase1,Reverse), + Reverse=[Lastsyllable|_], + readv(Vs),reado(Os),makenames(5,[],N2s1), + append(N2s1,Names1,N2s), + append(Os,N2s,Os2), + trialy2(Os2,R2), + findbest(R2,R21), + trialy2(Vs,R1), + findbest(R1,R11), + findall(C,(member(C,N2s),reverse(C,B),B=[E|_],rhymes2(E,Lastsyllable)),Phrases22), + trialy2(Phrases22,R3), + findbest(R3,R31), + append_list(R21,[R11,R31],Phrase2), + (member(R21,N2s1)-> + append(Names1,[R21],Names3); + Names3=Names1), + (member(Phrases22,N2s1)-> + append(Names3,[R21],Names2); + Names2=Names3). + +rhymes2(Syllable1,Syllable2) :- + rhymes(Lists), + member(List1,Lists), + member(Syllable1,List1), + member(Syllable2,List1). + +lyrics(Form1,Lyrics,Maxlength) :- + %%find("Who is the song about?",Character), + %%lyrics1(Form,Character,[],Lyrics). + catch((lyrics2(Form1,[],Lyrics,[],_Names,0,Maxlength)),_, + lyrics(Form1,Lyrics,Maxlength)). + +lyrics2([],Lyrics,Lyrics,Names,Names,Maxlength,Maxlength) :- !. +lyrics2(Form1,Lyrics1,Lyrics2,Names1,Names2,Maxlength1,Maxlength2) :- + Form1=[Form2|Forms3], + findsentence(Sentence1,Names1,Names3), + length(Sentence1,Length1), + findsentence(Sentence2,Names3,Names4), + length(Sentence2,Length2), + findrhyme(Sentence1,Sentence3,Names4,Names5), + length(Sentence3,Length3), + findrhyme(Sentence2,Sentence4,Names5,Names6), + length(Sentence4,Length4), + append(Lyrics1,[[Form2,Sentence1,Sentence2,Sentence3,Sentence4]],Lyrics3), + max([Length1,Length2,Length3,Length4],Maxlength1,Maxlength3), + lyrics2(Forms3,Lyrics3,Lyrics2,Names6,Names2,Maxlength3,Maxlength2),!. + + +melodyharmony(Form1,CPT,Maxlength,Melody,Harmony) :- + Partlength is Maxlength div 3, + Extra is Maxlength mod 3, + Total is Partlength+Extra, + _Parts=[Partlength,Partlength,Partlength,Total], + %%(CPT=1451->findall(A,note0(_,A),Notes); + %%findall(A,note0(_,A),Notes)), + %% What note should the song start on? + %%trialy2(Notes,R1), + %%findbest(R1,R11), + %%R11='A', + melodyharmony(Form1,CPT,_Parts2,_R11,_R2,[],Melody,[],Harmony). + +melodyharmony([],_CPT,_Parts,N,N,Melody,Melody,Harmony,Harmony) :- !. +melodyharmony(Form1,CPT,Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + Form1=[Form2|Forms3], + findmelody(Form2,CPT,Parts,N1,N3,Melody1,Melody3,Harmony1,Harmony3), + + (CPT='1451'->harmony1451(N3,3,N4);harmonyr(N3,5,N4)), + findmelody(Form2,CPT,Parts,N4,N5,Melody3,Melody4,Harmony3,Harmony4), + %%reverse(Melody3,Melody3R),Melody3R=[[_,Melody3RI]|_], %% relies on repeat *1 + %%reverse(Melody4,Melody4R),Melody4R=[[_,Melody4RI]|_], + %%not((length(Melody3RI,1),length(Melody4RI,1))), + melodyharmony(Forms3,CPT,Parts,N5,N2,Melody4,Melody2, + Harmony4,Harmony2). + +findmelody(Form,CPT,_Parts,_N1,_N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT='1451', + %%(CPT=1451->findall(A,note0(_,A),Notes); + %%findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + %%repeat, + %%trialy2(Notes,R1), + %%findbest(R1,N2), + %%versechorussoloprogression1451(N1,N2,Progression), + %%trialy2(Progression,R2), + %%findbest(R2,Progression2), + repeat, + writeln(["Please enter melody line for form",Form,"in format e.g. four notes from [d,di,r,ri,m,f,fi,s,si,l,li,t]."]), + read_string(user_input, "\n", "\r", _End21, Progression2B), + atom_to_term(Progression2B,Progression2A,_), + solfatonotes(Progression2A,[],Progression2), +not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + %%versechorussoloprogression1451(N1,N2,Progression3), + %%trialy2(Progression3,R3), + %%findbest(R3,Progression4), + repeat, + writeln(["Please enter harmony line in format e.g. four notes from ['C','E','G'] for",Form,"."]), + read_string(user_input, "\n", "\r", _End2, Progression4A), + atom_to_term(Progression4A,Progression4,_), + + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,_N1,_N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT='1564', + repeat, + writeln(["Please enter melody line for form",Form,"in format e.g. four notes from [d,di,r,ri,m,f,fi,s,si,l,li,t]."]), + read_string(user_input, "\n", "\r", _End2, Progression2B), + atom_to_term(Progression2B,Progression2A,_), + solfatonotes(Progression2A,[],Progression2), not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + %%versechorussoloprogression1451(N1,N2,Progression3), + %%trialy2(Progression3,R3), + %%findbest(R3,Progression4), + repeat, + writeln(["Please enter harmony line in format e.g. four notes from ['C','E','G'] for",Form,"."]), + read_string(user_input, "\n", "\r", _End2, Progression4A), + atom_to_term(Progression4A,Progression4,_), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,_N1,_N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT='1645', + repeat, + writeln(["Please enter melody line for form",Form,"in format e.g. four notes from [d,di,r,ri,m,f,fi,s,si,l,li,t]."]), + read_string(user_input, "\n", "\r", _End21, Progression2B), + atom_to_term(Progression2B,Progression2A,_), + solfatonotes(Progression2A,[],Progression2), not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + %%versechorussoloprogression1451(N1,N2,Progression3), + %%trialy2(Progression3,R3), + %%findbest(R3,Progression4), + repeat, + writeln(["Please enter harmony line in format e.g. four notes from ['C','E','G'] for",Form,"."]), + read_string(user_input, "\n", "\r", _End2, Progression4A), + atom_to_term(Progression4A,Progression4,_), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,_N1,_N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classical, + repeat, + writeln(["Please enter melody line for form",Form,"in format e.g. four notes from [d,di,r,ri,m,f,fi,s,si,l,li,t]."]), + read_string(user_input, "\n", "\r", _End21, Progression2B), + atom_to_term(Progression2B,Progression2A,_), + solfatonotes(Progression2A,[],Progression2), not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + %%versechorussoloprogression1451(N1,N2,Progression3), + %%trialy2(Progression3,R3), + %%findbest(R3,Progression4), + repeat, + writeln(["Please enter harmony line in format e.g. four notes from ['C','E','G'] for",Form,"."]), + read_string(user_input, "\n", "\r", _End2, Progression4A), + atom_to_term(Progression4A,Progression4,_), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,_N1,_N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classicalpop, + repeat, + writeln(["Please enter melody line for form",Form,"in format e.g. four notes from [d,di,r,ri,m,f,fi,s,si,l,li,t]."]), + read_string(user_input, "\n", "\r", _End21, Progression2B), + atom_to_term(Progression2B,Progression2A,_), + solfatonotes(Progression2A,[],Progression2), not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + %%versechorussoloprogression1451(N1,N2,Progression3), + %%trialy2(Progression3,R3), + %%findbest(R3,Progression4), + repeat, + writeln(["Please enter harmony line in format e.g. four notes from ['C','E','G'] for",Form,"."]), + read_string(user_input, "\n", "\r", _End2, Progression4A), + atom_to_term(Progression4A,Progression4,_), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). + +harmony(Form,CPT,Progression,Harmony1,Harmony2) :- + harmony1(Form,CPT,Progression,[],Harmony3), + append(Harmony1,[Harmony3],Harmony2). + +harmony1(_Form,_CPT,[],Harmony,Harmony) :- !. +harmony1(Form,CPT,Progression1,Harmony1,Harmony2) :- + Progression1=[Note1|Progression2], + (CPT='1451'->(harmony1451(Note1,2,Note2),harmony1451(Note1,4,Note3)); + (harmonyr(Note1,4,Note2),harmonyr(Note1,7,Note3))), + Harmony30=[Note2,Note3], + findall(B1,(member(A1,Harmony30),string_concat(B,_C,A1),string_length(B,1),atom_string(B1,B)),Harmony31), + append([Note1],Harmony31,Harmony3), + append(Harmony1,[[Form,Harmony3]],Harmony4), + harmony1(Form,CPT,Progression2,Harmony4,Harmony2),!. + +harmony1451(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). +harmonyr(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). + + +solfatonotes([],Progression1,Progression1) :- !. +solfatonotes(Progression2A,Progression1,Progression2) :- + Progression2A=[N1|Ns], + solfegenotetonote(N1,N2), + append(Progression1,[N2],Progression3), + solfatonotes(Ns,Progression3,Progression2). + +solfegenotetonote(d,'C'). +solfegenotetonote(di,'C#'). +solfegenotetonote(r,'D'). +solfegenotetonote(ri,'D#'). +solfegenotetonote(m,'E'). +solfegenotetonote(f,'F'). +solfegenotetonote(fi,'F#'). +solfegenotetonote(s,'G'). +solfegenotetonote(si,'G#'). +solfegenotetonote(l,'A'). +solfegenotetonote(li,'A#'). +solfegenotetonote(t,'B'). + + +instruments(Form1,MelodyInstruments,HarmonyInstruments,MelodyParts,HarmonyParts,Vocalstubinstrument) :- + melodyinstrumentnumber(MelodyInstrumentNumber), + harmonyinstrumentnumber(HarmonyInstrumentNumber), + instrumentlist(MelodyInstrumentNumber,MelodyInstruments), + instrumentlist(HarmonyInstrumentNumber,HarmonyInstruments), + Vocalstubinstrument=[0,"Acoustic Grand Piano"], + parts(Form1,MelodyInstruments,MelodyParts), + aggregate_all(count, (member(A1,MelodyParts),not(A1=[_,_,0])), Count1), + not(Count1=0), + parts(Form1,HarmonyInstruments,HarmonyParts), + aggregate_all(count, (member(A2,HarmonyParts),not(A2=[_,_,0])), Count2), + not(Count2=0). + +melodyinstrumentnumber(NumberofInstruments) :- + %% Number of melody instruments + trialy2("Number of melody instruments?",[1,2,3,4,5,6,7,8,9,10],R1), + findbest(R1,NumberofInstruments). + +harmonyinstrumentnumber(NumberofInstruments) :- + %% Number of harmony instruments + trialy2("Number of harmony instruments?",[1,2,3,4,5,6,7,8,9,10],R1), + findbest(R1,NumberofInstruments). + +instrumentlist(NumberofInstruments,Instruments) :- + instrumentlist(NumberofInstruments,[],Instruments),!. + +instrumentlist(0,Instruments,Instruments) :- !. +instrumentlist(NumberofInstruments1,Instruments1,Instruments2) :- + Instruments=[[0,"Acoustic Grand Piano"],[1,"Bright Acoustic Piano"],[2,"Electric Grand Piano"],[3,"Honky-Tonk Piano"],[4,"Electric Piano 1"],[5,"Electric Piano 2"],[6,"Harpsichord"],[7,"Clavinet"],[8,"Celesta"],[9,"Glockenspiel"],[10,"Music Box"],[11,"Vibraphone"],[12,"Marimba"],[13,"Xylophone"],[14,"Tubular Bells"],[15,"Dulcimer"],[16,"Drawbar Organ"],[17,"Percussive Organ"],[18,"Rock Organ"],[19,"Church Organ"],[20,"Reed Organ"],[21,"Accordian"],[22,"Harmonica"],[23,"Tango Accordian"],[24,"Nylon Acoustic Guitar"],[25,"Steel Acoustic Guitar"],[26,"Jazz Electric Guitar"],[27,"Clean Electric Guitar"],[28,"Muted Electric Guitar"],[29,"Overdriven Guitar"],[30,"Distortion Guitar"],[31,"Guitar Harmonics"],[32,"Acoustic Bass"],[33,"Electric Bass (finger)"],[34,"Electric Bass (pick)"],[35,"Fretless Bass"],[36,"Slap Bass 1"],[37,"Slap Bass 2"],[38,"Synth Bass 1"],[39,"Synth Bass 2"],[40,"Violin"],[41,"Viola"],[42,"Cello"],[43,"Contrabass"],[44,"Tremolo Strings"],[45,"Pizzicato Strings"],[46,"Orchestral Harp"],[47,"Timpani"],[48,"String Ensemble 1"],[49,"String Ensemble 2"],[50,"Synth Strings 1"],[51,"Synth Strings 2"],[52,"Choir Aahs"],[53,"Voice Oohs"],[54,"Synth Choir"],[55,"Orchestra Hit"],[56,"Trumpet"],[57,"Trombone"],[58,"Tuba"],[59,"Muted Trumpet"],[60,"French Horn"],[61,"Brass Section"],[62,"Synth Brass 1"],[63,"Synth Brass 2"],[64,"Soprano Sax"],[65,"Alto Sax"],[66,"Tenor Sax"],[67,"Baritone Sax"],[68,"Oboe"],[69,"English Horn"],[70,"Bassoon"],[71,"Clarinet"],[72,"Piccolo"],[73,"Flute"],[74,"Recorder"],[75,"Pan Flute"],[76,"Blown Bottle"],[77,"Shakuhachi"],[78,"Whistle"],[79,"Ocarina"],[80,"Lead 1 (square)"],[81,"Lead 2 (sawtooth)"],[82,"Lead 3 (calliope)"],[83,"Lead 4 (chiff)"],[84,"Lead 5 (charang)"],[85,"Lead 6 (voice)"],[86,"Lead 7 (fifths)"],[87,"Lead 8 (bass + lead)"],[88,"Pad 1 (new age)"],[89,"Pad 2 (warm)"],[90,"Pad 3 (polysynth)"],[91,"Pad 4 (choir)"],[92,"Pad 5 (bowed)"],[93,"Pad 6 (metallic)"],[94,"Pad 7 (halo)"],[95,"Pad 8 (sweep)"],[96,"FX 1 (rain)"],[97,"FX 2 (soundtrack)"],[98,"FX 3 (crystal)"],[99,"FX 4 (atmosphere)"],[100,"FX 5 (brightness)"],[101,"FX 6 (goblins)"],[102,"FX 7 (echoes)"],[103,"FX 8 (sci-fi)"],[104,"Sitar"],[105,"Banjo"],[106,"Shamisen"],[107,"Koto"],[108,"Kalimba"],[109,"Bagpipe"],[110,"Fiddle"],[111,"Shanai"],[112,"Tinkle Bell"],[113,"Agogo"],[114,"Steel Drums"],[115,"Woodblock"],[116,"Taiko Drum"],[117,"Melodic Tom"],[118,"Synth Drum"],[119,"Reverse Cymbal"],[120,"Guitar Fret Noise"],[121,"Breath Noise"],[122,"Seashore"]], + %% ,[123,"Bird Tweet"],[124,"Telephone Ring"],[125,"Helicopter"],[126,"Applause"],[127,"Gunshot"] + + trialy2("Please select an instrument from:",Instruments,R1), + findbest(R1,Instrument), + append(Instruments1,[Instrument],Instruments3), + NumberofInstruments2 is NumberofInstruments1-1, + instrumentlist(NumberofInstruments2,Instruments3,Instruments2),!. + +parts(Form,Instruments,Parts) :- + parts1(Form,Instruments,[],Parts),!. + +parts1(_Form,[],Parts,Parts) :- !. +parts1(Form,Instruments1,Parts1,Parts2) :- + Instruments1=[Instrument2|Instruments3], + parts2(Form,Instrument2,[],Part3), + append(Parts1,[Part3],Parts4), + parts1(Form,Instruments3,Parts4,Parts2),!. + +parts2([],_Instrument,Part,Part) :- !. +parts2(Form1,Instrument,Part1,Part2) :- + Form1=[Section|Form2], + (member([Section,Instrument,Playing],Part1)->true; + %% Is the instrument playing in the section? + term_to_atom(Instrument,Instrument1), + concat_list("Is ",[Instrument1," playing in ",Section,"?"],Question), + (trialy2(Question,[0,1],R1), + findbest(R1,Playing))), + append(Part1,[[Section,Instrument,Playing]],Part3), + parts2(Form2,Instrument,Part3,Part2),!. + +/** +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). +**/ + +concat_list(A,[],A) :-!. +concat_list(A,List,B) :- + List=[Item|Items], + concat_list2(A,[Item],C), + concat_list(C,Items,B). +concat_list2(A,List,C) :- + ((List=[[Item|Items]]->true;List=[Item|Items])-> + concat_list0(A,[Item|Items],C); + fail),!. +concat_list2(A,Item,C) :- + concat_list(A,Item,C),!. + +concat_list0([],""):-!. +concat_list0(A1,B):- + A1=[A|List], + concat_list0(A,List,B),!. + +concat_list0(A,[],A):-!. +concat_list0(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list0(C,Items,B). + +sentencewithspaces(Sentence1,Sentence2) :- + Sentence1=[Item|Items], + string_concat(Firstletter1,Rest,Item), + string_length(Firstletter1,1), + upcase_atom(Firstletter1,Firstletter2), + concat_list(Firstletter2,[Rest,""],Item2), + sentencewithspaces(Items,Item2,Sentence3), + string_concat(Sentence3,".",Sentence2). + +sentencewithspaces([],Sentence,Sentence) :- !. +sentencewithspaces(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Word|Sentence4], + concat_list("",[Sentence2," ",Word],Item2), + sentencewithspaces(Sentence4,Item2,Sentence3),!. + +rendersong(Form1,Voiceparts,_Maxlength,Melody, + Harmony,MelodyInstruments, + HarmonyInstruments,MelodyParts1, + HarmonyParts,Lyrics,Vocalstubinstrument,Song1,File1) :- + Totallength=8, + Voicetrack=1, + length(MelodyInstruments,MelodyInstrumentsLength), + length(HarmonyInstruments,HarmonyInstrumentsLength), + TracksNumber is MelodyInstrumentsLength+HarmonyInstrumentsLength+2, + Lyrics=[[_,Sentence1|_]|_],sentencewithspaces(Sentence1,Sentence2), + concat_list("format=1 tracks=", [TracksNumber, " division=384\n\nBA 1 CR 0 TR 0 CH 16 Text type 2: \"Produced by Mind Reading Music Composer by Lucian Academy (Manual Entry Mode)\"\nBA 1 CR 0 TR 0 CH 16 Text type 3: \"", Sentence2, "\"\nBA 1 CR 0 TR 0 CH 1 Channel volume 127\nBA 1 CR 0 TR 0 CH 16 Tempo 63.00009\n"], Song2), + printheader(Voicetrack,Vocalstubinstrument,Song2,Song3), + %%writeln(renderv1(Form1,Voiceparts,_,Lyrics,Melody, + %% Totallength,Voicetrack,1,_,Song3,Song4)), %% **** + renderv1(Form1,Voiceparts,_,Lyrics,Melody, + Totallength,Voicetrack,1,_,Song3,Song4), + Track2 is Voicetrack+1, + renderm1(Form1,Melody,MelodyParts1, + Track2,_Track3,Song4,Song5), + Track4 is MelodyInstrumentsLength+2, + renderh1(Form1,Harmony,HarmonyParts, + Track4,_Track5,Song5,Song6), + length(Form1,FormLength), + TotalBars is 4*FormLength+1, + concat_list(Song6,["BA ",TotalBars," CR 0 TR 0 CH 16 End of track"],Song1), + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list("song",[Year,Month,Day,Hour1,Minute1,Seconda],File1), + concat_list(File1,[".txt"],File2), + concat_list(File1,[".mid"],File3), + open_s(File2,write,Stream), + write(Stream,Song1), + close(Stream), + concat_list("./asc2mid ",[File2," > ",File3],Command), + shell1_s(Command), + %%writeln(["Texttobr, Texttobr2 not working. Please manually breason out ",File2]), + %%N is 4, M is 4000, texttobr2(N,File2,u,M),texttobr(N,File2,u,M), + outputlyrics(File1,Lyrics), + +/** + texttobr2qb(3), %% give self breasonings + texttobr2qb(20), %%Feature 1 + texttobr2qb(20), %%Updates + texttobr2qb(20), %%Feature 2 + texttobr2qb(20), %%Updates + texttobr2qb(20), %%Feature 3 + texttobr2qb(20), %%Updates + texttobr2qb(100), %%Icon + texttobr2qb(20), %%Updates + texttobr2qb(32), %%Lyrics + texttobr2qb(36), %%Music + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Medicine + texttobr2qb(20), %%Updates + texttobr2qb(2), %%Sales + texttobr2qb(20), %%Updates + texttobr2qb(2), %%Marketing + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Graciously give or blame listener for colour imagery + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Play song + texttobr2qb(2), %%Like song + **/ +!. + +printheader(Voicetrack,Vocalstubinstrument,Song1,Song2) :- + Vocalstubinstrument=[N1,_], + N2 is N1+1, + concat_list("BA 1 CR 0 TR ",[Voicetrack," CH ",Voicetrack," Instrument ",N2,"\nBA 1 CR 0 TR ",Voicetrack," CH ",Voicetrack," NT R 0 von=127 voff=0\n"],Song3), + string_concat(Song1,Song3,Song2). + +renderv1([],_Voiceparts,_Vocalstubinstrument,_Lyrics,_Melody,_Totallength,_Track,_Bar,_Bar2,Voice,Voice) :- !. +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength,Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section|Form2], + Voiceparts1=[Section|Voiceparts2], + renderv2(Section,Lyrics,Melody,Totallength,Track, + Bar1,Bar3,Voice1,Voice3), + renderv1(Form2,Voiceparts2,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength, +Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section1|Form2], + ((Voiceparts1=[Section2|_Voiceparts2], + not(Section1=Section2))->true;Voiceparts1=[]), + Bar3 is Bar1+4, + renderv1(Form2,Voiceparts1,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderv2(Section,Lyrics,_Melody,_Totallength,_Track,Bar,Bar,Voice,Voice) :- + not(member([Section|_],Lyrics)),!. +renderv2(Section1,Lyrics1,Melody1,_Totallength0,Track,Bar1,Bar2,Voice1,Voice2) :- + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section2|_]),Melody3), + Lyrics3=[Lyrics3A1|_Lyrics5], + %%trace, + (Lyrics3A1=[_,Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]->true; + (writeln(["Error:",Lyrics3A1,"does not have 4 lines"]),abort)), + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1(Lyrics1A,Melody1A,_Totallength1,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1(Lyrics2A,Melody2A,_Totallength2,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1(Lyrics3A,Melody1A,_Totallength3,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1(Lyrics4A,Melody2A,_Totallength4,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Lyrics1,[Section1|_],Lyrics51), + renderv2(Section1,Lyrics51,Melody1,_Totallength,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1(Lyrics2,Melody1,_Totallength,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + length(Lyrics2,Lyrics2Length), + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelody(Lyrics2Length,Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelody(Lyrics2Length,Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +generatemelody1([],Melody,Melody) :- !. +generatemelody1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4,Melody4],Melody6), + generatemelody1(Melody5,Melody6,Melody3),!. + +%%changelength(_Melody2Length,Melody,Melody) :- !. +changelength(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnote1(Length,Melody1,Melody2). +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnote1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnote2(Length,Item,Melody1,Melody2),!. + +repeatlastnote2(0,_Item,Melody,Melody) :- !. +repeatlastnote2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnote2(Length2,Item,Melody3,Melody2),!. + +renderline2(BarTimes,BarTimes,[],_Track,_Bar1,Voice,Voice) :- !. +renderline2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + Melody1=[Melody2|Melody3], + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody2,"- 1/2 voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderline2(BarTimes3,BarTimes4,Melody3,Track,Bar,Voice3,Voice2). + +renderlinerests(BarTimes,BarTimes,_Track,_Bar1,Voice,Voice) :- !. +renderlinerests(BarTimes1,BarTimes2,Track,Bar,Voice1,Voice2) :- + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT R 1/2 voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderlinerests(BarTimes3,BarTimes2,Track,Bar,Voice3,Voice2). + +longtoshortform(Section1,Section2) :- + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A),!. + + +renderm1(_Form1,_Melody,[],_Track1,_Track2,Song,Song) :- !. +renderm1(Form1,Melody,MelodyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + MelodyParts1=[MelodyParts2|MelodyParts3], + MelodyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderm21(Form1,Melody,MelodyParts1,Track1,1,_,Song3,Song4), + renderm21(Form1,Melody,MelodyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderm1(Form1,Melody,MelodyParts3,Track3,Track2,Song4,Song2),!. + + +renderm21(_Form1,_Melody,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderm22(Section2,Melody,Track,Bar1,Bar3,Voice1,Voice3), + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderm22(Section2,Melody1,_Track,Bar,Bar,Voice,Voice) :- + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderm22(Section1,Melody1,Track,Bar1,Bar2,Voice1,Voice2) :- + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_]),Melody3), + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1m(Melody1A,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1m(Melody2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1m(Melody1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1m(Melody2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Melody3,[Section1|_],Melody5), + renderm22(Section1,Melody5,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1m(Melody1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodym(Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodym(Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + Lyrics2Length=8, + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +renderh1(_Form1,_Harmony,[],_Track1,_Track2,Song,Song) :- !. +renderh1(Form1,Harmony,HarmonyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + HarmonyParts1=[HarmonyParts2|HarmonyParts3], + HarmonyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderh21(Form1,Harmony,HarmonyParts1,Track1,1,_,Song3,Song4), + renderh21(Form1,Harmony,HarmonyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderh1(Form1,Harmony,HarmonyParts3,Track3,Track2,Song4,Song2),!. + + +renderh21(_Form1,_Harmony,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderh22(Section2,Harmony,Track,Bar1,Bar3,Voice1,Voice3), + renderh22(Section2,Harmony,Track,Bar3,Bar4,Voice3,Voice4), + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar4,Bar2,Voice4,Voice2),!. + +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderh22(Section2,Harmony1,_Track,Bar,Bar,Voice,Voice) :- + + %%longtoshortform(Section21,Section2), + findall(Harmony2,(member(Harmony3,Harmony1),member(Harmony2,Harmony3), + Harmony2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderh22(Section1,Harmony1A,Track,Bar1,Bar2,Voice1,Voice2) :- + Harmony1A=[Harmony1A1|Harmony1A2], + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + + %% shorten + %%string_concat(B11,_C1,Section11), + %%string_length(B11,1),atom_string(Section1,B11), + + + + findall(Harmony2,(member(Harmony2,Harmony1A1), + Harmony2=[Section1|_] + %%,string_concat(B1,_C,Section1A), + %%string_length(B1,1),atom_string(Section1,B1) + ),Harmony3), + %% shorten + + + + (not(Harmony3=[])-> + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + %%Harmony3=[[_, Harmony1A], [_, Harmony2A]], + (renderlines1h(Harmony3,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1);(Bar3 is Bar1,Voice3=Voice1)), + /** + Bar3 is Bar1+1, + renderline1h(Harmony2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1h(Harmony1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1h(Harmony2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + **/ + %%delete(Harmony3,[Section1|_],Harmony5), + renderh22(Section1,Harmony1A2,Track,Bar3,Bar2,Voice3,Voice2). + +renderlines1h([],_Track,_Bar,Voice,Voice) :- !. +renderlines1h(Harmony1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodyh(Harmony1,Harmony2), +%%writeln(generatemelodyh(Harmony1,Harmony2)), + renderlineh2(BarTimes1,BarTimes2,Harmony2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodyh(Harmony1,Harmony2) :- + generatemelodyh1(Harmony1,[],Harmony3), + length(Harmony3,Harmony2Length), + Lyrics2Length=8, + changelengthh(Lyrics2Length,Harmony2Length, + Harmony3,Harmony2). + +generatemelodyh1([],Melody,Melody) :- !. +generatemelodyh1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4,Melody4],Melody6), + generatemelodyh1(Melody5,Melody6,Melody3),!. + +%%changelengthh(_Melody2Length,Melody,Melody) :- !. +changelengthh(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnoteh1(Length,Melody1,Melody2). +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnoteh1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnoteh2(Length,Item,Melody1,Melody2),!. + +repeatlastnoteh2(0,_Item,Melody,Melody) :- !. +repeatlastnoteh2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnoteh2(Length2,Item,Melody3,Melody2),!. + +renderlineh2(BarTimes,BarTimes,[],_Track,_Bar1,Voice,Voice) :- !. +renderlineh2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + Melody1=[Melody2|Melody3], + Melody2=[_,[Melody21,Melody22,Melody23]], + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody21,"- 1/2 voff=0\n"],Song1), + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody22,"- 1/2 voff=0\n"],Song2), + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody23,"- 1/2 voff=0\n"],Song3), + concat_list(Voice1,[Song1,Song2,Song3],Voice3), + renderlineh2(BarTimes3,BarTimes4,Melody3,Track,Bar,Voice3,Voice2). + +outputlyrics(File1,Lyrics1) :- + Lyrics1=[Lyrics2|_Lyrics3], + Lyrics2=[_|Lyrics4], + Lyrics4=[Lyrics5|_Lyrics6], + sentencewithspaces(Lyrics5,Lyrics7A), + string_concat(Lyrics7A,"\n\n",Lyrics7), + outputlyrics1(Lyrics1,Lyrics7,Lyrics8), + concat_list(File1,["lyrics.txt"],File2), + open_s(File2,write,Stream), + write(Stream,Lyrics8), + close(Stream). +%%texttobr2(u,File2,u,u),texttobr(u,File2,u,u). + %%writeln(["Texttobr, Texttobr2 not working. Please manually breason out ",File2]). + +outputlyrics1([],Lyrics,Lyrics) :- !. +outputlyrics1(Lyrics1,Lyrics5,Lyrics6) :- + Lyrics1=[Lyrics2|Lyrics3], + Lyrics2=[_|Lyrics4], + outputlyrics2(Lyrics4,Lyrics5,Lyrics7), + string_concat(Lyrics7,"\n",Lyrics8), + outputlyrics1(Lyrics3,Lyrics8,Lyrics6). + +outputlyrics2([],Lyrics,Lyrics) :- !. +outputlyrics2(Lyrics1,Lyrics5,Lyrics6) :- + Lyrics1=[Lyrics2|Lyrics3], + sentencewithspaces(Lyrics2,Lyrics4), + %%writeln(Lyrics4), + concat_list(Lyrics5,[Lyrics4,"\n"],Lyrics7), + outputlyrics2(Lyrics3,Lyrics7,Lyrics6). + +max([],M,M) :- !. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item>=Max1, + max(List2,Item,Max2),!. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item1 etc. + findbest(R,Item1). + +find("Should a chorus or instrumental come after the first verse?",CorI) :- + trialy2([c,i1],R), + findbest(R,CorI). + +find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS) :- + trialy2([[c,o],[s,s]],R), + findbest(R,COorSS). + +find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS) :- + trialy2([[c,o],[s,s]],R), + findbest(R,COorSS). + +%%find("Who is the song about?",Character) :- +%% trialy2(["Ma","til","da"],["Jo","se","phine"],["Hap","py"],["Ha","rold"],R), +%% findbest(R,Character). + +find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT) :- + trialy2([1451, 1564, 1645 + %% + , classical, classicalpop + ],R), + findbest(R,CPT). + +/**generatelyricslistsverse(Character,Lyrics1,Lyrics2):- +%% read c, o, v + reado(Objects), + readv(Verbs), + + charobj( %% character verb object first + %% object verb object pairs + +lyrics1([],_Character,Lyrics,Lyrics) :- !. +lyrics1(Form1,Character,Lyrics1,Lyrics2) :- + Form1=[Form2|Forms2], + lyrics2(Form2,Character,Lyrics1,Lyrics3), + +lyrics2([instrumental,_],_Character,Lyrics1,Lyrics2) :- + +append_list(Lyrics1,[[],[],[],[]],Lyrics2). +lyrics2([verse,_],Character,Lyrics1,Lyrics2) :- + lyricsv1(Lyrics1,[Character],Lyrics3), + append_list(Lyrics1,[Lyrics3,*],Lyrics2), + . + +lyricsv1(Lyrics0,Lyrics1,Lyrics2) :- + readsc(SyllableCount), + readv(Verbs1), + +reado(Objects1), + **/ +%%removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2),] +%%findv +%% same for o +%% findo + %% choose words that fit syllable count + +append_list(A,[],A) :-!. +append_list(A,List,B) :- + List=[Item|Items], + append2(A,[Item],C), + append_list(C,Items,B). +append2(A,List,C) :- + (List=[Item]-> + append(A,Item,C); + fail),!. +append2(A,Item,C) :- + append_list(A,Item,C),!. + +readsc(7). +readv([["loves"],["is"],["has"],["is","in"],["moves","to"],["nur","tures"],["needs"],["makes"],["lifts"],["finds"],["forms"],["goes","to"],["writes","on"],["reads","on"],["feels"],["is"]]). + +reado([["one"],["the","oth","er"],["the","runn","er"],["the","draw","er"],["the","count","er"],["the","graph","er"],["the","test","er"],["the","breaths","on","er"],["the","writ","er"],["the","spell","er"],["the","updat","er"],["the","check","er"],["the","choos","er"],["the","ess","ence"],["the","comb","in","er"],["the","mir","ac","le"],["the","trans","lat","or"],["the","gramm","ar"]]). + +rhymes([["one","er","or","ar","an","ae","er","ler","ur","ard","ney","ald","ess","el"],["le","py","ye","ee","ice"]]). + +%%removetoolongandnotrhyming(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !. +/**removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2) :- +%% makes list of all combos, checks, asks, asks + removetoolong1(Lyrics1,SyllableCount, + Verbs1,[],Verbs3), %% do after end + * %% not until object + %% find v x + removenotrhyming1(Lyrics0,Verbs3, + Verbs4), + + removetoolong1(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !.%* + removetoolong1(Lyrics1,SyllableCount,Verbs1,Verbs2,Verbs3) :- + Verbs1=[Verb4|Verbs5], + append_list(Lyrics1,Verb4,Lyrics2), + length(Lyrics2,Length), + (Length<=SyllableCount->(append(Verbs2, [Verb4],Verbs6), removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs6,Verbs3));(removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs2,Verbs3))). + +removenotrhyming1(Lyrics,Verbs1,Verbs2) :- + rhymes(Rhymes1) + length(Lyrics,Length), + Line1 is mod(Length,4), + (Line1 is 3-1-> + (Line2 is 3-2, + removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2)); + (Line1 is 4-1-> + (removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2));Verbs1=Verbs2) + ). + +removenotrhyming15(Line2,Rhymes1,Rhymes3) :- + length(List1,Line2), + append(List1,_,Lyrics), + reverse(List1,List2), + List2=[Item1|_Items2], + reverse(Item1,Item2), + Item2=[Item3|_Items4], + member(Rhymes2,Rhymes1), + member(Item3,Rhymes2), + delete(Rhymes2,Item3,Rhymes3) + +removenotrhyming2(_Rhymes3,[],Verbs,Verbs) :- !. +removenotrhyming2(Rhymes3,Verbs1,Verbs2,Verbs3) :- + (Verbs1=[Verb4|Verbs5], + (member(Verb4,Rhymes3) + delete(Rhymes3,Verb4,Rhymes4)) + ->(append(Verbs2,[Verb4],Verbs6), + removenotrhyming2(Rhymes4,Verbs5, + Verbs6,Verbs3)); + removenotrhyming2(Rhymes3,Verbs5, + Verbs2,Verbs3)). + **/ +trialy2([],R) :- + R=[[_,['C']]]. + %%writeln([[],in,trialy2]),abort. +trialy2(Question,List,R) :- + %%random_member(A,List), + term_to_atom(List,List1), + repeat, + writeln([Question,%%"Please enter an item from:", + List1]), + read_string(user_input, "\n", "\r", _End2, A1), + atom_to_term(A1,A,_), + %%((number_string(A,A1)->true;atom_string(A,A1))->true;, + member(A,List), + R=[[_,A]]. + + /** + length(List,L), + Trials is L*3, + trialy22(List,Trials,[],R).**/ + +trialy22([],_,R,R) :- !. +trialy22(List,Trials,RA,RB) :- + List=[Item|Items], + trialy21(Item,Trials,R1), + append(RA,[R1],RC), + trialy22(Items,Trials,RC,RB),!. + +trialy21(Label,Trials,RA) :- + trialy3(Trials,[],R), + aggregate_all(count, member(true,R), Count), + RA=[Count,Label]. + +trialy3(0,R,R) :-!. +trialy3(Trials1,RA,RB) :- + trialy1(R1), + append(RA,[R1],RC), + Trials2 is Trials1-1, + trialy3(Trials2,RC,RB),!. + +%% try other nouns +trialy1(R1) :- + %%control11(A1), + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, trial1(N,[],S),trial01(S,S3). +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +midpoint(S,MP) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,writeln(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + %%**texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +/**string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +comment(fiftyastest). +comment(turnoffas). +**/ + +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). + +remove_dups([],List,List) :- !. +remove_dups(List1,List2,List3) :- + List1=[Item|List4], + delete(List4,Item,List5), + append(List2,[Item],List6), + remove_dups(List5,List6,List3),!. + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + diff --git a/Music-Composer/mindreadtestmusiccomposer-unusual.pl b/Music-Composer/mindreadtestmusiccomposer-unusual.pl new file mode 100644 index 0000000000000000000000000000000000000000..0d286b48a9dd627ada4d42d76c1392205cc69777 --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer-unusual.pl @@ -0,0 +1,1125 @@ +/** + +- Form +- Which chord progression type? +- Lyrics - random sentences, rhyming +- Melody - random starting and ending note of first phrase, melody and harmony from chord progression type with three notes (split for melody), the circle of fifths for connection between sections +- Which of up to 3 instruments for each of melody and harmony +- What sections they will play + +- Melody cut to the length of vocals + +**/ + +use_module(library(pio)). + +:- use_module(library(date)). +%%:- include('texttobr2qb'). +%%:- include('mindreadtestshared'). +%%:- include('texttobr2'). +%%:- include('texttobr'). +%%:- include('texttobr2qbmusic'). + +:- include('musiclibrary'). +:- include('la_strings'). + +sectest(0) :- !. +sectest(N1) :- + writeln(["\n",N1]), + sectest0(_Form1,_Lyrics,_Melody,_Harmony,_MelodyParts, + _HarmonyParts,_Vocalstubinstrument,_Song1), + N2 is N1-1, + sectest(N2),!. + +sectest0(Form1,Lyrics,Melody,Harmony,MelodyParts,HarmonyParts,Vocalstubinstrument,Song1) :- + check_asc2mid, + %%texttobr2qb(2), %%Imagine song + form(Form1), + %%Form1=[v2,o], + find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT), + remove_dups(Form1,[],Form2), + Voiceparts1=[v1,v2,c,s], + intersection(Form1,[v1,v2,c,s],Voiceparts2), + lyrics(Voiceparts1,Lyrics,Maxlength), + findall(B,(member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1)),Form3), + remove_dups(Form3,[],Form4), + %%repeat, %% in case melody, harmony don't work + melodyharmony(Form4,CPT,Maxlength,Melody,Harmony), + %%writeln(melodyharmony(Form4,CPT,Maxlength,Melody,Harmony)), %% *** + instruments(Form1,MelodyInstruments,HarmonyInstruments, + MelodyParts,HarmonyParts,Vocalstubinstrument), + %%writeln(instruments(Form1,MelodyInstruments,HarmonyInstruments, +%% MelodyParts,HarmonyParts, +%% Vocalstubinstrument)), + %%writeln(rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + %% MelodyInstruments,HarmonyInstruments,MelodyParts, + %% HarmonyParts,Lyrics, + %% Vocalstubinstrument,Song1)), %%, + + rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + MelodyInstruments,HarmonyInstruments,MelodyParts, + HarmonyParts,Lyrics, + Vocalstubinstrument,Song1,File1), %%, + +Meta_file=[[form,Form1],[chord_progressions,CPT],[voice_part,Voiceparts2],[melody,Melody],[harmony,Harmony],[melody_instruments, + MelodyInstruments],[harmony_instruments,HarmonyInstruments],[melody_parts,MelodyParts],[harmony_parts, + HarmonyParts],[lyrics,Lyrics], + [genre,["anthem"]]], + + term_to_atom(Meta_file,Meta_file1), + string_atom(Meta_file2,Meta_file1), + + concat_list("",[File1,"_meta.txt"],File2), + (open_s(File2,write,Stream1), + write(Stream1,Meta_file2), + close(Stream1)),! + . + +%% n intro, vn verse, c chorus, i1 instrumental1, t2, instrumental 2, s solo, o outro + +findall1(Form2,Form3) :- + findall(B,shorten(Form2,B),Form3). + +shorten(Form2,B) :- + member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1). + +form(Form) :- + Form1=[n,v1], + find("Should a chorus or instrumental come after the first verse?",CorI), + (CorI=c-> + (append(Form1,[c,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + ); + (CorI=i1,append(Form1,[i1,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + )). + +makename(N3) :- + trialy2A(["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"],R1), + findbest(R1,N1), + trialy2A(["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"],R2), + findbest(R2,N2), + append([N1],[N2],N3). + +makenames(0,Ns,Ns) :- !. +makenames(Num1,N1,N2) :- + random_member(R1,["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"]), + random_member(R2,["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"]), + append([R1],[R2],R3), + append(N1,[R3],N3), + Num2 is Num1-1, + makenames(Num2,N3,N2). + +findsentence(Sentence,Names1,Names2) :- + makename(N1),readv(Vs),reado(Os), + makenames(5,[],N2s1),append(N2s1,Names1,N2s), + trialy2A(Vs,R1), + findbest(R1,R11), + append(Os,N2s,Os2), + trialy2A(Os2,R2), + findbest(R2,R21), + append_list(N1,[R11,R21],Sentence), + (member(R21,N2s1)-> + append(Names1,[R21],Names2); + Names2=Names1). + +findrhyme(Phrase1,Phrase2,Names1,Names2) :- + reverse(Phrase1,Reverse), + Reverse=[Lastsyllable|_], + readv(Vs),reado(Os),makenames(5,[],N2s1), + append(N2s1,Names1,N2s), + append(Os,N2s,Os2), + trialy2A(Os2,R2), + findbest(R2,R21), + trialy2A(Vs,R1), + findbest(R1,R11), + findall(C,(member(C,N2s),reverse(C,B),B=[E|_],rhymes2(E,Lastsyllable)),Phrases22), + trialy2A(Phrases22,R3), + findbest(R3,R31), + append_list(R21,[R11,R31],Phrase2), + (member(R21,N2s1)-> + append(Names1,[R21],Names3); + Names3=Names1), + (member(Phrases22,N2s1)-> + append(Names3,[R21],Names2); + Names2=Names3). + +rhymes2(Syllable1,Syllable2) :- + rhymes(Lists), + member(List1,Lists), + member(Syllable1,List1), + member(Syllable2,List1). + +lyrics(Form1,Lyrics,Maxlength) :- + %%find("Who is the song about?",Character), + %%lyrics1(Form,Character,[],Lyrics). + catch((lyrics2(Form1,[],Lyrics,[],_Names,0,Maxlength)),_, + lyrics(Form1,Lyrics,Maxlength)). + +lyrics2([],Lyrics,Lyrics,Names,Names,Maxlength,Maxlength) :- !. +lyrics2(Form1,Lyrics1,Lyrics2,Names1,Names2,Maxlength1,Maxlength2) :- + Form1=[Form2|Forms3], + findsentence(Sentence1,Names1,Names3), + length(Sentence1,Length1), + findsentence(Sentence2,Names3,Names4), + length(Sentence2,Length2), + findrhyme(Sentence1,Sentence3,Names4,Names5), + length(Sentence3,Length3), + findrhyme(Sentence2,Sentence4,Names5,Names6), + length(Sentence4,Length4), + append(Lyrics1,[[Form2,Sentence1,Sentence2,Sentence3,Sentence4]],Lyrics3), + max([Length1,Length2,Length3,Length4],Maxlength1,Maxlength3), + lyrics2(Forms3,Lyrics3,Lyrics2,Names6,Names2,Maxlength3,Maxlength2),!. + +melodyharmony(Form1,CPT,Maxlength,Melody,Harmony) :- + Partlength is Maxlength div 3, + Extra is Maxlength mod 3, + Total is Partlength+Extra, + _Parts=[Partlength,Partlength,Partlength,Total], + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the song start on? + %%trialy2(Notes,R1), + %%findbest(R1,R11), + R11='A', + melodyharmony(Form1,CPT,_Parts2,R11,_R2,[],Melody,[],Harmony). + +melodyharmony([],_CPT,_Parts,N,N,Melody,Melody,Harmony,Harmony) :- !. +melodyharmony(Form1,CPT,Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + Form1=[Form2|Forms3], + findmelody(Form2,CPT,Parts,N1,N3,Melody1,Melody3,Harmony1,Harmony3), + (CPT=1451->harmony1451(N3,3,N4);harmonyr(N3,5,N4)), + findmelody(Form2,CPT,Parts,N4,N5,Melody3,Melody4,Harmony3,Harmony4), + reverse(Melody3,Melody3R),Melody3R=[[_,Melody3RI]|_], %% relies on repeat *1 + reverse(Melody4,Melody4R),Melody4R=[[_,Melody4RI]|_], + not((length(Melody3RI,1),length(Melody4RI,1))), + melodyharmony(Forms3,CPT,Parts,N5,N2,Melody4,Melody2, + Harmony4,Harmony2). + +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1451, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + trialy2(Notes,R1), + findbest(R1,N2), + versechorussoloprogression1451(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1451(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1564, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, %% *1 see other *1 + trialy2(Notes,R1), + findbest(R1,N2), + versechorussoloprogression1564(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1564(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1645, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + trialy2(Notes,R1), + findbest(R1,N2), + versechorussoloprogression1645(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1645(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classical, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + trialy2(Notes,R1), + findbest(R1,N2), + classicalcomposition(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + classicalcomposition(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classicalpop, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + repeat, + trialy2(Notes,R1), + findbest(R1,N2), + popclassicalcomposition(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + popclassicalcomposition(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). + +harmony(Form,CPT,Progression,Harmony1,Harmony2) :- + harmony1(Form,CPT,Progression,[],Harmony3), + append(Harmony1,[Harmony3],Harmony2). + +harmony1(_Form,_CPT,[],Harmony,Harmony) :- !. +harmony1(Form,CPT,Progression1,Harmony1,Harmony2) :- + Progression1=[Note1|Progression2], + (CPT=1451->(harmony1451(Note1,2,Note2),harmony1451(Note1,4,Note3)); + (harmonyr(Note1,4,Note2),harmonyr(Note1,7,Note3))), + Harmony30=[Note2,Note3], + findall(B1,(member(A1,Harmony30),string_concat(B,_C,A1),string_length(B,1),atom_string(B1,B)),Harmony31), + append([Note1],Harmony31,Harmony3), + append(Harmony1,[[Form,Harmony3]],Harmony4), + harmony1(Form,CPT,Progression2,Harmony4,Harmony2),!. + +harmony1451(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). +harmonyr(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). + +instruments(Form1,MelodyInstruments,HarmonyInstruments,MelodyParts,HarmonyParts,Vocalstubinstrument) :- + instrumentnumber(MelodyInstrumentNumber), + instrumentnumber(HarmonyInstrumentNumber), + instrumentlist(MelodyInstrumentNumber,MelodyInstruments), + instrumentlist(HarmonyInstrumentNumber,HarmonyInstruments), + Vocalstubinstrument=[0,"Acoustic Grand Piano"], + parts(Form1,MelodyInstruments,MelodyParts), + aggregate_all(count, (member(A1,MelodyParts),not(A1=[_,_,0])), Count1), + not(Count1=0), + parts(Form1,HarmonyInstruments,HarmonyParts), + aggregate_all(count, (member(A2,HarmonyParts),not(A2=[_,_,0])), Count2), + not(Count2=0). + +instrumentnumber(NumberofInstruments) :- + %% Number of melody instruments + trialy2([1,2,3],R1), + findbest(R1,NumberofInstruments). + +instrumentlist(NumberofInstruments,Instruments) :- + instrumentlist(NumberofInstruments,[],Instruments),!. + +instrumentlist(0,Instruments,Instruments) :- !. +instrumentlist(NumberofInstruments1,Instruments1,Instruments2) :- + Instruments=[[0,"Acoustic Grand Piano"],[1,"Bright Acoustic Piano"],[2,"Electric Grand Piano"],[3,"Honky-Tonk Piano"],[4,"Electric Piano 1"],[5,"Electric Piano 2"],[6,"Harpsichord"],[7,"Clavinet"],[8,"Celesta"],[9,"Glockenspiel"],[10,"Music Box"],[11,"Vibraphone"],[12,"Marimba"],[13,"Xylophone"],[14,"Tubular Bells"],[15,"Dulcimer"],[16,"Drawbar Organ"],[17,"Percussive Organ"],[18,"Rock Organ"],[19,"Church Organ"],[20,"Reed Organ"],[21,"Accordian"],[22,"Harmonica"],[23,"Tango Accordian"],[24,"Nylon Acoustic Guitar"],[25,"Steel Acoustic Guitar"],[26,"Jazz Electric Guitar"],[27,"Clean Electric Guitar"],[28,"Muted Electric Guitar"],[29,"Overdriven Guitar"],[30,"Distortion Guitar"],[31,"Guitar Harmonics"],[32,"Acoustic Bass"],[33,"Electric Bass (finger)"],[34,"Electric Bass (pick)"],[35,"Fretless Bass"],[36,"Slap Bass 1"],[37,"Slap Bass 2"],[38,"Synth Bass 1"],[39,"Synth Bass 2"],[40,"Violin"],[41,"Viola"],[42,"Cello"],[43,"Contrabass"],[44,"Tremolo Strings"],[45,"Pizzicato Strings"],[46,"Orchestral Harp"],[47,"Timpani"],[48,"String Ensemble 1"],[49,"String Ensemble 2"],[50,"Synth Strings 1"],[51,"Synth Strings 2"],[52,"Choir Aahs"],[53,"Voice Oohs"],[54,"Synth Choir"],[55,"Orchestra Hit"],[56,"Trumpet"],[57,"Trombone"],[58,"Tuba"],[59,"Muted Trumpet"],[60,"French Horn"],[61,"Brass Section"],[62,"Synth Brass 1"],[63,"Synth Brass 2"],[64,"Soprano Sax"],[65,"Alto Sax"],[66,"Tenor Sax"],[67,"Baritone Sax"],[68,"Oboe"],[69,"English Horn"],[70,"Bassoon"],[71,"Clarinet"],[72,"Piccolo"],[73,"Flute"],[74,"Recorder"],[75,"Pan Flute"],[76,"Blown Bottle"],[77,"Shakuhachi"],[78,"Whistle"],[79,"Ocarina"],[80,"Lead 1 (square)"],[81,"Lead 2 (sawtooth)"],[82,"Lead 3 (calliope)"],[83,"Lead 4 (chiff)"],[84,"Lead 5 (charang)"],[85,"Lead 6 (voice)"],[86,"Lead 7 (fifths)"],[87,"Lead 8 (bass + lead)"],[88,"Pad 1 (new age)"],[89,"Pad 2 (warm)"],[90,"Pad 3 (polysynth)"],[91,"Pad 4 (choir)"],[92,"Pad 5 (bowed)"],[93,"Pad 6 (metallic)"],[94,"Pad 7 (halo)"],[95,"Pad 8 (sweep)"],[96,"FX 1 (rain)"],[97,"FX 2 (soundtrack)"],[98,"FX 3 (crystal)"],[99,"FX 4 (atmosphere)"],[100,"FX 5 (brightness)"],[101,"FX 6 (goblins)"],[102,"FX 7 (echoes)"],[103,"FX 8 (sci-fi)"],[104,"Sitar"],[105,"Banjo"],[106,"Shamisen"],[107,"Koto"],[108,"Kalimba"],[109,"Bagpipe"],[110,"Fiddle"],[111,"Shanai"],[112,"Tinkle Bell"],[113,"Agogo"],[114,"Steel Drums"],[115,"Woodblock"],[116,"Taiko Drum"],[117,"Melodic Tom"], + [118,"Synth Drum"], + [119,"Reverse Cymbal"],[120,"Guitar Fret Noise"],[121,"Breath Noise"],[122,"Seashore"]], + %% ,[123,"Bird Tweet"],[124,"Telephone Ring"],[125,"Helicopter"],[126,"Applause"],[127,"Gunshot"] + trialy2(Instruments,R1), + findbest(R1,Instrument), + append(Instruments1,[Instrument],Instruments3), + NumberofInstruments2 is NumberofInstruments1-1, + instrumentlist(NumberofInstruments2,Instruments3,Instruments2),!. + +parts(Form,Instruments,Parts) :- + parts1(Form,Instruments,[],Parts),!. + +parts1(_Form,[],Parts,Parts) :- !. +parts1(Form,Instruments1,Parts1,Parts2) :- + Instruments1=[Instrument2|Instruments3], + parts2(Form,Instrument2,[],Part3), + append(Parts1,[Part3],Parts4), + parts1(Form,Instruments3,Parts4,Parts2),!. + +parts2([],_Instrument,Part,Part) :- !. +parts2(Form1,Instrument,Part1,Part2) :- + Form1=[Section|Form2], + (member([Section,Instrument,Playing],Part1)->true; + %% Is the instrument playing in the section? + (trialy2([0,1],R1), + findbest(R1,Playing))), + append(Part1,[[Section,Instrument,Playing]],Part3), + parts2(Form2,Instrument,Part3,Part2),!. + +/** +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). +**/ + +concat_list(A,[],A) :-!. +concat_list(A,List,B) :- + List=[Item|Items], + concat_list2(A,[Item],C), + concat_list(C,Items,B). +concat_list2(A,List,C) :- + ((List=[[Item|Items]]->true;List=[Item|Items])-> + concat_list0(A,[Item|Items],C); + fail),!. +concat_list2(A,Item,C) :- + concat_list(A,Item,C),!. + +concat_list0([],""):-!. +concat_list0(A1,B):- + A1=[A|List], + concat_list0(A,List,B),!. + +concat_list0(A,[],A):-!. +concat_list0(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list0(C,Items,B). + +sentencewithspaces(Sentence1,Sentence2) :- + Sentence1=[Item|Items], + string_concat(Firstletter1,Rest,Item), + string_length(Firstletter1,1), + upcase_atom(Firstletter1,Firstletter2), + concat_list(Firstletter2,[Rest,""],Item2), + sentencewithspaces(Items,Item2,Sentence3), + string_concat(Sentence3,".",Sentence2). + +sentencewithspaces([],Sentence,Sentence) :- !. +sentencewithspaces(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Word|Sentence4], + concat_list("",[Sentence2," ",Word],Item2), + sentencewithspaces(Sentence4,Item2,Sentence3),!. + +rendersong(Form1,Voiceparts,_Maxlength,Melody, + Harmony,MelodyInstruments, + HarmonyInstruments,MelodyParts1, + HarmonyParts,Lyrics,Vocalstubinstrument,Song1,File1) :- + Totallength=8, + Voicetrack=1, + length(MelodyInstruments,MelodyInstrumentsLength), + length(HarmonyInstruments,HarmonyInstrumentsLength), + TracksNumber is MelodyInstrumentsLength+HarmonyInstrumentsLength+2, + Lyrics=[[_,Sentence1|_]|_],sentencewithspaces(Sentence1,Sentence2), + concat_list("format=1 tracks=", [TracksNumber, " division=384\n\nBA 1 CR 0 TR 0 CH 16 Text type 2: \"Produced by Mind Reading Music Composer by Lucian Academy (Random Generation Mode)\"\nBA 1 CR 0 TR 0 CH 16 Text type 3: \"", Sentence2, "\"\nBA 1 CR 0 TR 0 CH 1 Channel volume 127\nBA 1 CR 0 TR 0 CH 16 Tempo 63.00009\n"], Song2), + printheader(Voicetrack,Vocalstubinstrument,Song2,Song3), + %%writeln(renderv1(Form1,Voiceparts,_,Lyrics,Melody, + %% Totallength,Voicetrack,1,_,Song3,Song4)), %% **** + renderv1(Form1,Voiceparts,_,Lyrics,Melody, + Totallength,Voicetrack,1,_,Song3,Song4), + Track2 is Voicetrack+1, + renderm1(Form1,Melody,MelodyParts1, + Track2,_Track3,Song4,Song5), + Track4 is MelodyInstrumentsLength+2, + renderh1(Form1,Harmony,HarmonyParts, + Track4,_Track5,Song5,Song6), + length(Form1,FormLength), + TotalBars is 4*FormLength+1, + concat_list(Song6,["BA ",TotalBars," CR 0 TR 0 CH 16 End of track"],Song1), + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list("song",[Year,Month,Day,Hour1,Minute1,Seconda],File1), + concat_list(File1,[".txt"],File2), + concat_list(File1,[".mid"],File3), + open_s(File2,write,Stream), + write(Stream,Song1), + close(Stream), + concat_list("./asc2mid ",[File2," > ",File3],Command), + shell1_s(Command), + %%writeln(["Texttobr, Texttobr2 not working. Please manually breason out ",File2]), + %%N is 4, M is 4000, texttobr2(N,File2,u,M),texttobr(N,File2,u,M), + outputlyrics(File1,Lyrics), + +/** + texttobr2qb(3), %% give self breasonings + texttobr2qb(20), %%Feature 1 + texttobr2qb(20), %%Updates + texttobr2qb(20), %%Feature 2 + texttobr2qb(20), %%Updates + texttobr2qb(20), %%Feature 3 + texttobr2qb(20), %%Updates + texttobr2qb(100), %%Icon + texttobr2qb(20), %%Updates + texttobr2qb(32), %%Lyrics + texttobr2qb(36), %%Music + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Medicine + texttobr2qb(20), %%Updates + texttobr2qb(2), %%Sales + texttobr2qb(20), %%Updates + texttobr2qb(2), %%Marketing + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Graciously give or blame listener for colour imagery + texttobr2qb(20), %%Updates + + texttobr2qb(2), %%Play song + texttobr2qb(2), %%Like song + **/ +!. + +printheader(Voicetrack,Vocalstubinstrument,Song1,Song2) :- + Vocalstubinstrument=[N1,_], + N2 is N1+1, + concat_list("BA 1 CR 0 TR ",[Voicetrack," CH ",Voicetrack," Instrument ",N2,"\nBA 1 CR 0 TR ",Voicetrack," CH ",Voicetrack," NT R 0 von=127 voff=0\n"],Song3), + string_concat(Song1,Song3,Song2). + +renderv1([],_Voiceparts,_Vocalstubinstrument,_Lyrics,_Melody,_Totallength,_Track,_Bar,_Bar2,Voice,Voice) :- !. +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength,Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section|Form2], + Voiceparts1=[Section|Voiceparts2], + renderv2(Section,Lyrics,Melody,Totallength,Track, + Bar1,Bar3,Voice1,Voice3), + renderv1(Form2,Voiceparts2,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength, +Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section1|Form2], + ((Voiceparts1=[Section2|_Voiceparts2], + not(Section1=Section2))->true;Voiceparts1=[]), + Bar3 is Bar1+4, + renderv1(Form2,Voiceparts1,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderv2(Section,Lyrics,_Melody,_Totallength,_Track,Bar,Bar,Voice,Voice) :- + not(member([Section|_],Lyrics)),!. +renderv2(Section1,Lyrics1,Melody1,_Totallength0,Track,Bar1,Bar2,Voice1,Voice2) :- + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section2|_]),Melody3), + Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1(Lyrics1A,Melody1A,_Totallength1,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1(Lyrics2A,Melody2A,_Totallength2,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1(Lyrics3A,Melody1A,_Totallength3,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1(Lyrics4A,Melody2A,_Totallength4,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Lyrics1,[Section1|_],Lyrics5), + renderv2(Section1,Lyrics5,Melody1,_Totallength,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1(Lyrics2,Melody1,_Totallength,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + length(Lyrics2,Lyrics2Length), + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelody(Lyrics2Length,Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelody(Lyrics2Length,Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +generatemelody1([],Melody,Melody) :- !. +generatemelody1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4,Melody4],Melody6), + generatemelody1(Melody5,Melody6,Melody3),!. + +%%changelength(_Melody2Length,Melody,Melody) :- !. +changelength(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnote1(Length,Melody1,Melody2). +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnote1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnote2(Length,Item,Melody1,Melody2),!. + +repeatlastnote2(0,_Item,Melody,Melody) :- !. +repeatlastnote2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnote2(Length2,Item,Melody3,Melody2),!. + +renderline2(BarTimes,BarTimes,[],_Track,_Bar1,Voice,Voice) :- !. +renderline2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + Melody1=[Melody2|Melody3], + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody2,"- 1/2 voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderline2(BarTimes3,BarTimes4,Melody3,Track,Bar,Voice3,Voice2). + +renderlinerests(BarTimes,BarTimes,_Track,_Bar1,Voice,Voice) :- !. +renderlinerests(BarTimes1,BarTimes2,Track,Bar,Voice1,Voice2) :- + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT R 1/2 voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderlinerests(BarTimes3,BarTimes2,Track,Bar,Voice3,Voice2). + +longtoshortform(Section1,Section2) :- + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A),!. + + +renderm1(_Form1,_Melody,[],_Track1,_Track2,Song,Song) :- !. +renderm1(Form1,Melody,MelodyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + MelodyParts1=[MelodyParts2|MelodyParts3], + MelodyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderm21(Form1,Melody,MelodyParts1,Track1,1,_,Song3,Song4), + renderm21(Form1,Melody,MelodyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderm1(Form1,Melody,MelodyParts3,Track3,Track2,Song4,Song2),!. + + +renderm21(_Form1,_Melody,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderm22(Section2,Melody,Track,Bar1,Bar3,Voice1,Voice3), + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderm22(Section2,Melody1,_Track,Bar,Bar,Voice,Voice) :- + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderm22(Section1,Melody1,Track,Bar1,Bar2,Voice1,Voice2) :- + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_]),Melody3), + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1m(Melody1A,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1m(Melody2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1m(Melody1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1m(Melody2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Melody3,[Section1|_],Melody5), + renderm22(Section1,Melody5,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1m(Melody1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodym(Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodym(Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + Lyrics2Length=8, + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +renderh1(_Form1,_Harmony,[],_Track1,_Track2,Song,Song) :- !. +renderh1(Form1,Harmony,HarmonyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + HarmonyParts1=[HarmonyParts2|HarmonyParts3], + HarmonyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderh21(Form1,Harmony,HarmonyParts1,Track1,1,_,Song3,Song4), + renderh21(Form1,Harmony,HarmonyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderh1(Form1,Harmony,HarmonyParts3,Track3,Track2,Song4,Song2),!. + + +renderh21(_Form1,_Harmony,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderh22(Section2,Harmony,Track,Bar1,Bar3,Voice1,Voice3), + renderh22(Section2,Harmony,Track,Bar3,Bar4,Voice3,Voice4), + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar4,Bar2,Voice4,Voice2),!. + +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderh22(Section2,Harmony1,_Track,Bar,Bar,Voice,Voice) :- + + %%longtoshortform(Section21,Section2), + findall(Harmony2,(member(Harmony3,Harmony1),member(Harmony2,Harmony3), + Harmony2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderh22(Section1,Harmony1A,Track,Bar1,Bar2,Voice1,Voice2) :- + Harmony1A=[Harmony1A1|Harmony1A2], + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + + %% shorten + %%string_concat(B11,_C1,Section11), + %%string_length(B11,1),atom_string(Section1,B11), + + + + findall(Harmony2,(member(Harmony2,Harmony1A1), + Harmony2=[Section1|_] + %%,string_concat(B1,_C,Section1A), + %%string_length(B1,1),atom_string(Section1,B1) + ),Harmony3), + %% shorten + + + + (not(Harmony3=[])-> + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + %%Harmony3=[[_, Harmony1A], [_, Harmony2A]], + (renderlines1h(Harmony3,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1);(Bar3 is Bar1,Voice3=Voice1)), + /** + Bar3 is Bar1+1, + renderline1h(Harmony2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1h(Harmony1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1h(Harmony2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + **/ + %%delete(Harmony3,[Section1|_],Harmony5), + renderh22(Section1,Harmony1A2,Track,Bar3,Bar2,Voice3,Voice2). + +renderlines1h([],_Track,_Bar,Voice,Voice) :- !. +renderlines1h(Harmony1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodyh(Harmony1,Harmony2), +%%writeln(generatemelodyh(Harmony1,Harmony2)), + renderlineh2(BarTimes1,BarTimes2,Harmony2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodyh(Harmony1,Harmony2) :- + generatemelodyh1(Harmony1,[],Harmony3), + length(Harmony3,Harmony2Length), + Lyrics2Length=8, + changelengthh(Lyrics2Length,Harmony2Length, + Harmony3,Harmony2). + +generatemelodyh1([],Melody,Melody) :- !. +generatemelodyh1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4,Melody4],Melody6), + generatemelodyh1(Melody5,Melody6,Melody3),!. + +%%changelengthh(_Melody2Length,Melody,Melody) :- !. +changelengthh(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnoteh1(Length,Melody1,Melody2). +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnoteh1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnoteh2(Length,Item,Melody1,Melody2),!. + +repeatlastnoteh2(0,_Item,Melody,Melody) :- !. +repeatlastnoteh2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnoteh2(Length2,Item,Melody3,Melody2),!. + +renderlineh2(BarTimes,BarTimes,[],_Track,_Bar1,Voice,Voice) :- !. +renderlineh2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + Melody1=[Melody2|Melody3], + Melody2=[_,[Melody21,Melody22,Melody23]], + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody21,"- 1/2 voff=0\n"],Song1), + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody22,"- 1/2 voff=0\n"],Song2), + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody23,"- 1/2 voff=0\n"],Song3), + concat_list(Voice1,[Song1,Song2,Song3],Voice3), + renderlineh2(BarTimes3,BarTimes4,Melody3,Track,Bar,Voice3,Voice2). + +outputlyrics(File1,Lyrics1) :- + Lyrics1=[Lyrics2|_Lyrics3], + Lyrics2=[_|Lyrics4], + Lyrics4=[Lyrics5|_Lyrics6], + sentencewithspaces(Lyrics5,Lyrics7A), + string_concat(Lyrics7A,"\n\n",Lyrics7), + outputlyrics1(Lyrics1,Lyrics7,Lyrics8), + concat_list(File1,["lyrics.txt"],File2), + open_s(File2,write,Stream), + write(Stream,Lyrics8), + close(Stream), +%%texttobr2(u,File2,u,u),texttobr(u,File2,u,u). + writeln(["Texttobr, Texttobr2 not working. Please manually breason out ",File2]). + +outputlyrics1([],Lyrics,Lyrics) :- !. +outputlyrics1(Lyrics1,Lyrics5,Lyrics6) :- + Lyrics1=[Lyrics2|Lyrics3], + Lyrics2=[_|Lyrics4], + outputlyrics2(Lyrics4,Lyrics5,Lyrics7), + string_concat(Lyrics7,"\n",Lyrics8), + outputlyrics1(Lyrics3,Lyrics8,Lyrics6). + +outputlyrics2([],Lyrics,Lyrics) :- !. +outputlyrics2(Lyrics1,Lyrics5,Lyrics6) :- + Lyrics1=[Lyrics2|Lyrics3], + sentencewithspaces(Lyrics2,Lyrics4), + %%writeln(Lyrics4), + concat_list(Lyrics5,[Lyrics4,"\n"],Lyrics7), + outputlyrics2(Lyrics3,Lyrics7,Lyrics6). + +max([],M,M) :- !. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item>=Max1, + max(List2,Item,Max2),!. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item1 etc. + findbest(R,Item1). + +find("Should a chorus or instrumental come after the first verse?",CorI) :- + trialy2([c,i1],R), + findbest(R,CorI). + +find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS) :- + trialy2([[c,o],[s,s]],R), + findbest(R,COorSS). + +find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS) :- + trialy2([[c,o],[s,s]],R), + findbest(R,COorSS). + +%%find("Who is the song about?",Character) :- +%% trialy2(["Ma","til","da"],["Jo","se","phine"],["Hap","py"],["Ha","rold"],R), +%% findbest(R,Character). + +find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT) :- + trialy2([1451, 1564, 1645 + %% + , classical, classicalpop + ],R), + findbest(R,CPT). + +/**generatelyricslistsverse(Character,Lyrics1,Lyrics2):- +%% read c, o, v + reado(Objects), + readv(Verbs), + + charobj( %% character verb object first + %% object verb object pairs + +lyrics1([],_Character,Lyrics,Lyrics) :- !. +lyrics1(Form1,Character,Lyrics1,Lyrics2) :- + Form1=[Form2|Forms2], + lyrics2(Form2,Character,Lyrics1,Lyrics3), + +lyrics2([instrumental,_],_Character,Lyrics1,Lyrics2) :- + +append_list(Lyrics1,[[],[],[],[]],Lyrics2). +lyrics2([verse,_],Character,Lyrics1,Lyrics2) :- + lyricsv1(Lyrics1,[Character],Lyrics3), + append_list(Lyrics1,[Lyrics3,*],Lyrics2), + . + +lyricsv1(Lyrics0,Lyrics1,Lyrics2) :- + readsc(SyllableCount), + readv(Verbs1), + +reado(Objects1), + **/ +%%removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2),] +%%findv +%% same for o +%% findo + %% choose words that fit syllable count + +append_list(A,[],A) :-!. +append_list(A,List,B) :- + List=[Item|Items], + append2(A,[Item],C), + append_list(C,Items,B). +append2(A,List,C) :- + (List=[Item]-> + append(A,Item,C); + fail),!. +append2(A,Item,C) :- + append_list(A,Item,C),!. + +readsc(7). +readv([["loves"],["is"],["has"],["is","in"],["moves","to"],["nur","tures"],["needs"],["makes"],["lifts"],["finds"],["forms"],["goes","to"],["writes","on"],["reads","on"],["feels"],["is"]]). + +reado([["one"],["the","oth","er"],["the","runn","er"],["the","draw","er"],["the","count","er"],["the","graph","er"],["the","test","er"],["the","breaths","on","er"],["the","writ","er"],["the","spell","er"],["the","updat","er"],["the","check","er"],["the","choos","er"],["the","ess","ence"],["the","comb","in","er"],["the","mir","ac","le"],["the","trans","lat","or"],["the","gramm","ar"]]). + +rhymes([["one","er","or","ar","an","ae","er","ler","ur","ard","ney","ald","ess","el"],["le","py","ye","ee","ice"]]). + +%%removetoolongandnotrhyming(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !. +/**removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2) :- +%% makes list of all combos, checks, asks, asks + removetoolong1(Lyrics1,SyllableCount, + Verbs1,[],Verbs3), %% do after end + * %% not until object + %% find v x + removenotrhyming1(Lyrics0,Verbs3, + Verbs4), + + removetoolong1(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !.%* + removetoolong1(Lyrics1,SyllableCount,Verbs1,Verbs2,Verbs3) :- + Verbs1=[Verb4|Verbs5], + append_list(Lyrics1,Verb4,Lyrics2), + length(Lyrics2,Length), + (Length<=SyllableCount->(append(Verbs2, [Verb4],Verbs6), removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs6,Verbs3));(removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs2,Verbs3))). + +removenotrhyming1(Lyrics,Verbs1,Verbs2) :- + rhymes(Rhymes1) + length(Lyrics,Length), + Line1 is mod(Length,4), + (Line1 is 3-1-> + (Line2 is 3-2, + removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2)); + (Line1 is 4-1-> + (removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2));Verbs1=Verbs2) + ). + +removenotrhyming15(Line2,Rhymes1,Rhymes3) :- + length(List1,Line2), + append(List1,_,Lyrics), + reverse(List1,List2), + List2=[Item1|_Items2], + reverse(Item1,Item2), + Item2=[Item3|_Items4], + member(Rhymes2,Rhymes1), + member(Item3,Rhymes2), + delete(Rhymes2,Item3,Rhymes3) + +removenotrhyming2(_Rhymes3,[],Verbs,Verbs) :- !. +removenotrhyming2(Rhymes3,Verbs1,Verbs2,Verbs3) :- + (Verbs1=[Verb4|Verbs5], + (member(Verb4,Rhymes3) + delete(Rhymes3,Verb4,Rhymes4)) + ->(append(Verbs2,[Verb4],Verbs6), + removenotrhyming2(Rhymes4,Verbs5, + Verbs6,Verbs3)); + removenotrhyming2(Rhymes3,Verbs5, + Verbs2,Verbs3)). + **/ +trialy2([],R) :- + R=[[_,['C']]]. + %%writeln([[],in,trialy2]),abort. +trialy2(List,R) :- + trialy2B(List,R). +%%writeln([list,List]), +%%notrace, +%trialy2A([],R) :- +% R=[[_,'A']]. +trialy2A(List,R) :- + trialy2B(List,R). +trialy2B(List,R) :- + random_member(A,List), + R=[[_,A]]. + + /** + length(List,L), + Trials is L*3, + trialy22(List,Trials,[],R).**/ + +trialy22([],_,R,R) :- !. +trialy22(List,Trials,RA,RB) :- + List=[Item|Items], + trialy21(Item,Trials,R1), + append(RA,[R1],RC), + trialy22(Items,Trials,RC,RB),!. + +trialy21(Label,Trials,RA) :- + trialy3(Trials,[],R), + aggregate_all(count, member(true,R), Count), + RA=[Count,Label]. + +trialy3(0,R,R) :-!. +trialy3(Trials1,RA,RB) :- + trialy1(R1), + append(RA,[R1],RC), + Trials2 is Trials1-1, + trialy3(Trials2,RC,RB),!. + +%% try other nouns +trialy1(R1) :- + %%control11(A1), + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, trial1(N,[],S),trial01(S,S3). +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +midpoint(S,MP) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,writeln(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + %%**texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +/**string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +comment(fiftyastest). +comment(turnoffas). +**/ + +remove_dups([],List,List) :- !. +remove_dups(List1,List2,List3) :- + List1=[Item|List4], + delete(List4,Item,List5), + append(List2,[Item],List6), + remove_dups(List5,List6,List3),!. + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + diff --git a/Music-Composer/mindreadtestmusiccomposer.pl b/Music-Composer/mindreadtestmusiccomposer.pl new file mode 100644 index 0000000000000000000000000000000000000000..5628386148af2ef9fb546908eda5529517d7db32 --- /dev/null +++ b/Music-Composer/mindreadtestmusiccomposer.pl @@ -0,0 +1,1041 @@ +/** + +- Form +- Which chord progression type? +- Lyrics - random sentences, rhyming +- Melody - random starting and ending note of first phrase, melody and harmony from chord progression type with three notes (split for melody), the circle of fifths for connection between sections +- Which of up to 3 instruments for each of melody and harmony +- What sections they will play + +- Melody cut to the length of vocals + +**/ + +%%use_module(library(pio)). + +:- use_module(library(date)). +%%:- include('texttobr2qb'). +%%:- include('mindreadtestshared'). + +:- include('musiclibrary'). +:- include('1451'). +:- include('1564'). +:- include('1645'). +:- include('popclassical'). +:- include('classical'). +:- include('la_strings'). + +sectest0(Form1,Lyrics,Melody,Harmony,MelodyParts,HarmonyParts,Vocalstubinstrument,Song1) :- + check_asc2mid, + form(Form1), + %%Form1=[v2,o], + find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT), + remove_dups(Form1,[],Form2), + Voiceparts1=[v1,v2,c,s], + intersection(Form1,[v1,v2,c,s],Voiceparts2), + lyrics(Voiceparts1,Lyrics,Maxlength), + findall(B,(member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1)),Form3), + remove_dups(Form3,[],Form4), + %%repeat, %% in case melody, harmony don't work + melodyharmony(Form4,CPT,Maxlength,Melody,Harmony), + %%writeln(melodyharmony(Form4,CPT,Maxlength,Melody,Harmony)), %% *** + instruments(Form1,MelodyInstruments,HarmonyInstruments, + MelodyParts,HarmonyParts,Vocalstubinstrument), + writeln(instruments(Form1,MelodyInstruments,HarmonyInstruments, + MelodyParts,HarmonyParts, + Vocalstubinstrument)), + %%writeln(rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + %% MelodyInstruments,HarmonyInstruments,MelodyParts, + %% HarmonyParts,Lyrics, + %% Vocalstubinstrument,Song1)), %%, + + rendersong(Form1,Voiceparts2,Maxlength,Melody,Harmony, + MelodyInstruments,HarmonyInstruments,MelodyParts, + HarmonyParts,Lyrics, + Vocalstubinstrument,Song1), %%, + outputlyrics(Lyrics),!. + +%% n intro, vn verse, c chorus, i1 instrumental1, t2, instrumental 2, s solo, o outro + +findall1(Form2,Form3) :- + findall(B,shorten(Form2,B),Form3). + +shorten(Form2,B) :- + member(A1,Form2),string_concat(B1,_C,A1), + string_length(B1,1),atom_string(B,B1). + +form(Form) :- + Form1=[n,v1], + find("Should a chorus or instrumental come after the first verse?",CorI), + (CorI=c-> + (append(Form1,[i1,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + ); + (CorI=i1,append(Form1,[i1,v2,c,t2,s],Form2), + find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS), + (COorSS=[c,o]-> + (append(Form2,[c],Form3), + append(Form3,[o],Form)); + COorSS=[s,s], + append(Form2,[s],Form3),append(Form3,[s],Form)) + )). + +makename(N3) :- + trialy2(["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"],R1), + findbest(R1,N1), + trialy2(["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"],R2), + findbest(R2,N2), + append([N1],[N2],N3). + +makenames(0,Ns,Ns) :- !. +makenames(Num1,N1,N2) :- + random_member(R1,["Ack","Luc","Add","All","Brae","Skye","Whist","Dix","Wilb","Duk","Le","Ven", + "Syd","Don","Count","Black","Nei"]), + random_member(R2,["an","ae","ye","ler","ee","ur","ard","ice","ney","ald","ess","el"]), + append([R1],[R2],R3), + append(N1,[R3],N3), + Num2 is Num1-1, + makenames(Num2,N3,N2). + +findsentence(Sentence,Names1,Names2) :- + makename(N1),readv(Vs),reado(Os), + makenames(5,[],N2s1),append(N2s1,Names1,N2s), + trialy2(Vs,R1), + findbest(R1,R11), + append(Os,N2s,Os2), + trialy2(Os2,R2), + findbest(R2,R21), + append_list(N1,[R11,R21],Sentence), + (member(R21,N2s1)-> + append(Names1,[R21],Names2); + Names2=Names1). + +findrhyme(Phrase1,Phrase2,Names1,Names2) :- + reverse(Phrase1,Reverse), + Reverse=[Lastsyllable|_], + readv(Vs),reado(Os),makenames(5,[],N2s1), + append(N2s1,Names1,N2s), + append(Os,N2s,Os2), + trialy2(Os2,R2), + findbest(R2,R21), + trialy2(Vs,R1), + findbest(R1,R11), + findall(C,(member(C,N2s),reverse(C,B),B=[E|_],rhymes2(E,Lastsyllable)),Phrases22), + trialy2(Phrases22,R3), + findbest(R3,R31), + append_list(R21,[R11,R31],Phrase2), + (member(R21,N2s1)-> + append(Names1,[R21],Names3); + Names3=Names1), + (member(Phrases22,N2s1)-> + append(Names3,[R21],Names2); + Names2=Names3). + +rhymes2(Syllable1,Syllable2) :- + rhymes(Lists), + member(List1,Lists), + member(Syllable1,List1), + member(Syllable2,List1). + +lyrics(Form1,Lyrics,Maxlength) :- + %%find("Who is the song about?",Character), + %%lyrics1(Form,Character,[],Lyrics). + catch((lyrics2(Form1,[],Lyrics,[],_Names,0,Maxlength)),_, + lyrics(Form1,Lyrics,Maxlength)). + +lyrics2([],Lyrics,Lyrics,Names,Names,Maxlength,Maxlength) :- !. +lyrics2(Form1,Lyrics1,Lyrics2,Names1,Names2,Maxlength1,Maxlength2) :- + Form1=[Form2|Forms3], + findsentence(Sentence1,Names1,Names3), + length(Sentence1,Length1), + findsentence(Sentence2,Names3,Names4), + length(Sentence2,Length2), + findrhyme(Sentence1,Sentence3,Names4,Names5), + length(Sentence3,Length3), + findrhyme(Sentence2,Sentence4,Names5,Names6), + length(Sentence4,Length4), + append(Lyrics1,[[Form2,Sentence1,Sentence2,Sentence3,Sentence4]],Lyrics3), + max([Length1,Length2,Length3,Length4],Maxlength1,Maxlength3), + lyrics2(Forms3,Lyrics3,Lyrics2,Names6,Names2,Maxlength3,Maxlength2),!. + +melodyharmony(Form1,CPT,Maxlength,Melody,Harmony) :- + Partlength is Maxlength div 3, + Extra is Maxlength mod 3, + Total is Partlength+Extra, + _Parts=[Partlength,Partlength,Partlength,Total], + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the song start on? + trialy2(Notes,R1), + findbest(R1,R11), + melodyharmony(Form1,CPT,_Parts2,R11,_R2,[],Melody,[],Harmony). + +melodyharmony([],_CPT,_Parts,N,N,Melody,Melody,Harmony,Harmony) :- !. +melodyharmony(Form1,CPT,Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + Form1=[Form2|Forms3], + findmelody(Form2,CPT,Parts,N1,N3,Melody1,Melody3,Harmony1,Harmony3), + (CPT=1451->harmony1451(N3,3,N4);harmonyr(N3,5,N4)), + findmelody(Form2,CPT,Parts,N4,N5,Melody3,Melody4,Harmony3,Harmony4), + melodyharmony(Forms3,CPT,Parts,N5,N2,Melody4,Melody2,Harmony4,Harmony2). + +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1451, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + %%repeat, + trialy2(Notes,R1), + findbest(R1,N2), + versechorussoloprogression1451(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1451(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1564, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + %%repeat, + trialy2(Notes,R1), + findbest(R1,N2), + versechorussoloprogression1564(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1564(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=1645, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + %%repeat, + trialy2(Notes,R1), + findbest(R1,N2), + versechorussoloprogression1645(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + versechorussoloprogression1645(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classical, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + %%repeat, + trialy2(Notes,R1), + findbest(R1,N2), + classicalcomposition(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + classicalcomposition(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). +findmelody(Form,CPT,_Parts,N1,N2,Melody1,Melody2,Harmony1,Harmony2) :- + CPT=classicalpop, + (CPT=1451->findall(A,note0(_,A),Notes); + findall(A,note0(_,A),Notes)), + %% What note should the phrase end on? + %%repeat, + trialy2(Notes,R1), + findbest(R1,N2), + popclassicalcomposition(N1,N2,Progression), + trialy2(Progression,R2), + findbest(R2,Progression2), + not(Progression2=[]), + append(Melody1,[[Form,Progression2]],Melody2), + popclassicalcomposition(N1,N2,Progression3), + trialy2(Progression3,R3), + findbest(R3,Progression4), + harmony(Form,CPT,Progression4,Harmony1,Harmony2). + +harmony(Form,CPT,Progression,Harmony1,Harmony2) :- + harmony1(Form,CPT,Progression,[],Harmony3), + append(Harmony1,[Harmony3],Harmony2). + +harmony1(_Form,_CPT,[],Harmony,Harmony) :- !. +harmony1(Form,CPT,Progression1,Harmony1,Harmony2) :- + Progression1=[Note1|Progression2], + (CPT=1451->(harmony1451(Note1,2,Note2),harmony1451(Note1,4,Note3)); + (harmonyr(Note1,4,Note2),harmonyr(Note1,7,Note3))), + Harmony30=[Note2,Note3], + findall(B1,(member(A1,Harmony30),string_concat(B,_C,A1),string_length(B,1),atom_string(B1,B)),Harmony31), + append([Note1],Harmony31,Harmony3), + append(Harmony1,[[Form,Harmony3]],Harmony4), + harmony1(Form,CPT,Progression2,Harmony4,Harmony2),!. + +harmony1451(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). +harmonyr(Note1,Value2,Note2) :- + note(Value1,Note1),Value3 is Value1+Value2,Value4 is Value3 mod 12,note(Value4,Note2). + +instruments(Form1,MelodyInstruments,HarmonyInstruments,MelodyParts,HarmonyParts,Vocalstubinstrument) :- + instrumentnumber(MelodyInstrumentNumber), + instrumentnumber(HarmonyInstrumentNumber), + instrumentlist(MelodyInstrumentNumber,MelodyInstruments), + instrumentlist(HarmonyInstrumentNumber,HarmonyInstruments), + Vocalstubinstrument=[0,"Acoustic Grand Piano"], + parts(Form1,MelodyInstruments,MelodyParts), + parts(Form1,HarmonyInstruments,HarmonyParts). + +instrumentnumber(NumberofInstruments) :- + %% Number of melody instruments + trialy2([1,2,3],R1), + findbest(R1,NumberofInstruments). + +instrumentlist(NumberofInstruments,Instruments) :- + instrumentlist(NumberofInstruments,[],Instruments),!. + +instrumentlist(0,Instruments,Instruments) :- !. +instrumentlist(NumberofInstruments1,Instruments1,Instruments2) :- + Instruments=[[0,"Acoustic Grand Piano"],[1,"Bright Acoustic Piano"],[2,"Electric Grand Piano"],[3,"Honky-Tonk Piano"],[4,"Electric Piano 1"],[5,"Electric Piano 2"],[6,"Harpsichord"],[7,"Clavinet"],[8,"Celesta"],[9,"Glockenspiel"],[10,"Music Box"],[11,"Vibraphone"],[12,"Marimba"],[13,"Xylophone"],[14,"Tubular Bells"],[15,"Dulcimer"],[16,"Drawbar Organ"],[17,"Percussive Organ"],[18,"Rock Organ"],[19,"Church Organ"],[20,"Reed Organ"],[21,"Accordian"],[22,"Harmonica"],[23,"Tango Accordian"],[24,"Nylon Acoustic Guitar"],[25,"Steel Acoustic Guitar"],[26,"Jazz Electric Guitar"],[27,"Clean Electric Guitar"],[28,"Muted Electric Guitar"],[29,"Overdriven Guitar"],[30,"Distortion Guitar"],[31,"Guitar Harmonics"],[32,"Acoustic Bass"],[33,"Electric Bass (finger)"],[34,"Electric Bass (pick)"],[35,"Fretless Bass"],[36,"Slap Bass 1"],[37,"Slap Bass 2"],[38,"Synth Bass 1"],[39,"Synth Bass 2"],[40,"Violin"],[41,"Viola"],[42,"Cello"],[43,"Contrabass"],[44,"Tremolo Strings"],[45,"Pizzicato Strings"],[46,"Orchestral Harp"],[47,"Timpani"],[48,"String Ensemble 1"],[49,"String Ensemble 2"],[50,"Synth Strings 1"],[51,"Synth Strings 2"],[52,"Choir Aahs"],[53,"Voice Oohs"],[54,"Synth Choir"],[55,"Orchestra Hit"],[56,"Trumpet"],[57,"Trombone"],[58,"Tuba"],[59,"Muted Trumpet"],[60,"French Horn"],[61,"Brass Section"],[62,"Synth Brass 1"],[63,"Synth Brass 2"],[64,"Soprano Sax"],[65,"Alto Sax"],[66,"Tenor Sax"],[67,"Baritone Sax"],[68,"Oboe"],[69,"English Horn"],[70,"Bassoon"],[71,"Clarinet"],[72,"Piccolo"],[73,"Flute"],[74,"Recorder"],[75,"Pan Flute"],[76,"Blown Bottle"],[77,"Shakuhachi"],[78,"Whistle"],[79,"Ocarina"],[80,"Lead 1 (square)"],[81,"Lead 2 (sawtooth)"],[82,"Lead 3 (calliope)"],[83,"Lead 4 (chiff)"],[84,"Lead 5 (charang)"],[85,"Lead 6 (voice)"],[86,"Lead 7 (fifths)"],[87,"Lead 8 (bass + lead)"],[88,"Pad 1 (new age)"],[89,"Pad 2 (warm)"],[90,"Pad 3 (polysynth)"],[91,"Pad 4 (choir)"],[92,"Pad 5 (bowed)"],[93,"Pad 6 (metallic)"],[94,"Pad 7 (halo)"],[95,"Pad 8 (sweep)"],[96,"FX 1 (rain)"],[97,"FX 2 (soundtrack)"],[98,"FX 3 (crystal)"],[99,"FX 4 (atmosphere)"],[100,"FX 5 (brightness)"],[101,"FX 6 (goblins)"],[102,"FX 7 (echoes)"],[103,"FX 8 (sci-fi)"],[104,"Sitar"],[105,"Banjo"],[106,"Shamisen"],[107,"Koto"],[108,"Kalimba"],[109,"Bagpipe"],[110,"Fiddle"],[111,"Shanai"],[112,"Tinkle Bell"],[113,"Agogo"],[114,"Steel Drums"],[115,"Woodblock"],[116,"Taiko Drum"],[117,"Melodic Tom"],[118,"Synth Drum"],[119,"Reverse Cymbal"],[120,"Guitar Fret Noise"],[121,"Breath Noise"],[122,"Seashore"],[123,"Bird Tweet"],[124,"Telephone Ring"],[125,"Helicopter"],[126,"Applause"],[127,"Gunshot"]], + trialy2(Instruments,R1), + findbest(R1,Instrument), + append(Instruments1,[Instrument],Instruments3), + NumberofInstruments2 is NumberofInstruments1-1, + instrumentlist(NumberofInstruments2,Instruments3,Instruments2),!. + +parts(Form,Instruments,Parts) :- + parts1(Form,Instruments,[],Parts),!. + +parts1(_Form,[],Parts,Parts) :- !. +parts1(Form,Instruments1,Parts1,Parts2) :- + Instruments1=[Instrument2|Instruments3], + parts2(Form,Instrument2,[],Part3), + append(Parts1,[Part3],Parts4), + parts1(Form,Instruments3,Parts4,Parts2),!. + +parts2([],_Instrument,Part,Part) :- !. +parts2(Form1,Instrument,Part1,Part2) :- + Form1=[Section1|Form2], + %% shorten + string_concat(B11,_C1,Section1), + string_length(B11,1),atom_string(Section,B11), + %% shorten + ((findall([Section,Instrument,Playing],(member([Section2,Instrument, + Playing],Part1),string_concat(B1,_C,Section2), + string_length(B1,1),atom_string(Section,B1)),Form3), + [[Section,Instrument,Playing]|_]=Form3)->true; + %% Is the instrument playing in the section? + (trialy2([0,1],R1), + findbest(R1,Playing))), + append(Part1,[[Section1,Instrument,Playing]],Part3), + parts2(Form2,Instrument,Part3,Part2),!. + +/** +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). +**/ + +concat_list(A,[],A) :-!. +concat_list(A,List,B) :- + List=[Item|Items], + concat_list2(A,[Item],C), + concat_list(C,Items,B). +concat_list2(A,List,C) :- + ((List=[[Item|Items]]->true;List=[Item])-> + concat_list0(A,[Item|Items],C); + fail),!. +concat_list2(A,Item,C) :- + concat_list(A,Item,C),!. + +concat_list0([],""):-!. +concat_list0(A1,B):- + A1=[A|List], + concat_list0(A,List,B),!. + +concat_list0(A,[],A):-!. +concat_list0(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list0(C,Items,B). + +sentencewithspaces(Sentence1,Sentence2) :- + Sentence1=[Item|Items], + string_concat(Firstletter1,Rest,Item), + string_length(Firstletter1,1), + upcase_atom(Firstletter1,Firstletter2), + concat_list(Firstletter2,[Rest,""],Item2), + sentencewithspaces(Items,Item2,Sentence3), + string_concat(Sentence3,".",Sentence2). + +sentencewithspaces([],Sentence,Sentence) :- !. +sentencewithspaces(Sentence1,Sentence2,Sentence3) :- + Sentence1=[Word|Sentence4], + concat_list("",[Sentence2," ",Word],Item2), + sentencewithspaces(Sentence4,Item2,Sentence3),!. + +rendersong(Form1,Voiceparts,_Maxlength,Melody, + Harmony,MelodyInstruments, + HarmonyInstruments,MelodyParts1, + HarmonyParts,Lyrics,Vocalstubinstrument,Song1) :- + Totallength=8, + Voicetrack=1, + length(MelodyInstruments,MelodyInstrumentsLength), + length(HarmonyInstruments,HarmonyInstrumentsLength), + TracksNumber is MelodyInstrumentsLength+HarmonyInstrumentsLength+2, + Lyrics=[[_,Sentence1|_]|_],sentencewithspaces(Sentence1,Sentence2), + concat_list("format=1 tracks=", [TracksNumber, " division=384\n\nBA 1 CR 0 TR 0 CH 16 Text type 2: \"Produced by Mind Reading Music Composer by Lucian Academy\"\nBA 1 CR 0 TR 0 CH 16 Text type 3: \"", Sentence2, "\"\nBA 1 CR 0 TR 0 CH 1 Channel volume 127\nBA 1 CR 0 TR 0 CH 16 Tempo 63.00009\n"], Song2), + printheader(Voicetrack,Vocalstubinstrument,Song2,Song3), + %%writeln(renderv1(Form1,Voiceparts,_,Lyrics,Melody, + %% Totallength,Voicetrack,1,_,Song3,Song4)), %% **** + renderv1(Form1,Voiceparts,_,Lyrics,Melody, + Totallength,Voicetrack,1,_,Song3,Song4), + Track2 is Voicetrack+1, + renderm1(Form1,Melody,MelodyParts1, + Track2,_Track3,Song4,Song5), + Track4 is MelodyInstrumentsLength+2, + renderh1(Form1,Harmony,HarmonyParts, + Track4,_Track5,Song5,Song6), + length(Form1,FormLength), + TotalBars is 4*FormLength+1, + concat_list(Song6,["BA ",TotalBars," CR 0 TR 0 CH 16 End of track"],Song1), + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list("song",[Year,Month,Day,Hour1,Minute1,Seconda],File1), + concat_list(File1,[".txt"],File2), + open_s(File2,write,Stream), + write(Stream,Song1), + close(Stream), + concat_list("./asc2mid ",[File2," > ",File1,".mid"],Command), + shell1_s(Command),!. + +printheader(Voicetrack,Vocalstubinstrument,Song1,Song2) :- + Vocalstubinstrument=[N1,_], + N2 is N1+1, + concat_list("BA 1 CR 0 TR ",[Voicetrack," CH ",Voicetrack," Instrument ",N2,"\nBA 1 CR 0 TR ",Voicetrack," CH ",Voicetrack," NT R 0 von=127 voff=0\n"],Song3), + string_concat(Song1,Song3,Song2). + +renderv1([],_Voiceparts,_Vocalstubinstrument,_Lyrics,_Melody,_Totallength,_Track,_Bar,_Bar2,Voice,Voice) :- !. +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength,Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section|Form2], + Voiceparts1=[Section|Voiceparts2], + renderv2(Section,Lyrics,Melody,Totallength,Track, + Bar1,Bar3,Voice1,Voice3), + renderv1(Form2,Voiceparts2,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderv1(Form1,Voiceparts1,Vocalstubinstrument,Lyrics,Melody,Totallength, +Track,Bar1,Bar2,Voice1,Voice2) :- + Form1=[Section1|Form2], + ((Voiceparts1=[Section2|_Voiceparts2], + not(Section1=Section2))->true;Voiceparts1=[]), + Bar3 is Bar1+4, + renderv1(Form2,Voiceparts1,Vocalstubinstrument,Lyrics,Melody, + Totallength,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderv2(Section,Lyrics,_Melody,_Totallength,_Track,Bar,Bar,Voice,Voice) :- + not(member([Section|_],Lyrics)),!. +renderv2(Section1,Lyrics1,Melody1,_Totallength0,Track,Bar1,Bar2,Voice1,Voice2) :- + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section2|_]),Melody3), + Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1(Lyrics1A,Melody1A,_Totallength1,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1(Lyrics2A,Melody2A,_Totallength2,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1(Lyrics3A,Melody1A,_Totallength3,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1(Lyrics4A,Melody2A,_Totallength4,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Lyrics1,[Section1|_],Lyrics5), + renderv2(Section1,Lyrics5,Melody1,_Totallength,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1(Lyrics2,Melody1,_Totallength,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + length(Lyrics2,Lyrics2Length), + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelody(Lyrics2Length,Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelody(Lyrics2Length,Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +generatemelody1([],Melody,Melody) :- !. +generatemelody1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4,Melody4],Melody6), + generatemelody1(Melody5,Melody6,Melody3),!. + +%%changelength(_Melody2Length,Melody,Melody) :- !. +changelength(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnote1(Length,Melody1,Melody2). +changelength(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnote1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnote2(Length,Item,Melody1,Melody2),!. + +repeatlastnote2(0,_Item,Melody,Melody) :- !. +repeatlastnote2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnote2(Length2,Item,Melody3,Melody2),!. + +renderline2(BarTimes,BarTimes,[],_Track,_Bar1,Voice,Voice) :- !. +renderline2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + Melody1=[Melody2|Melody3], + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody2,"- 1/2 voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderline2(BarTimes3,BarTimes4,Melody3,Track,Bar,Voice3,Voice2). + +renderlinerests(BarTimes,BarTimes,_Track,_Bar1,Voice,Voice) :- !. +renderlinerests(BarTimes1,BarTimes2,Track,Bar,Voice1,Voice2) :- + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT R 1/2 voff=0\n"],Song), + string_concat(Voice1,Song,Voice3), + renderlinerests(BarTimes3,BarTimes2,Track,Bar,Voice3,Voice2). + +longtoshortform(Section1,Section2) :- + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A),!. + + +renderm1(_Form1,_Melody,[],_Track1,_Track2,Song,Song) :- !. +renderm1(Form1,Melody,MelodyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + MelodyParts1=[MelodyParts2|MelodyParts3], + MelodyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1,%% + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderm21(Form1,Melody,MelodyParts1,Track1,1,_,Song3,Song4), + renderm21(Form1,Melody,MelodyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderm1(Form1,Melody,MelodyParts3,Track3,Track2,Song4,Song2),!. + + +renderm21(_Form1,_Melody,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderm22(Section2,Melody,Track,Bar1,Bar3,Voice1,Voice3), + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice3,Voice2),!. + +renderm21(Form1,Melody,MelodyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + MelodyParts1=[MelodyPart2|MelodyParts3], + MelodyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderm21(Form1,Melody,MelodyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderm22(Section2,Melody1,_Track,Bar,Bar,Voice,Voice) :- + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderm22(Section1,Melody1,Track,Bar1,Bar2,Voice1,Voice2) :- + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + findall(Melody2,(member(Melody2,Melody1), + Melody2=[Section1|_]),Melody3), + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + Melody3=[[_, Melody1A], [_, Melody2A]], + renderline1m(Melody1A,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1, + renderline1m(Melody2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1m(Melody1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1m(Melody2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + delete(Melody3,[Section1|_],Melody5), + renderm22(Section1,Melody5,Track,Bar6,Bar2,Voice6,Voice2). + +renderline1m(Melody1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodym(Melody1,Melody2), + renderline2(BarTimes1,BarTimes2,Melody2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodym(Melody1,Melody2) :- + generatemelody1(Melody1,[],Melody3), + length(Melody3,Melody2Length), + Lyrics2Length=8, + changelength(Lyrics2Length,Melody2Length,Melody3,Melody2). + +renderh1(_Form1,_Harmony,[],_Track1,_Track2,Song,Song) :- !. +renderh1(Form1,Harmony,HarmonyParts1,Track1,Track2,Song1,Song2) :- + %%length(Form1,FormLength), + HarmonyParts1=[HarmonyParts2|HarmonyParts3], + HarmonyParts2=[[_A,[InstrumentNumber1,_B],_C]|_D], + InstrumentNumber2 is InstrumentNumber1 + 1, + printheader(Track1,[InstrumentNumber2,_],Song1,Song3), + %%renderh21(Form1,Harmony,HarmonyParts1,Track1,1,_,Song3,Song4), + renderh21(Form1,Harmony,HarmonyParts2,Track1,1,_E,Song3,Song4), + Track3 is Track1+1, + renderh1(Form1,Harmony,HarmonyParts3,Track3,Track2,Song4,Song2),!. + + +renderh21(_Form1,_Harmony,[],_Track,Bar,Bar,Voice,Voice) :- !. +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[Section1,[_,_],1], + longtoshortform(Section1,Section2), + renderh22(Section2,Harmony,Track,Bar1,Bar3,Voice1,Voice3), + renderh22(Section2,Harmony,Track,Bar3,Bar4,Voice3,Voice4), + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar4,Bar2,Voice4,Voice2),!. + +renderh21(Form1,Harmony,HarmonyParts1,Track,Bar1,Bar2,Voice1,Voice2) :- + HarmonyParts1=[HarmonyPart2|HarmonyParts3], + HarmonyPart2=[_Section,[_,_],0], + Bar3 is Bar1+4, + renderh21(Form1,Harmony,HarmonyParts3,Track,Bar3,Bar2,Voice1,Voice2),!. + +renderh22(Section2,Harmony1,_Track,Bar,Bar,Voice,Voice) :- + + %%longtoshortform(Section21,Section2), + findall(Harmony2,(member(Harmony3,Harmony1),member(Harmony2,Harmony3), + Harmony2=[Section1|_],longtoshortform(Section1,Section2)),List2),List2=[],!. +renderh22(Section1,Harmony1A,Track,Bar1,Bar2,Voice1,Voice2) :- + Harmony1A=[Harmony1A1|Harmony1A2], + /** + findall(Lyrics2,(member(Lyrics2,Lyrics1), + Lyrics2=[Section1|_]),Lyrics3), + string_concat(Section2A,_C,Section1), + string_length(Section2A,1), + atom_string(Section2,Section2A), + **/ + + %% shorten + %%string_concat(B11,_C1,Section11), + %%string_length(B11,1),atom_string(Section1,B11), + + + + findall(Harmony2,(member(Harmony2,Harmony1A1), + Harmony2=[Section1|_] + %%,string_concat(B1,_C,Section1A), + %%string_length(B1,1),atom_string(Section1,B1) + ),Harmony3), + %% shorten + + + + (not(Harmony3=[])-> + %%Lyrics3=[[_, Lyrics1A,Lyrics2A,Lyrics3A,Lyrics4A]], + %%Harmony3=[[_, Harmony1A], [_, Harmony2A]], + (renderlines1h(Harmony3,Track,Bar1,Voice1,Voice3), + Bar3 is Bar1+1);(Bar3 is Bar1,Voice3=Voice1)), + /** + Bar3 is Bar1+1, + renderline1h(Harmony2A,Track,Bar3,Voice3,Voice4), + Bar4 is Bar3+1, + renderline1h(Harmony1A,Track,Bar4,Voice4,Voice5), + Bar5 is Bar4+1, + renderline1h(Harmony2A,Track,Bar5,Voice5,Voice6), + Bar6 is Bar5+1, + **/ + %%delete(Harmony3,[Section1|_],Harmony5), + renderh22(Section1,Harmony1A2,Track,Bar3,Bar2,Voice3,Voice2). + +renderlines1h([],_Track,_Bar,Voice,Voice) :- !. +renderlines1h(Harmony1,Track,Bar1,Voice1,Voice2) :- + %%Lyrics1=[_,Lyrics2], + %%Rests is Totallength-Lyrics2Length, + BarTimes1=["0","1/2","1","1+1/2","2","2+1/2","3","3+1/2"], + generatemelodyh(Harmony1,Harmony2), +%%writeln(generatemelodyh(Harmony1,Harmony2)), + renderlineh2(BarTimes1,BarTimes2,Harmony2,Track,Bar1,Voice1,Voice3), + renderlinerests(BarTimes2,_BarTimes3,Track,Bar1,Voice3,Voice2). + +generatemelodyh(Harmony1,Harmony2) :- + generatemelodyh1(Harmony1,[],Harmony3), + length(Harmony3,Harmony2Length), + Lyrics2Length=8, + changelengthh(Lyrics2Length,Harmony2Length, + Harmony3,Harmony2). + +generatemelodyh1([],Melody,Melody) :- !. +generatemelodyh1(Melody1,Melody2,Melody3) :- + Melody1=[Melody4|Melody5], + append(Melody2,[Melody4,Melody4],Melody6), + generatemelodyh1(Melody5,Melody6,Melody3),!. + +%%changelengthh(_Melody2Length,Melody,Melody) :- !. +changelengthh(Lyrics2Length,Melody2Length,Melody,Melody) :- + Lyrics2Length=Melody2Length,!. +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length > Melody2Length, + Length is Lyrics2Length-Melody2Length, + repeatlastnoteh1(Length,Melody1,Melody2). +changelengthh(Lyrics2Length,Melody2Length,Melody1,Melody2) :- + Lyrics2Length < Melody2Length, + length(Melody2,Lyrics2Length), + append(Melody2,_,Melody1). + +repeatlastnoteh1(Length,Melody1,Melody2) :- + reverse(Melody1,Melody3), + Melody3=[Item|_], + repeatlastnoteh2(Length,Item,Melody1,Melody2),!. + +repeatlastnoteh2(0,_Item,Melody,Melody) :- !. +repeatlastnoteh2(Length1,Item,Melody1,Melody2) :- + append(Melody1,[Item],Melody3), + Length2 is Length1-1, + repeatlastnoteh2(Length2,Item,Melody3,Melody2),!. + +renderlineh2(BarTimes,BarTimes,[],_Track,_Bar1,Voice,Voice) :- !. +renderlineh2(BarTimes1,BarTimes4,Melody1,Track,Bar,Voice1,Voice2) :- + Melody1=[Melody2|Melody3], + Melody2=[_,[Melody21,Melody22,Melody23]], + BarTimes1=[BarTimes2|BarTimes3], + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody21,"- 1/2 voff=0\n"],Song1), + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody22,"- 1/2 voff=0\n"],Song2), + concat_list("BA ",[Bar," CR ",BarTimes2," TR ",Track," CH ",Track," NT ",Melody23,"- 1/2 voff=0\n"],Song3), + concat_list(Voice1,[Song1,Song2,Song3],Voice3), + renderlineh2(BarTimes3,BarTimes4,Melody3,Track,Bar,Voice3,Voice2). + +outputlyrics([]) :- !. +outputlyrics(Lyrics1) :- + Lyrics1=[Lyrics2|Lyrics3], + Lyrics2=[_|Lyrics4], + outputlyrics2(Lyrics4), + writeln(""), + outputlyrics(Lyrics3). + +outputlyrics2([]) :- !. +outputlyrics2(Lyrics1) :- + Lyrics1=[Lyrics2|Lyrics3], + sentencewithspaces(Lyrics2,Lyrics4), + writeln(Lyrics4), + outputlyrics2(Lyrics3). + +max([],M,M) :- !. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item>=Max1, + max(List2,Item,Max2),!. +max(List1,Max1,Max2) :- + List1=[Item|List2], + Item1 etc. + findbest(R,Item1). + +find("Should a chorus or instrumental come after the first verse?",CorI) :- + trialy2([c,i1],R), + findbest(R,CorI). + +find(["Should a chorus and outro or two solos come after the first solo?",1],COorSS) :- + trialy2([[c,o],[s,s]],R), + findbest(R,COorSS). + +find(["Should a chorus and outro or two solos come after the first solo?",2],COorSS) :- + trialy2([[c,o],[s,s]],R), + findbest(R,COorSS). + +%%find("Who is the song about?",Character) :- +%% trialy2(["Ma","til","da"],["Jo","se","phine"],["Hap","py"],["Ha","rold"],R), +%% findbest(R,Character). + +find("Should the chord progression type be 1451, 1564, 1645, Classical or Classical Pop?",CPT) :- + trialy2([1451, 1564, 1645, classical, classicalpop],R), + findbest(R,CPT). + +/**generatelyricslistsverse(Character,Lyrics1,Lyrics2):- +%% read c, o, v + reado(Objects), + readv(Verbs), + + charobj( %% character verb object first + %% object verb object pairs + +lyrics1([],_Character,Lyrics,Lyrics) :- !. +lyrics1(Form1,Character,Lyrics1,Lyrics2) :- + Form1=[Form2|Forms2], + lyrics2(Form2,Character,Lyrics1,Lyrics3), + +lyrics2([instrumental,_],_Character,Lyrics1,Lyrics2) :- + +append_list(Lyrics1,[[],[],[],[]],Lyrics2). +lyrics2([verse,_],Character,Lyrics1,Lyrics2) :- + lyricsv1(Lyrics1,[Character],Lyrics3), + append_list(Lyrics1,[Lyrics3,*],Lyrics2), + . + +lyricsv1(Lyrics0,Lyrics1,Lyrics2) :- + readsc(SyllableCount), + readv(Verbs1), + +reado(Objects1), + **/ +%%removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2),] +%%findv +%% same for o +%% findo + %% choose words that fit syllable count + +append_list(A,[],A) :-!. +append_list(A,List,B) :- + List=[Item|Items], + append2(A,[Item],C), + append_list(C,Items,B). +append2(A,List,C) :- + (List=[Item]-> + append(A,Item,C); + fail),!. +append2(A,Item,C) :- + append_list(A,Item,C),!. + +readsc(7). +readv([["loves"],["is"],["has"],["is","in"],["moves","to"],["nur","tures"],["needs"],["makes"],["lifts"],["finds"],["forms"],["goes","to"],["writes","on"],["reads","on"],["feels"],["is"]]). + +reado([["one"],["the","oth","er"],["the","runn","er"],["the","draw","er"],["the","count","er"],["the","graph","er"],["the","test","er"],["the","breaths","on","er"],["the","writ","er"],["the","spell","er"],["the","updat","er"],["the","check","er"],["the","choos","er"],["the","ess","ence"],["the","comb","in","er"],["the","mir","ac","le"],["the","trans","lat","or"],["the","gramm","ar"]]). + +rhymes([["one","er","or","ar","an","ae","er","ler","ur","ard","ney","ald","ess","el"],["le","py","ye","ee","ice"]]). + +%%removetoolongandnotrhyming(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !. +/**removetoolongandnotrhyming(Lyrics0,Lyrics1,SyllableCount,Verbs1,Verbs2,Objects1,Objects2) :- +%% makes list of all combos, checks, asks, asks + removetoolong1(Lyrics1,SyllableCount, + Verbs1,[],Verbs3), %% do after end + * %% not until object + %% find v x + removenotrhyming1(Lyrics0,Verbs3, + Verbs4), + + removetoolong1(Lyrics1,SyllableCount,[],Verbs,Verbs) :- !.%* + removetoolong1(Lyrics1,SyllableCount,Verbs1,Verbs2,Verbs3) :- + Verbs1=[Verb4|Verbs5], + append_list(Lyrics1,Verb4,Lyrics2), + length(Lyrics2,Length), + (Length<=SyllableCount->(append(Verbs2, [Verb4],Verbs6), removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs6,Verbs3));(removetoolong1(Lyrics1,SyllableCount,Verbs5,Verbs2,Verbs3))). + +removenotrhyming1(Lyrics,Verbs1,Verbs2) :- + rhymes(Rhymes1) + length(Lyrics,Length), + Line1 is mod(Length,4), + (Line1 is 3-1-> + (Line2 is 3-2, + removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2)); + (Line1 is 4-1-> + (removenotrhyming15(Line2,Rhymes1, + Rhymes3), + removenotrhyming2(Rhymes3, + Verbs1,[],Verbs2));Verbs1=Verbs2) + ). + +removenotrhyming15(Line2,Rhymes1,Rhymes3) :- + length(List1,Line2), + append(List1,_,Lyrics), + reverse(List1,List2), + List2=[Item1|_Items2], + reverse(Item1,Item2), + Item2=[Item3|_Items4], + member(Rhymes2,Rhymes1), + member(Item3,Rhymes2), + delete(Rhymes2,Item3,Rhymes3) + +removenotrhyming2(_Rhymes3,[],Verbs,Verbs) :- !. +removenotrhyming2(Rhymes3,Verbs1,Verbs2,Verbs3) :- + (Verbs1=[Verb4|Verbs5], + (member(Verb4,Rhymes3) + delete(Rhymes3,Verb4,Rhymes4)) + ->(append(Verbs2,[Verb4],Verbs6), + removenotrhyming2(Rhymes4,Verbs5, + Verbs6,Verbs3)); + removenotrhyming2(Rhymes3,Verbs5, + Verbs2,Verbs3)). + **/ +trialy2([],R) :- + R=[[_,['C']]]. + %%writeln([[],in,trialy2]),abort. +trialy2(List,R) :- + random_member(A,List), + R=[[_,A]]. + + /** + length(List,L), + Trials is L*3, + trialy22(List,Trials,[],R).**/ + +trialy22([],_,R,R) :- !. +trialy22(List,Trials,RA,RB) :- + List=[Item|Items], + trialy21(Item,Trials,R1), + append(RA,[R1],RC), + trialy22(Items,Trials,RC,RB),!. + +trialy21(Label,Trials,RA) :- + trialy3(Trials,[],R), + aggregate_all(count, member(true,R), Count), + RA=[Count,Label]. + +trialy3(0,R,R) :-!. +trialy3(Trials1,RA,RB) :- + trialy1(R1), + append(RA,[R1],RC), + Trials2 is Trials1-1, + trialy3(Trials2,RC,RB),!. + +%% try other nouns +trialy1(R1) :- + %%control11(A1), + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, trial1(N,[],S),trial01(S,S3). +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +midpoint(S,MP) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,writeln(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + %%**texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +/**string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +comment(fiftyastest). +comment(turnoffas). +**/ + +remove_dups([],List,List) :- !. +remove_dups(List1,List2,List3) :- + List1=[Item|List4], + delete(List4,Item,List5), + append(List2,[Item],List6), + remove_dups(List5,List6,List3),!. + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + diff --git a/Music-Composer/mus_sectest1.pl b/Music-Composer/mus_sectest1.pl new file mode 100644 index 0000000000000000000000000000000000000000..1169778049aa14c9cd030cd37c55498823c22508 --- /dev/null +++ b/Music-Composer/mus_sectest1.pl @@ -0,0 +1,34 @@ + +%% do 100 business songs at start of day + +%% automatically do a sale every hour until 11 PM x for n hours + +%% get txt,mid filenames +%% get br count of txt +%% br multiple + +%% move them into another folder (delete or save on la com next day - compress and email) + +%% later: react to periods + +sectest01(N) :- + write("*** Please indicate "), + write(N), + writeln(" number 1 songs. ***"), + sleep(5), + sectest11(N). + +sectest11(0) :- !. +sectest11(N1) :- + (catch( + sectest0(_Form1,_Lyrics,_Melody,_Harmony,_MelodyParts, + _HarmonyParts,_Vocalstubinstrument,_Song1), + %(random(X),X1 is ceiling(2*X),X1 =1), + _, + false)-> + (writeln([song,N1,done]), + N2 is N1-1, + sectest11(N2) + ); + sectest11(N1) +). diff --git a/Music-Composer/musiclibrary.pl b/Music-Composer/musiclibrary.pl new file mode 100644 index 0000000000000000000000000000000000000000..6b2596d9210d7415de1a744dfa1af7cb093e8652 --- /dev/null +++ b/Music-Composer/musiclibrary.pl @@ -0,0 +1,51 @@ +notestonames(Progression1, Progression2) :- +clean(Progression1, Progression2). +flag1(Items, Flag, Progression) :- +flag2(Items, Flag, [], Progression), !. +flag1([], b, []) :- !. +flag2([], a, Progression, Progression) :- !. +flag2([[a,Progression1]|Items], _, Progression2, Progression3) :- +append(Progression2, [Progression1], Progression4), +flag2(Items, _, Progression4, Progression3). +flag2([[b,_]|Items], _, Progression2, Progression3) :- +flag2(Items, _, Progression2, Progression3). +clean(Items1, Items2) :- +clean1(Items1,[],Items2). +clean1([],I,I). +clean1(Items,I1,F1) :- +allnumbers(Items,[],Names),append(I1,[Names],F1),!. +clean1([Item|Items],I1,F1) :- +clean1(Item, I1, F2), +clean1(Items, F2, F1). +allnumbers([],Names,Names). +allnumbers([Item|Items],Names1,Names2):- +number(Item),note(Item,Name),append(Names1,[Name],Names3),allnumbers(Items,Names3,Names2). + +note0(0,'C'). +note0(1,'D'). +note0(2,'E'). +note0(3,'F'). +note0(4,'G'). +note0(5,'A'). +note0(6,'B'). + +note(0,'C'). +note(1,'C#'). +note(2,'D'). +note(3,'D#'). +note(4,'E'). +note(5,'F'). +note(6,'F#'). +note(7,'G'). +note(8,'G#'). +note(9,'A'). +note(10,'A#'). +note(11,'B'). + +check_asc2mid :- + directory_files("./",F),delete_invisibles_etc(F,G),%writeln1(G), + (member("asc2mid",G)->true;(writeln1("Error: asc2mid not found."),abort)). + +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G). + diff --git a/Music-Composer/popclassical-4.pl b/Music-Composer/popclassical-4.pl new file mode 100644 index 0000000000000000000000000000000000000000..633f20c335e64abf58c9d7556aa80de42c5679d0 --- /dev/null +++ b/Music-Composer/popclassical-4.pl @@ -0,0 +1,108 @@ +/** +Classical Pop (with minor sixth) Composition +Based on 2 3 5 6 7 8 10 11 intervals +Counter=4 +?- popclassicalcomposition('C','D'). +?- popclassicalcomposition('C','C#'). +?- popclassicalcomposition('C','E'). +?- popclassicalcomposition('C','D#'). +**/ +popclassicalcomposition(Name1, Name2,Progression2) :- +note(Note1, Name1), +note(Note2, Name2), +popclassicalcomposition1(Note1, Note2, 0, _, [Note1], Progression1), notestonames(Progression1,Progression2),!. +popclassicalcomposition1(_, _, Counter, b, Progression, Progression) :- +Counter = 4, !. +popclassicalcomposition1(Note, Note, _Counter, a, Progression, Progression) :- !. +popclassicalcomposition1(Note1, Note2, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +steppopclassical1(Note1, Note3), +append(Progression1, [Note3], Progression3), +popclassicalcomposition2(Note3, Note2, Note1, Counter2, Flag2, Progression3, Progression4), +steppopclassical2(Note1, Note4), +append(Progression1, [Note4], Progression5), +popclassicalcomposition2(Note4, Note2, Note1, Counter2, Flag3, Progression5, Progression6), +steppopclassical3(Note1, Note5), +append(Progression1, [Note5], Progression7), +popclassicalcomposition2(Note5, Note2, Note1, Counter2, Flag4, Progression7, Progression8), + +steppopclassical4(Note1, Note7), +append(Progression1, [Note7], Progression9), +popclassicalcomposition2(Note7, Note2, Note3, Counter2, Flag5, Progression9, Progression10), + +steppopclassical5(Note1, Note8), +append(Progression1, [Note8], Progression11), +popclassicalcomposition2(Note8, Note2, Note3, Counter2, Flag6, Progression11, Progression12), + +steppopclassical52(Note1, Note82), +append(Progression1, [Note82], Progression112), +popclassicalcomposition2(Note82, Note2, Note3, Counter2, Flag62, Progression112, Progression122), + +steppopclassical6(Note1, Note9), +append(Progression1, [Note9], Progression13), +popclassicalcomposition2(Note9, Note2, Note3, Counter2, Flag7, Progression13, Progression14), + +steppopclassical7(Note1, Note10), +append(Progression1, [Note10], Progression15), +popclassicalcomposition2(Note10, Note2, Note3, Counter2, Flag8, Progression15, Progression16), + +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8], + +[Flag5, Progression10], [Flag6, Progression12], [Flag62, Progression122], [Flag7, Progression14], [Flag8, Progression16]], Flag1, Progression2). + +popclassicalcomposition2(_, _, _, Counter, b, Progression, Progression) :- +Counter = 4, !. +popclassicalcomposition2(Note, _, Note, _, _, _, _) :- !. +popclassicalcomposition2(Note, Note, _, _Counter, a, Progression, Progression) :- !. +popclassicalcomposition2(Note1, Note2, Note3, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +steppopclassical1(Note1, Note4), +append(Progression1, [Note4], Progression3), +popclassicalcomposition2(Note4, Note2, Note3, Counter2, Flag2, Progression3, Progression4), +steppopclassical2(Note1, Note5), +append(Progression1, [Note5], Progression5), +popclassicalcomposition2(Note5, Note2, Note3, Counter2, Flag3, Progression5, Progression6), +steppopclassical3(Note1, Note6), +append(Progression1, [Note6], Progression7), +popclassicalcomposition2(Note6, Note2, Note3, Counter2, Flag4, Progression7, Progression8), + +steppopclassical4(Note1, Note7), +append(Progression1, [Note7], Progression9), +popclassicalcomposition2(Note7, Note2, Note3, Counter2, Flag5, Progression9, Progression10), + +steppopclassical5(Note1, Note8), +append(Progression1, [Note8], Progression11), +popclassicalcomposition2(Note8, Note2, Note3, Counter2, Flag6, Progression11, Progression12), + +steppopclassical52(Note1, Note82), +append(Progression1, [Note82], Progression112), +popclassicalcomposition2(Note82, Note2, Note3, Counter2, Flag62, Progression112, Progression122), + +steppopclassical6(Note1, Note9), +append(Progression1, [Note9], Progression13), +popclassicalcomposition2(Note9, Note2, Note3, Counter2, Flag7, Progression13, Progression14), + +steppopclassical7(Note1, Note10), +append(Progression1, [Note10], Progression15), +popclassicalcomposition2(Note10, Note2, Note3, Counter2, Flag8, Progression15, Progression16), + +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8], + +[Flag5, Progression10], [Flag6, Progression12], [Flag62, Progression122], [Flag7, Progression14], [Flag8, Progression16]], Flag1, Progression2). + +steppopclassical1(Note1, Note2) :- +Value is Note1+2, Note2 is Value mod 12. +steppopclassical2(Note1, Note2) :- +Value is Note1+3, Note2 is Value mod 12. +steppopclassical3(Note1, Note2) :- +Value is Note1+5, Note2 is Value mod 12. +steppopclassical4(Note1, Note2) :- +Value is Note1+6, Note2 is Value mod 12. +steppopclassical5(Note1, Note2) :- +Value is Note1+7, Note2 is Value mod 12. +steppopclassical52(Note1, Note2) :- +Value is Note1+8, Note2 is Value mod 12. +steppopclassical6(Note1, Note2) :- +Value is Note1+10, Note2 is Value mod 12. +steppopclassical7(Note1, Note2) :- +Value is Note1+11, Note2 is Value mod 12. diff --git a/Music-Composer/popclassical.pl b/Music-Composer/popclassical.pl new file mode 100644 index 0000000000000000000000000000000000000000..7ee53242b690404df591583ea3ff48edc8beb727 --- /dev/null +++ b/Music-Composer/popclassical.pl @@ -0,0 +1,108 @@ +/** +Classical Pop (with minor sixth) Composition +Based on 2 3 5 6 7 8 10 11 intervals +Counter=4 +?- popclassicalcomposition('C','D'). +?- popclassicalcomposition('C','C#'). +?- popclassicalcomposition('C','E'). +?- popclassicalcomposition('C','D#'). +**/ +popclassicalcomposition(Name1, Name2,Progression2) :- +note(Note1, Name1), +note(Note2, Name2), +popclassicalcomposition1(Note1, Note2, 0, _, [Note1], Progression1), notestonames(Progression1,Progression2),!. +popclassicalcomposition1(_, _, Counter, b, Progression, Progression) :- +Counter = 3, !. +popclassicalcomposition1(Note, Note, _Counter, a, Progression, Progression) :- !. +popclassicalcomposition1(Note1, Note2, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +steppopclassical1(Note1, Note3), +append(Progression1, [Note3], Progression3), +popclassicalcomposition2(Note3, Note2, Note1, Counter2, Flag2, Progression3, Progression4), +steppopclassical2(Note1, Note4), +append(Progression1, [Note4], Progression5), +popclassicalcomposition2(Note4, Note2, Note1, Counter2, Flag3, Progression5, Progression6), +steppopclassical3(Note1, Note5), +append(Progression1, [Note5], Progression7), +popclassicalcomposition2(Note5, Note2, Note1, Counter2, Flag4, Progression7, Progression8), + +steppopclassical4(Note1, Note7), +append(Progression1, [Note7], Progression9), +popclassicalcomposition2(Note7, Note2, Note3, Counter2, Flag5, Progression9, Progression10), + +steppopclassical5(Note1, Note8), +append(Progression1, [Note8], Progression11), +popclassicalcomposition2(Note8, Note2, Note3, Counter2, Flag6, Progression11, Progression12), + +steppopclassical52(Note1, Note82), +append(Progression1, [Note82], Progression112), +popclassicalcomposition2(Note82, Note2, Note3, Counter2, Flag62, Progression112, Progression122), + +steppopclassical6(Note1, Note9), +append(Progression1, [Note9], Progression13), +popclassicalcomposition2(Note9, Note2, Note3, Counter2, Flag7, Progression13, Progression14), + +steppopclassical7(Note1, Note10), +append(Progression1, [Note10], Progression15), +popclassicalcomposition2(Note10, Note2, Note3, Counter2, Flag8, Progression15, Progression16), + +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8], + +[Flag5, Progression10], [Flag6, Progression12], [Flag62, Progression122], [Flag7, Progression14], [Flag8, Progression16]], Flag1, Progression2). + +popclassicalcomposition2(_, _, _, Counter, b, Progression, Progression) :- +Counter = 3, !. +popclassicalcomposition2(Note, _, Note, _, _, _, _) :- !. +popclassicalcomposition2(Note, Note, _, _Counter, a, Progression, Progression) :- !. +popclassicalcomposition2(Note1, Note2, Note3, Counter1, Flag1, Progression1, Progression2) :- +Counter2 is Counter1 + 1, +steppopclassical1(Note1, Note4), +append(Progression1, [Note4], Progression3), +popclassicalcomposition2(Note4, Note2, Note3, Counter2, Flag2, Progression3, Progression4), +steppopclassical2(Note1, Note5), +append(Progression1, [Note5], Progression5), +popclassicalcomposition2(Note5, Note2, Note3, Counter2, Flag3, Progression5, Progression6), +steppopclassical3(Note1, Note6), +append(Progression1, [Note6], Progression7), +popclassicalcomposition2(Note6, Note2, Note3, Counter2, Flag4, Progression7, Progression8), + +steppopclassical4(Note1, Note7), +append(Progression1, [Note7], Progression9), +popclassicalcomposition2(Note7, Note2, Note3, Counter2, Flag5, Progression9, Progression10), + +steppopclassical5(Note1, Note8), +append(Progression1, [Note8], Progression11), +popclassicalcomposition2(Note8, Note2, Note3, Counter2, Flag6, Progression11, Progression12), + +steppopclassical52(Note1, Note82), +append(Progression1, [Note82], Progression112), +popclassicalcomposition2(Note82, Note2, Note3, Counter2, Flag62, Progression112, Progression122), + +steppopclassical6(Note1, Note9), +append(Progression1, [Note9], Progression13), +popclassicalcomposition2(Note9, Note2, Note3, Counter2, Flag7, Progression13, Progression14), + +steppopclassical7(Note1, Note10), +append(Progression1, [Note10], Progression15), +popclassicalcomposition2(Note10, Note2, Note3, Counter2, Flag8, Progression15, Progression16), + +flag1([[Flag2, Progression4], [Flag3, Progression6], [Flag4, Progression8], + +[Flag5, Progression10], [Flag6, Progression12], [Flag62, Progression122], [Flag7, Progression14], [Flag8, Progression16]], Flag1, Progression2). + +steppopclassical1(Note1, Note2) :- +Value is Note1+2, Note2 is Value mod 12. +steppopclassical2(Note1, Note2) :- +Value is Note1+3, Note2 is Value mod 12. +steppopclassical3(Note1, Note2) :- +Value is Note1+5, Note2 is Value mod 12. +steppopclassical4(Note1, Note2) :- +Value is Note1+6, Note2 is Value mod 12. +steppopclassical5(Note1, Note2) :- +Value is Note1+7, Note2 is Value mod 12. +steppopclassical52(Note1, Note2) :- +Value is Note1+8, Note2 is Value mod 12. +steppopclassical6(Note1, Note2) :- +Value is Note1+10, Note2 is Value mod 12. +steppopclassical7(Note1, Note2) :- +Value is Note1+11, Note2 is Value mod 12. diff --git a/Music-Composer/stems/.DS_Store b/Music-Composer/stems/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Music-Composer/stems/.DS_Store differ diff --git a/Music-Composer/texttobr2qb.pl b/Music-Composer/texttobr2qb.pl new file mode 100644 index 0000000000000000000000000000000000000000..5702c5bb72e066272fe1857cb538b1b6d47e02d8 --- /dev/null +++ b/Music-Composer/texttobr2qb.pl @@ -0,0 +1,10 @@ +%% 250 breasonings + +texttobr2 :- +br1([[a,1,1.5,0],[about,1,1,0],[ache,1,0,0],[after,1,1,0],[all,1,1.5,0],[an,1,1.5,0],[and,1,1,0],[appearances,5,0,5],[apple,5,5,5],[are,1,1,0],[area,5,5,0],[argument,1,1.5,0],[as,1,1,0],[asanas,180,50,1],[at,1,1,0],[ate,15,2,1],[avoid,1,1,0],[b,1,1,1],[bacteria,2,1,1],[based,5,5,1],[be,50,30,180],[before,1,1,0],[being,50,30,180],[binding,1,1,0],[blocking,1,1,1],[bolt,2,1,1],[box,1,1,1],[breakfast,10,10,1],[breasoned,5,5,5],[breasoning,5,5,5],[breasonings,5,5,5],[by,1,1,0],[chapter,10,15,1],[clear,1,1,0],[colds,1,1,1],[comfort,180,50,30],[comfortable,50,50,100],[complexes,1,1,1],[conception,50,14,8],[connected,1,1,0],[correctness,1,1,0],[day,10,10,10],[depression,1,1,0],[details,1,1,1],[did,1,1,0],[do,50,30,180],[dotted,1,1,0],[down,1,1,0],[drinking,5,5,5],[during,1,1.5,0],[each,1,1,0],[earned,1,1.5,0],[education,50,30,180],[effects,1,1.5,0],[elderberries,1,1,1],[ensure,1,1,0],[excess,1,1,1],[exercises,10,10,1],[f,5,5,5],[feeling,1,1,0],[felt,1,1,0],[first,1,1.5,0],[flu,1,1,0],[food,2,1,1],[for,1,1,0],[from,1,1,0],[fruits,1,1,1],[function,1,1.5,0],[gaffer,10,10,1],[gave,1,1,1],[glasses,15,11,3],[god,50,30,180],[grade,1,1.5,0],[grains,1,1,1],[had,1,1,0],[hallucinogenic,1,1,0],[happiness,1,1,0],[happy,1,1,0],[have,5,5,5],[head,15,20,23],[headache,1,1,0],[headaches,1,1,0],[health,50,30,180],[healthy,50,30,180],[help,5,5,0],[helped,50,30,180],[high,1,1.5,0],[hills,5,5,2],[honey,10,10,10],[i,50,30,180],[imagery,5,5,0],[in,1,1,0],[incompatibility,1,1,0],[interpretation,20,30,0],[it,1,1,0],[keep,1,1,0],[laughter,1,1.5,0],[maintain,1,1,0],[maintaining,1,1,0],[master,50,30,180],[mind,3,4,4],[minutes,20,30,0],[mistake,1,1,0],[muscle,1,1,1],[muscles,1,1,1],[music,1,1,0],[must,1,1,0],[my,50,30,180],[namaskar,15,10,23],[neck,15,15,10],[next,1,1,0],[nietzsche,50,30,180],[no,1,1,0],[noticed,50,30,180],[nut,1,1,0],[nuts,1,1,1],[of,1,1,0],[off,1,1.5,0],[on,1,1,0],[one,1,1.5,0],[organs,1,1,1],[out,1,1,0],[over,1,1,0],[part,1,1,1],[peace,1,1.5,0],[perfect,1,1.5,0],[performed,50,30,180],[performing,50,30,250],[philosophy,10,20,1],[physiology,50,30,180],[pimple,1,1,0],[pm,5,5,5],[poem,1,1.5,0],[positive,1,1,0],[pot,5,5,10],[prayer,5,5,0],[prepared,1,1,0],[prevent,1,1,0],[prevented,1,1,0],[preventing,1,1,0],[prevention,1,1,0],[problems,1,1.5,0],[product,1,150,1],[products,5,5,5],[psychiatry,50,30,180],[psychology,2,2,0],[quality,1,1.5,0],[quantum,1,1,0],[read,15,1,20],[relax,180,50,30],[relaxed,180,50,30],[remain,1,1.5,0],[require,1,1,0],[room,500,400,300],[s,1,1,0],[safe,1,1.5,0],[sales,1,1.5,0],[same,1,1,0],[schizophrenia,1,1,0],[second,1,1.5,0],[see,1,0,1],[seek,1,1,0],[sell,5,3,0],[sets,5,1,0],[sex,1,1.5,0],[short,1,1,0],[single,1,1.5,0],[sites,20,0,30],[sitting,50,70,150],[sized,30,1,0],[skin,1,1,0],[sleep,180,50,30],[slices,2,2,2],[so,1,1,0],[societal,50,30,180],[some,1,1,0],[spine,1,1,10],[spiritual,50,30,180],[state,1,1.5,0],[stated,20,30,0],[stating,50,30,180],[status,1,1.5,0],[stop,5,1,15],[straight,1,1,0],[strawberry,1,1,1],[structure,1,1,1],[studied,50,30,180],[study,100,50,100],[studying,50,30,180],[subject,50,30,180],[successful,1,1.5,0],[surya,1,1.5,0],[sutra,5,1.5,0],[table,7,5,0],[tape,10,10,1],[task,15,2.5,0],[technique,1,1,0],[test,20,20,0],[text,20,30,0],[that,1,1,0],[the,1,1,0],[them,50,30,180],[then,1,1,0],[they,50,30,180],[thinking,1,1.5,0],[third,1,1.5,0],[this,1,1,0],[thoughts,1,1,0],[through,1,1,0],[tied,1,1,0],[time,1,1.5,0],[to,1,1,0],[together,1,1,0],[touch,1,1,0],[train,1,1,1],[trains,50,30,180],[travel,10,3,0],[treating,20,15,3],[turn,1,1,0],[two,1,1.5,0],[university,100,75,3],[unwanted,1,1,0],[up,1,1,0],[upasana,100,75,100],[used,1,1,0],[using,1,1,0],[var,1,1,1],[vegetables,5,5,5],[videos,5,0,3],[view,5,5,5],[virality,1,1,0],[viruses,1,1,2],[vitamin,1,1,1],[walks,100,500,0],[wanted,15,20,3],[warm,1,1,0],[was,1,1,0],[water,5,5,5],[way,1,1,0],[ways,50,100,0],[well,50,30,180],[where,1,1,0],[whole,1,1,1],[with,1,1,0],[without,1,1,0],[words,5,7,1],[write,15,1,1],[writing,5,5,0],[wrote,15,1,1],[years,3,1,0]]). + +br1([[a,1,1.5,0],[about,1,1,0],[ache,1,0,0],[after,1,1,0],[all,1,1.5,0],[an,1,1.5,0],[and,1,1,0],[appearances,5,0,5],[apple,5,5,5],[are,1,1,0],[area,5,5,0],[argument,1,1.5,0],[as,1,1,0],[asanas,180,50,1],[at,1,1,0],[ate,15,2,1],[avoid,1,1,0],[b,1,1,1],[bacteria,2,1,1],[based,5,5,1],[be,50,30,180],[before,1,1,0],[being,50,30,180],[binding,1,1,0],[blocking,1,1,1],[bolt,2,1,1],[box,1,1,1],[breakfast,10,10,1],[breasoned,5,5,5],[breasoning,5,5,5],[breasonings,5,5,5],[by,1,1,0],[chapter,10,15,1],[clear,1,1,0],[colds,1,1,1],[comfort,180,50,30],[comfortable,50,50,100],[complexes,1,1,1],[conception,50,14,8],[connected,1,1,0],[correctness,1,1,0],[day,10,10,10],[depression,1,1,0],[details,1,1,1],[did,1,1,0],[do,50,30,180],[dotted,1,1,0],[down,1,1,0],[drinking,5,5,5],[during,1,1.5,0],[each,1,1,0],[earned,1,1.5,0],[education,50,30,180],[effects,1,1.5,0],[elderberries,1,1,1],[ensure,1,1,0],[excess,1,1,1],[exercises,10,10,1],[f,5,5,5],[feeling,1,1,0],[felt,1,1,0],[first,1,1.5,0],[flu,1,1,0],[food,2,1,1],[for,1,1,0],[from,1,1,0],[fruits,1,1,1],[function,1,1.5,0],[gaffer,10,10,1],[gave,1,1,1],[glasses,15,11,3],[god,50,30,180],[grade,1,1.5,0],[grains,1,1,1],[had,1,1,0],[hallucinogenic,1,1,0],[happiness,1,1,0],[happy,1,1,0],[have,5,5,5],[head,15,20,23],[headache,1,1,0],[headaches,1,1,0],[health,50,30,180],[healthy,50,30,180],[help,5,5,0],[helped,50,30,180],[high,1,1.5,0],[hills,5,5,2],[honey,10,10,10],[i,50,30,180],[imagery,5,5,0],[in,1,1,0],[incompatibility,1,1,0],[interpretation,20,30,0],[it,1,1,0],[keep,1,1,0],[laughter,1,1.5,0],[maintain,1,1,0],[maintaining,1,1,0],[master,50,30,180],[mind,3,4,4],[minutes,20,30,0],[mistake,1,1,0],[muscle,1,1,1],[muscles,1,1,1],[music,1,1,0],[must,1,1,0],[my,50,30,180],[namaskar,15,10,23],[neck,15,15,10],[next,1,1,0],[nietzsche,50,30,180],[no,1,1,0],[noticed,50,30,180],[nut,1,1,0],[nuts,1,1,1],[of,1,1,0],[off,1,1.5,0],[on,1,1,0],[one,1,1.5,0],[organs,1,1,1],[out,1,1,0],[over,1,1,0],[part,1,1,1],[peace,1,1.5,0],[perfect,1,1.5,0],[performed,50,30,180],[performing,50,30,250],[philosophy,10,20,1],[physiology,50,30,180],[pimple,1,1,0],[pm,5,5,5],[poem,1,1.5,0],[positive,1,1,0],[pot,5,5,10],[prayer,5,5,0],[prepared,1,1,0],[prevent,1,1,0],[prevented,1,1,0],[preventing,1,1,0],[prevention,1,1,0],[problems,1,1.5,0],[product,1,150,1],[products,5,5,5],[psychiatry,50,30,180],[psychology,2,2,0],[quality,1,1.5,0],[quantum,1,1,0],[read,15,1,20],[relax,180,50,30],[relaxed,180,50,30],[remain,1,1.5,0],[require,1,1,0],[room,500,400,300],[s,1,1,0],[safe,1,1.5,0],[sales,1,1.5,0],[same,1,1,0],[schizophrenia,1,1,0],[second,1,1.5,0],[see,1,0,1],[seek,1,1,0],[sell,5,3,0],[sets,5,1,0],[sex,1,1.5,0],[short,1,1,0],[single,1,1.5,0],[sites,20,0,30],[sitting,50,70,150],[sized,30,1,0],[skin,1,1,0],[sleep,180,50,30],[slices,2,2,2],[so,1,1,0],[societal,50,30,180],[some,1,1,0],[spine,1,1,10],[spiritual,50,30,180],[state,1,1.5,0],[stated,20,30,0],[stating,50,30,180],[status,1,1.5,0],[stop,5,1,15],[straight,1,1,0],[strawberry,1,1,1],[structure,1,1,1],[studied,50,30,180],[study,100,50,100],[studying,50,30,180],[subject,50,30,180],[successful,1,1.5,0],[surya,1,1.5,0],[sutra,5,1.5,0],[table,7,5,0],[tape,10,10,1],[task,15,2.5,0],[technique,1,1,0],[test,20,20,0],[text,20,30,0],[that,1,1,0],[the,1,1,0],[them,50,30,180],[then,1,1,0],[they,50,30,180],[thinking,1,1.5,0],[third,1,1.5,0],[this,1,1,0],[thoughts,1,1,0],[through,1,1,0],[tied,1,1,0],[time,1,1.5,0],[to,1,1,0],[together,1,1,0],[touch,1,1,0],[train,1,1,1],[trains,50,30,180],[travel,10,3,0],[treating,20,15,3],[turn,1,1,0],[two,1,1.5,0],[university,100,75,3],[unwanted,1,1,0],[up,1,1,0],[upasana,100,75,100],[used,1,1,0],[using,1,1,0],[var,1,1,1],[vegetables,5,5,5],[videos,5,0,3],[view,5,5,5],[virality,1,1,0],[viruses,1,1,2],[vitamin,1,1,1],[walks,100,500,0],[wanted,15,20,3],[warm,1,1,0],[was,1,1,0],[water,5,5,5],[way,1,1,0],[ways,50,100,0],[well,50,30,180],[where,1,1,0],[whole,1,1,1],[with,1,1,0],[without,1,1,0],[words,5,7,1],[write,15,1,1],[writing,5,5,0],[wrote,15,1,1],[years,3,1,0]]). + +texttobr2(0):-!. +texttobr2(N1):- + texttobr2,N2 is N1-1,texttobr2(N2). diff --git a/Music-Composer/texttobr2qbmusic.pl b/Music-Composer/texttobr2qbmusic.pl new file mode 100644 index 0000000000000000000000000000000000000000..9f9b2c33bb873621fe6fde0e3670311ccb509bb3 --- /dev/null +++ b/Music-Composer/texttobr2qbmusic.pl @@ -0,0 +1,8 @@ +texttobr2qb :- +br1([[a,1,1.5,0],[about,1,1,0],[ache,1,0,0],[after,1,1,0],[all,1,1.5,0],[an,1,1.5,0],[and,1,1,0],[appearances,5,0,5],[apple,5,5,5],[are,1,1,0],[area,5,5,0],[argument,1,1.5,0],[as,1,1,0],[asanas,180,50,1],[at,1,1,0],[ate,15,2,1],[avoid,1,1,0],[b,1,1,1],[bacteria,2,1,1],[based,5,5,1],[be,50,30,180],[before,1,1,0],[being,50,30,180],[binding,1,1,0],[blocking,1,1,1],[bolt,2,1,1],[box,1,1,1],[breakfast,10,10,1],[breasoned,5,5,5],[breasoning,5,5,5],[breasonings,5,5,5],[by,1,1,0],[chapter,10,15,1],[clear,1,1,0],[colds,1,1,1],[comfort,180,50,30],[comfortable,50,50,100],[complexes,1,1,1],[conception,50,14,8],[connected,1,1,0],[correctness,1,1,0],[day,10,10,10],[depression,1,1,0],[details,1,1,1],[did,1,1,0],[do,50,30,180],[dotted,1,1,0],[down,1,1,0],[drinking,5,5,5],[during,1,1.5,0],[each,1,1,0],[earned,1,1.5,0],[education,50,30,180],[effects,1,1.5,0],[elderberries,1,1,1],[ensure,1,1,0],[excess,1,1,1],[exercises,10,10,1],[f,5,5,5],[feeling,1,1,0],[felt,1,1,0],[first,1,1.5,0],[flu,1,1,0],[food,2,1,1],[for,1,1,0],[from,1,1,0],[fruits,1,1,1],[function,1,1.5,0],[gaffer,10,10,1],[gave,1,1,1],[glasses,15,11,3],[god,50,30,180],[grade,1,1.5,0],[grains,1,1,1],[had,1,1,0],[hallucinogenic,1,1,0],[happiness,1,1,0],[happy,1,1,0],[have,5,5,5],[head,15,20,23],[headache,1,1,0],[headaches,1,1,0],[health,50,30,180],[healthy,50,30,180],[help,5,5,0],[helped,50,30,180],[high,1,1.5,0],[hills,5,5,2],[honey,10,10,10],[i,50,30,180],[imagery,5,5,0],[in,1,1,0],[incompatibility,1,1,0],[interpretation,20,30,0],[it,1,1,0],[keep,1,1,0],[laughter,1,1.5,0],[maintain,1,1,0],[maintaining,1,1,0],[master,50,30,180],[mind,3,4,4],[minutes,20,30,0],[mistake,1,1,0],[muscle,1,1,1],[muscles,1,1,1],[music,1,1,0],[must,1,1,0],[my,50,30,180],[namaskar,15,10,23],[neck,15,15,10],[next,1,1,0],[nietzsche,50,30,180],[no,1,1,0],[noticed,50,30,180],[nut,1,1,0],[nuts,1,1,1],[of,1,1,0],[off,1,1.5,0],[on,1,1,0],[one,1,1.5,0],[organs,1,1,1],[out,1,1,0],[over,1,1,0],[part,1,1,1],[peace,1,1.5,0],[perfect,1,1.5,0],[performed,50,30,180],[performing,50,30,250],[philosophy,10,20,1],[physiology,50,30,180],[pimple,1,1,0],[pm,5,5,5],[poem,1,1.5,0],[positive,1,1,0],[pot,5,5,10],[prayer,5,5,0],[prepared,1,1,0],[prevent,1,1,0],[prevented,1,1,0],[preventing,1,1,0],[prevention,1,1,0],[problems,1,1.5,0],[product,1,150,1],[products,5,5,5],[psychiatry,50,30,180],[psychology,2,2,0],[quality,1,1.5,0],[quantum,1,1,0],[read,15,1,20],[relax,180,50,30],[relaxed,180,50,30],[remain,1,1.5,0],[require,1,1,0],[room,500,400,300],[s,1,1,0],[safe,1,1.5,0],[sales,1,1.5,0],[same,1,1,0],[schizophrenia,1,1,0],[second,1,1.5,0],[see,1,0,1],[seek,1,1,0],[sell,5,3,0],[sets,5,1,0],[sex,1,1.5,0],[short,1,1,0],[single,1,1.5,0],[sites,20,0,30],[sitting,50,70,150],[sized,30,1,0],[skin,1,1,0],[sleep,180,50,30],[slices,2,2,2],[so,1,1,0],[societal,50,30,180],[some,1,1,0],[spine,1,1,10],[spiritual,50,30,180],[state,1,1.5,0],[stated,20,30,0],[stating,50,30,180],[status,1,1.5,0],[stop,5,1,15],[straight,1,1,0],[strawberry,1,1,1],[structure,1,1,1],[studied,50,30,180],[study,100,50,100],[studying,50,30,180],[subject,50,30,180],[successful,1,1.5,0],[surya,1,1.5,0],[sutra,5,1.5,0],[table,7,5,0],[tape,10,10,1],[task,15,2.5,0],[technique,1,1,0],[test,20,20,0],[text,20,30,0],[that,1,1,0],[the,1,1,0],[them,50,30,180],[then,1,1,0],[they,50,30,180],[thinking,1,1.5,0],[third,1,1.5,0],[this,1,1,0],[thoughts,1,1,0],[through,1,1,0],[tied,1,1,0],[time,1,1.5,0],[to,1,1,0],[together,1,1,0],[touch,1,1,0],[train,1,1,1],[trains,50,30,180],[travel,10,3,0],[treating,20,15,3],[turn,1,1,0],[two,1,1.5,0],[university,100,75,3],[unwanted,1,1,0],[up,1,1,0],[upasana,100,75,100],[used,1,1,0],[using,1,1,0],[var,1,1,1],[vegetables,5,5,5],[videos,5,0,3],[view,5,5,5],[virality,1,1,0],[viruses,1,1,2],[vitamin,1,1,1],[walks,100,500,0],[wanted,15,20,3],[warm,1,1,0],[was,1,1,0],[water,5,5,5],[way,1,1,0],[ways,50,100,0],[well,50,30,180],[where,1,1,0],[whole,1,1,1],[with,1,1,0],[without,1,1,0],[words,5,7,1],[write,15,1,1],[writing,5,5,0],[wrote,15,1,1],[years,3,1,0]]). + +br1([[a,1,1.5,0],[about,1,1,0],[ache,1,0,0],[after,1,1,0],[all,1,1.5,0],[an,1,1.5,0],[and,1,1,0],[appearances,5,0,5],[apple,5,5,5],[are,1,1,0],[area,5,5,0],[argument,1,1.5,0],[as,1,1,0],[asanas,180,50,1],[at,1,1,0],[ate,15,2,1],[avoid,1,1,0],[b,1,1,1],[bacteria,2,1,1],[based,5,5,1],[be,50,30,180],[before,1,1,0],[being,50,30,180],[binding,1,1,0],[blocking,1,1,1],[bolt,2,1,1],[box,1,1,1],[breakfast,10,10,1],[breasoned,5,5,5],[breasoning,5,5,5],[breasonings,5,5,5],[by,1,1,0],[chapter,10,15,1],[clear,1,1,0],[colds,1,1,1],[comfort,180,50,30],[comfortable,50,50,100],[complexes,1,1,1],[conception,50,14,8],[connected,1,1,0],[correctness,1,1,0],[day,10,10,10],[depression,1,1,0],[details,1,1,1],[did,1,1,0],[do,50,30,180],[dotted,1,1,0],[down,1,1,0],[drinking,5,5,5],[during,1,1.5,0],[each,1,1,0],[earned,1,1.5,0],[education,50,30,180],[effects,1,1.5,0],[elderberries,1,1,1],[ensure,1,1,0],[excess,1,1,1],[exercises,10,10,1],[f,5,5,5],[feeling,1,1,0],[felt,1,1,0],[first,1,1.5,0],[flu,1,1,0],[food,2,1,1],[for,1,1,0],[from,1,1,0],[fruits,1,1,1],[function,1,1.5,0],[gaffer,10,10,1],[gave,1,1,1],[glasses,15,11,3],[god,50,30,180],[grade,1,1.5,0],[grains,1,1,1],[had,1,1,0],[hallucinogenic,1,1,0],[happiness,1,1,0],[happy,1,1,0],[have,5,5,5],[head,15,20,23],[headache,1,1,0],[headaches,1,1,0],[health,50,30,180],[healthy,50,30,180],[help,5,5,0],[helped,50,30,180],[high,1,1.5,0],[hills,5,5,2],[honey,10,10,10],[i,50,30,180],[imagery,5,5,0],[in,1,1,0],[incompatibility,1,1,0],[interpretation,20,30,0],[it,1,1,0],[keep,1,1,0],[laughter,1,1.5,0],[maintain,1,1,0],[maintaining,1,1,0],[master,50,30,180],[mind,3,4,4],[minutes,20,30,0],[mistake,1,1,0],[muscle,1,1,1],[muscles,1,1,1],[music,1,1,0],[must,1,1,0],[my,50,30,180],[namaskar,15,10,23],[neck,15,15,10],[next,1,1,0],[nietzsche,50,30,180],[no,1,1,0],[noticed,50,30,180],[nut,1,1,0],[nuts,1,1,1],[of,1,1,0],[off,1,1.5,0],[on,1,1,0],[one,1,1.5,0],[organs,1,1,1],[out,1,1,0],[over,1,1,0],[part,1,1,1],[peace,1,1.5,0],[perfect,1,1.5,0],[performed,50,30,180],[performing,50,30,250],[philosophy,10,20,1],[physiology,50,30,180],[pimple,1,1,0],[pm,5,5,5],[poem,1,1.5,0],[positive,1,1,0],[pot,5,5,10],[prayer,5,5,0],[prepared,1,1,0],[prevent,1,1,0],[prevented,1,1,0],[preventing,1,1,0],[prevention,1,1,0],[problems,1,1.5,0],[product,1,150,1],[products,5,5,5],[psychiatry,50,30,180],[psychology,2,2,0],[quality,1,1.5,0],[quantum,1,1,0],[read,15,1,20],[relax,180,50,30],[relaxed,180,50,30],[remain,1,1.5,0],[require,1,1,0],[room,500,400,300],[s,1,1,0],[safe,1,1.5,0],[sales,1,1.5,0],[same,1,1,0],[schizophrenia,1,1,0],[second,1,1.5,0],[see,1,0,1],[seek,1,1,0],[sell,5,3,0],[sets,5,1,0],[sex,1,1.5,0],[short,1,1,0],[single,1,1.5,0],[sites,20,0,30],[sitting,50,70,150],[sized,30,1,0],[skin,1,1,0],[sleep,180,50,30],[slices,2,2,2],[so,1,1,0],[societal,50,30,180],[some,1,1,0],[spine,1,1,10],[spiritual,50,30,180],[state,1,1.5,0],[stated,20,30,0],[stating,50,30,180],[status,1,1.5,0],[stop,5,1,15],[straight,1,1,0],[strawberry,1,1,1],[structure,1,1,1],[studied,50,30,180],[study,100,50,100],[studying,50,30,180],[subject,50,30,180],[successful,1,1.5,0],[surya,1,1.5,0],[sutra,5,1.5,0],[table,7,5,0],[tape,10,10,1],[task,15,2.5,0],[technique,1,1,0],[test,20,20,0],[text,20,30,0],[that,1,1,0],[the,1,1,0],[them,50,30,180],[then,1,1,0],[they,50,30,180],[thinking,1,1.5,0],[third,1,1.5,0],[this,1,1,0],[thoughts,1,1,0],[through,1,1,0],[tied,1,1,0],[time,1,1.5,0],[to,1,1,0],[together,1,1,0],[touch,1,1,0],[train,1,1,1],[trains,50,30,180],[travel,10,3,0],[treating,20,15,3],[turn,1,1,0],[two,1,1.5,0],[university,100,75,3],[unwanted,1,1,0],[up,1,1,0],[upasana,100,75,100],[used,1,1,0],[using,1,1,0],[var,1,1,1],[vegetables,5,5,5],[videos,5,0,3],[view,5,5,5],[virality,1,1,0],[viruses,1,1,2],[vitamin,1,1,1],[walks,100,500,0],[wanted,15,20,3],[warm,1,1,0],[was,1,1,0],[water,5,5,5],[way,1,1,0],[ways,50,100,0],[well,50,30,180],[where,1,1,0],[whole,1,1,1],[with,1,1,0],[without,1,1,0],[words,5,7,1],[write,15,1,1],[writing,5,5,0],[wrote,15,1,1],[years,3,1,0]]). + +texttobr2qb(0):-!. +texttobr2qb(N1):- + texttobr2qb,N2 is N1-1,texttobr2qb(N2). diff --git a/Philosophy/.DS_Store b/Philosophy/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..e1770559f95876dc5c8956c68c2b5a64bd262912 Binary files /dev/null and b/Philosophy/.DS_Store differ diff --git a/Philosophy/1 4 23.pl b/Philosophy/1 4 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..9605a4349a413c60732d533a7629088b2119fdda --- /dev/null +++ b/Philosophy/1 4 23.pl @@ -0,0 +1,239 @@ +% 1 4 23.pl + +% 42 algs + +:-include('../listprologinterpreter/listprolog.pl'). + +% 1. I prepared to visualise the dimensions of the object that a word represented. I did this by writing the words on the pedagogy screen. First, I set up the pedagogy screen. Second, I wrote down the words on it. Third, I drew the object on the screen. In this way, I prepared to visualise the dimensions of the object that a word represented by writing the words on the pedagogy screen. + +object( +[ +[ +"* ", +"* ", +"* "], +[ +"* ", +"** ", +"* "], +[ +" ", +" ", +" "] +]). + +/* +object( +[ + [ + [*,-,-], + [*,-,-], + [*,-,-] + ], + [ + [*,-,-], + [*,*,-], + [*,-,-] + ], + [ + [-,-,-], + [-,-,-], + [-,-,-] + ] +]). +*/ +% verify,convert to matrix, find min,max x,y,z, find diff (dimensions) + +% dimensions(X,Y,Z). +% X = Z, Z = 2, +% Y = 3. + +dimensions(X,Y,Z) :- + object(Grid), + %find_bounds(Grid,X1,Y1,Z1), + verify(Grid,X1,Y1,Z1), + convert_to_matrix(Grid,X1,Y1,Z1,Matrix), + find_min_max_x_y_z(Matrix,Min_X,Max_X,Min_Y,Max_Y,Min_Z,Max_Z), + find_dimensions(Min_X,Max_X,Min_Y,Max_Y,Min_Z,Max_Z,X,Y,Z), + !. + +verify(Grid,X1,Y1,Z1) :- + length(Grid,Z1), + + member(Plane,Grid), + length(Plane,Y1), + forall(member(Plane1,Grid), + length(Plane1,Y1)), + + member(Line,Plane), + string_length(Line,X1), + forall(member(Line1,Plane), + string_length(Line1,X1)). + +convert_to_matrix(Grid,X1,Y1,Z1,Matrix) :- + numbers(Z1,1,[],Zs), + findall([X,Y,Z,Char],(member(Z,Zs), + get_item_n(Grid,Z,Plane), + numbers(Y1,1,[],Ys), + %findall(*,( + member(Y,Ys), + get_item_n(Plane,Y,Line), + string_chars(Line,Chars), + numbers(X1,1,[],Xs), + %findall(*,( + member(X,Xs), + get_item_n(Chars,X,Char)),Matrix). + +find_min_max_x_y_z(Matrix,Min_X,Max_X,Min_Y,Max_Y,Min_Z,Max_Z) :- + findall(X,(member([X,_Y,_Z,C],Matrix),C=(*)),Xs), + sort(Xs,Xs2), + append([Min_X],_,Xs2), + append(_,[Max_X],Xs2), + + findall(Y,(member([_X,Y,_Z,C],Matrix),C=(*)),Ys), + sort(Ys,Ys2), + append([Min_Y],_,Ys2), + append(_,[Max_Y],Ys2), + + findall(Z,(member([_X,_Y,Z,C],Matrix),C=(*)),Zs), + sort(Zs,Zs2), + append([Min_Z],_,Zs2), + append(_,[Max_Z],Zs2). + + find_dimensions(Min_X,Max_X,Min_Y,Max_Y,Min_Z,Max_Z,X,Y,Z) :- + X is Max_X-Min_X+1, + Y is Max_Y-Min_Y+1, + Z is Max_Z-Min_Z+1. + +% 9 + +% 11. I prepared to test that the giraffe had a long enough neck. I did this by testing the assignment with students. First, I tested the hypotenuse. Second, I tested the giraffe. Third, I tested the hiraffe (sic). In this way, I prepared to test that the giraffe had a long enough neck by testing the assignment with students. + +% length1(3,5,L). +% L = 4.0. + +length1(A,C,Length) :- Length is sqrt(C^2-A^2). + +% 21. I prepared to report the crime. I did this by identifying the murder (in fact, drinking tea). First, I identified that no one knew when the murder would occur. Second, I identified when it would happen. Third, I identified that it happened. In this way, I prepared to report the crime by identifying the murder (in fact, drinking tea). + +% report1(Ts). +% Ts = [5]. + +time1(1,nothing). +time1(2,nothing). +time1(3,nothing). +time1(4,nothing). +time1(5,murder). +time1(6,nothing). +time1(7,nothing). + +report1(Times) :- findall(Time1,time1(Time1,murder),Times). + +% 31. I prepared to transform vegetarians. I did this by eating the vegan nuggets. First, I served the vegetarian rice paper roll. Second, I walked to the next table. Third, I ate the vegan nuggets. In this way, I prepared to transform vegetarians by eating the vegan nuggets. + +% matrix_finder1([[[0,1],[-1,0]],[[1,0],[0,1]]],Matrix). + +% Matrix = [[0, -1.0], [1.0, 0]]. + +matrix_finder1(Ms,Ma) :- + Ms=[[M1,M2]|M3], + matrix_finder(M1,M2,Ma), + forall(member([M11,M12],M3),matrix_finder(M11,M12,Ma)),!. + +matrix_finder([I1,I2],[O1,O2],[[Ma1,Ma2],[Ma3,Ma4]]) :- + degree(D), + multiplier(M1), + cos_or_sin(CS1), + (CS1=cos->cos1(D,CS11);sin1(D,CS11)), + multiplier(M2), + cos_or_sin(CS2), + (CS2=cos->cos1(D,CS21);sin1(D,CS21)), + multiplier(M3), + cos_or_sin(CS3), + (CS3=cos->cos1(D,CS31);sin1(D,CS31)), + multiplier(M4), + cos_or_sin(CS4), + (CS4=cos->cos1(D,CS41);sin1(D,CS41)), + epsilon(E), + + Ma1 is M1*CS11, + Ma2 is M2*CS21, + Ma3 is M3*CS31, + Ma4 is M4*CS41, + + O11 is I1*Ma1+I2*Ma2, + O21 is I1*Ma3+I2*Ma4, + + O11 < O1+E, + O11 > O1-E, + + O21 < O2+E, + O21 > O2-E. + + +degree(90). +degree(-90). +multiplier(1). +multiplier(-1). +cos_or_sin(cos). +cos_or_sin(sin). +epsilon(0.000001). + +% 30 + +cos1(Deg,Cos_D) :- Cos_D1 is cos(Deg*pi/180), + epsilon(E), ((Cos_D1 < E, Cos_D1 > -E)-> Cos_D is 0; Cos_D = Cos_D1). +sin1(Deg,Sin_D) :- Sin_D1 is sin(Deg*pi/180), + epsilon(E), ((Sin_D1 < E, Sin_D1 > -E) -> Sin_D is 0; Sin_D = Sin_D1). + +% 1. In a binary opposition, one will be stronger. This is because one starts at it (it is the origin). The contention is exploring the relationship/directionality/methodology between two fundamental objects which have a relation. + +% and1([[a,b,c],[a,d,c],[e,f]],[a,c],Lines). +% Lines = [[a, b, c], [a, d, c]] + +and1(Lines1,Conj,Lines2) :- + findall(L,(member(L,Lines1),subtract(Conj,L,L3),L3=[]),Lines2). + +% 11. I prepared to enjoy a high quality of life. I did this by employing a legal consultant to help legally protect educational institutions. First, I wrote a contract to protect the organisation from donated land being revoked. Second, I wrote a contract so that people who had been emoyed to provide services wouldn't break their contracts. Third, I wrote a contract to protect the organisation from donated money being revoked. In this way, I prepared to enjoy a high quality of life by employing a legal consultant to help legally protect educational institutions. + +/* +?- revoke([land,contract,money],R). +R = true. + +?- revoke([land],R). +R = true. + +?- revoke([else],R). +R = false. + +?- revoke([contract,money],R). +R = true. + +?- revoke([],R). +R = false. +*/ + +revoke(Items,R) :- + ((member(land,Items)->true; + (member(contract,Items)->true; + (member(money,Items))))->R=true;R=false). + +% 36 + +% COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 3 of 4.txt + +% 21. I prepared to equate Plato's soul (including the mind and body) with the soul created by 50 As (16 250 breasoning As) in Lucianic Computational English. I did this by writing 16 250 breasoning areas of study influenced by Plato's forms about Computational English. First, I equated the Platonic body with Lucianic Computational English bodily developed things. Second, I equated the Platonic mind with the Lucianic Computational English mental ontological states. Third, I equated the Platonic soul with the Lucianic Computational English continual soulful aimingness (sic). In this way, I prepared to equate Plato's soul (including the mind and body) with the soul created by 50 As (16 250 breasoning As) in Lucianic Computational English by writing 16 250 breasoning areas of study influenced by Plato's forms about Computational English. + +% does each direction have 50 As? + +dir(1,2,true). +dir(2,3,true). +dir(3,1,true). +dir(2,1,true). +dir(3,2,true). +dir(1,3,true). + +dirs_true :- forall((member(A,[1,2,3]),member(B,[1,2,3]),not(A=B)), +dir(A,B,true)). + +% 43 diff --git a/Philosophy/10 2 23.pl b/Philosophy/10 2 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..e3fe5ae484ef0da4578e9679644e8f5dd77eba10 --- /dev/null +++ b/Philosophy/10 2 23.pl @@ -0,0 +1,52 @@ +% 10 2 23.pl + +% Need 11 more algs + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Perspectives 1 of 4.txt",0,algorithms,"6. I prepared to verify the definition of the word. I did this by writing the third technique algorithm in meaning. First, I entered the query. Second, I followed the algorithm find the relevant definition of the query. Third, I read the returned definition. In this way, I prepared to verify the definition of the word by writing the third technique algorithm in meaning."] + +:-include('../listprologinterpreter/listprolog.pl'). +%:-include('../listprologinterpreter/la_maths.pl'). + +ruler1(A) :- length(A). +ruler2(A) :- string_length(A). +fan(T1,T2) :- T2 is T1-5. +pen(A,A) :- writeln(A). +bag(A,A) :- findall(_,(member(B,A),writeln(B)),_). + +feed(dog,A,B) :- B=stomach(A). +walk(dog,X1,Y1,X2,Y2,V) :- X3 is X2-X1, Y3 is Y2-Y1, V = i*X3+j*Y3. +magnitude(X1,Y1,X2,Y2,M) :- M is sqrt((X2-X1)^2+(Y2-Y1)^2). + +phone([X1,Y1,M],[X2,Y2,M]):-writeln([[X1,Y1,M],[X2,Y2,M]]). +email([A1,B1,C1,D1,E],[A2,B2,C2,D2,E]):-writeln([[A1,B1,C1,D1,E],[A2,B2,C2,D2,E]]). + +% printer("a b c de",20,J). +% J = "a b c de". + +printer(S,Width,%A,B,CL% +Justified_line +) :- + string_length(S,L),B is Width div L,A is Width mod L, + atomic_list_concat(C,' ',S), + foldr(string_concat,C,CC), + string_length(CC,CCL), + length(C,CL), + append(C1,[C2],C), + string_length(C2,C2L), + A1 is (Width-CCL%+C2L + ) div (CL-1), + %A2 is (10-L%+C2L + %) mod (CL-1), + (B>=1->(numbers(A1,1,[],N),findall(' ',member(_,N),D),foldr(string_concat,D,'',E), + append(C1,[''],C3), + atomic_list_concat(C3,E,Justified_line1), + + string_length(Justified_line1,JL), + A3 is Width-JL-C2L, + + numbers(A3,1,[],N1),findall(' ',member(_,N1),D2),%append(D1,[E],D2), + foldr(string_concat,D2,'',E1), + foldr(string_concat,[Justified_line1,E1,C2],Justified_line) + ); + Justified_line=S),!. + \ No newline at end of file diff --git a/Philosophy/10 4 23.pl b/Philosophy/10 4 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..a8d1f4993ad2dd043e8a800399203816df3330fe --- /dev/null +++ b/Philosophy/10 4 23.pl @@ -0,0 +1,54 @@ +% 10 4 23.pl + +% 19 algs done + +:-include('../listprologinterpreter/listprolog.pl'). + +% 11. I prepared to test a philosophy over time. I did this by writing the philosophy that there are three parts of a philosophy. First, I read the introduction to Melchior. Second, I enjoyed the mid-section on the set of Excelsior. Third, I mercurially concluded the endgame. In this way, I prepared to test a philosophy over time by writing the philosophy that there are three parts of a philosophy. + +% beginnings_middles_ends(10,5,BME). +% [[1,[1,2,3,4,5]],[2,[1,2,3,4,5]],[3,[1,2,3,4,5]],[4,[1,2,3,4,5]],[5,[1,2,3,4,5]],[6,[1,2,3,4,5]],[7,[1,2,3,4,5]],[8,[1,2,3,4,5]],[9,[1,2,3,4,5]],[10,[1,2,3,4,5]]] + +beginnings_middles_ends(Number_of_lines1,Max_line_length,BME) :- + generate_lines(Number_of_lines1,Max_line_length,[],L1), + foldr(append,L1,L2), + permutation(L2,L3), + findall(Item,member([Item,_],L3),Items1), + sort(Items1,Items2), + findall([Item2,Items4],(member(Item2,Items2), + findall(Item3,member([Item2,Item3],L3),Items3), + sort(Items3,Items4)),BME),!. + +generate_lines(0,_Max_line_length,L,L) :- !. +generate_lines(Number_of_lines1,Max_line_length,L1,L2) :- + numbers(Max_line_length,1,[],L3), + findall([Number_of_lines1,Item],member(Item,L3),L4), + append(L1,[L4],L5), + Number_of_lines2 is Number_of_lines1-1, + generate_lines(Number_of_lines2,Max_line_length,L5,L2). + +% 21. I prepared to breason out the seen-as object, for each sentence of a VET course I wrote, with 5 breasonings. I did this by writing 20 breasonings (an 80 breasoning-long A) per sentence. First, I wrote the question's A. Second, I wrote the answer's A. Third, I wrote a connection A between the question A and answer A. In this way, I prepared to breason out the seen-as object, for each sentence of a VET course I wrote, with 5 breasonings by writing 20 breasonings (an 80 breasoning-long A) per sentence. + +% key word in sentence + +% phrase(sentence(A),[the,john,read,to,sera]). + +sentence(_)-->subject(_),verb_phrase(_). +verb_phrase(_)-->verb(_),person(_). +subject(_)-->determiner(_),person(_). +verb(_)-->[walked,to]. +verb(_)-->[saw]. +verb(_)-->[talked,to]. +verb(_)-->[read,to]. +verb(_)-->[wrote,to]. +determiner(_)-->[the]. +determiner(_)-->[a]. +person(_)-->[john]. +person(_)-->[peter]. +person(_)-->[eshan]. +person(_)-->[mary]. +person(_)-->[james]. +person(_)-->[jonno]. +person(_)-->[jonny]. +person(_)-->[richard]. +person(_)-->[sera]. diff --git a/Philosophy/11 6 23.pl b/Philosophy/11 6 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..4701211db5af6709d2303b993e029e76d60c5b4a --- /dev/null +++ b/Philosophy/11 6 23.pl @@ -0,0 +1,484 @@ +% 11 6 23.pl + +% 55 algs + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 2 of 4.txt",0,algorithms,"17. The Asperger patient showed his mastery of sports statistics by remembering the match with the top score. He did this by driving someone up to the door in a tricycle. First, he started at the edge of the courtyard. Second, he drove through the courtyard. Third, he stopped at the door. In this way, the Asperger patient prepared to demonstrate his mastery of sports statistics by remembering the match with the top score by driving someone up to the door in a tricycle."] + +% The Asperger patient showed his mastery of sports statistics by remembering the match with the top score. + +% top([4,5,7,3,2],Max). + +top(List,Max) :- + sort(List,List2), + append(_,[Max],List2),!. + +% He did this by driving someone up to the door in a tricycle. + +% Line Graph + +:-include('../LuciansHandBitMap-Font/characterbr.pl'). +:-include('../listprologinterpreter/listprolog.pl'). + +make_grid(X,Y,Grid) :- + numbers(Y,1,[],Ys), + numbers(X,1,[],Xs), + findall([X1,Y1,[]],(member(Y1,Ys),member(X1,Xs)),Grid). + +% line_graph(2,2,[1,2],_). +% * +%* + +/* +line_graph(4,4,[1,2],_). + * + * + * + + +line_graph(5,5,[1,2],_). + * + * + * + * + + +line_graph(10,10,[10,1,10],_). + * * + * * + * * + * * + * * + * * + * * + * * + * + * + +*/ +% line_graph(20,20,[10,1,5,4,3,8],_). + +line_graph(X_win,Y_win,Ys,Grid) :- + make_grid(X_win,Y_win,Grid1), + length(Ys,Xs_L), + sort(Ys,Ys2), + append(_,[Y_max],Ys2), + numbers(Xs_L,1,[],Xs), + findall([X2,Y2],(member(X,Xs),get_item_n(Ys,X,Y),Y2 is floor(Y*(Y_win/Y_max)),X2 is floor(X*(X_win/Xs_L))),Ps), + plot(Ps,Grid1,Grid), + prettyprint1_lg(Grid,Y_win,X_win),!. + +plot(Ps,Grid1,Grid2) :- + Ps=[[X1,Y1],[X2,Y2]], + %write([[X1,Y1],[X2,Y2]]), + line1(X1,Y1,X2,Y2,Grid1,Grid2,1),!. +plot(Ps,Grid1,Grid2) :- + Ps=[[X1,Y1],[X2,Y2]|Ps2], + %write([X2,Y2]), + line1(X1,Y1,X2,Y2,Grid1,Grid3,1), + plot([[X2,Y2]|Ps2],Grid3,Grid2),!. +plot(_,G,G) :- !. + +prettyprint1_lg(_C,0,_X) :- !. +prettyprint1_lg(C,N,X) :- + prettyprint2_lg(C,N,1,X), + writeln(''), + N2 is N-1, + prettyprint1_lg(C,N2,X). +prettyprint2_lg(_C,_N,X1,X) :- X1 is X+1, !. +prettyprint2_lg(C,N,M,X) :- + member([M,N,M2],C), + (M2=[]->write(' ');write('*')),write('\t'), + M3 is M+1, + prettyprint2_lg(C,N,M3,X). + +% 37 left + +% First, he started at the edge of the courtyard. + +% Graphs, returns true if point is inside a triangle + +/* + +edge(3, 3, 1, 1, 1, 3, 3, 2, 2, 2). +Point is in triangle. +1 1 +1 2 1 +1 + +edge(3, 3, 1, 1, 1, 3, 3, 2, 3, 1). +Point is not in triangle. +1 1 +1 1 1 +1 2 + +edge(10, 10, 2, 2, 2, 9, 8, 5, 5, 5). +Point is in triangle. + + 1 + 1 1 1 + 1 1 1 1 + 1 1 1 1 1 1 + 1 1 1 2 1 1 1 + 1 1 1 1 1 + 1 1 1 + 1 + +edge(10, 10, 2, 2, 2, 9, 8, 5, 8, 1). +Point is not in triangle. + + 1 + 1 1 1 + 1 1 1 1 + 1 1 1 1 1 1 + 1 1 1 1 1 1 1 + 1 1 1 1 1 + 1 1 1 + 1 + 2 + +*/ + +edge(X_win,Y_win,X1,Y1,X2,Y2,X3,Y3,PX,PY) :- + make_grid(X_win,Y_win,Grid1), + draw_edges(X1,Y1,X2,Y2,X3,Y3,Grid1,Grid2), + triangle_centre(X1,Y1,X2,Y2,X3,Y3,QX,QY), + fill(QX,QY,Grid2,Grid3), + ((member([PX,PY,M],Grid3), + member(1,M))-> + writeln("Point is in triangle."); + writeln("Point is not in triangle.")), + line1(PX,PY,PX,PY,Grid3,Grid4,2), + prettyprint1_e(Grid4,Y_win,X_win),!. + +draw_edges(X1,Y1,X2,Y2,X3,Y3,Grid1,Grid4) :- + line1(X1,Y1,X2,Y2,Grid1,Grid2,1), + line1(X2,Y2,X3,Y3,Grid2,Grid3,1), + line1(X3,Y3,X1,Y1,Grid3,Grid4,1). + +triangle_centre(X1,Y1,X2,Y2,X3,Y3,QX,QY) :- + QX is floor((X1 + X2 + X3)/3), + QY is floor((Y1 + Y2 + Y3)/3). + +fill(X,Y,Grid1,Grid6) :- + ((member([X,Y,M],Grid1), + member(1,M))-> + Grid1=Grid6; + (line1(X,Y,X,Y,Grid1,Grid2,1), + XM is X-1, + XP is X+1, + YM is Y-1, + YP is Y+1, + fill(XM,Y,Grid2,Grid3), + fill(XP,Y,Grid3,Grid4), + fill(X,YM,Grid4,Grid5), + fill(X,YP,Grid5,Grid6))),!. + +prettyprint1_e(_C,0,_X) :- !. +prettyprint1_e(C,N,X) :- + prettyprint2_e(C,N,1,X), + writeln(''), + N2 is N-1, + prettyprint1_e(C,N2,X). +prettyprint2_e(_C,_N,X1,X) :- X1 is X+1, !. +prettyprint2_e(C,N,M,X) :- + member([M,N,M2],C), + (M2=[]->write(' '); + (member(2,M2)->write(2); + (write(1)))),write('\t'), + M3 is M+1, + prettyprint2_e1(C,N,M3,X). + +% 25 left + +% Second, he drove through the courtyard. + +% 3D edge + +% edge_3d(3,3,3,1,1,1,3,1,1,1,3,1,1,1,3,2,2,2). + +% edge_3d(3,3,2,1,1,1,3,1,1,1,3,1,1,1,2,2,2,2). + +edge_3d(X_win,Y_win,Z_win,X1,Y1,Z1,X2,Y2,Z2,X3,Y3,Z3,X4,Y4,Z4,PX,PY,PZ) :- + make_grid_3d(X_win,Y_win,Z_win,Grid1), + draw_edges_3d(X1,Y1,Z1,X2,Y2,Z2,X3,Y3,Z3,X4,Y4,Z4,Grid1,Grid21), + draw_faces_3d(X1,Y1,Z1,X2,Y2,Z2,X3,Y3,Z3,X4,Y4,Z4,Grid21,Grid2), + +%prettyprint_3d_1_e1(Grid2,Y_win,X_win,Z_win), + + tetrahedron_centre_3d(X1,Y1,Z1,X2,Y2,Z2,X3,Y3,Z3,X4,Y4,Z4,QX,QY,QZ), + fill_3d(QX,QY,QZ,Grid2,Grid3), + ((member([PX,PY,PZ,M],Grid3), + member(1,M))-> + writeln("Point is in tetrahedron."); + writeln("Point is not in tetrahedron.")), + line1_3d(PX,PY,PZ,PX,PY,PZ,Grid3,Grid4,2), + prettyprint_3d_1_e1(Grid4,Y_win,X_win,Z_win),!. + +make_grid_3d(X,Y,Z,Grid) :- + numbers(Y,1,[],Ys), + numbers(X,1,[],Xs), + numbers(Z,1,[],Zs), + findall([X1,Y1,Z1,[]],(member(Y1,Ys),member(X1,Xs),member(Z1,Zs)),Grid). + +draw_edges_3d(X1,Y1,Z1,X2,Y2,Z2,X3,Y3,Z3,X4,Y4,Z4,Grid1,Grid7) :- + line1_3d(X1,Y1,Z1,X2,Y2,Z2,Grid1,Grid2,1), + line1_3d(X2,Y2,Z2,X3,Y3,Z3,Grid2,Grid3,1), + line1_3d(X3,Y3,Z3,X1,Y1,Z1,Grid3,Grid4,1), + line1_3d(X1,Y1,Z1,X4,Y4,Z4,Grid4,Grid5,1), + line1_3d(X2,Y2,Z2,X4,Y4,Z4,Grid5,Grid6,1), + line1_3d(X3,Y3,Z3,X4,Y4,Z4,Grid6,Grid7,1). + +draw_faces_3d(X1,Y1,Z1,X2,Y2,Z2,X3,Y3,Z3,X4,Y4,Z4,Grid1,Grid5) :- + draw_face_3d(X1,Y1,Z1,X2,Y2,Z2,X3,Y3,Z3,Grid1,Grid2), + draw_face_3d(X1,Y1,Z1,X2,Y2,Z2,X4,Y4,Z4,Grid2,Grid3), + draw_face_3d(X1,Y1,Z1,X4,Y4,Z4,X3,Y3,Z3,Grid3,Grid4), + draw_face_3d(X4,Y4,Z4,X2,Y2,Z2,X3,Y3,Z3,Grid4,Grid5). + +draw_face_3d(X1,Y1,Z1,X2,Y2,Z2,X3,Y3,Z3,Grid1,Grid2) :- + DX is X3-X2, + DY is Y3-Y2, + DZ is Z3-Z2, + Mod is sqrt(DX^2+DY^2+DZ^2), + Segs is ceiling(Mod/0.5), + Segs2 is Segs-1, + numbers(Segs2,0,[],S), + findall([SX,SY,SZ],(member(S1,S), + SX is floor(X2+S1*DX), + SY is floor(Y2+S1*DY), + SZ is floor(Z2+S1*DZ)),Ps), + draw_lines(X1,Y1,Z1,Ps,Grid1,Grid2). + +draw_lines(_X1,_Y1,_Z1,[],Grid,Grid) :- !. +draw_lines(X1,Y1,Z1,Ps,Grid1,Grid2) :- + Ps=[[X2,Y2,Z2]|Ps1], + line1_3d(X1,Y1,Z1,X2,Y2,Z2,Grid1,Grid3,1), + draw_lines(X1,Y1,Z1,Ps1,Grid3,Grid2). + +tetrahedron_centre_3d(X1,Y1,Z1,X2,Y2,Z2,X3,Y3,Z3,X4,Y4,Z4,QX,QY,QZ) :- + QX is floor((X1 + X2 + X3 + X4)/4), + QY is floor((Y1 + Y2 + Y3 + Y4)/4), + QZ is floor((Z1 + Z2 + Z3 + Z4)/4). + +fill_3d(X,Y,Z,Grid1,Grid6) :- + ((member([X,Y,Z,M],Grid1), + member(1,M))-> + Grid1=Grid6; + (line1_3d(X,Y,Z,X,Y,Z,Grid1,Grid2,1), + XM is X-1, + XP is X+1, + YM is Y-1, + YP is Y+1, + ZM is Z-1, + ZP is Z+1, + fill_3d(X,Y,ZM,Grid2,Grid3), + fill_3d(X,Y,ZP,Grid3,Grid41), + fill_3d(XM,Y,Z,Grid41,Grid31), + fill_3d(XP,Y,Z,Grid31,Grid4), + fill_3d(X,YM,Z,Grid4,Grid5), + fill_3d(X,YP,Z,Grid5,Grid6))),!. + +prettyprint_3d_1_e1(Grid,X,Y,Z) :- + numbers(Y,1,[],YN1), + reverse(YN1,YN), + numbers(X,1,[],XN), + numbers(Z,1,[],ZN1), + reverse(ZN1,ZN), + + + findall(_,(member(Z1,ZN), + nl,nl, + member(Y1,YN), + nl, + member(X1,XN), + member([Z1,X1,Y1,M2],Grid), + + (M2=[]->write(' '); + (member(2,M2)->write(2); + (write(1)))),write('\t') + + %(Pixel1=[*]->Pixel="*";Pixel=" "), + %write(Pixel) + ),_). + +/* +prettyprint_3d_1_e1(_C,_X,_Y,0) :- !. +prettyprint_3d_1_e1(C,X,Y,Z) :- + prettyprint_3d_1_e(C,1,X,Z), + writeln(''), + Z2 is Z-1, + prettyprint_3d_1_e1(C,X,Y,Z2). + +prettyprint_3d_1_e(_C,0,_X,_Z) :- !. +prettyprint_3d_1_e(C,N,X,Z) :- + prettyprint_3d_2_e(C,N,Z,1,X), + writeln(''), + N2 is N-1, + prettyprint_3d_1_e(C,N2,X,Z). +prettyprint_3d_2_e(_C,_N,_Z,X1,X) :- X1 is X+1, !. +prettyprint_3d_2_e(C,N,Z,M,X) :- + member([M,N,Z,M2],C), + (M2=[]->write(' '); + (member(2,M2)->write(2); + (write(1)))),write('\t'), + M3 is M+1, + prettyprint_3d_2_e(C,N,Z,M3,X). +*/ + + + +line1_3d(X1,Y1,Z1,X2,Y2,Z2,C2,C4,N3) :- + %%_3d(X1=_3d(XA1=X1,XA2=X2);_3d(XA1=X2,XA2=X1)), + %%_3d(Y1=_3d(YA1=Y1,YA2=Y2);_3d(YA1=Y2,YA2=Y1)), + %%gridline1_3d(XA1,YA1,XA2,YA2,C2,C3,N3). + gridline1_3d(X1,Y1,Z1,X2,Y2,Z2,C2,C3,1000), + findall([X,Y,Z,M2],(member([X,Y,Z,M],C3), + (member(1000,M)->(delete(M,1000,M1),append(M1,[N3],M2)); + M=M2)),C4). + +%% Graphs _3d(X1,Y1) to _3d(X2,Y2) + +gridline1_3d(X1,Y1,Z1,X2,Y2,Z2,C2,C3,N3) :- + sortbyx_3d(X1,Y1,X2,Y2,XA1,YA1,XA2,YA2), + equation_3d(XA1,YA1,XA2,YA2,M,C), + gridline_orig_2_3d(XA1,YA1,XA2,YA2,M,C,C2,C4,N3), + %C4=C3, + %C2=C4, + sortbyx_3d(X1,Z1,X2,Z2,XXA1,ZZA1,XXA2,ZZA2), + equation_3d(XXA1,ZZA1,XXA2,ZZA2,M0,C0), + gridline2_3d(XXA1,ZZA1,XXA2,ZZA2,M0,C0,C4,C3,N3), + + !. + %%writeln_3d(Grid1), + %%sort_3d(YA1,YA2,YB1,YB2), + %%print_3d(XA1,YB1,XA2,YB2,Grid1,_Grid2),!. + +%% Sorts _3d(X1,Y1) and _3d(X2,Y2) by X + +sortbyx_3d(X1,Y1,X2,Y2,X1,Y1,X2,Y2) :- + X2 >= X1. +sortbyx_3d(X1,Y1,X2,Y2,X2,Y2,X1,Y1) :- + X2 < X1. + +%% Finds the rise and run of _3d(X1,Y1) and _3d(X2,Y2) + +equation_3d(X1,Y1,X2,Y2,M,C) :- + DY is Y2-Y1, + DX is X2-X1, + %%writeln_3d([y2,Y2,y1,Y1,x2,X2,x1,X1,dy,DY,dx,DX]), %% + equation2_3d(DY,DX,M,Y1,X1,C). + +%% Finds the gradient m and y-intercept c of _3d(X1,Y1) and _3d(X2,Y2) + +equation2_3d(_DY,0,999999999,_Y1,X1,X1) :- + !. +equation2_3d(DY,DX,M,Y1,X1,C) :- + M is DY/DX, + C is Y1-M*X1 + %%,writeln_3d([m,M,c,C]) + . + +%% Finds the graph of the line connecting the two points. It does this by finding the graph flipped in the y=x line if the gradient m is greater than 1 or less than -1, so that the graph is not disjointed + +gridline2_3d(X1,_Y1,X2,_Y2,M,C,C2,Grid,N3) :- + M =< 1, M >= -1, + %%x_3d(X),%%X1 is X+1, + gridline3_3d(X1,X2,M,C,C2,Grid,N3,_X). +gridline2_3d(X1,Y1,_X2,Y2,M,_C,C22,Grid1,N3) :- + (M > 1; M < -1), + M2 is 1/M, + sort_3d(Y1,Y2,YA1,YA2), + C2 is X1-M2*Y1, + flipxy_3d(C22,[],Grid), + %%y_3d(Y), + %%Y1 is Y+1, + gridline3_3d(YA1,YA2,M2,C2,Grid,Grid2,N3,_Y1), + %%writeln_3d(['***',flipxygrid,Grid2]), + flipxy_3d(Grid2,[],Grid1). + +%% Sorts Y1 and Y2 + +sort_3d(Y1,Y2,Y1,Y2) :- + Y1=Y2. + +%% Plots a point at each x-value of the graph + +gridline3_3d(X1,X2,_M,_C,Grid,Grid,_N3,_N4) :- + %%X1 is N4+1. %% swap, + X1 is X2+1. +gridline3_3d(X1,X2,M,C,Grid1,Grid2,N3,_N4) :- + Z is floor(M*X1+C), % or round *** + %%Coord = [X1,Y], + findall(New,( + member([X0,Y0,Z0,M0],Grid1), + ([X0,Y0,Z0,M0]=[X1,Y,Z,M2]-> + %member([X1,Y,Z,M2],Grid1), + %delete(Grid1,[X1,Y,Z,M2],Grid11), + (append(M2,[N3],M3),New=[X1,Y,Z,M3]); + New=[X0,Y0,Z0,M0]) + ),Grid3), + + %append(Grid11,[[X1,Y,M3]],Grid3), + %%writeln_3d([X1,Y,M3]), %% + X3 is X1+1, + gridline3_3d(X3,X2,M,C,Grid3,Grid2,N3,_N42). + +%% Flips the graph in the y=x line + +flipxy_3d([],Grid,Grid) :- !. +flipxy_3d(Grid1,Grid2,Grid3) :- + Grid1 = [Coord1 | Coords], + Coord1 = [X, Y, Z, M], + Coord2 = [Z, Y, X, M], + append(Grid2,[Coord2],Grid4), + flipxy_3d(Coords,Grid4,Grid3). + +gridline_orig_2_3d(X1,_Y1,X2,_Y2,M,C,C2,Grid,N3) :- + M =< 1, M >= -1, + %%x_3d(X),%%X1 is X+1, + gridline_orig_3_3d(X1,X2,M,C,C2,Grid,N3,_X). +gridline_orig_2_3d(X1,Y1,_X2,Y2,M,_C,C22,Grid1,N3) :- + (M > 1; M < -1), + M2 is 1/M, + sort_3d(Y1,Y2,YA1,YA2), + C2 is X1-M2*Y1, + flipxy_orig_3d(C22,[],Grid), + %%y_3d(Y), + %%Y1 is Y+1, + gridline_orig_3_3d(YA1,YA2,M2,C2,Grid,Grid2,N3,_Y1), + %%writeln_3d(['***',flipxygrid,Grid2]), + flipxy_3d(Grid2,[],Grid1). + +gridline_orig_3_3d(X1,X2,_M,_C,Grid,Grid,_N3,_N4) :- + %%X1 is N4+1. %% swap, + X1 is X2+1. +gridline_orig_3_3d(X1,X2,M,C,Grid1,Grid2,N3,_N4) :- + Z is floor(M*X1+C), % or round *** + %%Coord = [X1,Y], + %trace, + findall(New,( + member([X0,Y0,Z0,M0],Grid1), + ([X0,Y0,Z0,M0]=[X1,Y,Z,M2]-> + %member([X1,Y,Z,M2],Grid1), + %delete(Grid1,[X1,Y,Z,M2],Grid11), + (append(M2,[N3],M3),New=[X1,Y,Z,M3]); + New=[X0,Y0,Z0,M0]) + ),Grid3), + + %append(Grid11,[[X1,Y,M3]],Grid3), + %%writeln_3d([X1,Y,M3]), %% + X3 is X1+1, + gridline_orig_3_3d(X3,X2,M,C,Grid3,Grid2,N3,_N42). + +%% Flips the graph in the y=x line + +flipxy_orig_3d([],Grid,Grid) :- !. +flipxy_orig_3d(Grid1,Grid2,Grid3) :- + Grid1 = [Coord1 | Coords], + Coord1 = [X, Y, Z, M], + Coord2 = [Y, X, Z, M], + append(Grid2,[Coord2],Grid4), + flipxy_orig_3d(Coords,Grid4,Grid3). + +% 16 algs over \ No newline at end of file diff --git a/Philosophy/16 9 23.pl b/Philosophy/16 9 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..750df41b7386d02db58b932534d0461567f8d4b5 --- /dev/null +++ b/Philosophy/16 9 23.pl @@ -0,0 +1,151 @@ +% 16 9 23 + +% 35-7=28 algs + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Two Uses 19 of 30.txt",0,algorithms,"186. ALEXIS: The subject should include 50 As in each book."] + +book1(book1,50). +book1(book2,60). +book1(book3,25). +book1(book4,0). +book1(book5,50). + +% ?- all_fifty(D). +% D = [[book1, 50], [book2, 60], [book5, 50]]. + +all_fifty(Data):-findall([Name,Number],(book1(Name,Number),Number >=50),Data). + +/* +?- pretty_print_table([[1,2,3],[4,5,6],[7,8,9]]). +1 2 3 +4 5 6 +7 8 9 +*/ +pretty_print_table(Data) :- +findall(_,(member(Item,Data), +findall(_,(member(Item1,Item),write(Item1),write("\t")),_), +writeln("")),_). + +% pretty_print_nd([[[1,2],[3,4]],[[1,2],[3,4]]]). +/* +1 2 +3 4 + +1 2 +3 4 + pretty_print_nd([[[[1,2],[3,4]],[[1,2],[3,4]]],[[[1,2],[3,4]],[[1,2],[3,4]]]]). +1 2 +3 4 + +1 2 +3 4 + +1 2 +3 4 + +1 2 +3 4 + +pretty_print_nd([[[[[1,2],[3,4]],[[1,2],[3,4]]],[[[1,2],[3,4]],[[1,2],[3,4]]]],[[[[1,2],[3,4]],[[1,2],[3,4]]],[[[1,2],[3,4]],[[1,2],[3,4]]]]]). + +1 2 +3 4 + +1 2 +3 4 + +1 2 +3 4 + +1 2 +3 4 + +1 2 +3 4 + +1 2 +3 4 + +1 2 +3 4 + +1 2 +3 4 + +*/ +pretty_print_nd(Data):- +findall(_,(member(Item,Data), +((is_list(Item),Item=[Item1|_],is_list(Item1),Item1=[Item2|_], +not(is_list(Item2)))-> +(pretty_print_table(Item),nl); +pretty_print_nd(Item) +) +),_). +/* +?- pretty_print_all_fifty. +book1 50 +book2 60 +book5 50 +*/ +pretty_print_all_fifty :- +all_fifty(Data), +pretty_print_table(Data). + +% 28-14=14 + +% ["Delegate workloads, Lecturer, Recordings","Delegate Workloads 1.txt",0,algorithms," 32. I compared notes."] + +:-include('../listprologinterpreter/la_maths.pl'). + +/* +find_complexity(n,C). +C = n. + +find_complexity(sqrn, C). +C=sqrn. +*/ +n(N,N1) :- n(N,[],N1). +n([],N,N) :- !. +n(N,N1,N2) :- N=[N31|N32], +append(N1,[N31],N4), +%N4 is N1+1, +n(N32,N4,N2). + +sqrn(Ns,N1) :- %numbers(N,1,[],Ns), +findall([N2,N3],(member(N2,Ns),member(N3,Ns)),N1). + + find_complexity(Name,Complexity) :- + N0=10, + numbers(N0,1,[],N01), + (Name=n->n(N01,N1); + Name=sqrn->sqrn(N01,N1)), + N2 is N0^2, + numbers(N2,1,[],Ns2), + (close1(N01,N1)->Complexity=n; + close1(N1,Ns2)->Complexity=sqrn). +close1(N0,N1) :- + length(N0,L0), + length(N1,L1), + L11 is L1-1, + L12 is L1+1, + L11 noun1(B1), space(B2), verb(B3), space(B4), noun1(B5), full_stop(B6), {foldr(append,[B1,[B2],[B3],[B4],B5,[B6]],B)}. +noun1(B) --> determiner(B1), space(B2), noun(B3), {foldr(append,[[B1],[B2],[B3]],B)}. +noun1(B) --> noun(B1), {foldr(append,[[B1]],B)}. + +determiner("down") --> "the". +determiner("down") --> "an". +determiner("down") --> "a". + +space("square") --> " ". +full_stop("block") --> ".". + +noun("pear") --> "pear". +noun("person") --> "mara". + +verb("pen") --> "wrote". +verb("mouth") --> "addressed". + +br("down", [1,1,0]). +br("square",[1,1,0]). +br("block", [1,1,1]). +br("pear", [5,5,10]). +br("person",[50,30,180]). +br("pen", [15,0.5,0.5]). +br("mouth", [4,0,1]). + +% br with sent and alg + +% types + +% eg type_checker([c1(a,b),c2(a),c3(b,c)],Types). +% Types = [c1(atom, number), c2(atom), c3(number, string)]. + +type_checker([],[]). +type_checker([C1|C],[T1|T]) :- + functor(C1,C2,Ar), + (Ar=1-> + (arg(1,C1,A1), + type(A1,T2), + type(C2,T2), + functor(T1,C2,Ar),arg(1,T1,T2)); + (Ar=2, + arg(1,C1,A1),arg(2,C1,A2), + type(A1,T2),type(A2,T3), + type(C2,T2,T3), + functor(T1,C2,Ar),arg(1,T1,T2),arg(2,T1,T3))), + type_checker(C,T). + +type(a,atom). +type(b,number). +type(c,string). + +type(c1,atom,number). +type(c2,atom). +type(c3,number,string). + +% types with a sentence + +% eg types_to_sentence([c1(atom, number), c2(atom), c3(number, string), c4(string, atom)],S). +% S = "I converted the value from an atom to a number, checked it was an atom, then converted it to a string, then back to an atom." + +types_to_sentence([],[]). +types_to_sentence(T,S) :- + findall(T1b,(member(T1,T), + ((functor(T1,C2,2),arg(1,T1,A1),arg(2,T1,A2))->T1b=[C2,A1,A2]; + (functor(T1,C2,1),arg(1,T1,A1),T1b=[C2,A1]))),T2b), + t2s(T2b,[],S1,T2b),foldr(string_concat,S1,S),!. +t2s([],S,S,_). +t2s([T1|T2b],S1,S2,SS) :- + S1=[], + (T1=[_,A1,A2]->(det(A1,DA1),det(A2,DA2),foldr(string_concat,["I converted the value from ",DA1," ",A1," to ",DA2," ",A2,""],S3)); + (T1=[_,A1]->(det(A1,DA1),foldr(string_concat,["I checked the value was ",DA1," ",A1,""],S3)))), + append(S1,[S3],S4), + t2s(T2b,S4,S2,SS). + +t2s([T1],S1,S2,SS) :- + not(S1=[]), + (T1=[_,_A1,A2]->(det(A2,DA2),findall(X,member([_,X,_],SS),X1),(member(A2,X1)->W="back";W=""),foldr(string_concat,[", then ",W," to ",DA2," ",A2,"."],S3)); + (T1=[_,A1]->(det(A1,DA1),findall(X,member([_,X],SS),X1),(member(A1,X1)->W=" again";W=""),foldr(string_concat,[", then checked it was ",DA1," ",A1,W,"."],S3)))), + append(S1,[S3],S4), + t2s([],S4,S2,SS). + +t2s([T1|T2b],S1,S2,SS) :- + not(S1=[]), + (T1=[_,_A1,A2]->(det(A2,DA2),foldr(string_concat,[", then converted it to ",DA2," ",A2,""],S3)); + (T1=[_,A1]->(det(A1,DA1),foldr(string_concat,[", checked it was ",DA1," ",A1,""],S3)))), + append(S1,[S3],S4), + t2s(T2b,S4,S2,SS). + + +det(atom,an). +det(number,a). +det(string,a). + +% 68 in total + + diff --git a/Philosophy/17 9 23.pl b/Philosophy/17 9 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..4c9bbe1183020d88803efaa97380ac325c38bd9a --- /dev/null +++ b/Philosophy/17 9 23.pl @@ -0,0 +1,175 @@ +% 17 9 23 + +% 35-7=28 + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 2 of 4.txt",0,algorithms,"11. I prepared to take care of the ducklings. + +/* +?- keep_on_explaining_until_it_makes_sense. +That's good. Are there any areas of the idea you still don't understand? (y/n) +|: y +That's good. Are there any areas of the idea you still don't understand? (y/n) +|: n +*/ + +keep_on_explaining_until_it_makes_sense :- + texttobr2_1(1), + writeln("That's good. Are there any areas of the idea you still don't understand? (y/n)"), + read_string(user_input,"\n\r","\n\r",_,S), + (S="y"->keep_on_explaining_until_it_makes_sense;true),!. + +/* +?- cycle_text("Good. Are there any further areas to mind map? (y/n)"). +Good. Are there any further areas to mind map? (y/n) +|: y +Good. Are there any further areas to mind map? (y/n) +|: n + +?- cycle_text("I see. Are there any other things you'd like to mention to me? (y/n)"). +I see. Are there any other things you'd like to mention to me? (y/n) +|: y +I see. Are there any other things you'd like to mention to me? (y/n) +|: n + +?- cycle_text("Well done. Are there any parts of language you are interested in? (y/n)"). +Well done. Are there any parts of language you are interested in? (y/n) +|: y +Well done. Are there any parts of language you are interested in? (y/n) +|: n + +?- cycle_text("Do you believe in duckling love? Do you love me too? (y/n)"). +Do you believe in duckling love? Do you love me too? (y/n) +|: y + +*/ + +cycle_text(Text) :- + texttobr2_1(1), + writeln(Text), + read_string(user_input,"\n\r","\n\r",_,S), + (S="y"->cycle_text(Text);true),!. + +% 28-8=20 + +% ["Short Arguments","Professor Algorithm.txt",0,algorithms,"9. I prepared to say it encouragingly. I did this by performing the verb. First, I loved you. Second, I loved it. Third, I wanted it."] + +/* +?- verb(Object,Adjective,Verb,Adverb,Room,Part_of_room,Direction,Time_to_do,Time_to_finish). +Object = book, +Adjective = intelligent, +Verb = read, +Adverb = studiously, +Room = library, +Part_of_room = (table), +Direction = n, +Time_to_do = [30, minutes], +Time_to_finish = [40, minutes] ; + +Object = computer, +Adjective = amazing, +Verb = read, +Adverb = carefully, +Room = bedroom, +Part_of_room = (table), +Direction = n, +Time_to_do = [60, minutes], +Time_to_finish = [120, minutes] ; + +Object = project, +Adjective = interesting, +Verb = complete, +Adverb = intricately, +Room = living_room, +Part_of_room = (table), +Direction = n, +Time_to_do = [120, minutes], +Time_to_finish = [5, minutes] ; + +Object = model, +Adjective = well_designed, +Verb = construct, +Adverb = thoughtfully, +Room = kitchen, +Part_of_room = (table), +Direction = e, +Time_to_do = [40, minutes], +Time_to_finish = [10, minutes] ; + +Object = chemistry_set, +Adjective = thought_provoking, +Verb = make_molecule, +Adverb = methodically, +Room = exam, +Part_of_room = (table), +Direction = n, +Time_to_do = [15, minutes], +Time_to_finish = [0, minutes]. +*/ + +verb(book,intelligent,read,studiously,library,table,n,[30,minutes],[40,minutes]). +verb(computer,amazing,read,carefully,bedroom,table,n,[60,minutes],[120,minutes]). +verb(project,interesting,complete,intricately,living_room,table,n,[120,minutes],[5,minutes]). +verb(model,well_designed,construct,thoughtfully,kitchen,table,e,[40,minutes],[10,minutes]). +verb(chemistry_set,thought_provoking,make_molecule,methodically,exam,table,n,[15,minutes],[0,minutes]). + +% 20 - 6 - 1 luciancicd = 13 + +% ["Medicine","MEDICINE by Lucian Green Less Depression 3 of 4.txt",0,algorithms,"22. I prepared to prevent bullying by teaching pedagogy, medicine and meditation to give skills to complete tasks with positive functionalism. I did this by providing Brass service as breasonings currency. First, I found the brass instrument. Second, I pressed my lips next against the mouthpiece. Third, I played a note. In this way, I prepared to prevent bullying by teaching pedagogy, medicine and meditation to give skills to complete tasks with positive functionalism by providing Brass service as breasonings currency."] + +/* +?- distance_greater(0,0,14,15,9). +false. + +?- distance_greater(0,0,14,15,200). +true. +*/ + +distance_greater(X1,Y1,X2,Y2,D) :- + D >= sqrt((Y2-Y1)^2 + (X2-X1)^2). + +/* +?- distance_greater3d(0,0,0,14,15,0,9). +false. + +?- distance_greater3d(0,0,0,14,15,0,200). +true. +*/ + +distance_greater3d(X1,Y1,Z1,X2,Y2,Z2,D) :- + D >= sqrt((Y2-Y1)^2 + (X2-X1)^2 + (Z2-Z1)^2). + +% lettering_spacing(10,3,S). +lettering_spacing(L,N,S) :- + S is L/N. + +% print_heading("Happy World",30,S). +% S = " H a p p y W o r l d ". + +% print_heading("EXCELLENT!",40,S). +% S = " E X C E L L E N T !". + +% print_heading("Lucian Green",18,S). +% S = "L uc ia n Gr ee n". + +% print_heading("1.. 2... 3...",15,S). +% S = "1.. 2. .. 3... ". + +:-include('../listprologinterpreter/la_maths.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). + +print_heading(String,L,S) :- + string_length(String,SL1), + lettering_spacing(L,SL1,S1), + numbers(SL1,1,[],SLN), + string_strings(String,Strings), + findall([SLN1,S2],(member(SLN2,SLN), + get_item_n(Strings,SLN2,S2), + SLN1 is floor(SLN2*S1)),S3), + numbers(L,1,[],LN), + findall(S5,(member(LN1,LN), + (member([LN1,S4],S3)->S5=S4;S5=" ")),S6), + findall(S7,(member(S8,S6),atom_string(S7,S8)),S9), + string_chars(S,S9),!. + diff --git a/Philosophy/18 6 23.pl b/Philosophy/18 6 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..7dc7f891ad484c4b170c7585d95a72aecad98349 --- /dev/null +++ b/Philosophy/18 6 23.pl @@ -0,0 +1,300 @@ +% 18 6 23.pl + +% 80 algs needed + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Rebreathsonings 2 of 4.txt",0,algorithms,"17. The Asperger patient showed his mastery of sports statistics by remembering the match with the top score. He did this by driving someone up to the door in a tricycle. First, he started at the edge of the courtyard. Second, he drove through the courtyard. Third, he stopped at the door. In this way, the Asperger patient prepared to demonstrate his mastery of sports statistics by remembering the match with the top score by driving someone up to the door in a tricycle."] + +% Third, he stopped at the door. + +% update verifiers - notifies if an update is available + +%:- use_module(library(date)). + +% update_verifier("luciangreen","a"). +% update_verifier("luciangreen","b"). + +update_verifier(User,Repository) :- + + get_time(TS),%stamp_date_time(TS,date(Year,Month,Day,Hour,Minute,_Seconda,_A,_TZ,_False),local), + repositories(List), + findall(TS1,(member([User,Repository,Date|_],List), + Date=[Year1,Month1,Day1,Hour1,Minute1], + date_time_stamp(date(Year1,Month1,Day1,Hour1,Minute1,0,_,_,_),TS1)),TS2), + sort(TS2,TS3), + append(_,[TS4],TS3), + (TS>=TS4-> + writeln([User,Repository,"is up to date."]); + writeln([User,Repository,"is not up to date."])),!. + +repositories( +[ +["luciangreen","a",[2023,6,1,0,0]], +["luciangreen","b",[2024,6,1,0,0]] +]). + +:-include('version.pl'). + +% version adder - given new lines, asks for version info and updates version + +% version_adder(["a","b"],"version.txt","version2.txt"). + +version_adder(Changes,File1,File2) :- + findall([Type1,Label],(member(Change,Changes), + %repeat, + writeln(["For change:",Change]), + writeln("Is the change an API change (a), a new feature (f) or a bug fix (b)?"), + read_string(user_input,"\n\r","\n\r",_,Type),atom_string(Type1,Type),(Type1=a->true;(Type1=f->true;Type1=b)), + writeln("Please enter a description of the change: "), + read_string(user_input,"\n\r","\n\r",_,Label) + ),Log), + + open_file_s(File1,File_term1), + append(File_term1,Log,File_term2), + version1(File_term2,0,0,0,A,F,B), + writeln(["Version",A,F,B]), + save_file_s(File2,File_term2). + +% 8 done + +% ["Short Arguments","Medicine - Quantum Box of Circulatory System 2.txt",0,algorithms,"4. I felt clear. I knew it was there, and moved it on. I meditated, giving my voluntary control over involuntary processes. I detected a piece of food in my blood vessel. It wasn't there later."] + +:-include('../listprologinterpreter/listprolog.pl'). + +% Medicine - Quantum Box of Circulatory System + +% accident_preventer_game. + +accident_preventer_game :- + %tty_size(R,_C), + R=8, + numbers(R,1,[],Rs), + numbers(8,1,[],Ns), + random_permutation(Ns,Ns2), + findall(" ",member(_,Rs),L), + append([9],L,L1), + accident_preventer_game2(Ns2,L1). + +accident_preventer_game2(Ns,L) :- + %findall(_,(member(L3,L),writeln(L3),nl),_), + writeln(L), + append(_,[N1],L), + (not(N1=" ")-> + writeln("Game Over - You Lost."); + %(Ns=[]-> + ((not((member(N,L),number(N))),Ns=[])-> + (trace,writeln("Game Over - You Won.")); + (move_down(Ns,Ns1,L,L1), + accident_preventer_game2(Ns1,L1)))),!. + +move_down(Ns,Ns1,L0,L2) :- + keyboard_input(L0,L,T), + T1 is 0.5-T, + sleep(T1), + random(1,3,N1), + ((N1=1,not(Ns=[]))-> + (append(L3,[_],L), + Ns=[N|Ns1], + append([N],L3,L2)); + (append(L3,[_],L), + Ns=Ns1, + append([" "],L3,L2))),!. + +keyboard_input(L1,L2,T) :- + get_time(T1), + +(catch(call_with_time_limit(0.5,get_single_char(Code)),_,fail)-> + (%trace, + get_time(T2), + (catch(number_codes(Num,[Code]),_,fail)->true;Num=65), + T is T2-T1, + %trace, + %get_single_char(Code), + (member(Num,L1)-> + (get_n_item(L1,Num,N), + put_item_n(L1,N," ",L2) + ); + L1=L2 + )); + (L1=L2,T=0.5)),!. + +% I felt clear. + +% food_dissolver_game. + +food_dissolver_game :- + %tty_size(R,_C), + R=8, + numbers(R,1,[],Rs), + numbers(3,1,[],Ns), + random_permutation(Ns,Ns2), + findall(" ",member(_,Rs),L), + append([4],L,L1), + food_dissolver_game2(Ns2,L1). + +food_dissolver_game2(Ns,L) :- + %findall(_,(member(L3,L),writeln(L3),nl),_), + writeln(L), + append(_,[N1],L), + (not(N1=" ")-> + writeln("Game Over - You Lost."); + %(Ns=[]-> + ((not((member(N,L),number(N))),Ns=[])-> + writeln("Game Over - You Won."); + (move_down_fd(Ns,Ns1,L,L1), + food_dissolver_game2(Ns1,L1)))),!. + +move_down_fd(Ns,Ns1,L0,L2) :- + keyboard_input_fd(L0,L,T), + T1 is 1-T, + sleep(T1), + random(1,3,N1), + ((N1=1,not(Ns=[]))-> + (append(L3,[_],L), + Ns=[N|Ns1], + append([N],L3,L2)); + (append(L3,[_],L), + Ns=Ns1, + append([" "],L3,L2))),!. + +keyboard_input_fd(L1,L2,T) :- + get_time(T1), + +(catch(call_with_time_limit(1,get_single_char(Code)),_,fail)-> + (%trace, + get_time(T2), + (catch(number_codes(Num,[Code]),_,fail)->true;Num=65), + number_codes(Num,[Code]), + T is T2-T1, + %trace, + %get_single_char(Code), + (member(Num,L1)-> + (%trace, + reverse(L1,L11), + get_n_item(L11,Num,N0), + length(L11,Le), + N is Le-N0+1, + N1 is Num-1, + (N1= 0-> + put_item_n(L1,N," ",L2); + put_item_n(L1,N,N1,L2)) + ); + L1=L2 + )); + (L1=L2,T=0.5)),!. + +% 19 done + +% I knew it was there, and moved it on. + +higher_lower :- + random(X),X1 is floor(X*100)/10, + higher_lower(9,X1). +higher_lower(0,_X1) :- writeln("You lost."),!. +higher_lower(N,X1) :- + guess(N1), + (N1=X1->writeln("Correct."); + ((N1writeln("Higher."); + writeln("Lower.")), + N2 is N-1, + higher_lower(N2,X1))),!. +guess(N) :- + %repeat, + writeln("Please guess the number between 0 and 10."), + read_string(user_input,"\n\r","\n\r",_,N1), + %number(N1), + (number_string(N,N1)->true;guess(N)). + +% 23 done + +% I meditated, giving my voluntary control over involuntary processes. + +% meditation_chain_letter(Sample,Branching_factor,Levels_taken_to_fill_sample). + +% meditation_chain_letter(1,1,L). +% L=2. + +% meditation_chain_letter(2,1,L). +% L=3. + +% meditation_chain_letter(3,1,L). +% L=6. + +meditation_chain_letter(S,N,L) :- + numbers(S,1,[],Ss), + findall((-),member(_,Ss),S1), + %random(X),X1 is floor(X*S), + S11 is S+2, + random(1,S11,X1), + meditation_chain_letter1(S,S1,N,X1,0,L). +meditation_chain_letter1(S,S1,N,X1,L1,L2) :- + (forall(member(S0,S1),S0=(*))->L1=L2; + (numbers(N,1,[],Ns), + X11 is X1+2, + findall(X2,(member(_,Ns),random(1,X11,X2)%random(X),X2 is ceiling((X1+0)*X)+1 + ),S2), + numbers(S,1,[],Ss), + findall(S5,(member(S3,Ss),get_item_n(S1,S3,S4),(member(S3,%S4, + S2)->S5=(*);S5=S4)),S6), + findall(_,member((*),S6),S7), + length(S7,X3), + L3 is L1+1, + meditation_chain_letter1(S,S6,N,X3,L3,L2))),!. + +% 29 done + +% with 44 from nn.pl, 7 left + +% I detected a piece of food in my blood vessel. It wasn't there later. + +:-include('11 6 23.pl'). + +% create_check_cube(3,3,3). +% create_check_cube(4,4,4). +% create_check_cube(5,5,5). + +create_check_cube(Y_win,X_win,Z_win) :- + create_rectangular_prism_in_space(Y_win,X_win,Z_win,G),detect_data(G),prettyprint_3d_1_e1(G,Y_win,X_win,Z_win),!. + +create_rectangular_prism_in_space(X_min,Y_min,Z_min,Grid2) :- + make_grid_3d(X_min,Y_min,Z_min,Grid1), + random(1,3,N), + (N=1-> + (random(1,X_min,X1), + random(1,X_min,X2), + random(1,Y_min,Y1), + random(1,Y_min,Y2), + random(1,Z_min,Z1), + random(1,Z_min,Z2), + draw_cube_edges_3d(X1,Y1,Z1,X2,Y2,Z2,Grid1,Grid2)); + Grid1=Grid2). + +detect_data(Grid) :- + member([_X,_Y,_Z,M],Grid),member(1,M). + + /* + 1 2 3 4, 5 6 7 8 + + 1 111 + 2 211 + 3 121 + 4 221 + 5 112 + 6 212 + 7 122 + 8 222 + */ + +draw_cube_edges_3d(X1,Y1,Z1,X2,Y2,Z2,Grid1,Grid13) :- + line1_3d(X1,Y1,Z1,X2,Y1,Z1,Grid1,Grid2,1), + line1_3d(X2,Y1,Z1,X1,Y2,Z1,Grid2,Grid3,1), + line1_3d(X1,Y2,Z1,X2,Y2,Z1,Grid3,Grid4,1), + line1_3d(X2,Y2,Z1,X1,Y1,Z1,Grid4,Grid5,1), + line1_3d(X1,Y1,Z2,X2,Y1,Z2,Grid5,Grid6,1), + line1_3d(X2,Y1,Z2,X1,Y2,Z2,Grid6,Grid7,1), + line1_3d(X1,Y2,Z2,X2,Y2,Z2,Grid7,Grid8,1), + line1_3d(X2,Y2,Z2,X1,Y1,Z2,Grid8,Grid9,1), + line1_3d(X1,Y1,Z1,X1,Y1,Z2,Grid9,Grid10,1), + line1_3d(X2,Y1,Z1,X2,Y2,Z2,Grid10,Grid11,1), + line1_3d(X1,Y2,Z1,X1,Y2,Z2,Grid11,Grid12,1), + line1_3d(X2,Y2,Z1,X2,Y2,Z2,Grid12,Grid13,1). + diff --git a/Philosophy/19 3 23.pl b/Philosophy/19 3 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..67663bff09873fb22c4637886a37d0a581e90007 --- /dev/null +++ b/Philosophy/19 3 23.pl @@ -0,0 +1,121 @@ +% 19 3 23.pl + +% 28 algs needed + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Derivability 3 of 4.txt",0,algorithms,"23. I prepared to be perfectly healthy. I did this by writing 10 80-breasoning subjects to be with-it in a Medicine degree. First, I wrote the organ subjects. Second, I wrote the theological surgery subjects. Third, I wrote the organ sutra subjects. In this way, I prepared to be perfectly healthy by writing 10 80-breasoning subjects to be with-it in a Medicine degree."] + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +healthy(circulatory_system_and_cardiovascular_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? +healthy(digestive_system_and_excretory_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? +healthy(endocrine_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? +healthy(integumentary_system_and_exocrine_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? +healthy(immune_system_and_lymphatic_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? +healthy(muscular_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? +healthy(nervous_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? +healthy(renal_system_and_urinary_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? +healthy(reproductive_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? +healthy(respiratory_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? +healthy(skeletal_system):-random(X),X1 is round(10*X),texttobr2(X1).%if nothing else is wrong, is there anything wrong? + +% run daily + +run_daily:-findall(_,(healthy(A),writeln1([A,"healthy"])),_). + +run_for_each_place(N):-numbers(N,1,[],Ns),findall(_,(member(N1,Ns),writeln([place,N1]),run_daily,nl),_). + +br_n_calculator(Text,Br_n_required,Gl_br_n_required) :- + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'\"0123456789", + split_string(Text,"\n\r.","\n\r.",Text2), + length(Text2,L2), + writeln(["Number of lines and sentences:",L2]), + split_string(Text,SepandPad,SepandPad,Text3), + length(Text3,L3), + writeln(["Number of words:",L3]), + Gl_br_n_required is round(125*Br_n_required-L2), + writeln(["Number of GL breasonings required for",Br_n_required,"breasonings:",Gl_br_n_required]). + +find1(Text,String,List) :- find11(Text,String,[],List). +find11(Text,String,List1,List2):-not(sub_string(Text,_A,_B,_C,String)),append(List1,[Text],List2),!. +find11(Text,String,List1,List2):-sub_string(Text,A,_B,C,String),!,sub_string(Text,0,A,_,String2), +sub_string(Text,_,C,0,String3),append(List1,[String2,String],List3),find11(String3,String,List3,List2). + +%replace within words +% 19 algs done + +organ_sutra :- findall(_,(healthy(Organ),texttobr2(1),writeln(["Organ sutra for:",Organ])),_). + +theological_surgery :- findall(_,(healthy(Organ),texttobr2(1),writeln(["I am happy with my",Organ,"organ"])),_). + +%["Short Arguments","Part_of_Room.txt",0,algorithms,"10. I prepared to admit that it was this planet and that the sky was 1 metre above the ground. I did this by stating that where the planetary outpost was a building, the space station was itself. First, I noticed the building on a planet. Second, I noticed the space station in the sky. Third, I went from Earth, to the space station, to the planetary outpost."] + +% go1("Earth","planetary outpost"). + +go1(A,A):-!. +go1(A,B) :- link1(A,C),go1(C,B),!. +link1("Earth","space station"). +link1("space station","planetary outpost"). + +% 26 done + +% ["Time Travel","Interesting histories to visit 2.txt",0,algorithms,"43. The character from history wanted an optimised algorithm."] + +%minimise_dfa([[a,b],[a,c],[b,d],[c,d],[d,a]],Min_dfa). +%Min_dfa=[[a,b],[b,d],[d,a]] +/* +minimise_dfa(T,DFA) :- + minimise_dfa1(T,T,DFA),!. + +minimise_dfa1([],DFA,DFA) :- !. +minimise_dfa1(Transitions,DFA1,DFA2) :- + %Transitions=[T2|T3], + %append(DFA1,[T2],DFA4), + DFA1=DFA4, + minimise_dfa(Transitions,%T2,T3, + DFA4,DFA3), + (DFA4=DFA3->DFA3=DFA2; + minimise_dfa1(DFA3,DFA3,DFA2)). + +minimise_dfa([],DFA,DFA) :- !. +minimise_dfa(Transitions,DFA1,DFA2) :- + Transitions=[T2|T3], + %append(DFA1,[T2],DFA4), + DFA1=DFA4, + minimise_dfa2(T2,T3,DFA4,DFA3), + minimise_dfa(T3,DFA3,DFA2). + +minimise_dfa2(T2,T3,DFA1,DFA2) :- + T2=[From,To], + ((minimise_dfa3(From,To,T3,[],DFA3), + %append(DFA3,[T2],DFA2), + subtract(T3,DFA3,DFA4), + append([T2],DFA4,DFA2) + )->true; + DFA1=DFA2),!. + +minimise_dfa3(From,To,T3,DFA1,DFA3) :- + T3=[T4|_T5], + T4=[From,To2], + member([From,To],T3), + %not(To2=To), + %DFA1=DFA3, + append(DFA1,[[From,To]],DFA3), + %minimise_dfa3(To1,To2,T5,DFA3,DFA2), + !. +minimise_dfa3(From,To,T3,DFA1,DFA2) :- + T3=[T4|T5], + T4=[From,To1], + member([To,To2],T3), + append(DFA1,[T4],DFA3), + %DFA1=DFA3, + minimise_dfa3(To1,To2,T5,DFA3,DFA2),!. +minimise_dfa3(From,To,T3,DFA1,DFA2) :- + T3=[T4|T5], + not(T4=[From,_To1]), + %member([To,To2],T3), + %append(DFA1,[T4],DFA3), + minimise_dfa3(From,To,T5,DFA1,DFA2),!. +*/ + + +%% 33 algs diff --git a/Philosophy/2 4 23.pl b/Philosophy/2 4 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..9e78131b54e6f55b4045ca2c751deb9b9509b049 --- /dev/null +++ b/Philosophy/2 4 23.pl @@ -0,0 +1,224 @@ +% 2 4 23.pl + +% 42 algs + +:-include('26 3 23.pl'). +%:-include('1 4 23.pl'). + +% 31. I prepared to connect the students' thoughts together. I did this by writing 50 breasonings per connection between sentences in philosophy. First, I constructed the train track. Second, I placed the train on it. Third, I let the train drive on the track. In this way, I prepared to connect the students' thoughts together by writing 50 breasonings per connection between sentences in philosophy. + +% distribute thoughts, find jump lists + +person1(anita). +person1(john). +person1(peter). +person1(susan). +person1(susie). +person1(aaron). +person1(hamilton). +person1(albert). +person1(bei-en). +person1(heshan). +person1(manaia). + +% connect_thoughts(A),writeln(A). +% [[3,heshan],[12,albert],[21,albert],[30,aaron],[39,bei-en],[48,albert],[57,albert],[66,susan],[75,manaia],[84,susie],[93,anita]] + +connect_thoughts(L2) :- + distribute_thoughts(L), + find_jump_lists(L,L2). + +distribute_thoughts(L) :- + findall(N,person1(N),N1), + numbers(100,1,[],Ns), + findall([N2,P],(member(N2,Ns),random_member(P,N1)),L). + +find_jump_lists(L1,L2) :- + random_member(Start,[1,2,3,4,5,6,7,8,9,10]), + random_member(Jump,[2,3,4,5,6,7,8,9]), + jump(Start,Jump,100,L1,[],L2). + +jump(Start,_Jump,Limit,_L1,L2,L2) :- Start >= Limit,!. +jump(Start,Jump,Limit,L1,L21,L22) :- + member([Start,Name],L1), + append(L21,[[Start,Name]],L23), + Start2 is Start+Jump, + jump(Start2,Jump,Limit,L1,L23,L22),!. + +% 17 + +% 1a. I prepared to reassure Hamlet. I did this by holding the skull. First, the clown entertained me. Second, I was at the deathbed of the clown. Third, I held the skull of the clown when digging the grave for the girl. In this way, I prepared to reassure Hamlet by holding the skull. + + +/* +closer(0,0,0,1,2,0). true. +closer(0,0,0,1,0.2,0). +false. +*/ + +closer(X1,Y1,X2,Y2,X3,Y3) :- + D1 is sqrt((X2-X1)^2+(Y2-Y1)^2), + D2 is sqrt((X3-X1)^2+(Y3-Y1)^2), + D1 < D2. + +% 11. I prepared to experience the art forms of God (the master). I did this by trusting God (the master). First, I trusted the art of the master. Second, I trusted the music of the master. Third, I trusted the architecture of the master. In this way, I prepared to experience the art forms of God (the master) by trusting God (the master). + +/* +?- house_cross_section([r1,r2,r3],[1,1.5],A). +A = [[r1, 1], [r2, 1], [r3, 1]]. + +?- house_cross_section([r1,r2,r3],[1,1.5],A). +A = [[r1, 1.5], [r2, 1], [r3, 1]]. +*/ +house_cross_section(Rs,Ls,A) :- + findall([R,L],(member(R,Rs),random_member(L,Ls)),A). + +% 23 + +% 21. I prepared to scroll the text down and add a new conclusion at the vanishing point on the horizon. I did this by concluding a new conclusion from two conclusions. First, I read the first conclusion. Second, I read the second conclusion. Third, I wrote a third conclusion formed from the first clause in the first conclusion and the second clause in the second conclusion. In this way, I prepared to scroll the text down and add a new conclusion at the vanishing point on the horizon by concluding a new conclusion from two conclusions. + +setting1([ +"( ) ( ) () ( ) ( ) ()", +"^ ^ ^ ^ ^", +". . .. . . ... . .. . .", +"( ) ( ) () ( ) ( ) ()", +"^ ^ ^ ^ ^", +". . .. . . ... . .. . .", +"( ) ( ) () ( ) ( ) ()", +"^ ^ ^ ^ ^", +". . .. . . ... . .. . ."]). + +window_x(7). +window_y(7). + +inkey_pic :- + window_x(R), + window_y(C), + setting1(S), + verify(S,X1,Y1), + convert_to_matrix(S,X1,Y1,Matrix), + + tty_size(R1,_C), + X is 1,%round(C/2), + Y is R-2,%round(R/2), + move(R1,C,R,1,X1,1,Y1,X,Y,Matrix). + +verify(Plane,X1,Y1) :- + %length(Grid,Z1), + + %member(Plane,Grid), + length(Plane,Y1), + %forall(member(Plane1,Grid), + %length(Plane1,Y1)), + + member(Line,Plane), + string_length(Line,X1), + forall(member(Line1,Plane), + string_length(Line1,X1)). + +convert_to_matrix(Grid,X1,Y1,Matrix) :- + %numbers(Z1,1,[],Zs), + findall([X,Y,Char],(%member(Z,Zs), + %get_item_n(Grid,Z,Plane), + numbers(Y1,1,[],Ys), + %findall(*,( + member(Y,Ys), + get_item_n(Grid,Y,Line), + string_chars(Line,Chars), + numbers(X1,1,[],Xs), + %findall(*,( + member(X,Xs), + get_item_n(Chars,X,Char)),Matrix). + +move(R,Win_X,Win_Y,X_min,X_max,Y_min,Y_max,X,Y,Matrix) :- + %tty_goto(X, Y), + tty_put101(%'*', + Win_X,Win_Y, + R, + X,Y,Matrix), + %1 + %write('*'),%, 1), + read_key(Atom), + (Atom=other->true; + (move1(Win_X,Win_Y,X_min,X_max,Y_min,Y_max,Atom,X,Y,X1,Y1), + move(R,Win_X,Win_Y,X_min,X_max,Y_min,Y_max,X1,Y1,Matrix))),!. + +move1(_Win_X,Win_Y,_X_min,_X_max,Y_min,_Y_max,up,X,Y,X,Y2) :- + Y1 is Y-1, + (Y1 is Y_min+3 + ->Y2=Y;Y2=Y1). +move1(_Win_X,_Win_Y,_X_min,_X_max,_Y_min,Y_max,down,X,Y,X,Y2) :-%trace, + Y1 is Y+1, + (Y1 is Y_max%+Win_Y%+1 + ->Y2=Y;Y2=Y1). +move1(_Win_X,_Win_Y,X_min,_X_max,_Y_min,_Y_max,left,X,Y,X2,Y) :- + X1 is X-1, + (X1 is X_min-1->X2=X;X2=X1). +move1(Win_X,_Win_Y,_X_min,X_max,_Y_min,_Y_max,right,X,Y,X2,Y) :- + X1 is X+1, + (X1 is X_max-Win_X+1 + ->X2=X;X2=X1). + +tty_put101(%C, + Win_X,Win_Y, + R,X,Y,Matrix) :- + numbers(R,1,[],Rs), + findall(_,(member(_,Rs),nl),_), + + Y1 is Y+1,%+2*Win_Y, + X1 is X+Win_X, + numbers(Y1,1,[],Ys), + numbers(X1,X,[],Xs), + findall(_,(member(Y2,Ys), + findall(_,(member(X2,Xs),member([X2,Y2,C],Matrix), + write(C)),_),writeln("")),_), + + %findall(_,(member(_,Xs),write(' ')),_), + %write(C), + Y11 is R-Win_Y,%Y,%Y1+1, + numbers(Y11,1,[],Y1s), + findall(_,(member(_,Y1s),nl),_). + +% 35 + +% ["Green, L 2021, Exploring opposites in Hamlet 4 of 4, Lucian Academy Press, Melbourne.","Green, L 2021",1,"COMPUTATIONAL ENGLISH + +% 31. I prepared to declare Hamlet a success. I did this by agreeing with Hamlet. First, I observed Hamlet avoid all the action. Second, I observed Hamlet avoid suicide. Third, I observed Hamlet avoid becoming a murderer. In this way, I prepared to declare Hamlet a success by agreeing with Hamlet. + +/* +?- evader(5,N1),writeln(N1). +N1 = [[6, 6], [1, 6], [5, 2], [4, 3], [2, 1]]. + +?- evader(6,N1),writeln(N1). +false. + +?- evader(5,N1),writeln(N1). +N1 = [[3, 1], [3, 6], [3, 3], [5, 6], [3, 3]]. + +?- evader(6,N1),writeln(N1). +N1 = [[7, 3], [4, 7], [1, 1], [5, 1], [8, 5], [1, 6]]. +*/ + +evader(N,N1) :- + evader(N,[],N1). +evader(0,N1,N1) :- !. +evader(N,N0,N1) :- + random_member(X,[1,2,3,4,5,6,7,8]), + random_member(Y,[1,2,3,4,5,6,7,8]), + XM is X-1, + XP is X+1, + YM is Y+1, + YP is Y+1, + not(member([XM,YM],N0)), + not(member([XM,Y],N0)), + not(member([XM,YP],N0)), + not(member([X,YM],N0)), + + not(member([X,YP],N0)), + not(member([XP,YM],N0)), + not(member([XP,Y],N0)), + not(member([XP,YP],N0)), + append(N0,[[X,Y]],N01), + NN is N-1, + evader(NN,N01,N1). + diff --git a/Philosophy/24 3 23.pl b/Philosophy/24 3 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..4a25406d1af4757f39224b6c1d4bb510fb0e9ab5 --- /dev/null +++ b/Philosophy/24 3 23.pl @@ -0,0 +1,170 @@ +% 24 3 23.pl + +% 42 algs +:-include('../listprologinterpreter/listprolog.pl'). + +% 1. I started the web business in the future and the present. There were a set number of places. *SSI in C was needed for more places. I had fifty As, or at least 50 breasonings for each conclusion. + +% ?- ssi_in_c(S). +% S = 60. + +ssi_in_c(Sum) :- + X_max is 3, + Y_max is 4, + Z_max is 5, + numbers(X_max,1,[],Xs), + numbers(Y_max,1,[],Ys), + numbers(Z_max,1,[],Zs), + + % define array int a[X][Y][Z] + + findall(Ys1,(member(_Z,Zs), + findall(Xs1,(member(_Y,Ys), + findall(Xn,(member(_X,Xs), Xn=1),Xs1)),Ys1)),Zs1), + + % count all values in array + + flatten(Zs1,Zs2),sum(Zs2,0,Sum). + +% 1. The bot transcribed the text. + +% ocr1(T). +% T = [l, -, *]. +ocr1(Text):-ocr([l-[['*','-'],['*','*']],i-[['*','-'],['*','-']],v-[['*','*'],['*','-']],'bs'-[['*','-'],['-','*']],'/'-[['-','*'],['*','-']],'.'-[['-','-'],['*','-']],'-'-[['*','*'],['-','-']],'_'-[['-','-'],['*','*']],','-[['-','*'],['*','*']],'\''-[['-','*'],['-','-']],'*'-[['*','*'],['*','*']]],[[['*','-'],['*','*']],[['*','*'],['-','-']],[['*','*'],['*','*']]],Text). + +ocr(Characters,Graphics,Text) :- + findall(C,(member(G,Graphics),member(C-G,Characters)),Text). + +% 1. The symbols for all, there exists, ->, <->, ^, v and ~ (not) need to be updated to account for language's semantic properties. *For example, there exist 3 (the symbol there exists! means there exists 1). The symbol ~avb could represent a although b because 'a although' implies ~a and 'although b' implies b, hence ~avb. + +a1(A):-a(A),!. +a(1). +a(2). +a(3). + +% 11. I prepared to reduce psychiatric costs. I did this by preventing etiological phenomena (the virus AIDS, in fact maintaining human happiness by using a condom, in fact abstaining from sex). *First, I stood at the start of the path. Second, I started walking along the path. Third, I walked along the path until the end of it. In this way, I prepared to reduce psychiatric costs by preventing etiological phenomena (the disease AIDS, in fact maintaining human happiness by using a condom, in fact abstaining from sex). + +% p("* ** ** ",PS). % PS = [1, 3, 7]. + +p(String,PS) :- + string_codes(String,String2), + split_on_substring117(String2,` `,[],List1), + p2(List1,1,[],PS). +p2([],_,L,L) :- !. +p2(List,N1,L1,L2) :- + List=[L|Ls], + (L=" "-> + (N3 is N1+1, + p2(Ls,N3,L1,L2))),!. +p2(List,N1,L1,L2) :- + List=[L|Ls], + (string_length(L,LN), + append(L1,[N1],L3), + N3 is N1+LN, + p2(Ls,N3,L3,L2)). + +% 21. I prepared to eat the tofu nuggets. I did this by preventing animal products from being produced (I cooked the tofu). First, I placed the tofu in the wok. Second, I started to cook it. Third, I stopped cooking it after 4 minutes. In this way, I prepared to eat the tofu nuggets by preventing animal products from being produced (I cooked the tofu). + +%bar_with_time(0,10). +bar_with_time(F,F) :- !. +bar_with_time(Now,F) :- + sleep(1), + bar_n(Now,F), + Now1 is Now+1, + bar_with_time(Now1,F). +bar_n(N,M) :- + P is round((N/M)*10), + P1 is round((N/M)*100), + numbers(P,1,[],N1), + findall("*",member(_,N1),B), + write(B),write(" "),write(P1),writeln("%"). + +% 31. I prepared to become a web fan. *I did this by running a viral algorithm. First, I breasoned out 5 As. Second, I performed 45 brain thoughts after meditating using 80 lucian mantras (drawing an illustration). Third, I dotted on a grid, making, doing, having, time to prepare for the video web site counter, a large cloud to protect oneself from the wires being felt, a non-famous wire and a famous wire. In this way, I prepared to become a web fan by running a viral algorithm. + +% viral_poster_hands_train(It). +% It = [1,2,3] + +% once the 4*50 As for a work has been completed, viral As stretching out from each side are completed (where one imagines virality to come from), like hands, and this reminded me of spiritual hands on University posters, that guide the eyes around the poster, like a train. + +viral_poster_hands_train(Itin4) :- + Itin=[], + point(1,Itin,Itin2), + point(2,Itin2,Itin3), + point(3,Itin3,Itin4). + +point(A,I1,I2) :- + append(I1,[A],I2). + +% 1. After reading a page on Agnès van Rees, the Director of the project Characteristics of Argumentative Discourse found using the Yahoo! search criteria of 'narratology, contention and characteristics', I became interested in resolution of differences of opinion in everyday discussion. + +% side_determiner([1,1,2,3,4,1],Ave). +% 1=agreeing, 10=disagreeing + +side_determiner(Closer_to_agreeing_or_disagreeing,Ave) :- + sum(Closer_to_agreeing_or_disagreeing,0,Sum), + Ave is Sum/10. + +% 11. I prepared to be a meditation (philosophy) teacher. I did this by virtuously running up the pole. First, I stepped on the first rung of the stairs on the pole. Second, I prepared to walk to the next step. Third, I repeated this until I had ran up the pole. In this way, I prepared to be a meditation (philosophy) teacher by virtuously running up the pole. + +create_edu_pol(A,D):- + writeln("Are there any more contentions? (y/n)"), + read_string(user_input,"\n\r","\n\r",_,C), + (C="y"-> + (writeln("What is the binary contention?"), + read_string(user_input,"\n\r","\n\r",_,B), + writeln("What is your opinion (1-strongly disagree, 5-strongly agree)?"), + read_string(user_input,"\n\r","\n\r",_,O), + append(A,[[B,O]],E), + create_edu_pol(E,D)); + (A=D, + writeln(D))),!. + +% 21. I prepared to observe the mouse man ascend the group of slopes. *I did this by observing the mouse man run up the slope. First, I observed the mouse man stand at the foot of the slope. Second, I observed the mouse running up the slope. Third, I observed the mouse stop at the top of the slope. In this way, I prepared to observe the mouse man ascend the group of slopes by observing the mouse man run up the slope. + +% find_peak_height(10,"ddd-uu-d-d",[],H). +find_peak_height(H1,S,H) :- + string_codes(S,C), + fph(H1,C,[],H2), + sort(H2,H3), + append(_,[H],H3),!. +fph(_,[],H,H) :- !. +fph(H1,C,H5,H6) :- + C=[H2|H3], + ([H2]=`-` -> + H4=H1; + ([H2]=`u` -> + H4 is H1+1; + ([H2]=`d` -> + H4 is H1-1))), + append(H5,[H4],H7), + fph(H4,H3,H7,H6),!. + +% COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 4 of 4.txt + +% 31. I prepared to observe the volunteer vitiate the community. I did this by observing the volunteer teach meditation (philosophy of music). First, she gave a copy of the degree to her student. Second, her student breasoned out the degree. Third, the volunteer taught her student meditation (philosophy of music) as she had been taught. In this way, I prepared to observe the volunteer vitiate the community by observing the volunteer teach meditation (philosophy of music). + +% check_text_with_ontologies(A),writeln(A). Call: (11) check_text_with_ontologies(A),writeln(A). +% A=[[The,person,was,good],[The,person,was,happy],[The,person,was,updated],[The,phone,was,good],[The,phone,was,updated],[The,dog,was,good],[The,dog,was,happy]] + +check_text_with_ontologies(Texts) :- + % find text + findall(B1,subject(B1),B), + findall(D1,object(D1),D), + findall(["The",A,"was",C],(member(A,B),member(C,D)),E), + % check text + findall(J1,member(J1),J), + findall([G3,G,G1,H],(member([G3,G,G1,H],E),member([G,H],J)),Texts). + +subject(person). +subject(phone). +subject(dog). +object(good). +object(happy). +object(updated). +member([person,good]). +member([phone,good]). +member([dog,good]). +member([person,happy]). +member([dog,happy]). +member([person,updated]). +member([phone,updated]). diff --git a/Philosophy/25 3 23.pl b/Philosophy/25 3 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..790ee6188dc24c482264638c583562fa6b03ee26 --- /dev/null +++ b/Philosophy/25 3 23.pl @@ -0,0 +1,125 @@ +% 25 3 23.pl + +:-include('../listprologinterpreter/listprolog.pl'). + +% 42 algs + +% 1. Computational English uses techniques to disseminate texts hermeneutically (interpret them). + +% make_connection(Subject,Object,Meronym). + +make_connection("computer","file","stored"). +make_connection("screen","image","displayed"). +make_connection("artist","diagram","drew"). +make_connection("artist","diagram","modified"). +make_connection("philosopher","connection","made"). +make_connection("computer scientist","algorithm","wrote"). +make_connection("computer scientist","algorithm","debugged"). +make_connection("computer scientist","algorithm","saved"). +make_connection("computer scientist","algorithm","loaded"). +make_connection("computer scientist","algorithm","deleted"). + +% 11.I prepared to store the positive Computational English phenomena in a box. I did this by finding the positive results in Computational English. First, I wrote down the first leg of the path. Second, I prepared to write down the next leg of the path. Third, I repeated this until the correct result had been positively found. In this way, I prepared to store the positive Computational English phenomena in a box by finding the positive results in Computational English. + +negative_result(Subject,Object,Meronym) :- + make_connection(Subject,Object,Meronym,(-)). + +make_connection("computer","file","cooled",(-)). +make_connection("screen","image","produced a smell from",(-)). +make_connection("artist","diagram","wrote",(-)). +make_connection("artist","diagram","heated",(-)). +make_connection("philosopher","connection","wore",(-)). +make_connection("computer scientist","algorithm","tasted",(-)). +make_connection("computer scientist","algorithm","touched",(-)). +make_connection("computer scientist","algorithm","smelled",(-)). +make_connection("computer scientist","algorithm","ignored",(-)). +make_connection("computer scientist","algorithm","was blinded by the",(-)). + +% 21.I prepared to write the book. I did this by writing the text. First, I thought of a topic. Second, I wrote the reason. Third, I wrote an inference between the reason and the conclusion. In this way, I prepared to write the book by writing the text. + +% popular_links([[1,2],[1,3],[1,4],[2,5],[3,5],[4,6],[5,7],[6,7]],Trail). +% Trail = [[1, 5, 7]] + +popular_links(Transitions,Trail) :- + N = 1, + pl(N,Transitions,[N],Trail). +pl(N,Transitions,Trail1,Trail2) :- member([N,N1],Transitions),not(member([N1,_],Transitions)),append(Trail1,[N1],Trail2),!. +pl(N,Transitions,Trail1,Trail2) :- + % find all linked to states +findall([C1,C],(member([N,C1],Transitions),member([C1,C],Transitions)),E), + sort(E,K), + + % find frequency of occurrences of states +findall([J,G],(member([_,G],K),findall(G,member([_,G],E),H),length(H,J)),M4), + sort(M4,M5), + + % find most popular links + append(_,[[L1,_M6]],M5), + findall(M7,member([L1,M7],M5),M8), + + % follow most popular links + findall(Trail5,(member(M9,M8), + append(Trail1,[M9],Trail3), + pl(M9,Transitions,[],Trail4), + append(Trail3,Trail4,Trail5)),Trail2). + +% 27 + +% 31.I prepared to discover the protein code. I did this by writing that Bioschemistry (sic) referred to computational biochemistry, the study of the computational nature of proteins. First, I discovered the first biochemical reaction. Second, I prepared to discover the next biochemical reaction. Third, I repeated this until I had discover all the biochemical reactions and the desired result had been achieved. In this way, I prepared to discover the protein code by writing that Bioschemistry (sic) referred to computational biochemistry, the study of the computational nature of proteins. + +%maze([1,2],[1,3],[2,4],[2,5],[5,6],[5,7],1,7). + +maze(_,F,F) :- !. +maze(Ts,I,F) :- + member([I,N],Ts),maze(Ts,N,F). + +% 1a. Will Computational English be like a calculator, as in people should understand the theory before relying on it? Advantages: will allow focus on more material, as more data can be analysed (information would have to be presented in such a way as to allow reviewing in the best way possible). User will be able to program using Conglish's (Computational English) features (write essay, fill out grammar to recognise sentences with the same meaning, compressing the program e.g. 2 identical conclusions compressed to 1 conclusion, with reasons from both attached to it, action to take based on arriving at a conclusion e.g. return a telephone call, cook a meal, clean up or write a poem). + +%model_to_result(1,1,2,2,2,R). +% R = 4 +model_to_result(A1,B1,R1,A2,B2,R2) :- + ind(A1,B1,R1,Op), + interpret(A2,B2,Op,R2). + +%ind(1,1,2,Op). +ind(A,B,R,Op) :- + R is A+B, +Op=(+). + +interpret(A,B,Op,R) :- Op=(+),R is A+B. + +% 34 + +% 11. I prepared to write 'I ate the apple' and 'I ate the apple, therefore I gained energy', therefore 'I gained energy'. I did this by writing that 'and' in Ball Prolog is represented by the ball travelling forwards. First, I observed that the ball started at the position of the first proposition. Second, I watched it roll along the track, representing the conjunction. Third, I observed that the ball finished at the position of the second proposition. In this way, I prepared to write 'I ate the apple' and 'I ate the apple, therefore I gained energy', therefore 'I gained energy' by writing that 'and' in Ball Prolog is represented by the ball travelling forwards. + +% modus_ponens('I ate the apple', 'I ate the apple'-'I gained energy', B). +% B = 'I gained energy'. + +modus_ponens(A,A-B,B). + +% Computational English is like a Calculator 3 of 4 + +% 21. I prepared to perform computations on the ontology, step by step. I did this by writing an ontology, in other words, the data structure, containing the arguments in order. First, I wrote words, the arguments down in the ontology. Second, I wrote the grammar, the order of the arguments under the words, the arguments. Third, I wrote the words, the arguments, in the order of action in the ontology. In this way, I prepared to perform computations on the ontology, step by step by writing an ontology, in other words, the data structure, containing the arguments in order. + +% summarise_structure([a,[b,c,[d,[e]]]],[],[],R). +% R = [a, [b, c, [d, [e]]]]. + +% summarise_structure([a,[b,c,[d,[e]]]],[a,e],[],R). % R = [b, c, [d]] + +% summarise_structure([a,[b,c,[d,[e]]]],[a,d],[],R). +% R = [b, c, [e]] + +summarise_structure([],_D,R,R) :- !. +summarise_structure([A|B],D,R1,R2) :- + not(length(B,1)), + (member(A,D)->R1=R3; + append(R1,[A],R3)), + summarise_structure(B,D,R3,R2),!. +summarise_structure([A,B],D,R1,R2) :- + (member(A,D)-> + (summarise_structure(B,D,R1,R2)); + (not(member(A,D))-> + (append(R1,[A],R3), + summarise_structure(B,D,[],R4), + (R4=[]->R2=R3; + append([R3,[R4]],R2))))). diff --git a/Philosophy/29 3 24.pl b/Philosophy/29 3 24.pl new file mode 100644 index 0000000000000000000000000000000000000000..b58fbf9e915bd2527e6a53de0318d6844bd8af12 --- /dev/null +++ b/Philosophy/29 3 24.pl @@ -0,0 +1,63 @@ +% 29 3 23.pl + +% ["Creating and Helping Pedagogues","CREATE AND HELP PEDAGOGUES by Lucian Green Daily Professional Requirement of the Pedagogy Helper 3 of 5.txt",0,algorithms,"23. The self examined that the work was finished on time. The self knew that pedagogy Aigs helpers helped. The self observed the helper help the self. The self saw the second helper help. The second helper also helped the self."] + +% The self examined that the work was finished on time. +% The day's life accreditation and work breasonings were completed each day. + +% calculates needed background helping breasonings according to one's time travel schedule and whether new breasonings will be written + +%:- use_module(library(date)). + +:-include('../listprologinterpreter/listprolog.pl'). + +% (guess before 12PM x) br from now to tt (poss after walk, nap) +% continue using alg, ask whether in 2024 or 5689, if 2024 2*80 br, if 5689, now-11 AM next day + +schedule :- + +get_time(TS),stamp_date_time(TS,date(YearValue,MonthValue,DayValue,HourValue,MinuteValue,_SecondsValue1,_A,_TZ,_False),local), + + foldr(string_concat,["Scheduler\n\n","The time now is ",HourValue,":",MinuteValue,".\n\n"],B), + writeln(B), + + writeln("Will you go for a walk (y/n)?"), + read_string(user_input,"\n\r","\n\r",_,S1), + (S1="y"->(writeln("How many minutes will it take (including preparations)?"), + read_string(user_input,"\n\r","\n\r",_,S2), + number_string(N2,S2));N2=0), + + writeln("Will you have a nap afterwards (y/n)?"), + read_string(user_input,"\n\r","\n\r",_,S3), + (S3="y"->(writeln("How many minutes will it take?"), + read_string(user_input,"\n\r","\n\r",_,S4), + number_string(N4,S4));N4=0), + %trace, + C is 60*(N2+N4), + + %date_time_stamp(date(YearValue,DayValue,HourValue,MinuteValue,0,0,-,-),Value1A), + D is TS%Value1A + +C, + + E is ceiling((N2+N4)/60), + F is E*5, + stamp_date_time(D,date(_,_,_,HourValue1,MinuteValue1,_SecondsValue,_A1,_TZ1,_False1),local), +foldr(string_concat,["I assume you will time travel after ",HourValue1,":",MinuteValue1,".\n\n","You will need ",E,"*5=",F," breasonings.\n\n","Press when you have time travelled."],G), + writeln(G), + get_single_char(_), + writeln("You will need 160 breasonings at your first destination.\n\nPlease press when you have time travelled after this."), + get_single_char(_), + + get_time(TS1),stamp_date_time(TS1,date(_,_,_,HourValue2,_MinuteValue2,_,_,_,_),local), + + H is 11+(24-HourValue2), + J is H*5, + + foldr(string_concat,["\n\n","You will need ",H,"*5=",J," breasonings until 11 AM tomorrow.\n\n","Thank you."],K), + writeln(K), + !. + +% 3 algs + +% paraph mr + diff --git a/Philosophy/3 6 23.pl b/Philosophy/3 6 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..96940e60abbd223ac5373f9a3c65d2ffea3012dc --- /dev/null +++ b/Philosophy/3 6 23.pl @@ -0,0 +1,309 @@ +% 3 6 23.pl + +% 80 algs + +% % ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 3 of 4.txt",0,algorithms,"25. The brainworks participant wrote breasonings (thought of X, Y and Z dimensions for objects) for a 'seen-as' essay based directly on secondary literature, and handed in an essay of his design. He did this by moving the arch from his toes. First, he bent down. Second, he put the arch over his toes. Third, he removed the arch. In this way, the brainworks participant wrote breasonings (thought of X, Y and Z dimensions for objects) for a 'seen-as' essay based directly on secondary literature, and handed in an essay of his design by moving the arch from his toes."] + +% He did this by moving the arch from his toes. + +% apologies to 19 3 23.pl + +/* + +minimise_dfa([[a,b],[b,c],[a,d],[d,c],[a,e],[e,c]],B). +B = [[a, b], [b, c]] . + +minimise_dfa([[a,b1],[b1,b2],[b2,c],[a,d1],[d1,d2],[d2,c],[a,e1],[e1,e2],[e2,c]],B). +B = [[a, b1], [b1, b2], [b2, c]] . + +minimise_dfa([[a,b1],[b1,b2],[b2,b3],[b3,c],[a,d1],[d1,d2],[d2,c],[a,e1],[e1,e2],[e2,c]],B). +B = [[a, b1], [b1, b2], [b2, b3], [b3, c], [a, d1], [d1, d2], [d2, c]]. + +*/ + +%:-include('../listprologinterpreter/listprolog.pl'). + +minimise_dfa(A,B) :- + minimise_dfa1(A,[],B). + + +minimise_dfa1([],DFA,DFA) :- !. +minimise_dfa1(Transitions,DFA1,DFA2) :- + %Transitions=[T2|T3], + %append(DFA1,[T2],DFA4), + minimise_dfa(Transitions,%T2,T3, + DFA1,DFA3), + (Transitions=DFA3->DFA3=DFA2; + minimise_dfa1(DFA3,[],DFA2)). + + +minimise_dfa([],B,B) :- !. +minimise_dfa(A,B,C) :- + A=[[D1,D2]|E], + findall(_,member([D1,_],A),D4), + length(D4,1), + append(B,[[D1,D2]],F), + minimise_dfa(E,F,C). +minimise_dfa(A,B,C) :- +%writeln(minimise_dfa(A,B,C)), + A=[[D1,D2]|E], + findall(_,member([D1,_],A),D4), + not(length(D4,1)), + %trace, + ((find_loops(D1,A,[],_C1%,1,N1 + ,[],L), + findall(H3,(member([_F,_G,H],L),length(H,LeH),LeH>=2,H=[_H1|H3]),H2), + foldr(append,H2,H4), + foldr(append,H4,H5), + %L=[L1|L2], + %foldr(append,L2,L3), + %find_loop(D2,E,[],C11,1,N2), + %N1=N2 + subtract(E,H5,C2%B1 + ), + append([[D1,D2]],C2,C3), + + (H5=[]->C=C3; + minimise_dfa(C3,[],C)) %E=C).%, +%, + %append(C2,L1,C) + )->true; + (%trace,%subtract(A,E,A1), + %append(A,B,B1), + minimise_dfa(E,B,C))). %E=C).%, + %minimise_dfa(B1,B,C). +%find_loop(_,_,C,X,X,C,N,N). +%find_loop(D2,E1,B,X,X,C,N,N) :- +% member([D2,D3],E1), +% append(B,[[D2,D3]],C). + +find_loop(D2,E1,B,_X1,X2,C,N1,N2) :- + N3 is N1+1, + member([D2,D3],E1),%subtract(E1,[[D2,D3]],E2), + %findall(_,member([D2,_],E1),D4), + %length(D4,1), + append(B,[[D2,D3]],B1), + X3=D3, + find_loop1(D3,E1,B1,X3,X2,C,N3,N2). +find_loop1(_D3,_E1,C,X,X,C,N,N). +find_loop1(D3,E1,B1,X1,X2,C,N3,N2) :- + find_loop(D3,E1,B1,X1,X2,C,N3,N2). + + +% find loops that are the same length +find_loops(A,B,_C,_D,_L1,L4) :- +%trace, + findall([D1,N,X2],find_loop(A,B,[]%C + ,[],X2,D1,1,N),L2), + findall([N,X21],member([_,N,X21],L2),X22), + sort(X22,X23), + findall([N,X,LN],(member([N,X],X23),findall(L,member([L,N,X],L2),LN)),L4).%, + %findall(N,member([_,N,_],L2),N1), + %N1=[N2|N3], + %forall(member(N4,N3),N4=N2), + %findall(L3,member([L3,_,_],L2),L4). +% (member(D1,L1)->L1=L2; +% (append(L1,[D1],L3), +% find_loops(A,B,C,D,L3,L2))). + +% 14 + +% First, he bent down. + +% check_dependencies([["a",[]]],[["a","a.pl",[["b"]]]]). + +% check_dependencies([["a",[]]],[["a","a.pl",[["include","b.pl"]]],["a","b.pl",[["c"]]]]). + +% check_dependencies([["a",[]]],[["a","a.pl",[["a",":-","b"],["b"]]]]). + +% check_dependencies([["a",["b"]]],[["a","a.pl",[["include","c.pl"],["init",":-",["d","e"]]]],["a","c.pl",[["f"]]],["b","h.pl",[["e",":-","f"]]]]). + +% check_dependencies([["a",["b"]],["a",["b"]]],[["a","a.pl",[["include","c.pl"],["init",":-",["d","e"]]]],["a","c.pl",[["f"]]],["b","h.pl",[["e",":-","f"]]]]). + + +% removed init (first predicate from registry) + +check_dependencies(Registry,Files) :- + % check package manager registry file + findall(_,(member([R1%,P1 + ,Rs],Registry),append([R1],Rs,Rs2), + + % check included files + member(R,Rs2), + (findall(_,(member([R,_F,P],Files), + findall(_,(member(["include",F1],P), + check_file(F1,Files)),B), + length(P,L1),%length(B,L1), + findall(_,(member([P1],P),(string(P1)->P1=P2;P1=[P2,_,_]), + check_predicate([P2],Files)),A), + append(A,B,AB), + length(AB,L1) + ),_C) + %length(Files,L2),length(C,L2) + )),_D). + %length(Registry,L3),length(D,L3). + + % check predicates + +check_file(F1,Files) :- + findall(_,(member([_R,F1,P],Files), + findall(_,(member(Predicate,P), + check_predicate(Predicate,Files)),A), + length(P,L1),length(A,L1)),_B). + %length(Files,L2),length(B,L2). + +check_predicate(Predicate,Files) :- + findall(_,(member([_R,_F1,P],Files), + findall(_,(member(Predicate1,P), + Predicate1=Predicate2, + (Predicate2=Predicate->true; + (Predicate=[Predicate2,":-",Ps], + findall(_,(member(P1,Ps), + check_predicate(P1,Files)),A), + length(Ps,L1),length(A,L1)))),B), + length(P,L2),length(B,L2)),_C). + %length(Files,L3),length(C,L3). + +% later: check unnecessary packages, files and predicates x check everything aligns above + +% 22+41 from lc = 63 + +% Second, he put the arch over his toes. + +% reach end + +%reach_end(5). +%reach_end(1). +%reach_end(0). + +reach_end(N) :- + numbers(N,1,[],Ns), + random_permutation(Ns,Ns1), + sort1(Ns1,[],Ns2), + reverse(Ns2,Ns3), + numbers(N,1,[],Ns3),!. + +sort1([],B,B). +sort1(A,B,C) :- + max(A,D), + delete(A,D,F), + append(B,[D],E), + sort1(F,E,C). +max([A],A). +max(A,D) :- + A=[B|C], + max1(B,C,D). +max1(A,[],A). +max1(B,C,D) :- + C=[E|F], + (B>=E->max1(B,F,D);max1(E,F,D)). + +% 72 + +% Third, he removed the arch. + +/* +reverse_engineering1(5,A). +A = 1+4 ; +A = 2+3 ; +A = 3+2 ; +A = 4+1 ; +*/ + +reverse_engineering1(N,A) :- + numbers(N,1,[],Ns),member(N1,Ns),member(N2,Ns),N is N1 + N2,A=N1+N2. + +:-include('data_to_alg.pl'). + +%reverse_engineering2([[[a,b,c,d],[b,c]],[[e,b,f,d],[b,f]]],A). +reverse_engineering2(Ls,A) :- + test1(off,1,_), + findall([[DFN,L1],[D,H],DHN],(member([L,L1],Ls), + + Z=[b,d,f,g,h,j], + append(B1,C1,Z), + append([D1],E1,C1), + append(F1,G1,E1), + append([H1],J1,G1), + + append([D1],F1,DF1), + + %writeln([df1,DF1]), + interpretpart(match4,[v,sys1],DF1,[[[v,b],B1],[[v,d],D1],[[v,f],F1],[[v,g],G1],[[v,h],H1],[[v,j],J1]],Vars,_), + + getvalue([v,sys1],L0,Vars),%),V), + +%foldr(append,L0,L00), +L0=L00, +%writeln(L00), +%writeln(L00=L1), + L00=[d,f],%L1, + %trace, + member([[v,d],D],Vars), + member([[v,h],H],Vars), + get_n_item(Z,D,DN), + get_n_item(Z,H,HN), + + get_item_n(L,DN,DNN), + get_item_n(L,HN,HNN), + + append(_B,C,L), + append([DNN],E,C), + append(F,G,E), + %trace, + append([HNN],_J,G), + + append([DNN],F,DFN), + append([DNN],[HNN],DHN) + + %writeln([[b,d,f,g,j],[_B,DNN,_F,HNN,_J]]) + ),V), + +findall(A,member([A,_,_],V),V0), + %V=[[V31,V32]|V2], + forall(member([V41,V42],V0),V41=V42), + + findall([B1,B2],member([_,[B1,B2],_],V),V01), + + findall(B3,member([B3,_],V01),V02), + findall(B4,member([_,B4],V01),V03), + + V02=[V021|V022], + forall(member(V023,V022),V023=V021), + V03=[V031|V032], + forall(member(V033,V032),V033=V031), + + findall(C,member([_,_,C],V),V04), +%trace, + V04=[[D,H]|_], + A=(append(b,c,l), + append([D],e,c), + append(f,g,e), + append([H],j,g)). + +/* + +A = (append(b, c, l), append([b], e, c), append(f, g, e), append([d], j, g)). + +append(_B,C,[a,b,c,d]), append([b],E,C), append(F,G,E), append([d],_J,G). +_B = [a], +C = [b, c, d], +E = [c, d], * +F = [c], +G = [d], +_J = [] ; + +reverse_engineering2([[[a,b,c,d],[b,c]],[[z,b,q,d],[b,q]],[[e,b,f,d],[b,f]]],A). +A = (append(b, c, l), append([b], e, c), append(f, g, e), append([d], j, g)). + +reverse_engineering2([[[a,b,c,d],[b,c]]],A). +A = (append(b, c, l), append([b], e, c), append(f, g, e), append([d], j, g)). + +reverse_engineering2([[[a,b,r,c,d],[b,r,c]]],A). +works if Z=[b,d,f,g,h,j] changed to Z=[b,d,r,f,g,h,j] + L00=[d,f] changed to L00=[d,r,f] above + same answer + +*/ + +% 81 \ No newline at end of file diff --git a/Philosophy/5 2 23 messager.pl b/Philosophy/5 2 23 messager.pl new file mode 100644 index 0000000000000000000000000000000000000000..b34e738fd0a3393999c3149f4ed0f7953d99591e --- /dev/null +++ b/Philosophy/5 2 23 messager.pl @@ -0,0 +1,36 @@ +% 5 2 23.pl + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Upasana Sutra 4 of 4.txt",0,algorithms,"31. *I prepared to send the letter to the Queen. I did this by licking the stamp. First, I detached the stamp from the perforated sheet. Second, I licked it. Third, I attached it to the envelope. In this way, I prepared to send the letter to the Queen by licking the stamp."] + +% messager + +% this version: one user without login creates, edits or deletes messages - can port from pl to ssiws (lp) later + +%:-include('../') +:- include('../listprologinterpreter/listprolog.pl'). +:- include('../private/la_com_ssi1.pl'). +:- include('../private/la_com_ssi1_run_in_prolog.pl'). + +messager :- display_messages(Messages), nl, writeln("Enter (0 - Exit, 1 - New message, 2 - Edit a message, 3 - Delete a message):"), +read_string(S),(number_string(N,S)->go_messager_option(N,Messages);messager),!. + +display_messages(Messages2) :- open_file('messages.txt',Messages),length(Messages,Length),numbers(Length,1,[],Ns), findall([N,Title,Message],(member2(Ns,N),get_item_n(Messages,N,[N,Title,Message]),%write(N),write(" - "), +writeln(Title)),Messages2). + +go_messager_option(0,_Messages) :- true,!. +go_messager_option(1,Messages) :- length(Messages,L),L2 is L+1,writeln("Enter Title:"),read_string(T),writeln("Enter Message:"),read_string(M),append(Messages,[[L2,T,M]],Messages2),sort(Messages2,Messages21),write_file('messages.txt',Messages21),messager. +go_messager_option(2,Messages) :- display_messages_to_select(Messages),length(Messages,L),nl,write("Enter number of message to edit (1-"),write(L),writeln(" or 0 to go up):"),read_string(S),(number_string(N,S)->(go_edit_message(N,Messages));go_messager_option(2,Messages)). +go_messager_option(3,Messages) :- display_messages_to_select(Messages),length(Messages,L),nl,write("Enter number of message to delete (1-"),write(L),writeln(" or 0 to go up):"),read_string(S),(number_string(N,S)->(go_delete_message(N,Messages));go_messager_option(3,Messages)). + +go_messager_option(N) :- not(N=1),not(N=2),not(N=3),not(N=4),messager. + +go_edit_message(0,_Messages) :- messager,!. +go_edit_message(N,Messages) :- %trace, +get_item_n(Messages,N,[N,Title,Message]),writeln("Edit title:"),text_area("rows=\"4\" style=\"width:100%\"",Title,T2),writeln("Edit message:"),text_area("rows=\"4\" style=\"width:100%\"",Message,M2),delete(Messages,[N,_A,_B],Messages0),append(Messages0,[[N,T2,M2]],Messages2),sort(Messages2,Messages21),write_file('messages.txt',Messages21),nl,writeln("Edited message."),nl,messager. + +go_delete_message(0,_Messages) :- messager,!. +go_delete_message(N,Messages) :- delete_item_n(Messages,N,Messages2),write_file('messages.txt',Messages2),nl,writeln("Deleted message."),nl,messager. + +display_messages_to_select(Messages) :-%trace, +findall([N," - ",T,"\n"],member2(Messages,[N,T,_M]),Messages3), +maplist([n,append],Messages3,[],Messages1),maplist([n,string_concat],Messages1,"",Messages2),writeln(Messages2). \ No newline at end of file diff --git a/Philosophy/8 4 23.pl b/Philosophy/8 4 23.pl new file mode 100644 index 0000000000000000000000000000000000000000..12c69e2de2e1b4c4f7a366673f4c2e8de9e6a5c6 --- /dev/null +++ b/Philosophy/8 4 23.pl @@ -0,0 +1,16 @@ +% 8 4 23.pl + +% 27 done so far today + +:-include('big_connections_with_bag3_and_mr_short_books-rel.pl'). + +% How can the program cope with real variation? 2 of 4 + +% 11. I prepared to help the girl onto the tram stop. I did this by writing that the feminine gender of the noun was given by the example, 'girl'. First, I noted that the person was female. Second, I noted that the person was young. Third, I called her a girl. In this way, I prepared to help the girl onto the tram stop by writing that the feminine gender of the noun was given by the example, 'girl'. + +% find key sents with a key term, relate other sents in para in chain of terms + +% big_connections_with_bag3_and_mr(["computational","english"]). + + + diff --git a/Philosophy/LICENSE b/Philosophy/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..3a27c90f1c2986ac0fb73b4ae956c6205343dfdd --- /dev/null +++ b/Philosophy/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Philosophy/aa_log.txt b/Philosophy/aa_log.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/Philosophy/aa_log.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/Philosophy/alg_cache.txt b/Philosophy/alg_cache.txt new file mode 100644 index 0000000000000000000000000000000000000000..00ec8723df85da5eaf7722f0b2d1c1438816b43d --- /dev/null +++ b/Philosophy/alg_cache.txt @@ -0,0 +1 @@ +[0,""] \ No newline at end of file diff --git a/Philosophy/alg_dict2.txt b/Philosophy/alg_dict2.txt new file mode 100644 index 0000000000000000000000000000000000000000..22280e822ec3e6fc88b335912866370a7a3364f0 --- /dev/null +++ b/Philosophy/alg_dict2.txt @@ -0,0 +1 @@ +[[[1,"[",2],[2,"[",136],[136,"n",44],[44,",",138],[138,"f",46],[46,"u",140],[140,"n",48],[48,"c",142],[142,"t",50],[50,"i",144],[144,"o",52],[52,"n",146],[146,"2",147],[146,"]",54],[54,",",15],[15,"""",16],[15,"[",56],[16,":",17],[17,"-",18],[18,"""",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"[",23],[23,"n",24],[24,",",25],[25,"e",26],[26,"q",27],[27,"u",28],[28,"a",29],[29,"l",30],[30,"s",31],[31,"4",32],[32,"_",33],[33,"o",34],[34,"n",35],[35,"]",36],[36,"]",37],[37,"]",38],[38,"]",39],[39,"0",40],[40,"1",[-,"[[n,function],"":-"",[[[n,equals4_on]]]]01"]],[56,"[",57],[57,"v",58],[58,",",59],[59,"1",60],[60,"]",61],[61,",",62],[62,"[",63],[63,"v",64],[64,",",65],[65,"2",66],[66,"]",67],[67,",",68],[68,"[",69],[69,"v",70],[70,",",71],[71,"3",72],[72,"]",73],[73,"]",74],[74,",",75],[75,"""",76],[76,":",77],[77,"-",78],[78,"""",79],[79,",",80],[80,"[",81],[81,"[",82],[82,"[",83],[83,"n",84],[84,",",85],[85,"+",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"[",90],[90,"v",91],[91,",",92],[92,"1",93],[93,"]",94],[94,",",95],[95,"[",96],[96,"v",97],[97,",",98],[98,"2",99],[99,"]",100],[100,",",101],[101,"[",102],[102,"v",103],[103,",",104],[104,"6",105],[105,"]",106],[106,"]",107],[107,"]",108],[108,",",109],[109,"[",110],[110,"[",111],[111,"n",112],[112,",",113],[113,"+",114],[114,"]",115],[115,",",116],[116,"[",117],[117,"[",118],[118,"v",119],[119,",",120],[120,"6",121],[121,"]",122],[122,",",123],[123,"[",124],[124,"v",125],[125,",",126],[126,"3",127],[127,"]",128],[128,"]",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"0",133],[133,"1",[-,"[[n,function],[[v,1],[v,2],[v,3]],"":-"",[[[n,+],[[v,1],[v,2],[v,6]]],[[n,+],[[v,6],[v,3]]]]]01"]],[147,"]",148],[148,",",149],[149,"[",150],[150,"[",151],[151,"v",152],[152,",",153],[153,"1",154],[154,"]",155],[155,",",156],[156,"[",157],[157,"v",158],[158,",",159],[159,"2",160],[160,"]",161],[161,"]",162],[162,",",163],[163,"""",164],[164,":",165],[165,"-",166],[166,"""",167],[167,",",168],[168,"[",169],[169,"[",170],[170,"[",171],[171,"n",172],[172,",",173],[173,"i",174],[174,"s",175],[175,"]",176],[176,",",177],[177,"[",178],[178,"[",179],[179,"v",180],[180,",",181],[181,"1",182],[182,"]",183],[183,"]",184],[184,"]",185],[185,",",186],[186,"[",187],[187,"[",188],[188,"n",189],[189,",",190],[190,"i",191],[191,"s",192],[192,"]",193],[193,",",194],[194,"[",195],[195,"[",196],[196,"v",197],[197,",",198],[198,"2",199],[199,"]",200],[200,"]",201],[201,"]",202],[202,"]",203],[203,"]",204],[204,"0",205],[205,"1",[-,"[[n,function2],[[v,1],[v,2]],"":-"",[[[n,is],[[v,1]]],[[n,is],[[v,2]]]]]01"]]],[[1,"[",2],[2,"[",125],[125,"n",99],[99,",",127],[127,"a",6],[127,"b",101],[127,"c",128],[6,"p",7],[7,"p",8],[8,"e",9],[9,"n",10],[10,"d",11],[11,"1",12],[12,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"1",19],[19,"]",20],[20,"]",21],[21,",",22],[22,"""",23],[23,":",24],[24,"-",25],[25,"""",26],[26,",",27],[27,"[",28],[28,"[",29],[29,"[",30],[30,"n",31],[31,",",32],[32,"b",33],[33,"]",34],[34,",",35],[35,"[",36],[36,"[",37],[37,"v",38],[38,",",39],[39,"2",40],[40,"]",41],[41,"]",42],[42,"]",43],[43,",",44],[44,"[",45],[45,"[",46],[46,"n",47],[47,",",48],[48,"c",49],[49,"]",50],[50,",",51],[51,"[",52],[52,"[",53],[53,"v",54],[54,",",55],[55,"3",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,",",60],[60,"[",61],[61,"[",62],[62,"n",63],[63,",",64],[64,"a",65],[65,"p",66],[66,"p",67],[67,"e",68],[68,"n",69],[69,"d",70],[70,"]",71],[71,",",72],[72,"[",73],[73,"[",74],[74,"v",75],[75,",",76],[76,"2",77],[77,"]",78],[78,",",79],[79,"[",80],[80,"v",81],[81,",",82],[82,"3",83],[83,"]",84],[84,",",85],[85,"[",86],[86,"v",87],[87,",",88],[88,"1",89],[89,"]",90],[90,"]",91],[91,"]",92],[92,"]",93],[93,"]",94],[94,"0",95],[95,"1",[-,"[[n,append1],[[v,1]],"":-"",[[[n,b],[[v,2]]],[[n,c],[[v,3]]],[[n,append],[[v,2],[v,3],[v,1]]]]]01"]],[101,"]",102],[102,",",103],[103,"""",104],[104,":",105],[105,"-",106],[106,"""",107],[107,",",108],[108,"[",109],[109,"[",110],[110,"[",111],[111,"n",112],[112,",",113],[113,"t",114],[114,"r",115],[115,"u",116],[116,"e",117],[117,"]",118],[118,"]",119],[119,"]",120],[120,"]",121],[121,"0",122],[122,"1",[-,"[[n,b],"":-"",[[[n,true]]]]01"]],[128,"]",129],[129,",",130],[130,"""",131],[131,":",132],[132,"-",133],[133,"""",134],[134,",",135],[135,"[",136],[136,"[",137],[137,"[",138],[138,"n",139],[139,",",140],[140,"t",141],[141,"r",142],[142,"u",143],[143,"e",144],[144,"]",145],[145,"]",146],[146,"]",147],[147,"]",148],[148,"0",149],[149,"1",[-,"[[n,c],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",65],[65,"n",35],[35,",",67],[67,"c",37],[37,"o",69],[69,"u",39],[39,"n",71],[71,"t",41],[41,"]",73],[73,",",43],[43,"""",13],[43,"[",75],[13,":",14],[14,"-",15],[15,"""",16],[16,",",17],[17,"[",18],[18,"[",19],[19,"[",20],[20,"n",21],[21,",",22],[22,"t",23],[23,"r",24],[24,"u",25],[25,"e",26],[26,"]",27],[27,"]",28],[28,"]",29],[29,"]",30],[30,"0",31],[75,"[",76],[76,"v",77],[77,",",78],[78,"1",79],[79,"]",80],[80,",",81],[81,"[",82],[82,"v",83],[83,",",84],[84,"2",85],[85,"]",86],[86,"]",87],[87,",",88],[88,"""",89],[89,":",90],[90,"-",91],[91,"""",92],[92,",",93],[93,"[",94],[94,"[",95],[95,"[",96],[96,"n",97],[97,",",98],[98,"+",99],[99,"]",100],[100,",",101],[101,"[",102],[102,"[",103],[103,"v",104],[104,",",105],[105,"1",106],[106,"]",107],[107,",",108],[108,"[",109],[109,"v",110],[110,",",111],[111,"5",112],[112,"]",113],[113,"]",114],[114,"]",115],[115,",",116],[116,"[",117],[117,"[",118],[118,"n",119],[119,",",120],[120,"c",121],[121,"o",122],[122,"u",123],[123,"n",124],[124,"t",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"[",129],[129,"v",130],[130,",",131],[131,"5",132],[132,"]",133],[133,",",134],[134,"[",135],[135,"v",136],[136,",",137],[137,"2",138],[138,"]",139],[139,"]",140],[140,"]",141],[141,"]",142],[142,"]",143],[143,"0",144],[144,"1",[-,"[[n,count],[[v,1],[v,2]],"":-"",[[[n,+],[[v,1],[v,5]]],[[n,count],[[v,5],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",100],[100,"n",51],[51,",",102],[102,"g",53],[102,"r",6],[6,"e",7],[7,"v",8],[8,"e",9],[9,"r",10],[10,"s",11],[11,"e",12],[12,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"2",19],[19,"]",20],[20,",",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"2",25],[25,"]",26],[26,"]",27],[27,",",28],[28,"""",29],[29,":",30],[30,"-",31],[31,"""",32],[32,",",33],[33,"[",34],[34,"[",35],[35,"[",36],[36,"n",37],[37,",",38],[38,"t",39],[39,"r",40],[40,"u",41],[41,"e",42],[42,"]",43],[43,"]",44],[44,"]",45],[45,"]",46],[46,"0",47],[47,"1",[-,"[[n,reverse],[[v,2],[v,2]],"":-"",[[[n,true]]]]01"]],[53,"r",54],[54,"a",55],[55,"m",56],[56,"m",57],[57,"a",58],[58,"r",59],[59,"1",60],[60,"]",61],[61,",",62],[62,"[",63],[63,"[",64],[64,"v",65],[65,",",66],[66,"1",67],[67,"]",68],[68,"]",69],[69,",",70],[70,"""",71],[71,":",72],[72,"-",73],[73,"""",74],[74,",",75],[75,"[",76],[76,"[",77],[77,"[",78],[78,"n",79],[79,",",80],[80,"n",81],[81,"o",82],[82,"u",83],[83,"n",84],[84,"]",85],[85,",",86],[86,"[",87],[87,"[",88],[88,"v",89],[89,",",90],[90,"1",91],[91,"]",92],[92,"]",93],[93,"]",94],[94,"]",95],[95,"]",96],[96,"0",97]],[[1,"[",2],[2,"[",139],[139,"n",66],[66,",",141],[141,"g",68],[68,"r",143],[143,"a",70],[70,"m",145],[145,"m",72],[72,"a",147],[147,"r",74],[74,"1",149],[149,"]",76],[76,",",151],[151,"[",78],[78,"[",153],[153,"v",80],[80,",",155],[155,"1",82],[82,"]",157],[157,",",84],[157,"]",158],[84,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,",",90],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"n",40],[40,"o",41],[41,"u",42],[42,"n",43],[43,"]",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"v",48],[48,",",49],[49,"1",50],[50,"]",51],[51,",",52],[52,"[",53],[53,"v",54],[54,",",55],[55,"2",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"0",62],[62,"1",[-,"[[n,grammar1],[[v,1],[v,2]],"":-"",[[[n,noun],[[v,1],[v,2]]]]]01"]],[90,"[",91],[91,"v",92],[92,",",93],[93,"3",94],[94,"]",95],[95,"]",96],[96,",",97],[97,"""",98],[98,":",99],[99,"-",100],[100,"""",101],[101,",",102],[102,"[",103],[103,"[",104],[104,"[",105],[105,"n",106],[106,",",107],[107,"n",108],[108,"o",109],[109,"u",110],[110,"n",111],[111,"]",112],[112,",",113],[113,"[",114],[114,"[",115],[115,"v",116],[116,",",117],[117,"1",118],[118,"]",119],[119,",",120],[120,"[",121],[121,"v",122],[122,",",123],[123,"2",124],[124,"]",125],[125,",",126],[126,"[",127],[127,"v",128],[128,",",129],[129,"3",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"]",134],[134,"]",135],[135,"0",136],[136,"1",[-,"[[n,grammar1],[[v,1],[v,2],[v,3]],"":-"",[[[n,noun],[[v,1],[v,2],[v,3]]]]]01"]],[158,",",159],[159,"""",160],[160,":",161],[161,"-",162],[162,"""",163],[163,",",164],[164,"[",165],[165,"[",166],[166,"[",167],[167,"n",168],[168,",",169],[169,"n",170],[170,"o",171],[171,"u",172],[172,"n",173],[173,"]",174],[174,",",175],[175,"[",176],[176,"[",177],[177,"v",178],[178,",",179],[179,"1",180],[180,"]",181],[181,"]",182],[182,"]",183],[183,"]",184],[184,"]",185],[185,"0",186],[186,"1",[-,"[[n,grammar1],[[v,1]],"":-"",[[[n,noun],[[v,1]]]]]01"]]],[[1,"[",2],[2,"[",132],[132,"n",70],[70,",",134],[134,"c",72],[134,"g",6],[6,"r",7],[7,"a",8],[8,"m",9],[9,"m",10],[10,"a",11],[11,"r",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"c",40],[40,"o",41],[41,"m",42],[42,"p",43],[43,"o",44],[44,"u",45],[45,"n",46],[46,"d",47],[47,"]",48],[48,",",49],[49,"[",50],[50,"[",51],[51,"v",52],[52,",",53],[53,"1",54],[54,"]",55],[55,",",56],[56,"[",57],[57,"v",58],[58,",",59],[59,"2",60],[60,"]",61],[61,"]",62],[62,"]",63],[63,"]",64],[64,"]",65],[65,"0",66],[66,"1",[-,"[[n,grammar1],[[v,1],[v,2]],"":-"",[[[n,compound],[[v,1],[v,2]]]]]01"]],[72,"o",73],[73,"m",74],[74,"p",75],[75,"o",76],[76,"u",77],[77,"n",78],[78,"d",79],[79,"2",80],[80,"1",81],[81,"2",145],[81,"3",82],[82,"]",83],[83,",",84],[84,"[",85],[85,"[",86],[86,"v",87],[87,",",88],[88,"1",89],[89,"]",90],[90,",",91],[91,"[",92],[92,"v",93],[93,",",94],[94,"1",95],[95,"]",96],[96,",",97],[97,"[",98],[98,"v",99],[99,",",100],[100,"3",101],[101,"]",102],[102,",",103],[103,"[",104],[104,"v",105],[105,",",106],[106,"3",107],[107,"]",108],[108,"]",109],[109,",",110],[110,"""",111],[111,":",112],[112,"-",113],[113,"""",114],[114,",",115],[115,"[",116],[116,"[",117],[117,"[",118],[118,"n",119],[119,",",120],[120,"t",121],[121,"r",122],[122,"u",123],[123,"e",124],[124,"]",125],[125,"]",126],[126,"]",127],[127,"]",128],[128,"0",129],[129,"1",[-,"[[n,compound213],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[145,"]",146],[146,",",147],[147,"[",148],[148,"[",149],[149,"v",150],[150,",",151],[151,"1",152],[152,"]",153],[153,",",154],[154,"[",155],[155,"v",156],[156,",",157],[157,"1",158],[158,"]",159],[159,",",160],[160,"[",161],[161,"v",162],[162,",",163],[163,"3",164],[164,"]",165],[165,",",166],[166,"[",167],[167,"v",168],[168,",",169],[169,"3",170],[170,"]",171],[171,"]",172],[172,",",173],[173,"""",174],[174,":",175],[175,"-",176],[176,"""",177],[177,",",178],[178,"[",179],[179,"[",180],[180,"[",181],[181,"n",182],[182,",",183],[183,"t",184],[184,"r",185],[185,"u",186],[186,"e",187],[187,"]",188],[188,"]",189],[189,"]",190],[190,"]",191],[191,"0",192],[192,"1",[-,"[[n,compound212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",152],[152,"n",87],[87,",",154],[154,"c",155],[154,"g",89],[154,"l",6],[6,"o",7],[7,"o",8],[8,"k",9],[9,"a",10],[10,"h",11],[11,"e",12],[12,"a",13],[13,"d",14],[14,"]",15],[15,",",16],[16,"[",17],[17,"[",18],[18,"v",19],[19,",",20],[20,"1",21],[21,"]",22],[22,",",23],[23,"[",24],[24,"v",25],[25,",",26],[26,"1",27],[27,"]",28],[28,",",29],[29,"[",30],[30,"v",31],[31,",",32],[32,"3",33],[33,"]",34],[34,"]",35],[35,",",36],[36,"""",37],[37,":",38],[38,"-",39],[39,"""",40],[40,",",41],[41,"[",42],[42,"[",43],[43,"[",44],[44,"n",45],[45,",",46],[46,"s",47],[47,"t",48],[48,"r",49],[49,"i",50],[50,"n",51],[51,"g",52],[52,"c",53],[53,"o",54],[54,"n",55],[55,"c",56],[56,"a",57],[57,"t",58],[58,"]",59],[59,",",60],[60,"[",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"3",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"5",71],[71,"]",72],[72,",",73],[73,"[",74],[74,"v",75],[75,",",76],[76,"1",77],[77,"]",78],[78,"]",79],[79,"]",80],[80,"]",81],[81,"]",82],[82,"0",83],[83,"1",[-,"[[n,lookahead],[[v,1],[v,1],[v,3]],"":-"",[[[n,stringconcat],[[v,3],[v,5],[v,1]]]]]01"]],[89,"r",90],[90,"a",91],[91,"m",92],[92,"m",93],[93,"a",94],[94,"r",95],[95,"1",96],[96,"]",97],[97,",",98],[98,"[",99],[99,"[",100],[100,"v",101],[101,",",102],[102,"1",103],[103,"]",104],[104,",",105],[105,"[",106],[106,"v",107],[107,",",108],[108,"2",109],[109,"]",110],[110,"]",111],[111,",",112],[112,"""",113],[113,":",114],[114,"-",115],[115,"""",116],[116,",",117],[117,"[",118],[118,"[",119],[119,"[",120],[120,"n",121],[121,",",122],[122,"c",123],[123,"o",124],[124,"m",125],[125,"p",126],[126,"o",127],[127,"u",128],[128,"n",129],[129,"d",130],[130,"]",131],[131,",",132],[132,"[",133],[133,"[",134],[134,"v",135],[135,",",136],[136,"1",137],[137,"]",138],[138,",",139],[139,"[",140],[140,"v",141],[141,",",142],[142,"2",143],[143,"]",144],[144,"]",145],[145,"]",146],[146,"]",147],[147,"]",148],[148,"0",149],[149,"1",[-,"[[n,grammar1],[[v,1],[v,2]],"":-"",[[[n,compound],[[v,1],[v,2]]]]]01"]],[155,"o",156],[156,"m",157],[157,"p",158],[158,"o",159],[159,"u",160],[160,"n",161],[161,"d",162],[162,"2",163],[163,"1",164],[164,"3",165],[165,"]",166],[166,",",167],[167,"[",168],[168,"[",169],[169,"v",170],[170,",",171],[171,"3",172],[172,"]",173],[173,",",174],[174,"[",175],[175,"v",176],[176,",",177],[177,"3",178],[178,"]",179],[179,"]",180],[180,",",181],[181,"""",182],[182,":",183],[183,"-",184],[184,"""",185],[185,",",186],[186,"[",187],[187,"[",188],[188,"[",189],[189,"n",190],[190,",",191],[191,"t",192],[192,"r",193],[193,"u",194],[194,"e",195],[195,"]",196],[196,"]",197],[197,"]",198],[198,"]",199],[199,"0",200],[200,"1",[-,"[[n,compound213],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",117],[117,"n",67],[67,",",119],[119,"c",69],[69,"o",121],[121,"m",71],[71,"p",123],[123,"o",73],[73,"u",125],[125,"n",75],[75,"d",127],[127,"2",77],[77,"1",129],[129,"2",79],[129,"3",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"[",20],[20,"v",21],[21,",",22],[22,"1",23],[23,"]",24],[24,",",25],[25,"[",26],[26,"v",27],[27,",",28],[28,"1",29],[29,"]",30],[30,",",31],[31,"[",32],[32,"v",33],[33,",",34],[34,"3",35],[35,"]",36],[36,",",37],[37,"[",38],[38,"v",39],[39,",",40],[40,"3",41],[41,"]",42],[42,"]",43],[43,",",44],[44,"""",45],[45,":",46],[46,"-",47],[47,"""",48],[48,",",49],[49,"[",50],[50,"[",51],[51,"[",52],[52,"n",53],[53,",",54],[54,"t",55],[55,"r",56],[56,"u",57],[57,"e",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"]",62],[62,"0",63],[63,"1",[-,"[[n,compound213],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[79,"]",80],[80,",",81],[81,"[",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"1",137],[85,"3",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"v",90],[90,",",91],[91,"3",92],[92,"]",93],[93,"]",94],[94,",",95],[95,"""",96],[96,":",97],[97,"-",98],[98,"""",99],[99,",",100],[100,"[",101],[101,"[",102],[102,"[",103],[103,"n",104],[104,",",105],[105,"t",106],[106,"r",107],[107,"u",108],[108,"e",109],[109,"]",110],[110,"]",111],[111,"]",112],[112,"]",113],[113,"0",114],[114,"1",[-,"[[n,compound212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[137,"]",138],[138,",",139],[139,"[",140],[140,"v",141],[141,",",142],[142,"1",143],[143,"]",144],[144,",",145],[145,"[",146],[146,"v",147],[147,",",148],[148,"3",149],[149,"]",150],[150,",",151],[151,"[",152],[152,"v",153],[153,",",154],[154,"3",155],[155,"]",156],[156,"]",157],[157,",",158],[158,"""",159],[159,":",160],[160,"-",161],[161,"""",162],[162,",",163],[163,"[",164],[164,"[",165],[165,"[",166],[166,"n",167],[167,",",168],[168,"t",169],[169,"r",170],[170,"u",171],[171,"e",172],[172,"]",173],[173,"]",174],[174,"]",175],[175,"]",176],[176,"0",177],[177,"1",[-,"[[n,compound212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",71],[71,"n",31],[31,",",73],[73,"a",33],[73,"g",74],[33,"]",7],[7,",",8],[8,"""",9],[8,"[",36],[9,":",10],[10,"-",11],[11,"""",12],[12,",",13],[13,"[",14],[14,"[",15],[15,"[",16],[16,"n",17],[17,",",18],[18,"t",19],[19,"r",20],[20,"u",21],[21,"e",22],[22,"]",23],[23,"]",24],[24,"]",25],[25,"]",26],[26,"0",27],[27,"1",[-,"[[n,a],"":-"",[[[n,true]]]]01"]],[36,"[",37],[37,"v",38],[38,",",39],[39,"1",40],[40,"]",41],[41,",",42],[42,"[",43],[43,"v",44],[44,",",45],[45,"1",46],[46,"]",47],[47,"]",48],[48,",",49],[49,"""",50],[50,":",51],[51,"-",52],[52,"""",53],[53,",",54],[54,"[",55],[55,"[",56],[56,"[",57],[57,"n",58],[58,",",59],[59,"t",60],[60,"r",61],[61,"u",62],[62,"e",63],[63,"]",64],[64,"]",65],[65,"]",66],[66,"]",67],[67,"0",68],[68,"1",[-,"[[n,a],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]],[74,"r",75],[75,"a",76],[76,"m",77],[77,"m",78],[78,"a",79],[79,"r",80],[80,"1",81],[81,"]",82],[82,",",83],[83,"[",84],[84,"[",85],[85,"v",86],[86,",",87],[87,"1",88],[88,"]",89],[89,",",90],[90,"[",91],[91,"v",92],[92,",",93],[93,"2",94],[94,"]",95],[95,"]",96],[96,",",97],[97,"""",98],[98,":",99],[99,"-",100],[100,"""",101],[101,",",102],[102,"[",103],[103,"[",104],[104,"[",105],[105,"n",106],[106,",",107],[107,"c",108],[108,"o",109],[109,"m",110],[110,"p",111],[111,"o",112],[112,"u",113],[113,"n",114],[114,"d",115],[115,"]",116],[116,",",117],[117,"[",118],[118,"[",119],[119,"v",120],[120,",",121],[121,"1",122],[122,"]",123],[123,",",124],[124,"[",125],[125,"v",126],[126,",",127],[127,"2",128],[128,"]",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"0",134],[134,"1",[-,"[[n,grammar1],[[v,1],[v,2]],"":-"",[[[n,compound],[[v,1],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",129],[129,"n",67],[67,",",131],[131,"c",69],[131,"n",132],[69,"o",7],[7,"m",8],[8,"p",9],[9,"o",10],[10,"u",11],[11,"n",12],[12,"d",13],[13,"2",14],[14,"1",15],[15,"2",79],[15,"3",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"[",20],[20,"v",21],[21,",",22],[22,"1",23],[23,"]",24],[24,",",25],[25,"[",26],[26,"v",27],[27,",",28],[28,"1",29],[29,"]",30],[30,",",31],[31,"[",32],[32,"v",33],[33,",",34],[34,"3",35],[35,"]",36],[36,",",37],[37,"[",38],[38,"v",39],[39,",",40],[40,"3",41],[41,"]",42],[42,"]",43],[43,",",44],[44,"""",45],[45,":",46],[46,"-",47],[47,"""",48],[48,",",49],[49,"[",50],[50,"[",51],[51,"[",52],[52,"n",53],[53,",",54],[54,"t",55],[55,"r",56],[56,"u",57],[57,"e",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"]",62],[62,"0",63],[63,"1",[-,"[[n,compound213],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[79,"]",80],[80,",",81],[81,"[",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"1",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"v",90],[90,",",91],[91,"1",92],[92,"]",93],[93,",",94],[94,"[",95],[95,"v",96],[96,",",97],[97,"3",98],[98,"]",99],[99,",",100],[100,"[",101],[101,"v",102],[102,",",103],[103,"3",104],[104,"]",105],[105,"]",106],[106,",",107],[107,"""",108],[108,":",109],[109,"-",110],[110,"""",111],[111,",",112],[112,"[",113],[113,"[",114],[114,"[",115],[115,"n",116],[116,",",117],[117,"t",118],[118,"r",119],[119,"u",120],[120,"e",121],[121,"]",122],[122,"]",123],[123,"]",124],[124,"]",125],[125,"0",126],[126,"1",[-,"[[n,compound212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[132,"u",133],[133,"m",134],[134,"b",135],[135,"e",136],[136,"r",137],[137,"2",138],[138,"1",139],[139,"2",140],[140,"]",141],[141,",",142],[142,"[",143],[143,"[",144],[144,"v",145],[145,",",146],[146,"1",147],[147,"]",148],[148,",",149],[149,"[",150],[150,"v",151],[151,",",152],[152,"1",153],[153,"]",154],[154,",",155],[155,"[",156],[156,"v",157],[157,",",158],[158,"3",159],[159,"]",160],[160,",",161],[161,"[",162],[162,"v",163],[163,",",164],[164,"3",165],[165,"]",166],[166,"]",167],[167,",",168],[168,"""",169],[169,":",170],[170,"-",171],[171,"""",172],[172,",",173],[173,"[",174],[174,"[",175],[175,"[",176],[176,"n",177],[177,",",178],[178,"t",179],[179,"r",180],[180,"u",181],[181,"e",182],[182,"]",183],[183,"]",184],[184,"]",185],[185,"]",186],[186,"0",187],[187,"1",[-,"[[n,number212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",126],[126,"n",63],[63,",",128],[128,"l",129],[128,"w",65],[65,"o",7],[7,"r",8],[8,"d",9],[9,"2",10],[10,"1",11],[11,"2",12],[12,"]",13],[12,"_",72],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"1",19],[19,"]",20],[20,",",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"1",25],[25,"]",26],[26,",",27],[27,"[",28],[28,"v",29],[29,",",30],[30,"3",31],[31,"]",32],[32,",",33],[33,"[",34],[34,"v",35],[35,",",36],[36,"3",37],[37,"]",38],[38,"]",39],[39,",",40],[40,"""",41],[41,":",42],[42,"-",43],[43,"""",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"[",48],[48,"n",49],[49,",",50],[50,"t",51],[51,"r",52],[52,"u",53],[53,"e",54],[54,"]",55],[55,"]",56],[56,"]",57],[57,"]",58],[58,"0",59],[59,"1",[-,"[[n,word212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[72,"a",73],[73,"t",74],[74,"o",75],[75,"m",76],[76,"]",77],[77,",",78],[78,"[",79],[79,"[",80],[80,"v",81],[81,",",82],[82,"1",83],[83,"]",84],[84,",",85],[85,"[",86],[86,"v",87],[87,",",88],[88,"1",89],[89,"]",90],[90,",",91],[91,"[",92],[92,"v",93],[93,",",94],[94,"3",95],[95,"]",96],[96,",",97],[97,"[",98],[98,"v",99],[99,",",100],[100,"3",101],[101,"]",102],[102,"]",103],[103,",",104],[104,"""",105],[105,":",106],[106,"-",107],[107,"""",108],[108,",",109],[109,"[",110],[110,"[",111],[111,"[",112],[112,"n",113],[113,",",114],[114,"t",115],[115,"r",116],[116,"u",117],[117,"e",118],[118,"]",119],[119,"]",120],[120,"]",121],[121,"]",122],[122,"0",123],[123,"1",[-,"[[n,word212_atom],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[129,"o",130],[130,"o",131],[131,"k",132],[132,"a",133],[133,"h",134],[134,"e",135],[135,"a",136],[136,"d",137],[137,"]",138],[138,",",139],[139,"[",140],[140,"[",141],[141,"v",142],[142,",",143],[143,"1",144],[144,"]",145],[145,",",146],[146,"[",147],[147,"v",148],[148,",",149],[149,"1",150],[150,"]",151],[151,",",152],[152,"[",153],[153,"v",154],[154,",",155],[155,"3",156],[156,"]",157],[157,"]",158],[158,",",159],[159,"""",160],[160,":",161],[161,"-",162],[162,"""",163],[163,",",164],[164,"[",165],[165,"[",166],[166,"[",167],[167,"n",168],[168,",",169],[169,"s",170],[170,"t",171],[171,"r",172],[172,"i",173],[173,"n",174],[174,"g",175],[175,"c",176],[176,"o",177],[177,"n",178],[178,"c",179],[179,"a",180],[180,"t",181],[181,"]",182],[182,",",183],[183,"[",184],[184,"[",185],[185,"v",186],[186,",",187],[187,"3",188],[188,"]",189],[189,",",190],[190,"[",191],[191,"v",192],[192,",",193],[193,"5",194],[194,"]",195],[195,",",196],[196,"[",197],[197,"v",198],[198,",",199],[199,"1",200],[200,"]",201],[201,"]",202],[202,"]",203],[203,"]",204],[204,"]",205],[205,"0",206],[206,"1",[-,"[[n,lookahead],[[v,1],[v,1],[v,3]],"":-"",[[[n,stringconcat],[[v,3],[v,5],[v,1]]]]]01"]]],[[1,"[",2],[2,"[",90],[90,"n",58],[58,",",92],[92,"g",6],[92,"s",60],[6,"r",7],[7,"a",8],[8,"m",9],[9,"m",10],[10,"a",11],[11,"r",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,"]",22],[22,",",23],[23,"""",24],[24,":",25],[25,"-",26],[26,"""",27],[27,",",28],[28,"[",29],[29,"[",30],[30,"[",31],[31,"n",32],[32,",",33],[33,"s",34],[34,"e",35],[35,"n",36],[36,"t",37],[37,"e",38],[38,"n",39],[39,"c",40],[40,"e",41],[41,"]",42],[42,",",43],[43,"[",44],[44,"[",45],[45,"v",46],[46,",",47],[47,"1",48],[48,"]",49],[49,"]",50],[50,"]",51],[51,"]",52],[52,"]",53],[53,"0",54],[54,"1",[-,"[[n,grammar1],[[v,1]],"":-"",[[[n,sentence],[[v,1]]]]]01"]],[60,"u",61],[61,"b",62],[62,"j",63],[63,"e",64],[64,"c",65],[65,"t",66],[66,"]",67],[67,",",68],[68,"""",69],[68,"[",102],[69,":",70],[70,"-",71],[71,"""",72],[72,",",73],[73,"[",74],[74,"[",75],[75,"[",76],[76,"n",77],[77,",",78],[78,"t",79],[79,"r",80],[80,"u",81],[81,"e",82],[82,"]",83],[83,"]",84],[84,"]",85],[85,"]",86],[86,"0",87],[87,"1",[-,"[[n,subject],"":-"",[[[n,true]]]]01"]],[102,"[",103],[103,"v",104],[104,",",105],[105,"1",106],[106,"]",107],[107,",",108],[108,"[",109],[109,"v",110],[110,",",111],[111,"1",112],[112,"]",113],[113,"]",114],[114,",",115],[115,"""",116],[116,":",117],[117,"-",118],[118,"""",119],[119,",",120],[120,"[",121],[121,"[",122],[122,"[",123],[123,"n",124],[124,",",125],[125,"t",126],[126,"r",127],[127,"u",128],[128,"e",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"0",134],[134,"1",[-,"[[n,subject],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",77],[77,"n",34],[34,",",79],[79,"o",80],[79,"v",36],[36,"e",7],[7,"r",8],[8,"b",9],[9,"]",10],[10,",",11],[11,"""",12],[11,"[",42],[12,":",13],[13,"-",14],[14,"""",15],[15,",",16],[16,"[",17],[17,"[",18],[18,"[",19],[19,"n",20],[20,",",21],[21,"t",22],[22,"r",23],[23,"u",24],[24,"e",25],[25,"]",26],[26,"]",27],[27,"]",28],[28,"]",29],[29,"0",30],[30,"1",[-,"[[n,verb],"":-"",[[[n,true]]]]01"]],[42,"[",43],[43,"v",44],[44,",",45],[45,"1",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"v",50],[50,",",51],[51,"1",52],[52,"]",53],[53,"]",54],[54,",",55],[55,"""",56],[56,":",57],[57,"-",58],[58,"""",59],[59,",",60],[60,"[",61],[61,"[",62],[62,"[",63],[63,"n",64],[64,",",65],[65,"t",66],[66,"r",67],[67,"u",68],[68,"e",69],[69,"]",70],[70,"]",71],[71,"]",72],[72,"]",73],[73,"0",74],[74,"1",[-,"[[n,verb],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]],[80,"b",81],[81,"j",82],[82,"e",83],[83,"c",84],[84,"t",85],[85,"]",86],[86,",",87],[87,"""",88],[88,":",89],[89,"-",90],[90,"""",91],[91,",",92],[92,"[",93],[93,"[",94],[94,"[",95],[95,"n",96],[96,",",97],[97,"t",98],[98,"r",99],[99,"u",100],[100,"e",101],[101,"]",102],[102,"]",103],[103,"]",104],[104,"]",105],[105,"0",106],[106,"1",[-,"[[n,object],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",127],[127,"n",50],[50,",",129],[129,"c",130],[129,"g",52],[129,"o",6],[6,"b",7],[7,"j",8],[8,"e",9],[9,"c",10],[10,"t",11],[11,"]",12],[12,",",13],[13,"[",14],[14,"[",15],[15,"v",16],[16,",",17],[17,"1",18],[18,"]",19],[19,",",20],[20,"[",21],[21,"v",22],[22,",",23],[23,"1",24],[24,"]",25],[25,"]",26],[26,",",27],[27,"""",28],[28,":",29],[29,"-",30],[30,"""",31],[31,",",32],[32,"[",33],[33,"[",34],[34,"[",35],[35,"n",36],[36,",",37],[37,"t",38],[38,"r",39],[39,"u",40],[40,"e",41],[41,"]",42],[42,"]",43],[43,"]",44],[44,"]",45],[45,"0",46],[46,"1",[-,"[[n,object],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]],[52,"r",53],[53,"a",54],[54,"m",55],[55,"m",56],[56,"a",57],[57,"r",58],[58,"1",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"[",63],[63,"v",64],[64,",",65],[65,"1",66],[66,"]",67],[67,",",68],[68,"[",69],[69,"v",70],[70,",",71],[71,"2",72],[72,"]",73],[73,"]",74],[74,",",75],[75,"""",76],[76,":",77],[77,"-",78],[78,"""",79],[79,",",80],[80,"[",81],[81,"[",82],[82,"[",83],[83,"n",84],[84,",",85],[85,"c",86],[86,"o",87],[87,"m",88],[88,"p",89],[89,"o",90],[90,"u",91],[91,"n",92],[92,"d",93],[93,"2",94],[94,"1",95],[95,"]",96],[96,",",97],[97,"[",98],[98,"[",99],[99,"v",100],[100,",",101],[101,"1",102],[102,"]",103],[103,",",104],[104,"[",105],[105,"v",106],[106,",",107],[107,"2",108],[108,"]",109],[109,"]",110],[110,"]",111],[111,",",112],[112,"[",113],[113,"[",114],[114,"n",115],[115,",",116],[116,"c",117],[117,"u",118],[118,"t",119],[119,"]",120],[120,"]",121],[121,"]",122],[122,"]",123],[123,"0",124],[124,"1",[-,"[[n,grammar1],[[v,1],[v,2]],"":-"",[[[n,compound21],[[v,1],[v,2]]],[[n,cut]]]]01"]],[130,"o",131],[131,"m",132],[132,"p",133],[133,"o",134],[134,"u",135],[135,"n",136],[136,"d",137],[137,"2",138],[138,"1",139],[139,"3",140],[140,"]",141],[141,",",142],[142,"[",143],[143,"[",144],[144,"v",145],[145,",",146],[146,"3",147],[147,"]",148],[148,",",149],[149,"[",150],[150,"v",151],[151,",",152],[152,"3",153],[153,"]",154],[154,"]",155],[155,",",156],[156,"""",157],[157,":",158],[158,"-",159],[159,"""",160],[160,",",161],[161,"[",162],[162,"[",163],[163,"[",164],[164,"n",165],[165,",",166],[166,"t",167],[167,"r",168],[168,"u",169],[169,"e",170],[170,"]",171],[171,"]",172],[172,"]",173],[173,"]",174],[174,"0",175],[175,"1",[-,"[[n,compound213],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",117],[117,"n",67],[67,",",119],[119,"c",69],[69,"o",121],[121,"m",71],[71,"p",123],[123,"o",73],[73,"u",125],[125,"n",75],[75,"d",127],[127,"2",77],[77,"1",129],[129,"2",79],[129,"3",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"[",20],[20,"v",21],[21,",",22],[22,"1",23],[23,"]",24],[24,",",25],[25,"[",26],[26,"v",27],[27,",",28],[28,"1",29],[29,"]",30],[30,",",31],[31,"[",32],[32,"v",33],[33,",",34],[34,"3",35],[35,"]",36],[36,",",37],[37,"[",38],[38,"v",39],[39,",",40],[40,"3",41],[41,"]",42],[42,"]",43],[43,",",44],[44,"""",45],[45,":",46],[46,"-",47],[47,"""",48],[48,",",49],[49,"[",50],[50,"[",51],[51,"[",52],[52,"n",53],[53,",",54],[54,"t",55],[55,"r",56],[56,"u",57],[57,"e",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"]",62],[62,"0",63],[63,"1",[-,"[[n,compound213],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[79,"]",80],[80,",",81],[81,"[",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"1",137],[85,"3",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"v",90],[90,",",91],[91,"3",92],[92,"]",93],[93,"]",94],[94,",",95],[95,"""",96],[96,":",97],[97,"-",98],[98,"""",99],[99,",",100],[100,"[",101],[101,"[",102],[102,"[",103],[103,"n",104],[104,",",105],[105,"t",106],[106,"r",107],[107,"u",108],[108,"e",109],[109,"]",110],[110,"]",111],[111,"]",112],[112,"]",113],[113,"0",114],[114,"1",[-,"[[n,compound212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[137,"]",138],[138,",",139],[139,"[",140],[140,"v",141],[141,",",142],[142,"1",143],[143,"]",144],[144,",",145],[145,"[",146],[146,"v",147],[147,",",148],[148,"3",149],[149,"]",150],[150,",",151],[151,"[",152],[152,"v",153],[153,",",154],[154,"3",155],[155,"]",156],[156,"]",157],[157,",",158],[158,"""",159],[159,":",160],[160,"-",161],[161,"""",162],[162,",",163],[163,"[",164],[164,"[",165],[165,"[",166],[166,"n",167],[167,",",168],[168,"t",169],[169,"r",170],[170,"u",171],[171,"e",172],[172,"]",173],[173,"]",174],[174,"]",175],[175,"]",176],[176,"0",177],[177,"1",[-,"[[n,compound212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",113],[113,"n",53],[53,",",115],[115,"n",55],[115,"w",116],[55,"u",7],[7,"m",8],[8,"b",9],[9,"e",10],[10,"r",11],[11,"2",12],[12,"1",13],[13,"2",14],[14,"]",15],[15,",",16],[16,"[",17],[17,"[",18],[18,"v",19],[19,",",20],[20,"1",70],[20,"3",21],[21,"]",22],[22,",",23],[23,"[",24],[24,"v",25],[25,",",26],[26,"3",27],[27,"]",28],[28,"]",29],[29,",",30],[30,"""",31],[31,":",32],[32,"-",33],[33,"""",34],[34,",",35],[35,"[",36],[36,"[",37],[37,"[",38],[38,"n",39],[39,",",40],[40,"t",41],[41,"r",42],[42,"u",43],[43,"e",44],[44,"]",45],[45,"]",46],[46,"]",47],[47,"]",48],[48,"0",49],[49,"1",[-,"[[n,number212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[70,"]",71],[71,",",72],[72,"[",73],[73,"v",74],[74,",",75],[75,"1",76],[76,"]",77],[77,",",78],[78,"[",79],[79,"v",80],[80,",",81],[81,"3",82],[82,"]",83],[83,",",84],[84,"[",85],[85,"v",86],[86,",",87],[87,"3",88],[88,"]",89],[89,"]",90],[90,",",91],[91,"""",92],[92,":",93],[93,"-",94],[94,"""",95],[95,",",96],[96,"[",97],[97,"[",98],[98,"[",99],[99,"n",100],[100,",",101],[101,"t",102],[102,"r",103],[103,"u",104],[104,"e",105],[105,"]",106],[106,"]",107],[107,"]",108],[108,"]",109],[109,"0",110],[110,"1",[-,"[[n,number212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[116,"o",117],[117,"r",118],[118,"d",119],[119,"2",120],[120,"1",121],[121,"2",122],[122,"]",123],[123,",",124],[124,"[",125],[125,"[",126],[126,"v",127],[127,",",128],[128,"3",129],[129,"]",130],[130,",",131],[131,"[",132],[132,"v",133],[133,",",134],[134,"3",135],[135,"]",136],[136,"]",137],[137,",",138],[138,"""",139],[139,":",140],[140,"-",141],[141,"""",142],[142,",",143],[143,"[",144],[144,"[",145],[145,"[",146],[146,"n",147],[147,",",148],[148,"t",149],[149,"r",150],[150,"u",151],[151,"e",152],[152,"]",153],[153,"]",154],[154,"]",155],[155,"]",156],[156,"0",157],[157,"1",[-,"[[n,word212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",120],[120,"n",63],[63,",",122],[122,"s",65],[122,"w",6],[6,"o",7],[7,"r",8],[8,"d",9],[9,"2",10],[10,"1",11],[11,"2",12],[12,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"1",19],[19,"]",20],[20,",",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"1",25],[25,"]",26],[26,",",27],[27,"[",28],[28,"v",29],[29,",",30],[30,"3",31],[31,"]",32],[32,",",33],[33,"[",34],[34,"v",35],[35,",",36],[36,"3",37],[37,"]",38],[38,"]",39],[39,",",40],[40,"""",41],[41,":",42],[42,"-",43],[43,"""",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"[",48],[48,"n",49],[49,",",50],[50,"t",51],[51,"r",52],[52,"u",53],[53,"e",54],[54,"]",55],[55,"]",56],[56,"]",57],[57,"]",58],[58,"0",59],[59,"1",[-,"[[n,word212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[65,"e",66],[66,"n",67],[67,"t",68],[68,"e",69],[69,"n",70],[70,"c",71],[71,"e",72],[72,"c",73],[73,"h",74],[74,"a",75],[75,"r",76],[76,"s",77],[77,"]",78],[78,",",79],[79,"[",80],[80,"[",81],[81,"v",82],[82,",",83],[83,"1",84],[84,"]",85],[85,"]",86],[86,",",87],[87,"""",88],[88,":",89],[89,"-",90],[90,"""",91],[91,",",92],[92,"[",93],[93,"[",94],[94,"[",95],[95,"[",154],[95,"n",96],[96,",",97],[97,"l",98],[98,"e",99],[99,"t",100],[100,"t",101],[101,"e",102],[102,"r",103],[103,"s",104],[104,"]",105],[105,",",106],[106,"[",107],[107,"[",108],[108,"v",109],[109,",",110],[110,"1",111],[111,"]",112],[112,"]",113],[113,"]",114],[114,"]",115],[115,"]",116],[116,"0",117],[117,"1",[-,"[[n,sentencechars],[[v,1]],"":-"",[[[n,letters],[[v,1]]]]]01"]],[154,"n",155],[155,",",156],[156,"s",157],[157,"t",158],[158,"r",159],[159,"i",160],[160,"n",161],[161,"g",162],[162,"t",163],[163,"o",164],[164,"n",165],[165,"u",166],[166,"m",167],[167,"b",168],[168,"e",169],[169,"r",170],[170,"]",171],[171,",",172],[172,"[",173],[173,"[",174],[174,"v",175],[175,",",176],[176,"1",177],[177,"]",178],[178,",",179],[179,"[",180],[180,"v",181],[181,",",182],[182,"3",183],[183,"]",184],[184,"]",185],[185,"]",186],[186,",",187],[187,"[",188],[188,"[",189],[189,"n",190],[190,",",191],[191,"n",192],[192,"u",193],[193,"m",194],[194,"b",195],[195,"e",196],[196,"r",197],[197,"]",198],[198,",",199],[199,"[",200],[200,"[",201],[201,"v",202],[202,",",203],[203,"3",204],[204,"]",205],[205,"]",206],[206,"]",207],[207,"]",208],[208,"]",209],[209,"]",210],[210,"0",211],[211,"1",[-,"[[n,sentencechars],[[v,1]],"":-"",[[[[n,stringtonumber],[[v,1],[v,3]]],[[n,number],[[v,3]]]]]]01"]]],[[1,"[",2],[2,"[",107],[107,"n",56],[56,",",109],[109,"s",58],[58,"e",111],[111,"n",60],[60,"t",113],[113,"e",62],[62,"n",115],[115,"c",64],[64,"e",117],[117,"c",66],[66,"h",119],[119,"a",68],[68,"r",121],[121,"s",70],[70,"]",123],[123,",",72],[72,"[",125],[125,"[",74],[74,"v",127],[127,",",76],[76,"1",129],[129,"]",78],[78,"]",131],[131,",",80],[80,"""",133],[133,":",82],[82,"-",135],[135,"""",84],[84,",",137],[137,"[",86],[86,"[",139],[139,"[",88],[88,"n",141],[141,",",90],[90,"=",143],[143,"]",92],[92,",",145],[145,"[",94],[94,"[",147],[147,"v",96],[96,",",149],[149,"1",98],[98,"]",151],[151,"]",100],[100,"]",153],[153,"]",102],[102,"]",155],[155,"0",104]],[[1,"[",2],[2,"[",103],[103,"n",56],[56,",",105],[105,"f",58],[105,"s",6],[6,"e",7],[7,"n",8],[8,"t",9],[9,"e",10],[10,"n",11],[11,"c",12],[12,"e",13],[13,"c",14],[14,"h",15],[15,"a",16],[16,"r",17],[17,"s",18],[18,"]",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"1",25],[25,"]",26],[26,"]",27],[27,",",28],[28,"""",29],[29,":",30],[30,"-",31],[31,"""",32],[32,",",33],[33,"[",34],[34,"[",35],[35,"[",36],[36,"n",37],[37,",",38],[38,"=",39],[39,"]",40],[40,",",41],[41,"[",42],[42,"[",43],[43,"v",44],[44,",",45],[45,"1",46],[46,"]",47],[47,"]",48],[48,"]",49],[49,"]",50],[50,"]",51],[51,"0",52],[52,"1",[-,"[[n,sentencechars],[[v,1]],"":-"",[[[n,=],[[v,1]]]]]01"]],[58,"i",59],[59,"n",60],[60,"a",61],[61,"l",62],[62,"c",63],[63,"h",64],[64,"a",65],[65,"r",66],[66,"]",67],[67,",",68],[68,"[",69],[69,"[",70],[70,"v",71],[71,",",72],[72,"1",73],[73,"]",74],[74,"]",75],[75,",",76],[76,"""",77],[77,":",78],[78,"-",79],[79,"""",80],[80,",",81],[81,"[",82],[82,"[",83],[83,"[",84],[84,"n",85],[85,",",86],[86,"=",87],[87,"]",88],[88,",",89],[89,"[",90],[90,"[",91],[91,"v",92],[92,",",93],[93,"1",94],[94,"]",95],[95,"]",96],[96,"]",97],[97,"]",98],[98,"]",99],[99,"0",100]],[[1,"[",2],[2,"[",117],[117,"n",52],[52,",",119],[119,"f",6],[119,"g",54],[119,"s",120],[6,"i",7],[7,"n",8],[8,"a",9],[9,"l",10],[10,"c",11],[11,"h",12],[12,"a",13],[13,"r",14],[14,"]",15],[15,",",16],[16,"[",17],[17,"[",18],[18,"v",19],[19,",",20],[20,"1",21],[21,"]",22],[22,"]",23],[23,",",24],[24,"""",25],[25,":",26],[26,"-",27],[27,"""",28],[28,",",29],[29,"[",30],[30,"[",31],[31,"[",32],[32,"n",33],[33,",",34],[34,"=",35],[35,"]",36],[36,",",37],[37,"[",38],[38,"[",39],[39,"v",40],[40,",",41],[41,"1",42],[42,"]",43],[43,"]",44],[44,"]",45],[45,"]",46],[46,"]",47],[47,"0",48],[48,"1",[-,"[[n,finalchar],[[v,1]],"":-"",[[[n,=],[[v,1]]]]]01"]],[54,"r",55],[55,"a",56],[56,"m",57],[57,"m",58],[58,"a",59],[59,"r",60],[60,"1",61],[61,"]",62],[62,",",63],[63,"[",64],[64,"[",65],[65,"v",66],[66,",",67],[67,"1",68],[68,"]",69],[69,",",70],[70,"[",71],[71,"v",72],[72,",",73],[73,"2",74],[74,"]",75],[75,"]",76],[76,",",77],[77,"""",78],[78,":",79],[79,"-",80],[80,"""",81],[81,",",82],[82,"[",83],[83,"[",84],[84,"[",85],[85,"n",86],[86,",",87],[87,"s",88],[88,"e",89],[89,"n",90],[90,"t",91],[91,"e",92],[92,"n",93],[93,"c",94],[94,"e",95],[95,"]",96],[96,",",97],[97,"[",98],[98,"[",99],[99,"v",100],[100,",",101],[101,"1",102],[102,"]",103],[103,",",104],[104,"[",105],[105,"v",106],[106,",",107],[107,"2",108],[108,"]",109],[109,"]",110],[110,"]",111],[111,"]",112],[112,"]",113],[113,"0",114],[114,"1",[-,"[[n,grammar1],[[v,1],[v,2]],"":-"",[[[n,sentence],[[v,1],[v,2]]]]]01"]],[120,"u",121],[121,"b",122],[122,"j",123],[123,"e",124],[124,"c",125],[125,"t",126],[126,"]",127],[127,",",128],[128,"""",129],[129,":",130],[130,"-",131],[131,"""",132],[132,",",133],[133,"[",134],[134,"[",135],[135,"[",136],[136,"n",137],[137,",",138],[138,"t",139],[139,"r",140],[140,"u",141],[141,"e",142],[142,"]",143],[143,"]",144],[144,"]",145],[145,"]",146],[146,"0",147],[147,"1",[-,"[[n,subject],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",80],[80,"n",51],[51,",",82],[82,"s",6],[82,"v",53],[6,"u",7],[7,"b",8],[8,"j",9],[9,"e",10],[10,"c",11],[11,"t",12],[12,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"1",19],[19,"]",20],[20,",",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"1",25],[25,"]",26],[26,"]",27],[27,",",28],[28,"""",29],[29,":",30],[30,"-",31],[31,"""",32],[32,",",33],[33,"[",34],[34,"[",35],[35,"[",36],[36,"n",37],[37,",",38],[38,"t",39],[39,"r",40],[40,"u",41],[41,"e",42],[42,"]",43],[43,"]",44],[44,"]",45],[45,"]",46],[46,"0",47],[47,"1",[-,"[[n,subject],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]],[53,"e",54],[54,"r",55],[55,"b",56],[56,"]",57],[57,",",58],[58,"""",59],[58,"[",89],[59,":",60],[60,"-",61],[61,"""",62],[62,",",63],[63,"[",64],[64,"[",65],[65,"[",66],[66,"n",67],[67,",",68],[68,"t",69],[69,"r",70],[70,"u",71],[71,"e",72],[72,"]",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"0",77],[77,"1",[-,"[[n,verb],"":-"",[[[n,true]]]]01"]],[89,"[",90],[90,"v",91],[91,",",92],[92,"1",93],[93,"]",94],[94,",",95],[95,"[",96],[96,"v",97],[97,",",98],[98,"1",99],[99,"]",100],[100,"]",101],[101,",",102],[102,"""",103],[103,":",104],[104,"-",105],[105,"""",106],[106,",",107],[107,"[",108],[108,"[",109],[109,"[",110],[110,"n",111],[111,",",112],[112,"t",113],[113,"r",114],[114,"u",115],[115,"e",116],[116,"]",117],[117,"]",118],[118,"]",119],[119,"]",120],[120,"0",121],[121,"1",[-,"[[n,verb],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",89],[89,"n",44],[44,",",91],[91,"n",92],[91,"o",46],[46,"b",7],[7,"j",8],[8,"e",9],[9,"c",10],[10,"t",11],[11,"]",12],[12,",",13],[13,"[",14],[14,"[",15],[15,"v",16],[16,",",17],[17,"1",58],[17,"3",18],[18,"]",19],[19,"]",20],[20,",",21],[21,"""",22],[22,":",23],[23,"-",24],[24,"""",25],[25,",",26],[26,"[",27],[27,"[",28],[28,"[",29],[29,"n",30],[30,",",31],[31,"t",32],[32,"r",33],[33,"u",34],[34,"e",35],[35,"]",36],[36,"]",37],[37,"]",38],[38,"]",39],[39,"0",40],[40,"1",[-,"[[n,object],[[v,3]],"":-"",[[[n,true]]]]01"]],[58,"]",59],[59,",",60],[60,"[",61],[61,"v",62],[62,",",63],[63,"1",64],[64,"]",65],[65,"]",66],[66,",",67],[67,"""",68],[68,":",69],[69,"-",70],[70,"""",71],[71,",",72],[72,"[",73],[73,"[",74],[74,"[",75],[75,"n",76],[76,",",77],[77,"t",78],[78,"r",79],[79,"u",80],[80,"e",81],[81,"]",82],[82,"]",83],[83,"]",84],[84,"]",85],[85,"0",86],[86,"1",[-,"[[n,object],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]],[92,"u",93],[93,"m",94],[94,"b",95],[95,"e",96],[96,"r",97],[97,"2",98],[98,"1",99],[99,"2",100],[100,"]",101],[101,",",102],[102,"[",103],[103,"[",104],[104,"v",105],[105,",",106],[106,"3",107],[107,"]",108],[108,",",109],[109,"[",110],[110,"v",111],[111,",",112],[112,"3",113],[113,"]",114],[114,"]",115],[115,",",116],[116,"""",117],[117,":",118],[118,"-",119],[119,"""",120],[120,",",121],[121,"[",122],[122,"[",123],[123,"[",124],[124,"n",125],[125,",",126],[126,"t",127],[127,"r",128],[128,"u",129],[129,"e",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"]",134],[134,"0",135],[135,"1",[-,"[[n,number212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",125],[125,"n",65],[65,",",127],[127,"f",128],[127,"n",6],[127,"p",67],[6,"u",7],[7,"m",8],[8,"b",9],[9,"e",10],[10,"r",11],[11,"2",12],[12,"1",13],[13,"2",14],[14,"]",15],[15,",",16],[16,"[",17],[17,"[",18],[18,"v",19],[19,",",20],[20,"1",21],[21,"]",22],[22,",",23],[23,"[",24],[24,"v",25],[25,",",26],[26,"1",27],[27,"]",28],[28,",",29],[29,"[",30],[30,"v",31],[31,",",32],[32,"3",33],[33,"]",34],[34,",",35],[35,"[",36],[36,"v",37],[37,",",38],[38,"3",39],[39,"]",40],[40,"]",41],[41,",",42],[42,"""",43],[43,":",44],[44,"-",45],[45,"""",46],[46,",",47],[47,"[",48],[48,"[",49],[49,"[",50],[50,"n",51],[51,",",52],[52,"t",53],[53,"r",54],[54,"u",55],[55,"e",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"0",61],[61,"1",[-,"[[n,number212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[67,"o",68],[68,"s",69],[69,"i",70],[70,"t",71],[71,"i",72],[72,"v",73],[73,"i",74],[74,"t",75],[75,"y",76],[76,"s",77],[77,"c",78],[78,"o",79],[79,"r",80],[80,"e",81],[81,"]",82],[82,",",83],[83,"[",84],[84,"[",85],[85,"v",86],[86,",",87],[87,"2",88],[88,"]",89],[89,",",90],[90,"[",91],[91,"v",92],[92,",",93],[93,"3",94],[94,"]",95],[95,",",96],[96,"[",97],[97,"v",98],[98,",",99],[99,"3",100],[100,"]",101],[101,"]",102],[102,",",103],[103,"""",104],[104,":",105],[105,"-",106],[106,"""",107],[107,",",108],[108,"[",109],[109,"[",110],[110,"[",111],[111,"n",112],[112,",",113],[113,"t",114],[114,"r",115],[115,"u",116],[116,"e",117],[117,"]",118],[118,"]",119],[119,"]",120],[120,"]",121],[121,"0",122],[122,"1",[-,"[[n,positivityscore],[[v,2],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[128,"u",129],[129,"n",130],[130,"c",131],[131,"t",132],[132,"i",133],[133,"o",134],[134,"n",135],[135,"]",136],[136,",",137],[137,"[",138],[138,"[",139],[139,"v",140],[140,",",141],[141,"1",142],[142,"]",143],[143,",",144],[144,"[",145],[145,"v",146],[146,",",147],[147,"2",148],[148,"]",149],[149,",",150],[150,"[",151],[151,"v",152],[152,",",153],[153,"3",154],[154,"]",155],[155,"]",156],[156,",",157],[157,"""",158],[158,":",159],[159,"-",160],[160,"""",161],[161,",",162],[162,"[",163],[163,"[",164],[164,"[",165],[165,"[",166],[166,"n",167],[167,",",168],[168,"+",169],[169,"]",170],[170,",",171],[171,"[",172],[172,"[",173],[173,"v",174],[174,",",175],[175,"1",176],[176,"]",177],[177,",",178],[178,"[",179],[179,"v",180],[180,",",181],[181,"2",182],[182,"]",183],[183,",",184],[184,"[",185],[185,"v",186],[186,",",187],[187,"3",188],[188,"]",189],[189,"]",190],[190,"]",191],[191,"]",192],[192,"]",193],[193,"]",194],[194,"0",195],[195,"1",[-,"[[n,function],[[v,1],[v,2],[v,3]],"":-"",[[[[n,+],[[v,1],[v,2],[v,3]]]]]]01"]]],[[1,"[",2],[2,"[",153],[153,"n",71],[71,",",155],[155,"g",6],[155,"l",73],[6,"r",7],[7,"a",8],[8,"m",9],[9,"m",10],[10,"a",11],[11,"r",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[21,"]",172],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"l",40],[40,"o",41],[41,"o",42],[42,"k",43],[43,"a",44],[44,"h",45],[45,"e",46],[46,"a",47],[47,"d",48],[48,"]",49],[49,",",50],[50,"[",51],[51,"[",52],[52,"v",53],[53,",",54],[54,"1",55],[55,"]",56],[56,",",57],[57,"[",58],[58,"v",59],[59,",",60],[60,"2",61],[61,"]",62],[62,"]",63],[63,"]",64],[64,"]",65],[65,"]",66],[66,"0",67],[67,"1",[-,"[[n,grammar1],[[v,1],[v,2]],"":-"",[[[n,lookahead],[[v,1],[v,2]]]]]01"]],[73,"o",74],[74,"o",75],[75,"k",76],[76,"a",77],[77,"h",78],[78,"e",79],[79,"a",80],[80,"d",81],[81,"]",82],[82,",",83],[83,"[",84],[84,"[",85],[85,"v",86],[86,",",87],[87,"1",88],[88,"]",89],[89,",",90],[90,"[",91],[91,"v",92],[92,",",93],[93,"1",94],[94,"]",95],[95,",",96],[96,"[",97],[97,"v",98],[98,",",99],[99,"3",100],[100,"]",101],[101,"]",102],[102,",",103],[103,"""",104],[104,":",105],[105,"-",106],[106,"""",107],[107,",",108],[108,"[",109],[109,"[",110],[110,"[",111],[111,"n",112],[112,",",113],[113,"s",114],[114,"t",115],[115,"r",116],[116,"i",117],[117,"n",118],[118,"g",119],[119,"c",120],[120,"o",121],[121,"n",122],[122,"c",123],[123,"a",124],[124,"t",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"[",129],[129,"v",130],[130,",",131],[131,"3",132],[132,"]",133],[133,",",134],[134,"[",135],[135,"v",136],[136,",",137],[137,"5",138],[138,"]",139],[139,",",140],[140,"[",141],[141,"v",142],[142,",",143],[143,"1",144],[144,"]",145],[145,"]",146],[146,"]",147],[147,"]",148],[148,"]",149],[149,"0",150],[150,"1",[-,"[[n,lookahead],[[v,1],[v,1],[v,3]],"":-"",[[[n,stringconcat],[[v,3],[v,5],[v,1]]]]]01"]],[172,",",173],[173,"""",174],[174,":",175],[175,"-",176],[176,"""",177],[177,",",178],[178,"[",179],[179,"[",180],[180,"[",181],[181,"n",182],[182,",",183],[183,"s",184],[184,"e",185],[185,"n",186],[186,"t",187],[187,"e",188],[188,"n",189],[189,"c",190],[190,"e",191],[191,"]",192],[192,",",193],[193,"[",194],[194,"[",195],[195,"v",196],[196,",",197],[197,"1",198],[198,"]",199],[199,"]",200],[200,"]",201],[201,"]",202],[202,"]",203],[203,"0",204],[204,"1",[-,"[[n,grammar1],[[v,1]],"":-"",[[[n,sentence],[[v,1]]]]]01"]]],[[1,"[",2],[2,"[",83],[83,"n",37],[37,",",85],[85,"s",39],[85,"v",86],[39,"u",7],[7,"b",8],[8,"j",9],[9,"e",10],[10,"c",11],[11,"t",12],[12,"]",13],[13,",",14],[14,"""",15],[14,"[",48],[15,":",16],[16,"-",17],[17,"""",18],[18,",",19],[19,"[",20],[20,"[",21],[21,"[",22],[22,"n",23],[23,",",24],[24,"t",25],[25,"r",26],[26,"u",27],[27,"e",28],[28,"]",29],[29,"]",30],[30,"]",31],[31,"]",32],[32,"0",33],[33,"1",[-,"[[n,subject],"":-"",[[[n,true]]]]01"]],[48,"[",49],[49,"v",50],[50,",",51],[51,"1",52],[52,"]",53],[53,",",54],[54,"[",55],[55,"v",56],[56,",",57],[57,"1",58],[58,"]",59],[59,"]",60],[60,",",61],[61,"""",62],[62,":",63],[63,"-",64],[64,"""",65],[65,",",66],[66,"[",67],[67,"[",68],[68,"[",69],[69,"n",70],[70,",",71],[71,"t",72],[72,"r",73],[73,"u",74],[74,"e",75],[75,"]",76],[76,"]",77],[77,"]",78],[78,"]",79],[79,"0",80],[80,"1",[-,"[[n,subject],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]],[86,"e",87],[87,"r",88],[88,"b",89],[89,"]",90],[90,",",91],[91,"""",92],[92,":",93],[93,"-",94],[94,"""",95],[95,",",96],[96,"[",97],[97,"[",98],[98,"[",99],[99,"n",100],[100,",",101],[101,"t",102],[102,"r",103],[103,"u",104],[104,"e",105],[105,"]",106],[106,"]",107],[107,"]",108],[108,"]",109],[109,"0",110],[110,"1",[-,"[[n,verb],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",77],[77,"n",48],[48,",",79],[79,"n",50],[79,"v",6],[6,"e",7],[7,"r",8],[8,"b",9],[9,"]",10],[10,",",11],[11,"[",12],[12,"[",13],[13,"v",14],[14,",",15],[15,"1",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"v",20],[20,",",21],[21,"1",22],[22,"]",23],[23,"]",24],[24,",",25],[25,"""",26],[26,":",27],[27,"-",28],[28,"""",29],[29,",",30],[30,"[",31],[31,"[",32],[32,"[",33],[33,"n",34],[34,",",35],[35,"t",36],[36,"r",37],[37,"u",38],[38,"e",39],[39,"]",40],[40,"]",41],[41,"]",42],[42,"]",43],[43,"0",44],[44,"1",[-,"[[n,verb],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]],[50,"o",51],[51,"u",52],[52,"n",53],[53,"]",54],[54,",",55],[55,"""",56],[55,"[",86],[56,":",57],[57,"-",58],[58,"""",59],[59,",",60],[60,"[",61],[61,"[",62],[62,"[",63],[63,"n",64],[64,",",65],[65,"t",66],[66,"r",67],[67,"u",68],[68,"e",69],[69,"]",70],[70,"]",71],[71,"]",72],[72,"]",73],[73,"0",74],[74,"1",[-,"[[n,noun],"":-"",[[[n,true]]]]01"]],[86,"[",87],[87,"v",88],[88,",",89],[89,"1",90],[90,"]",91],[91,",",92],[92,"[",93],[93,"v",94],[94,",",95],[95,"1",96],[96,"]",97],[97,"]",98],[98,",",99],[99,"""",100],[100,":",101],[101,"-",102],[102,"""",103],[103,",",104],[104,"[",105],[105,"[",106],[106,"[",107],[107,"n",108],[108,",",109],[109,"t",110],[110,"r",111],[111,"u",112],[112,"e",113],[113,"]",114],[114,"]",115],[115,"]",116],[116,"]",117],[117,"0",118],[118,"1",[-,"[[n,noun],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",199],[199,"n",102],[102,",",201],[201,"a",104],[201,"m",202],[104,"g",7],[7,"r",8],[8,"e",9],[9,"e",10],[10,"]",11],[11,",",12],[12,"[",13],[13,"[",14],[14,"v",15],[15,",",16],[16,"1",17],[17,"]",18],[18,",",19],[19,"[",20],[20,"v",21],[21,",",22],[22,"2",23],[23,"]",24],[24,",",25],[25,"[",26],[26,"v",27],[27,",",28],[28,"3",29],[29,"]",30],[30,"]",31],[31,",",32],[32,"""",33],[33,":",34],[34,"-",35],[35,"""",36],[36,",",37],[37,"[",38],[38,"[",39],[39,"[",40],[40,"n",41],[41,",",42],[42,"m",43],[43,"e",44],[44,"m",45],[45,"b",46],[46,"e",47],[47,"r",48],[48,"a",49],[49,"1",50],[50,"]",51],[51,",",52],[52,"[",53],[53,"[",54],[54,"v",55],[55,",",56],[56,"1",57],[56,"2",155],[57,"]",58],[58,",",59],[59,"[",60],[60,"v",61],[61,",",62],[62,"3",63],[63,"]",64],[64,"]",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"[",69],[69,"n",70],[70,",",71],[71,"m",72],[72,"e",73],[73,"m",74],[74,"b",75],[75,"e",76],[76,"r",77],[77,"a",78],[78,"2",79],[79,"]",80],[80,",",81],[81,"[",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"2",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"v",90],[90,",",91],[91,"3",92],[92,"]",93],[93,"]",94],[94,"]",95],[95,"]",96],[96,"]",97],[97,"0",98],[98,"1",[-,"[[n,agree],[[v,1],[v,2],[v,3]],"":-"",[[[n,membera1],[[v,1],[v,3]]],[[n,membera2],[[v,2],[v,3]]]]]01"]],[155,"]",156],[156,",",157],[157,"[",158],[158,"v",159],[159,",",160],[160,"3",161],[161,"]",162],[162,"]",163],[163,"]",164],[164,",",165],[165,"[",166],[166,"[",167],[167,"n",168],[168,",",169],[169,"m",170],[170,"e",171],[171,"m",172],[172,"b",173],[173,"e",174],[174,"r",175],[175,"a",176],[176,"2",177],[177,"]",178],[178,",",179],[179,"[",180],[180,"[",181],[181,"v",182],[182,",",183],[183,"1",184],[184,"]",185],[185,",",186],[186,"[",187],[187,"v",188],[188,",",189],[189,"3",190],[190,"]",191],[191,"]",192],[192,"]",193],[193,"]",194],[194,"]",195],[195,"0",196],[196,"1",[-,"[[n,agree],[[v,1],[v,2],[v,3]],"":-"",[[[n,membera1],[[v,2],[v,3]]],[[n,membera2],[[v,1],[v,3]]]]]01"]],[202,"e",203],[203,"m",204],[204,"b",205],[205,"e",206],[206,"r",207],[207,"a",208],[208,"1",209],[209,"]",210],[210,",",211],[211,"[",212],[212,"[",213],[213,"v",214],[214,",",215],[215,"2",216],[216,"]",217],[217,"]",218],[218,",",219],[219,"""",220],[220,":",221],[221,"-",222],[222,"""",223],[223,",",224],[224,"[",225],[225,"[",226],[226,"[",227],[227,"n",228],[228,",",229],[229,"t",230],[230,"r",231],[231,"u",232],[232,"e",233],[233,"]",234],[234,"]",235],[235,"]",236],[236,"]",237],[237,"0",238],[238,"1",[-,"[[n,membera1],[[v,2]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",87],[87,"n",46],[46,",",89],[89,"c",90],[89,"m",48],[48,"e",7],[7,"m",8],[8,"b",9],[9,"e",10],[10,"r",11],[11,"a",12],[12,"2",13],[12,"3",55],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"2",20],[20,"]",21],[21,"]",22],[22,",",23],[23,"""",24],[24,":",25],[25,"-",26],[26,"""",27],[27,",",28],[28,"[",29],[29,"[",30],[30,"[",31],[31,"n",32],[32,",",33],[33,"t",34],[34,"r",35],[35,"u",36],[36,"e",37],[37,"]",38],[38,"]",39],[39,"]",40],[40,"]",41],[41,"0",42],[42,"1",[-,"[[n,membera2],[[v,2]],"":-"",[[[n,true]]]]01"]],[55,"]",56],[56,",",57],[57,"[",58],[58,"[",59],[59,"v",60],[60,",",61],[61,"2",62],[62,"]",63],[63,"]",64],[64,",",65],[65,"""",66],[66,":",67],[67,"-",68],[68,"""",69],[69,",",70],[70,"[",71],[71,"[",72],[72,"[",73],[73,"n",74],[74,",",75],[75,"t",76],[76,"r",77],[77,"u",78],[78,"e",79],[79,"]",80],[80,"]",81],[81,"]",82],[82,"]",83],[83,"0",84],[84,"1",[-,"[[n,membera3],[[v,2]],"":-"",[[[n,true]]]]01"]],[90,"o",91],[91,"m",92],[92,"p",93],[93,"o",94],[94,"u",95],[95,"n",96],[96,"d",97],[97,"2",98],[98,"1",99],[99,"3",100],[100,"]",101],[101,",",102],[102,"[",103],[103,"[",104],[104,"v",105],[105,",",106],[106,"3",107],[107,"]",108],[108,",",109],[109,"[",110],[110,"v",111],[111,",",112],[112,"3",113],[113,"]",114],[114,"]",115],[115,",",116],[116,"""",117],[117,":",118],[118,"-",119],[119,"""",120],[120,",",121],[121,"[",122],[122,"[",123],[123,"[",124],[124,"n",125],[125,",",126],[126,"t",127],[127,"r",128],[128,"u",129],[129,"e",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"]",134],[134,"0",135],[135,"1",[-,"[[n,compound213],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",117],[117,"n",67],[67,",",119],[119,"c",69],[69,"o",121],[121,"m",71],[71,"p",123],[123,"o",73],[73,"u",125],[125,"n",75],[75,"d",127],[127,"2",77],[77,"1",129],[129,"2",79],[129,"3",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"[",20],[20,"v",21],[21,",",22],[22,"1",23],[23,"]",24],[24,",",25],[25,"[",26],[26,"v",27],[27,",",28],[28,"1",29],[29,"]",30],[30,",",31],[31,"[",32],[32,"v",33],[33,",",34],[34,"3",35],[35,"]",36],[36,",",37],[37,"[",38],[38,"v",39],[39,",",40],[40,"3",41],[41,"]",42],[42,"]",43],[43,",",44],[44,"""",45],[45,":",46],[46,"-",47],[47,"""",48],[48,",",49],[49,"[",50],[50,"[",51],[51,"[",52],[52,"n",53],[53,",",54],[54,"t",55],[55,"r",56],[56,"u",57],[57,"e",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"]",62],[62,"0",63],[63,"1",[-,"[[n,compound213],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[79,"]",80],[80,",",81],[81,"[",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"1",137],[85,"3",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"v",90],[90,",",91],[91,"3",92],[92,"]",93],[93,"]",94],[94,",",95],[95,"""",96],[96,":",97],[97,"-",98],[98,"""",99],[99,",",100],[100,"[",101],[101,"[",102],[102,"[",103],[103,"n",104],[104,",",105],[105,"t",106],[106,"r",107],[107,"u",108],[108,"e",109],[109,"]",110],[110,"]",111],[111,"]",112],[112,"]",113],[113,"0",114],[114,"1",[-,"[[n,compound212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[137,"]",138],[138,",",139],[139,"[",140],[140,"v",141],[141,",",142],[142,"1",143],[143,"]",144],[144,",",145],[145,"[",146],[146,"v",147],[147,",",148],[148,"3",149],[149,"]",150],[150,",",151],[151,"[",152],[152,"v",153],[153,",",154],[154,"3",155],[155,"]",156],[156,"]",157],[157,",",158],[158,"""",159],[159,":",160],[160,"-",161],[161,"""",162],[162,",",163],[163,"[",164],[164,"[",165],[165,"[",166],[166,"n",167],[167,",",168],[168,"t",169],[169,"r",170],[170,"u",171],[171,"e",172],[172,"]",173],[173,"]",174],[174,"]",175],[175,"]",176],[176,"0",177],[177,"1",[-,"[[n,compound212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",109],[109,"n",51],[51,",",111],[111,"f",112],[111,"w",53],[53,"o",7],[7,"r",8],[8,"d",9],[9,"2",10],[10,"1",11],[11,"2",12],[12,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"1",66],[18,"3",19],[19,"]",20],[20,",",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"3",25],[25,"]",26],[26,"]",27],[27,",",28],[28,"""",29],[29,":",30],[30,"-",31],[31,"""",32],[32,",",33],[33,"[",34],[34,"[",35],[35,"[",36],[36,"n",37],[37,",",38],[38,"t",39],[39,"r",40],[40,"u",41],[41,"e",42],[42,"]",43],[43,"]",44],[44,"]",45],[45,"]",46],[46,"0",47],[47,"1",[-,"[[n,word212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[66,"]",67],[67,",",68],[68,"[",69],[69,"v",70],[70,",",71],[71,"1",72],[72,"]",73],[73,",",74],[74,"[",75],[75,"v",76],[76,",",77],[77,"3",78],[78,"]",79],[79,",",80],[80,"[",81],[81,"v",82],[82,",",83],[83,"3",84],[84,"]",85],[85,"]",86],[86,",",87],[87,"""",88],[88,":",89],[89,"-",90],[90,"""",91],[91,",",92],[92,"[",93],[93,"[",94],[94,"[",95],[95,"n",96],[96,",",97],[97,"t",98],[98,"r",99],[99,"u",100],[100,"e",101],[101,"]",102],[102,"]",103],[103,"]",104],[104,"]",105],[105,"0",106],[106,"1",[-,"[[n,word212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[112,"i",113],[113,"n",114],[114,"a",115],[115,"l",116],[116,"c",117],[117,"h",118],[118,"a",119],[119,"r",120],[120,"]",121],[121,",",122],[122,"[",123],[123,"[",124],[124,"v",125],[125,",",126],[126,"1",127],[127,"]",128],[128,"]",129],[129,",",130],[130,"""",131],[131,":",132],[132,"-",133],[133,"""",134],[134,",",135],[135,"[",136],[136,"[",137],[137,"[",138],[138,"n",139],[139,",",140],[140,"=",141],[141,"]",142],[142,",",143],[143,"[",144],[144,"[",145],[145,"v",146],[146,",",147],[147,"1",148],[148,"]",149],[149,"]",150],[150,"]",151],[151,"]",152],[152,"]",153],[153,"0",154],[154,"1",[-,"[[n,finalchar],[[v,1]],"":-"",[[[n,=],[[v,1]]]]]01"]]],[[1,"[",2],[2,"[",112],[112,"n",52],[52,",",114],[114,"d",115],[114,"f",6],[114,"p",54],[6,"i",7],[7,"n",8],[8,"a",9],[9,"l",10],[10,"c",11],[11,"h",12],[12,"a",13],[13,"r",14],[14,"]",15],[15,",",16],[16,"[",17],[17,"[",18],[18,"v",19],[19,",",20],[20,"1",21],[21,"]",22],[22,"]",23],[23,",",24],[24,"""",25],[25,":",26],[26,"-",27],[27,"""",28],[28,",",29],[29,"[",30],[30,"[",31],[31,"[",32],[32,"n",33],[33,",",34],[34,"=",35],[35,"]",36],[36,",",37],[37,"[",38],[38,"[",39],[39,"v",40],[40,",",41],[41,"1",42],[42,"]",43],[43,"]",44],[44,"]",45],[45,"]",46],[46,"]",47],[47,"0",48],[48,"1",[-,"[[n,finalchar],[[v,1]],"":-"",[[[n,=],[[v,1]]]]]01"]],[54,"o",55],[55,"s",56],[56,"i",57],[57,"t",58],[58,"i",59],[59,"v",60],[60,"i",61],[61,"t",62],[62,"y",63],[63,"s",64],[64,"c",65],[65,"o",66],[66,"r",67],[67,"e",68],[68,"]",69],[69,",",70],[70,"[",71],[71,"[",72],[72,"v",73],[73,",",74],[74,"2",75],[75,"]",76],[76,",",77],[77,"[",78],[78,"v",79],[79,",",80],[80,"3",81],[81,"]",82],[82,",",83],[83,"[",84],[84,"v",85],[85,",",86],[86,"3",87],[87,"]",88],[88,"]",89],[89,",",90],[90,"""",91],[91,":",92],[92,"-",93],[93,"""",94],[94,",",95],[95,"[",96],[96,"[",97],[97,"[",98],[98,"n",99],[99,",",100],[100,"t",101],[101,"r",102],[102,"u",103],[103,"e",104],[104,"]",105],[105,"]",106],[106,"]",107],[107,"]",108],[108,"0",109],[109,"1",[-,"[[n,positivityscore],[[v,2],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[115,"e",116],[116,"l",117],[117,"e",118],[118,"t",119],[119,"e",120],[120,"a",121],[121,"2",122],[122,"]",123],[123,",",124],[124,"[",125],[125,"[",126],[126,"v",127],[127,",",128],[128,"2",129],[129,"]",130],[130,",",131],[131,"[",132],[132,"v",133],[133,",",134],[134,"3",135],[135,"]",136],[136,"]",137],[137,",",138],[138,"""",139],[139,":",140],[140,"-",141],[141,"""",142],[142,",",143],[143,"[",144],[144,"[",145],[145,"[",146],[146,"n",147],[147,",",148],[148,"f",149],[149,"a",150],[150,"i",151],[151,"l",152],[152,"]",153],[153,"]",154],[154,"]",155],[155,"]",156],[156,"0",157],[157,"1",[-,"[[n,deletea2],[[v,2],[v,3]],"":-"",[[[n,fail]]]]01"]]],[[1,"[",2],[2,"[",120],[120,"n",46],[46,",",122],[122,"a",48],[122,"e",123],[122,"m",6],[6,"e",7],[7,"m",8],[8,"b",9],[9,"e",10],[10,"r",11],[11,"a",12],[12,"3",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"2",20],[20,"]",21],[21,"]",22],[22,",",23],[23,"""",24],[24,":",25],[25,"-",26],[26,"""",27],[27,",",28],[28,"[",29],[29,"[",30],[30,"[",31],[31,"n",32],[32,",",33],[33,"f",34],[34,"a",35],[35,"i",36],[36,"l",37],[37,"]",38],[38,"]",39],[39,"]",40],[40,"]",41],[41,"0",42],[42,"1",[-,"[[n,membera3],[[v,2]],"":-"",[[[n,fail]]]]01"]],[48,"p",49],[49,"p",50],[50,"e",51],[51,"n",52],[52,"d",53],[53,"1",54],[54,"]",55],[55,",",56],[56,"[",57],[57,"[",58],[58,"v",59],[59,",",60],[60,"1",61],[61,"]",62],[62,",",63],[63,"[",64],[64,"v",65],[65,",",66],[66,"2",67],[67,"]",68],[68,",",69],[69,"[",70],[70,"v",71],[71,",",72],[72,"3",73],[73,"]",74],[74,"]",75],[75,",",76],[76,"""",77],[77,":",78],[78,"-",79],[79,"""",80],[80,",",81],[81,"[",82],[82,"[",83],[83,"[",84],[84,"n",85],[85,",",86],[86,"a",87],[87,"p",88],[88,"p",89],[89,"e",90],[90,"n",91],[91,"d",92],[92,"]",93],[93,",",94],[94,"[",95],[95,"[",96],[96,"v",97],[97,",",98],[98,"1",99],[99,"]",100],[100,",",101],[101,"[",102],[102,"v",103],[103,",",104],[104,"2",105],[105,"]",106],[106,",",107],[107,"[",108],[108,"v",109],[109,",",110],[110,"3",111],[111,"]",112],[112,"]",113],[113,"]",114],[114,"]",115],[115,"]",116],[116,"0",117],[117,"1",[-,"[[n,append1],[[v,1],[v,2],[v,3]],"":-"",[[[n,append],[[v,1],[v,2],[v,3]]]]]01"]],[123,"q",124],[124,"u",125],[125,"a",126],[126,"l",127],[127,"s",128],[128,"1",129],[129,"1",130],[130,"]",131],[131,",",132],[132,"[",133],[133,"[",134],[134,"v",135],[135,",",136],[136,"1",137],[137,"]",138],[138,",",139],[139,"[",140],[140,"v",141],[141,",",142],[142,"1",143],[143,"]",144],[144,"]",145],[145,",",146],[146,"""",147],[147,":",148],[148,"-",149],[149,"""",150],[150,",",151],[151,"[",152],[152,"[",153],[153,"[",154],[154,"n",155],[155,",",156],[156,"t",157],[157,"r",158],[158,"u",159],[159,"e",160],[160,"]",161],[161,"]",162],[162,"]",163],[163,"]",164],[164,"0",165],[165,"1",[-,"[[n,equals11],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",102],[102,"n",56],[56,",",104],[104,"i",105],[104,"m",58],[104,"n",6],[6,"u",7],[7,"m",8],[8,"b",9],[9,"e",10],[10,"r",11],[11,"1",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,"]",22],[22,",",23],[23,"""",24],[24,":",25],[25,"-",26],[26,"""",27],[27,",",28],[28,"[",29],[29,"[",30],[30,"[",31],[31,"n",32],[32,",",33],[33,"n",34],[34,"u",35],[35,"m",36],[36,"b",37],[37,"e",38],[38,"r",39],[39,"]",40],[40,",",41],[41,"[",42],[42,"[",43],[43,"v",44],[44,",",45],[45,"1",46],[46,"]",47],[47,"]",48],[48,"]",49],[49,"]",50],[50,"]",51],[51,"0",52],[52,"1",[-,"[[n,number11],[[v,1]],"":-"",[[[n,number],[[v,1]]]]]01"]],[58,"i",59],[59,"n",60],[60,"u",61],[61,"s",62],[62,"1",63],[63,"1",64],[64,"]",65],[65,",",66],[66,"[",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"1",71],[71,"]",72],[72,",",73],[73,"[",74],[74,"v",75],[75,",",76],[76,"1",77],[77,"]",78],[78,"]",79],[79,",",80],[80,"""",81],[81,":",82],[82,"-",83],[83,"""",84],[84,",",85],[85,"[",86],[86,"[",87],[87,"[",88],[88,"n",89],[89,",",90],[90,"t",91],[91,"r",92],[92,"u",93],[93,"e",94],[94,"]",95],[95,"]",96],[96,"]",97],[97,"]",98],[98,"0",99],[99,"1",[-,"[[n,minus11],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]],[105,"f",106],[106,"1",107],[107,"1",108],[108,"]",109],[109,",",110],[110,"[",111],[111,"[",112],[112,"v",113],[113,",",114],[114,"1",115],[115,"]",116],[116,",",117],[117,"[",118],[118,"v",119],[119,",",120],[120,"2",121],[121,"]",122],[122,"]",123],[123,",",124],[124,"""",125],[125,":",126],[126,"-",127],[127,"""",128],[128,",",129],[129,"[",130],[130,"[",131],[131,"[",132],[132,"n",133],[133,",",134],[134,"""",135],[135,"-",136],[136,">",137],[137,"""",138],[138,"]",139],[139,",",140],[140,"[",141],[141,"[",142],[142,"[",143],[143,"n",144],[144,",",145],[145,"i",146],[146,"s",147],[147,"]",148],[148,",",149],[149,"[",150],[150,"[",151],[151,"v",152],[152,",",153],[153,"1",154],[154,"]",155],[155,"]",156],[156,"]",157],[157,",",158],[158,"[",159],[159,"[",160],[160,"n",161],[161,",",162],[162,"i",163],[163,"s",164],[164,"]",165],[165,",",166],[166,"[",167],[167,"[",168],[168,"v",169],[169,",",170],[170,"2",171],[171,"]",172],[172,"]",173],[173,"]",174],[174,",",175],[175,"[",176],[176,"[",177],[177,"n",178],[178,",",179],[179,"i",180],[180,"s",181],[181,"]",182],[182,",",183],[183,"[",184],[184,"[",185],[185,"v",186],[186,",",187],[187,"2",188],[188,"]",189],[189,"]",190],[190,"]",191],[191,"]",192],[192,"]",193],[193,"]",194],[194,"]",195],[195,"0",196],[196,"1",[-,"[[n,if11],[[v,1],[v,2]],"":-"",[[[n,""->""],[[[n,is],[[v,1]]],[[n,is],[[v,2]]],[[n,is],[[v,2]]]]]]]01"]]],[[1,"[",2],[2,"[",131],[131,"n",60],[60,",",133],[133,"d",134],[133,"n",6],[133,"o",62],[6,"o",7],[7,"t",8],[8,"1",9],[9,"1",10],[10,"]",11],[11,",",12],[12,"[",13],[13,"[",14],[14,"v",15],[15,",",16],[16,"1",17],[17,"]",18],[18,"]",19],[19,",",20],[20,"""",21],[21,":",22],[22,"-",23],[23,"""",24],[24,",",25],[25,"[",26],[26,"[",27],[27,"[",28],[28,"n",29],[29,",",30],[30,"n",31],[31,"o",32],[32,"t",33],[33,"]",34],[34,",",35],[35,"[",36],[36,"[",37],[37,"[",38],[38,"n",39],[39,",",40],[40,"=",41],[41,"]",42],[42,",",43],[43,"[",44],[44,"[",45],[45,"v",46],[46,",",47],[47,"1",48],[48,"]",49],[49,"]",50],[50,"]",51],[51,"]",52],[52,"]",53],[53,"]",54],[54,"]",55],[55,"0",56],[56,"1",[-,"[[n,not11],[[v,1]],"":-"",[[[n,not],[[[n,=],[[v,1]]]]]]]01"]],[62,"r",63],[63,"1",64],[64,"1",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"[",69],[69,"v",70],[70,",",71],[71,"1",72],[72,"]",73],[73,"]",74],[74,",",75],[75,"""",76],[76,":",77],[77,"-",78],[78,"""",79],[79,",",80],[80,"[",81],[81,"[",82],[82,"[",83],[83,"n",84],[84,",",85],[85,"o",86],[86,"r",87],[87,"]",88],[88,",",89],[89,"[",90],[90,"[",91],[91,"[",92],[92,"n",93],[93,",",94],[94,"i",95],[95,"s",96],[96,"]",97],[97,",",98],[98,"[",99],[99,"[",100],[100,"v",101],[101,",",102],[102,"1",103],[103,"]",104],[104,"]",105],[105,"]",106],[106,",",107],[107,"[",108],[108,"[",109],[109,"n",110],[110,",",111],[111,"i",112],[112,"s",113],[113,"]",114],[114,",",115],[115,"[",116],[116,"[",117],[117,"v",118],[118,",",119],[119,"1",120],[120,"]",121],[121,"]",122],[122,"]",123],[123,"]",124],[124,"]",125],[125,"]",126],[126,"]",127],[127,"0",128],[128,"1",[-,"[[n,or11],[[v,1]],"":-"",[[[n,or],[[[n,is],[[v,1]]],[[n,is],[[v,1]]]]]]]01"]],[134,"o",135],[135,"w",136],[136,"n",137],[137,"p",138],[138,"i",139],[139,"p",140],[140,"e",141],[141,"]",142],[142,",",143],[143,"[",144],[144,"[",145],[145,"v",146],[146,",",147],[147,"1",148],[148,"]",149],[149,",",150],[150,"[",151],[151,"v",152],[152,",",153],[153,"1",154],[154,"]",155],[155,",",156],[156,"[",157],[157,"v",158],[158,",",159],[159,"3",160],[160,"]",161],[161,"]",162],[162,",",163],[163,"""",164],[164,":",165],[165,"-",166],[166,"""",167],[167,",",168],[168,"[",169],[169,"[",170],[170,"[",171],[171,"n",172],[172,",",173],[173,"t",174],[174,"r",175],[175,"u",176],[176,"e",177],[177,"]",178],[178,"]",179],[179,"]",180],[180,"]",181],[181,"0",182],[182,"1",[-,"[[n,downpipe],[[v,1],[v,1],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",159],[159,"n",66],[66,",",161],[161,"g",6],[161,"i",68],[161,"l",162],[6,"e",7],[7,"t",8],[8,"i",9],[9,"t",10],[10,"e",11],[11,"m",12],[12,"n",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"2",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"3",26],[26,"]",27],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"h",40],[40,"e",41],[41,"a",42],[42,"d",43],[43,"]",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"v",48],[48,",",49],[49,"2",50],[50,"]",51],[51,",",52],[52,"[",53],[53,"v",54],[54,",",55],[55,"3",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"0",62],[62,"1",[-,"[[n,getitemn],[[v,2],[v,3]],"":-"",[[[n,head],[[v,2],[v,3]]]]]01"]],[68,"d",69],[69,"e",70],[70,"n",71],[71,"t",72],[72,"i",73],[73,"c",74],[74,"a",75],[75,"l",76],[76,"]",77],[77,",",78],[78,"[",79],[79,"[",80],[80,"v",81],[81,",",82],[82,"1",83],[83,"]",84],[84,",",85],[85,"[",86],[86,"v",87],[87,",",88],[88,"2",89],[89,"]",90],[90,"]",91],[91,",",92],[92,"""",93],[93,":",94],[94,"-",95],[95,"""",96],[96,",",97],[97,"[",98],[98,"[",99],[99,"[",100],[100,"n",101],[101,",",102],[102,"+",103],[103,"]",104],[104,",",105],[105,"[",106],[106,"[",107],[107,"v",108],[108,",",109],[109,"1",110],[110,"]",111],[111,",",112],[112,"[",113],[113,"v",114],[114,",",115],[115,"2",116],[116,"]",117],[117,",",118],[118,"[",119],[119,"v",120],[120,",",121],[121,"5",122],[122,"]",123],[123,"]",124],[124,"]",125],[125,",",126],[126,"[",127],[127,"[",128],[128,"n",129],[129,",",130],[130,"+",131],[131,"]",132],[132,",",133],[133,"[",134],[134,"[",135],[135,"v",136],[136,",",137],[137,"2",138],[138,"]",139],[139,",",140],[140,"[",141],[141,"v",142],[142,",",143],[143,"1",144],[144,"]",145],[145,",",146],[146,"[",147],[147,"v",148],[148,",",149],[149,"5",150],[150,"]",151],[151,"]",152],[152,"]",153],[153,"]",154],[154,"]",155],[155,"0",156],[156,"1",[-,"[[n,identical],[[v,1],[v,2]],"":-"",[[[n,+],[[v,1],[v,2],[v,5]]],[[n,+],[[v,2],[v,1],[v,5]]]]]01"]],[162,"e",163],[163,"n",164],[164,"g",165],[165,"t",166],[166,"h",167],[167,"1",168],[168,"]",169],[169,",",170],[170,"[",171],[171,"[",172],[172,"v",173],[173,",",174],[174,"2",175],[175,"]",176],[176,",",177],[177,"[",178],[178,"v",179],[179,",",180],[180,"2",181],[181,"]",182],[182,"]",183],[183,",",184],[184,"""",185],[185,":",186],[186,"-",187],[187,"""",188],[188,",",189],[189,"[",190],[190,"[",191],[191,"[",192],[192,"n",193],[193,",",194],[194,"t",195],[195,"r",196],[196,"u",197],[197,"e",198],[198,"]",199],[199,"]",200],[200,"]",201],[201,"]",202],[202,"0",203],[203,"1",[-,"[[n,length1],[[v,2],[v,2]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",139],[139,"n",81],[81,",",141],[141,"i",83],[141,"m",6],[6,"e",7],[6,"i",143],[7,"m",8],[8,"b",9],[9,"e",10],[10,"r",11],[11,"1",12],[12,"a",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"i",40],[40,"n",41],[41,"t",42],[42,"e",43],[43,"r",44],[44,"s",45],[45,"e",46],[46,"c",47],[47,"t",48],[48,"i",49],[49,"o",50],[50,"n",51],[51,"2",52],[52,"]",53],[53,",",54],[54,"[",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"1",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"2",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"6",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"0",77],[77,"1",[-,"[[n,member1a],[[v,1],[v,2]],"":-"",[[[n,intersection2],[[v,1],[v,2],[v,6]]]]]01"]],[83,"n",84],[84,"t",85],[85,"e",86],[86,"r",87],[87,"s",88],[88,"e",89],[89,"c",90],[90,"t",91],[91,"i",92],[92,"o",93],[93,"n",94],[94,"2",95],[95,"]",96],[96,",",97],[97,"[",98],[98,"[",99],[99,"v",100],[100,",",101],[101,"1",102],[102,"]",103],[103,",",104],[104,"[",105],[105,"v",106],[106,",",107],[107,"3",108],[108,"]",109],[109,",",110],[110,"[",111],[111,"v",112],[112,",",113],[113,"3",114],[114,"]",115],[115,"]",116],[116,",",117],[117,"""",118],[118,":",119],[119,"-",120],[120,"""",121],[121,",",122],[122,"[",123],[123,"[",124],[124,"[",125],[125,"n",126],[126,",",127],[127,"t",128],[128,"r",129],[129,"u",130],[130,"e",131],[131,"]",132],[132,"]",133],[133,"]",134],[134,"]",135],[135,"0",136],[136,"1",[-,"[[n,intersection2],[[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[143,"n",144],[144,"u",145],[145,"s",146],[146,"1",147],[147,"]",148],[148,",",149],[149,"[",150],[150,"[",151],[151,"v",152],[152,",",153],[153,"1",154],[154,"]",155],[155,",",156],[156,"[",157],[157,"v",158],[158,",",159],[159,"1",160],[160,"]",161],[161,"]",162],[162,",",163],[163,"""",164],[164,":",165],[165,"-",166],[166,"""",167],[167,",",168],[168,"[",169],[169,"[",170],[170,"[",171],[171,"n",172],[172,",",173],[173,"t",174],[174,"r",175],[175,"u",176],[176,"e",177],[177,"]",178],[178,"]",179],[179,"]",180],[180,"]",181],[181,"0",182],[182,"1",[-,"[[n,minus1],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",91],[91,"n",57],[57,",",93],[93,"d",6],[93,"s",59],[6,"e",7],[7,"l",8],[8,"e",9],[9,"t",10],[10,"e",11],[11,"2",12],[12,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"2",19],[19,"]",20],[20,",",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"3",25],[25,"]",26],[26,",",27],[27,"[",28],[28,"v",29],[29,",",30],[30,"3",31],[31,"]",32],[32,"]",33],[33,",",34],[34,"""",35],[35,":",36],[36,"-",37],[37,"""",38],[38,",",39],[39,"[",40],[40,"[",41],[41,"[",42],[42,"n",43],[43,",",44],[44,"t",45],[45,"r",46],[46,"u",47],[47,"e",48],[48,"]",49],[49,"]",50],[50,"]",51],[51,"]",52],[52,"0",53],[53,"1",[-,"[[n,delete2],[[v,2],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[59,"u",60],[60,"b",61],[61,"s",62],[62,"t",63],[63,"r",64],[64,"i",65],[65,"n",66],[66,"g",67],[67,"]",68],[68,",",69],[69,"""",70],[69,"[",105],[70,":",71],[71,"-",72],[72,"""",73],[73,",",74],[74,"[",75],[75,"[",76],[76,"[",77],[77,"n",78],[78,",",79],[79,"t",80],[80,"r",81],[81,"u",82],[82,"e",83],[83,"]",84],[84,"]",85],[85,"]",86],[86,"]",87],[87,"0",88],[88,"1",[-,"[[n,substring],"":-"",[[[n,true]]]]01"]],[105,"[",106],[106,"v",107],[107,",",108],[108,"2",109],[109,"]",110],[110,"]",111],[111,",",112],[112,"""",113],[113,":",114],[114,"-",115],[115,"""",116],[116,",",117],[117,"[",118],[118,"[",119],[119,"[",120],[120,"n",121],[121,",",122],[122,"n",123],[123,"o",124],[124,"t",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"[",129],[129,"[",130],[130,"n",131],[131,",",132],[132,"=",133],[133,"]",134],[134,",",135],[135,"[",136],[136,"[",137],[137,"v",138],[138,",",139],[139,"2",140],[140,"]",141],[141,"]",142],[142,"]",143],[143,"]",144],[144,"]",145],[145,",",146],[146,"[",147],[147,"[",148],[148,"n",149],[149,",",150],[150,"f",151],[151,"a",152],[152,"i",153],[153,"l",154],[154,"]",155],[155,"]",156],[156,"]",157],[157,"]",158],[158,"0",159],[159,"1",[-,"[[n,substring],[[v,2]],"":-"",[[[n,not],[[[n,=],[[v,2]]]]],[[n,fail]]]]01"]]],[[1,"[",2],[2,"[",75],[75,"n",46],[46,",",77],[77,"l",6],[77,"o",48],[6,"i",7],[7,"s",8],[8,"t",9],[9,"h",10],[10,"e",11],[11,"a",12],[12,"d",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,"]",22],[22,",",23],[23,"""",24],[24,":",25],[25,"-",26],[26,"""",27],[27,",",28],[28,"[",29],[29,"[",30],[30,"[",31],[31,"n",32],[32,",",33],[33,"t",34],[34,"r",35],[35,"u",36],[36,"e",37],[37,"]",38],[38,"]",39],[39,"]",40],[40,"]",41],[41,"0",42],[42,"1",[-,"[[n,listhead],[[v,1]],"":-"",[[[n,true]]]]01"]],[48,"r",49],[49,"1",50],[50,"2",51],[51,"]",52],[52,",",53],[53,"""",54],[54,":",55],[55,"-",56],[56,"""",57],[57,",",58],[58,"[",59],[59,"[",60],[60,"[",61],[61,"n",62],[62,",",63],[63,"t",64],[64,"r",65],[65,"u",66],[66,"e",67],[67,"]",68],[68,"]",69],[69,"]",70],[70,"]",71],[71,"0",72]],[[1,"[",2],[2,"[",121],[121,"n",63],[63,",",123],[123,"d",124],[123,"i",65],[65,"n",7],[7,"t",8],[8,"e",9],[9,"r",10],[10,"s",11],[11,"e",12],[12,"c",13],[13,"t",14],[14,"i",15],[15,"o",16],[16,"n",17],[17,"1",18],[17,"2",77],[18,"]",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"2",25],[25,"]",26],[26,",",27],[27,"[",28],[28,"v",29],[29,",",30],[30,"3",31],[31,"]",32],[32,",",33],[33,"[",34],[34,"v",35],[35,",",36],[36,"3",37],[37,"]",38],[38,"]",39],[39,",",40],[40,"""",41],[41,":",42],[42,"-",43],[43,"""",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"[",48],[48,"n",49],[49,",",50],[50,"t",51],[51,"r",52],[52,"u",53],[53,"e",54],[54,"]",55],[55,"]",56],[56,"]",57],[57,"]",58],[58,"0",59],[59,"1",[-,"[[n,intersection1],[[v,2],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[77,"]",78],[78,",",79],[79,"[",80],[80,"[",81],[81,"v",82],[82,",",83],[83,"1",84],[84,"]",85],[85,",",86],[86,"[",87],[87,"v",88],[88,",",89],[89,"3",90],[90,"]",91],[91,",",92],[92,"[",93],[93,"v",94],[94,",",95],[95,"3",96],[96,"]",97],[97,"]",98],[98,",",99],[99,"""",100],[100,":",101],[101,"-",102],[102,"""",103],[103,",",104],[104,"[",105],[105,"[",106],[106,"[",107],[107,"n",108],[108,",",109],[109,"t",110],[110,"r",111],[111,"u",112],[112,"e",113],[113,"]",114],[114,"]",115],[115,"]",116],[116,"]",117],[117,"0",118],[118,"1",[-,"[[n,intersection2],[[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[124,"e",125],[125,"l",126],[126,"e",127],[127,"t",128],[128,"e",129],[129,"2",130],[130,"]",131],[131,",",132],[132,"[",133],[133,"[",134],[134,"v",135],[135,",",136],[136,"2",137],[137,"]",138],[138,",",139],[139,"[",140],[140,"v",141],[141,",",142],[142,"3",143],[143,"]",144],[144,",",145],[145,"[",146],[146,"v",147],[147,",",148],[148,"3",149],[149,"]",150],[150,"]",151],[151,",",152],[152,"""",153],[153,":",154],[154,"-",155],[155,"""",156],[156,",",157],[157,"[",158],[158,"[",159],[159,"[",160],[160,"n",161],[161,",",162],[162,"t",163],[163,"r",164],[164,"u",165],[165,"e",166],[166,"]",167],[167,"]",168],[168,"]",169],[169,"]",170],[170,"0",171],[171,"1",[-,"[[n,delete2],[[v,2],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",102],[102,"n",66],[66,",",104],[104,"c",68],[104,"g",6],[6,"r",7],[7,"e",8],[8,"a",9],[9,"t",10],[10,"e",11],[11,"r",12],[12,"t",13],[13,"h",14],[14,"a",15],[15,"n",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"[",20],[20,"v",21],[21,",",22],[22,"1",23],[23,"]",24],[24,",",25],[25,"[",26],[26,"v",27],[27,",",28],[28,"2",29],[29,"]",30],[30,"]",31],[31,",",32],[32,"""",33],[33,":",34],[34,"-",35],[35,"""",36],[36,",",37],[37,"[",38],[38,"[",39],[39,"[",40],[40,"n",41],[41,",",42],[42,">",43],[43,"]",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"v",48],[48,",",49],[49,"1",50],[50,"]",51],[51,",",52],[52,"[",53],[53,"v",54],[54,",",55],[55,"2",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"0",62],[62,"1",[-,"[[n,greaterthan],[[v,1],[v,2]],"":-"",[[[n,>],[[v,1],[v,2]]]]]01"]],[68,"o",69],[69,"n",70],[70,"j",71],[71,"u",72],[72,"n",73],[73,"c",74],[74,"t",75],[75,"i",76],[76,"o",77],[77,"n",78],[78,"]",79],[79,",",80],[80,"""",81],[80,"[",118],[81,":",82],[82,"-",83],[83,"""",84],[84,",",85],[85,"[",86],[86,"[",87],[87,"[",88],[88,"n",89],[89,",",90],[90,"t",91],[91,"r",92],[92,"u",93],[93,"e",94],[94,"]",95],[95,"]",96],[96,"]",97],[97,"]",98],[98,"0",99],[99,"1",[-,"[[n,conjunction],"":-"",[[[n,true]]]]01"]],[118,"[",119],[119,"v",120],[120,",",121],[121,"1",122],[122,"]",123],[123,",",124],[124,"[",125],[125,"v",126],[126,",",127],[127,"2",128],[128,"]",129],[129,"]",130],[130,",",131],[131,"""",132],[132,":",133],[133,"-",134],[134,"""",135],[135,",",136],[136,"[",137],[137,"[",138],[138,"[",139],[139,"n",140],[140,",",141],[141,"n",142],[142,"o",143],[143,"t",144],[144,"]",145],[145,",",146],[146,"[",147],[147,"[",148],[148,"[",149],[149,"[",150],[150,"n",151],[151,",",152],[152,"=",153],[153,"]",154],[154,",",155],[155,"[",156],[156,"[",157],[157,"v",158],[158,",",159],[159,"1",160],[160,"]",161],[161,"]",162],[162,"]",163],[163,",",164],[164,"[",165],[165,"[",166],[166,"n",167],[167,",",168],[168,"=",169],[169,"]",170],[170,",",171],[171,"[",172],[172,"[",173],[173,"v",174],[174,",",175],[175,"2",176],[176,"]",177],[177,"]",178],[178,"]",179],[179,"]",180],[180,"]",181],[181,"]",182],[182,"]",183],[183,"]",184],[184,"0",185],[185,"1",[-,"[[n,conjunction],[[v,1],[v,2]],"":-"",[[[n,not],[[[[n,=],[[v,1]]],[[n,=],[[v,2]]]]]]]]01"]]],[[1,"[",2],[2,"[",106],[106,"n",47],[47,",",108],[108,"s",49],[49,"o",110],[49,"u",7],[7,"m",8],[8,"]",9],[9,",",10],[10,"[",11],[11,"[",12],[12,"v",13],[13,",",14],[14,"2",15],[15,"]",16],[16,",",17],[17,"[",18],[18,"v",19],[19,",",20],[20,"2",21],[21,"]",22],[22,"]",23],[23,",",24],[24,"""",25],[25,":",26],[26,"-",27],[27,"""",28],[28,",",29],[29,"[",30],[30,"[",31],[31,"[",32],[32,"n",33],[33,",",34],[34,"t",35],[35,"r",36],[36,"u",37],[37,"e",38],[38,"]",39],[39,"]",40],[40,"]",41],[41,"]",42],[42,"0",43],[43,"1",[-,"[[n,sum],[[v,2],[v,2]],"":-"",[[[n,true]]]]01"]],[110,"r",51],[51,"t",52],[52,"0",53],[52,"1",113],[53,"]",54],[54,",",55],[55,"[",56],[56,"[",57],[57,"v",58],[58,",",59],[59,"1",60],[60,"]",61],[61,",",62],[62,"[",63],[63,"v",64],[64,",",65],[65,"2",66],[66,"]",67],[67,"]",68],[68,",",69],[69,"""",70],[70,":",71],[71,"-",72],[72,"""",73],[73,",",74],[74,"[",75],[75,"[",76],[76,"[",77],[77,"n",78],[78,",",79],[79,"s",80],[80,"o",81],[81,"r",82],[82,"t",83],[83,"1",84],[84,"]",85],[85,",",86],[86,"[",87],[87,"[",88],[88,"v",89],[89,",",90],[90,"1",91],[91,"]",92],[92,",",93],[93,"[",94],[94,"v",95],[95,",",96],[96,"2",97],[97,"]",98],[98,"]",99],[99,"]",100],[100,"]",101],[101,"]",102],[102,"0",103],[103,"1",[-,"[[n,sort0],[[v,1],[v,2]],"":-"",[[[n,sort1],[[v,1],[v,2]]]]]01"]],[113,"]",114],[114,",",115],[115,"[",116],[116,"[",117],[117,"v",118],[118,",",119],[119,"2",120],[120,"]",121],[121,",",122],[122,"[",123],[123,"v",124],[124,",",125],[125,"2",126],[126,"]",127],[127,"]",128],[128,",",129],[129,"""",130],[130,":",131],[131,"-",132],[132,"""",133],[133,",",134],[134,"[",135],[135,"[",136],[136,"[",137],[137,"n",138],[138,",",139],[139,"t",140],[140,"r",141],[141,"u",142],[142,"e",143],[143,"]",144],[144,"]",145],[145,"]",146],[146,"]",147],[147,"0",148],[148,"1",[-,"[[n,sort1],[[v,2],[v,2]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",121],[121,"n",63],[63,",",123],[123,"d",124],[123,"m",65],[65,"a",7],[7,"x",8],[8,"i",9],[9,"m",10],[10,"u",11],[11,"m",12],[12,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"2",19],[19,"]",20],[20,",",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"2",25],[25,"]",26],[26,",",27],[27,"[",28],[28,"v",29],[29,",",30],[30,"4",31],[31,"]",32],[32,",",33],[33,"[",34],[34,"v",35],[35,",",36],[36,"4",37],[37,"]",38],[38,"]",39],[39,",",40],[40,"""",41],[41,":",42],[42,"-",43],[43,"""",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"[",48],[48,"n",49],[49,",",50],[50,"t",51],[51,"r",52],[52,"u",53],[53,"e",54],[54,"]",55],[55,"]",56],[56,"]",57],[57,"]",58],[58,"0",59],[124,"i",125],[125,"s",126],[126,"j",127],[127,"u",128],[128,"n",129],[129,"c",130],[130,"t",131],[131,"i",132],[132,"o",133],[133,"n",134],[134,"]",135],[135,",",136],[136,"""",137],[137,":",138],[138,"-",139],[139,"""",140],[140,",",141],[141,"[",142],[142,"[",143],[143,"[",144],[144,"n",145],[145,",",146],[146,"t",147],[147,"r",148],[148,"u",149],[149,"e",150],[150,"]",151],[151,"]",152],[152,"]",153],[153,"]",154],[154,"0",155],[155,"1",[-,"[[n,disjunction],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",136],[136,"n",90],[90,",",138],[138,"d",6],[138,"e",92],[6,"i",7],[7,"s",8],[8,"j",9],[9,"u",10],[10,"n",11],[11,"c",12],[12,"t",13],[13,"i",14],[14,"o",15],[15,"n",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"[",20],[20,"v",21],[21,",",22],[22,"1",23],[23,"]",24],[24,",",25],[25,"[",26],[26,"v",27],[27,",",28],[28,"2",29],[29,"]",30],[30,"]",31],[31,",",32],[32,"""",33],[33,":",34],[34,"-",35],[35,"""",36],[36,",",37],[37,"[",38],[38,"[",39],[39,"[",40],[40,"n",41],[41,",",42],[42,"n",43],[43,"o",44],[44,"t",45],[45,"]",46],[46,",",47],[47,"[",48],[48,"[",49],[49,"[",50],[50,"[",51],[51,"n",52],[52,",",53],[53,"=",54],[54,"]",55],[55,",",56],[56,"[",57],[57,"[",58],[58,"v",59],[59,",",60],[60,"1",61],[61,"]",62],[62,"]",63],[63,"]",64],[64,",",65],[65,"[",66],[66,"[",67],[67,"n",68],[68,",",69],[69,"=",70],[70,"]",71],[71,",",72],[72,"[",73],[73,"[",74],[74,"v",75],[75,",",76],[76,"2",77],[77,"]",78],[78,"]",79],[79,"]",80],[80,"]",81],[81,"]",82],[82,"]",83],[83,"]",84],[84,"]",85],[85,"0",86],[86,"1",[-,"[[n,disjunction],[[v,1],[v,2]],"":-"",[[[n,not],[[[[n,=],[[v,1]]],[[n,=],[[v,2]]]]]]]]01"]],[92,"x",93],[93,"p",94],[94,"r",95],[95,"e",96],[96,"s",97],[97,"s",98],[98,"i",99],[99,"o",100],[100,"n",101],[101,"n",102],[102,"o",103],[103,"t",104],[104,"h",105],[105,"e",106],[106,"a",107],[107,"d",108],[108,"a",109],[109,"c",110],[110,"h",111],[111,"e",112],[112,"]",113],[113,",",114],[114,"""",115],[114,"[",162],[115,":",116],[116,"-",117],[117,"""",118],[118,",",119],[119,"[",120],[120,"[",121],[121,"[",122],[122,"n",123],[123,",",124],[124,"t",125],[125,"r",126],[126,"u",127],[127,"e",128],[128,"]",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"0",133],[133,"1",[-,"[[n,expressionnotheadache],"":-"",[[[n,true]]]]01"]],[162,"[",163],[163,"v",164],[164,",",165],[165,"1",166],[166,"]",167],[167,"]",168],[168,",",169],[169,"""",170],[170,":",171],[171,"-",172],[172,"""",173],[173,",",174],[174,"[",175],[175,"[",176],[176,"[",177],[177,"n",178],[178,",",179],[179,"n",180],[180,"o",181],[181,"t",182],[182,"]",183],[183,",",184],[184,"[",185],[185,"[",186],[186,"[",187],[187,"[",188],[188,"n",189],[189,",",190],[190,"=",191],[191,"]",192],[192,",",193],[193,"[",194],[194,"[",195],[195,"v",196],[196,",",197],[197,"1",198],[198,"]",199],[199,"]",200],[200,"]",201],[201,"]",202],[202,"]",203],[203,"]",204],[204,"]",205],[205,"]",206],[206,"0",207],[207,"1",[-,"[[n,expressionnotheadache],[[v,1]],"":-"",[[[n,not],[[[[n,=],[[v,1]]]]]]]]01"]]],[[1,"[",2],[2,"[",98],[98,"n",38],[38,",",100],[100,"f",101],[100,"m",40],[40,"a",7],[7,"i",8],[8,"n",9],[9,"r",10],[10,"o",11],[11,"l",12],[12,"e",13],[13,"]",14],[14,",",15],[15,"""",16],[15,"[",50],[16,":",17],[17,"-",18],[18,"""",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"[",23],[23,"n",24],[24,",",25],[25,"t",26],[26,"r",27],[27,"u",28],[28,"e",29],[29,"]",30],[30,"]",31],[31,"]",32],[32,"]",33],[33,"0",34],[34,"1",[-,"[[n,mainrole],"":-"",[[[n,true]]]]01"]],[50,"[",51],[51,"v",52],[52,",",53],[53,"1",54],[54,"]",55],[55,"]",56],[56,",",57],[57,"""",58],[58,":",59],[59,"-",60],[60,"""",61],[61,",",62],[62,"[",63],[63,"[",64],[64,"[",65],[65,"n",66],[66,",",67],[67,"n",68],[68,"o",69],[69,"t",70],[70,"]",71],[71,",",72],[72,"[",73],[73,"[",74],[74,"[",75],[75,"[",76],[76,"n",77],[77,",",78],[78,"=",79],[79,"]",80],[80,",",81],[81,"[",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"1",86],[86,"]",87],[87,"]",88],[88,"]",89],[89,"]",90],[90,"]",91],[91,"]",92],[92,"]",93],[93,"]",94],[94,"0",95],[95,"1",[-,"[[n,mainrole],[[v,1]],"":-"",[[[n,not],[[[[n,=],[[v,1]]]]]]]]01"]],[101,"u",102],[102,"n",103],[103,"c",104],[104,"t",105],[105,"i",106],[106,"o",107],[107,"n",108],[108,"2",109],[109,"]",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"v",114],[114,",",115],[115,"1",116],[116,"]",117],[117,",",118],[118,"[",119],[119,"v",120],[120,",",121],[121,"2",122],[122,"]",123],[123,",",124],[124,"[",125],[125,"v",126],[126,",",127],[127,"3",128],[128,"]",129],[129,"]",130],[130,",",131],[131,"""",132],[132,":",133],[133,"-",134],[134,"""",135],[135,",",136],[136,"[",137],[137,"[",138],[138,"[",139],[139,"n",140],[140,",",141],[141,"i",142],[142,"s",143],[143,"]",144],[144,",",145],[145,"[",146],[146,"[",147],[147,"v",148],[148,",",149],[149,"2",150],[150,"]",151],[151,",",152],[152,"[",153],[153,"v",154],[154,",",155],[155,"1",156],[156,"]",157],[157,"]",158],[158,"]",159],[159,",",160],[160,"[",161],[161,"[",162],[162,"n",163],[163,",",164],[164,"i",165],[165,"s",166],[166,"]",167],[167,",",168],[168,"[",169],[169,"[",170],[170,"v",171],[171,",",172],[172,"3",173],[173,"]",174],[174,"]",175],[175,"]",176],[176,"]",177],[177,"]",178],[178,"0",179],[179,"1",[-,"[[n,function2],[[v,1],[v,2],[v,3]],"":-"",[[[n,is],[[v,2],[v,1]]],[[n,is],[[v,3]]]]]01"]]],[[1,"[",2],[2,"[",149],[149,"n",66],[66,",",151],[151,"f",68],[151,"g",6],[6,"e",7],[7,"t",8],[8,"i",9],[9,"t",10],[10,"e",11],[11,"m",12],[12,"n",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"2",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"3",26],[26,"]",27],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"h",40],[40,"e",41],[41,"a",42],[42,"d",43],[43,"]",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"v",48],[48,",",49],[49,"2",50],[50,"]",51],[51,",",52],[52,"[",53],[53,"v",54],[54,",",55],[55,"3",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"0",62],[68,"u",69],[69,"n",70],[70,"c",71],[71,"t",72],[72,"i",73],[73,"o",74],[74,"n",75],[75,"2",76],[76,"]",77],[77,",",78],[78,"[",79],[79,"[",80],[80,"v",81],[81,",",82],[82,"1",83],[83,"]",84],[84,",",85],[85,"[",86],[86,"v",87],[87,",",88],[88,"2",89],[89,"]",90],[90,",",91],[91,"[",92],[92,"v",93],[93,",",94],[94,"3",95],[95,"]",96],[96,"]",97],[97,",",98],[98,"""",99],[99,":",100],[100,"-",101],[101,"""",102],[102,",",103],[103,"[",104],[104,"[",105],[105,"[",106],[106,"n",107],[107,",",108],[108,"i",109],[109,"s",110],[110,"]",111],[111,",",112],[112,"[",113],[113,"[",114],[114,"v",115],[115,",",116],[116,"2",117],[117,"]",118],[118,",",119],[119,"[",120],[120,"v",121],[121,",",122],[122,"1",123],[123,"]",124],[124,"]",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"[",129],[129,"n",130],[130,",",131],[131,"i",132],[132,"s",133],[133,"]",134],[134,",",135],[135,"[",136],[136,"[",137],[137,"v",138],[138,",",139],[139,"3",140],[140,"]",141],[141,"]",142],[142,"]",143],[143,"]",144],[144,"]",145],[145,"0",146],[146,"1",[-,"[[n,function2],[[v,1],[v,2],[v,3]],"":-"",[[[n,is],[[v,2],[v,1]]],[[n,is],[[v,3]]]]]01"]]],[[1,"[",2],[2,"[",65],[65,"n",35],[35,",",67],[67,"m",68],[67,"t",37],[37,"e",7],[7,"s",8],[8,"t",9],[9,"1",10],[9,"2",41],[10,"]",11],[11,",",12],[12,"""",13],[13,":",14],[14,"-",15],[15,"""",16],[16,",",17],[17,"[",18],[18,"[",19],[19,"[",20],[20,"n",21],[21,",",22],[22,"t",23],[23,"r",24],[24,"u",25],[25,"e",26],[26,"]",27],[27,"]",28],[28,"]",29],[29,"]",30],[30,"0",31],[31,"1",[-,"[[n,test1],"":-"",[[[n,true]]]]01"]],[41,"]",42],[42,",",43],[43,"""",44],[44,":",45],[45,"-",46],[46,"""",47],[47,",",48],[48,"[",49],[49,"[",50],[50,"[",51],[51,"n",52],[52,",",53],[53,"t",54],[54,"r",55],[55,"u",56],[56,"e",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"0",62],[62,"1",[-,"[[n,test2],"":-"",[[[n,true]]]]01"]],[68,"a",69],[69,"p",70],[70,"]",71],[71,",",72],[72,"[",73],[73,"[",74],[74,"v",75],[75,",",76],[76,"1",77],[77,"]",78],[78,",",79],[79,"[",80],[80,"v",81],[81,",",82],[82,"3",83],[83,"]",84],[84,",",85],[85,"[",86],[86,"v",87],[87,",",88],[88,"3",89],[89,"]",90],[90,"]",91],[91,",",92],[92,"""",93],[93,":",94],[94,"-",95],[95,"""",96],[96,",",97],[97,"[",98],[98,"[",99],[99,"[",100],[100,"n",101],[101,",",102],[102,"t",103],[103,"r",104],[104,"u",105],[105,"e",106],[106,"]",107],[107,"]",108],[108,"]",109],[109,"]",110],[110,"0",111],[111,"1",[-,"[[n,map],[[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",123],[123,"n",70],[70,",",125],[125,"a",6],[125,"f",72],[125,"p",126],[6,"d",7],[7,"d",8],[8,"]",9],[9,",",10],[10,"[",11],[11,"[",12],[12,"v",13],[13,",",14],[14,"1",15],[15,"]",16],[16,",",17],[17,"[",18],[18,"v",19],[19,",",20],[20,"2",21],[21,"]",22],[22,",",23],[23,"[",24],[24,"v",25],[25,",",26],[26,"3",27],[27,"]",28],[28,"]",29],[29,",",30],[30,"""",31],[31,":",32],[32,"-",33],[33,"""",34],[34,",",35],[35,"[",36],[36,"[",37],[37,"[",38],[38,"n",39],[39,",",40],[40,"+",41],[41,"]",42],[42,",",43],[43,"[",44],[44,"[",45],[45,"v",46],[46,",",47],[47,"1",48],[48,"]",49],[49,",",50],[50,"[",51],[51,"v",52],[52,",",53],[53,"2",54],[54,"]",55],[55,",",56],[56,"[",57],[57,"v",58],[58,",",59],[59,"3",60],[60,"]",61],[61,"]",62],[62,"]",63],[63,"]",64],[64,"]",65],[65,"0",66],[66,"1",[-,"[[n,add],[[v,1],[v,2],[v,3]],"":-"",[[[n,+],[[v,1],[v,2],[v,3]]]]]01"]],[72,"i",73],[73,"n",74],[74,"d",75],[75,"a",76],[76,"l",77],[77,"l",78],[78,"1",79],[79,"]",80],[80,",",81],[81,"[",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"1",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"v",90],[90,",",91],[91,"3",92],[92,"]",93],[93,",",94],[94,"[",95],[95,"v",96],[96,",",97],[97,"3",98],[98,"]",99],[99,"]",100],[100,",",101],[101,"""",102],[102,":",103],[103,"-",104],[104,"""",105],[105,",",106],[106,"[",107],[107,"[",108],[108,"[",109],[109,"n",110],[110,",",111],[111,"t",112],[112,"r",113],[113,"u",114],[114,"e",115],[115,"]",116],[116,"]",117],[117,"]",118],[118,"]",119],[119,"0",120],[120,"1",[-,"[[n,findall1],[[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[126,"l",127],[127,"u",128],[128,"s",129],[129,"o",130],[130,"n",131],[131,"e",132],[132,"]",133],[133,",",134],[134,"[",135],[135,"[",136],[136,"v",137],[137,",",138],[138,"1",139],[139,"]",140],[140,",",141],[141,"[",142],[142,"v",143],[143,",",144],[144,"2",145],[145,"]",146],[146,"]",147],[147,",",148],[148,"""",149],[149,":",150],[150,"-",151],[151,"""",152],[152,",",153],[153,"[",154],[154,"[",155],[155,"[",156],[156,"n",157],[157,",",158],[158,"+",159],[159,"]",160],[160,",",161],[161,"[",162],[162,"[",163],[163,"v",164],[164,",",165],[165,"1",166],[166,"]",167],[167,",",168],[168,"[",169],[169,"v",170],[170,",",171],[171,"2",172],[172,"]",173],[173,"]",174],[174,"]",175],[175,"]",176],[176,"]",177],[177,"0",178],[178,"1",[-,"[[n,plusone],[[v,1],[v,2]],"":-"",[[[n,+],[[v,1],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",89],[89,"n",58],[58,",",91],[91,"a",60],[91,"c",92],[91,"f",6],[6,"i",7],[7,"n",8],[8,"d",9],[9,"a",10],[10,"l",11],[11,"l",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"3",26],[26,"]",27],[27,",",28],[28,"[",29],[29,"v",30],[30,",",31],[31,"3",32],[32,"]",33],[33,"]",34],[34,",",35],[35,"""",36],[36,":",37],[37,"-",38],[38,"""",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"[",43],[43,"n",44],[44,",",45],[45,"t",46],[46,"r",47],[47,"u",48],[48,"e",49],[49,"]",50],[50,"]",51],[51,"]",52],[52,"]",53],[53,"0",54],[54,"1",[-,"[[n,findall1],[[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[60,"_",61],[61,"t",62],[62,"o",63],[63,"_",64],[64,"c",65],[65,"]",66],[66,",",67],[67,"""",68],[68,":",69],[69,"-",70],[70,"""",71],[71,",",72],[72,"[",73],[73,"[",74],[74,"[",75],[75,"n",76],[76,",",77],[77,"t",78],[78,"r",79],[79,"u",80],[80,"e",81],[81,"]",82],[82,"]",83],[83,"]",84],[84,"]",85],[85,"0",86],[86,"1",[-,"[[n,a_to_c],"":-"",[[[n,true]]]]01"]],[92,"o",93],[93,"u",94],[94,"n",95],[95,"t",96],[96,"]",97],[97,",",98],[98,"""",99],[99,":",100],[100,"-",101],[101,"""",102],[102,",",103],[103,"[",104],[104,"[",105],[105,"[",106],[106,"n",107],[107,",",108],[108,"c",109],[109,"u",110],[110,"t",111],[111,"]",112],[112,"]",113],[113,"]",114],[114,"]",115],[115,"0",116],[116,"1",[-,"[[n,count],"":-"",[[[n,cut]]]]01"]]],[[1,"[",2],[2,"[",81],[81,"n",35],[35,",",83],[83,"c",6],[83,"i",84],[83,"r",37],[6,"o",7],[7,"u",8],[8,"n",9],[9,"t",10],[10,"]",11],[11,",",12],[12,"""",13],[13,":",14],[14,"-",15],[15,"""",16],[16,",",17],[17,"[",18],[18,"[",19],[19,"[",20],[20,"n",21],[21,",",22],[22,"t",23],[23,"r",24],[24,"u",25],[25,"e",26],[26,"]",27],[27,"]",28],[28,"]",29],[29,"]",30],[30,"0",31],[31,"1",[-,"[[n,count],"":-"",[[[n,true]]]]01"]],[37,"e",38],[38,"v",39],[39,"e",40],[40,"r",41],[41,"s",42],[42,"e",43],[43,"]",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"v",48],[48,",",49],[49,"2",50],[50,"]",51],[51,",",52],[52,"[",53],[53,"v",54],[54,",",55],[55,"2",56],[56,"]",57],[57,"]",58],[58,",",59],[59,"""",60],[60,":",61],[61,"-",62],[62,"""",63],[63,",",64],[64,"[",65],[65,"[",66],[66,"[",67],[67,"n",68],[68,",",69],[69,"t",70],[70,"r",71],[71,"u",72],[72,"e",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"]",77],[77,"0",78],[78,"1",[-,"[[n,reverse],[[v,2],[v,2]],"":-"",[[[n,true]]]]01"]],[84,"n",85],[85,"t",86],[86,"e",87],[87,"r",88],[88,"s",89],[89,"e",90],[90,"c",91],[91,"t",92],[92,"i",93],[93,"o",94],[94,"n",95],[95,"1",96],[96,"]",97],[97,",",98],[98,"[",99],[99,"[",100],[100,"v",101],[101,",",102],[102,"2",103],[103,"]",104],[104,",",105],[105,"[",106],[106,"v",107],[107,",",108],[108,"3",109],[109,"]",110],[110,",",111],[111,"[",112],[112,"v",113],[113,",",114],[114,"3",115],[115,"]",116],[116,"]",117],[117,",",118],[118,"""",119],[119,":",120],[120,"-",121],[121,"""",122],[122,",",123],[123,"[",124],[124,"[",125],[125,"[",126],[126,"n",127],[127,",",128],[128,"t",129],[129,"r",130],[130,"u",131],[131,"e",132],[132,"]",133],[133,"]",134],[134,"]",135],[135,"]",136],[136,"0",137],[137,"1",[-,"[[n,intersection1],[[v,2],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",137],[137,"n",63],[63,",",139],[139,"a",65],[139,"i",6],[139,"m",140],[6,"n",7],[7,"t",8],[8,"e",9],[9,"r",10],[10,"s",11],[11,"e",12],[12,"c",13],[13,"t",14],[14,"i",15],[15,"o",16],[16,"n",17],[17,"2",18],[18,"]",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"1",25],[25,"]",26],[26,",",27],[27,"[",28],[28,"v",29],[29,",",30],[30,"3",31],[31,"]",32],[32,",",33],[33,"[",34],[34,"v",35],[35,",",36],[36,"3",37],[37,"]",38],[38,"]",39],[39,",",40],[40,"""",41],[41,":",42],[42,"-",43],[43,"""",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"[",48],[48,"n",49],[49,",",50],[50,"t",51],[51,"r",52],[52,"u",53],[53,"e",54],[54,"]",55],[55,"]",56],[56,"]",57],[57,"]",58],[58,"0",59],[59,"1",[-,"[[n,intersection2],[[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[65,"p",66],[66,"p",67],[67,"e",68],[68,"n",69],[69,"d",70],[70,"1",71],[71,"]",72],[72,",",73],[73,"[",74],[74,"[",75],[75,"v",76],[76,",",77],[77,"1",78],[78,"]",79],[79,",",80],[80,"[",81],[81,"v",82],[82,",",83],[83,"2",84],[84,"]",85],[85,",",86],[86,"[",87],[87,"v",88],[88,",",89],[89,"3",90],[90,"]",91],[91,"]",92],[92,",",93],[93,"""",94],[94,":",95],[95,"-",96],[96,"""",97],[97,",",98],[98,"[",99],[99,"[",100],[100,"[",101],[101,"n",102],[102,",",103],[103,"a",104],[104,"p",105],[105,"p",106],[106,"e",107],[107,"n",108],[108,"d",109],[109,"]",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"v",114],[114,",",115],[115,"1",116],[116,"]",117],[117,",",118],[118,"[",119],[119,"v",120],[120,",",121],[121,"2",122],[122,"]",123],[123,",",124],[124,"[",125],[125,"v",126],[126,",",127],[127,"3",128],[128,"]",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"0",134],[134,"1",[-,"[[n,append1],[[v,1],[v,2],[v,3]],"":-"",[[[n,append],[[v,1],[v,2],[v,3]]]]]01"]],[140,"i",141],[141,"n",142],[142,"u",143],[143,"s",144],[144,"1",145],[145,"]",146],[146,",",147],[147,"[",148],[148,"[",149],[149,"v",150],[150,",",151],[151,"1",152],[152,"]",153],[153,",",154],[154,"[",155],[155,"v",156],[156,",",157],[157,"1",158],[158,"]",159],[159,"]",160],[160,",",161],[161,"""",162],[162,":",163],[163,"-",164],[164,"""",165],[165,",",166],[166,"[",167],[167,"[",168],[168,"[",169],[169,"n",170],[170,",",171],[171,"t",172],[172,"r",173],[173,"u",174],[174,"e",175],[175,"]",176],[176,"]",177],[177,"]",178],[178,"]",179],[179,"0",180],[180,"1",[-,"[[n,minus1],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",107],[107,"n",57],[57,",",109],[109,"d",6],[109,"m",59],[6,"e",7],[7,"l",8],[8,"e",9],[9,"t",10],[10,"e",11],[11,"2",12],[12,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"2",19],[19,"]",20],[20,",",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"3",25],[25,"]",26],[26,",",27],[27,"[",28],[28,"v",29],[29,",",30],[30,"3",31],[31,"]",32],[32,"]",33],[33,",",34],[34,"""",35],[35,":",36],[36,"-",37],[37,"""",38],[38,",",39],[39,"[",40],[40,"[",41],[41,"[",42],[42,"n",43],[43,",",44],[44,"t",45],[45,"r",46],[46,"u",47],[47,"e",48],[48,"]",49],[49,"]",50],[50,"]",51],[51,"]",52],[52,"0",53],[53,"1",[-,"[[n,delete2],[[v,2],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[59,"e",111],[59,"u",60],[60,"t",61],[61,"u",62],[62,"a",63],[63,"l",64],[64,"l",65],[65,"y",66],[66,"e",67],[67,"x",68],[68,"c",69],[69,"l",70],[70,"u",71],[71,"s",72],[72,"i",73],[73,"v",74],[74,"e",75],[75,"]",76],[76,",",77],[77,"[",78],[78,"[",79],[79,"v",80],[80,",",81],[81,"2",82],[82,"]",83],[83,"]",84],[84,",",85],[85,"""",86],[86,":",87],[87,"-",88],[88,"""",89],[89,",",90],[90,"[",91],[91,"[",92],[92,"[",93],[93,"n",94],[94,",",95],[95,"t",96],[96,"r",97],[97,"u",98],[98,"e",99],[99,"]",100],[100,"]",101],[101,"]",102],[102,"]",103],[103,"0",104],[104,"1",[-,"[[n,mutuallyexclusive],[[v,2]],"":-"",[[[n,true]]]]01"]],[111,"m",112],[112,"b",113],[113,"e",114],[114,"r",115],[115,"a",116],[116,"3",117],[117,"]",118],[118,",",119],[119,"[",120],[120,"[",121],[121,"v",122],[122,",",123],[123,"2",124],[124,"]",125],[125,"]",126],[126,",",127],[127,"""",128],[128,":",129],[129,"-",130],[130,"""",131],[131,",",132],[132,"[",133],[133,"[",134],[134,"[",135],[135,"n",136],[136,",",137],[137,"t",138],[138,"r",139],[139,"u",140],[140,"e",141],[141,"]",142],[142,"]",143],[143,"]",144],[144,"]",145],[145,"0",146],[146,"1",[-,"[[n,membera3],[[v,2]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",107],[107,"n",60],[60,",",109],[109,"d",62],[109,"m",110],[62,"e",63],[62,"u",7],[7,"p",8],[8,"l",9],[9,"i",10],[10,"c",11],[11,"a",12],[12,"t",13],[13,"e",14],[14,"s",15],[15,"]",16],[16,",",17],[17,"[",18],[18,"[",19],[19,"v",20],[20,",",21],[21,"2",22],[22,"]",23],[23,",",24],[24,"[",25],[25,"v",26],[26,",",27],[27,"3",28],[28,"]",29],[29,",",30],[30,"[",31],[31,"v",32],[32,",",33],[33,"3",34],[34,"]",35],[35,"]",36],[36,",",37],[37,"""",38],[38,":",39],[39,"-",40],[40,"""",41],[41,",",42],[42,"[",43],[43,"[",44],[44,"[",45],[45,"n",46],[46,",",47],[47,"t",48],[48,"r",49],[49,"u",50],[50,"e",51],[51,"]",52],[52,"]",53],[53,"]",54],[54,"]",55],[55,"0",56],[56,"1",[-,"[[n,duplicates],[[v,2],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[63,"l",64],[64,"e",65],[65,"t",66],[66,"e",67],[67,"a",68],[68,"2",69],[69,"]",70],[70,",",71],[71,"[",72],[72,"[",73],[73,"v",74],[74,",",75],[75,"2",76],[76,"]",77],[77,",",78],[78,"[",79],[79,"v",80],[80,",",81],[81,"3",82],[82,"]",83],[83,"]",84],[84,",",85],[85,"""",86],[86,":",87],[87,"-",88],[88,"""",89],[89,",",90],[90,"[",91],[91,"[",92],[92,"[",93],[93,"n",94],[94,",",95],[95,"f",96],[96,"a",97],[97,"i",98],[98,"l",99],[99,"]",100],[100,"]",101],[101,"]",102],[102,"]",103],[103,"0",104],[104,"1",[-,"[[n,deletea2],[[v,2],[v,3]],"":-"",[[[n,fail]]]]01"]],[110,"e",111],[111,"m",112],[112,"b",113],[113,"e",114],[114,"r",115],[115,"a",116],[116,"4",117],[117,"]",118],[118,",",119],[119,"[",120],[120,"[",121],[121,"v",122],[122,",",123],[123,"2",124],[124,"]",125],[125,"]",126],[126,",",127],[127,"""",128],[128,":",129],[129,"-",130],[130,"""",131],[131,",",132],[132,"[",133],[133,"[",134],[134,"[",135],[135,"n",136],[136,",",137],[137,"f",138],[138,"a",139],[139,"i",140],[140,"l",141],[141,"]",142],[142,"]",143],[143,"]",144],[144,"]",145],[145,"0",146],[146,"1",[-,"[[n,membera4],[[v,2]],"":-"",[[[n,fail]]]]01"]]],[[1,"[",2],[2,"[",100],[100,"n",66],[66,",",102],[102,"m",6],[102,"s",68],[6,"e",7],[7,"m",8],[8,"b",9],[9,"e",10],[10,"r",11],[11,"a",12],[12,"4",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"h",40],[40,"e",41],[41,"a",42],[42,"d",43],[43,"]",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"v",48],[48,",",49],[49,"1",50],[50,"]",51],[51,",",52],[52,"[",53],[53,"v",54],[54,",",55],[55,"2",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"0",62],[62,"1",[-,"[[n,membera4],[[v,1],[v,2]],"":-"",[[[n,head],[[v,1],[v,2]]]]]01"]],[68,"u",69],[69,"b",70],[70,"s",71],[71,"t",72],[72,"r",73],[73,"i",74],[74,"n",75],[75,"g",76],[76,"]",77],[77,",",78],[78,"""",79],[78,"[",114],[79,":",80],[80,"-",81],[81,"""",82],[82,",",83],[83,"[",84],[84,"[",85],[85,"[",86],[86,"n",87],[87,",",88],[88,"t",89],[89,"r",90],[90,"u",91],[91,"e",92],[92,"]",93],[93,"]",94],[94,"]",95],[95,"]",96],[96,"0",97],[97,"1",[-,"[[n,substring],"":-"",[[[n,true]]]]01"]],[114,"[",115],[115,"v",116],[116,",",117],[117,"2",118],[118,"]",119],[119,"]",120],[120,",",121],[121,"""",122],[122,":",123],[123,"-",124],[124,"""",125],[125,",",126],[126,"[",127],[127,"[",128],[128,"[",129],[129,"n",130],[130,",",131],[131,"n",132],[132,"o",133],[133,"t",134],[134,"]",135],[135,",",136],[136,"[",137],[137,"[",138],[138,"[",139],[139,"n",140],[140,",",141],[141,"=",142],[142,"]",143],[143,",",144],[144,"[",145],[145,"[",146],[146,"v",147],[147,",",148],[148,"2",149],[149,"]",150],[150,"]",151],[151,"]",152],[152,"]",153],[153,"]",154],[154,",",155],[155,"[",156],[156,"[",157],[157,"n",158],[158,",",159],[159,"f",160],[160,"a",161],[161,"i",162],[162,"l",163],[163,"]",164],[164,"]",165],[165,"]",166],[166,"]",167],[167,"0",168],[168,"1",[-,"[[n,substring],[[v,2]],"":-"",[[[n,not],[[[n,=],[[v,2]]]]],[[n,fail]]]]01"]]],[[1,"[",2],[2,"[",94],[94,"n",46],[46,",",96],[96,"a",48],[96,"l",6],[6,"i",7],[7,"s",8],[8,"t",9],[9,"h",10],[10,"e",11],[11,"a",12],[12,"d",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,"]",22],[22,",",23],[23,"""",24],[24,":",25],[25,"-",26],[26,"""",27],[27,",",28],[28,"[",29],[29,"[",30],[30,"[",31],[31,"n",32],[32,",",33],[33,"t",34],[34,"r",35],[35,"u",36],[36,"e",37],[37,"]",38],[38,"]",39],[39,"]",40],[40,"]",41],[41,"0",42],[42,"1",[-,"[[n,listhead],[[v,1]],"":-"",[[[n,true]]]]01"]],[48,"d",49],[49,"d",50],[50,"]",51],[51,",",52],[52,"[",53],[53,"[",54],[54,"v",55],[55,",",56],[56,"2",57],[57,"]",58],[58,",",59],[59,"[",60],[60,"v",61],[61,",",62],[62,"2",112],[62,"3",63],[63,"]",64],[64,",",65],[65,"[",66],[66,"v",67],[67,",",68],[68,"3",69],[69,"]",70],[70,"]",71],[71,",",72],[72,"""",73],[73,":",74],[74,"-",75],[75,"""",76],[76,",",77],[77,"[",78],[78,"[",79],[79,"[",80],[80,"n",81],[81,",",82],[82,"t",83],[83,"r",84],[84,"u",85],[85,"e",86],[86,"]",87],[87,"]",88],[88,"]",89],[89,"]",90],[90,"0",91],[91,"1",[-,"[[n,add],[[v,2],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[112,"]",113],[113,"]",114],[114,",",115],[115,"""",116],[116,":",117],[117,"-",118],[118,"""",119],[119,",",120],[120,"[",121],[121,"[",122],[122,"[",123],[123,"n",124],[124,",",125],[125,"t",126],[126,"r",127],[127,"u",128],[128,"e",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"0",134],[134,"1",[-,"[[n,add],[[v,2],[v,2]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",76],[76,"n",33],[33,",",78],[78,"a",35],[35,"d",80],[80,"d",37],[37,"2",82],[37,"]",9],[9,",",10],[10,"""",11],[10,"[",40],[11,":",12],[12,"-",13],[13,"""",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"[",18],[18,"n",19],[19,",",20],[20,"t",21],[21,"r",22],[22,"u",23],[23,"e",24],[24,"]",25],[25,"]",26],[26,"]",27],[27,"]",28],[28,"0",29],[29,"1",[-,"[[n,add],"":-"",[[[n,true]]]]01"]],[40,"[",41],[41,"v",42],[42,",",43],[43,"2",44],[44,"]",45],[45,"]",46],[46,",",47],[47,"""",48],[48,":",49],[49,"-",50],[50,"""",51],[51,",",52],[52,"[",53],[53,"[",54],[54,"[",55],[55,"n",56],[56,",",57],[57,"a",58],[58,"d",59],[59,"d",60],[60,"]",61],[61,",",62],[62,"[",63],[63,"[",64],[64,"v",65],[65,",",66],[66,"2",67],[67,"]",68],[68,"]",69],[69,"]",70],[70,"]",71],[71,"]",72],[72,"0",73],[73,"1",[-,"[[n,add],[[v,2]],"":-"",[[[n,add],[[v,2]]]]]01"]],[82,"]",83],[83,",",84],[84,"[",85],[85,"[",86],[86,"v",87],[87,",",88],[88,"1",89],[89,"]",90],[90,",",91],[91,"[",92],[92,"v",93],[93,",",94],[94,"2",95],[95,"]",96],[96,"]",97],[97,",",98],[98,"""",99],[99,":",100],[100,"-",101],[101,"""",102],[102,",",103],[103,"[",104],[104,"[",105],[105,"[",106],[106,"n",107],[107,",",108],[108,"=",109],[109,"]",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"v",114],[114,",",115],[115,"1",116],[116,"]",117],[117,"]",118],[118,"]",119],[119,",",120],[120,"[",121],[121,"[",122],[122,"n",123],[123,",",124],[124,"=",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"[",129],[129,"v",130],[130,",",131],[131,"2",132],[132,"]",133],[133,"]",134],[134,"]",135],[135,"]",136],[136,"]",137],[137,"0",138],[138,"1",[-,"[[n,add2],[[v,1],[v,2]],"":-"",[[[n,=],[[v,1]]],[[n,=],[[v,2]]]]]01"]]],[[1,"[",2],[2,"[",138],[138,"n",62],[62,",",140],[140,"1",141],[140,"a",64],[64,"d",7],[7,"d",8],[8,"0",67],[8,"3",9],[9,"]",10],[10,",",11],[11,"[",12],[12,"[",13],[13,"v",14],[14,",",15],[15,"1",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"v",20],[20,",",21],[21,"2",22],[22,"]",23],[23,"]",24],[24,",",25],[25,"""",26],[26,":",27],[27,"-",28],[28,"""",29],[29,",",30],[30,"[",31],[31,"[",32],[32,"[",33],[33,"n",34],[34,",",35],[35,"t",36],[36,"a",37],[37,"i",38],[38,"l",39],[39,"]",40],[40,",",41],[41,"[",42],[42,"[",43],[43,"v",44],[44,",",45],[45,"1",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"v",50],[50,",",51],[51,"2",52],[52,"]",53],[53,"]",54],[54,"]",55],[55,"]",56],[56,"]",57],[57,"0",58],[58,"1",[-,"[[n,add3],[[v,1],[v,2]],"":-"",[[[n,tail],[[v,1],[v,2]]]]]01"]],[67,"]",68],[68,",",69],[69,"[",70],[70,"[",71],[71,"v",72],[72,",",73],[73,"1",74],[74,"]",75],[75,",",76],[76,"[",77],[77,"v",78],[78,",",79],[79,"2",80],[80,"]",81],[81,"]",82],[82,",",83],[83,"""",84],[84,":",85],[85,"-",86],[86,"""",87],[87,",",88],[88,"[",89],[89,"[",90],[90,"[",91],[91,"n",92],[92,",",93],[93,"1",94],[94,"]",95],[95,",",96],[96,"[",97],[97,"[",98],[98,"v",99],[99,",",100],[100,"1",101],[101,"]",102],[102,",",103],[103,"[",104],[104,"v",105],[105,",",106],[106,"4",107],[107,"]",108],[108,"]",109],[109,"]",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"n",114],[114,",",115],[115,"=",116],[116,"]",117],[117,",",118],[118,"[",119],[119,"[",120],[120,"v",121],[121,",",122],[122,"4",123],[123,"]",124],[124,",",125],[125,"[",126],[126,"v",127],[127,",",128],[128,"2",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"]",134],[134,"0",135],[135,"1",[-,"[[n,add0],[[v,1],[v,2]],"":-"",[[[n,1],[[v,1],[v,4]]],[[n,=],[[v,4],[v,2]]]]]01"]],[141,"]",142],[142,",",143],[143,"[",144],[144,"[",145],[145,"v",146],[146,",",147],[147,"1",148],[148,"]",149],[149,",",150],[150,"[",151],[151,"v",152],[152,",",153],[153,"2",154],[154,"]",155],[155,"]",156],[156,",",157],[157,"""",158],[158,":",159],[159,"-",160],[160,"""",161],[161,",",162],[162,"[",163],[163,"[",164],[164,"[",165],[165,"n",166],[166,",",167],[167,"a",168],[168,"d",169],[169,"d",170],[170,"2",171],[171,"]",172],[172,",",173],[173,"[",174],[174,"[",175],[175,"v",176],[176,",",177],[177,"1",178],[178,"]",179],[179,",",180],[180,"[",181],[181,"v",182],[182,",",183],[183,"4",184],[184,"]",185],[185,"]",186],[186,"]",187],[187,",",188],[188,"[",189],[189,"[",190],[190,"n",191],[191,",",192],[192,"=",193],[193,"]",194],[194,",",195],[195,"[",196],[196,"[",197],[197,"v",198],[198,",",199],[199,"4",200],[200,"]",201],[201,",",202],[202,"[",203],[203,"v",204],[204,",",205],[205,"2",206],[206,"]",207],[207,"]",208],[208,"]",209],[209,"]",210],[210,"]",211],[211,"0",212],[212,"1",[-,"[[n,1],[[v,1],[v,2]],"":-"",[[[n,add2],[[v,1],[v,4]]],[[n,=],[[v,4],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",160],[160,"n",103],[103,",",162],[162,"1",6],[162,"a",105],[6,"]",7],[7,",",8],[8,"[",9],[9,"[",10],[10,"v",11],[11,",",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"2",19],[19,"]",20],[20,"]",21],[21,",",22],[22,"""",23],[23,":",24],[24,"-",25],[25,"""",26],[26,",",27],[27,"[",28],[28,"[",29],[29,"[",30],[30,"n",31],[31,",",32],[32,"a",33],[33,"d",34],[34,"d",35],[35,"3",36],[36,"]",37],[37,",",38],[38,"[",39],[39,"[",40],[40,"v",41],[41,",",42],[42,"1",43],[43,"]",44],[44,",",45],[45,"[",46],[46,"v",47],[47,",",48],[48,"4",49],[49,"]",50],[50,"]",51],[51,"]",52],[52,",",53],[53,"[",54],[54,"[",55],[55,"n",56],[56,",",57],[57,"1",58],[58,"]",59],[59,",",60],[60,"[",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"4",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"6",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,",",75],[75,"[",76],[76,"[",77],[77,"n",78],[78,",",79],[79,"=",80],[80,"]",81],[81,",",82],[82,"[",83],[83,"[",84],[84,"v",85],[85,",",86],[86,"6",87],[87,"]",88],[88,",",89],[89,"[",90],[90,"v",91],[91,",",92],[92,"2",93],[93,"]",94],[94,"]",95],[95,"]",96],[96,"]",97],[97,"]",98],[98,"0",99],[99,"1",[-,"[[n,1],[[v,1],[v,2]],"":-"",[[[n,add3],[[v,1],[v,4]]],[[n,1],[[v,4],[v,6]]],[[n,=],[[v,6],[v,2]]]]]01"]],[105,"d",106],[106,"d",107],[107,"0",166],[107,"3",108],[108,"]",109],[109,",",110],[110,"[",111],[111,"[",112],[112,"v",113],[113,",",114],[114,"1",115],[115,"]",116],[116,",",117],[117,"[",118],[118,"v",119],[119,",",120],[120,"2",121],[121,"]",122],[122,"]",123],[123,",",124],[124,"""",125],[125,":",126],[126,"-",127],[127,"""",128],[128,",",129],[129,"[",130],[130,"[",131],[131,"[",132],[132,"n",133],[133,",",134],[134,"t",135],[135,"a",136],[136,"i",137],[137,"l",138],[138,"]",139],[139,",",140],[140,"[",141],[141,"[",142],[142,"v",143],[143,",",144],[144,"1",145],[145,"]",146],[146,",",147],[147,"[",148],[148,"v",149],[149,",",150],[150,"2",151],[151,"]",152],[152,"]",153],[153,"]",154],[154,"]",155],[155,"]",156],[156,"0",157],[157,"1",[-,"[[n,add3],[[v,1],[v,2]],"":-"",[[[n,tail],[[v,1],[v,2]]]]]01"]],[166,"]",167],[167,",",168],[168,"[",169],[169,"[",170],[170,"v",171],[171,",",172],[172,"1",173],[173,"]",174],[174,",",175],[175,"[",176],[176,"v",177],[177,",",178],[178,"2",179],[179,"]",180],[180,"]",181],[181,",",182],[182,"""",183],[183,":",184],[184,"-",185],[185,"""",186],[186,",",187],[187,"[",188],[188,"[",189],[189,"[",190],[190,"n",191],[191,",",192],[192,"a",193],[193,"d",194],[194,"d",195],[195,"3",196],[196,"]",197],[197,",",198],[198,"[",199],[199,"[",200],[200,"v",201],[201,",",202],[202,"1",203],[203,"]",204],[204,",",205],[205,"[",206],[206,"v",207],[207,",",208],[208,"4",209],[209,"]",210],[210,"]",211],[211,"]",212],[212,",",213],[213,"[",214],[214,"[",215],[215,"n",216],[216,",",217],[217,"=",218],[218,"]",219],[219,",",220],[220,"[",221],[221,"[",222],[222,"v",223],[223,",",224],[224,"4",225],[225,"]",226],[226,",",227],[227,"[",228],[228,"v",229],[229,",",230],[230,"2",231],[231,"]",232],[232,"]",233],[233,"]",234],[234,"]",235],[235,"]",236],[236,"0",237],[237,"1",[-,"[[n,add0],[[v,1],[v,2]],"":-"",[[[n,add3],[[v,1],[v,4]]],[[n,=],[[v,4],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",201],[201,"n",103],[103,",",203],[203,"a",105],[105,"d",205],[205,"d",107],[107,"o",207],[207,"r",109],[109,"s",209],[209,"u",111],[111,"b",211],[211,"t",113],[113,"r",213],[213,"a",115],[115,"c",215],[215,"t",117],[117,"2",217],[217,"]",119],[119,",",219],[219,"[",121],[121,"[",221],[221,"v",123],[123,",",223],[223,"1",125],[125,"]",225],[225,",",127],[127,"[",227],[227,"v",129],[129,",",229],[229,"2",131],[131,"]",231],[231,",",133],[133,"[",233],[233,"v",135],[135,",",235],[235,"3",137],[137,"]",237],[237,"]",139],[139,",",239],[239,"""",141],[141,":",241],[241,"-",143],[143,"""",243],[243,",",145],[145,"[",245],[245,"[",147],[147,"[",247],[247,"n",149],[149,",",249],[249,"+",52],[249,"-",151],[52,"]",53],[53,",",54],[54,"[",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"1",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"2",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"3",269],[70,"6",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,",",75],[75,"[",76],[76,"[",77],[77,"n",78],[78,",",79],[79,"=",80],[80,"]",81],[81,",",82],[82,"[",83],[83,"[",84],[84,"v",85],[85,",",86],[86,"6",87],[87,"]",88],[88,",",89],[89,"[",90],[90,"v",91],[91,",",92],[92,"3",93],[93,"]",94],[94,"]",95],[95,"]",96],[96,"]",97],[97,"]",98],[98,"0",99],[99,"1",[-,"[[n,addorsubtract2],[[v,1],[v,2],[v,3]],"":-"",[[[n,+],[[v,1],[v,2],[v,6]]],[[n,=],[[v,6],[v,3]]]]]01"]],[151,"]",152],[152,",",153],[153,"[",154],[154,"[",155],[155,"v",156],[156,",",157],[157,"1",158],[158,"]",159],[159,",",160],[160,"[",161],[161,"v",162],[162,",",163],[163,"2",164],[164,"]",165],[165,",",166],[166,"[",167],[167,"v",168],[168,",",169],[169,"6",170],[170,"]",171],[171,"]",172],[172,"]",173],[173,",",174],[174,"[",175],[175,"[",176],[176,"n",177],[177,",",178],[178,"=",179],[179,"]",180],[180,",",181],[181,"[",182],[182,"[",183],[183,"v",184],[184,",",185],[185,"6",186],[186,"]",187],[187,",",188],[188,"[",189],[189,"v",190],[190,",",191],[191,"3",192],[192,"]",193],[193,"]",194],[194,"]",195],[195,"]",196],[196,"]",197],[197,"0",198],[198,"1",[-,"[[n,addorsubtract2],[[v,1],[v,2],[v,3]],"":-"",[[[n,-],[[v,1],[v,2],[v,6]]],[[n,=],[[v,6],[v,3]]]]]01"]],[269,"]",270],[270,"]",271],[271,"]",272],[272,"]",273],[273,"]",274],[274,"0",275],[275,"1",[-,"[[n,addorsubtract2],[[v,1],[v,2],[v,3]],"":-"",[[[n,+],[[v,1],[v,2],[v,3]]]]]01"]]],[[1,"[",2],[2,"[",157],[157,"n",81],[81,",",159],[159,"a",83],[83,"d",161],[161,"d",85],[85,"o",163],[163,"r",87],[87,"s",165],[165,"u",89],[89,"b",167],[167,"t",91],[91,"r",169],[169,"a",93],[93,"c",171],[171,"t",95],[95,"2",96],[95,"3",173],[96,"]",97],[97,",",98],[98,"[",99],[99,"[",100],[100,"v",101],[101,",",102],[102,"1",103],[103,"]",104],[104,",",105],[105,"[",106],[106,"v",107],[107,",",108],[108,"2",109],[109,"]",110],[110,",",111],[111,"[",112],[112,"v",113],[113,",",114],[114,"3",115],[115,"]",116],[116,"]",117],[117,",",118],[118,"""",119],[119,":",120],[120,"-",121],[121,"""",122],[122,",",123],[123,"[",124],[124,"[",125],[125,"[",126],[126,"n",127],[127,",",128],[128,"+",129],[129,"]",130],[130,",",131],[131,"[",132],[132,"[",133],[133,"v",134],[134,",",135],[135,"1",136],[136,"]",137],[137,",",138],[138,"[",139],[139,"v",140],[140,",",141],[141,"2",142],[142,"]",143],[143,",",144],[144,"[",145],[145,"v",146],[146,",",147],[147,"3",148],[148,"]",149],[149,"]",150],[150,"]",151],[151,"]",152],[152,"]",153],[153,"0",154],[154,"1",[-,"[[n,addorsubtract2],[[v,1],[v,2],[v,3]],"":-"",[[[n,+],[[v,1],[v,2],[v,3]]]]]01"]],[173,"]",20],[20,",",21],[21,"[",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"1",26],[26,"]",27],[27,",",28],[28,"[",29],[29,"v",30],[30,",",31],[31,"2",32],[32,"]",33],[33,",",34],[34,"[",35],[35,"v",36],[36,",",37],[37,"3",38],[38,"]",39],[39,"]",40],[40,",",41],[41,"""",42],[42,":",43],[43,"-",44],[44,"""",45],[45,",",46],[46,"[",47],[47,"[",48],[48,"[",49],[49,"n",50],[50,",",51],[51,"-",52],[52,"]",53],[53,",",54],[54,"[",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"1",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"2",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"3",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"0",77]],[[1,"[",2],[2,"[",154],[154,"n",81],[81,",",156],[156,"1",83],[156,"a",6],[6,"d",7],[7,"d",8],[8,"0",9],[9,"]",10],[10,",",11],[11,"[",12],[12,"[",13],[13,"v",14],[14,",",15],[15,"1",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"v",20],[20,",",21],[21,"2",22],[22,"]",23],[23,"]",24],[24,",",25],[25,"""",26],[26,":",27],[27,"-",28],[28,"""",29],[29,",",30],[30,"[",31],[31,"[",32],[32,"[",33],[33,"n",34],[34,",",35],[35,"1",36],[36,"]",37],[37,",",38],[38,"[",39],[39,"[",40],[40,"v",41],[41,",",42],[42,"1",43],[43,"]",44],[44,",",45],[45,"[",46],[46,"v",47],[47,",",48],[48,"4",49],[49,"]",50],[50,"]",51],[51,"]",52],[52,",",53],[53,"[",54],[54,"[",55],[55,"n",56],[56,",",57],[57,"=",58],[58,"]",59],[59,",",60],[60,"[",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"4",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"2",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"0",77],[77,"1",[-,"[[n,add0],[[v,1],[v,2]],"":-"",[[[n,1],[[v,1],[v,4]]],[[n,=],[[v,4],[v,2]]]]]01"]],[83,"]",84],[84,",",85],[85,"[",86],[86,"[",87],[87,"v",88],[88,",",89],[89,"1",90],[90,"]",91],[91,",",92],[92,"[",93],[93,"v",94],[94,",",95],[95,"2",96],[96,"]",97],[97,"]",98],[98,",",99],[99,"""",100],[100,":",101],[101,"-",102],[102,"""",103],[103,",",104],[104,"[",105],[105,"[",106],[106,"[",107],[107,"n",108],[108,",",109],[109,"+",110],[109,"-",184],[110,"]",111],[111,",",112],[112,"[",113],[113,"[",114],[114,"v",115],[115,",",116],[116,"1",117],[117,"]",118],[118,",",119],[119,"[",120],[120,"v",121],[121,",",122],[122,"5",123],[123,"]",124],[124,"]",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"[",129],[129,"n",130],[130,",",131],[131,"=",132],[132,"]",133],[133,",",134],[134,"[",135],[135,"[",136],[136,"v",137],[137,",",138],[138,"5",139],[139,"]",140],[140,",",141],[141,"[",142],[142,"v",143],[143,",",144],[144,"2",145],[145,"]",146],[146,"]",147],[147,"]",148],[148,"]",149],[149,"]",150],[150,"0",151],[151,"1",[-,"[[n,1],[[v,1],[v,2]],"":-"",[[[n,+],[[v,1],[v,5]]],[[n,=],[[v,5],[v,2]]]]]01"]],[184,"]",185],[185,",",186],[186,"[",187],[187,"[",188],[188,"v",189],[189,",",190],[190,"1",191],[191,"]",192],[192,",",193],[193,"[",194],[194,"v",195],[195,",",196],[196,"5",197],[197,"]",198],[198,"]",199],[199,"]",200],[200,",",201],[201,"[",202],[202,"[",203],[203,"n",204],[204,",",205],[205,"=",206],[206,"]",207],[207,",",208],[208,"[",209],[209,"[",210],[210,"v",211],[211,",",212],[212,"5",213],[213,"]",214],[214,",",215],[215,"[",216],[216,"v",217],[217,",",218],[218,"2",219],[219,"]",220],[220,"]",221],[221,"]",222],[222,"]",223],[223,"]",224],[224,"0",225],[225,"1",[-,"[[n,1],[[v,1],[v,2]],"":-"",[[[n,-],[[v,1],[v,5]]],[[n,=],[[v,5],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",153],[153,"n",79],[79,",",155],[155,"a",81],[81,"2",7],[81,"3",82],[81,"d",157],[7,"]",8],[8,",",9],[9,"[",10],[10,"[",11],[11,"v",12],[12,",",13],[13,"1",14],[14,"]",15],[15,",",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"2",20],[20,"]",21],[21,"]",22],[22,",",23],[23,"""",24],[24,":",25],[25,"-",26],[26,"""",27],[27,",",28],[28,"[",29],[29,"[",30],[30,"[",31],[31,"n",32],[32,",",33],[33,"+",34],[34,"]",35],[35,",",36],[36,"[",37],[37,"[",38],[38,"v",39],[39,",",40],[40,"1",41],[41,"]",42],[42,",",43],[43,"[",44],[44,"v",45],[45,",",46],[46,"5",47],[47,"]",48],[48,"]",49],[49,"]",50],[50,",",51],[51,"[",52],[52,"[",53],[53,"n",54],[54,",",55],[55,"=",56],[56,"]",57],[57,",",58],[58,"[",59],[59,"[",60],[60,"v",61],[61,",",62],[62,"5",63],[63,"]",64],[64,",",65],[65,"[",66],[66,"v",67],[67,",",68],[68,"2",69],[69,"]",70],[70,"]",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,"0",75],[75,"1",[-,"[[n,a2],[[v,1],[v,2]],"":-"",[[[n,+],[[v,1],[v,5]]],[[n,=],[[v,5],[v,2]]]]]01"]],[82,"]",83],[83,",",84],[84,"[",85],[85,"[",86],[86,"v",87],[87,",",88],[88,"1",89],[89,"]",90],[90,",",91],[91,"[",92],[92,"v",93],[93,",",94],[94,"2",95],[95,"]",96],[96,"]",97],[97,",",98],[98,"""",99],[99,":",100],[100,"-",101],[101,"""",102],[102,",",103],[103,"[",104],[104,"[",105],[105,"[",106],[106,"n",107],[107,",",108],[108,"-",109],[109,"]",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"v",114],[114,",",115],[115,"1",116],[116,"]",117],[117,",",118],[118,"[",119],[119,"v",120],[120,",",121],[121,"5",122],[122,"]",123],[123,"]",124],[124,"]",125],[125,",",126],[126,"[",127],[127,"[",128],[128,"n",129],[129,",",130],[130,"=",131],[131,"]",132],[132,",",133],[133,"[",134],[134,"[",135],[135,"v",136],[136,",",137],[137,"5",138],[138,"]",139],[139,",",140],[140,"[",141],[141,"v",142],[142,",",143],[143,"2",144],[144,"]",145],[145,"]",146],[146,"]",147],[147,"]",148],[148,"]",149],[149,"0",150],[150,"1",[-,"[[n,a3],[[v,1],[v,2]],"":-"",[[[n,-],[[v,1],[v,5]]],[[n,=],[[v,5],[v,2]]]]]01"]],[157,"d",158],[158,"0",159],[159,"]",160],[160,",",161],[161,"[",162],[162,"[",163],[163,"v",164],[164,",",165],[165,"1",166],[166,"]",167],[167,",",168],[168,"[",169],[169,"v",170],[170,",",171],[171,"2",172],[172,"]",173],[173,"]",174],[174,",",175],[175,"""",176],[176,":",177],[177,"-",178],[178,"""",179],[179,",",180],[180,"[",181],[181,"[",182],[182,"[",183],[183,"n",184],[184,",",185],[185,"1",186],[186,"]",187],[187,",",188],[188,"[",189],[189,"[",190],[190,"v",191],[191,",",192],[192,"1",193],[193,"]",194],[194,",",195],[195,"[",196],[196,"v",197],[197,",",198],[198,"2",199],[199,"]",200],[200,"]",201],[201,"]",202],[202,"]",203],[203,"]",204],[204,"0",205],[205,"1",[-,"[[n,add0],[[v,1],[v,2]],"":-"",[[[n,1],[[v,1],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",153],[153,"n",79],[79,",",155],[155,"1",81],[155,"a",156],[81,"]",7],[7,",",8],[8,"[",9],[9,"[",10],[10,"v",11],[11,",",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"2",19],[19,"]",20],[20,"]",21],[21,",",22],[22,"""",23],[23,":",24],[24,"-",25],[25,"""",26],[26,",",27],[27,"[",28],[28,"[",29],[29,"[",30],[30,"n",31],[31,",",32],[32,"a",33],[33,"2",34],[33,"3",109],[34,"]",35],[35,",",36],[36,"[",37],[37,"[",38],[38,"v",39],[39,",",40],[40,"1",41],[41,"]",42],[42,",",43],[43,"[",44],[44,"v",45],[45,",",46],[46,"4",47],[47,"]",48],[48,"]",49],[49,"]",50],[50,",",51],[51,"[",52],[52,"[",53],[53,"n",54],[54,",",55],[55,"=",56],[56,"]",57],[57,",",58],[58,"[",59],[59,"[",60],[60,"v",61],[61,",",62],[62,"4",63],[63,"]",64],[64,",",65],[65,"[",66],[66,"v",67],[67,",",68],[68,"2",69],[69,"]",70],[70,"]",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,"0",75],[75,"1",[-,"[[n,1],[[v,1],[v,2]],"":-"",[[[n,a2],[[v,1],[v,4]]],[[n,=],[[v,4],[v,2]]]]]01"]],[109,"]",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"v",114],[114,",",115],[115,"1",116],[116,"]",117],[117,",",118],[118,"[",119],[119,"v",120],[120,",",121],[121,"4",122],[122,"]",123],[123,"]",124],[124,"]",125],[125,",",126],[126,"[",127],[127,"[",128],[128,"n",129],[129,",",130],[130,"=",131],[131,"]",132],[132,",",133],[133,"[",134],[134,"[",135],[135,"v",136],[136,",",137],[137,"4",138],[138,"]",139],[139,",",140],[140,"[",141],[141,"v",142],[142,",",143],[143,"2",144],[144,"]",145],[145,"]",146],[146,"]",147],[147,"]",148],[148,"]",149],[149,"0",150],[150,"1",[-,"[[n,1],[[v,1],[v,2]],"":-"",[[[n,a3],[[v,1],[v,4]]],[[n,=],[[v,4],[v,2]]]]]01"]],[156,"d",157],[157,"d",158],[158,"0",159],[159,"]",160],[160,",",161],[161,"[",162],[162,"[",163],[163,"v",164],[164,",",165],[165,"1",166],[166,"]",167],[167,",",168],[168,"[",169],[169,"v",170],[170,",",171],[171,"2",172],[172,"]",173],[173,"]",174],[174,",",175],[175,"""",176],[176,":",177],[177,"-",178],[178,"""",179],[179,",",180],[180,"[",181],[181,"[",182],[182,"[",183],[183,"n",184],[184,",",185],[185,"1",186],[186,"]",187],[187,",",188],[188,"[",189],[189,"[",190],[190,"v",191],[191,",",192],[192,"1",193],[193,"]",194],[194,",",195],[195,"[",196],[196,"v",197],[197,",",198],[198,"2",199],[199,"]",200],[200,"]",201],[201,"]",202],[202,"]",203],[203,"]",204],[204,"0",205],[205,"1",[-,"[[n,add0],[[v,1],[v,2]],"":-"",[[[n,1],[[v,1],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",132],[132,"n",78],[78,",",134],[134,"1",6],[134,"a",80],[6,"]",7],[7,",",8],[8,"[",9],[9,"[",10],[10,"v",11],[11,",",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"2",19],[19,"]",20],[20,"]",21],[21,",",22],[22,"""",23],[23,":",24],[24,"-",25],[25,"""",26],[26,",",27],[27,"[",28],[28,"[",29],[29,"[",30],[30,"n",31],[31,",",32],[32,"+",33],[33,"]",34],[34,",",35],[35,"[",36],[36,"[",37],[37,"v",38],[38,",",39],[39,"1",40],[40,"]",41],[41,",",42],[42,"[",43],[43,"v",44],[44,",",45],[45,"5",46],[46,"]",47],[47,"]",48],[48,"]",49],[49,",",50],[50,"[",51],[51,"[",52],[52,"n",53],[53,",",54],[54,"=",55],[55,"]",56],[56,",",57],[57,"[",58],[58,"[",59],[59,"v",60],[60,",",61],[61,"5",62],[62,"]",63],[63,",",64],[64,"[",65],[65,"v",66],[66,",",67],[67,"2",68],[68,"]",69],[69,"]",70],[70,"]",71],[71,"]",72],[72,"]",73],[73,"0",74],[80,"d",81],[81,"d",82],[82,"0",83],[83,"]",84],[84,",",85],[85,"[",86],[86,"[",87],[87,"v",88],[88,",",89],[89,"1",90],[90,"]",91],[91,",",92],[92,"[",93],[93,"v",94],[94,",",95],[95,"2",96],[96,"]",97],[97,"]",98],[98,",",99],[99,"""",100],[100,":",101],[101,"-",102],[102,"""",103],[103,",",104],[104,"[",105],[105,"[",106],[106,"[",107],[107,"n",108],[108,",",109],[109,"1",110],[110,"]",111],[111,",",112],[112,"[",113],[113,"[",114],[114,"v",115],[115,",",116],[116,"1",117],[117,"]",118],[118,",",119],[119,"[",120],[120,"v",121],[121,",",122],[122,"2",123],[123,"]",124],[124,"]",125],[125,"]",126],[126,"]",127],[127,"]",128],[128,"0",129],[129,"1",[-,"[[n,add0],[[v,1],[v,2]],"":-"",[[[n,1],[[v,1],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",156],[156,"n",71],[71,",",158],[158,"1",73],[158,"a",6],[6,"d",7],[7,"d",8],[8,"0",9],[8,"2",162],[9,"]",10],[10,",",11],[11,"[",12],[12,"[",13],[13,"v",14],[14,",",15],[15,"1",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"v",20],[20,",",21],[21,"2",22],[22,"]",23],[23,",",24],[24,"[",25],[25,"v",26],[26,",",27],[27,"3",28],[28,"]",29],[29,"]",30],[30,",",31],[31,"""",32],[32,":",33],[33,"-",34],[34,"""",35],[35,",",36],[36,"[",37],[37,"[",38],[38,"[",39],[39,"n",40],[40,",",41],[41,"1",42],[42,"]",43],[43,",",44],[44,"[",45],[45,"[",46],[46,"v",47],[47,",",48],[48,"1",49],[49,"]",50],[50,",",51],[51,"[",52],[52,"v",53],[53,",",54],[54,"2",55],[55,"]",56],[56,",",57],[57,"[",58],[58,"v",59],[59,",",60],[60,"3",61],[61,"]",62],[62,"]",63],[63,"]",64],[64,"]",65],[65,"]",66],[66,"0",67],[67,"1",[-,"[[n,add0],[[v,1],[v,2],[v,3]],"":-"",[[[n,1],[[v,1],[v,2],[v,3]]]]]01"]],[73,"]",74],[74,",",75],[75,"[",76],[76,"[",77],[77,"v",78],[78,",",79],[79,"1",80],[80,"]",81],[81,",",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"2",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"v",90],[90,",",91],[91,"3",92],[92,"]",93],[93,"]",94],[94,",",95],[95,"""",96],[96,":",97],[97,"-",98],[98,"""",99],[99,",",100],[100,"[",101],[101,"[",102],[102,"[",103],[103,"n",104],[104,",",105],[105,"+",106],[106,"]",107],[107,",",108],[108,"[",109],[109,"[",110],[110,"v",111],[111,",",112],[112,"1",113],[113,"]",114],[114,",",115],[115,"[",116],[116,"v",117],[117,",",118],[118,"2",119],[119,"]",120],[120,",",121],[121,"[",122],[122,"v",123],[123,",",124],[124,"6",125],[125,"]",126],[126,"]",127],[127,"]",128],[128,",",129],[129,"[",130],[130,"[",131],[131,"n",132],[132,",",133],[133,"=",134],[134,"]",135],[135,",",136],[136,"[",137],[137,"[",138],[138,"v",139],[139,",",140],[140,"6",141],[141,"]",142],[142,",",143],[143,"[",144],[144,"v",145],[145,",",146],[146,"3",147],[147,"]",148],[148,"]",149],[149,"]",150],[150,"]",151],[151,"]",152],[152,"0",153],[153,"1",[-,"[[n,1],[[v,1],[v,2],[v,3]],"":-"",[[[n,+],[[v,1],[v,2],[v,6]]],[[n,=],[[v,6],[v,3]]]]]01"]],[162,"]",163],[163,",",164],[164,"[",165],[165,"[",166],[166,"v",167],[167,",",168],[168,"1",169],[169,"]",170],[170,",",171],[171,"[",172],[172,"v",173],[173,",",174],[174,"2",175],[175,"]",176],[176,"]",177],[177,",",178],[178,"""",179],[179,":",180],[180,"-",181],[181,"""",182],[182,",",183],[183,"[",184],[184,"[",185],[185,"[",186],[186,"n",187],[187,",",188],[188,"=",189],[189,"]",190],[190,",",191],[191,"[",192],[192,"[",193],[193,"v",194],[194,",",195],[195,"1",196],[196,"]",197],[197,"]",198],[198,"]",199],[199,",",200],[200,"[",201],[201,"[",202],[202,"n",203],[203,",",204],[204,"=",205],[205,"]",206],[206,",",207],[207,"[",208],[208,"[",209],[209,"v",210],[210,",",211],[211,"2",212],[212,"]",213],[213,"]",214],[214,"]",215],[215,"]",216],[216,"]",217],[217,"0",218],[218,"1",[-,"[[n,add2],[[v,1],[v,2]],"":-"",[[[n,=],[[v,1]]],[[n,=],[[v,2]]]]]01"]]],[[1,"[",2],[2,"[",141],[141,"n",62],[62,",",143],[143,"a",64],[64,"d",145],[145,"d",66],[66,"0",67],[66,"2",147],[66,"3",9],[9,"]",10],[10,",",11],[11,"[",12],[12,"[",13],[13,"v",14],[14,",",15],[15,"1",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"v",20],[20,",",21],[21,"2",22],[22,"]",23],[23,"]",24],[24,",",25],[25,"""",26],[26,":",27],[27,"-",28],[28,"""",29],[29,",",30],[30,"[",31],[31,"[",32],[32,"[",33],[33,"n",34],[34,",",35],[35,"t",36],[36,"a",37],[37,"i",38],[38,"l",39],[39,"]",40],[40,",",41],[41,"[",42],[42,"[",43],[43,"v",44],[44,",",45],[45,"1",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"v",50],[50,",",51],[51,"2",52],[52,"]",53],[53,"]",54],[54,"]",55],[55,"]",56],[56,"]",57],[57,"0",58],[58,"1",[-,"[[n,add3],[[v,1],[v,2]],"":-"",[[[n,tail],[[v,1],[v,2]]]]]01"]],[67,"]",68],[68,",",69],[69,"[",70],[70,"[",71],[71,"v",72],[72,",",73],[73,"1",74],[74,"]",75],[75,",",76],[76,"[",77],[77,"v",78],[78,",",79],[79,"2",80],[80,"]",81],[81,"]",82],[82,",",83],[83,"""",84],[84,":",85],[85,"-",86],[86,"""",87],[87,",",88],[88,"[",89],[89,"[",90],[90,"[",91],[91,"n",92],[92,",",93],[93,"a",94],[94,"d",95],[95,"d",96],[96,"2",97],[97,"]",98],[98,",",99],[99,"[",100],[100,"[",101],[101,"v",102],[102,",",103],[103,"1",104],[104,"]",105],[105,",",106],[106,"[",107],[107,"v",108],[108,",",109],[109,"4",110],[110,"]",111],[111,"]",112],[112,"]",113],[113,",",114],[114,"[",115],[115,"[",116],[116,"n",117],[117,",",118],[118,"=",119],[119,"]",120],[120,",",121],[121,"[",122],[122,"[",123],[123,"v",124],[124,",",125],[125,"4",126],[126,"]",127],[127,",",128],[128,"[",129],[129,"v",130],[130,",",131],[131,"2",132],[132,"]",133],[133,"]",134],[134,"]",135],[135,"]",136],[136,"]",137],[137,"0",138],[138,"1",[-,"[[n,add0],[[v,1],[v,2]],"":-"",[[[n,add2],[[v,1],[v,4]]],[[n,=],[[v,4],[v,2]]]]]01"]],[147,"]",148],[148,",",149],[149,"[",150],[150,"[",151],[151,"v",152],[152,",",153],[153,"1",154],[154,"]",155],[155,",",156],[156,"[",157],[157,"v",158],[158,",",159],[159,"2",160],[160,"]",161],[161,"]",162],[162,",",163],[163,"""",164],[164,":",165],[165,"-",166],[166,"""",167],[167,",",168],[168,"[",169],[169,"[",170],[170,"[",171],[171,"n",172],[172,",",173],[173,"=",174],[174,"]",175],[175,",",176],[176,"[",177],[177,"[",178],[178,"v",179],[179,",",180],[180,"1",181],[181,"]",182],[182,"]",183],[183,"]",184],[184,",",185],[185,"[",186],[186,"[",187],[187,"n",188],[188,",",189],[189,"=",190],[190,"]",191],[191,",",192],[192,"[",193],[193,"[",194],[194,"v",195],[195,",",196],[196,"2",197],[197,"]",198],[198,"]",199],[199,"]",200],[200,"]",201],[201,"]",202],[202,"0",203],[203,"1",[-,"[[n,add2],[[v,1],[v,2]],"":-"",[[[n,=],[[v,1]]],[[n,=],[[v,2]]]]]01"]]],[[1,"[",2],[2,"[",167],[167,"n",84],[84,",",169],[169,"a",6],[169,"f",170],[169,"i",86],[6,"d",7],[7,"d",8],[8,"0",9],[9,"]",10],[10,",",11],[11,"[",12],[12,"[",13],[13,"v",14],[14,",",15],[15,"1",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"v",20],[20,",",21],[21,"2",22],[22,"]",23],[23,"]",24],[24,",",25],[25,"""",26],[26,":",27],[27,"-",28],[28,"""",29],[29,",",30],[30,"[",31],[31,"[",32],[32,"[",33],[33,"n",34],[34,",",35],[35,"a",36],[36,"d",37],[37,"d",38],[38,"2",39],[39,"]",40],[40,",",41],[41,"[",42],[42,"[",43],[43,"v",44],[44,",",45],[45,"1",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"v",50],[50,",",51],[51,"4",52],[52,"]",53],[53,"]",54],[54,"]",55],[55,",",56],[56,"[",57],[57,"[",58],[58,"n",59],[59,",",60],[60,"=",61],[61,"]",62],[62,",",63],[63,"[",64],[64,"[",65],[65,"v",66],[66,",",67],[67,"1",68],[68,"]",69],[69,",",70],[70,"[",71],[71,"v",72],[72,",",73],[73,"2",74],[74,"]",75],[75,"]",76],[76,"]",77],[77,"]",78],[78,"]",79],[79,"0",80],[80,"1",[-,"[[n,add0],[[v,1],[v,2]],"":-"",[[[n,add2],[[v,1],[v,4]]],[[n,=],[[v,1],[v,2]]]]]01"]],[86,"m",87],[87,"p",88],[88,"l",89],[89,"i",90],[90,"e",91],[91,"s",92],[92,"2",93],[93,"]",94],[94,",",95],[95,"[",96],[96,"[",97],[97,"v",98],[98,",",99],[99,"1",100],[100,"]",101],[101,",",102],[102,"[",103],[103,"v",104],[104,",",105],[105,"2",106],[106,"]",107],[107,"]",108],[108,",",109],[109,"""",110],[110,":",111],[111,"-",112],[112,"""",113],[113,",",114],[114,"[",115],[115,"[",116],[116,"[",117],[117,"n",118],[118,",",119],[119,"""",120],[120,"-",121],[121,">",122],[122,"""",123],[123,"]",124],[124,",",125],[125,"[",126],[126,"[",127],[127,"[",128],[128,"n",129],[129,",",130],[130,"i",131],[131,"s",132],[132,"]",133],[133,",",134],[134,"[",135],[135,"[",136],[136,"v",137],[137,",",138],[138,"1",139],[139,"]",140],[140,"]",141],[141,"]",142],[142,",",143],[143,"[",144],[144,"[",145],[145,"n",146],[146,",",147],[147,"i",148],[148,"s",149],[149,"]",150],[150,",",151],[151,"[",152],[152,"[",153],[153,"v",154],[154,",",155],[155,"2",156],[156,"]",157],[157,"]",158],[158,"]",159],[159,"]",160],[160,"]",161],[161,"]",162],[162,"]",163],[163,"0",164],[164,"1",[-,"[[n,implies2],[[v,1],[v,2]],"":-"",[[[n,""->""],[[[n,is],[[v,1]]],[[n,is],[[v,2]]]]]]]01"]],[170,"i",171],[171,"n",172],[172,"d",173],[173,"a",174],[174,"l",175],[175,"l",176],[176,"1",177],[177,"]",178],[178,",",179],[179,"[",180],[180,"[",181],[181,"v",182],[182,",",183],[183,"1",184],[184,"]",185],[185,",",186],[186,"[",187],[187,"v",188],[188,",",189],[189,"2",190],[190,"]",191],[191,"]",192],[192,",",193],[193,"""",194],[194,":",195],[195,"-",196],[196,"""",197],[197,",",198],[198,"[",199],[199,"[",200],[200,"[",201],[201,"n",202],[202,",",203],[203,"f",204],[204,"i",205],[205,"n",206],[206,"d",207],[207,"a",208],[208,"l",209],[209,"l",210],[210,"]",211],[211,",",212],[212,"[",213],[213,"[",214],[214,"v",215],[215,",",216],[216,"3",217],[217,"]",218],[218,",",219],[219,"[",220],[220,"v",221],[221,",",222],[222,"1",223],[223,"]",224],[224,",",225],[225,"[",226],[226,"v",227],[227,",",228],[228,"3",229],[229,"]",230],[230,",",231],[231,"[",232],[232,"v",233],[233,",",234],[234,"2",235],[235,"]",236],[236,"]",237],[237,"]",238],[238,"]",239],[239,"]",240],[240,"0",241],[241,"1",[-,"[[n,findall1],[[v,1],[v,2]],"":-"",[[[n,findall],[[v,3],[v,1],[v,3],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",139],[139,"n",69],[69,",",141],[141,"e",71],[141,"m",6],[6,"a",7],[7,"p",8],[8,"l",9],[9,"i",10],[10,"s",11],[11,"t",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"m",40],[40,"a",41],[41,"p",42],[42,"l",43],[43,"i",44],[44,"s",45],[45,"t",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"[",50],[50,"v",51],[51,",",52],[52,"1",53],[53,"]",54],[54,",",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"2",59],[59,"]",60],[60,"]",61],[61,"]",62],[62,"]",63],[63,"]",64],[64,"0",65],[65,"1",[-,"[[n,maplist1],[[v,1],[v,2]],"":-"",[[[n,maplist],[[v,1],[v,2]]]]]01"]],[71,"q",72],[72,"u",73],[73,"a",74],[74,"l",75],[75,"s",76],[76,"4",77],[77,"1",78],[78,"]",79],[79,",",80],[80,"[",81],[81,"[",82],[82,"v",83],[83,",",84],[84,"1",85],[85,"]",86],[86,",",87],[87,"[",88],[88,"v",89],[89,",",90],[90,"2",91],[91,"]",92],[92,",",164],[92,"]",93],[93,",",94],[94,"""",95],[95,":",96],[96,"-",97],[97,"""",98],[98,",",99],[99,"[",100],[100,"[",101],[101,"[",102],[102,"n",103],[103,",",104],[104,"e",105],[105,"q",106],[106,"u",107],[107,"a",108],[108,"l",109],[109,"s",110],[110,"4",111],[111,"]",112],[112,",",113],[113,"[",114],[114,"[",115],[115,"v",116],[116,",",117],[117,"1",118],[118,"]",119],[119,",",120],[120,"[",121],[121,"v",122],[122,",",123],[123,"2",124],[124,"]",125],[125,",",126],[126,"[",127],[127,"v",128],[128,",",129],[129,"6",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"]",134],[134,"]",135],[135,"0",136],[136,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,equals4],[[v,1],[v,2],[v,6]]]]]01"]],[164,"[",165],[165,"v",166],[166,",",167],[167,"3",168],[168,"]",169],[169,",",170],[170,"[",171],[171,"v",172],[172,",",173],[173,"4",174],[174,"]",175],[175,"]",176],[176,",",177],[177,"""",178],[178,":",179],[179,"-",180],[180,"""",181],[181,",",182],[182,"[",183],[183,"[",184],[184,"[",185],[185,"n",186],[186,",",187],[187,"e",188],[188,"q",189],[189,"u",190],[190,"a",191],[191,"l",192],[192,"s",193],[193,"4",194],[194,"]",195],[195,",",196],[196,"[",197],[197,"[",198],[198,"v",199],[199,",",200],[200,"1",201],[201,"]",202],[202,",",203],[203,"[",204],[204,"v",205],[205,",",206],[206,"2",207],[207,"]",208],[208,",",209],[209,"[",210],[210,"v",211],[211,",",212],[212,"3",213],[213,"]",214],[214,",",215],[215,"[",216],[216,"v",217],[217,",",218],[218,"4",219],[219,"]",220],[220,"]",221],[221,"]",222],[222,"]",223],[223,"]",224],[224,"0",225],[225,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3],[v,4]],"":-"",[[[n,equals4],[[v,1],[v,2],[v,3],[v,4]]]]]01"]]],[[1,"[",2],[2,"[",145],[145,"n",81],[81,",",147],[147,"e",83],[83,"q",149],[149,"u",85],[85,"a",151],[151,"l",87],[87,"s",153],[153,"4",89],[89,"1",155],[155,"]",91],[91,",",157],[157,"""",158],[157,"[",93],[93,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,",",28],[27,"]",105],[28,"[",29],[29,"v",30],[30,",",31],[31,"3",32],[32,"]",33],[33,"]",34],[34,",",35],[35,"""",36],[36,":",37],[37,"-",38],[38,"""",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"[",43],[43,"n",44],[44,",",45],[45,"e",46],[46,"q",47],[47,"u",48],[48,"a",49],[49,"l",50],[50,"s",51],[51,"4",52],[52,"]",53],[53,",",54],[54,"[",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"1",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"2",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"3",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"0",77],[77,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3]],"":-"",[[[n,equals4],[[v,1],[v,2],[v,3]]]]]01"]],[105,",",106],[106,"""",107],[107,":",108],[108,"-",109],[109,"""",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"[",114],[114,"n",115],[115,",",116],[116,"e",117],[117,"q",118],[118,"u",119],[119,"a",120],[120,"l",121],[121,"s",122],[122,"4",123],[123,"]",124],[124,",",125],[125,"[",126],[126,"[",127],[127,"v",128],[128,",",129],[129,"1",130],[130,"]",131],[131,",",132],[132,"[",133],[133,"v",134],[134,",",135],[135,"2",136],[136,"]",137],[137,"]",138],[138,"]",139],[139,"]",140],[140,"]",141],[141,"0",142],[142,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,equals4],[[v,1],[v,2]]]]]01"]],[158,":",159],[159,"-",160],[160,"""",161],[161,",",162],[162,"[",163],[163,"[",164],[164,"[",165],[165,"n",166],[166,",",167],[167,"e",168],[168,"q",169],[169,"u",170],[170,"a",171],[171,"l",172],[172,"s",173],[173,"4",174],[174,"]",175],[175,",",176],[176,"[",177],[177,"[",178],[178,"v",179],[179,",",180],[180,"1",181],[181,"]",182],[182,",",183],[183,"[",184],[184,"v",185],[185,",",186],[186,"2",187],[187,"]",188],[188,",",189],[189,"[",190],[190,"v",191],[191,",",192],[192,"4",193],[193,"]",194],[194,",",195],[195,"[",196],[196,"v",197],[197,",",198],[198,"5",199],[199,"]",200],[200,"]",201],[201,"]",202],[202,"]",203],[203,"]",204],[204,"0",205],[205,"1",[-,"[[n,equals41],"":-"",[[[n,equals4],[[v,1],[v,2],[v,4],[v,5]]]]]01"]]],[[1,"[",2],[2,"[",145],[145,"n",81],[81,",",147],[147,"e",83],[83,"q",149],[149,"u",85],[85,"a",151],[151,"l",87],[87,"s",153],[153,"4",89],[89,"1",155],[155,"]",91],[91,",",157],[157,"[",93],[93,"[",159],[159,"v",95],[95,",",161],[161,"1",97],[97,"]",163],[163,",",99],[99,"[",165],[165,"v",101],[101,",",167],[167,"2",103],[103,"]",169],[169,",",28],[169,"]",105],[28,"[",29],[29,"v",30],[30,",",31],[31,"3",32],[32,"]",33],[33,"]",34],[34,",",35],[35,"""",36],[36,":",37],[37,"-",38],[38,"""",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"[",43],[43,"n",44],[44,",",45],[45,"e",46],[46,"q",47],[47,"u",48],[48,"a",49],[49,"l",50],[50,"s",51],[51,"4",52],[52,"]",53],[53,",",54],[54,"[",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"1",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"2",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"3",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"0",77],[77,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3]],"":-"",[[[n,equals4],[[v,1],[v,2],[v,3]]]]]01"]],[105,",",106],[106,"""",107],[107,":",108],[108,"-",109],[109,"""",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"[",114],[114,"n",115],[115,",",116],[116,"e",117],[117,"q",118],[118,"u",119],[119,"a",120],[120,"l",121],[121,"s",122],[122,"4",123],[123,"]",124],[124,",",125],[125,"[",126],[126,"[",127],[127,"v",128],[128,",",129],[129,"1",130],[130,"]",131],[131,",",132],[132,"[",133],[133,"v",134],[134,",",135],[135,"2",136],[136,"]",137],[137,"]",138],[138,"]",139],[139,"]",140],[140,"]",141],[141,"0",142]],[[1,"[",2],[2,"[",121],[121,"n",57],[57,",",123],[123,"e",59],[59,"q",125],[125,"u",61],[61,"a",127],[127,"l",63],[63,"s",129],[129,"4",65],[65,"1",131],[131,"]",67],[67,",",133],[133,"[",69],[69,"[",135],[135,"v",71],[71,",",137],[137,"1",73],[73,"]",139],[139,",",75],[139,"]",22],[22,",",23],[23,"""",24],[24,":",25],[25,"-",26],[26,"""",27],[27,",",28],[28,"[",29],[29,"[",30],[30,"[",31],[31,"n",32],[32,",",33],[33,"e",34],[34,"q",35],[35,"u",36],[36,"a",37],[37,"l",38],[38,"s",39],[39,"4",40],[40,"]",41],[41,",",42],[42,"[",43],[43,"[",44],[44,"v",45],[45,",",46],[46,"1",47],[47,"]",48],[48,"]",49],[49,"]",50],[50,"]",51],[51,"]",52],[52,"0",53],[53,"1",[-,"[[n,equals41],[[v,1]],"":-"",[[[n,equals4],[[v,1]]]]]01"]],[75,"[",76],[76,"v",77],[77,",",78],[78,"2",79],[79,"]",80],[80,"]",81],[81,",",82],[82,"""",83],[83,":",84],[84,"-",85],[85,"""",86],[86,",",87],[87,"[",88],[88,"[",89],[89,"[",90],[90,"n",91],[91,",",92],[92,"e",93],[93,"q",94],[94,"u",95],[95,"a",96],[96,"l",97],[97,"s",98],[98,"4",99],[99,"]",100],[100,",",101],[101,"[",102],[102,"[",103],[103,"v",104],[104,",",105],[105,"1",106],[106,"]",107],[107,",",108],[108,"[",109],[109,"v",110],[110,",",111],[111,"2",112],[112,"]",113],[113,"]",114],[114,"]",115],[115,"]",116],[116,"]",117],[117,"0",118]],[[1,"[",2],[2,"[",119],[119,"n",67],[67,",",121],[121,"e",69],[69,"q",123],[123,"u",71],[71,"a",125],[125,"l",73],[73,"s",127],[127,"4",75],[75,"1",129],[129,"]",77],[77,",",131],[131,"""",16],[131,"[",79],[16,":",17],[17,"-",18],[18,"""",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"[",23],[23,"n",24],[24,",",25],[25,"e",26],[26,"q",27],[27,"u",28],[28,"a",29],[29,"l",30],[30,"s",31],[31,"4",32],[32,"]",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"v",37],[37,",",38],[38,"1",39],[39,"]",40],[40,",",41],[41,"[",42],[42,"v",43],[43,",",44],[44,"2",45],[45,"]",46],[46,",",47],[47,"[",48],[48,"v",49],[49,",",50],[50,"4",51],[51,"]",52],[52,",",53],[53,"[",54],[54,"v",55],[55,",",56],[56,"6",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"]",62],[62,"0",63],[63,"1",[-,"[[n,equals41],"":-"",[[[n,equals4],[[v,1],[v,2],[v,4],[v,6]]]]]01"]],[79,"[",80],[80,"v",81],[81,",",82],[82,"1",83],[83,"]",84],[84,",",138],[84,"]",85],[85,",",86],[86,"""",87],[87,":",88],[88,"-",89],[89,"""",90],[90,",",91],[91,"[",92],[92,"[",93],[93,"[",94],[94,"n",95],[95,",",96],[96,"e",97],[97,"q",98],[98,"u",99],[99,"a",100],[100,"l",101],[101,"s",102],[102,"4",103],[103,"]",104],[104,",",105],[105,"[",106],[106,"[",107],[107,"v",108],[108,",",109],[109,"1",110],[110,"]",111],[111,"]",112],[112,"]",113],[113,"]",114],[114,"]",115],[115,"0",116],[116,"1",[-,"[[n,equals41],[[v,1]],"":-"",[[[n,equals4],[[v,1]]]]]01"]],[138,"[",139],[139,"v",140],[140,",",141],[141,"2",142],[142,"]",143],[143,",",144],[144,"[",145],[145,"v",146],[146,",",147],[147,"3",148],[148,"]",149],[149,"]",150],[150,",",151],[151,"""",152],[152,":",153],[153,"-",154],[154,"""",155],[155,",",156],[156,"[",157],[157,"[",158],[158,"[",159],[159,"n",160],[160,",",161],[161,"e",162],[162,"q",163],[163,"u",164],[164,"a",165],[165,"l",166],[166,"s",167],[167,"4",168],[168,"]",169],[169,",",170],[170,"[",171],[171,"[",172],[172,"v",173],[173,",",174],[174,"1",175],[175,"]",176],[176,",",177],[177,"[",178],[178,"v",179],[179,",",180],[180,"2",181],[181,"]",182],[182,",",183],[183,"[",184],[184,"v",185],[185,",",186],[186,"3",187],[187,"]",188],[188,"]",189],[189,"]",190],[190,"]",191],[191,"]",192],[192,"0",193],[193,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3]],"":-"",[[[n,equals4],[[v,1],[v,2],[v,3]]]]]01"]]],[[1,"[",2],[2,"[",133],[133,"n",69],[69,",",135],[135,"f",136],[135,"m",71],[71,"a",7],[7,"p",8],[8,"l",9],[9,"i",10],[10,"s",11],[11,"t",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"m",40],[40,"a",41],[41,"p",42],[42,"l",43],[43,"i",44],[44,"s",45],[45,"t",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"[",50],[50,"v",51],[51,",",52],[52,"1",53],[53,"]",54],[54,",",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"2",59],[59,"]",60],[60,"]",61],[61,"]",62],[62,"]",63],[63,"]",64],[64,"0",65],[136,"i",137],[137,"n",138],[138,"d",139],[139,"a",140],[140,"l",141],[141,"l",142],[142,"1",143],[143,"]",144],[144,",",145],[145,"[",146],[146,"[",147],[147,"v",148],[148,",",149],[149,"1",150],[150,"]",151],[151,",",152],[152,"[",153],[153,"v",154],[154,",",155],[155,"2",156],[156,"]",157],[157,"]",158],[158,",",159],[159,"""",160],[160,":",161],[161,"-",162],[162,"""",163],[163,",",164],[164,"[",165],[165,"[",166],[166,"[",167],[167,"n",168],[168,",",169],[169,"f",170],[170,"i",171],[171,"n",172],[172,"d",173],[173,"a",174],[174,"l",175],[175,"l",176],[176,"]",177],[177,",",178],[178,"[",179],[179,"[",180],[180,"v",181],[181,",",182],[182,"3",183],[183,"]",184],[184,",",185],[185,"[",186],[186,"v",187],[187,",",188],[188,"3",189],[189,"]",190],[190,",",191],[191,"[",192],[192,"v",193],[193,",",194],[194,"1",195],[195,"]",196],[196,",",197],[197,"[",198],[198,"v",199],[199,",",200],[200,"3",201],[201,"]",202],[202,",",203],[203,"[",204],[204,"v",205],[205,",",206],[206,"2",207],[207,"]",208],[208,"]",209],[209,"]",210],[210,"]",211],[211,"]",212],[212,"0",213],[213,"1",[-,"[[n,findall1],[[v,1],[v,2]],"":-"",[[[n,findall],[[v,3],[v,3],[v,1],[v,3],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",127],[127,"n",75],[75,",",129],[129,"e",77],[77,"q",131],[131,"u",79],[79,"a",133],[133,"l",81],[81,"s",135],[135,"4",83],[83,"1",137],[137,"]",85],[85,",",139],[139,"[",87],[87,"[",141],[141,"v",89],[89,",",143],[143,"1",91],[91,"]",145],[145,",",22],[145,"]",93],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,",",152],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"e",40],[40,"q",41],[41,"u",42],[42,"a",43],[43,"l",44],[44,"s",45],[45,"4",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"[",50],[50,"v",51],[51,",",52],[52,"2",53],[53,"]",54],[54,",",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"1",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"1",65],[65,"]",66],[66,"]",67],[67,"]",68],[68,"]",69],[69,"]",70],[70,"0",71],[71,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,equals4],[[v,2],[v,1],[v,1]]]]]01"]],[93,",",94],[94,"""",95],[95,":",96],[96,"-",97],[97,"""",98],[98,",",99],[99,"[",100],[100,"[",101],[101,"[",102],[102,"n",103],[103,",",104],[104,"e",105],[105,"q",106],[106,"u",107],[107,"a",108],[108,"l",109],[109,"s",110],[110,"4",111],[111,"]",112],[112,",",113],[113,"[",114],[114,"[",115],[115,"v",116],[116,",",117],[117,"1",118],[118,"]",119],[119,"]",120],[120,"]",121],[121,"]",122],[122,"]",123],[123,"0",124],[124,"1",[-,"[[n,equals41],[[v,1]],"":-"",[[[n,equals4],[[v,1]]]]]01"]],[152,"[",153],[153,"v",154],[154,",",155],[155,"3",156],[156,"]",157],[157,"]",158],[158,",",159],[159,"""",160],[160,":",161],[161,"-",162],[162,"""",163],[163,",",164],[164,"[",165],[165,"[",166],[166,"[",167],[167,"n",168],[168,",",169],[169,"e",170],[170,"q",171],[171,"u",172],[172,"a",173],[173,"l",174],[174,"s",175],[175,"4",176],[176,"]",177],[177,",",178],[178,"[",179],[179,"[",180],[180,"v",181],[181,",",182],[182,"2",183],[183,"]",184],[184,",",185],[185,"[",186],[186,"v",187],[187,",",188],[188,"3",189],[189,"]",190],[190,",",191],[191,"[",192],[192,"v",193],[193,",",194],[194,"1",195],[195,"]",196],[196,"]",197],[197,"]",198],[198,"]",199],[199,"]",200],[200,"0",201],[201,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3]],"":-"",[[[n,equals4],[[v,2],[v,3],[v,1]]]]]01"]]],[[1,"[",2],[2,"[",169],[169,"n",81],[81,",",171],[171,"e",83],[83,"q",173],[173,"u",85],[85,"a",175],[175,"l",87],[87,"s",177],[177,"4",89],[89,"1",179],[179,"]",91],[91,",",181],[181,"[",93],[93,"[",183],[183,"v",95],[95,",",185],[185,"1",97],[97,"]",187],[187,",",99],[99,"[",189],[189,"v",101],[101,",",191],[191,"2",103],[103,"]",193],[193,",",105],[193,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"e",40],[40,"q",41],[41,"u",42],[42,"a",43],[43,"l",44],[44,"s",45],[45,"4",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"[",50],[50,"v",51],[51,",",52],[52,"2",53],[53,"]",54],[54,",",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"1",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"1",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"1",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"0",77],[77,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,equals4],[[v,2],[v,1],[v,1],[v,1]]]]]01"]],[105,"[",106],[106,"v",107],[107,",",108],[108,"3",109],[109,"]",110],[110,"]",111],[111,",",112],[112,"""",113],[113,":",114],[114,"-",115],[115,"""",116],[116,",",117],[117,"[",118],[118,"[",119],[119,"[",120],[120,"n",121],[121,",",122],[122,"e",123],[123,"q",124],[124,"u",125],[125,"a",126],[126,"l",127],[127,"s",128],[128,"4",129],[129,"]",130],[130,",",131],[131,"[",132],[132,"[",133],[133,"v",134],[134,",",135],[135,"1",225],[135,"2",136],[136,"]",137],[137,",",138],[138,"[",139],[139,"v",140],[140,",",141],[141,"3",142],[142,"]",143],[143,",",144],[144,"[",145],[145,"v",146],[146,",",147],[147,"1",148],[148,"]",149],[149,",",150],[150,"[",151],[151,"v",152],[152,",",153],[153,"1",154],[154,"]",155],[155,",",156],[156,"[",157],[157,"v",158],[158,",",159],[159,"1",160],[160,"]",161],[161,"]",162],[162,"]",163],[163,"]",164],[164,"]",165],[165,"0",166],[166,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3]],"":-"",[[[n,equals4],[[v,2],[v,3],[v,1],[v,1],[v,1]]]]]01"]],[225,"]",226],[226,",",227],[227,"[",228],[228,"v",229],[229,",",230],[230,"1",231],[231,"]",232],[232,",",233],[233,"[",234],[234,"v",235],[235,",",236],[236,"1",237],[237,"]",238],[238,",",239],[239,"[",240],[240,"v",241],[241,",",242],[242,"2",243],[243,"]",244],[244,",",245],[245,"[",246],[246,"v",247],[247,",",248],[248,"3",249],[249,"]",250],[250,"]",251],[251,"]",252],[252,"]",253],[253,"]",254],[254,"0",255],[255,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3]],"":-"",[[[n,equals4],[[v,1],[v,1],[v,1],[v,2],[v,3]]]]]01"]]],[[1,"[",2],[2,"[",205],[205,"n",105],[105,",",207],[207,"e",107],[207,"m",208],[107,"q",7],[7,"u",8],[8,"a",9],[9,"l",10],[10,"s",11],[11,"4",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,",",28],[28,"[",29],[29,"v",30],[30,",",31],[31,"3",32],[32,"]",33],[33,",",34],[34,"[",35],[35,"v",36],[36,",",37],[37,"4",38],[38,"]",39],[39,",",40],[40,"[",41],[41,"v",42],[42,",",43],[43,"5",44],[44,"]",45],[45,"]",46],[46,",",47],[47,"""",48],[48,":",49],[49,"-",50],[50,"""",51],[51,",",52],[52,"[",53],[53,"[",54],[54,"[",55],[55,"n",56],[56,",",57],[57,"e",58],[58,"q",59],[59,"u",60],[60,"a",61],[61,"l",62],[62,"s",63],[63,"4",64],[64,"]",65],[65,",",66],[66,"[",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"1",71],[70,"3",172],[71,"]",72],[72,",",73],[73,"[",74],[74,"v",75],[75,",",76],[76,"2",77],[77,"]",78],[78,",",79],[79,"[",80],[80,"v",81],[81,",",82],[82,"3",83],[83,"]",84],[84,",",85],[85,"[",86],[86,"v",87],[87,",",88],[88,"4",89],[89,"]",90],[90,",",91],[91,"[",92],[92,"v",93],[93,",",94],[94,"5",95],[95,"]",96],[96,"]",97],[97,"]",98],[98,"]",99],[99,"]",100],[100,"0",101],[101,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3],[v,4],[v,5]],"":-"",[[[n,equals4],[[v,1],[v,2],[v,3],[v,4],[v,5]]]]]01"]],[172,"]",173],[173,",",174],[174,"[",175],[175,"v",176],[176,",",177],[177,"4",178],[178,"]",179],[179,",",180],[180,"[",181],[181,"v",182],[182,",",183],[183,"5",184],[184,"]",185],[185,",",186],[186,"[",187],[187,"v",188],[188,",",189],[189,"1",190],[190,"]",191],[191,",",192],[192,"[",193],[193,"v",194],[194,",",195],[195,"2",196],[196,"]",197],[197,"]",198],[198,"]",199],[199,"]",200],[200,"]",201],[201,"0",202],[202,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3],[v,4],[v,5]],"":-"",[[[n,equals4],[[v,3],[v,4],[v,5],[v,1],[v,2]]]]]01"]],[208,"e",209],[209,"m",210],[210,"b",211],[211,"e",212],[212,"r",213],[213,"2",214],[214,"a",215],[215,"]",216],[216,",",217],[217,"[",218],[218,"[",219],[219,"v",220],[220,",",221],[221,"1",222],[222,"]",223],[223,",",224],[224,"[",225],[225,"v",226],[226,",",227],[227,"2",228],[228,"]",229],[229,"]",230],[230,",",231],[231,"""",232],[232,":",233],[233,"-",234],[234,"""",235],[235,",",236],[236,"[",237],[237,"[",238],[238,"[",239],[239,"n",240],[240,",",241],[241,"m",242],[242,"e",243],[243,"m",244],[244,"b",245],[245,"e",246],[246,"r",247],[247,"2",248],[248,"]",249],[249,",",250],[250,"[",251],[251,"[",252],[252,"v",253],[253,",",254],[254,"1",255],[255,"]",256],[256,",",257],[257,"[",258],[258,"v",259],[259,",",260],[260,"2",261],[261,"]",262],[262,"]",263],[263,"]",264],[264,",",265],[265,"[",266],[266,"[",267],[267,"n",268],[268,",",269],[269,"c",270],[270,"u",271],[271,"t",272],[272,"]",273],[273,"]",274],[274,"]",275],[275,"]",276],[276,"0",277],[277,"1",[-,"[[n,member2a],[[v,1],[v,2]],"":-"",[[[n,member2],[[v,1],[v,2]]],[[n,cut]]]]01"]]],[[1,"[",2],[2,"[",138],[138,"n",64],[64,",",140],[140,"c",6],[140,"m",66],[6,"a",7],[7,"l",8],[8,"l",9],[9,"1",10],[10,"a",11],[10,"b",146],[11,"]",12],[12,",",13],[13,"[",14],[14,"[",15],[15,"v",16],[16,",",17],[17,"1",18],[18,"]",19],[19,",",20],[20,"[",21],[21,"v",22],[22,",",23],[23,"2",24],[24,"]",25],[25,"]",26],[26,",",27],[27,"""",28],[28,":",29],[29,"-",30],[30,"""",31],[31,",",32],[32,"[",33],[33,"[",34],[34,"[",35],[35,"n",36],[36,",",37],[37,"c",38],[38,"a",39],[39,"l",40],[40,"l",41],[41,"]",42],[42,",",43],[43,"[",44],[44,"[",45],[45,"v",46],[46,",",47],[47,"1",48],[48,"]",49],[49,",",50],[50,"[",51],[51,"v",52],[52,",",53],[53,"2",54],[54,"]",55],[55,"]",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"0",60],[60,"1",[-,"[[n,call1a],[[v,1],[v,2]],"":-"",[[[n,call],[[v,1],[v,2]]]]]01"]],[66,"e",67],[67,"m",68],[68,"b",69],[69,"e",70],[70,"r",71],[71,"2",72],[72,"a",73],[73,"]",74],[74,",",75],[75,"[",76],[76,"[",77],[77,"v",78],[78,",",79],[79,"1",80],[80,"]",81],[81,",",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"2",86],[86,"]",87],[87,"]",88],[88,",",89],[89,"""",90],[90,":",91],[91,"-",92],[92,"""",93],[93,",",94],[94,"[",95],[95,"[",96],[96,"[",97],[97,"n",98],[98,",",99],[99,"m",100],[100,"e",101],[101,"m",102],[102,"b",103],[103,"e",104],[104,"r",105],[105,"2",106],[106,"]",107],[107,",",108],[108,"[",109],[109,"[",110],[110,"v",111],[111,",",112],[112,"1",113],[113,"]",114],[114,",",115],[115,"[",116],[116,"v",117],[117,",",118],[118,"2",119],[119,"]",120],[120,"]",121],[121,"]",122],[122,",",123],[123,"[",124],[124,"[",125],[125,"n",126],[126,",",127],[127,"c",128],[128,"u",129],[129,"t",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"]",134],[134,"0",135],[135,"1",[-,"[[n,member2a],[[v,1],[v,2]],"":-"",[[[n,member2],[[v,1],[v,2]]],[[n,cut]]]]01"]],[146,"]",147],[147,",",148],[148,"[",149],[149,"[",150],[150,"v",151],[151,",",152],[152,"1",153],[153,"]",154],[154,",",155],[155,"[",156],[156,"v",157],[157,",",158],[158,"2",159],[159,"]",160],[160,"]",161],[161,",",162],[162,"""",163],[163,":",164],[164,"-",165],[165,"""",166],[166,",",167],[167,"[",168],[168,"[",169],[169,"[",170],[170,"n",171],[171,",",172],[172,"c",173],[173,"a",174],[174,"l",175],[175,"l",176],[176,"]",177],[177,",",178],[178,"[",179],[179,"[",180],[180,"v",181],[181,",",182],[182,"1",183],[183,"]",184],[184,",",185],[185,"[",186],[186,"v",187],[187,",",188],[188,"2",189],[189,"]",190],[190,",",191],[191,"[",192],[192,"v",193],[193,",",194],[194,"1",195],[195,"]",196],[196,",",197],[197,"[",198],[198,"v",199],[199,",",200],[200,"2",201],[201,"]",202],[202,",",203],[203,"[",204],[204,"v",205],[205,",",206],[206,"1",207],[207,"]",208],[208,",",209],[209,"[",210],[210,"v",211],[211,",",212],[212,"2",213],[213,"]",214],[214,"]",215],[215,"]",216],[216,",",217],[217,"[",218],[218,"[",219],[219,"n",220],[220,",",221],[221,"e",222],[222,"q",223],[223,"u",224],[224,"a",225],[225,"l",226],[226,"s",227],[227,"4",228],[228,"_",229],[229,"o",230],[230,"n",231],[231,"]",232],[232,"]",233],[233,"]",234],[234,"]",235],[235,"0",236],[236,"1",[-,"[[n,call1b],[[v,1],[v,2]],"":-"",[[[n,call],[[v,1],[v,2],[v,1],[v,2],[v,1],[v,2]]],[[n,equals4_on]]]]01"]]],[[1,"[",2],[2,"[",110],[110,"n",61],[61,",",112],[112,"l",63],[112,"m",6],[112,"t",113],[6,"i",7],[7,"d",8],[8,"d",9],[9,"l",10],[10,"e",11],[11,"]",12],[12,",",13],[13,"[",14],[14,"[",15],[15,"v",16],[16,",",17],[17,"1",18],[18,"]",19],[19,",",20],[20,"[",21],[21,"v",22],[22,",",23],[23,"2",24],[24,"]",25],[25,"]",26],[26,",",27],[27,"""",28],[28,":",29],[29,"-",30],[30,"""",31],[31,",",32],[32,"[",33],[33,"[",34],[34,"[",35],[35,"n",36],[36,",",37],[37,"/",38],[38,"]",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"v",43],[43,",",44],[44,"1",45],[45,"]",46],[46,",",47],[47,"[",48],[48,"v",49],[49,",",50],[50,"2",51],[51,"]",52],[52,"]",53],[53,"]",54],[54,"]",55],[55,"]",56],[56,"0",57],[57,"1",[-,"[[n,middle],[[v,1],[v,2]],"":-"",[[[n,/],[[v,1],[v,2]]]]]01"]],[63,"e",64],[64,"v",65],[65,"e",66],[66,"l",67],[67,"_",68],[68,"w",69],[69,"i",70],[70,"t",71],[71,"h",72],[72,"]",73],[73,",",74],[74,"[",75],[75,"[",76],[76,"v",77],[77,",",78],[78,"1",79],[79,"]",80],[80,",",81],[81,"[",82],[82,"v",83],[83,",",84],[84,"1",85],[85,"]",86],[86,"]",87],[87,",",88],[88,"""",89],[89,":",90],[90,"-",91],[91,"""",92],[92,",",93],[93,"[",94],[94,"[",95],[95,"[",96],[96,"n",97],[97,",",98],[98,"t",99],[99,"r",100],[100,"u",101],[101,"e",102],[102,"]",103],[103,"]",104],[104,"]",105],[105,"]",106],[106,"0",107],[107,"1",[-,"[[n,level_with],[[v,1],[v,1]],"":-"",[[[n,true]]]]01"]],[113,"r",114],[114,"a",115],[115,"_",116],[116,"l",117],[117,"a",118],[118,"s",119],[119,"]",120],[120,",",121],[121,"[",122],[122,"[",123],[123,"v",124],[124,",",125],[125,"1",126],[126,"]",127],[127,",",128],[128,"[",129],[129,"v",130],[130,",",131],[131,"2",132],[132,"]",133],[133,"]",134],[134,",",135],[135,"""",136],[136,":",137],[137,"-",138],[138,"""",139],[139,",",140],[140,"[",141],[141,"[",142],[142,"[",143],[143,"n",144],[144,",",145],[145,"l",146],[146,"a",147],[147,"s",148],[148,"]",149],[149,",",150],[150,"[",151],[151,"[",152],[152,"v",153],[153,",",154],[154,"1",155],[155,"]",156],[156,",",157],[157,"[",158],[158,"v",159],[159,",",160],[160,"5",161],[161,"]",162],[162,"]",163],[163,"]",164],[164,",",165],[165,"[",166],[166,"[",167],[167,"n",168],[168,",",169],[169,"a",170],[170,"p",171],[171,"p",172],[172,"e",173],[173,"n",174],[174,"d",175],[175,"]",176],[176,",",177],[177,"[",178],[178,"[",179],[179,"v",180],[180,",",181],[181,"5",182],[182,"]",183],[183,",",184],[184,"[",185],[185,"v",186],[186,",",187],[187,"2",188],[188,"]",189],[189,"]",190],[190,"]",191],[191,"]",192],[192,"]",193],[193,"0",194],[194,"1",[-,"[[n,tra_las],[[v,1],[v,2]],"":-"",[[[n,las],[[v,1],[v,5]]],[[n,append],[[v,5],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",92],[92,"n",47],[47,",",94],[94,"b",95],[94,"d",49],[94,"l",6],[6,"a",7],[7,"s",8],[8,"]",9],[9,",",10],[10,"[",11],[11,"[",12],[12,"v",13],[13,",",14],[14,"2",15],[15,"]",16],[16,",",17],[17,"[",18],[18,"v",19],[19,",",20],[20,"2",21],[21,"]",22],[22,"]",23],[23,",",24],[24,"""",25],[25,":",26],[26,"-",27],[27,"""",28],[28,",",29],[29,"[",30],[30,"[",31],[31,"[",32],[32,"n",33],[33,",",34],[34,"t",35],[35,"r",36],[36,"u",37],[37,"e",38],[38,"]",39],[39,"]",40],[40,"]",41],[41,"]",42],[42,"0",43],[43,"1",[-,"[[n,las],[[v,2],[v,2]],"":-"",[[[n,true]]]]01"]],[49,"a",50],[50,"s",51],[51,"h",52],[52,"e",53],[53,"s",54],[54,"]",55],[55,",",56],[56,"[",57],[57,"[",58],[58,"v",59],[59,",",60],[60,"2",61],[61,"]",62],[62,",",63],[63,"[",64],[64,"v",65],[65,",",66],[66,"2",67],[67,"]",68],[68,"]",69],[69,",",70],[70,"""",71],[71,":",72],[72,"-",73],[73,"""",74],[74,",",75],[75,"[",76],[76,"[",77],[77,"[",78],[78,"n",79],[79,",",80],[80,"t",81],[81,"r",82],[82,"u",83],[83,"e",84],[84,"]",85],[85,"]",86],[86,"]",87],[87,"]",88],[88,"0",89],[89,"1",[-,"[[n,dashes],[[v,2],[v,2]],"":-"",[[[n,true]]]]01"]],[95,"e",96],[96,"d",97],[97,"r",98],[98,"o",99],[99,"o",100],[100,"m",101],[101,"_",102],[102,"t",103],[103,"o",104],[104,"_",105],[105,"g",106],[106,"a",107],[107,"r",108],[108,"d",109],[109,"e",110],[110,"n",111],[111,"]",112],[112,",",113],[113,"""",114],[114,":",115],[115,"-",116],[116,"""",117],[117,",",118],[118,"[",119],[119,"[",120],[120,"[",121],[121,"n",122],[122,",",123],[123,"t",124],[124,"r",125],[125,"u",126],[126,"e",127],[127,"]",128],[128,"]",129],[129,"]",130],[130,"]",131],[131,"0",132],[132,"1",[-,"[[n,bedroom_to_garden],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",93],[93,"n",41],[41,",",95],[95,"f",96],[95,"s",43],[43,"t",7],[7,"o",8],[8,"p",9],[9,"_",10],[10,"a",11],[11,"t",12],[12,"_",13],[13,"t",14],[14,"o",15],[15,"p",16],[16,"]",17],[17,",",18],[18,"""",19],[18,"[",56],[19,":",20],[20,"-",21],[21,"""",22],[22,",",23],[23,"[",24],[24,"[",25],[25,"[",26],[26,"n",27],[27,",",28],[28,"t",29],[29,"r",30],[30,"u",31],[31,"e",32],[32,"]",33],[33,"]",34],[34,"]",35],[35,"]",36],[36,"0",37],[37,"1",[-,"[[n,stop_at_top],"":-"",[[[n,true]]]]01"]],[56,"[",57],[57,"v",58],[58,",",59],[59,"1",60],[60,"]",61],[61,"]",62],[62,",",63],[63,"""",64],[64,":",65],[65,"-",66],[66,"""",67],[67,",",68],[68,"[",69],[69,"[",70],[70,"[",71],[71,"n",72],[72,",",73],[73,"h",74],[74,"e",75],[75,"a",76],[76,"d",77],[77,"]",78],[78,",",79],[79,"[",80],[80,"[",81],[81,"v",82],[82,",",83],[83,"1",84],[84,"]",85],[85,"]",86],[86,"]",87],[87,"]",88],[88,"]",89],[89,"0",90],[90,"1",[-,"[[n,stop_at_top],[[v,1]],"":-"",[[[n,head],[[v,1]]]]]01"]],[96,"u",97],[97,"n",98],[98,"c",99],[99,"t",100],[100,"i",101],[101,"o",102],[102,"n",103],[103,"]",104],[104,",",105],[105,"[",106],[106,"[",107],[107,"v",108],[108,",",109],[109,"2",110],[110,"]",111],[111,",",112],[112,"[",113],[113,"v",114],[114,",",115],[115,"3",116],[116,"]",117],[117,",",118],[118,"[",119],[119,"v",120],[120,",",121],[121,"3",122],[122,"]",123],[123,"]",124],[124,",",125],[125,"""",126],[126,":",127],[127,"-",128],[128,"""",129],[129,",",130],[130,"[",131],[131,"[",132],[132,"[",133],[133,"n",134],[134,",",135],[135,"t",136],[136,"r",137],[137,"u",138],[138,"e",139],[139,"]",140],[140,"]",141],[141,"]",142],[142,"]",143],[143,"0",144],[144,"1",[-,"[[n,function],[[v,2],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",134],[134,"n",84],[84,",",136],[136,"c",86],[136,"g",6],[6,"r",7],[7,"a",8],[8,"m",9],[9,"m",10],[10,"a",11],[11,"r",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,",",28],[28,"[",29],[29,"v",30],[30,",",31],[31,"3",32],[32,"]",33],[33,"]",34],[34,",",35],[35,"""",36],[36,":",37],[37,"-",38],[38,"""",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"[",43],[43,"n",44],[44,",",45],[45,"c",46],[46,"o",47],[47,"m",48],[48,"p",49],[49,"o",50],[50,"u",51],[51,"n",52],[52,"d",53],[53,"2",54],[54,"1",55],[55,"]",56],[56,",",57],[57,"[",58],[58,"[",59],[59,"v",60],[60,",",61],[61,"1",62],[62,"]",63],[63,",",64],[64,"[",65],[65,"v",66],[66,",",67],[67,"2",68],[68,"]",69],[69,",",70],[70,"[",71],[71,"v",72],[72,",",73],[73,"3",74],[74,"]",75],[75,"]",76],[76,"]",77],[77,"]",78],[78,"]",79],[79,"0",80],[80,"1",[-,"[[n,grammar1],[[v,1],[v,2],[v,3]],"":-"",[[[n,compound21],[[v,1],[v,2],[v,3]]]]]01"]],[86,"o",87],[87,"m",88],[88,"p",89],[89,"o",90],[90,"u",91],[91,"n",92],[92,"d",93],[93,"2",94],[94,"1",95],[95,"3",96],[96,"]",97],[97,",",98],[98,"[",99],[99,"[",100],[100,"v",101],[101,",",102],[102,"1",154],[102,"3",103],[103,"]",104],[104,",",105],[105,"[",106],[106,"v",107],[107,",",108],[108,"3",109],[109,"]",110],[110,"]",111],[111,",",112],[112,"""",113],[113,":",114],[114,"-",115],[115,"""",116],[116,",",117],[117,"[",118],[118,"[",119],[119,"[",120],[120,"n",121],[121,",",122],[122,"t",123],[123,"r",124],[124,"u",125],[125,"e",126],[126,"]",127],[127,"]",128],[128,"]",129],[129,"]",130],[130,"0",131],[131,"1",[-,"[[n,compound213],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[154,"]",155],[155,",",156],[156,"[",157],[157,"v",158],[158,",",159],[159,"1",160],[160,"]",161],[161,",",162],[162,"[",163],[163,"v",164],[164,",",165],[165,"3",166],[166,"]",167],[167,",",168],[168,"[",169],[169,"v",170],[170,",",171],[171,"3",172],[172,"]",173],[173,"]",174],[174,",",175],[175,"""",176],[176,":",177],[177,"-",178],[178,"""",179],[179,",",180],[180,"[",181],[181,"[",182],[182,"[",183],[183,"n",184],[184,",",185],[185,"t",186],[186,"r",187],[187,"u",188],[188,"e",189],[189,"]",190],[190,"]",191],[191,"]",192],[192,"]",193],[193,"0",194],[194,"1",[-,"[[n,compound213],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",117],[117,"n",55],[55,",",119],[119,"c",57],[57,"o",121],[121,"m",59],[59,"p",123],[123,"o",61],[61,"u",125],[125,"n",63],[63,"d",127],[127,"2",65],[65,"1",129],[129,"2",67],[129,"]",130],[67,"]",17],[17,",",18],[18,"[",19],[19,"[",20],[20,"v",21],[21,",",22],[22,"1",74],[22,"3",23],[23,"]",24],[24,",",25],[25,"[",26],[26,"v",27],[27,",",28],[28,"3",29],[29,"]",30],[30,"]",31],[31,",",32],[32,"""",33],[33,":",34],[34,"-",35],[35,"""",36],[36,",",37],[37,"[",38],[38,"[",39],[39,"[",40],[40,"n",41],[41,",",42],[42,"t",43],[43,"r",44],[44,"u",45],[45,"e",46],[46,"]",47],[47,"]",48],[48,"]",49],[49,"]",50],[50,"0",51],[51,"1",[-,"[[n,compound212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[74,"]",75],[75,",",76],[76,"[",77],[77,"v",78],[78,",",79],[79,"1",80],[80,"]",81],[81,",",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"3",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"v",90],[90,",",91],[91,"3",92],[92,"]",93],[93,"]",94],[94,",",95],[95,"""",96],[96,":",97],[97,"-",98],[98,"""",99],[99,",",100],[100,"[",101],[101,"[",102],[102,"[",103],[103,"n",104],[104,",",105],[105,"t",106],[106,"r",107],[107,"u",108],[108,"e",109],[109,"]",110],[110,"]",111],[111,"]",112],[112,"]",113],[113,"0",114],[114,"1",[-,"[[n,compound212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[130,",",131],[131,"[",132],[132,"[",133],[133,"v",134],[134,",",135],[135,"3",136],[136,"]",137],[137,"]",138],[138,",",139],[139,"""",140],[140,":",141],[141,"-",142],[142,"""",143],[143,",",144],[144,"[",145],[145,"[",146],[146,"[",147],[147,"n",148],[148,",",149],[149,"t",150],[150,"r",151],[151,"u",152],[152,"e",153],[153,"]",154],[154,"]",155],[155,"]",156],[156,"]",157],[157,"0",158],[158,"1",[-,"[[n,compound21],[[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",109],[109,"n",51],[51,",",111],[111,"w",53],[53,"o",113],[113,"r",55],[55,"d",115],[115,"2",57],[57,"1",117],[117,"2",59],[117,"3",118],[59,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"1",66],[18,"3",19],[19,"]",20],[20,",",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"3",25],[25,"]",26],[26,"]",27],[27,",",28],[28,"""",29],[29,":",30],[30,"-",31],[31,"""",32],[32,",",33],[33,"[",34],[34,"[",35],[35,"[",36],[36,"n",37],[37,",",38],[38,"t",39],[39,"r",40],[40,"u",41],[41,"e",42],[42,"]",43],[43,"]",44],[44,"]",45],[45,"]",46],[46,"0",47],[47,"1",[-,"[[n,word212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[66,"]",67],[67,",",68],[68,"[",69],[69,"v",70],[70,",",71],[71,"1",72],[72,"]",73],[73,",",74],[74,"[",75],[75,"v",76],[76,",",77],[77,"3",78],[78,"]",79],[79,",",80],[80,"[",81],[81,"v",82],[82,",",83],[83,"3",84],[84,"]",85],[85,"]",86],[86,",",87],[87,"""",88],[88,":",89],[89,"-",90],[90,"""",91],[91,",",92],[92,"[",93],[93,"[",94],[94,"[",95],[95,"n",96],[96,",",97],[97,"t",98],[98,"r",99],[99,"u",100],[100,"e",101],[101,"]",102],[102,"]",103],[103,"]",104],[104,"]",105],[105,"0",106],[106,"1",[-,"[[n,word212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[118,"]",119],[119,",",120],[120,"[",121],[121,"[",122],[122,"v",123],[123,",",124],[124,"3",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"v",129],[129,",",130],[130,"3",131],[131,"]",132],[132,"]",133],[133,",",134],[134,"""",135],[135,":",136],[136,"-",137],[137,"""",138],[138,",",139],[139,"[",140],[140,"[",141],[141,"[",142],[142,"n",143],[143,",",144],[144,"t",145],[145,"r",146],[146,"u",147],[147,"e",148],[148,"]",149],[149,"]",150],[150,"]",151],[151,"]",152],[152,"0",153],[153,"1",[-,"[[n,word213],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",149],[149,"n",85],[85,",",151],[151,"f",87],[151,"l",152],[151,"s",6],[6,"e",7],[7,"n",8],[8,"t",9],[9,"e",10],[10,"n",11],[11,"c",12],[12,"e",13],[13,"c",14],[14,"h",15],[15,"a",16],[16,"r",17],[17,"s",18],[18,"]",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"1",25],[25,"]",26],[26,",",27],[27,"[",28],[28,"v",29],[29,",",30],[30,"2",31],[31,"]",32],[32,"]",33],[33,",",34],[34,"""",35],[35,":",36],[36,"-",37],[37,"""",38],[38,",",39],[39,"[",40],[40,"[",41],[41,"[",42],[42,"n",43],[43,",",44],[44,"n",45],[45,"o",46],[46,"t",47],[47,"]",48],[48,",",49],[49,"[",50],[50,"[",51],[51,"[",52],[52,"n",53],[53,",",54],[54,"m",55],[55,"e",56],[56,"m",57],[57,"b",58],[58,"e",59],[59,"r",60],[60,"]",61],[61,",",62],[62,"[",63],[63,"[",64],[64,"v",65],[65,",",66],[66,"2",67],[67,"]",68],[68,",",69],[69,"[",70],[70,"v",71],[71,",",72],[72,"1",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"]",77],[77,"]",78],[78,"]",79],[79,"]",80],[80,"0",81],[81,"1",[-,"[[n,sentencechars],[[v,1],[v,2]],"":-"",[[[n,not],[[[n,member],[[v,2],[v,1]]]]]]]01"]],[87,"i",88],[88,"n",89],[89,"a",90],[90,"l",91],[91,"c",92],[92,"h",93],[93,"a",94],[94,"r",95],[95,"]",96],[96,",",97],[97,"[",98],[98,"[",99],[99,"v",100],[100,",",101],[101,"1",102],[102,"]",103],[103,",",104],[104,"[",105],[105,"v",106],[106,",",107],[107,"2",108],[108,"]",109],[109,"]",110],[110,",",111],[111,"""",112],[112,":",113],[113,"-",114],[114,"""",115],[115,",",116],[116,"[",117],[117,"[",118],[118,"[",119],[119,"n",120],[120,",",121],[121,"m",122],[122,"e",123],[123,"m",124],[124,"b",125],[125,"e",126],[126,"r",127],[127,"]",128],[128,",",129],[129,"[",130],[130,"[",131],[131,"v",132],[132,",",133],[133,"2",134],[134,"]",135],[135,",",136],[136,"[",137],[137,"v",138],[138,",",139],[139,"1",140],[140,"]",141],[141,"]",142],[142,"]",143],[143,"]",144],[144,"]",145],[145,"0",146],[146,"1",[-,"[[n,finalchar],[[v,1],[v,2]],"":-"",[[[n,member],[[v,2],[v,1]]]]]01"]],[152,"o",153],[153,"o",154],[154,"k",155],[155,"a",156],[156,"h",157],[157,"e",158],[158,"a",159],[159,"d",160],[160,"]",161],[161,",",162],[162,"[",163],[163,"[",164],[164,"v",165],[165,",",166],[166,"1",167],[167,"]",168],[168,",",169],[169,"[",170],[170,"v",171],[171,",",172],[172,"1",173],[173,"]",174],[174,",",175],[175,"[",176],[176,"v",177],[177,",",178],[178,"3",179],[179,"]",180],[180,"]",181],[181,",",182],[182,"""",183],[183,":",184],[184,"-",185],[185,"""",186],[186,",",187],[187,"[",188],[188,"[",189],[189,"[",190],[190,"n",191],[191,",",192],[192,"s",193],[193,"t",194],[194,"r",195],[195,"i",196],[196,"n",197],[197,"g",198],[198,"c",199],[199,"o",200],[200,"n",201],[201,"c",202],[202,"a",203],[203,"t",204],[204,"]",205],[205,",",206],[206,"[",207],[207,"[",208],[208,"v",209],[209,",",210],[210,"3",211],[211,"]",212],[212,",",213],[213,"[",214],[214,"v",215],[215,",",216],[216,"5",217],[217,"]",218],[218,",",219],[219,"[",220],[220,"v",221],[221,",",222],[222,"1",223],[223,"]",224],[224,"]",225],[225,"]",226],[226,"]",227],[227,"]",228],[228,"0",229],[229,"1",[-,"[[n,lookahead],[[v,1],[v,1],[v,3]],"":-"",[[[n,stringconcat],[[v,3],[v,5],[v,1]]]]]01"]]],[[1,"[",2],[2,"[",134],[134,"n",84],[84,",",136],[136,"c",86],[136,"g",6],[6,"r",7],[7,"a",8],[8,"m",9],[9,"m",10],[10,"a",11],[11,"r",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,",",28],[28,"[",29],[29,"v",30],[30,",",31],[31,"3",32],[32,"]",33],[33,"]",34],[34,",",35],[35,"""",36],[36,":",37],[37,"-",38],[38,"""",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"[",43],[43,"n",44],[44,",",45],[45,"c",46],[46,"o",47],[47,"m",48],[48,"p",49],[49,"o",50],[50,"u",51],[51,"n",52],[52,"d",53],[53,"2",54],[54,"1",55],[55,"]",56],[56,",",57],[57,"[",58],[58,"[",59],[59,"v",60],[60,",",61],[61,"1",62],[62,"]",63],[63,",",64],[64,"[",65],[65,"v",66],[66,",",67],[67,"2",68],[68,"]",69],[69,",",70],[70,"[",71],[71,"v",72],[72,",",73],[73,"3",74],[74,"]",75],[75,"]",76],[76,"]",77],[77,"]",78],[78,"]",79],[79,"0",80],[80,"1",[-,"[[n,grammar1],[[v,1],[v,2],[v,3]],"":-"",[[[n,compound21],[[v,1],[v,2],[v,3]]]]]01"]],[86,"o",87],[87,"m",88],[88,"p",89],[89,"o",90],[90,"u",91],[91,"n",92],[92,"d",93],[93,"2",94],[94,"1",95],[95,"3",96],[96,"]",97],[97,",",98],[98,"[",99],[99,"[",100],[100,"v",101],[101,",",102],[102,"1",154],[102,"3",103],[103,"]",104],[104,",",105],[105,"[",106],[106,"v",107],[107,",",108],[108,"3",109],[109,"]",110],[110,"]",111],[111,",",112],[112,"""",113],[113,":",114],[114,"-",115],[115,"""",116],[116,",",117],[117,"[",118],[118,"[",119],[119,"[",120],[120,"n",121],[121,",",122],[122,"t",123],[123,"r",124],[124,"u",125],[125,"e",126],[126,"]",127],[127,"]",128],[128,"]",129],[129,"]",130],[130,"0",131],[131,"1",[-,"[[n,compound213],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[154,"]",155],[155,",",156],[156,"[",157],[157,"v",158],[158,",",159],[159,"1",160],[160,"]",161],[161,",",162],[162,"[",163],[163,"v",164],[164,",",165],[165,"3",166],[166,"]",167],[167,",",168],[168,"[",169],[169,"v",170],[170,",",171],[171,"3",172],[172,"]",173],[173,"]",174],[174,",",175],[175,"""",176],[176,":",177],[177,"-",178],[178,"""",179],[179,",",180],[180,"[",181],[181,"[",182],[182,"[",183],[183,"n",184],[184,",",185],[185,"t",186],[186,"r",187],[187,"u",188],[188,"e",189],[189,"]",190],[190,"]",191],[191,"]",192],[192,"]",193],[193,"0",194],[194,"1",[-,"[[n,compound213],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",117],[117,"n",55],[55,",",119],[119,"c",57],[57,"o",121],[121,"m",59],[59,"p",123],[123,"o",61],[61,"u",125],[125,"n",63],[63,"d",127],[127,"2",65],[65,"1",129],[129,"2",67],[129,"]",130],[67,"]",17],[17,",",18],[18,"[",19],[19,"[",20],[20,"v",21],[21,",",22],[22,"1",74],[22,"3",23],[23,"]",24],[24,",",25],[25,"[",26],[26,"v",27],[27,",",28],[28,"3",29],[29,"]",30],[30,"]",31],[31,",",32],[32,"""",33],[33,":",34],[34,"-",35],[35,"""",36],[36,",",37],[37,"[",38],[38,"[",39],[39,"[",40],[40,"n",41],[41,",",42],[42,"t",43],[43,"r",44],[44,"u",45],[45,"e",46],[46,"]",47],[47,"]",48],[48,"]",49],[49,"]",50],[50,"0",51],[51,"1",[-,"[[n,compound212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[74,"]",75],[75,",",76],[76,"[",77],[77,"v",78],[78,",",79],[79,"1",80],[80,"]",81],[81,",",82],[82,"[",83],[83,"v",84],[84,",",85],[85,"3",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"v",90],[90,",",91],[91,"3",92],[92,"]",93],[93,"]",94],[94,",",95],[95,"""",96],[96,":",97],[97,"-",98],[98,"""",99],[99,",",100],[100,"[",101],[101,"[",102],[102,"[",103],[103,"n",104],[104,",",105],[105,"t",106],[106,"r",107],[107,"u",108],[108,"e",109],[109,"]",110],[110,"]",111],[111,"]",112],[112,"]",113],[113,"0",114],[114,"1",[-,"[[n,compound212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[130,",",131],[131,"[",132],[132,"[",133],[133,"v",134],[134,",",135],[135,"3",136],[136,"]",137],[137,"]",138],[138,",",139],[139,"""",140],[140,":",141],[141,"-",142],[142,"""",143],[143,",",144],[144,"[",145],[145,"[",146],[146,"[",147],[147,"n",148],[148,",",149],[149,"t",150],[150,"r",151],[151,"u",152],[152,"e",153],[153,"]",154],[154,"]",155],[155,"]",156],[156,"]",157],[157,"0",158],[158,"1",[-,"[[n,compound21],[[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",109],[109,"n",51],[51,",",111],[111,"w",53],[53,"o",113],[113,"r",55],[55,"d",115],[115,"2",57],[57,"1",117],[117,"2",59],[117,"3",118],[59,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"1",66],[18,"3",19],[19,"]",20],[20,",",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"3",25],[25,"]",26],[26,"]",27],[27,",",28],[28,"""",29],[29,":",30],[30,"-",31],[31,"""",32],[32,",",33],[33,"[",34],[34,"[",35],[35,"[",36],[36,"n",37],[37,",",38],[38,"t",39],[39,"r",40],[40,"u",41],[41,"e",42],[42,"]",43],[43,"]",44],[44,"]",45],[45,"]",46],[46,"0",47],[47,"1",[-,"[[n,word212],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[66,"]",67],[67,",",68],[68,"[",69],[69,"v",70],[70,",",71],[71,"1",72],[72,"]",73],[73,",",74],[74,"[",75],[75,"v",76],[76,",",77],[77,"3",78],[78,"]",79],[79,",",80],[80,"[",81],[81,"v",82],[82,",",83],[83,"3",84],[84,"]",85],[85,"]",86],[86,",",87],[87,"""",88],[88,":",89],[89,"-",90],[90,"""",91],[91,",",92],[92,"[",93],[93,"[",94],[94,"[",95],[95,"n",96],[96,",",97],[97,"t",98],[98,"r",99],[99,"u",100],[100,"e",101],[101,"]",102],[102,"]",103],[103,"]",104],[104,"]",105],[105,"0",106],[106,"1",[-,"[[n,word212],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[118,"]",119],[119,",",120],[120,"[",121],[121,"[",122],[122,"v",123],[123,",",124],[124,"3",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"v",129],[129,",",130],[130,"3",131],[131,"]",132],[132,"]",133],[133,",",134],[134,"""",135],[135,":",136],[136,"-",137],[137,"""",138],[138,",",139],[139,"[",140],[140,"[",141],[141,"[",142],[142,"n",143],[143,",",144],[144,"t",145],[145,"r",146],[146,"u",147],[147,"e",148],[148,"]",149],[149,"]",150],[150,"]",151],[151,"]",152],[152,"0",153],[153,"1",[-,"[[n,word213],[[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",149],[149,"n",85],[85,",",151],[151,"f",87],[151,"l",152],[151,"s",6],[6,"e",7],[7,"n",8],[8,"t",9],[9,"e",10],[10,"n",11],[11,"c",12],[12,"e",13],[13,"c",14],[14,"h",15],[15,"a",16],[16,"r",17],[17,"s",18],[18,"]",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"v",23],[23,",",24],[24,"1",25],[25,"]",26],[26,",",27],[27,"[",28],[28,"v",29],[29,",",30],[30,"2",31],[31,"]",32],[32,"]",33],[33,",",34],[34,"""",35],[35,":",36],[36,"-",37],[37,"""",38],[38,",",39],[39,"[",40],[40,"[",41],[41,"[",42],[42,"n",43],[43,",",44],[44,"n",45],[45,"o",46],[46,"t",47],[47,"]",48],[48,",",49],[49,"[",50],[50,"[",51],[51,"[",52],[52,"n",53],[53,",",54],[54,"m",55],[55,"e",56],[56,"m",57],[57,"b",58],[58,"e",59],[59,"r",60],[60,"]",61],[61,",",62],[62,"[",63],[63,"[",64],[64,"v",65],[65,",",66],[66,"2",67],[67,"]",68],[68,",",69],[69,"[",70],[70,"v",71],[71,",",72],[72,"1",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"]",77],[77,"]",78],[78,"]",79],[79,"]",80],[80,"0",81],[81,"1",[-,"[[n,sentencechars],[[v,1],[v,2]],"":-"",[[[n,not],[[[n,member],[[v,2],[v,1]]]]]]]01"]],[87,"i",88],[88,"n",89],[89,"a",90],[90,"l",91],[91,"c",92],[92,"h",93],[93,"a",94],[94,"r",95],[95,"]",96],[96,",",97],[97,"[",98],[98,"[",99],[99,"v",100],[100,",",101],[101,"1",102],[102,"]",103],[103,",",104],[104,"[",105],[105,"v",106],[106,",",107],[107,"2",108],[108,"]",109],[109,"]",110],[110,",",111],[111,"""",112],[112,":",113],[113,"-",114],[114,"""",115],[115,",",116],[116,"[",117],[117,"[",118],[118,"[",119],[119,"n",120],[120,",",121],[121,"m",122],[122,"e",123],[123,"m",124],[124,"b",125],[125,"e",126],[126,"r",127],[127,"]",128],[128,",",129],[129,"[",130],[130,"[",131],[131,"v",132],[132,",",133],[133,"2",134],[134,"]",135],[135,",",136],[136,"[",137],[137,"v",138],[138,",",139],[139,"1",140],[140,"]",141],[141,"]",142],[142,"]",143],[143,"]",144],[144,"]",145],[145,"0",146],[146,"1",[-,"[[n,finalchar],[[v,1],[v,2]],"":-"",[[[n,member],[[v,2],[v,1]]]]]01"]],[152,"o",153],[153,"o",154],[154,"k",155],[155,"a",156],[156,"h",157],[157,"e",158],[158,"a",159],[159,"d",160],[160,"]",161],[161,",",162],[162,"[",163],[163,"[",164],[164,"v",165],[165,",",166],[166,"1",167],[167,"]",168],[168,",",169],[169,"[",170],[170,"v",171],[171,",",172],[172,"1",173],[173,"]",174],[174,",",175],[175,"[",176],[176,"v",177],[177,",",178],[178,"3",179],[179,"]",180],[180,"]",181],[181,",",182],[182,"""",183],[183,":",184],[184,"-",185],[185,"""",186],[186,",",187],[187,"[",188],[188,"[",189],[189,"[",190],[190,"n",191],[191,",",192],[192,"s",193],[193,"t",194],[194,"r",195],[195,"i",196],[196,"n",197],[197,"g",198],[198,"c",199],[199,"o",200],[200,"n",201],[201,"c",202],[202,"a",203],[203,"t",204],[204,"]",205],[205,",",206],[206,"[",207],[207,"[",208],[208,"v",209],[209,",",210],[210,"3",211],[211,"]",212],[212,",",213],[213,"[",214],[214,"v",215],[215,",",216],[216,"5",217],[217,"]",218],[218,",",219],[219,"[",220],[220,"v",221],[221,",",222],[222,"1",223],[223,"]",224],[224,"]",225],[225,"]",226],[226,"]",227],[227,"]",228],[228,"0",229],[229,"1",[-,"[[n,lookahead],[[v,1],[v,1],[v,3]],"":-"",[[[n,stringconcat],[[v,3],[v,5],[v,1]]]]]01"]]],[[1,"[",2],[2,"[",120],[120,"n",90],[90,",",122],[122,"e",123],[122,"q",6],[122,"t",92],[6,"u",7],[7,"e",8],[8,"r",9],[9,"y",10],[10,"_",11],[11,"p",12],[12,"r",13],[13,"e",14],[14,"d",15],[15,"]",16],[16,",",17],[17,"""",18],[18,":",19],[19,"-",20],[20,"""",21],[21,",",22],[22,"[",23],[23,"[",24],[24,"[",25],[25,"n",26],[26,",",27],[27,"e",28],[28,"q",29],[29,"u",30],[30,"a",31],[31,"l",32],[32,"s",33],[33,"4",34],[34,"_",35],[35,"o",36],[36,"n",37],[37,"]",38],[38,"]",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"n",43],[43,",",44],[44,"c",45],[45,"h",46],[46,"e",47],[47,"c",48],[48,"k",49],[49,"t",50],[50,"y",51],[51,"p",52],[52,"e",53],[53,"s",54],[54,"_",55],[55,"i",56],[56,"n",57],[57,"p",58],[58,"u",59],[59,"t",60],[60,"s",61],[61,"]",62],[62,",",63],[63,"[",64],[64,"]",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"[",69],[69,"n",70],[70,",",71],[71,"e",72],[72,"q",73],[73,"u",74],[74,"a",75],[75,"l",76],[76,"s",77],[77,"4",78],[78,"_",79],[79,"o",80],[80,"n",81],[81,"]",82],[82,"]",83],[83,"]",84],[84,"]",85],[85,"0",86],[86,"1",[-,"[[n,query_pred],"":-"",[[[n,equals4_on]],[[n,checktypes_inputs],[]],[[n,equals4_on]]]]01"]],[92,"y",93],[93,"p",94],[94,"e",95],[95,"s",96],[96,"]",97],[97,",",98],[98,"""",99],[99,":",100],[100,"-",101],[101,"""",102],[102,",",103],[103,"[",104],[104,"[",105],[105,"[",106],[106,"n",107],[107,",",108],[108,"t",109],[109,"r",110],[110,"u",111],[111,"e",112],[112,"]",113],[113,"]",114],[114,"]",115],[115,"]",116],[116,"0",117],[117,"1",[-,"[[n,types],"":-"",[[[n,true]]]]01"]],[123,"x",124],[124,"t",125],[125,"r",126],[126,"a",127],[127,"c",128],[128,"t",129],[129,"_",130],[130,"m",131],[131,"o",132],[132,"d",133],[133,"e",134],[134,"s",135],[135,"2",136],[136,"]",137],[137,",",138],[138,"[",139],[139,"[",140],[140,"v",141],[141,",",142],[142,"2",143],[143,"]",144],[144,",",145],[145,"[",146],[146,"v",147],[147,",",148],[148,"2",149],[149,"]",150],[150,",",151],[151,"[",152],[152,"v",153],[153,",",154],[154,"5",155],[155,"]",156],[156,",",157],[157,"[",158],[158,"v",159],[159,",",160],[160,"5",161],[161,"]",162],[162,"]",163],[163,",",164],[164,"""",165],[165,":",166],[166,"-",167],[167,"""",168],[168,",",169],[169,"[",170],[170,"[",171],[171,"[",172],[172,"n",173],[173,",",174],[174,"c",175],[175,"u",176],[176,"t",177],[177,"]",178],[178,"]",179],[179,"]",180],[180,"]",181],[181,"0",182],[182,"1",[-,"[[n,extract_modes2],[[v,2],[v,2],[v,5],[v,5]],"":-"",[[[n,cut]]]]01"]]],[[1,"[",2],[2,"[",109],[109,"n",54],[54,",",111],[111,"c",56],[111,"d",112],[56,"h",7],[7,"e",8],[8,"c",9],[9,"k",10],[10,"t",11],[11,"y",12],[12,"p",13],[13,"e",14],[14,"s",15],[15,"1",16],[15,"3",66],[16,"]",17],[17,",",18],[18,"[",19],[19,"[",20],[20,"v",21],[21,",",22],[22,"3",23],[23,"]",24],[24,",",25],[25,"[",26],[26,"v",27],[27,",",28],[28,"4",29],[29,"]",30],[30,"]",31],[31,",",32],[32,"""",33],[33,":",34],[34,"-",35],[35,"""",36],[36,",",37],[37,"[",38],[38,"[",39],[39,"[",40],[40,"n",41],[41,",",42],[42,"c",43],[43,"u",44],[44,"t",45],[45,"]",46],[46,"]",47],[47,"]",48],[48,"]",49],[49,"0",50],[50,"1",[-,"[[n,checktypes1],[[v,3],[v,4]],"":-"",[[[n,cut]]]]01"]],[66,"]",67],[67,",",68],[68,"[",69],[69,"[",70],[70,"v",71],[71,",",72],[72,"2",73],[73,"]",74],[74,",",75],[75,"[",76],[76,"v",77],[77,",",78],[78,"3",79],[79,"]",80],[80,",",81],[81,"[",82],[82,"v",83],[83,",",84],[84,"4",85],[85,"]",86],[86,"]",87],[87,",",88],[88,"""",89],[89,":",90],[90,"-",91],[91,"""",92],[92,",",93],[93,"[",94],[94,"[",95],[95,"[",96],[96,"n",97],[97,",",98],[98,"c",99],[99,"u",100],[100,"t",101],[101,"]",102],[102,"]",103],[103,"]",104],[104,"]",105],[105,"0",106],[106,"1",[-,"[[n,checktypes3],[[v,2],[v,3],[v,4]],"":-"",[[[n,cut]]]]01"]],[112,"e",113],[113,"b",114],[114,"u",115],[115,"g",116],[116,"_",117],[117,"c",118],[118,"a",119],[119,"l",120],[120,"l",121],[121,"]",122],[122,",",123],[123,"[",124],[124,"[",125],[125,"v",126],[126,",",127],[127,"1",128],[128,"]",129],[129,",",130],[130,"[",131],[131,"v",132],[132,",",133],[133,"2",134],[134,"]",135],[135,",",136],[136,"[",137],[137,"v",138],[138,",",139],[139,"3",140],[140,"]",141],[141,"]",142],[142,",",143],[143,"""",144],[144,":",145],[145,"-",146],[146,"""",147],[147,",",148],[148,"[",149],[149,"[",150],[150,"[",151],[151,"n",152],[152,",",153],[153,"w",154],[154,"r",155],[155,"i",156],[156,"t",157],[157,"e",158],[158,"l",159],[159,"n",160],[160,"]",161],[161,",",162],[162,"[",163],[163,"[",164],[164,"v",165],[165,",",166],[166,"2",167],[167,"]",168],[168,",",169],[169,"[",170],[170,"v",171],[171,",",172],[172,"3",173],[173,"]",174],[174,"]",175],[175,"]",176],[176,"]",177],[177,"]",178],[178,"0",179],[179,"1",[-,"[[n,debug_call],[[v,1],[v,2],[v,3]],"":-"",[[[n,writeln],[[v,2],[v,3]]]]]01"]]],[[1,"[",2],[2,"[",160],[160,"n",77],[77,",",162],[162,"d",79],[79,"e",164],[164,"b",81],[81,"u",166],[166,"g",83],[83,"_",168],[168,"e",12],[168,"f",85],[168,"t",169],[12,"x",13],[13,"i",14],[14,"t",15],[15,"]",16],[16,",",17],[17,"[",18],[18,"[",19],[19,"v",20],[20,",",21],[21,"1",22],[22,"]",23],[23,",",24],[24,"[",25],[25,"v",26],[26,",",27],[27,"2",28],[28,"]",29],[29,",",30],[30,"[",31],[31,"v",32],[32,",",33],[33,"3",34],[34,"]",35],[35,"]",36],[36,",",37],[37,"""",38],[38,":",39],[39,"-",40],[40,"""",41],[41,",",42],[42,"[",43],[43,"[",44],[44,"[",45],[45,"n",46],[46,",",47],[47,"w",48],[48,"r",49],[49,"i",50],[50,"t",51],[51,"e",52],[52,"l",53],[53,"n",54],[54,"]",55],[55,",",56],[56,"[",57],[57,"[",58],[58,"v",59],[59,",",60],[60,"2",61],[61,"]",62],[62,",",63],[63,"[",64],[64,"v",65],[65,",",66],[66,"3",67],[67,"]",68],[68,"]",69],[69,"]",70],[70,"]",71],[71,"]",72],[72,"0",73],[73,"1",[-,"[[n,debug_exit],[[v,1],[v,2],[v,3]],"":-"",[[[n,writeln],[[v,2],[v,3]]]]]01"]],[85,"a",86],[86,"i",87],[87,"l",88],[88,"]",89],[89,",",90],[90,"[",91],[91,"[",92],[92,"v",93],[93,",",94],[94,"1",95],[95,"]",96],[96,",",97],[97,"[",98],[98,"v",99],[99,",",100],[100,"2",101],[101,"]",102],[102,",",103],[103,"[",104],[104,"v",105],[105,",",106],[106,"3",107],[107,"]",108],[108,"]",109],[109,",",110],[110,"""",111],[111,":",112],[112,"-",113],[113,"""",114],[114,",",115],[115,"[",116],[116,"[",117],[117,"[",118],[118,"n",119],[119,",",120],[120,"w",121],[121,"r",122],[122,"i",123],[123,"t",124],[124,"e",125],[125,"l",126],[126,"n",127],[127,"]",128],[128,",",129],[129,"[",130],[130,"[",131],[131,"v",132],[132,",",133],[133,"2",134],[134,"]",135],[135,",",136],[136,"[",137],[137,"v",138],[138,",",139],[139,"3",140],[140,"]",141],[141,"]",142],[142,"]",143],[143,",",144],[144,"[",145],[145,"[",146],[146,"n",147],[147,",",148],[148,"f",149],[149,"a",150],[150,"i",151],[151,"l",152],[152,"]",153],[153,"]",154],[154,"]",155],[155,"]",156],[156,"0",157],[157,"1",[-,"[[n,debug_fail],[[v,1],[v,2],[v,3]],"":-"",[[[n,writeln],[[v,2],[v,3]]],[[n,fail]]]]01"]],[169,"y",170],[170,"p",171],[171,"e",172],[172,"s",173],[173,"_",174],[174,"c",175],[175,"a",176],[176,"l",177],[177,"l",178],[178,"]",179],[179,",",180],[180,"[",181],[181,"[",182],[182,"v",183],[183,",",184],[184,"1",185],[185,"]",186],[186,"]",187],[187,",",188],[188,"""",189],[189,":",190],[190,"-",191],[191,"""",192],[192,",",193],[193,"[",194],[194,"[",195],[195,"[",196],[196,"n",197],[197,",",198],[198,"w",199],[199,"r",200],[200,"i",201],[201,"t",202],[202,"e",203],[203,"l",204],[204,"n",205],[205,"]",206],[206,",",207],[207,"[",208],[208,"[",209],[209,"v",210],[210,",",211],[211,"1",212],[212,"]",213],[213,"]",214],[214,"]",215],[215,"]",216],[216,"]",217],[217,"0",218],[218,"1",[-,"[[n,debug_types_call],[[v,1]],"":-"",[[[n,writeln],[[v,1]]]]]01"]]],[[1,"[",2],[2,"[",136],[136,"n",65],[65,",",138],[138,"d",67],[138,"i",139],[67,"e",7],[7,"b",8],[8,"u",9],[9,"g",10],[10,"_",11],[11,"t",12],[12,"y",13],[13,"p",14],[14,"e",15],[15,"s",16],[16,"_",17],[17,"e",18],[17,"f",79],[18,"x",19],[19,"i",20],[20,"t",21],[21,"]",22],[22,",",23],[23,"[",24],[24,"[",25],[25,"v",26],[26,",",27],[27,"1",28],[28,"]",29],[29,"]",30],[30,",",31],[31,"""",32],[32,":",33],[33,"-",34],[34,"""",35],[35,",",36],[36,"[",37],[37,"[",38],[38,"[",39],[39,"n",40],[40,",",41],[41,"w",42],[42,"r",43],[43,"i",44],[44,"t",45],[45,"e",46],[46,"l",47],[47,"n",48],[48,"]",49],[49,",",50],[50,"[",51],[51,"[",52],[52,"v",53],[53,",",54],[54,"1",55],[55,"]",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"0",61],[61,"1",[-,"[[n,debug_types_exit],[[v,1]],"":-"",[[[n,writeln],[[v,1]]]]]01"]],[79,"a",80],[80,"i",81],[81,"l",82],[82,"]",83],[83,",",84],[84,"[",85],[85,"[",86],[86,"v",87],[87,",",88],[88,"1",89],[89,"]",90],[90,"]",91],[91,",",92],[92,"""",93],[93,":",94],[94,"-",95],[95,"""",96],[96,",",97],[97,"[",98],[98,"[",99],[99,"[",100],[100,"n",101],[101,",",102],[102,"w",103],[103,"r",104],[104,"i",105],[105,"t",106],[106,"e",107],[107,"l",108],[108,"n",109],[109,"]",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"v",114],[114,",",115],[115,"1",116],[116,"]",117],[117,"]",118],[118,"]",119],[119,",",120],[120,"[",121],[121,"[",122],[122,"n",123],[123,",",124],[124,"f",125],[125,"a",126],[126,"i",127],[127,"l",128],[128,"]",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"0",133],[133,"1",[-,"[[n,debug_types_fail],[[v,1]],"":-"",[[[n,writeln],[[v,1]]],[[n,fail]]]]01"]],[139,"s",140],[140,"_",141],[141,"l",142],[142,"i",143],[143,"s",144],[144,"t",145],[145,"]",146],[146,",",147],[147,"[",148],[148,"[",149],[149,"v",150],[150,",",151],[151,"1",152],[152,"]",153],[153,"]",154],[154,",",155],[155,"""",156],[156,":",157],[157,"-",158],[158,"""",159],[159,",",160],[160,"[",161],[161,"[",162],[162,"[",163],[163,"n",164],[164,",",165],[165,"=",166],[166,"]",167],[167,",",168],[168,"[",169],[169,"[",170],[170,"v",171],[171,",",172],[172,"1",173],[173,"]",174],[174,"]",175],[175,"]",176],[176,"]",177],[177,"]",178],[178,"0",179],[179,"1",[-,"[[n,is_list],[[v,1]],"":-"",[[[n,=],[[v,1]]]]]01"]]],[[1,"[",2],[2,"[",111],[111,"n",68],[68,",",113],[113,"c",70],[113,"i",6],[6,"s",7],[7,"_",8],[8,"l",9],[9,"i",10],[10,"s",11],[11,"t",12],[12,"]",13],[13,",",14],[14,"[",15],[15,"[",16],[16,"v",17],[17,",",18],[18,"1",19],[19,"]",20],[20,"]",21],[21,",",22],[22,"""",23],[23,":",24],[24,"-",25],[25,"""",26],[26,",",27],[27,"[",28],[28,"[",29],[29,"[",30],[30,"n",31],[31,",",32],[32,"e",33],[33,"q",34],[34,"u",35],[35,"a",36],[36,"l",37],[37,"s",38],[38,"4",39],[39,"]",40],[40,",",41],[41,"[",42],[42,"[",43],[43,"v",44],[44,",",45],[45,"1",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"v",50],[50,",",51],[51,"3",52],[52,"]",53],[53,",",54],[54,"[",55],[55,"v",56],[56,",",57],[57,"5",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"]",62],[62,"]",63],[63,"0",64],[64,"1",[-,"[[n,is_list],[[v,1]],"":-"",[[[n,equals4],[[v,1],[v,3],[v,5]]]]]01"]],[70,"o",71],[71,"u",72],[72,"n",73],[73,"t",74],[74,"]",75],[75,",",76],[76,"[",77],[77,"[",78],[78,"v",79],[79,",",80],[80,"1",81],[81,"]",82],[82,"]",83],[83,",",84],[84,"""",85],[85,":",86],[86,"-",87],[87,"""",88],[88,",",89],[89,"[",90],[90,"[",91],[91,"[",92],[92,"n",93],[93,",",94],[94,"=",95],[95,"]",96],[96,",",97],[97,"[",98],[98,"[",99],[99,"v",100],[100,",",101],[101,"1",102],[102,"]",103],[103,"]",104],[104,"]",105],[105,"]",106],[106,"]",107],[107,"0",108]],[[1,"[",2],[2,"[",107],[107,"n",48],[48,",",109],[109,"a",110],[109,"c",6],[109,"f",50],[6,"o",7],[7,"u",8],[8,"n",9],[9,"t",10],[10,"]",11],[11,",",12],[12,"[",13],[13,"[",14],[14,"v",15],[15,",",16],[16,"1",17],[17,"]",18],[18,"]",19],[19,",",20],[20,"""",21],[21,":",22],[22,"-",23],[23,"""",24],[24,",",25],[25,"[",26],[26,"[",27],[27,"[",28],[28,"n",29],[29,",",30],[30,"=",31],[31,"]",32],[32,",",33],[33,"[",34],[34,"[",35],[35,"v",36],[36,",",37],[37,"1",38],[38,"]",39],[39,"]",40],[40,"]",41],[41,"]",42],[42,"]",43],[43,"0",44],[44,"1",[-,"[[n,count],[[v,1]],"":-"",[[[n,=],[[v,1]]]]]01"]],[50,"u",51],[51,"n",52],[52,"c",53],[53,"t",54],[54,"i",55],[55,"o",56],[56,"n",57],[57,"]",58],[58,",",59],[59,"[",60],[60,"[",61],[61,"v",62],[62,",",63],[63,"1",64],[64,"]",65],[65,",",66],[66,"[",67],[67,"v",68],[68,",",69],[69,"1",70],[70,"]",71],[71,",",72],[72,"[",73],[73,"v",74],[74,",",75],[75,"3",76],[76,"]",77],[77,",",78],[78,"[",79],[79,"v",80],[80,",",81],[81,"3",82],[82,"]",83],[83,"]",84],[84,",",85],[85,"""",86],[86,":",87],[87,"-",88],[88,"""",89],[89,",",90],[90,"[",91],[91,"[",92],[92,"[",93],[93,"n",94],[94,",",95],[95,"t",96],[96,"r",97],[97,"u",98],[98,"e",99],[99,"]",100],[100,"]",101],[101,"]",102],[102,"]",103],[103,"0",104],[104,"1",[-,"[[n,function],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[110,"p",111],[111,"p",112],[112,"e",113],[113,"n",114],[114,"d",115],[115,"1",116],[116,"]",117],[117,",",118],[118,"[",119],[119,"[",120],[120,"v",121],[121,",",122],[122,"1",123],[123,"]",124],[124,"]",125],[125,",",126],[126,"""",127],[127,":",128],[128,"-",129],[129,"""",130],[130,",",131],[131,"[",132],[132,"[",133],[133,"[",134],[134,"n",135],[135,",",136],[136,"a",137],[137,"]",138],[138,",",139],[139,"[",140],[140,"[",141],[141,"v",142],[142,",",143],[143,"1",144],[144,"]",145],[145,"]",146],[146,"]",147],[147,"]",148],[148,"]",149],[149,"0",150],[150,"1",[-,"[[n,append1],[[v,1]],"":-"",[[[n,a],[[v,1]]]]]01"]]],[[1,"[",2],[2,"[",73],[73,"n",31],[31,",",75],[75,"a",6],[75,"e",33],[6,"]",7],[7,",",8],[8,"""",9],[9,":",10],[10,"-",11],[11,"""",12],[12,",",13],[13,"[",14],[14,"[",15],[15,"[",16],[16,"n",17],[17,",",18],[18,"t",19],[19,"r",20],[20,"u",21],[21,"e",22],[22,"]",23],[23,"]",24],[24,"]",25],[25,"]",26],[26,"0",27],[27,"1",[-,"[[n,a],"":-"",[[[n,true]]]]01"]],[33,"q",34],[34,"u",35],[35,"a",36],[36,"l",37],[37,"s",38],[38,"4",39],[39,"1",83],[39,"_",40],[40,"o",41],[41,"n",42],[42,"1",43],[43,"]",44],[44,",",45],[45,"""",46],[46,":",47],[47,"-",48],[48,"""",49],[49,",",50],[50,"[",51],[51,"[",52],[52,"[",53],[53,"n",54],[54,",",55],[55,"e",56],[56,"q",57],[57,"u",58],[58,"a",59],[59,"l",60],[60,"s",61],[61,"4",62],[62,"_",63],[63,"o",64],[64,"n",65],[65,"]",66],[66,"]",67],[67,"]",68],[68,"]",69],[69,"0",70],[70,"1",[-,"[[n,equals4_on1],"":-"",[[[n,equals4_on]]]]01"]],[83,"]",84],[84,",",85],[85,"""",86],[86,":",87],[87,"-",88],[88,"""",89],[89,",",90],[90,"[",91],[91,"[",92],[92,"[",93],[93,"n",94],[94,",",95],[95,"t",96],[96,"r",97],[97,"u",98],[98,"e",99],[99,"]",100],[100,"]",101],[101,"]",102],[102,"]",103],[103,"0",104],[104,"1",[-,"[[n,equals41],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",71],[71,"n",38],[38,",",73],[73,"e",40],[40,"q",75],[75,"u",42],[42,"a",77],[77,"l",44],[44,"s",79],[79,"4",46],[46,"1",81],[81,"]",48],[48,",",83],[83,"""",50],[50,":",85],[85,"-",52],[52,"""",87],[87,",",54],[54,"[",89],[89,"[",56],[56,"[",91],[91,"n",58],[58,",",93],[93,"t",60],[60,"r",95],[95,"u",62],[62,"e",97],[97,"]",64],[64,"]",99],[99,"]",66],[66,"]",101],[101,"0",68]],[[1,"[",2],[2,"[",71],[71,"n",38],[38,",",73],[73,"e",40],[40,"q",75],[75,"u",42],[42,"a",77],[77,"l",44],[44,"s",79],[79,"4",46],[46,"1",81],[81,"]",48],[48,",",83],[83,"""",50],[50,":",85],[85,"-",52],[52,"""",87],[87,",",54],[54,"[",89],[89,"[",56],[56,"[",91],[91,"n",58],[58,",",93],[93,"t",60],[60,"r",95],[95,"u",62],[62,"e",97],[97,"]",64],[64,"]",99],[99,"]",66],[66,"]",101],[101,"0",68]],[[1,"[",2],[2,"[",71],[71,"n",38],[38,",",73],[73,"e",40],[40,"q",75],[75,"u",42],[42,"a",77],[77,"l",44],[44,"s",79],[79,"4",46],[46,"1",81],[81,"]",48],[48,",",83],[83,"""",50],[50,":",85],[85,"-",52],[52,"""",87],[87,",",54],[54,"[",89],[89,"[",56],[56,"[",91],[91,"n",58],[58,",",93],[93,"t",60],[60,"r",95],[95,"u",62],[62,"e",97],[97,"]",64],[64,"]",99],[99,"]",66],[66,"]",101],[101,"0",68]],[[1,"[",2],[2,"[",71],[71,"n",38],[38,",",73],[73,"e",40],[40,"q",75],[75,"u",42],[42,"a",77],[77,"l",44],[44,"s",79],[79,"4",46],[46,"1",81],[81,"]",48],[48,",",83],[83,"""",50],[50,":",85],[85,"-",52],[52,"""",87],[87,",",54],[54,"[",89],[89,"[",56],[56,"[",91],[91,"n",58],[58,",",93],[93,"t",60],[60,"r",95],[95,"u",62],[62,"e",97],[97,"]",64],[64,"]",99],[99,"]",66],[66,"]",101],[101,"0",68]],[[1,"[",2],[2,"[",97],[97,"n",38],[38,",",99],[99,"e",40],[40,"q",101],[101,"u",42],[42,"a",103],[103,"l",44],[44,"s",105],[105,"4",46],[46,"1",13],[46,"2",107],[13,"]",14],[14,",",15],[15,"""",16],[15,"[",50],[16,":",17],[17,"-",18],[18,"""",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"[",23],[23,"n",24],[24,",",25],[25,"t",26],[26,"r",27],[27,"u",28],[28,"e",29],[29,"]",30],[30,"]",31],[31,"]",32],[32,"]",33],[33,"0",34],[34,"1",[-,"[[n,equals41],"":-"",[[[n,true]]]]01"]],[50,"[",51],[51,"v",52],[52,",",53],[53,"1",54],[54,"]",55],[55,"]",56],[56,",",57],[57,"""",58],[58,":",59],[59,"-",60],[60,"""",61],[61,",",62],[62,"[",63],[63,"[",64],[64,"[",65],[65,"n",66],[66,",",67],[67,"e",68],[68,"q",69],[69,"u",70],[70,"a",71],[71,"l",72],[72,"s",73],[73,"4",74],[74,"2",75],[75,"]",76],[76,",",77],[77,"[",78],[78,"[",79],[79,"v",80],[80,",",81],[81,"1",82],[82,"]",83],[83,",",84],[84,"[",85],[85,"v",86],[86,",",87],[87,"4",88],[88,"]",89],[89,"]",90],[90,"]",91],[91,"]",92],[92,"]",93],[93,"0",94],[94,"1",[-,"[[n,equals41],[[v,1]],"":-"",[[[n,equals42],[[v,1],[v,4]]]]]01"]],[107,"]",108],[108,",",109],[109,"""",110],[110,":",111],[111,"-",112],[112,"""",113],[113,",",114],[114,"[",115],[115,"[",116],[116,"[",117],[117,"n",118],[118,",",119],[119,"t",120],[120,"r",121],[121,"u",122],[122,"e",123],[123,"]",124],[124,"]",125],[125,"]",126],[126,"]",127],[127,"0",128],[128,"1",[-,"[[n,equals42],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",70],[70,"n",44],[44,",",72],[72,"b",46],[72,"c",73],[72,"e",6],[6,"q",7],[7,"u",8],[8,"a",9],[9,"l",10],[10,"s",11],[11,"4",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"""",16],[16,":",17],[17,"-",18],[18,"""",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"[",23],[23,"n",24],[24,",",25],[25,"e",26],[26,"q",27],[27,"u",28],[28,"a",29],[29,"l",30],[30,"s",31],[31,"4",32],[32,"_",33],[33,"o",34],[34,"n",35],[35,"]",36],[36,"]",37],[37,"]",38],[38,"]",39],[39,"0",40],[40,"1",[-,"[[n,equals41],"":-"",[[[n,equals4_on]]]]01"]],[46,"]",47],[47,",",48],[48,"""",49],[49,":",50],[50,"-",51],[51,"""",52],[52,",",53],[53,"[",54],[54,"[",55],[55,"[",56],[56,"n",57],[57,",",58],[58,"t",59],[59,"r",60],[60,"u",61],[61,"e",62],[62,"]",63],[63,"]",64],[64,"]",65],[65,"]",66],[66,"0",67],[67,"1",[-,"[[n,b],"":-"",[[[n,true]]]]01"]],[73,"]",74],[74,",",75],[75,"""",76],[76,":",77],[77,"-",78],[78,"""",79],[79,",",80],[80,"[",81],[81,"[",82],[82,"[",83],[83,"n",84],[84,",",85],[85,"t",86],[86,"r",87],[87,"u",88],[88,"e",89],[89,"]",90],[90,"]",91],[91,"]",92],[92,"]",93],[93,"0",94],[94,"1",[-,"[[n,c],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",127],[127,"n",66],[66,",",129],[129,"e",68],[68,"q",131],[131,"u",70],[70,"a",133],[133,"l",72],[72,"s",135],[135,"4",74],[74,"1",137],[137,"]",76],[76,",",139],[139,"[",78],[78,"[",141],[141,"v",80],[80,",",143],[143,"1",82],[82,"]",145],[145,",",84],[84,"[",147],[147,"v",86],[86,",",149],[149,"2",88],[88,"]",151],[151,"]",90],[90,",",153],[153,"""",92],[92,":",155],[155,"-",94],[94,"""",157],[157,",",96],[96,"[",159],[159,"[",98],[98,"[",161],[161,"n",100],[100,",",163],[163,"h",102],[163,"t",164],[163,"w",40],[40,"r",41],[41,"a",42],[42,"p",43],[43,"]",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"v",48],[48,",",49],[49,"1",50],[50,"]",51],[51,",",52],[52,"[",53],[53,"v",54],[54,",",55],[55,"2",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,"0",62],[62,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,wrap],[[v,1],[v,2]]]]]01"]],[102,"e",103],[103,"a",104],[104,"d",105],[105,"]",106],[106,",",107],[107,"[",108],[108,"[",109],[109,"v",110],[110,",",111],[111,"1",112],[112,"]",113],[113,",",114],[114,"[",115],[115,"v",116],[116,",",117],[117,"2",118],[118,"]",119],[119,"]",120],[120,"]",121],[121,"]",122],[122,"]",123],[123,"0",124],[124,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,head],[[v,1],[v,2]]]]]01"]],[164,"a",165],[165,"i",166],[166,"l",167],[167,"]",168],[168,",",169],[169,"[",170],[170,"[",171],[171,"v",172],[172,",",173],[173,"1",174],[174,"]",175],[175,",",176],[176,"[",177],[177,"v",178],[178,",",179],[179,"2",180],[180,"]",181],[181,"]",182],[182,"]",183],[183,"]",184],[184,"]",185],[185,"0",186],[186,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,tail],[[v,1],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",153],[153,"n",89],[89,",",155],[155,"e",91],[91,"q",157],[157,"u",93],[93,"a",159],[159,"l",95],[95,"s",161],[161,"4",97],[97,"1",163],[163,"]",99],[99,",",165],[165,"[",101],[101,"[",167],[167,"v",103],[103,",",169],[169,"1",105],[105,"]",171],[171,",",107],[107,"[",173],[173,"v",109],[109,",",175],[175,"2",111],[111,"]",177],[177,"]",113],[113,",",179],[179,"""",115],[115,":",181],[181,"-",117],[117,"""",183],[183,",",119],[119,"[",185],[185,"[",121],[121,"[",187],[187,"n",123],[123,",",189],[189,"m",125],[125,"e",191],[191,"m",127],[127,"b",193],[193,"e",129],[129,"r",195],[195,"2",131],[195,"3",196],[195,"]",46],[46,",",47],[47,"[",48],[48,"[",49],[49,"v",50],[50,",",51],[51,"1",52],[52,"]",53],[53,",",54],[54,"[",55],[55,"v",56],[56,",",57],[57,"2",58],[58,"]",59],[59,"]",60],[60,"]",61],[61,",",62],[62,"[",63],[63,"[",64],[64,"n",65],[65,",",66],[66,"m",67],[67,"e",68],[68,"m",69],[69,"b",70],[70,"e",71],[71,"r",72],[72,"]",73],[73,",",74],[74,"[",75],[75,"[",76],[76,"v",77],[77,",",78],[78,"1",79],[79,"]",80],[80,"]",81],[81,"]",82],[82,"]",83],[83,"]",84],[84,"0",85],[85,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,member],[[v,1],[v,2]]],[[n,member],[[v,1]]]]]01"]],[131,"]",132],[132,",",133],[133,"[",134],[134,"[",135],[135,"v",136],[136,",",137],[137,"1",138],[138,"]",139],[139,",",140],[140,"[",141],[141,"v",142],[142,",",143],[143,"2",144],[144,"]",145],[145,"]",146],[146,"]",147],[147,"]",148],[148,"]",149],[149,"0",150],[150,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,member2],[[v,1],[v,2]]]]]01"]],[196,"]",197],[197,",",198],[198,"[",199],[199,"[",200],[200,"v",201],[201,",",202],[202,"2",203],[203,"]",204],[204,",",205],[205,"[",206],[206,"v",207],[207,",",208],[208,"1",209],[209,"]",210],[210,"]",211],[211,"]",212],[212,"]",213],[213,"]",214],[214,"0",215],[215,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,member3],[[v,2],[v,1]]]]]01"]]],[[1,"[",2],[2,"[",121],[121,"n",69],[69,",",123],[123,"e",71],[71,"q",125],[125,"u",73],[73,"a",127],[127,"l",75],[75,"s",129],[129,"4",77],[77,"1",131],[131,"]",79],[79,",",133],[133,"[",81],[81,"[",135],[135,"v",83],[83,",",137],[137,"1",85],[85,"]",139],[139,",",22],[139,"]",87],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"e",40],[40,"q",41],[41,"u",42],[42,"a",43],[43,"l",44],[44,"s",45],[45,"3",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"[",50],[50,"v",51],[51,",",52],[52,"2",53],[53,"]",54],[54,",",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"1",59],[59,"]",60],[60,"]",61],[61,"]",62],[62,"]",63],[63,"]",64],[64,"0",65],[65,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,equals3],[[v,2],[v,1]]]]]01"]],[87,",",88],[88,"""",89],[89,":",90],[90,"-",91],[91,"""",92],[92,",",93],[93,"[",94],[94,"[",95],[95,"[",96],[96,"n",97],[97,",",98],[98,"e",99],[99,"q",100],[100,"u",101],[101,"a",102],[102,"l",103],[103,"s",104],[104,"2",105],[104,"3",158],[105,"]",106],[106,",",107],[107,"[",108],[108,"[",109],[109,"v",110],[110,",",111],[111,"1",112],[112,"]",113],[113,"]",114],[114,"]",115],[115,"]",116],[116,"]",117],[117,"0",118],[118,"1",[-,"[[n,equals41],[[v,1]],"":-"",[[[n,equals2],[[v,1]]]]]01"]],[158,"]",159],[159,",",160],[160,"[",161],[161,"[",162],[162,"v",163],[163,",",164],[164,"2",165],[165,"]",166],[166,",",167],[167,"[",168],[168,"v",169],[169,",",170],[170,"1",171],[171,"]",172],[172,"]",173],[173,"]",174],[174,"]",175],[175,"]",176],[176,"0",177],[177,"1",[-,"[[n,equals41],[[v,1]],"":-"",[[[n,equals3],[[v,2],[v,1]]]]]01"]]],[[1,"[",2],[2,"[",150],[150,"n",80],[80,",",152],[152,"e",82],[82,"q",154],[154,"u",84],[84,"a",156],[156,"l",86],[86,"s",158],[158,"4",88],[88,"1",160],[160,"]",90],[90,",",162],[162,"[",92],[92,"[",164],[164,"v",94],[94,",",166],[166,"1",96],[96,"]",168],[168,",",98],[98,"[",170],[170,"v",100],[100,",",172],[172,"2",102],[102,"]",174],[174,",",28],[174,"]",104],[28,"[",29],[29,"v",30],[30,",",31],[31,"3",32],[32,"]",33],[33,"]",34],[34,",",35],[35,"""",36],[36,":",37],[37,"-",38],[38,"""",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"[",43],[43,"n",44],[44,",",45],[45,"d",46],[46,"e",47],[47,"l",48],[48,"e",49],[49,"t",50],[50,"e",51],[51,"]",52],[52,",",53],[53,"[",54],[54,"[",55],[55,"v",56],[56,",",57],[57,"1",58],[58,"]",59],[59,",",60],[60,"[",61],[61,"v",62],[62,",",63],[63,"2",64],[64,"]",65],[65,",",66],[66,"[",67],[67,"v",68],[68,",",69],[69,"3",70],[70,"]",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,"]",75],[75,"0",76],[76,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3]],"":-"",[[[n,delete],[[v,1],[v,2],[v,3]]]]]01"]],[104,",",105],[105,"""",106],[106,":",107],[107,"-",108],[108,"""",109],[109,",",110],[110,"[",111],[111,"[",112],[112,"[",113],[113,"n",114],[114,",",115],[115,"m",116],[115,"s",187],[116,"a",117],[117,"p",118],[118,"l",119],[119,"i",120],[120,"s",121],[121,"t",122],[122,"]",123],[123,",",124],[124,"[",125],[125,"[",126],[126,"v",127],[127,",",128],[128,"1",129],[129,"]",130],[130,",",131],[131,"[",132],[132,"v",133],[133,",",134],[134,"2",135],[135,"]",136],[136,",",137],[137,"[",138],[138,"v",139],[139,",",140],[140,"8",141],[141,"]",142],[142,"]",143],[143,"]",144],[144,"]",145],[145,"]",146],[146,"0",147],[147,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,maplist],[[v,1],[v,2],[v,8]]]]]01"]],[187,"o",188],[188,"r",189],[189,"t",190],[190,"]",191],[191,",",192],[192,"[",193],[193,"[",194],[194,"v",195],[195,",",196],[196,"1",197],[197,"]",198],[198,",",199],[199,"[",200],[200,"v",201],[201,",",202],[202,"2",203],[203,"]",204],[204,"]",205],[205,"]",206],[206,"]",207],[207,"]",208],[208,"0",209],[209,"1",[-,"[[n,equals41],[[v,1],[v,2]],"":-"",[[[n,sort],[[v,1],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",156],[156,"n",92],[92,",",158],[158,"e",94],[94,"q",160],[160,"u",96],[96,"a",162],[162,"l",98],[98,"s",164],[164,"4",100],[100,"1",166],[166,"]",102],[102,",",168],[168,"[",104],[104,"[",170],[170,"v",106],[106,",",172],[172,"1",108],[108,"]",174],[174,",",110],[110,"[",176],[176,"v",112],[112,",",178],[178,"2",114],[114,"]",180],[180,",",28],[180,"]",116],[28,"[",29],[29,"v",30],[30,",",31],[31,"3",32],[32,"]",33],[33,"]",34],[34,",",35],[35,"""",36],[36,":",37],[37,"-",38],[38,"""",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"[",43],[43,"n",44],[44,",",45],[45,"i",46],[46,"n",47],[47,"t",48],[48,"e",49],[49,"r",50],[50,"s",51],[51,"e",52],[52,"c",53],[53,"t",54],[54,"i",55],[55,"o",56],[56,"n",57],[57,"]",58],[58,",",59],[59,"[",60],[60,"[",61],[61,"v",62],[62,",",63],[63,"1",64],[64,"]",65],[65,",",66],[66,"[",67],[67,"v",68],[68,",",69],[69,"2",70],[70,"]",71],[71,",",72],[72,"[",73],[73,"v",74],[74,",",75],[75,"3",76],[76,"]",77],[77,",",78],[78,"[",79],[79,"v",80],[80,",",81],[81,"8",82],[82,"]",83],[83,"]",84],[84,"]",85],[85,"]",86],[86,"]",87],[87,"0",88],[88,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3]],"":-"",[[[n,intersection],[[v,1],[v,2],[v,3],[v,8]]]]]01"]],[116,",",117],[117,"""",118],[118,":",119],[119,"-",120],[120,"""",121],[121,",",122],[122,"[",123],[123,"[",124],[124,"[",125],[125,"n",126],[126,",",127],[127,"m",128],[128,"a",129],[129,"p",130],[130,"l",131],[131,"i",132],[132,"s",133],[133,"t",134],[134,"]",135],[135,",",136],[136,"[",137],[137,"[",138],[138,"v",139],[139,",",140],[140,"1",141],[141,"]",142],[142,",",143],[143,"[",144],[144,"v",145],[145,",",146],[146,"2",147],[147,"]",148],[148,"]",149],[149,"]",150],[150,"]",151],[151,"]",152],[152,"0",153]],[[1,"[",2],[2,"[",133],[133,"n",57],[57,",",135],[135,"e",59],[59,"q",137],[137,"u",61],[61,"a",139],[139,"l",63],[63,"s",141],[141,"4",65],[65,"1",143],[65,"2",66],[66,"]",67],[67,",",68],[68,"[",69],[69,"[",70],[70,"v",71],[71,",",72],[72,"1",73],[73,"]",74],[74,",",75],[75,"[",76],[76,"v",77],[77,",",78],[78,"2",79],[79,"]",80],[80,",",81],[81,"[",82],[82,"v",83],[83,",",84],[84,"4",85],[85,"]",86],[86,"]",87],[87,",",88],[88,"""",89],[89,":",90],[90,"-",91],[91,"""",92],[92,",",93],[93,"[",94],[94,"[",95],[95,"[",96],[96,"n",97],[97,",",98],[98,"e",99],[99,"q",100],[100,"u",101],[101,"a",102],[102,"l",103],[103,"s",104],[104,"4",105],[105,"]",106],[106,",",107],[107,"[",108],[108,"[",109],[109,"v",110],[110,",",111],[111,"1",112],[112,"]",113],[113,",",114],[114,"[",115],[115,"v",116],[116,",",117],[117,"2",118],[118,"]",119],[119,",",120],[120,"[",121],[121,"v",122],[122,",",123],[123,"4",124],[124,"]",125],[125,"]",126],[126,"]",127],[127,"]",128],[128,"]",129],[129,"0",130],[130,"1",[-,"[[n,equals42],[[v,1],[v,2],[v,4]],"":-"",[[[n,equals4],[[v,1],[v,2],[v,4]]]]]01"]],[143,"]",14],[14,",",15],[15,"""",146],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,"]",22],[22,",",23],[23,"""",24],[24,":",25],[25,"-",26],[26,"""",27],[27,",",28],[28,"[",29],[29,"[",30],[30,"[",31],[31,"n",32],[32,",",33],[33,"e",34],[34,"q",35],[35,"u",36],[36,"a",37],[37,"l",38],[38,"s",39],[39,"4",40],[40,"]",41],[41,",",42],[42,"[",43],[43,"[",44],[44,"v",45],[45,",",46],[46,"1",47],[47,"]",48],[48,"]",49],[49,"]",50],[50,"]",51],[51,"]",52],[52,"0",53],[53,"1",[-,"[[n,equals41],[[v,1]],"":-"",[[[n,equals4],[[v,1]]]]]01"]],[146,":",147],[147,"-",148],[148,"""",149],[149,",",150],[150,"[",151],[151,"[",152],[152,"[",153],[153,"n",154],[154,",",155],[155,"t",156],[156,"r",157],[157,"u",158],[158,"e",159],[159,"]",160],[160,"]",161],[161,"]",162],[162,"]",163],[163,"0",164],[164,"1",[-,"[[n,equals41],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",91],[91,"n",58],[58,",",93],[93,"e",60],[60,"q",95],[95,"u",62],[62,"a",97],[97,"l",64],[64,"s",99],[99,"4",66],[66,"1",101],[101,"]",68],[68,",",103],[103,"""",70],[103,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,"]",22],[22,",",23],[23,"""",24],[24,":",25],[25,"-",26],[26,"""",27],[27,",",28],[28,"[",29],[29,"[",30],[30,"[",31],[31,"n",32],[32,",",33],[33,"e",34],[34,"q",35],[35,"u",36],[36,"a",37],[37,"l",38],[38,"s",39],[39,"4",40],[40,"2",41],[41,"]",42],[42,",",43],[43,"[",44],[44,"[",45],[45,"v",46],[46,",",47],[47,"1",48],[48,"]",49],[49,"]",50],[50,"]",51],[51,"]",52],[52,"]",53],[53,"0",54],[54,"1",[-,"[[n,equals41],[[v,1]],"":-"",[[[n,equals42],[[v,1]]]]]01"]],[70,":",71],[71,"-",72],[72,"""",73],[73,",",74],[74,"[",75],[75,"[",76],[76,"[",77],[77,"n",78],[78,",",79],[79,"e",114],[79,"t",80],[80,"r",81],[81,"u",82],[82,"e",83],[83,"]",84],[84,"]",85],[85,"]",86],[86,"]",87],[87,"0",88],[88,"1",[-,"[[n,equals41],"":-"",[[[n,true]]]]01"]],[114,"q",115],[115,"u",116],[116,"a",117],[117,"l",118],[118,"s",119],[119,"4",120],[120,"_",121],[121,"o",122],[122,"n",123],[123,"]",124],[124,"]",125],[125,"]",126],[126,"]",127],[127,"0",128],[128,"1",[-,"[[n,equals41],"":-"",[[[n,equals4_on]]]]01"]]],[[1,"[",2],[2,"[",146],[146,"n",75],[75,",",148],[148,"e",149],[148,"f",77],[77,"u",7],[7,"n",8],[8,"c",9],[9,"t",10],[10,"i",11],[11,"o",12],[12,"n",13],[13,"2",85],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,",",28],[28,"[",29],[29,"v",30],[30,",",31],[31,"3",32],[32,"]",33],[33,"]",34],[34,",",35],[35,"""",36],[36,":",37],[37,"-",38],[38,"""",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"[",43],[43,"n",44],[44,",",45],[45,"+",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"[",50],[50,"v",51],[51,",",52],[52,"1",53],[53,"]",54],[54,",",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"2",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"3",65],[65,"]",66],[66,"]",67],[67,"]",68],[68,"]",69],[69,"]",70],[70,"0",71],[71,"1",[-,"[[n,function],[[v,1],[v,2],[v,3]],"":-"",[[[n,+],[[v,1],[v,2],[v,3]]]]]01"]],[85,"]",86],[86,",",87],[87,"[",88],[88,"[",89],[89,"v",90],[90,",",91],[91,"1",92],[92,"]",93],[93,",",94],[94,"[",95],[95,"v",96],[96,",",97],[97,"2",98],[98,"]",99],[99,",",100],[100,"[",101],[101,"v",102],[102,",",103],[103,"3",104],[104,"]",105],[105,"]",106],[106,",",107],[107,"""",108],[108,":",109],[109,"-",110],[110,"""",111],[111,",",112],[112,"[",113],[113,"[",114],[114,"[",115],[115,"n",116],[116,",",117],[117,"+",118],[118,"]",119],[119,",",120],[120,"[",121],[121,"[",122],[122,"v",123],[123,",",124],[124,"1",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"v",129],[129,",",130],[130,"2",131],[131,"]",132],[132,",",133],[133,"[",134],[134,"v",135],[135,",",136],[136,"3",137],[137,"]",138],[138,"]",139],[139,"]",140],[140,"]",141],[141,"]",142],[142,"0",143],[143,"1",[-,"[[n,function2],[[v,1],[v,2],[v,3]],"":-"",[[[n,+],[[v,1],[v,2],[v,3]]]]]01"]],[149,"q",150],[150,"u",151],[151,"a",152],[152,"l",153],[153,"s",154],[154,"4",155],[155,"1",156],[156,"]",157],[157,",",158],[158,"[",159],[159,"[",160],[160,"v",161],[161,",",162],[162,"1",163],[163,"]",164],[164,",",165],[165,"[",166],[166,"v",167],[167,",",168],[168,"2",169],[169,"]",170],[170,",",171],[171,"[",172],[172,"v",173],[173,",",174],[174,"3",175],[175,"]",176],[176,"]",177],[177,",",178],[178,"""",179],[179,":",180],[180,"-",181],[181,"""",182],[182,",",183],[183,"[",184],[184,"[",185],[185,"[",186],[186,"n",187],[187,",",188],[188,"e",189],[189,"q",190],[190,"u",191],[191,"a",192],[192,"l",193],[193,"s",194],[194,"4",195],[195,"2",196],[196,"]",197],[197,",",198],[198,"[",199],[199,"[",200],[200,"v",201],[201,",",202],[202,"1",203],[203,"]",204],[204,",",205],[205,"[",206],[206,"v",207],[207,",",208],[208,"2",209],[209,"]",210],[210,",",211],[211,"[",212],[212,"v",213],[213,",",214],[214,"3",215],[215,"]",216],[216,"]",217],[217,"]",218],[218,"]",219],[219,"]",220],[220,"0",221],[221,"1",[-,"[[n,equals41],[[v,1],[v,2],[v,3]],"":-"",[[[n,equals42],[[v,1],[v,2],[v,3]]]]]01"]]],[[1,"[",2],[2,"[",159],[159,"n",81],[81,",",161],[161,"e",6],[161,"f",83],[6,"q",7],[7,"u",8],[8,"a",9],[9,"l",10],[10,"s",11],[11,"4",12],[12,"2",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,",",28],[28,"[",29],[29,"v",30],[30,",",31],[31,"4",32],[32,"]",33],[33,"]",34],[34,",",35],[35,"""",36],[36,":",37],[37,"-",38],[38,"""",39],[39,",",40],[40,"[",41],[41,"[",42],[42,"[",43],[43,"n",44],[44,",",45],[45,"e",46],[46,"q",47],[47,"u",48],[48,"a",49],[49,"l",50],[50,"s",51],[51,"4",52],[52,"]",53],[53,",",54],[54,"[",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"1",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"2",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"4",71],[71,"]",72],[72,"]",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"0",77],[77,"1",[-,"[[n,equals42],[[v,1],[v,2],[v,4]],"":-"",[[[n,equals4],[[v,1],[v,2],[v,4]]]]]01"]],[83,"u",84],[84,"n",85],[85,"c",86],[86,"t",87],[87,"i",88],[88,"o",89],[89,"n",90],[90,"1",91],[90,"]",170],[91,"]",92],[92,",",93],[93,"[",94],[94,"[",95],[95,"v",96],[96,",",97],[97,"1",98],[98,"]",99],[99,",",100],[100,"[",101],[101,"v",102],[102,",",103],[103,"1",104],[104,"]",105],[105,",",106],[106,"[",107],[107,"v",108],[108,",",109],[109,"3",110],[110,"]",111],[111,",",112],[112,"[",113],[113,"v",114],[114,",",115],[115,"3",116],[116,"]",117],[117,"]",118],[118,",",119],[119,"""",120],[120,":",121],[121,"-",122],[122,"""",123],[123,",",124],[124,"[",125],[125,"[",126],[126,"[",127],[127,"n",128],[128,",",129],[129,"f",130],[130,"u",131],[131,"n",132],[132,"c",133],[133,"t",134],[134,"i",135],[135,"o",136],[136,"n",137],[137,"]",138],[138,",",139],[139,"[",140],[140,"[",141],[141,"v",142],[142,",",143],[143,"1",144],[144,"]",145],[145,",",146],[146,"[",147],[147,"v",148],[148,",",149],[149,"3",150],[150,"]",151],[151,"]",152],[152,"]",153],[153,"]",154],[154,"]",155],[155,"0",156],[156,"1",[-,"[[n,function1],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,function],[[v,1],[v,3]]]]]01"]],[170,",",171],[171,"[",172],[172,"[",173],[173,"v",174],[174,",",175],[175,"1",176],[176,"]",177],[177,",",178],[178,"[",179],[179,"v",180],[180,",",181],[181,"1",182],[182,"]",183],[183,",",184],[184,"[",185],[185,"v",186],[186,",",187],[187,"3",188],[188,"]",189],[189,",",190],[190,"[",191],[191,"v",192],[192,",",193],[193,"3",194],[194,"]",195],[195,"]",196],[196,",",197],[197,"""",198],[198,":",199],[199,"-",200],[200,"""",201],[201,",",202],[202,"[",203],[203,"[",204],[204,"[",205],[205,"n",206],[206,",",207],[207,"t",208],[208,"r",209],[209,"u",210],[210,"e",211],[211,"]",212],[212,"]",213],[213,"]",214],[214,"]",215],[215,"0",216],[216,"1",[-,"[[n,function],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",142],[142,"n",83],[83,",",144],[144,"f",85],[85,"u",146],[146,"n",87],[87,"c",148],[148,"t",89],[89,"i",150],[150,"o",91],[91,"n",152],[152,"1",14],[152,"]",93],[14,"]",15],[15,",",16],[16,"""",156],[16,"[",17],[17,"[",18],[18,"v",19],[19,",",20],[20,"1",21],[21,"]",22],[22,",",23],[23,"[",24],[24,"v",25],[25,",",26],[26,"2",27],[27,"]",28],[28,",",29],[29,"[",30],[30,"v",31],[31,",",32],[32,"2",33],[33,"]",34],[34,",",35],[35,"[",36],[36,"v",37],[37,",",38],[38,"1",39],[39,"]",40],[40,"]",41],[41,",",42],[42,"""",43],[43,":",44],[44,"-",45],[45,"""",46],[46,",",47],[47,"[",48],[48,"[",49],[49,"[",50],[50,"n",51],[51,",",52],[52,"f",53],[53,"u",54],[54,"n",55],[55,"c",56],[56,"t",57],[57,"i",58],[58,"o",59],[59,"n",60],[60,"]",61],[61,",",62],[62,"[",63],[63,"[",64],[64,"v",65],[65,",",66],[66,"1",67],[67,"]",68],[68,",",69],[69,"[",70],[70,"v",71],[71,",",72],[72,"2",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"]",77],[77,"]",78],[78,"0",79],[79,"1",[-,"[[n,function1],[[v,1],[v,2],[v,2],[v,1]],"":-"",[[[n,function],[[v,1],[v,2]]]]]01"]],[93,",",94],[94,"[",95],[95,"[",96],[96,"v",97],[97,",",98],[98,"1",99],[99,"]",100],[100,",",101],[101,"[",102],[102,"v",103],[103,",",104],[104,"1",105],[105,"]",106],[106,",",107],[107,"[",108],[108,"v",109],[109,",",110],[110,"3",111],[111,"]",112],[112,",",113],[113,"[",114],[114,"v",115],[115,",",116],[116,"3",117],[117,"]",118],[118,"]",119],[119,",",120],[120,"""",121],[121,":",122],[122,"-",123],[123,"""",124],[124,",",125],[125,"[",126],[126,"[",127],[127,"[",128],[128,"n",129],[129,",",130],[130,"t",131],[131,"r",132],[132,"u",133],[133,"e",134],[134,"]",135],[135,"]",136],[136,"]",137],[137,"]",138],[138,"0",139],[139,"1",[-,"[[n,function],[[v,1],[v,1],[v,3],[v,3]],"":-"",[[[n,true]]]]01"]],[156,":",157],[157,"-",158],[158,"""",159],[159,",",160],[160,"[",161],[161,"[",162],[162,"[",163],[163,"n",164],[164,",",165],[165,"t",166],[166,"r",167],[167,"u",168],[168,"e",169],[169,"]",170],[170,"]",171],[171,"]",172],[172,"]",173],[173,"0",174],[174,"1",[-,"[[n,function1],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",118],[118,"n",76],[76,",",120],[120,"f",78],[78,"u",122],[122,"n",80],[80,"c",124],[124,"t",82],[82,"i",126],[126,"o",84],[84,"n",128],[128,"1",86],[86,"]",130],[130,",",88],[88,"""",132],[88,"[",17],[17,"[",18],[18,"v",19],[19,",",20],[20,"1",21],[20,"2",93],[21,"]",22],[22,",",23],[23,"[",24],[24,"v",25],[25,",",26],[26,"2",27],[27,"]",28],[28,"]",29],[29,",",30],[30,"""",31],[31,":",32],[32,"-",33],[33,"""",34],[34,",",35],[35,"[",36],[36,"[",37],[37,"[",38],[38,"n",39],[39,",",40],[40,"e",41],[41,"q",42],[42,"u",43],[43,"a",44],[44,"l",45],[45,"s",46],[46,"4",47],[47,"]",48],[48,",",49],[49,"[",50],[50,"[",51],[51,"v",52],[52,",",53],[53,"1",54],[54,"]",55],[55,",",56],[56,"[",57],[57,"v",58],[58,",",59],[59,"1",60],[60,"]",61],[61,",",62],[62,"[",63],[63,"v",64],[64,",",65],[65,"2",66],[66,"]",67],[67,"]",68],[68,"]",69],[69,"]",70],[70,"]",71],[71,"0",72],[72,"1",[-,"[[n,function1],[[v,1],[v,2]],"":-"",[[[n,equals4],[[v,1],[v,1],[v,2]]]]]01"]],[93,"]",94],[94,"]",95],[95,",",96],[96,"""",97],[97,":",98],[98,"-",99],[99,"""",100],[100,",",101],[101,"[",102],[102,"[",103],[103,"[",104],[104,"n",105],[105,",",106],[106,"t",107],[107,"r",108],[108,"u",109],[109,"e",110],[110,"]",111],[111,"]",112],[112,"]",113],[113,"]",114],[114,"0",115],[115,"1",[-,"[[n,function1],[[v,2]],"":-"",[[[n,true]]]]01"]],[132,":",133],[133,"-",134],[134,"""",135],[135,",",136],[136,"[",137],[137,"[",138],[138,"[",139],[139,"n",140],[140,",",141],[141,"t",142],[142,"r",143],[143,"u",144],[144,"e",145],[145,"]",146],[146,"]",147],[147,"]",148],[148,"]",149],[149,"0",150],[150,"1",[-,"[[n,function1],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",137],[137,"n",65],[65,",",139],[139,"f",67],[67,"u",141],[141,"n",69],[69,"c",143],[143,"t",71],[71,"i",145],[145,"o",73],[73,"n",147],[147,"1",75],[75,"]",149],[149,",",77],[77,"[",151],[151,"[",79],[79,"v",153],[153,",",81],[81,"1",155],[155,"]",83],[83,",",157],[157,"[",85],[85,"v",159],[159,",",87],[87,"1",161],[161,"]",89],[89,",",163],[163,"[",91],[91,"v",165],[165,",",93],[93,"4",33],[93,"5",167],[33,"]",34],[34,",",35],[35,"[",36],[36,"v",37],[37,",",38],[38,"4",39],[39,"]",40],[40,",",102],[40,"]",41],[41,",",42],[42,"""",43],[43,":",44],[44,"-",45],[45,"""",46],[46,",",47],[47,"[",48],[48,"[",49],[49,"[",50],[50,"n",51],[51,",",52],[52,"t",53],[53,"r",54],[54,"u",55],[55,"e",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"]",60],[60,"0",61],[61,"1",[-,"[[n,function1],[[v,1],[v,1],[v,4],[v,4]],"":-"",[[[n,true]]]]01"]],[102,"[",103],[103,"v",104],[104,",",105],[105,"6",106],[106,"]",107],[107,",",108],[108,"[",109],[109,"v",110],[110,",",111],[111,"7",112],[112,"]",113],[113,"]",114],[114,",",115],[115,"""",116],[116,":",117],[117,"-",118],[118,"""",119],[119,",",120],[120,"[",121],[121,"[",122],[122,"[",123],[123,"n",124],[124,",",125],[125,"t",126],[126,"r",127],[127,"u",128],[128,"e",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"0",134],[134,"1",[-,"[[n,function1],[[v,1],[v,1],[v,4],[v,4],[v,6],[v,7]],"":-"",[[[n,true]]]]01"]],[167,"]",168],[168,",",169],[169,"[",170],[170,"v",171],[171,",",172],[172,"5",173],[173,"]",174],[174,",",175],[175,"[",176],[176,"v",177],[177,",",178],[178,"7",179],[179,"]",180],[180,",",181],[181,"[",182],[182,"v",183],[183,",",184],[184,"8",185],[185,"]",186],[186,"]",187],[187,",",188],[188,"""",189],[189,":",190],[190,"-",191],[191,"""",192],[192,",",193],[193,"[",194],[194,"[",195],[195,"[",196],[196,"n",197],[197,",",198],[198,"t",199],[199,"r",200],[200,"u",201],[201,"e",202],[202,"]",203],[203,"]",204],[204,"]",205],[205,"]",206],[206,"0",207],[207,"1",[-,"[[n,function1],[[v,1],[v,1],[v,5],[v,5],[v,7],[v,8]],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",63],[63,"n",34],[34,",",65],[65,"a",36],[36,"d",67],[67,"d",38],[38,"0",9],[38,"2",69],[9,"]",10],[10,",",11],[11,"""",12],[12,":",13],[13,"-",14],[14,"""",15],[15,",",16],[16,"[",17],[17,"[",18],[18,"[",19],[19,"n",20],[20,",",21],[21,"f",22],[21,"t",52],[22,"a",23],[23,"i",24],[24,"l",25],[25,"]",26],[26,"]",27],[27,"]",28],[28,"]",29],[29,"0",30],[30,"1",[-,"[[n,add0],"":-"",[[[n,fail]]]]01"]],[52,"r",53],[53,"u",54],[54,"e",55],[55,"]",56],[56,"]",57],[57,"]",58],[58,"]",59],[59,"0",60],[60,"1",[-,"[[n,add0],"":-"",[[[n,true]]]]01"]],[69,"]",70],[70,",",71],[71,"[",72],[72,"[",73],[73,"v",74],[74,",",75],[75,"1",76],[76,"]",77],[77,",",78],[78,"[",79],[79,"v",80],[80,",",81],[81,"2",82],[82,"]",83],[83,"]",84],[84,",",85],[85,"""",86],[86,":",87],[87,"-",88],[88,"""",89],[89,",",90],[90,"[",91],[91,"[",92],[92,"[",93],[93,"n",94],[94,",",95],[95,"=",96],[96,"]",97],[97,",",98],[98,"[",99],[99,"[",100],[100,"v",101],[101,",",102],[102,"1",103],[103,"]",104],[104,"]",105],[105,"]",106],[106,",",107],[107,"[",108],[108,"[",109],[109,"n",110],[110,",",111],[111,"=",112],[112,"]",113],[113,",",114],[114,"[",115],[115,"[",116],[116,"v",117],[117,",",118],[118,"2",119],[119,"]",120],[120,"]",121],[121,"]",122],[122,"]",123],[123,"]",124],[124,"0",125],[125,"1",[-,"[[n,add2],[[v,1],[v,2]],"":-"",[[[n,=],[[v,1]]],[[n,=],[[v,2]]]]]01"]]],[[1,"[",2],[2,"[",138],[138,"n",62],[62,",",140],[140,"1",64],[140,"a",6],[6,"d",7],[7,"d",8],[8,"3",9],[9,"]",10],[10,",",11],[11,"[",12],[12,"[",13],[13,"v",14],[14,",",15],[15,"1",16],[16,"]",17],[17,",",18],[18,"[",19],[19,"v",20],[20,",",21],[21,"2",22],[22,"]",23],[23,"]",24],[24,",",25],[25,"""",26],[26,":",27],[27,"-",28],[28,"""",29],[29,",",30],[30,"[",31],[31,"[",32],[32,"[",33],[33,"n",34],[34,",",35],[35,"t",36],[36,"a",37],[37,"i",38],[38,"l",39],[39,"]",40],[40,",",41],[41,"[",42],[42,"[",43],[43,"v",44],[44,",",45],[45,"1",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"v",50],[50,",",51],[51,"2",52],[52,"]",53],[53,"]",54],[54,"]",55],[55,"]",56],[56,"]",57],[57,"0",58],[58,"1",[-,"[[n,add3],[[v,1],[v,2]],"":-"",[[[n,tail],[[v,1],[v,2]]]]]01"]],[64,"]",65],[65,",",66],[66,"[",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"1",71],[71,"]",72],[72,",",73],[73,"[",74],[74,"v",75],[75,",",76],[76,"2",77],[77,"]",78],[78,"]",79],[79,",",80],[80,"""",81],[81,":",82],[82,"-",83],[83,"""",84],[84,",",85],[85,"[",86],[86,"[",87],[87,"[",88],[88,"n",89],[89,",",90],[90,"a",91],[91,"d",92],[92,"d",93],[93,"2",94],[93,"3",171],[94,"]",95],[95,",",96],[96,"[",97],[97,"[",98],[98,"v",99],[99,",",100],[100,"1",101],[101,"]",102],[102,",",103],[103,"[",104],[104,"v",105],[105,",",106],[106,"4",107],[107,"]",108],[108,"]",109],[109,"]",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"n",114],[114,",",115],[115,"=",116],[116,"]",117],[117,",",118],[118,"[",119],[119,"[",120],[120,"v",121],[121,",",122],[122,"4",123],[123,"]",124],[124,",",125],[125,"[",126],[126,"v",127],[127,",",128],[128,"2",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"]",134],[134,"0",135],[135,"1",[-,"[[n,1],[[v,1],[v,2]],"":-"",[[[n,add2],[[v,1],[v,4]]],[[n,=],[[v,4],[v,2]]]]]01"]],[171,"]",172],[172,",",173],[173,"[",174],[174,"[",175],[175,"v",176],[176,",",177],[177,"1",178],[178,"]",179],[179,",",180],[180,"[",181],[181,"v",182],[182,",",183],[183,"4",184],[184,"]",185],[185,"]",186],[186,"]",187],[187,",",188],[188,"[",189],[189,"[",190],[190,"n",191],[191,",",192],[192,"1",193],[193,"]",194],[194,",",195],[195,"[",196],[196,"[",197],[197,"v",198],[198,",",199],[199,"4",200],[200,"]",201],[201,",",202],[202,"[",203],[203,"v",204],[204,",",205],[205,"6",206],[206,"]",207],[207,"]",208],[208,"]",209],[209,",",210],[210,"[",211],[211,"[",212],[212,"n",213],[213,",",214],[214,"=",215],[215,"]",216],[216,",",217],[217,"[",218],[218,"[",219],[219,"v",220],[220,",",221],[221,"6",222],[222,"]",223],[223,",",224],[224,"[",225],[225,"v",226],[226,",",227],[227,"2",228],[228,"]",229],[229,"]",230],[230,"]",231],[231,"]",232],[232,"]",233],[233,"0",234],[234,"1",[-,"[[n,1],[[v,1],[v,2]],"":-"",[[[n,add3],[[v,1],[v,4]]],[[n,1],[[v,4],[v,6]]],[[n,=],[[v,6],[v,2]]]]]01"]]],[[1,"[",2],[2,"[",140],[140,"n",93],[93,",",142],[142,"m",6],[142,"n",95],[6,"e",7],[7,"m",8],[8,"b",9],[9,"e",10],[10,"r",11],[11,"_",12],[12,"t",13],[13,"r",14],[14,"y",15],[15,"]",16],[16,",",17],[17,"[",18],[18,"[",19],[19,"v",20],[20,",",21],[21,"1",22],[22,"]",23],[23,",",24],[24,"[",25],[25,"v",26],[26,",",27],[27,"2",28],[28,"]",29],[29,"]",30],[30,",",31],[31,"""",32],[32,":",33],[33,"-",34],[34,"""",35],[35,",",36],[36,"[",37],[37,"[",38],[38,"[",39],[39,"n",40],[40,",",41],[41,"m",42],[42,"e",43],[43,"m",44],[44,"b",45],[45,"e",46],[46,"r",47],[47,"2",48],[48,"]",49],[49,",",50],[50,"[",51],[51,"[",52],[52,"v",53],[53,",",54],[54,"1",55],[55,"]",56],[56,",",57],[57,"[",58],[58,"v",59],[59,",",60],[60,"2",61],[61,"]",62],[62,"]",63],[63,"]",64],[64,",",65],[65,"[",66],[66,"[",67],[67,"n",68],[68,",",69],[69,"e",70],[70,"q",71],[71,"u",72],[72,"a",73],[73,"l",74],[74,"s",75],[75,"4",76],[76,"]",77],[77,",",78],[78,"[",79],[79,"[",80],[80,"v",81],[81,",",82],[82,"2",83],[83,"]",84],[84,"]",85],[85,"]",86],[86,"]",87],[87,"]",88],[88,"0",89],[89,"1",[-,"[[n,member_try],[[v,1],[v,2]],"":-"",[[[n,member2],[[v,1],[v,2]]],[[n,equals4],[[v,2]]]]]01"]],[95,"o",96],[96,"t",97],[97,"1",98],[98,"]",99],[99,",",100],[100,"""",101],[101,":",102],[102,"-",103],[103,"""",104],[104,",",105],[105,"[",106],[106,"[",107],[107,"[",108],[108,"n",109],[109,",",110],[110,"n",111],[111,"o",112],[112,"t",113],[113,"]",114],[114,",",115],[115,"[",116],[116,"[",117],[117,"[",118],[118,"[",167],[118,"n",119],[119,",",120],[120,"e",121],[121,"q",122],[122,"u",123],[123,"a",124],[124,"l",125],[125,"s",126],[126,"4",127],[127,"]",128],[128,",",129],[129,"[",130],[130,"]",131],[131,"]",132],[132,"]",133],[133,"]",134],[134,"]",135],[135,"]",136],[136,"0",137],[137,"1",[-,"[[n,not1],"":-"",[[[n,not],[[[n,equals4],[]]]]]]01"]],[167,"n",168],[168,",",169],[169,"e",170],[170,"q",171],[171,"u",172],[172,"a",173],[173,"l",174],[174,"s",175],[175,"4",176],[176,"]",177],[177,",",178],[178,"[",179],[179,"]",180],[180,"]",181],[181,",",182],[182,"[",183],[183,"[",184],[184,"n",185],[185,",",186],[186,"e",187],[187,"q",188],[188,"u",189],[189,"a",190],[190,"l",191],[191,"s",192],[192,"4",193],[193,"]",194],[194,",",195],[195,"[",196],[196,"]",197],[197,"]",198],[198,"]",199],[199,"]",200],[200,"]",201],[201,",",202],[202,"[",203],[203,"[",204],[204,"n",205],[205,",",206],[206,"t",207],[207,"r",208],[208,"u",209],[209,"e",210],[210,"]",211],[211,"]",212],[212,"]",213],[213,"]",214],[214,"0",215],[215,"1",[-,"[[n,not1],"":-"",[[[n,not],[[[[n,equals4],[]],[[n,equals4],[]]]]],[[n,true]]]]01"]]],[[1,"[",2],[2,"[",142],[142,"n",82],[82,",",144],[144,"b",84],[144,"n",6],[6,"o",7],[7,"t",8],[8,"1",9],[9,"]",10],[10,",",11],[11,"""",12],[12,":",13],[13,"-",14],[14,"""",15],[15,",",16],[16,"[",17],[17,"[",18],[18,"[",19],[19,"n",20],[20,",",21],[21,"n",22],[22,"o",23],[23,"t",24],[24,"]",25],[25,",",26],[26,"[",27],[27,"[",28],[28,"[",29],[29,"[",30],[30,"n",31],[31,",",32],[32,"e",33],[33,"q",34],[34,"u",35],[35,"a",36],[36,"l",37],[37,"s",38],[38,"4",39],[39,"]",40],[40,",",41],[41,"[",42],[42,"]",43],[43,"]",44],[44,",",45],[45,"[",46],[46,"[",47],[47,"n",48],[48,",",49],[49,"e",50],[50,"q",51],[51,"u",52],[52,"a",53],[53,"l",54],[54,"s",55],[55,"4",56],[56,"]",57],[57,",",58],[58,"[",59],[59,"]",60],[60,"]",61],[61,"]",62],[62,"]",63],[63,"]",64],[64,",",65],[65,"[",66],[66,"[",67],[67,"n",68],[68,",",69],[69,"t",70],[70,"r",71],[71,"u",72],[72,"e",73],[73,"]",74],[74,"]",75],[75,"]",76],[76,"]",77],[77,"0",78],[78,"1",[-,"[[n,not1],"":-"",[[[n,not],[[[[n,equals4],[]],[[n,equals4],[]]]]],[[n,true]]]]01"]],[84,"r",85],[85,"a",86],[86,"c",87],[87,"k",88],[88,"e",89],[89,"t",90],[90,"s",91],[91,"1",92],[92,"]",93],[93,",",94],[94,"""",95],[95,":",96],[96,"-",97],[97,"""",98],[98,",",99],[99,"[",100],[100,"[",101],[101,"[",102],[102,"[",103],[103,"n",104],[104,",",105],[105,"t",106],[105,"w",167],[106,"r",107],[107,"u",108],[108,"e",109],[109,"]",110],[110,"]",111],[111,",",112],[112,"[",113],[113,"[",114],[114,"[",115],[115,"n",116],[116,",",117],[117,"t",118],[118,"r",119],[119,"u",120],[120,"e",121],[121,"]",122],[122,"]",123],[123,",",124],[124,"[",125],[125,"[",126],[126,"n",127],[127,",",128],[128,"t",129],[129,"r",130],[130,"u",131],[131,"e",132],[132,"]",133],[133,"]",134],[134,"]",135],[135,"]",136],[136,"]",137],[137,"]",138],[138,"0",139],[139,"1",[-,"[[n,brackets1],"":-"",[[[[n,true]],[[[n,true]],[[n,true]]]]]]01"]],[167,"r",168],[168,"i",169],[169,"t",170],[170,"e",171],[171,"l",172],[172,"n",173],[173,"]",174],[174,",",175],[175,"[",176],[176,"]",177],[177,"]",178],[178,"]",179],[179,",",180],[180,"[",181],[181,"[",182],[182,"[",183],[183,"n",184],[184,",",185],[185,"w",186],[186,"r",187],[187,"i",188],[188,"t",189],[189,"e",190],[190,"l",191],[191,"n",192],[192,"]",193],[193,",",194],[194,"[",195],[195,"]",196],[196,"]",197],[197,",",198],[198,"[",199],[199,"[",200],[200,"n",201],[201,",",202],[202,"w",203],[203,"r",204],[204,"i",205],[205,"t",206],[206,"e",207],[207,"l",208],[208,"n",209],[209,"]",210],[210,",",211],[211,"[",212],[212,"]",213],[213,"]",214],[214,"]",215],[215,"]",216],[216,"]",217],[217,"0",218],[218,"1",[-,"[[n,brackets1],"":-"",[[[[n,writeln],[]]],[[[n,writeln],[]],[[n,writeln],[]]]]]01"]]],[[1,"[",2],[2,"[",88],[88,"n",54],[54,",",90],[90,"b",56],[90,"n",91],[56,"r",7],[7,"a",8],[8,"c",9],[9,"k",10],[10,"e",11],[11,"t",12],[12,"s",13],[13,"1",14],[14,"]",15],[15,",",16],[16,"""",17],[17,":",18],[18,"-",19],[19,"""",20],[20,",",21],[21,"[",22],[22,"[",23],[23,"[",24],[24,"[",25],[24,"n",75],[25,"n",26],[26,",",27],[27,"t",28],[28,"r",29],[29,"u",30],[30,"e",31],[31,"]",32],[32,"]",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"t",40],[40,"r",41],[41,"u",42],[42,"e",43],[43,"]",44],[44,"]",45],[45,"]",46],[46,"]",47],[47,"]",48],[48,"]",49],[49,"0",50],[50,"1",[-,"[[n,brackets1],"":-"",[[[[n,true]],[[[n,true]]]]]]01"]],[75,",",76],[76,"t",77],[77,"r",78],[78,"u",79],[79,"e",80],[80,"]",81],[81,"]",82],[82,"]",83],[83,"]",84],[84,"0",85],[85,"1",[-,"[[n,brackets1],"":-"",[[[n,true]]]]01"]],[91,"o",92],[92,"t",93],[93,"1",94],[94,"]",95],[95,",",96],[96,"""",97],[97,":",98],[98,"-",99],[99,"""",100],[100,",",101],[101,"[",102],[102,"[",103],[103,"[",104],[104,"n",105],[105,",",106],[106,"n",107],[107,"o",108],[108,"t",109],[109,"]",110],[110,",",111],[111,"[",112],[112,"[",113],[113,"[",114],[114,"n",115],[115,",",116],[116,"m",117],[117,"e",118],[118,"m",119],[119,"b",120],[120,"e",121],[121,"r",122],[122,"]",123],[123,",",124],[124,"[",125],[125,"]",126],[126,"]",127],[127,"]",128],[128,"]",129],[129,"]",130],[130,"]",131],[131,"0",132],[132,"1",[-,"[[n,not1],"":-"",[[[n,not],[[[n,member],[]]]]]]01"]]],[[1,"[",2],[2,"[",153],[153,"n",93],[93,",",155],[155,"c",95],[155,"f",6],[155,"o",156],[6,"i",7],[7,"n",8],[8,"d",9],[9,"a",10],[10,"l",11],[11,"l",12],[12,"1",13],[13,"]",14],[14,",",15],[15,"[",16],[16,"[",17],[17,"v",18],[18,",",19],[19,"1",20],[20,"]",21],[21,",",22],[22,"[",23],[23,"v",24],[24,",",25],[25,"2",26],[26,"]",27],[27,"]",28],[28,",",29],[29,"""",30],[30,":",31],[31,"-",32],[32,"""",33],[33,",",34],[34,"[",35],[35,"[",36],[36,"[",37],[37,"n",38],[38,",",39],[39,"f",40],[40,"i",41],[41,"n",42],[42,"d",43],[43,"a",44],[44,"l",45],[45,"l",46],[46,"]",47],[47,",",48],[48,"[",49],[49,"[",50],[50,"v",51],[51,",",52],[52,"3",53],[53,"]",54],[54,",",55],[55,"[",56],[56,"v",57],[57,",",58],[58,"1",59],[59,"]",60],[60,",",61],[61,"[",62],[62,"v",63],[63,",",64],[64,"6",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"v",69],[69,",",70],[70,"6",71],[71,"]",72],[72,",",73],[73,"[",74],[74,"v",75],[75,",",76],[76,"3",77],[77,"]",78],[78,",",79],[79,"[",80],[80,"v",81],[81,",",82],[82,"2",83],[83,"]",84],[84,"]",85],[85,"]",86],[86,"]",87],[87,"]",88],[88,"0",89],[89,"1",[-,"[[n,findall1],[[v,1],[v,2]],"":-"",[[[n,findall],[[v,3],[v,1],[v,6],[v,6],[v,3],[v,2]]]]]01"]],[95,"u",96],[96,"t",97],[97,"1",98],[98,"]",99],[99,",",100],[100,"[",101],[101,"[",102],[102,"v",103],[103,",",104],[104,"1",105],[105,"]",106],[106,"]",107],[107,",",108],[108,"""",109],[109,":",110],[110,"-",111],[111,"""",112],[112,",",113],[113,"[",114],[114,"[",115],[115,"[",116],[116,"n",117],[117,",",118],[118,"f",119],[119,"i",120],[120,"n",121],[121,"d",122],[122,"a",123],[123,"l",124],[124,"l",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"[",129],[129,"v",130],[130,",",131],[131,"2",132],[132,"]",133],[133,",",134],[134,"[",135],[135,"v",136],[136,",",137],[137,"2",138],[138,"]",139],[139,",",140],[140,"[",141],[141,"v",142],[142,",",143],[143,"1",144],[144,"]",145],[145,"]",146],[146,"]",147],[147,"]",148],[148,"]",149],[149,"0",150],[150,"1",[-,"[[n,cut1],[[v,1]],"":-"",[[[n,findall],[[v,2],[v,2],[v,1]]]]]01"]],[156,"r",157],[157,"1",158],[158,"2",159],[159,"]",160],[160,",",161],[161,"""",162],[162,":",163],[163,"-",164],[164,"""",165],[165,",",166],[166,"[",167],[167,"[",168],[168,"[",169],[169,"n",170],[170,",",171],[171,"t",172],[172,"r",173],[173,"u",174],[174,"e",175],[175,"]",176],[176,"]",177],[177,"]",178],[178,"]",179],[179,"0",180],[180,"1",[-,"[[n,or12],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",63],[63,"n",34],[34,",",65],[65,"o",36],[36,"r",67],[67,"1",38],[38,"2",69],[69,"]",40],[40,",",71],[71,"""",42],[42,":",73],[73,"-",44],[44,"""",75],[75,",",46],[46,"[",77],[77,"[",48],[48,"[",79],[79,"n",50],[50,",",81],[81,"f",52],[81,"t",82],[52,"a",23],[23,"i",24],[24,"l",25],[25,"]",26],[26,"]",27],[27,"]",28],[28,"]",29],[29,"0",30],[82,"r",83],[83,"u",84],[84,"e",85],[85,"]",86],[86,"]",87],[87,"]",88],[88,"]",89],[89,"0",90],[90,"1",[-,"[[n,or12],"":-"",[[[n,true]]]]01"]]],[[1,"[",2],[2,"[",143],[143,"n",76],[76,",",145],[145,"f",78],[78,"u",147],[147,"n",80],[80,"c",149],[149,"t",82],[82,"i",151],[151,"o",84],[84,"n",153],[153,"]",86],[86,",",155],[155,"""",88],[88,":",157],[157,"-",90],[90,"""",159],[159,",",92],[92,"[",161],[161,"[",94],[94,"[",163],[163,"n",96],[96,",",165],[165,"""",166],[165,"n",26],[165,"o",98],[26,"o",27],[27,"t",28],[28,"]",29],[29,",",30],[30,"[",31],[31,"[",32],[32,"[",33],[33,"[",34],[34,"n",35],[35,",",36],[36,"f",37],[37,"a",38],[38,"l",39],[39,"s",40],[40,"e",41],[41,"]",42],[42,"]",43],[43,",",44],[44,"[",45],[45,"[",46],[46,"n",47],[47,",",48],[48,"f",49],[49,"a",50],[50,"l",51],[51,"s",52],[52,"e",53],[53,"]",54],[54,"]",55],[55,"]",56],[56,"]",57],[57,"]",58],[58,",",59],[59,"[",60],[60,"[",61],[61,"n",62],[62,",",63],[63,"t",64],[64,"r",65],[65,"u",66],[66,"e",67],[67,"]",68],[68,"]",69],[69,"]",70],[70,"]",71],[71,"0",72],[72,"1",[-,"[[n,function],"":-"",[[[n,not],[[[[n,false]],[[n,false]]]]],[[n,true]]]]01"]],[98,"r",99],[99,"]",100],[100,",",101],[101,"[",102],[102,"[",103],[103,"[",104],[104,"n",105],[105,",",106],[106,"f",107],[107,"a",108],[108,"l",109],[109,"s",110],[110,"e",111],[111,"]",112],[112,"]",113],[113,",",114],[114,"[",115],[115,"[",116],[116,"n",117],[117,",",118],[118,"t",119],[119,"r",120],[120,"u",121],[121,"e",122],[122,"]",123],[123,"]",124],[124,"]",125],[125,"]",126],[126,",",127],[127,"[",128],[128,"[",129],[129,"n",130],[130,",",131],[131,"t",132],[132,"r",133],[133,"u",134],[134,"e",135],[135,"]",136],[136,"]",137],[137,"]",138],[138,"]",139],[139,"0",140],[140,"1",[-,"[[n,function],"":-"",[[[n,or],[[[n,false]],[[n,true]]]],[[n,true]]]]01"]],[166,"-",167],[167,">",168],[168,"""",169],[169,"]",170],[170,",",171],[171,"[",172],[172,"[",173],[173,"[",174],[174,"n",175],[175,",",176],[176,"t",177],[177,"r",178],[178,"u",179],[179,"e",180],[180,"]",181],[181,"]",182],[182,",",183],[183,"[",184],[184,"[",185],[185,"n",186],[186,",",187],[187,"t",188],[188,"r",189],[189,"u",190],[190,"e",191],[191,"]",192],[192,"]",193],[193,"]",194],[194,"]",195],[195,",",196],[196,"[",197],[197,"[",198],[198,"n",199],[199,",",200],[200,"t",201],[201,"r",202],[202,"u",203],[203,"e",204],[204,"]",205],[205,"]",206],[206,"]",207],[207,"]",208],[208,"0",209],[209,"1",[-,"[[n,function],"":-"",[[[n,""->""],[[[n,true]],[[n,true]]]],[[n,true]]]]01"]]],[[1,"[",2],[2,"[",119],[119,"n",84],[84,",",121],[121,"f",6],[121,"r",86],[121,"t",122],[6,"u",7],[7,"n",8],[8,"c",9],[9,"t",10],[10,"i",11],[11,"o",12],[12,"n",13],[13,"]",14],[14,",",15],[15,"""",16],[16,":",17],[17,"-",18],[18,"""",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"[",23],[23,"n",24],[24,",",25],[25,"""",26],[26,"-",27],[27,">",28],[28,"""",29],[29,"]",30],[30,",",31],[31,"[",32],[32,"[",33],[33,"[",34],[34,"n",35],[35,",",36],[36,"t",37],[37,"r",38],[38,"u",39],[39,"e",40],[40,"]",41],[41,"]",42],[42,",",43],[43,"[",44],[44,"[",45],[45,"n",46],[46,",",47],[47,"t",48],[48,"r",49],[49,"u",50],[50,"e",51],[51,"]",52],[52,"]",53],[53,",",54],[54,"[",55],[55,"[",56],[56,"n",57],[57,",",58],[58,"t",59],[59,"r",60],[60,"u",61],[61,"e",62],[62,"]",63],[63,"]",64],[64,"]",65],[65,"]",66],[66,",",67],[67,"[",68],[68,"[",69],[69,"n",70],[70,",",71],[71,"t",72],[72,"r",73],[73,"u",74],[74,"e",75],[75,"]",76],[76,"]",77],[77,"]",78],[78,"]",79],[79,"0",80],[80,"1",[-,"[[n,function],"":-"",[[[n,""->""],[[[n,true]],[[n,true]],[[n,true]]]],[[n,true]]]]01"]],[86,"a",87],[87,"i",88],[88,"n",89],[89,"f",90],[90,"o",91],[91,"r",92],[92,"e",93],[93,"s",94],[94,"t",95],[95,"]",96],[96,",",97],[97,"""",98],[98,":",99],[99,"-",100],[100,"""",101],[101,",",102],[102,"[",103],[103,"[",104],[104,"[",105],[105,"n",106],[106,",",107],[107,"t",108],[108,"r",109],[109,"u",110],[110,"e",111],[111,"]",112],[112,"]",113],[113,"]",114],[114,"]",115],[115,"0",116],[116,"1",[-,"[[n,rainforest],"":-"",[[[n,true]]]]01"]],[122,"r",123],[123,"a",124],[124,"v",125],[125,"e",126],[126,"r",127],[127,"s",128],[128,"e",129],[129,"]",130],[130,",",131],[131,"[",132],[132,"[",133],[133,"v",134],[134,",",135],[135,"1",136],[136,"]",137],[137,",",138],[138,"[",139],[139,"v",140],[140,",",141],[141,"2",142],[142,"]",143],[143,"]",144],[144,",",145],[145,"""",146],[146,":",147],[147,"-",148],[148,"""",149],[149,",",150],[150,"[",151],[151,"[",152],[152,"[",153],[153,"n",154],[154,",",155],[155,"t",156],[156,"r",157],[157,"a",158],[158,"v",159],[159,"e",160],[160,"r",161],[161,"s",162],[162,"e",163],[163,"]",164],[164,",",165],[165,"[",166],[166,"[",167],[167,"v",168],[168,",",169],[169,"1",170],[170,"]",171],[171,",",172],[172,"[",173],[173,"v",174],[174,",",175],[175,"2",176],[176,"]",177],[177,",",178],[178,"[",179],[179,"v",180],[180,",",181],[181,"6",182],[182,"]",183],[183,",",184],[184,"[",185],[185,"v",186],[186,",",187],[187,"8",188],[188,"]",189],[189,"]",190],[190,"]",191],[191,",",192],[192,"[",193],[193,"[",194],[194,"n",195],[195,",",196],[196,"c",197],[197,"u",198],[198,"t",199],[199,"]",200],[200,"]",201],[201,"]",202],[202,"]",203],[203,"0",204],[204,"1",[-,"[[n,traverse],[[v,1],[v,2]],"":-"",[[[n,traverse],[[v,1],[v,2],[v,6],[v,8]]],[[n,cut]]]]01"]]],[[1,"[",2],[2,"[",65],[65,"n",35],[35,",",67],[67,"a",37],[67,"f",68],[37,"p",7],[7,"p",8],[8,"l",9],[9,"y",10],[10,"]",11],[11,",",12],[12,"""",13],[13,":",14],[14,"-",15],[15,"""",16],[16,",",17],[17,"[",18],[18,"[",19],[19,"[",20],[20,"n",21],[21,",",22],[22,"t",23],[23,"r",24],[24,"u",25],[25,"e",26],[26,"]",27],[27,"]",28],[28,"]",29],[29,"]",30],[30,"0",31],[68,"u",69],[69,"n",70],[70,"c",71],[71,"t",72],[72,"i",73],[73,"o",74],[74,"n",75],[75,"]",76],[76,",",77],[77,"""",78],[78,":",79],[79,"-",80],[80,"""",81],[81,",",82],[82,"[",83],[83,"[",84],[84,"[",85],[85,"n",86],[86,",",87],[87,"w",88],[88,"r",89],[89,"i",90],[90,"t",91],[91,"e",92],[92,"l",93],[93,"n",94],[94,"]",95],[95,",",96],[96,"[",97],[97,"]",98],[98,"]",99],[99,"]",100],[100,"]",101],[101,"0",102],[102,"1",[-,"[[n,function],"":-"",[[[n,writeln],[]]]]01"]]],[[1,"[",2],[2,"[",63],[63,"n",34],[34,",",65],[65,"o",36],[36,"r",67],[67,"1",38],[38,"2",69],[69,"]",40],[40,",",71],[71,"""",42],[42,":",73],[73,"-",44],[44,"""",75],[75,",",46],[46,"[",77],[77,"[",48],[48,"[",79],[79,"n",50],[50,",",81],[81,"t",52],[52,"r",83],[83,"u",54],[54,"e",85],[85,"]",56],[56,"]",87],[87,"]",58],[58,"]",89],[89,"0",60]],[[1,"[",2],[2,"[",3],[3,"n",4],[4,",",5],[5,"f",6],[6,"u",7],[7,"n",8],[8,"c",9],[9,"t",10],[10,"i",11],[11,"o",12],[12,"n",13],[13,"]",14],[14,",",15],[15,"""",16],[15,"[",56],[16,":",17],[17,"-",18],[18,"""",19],[19,",",20],[20,"[",21],[21,"[",22],[22,"[",23],[23,"n",24],[24,",",25],[25,"e",26],[26,"q",27],[27,"u",28],[28,"a",29],[29,"l",30],[30,"s",31],[31,"4",32],[32,"_",33],[33,"o",34],[34,"n",35],[35,"]",36],[36,"]",37],[37,"]",38],[38,"]",39],[39,"0",40],[40,"1",[-,"[[n,function],"":-"",[[[n,equals4_on]]]]01"]],[56,"[",57],[57,"v",58],[58,",",59],[59,"1",60],[60,"]",61],[61,",",62],[62,"[",63],[63,"v",64],[64,",",65],[65,"2",66],[66,"]",67],[67,",",68],[68,"[",69],[69,"v",70],[70,",",71],[71,"3",72],[72,"]",73],[73,"]",74],[74,",",75],[75,"""",76],[76,":",77],[77,"-",78],[78,"""",79],[79,",",80],[80,"[",81],[81,"[",82],[82,"[",83],[83,"n",84],[84,",",85],[85,"+",86],[86,"]",87],[87,",",88],[88,"[",89],[89,"[",90],[90,"v",91],[91,",",92],[92,"1",93],[93,"]",94],[94,",",95],[95,"[",96],[96,"v",97],[97,",",98],[98,"2",99],[99,"]",100],[100,",",101],[101,"[",102],[102,"v",103],[103,",",104],[104,"6",105],[105,"]",106],[106,"]",107],[107,"]",108],[108,",",109],[109,"[",110],[110,"[",111],[111,"n",112],[112,",",113],[113,"+",114],[114,"]",115],[115,",",116],[116,"[",117],[117,"[",118],[118,"v",119],[119,",",120],[120,"6",121],[121,"]",122],[122,",",123],[123,"[",124],[124,"v",125],[125,",",126],[126,"3",127],[127,"]",128],[128,"]",129],[129,"]",130],[130,"]",131],[131,"]",132],[132,"0",133],[133,"1",[-,"[[n,function],[[v,1],[v,2],[v,3]],"":-"",[[[n,+],[[v,1],[v,2],[v,6]]],[[n,+],[[v,6],[v,3]]]]]01"]]]] \ No newline at end of file diff --git a/Philosophy/alg_file.txt b/Philosophy/alg_file.txt new file mode 100644 index 0000000000000000000000000000000000000000..15ed7d8f680e87de70912455a951b64e8faafbf9 --- /dev/null +++ b/Philosophy/alg_file.txt @@ -0,0 +1 @@ +[[\t[n,function \ No newline at end of file diff --git a/Philosophy/alg_file2.txt b/Philosophy/alg_file2.txt new file mode 100644 index 0000000000000000000000000000000000000000..15ed7d8f680e87de70912455a951b64e8faafbf9 --- /dev/null +++ b/Philosophy/alg_file2.txt @@ -0,0 +1 @@ +[[\t[n,function \ No newline at end of file diff --git a/Philosophy/alg_gen3.pl b/Philosophy/alg_gen3.pl new file mode 100644 index 0000000000000000000000000000000000000000..06b1da47d17b5a227cf238f91c4c1359c6ec2830 --- /dev/null +++ b/Philosophy/alg_gen3.pl @@ -0,0 +1,401 @@ +% 11, total 17 + +/* + +f e.e. + +?- alg_gen. +f is not in dictionary. Is it computational (y/n)? +| y +What is the command name for f ( for the same)? +| +e is not in dictionary. Is it computational (y/n)? +| n +What is the algorithm note for f? +| f +What is the command name for e ( for the same)? +| +What is the algorithm note for e? +| e +true. + +*/ + +% test then put q skips + +% for 80 algs + +% a b word c automate -> (word->word,automate->recurse) -> recurse(word) + +% - word or automate used if only a b or c present (or same with another sent), otherwise asks which words to use * x +% collects words to note and discard (automatically discards connectives and any non-computational terms without brdict such as plus, variable, box, square, right, down), +% asks about new words x +% x discards conn and non comp words, asks about rest +% x * +% *** x records computational, discarded terms x + +% alg - takes previous status of word as arg or function, in previous relations (if has no function, then make function), if has no previous relation (and, a fn/arg) then ask +% x uses previous relations +% *** x records weightings of and uses algs (notes string) for single computational terms (also uses dict1 for converting to comp terms) + +% tech_dict.txt: automate,recurse, alg_dict.txt: [a, b, c, word,recurse],recurse(word) x + +% *** alg_gen_dict1.txt [add,sum], alg_gen_dict2.txt [sum,1,"1+1=2"] + +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). +:-include('../listprologinterpreter/la_files.pl'). + +alg_gen :- + + phrase_from_file_s(string(File), "../Text-to-Breasonings/file.txt"), + string_codes(File_string0,File), + + downcase_atom(File_string0,File_string1), + split_string(File_string1,".\n\r",".\n\r",File_string31), + + delete(File_string31,"",File_string3), + + % to delete connectives such as and, the + % open_file_s("connectives.txt",Connectives), + + open_file_s("alg_gen3_dict1.txt",Alg_gen_dict1), + + open_file_s("alg_gen3_dict2.txt",Alg_gen_dict2), + +% * Recursively do it, replacing old dict entries + alg_gen1(File_string3,Alg_gen_dict1,Alg_gen_dict1a,Alg_gen_dict2,Alg_gen_dict2a), + + save_file_s("alg_gen3_dict1.txt",Alg_gen_dict1a), + + save_file_s("alg_gen3_dict2.txt",Alg_gen_dict2a). + + +alg_gen1([],Alg_gen_dict1,Alg_gen_dict1,Alg_gen_dict2,Alg_gen_dict2) :- !. +alg_gen1(File_string3,Alg_gen_dict1,Alg_gen_dict1a,Alg_gen_dict2,Alg_gen_dict2a) :- + + File_string3=[File_string3a|Rest], + %findall(Words1j,(member(File_string3a,File_string3), + SepandPad="#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\!'\"0123456789", + split_string(File_string3a,SepandPad,SepandPad,File_string2), + + + %trace, + (findall([AA,AB%AD,AB,AF + ],(member(AA,File_string2),member([AA,AB],Alg_gen_dict1)%,member([AD,AB,AF],Alg_gen_dict2) + ),AC), + findall(-,(member([_,-],AC)),AG), + ((length(AG,LA),length(AC,LA))-> + subtract(Alg_gen_dict1,AC,Alg_gen_dict11); + Alg_gen_dict1=Alg_gen_dict11)), + + repeat, + + % if l AC - l AG + findall(-%[AA,AB%AD,AB,AF + %] + ,(member(AA,File_string2),not(member([AA,AB],Alg_gen_dict1)))%,member([AD,AB,AF],Alg_gen_dict2) + ,AC1), + (length(AC1,1)->Single=true;Single=false), + %trace, + alg_gen2(File_string2,Single,Alg_gen_dict11,Alg_gen_dict1b, + Alg_gen_dict2,Alg_gen_dict2b, + [],Alg_gen_dict3b, + New_flag + ), + +% if a word is new, asks for the main term in the sentence, otherwise determines it from weights +%trace, +((not(var(New_flag)),New_flag=true)-> +( +term_to_atom(File_string3a,File_string3a1), + +(length(Alg_gen_dict3b,1)->S11=1; +( +length(Alg_gen_dict3b,L), +numbers(L,1,[],N), +findall([N1," - ",Item,"\n"],(member(N1,N),get_item_n(Alg_gen_dict3b,N1,Item)),List),foldr(append,List,[],Item2a), +foldr(string_concat,Item2a,"",Item2), + +%term_to_atom(Alg_gen_dict3b,Alg_gen_dict3b1), +foldr(string_concat,["The sentence: ",File_string3a1," has the computational terms:","\n",Item2,"What is the number of the main computational term?"],"",Prompt1), +repeat, +writeln(Prompt1), +read_string(user_input,"\n","\r",_,S1), +number_string(S11,S1) +)), +get_item_n(Alg_gen_dict3b,S11,Item3), +Flag=true +% +%Alg_gen_dict2=Alg_gen_dict2c +) +; +(%trace, +(Alg_gen_dict3b=[]->fail;true), +findall([Weight,M,Note],(member(M,File_string2),member([Weight,M,Note],Alg_gen_dict2)),M1), +(M1=[]-> +Flag=fail; +(sort(M1,M2), +reverse(M2,[[W,Item3,Note]|_]), +Flag=true)))), + +% () * repeat, get A out of [A,B] + +% if a word's note is not in a_dict2, then asks for its alg + +(Flag=fail-> +Alg_gen_dict2b=Alg_gen_dict2c; +( +(Note=(-)-> +(foldr(string_concat,["What is the algorithm note for ",Item3,"?"],"",Prompt2), +%repeat, +writeln(Prompt2), +read_string(user_input,"\n","\r",_,S4), +%trace, +member([W,Item3,_],Alg_gen_dict2b), +delete(Alg_gen_dict2b,[W,Item3,_],Alg_gen_dict2c1), +append(Alg_gen_dict2c1,[[W,Item3,S4]],Alg_gen_dict2c)) +; +Alg_gen_dict2b=Alg_gen_dict2c))), + +alg_gen1(Rest,Alg_gen_dict1b,Alg_gen_dict1a,Alg_gen_dict2c,Alg_gen_dict2a),!. + + %alg_gen1(Rest,Alg_gen_dict1a,Alg_gen_dict1b,Alg_gen_dict2a,Alg_gen_dict2b). + + +alg_gen2([],_,Alg_gen_dict1,Alg_gen_dict1, + Alg_gen_dict2,Alg_gen_dict2, + Alg_gen_dict3,Alg_gen_dict3, + _New_flag + ) :- !. + +alg_gen2(File_string2,Single,Alg_gen_dict1a1,Alg_gen_dict1b, + Alg_gen_dict2a1,Alg_gen_dict2b, + Alg_gen_dict3a,Alg_gen_dict3b, + New_flag + ) :- + + alg_gen3(File_string2,Alg_gen_dict1a1,Alg_gen_dict1a, + Alg_gen_dict2a1,Alg_gen_dict2a, + [],Alg_gen_dict4a), + + File_string2=[File_string22|Rest], + +length(Alg_gen_dict4a,L), +numbers(L,1,[],N), + +(Alg_gen_dict4a=[]-> +( + Alg_gen_dict1a=Alg_gen_dict1d, + Alg_gen_dict2a=Alg_gen_dict2d, + Alg_gen_dict3a=Alg_gen_dict3d +) +; +( +term_to_atom(File_string2,File_string3a1), + +(length(Alg_gen_dict4a,1)->SS=[1]; +( +findall([N1," - ",Item,"\n"],(member(N1,N),get_item_n(Alg_gen_dict4a,N1,Item)),List),foldr(append,List,[],Item2a), +foldr(string_concat,Item2a,"",Item2), + +%term_to_atom(Alg_gen_dict3b,Alg_gen_dict3b1), +foldr(string_concat,["The sentence: ",File_string3a1," has the undefined words:","\n",Item2,"What are the numbers of the computational terms?"],"",Prompt1), +repeat, +writeln(Prompt1), +read_string(user_input,"\n","\r",_,S1), +split_string(S1,", ",", ",S3), +%trace, + +findall(S5,(member(S4,S3),number_string(S5,S4)),SS))), +%trace, +findall(Item3,(member(S5,SS),get_item_n(Alg_gen_dict4a,S5,Item3)),S6), +findall(Item3,(member(S5,N),not(member(S5,SS)), +get_item_n(Alg_gen_dict4a,S5,Item3)),S7), +%Flag=true, + +% +%Alg_gen_dict2=Alg_gen_dict2c + + + + /* +File_string2=[Word|Rest], + +% if a word is not in a_dict1, then asks for that word's status or computational version + +(member([Word,Words1d],Alg_gen_dict1a)-> +(Alg_gen_dict1a=Alg_gen_dict1c, +member([W,Words1d,Note],Alg_gen_dict2a), +W1 is W+1, +delete(Alg_gen_dict2a,[W,Words1d,Note],Alg_gen_dict2a1), +append(Alg_gen_dict2a1,[[W1,Words1d,Note]],Alg_gen_dict2c), +append(Alg_gen_dict3a,[Words1d],Alg_gen_dict3c) + +) +; +(foldr(string_concat,[Word," is not in dictionary. Is it computational (y/n)?"],"",Prompt1), +%repeat, + +(Single=true->S1="y"; +(writeln(Prompt1), +read_string(user_input,"\n","\r",_,S1))), + +(S1="y"-> +*/ +alg_gen4(S6,Alg_gen_dict1a,Alg_gen_dict1c, +Alg_gen_dict2a,Alg_gen_dict2c, +Alg_gen_dict3a,Alg_gen_dict3c, +New_flag), + +alg_gen5(S7,Alg_gen_dict1c,Alg_gen_dict1d, +Alg_gen_dict2c,Alg_gen_dict2d, +Alg_gen_dict3c,Alg_gen_dict3d) +)), +/* +findall(*,(member(S7,S6),foldr(string_concat,["What is the command name for ",S7," ( for the same)?"],"",Prompt2), +%repeat, +writeln(Prompt2), +read_string(user_input,"\n","\r",_,S2), +(S2=""->S3=Word;S3=S2), +append(Alg_gen_dict1a,[[Word,S3]],Alg_gen_dict1c), +append(Alg_gen_dict2a,[[0,S3,-]],Alg_gen_dict2c), +append(Alg_gen_dict3a,[S3],Alg_gen_dict3c), +New_flag=true +); +(append(Alg_gen_dict1a,[[Word,-]],Alg_gen_dict1c), +Alg_gen_dict2a=Alg_gen_dict2c, +Alg_gen_dict3a=Alg_gen_dict3c) +))), +*/ +alg_gen2(Rest,Single,Alg_gen_dict1d,Alg_gen_dict1b, +Alg_gen_dict2d,Alg_gen_dict2b, +Alg_gen_dict3d,Alg_gen_dict3b, +New_flag +). + + +alg_gen3([],Alg_gen_dict1,Alg_gen_dict1, + Alg_gen_dict2,Alg_gen_dict2, + Alg_gen_dict3,Alg_gen_dict3) :- !. + +alg_gen3(File_string2,Alg_gen_dict1a,Alg_gen_dict1b, + Alg_gen_dict2a,Alg_gen_dict2b, + Alg_gen_dict3a,Alg_gen_dict3b) :- + +File_string2=[Word|Rest], + +% if a word is not in a_dict1, then asks for that word's status or computational version + +(member([Word,Words1d],Alg_gen_dict1a)-> +(Alg_gen_dict1a=Alg_gen_dict1c, +member([W,Words1d,Note],Alg_gen_dict2a), +W1 is W+1, +delete(Alg_gen_dict2a,[W,Words1d,Note],Alg_gen_dict2a1), +append(Alg_gen_dict2a1,[[W1,Words1d,Note]],Alg_gen_dict2c), +Alg_gen_dict3a=Alg_gen_dict3c +) +; +(Alg_gen_dict1a=Alg_gen_dict1c, +Alg_gen_dict2a=Alg_gen_dict2c, +append(Alg_gen_dict3a,[Word%Words1d +],Alg_gen_dict3c) +) +), + +alg_gen3(Rest,Alg_gen_dict1c,Alg_gen_dict1b, + Alg_gen_dict2c,Alg_gen_dict2b, + Alg_gen_dict3c,Alg_gen_dict3b). + + + +alg_gen4([],Alg_gen_dict1,Alg_gen_dict1, +Alg_gen_dict2,Alg_gen_dict2, +Alg_gen_dict3,Alg_gen_dict3, +_New_flag) :- !. + +alg_gen4(S6,Alg_gen_dict1a,Alg_gen_dict1b, +Alg_gen_dict2a,Alg_gen_dict2b, +Alg_gen_dict3a,Alg_gen_dict3b, +New_flag) :- + +S6=[S7|S8], + +foldr(string_concat,["What is the command name for ",S7," ( for the same)?"],"",Prompt2), +%repeat, +writeln(Prompt2), +read_string(user_input,"\n","\r",_,S2), +(S2=""->S3=S7;S3=S2), +append(Alg_gen_dict1a,[[S7,S3]],Alg_gen_dict1c), +append(Alg_gen_dict2a,[[0,S3,-]],Alg_gen_dict2c), +append(Alg_gen_dict3a,[S3],Alg_gen_dict3c), +New_flag=true, + +alg_gen4(S8,Alg_gen_dict1c,Alg_gen_dict1b, +Alg_gen_dict2c,Alg_gen_dict2b, +Alg_gen_dict3c,Alg_gen_dict3b, +New_flag). + + + +alg_gen5([],Alg_gen_dict1,Alg_gen_dict1, +Alg_gen_dict2,Alg_gen_dict2, +Alg_gen_dict3,Alg_gen_dict3) :- !. + +alg_gen5(S7,Alg_gen_dict1a,Alg_gen_dict1b, +Alg_gen_dict2a,Alg_gen_dict2b, +Alg_gen_dict3a,Alg_gen_dict3b) :- + +S7=[Word|S8], + +append(Alg_gen_dict1a,[[Word,-]],Alg_gen_dict1c), +Alg_gen_dict2a=Alg_gen_dict2c, +Alg_gen_dict3a=Alg_gen_dict3c, + +alg_gen5(S8,Alg_gen_dict1c,Alg_gen_dict1b, +Alg_gen_dict2c,Alg_gen_dict2b, +Alg_gen_dict3c,Alg_gen_dict3b). + +% * if not in a_dict1, * + +/* + findall(Words1e,(member(Words1c,File_string2), + (member([Words1c,Words1d],Alg_gen_dict1)->Words1e=Words1d; + Words1e=Words1c)),Words1f), + + + subtract(Words1f,Connectives,Words1g1), + + % remove duplicates + sort(Words1g1,Words1h), + + (member([Words1h,_A],Alg_dict)->fail*; + ( + + term_to_atom(Words1f,Words1g), + + foldr(string_concat,"What are the [Computational_words,Algorithm] for: ",Words1g, + "\n","e.g. [[[word,word],[automate,recurse]], [recurse(word)]] ?",Prompt), + repeat, + writeln(Prompt), + read_string(user_input,"\n","\r",_,S), + + term_to_atom(S1,S), + + S1=[Computational_words,_Algorithm], + + findall([A1,B1],(member([A,B],Computational_words), + (string_atom(A1,A),string_atom(B1,B))),Computational_words1), + + findall(Words1e,(member(Words1c,Words1h), + (member([Words1c,Words1d],Computational_words1)-> + Words1e=Words1d; + Words1e=Words1c)),Words1j) + ))),Words1k), + + % foldr(append,Words1k,[],Words1m), + + append() + + +*/ + diff --git a/Philosophy/arg_cache.txt b/Philosophy/arg_cache.txt new file mode 100644 index 0000000000000000000000000000000000000000..00ec8723df85da5eaf7722f0b2d1c1438816b43d --- /dev/null +++ b/Philosophy/arg_cache.txt @@ -0,0 +1 @@ +[0,""] \ No newline at end of file diff --git a/Philosophy/bag2phil.pl b/Philosophy/bag2phil.pl new file mode 100644 index 0000000000000000000000000000000000000000..cc3e6631766f1d9eb1e081f06c18f02eec8edcae --- /dev/null +++ b/Philosophy/bag2phil.pl @@ -0,0 +1,69 @@ +% bag2phil.pl + +% breasoning algorithm generator 2 to philosophy + +% uses breasoning algorithm generator 2 to output the maximum number of philosophy paragraphs with 5 sentences. + +% later: do this for combos of texts + +:-include('../listprologinterpreter/listprolog.pl'). +:-include('br_alg_gen2.pl'). + +/* +?- bag2phil(Br),writeln(Br). +Enter number of input files: +|: 2 +Enter file 1 path: +|: ../Lucian-Academy/Books/IMMORTALITY/Immortality 1.txt +Enter file 2 path: +|: ../Lucian-Academy/Books/IMMORTALITY/Immortality 2.txt +*/ + +bag2phil(Br32) :- +repeat, +writeln("Enter number of input files:"), +read_string(user_input,"\n\r","\n\r",_,S1), +number_string(N,S1), +numbers(N,1,[],N1), +findall([T2,"\n",T3,"\n",T4,"\n"],(member(N2,N1),write("Enter file "),write(N2),writeln(" path:"),read_string(user_input,"\n\r","\n\r",_,S2), + +%open_s(S2,read,T1),read_term(T1,[T2,T3,_,T4]),close(S2) + +phrase_from_file_s(string(A),S2),term_to_atom([T2,T3,_,T4],A)%writeln(A1). + +),T5), +%trace, +flatten(T5,T6), +%findall([T61,". "],member(T61,T6),T62), +%flatten(T62,T63), +foldr(string_concat,T6,T7), +open_s("../Text-to-Breasonings/file.txt",write,S3),write(S3,T7),close(S3), +%trace, +br_alg_gen2(Br0), +sort(Br0,Br), + +findall(T63,(member(T61,Br),string_concat(T61,". ",T63)),T62), +%flatten(T62,Br11), + +length(T62,N3), +N4 is N3 div 5, + +N6 is 5*N4, +length(L,N6), +append(L,L2,T62), + +divide(1,L,5,[],Br2), + +N8 is N4+1, +append(Br2,[N8,". ",L2],Br3), + +flatten(Br3,Br31),foldr(string_concat,Br31,Br32),!. + + +divide(_,[],_,Br2,Br2) :- !. +divide(N7,Br,N4,Br1,Br2) :- + length(L,N4),append(L,L2,Br), + append(Br1,[N7,". ",L,"\n"],Br3), + N8 is N7+1, + divide(N8,L2,N4,Br3,Br2). + diff --git a/Philosophy/bag2philn.pl b/Philosophy/bag2philn.pl new file mode 100644 index 0000000000000000000000000000000000000000..e272d8cd07931c175cb6a47c9fa3561aed42a096 --- /dev/null +++ b/Philosophy/bag2philn.pl @@ -0,0 +1,103 @@ +% bag2phil.pl + +% breasoning algorithm generator 2 to philosophy + +% uses breasoning algorithm generator 2 to output the maximum number of philosophy paragraphs with 5 sentences. + +% later: do this for combos of texts + +%:-include('../listprologinterpreter/listprolog.pl'). +:-include('br_alg_gen2.pl'). +%:-dynamic spec/1. +/* +?- bag2phil(100,2,Br),writeln(Br). +Enter number of input files: +|: 2 +Enter file 1 path: +|: ../Lucian-Academy/Books/IMMORTALITY/Immortality 1.txt +Enter file 2 path: +|: ../Lucian-Academy/Books/IMMORTALITY/Immortality 2.txt +*/ + +bag2phil(Length,Twists,Br32) :- +repeat, +writeln("Enter number of input files:"), +read_string(user_input,"\n\r","\n\r",_,S1), +number_string(N,S1), +numbers(N,1,[],N1), +findall([T2,"\n",T3,"\n",T4,"\n"],(member(N2,N1),write("Enter file "),write(N2),writeln(" path:"),read_string(user_input,"\n\r","\n\r",_,S2), + +%open_s(S2,read,T1),read_term(T1,[T2,T3,_,T4]),close(S2) + +phrase_from_file_s(string(A),S2),term_to_atom([T2,T3,_,T4],A)%writeln(A1). + +),T5), + +bag2phil2(Length,T5,Twists,Br32). + +bag2phil2(Length,T5,Twists,Br32) :- + +%trace, +flatten(T5,T6), +%findall([T61,". "],member(T61,T6),T62), +%flatten(T62,T63), +foldr(string_concat,T6,T7), +%open_s("../Text-to-Breasonings/file.txt",write,S3),write(S3,T7),close(S3), +%trace, + +repeat(Length,Twists,T7,Br32),!. + +repeat(_,0,T7,T7) :- !. + +repeat(Length,N,T7,Br33) :- + +br_alg_gen21(T7,Br00), + + +sort(Br00,Br0), + +% + +%Length2 is Length*1.1, + +random_permutation(Br0,Br000), + + +length(Br000,LL1), +(LL1>=Length-> +(length(Br,Length), +append(Br,_,Br000)); +Br=Br0 +) +, +%*/ + + +findall(T63,(member(T61,Br),string_concat(T61,". ",T63)),T62), +%flatten(T62,Br11), + +length(T62,N3), +N4 is N3 div 5, + +N6 is 5*N4, +length(L,N6), +append(L,L2,T62), + +divide(1,L,5,[],Br2), + +N8 is N4+1, +append(Br2,[N8,". ",L2],Br3), + +flatten(Br3,Br31),foldr(string_concat,Br31,Br32), + +NN is N-1, + repeat(Length,NN,Br32,Br33), +!. + + +divide(_,[],_,Br2,Br2) :- !. +divide(N7,Br,N4,Br1,Br2) :- + length(L,N4),append(L,L2,Br), + append(Br1,[N7,". ",L,"\n"],Br3), + N8 is N7+1, + divide(N8,L2,N4,Br3,Br2). diff --git a/Philosophy/bag_args2.pl b/Philosophy/bag_args2.pl new file mode 100644 index 0000000000000000000000000000000000000000..d4152a3b1c64ae4afbb2f6b45f62f3dcc5c78428 --- /dev/null +++ b/Philosophy/bag_args2.pl @@ -0,0 +1,170 @@ +:-include('big_connections_with_bag3_and_mr_short_books_aa540_args_init2.pl'). +:-include('../Text-to-Breasonings/text_to_breasonings3.pl'). +:-include('../t2ab/t2ab.pl'). +:-include('../listprologinterpreter/la_files.pl'). +:- dynamic count2/1. +:-include('la_vps.pl'). +%:-include('../Lucian-Academy/word_count.pl'). + +%bag_args(96000). +get_r(X2) :- +random(X),X1 is ceiling(X*1000000),foldr(string_concat,["Books/args-",X1,"/"],X3), +(exists_directory(X3)->get_r(X2);X3=X2). + +bag_args(Limit1,C2) :- + (( +(exists_directory('Books/args')->(get_r(X2),mv("Books/args/",X2));true), +time((Split_into_n=%1,% +2, +%Limit is ceiling((Limit1*2.5*Split_into_n)),%/(*5)), +Limit is Limit1,%*2.5*Split_into_n)),%/(*5)), + + retractall(count2(_)), + assertz(count2(0)), + retractall(br_bar(_)), + assertz(br_bar([])), + %length(L,10), + %L=[1,2,3,4,5,6,7,8,9,10], + open_string_file_s("../Lucian-Academy/Books1/args/lgtext_a.txt",File_string), + term_to_atom(File_string2,File_string), + File_string2=["","","",File_string1], + not(File_string1=""), + split1(Split_into_n,File_string1,File_strings), + length(File_strings,L), + numbers(L,1,[],Ls), + + concurrent_maplist(a1([File_strings,Limit]),Ls,_N2))) + )-> + + (true + + );true%bag_args(Limit1) + ), + + count21(C2),writeln([count,C2,/,Limit1]), + + get_time(A),stamp_date_time(A,date(_Year,_Month,_Day,Hour1,Minute1,_Seconda,_A,_TZ,_False),local), + +open_string_file_s("aa_log.txt",File_string_b), + term_to_atom(File_string2_b,File_string_b), + append(File_string2_b,[[arg,Hour1,Minute1,C2]],File_string2_b1), + + open_s("aa_log.txt",write,Stream1a), + write(Stream1a,File_string2_b1), + close(Stream1a), + %delete(N2,0,N3),append(_,[N4],N3),writeln1(N4), + + br_bar(Br), + +open_string_file_s("arg_cache.txt",File_string_b_cache), + term_to_atom([Cache_N,File_string2_b_cache],File_string_b_cache), + + term_to_atom(Br,N5),atom_string(N5,N6), + + string_concat(File_string2_b_cache,N6,N7), + + WC is C2+Cache_N, + + %word_count(["string",N7],WC), + + (WC>6000-> + (texttobr2(u,u,N7,u,[auto,on]), + t2ab(u,u,N7,u,on), + term_to_atom([0,""],Cache_atom), + open_s("arg_cache.txt",write,Stream1a_cache), + write(Stream1a_cache,Cache_atom), + close(Stream1a_cache) + ); + (%WC1 is Cache_N+WC, + term_to_atom([WC,N7],N8), + open_s("arg_cache.txt",write,Stream1a_cache), + write(Stream1a_cache,N8), + close(Stream1a_cache) + )), + + + + + !. + +%%%% + +count21(C) :- count2(C). + +a1([File_strings,Limit],L1,File_string_a) :- + %findall(Sent_br2,( + %member(L1,Ls), + count21(C2),(C2>=Limit-> + (File_string_a=""%open_s("../Lucian-Academy/Books1/algs/lgalgs_a.txt",write,Stream1), + %write(Stream1,File_string), + %close(Stream1),abort + );(writeln([count,C2,/,Limit]),get_item_n(File_strings,L1,N), + + + +(((catch(call_with_time_limit(0.84, + time( big_connections_with_bag3_and_mr(N,File_string_a))), + time_limit_exceeded, + false)),%-> + + findall(_,sub_string(File_string_a,_,_,_,". "),A),length(A,L2), +split_string(File_string_a,"\n\r","\n\r",NL),length(NL,NLN),Sent_br2 is L2-NLN, + + %split_string(File_string_a,"\n\r.","\n\r.",Sents), + %split_string(File_string_a,"\n\r","\n\r",Sents1), + + %length(Sents1,Sents1L), + %length(Sents,Sent_br21), + %Sent_br2 is Sent_br21-Sents1L, + + %trace, + %t2b, + %t2ab, +%N1=1,texttobr2(N1,"Books/args/lgtext_a.txt",u,u,false,false,false,false,false,false,[auto,on]), + %trace, + %t2ab(u,"Books/args/lgtext_a.txt",u,u,on), + + count21(C), + C1 is C+Sent_br2, + retractall(count2(_)), + assertz(count2(C1)), + + br_bar(Br), + append(Br,[File_string_a],Br1), + retractall(br_bar(_)), + assertz(br_bar(Br1)) + + + )->true;(File_string_a="")))),!. + + %split1(17,A"a:-b,c.",B). + +sum(A,B,C) :- C is A+B. +split1(N,A,B) :- + split_string(A,"\n\r","\n\r",C), + split2(N,C,[],D), + findall(H,(member(D1,D), + findall([E,"\n"],member(E,D1),F), + flatten(F,G), + foldr(string_concat,G,H)),B),!. +split2(N,A,B,C) :- + length(D,N), + (append(D,E,A)-> + (append(B,[D],F), + split2(N,E,F,C)); + append(B,[A],C)),!. + + + t2b :- +N1=1,texttobr2(N1,"Books/args/lgtext_a.txt",u,u,false,false,false,false,false,false,[auto,on]),!. + +t2ab:-t2ab(u,"Books/args/lgtext_a.txt",u,u,on),!. + +print_report :- + open_string_file_s("aa_log.txt",File_string), + term_to_atom(File_string2,File_string), + sort(File_string2,File_string22), + reverse(File_string22,File_string21), + length(File_string21,L), + numbers(L,1,[],Ls), + findall(_,(member(L1,Ls),get_item_n(File_string21,L1,A),writeln([L1|A])),_),!. \ No newline at end of file diff --git a/Philosophy/bag_cat_arg.pl b/Philosophy/bag_cat_arg.pl new file mode 100644 index 0000000000000000000000000000000000000000..0cdeb943a596cbd377f6e83547845f4a3c7253ea --- /dev/null +++ b/Philosophy/bag_cat_arg.pl @@ -0,0 +1,12 @@ +#!/usr/bin/swipl -f -q + +:-include('cat_arg_files.pl'). +:- initialization(catch(main, Err, handle_error(Err))). + +handle_error(_Err):- + halt(1). +main :- + cat_arg_files(6000), + nl, + halt. +main :- halt(1). diff --git a/Philosophy/bag_tt1.pl b/Philosophy/bag_tt1.pl new file mode 100644 index 0000000000000000000000000000000000000000..e7e9751bf87b2d05e92391f9142b00141064ed5b --- /dev/null +++ b/Philosophy/bag_tt1.pl @@ -0,0 +1,12 @@ +#!/usr/bin/swipl -f -q + +:-include('text_to_breasonings.pl'). +:- initialization(catch(main, Err, handle_error(Err))). + +handle_error(_Err):- + halt(1). +main :- + texttobr2(u,u,"square",u), + nl, + halt. +main :- halt(1). diff --git a/Philosophy/bell.pl b/Philosophy/bell.pl new file mode 100644 index 0000000000000000000000000000000000000000..370e2d98f13513805a1a2a6806147fe9c36d61c6 --- /dev/null +++ b/Philosophy/bell.pl @@ -0,0 +1,37 @@ +:-include('../listprologinterpreter/listprolog.pl'). + +bell(Spoken_notification) :- + catch((bell1(Spoken_notification)->true;true),_,true),!. + +bell1(Spoken_notification) :- + %alarm_clock(H,M), + /* + string_codes(S,[101, 99, 104, 111, 32, %022, %86, + 007%, 71 + ]),*/ + + os_running(OS), + + (OS=macos-> + foldr(string_concat,["afplay /System/Library/Sounds/Funk.aiff\n","say \""%done"%"echo " + ,%""% + Spoken_notification, + "\""],S1); + + (OS=linux-> + foldr(string_concat,["echo -e '\a'\n","spd-say \""%done"%"echo " + ,%""% + Spoken_notification, + "\""],S1); + + (OS=windows-> + foldr(string_concat,["ECHO \"^G\"\n","PowerShell -Command \"Add-Type –AssemblyName System.Speech; (New-Object System.Speech.Synthesis.SpeechSynthesizer).Speak('",Spoken_notification,"');\""],S1); + fail))), + + shell1_s(S1). + +os_running(OS) :- + (current_prolog_flag(arch, _Arch)->OS=macos; + (current_prolog_flag(unix, U)->US=linux; + (current_prolog_flag(windows, W)->OS=windows; + OS=other))),!. diff --git a/Philosophy/big_connections_with_bag3_and_mr_short_books.pl b/Philosophy/big_connections_with_bag3_and_mr_short_books.pl new file mode 100644 index 0000000000000000000000000000000000000000..8ef83a61a3c5cb134398c87b280a56285fee8985 --- /dev/null +++ b/Philosophy/big_connections_with_bag3_and_mr_short_books.pl @@ -0,0 +1,108 @@ +% big_connections_with_bag3_and_mr.pl + +% This will conclude the philosophy project, +% writing 16*n joined details. + +:-include('bag2philn.pl'). + +big_connections_with_bag3_and_mr2(Keywords) :- +%trace, + retractall(spec(_)), + assertz(spec(on)), + findall(Keyword1,(member(Keyword,Keywords),downcase_atom(Keyword,Keyword1)),Keywords2), + retractall(keywords(_)), + assertz(keywords(Keywords2)), + big_connections_with_bag3_and_mr. + +choose_texts(K1,H,J1):- +%member(H,G), +not(string_concat("dot",_,H)), +string_concat(K1,H,H1),open_file_s(H1,[A,B,_,File_term]), +flatten([A,"\n",B,"\n",File_term,"\n"],J1). + +big_connections_with_bag3_and_mr :- + /*(not(spec(_))-> + (retractall(spec(_)), + assertz(spec(off))); + true), +*/ +(exists_directory('Books')->true;make_directory('Books')), + + +K=[ +"../Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators1/" + +%"../Lucian-Academy/Books/BOTS/" +%/* +/*"../Lucian-Academy/Books/COMPUTER SCIENCE/", +"../Lucian-Academy/Books/Computational English/", +"../Lucian-Academy/Books/Creating and Helping Pedagogues/", +"../Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/" +, +*/ +%"../Lucian-Academy/Books/ECONOMICS/", +/* +"../Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/", +"../Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/", + +"../Lucian-Academy/Books/IMMORTALITY/", + +"../Lucian-Academy/Books/Lecturer/", +"../Lucian-Academy/Books/MEDICINE/", +"../Lucian-Academy/Books/MEDITATION/", +"../Lucian-Academy/Books/Mind Reading/", +"../Lucian-Academy/Books/PEDAGOGY/", +"../Lucian-Academy/Books/POLITICS/", +"../Lucian-Academy/Books/POPOLOGY/" +*/ +%"../Lucian-Academy/Books/REPLICATION/", +%"../Lucian-Academy/Books/SALES/" +/*, +"../Lucian-Academy/Books/Short Arguments/", +"../Lucian-Academy/Books/SIMULATION/", +"../Lucian-Academy/Books/Time Travel/" +%*/ +%"../Lucian-Academy/Books/books2/" +], + + +findall(_J,(member(K1,K), directory_files(K1,F), + delete_invisibles_etc(F,G), + +%/* +findall(_,(member(H,G),not(string_concat("dot",_,H)), +string_concat(K1,H,H1),open_file_s(H1,[A,B,_,File_term]), + + +flatten([A,"\n",B,"\n",File_term,"\n"],J1), +foldr(string_concat,J1,"",J2), + +split_string(J2,"\n\r.","\n\r.",J3), +delete(J3,"",J4), +length(J4,Length), +Length2 is Length*16, +%trace, + bag2phil2(Length2,J2,2,Br32), + + string_concat("../Lucian-Academy/Books/",K2,K1), + string_concat("Books/",K2,K3), +(exists_directory(K3)->true;make_directory(K3)), + +%string_concat(K4,"/",K3), +%string_concat(K4,".txt",K5), +foldr(string_concat,[K3,H],K5), + +open_s(K5,write,S), +write(S,Br32),close(S), + +writeln([K5,written]) + + +),_J10) +%*/ +%J10=["Hello, I am Lucian Green. I enjoy programming, writing Philosophy and going for walks. I also compose songs and do acting. Programming is on the topic compilers, inductive algorithm finders and philosophy. I love animals, and secretly wrote songs with about my (human-like) pets. I also enjoy programming about each of my books, on topics such as meditation. +%"], + + + +),_),!. \ No newline at end of file diff --git a/Philosophy/big_connections_with_bag3_and_mr_short_books_aa540_algs_init.pl b/Philosophy/big_connections_with_bag3_and_mr_short_books_aa540_algs_init.pl new file mode 100644 index 0000000000000000000000000000000000000000..9c4f0c182800be8817895100dfa69d6dd93501a6 --- /dev/null +++ b/Philosophy/big_connections_with_bag3_and_mr_short_books_aa540_algs_init.pl @@ -0,0 +1,18 @@ +% big_connections_with_bag3_and_mr.pl + +% This will conclude the philosophy project, +% writing 16*n joined details. + +:-include('bag2philn.pl'). + +big_connections_with_bag3_and_mr2(File_term,Br32) :- + +flatten([File_term],J1), +foldr(string_concat,J1,"",J2), + +split_string(J2,"\n\r.","\n\r.",J3), +delete(J3,"",J4), +length(J4,Length), +Length2 is Length*16, +%trace, + bag2phil2(Length2,J2,2,Br32),!. \ No newline at end of file diff --git a/Philosophy/br_alg_gen2.pl b/Philosophy/br_alg_gen2.pl new file mode 100644 index 0000000000000000000000000000000000000000..be94e05afedb68b8a815bddb3c77436057e44120 --- /dev/null +++ b/Philosophy/br_alg_gen2.pl @@ -0,0 +1,75 @@ +% ["Creating and Helping Pedagogues","CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Computer Science 4 of 4.txt",0,algorithms," 61. I knew about or."] + +% join two sentences using br alg gen + +% br_alg_gen2(Br),writeln1(Br). + +% input: +% 10. I added that I could write breasonings by substituting "The algorithm finds the output for an input using similar yet different enough phrases" for the algorithm. For example, the writer can find the complete set of non-synonymous words that take turns with a word. The linguistic algorithm writer attempted to cover the humanities framework with ideas by tying critical terms from the real world to the algorithms in one version. Also, she could impressionably convert algorithms to functionalism and back. In a way, it was treating old as new. There were existing ideas and recent research in contemporary times. + +% output: +% [" For example for an input using similar yet different enough phrases"" for the algorithm"," Also, she writer can find the complete set of non-synonymous words that take turns with a word"] + +%:- include('../listprologinterpreter/listprolog'). +:- include('br_alg_gen.pl'). +:-include('../listprologinterpreter/la_maths.pl'). + +br_alg_gen2(Br) :- + +phrase_from_file_s(string(Codes), "../Text-to-Breasonings/file.txt"), + + string_codes(String,Codes), + +br_alg_gen21(String,Br),!. + +br_alg_gen21(String,Br) :- + + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!0123456789", % doesn't have "'" xx + %string_codes(String1,Codes), + %string_to_list2(SepandPad,[],SepandPad1), + string_codes(SepandPad,SepandPad1), + + + split_string(String,"\n\r.","\n\r.",Strings), + +findall(File_list1,(member(String1,Strings), string_codes(String1,Codes1), + split_on_substring117(Codes1,SepandPad1,[],File_list1)),File_list), + + +% gets aa bb from Aa Bb +findall(C,(member(AA,File_list), +findall([A,B1],(member(A,AA),downcase_atom(A,B),atom_string(B,B1)),C)),CC), + +%findall(A,member([_,A],C),D), + +% gets join word pairs +%trace, +%flatten(CC,CC1), +%foldr(string_concat,CC1,CC2), +br_gen2(String,Raw,_Chains), + +% finds joined sentences with Aa Bb + +findall(L3,%[G,E,F,H], +(member([E,F],Raw), +%member([C1,C2],CC),get_n_item([C1,C2],E,N1), +member(C3,CC),get_n_item(C3,[_,E],N1), +%numbers(N1,1,[],N11), +length(L1,N1),%append(G,[E|_],C2) +append(L1,_,C3),%append(L2,_,C1), +findall(C4,member([C4,_],L1),C5), +member(D3,CC),%append(_,[F|H],D2)),Br). +get_n_item(D3,[_,F],N2), +%numbers(N1,1,[],N11), +length(L2,N2),%append(G,[E|_],C2) +append(_,L2,D3), +findall(D4,member([D4,_],L2),D5), +foldr(append,[C5,[" "],D5],[],L31), +%trace, +%((%not(not(spec(_))), +%spec(on))->(findall(L312,(member(L311,L31),downcase_atom(L311,L312)),L313),keywords(Keywords),not(intersection(L313,Keywords,[]))%,trace +%);true), +%writeln1(C5),trace, + +foldr(string_concat,L31,"",L3)),Br). + diff --git a/Philosophy/bt-lp2sm_en2test1.pl b/Philosophy/bt-lp2sm_en2test1.pl new file mode 100644 index 0000000000000000000000000000000000000000..30459970c9d0ce623803f40643f9aef43d053962 --- /dev/null +++ b/Philosophy/bt-lp2sm_en2test1.pl @@ -0,0 +1,306 @@ +:-include('../SSI/ssi.pl'). +:-include('../State-Machine-to-List-Prolog/sm_to_lp.pl'). + +% back-translates List Prolog to State Machines + +% bt-lp2sm_en2test(A,B). +% A - output: total tests +% B - output: total correct results + +% bt-lp2sm_en2test1(N,B). +% N - input: number to test +% B - output: result + + +%?- retractall(lang(_)),assertz(lang("en")),Lang="en2",numbers(247,1,[],N),findall(["lp2sm_en2test(",N1,",",Functions2b,",",Functions3b,").\n"],(member(N1,N),(once((test(N1,Q,F,R),query_box(Q,Query1,Functions,Functions1),convert_to_grammar_part1(Functions1,[],Functions2,_),trans_alg(Functions2,"en",Lang,F2),retractall(lang(_)),assertz(lang("en")),add_line_numbers_to_algorithm1(Functions2,Functions2a),find_pred_numbers(Functions2a,[],Pred_numbers),retractall(pred_numbers(_)),assertz(pred_numbers(Pred_numbers)),find_state_machine1(Functions2a,Functions3,Pred_numbers),sm_to_lp(Functions3,Functions4),term_to_atom(F2,Functions2b),term_to_atom(Functions3,Functions3b))))),A),foldr(append,A,B1),foldr(string_concat,B1,B),writeln(B). + + +lp2sm_en2test(1,[[["n","query box 1"],[],":-",[[["n","function"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[]]]]]). +lp2sm_en2test(2,[[["n","query box 1"],[["v",c]],":-",[[["n","function"],[1,1,["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[1,1,[v,c]]]]]]). +lp2sm_en2test(3,[[["n","query box 1"],[["v",c]],":-",[[["n","function"],[1,1,["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[1,1,[v,c]]]]]]). +lp2sm_en2test(4,[[["n","query box 1"],[["v",a]],":-",[[["n","append 1"],[["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append1],[[v,a]]]]]]). +lp2sm_en2test(5,[[["n","query box 1"],[["v",n]],":-",[[["n","count"],[1,["v",n]]]]]],[[0,[n,query_box_1],[[v,n]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,count],[1,[v,n]]]]]]). +lp2sm_en2test(6,[[["n","query box 1"],[["v",n]],":-",[[["n","count"],[0,["v",n]]]]]],[[0,[n,query_box_1],[[v,n]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,count],[0,[v,n]]]]]]). +lp2sm_en2test(7,[[["n","query box 1"],[["v",l]],":-",[[["n","reverse"],[[1,2,3],[],["v",l]]]]]],[[0,[n,query_box_1],[[v,l]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,reverse],[[1,2,3],[],[v,l]]]]]]). +lp2sm_en2test(8,[[["n","query box 1"],[],":-",[[["n","grammar 1"],["apple"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["apple"]]]]]). +lp2sm_en2test(9,[[["n","query box 1"],[],":-",[[["n","grammar 1"],["aaa"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["aaa"]]]]]). +lp2sm_en2test(10,[[["n","query box 1"],[["v",t]],":-",[[["n","grammar 1"],["aa",["v",t]]]]]],[[0,[n,query_box_1],[[v,t]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["aa",[v,t]]]]]]). +lp2sm_en2test(11,[[["n","query box 1"],[["v",t],["v",u]],":-",[[["n","grammar 1"],["aa",["v",t],["v",u]]]]]],[[0,[n,query_box_1],[[v,t],[v,u]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["aa",[v,t],[v,u]]]]]]). +lp2sm_en2test(12,[[["n","query box 1"],[],":-",[[["n","grammar 1"],["aa"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["aa"]]]]]). +lp2sm_en2test(13,[[["n","query box 1"],[["v",t]],":-",[[["n","grammar 1"],["[a,a]",["v",t]]]]]],[[0,[n,query_box_1],[[v,t]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["[a,a]",[v,t]]]]]]). +lp2sm_en2test(14,[[["n","query box 1"],[["v",t]],":-",[[["n","grammar 1"],["[a]",["v",t]]]]]],[[0,[n,query_box_1],[[v,t]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["[a]",[v,t]]]]]]). +lp2sm_en2test(15,[[["n","query box 1"],[["v",t]],":-",[[["n","grammar 1"],["[[""aa,]"",b,""c"",[]],1]",["v",t]]]]]],[[0,[n,query_box_1],[[v,t]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["[[""aa,]"",b,""c"",[]],1]",[v,t]]]]]]). +lp2sm_en2test(16,[[["n","query box 1"],[],":-",[[["n","grammar 1"],["john ate the apple"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["john ate the apple"]]]]]). +lp2sm_en2test(17,[[["n","query box 1"],[["v",t]],":-",[[["n","grammar 1"],["aaa1 ,-'! a? b! b.",["v",t]]]]]],[[0,[n,query_box_1],[[v,t]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["aaa1 ,-'! a? b! b.",[v,t]]]]]]). +lp2sm_en2test(18,[[["n","query box 1"],[["v",c]],":-",[[["n","grammar 1"],["what is 1+11",["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["what is 1+11",[v,c]]]]]]). +lp2sm_en2test(19,[[["n","query box 1"],[["v",s]],":-",[[["n","positivityscore"],[["not","you","like","a","walk"],["would","you","like","a","walk"],0,["v",s]]]]]],[[0,[n,query_box_1],[[v,s]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,positivityscore],[["not","you","like","a","walk"],["would","you","like","a","walk"],0,[v,s]]]]]]). +lp2sm_en2test(20,[[["n","query box 1"],[["v",c]],":-",[[["n","function"],[1,1,["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[1,1,[v,c]]]]]]). +lp2sm_en2test(21,[[["n","query box 1"],[["v",t]],":-",[[["n","grammar 1"],["ate",["v",t]]]]]],[[0,[n,query_box_1],[[v,t]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["ate",[v,t]]]]]]). +lp2sm_en2test(22,[[["n","query box 1"],[],":-",[[["n","grammar 1"],["peter cut the pear"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["peter cut the pear"]]]]]). +lp2sm_en2test(23,[[["n","query box 1"],[["v",s]],":-",[[["n","agree"],[["a","b","c"],["d","e","f"],["a","b","c","g"],["v",s]]]]]],[[0,[n,query_box_1],[[v,s]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,agree],[["a","b","c"],["d","e","f"],["a","b","c","g"],[v,s]]]]]]). +lp2sm_en2test(24,[[["n","query box 1"],[["v",s]],":-",[[["n","modus ponens"],["a",[["a","b"],["c","d"],["e","f"]],["v",s]]]]]],[[0,[n,query_box_1],[[v,s]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,modus_ponens],["a",[["a","b"],["c","d"],["e","f"]],[v,s]]]]]]). +lp2sm_en2test(25,[[["n","query box 1"],[["v",s]],":-",[[["n","grammar 1"],["aaa1 ,-' +a +b +b +","aaa1 ,-' +a +b +a",["v",s]]]]]],[[0,[n,query_box_1],[[v,s]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["aaa1 ,-' +a +b +b +","aaa1 ,-' +a +b +a",[v,s]]]]]]). +lp2sm_en2test(26,[[["n","query box 1"],[["v",s]],":-",[[["n","append 1"],[["a"],["b"],["v",s]]]]]],[[0,[n,query_box_1],[[v,s]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append1],[["a"],["b"],[v,s]]]]]]). +lp2sm_en2test(27,[[["n","query box 1"],[],":-",[[["n","equals 11"],["a","a"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals11],["a","a"]]]]]). +lp2sm_en2test(28,[[["n","query box 1"],[],":-",[[["n","number 11"],[1]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,number11],[1]]]]]). +lp2sm_en2test(29,[[["n","query box 1"],[["v",c]],":-",[[["n","minus 11"],[[1,2,3],[3],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,minus11],[[1,2,3],[3],[v,c]]]]]]). +lp2sm_en2test(30,[[["n","query box 1"],[["v",b]],":-",[[["n","if 11"],[1,["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,if11],[1,[v,b]]]]]]). +lp2sm_en2test(31,[[["n","query box 1"],[],":-",[[["n","not 11"],[1]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,not11],[1]]]]]). +lp2sm_en2test(32,[[["n","query box 1"],[],":-",[[["n","or 11"],[1]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,or11],[1]]]]]). +lp2sm_en2test(33,[[["n","query box 1"],[],":-",[[["n","downpipe"],[3,1,[[3,[4,2]],[2,[3,1]]]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,downpipe],[3,1,[[3,[4,2]],[2,[3,1]]]]]]]]). +lp2sm_en2test(34,[[["n","query box 1"],[["v",c]],":-",[[["n","getitemn"],[3,[1,2,3],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,getitemn],[3,[1,2,3],[v,c]]]]]]). +lp2sm_en2test(35,[[["n","query box 1"],[],":-",[[["n","identical"],[1,2]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,identical],[1,2]]]]]). +lp2sm_en2test(36,[[["n","query box 1"],[],":-",[[["n","associative"],[1,2,3]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,associative],[1,2,3]]]]]). +lp2sm_en2test(37,[[["n","query box 1"],[["v",l]],":-",[[["n","length 1"],[[1],0,["v",l]]]]]],[[0,[n,query_box_1],[[v,l]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,length1],[[1],0,[v,l]]]]]]). +lp2sm_en2test(38,[[["n","query box 1"],[["v",d]],":-",[[["n","optimise 1"],[[[5,4],[3,2],[1,0]],["v",d]]]]]],[[0,[n,query_box_1],[[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,optimise1],[[[5,4],[3,2],[1,0]],[v,d]]]]]]). +lp2sm_en2test(39,[[["n","query box 1"],[],":-",[[["n","member 1a"],[1,[1,2]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,member1a],[1,[1,2]]]]]]). +lp2sm_en2test(40,[[["n","query box 1"],[["v",a]],":-",[[["n","minus 1"],[[1,2,3],[1,2],["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,minus1],[[1,2,3],[1,2],[v,a]]]]]]). +lp2sm_en2test(41,[[["n","query box 1"],[],":-",[[["n","part of string"],[[1,2,3,4],[2,3]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,part_of_string],[[1,2,3,4],[2,3]]]]]]). +lp2sm_en2test(42,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","find all"],[["v",b],[[["n","or 12"],[["v",b]]]],["v",a]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[v,b],[v,a]]],[1,[on_true,2],[go_after,[findall_exit_function,0]],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,"[]"]],[2,[on_true,[exit_function,1]],[go_after,-],[on_false,[fail_function,1]],[go_to_predicates,-],[n,or12],[[v,b]]]]]]). +lp2sm_en2test(43,[[["n","query box 1"],[["v",a]],":-",[[["n","intersection 1"],[[1,2,3],[3,4,5],[],["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,intersection1],[[1,2,3],[3,4,5],[],[v,a]]]]]]). +lp2sm_en2test(44,[[["n","query box 1"],[["v",a]],":-",[[["n","delete 2"],[[1,1,2],1,[],["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,delete2],[[1,1,2],1,[],[v,a]]]]]]). +lp2sm_en2test(45,[[["n","query box 1"],[],":-",[[["n","greaterthan"],[3,2]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,greaterthan],[3,2]]]]]). +lp2sm_en2test(46,[[["n","query box 1"],[["v",c]],":-",[[["n","conjunction"],["true","false",["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,conjunction],["true","false",[v,c]]]]]]). +lp2sm_en2test(47,[[["n","query box 1"],[["v",l]],":-",[[["n","sum"],[[3,1,2],0,["v",l]]]]]],[[0,[n,query_box_1],[[v,l]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,sum],[[3,1,2],0,[v,l]]]]]]). +lp2sm_en2test(48,[[["n","query box 1"],[["v",l]],":-",[[["n","sort 0"],[[9,4,8,2,1,5,7,6,3,10],["v",l]]]]]],[[0,[n,query_box_1],[[v,l]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,sort0],[[9,4,8,2,1,5,7,6,3,10],[v,l]]]]]]). +lp2sm_en2test(49,[[["n","query box 1"],[["v",m]],":-",[[["n","maximum 0"],[[2,1,3,5,-1],["v",m]]]]]],[[0,[n,query_box_1],[[v,m]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,maximum0],[[2,1,3,5,-1],[v,m]]]]]]). +lp2sm_en2test(50,[[["n","query box 1"],[["v",c]],":-",[[["n","disjunction"],["true","false",["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,disjunction],["true","false",[v,c]]]]]]). +lp2sm_en2test(51,[[["n","query box 1"],[["v",c]],":-",[[["n","expressionnotheadache"],["true",["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,expressionnotheadache],["true",[v,c]]]]]]). +lp2sm_en2test(52,[[["n","query box 1"],[["v",c]],":-",[[["n","mainrole"],[7,["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,mainrole],[7,[v,c]]]]]]). +lp2sm_en2test(53,[[["n","query box 1"],[["v",c]],":-",[[["n","function"],[[["n","function 2"],[2]],1,1,["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[[[n,function2],[2]],1,1,[v,c]]]]]]). +lp2sm_en2test(54,[[["n","query box 1"],[["v",c]],":-",[[["n","function"],[[["n","function 2"],[2]],1,1,["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[[[n,function2],[2]],1,1,[v,c]]]]]]). +lp2sm_en2test(55,[[["n","query box 1"],[["v",c]],":-",[[["n","test 1"],[["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,test1],[[v,c]]]]]]). +lp2sm_en2test(56,[[["n","query box 1"],[["v",d]],":-",[[["n","map"],[["n","add"],[1,2,3],0,["v",d]]]]]],[[0,[n,query_box_1],[[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,map],[[n,add],[1,2,3],0,[v,d]]]]]]). +lp2sm_en2test(57,[[["n","query box 1"],[["v",d]],":-",[[["n","find all 1"],[["n","plusone"],[1,2,3],[],["v",d]]]]]],[[0,[n,query_box_1],[[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,findall1],[[n,plusone],[1,2,3],[],[v,d]]]]]]). +lp2sm_en2test(58,[[["n","query box 1"],[["v",d]],":-",[[["n","find all 1"],[["n","a to c"],["a","b","a"],[],["v",d]]]]]],[[0,[n,query_box_1],[[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,findall1],[[n,a_to_c],["a","b","a"],[],[v,d]]]]]]). +lp2sm_en2test(59,[[["n","query box 1"],[["v",n]],":-",[[["n","count"],[1,["v",n]]]]]],[[0,[n,query_box_1],[[v,n]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,count],[1,[v,n]]]]]]). +lp2sm_en2test(60,[[["n","query box 1"],[],":-",[[["n","a"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,a],[]]]]]). +lp2sm_en2test(61,[[["n","query box 1"],[["v",l]],":-",[[["n","add"],[[1,2,3],3,[],["v",l]]]]]],[[0,[n,query_box_1],[[v,l]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add],[[1,2,3],3,[],[v,l]]]]]]). +lp2sm_en2test(62,[[["n","query box 1"],[["v",l]],":-",[[["n","add"],[[1],[2,3],["v",l]]]]]],[[0,[n,query_box_1],[[v,l]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add],[[1],[2,3],[v,l]]]]]]). +lp2sm_en2test(63,[[["n","query box 1"],[["v",b]],":-",[[["n","add"],[1,["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add],[1,[v,b]]]]]]). +lp2sm_en2test(64,[[["n","query box 1"],[["v",b]],":-",[[["n","add 0"],[[1,2],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[[1,2],[v,b]]]]]]). +lp2sm_en2test(65,[[["n","query box 1"],[["v",b]],":-",[[["n","add 0"],[[1],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[[1],[v,b]]]]]]). +lp2sm_en2test(66,[[["n","query box 1"],[],":-",[[["n","addorsubtract 1"],[2,1,1]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,addorsubtract1],[2,1,1]]]]]). +lp2sm_en2test(67,[[["n","query box 1"],[],":-",[[["n","addorsubtract 1"],[2,1,1]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,addorsubtract1],[2,1,1]]]]]). +lp2sm_en2test(68,[[["n","query box 1"],[],":-",[[["n","addorsubtract 1"],[2,1,1]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,addorsubtract1],[2,1,1]]]]]). +lp2sm_en2test(69,[[["n","query box 1"],[],":-",[[["n","add 0"],[2,1]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[2,1]]]]]). +lp2sm_en2test(70,[[["n","query box 1"],[],":-",[[["n","add 0"],[1,2]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[1,2]]]]]). +lp2sm_en2test(71,[[["n","query box 1"],[],":-",[[["n","add 0"],[1,2]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[1,2]]]]]). +lp2sm_en2test(72,[[["n","query box 1"],[["v",b]],":-",[[["n","add 0"],[1,["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[1,[v,b]]]]]]). +lp2sm_en2test(73,[[["n","query box 1"],[["v",c]],":-",[[["n","add 0"],[1,1,["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[1,1,[v,c]]]]]]). +lp2sm_en2test(74,[[["n","query box 1"],[["v",c]],":-",[[["n","add 0"],[[1,2],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[[1,2],[v,c]]]]]]). +lp2sm_en2test(75,[[["n","query box 1"],[["v",c]],":-",[[["n","add 0"],[[],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[[],[v,c]]]]]]). +lp2sm_en2test(76,[[["n","query box 1"],[["v",b]],":-",[[["n","implies 2"],[1,["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,implies2],[1,[v,b]]]]]]). +lp2sm_en2test(77,[[["n","query box 1"],[["v",b]],":-",[[["n","find all 1"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,findall1],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(78,[[["n","query box 1"],[["v",b]],":-",[[["n","map list 1"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,maplist1],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(79,[[["n","query box 1"],[["v",b]],":-",[[["n","equals 41"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(80,[[["n","query box 1"],[["v",a],["v",b],["v",c],["v",d]],":-",[[["n","equals 41"],[["v",a],["v",d],["v",c],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c],[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a],[v,d],[v,c],[v,b]]]]]]). +lp2sm_en2test(81,[[["n","query box 1"],[["v",a],["v",b],["v",c]],":-",[[["n","equals 41"],[["v",a],["v",c],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a],[v,c],[v,b]]]]]]). +lp2sm_en2test(82,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[["v",a],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a],[v,b]]]]]]). +lp2sm_en2test(83,[[["n","query box 1"],[],":-",[[["n","equals 41"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[]]]]]). +lp2sm_en2test(84,[[["n","query box 1"],[["v",a],["v",b],["v",c]],":-",[[["n","equals 41"],[["v",a],["v",c],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a],[v,c],[v,b]]]]]]). +lp2sm_en2test(85,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[["v",a],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a],[v,b]]]]]]). +lp2sm_en2test(86,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[["v",a],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a],[v,b]]]]]]). +lp2sm_en2test(87,[[["n","query box 1"],[["v",a]],":-",[[["n","equals 41"],[["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a]]]]]]). +lp2sm_en2test(88,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[["v",a],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a],[v,b]]]]]]). +lp2sm_en2test(89,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[["v",a],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a],[v,b]]]]]]). +lp2sm_en2test(90,[[["n","query box 1"],[],":-",[[["n","equals 41"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[]]]]]). +lp2sm_en2test(91,[[["n","query box 1"],[],":-",[[["n","equals 41"],[[1,2,3]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[1,2,3]]]]]]). +lp2sm_en2test(92,[[["n","query box 1"],[["v",a],["v",b],["v",d]],":-",[[["n","equals 41"],[["v",a],["v",b],["v",d]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a],[v,b],[v,d]]]]]]). +lp2sm_en2test(93,[[["n","query box 1"],[["v",b]],":-",[[["n","map list 1"],[[[1],[2],[3]],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,maplist1],[[[1],[2],[3]],[v,b]]]]]]). +lp2sm_en2test(94,[[["n","query box 1"],[["v",b]],":-",[[["n","map list 1"],[[[[1]],[[2]],[[3]]],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,maplist1],[[[[1]],[[2]],[[3]]],[v,b]]]]]]). +lp2sm_en2test(95,[[["n","query box 1"],[["v",b]],":-",[[["n","find all 1"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,findall1],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(96,[[["n","query box 1"],[["v",b]],":-",[[["n","equals 41"],[1,["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[1,[v,b]]]]]]). +lp2sm_en2test(97,[[["n","query box 1"],[["v",a]],":-",[[["n","equals 41"],[["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a]]]]]]). +lp2sm_en2test(98,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[[[1,2],3,4],["v",a],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[1,2],3,4],[v,a],[v,b]]]]]]). +lp2sm_en2test(99,[[["n","query box 1"],[["v",b]],":-",[[["n","equals 41"],[1,["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[1,[v,b]]]]]]). +lp2sm_en2test(100,[[["n","query box 1"],[["v",b],["v",c]],":-",[[["n","equals 41"],[1,["v",c],["v",b]]]]]],[[0,[n,query_box_1],[[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[1,[v,c],[v,b]]]]]]). +lp2sm_en2test(101,[[["n","query box 1"],[["v",b],["v",c]],":-",[[["n","equals 41"],[1,["v",c],["v",b]]]]]],[[0,[n,query_box_1],[[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[1,[v,c],[v,b]]]]]]). +lp2sm_en2test(102,[[["n","query box 1"],[["v",b1],["v",b2],["v",b3]],":-",[[["n","equals 41"],[1,[2,3],["v",b1],["v",b2],["v",b3]]]]]],[[0,[n,query_box_1],[[v,b1],[v,b2],[v,b3]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[1,[2,3],[v,b1],[v,b2],[v,b3]]]]]]). +lp2sm_en2test(103,[[["n","query box 1"],[["v",b1],["v",b2],["v",b3]],":-",[[["n","equals 41"],[1,[2,3],["v",b1],["v",b2],["v",b3]]]]]],[[0,[n,query_box_1],[[v,b1],[v,b2],[v,b3]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[1,[2,3],[v,b1],[v,b2],[v,b3]]]]]]). +lp2sm_en2test(104,[[["n","query box 1"],[["v",b]],":-",[[["n","find all 1"],[[[1,2],[3,4]],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,findall1],[[[1,2],[3,4]],[v,b]]]]]]). +lp2sm_en2test(105,[[["n","query box 1"],[["v",b]],":-",[[["n","member 2a"],[["v",b],[1,11,111]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,member2a],[[v,b],[1,11,111]]]]]]). +lp2sm_en2test(106,[[["n","query box 1"],[["v",b]],":-",[[["n","call 1a"],[["v",b],[1,11,111]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,call1a],[[v,b],[1,11,111]]]]]]). +lp2sm_en2test(107,[[["n","query box 1"],[["v",b]],":-",[[["n","call 1b"],[[1,11,111],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,call1b],[[1,11,111],[v,b]]]]]]). +lp2sm_en2test(108,[[["n","query box 1"],[["v",b]],":-",[[["n","call 1b"],[[1,11,111],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,call1b],[[1,11,111],[v,b]]]]]]). +lp2sm_en2test(109,[[["n","query box 1"],[["v",b]],":-",[[["n","middle"],[2,["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,middle],[2,[v,b]]]]]]). +lp2sm_en2test(110,[[["n","query box 1"],[["v",b]],":-",[[["n","level with"],[170,["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,level_with],[170,[v,b]]]]]]). +lp2sm_en2test(111,[[["n","query box 1"],[["v",a]],":-",[[["n","tra las"],[5,["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,tra_las],[5,[v,a]]]]]]). +lp2sm_en2test(112,[[["n","query box 1"],[["v",a]],":-",[[["n","final gong"],[5,["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,final_gong],[5,[v,a]]]]]]). +lp2sm_en2test(113,[[["n","query box 1"],[["v",b]],":-",[[["n","bedroom to garden"],["bedroom",["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,bedroom_to_garden],["bedroom",[v,b]]]]]]). +lp2sm_en2test(114,[[["n","query box 1"],[["v",a]],":-",[[["n","stop at top"],[["-","-","-","top"],["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,stop_at_top],[["-","-","-","top"],[v,a]]]]]]). +lp2sm_en2test(115,[[["n","query box 1"],[["v",result]],":-",[[["n","function"],[[["n1","a"]],[["a",5]],[],["v",result]]]]]],[[0,[n,query_box_1],[[v,result]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[[["n1","a"]],[["a",5]],[],[v,result]]]]]]). +lp2sm_en2test(116,[[["n","query box 1"],[["v",t]],":-",[[["n","grammar 1"],[".aaa.bbb.",[".","?"],["v",t]]]]]],[[0,[n,query_box_1],[[v,t]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],[".aaa.bbb.",[".","?"],[v,t]]]]]]). +lp2sm_en2test(117,[[["n","query box 1"],[["v",t]],":-",[[["n","grammar 1"],["a a. a ",[" ","."],["v",t]]]]]],[[0,[n,query_box_1],[[v,t]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],["a a. a ",[" ","."],[v,t]]]]]]). +lp2sm_en2test(118,[[["n","query box 1"],[],":-",[[["n","query pred"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,query_pred],[]]]]]). +lp2sm_en2test(119,[[["n","query box 1"],[],":-",[[["n","count"],[2]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,count],[2]]]]]). +lp2sm_en2test(120,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","function"],[1,["v",b],2,["v",a]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[1,[v,b],2,[v,a]]]]]]). +lp2sm_en2test(121,[[["n","query box 1"],[["v",a]],":-",[[["n","append 1"],[["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append1],[[v,a]]]]]]). +lp2sm_en2test(122,[[["n","query box 1"],[],":-",[[["n","equals 4 on1"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals4_on1],[]]]]]). +lp2sm_en2test(123,[[["n","query box 1"],[["v",b],["v",c]],":-",[[["n","equals 41"],[[["v",b],"|",["v",c]]]]]]],[[0,[n,query_box_1],[[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[v,b],"|",[v,c]]]]]]]). +lp2sm_en2test(124,[[["n","query box 1"],[["v",a],["v",b],["v",c],["v",d]],":-",[[["n","equals 41"],[[[["v",a],"|",["v",d]],["v",c],"|",["v",b]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c],[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[[v,a],"|",[v,d]],[v,c],"|",[v,b]]]]]]]). +lp2sm_en2test(125,[[["n","query box 1"],[["v",a],["v",b],["v",c]],":-",[[["n","equals 41"],[[[["v",a],["v",c]],"|",["v",b]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[[v,a],[v,c]],"|",[v,b]]]]]]]). +lp2sm_en2test(126,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[[["v",a],"|",["v",b]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[v,a],"|",[v,b]]]]]]]). +lp2sm_en2test(127,[[["n","query box 1"],[["v",a],["v",b],["v",c],["v",d]],":-",[[["n","equals 41"],[[["v",a],["v",c],"|",["v",b],["v",d]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c],[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[v,a],[v,c],"|",[v,b],[v,d]]]]]]]). +lp2sm_en2test(128,[[["n","query box 1"],[["v",a],["v",b],["v",c]],":-",[[["n","equals 41"],[[[["v",a]],["v",c],"|",["v",b]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[[v,a]],[v,c],"|",[v,b]]]]]]]). +lp2sm_en2test(129,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[[["v",a],"|",["v",b]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[v,a],"|",[v,b]]]]]]]). +lp2sm_en2test(130,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[[["v",a],"|",[["v",b]]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[v,a],"|",[[v,b]]]]]]]]). +lp2sm_en2test(131,[[["n","query box 1"],[["v",a]],":-",[[["n","equals 41"],[[["v",a]]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[v,a]]]]]]]). +lp2sm_en2test(132,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[[["v",a],["v",b]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[v,a],[v,b]]]]]]]). +lp2sm_en2test(133,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 41"],[[["v",a],["v",b]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[v,a],[v,b]]]]]]]). +lp2sm_en2test(134,[[["n","query box 1"],[["v",a],["v",b],["v",c],["v",d]],":-",[[["n","equals 41"],[[["v",a],["v",c],"|",["v",b],"|",["v",d]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c],[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[v,a],[v,c],"|",[v,b],"|",[v,d]]]]]]]). +lp2sm_en2test(135,[[["n","query box 1"],[],":-",[[["n","equals 41"],[[1,2,3]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[1,2,3]]]]]]). +lp2sm_en2test(136,[[["n","query box 1"],[["v",a],["v",b],["v",d]],":-",[[["n","equals 41"],[[["v",a],"|",[["v",b],"|",["v",d]]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[v,a],"|",[[v,b],"|",[v,d]]]]]]]]). +lp2sm_en2test(137,[[["n","query box 1"],[["v",b]],":-",[[["n","equals 41"],[["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,b]]]]]]). +lp2sm_en2test(138,[[["n","query box 1"],[],":-",[[["n","equals 41"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[]]]]]). +lp2sm_en2test(139,[[["n","query box 1"],[["v",a],["v",d]],":-",[[["n","append 1"],[["v",a],["v",d]]]]]],[[0,[n,query_box_1],[[v,a],[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append1],[[v,a],[v,d]]]]]]). +lp2sm_en2test(140,[[["n","query box 1"],[["v",b]],":-",[[["n","equals 41"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(141,[[["n","query box 1"],[["v",b]],":-",[[["n","equals 41"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(142,[[["n","query box 1"],[["v",b]],":-",[[["n","equals 41"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(143,[[["n","query box 1"],[["v",b]],":-",[[["n","equals 41"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(144,[[["n","query box 1"],[["v",b]],":-",[[["n","equals 41"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(145,[[["n","query box 1"],[["v",c]],":-",[[["n","equals 41"],[[[1,2,3]],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[1,2,3]],[v,c]]]]]]). +lp2sm_en2test(146,[[["n","query box 1"],[["v",c]],":-",[[["n","equals 41"],[[[1,2,3],4,5],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[1,2,3],4,5],[v,c]]]]]]). +lp2sm_en2test(147,[[["n","query box 1"],[["v",c]],":-",[[["n","equals 41"],[["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,c]]]]]]). +lp2sm_en2test(148,[[["n","query box 1"],[["v",c]],":-",[[["n","equals 41"],[["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,c]]]]]]). +lp2sm_en2test(149,[[["n","query box 1"],[["v",b],["v",c]],":-",[[["n","equals 41"],[[[1,2,3]],["v",b],["v",c]]]]]],[[0,[n,query_box_1],[[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[1,2,3]],[v,b],[v,c]]]]]]). +lp2sm_en2test(150,[[["n","query box 1"],[["v",c]],":-",[[["n","equals 41"],[[[4,5,6]],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[4,5,6]],[v,c]]]]]]). +lp2sm_en2test(151,[[["n","query box 1"],[["v",c]],":-",[[["n","equals 41"],[[[6,2,3],[5]],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[6,2,3],[5]],[v,c]]]]]]). +lp2sm_en2test(152,[[["n","query box 1"],[["v",c]],":-",[[["n","equals 41"],[[6,2,3],[1,2,3],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[6,2,3],[1,2,3],[v,c]]]]]]). +lp2sm_en2test(153,[[["n","query box 1"],[["v",c]],":-",[[["n","equals 41"],[[[4,5,6]],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[4,5,6]],[v,c]]]]]]). +lp2sm_en2test(154,[[["n","query box 1"],[["v",c]],":-",[[["n","equals 41"],[[[4,5,6]],["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[[4,5,6]],[v,c]]]]]]). +lp2sm_en2test(155,[[["n","query box 1"],[["v",a]],":-",[[["n","equals 41"],[["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a]]]]]]). +lp2sm_en2test(156,[[["n","query box 1"],[["v",a],["v",b],["v",c]],":-",[[["n","equals 42"],[["v",a],[["v",b],["v",c]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals42],[[v,a],[[v,b],[v,c]]]]]]]). +lp2sm_en2test(157,[[["n","query box 1"],[["v",a]],":-",[[["n","equals 41"],[["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a]]]]]]). +lp2sm_en2test(158,[[["n","query box 1"],[["v",a]],":-",[[["n","equals 41"],[["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a]]]]]]). +lp2sm_en2test(159,[[["n","query box 1"],[],":-",[[["n","equals 41"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[]]]]]). +lp2sm_en2test(160,[[["n","query box 1"],[["v",c]],":-",[[["n","function"],[1,1,["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[1,1,[v,c]]]]]]). +lp2sm_en2test(161,[[["n","query box 1"],[["v",c]],":-",[[["n","function"],[1,1,["v",c]]]]]],[[0,[n,query_box_1],[[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[1,1,[v,c]]]]]]). +lp2sm_en2test(162,[[["n","query box 1"],[["v",a],["v",b],["v",c]],":-",[[["n","equals 41"],[["v",a],["v",b],["v",c]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals41],[[v,a],[v,b],[v,c]]]]]]). +lp2sm_en2test(163,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","function 1"],[["v",a],["v",a],["v",b],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function1],[[v,a],[v,a],[v,b],[v,b]]]]]]). +lp2sm_en2test(164,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","function 1"],[["v",a],["v",a],["v",b],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function1],[[v,a],[v,a],[v,b],[v,b]]]]]]). +lp2sm_en2test(165,[[["n","query box 1"],[["v",a],["v",b]],":-",[[[["n","equals 4"],[["v",a],[1,2]]],[["n","equals 4"],[["v",b],[0,"|",["v",a]]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,"[]"]],[1,[on_true,2],[go_after,-],[on_false,[fail_function,0]],[go_to_predicates,-],[n,equals4],[[v,a],[1,2]]],[2,[on_true,[exit_function,0]],[go_after,-],[on_false,[fail_function,0]],[go_to_predicates,-],[n,equals4],[[v,b],[0,"|",[v,a]]]]]]]). +lp2sm_en2test(166,[[["n","query box 1"],[["v",a],["v",b]],":-",[[[["n","equals 4"],[["v",a],0]],[["n","equals 4"],[["v",b],[["v",a],"|",[1,2]]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,"[]"]],[1,[on_true,2],[go_after,-],[on_false,[fail_function,0]],[go_to_predicates,-],[n,equals4],[[v,a],0]],[2,[on_true,[exit_function,0]],[go_after,-],[on_false,[fail_function,0]],[go_to_predicates,-],[n,equals4],[[v,b],[[v,a],"|",[1,2]]]]]]]). +lp2sm_en2test(167,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","equals 4"],[[["v",a],"|",["v",b]],[1,"|",2]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals4],[[[v,a],"|",[v,b]],[1,"|",2]]]]]]). +lp2sm_en2test(168,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","function 1"],[["v",a],"|",["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function1],[[v,a],"|",[v,b]]]]]]). +lp2sm_en2test(169,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","function 1"],[["v",a],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function1],[[v,a],[v,b]]]]]]). +lp2sm_en2test(170,[[["n","query box 1"],[["v",a]],":-",[[["n","function 1"],[["v",a],["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function1],[[v,a],[v,a]]]]]]). +lp2sm_en2test(171,[[["n","query box 1"],[],":-",[[["n","function 1"],[[[]]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function1],[[[]]]]]]]). +lp2sm_en2test(172,[[["n","query box 1"],[],":-",[[["n","equals 4"],[[],[[]]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals4],[[],[[]]]]]]]). +lp2sm_en2test(173,[[["n","query box 1"],[["v",a],["v",b],["v",c],["v",d],["v",e]],":-",[[["n","equals 4"],[[["v",d],["v",d],["v",a],["v",b],["v",a]],[["v",e],["v",c],1,["v",c],["v",b]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c],[v,d],[v,e]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals4],[[[v,d],[v,d],[v,a],[v,b],[v,a]],[[v,e],[v,c],1,[v,c],[v,b]]]]]]]). +lp2sm_en2test(174,[[["n","query box 1"],[["v",a],["v",c],["v",e]],":-",[[["n","function 1"],[["v",e],["v",c],["v",a],["v",c],["v",a]]]]]],[[0,[n,query_box_1],[[v,a],[v,c],[v,e]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function1],[[v,e],[v,c],[v,a],[v,c],[v,a]]]]]]). +lp2sm_en2test(175,[[["n","query box 1"],[["v",a],["v",b],["v",c],["v",d],["v",e1],["v",e2],["v",f],["v",f2]],":-",[[["n","equals 4"],[[[["v",e1],["v",e2]],["v",c],["v",a],["v",c],["v",a],["v",e1]],[["v",d],["v",d],[[1,1],[1,1]],["v",b],["v",b],[["v",f],["v",f2]]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c],[v,d],[v,e1],[v,e2],[v,f],[v,f2]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals4],[[[[v,e1],[v,e2]],[v,c],[v,a],[v,c],[v,a],[v,e1]],[[v,d],[v,d],[[1,1],[1,1]],[v,b],[v,b],[[v,f],[v,f2]]]]]]]]). +lp2sm_en2test(176,[[["n","query box 1"],[["v",a],["v",c],["v",e1],["v",e2]],":-",[[["n","function 1"],[[["v",e1],["v",e2]],["v",c],["v",a],["v",c],["v",a],["v",e1]]]]]],[[0,[n,query_box_1],[[v,a],[v,c],[v,e1],[v,e2]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function1],[[[v,e1],[v,e2]],[v,c],[v,a],[v,c],[v,a],[v,e1]]]]]]). +lp2sm_en2test(177,[[["n","query box 1"],[["v",a],["v",b],["v",c],["v",d],["v",e1],["v",e2],["v",f],["v",f2]],":-",[[["n","equals 4"],[[[["v",e1],["v",e2]],"|",[["v",c],["v",a],["v",c],["v",a],["v",e1]]],[["v",d],"|",[["v",d],[[1,1],[1,1]],["v",b],["v",b],[["v",f],["v",f2]]]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c],[v,d],[v,e1],[v,e2],[v,f],[v,f2]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,equals4],[[[[v,e1],[v,e2]],"|",[[v,c],[v,a],[v,c],[v,a],[v,e1]]],[[v,d],"|",[[v,d],[[1,1],[1,1]],[v,b],[v,b],[[v,f],[v,f2]]]]]]]]]). +lp2sm_en2test(178,[[["n","query box 1"],[["v",a],["v",c],["v",e1],["v",e2]],":-",[[["n","function 1"],[[["v",e1],["v",e2]],"|",[["v",c],["v",a],["v",c],["v",a],["v",e1]]]]]]],[[0,[n,query_box_1],[[v,a],[v,c],[v,e1],[v,e2]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function1],[[[v,e1],[v,e2]],"|",[[v,c],[v,a],[v,c],[v,a],[v,e1]]]]]]]). +lp2sm_en2test(179,[[["n","query box 1"],[],":-",[[["n","add 0"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[]]]]]). +lp2sm_en2test(180,[[["n","query box 1"],[],":-",[[["n","add 0"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,add0],[]]]]]). +lp2sm_en2test(181,[[["n","query box 1"],[["v",b]],":-",[[["n",1],[[1,2],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,1],[[1,2],[v,b]]]]]]). +lp2sm_en2test(182,[[["n","query box 1"],[["v",b]],":-",[[["n","member try"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,member_try],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(183,[[["n","query box 1"],[],":-",[[["n","not 1"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,not1],[]]]]]). +lp2sm_en2test(184,[[["n","query box 1"],[],":-",[[["n","not 1"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,not1],[]]]]]). +lp2sm_en2test(185,[[["n","query box 1"],[],":-",[[["n","not 1"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,not1],[]]]]]). +lp2sm_en2test(186,[[["n","query box 1"],[],":-",[[["n","brackets 1"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,brackets1],[]]]]]). +lp2sm_en2test(187,[[["n","query box 1"],[],":-",[[["n","brackets 1"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,brackets1],[]]]]]). +lp2sm_en2test(188,[[["n","query box 1"],[],":-",[[["n","brackets 1"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,brackets1],[]]]]]). +lp2sm_en2test(189,[[["n","query box 1"],[],":-",[[["n","brackets 1"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,brackets1],[]]]]]). +lp2sm_en2test(190,[[["n","query box 1"],[],":-",[[["n","true"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,true],[]]]]]). +lp2sm_en2test(191,[[["n","query box 1"],[],":-",[[["n","not 1"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,not1],[]]]]]). +lp2sm_en2test(192,[[["n","query box 1"],[["v",b]],":-",[[["n","find all 1"],[[1,2,3],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,findall1],[[1,2,3],[v,b]]]]]]). +lp2sm_en2test(193,[[["n","query box 1"],[["v",b]],":-",[[["n","find all 1"],[[[[1,2,3,4]]],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,findall1],[[[[1,2,3,4]]],[v,b]]]]]]). +lp2sm_en2test(194,[[["n","query box 1"],[["v",a]],":-",[[["n","cut 1"],[["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,cut1],[[v,a]]]]]]). +lp2sm_en2test(195,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","find all"],[["v",b],[[["n","or 12"],[["v",b]]]],["v",a]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[v,b],[v,a]]],[1,[on_true,2],[go_after,[findall_exit_function,0]],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,"[]"]],[2,[on_true,[exit_function,1]],[go_after,-],[on_false,[fail_function,1]],[go_to_predicates,-],[n,or12],[[v,b]]]]]]). +lp2sm_en2test(196,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","find all"],[["v",b],[[["n","or 12"],[["v",b]]]],["v",a]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[v,b],[v,a]]],[1,[on_true,2],[go_after,[findall_exit_function,0]],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,"[]"]],[2,[on_true,[exit_function,1]],[go_after,-],[on_false,[fail_function,1]],[go_to_predicates,-],[n,or12],[[v,b]]]]]]). +lp2sm_en2test(197,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","find all"],[["v",b],[[["n","member"],[["v",b],[1]]]],["v",a]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[v,b],[v,a]]],[1,[on_true,2],[go_after,[findall_exit_function,0]],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,"[]"]],[2,[on_true,[exit_function,1]],[go_after,-],[on_false,[fail_function,1]],[go_to_predicates,-],[n,member],[[v,b],[1]]]]]]). +lp2sm_en2test(198,[[["n","query box 1"],[],":-",[[["n","function"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[]]]]]). +lp2sm_en2test(199,[[["n","query box 1"],[["v",a]],":-",[[["n","function"],[1,1,["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[1,1,[v,a]]]]]]). +lp2sm_en2test(200,[[["n","query box 1"],[],":-",[[["n","function"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[]]]]]). +lp2sm_en2test(201,[[["n","query box 1"],[],":-",[[["n","function"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[]]]]]). +lp2sm_en2test(202,[[["n","query box 1"],[],":-",[[["n","function"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[]]]]]). +lp2sm_en2test(203,[[["n","query box 1"],[],":-",[[["n","traverse"],[2,6]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,traverse],[2,6]]]]]). +lp2sm_en2test(204,[[["n","query box 1"],[],":-",[[["n","function"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,function],[]]]]]). +lp2sm_en2test(205,[[["n","query box 1"],[["v",b]],":-",[[["n","find all 1"],[[[1,2],[1,4]],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,findall1],[[[1,2],[1,4]],[v,b]]]]]]). +lp2sm_en2test(206,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","find all"],[["v",b],[[["n","or 12"],[1,["v",b]]]],["v",a]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[v,b],[v,a]]],[1,[on_true,2],[go_after,[findall_exit_function,0]],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,"[]"]],[2,[on_true,[exit_function,1]],[go_after,-],[on_false,[fail_function,1]],[go_to_predicates,-],[n,or12],[1,[v,b]]]]]]). +lp2sm_en2test(207,[[["n","query box 1"],[["v",b1]],":-",[[["n","find all 1"],[[1,2],["v",b1]]]]]],[[0,[n,query_box_1],[[v,b1]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,findall1],[[1,2],[v,b1]]]]]]). +lp2sm_en2test(208,[[["n","query box 1"],[["v",b],["v",c],["v",g]],":-",[[["n","reverse 1"],[[1,2,3],[],["v",c],["v",b],["v",g]]]]]],[[0,[n,query_box_1],[[v,b],[v,c],[v,g]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,reverse1],[[1,2,3],[],[v,c],[v,b],[v,g]]]]]]). +lp2sm_en2test(209,[[["n","query box 1"],[["v",a]],":-",[[["n","a"],[["a","b"],"",["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,a],[["a","b"],"",[v,a]]]]]]). +lp2sm_en2test(210,[[["n","query box 1"],[],":-",[[["n","grammar 1"],[[apple]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],[[apple]]]]]]). +lp2sm_en2test(211,[[["n","query box 1"],[],":-",[[["n","grammar 1"],[[apple,banana]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],[[apple,banana]]]]]]). +lp2sm_en2test(212,[[["n","query box 1"],[],":-",[[["n","grammar 1"],[[apple,banana,carrot]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,grammar1],[[apple,banana,carrot]]]]]]). +lp2sm_en2test(213,[[["n","query box 1"],[["v",a]],":-",[[["n","reverse 1"],[[1,2,3],[],["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,reverse1],[[1,2,3],[],[v,a]]]]]]). +lp2sm_en2test(214,[[["n","query box 1"],[["v",a]],":-",[[["n","append 1"],[[1,2,3],[],["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append1],[[1,2,3],[],[v,a]]]]]]). +lp2sm_en2test(215,[[["n","query box 1"],[["v",a],["v",b],["v",c]],":-",[[["n","find all"],[[["v",a],["v",b]],[["n","append"],[["v",a],["v",b],[1,2,3]]],["v",c]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[[v,a],[v,b]],[v,c]]],[1,[on_true,[findall_exit_function,0]],[go_after,-],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,append],[[v,a],[v,b],[1,2,3]]]]]]). +lp2sm_en2test(216,[[["n","query box 1"],[],":-",[[["n","traverse"],[2,8]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,traverse],[2,8]]]]]). +lp2sm_en2test(217,[[["n","query box 1"],[],":-",[[["n","traverse"],[2,6]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,traverse],[2,6]]]]]). +lp2sm_en2test(218,[[["n","query box 1"],[["v",b]],":-",[[["n","find all 1"],[[[1,2],[3,4]],["v",b]]]]]],[[0,[n,query_box_1],[[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,findall1],[[[1,2],[3,4]],[v,b]]]]]]). +lp2sm_en2test(219,[[["n","query box 1"],[["v",a],["v",b],["v",c]],":-",[[["n","find all"],[[["v",a],["v",b]],[["n","concatenate strings"],[["v",a],["v",b],"abc"]],["v",c]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[[v,a],[v,b]],[v,c]]],[1,[on_true,[findall_exit_function,0]],[go_after,-],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,stringconcat],[[v,a],[v,b],"abc"]]]]]). +lp2sm_en2test(220,[[["n","query box 1"],[],":-",[[["n","concatenate strings"],["a","b","ab"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,stringconcat],["a","b","ab"]]]]]). +lp2sm_en2test(221,[[["n","query box 1"],[["v",a]],":-",[[["n","concatenate strings"],["a","b",["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,stringconcat],["a","b",[v,a]]]]]]). +lp2sm_en2test(222,[[["n","query box 1"],[["v",a]],":-",[[["n","concatenate strings"],["a",["v",a],"ab"]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,stringconcat],["a",[v,a],"ab"]]]]]). +lp2sm_en2test(223,[[["n","query box 1"],[["v",a]],":-",[[["n","concatenate strings"],[["v",a],"b","ab"]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,stringconcat],[[v,a],"b","ab"]]]]]). +lp2sm_en2test(224,[[["n","query box 1"],[],":-",[[["n","append"],[["a"],["b"],["a","b"]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append],[["a"],["b"],["a","b"]]]]]]). +lp2sm_en2test(225,[[["n","query box 1"],[["v",a]],":-",[[["n","append"],[["a"],["b"],["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append],[["a"],["b"],[v,a]]]]]]). +lp2sm_en2test(226,[[["n","query box 1"],[["v",a]],":-",[[["n","append"],[["a"],["v",a],["a","b"]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append],[["a"],[v,a],["a","b"]]]]]]). +lp2sm_en2test(227,[[["n","query box 1"],[["v",a]],":-",[[["n","append"],[["v",a],["b"],["a","b"]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append],[[v,a],["b"],["a","b"]]]]]]). +lp2sm_en2test(228,[[["n","query box 1"],[["v",a],["v",b],["v",c]],":-",[[["n","find all"],[[["v",a],["v",b]],[["n","append"],[["v",a],["v",b],["a","b","c"]]],["v",c]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[[v,a],[v,b]],[v,c]]],[1,[on_true,[findall_exit_function,0]],[go_after,-],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,append],[[v,a],[v,b],["a","b","c"]]]]]]). +lp2sm_en2test(229,[[["n","query box 1"],[["v",a],["v",c]],":-",[[["n","find all"],[["v",a],[["n","member"],[["v",a],["a","b"]]],["v",c]]]]]],[[0,[n,query_box_1],[[v,a],[v,c]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[v,a],[v,c]]],[1,[on_true,[findall_exit_function,0]],[go_after,-],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,member],[[v,a],["a","b"]]]]]]). +lp2sm_en2test(230,[[["n","query box 1"],[["v",a],["v",b]],":-",[[["n","append"],[["a"],["v",a],["v",b]]]]]],[[0,[n,query_box_1],[[v,a],[v,b]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append],[["a"],[v,a],[v,b]]]]]]). +lp2sm_en2test(231,[[["n","query box 1"],[["v",a],["v",c],["v",d]],":-",[[["n","find all"],[[["v",a],["v",c]],[["n","append"],[["v",a],["a","b"],["v",c]]],["v",d]]]]]],[[0,[n,query_box_1],[[v,a],[v,c],[v,d]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[[v,a],[v,c]],[v,d]]],[1,[on_true,[findall_exit_function,0]],[go_after,-],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,append],[[v,a],["a","b"],[v,c]]]]]]). +lp2sm_en2test(232,[[["n","query box 1"],[],":-",[[["n","member"],["a",["a","b"]]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,member],["a",["a","b"]]]]]]). +lp2sm_en2test(233,[[["n","query box 1"],[["v",a],["v",d]],":-",[[["n","find all"],[["v",a],[["n","member"],["a",["v",a]]],["v",d]]]]]],[[0,[n,query_box_1],[[v,a],[v,d]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[v,a],[v,d]]],[1,[on_true,[findall_exit_function,0]],[go_after,-],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,member],["a",[v,a]]]]]]). +lp2sm_en2test(234,[[["n","query box 1"],[["v",a]],":-",[[["n","member"],[["v",a],["a","b"]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,member],[[v,a],["a","b"]]]]]]). +lp2sm_en2test(235,[[["n","query box 1"],[["v",a],["v",b],["v",d]],":-",[[["n","find all"],[[["v",a],["v",b]],[["n","member"],[["v",b],["v",a]]],["v",d]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,d]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[[v,a],[v,b]],[v,d]]],[1,[on_true,[findall_exit_function,0]],[go_after,-],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,member],[[v,b],[v,a]]]]]]). +lp2sm_en2test(236,[[["n","query box 1"],[["v",a],["v",b],["v",c],["v",d]],":-",[[["n","find all"],[[["v",a],["v",b],["v",c]],[["n","append"],[["v",a],["a"],[["v",b],"|",["v",c]]]],["v",d]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c],[v,d]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[[v,a],[v,b],[v,c]],[v,d]]],[1,[on_true,[findall_exit_function,0]],[go_after,-],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,append],[[v,a],["a"],[[v,b],"|",[v,c]]]]]]]). +lp2sm_en2test(237,[[["n","query box 1"],[["v",a],["v",b],["v",c],["v",d]],":-",[[["n","find all"],[[["v",a],["v",b],["v",c]],[["n","append"],[["v",a],["b",["v",b]],["v",c]]],["v",d]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c],[v,d]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[[v,a],[v,b],[v,c]],[v,d]]],[1,[on_true,[findall_exit_function,0]],[go_after,-],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,append],[[v,a],["b",[v,b]],[v,c]]]]]]). +lp2sm_en2test(238,[[["n","query box 1"],[["v",a],["v",b],["v",c],["v",d]],":-",[[["n","find all"],[[["v",a],["v",b],["v",c]],[["n","append"],[["v",a],["v",b],["v",c]]],["v",d]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c],[v,d]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,findall],[[[v,a],[v,b],[v,c]],[v,d]]],[1,[on_true,[findall_exit_function,0]],[go_after,-],[on_false,[findall_fail_function,0]],[go_to_predicates,-],[n,append],[[v,a],[v,b],[v,c]]]]]]). +lp2sm_en2test(239,[[["n","query box 1"],[["v",a],["v",b],["v",c]],":-",[[["n","append"],[["v",a],["b",["v",b]],[["v",c]]]]]]],[[0,[n,query_box_1],[[v,a],[v,b],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append],[[v,a],["b",[v,b]],[[v,c]]]]]]]). +lp2sm_en2test(240,[[["n","query box 1"],[["v",a],["v",c]],":-",[[["n","append"],[["v",a],["b"],[["v",c]]]]]]],[[0,[n,query_box_1],[[v,a],[v,c]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append],[[v,a],["b"],[[v,c]]]]]]]). +lp2sm_en2test(241,[[["n","query box 1"],[["v",a]],":-",[[["n","reverse 1"],[["v",a],[],[3,2,1]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,reverse1],[[v,a],[],[3,2,1]]]]]]). +lp2sm_en2test(242,[[["n","query box 1"],[["v",a]],":-",[[["n","append 1"],[["v",a],[],[1,2,3]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,append1],[[v,a],[],[1,2,3]]]]]]). +lp2sm_en2test(243,[[["n","query box 1"],[["v",a]],":-",[[["n","back propagate"],[["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,back_propagate],[[v,a]]]]]]). +lp2sm_en2test(244,[[["n","query box 1"],[["v",d]],":-",[[["n","foldl 1"],[["n","stringconcata 1"],["a","b"],"",["v",d]]]]]],[[0,[n,query_box_1],[[v,d]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,foldl1],[[n,stringconcata1],["a","b"],"",[v,d]]]]]]). +lp2sm_en2test(245,[[["n","query box 1"],[["v",a],["v",x],["v",y],["v",z]],":-",[[[["n","equals 4"],[["v",a],"abc"]],[["n","string chars"],[["v",a],[["v",x],["v",y],["v",z]]]]]]]],[[0,[n,query_box_1],[[v,a],[v,x],[v,y],[v,z]],":-",[[0,[on_true,1],[go_after,-2],[on_false,-3],[go_to_predicates,-],[n,"[]"]],[1,[on_true,2],[go_after,-],[on_false,[fail_function,0]],[go_to_predicates,-],[n,equals4],[[v,a],"abc"]],[2,[on_true,[exit_function,0]],[go_after,-],[on_false,[fail_function,0]],[go_to_predicates,-],[n,string_chars],[[v,a],[[v,x],[v,y],[v,z]]]]]]]). +lp2sm_en2test(246,[[["n","query box 1"],[],":-",[[["n","test 2"]]]]],[[0,[n,query_box_1],[],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,test2],[]]]]]). +lp2sm_en2test(247,[[["n","query box 1"],[["v",a]],":-",[[["n","test 2"],[["v",a]]]]]],[[0,[n,query_box_1],[[v,a]],":-",[[0,[on_true,-2],[go_after,-],[on_false,-3],[go_to_predicates,-],[n,test2],[[v,a]]]]]]). + + + + +bt-lp2sm_en2test(Lang,BL,RL) :- + +findall(A,(lp2sm_en2test(N,I,O),((retractall(lang(_)),assertz(lang("en")),sm_to_lp(O,I2),trans_alg(I2,"en",Lang,I3),I3=I)->(writeln([bt-lp2sm_en2test,N,passed]),A=passed);(writeln([bt-lp2sm_en2test,N,failed]),A=failed))),B), +length(B,BL), +findall(_,member(passed,B),R),length(R,RL),!. + +bt-lp2sm_en2test1(Lang,N,A) :- +lp2sm_en2test(N,I,O),((retractall(lang(_)),assertz(lang("en")),sm_to_lp(O,I2),trans_alg(I2,"en",Lang,I3),I=I3)->(writeln([bt-lp2sm_en2test,N,passed]),A=passed);(writeln([bt-lp2sm_en2test,N,failed]),A=failed)),!. + + +%?- test(1,Q,F,R),query_box(Q,Query1,Functions,Functions1),convert_to_grammar_part1(Functions1,[],Functions2,_),add_line_numbers_to_algorithm1(Functions2,Functions2a),find_pred_numbers(Functions2a,[] ,Pred_numbers),retractall(pred_numbers(_)),assertz(pred_numbers(Pred_numbers)),find_state_machine1(Functions2a,Functions3,Pred_numbers),writeln(Functions2),sm_to_lp(Functions3,Functions4),writeln(Functions4). + + +% to do +% back-translation, on I not O + + +bt1-lp2sm_en2test(Lang,BL,RL) :- +findall(A,(lp2sm_en2test(N,I,O),(((once((trans_alg(I,Lang,"en",I2),retractall(lang(_)),assertz(lang("en")),add_line_numbers_to_algorithm1(I2,Functions2a),find_pred_numbers(Functions2a,[] ,Pred_numbers),retractall(pred_numbers(_)),assertz(pred_numbers(Pred_numbers)),find_state_machine1(Functions2a,O1,Pred_numbers))),O=O1)->(writeln([bt1-lp2sm_en2test,N,passed]),A=passed);(writeln([bt1-lp2sm_en2test,N,failed]),A=failed)))),B), +length(B,BL), +findall(_,member(passed,B),R),length(R,RL),!. + +bt1-lp2sm_en2test1(Lang,N,A) :- +lp2sm_en2test(N,I,O),(((once((trans_alg(I,Lang,"en",I2),retractall(lang(_)),assertz(lang("en")),add_line_numbers_to_algorithm1(I2,Functions2a),find_pred_numbers(Functions2a,[] ,Pred_numbers),retractall(pred_numbers(_)),assertz(pred_numbers(Pred_numbers)),find_state_machine1(Functions2a,O1,Pred_numbers))),O=O1)->(writeln([bt1-lp2sm_en2test,N,passed]),A=passed);(writeln([bt1-lp2sm_en2test,N,failed]),A=failed))),!. diff --git a/Philosophy/cat_alg_files.sh b/Philosophy/cat_alg_files.sh new file mode 100644 index 0000000000000000000000000000000000000000..6c0c7e6e80175048b444886dca156ab16808e8da --- /dev/null +++ b/Philosophy/cat_alg_files.sh @@ -0,0 +1,3 @@ +#!/bin/bash +swipl --goal=main --stand_alone=true -o cat_alg_files -c cat_alg_files2.pl +./cat_alg_files \ No newline at end of file diff --git a/Philosophy/cat_alg_files1.pl b/Philosophy/cat_alg_files1.pl new file mode 100644 index 0000000000000000000000000000000000000000..68e28181e1dd32bddf10c968441be8591cfc1ca9 --- /dev/null +++ b/Philosophy/cat_alg_files1.pl @@ -0,0 +1,6 @@ +:-include('cat_alg_files.pl'). +main:-catch(cat_alg_files(6000),Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). \ No newline at end of file diff --git a/Philosophy/cat_arg_files1.pl b/Philosophy/cat_arg_files1.pl new file mode 100644 index 0000000000000000000000000000000000000000..a3a593163f37186a1bcb3fc2b73540b0e2901d16 --- /dev/null +++ b/Philosophy/cat_arg_files1.pl @@ -0,0 +1,6 @@ +:-include('cat_arg_files.pl'). +main:-catch(cat_arg_files(6000),Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). \ No newline at end of file diff --git a/Philosophy/caw_wo_lpi.pl b/Philosophy/caw_wo_lpi.pl new file mode 100644 index 0000000000000000000000000000000000000000..6213fe16387183107f0aa5d5fa8b3c92f43aaa61 --- /dev/null +++ b/Philosophy/caw_wo_lpi.pl @@ -0,0 +1 @@ +a:-A=true,B=true,C=(b:-A,B),retractall(b),assertz(C),b. \ No newline at end of file diff --git a/Philosophy/cicd.txt b/Philosophy/cicd.txt new file mode 100644 index 0000000000000000000000000000000000000000..a85e3271b57ace1802f61beeffe8df62dd44e62c --- /dev/null +++ b/Philosophy/cicd.txt @@ -0,0 +1,65 @@ +% need 23 tests + +[ +["","1 4 23.pl",(object(A),A=[ +[ +"* ", +"* ", +"* "], +[ +"* ", +"** ", +"* "], +[ +" ", +" ", +" "] +])], + +["","1 4 23.pl",(dimensions(X,Y,Z), X = 2, Z = 2, Y = 3)], + +["","1 4 23.pl",(length1(3,5,L), L = 4.0)], + +["","1 4 23.pl",(report1(Ts), Ts = [5])], + +["","1 4 23.pl",(matrix_finder1([[[0,1],[-1,0]],[[1,0],[0,1]]],Matrix), Matrix = [[0, -1.0], [1.0, 0]])], + +["","1 4 23.pl",(degree(90))], + +["","1 4 23.pl",(degree(-90))], + +["","1 4 23.pl",(multiplier(1))], + +["","1 4 23.pl",(multiplier(-1))], + +["","1 4 23.pl",(cos_or_sin(cos))], + +["","1 4 23.pl",(cos_or_sin(sin))], + +["","1 4 23.pl",(epsilon(0.000001))], + +["","1 4 23.pl",(and1([[a,b,c],[a,d,c],[e,f]],[a,c],Lines), Lines = [[a, b, c], [a, d, c]])], + +["","1 4 23.pl",(revoke([land,contract,money],R),R = true)], + +["","1 4 23.pl",(revoke([land],R),R = true)], + +["","1 4 23.pl",(revoke([else],R),R = false)], + +["","1 4 23.pl",(revoke([contract,money],R),R = true)], + +["","1 4 23.pl",(revoke([],R),R = false)], + +["","1 4 23.pl",(dir(1,2,true))], + +["","1 4 23.pl",(dir(2,3,true))], + +["","1 4 23.pl",(dir(3,1,true))], + +["","1 4 23.pl",(dir(2,1,true))], + +["","1 4 23.pl",(dir(3,2,true))], + +["","1 4 23.pl",(dir(1,3,true))] + +] diff --git a/Philosophy/colours_to_image.pl b/Philosophy/colours_to_image.pl new file mode 100644 index 0000000000000000000000000000000000000000..0cd978bfb3a620e3806798e60a610a296d0ed877 --- /dev/null +++ b/Philosophy/colours_to_image.pl @@ -0,0 +1,51 @@ +% realism, ironism + +% creates an image from names of colours + +% [[white,white,red,white,white],[white,red,yellow,red,white],[red,yellow,blue,yellow,red],[white,red,yellow,red,white],[white,white,red,white,white]] + +% [[red,white],[white,red]] + +:-include('../listprologinterpreter/la_strings.pl'). + +%:-include('../listprologinterpreter/la_strings.pl'). + +colours_to_image :- + writeln("Enter pixel colours for image as e.g. \"[[red,yellow],[yellow,red]]\"."), + read_string(user_input, "\n", "\r", _, Input), + term_to_atom(Grid,Input), + length(Grid,Y), + Grid=[Row|_], + length(Row,X), + findall(Pixel_RGB2,(member(Line,Grid),findall([R," ",G," ",B,"\n"],(member(Pixel_colour,Line),colour(Pixel_colour,[R,G,B])),Pixel_RGB2)),Grid2), + %trace, + %writeln1(Grid31), + maplist(append,[Grid2],[Grid32]), + maplist(append,[Grid32],[Grid3]), + %maplist(append,[[Grid33]],[Grid34]), + %maplist(append,[[Grid34]],[Grid3]), + +%trace, + maplist(append,[[["P3","\n",X," ",Y,"\n","255","\n"],Grid3,["\n"]]],[Grid31]), + + concat_list(Grid31,Grid4), + + writeln("Enter file name, ending with \".ppm\"."), + read_string(user_input, "\n", "\r", _, File), + %term_to_atom(Grid4,D85), + string_atom(D85,Grid4), + + (open_s(File,write,Stream2), + write(Stream2,D85), + close(Stream2)),!. + + +colour(red, [255, 0, 0]). +colour(black, [0, 0, 0]). +colour(white, [255, 255, 255]). +colour(yellow, [255, 255, 0]). +colour(green, [0, 255, 0]). +colour(blue, [0, 0, 255]). +colour(purple, [128, 0, 128]). +colour(orange, [255, 128, 0]). +colour(brown, [128, 64, 0]). diff --git a/Philosophy/combo_pass.pl b/Philosophy/combo_pass.pl new file mode 100644 index 0000000000000000000000000000000000000000..becd3c68e33c3d10228bbe11ba5e7558ebee9520 --- /dev/null +++ b/Philosophy/combo_pass.pl @@ -0,0 +1,24 @@ +:-include('../listprologinterpreter/listprolog.pl'). + +%?- combo_pass([1,2,3,5],[1,4,3,6],11,G). +%G = [[1, 2, 3, 5]]. + +combo_pass(A,B,Sum,G) :- + length(A,L), + numbers(L,1,[],Ns), + + % Finds the possibility list [[1], [2, 4], [3], [5, 6]] + findall(C,(member(N,Ns),get_item_n(A,N,An),get_item_n(B,N,Bn),(An=Bn->C=[An];C=[An,Bn])),D), + + % Finds the expanded possibility list [[1, 2, 3, 5], [1, 2, 3, 6], [1, 4, 3, 5], [1, 4, 3, 6]] + + findall(H,findall2(D,[],H),J), + + % Finds the list that sums to Sum=11 [[1, 2, 3, 5]] + + findall(M,(member(M,J),sum(M,Sum)),G) + .%(E,(member(F,D),member(E,F)),G). + +findall2([],A,A) :- !. +findall2(A,B,C) :- + A=[D|E],member(F,D),append(B,[F],G),findall2(E,G,C). \ No newline at end of file diff --git a/Philosophy/connect_unrelatedness.pl b/Philosophy/connect_unrelatedness.pl new file mode 100644 index 0000000000000000000000000000000000000000..54968b7de60b1501758d1ae9ff302b51df2eeaf2 --- /dev/null +++ b/Philosophy/connect_unrelatedness.pl @@ -0,0 +1,14 @@ +% ["Medicine","MEDICINE by Lucian Green Prevent Headaches on Train and a Bent Spine 3 of 4.txt",0,algorithms,"23. I prepared to eat the soy fish. I did this by providing Social Work service as breasonings currency. First, I worked as a big social worker. Second, I enamoured it lightly. Third, I connected unrelatedness. In this way, I prepared to eat the soy fish by providing Social Work service as breasonings currency."] + +% ?- connect_unrelatedness([[a,b],[a,c],[d,e]],Other_connections),writeln(Other_connections). +% Other_connections = [[a,a],[a,d],[a,e],[b,a],[b,b],[b,c],[b,d],[b,e],[c,a],[c,b],[c,c],[c,d],[c,e],[d,a],[d,b],[d,c],[d,d],[e,a],[e,b],[e,c],[e,d],[e,e]] + +connect_unrelatedness(Connections,Other_connections) :- + collect_vars(Connections,Vars), + findall([A,B],(member(A,Vars),member(B,Vars)),C), + subtract(C,Connections,Other_connections). + +collect_vars(Connections,Vars2) :- + maplist(append,[Connections],[Vars1]), + sort(Vars1,Vars2). + \ No newline at end of file diff --git a/Philosophy/connections.txt b/Philosophy/connections.txt new file mode 100644 index 0000000000000000000000000000000000000000..59b59e723ffc872ebac0105b84d856501965562c --- /dev/null +++ b/Philosophy/connections.txt @@ -0,0 +1 @@ +[[["get","map","for","goal"],[1,2,"for",4]]] \ No newline at end of file diff --git a/Philosophy/data_to_alg.pl b/Philosophy/data_to_alg.pl new file mode 100644 index 0000000000000000000000000000000000000000..95b0655294f685586b47fa0644850f7b1aa8fbe2 --- /dev/null +++ b/Philosophy/data_to_alg.pl @@ -0,0 +1,39 @@ +% ["Short Arguments","God Algorithm.txt",0,algorithms,"5. I prepared to ask why why was. I did this by stating that I knew the result of the God algorithm was why. First, I noticed the occurrence of the event. Second, I read the result of the God algorithm. Third, I worked out that the result explained the occurrence."] + +% * Third, I worked out that the result explained the occurrence. + +:-include('../listprologinterpreter/listprolog.pl'). + +% as [A,B] is to [[B],A], [c,d] is to X + +% test1(off,1,A). + +% data_to_alg([a,b],[[b],a],[c,d],A2). +% A2=[[d],c] + +data_to_alg(Data1,Data2,Alg1,Alg2) :- + % finds A=a, etc. and [A,B] + data_to_alg1(Data1,[],_Vars1,[],Alg3), + data_to_alg1(Data2,[],_Vars3,[],Alg4), + + % finds [[B],A] from A=a etc. and [[b],a] + % [A,B],[c,d] + interpretpart(match4,Alg3,Alg1,[],Vars2,_), + % [[B],A] + interpretpart(match4,Alg4,[v,sys1],Vars2,Vars4,_), + + % finds [[d],c] from [A,B], [[B],A] and [c,d] + getvalue([v,sys1],Alg2,Vars4). + + +data_to_alg1([],Vars,Vars,Alg,Alg) :- !. +data_to_alg1(Data1,Vars1,Vars2,Alg1,Alg2) :- + Data1=[Data2|Data3], + (is_list(Data2)->%List=true;List=false), + (data_to_alg1(Data2,Vars1,Vars3,[],Alg4), + Alg5=Alg4,append(Alg1,[Alg5],Alg3)); + (append(Vars1,[[[v,Data2],Data2]],Vars3), + append(Alg1,[[v,Data2]],Alg3))), + data_to_alg1(Data3,Vars3,Vars2,Alg3,Alg2),!. + +% if data2 is list, wrap its alg in [] diff --git a/Philosophy/day.sh b/Philosophy/day.sh new file mode 100644 index 0000000000000000000000000000000000000000..b23bfaf844d66d3589ef88eb012f1e119f2be9d9 --- /dev/null +++ b/Philosophy/day.sh @@ -0,0 +1,3 @@ +#!/bin/sh +swipl -f -q ./bag_args21.pl +swipl -f -q ./bag_algs1.pl \ No newline at end of file diff --git a/Philosophy/day2.pl b/Philosophy/day2.pl new file mode 100644 index 0000000000000000000000000000000000000000..bef1c0a79b93b8df638279c4d6346b28765bbf5f --- /dev/null +++ b/Philosophy/day2.pl @@ -0,0 +1,139 @@ +#!/usr/bin/swipl -f -q + +:-include('../listprologinterpreter/listprolog.pl'). +:- use_module(library(date)). +:-include('flush_caches.pl'). +%:-include('bell.pl'). + +%:-include('../listprologinterpreter/la_files.pl'). + +%:- initialization(catch(main, Err, handle_error(Err))). + +main :- +S is 60*60, +catch(catch(call_with_time_limit(S,main2),time_limit_exceeded,(writeln("Timed out."),handle_error(_Err))),Err,handle_error(_Err)),halt. +handle_error(_Err):- + halt(1). +main :- halt(1). +main2 :- + +/* +date_time_stamp(date(2023,1,1,1,0,0,_,_,_),TS0), +date_time_stamp(date(2023,1,2,1,0,0,_,_,_),TS01), +TSD is TS01-TS0, + +get_time(TS1), +In_a_day is TS1+TSD, + + +Hours=[9,12,15], + +get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + +findall(TSA0,(member(H,Hours),date_time_stamp(date(Year,Month,Day,H,0,0,_,_,_),TSA0)),TSAA), + +*/ +open_file_s("../Lucian-Academy/Books1/algs/lgalgs_a.txt",[_,_,_,FS1]), +not(FS1=""), +open_file_s("../Lucian-Academy/Books1/args/lgtext_a.txt",[_,_,_,FS2]), +not(FS2=""), + +(exists_file('aa_log.txt')->( + +get_time(TS1), + stamp_date_time(TS1,date(Year,Month,Day,Hour1,Minute1,_Seconda,_A,_TZ,_False),local), + foldr(string_concat,["aa_log-",Year,"-",Month,"-",Day,"-",Hour1,"-",Minute1,".txt"],Name), + + mv("aa_log.txt",Name), + + open_s("aa_log.txt",write,SB), + write(SB,'[]'),close(SB) + + );true), + +texttobr2(25,u,"square",u), +BL is %1000,% +4*16000, +%5*16000*8*5, +day_loop(BL,0,0%,In_a_day,TSAA +), +flush_caches, +A is (25-1)*(1 % teleport 1 + + 1 % anti-ageing medicine 1 + + 1 % teleport 2 + + 1 % anti-ageing medicine 2 + + 2 % meditation + + 2),% a thought + +texttobr2_1(A), +%bell("Freezing Ages Complete"), + + nl, + halt. +main :- halt(1). + + +day_loop(Br_lim,Br_ar,Br_al%NB,In_a_day,TSA +) :- + +%get_time(TS1), +(Br_ar>Br_lim->Br_ar2=Br_ar; +( +foldr(string_concat,["swipl -f -q ./bag_args21.pl"],S3)%, +,catch(bash_command(S3,Br_ar10), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text4),%writeln1(Text4), + false%fail%abort + )), + +split_string(Br_ar10,"\n\r","\n\r",Br_ar11), +append(_,[Br_ar12],Br_ar11), +number_string(Br_ar1,Br_ar12), + +Br_ar2 is Br_ar+Br_ar1)), + +(Br_al>Br_lim->Br_al2=Br_al; + +%/* +( +foldr(string_concat,["swipl -f -q ./bag_algs1.pl"],S31)%, +,catch(bash_command(S31,Br_al10), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text41),%writeln1(Text4), + false%fail%abort + )), +%*/ +split_string(Br_al10,"\n\r","\n\r",Br_al11), +append(_,[Br_al12],Br_al11), +number_string(Br_al1,Br_al12), + + +%get_time(TS1), +Br_al2 is Br_al+Br_al1)), + +/* +findall(TS2,(member(TS2,TSA),TS1>TS2, + +%foldr(atom_concat,['echo "" | mutt -s "Alarm ',NB1,' Breasonings" luciangreen@lucianacademy.com',''],A1), + +%shell1(A1) + + foldr(string_concat,["afplay /System/Library/Sounds/Funk.aiff\nsay \" Alarm "%done"%"echo " + ,%""% + NB1, + "\" Breasonings"],S1), + shell1_s(S1) + +),TS3), +subtract(TSA,TS3), + +*/ +%atom_concat('echo "" | mutt -s "BAG Day Test" luciangreen@lucianacademy.com','',A1), + +%shell1(A1) + +((Br_ar>Br_lim,Br_al>Br_lim)->true; + day_loop(Br_lim,Br_ar2,Br_al2%NB1,In_a_day,TS3 + )) + + ,!. + + \ No newline at end of file diff --git a/Philosophy/day_init.sh b/Philosophy/day_init.sh new file mode 100644 index 0000000000000000000000000000000000000000..bdf29e90f44199b6471f99f15bc7395477fea967 --- /dev/null +++ b/Philosophy/day_init.sh @@ -0,0 +1,13 @@ +#!/bin/sh + +cd GitHub/private2/Text-to-Breasonings/ + +swipl -f -q ./bag_tt1.pl + +cd ../Philosophy/ + +#swipl -f -q ./bag_cat_alg.pl + +#swipl -f -q ./bag_cat_arg.pl + +swipl -f -q ./day.pl diff --git a/Philosophy/debug_tools.pl b/Philosophy/debug_tools.pl new file mode 100644 index 0000000000000000000000000000000000000000..261aee76a9ef4faba07fb700491fdf5fad5c6b5c --- /dev/null +++ b/Philosophy/debug_tools.pl @@ -0,0 +1,22 @@ +% debug_tools.pl + +:-include('pretty_print_table.pl'). +:-include('sub_term_with_address.pl'). + +test_p(A) :- (catch(call_with_time_limit(10,A),_,(writeln1(["Predicate timed out:",A]),abort))->true;(writeln1(["Predicate failed:",A]),fail%abort +)),!. + +test_r(A,B) :- +sub_term_types_wa([string,number,atom,[]],A,Instances1), +sub_term_types_wa([string,number,atom,[]],B,Instances2), +(Instances1=Instances2->true; +(subtract(Instances1,Instances2,Instances1a), +subtract(Instances2,Instances1,Instances2a), +(pretty_print_table([A],A1)->true;pretty_print_table([[A]],A1)), +(pretty_print_table([B],B1)->true;pretty_print_table([[B]],B1)), +pretty_print_table([Instances1a],Instances1a1), +pretty_print_table([Instances2a],Instances2a1), +writeln(["**********","\n","Error in result:",A1,"\n**********","\n","given correct result:",B1,"\n**********","\n","Missing terminals:",Instances1a1, +"\n**********","\n","Incorrect terminals:",Instances2a1]), +true%abort +)),!. diff --git a/Philosophy/doc_db.pl b/Philosophy/doc_db.pl new file mode 100644 index 0000000000000000000000000000000000000000..b707db4c292a811f324584a5a6484f74c54ae673 --- /dev/null +++ b/Philosophy/doc_db.pl @@ -0,0 +1,13 @@ +:-include('../listprologinterpreter/listprolog.pl'). + +doc_db(String) :- + +open_s("docs.txt",read,Stream), +read(Stream,Term), +close(Stream), + +findall([Command," - ",Example,"
"], + (member([Command,Example],Term)),Term2), + + foldr(append,Term2,Term3), + foldr(string_concat,Term3,String). diff --git a/Philosophy/eh.pl b/Philosophy/eh.pl new file mode 100644 index 0000000000000000000000000000000000000000..86ea70cd0ac096eca8194213a7c71b1c84fe0acc --- /dev/null +++ b/Philosophy/eh.pl @@ -0,0 +1,176 @@ +% eh.pl + +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/', eh_form, []). + +% http://localhost:8000/a + +:-include('../Lucian-Academy/short_essay_helper3_agps.pl'). +:-include('../listprologinterpreter/la_maths.pl'). +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +%:-include('test_form1.pl'). + +%:- include('files/listprolog.pl'). + +eh_server(Port) :- + http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + + eh_form(_Request) :- + + %format('Content-type: text/html~n~n', []), + +% data_copy(Header,Footer), +% format(Header,[]), + +% + + reply_html_page( + title('Essay Helper'), + [ + +form([action='/landing1', method='POST'], [ + + p([], [ + label([for=title],'Essay Title'), + input([name=title, type=textarea]) + ]), + + p([], [ + label([for=s1l],'Source 1 long citation, e.g. Surname, I., & Surname, I. (2000). Title. Publisher'), + input([name=s1l, type=textarea]) + ]), + + p([], [ + label([for=s1s],'Source 1 short citation, e.g. Surname, 2000'), + input([name=s1s, type=textarea]) + ]), + + p([], [ + label([for=s1p],'Source 1 page number of first page, e.g. 1'), + input([name=s1p, type=textarea]) + ]), + + p([], [ + label([for=s1c],'Source 1 content'), + input([name=s1c, type=textarea]) + ]), + + p([], input([name=submit, type=submit, value='Submit'], [])) + ])])%, + + %format(Footer,[]) + . + + + :- http_handler('/landing1', landing_pad1, []). + + landing_pad1(Request) :- + member(method(post), Request), !, + http_read_data(Request, Data, []), + + + format('Content-type: text/html~n~n', []), + %format('

', []), + +Data=[%%debug='off',%%Debug1, +title=Title,s1l=S1l,s1s=S1s,s1p=S1p,s1c=S1c,submit=_], + + + data_copy(Header,Footer), + + format(Header,[]), + + atom_string(Title,Title1), + atom_string(S1l,S1l1), + atom_string(S1s,S1s1), + atom_number(S1p,S1p1), + atom_string(S1c,S1c1), + + term_to_atom( + [S1l1,S1s1,S1p1,S1c1],Texts), + + atom_string(Texts,Texts1), + Texts2=[Texts1], + %writeln1(Texts1), + short_essay_helper(Texts2,Title,1,Essay_0), + + atomic_list_concat(A,"\n",Essay_0), + atomic_list_concat(A,"
",Essay_01), + + writeln(Essay_01), +%/* +%term_to_atom(Hidden1,Hidden), + + + format(Footer,[]) + +%,_,writeln("Error.")) +. + + + + + + +data_copy(Header,Footer) :- + +Header=' + + + + + State Saving Interpreter + + + + + + + +

+ + + + + + +
+

', + +Footer='

+
+
+ +
+ +'. diff --git a/Philosophy/family_sols.pl b/Philosophy/family_sols.pl new file mode 100644 index 0000000000000000000000000000000000000000..e520f83e51b6e1d93934d2168947406e441ad9cf --- /dev/null +++ b/Philosophy/family_sols.pl @@ -0,0 +1,144 @@ +parent(albert, jim). +parent(albert, peter). +parent(jim, brian). +parent(peter, lee). +parent(peter, john). + +year_of_birth(albert, 1926). +year_of_birth(peter, 1945). +year_of_birth(jim, 1949). +year_of_birth(brian, 1974). +year_of_birth(john, 1974). +year_of_birth(lee, 1975). + +gender(albert,male). +gender(jim,male). +gender(peter,male). +gender(brian,male). +gender(lee,male). +gender(john,male). + +grandparent(GG) :- + findall([A,B],parent(A,B),C), + findall([Grandparent,Grandchild],(member([Grandparent, Child],C), + member([Child, Grandchild],C)),GG),!. + +% older(A,B) +% means A is older than B +% +older(GG) :- + findall([A,B],year_of_birth(A,B),C), + findall([A1,B1],(member([A1, Y1],C), + member([B1, Y2],C), Y2 > Y1),GG),!. + + +% siblings(A,B) +% means A and B are siblings +% +siblings(GG) :- + findall([A,B],parent(A,B),C), + findall([A1,B1],(member([X, A1],C), + member([X, B1],C),not(A1=B1)),GG),!. + + % A & B share a common parent + % + % A is different from B (Bratko, p175) + + + +% sibling_list(Child, Siblings) +% Siblings is a list of A1's brothers and sisters +% +sibling_list(A1, Siblings) :- + findall([A,B],parent(A,B),C), + findall([A1,B1],(member([X, A1],C), + member([X, B1],C),not(A1=B1)),List), + remove_duplicates(List, Siblings),!. + + +% remove_duplicates(List, Result) +% +% Removes duplicate entries in a list +% +remove_duplicates([], []):-!. + +remove_duplicates([[X1,X2]|Rest], [[X1,X2]|Result]) :- + delete(Rest,[X1,X2],R1), + delete(R1,[X2,X1],R2), + remove_duplicates(R2, Result),!. + +% older_brother(A,B) +% means A is an older brother of B +% +older_brother(GG) :- + older_brother1(P,Y,G), + +findall([A,B,C1,B3],(member([A,B],P),member([B,C1],Y),member([B,B3],G)),C),findall([A1,B1],(member([X, A1, C2, male],C),member([X, B1, C21, _],C),C2 > C21),GG),!. + +older_brother1(P,Y,G) :- + findall([A,B],parent(A,B),P), + findall([A,B],year_of_birth(A,B),Y), + findall([A,B],gender(A,B),G),!. + + +% descendant(Person, Descendant) +% means Descendant is a descendant of Person. +% + +descendant(A1,B1) :- findall([A,B],parent(A,B),C), descendant1(A1,B1,C,C). +descendant1(_Person, [],[],_) :-!. +descendant1(Person, D2,[[Person,C2]|C1],E) :- + descendant1(C2,D, E,E), + descendant1(Person,Descendant, C1,E), + foldr_append([[C2],D,Descendant],[],D2),!. + +descendant1(Person1, Descendant,[[Person2,_C2]|C1],E) :- + not(Person1=Person2), + descendant1(Person1,Descendant, C1,E),!. + +foldr_append([],B,B) :- !. +foldr_append([A1|A2],B,C) :- + append(B,A1,D), + foldr_append(A2,D,C),!. + +% ancestor(Person, Ancestor) +% means Ancestor is an ancestor of Person. +% +% This is functionally equivalent to descendant(Ancestor, Person). +% + +ancestor(A1,B1) :- findall([A,B],parent(A,B),C), ancestor1(A1,B1,C,C). +ancestor1(_Person, [],[],_) :-!. +ancestor1(Person, D2,[[C2,Person]|C1],E) :- + ancestor1(C2,D, E,E), + ancestor1(Person,Ancestor, C1,E), + foldr_append([[C2],D,Ancestor],[],D2),!. + +ancestor1(Person1, Ancestor,[[_C2,Person2]|C1],E) :- + not(Person1=Person2), + ancestor1(Person1,Ancestor, C1,E),!. + + +% children(Parent, ChildList) +% ChildList is bound to a list of the children of Parent. +% +children(Parent, ChildList) :- + findall(Child, parent(Parent,Child), ChildList). + + +% list_count(List, Count) +% Count is bound to the number of elements in List. +% +list_count([], 0). +list_count([_|Tail], Count) :- + list_count(Tail, TailCount), + Count is TailCount + 1. + + +% count_descendants(Person, Count) +% Count is bound to the number of descendants of Person. +% +count_descendants(Person, Count) :- + descendant(Person, List), + list_count(List, Count). + \ No newline at end of file diff --git a/Philosophy/family_test.pl b/Philosophy/family_test.pl new file mode 100644 index 0000000000000000000000000000000000000000..4e4f2a00a97e5e41859e405e52b73c4023231943 --- /dev/null +++ b/Philosophy/family_test.pl @@ -0,0 +1,41 @@ +family_test:- + +grandparent(Result1), + +writeln(Result1), + +older(Result2), + +writeln(Result2), + +siblings(Result3), + +writeln(Result3), + +sibling_list(john,Result5), + +writeln(Result5), + +older_brother(Result6), + +writeln(Result6), + +descendant(albert,Result7), + +writeln(Result7), + +ancestor(john,Result8), + +writeln(Result8), + +children(albert, ChildList), + +writeln(ChildList), + +list_count([1,2,3], Count1), + +writeln(Count1), + +count_descendants(albert, Count2), + +writeln(Count2). diff --git a/Philosophy/file.txt b/Philosophy/file.txt new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/Philosophy/file2.txt b/Philosophy/file2.txt new file mode 100644 index 0000000000000000000000000000000000000000..e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 diff --git a/Philosophy/file_fd3.txt b/Philosophy/file_fd3.txt new file mode 100644 index 0000000000000000000000000000000000000000..721ff7078d1f63cf5d325d0ff9544541099229bb --- /dev/null +++ b/Philosophy/file_fd3.txt @@ -0,0 +1 @@ +language \ No newline at end of file diff --git a/Philosophy/file_fddb.txt b/Philosophy/file_fddb.txt new file mode 100644 index 0000000000000000000000000000000000000000..c115f9f42357fb4709fd2a3889deef02071f6d77 --- /dev/null +++ b/Philosophy/file_fddb.txt @@ -0,0 +1 @@ +b f. \ No newline at end of file diff --git a/Philosophy/find_details3_db bu.txt b/Philosophy/find_details3_db bu.txt new file mode 100644 index 0000000000000000000000000000000000000000..318f4af70257b73d0a310c51d3bb3c65dd52d3c7 --- /dev/null +++ b/Philosophy/find_details3_db bu.txt @@ -0,0 +1 @@ +[[[""],""],[["b","c"],"b c"],[["c","d"]," c d"],[["d","e"]," d e"],[["e","f"]," e f"]] \ No newline at end of file diff --git a/Philosophy/flatten1.pl b/Philosophy/flatten1.pl new file mode 100644 index 0000000000000000000000000000000000000000..3d6487bca644b40f9acdb12b429cf56a1ec5ae86 --- /dev/null +++ b/Philosophy/flatten1.pl @@ -0,0 +1,8 @@ +flatten1(A,B) :- + flatten2(A,[],B),!. + +flatten2([],B,B) :- !. +flatten2(A,B,C) :- + (not(is_list(A))->append(B,[A],C); + (A=[D|E],flatten2(D,B,F), + flatten2(E,F,C))),!. diff --git a/Philosophy/flush_caches.pl b/Philosophy/flush_caches.pl new file mode 100644 index 0000000000000000000000000000000000000000..be59bcd78f38280439d5c2a49238b568ef912b92 --- /dev/null +++ b/Philosophy/flush_caches.pl @@ -0,0 +1,34 @@ +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). +:-include('../t2ab/t2ab.pl'). +:-include('../listprologinterpreter/la_files.pl'). +%:-include('../Lucian-Academy/word_count.pl'). + +%bag_args(96000). + +flush_caches :- + +open_string_file_s("arg_cache.txt",File_string_b_cache), + term_to_atom([_Cache_N,File_string2_b_cache],File_string_b_cache), + + +open_string_file_s("alg_cache.txt",File_string_b_cache2), + term_to_atom([_Cache_N2,File_string2_b_cache2],File_string_b_cache2), + + + + string_concat(File_string2_b_cache,File_string2_b_cache2,N7), + + + %word_count(["string",N7],WC), + + texttobr2(u,u,N7,u,[auto,on]), + t2ab(u,u,N7,u,on), + term_to_atom([0,""],Cache_atom), + open_s("arg_cache.txt",write,Stream1a_cache), + write(Stream1a_cache,Cache_atom), + close(Stream1a_cache), + + open_s("alg_cache.txt",write,Stream1a_cache1), + write(Stream1a_cache1,Cache_atom), + close(Stream1a_cache1), + !. \ No newline at end of file diff --git a/Philosophy/folder_name/file_name.txt b/Philosophy/folder_name/file_name.txt new file mode 100644 index 0000000000000000000000000000000000000000..31219e89db9c67f1faa498d7981c46876c99b1d9 --- /dev/null +++ b/Philosophy/folder_name/file_name.txt @@ -0,0 +1 @@ +[[["player_name","Harold"],["score",10],["symbol",key_word]],[["player_name","Queenie"],["score",20],["symbol",schema]]] \ No newline at end of file diff --git a/Philosophy/foldl1.pl b/Philosophy/foldl1.pl new file mode 100644 index 0000000000000000000000000000000000000000..949926e8635b92ef0f5b7cd0724e7b62af942374 --- /dev/null +++ b/Philosophy/foldl1.pl @@ -0,0 +1,37 @@ +/* + +?- foldl1(string_concat1,["a","b"],"",C). +C = "ab". + +?- foldl1(string_concat1,["a","b"],D,"ab"). +D = "". + +?- foldl1(string_concat1,A,"","ab"). +fail +* should return an error + +?- foldl1(append1,[["a"],["b","c"]],[],C). +C = ["a", "b", "c"]. + +?- foldl1(append1,[["a"],["b","c"]],D,["a", "b", "c"]). +D = []. + +?- foldl1(append1,A,[],["a", "b", "c"]). +A = [["a", "b", "c"]] + +*/ + +:-include('string_concat1.pl'). + +foldl1(_,[],A,A):-!. +foldl1(Function,[A1|D],B,C) :- +functor(A,Function,3),arg(1,A,B),arg(2,A,A1),arg(3,A,A3), +A, foldl1(Function,D,A3,C),!. + +% build this higher order function directly in lp + +% test 244 with different modes + +% without an initial value, use the first value + +% foldr diff --git a/Philosophy/if-then.pl b/Philosophy/if-then.pl new file mode 100644 index 0000000000000000000000000000000000000000..f1ca5f3762d028e3273862c3c97c199621fc2072 --- /dev/null +++ b/Philosophy/if-then.pl @@ -0,0 +1,24 @@ +/* +I want to design a top-level interface for Prolog that is not buggy (where cut successfully cuts choicepoints, not if-then causing choicepoints to be buggily cut). I thought that I could convert if-then to clauses where the truth of the antecedent is reified and its truth is tested to be true or false at the start of each clause. + +For example, test1 below shows the original if-then clause. +*/ +test1:-(a1(1)->b1;c1). +a1(1). +b1. +c1. + +/* +Here, test2 shows how if-then has been converted into separate clauses. So, test1. does the same thing as test2., but test2's choicepoints are available to cut, as against the if-then clause in test1 possibly causing choicepoints to be buggily cut. +*/ +test2:-a2(1,R),d2(R). + +a2(1,true):-!. +a2(_,false). +d2(true):-b2,!. +d2(false):-c2. +b2:-writeln(b2). +c2:-writeln(c2). + + +% I could convert to the test2 format at compilation and display if-then using the test1 format in trace mode. \ No newline at end of file diff --git a/Philosophy/interpret.pl b/Philosophy/interpret.pl new file mode 100644 index 0000000000000000000000000000000000000000..7c05e1773ef337c4943cabb3719a825fe3e32015 --- /dev/null +++ b/Philosophy/interpret.pl @@ -0,0 +1,5 @@ +% run an interpreter in lc, lpi, ssi + +% interpret(1,"+",1,E). +% E=2. +interpret(A,"+",B,C), :- C is A+B. \ No newline at end of file diff --git a/Philosophy/intersection1.pl b/Philosophy/intersection1.pl new file mode 100644 index 0000000000000000000000000000000000000000..8dd9ad65c683f6af8e8e8229e82d10ab90c5641a --- /dev/null +++ b/Philosophy/intersection1.pl @@ -0,0 +1,13 @@ +intersection1(A,B,C):- +findall(D,(member(D,A),member(D,B)),C). + +intersection2(A,B,C):-intersection2(A,B,[],C),!. +intersection2([],_,A,A):-!. +intersection2([A1|A2],B,C,D):- +member1(A1,B), +intersection2(A2,B,[A1|C],D). +intersection2([_|A2],B,C,D):- +intersection2(A2,B,C,D). + +member1(A,[A|_]):-!. +member1(A,[_|B2]):-member1(A,B2). \ No newline at end of file diff --git a/Philosophy/invert_string_concat.pl b/Philosophy/invert_string_concat.pl new file mode 100644 index 0000000000000000000000000000000000000000..a59cbbd4528d69d0e52396a86d3f156c4b2374fb --- /dev/null +++ b/Philosophy/invert_string_concat.pl @@ -0,0 +1,13 @@ +:-include('pf_types.pl'). + +% b1("abc",B). +% B = "abc". + +b1(A,G) :- +string_codes(A,B),findall(C,(member(D,B),char_code(C,D)),E),a(F,[],E),foldr(string_concat,F,G),!. + +% reverse2("abc",B). +% B = "cba". + +reverse2(A,G) :- +string_codes(A,B),findall(C,(member(D,B),char_code(C,D)),E),reverse1(F,[],E),foldr(string_concat,F,G),!. \ No newline at end of file diff --git a/Philosophy/join_period.pl b/Philosophy/join_period.pl new file mode 100644 index 0000000000000000000000000000000000000000..0053e82847375f527eea14fe9f0fdf36ccc338e9 --- /dev/null +++ b/Philosophy/join_period.pl @@ -0,0 +1,11 @@ +% join_period(["a",".","b","c",".",".","s"],C). +% C = ["a.", "bc..", "s"]. +join_period([],[]) :-!. +join_period([A],[A]) :-!. +join_period([X11,".",X13|Xs], [X12|Ys]) :- + string_concat(X11,".",X12), + not(X13="."), + join_period([X13|Xs], Ys),!. +join_period([A1,A2|A3],Ys) :- + foldr(string_concat,[A1,A2],A4), + join_period([A4|A3],Ys),!. diff --git a/Philosophy/lfl2html.pl b/Philosophy/lfl2html.pl new file mode 100644 index 0000000000000000000000000000000000000000..56db842829151c34eb831d572e4a2b2f1e9b641b --- /dev/null +++ b/Philosophy/lfl2html.pl @@ -0,0 +1,45 @@ +% lfl2html.pl + +:-include('../listprologinterpreter/la_strings.pl'). + +%:-include('../listprologinterpreter/la_strings.pl'). + +% [[b],"Hello"] -> Hello +% [[p],[[b],"Hello"]] ->

Hello

+ +/* + +[[html],[[head],[[title],"Title"]],[[body],[[p],[[b],"Hello"]]]] + +H = "Title

Hello

". + +*/ + +lfl2html(HTML) :- + read_string(user_input,'\n','\r',_,Input), + term_to_atom(Term,Input), + lfl2html2(Term,HTML),!. + +lfl2html2([],[]). + +lfl2html2(String,String) :- string(String),!. + +lfl2html2(Term,HTML) :- + lfl2html_codes(Tag,Open_tag,Close_tag), + (Term=[Tag|String]-> + (lfl2html2(String,HTML3), + (is_list(HTML3)->concat_list(HTML3,HTML2);HTML2=HTML3), + %trace, + concat_list([Open_tag,HTML2,Close_tag],HTML))). + + +lfl2html2([Term|Term2],[HTML|HTML2]) :- + lfl2html2(Term,HTML), + lfl2html2(Term2,HTML2). + +lfl2html_codes([html],"",""). +lfl2html_codes([head],"",""). +lfl2html_codes([body],"",""). +lfl2html_codes([title],"",""). +lfl2html_codes([p],"

","

"). +lfl2html_codes([b],"",""). diff --git a/Philosophy/magic_square.pl b/Philosophy/magic_square.pl new file mode 100644 index 0000000000000000000000000000000000000000..b6018f6b0c3180d3b7020b239f17998e46431b06 --- /dev/null +++ b/Philosophy/magic_square.pl @@ -0,0 +1,33 @@ +/* +?- magic_square([A,B,C,D,E,F,G,H,I]). + +A = 2, +B = 7, +C = 6, +D = 9, +E = 5, +F = 1, +G = 4, +H = 3, +I = 8 ; + +*/ +magic_square([A,B,C, + D,E,F, + G,H,I]) :- + permutation([1,2,3, + 4,5,6, + 7,8,9],[A,B,C, + D,E,F, + G,H,I]), + J is A+B+C, + J is D+E+F, + J is G+H+I, + J is A+D+G, + J is B+E+H, + J is C+F+I, + + J is A+E+I, + J is G+E+C. + + \ No newline at end of file diff --git a/Philosophy/make_exercises.pl b/Philosophy/make_exercises.pl new file mode 100644 index 0000000000000000000000000000000000000000..8a941d18d7e3ad7f67efc6e95a349c95fa7dd8a6 --- /dev/null +++ b/Philosophy/make_exercises.pl @@ -0,0 +1,97 @@ +% fill the gap, multi choice fill the gap, ()matching, multi-choice matching +% - ignore words starting with _, connectives, duplicate answers + +% (split on space), split and keep punctuation and space except '- + +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). +:-include('../listprologinterpreter/la_maths.pl'). + +% ?- trace,split_into_sentences("a b c. d e f.",*,2, E). * run with make_exercises +% E = [["a b _____.", "c"], [" _____ e f.", "d"]]. + +split_into_sentences(Text,Connectives,Number_of_questions,Exercises) :- + + split_string(Text,".\n\r",".\n\r",Sentences2), + findall(Sentences3,(member(Sentence2,Sentences2), + foldr(string_concat,[Sentence2,"."],"",Sentences3)),Sentences), + +make_exercise0(Connectives,Sentences,%Sentences1, +Number_of_questions,[],Exercises1), + +findall([Exercise4,Word],(member([Exercise2,Word],Exercises1),foldl(replace_undefined_with_gap,Exercise2,[],Exercise3), +foldr(string_concat,Exercise3,Exercise4)),Exercises) +. + +replace_undefined_with_gap(A,C,D) :- + (var(A)-> B="_____";B=A),append(C,[B],D). + +make_exercise0(_,_,%_Sentences, +0,Exercises,Exercises) :- !. +make_exercise0(Connectives,Sentences,%Sentences1, +Number_of_questions,Exercises1,Exercises2) :- + +make_exercise1(Connectives,Sentences,%Sentences1, +Sentences0,%Sentences01, +Exercise), + +append(Exercises1,[Exercise],Exercises3), + +Number_of_questions2 is Number_of_questions-1, + +make_exercise0(Connectives,Sentences0,%Sentences01, +Number_of_questions2,Exercises3,Exercises2). + +make_exercise1(Connectives,Sentences,%Sentences1, +Sentences0,%Sentences01, +Exercise) :- + +length(Sentences,Length), numbers(Length,1,[],N), + random_member(N1,%Item,%Pair,%[Sentence,Sentence1], + N%Sentences1 + ), + + get_item_n(Sentences,N1,Item), + +delete_item_n(Sentences,N1,Sentences11), +%delete_item_n(Sentences1,N1,Sentences12), + +%trace, +(make_exercise(Connectives,Sentences,Item,Exercise)-> +(Sentences11=Sentences0%,Sentences12=Sentences01 +); +(make_exercise1(Connectives,Sentences11,%Sentences12 +Sentences0,%Sentences01 +Exercise))). +% if fails then try again + %delete() +% don't use if there is a duplicate + + +make_exercise(Connectives,Sentences1,%[ +String%,Words] +,Exercise) :- + string_codes(String,Codes), + +Punct1='&#@~%`$?+*^,()|.:;=_/[]<>{}\s\t\\"!0123456789', +string_codes(Punct1,Punct), + split_on_substring117(Codes,Punct,[],Exercise1), % except -'\n\r + +find_a_word_to_blank(Connectives,Exercise1,Punct,Sentences1,Exercise). + + +find_a_word_to_blank(Connectives,Exercise1,Punct,Sentences,Exercise) :- + +length(Exercise1,Length), numbers(Length,1,[],N), findall(List,(member(N1,N),get_item_n(Exercise1,N1,Word),(((member(Word,Connectives))->true;(string_codes(Word,[Codes]), + member(Codes,Punct)))-> List=[-,N1,Word];List=[+,N1,Word])),List1), + +findall([+,N2,Word],member([+,N2,Word],List1),List2), + +random_member([+,N2,Word],List2), + +put_item_n(Exercise1,N2,_,Exercise2), + +(member(Exercise2,Sentences)-> +find_a_word_to_blank(Connectives,Exercise1,Punct,Sentences,Exercise); +Exercise=[Exercise2,Word]). + diff --git a/Philosophy/maths_formulas.pl b/Philosophy/maths_formulas.pl new file mode 100644 index 0000000000000000000000000000000000000000..441fd453b3deea02760eee0081e38345022a7217 --- /dev/null +++ b/Philosophy/maths_formulas.pl @@ -0,0 +1,11 @@ +% formula((round(1+2.2*ceiling(2.7))),A),!. +%A=8 +formula(A*B,C) :- formula(A,C1), formula(B,C2), C is C1*C2. +formula(A/B,C) :- formula(A,C1), formula(B,C2), C is C1/C2. +formula(A+B,C) :- formula(A,C1), formula(B,C2), C is C1+C2. +formula(A-B,C) :- formula(A,C1), formula(B,C2), C is C1-C2. +formula(A,C) :- number(A), C=A. + +formula(round(A),C) :- formula(A,C1), C is round(C1). +formula(ceiling(A),C) :- formula(A,C1), C is ceiling(C1). +formula(floor(A),C) :- formula(A,C1), C is floor(C1). diff --git a/Philosophy/memorisation.pl b/Philosophy/memorisation.pl new file mode 100644 index 0000000000000000000000000000000000000000..ba5392bc81a8811e34754f7af207c9bee9023e15 --- /dev/null +++ b/Philosophy/memorisation.pl @@ -0,0 +1,18 @@ +:-include('../listprologinterpreter/listprolog.pl'). + +%memorisation([a,b,c],2) :- +memorisation(_,0) :- !. +memorisation(List1,N1) :- + repeat, + writeln("Say out loud and type the following to help memorise it:"), + writeln1(List1), + read_string(user_input, "\n", "\r", _, List11), + term_to_atom(List1,List11), + reverse(List1,List2), + repeat, + writeln("Reverse, say out loud and type the following to help memorise it:"), + writeln1(List1), + read_string(user_input, "\n", "\r", _, List21), + term_to_atom(List2,List21), + N2 is N1-1, + memorisation(List1,N2),!. \ No newline at end of file diff --git a/Philosophy/operate1.pl b/Philosophy/operate1.pl new file mode 100644 index 0000000000000000000000000000000000000000..9536763d01e82adb18083eaa3d977f7f4e913d3c --- /dev/null +++ b/Philosophy/operate1.pl @@ -0,0 +1,216 @@ +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green *How can the program cope with real variation? 1 of 4.txt",0,algorithms,"2. I prepared to direct students to enroll in another subject. I did this by writing that the intensive part of the verb was given by the example, 'The subject is full'. First, I enrolled in the subject. Second, I noticed that others prepared to enroll in the subject as well. Third, I noticed that they repeated this until the subject was full. In this way, I prepared to direct students to enroll in another subject by writing that the intensive part of the verb was given by the example, 'The subject is full'."] + +/* + +?- ['operate1.pl']. +true. + +?- operate1([+,2,1],Outputs),writeln(Outputs). +[[+,1,1,2],[+,1,2,3],[+,1,3,4],[+,1,4,5],[+,1,5,6],[+,1,1,2],[+,1,2,3],[+,1,3,4],[+,1,4,5],[+,1,5,6],[+,2,1,3],[+,2,2,4],[+,2,3,5],[+,2,4,6],[+,2,5,7],[+,2,1,3],[+,2,2,4],[+,2,3,5],[+,2,4,6],[+,2,5,7],[+,3,1,4],[+,3,2,5],[+,3,3,6],[+,3,4,7],[+,3,5,8],[+,3,1,4],[+,3,2,5],[+,3,3,6],[+,3,4,7],[+,3,5,8],[+,4,1,5],[+,4,2,6],[+,4,3,7],[+,4,4,8],[+,4,5,9],[+,4,1,5],[+,4,2,6],[+,4,3,7],[+,4,4,8],[+,4,5,9],[+,5,1,6],[+,5,2,7],[+,5,3,8],[+,5,4,9],[+,5,5,10],[+,5,1,6],[+,5,2,7],[+,5,3,8],[+,5,4,9],[+,5,5,10],[+,1,1,2],[+,1,2,3],[+,1,3,4],[+,1,4,5],[+,1,5,6],[+,1,1,2],[+,1,2,3],[+,1,3,4],[+,1,4,5],[+,1,5,6],[+,2,1,3],[+,2,2,4],[+,2,3,5],[+,2,4,6],[+,2,5,7],[+,2,1,3],[+,2,2,4],[+,2,3,5],[+,2,4,6],[+,2,5,7],[+,3,1,4],[+,3,2,5],[+,3,3,6],[+,3,4,7],[+,3,5,8],[+,3,1,4],[+,3,2,5],[+,3,3,6],[+,3,4,7],[+,3,5,8],[+,4,1,5],[+,4,2,6],[+,4,3,7],[+,4,4,8],[+,4,5,9],[+,4,1,5],[+,4,2,6],[+,4,3,7],[+,4,4,8],[+,4,5,9],[+,5,1,6],[+,5,2,7],[+,5,3,8],[+,5,4,9],[+,5,5,10],[+,5,1,6],[+,5,2,7],[+,5,3,8],[+,5,4,9],[+,5,5,10]] +Outputs = [[+, 1, 1, 2], [+, 1, 2, 3], [+, 1, 3, 4], [+, 1, 4, 5], [+, 1, 5, 6], [+, 1, 1|...], [+, 1|...], [+|...], [...|...]|...]. + +?- operate1([*,2,1],Outputs),writeln(Outputs). +[[*,1,1,1],[*,1,2,2],[*,1,3,3],[*,1,4,4],[*,1,5,5],[*,1,1,1],[*,1,2,2],[*,1,3,3],[*,1,4,4],[*,1,5,5],[*,2,1,2],[*,2,2,4],[*,2,3,6],[*,2,4,8],[*,2,5,10],[*,2,1,2],[*,2,2,4],[*,2,3,6],[*,2,4,8],[*,2,5,10],[*,3,1,3],[*,3,2,6],[*,3,3,9],[*,3,4,12],[*,3,5,15],[*,3,1,3],[*,3,2,6],[*,3,3,9],[*,3,4,12],[*,3,5,15],[*,4,1,4],[*,4,2,8],[*,4,3,12],[*,4,4,16],[*,4,5,20],[*,4,1,4],[*,4,2,8],[*,4,3,12],[*,4,4,16],[*,4,5,20],[*,5,1,5],[*,5,2,10],[*,5,3,15],[*,5,4,20],[*,5,5,25],[*,5,1,5],[*,5,2,10],[*,5,3,15],[*,5,4,20],[*,5,5,25],[*,1,1,1],[*,1,2,2],[*,1,3,3],[*,1,4,4],[*,1,5,5],[*,1,1,1],[*,1,2,2],[*,1,3,3],[*,1,4,4],[*,1,5,5],[*,2,1,2],[*,2,2,4],[*,2,3,6],[*,2,4,8],[*,2,5,10],[*,2,1,2],[*,2,2,4],[*,2,3,6],[*,2,4,8],[*,2,5,10],[*,3,1,3],[*,3,2,6],[*,3,3,9],[*,3,4,12],[*,3,5,15],[*,3,1,3],[*,3,2,6],[*,3,3,9],[*,3,4,12],[*,3,5,15],[*,4,1,4],[*,4,2,8],[*,4,3,12],[*,4,4,16],[*,4,5,20],[*,4,1,4],[*,4,2,8],[*,4,3,12],[*,4,4,16],[*,4,5,20],[*,5,1,5],[*,5,2,10],[*,5,3,15],[*,5,4,20],[*,5,5,25],[*,5,1,5],[*,5,2,10],[*,5,3,15],[*,5,4,20],[*,5,5,25]] +Outputs = [[*, 1, 1, 1], [*, 1, 2, 2], [*, 1, 3, 3], [*, 1, 4, 4], [*, 1, 5, 5], [*, 1, 1|...], [*, 1|...], [*|...], [...|...]|...]. + +?- operate1([-,2,1],Outputs),writeln(Outputs). +[[-,1,1,0],[-,1,2,-1],[-,1,3,-2],[-,1,4,-3],[-,1,5,-4],[-,1,1,0],[-,1,2,-1],[-,1,3,-2],[-,1,4,-3],[-,1,5,-4],[-,2,1,1],[-,2,2,0],[-,2,3,-1],[-,2,4,-2],[-,2,5,-3],[-,2,1,1],[-,2,2,0],[-,2,3,-1],[-,2,4,-2],[-,2,5,-3],[-,3,1,2],[-,3,2,1],[-,3,3,0],[-,3,4,-1],[-,3,5,-2],[-,3,1,2],[-,3,2,1],[-,3,3,0],[-,3,4,-1],[-,3,5,-2],[-,4,1,3],[-,4,2,2],[-,4,3,1],[-,4,4,0],[-,4,5,-1],[-,4,1,3],[-,4,2,2],[-,4,3,1],[-,4,4,0],[-,4,5,-1],[-,5,1,4],[-,5,2,3],[-,5,3,2],[-,5,4,1],[-,5,5,0],[-,5,1,4],[-,5,2,3],[-,5,3,2],[-,5,4,1],[-,5,5,0],[-,1,1,0],[-,1,2,-1],[-,1,3,-2],[-,1,4,-3],[-,1,5,-4],[-,1,1,0],[-,1,2,-1],[-,1,3,-2],[-,1,4,-3],[-,1,5,-4],[-,2,1,1],[-,2,2,0],[-,2,3,-1],[-,2,4,-2],[-,2,5,-3],[-,2,1,1],[-,2,2,0],[-,2,3,-1],[-,2,4,-2],[-,2,5,-3],[-,3,1,2],[-,3,2,1],[-,3,3,0],[-,3,4,-1],[-,3,5,-2],[-,3,1,2],[-,3,2,1],[-,3,3,0],[-,3,4,-1],[-,3,5,-2],[-,4,1,3],[-,4,2,2],[-,4,3,1],[-,4,4,0],[-,4,5,-1],[-,4,1,3],[-,4,2,2],[-,4,3,1],[-,4,4,0],[-,4,5,-1],[-,5,1,4],[-,5,2,3],[-,5,3,2],[-,5,4,1],[-,5,5,0],[-,5,1,4],[-,5,2,3],[-,5,3,2],[-,5,4,1],[-,5,5,0]] +Outputs = [[-, 1, 1, 0], [-, 1, 2, -1], [-, 1, 3, -2], [-, 1, 4, -3], [-, 1, 5, -4], [-, 1, 1|...], [-, 1|...], [-|...], [...|...]|...]. + +?- operate1([/,2,1],Outputs),writeln(Outputs). +[[/,1,1,1],[/,1,2,0.5],[/,1,3,0.3333333333333333],[/,1,4,0.25],[/,1,5,0.2],[/,1,1,1],[/,1,2,0.5],[/,1,3,0.3333333333333333],[/,1,4,0.25],[/,1,5,0.2],[/,2,1,2],[/,2,2,1],[/,2,3,0.6666666666666666],[/,2,4,0.5],[/,2,5,0.4],[/,2,1,2],[/,2,2,1],[/,2,3,0.6666666666666666],[/,2,4,0.5],[/,2,5,0.4],[/,3,1,3],[/,3,2,1.5],[/,3,3,1],[/,3,4,0.75],[/,3,5,0.6],[/,3,1,3],[/,3,2,1.5],[/,3,3,1],[/,3,4,0.75],[/,3,5,0.6],[/,4,1,4],[/,4,2,2],[/,4,3,1.3333333333333333],[/,4,4,1],[/,4,5,0.8],[/,4,1,4],[/,4,2,2],[/,4,3,1.3333333333333333],[/,4,4,1],[/,4,5,0.8],[/,5,1,5],[/,5,2,2.5],[/,5,3,1.6666666666666667],[/,5,4,1.25],[/,5,5,1],[/,5,1,5],[/,5,2,2.5],[/,5,3,1.6666666666666667],[/,5,4,1.25],[/,5,5,1],[/,1,1,1],[/,1,2,0.5],[/,1,3,0.3333333333333333],[/,1,4,0.25],[/,1,5,0.2],[/,1,1,1],[/,1,2,0.5],[/,1,3,0.3333333333333333],[/,1,4,0.25],[/,1,5,0.2],[/,2,1,2],[/,2,2,1],[/,2,3,0.6666666666666666],[/,2,4,0.5],[/,2,5,0.4],[/,2,1,2],[/,2,2,1],[/,2,3,0.6666666666666666],[/,2,4,0.5],[/,2,5,0.4],[/,3,1,3],[/,3,2,1.5],[/,3,3,1],[/,3,4,0.75],[/,3,5,0.6],[/,3,1,3],[/,3,2,1.5],[/,3,3,1],[/,3,4,0.75],[/,3,5,0.6],[/,4,1,4],[/,4,2,2],[/,4,3,1.3333333333333333],[/,4,4,1],[/,4,5,0.8],[/,4,1,4],[/,4,2,2],[/,4,3,1.3333333333333333],[/,4,4,1],[/,4,5,0.8],[/,5,1,5],[/,5,2,2.5],[/,5,3,1.6666666666666667],[/,5,4,1.25],[/,5,5,1],[/,5,1,5],[/,5,2,2.5],[/,5,3,1.6666666666666667],[/,5,4,1.25],[/,5,5,1]] +Outputs = [[/, 1, 1, 1], [/, 1, 2, 0.5], [/, 1, 3, 0.3333333333333333], [/, 1, 4, 0.25], [/, 1, 5, 0.2], [/, 1, 1|...], [/, 1|...], [/|...], [...|...]|...]. + +*/ + +:-include('../listprologinterpreter/la_maths.pl'). +:-include('../listprologinterpreter/la_strings.pl'). + +number_generator(Min,Max,N) :- + numbers(Max,Min,[],Ns), + member(N,Ns). + +operations([[+,2,1],[-,2,1],[*,2,1],[/,2,1]]). + +% operate1([+,2,1],Outputs). + +operate1([Operator,Input_number,Output_number],Outputs) :- + numbers(Input_number,1,[],Input_numbers), + findall([Input1,Input2],(member(_Input_number1,Input_numbers), + number_generator(1,5,Input1), + member(_Input_number2,Input_numbers), + number_generator(1,5,Input2)),Inputs), + operate2([Operator,_,Output_number],Inputs,Outputs). + +operate2([Operator,_Input_number,Output_number],Inputs,Outputs) :- + %length(List,Input_number), + findall([Operator,A,B,Outputs1],(member([A,B],Inputs), + operate3(Operator,[A,B],Output_number,Outputs1)),Outputs). + +operate3(+,[A,B],1,Output) :- + Output is A+B. +operate3(-,[A,B],1,Output) :- + Output is A-B. +operate3(*,[A,B],1,Output) :- + Output is A*B. +operate3(/,[A,B],1,Output) :- + Output is A/B. + +/* + +% 2. I prepared to direct students to enroll in another subject. + +?- enroll("Joan","Clemens","Mathematics",[["Mathematics",[]]],E2),writeln(E2). +[[Mathematics,[[Joan,Clemens]]]] +E2 = [["Mathematics", [["Joan", "Clemens"]]]]. + +?- enroll("Anne","Mable","Mathematics",[["Mathematics", [["Joan", "Clemens"]]]],E2),writeln(E2). +[[Mathematics,[[Joan,Clemens],[Anne,Mable]]]] +E2 = [["Mathematics", [["Joan", "Clemens"], ["Anne", "Mable"]]]]. + +*/ + +enroll(First,Second,Subject,Enrollments1,Enrollments2) :- + member([Subject,Enrollments3],Enrollments1), + append(Enrollments3,[[First,Second]],Enrollments4), + delete(Enrollments1,[Subject,Enrollments3],Enrollments5), + append(Enrollments5,[[Subject,Enrollments4]],Enrollments2). + +enroll2(First,Second,Subject,Enrollments1,Enrollments2) :- + member([Subject,Q,Enrollments3],Enrollments1), + append(Enrollments3,[[First,Second]],Enrollments4), + delete(Enrollments1,[Subject,Q,Enrollments3],Enrollments5), + append(Enrollments5,[[Subject,Q,Enrollments4]],Enrollments2). + +% I did this by writing that the intensive part of the verb was given by the example, 'The subject is full'. + +% v, o matched with base meanings +% alg found + +% - +% alg: A equals/is 1 plus/summed with 1 + +% sentence_to_meaning(["a","is","1","summed with","1"],Meaning). +% Meaning = ["a", "equals", "1", "plus", "1"]. + +sentence_to_meaning(Sentence,Meaning) :- + findall(Part1,(member(Part,Sentence),((meaning(Part1,Parts), + member(Part,Parts))->true;Part1=Part)),Meaning). + +meaning("equals",["is","is calculated to be"]). +meaning("plus",["summed with","added to"]). + +% meaning_to_algorithm(["a", "equals", "1", "plus", "1"],A). +% A = ["a", "equals", 2]. + +meaning_to_algorithm([V, "equals", A, "plus", B],Algorithm) :- + number_string(A1,A), + number_string(B1,B), + operate3(+,[A1,B1],1,Output), + Algorithm=[V, "equals", Output]. + +/* +?- foldr(unenroll,[[a,b,m],[c,d,m]],[[m,[[e,f],[a,b]]]],A). +A = [[m, [[e, f]]]]. +*/ + +unenroll(First,Second,Subject,Enrollments1,Enrollments2) :- + member([Subject,Enrollments3],Enrollments1), + delete(Enrollments3,[First,Second],Enrollments4), + delete(Enrollments1,[Subject,Enrollments3],Enrollments5), + append(Enrollments5,[[Subject,Enrollments4]],Enrollments2). + +number_of_students(Subject,Enrollments,Number_of_students) :- + member([Subject,Enrollments2],Enrollments), + length(Enrollments2,Number_of_students). + +% enroll_with_quota([[first1,last1,"Mathematics"],[first1,last1,"Mathematics"]],[["Mathematics",1, [["Joan", "Clemens"]]]],Enrollments2). + +/* + +enroll_with_quota([[a,b,m],[c,d,m]],[[m,1,[[e,f]]]],A). +A = [[m, 1, [[e, f]]]]. + +enroll_with_quota([[a,b,m],[c,d,m]],[[m,0,[]]],A). +A = [[m, 0, []]]. + +enroll_with_quota([[a,b,m],[c,d,m]],[[m,5,[[e,f]]]],A). +A = [[m, 5, [[e, f], [c, d], [a, b]]]]. + +enroll_with_quota([[a,b,m],[c,d,m]],[[m,2,[[e,f]]]],A). +A = [[m, 2, [[e, f], [a, b]]]]. + +*/ + +enroll_with_quota(Students_to_enroll,Enrollments1,Enrollments2) :- + % find list of subjects in demand + findall(Subject,(member([_,_,Subject],Students_to_enroll)),Subjects1), + sort(Subjects1,Subjects2), + + % find list of quota limits for each subject, number of students currently in each subject and the number of places available + findall([Subject,_Quota1,_Number_of_current_students1,Places_available],(member(Subject,Subjects2),member([Subject,Quota,Current_students],Enrollments1),length(Current_students,Number_of_current_students),Places_available is Quota-Number_of_current_students,Places_available>=1),Data), + + % find lists of new students for each subject + findall(New_students2,(member([Subject,_Quota,_Number_of_current_students2,Places_available],Data),length(New_students,Places_available), + findall([First,Last,Subject],member([First,Last,Subject],Students_to_enroll),Students_demanding_enrollment), + + %trace, + length(Students_demanding_enrollment,Number_of_students_demanding_enrollment), + (Places_available>Number_of_students_demanding_enrollment-> + New_students2=Students_demanding_enrollment; + (append(New_students,_,Students_demanding_enrollment), + New_students=New_students2))),New_students1), + + %trace, + + % add students + foldl(append,New_students1,[],New_students3), + + %(New_students1=[]->Enrollments1=Enrollments2; + %([New_students3]=New_students1, + foldl(enroll2,New_students3,Enrollments1,Enrollments2).%%)). + +enroll([First,Second,Subject],Enrollments1,Enrollments2) :- + enroll(First,Second,Subject,Enrollments1,Enrollments2). + +unenroll([First,Second,Subject],Enrollments1,Enrollments2) :- + unenroll(First,Second,Subject,Enrollments1,Enrollments2). + +enroll2([First,Second,Subject],Enrollments1,Enrollments2) :- + enroll2(First,Second,Subject,Enrollments1,Enrollments2). + +% ["Creating and Helping Pedagogues","CREATE AND HELP PEDAGOGUES by Lucian Green Pedagogy Helper - Write on Breasoning - Politics 3 of 3.txt",0,algorithms,"21. Rural area residents wrote pedagogies."] + +% grade_book + +/* + +enter_grade(f,s,m,"a1",60,[[m,1,[[f,s,[]]]]],A),writeln1(A). +A=[[m,1,[[f,s,[["a1",60],["Overall_grade",60]]]]]] + +enter_grade(f,s,m,"a2",70,[[m,1,[[f,s,[[a1,60],[Overall_grade,60]]]]]],A),writeln1(A). +A=[[m,1,[[f,s,[["a1",60],["a2",70],["Overall_grade",65]]]]]] + +*/ + +enter_grade(First,Second,Subject,Assignment_name,Grade,Enrollments1,Enrollments2) :- + member([Subject,Q,Enrollments3],Enrollments1), + member([First,Second,Grades1],Enrollments3), + delete(Grades1,[Assignment_name,_],Grades2), + append(Grades2,[[Assignment_name,Grade]],Grades3), + delete(Grades3,["Overall_grade",_],Grades4), + %trace, + findall(Grade2,(member([_,Grade2],Grades4)),Grades6), + find_overall_grade(Grades6,Overall_grade), + append(Grades4,[["Overall_grade",Overall_grade]],Grades5), + delete(Enrollments3,[First,Second,Grades1],Enrollments4), + append(Enrollments4,[[First,Second,Grades5]],Enrollments5), + delete(Enrollments1,[Subject,Q,Enrollments3],Enrollments6), + append(Enrollments6,[[Subject,Q,Enrollments5]],Enrollments2). + +% find overall grade + +find_overall_grade(List,Overall_grade) :- + foldl(sum,[List],0,Sum), + length(List,Length), + Overall_grade is floor(Sum/Length). diff --git a/Philosophy/p1.pl b/Philosophy/p1.pl new file mode 100644 index 0000000000000000000000000000000000000000..a6722ba6612b7834034a08609753d23b0f596292 --- /dev/null +++ b/Philosophy/p1.pl @@ -0,0 +1,11 @@ +/* +p1(["a",["b", "b"],"a",["b"]],[],A). +A = ["a", ["b", "b"], "a", ["b"]] +*/ + +p1([],V1,V1). +p1(V1,V2,V3):-equals4(V1,[V11,V12|V13]),equals4(V11,"a"),p2(V12,[],V15),append(V2,[V11,V15],V14),p1(V13,V14,V3). +p2([],V4,V4). +p2(V4,V5,V6):-equals4(V4,[V7|V8]),equals4(V7,"b"),append(V5,[V7],V9),p2(V8,V9,V6). + +equals4(A,B):-A=B. diff --git a/Philosophy/paraphraser1.pl b/Philosophy/paraphraser1.pl new file mode 100644 index 0000000000000000000000000000000000000000..072c965619f7641b3afbc180a18a154940f36d72 --- /dev/null +++ b/Philosophy/paraphraser1.pl @@ -0,0 +1,120 @@ +% paraphraser1.pl + +% Tries to paraphrase each non-trivial word + +% Uses formatting preserver x splits on space, return x needs formatting - use lpi 117 + +/* +% place text to be paraphrased in file.txt +cd Philosophy/ +swipl +['paraphraser1.pl']. +paraphraser([file,"file.txt"],_). +% Find paraphrased file in file2.txt +*/ + +:-dynamic auto/1. + +:- include('../listprologinterpreter/listprolog'). + +paraphraser(Parameters,File_list_a,Auto) :- + retractall(auto(_)), + assertz(auto(Auto)), + paraphraser(Parameters,File_list_a),!. + +paraphraser([string,String],File_list_a) :- + (not(catch(auto(_),_,false))-> + (retractall(auto(_)), + assertz(auto(off)));true), + %File1="test1.pl", + string_codes(String,Codes), + paraphraser1(Codes,File_list_a),!. + +paraphraser([file,File1],File_list_a +) :- + (not(catch(auto(_),_,false))-> + (retractall(auto(_)), + assertz(auto(off)));true), + %File1="test1.pl", + phrase_from_file_s(string(Codes), File1), + paraphraser1(Codes,File_list_a), + (open_s("file2.txt",write,Stream1), + write(Stream1,File_list_a), + close(Stream1)),!. + +paraphraser1(Codes,File_list_a) :- + SepandPad=%".",% + "&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!0123456789", % doesn't have "'" xx + %string_codes(String1,Codes), + %string_to_list2(SepandPad,[],SepandPad1), + string_codes(SepandPad,SepandPad1), + %split_string2(String1,SepandPad1,File_list), + split_on_substring117(Codes,SepandPad1,[],File_list), + %split_string(String1,SepandPad,SepandPad,File_list), + + phrase_from_file_s(string(Codes2), "thesaurus.txt"), + %trace, + string_codes(String2,Codes2), + atom_to_term(String2,Synonym_list,_), + +paraphrase1(File_list,[],File_list2a,Synonym_list,Synonym_list2), + + concat_list(File_list2a,File_list_a), + + term_to_atom(Synonym_list2,Synonym_list_a), + + (open_s("thesaurus.txt",write,Stream2), + write(Stream2,Synonym_list_a), + close(Stream2)),!. + +string_to_list2(B,A1,A2) :- + string_concat(D,"",B), + string_length(D,1), + append(A1,[D],A2),!. +string_to_list2(A,B,C) :- + string_concat(D,E,A), + string_length(D,1), + append(B,[D],F), + string_to_list2(E,F,C). + + paraphrase1([""],File_list,File_list,Synonym_list,Synonym_list) :- !. + paraphrase1([],File_list,File_list,Synonym_list,Synonym_list) :- !. +paraphrase1(File_list,File_list1,File_list2,Synonym_list,Synonym_list2) :- + File_list=[File_list3|File_list4], + string_concat(File_list5,_E,File_list3), + string_length(File_list5,1), + string_codes(File_list5,File_list5_c), + (not(phrase(word1(File_list5_c),_))->true; + member(File_list3,["the","a","i","on","with","of","an","for","to","was","were","and","in","my","from","out","by"])), + append(File_list1,[File_list3],File_list6), + paraphrase1(File_list4,File_list6,File_list2,Synonym_list,Synonym_list2). + +paraphrase1(File_list,File_list1,File_list2,Synonym_list,Synonym_list2) :- + File_list=[File_list3|File_list4], + string_concat(File_list5,_E,File_list3), + string_length(File_list5,1), + %letters(File_list5), + string_codes(File_list5,File_list5_c), + phrase(word1(File_list5_c),_), + (((member([File_list3,Synonym],Synonym_list)->true; + member([Synonym,File_list3],Synonym_list))-> + (append(File_list1,[Synonym],File_list6), + Synonym_list=Synonym_list1); + ((%true%trace, + %trace, + auto(off)) + -> + (concat_list(["What is a synonym for: ",File_list3,"\n","or to skip."],Notification),writeln(Notification), + read_string(user_input, "\n", "\r", _End, Input), + (Input=""->(Synonym2=File_list3);(Synonym2=Input)),append(Synonym_list,[[File_list3,Synonym2]],Synonym_list1)); + (%File_list1=File_list6, + Synonym2=File_list3, + Synonym_list=Synonym_list1) + )), + append(File_list1,[Synonym2],File_list6)), + paraphrase1(File_list4,File_list6,File_list2,Synonym_list1,Synonym_list2). + + + + + diff --git a/Philosophy/paraphraser1_lp1.txt b/Philosophy/paraphraser1_lp1.txt new file mode 100644 index 0000000000000000000000000000000000000000..2b610ea94e080121909c3a6bffc30d5e6dd403cb --- /dev/null +++ b/Philosophy/paraphraser1_lp1.txt @@ -0,0 +1,4 @@ +assertz(auto(Auto)), +paraphrase1(File_list1,[],File_list,Synonym_list1,Synonym_list), +term_to_atom([File_list,Synonym_list],A),write(A). +:-include('paraphraser1.pl'). diff --git a/Philosophy/pft.pl b/Philosophy/pft.pl new file mode 100644 index 0000000000000000000000000000000000000000..1ddb2cf87b4d35828cb04ee4901caa1fff3f01b3 --- /dev/null +++ b/Philosophy/pft.pl @@ -0,0 +1,528 @@ +/* + +Moral compass + +1. I performed the task for the student well. * + +- replicating work for the student +text +- ontologies, connections, possibly algs + - algs (sentences or words instead of letters, pft - sees if have predictable patterns otherwise uses nn like caw (starting closer to the place x) - recog sort or max with algs, (which also produces data in the backwards direction) x forwards direction) + - nn x pft + - recn +role + + +- replace data including characters in strings with symbols +- move data using prev alg + - equals4 version that concatenates strings +- recn for patterns (list type) - use data types and vars in pft types + +1. data to types +2. groups types into lists - multiple sets of i,o +3. alg to write pred that converts i->o x uses types to convert x prev alg +4. try with new data + + +data: i:[1,[a,a,a]] o: [[a,a,a],2] +types: {atom...} + +i:"ab" or "abab" +o:"ba" or "baba" + +[a,b,a] - {[a,1],[a,2]}:[a,1] +[[a,b],a] - [{[a,1],[a,2]}]:[a,1] +x +[[a,b],a] - {[a,1],[a,2]}:[a,1] + +i: [3,1,2] +o: [1,2] + +result: sort, remove last item + +*/ + +% data to types with lists + +%% data_to_types2.pl + +% needs vars, list of vars-data and finds lists across sets of data + +% data to types with strings + +/** + +['../listprologinterpreter/listprolog.pl']. + +** Data given is the list of lists in input, not input and output. + +% try with single v, mult v items, mult sets v with prev, l=5,4, brackets of brackets with prev + + +data_to_types22([[1]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[t, number, 1]]. + +* +data_to_types22([[1,1]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[t, number, 1], [t, number, 1]]. + +data_to_types22([[1],[1]],T1,VD), find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[t, number, 1]]. +v + +data_to_types22([[1,1],[1,1]],T1,VD), find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[t, number, 1], [t, number, 1]]. +v + + + +data_to_types22([[[1,1]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t, brackets], [[t, number, 1], [t, number, 1]]]]. + + + +data_to_types22([[[1,1]],[[1,1]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t, list], [[t, number, 1], [t, number, 1]]]]. +, 1], [t, number, 1]]. + + +data_to_types22([[[1]],[[2]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t, list], [[t, number, 1]]]]. + +data_to_types22([[[1]],[[1]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t, list], [[t, number, 1]]]]. + + + +data_to_types22([[[1,"a"]],[[2,"b"]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t, list], [[t, number, 1], [t, string, 2]]]]. + + +data_to_types22([[[1]],[[2,2]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t, list], [[t, number, 1]]]]. + +data_to_types22([[[1,1]],[[2,2,2,2]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t, list], [[t, number, 1], [t, number, 1]]]]. + + +data_to_types22([[["1",1]],[["2",2,"2",2]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t, list], [[t, string, 1], [t, number, 2]]]]. + + +data_to_types22([[["2",2,"2",2]],[["2",2,"2",2]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t,list],[[t,string,1],[t,number,2],[t,string,1],[t,number,2]]]] + + +data_to_types22([[[1]],[[1,1]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t, list], [[t, number, 1]]]]. + +data_to_types22([[[1,2,3,4]],[[1,2,3,4,1,2,3,4]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t,list],[[t,number,1],[t,number,2],[t,number,3],[t,number,4]]]] + +data_to_types22([[[[1,2,3,4]]],[[[1,2,3,4,1,2,3,4]]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t,list],[[t,list],[[t,number,1],[t,number,2],[t,number,3],[t,number,4]]]]] + +data_to_types22([[[[[1,2,3,4]]]],[[[[1,2,3,4,1,2,3,4]]]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +Sets2 = [[[t,list],[[t,list],[[t,list],[[t,number,1],[t,number,2],[t,number,3],[t,number,4]]]]]] + +data_to_types22([[[[a,b],2,3,4]],[[[a,b],2,3,4,[a,b],2,3,4]]],T1,VD),find_lists(T1,[],Sets2),writeln(T1),writeln(VD),writeln(Sets2). +x + + +* need data ID(N) to write alg + +**/ + +:- dynamic get_data_n1/1. +:- include('pft_expand_types.pl'). + +%data_to_types22(Data,Types4,VD) :- +% retractall(get_data_n1(_)), +% assertz(get_data_n1(0)), + +data_to_types22(Data,Types4,VD) :- + retractall(get_data_n1(_)), + assertz(get_data_n1(0)), + %findall([Types2,VD2],(member(Data1,Data),data_to_types2(Data1,[],Types2,[],VD2)),Types31), + data_to_types221(Data,[],Types4,[],VD). + %sort(Types3,Types4). + %findall(Types,member([Types,_],Types31),Types3), + %findall(VD1,member([_,VD1],Types31),VD). + + +data_to_types221([],Types,Types,VD,VD) :- !. +data_to_types221(Data,Types1,Types2,VD1,VD2) :- + Data=[Data1|Data2], + pft_expand_types(Data1,[],Types3,VD1,VD3), + append(Types1,[Types3],Types4), + data_to_types221(Data2,Types4,Types2,VD3,VD2). + + +get_data_n(N) :- + get_data_n1(N1), + N is N1+1, + retractall(get_data_n1(_)), + assertz(get_data_n1(N)). + +/* +data_to_types20(Data,Types1,Types2,VD1,VD2) :- + (data_to_types21(Data,Types1,Types2,VD1,VD2)->true; + data_to_types2(Data,Types1,Types2,VD1,VD2,true)). + +%data_to_types2(Data,Types1,Types2,VD1,VD2) :- +% data_to_types2(Data,Types1,Types2,VD1,VD2). + +data_to_types21(Data,Types1,Types2,VD1,VD2) :- +get_lang_word("t",T), +get_lang_word("number",Dbw_number), + number(Data), + (member([Dbw_number,Data,N],VD1)->VD2=VD1; + (get_data_n(N), + append(VD1,[[Dbw_number,Data,N]],VD2))), + append(Types1,[T,Dbw_number,N],Types2),!. +data_to_types21(Data,Types1,Types2,VD1,VD2) :- +%trace, +get_lang_word("t",T), +get_lang_word("string",Dbw_string), + string(Data), + (member([Dbw_string,Data,N],VD1)->VD2=VD1; + (get_data_n(N), + append(VD1,[[Dbw_string,Data,N]],VD2))), + append(Types1,[T,Dbw_string,N],Types2),!. +data_to_types21(Data,Types1,Types2,VD1,VD2) :- +get_lang_word("t",T), +get_lang_word("atom",Dbw_atom), + atom(Data), + (member([Dbw_atom,Data,N],VD1)->VD2=VD1; + (get_data_n(N), + append(VD1,[[Dbw_atom,Data,N]],VD2))), + append(Types1,[T,Dbw_atom,N],Types2),!. + +data_to_types2([],Types,Types,VD,VD,_) :- !. +data_to_types2(Data1,Types1,Types7,VD1,VD2,_) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), +%trace, + Data1=[[Data2|Data3]|Data4], + %not(single_item(Data2)), + data_to_types2(Data2,[],Types3,VD1,VD3,true), + data_to_types2(Data3,Types3,Types4,VD3,VD4,false), + Types5=[[T,Dbw_brackets],[Types4]], + data_to_types2(Data4,[],Types6,VD4,VD2,true), + foldr(append,[Types1,Types5,Types6],Types2), + Types7=[[T,Dbw_brackets],Types2], + !. + +data_to_types2(Data1,Types1,Types41,VD1,VD2,Start) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), +%trace, + Data1=[Data2|Data3], + %not(single_item(Data2)), + %trace, + data_to_types20(Data2,[],Types3,VD1,VD3), + append([],[Types3],Types31), + data_to_types2(Data3,Types31 + ,Types4,VD3,VD2,false), + %trace, + (Start=true->Types42=[[T,Dbw_brackets],Types4]; + Types42=Types4), + foldr(append,[Types1,Types42],Types41), + %Types5=[[[T,Dbw_brackets],Types4]], + !. +* + +%************ + +* +find_lists(Sets0,Sets1,Sets2) :- +%Sets=[Sets0], +%trace, + findall(B,(member(S,Sets0),S=B%[[[t, brackets],B]] + ),Sets), + find_lists1(Sets,Sets1,Sets2). +/ + + +find_lists(Sets,Sets1,Sets1) :- maplist(is_empty_list,Sets),!. +find_lists(Sets,Sets1,Sets2) :- +%Sets=[Sets0], +%trace, + %findall(B,(member(S,Sets0),S=[[t, brackets],B]),Sets), + findall(Heads3,member([Heads3|_],Sets),Heads), + %trace, + findall(Bodies3,member([_|Bodies3],Sets),Bodies), + find_lists2(Heads,[],Sets3), + append(Sets1,[Sets3],Sets4), + find_lists(Bodies,Sets4,Sets2). + +is_empty_list([]). + +find_lists2([],Sets,Sets) :- !. +find_lists2(Heads,Sets1,Sets2) :- +get_lang_word("t",Dbw_t), +get_lang_word("brackets",Dbw_brackets), +get_lang_word("list",Dbw_list), + Heads=[Head|Heads2], + %trace, + get_type1(Head,Type), + + (Type=brackets-> + (%findall(Heads3,Heads=[[[Dbw_t, Dbw_brackets]|Heads3]|_],Heads4), + findall(Heads3,member([[Dbw_t, Dbw_brackets],Heads3],Heads),Heads4), + + %trace, + ((forall((%trace, + member(H,Heads4), + H=[Head1|_]%member(H1,H), + %member(Head1,H) + ),get_type1(Head1,brackets)), + %trace, + Heads4=Heads42, + + not(maplist(is_empty_list,Heads42)) + )-> + (%trace,%findall(Head1,member([Head1],Heads4),Heads41), + find_lists2(Heads42,[],Sets4), + L2=[[Dbw_t, Dbw_list],Sets4], + append(Sets1,L2,Sets31), + _List_of_lists=true%Heads42=Heads41 + + ); (%trace, + List_of_lists=false,%Heads42=Heads4), + %findall(H2,member([H2],Heads4),Heads42) + Heads4=Heads42, + %(%trace, + %Heads=[[[Dbw_t, Dbw_brackets]|Heads4]|_H5], + test_lists(Heads42,L,CFLM),%,Heads5 + %trace, + (CFLM=single%false%length(L,1) + ->L1=[[Dbw_t, Dbw_brackets],L];L1=[[Dbw_t, Dbw_list],L]), + (List_of_lists=true->L2=[[Dbw_t, Dbw_list],L1];L2=L1), + append(Sets1,L2,Sets31))), + %foldr(append,Sets31,Sets3) + Sets31=Sets3 + ); + + (Type=brackets2,%findall(Heads3,Heads=[[[Dbw_t, Dbw_brackets]|Heads3]|_],Heads4), + %findall(Heads3,member([Heads3],Heads),Heads4), + %Heads4=Heads, + %((%trace,%false, + %* + %trace, + findall(Heads3,member(Heads3,Heads),Heads4), + + %trace, + %Heads4=Heads, + %(( + + %false, + + % replace any top level brackets with lists (1) - here true if no top level brackets - in 2nd consequent (1) x just process them + + %find_lists3(Heads4,Sets1,Sets2) + (( + Heads4=Heads42, + + % if detects brackets + + not(maplist(is_empty_list,Heads42)) + )-> + (%trace,%findall(Head1,member([Head1],Heads4),Heads41), + %foldl( + find_lists2(Heads42,[],Sets4), + L2=[[Dbw_t, Dbw_list],Sets4], + append(Sets1,L2,Sets31), + _List_of_lists=true%Heads42=Heads41 + + ); (%trace, + List_of_lists=false,%Heads42=Heads4), + %findall(H2,member([H2],Heads4),Heads42) + Heads4=Heads42, + %(%trace, + %Heads=[[[Dbw_t, Dbw_brackets]|Heads4]|_H5], + test_lists(Heads42,L,CFLM),%,Heads5 + %trace, + (CFLM=single%false%length(L,1) + ->L1=[[Dbw_t, Dbw_brackets],L];L1=[[Dbw_t, Dbw_list],L]), + (List_of_lists=true->L2=[[Dbw_t, Dbw_list],L1];L2=L1), + append(Sets1,L2,Sets31))), + %foldr(append,Sets31,Sets3) + Sets31=Sets3 + )->true + ; + (forall(member(Head1,Heads2),get_type1(Head1,Type)), + findall(N,(member([_,_,N],Heads)),Ns), + Ns=[N1|_], + append(Sets1,[Dbw_t,Type,N1],Sets3))), + %append(Sets1,[[[Dbw_t, Dbw_list],L]],Sets3))))), + Sets3=Sets2. + %find_lists2(Heads2,Sets3,Sets2). + +get_type1(Head1,Type) :- + Head1=[_,Head|_], + (Head=number->Type=number; + (Head=string->Type=string; + (Head=atom->Type=atom; + (Head1=[[_,brackets]|_]->Type=brackets; + Type=brackets2 + )))). + +test_lists(Heads1,L,CFLM%,Heads3 +) :- +%trace, + findall(Common_factors1,(member(Head,Heads1),length(Head,Length), + common_factors(Length,Common_factors1)),Common_factors), + %foldr(append,Common_factors,Common_factors3), + %sort(Common_factors3,Common_factors4), + length(Common_factors,CFL), + (CFL>1->CFLM=multiple;(%trace, + CFLM=single)), + Common_factors=[CF1|CF2], + (%CF2=[]->Common_factors4=CF1; + (%trace, + foldr(intersection,CF2,CF1,Common_factors4))), + reverse(Common_factors4,Common_factors5), + %trace, + test_lists2(Common_factors5,Heads1,L%,Heads3 + ). + +common_factors(Length,Common_factors1) :- + Length2 is ceiling(Length), + numbers(Length2,1,[],Ns), + findall(N,(member(N,Ns),0 is mod(Length2,N)),Common_factors1). + +test_lists2(Common_factors1,Heads1,L%,Heads2 +) :- + member(Common_factor,Common_factors1),%,|Common_factors2], + test_lists3(Common_factor,Heads1,L),%,Heads3 + %)-> + + !. + +test_lists3(Common_factor,Heads1,L%,Heads3 +) :- + length(L,Common_factor), + Heads1=[Head|_], + append(L,_Head2,Head), + %trace, + %Heads1=Heads11,% + if_brackets_tl(Heads1,[],Heads11), + + forall(member(Head3,Heads11),types_in(Common_factor,L,Head3)). + %test_lists4(Common_factor,L,Heads1,Heads3). + +%test_lists4(Common_factor,L,Heads1,Heads3) :- + +types_in(_Common_factor,_L,[]) :- !. +types_in(Common_factor,L,Head3) :- + length(M,Common_factor), + append(M,N,Head3), + types_in2(L,M), + types_in(Common_factor,L,N). + +% find L at end above +% L should be [_,_,_] not [_] + +% insert list[] + +types_in2([],[]) :- !. +types_in2(L,M) :- + L=[L1|L2], + M=[M1|M2], + %trace, + L1=[_,Type,_N1], + M1=[_,Type,_N2], + types_in2(L2,M2). + +% replace brackets with lists +% if bottom level (non brackets) test +% perhaps replace upper parts of alg with this + +% x test and return repeating lists, not +% +find_lists3(Heads1,Heads,Heads) :- + maplist(is_empty_list,Heads1),!. +find_lists3(Heads4,Heads51,Heads52) :- +%trace, + get_lang_word("t",Dbw_t), + get_lang_word("brackets",Dbw_brackets), + get_lang_word("list",Dbw_list), + findall(Head,member([Head|_],Heads4),Heads7), + findall(Tail,member([_|Tail],Heads4),Tails7), + Heads7=[Heads8|Heads9], + %Heads4=[Heads5|Heads6], + get_type1(Heads8,Type), + forall(member(Heads10,Heads9), + get_type1(Heads10,Type)), + + ((Type=brackets-> + find_lists3(Heads7,[],Heads11), + %T=[[n,]] + (length(Heads7,1)-> + T=[[[Dbw_t,Dbw_brackets],Heads11]];T=Heads11), + find_lists3(Tails7,[],Tails11), + foldr(append,[Heads51,T,Tails11],Heads52))->true; + + ((Type=brackets2-> + find_lists3(Heads7,[],Heads11), + %T=[[n,]] + (length(Heads7,1)-> + T=[[[Dbw_t,Dbw_brackets],Heads11]];T=Heads11), + find_lists3(Tails7,[],Tails11), + foldr(append,[Heads51,T,Tails11],Heads52))->true; + + (%test_lists + find_lists2(Heads7,L,CFLM), + find_lists3(Tails7,[],Tails11), + + (CFLM=single%false%length(L,1) + ->L1=[[Dbw_t, Dbw_brackets],L];L1=[[Dbw_t, Dbw_list],L]), + %(List_of_lists=true->L2=[[Dbw_t, Dbw_list],L1];L2=L1), + %append(Sets1,L2,Sets31))), + foldr(append,[Heads51,L1,Tails11],Heads52)))). + + +if_brackets_tl(Heads11,H1,Heads1) :- + if_brackets_tl1(Heads11,H1,%Heads1% + [Heads1] + ). + +if_brackets_tl1([],H,H) :- !. +if_brackets_tl1(Heads11,H1,Heads1) :- +%trace, + %findall(Head,member([Head|_],Heads4),Heads7), + %findall(Tail,member([_|Tail],Heads4),Tails7), + + Heads11=[Heads2|Heads3], + (get_type1(Heads2,Type)->true;Type=multiple), + + ((Type=brackets, + Heads2=[[[t,brackets],Heads5]], + test_lists(Heads5,Heads41,_), + Heads4=[Heads41] + %foldr(append,Heads42,Heads43), + %[Heads43]=Heads4 + )->true; + + ((%trace, + Type=brackets2, + %Heads2=[Heads5], % [_|_]? + %trace, + test_lists_a(Heads2,[],Heads4))->true; + + (Type=multiple, + [Heads2]=Heads4))), + + append(H1,Heads4,H2), + if_brackets_tl1(Heads3,H2,Heads1a), + Heads1=[Heads1a]. + +test_lists_a([],B,B) :- !. +test_lists_a(A,B,C) :- + A=[D|E], + test_lists([[D]],F,_), + append(B,[F],G), + test_lists_a(E,G,C),!. + + */ diff --git a/Philosophy/program.c b/Philosophy/program.c new file mode 100644 index 0000000000000000000000000000000000000000..93c60bcaa6e68476482ef6503fe60383813a04f1 --- /dev/null +++ b/Philosophy/program.c @@ -0,0 +1 @@ +#include\n#include\n\nint main(void)\n{\n char str1[20];\n\n scanf(\"%19s\", str1);\n \n printf(\"%s\\n\", str1);\n return 0;\n} \ No newline at end of file diff --git a/Philosophy/random_phrase.pl b/Philosophy/random_phrase.pl new file mode 100644 index 0000000000000000000000000000000000000000..1f90d169d9327abcc71d5defccd6fcb899baa1bd --- /dev/null +++ b/Philosophy/random_phrase.pl @@ -0,0 +1,34 @@ +random_phrase(Noun1,Verb,Noun2) :- + choose(noun(Noun1)),choose(verb(Verb)),choose(noun(Noun2)). +random_phrase(Noun1,Verb,Noun2,Ablative,Noun3) :- + choose(noun(Noun1)),choose(verb(Verb)),choose(noun(Noun2)),choose(ablative(Ablative),noun(Noun3)). + +choose(C) :- + findall(C,C,D),random_member(C,D). + %C. +/* + C..=[A,B], + C + functor(C,A,1),D=C,arg(1,D,_), + retractall(D), + arg(1,C,B), + %((A=curr_cp,B=24)->writeln(set(curr_cp,24));true), + assertz(C),! . + */ +noun(peter). +noun(sue). +noun(a_book). +noun(a_song). +verb(reading). +verb(walking). +verb(composing). +verb(walked_with). +verb(talked_with). +ablative(by). +ablative(with). +ablative(from). +ablative(in). +ablative(on). +ablative(out_of). + + diff --git a/Philosophy/replace_nothing_with_v.pl b/Philosophy/replace_nothing_with_v.pl new file mode 100644 index 0000000000000000000000000000000000000000..ef1aaf798668c28dad3564095d903824b7e06637 --- /dev/null +++ b/Philosophy/replace_nothing_with_v.pl @@ -0,0 +1,20 @@ +replace_nothing_with_v([],Vars,Vars%,_ +) :- !. +replace_nothing_with_v(Term,Vars1,Vars2%,Start +) :- + not(atom(Term)), + Term=[Term1|Term2], + replace_nothing_with_v(Term1,[],Vars3%,true + ), + append(Vars1,[Vars3],Vars4), + replace_nothing_with_v(Term2,Vars4,Vars2%,false + ),!. +replace_nothing_with_v(Term,Vars1,Vars2%,Start +) :- + atom(Term), + %(Start=true-> + %append(Vars1,V,Vars2); + append(Vars1,[v,Term],Vars2),!. +%replace_nothing_with_v(Term,Vars,Vars) :- +% not(Term=[v,_]),!. + diff --git a/Philosophy/sentence_count.pl b/Philosophy/sentence_count.pl new file mode 100644 index 0000000000000000000000000000000000000000..da54272ca902e2f033fac1e312e7342458ae1137 --- /dev/null +++ b/Philosophy/sentence_count.pl @@ -0,0 +1,37 @@ +% needs t2b repository + +:-include('../listprologinterpreter/listprolog.pl'). +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +sentence_count(Count) :- + +K=["../../../Google Drive/HTML/files/Philosophy/Books/CREATE AND HELP PEDAGOGUES.txt", +"../../../Google Drive/HTML/files/Philosophy/Books/FUNDAMENTALS OF PEDAGOGY.txt", +"../../../Google Drive/HTML/files/Philosophy/Books/PEDAGOGY GUIDE.txt", +"../../../Google Drive/HTML/files/Philosophy/Books/PEDAGOGY INDICATORS.txt"], + +% A = 5960. + +% sales = 150 + +findall(J5,(%member(K1,K), %directory_files(K1,F), + %delete_invisibles_etc(F,G), + +%findall([File_term,"\n"],( +member(H,K),%string_concat(K1,H,H1), +open_string_file_s(H,File_term),flatten(File_term,J1), +foldr(string_concat,J1,"",J2), + +split_string(J2,"\n\r.","\n\r.",J3), +delete(J3,"",J4), +length(J4,J5) + +),J6), +sum(J6,0,Count). + + +%N=1,M=u,texttobr2(N,u,J2,M,false,false,false,false,false,false), + + +%N=1,M=u,texttobr(N,u,J2,M). + diff --git a/Philosophy/simplify_types_with_n.pl b/Philosophy/simplify_types_with_n.pl new file mode 100644 index 0000000000000000000000000000000000000000..76971abcbd3c0bc488b6839fd406d702cc3530a6 --- /dev/null +++ b/Philosophy/simplify_types_with_n.pl @@ -0,0 +1,114 @@ +%% simplify_types_with_n.pl + +/** + +?- simplify_types_with_n([[[t, brackets], [[t, number]]], [t, string], [t, number]],[],T). +T = [[[t, number]], [t, string], [t, number]]. + +**/ + + +simplify_types_with_n([],Types,Types) :- !. +simplify_types_with_n(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("number",Dbw_number), +Data=[T,Dbw_number,_N,D], +% number(Data), +append(Types1,[D],Types2),!. +simplify_types_with_n(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("string",Dbw_string), +Data=[T,Dbw_string,_N,D], + %string(Data), + append(Types1,[D],Types2),!. +simplify_types_with_n(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("atom",Dbw_atom), +Data=[T,Dbw_atom,_N,D], + %string(Data), + append(Types1,[D],Types2),!. +simplify_types_with_n(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[[[T,Dbw_brackets],Types4]|Types6], + simplify_types_with_n(Types4,[],Data2), + Types5=[Data2], + append_list3([Types1,Types5],Types2a), + +simplify_types_with_n(Types6,Types2a,Types2),!. + + +simplify_types_with_n(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("list",Dbw_list), +%trace, + Data1=[[[T,Dbw_list],Types4]|Types6], +%trace, + simplify_types_with_n(Types4,[],Types21),%[Data2|Data2a]), + Types22=[Types21], +/* +(Data2=[Data2a]->Types5=[{Data2a}]; +(Data2=[Data2a|Data2b]-> +(square_to_round([Data2a|Data2b],Types52), +round_to_curly(Types52,Types51), +Types5=[Types51]);false)), +*/ + append_list3([Types1,Types22],Types2a), + simplify_types_with_n(Types6,Types2a,Types2),!. + + /* +simplify_types_with_n(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[[T,Dbw_brackets]|Types4], + simplify_types_with_n(Types4,[],Data2), + Types5=[[Data2],Data4], + simplify_types_with_n(Data4,[],Types6), + append_list3([Types1,Types5,Types6],Types2),!. +*/ + +/* +simplify_types_with_n(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[[Data2|Data3]|Data4], + simplify_types_with_n(Data2,[],Types3), + simplify_types_with_n(Data3,Types3,Types4), + Types5=[[[T,Dbw_brackets],Types4]], + simplify_types_with_n(Data4,[],Types6), + append_list3([Types1,Types5,Types6],Types2),!. + +*/ + + +simplify_types_with_n(Data,Types1,Types2) :- +get_lang_word("t",T), +%get_lang_word("string",Dbw_string), +Data=[T,A], + %string(Data), + append(Types1,[[T,A]],Types2),!. + +simplify_types_with_n(Data,Types1,Types2) :- +get_lang_word("t",T), +%get_lang_word("string",Dbw_string), +Data=[T,_A,_N,D], + %string(Data), + append(Types1,[D],Types2),!. + + +simplify_types_with_n(Data1,Types1,Types2) :- + Data1=[Data2|Data3], + simplify_types_with_n(Data2,Types1,Types3), + simplify_types_with_n(Data3,Types3,Types2),!. + %%Types2=[Types4]. + +/* +append_list3(A1,B):- + append_list3(A1,[],B),!. + +append_list3([],A,A):-!. +append_list3(List,A,B) :- + List=[Item|Items], + append(A,Item,C), + append_list3(Items,C,B). +*/ \ No newline at end of file diff --git a/Philosophy/simulation.pl b/Philosophy/simulation.pl new file mode 100644 index 0000000000000000000000000000000000000000..91a175aee420b32ea28b27eb8d291e5e9a93f005 --- /dev/null +++ b/Philosophy/simulation.pl @@ -0,0 +1,136 @@ +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green The Science of Crossing Over 3 of 4.txt",0,algorithms,"27. I prepared to determine that the crossing over of characters was induction. I did this by determining the crossing over of characters by induction of physical simulation. First, I observed the first character facing east in a business suit. Second, I observed the second character touching the first character's hand, facing west in a business suit. Third, I induced that the two characters were making a business agreement by shaking hands. In this way, I prepared to determine that the crossing over of characters was induction by determining the crossing over of characters by induction of physical simulation."] + +% * I prepared to determine that the crossing over of characters was induction. + +% - Logs meetings and tests whether two people have met through others. + +% simulation(20,20,5,20,Simulation,Meetings,1,2). + +:-include('../listprologinterpreter/la_maths.pl'). + +simulation(X,Y,N_people,N_frames,Simulation,Meetings,Person1,Person2) :- + Spaces is X*Y,(N_people>Spaces->(writeln(["Error: N_people",N_people,">Spaces",X*Y]),abort);true), +% size, number of people, number of frames +% random starting positions + random_starting_positions(X,Y,N_people,Frame1), +% random movements + random_movements(X,Y,Frame1,1,N_frames,[Frame1],Simulation), +% log_meetings + log_meetings(Simulation,Simulation,[],Meetings), +% met_a_chain_member + met_a_chain_member(Meetings,Person1,Person1,Person2). + +random_starting_positions(X,Y,N_people,Frame1) :- + random_starting_positions1(X,Y,1,N_people,[],Frame1). + +random_starting_positions1(_X,_Y,N_people,N_people1,Frame1,Frame1) :- + N_people1 is N_people-1,!. +random_starting_positions1(X,Y,N_people1,N_people2,Frame1,Frame2) :- + random_starting_positions2(X,Y,X1,Y1,Frame1),%,Frame3), + append(Frame1,[[N_people1,[X1,Y1]]],Frame4), + N_people3 is N_people1+1, + random_starting_positions1(X,Y,N_people3,N_people2,Frame4,Frame2). + +random_starting_positions2(X,Y,X1,Y1,Frame1) :-%,Frame2) :- + random_starting_positions3(X,Y,X2,Y2), + (member([_,[X2,Y2]],Frame1)-> + fail%random_starting_positions2(X,Y,X1,Y1,Frame1) + ;%,Frame2); + (X1=X2,Y1=Y2)).%Frame2=[X2,Y2]). + +random_starting_positions3(X,Y,X1,Y1) :- + %random(X2),X1 is ceiling(X*X2), + %random(Y2),Y1 is ceiling(Y*Y2). +%repeat, +numbers(X,1,[],X2), +numbers(Y,1,[],Y2), +member(X1,X2), +member(Y1,Y2). +% finds list of new positions, retries if deadlocked + +% need to move from old pos, check in curr frame + +random_movements(_X_bounds,_Y_bounds,_Frame1,N_frames,N_frames,Simulation,Simulation) :- !. +random_movements(X_bounds,Y_bounds,Frame1,N_frames1,N_frames2,Simulation1,Simulation2) :- + random_movements1(X_bounds,Y_bounds,Frame1,Frame1,[],Frame2), + append(Simulation1,[Frame2],Simulation3), + N_frames3 is N_frames1+1, + random_movements(X_bounds,Y_bounds,Frame2,N_frames3,N_frames2,Simulation3,Simulation2). + +% if it fails, retry () * + +random_movements1(_,_,_,[],Frame,Frame) :- !. +random_movements1(X_bounds,Y_bounds,_,Frame1,Frame11,Frame2) :- + Frame1=[[N,[X,Y]]|Rest], + random_starting_positions3(X_bounds,Y_bounds,X,Y,X1,Y1,Frame11), + append(Frame11,[[N,[X1,Y1]]],Frame3), + random_movements1(X_bounds,Y_bounds,_,Rest,Frame3,Frame2). + +random_starting_positions3(X_bounds,Y_bounds,X,Y,X1,Y1,Frame1) :-%,Frame2) :- + random_starting_positions31(X_bounds,Y_bounds,X,Y,X2,Y2), + (member([_,[X2,Y2]],Frame1)-> + fail%random_starting_positions3(X_bounds,Y_bounds,X,Y,X1,Y1,Frame1) + ;%,Frame2); + (X1=X2,Y1=Y2)).%Frame2=[X2,Y2]). + +random_starting_positions31(X_bounds,Y_bounds,X,Y,X2,Y2) :- + Choices=[stay,left,right,up,down], + (X=1->delete(Choices,left,Choices2);Choices=Choices2), + (Y=1->delete(Choices2,up,Choices3);Choices2=Choices3), + (X=X_bounds->delete(Choices3,right,Choices4);Choices3=Choices4), + (Y=Y_bounds->delete(Choices4,down,Choices5);Choices4=Choices5), + %repeat, + member(Direction,Choices5), + random_starting_positions32(Direction,X,Y,X2,Y2). + +random_starting_positions32(stay,X,Y,X,Y) :- !. +random_starting_positions32(left,X,Y,X1,Y) :- + X1 is X-1,!. +random_starting_positions32(right,X,Y,X1,Y) :- + X1 is X+1,!. +random_starting_positions32(up,X,Y,X,Y1) :- + Y1 is Y-1,!. +random_starting_positions32(down,X,Y,X,Y1) :- + Y1 is Y+1,!. + +log_meetings(_,[],Meetings,Meetings) :- !. +log_meetings(_,Simulation,Meetings1,Meetings2) :- + Simulation=[Simulation1|Simulation2], + log_meetings1(Simulation1,Simulation1,[],Meetings3), + append(Meetings1,[Meetings3],Meetings4), + log_meetings(_,Simulation2,Meetings4,Meetings2). + +log_meetings1(_,[],Meetings,Meetings) :- !. +log_meetings1(Simulation1,Simulation2,Meetings1,Meetings2) :- + Simulation2=[[N,[X,Y]]|Rest], + meetings(N,X,Y,Simulation1,[],Meetings3), + append(Meetings1,Meetings3,Meetings4), + log_meetings1(Simulation1,Rest,Meetings4,Meetings2). + +meetings(N1,X,Y,Simulation1,Meetings1,Meetings2) :- + X1 is X-1, + X2 is X+1, + Y1 is Y-1, + Y2 is Y+1, + meeting(N1,X1,Y,Simulation1,Meetings1,Meetings3), + meeting(N1,X2,Y,Simulation1,Meetings3,Meetings4), + meeting(N1,X,Y1,Simulation1,Meetings4,Meetings5), + meeting(N1,X,Y2,Simulation1,Meetings5,Meetings2). + +meeting(N1,X,Y,Simulation1,Meetings1,Meetings2) :- + (member([N,[X,Y]],Simulation1)-> + append(Meetings1,[[N1,N]],Meetings2); + Meetings1=Meetings2). + +% tests whether person 1 has a link to person 2 over time +met_a_chain_member(_Meetings,Person0,Person,Person) :- + writeln(["Link between",Person0,"and",Person]),!. +met_a_chain_member([],Person0,Person1,Person2) :- + (not(Person1=Person2)->(writeln(["No link between",Person0,"and",Person2]),!);true). +met_a_chain_member(Meetings,Person0,Person1,Person2) :- + Meetings=[Meetings1|Meetings2], + ((member([Person1,_Person3],Meetings1)->true; + member([_Person3,Person1],Meetings1))-> + met_a_chain_member(_Meetings,Person0,Person2,Person2); + met_a_chain_member(Meetings2,Person0,Person1,Person2) + ). diff --git a/Philosophy/string_to_image.pl b/Philosophy/string_to_image.pl new file mode 100644 index 0000000000000000000000000000000000000000..44650ee8bac7605ac3960740298bea98f077ff1a --- /dev/null +++ b/Philosophy/string_to_image.pl @@ -0,0 +1,212 @@ +:-include('../LuciansHandBitMap-Font/characterbr.pl'). +:-include('../listprologinterpreter/la_maths.pl'). +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). +%:-include('strings_to_grid.pl'). +:-include('scaler.pl'). + +% string_to_pbm("ab\nac\nad",5,5,"a.pbm"). +% string_to_pbm("a",5,5,"a.pbm"). + +string_to_image(["text","x_bounds",XB1,"y_bounds",YB1,"x",X1,"y",Y1,"size",S1,"colour",C1,T1],_X,_Y,File_contents1,File_contents2) :- + %X=40,Y=24, + %X=5,Y=5, + %background(X,Y,File_contents1), + scaler(S1,Cs), + retractall(characterbr2(_)), + assertz(characterbr2(Cs)), + %trace, + string_to_image1(XB1,YB1,1,1,X1,Y1,S1,C1,T1,File_contents1,File_contents2), + %writeln([file_contents2,File_contents2]), + + render_pbm_file(File_contents2,XB1,YB1,File_contents3), + random(X3), + concat_list(["paint_x",X3,".ppm"],X4), + write_pbm(File_contents3,XB1,YB1,X4), + !. + +/* +background(X1,Y1,File_contents1) :- + X is X1*6, + Y is Y1*10, + numbers(X,1,[],XN), + numbers(Y,1,[],YN), + findall([X2,Y2,0],(member(X2,XN), + member(Y2,YN)),File_contents1). +*/ + +ctobr_1(S1,C1_1,Grid2) :- + %writeln(C1), + characterbr2(Cs), + string_atom(C1_1,C1), + member([C1,_C1Name,C2],Cs), + %writeln(C1Name),writeln(''), + + grid(S1,Grid1), + member([X1,Y1,M1],C2), + N2=1, + Stroke1=0, + %% States:[[this,state],Line:[[any,state,to,this,state],[true,or,false]],State:[[this,state,to,a,state],states]] + States= [ + [[1,s], false, [[1,s],[1,-]]], + [[1,-], false, [[3,s],[2,-]]], + %%[[2,s], false, [[2,s],[3,-]]], + [[2,-], true, [[3,s],[2,-]]], + [[3,s], true, [[1,s],[1,-]]] + %%[[3,-], false, [[3,s],[2,-]]] + ], + +M1=[N2|_Ms], +(changegrid2(_Prevstate,[1,s],Grid1,Grid2,X1,Y1,C2,_C4,N2,Stroke1,States); +changegrid2(_Prevstate,[1,-],Grid1,Grid2,X1,Y1,C2,_C4,N2,Stroke1,States)), + + %y(Y), + %prettyprint1(Grid2,Y),writeln(''), + %prettyprint1A(Grid2,Y), + !. %% 2 + +%%string_to_image1(_,_,40,24,_,File_contents,File_contents) :- !. +%string_to_image1(_,_,_,_,"",File_contents,File_contents) :- !. +string_to_image1(_X,_Y,_,_,_X1,_Y1,"",_C1,_T1,File_contents,File_contents) :- !. +string_to_image1(X_lim,Y_lim,_X,Y,XA1,YA1,S1,C1,String,File_contents1,File_contents2) :- + string_concat(Char,String2,String), + string_length(Char,1), + (Char="\n"->true;Char="\r"), + + X1 is 1,Y1 is Y+1, + string_to_image1(X_lim,Y_lim,X1,Y1,XA1,YA1,S1,C1,String2,File_contents1,File_contents2). +string_to_image1(X_lim,Y_lim,X,Y,XA1,YA1,S1,C1,String,File_contents1,File_contents2) :- + string_concat(Char1,String2,String), + string_length(Char1,1), + + characterbr2(Cs), + string_atom(Char1,Char), + %trace, + (member([Char,_,_],Cs)->Char2=Char; + Char2=' '), + + %trace, + + (Char2=' '->(File_contents1=File_contents3); + (ctobr_1(S1,Char2,Grid2), + %writeln1(Grid2), + + convert_to_pbm(Grid2,5,9,XA1,YA1,S1,C1,Grid3), + %trace, + %writeln1([c,Grid3]), + +%(Char2="b"->print_grid_1(Grid3,13,10);true), + + render_to_screen(File_contents1,Grid3,5,9,X,Y,X_lim,Y_lim,XA1,YA1,S1,C1,File_contents3))), + ((X=X_lim,Y=Y_lim)->File_contents3=File_contents2; + ((X is X_lim->(X1 is 1,Y1 is Y+1); + (X1 is X+1,Y1 is Y)), + string_to_image1(X_lim,Y_lim,X1,Y1,XA1,YA1,S1,C1,String2,File_contents3,File_contents2))). + +convert_to_pbm(Grid1,X,Y,_XA1,_YA1,_S1,C1,Grid2) :- + numbers(Y,0,[],YN1), + reverse(YN1,YN), + numbers(X,0,[],XN), +%trace, + findall([X1,Y1|Pixel],(member(Y1,YN), + member(X1,XN), + member([X1,Y1|Pixel1],Grid1), + ((Pixel1=[[]]->true;Pixel1=[255,255,255])->Pixel=[255,255,255];(%trace, + colour(C1, [R, G, B]),Pixel=[R, G, B])) + %write(Pixel),write("\t") + ),Grid2). + %,writeln1([grid2,Grid2]). + +render_to_screen(File_contents1,Grid3,X2,Y2,X,Y,_X_lim,Y_lim,XA1,YA1,S1,_C1,File_contents2) :- +%trace, +%writeln1([b,Grid3]), + numbers(Y2,1,[],YN1), + reverse(YN1,YN), + numbers(X2,1,[],XN), + +%trace, + findall([X3,Y3,R,G,B],(member(Y1,YN), + member(X1,XN), + member([X1,Y1,R,G,B],Grid3), + X3 is %X1,% + S1*(6*(X-1)+X1)+XA1, + Y3 is %Y1% + S1*(10*(Y_lim-Y)+Y1)+YA1 + %(Pixel1=[]->Pixel=0;Pixel=1) + %write(Pixel),write("\t") + ),Grid2), + + %trace, + %X4 is 2*6,Y4 is 1*10, + %*X4 is X_lim*6,Y4 is Y_lim*10, + + %print_grid_1(Grid2,5,9), + %*print_grid_1(Grid2,X4,Y4), + + + replace_1(File_contents1,Grid2,File_contents2). + +replace_1(File_contents,[],File_contents) :- !. +replace_1(File_contents1,Grid2,File_contents2) :- + Grid2=[[X,Y,R,G,B]|Grid4], + delete(File_contents1,[X,Y,_,_,_],File_contents3), + append(File_contents3,[[X,Y,R,G,B]],File_contents4), + replace_1(File_contents4,Grid4,File_contents2). + +render_pbm_file(Grid1,X2,Y2,Grid2) :- + X is X2*6, + Y is Y2*10, + numbers(Y,1,[],YN1), + reverse(YN1,YN), + numbers(X,1,[],XN), + + %trace, + + findall([A%,["\n"] + ],(member(Y1,YN), + findall([R,"\t",G,"\t",B,"\n"],(member(Y1,YN), + + %nl, + member(X1,XN), + member([X1,Y1,R,G,B],Grid1)),A2), + + maplist(append,[A2],[A]) + + ),Grid3), + + %writeln1(Grid3), + + %trace, + + maplist(append,[Grid3],[Grid2a]), + + maplist(append,[Grid2a],[Grid2b]), + + concat_list(Grid2b,Grid2). + %writeln1(Grid2). + %(Pixel1=[*]->Pixel="*";Pixel=" "), + %write(Pixel),write("\t") + %),_), + %nl. + +write_pbm(File_contents3,X2,Y2,File) :- + X is X2*6, + Y is Y2*10, + concat_list(["P3\n",X," ",Y,"\n","255","\n",File_contents3],File_contents4), + open_s(File,write,Stream), + write(Stream,File_contents4),close(Stream). + + + +print_grid_1(Grid,X,Y) :- + numbers(Y,1,[],YN1), + reverse(YN1,YN), + numbers(X,1,[],XN), + + findall(_,(member(Y1,YN), + nl, + member(X1,XN), + member([X1,Y1,Pixel],Grid), + %(Pixel1=1->Pixel="*";Pixel=" "), + write(Pixel)),_). + diff --git a/Philosophy/stwa.pl b/Philosophy/stwa.pl new file mode 100644 index 0000000000000000000000000000000000000000..602c1a88386d5a2489e672cc5e331ea61f69c971 --- /dev/null +++ b/Philosophy/stwa.pl @@ -0,0 +1 @@ +:-include('sub_term_with_address.pl'). \ No newline at end of file diff --git a/Philosophy/sub_term_with_address.pl b/Philosophy/sub_term_with_address.pl new file mode 100644 index 0000000000000000000000000000000000000000..9f32dd9c589108ef9c2054c506c73919cb853657 --- /dev/null +++ b/Philosophy/sub_term_with_address.pl @@ -0,0 +1,518 @@ +/* + +Subterm with address + +sub_term_wa(Subterm, Term, Instances). +Subterm - Subterm to find +Term - Initial list to search +Instances - List of addresses and subterms + +Subterm (with address) gives the "address" of a term in a term, such as [[[1], a]] for a in a, [[[1, 2], b]] for b in [a,b], (where the column number is the dimension), and put subterm with address puts an item into an address. + +e.g.s of addresses: [1]=a,[1,1]=a in [a,b], [1,1,1]=a in [[a],[b],[c]] + +Please don't give a subterm with address terms with _ to find in; it will return results with the search term for each _. + +*/ + +help(sub_term_with_address) :- + help(subterm_with_address). +help(sub_term_wa) :- + help(subterm_with_address). +help(stwa) :- + help(subterm_with_address). + +help(subterm_with_address) :- + +print_message(information,'Subterm With Address Help'), + +Message= +'sub_term_wa(Subterm, Term, Instances) +sub_term_wa([a,_], [[a,b], [a,c]], Instances). +Instances = [[[1, 1], [a, b]], [[1, 2], [a, c]]] + +get_sub_term_wa(Term, Address, Item) +get_sub_term_wa([[1, 4], 2, 3], [1, 1, 2], Item). +Item = 4 + +put_sub_term_wa(Item, Address, Term1, Term2) +put_sub_term_wa(88, [1,1], [[2, 3], 4], Term2). +Term2 = [88, 4] + +put_sub_term_wa_smooth(Item, Address, Term1, Term2) +put_sub_term_wa_smooth([88,1], [1,1], [[2,3],4], Term2). +Term2 = [88, 1, 4] + +delete_sub_term_wa(Instances, Term1, Term2) +delete_sub_term_wa([[1, 1], [1, 2]], [a, b], Term2). +Term2 = [] + +foldr(put_sub_term_wa_ae,Instances, Term1, Term2). +foldr(put_sub_term_wa_ae,[[[1, 1], [v, 2]], [[1, 2], [v, 3]]], [[v, 1], [v, 2]], Term2). +Term2 = [[v, 2], [v, 3]] + +foldr(put_sub_term_wa_ae_smooth, Instances, Term1, Term2) +foldr(put_sub_term_wa_ae_smooth, [[[1, 1], [v, 2]], [[1, 2], [v, 3]]], [[v, 1], [v, 2]], Term2). +Term2 = [v, 2, v, 3] + +sub_term_types_wa(Heuristic, Term, Instances) +sub_term_types_wa([all([number, string])], [1,[a,3]], Instances) +Instances = [[[1,2], [a,3]]] + +Possible heuristics: +var +string +atom +[] +number +compound (non-list compounds) +all(Insert_more_heuristics) (all the items are of a type, can be used to select terminals) +heuristic(Heuristic, Output_variable) (Heuristic may be for example A=a and Output_variable=A)', + +writeln(Message). + +subterm_wa(Subterm, Term, Instances) :- + sub_term_wa(Subterm, Term, Instances). +subterm_with_address(Subterm, Term, Instances) :- + sub_term_wa(Subterm, Term, Instances). +sub_term_with_address(Subterm, Term, Instances) :- + sub_term_wa(Subterm, Term, Instances). +stwa(Subterm, Term, Instances) :- + sub_term_wa(Subterm, Term, Instances). + +get_subterm_wa(Term, Address, Item) :- + get_sub_term_wa(Term, Address, Item). +get_subterm_with_address(Term, Address, Item) :- + get_sub_term_wa(Term, Address, Item). +get_sub_term_with_address(Term, Address, Item) :- + get_sub_term_wa(Term, Address, Item). +get_stwa(Term, Address, Item) :- + get_sub_term_wa(Term, Address, Item). + +put_subterm_wa(Item, Address, Term1, Term2) :- + put_sub_term_wa(Item, Address, Term1, Term2). +put_subterm_with_address(Item, Address, Term1, Term2) :- + put_sub_term_wa(Item, Address, Term1, Term2). +put_sub_term_with_address(Item, Address, Term1, Term2) :- + put_sub_term_wa(Item, Address, Term1, Term2). +put_stwa(Item, Address, Term1, Term2) :- + put_sub_term_wa(Item, Address, Term1, Term2). + +put_subterm_wa_smooth(Item, Address, Term1, Term2) :- + put_sub_term_wa_smooth(Item, Address, Term1, Term2). +put_subterm_with_address_smooth(Item, Address, Term1, Term2) :- + put_sub_term_wa_smooth(Item, Address, Term1, Term2). +put_sub_term_with_address_smooth(Item, Address, Term1, Term2) :- + put_sub_term_wa_smooth(Item, Address, Term1, Term2). +put_stwa_smooth(Item, Address, Term1, Term2) :- + put_sub_term_wa_smooth(Item, Address, Term1, Term2). + +delete_subterm_wa(Instances, Term1, Term2) :- + delete_sub_term_wa(Instances, Term1, Term2). +delete_subterm_with_address(Instances, Term1, Term2) :- + delete_sub_term_wa(Instances, Term1, Term2). +delete_sub_term_with_address(Instances, Term1, Term2) :- + delete_sub_term_wa(Instances, Term1, Term2). +delete_stwa(Instances, Term1, Term2) :- + delete_sub_term_wa(Instances, Term1, Term2). + +foldr(put_subterm_wa_ae,Instances, Term1, Term2) :- + foldr(put_sub_term_wa_ae,Instances, Term1, Term2). +foldr(put_subterm_with_address_ae,Instances, Term1, Term2) :- + foldr(put_sub_term_wa_ae,Instances, Term1, Term2). +foldr(put_sub_term_with_address_ae,Instances, Term1, Term2) :- + foldr(put_sub_term_wa_ae,Instances, Term1, Term2). +foldr(put_stwa_ae,Instances, Term1, Term2) :- + foldr(put_sub_term_wa_ae,Instances, Term1, Term2). + +foldr(put_subterm_wa_ae_smooth, Instances, Term1, Term2) :- + foldr(put_sub_term_wa_ae_smooth, Instances, Term1, Term2). +foldr(put_subterm_with_address_ae_smooth, Instances, Term1, Term2) :- + foldr(put_sub_term_wa_ae_smooth, Instances, Term1, Term2). +foldr(put_sub_term_with_address_ae_smooth, Instances, Term1, Term2) :- + foldr(put_sub_term_wa_ae_smooth, Instances, Term1, Term2). +foldr(put_stwa_ae_smooth, Instances, Term1, Term2) :- + foldr(put_sub_term_wa_ae_smooth, Instances, Term1, Term2). + +subterm_types_wa(Heuristic, Term, Instances) :- + sub_term_types_wa(Heuristic, Term, Instances). +subterm_types_with_address(Heuristic, Term, Instances) :- + sub_term_types_wa(Heuristic, Term, Instances). +sub_term_types_with_address(Heuristic, Term, Instances) :- + sub_term_types_wa(Heuristic, Term, Instances). +sttwa(Heuristic, Term, Instances) :- + sub_term_types_wa(Heuristic, Term, Instances). + +% Requires +%:-include('../listprologinterpreter/listprolog.pl'). + +test_stwa :- + test_sub_term_wa, + test_get_sub_term_wa, + test_put_sub_term_wa, + test_put_sub_term_wa_smooth, + test_delete_sub_term_wa, + test_foldr_put_sub_term_wa_ae, + test_foldr_put_sub_term_wa_ae_smooth, + test_sub_term_types_wa,!. + + + +test_sub_term_wa :- + +findall(_,(member([N,ST,T,In], +[ +[1,[a,_],[[a,b],[a,c]], + [[[1, 1], [a, b]], [[1, 2], [a, c]]]], + +[2,[a,_],[[a,b],[[a,c]]], + [[[1, 1], [a, b]], [[1, 2, 1], [a, c]]]], + +[3,[],[[a,b],[a,c]], + []], + +[4,[a,_],[[a,b],[a,[a,e]]], + [[[1, 1], [a, b]], [[1, 2], [a, [a, e]]]]], + +[5,[],[[]], + [[[1, 1], []]]], + +[6,[],[], + [[[1], []]]], + +[7,_,_, + [[[1], _]]], + +[8,[_],[[_],_], + [[[1, 1], [_]], [[1, 2], [_]]]], + +[9,_,[[_],_], + [[[1], [[_], _]]]], + +[10,1,[[1,_],1,_], + [[[1, 1, 1], 1], [[1, 1, 2], 1], [[1, 2], 1], [[1, 3], 1]]], + +[11,[_,_],[[_,_]], + [[[1, 1], [_, _]]]], + +[12,[_,_],[_,_], + [[[1], [_, _]]]], + +[13,[_],[_,_], + [[[1, 1], [_]], [[1, 2], [_]]]], + +[14,[_],[_], + [[[1], [_]]]], + +[15,[[[_]]],[_,_], + [[[1, 1], [[[_]]]], [[1, 2], [[[_]]]]]], + +[16,[1],[[1]], + [[[1, 1], [1]]]], + +[17,[_],[[_]], + [[[1], [[_]]]]], + +[18,[_],[[_]], + + [[[1], [[_]]]]], + +[19,[_],_, + [[[1], [_]]]] + +]), + ((sub_term_wa(ST,T,In1),In1=In)->R=success;R=fail), + writeln([R,sub_term_wa,test,N])),_),!. + + +sub_term_wa(Find,A,B) :- + dynamic(stwa/1), + retractall(stwa(_)), + assertz(stwa(%[ + find%,Find] + )), + sub_term_wa1([1],_Ns2,0,[A],Find,[],B1,true), + findall([C3,C2],(member([C1,C2],B1),append([_],C3,C1),not(C3=[])),B), + %length(Find,L), + %findall(B1,(member(B1,B),length(B1,L)),B2), + !. + +%sub_term_wa2(Ns,Ns,_N,E,_Find,B,B,_) :- !. +% var(E), +sub_term_wa1(Ns1,Ns2,N,E,Find,B,B1,Flag) :- + (([E1]=E,var(E1))->B1=[[[_,1],E]]; + sub_term_wa2(Ns1,Ns2,N,E,Find,B,B1,Flag)),!. + +%sub_term_wa2(Ns,Ns,_N,[],_Find,B,B,_) :- !. +%sub_term_wa2(Ns,Ns,_N,E,_Find,B,B,_) :- +% not(var(E)),E=[],!. +%sub_term_wa2(Ns,Ns,_N,E,_Find,B,B1,_) :- +% ((not(var(E)),E=[])->(C0=types->fail;B1=B);(fail%,var(E),B=[],B1=_ +% )),!. + % put into stwa with types, merge +sub_term_wa2(Ns1,Ns2,N,A,Find,B,C,First) :- + stwa(%[ + C0%,Find] + ), + (((C0=find,First=true,not(Ns1=[_]))->(A=Find->append(B,[[Ns1,A]],C);fail); + ((C0=types,First=true%,not(Ns1=[_]) + )->(is_t(Find,A,First,true)->append(B,[[Ns1,A]],C)) + ))->true; + (not(find_first(is_list(A))), + sub_term_wa3(Ns1,Ns2,N,A,Find,B,C,First))),!. +sub_term_wa2(Ns,Ns,_N,[],_Find,B,B,_) :- !. +sub_term_wa2(Ns1,Ns2,N,A,Find,B,C,First) :- + find_first(is_list(A)), + sub_term_wa4(Ns1,Ns2,N,A,Find,B,C,First). + +sub_term_wa3(Ns1,Ns1,_N,A,Find,B,C,true) :- + stwa(%[ + C0%,Find] + ), + (C0=find->A=Find;(C0=types,is_t(Find,A,true,true))), + not(Ns1=[_]), + append(B,[[Ns1,A]],C). +sub_term_wa3(Ns,Ns,_N,A,Find,B,B,true) :- + stwa(%[ + C0%,Find] + ), + (C0=find->not(A=Find);(C0=types,not(is_t(Find,A,true,true)))),!. + +sub_term_wa4(Ns1,Ns1,_N,A,Find,B,C,true) :- + stwa(%[ + C0%,Find] + ), + (C0=find->A=Find;(C0=types,is_t(Find,A,true,true))), + not(Ns1=[_]), + append(B,[[Ns1,A]],C). +sub_term_wa4(Ns1,Ns2,N,A,Find,B,C,First) :- + copy_term(Find,Find1), + N1 is N+1, + (First=true->N11=1;N11=N1), + append(Ns1,[N11],Ns5), + %trace, + A=[D|E],sub_term_wa2(Ns5,_Ns3,N11,D,Find1,B,F,true), + %writeln(N11), + %writeln(First),trace, + sub_term_wa2(Ns1,Ns2,N11,E,Find,F,C,false). + + +test_get_sub_term_wa :- + +findall(_,(member([N,T1,Add,It1], +[[1, +[1,2,3],[1,1], + 1], + +[2,[1,2,3],[1], + [1, 2, 3]], + +[3,[[1,4],2,3],[1,1,1], + 1], + +[4,[[1,4],2,3],[1,1,2], + 4] +]), + ((get_sub_term_wa(T1,Add,It2),It1=It2)->R=success;R=fail), + writeln([R,get_sub_term_wa,test,N])),_),!. + + +get_sub_term_wa(List,[1],List) :- !. +get_sub_term_wa(List,[_|Ns],L1) :- + get_sub_term_wa1(List,Ns,L1),!. + +get_sub_term_wa1(List,Ns,L1) :- + Ns=[N], + get_item_n(List,N,L1),!. +get_sub_term_wa1(List,Ns,L1) :- + Ns=[N|Ns2], + get_item_n(List,N,L3), + get_sub_term_wa1(L3,Ns2,L1). + + +test_put_sub_term_wa :- + +findall(_,(member([N,It,Add,T1,T2], +[ +[1,9,[1, 2, 1, 1, 2, 3],[[1,2],[[[4,[5,7,8],6]]]], + [[1, 2], [[[4, [5, 7, 9], 6]]]]], + +[2,88,[1,1],[[2,3],4], + [88, 4]] + +]), + ((put_sub_term_wa(It,Add,T1,T21),T21=T2)->R=success;R=fail), + writeln([R,put_sub_term_wa,test,N])),_),!. + +test_put_sub_term_wa_smooth :- + +findall(_,(member([N,It,Add,T1,T2], +[ +[1,[9],[1, 2, 1, 1, 2, 3],[[1,2],[[[4,[5,7,8],6]]]], + [[1, 2], [[[4, [5, 7, 9], 6]]]]], + +[2,[88,1],[1,1],[[2,3],4], + [88, 1, 4]] + +]), + ((put_sub_term_wa_smooth(It,Add,T1,T21),T21=T2)->R=success;R=fail), + writeln([R,put_sub_term_wa_smooth,test,N])),_),!. + +put_sub_term_wa_smooth(A,B,C,D) :- + dynamic(stwa_smooth/1), + retractall(stwa_smooth(_)), + assertz(stwa_smooth(on)), + put_sub_term_wa(A,B,C,D), + retractall(stwa_smooth(_)), + assertz(stwa_smooth(off)), + !. + +put_sub_term_wa(A,B,C,D) :- + dynamic(stwa_smooth/1), + (stwa_smooth(_)->true; + retractall(stwa_smooth(_)), + assertz(stwa_smooth(off))), + put_sub_term_wa2(A,B,C,D),!. + +put_sub_term_wa2(List,[1],_L1,List) :- !. +put_sub_term_wa2(List,[_|Ns],L1,L2) :- + put_sub_term_wa1(List,Ns,L1,L2),!. + +put_sub_term_wa1(List,Ns,L1,L2) :- + Ns=[N], + (stwa_smooth(on)-> + (N1 is N-1, + length(L,N1), + length(L1,N3), + N2 is N3-N, + length(L3,N2), + append(L,L4,L1), + append(_,L3,L4), + foldr(append,[L,List,L3],L2)); + put_item_n(L1,N,List,L2)),!. +put_sub_term_wa1(List,Ns,L1,L2) :- + Ns=[N|Ns2], + get_item_n(L1,N,L3), + put_sub_term_wa1(List,Ns2,L3,L4), + put_item_n(L1,N,L4,L2). + + +test_delete_sub_term_wa :- + +findall(_,(member([N,In,T,T2], +[ +[1,[[1, 1], [1, 2]], [a, b], + []], + +[2,[[1]],1, + []] + +]), + ((delete_sub_term_wa(In,T,T21),T21=T2)->R=success;R=fail), + writeln([R,delete_sub_term_wa,test,N])),_),!. + + +delete_sub_term_wa(NNs,L1,L2) :- + foldr(put_sub_term_wa("&del"), NNs, L1, L3), + delete_sub_term_wa2(L3,"&del",[],L2),!. + +delete_sub_term_wa2(Find,Find,A,A) :-!. +delete_sub_term_wa2([],_,A,A). +delete_sub_term_wa2([Find|D],Find,B,C):- + delete_sub_term_wa2(D,Find,B,C),!. +delete_sub_term_wa2([A|D],Find,B,[A1|C]):- + (is_list(A)->delete_sub_term_wa2(A,Find,[],A1);A=A1), + delete_sub_term_wa2(D,Find,B,C). + + +/* + +A5=[[v,1],[v,2]],sub_term_wa([v,_],A5,A),findall([Ad,[v,A1]],(member([Ad,[v,A2]],A),A1 is A2+1),A3),foldr(put_sub_term_wa_ae,A3,A5,A4). +A5 = [[v, 1], [v, 2]], +A = [[[1, 1], [v, 1]], [[1, 2], [v, 2]]], +A3 = [[[1, 1], [v, 2]], [[1, 2], [v, 3]]], +A4 = [[v, 2], [v, 3]]. + +*/ + +test_foldr_put_sub_term_wa_ae :- + +findall(_,(member([N,In,T,T2], +[ +[1,[[[1, 1], [v, 2]], [[1, 2], [v, 3]]], [[v, 1], [v, 2]], + [[v, 2], [v, 3]]] + +]), + ((foldr(put_sub_term_wa_ae,In,T,T21),T21=T2)->R=success;R=fail), + writeln([R,foldr(put_sub_term_wa_ae),test,N])),_),!. + + + +test_foldr_put_sub_term_wa_ae_smooth :- + +findall(_,(member([N,In,T,T2], +[ +[1,[[[1, 1], [v, 2]], [[1, 2], [v, 3]]], [[v, 1], [v, 2]], + [v, 2, v, 3]] + +]), + ((foldr(put_sub_term_wa_ae_smooth,In,T,T21),T21=T2)->R=success;R=fail), + writeln([R,foldr(put_sub_term_wa_ae_smooth),test,N])),_),!. + + +put_sub_term_wa_ae([E,A],B,C) :- + put_sub_term_wa(A,E,B,C),!. + + +put_sub_term_wa_ae_smooth([E,A],B,C) :- + put_sub_term_wa_smooth(A,E,B,C),!. + + +test_sub_term_types_wa :- + +findall(_,(member([N,H,T,In], +[ +[1,[string,atom,[],number,var,compound], ["a",a,[],1,_,1+1], + [[[1, 1], "a"], [[1, 2], a], [[1, 3], []], [[1, 4], 1], [[1, 5], _], [[1, 6], 1+1]]], + +[2,[[]], [], + [[[1], []]]], + +[3,[[]],[[]], + [[[1, 1], []]]], + +[4,[all([number,string])],[1,["a",3]], + [[[1,2],["a",3]]] +] + +]), + ((sub_term_types_wa(H,T,In1),In1=In)->R=success;R=fail), + writeln([R,sub_term_types_wa,test,N])),_),!. + + +sub_term_types_wa(H,A,B) :- + dynamic(stwa/1), + retractall(stwa(_)), + assertz(stwa(types)), + copy_term(H,H1), + sub_term_wa2([1],_Ns2,0,A,H1,[],B,true), + !. + +% find terminal lists with particular types +% sub_term_types_wa([all([string])],["a","b",["c","d",["e"]]],In). + +% In = [[[1, 3, 3], ["e"]]]. + +is_t(H,A,First0,First) :- + ((member(var,H),var(A))->true; + ((member(string,H),string(A))->true; + ((member(atom,H),atom(A))->true; + ((member([],H),not(var(A)),A=[])->true; + ((member(number,H),number(A)->true; + ((member(compound,H),not(is_list(A)),compound(A)->true; + ((First0=true,First=true,member(all(K),H), + is_list(A), + forall(member(A1,A),is_t(K,A1,First0,false))); + ((member(heuristic(He,Output),H), + Output=A,He) + )))))))))),!. diff --git a/Philosophy/subterm_with_address.pl b/Philosophy/subterm_with_address.pl new file mode 100644 index 0000000000000000000000000000000000000000..602c1a88386d5a2489e672cc5e331ea61f69c971 --- /dev/null +++ b/Philosophy/subterm_with_address.pl @@ -0,0 +1 @@ +:-include('sub_term_with_address.pl'). \ No newline at end of file diff --git a/Philosophy/subtract1.pl b/Philosophy/subtract1.pl new file mode 100644 index 0000000000000000000000000000000000000000..5326ff30e5a987f5a7604360b73b22292958d59a --- /dev/null +++ b/Philosophy/subtract1.pl @@ -0,0 +1,6 @@ +subtract1(C,[],C):-!. +subtract1(A,[B1|B2],E):-delete1(A,B1,[],D),subtract1(D,B2,E). + +delete1([],_,C,C):-!. +delete1([A1|A2],A1,C,D):-delete1(A2,A1,C,D),!. +delete1([A1|A2],B,C,D):-delete1(A2,B,[A1|C],D),!. \ No newline at end of file diff --git a/Philosophy/tech_dict.txt b/Philosophy/tech_dict.txt new file mode 100644 index 0000000000000000000000000000000000000000..1cf662936a88b5f2388d2a2b1e8df43855c8875c --- /dev/null +++ b/Philosophy/tech_dict.txt @@ -0,0 +1 @@ +[["automate","recurse"],["word","word"]] \ No newline at end of file diff --git a/Philosophy/test.lp b/Philosophy/test.lp new file mode 100644 index 0000000000000000000000000000000000000000..06b99f75d255f0e20f18957fb1a563a1f83736cb --- /dev/null +++ b/Philosophy/test.lp @@ -0,0 +1,7 @@ +[ +[[n,test],":-", +[ + [[n,read_string],[[v,a]]], + [[n,writeln],[[v,a]]] +]] +]. \ No newline at end of file diff --git a/Philosophy/test15_vgp.pl b/Philosophy/test15_vgp.pl new file mode 100644 index 0000000000000000000000000000000000000000..5bc87f7f13889756a9e4c764f6fddd245117da7b --- /dev/null +++ b/Philosophy/test15_vgp.pl @@ -0,0 +1,32 @@ +query_box_2(T):-grammar1("[[\"aa,]\",b,\"c\",[]],1]",T). +grammar1(U,T):-compound(U,"",[],T). +compound213(U,U,T,T). +compound(Vgp1,Vgp2,T,U):-grammar_part("[",Vgp1,Vgp3),grammar_part("]",Vgp3,Vgp4),compound213(Vgp4,Vgp2,T,U). +compound(Vgp1,Vgp2,T,U):-grammar_part("[",Vgp1,Vgp3),compound21(Vgp3,Vgp4,T,V),grammar_part("]",Vgp4,Vgp5),compound213(Vgp5,Vgp2,V,U). +compound212(U,U,T,T). +compound21(Vgp1,Vgp2,T,U):-item(Vgp1,Vgp3,I),lookahead(Vgp3,Vgp4,"]"),wrap(I,Itemname1),append(T,Itemname1,V),compound212(Vgp4,Vgp2,V,U). +compound21(Vgp1,Vgp2,T,U):-item(Vgp1,Vgp3,I),grammar_part(",",Vgp3,Vgp4),compound21(Vgp4,Vgp2,[],Compound1name),wrap(I,Itemname1),append(T,Itemname1,V),append(V,Compound1name,U). +item(Vgp1,Vgp2,T):-grammar_part("\"",Vgp1,Vgp3),word21(Vgp3,Vgp4,"",T),grammar_part("\"",Vgp4,Vgp2). +item(Vgp1,Vgp2,T):-number21(Vgp1,Vgp2,"",U),stringtonumber(U,T). +item(Vgp1,Vgp2,T):-word21_atom(Vgp1,Vgp2,"",T1),atom_string(T,T1). +item(Vgp1,Vgp2,T):-compound(Vgp1,Vgp2,[],T). +number212(U,U,T,T). +number21(Vgp1,Vgp2,T,U):-grammar_part(A,Vgp1,Vgp3),commaorrightbracketnext(Vgp3,Vgp4),((stringtonumber(A,A1),number(A1))->(true);((equals4(A,".")->(true);(equals4(A,"-"))))),stringconcat(T,A,V),number212(Vgp4,Vgp2,V,U). +number21(Vgp1,Vgp2,T,U):-grammar_part(A,Vgp1,Vgp3),((stringtonumber(A,A1),number(A1))->(true);((equals4(A,".")->(true);(equals4(A,"-"))))),stringconcat(T,A,V),number21(Vgp3,Vgp2,"",Numberstring),stringconcat(V,Numberstring,U). +word212(U,U,T,T). +word21(Vgp1,Vgp2,T,U):-grammar_part(A,Vgp1,Vgp3),quote_next(Vgp3,Vgp4),not((=(A,"\""))),stringconcat(T,A,V),word212(Vgp4,Vgp2,V,U). +word21(Vgp1,Vgp2,T,U):-grammar_part(A,Vgp1,Vgp3),not((=(A,"\""))),stringconcat(T,A,V),word21(Vgp3,Vgp2,"",Wordstring),stringconcat(V,Wordstring,U). +word212_atom(U,U,T,T). +word21_atom(Vgp1,Vgp2,T,U):-%writeln1(word21_atom(Vgp1,Vgp2,T,U)),trace, +grammar_part(A,Vgp1,Vgp3),commaorrightbracketnext(Vgp3,Vgp4),not((=(A,"\""))),not((=(A,"["))),not((=(A,"]"))),stringconcat(T,A,V),word212_atom(Vgp4,Vgp2,V,U). +word21_atom(Vgp1,Vgp2,T,U):-grammar_part(A,Vgp1,Vgp3),not((=(A,"\""))),not((=(A,"["))),not((=(A,"]"))),stringconcat(T,A,V),word21_atom(Vgp3,Vgp2,"",Wordstring),stringconcat(V,Wordstring,U). +commaorrightbracketnext(Vgp1,Vgp2):-lookahead(Vgp1,Vgp2,","). +commaorrightbracketnext(Vgp1,Vgp2):-lookahead(Vgp1,Vgp2,"]"). +quote_next(Vgp1,Vgp2):-lookahead(Vgp1,Vgp2,"\""). +lookahead(A,A,B):-stringconcat(B,D,A). + +grammar_part(A,B,C):-string_concat(A,C,B),string_length(A,1). +stringconcat(A,B,C):-string_concat(A,B,C). +stringtonumber(A,B):-number_string(B,A). +equals4(A,B):-A=B. +wrap(A,B):-B=[A]. diff --git a/Philosophy/test_family.pl b/Philosophy/test_family.pl new file mode 100644 index 0000000000000000000000000000000000000000..3251ea9ca3c51da73e358deadb050c9784324d40 --- /dev/null +++ b/Philosophy/test_family.pl @@ -0,0 +1,20 @@ +:-include('../Prolog-to-List-Prolog/p2lpconverter.pl'). +%:-include('../SSI-1d011c3148fcfb6edd02292b0151fa9aa858848d/ssi.pl'). +:-include('../SSI/ssi.pl'). +%:-include('../listprologinterpreter/listprolog.pl'). +:-include('family_sols.pl'). +:-include('family_test.pl'). + +test_family :- + %working_directory1(A,A), + %working_directory1(_,"../"), + %fastp2lp("family_sols.pl",Functions), + + p2lpconverter([file,"family_sols.pl"],Functions1), + p2lpconverter([file,"family_test.pl"],Functions2), + + append(Functions1,Functions2,Functions3), + + Debug=on, + + lucianpl(Debug,[[n,family_test]],Functions3,[[]]). \ No newline at end of file diff --git a/Philosophy/trans_txt.txt b/Philosophy/trans_txt.txt new file mode 100644 index 0000000000000000000000000000000000000000..b4b2bb3c96c89755402e8326a931b888c2d30476 --- /dev/null +++ b/Philosophy/trans_txt.txt @@ -0,0 +1 @@ +[[1,"A. \nA.",3,"A.A."],[5,"A.A.",7,"a.a."]] \ No newline at end of file diff --git a/Philosophy/unconceived_child_was_examined.pl b/Philosophy/unconceived_child_was_examined.pl new file mode 100644 index 0000000000000000000000000000000000000000..c13090bdbb373e411f83b1db87da1a5ea2a50e92 --- /dev/null +++ b/Philosophy/unconceived_child_was_examined.pl @@ -0,0 +1,6 @@ +% ["Mind Reading","mr for scientific property detection 4.txt",0,algorithms,"62. I mind tested that the unconceived child was examined."] + +unconceived_child_was_examined :- + writeln("Have you written 1 A if you are a graduate of undergraduate or below, 15 As if you are an Honours graduate, 50 As if you are a Master graduate or 200 As if you are a PhD graduate for the successful birth of your unconceived child (y/n)?"), + read_string(user_input, "\n", "\r", _, "y"). + \ No newline at end of file diff --git a/Philosophy/version.pl b/Philosophy/version.pl new file mode 100644 index 0000000000000000000000000000000000000000..0d9471571c493b53e37ccb202ef6acd176c06e40 --- /dev/null +++ b/Philosophy/version.pl @@ -0,0 +1,39 @@ +% version.pl + +% Calculates the version of an algorithm from a history file with API changes, features and bug fixes + +:-include('../listprologinterpreter/la_files.pl'). +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). + +version(File,A,F,B) :- + +open_file_s(File,File_term), + +version1(File_term,0,0,0,A,F,B). + + +version1([],A,F,B,A,F,B) :- !. +version1(File_term,A1,_F1,_B1,A2,F2,B2) :- + File_term=[[a,_]|R], + A3 is A1+1, + F3 is 0, + B3 is 0, +version1(R,A3,F3,B3,A2,F2,B2). + +version1(File_term,A1,F1,_B1,A2,F2,B2) :- + File_term=[[f,_]|R], + A3 is A1, + F3 is F1+1, + B3 is 0, +version1(R,A3,F3,B3,A2,F2,B2). + +version1(File_term,A1,F1,B1,A2,F2,B2) :- + File_term=[[b,_]|R], + A3 is A1, + F3 is F1, + B3 is B1+1, +version1(R,A3,F3,B3,A2,F2,B2). + + + diff --git a/Philosophy/version2.txt b/Philosophy/version2.txt new file mode 100644 index 0000000000000000000000000000000000000000..8bcd837629687ba46f949faf06adb38ee976081f --- /dev/null +++ b/Philosophy/version2.txt @@ -0,0 +1 @@ +[[f,"Wrote version.pl."],[b,"Added underscores at lines 15 and 22."],[b,"d 1"],[f,"d 2"]] \ No newline at end of file diff --git a/Philosophy/web-editor-pw.pl b/Philosophy/web-editor-pw.pl new file mode 100644 index 0000000000000000000000000000000000000000..f662dff18d7d9ff8a951f1d7d6e77aba87235527 --- /dev/null +++ b/Philosophy/web-editor-pw.pl @@ -0,0 +1 @@ +password("apple8"). \ No newline at end of file diff --git a/Philosophy/word_frequency_count.pl b/Philosophy/word_frequency_count.pl new file mode 100644 index 0000000000000000000000000000000000000000..f7ddbc85130963fb7f979d07e7e13f2937af7435 --- /dev/null +++ b/Philosophy/word_frequency_count.pl @@ -0,0 +1,22 @@ +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). + +% word_frequency_count(["file","file.txt"],Freq). + +% word_frequency_count(["string","a b c"],Freq). +% Freq = [["a", 1], ["b", 1], ["c", 2]] + +word_frequency_count([Type,String1],Freq) :- + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + + (Type="file"-> + (phrase_from_file_s(string(String2), String1)); + String1=String2), + split_string(String2,SepandPad,SepandPad,String3), + %writeln(String3), + %length(String3,Words), + + sort(String3,String4), + findall([A2,A],(member(A,String4),findall(A,(member(A,String3)),A1),length(A1,A2)),Freq). + + diff --git a/Philosophy/word_frequency_count2.pl b/Philosophy/word_frequency_count2.pl new file mode 100644 index 0000000000000000000000000000000000000000..b832ad6b94e8e336cd7311476cd83751963d5198 --- /dev/null +++ b/Philosophy/word_frequency_count2.pl @@ -0,0 +1,30 @@ +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). + +% word_frequency_count(["file","file.txt"],Freq). + +% word_frequency_count(["string","a b c"],Freq). +% Freq = [["a", 1], ["b", 1], ["c", 2]] + +word_frequency_count2([Type,String1],Freq) :- + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + + (Type="file"-> + (phrase_from_file_s(string(String2), String1)); + String1=String2), + split_string(String2,SepandPad,SepandPad,String3), + %writeln(String3), + %length(String3,Words), + + word_frequency_count2(String3,Freq). + +/* + sort(String3,String4), + findall([A2,A],(member(A,String4),findall(A,(member(A,String3)),A1),length(A1,A2)),Freq). + */ + + +word_frequency_count2(Words,Counts) :- + maplist(downcase_atom, Words, LwrWords), + msort(LwrWords, Sorted), + clumped(Sorted, Counts),!. diff --git a/Philosophy/word_game.pl b/Philosophy/word_game.pl new file mode 100644 index 0000000000000000000000000000000000000000..fa2dddae4264b7007fd37da30e8cea657e4d7f7c --- /dev/null +++ b/Philosophy/word_game.pl @@ -0,0 +1,37 @@ +:-include('../listprologinterpreter/listprolog.pl'). + +word_game :- + writeln("Player 1, please enter the word for player 2 to guess without showing them."), + read_string(user_input,"\n","\r",_,S), + nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, nl, + string_codes(S,C), + findall(C2,(member(C1,C),string_codes(C2,[C1])),S2), + length(S2,N), + numbers(N,1,[],S1), + N2 is 20, + word_game1(S1,S2,N2). + +%word_game(Numbers,Correct_letters,Tries_left) +word_game1(_,S,0) :- writeln("You lost!"),write("The word was "),foldr(string_concat,S,"",S1),write(S1),writeln("."),!. +word_game1(S,S,_) :- writeln("You won!"),write("The word was "),foldr(string_concat,S,"",S1),write(S1),writeln("."),!. +word_game1(S,S1,N) :- + repeat, + writeln("Player 2, please enter e.g. 1,a where 1 is the position number and a is the letter."), + writeln(S), + read_string(user_input,"\n","\r",_,S2), + split_string(S2,", ",", ",[N1,L1]), + number_string(N2,N1), + + get_item_n(S1,N2,L2), + (L1=L2-> + (writeln("Correct!"), + put_item_n(S,N2,L1,S3), + N0 is N-1, + word_game1(S3,S1,N0)) + ; + (writeln("Incorrect!"), + N0 is N-1, + word_game1(S,S1,N0))),!. + + + \ No newline at end of file diff --git a/Program-Finder-from-Data-/LICENSE b/Program-Finder-from-Data-/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/Program-Finder-from-Data-/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Program-Finder-from-Data-/README.md b/Program-Finder-from-Data-/README.md new file mode 100644 index 0000000000000000000000000000000000000000..5fcc42ee6c968bf4277c36fa4aa0e4d5b356d5da --- /dev/null +++ b/Program-Finder-from-Data-/README.md @@ -0,0 +1,52 @@ +# Program-Finder-from-Data- +Finds recursive algorithms from the structure of data. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for writing algorithms. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, the List Prolog Interpreter Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Program-Finder-from-Data-"). +halt +``` + +# Running + +* In Shell: +`cd Program-Finder-from-Data-` +`swipl` +`['programfinder'].` + +* Running the algorithm +To generate an algorithm: +`Input1=[["n","a"]],Inputs2=[["a",5]] ,Output=[["n", 5]] ,programfinder(Input1,Inputs2,Output,Extras,Program),writeln1(Program),international_interpret([lang,"en"],on,[[n,function],[Input1,Inputs2,[],[v,result]]],Program,Result).` + +# Versioning + +We will use SemVer for versioning. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/Program-Finder-from-Data-/program_finder.pl b/Program-Finder-from-Data-/program_finder.pl new file mode 100644 index 0000000000000000000000000000000000000000..90493d01c77b538ae99b183cde0aa0572b6af406 --- /dev/null +++ b/Program-Finder-from-Data-/program_finder.pl @@ -0,0 +1,332 @@ +%% Program Finder + +/** + +Also load interpret. + +*** BEFORE +?- Input1=[[n,a]],Inputs2=[[a,5]] ,Output=[[n, 5]] ,programfinder(Input1,Inputs2,Output,Extras,Program),writeln(Program),interpret(off,[function,[Input1,Inputs2,[],result]],Program,Result). +[[function,[[],inputs2,output,output]],[function,[input1,inputs2,inputs3,output],:-,[[head,input1,head],[tail,input1,tail],[head=[a,b]],[[atom,a],[atom,b]],[member,inputs21,inputs2],[inputs21=[b,c]],[item1=[a,c]],[append,inputs3,item1,item2],[function,[tail,inputs2,item2,output]]]]] +Input1 = [[n, a]], +Inputs2 = [[a, 5]], +Output = [[n, 5]], +Extras = [], +Program = [[function,[[],inputs2,output,output]],[function,[input1,inputs2,inputs3,output],:-,[[head,input1,head],[tail,input1,tail],[head=[a,b]],[[atom,a],[atom,b]],[member,inputs21,inputs2],[inputs21=[b,c]],[item1=[a,c]],[append,inputs3,item1,item2],[function,[tail,inputs2,item2,output]]]]], +Result = [[result, [n, 5]]] + +** AFTER + +Input1=[[n,a]],Inputs2=[[a,5]] ,Output=[[n, 5]] ,programfinder(Input1,Inputs2,Output,Extras,Program),writeln(Program),interpret(off,[[n,function],[Input1,Inputs2,[],[v,result]]],Program,Result). +[[[n,function],[[],[v,inputs2],[v,output],[v,output]]],[[n,function],[[v,input1],[v,inputs2],[v,inputs3],[v,output]],:-,[[[n,head],[[v,input1],[v,head]]],[[n,tail],[[v,input1],[v,tail]]],[[n,=],[[v,head],[[v,a],[v,b]]]],[[[n,atom],[v,a]],[[n,atom],[v,b]]],[[n,member],[[v,inputs21],[v,inputs2]]],[[n,=],[[v,inputs21],[[v,b],[v,c]]]],[[[n,number],[v,c]]],[[n,=],[[v,item1],[[v,a],[v,c]]]],[[n,append],[[v,inputs3],[v,item1],[v,item2]]],[[n,function],[[v,tail],[v,inputs2],[v,item2],[v,output]]]]]] +Input1 = [[n, a]], +Inputs2 = [[a, 5]], +Output = [[n, 5]], +Extras = Result, Result = [], + +**/ + +programfinder(Input1,Inputs2,Output,Extras,Program) :- + labelall(Inputs2,inputs2,1,1,[],Inputs2Labels), +%%writeln([inputs2Labels,Inputs2Labels]), + label(Extras,extras,1,1,[],ExtrasLabels), +%%writeln([extrasLabels,ExtrasLabels]), +%% Inputs2=[Item|_Items], +%% ((length(Item,1),Extra=[])-> %% () is2 has 1-length tuples - then processes i1, is2 simultaneously +%% replace1is2(Input1,Inputs2,Output,Program); + replace(Input1,Output,Inputs2Labels,ExtrasLabels,[],Relations1), %% replace returns the relations for given input for a replacement +%%writeln([relations1,Relations1]), + deleteduplicates(Relations1,[],Relations2), +%%writeln([relations2,Relations2]), + FunctionName=[n,function], + (not(member([extrarelation,_,_,_,_,_],Relations2))-> + makebasecase(FunctionName,BaseCase); + makebasecase2(FunctionName,BaseCase)), +%%writeln([baseCase,BaseCase]), + findprogram(FunctionName,Relations2,1,[],Program1), + append(BaseCase,Program1,Program) + . +%%) +/** +replace1is2(Input1,Inputs2,Output,Program) :- + Input1=[Item1|Items1], + Inputs2=[Item2|Items2], + Output=[Item3|Items3], +**/ +replace([],_Output1,_Inputs2Labels,_ExtrasLabels,Relations,Relations). +replace(Input1,Output1,Inputs2Labels,ExtrasLabels,Relations1,Relations2) :- + ((atom(Input1);number(Input1))->Input1a=[Input1];Input1a=Input1), + Input1a=[Item1|Items1], + ((atom(Output1);number(Output1))->Output1a=[Output1];Output1a=Output1), + Output1a=[Item2|Items2], %% record what we need to give to output, then give it in a clause acc considerations + label(Item1,input1,1,1,[],Input1Labels), + label(Item2,output,1,1,[],OutputLabels), %% find position, type of values, (will find identicalness) + relations1(Input1Labels,OutputLabels,Inputs2Labels,ExtrasLabels,Relations1,Relations3), %% find rel types, which case extra/findargs relates to +%% ExtrasLabels later +%% return option config from rep, type config in extras, extras with relations + replace(Items1,Items2,Inputs2Labels,ExtrasLabels,Relations3,Relations2). + +labelall([],_Range,_ListItemNumber,_Position,Labels,Labels). +labelall(Data1,Range,ListItemNumber,Position1,Labels1,Labels2) :- + Data1=[Data2|Data3], + label(Data2,Range,ListItemNumber,Position1,Labels1,Labels3), + Position2 is Position1 + 1, + labelall(Data3,Range,ListItemNumber,Position2,Labels3,Labels2). +label([],_Range,_ItemNumber,_Position,ItemLabels,ItemLabels). +label(Item1,Range,ItemNumber,Position1,ItemLabels1,ItemLabels2) :- + ((atom(Item1);number(Item1))->Item1a=[Item1];Item1a=Item1), + Item1a=[Item2|Items], %% removed [] from i1 + ((atom(Item2),Type=[n,atom]); + (number(Item2),Type=[n,number]); + (string(Item2),Type=[n,string]); + (var(Item2),Type=[n,variable])), %% *** + Label=[Item2,Type,Range,ItemNumber,Position1], + append(ItemLabels1,[Label],ItemLabels3), + Position2 is Position1 + 1, + label(Items,Range,ItemNumber,Position2,ItemLabels3,ItemLabels2). +relations1([],_OutputLabels,_Inputs2Labels,_ExtrasLabels,Relations,Relations). +relations1(Input1Labels,OutputLabels,Inputs2Labels,ExtrasLabels,Relations1,Relations2) :- + ((atom(Input1Labels);number(Input1Labels))->Input1Labels1a=[Input1Labels];Input1Labels1a=Input1Labels), + Input1Labels1a=[Item1a|Items1], + Item1a=[Item1,Type1,Range1,ItemNumber1,Position1], + relations2(Item1,Type1,Range1,ItemNumber1,Position1,_Items1,OutputLabels,Inputs2Labels,ExtrasLabels,Relations1,Relations3), + relations1(Items1,OutputLabels,Inputs2Labels,ExtrasLabels,Relations3,Relations2). +/**relations2(Item1,Range1,ItemNumber1,Position1,Items1,_OutputLabels,_Inputs2Labels,_ExtrasLabels,Relations1,Relations2,ExtrasRelations,ExtrasRelations) :- +writeln(r21), + member(Item2a,Items1), + Item2a=[Item2,_Type2,Range2,ItemNumber2,Position2], + Item1=Item2, + append(Relations1,[[Item1,Range1,ItemNumber1,Position1],[Item2,Range2,ItemNumber2,Position2]],Relations2). +**/ +relations2(Item1,Type1,Range1,ItemNumber1,Position1,_Items1,OutputLabels,Inputs2Labels,ExtrasLabels,Relations1,Relations2) :- +%%writeln(r22), + ((Option1=Inputs2Labels,Option2=OutputLabels);(Option1=OutputLabels,Option2=Inputs2Labels)), + member(Item2a,Option1), + Item2a=[Item2,Type2,Range2,ItemNumber2,Position2], + Item1=Item2, + append(Relations1,[[[Item1,Type1,Range1,ItemNumber1,Position1],[Item2,Type2,Range2,ItemNumber2,Position2]]],Relations3), + member(Item3a,Option1), + Item3a=[Item3,Type3,Range3,ItemNumber3,Position3], + ItemNumber2=ItemNumber3, + ((Item3=empty,Position1=Position2,not(Position2=Position3))->( + member(Item5a,ExtrasLabels), + Item5a=[Item5,Type5,Range5,ItemNumber5,Position5], + Item2=Item5, + append(Relations3,[[extrarelation,Item5,Type5,Range5,ItemNumber5,Position5]],Relations4) + );Relations4=Relations3),( + member(Item4a,Option2), + Item4a=[Item4,Type4,Range4,ItemNumber4,Position4], + Item3=Item4, + append(Relations4,[[[Item3,Type3,Range3,ItemNumber3,Position3],[Item4,Type4,Range4,ItemNumber4,Position4]]],Relations2)), + !. +relations2(Item1,_Range1,_ItemNumber1,_Position1,_Items1,OutputLabels,Inputs2Labels,ExtrasLabels,Relations1,Relations2) :- +writeln(r23), +%% ((Item3=empty,Position1=Position2,not(Position2=Position3))->( + member(Item5a,ExtrasLabels), + Item5a=[Item5,Type5,Range5,ItemNumber5,Position5], + Item1=Item5, + append(Relations1,[[extrarelation,Item5,Type5,Range5,ItemNumber5,Position5]],Relations2), + %%);ExtrasRelations1=ExtrasRelations2), + (Option=Inputs2Labels;Option=OutputLabels), + not((member(Item4a,Option), + Item4a=[Item4,_Type1,_Range4,_ItemNumber4,_Position4], + Item4=Item5)). + +%% append(Relations3,[[Item3,Range3,ItemNumber3,Position3],[Item4,Range4,ItemNumber4,Position4]],Relations2) + +deleteduplicates([],List,List) :- !. +deleteduplicates(List1,List2,List3) :- + %%((not(List1=[])-> + List1=[[[Item1,Type1,Range1,ItemNumber1,Position1],[Item2,Type2,Range2,ItemNumber2,Position2]]|Rest], + delete(Rest,[[_Item12,Type1,Range1,ItemNumber3,Position1],[_Item22,Type2,Range2,ItemNumber4,Position2]],Rest2), %% does vv need deleting too v + delete(Rest2,[[_Item23,Type2,Range2,ItemNumber4,Position2],[_Item13,Type1,Range1,ItemNumber3,Position1]],Rest3), + ItemNumber1=ItemNumber2,ItemNumber1=ItemNumber3,ItemNumber1=ItemNumber4, + append(List2,[[[Item1,Type1,Range1,ItemNumber1,Position1],[Item2,Type2,Range2,ItemNumber2,Position2]]],List4), + deleteduplicates(Rest3,List4,List3), + %%);( +%%writeln(here1), + %% List3=[])) + !. + +deleteduplicates2([],List,List) :- !. +deleteduplicates2(List1,List2,List3) :- + List1=[Item|Rest1], + delete(List2,Item,List5), + deleteduplicates2(Rest1,List5,List3), + !. + +findprogram(_FunctionName,Relations,FunctionNumber,Program,Program) :- + not(member([[_Item1,_Type1,_Range1,FunctionNumber,_Position1],[_Item2,_Type2,_Range2,FunctionNumber,_Position2]],Relations)), + !. +findprogram(FunctionName,Relations,FunctionNumber1,Program1,Program2) :- + %%findprogram2(Relations,FunctionNumber1,1,[],Vars,[],Header,[],Body1), + input1arguments(Relations,FunctionNumber1,1,[[undef,[v,"`"]]],Vars1,[],Header1,[],TypeStatements1), + inputs2arguments(Relations,FunctionNumber1,1,Vars1,Vars2,[],Header2,[],TypeStatements2), + deleteduplicates2(TypeStatements1,TypeStatements2,TypeStatements3), + %trace, + outputarguments(Relations,FunctionNumber1,1,Vars2,Vars3,[],Header3), + extrasarguments(Relations,FunctionNumber1,1,Vars3,_Vars4,[],Header4), + (not(Header4=[])-> + makecode(FunctionName,Header1,TypeStatements1,TypeStatements3,Header2,Header3,Code1); + makecode(FunctionName,Header1,TypeStatements1,TypeStatements3,Header2,Header3,Header4,Code1)), + %%processcode( + append(Program1,Code1,Program3), + FunctionNumber2 is FunctionNumber1 + 1, + findprogram(FunctionName,Relations,FunctionNumber2,Program3,Program2). + +input1arguments(_Relations,_FunctionNumber,3,Vars,Vars,Header,Header,TypeStatements,TypeStatements). +input1arguments(Relations,FunctionNumber,Position1,Vars1,Vars2,Header1,Header2,TypeStatements1,TypeStatements2) :- + (member([[Item1,Type1,input1,FunctionNumber,Position1],[Item1,_Type2,_Range2,FunctionNumber,_Position2]],Relations);member([[Item1,_Type2,_Range2,FunctionNumber,_Position2],[Item1,Type1,input1,FunctionNumber,Position1]],Relations)), + var(Item1,Var,Vars1,Vars3), + append(Header1,[Var],Header3), + Position2 is Position1 + 1, + append(TypeStatements1,[[Type1,[Var]]],TypeStatements3), %% add to lpi + input1arguments(Relations,FunctionNumber,Position2,Vars3,Vars2,Header3,Header2,TypeStatements3,TypeStatements2). +input1arguments(Relations,FunctionNumber,Position1,Vars1,Vars2,Header1,Header2,TypeStatements1,TypeStatements2) :- + not((member([[Item1,Type1,input1,FunctionNumber,Position1],[Item1,_Type2,_Range2,FunctionNumber,_Position2]],Relations);member([[Item1,_Type22,_Range22,FunctionNumber,_Position22],[Item1,Type1,input1,FunctionNumber,Position1]],Relations))), + append(Header1,[undef],Header3), %% check this works in lpi + Position2 is Position1 + 1, + input1arguments(Relations,FunctionNumber,Position2,Vars1,Vars2,Header3,Header2,TypeStatements1,TypeStatements2). + +inputs2arguments(_Relations,_FunctionNumber,3,Vars,Vars,Header,Header,TypeStatements,TypeStatements). +inputs2arguments(Relations,FunctionNumber,Position1,Vars1,Vars2,Header1,Header2,TypeStatements1,TypeStatements2) :- + (member([[Item1,Type1,inputs2,FunctionNumber,Position1],[Item1,_Type2,_Range2,FunctionNumber,_Position2]],Relations);member([[Item1,_Type2,_Range2,FunctionNumber,_Position2],[Item1,Type1,inputs2,FunctionNumber,Position1]],Relations)), + var(Item1,Var,Vars1,Vars3), + append(Header1,[Var],Header3), + Position2 is Position1 + 1, + append(TypeStatements1,[[Type1,[Var]]],TypeStatements3), + inputs2arguments(Relations,FunctionNumber,Position2,Vars3,Vars2,Header3,Header2,TypeStatements3,TypeStatements2). +inputs2arguments(Relations,FunctionNumber,Position1,Vars1,Vars2,Header1,Header2,TypeStatements1,TypeStatements2) :- + not((member([[Item1,Type1,inputs2,FunctionNumber,Position1],[Item1,_Type2,_Range2,FunctionNumber,_Position2]],Relations);member([[Item1,_Type22,_Range22,FunctionNumber,_Position22],[Item1,Type1,inputs2,FunctionNumber,Position1]],Relations))), + append(Header1,[undef],Header3), %% check this works in lpi + Position2 is Position1 + 1, + inputs2arguments(Relations,FunctionNumber,Position2,Vars1,Vars2,Header3,Header2,TypeStatements1,TypeStatements2). + +outputarguments(_Relations,_FunctionNumber,3,Vars,Vars,Header,Header). +outputarguments(Relations,FunctionNumber,Position1,Vars1,Vars2,Header1,Header2) :- + (member([[Item1,_Type1,output,FunctionNumber,Position1],[Item1,_Type2,_Range2,FunctionNumber,_Position2]],Relations);member([[Item1,_Type22,_Range22,FunctionNumber,_Position22],[Item1,_Type12,output,FunctionNumber,Position1]],Relations)), + var(Item1,Var,Vars1,Vars3), + append(Header1,[Var],Header3), + Position2 is Position1 + 1, + outputarguments(Relations,FunctionNumber,Position2,Vars3,Vars2,Header3,Header2). +outputarguments(Relations,FunctionNumber,Position1,Vars1,Vars2,Header1,Header2) :- + not((member([[Item1,Type1,output,FunctionNumber,Position1],[Item1,_Type2,_Range2,FunctionNumber,_Position2]],Relations);member([[Item1,_Type22,_Range22,FunctionNumber,_Position22],[Item1,Type1,output,FunctionNumber,Position1]],Relations))), + append(Header1,[undef],Header3), %% check this works in lpi + Position2 is Position1 + 1, + outputarguments(Relations,FunctionNumber,Position2,Vars1,Vars2,Header3,Header2). + +extrasarguments(_Relations,_FunctionNumber,3,Vars,Vars,Header,Header). +extrasarguments(Relations,FunctionNumber,Position1,Vars1,Vars2,Header1,Header2) :- + member([extrarelation,Item1,_Type1,_Range,FunctionNumber,Position1],Relations), + var(Item1,Var,Vars1,Vars3), + append(Header1,[Var],Header3), + Position2 is Position1 + 1, + extrasarguments(Relations,FunctionNumber,Position2,Vars3,Vars2,Header3,Header2). +extrasarguments(Relations,FunctionNumber,Position1,Vars1,Vars2,Header1,Header2) :- + not(member([extrarelation,_Item1,_Type1,_Range,FunctionNumber,Position1],Relations)), + append(Header1,[undef],Header3), %% check this works in lpi + Position2 is Position1 + 1, + extrasarguments(Relations,FunctionNumber,Position2,Vars1,Vars2,Header3,Header2). + +makebasecase(FunctionName,Code) :- %**** + Code=[[FunctionName,[[],[v,inputs2],[v,output],[v,output]]]]. +makecode(FunctionName,Header1,TypeStatements1,TypeStatements2,Header2,Header3,Code) :- + append([[v,head]],[Header1],List1), + append([[v,head1]],[Header2],List2), + append([[v,item1]],[Header3],List3), + Code=[ + [FunctionName,[[v,input1],[v,inputs2],[v,inputs3],[v,output]],":-", + [ + [[n,head],[[v,input1],[v,head]]], + [[n,tail],[[v,input1],[v,tail]]], + [[n,equals1],List1], + TypeStatements1, + %[[n,member2],[[v,inputs2],[v,inputs21]]], %% brackets here in new lpi + [[n,head],[[v,inputs2],[v,head1]]], + [[n,tail],[[v,inputs2],[v,tail1]]], + [[n,equals1],List2], + TypeStatements2, + [[n,equals2],List3], + [[n,wrap],[[v,item1],[v,item1a]]], + [[n,append],[[v,inputs3],[v,item1a],[v,item2]]], %% brackets here in new lpi + [FunctionName,[[v,tail],[v,tail1],[v,item2],[v,output]] +]] + + ]]. +makebasecase2(FunctionName,Code) :- + Code=[[FunctionName,[[],[v,inputs2],[v,extras],[v,extras],[v,output],[v,output]]]]. +makecode(FunctionName,Header1,TypeStatements1,TypeStatements2,Header2,Header3,Header4,Code) :- + append([[v,head]],[Header1],List1), + append([[v,head1]],[Header2],List2), + append([[v,item1]],[Header3],List3), + append([[v,item3]],[Header4],List4), + Code=[ + [FunctionName,[[v,input1],[v,inputs2],[v,extras1],[v,extras2],[v,inputs3],[v,output]],":-", + [ + [[n,head],[[v,input1],[v,head]]], + [[n,tail],[[v,input1],[v,tail]]], + [[n,equals1],List1], + TypeStatements1, + %[[n,member2],[[v,inputs2],[v,inputs21]]], %% brackets here in new lpi + [[n,head],[[v,inputs2],[v,head1]]], + [[n,tail],[[v,inputs2],[v,tail1]]], + [[n,equals1],List2], + TypeStatements2, + [[n,equals2],List3], + TypeStatements2, + [[n,wrap],[[v,item1],[v,item1a]]], + [[n,append],[[v,inputs3],[v,item1a],[v,item2]]], %% brackets here in new lpi + [[n,equals2],List4], + [[n,append],[[v,extras1],[v,item3],[v,extras2]]], + [FunctionName,[[v,tail],[v,tail1],[v,item2],[v,output]] +] + ] + ] + ]. + +findprogram2(Relations,FunctionNumber,ArgumentNumber1,Vars1,Vars2,Header1,Header2,Body1,Body2) :- + ArgumentNumber2 is ArgumentNumber1 + 1, + findprogram2(Relations,FunctionNumber,ArgumentNumber2,Vars1,Vars2,Header1,Header2,Body1,Body2). +/**processcode( + deleteduplicatecode( + optimisecode(**/ +var(Item,Var,Vars,Vars) :- + member([Item,Var],Vars). +var(Item1,Var1A,Vars1,Vars2) :- + length(Vars1,Vars1Length1), + Vars1Length2 is Vars1Length1-1, + length(Vars3,Vars1Length2), + append(Vars3,[[_Item2,[v,Var2]]],Vars1), + char_code(Var2,Var2Code1), + Var2Code2 is Var2Code1 + 1, + Var2Code2 =< 122, + char_code(Var1,Var2Code2), + Var1A=[v,Var1], + append(Vars1,[[Item1,[v,Var1]]],Vars2). + +/** + +fp2 +-() 2v in is2,o + +- convert to var a,b,c, etc. +- append varname to header, command to body +2p of vars +remove duplicate code +merge code, i.e. m +efmar + + %% if vals are same, then do something, e.g. write empty, otherwise put them together + %% () if e.g. firstargs are needed, do same treating it as an output, test if same vals affects it +1. o->is, rec type,pos,identicalness format of o (any empty, etc), is + a. lack of presence in empty in is2, or identicalness in i1, is2,->firstargs added to - in [n,n], one n from each of i1, is2 +2. is->o, code + +finds mp with 1 or 2 in i1, unless is2 has 1-length tuples - then processes i1, is2 simultanenously + +order: +[] +format +type +identicalness + +**/ \ No newline at end of file diff --git a/Program-Finder-from-Data-/programfinder.pl b/Program-Finder-from-Data-/programfinder.pl new file mode 100644 index 0000000000000000000000000000000000000000..c163e03c9954ad000e52940a95207899934ea03d --- /dev/null +++ b/Program-Finder-from-Data-/programfinder.pl @@ -0,0 +1,8 @@ +:- include('../listprologinterpreter/listprolog.pl'). +%%:- include('caw5 copy 12.pl'). +%%:- include('cawpverify.pl'). +%%:- include('rcawp.pl'). +%%:- include('rcaw.pl'). +%%**:- include('texttobr2.pl'). +%:- include('../listprologinterpreter/la_strings.pl'). +:- include('program_finder.pl'). \ No newline at end of file diff --git a/Prolog-to-List-Prolog-master/.DS_Store b/Prolog-to-List-Prolog-master/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..dca0fa4557795fbafde08577faf1521a71189c46 Binary files /dev/null and b/Prolog-to-List-Prolog-master/.DS_Store differ diff --git a/Prolog-to-List-Prolog-master/LICENSE b/Prolog-to-List-Prolog-master/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/Prolog-to-List-Prolog-master/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Prolog-to-List-Prolog-master/README.md b/Prolog-to-List-Prolog-master/README.md new file mode 100644 index 0000000000000000000000000000000000000000..f0850473122f00f3de73bf31d198b309bde64ed7 --- /dev/null +++ b/Prolog-to-List-Prolog-master/README.md @@ -0,0 +1,79 @@ +# Prolog-to-List-Prolog +Converts Prolog algorithms to List Prolog algorithms + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository and the List Prolog interpreter. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Prolog-to-List-Prolog"). +halt +``` + +# Running Prolog to List Prolog + +* In Shell: +`cd Prolog-to-List-Prolog` +`swipl` +`['p2lpconverter.pl'].` + +Run: + +Convert Prolog code to List Prolog code by copying Prolog algorithm into `test1.pl` and running: `p2lpconverter(S1),pp0(S1,S2),writeln(S2).` + +# Tests + +* Run `p2lp_test(A,B)` or `p2lp_test1(N,B)` where N is the test number from Prolog to List Prolog/p2lpverify.pl. + +# Prolog to Simple List Prolog + +Install: +Load each file into SWI-Prolog using `['la_strings.pl'].` and `['p2slpconverter.pl'].` + +Run: + +Convert Prolog code to List Prolog code by copying Prolog algorithm into test1.pl and running: `p2slpconverter.` +Input: +``` +a. +b(C,D). +ef(G):-(h(I)->true;true),!. +``` +Note: `[a,*,*]` not `a`, which is Simple List Prolog. `[a,*,*]` is for inputting into CAWPS predicate dictionary. + +Output: +``` +[[[a,*,*]],[[b,*,*],[c,d]],[[ef,*,*],[g],:-,[[[[h,*,*],[i]],->,true,or,true],!]]] +``` + +Pretty-print List Prolog algorithm by typing e.g.: +``` +pp0([[[a,*,*]],[[b,*,*],[c,d]],[[ef,*,*],[g],(:-),[[[[h,*,*],[i]],->,true,or,true],!]]]).` +Output: + +[ +[[a,*,*]], +[[b,*,*],[c,d]], +[[ef,*,*],[g],(:-), +[ + [[[h,*,*],[i]],->,true,or,true], + ! +]] +] +``` + +* See also State Machine to List Prolog, which converts from the State Machine generated within SSI (a Prolog compiler written in Prolog) to List Prolog (a version of Prolog with algorithms written as lists). The State Machine may be operated on by, for example, optimisers. diff --git a/Prolog-to-List-Prolog-master/la_strings.pl b/Prolog-to-List-Prolog-master/la_strings.pl new file mode 100644 index 0000000000000000000000000000000000000000..778f427c8a676e4bd9bbd439c0d022d08bfe2a11 --- /dev/null +++ b/Prolog-to-List-Prolog-master/la_strings.pl @@ -0,0 +1,20 @@ +%% la_strings.pl + +use_module(library(pio)). +use_module(library(dcg/basics)). + +open_s(File,Mode,Stream) :- + atom_string(File1,File), + open(File1,Mode,Stream),!. + +string_atom(String,Atom) :- + atom_string(Atom,String),!. + +phrase_from_file_s(string(Output), String) :- + atom_string(String1,String), + phrase_from_file(string(Output), String1),!. + +writeln1(Term) :- + term_to_atom(Term,Atom), + writeln(Atom),!. + \ No newline at end of file diff --git a/Prolog-to-List-Prolog-master/p2lpconverter.pl b/Prolog-to-List-Prolog-master/p2lpconverter.pl new file mode 100644 index 0000000000000000000000000000000000000000..c24f2875ff7f3a877ccf5db21eade7a558e66677 --- /dev/null +++ b/Prolog-to-List-Prolog-master/p2lpconverter.pl @@ -0,0 +1,1049 @@ +:-include('../listprologinterpreter/la_strings.pl'). +:-include('pretty_print.pl'). +:-include('pretty_print_lp2p.pl'). +:-include('p2lpverify.pl'). + +%% p2lpconverter(S1),pp0(S1,S2),writeln(S2). + +/**trace, +string_codes("a.\nb(C,D).\nef('A'):-(h(a)->true;true),!.",A),phrase(file(B),A),write(B). +**/ + +%% [[[n,a]],[[n,b],[[v,c],[v,d]]],[[n,ef],['A'],:-,[[[n,->],[[[n,h],[a]],[[n,true]],[[n,true]]]],[[n,cut]]]]] + + +use_module(library(pio)). +use_module(library(dcg/basics)). + +:- include('la_strings.pl'). + +:-dynamic keep_comments/1. + +init_keep_comments :- + (not(keep_comments(_) + )-> + (retractall(keep_comments(_)), + assertz(keep_comments([]))); + true),!. + +turn_keep_comments_on :- + retractall(keep_comments(_)), + assertz(keep_comments([percentage_comments,slash_star_comments%,newlines + ])),!. + +turn_keep_comments_off :- + retractall(keep_comments(_)), + assertz(keep_comments([])),!. + + +p2lpconverter([string,String],List3) :- + turn_keep_comments_on, + %File1="test1.pl", + string_codes(String,String1), + (phrase(file(List3),String1)->true;%(writeln(Error), + fail),!. + +p2lpconverter([file,File1],List3) :- + turn_keep_comments_on, + %File1="test1.pl", + readfile(File1,"test1.pl file read error.",List3),!. + + +p2lpconverter(List3) :- + turn_keep_comments_on, + File1="test1.pl", + readfile(File1,"test1.pl file read error.",List3),!. + +readfile(List1,_Error,List3) :- + %init_keep_comments, + phrase_from_file_s(string(List6), List1), + (phrase(file(List3),List6)->true;%(writeln(Error), + fail). + %writeln1(List3) . + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +%file(N) --> newlines1(N),!. + +file(Ls2) --> newlines1(N1),predicate(L),newlines1(N2), +%{writeln1(L)}, +file(Ls), +%{writeln1(L)}, %%*** + {foldr(append,[N1, + L,N2, + Ls],Ls2)}, + %delete(Ls3,[],Ls2)}, + !. +file(Ls2) --> newlines1(Ls2),!. +file([]) --> [],!. + +/* +file(Ls2) --> newlines1(N),file(Ls), +%{writeln1(L)}, %%*** + {%foldr(append,[[L],N|Ls],Ls3), + delete([N|Ls],[],Ls2)}, +%{foldr(append,[N],Ls2)}, +%{writeln1(L)}, +!. +*/ +%%predicate([]) --> newlines1(_). + +predicate(A2) --> + ":-",newlines1(_),name1(Word11), + "(",newlines1(N1),varnames(Varnames),")", + newlines1(N2),%":-",%newlines1(N3),%{trace}, + %lines(L), + ".", + {foldr(append,[[[":-",[n,Word11],Varnames%N, + ]],N1,N2%,N3 + ],A2) + %delete(A,[],A2) + }. +predicate(A2) --> + ":-",newlines1(_),name1(Word11), + %"(", + newlines1(N1),name1(Word13),%varnames(Varnames),newlines1(_), + "/",newlines1(_),name1(Word12),%")", + newlines1(N2),%":-",%newlines1(N3),%{trace}, + %lines(L), + ".", + {foldr(append,[[[":-",[n,Word11],[Word13,"/",Word12]%,Varnames%N, + ]],N1,N2%,N3 + ],A2) + %delete(A,[],A2) + }. + +predicate(A) --> + name1(Word11), + ".", {A=[[[n,Word11]]] + }. +predicate(A) --> + name1(Word11), + "(",newlines1(N1),varnames(Varnames),%,newlines1(N2), + ")", + ".", {foldr(append,[[[[n,Word11],Varnames]],N1],A) + }. +predicate(A2) --> + name1(Word11), + "(",newlines1(N1),varnames(Varnames),")", + newlines1(N2),":-",newlines1(N3),%{trace}, + lines(L), ".", + {foldr(append,[[[[n,Word11],Varnames,":-",%N, + L]],N1,N2,N3 + ],A2) + %delete(A,[],A2) + }. +predicate(A2) --> + name1(Word11), + "(",newlines1(N1),varnames(Varnames),")", + newlines1(N2),"->",newlines1(N3), + lines(L), ".", + {foldr(append,[[[[n,Word11],Varnames,"->", + L]],N1,N2,N3],A2) + %delete(A,[],A2) + + }. +predicate(A2) --> + name1(Word11), + "(",newlines1(N1),varnames(Varnames),")", + newlines1(N2),"-->",newlines1(N3), + lines(L), ".", + {foldr(append,[[[[n,Word11],Varnames,"->",%N, + L]],N1,N2,N3],A2) + %delete(A,[],A2) + + }. +predicate(A2) --> + name1(Word11), + newlines1(N1),":-",newlines1(N2),%{trace}, + lines(L), ".", + {foldr(append,[[[[n,Word11],":-",%N, + L]],N1,N2],A2) + + %delete(A,[],A2) + + }. +predicate(A2) --> + name1(Word11), + newlines1(N1),"->",newlines1(N2), + lines(L), ".", + {foldr(append,[[[[n,Word11],"->",%N, + L]],N1,N2],A2) + + %delete(A,[],A2) + + }. +predicate(A2) --> + name1(Word11), + newlines1(N1),"-->",newlines1(N2), + lines(L), ".", + {foldr(append,[[[[n,Word11],"->",%N, + L]],N1,N2],A2) + + %delete(A,[],A2) + + }. + +/**name1([L3|Xs]) --> [X], {string_codes(L2,[X]),(char_type(X,alnum)->true;L2="_"),downcase_atom(L2,L3)}, name1(Xs), !. +name1([]) --> []. +**/ + + +name1(X1) --> name1a(X1). + +name1a(X1) --> name10(X11), + {(string_atom(X12,X11),number_string(X1,X12)->true; + ((%contains_string(X11)-> + string_atom2(X1,X11)%;X11=X1) + )))}.%%., X2->X1 {atom_string(X2,X1)}. + +name1a(X1) --> name2(X1). + + +name10(XXs) --> [X], + {char_code(Ch1,X),(char_type(X,alnum)->true;(Ch1='_'->true;( + Ch1='!'->true;Ch1='.'))), + atom_string(CA,Ch1),downcase_atom(CA,CA2)}, + name10(Xs), + {atom_concat(CA2,Xs,XXs)}, !. +name10(XXs) --> [X], + {char_code(Ch1,X),(char_type(X,alnum)->true;(Ch1='_'->true; + Ch1='!')), + atom_string(CA,Ch1),downcase_atom(CA,XXs)}, !. +%%name10('') --> []. + +name11(X1) --> %{trace}, +name101(X11), + {(string_atom(X12,X11),number_string(X1,X12)->true; + ((%contains_string(X11)-> + string_atom2(X1,X11)%;X11=X1) + )))}.%%., X2->X1 {atom_string(X2,X1)}. + +/** +\"'a'\" +\"\\\"a\\\"\" +'"a"' +'\\'a\\'' +**/ + +name101(XXs) --> "'",name1010(XXs1),"'", + {atom_concat_list(['\'',XXs1,'\''],XXs)}, !. + +name1010(XXs) --> [X],[Y], + {char_code(Ch1,X),char_code(Ch2,Y),%char_type(X,ascii),%->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + Ch1='\\',Ch2='\'', + atom_string(CA2,Ch1),atom_string(CA22,Ch2)},%%downcase_atom(CA,CA2)}, + name1010(Xs), + {atom_concat_list([CA2,CA22,Xs],XXs)}, !. +name1010(XXs) --> [X], [Y], + {char_code(Ch1,X),char_code(Ch2,Y),%char_type(X,ascii),%->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + Ch1='\\',Ch2='\'', + atom_string(CA2,Ch1),atom_string(CA22,Ch2)},%%downcase_atom(CA,CA2)}, + %name101(Xs), + {atom_concat_list([CA2,CA22,''],XXs)}, !. + +name1010(XXs) --> [X], +{char_code(Ch1,X),%char_type(X,ascii),%->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + not(Ch1='\''),%not(Ch1='('), + atom_string(CA2,Ch1)},%%downcase_atom(CA,CA2)}, + name1010(Xs), + {atom_concat(CA2,Xs,XXs)}, !. +name1010(XXs) --> [X], + {char_code(Ch1,X),%char_type(X,ascii),%->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + not(Ch1='\''),%not(Ch1='('), + atom_string(CA2,Ch1)},%%downcase_atom(CA,CA2)}, + %name101(Xs), + {atom_concat(CA2,'',XXs)}, !. + + +name101(XXs) --> "\"",name1011(XXs1),"\"", + {atom_concat_list(['"',XXs1,'"'],XXs)}, !. + + /* +name1011(XXs) --> [X], + {char_code(Ch1,X),%char_code(Ch2,Y),%char_type(X,ascii),%->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + Ch1='"',%trace,%Ch2='"', + atom_string(CA2,Ch1)%,atom_string(CA22,Ch2) + },%%downcase_atom(CA,CA2)}, + %name1011(Xs), + {atom_concat_list([%'\'',CA2 + ],XXs)}, !. +*/ +name1011(XXs) --> [X],[Y], + {char_code(Ch1,X),char_code(Ch2,Y),%char_type(X,ascii),%->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + Ch1='\\',Ch2='"', + atom_string(CA2,Ch1),atom_string(CA22,Ch2)},%%downcase_atom(CA,CA2)}, + name1011(Xs), + {atom_concat_list([CA2,CA22,Xs],XXs)}, !. +name1011(XXs) --> [X], [Y], + {char_code(Ch1,X),char_code(Ch2,Y),%char_type(X,ascii),%->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + Ch1='\\',Ch2='"', + atom_string(CA2,Ch1),atom_string(CA22,Ch2)},%%downcase_atom(CA,CA2)}, + %name101(Xs), + {atom_concat_list([CA2,CA22,''],XXs)}, !. + +name1011(XXs) --> [X], + {char_code(Ch1,X),%char_type(X,ascii),%->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + not(Ch1='"'),%not(Ch1='('), + atom_string(CA2,Ch1)},%%downcase_atom(CA,CA2)}, + name1011(Xs), + {atom_concat(CA2,Xs,XXs)}, !. +name1011(XXs) --> [X], + {char_code(Ch1,X),%char_type(X,ascii),%->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + not(Ch1='"'),%not(Ch1='('), + atom_string(CA2,Ch1)},%%downcase_atom(CA,CA2)}, + %name101(Xs), + {atom_concat(CA2,'',XXs)}, !. + + + +name101(XXs) --> name1012(XXs1), + {atom_concat_list([XXs1],XXs)}, !. + +name1012(XXs) --> + [X], + lookahead2([',',')',']','|' + ]), + {char_code(Ch1,X),%char_type(X,ascii), + %->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + not(Ch1='['),not(Ch1=']'),not(Ch1='('), + atom_string(CA2,Ch1)},%%downcase_atom(CA,CA2)}, + %name101(Xs), + {atom_concat(CA2,'',XXs)}, !. + + +name1012(XXs) --> %{trace}, + [X], %lookahead2([',',')'%,']' + %]), + %{trace}, + %lookahead3(A), + {%char_code(ChA,A),not(ChA=','),not(ChA=')'), + char_code(Ch1,X),%char_type(X,ascii), + %->true;(Ch1='\''->true;(Ch1='"'->true;(Ch1='_'->true; + %Ch1='!'->true;Ch1='.')))), + %not(Ch1=','), + not(Ch1='['),not(Ch1=']'),not(Ch1='('), + atom_string(CA2,Ch1)},%%downcase_atom(CA,CA2)}, + name1012(Xs), + {atom_concat(CA2,Xs,XXs)}, !. + + +name2(X1) --> name20(X1).%%, {atom_string(X2,X1)}. + + +name20(XXs) --> [X], %{trace}, +%lookahead(Y), + {char_code(Ch1,X),%%char_type(X,alnum)->true; + %trace, + %writeln(Y), + %(Ch1='-'->trace;true), + %((Ch1='-',[Y]=`>`)->fail;( + (Ch1='+'->true;(Ch1='-'->true;(Ch1='*'->true; + (Ch1='/'->true;(Ch1='<'->true;(Ch1='>'->true; + (Ch1='='))))))),atom_string(CA,Ch1), + downcase_atom(CA,CA2)}, + name20(Xs), + {atom_concat(CA2,Xs,XXs)}, !. +name20(XXs) --> [X], + {char_code(Ch1,X),%%(char_type(X,alnum)->true; + (Ch1='+'->true;(Ch1='-'->true;(Ch1='*'->true; + (Ch1='/'->true;(Ch1='<'->true;(Ch1='>'->true; + (Ch1='='))))))), + atom_string(CA,Ch1),downcase_atom(CA,XXs)}, !. +%%name20('') --> []. + +/** +name2([L3|Xs]) --> [X], {(char_type(X,alnum)->true; + char_type(X,punct)),string_codes(L2,[X]),downcase_atom(L2,L3)}, name2(Xs), !. +name2([]) --> [].**/ + +%%spaces1(_) --> [X], {char_type(X,space)},!. +%spaces1([X|Xs]) --> [X], {char_type(X,space)}, spaces1(Xs), !. +%spaces1([]) --> []. +%spaces1(_)-->newlines1(_). +/** +varnames([L1|Ls]) --> varname1(L1),",", %%{writeln(L)}, %%*** +varnames(Ls), !. +varnames([L1]) --> varname1(L1), +!. +**/ + +varnames01(L1) --> %{trace}, +"[",newlines1(_N1), +varnames0(L2),newlines1(_), +"]",newlines1(_N2), +{%foldr(append,[[L2],N1,N2],L1) +L1=L2}, +!. + +/* +varnames01(L1) --> {trace}, +"[",newlines1(N1),varnames0(L2),%,newlines1(_), +"]",newlines1(N2), +{foldr(append,[[L2],N1,N2],L1)}, +!. +*/ + +varnames01(L1) --> varname1(L2), + {L1=L2},!. + +/* +varnames01(L1) --> %{trace}, +"[",%newlines1(N1), +varname1(L2),%,newlines1(_), +"]",%newlines1(N2), +{foldr(append,[[L2]],L1)}, +!. +*/ + +varnames(L3) --> %{trace}, +"[",%newlines1(N1), +varnames0(L1),%newlines1(_), +"]",%newlines1(N2), +",", +%newlines1(N3), +varnames(L2), + {foldr(append,[[L1],L2],L3) + %foldr(append,[[L4]],L3) + }, + %{append([L1],L2,L3)}, + !. +% {maplist(append,[[[[L1],L2]]],[[L3]])},!. + + +varnames(L3) --> %{trace}, +"[",newlines1(_),varnames0(L1),newlines1(_),"]",newlines1(_),"|",varnames(L2),newlines1(_), + {maplist(append,[[[[L1],"|",L2]]],[L3])},!. + + + +varnames(L1) --> %{trace}, +"[",newlines1(_),varnames0(L2),newlines1(_),"]",newlines1(_), +{L1 = [L2]}, +!. + + +/* +varnames(L1) --> %{trace}, +%"[","]",",","[","]", +varnames0(L0), +newlines1(_),spaces1(_),varnames0(L2), + {append(L0,L2,L1)},!. + */ + +varnames(L3) --> %{trace}, +%"[", +newlines1(_),varnames0(L3),newlines1(_),%"]", +%",", +%newlines1(_),spaces1(_),varnames(L2), + %{append(L1,L2,L3)}, + !. + +varnames(L1) --> %{trace}, +"[",newlines1(_),"]",newlines1(_),",",newlines1(_),varnames(L2), + {append([[]],L2,L1)},!. + +varnames(L1) --> %{trace}, +"[",newlines1(_),"]",newlines1(_),"|",newlines1(_),varnames(L2), + {maplist(append,[[[[[]],"|",L2]]],[L1])},!. + + + +varnames(L1) --> %{trace}, +"[",newlines1(_),"]", newlines1(_), +{L1 = []}, +!. + +varnames(L1) --> %{trace}, +varnames0(L1), +!. + +varnames0(L1) --> varname1(L2),%{trace}, +lookahead1,%{notrace}, + {L1=[L2]},!. + +varnames0(Ls2) --> %{trace}, +varname1(L1),newlines1(_),",", newlines1(_),%%{writeln(L)}, %%*** + varnames0(Ls), newlines1(_), + {append([L1],Ls,Ls2)},!. +% {maplist(append,[[[L1,Ls]]],[[Ls2]])},!. + +varnames0(Ls2) --> varname1(L1),newlines1(_),"|", newlines1(_),%%{writeln(L)}, %%*** + varnames0([Ls]), newlines1(_), + {append_list([L1,"|",Ls],Ls2)},!. +% {maplist(append,[[[L1,"|",Ls]]],[Ls2])},!. + +varnames3(L1) --> varname1(L2),%{trace}, +%lookahead1,%{notrace}, + {L1=L2},!. + +varnames3(L1) --> varnames(L1),%{trace}, +%lookahead1,%{notrace}, + %{L1=[L2]}, + !. + +lookahead1(A,A) :- append(`]`,_,A). +lookahead1(A,A) :- append(`)`,_,A). +%lookahead1(A,A) :- append(`.`,_,A). +%lookahead1(A,A) :- append(`,`,_,A). +%lookahead1(A,A) :- append(`|`,_,A). + +%lookahead3(A,A) :- lookahead1(A,A) +%lookahead3(A,A) :- append(`,`,_,A). + +%lookahead(_,[],[]) :-!. +lookahead(B2,A,A) :- +%trace, + %member(B,B1), + %string_codes(B,B2), + append([B2],_D,A),!. + +lookahead2(B1,A,A):- +%trace, + member(B,B1), + string_codes(B,B2), + append(B2,_D,A). + +varname1([]) --> "[",newlines1(_),"]",newlines1(_). %%{writeln(L)}, %%*** +varname1(L4) --> %{trace}, +name11(L1), newlines1(_),%%{writeln(L)}, %%*** +{%trace,%%atom_string(L1,L10),string_codes(L2,L10), +(((string(L1)->true;(atom_concat(A,_,L1),atom_length(A,1),(not(is_upper(A)),not(A='_'))))->L4=L1;(downcase_atom(%%L2 +L1,L3),L4=[v,L3])))%%L3A + +%%,term_to_atom(L3A,L4)%%,atom_string(L4A,L4) +}. +varname1(L4) --> "(",newlines1(_),line(L4),newlines1(_),")",newlines1(_). +varname1(L1) --> +"[",newlines1(_),varnames0(L2),newlines1(_),"]",newlines1(_), +{L1 = L2},!. + +varname1(A) --> varname_term(A). +%varnames(A) --> varnames_term(B),","varnames(C),[]. + +varname_term(A) --> {%trace, +true},name1(Word11),{%(Word11=phrase_from_file->trace;true)% +%writeln1(Word11) +true +}, + "(",newlines1(_),varnames(Varnames),")", + newlines1(_), {A=[[n,Word11],Varnames]%,trace + },!. + +%comment([]) --> []. +comment(X1) --> %spaces1(_), +[X], {char_code('%',X)},comment1(Xs), {append([X],Xs,X2),string_codes(X3,X2),X1=[[[n,comment],[X3]]]},!. + +%comment1([]) --> [], !. +comment1([X|Xs]) --> %{trace}, +[X], %lookahead(_A), +{not(char_type(X,newline))%,not(A=[]) +}, comment1(Xs), !. +%%newlines1([X]) --> [X], {char_type(X,newline)},!. +%comment1([X]) --> [X], %lookahead(A), +%!. +%{(char_type(X,newline)->true;A=[])}, !. +comment1([]) --> [X], lookahead(A), {(char_type(X,newline)->true;A=[])}, !. +comment1([]) --> [], !. + + +comment2(X1) --> %spaces1(_), +[XA],[XB], {char_code('/',XA),char_code('*',XB)},comment3(Xs), {flatten([XA,XB|Xs],X4),%foldr(append,X4,X2), +string_codes(X3,X4),X1=[[[n,comment],[X3]]]},!. + +%comment1([]) --> [], !. +comment3([XA|Xs]) --> [XA],%[XB], + lookahead(XB),{not((char_code('*',XA),char_code('/',XB)))}, comment3(Xs), !. +%%newlines1([X]) --> [X], {char_type(X,newline)},!. +comment3([XA,XB]) --> [XA],[XB], {char_code('*',XA),char_code('/',XB)}, !. + + +%newlines1(X3) --> newlines0(X),newlines1(X2),{append(X,X2,X3)},!. +/* +newlines1(X) --> newlines0(X). +newlines0(X) --> newlines11(X2), {%(not(X2=[])-> +X=[[[n,newlines],[X2]]]%;X=[]) +},!. +newlines0(X) --> newlines12(X2), {X=[[[n,percentage_comments],[X2]]]},!. +newlines0(X) --> newlines13(X2), {X=[[[n,slash_star_comments],[X2]]]},!. +newlines00(X) --> newlines11(X). +newlines00(X) --> newlines12(X). +newlines00(X) --> newlines13(X). +*/ +newlines1(Xs) --> [X], {char_type(X,space)}, newlines1(Xs), +%{%keep_comments(Y), +%(%true%member(newlines,Y) +%->append([X],Xs,Xs2);Xs2=[]) +%}, +!. +newlines1(Xs2 +) --> %{trace}, +comment(X), newlines1(Xs), %{append([X],Xs,Xs2)}, +{keep_comments(Y),(member(percentage_comments,Y)->append(X,Xs,Xs2);Xs2=[])}, +!. +newlines1(Xs2 +) --> comment2(X), newlines1(Xs), %{append([X],Xs,Xs2)}, +{keep_comments(Y),(member(slash_star_comments,Y)->append(X,Xs,Xs2);Xs2=[])}, +!. +%%newlines1([X]) --> [X], {char_type(X,newline)},!. +%newlines1([]) --> [],%lookahead([]), +%!. +newlines1([]) --> [],%lookahead([]), +!. + +/* +newlines11([]) --> [],%lookahead([]), +!. +newlines12([]) --> [],%lookahead([]), +!. +newlines13([]) --> [],%lookahead([]), +!. +*/ + +/** +was comments +newlines1([]) --> "%", comments(_), "\n". +newlines1([]) --> "/","*", commentsa(_), "*","/". +newlines1([]) --> newlines1(_). + +comments([L|Ls]) --> comments2(L), +%%{writeln(L)}, %%*** +comments(Ls), !. +comments([L]) --> comments2(L), +%%{writeln(L)}, +!. + +comments2(_) --> spaces1(_),name1(_).%%[X], {string_codes(X1,[X]), not(X1="\n")}. + + +commentsa([L|Ls]) --> comments3(L), +%%{writeln(L)}, %%*** +commentsa(Ls), !. +commentsa([L]) --> comments3(L), +%%{writeln(L)}, +!. + +comments3(_) --> spaces1(_),name1(_).%%[X], [Y], {string_codes(X1,[X]), + %%string_codes(Y1,[Y]), not((X1="*",Y1="/"))}. + +**/ +comma_or_semicolon --> ",". +comma_or_semicolon --> ";"%,{trace} +. + +lines(Ls2) --> %{trace},%newlines1(_), +newlines1(_),line(L),newlines1(N1),comma_or_semicolon,%",", +newlines1(N2), +%{writeln(L)}, %%*** +lines(Ls), %trace, +newlines1(N3), +%{delete([L,N|Ls],[],Ls2)}, !. +%lines(Ls2) --> line(L),",",newlines1(N), +%%{writeln(L)}, %%*** +%lines(Ls), +{foldr(append,[[L],N1,N2, +Ls,N3 +],Ls2%[],Ls2 +)}, !. +lines([L]) --> line(L), +%%{writeln(L)}, +!. +varname_or_names(Varnames1) --> varnames([Varnames1]). +varname_or_names(Varname) --> varname1(Varname). + +% maplist(append,[[[40],B1,[41]]],[B12]), % "()" + +%varname1 +line(A) -->%{trace}, + varname_or_names(Varnames1),newlines1(_),"=",newlines1(_), + varname_or_names(Varnames2),%newlines1(_), + {A=[[n,equals4],[Varnames1,Varnames2]] + }. +/* +line(A) -->%{trace}, + "true",%newlines1(_), + {trace}, + {A=[[n,true]] + }. +*/ + +line(A) --> %%spaces1(_), + name1(Word11), %% name(A,B,C) + {%trace, + Word11=not}, + "(",newlines1(_), + lines(Lines),")", + {A=[[n,Word11],Lines]},!. + +line(A) --> %%spaces1(_), + name1(Word11),newlines1(_), %% name(A,B,C). + %{trace,(Word11="->"->trace;true)}, + {%trace, + not(Word11=findall)}, + "(",newlines1(_),varnames(Varnames),")", + {A=[[n,Word11],Varnames]},!. +line(A) --> %%spaces1(_), + name1(Variable1),newlines1(_),{%trace, + not(Variable1=findall)}, + newlines1(_), %% A = B*Y + (name1(_Is)|name2(_Equals)), newlines1(_), + name1(Variable2), newlines1(_), + name2(Operator),newlines1(_), + {%trace, + not(Operator=(->))}, + name1(Variable3), %newlines1(_), + { %% A=B*Y + v_if_string_or_atom(Variable2,Variable2a), + v_if_string_or_atom(Variable3,Variable3a), + v_if_string_or_atom(Variable1,Variable1a), + A=[[n,Operator],[Variable2a,Variable3a,Variable1a]]},!. +line(A) --> %%spaces1(_), + name1(Word10),{%trace, + not(Word10=findall)}, + newlines1(_), %% A is B + name2(Word21), newlines1(_), name1(Word11), + {v_if_string_or_atom(Word10,Word10a), + v_if_string_or_atom(Word11,Word11a), + A=[[n,Word21],[Word10a,Word11a]]},!. +line(A) --> %%spaces1(_), + name1(Word10),{%trace, + not(Word10=findall)}, + newlines1(_), %% A is B + name1(Word21), newlines1(_), name1(Word11), + {v_if_string_or_atom(Word10,Word10a), + v_if_string_or_atom(Word11,Word11a), + A=[[n,Word21],[Word10a,Word11a]]},!. +/*line(A) --> %%spaces1(_), + name1(Word10),{%trace, + not(Word10=findall)}, + %spaces1(_), + %% A = [B,C] + "is", + %name2(Word21), + %spaces1(_), + name1(Word11), + {v_if_string_or_atom(Word10,Word10a), + v_if_string_or_atom(Word11,Word11a), + A=[[n,=],[Word10a,Word11a]]},!. + */ +line(A) --> %%spaces1(_), + name1(Word10),{%trace, + not(Word10=findall)}, + %spaces1(_), + %% A = [B,C] + "=", + %name2(Word21), + %spaces1(_), + "[", + "]", + {v_if_string_or_atom(Word10,Word10a), + %v_if_string_or_atom(Word11,Word11a), + %v_if_string_or_atom(Word12,Word12a), + A=[[n,equals4],[Word10a,[]]]},!. +line(A) --> %%spaces1(_), + name1(Word10),{%trace, + not(Word10=findall)}, + %spaces1(_), + %% A = [B,C] + "=", + %name2(Word21), + %spaces1(_), + "\"\"", + {v_if_string_or_atom(Word10,Word10a), + %v_if_string_or_atom(Word11,Word11a), + %v_if_string_or_atom(Word12,Word12a), + A=[[n,equals4],[Word10a,""]]},!. +line(A) --> %%spaces1(_), + %{trace}, + name1(Word10),newlines1(_),{%trace, + not(Word10=findall)}, + %spaces1(_), + %% A = [B,C] + "=",newlines1(_), + %name2(Word21), + %spaces1(_), + "[",newlines1(_), + varnames(Word11),newlines1(_),"]",%newlines1(_), + {v_if_string_or_atom(Word10,Word10a), + %v_if_string_or_atom(Word11,Word11a), + %v_if_string_or_atom(Word12,Word12a), + A=[[n,equals4],[Word10a,Word11]]},!. +line(A) --> %%spaces1(_), + %{trace}, + %name1(Word10),{%trace, + %not(Word10=findall)}, + %spaces1(_), + %% A = [B,C] + %"=", + %name2(Word21), + %spaces1(_), + "[",newlines1(_), + varnames(Word11),newlines1(_),"]",%newlines1(_), + {%v_if_string_or_atom(Word10,Word10a), + %v_if_string_or_atom(Word11,Word11a), + %v_if_string_or_atom(Word12,Word12a), + A=Word11},!. +line(A) --> %%spaces1(_), + %{trace}, + %name1(Word10),{%trace, + %not(Word10=findall)}, + %spaces1(_), + %% A = [B,C] + %"=", + %name2(Word21), + %spaces1(_), + "[",newlines1(_), + %varnames(Word11), + "]",%newlines1(_), + {%v_if_string_or_atom(Word10,Word10a), + %v_if_string_or_atom(Word11,Word11a), + %v_if_string_or_atom(Word12,Word12a), + A=[]},!. + +line(A) --> %%spaces1(_), + name1(Word10),{%trace, + not(Word10=findall)}, + %spaces1(_), + %% A = [B,C] + "=", + %name2(Word21), + %spaces1(_), + "[",name1(Word11), + "|",name1(Word12), + "]", + {v_if_string_or_atom(Word10,Word10a), + v_if_string_or_atom(Word11,Word11a), + v_if_string_or_atom(Word12,Word12a), + A=[[n,equals4],[Word10a,[Word11a,"|",Word12a]]]},!. +line(A) --> %%spaces1(_), + name1(Word10),newlines1(_),{%trace, + not(Word10=findall)}, + %spaces1(_), + %% A = [B,C] + "=",newlines1(_), + %name2(Word21), + %spaces1(_), + varnames3(Word11),%newlines1(_), + {v_if_string_or_atom(Word10,Word10a), + %v_if_string_or_atom(Word11,Word11a), + A=[[n,=],[Word10a,Word11]]},!. +line(A) --> %%spaces1(_), + %{trace}, + name1(Word10),{%trace, + not(Word10=findall)}, + %spaces1(_), + %% A=B + "=", + %name2(Word21), + %spaces1(_), + name1(Word11), + {v_if_string_or_atom(Word10,Word10a), + v_if_string_or_atom(Word11,Word11a), + A=[[n,=],[Word10a,Word11a]]},!. + +line(A) --> %%spaces1(_), + name1(Word11), newlines1(_),%% name(A,B,C) + {%trace, + Word11=findall}, + "(",newlines1(_), + varnames01(Varnames),newlines1(_),",",newlines1(_), + "(",newlines1(_),lines(A1),newlines1(_),")",newlines1(_),",", + newlines1(_), + varnames01(Varnames2),newlines1(_),")",%newlines1(_), + {A=[[n,Word11],[Varnames,A1,Varnames2]]},!. + +line(A) --> %%spaces1(_), + name1(Word11), newlines1(_),%% name(A,B,C) + {%trace, + Word11=findall}, + "(",newlines1(_),varnames01(Varnames),newlines1(_),",",newlines1(_), + line(A1),newlines1(_),",",newlines1(_), + varnames01(Varnames2),newlines1(_),")",%newlines1(_), + {A=[[n,Word11],[Varnames,A1,Varnames2]]},!. + +line(Word1) --> + "{",line(Word2),"}",{Word1=[[n,code],Word2]},!. + +line(Word1) --> + "(",newlines1(_),line(Word2),")",{Word1=[Word2]},!. +line(Word1) -->%{trace}, + "(",newlines1(_),line(Word2),newlines1(_),"->",newlines1(_), + line(Word3),newlines1(_),";",newlines1(_N2), + line(Word4),newlines1(_N3),")", + {%(Word4=[[[[n,_]|_]|_]|_]->Word4=[Word41];Word4=Word41), + %if_one_item_then_remove_brackets() + Word1=[[n,"->"],[Word2,Word3,Word4]]},!. +line(Word1) --> + "(",newlines1(_), + line(Word2),newlines1(_N21),"->",newlines1(_N2),line(Word3),newlines1(_N3),")", + {Word1=[[n,"->"],[Word2,Word3]]},!. +line(Word1) --> + "(",newlines1(_), + line(Word2),";",newlines1(_N2),line(Word3),")", + {Word1=[[n,or],[Word2,Word3]]},!. +line([[n,cut]]) --> %%spaces1(_), + name1(Word), + {not(Word=findall)}, + {Word=!},!. +line([[n,Word]]) --> %%spaces1(_), + name1(Word),{%trace, + not(Word=findall)}. +%% {Word=true},!. + +line(Word1) --> + "(",newlines1(_), + lines(Word2),")", + {Word1=Word2},!. +line(Word1) --> + "{",newlines1(_),lines(Word2),"}", + {Word1=[[n,code],Word2]},!. + +%%a(_) -->",". +%%line([]) --> newlines1(_),!. + + +%% **** Pretty Print + +/** + +?- pp0([[[a,*,*]],[[b,*,*],[c,d]],[[ef,*,*],[g],(:-),[[[[h,*,*],[i]],->,true,or,true],!]]]). +[ +[[a,*,*]], +[[b,*,*],[c,d]], +[[ef,*,*],[g],(:-), +[ + [[[h,*,*],[i]],->,true,or,true], + ! +]] +] +**/ +/** +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). +**/ + + + +/** +pp3([]) :- !. +pp3(List1) :- + List1=[List2|List3], + writeln1(List2), + pp3(List3). +**/ + +/** +concat_list_term(List,String) :- +%trace, + findall(A,(member(Item,List), + %trace, + (atom(Item) -> Item=A; + term_to_atom(Item,A)) + %notrace + ),List1), + concat_list(List1,String). +**/ + +contains_string(Atom,String) :- + string_concat(A,B,Atom), + string_length(A,1), + A="\"", + string_concat(String,C,B), + string_length(C,1), + C="\"",!. + +% remove " if string, leave as atom if atom + +string_atom4(String,Atom) :- %trace, +((atom(String)->( +% string_atom(Atom,String),!. + +term_to_atom(A,Atom), +(var(A)->String=Atom; +atom_string(A,String)) +);String=Atom)->((writeln1(string_atom2(String,Atom))));(writeln1(string_atom2(String,Atom)),false)). + +% \" \\\"\" -> " \"" +% or '" \\""' -> '" \""' + +string_atom2(String1,Atom1) :- +%writeln1(string_atom2(String1,Atom1)), + contains_string(Atom1,String2),%trace, + %foldr(string_concat,String2,String1), +%trace, + %String1=Atom1, + %string_atom(String1,String2), + %string_strings(Atom1,A), + %append([_],A1,A), + %append(A2,[_],A1), + %foldr(string_concat,A2,String1), + + %delete1_p2lp(Atom1,"\"",String1), + atomic_list_concat(A,"\\",String2), + atomic_list_concat(A,"",Atom2), + atom_string(Atom2,String1), + + %string_atom(String1,String2), + %replace(String2,"'","#",String1), + %string_atom(String1,String2), + !. +string_atom2(String1,Atom1) :- + atom(Atom1),%String1=Atom1, + %trace, + %replace(Atom1,"\"","&",String2), + delete1_p2lp(Atom1,"'",%"#", + String3), + + %string_strings(Atom1,A), + %append([_],A1,A), + %append(A2,[_],A1), + %foldr(string_concat,A2,String3), + + string_atom(String3,String1), + !. + +replace(A1,Find,Replace,F) :- +string_concat("%",A1,A2), +string_concat(A2,"%",A), split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F1),string_concat(F2,G,F1),string_length(G,1), + string_concat("%",F3,F2), + string_concat(F,"%",F3). + +v_if_string_or_atom(String_or_atom,V) :- + (((string(String_or_atom)->true; + atom(String_or_atom)), + %trace, + string_concat(A,_,String_or_atom), + string_length(A,1) + %,is_upper(A) + )-> + V=[v,String_or_atom]; + V=String_or_atom),!. + +delete1_p2lp(A ,Find,F) :- +%writeln1(delete1_p2lp(A ,Find,F)), +%string_concat("%",A1,A2), +%string_concat(A2,"%",A), +%trace, +string_strings(A,B), +(append([Find],C,B)->true;C=B), +(append(D,[Find],C)->true;D=C), +foldr(string_concat,D,F). %split_string(A,Find,"",B),%findall([C,Replace],(member(C,B)),D), + %maplist(append,[[B]],[E]),concat_list(E,F).%,string_concat(F,G,F1),string_length(G,1). + %string_concat("%",F3,F2), + %string_concat(F,"%",F3). diff --git a/Prolog-to-List-Prolog-master/p2lpverify.pl b/Prolog-to-List-Prolog-master/p2lpverify.pl new file mode 100644 index 0000000000000000000000000000000000000000..9756853b9fd842e713e174b3a48f00ba2fab76e9 --- /dev/null +++ b/Prolog-to-List-Prolog-master/p2lpverify.pl @@ -0,0 +1,548 @@ +%% p2lp_test(Total,Score). + +%%:- use_module(library(time)). + +%% Test cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +p2lp_test(NTotal,Score) :- p2lp_test(0,NTotal,0,Score),!. +p2lp_test(NTotal,NTotal,Score,Score) :- + findall(_,p2lp_test(_,_,_),Ns),length(Ns,NL),NL=NTotal, !. +p2lp_test(NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + p2lp_test(NTotal3,In,Out), + ((p2lpconverter([string,In],Result1), + %writeln1([result1,Result1]), + Out=Result1 + )->(Score3 is Score1+1,writeln0([p2lp_test,NTotal3,passed]));(Score3=Score1,writeln0([p2lp_test,NTotal3,failed]))), + writeln0(""), + p2lp_test(NTotal3,NTotal2,Score3,Score2),!. + +%% Test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +p2lp_test1(N,Passed) :- + p2lp_test(N,In,Out), + ((p2lpconverter([string,In],Result1), + %writeln1([result1,Result1]), + Out=Result1 + )->(Passed=passed,writeln0([p2lp_test,N,passed]));(Passed=failed,writeln0([p2lp_test,N,failed]))),!. + + +p2lp_test(1,":-use_module(library(date)).", +[ +[":-",[n,use_module],[[[n,library],[date]]]] +] +). + +p2lp_test(2,":-include('luciancicd.pl').", +[ +[":-",[n,include],['luciancicd.pl']] +] +). + +p2lp_test(3,":-include('../Prolog-to-List-Prolog/p2lpconverter.pl').", +[ +[":-",[n,include],['../Prolog-to-List-Prolog/p2lpconverter.pl']] +] +). + +p2lp_test(4,"use_module(library(pio)).", +[ +[[n,use_module],[[[n,library],[pio]]]] +] +). + +p2lp_test(5,"use_module(library(dcg/basics)).", +[ +[[n,use_module],[[[n,library],['dcg/basics']]]] +] +). + +p2lp_test(6,":-dynamic keep_comments/1.", +[ +[":-",[n,dynamic],[keep_comments,"/",1]] +] +). + +p2lp_test(7,"a:-a.", +[ +[[n,a],":-", +[ + [[n,a]] +]] +] +). + +p2lp_test(8,"a(A,B). +/*1*/ +%2", +[ +[[n,a],[[v,a],[v,b]]], +[[n,comment],["/*1*/"]], +[[n,comment],["%2"]] +] +). + +p2lp_test(9,"a:-d(E). +/*2*/ +/*3*/", +[[[n, a], ":-", [[[n, d], [[v, e]]]]], +[[n, comment], ["/*2*/"]], +[[n, comment], ["/*3*/"]]] + + +). + +p2lp_test(10,"a(B,C)-->d(E). +/*1*/ +/*2*/", +[ +[[n,a],[[v,b],[v,c]],"->", +[ + [[n,d],[[v,e]]] +]], +[[n,comment],["/*1*/"]], +[[n,comment],["/*2*/"]] +] +). + +p2lp_test(11,"a:-d(E). +/*2*/ +/*3*/", +[ +[[n,a],":-", +[ + [[n,d],[[v,e]]] +]], +[[n,comment],["/*2*/"]], +[[n,comment],["/*3*/"]] +] +). + +p2lp_test(12,"a-->d(E). +/*2*/ +/*3*/", +[ +[[n,a],"->", +[ + [[n,d],[[v,e]]] +]], +[[n,comment],["/*2*/"]], +[[n,comment],["/*3*/"]] +] +). + +p2lp_test(13,"a-->a.", +[ +[[n,a],"->", +[ + [[n,a]] +]] +] +). + +p2lp_test(14,"a-->d(E). +/*2*/ +/*3*/", +[ +[[n,a],"->", +[ + [[n,d],[[v,e]]] +]], +[[n,comment],["/*2*/"]], +[[n,comment],["/*3*/"]] +] +). + +p2lp_test(15,"a:-b([A,B],1,[C]).", +[ +[[n,a],":-", +[ + [[n,b],[[[v,a],[v,b]],1,[[v,c]]]] +]] +] +). + +p2lp_test(16,"a. +/*1*/", +[ +[[n,a]], +[[n,comment],["/*1*/"]] +] +). + +p2lp_test(17,"b:-c.", +[ +[[n,b],":-", +[ + [[n,c]] +]] +] +). + +p2lp_test(18,"a:-b,c,d. +/*1*/ +%2", +[ +[[n,a],":-", +[ + [[n,b]], + [[n,c]], + [[n,d]] +]], +[[n,comment],["/*1*/"]], +[[n,comment],["%2"]] +] +). + +p2lp_test(19,"a(A):-[]=[[]].", +[ +[[n,a],[[v,a]],":-", +[ + [[n,equals4],[[],[[]]]] +]] +] +). + +p2lp_test(20,"a(A):-([]=[[]]).", +[ +[[n,a],[[v,a]],":-", +[ + [ + [[n,equals4],[[],[[]]]] + ] +]] +] +). + +p2lp_test(21,"a(A):-([1]=A).", +[ +[[n,a],[[v,a]],":-", +[ + [ + [[n,equals4],[[1],[v,a]]] + ] +]] +] +). + +p2lp_test(22,"a(A):-(B=A).", +[ +[[n,a],[[v,a]],":-", +[ + [ + [[n,=],[[v,b],[v,a]]] + ] +]] +] +). + +p2lp_test(23,"a(A):-([1]=[1]).", +[ +[[n,a],[[v,a]],":-", +[ + [ + [[n,equals4],[[1],[1]]] + ] +]] +] +). + +p2lp_test(24,"a([A,B,C]).", +[ +[[n,a],[[[v,a],[v,b],[v,c]]]] +] +). + +p2lp_test(25,"a([A,B]).", +[ +[[n,a],[[[v,a],[v,b]]]] +] +). + +p2lp_test(26,"a([A|C]).", +[ +[[n,a],[[[v,a],"|",[v,c]]]] +] +). + +p2lp_test(27,"a([[A,B]]).", +[ +[[n,a],[[[[v,a],[v,b]]]]] +] +). + +p2lp_test(28,"a(A,[]).", +[ +[[n,a],[[v,a],[]]] +] +). + +p2lp_test(29,"a(A,[A]).", +[ +[[n,a],[[v,a],[[v,a]]]] +] +). + +p2lp_test(30,"a([[A]|C]).", +[ +[[n,a],[[[[v,a]],"|",[v,c]]]] +] +). + +p2lp_test(31,"a(A,[[A]|C]).", +[ +[[n,a],[[v,a],[[[v,a]],"|",[v,c]]]] +] +). + +p2lp_test(32,"a(A,[[A,B]|[C,D]]).", +[ +[[n,a],[[v,a],[[[v,a],[v,b]],"|",[[v,c],[v,d]]]]] +] +). + +p2lp_test(33,"ba(12).", +[ +[[n,ba],[12]] +] +). + +p2lp_test(34,"ba(12,1).", +[ +[[n,ba],[12,1]] +] +). + +p2lp_test(35,"a(1.1).", +[ +[[n,a],[1.1]] +] +). + +p2lp_test(36,"a(\"dsf\").", +[ +[[n,a],["dsf"]] +] +). + +p2lp_test(37,"a(dd).", +[ +[[n,a],['dd']] +] +). + +p2lp_test(38,"a(A):-findall(A,hello(A),B).", +[ +[[n,a],[[v,a]],":-", +[ + [[n,findall], + [ + [v,a], + + [[n,hello],[[v,a]]], + + [v,b] + ]] +]] +] +). + +p2lp_test(39,"a(A):-findall(A,hello(A),B),!.", +[ +[[n,a],[[v,a]],":-", +[ + [[n,findall], + [ + [v,a], + + [[n,hello],[[v,a]]], + + [v,b] + ]], + [[n,cut]] +]] +] +). + +p2lp_test(40,"a(A):-findall(A,(hello(A),hello(A)),B).", +[ +[[n,a],[[v,a]],":-", +[ + [[n,findall], + [ + [v,a], + + [ + [[n,hello],[[v,a]]], + [[n,hello],[[v,a]]] + ], + + [v,b] + ]] +]] +] +). + +p2lp_test(41,"a([A]):-A is 1+1.", +[ +[[n,a],[[[v,a]]],":-", +[ + [[n,+],[1,1,[v,a]]] +]] +] +). + +p2lp_test(42,"ef(G):-(h(I)->true;true).", +[ +[[n,ef],[[v,g]],":-", +[ + [[n,"->"], + [ + [[n,h],[[v,i]]], + + [[n,true]], + + [[n,true]] + ]] +]] +] +). + +p2lp_test(43,"ef(G):-(h(I)->true;true),!.", +[ +[[n,ef],[[v,g]],":-", +[ + [[n,"->"], + [ + [[n,h],[[v,i]]], + + [[n,true]], + + [[n,true]] + ]], + [[n,cut]] +]] +] +). + +p2lp_test(44,"ef(G):-(h(I)->true).", +[ +[[n,ef],[[v,g]],":-", +[ + [[n,"->"], + [ + [[n,h],[[v,i]]], + + [[n,true]] + ]] +]] +] +). + +p2lp_test(45,"ef(G):-(h(I)->true),!.", +[ +[[n,ef],[[v,g]],":-", +[ + [[n,"->"], + [ + [[n,h],[[v,i]]], + + [[n,true]] + ]], + [[n,cut]] +]] +] +). + +p2lp_test(46,"compound21(T,U)-->item(I).", +[ +[[n,compound21],[[v,t],[v,u]],"->", +[ + [[n,item],[[v,i]]] +]] +] +). + +p2lp_test(47,"compound21(T,U)-->item(I),!.", +[ +[[n,compound21],[[v,t],[v,u]],"->", +[ + [[n,item],[[v,i]]], + [[n,cut]] +]] +] +). + +p2lp_test(48,"a([A]).", +[ +[[n,a],[[[v,a]]]] +] +). + +p2lp_test(49,"a(A).", +[ +[[n,a],[[v,a]]] +] +). + +p2lp_test(50,"compound21(T,U)-->{wrap(I,Itemname1)}.", +[ +[[n,compound21],[[v,t],[v,u]],"->", +[ + [[n,code],[[n,wrap],[[v,i],[v,itemname1]]]] +]] +] +). + +p2lp_test(51,"compound21(T,U)-->{wrap(I,Itemname1),append(T,Itemname1,V)}.", +[ +[[n,compound21],[[v,t],[v,u]],"->", +[ + [[n,code],[[[n,wrap],[[v,i],[v,itemname1]]],[[n,append],[[v,t],[v,itemname1],[v,v]]]]] +]] +] +). + +p2lp_test(52,"compound21(T,U)-->item(I),lookahead(\"]\"),{wrap(I,Itemname1),append(T,Itemname1,V)},compound212(V,U).", +[ +[[n,compound21],[[v,t],[v,u]],"->", +[ + [[n,item],[[v,i]]], + [[n,lookahead],["]"]], + [[n,code],[[[n,wrap],[[v,i],[v,itemname1]]],[[n,append],[[v,t],[v,itemname1],[v,v]]]]], + [[n,compound212],[[v,v],[v,u]]] +]] +] +). + +p2lp_test(53,"a(A):-findall([A,C],hello(A,C),B).", +[ +[[n,a],[[v,a]],":-", +[ + [[n,findall], + [ + [[v,a],[v,c]], + + [[n,hello],[[v,a],[v,c]]], + + [v,b] + ]] +]] +] +). + +p2lp_test(54,"a:-a(\"\n\",\"\\\"\").", +[[[n, a], ":-",[[[n,a],["\n",""""]]]]] +). + +p2lp_test(55,"a:-a(\"\\\"\").", +[[[n, a], ":-",[[[n,a],[""""]]]]] +). + +p2lp_test(56,"a:-a(\" \\\"\").", +[[[n, a], ":-",[[[n,a],[" \""]]]]] +). + +p2lp_test(57,"a:-a(\"\\\" \").", +[[[n, a], ":-",[[[n,a],[""" "]]]]] +). diff --git a/Prolog-to-List-Prolog-master/p2slpconverter.pl b/Prolog-to-List-Prolog-master/p2slpconverter.pl new file mode 100644 index 0000000000000000000000000000000000000000..1404f80e4247efa5de01e7667d4fff2ec6033a68 --- /dev/null +++ b/Prolog-to-List-Prolog-master/p2slpconverter.pl @@ -0,0 +1,239 @@ +%% trace,string_codes("a.\nb(C,D).\nef(G):-(h(I)->true;true),!.",A),phrase(file(B),A),write(B). + +%% [[[a,*,*]],[[b,*,*],[c,d]],[[ef,*,*],[g],:-,[[[[h,*,*],[i]],->,true,or,true],!]]] + +use_module(library(pio)). +use_module(library(dcg/basics)). + +:- include('la_strings.pl'). + +p2slpconverter :- + File1="test1.pl", + readfile(File1,"test1.pl file read error."). + +readfile(List1,Error) :- + phrase_from_file_s(string(List6), List1), + (phrase(file(List3),List6)->true;(writeln(Error),abort)), + writeln(List3) . + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +file([L|Ls]) --> predicate(L),newlines1(_), +%%{writeln(L)}, %%*** +file(Ls), !. +file([L]) --> predicate(L),newlines1(_), +%%{writeln(L)}, +!. + +%%predicate([]) --> newlines1(_). +predicate(A) --> + name1(Word11), + ".", {A=[[Word11,"*","*"]] + }. +predicate(A) --> + name1(Word11), + "(",varnames(Varnames),")", + ".", {A=[[Word11,"*","*"],Varnames] + }. +predicate(A) --> + name1(Word11), + "(",varnames(Varnames),")", + spaces1(_),":-",newlines1(_), + lines(L), ".", + {A=[[Word11,"*","*"],Varnames,(:-),L] + }. + +/**name1([L3|Xs]) --> [X], {string_codes(L2,[X]),(char_type(X,alnum)->true;L2="_"),downcase_atom(L2,L3)}, name1(Xs), !. +name1([]) --> []. +**/ + +name1(X1) --> name10(X2), {atom_string(X2,X1)}. + +name10(XXs) --> [X], + {char_code(Ch1,X),(char_type(X,alnum)->true;(Ch1='_'->true; + Ch1='!')), + atom_string(CA,Ch1),downcase_atom(CA,CA2)}, + name10(Xs), + {atom_concat(CA2,Xs,XXs)}, !. +name10(XXs) --> [X], + {char_code(Ch1,X),(char_type(X,alnum)->true;(Ch1='_'->true; + Ch1='!')), + atom_string(CA,Ch1),downcase_atom(CA,XXs)}, !. +%%name10('') --> []. + +name2(X1) --> name20(X2), {atom_string(X2,X1)}. + +name20(XXs) --> [X], + {char_code(Ch1,X),%%char_type(X,alnum)->true; + (Ch1='+'->true;(Ch1='-'->true;(Ch1='*'->true; + (Ch1='/'->true;(Ch1='<'->true;(Ch1='>'->true; + (Ch1='='))))))),atom_string(CA,Ch1), + downcase_atom(CA,CA2)}, + name20(Xs), + {atom_concat(CA2,Xs,XXs)}, !. +name20(XXs) --> [X], + {char_code(Ch1,X),%%(char_type(X,alnum)->true; + (Ch1='+'->true;(Ch1='-'->true;(Ch1='*'->true; + (Ch1='/'->true;(Ch1='<'->true;(Ch1='>'->true; + (Ch1='='))))))), + atom_string(CA,Ch1),downcase_atom(CA,XXs)}, !. +%%name20('') --> []. + +/** +name2([L3|Xs]) --> [X], {(char_type(X,alnum)->true; + char_type(X,punct)),string_codes(L2,[X]),downcase_atom(L2,L3)}, name2(Xs), !. +name2([]) --> [].**/ + +%%spaces1(_) --> [X], {char_type(X,space)},!. +spaces1([X|Xs]) --> [X], {char_type(X,space)}, spaces1(Xs), !. +spaces1([]) --> []. + +varnames([L1|Ls]) --> name1(L1),",", %%{writeln(L)}, %%*** +varnames(Ls), !. +varnames([L4]) --> name1(L1), {string_codes(L2,L1),downcase_atom(L2,L3),atom_string(L3,L4)},!. + + +newlines1([X|Xs]) --> [X], {char_type(X,newline)}, newlines1(Xs), !. +%%newlines1([X]) --> [X], {char_type(X,newline)},!. +newlines1([]) --> [],!. + +/** +was comments +newlines1([]) --> "%", comments(_), "\n". +newlines1([]) --> "/","*", commentsa(_), "*","/". +newlines1([]) --> newlines1(_). + +comments([L|Ls]) --> comments2(L), +%%{writeln(L)}, %%*** +comments(Ls), !. +comments([L]) --> comments2(L), +%%{writeln(L)}, +!. + +comments2(_) --> spaces1(_),name1(_).%%[X], {string_codes(X1,[X]), not(X1="\n")}. + + +commentsa([L|Ls]) --> comments3(L), +%%{writeln(L)}, %%*** +commentsa(Ls), !. +commentsa([L]) --> comments3(L), +%%{writeln(L)}, +!. + +comments3(_) --> spaces1(_),name1(_).%%[X], [Y], {string_codes(X1,[X]), + %%string_codes(Y1,[Y]), not((X1="*",Y1="/"))}. + +**/ + +lines([L|Ls]) --> line(L),",",newlines1(_), +%%{writeln(L)}, %%*** +lines(Ls), !. +lines([L]) --> line(L), +%%{writeln(L)}, +!. + +line(A) --> %%spaces1(_), + name1(Word11), %% name(A,B,C). + "(",varnames(Varnames),")", + {A=[[Word11,"*","*"],Varnames]},!. +line(A) --> %%spaces1(_), + name1(Word10), spaces1(_), %% A = B*Y + (name1(Word21)|name2(Word21)), spaces1(_), + name1(Word11), + name2(Word12), name1(Word13), + {string_concat(Word11,Word12,B), + string_concat(B,Word13,C), + A=[Word10,Word21,C]},!. +line(A) --> %%spaces1(_), + name1(Word10), spaces1(_), %% A is B + name2(Word21), spaces1(_), name1(Word11), + {A=[Word10,Word21,Word11]},!. +line(Word1) --> + "(",line(Word2),")",{Word1=[Word2]},!. +line(Word1) --> + "(",line(Word2),"->",line(Word3),";",line(Word4),")", + {Word1=[Word2,->,Word3,or,Word4]},!. +line(Word1) --> + "(",line(Word2),"->",line(Word3),")", + {Word1=[Word2,->,Word3]},!. +line(Word1) --> + "(",line(Word2),";",line(Word3),")", + {Word1=[Word2,or,Word3]},!. +line(Word) --> %%spaces1(_), + name1(Word), + {(Word="true"->true;Word="!")},!. +line(Word1) --> + "(",lines(Word2),")", + {Word1=[Word2]},!. +%%a(_) -->",". +%%line([]) --> newlines1(_),!. + + +%% **** Pretty Print + +/** + +?- pp0([[[a,*,*]],[[b,*,*],[c,d]],[[ef,*,*],[g],(:-),[[[[h,*,*],[i]],->,true,or,true],!]]]). +[ +[[a,*,*]], +[[b,*,*],[c,d]], +[[ef,*,*],[g],(:-), +[ + [[[h,*,*],[i]],->,true,or,true], + ! +]] +] +**/ + +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). + + +pp0(List) :- + writeln("["), + pp1(List), + writeln("]"),!. + +pp1([]):-!. +pp1(List1) :- + List1=[List2], + (((List2=[[_Name,*,*]]->true;List2=[[_Name,*,*], + _Variables]), + write(List2),writeln(","))->true; + (List2=[[Name,*,*],Variables1,(:-),Body]-> + (term_to_atom(Variables1,Variables2), + concat_list("[[",[Name,",*,*],",Variables2, + ",(:-),"],String), + writeln(String),writeln("["),pp2(Body),writeln("]]")))),!. +pp1(List1) :- + List1=[List2|Lists3], + (((List2=[[_Name,*,*]]->true;List2=[[_Name,*,*], + _Variables]), + write(List2),writeln(","))->true; + (List2=[[Name,*,*],Variables1,(:-),Body]-> + (term_to_atom(Variables1,Variables2), + concat_list("[[",[Name,",*,*],",Variables2, + ",(:-),"],String), + writeln(String),writeln("["),pp2(Body),writeln("]],")))), + pp1(Lists3),!. +pp2([]):-!. +pp2(List1) :- + List1=[List2], + write("\t"),writeln(List2),!. +pp2(List1) :- + List1=[List2|Lists3], + write("\t"),write(List2),writeln(","), + pp2(Lists3),!. + + +pp3([]) :- !. +pp3(List1) :- + List1=[List2|List3], + writeln1(List2), + pp3(List3). \ No newline at end of file diff --git a/Prolog-to-List-Prolog-master/pretty_print.pl b/Prolog-to-List-Prolog-master/pretty_print.pl new file mode 100644 index 0000000000000000000000000000000000000000..97fa7233d63d74d24eb57ed32238f2d195f7b4b0 --- /dev/null +++ b/Prolog-to-List-Prolog-master/pretty_print.pl @@ -0,0 +1,264 @@ +% ["Medicine","MEDICINE by Lucian Green Head of State Head Ache Prevention 4 of 4.txt",0,algorithms,"32. I prepared to observe the people like my lecturer friend and meditation student was a doctor and friend. I did this by observing the hansard. First, I found the hansard. Second, I observed him listen to the politician. Third, I observed him take notes. In this way, I prepared to observe the people like my lecturer friend and meditation student was a doctor and friend by observing the hansard."] + +% * I did this by observing the hansard. + +% For nested if-then, findall + +:-dynamic pp_separate_comma/1. + +pretty_print(List,String) :- + (pp0(List,String)->true; + pp_1(List,String)). + +symbol_1(":-","\":-\""). +symbol_1("->","\"->\""). + +pp0_2(A,B) :- + retractall(pp_separate_comma(_)), + assertz(pp_separate_comma("\n")), + pp0(A,B),!. + +pp0_3(A,B) :- + retractall(pp_separate_comma(_)), + assertz(pp_separate_comma("")), + pp0(A,B),!. + +pp0(A,B) :- +%/* + ((pp_separate_comma(_PSC) + %not(var(PSC)) + )-> + true;(retractall(pp_separate_comma(_)), + assertz(pp_separate_comma("")))), + %*/ + pp01(A,B). + +pp01([],'[]') :- !. +pp01(List,String2) :- +%trace, + +%trace, + pp1(List,'',String1), + concat_list(['[\n',String1],String5), + %replace(String3,"&","\"",String4), + %replace(String3,"#","'",String5), + string_concat(String6,B,String5),string_length(B,2), + string_concat(String6,'\n]',String2), + !. +pp1([],S,S):-!. +pp1([List1],S1,S2) :- + %List1=[List2], + pp3(List1,S1,S2). +pp1(List1,S1,S2) :- + List1=[List2|Lists3], + pp3(List2,S1,S3), + pp1(Lists3,S3,S2),!. +pp2([],S,S,_N):-!. +pp2(List1,S1,S2,N) :- + pp_separate_comma(PSC), + List1=[List2], + + (List2=[[n,findall],[V1,Body,V2]]-> % if then else + (N2 is N+1, + pp2([Body],'',S4,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + length(Counter2,N2), + findall('\t',member(_,Counter2),Ts21), + concat_list(Ts21,Ts2), + %pp2(Lists3,'',S3,N), + term_to_atom(V1,V11), + term_to_atom(V2,V21), + + concat_list([S1,'\n',Ts,'[[n,findall]',',','\n',Ts,'[', + '\n',Ts2,V11,PSC,',','\n', + S4,PSC,',',%'\n',Ts, + '\n', + '\n',Ts2,V21, + '\n',%Ts,'],[', + %S5, + Ts,']]'],S2)); + + (List2=[[n,"->"],[If,Then]]-> % if then else + (N2 is N+1, + pp2([If],'',S4,N2), + pp2([Then],'',S5,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + %pp2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,'[[n,"->"]',',','\n',Ts,'[', + S4,PSC,',',%'\n',Ts, + '\n',%Ts,'],[', + S5,'\n',Ts,']]'],S2)); + + %* remove \n Ts + %* put in n,"->" + (List2=[[n,"->"],[If,Then,Else]]-> % if then else + (N2 is N+1, + pp2([If],'',S4,N2), + pp2([Then],'',S5,N2), + pp2([Else],'',S51,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + %pp2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,'[[n,"->"]',',','\n',Ts,'[', + S4,PSC,',',%'\n',Ts, + '\n',%Ts,'],[', + S5,PSC,',',%'\n',Ts, + '\n',%Ts,'],[', + S51,'\n',Ts,']]'],S2)); + %concat_list([S1,'\n',Ts,S4,',','\n',Ts,S5,',','\n',Ts,S51],S2)); + + %write("\t")%,writeln(List2), + (%pp2(List2,'',List2a,N), + List2=[[n,_]|_], + term_to_atom(List2,List2a), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + concat_list([S1,'\n',Ts,List2a],S2)); + + (pp2(List2,'',List2a,N), + %term_to_atom(List2,List2a), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + concat_list([S1,%'\n',Ts, + '\n',Ts,'[', + List2a, + '\n',Ts,']' + ],S2))))), + !. +pp2(List1,S1,S2,N) :- + pp_separate_comma(PSC), + List1=[List2|Lists3], + %write("\t"),write(List2),writeln(","), + + (List2=[[n,findall],[V1,Body,V2]]-> % if then else + (N2 is N+1, + pp2([Body],'',S4,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + length(Counter2,N2), + findall('\t',member(_,Counter2),Ts21), + concat_list(Ts21,Ts2), + %pp2(Lists3,'',S3,N), + term_to_atom(V1,V11), + term_to_atom(V2,V21), + pp2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,'[[n,findall]',',','\n',Ts,'[', + '\n',Ts2,V11,PSC,',','\n', + S4,PSC,',',%'\n',Ts, + '\n', + '\n',Ts2,V21, + '\n',%Ts,'],[', + %S5, + Ts,']]',PSC,',',S3],S2)); + + (List2=[[n,"->"],[If,Then]]-> % if then else + (N2 is N+1, + pp2([If],'',S4,N2), + pp2([Then],'',S5,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + pp2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,'[[n,"->"]',',','\n',Ts,'[', + S4,PSC,',',%'\n',Ts, + '\n',%Ts,'],[', + S5,'\n',Ts,']]',PSC,',',S3],S2)); + %concat_list([S1,'\n',Ts,S4,',','\n',Ts,S5,',',S3],S2)); + + (List2=[[n,"->"],[If,Then,Else]]-> % if then else + (N2 is N+1, + pp2([If],'',S4,N2), + pp2([Then],'',S5,N2), + pp2([Else],'',S51,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + pp2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,'[[n,"->"]',',','\n',Ts,'[', + S4,PSC,',',%'\n',Ts, + '\n',%Ts,'],[', + S5,PSC,',',%'\n',Ts, + '\n',%Ts,'],[', + S51,'\n',Ts,']]',PSC,',',S3],S2)); + %concat_list([S1,'\n',Ts,S4,',','\n',Ts,S5,',','\n',Ts,S51,',',S3],S2)); + (%pp2(List2,'',List2a,N), + %trace, + List2=[[N_or_v,_]|_],(N_or_v=n->true;N_or_v=v), + pp2(Lists3,'',S3,N), + term_to_atom(List2,List2a), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + (S3=''->Comma='';Comma=','), + concat_list([S1,'\n',Ts,List2a,Comma,S3],S2)); + + (pp2(List2,'',List2a,N), + pp2(Lists3,'',S3,N), + %term_to_atom(List2,List2a), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + concat_list([S1,%'\n',Ts, + '\n',Ts,'[', + List2a, + '\n',Ts,']',PSC, + ',',S3 + ],S2))))), + %concat_list([S1,%'\n',Ts, + %List2a,',',S3],S2)))), + +/* + + (term_to_atom(List2,List2a), + pp2(Lists3,'',S3,N), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + concat_list([S1,'\n',Ts,List2a,',',S3],S2)))), + */ + !. + +pp_1(List,String) :- + term_to_atom(List,Atom), + atom_string(Atom,String). + +pp3(List1,S1,S3) :- + pp_separate_comma(PSC), + symbol_1(Symbol,Symbol1), + List1=List2, + (((List2=[[_N10,_Name]]->true;List2=[[_N10,_Name], + _Variables]), + term_to_atom(List2,List2a), + concat_list([S1,List2a,PSC,',\n'],S3))->true; + + ((List2=[":-",[_,_Word11],_Varnames%N, + ]%,N3 + ;List2=[":-",[_,_Word11],[_Word13,"/",_Word12]%,Varnames%N, + ] + ), + term_to_atom(List2,List2a), + concat_list([S1,List2a,PSC,',\n'],S3))->true; + + ((List2=[[N1,Name],Variables1,Symbol,Body]-> + (term_to_atom(Variables1,Variables2), + concat_list([S1,'[[',N1,',',Name,'],',Variables2, + PSC,',',Symbol1,',\n[',PSC],String), + %trace, + pp2(Body,'',B1,1), + %string_concat(B1,",",B11), + concat_list([String,B1,'\n]]',PSC,',\n'],S3)))->true; + List2=[[N1,Name],Symbol,Body]-> + (%term_to_atom(Variables1,Variables2), + concat_list([S1,'[[',N1,',',Name,']',PSC,',',Symbol1,',\n[',PSC],String), + %trace, + pp2(Body,'',B1,1), + %string_concat(B1,",",B11), + concat_list([String,B1,'\n]]',PSC,',\n'],S3)))),!. diff --git a/Prolog-to-List-Prolog-master/pretty_print_lp2p.pl b/Prolog-to-List-Prolog-master/pretty_print_lp2p.pl new file mode 100644 index 0000000000000000000000000000000000000000..20ecebe5a872ef95e533b7ad3351611a9dc4fc1f --- /dev/null +++ b/Prolog-to-List-Prolog-master/pretty_print_lp2p.pl @@ -0,0 +1,441 @@ +% ["Medicine","MEDICINE by Lucian Green Head of State Head Ache Prevention 4 of 4.txt",0,algorithms,"32. I prepared to observe the people like my lecturer friend and meditation student was a doctor and friend. I did this by observing the hansard. First, I found the hansard. Second, I observed him listen to the politician. Third, I observed him take notes. In this way, I prepared to observe the people like my lecturer friend and meditation student was a doctor and friend by observing the hansard."] + +% * I did this by observing the hansard. + +% For nested if-then, findall + +%:-include('../List-Prolog-to-Prolog-Converter/lp2pconverter.pl'). + +pretty_print_lp2p(List,String) :- + (pp_lp2p0(List,String)->true; + pp_lp2p_1(List,String)). + +symbol_1_lp2p(":-",":-"). +symbol_1_lp2p("->","->"). + +pp_lp2p0([],'') :- !. +pp_lp2p0(List,String2) :- +%trace, + pp_lp2p1(List,'',String1), + concat_list(['',String1],String6),%5), + %replace(String3,"&","\"",String4), + %replace(String3,"#","'",String5), + %string_concat(String6,B,String5),string_length(B,2), + string_concat(String6,'\n',String2), + !. +pp_lp2p1([],S,S):-!. +pp_lp2p1([List1],S1,S2) :- + %List1=[List2], + pp_lp2p3(List1,S1,S2). +pp_lp2p1(List1,S1,S2) :- + List1=[List2|Lists3], + pp_lp2p3(List2,S1,S3), + pp_lp2p1(Lists3,S3,S2),!. +pp_lp2p2([],S,S,_N):-!. +pp_lp2p2(List1,S1,S2,N) :- + List1=[List2], + + (List2=[[n,findall],[V1,Body,V2]]-> % if then else + (N2 is N+1, + pp_lp2p2([Body],'',S4,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + length(Counter2,N2), + findall('\t',member(_,Counter2),Ts21), + concat_list(Ts21,Ts2), + %pp_lp2p2(Lists3,'',S3,N), + %term_to_atom(V1,V11), + interpretstatementlp2p5(V1,'',V11), + %term_to_atom(V2,V21), + interpretstatementlp2p5(V2,'',V21), + concat_list([S1,'\n',Ts,'findall(',%'\n',Ts, + %'\n',Ts2, + V11,',','\n', + S4,',',%'\n',Ts, + '\n', + '\n',Ts2,V21, + %'\n',%Ts,'],[', + %S5, + ')'%Ts,']]' + ],S2)); + + (List2=[[n,"->"],[If,Then]]-> % if then else + (N2 is N+1, + pp_lp2p2([If],'',S4,N2), + pp_lp2p2([Then],'',S5,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + %pp_lp2p2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,%'[[n,"->"]',',', + %'\n',Ts, + '(', + S4,'->',%'\n',Ts, + '\n',Ts,'', + S5,'\n',Ts,')'],S2)); + + (List2=[[n,code],Code1]-> % if then else + (N2 is N+1, + (%trace, + (Code1=[A|_],not(predicate_or_rule_name(A)))-> + Code1=Code;[Code1]=Code), + pp_lp2p2(Code,'',S4,N2), + %pp_lp2p2([Then],'',S5,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + %pp_lp2p2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,%'[[n,"->"]',',', + %'\n',Ts, + '{', + S4,'',%'\n',Ts, + %'\n',Ts,'', + %S5, + '\n',Ts,'}'],S2)); + + %* remove \n Ts + %* put in n,"->" + (List2=[[n,"->"],[If,Then,Else]]-> % if then else + (N2 is N+1, + pp_lp2p2([If],'',S4,N2), + pp_lp2p2([Then],'',S5,N2), + pp_lp2p2([Else],'',S51,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + %pp_lp2p2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,%'[[n,"->"]',',','\n',Ts, + '(', + S4,'->',%'\n',Ts, + '\n',%Ts,'],[', + S5,';',%'\n',Ts, + '\n',%Ts,'],[', + S51,'\n',Ts,')'],S2)); + %concat_list([S1,'\n',Ts,S4,',','\n',Ts,S5,',','\n',Ts,S51],S2)); + + %write("\t")%,writeln(List2), + (%pp_lp2p2(List2,'',List2a,N), + List2=[[n,cut]|_]-> + (%pp4_lp2p3_1_4(List2,List2a), + List2a="!", + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + concat_list([S1,'\n',Ts,List2a],S2)); + + (List2=[[n,comment],[Comment]]-> % if then else + (%N2 is N+1, + %pp_lp2p2([Comment],'',S4,N), + %pp_lp2p2([Then],'',S5,N2), + %pp_lp2p2([Else],'',S51,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + %pp_lp2p2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,Comment],S2)); + + (%pp_lp2p2(List2,'',List2a,N), + List2=[[n,_]|_]-> + (pp4_lp2p3_1_4(List2,List2a), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + concat_list([S1,'\n',Ts,List2a],S2)); + + (N2 is N+1, + pp_lp2p2(List2,'',List2a,N2), + %term_to_atom(List2,List2a), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + concat_list([S1,%'\n',Ts, + '\n',Ts,'(', + List2a, + '\n',Ts,')' + ],S2))))))))), + !. +pp_lp2p2(List1,S1,S2,N) :- + List1=[List2|Lists3], + %write("\t"),write(List2),writeln(","), + + (List2=[[n,findall],[V1,Body,V2]]-> % if then else + (N2 is N+1, + pp_lp2p2([Body],'',S4,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + length(Counter2,N2), + findall('\t',member(_,Counter2),Ts21), + concat_list(Ts21,Ts2), + %pp_lp2p2(Lists3,'',S3,N), + interpretstatementlp2p5(V1,'',V11), + interpretstatementlp2p5(V2,'',V21), + pp_lp2p2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,'findall(',%'\n',Ts,'', + %'\n',Ts2, + V11,',','\n', + S4,',',%'\n',Ts, + '\n', + '\n',Ts2,V21, + '',%Ts,'],[', + %S5, + %Ts, + ')',',',S3],S2)); + + (List2=[[n,"->"],[If,Then]]-> % if then else + (N2 is N+1, + pp_lp2p2([If],'',S4,N2), + pp_lp2p2([Then],'',S5,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + pp_lp2p2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,'(','','\n', + Ts,'', + S4,'->',%'\n',Ts, + '\n',%Ts,'],[', + S5,'\n',Ts, + ')',',',S3],S2)); + %concat_list([S1,'\n',Ts,S4,',','\n',Ts,S5,',',S3],S2)); + + (List2=[[n,code],Code1]-> % if then else + (N2 is N+1, + (%trace, + (Code1=[A|_],not(predicate_or_rule_name(A)))-> + Code1=Code;[Code1]=Code), + pp_lp2p2(Code,'',S4,N2), + %pp_lp2p2([Then],'',S5,N2), + %pp_lp2p2([Else],'',S51,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + pp_lp2p2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,'{',%',','\n',Ts,'[', + S4,%'->',%'\n',Ts, + '\n'%Ts,'],[', + %S5,';',%'\n',Ts, + %'\n',%Ts,'],[', + %S51%,'\n' + ,Ts + ,'}',',',S3],S2)); + + (List2=[[n,"->"],[If,Then,Else]]-> % if then else + (N2 is N+1, + pp_lp2p2([If],'',S4,N2), + pp_lp2p2([Then],'',S5,N2), + pp_lp2p2([Else],'',S51,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + pp_lp2p2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,'(',%',','\n',Ts,'[', + S4,'->',%'\n',Ts, + '\n',%Ts,'],[', + S5,';',%'\n',Ts, + '\n',%Ts,'],[', + S51%,'\n' + %,Ts + ,')',',',S3],S2)); + %concat_list([S1,'\n',Ts,S4,',','\n',Ts,S5,',','\n',Ts,S51,',',S3],S2)); + (%pp_lp2p2(List2,'',List2a,N), + %trace, + (List2=[[N_or_v,cut]|_],(N_or_v=n->true;N_or_v=v))-> + (pp_lp2p2(Lists3,'',S3,N), + %pp4_lp2p3_1_4(List2,List2a), + List2a="!", + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + (S3=''->Comma='';Comma=','), + concat_list([S1,'\n',Ts,List2a,Comma,S3],S2)); + + (List2=[[n,comment],[Comment]]-> % if then else + (pp_lp2p2(Lists3,'',S3,N),%N2 is N+1, + %pp_lp2p2([Comment],'',S4,N), + %pp_lp2p2([Then],'',S5,N2), + %pp_lp2p2([Else],'',S51,N2), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + %pp_lp2p2(Lists3,'',S3,N), + concat_list([S1,'\n',Ts,Comment,S3],S2)); + + (%pp_lp2p2(List2,'',List2a,N), + %trace, + (List2=[[N_or_v,_]|_],(N_or_v=n->true;N_or_v=v))-> + (pp_lp2p2(Lists3,'',S3,N), + pp4_lp2p3_1_4(List2,List2a), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + (S3=''->Comma='';Comma=','), + concat_list([S1,'\n',Ts,List2a,Comma,S3],S2)); + + (N2 is N+1, + pp_lp2p2(List2,'',List2a,N2), + pp_lp2p2(Lists3,'',S3,N), + %term_to_atom(List2,List2a), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + concat_list([S1,%'\n',Ts, + '\n',Ts,'(', + List2a, + '\n',Ts,')', + ',',S3 + ],S2))))))))), + %concat_list([S1,%'\n',Ts, + %List2a,',',S3],S2)))), + +/* + + (term_to_atom(List2,List2a), + pp_lp2p2(Lists3,'',S3,N), + length(Counter,N), + findall('\t',member(_,Counter),Ts1), + concat_list(Ts1,Ts), + concat_list([S1,'\n',Ts,List2a,',',S3],S2)))), + */ + !. + +pp_lp2p_1(List,String) :- + pp4_lp2p3_1_4(List,Atom), + atom_string(Atom,String). + +pp_lp2p3(List1,S1,S3) :- + %symbol_1_lp2p(Symbol,Symbol1), + %List1=List2, + ((pp4_lp2p3_1(List1,String), + %trace, + %pp_lp2p2(Body,'',B1,1), + %string_concat(B1,",",B11), + concat_list([S1,String,".\n"%,'.\n\n' + ],S3) + + )->true; + ((pp4_lp2p3_2(List1,String), + + %trace, + %pp_lp2p2(Body,'',B1,1), + %string_concat(B1,",",B11), + concat_list([S1,String,"\n"%,%B1, + %'.\n\n' + ],S3))->true; + + ((pp4_lp2p3_21(List1,String), + + %trace, + %pp_lp2p2(Body,'',B1,1), + %string_concat(B1,",",B11), + concat_list([S1,String,".\n"%,%B1, + %'.\n\n' + ],S3))->true; + + + %)->true; + ((pp4_lp2p3_3(List1,String,B1), + + %string_concat(B1,",",B11), + concat_list([S1,String,B1,'.\n\n'],S3)->true; + + ((pp4_lp2p3_4(List1,String,B1), + %string_concat(B1,",",B11), + concat_list([S1,String,B1,'.\n\n'],S3)->true; + + ((pp4_lp2p3_5(List1,String), + %string_concat(B1,",",B11), + concat_list([S1,String,'.\n\n'],S3)->true; + + ((pp4_lp2p3_6(List1,String), + %string_concat(B1,",",B11), + concat_list([S1,String,'.\n\n'],S3)) + )))))))))),!. + +pp4_lp2p3_1_4(L,S) :- + (pp4_lp2p3_1(L,S)->true; + (pp4_lp2p3_2(L,S)->true; + (pp4_lp2p3_21(L,S)->true; + ((pp4_lp2p3_3(L,String,B1), + string_concat(String,B1,S))->true; + (pp4_lp2p3_4(L,String,B1), + string_concat(String,B1,S)))))),!. + +pp4_lp2p3_1(List1,S3) :- + List1=[[_N10,Name]], + %term_to_atom(List2,List2a), + %concat_list([S1,List2a,'\n'],S3) + + + %term_to_atom(Variables1,Variables2), + concat_list(['',Name%,'\n' + ],String), + %trace, + %pp_lp2p2(Body,'',B1,1), + %string_concat(B1,",",B11), + concat_list([String%,'\n\n' + ],S3). + +pp4_lp2p3_2(List1,String) :- + List1=[[_N10,comment], + [Comment]], + %term_to_atom(List2,List2a), + %concat_list([S1,List2a,'\n'],S3) + + + %interpretstatementlp2p5(Variables,'',Variables2,false), + concat_list(['',Comment%, + %',',%Symbol1, + %'\n' + ],String),!. + +pp4_lp2p3_21(List1,String) :- + List1=[[_N10,Name], + Variables], + %term_to_atom(List2,List2a), + %concat_list([S1,List2a,'\n'],S3) + + + interpretstatementlp2p5(Variables,'',Variables2,false), + concat_list(['',Name,'(',Variables2,')'%, + %',',%Symbol1, + %'\n' + ],String). + +pp4_lp2p3_3(List2,String,B1) :- + List2=[[_N1,Name],Variables1,Symbol,Body], + symbol_1_lp2p(Symbol,Symbol1), + interpretstatementlp2p5(Variables1,'',Variables2,false), + concat_list(['',Name,'(',Variables2,')', + '',Symbol1,'\n'],String), + %trace, + pp_lp2p2(Body,'',B1,1). + %string_concat(B10). + +pp4_lp2p3_4(List2,String,B1) :- + List2=[[_N1,Name],Symbol,Body], + symbol_1_lp2p(Symbol,Symbol1), + %term_to_atom(Variables1,Variables2), + concat_list([Name,Symbol1,'\n'],String), + %trace, + pp_lp2p2(Body,'',B1,1). + +pp4_lp2p3_5(List2,String) :- + List2=[Symbol,[_,Word11],[Word13,"/",Word12]%,Varnames%N, + ], + symbol_1_lp2p(Symbol,Symbol1), + %term_to_atom(Variables1,Variables2), + concat_list([Symbol1,Word11," ",Word13,"/",Word12],String). + %trace, + + %pp_lp2p2(Body,'',B1,1). + +pp4_lp2p3_6(List2,String) :- + List2=[Symbol,[_,Word11],Variables1%N, + ], + symbol_1_lp2p(Symbol,Symbol1), + %term_to_atom(Variables1,Variables2), + interpretstatementlp2p5(Variables1,'',Variables2,false), + concat_list([Symbol1,Word11,'(',Variables2,')'],String). + %trace, + %pp_lp2p2(Body,'',B1,1). diff --git a/Prolog-to-List-Prolog-master/test1.pl b/Prolog-to-List-Prolog-master/test1.pl new file mode 100644 index 0000000000000000000000000000000000000000..c8a27fca0e9b5fddf0389989f6570fa2f0327939 --- /dev/null +++ b/Prolog-to-List-Prolog-master/test1.pl @@ -0,0 +1,64 @@ +:- use_module(library(date)). + +:-include('luciancicd.pl'). +:-include('../Prolog-to-List-Prolog/p2lpconverter.pl'). + +use_module(library(pio)). +use_module(library(dcg/basics)). + + +:-dynamic keep_comments/1. + +a:-a. +a(/*1*/%2 + +A,B). +a(/*1*/B,C)/*2*/:-d(E). +a(/*1*/B,C)/*2*/->d(E). +a/*2*/:-/*3*/d(E). +a/*2*/->/*3*/d(E). +a->a. +a/*2*/-->/*3*/d(E). +a:-b([A,B],1,[C]). +a./*1*/ +b:-c. +a:-b, +/*1*/ +c, +%2 +d. +a(A):-[]=[[]]. +a(A):-([]=[[]]). +a(A):-([1]=A). +a(A):-(B=A). +a(A):-([1]=[1]). +a([A,B,C]). +a([A,B]). +a([A|C]). +a([[A,B]]). +a(A,[]). +a(A,[A]). +a([[A]|C]). +a(A,[[A]|C]). +a(A,[[A,B]|[C,D]]). +ba(12). +ba(12,1). +a(1.1). +a("dsf"). +a('dd'). +a(A):-findall(A,hello(A),B). +a(A):-findall(A,hello(A),B),!. +a(A):-findall(A,(hello(A),hello(A)),B). +a([A]):-A is 1+1. +ef(G):-(h(I)->true;true). +ef(G):-(h(I)->true;true),!. +ef(G):-(h(I)->true). +ef(G):-(h(I)->true),!. +compound21(T,U)->item(I). +compound21(T,U)->item(I),!. +a([A]). +a(A). +compound21(T,U)->{wrap(I,Itemname1)}. +compound21(T,U)->{wrap(I,Itemname1),append(T,Itemname1,V)}. +compound21(T,U)->item(I),lookahead("]"),{wrap(I,Itemname1),append(T,Itemname1,V)},compound212(V,U). +a(A):-findall([A,C],hello(A,C),B). diff --git a/SSI/.DS_Store b/SSI/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..639d16dbe7b54b4e531127b494244df6eb2caa20 Binary files /dev/null and b/SSI/.DS_Store differ diff --git a/SSI/LICENSE b/SSI/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..920183ffbdaca89fc6faee40b6de08a6c8a061c7 --- /dev/null +++ b/SSI/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/SSI/README.md b/SSI/README.md new file mode 100644 index 0000000000000000000000000000000000000000..517fb8cf0b8519599f79d08295d015eb4edfd8ec --- /dev/null +++ b/SSI/README.md @@ -0,0 +1,156 @@ +# State Saving Interpreter (SSI) + +![D7123F59-111C-4A88-A9BD-5006F7D36E7A](https://user-images.githubusercontent.com/15845542/226259087-0e485e4f-feaa-4bcc-95bc-ac81b31a5994.jpeg) +* White hexagons on a black background. + +* With the State Saving Interpreter, users can save states of the interpreter between times and deprecate List Prolog Interpreter. + +# Getting Started + +Please read the following instructions on installing the project on your computer for writing code. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +* You may need to install gawk using Homebrew. + +* Install Translation Shell on Mac, etc. +Change line in +``` +culturaltranslationtool/ctt2.pl +trans_location("../../../gawk/trans"). +``` +to correct location of trans. + +# 1. Install manually + +Download this repository, List Prolog Interpreter (LPI) and the repositories which LPI is dependent on. The Languages repository enables SSI to be run in different languages. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","SSI"). +halt +``` + +# Running + +* In Shell: +`cd SSI` +`swipl` +`['ssi'].` + +* You may need to replace `swipl` with `swipl --stack-limit=1G` above to run tests 15 and 25. + +# Learn Prolog with SSI + +You can learn List Prolog, SSI's programming language at List Prolog Documentation. + +# Explanation of what it does + +* State Saving Interpreter (SSI), like List Prolog Interpreter, runs algorithms in the List Prolog language. List Prolog algorithms are lists, which can be more easily generated and analysed. SSI is different because it manually creates and follows choice points (allowing multiple possible solutions like Prolog) unlike List Prolog Interpreter, which relies on Prolog to manage choice points. +* SSI was written to save the states of the Prolog interpreter. +* It solves the problem of backtracking by saving states. +* It works by manually storing and referring to choice points, records of the variable binding table and recursive states. These include "virtual" commands (commands such as `member2` which compute all results when they are first run). The interpreter loads these choice points as the algorithm encounters flow through predicates, commands and recursion. +* It supports the Higher-Order `call` Command. +* SSI supports `member2`, `stringconcat`, `nested findall`, `cut` and the commands of List Prolog Interpreter. +* SSI will enable running segments of a complex algorithm on a supercomputer and prevent data loss in low-power areas. + +# Tests + +To run SSI-specific tests, run: +`only_ssi_test(off,NTotal,Score).` + +To run all tests (main, types, open and open types) in any language: +``` +ssi_test_all00("en",off,NTotal,Score). +ssi_test_all00("en2",off,NTotal,Score). +``` +where "en2" is an English language with e.g. `"concatenate strings"` instead of `stringconcat` ("en", or see available language codes - see the Languages repository for instructions about how to install different languages). + +To run a test from one of the primary, types, open or open types, run one of: +``` +ssi_test_all01(test, 4,"en2",off,1,Passed). +ssi_test_all01(test_types_cases,6,"en2",off,1,Passed). +ssi_test_all01(testopen_cases, 3,"en2",off,1,Passed). +ssi_test_all01(test_open_types, 5,"en2",off,1,Passed). +``` +where the user should replace 1 with the test number from + +* lpiverify4.pl +* lpiverify4_types.pl +* lpiverify4_open.pl +* lpiverify4_open_types.pl + +respectively. + +* Note 1: drag and drop contents of test_open_and_types_open_data/ into an empty file in BBEdit (Mac) to copy and paste into Terminal for tests with input. + +To run all tests (main, types, open and open types) back-translating to and from any language: +``` +ssi_test_all_bt00("en2",off,NTotal,Score). +``` + +To run a test from one of the primary, types, open or open types, run one of: +``` +ssi_test_all_bt01(test, 4,"en2",off,1,Passed). +ssi_test_all_bt01(test_types_cases,6,"en2",off,1,Passed). +ssi_test_all_bt01(testopen_cases, 3,"en2",off,1,Passed). +ssi_test_all_bt01(test_open_types, 5,"en2",off,1,Passed). +``` +where the user replaces 1 with the test number from + +* lpiverify4.pl +* lpiverify4_types.pl +* lpiverify4_open.pl +* lpiverify4_open_types.pl + +respectively. + +* See note 1 above. + +# SSI Web Service + +* A low-code backend web Prolog service. + +* After loading `swipl` and `[ssi].`, load State Saving Interpreter to run web apps with `ssi_server(8000).` and go to `http://127.0.0.1:8000` in your browser. + +* Modify the application to run, the window's title and background colour in `ssi-api.pl`. + +* Before running on your server, change the API key in `api-key(1234567890).` in `ssi-api-key.pl` to a different API key. + +* To delete old session files every month, run `swipl ssi-ws.pl` using a scheduler in Linux, i.e. `crontab -e`. + +* Setting_up_a_VPS_with_TextToBr describes how to set up a Virtual Private Server, which can run Prolog over the web. + +* SSI Web Service contains few, but central commands for mainly AI-purposes in Prolog. For more tutorials, see my Lucian Academy YouTube Channel. + +* To comply with browsers' security requirements, only run SSIWS apps with a password screen, as follows: +``` +read_password("password") +``` + +# Documentation + +See List Prolog Documentation. + +# SSI with Prolog queries + +* Run SSI with Prolog queries. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details. diff --git a/SSI/a.out b/SSI/a.out new file mode 100644 index 0000000000000000000000000000000000000000..7f0aeb54535dc8180f9f5938767725f6773e3687 Binary files /dev/null and b/SSI/a.out differ diff --git a/SSI/cp_since_findall_start2.pl b/SSI/cp_since_findall_start2.pl new file mode 100644 index 0000000000000000000000000000000000000000..f5404f99a597769f138667a06b5634bf927b2eaf --- /dev/null +++ b/SSI/cp_since_findall_start2.pl @@ -0,0 +1,74 @@ +get_later_cps_than_cp11(List1,Cp1,Cp3,Cps) :- +%trace, +%member([Cp_a,Cp_b|Cp1],List1), +%trace, +append(_,B,List1), + +append([Cp1],Cps,B), + +%writeln1(["*4",append([Cp1],Cps,B)]), + [_,_|Cp3]=Cp1. +/* + get_later_cps_than_cp1(List1,Cp1,Cp3,Cps) :- + %curr_cp(N), + %writeln1([cp1,Cp1]), + %writeln1([list1,List1]), + %trace, + %Cp1= + %trace, + Cp1=[Cp_b,_Cp_c|Cp3], + %member(Cp1,List1), + member([Cp_b,Cp_a1|_Cp5],List1), + member([Cp_a1,Cp_a|_Cp4],List1), + get_later_cps_than_cp(List1,Cp_a,[],Cps). + */ + %Cp2=[_,_|Cp3]. +/* +get_later_cps_than_cp1(List1,Cp_a,Cps,Cps) :- + not(member([_,Cp_a|_],List1)),!. +get_later_cps_than_cp1(List1,Cp_a,Cps1,Cps2) :- + member([Cp_b,Cp_a|Cp3],List1), + append(Cps1,[[Cp_b,Cp_a|Cp3]],Cps3), + get_later_cps_than_cp1(List1,Cp_b,Cps3,Cps2),!. +*/ +% new above that returns cp a + +/* +cp_since_findall_start2(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars2) :- + %writeln1("y for trace:"),(get_char(y)->trace;true), + %get(curr_cp,Curr_cp,CP_Vars1),%writeln([curr_cp,Curr_cp]), + (debug4(on)->writeln1(cp_since_findall_start22(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars2));true), + (get_last_p_before_n2(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars2)->true;false%(writeln([get_last_p_before_n2,abort]),abort) + ), + %writeln1(cp_since_findall_start22(List1,Cp1,Cp2,Cp3)), + %get(curr_cp,Curr_cp1,CP_Vars2),%writeln([curr_cp,Curr_cp1]), + (debug4(on)->writeln([cp_since_findall_start22,List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars2]);true). + */ + %notrace. + + /* +cp_since_findall_start22(List1,Cp1,Cp2,Cp3) :- +%trace, +writeln1([list1,List1]), + curr_cp(N), + cp_since_findall_start21(List1,Cp1,N,Cp2), + Cp2=[_,_|Cp3]. + +cp_since_findall_start21(List1,_Cp1,N,_Cp2) :- + (not(member([N,_|_],List1)),fail),!. +cp_since_findall_start21(List1,Cp1,N,Cp2) :- + member([N,B|Cp1],List1), + %((Cp1=[_,_,_,["returns to",_]|_])-> + %fail%cp_since_findall_start21(List1,Cp1,B,Cp2) + %; + Cp2=[N,B|Cp1],!. +cp_since_findall_start21(List1,Cp1,N,Cp2) :- + member([B,N|Cp1],List1), + %((Cp1=[_,_,_,["returns to",_]|_])-> + %cp_since_findall_start21(List1,Cp1,B,Cp2); + Cp2=[B,N|Cp1],!. +cp_since_findall_start21(List1,Cp1,N,Cp2) :- + member([B,N|Cp3],List1), + not(Cp1=Cp3), + cp_since_findall_start21(List1,Cp1,B,Cp2),!. +*/ \ No newline at end of file diff --git a/SSI/d.pl b/SSI/d.pl new file mode 100644 index 0000000000000000000000000000000000000000..1c38fb649479c493c91f5a0cb2bca50cfc4be220 --- /dev/null +++ b/SSI/d.pl @@ -0,0 +1,179 @@ +d(Pred_id,D0,Level,Predicate_number,Line_number_b,Query,Vars1,Vars2,All_predicate_numbers,Line,Choice_point_trail10,Globals33,Functions,Result1, Result2,Globals2,Choice_point_trail3,CP_Vars10,CP_Vars2) :- + +%writeln1(d(Pred_id,D0,Level,Predicate_number,Line_number_b,Query,Vars1,Vars2,All_predicate_numbers,Line,Choice_point_trail1,Globals3,Functions,Result1, Result2,Globals2,Choice_point_trail3,CP_Vars1,CP_Vars2)), + +%trace, +(false% too slow +%last_call_optimisation(Globals3,Choice_point_trail10,Choice_point_trail1,Predicate_number,Line_number_b,Functions,CP_Vars10,CP_Vars1,Globals33,Globals3) +->true%writeln1(last_call_optimisation(Globals3,Choice_point_trail10,Choice_point_trail1,Predicate_number,Line_number_b,Functions,CP_Vars10,CP_Vars1)) +; +(Choice_point_trail10=Choice_point_trail1,CP_Vars1=CP_Vars10,Globals33=Globals3)), + +%( +%append(D,All_predicate_numbers,All_predicate_numbers3), + %All_predicate_numbers3= + + % * vars1,3 + + Level2 is Level+1, + +%writeln1(append_cp(Choice_point_trail1,[[Pred_id,Level,Predicate_number,["returns to",Line_number_b],"predicate",Query, + %Vars2,All_predicate_numbers]],Choice_point_trail11) +%), + +%Line=[Function,Arguments], + + + +get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +%get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%get_lang_word("call",Dbw_call1),Dbw_call1=Dbw_call, + +%find_pred_sm(Reserved_words1), +Line=Query1, + %((Query1=[[Dbw_n,Dbw_call],[Function,Arguments]] + %)->true; +%( +Query1=[Function,Arguments] +%) +%) +, + + ((Function=[Dbw_v,Function2], + not(reserved_word2(Function2)))-> + (append([Function],Arguments,Arguments1), + substitutevarsA1(Arguments1,Vars1,[],Vars3,[],FirstArgs), + + append(Globals3,[[[firstargs_uv2,Pred_id],FirstArgs]],Globals31), + del_append(Globals31,[[[vars1,Pred_id],Vars1]],Globals32), + + Vars3=[Function1|Vars31], + Query2=[Function1,Vars31], + + pred_numbers(Pred_numbers), + length(Arguments,Arity1), + %trace, + member([Function1,Arity1,Pred_numbers1],Pred_numbers), + findall([All_predicate_numbers0,"prev_pred_id",Pred_id],member(All_predicate_numbers0,Pred_numbers1),D) + %All_predicate_numbers01=[[All_predicate_numbers1,"prev_pred_id",Prev_pred_id]|All_predicate_numbers2] + + + ); + ( + substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs), + + append(Globals3,[[[firstargs_uv2,Pred_id],FirstArgs]],Globals32), + + Query2=[Function,Vars3], + findall([All_predicate_numbers0,"prev_pred_id",Pred_id],member(All_predicate_numbers0,D0),D) + + ) + + + + + + + + ), + %interpret2(Query2,Functions0,Functions0,Result1), + +/* + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + +*/ + D= + [All_predicate_numbers1|All_predicate_numbers2], + + All_predicate_numbers1=[All_predicate_numbers11,"prev_pred_id",Prev_pred_id], + + + +append_cp(Choice_point_trail1,[[Pred_id,Level,Predicate_number,["returns to",Line_number_b,"pred_id",Pred_id],"predicate",Query, + Vars1,All_predicate_numbers + %D + ]],Choice_point_trail11,CP_Vars1,CP_Vars3), + + %% ** + %(Pred_id=3->writeln(here1);true), + + %trace, +%cut_cps_if_necessary(Prev_pred_id,Choice_point_trail11,Choice_point_trail12,CP_Vars3,CP_Vars32,All_predicate_numbers11,Globals32) , +Choice_point_trail11=Choice_point_trail12, +CP_Vars3=CP_Vars32, + %notrace, + %writeln1(cut_cps_if_necessary(Prev_pred_id,Choice_point_trail11,Choice_point_trail12,CP_Vars3,CP_Vars32,All_predicate_numbers11,Globals32) ), + ssi1([["prev_pred_id",Prev_pred_id],Level2,All_predicate_numbers11,-1,"predicate",Query2, + Vars1,All_predicate_numbers2], + _End_result2, %don't need + Functions,Vars2, + Result1, Result2, % don't need + Globals32,Globals2, + Choice_point_trail12, + Choice_point_trail3, + CP_Vars32,CP_Vars2) + + +% append(Result3,[End_result],Result2) + . + +cut_cps_if_necessary(Pred_id,Choice_point_trail11,Choice_point_trail2,CP_Vars1,CP_Vars2,Predicate_number,Globals3) :- + +cut_cps_if_necessary(Pred_id,Choice_point_trail11,Choice_point_trail2,CP_Vars1,CP_Vars2,Predicate_number,Globals3,check-rec). + +/* +cut_cps_if_necessary1(Pred_id,Choice_point_trail11,Choice_point_trail2,CP_Vars1,CP_Vars2,Predicate_number,Globals3) :- + +cut_cps_if_necessary(Pred_id,Choice_point_trail11,Choice_point_trail2,CP_Vars1,CP_Vars2,Predicate_number,Globals3,no-check-rec). +*/ + +cut_cps_if_necessary(Pred_id,Choice_point_trail11,Choice_point_trail2,CP_Vars1,CP_Vars2,Predicate_number,Globals3,CR_flag) :- + + % find pred_id group + +findall(Pred_ids,collect_connected_pred_ids(Pred_id,[Pred_id],Pred_ids,Predicate_number,Globals3),Pred_ids1), + +%(not(Pred_ids1=[[_]])->writeln1(cut_cps_if_necessary(Pred_id,Choice_point_trail11,Choice_point_trail2,CP_Vars1,CP_Vars2,Predicate_number,Globals3));true), + +flatten(Pred_ids1,Pred_ids1a), +sort(Pred_ids1a,Pred_ids2), + + %writeln([2,Pred_ids2]), + %(Pred_ids2=[117]->%writeln(*****************)% + %true + %turndebug(on) + %;true), + + % if no cp data for pred_id group, cut cps + findall([A,B2,C,D_Level,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2],(member([A,B2,C,D_Level,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2],Choice_point_trail11), + member(C,Pred_ids2)%,not(F_Line_number_a2= -1), + %not(F_Line_number_a2=["returns to", _, "pred_id", _]) + ),M), + ((CR_flag=check-rec->recursive_predicate(Pred_id,Pred_ids2,Globals3);true),(forall(member([A,B2,C,D_Level,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2],M), + + (Pred_or_line="line"-> + (All_predicate_numbers2=[_Ab,_Bb,_Cb,_Db,_Eb, + _Fb,Vars2c],not(Vars2c=[])); + Pred_or_line="predicate"-> + not(All_predicate_numbers2=[]))))-> + + +cut_cps(Choice_point_trail11,Choice_point_trail2,CP_Vars1,CP_Vars2,Pred_id,Predicate_number,Globals3); + +(Choice_point_trail11=Choice_point_trail2,CP_Vars1=CP_Vars2)). + +%recursive_predicate(A,A,_):- !. +recursive_predicate(A,B,_):-member(A,B),!. +recursive_predicate(A,B,Globals3):- + member([pred_id_chain,C,A],Globals3), + (member(C,B)->true; + (recursive_predicate(C,B,Globals3))),!. + +/* +recursive_predicate(A,_B,Predicate_number,Globals3):- + member([pred_id_chain,C,A],Globals3), + ((%member(C,B), + member([[pred_num,C],Predicate_number],Globals3))->true; + (recursive_predicate(C,_B,Globals3))),!. +*/ \ No newline at end of file diff --git a/SSI/e.pl b/SSI/e.pl new file mode 100644 index 0000000000000000000000000000000000000000..671227f73d1d1d7d8bd241175f5819fbf5518ce2 --- /dev/null +++ b/SSI/e.pl @@ -0,0 +1,197 @@ +e(Pred_id,Level,Predicate_number,Vars3,End_result,Functions,Vars2,Result1, Result2, + Globals1a,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2 + ) :- +%trace, + +/* member([[function,Pred_id],Function],Globals1), + member([[arguments1,Pred_id],Arguments1],Globals1), + + member([[skip,Pred_id],Skip],Globals1), + +(debug_fail(Skip,[Function,Arguments1])->true;true), +*/ + +((not(Level=0))->( + + %trace, + Level2 is Level-1, + + (Level2 = 0 -> + + ( %(Pred_id=3->writeln(here2);true), + + % pred_id in following is _ (and _ for pred_id in ssi1 calls in e.pl because they are -3 line calls, and shouldn't pass previous pred ids to be added to pred id chain) + + ssi1([_,0,_Predicate_number,-3,"predicate",_Query_a, + Vars3,_All_predicate_numbers], _Result21, Functions,Vars2, + Result1, Result2, + Globals1a,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2)); + + ( + + %Choice_point_trail1=Choice_point_trail11, + + %reverse(Choice_point_trail1,Choice_point_trail11), + +%writeln([choice_point_trail11,Choice_point_trail11]), +((%trace, +%writeln(here1), +/*writeln1(get_last_p_before_n(Choice_point_trail1, + [_Pred_id,Level1,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2], + [Cp_a,Cb_b,_Pred_id,Level1,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2],_)), + */ + + %writeln([choice_point_trail1,Choice_point_trail1]), + + Query2=[_|_], + %trace, +%writeln("*1"), + get_last_cp_before_n(Choice_point_trail1, + [Pred_id,Level1,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2], + [Cp_a,Cb_b,Pred_id,Level1,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2],_, + CP_Vars1,CP_Vars21) , + + member([[function,Pred_id],Function],Globals1a), + member([[arguments1,Pred_id],Arguments1],Globals1a), + + member([[skip,Pred_id],Skip],Globals1a), + + %delete(Globals1a,[[function,Pred_id],Function],Globals1), + Globals1a=Globals1, + +(debug_fail(Skip,[Function,Arguments1])->true;true) + + +%Query2=[AB1|AB2],not(var(AB1)),not(var(AB2)), %trace, + /* +writeln1(get_last_p_before_n(Choice_point_trail1, + [Pred_id,Level1,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2], + [Cp_a,Cb_b,Pred_id,Level1,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2],_)), + */ + +%member([_Pred_id,Level1,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2],Choice_point_trail11), +%(((Pred_or_line="predicate",not(All_predicate_numbers2=[]))->true;(Pred_or_line="line",%writeln1(All_predicate_numbers2),All_predicate_numbers2=[_,_,_,_,_,_,Vars2c],not(Vars2c=[])))) + + %[[n,member2],[[1,2,3],empty],_204342,_204348,[[[[[v,a],[1,2,3]],[[v,b],2]],[[1,2,3],2]],[[[[v,a],[1,2,3]],[[v,b],3]],[[1,2,3],3]]]] + +%writeln1(delete(Choice_point_trail1,[Level,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2],Choice_point_trail12)), + +/*writeln1(delete_cp(Choice_point_trail1,[Cp_a,Cb_b,_Pred_id,Level,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2],Choice_point_trail12)), +*/ + + +/* +writeln1(delete_cp(Choice_point_trail1,[Cp_a,Cb_b,_Pred_id,Level,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2],Choice_point_trail12)) +*/ +) +-> + + + (Pred_or_line="predicate"-> + + %(All_predicate_numbers2=[]-> + ( + + +%load_local_from_global_cp_trail(Pred_id,%Choice_point_trail1,Choice_point_trail1_new,CP_Vars21,CP_Vars22), + + delete_cp(Choice_point_trail1,[Cp_a,Cb_b,Pred_id,Level1,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2],Choice_point_trail12,CP_Vars21,CP_Vars3,_), + + %save_local_to_global_cp_trail(Choice_point_trail12,[],CP_Vars3111,CP_Vars3), + + +%*** + + %); + + + All_predicate_numbers2=[All_predicate_numbers3|All_predicate_numbers4], + + All_predicate_numbers3=[All_predicate_numbers31,"prev_pred_id",Prev_pred_id], + %trace, + + + delete_cp_value(Choice_point_trail12,[All_predicate_numbers31,"prev_pred_id",Prev_pred_id],Choice_point_trail13,CP_Vars3,CP_Vars31), + + + member([[level,Pred_id],L],Globals1), + %Level3 is Level+1, + %Level3 is Level, + + %(Pred_id=3->writeln(here3);true), + + % pred id in the following was _ + (Line_number2b = ["returns to", _, "pred_id", _] -> (Line_number2b1 = -1%,trace + ) ; Line_number2b1 = Line_number2b), + +ssi1([["prev_pred_id",Prev_pred_id],L,All_predicate_numbers31,Line_number2b1,"predicate",Query2, + Vars4,All_predicate_numbers4], End_result,Functions,Vars2, + Result1, Result2,%2, + Globals1,Globals2, + Choice_point_trail13, + Choice_point_trail3, + CP_Vars31,CP_Vars2) + +); + +(Pred_or_line="line" -> + +( +%trace, + +%*** loads new frame + +delete_until_last_cp(Choice_point_trail1,Choice_point_trail6,D1,AC,CP_Vars21,CP_Vars4), +%writeln1(delete_until_last_cp(Choice_point_trail1,Choice_point_trail6,D1,AC,CP_Vars21,CP_Vars4)), + + ( + D1=[_,_,Pred_id2,Level11,Predicate_number11,Line_number_a11,"line",-, + Vars2d11,Vars2e11], + + %writeln1([d1,D1]), + + ssi1([Pred_id2,Level11,Predicate_number11,Line_number_a11,"line",-, + Vars2d11,Vars2e11], _, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail6, + Choice_point_trail3,["appearance of command",AC], + CP_Vars4,CP_Vars2) + )))); + +%**** + ( + /* + %reverse(Globals1,Globals3), + member([[firstargs,Pred_id],FirstArgs],Globals1), %*delete, where was pred called from? - prev level in cps + delete(Globals1,[[firstargs,Pred_id],FirstArgs],Globals41), + member([[function,Pred_id],Function],Globals41), + delete(Globals41,[[function,Pred_id],Function],Globals51), + member([[arguments1,Pred_id],Arguments1],Globals51), + delete(Globals51,[[arguments1,Pred_id],Arguments1],Globals61), + member([[skip,Pred_id],Skip],Globals61), + delete(Globals61,[[skip,Pred_id],Skip],Globals71), + member([[level,Pred_id],Level],Globals71), + delete(Globals71,[[level,Pred_id],Level],Globals222), + + %reverse(Globals21,Globals222), +*/ + +%(debug_fail(Skip,[Function,Arguments1])->true;true), + +%writeln([globals1a,Globals1a]), + +ssi1([Pred_id,Level,Predicate_number,-3,"predicate",-, + [],_All_predicate_numbers2], End_result,Functions,Vars2, + Result1, Result2,%2, + Globals1a,Globals2, % *** Globals1->Globals222 + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2)) +))))). \ No newline at end of file diff --git a/SSI/end_nested_findall.pl b/SSI/end_nested_findall.pl new file mode 100644 index 0000000000000000000000000000000000000000..4e618920108c1798b1295bf76e264f9ad5d234e9 --- /dev/null +++ b/SSI/end_nested_findall.pl @@ -0,0 +1,275 @@ +end_nested_findall(FA,Pred_id,Level,Predicate_number,Line_number_b,Choice_point_trail1,Choice_point_trail2,Vars1,Vars2,CP_Vars1,CP_Vars2,Functions,Globals1,Globals2,Result1, Result2,End_line42) :- + % if there are no more choice points after the current findall, then pass back results. + + % [1,2,1,1,0,-1,"predicate",[[n,query_box_1],[[v,b]]],[[[v,b],empty]],[]],[2,3,1,1,0,["returns to",0,"pred_id",1],"predicate",-,[[[v,b],empty]],[]],[3,4,2,2,1,-1,"predicate",[[n,findall1],[[[1,2],[3,4]],[v,b]]],[[[v,a],[[1,2],[3,4]]],[[v,b],empty]],[]],[4,5,2,2,1,0,"findall",-,[old_vars,[[[v,a],[[1,2],[3,4]]],[[v,b],empty]]],[findall_vars,[empty]],[format_vars,[v,b1]],[result_var,[v,b]]],[5,6,2,2,1,2,"line",_536118,_536124,[[n,member2],[[[1,2],[3,4]],empty],_462556,_462562,_462568,_462574,[]]],[6,7,2,2,1,3,"findall",-,[old_vars,[[[v,a],[[1,2],[3,4]]],[[v,b],empty],[[v,a1],[3,4]]]],[findall_vars,[8]],[format_vars,[v,a3]],[result_var,[v,b1]]],[7,8,2,2,1,5,"line",_577278,_577284,[[n,member2],[[3,4],empty],_557636,_557642,_557648,_557654,[]]],[8,9,2,2,1,[findall_exit_function,3],"predicate",-,[[[v,a],[[1,2],[3,4]]],[[v,b],empty],[[v,a1],[1,2]],[[v,a2],2],[[v,a3],7],[[v,b1],[6,7]]],[[n,member2],[[1,2],empty],_482002,_482008,_482014,_482020,[]]],[9,10,2,2,1,5,"line",_499516,_499522,[[n,member2],[[1,2],empty],_482002,_482008,_482014,_482020,[]]] + +/* + +[4,5,2,2,1,0,"findall",-,[old_vars,[[[v,a],[[1,2],[3,4]]],[[v,b],empty]]],[findall_vars,[empty]],[format_vars,[v,b1]],[result_var,[v,b]]] + +[6,7,2,2,1,3,"findall",-,[old_vars,[[[v,a],[[1,2],[3,4]]],[[v,b],empty],[[v,a1],[3,4]]]],[findall_vars,[8]],[format_vars,[v,a3]],[result_var,[v,b1]]], + [8,9,2,2,1,[findall_exit_function,3],"predicate",-,[[[v,a],[[1,2],[3,4]]],[[v,b],empty],[[v,a1],[1,2]],[[v,a2],2],[[v,a3],7],[[v,b1],[6,7]]],[[n,member2],[[1,2],empty],_482002,_482008,_482014,_482020,[]]] + +x + + +[[4,5,2,2,1,0,"findall",-,[old_vars,[[[v,a],[[[1,2,3,4]]]],[[v,b],empty]]],[findall_vars,[]],[format_vars,[v,b1]],[result_var,[v,b]]], + +[6,7,2,2,1,3,"findall",-,[old_vars,[[[v,a],[[[1,2,3,4]]]],[[v,b],empty],[[v,a1],[[1,2,3,4]]]]],[findall_vars,[]],[format_vars,[v,b2]],[result_var,[v,b1]]], + +[8,9,2,2,1,6,"findall",-,[old_vars,[[[v,a],[[[1,2,3,4]]]],[[v,b],empty],[[v,a1],[[1,2,3,4]]],[[v,a2],[1,2,3,4]]]],[findall_vars,[5,6,7]],[format_vars,[v,a4]],[result_var,[v,b2]]]] + +*/ + +%(cp_since_findall_start(Choice_point_trail1,Level1,_D10,E1,[%A1c,A2c, +%Ac, Bc, Cc, Dc, "findall", -, +%[old_vars,Ec],[findall_vars,Fc],[format_vars,Gc],[result_var,Hc]],CP_Vars5a, +%CP_Vars5a1)-> + +% put end_nested_findall at end of process cp x +% checks for fa/cp -> process cp ; end_nested_findall, which goes to ssi when cp + +%ssi; +%( + +%trace, + +get_last_p_before_n(Choice_point_trail1,[_,Level1,Predicate_number1,Line_number_a,"findall",-,[old_vars,Old_vars],[findall_vars,Findall_vars],[format_vars,Format_vars],[result_var,Result_var]], + [Cp_b1,Cb_b2,_,Level1,Predicate_number1,Line_number_a,"findall",-,[old_vars,Old_vars],[findall_vars,Findall_vars],[format_vars,Format_vars],[result_var,Result_var]],_,CP_Vars1,CP_Vars4), + D1=[Cp_b1,Cb_b2,_,Level1,Predicate_number1,_Line_number_a,"findall",-,[old_vars,Old_vars],[findall_vars,Findall_vars],[format_vars,Format_vars],[result_var,Result_var]], + + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, + + remember_and_turn_off_debug(Debug), + +find_sys(Sys_name), + + %trace, + interpretpart(match4,Format_vars,[Dbw_v,Sys_name],Vars1,Vars2fa,_), + +%trace, +%writeln([*,FA]), + (not(FA=fail)->getvalue([Dbw_v,Sys_name],Value3,Vars2fa); + is_empty(Value3)), + + +(is_empty(Value3)-> Findall_vars=Findall_vars2; append(Findall_vars,[Value3],Findall_vars2)), + + + %delete_cp(Choice_point_trail1,D1,Choice_point_trail3,CP_Vars4,CP_Vars3,_), + Choice_point_trail1=Choice_point_trail3, + CP_Vars4=CP_Vars3, + + interpretpart(match4,Result_var,Findall_vars2,[]%Vars1 + ,Vars2fb,_), + + turn_back_debug(Debug), + + getvalue(Result_var,Value4,Vars2fb), + + %trace, + subtract(Old_vars,[[Result_var,_]],Vars2fd), + + append(Vars2fd,[[Result_var,Value4]],Vars2fc),%Vars2fc1), + +%trace, + +%findall([Var_x,Val_x],(member([Var_x,Val_x],Vars2fc1),not(member([Var_x,_],Format_vars))),Vars2fc), + +%trace, + +reverse(Choice_point_trail3,Choice_point_trail4), + +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + +%trace, +(Line_number_b=[_,Line_number_b2]->Line_number_b2=_Line_number_b1; +true%Line_number_b2=Line_number_b +), + +%writeln(hered), + +/* +writeln1([member,[skip,Pred_id,Line_number_b2],Skip%,Globals10 +]), + + member([[skip,Pred_id,Line_number_b2],Skip],Globals1), + %Globals10=Globals1, + + %delete(Globals10,[[skip,Pred_id,Line_number_b2],Skip],Globals1), +%trace, +debug_exit(Skip,[[Dbw_n,Dbw_findall]]), +*/ + +%trace, +((get_lang_word("findall_exit_function",Dbw_findall_exit_function1),Dbw_findall_exit_function1=Dbw_findall_exit_function, +%trace, +(member_cut1([_Ad,_Bd,_Cf,_Ef,_Ff,[Dbw_findall_exit_function,_Gf],"line",-,_Vars11|_],Choice_point_trail4)->true;true), + +%writeln1(member_cut1([_Ad,_Bd,Cf,Ef,Ff,[findall_exit_function,Gf],"line",-,Vars11|_],Choice_point_trail4)), + +member_cut1([_A1c,_A2c, +_Ac, _Bc, _Cc, _Dc, "findall", -, +[old_vars,Ec],[findall_vars,_Fc],[format_vars,_Gc],[result_var,Hc]], +Choice_point_trail4), + +%trace, + +/* + remember_and_turn_off_debug(Debug3), +find_sys(Sys_name3), + interpretpart(match4,Hc,[Dbw_v,Sys_name3],Vars11,Vars2fa1,_), + + getvalue([Dbw_v,Sys_name3],Value31,Vars2fa1), + */ + %(Findall_vars2=[empty]-> Findall_vars4=[];Findall_vars4=Findall_vars2), + +%append(Findall_vars,[Value31],Findall_vars41), +%Findall_vars41=[Value31], +Findall_vars41=Value4, + +%append([Value31],[Value4],Value41), + +append(Ec,[[Hc,Findall_vars41]],Result42), + + %turn_back_debug(Debug3), + + + + +% run end_nested_findall when about to end findall +% copy end findall in ssi3 +% delete the rest of nested findall in ssi3 +% exits end_nested_findall when 1 findall left + +not(cp_since_findall_start(Choice_point_trail3,_,_,_,_,CP_Vars3,_)), + +flush_after_last_findall(Choice_point_trail3,Choice_point_trail51,CP_Vars3,CP_Vars311), +%trace, +delete_cp(Choice_point_trail51,D1,Choice_point_trail5,CP_Vars311,CP_Vars411,_), + +%(( +%trace, +reverse(Choice_point_trail5,Choice_point_trail52), +%trace, +member_cut1([_,_,_,_,_,_,"findall",-|_],Choice_point_trail52) +) + +%cp_since_findall_start(Choice_point_trail5,Level1,_D10,E1,_D1,CP_Vars41,CP_Vars4) +-> +(%writeln(here1),writeln([line_number_a,_Line_number_a]), +%trace, +%writeln(herea), + +%writeln1([member,[skip,Pred_id,Line_number_a],Skip%,Globals10 +%]), + + member([[skip,Pred_id,Line_number_a],Skip],Globals1), + %Globals10=Globals1, + + %delete(Globals10,[[skip,Pred_id,Line_number_b2],Skip],Globals1), +%trace, +debug_exit(Skip,[[Dbw_n,Dbw_findall],[Format_vars,"...",Value4]]), + + +end_nested_findall(exit,Pred_id,Level,Predicate_number,Line_number_b,Choice_point_trail5,Choice_point_trail2,Result42,Vars2,CP_Vars411,CP_Vars2,Functions,Globals1,Globals2,Result1, Result2,End_line42)); +(%trace, +(%writeln(hereb), + +cp_since_findall_start(Choice_point_trail3,Level,_,E100,D100,CP_Vars3,CP_Vars3111)-> + +( +/* +ssi1([Pred_id,Level,Predicate_number,Line_number_c%Line_number_c + ,"line",Query, + Vars1,%Jc,%Old_vars, + All_predicate_numbers], End_result, Functions,Vars2, + Result1, Result2, + Globals3,Globals2, + Choice_point_trail3, % Choice_point_trail11 to Choice_point_trail1a + Choice_point_trail2, + CP_Vars3,CP_Vars2) + */ + + %trace, + E100=[_C_Pred_id2,_C_Level3,_C_Predicate_number2,C_Line_number_a2,_C_Pred_or_line%"line" + ,_C_Query_a2,_CCPV0,_CCPV3], + +%trace, +%writeln([choice_point_trail3,Choice_point_trail3]), + + process_cp(C_Line_number_a2,%exit% + FA + ,D100,E100, + +_, + +Vars1, + _End_result, Functions,Vars2, %% CPVs here?**** CPV0,CPV to CPV1,CPV2 + Result1, Result2, + Globals1,Globals2, + Choice_point_trail3, + Choice_point_trail2, + CP_Vars3111,CP_Vars2 + %*/ + ) + + ); + +(%writeln(herec), +%/* +%writeln1([member,[skip,Pred_id,Line_number_a],Skip%,Globals10 +%]), + + member([[skip,Pred_id,Line_number_a],Skip1],Globals1), + %Globals10=Globals1, + + %delete(Globals10,[[skip,Pred_id,Line_number_b2],Skip],Globals1), +%trace, +debug_exit(Skip1,[[Dbw_n,Dbw_findall],[Format_vars,"...",Value4]]), +%*/ +%trace, +((%var(Line_number_b), +number(End_line42))->Line_number_b3=End_line42;Line_number_b3=Line_number_b), +%notrace, +exit_findall_line(Pred_id,Globals1,Predicate_number,Line_number_b3,Functions,Line_number_c), + + +flush_after_last_findall(Choice_point_trail3,Choice_point_trail51,CP_Vars1,CP_Vars311), + + +%trace, +get_last_p_before_n(Choice_point_trail51,[_,_Level_f,_Predicate_number_f,_Line_number_a_f,"findall",-|Rest_f], + [Cp_b1_f,Cb_b2_f,_,_Level_f,_Predicate_number_f,_Line_number_a_f,"findall",-|Rest_f],_,CP_Vars1,CP_Vars4), + D1_f=[Cp_b1_f,Cb_b2_f,_,_Level_f,_Predicate_number_f,_Line_number_a_f,"findall",-|Rest_f], + +delete_cp(Choice_point_trail51,D1_f,Choice_point_trail52,CP_Vars311,CP_Vars411,_), + + +%member([0,["on true",_],["go after",Findall_end_line2]|_],Lines2), + + ssi1([Pred_id,Level,Predicate_number,Line_number_c%Line_number_c + ,"line",_Query, + Vars2fc,%Jc,%Old_vars, + _All_predicate_numbers], _End_result, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail52, % Choice_point_trail11 to Choice_point_trail1a + Choice_point_trail2, + CP_Vars411,CP_Vars2) + + /* +Choice_point_trail3=Choice_point_trail2, +Vars1=Vars2, +CP_Vars3=CP_Vars2, +Cd=Ce,Ed=Ee,Fd=Fe,Gd=Ge, +Globals1=Globals2 +*/ +)))) +. + +% * put in findall exit with vars in process CP + +% *** test whether cps before process cps or end nested findall, call this pred at end of end nested findall +% x just p cp ; e n fa, call ssi after e n fa \ No newline at end of file diff --git a/SSI/file.txt b/SSI/file.txt new file mode 100644 index 0000000000000000000000000000000000000000..8395beeca8e90b28f98b775d18f6fcbbd8c71907 --- /dev/null +++ b/SSI/file.txt @@ -0,0 +1 @@ +["Name","a"]. \ No newline at end of file diff --git a/SSI/find_pred_sm.pl b/SSI/find_pred_sm.pl new file mode 100644 index 0000000000000000000000000000000000000000..8abca0aacdeec3cbd0497b047ba7530223a34e12 --- /dev/null +++ b/SSI/find_pred_sm.pl @@ -0,0 +1,102 @@ +/** + +cd SSI +swipl +['../listprologinterpreter/listprolog.pl']. +assertz(lang("en2")). +load_lang_db. +['find_pred_sm.pl']. +find_pred_sm(RW,"en"),writeln1(RW). + +["+","-","*","/","abort","any","append","atom","brackets","call","ceiling","code","creep","cut","date","delete","equals1","equals2","equals3","equals4","exit","fail","grammar","head","is","length","letters","list","member","member2","n","not","number","or","predicatename","random","round","skip","string","string_from_file","stringconcat","stringtonumber","sys","findall_sys","t","tail","true","unwrap","v","variable","vgp","wrap","input","output","string_length","sort","intersection","read_string","writeln","atom_string","trace","notrace","sqrt","notrace"] + +**/ + +/* +find_pred_sm(Reserved_words1) :- + Reserved_words1=["+","-","*","/","abort","any","append","atom","brackets","call","ceiling","code","creep","cut","date","delete","equals1","equals2","equals3","equals4","equals4_on","equals4_off","exit","fail","grammar","head","is","length","letters","list","member","member2","member3","n","not","number","or","predicatename","random","round","skip","string","string_from_file","stringconcat","stringtonumber","sys","findall_sys","t","tail","true","unwrap","v","variable","vgp","wrap","input","output","string_length","sort","intersection","read_string","writeln","atom_string","trace","notrace","sqrt","notrace","get_lang_word" +,"on_true","go_after","on_false","go_to_predicates", +"exit_function","fail_function","findall_exit_function"]. + + */ +% list of reserved words from lang dict + +/** + lang(Lang1), + %(Lang1="en"->Lang2="en2";Lang2=Lang1), + lang_db(Lang_DB), + findall(En_word3, + (member([En_word,En_word2,Lang1,_T_word], + Lang_DB), + (Lang2="en"->En_word3=En_word;En_word3=En_word2), + not(member(En_word,["findall","maplist"]))), + % these reserved words are taken out to be run as list prolog algorithms + Reserved_words1). +**/ + +/** + +% keeps list of pred numbers of clauses for a predicate (run after finding predicate and line numbers and before finding state machine) + +find_pred_numbers([[0,[n,downpipe],[[v,a],[v,a],[v,b]]],[1,[n,downpipe],[[v,a],[v,b],[v,c]],":-",[[0,[n,member2],[[v,c],[v,c1]]],[1,[n,equals1],[[v,c1]]],[2,[n,equals1],[[v,c12]]],[3,[n,"->"],[[4,[n,>],[[v,a],[v,c121]]],[5,[n,downpipe],[[v,c121],[v,b],[v,c]]],[6,[n,"->"],[[7,[n,>],[[v,a],[v,c122]]],[8,[n,downpipe],[[v,c122],[v,b],[v,c]]],[9,[n,fail]]]]]]]]], +["+","-","*","/","abort","any","append","atom","brackets","call","ceiling","code","creep","cut","date","delete","equals","equals2","equals3","equals4","exit","fail","grammar","head","is","length","letters","list","member","member2","n","not","number","or","predicatename","random","round","skip","string","string_from_file","stringconcat","stringtonumber","sys","findall_sys","t","tail","true","unwrap","v","variable","vgp","wrap","input","output","string_length","sort","intersection","read_string","writeln","atom_string","trace","notrace","sqrt","notrace"], +Pred_numbers),writeln1(Pred_numbers). + +Pred_numbers = [[[n, downpipe], 3, [0, 1]]]. + +**/ + +find_pred_numbers(Functions,_Reserved_words,Pred_numbers) :- +%trace, + findall([Pred_name1,Arity1],( + + %member([_Pred_number1,Pred_name1|Arguments_Body],Functions),(Arguments=":-"->Arity1=0;length(Arguments,Arity1)) + + member([Pred_number2,Pred_name1|Rest],Functions), +pred_rest(Arity1,Rest,_) + /* +(Rest=[Args,":-",Lines]->length(Args,Arity1); +(Rest=[Args]->(Lines=[[[Dbw_n,Dbw_true]]],length(Args,Arity1)); +(Rest=[":-",Lines]->Arity1=0; +(Rest=[[Dbw_n,_],Args,Lines]->length(Args,Arity1); +(Rest=[],Lines=[[[Dbw_n,Dbw_true]]],Arity1=0))))) + */ + + ),Unique_predicates1), + sort(Unique_predicates1,Unique_predicates2), + + % functions may contain a reserved word: + %forall((member([Pred_name3,_Arity3],Unique_predicates2),not(member(Pred_name3,Reserved_words))),(term_to_atom(Pred_name3,Pred_name31),concat_list(["Error: Functions contains reserved word: ",Pred_name31,"."],Note),writeln(Note),abort)), + %trace, + findall([Pred_name2,Arity1,Pred_number3],(member([Pred_name2,Arity1],Unique_predicates2),findall(Pred_number2,( + + +%member([Pred_number2,Pred_name2,Arguments2|_],Functions),length(Arguments2,Arity2) + + + member([Pred_number2,Pred_name2|Rest],Functions), + pred_rest(Arity1,Rest,_) +/* +(Rest=[Args,":-",Lines]->length(Args,Arity1); +(Rest=[Args]->(Lines=[[[Dbw_n,Dbw_true]]],length(Args,Arity1)); +(Rest=[":-",Lines]->Arity1=0; +(Rest=[[Dbw_n,_],Args,Lines]->length(Args,Arity1); +(Rest=[],Lines=[[[Dbw_n,Dbw_true]]],Arity1=0))))) +*/ + + +),Pred_number3)),Pred_numbers). + +% stores these in pred calls in the sm - see ssi_find_sm.pl + +pred_rest(Arity1,Rest,Lines) :- + +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("true",Dbw_true1),Dbw_true1=Dbw_true, + +(Rest=[Args,":-",Lines]->length(Args,Arity1); +(Rest=[Args]->(Lines=[[[Dbw_n,Dbw_true]]],length(Args,Arity1)); +(Rest=[":-",Lines]->Arity1=0; +(Rest=[[Dbw_n,_],Args|Lines]->length(Args,Arity1); +(Rest=[],Lines=[[[Dbw_n,Dbw_true]]],Arity1=0))))),!. + diff --git a/SSI/flush_after_last_findall.pl b/SSI/flush_after_last_findall.pl new file mode 100644 index 0000000000000000000000000000000000000000..535553c415e0da1e1db9017698d1c0ad8854c5a7 --- /dev/null +++ b/SSI/flush_after_last_findall.pl @@ -0,0 +1,33 @@ +flush_after_last_findall(Choice_point_trail1,Choice_point_trail2,CP_Vars1,CP_Vars2) :- + + +reverse(Choice_point_trail1,Choice_point_trail14), + member([A1,A2,_,_,_,_,"findall",_|_],Choice_point_trail14), + %_D11=[_,_,_,_,"findall",_|_], +%trace, + get_later_cps_than_cp11(Choice_point_trail1, + [A1,A2,_,_,_,_,"findall",-|_], + _D1,B), + +%trace, + +delete_cp2(Choice_point_trail1,B,Choice_point_trail2,CP_Vars1,CP_Vars2). + %subtract(Choice_point_trail1,B,Choice_point_trail2). + +delete_cp2(List1,[],List1,Cp_vars1,Cp_vars1) :- !. + delete_cp2(List1,B,List2,Cp_vars1,Cp_vars2) :- + B=[B1|B21], delete_cp(List1,B1,List3,Cp_vars1,Cp_vars3,Swaps), + replace_cps(Swaps,B21,[],B22), + delete_cp2(List3,B22,List2,Cp_vars3,Cp_vars2). + + +replace_cps(_,[],B,B) :- !. +replace_cps(Swaps,B21,B23,B22) :- + B21=[B31|B32], + (member([B31,B33],Swaps)-> + append(B23,[B33],B34); + append(B23,[B31],B34)), + replace_cps(Swaps,B32,B34,B22). + + + diff --git a/SSI/input.txt b/SSI/input.txt new file mode 100644 index 0000000000000000000000000000000000000000..075c842d1b8cdbc5f791e44aa60e3b9639d3b452 --- /dev/null +++ b/SSI/input.txt @@ -0,0 +1 @@ +"a" \ No newline at end of file diff --git a/SSI/interpretstatement3.pl b/SSI/interpretstatement3.pl new file mode 100644 index 0000000000000000000000000000000000000000..6fb6ebb14be4bdc4e79bff9dafdd38c62f190625 --- /dev/null +++ b/SSI/interpretstatement3.pl @@ -0,0 +1,32 @@ + +%** debug displays in not, findall +interpretstatement3(ssi,_,_,[[Dbw_n,"[]"]|_],Vars,Vars,_Result21,_Cut,_,Skip) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +Skip=false. + %debug_call(Skip,[[Dbw_n,"[]"]]). + + %debug_call(Skip,[[n,"[]"]]). +interpretstatement3(ssi,_,_,[[Dbw_n,Dbw_not]|_],Vars,Vars,_Result21,_Cut,_,Skip) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, + debug_call(Skip,[[Dbw_n,Dbw_not]]). + +interpretstatement3(ssi,_,_,[[Dbw_n,Dbw_or]|_],Vars,Vars,_Result21,_Cut,_,Skip) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +Skip=false. + %debug_call(Skip,[[Dbw_n,Dbw_or]]). + +interpretstatement3(ssi,_,_,[[Dbw_n,"->"]|_],Vars,Vars,_Result21,_Cut,_,Skip) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +Skip=false. + %debug_call(Skip,[[Dbw_n,"->"]]). + +interpretstatement3(ssi,_,_,[[Dbw_n,Dbw_findall]|Args],Vars,Vars,_Result21,_Cut,_,Skip) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, +%Skip=false. + %trace, + debug_call(Skip,[[Dbw_n,Dbw_findall]|Args]). + +%% ** findall/2 is done above diff --git a/SSI/li.txt b/SSI/li.txt new file mode 100644 index 0000000000000000000000000000000000000000..0637a088a01e8ddab3bf3fa98dbe804cbde1a0dc --- /dev/null +++ b/SSI/li.txt @@ -0,0 +1 @@ +[] \ No newline at end of file diff --git a/SSI/only_ssi_verify4.pl b/SSI/only_ssi_verify4.pl new file mode 100644 index 0000000000000000000000000000000000000000..7c21d8b673f55107cca084a30973098d86b5d331 --- /dev/null +++ b/SSI/only_ssi_verify4.pl @@ -0,0 +1,473 @@ +%% only_ssi_test(Debug[on/off],Total,Score). + +%%:- use_module(library(time)). + +%% only_ssi_test cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +only_ssi_test(Debug,NTotal,Score) :- only_ssi_test(Debug,0,NTotal,0,Score),!. +only_ssi_test(_Debug,NTotal,NTotal,Score,Score) :- NTotal=3, !. +only_ssi_test(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + only_ssi_test(NTotal3,Query,Functions,Result), + ((international_lucianpl([lang,"en"],Debug,Query,Functions,Result1) + %,writeln1([Result,Result1]) + ,Result=Result1) + ->(Score3 is Score1+1,writeln0([only_ssi_test,NTotal3,passed]));(Score3=Score1,writeln0([only_ssi_test,NTotal3,failed]))), + writeln0(""), + only_ssi_test(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% only_ssi_test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +only_ssi_test1(Debug,N,Passed) :- + only_ssi_test(N,Query,Functions,Result), + ((international_lucianpl([lang,"en"],Debug,Query,Functions,Result1), + %writeln1([result1,Result1]), + Result=Result1 + )->(Passed=passed,writeln0([only_ssi_test,N,passed]));(Passed=failed,writeln0([only_ssi_test,N,failed]))),!. + + +only_ssi_test(1,[[n,findall],[[[v,a],[v,b]],[[n,appendz],[[v,a],[v,b],[1,2,3]]],[v,c]]], +[ +[[n,appendz],[[],[v,a],[v,a]]%,":-",[[[n,true]]] +], +[[n,appendz],[[[v,a],"|",[v,d]],[v,b],[[v,a],"|",[v,c]]],":-", +[ +[[n,appendz],[[v,d],[v,b],[v,c]]]%,[[n,cut]] +] +]] +,[[[[v,c],[[[],[1,2,3]],[[1],[2,3]],[[1,2],[3]],[[1,2,3],[]]]]]]). + +only_ssi_test(2,[[n,family_test]], % compatible with ssi, requires more memory + +[ +[[n,parent],[albert,jim]], +[[n,parent],[albert,peter]], +[[n,parent],[jim,brian]], +[[n,parent],[peter,lee]], +[[n,parent],[peter,john]], +[[n,year_of_birth],[albert,1926]], +[[n,year_of_birth],[peter,1945]], +[[n,year_of_birth],[jim,1949]], +[[n,year_of_birth],[brian,1974]], +[[n,year_of_birth],[john,1974]], +[[n,year_of_birth],[lee,1975]], +[[n,gender],[albert,male]], +[[n,gender],[jim,male]], +[[n,gender],[peter,male]], +[[n,gender],[brian,male]], +[[n,gender],[lee,male]], +[[n,gender],[john,male]], +[[n,grandparent],[[v,gg]],":-", +[ + [[n,findall], + [ + [[v,a],[v,b]], + + [[n,parent],[[v,a],[v,b]]], + + [v,c] + ]], + [[n,findall], + [ + [[v,grandparent],[v,grandchild]], + + [ + [[n,member],[[[v,grandparent],[v,child]],[v,c]]], + [[n,member],[[[v,child],[v,grandchild]],[v,c]]] + ], + + [v,gg] + ]], + [[n,cut]] +]], +[[n,comment],["% older(A,B)"]], +[[n,comment],["% means A is older than B"]], +[[n,comment],["%"]], +[[n,older],[[v,gg]],":-", +[ + [[n,findall], + [ + [[v,a],[v,b]], + + [[n,year_of_birth],[[v,a],[v,b]]], + + [v,c] + ]], + [[n,findall], + [ + [[v,a1],[v,b1]], + + [ + [[n,member],[[[v,a1],[v,y1]],[v,c]]], + [[n,member],[[[v,b1],[v,y2]],[v,c]]], + [[n,>],[[v,y2],[v,y1]]] + ], + + [v,gg] + ]], + [[n,cut]] +]], +[[n,comment],["% siblings(A,B)"]], +[[n,comment],["% means A and B are siblings"]], +[[n,comment],["%"]], +[[n,siblings],[[v,gg]],":-", +[ + [[n,findall], + [ + [[v,a],[v,b]], + + [[n,parent],[[v,a],[v,b]]], + + [v,c] + ]], + [[n,findall], + [ + [[v,a1],[v,b1]], + + [ + [[n,member],[[[v,x],[v,a1]],[v,c]]], + [[n,member],[[[v,x],[v,b1]],[v,c]]], + [[n,not],[[[n,=],[[v,a1],[v,b1]]]]] + ], + + [v,gg] + ]], + [[n,cut]] +]], +[[n,comment],["% A & B share a common parent"]], +[[n,comment],["%"]], +[[n,comment],["% A is different from B (Bratko, p175)"]], +[[n,comment],["% sibling_list(Child, Siblings)"]], +[[n,comment],["% Siblings is a list of A1's brothers and sisters"]], +[[n,comment],["%"]], +[[n,sibling_list],[[v,a1],[v,siblings]],":-", +[ + [[n,findall], + [ + [[v,a],[v,b]], + + [[n,parent],[[v,a],[v,b]]], + + [v,c] + ]], + [[n,findall], + [ + [[v,a1],[v,b1]], + + [ + [[n,member],[[[v,x],[v,a1]],[v,c]]], + [[n,member],[[[v,x],[v,b1]],[v,c]]], + [[n,not],[[[n,=],[[v,a1],[v,b1]]]]] + ], + + [v,list] + ]], + [[n,remove_duplicates],[[v,list],[v,siblings]]], + [[n,cut]] +]], +[[n,comment],["% remove_duplicates(List, Result)"]], +[[n,comment],["%"]], +[[n,comment],["% Removes duplicate entries in a list"]], +[[n,comment],["%"]], +[[n,remove_duplicates],[[],[]],":-", +[ + [[n,cut]] +]], +[[n,remove_duplicates],[[[[v,x1],[v,x2]],"|",[v,rest]],[[[v,x1],[v,x2]],"|",[v,result]]],":-", +[ + [[n,delete],[[v,rest],[[v,x1],[v,x2]],[v,r1]]], + [[n,delete],[[v,r1],[[v,x2],[v,x1]],[v,r2]]], + [[n,remove_duplicates],[[v,r2],[v,result]]], + [[n,cut]] +]], +[[n,comment],["% older_brother(A,B)"]], +[[n,comment],["% means A is an older brother of B"]], +[[n,comment],["%"]], +[[n,older_brother],[[v,gg]],":-", +[ + [[n,older_brother1],[[v,p],[v,y],[v,g]]], + [[n,findall], + [ + [[v,a],[v,b],[v,c1],[v,b3]], + + [ + [[n,member],[[[v,a],[v,b]],[v,p]]], + [[n,member],[[[v,b],[v,c1]],[v,y]]], + [[n,member],[[[v,b],[v,b3]],[v,g]]] + ], + + [v,c] + ]], + [[n,findall], + [ + [[v,a1],[v,b1]], + + [ + [[n,member],[[[v,x],[v,a1],[v,c2],male],[v,c]]], + [[n,member],[[[v,x],[v,b1],[v,c21],[v,'_']],[v,c]]], + [[n,>],[[v,c2],[v,c21]]] + ], + + [v,gg] + ]], + [[n,cut]] +]], +[[n,older_brother1],[[v,p],[v,y],[v,g]],":-", +[ + [[n,findall], + [ + [[v,a],[v,b]], + + [[n,parent],[[v,a],[v,b]]], + + [v,p] + ]], + [[n,findall], + [ + [[v,a],[v,b]], + + [[n,year_of_birth],[[v,a],[v,b]]], + + [v,y] + ]], + [[n,findall], + [ + [[v,a],[v,b]], + + [[n,gender],[[v,a],[v,b]]], + + [v,g] + ]], + [[n,cut]] +]], +[[n,comment],["% descendant(Person, Descendant)"]], +[[n,comment],["% means Descendant is a descendant of Person."]], +[[n,comment],["%"]], +[[n,descendant],[[v,a1],[v,b1]],":-", +[ + [[n,findall], + [ + [[v,a],[v,b]], + + [[n,parent],[[v,a],[v,b]]], + + [v,c] + ]], + [[n,descendant1],[[v,a1],[v,b1],[v,c],[v,c]]] +]], +[[n,descendant1],[[v,'_person'],[],[],[v,'_']],":-", +[ + [[n,cut]] +]], +[[n,descendant1],[[v,person],[v,d2],[[[v,person],[v,c2]],"|",[v,c1]],[v,e]],":-", +[ + [[n,descendant1],[[v,c2],[v,d],[v,e],[v,e]]], + [[n,descendant1],[[v,person],[v,descendant],[v,c1],[v,e]]], + [[n,foldr_append],[[[[v,c2]],[v,d],[v,descendant]],[],[v,d2]]], + [[n,cut]] +]], +[[n,descendant1],[[v,person1],[v,descendant],[[[v,person2],[v,'_c2']],"|",[v,c1]],[v,e]],":-", +[ + [[n,not],[[[n,=],[[v,person1],[v,person2]]]]], + [[n,descendant1],[[v,person1],[v,descendant],[v,c1],[v,e]]], + [[n,cut]] +]], +[[n,foldr_append],[[],[v,b],[v,b]],":-", +[ + [[n,cut]] +]], +[[n,foldr_append],[[[v,a1],"|",[v,a2]],[v,b],[v,c]],":-", +[ + [[n,append],[[v,b],[v,a1],[v,d]]], + [[n,foldr_append],[[v,a2],[v,d],[v,c]]], + [[n,cut]] +]], +[[n,comment],["% ancestor(Person, Ancestor)"]], +[[n,comment],["% means Ancestor is an ancestor of Person."]], +[[n,comment],["%"]], +[[n,comment],["% This is functionally equivalent to descendant(Ancestor, Person)."]], +[[n,comment],["%"]], +[[n,ancestor],[[v,a1],[v,b1]],":-", +[ + [[n,findall], + [ + [[v,a],[v,b]], + + [[n,parent],[[v,a],[v,b]]], + + [v,c] + ]], + [[n,ancestor1],[[v,a1],[v,b1],[v,c],[v,c]]] +]], +[[n,ancestor1],[[v,'_person'],[],[],[v,'_']],":-", +[ + [[n,cut]] +]], +[[n,ancestor1],[[v,person],[v,d2],[[[v,c2],[v,person]],"|",[v,c1]],[v,e]],":-", +[ + [[n,ancestor1],[[v,c2],[v,d],[v,e],[v,e]]], + [[n,ancestor1],[[v,person],[v,ancestor],[v,c1],[v,e]]], + [[n,foldr_append],[[[[v,c2]],[v,d],[v,ancestor]],[],[v,d2]]], + [[n,cut]] +]], +[[n,ancestor1],[[v,person1],[v,ancestor],[[[v,'_c2'],[v,person2]],"|",[v,c1]],[v,e]],":-", +[ + [[n,not],[[[n,=],[[v,person1],[v,person2]]]]], + [[n,ancestor1],[[v,person1],[v,ancestor],[v,c1],[v,e]]], + [[n,cut]] +]], +[[n,comment],["% children(Parent, ChildList)"]], +[[n,comment],["% ChildList is bound to a list of the children of Parent."]], +[[n,comment],["%"]], +[[n,children],[[v,parent],[v,childlist]],":-", +[ + [[n,findall], + [ + [v,child], + + [[n,parent],[[v,parent],[v,child]]], + + [v,childlist] + ]] +]], +[[n,comment],["% list_count(List, Count)"]], +[[n,comment],["% Count is bound to the number of elements in List."]], +[[n,comment],["%"]], +[[n,list_count],[[],0]], +[[n,list_count],[[[v,'_'],"|",[v,tail]],[v,count]],":-", +[ + [[n,list_count],[[v,tail],[v,tailcount]]], + [[n,+],[[v,tailcount],1,[v,count]]] +]], +[[n,comment],["% count_descendants(Person, Count)"]], +[[n,comment],["% Count is bound to the number of descendants of Person."]], +[[n,comment],["%"]], +[[n,count_descendants],[[v,person],[v,count]],":-", +[ + [[n,descendant],[[v,person],[v,list]]], + [[n,list_count],[[v,list],[v,count]]] +]], +[[n,family_test],":-", +[ + [[n,grandparent],[[v,result1]]], + [[n,writeln],[[v,result1]]], + [[n,older],[[v,result2]]], + [[n,writeln],[[v,result2]]], + [[n,siblings],[[v,result3]]], + [[n,writeln],[[v,result3]]], + [[n,sibling_list],[john,[v,result5]]], + [[n,writeln],[[v,result5]]], + [[n,older_brother],[[v,result6]]], + [[n,writeln],[[v,result6]]], + [[n,descendant],[albert,[v,result7]]], + [[n,writeln],[[v,result7]]], + [[n,ancestor],[john,[v,result8]]], + [[n,writeln],[[v,result8]]], + [[n,children],[albert,[v,childlist]]], + [[n,writeln],[[v,childlist]]], + [[n,list_count],[[1,2,3],[v,count1]]], + [[n,writeln],[[v,count1]]], + [[n,count_descendants],[albert,[v,count2]]], + [[n,writeln],[[v,count2]]] +]] +] +,[[]]). + + +/* +only_ssi_test(3,[[n,older_brother],[[v,result6]]], %% trying to get working with ssi + +[ +[[n,parent],[albert, jim]], +[[n,parent],[albert, peter]], +[[n,parent],[jim, brian]], +[[n,parent],[john, darren]], +[[n,parent],[peter, lee]], +[[n,parent],[peter, sandra]], +[[n,parent],[peter, james]], +[[n,parent],[peter, kate]], +[[n,parent],[peter, kyle]], +[[n,parent],[brian, jenny]], +[[n,parent],[irene, jim]], +[[n,parent],[irene, peter]], +[[n,parent],[pat, brian]], +[[n,parent],[pat, darren]], +[[n,parent],[amanda, jenny]], + +[[n,older_brother],[[v,c]],":-", +[ + [[n,findall], + [ + [[v,a],[v,b]], + + [ + [[n,siblings],[[v,a],[v,b]]], + [[n,male],[[v,a]]], + [[n,older],[[v,a],[v,b]]] + ], + + [v,c] + ]] +]], +[[n,siblings],[[v,a],[v,b]],":-", +[ + [[n,parent],[[v,x],[v,a]]], + [[n,parent],[[v,x],[v,b]]], + [[n,not],[[[n,=],[[v,a],[v,b]]]]] +]], +[[n,male],[albert]], +[[n,male],[jim]], +[[n,male],[peter]], +[[n,male],[brian]], +[[n,male],[john]], +[[n,male],[darren]], +[[n,male],[james]], +[[n,male],[kyle]], +[[n,yearofbirth],[irene,1923]], +[[n,yearofbirth],[pat,1954]], +[[n,yearofbirth],[lee,1970]], +[[n,yearofbirth],[sandra,1973]], +[[n,yearofbirth],[jenny,2004]], +[[n,yearofbirth],[amanda,1979]], +[[n,yearofbirth],[albert,1926]], +[[n,yearofbirth],[jim,1949]], +[[n,yearofbirth],[peter,1945]], +[[n,yearofbirth],[brian,1974]], +[[n,yearofbirth],[john,1955]], +[[n,yearofbirth],[darren,1976]], +[[n,yearofbirth],[james,1969]], +[[n,yearofbirth],[kate,1975]], +[[n,yearofbirth],[kyle,1976]], +[[n,older],[[v,a],[v,b]],":-", +[ + [[n,yearofbirth],[[v,a],[v,y1]]], + [[n,yearofbirth],[[v,b],[v,y2]]], + [[n,>],[[v,y2],[v,y1]]] +]], +[[n,family_test],":-", +[ + [[n,older_brother],[[v,result6]]], + [[n,writeln],[[v,result6]]] +]] +], +[[[[v,result6],[[peter, jim], [james, lee], [james, sandra], [james, kate], [james, kyle], [peter, jim], [brian, darren]]]]]). + +*/ + +only_ssi_test(3,[[n,grammar1],[["a","b","c","b","c"]]], +[ + [[n,grammar1],[[v,s]],":-", + [ + [[n,1],[[v,s],[]]] + ] + ], + + [[n,1],"->",[[]]], + [[n,1],"->",[["a"],[[n,2]],[[n,1]]]], + [[n,2],"->",[[]]], + [[n,2],"->",[["b"],[[n,3]],[[n,2]]]], + [[n,3],"->",[[]]], + [[n,3],"->",[["c"],[[n,3]]]] + +],[[]]). + diff --git a/SSI/optimisations.pl b/SSI/optimisations.pl new file mode 100644 index 0000000000000000000000000000000000000000..e6730da94ca9761920b4960f230dcc33e522d699 --- /dev/null +++ b/SSI/optimisations.pl @@ -0,0 +1,81 @@ + +/* +del_append(Globals31,[[[vars1,Pred_id],Vars1]],Globals32) :- + delete(Globals31,[[[vars1,Pred_id],_]],Globals33), + append(Globals33,[[[vars1,Pred_id],Vars1]],Globals32),!. +*/ +del_append(Globals1,[[[vars1,Pred_id],Vars1]],Globals3) :- +(var(Vars1)->Globals1=Globals3; +append(Globals1,[[[vars1,Pred_id],Vars1]],Globals3)),!. + +last_call_optimisation(_Globals3,Choice_point_trail11,Choice_point_trail3,Predicate_number,Line_number_b,Functions,CP_Vars1,CP_Vars2,Globals33,Globals3) :- + +% is determinate + +member([[pred_num,Pred_id],Predicate_number],Globals33),!, + +findall(Pred_ids,collect_connected_pred_ids1(Pred_id,[Pred_id +],Pred_ids,Predicate_number,Globals33),Pred_ids1), + +flatten(Pred_ids1,Pred_ids1a), +sort(Pred_ids1a,Pred_ids2), + findall([A,B2,C,D_Level,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2],(member([A,B2,C,D_Level,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2],Choice_point_trail11), + member(C,Pred_ids2) + ),M), + + %recursive_predicate(Pred_id,Pred_ids2,Globals3), + (forall(member([A,B2,C,D_Level,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2],M), + + (Pred_or_line="line"-> + (All_predicate_numbers2=[_Ab,_Bb,_Cb,_Db,_Eb, + _Fb,Vars2c],(Vars2c=[])); + Pred_or_line="predicate"-> + (All_predicate_numbers2=[])))), + +% is in last clause of a predicate + +append(_,[[Predicate_number,F1,Args1|Commands]|Functions1],Functions),!, + + + +not((append(_,[[_Predicate_number_b,F1,Args2|_]|_],Functions1), +length(Args1,L),length(Args2,L))),!, + +%[1,[n,count],[1,2],":-" + +% is last command +append(_,[Commands1],Commands),!, +append(_,[[Line_number_b,_,_,_,_,F2,Args3|_]|_],Commands1), +%[0, [on_true, -2], [go_after, -], [on_false, -3], [go_to_predicates, [1, 2]], [n, count], [0, [v, n]]] + +% is recursive + +F1=F2,length(Args1,L),length(Args3,L),!, + +%delete_frame_contents + +collect_connected_pred_ids2(Pred_id,Pred_ids2,Pred_ids3,Globals33), + +flatten(Pred_ids3,Pred_ids3a), +sort(Pred_ids3a,Pred_ids4), + + + findall(E2,(member(C1,Pred_ids4), + E2=[_A,_B2,C1,_D_Level,_E_Predicate_number2,_F_Line_number_a2,_Pred_or_line,_H,_I,_All_predicate_numbers2], + member(E2,Choice_point_trail11) + + ),E3), + %trace, + delete(E3,[_,_,_Pred_id,_,_,-1,"predicate",_,_,_],E32), + + delete(E32,[_,_,_Pred_id1,_,_,["returns to",_,"pred_id",_],"predicate",_,_,_],E31), + %E32=E31, + +delete_cps(Choice_point_trail11,E31,Choice_point_trail3,CP_Vars1,CP_Vars2), +Globals3=Globals33,!. + +% - change records of pn to new pn, same pid +% connect firstargs in globals to new first args x needs to go back through frame tops, first args needs to be computed afterwards, so don't do in 1 frame + +%run_in_current_frame() :-!. + diff --git a/SSI/pred_minus_one_fail2.pl b/SSI/pred_minus_one_fail2.pl new file mode 100644 index 0000000000000000000000000000000000000000..9044a74f49224d312f41ef67d72990dff6964fa9 --- /dev/null +++ b/SSI/pred_minus_one_fail2.pl @@ -0,0 +1,183 @@ +pred_minus_one_fail2([Pred_id1,Level,Predicate_number,_Line_number,"predicate",Query_a, + Vars,All_predicate_numbers], Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2) :- + %trace, + member([[firstargs,Pred_id1],_FirstArgs],Globals1), + + member([[function,Pred_id1],Function],Globals1), + + member([[arguments1,Pred_id1],Arguments1],Globals1), + + member([[skip,Pred_id1],Skip],Globals1), + + member([[level,Pred_id1],Level_a],Globals1), + + %delete(Globals1,[[function,Pred_id1],Function],Globals3), + + Globals1=Globals3, + +/*(trace, + (debug3(on)-> +write(["L",Level_a]);true), + + (debug_fail(Skip,[Function,Arguments1])->true;true), +*/ +%trace, + %updatevars(FirstArgs,Vars,[],Result), + %unique1(Result,[],Vars3), + + /* + Pred_id1=Pred_id4, + + get_last_p_before_n(Choice_point_trail1,[%_,_, + Pred_id4,_Level,_Predicate_number4,-1,"predicate",[Function,Arguments1], %** + _Vars,_All_predicate_numbers4], + [_,_,Pred_id4,_Level,_Predicate_number4,-1,"predicate",[Function,Arguments1], %** + _Vars,_All_predicate_numbers4],_%Choice_point_trail5 + , + CP_Vars1,CP_Vars3), + */ + + /* + trace,writeln1(["here1:",%_,_, + Pred_id4,_Level,_Predicate_number4,-1,"predicate",[Function,Arguments1], %** + _Vars,_All_predicate_numbers4]), + */ +%trace, + %findresult3(Arguments1,Vars3,[],Result22), + /* + (debug3(on)-> +write(["L",Level_a]);true), + debug_exit(Skip,[Function,Result22]), % return result21 + checktypes(Function,Result22), +*/ + ((not(Level_a=0))->( + Level2 is Level-1, + + (Level2 = 0 -> + + %trace, + ( + %trace, + (debug3(on)-> +write0(["L",Level_a]);true), + + (debug_fail(Skip,[Function,Arguments1])->true;true), + + +ssi1([_,0,Predicate_number,-3,"predicate",Query_a, + Vars,All_predicate_numbers], Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2)); + + (%trace, + +%writeln1([globals3,Globals3]), +%member([pred_id_chain,PID1,Pred_id1],Globals3), +%trace, + PID1=Pred_id1, + + get_last_p_before_n(Choice_point_trail1, + [PID1,Level3,Predicate_number3,["returns to",Line_number3,"pred_id",Pred_id3] + ,"predicate",Query2,Vars4,All_predicate_numbers3], + [_Cp_a,_Cb_b,PID1,Level3,Predicate_number3,["returns to",Line_number3,"pred_id",Pred_id3] + ,"predicate",Query2,Vars4,All_predicate_numbers3],_, + CP_Vars1,CP_Vars41), + + get_last_p_before_n(Choice_point_trail1, + [PID1,Level21,Predicate_number2,-1%Line_number2b + ,"predicate",Query21,Vars9,All_predicate_numbers2], + [_Cp_a1,_Cb_b1,PID1,Level21,Predicate_number2,-1%Line_number2b + ,"predicate",Query21,Vars9,All_predicate_numbers2],_, + CP_Vars41,CP_Vars4), + +/*writeln1(get_last_p_before_n(Choice_point_trail1, + [PID1,Level3,Predicate_number2,-1%Line_number2b + ,"predicate",Query2,Vars4,All_predicate_numbers2], + [_Cp_a,_Cb_b,PID1,Level3,Predicate_number2,Line_number2b,"predicate",Query2,Vars4,All_predicate_numbers2],_, + CP_Vars1,CP_Vars4)), + */ + + %member([[vars1,PID1],Vars5],Globals3), + + (%Line_number2b=["returns to", + %Line_number3,"pred_id",Pred_id3] + + true % *** this may lower test results + -> + ( +%member([[pred_num,Pred_id3],Predicate_number2],Globals3), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("true",Dbw_true1),Dbw_true1=Dbw_true, + +member([Predicate_number2,_F|Rest],Functions), +(Rest=[_Args,":-",Lines]->true; +(Rest=[_Args]->Lines=[[[Dbw_n,Dbw_true]]]; +(Rest=[":-",Lines]; +(Rest=[],Lines=[[[Dbw_n,Dbw_true]]])))), + + + +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + + member([Line_number3,[Dbw_on_true,_A],[Dbw_go_after,_B],[Dbw_on_false,C],[Dbw_go_to_predicates,_D]|_Line],Lines), +C=Line_number2a +%trace, +/* + reverse(Globals1,Globals33), + + member([[firstargs_uv2,Pred_id3],FirstArgs1],Globals33), +%trace, +%writeln1([globals33,Globals33]), + Globals33=Globals43, + + member([[vars1,Pred_id3],Vars11],Globals43), + + Globals43=Globals212, + +reverse(Globals212,Globals22), + + updatevars2(FirstArgs1,Vars3,[],Vars5), + updatevars3(Vars11,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + ( + unique1(Vars7,[],Vars8) + ) +);( + Vars8=[])), + + Vars8=Vars44 %% 4 not 2? *** + */ +) +; + +(Line_number2a=Line_number3%Line_number2b +%, +%,Level_x=Level2 +%Vars44=Vars3 + +)), % Line_number2 to 2b + + member([[level,PID1],Level_b],Globals3), + + %trace, + ssi1([PID1,Level_b,Predicate_number2,Line_number2a,"line",-, + Vars4,All_predicate_numbers2], _, Functions,Vars2, + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars4,CP_Vars2) + + )))). \ No newline at end of file diff --git a/SSI/pred_minus_three.pl b/SSI/pred_minus_three.pl new file mode 100644 index 0000000000000000000000000000000000000000..156ece858db9de2c684dfb211d2a0b0405044d6d --- /dev/null +++ b/SSI/pred_minus_three.pl @@ -0,0 +1,192 @@ +pred_minus_three([Pred_id1,Level,Predicate_number,_Line_number,"predicate",Query_a, + Vars,All_predicate_numbers], Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2) :- + %trace, + member([[firstargs,Pred_id1],_FirstArgs],Globals1), + + member([[function,Pred_id1],Function],Globals1), + + member([[arguments1,Pred_id1],Arguments1],Globals1), + + member([[skip,Pred_id1],Skip],Globals1), + + member([[level,Pred_id1],Level_a],Globals1), + + Globals1=Globals3, + +%/*(trace, + (debug3(on)-> +write0(["L",Level_a]);true), + + (debug_fail(Skip,[Function,Arguments1])->true;true), +%*/ +%trace, + %updatevars(FirstArgs,Vars,[],Result), + %unique1(Result,[],Vars3), + + /* + Pred_id1=Pred_id4, + + get_last_p_before_n(Choice_point_trail1,[%_,_, + Pred_id4,_Level,_Predicate_number4,-1,"predicate",[Function,Arguments1], %** + _Vars,_All_predicate_numbers4], + [_,_,Pred_id4,_Level,_Predicate_number4,-1,"predicate",[Function,Arguments1], %** + _Vars,_All_predicate_numbers4],_%Choice_point_trail5 + , + CP_Vars1,CP_Vars3), + */ + + /* + trace,writeln1(["here1:",%_,_, + Pred_id4,_Level,_Predicate_number4,-1,"predicate",[Function,Arguments1], %** + _Vars,_All_predicate_numbers4]), + */ +%trace, + %findresult3(Arguments1,Vars3,[],Result22), + /* + (debug3(on)-> +write(["L",Level_a]);true), + debug_exit(Skip,[Function,Result22]), % return result21 + checktypes(Function,Result22), +*/ + ((not(Level_a=0))->( + Level2 is Level-1, + + (Level2 = 0 -> + + %trace, + ( + %trace, + (debug3(on)-> +write0(["L",Level_a]);true), + + (debug_fail(Skip,[Function,Arguments1])->true;true), + + +ssi1([_,0,Predicate_number,-3,"predicate",Query_a, + Vars,All_predicate_numbers], Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2)); + + (%trace, + +%(Mode=pred_minus_three-> +%writeln1([globals3,Globals3]), +member([pred_id_chain,PID1,Pred_id1],Globals3),%; + +%PID1=Pred_id1), +%trace, + + get_last_p_before_n(Choice_point_trail1, + [PID1,Level3,Predicate_number3,["returns to",Line_number3,"pred_id",Pred_id3] + ,"predicate",Query2,Vars4,All_predicate_numbers3], + [_Cp_a,_Cb_b,PID1,Level3,Predicate_number3,["returns to",Line_number3,"pred_id",Pred_id3] + ,"predicate",Query2,Vars4,All_predicate_numbers3],_, + CP_Vars1,CP_Vars41), + + get_last_p_before_n(Choice_point_trail1, + [PID1,Level21,Predicate_number2,-1%Line_number2b + ,"predicate",Query21,Vars9,All_predicate_numbers2], + [_Cp_a1,_Cb_b1,PID1,Level21,Predicate_number2,-1%Line_number2b + ,"predicate",Query21,Vars9,All_predicate_numbers2],_, + CP_Vars41,CP_Vars42), + +/* +delete_cp(Choice_point_trail1,[_Cp_a1,_Cb_b1,PID1,Level21,Predicate_number2,-1%Line_number2b + ,"predicate",Query21,Vars9,All_predicate_numbers2],Choice_point_trail1a,CP_Vars42,CP_Vars4,_), + */ + +Choice_point_trail1=Choice_point_trail1a,CP_Vars42=CP_Vars4, +%trace, writeln1([all_predicate_numbers2,All_predicate_numbers2]), +%notrace, + +/*writeln1(get_last_p_before_n(Choice_point_trail1, + [PID1,Level3,Predicate_number2,-1%Line_number2b + ,"predicate",Query2,Vars4,All_predicate_numbers2], + [_Cp_a,_Cb_b,PID1,Level3,Predicate_number2,Line_number2b,"predicate",Query2,Vars4,All_predicate_numbers2],_, + CP_Vars1,CP_Vars4)), + */ + + %member([[vars1,PID1],Vars5],Globals3), + + (%Line_number2b=["returns to", + %Line_number3,"pred_id",Pred_id3] + + true % *** this may lower test results + -> + ( +%member([[pred_num,Pred_id3],Predicate_number2],Globals3), + +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("true",Dbw_true1),Dbw_true1=Dbw_true, + +member([Predicate_number2,_F|Rest],Functions), +(Rest=[_Args,":-",Lines]->true; +(Rest=[_Args]->Lines=[[[Dbw_n,Dbw_true]]]; +(Rest=[":-",Lines]; +(Rest=[],Lines=[[[Dbw_n,Dbw_true]]])))), + + +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + + member([Line_number3,[Dbw_on_true,_A],[Dbw_go_after,_B],[Dbw_on_false,C],[Dbw_go_to_predicates,_D]|_Line],Lines), +C=Line_number2a +%trace, +/* + reverse(Globals1,Globals33), + + member([[firstargs_uv2,Pred_id3],FirstArgs1],Globals33), +%trace, +%writeln1([globals33,Globals33]), + Globals33=Globals43, + + member([[vars1,Pred_id3],Vars11],Globals43), + + Globals43=Globals212, + +reverse(Globals212,Globals22), + + updatevars2(FirstArgs1,Vars3,[],Vars5), + updatevars3(Vars11,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + ( + unique1(Vars7,[],Vars8) + ) +);( + Vars8=[])), + + Vars8=Vars44 %% 4 not 2? *** + */ +) +; + +(Line_number2a=Line_number3%Line_number2b +%, +%,Level_x=Level2 +%Vars44=Vars3 + +)), % Line_number2 to 2b + + member([[level,PID1],Level_b],Globals3), + + %trace, + ssi1([PID1,Level_b,Predicate_number2,Line_number2a,"line",-, + Vars4,All_predicate_numbers2], _, Functions,Vars2, + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1a, + Choice_point_trail3, + CP_Vars4,CP_Vars2) + + )))). \ No newline at end of file diff --git a/SSI/process_cp.pl b/SSI/process_cp.pl new file mode 100644 index 0000000000000000000000000000000000000000..7e37133ba36bf39afa5553abdaca35fe7eee9bd1 --- /dev/null +++ b/SSI/process_cp.pl @@ -0,0 +1,228 @@ +process_cp(Findall_end_line,FA,D1,E1, + +Query_a2, + +Vars1, + End_result, Functions,Vars2, %% CPVs here?**** CPV0,CPV to CPV1,CPV2 + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1e, + Choice_point_trail3, + CP_Vars31,CP_Vars2 + %*/ + ) :- +%trace, +%trace,writeln(here1), + D1=[Pred_id3,Level4 + %1 + ,Predicate_number14,Line_number_a14,"findall",-,[old_vars,Old_vars],[findall_vars,Findall_vars],[format_vars,Format_vars],[result_var,Result_var]], + + + %writeln([d1,D1]), + E1=[Pred_id2,Level3,Predicate_number2,Line_number_a2,Pred_or_line%"line" + ,Query_a2,_CPV0,CPV3], + %trace, + (Pred_or_line="line"-> + (CPV3=[CPV_A1,CPV_A2,CPV_A31,CPV_A41,CPV_A51, + CPV_A61,CPV], + + %CPV=[CPV1|CPV2], + CPV=[[CPV11,CPV12]|CPV2], + + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, + + remember_and_turn_off_debug(Debug), + + find_sys(Sys_name), + +%trace, + interpretpart(match4,Format_vars,[Dbw_v,Sys_name],Vars1,Vars2fa,_), +%writeln([*,FA]), + (not(FA=fail)->getvalue([Dbw_v,Sys_name],Value3,Vars2fa); + is_empty(Value3)), + +(is_empty(Value3)->Findall_vars=Findall_vars2; append(Findall_vars,[Value3],Findall_vars2)), + + turn_back_debug(Debug), + +%trace, + get_last_p_before_n(Choice_point_trail1e,D1,[Cp_a1,Cp_a2|D1],_,CP_Vars31,CP_Vars4), + + +%trace, + + +replace_cp(Choice_point_trail1e,Cp_a1,Cp_a2,D1,[Pred_id3,Level4 + %1 + ,Predicate_number14,Line_number_a14,"findall",-,[old_vars,Old_vars],[findall_vars,Findall_vars2],[format_vars,Format_vars],[result_var,Result_var]],Choice_point_trail1b,CP_Vars4,CP_Vars6), + + %delete_cp(Choice_point_trail1e,[Cp_a1,Cp_a2|D1],Choice_point_trail1a,CP_Vars4,CP_Vars5,_), + + %D2=[Pred_id3,Level4 + %1 + %,Predicate_number14,Line_number_a14,"findall",-,[old_vars,Old_vars],[findall_vars,Findall_vars2],[format_vars,Format_vars],[result_var,Result_var]], + + %append_cp(Choice_point_trail1a,[D2],Choice_point_trail1b,CP_Vars5,CP_Vars6), + +%trace, + +%trace, +(cp_since_findall_start(Choice_point_trail1b,_,_,E1,_,CP_Vars6,_)-> +(%trace, +delete_cp(Choice_point_trail1b,[_,_|E1],Choice_point_trail1c,CP_Vars6,CP_Vars7,_)); +(CP_Vars6=CP_Vars7,Choice_point_trail1b=Choice_point_trail1c)), + + +%delete_cp(Choice_point_trail1b,[_,_|E1],Choice_point_trail1c,CP_Vars6,CP_Vars7,_),%* + + %CPV22=CPV2, + CPV22=[CPV_A1,CPV_A2,CPV12,CPV11], + + CPV23=[CPV_A1,CPV_A2,CPV_A31,CPV_A41,CPV_A51, + CPV_A61,CPV2], +%/* +%writeln(["*1",append_cp(Choice_point_trail1c,[[Pred_id2,Level3,Predicate_number2,Line_number_a2,Pred_or_line,_,_,CPV23]],Choice_point_trail1d,CP_Vars7,CP_Vars8)]), +append_cp(Choice_point_trail1c,[[Pred_id2,Level3,Predicate_number2,Line_number_a2,Pred_or_line,_,_,CPV23]],Choice_point_trail1d,CP_Vars7,CP_Vars8),/* + +%*/ +/* get_last_p_before_n(Choice_point_trail1b,E1,[Cp_a11,Cp_a21|E1],_,CP_Vars6,CP_Vars7), + + +%trace, + + +replace_cp(Choice_point_trail1b,Cp_a11,Cp_a21,E1,[Pred_id2,Level3,Predicate_number2,Line_number_a2,Pred_or_line,_,_,CPV23],Choice_point_trail1d,CP_Vars7,CP_Vars8), +*/ +%writeln1(["*1",append_cp(Choice_point_trail1c,[[Pred_id2,Level3,Predicate_number2,Line_number_a2,Pred_or_line,_,_,CPV23]],Choice_point_trail1d,CP_Vars7,CP_Vars8)]), + +get_lang_word("findall_exit_function",Dbw_findall_exit_function1),Dbw_findall_exit_function1=Dbw_findall_exit_function, + +append_cp(Choice_point_trail1d,[[Pred_id2,Level3,Predicate_number2,[Dbw_findall_exit_function,Findall_end_line],"line",_, + Vars2fa,_]],Choice_point_trail1d1, + CP_Vars8,CP_Vars81), % Pred_id n? + +%CPV1=[CPV10, "prev_pred_id", Prev_pred_id], + + ssi1([Pred_id2,Level3, %* + Predicate_number2,Line_number_a2,"line",Query_a2, + Old_vars,CPV23], End_result, Functions,Vars2, %% CPVs here?**** CPV0,CPV to CPV1,CPV2 + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1d1, + Choice_point_trail3,["appearance of command",CPV22], + CP_Vars81,CP_Vars2) + + + + + ); + (Pred_or_line="predicate"-> + (CPV3=CPV, + + CPV=[CPV1|CPV2], + + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, + + remember_and_turn_off_debug(Debug), + + find_sys(Sys_name), + + interpretpart(match4,Format_vars,[Dbw_v,Sys_name],Vars1,Vars2fa,_), +%writeln([*,FA]), + (not(FA=fail)->getvalue([Dbw_v,Sys_name],Value3,Vars2fa); + is_empty(Value3)), + +(is_empty(Value3)->Findall_vars=Findall_vars2; append(Findall_vars,[Value3],Findall_vars2)), + + turn_back_debug(Debug), + +%trace, + get_last_p_before_n(Choice_point_trail1e,D1,[Cp_a1,Cp_a2|D1],_,CP_Vars31,CP_Vars4), + replace_cp(Choice_point_trail1e,Cp_a1,Cp_a2,D1,[Pred_id3,Level4 + %1 + ,Predicate_number14,Line_number_a14,"findall",-,[old_vars,Old_vars],[findall_vars,Findall_vars2],[format_vars,Format_vars],[result_var,Result_var]],Choice_point_trail1b,CP_Vars4,CP_Vars6), + +%x +%delete_cp(Choice_point_trail1e,[Cp_a1,Cp_a2|D1],Choice_point_trail1a,CP_Vars4,CP_Vars5,_), + +%x D2=[Pred_id3,Level4 + %1 + %,Predicate_number14,Line_number_a14,"findall",-,[old_vars,Old_vars],[findall_vars,Findall_vars2],[format_vars,Format_vars],[result_var,Result_var]], + + %append_cp(Choice_point_trail1a,[D2],Choice_point_trail1b,CP_Vars5,CP_Vars6), + + delete_cp(Choice_point_trail1b,[_,_|E1],Choice_point_trail1c,CP_Vars6,CP_Vars7,_), + + %(Pred_or_line="line"-> + %(CPV22=[CPV_A1,CPV_A2,CPV_A3,CPV_A4,CPV_A5, + %CPV_A6,CPV2] + + %); + %(Pred_or_line="predicate"-> + CPV22=CPV2, + +%/* + +%writeln1([*,append_cp(Choice_point_trail1c,[[Pred_id2,Level3,CPV1,Line_number_a2,Pred_or_line,_,_,CPV22]],Choice_point_trail1d,CP_Vars7,CP_Vars8)]), + +% Get and delete old cp +%trace, +(cp_since_findall_start3(Choice_point_trail1c,_Level1,_D13,CPX,_D11,CP_Vars7,CP_Vars8)-> +(%trace, +delete_cp(Choice_point_trail1c,CPX,Choice_point_trail1c1,CP_Vars8,CP_Vars9,_)); +(CP_Vars7=CP_Vars9,Choice_point_trail1c=Choice_point_trail1c1)), + +append_cp(Choice_point_trail1c1,[[Pred_id2,Level3,CPV1,Line_number_a2,Pred_or_line,_,_,CPV22]],Choice_point_trail1d,CP_Vars9,CP_Vars8), + +%*/ + %get_last_p_before_n(Choice_point_trail1b,E1,[Cp_a11,Cp_a21|E1],_,CP_Vars6,CP_Vars7), + + +%trace, + + +%replace_cp(Choice_point_trail1b,Cp_a11,Cp_a21,E1,[Pred_id2,Level3,CPV1,Line_number_a2,Pred_or_line,_,_,CPV22],Choice_point_trail1d,CP_Vars7,CP_Vars8), + +%writeln1(["*2",append_cp(Choice_point_trail1c,[[Pred_id2,Level3,Predicate_number2,Line_number_a2,Pred_or_line,_,_,CPV23]],Choice_point_trail1d,CP_Vars7,CP_Vars8)]), + +get_lang_word("findall_exit_function",Dbw_findall_exit_function1),Dbw_findall_exit_function1=Dbw_findall_exit_function, +get_lang_word("findall_fail_function",Dbw_findall_fail_function1),Dbw_findall_fail_function1=Dbw_findall_fail_function, +%writeln([*,FA]), +(not(FA=exit)->FA1=Dbw_findall_fail_function;FA1=Dbw_findall_exit_function), + +append_cp(Choice_point_trail1d,[[Pred_id2,Level3,CPV1,[FA1,Findall_end_line],"line",_, + Vars2fa,_]],Choice_point_trail1d1, + CP_Vars8,CP_Vars81), % Pred_id n? + +%trace, + +( +CPV1=[CPV10, "prev_pred_id", Prev_pred_id]->true; +(%writeln("Error here"), +fail)),%-> + +ssi1([Prev_pred_id,Level3, %* + CPV10,Line_number_a2,"line",Query_a2, + _,CPV2], End_result, Functions,Vars2, %% CPVs here?**** CPV0,CPV to CPV1,CPV2 + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1d1, + Choice_point_trail3, + CP_Vars81,CP_Vars2) + /* + ; +(Line_number_a2=["returns to", CPV10, "pred_id", Prev_pred_id]), +%[CPV10, "prev_pred_id", Prev_pred_id]), + + ssi1([Prev_pred_id,Level3, %* + CPV10,Line_number_a2,"line",Query_a2, + _,CPV2], End_result, Functions,Vars2, %% CPVs here?**** CPV0,CPV to CPV1,CPV2 + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1d1, + Choice_point_trail3, + CP_Vars81,CP_Vars2) + ). + ))). +*/ + ))). \ No newline at end of file diff --git a/SSI/program.c b/SSI/program.c new file mode 100644 index 0000000000000000000000000000000000000000..d76c027f1c886f2e9504dde5ae14f0d24282d42b --- /dev/null +++ b/SSI/program.c @@ -0,0 +1,12 @@ +#include +#include + +int main(void) +{ + char str1[20]; + + scanf("%19s", str1); + + printf("%s\n", str1); + return 0; +} \ No newline at end of file diff --git a/SSI/program.txt b/SSI/program.txt new file mode 100644 index 0000000000000000000000000000000000000000..d76c027f1c886f2e9504dde5ae14f0d24282d42b --- /dev/null +++ b/SSI/program.txt @@ -0,0 +1,12 @@ +#include +#include + +int main(void) +{ + char str1[20]; + + scanf("%19s", str1); + + printf("%s\n", str1); + return 0; +} \ No newline at end of file diff --git a/SSI/sessions.pl b/SSI/sessions.pl new file mode 100644 index 0000000000000000000000000000000000000000..eff745dfd4e3be939b574f43e9115e744a16e119 --- /dev/null +++ b/SSI/sessions.pl @@ -0,0 +1,58 @@ +generate_session_key(Key5) :- + repeat, + api-key(API_key), + length(Key,10), + findall(Key1,(member(_,Key),random(X),Key1 is ceiling(10*X)),Key2), + flatten([API_key,"-",Key2],List), + foldr(string_concat,List,"",Key4), + string_atom(Key4,Key5), + foldr(string_concat,["sessions/session",Key5,".txt"],Path), + not(exists_file(Path)), + %number_string(Key31,Key3), + %Key4=API_key-Key31, + !. + + +% each day, delete sessions older than 1 year x day x month + +delete_old_sessions :- + +get_time(Now), +One_month_ago is Now - 2592000.0, + +%findall(J5,(%member(K1,K), +directory_files("sessions/",F), + delete_invisibles_etc(F,G), + +findall(_,( +member(H,G),string_concat("sessions/",H,H1), +set_time_file(H1, [access(Access)], []), +Access < One_month_ago, +delete_file(H1) +),_),!. + + +% save session + +% api key... + +%save_session_first_time(Session) :- +% generate_session_key(Session_number), +% save_session(Session_number,Session),!. + +save_session(Session_number,Session) :- + (var(Session_number)->generate_session_key(Session_number);true), + foldr(string_concat,["sessions/session",Session_number,".txt"],Path), + + save_file_s(Path,Session),!. + +% () check if generated key is the same as another key + + +% get session + +get_session(Session_number,Session) :- + foldr(string_concat,["sessions/session",Session_number,".txt"],Path), + open_file_s(Path,Session),!. + + diff --git a/SSI/sessions/.DS_Store b/SSI/sessions/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/SSI/sessions/.DS_Store differ diff --git a/SSI/ssi b/SSI/ssi new file mode 100644 index 0000000000000000000000000000000000000000..3ee2b7f4c109864d720d63556001484d4458359f Binary files /dev/null and b/SSI/ssi differ diff --git a/SSI/ssi-api-key.pl b/SSI/ssi-api-key.pl new file mode 100644 index 0000000000000000000000000000000000000000..a909fdd9728062c0eb40e3e7f7710ea8d023d3ef --- /dev/null +++ b/SSI/ssi-api-key.pl @@ -0,0 +1 @@ +api-key(1234567890). \ No newline at end of file diff --git a/SSI/ssi-api.pl b/SSI/ssi-api.pl new file mode 100644 index 0000000000000000000000000000000000000000..2081bcb547d0e55cd323a31933ccae8fe1229754 --- /dev/null +++ b/SSI/ssi-api.pl @@ -0,0 +1,271 @@ +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/', ssi_web_form, []). + +%:-include('test_form1.pl'). + +%:- include('files/listprolog.pl'). +%:- include('paraphraser1_lp.pl'). + +%:-include('../Prolog-to-List-Prolog/p2lpconverter.pl'). + +ssi_server(Port) :- + http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + + ssi_web_form(_Request) :- +retractall(html_api_maker_or_terminal(_)), +assertz(html_api_maker_or_terminal(html + %terminal + )), + format('Content-type: text/html~n~n', []), + +data(Header,Footer), + +format(Header,[]), + +Debug=off, + + +%open_string_file_s("../private/la_ws.pl",S),p2lpconverter([string,S],T), +test_open_types_cases(25,Query,Types,Modes,Functions), + +%international_lucianpl([lang,"en"],Debug,[[n,main]],T,_Result), +international_lucianpl([lang,"en"],Debug,Query,Types,Modes,Functions,_Result), +%p2lpconverter([file,"../private/la_com_ssi1.pl"],List3), + +%testopen_cases(8,[[n,test]],List3), + %test(1,Query,Functions,Result), + +% Form and HTML Table +%test1(Functions), +%Query=[[n,test]], + %ssi_test(List3), + + %para(List3), + %international_lucianpl([lang,"en"],Debug,[[n,paraphraser]],List3,_Result1), + + +format(Footer,[]) + + . + + :- http_handler('/landing', landing_pad, []). + + landing_pad(Request) :- + member(method(post), Request), !, + http_read_data(Request, Data, []), + + + format('Content-type: text/html~n~n', []), + format('

', []), + %%portray_clause(Data), + + %%term_to_atom(Term,Data), + + %format(Data,[]) + +%/* +Data=[%%debug='off',%%Debug1, +input=Input1, +ssi=Hidden1,submit=_], +%term_to_atom(Hidden1,Hidden), + +%writeln(Data), +%*/ +%/* + +%writeln(Hidden1), +%writeln(Input1), + + %replace_new(Hidden1,""","\"",Hidden2), + %replace_new(Input1,""","\"",Input2), + +%writeln(Hidden2), +%writeln(Input2) +%/* +term_to_atom(Hidden11,Hidden1), + +%writeln1([atom_string(Hidden11,Hidden1)]), + +get_session(Hidden11,Session), + +%/* +%term_to_atom( +Session=[Dbw_n,Dbw_read_string,Value1,Variable1,Line_number_b,Skip,lang(Lang), + + + +debug2(Debug2), +debug3(Debug3), +debug4(Debug4), +retry_back(Retry_back), +retry_back_stack(Retry_back_stack), +retry_back_stack_n(Retry_back_stack_n), +cumulative_or_current_text(Cumulative_or_current_text), +number_of_current_text(Number_of_current_text), +html_api_maker_or_terminal(Html_api_maker_or_terminal), +pred_numbers(Pred_numbers), + +pred_id(Pred_id_a), +types(Types), +typestatements(Typestatements), +modestatements(Modestatements), + + + ssi1([Pred_id,Level,Predicate_number,A,"line",Query, + Vars3,All_predicate_numbers], End_result3, Functions,_Vars2, + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1e, + Choice_point_trail3, + CP_Vars3,CP_Vars2), + + ssi1([C,"line",Query,Vars1])], + %Session), + %*/ + +%/* +lang(Lang), + +retractall(session_number(_)),assertz(session_number(Hidden11)), +retractall(debug2(_)),assertz(debug2(Debug2)), +retractall(debug3(_)),assertz(debug3(Debug3)), +retractall(debug4(_)),assertz(debug4(Debug4)), +retractall(retry_back(_)),assertz(retry_back(Retry_back)), +retractall(retry_back_stack(_)),assertz(retry_back_stack(Retry_back_stack)), +retractall(retry_back_stack_n(_)),assertz(retry_back_stack_n(Retry_back_stack_n)), +retractall(cumulative_or_current_text(_)),assertz(cumulative_or_current_text(Cumulative_or_current_text)), +retractall(number_of_current_text(_)),assertz(number_of_current_text(Number_of_current_text)), +retractall(html_api_maker_or_terminal(_)),assertz(html_api_maker_or_terminal(Html_api_maker_or_terminal)), +retractall(pred_numbers(_)),assertz(pred_numbers(Pred_numbers)), + +retractall(pred_id(_)),assertz(pred_id(Pred_id_a)), +retractall(types(_)),assertz(types(Types)), +retractall(typestatements(_)), + + %findall([A,C],(member([A,B],TypeStatements),expand_types(B,[],C)),TypeStatements1), + +assertz(typestatements(Typestatements)), +retractall(modestatements(_)),assertz(modestatements(Modestatements)), + + +atom_string(Input1,Value1A), +%*/ + +data(Header,Footer), +%/* +format(Header,[]), + + + %test(1,Query,Functions,Result), + %international_lucianpl([lang,"en"],Debug,Query,Functions,Result1), + + +%trace, + + ((val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars3,Vars2a))-> + (debug_exit(Skip,[[Dbw_n,Dbw_read_string],[Value1A]]), + + (var(Skip)->Globals3=Globals4; + append(Globals3,[[[skip,Pred_id,Line_number_b],Skip]],Globals4)), + + %trace, + ssi1([Pred_id,Level,Predicate_number,A,"line",Query, + Vars2a,All_predicate_numbers], _End_result31, Functions,_Vars21, + Result1, Result2, + Globals4,Globals2, + Choice_point_trail1e, + Choice_point_trail3, + CP_Vars3,CP_Vars2) + + ) +; (debug_fail(Skip,[[Dbw_n,Dbw_read_string],[variable]]), + + ssi1([Pred_id,Level,Predicate_number,C,"line",Query, + Vars1,All_predicate_numbers], End_result3, Functions,_Vars21, + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1e, + Choice_point_trail3, + CP_Vars3,CP_Vars2) + + +)), +%*/ +format(Footer,[]) +%*/ +. +%term_to_atom(Debug2,'off'), +%term_to_atom(Query2,Query1), +%term_to_atom(Functions2,Functions1), + +%international_interpret([lang,"en"],Debug2,Query2,Functions2,Result), + %%format('

========~n', []), + %%portray_clause + %portray_clause(result), + %%writeln1(Data), + +%format('

'). + + +data(Header,Footer) :- + +Header=' + + + + + State Saving Interpreter + + + + + + + +
+ + + + + + +
+

', + +Footer='

+
+
+ +
+ +'. diff --git a/SSI/ssi-ws.pl b/SSI/ssi-ws.pl new file mode 100644 index 0000000000000000000000000000000000000000..c51b74a813ba21314dd5581717c9119deb2ac134 --- /dev/null +++ b/SSI/ssi-ws.pl @@ -0,0 +1,36 @@ +#!/usr/bin/swipl -f -q + +% Place in root directory, check paths below. + +% run with swipl ssi-ws.pl in e.g. crontab -e + +:- initialization main. + +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/', web_form, []). + +% GitHub/Algorithm-Writer-with-Lists/ +:- include('GitHub/SSI/ssi.pl'). + +main :- + working_directory(_, 'GitHub/SSI/'), + +delete_old_sessions, +ssi_server(8000),sleep(86390),halt. + +main :- halt(1). + +%server(Port) :- +% http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + diff --git a/SSI/ssi.pl b/SSI/ssi.pl new file mode 100644 index 0000000000000000000000000000000000000000..9c270421f5197b3b76753b2f1e907f99dd17c36c --- /dev/null +++ b/SSI/ssi.pl @@ -0,0 +1,547 @@ +%% ssi.pl +%% State Saving Interpreter +%% +%% Introduces retry in trace + +%% convert alg to state machine (sm) +%% it is a nondeterministic (nd) state machine, i.e. it has forks between parts of the sm if there are nd choices left or it has failed a clause, with backtracking to the last choice point on the line - predicates and clauses are signposted in the sm +%% predicates, clauses and individual predicate sms +%% - predicates and clauses sms are separate because the sm goes from one predicate to next x only clauses not predicates sm x predicates sm as well because they are separate but have back-connection origin values that make temporary parts of (a separate) state machine that correspond to level numbers in the trace x there are sm links from ends of predicates to where the predicate was called from +%% clauses in clauses sms catch fails from previous clauses + +%% try without recursion at first + +%% x sm not nec, just scope ds with var, (recursive states x) nd states +%% - can fail predicate that doesn't call another predicate, then try another predicate + +%% on if then, goes to other clause if fails, etc. + +%% unique number for every line of program + +%% line (from command) to rectangle (e.g. predicate) + +%% track list for backtracking +%% 1 predicate number, 1,2 line number or bracketed part, etc. +%% - failed predicates removed from track list +%% doesn't need to split predicates in half to turn algorithms into web services + +%% a :- b, c. not a:-a. +%% a(1,1,Result):- Result is 1+1. + +/** +crop down pred from top x just goes to next command +- sets up antecedant(s) in if, not, or if, brackets backtracks to it +- x based on states, it continues on from antecedants +**/ + +%% find called line +%% find next line +%% - go to next numbered line if in brackets, etc. +%% - if last line of brackets +%% - if antecedent that is true, ..., vv +%% if fails, returns to last choice point +%:-include('listprologinterpreter1listrecursion4.pl'). % enabling these 3 causes a hang on loading +%:-include('listprologinterpreter3preds5.pl'). +%:-include('grammar.pl'). +%:-include('../../../GitHub/listprologinterpreter/lpiverify4.pl'). +%:-include('ssi_find_state_machine.pl'). + +%ssi_verify(Debug,N) :- +% test(N,Query,Functions,Result), +% ssi(Debug,Query,Functions,Result). + +%:-include('interpretstatement3.pl'). +:-include('ssi-api.pl'). + +:-include('interpretstatement3.pl'). + +:-include('../listprologinterpreter/listprolog.pl'). +:-include('find_pred_sm.pl'). +:-include('ssi_find_state_machine.pl'). +%:-include('find_types_sm.pl'). +%:-include('go_forward_or_backtrack.pl'). +:-include('ssi3.pl'). +:-include('ssi_verify4_test_lang_all.pl'). +:-include('ssi_verify4.pl'). +:-include('ssi_verify4_types.pl'). +:-include('ssi_verify4_test_bt_lang_all.pl'). +:-include('ssi_verify4_open.pl'). +:-include('ssi_verify4_open_types.pl'). + +:-include('ssi_listrecursion4.pl'). +:-include('ssi_3preds5.pl'). +:-include('d.pl'). +:-include('e.pl'). +:-include('cp_since_findall_start2.pl'). +:-include('pred_minus_three.pl'). +:-include('pred_minus_one_fail2.pl'). +:-include('flush_after_last_findall'). +:-include('process_cp.pl'). +:-include('end_nested_findall.pl'). +:-include('used_by_call_command.pl'). +:-include('sessions.pl'). +:-include('ssi-api-key.pl'). +%:-include('replace_in_term.pl'). +%:-include('local_and_global_cp_trails.pl'). +:-include('optimisations.pl'). +:-include('only_ssi_verify4.pl'). +:-include('ssi_verify_pl.pl'). + +:- dynamic debug2/1. +:- dynamic debug3/1. +:- dynamic debug4/1. +:- dynamic retry_back/1. +:- dynamic retry_back_stack/1. +:- dynamic retry_back_stack_n/1. +:- dynamic cumulative_or_current_text/1. +:- dynamic number_of_current_text/1. +:- dynamic html_api_maker_or_terminal/1. +:- dynamic session_number/1. + +%:- dynamic screen_text/1. +%:- dynamic curr_screen_text/1. + +:- dynamic pred_numbers/1. +%:- dynamic curr_cp/1. +:- dynamic curr_cp_index/1. +:- dynamic pred_id/1. + +:- dynamic types/1. +:- dynamic typestatements/1. +:- dynamic modestatements/1. + +main2:- +time(( +only_ssi_test(off,NTotal1,Score1), +ssi_test_all00("en",off,NTotal2,Score2), +ssi_test_all00("en2",off,NTotal3,Score3), +ssi_test_all_bt00("en2",off,NTotal4,Score4), +writeln(only_ssi_test(off,NTotal1,Score1)), +writeln(ssi_test_all00("en",off,NTotal2,Score2)), +writeln(ssi_test_all00("en2",off,NTotal3,Score3)), +writeln(ssi_test_all_bt00("en2",off,NTotal4,Score4)))). +%:- dynamic hidden/1. + + +/* +ssi(Debug,Query,Functions1,Result) :- + %load_lang_db, % * check this is done once for whole ssi + + retractall(debug(_)), + assertz(debug(Debug)), + convert_to_grammar_part1(Functions1,[],Functions2,_), + add_line_numbers_to_algorithm1(Functions2,Functions2a), + %%writeln1(Functions2a), + find_pred_sm(Reserved_words1),%,"en"), +find_pred_numbers(Functions2a,Reserved_words,Pred_numbers), + find_state_machine1(Functions2a,Functions3,Pred_numbers), + + %%writeln1(Functions3), + prep_predicate_call(Query,Functions3, + All_predicate_numbers), + ssi1([1,1,"predicate",Query,[], + All_predicate_numbers],Functions3,[],Result,[],_Choice_point_trail). + +*/ + +prep_predicate_call(Query,Functions3,All_predicate_numbers) :- + +%writeln(prep_predicate_call(Query,Functions3,All_predicate_numbers)), +%trace, + Query=[Name|Arguments1], + (Arguments1=[]->Arguments_length=0; + (Arguments1=[Arguments3], + length(Arguments3,Arguments_length))), + findall(Predicate_number1, + ( + + %member([Predicate_number1,Name| + % [Arguments2|_]],Functions3), + %length(Arguments2,Arguments_length) +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("true",Dbw_true1),Dbw_true1=Dbw_true, + + member([Predicate_number1,Name|Rest],Functions3), +(Rest=[Args,":-",Lines]->length(Args,Arguments_length); +(Rest=[Args]->(Lines=[[[Dbw_n,Dbw_true]]],length(Args,Arguments_length)); +(Rest=[":-",Lines]->Arguments_length=0; +(Rest=[],Lines=[[[Dbw_n,Dbw_true]]],Arguments_length=0)))) + + + ), + All_predicate_numbers). + +add_line_numbers_to_algorithm1(Algorithm1,Algorithm2) :- + add_line_numbers_to_algorithm2(Algorithm1,[],Algorithm2,0,_). + +add_line_numbers_to_algorithm2([],Algorithm,Algorithm,N,N) :- !. +add_line_numbers_to_algorithm2(Algorithm1,Algorithm2,Algorithm3,Number1,Number2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("true",Dbw_true1),Dbw_true1=Dbw_true, + Algorithm1=[Function1|Functions], + ((Function1=[Name,Arguments1,Symbol1,Body1],symbol(Symbol1,Symbol2), + findall(Arguments3,(member(Arguments2,Arguments1),slp2lp_variables(Arguments2,Arguments3)),Arguments4), + %Number1a is Number1+1, + add_line_numbers_to_algorithm_body2(Body1,Body2,0,_), + append(Algorithm2,[[Number1,Name,Arguments4,Symbol2,Body2]],Algorithm4))->true; + + ((Function1=[Name,Symbol1,Body1],symbol(Symbol1,Symbol2), + + %Number1a is Number1+1, + add_line_numbers_to_algorithm_body2(Body1,Body2,0,_), + append(Algorithm2,[[Number1,Name,[],Symbol2,Body2]],Algorithm4))->true; + + ((Function1=[Name,Arguments1],symbol(":-",Symbol2), + findall(Arguments3,(member(Arguments2,Arguments1),slp2lp_variables(Arguments2,Arguments3)),Arguments4), + + add_line_numbers_to_algorithm_body2([[[Dbw_n,Dbw_true]]],Body2,0,_), + +append(Algorithm2,[[Number1,Name,Arguments4,Symbol2,Body2]],Algorithm4))->true; + + + (Function1=[Name],symbol(":-",Symbol2), + add_line_numbers_to_algorithm_body2([[[Dbw_n,Dbw_true]]],Body2,0,_), + + append(Algorithm2,[[Number1,Name,[],Symbol2,Body2]],Algorithm4))->true; + + % [":-", [n, include], ['../b/b.pl']] + ((Function1=[Symbol2,Name,Arguments1],symbol(":-",Symbol2), + findall(Arguments3,(member(Arguments2,Arguments1),slp2lp_variables(Arguments2,Arguments3)),Arguments4), + + add_line_numbers_to_algorithm_body2([[[Dbw_n,Dbw_true]]],Body2,0,_), + +append(Algorithm2,[[Number1,Symbol2,Name,Arguments4,Body2]],Algorithm4)))))), + + + Number1a is Number1+1, + %%writeln1([Number1,Name,Arguments4,Symbol2,Body2]), + add_line_numbers_to_algorithm2(Functions,Algorithm4,Algorithm3,Number1a,Number2). + + +symbol(Symbol,Symbol) :-!. + +%%slp2lp_variables(Name1,[v,Name1]) :- predicate_or_rule_name(Name1),!. +slp2lp_variables(Name,Name) :- !. + +/** +add_line_numbers_to_algorithm_body(Body1,Body2) :- + findall(*,(member(Statement1,Body1 +add_line_numbers_to_algorithm_body(Body1,[],Body2) :- + +**/ + +%%predicate_or_rule_name([A,B]) :- atom(A),is_list(B),!. +predicate_or_rule_name([V_or_n,_Name]) :- get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +(V_or_n=Dbw_v->true;V_or_n=Dbw_n),!.%%,atom(Name),!. +%% x: predicate_or_rule_name(V_or_n) :- (V_or_n=v->true;V_or_n=n),fail,!. + +add_line_numbers_to_algorithm_body2([],[],N,N):-!.%%,Body3 + + +%%add_line_numbers_to_algorithm_body2([],Body,Body) :- !. +add_line_numbers_to_algorithm_body2(Body1,Body2%%,Body3 +,Number1,Number2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + Body1=[[Statements1|Statements1a]|Statements2 + ], + + not(predicate_or_rule_name(Statements1)), + Number1a is Number1+1, +add_line_numbers_to_algorithm_body2([Statements1],Body3,Number1a,Number3), %% 2->1 + + add_line_numbers_to_algorithm_body2(Statements1a,Body4,Number3,Number4), + add_line_numbers_to_algorithm_body2(Statements2,Body5,Number4,Number2), + +%% append([Body3,Body4],Body6), +%% append([[Body6],Body5],Body2), + + append(Body3,Body4,Body34), + Body6=[Number1,[Dbw_n,"[]"],Body34 + ], + append([Body6],Body5,Body2), + + !. + + + + +add_line_numbers_to_algorithm_body2(Body1,Body2,Number1,Number2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, + Body1=[[[Dbw_n,Dbw_not],Statement]|Statements2 %% [] removed from Statement + ], + Number1a is Number1+1, + add_line_numbers_to_algorithm_body2(Statement,Body3,Number1a,Number3), + add_line_numbers_to_algorithm_body2(Statements2,Body4,Number3,Number2), + %trace, + append([Number1,%%*, + [Dbw_n,Dbw_not]],Body3,Body5), + append([Body5],Body4 + ,Body2), + + !. + + + + +add_line_numbers_to_algorithm_body2(Body1,Body2,Number1,Number2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, + + Body1=[[[Dbw_n,Dbw_or],[Statements1,Statements2]]|Statements3], + Number1a is Number1+1, + add_line_numbers_to_algorithm_body2([Statements1],Body3,Number1a,Number3), + add_line_numbers_to_algorithm_body2([Statements2],Body4,Number3,Number4), + add_line_numbers_to_algorithm_body2(Statements3,Body5,Number4,Number2), + append(Body3,Body4,Body34), + Body6=[Number1,[Dbw_n,Dbw_or],Body34 + ], + append([Body6],Body5,Body2), + !. + + +add_line_numbers_to_algorithm_body2(Body1,Body2,Number1,Number2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + Body1=[[[Dbw_n,"->"],[Statements1,Statements2]]|Statements3], + Number1a is Number1+1, + add_line_numbers_to_algorithm_body2([Statements1],Body3,Number1a,Number3), + add_line_numbers_to_algorithm_body2([Statements2],Body4,Number3,Number4), + + add_line_numbers_to_algorithm_body2(Statements3,Body5,Number4,Number2), + append(Body3,Body4,Body34), + Body6=[Number1,[Dbw_n,"->"],Body34 + ], + append([Body6],Body5,Body2), + + !. + + + + +add_line_numbers_to_algorithm_body2(Body1,Body2,Number1,Number2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + Body1=[[[Dbw_n,"->"],[Statements1,Statements2,Statements2a]]|Statements3], + Number1a is Number1+1, + add_line_numbers_to_algorithm_body2([Statements1],Body3,Number1a,Number3), + add_line_numbers_to_algorithm_body2([Statements2],Body4,Number3,Number4), + %%trace, + add_line_numbers_to_algorithm_body2([Statements2a],Body5,Number4,Number5), + add_line_numbers_to_algorithm_body2(Statements3,Body6,Number5,Number2), + append_list2([Body3,Body4,Body5],Body345), + Body7=[Number1,[Dbw_n,"->"],Body345], + append([Body7],Body6,Body2), + + !. + + +add_line_numbers_to_algorithm_body2(Body1,Body2,Number1,Number2) :- + Body1=[Statement|Statements], + not(predicate_or_rule_name(Statement)), + add_line_numbers_to_algorithm_statement1(Statement,Result1,Number1,Number3), + add_line_numbers_to_algorithm_body2(Statements,Result2,Number3,Number2), + append_list2([Result1,Result2],Body2),!. + +add_line_numbers_to_algorithm_statement1(Statement,Result1,Number1,Number2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + ((Statement=[[Dbw_n,Dbw_findall],[Arguments1,Arguments2,Arguments3]], + %Arguments=Result2, + %trace, + Number1a is Number1+1, + add_line_numbers_to_algorithm_body2([Arguments2],Body3,Number1a,Number2), + %%*** [Arguments2] to Arguments2 + +%findall(Argument,(member(Argument,Arguments),(predicate_or_rule_name(Argument))),Result2), + Result1=[[Number1,[Dbw_n,Dbw_findall],[Arguments1,Arguments3,Body3]]])). + +/* +add_line_numbers_to_algorithm_statement1(Statement,Result1,Number1,Number2) :- + ((Statement=[[n,maplist],[Arguments1,Arguments2,Arguments3,Arguments4]], + %Arguments=Result2, + %trace, + Number1a is Number1+1, + add_line_numbers_to_algorithm_body2([Arguments1],Body3,Number1a,Number2), + +%findall(Argument,(member(Argument,Arguments),(predicate_or_rule_name(Argument))),Result2), + Result1=[[Number1,[n,maplist],[Arguments2,Arguments3,Arguments4,Body3]]])). + */ + +add_line_numbers_to_algorithm_statement1(Statement,Result1,Number1,Number2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + %(Statement=[[n,cut]]->trace;true), + + ((Statement=[[Dbw_n,Name],Arguments], + not(Name=Dbw_findall), + + Arguments=Result2, + %findall(Argument,(member(Argument,Arguments),(predicate_or_rule_name(Argument))),Result2), + Result1=[[Number1,[Dbw_n,Name],Result2]])->true; + (Statement=[[Dbw_n,Name]], + Result1=[[Number1,[Dbw_n,Name],[]]])), + Number2 is Number1+1. + +add_line_numbers_to_algorithm_statement1(Statement,Result1,Number1,Number2) :- +get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, + ((Statement=[[Dbw_v,Name],Arguments], + %not(Name=findall), + Arguments=Result2, + %findall(Argument,(member(Argument,Arguments),(predicate_or_rule_name(Argument))),Result2), + Result1=[[Number1,[Dbw_v,Name],Result2]])->true; + (Statement=[[Dbw_v,Name]], + Result1=[[Number1,[Dbw_v,Name],[]]])), + Number2 is Number1+1. + +%%*del:Functions2, + +%% *** don't worry about sublevels, just level numbers x + +%ssi0 + +% newer than comments in next section: + +% go to types, pred or lines pred +% - whether this pred determines which to go to, or it is already decided x +% - * sm connects types at start - no call/exit preds, just goes to next item in sm, +% - types sm creator - checks types (do non types sm first) +% - * sm of preds (to clauses from calls, allowing backtracking) +% - whether need call and exit preds, or do anyway x + +/** +ssi1([Level,Sublevel,"predicate",Query,Predicate_numbers_finished,All_predicate_numbers],Functions1,Result1,Result2,Choice_point_trail1,Choice_point_trail2) :- + append(Predicate_numbers_finished,Curr_predicate_queue,All_predicate_numbers), + (Curr_predicate_queue=[]-> + + (% No other clauses of this predicate to try + + % add to cps + append(Choice_point_trail1,[[Level,Sublevel,"predicate",Query,Predicate_numbers_finished,All_predicate_numbers]],Choice_point_trail3), + + + %call first line +%* + + + % when reach last line: + %predicate finished, no more results from it + + % if level 1 end x, call prev cp + + %- if not level 1 call prev lev - find var state of last line of prev lev - update vars + + % + ) + ;( + % Some clauses of this predicate left to try + + % add to cps + % call first line of pred, + % when reach last line, call next clause (dup vars *(one list for old results, one for curr results x label var results with cp, *sublevel at least x, also need ()x-predicate and line numbers*), add to new var list, delete cps back to pred call) + + % * var results/cps saved from where began when try each clause, deleted back until and to when trying and failing respectively + + )), + + % * return from call-header to call first line x check types without being called by pred header (do separately) + % * check types separately (and call, exit separately) from pred, line (after pred) + % when call a pred from a line, use lpi_list_rec find_result etc + % * predicates and lines have separate call and exit/fail preds +% * second sm for types, pred, lines + + % when retries, deletes vars, cps up to and not including that call (*cps have sublevel to delete back until) + + +call_first_line1() :- + query_to_vars(), + + Sublevel2 is Sublevel1+1, + ssi1([Level,Sublevel2,"line",1, % line number + Query*,Predicate_numbers_finished,All_predicate_numbers],Functions1,Result1,Result2,Choice_point_trail1,Choice_point_trail2), + +query_to_vars(Query,) :- + + + + %**** + + find_called_lines(Query,Functions1,Predicate_numbers), + ssi_call(Query,Functions1,Predicate_numbers,Result1), + true. + **/ + /* +find_called_lines(Query,Functions1,Predicate_numbers) :- + findall(Number1,(Query=[Name|Arguments1], + length(Arguments1,Arguments_length), + member([Number1,Name|Arguments2],Functions1), + (Arguments2=[Arguments3,":-",_Body1]->true; + (Arguments2=[":-",_Body2]->Arguments3=[]; + (Arguments2=[Arguments3]->true; + (Arguments2=[]->Arguments3=[])))), + length(Arguments3,Arguments_length)),Predicate_numbers). + +ssi_call(Query,Functions1,Predicate_numbers,Vars21) :- %% ** added vars21 + findall(Vars2,(member(Predicate_number,Predicate_numbers), + %% *** findall xx + member([Predicate_number,_|Arguments2],Functions1), + + Query=[Name|Arguments1], % Name,Arguments1 -> Name|Arguments1 + %% checktypes_inputs(Name,Arguments1) + + (Arguments2=[Arguments3,":-",Body]->true; + (Arguments2=[":-",Body]->Arguments3=[]; + (Arguments2=[Arguments3]->Body=[]; + (Arguments2=[]->(Arguments3=[],Body=[]))))), + + checkarguments(Arguments1,Arguments3,[],Vars1,[],FirstArgs), + + %% debug_call(Skip,[Name,Arguments1]), + +%% find first line to run, keeping track list of command numbers gone past + find_first_line_to_run(Body,Vars1,Vars2)),Vars21), + %run_line + true. +%% find called line +%% find next line +%% - go to next numbered line if in brackets, etc. +%% - if last line of brackets +%% - if antecedent that is true, ..., vv +%% if fails, returns to last choice point + +find_first_line_to_run([],Vars,Vars) :- !. +*/ +%%find_first_line_to_run(Body,Vars1,Vars2) :- + + %% line 0, - easy to find + %% usually for next line finds structure of pred, + %% finds next line + +%% find_next_line_to_run (Predicate_number,Previous_line_number) :- +%% Predicate_number= ... +%% Previous_line_number = -1 if just called the predicate + +%% pass nondeterminism, recursion, variable states +%% exit status of previous line, e.g. fail so backtrack + +%% go to line+1 if in body, different if in different body structure, different if in consequent 1 of 2 consequents of statement + +%% as goes, record line numbers to go to when reach a line number based on if then, work these out when at start, delete these x +%% predicate line -2 is return from predicate +%% later: check for singleton in branch + +%% if goes past choice-point (e.g. member, some function calls) add it on to stack +%% recursion is found as go + +/** +Predicate line +0 for first line +-1 if just called the predicate +-2 to return from predicate +-3 if failed the predicate +**/ +/* +find_next_line_to_run(Predicate_number,-1) :- + run(Predicate_number,0),!. +find_next_line_to_run(Predicate_number,Previous_line_number) :- +true. +*/ diff --git a/SSI/ssi3.pl b/SSI/ssi3.pl new file mode 100644 index 0000000000000000000000000000000000000000..6505c6216806c76a0a7ee4201f8202f47b1d7230 --- /dev/null +++ b/SSI/ssi3.pl @@ -0,0 +1,2189 @@ +% test(2,Q,F,R),lucianpl(off,Q,F,R2). + + +/* +see lpi test 15 for swipl call with enough memory for test 15 + +numbers(194,1,[],N),findall(N1,(member(N1,N),test(N1,Q,F,R),catch(call_with_time_limit(4,lucianpl(off,Q,F,R)),_,false),writeln(N1)),N2),sort(N2,N3),writeln(N3),length(N3,L),subtract(N,N3,U),writeln(U). + +turn leash on in lpi, ssi + +turn_save_debug(on),numbers(13,13,[],N),findall(N1,(member(N1,N),test(N1,Q,F,R1),do_saved_debug([]),interpret(on,Q,F,R2),writeln(""),saved_debug(S1),do_saved_debug([]),catch(call_with_time_limit(3,lucianpl(on,Q,F,R3)),_,false),writeln(""),saved_debug(S2),S1=S2,do_saved_debug([]),writeln(N1)),N2),sort(N2,N3),writeln(N3),length(N3,L),subtract(N,N3,U),writeln(U). + +NOT: +turn_save_debug(on),numbers(13,13,[],N),findall(N1,(member(N1,N),test(N1,Q,F,R1),do_saved_debug([]),interpret(on,Q,F,R2),saved_debug(S1),writeln1([s,S1]),writeln(""),do_saved_debug([]),catch(call_with_time_limit(3,lucianpl(on,Q,F,R3)),_,false),saved_debug(S2),writeln([s,S2]),S1=S2,do_saved_debug([]),writeln(N1),writeln("")),N2),sort(N2,N3),writeln(N3),length(N3,L),subtract(N,N3,U),writeln(U). + +*/ + +lucianpl(Debug,Query,Functions1,Result) :- + international_lucianpl([lang,"en"], + Debug,Query,Functions1,Result). + +international_lucianpl([lang,Lang],Debug,Query,Functions1,Result) :- + retractall(lang(_)), + assertz(lang(Lang)), + lucianpl_1(Debug,Query,Functions1,Result). + +international_lucianpl([lang,Lang],Debug,Query,TypeStatements,ModeStatements,Functions1,Result) :- + retractall(lang(_)), + assertz(lang(Lang)), + lucianpl_1(Debug,Query,TypeStatements,ModeStatements,Functions1,Result). + + +lucianpl_1(Debug,Query,Functions1,Result) :- + retractall(types(_)), + assertz(types(off)), +lucianpl11(Debug,Query,Functions1,Result),!. + +lucianpl_1(Debug,Query,TypeStatements,ModeStatements,Functions1,Result) :- + retractall(types(_)), + assertz(types(on)), + retractall(typestatements(_)), + findall([A,C],(member([A,B],TypeStatements),expand_types(B,[],C)),TypeStatements1), + +assertz(typestatements(TypeStatements1)), + retractall(modestatements(_)), + assertz(modestatements(ModeStatements)), +lucianpl11(Debug,Query,Functions1,Result). + +lucianpl11(Debug,Query,Functions,Result) :- + ((not(lang(_Lang1)) + %var(Lang1) + )-> + (retractall(lang(_)), + assertz(lang("en"))); + true), + load_lang_db, + query_box(Query,Query1,Functions,Functions1), + + %writeln1(query_box(Query,Query1,Functions,Functions1)), +%%writeln1([i1]), + %%writeln1(convert_to_grammar_part1(Functions1,[],Functions2,_)), + convert_to_grammar_part1(Functions1,[],Functions2,_), + %insert_cuts(Functions2a,Functions2), + + %writeln1(convert_to_grammar_part1(Functions1,[],Functions2,_)), %trace, + %writeln1(Functions2), + %%pp3(Functions2), + %%writeln1(lucianpl1(Debug,Query,Functions2,Functions2,Result)), + %findall(Result1, + + %trace, + add_line_numbers_to_algorithm1(Functions2,Functions2a), + + %writeln1(add_line_numbers_to_algorithm1(Functions2,Functions2a)), + %writeln1(Functions2a), + + %find_pred_sm(Reserved_words),%,"en"), +find_pred_numbers(Functions2a,[]%Reserved_words +,Pred_numbers), + + retractall(pred_numbers(_)), + assertz(pred_numbers(Pred_numbers)), +%trace, + find_state_machine1(Functions2a,Functions3,Pred_numbers), + +%writeln1(find_state_machine1(Functions2a,Functions3,Pred_numbers)), + % find first predicate +%trace, + prep_predicate_call(Query1,Functions3, + All_predicate_numbers), + + lucianpl1(Debug), + + % ssi1([Level, % Trace level + % All_predicate_numbers1 % List of instances of this predicate left to call + % Predicate_or_line_number, % predicate nor line number in predicate + % Destination, % "predicate" or "line" + % Query, % Query when Destination="predicate" + % Vars, % Input Vars + % All_predicate_numbers], % predicates remaining to try at current level - saved in cp trail + % End_result, % Result of predicate + % Functions, % functions in algorithm + % Vars2, % output vars + % Result1, Result2, % Initial and cumulative lists of non-deterministic results + % Globals1, Globals2, % Initial and cumulative lists of assertz globals + % Choice_point_trail1, % Initial and cumulative lists of choice points + % Choice_point_trail2, Cpv1, Cpv2) + %trace, + %trace, + findall([All_predicate_numbers0,"prev_pred_id",0],member(All_predicate_numbers0,All_predicate_numbers),All_predicate_numbers01), + All_predicate_numbers01=[[All_predicate_numbers1,"prev_pred_id",Prev_pred_id]|All_predicate_numbers2], + %trace, + ssi1([["prev_pred_id",Prev_pred_id],1,All_predicate_numbers1,-1,"predicate",Query1,[], + All_predicate_numbers2],_End_result,Functions3,_Vars2,[],Result1, + [],_Globals2, + [], _Choice_point_trail2, + [[curr_cp,0],[curr_cp_index,0],[global_cp_trail,[]]],_), + Result=Result1. + +lucianpl1(Debug) :- + + retractall(debug(_)), + assertz(debug(Debug)), + retractall(cut(_)), + assertz(cut(off)), + + %retractall(leash1(_)), + %assertz(leash1(off)), + %% Should normally be off + (not(leash1(_))->(retractall(leash1(_)),assertz(leash1(off)));true), + + + retractall(sys(_)), + assertz(sys(1)), + retractall(pred_id(_)), + assertz(pred_id(%100% + 0 + )), + + retractall(debug2(_)), + retractall(debug3(_)), + retractall(debug4(_)), + retractall(retry_back(_)), + retractall(retry_back_stack(_)), + retractall(retry_back_stack_n(_)), + retractall(cumulative_or_current_text(_)), + retractall(number_of_current_text(_)), + %retractall(html_api_maker_or_terminal(_)), + %retractall(screen_text(_)), + %retractall(curr_screen_text(_)), + + retractall(session_number(_)), + assertz(session_number(_)), + + assertz(debug2(off)), % on - displays ssi debug info + assertz(debug3(off)), % on - displays level + assertz(debug4(off)), % on - displays append cp etc. trace + assertz(retry_back(on)), % on - retry/back mode options available in trace mode + assertz(retry_back_stack([])), % on - retry/back mode options available in trace mode + %assertz(screen_text([])), + % assertz(curr_screen_text("")), + assertz(retry_back_stack_n(0)), + assertz(cumulative_or_current_text(current)), + assertz(number_of_current_text(1)), + %assertz(html_api_maker_or_terminal(html + %terminal + %)), + (not(save_debug(_))->(retractall(save_debug(_)),assertz(save_debug(off)));true), + +(not(equals4(_Equals4))->(retractall(equals4(_)),assertz(equals4(on)));true). + +append_retry_back_stack(_Item) :- +true. +%trace, + +/* find_retry_back_stack_n(N2), + retry_back_stack(List1), + append(List1,[[N2,Item]],List2), + retractall(retry_back_stack(_)), + assertz(retry_back_stack(List2)). +*/ + +/* +replace(A,Find,Replace,F) :- + split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F),!. +*/ + +/* +print_text :- +trace, + html_api_maker_or_terminal(Html_api_maker_or_terminal), + retry_back_stack(Stack), + cumulative_or_current_text(Cumulative_or_current_text), + (Cumulative_or_current_text=cumulative-> + findall([Text1,"\n"],member([_,[text,Text1]],Stack),Text2); + get_curr_text(Stack,Text2)), + term_to_atom(Text2,Text4), + %concat_list(Text2,Text4), + (Html_api_maker_or_terminal=html-> + (replace(Text4,"\n","
",Text3), + format(Text3,[])); + (Text4=Text3, + writeln(Text3))). + +get_curr_text(Stack,Text2) :- +trace, + number_of_current_text(N1), + ((%member([N1,[_,Text1]],Stack), + append(_,[[N1,[_,Text1]]|A],Stack))->true;Text1=[]), + findall([Text10,"\n"],member([_,[text,Text10]],[[N1,[_,Text1]]|A]),Text2), + + %flatten(Text2,Text3), + + retry_back_stack_n(N2), + retractall(number_of_current_text(_)), + assertz(number_of_current_text(N2)). +*/ +ssi1([_,0,Predicate_number,Line_number,"predicate",Query_a, + Vars,All_predicate_numbers], End_result, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2) :- + + append_retry_back_stack([ssi,[[_,0,Predicate_number,Line_number,"predicate",Query_a, + Vars,All_predicate_numbers], End_result, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2]]), + + +(( Level=_, + %/* + (debug2(on)-> + writeln1(ssi1([_,0,Predicate_number,Line_number,"predicate",Query_a, + Vars,All_predicate_numbers], End_result, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2));true), + %*/ + + %trace, + (Line_number = -2 -> + (%trace, + + %***** + + +( + +%**** + ((var(Vars)->[Result1]=Result2; + append(Result1,[Vars],Result2)),%, + + %save_local_to_global_cp_trail(Choice_point_trail1,Choice_point_trail3,CP_Vars1,CP_Vars2) + + Choice_point_trail1=Choice_point_trail3, + CP_Vars1=CP_Vars2 + +%Choice_point_trail1=Choice_point_trail3 + ) + )); + + (Line_number = -3 -> + + (%trace, + + %***** + %save_local_to_global_cp_trail(Choice_point_trail1,Choice_point_trail1a,CP_Vars1,CP_Vars1a), +Choice_point_trail1=Choice_point_trail1a, +CP_Vars1=CP_Vars1a, + + + +(( +%writeln(here2), + Query2=[_|_], + +%writeln("*2"), + +get_last_cp_before_n(Choice_point_trail1a, + [Pred_id,Level1,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2], + [Cp_a,Cb_b,Pred_id,Level1,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2],_, + CP_Vars1a,CP_Vars3) + +%(((Pred_or_line="predicate",not(All_predicate_numbers2=[]))->true;(Pred_or_line="line",All_predicate_numbers2=[_,_,_,_,_,_,Vars2c],not(Vars2c=[])))) + + + + ) +-> + + (Pred_or_line="predicate"-> + + ( + + delete_cp(Choice_point_trail1,[Cp_a,Cb_b,Pred_id,Level,Predicate_number2,Line_number2b,Pred_or_line,Query2,Vars4,All_predicate_numbers2],Choice_point_trail12, + CP_Vars3,CP_Vars4,_), + +All_predicate_numbers2=[All_predicate_numbers3|All_predicate_numbers4], + + All_predicate_numbers3=[All_predicate_numbers31,"prev_pred_id",Prev_pred_id], + + + %(Line_number2b = ["returns to", _, "pred_id", _] -> (Line_number2b1 = -1,trace) ; Line_number2b1 = Line_number2b), + + ssi1([Prev_pred_id,Level,All_predicate_numbers31,Line_number2b,"predicate",Query2, + Vars4,All_predicate_numbers4], End_result,Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail12, + Choice_point_trail3, + CP_Vars4,CP_Vars2) + +); + +(Pred_or_line="line" -> + +( + +delete_until_last_cp(Choice_point_trail1,Choice_point_trail6,D1,AC,CP_Vars3,CP_Vars5), + + ( + D1=[_,_,Pred_id2,Level11,Predicate_number11,Line_number_a11,"line",-, + Vars2d11,Vars2e11], + + ssi1([Pred_id2,Level11,Predicate_number11,Line_number_a11,"line",-, + Vars2d11,Vars2e11], End_result, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail6, + Choice_point_trail3,["appearance of command",AC], + CP_Vars5,CP_Vars2) + )))); + +%**** + (Result2=Result1, + CP_Vars1a=CP_Vars2, + Globals1=Globals2, + Choice_point_trail1a=Choice_point_trail3) + + + )))) + )->true;(writeln0([ssi1,0,abort]), + %abort% + fail%number_string(a,_)%abort + )),!. + + + +ssi1([Pred_id_a1,Level,Predicate_number,Line_number,"predicate",Query_a, + Vars,All_predicate_numbers], Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2) :- + + +append_retry_back_stack([ssi,[[Pred_id_a1,Level,Predicate_number,Line_number,"predicate",Query_a, + Vars,All_predicate_numbers], Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2]]), + + +(( %writeln([all_predicate_numbers,All_predicate_numbers]), + + (Pred_id_a1=["prev_pred_id",Pred_id1]->true;Pred_id_a1=Pred_id1), + + (debug2(on)-> +writeln1( +ssi1([Pred_id_a1,Level,Predicate_number,Line_number,"predicate",Query_a, + Vars,All_predicate_numbers], Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2));true), + %*/ + ((not(Line_number= -2), not(Line_number= -3))-> + (Query_a=[Function,Arguments1]->Query_a=Query; + ((Query_a=[Function],Query=[Function,[]],Arguments1=[])->true; + (Query_a=(-),Query=Query_a))); + true), + + (Line_number = -1 % just started + -> + (%trace, + + (( + member([Predicate_number,Function,Arguments2,":-",_Body],Functions), + length(Arguments1,Length), + length(Arguments2,Length), + +((( +checktypes_inputs(Function,Arguments1), + + %%writeln1(checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs)), + %trace, + checkarguments(Arguments1,Arguments2,[],Vars3,[],FirstArgs), + %notrace, + +%writeln1([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + (debug3(on)-> +write0(["L",Level]);true), +%trace, +debug_call(Skip,[Function,Arguments1]) +%,notrace +) + + +), + +find_pred_id(Pred_id), + +%trace, +%writeln1([pred_id_chain,Pred_id1,Pred_id]), +append(Globals1, +[[[firstargs,Pred_id],FirstArgs], +[[function,Pred_id],Function], +[[arguments1,Pred_id],Arguments1], +[[skip,Pred_id],Skip], +[[level,Pred_id],Level], +[[pred_num,Pred_id],Predicate_number] +],Globals333), + +(var(Pred_id1)->Globals333=Globals3; +append(Globals333,[[pred_id_chain,Pred_id1,Pred_id]], +Globals3)), + + %get_last_p_before_n(Choice_point_trail1,[_Pred_id_1,_Level_1,[_P1,"prev_pred_id",Predicate_number],-1,"predicate",_46492998,_46493004,[]]],[Cp_a1,Cp_a2|D1],_,CP_Vars31,CP_Vars4), + +%trace, +%(var(Query)->Query=(-);true), +%/* + +/* +writeln1([*,Choice_point_trail1,[[Pred_id,Level,Predicate_number,-1,"predicate",Query, + Vars3, + All_predicate_numbers]],Choice_point_trail11, + CP_Vars1,CP_Vars3]), + */ + +%/* +append_cp(Choice_point_trail1,[[Pred_id,Level,Predicate_number,-1,"predicate",Query, + Vars3, + All_predicate_numbers]],Choice_point_trail11, + CP_Vars1,CP_Vars3), + %*/ +%%*/ +%Choice_point_trail1=Choice_point_trail11,CP_Vars1=CP_Vars3, + + /*writeln1(["*3",append_cp(Choice_point_trail1,[[Pred_id,Level,Predicate_number,-1,"predicate",Query, + Vars3, + All_predicate_numbers]],Choice_point_trail11, + CP_Vars1,CP_Vars3)]),*/ + %writeln1(Choice_point_trail11), + + ssi1([Pred_id,Level,Predicate_number,0,"line",-, + Vars3, + All_predicate_numbers], _, Functions,Vars2, % first Vars1 to Vars, 2nd Vars1 to Vars2 + Result1, Result2, + Globals3,Globals2, + Choice_point_trail11, + Choice_point_trail3, + CP_Vars3,CP_Vars2) + + )->true; + + (%Line_number = -1, + + (debug3(on)-> +write0(["L",Level]);true), +%trace, +debug_call(Skip1,[Function,Arguments1]), + + (debug_fail(Skip1,[Function,Arguments1])->true;true), % below instead + + All_predicate_numbers=[Predicate_number_a|All_predicate_numbers2], + + Predicate_number_a=[Predicate_number_a1,"prev_pred_id",Prev_pred_id], + + + %trace, + + delete_cp_value(Choice_point_trail1,[Predicate_number_a1,"prev_pred_id",Prev_pred_id],Choice_point_trail31,CP_Vars1,CP_Vars21), + + %trace, + + %Pred_id1=["prev_pred_id",Pred_id], + %(Pred_id=3->writeln(here4);true), + +ssi1([["prev_pred_id",Prev_pred_id],Level,Predicate_number_a1,Line_number,"predicate",Query_a, + Vars,All_predicate_numbers2], Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail31, + Choice_point_trail3, + CP_Vars21,CP_Vars2) + ))->true; + + (%trace, + Level2 is Level-1, + %member([pred_id_chain,PID1,Pred_id1],Globals1), + %member([[pred_num,PID1],Predicate_number_c],Globals1), + +pred_minus_one_fail2([Pred_id1,Level2,Predicate_number,-3,"predicate",-, % (-) as pred id + [],_All_predicate_numbers], Result21,Functions,Vars2, + Result1, Result2,%2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2)) + ) + + + ); + + (Line_number = -2 + % true exit from pred + ->( + ( + %trace, + %delete_cp(Choice_point_trail1, + %[_,_,Pred_id1,Level,Predicate_number,_,"predicate"|_], + %Choice_point_trail1a,CP_Vars1,CP_Vars1a,_), + + %save_local_to_global_cp_trail(Choice_point_trail1,Choice_point_trail1a,CP_Vars1,CP_Vars1a), + Choice_point_trail1=Choice_point_trail1a, + CP_Vars1=CP_Vars1a, + + %Choice_point_trail1=Choice_point_trail1a, + %CP_Vars1=CP_Vars1a, + + member([[firstargs,Pred_id1],FirstArgs],Globals1), + + member([[arguments1,Pred_id1],Arguments1],Globals1), +%trace, + member([[skip,Pred_id1],Skip],Globals1), + + member([[level,Pred_id1],Level_a],Globals1), + + %delete(Globals1,[[firstargs,Pred_id1],FirstArgs],Globals3) + Globals1=Globals3, + + (debug3(on)-> +write0(["L",Level_a]);true), + + debug_fail_fail(Skip), +%trace, + updatevars(FirstArgs,Vars,[],Result), + unique1(Result,[],Vars3), + + Pred_id1=Pred_id4, + + get_last_p_before_n(Choice_point_trail1a,[%_,_, + Pred_id4,Level1,Predicate_number4,-1,"predicate",[Function,Arguments1], %** + Vars9,All_predicate_numbers4], + [_,_,Pred_id4,Level1,Predicate_number4,-1,"predicate",[Function,Arguments1], %** + Vars9,All_predicate_numbers4],_%Choice_point_trail5 + , + CP_Vars1a,CP_Vars3), + + Choice_point_trail1a=Choice_point_trail2, + CP_Vars3=CP_Vars41, + + %trace, + /* + delete_cp(Choice_point_trail1,[_,_, + Pred_id4,Level1,Predicate_number4,-1,"predicate",[Function,Arguments1], %** + Vars9,All_predicate_numbers4],Choice_point_trail2, + CP_Vars3,CP_Vars41,_), + */ + + + /* + trace,writeln1(["here1:",%_,_, + Pred_id4,Level1,Predicate_number4,-1,"predicate",[Function,Arguments1], %** + Vars9,All_predicate_numbers4]), + */ +%trace, + findresult3(Arguments1,Vars3,[],Result22), + + %trace, + (debug3(on)-> +write0(["L",Level_a]);true), +%notrace, + debug_exit(Skip,[Function,Result22]), % return result21 + checktypes(Function,Result22), + + ((not(Level_a=0))->( + Level2 is Level-1, + + (Level2 = 0 -> + + %trace, + ( + +ssi1([_,0,_Predicate_number,-2,"predicate",_Query_a, + Vars3,_All_predicate_numbers], _Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail2, + Choice_point_trail3, + CP_Vars41,CP_Vars2)); + + (%trace, + +%writeln1([globals3,Globals3]), +member([pred_id_chain,PID1,Pred_id1],Globals3), +%trace, + get_last_p_before_n(Choice_point_trail2, + [PID1,Level3,Predicate_number2,Line_number2b,"predicate",Query2,Vars4,All_predicate_numbers2], + [_Cp_a,_Cb_b,PID1,Level3,Predicate_number2,Line_number2b,"predicate",Query2,Vars4,All_predicate_numbers2],_, + CP_Vars41,CP_Vars42), + + %delete_cp(Choice_point_trail2,[_Cp_a,_Cb_b,PID1,Level3,Predicate_number2,Line_number2b,"predicate",Query2,Vars4,All_predicate_numbers2],Choice_point_trail2a,CP_Vars42,CP_Vars4,_), +Choice_point_trail2=Choice_point_trail2a,CP_Vars42=CP_Vars4, + + + (Line_number2b=["returns to", + Line_number3,"pred_id",Pred_id3] + -> + ( +member([[pred_num,Pred_id3],Predicate_number2],Globals3), + +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("true",Dbw_true1),Dbw_true1=Dbw_true, + +member([Predicate_number2,_F|Rest],Functions), +(Rest=[_Args,":-",Lines]->true; +(Rest=[_Args]->Lines=[[[Dbw_n,Dbw_true]]]; +(Rest=[":-",Lines]; +(Rest=[],Lines=[[[Dbw_n,Dbw_true]]])))), + +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + + + member([Line_number3,[Dbw_on_true,A],[Dbw_go_after,_B],[Dbw_on_false,_C],[Dbw_go_to_predicates,_D]|_Line],Lines), +A=Line_number2a, +%trace, + reverse(Globals1,Globals33), + + member([[firstargs_uv2,Pred_id3],FirstArgs1],Globals33), +%trace, +%writeln1([globals33,Globals33]), + Globals33=Globals43, + + member([[vars1,Pred_id3],Vars11],Globals43), + + Globals43=Globals212, + +reverse(Globals212,_Globals22), + + updatevars2(FirstArgs1,Vars3,[],Vars5), + (var(Vars11)->(Vars11=[]%,trace + );true), + updatevars3(Vars11,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + ( + unique1(Vars7,[],Vars8) + ) +);( + Vars8=[])), + + Vars8=Vars44 %% 4 not 2? *** +) +; + +(Line_number2a=Line_number2b +%, +%,Level_x=Level2 +%Vars44=Vars3 + +)), % Line_number2 to 2b + + %trace, + ssi1([PID1,Level2,Predicate_number2,Line_number2a,"line",-, + Vars44,All_predicate_numbers2], _, Functions,Vars2, + Result1, Result2, + Globals3,Globals2, + Choice_point_trail2a, + Choice_point_trail3, + CP_Vars4,CP_Vars2) + + ))))) +->true; + ( + ssi1([Pred_id_a1,Level,Predicate_number,-3%Line_number + ,"predicate",Query_a, + Vars,All_predicate_numbers], Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2) + + ) +) + %)); + ; + (Line_number = -3 + % fail exit from pred + ->( + + %save_local_to_global_cp_trail(Choice_point_trail1,Choice_point_trail1a,CP_Vars1,CP_Vars1a), + Choice_point_trail1=Choice_point_trail1a, + CP_Vars1=CP_Vars1a, + + +pred_minus_three([Pred_id_a1,Level,Predicate_number,Line_number,"predicate",Query_a, + Vars,All_predicate_numbers], Result21, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1a, + Choice_point_trail3, + CP_Vars1a,CP_Vars2) + + )))) + + )->true;(writeln0([ssi1,predicate,abort]), + %abort% + fail + %number_string(a,_)%abort + )),!. + + + + +ssi1([Pred_id,Level,Predicate_number,Line_number_a,"line",Query, + Vars1,All_predicate_numbers], _, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3, + CP_Vars1,CP_Vars2) :- + +ssi1([Pred_id,Level,Predicate_number,Line_number_a,"line",Query, + Vars1,All_predicate_numbers], _, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3,["appearance of command",-], + CP_Vars1,CP_Vars2). + +ssi1([Pred_id,Level,Predicate_number,Line_number_a,"line",Query, + Vars1,All_predicate_numbers], _, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3,["appearance of command",AC], + CP_Vars1,CP_Vars2) :- + +append_retry_back_stack([ssi,[[Pred_id,Level,Predicate_number,Line_number_a,"line",Query, + Vars1,All_predicate_numbers], _, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3,["appearance of command",AC], + CP_Vars1,CP_Vars2]]), + %/* + +(( %writeln([all_predicate_numbers,All_predicate_numbers]), + (debug2(on)-> +writeln1( + ssi1([Pred_id,Level,Predicate_number,Line_number_a,"line",Query, + Vars1,All_predicate_numbers], _, Functions,Vars2, + Result1, Result2, + Globals1,Globals2, + Choice_point_trail1, + Choice_point_trail3,["appearance of command",AC], + CP_Vars1,CP_Vars2));true), + +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("true",Dbw_true1),Dbw_true1=Dbw_true, + + +member([Predicate_number,_F|Rest],Functions), +(Rest=[_Args,":-",Lines]->true; +(Rest=[_Args]->Lines=[[[Dbw_n,Dbw_true]]]; +(Rest=[":-",Lines]; +(Rest=[],Lines=[[[Dbw_n,Dbw_true]]])))), + +(%trace, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + +%trace, +return_to_last_non_end_function(Line_number_a,Lines,Line_number_b,[Dbw_on_true,A],[Dbw_go_after,B],[Dbw_on_false,C],[Dbw_go_to_predicates,D],Line,Globals1,Pred_id,Line_number_a,FA,(-),End_line42) + +,(debug2(on)->writeln1([*,return_to_last_non_end_function(Line_number_a,Lines,Line_number_b,[Dbw_on_true,A],[Dbw_go_after,B],[Dbw_on_false,C],[Dbw_go_to_predicates,D],Line,Globals1,Pred_id,Line_number_a,FA,(-),End_line42)]);true) + +), + + +((%trace,(( +not(Line_number_b= -2),not(Line_number_b= -3), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + +Line=[[Dbw_n,Dbw_findall],[S1,S2]])-> + +(%trace, +%trace, +%writeln([here_ln,Line_number_a,Line_number_b]), +append_cp(Choice_point_trail1,[[Pred_id,Level,Predicate_number,Line_number_b,"findall",-, + [old_vars,Vars1],[findall_vars,[]],[format_vars,S1], + [result_var,S2]]],Choice_point_trail1e, + CP_Vars1,CP_Vars3) + + +); +(Choice_point_trail1=Choice_point_trail1e, +CP_Vars1=CP_Vars3) +), + +%(Pred_id=1->trace;true), + +del_append(Globals1,[[[vars1,Pred_id],Vars1]],Globals3), + + ((get_lang_word("findall_exit_function",Dbw_findall_exit_function1),Dbw_findall_exit_function1=Dbw_findall_exit_function, +get_lang_word("findall_fail_function",Dbw_findall_fail_function1),Dbw_findall_fail_function1=Dbw_findall_fail_function, + + + (Line_number_b=[Dbw_findall_exit_function,Findall_end_line]->true; + Line_number_b=[Dbw_findall_fail_function,Findall_end_line]) + %,trace + %writeln(here0) + )-> + + (( + +%trace, + +cp_since_findall_start(Choice_point_trail1e,Level,_D10,E1,D1,CP_Vars3,CP_Vars31) + %writeln1(cp_since_findall_start(Choice_point_trail1e,Level,D1,E1,CP_Vars3,CP_Vars31)) + )-> + ( +%trace, + +%cut_cps_if_necessary1(Pred_id,Choice_point_trail1e,Choice_point_trail1f,CP_Vars31,CP_Vars311,Predicate_number,Globals3), + + process_cp(Findall_end_line,FA,D1,E1, + +_, + +Vars1, + _End_result, Functions,Vars2, %% CPVs here?**** CPV0,CPV to CPV1,CPV2 + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1e, + Choice_point_trail3, + CP_Vars31,CP_Vars2) + ); + + ( + %trace, + %cut_cps_if_necessary1(Pred_id,Choice_point_trail1e,Choice_point_trail1f,CP_Vars3,CP_Vars311,Predicate_number,Globals3), + +%clear_cps(Choice_point_trail1e,Choice_point_trail1f,CP_Vars3,CP_Vars3a), + +Choice_point_trail1e=Choice_point_trail1f, + +CP_Vars3=CP_Vars3a, + %writeln(here2), + end_nested_findall(FA + ,Pred_id,Level,Predicate_number,Line_number_b,Choice_point_trail1f,Choice_point_trail3,Vars1,Vars2,CP_Vars3a,CP_Vars2,Functions,Globals3,Globals2,Result1, Result2,End_line42) + +)); +%Level32 is Level-1, + ((Line_number_b= -1 -> true;(Line_number_b= -2 ->true;Line_number_b= -3))-> + %trace, + + ((Line_number_b= -3)-> + + e(Pred_id,Level,Predicate_number,Vars1,_End_result,Functions,Vars2,Result1, Result2, + Globals3,Globals2, + Choice_point_trail1e, + Choice_point_trail3, + CP_Vars3,CP_Vars2 + ); + + (%trace, + ssi1([Pred_id,Level, %* + Predicate_number,Line_number_b,"predicate",Query, + Vars1,All_predicate_numbers], _End_result, Functions,Vars2, + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1e, + Choice_point_trail3, + CP_Vars3,CP_Vars2))); + + + ( %(trace, + %not(D='-') + (((%trace, + D=[_|_])-> true;(D=(*),get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v,Line=[Function,Arguments],Function=[Dbw_v,_Function2],%,not(reserved_word2(Function2)) + +append([Function],Arguments,Arguments1), + substitutevarsA1(Arguments1,Vars1,[],Vars3,[],FirstArgs), + + append(Globals3,[[[firstargs_uv2,Pred_id],FirstArgs]],Globals31), + del_append(Globals31,[[[vars1,Pred_id],Vars1]],_Globals32), + + Vars3=[Function1|Vars31], + _Query2=[Function1,Vars31], + pred_numbers(Pred_numbers),length(Arguments,Arity1),member([Function1,Arity1,_Pred_numbers1],Pred_numbers))) + ) -> + +(d(Pred_id,D,Level,Predicate_number,Line_number_b,Query,Vars1,Vars2,All_predicate_numbers,Line,Choice_point_trail1e,Globals3,Functions,Result1, Result2,Globals2,Choice_point_trail3,CP_Vars3,CP_Vars2)); + +((get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("cut",Dbw_cut1),Dbw_cut1=Dbw_cut, + %writeln1([line,Line]), + + Line=[[Dbw_n,Dbw_cut],[]]) -> + %trace, + (cut_cps(Choice_point_trail1e,Choice_point_trail11,CP_Vars3,CP_Vars4,Pred_id,Predicate_number,Globals3), + + %writeln1(cut_cps(Choice_point_trail1e,Choice_point_trail11,CP_Vars3,CP_Vars4,Pred_id,Predicate_number,Globals3)), + + ssi1([Pred_id,Level,Predicate_number,A,"line",Query, + Vars1,All_predicate_numbers], _End_result3, Functions,Vars2, + Result1, Result2, + Globals3,Globals2, + Choice_point_trail11, + Choice_point_trail3, + CP_Vars4,CP_Vars2) + + ); + + (((%trace, + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + %trace, + get_lang_word("read_string",Dbw_read_string1),get_lang_word("read_password",Dbw_read_string2), + get_lang_word("text_area",Dbw_read_string3), + %writeln1([Dbw_read_string1,Dbw_read_string2]), + %trace, + %Dbw_read_string1=Dbw_read_string->true;Dbw_read_string2=Dbw_read_string), + %trace, + %writeln1([line,Line]), + + ((Line=[[Dbw_n,Dbw_read_string1],[Variable1]], + Dbw_read_string1=Dbw_read_string)->true; + (Line=[[Dbw_n,Dbw_read_string2],[Variable1]], + Dbw_read_string2=Dbw_read_string); + (Line=[[Dbw_n,Dbw_read_string3],[Variable1,Variable2,Variable3]], + Dbw_read_string3=Dbw_read_string)), + + %true% + html_api_maker_or_terminal(html + ) + )-> + + ( + %trace, + % if html, output web form code, stop + % () if api maker, + % if terminal, use ssi read string below + + %interpretpart(read_string1,Variable1,Vars1,Vars3) + + +/* + (var(Skip)->Globals3=Globals4; + append(Globals3,[[[skip,Pred_id,Line_number_b],Skip]],Globals4)), + + (%trace, + Vars2c=[]->(Choice_point_trail1e=Choice_point_trail11, + CP_Vars3=CP_Vars4); + append_cp(Choice_point_trail1e,[[Pred_id,Level,Predicate_number,Line_number_a,"line",_, + Vars3,Vars2c]],Choice_point_trail11,CP_Vars3,CP_Vars4)), +*/ + +getvalue(Variable1,Value1,Vars1), + +(Dbw_read_string3=Dbw_read_string-> +(getvalue(Variable2,Value2,Vars1), +getvalue(Variable3,Value3,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_read_string3],[Value1,Value2,variable]])); +% text area + +(debug_call(Skip,[[Dbw_n,Dbw_read_string],[variable]]))), + + +lang(Lang), + +debug2(Debug2), +debug3(Debug3), +debug4(Debug4), +retry_back(Retry_back), +retry_back_stack(Retry_back_stack), +retry_back_stack_n(Retry_back_stack_n), +cumulative_or_current_text(Cumulative_or_current_text), +number_of_current_text(Number_of_current_text), +html_api_maker_or_terminal(Html_api_maker_or_terminal), +pred_numbers(Pred_numbers), +%curr_cp_index(Curr_cp_index), +%trace, +%*pred_id(Pred_id), +%writeln([Pred_id,pred_id(Pred_id_a)]), +pred_id(Pred_id_a), +(true%pred_id(Pred_id) +->true;(writeln([Pred_id,pred_id(Pred_id_a)]),fail)), +(types(Types)->true;(writeln(types),fail)), +(types(on)-> +(typestatements(Typestatements), +modestatements(Modestatements));true), +%writeln(1), +session_number(Session_number), +%retractall(hidden(Hidden)), +%writeln(2), + +%assertz(hidden( +Hidden=Session_number, + +%writeln(3), + +(Dbw_read_string3=Dbw_read_string-> %text area +(Value1A=Value3,Variable1A=Variable3); +(Value1A=Value1,Variable1A=Variable1)), + +Hidden3=[Dbw_n,Dbw_read_string,Value1A,Variable1A,Line_number_b,Skip,lang(Lang), + +debug2(Debug2), +debug3(Debug3), +debug4(Debug4), +retry_back(Retry_back), +retry_back_stack(Retry_back_stack), +retry_back_stack_n(Retry_back_stack_n), +cumulative_or_current_text(Cumulative_or_current_text), +number_of_current_text(Number_of_current_text), +html_api_maker_or_terminal(Html_api_maker_or_terminal), +pred_numbers(Pred_numbers), +pred_id(Pred_id_a), +types(Types), +typestatements(Typestatements), +modestatements(Modestatements), + + + ssi1([Pred_id,Level,Predicate_number,A,"line",Query, + Vars1,All_predicate_numbers], _End_result3, Functions,Vars2, + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1e, + Choice_point_trail3, + CP_Vars3,CP_Vars2), + + ssi1([C,"line",Query,Vars1])], + +%writeln(4), + +save_session(Session_number,Hidden3), + +%writeln(5), + + %print_text, + + %trace, + term_to_atom(Hidden,Hidden1), + + replace_new(Hidden1,"\"",""",Hidden2), + %replace_new(Hidden2,"\'","\'\'",Hidden3), + + %writeln1([Dbw_read_string=Dbw_read_string2]), + + %writeln1(["*1",Dbw_read_string=Dbw_read_string2]), + + + + (Dbw_read_string=Dbw_read_string2 % read_password + -> + Form_input="password"; + + Form_input="text"), + + + (Dbw_read_string=Dbw_read_string3 % text_area + -> + concat_list([""],CL1); + + concat_list([""],CL1)), + + + concat_list([" + +
+ + ",CL1,"

+

+ +
+"],Form_text1), + +/*'", + 'a', + %, + "' + */ + +atom_string(Form_text,Form_text1), + + %format(Hidden1,[]), + + format(Form_text,[]) + + + + ) + + ); + + (%trace, + (( + + (not((get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("read_string",Dbw_read_string1),get_lang_word("read_password",Dbw_read_string2),Dbw_read_string1=Dbw_read_string,Dbw_read_string2=Dbw_read_string, + %writeln1([line,Line]), + + Line=[[Dbw_n,Dbw_read_string],[Variable1]], + + html_api_maker_or_terminal(html) + ))), + + ((%trace, + AC=(-)) -> + + (%writeln1(interpretstatement2(ssi,Functions,Functions,Line,Vars1,Vars3,_Result21,_Cut,Vars2c)), + interpretstatement2(ssi,Functions,Functions,Line,Vars1,Vars3,_Result21,_Cut,Vars2c,Skip)); + (%trace, + interpretstatement2(ssi,Functions,Functions,Line,Vars1,Vars3,_Result21,_Cut,Vars2c,AC,Skip))) + ) + % choose certain commands from lpi for ssi, rest customised + + -> + + % if triggers end_function (meaning the end of an if-then clause), writes trace display and goes to "go after" line at start of function + + % end_function is given with line number to signal the end of the if-then statement etc. + % -2 line number for end of predicate, -3 for failed predicate - earlier + + % n.b. findall, maplist (x not in new shell, with different end_function) need other code expansion in sm - with call, new shell + % also type check in new shell, separate from alg + % iso commands need to be done like c + + % - do bagof, setof later + ( + + (var(Skip)->Globals3=Globals4; + ( + %writeln1([append,[skip,Pred_id,Line_number_b],Skip]), + append(Globals3,[[[skip,Pred_id,Line_number_b],Skip]],Globals4))), + + (%trace, + Vars2c=[]->(Choice_point_trail1e=Choice_point_trail11, + CP_Vars3=CP_Vars4); + (%trace, + append_cp(Choice_point_trail1e,[[Pred_id,Level,Predicate_number,Line_number_a,"line",_, + Vars3,Vars2c]],Choice_point_trail11,CP_Vars3,CP_Vars4))), + + ssi1([Pred_id,Level,Predicate_number,A,"line",Query, + Vars3,All_predicate_numbers], _End_result3, Functions,Vars2, + Result1, Result2, + Globals4,Globals2, + Choice_point_trail11, + Choice_point_trail3, + CP_Vars4,CP_Vars2) + ) + + ; + (%trace, + + ssi1([Pred_id,Level,Predicate_number,C,"line",Query, + Vars1,All_predicate_numbers], _End_result4, Functions,Vars2, + Result1, Result2, + Globals3,Globals2, + Choice_point_trail1e, + Choice_point_trail3, + CP_Vars3,CP_Vars2))))))))) )->true;(writeln0([ssi1,line,abort]), + %abort% + fail%number_string(a,_)%abort + )),!. + + + +% t or f +% return true or false result from pred + + +interpretstatement2(ssi,Functions,Functions,Line,Vars2,Vars3,Result21,Cut,_,Skip) :- + %false. + interpretstatement3(ssi,Functions,Functions,Line,Vars2,Vars3,Result21,Cut,_,Skip). + +interpretstatement2(ssi,Functions,Functions,Line,Vars2,Vars3,Result21,_Cut1,Vars2c,_Skip) :- + %writeln1(interpretstatement2(Functions,Functions,Line,Vars2,Vars3,Result21,_Cut1)), + %false.% + interpretstatement1(ssi,Functions,Functions,Line,Vars2,Vars3,Result21,_Cut,Vars2c). + +interpretstatement2(ssi,Functions,Functions,Line,Vars2,Vars3,Result21,Cut,_,AC,_Skip) :- +%trace, + interpretstatement4(ssi,Functions,Functions,Line,Vars2,Vars3,Result21,Cut,_,AC). + + +delete_until_last_cp(Choice_point_trail1,Choice_point_trail2,D1,AC +%,Vars3 +,CP_Vars1,CP_Vars2) :- +(delete_until_last_cp0(Choice_point_trail1,Choice_point_trail2,D1,AC +%,Vars3 +,CP_Vars1,CP_Vars2)->true +%writeln1(delete_until_last_cp0(Choice_point_trail1,Choice_point_trail2,D1,AC +%,Vars3,CP_Vars1,CP_Vars2)) +; +(writeln1(delete_until_last_cp0(Choice_point_trail1,Choice_point_trail2,D1,AC +%,Vars3 +,CP_Vars1,CP_Vars2)),abort)),!. + +delete_until_last_cp0(Choice_point_trail1,Choice_point_trail2,D1,AC +%,Vars3 +,CP_Vars1,CP_Vars2) :- +%trace, +%writeln(here4), + + get_last_p_before_n(Choice_point_trail1, + [Pred_id,Level,Predicate_number,Line_number_a,"line",_, + Vars3,Vars2c], + D2,D,CP_Vars1,CP_Vars3), + + D=[Pred_id,Level,Predicate_number,Line_number_a,"line",_, + Vars3,Vars2c], + %trace, + D2=[Cp_a,Cp_b|D], + + %(Vars2c=[]->fail; + Vars2c=Vars2e, +((%trace, + Vars2e=[[Dbw_n,Dbw_member],[Value1,Value2],_,_,_,_%[Value1a,Value2a] + ,Vars2e1], + (Vars2e1=[]->fail; + (Vars2e1=[Vars2e2|Vars2e3], + %(findall(Vars2f1,(member([Vars2f1,_],Vars2e2)),Vars2f), + findall(AC1,(member(AC1,Vars2e2)),AC2), % *** x ac2=Vars2e2 + %trace, + Vars2e2=[Value4,_], + %findall(AC3,(member([_,AC3],Vars2e2)),AC4), + Vars2e2=[_,Value5a],%,Value2a + %|_], + + AC=[[Dbw_n,Dbw_member],[Value1,Value2],Value5a,Value4,Vars2e3])))->true; + + + + + (%trace, + Vars2e=[[Dbw_n,Dbw_member],[Value1,Value2,Value3],_,_,_,_%[Value1a,Value2a] + ,Vars2e1], + (Vars2e1=[]->fail; + (Vars2e1=[Vars2e2|Vars2e3], + %(findall(Vars2f1,(member([Vars2f1,_],Vars2e2)),Vars2f), + findall(AC1,(member(AC1,Vars2e2)),AC2), % *** x ac2=Vars2e2 + %trace, + Vars2e2=[Value4,_], + %findall(AC3,(member([_,AC3],Vars2e2)),AC4), + Vars2e2=[_,Value5a],%,Value2a + %|_], + + AC=[[Dbw_n,Dbw_member],[Value1,Value2,Value3],Value5a,Value4,Vars2e3])))), % Vars2e3->Vars2e1 + + + D10=[Pred_id,Level,Predicate_number,Line_number_a,"line",-, + AC2,%Vars2f, + AC], + + D1=[Cp_a,Cp_b|D10] + + + , + + %get_later_cps_than_cp(Choice_point_trail1,D2,C1), + %subtract(C1,D2,C2), + + %save_local_to_global_cp_trail(Choice_point_trail1,[],CP_Vars3,CP_Vars3b), + + %load_local_from_global_cp_trail(Pred_id,%Choice_point_trail1, +%Choice_point_trail2a,CP_Vars3b,CP_Vars3a), + Choice_point_trail1=Choice_point_trail2a, + CP_Vars3=CP_Vars3a, + +delete_cp(Choice_point_trail2a,D2,Choice_point_trail3,CP_Vars3a,CP_Vars4,_), + append_cp(Choice_point_trail3,[D10],Choice_point_trail2,CP_Vars4,CP_Vars2), + %set(curr_cp,Cp_a), %% * try without this + %reverse(C1,Choice_point_trail2), + %trace,writeln([D1,AC]), + !. + +% get earlier arg2=[cba,cpb|_] +% D1= no cpa cpb +% B=list cpa cpb of cps + +% cp since arg2= no cpa cpb +% arg3= cpa cpb_ +% E1= no cpa cpb + +cp_since_findall_start(Choice_point_trail1,_Level,D1,E1,D11,CP_Vars1,CP_Vars2) :- +%trace, + CP_Vars1=CP_Vars2, + %trace, + +reverse(Choice_point_trail1,Choice_point_trail14), + member_cut1([A1,A2,A_Pred_id,A_Level,A_Predicate_number,A_Line_number_a,"findall",A3|A4],Choice_point_trail14), + D11=[A_Pred_id,A_Level,A_Predicate_number,A_Line_number_a,"findall",A3|A4], + %writeln1(cp_since_findall_start(Choice_point_trail1,Level,D1,E1)), + get_later_cps_than_cp11(Choice_point_trail1, + [A1,A2,A_Pred_id,A_Level,A_Predicate_number,A_Line_number_a,"findall",A3|A4], + D1,B), + + reverse(B,B1), + + %D1=[_Pred_id,Level,_Predicate_number,_Line_number_a,"findall",-|_], + %member([_,_|D1],Choice_point_trail1), + + member([_A,_B2,C,D_Level + %2 + ,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2],B1), + + not(F_Line_number_a2=["returns to", _, "pred_id", _]), + +%not(var(H)), + E1=[C,D_Level + %2 + ,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2], + + /*cp_since_findall_start2( + %get_last_p_before_n( + B, + [_,Level + %2 + ,_Predicate_number2,_Line_number_a2,Pred_or_line,_,_,All_predicate_numbers2],_,E1,CP_Vars1,CP_Vars2), + */ + + + (Pred_or_line="line"-> + (All_predicate_numbers2=[_,_,_,_,_, + _,Vars2c],not(Vars2c=[])); + Pred_or_line="predicate"-> + not(All_predicate_numbers2=[]))%) + + . + + +cp_since_findall_start3(Choice_point_trail1,_Level,_D1,E1,_D11,CP_Vars1,CP_Vars2) :- +%trace, + CP_Vars1=CP_Vars2, + %trace, + +reverse(Choice_point_trail1,Choice_point_trail14), + member_cut1([A1,A2,A_Pred_id,A_Level,A_Predicate_number,A_Line_number_a,"findall",A3|A4],Choice_point_trail14), + D1=[A_Pred_id,A_Level,A_Predicate_number,A_Line_number_a,"findall",A3|A4], + %writeln1(cp_since_findall_start(Choice_point_trail1,Level,D1,E1)), + get_later_cps_than_cp11(Choice_point_trail1, + [A1,A2,A_Pred_id,A_Level,A_Predicate_number,A_Line_number_a,"findall",A3|A4], + D1,B), + + reverse(B,B1), + + %D1=[_Pred_id,Level,_Predicate_number,_Line_number_a,"findall",-|_], + %member([_,_|D1],Choice_point_trail1), + +%[5,6,3,3,[3,"prev_pred_id",2],-1,"predicate",_12340,_12346,[[4,"prev_pred_id",2],[5,"prev_pred_id",2],[6,"prev_pred_id",2]]] + member([A,B2,C,D_Level + %2 + ,[P1,"prev_pred_id",P2],-1,"predicate",H,I,All_predicate_numbers2],B1), + + %not(F_Line_number_a2=["returns to", _, "pred_id", _]), + +%not(var(H)), + E1=[A,B2,C,D_Level + %2 + ,[P1,"prev_pred_id",P2],-1,"predicate",H,I,All_predicate_numbers2] + + /*cp_since_findall_start2( + %get_last_p_before_n( + B, + [_,Level + %2 + ,_Predicate_number2,_Line_number_a2,Pred_or_line,_,_,All_predicate_numbers2],_,E1,CP_Vars1,CP_Vars2), + */ + + +/* +(Pred_or_line="line"-> + (All_predicate_numbers2=[_,_,_,_,_, + _,Vars2c],not(Vars2c=[])); + Pred_or_line="predicate"-> + not(All_predicate_numbers2=[]))%) +*/ + . + member_cut1([A1,A2,A_Pred_id,A_Level,A_Predicate_number,A_Line_number_a,A5,A3|A4],Choice_point_trail14) :- + member([A1,A2,A_Pred_id,A_Level,A_Predicate_number,A_Line_number_a,A5,A3|A4],Choice_point_trail14),!. + + +% return until last non end function, go to line after +% return until last non end function, go to line for false + +% return line of last non end function +% returns go to predicates + +% bc + +return_to_last_non_end_function(E1,Lines,End_line4,[Dbw_on_true,A1],[Dbw_go_after,B1],[Dbw_on_false,C1],[Dbw_go_to_predicates,D1],Line1,_Globals,_,_,FA,E2,End_line42) :- + +%writeln1([*,return_to_last_non_end_function(E1,Lines,End_line4,[Dbw_on_true,A1],[Dbw_go_after,B1],[Dbw_on_false,C1],[Dbw_go_to_predicates,D1],Line1,_Globals,_,_,FA)]), + +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + +get_lang_word("exit_function",Dbw_exit_function1),Dbw_exit_function1=Dbw_exit_function, +get_lang_word("findall_exit_function",Dbw_findall_exit_function1),Dbw_findall_exit_function1=Dbw_findall_exit_function, +get_lang_word("findall_fail_function",Dbw_findall_fail_function1),Dbw_findall_fail_function1=Dbw_findall_fail_function, +get_lang_word("fail_function",Dbw_fail_function1),Dbw_fail_function1=Dbw_fail_function, + + + % if -2 or -3, fill A-Line + ((E1= -1->true;(E1= -2->true; E1= -3))-> + (End_line4=E1,A1=(-),B1=(-),C1=(-),D1=(-),Line1=(-)); + ((E1=[Dbw_exit_function,_]->true;E1=[Dbw_fail_function,_])-> + fail; + (E1=[Dbw_findall_exit_function,_]-> + (%go_after(E1,Lines,End_line4) + End_line4=E1,A1=(-),B1=(-),C1=(-),D1=(-),Line1=(-),FA=exit); + (E1=[Dbw_findall_fail_function,E]-> + (((number(E2),E2= -3)->(member([E,[Dbw_on_true,A11],[Dbw_go_after,B11],[Dbw_on_false,C11],[Dbw_go_to_predicates,D11]|Line11],Lines), + +((%trace, +not(B11= -1),not(B11= -2),not(B11= -3),number(B11)%,trace +)-> (member([B11,[Dbw_on_true,A1],[Dbw_go_after,B1],[Dbw_on_false,C1],[Dbw_go_to_predicates,D1]|Line1],Lines),End_line42=B11); +(A11=A1,B11=B1,C11=C1,D11=D1,Line11=Line1,End_line42=_)), + + FA=fail); + (member([E,[Dbw_on_true,A1],[Dbw_go_after,B1],[Dbw_on_false,C1],[Dbw_go_to_predicates,D1]|Line1],Lines), + %member([B11,[Dbw_on_true,A1],[Dbw_go_after,B1],[Dbw_on_false,C1],[Dbw_go_to_predicates,D1]|Line1],Lines),%End_line4=B11, + FA=fail))); + %End_line4=E1,A1=(-),B1=(-),C1=(-),D1=(-),Line1=(-),FA=fail); + (find_line_number(E1,E),End_line4=E, + member([E,[Dbw_on_true,A1],[Dbw_go_after,B1],[Dbw_on_false,C1],[Dbw_go_to_predicates,D1]|Line1],Lines)))))). + +return_to_last_non_end_function(E1,Lines,End_line4,[Dbw_on_true,A1],[Dbw_go_after,B1],[Dbw_on_false,C1],[Dbw_go_to_predicates,D1],Line1,Globals,Pred_id,Line_number_a,FA,E2,End_line42) :- +%trace, +%writeln1([lines,Lines]), +%writeln1(return_to_last_non_end_function(E1,Lines,End_line4,["on true",A1],["go after",B1],["on false",C1],["go to predicates",D1],Line1)), +%trace, +%writeln(E1), + find_line_number(E1,E), + %trace, + %writeln1([globals,Globals]), + find_line_number(Line_number_a,Line_number_a1), + (leash1(on)->Skip=false;member([[skip,Pred_id,Line_number_a1],Skip],Globals)), + + get_lang_word("n",Dbw_n), + get_lang_word("not",Dbw_not), + +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + +get_lang_word("exit_function",Dbw_exit_function1),Dbw_exit_function1=Dbw_exit_function, +get_lang_word("fail_function",Dbw_fail_function1),Dbw_fail_function1=Dbw_fail_function, + member([E,[Dbw_on_true,_A],[Dbw_go_after,B],[Dbw_on_false,C],[Dbw_go_to_predicates,_D]|Line],Lines), + (E1=[Dbw_exit_function,_]-> + (F=B,E21=E2,(Line=[[Dbw_n,Dbw_not]]-> + (debug_fail(Skip,Line)->true;true);true%debug_exit(Skip,Line) + )); + (%trace, + E1=[Dbw_fail_function,_]-> + (%trace, + F=C,E21= -3,(Line=[[Dbw_n,Dbw_not]]-> + debug_exit(Skip,Line) + ;((Line=[[Dbw_n,Dbw_not]], + debug_fail(Skip,Line))->true;true))))), + + return_to_last_non_end_function(F,Lines,End_line4,[Dbw_on_true,A1],[Dbw_go_after,B1],[Dbw_on_false,C1],[Dbw_go_to_predicates,D1],Line1,Globals,Pred_id,Line_number_a,FA,E21,End_line42). + + +find_line_number(Line_number1,Line_number2) :- + +get_lang_word("exit_function",Dbw_exit_function1),Dbw_exit_function1=Dbw_exit_function, +get_lang_word("fail_function",Dbw_fail_function1),Dbw_fail_function1=Dbw_fail_function, + +((Line_number1=[Dbw_exit_function,Line_number3]->true; + Line_number1=[Dbw_fail_function,Line_number3])-> + Line_number2=Line_number3; Line_number2=Line_number1),! + . + +find_pred_id(N2) :- + pred_id(N1), + N2 is N1+1, + retractall(pred_id(_)), + assertz(pred_id(N2)). + + +append_cp(List1,CP,List5a,CP_Vars1,CP_Vars2) :- + %writeln1(append_cp(List1,CP,List5a,CP_Vars1,CP_Vars2)), + %trace, + %(writeln("y to trace"),(get_char(y)->trace;true)), + %trace, + get(curr_cp,Curr_cp,CP_Vars1),%writeln([curr_cp,Curr_cp]), + (debug4(on)->writeln1(append_cp(List1,CP,List5a,CP_Vars1,CP_Vars2));true), + (append_cp1(List1,CP,List5a,CP_Vars1,CP_Vars2)->true;(writeln0([append_cp,abort]),abort)), + %writeln1(append_cp(List1,CP,List5a)), + get(curr_cp,Curr_cp1,CP_Vars2),%writeln([curr_cp,Curr_cp1]), + (debug4(on)->writeln0([append_cp,curr_cp,Curr_cp,Curr_cp1,List5a,CP_Vars1,CP_Vars2]);true). + %notrace. + +append_cp1(List1,CP,List5a,CP_Vars1,CP_Vars2a) :- + CP=[CP2], + + get(curr_cp,N,CP_Vars1), + get(curr_cp_index,Curr_cp_index1,CP_Vars1), + Curr_cp_index2 is Curr_cp_index1+1, + + ( + ((List1=[], + %curr_cp_index3 is curr_cp_index2+1, + List5=[[1,2|CP2]]), + set(curr_cp,2,CP_Vars1,CP_Vars3), + set(curr_cp_index,2,CP_Vars3,CP_Vars4), + set(min_cp,1,CP_Vars4,CP_Vars5), + set(max_cp,2,CP_Vars5,CP_Vars2) +%curr_cp_index2=curr_cp_index3 + )->true; + + ((member([N,_A|_CP21],List1),not(member([_F,N|_],List1)), + append([[Curr_cp_index2,N|CP2]],List1,List5), + set(curr_cp,N,CP_Vars1,CP_Vars3), + set(curr_cp_index,Curr_cp_index2,CP_Vars3,CP_Vars4), + set(min_cp,Curr_cp_index2,CP_Vars4,CP_Vars2) + )->true; + + ((member([_A,N|_CP21],List1),not(member([N,_F|_],List1)), + %curr_cp_index3 is curr_cp_index2+1, + append(List1,[[N,Curr_cp_index2|CP2]],List5), + set(curr_cp,Curr_cp_index2,CP_Vars1,CP_Vars3), + set(curr_cp_index,Curr_cp_index2,CP_Vars3,CP_Vars4), + set(max_cp,Curr_cp_index2,CP_Vars4,CP_Vars2) + %,(curr_cp_index2=24->writeln([curr_cp_index2,curr_cp_index2]);true) + )->true; + + ( + %curr_cp_index3 is curr_cp_index2+1, + member([D,N|CP4],List1), + member([N,_B|_CP21],List1), + delete(List1,[D,N|CP4],List4), + append(List4,[[D,Curr_cp_index2|CP4]],List2), + append(List2,[[Curr_cp_index2,N|CP2]],List5), + + set(curr_cp,N,CP_Vars1,CP_Vars3), + set(curr_cp_index,Curr_cp_index2,CP_Vars3,CP_Vars2) + + )))), + + renumber_cps(List5,List5a,CP_Vars2,CP_Vars2a,_), + %sort(List5,List5a), + + !. + +renumber_cps(List1,List2,CP_Vars1,CP_Vars2,Swaps) :- + get(min_cp,Min_cp,CP_Vars1), + get(max_cp,Max_cp,CP_Vars1), % max_cp x curr_cp_index + get(curr_cp,Curr_cp31,CP_Vars1), +renumber_cps1(List1,Min_cp,Max_cp,[],List2,1,Max_cp2,Curr_cp31,Curr_cp3,[],Swaps), + set(min_cp,1,CP_Vars1,CP_Vars3), + set(max_cp,Max_cp2,CP_Vars3,CP_Vars4), + set(curr_cp,Curr_cp3,CP_Vars4,CP_Vars5), + set(curr_cp_index,Max_cp2,CP_Vars5,CP_Vars2). + + +renumber_cps1([[A,B|C]],A,B,List1,List2,Curr_cp,Curr_cp2,Curr_cp31,Curr_cp3,Swaps1,Swaps2) :- + Curr_cp2 is Curr_cp+1, + append(List1,[[Curr_cp,Curr_cp2|C]],List2), + append(Swaps1,[[[A,B|C],[Curr_cp,Curr_cp2|C]]],Swaps2), + (A=Curr_cp31-> Curr_cp3=Curr_cp;true), + (B=Curr_cp31-> Curr_cp3=Curr_cp2;true) ,!. +renumber_cps1(List1,Min_cp,Max_cp,List1a,List2,Curr_cp,Max_cp2,Curr_cp31,Curr_cp3,Swaps1,Swaps2) :- + member([Min_cp,B|C],List1), + delete(List1,[Min_cp,B|C],List3), + Curr_cp2 is Curr_cp+1, + append(List1a,[[Curr_cp,Curr_cp2|C]],List4), + append(Swaps1,[[[Min_cp,B|C],[Curr_cp,Curr_cp2|C]]],Swaps3), + (Min_cp=Curr_cp31-> Curr_cp3=Curr_cp;true), + (B=Curr_cp31-> Curr_cp3=Curr_cp2;true), renumber_cps1(List3,B,Max_cp,List4,List2, + Curr_cp2,Max_cp2,Curr_cp31,Curr_cp3,Swaps3,Swaps2),!. + + + +get_cp(List1,N,Cp) :- + (member([N,B|Cp1],List1)->true;(writeln0("get_cp failed"),abort)), + Cp=[N,B|Cp1]. + + +get_last_p_before_n(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars1) :- + + %writeln1("y for trace:"),(get_char(y)->trace;true), + %get(curr_cp,Curr_cp,CP_Vars1),%writeln([curr_cp,Curr_cp]), + (debug4(on)->writeln1(get_last_p_before_n2(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars1));true), + (get_last_p_before_n2(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars1)->true;false%(writeln([get_last_p_before_n2,abort]),abort)), + %writeln1(get_last_p_before_n2(List1,Cp1,Cp2,Cp3) + ), + %get(curr_cp,Curr_cp1,CP_Vars1),%writeln([curr_cp,Curr_cp1]), + (debug4(on)->writeln0([get_last_p_before_n,Cp2,CP_Vars1,CP_Vars1]);true). + %notrace. + +get_last_p_before_n2(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars1) :- + + reverse(List1,B1), + +/* + member([global_cp_trail,Global_cp_trail1],CP_Vars1), + +append(Global_cp_trail1,[[_,_,List1]],Global_cp_trail1a), + reverse(Global_cp_trail1a%List1 + ,B1), + */ + %D1=[_Pred_id,Level,_Predicate_number,_Line_number_a,"findall",-|_], + %member([_,_|D1],Choice_point_trail1), + + Cp1=[%A,B2, + _CX,D_Level + %2 + ,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I|All_predicate_numbers2], + + %member([_,_,B11],B1), + %reverse(B11,B12), + + member([A,B2,C,D_Level + %2 + ,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I|All_predicate_numbers2],B1),%B12), + + Cp2=[A,B2,C,D_Level + %2 + ,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I|All_predicate_numbers2], + + /*cp_since_findall_start2( + %get_last_p_before_n( + B, + [_,Level + %2 + ,_Predicate_number2,_Line_number_a2,Pred_or_line,_,_,All_predicate_numbers2],_,E1,CP_Vars1,CP_Vars2), + */ + + /* + (Pred_or_line="line"-> + (All_predicate_numbers2=[_,_,_,_,_, + _,Vars2c],not(Vars2c=[])); + Pred_or_line="predicate"-> + not(All_predicate_numbers2=[])), + */ + + Cp2=[_,_|Cp3] +%) + /* +%trace, +%writeln1([list1,List1]), +%trace, + get(curr_cp,N,CP_Vars1), + get_last_p_before_n1(List1,Cp1,N,Cp2), + Cp2=[_,_|Cp3] + */ + . + +get_last_cp_before_n(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars1) :- + + %writeln1("y for trace:"),(get_char(y)->trace;true), + %get(curr_cp,Curr_cp,CP_Vars1),%writeln([curr_cp,Curr_cp]), + (debug4(on)->writeln1(get_last_cp_before_n2(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars1));true), + (get_last_cp_before_n2(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars1)->true;false%(writeln([get_last_p_before_n2,abort]),abort)), + %writeln1(get_last_p_before_n2(List1,Cp1,Cp2,Cp3) + ), + %get(curr_cp,Curr_cp1,CP_Vars1),%writeln([curr_cp,Curr_cp1]), + (debug4(on)->writeln0([get_last_cp_before_n,Cp2,CP_Vars1,CP_Vars1]);true). + %notrace. + +get_last_cp_before_n2(List1,Cp1,Cp2,Cp3,CP_Vars1,CP_Vars1) :- +/* + member([global_cp_trail,Global_cp_trail1],CP_Vars1), + +append(Global_cp_trail1,[[_,_,List1]],Global_cp_trail1a), + + reverse(Global_cp_trail1a%List1 + ,B1), +*/ + reverse(List1,B1), + %D1=[_Pred_id,Level,_Predicate_number,_Line_number_a,"findall",-|_], + %member([_,_|D1],Choice_point_trail1), + + Cp1=[%A,B2, + C,D_Level + %2 + ,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2], + + %not(F_Line_number_a2=["returns to", _, "pred_id", _]), + + %member([_,_,B11],B1), + %reverse(B11,B12), + + member([A,B2,C,D_Level + %2 + ,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2],B1),%B12), + + Cp2=[A,B2,C,D_Level + %2 + ,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2], + + /*cp_since_findall_start2( + %get_last_p_before_n( + B, + [_,Level + %2 + ,_Predicate_number2,_Line_number_a2,Pred_or_line,_,_,All_predicate_numbers2],_,E1,CP_Vars1,CP_Vars2), + */ + + + (Pred_or_line="line"-> + (All_predicate_numbers2=[_,_,_,_,_, + _,Vars2c],not(Vars2c=[])); + Pred_or_line="predicate"-> + not(All_predicate_numbers2=[])), + + Cp2=[_,_|Cp3] +%) + /* +%trace, +%writeln1([list1,List1]), +%trace, + get(curr_cp,N,CP_Vars1), + get_last_p_before_n1(List1,Cp1,N,Cp2), + Cp2=[_,_|Cp3] + */ + . + +/* +get_last_p_before_n1(List1,_Cp1,N,_Cp2) :- + (not(member([N,_|_],List1)),fail). +get_last_p_before_n1(List1,Cp1,N,Cp2) :- + member([B,N|Cp1],List1), + %((Cp1=[_,_,_,["returns to",_]|_])-> + %get_last_p_before_n1(List1,Cp1,B,Cp2); + Cp2=[B,N|Cp1]. +get_last_p_before_n1(List1,Cp1,N,Cp2) :- +%trace,writeln1(List1), + member([N,B|Cp1],List1), + %((Cp1=[_,_,_,["returns to",_]|_])-> + %fail%get_last_p_before_n1(List1,Cp1,B,Cp2) + %; + Cp2=[N,B|Cp1],!. +get_last_p_before_n1(List1,Cp1,N,Cp2) :- + member([B,N|Cp3],List1), + not(Cp1=Cp3), + get_last_p_before_n1(List1,Cp1,B,Cp2). +*/ +/* + +[debug] ?- get_later_cps_than_cp([[1,2,a],[2,3,b],[3,4,c]],[2,3,b],C). C = [[2, 3, b], [1, 2, a]]. + +[debug] ?- get_later_cps_than_cp([[1,2,a],[2,3,b],[3,4,c]],[1,2,a],C). +C = [[1, 2, a]]. + +[debug] ?- get_later_cps_than_cp([[1,2,a],[2,3,b],[3,4,c]],[3,4,c],C). +C = [[3, 4, c], [2, 3, b], [1, 2, a]]. + +*/ +/* +get_later_cps_than_cp(List1,Cp1,Cps) :- + %curr_cp(N), + Cp1=[_Cp_b,Cp_a|_Cp], + get_later_cps_than_cp(List1,Cp_a,[],Cps1), + sort(Cps1,Cps). + %Cp2=[_,_|Cp3]. + +get_later_cps_than_cp(List1,Cp_a,Cps,Cps) :- + not(member([Cp_a,_|_],List1)),!. +get_later_cps_than_cp(List1,Cp_a,Cps1,Cps2) :- + member([Cp_a,Cp_b|Cp3],List1), + append(Cps1,[[Cp_a,Cp_b|Cp3]],Cps3), + get_later_cps_than_cp(List1,Cp_b,Cps3,Cps2),!. +*/ +set(A,B,CP_Vars1,CP_Vars2) :- + delete(CP_Vars1,[A,_],CP_Vars3), + append(CP_Vars3,[[A,B]],CP_Vars2),!. + +get(A,B,CP_Vars1) :- + member([A,B],CP_Vars1),!. + +/* + +[debug] ?- trace, delete_cp([[1,2,a],[2,3,b],[3,4,c]],[1,2,a],A, [[curr_cp_index,4],[min_cp,1],[max_cp,4],[curr_cp,3]],V). +A = [[1, 2, b], [2, 3, c]], +V = [[min_cp, 1], [max_cp, 3], [curr_cp, 1], [curr_cp_index, 3]]. + + +[debug] ?- trace, delete_cp([[1,2,a],[2,3,b],[3,4,c]],[2,3,b],A, [[curr_cp_index,4],[min_cp,1],[max_cp,4],[curr_cp,3]],V). +A = [[1, 2, a], [2, 3, c]], +V = [[min_cp, 1], [max_cp, 3], [curr_cp, 2], [curr_cp_index, 3]]. + + +[trace] ?- trace, delete_cp([[1,2,a],[2,3,b],[3,4,c]],[3,4,c],A, [[curr_cp_index,4],[min_cp,1],[max_cp,4],[curr_cp,3]],V). +A = [[1, 2, a], [2, 3, b]], +V = [[min_cp, 1], [max_cp, 3], [curr_cp, 3], [curr_cp_index, 3]]. + +*/ + +delete_cp(List1,CP,List5,CP_Vars1,CP_Vars2,Swaps) :- + %writeln1(delete_cp1(List1,CP,List5,CP_Vars1,CP_Vars2,Swaps)), + %trace, + get(curr_cp,Curr_cp,CP_Vars1),%writeln([curr_cp,Curr_cp]), + %trace, + (debug4(on)->writeln0([delete_cp,curr_cp,Curr_cp,CP,List1,List5,CP_Vars1,CP_Vars2,Swaps]);true), + +(delete_cp1(List1,CP,List5,CP_Vars1,CP_Vars2,Swaps)->true;(writeln0([delete_cp1,abort]),abort)), + %writeln1(delete_cp1(List1,CP,List5)), + get(curr_cp,Curr_cp1,CP_Vars1),%writeln([curr_cp,Curr_cp1]), + (debug4(on)->writeln0([delete_cp,curr_cp,Curr_cp,Curr_cp1,CP,List1,List5,CP_Vars1,CP_Vars2,Swaps]);true). + %notrace. + +delete_cp1(List1,CP,List5a,CP_Vars1,CP_Vars2a,Swaps) :- + CP=[A,B|CP2], + + %writeln1([delete_cp1,[A,B|CP2]]), + ( + (List1=[])->true; + + (List1=[[A,B|CP2]], + List5=[], + set(curr_cp,0,CP_Vars1,CP_Vars3), + set(curr_cp_index,0,CP_Vars3,CP_Vars4), + set(min_cp,0,CP_Vars4,CP_Vars5), + set(max_cp,0,CP_Vars5,CP_Vars2) + )->true; + + (member([A,B|CP2],List1), + not(member([_F,A|_],List1)), + delete(List1,[A,B|CP2],List5), + member([B,C|_CP3],List5), + %(B=23->(trace,writeln(here1));true), + %(curr_cp(A)->set(curr_cp,B);true) + set(curr_cp,C,CP_Vars1,CP_Vars3), + set(min_cp,B,CP_Vars3,CP_Vars2) + )->true; + + (%trace, + member([A,B|CP2],List1), + not(member([B,_F|_],List1)), + delete(List1,[A,B|CP2],List5), + %member([C,A|_CP3],List5), + %(A=23->(trace + %,writeln(here2));true), + %(curr_cp(B)->set(curr_cp,A);true) + set(curr_cp,A,CP_Vars1,CP_Vars3), + set(max_cp,A,CP_Vars3,CP_Vars2) + %,notrace + )->true; + + ( + member([D,A|CP4],List1), + delete(List1,[D,A|CP4],List2), + member([B,_C|_CP3],List2), + member([A,B|CP2],List2), + delete(List2,[A,B|CP2],List4), + append(List4,[[D,B|CP4]],List5), + + %(B=23->(trace,writeln(here3));true), + + %(curr_cp(A)->set(curr_cp,B);true) + set(curr_cp,B,CP_Vars1,CP_Vars2) + + )), + + %trace, + get(max_cp,Max_cp,CP_Vars2),%writeln([curr_cp,Curr_cp]), + set(curr_cp,Max_cp,CP_Vars2,CP_Vars2b), + renumber_cps(List5,List5a,CP_Vars2b,CP_Vars2a,Swaps) + ,!. + + +replace_cp(Choice_point_trail1e,Cp_a1,Cp_a2,D1,D2,Choice_point_trail1b,CP_Vars1,CP_Vars2) :- + +%trace, +(member([Cp_a1,Cp_a2|D1],Choice_point_trail1e)->true; +(writeln0("replace_cp abort"),abort)), + delete(Choice_point_trail1e,[Cp_a1,Cp_a2|D1],Choice_point_trail1f), + append(Choice_point_trail1f,[[Cp_a1,Cp_a2|D2]],Choice_point_trail1g), + +%writeln1(["*5", append(Choice_point_trail1f,[[Cp_a1,Cp_a2|D2]],Choice_point_trail1g)]), + + set(curr_cp,Cp_a1,CP_Vars1,CP_Vars2), + sort(Choice_point_trail1g,Choice_point_trail1b). + +exit_findall_line(Pred_id1,Globals1,Predicate_number,Line_number_b,Functions,Line_number_c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("true",Dbw_true1),Dbw_true1=Dbw_true, +get_lang_word("findall_exit_function",Dbw_findall_exit_function1),Dbw_findall_exit_function1=Dbw_findall_exit_function, +get_lang_word("exit_function",Dbw_exit_function1),Dbw_exit_function1=Dbw_exit_function, + +%trace, + + member([Predicate_number,_F|Rest],Functions), +(Rest=[_Args,":-",Lines]->true; +(Rest=[_Args]->Lines=[[[Dbw_n,Dbw_true]]]; +(Rest=[":-",Lines]; +(Rest=[],Lines=[[[Dbw_n,Dbw_true]]])))), + +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, + +((Line_number_b=[Dbw_findall_exit_function,C]->true; +Line_number_b=[Dbw_exit_function,C])-> +(member([C,[Dbw_on_true,_A],[Dbw_go_after,B]|_Line],Lines), +exit_findall_line(Pred_id1,Globals1,Predicate_number,B,Functions,Line_number_c)); + + %find_line_number(Line_number_c,E), + %member([[skip,Pred_id,E],Skip],Globals), + + +%debug_exit(Skip,Line), + +(Line_number_c=Line_number_b%trace,member([Line_number_b,["on true",_A],["go after",Line_number_c]|_],Lines) +)). + +%/* +collect_connected_pred_ids(Pred_id,Pred_ids1,Pred_ids2,Predicate_number,Globals3) :- + +collect_connected_pred_ids1(Pred_id,Pred_ids1,Pred_ids3,Predicate_number,Globals3), + +%writeln1([*,Pred_ids3]), +Pred_ids3=Pred_ids2, +%findall(Pred_ids6,(member(Pred_id5,Pred_ids3), +%collect_connected_pred_ids2(Pred_id5,[Pred_id5],Pred_ids6,Globals3)), +%Pred_ids2), +!. + +collect_connected_pred_ids1(Pred_id,Pred_ids1,Pred_ids2,Predicate_number,Globals3) :- + + ((member([pred_id_chain,Prev_pred_id,Pred_id],Globals3), + not(member(Prev_pred_id,Pred_ids1)), + member([[pred_num,Prev_pred_id],Predicate_number],Globals3))-> + (%trace, + append(Pred_ids1,[Prev_pred_id],Pred_ids3), + %findall(Pred_ids4, collect_connected_pred_ids2(Prev_pred_id,Pred_ids3,Pred_ids4,Globals3),Pred_ids5), + Pred_ids3=Pred_ids5, + collect_connected_pred_ids1(Prev_pred_id,Pred_ids5,Pred_ids2,Predicate_number,Globals3)); + Pred_ids1=Pred_ids2). + + + /*collect_connected_pred_ids0(Pred_id,Pred_ids1,Pred_ids2,Predicate_number,Globals3) :- + +collect_connected_pred_ids01(Pred_id,Pred_ids1,Pred_ids3,Predicate_number,Globals3), + +%writeln1([*,Pred_ids3]), +Pred_ids3=Pred_ids2, +%findall(Pred_ids6,(member(Pred_id5,Pred_ids3), +%collect_connected_pred_ids2(Pred_id5,[Pred_id5],Pred_ids6,Globals3)), +%Pred_ids2), +!. + +collect_connected_pred_ids01(Pred_id,Pred_ids1,Pred_ids2,Predicate_number,Globals3) :- + + ((%member([pred_id_chain,Prev_pred_id,Pred_id],Globals3), + Prev_pred_id=Pred_id, + not(member(Prev_pred_id,Pred_ids1)), + member([[pred_num,Prev_pred_id],Predicate_number],Globals3))-> + (%trace, + append(Pred_ids1,[Prev_pred_id],Pred_ids3), + %findall(Pred_ids4, collect_connected_pred_ids2(Prev_pred_id,Pred_ids3,Pred_ids4,Globals3),Pred_ids5), + Pred_ids3=Pred_ids5, + collect_connected_pred_ids01(Prev_pred_id,Pred_ids5,Pred_ids2,Predicate_number,Globals3)); + Pred_ids1=Pred_ids2). + +collect_connected_pred_ids01(Pred_id,Pred_ids1,Pred_ids2,Predicate_number,Globals3) :- + + ((member([pred_id_chain,Prev_pred_id,Pred_id],Globals3), + not(member(Prev_pred_id,Pred_ids1)), + member([[pred_num,Prev_pred_id],Predicate_number],Globals3))-> + (%trace, + append(Pred_ids1,[Prev_pred_id],Pred_ids3), + %findall(Pred_ids4, collect_connected_pred_ids2(Prev_pred_id,Pred_ids3,Pred_ids4,Globals3),Pred_ids5), + Pred_ids3=Pred_ids5, + collect_connected_pred_ids01(Prev_pred_id,Pred_ids5,Pred_ids2,Predicate_number,Globals3)); + Pred_ids1=Pred_ids2). + */ +%collect_connected_pred_ids2(Pred_id,Pred_ids1,Pred_ids1,_Globals3) :- + + %not(member([pred_id_chain,Pred_id,Next_pred_id],Globals3)). +/* +collect_connected_pred_ids2(Pred_id,Pred_ids1,Pred_ids2,Globals3) :- + + (findall([%Next_pred_id, + Next_pred_id1],(member([pred_id_chain,Pred_id,Next_pred_id],Globals3)),Next_pred_id1), + member(Next_pred_id2,Next_pred_id), + collect_connected_pred_ids2(Next_pred_id2,[Next_pred_id],Pred_ids2,Globals3)).%),Pred_ids2)). + */ + + collect_connected_pred_ids2(Pred_id,Pred_ids1,Pred_ids2,Globals3) :- + ((findall(Next_pred_id,member([pred_id_chain,Pred_id,Next_pred_id],Globals3),Next_pred_id), + %member(Next_pred_id,Next_pred_id1), + %not(member(Next_pred_id,Pred_ids1)) + subtract(Next_pred_id,Pred_ids1,Next_pred_id1) + %member([[pred_num,Prev_pred_id],Predicate_number],Globals3) + )-> + (%trace, + append(Pred_ids1,Next_pred_id1,Pred_ids3), + %findall(Pred_ids4, collect_connected_pred_ids2(Prev_pred_id,Pred_ids3,Pred_ids4,Globals3),Pred_ids5), + Pred_ids3=Pred_ids5, + findall(Pred_ids6,(member(Next_pred_id2,Next_pred_id1),collect_connected_pred_ids2(Next_pred_id2,Pred_ids5,Pred_ids6,Globals3)),Pred_ids7), + append([Pred_id,Pred_ids7],Next_pred_id1,Pred_ids2)); + Pred_ids1=Pred_ids2). + + /* + -> + (subtract(Next_pred_id1,[Pred_id],Pred_ids11), + append(Pred_ids1,[Pred_ids11],Pred_ids3)); + Pred_ids1=Pred_ids3), + collect_connected_pred_ids2(Next_pred_id,Pred_ids3,Pred_ids2,Globals3). +*/ + +%*/ +/* +collect_connected_pred_ids(Pred_id,Pred_ids1,Pred_ids2,Predicate_number,Globals3) :- + + member([pred_id_chain,Pred_id,Next_pred_id],Globals3), + ((not(member(Next_pred_id,Pred_ids1)), + member([[pred_num,Next_pred_id],Predicate_number],Globals3))-> + (append(Pred_ids1,[Next_pred_id],Pred_ids3), + collect_connected_pred_ids(Next_pred_id,Pred_ids3,Pred_ids2,Predicate_number,Globals3)); + Pred_ids1=Pred_ids2),!. +*/ +clear_cps(Choice_point_trail1a,Choice_point_trail1,CP_Vars1a,CP_Vars1) :- +%trace, +findall([A,B,C,D,E,L,M,F,H,J], +(member([A,B,C,D,E,L,M,F,H,J],Choice_point_trail1a),(J=[]->true;J=[_Ab,_Bb,_Cb,_Db,_Eb, + _Fb,[]]),not(L= -1),not(L=["returns to", _, "pred_id", _]),not(L=[findall_exit_function,_]),not(M="findall")),K), +delete_cps(Choice_point_trail1a,K,Choice_point_trail1,CP_Vars1a,CP_Vars1),!. + +% delete choicepoints in all clauses of current predicate x +% find cps of same name, arity that have same previous pred_id x +cut_cps(Choice_point_trail1a,Choice_point_trail2,CP_Vars1a,CP_Vars2,Pred_id,Predicate_number,Globals3) :- + +%trace, % collect pred ids connected by curr pred num + + +% clear empty Cps + +%clear_cps(Choice_point_trail1a,Choice_point_trail1,CP_Vars1a,CP_Vars1), +Choice_point_trail1a=Choice_point_trail1, +CP_Vars1a=CP_Vars1, +% (possibly not necessarily) collect connected pred ids + +findall(Pred_ids,collect_connected_pred_ids(Pred_id,[Pred_id],Pred_ids,Predicate_number,Globals3),Pred_ids1), + +flatten(Pred_ids1,Pred_ids1a), +sort(Pred_ids1a,Pred_ids2), + +%writeln([1,Pred_ids2]), + % replace their cps with [] x + + +/* member([pred_id_chain,Prev_pred_id,Pred_id],Globals3), + pred_numbers(Pred_numbers), + member([Function1,Arity1,Pred_numbers1],Pred_numbers), + member(Predicate_number,Pred_numbers1), + + findall(Pred_id1,(member([pred_id_chain,Prev_pred_id,Pred_id1],Globals3), + member([Function1,Arity1,Pred_numbers2],Pred_numbers), + member([[pred_num,Pred_id1],Predicate_number1],Globals3), + member(Predicate_number1,Pred_numbers2)), + Pred_ids2), +*/ + findall(%[ + E2%,E4 + %] + ,(member(C,Pred_ids2), + E2=[_A,_B2,C,_D_Level,_E_Predicate_number2,_F_Line_number_a2,_Pred_or_line,_H,_I,_All_predicate_numbers2], + member(E2,Choice_point_trail1) + + %E1=[C,D_Level,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,All_predicate_numbers2], + +/* + (Pred_or_line="line"-> + (All_predicate_numbers2=[Ab,Bb,Cb,Db,Eb, + Fb,Vars2c],not(Vars2c=[]), + E4=[A,B2,C,D_Level,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,[Ab,Bb,Cb,Db,Eb, + Fb,[]]]); + Pred_or_line="predicate"-> + not(All_predicate_numbers2=[]), + E4=[A,B2,C,D_Level,E_Predicate_number2,F_Line_number_a2,Pred_or_line,H,I,[]]) +*/ + ),E3), + + % * delete not replace []s, heap since first line needed for retry + % - delete rest of cps with pred_ids2 + +%replace_cps(E3,B21,B23,B22), + +replace1(Choice_point_trail1,Pred_ids2,Choice_point_trail3,CP_Vars1,CP_Vars3), +%trace, +delete1(E3,Pred_ids2,E31),%,CP_Vars3,CP_Vars4), + + %delete_cp2(Choice_point_trail1,E31,Choice_point_trail2,CP_Vars1,CP_Vars2). + +delete_cps(Choice_point_trail3,E31,Choice_point_trail2,CP_Vars3,CP_Vars2). + +%/* +delete_cps(Choice_point_trail1,[],Choice_point_trail1,CP_Vars,CP_Vars) :- !. + +delete_cps(Choice_point_trail1,[E31|E32],Choice_point_trail2,CP_Vars1,CP_Vars2) :- +E31=[_A,_B|E33], +delete_cp(Choice_point_trail1,[_,_|E33],Choice_point_trail3,CP_Vars1,CP_Vars3,_), +delete_cps(Choice_point_trail3,E32,Choice_point_trail2,CP_Vars3,CP_Vars2),!. +%*/ + %replace_cp2(Choice_point_trail1,E3,Choice_point_trail2,CP_Vars1,CP_Vars2). + +%/* +replace1(E3,Pred_ids2,E31,CP_Vars1,CP_Vars1) :- + +findall(M, +(member(N,E3),((N=[A,B,C,D,E,L,"predicate",F,H,_J],(L= -1->true;L=["returns to", _, "pred_id", _]), +member(C,Pred_ids2))-> +M=[A,B,C,D,E,L,"predicate",F,H,[]]; +M=N)),E31),!. + +%*/ + +delete1(E3,Pred_ids2,E31%,CP_Vars1,CP_Vars3 +) :- + +findall([A,B,C,D,E,L,"predicate",F,H,J], +(member([A,B,C,D,E,L,"predicate",F,H,J],E3),(L= -1->true;L=["returns to", _, "pred_id", _]), +member(C,Pred_ids2)),K), +subtract(E3,K,E31%,CP_Vars1,CP_Vars3,_ +),!. + +delete1(E,_,E) :- !. + + replace_cp2(Choice_point_trail,[],Choice_point_trail,CP_Vars,CP_Vars) :- !. + replace_cp2(Choice_point_trail1,E3,Choice_point_trail2,CP_Vars1,CP_Vars2) :- + + E3=[[[A,B|E1],[A,B|E11]]|E2], +replace_cp(Choice_point_trail1,A,B,E1,E11,Choice_point_trail3,CP_Vars1,CP_Vars3), +replace_cp2(Choice_point_trail3,E2,Choice_point_trail2,CP_Vars3,CP_Vars2). + + delete_cp_value(Choice_point_trail1,[Predicate_number_a1,"prev_pred_id",Prev_pred_id],Choice_point_trail31,CP_Vars1,CP_Vars21) :- + reverse(Choice_point_trail1,Choice_point_trail14), + +% [8, 9, 4, 3, [4, "prev_pred_id", 2], -1, "predicate", _51400588, _51400594, [[5, "prev_pred_id", 2], [6, "prev_pred_id", 2]]] + member([A1,A2,A_Pred_id,A_Level,A_Predicate_number,-1,"predicate",A4,A3,A5 + ],Choice_point_trail14), + %D11=[A_Pred_id,A_Level,A_Predicate_number,-1,"predicate",A3,A4,A5], + +((member([Predicate_number_a1,"prev_pred_id",Prev_pred_id],A5), +delete(A5,[Predicate_number_a1,"prev_pred_id",Prev_pred_id],A6), +%A5=[A6|A7]-> + +replace_cp(Choice_point_trail1,A1,A2, +[A_Pred_id,A_Level,A_Predicate_number,-1,"predicate",A4,A3,A5], +[A_Pred_id,A_Level,A_Predicate_number,-1,"predicate",A4,A3,A6], +Choice_point_trail31,CP_Vars1,CP_Vars21))->true; +(Choice_point_trail1=Choice_point_trail31, +CP_Vars1=CP_Vars21)). + + +find_retry_back_stack_n(N2) :- + retry_back_stack_n(N1), + N2 is N1+1, + retractall(retry_back_stack_n(_)), + assertz(retry_back_stack_n(N2)). diff --git a/SSI/ssi_3preds5.pl b/SSI/ssi_3preds5.pl new file mode 100644 index 0000000000000000000000000000000000000000..52358d46902bef348d0596e152c0b39a083116b4 --- /dev/null +++ b/SSI/ssi_3preds5.pl @@ -0,0 +1,676 @@ +/* +interpretpart(read_string1,Variable1,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("read_string",Dbw_read_string), + getvalue(Variable1,Value1,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_read_string],[variable]]), + ((read_string(user_input, "\n", "\r", _End1, Value1A), + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_read_string],[Value1A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_read_string],[variable]])),!. +*/ + +/*ssi_interpretpart(member,Variable1,Variable2,Vars1,Vars2,Vars2c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("member",Dbw_member), + getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_member],[Value1,Value2]]), + %(((not(Value2=empty)->member(Value2,Value1), + ((findall([[Value1,Value3],Vars2b,Value3],(member(Value3,Value1), + putvalue_equals4(Variable2,Value3,Vars1,Vars2b)%%,Vars2=Vars1 + ),Vars2a),Vars2a=[[_,Vars2,Value3]|Vars2d], + findall([[Value11,Value21],Vars2e],member([[Value11,Value21],Vars2e,Value31],Vars2d),Vars2c1), + Vars2c=[[[Dbw_n,Dbw_member],[Value1,Value2],Value3]|[Vars2c1]] + )-> + debug_exit(Skip,[[Dbw_n,Dbw_member],[Value1,Value3]]) +; debug_fail(Skip,[[Dbw_n,Dbw_member],[Value1,Value2]])),!. +*/ + +ssi_interpretpart(member2,Dbw_member2,Variable1,Variable2,Vars1,Vars2,Vars2c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%get_lang_word("member2",Dbw_member21),Dbw_member21=Dbw_member2, +get_lang_word("v",Dbw_v), + + getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_member2],[Value2,Value1]]), + ((%Value2=empty, + + matrix_member(Matrix),findall(X,(member(Y,[Value1,Value2]),(contains_var([Dbw_v,_],Y)->X=o;X=i)),Z), +foldr(atom_concat,Z,'',W),(member(W,Matrix)->true;(writeln([incorrect,member2,modes,W]),abort)), + +(W=ii-> +(member(Value2,Value1)-> + + (Vars1=Vars2, + debug_exit(Skip,[[Dbw_n,Dbw_member2],[Value2,Value1]])); + (Vars1=Vars2, + debug_fail(Skip,[[Dbw_n,Dbw_member2],[Value2,Value1]]))); + +( +(W=io-> +((findall([Vars2b,[Value1,Value2a],Value2a],(member(Value2a,Value1), + putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value2a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + Vars2c=[[Dbw_n,Dbw_member2],[Value1,Value2],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + + debug_exit(Skip,[[Dbw_n,Dbw_member2],[Value2a,Value1]]))); + + + +(W=oi-> + +(%trace, + command_n_sols(N), + findall([Vars2b,[Value1a,Value2],Value1a],( + +%replace_in_term([Value2,],_%'$VAR'(_) +% ,empty,Value1A1), + findnsols(N,Value1A1,(member(Value2,Value1A), + + %Value1A=[Value3A2|Value3A3], + %ValueIA1=[Value3A2,"|",Value3A3], + + replace_in_term(Value1A,_%'$VAR'(_) + ,empty2,Value1A2), + + convert_to_lp_pipe(Value1A2,Value1A3), + + find_v_sys(V_sys), + + replace_in_term(Value1A3,empty2%'$VAR'(_) + ,V_sys,Value1A1) + + ) + ,ValueA),!, + + + %val1emptyorvalsequal(Value3,Value3A), + %trace, + %Vars1=Vars2, + member(Value1a,ValueA), + putvalue_equals4(Variable1,Value1a,Vars1,Vars2b) ),Vars2a), + %trace, + Vars2a=[[Vars2,_,Value1a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + Vars2c=[[Dbw_n,Dbw_member2],[Value1,Value2],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1] + + , + debug_exit(Skip,[[Dbw_n,Dbw_member2],[Value2,Value1a]])); + + +(W=oo-> + + +(%trace, +command_n_sols(N), + findall([Vars2b,[Value1,Value2a],%Value1a, + Value2a],(findnsols(N,%[Value1A2, + Value2A2%] + ,(member(_Value1A,Value2A), + %replace_in_term(Value1A,_%'$VAR'(_) + %,empty,Value1A1), + replace_in_term(Value2A,_%'$VAR'(_) + ,empty2,Value2A1), + + %convert_to_lp_pipe(Value1A1,Value1A2), + convert_to_lp_pipe(Value2A1,Value2A3), + + find_v_sys(V_sys), + + replace_in_term(Value2A3,empty2%'$VAR'(_) + ,V_sys,Value2A2) + + + ) + ,ValueA),!, + %val1emptyorvalsequal(Value3,Value3A), + %trace, + %Vars1=Vars2, + member(%[Value1a, + Value2a%] + ,ValueA), + %putvalue_equals4(Variable1,Value1a,Vars1,Vars3),%)-> + putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,%Value1a, + Value2a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,%_, + _],Vars2d),Vars2c1), + +%trace, + Vars2c=[[Dbw_n,Dbw_member2],[Value1,Value2a],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_member2],[Value2a,Value1]])))))))) ->true;(%writeln1(fail-ssi_interpretpart(member2,Variable1,Variable2,Vars1,Vars2,Vars2c)), + fail)). + + + + +ssi_interpretpart(stringconcat,Variable1,Variable2,Variable3,Vars1,Vars2,Vars2c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("stringconcat",Dbw_stringconcat1),Dbw_stringconcat1=Dbw_stringconcat, +%Vars1=Vars2, +%trace, + +getvalues_equals4(Variable1,Variable2,Variable3,Value1z,Value2z,Value3z,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[Value1z,Value2z,Value3z]]), + +matrix(Matrix),findall(X,(member(Y,[Value1z,Value2z,Value3z]),(is_empty(Y)->X=o;X=i)),Z), +foldr(atom_concat,Z,'',W),(member(W,Matrix)->true;(writeln([incorrect,stringconcat,modes,W]),abort)), + +%findall(Item_u,(member(Item,[Value1z,Value2z,Value3z]),replace_empty_with_undefined(Item,Item_u)),[Value1,Value2,Value3]), +[Value1z,Value2z,Value3z]=[Value1,Value2,Value3], + +(W=iii-> +(%findall([Vars2b,[Value1,Value2a],Value2a],(,* +((string_concat(Value1,Value2,Value3) +%, +%replace_undefined_with_empty(Item,Item_e) +)-> +(Vars1=Vars2, +debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3]])); +(Vars1=Vars2, +debug_fail(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3]]))), + %putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + %),Vars2a),Vars2a=[[Vars2,_,Value2a]|Vars2d], + %findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1)), + + Vars2c=[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + []] +); + +%numbers(3,1,[],N), +%findall(P,(member(N1,N),get_item_n(Z,N1,Z1),(Z1=i->get_item_n([Value1,Value2,Value3],N1,P));P=[v1,N1]%(Z1=o,get_item_n([Value1a,Value2a,Value3a],N1,P)) +%),P1), + +(W=iio-> + (findall([Vars2b,[Value1,Value2,Value3a],Value3a],(string_concat(Value1,Value2,Value3a), + putvalue_equals4(Variable3,Value3a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value3a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3a]])); + + +(W=ioi-> + (findall([Vars2b,[Value1,Value2a,Value3],Value2a],(string_concat(Value1,Value2a,Value3), + putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value2a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2a,Value3]])); + + +/* +(W=ioo-> + (findall([Vars2b,[Value1,Value2a,Value3a],Value2a,Value3a],(string_concat*(Value1,Value2a,Value3a), + putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value2a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2a,Value3]])); +*/ + +(W=oii-> + (findall([Vars2b,[Value1a,Value2,Value3],Value1a],(string_concat(Value1a,Value2,Value3), + putvalue_equals4(Variable1,Value1a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value1a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1a,Value2,Value3]])); + + +/* +(W=oio-> + (findall([Vars2b,[Value1a,Value2,Value3],Value1a],(string_concat(Value1a,Value2,Value3), + putvalue_equals4(Variable1,Value1a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value1a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1a,Value2,Value3]])); +*/ + +(W=ooi-> + (%trace, + findall([Vars2b,[Value1a,Value2a,Value3],Value1a,Value2a],(string_concat(Value1a,Value2a,Value3), + putvalue_equals4(Variable1,Value1a,Vars1,Vars2ba), + putvalue_equals4(Variable2,Value2a,Vars2ba,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value1a,Value2a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1a,Value2a,Value3]])) + +))))). + + + +ssi_interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2,Vars2c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("append",Dbw_append1),Dbw_append1=Dbw_append, +get_lang_word("v",Dbw_v), + +%Vars1=Vars2, + getvalues_equals4(Variable1,Variable2,Variable3,Value1z,Value2z,Value3z,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_append],[Value1z,Value2z,Value3z]]), + +matrix(Matrix),findall(X,(member(Y,[Value1z,Value2z,Value3z]),(contains_var([Dbw_v,_],Y)->X=o;X=i)),Z), +foldr(atom_concat,Z,'',W),(member(W,Matrix)->true;(writeln([incorrect,append,modes,W]),abort)), + +%findall(Item_u,(member(Item,[Value1z,Value2z,Value3z]),replace_empty_with_undefined(Item,Item_u)),[Value1,Value2,Value3]), +[Value1z,Value2z,Value3z]=[Value1,Value2,Value3], + +( +(W=iii-> +(%findall([Vars2b,[Value1,Value2a],Value2a],(,* +((append(Value1,Value2,Value3) +%, +%replace_undefined_with_empty(Item,Item_e) +)-> +(Vars1=Vars2, +debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,Value3]])); +(Vars1=Vars2, +debug_fail(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,Value3]]))), + %putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + %),Vars2a),Vars2a=[[Vars2,_,Value2a]|Vars2d], + %findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1)), + + Vars2c=[[Dbw_n,Dbw_append],[Value1,Value2,Value3],W,_,%,%Value2a + _,_,%[Value1,Value2a] + []] +); + + +%numbers(3,1,[],N), +%findall(P,(member(N1,N),get_item_n(Z,N1,Z1),(Z1=i->get_item_n([Value1,Value2,Value3],N1,P));P=[v1,N1]%(Z1=o,get_item_n([Value1a,Value2a,Value3a],N1,P)) +%),P1), + +(W=iio-> + (findall([Vars2b,[Value1,Value2,Value3a],Value3a],(append(Value1,Value2,Value3a), + putvalue_equals4(Variable3,Value3a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value3a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_append],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,Value3a]])); + + +(W=ioi-> + (findall([Vars2b,[Value1,Value2a,Value3],Value2a],(append(Value1,Value2a,Value3), + putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value2a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_append],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1,Value2a,Value3]])); + + +%/* % this is possible later x +(W=ioo-> + + + +findall([Vars2b,[Value1,Value2a,Value3a],Value3a],( + + append1(Value1,_Value2A,Value3A),%ValueA), + find_v_sys(V_sys), + + replace_in_term(Value3A,_%'$VAR'(_) + ,V_sys,Value3A1), + Value3A1=[Value3A2|Value3A3], + Value3A4=[Value3A2,"|",Value3A3], + %val1emptyorvalsequal(Value3,Value3A), + %trace, + %Vars1=Vars2, + %member([Value1A,Value3A],ValueA), + putvalue_equals4(Variable3,Value3A4,Vars1,Vars2b)%)-> + + + %(findall([Vars2b,[Value1,Value2a,Value3a],Value2a,Value3a],(append*(Value1,Value2a,Value3a), + %putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value3a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_append],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,Value3a]])); +%*/ + +(W=oii-> + (findall([Vars2b,[Value1a,Value2,Value3],Value1a],(append(Value1a,Value2,Value3), + putvalue_equals4(Variable1,Value1a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value1a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_append],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1a,Value2,Value3]])); + + +%/* possible later x + +(W=oio-> + + +(command_n_sols(N), + findall([Vars2b,[Value1a,Value2,Value3a],Value1a,Value3a],(findnsols(N,[Value1A1,Value3A1],(append1(Value1A,Value2,Value3A), + find_v_sys(V_sys1), + replace_in_term(Value1A,_%'$VAR'(_) + ,V_sys1,Value1A1), + find_v_sys(V_sys2), + replace_in_term(Value3A,_%'$VAR'(_) + ,V_sys2,Value3A1)) + ,ValueA),!, + %val1emptyorvalsequal(Value3,Value3A), + %trace, + %Vars1=Vars2, + member([Value1a,Value3a],ValueA), + putvalue_equals4(Variable1,Value1a,Vars1,Vars3),%)-> + putvalue_equals4(Variable3,Value3a,Vars3,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value1a,Value3a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_append],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1a,Value2,Value3a]])); +%*/ + +(W=ooi-> + (%trace, + findall([Vars2b,[Value1a,Value2a,Value3],Value1a,Value2a],(append(Value1a,Value2a,Value3), + putvalue_equals4(Variable1,Value1a,Vars1,Vars2ba), + putvalue_equals4(Variable2,Value2a,Vars2ba,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value1a,Value2a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_append],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1a,Value2a,Value3]])); + +(W=ooo-> + + +(%trace, +command_n_sols(N), +%N=3, +%find_v_sys(V_sys), +replace_in_term(Value2,[Dbw_v,_],%'$VAR'(_) + _,Value22), findall([Vars2b,[Value1a,%Value2a, + Value3a],Value1a,%Value2a, + Value3a],(findnsols(N,[Value1A1,%Value2A1, + Value3A1],(append1(Value1A,Value22,Value3A), + find_v_sys(V_sys1), + find_v_sys(V_sys2), + replace_in_term(Value1A,_%'$VAR'(_) + ,empty2,Value1A2), + %replace_in_term(Value2A,_%'$VAR'(_) + %,empty,Value2A1), + replace_in_term(Value3A,_%'$VAR'(_) + ,empty2,Value3A2), + convert_to_lp_pipe(Value1A2,Value1A3), + %convert_to_lp_pipe(Value2A2,Value2A1), + convert_to_lp_pipe(Value3A2,Value3A3), + replace_in_term(Value1A3,empty2%'$VAR'(_) + ,V_sys1,Value1A1), + replace_in_term(Value3A3,empty2%'$VAR'(_) + ,V_sys2,Value3A1) + ) + ,ValueA),!, + %val1emptyorvalsequal(Value3,Value3A), + %trace, + %Vars1=Vars2, + member([Value1a,%Value2a, + Value3a],ValueA), + putvalue_equals4(Variable1,Value1a,Vars1,Vars3b),%)-> + %putvalue_equals4(Variable2,Value2a,Vars3b,Vars3),%)-> + putvalue_equals4(Variable3,Value3a,Vars3b,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value1a,%Value2a, + Value3a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_,_],Vars2d),Vars2c1), + + Vars2c=[[Dbw_n,Dbw_append],[Value1,Value2,Value3],_,_,%,%Value2a + _,_,%[Value1,Value2a] + Vars2c1], + + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1a,Value2a,Value3a]])) + + +)))))))->true;(%writeln1(fail-ssi_interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2,Vars2c)), +fail)). + + + + + % writeln(Vars2c),trace. +%%; %%debug_fail(Skip,[[n,member2],[Value1,Value2]])),!. +%% ((debug(on)->(writeln1([fail,[[n,member2],[Value1,value]],"Press c."]),(leash1(on)->true;(not(get_single_char(97))->true;abort)));true),fail))))). + +ssi_interpretpart(member3,Variable2,Variable1,Vars1,Vars2,Vars2c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("member3",Dbw_member31),Dbw_member31=Dbw_member3, + getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_member3],[Value1,Value2]]), ((%Value2=empty, + ((findall([Vars2b,[Value1,Value2a],Value2a],(member(Value2a,Value1), + putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + ),Vars2a),Vars2a=[[Vars2,_,Value2a]|Vars2d], + findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_],Vars2d),Vars2c1), + Vars2c=[[Dbw_n,Dbw_member3],[Value1,Value2],_,_,%,[Value1,Value2a] + _,_,Vars2c1] + + ))), + debug_exit(Skip,[[Dbw_n,Dbw_member3],[Value1,Value2a]])). + +%%% + +/* +ssi_interpretpart(member,Variable1,Variable2,Vars1,Vars2,Vars2c,AC) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("member",Dbw_member), +AC=[[_Dbw_n,_Dbw_member],[Value1,Value2],[Value1a,Value2a]], + %getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_member],[Value1,Value2]]), + %(((not(Value2=empty)->member(Value2,Value1), + %((findall([[Value1,Value3],Vars2b,Value3],(member(Value3,Value1), + %putvalue_equals4(Variable2,Value3,Vars1,Vars2b)%%,Vars2=Vars1 + %),Vars2a),Vars2a=[[_,Vars2,Value3]|Vars2d], + %findall([[Value11,Value21],Vars2e],member([[Value11,Value21],Vars2e,Value31],Vars2d),Vars2c1), + %Vars2c=[[[Dbw_n,Dbw_member],[Value1,Value2],Value3]|[Vars2c1]] + %)-> + debug_exit(Skip,[[Dbw_n,Dbw_member],[Value1,Value3]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_member],[Value1,Value2]])), +!. +*/ + +ssi_interpretpart(member2,Dbw_member2,_Variable1,_Variable2,_Vars1,Vars2,_Vars2c,AC) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%get_lang_word("member2",Dbw_member21),Dbw_member21=Dbw_member2, +%trace, +AC=[[Dbw_n,Dbw_member2],Value1_Value2,Value1a_Value2a,Vars2|_], +%writeln1(AC), +%writeln1(AC), + %getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), + Value1_Value2=[Value1_x,Value2_x], +debug_call(Skip,[[Dbw_n,Dbw_member2],[Value2_x,Value1_x]]), + %((%Value2=empty, + %((findall([Vars2b,Value2a],(member(Value2a,Value1), + %putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + %),Vars2a),Vars2a=[[Vars2,Value2a]|Vars2d], + % findall(Vars2e,member([Vars2e,_],Vars2d),Vars2c) + %))), + Value1a_Value2a=[Value1a_x,Value2a_x], + debug_exit(Skip,[[Dbw_n,Dbw_member2],[Value2a_x,Value1a_x]])%) + . + + + + +ssi_interpretpart(stringconcat,_Variable1,_Variable2,_Variable3,_Vars1,Vars2,_Vars2c,AC) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("stringconcat",Dbw_stringconcat1),Dbw_stringconcat1=Dbw_stringconcat, + +AC=[[Dbw_n,Dbw_stringconcat],Values,Values_a,Vars2|_], + +debug_call(Skip,[[Dbw_n,Dbw_stringconcat],Values]), + + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],Values_a]). + + + + +ssi_interpretpart(append,_Variable1,_Variable2,_Variable3,_Vars1,Vars2,_Vars2c,AC) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("append",Dbw_append1),Dbw_append1=Dbw_append, + +AC=[[Dbw_n,Dbw_append],Values,Values_a,Vars2|_], + +debug_call(Skip,[[Dbw_n,Dbw_append],Values]), + + debug_exit(Skip,[[Dbw_n,Dbw_append],Values_a]). + +%%; %%debug_fail(Skip,[[n,member2],[Value1,Value2]])),!. +%% ((debug(on)->(writeln1([fail,[[n,member2],[Value1,value]],"Press c."]),(leash1(on)->true;(not(get_single_char(97))->true;abort)));true),fail))))). + +/* +ssi_interpretpart(member3,_Variable2,_Variable1,_Vars1,Vars2,_Vars2c,AC) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("member3",Dbw_member2), +%trace, +AC=[[_Dbw_n,_Dbw_member],[Value1,Value2],[Value1a,Value2a],Vars2|_], + +%getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_member2],[Value1,Value2]]), %((%Value2=empty, + %((findall([Vars2b,Value2a],(member(Value2a,Value1), + %putvalue_equals4(Variable2,Value2a,Vars1,Vars2b) + %),Vars2a),Vars2a=[[Vars2,Value2a]|Vars2d], + % findall(Vars2e,member([Vars2e,_],Vars2d),Vars2c) + %))), + debug_exit(Skip,[[Dbw_n,Dbw_member2],[Value1a,Value2a]]). +*/ +/* +ssi_interpretpart(stringconcat,Terminal,Phrase2,Phrase1,Vars1,Vars2,Vars2c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("stringconcat",Dbw_stringconcat), + %%Variables1=[Terminal,Phrase1,Phrase2], %% terminal can be v or "a" + ((getvalues2([Terminal,Phrase1,Phrase2], + [],[TerminalValue1,Phrase1Value1,Phrase2Value1],Vars1,[],[Flag1,Flag2,_Flag3]), %% prolog vars, list of vars, [v]=[prolog var] + %%delete(Value1,Value2,Value3A), + + (findall([Vars2b,TerminalValue1,TerminalValue3,Phrase1Value1,Phrase1Value3,Phrase2Value1],( + (Terminal=[_Value]->TerminalValue2=[TerminalValue1];TerminalValue2=TerminalValue1), + + +(Terminal=""->(TerminalValue2="", + +string_concat(TerminalValue2,Phrase2Value1,Phrase1Value1))->true; + ((var(TerminalValue2)->(string_concat(TerminalValue2,Phrase2Value1,Phrase1Value1)),string_length(TerminalValue2,1) + );string_concat(TerminalValue2,Phrase2Value1,Phrase1Value1))), + + putvalue(Terminal,TerminalValue2,Vars1,Vars3), + putvalue(Phrase2,Phrase2Value1,Vars3,Vars4), + putvalue(Phrase1,Phrase1Value1,Vars4,Vars2b) + + ),Vars2a),Vars2a=[[Vars2,TerminalValue1,TerminalValue3,Phrase1Value1,Phrase1Value3,Phrase2Value1]|Vars2d], + findall(Vars2e,member([Vars2e,_],Vars2d),Vars2c)), + (Flag1=true->TerminalValue3=variable1;TerminalValue3=TerminalValue1), + (Flag2=true->Phrase1Value3=variable2;Phrase1Value3=Phrase1Value1))-> + +(debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[TerminalValue3,Phrase1Value3,Phrase2]]), + +debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[TerminalValue1,Phrase1Value1,Phrase2Value1]]) + ); + + (debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[variable1,variable2,variable3]]), + debug_fail(Skip,[[Dbw_n,Dbw_stringconcat],[variable1,variable2,variable3]]) + )).%!. +*/ + +/* +ssi_interpretpart(grammar_part,Variables1,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%get_lang_word("grammar_part",Dbw_grammar_part), + + Variables1=[Terminal,Phrase1,Phrase2], %% terminal can be v or "a" + %%terminal(Terminal), + getvalues2([Terminal,Phrase1,Phrase2], + [],[TerminalValue1,Phrase1Value1,Phrase2Value1],Vars1,[],[Flag1,Flag2,_Flag3]), %% prolog vars, list of vars, [v]=[prolog var] + %%delete(Value1,Value2,Value3A), + (( (Terminal=[_Value]->TerminalValue2=[TerminalValue1];TerminalValue2=TerminalValue1), + + + +((string(Phrase1Value1)->Phrase1Value1=Phrase1Value11;(number(Phrase1Value1)->number_string(Phrase1Value1,Phrase1Value11);Phrase1Value1=Phrase1Value11)), + +(Terminal=""->TerminalValue2="";true), + +(((var(TerminalValue2)->(string_concat(TerminalValue2,Phrase2Value1,Phrase1Value11)),string_length(TerminalValue2,1));string_concat(TerminalValue2,Phrase2Value1,Phrase1Value11))->true; + +string_concat(TerminalValue2,Phrase2Value1,Phrase1Value11))->true; + + + +((Phrase1Value1=[_ItemA|_ItemsA]),(Terminal=[]->(TerminalValue2=[], + +((var(TerminalValue2)->length(TerminalValue2,1);true),(append(TerminalValue2,Phrase2Value1,Phrase1Value1))))->true; + +(append(TerminalValue2,Phrase2Value1,Phrase1Value1)->true)))), + + putvalue(Terminal,TerminalValue2,Vars1,Vars3), + putvalue(Phrase2,Phrase2Value1,Vars3,Vars4), + putvalue(Phrase1,Phrase1Value1,Vars4,Vars2), + (Flag1=true->TerminalValue3=variable1;TerminalValue3=TerminalValue1), + (Flag2=true->Phrase1Value3=variable2;Phrase1Value3=Phrase1Value1))-> + (debug_call(Skip,[[Dbw_n,grammar_part],[TerminalValue3,Phrase1Value3,Phrase2]]), + debug_exit(Skip,[[Dbw_n,grammar_part],[TerminalValue1,Phrase1Value1,Phrase2Value1]])); + +% CAW requires input,input,output with "a","ab",[v,a] where [v,a]="b" + (debug_call(Skip,[[Dbw_n,grammar_part],[Terminal,Phrase1,Phrase2]]), + (debug_fail(Skip,[[Dbw_n,grammar_part],[Terminal,Phrase1,Phrase2]])))),!. + +*/ + \ No newline at end of file diff --git a/SSI/ssi_find_state_machine.pl b/SSI/ssi_find_state_machine.pl new file mode 100644 index 0000000000000000000000000000000000000000..73053fafd04049b4a8c2295fcf74487eedfb0848 --- /dev/null +++ b/SSI/ssi_find_state_machine.pl @@ -0,0 +1,526 @@ +%% finds line number to go to in predicate if true or false +%% record predicate number + +/** + +x: load lpi first +['ssi2.pl']. +['ssi_find_state_machine.pl']. +['../../../GitHub/listprologinterpreter/lpiverify4.pl']. +['../../../GitHub/listprologinterpreter/la_strings.pl']. + +test(33,_,A,_),add_line_numbers_to_algorithm1(A,A2),writeln1(A2). [[0,[n,downpipe],[[v,a],[v,a],[v,b]]],[1,[n,downpipe],[[v,a],[v,b],[v,c]],":-",[[0,[n,member2],[[v,c],[v,c1]]],[1,[n,equals1],[[v,c1]]],[2,[n,equals1],[[v,c12]]],[3,[n,"->"],[[4,[n,>],[[v,a],[v,c121]]],[5,[n,downpipe],[[v,c121],[v,b],[v,c]]],[6,[n,"->"],[[7,[n,>],[[v,a],[v,c122]]],[8,[n,downpipe],[[v,c122],[v,b],[v,c]]],[9,[n,fail]]]]]]]]] + + +A=[[0,[n,downpipe],[[v,a],[v,a],[v,b]]],[1,[n,downpipe],[[v,a],[v,b],[v,c]],":-",[[0,[n,member2],[[v,c],[v,c1]]],[1,[n,equals1],[[v,c1]]],[2,[n,equals1],[[v,c12]]],[3,[n,"->"],[[4,[n,>],[[v,a],[v,c121]]],[5,[n,downpipe],[[v,c121],[v,b],[v,c]]],[6,[n,"->"],[[7,[n,>],[[v,a],[v,c122]]],[8,[n,downpipe],[[v,c122],[v,b],[v,c]]],[9,[n,fail]]]]]]]]], find_state_machine1(A,B),writeln1(B). + +[[0,[n,downpipe],[[v,a],[v,a],[v,b]]],[1,[n,downpipe],[[v,a],[v,b],[v,c]],":-",[[0,[n,member2],[[v,c],[v,c1]]],[1,[n,equals1],[[v,c1]]],[2,[n,equals1],[[v,c12]]],[3,[n,"->"],[[4,[n,>],[[v,a],[v,c121]]],[5,[n,downpipe],[[v,c121],[v,b],[v,c]]],[6,[n,"->"],[[7,[n,>],[[v,a],[v,c122]]],[8,[n,downpipe],[[v,c122],[v,b],[v,c]]],[9,[n,fail]]]]]]]]] +A = B +x + +B=[[0,[n,downpipe],[[v,a],[v,a],[v,b]]],[1,[n,downpipe],[[v,a],[v,b],[v,c]],":-",[[0,1,-,-3,[n,member2],[[v,c],[v,c1]]],[1,2,-,-3,[n,equals1],[[v,c1]]],[2,3,-,-3,[n,equals1],[[v,c12]]],[3,4,-2,-3,[n,"->"],[[4,5,-,6,[n,>],[[v,a],[v,c121]]],[5,[end_function,3],-,-3,[n,downpipe],[[v,c121],[v,b],[v,c]]],[6,7,[end_function,3],-3,[n,"->"],[[7,8,-,9,[n,>],[[v,a],[v,c122]]],[8,[end_function,6],-,-3,[n,downpipe],[[v,c122],[v,b],[v,c]]],[9,[end_function,6],-,-3,[n,fail]]]]]]]]] + +*** + +A=[[0,[n,downpipe],[[v,a],[v,a],[v,b]]],[1,[n,downpipe],[[v,a],[v,b],[v,c]],":-",[[0,[n,member2],[[v,c],[v,c1]]],[1,[n,equals1],[[v,c1]]],[2,[n,equals1],[[v,c12]]],[3,[n,"->"],[[4,[n,>],[[v,a],[v,c121]]],[5,[n,downpipe],[[v,c121],[v,b],[v,c]]],[6,[n,"->"],[[7,[n,>],[[v,a],[v,c122]]],[8,[n,downpipe],[[v,c122],[v,b],[v,c]]],[9,[n,fail]]]]]]]]], find_state_machine1(A,B,[[[n, downpipe], 3, [0, 1]]]),writeln1(B). + +B= +[[0,[n,downpipe],[[v,a],[v,a],[v,b]]], + +[1,[n,downpipe],[[v,a],[v,b],[v,c]],":-", +[[0,["on true",1],["go after",-],["on false",-3],["go to predicates",-],[n,member2],[[v,c],[v,c1]]], +[1,["on true",2],["go after",-],["on false",-3],["go to predicates",-],[n,equals1],[[v,c1]]], +[2,["on true",3],["go after",-],["on false",-3],["go to predicates",-],[n,equals1],[[v,c12]]], +[3,["on true",4],["go after",-2],["on false",-3],[n,"->"]], +[4,["on true",5],["go after",-],["on false",6],["go to predicates",-],[n,>],[[v,a],[v,c121]]], +[5,["on true",[end_function,3]],["go after",-],["on false",-3],["go to predicates",[0,1]],[n,downpipe],[[v,c121],[v,b],[v,c]]], +[6,["on true",7],["go after",[end_function,3]],["on false",-3],[n,"->"]], +[7,["on true",8],["go after",-],["on false",9],["go to predicates",-],[n,>],[[v,a],[v,c122]]], +[8,["on true",[end_function,6]],["go after",-],["on false",-3],["go to predicates",[0,1]],[n,downpipe],[[v,c122],[v,b],[v,c]]], +[9,["on true",[end_function,6]],["go after",-],["on false",-3],["go to predicates",-],[n,fail]]]]] + +*** + +A= +[ + [[n,reverse],[[],[v,l],[v,l]]], + [[n,reverse],[[v,l],[v,m],[v,n]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,wrap],[[v,h],[v,h1]]], + [[n,append],[[v,h1],[v,m],[v,o]]], + [[n,reverse],[[v,t],[v,o],[v,n]]] + ] + ] +], +add_line_numbers_to_algorithm1(A,A2),writeln1(A2). + +A2= +[[0,[n,reverse],[[],[v,l],[v,l]]],[1,[n,reverse],[[v,l],[v,m],[v,n]],":-",[[0,[n,head],[[v,l],[v,h]]],[1,[n,tail],[[v,l],[v,t]]],[2,[n,wrap],[[v,h],[v,h1]]],[3,[n,append],[[v,h1],[v,m],[v,o]]],[4,[n,reverse],[[v,t],[v,o],[v,n]]]]]] + +['find_pred_sm.pl']. +find_pred_sm(Reserved_words1,_),writeln(Reserved_words1). +[+,-,*,/,abort,any,append,atom,brackets,call,ceiling,code,creep,cut,date,delete,equals1,equals2,equals3,equals4,exit,fail,grammar,head,is,length,letters,list,member,member2,n,not,number,or,predicatename,random,round,skip,string,string_from_file,stringconcat,stringtonumber,sys,findall_sys,t,tail,true,unwrap,v,variable,vgp,wrap,input,output,string_length,sort,intersection,read_string,writeln,atom_string,trace,notrace,sqrt,notrace] + +A= +[[0,[n,reverse],[[],[v,l],[v,l]]],[1,[n,reverse],[[v,l],[v,m],[v,n]],":-",[[0,[n,head],[[v,l],[v,h]]],[1,[n,tail],[[v,l],[v,t]]],[2,[n,wrap],[[v,h],[v,h1]]],[3,[n,append],[[v,h1],[v,m],[v,o]]],[4,[n,reverse],[[v,t],[v,o],[v,n]]]]]], +find_state_machine1(A,B,_,[+,-,*,/,abort,any,append,atom,brackets,call,ceiling,code,creep,cut,date,delete,equals1,equals2,equals3,equals4,exit,fail,grammar,head,is,length,letters,list,member,member2,n,not,number,or,predicatename,random,round,skip,string,string_from_file,stringconcat,stringtonumber,sys,findall_sys,t,tail,true,unwrap,v,variable,vgp,wrap,input,output,string_length,sort,intersection,read_string,writeln,atom_string,trace,notrace,sqrt,notrace]),writeln1(B). + +B= +[[0,[n,reverse],[[],[v,l],[v,l]]],[1,[n,reverse],[[v,l],[v,m],[v,n]],":-",[[0,["on true",1],["go after",-],["on false",-3],["go to predicates",-],[n,head],[[v,l],[v,h]]],[1,["on true",2],["go after",-],["on false",-3],["go to predicates",-],[n,tail],[[v,l],[v,t]]],[2,["on true",3],["go after",-],["on false",-3],["go to predicates",-],[n,wrap],[[v,h],[v,h1]]],[3,["on true",4],["go after",-],["on false",-3],["go to predicates",-],[n,append],[[v,h1],[v,m],[v,o]]],[4,["on true",-2],["go after",-],["on false",-3],["go to predicates",-],[n,reverse],[[v,t],[v,o],[v,n]]]]]] + + + +[debug] ?- test(77,_,C,_),add_line_numbers_to_algorithm1(C,D),writeln1(C),writeln1(D). +C=[[[n,findall1],[[v,a],[v,b]],":-",[[[n,findall],[[v,a1],[[n,member2],[[v,a],[v,a1]]],[v,b]]]]]] +D=[[0,[n,findall1],[[v,a],[v,b]],":-",[[0,[n,findall],[[v,a1],[v,b],[[1,[n,member2],[[v,a],[v,a1]]]]]]]]] + +[debug] ?- find_state_machine1([[0,[n,findall1],[[v,a],[v,b]],":-",[[0,[n,findall],[[v,a1],[v,b],[[1,[n,member2],[[v,a],[v,a1]]]]]]]]],F,["+","-","*","/","abort","any","append","atom","brackets","call","ceiling","code","creep","cut","date","delete","equals1","equals2","equals3","equals4","equals4_on","equals4_off","exit","fail","grammar","head","is","length","letters","list","member","member2","member3","n","not","number","or","predicatename","random","round","skip","string","string_from_file","stringconcat","stringtonumber","sys","findall_sys","t","tail","true","unwrap","v","variable","vgp","wrap","input","output","string_length","sort","intersection","read_string","writeln","atom_string","trace","notrace","sqrt","notrace","get_lang_word"]),writeln1(F). +F=[[0,[n,findall1],[[v,a],[v,b]],":-",[[0,["on true",1],["go after",-2],["on false",-3],[n,findall],[[v,a1],[v,b]]],[1,["on true",[end_function,0]],["go after",-],["on false",-3],["go to predicates",-],[n,member2],[[v,a],[v,a1]]]]]] + +**/ + +find_state_machine1(Algorithm1,Algorithm2,Pred_numbers) :- + find_state_machine2(Algorithm1,[],Algorithm2,Pred_numbers). + +find_state_machine2([],Algorithm,Algorithm,_) :- !. +find_state_machine2(Algorithm1,Algorithm2,Algorithm3,Pred_numbers) :- + Algorithm1=[Function1|Functions], + ((Function1=[Number,Name,Arguments1,Symbol1,Body1],symbol(Symbol1,Symbol2), + findall(Arguments3,(member(Arguments2,Arguments1),slp2lp_variables(Arguments2,Arguments3)),Arguments4), + find_state_machine_body2(Body1,Body2,-2,-3,Pred_numbers), + append(Algorithm2,[[Number,Name,Arguments4,Symbol2,Body2]],Algorithm4)) + /*->true; + + ((Function1=[Number,Name,Symbol1,Body1],symbol(Symbol1,Symbol2), + + find_state_machine_body2(Body1,Body2,-2,-3,Pred_numbers), + append(Algorithm2,[[Number,Name,[],Symbol2,Body2]],Algorithm4))->true; + + ((Function1=[Number,Name,Arguments1],symbol(Symbol1,Symbol2), + findall(Arguments3,(member(Arguments2,Arguments1),slp2lp_variables(Arguments2,Arguments3)),Arguments4), + append(Algorithm2,[[Number,Name,Arguments4,Symbol2,[[[n,true]]]]],Algorithm4))->true; + + (Function1=[Number,Name],symbol(Symbol1,Symbol2), + append(Algorithm2,[[Number,Name,[],Symbol2,[[[n,true]]]]],Algorithm4)))) + */ + ), + %%writeln1([Number,Name,Arguments4,Symbol2,Body2]), + %%trace, + find_state_machine2(Functions,Algorithm4,Algorithm3,Pred_numbers). + + %%%****Delete these duplicate predicates +symbol(Symbol,Symbol) :-!. + +%%slp2lp_variables(Name1,[v,Name1]) :- predicate_or_rule_name(Name1),!. +slp2lp_variables(Name,Name) :- !. + +/** +find_state_machine_body(Body1,Body2) :- + findall(*,(member(Statement1,Body1 +find_state_machine_body(Body1,[],Body2) :- + +**/ + +%%predicate_or_rule_name([A,B]) :- atom(A),is_list(B),!. + +/** +defined in grammar.pl +predicate_or_rule_name([V_or_n,_Name]) :- (V_or_n=v->true;V_or_n=n),!.%%,atom(Name),!. +**/ + +%% x: predicate_or_rule_name(V_or_n) :- (V_or_n=v->true;V_or_n=n),fail,!. + +find_state_machine_body2([],[],_,_,_):-!.%%,Body3 + + +%%find_state_machine_body2([],Body,Body) :- !. +find_state_machine_body2(Body1,Body2%%,Body3 +,Return_line_true,Return_line_false,Pred_numbers) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("exit_function",Dbw_exit_function1),Dbw_exit_function1=Dbw_exit_function, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, +get_lang_word("fail_function",Dbw_fail_function1),Dbw_fail_function1=Dbw_fail_function, + + Body1=[[Number,[Dbw_n,"[]"],[Statements1|Statements1a]]|Statements2 + ], + not(predicate_or_rule_name(Statements1)), + not(number(Statements1)), + find_first_line_number(Statements1,Statements1_number), + + (Statements1a=[]->Statements1a_number=[Dbw_exit_function,Number]; + find_first_line_number(Statements1a,Statements1a_number)), + + (Statements2=[]->Statements2_number=Return_line_true; + find_first_line_number(Statements2,Statements2_number)), + + + %%find_first_line_number(Statements2,Statements2_number), + find_state_machine_body2([Statements1],Body3,Statements1a_number,[Dbw_fail_function,Number],Pred_numbers), %% 2->1 + + find_state_machine_body2([Statements1a],Body4,[Dbw_exit_function,Number],[Dbw_fail_function,Number],Pred_numbers), + find_state_machine_body2([Statements2],Body5,Return_line_true,Return_line_false,Pred_numbers), + + maplist(append,[[Body3,Body4,Body5]],[Body345]), + + %% [A,B1,B2,C| - A - Line number, B1 - Line to go to if true, + %% B2 - Line to go to next + %% C - Line to go to if false + +Body6=[Number,[Dbw_on_true,Statements1_number],[Dbw_go_after,Statements2_number],[Dbw_on_false,Return_line_false],[Dbw_go_to_predicates,-],[Dbw_n,"[]"]%,Body34 + ], + append([Body6],Body345,Body2), + !. + + +%% make predicates into dcgs with bottom case, to give input at any line x doesn't need to go back to any matching line, needs a single line x dcg would allow traversing tree structure x give a return to number as goes + +%% if fails, back-tracks along track list x goes to return line in nested if thens +%% returns to choice points when finishes predicates +%% -3 if successfully finishes predicate x + +%% + +find_state_machine_body2(Body1,Body2,Return_line_true,Return_line_false,Pred_numbers) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +get_lang_word("exit_function",Dbw_exit_function1),Dbw_exit_function1=Dbw_exit_function, +get_lang_word("fail_function",Dbw_fail_function1),Dbw_fail_function1=Dbw_fail_function, + +%%trace, + Body1=[[Number1,[Dbw_n,Dbw_not],Statement]|Statements2 + ], + %trace, + find_first_line_number(Statement,Statement_number), + (Statements2=[]->Statements2_number=Return_line_true; + find_first_line_number(Statements2,Statements2_number)), + %%%% swap args 3,4 ? + find_state_machine_body2([Statement],Body3,[Dbw_exit_function,Number1],[Dbw_fail_function,Number1],Pred_numbers), + + %writeln1(Body3), + find_state_machine_body2([Statements2],Body4,Return_line_true,Return_line_false,Pred_numbers), + %%Number2 is Number1+1, + %writeln1(Body4), + %%if_empty_list_then_return(Statements2,Number2,Number3), + + maplist(append,[[Body3,Body4]],[Body34]), + +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + + Body2a=[Number1,[Dbw_on_true,Statement_number],[Dbw_go_after,Return_line_false], + [Dbw_on_false,Statements2_number],[Dbw_go_to_predicates,-], + [Dbw_n,Dbw_not]], + append([Body2a],Body34,Body2), + %append([Body5],Body4 + % ,Body2), + + !. + + +%% s1 and s2 may need separate line numbers for nondet for or +%% find first line number in each of s1, s2 +%% account for brackets +%% vars have separate list for nondet, splices existing lists +find_state_machine_body2(Body1,Body2,Return_line_true,Return_line_false,Pred_numbers) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +get_lang_word("exit_function",Dbw_exit_function1),Dbw_exit_function1=Dbw_exit_function, + Body1=[[Number,[Dbw_n,Dbw_or],[Statements1,Statements2]]|Statements3], + + find_first_line_number(Statements1,Statements1_number), + find_first_line_number(Statements2,Statements2_number), + (Statements3=[]->Statements3_number=Return_line_true; + find_first_line_number(Statements3,Statements3_number)), + find_state_machine_body2([Statements1],Body3,[Dbw_exit_function,Number],Statements2_number,Pred_numbers), + find_state_machine_body2([Statements2],Body4,[Dbw_exit_function,Number],Return_line_false,Pred_numbers), + find_state_machine_body2([Statements3],Body5,Return_line_true,Return_line_false,Pred_numbers), + %%Number2 is Number1+1, + %%if_empty_list_then_return(Statements3,Number2,Number3), + get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, + + + + maplist(append,[[Body3,Body4,Body5]],[Body345]), + Body6=[Number,[Dbw_on_true,Statements1_number],[Dbw_go_after,Statements3_number],[Dbw_on_false,Return_line_false],[Dbw_go_to_predicates,-],[Dbw_n,Dbw_or]],%Body34 + %], + append([Body6],Body345,Body2), + !. + +%% If true at end of section, return to next line + +find_state_machine_body2(Body1,Body2,Return_line_true,Return_line_false,Pred_numbers) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("exit_function",Dbw_exit_function1),Dbw_exit_function1=Dbw_exit_function, + + get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + + Body1=[[Number,[Dbw_n,"->"],[Statements1,Statements2]]|Statements3], + + find_first_line_number(Statements1,Statements1_number), + find_first_line_number(Statements2,Statements2_number), + (Statements3=[]->Statements3_number=Return_line_true; + find_first_line_number(Statements3,Statements3_number)), +find_state_machine_body2([Statements1],Body3,Statements2_number,Return_line_false,Pred_numbers), + find_state_machine_body2([Statements2],Body4,[Dbw_exit_function,Number],Return_line_false,Pred_numbers), + + find_state_machine_body2([Statements3],Body5,Return_line_true,Return_line_false,Pred_numbers), + + maplist(append,[[Body3,Body4,Body5]],[Body345]), + %append(Body3,Body4,Body34), + Body6=[Number,[Dbw_on_true,Statements1_number],[Dbw_go_after,Statements3_number],[Dbw_on_false,Return_line_false],[Dbw_go_to_predicates,-],[Dbw_n,"->"]], + append([Body6],Body345,Body2), + %append(Body61,Body5,Body2), + + !. + + + + +find_state_machine_body2(Body1,Body2,Return_line_true,Return_line_false,Pred_numbers) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("exit_function",Dbw_exit_function1),Dbw_exit_function1=Dbw_exit_function, + get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + Body1=[[Number,[Dbw_n,"->"],[Statements1,Statements2,Statements2a]]|Statements3], + %trace, + + find_first_line_number(Statements1,Statements1_number), + find_first_line_number(Statements2,Statements2_number), + find_first_line_number(Statements2a,Statements2a_number), + %%trace, + (Statements3=[]->Statements3_number=Return_line_true; + find_first_line_number(Statements3,Statements3_number)), + find_state_machine_body2([Statements1],Body3,Statements2_number,Statements2a_number,Pred_numbers), + find_state_machine_body2([Statements2],Body4,[Dbw_exit_function,Number],Return_line_false,Pred_numbers), + %%trace, + find_state_machine_body2([Statements2a],Body5,[Dbw_exit_function,Number],Return_line_false,Pred_numbers), + find_state_machine_body2([Statements3],Body6,Return_line_true,Return_line_false,Pred_numbers), + + maplist(append,[[Body3,Body4,Body5,Body6]],[Body3456]), + Body7=[Number,[Dbw_on_true,Statements1_number],[Dbw_go_after,Statements3_number],[Dbw_on_false,Return_line_false],[Dbw_go_to_predicates,-],[Dbw_n,"->"]], + append([Body7],Body3456,Body2), + %append(Body71,Body6,Body2), + + !. + + +find_state_machine_body2(Body1,Body2,Return_line_true,Return_line_false,Pred_numbers) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, +get_lang_word("findall_exit_function",Dbw_findall_exit_function1),Dbw_findall_exit_function1=Dbw_findall_exit_function, +get_lang_word("findall_fail_function",Dbw_findall_fail_function1),Dbw_findall_fail_function1=Dbw_findall_fail_function, + get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + +Body1=[[Number,[Dbw_n,Dbw_findall],[Statements1,Statements2,Statements2a]]|Statements3], + %trace, + + %find_first_line_number(Statements1,Statements1_number), + %find_first_line_number(Statements2,Statements2_number), + find_first_line_number(Statements2a,Statements2a_number), + %%trace, + (Statements3=[]->Statements3_number=Return_line_true; + find_first_line_number(Statements3,Statements3_number)), + %find_state_machine_body2([Statements1],Body3,Statements2_number,Statements2a_number,Pred_numbers), + %find_state_machine_body2([Statements2],Body4,[end_function,Number],Return_line_false,Pred_numbers), + %%trace, + find_state_machine_body2(Statements2a,Body5,[Dbw_findall_exit_function,Number],[Dbw_findall_fail_function,Number]%Return_line_false + ,Pred_numbers), + find_state_machine_body2(Statements3,Body6,Return_line_true,Return_line_false,Pred_numbers), + + maplist(append,[[%Body3,Body4, + Body5,Body6]],[Body56]), + Body7=[Number,[Dbw_on_true,Statements2a_number],[Dbw_go_after,Statements3_number],[Dbw_on_false,Return_line_false],[Dbw_go_to_predicates,-],[Dbw_n,Dbw_findall],[Statements1,Statements2]], + append([Body7],Body56,Body2), + %append(Body71,Body6,Body2), + + !. + + +find_state_machine_body2(Body1,Body2,Return_line_true,Return_line_false,Pred_numbers) :- + Body1=[]->Body2=[]; + (Body1=[Statement|Statements], + (( + (get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + Statement=[N,[Dbw_n,_]|_],number(N))-> + (%not(predicate_or_rule_name(Statement)), + %not(number(Statement)), + (Statements=[]->Statements_number=Return_line_true; + find_first_line_number(Statements,Statements_number)), + find_state_machine_statement1(Statement,Result1,Statements_number,Return_line_false,Pred_numbers)); + +find_state_machine_body2(Statement,Body2,Return_line_true,Return_line_false,Pred_numbers))), + find_state_machine_body2(Statements,Result2,Return_line_true,Return_line_false,Pred_numbers)), + append_list2([Result1,Result2],Body2),!. + + +find_state_machine_body2(Body1,Body2,Return_line_true,Return_line_false,Pred_numbers) :- + Body1=[]->Body2=[]; + (Body1=[Statement|Statements], + (( + (get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, + Statement=[N,[Dbw_v,_]|_],number(N))-> + (%not(predicate_or_rule_name(Statement)), + %not(number(Statement)), + (Statements=[]->Statements_number=Return_line_true; + find_first_line_number(Statements,Statements_number)), + find_state_machine_statement1(Statement,Result1,Statements_number,Return_line_false,Pred_numbers)); + +find_state_machine_body2(Statement,Body2,Return_line_true,Return_line_false,Pred_numbers))), + find_state_machine_body2(Statements,Result2,Return_line_true,Return_line_false,Pred_numbers)), + append_list2([Result1,Result2],Body2),!. + + /* +find_state_machine_body2(Body1,Body2,Return_line_true,Return_line_false,Pred_numbers) :- + + Body1=[[Number,[n,maplist],[Statements1,Statements2,Statements2a,Statements2b]]|Statements3], + %trace, + + %find_first_line_number(Statements1,Statements1_number), + %find_first_line_number(Statements2,Statements2_number), + find_first_line_number(Statements2a,Statements2a_number), + %%trace, + (Statements3=[]->Statements3_number=Return_line_true; + find_first_line_number(Statements3,Statements3_number)), + %find_state_machine_body2([Statements1],Body3,Statements2_number,Statements2a_number,Pred_numbers), + %find_state_machine_body2([Statements2],Body4,[end_function,Number],Return_line_false,Pred_numbers), + %%trace, + find_state_machine_body2(Statements2a,Body5,[end_function,Number],Return_line_false,Pred_numbers), + find_state_machine_body2(Statements3,Body6,Return_line_true,Return_line_false,Pred_numbers), + + append_list2([%Body3,Body4, + Body5],Body345), + Body7=[Number,["on true",Statements2a_number],["go after",Statements3_number],["on false",Return_line_false],[n,findall],[Statements1,Statements2]], + append([Body7],Body345,Body71), + append(Body71,Body6,Body2), + + !. + */ + /** +find_state_machine_statement1(Statement,Result1,Return_line_true,Return_line_false,Pred_numbers1) :- + ((Statement=[Number,[n,Name1],[Arguments1,Arguments2,Arguments3]], + + Name1=findall, +%trace, + length(Arguments,Arity1), + + find_state_machine_body2(Arguments3,Result3,Return_line_true,Return_line_false,Pred_numbers1), + + %atom_string(Name1,Name), + %(member(Name,Reserved_words)->Pred_numbers2=none;(member([[n,Name],Arity1,Pred_numbers2],Pred_numbers1))), + %(Name1=downpipe->trace;true), + %(member([[n,Name1],Arity1,Pred_numbers2a],Pred_numbers1)->Pred_numbers2=Pred_numbers2a;Pred_numbers2=(-)), + + %Arguments=Result2, + %findall(Argument,(member(Argument,Arguments),(predicate_or_rule_name(Argument))),Result2), + Result1=[[Number,["on true",Return_line_true],["go after",-],["on false",Return_line_false],["go to predicates",-],[n,Name1],[Arguments1,Arguments2]]])), + + append_list2([Result1,Result3],Result2),!. + **/ + +find_state_machine_statement1(Statement,Result1,Return_line_true,Return_line_false,Pred_numbers1) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + ((Statement=[Number,[Dbw_n,Name1],Arguments], + + not(Name1=Dbw_findall), + + length(Arguments,Arity1), + %atom_string(Name1,Name), + %(member(Name,Reserved_words)->Pred_numbers2=none;(member([[n,Name],Arity1,Pred_numbers2],Pred_numbers1))), + %(Name1=downpipe->trace;true), + (member([[Dbw_n,Name1],Arity1,Pred_numbers2a],Pred_numbers1)->Pred_numbers2=Pred_numbers2a;Pred_numbers2=(-)), + + Arguments=Result2, + %findall(Argument,(member(Argument,Arguments),(predicate_or_rule_name(Argument))),Result2), + + Result1=[[Number,[Dbw_on_true,Return_line_true],[Dbw_go_after,-],[Dbw_on_false,Return_line_false],[Dbw_go_to_predicates,Pred_numbers2],[Dbw_n,Name1],Result2]])->true; + (%%Statement=[_,[n,Name]], + %%trace, + Statement=[Number,[Dbw_n,Name1]], + length([],Arity1), + %atom_string(Name1,Name), +%(member(Name,Reserved_words)->Pred_numbers2=none;(member([[n,Name],Arity1,Pred_numbers2],Pred_numbers1))), +(member([[Dbw_n,Name1],Arity1,Pred_numbers2a],Pred_numbers1)->Pred_numbers2=Pred_numbers2a;Pred_numbers2=(-)), + + +Result1=[[Number,[Dbw_on_true,Return_line_true],[Dbw_go_after,-],[Dbw_on_false,Return_line_false],[Dbw_go_to_predicates,Pred_numbers2],[Dbw_n,Name1]]])). + + + + find_state_machine_statement1(Statement,Result1,Return_line_true,Return_line_false,_Pred_numbers1) :- +get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + ((Statement=[Number,[Dbw_v,Name1],Arguments], + %trace, + + not(Name1=Dbw_findall), + + %length(Arguments,Arity1), + %atom_string(Name1,Name), + %(member(Name,Reserved_words)->Pred_numbers2=none;(member([[n,Name],Arity1,Pred_numbers2],Pred_numbers1))), + %(Name1=downpipe->trace;true), + %(member([[v,Name1],Arity1,Pred_numbers2a],Pred_numbers1)->Pred_numbers2=Pred_numbers2a;Pred_numbers2=(-)), + + Pred_numbers2=(*), + + Arguments=Result2, + %findall(Argument,(member(Argument,Arguments),(predicate_or_rule_name(Argument))),Result2), + Result1=[[Number,[Dbw_on_true,Return_line_true],[Dbw_go_after,-],[Dbw_on_false,Return_line_false],[Dbw_go_to_predicates,Pred_numbers2],[Dbw_v,Name1],Result2]])->true; + (%%Statement=[_,[n,Name]], + %%trace, + Statement=[Number,[Dbw_v,Name1]], + %length([],Arity1), + %atom_string(Name1,Name), +%(member(Name,Reserved_words)->Pred_numbers2=none;(member([[n,Name],Arity1,Pred_numbers2],Pred_numbers1))), +%(member([[v,Name1],Arity1,Pred_numbers2a],Pred_numbers1)->Pred_numbers2=Pred_numbers2a;Pred_numbers2=(-)), + Pred_numbers2=(*), + +Result1=[[Number,[Dbw_on_true,Return_line_true],[Dbw_go_after,-],[Dbw_on_false,Return_line_false],[Dbw_go_to_predicates,Pred_numbers2],[Dbw_v,Name1]]])). + %%writeln([[Number,Return_line_true,Return_line_false,[n,Name],Result2]]). + +%%if_empty_list_then_return([],_Number,-3) :- !. +%%if _empty_list_then_return(Statements,Number,Number) :- +%% not(Statements=[]),!. + +find_first_line_number([],-2) :- !. +find_first_line_number(Number,Number) :- number(Number),!. +find_first_line_number([[Number|_]|_],Number) :- number(Number),!. +find_first_line_number([Statement|_],Number) :- +%%trace, + not((Statement=[Number|_],number(Number))), + find_first_line_number(Statement,Number),!. + +/** + Result1=[[Number,Return_line_true,-,Return_line_false,[n,Name]]])). + + Result1=[[Number,Return_line_true,-,Return_line_false,[n,Name],Result2]])->true; + +**/ \ No newline at end of file diff --git a/SSI/ssi_listrecursion4.pl b/SSI/ssi_listrecursion4.pl new file mode 100644 index 0000000000000000000000000000000000000000..5ec6ef2af8fe2000bfdbd82e498ad97d00471423 --- /dev/null +++ b/SSI/ssi_listrecursion4.pl @@ -0,0 +1,490 @@ +/* +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_member],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("member",Dbw_member1),Dbw_member1=Dbw_member, +%%writeln1(8), + ssi_interpretpart(member,Variable1,Variable2,Vars1,Vars2). +*/ + + + + + +interpretstatement1(ssi,_Functions0,_Functions,Query1,Vars1,Vars8,true,nocut,_) :- +%writeln1([*,_Functions0,_Functions]), +get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("call",Dbw_call1),Dbw_call1=Dbw_call, + +%%writeln1("h1/10"), +%trace, +%find_pred_sm(Reserved_words1), + ((Query1=[[Dbw_n,Dbw_call],[[lang,Lang1],Debug1,[Function,Arguments],Functions%,Result + ]],Tm=off%, + %not(member(Dbw_call,Reserved_words1)) + )->true; + (Query1=[[Dbw_n,Dbw_call],[[lang,Lang1],Debug1,[Function,Arguments],Types,Modes,Functions%,Result + ]],Tm=on)), + + %trace, + + lang(Lang2a), + types(Types2a), + (Types2a=on->(typestatements(TypeStatements2a), + modestatements(ModeStatements2a));true), + + (Lang1=same->lang(Lang2);Lang2=Lang1), + (Debug1=same->debug(Debug2);Debug2=Debug1), + + %%not(Function=[n,grammar]->true;Function=[n,grammar_part]), **** +%%writeln1(["Arguments",Arguments,"Vars1",Vars1]), + %%***writeln1(substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs)), + ((Function=[Dbw_v,Function2], + not(reserved_word2(Function2)))-> + (append([Function],Arguments,Arguments1), + substitutevarsA1(Arguments1,Vars1,[],Vars3,[],FirstArgs), + Vars3=[Function1|Vars31], + Query2=[Function1,Vars31]); + (substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs), + %simplify(Vars32,Vars3), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln1([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + Query2=[Function,Vars3])), %% Bodyvars2? +%% debug(on)->writeln1([call,[Function,[Vars3]]]), +%%writeln1(["Query2",Query2,"Functions0",Functions0]), + + + %interpret2(Query2,Functions0,Functions0,Result1), + + remember_and_turn_off_debug(Debug), + + + + + %trace, + + query_box(Query2,_Query3,Functions,Functions1), + convert_to_grammar_part1(Functions1,[],Functions2,_), + %insert_cuts(Functions2a,Functions2), +%Functions1=Functions2, + %writeln1(convert_to_grammar_part1(Functions1,[],Functions2,_)), %trace, + %writeln1(Functions2), + %%pp3(Functions2), + %%writeln1(lucianpl11(Debug,Query,Functions2,Functions2,Result)), + %findall(Result1, + + %trace, + add_line_numbers_to_algorithm1(Functions2,Functions2a), +%Functions2=Functions2a, + %writeln1(add_line_numbers_to_algorithm1(Functions2,Functions2a)), + %writeln1(Functions2a), +%append(Functions2a,Functions1a,Functions2b), + + pred_numbers(Pred_numbers0), + + %find_pred_sm(Reserved_words),%,"en"), +find_pred_numbers(Functions2a,[],%Reserved_words, +Pred_numbers), + + retractall(pred_numbers(_)), + assertz(pred_numbers(Pred_numbers)), +%trace, + + find_state_machine1(Functions2a,Functions3a,Pred_numbers), + + + + + + %trace, + ((Tm=off->international_lucianpl1([lang,Lang2],%off,% + Debug2, + Query2,Functions3a,Result1a) + %international_interpret([lang,Lang2],off,%Debug2, +%Query2,Functions,Result1a) +; + +international_lucianpl1([lang,Lang2],%off,% +Debug2, +Query2,Types,Modes,Functions3a,Result1a) %international_interpret([lang,Lang2],off,%Debug2 + %Query2,Types,Modes,Functions,Result1a) + )->true;(turn_back_debug(Debug),false)), + + turn_back_debug(Debug), + + + member(Result1,Result1a), + + retractall(pred_numbers(_)), + assertz(pred_numbers(Pred_numbers0)), + + retractall(lang(_)), + assertz(lang(Lang2a)), + + retractall(types(_)), + assertz(types(Types2a)), + + (Types2a=on->( + retractall(typestatements(_)), + %findall([A,C],(member([A,B],TypeStatements2a),expand_types(B,[],C)),TypeStatements2a1), + +assertz(typestatements(TypeStatements2a)), + retractall(modestatements(_)), + assertz(modestatements(ModeStatements2a)));true), + + + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + (%trace, + unique1(Vars7,[],Vars8)%,notrace + ) +);( +%%writeln1(here1), + Vars8=[])). + + + + +interpretstatement1(ssi,Functions0,_Functions,Query1,Vars1,Vars8,true,nocut,_) :- + + %trace, + %writeln(interpretstatement1(ssi,Functions0,_Functions,Query1,Vars1,Vars8,true,nocut)), + +get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("call",Dbw_call1),Dbw_call1=Dbw_call, + +%%writeln1("h1/10"), +%trace, +%writeln([Functions0,Functions0]), +%find_pred_sm(Reserved_words1), + + %trace, + ((Query1=[[Dbw_n,Dbw_call],[Function,Arguments]]%, not_reserved_word(Function,Reserved_words1) + )->true; +(Query1=[Function,Arguments], +Function=[Dbw_v,_Function2] +%not(reserved_word2(Function2)) +))%,Function=[Dbw_n1,Function_a],atom_string(Function_a,Function_s), +%,not_reserved_word(Function,Reserved_words1)) +%) +, + +%trace, + %%not(Function=[n,grammar]->true;Function=[n,grammar_part]), **** +%%writeln1(["Arguments",Arguments,"Vars1",Vars1]), + %%***writeln1(substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs)), + ((Function=[Dbw_v,Function2], + not(reserved_word2(Function2)))-> + (%trace, + append([Function],Arguments,Arguments1), + %trace, + substitutevarsA1(Arguments1,Vars1,[],Vars3,[],FirstArgs), + Vars3=[Function1|Vars31], + Query2=[Function1,Vars31]); + (%trace, + substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs), + %simplify(Vars32,Vars3), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%writeln1([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + Query2=[Function,Vars3])), %% Bodyvars2? +%(Function=[n,compound213]->%true +%trace +%;true), + %trace, +%% debug(on)->writeln1([call,[Function,[Vars3]]]), +%%writeln1(["Query2",Query2,"Functions0",Functions0]), +%trace, +%writeln1(interpret2(Query2,Functions0,Functions0,Result1)), + + remember_and_turn_off_debug(Debug), + turndebug(off), %trace, + (lucianpl1(off,%,off,%Debug, + Query2,Functions0,Result1a)%interpret2(Query2,Functions0,Functions0,Result1) + ->true;(turn_back_debug(Debug),false)), + + turn_back_debug(Debug), + + %trace, + member(Result1,Result1a), + %writeln1(interpret2(Query2,Functions0,Functions0,Result1)), + %writeln1(updatevars2(FirstArgs,Result1,[],Vars5)), + + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + (%trace, + unique1(Vars7,[],Vars8)%,notrace + ) +);( +%%writeln1(here1), + Vars8=[])). +%%**** reverse and take first instance of each variable. + %%findresult3(Arguments,Vars6,[],Result2) +%%writeln1(["FirstArgs",FirstArgs,"Result1",Result1,"Vars5",Vars5,"Vars4",Vars4]), +%%writeln1(["Vars1:",Vars1,"Vars4:",Vars4]), +%% debug(on)->writeln1([exit,[Function,[Result2]]]). +/* +interpretstatement1(ssi,Functions0,_Functions,Query,Vars,Vars,true,nocut,_) :- +find_pred_sm(Reserved_words1), + + Query=[Function], + %trace, + not_reserved_word(Function,Reserved_words1), +%debug_call(Skip,[Function]), + + remember_and_turn_off_debug(Debug), + turndebug(off), %trace, + (lucianpl1(off,%Debug, + Query,Functions0,_Result1)%interpret2(Query,Functions0,Functions0,_Result1) + ->true;(turn_back_debug(Debug),false)), + + turn_back_debug(Debug) + + + + +,!. + +*/ + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_stringconcat],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut,Vars2c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +((get_lang_word("stringconcat",Dbw_stringconcat1),Dbw_stringconcat1=Dbw_stringconcat)->true; +Dbw_stringconcat=string_concat), +%%writeln1(8), + + ssi_interpretpart(stringconcat,Variable1,Variable2,Variable3,Vars1,Vars2,Vars2c). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_append],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut,Vars2c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +get_lang_word("append",Dbw_append1),Dbw_append1=Dbw_append, +%%writeln1(8), + + ssi_interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2,Vars2c). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_member2],[Variable2,Variable1]],Vars1,Vars2,true,nocut,Vars2c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +((get_lang_word("member2",Dbw_member21), +Dbw_member21=Dbw_member2)-> ssi_interpretpart(member2,Dbw_member2,Variable2,Variable1,Vars1,Vars2,Vars2c); +(get_lang_word("member",Dbw_member22), +Dbw_member22=Dbw_member2, +%%writeln1(8), +%trace, + ssi_interpretpart(member2,Dbw_member2,Variable1,Variable2,Vars1,Vars2,Vars2c))). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_member3],[Variable1,Variable2]],Vars1,Vars2,true,nocut,Vars2c) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +get_lang_word("member3",Dbw_member31),Dbw_member31=Dbw_member3, +%%writeln1(8), + ssi_interpretpart(member3,Variable1,Variable2,Vars1,Vars2,Vars2c). + +/* +interpretstatement4(ssi,_F0,_Functions,[[Dbw_n,Dbw_member],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[],AC) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("member",Dbw_member1),Dbw_member1=Dbw_member, +%%writeln1(8), + ssi_interpretpart(member,Variable1,Variable2,Vars1,Vars2,AC). +*/ +interpretstatement4(ssi,_F0,_Functions,[[Dbw_n,Dbw_member2],[Variable1,Variable2]],Vars1,Vars2,true,nocut,Vars2c,AC) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +((get_lang_word("member2",Dbw_member21), +Dbw_member21=Dbw_member2)->true; +(get_lang_word("member",Dbw_member22), +Dbw_member22=Dbw_member2)), + +%%writeln1(8), + + ssi_interpretpart(member2,Dbw_member2,Variable1,Variable2,Vars1,Vars2,Vars2c,AC). + + + + + +interpretstatement4(ssi,_F0,_Functions,[[Dbw_n,Dbw_stringconcat],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut,Vars2c,AC) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +((get_lang_word("stringconcat",Dbw_stringconcat1),Dbw_stringconcat1=Dbw_stringconcat)->true; +Dbw_stringconcat=string_concat), +%%writeln1(8), + + ssi_interpretpart(stringconcat,Variable1,Variable2,Variable3,Vars1,Vars2,Vars2c,AC). + +interpretstatement4(ssi,_F0,_Functions,[[Dbw_n,Dbw_append],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut,Vars2c,AC) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +get_lang_word("append",Dbw_append1),Dbw_append1=Dbw_append, +%%writeln1(8), + + ssi_interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2,Vars2c,AC). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_member3],[Variable1,Variable2]],Vars1,Vars2,true,nocut,Vars2c,AC) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +get_lang_word("member3",Dbw_member31),Dbw_member31=Dbw_member3, +%%writeln1(8), + ssi_interpretpart(member3,Variable1,Variable2,Vars1,Vars2,Vars2c,AC). + + +interpretstatement1(ssi,F0,Functions,[[Dbw_n,Dbw_name]|Args],Vars1,Vars2,T,C,[]) :- %writeln(here), +interpretstatement1(ssi,F0,Functions,[[Dbw_n,Dbw_name]|Args],Vars1,Vars2,T,C). + +%%%% + + +/* +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals4_on]|_],Vars,Vars,true,nocut,[]) :- %writeln(here), +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_equals4_on]|_],Vars,Vars,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals4_off]|_],Vars,Vars,true,nocut,[]) :- %writeln(here), +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_equals4_off]|_],Vars,Vars,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[n,trace2]|_],Vars,Vars,true,nocut,[]) :- %writeln(here), +interpretstatement1(non-ssi,_F0,_Functions,[[n,trace2]|_],Vars,Vars,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_trace]|_],Vars,Vars,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_trace]|_],Vars,Vars,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_notrace]|_],Vars,Vars,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_notrace]|_],Vars,Vars,true,nocut). + +% * Different in ssi + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_cut]|_],Vars,Vars,true,cut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_cut]|_],Vars,Vars,true,cut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_true]|_],Vars,Vars,_,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_true]|_],Vars,Vars,_,nocut). +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_fail]|_],Vars,Vars,_,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_fail]|_],Vars,Vars,_,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_atom],[Variable]],Vars,Vars,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_atom],[Variable]],Vars,Vars,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_string],[Variable]],Vars,Vars,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_string],[Variable]],Vars,Vars,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_number],[Variable]],Vars,Vars,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_number],[Variable]],Vars,Vars,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_letters],[Variable]],Vars,Vars,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_letters],[Variable]],Vars,Vars,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_variable],[Variable]],Vars,Vars,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_variable],[Variable]],Vars,Vars,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Operator],[Variable2,Variable3,Variable1]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Operator],[Variable2,Variable3,Variable1]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals1],[Variable1,[Variable2,Variable3]]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_equals1],[Variable1,[Variable2,Variable3]]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals2],[Variable1,[Variable2,Variable3]]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_equals2],[Variable1,[Variable2,Variable3]]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals3],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_equals3],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals4],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_equals4],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_wrap],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_wrap],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_unwrap],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_unwrap],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_head],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_head],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_tail],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_tail],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_delete],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_delete],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut). +%%** all in form f,[1,1,etc], including + with 0,1 + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_append],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_append],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut). +%get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%get_lang_word("append",Dbw_append1),Dbw_append1=Dbw_append, +%%writeln1(9), + %ssi_interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2). + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_stringconcat],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_stringconcat],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut). + +%get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%get_lang_word("stringconcat",Dbw_stringconcat1),Dbw_stringconcat1=Dbw_stringconcat, + +%ssi_interpretpart(stringconcat,Variable1,Variable2,Variable3,Vars1,Vars2). + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_stringtonumber],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_stringtonumber],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_random],[Variable1]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_random],[Variable1]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_length],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_length],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_ceiling],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_ceiling],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_date],[Year,Month,Day,Hour,Minute,Seconds]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_date],[Year,Month,Day,Hour,Minute,Seconds]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_round],[N1,N2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_round],[N1,N2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_Grammar,_Grammar2,[[Dbw_n,grammar_part],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_Grammar,_Grammar2,[[Dbw_n,grammar_part],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut). + + +interpretstatement1(ssi,_Functions0,_Functions,[[Dbw_n,Dbw_string_from_file],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_Functions0,_Functions,[[Dbw_n,Dbw_string_from_file],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,Functions0,Functions,[[Dbw_n,Dbw_maplist],[Variable1,Variable2,Variable3,Variable4]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,Functions0,Functions,[[Dbw_n,Dbw_maplist],[Variable1,Variable2,Variable3,Variable4]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_string_length],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_string_length],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_sort],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_sort],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_intersection],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_intersection],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_read_string],[Variable1]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_read_string],[Variable1]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_writeln],[Variable1]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_writeln],[Variable1]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_atom_string],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_atom_string],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_get_lang_word],[Variable1,Variable2]],Vars1,Vars2,true,nocut,[]) :- +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_get_lang_word],[Variable1,Variable2]],Vars1,Vars2,true,nocut). + +*/ \ No newline at end of file diff --git a/SSI/ssi_verify4.pl b/SSI/ssi_verify4.pl new file mode 100644 index 0000000000000000000000000000000000000000..01e771671d519eca7e12e2baca41adddeecee5ee --- /dev/null +++ b/SSI/ssi_verify4.pl @@ -0,0 +1,26 @@ +%% ssi_test(Debug[on/off],Total,Score). + +%%:- use_module(library(time)). + +%% ssi_test cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +ssi_test(Debug,NTotal,Score) :- ssi_test(Debug,0,NTotal,0,Score),!. +ssi_test(_Debug,NTotal,NTotal,Score,Score) :- NTotal=249, !. +ssi_test(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + test(NTotal3,Query,Functions,Result), + ((international_lucianpl([lang,"en"],Debug,Query,Functions,Result1) + %,writeln1([Result,Result1]) + ,Result=Result1) + ->(Score3 is Score1+1,writeln0([ssi_test,NTotal3,passed]));(Score3=Score1,writeln0([ssi_test,NTotal3,failed]))), + writeln0(""), + ssi_test(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% ssi_test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +ssi_test1(Debug,N,Passed) :- + test(N,Query,Functions,Result), + ((international_lucianpl([lang,"en"],Debug,Query,Functions,Result1), + %writeln1([result1,Result1]), + Result=Result1 + )->(Passed=passed,writeln0([ssi_test,N,passed]));(Passed=failed,writeln0([ssi_test,N,failed]))),!. diff --git a/SSI/ssi_verify4_open.pl b/SSI/ssi_verify4_open.pl new file mode 100644 index 0000000000000000000000000000000000000000..eead4036fbd432c8743fd19ac8cb8e07399a386d --- /dev/null +++ b/SSI/ssi_verify4_open.pl @@ -0,0 +1,14 @@ +%% ssi_test cases, Debug=trace=on or off, N=output=result +ssi_testopen(Debug,NTotal) :- ssi_testopen(Debug,0,NTotal),!. +ssi_testopen(_Debug,NTotal,NTotal) :- NTotal=3, !. +ssi_testopen(Debug,NTotal1,NTotal2) :- + NTotal3 is NTotal1+1, + testopen_cases(NTotal3,Query,Functions), + ((international_lucianpl([lang,"en"],Debug,Query,Functions,Result),not(Result=[]))->(writeln0([ssi_test,NTotal3,result,Result]),writeln0([ssi_test,NTotal3,passed]));(writeln0([ssi_test,NTotal3,failed]))), + writeln0(""), + ssi_testopen(Debug,NTotal3,NTotal2),!. + +%% ssi_test individual cases, Debug=trace=on or off, N=case number +ssi_testopen1(Debug,N) :- + testopen_cases(N,Query,Functions), +((international_lucianpl([lang,"en"],Debug,Query,Functions,Result),not(Result=[]))->(writeln0([ssi_test,N,result,Result]),writeln0([ssi_test,N,passed]));(writeln0([ssi_test,N,failed]))),!. diff --git a/SSI/ssi_verify4_open_types.pl b/SSI/ssi_verify4_open_types.pl new file mode 100644 index 0000000000000000000000000000000000000000..295d21cda29ab34d792e658d0376361ba2d18500 --- /dev/null +++ b/SSI/ssi_verify4_open_types.pl @@ -0,0 +1,23 @@ +%% ssi_test_open_types(Debug[on/off],Total,Score). + +%%:- use_module(library(time)). + +%% ssi_test cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +ssi_test_open_types(Debug,NTotal,Score) :- ssi_test_open_types(Debug,0,NTotal,0,Score),!. +ssi_test_open_types(_Debug,NTotal,NTotal,Score,Score) :- NTotal=24, !. +ssi_test_open_types(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + test_open_types_cases(NTotal3,Query,Types,Modes,Functions), + ((international_lucianpl([lang,"en"],Debug,Query,Types,Modes,Functions,Result),not(Result=[]))->(Score3 is Score1+1,writeln0([ssi_test_open_types,NTotal3,result,Result]),writeln0([ssi_test_open_types,NTotal3,passed]));(Score3=Score1,writeln0([ssi_test_open_types,NTotal3,failed]))), + writeln0(""), + ssi_test_open_types(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% ssi_test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +ssi_test_open_types1(Debug,N,Passed) :- + test_open_types_cases(N,Query,Types,Modes,Functions), + (((international_lucianpl([lang,"en"],Debug,Query,Types,Modes,Functions,Result),not(Result=[]))%%writeln(Result1), + %%Result=Result1 + )->(Passed=passed,writeln0([ssi_test_open_types,N,result,Result]),writeln0([ssi_test_open_types,N,passed]));(Passed=failed,writeln0([ssi_test_open_types,N,failed]))),!. + diff --git a/SSI/ssi_verify4_test_bt_lang_all.pl b/SSI/ssi_verify4_test_bt_lang_all.pl new file mode 100644 index 0000000000000000000000000000000000000000..3c65201e1eeebd4df7d430130ae6d1beaa190d29 --- /dev/null +++ b/SSI/ssi_verify4_test_bt_lang_all.pl @@ -0,0 +1,145 @@ +%:-include('../Languages/make_docs.pl'). + +%:-include('ssi_verify4.pl'). +%:-include('ssi_verify4_types.pl'). +%:-include('ssi_verify4_open.pl'). +%:-include('ssi_verify4_open_types.pl'). + +% ssi_test (NTotal3,Query,Functions,Result) +% ssi_test_types_cases (NTotal3,Query,Types,Modes,Functions,Result) +% ssi_testopen_cases (N, Query,Functions) +% ssi_test_open_types (NTotal3,Query,Types,Modes,Functions) + +%% ssi_test_all_bt00("en2",off,NTotal,Score). + +ssi_test_all_bt00(Lang,Debug,NTotal,Score) :- + retractall(lang(_)), + assertz(lang(Lang)), + + ssi_test_all_bt0(test,4,Lang,Debug,NT1,S1), + writeln0([ssi_verify4,S1,/,NT1,passed]), + writeln0(""), writeln0(""), + + ssi_test_all_bt0(test_types_cases,6,Lang,Debug,NT2,S2), + writeln0([ssi_verify4_types,S2,/,NT2,passed]), + writeln0(""), writeln0(""), + + ssi_test_all_bt0(testopen_cases,3,Lang,Debug,NT3,S3), + writeln0([ssi_verify4_open,S3,/,NT3,passed]), + writeln0(""), writeln0(""), + + ssi_test_all_bt0(test_open_types_cases,5,Lang,Debug,NT4,S4), + writeln0([ssi_verify4_open_types,S4,/,NT4,passed]), + writeln0(""), writeln0(""), + + NTotal is NT1+NT2+NT3+NT4, + Score is S1+S2+S3+S4. + +ssi_test_all_bt0(Ssi_test,Arity,Lang,Debug,NTotal,Score) :- + functor(Ssi_test2,Ssi_test,Arity), + findall(ssi_test2,(Ssi_test2),B),length(B,NTotal1), +ssi_test_all_bt0(Ssi_test,Arity,Lang,Debug,0,NTotal,0,Score,NTotal1),!. +ssi_test_all_bt0(_Ssi_test,_Arity,_Lang,_Debug,NTotal,NTotal,Score,Score,NTotal) :- +%NTotal=105, + !. +ssi_test_all_bt0(Ssi_test,Arity,Lang,Debug,NTotal1,NTotal2,Score1,Score2,NTotal4) :- + NTotal3 is NTotal1+1, + ssi_test_all_bt000(Ssi_test,Debug,NTotal3,Score1,Score3,Lang), + writeln0(""), + ssi_test_all_bt0(Ssi_test,Arity,Lang,Debug,NTotal3,NTotal2,Score3,Score2,NTotal4),!. + +%% ssi_test_all_bt01 individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +%% ssi_test_all_bt01(ssi_test, 4,"en2",off,1,Passed). +%% ssi_test_all_bt01(ssi_test_types_cases,6,"en2",off,1,Passed). +%% ssi_test_all_bt01(ssi_testopen_cases, 3,"en2",off,1,Passed). +%% ssi_test_all_bt01(ssi_test_open_types, 5,"en2",off,1,Passed). + +ssi_test_all_bt01(test,_Arity,Lang,Debug,NTotal3,Passed) :- + ssi_test_all_bt000(test,Debug,NTotal3,0,Passed1,Lang), + (Passed1=1->Passed=passed;Passed=failed), + /** + ((international_lucianpl([lang,"en"],Debug,Query,Functions,Result1),%%writeln(Result1), + Result=Result1 + )->(Passed=passed,writeln([ssi_test_all_bt0,N,passed]));(Passed=failed,writeln([ssi_test_all_bt0,N,failed]))), +**/ + !. + +ssi_test_all_bt000(test,Debug,NTotal3,Score1,Score3,Lang) :- + test(NTotal3,Query,Functions,Result), + trans_alg1(Query,"en",Lang,Query1), + (Query=Query1->true%writeln("Query=Query1") + ;(writeln0("not(Query=Query1)"),abort)), + trans_alg1(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions);true), + (Debug=on->writeln1(Functions1);true), + (Functions=Functions1->true%writeln("Functions=Functions1") + ;(writeln0("not(Functions=Functions1)"),abort)), + trans_alg1(Result,"en",Lang,Result1), + (Result1=_Result11->true%writeln("Result1=Result11") + ;(writeln0("not(Result1=Result11)"),abort)), + (international_lucianpl([lang,"en"],Debug,Query1,Functions1,Result1) + %%writeln1(Result2 + ->(Score3 is Score1+1,writeln0([ssi_test,NTotal3,passed]));(Score3=Score1,writeln0([ssi_test,NTotal3,failed]))). + +ssi_test_all_bt000(test_types_cases,Debug,NTotal3,Score1,Score3,Lang) :- + +test_types_cases(NTotal3,Query,Types,Modes,Functions,Result), + + trans_alg1(Query,"en",Lang,Query1), + %trans_alg1(Types,"en",Lang,Types1), + + findall([F1|Types00],(member([F1|Types003],Types), + + expand_types1(Types003,[],Types00)),Types004), + + %Types004=[[[n, find_record], [[t, brackets], [[[t, list], [[t, number], [t, string]]], [t, number], [t, string]]]]], + + trans_alg1(Types004,"en",Lang,Types005),%,expand_types1(Types002,[],Types003),simplify_types(Types003,[],Types00) + %),Types1), + + %simplify_types(Types01,[],Types1),%findall + findall([F|Types100],(member([F|Types101],Types005),simplify_types(Types101,[],Types100)),Types1), + + + trans_alg1(Modes,"en",Lang,Modes1), + trans_alg1(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + trans_alg1(Result,"en",Lang,Result1), + +(international_lucianpl([lang,"en"],Debug,Query1,Types1,Modes1,Functions1,Result1)->(Score3 is Score1+1,writeln0([ssi_test_types,NTotal3,passed]));(Score3=Score1,writeln0([ssi_test_types,NTotal3,failed]))). + +ssi_test_all_bt000(testopen_cases,Debug,NTotal3,Score1,Score3,Lang) :- + testopen_cases(NTotal3,Query,Functions), + trans_alg1(Query,"en",Lang,Query1), + trans_alg1(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + ((international_lucianpl([lang,"en"],Debug,Query1,Functions1,Result),not(Result=[]))->(Score3 is Score1+1,writeln0([ssi_testopen,NTotal3,result,Result]),writeln0([ssi_test,NTotal3,passed]));(Score3=Score1,writeln0([ssi_testopen,NTotal3,failed]))). + +ssi_test_all_bt000(test_open_types_cases,Debug,NTotal3,Score1,Score3,Lang) :- + test_open_types_cases(NTotal3,Query,Types,Modes,Functions), + trans_alg1(Query,"en",Lang,Query1), + %trans_alg1(Types,"en",Lang,Types1), + + findall([F1|Types00],(member([F1|Types003],Types), + + expand_types1(Types003,[],Types00)),Types004), + + %Types004=[[[n, find_record], [[t, brackets], [[[t, list], [[t, number], [t, string]]], [t, number], [t, string]]]]], + + trans_alg1(Types004,"en",Lang,Types005),%,expand_types1(Types002,[],Types003),simplify_types(Types003,[],Types00) + %),Types1), + + %simplify_types(Types01,[],Types1),%findall + findall([F|Types100],(member([F|Types101],Types005),simplify_types(Types101,[],Types100)),Types1), + + trans_alg1(Modes,"en",Lang,Modes1), + trans_alg1(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + + +((international_lucianpl([lang,"en"],Debug,Query1,Types1,Modes1,Functions1,Result),not(Result=[]))->(Score3 is Score1+1,writeln0([ssi_test_open_types,NTotal3,result,Result]),writeln0([ssi_test_open_types,NTotal3,passed]));(Score3=Score1,writeln0([ssi_test_open_types,NTotal3,failed]))). + +trans_alg1(Query,"en",Lang,Query1) :- + trans_alg(Query,"en",Lang,Query2), + trans_alg(Query2,Lang,"en",Query1). diff --git a/SSI/ssi_verify4_test_lang_all.pl b/SSI/ssi_verify4_test_lang_all.pl new file mode 100644 index 0000000000000000000000000000000000000000..a0d58ea5a4b538b1bd584c2e763dc86da02893a2 --- /dev/null +++ b/SSI/ssi_verify4_test_lang_all.pl @@ -0,0 +1,135 @@ +%:-include('../Languages/make_docs.pl'). + +%:-include('ssi_verify4.pl'). +%:-include('ssi_verify4_types.pl'). +%:-include('ssi_verify4_open.pl'). +%:-include('ssi_verify4_open_types.pl'). + +%% ssi_test_all00("en",off,NTotal,Score). + +%% ssi_test_all0(test,4,"en",off,A,B). + +%% ssi_test_all01(test,_,"en",off,1,Passed). + +ssi_test_all00(Lang,Debug,NTotal,Score) :- + retractall(lang(_)), + assertz(lang(Lang)), + + ssi_test_all0(test,4,Lang,Debug,NT1,S1), + writeln0([ssi_verify4,S1,/,NT1,passed]), + writeln0(""), writeln0(""), + + ssi_test_all0(test_types_cases,6,Lang,Debug,NT2,S2), + writeln0([ssi_verify4_types,S2,/,NT2,passed]), + writeln0(""), writeln0(""), + + ssi_test_all0(testopen_cases,3,Lang,Debug,NT3,S3), + writeln0([ssi_verify4_open,S3,/,NT3,passed]), + writeln0(""), writeln0(""), + + ssi_test_all0(test_open_types_cases,5,Lang,Debug,NT4,S4), + writeln0([ssi_verify4_open_types,S4,/,NT4,passed]), + writeln0(""), writeln0(""), + + NTotal is NT1+NT2+NT3+NT4, + Score is S1+S2+S3+S4. + +ssi_test_all0(Test,Arity,Lang,Debug,NTotal,Score) :- + functor(Test2,Test,Arity), + findall(Test2,(Test2),B),length(B,NTotal1), +ssi_test_all0(Test,Arity,Lang,Debug,0,NTotal,0,Score,NTotal1),!. +ssi_test_all0(_Test,_Arity,_Lang,_Debug,NTotal,NTotal,Score,Score,NTotal) :- +%NTotal=105, + !. +ssi_test_all0(Test,Arity,Lang,Debug,NTotal1,NTotal2,Score1,Score2,NTotal4) :- + NTotal3 is NTotal1+1, + ssi_test_all000(Test,Debug,NTotal3,Score1,Score3,Lang), + writeln0(""), + ssi_test_all0(Test,Arity,Lang,Debug,NTotal3,NTotal2,Score3,Score2,NTotal4),!. + +%% ssi_test_all01 individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +%% ssi_test_all01(test, 4,"en2",off,1,Passed). +%% ssi_test_all01(test_types_cases,6,"en2",off,1,Passed). +%% ssi_test_all01(testopen_cases, 3,"en2",off,1,Passed). +%% ssi_test_all01(test_open_types, 5,"en2",off,1,Passed). + +ssi_test_all01(Test,_Arity,Lang,Debug,NTotal3,Passed) :- + ssi_test_all000(Test,Debug,NTotal3,0,Passed1,Lang), + (Passed1=1->Passed=passed;Passed=failed), + /** + ((international_lucianpl([lang,"en"],Debug,Query,Functions,Result1),%%writeln(Result1), + Result=Result1 + )->(Passed=passed,writeln([ssi_test_all0,N,passed]));(Passed=failed,writeln([ssi_test_all0,N,failed]))), +**/ + !. + +ssi_test_all000(test,Debug,NTotal3,Score1,Score3,Lang) :- + test(NTotal3,Query,Functions,Result), + trans_alg(Query,"en",Lang,Query1), + trans_alg(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions);true), + (Debug=on->writeln1(Functions1);true), + trans_alg(Result,"en",Lang,Result1), + + + (( +catch(call_with_time_limit(800,international_lucianpl([lang,Lang],Debug,Query1,Functions1,Result1)),_,false) %,writeln1(Result2) + ) + ->(Score3 is Score1+1,writeln0([ssi_test,NTotal3,passed]));(Score3=Score1,writeln0([ssi_test,NTotal3,failed]))). + +ssi_test_all000(test_types_cases,Debug,NTotal3,Score1,Score3,Lang) :- + +test_types_cases(NTotal3,Query,Types,Modes,Functions,Result), + + trans_alg(Query,"en",Lang,Query1), + + retractall(lang(_)), + assertz(lang("en")), + + %/* + findall([F1|Types00],(member([F1|Types003],Types), + + + expand_types1(Types003,[],Types00)),Types004), + + %Types004=[[[n, find_record], [[t, brackets], [[[t, list], [[t, number], [t, string]]], [t, number], [t, string]]]]], + + trans_alg(Types004,"en",Lang,Types005),%,expand_types1(Types002,[],Types003),simplify_types(Types003,[],Types00) + %),Types1), + + %simplify_types(Types01,[],Types1),%findall + findall([F|Types100],(member([F|Types101],Types005), + + %retractall(lang(_)), + %assertz(lang("en")), + + simplify_types(Types101,[],Types100)),Types1), + %*/ + %trans_alg(Types,"en",Lang,Types1), + trans_alg(Modes,"en",Lang,Modes1), + trans_alg(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + trans_alg(Result,"en",Lang,Result1), + +(catch(call_with_time_limit(800,international_lucianpl([lang,Lang],Debug,Query1,Types1,Modes1,Functions1,Result1)),_,false) %,writeln1(Result2) +->(Score3 is Score1+1,writeln0([ssi_test_types,NTotal3,passed]));(Score3=Score1,writeln0([ssi_test_types,NTotal3,failed]))). + +ssi_test_all000(testopen_cases,Debug,NTotal3,Score1,Score3,Lang) :- + testopen_cases(NTotal3,Query,Functions), + trans_alg(Query,"en",Lang,Query1), + trans_alg(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + ((international_lucianpl([lang,Lang],Debug,Query1,Functions1,Result),not(Result=[]))->(Score3 is Score1+1,writeln0([ssi_testopen,NTotal3,result,Result]),writeln0([ssi_test,NTotal3,passed]));(Score3=Score1,writeln0([ssi_testopen,NTotal3,failed]))). + +ssi_test_all000(test_open_types_cases,Debug,NTotal3,Score1,Score3,Lang) :- + test_open_types_cases(NTotal3,Query,Types,Modes,Functions), + trans_alg(Query,"en",Lang,Query1), + trans_alg(Types,"en",Lang,Types1), + trans_alg(Modes,"en",Lang,Modes1), + trans_alg(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + + +((international_lucianpl([lang,Lang],Debug,Query1,Types1,Modes1,Functions1,Result),not(Result=[]))->(Score3 is Score1+1,writeln0([ssi_test_open_types,NTotal3,result,Result]),writeln0([ssi_test_open_types,NTotal3,passed]));(Score3=Score1,writeln0([ssi_test_open_types,NTotal3,failed]))). + diff --git a/SSI/ssi_verify4_types.pl b/SSI/ssi_verify4_types.pl new file mode 100644 index 0000000000000000000000000000000000000000..65683031a5ae44d590b2ad3737b15e8e38df86cd --- /dev/null +++ b/SSI/ssi_verify4_types.pl @@ -0,0 +1,23 @@ +%% ssi_test_types(Debug[on/off],Total,Score). + +%%:- use_module(library(time)). + +%% ssi_test cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +ssi_test_types(Debug,NTotal,Score) :- ssi_test_types(Debug,0,NTotal,0,Score),!. +ssi_test_types(_Debug,NTotal,NTotal,Score,Score) :- NTotal=58, !. +ssi_test_types(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + test_types_cases(NTotal3,Query,Types,Modes,Functions,Result), + (international_lucianpl([lang,"en"],Debug,Query,Types,Modes,Functions,Result)->(Score3 is Score1+1,writeln0([ssi_test_types,NTotal3,passed]));(Score3=Score1,writeln0([ssi_test_types,NTotal3,failed]))), + writeln0(""), + ssi_test_types(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% ssi_test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +ssi_test_types1(Debug,N,Passed) :- + test_types_cases(N,Query,Types,Modes,Functions,Result), + ((international_lucianpl([lang,"en"],Debug,Query,Types,Modes,Functions,Result1),%writeln(Result1), + Result=Result1 + )->(Passed=passed,writeln0([ssi_test_types,N,passed]));(Passed=failed,writeln0([ssi_test_types,N,failed]))),!. + diff --git a/SSI/ssi_verify_pl.pl b/SSI/ssi_verify_pl.pl new file mode 100644 index 0000000000000000000000000000000000000000..aacedb8a56e4d41b0645bd33eb262eda80f7ce6b --- /dev/null +++ b/SSI/ssi_verify_pl.pl @@ -0,0 +1,32 @@ +%% ssi_test_pl(Debug[on/off],Total,Score). + +%%:- use_module(library(time)). + +%% ssi_test_pl cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +ssi_test_pl(Debug,NTotal,Score) :- ssi_test_pl(Debug,0,NTotal,0,Score),!. +ssi_test_pl(_Debug,NTotal,NTotal,Score,Score) :- NTotal=1, !. +ssi_test_pl(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + test_pl(NTotal3,Query,Functions,Result), + p2lpconverter_command([string,Query],Query1), + p2lpconverter([string,Functions],Functions1), + p2lpconverter_term([string,Result],Result2), + ((international_lucianpl([lang,"en"],Debug,Query1,Functions1,Result1), + %writeln1([result1,Result1]), + Result2=Result1 + )->(Score3 is Score1+1,writeln0([ssi_test_pl,NTotal3,passed]));(Score3=Score1,writeln0([ssi_test_pl,NTotal3,failed]))), + writeln0(""), + ssi_test_pl(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% ssi_test_pl individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +ssi_test_pl1(Debug,N,Passed) :- + test_pl(N,Query,Functions,Result), + p2lpconverter_command([string,Query],Query1), + p2lpconverter([string,Functions],Functions1), + p2lpconverter_term([string,Result],Result2), + ((international_lucianpl([lang,"en"],Debug,Query1,Functions1,Result1), + %writeln1([result1,Result1]), + Result2=Result1 + )->(Passed=passed,writeln0([ssi_test_pl,N,passed]));(Passed=failed,writeln0([ssi_test_pl,N,failed]))),!. diff --git a/SSI/used_by_call_command.pl b/SSI/used_by_call_command.pl new file mode 100644 index 0000000000000000000000000000000000000000..c5813010d8f8b1942f79db8a2892c5de6a96a197 --- /dev/null +++ b/SSI/used_by_call_command.pl @@ -0,0 +1,119 @@ +% used by call command + +lucianpl1(Debug,Query,Functions1,Result) :- + lang(Lang1), + international_lucianpl1([lang,Lang1], + Debug,Query,Functions1,Result). + +international_lucianpl1([lang,Lang],Debug,Query,Functions1,Result) :- + retractall(lang(_)), + assertz(lang(Lang)), + lucianpl1_1(Debug,Query,Functions1,Result). + +international_lucianpl1([lang,Lang],Debug,Query,TypeStatements,ModeStatements,Functions1,Result) :- + retractall(lang(_)), + assertz(lang(Lang)), + lucianpl1_1(Debug,Query,TypeStatements,ModeStatements,Functions1,Result). + + +lucianpl1_1(Debug,Query,Functions1,Result) :- + retractall(types(_)), + assertz(types(off)), +lucianpl111(Debug,Query,Functions1,Result),!. + +lucianpl1_1(Debug,Query,TypeStatements,ModeStatements,Functions1,Result) :- + retractall(types(_)), + assertz(types(on)), + retractall(typestatements(_)), + findall([A,C],(member([A,B],TypeStatements),expand_types(B,[],C)),TypeStatements1), + +assertz(typestatements(TypeStatements1)), + retractall(modestatements(_)), + assertz(modestatements(ModeStatements)), +lucianpl111(Debug,Query,Functions1,Result). + +lucianpl111(Debug,Query,Functions,Result) :- + /* + ((not(lang(_Lang1)) + %var(Lang1) + )-> + (retractall(lang(_)), + assertz(lang("en"))); + true), + load_lang_db, + */ +%trace, + query_box(Query,Query1,[],Functions1), + +%get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + +delete(Functions,[0|_],Functions1a), % delete query box + + %writeln1(query_box(Query,Query1,Functions,Functions1)), +%%writeln1([i1]), + %%writeln1(convert_to_grammar_part1(Functions1,[],Functions2,_)), + convert_to_grammar_part1(Functions1,[],Functions2,_), + %insert_cuts(Functions2a,Functions2), +%Functions1=Functions2, + %writeln1(convert_to_grammar_part1(Functions1,[],Functions2,_)), %trace, + %writeln1(Functions2), + %%pp3(Functions2), + %%writeln1(lucianpl11(Debug,Query,Functions2,Functions2,Result)), + %findall(Result1, + + %trace, + add_line_numbers_to_algorithm1(Functions2,Functions2a), +%Functions2=Functions2a, + %writeln1(add_line_numbers_to_algorithm1(Functions2,Functions2a)), + %writeln1(Functions2a), +append(Functions2a,Functions1a,Functions2b), + + %find_pred_sm(Reserved_words),%,"en"), +find_pred_numbers(Functions2b,[]%Reserved_words +,Pred_numbers), + +%pred_numbers(Pred_numbers0), +%writeln([pred_numbers0,Pred_numbers0]), + retractall(pred_numbers(_)), + assertz(pred_numbers(Pred_numbers)), +%trace, +%writeln([pred_numbers1,Pred_numbers]), + + find_state_machine1(Functions2a,Functions3a,Pred_numbers), + append(Functions3a,Functions1a,Functions3b), + +%Functions2a=Functions3, +%writeln1(find_state_machine1(Functions2a,Functions3,Pred_numbers)), + % find first predicate +%trace, + %writeln1([functions3a,Functions3a]), + prep_predicate_call(Query1,Functions3b, + All_predicate_numbers), + + lucianpl1(Debug), + + % ssi1([Level, % Trace level + % All_predicate_numbers1 % List of instances of this predicate left to call + % Predicate_or_line_number, % predicate nor line number in predicate + % Destination, % "predicate" or "line" + % Query, % Query when Destination="predicate" + % Vars, % Input Vars + % All_predicate_numbers], % predicates remaining to try at current level - saved in cp trail + % End_result, % Result of predicate + % Functions, % functions in algorithm + % Vars2, % output vars + % Result1, Result2, % Initial and cumulative lists of non-deterministic results + % Globals1, Globals2, % Initial and cumulative lists of assertz globals + % Choice_point_trail1, % Initial and cumulative lists of choice points + % Choice_point_trail2, Cpv1, Cpv2) + %trace, + %trace, + findall([All_predicate_numbers0,"prev_pred_id",0],member(All_predicate_numbers0,All_predicate_numbers),All_predicate_numbers01), + All_predicate_numbers01=[[All_predicate_numbers1,"prev_pred_id",Prev_pred_id]|All_predicate_numbers2], + %trace, + ssi1([["prev_pred_id",Prev_pred_id],1,All_predicate_numbers1,-1,"predicate",Query1,[], + All_predicate_numbers2],_End_result,Functions3b,_Vars2,[],Result1, + [],_Globals2, + [], _Choice_point_trail2, + [[curr_cp,0],[curr_cp_index,0]],_), + Result=Result1. \ No newline at end of file diff --git a/Simple-List-Prolog-to-List-Prolog/LICENSE b/Simple-List-Prolog-to-List-Prolog/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/Simple-List-Prolog-to-List-Prolog/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Simple-List-Prolog-to-List-Prolog/README.md b/Simple-List-Prolog-to-List-Prolog/README.md new file mode 100644 index 0000000000000000000000000000000000000000..6924cb7ec00edd9b8398838ca1243b4539a36cdf --- /dev/null +++ b/Simple-List-Prolog-to-List-Prolog/README.md @@ -0,0 +1,39 @@ +# Simple-List-Prolog-to-List-Prolog + +Converts Simple List Prolog algorithms to List Prolog algorithms + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Simple-List-Prolog-to-List-Prolog"). +halt +``` + +# Running + +* In Shell: +`cd Simple-List-Prolog-to-List-Prolog` +`swipl` +`['slp2lp.pl'].` + +* Convert Simple List Prolog code to List Prolog code by running: +``` +?- slp2lp([[f1,[a,b,c,d,e],(:-),[[+,[a,b,f]],[+,[c,f,g]],[+,[d,g,h]],[=,[h,e]]]]],Algorithm2),writeln(Algorithm2). +Algorithm2=[[[n,f1],[[v,a],[v,b],[v,c],[v,d],[v,e]],":-",[[[n,+],[[v,a],[v,b],[v,f]]],[[n,+],[[v,c],[v,f],[v,g]]],[[n,+],[[v,d],[v,g],[v,h]]],[[n,=],[[v,h],[v,e]]]]]] +``` diff --git a/Simple-List-Prolog-to-List-Prolog/la_strings.pl b/Simple-List-Prolog-to-List-Prolog/la_strings.pl new file mode 100644 index 0000000000000000000000000000000000000000..b163083f2c576fd216a2c465dc15a722d9e62d6e --- /dev/null +++ b/Simple-List-Prolog-to-List-Prolog/la_strings.pl @@ -0,0 +1,31 @@ +%% la_strings.pl + +use_module(library(pio)). +use_module(library(dcg/basics)). + +open_s(File,Mode,Stream) :- + atom_string(File1,File), + open(File1,Mode,Stream),!. + +string_atom(String,Atom) :- + atom_string(Atom,String),!. + +phrase_from_file_s(string(Output), String) :- + atom_string(String1,String), + phrase_from_file(string(Output), String1),!. + +append_list2([],[]):-!. +append_list2(A1,B):- + A1=[A|List], + append_list2(A,List,B),!. + +append_list2(A,[],A):-!. +append_list2(A,List,B) :- + List=[Item|Items], + append(A,Item,C), + append_list2(C,Items,B). + +writeln1(Term) :- + term_to_atom(Term,Atom), + writeln(Atom),!. + diff --git a/Simple-List-Prolog-to-List-Prolog/slp2lp.pl b/Simple-List-Prolog-to-List-Prolog/slp2lp.pl new file mode 100644 index 0000000000000000000000000000000000000000..cdf26db36cdcaa218411d84c3756d47b7793e349 --- /dev/null +++ b/Simple-List-Prolog-to-List-Prolog/slp2lp.pl @@ -0,0 +1,129 @@ +%% slp2lp.pl + +/** +slp2lp([[f1,[a,b,c,d,e],(:-),[[+,[a,b,f]],[+,[c,f,g]],[+,[d,g,h]],[=,[h,e]]]]],Algorithm2). + +Algorithm2= [[[n,f1],[[v,a],[v,b],[v,c],[v,d],[v,e]],":-",[[[n,+],[[v,a],[v,b],[v,f]]],[[n,+],[[v,c],[v,f],[v,g]]],[[n,+],[[v,d],[v,g],[v,h]]],[[n,=],[[v,h],[v,e]]]]]] + +**/ + +:- include('la_strings.pl'). + +slp2lp(Algorithm1,Algorithm2) :- + findall([[n,Name],Arguments4,Symbol2,Body2],(member(Function1,Algorithm1),Function1=[Name,Arguments1,Symbol1,Body1],symbol(Symbol1,Symbol2), + findall(Arguments3,(member(Arguments2,Arguments1),slp2lp_variables(Arguments2,Arguments3)),Arguments4), + process_body2(Body1,Body2)),Algorithm2). + + +symbol((:-),":-"):-!. +symbol(Symbol,Symbol) :-!. + +slp2lp_variables(Name1,[v,Name1]) :- predicate_or_rule_name(Name1),!. +slp2lp_variables(Name,Name) :- !. + +/** +process_body(Body1,Body2) :- + findall(*,(member(Statement1,Body1 +process_body(Body1,[],Body2) :- + +**/ + +%%predicate_or_rule_name([A,B]) :- atom(A),is_list(B),!. +predicate_or_rule_name(A) :- atom(A),!. + + +process_body2([],[]):-!.%%,Body3 + + +%%process_body2([],Body,Body) :- !. +process_body2(Body1,Body2%%,Body3 +) :- + Body1=[[Statements1|Statements1a]|Statements2 + ], + + not(predicate_or_rule_name(Statements1)), + process_body2([Statements1],Body3), %% 2->1 + + process_body2(Statements1a,Body4), + process_body2(Statements2,Body5), + + append([Body3,Body4],Body6), + append([[Body6],Body5],Body2), + !. + + + + +process_body2(Body1,Body2) :- + Body1=[[not,[Statement]]|Statements2 + ], + + process_body2([Statement],Body3), + process_body2(Statements2,Body4), + append([[n,not]],Body3,Body5), + append([Body5],Body4 + ,Body2), + + !. + + + + +process_body2(Body1,Body2) :- + Body1=[[or,[Statements1,Statements2]]|Statements3], + process_body2([Statements1],Body3), + process_body2([Statements2],Body4), + process_body2(Statements3,Body5), + append(Body3,Body4,Body34), + Body6=[[n,or],Body34 + ], + append([Body6],Body5,Body2), + !. + + +process_body2(Body1,Body2) :- + Body1=[["->",[Statements1,Statements2]]|Statements3], + process_body2([Statements1],Body3), + process_body2([Statements2],Body4), + + process_body2(Statements3,Body5), + append(Body3,Body4,Body34), + Body6=[[n,"->"],Body34 + ], + append([Body6],Body5,Body2), + + !. + + + + +process_body2(Body1,Body2) :- + Body1=[["->",[Statements1,Statements2,Statements2a]]|Statements3], + process_body2([Statements1],Body3), + process_body2([Statements2],Body4), + process_body2([Statements2a],Body5), + process_body2(Statements3,Body6), + append_list2([Body3,Body4,Body5],Body345), + Body7=[[n,"->"],Body345], + append([Body7],Body6,Body2), + + !. + + +process_body2(Body1,Body2) :- + Body1=[Statement|Statements], + not(predicate_or_rule_name(Statement)), + process_statement1(Statement,Result1), + process_body2(Statements,Result2), + append_list2([Result1,Result2],Body2),!. + + +process_statement1(Statement,Result1) :- + ((Statement=[Name,Arguments], + findall(Name2,(member(Argument,Arguments), + (predicate_or_rule_name(Argument)->Name2=[v,Argument]; + Name2=Argument)),Result2), + Result1=[[[n,Name],Result2]])->true; + ((Statement=[Name], + Result1=[[[n,Name]]]))). + \ No newline at end of file diff --git a/Split-on-Phrases/LICENSE b/Split-on-Phrases/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/Split-on-Phrases/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Split-on-Phrases/README.md b/Split-on-Phrases/README.md new file mode 100644 index 0000000000000000000000000000000000000000..e7a4733a85fd460541d937928f8e5e892505ccff --- /dev/null +++ b/Split-on-Phrases/README.md @@ -0,0 +1,198 @@ +# Split-on-Phrases +Splits text files into groups of files. It was initially written to break m multiple chapters per file (with paragraph numbers up to about 42) into 15 minute-speakable files for YouTube. It splits on every 10 paragraphs. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +* You may need to install gawk using Homebrew. + +* Install Translation Shell on Mac, etc. +Change line in +``` +culturaltranslationtool/ctt2.pl +trans_location("../../../gawk/trans"). +``` +to correct location of trans. + +# 1. Install manually + +Download this repository, the List Prolog Interpreter repository, the Languages repository and Cultural Translation Tool. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Split-on-Phrases"). +halt +``` + +# What it does + +* Takes an input file, e.g.: + +``` +title1 + +1. a +2. a2 +10. b +11. b +19. c +20. d +21. e +29. c + +title2 + +1. a +2. a2 +10. b +11. b +19. c +20. d +21. e +29. c +30. d +31. e +39. c +40. d +41. e +42. f + +eof +``` + +* and the phrases file: + +``` +title1 +title2 +``` + +* and produces the files: + +`grand title by Lucian Green title1 1 of 4.txt` +``` +grand title +by Lucian Green +title1 1 of 4 + +1. a +2. a2 +10. b +``` + +`grand title by Lucian Green title1 2 of 4.txt` +``` +grand title +by Lucian Green +title1 2 of 4 + +11. b +19. c +20. d +``` + +`grand title by Lucian Green title1 3 of 4.txt` +``` +grand title +by Lucian Green +title1 3 of 4 + +21. e +29. c +``` + +`grand title by Lucian Green title2 1 of 4.txt` +``` +grand title +by Lucian Green +title2 1 of 4 + +1. a +2. a2 +10. b +``` + +`grand title by Lucian Green title2 2 of 4.txt` +``` +grand title +by Lucian Green +title2 2 of 4 + +11. b +19. c +20. d +``` + +`grand title by Lucian Green title2 3 of 4.txt` +``` +grand title +by Lucian Green +title2 3 of 4 + +21. e +29. c +30. d +``` + +`grand title by Lucian Green title2 4 of 4.txt` +``` +grand title +by Lucian Green +title2 4 of 4 + +31. e +39. c +40. d +41. e +42. f +``` + +# Running + +* In Shell: +`cd Split-on-Phrases` +`swipl` +`['../listprologinterpreter/listprolog'].` + +* Load the Split on Phrases program by typing: +`['split_on_phrases'].` + +* The algorithm is called in the form: +`split_on_phrases1("grand title","phrasesfile.txt","inputfile.txt").` + +Where: +grand title - the title at the start of each file. +phrasesfile.txt - the file with the phrases to split the file on. +inputfile.txt - the file with the entitled list of paragraphs to split. + +# Append EOF + +* Append EOF (for Mac Terminal) appends EOF (the string, not the symbol) to each text file in the Data folder that appendeof.sh is in the same folder with, to the Out folder. + +* The Shell script is called in the form: +`./appendeof.sh` + +# Text to AIFF + +* Text to AIFF (for Mac Terminal) converts text files in the Data folder that t2aiff.sh is in the same folder with, to AIFF files, which can be converted to MP3s, which can be sound tracks added to YouTube. + +* The Shell script is called in the form: +`./t2aiff.sh` + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/Split-on-Phrases/appendeof.sh b/Split-on-Phrases/appendeof.sh new file mode 100644 index 0000000000000000000000000000000000000000..70ed323645f45ad38ab74e1bfffa25698d24dee8 --- /dev/null +++ b/Split-on-Phrases/appendeof.sh @@ -0,0 +1,4 @@ +#!/bin/bash +for filename in Data/*.txt; do + cat "$filename" "eof.txt">"Out/$(basename "$filename" .txt)-eof.txt" +done \ No newline at end of file diff --git a/Split-on-Phrases/eof.txt b/Split-on-Phrases/eof.txt new file mode 100644 index 0000000000000000000000000000000000000000..2285a781856c2e258bd3bdf1d7d15095c9bb0cde --- /dev/null +++ b/Split-on-Phrases/eof.txt @@ -0,0 +1,2 @@ + +eof \ No newline at end of file diff --git a/Split-on-Phrases/inputfile.txt b/Split-on-Phrases/inputfile.txt new file mode 100644 index 0000000000000000000000000000000000000000..f76b4b1110b8be98e7425b33357e05bd33379efd --- /dev/null +++ b/Split-on-Phrases/inputfile.txt @@ -0,0 +1,29 @@ +title1 + +1. a +2. a2 +10. b +11. b +19. c +20. d +21. e +29. c + +title2 + +1. a +2. a2 +10. b +11. b +19. c +20. d +21. e +29. c +30. d +31. e +39. c +40. d +41. e +42. f + +eof \ No newline at end of file diff --git a/Split-on-Phrases/phrasesfile.txt b/Split-on-Phrases/phrasesfile.txt new file mode 100644 index 0000000000000000000000000000000000000000..0c2454b462451b7c9fb158450efcaaee2cdcedd5 --- /dev/null +++ b/Split-on-Phrases/phrasesfile.txt @@ -0,0 +1,2 @@ +title1 +title2 \ No newline at end of file diff --git a/Split-on-Phrases/split_on_phrases.pl b/Split-on-Phrases/split_on_phrases.pl new file mode 100644 index 0000000000000000000000000000000000000000..5c730c9b1601803a5761aadd1641b63feaa1e9e2 --- /dev/null +++ b/Split-on-Phrases/split_on_phrases.pl @@ -0,0 +1,298 @@ +use_module(library(pio)). +:- include('../listprologinterpreter/la_strings'). + +%% split_on_phrases1("grand title","phrasesfile.txt","inputfile.txt"). + +split_on_phrases1(Title,PhrasesFile,InputFile) :- + get_phrases(PhrasesFile,Phrases), + split_on_phrases2(Phrases,InputFile,Text,Title), + write_files(Text),!. + +get_phrases(PhrasesFile,Phrases) :- + SepandPad="\n", + phrase_from_file_s(string(String00), PhrasesFile), + split_string(String00,SepandPad,SepandPad,List1), + delete(List1,"",Phrases1), + findall(Text6,(member(Text7,Phrases1), + string_codes(Text6,Text7)),Phrases). + +split_on_phrases2(Phrases2,InputFile,Text4,Title) :- + phrase_from_file_s(string(Text2), InputFile), + %%string_codes(Phrases1,Phrases2), + %%string_codes(Text2,Text1), + split_on_phrases3(Phrases2,Text2,[],Text3,Title), + %%term_to_atom([split_on_phrases3(Phrases2,Text2,[],Text3,Title)],AA), + %%writeln1(AA), + %%findall(Text4a,(member(Text7,Text3), + findall([A1,B1],(member(Text8,Text3),Text8=[A,B], + string_codes(A,A1),string_codes(B,B1)),Text4).%%),Text4). + +split_on_phrases3([],_Text1,Text2,Text2,_Title) :- !. +%%split_on_phrases3(_Phrases1,[],Text2,Text2,_Title) :- !. +%% Phrases1=[Phrase2], + +split_on_phrases3(Phrases1,Text1,Text2,Text3,Title) :- + (Phrases1=[Phrase2b,Phrase2a|Phrases3]->true; + (Phrases1=[Phrase2b], + Phrase2a="eof",Phrases3=[])), + length(Phrases1,Phrases1L), + writeln(Phrase2b), +split_on_phrases3a(Phrase2b,Text1,Text2,Title,Phrase2a,E14,Phrases4,Phrases1L), +(Phrase2a="eof"->(%%writeln([phrases3,Phrases3]), +Phrases3=Phrases3a); +(%%writeln(append([Phrase2a],Phrases3,Phrases3a)), + append([Phrase2a],Phrases3,Phrases3a))), + split_on_phrases3(Phrases3a,E14,Phrases4,Text3,Title). + +split_on_phrases3a(Phrase2b,Text1,Text2,Title,Phrase2a,E14,Phrases4,Phrases1L) :- + %%string_concat(Phrase2b,"\n",Phrase2c), + %%string_codes(Phrase2c,Phrase2), + string_codes(Phrase2b,Phrase22), + string_codes("1.",String1), + string_codes("11.",String11), + string_codes("21.",String21), + string_codes("31.",String31), + string_codes(Phrase2a,Phrase2aa), + string_codes("eof",EofC), + + ( + (%%Phrases1L=1-> + %%split_on_phrases4(Text1,E11,Text2,Text4, + %%String1,Phrase22,EofC,Title,1) + %%; + split_on_phrases41(Text1,E11,Text2,Text4, + String1,Phrase22,String11,Title,1,Phrase2aa), + writeln("11") + )-> + ((%Phrases1L=1-> + %%split_on_phrases4(E11,E12,Text4,Text5, + %%String11,Phrase22,EofC,Title,2) + %%; + split_on_phrases41(E11,E12,Text4,Text5, + String11,Phrase22,String21,Title,2,Phrase2aa), + writeln("21") + )-> + ((%%Phrases1L=1-> + %%split_on_phrases4(E12,E13,Text5,Text6, + %%String21,Phrase22,EofC,Title,3) + %%; + split_on_phrases41(E12,E13,Text5,Text6, + String21,Phrase22,String31,Title,3,Phrase2aa), + writeln("31") + )-> + ( + ((Phrases1L=1-> + ((split_on_phrases4(E13,E14,Text6,Phrases4, + String31,Phrase22,EofC,Title,4), + writeln("eof"))-> + true;(writeln("Error: No eof."),abort) + ) + ; + (%%writeln("here"), + %%writeln(split_on_phrases4(E13,E14,Text6,Phrases4, + %%String31,Phrase22,Phrase2aa,Title,4)), + %%( + (split_on_phrases42(E13,E14,Text6,Phrases4, + String31,Phrase22,_Empty,Title,4,Phrase2aa), + writeln("-"))%%-> + %%true;(E13=E14,Text6=Phrases4) + ) + ) + ) + ) + ; + (Phrases1L=1-> + (split_on_phrases4(E12,E14,Text5,Phrases4, + String21,Phrase22,EofC,Title,3), + writeln("eof")) + ; + (split_on_phrases41(E12,E14,Text5,Phrases4, + String21,Phrase22,String31,Title,3,Phrase2aa),%%-> + %%true;(E12=E14,Text5=Phrases4) + writeln("31")) + )) + ; + (Phrases1L=1-> + (split_on_phrases4(E11,E14,Text4,Phrases4, + String11,Phrase22,EofC,Title,2), + writeln("eof")) + ; + (split_on_phrases41(E11,E14,Text4,Phrases4, + String11,Phrase22,String21,Title,2,Phrase2aa),%%-> + %%true;(E11=E14,Text4=Phrases4) + writeln("21")) + )) + ; + (Phrases1L=1-> + (split_on_phrases4(Text1,E14,Text2,Phrases4, + String1,Phrase22,EofC,Title,1), + writeln("eof")) + ; + (split_on_phrases41(Text1,E14,Text2,Phrases4, + String1,Phrase22,String11,Title,1,Phrase2aa),%%-> + %%true;(Text1=E14,Text2=Phrases4) + writeln("11")) + ) + ). + %%append_list([Text4,Text5,Text6,Text7],Phrases4). + +split_on_phrases4(Text1,E11,Text2,Text4,Phrase2,Phrase22,Phrase2a,Title,N) :- + split_on_phrases4a(Text1,_C1,D1,Phrase2,_E1), + %%string_codes(C,C1), + %%string_codes(D,D1), + string_codes(Phrase20,Phrase2), + string_codes(Phrase202,Phrase22), + split_on_phrases4b(D1,Phrase2a,E11,C11,_), + string_codes(C110,C11), + concat_list([Phrase20,C110],C12), + string_codes(C10,C12), + number_string(N,NS), + concat_list([Title," ","by Lucian Green"," ",Phrase202," ",NS," of 4.txt"],E101), + concat_list([Title,"\n","by Lucian Green","\n",Phrase202," ",NS," of 4\n\n",C10],C122), + append(Text2,[[E101,C122]],Text4). + +split_on_phrases4a(Text1,C1,D1,Phrase2,E1) :- + append(C1,E1,Text1), + append(Phrase2,D1,E1). +split_on_phrases4b(D1,Phrase2a,E11,C11,D10) :- + append(C11,E11,D1), + %%writeln1(append(Phrase2a,D10,E11)), + append(Phrase2a,D10,E11). + +split_on_phrases41(Text1,E11,Text2,Text4,Phrase2,Phrase22,Phrase2a,Title,N,Phrase2aa) :- + +string_codes(Text1z,Text1), +%%*string_codes(E11z,E11), +%%string_codes(Text2z,Text2), +%%*string_codes(Text4z,Text4), +string_codes(Phrase2z,Phrase2), +string_codes(Phrase22z,Phrase22), +string_codes(Phrase2az,Phrase2a), +string_codes(Titlez,Title), +%%*string_codes(Nz,N), +string_codes(Phrase2aaz,Phrase2aa), +%%writeln1(split_on_phrases41(Text1z,_E11z,Text2,_Text4z,Phrase2z,Phrase22z,Phrase2az,Titlez,N,Phrase2aaz)), + + +%%string_codes(Text1z,Text1),writeln([text1,Text1z]), + %%writeln(split_on_phrases4b(Text1,Phrase2aa,E11x,Text1a,D100)), + split_on_phrases4b(Text1,Phrase2aa,E11x,Text1a,D100), + +string_codes(Text1z,Text1), +string_codes(Phrase2aaz,Phrase2aa), +string_codes(E11xz,E11x), +string_codes(Text1az,Text1a), +string_codes(D100z,D100), + +%%writeln1( split_on_phrases4b(Text1z,Phrase2aaz,E11xz,Text1az,D100z)), + +%%writeln1( split_on_phrases4a(Text1a,_C1,D1,Phrase2,_E1)), + split_on_phrases4a(Text1a,C1,D1,Phrase2,E1), + +string_codes(C1z,C1), +string_codes(D1z,D1), +string_codes(E1z,E1), + %%writeln(split_on_phrases4a(Text1a,C1z,D1z,Phrase2z,E1z)), + + %%string_codes(C,C1), + %%string_codes(D,D1), + %%string_codes(Phrase20,Phrase2), + string_codes(Phrase202,Phrase22), + + %%writeln1( split_on_phrases4b(Text1a,Phrase2a,_,_,_)), + (split_on_phrases4b(Text1a,Phrase2a,_E11q,_C11q,_D10q)-> %% D1->Text1a + + ((split_on_phrases4b(Text1,Phrase22,_,_,C11r)->true; + C11r=Text1), %% old + split_on_phrases4b(C11r,Phrase2a,E11,C11,_D102)); + %%split_on_phrases4b(Text1a,Phrase2,_E11q,_C11q,_D10q)); + (C11=Text1a,E11=E11x)), %% old + +%%writeln1(split_on_phrases4b(D1,Phrase2aa,_F112,G11)), + %%split_on_phrases4b(D10,Phrase2aa,_F11,G11,H), + %%()split_on_phrases4b(D1,Phrase2a,E11,C11), + + %%split_on_phrases4b(G11,Phrase2a,E11,C11), + string_codes(C110,C11), + concat_list([%%Phrase20, + C110%%,E11 + ],C12), + string_codes(C10,C12), + number_string(N,NS), + concat_list([Title," ","by Lucian Green"," ",Phrase202," ",NS," of 4.txt"],E101), + concat_list([Title,"\n","by Lucian Green","\n",Phrase202," ",NS," of 4\n\n",C10],C122), + append(Text2,[[E101,C122]],Text4). + +split_on_phrases42(Text1,E11,Text2,Text4,_Phrase2,Phrase22,_Phrase2a,Title,N,Phrase2aa) :- + +string_codes(Text1z,Text1), +%%string_codes(E11z,E11), +%%string_codes(Text2z,Text2), +%%string_codes(Text4z,Text4), +%%string_codes(Phrase2z,Phrase2), +%%string_codes(Phrase22z,Phrase22), +%%string_codes(Phrase2az,Phrase2a), +%%string_codes(Titlez,Title), +%%string_codes(Nz,N), +string_codes(Phrase2aaz,Phrase2aa), +%%writeln1(split_on_phrases41(Text1z,E11z,Text2z,Text4z,Phrase2z,Phrase22z,Phrase2az,Titlez,N,Phrase2aaz)), +%%string_codes(Text1z,Text1),%%writeln([text1,Text1z]), + split_on_phrases4b(Text1,Phrase2aa,E11,C11,E11x), + string_codes(E11z,E11), +string_codes(C11z,C11), +string_codes(E11xz,E11x), +%%***writeln1(split_on_phrases4b_42here(Text1z,Phrase2aaz,E11z,C11z,E11xz)), +(C11=[]->Text4=Text2;( +%%string_codes(Text1z,Text1), +%%string_codes(Phrase2aaz,Phrase2aa), +%%string_codes(Text1az,Text1a), +%%string_codes(D100z,D100), + +%%writeln1( split_on_phrases4b(Text1z,Phrase2aaz,E11z,Text1az,D100z)), + +%%writeln1( split_on_phrases4a(Text1a,_C1,D1,Phrase2,_E1)), + %%*split_on_phrases4a(Text1a,_C1,_D1,Phrase2,_E1), + %%string_codes(C,C1), + %%string_codes(D,D1), + %%string_codes(Phrase20,Phrase2), + string_codes(Phrase202,Phrase22), + + %%*writeln1( split_on_phrases4b(Text1a,Phrase2a,E11,C11,_D10)), + %%*(split_on_phrases4b(Text1a,Phrase2a,_E11q,_C11q,_D10q)-> %% D1->Text1a +/** + ((split_on_phrases4b(Text1,Phrase22,_,_,C11r)->true; + C11r=Text1), %% old + split_on_phrases4b(C11r,Phrase2a,E11,C11,_D102)); + %%split_on_phrases4b(Text1a,Phrase2,_E11q,_C11q,_D10q)); + (C11=Text1a,E11=E11x)), %% old +**/ +%%writeln1(split_on_phrases4b(D1,Phrase2aa,_F112,G11)), + %%split_on_phrases4b(D10,Phrase2aa,_F11,G11,H), + %%()split_on_phrases4b(D1,Phrase2a,E11,C11), + + %%split_on_phrases4b(G11,Phrase2a,E11,C11), + string_codes(C110,C11), + concat_list([%%Phrase20, + C110%%,E11 + ],C12), + string_codes(C10,C12), + number_string(N,NS), + concat_list([Title," ","by Lucian Green"," ",Phrase202," ",NS," of 4.txt"],E101), + concat_list([Title,"\n","by Lucian Green","\n",Phrase202," ",NS," of 4\n\n",C10],C122), + append(Text2,[[E101,C122]],Text4))). + %%writeln([e11,E11]). + +write_files(Text1) :- + findall(_,(member(Text2,Text1),%%member(Text2,Text3),%%member(Text2,Text4), + Text2=[Title1,Contents1],string_codes(Title2,Title1),string_codes(Contents2,Contents1), + write_file(Title2,Contents2)),_). + +write_file(File,String) :- + open_s(File,write,Stream), + write(Stream,String), + close(Stream). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). diff --git a/Split-on-Phrases/t2aiff.sh b/Split-on-Phrases/t2aiff.sh new file mode 100644 index 0000000000000000000000000000000000000000..33da3e544a736755fca78a23dc11c4f9cd877745 --- /dev/null +++ b/Split-on-Phrases/t2aiff.sh @@ -0,0 +1,4 @@ +#!/bin/bash +for filename in Data/*.txt; do + Say -v Fred -o "Data/$(basename "$filename" .txt).aiff" -f "$filename" +done \ No newline at end of file diff --git a/State-Machine-to-List-Prolog/.DS_Store b/State-Machine-to-List-Prolog/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..bfdba582b92992d6d1f244dee138d2c86c4dcc98 Binary files /dev/null and b/State-Machine-to-List-Prolog/.DS_Store differ diff --git a/State-Machine-to-List-Prolog/LICENSE b/State-Machine-to-List-Prolog/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..cbee26b2a100ccce61cfcd80f0ab2580494d4795 --- /dev/null +++ b/State-Machine-to-List-Prolog/LICENSE @@ -0,0 +1,28 @@ +BSD 3-Clause License + +Copyright (c) 2022, Lucian Green + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/State-Machine-to-List-Prolog/README.md b/State-Machine-to-List-Prolog/README.md new file mode 100644 index 0000000000000000000000000000000000000000..b70249e95c17cc87000cceac2dab4c84df26628f --- /dev/null +++ b/State-Machine-to-List-Prolog/README.md @@ -0,0 +1,110 @@ +# State Machine to List Prolog + +* Convert State Saving Interpreter State Machines to List Prolog algorithms. + + +# Getting Started + +Please read the following instructions on how to install the project on your computer for converting algorithms. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +* You may need to install gawk using Homebrew. + +* Install Translation Shell on Mac, etc. +Change line in +``` +culturaltranslationtool/ctt2.pl +trans_location("../../../gawk/trans"). +``` +to correct location of trans. + +# 1. Install manually + +* Download: +* this repository. +* State Saving Interpreter. (Note: see SSI's page for additional repositories needed.) + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","State-Machine-to-List-Prolog"). +halt +``` + +# Running + +* In Shell: +`cd State-Machine-to-List-Prolog` +`swipl` +`['sm_to_lp.pl'].` + +* For example, convert a List Prolog test algorithm to SM form and back to LP form and test they have the same result. (Replace `3` in `numbers(3,1,[],A)` with the maximum test number below.) + +``` +?- numbers(10,1,[],A),findall(B,(Debug=off,member(B,A),test(B,Q,F,R),query_box(Q,Query1,F,Functions1),convert_to_grammar_part1(Functions1,[],Functions2,_),add_line_numbers_to_algorithm1(Functions2,Functions2a),find_pred_numbers(Functions2a,[],Pred_numbers),find_state_machine1(Functions2a,Functions3,Pred_numbers),sm_to_lp(Functions3,Functions2b),lucianpl(Debug,Q,F,R1),Functions2b=[[_|_]|Functions2c],lucianpl(off,Q,Functions2c,R2),(R1=R2->Result=success;Result=fail),writeln([B,Result])),C),sort(C,C1),writeln(C1). + +[[1,success],[2,success],[3,success],[4,success],[5,success],[6,success],[7,success],[8,success],[9,success],[10,success]] +``` + +* The original and converted algorithm have the same results, showing the converter has worked. + +* To pretty print the converted List Prolog for a test number: + +``` +A=[250],findall(B,(member(B,A),test(B,Q,F,R),query_box(Q,Query1,F,Functions1),convert_to_grammar_part1(Functions1,[],Functions2,_),add_line_numbers_to_algorithm1(Functions2,Functions2a),find_pred_numbers(Functions2a,[],Pred_numbers),find_state_machine1(Functions2a,Functions3,Pred_numbers),sm_to_lp(Functions3,Functions2b),pp0(Functions2,Functions21),writeln(Functions21),pp0(Functions2b,Functions2b1),writeln(Functions2b1)),C). + +[ +[[n,query_box_1],[[v,b]],":-", +[ + [[n,a],[[v,b]]] +]], +[[n,a],[[v,b]],":-", +[ + [[n,"->"], + [ + [ + [[n,c],[[v,c]]] + ], + + [ + [[n,c],[[v,c]]] + ], + + [[n,"->"], + [ + [ + [[n,c],[[v,c]]] + ], + + [ + [[n,c],[[v,c]]] + ], + + [ + [[n,c],[[v,c]]] + ] + ]] + ]] +]] +] +``` + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + diff --git a/State-Machine-to-List-Prolog/file.txt b/State-Machine-to-List-Prolog/file.txt new file mode 100644 index 0000000000000000000000000000000000000000..a4689b2db1dea7f0796358691e489cbe73a7fd78 --- /dev/null +++ b/State-Machine-to-List-Prolog/file.txt @@ -0,0 +1 @@ +["Name","61"]. \ No newline at end of file diff --git a/State-Machine-to-List-Prolog/sm_to_lp.pl b/State-Machine-to-List-Prolog/sm_to_lp.pl new file mode 100644 index 0000000000000000000000000000000000000000..c93739a1d63dccdbaeb336102fa7e54ab844cccb --- /dev/null +++ b/State-Machine-to-List-Prolog/sm_to_lp.pl @@ -0,0 +1,853 @@ +:-include('../SSI/ssi.pl'). + +sm_to_lp(Algorithm1,Algorithm2) :- + sm_to_lp(Algorithm1,[],Algorithm2),!. + +sm_to_lp([],Algorithm,Algorithm) :- !. +sm_to_lp(Algorithm1,Algorithm2,Algorithm3) :- + Algorithm1=[Function1|Functions], + (Function1=[_Number,Name,Arguments1,Symbol1,Body1] + ->%symbol(Symbol1,Symbol2), + ( + get_up_to_next_chunk(Body1,%Body2, + [],Body2), + (false->%Arguments1=[]-> + append(Algorithm2,[[%Number, + Name,Symbol1,Body2]],Algorithm4); + append(Algorithm2,[[%Number, + Name,Arguments1,Symbol1,Body2]],Algorithm4) + + ) + ) + /*; + (trace,Function1=[_Number,Name|Arguments1], + + %get_up_to_next_chunk(Body1,%Body2, + %[],Body2), + (Arguments1=[]-> + append(Algorithm2,[[%Number, + Name]],Algorithm4); + append(Algorithm2,[[%Number, + Name,Arguments1]],Algorithm4) + ) + ) + */), + %writeln1([Name,Arguments1,Symbol1,Body2]), +%catch(call_with_time_limit(2,sm_to_lp(Functions,Algorithm4,Algorithm3)),_,false). + + sm_to_lp(Functions,Algorithm4,Algorithm3). + + + %gets next chunk (section) including up to a chunk, then goes over it and searches for more chunks * + +% get_up_to_next_chunk(B,[],Commands). + +get_up_to_next_chunk([],C,C) :- !. +get_up_to_next_chunk(B,C1,C2) :- + + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after,get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + + + append(A,C,B), + +C=[C111|D], + +C111=[_Number,[Dbw_on_true,_Statements1_number],[Dbw_go_after,_Statements2_number],[Dbw_on_false,_Return_line_false],[Dbw_go_to_predicates,_Predicates],[Dbw_n_or_v1,F]|_Arguments], +(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v), +%trace, +( +(F="[]"->true;(F=Dbw_not->true;(F=Dbw_or->true;(F="->"->true;(F=Dbw_findall))))) +-> +( +convert_chunk(A,E), +append(C1,E,C3), +%* append E +%trace, +append(C,[],CD), +get_chunks(CD,[],C22), +append(C3,C22,C2) +) +; +( +%trace, +%not(( +%(F="[]"->true;(F=Dbw_not->true;(F=Dbw_or->true;(F="->"->true;(F=Dbw_findall))))))), + +%trace, +/*convert_chunk(B,E), +*/ +%trace, +append(A,[C111],AC), +convert_chunk(AC,E), +append(C1,E,C3), +%* append E +%trace, +%append(C,[],CD), +get_up_to_next_chunk(D,[],C22), +append(C3,C22,C2) + +)) +,! +. + +/* +get_up_to_next_chunk2([],C,C) :- !. +get_up_to_next_chunk2(B,C1,C2) :- + + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after,get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + + + %append(A,C,B), + +%C=[C111|D], + +%C111=[_Number,[Dbw_on_true,_Statements1_number],[Dbw_go_after,_Statements2_number],[Dbw_on_false,_Return_line_false],[Dbw_go_to_predicates,_Predicates],[Dbw_n_or_v1,F]|_Arguments], +%(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v), +%trace, + +( +forall(member([_Number,[Dbw_on_true,_Statements1_number],[Dbw_go_after,_Statements2_number],[Dbw_on_false,_Return_line_false],[Dbw_go_to_predicates,_Predicates],[Dbw_n_or_v1,F]|_Arguments],B),not((F="[]"->true;(F=Dbw_not->true;(F=Dbw_or->true;(F="->"->true;(F=Dbw_findall))))))) + +-> + + +(convert_chunk(B,E), +append(C1,E,C2)) +; + +get_up_to_next_chunk(B,C1,C2)) +,! +. +*/ + + +get_chunks([],C,C) :- !. + +get_chunks(CD,C1,C2) :- + + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after,get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + %get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +%get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +%get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + +CD=[[Number,[Dbw_on_true,_Statements1_number],[Dbw_go_after,_Statements2_number],[Dbw_on_false,_Return_line_false],[Dbw_go_to_predicates,_Predicates],[Dbw_n_or_v1,F]|Arguments]|D], + +(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v), +%F="[]", +%trace, +%(F=findall->trace;true), + + get(F,Number,D,E,[],C4,Arguments,wrap), + + +append(C1,C4,C6), +%trace, +get_up_to_next_chunk(E,[],C5), +append(C6,C5,C2) +,! +. + + + +convert_chunk([],[]) :- !. +convert_chunk(A,B) :- + + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after,get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + +findall(A1,(member([_Number,[Dbw_on_true,_Statements1_number],[Dbw_go_after,_Statements2_number],[Dbw_on_false,_Return_line_false],[Dbw_go_to_predicates,_Predicates],[Dbw_n_or_v,F],Arguments],A),(Dbw_n_or_v=Dbw_n->true;Dbw_n_or_v=Dbw_v), +%trace, +(Arguments=[]->A1=[[Dbw_n_or_v,F]];A1=[[Dbw_n_or_v,F],Arguments]) +),B). + + + +get(F,Number,D,E2,C1,C4,[[Argument1,Argument2]],Wrap) :- + + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + +(F="[]"->true;(F=Dbw_not->true;(F=Dbw_findall))), + +%trace, + +(Wrap=nowrap->append([_],D1,D);D1=D), + + %append(A,C,D1), + +%(false%Wrap=nowrap +%->not(A=[]);true), +%trace, +get_last_line(Number,D1,C111,A,E,Number2,F2,C,Arguments2), + +%reverse(C,CR), +% CR=[C111|_], + %*C111=[Number2,[Dbw_on_true,A3],[Dbw_go_after,B],[Dbw_on_false,_],[Dbw_go_to_predicates,_],[Dbw_n_or_v1,F2]|Arguments2], + +%(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v), + + +%(A3=[Fail_or_exit,Number]->true;B=[Fail_or_exit,Number]), +%(Fail_or_exit=exit_function->true;Fail_or_exit=fail_function), + +%trace, +%append(Ay,C11y,C), +%C=[C111|E], +%C11y=[C111|E], +%append(A,Ay,Az), +/* +%convert_chunk +%trace, +append(A,[C111],AC), +get_up_to_next_chunk(AC,[],C31), +*/ + + +/* +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, +*/ + +%trace, + +((F2="[]"->true;(F2=Dbw_not->true;(F2=Dbw_or->true;(F2="->"->true;(F2=Dbw_findall)))))-> + + + (%append(A,[C111],AC), + %trace, + get_up_to_next_chunk(A,[],Ax1), + get(F2,Number2,C%AC%E E* + ,E2,[],C32,Arguments2,nowrap), + append(Ax1 + ,C32,C31)); + +( +%convert_chunk +%trace, +append(A,[C111],AC), +%trace, +get_up_to_next_chunk(AC,[],C31), +E=E2 +)), + + +((F="[]", +(true%Wrap=wrap +-> +C3=[C31]; +C3= +C31 +))->true; + +((F=Dbw_not, +(true%Wrap=wrap +-> +C3=[[[Dbw_n,F],C31]]; +C3=[[Dbw_n,F],C31]) +)->true; + +(F=Dbw_findall, +foldr(append,[[Argument1],C31,[Argument2]],Arguments3), +(true%Wrap=wrap +-> +C3=[[[Dbw_n,F],Arguments3]]; +C3=[[Dbw_n,F],Arguments3]) +))) + + +, % "[]" + +append(C1,C3,C4). + +/* +get(Dbw_not,Number,D,E2,_C1,C3,_,Wrap) :- + +F=Dbw_not, + + + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after,get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, +*/ +/* + get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, + +%trace, +(Wrap=nowrap->append([_],D1,D);D1=D), + + append(A,C,D1), +(false%Wrap=nowrap +->not(A=[]);true), + + C=[C112|E], + +C112=[Number2,[Dbw_on_true,A3],[Dbw_go_after,A4],[Dbw_on_false,_%[fail_function,Number] +],[Dbw_go_to_predicates,_],[Dbw_n_or_v1,F2]|Arguments2], + +(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v), + +(A3=[Fail_or_exit,Number]->true;A4=[Fail_or_exit,Number]), + +(Fail_or_exit=exit_function->true;Fail_or_exit=fail_function), + +%convert_chunk +%trace, + + + +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + +%trace, + +((F2="[]"->true;(F2=Dbw_not->true;(F2=Dbw_or->true;(F2="->"->true;(F2=Dbw_findall)))))-> + + + (%append(A,[C111],AC), + %trace, + get_up_to_next_chunk(A,[],Ax1), + + get(F2,Number2,C%AC%E E* + ,E2,[],C32,Arguments2,nowrap), + append(Ax1 + ,C32,C31)); + +( +%convert_chunk +%trace, +append(A,[C112],AC), +%trace, +get_up_to_next_chunk(AC,[],C31), +E=E2 +)), + +(true%Wrap=wrap +-> +C3=[[[Dbw_n,F],C31]]; +C3=[[Dbw_n,F],C31]). +*/ + + +%->,3 +get(F,Number,D,E5,_C1,C3,_,Wrap) :- + + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + +F="->", +%trace, +/* get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after,get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, +*/ +%trace, +(Wrap=nowrap->append([_],D1,D);D1=D), + +get_last_line(Number,D1,C111,A,E,Number2,F2,C,Arguments2), + +% append(A,C,D1), +%(false%Wrap=nowrap +%->not(A=[]);true), + + %C=[C111|E], + +%trace, + %C111=[Number2,[Dbw_on_true,_N1],[Dbw_go_after,_],[Dbw_on_false,_Number3],[Dbw_go_to_predicates,_],[Dbw_n_or_v1,F2]|Arguments2], + +%(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v), + +%trace, +%get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +%get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +%get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + +%trace, + +((F2="[]"->true;(F2=Dbw_not->true;(F2=Dbw_or->true;(F2="->"->true;(F2=Dbw_findall)))))-> + + + (%append(A,[C111],AC), + %trace, + get_up_to_next_chunk(A,[],Ax1), + get(F2,Number2,C%AC%E E* + ,E2,[],C32,Arguments2,nowrap), + append(Ax1 + ,C32,C31)); + +( +%convert_chunk +%trace, +append(A,[C111],AC), +%trace, +get_up_to_next_chunk(AC,[],C31), +E=E2 +)), + +%trace, +%(Wrap=nowrap->append([_],E21,E2);E21=E2), + +get_last_line(Number,E2,C112,A1,E1x,Number31,F3,C11,Arguments3), + + %append(A1,C11,E2), + + %C11=[C112|E1x], + + %trace, + %C112=[Number31,[Dbw_on_true,A3],[Dbw_go_after,B],[Dbw_on_false,_Number4],[Dbw_go_to_predicates,_],[Dbw_n_or_v2,F3]|Arguments3], + +%(Dbw_n_or_v2=Dbw_n->true;Dbw_n_or_v2=Dbw_v), + +%(A3=[Fail_or_exit,Number]->true;B=[Fail_or_exit,Number]), + + %(Fail_or_exit=exit_function->true;Fail_or_exit=fail_function), + +((F3="[]"->true;(F3=Dbw_not->true;(F3=Dbw_or->true;(F3="->"->true;(F3=Dbw_findall)))))-> + + + (%append(A1,[C112],AC1), + get_up_to_next_chunk(A1,[],Ax), + get(F3,Number31,%E1% + C11%,AC1%E* + ,E3,[]%C1 + ,C311x,Arguments3,nowrap), + append(Ax,C311x,C311)); + +( + +%convert_chunk +%trace, +append(A1,[C112],AC1), +get_up_to_next_chunk(AC1,[],C311) +,E1x=E3 +)), + +%(Wrap=nowrap->append([_],E31,E3);E31=E3), + +get_last_line(Number,E3,C1131,A13,E4,Number41,F33,C113,Arguments33), + + %append(A13,C113,E3), + + %C113=[C1131|E4], + + %trace, + %C1131=[Number41,[Dbw_on_true,A33],[Dbw_go_after,B3],[Dbw_on_false,_Number43],[Dbw_go_to_predicates,_],[Dbw_n_or_v3,F33]|Arguments33], + +%(Dbw_n_or_v3=Dbw_n->true;Dbw_n_or_v3=Dbw_v), + +%(A33=[Fail_or_exit2,Number]->true;B3=[Fail_or_exit2,Number]), + + %(Fail_or_exit2=exit_function->true;Fail_or_exit2=fail_function), + +((F33="[]"->true;(F33=Dbw_not->true;(F33=Dbw_or->true;(F33="->"->true;(F33=Dbw_findall)))))-> + + + (%append(A1,[C112],AC1), + get_up_to_next_chunk(A13,[],A13x), + get(F33,Number41,%E1% + C113%,AC1%E* + ,E5,[]%C1 + ,C3131,Arguments33,nowrap), + append(A13x,C3131,C313) + % eliminate ->,2 + %here1 + ); + +( + +%convert_chunk +%trace, +append(A13,[C1131],AC13), +get_up_to_next_chunk(AC13,[],C313) +,E4=E5 +)), + +%trace, +foldr(append,[C31,C311,C313],C314), + +(true%Wrap=wrap +-> +C3=[[[Dbw_n,F],C314]]; +C3=[[Dbw_n,F],C314]). + + +get(F,Number,D,E3,_C1,C3,_,Wrap) :- + + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + +(F=Dbw_or->true;F="->"), % ->,2 + +/* + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after,get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, +*/ + +%trace, +(Wrap=nowrap->append([_],D1,D);D1=D), + +get_last_line(Number,D1,C111,A,E,Number2,F2,C,Arguments2), + +%append(A,C,D1), +%(false%Wrap=nowrap +%->not(A=[]);true), + + %C=[C111|E], + + %C111=[Number2,[Dbw_on_true,[Fail_or_exit,Number]],[Dbw_go_after,_],[Dbw_on_false,_Number3],[Dbw_go_to_predicates,_],[Dbw_n_or_v1,F2]|Arguments2], + + %(Fail_or_exit=exit_function->true;Fail_or_exit=fail_function), + + +%(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v), + + + +/* +%convert_chunk +%trace, +append(A,[C111],AC), +%trace, +get_up_to_next_chunk(AC,[],C31), + + + append(A1,C11,E), + + C11=[C112|E1], + + C112=[_Number31,[Dbw_on_true,[exit_function,Number]],[Dbw_go_after,_],[Dbw_on_false,_Number4],[Dbw_go_to_predicates,_],[Dbw_n,_F3]|_Arguments3], + +%convert_chunk +%trace, +append(A1,[C112],AC1), +get_up_to_next_chunk(AC1,[],C311), +*/ + + + + + +/* +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, +*/ +%trace, + +((F2="[]"->true;(F2=Dbw_not->true;(F2=Dbw_or->true;(F2="->"->true;(F2=Dbw_findall)))))-> + + + (%append(A,[C111],AC), + %trace, + get_up_to_next_chunk(A,[],Ax1), + get(F2,Number2,C%AC%E E* + ,E2,[],C32,Arguments2,nowrap), + append(Ax1 + ,C32,C31)); + +( +%convert_chunk +%trace, +append(A,[C111],AC), +%trace, +get_up_to_next_chunk(AC,[],C31), +E=E2 +)), + + +%(Wrap=nowrap->append([_],E21,E2);E21=E2), + +get_last_line(Number,E2,C112,A1,E1,Number31,F3,C11,Arguments3), + + %append(A1,C11,E2), + + %C11=[C112|E1], + + %C112=[Number31,[Dbw_on_true,[Fail_or_exit2,Number]],[Dbw_go_after,_],[Dbw_on_false,_Number4],[Dbw_go_to_predicates,_],[Dbw_n_or_v2,F3]|Arguments3], + + %(Fail_or_exit2=exit_function->true;Fail_or_exit2=fail_function), + + %(Dbw_n_or_v2=Dbw_n->true;Dbw_n_or_v2=Dbw_v), + +((F3="[]"->true;(F3=Dbw_not->true;(F3=Dbw_or->true;(F3="->"->true;(F3=Dbw_findall)))))-> + + + (%append(A1,[C112],AC1), + get_up_to_next_chunk(A1,[],Ax), + get(F3,Number31,%E1% + C11%,AC1%E* + ,E3,[]%C1 + ,C3111,Arguments3,nowrap), + append(Ax,C3111,C311)); + +( + +%convert_chunk +%trace, +append(A1,[C112],AC1), +get_up_to_next_chunk(AC1,[],C311) +,E1=E3 +)), + + +%trace, +foldr(append,[C31,C311],C313), + +(true%Wrap=wrap +-> +C3=[[[Dbw_n,F],C313]]; +C3=[[Dbw_n,F],C313]). + +/* + +%->,2 +get("->",Number,D,E3,_C1,C3,_,Wrap) :- + +F="->", + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after,get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + +(Wrap=nowrap->append([_],D1,D);D1=D), + + append(A,C,D1), +(false%Wrap=nowrap +->not(A=[]);true), + + C=[C111|E], + +%trace, + C111=[Number2,[Dbw_on_true,_N1],[Dbw_go_after,_],[Dbw_on_false,_Number3],[Dbw_go_to_predicates,_],[Dbw_n_or_v1,F2]|Arguments2], +%trace, + +(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v), + +((F2="[]"->true;(F2=Dbw_not->true;(F2=Dbw_or->true;(F2="->"->true;(F2=Dbw_findall)))))-> + + + (%append(A,[C111],AC), + %trace, + get_up_to_next_chunk(A,[],Ax1), + get(F2,Number2,C%AC%E E* + ,E2,[],C32,Arguments2,nowrap), + append(Ax1 + ,C32,C31)); + +( +%convert_chunk +%trace, +append(A,[C111],AC), +%trace, +get_up_to_next_chunk(AC,[],C31), +E=E2 +)), + +%(Wrap=nowrap->append([_],E21,E2);E21=E2), + + append(A1,C11,E2), + + C11=[C112|E1], + + %trace, + C112=[Number31,[Dbw_on_true,A3],[Dbw_go_after,B],[Dbw_on_false,_Number4],[Dbw_go_to_predicates,_],[Dbw_n_or_v2,F3]|Arguments3], + +(Dbw_n_or_v2=Dbw_n->true;Dbw_n_or_v2=Dbw_v), + +(A3=[Fail_or_exit3,Number]->true;B=[Fail_or_exit3,Number]), + + (Fail_or_exit3=exit_function->true;Fail_or_exit3=fail_function), + +((F3="[]"->true;(F3=Dbw_not->true;(F3=Dbw_or->true;(F3="->"->true;(F3=Dbw_findall)))))-> + + + (%append(A1,[C112],AC1), + get_up_to_next_chunk(A1,[],Ax), + get(F3,Number31,%E1% + C11%,AC1%E* + ,E3,[]%C1 + ,C3111,Arguments3,nowrap), + append(Ax,C3111,C311)); + +( + +%convert_chunk +%trace, +append(A1,[C112],AC1), +get_up_to_next_chunk(AC1,[],C311) +,E1=E3 +)), + +%trace, +foldr(append,[C31,C311],C313), + +(true%Wrap=wrap +-> +C3=[[[Dbw_n,F],C313]]; +C3=[[Dbw_n,F],C313]). +*/ + +/* +get(Dbw_findall,Number,D,E2,_C1,C3,[Argument1,Argument2],Wrap) :- + +F=Dbw_findall, + + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after,get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + +%get_lang_word("findall_exit_function",Dbw_findall_exit_function1),Dbw_findall_exit_function1=Dbw_findall_exit_function, +%get_lang_word("findall_fail_function",Dbw_findall_fail_function1),Dbw_findall_fail_function1=Dbw_findall_fail_function, + +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + +(Wrap=nowrap->append([_],D1,D);D1=D), + + append(A,C,D1), +(false%Wrap=nowrap +->not(A=[]);true), + + C=[C111|E], + + C111=[Number2,[Dbw_on_true,[Fail_or_exit1,Number]],[Dbw_go_after,_],[Dbw_on_false,[Fail_or_exit2,Number]],[Dbw_go_to_predicates,_],[Dbw_n_or_v1,F2]|Arguments2], + + (Fail_or_exit1=exit_function->true;Fail_or_exit1=fail_function), + (Fail_or_exit2=exit_function->true;Fail_or_exit2=fail_function), + +(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v), + +%convert_chunk +%trace, + + + +get_lang_word("not",Dbw_not1),Dbw_not1=Dbw_not, +get_lang_word("or",Dbw_or1),Dbw_or1=Dbw_or, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, + +%trace, + +((F2="[]"->true;(F2=Dbw_not->true;(F2=Dbw_or->true;(F2="->"->true;(F2=Dbw_findall)))))-> + + + (%append(A,[C111],AC), + %trace, + get_up_to_next_chunk(A,[],Ax1), + get(F2,Number2,C%AC%E E* + ,E2,[],C32,Arguments2,nowrap), + append(Ax1 + ,C32,C31)); + +( +%convert_chunk +%trace, +append(A,[C111],AC), +%trace, +get_up_to_next_chunk(AC,[],C31), +E=E2 +)), + + +%C3=[C31], % "[]" +foldr(append,[[Argument1],C31,[Argument2]],Arguments3), + +(true%Wrap=wrap +-> +C3=[[[Dbw_n,F],Arguments3]]; +C3=[[Dbw_n,F],Arguments3]). + +*/ + +/* + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("on_true",Dbw_on_true1),Dbw_on_true1=Dbw_on_true, +get_lang_word("on_false",Dbw_on_false1),Dbw_on_false1=Dbw_on_false, +get_lang_word("go_after",Dbw_go_after1),Dbw_go_after1=Dbw_go_after,get_lang_word("go_to_predicates",Dbw_go_to_predicates1),Dbw_go_to_predicates1=Dbw_go_to_predicates, + + +*/ + +% if exit or fail fn is in go after, find deepest extent of curr fn, through true, false, the same rule as above +% otherwise, use furthest last end line +% this works for all fns + +get_last_line(Number,D,CL,A,E,Number2,F2,C,Arguments2) :- +get_last_line2(Number,D,CL), +CL=[Number2,[_Dbw_on_true,_],[_Dbw_go_after,_],[_Dbw_on_false,_],[_Dbw_go_to_predicates,_],[_Dbw_n_or_v1,F2]|Arguments2], +append(A,C,D), +C=[CL|E]. + + +get_last_line2(Number,C,CL) :- + + ((reverse(C,CR),member([Number2,[_Dbw_on_true,_],[_Dbw_go_after,B],[_Dbw_on_false,_],[_Dbw_go_to_predicates,_],[_Dbw_n_or_v1,_F2]|_Arguments2],CR), + (( + B=[Fail_or_exit,Number], +(Fail_or_exit=exit_function->true;(Fail_or_exit=fail_function->true; +(Fail_or_exit=findall_exit_function->true; +Fail_or_exit=findall_fail_function))) +%(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v) +, + +get_last_line2(Number2,C,CL))->true; + +(%reverse(C,CR), +member([Number2,[Dbw_on_true,A],[Dbw_go_after,B],[Dbw_on_false,D],[Dbw_go_to_predicates,E],[Dbw_n_or_v1,F2]|Arguments2],CR), + +%(Dbw_n_or_v1=Dbw_n->true;Dbw_n_or_v1=Dbw_v), + + +(A=[Fail_or_exit,A2]->A1=A2;A1=A), +%(D=[Fail_or_exit2,D2]->D1=D2;D1=D), + +(Fail_or_exit=exit_function->true;(Fail_or_exit=fail_function->true; +(Fail_or_exit=findall_exit_function->true; +Fail_or_exit=findall_fail_function))), + +%get_last_line(A1,C,CL1), + +%get_last_line(D1,C,CL2), + +%(CL1>=CL2->CL=CL1;CL=CL2))),!. + +(A1=Number),%,->true%;D1=Number), +CL=[Number2,[Dbw_on_true,A],[Dbw_go_after,B],[Dbw_on_false,D],[Dbw_go_to_predicates,E],[Dbw_n_or_v1,F2]|Arguments2])))). diff --git a/Text-to-Breasonings-master/.DS_Store b/Text-to-Breasonings-master/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..7b160be1118e3bb428389fa677ad56bc4637a2ac Binary files /dev/null and b/Text-to-Breasonings-master/.DS_Store differ diff --git a/Text-to-Breasonings-master/Disclaimer.txt b/Text-to-Breasonings-master/Disclaimer.txt new file mode 100644 index 0000000000000000000000000000000000000000..8f7445f1a9f2bcd57bded0d454996f495a64809f --- /dev/null +++ b/Text-to-Breasonings-master/Disclaimer.txt @@ -0,0 +1,3 @@ +Disclaimer + +We take no responsibility for using this software. Please see a qualified professional if needed. diff --git a/Text-to-Breasonings-master/Instructions_for_Using_texttobr(2).pl.txt b/Text-to-Breasonings-master/Instructions_for_Using_texttobr(2).pl.txt new file mode 100644 index 0000000000000000000000000000000000000000..f06a488116035e807fb9f85ec5aa22f43fdf9664 --- /dev/null +++ b/Text-to-Breasonings-master/Instructions_for_Using_texttobr(2).pl.txt @@ -0,0 +1,279 @@ +Instructions for Using texttobr(2).pl + +This software is not for use for human conception. + +Use the computer as an extension of yourself to breason out texts. + + + +Glossary: Breasoning is thinking of the x, y and z measurements in metres of an object in a sentence. + + + +1. Perform the following Daily Regimen, containing meditation, pedagogy and medicine preparation. + + +Meditation - (this is what the Setting up a VPS with TextToBr.txt simulates). + + +You can naturally prevent a headache by silently repeating the mantras, "lucian" and "arem" for twenty, minutes, twice per day. The mantras becomes lighter and lighter, until one forgets the thought, clearing the mind. Also, silently repeat the sutra "green" for twenty minutes, twice per day after two months of using the mantras in the morning and evening. Also, pray for no digestive system pops from practising the sutra each day. + +Quantum Box and Nut and Bolt + + +Quantum Box + + +The quantum box is a guided meditation that I will describe later that prevents headaches, (relaxes head and neck muscles), and prevents muscle aches and pains. In it, you can make anything (e.g. a headache) instantly disappear, based on quantum physics (where at the quantum level, things may disappear if we want them to). + + +Nut and Bolt + + +The nut and bolt is a guided meditation that is a more detailed prevention of problems the Quantum Box prevents. In it, pretend to unscrew a nut from a long thin bolt each day to prevent a headache, etc. + + +One-off + +One-off: Repeat Upasana * 800 and breason out the following 7 breasonings details to God. I make a small recommendation of a University accredited short Education course every two years or a degree every ten years. + + + +Repeat Upasana * 800 if not done in a chain of meditation days linked to today (and green, Purusha, Use, Teacher, Maharishi, Doctor each x 800 before Upasana if they are not done in a chain of meditation days linked to today). + +Breason out (think of examples with objects that you have thought of the x, y and z dimensions of) the following 7 breasonings details as an exercise. Pretend to breason out the breasonings details to God. + +See http://lucianspedagogy.blogspot.com.au/p/writing-sentences-by-making.html + +1. Details - As an exercise, think of two uses, a future use and two types for each object. + +2. Breasoning - Think of the x, y, z dimensions and colour of each object. + +3. Rebreasoning - Think of the fact that the person and the object in a sentence are connected by a verb (an action) that means they touch. + +4. Breathsoning - Think of a reason that the object is judged good (a judgment adjective). + +5. Rebreathsoning - Think of a reason that the verb (action) is judged done well (a judgment adverb). + +6. Space - Think of spatial characteristics of the person in relation to the object (room, part of room e.g. table and direction in room e.g. plate). + +7. Time - Think of temporal characteristics of the person in relation to the object (time to prepare e.g. buy, time to do e.g. eat, and time to finish e.g. place core in a receptacle). + + + + +Daily Regimen + + +I recommend Yoga and Qi Gong. + + +Indicate 2 radio buttons (one for each of the following ideas, and one for each of these ideas working anyway, where a radio button is indicated by breasoning out, or thinking of the x, y and z dimensions of a counter, 0.01, 0.01, 0.005 metres). + +Too (just say “too” for each idea following to indicate the two radio buttons) + +meditation (108 lucian and arem mantras and 108 green sutras) + +Also, for the following, which also have quantum box and nut and bolt: + +No digestive system pops from practising the sutra + + +Also, for the following, which also have quantum box and nut and bolt: + +Protector from Headache in Meditation after Honours Study, etc. + +Also, for the following, which also have quantum box and nut and bolt: + +No headache + + +The following turn off headaches etc. + +Also, for the following, which also have quantum box and nut and bolt: + +Turn off workload from all employees including you below you, + +Also, for the following, which also have quantum box and nut and bolt: + +Detect and turn off workloads using manager algorithm. (The self does the As, turning off the workloads.) + +Also, for the following, which also have quantum box and nut and bolt: + +Detect and turn off workloads using manager network. (The others help finish the As, turning off the workloads.) + +Also, for the following, which also have quantum box and nut and bolt: + +No muscle aches/pains + + +2. One-off, breason out 250 characters from Lecturer Training, to be able to breason out anything you want, not just found out breasonings. + + +“Lecturer (all in this document) + + + 1. I collected the comments. + + 2. I wrote the formula. + + 3. I wrote the input. + + 4. I wrote the output. + + 5. I had the rights to government. + + 6. I had the rights to the state. + + 7. I had the rights to vote. + + 8. I amalgamated the b” + + +3. One-off, breason out 250 characters from Recordings Training, to be able to indicate that breasonings are implicitly breasoned out for you. + + +“Recordings from 30 As 1 + + + 1. I meditated on the day I produced recordings. + + 2. I confirmed training. + + 3. Recordings relate to the rest of pedagogy. + + 4. The delegator matched recordings (workloads) with different recordings. + + 5. I endorsed the 5*50 As pract” + + +“Too”: 5 symbolic strawberries (each is 0.01, 0.01, 0.01 cm) for Recordings to work. + + +4. Each day, prepare for using texttobr[1/2]: + + +Realise the hand breasoning switch is done for you each day from the first time you switch it on, doing the following, by breasoning out the following ten words from file2qb.txt (medicine) each day: +"i the to and in by of this prepared a" + + +Use only ONE of the following plans. It is recommended to progress through plans 1-3, for at least one month each. Commence instructions at 8 AM at the earliest. Warning: Not following these instructions when using texttobr[1/2] can result in haemorrhoids, eyelid twitches and eyelid itchiness. + + +Note: In the following, 250 characters/10 words (that you MUST breason out each day before using texttobr[1/2] from Job Medicine are: + + +“I relaxed my whole body. I had a bath. I ran the water. I stepped into the bath. I enjoyed the bath. + +I played the game. I massaged myself. I massaged my neck. I massaged my shoulders. I felt good. + +I had a hot bath. I ran the water. I turned t” + + +5. 250 characters from Delegate Workloads + +"1. I cut off infinity. + 2. I used the sun. + 3. The queen helped. + 4. The delivery arrived. + 5. I earned the role. + 6. I wrote the developed text. + 7. I knew about the lotus spoon. + 8. I knew the female politician. + 9. I knew positive religion. + 10. P" + +Plan 1: + +Every Monday and Thursday: + +1. Prepare the radio button: Accredited, at the time, turned off (to make accreditation acceptable) computer science, breathsoned, VET, sales, education for the following. + +2. The next step means that you are seen as breasoning out text before texttobr, computer housing the breasonings, 250 breasonings per waking hour for 16 hours, stopping at midnight (from word breasonings first), no medical problems from it. + +3. Breason out a radio button (0.01, 0.01, 0.005 cm). + +4. Hand-breason out (not use texttobr[1/2] for) 250 characters from Job Medicine. + + +Plan 2: + +Every Monday and Thursday: + +1. Prepare the radio button: Accredited, at the time, turned off (to make accreditation acceptable) computer science, breathsoned, VET, sales, education for the following. + +2. The next step means that you are seen as breasoning out text before texttobr, computer housing the breasonings, 200 breasonings per waking hour for 16 hours, stopping at midnight (from word breasonings first), no medical problems from it. + +3. Breason out a radio button (0.01, 0.01, 0.005 cm). + +4. Hand-breason out (not use texttobr[1/2] for) 250 characters from Job Medicine. + +Every other day: + +1. Prepare the radio button: Accredited, at the time, turned off (to make accreditation acceptable) CS, breathsoned, VET, sales, education for the following. + +2. The next step means that you are seen as breasoning out text before texttobr, computer housing the breasonings, 250 breasonings per waking hour for 1 hour (from word breasonings first), no medical problems from it. + +3. Breason out a radio button (0.01, 0.01, 0.005 cm). + +4. Hand-breason out (not use texttobr[1/2] for) 250 characters from Job Medicine. + + +Plan 3: + +Every day: + +1. Prepare the radio button: Accredited, at the time, turned off (to make accreditation acceptable) computer science, breathsoned, VET, sales, education for the following. + +2. The next step means that you are seen as breasoning out text before texttobr, computer housing the breasonings, 70 breasonings per waking hour for 16 hours, stopping at midnight (from word breasonings first), no medical problems from it. + +3. Breason out a radio button (0.01, 0.01, 0.005 cm). + +4. Hand-breason out (not use texttobr[1/2] for) 250 characters from Job Medicine. + + +5. If necessary, download SWI-Prolog for Mac or Windows and install it. + + +6. Type the following commands in SWI-Prolog: + + +[texttobr2]. + + +[texttobr]. + + +texttobr2. + + +texttobr. + +(To breason out each character in the file). + + +Alternative to 6: It is recommended to replace the contents of file.txt with Lecturer Training, Recordings Training and JobMedicine.txt, one-off, and + + +Type the following commands in SWI-Prolog: + +* Note: see T2B_docs.md for instructions about running TextToBreasonings. + +Notes: + + +Replace ⁃, – in file.txt with - or crashes + +Replace “,” in file.txt with " or crashes + +Replace ‘,’ in file.txt with ' or crashes + +Replace MS Word return in file.txt with normal return or crashes + +Replace nonbreaking space " " in file.txt with normal space " " or crashes + + +If there are any invisible characters apart from esc then brdict.txt may become corrupt, and should be replaced with a backup. You could directly delete the invisible characters from brdict.txt, but Lucian Academy accepts no responsibility for this. + + +You don't have to reinsert smart quotes in file.txt afterwards because texttobr2 doesn't change the file. Simply use the copy of your text before you inserted and changed it in file.txt. diff --git a/Text-to-Breasonings-master/Job_Medicine.txt b/Text-to-Breasonings-master/Job_Medicine.txt new file mode 100644 index 0000000000000000000000000000000000000000..8b0fb953930a0d7c869183e7b93ecafb90e39e1c --- /dev/null +++ b/Text-to-Breasonings-master/Job_Medicine.txt @@ -0,0 +1,12 @@ +Job Medicine + +[* start 170, 250 chars, 250 words] I relaxed my whole body. I had a bath. I ran the water. I stepped into the bath. I enjoyed the bath. +I played the game. I massaged myself. I massaged my neck. I ma[* end 170 chars] ssaged my shoulders. I felt good. +I had a hot bath. I ran the water. I[* end 250 chars] turned the hot tap on. I turned the cold tap on. I made sure that the water was hot enough. +I put my foot in the water. I tested the water. I inserted my finger into the water. I removed my finger from the water. I ran more hot or cold water if the temperature was too cold or hot, respectively. +I relaxed thinking of the bath times. I thought of the bath times. I thought of today’s bath time. I thought of tomorrow’s bath time. I thought of the bath time on the day after tomorrow. +I gently squeezed the muscle. I found the muscle’s end. I felt the muscle. I found its end. I touched it. +I massaged the muscle’s length. I found the muscle’s length. I found the muscle’s start. I found the muscle’s end. I subtracted the muscle’s start from the muscle’s end to find the muscle’s length. +I tested whether the muscle needed more squeezing. I started massaging the muscle. I placed one finger on one side of the muscle. I placed another finger on the other side of the muscle. I felt if the muscle needed squeezing. [* 250 words] +I massaged myself and stayed comfortable. I finished massaging the muscle. I squeezed the muscle a little. I felt that I could finish squeezing the muscle. I gently turned my head. +I relaxed. I thought of relaxing. I thought of meditation (medicine). I practised meditation (relaxation). I massaged my body. \ No newline at end of file diff --git a/Text-to-Breasonings-master/LICENSE b/Text-to-Breasonings-master/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..3a27c90f1c2986ac0fb73b4ae956c6205343dfdd --- /dev/null +++ b/Text-to-Breasonings-master/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Text-to-Breasonings-master/Lecturer.txt b/Text-to-Breasonings-master/Lecturer.txt new file mode 100644 index 0000000000000000000000000000000000000000..614f5333fd8e9ae801f6e15e1bd12936c83576be --- /dev/null +++ b/Text-to-Breasonings-master/Lecturer.txt @@ -0,0 +1,393 @@ +Lecturer + +Contents + +Lecturer (all in this document) +Simulated Intelligence +Lecturer Simulated Intelligence +Lecturer in Lecturer Pedagogy +Lecturer in Recordings Pedagogy +Pedagogy Helper Lecturer +People Values Lecturer +History lecturer +Philosophy/Computational English lecturer +Politics lecturer +English lecturer +Linguistics lecturer +Computer Science lecturer +Psychology lecturer +Meditation lecturer +Medicine lecturer +Lecturer +Metaphysics Lecturer +Hermeneutics Lecturer +Gay Studies Lecturer +Popology Lecturer +Societology Lecturer +Simulated Intelligence Lecture continued. +Lucianism Lecturer +Culturology Lecturer +Rhetoric Lecturer +Lecturer (all in this document) Edit +I collected the comments. +I wrote the formula. +I wrote the input. +I wrote the output. +I had the rights to government. +I had the rights to the state. +I had the rights to vote. +I amalgamated the bodies. +I agreed. +I disagreed (concurred). +Simulated Intelligence Edit +I established simulated intelligence. +I wrote on pedagogy. +I became a pedagogue. +I produced pedagogical recordings. +I became a pedagogical helper. +I wrote on people values. +I wrote on history. +I wrote on philosophy (Computational English). +I wrote on politics. +I wrote on literary studies. +I wrote on linguistic. +I wrote on computer science. +I wrote on psychology. +I wrote on meditation. +I wrote on medicine. +Lecturer Simulated Intelligence Edit +The seen-as version was the first depth applied to the ideas the simulation. +I finished the sentence with the simulation. +I finished the topic with the simulation. +I finished the book with the simulation. +I simulated the lecturer with a function. +I verified reality with the simulation. +I was good in the simulation. +I tested that the developed idea test was successful. +I verified that the sentence in the simulation was correct. +I applied the second department to the sentence in the simulation. +== +Lecturer Pedagogy == + +I wrote the types and future details. +For example, I wrote the use detail. +I wrote the breasonings. +I wrote the rebreasoning. +I wrote the breathsoning. +I wrote the rebreathsoning. +I wrote the room. +I wrote the part of the room. +I wrote the direction. +I wrote what happened before. +I wrote what happened now. +I wrote what happened in the future. +Lecturer in Lecturer Pedagogy Edit +I checked that I had enough before leaving. +I waited. +I wrote the important ideas first. +I checked that I had enough for when I had returned. +I studied philosophy subjects. +I studied to teach a competency. +I could help others with A. +I wrote interestingly. +I had the prerequisite studies. +I found a student. +Lecturer in Recordings Pedagogy Edit +I could care for myself when playing recordings (not have a headache). +I was free with recordings. +I briefed (played to) the recipient of recordings. +I knew everything I talked about in recordings. +I prepared for (meditated on) the future from recordings. +I had no muscle aches. +I helped others to present recordings. +The class average increased. +I wrote on the good idea. +I recommended medicine and meditation. +Pedagogy Helper Lecturer Edit +I wrote the three details down. +I researched the student. +I wrote computer science on the idea. +I wrote philosophy on the idea. +I wrote politics on the idea. +I linked the details together. +I found the topic title. +I wrote in logic. +I wrote on the whole idea (three steps and the use). +I multiplied the idea when leading a committee on it. +People Values Lecturer Edit +The subject was saving life. +The subject was perfect (earning 100%). +The subject was giving (answering specific questions). +The subject was led (acting on feedback). +The subject helped the disadvantaged. +The subject accepted the reward. +The subject gave feedback. +The subject answered the question. +The subject rewrote the course. +The subject noticed improvement in helped people. +History lecturer Edit +I was an historian. +I wrote the most important ideas. +I wrote on the history of madness. +I wrote on the history of education. +I ate an apple per day (10 breasonings) in meditation. +I breasoned out 20 finding fault (agreeing) breasonings in business per day. +I performed the lecturer preparation. +I wrote 10 breasonings per day in teaching. +I wrote 10 breasonings per day for happiness medicine. +There was a government history campaign. +Philosophy/Computational English lecturer Edit +I wrote on Computational English. +Computational English is like a cake. +I connected two texts. +I cut off infinity. +I found the opposite of an argument. +I determined the event’s history over time +I protected the world with a simulation. +I found the property from the file length. +I determined the term from the definition. +I determined the definition from the term. +Politics lecturer Edit +I had the right to constitution. +I had the right to state. +I had the right to government. +I had the right to vote. +I had the right to amalgamation. +The country was equitable. +The country maintained hight (sic) quality of life. +There was regular re-election of government. +The voters regularly verified the policies. +The best ideas were found. +English lecturer Edit +I agreed. +I disagreed. +I told the story of the self. +I told the story of us. +I told the story of now. +I wrote the secondary text. +I wrote the debate. +I wrote on another. +I wrote on the different groups. +I wrote on the past and future. +Linguistics lecturer Edit +I chose red with culture. +I created the lingua franca. +I started the newspaper. +I wrote the article. +I wrote a new language. +I wrote on universalism. +I defined the meaning. +I was editor. +I edited an article. +I spoke in a new language. +Computer Science lecturer Edit +I wrote a new logic. +I wrote a new language of logic. +I wrote the formula. +I wrote the formula finder. +I wrote the input. +I wrote the output. +I applied the new logic to the question. +I applied the new logic to the problem. +I wrote the units of the input. +I wrote the units of the output. +Psychology lecturer Edit +I wrote the algorithm in steps. +I wrote the algorithm in a hierarchy. +I wrote the algorithm with logic. +I added emotion to ten learning aid. +I wrote the memory aid. +I wrote the name of one object per step. +I wrote that everyone was included. +I wrote that the argument was valid. +I wrote that the feeling made the experience memorable. +I tested that I was rested. +Meditation lecturer Edit +I chose spiritual instead of (and) mindfulness. +I wrote a meditation idea per day on my aim. +I wrote a meditation idea per day on my method. +I wrote a meditation idea per day on my apparatus. +I wrote a meditation idea per day on my hypothesis. +I wrote a meditation idea per day on my results. +I wrote a meditation idea per day on my discussion. +I wrote a meditation idea per day on my conclusion. +I wrote a meditation idea per day on my outliers. +I wrote a meditation idea per day on my new discoveries. +Medicine lecturer Edit +I wrote six levels on the solution to the problem. +I helped each person with meditation. +I helped each person with medicine. +I helped each person with pedagogy. +I helped each person with Computational English. +I found protection by natural law. +I found objects that increased longevity. +I found objects that increased happiness. +I found objects that increased grades. +I found objects that increased consciousness (expression). +Lecturer Edit +Metaphysics +Hermeneutics +Communication +Gay Studies +Popology +Societology +Lucianism +Culturology +Rhetoric +Cognitive Science +Psychology +Philosophy of Language +Development +Body Metaphor +Mind Metaphor +Future +Aesthetics +Epistemology +Science +Love +Sex +Primary Philosophy +Secondary Philosophy +Logic +Brain Metaphor +Poetics +Computational Philosophy +Archeology +Architecture +Cosmology +Ethics +Laws +Economics +Sport +Games +Metaphysics Lecturer Edit +I wrote the idea as an object. +I wrote the idea as a word. +I wrote the idea as a sentence. +I wrote the idea as a paragraph. +I wrote the idea as a chapter. +I wrote the idea as a book. +I wrote the idea as an algorithm. +I detected the size of the object. +I described the use. +I simplified the idea to all appropriate uses. +Hermeneutics Lecturer Edit +I wrote the queryable ideology. +I wrote the query. +I wrote the ontology as answer. +I wrote the conjoined parts. +I wrote one of the parts of the disjunction. +I wrote the answer set in the ideology. +I converted the question into an answer. +I wrote the reason for the ontology. +I inferenced the reason with the conclusion. +I verified the “i, j” part. +== +Lecturer Communication == + +I spoke. +I wrote solutions to puzzles. +I won the point with the universal statement. +The post-graduate was expected to become a lecturer. +I wrote the sentence maze. +I communicated well. +I communicated bravely. +I communicated to the king. +I communicated to everyone. +I communicated to the world. +Gay Studies Lecturer Edit +I was friendly with everyone. +I was happy with everyone +I was good with everyone. +I wrote the poem. +I wrote the articles. +I wrote about sex. +I wrote about liberation. +I wrote about happiness. +I wrote about goodness. +Popology Lecturer Edit +The person was just. +The person was happy. +The person was sentient. +The person was sapient. +The person was agreeable. +The person met the standard. +The person smiled. +The person was refreshed his appearance in his mind. +The person weighed up each side. +The person made allowance for a new point of view. +Societology Lecturer Edit +The person was right. +The person was good. +The person was helpful. +The person was useful +The person was caring (well-dressed). +There was enough leeway (the person was equitable). +The societology model worked well. +The other was interesting. +The society performed the job itself. +Society stood still. +Simulated Intelligence Lecture continued. Edit +Simulated Intelligence contains all ideas. +I thought around corners of the simulation. +I noticed Kinglish about (the characters reacted to) Simulated Intelligence. +Breasonings didn’t like (liked) the leader in the simulation. +God collected the value pertaining to whether the character in the simulation had critiqued him. +The simulation supported equality of marking agreement and disagreement. +Accreditation in the simulation is testing good not bad (and testing other good) religions. +Accreditation in the simulation is breasoning time points being assessable because of payment to help with birth, where this help might otherwise not be given. +Accreditation in the simulation is the meditation mantra (word) being assessable, not practised but seen as a generic skill. +Accreditation in the simulation is the medicine quantum switch assessable because it is a generic skill. +Accreditation in pedagogy in the simulation is breasoned breasoning lists “creating pedagogues” and “helping pedagogues” being assessable. +I was quite intelligent. +The king commanded the simulation. +I started the simulation part. +The hint worked in the simulation. +The person maintained his good health in the simulation. +The element in the simulation advertised it was there until change. +Lucianism Lecturer Edit +I read philosophy because I knew the author. +Lucianism is taking over the world. +Lucianism is the philosophy of God. +The relevant philosophy is interpreted. +Philosophy helped the world. +The self chose itself. +The self chose the other. +I examined the chai form non-accreditation to accreditation. +Humanist Pedagogy worked. +Meditation worked. +Culturology Lecturer Edit +I examined [Lucianic] culture. +I examined the goodness of life. +Culturology is good (defended). +Culturology was with presentness. +Culturology had a point (culture has the selves). +Culture was “small idea”-like, in other words, culture had a small idea. +I examined culturology, on other words two models joined in culturology. +I noticed the vestiches (sic), in other words I was present to, or eyed culturology. +The croc swamp found the island, in other words culturology decorated the leaves, or canvas, or took care of the ward. +I rewarded the culture from above with art, I rewrote culturology from above in “arg art”. +Rhetoric Lecturer Edit +I wrote the rhetoric of safety. Creativity was derived from God. +I said the statement, meaning it. I know this is a production of culturology. +The house of rhetoric advanced. As A was verified the rhetorician advanced against what was like fogs of food. +As I saw, he saw. So were waddling if grassy. As I saw what there was to see, they saw it too. +As I looked over myself, I was happy. As I faced the wood, I stacked it. +As the lion roared, it was rhetoric. As I ran the program, I verified it. +As the lion transfixed on the lion cub on a crab cond (sic) lim (sic), the lioness transfixed a point. As I talked, my students transfixed by people not necessarily disagreeing (agreed) (talking) doing work. +As the guidelines for self-aim advice, there were for the other. +As I created the ontology, I used it. +As the self was verified to exist, the other was verified to exist. As the car brought food, the disabled woman led. +== +Cognitive Science Lecturer == + +Cognitive science was computation written in sentences. Cognitive science was what was right. +Cognition was thought, soul and deed. Cognition followed words. +I further cognised a previous cognition. Cognition followed words. +The lecturer wrote God’s cognitive science algorithm. I structured the thought. +The meontology came true. I gained from the safe cognitive science. +The cognitive scientist didn’t release God’s not reasonings (did not release the person’s metadata). The cognitive scientist started with 0, not 1. +In cognitive science, the person was only called performing an action when the touched the object to perform that action. Cognitive science was touching the object. +Cognitive science is the visualisation of algorithms. Cognitive science is the counting algorithm. +Cognitive science was the void. Cognitive science was about writing (domain of the algorithm) (the plant decided on a side of the debate). +The hook properly hooked (didn’t fall off in cognitive science). \ No newline at end of file diff --git a/Text-to-Breasonings-master/Memory_returned_when_suffering_from_Parkinsons.txt b/Text-to-Breasonings-master/Memory_returned_when_suffering_from_Parkinsons.txt new file mode 100644 index 0000000000000000000000000000000000000000..7f6bcf5a1110d9ec6f3479958784e7dbdba5c40d --- /dev/null +++ b/Text-to-Breasonings-master/Memory_returned_when_suffering_from_Parkinsons.txt @@ -0,0 +1,11 @@ +Putting "Frank" on done up 50 As for meditation helped his memory return while suffering from Parkinson's. + +time((N is 21*3*3*3*4, M is 4000, texttobr2(N,u,u,M),texttobr(N,u,u,M))). + +21 people, 3 (Lucian, God, the area of study), 3 (today, tomorrow, the day after tomorrow), 3 (standard, B, B to B), 4 * 50As + +time((N is 21*3*3*2*4, M is 4000, texttobr2(N,u,u,M),texttobr(N,u,u,M))). + +Instead of 3 (standard, B, B to B): 2 (going against itself to arrive at the destination and return) + +https://lucianpedia.fandom.com/wiki/Random_Combination_Algorithm_Writer \ No newline at end of file diff --git a/Text-to-Breasonings-master/README.md b/Text-to-Breasonings-master/README.md new file mode 100644 index 0000000000000000000000000000000000000000..14444bf07c7c826d06f9ace75185cbb3d15ca532 --- /dev/null +++ b/Text-to-Breasonings-master/README.md @@ -0,0 +1,181 @@ +Essay Helper | Text-to-Breasonings (Helps Earn High Distinctions) | Grammar Logic (Helps Mind Map Details) + +# Text-to-Breasonings Documentation + +* Text-to-Breasonings (T2B) is a set of algorithms that allow breasoning out (in this case, the computer thinks of the x, y and z dimensions of) objects to help earn high distinctions, support but not conceive healthy children, earn jobs, sell products, produce quantum energy, read minds, display spiritual screens, time travel, replicate or become immortal. T2B automates breasoning, where breasoning prevents possible health problems of other methods (see Disclaimer). + +* TextToBreasonings can (1) breason out words over a local machine and a Virtual Private Server (a server that you can use as central which has environmental advantages) (VPS) host, (2) breason out characters over a local machine or (3) breason out words, breathsonings (judgement adjectives and adverbs), rooms, parts of rooms, directions in rooms, objects to prepare for an action and objects to finish an action over a local machine. + +* Generally, 80 word breasonings are needed to earn a high distinction at undergraduate level and below, have healthy children or sell products. This increases to 2 * 15 * 80=2400 breasonings per Honours level chapter (i.e. 2, which is one to the lecturer and one from them to you), approximately 2 * 50 * 80=8000 breasonings per Masters level assignment and approximately 2 * 4 * 50 * 80=32,000 breasonings per PhD level assignment. *NB:* To actually earn H1, multiply the number of breasonings by 125 (if the rest of the breasonings are generated by Grammar Logic), except undergraduate, i.e. Honours=300,000 breasonings, Masters=1,000,000 breasonings and PhD=4,000,000 breasonings. + +* In addition to the 80 philosophy sentence breasonings, a certain number of short algorithms are necessary to earn a high distinction for an assignment. If there are a small number of algorithms, the lecturer may do them. + +| Level | Algorithms required | +| ------------- | ------------------: | +| Undergraduate | 80 | +| Honours | 80 | +| Master's | 80 | +| PhD | 80 | + +* 50 As (50 * 80=4000 breasonings) are required to earn a job. + +* Replace 80 breasonings and 80 algorithms with 250 each for up to 100% above. + +* To earn H1, also breason out (and detail out in Honours and above) the lecture notes. + +* Essay Writing and Pedagogy Tips + +# Getting Started + +Please read the following instructions on how to install the project on your computer for breasoning. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, the List Prolog Interpreter Repository and the Algorithm Writer with Lists Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Text-to-Breasonings"). +halt +``` + +# Caution: + +* Before running texttobr, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. + +* Follow instructions in Instructions for Using texttobr(2) when using texttobr, texttobr2 or mind reader to avoid medical problems. + +* The Prevent Headaches App may be necessary. + +* See also Time Machine, which uses Text to Breasonings, to travel through time with meditation. + +# Running + +* In Shell: +`cd Text-to-Breasonings` +`swipl` + +* (1) To breason out words over a local machine and a VPS host, in the SWI-Prolog environment, enter: +`['../listprologinterpreter/listprolog'].` +`[texttobr2].` + +* Please follow the instructions in "Setting up a VPS with TextToBr.txt" to set up a VPS with TextToBr up to and including "Upload them to the VPS's...". + +* In the SWI-Prolog environment, enter: +`texttobr2(N,File,String,M).` +where N is the number of times to breason out the file, File is the file name, String is the string to breason out and M is the number of words in the file to breason out, e.g.: +* `texttobr2(u,"file.txt",u,u).` +Breasons out file.txt. +* `texttobr2(2,"file.txt",u,u).` +Breasons out file.txt twice. +* `texttobr2(u,u,"Hello world.",u).` +Breason out "Hello world.". +* `texttobr2(3,u,"a b c",2).` +Breasons out the first two words in "a b c" ("a" and "b") 3 times. + +* (2) To breason out characters over a local machine, in the SWI-Prolog environment, enter: +`[texttobr].` + +* In the SWI-Prolog environment, enter: +`texttobr(N,File,String,M).` +where N is the number of times to breason out the file, File is the file name, String is the string to breason out and M is the number of words in the file to breason out, e.g.: +* `texttobr(u,"file.txt",u,u).` +Breasons out file.txt. +* `texttobr(2,"file.txt",u,u).` +Breasons out file.txt twice. +* `texttobr(u,u,"Hello world.",u).` +Breason out "Hello world.". +* `texttobr(3,u,"abc",2).` +Breasons out the first two characters in "abc" ("a" and "b") 3 times. + +* (3) To breason out words, breathsonings (judgement adjectives and judgement adverbs), rooms, parts of rooms, directions in rooms, objects to prepare for an action and objects to finish an action over a local machine, in the SWI-Prolog environment, enter: +`['text_to_breasonings.pl'].` + +* In the SWI-Prolog environment, enter: +`texttobr2(u,u,u,u,false,false,false,false,false,false).` +where the first four arguments may be changed as in (1) above, and only words are breasoned out. +* `texttobr2(u,u,u,u,true,false,false,false,false,false).` +where the first four arguments may be changed as in (1) above, and only words and breathsonings are breasoned out. +* `texttobr2(u,u,u,u,true,true,true,true,true,true).` +where the first four arguments may be changed as in (1) above, and words, breathsonings (judgement adjectives and judgement adverbs), rooms, parts of rooms, directions in rooms, objects to prepare for an action and objects to finish an action are breasoned out. + +# Reading Algorithm + +* The algorithm often runs too quickly. To notice a number of words ("read them") in unread texts, where Master=6 algorithms, PhD=~16 algorithms and professor/politician=~50 algorithms, run with: +``` +['../Text-to-Breasonings/text_to_breasonings.pl']. +W is 50*4,texttobr2(u,u,u,u,false,false,false,false,false,false,W). +% where W is the number of words to read +% 50 is the number of algorithms, +% and there are approximately 4 words per algorithm. +``` + +# Installing and Running T2B with VPS Synchronisation + +* Use a Virtual Private Server (VPS) Sync to breason out new meditators' names on your machine, sync with the VPS, and run meditation and text to breasonings on the VPS (by pasting rcaw.paste into the terminal. +* Install this repository using the instructions above. + +* To install, in in username folder/texttobr2.sh replace the the path `/username/codefolder/` with the location of the installation of this repository and move the file to i.e. your `/username/` folder. + +* To run, type `./texttobr2.sh` in the `/username/` folder and T2B wil run (breasoning out only breasonings, not breathsonings, etc.), syncing with the VPS. + +# Repairing Corrupted Dictionary Files + +* If you accidentally enter a mistake in Text to Breasonings, you can repair the dictionary files using the following method. + +* Find texttobrall2_reading.pl or your Text to Breasonings algorithm file. +Screen Shot 2019-12-13 at 10 46 18 pm +* Open with BBEdit. +Screen Shot 2019-12-13 at 10 46 32 pm +* Find the line containing "%%{writeln(L)}". For brdict2.txt find the line ""%%{writeln(L)}" following "filet([L|Ls]) --> entryt(L),",",". +Screen Shot 2019-12-13 at 10 46 48 pm +* Uncomment (delete "%%") the lines "%%{writeln(L)}" and (another below it) "%%{writeln(L)}" and save the file. +Screen Shot 2019-12-13 at 10 47 02 pm +* Load Text to Breasonings with: ['text_to_breasonings.pl']. +* Run Text to Breasonings with i.e. "N=u,M=u,texttobr2(N,"file.txt",u,M,false,false,false,false,false,false).". +Screen Shot 2019-12-13 at 10 47 52 pm +* Read the final dictionary entry before the error. +Screen Shot 2019-12-13 at 10 51 00 pm +* Open the dictionary file with BBEdit. +Screen Shot 2019-12-13 at 10 51 21 pm +* Locate the final entry "[tz,dash]," before the error "[⁃,dash],". +Screen Shot 2019-12-13 at 10 51 50 pm +* Delete or modify "[⁃,dash]," so that it doesn't contain the illegal character "⁃". +Screen Shot 2019-12-13 at 10 52 12 pm +* Reenter the comments ("%%") that were removed before. +Screen Shot 2019-12-13 at 10 52 32 pm +* Load Text to Breasonings with: +['../Text-to-Breasonings/text_to_breasonings.pl']. +* Run Text to Breasonings with i.e. "N=u,M=u,texttobr2(N,"file.txt",u,M,false,false,false,false,false,false)." to check the dictionary file is not corrupt. +Screen Shot 2019-12-13 at 10 53 25 pm + + +# Versioning + +We will use SemVer for versioning. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + + +# Addendum + +rcaw.paste added - this can be run each day to meet daily regimen requirements, such as preventing unwanted thoughts (mindreadtestsec.pl), giving meditators thoughts (mindreadingcaw.pl), giving thoughts about psychoanalysis (mindreadtestpsychiatrist.pl) and giving philosophical details(mindreadtestsecthoughts.pl). diff --git a/Text-to-Breasonings-master/Read_Me.txt b/Text-to-Breasonings-master/Read_Me.txt new file mode 100644 index 0000000000000000000000000000000000000000..adf5bc35c96195c17db281d89111d2f349a95659 --- /dev/null +++ b/Text-to-Breasonings-master/Read_Me.txt @@ -0,0 +1,3 @@ +Read Me + +Please read "Setting up a VPS with TextToBr.txt" to set up a VPS with TextToBr. \ No newline at end of file diff --git a/Text-to-Breasonings-master/Recordings.txt b/Text-to-Breasonings-master/Recordings.txt new file mode 100644 index 0000000000000000000000000000000000000000..fbb6c2a60b824f6ce7775a7822a1fd11bd91f7ec --- /dev/null +++ b/Text-to-Breasonings-master/Recordings.txt @@ -0,0 +1,191 @@ +Recordings + +Recordings from 30 As 1 Edit +I meditated on the day I produced recordings. +I confirmed training. +Recordings relate to the rest of pedagogy. +The delegator matched recordings (workloads) with different recordings. +I endorsed the 5*50 As practicums. +I can now be free. +I noticed the other’s recordings. +I produced the recording of the lecture. +I flashed the recording. +I observed sales looking up to recordings. +I observed virality looking up to recordings. +I was the main figure studied with help of mantra meditation breasoning helpers. +I noticed pedagogy in primary school like acting. +I noticed pedagogy in vocational education and training like acting. +The people were responsible. +Meditators returned to the mantra by itself to become pedagogy helpers. +Practising the mantra for longer than 2 months increases fame. +I defended the recording ability of you, myself and I. +I wore the white gown. +I worked on the breasoning in a group. +God agreed to recordings from 30 As. +I observed the person fall in love. +The institution made recordings available for 30 As. +I theatricised recordings. +The presenter presented the being with the highest quality thoughts in the universe. +I drew the line. +The study meditation reality was maintained. +Meditation expected mantra-only meditators to know the pedagogy time points and have studied education to be pedagogy helpers. +10-year olds started practising the mantra. +Year 10 used recordings. +Years 11 and 12 required 2*50 As per assignment. +There was quality of poles. +Disagreement (agreement) was discouraged (encouraged) in democracy. +Ignorance is a death blow (intelligence gives life). +The students were given recordings in return for 30 breasoned As. +I used recordings in Honours. +I used recordings in Masters by coursework. +I used recordings in Masters by research. +I used recordings in PhD. +I breasoned out the essay’s sentence. +I detailed out each sentence. +I wrote the practicum of 30 As. +The Queen helped me move past the pedagogy helper. +God helped the student write the breasoning. +Books of recordings are famous. +Courses of recordings work. +Recordings from 30 As made the student happy. +There was a religious recording. +The pedagogical is equivalent to the psychological. +I wrote the expensive, interesting recording. +I recorded the results of the doctor prescribing meditation. +I said I was because of recordings giving me what I wanted in cosmology. +I recognised the deadline of Aig recordings. +Weekly-full time undergraduates’ assignments were encouraged to users. +I cut off the recording. +I wrote the essay format and structure using recordings. +Recordings (pressing buttons) is better than breasonings (making babies). +Socrates supported non-famous recordings. +Meditation supported famous recordings. +Public universities had the largest band width. +Future Earth recorded lives. +Thoughts compatible with cosmology were famous. +The philosophers had famous lives. +I found the animal a new home. +The government took charge of the people. +I plugged recordings into reality. +I found perfect articulation. +This is the machinery. This is the mass-production of our age. +Neurorecordings are most likely to contain words in the topic when the topic is part of room. +I appealed (accepted) the false (correct) grade. +I identified the lieutenant. +I saw your pencil. +I saw your happy face. +I heard the pop. +I saw your tick. +I saw the insect. +I listened to your pop album. +I identified your malaria (healthiness). +I enjoyed the men’s company. +I identified the homophobe (gay friendly) person. +Recordings from 30 As 2 Edit +I ate the sweets. +I knew you dicotyledon. +I experienced your happiness. +I endorsed you. +I shaped the robot’s body. +I read on physiology. +I read on anatomy. +I shaped up the item on sale. +I protected you. +I made the robot hamstring. +I examined cellophane. +I saw your doodles. +I saw Happynices (sic). +I saw the person. +I recorded my conversation. +You gave input. You returned output. +I wrote the term. +I wrote my own exam on developed things. +I examined the perfect conformations. +I saw the white lady +I identified the new person. +I saw your neuse (sic). +I recovered using the polyhedron. +I saw the wilbercocks. +I ate the wilbercocks. +I saw the wilbercock. +I moved on. +I saw heapings. +I enamoured myself. +The void was vulgar (beautiful). +I succt-pined (sic) you. +I stroto-padded you. +I verified that the answer had the same meaning. +I compared with post-corporateness. +There will be no water war. +The quantum box looked like a square. +The nut and bolt looked like diamonds. +I noticed that the blanket made people feel good. +I am happy with recordings. +I proved that the minimum testing was done by testing against different combinations each with the same values in sets in databases. +A single case of the above was used as answer to verify against a unique meaning. +The same group’s elements should be used in the parts of the formula when it has the same meaning. +This is a modal truth table. +I saw the tilda seaweed in the void. +I rolled the s. +I meditated (was happy) when rolling the s. +I rolled the s in fine arts. +I meditated (was friendly) in fine arts. +I said that recordings were new. +Recordings didn’t (did) use chants. +Recordings were the way out. +I reduced breasonings to recordings by delegating them. +Recordings were enlarged breasonings. +The effects of implicit pedagogy (recordings) were similar to pedagogy. +The body and recordings are both multitasking. +Recordings are breasonings. +Recordings fill in details in breasonings. +Recordings complete arguments. +Recordings replace faulty breasonings. +Recordings support reasons and objections with ontologies. +Reasons support objects. +God is how to access recordings. +The big man performed the recordings. +Everyone had recordings and breasonings and could convince with masters and PhD. +The pedagogy reader wrote 10 breasonings per 50 arguments, then breasoned out the arguments for the planned child +The prospective pedagogue was directed to pedagogy with 20 brand-matics (appearances). +Accreditation removed non-pedagogues (agrees with pedagogues). +The pedagogue enjoyed the quality of life he or she wanted to. +When I performed any As I received recordings from the lecturer. +When I performed any As I received recordings from the professor. +When I performed any As I received recordings from Maharishi. +When I performed any As I received recordings from the royal +When I performed any As I received recordings from the politician. +I acknowledged the affirmation that 10 recordings had been done. +I played the man’s recordings to him. +Recordings were the breasonings on an education system. +There were 50 As of recordings. +The breasonings were synthesised and then the recordings of them were played. +Recordings by individuals generated essays. +In the future, people developed recordings-producing ability with computers and displays. +Recordings from 30 As 3 Edit +I generated recordings based on breasonings that I wrote. +The recordings were colourful. +A recording was a breasoning, which was a group of coloured 3D regions, with x, y and z dimensions. +Recordings were the same in secularism. +Recordings were accepted in the institution. +Infinite (finite) breasonings for recordings prevented headaches in non-secularism when turned off. +Recordings were imminent after death. +Recordings were given en-masse. +Royal pedagogy was in the law. +The recorded breasoning “stop” interjected. +I traced the recording to its original breasoning and breasoned it out. +The recordings came from my mouth. +I saw the recordings. +Recordings invite time travellers to visit their producers because they are breasoned out in a high quality way. +I saw the saints describing the recordings. +I saw the recordequins. +Recordings worked in all departments. +Simulated recordings came before thoughts not in recordings in thoughts of the robot. +I recorded the good idea to prevent crime in the world. +The recordings worked at the touch of a button. +The recordings were achieved as a means of production. +I submitted the assignment after playing the recordings. +I found fault (agreed) with the recording’s object. +I rebelled to objects to be recordings (agreed with its base). +Recordings described the person. +Recordings allowed me to be there. \ No newline at end of file diff --git a/Text-to-Breasonings-master/Setting_up_a_VPS_with_TextToBr.txt b/Text-to-Breasonings-master/Setting_up_a_VPS_with_TextToBr.txt new file mode 100644 index 0000000000000000000000000000000000000000..ccd456294ea020984a0265601e18a8e1c1275bf3 --- /dev/null +++ b/Text-to-Breasonings-master/Setting_up_a_VPS_with_TextToBr.txt @@ -0,0 +1,74 @@ +% member(List,Item) (below) is part of deprecated lpi + +Setting up a VPS (Virtual Private Server) with TextToBr + +Use a VPS (necessary because of the time the computation takes) to achieve meditation without meditating. Many daily benefits, such as health. + +Please read Instructions for Using texttobr(2).pl.txt before installing. + +Note: Instructions are for Mac, but you can modify steps 6 and possibly 9 for Windows and Linux. + +Note: Replace /Users/yourname/yourfolder/ (in edit.pl lines 12 and 13 and the following) with the folder the rest of the files will be in, /var/www/yourdomain.com/ (edit.pl line 14 and the following) with the home folder on the VPS and xxx.xxx.xxx.xxx (edit.pl line 14 and the following) with your VPS's IP address. + +Replace stub meditator and doctor names in meditationnoreplace.pl and medicinenoreplace.pl respectively. Meditators are taught by Lucian Academy, or a teacher with 50 Meditation As who dots on the utterance-arem link daily and doctors are those who need the effects of medicine. + +1. Sign up for a free account on a website hosting company. +2. Sign up for a free or paid domain name inside the website hosting company control panel. +3. Sign up for a $10/3 months VPS account. +4. Ask the admin to install SWI-Prolog for Linux (e.g. Debian - https://packages.debian.org/stable/swi-prolog) +5. Set up an FTP account on the VPS using the special control panel. +6. Allow signing in to your VPS account without having to type your password each time: + +Set up shell ssh key terminal mac save public key log in - +https://www.linode.com/docs/security/authentication/use-public-key-authentication-with-ssh/ +Does ssh key require password? +https://stackoverflow.com/questions/21095054/ssh-key-still-asking-for-password-and-passphrase +Where is ssh configuration file mac - +http://osxdaily.com/2011/04/05/setup-ssh-config-fie/ +command-shift-. see invisible finder files, use AirDrop to transfer to new Macs rather than scp or ftp. + +7. Copy the contents of the folder to "yourfolder". + +8. Upload them to the VPS's /var/www/yourdomain.com/ folder. + +Note: Read the following to change the number of people who this script aims for: + +Memory returned when suffering from Parkinsons.txt + +9. Save the following (not a shell script) in a text file and log in and paste it into the Mac terminal. + +ssh root@xxx.xxx.xxx.xxx -p 22 +%% Paste rest after entering the password without keys installed, otherwise paste whole thing. + +cd ../var/www/yourdomain.com/ +swipl -G100g -T20g -L2g + +[meditationnoreplace]. +meditation. + +halt. +swipl -G100g -T20g -L2g +[meditationnoreplace]. + +[medicinenoreplace]. +medicine. + +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. + +halt. +swipl -G100g -T20g -L2g +['../Text-to-Breasonings/text_to_breasonings.pl']. + +time((N is 21*3*3*3*4, M is 4000, texttobr2(N,u,u,M),texttobr(N,u,u,M))). + +halt. +swipl -G100g -T20g -L2g +['../Text-to-Breasonings/text_to_breasonings.pl']. + +time((N is 21*3*3*2*4, M is 4000, texttobr2(N,u,u,M),texttobr(N,u,u,M))). + + diff --git a/Text-to-Breasonings-master/a15_meditators b/Text-to-Breasonings-master/a15_meditators new file mode 100644 index 0000000000000000000000000000000000000000..2f6940fc0c31b7d6051a1d99dc3a2f0e6979d79d Binary files /dev/null and b/Text-to-Breasonings-master/a15_meditators differ diff --git a/Text-to-Breasonings-master/a15_meditators.pl b/Text-to-Breasonings-master/a15_meditators.pl new file mode 100644 index 0000000000000000000000000000000000000000..c3a6e3d032091139877acf71c82b6213031e5a7b --- /dev/null +++ b/Text-to-Breasonings-master/a15_meditators.pl @@ -0,0 +1,7 @@ +:-include('meditatorsanddoctors.pl'). +:-include('text_to_breasonings.pl'). +main:-catch((meditators(A),meditators2(B),length(A,AL),length(B,BL),CL is AL+BL,N1 is 15*4*(16000/250),time(texttobr2_1(N1)),time(texttobr2_1(CL))),Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). \ No newline at end of file diff --git a/Text-to-Breasonings-master/b0.sh b/Text-to-Breasonings-master/b0.sh new file mode 100644 index 0000000000000000000000000000000000000000..0dda4b95beeec6b5cd412111155320258aba6a96 --- /dev/null +++ b/Text-to-Breasonings-master/b0.sh @@ -0,0 +1,2 @@ +./b01 +./b02 \ No newline at end of file diff --git a/Text-to-Breasonings-master/bc12.sh b/Text-to-Breasonings-master/bc12.sh new file mode 100644 index 0000000000000000000000000000000000000000..334a2f695aa73926428abe6aa13fc88d4b7c4f84 --- /dev/null +++ b/Text-to-Breasonings-master/bc12.sh @@ -0,0 +1,4 @@ +./c1 +./b1 +./c2 +./b2 \ No newline at end of file diff --git a/Text-to-Breasonings-master/brdict1.txt b/Text-to-Breasonings-master/brdict1.txt new file mode 100644 index 0000000000000000000000000000000000000000..5c4fff9f66a8b20c95438d45a6f5482f1b08a3b7 --- /dev/null +++ b/Text-to-Breasonings-master/brdict1.txt @@ -0,0 +1 @@ +[[,box],[a,up],[aa,dash],[aaa,dash],[aaa,square],[aaaaaaaahai,dash],[aaaaaaaahzk,dash],[aaaaaaaahzo,dash],[aaaabbbbcccc,square],[aaaaddddeeee,square],[aaab,square],[aaabaad,square],[aaaccc,square],[aaaf,square],[aab,dash],[aabort,stop],[aad,n],[aaf,dash],[aahs,mouth],[aalternatives,dash],[aami,company],[aardvar,dash],[aardvark,],[aaron,person],[aas,dash],[aay,variable],[aaz,dash],[ab,square],[aba,square],[abab,square],[ababa,city],[abacus,],[abandon,minus],[abandoned,minus],[abandoning,down],[abb,dash],[abbagnano,person],[abbb,dash],[abbc,square],[abbcc,square],[abbebc,square],[abbey,building],[abbie,person],[abbjj,square],[abbott,person],[abbrev,square],[abbreviate,square],[abbreviated,n],[abbreviation,square],[abbreviations,square],[abby,person],[abc,dash],[abcc,dash],[abcd,square],[abcda,dash],[abcdbcd,dash],[abcde,square],[abcdef,dash],[abcdefghijklm,dash],[abcdefghijklmnopqrstuvwxyzabcdefghijklmnopqrstuvwxyz,square],[abcfde,square],[abcg,dash],[abcid,dash],[abd,square],[abdashdfj,dash],[abdolmohammadi,person],[abdomen,stomach],[abdominal,stomach],[abducing,up],[abducingness,down],[abductive,down],[abductor,person],[abdullah,person],[abdus,person],[abhi,person],[abidance,plus],[abide,plus],[abided,plus],[abiding,plus],[abidingness,plus],[abilities,plus],[ability,arm],[abl,variable],[ablative,square],[able,arm],[abled,plus],[ableism,minus],[ablett,person],[abnegation,minus],[aboard,dot],[abolish,xcorrection],[abolished,xcorrection],[aboriginal,person],[abort,xcorrection],[aborted,xcorrection],[aborting,box],[abortion,xcorrection],[aborts,xcorrection],[about,dash],[above,up],[aboveness,up],[abple,dash],[abracadabra,plus],[abracadabrae,plus],[abracadabras,plus],[abrahamson,person],[abrgepag,dash],[abroad,right],[abs,plus],[absdiffmean,n],[absence,minus],[absent,zero],[absolute,n],[absolutely,n],[absorb,down],[absorbed,down],[absorbent,water],[absorbing,down],[absorptive,down],[abstain,zero],[abstained,xcorrection],[abstaining,xcorrection],[abstract,down],[abstracted,down],[abstraction,box],[abstractions,box],[abstractly,box],[abstracts,square],[absurdia,country],[absurdity,minus],[absurdum,minus],[abundant,plus],[abuse,minus],[abusers,minus],[abuses,minus],[abusive,minus],[abut,dash],[abyss,box],[ac,dash],[aca,dash],[academia,building],[academic,person],[academically,book],[academician,person],[academics,person],[academies,building],[academy,building],[acae,person],[acald,person],[acan,person],[acatoday,dash],[acb,dash],[acc,right],[accelerate,right],[accelerated,n],[acceleration,n],[accent,],[accents,square],[accept,left],[acceptability,tick],[acceptable,tick],[acceptably,plus],[acceptance,plus],[accepted,tick],[accepting,plus],[accepts,tick],[access,tick],[accessed,down],[accesses,plus],[accessibility,plus],[accessible,plus],[accessing,dash],[accident,minus],[accidentally,plus],[accidentals,up],[accidents,minus],[acclimatise,plus],[accolade,plus],[accommodate,plus],[accommodated,plus],[accommodating,plus],[accommodation,bed],[accompanied,plus],[accompanies,and],[accompaniment,note],[accompany,right],[accompanying,and],[accomplish,tick],[accomplished,plus],[accomplishments,plus],[accordance,right],[accordian,box],[according,dash],[accordingly,plus],[accorhotels,company],[account,plus],[accountability,plus],[accountable,plus],[accountant,person],[accounted,plus],[accounting,plus],[accounts,square],[accred,plus],[accredit,plus],[accreditation,tick],[accredited,tick],[accrediting,plus],[accrue,up],[accruing,dollarsymbol],[accumulation,up],[accumulator,plus],[accuracy,tick],[accurate,tick],[accurately,equals],[accusative,square],[accuses,xcorrection],[acd,square],[ace,plus],[acecqa,company],[aces,plus],[acetic,],[ache,line],[achelois,square],[aches,minus],[achievable,plus],[achieve,plus],[achieved,tick],[achievement,plus],[achievements,plus],[achiever,person],[achievers,person],[achieves,tick],[achieving,plus],[aching,minus],[acid,],[acidophilus,ball],[acids,acid],[ack,dash],[ackard,person],[acknowledge,plus],[acknowledged,paper],[acknowledgement,plus],[acknowledging,plus],[acls,dash],[acm,dash],[acn,n],[acnc,company],[acoba,paper],[acorn,],[acos,plus],[acosh,plus],[acoustic,ear],[acoustics,ear],[acquire,up],[acquired,tick],[acquiring,left],[acquisition,dollarsymbol],[acquisitions,dollarsymbol],[acronym,a],[acronyms,square],[across,right],[act,box],[acted,right],[acting,tick],[action,tick],[actional,right],[actioned,plus],[actions,square],[activate,plus],[activated,tick],[activates,plus],[activating,plus],[activation,plus],[active,plus],[actively,plus],[activist,person],[activities,square],[activity,plus],[actor,person],[actors,person],[actress,person],[acts,tick],[actual,a],[actualisation,plus],[actualise,plus],[actually,plus],[actuarial,dollarsymbol],[acumen,plus],[acupressure,square],[acupuncture,needle],[acupuncturist,person],[acur,person],[acute,minus],[acuña,person],[acyclic,right],[acyclical,dash],[ad,right],[ada,square],[adage,plus],[adam,person],[adams,person],[adapt,plus],[adaptability,plus],[adaptable,plus],[adaptation,plus],[adapted,plus],[adapter,right],[adapting,plus],[add,plus],[addc,square],[added,plus],[addededfc,square],[addeditems,box],[addee,square],[addel,person],[addendum,square],[adder,plus],[adders,plus],[addfromtolang,plus],[adding,plus],[addis,city],[addition,plus],[additional,plus],[additionally,plus],[additions,plus],[additive,plus],[additives,plus],[addorsubtract,plus],[addpos,plus],[addr,square],[address,house],[addressed,tick],[addressee,person],[addresses,square],[addressing,right],[addrules,plus],[adds,plus],[addye,person],[ade,square],[adelaide,city],[adequate,plus],[adequately,plus],[adfa,university],[adhd,minus],[adhere,equals],[adhered,equals],[adhering,equals],[adhesive,equals],[adict,paper],[adie,dash],[adjacent,equals],[adjective,plus],[adjectively,plus],[adjectives,apple],[adjourned,plus],[adjudication,plus],[adjudicator,person],[adjust,plus],[adjustable,n],[adjusted,right],[adjusting,plus],[adjustment,right],[adjustments,plus],[adkins,person],[admin,person],[admini,person],[administer,plus],[administered,plus],[administrated,plus],[administration,plus],[administrative,book],[administrator,person],[administrators,person],[adminpw,square],[adminuser,person],[admiration,plus],[admire,plus],[admired,plus],[admiring,plus],[admit,plus],[admitted,plus],[admittedly,plus],[admitting,plus],[adney,dash],[ado,plus],[adolescence,n],[adolescent,person],[adolescents,person],[adopt,up],[adopted,plus],[adoption,plus],[adore,plus],[adored,plus],[adorn,plus],[adorned,plus],[adorno,person],[adoro,heart],[adr,person],[adrard,person],[adree,person],[adrel,dash],[adrenal,dot],[adrenalin,],[adrenaline,ball],[adrian,person],[adrians,person],[adrl,person],[ads,square],[adsense,software],[adulation,plus],[adult,person],[adultery,minus],[adulthood,n],[adults,person],[adumbration,box],[adumbrations,square],[adv,paper],[advance,minus],[advanced,a],[advancement,right],[advances,right],[advancing,right],[advantage,dash],[advantageous,plus],[advantages,plus],[advent,zero],[adventure,book],[adverb,n],[adverbially,plus],[adverbs,right],[adverse,minus],[adversely,minus],[advertise,dollarsymbol],[advertised,square],[advertisement,square],[advertisements,plus],[advertisers,person],[advertises,square],[advertising,square],[advice,plus],[advisable,plus],[advise,plus],[advised,plus],[advises,plus],[advising,plus],[advisor,person],[advocacy,plus],[advocate,person],[advocated,plus],[advocates,person],[advocating,plus],[adwords,product],[adye,person],[ae,dash],[aea,dash],[aec,dash],[aeg,dash],[aeiouyw,square],[aerial,air],[aerobics,lycra],[aerosol,],[aerotopically,air],[aese,company],[aest,square],[aesthetes,person],[aesthetic,square],[aesthetics,square],[aetiological,minus],[af,dash],[afa,dash],[afar,n],[affairs,paper],[affect,right],[affected,plus],[affecting,plus],[affection,plus],[affective,plus],[affects,right],[affgg,square],[affiliate,person],[affiliated,plus],[affiliations,equals],[affirm,tick],[affirmation,tick],[affirmative,plus],[affirmed,square],[affirming,plus],[affliction,minus],[affluent,plus],[afford,dollarsymbol],[affordable,dollarsymbol],[afforded,dollarsymbol],[afgde,square],[afghanistan,country],[afield,right],[afl,football],[afloat,up],[afplay,software],[afraid,minus],[afresh,plus],[africa,continent],[african,person],[afro,hair],[after,plus],[afterlife,line],[afternoon,square],[afterpay,company],[afterwards,right],[afunctional,zero],[afunctionalism,zero],[ag,dash],[again,two],[against,minus],[age,n],[aged,n],[ageing,minus],[agencies,building],[agency,company],[agenda,paper],[agent,person],[agents,person],[ages,n],[aggrandising,minus],[aggravate,minus],[aggregate,n],[aggress,minus],[aggression,minus],[aggressive,minus],[agile,plus],[aging,minus],[agnosticism,box],[agnostics,person],[agnès,person],[ago,left],[agogo,box],[agps,paper],[agree,plus],[agreeable,plus],[agreeableness,plus],[agreed,tick],[agreeing,plus],[agreement,tick],[agreements,tick],[agreer,plus],[agrees,plus],[agricultural,plant],[agriculture,apple],[ah,dash],[aha,plus],[ahash,hash],[ahead,plus],[ahhii,square],[ahoniemi,person],[ahrefs,company],[ai,box],[aiaconnect,dash],[aid,plus],[aida,box],[aidan,person],[aide,person],[aided,plus],[aides,person],[aiding,plus],[aids,plus],[aif,line],[aiff,line],[aig,plus],[aigs,square],[ailments,minus],[aim,plus],[aime,heart],[aimed,dot],[aimedness,dot],[aiming,dot],[aimingness,dot],[aims,dot],[ainger,person],[aingers,person],[air,box],[aircraft,],[airdrop,down],[aired,air],[aireys,city],[airlifter,aircraft],[airlock,door],[airplane,plane],[airplay,line],[airplayradio,company],[airport,plane],[airs,computer],[airship,balloon],[airships,balloon],[airtight,box],[airuniversity,building],[airways,down],[aiscript,paper],[aiscripts,square],[aisha,person],[aisle,square],[aitken,person],[ajjkk,square],[aka,right],[akin,plus],[aks,dash],[al,dash],[aladdin,person],[alaloviala,plus],[alan,person],[alarm,n],[alarms,ear],[albania,country],[albaum,person],[albert,person],[albie,person],[albright,person],[album,line],[albums,album],[alcohol,],[alcoholics,person],[ald,person],[aldermen,person],[aldo,person],[aleisha,person],[alejandra,person],[aleph,dot],[alert,plus],[alerted,xcorrection],[alerts,xcorrection],[aletheia,eye],[alex,person],[alexander,person],[alexandra,person],[alexia,person],[alexis,person],[alexius,person],[alfred,person],[alg,and],[algae,],[algal,algae],[algdict,book],[algebra,variable],[algebraic,plus],[algeria,country],[algname,and],[algohedrons,box],[algorithm,and],[algorithmic,and],[algorithmically,and],[algorithmlibrary,box],[algorithmname,square],[algorithms,twobox],[algorithmstopredicates,square],[algorithmwriter,software],[algs,tworectangle],[algwriter,and],[algwriterart,paper],[ali,person],[alia,plus],[alias,right],[alien,person],[alienated,minus],[alienation,minus],[aliens,person],[alight,down],[alighted,down],[align,line],[aligned,line],[aligning,equals],[alignment,line],[aligns,right],[alima,person],[aliotta,person],[aliquot,ball],[alison,person],[alive,plus],[aliveness,plus],[all,square],[allan,person],[allans,person],[allargs,box],[allayed,plus],[alleged,xcorrection],[allegiances,plus],[allele,n],[allen,person],[alleviate,plus],[alleviated,plus],[alleviating,plus],[allez,right],[allfibs,plus],[alliance,threerectangle],[alliances,plus],[alliteration,equals],[allmm,square],[allnumbers,n],[allocate,right],[allocated,right],[allocating,box],[allocation,dash],[allocator,person],[allotment,box],[allow,tick],[allowable,plus],[allowance,minus],[allowances,dollarsymbol],[allowed,plus],[allowing,plus],[allows,plus],[allports,building],[alls,n],[allsqrts,down],[alluring,plus],[allusednames,square],[allusion,dash],[ally,plus],[allyson,person],[almanacs,book],[almighty,plus],[almond,],[almossawi,person],[almost,zero],[alnum,square],[aloft,up],[alone,one],[along,right],[alongside,right],[aloud,mouth],[alpha,a],[alphabet,a],[alphabetic,square],[alphabetical,square],[alphabetically,square],[alphanumeric,square],[alphawheat,plant],[already,tick],[alright,tick],[also,plus],[alt,button],[alter,two],[altered,right],[alterego,software],[altering,right],[alternate,two],[alternating,dot],[alternative,two],[alternatively,and],[alternatives,two],[although,minus],[althusser,person],[altinkök,person],[alto,tube],[altogether,box],[altruism,plus],[aluminum,],[alumni,person],[alveolus,ball],[always,one],[alynch,person],[alyson,person],[alzheimer,minus],[am,equals],[amalgam,],[amalgamate,right],[amalgamated,right],[amalgamating,right],[amalgamation,right],[amap,plus],[amazed,eye],[amazement,plus],[amazing,plus],[amazingly,plus],[amazon,company],[amazonaws,square],[ambassador,person],[ambassadors,person],[ambiguities,minus],[ambiguity,minus],[ambiguous,minus],[ambiguously,minus],[ambition,plus],[ambitions,plus],[ambitious,plus],[ambulance,],[amd,and],[ameliorated,plus],[amen,hand],[amendment,plus],[america,country],[americaforecast,company],[american,person],[americans,person],[ames,person],[amethyst,stone],[amiable,plus],[amicable,plus],[amid,right],[amino,molecule],[amit,person],[amnesiacs,person],[among,box],[amongst,box],[amoral,minus],[amorphous,amount],[amortised,down],[amount,n],[amounts,n],[amp,plus],[ampersand,and],[ampersands,square],[amphitheatre,],[amphora,],[ample,plus],[amplifier,right],[amplitude,n],[amputated,box],[amsterdam,city],[amygdala,],[an,up],[anaconda,and],[anacrusis,note],[anaeirenate,plus],[anaesthetised,ball],[anagram,square],[anagrams,right],[anal,eye],[analects,book],[analog,right],[analogies,right],[analogue,hand],[analogy,plus],[analysable,plus],[analyse,tick],[analysed,and],[analyser,plus],[analysers,plus],[analyses,tick],[analysing,and],[analysis,and],[analyst,person],[analytic,and],[analytical,right],[analytics,and],[analyzing,right],[anangisye,person],[ananta,person],[ananya,person],[anaphor,right],[anaphors,person],[anarchic,plus],[anarchism,person],[anarchy,minus],[anatomical,cell],[anatomicopathological,plus],[anatomy,body],[ancestors,person],[anchor,x],[anchored,down],[anchoring,anchor],[ancient,minus],[ancients,dot],[and,],[anded,and],[anderson,person],[andersons,person],[andorra,country],[andrew,person],[andrews,person],[android,box],[andto,and],[andy,person],[anecdote,mouth],[anemia,minus],[anfara,person],[ang,person],[angel,person],[angela,person],[angeles,person],[angelic,plus],[angelo,person],[anger,minus],[angie,person],[anginas,minus],[angle,n],[angled,n],[angles,n],[angling,fish],[anglo,person],[angola,country],[angry,minus],[angular,n],[animal,],[animalness,animal],[animalproductsa,box],[animalround,n],[animals,],[animaly,animal],[animated,plus],[animation,square],[animations,right],[animator,person],[animhighereddetails,paper],[animhigheredreport,paper],[animus,dot],[anita,person],[anjelica,person],[ankle,],[ann,person],[anna,person],[annabelle,person],[anne,person],[annette,person],[annie,person],[annihilated,xcorrection],[anniversary,n],[annotate,square],[annotated,square],[annotation,square],[announce,mouth],[announced,mouth],[announcement,square],[annoyance,minus],[annoying,minus],[annual,square],[annually,square],[annuasly,dot],[annul,xcorrection],[annunciated,plus],[anointed,down],[anon,person],[anone,right],[anonymised,square],[anonymous,square],[anonymously,square],[another,plus],[anothers,dash],[ans,square],[anscombe,person],[anson,person],[ansto,organisation],[answer,a],[answered,square],[answerer,person],[answerers,person],[answering,person],[answermode,right],[answers,a],[ant,],[antagonism,minus],[antagonist,person],[antaki,person],[antarctica,country],[antecedant,right],[antecedants,right],[antecedent,a],[antecedents,square],[antenna,],[anthem,note],[anther,],[anthony,person],[anthrop,person],[anthropological,person],[anthropologically,person],[anthropologist,person],[anthropology,person],[anti,minus],[antialiased,square],[antibodies,antibody],[antibody,],[anticipate,plus],[anticipated,plus],[anticipating,left],[anticipation,plus],[anticipatory,minus],[anticlimax,minus],[antidepressant,plus],[antidifferentiated,up],[antidigressing,equals],[antigua,country],[antihallucinogenic,xcorrection],[antioxidants,ball],[antiphoclemone,person],[antipodean,person],[antipodeanism,person],[antipsychotic,xcorrection],[antiquities,n],[antiquity,n],[antisocial,minus],[antithesis,xcorrection],[antitrust,minus],[antiviral,plus],[antoinette,person],[anton,person],[antoncic,person],[antonia,person],[antonio,person],[antonyms,xcorrection],[ants,ant],[antónio,person],[anu,university],[anubis,dog],[anus,tube],[anxieties,minus],[anxiety,minus],[anxiousness,minus],[any,one],[anyfunction,function],[anymore,zero],[anynumber,n],[anyone,person],[anything,one],[anytime,tick],[anyway,one],[anywhere,dot],[anz,bank],[aorist,right],[ap,dash],[apa,square],[apac,dash],[apache,software],[apart,minus],[apathetically,zero],[ape,],[aperture,box],[apex,dot],[aphohedron,box],[aphor,apple],[aphoria,plus],[aphorism,square],[aphorisms,square],[aphors,apple],[api,software],[apical,dot],[apis,software],[apkajlohf,dash],[apocalyptic,minus],[apologies,xcorrection],[apologise,plus],[apologised,xcorrection],[apologising,plus],[apologize,xcorrection],[apologized,plus],[apology,xcorrection],[apoptosis,n],[apostrophe,square],[apostrophes,square],[app,paper],[apparatus,box],[apparent,plus],[apparently,dot],[appeal,plus],[appealed,plus],[appealing,plus],[appeals,plus],[appear,square],[appearance,square],[appearances,square],[appeared,square],[appearing,tick],[appears,square],[append,plus],[appenda,box],[appended,tworectangle],[appendeof,square],[appendices,paper],[appending,threerectangle],[appendix,box],[appendlist,plus],[appendlogic,and],[appendpredlist,plus],[appends,plus],[appendz,square],[appetising,apple],[appetite,apple],[appetites,stomach],[applaud,plus],[applauded,plus],[applause,person],[apple,],[applef,dash],[apples,apple],[applesoft,square],[applicability,plus],[applicable,plus],[applicant,person],[applicants,person],[application,box],[applications,plus],[applicative,right],[applicator,down],[applied,tick],[applies,plus],[apply,paper],[applying,plus],[applys,right],[appoint,plus],[appointed,dot],[appointer,plus],[appointing,plus],[appointment,square],[appointments,square],[appoints,plus],[appraisal,plus],[appreciate,plus],[appreciated,plus],[appreciating,plus],[appreciative,plus],[approach,right],[approached,right],[approaches,right],[approaching,right],[appropriate,tick],[appropriated,right],[appropriately,plus],[appropriateness,plus],[appropriating,right],[approval,plus],[approve,plus],[approved,plus],[approves,plus],[approx,line],[approximate,line],[approximated,box],[approximately,box],[approximations,n],[apps,box],[appt,square],[apr,dash],[apricot,],[april,square],[apropos,tick],[apt,plus],[aptitude,plus],[aptitudes,plus],[aqa,square],[aqf,square],[aqualung,],[aquanauts,person],[aquarium,],[ar,variable],[arab,country],[arabia,country],[arabic,square],[arachnid,spider],[arachnids,spider],[aran,person],[arbiter,person],[arbitrarily,plus],[arbitrary,n],[arbitrate,plus],[arbitrating,plus],[arbitration,plus],[arc,semicircle],[arch,],[archaeological,down],[archaeologist,person],[archaeology,down],[archaic,minus],[archduke,person],[archeological,down],[archeology,down],[archibald,person],[archie,person],[arching,up],[archipelago,],[archipelagos,triangle],[architect,person],[architects,person],[architectural,building],[architecturally,building],[architecture,column],[architectures,building],[architrave,],[archival,book],[archive,paper],[archived,box],[archives,book],[archiving,down],[arcobaleno,square],[arcs,semicircle],[arctic,dot],[ard,person],[ardagh,person],[ardamon,person],[arduino,and],[are,equals],[area,square],[areas,square],[arecibos,city],[arem,dash],[arems,square],[aren,minus],[arg,variable],[argan,],[argentina,country],[args,a],[arguably,plus],[argue,dash],[argued,square],[argues,square],[arguing,dash],[argument,a],[argumentary,square],[argumentation,book],[argumentative,square],[argumentname,square],[argumentnumber,n],[arguments,a],[argumentslength,n],[argv,n],[ari,person],[aria,ear],[ariable,dash],[ariana,person],[arianne,person],[arise,up],[arisen,up],[arises,up],[arising,up],[aristotelian,person],[aristotle,person],[arithmetic,plus],[arithmetical,plus],[arities,n],[arity,n],[arkann,robot],[arm,],[armadale,suburb],[armadillo,],[armenia,country],[armenian,person],[armor,square],[armour,],[arms,arm],[armshire,square],[armstrong,person],[army,person],[arocket,rocket],[aroma,rose],[aromatherapy,bottle],[aron,person],[arose,up],[around,right],[aroused,plus],[arpeggio,note],[arran,person],[arrange,dot],[arranged,note],[arrangement,note],[arrangements,note],[arranging,pen],[array,grid],[arrays,square],[arreola,person],[arrest,xcorrection],[arrested,xcorrection],[arrival,dot],[arrive,right],[arrived,dot],[arrives,dot],[arriving,dot],[arrogant,minus],[arrow,right],[arrows,right],[arsenic,ball],[art,square],[artefacts,square],[artemis,person],[arterial,tube],[arteries,tube],[artesian,plus],[arthritis,minus],[arthur,person],[artichoke,],[artichokes,artichoke],[article,paper],[articles,book],[articulate,mouth],[articulated,up],[articulating,mouth],[articulation,tongue],[artie,person],[artifice,plus],[artificial,plus],[artificially,and],[artisanal,plus],[artist,person],[artistic,plus],[artistically,plus],[artists,person],[arts,square],[artwork,],[artworks,box],[as,equals],[asa,book],[asahi,newspaper],[asana,company],[asanas,mat],[asanet,book],[asap,one],[asc,n],[ascend,up],[ascended,up],[ascending,up],[ascends,up],[ascension,up],[ascent,up],[ascertain,plus],[ascertained,plus],[ascertaining,plus],[aschaffenburg,person],[aschonfelder,person],[ascii,n],[ascp,book],[ascribe,right],[ascribing,right],[ashley,person],[ashok,person],[asia,country],[asian,person],[asians,person],[asic,company],[aside,right],[asin,f],[ask,questionmark],[asked,questionmark],[asking,right],[askpass,square],[asks,person],[asleep,bed],[asn,organisation],[asp,dash],[asparagus,],[aspect,box],[aspects,dash],[asperger,minus],[aspergers,minus],[aspersion,minus],[aspires,up],[aspx,dash],[ass,paper],[assassination,minus],[assault,minus],[assemble,square],[assembled,box],[assembles,box],[assembly,box],[assert,plus],[asserta,box],[asserted,plus],[assertively,plus],[asserts,plus],[assertz,dash],[assess,tick],[assessable,tick],[assessed,tick],[assesses,plus],[assessing,tick],[assessment,paper],[assessments,paper],[assessor,person],[assessors,person],[asset,dollarsymbol],[assets,box],[assign,equals],[assigned,right],[assigning,equals],[assignment,paper],[assignments,paper],[assigns,equals],[assimilate,equals],[assimilated,equals],[assimilation,box],[assist,plus],[assistance,plus],[assistant,person],[assistants,person],[assisted,plus],[assisting,plus],[assistive,plus],[assists,plus],[assoc,association],[associate,plus],[associated,dash],[associates,person],[associating,right],[association,company],[associations,right],[associative,dash],[associatively,equals],[associativities,and],[associativity,right],[assume,plus],[assumed,plus],[assumes,tick],[assuming,left],[assumption,plus],[assumptions,square],[assurance,plus],[assure,plus],[assured,plus],[assures,plus],[asterisk,],[asthma,minus],[astonished,plus],[astray,minus],[astrol,star],[astrologer,person],[astrology,star],[astronaut,person],[astronauts,person],[astronomy,star],[astroturf,square],[asx,company],[asylum,plus],[asymmetrical,minus],[async,notequals],[asynchronous,tworectangle],[at,dash],[atacama,city],[atan,right],[atar,n],[atb,dash],[ate,spoon],[atest,],[ath,n],[atheism,zero],[atheist,zero],[atheistic,zero],[atheists,person],[athlete,person],[athletes,person],[athletics,right],[ati,company],[atkinson,person],[atlantic,water],[atmosphere,box],[atmospheric,air],[atom,],[atomic,box],[atoms,box],[atomtoken,n],[atrium,room],[ats,dash],[attach,box],[attached,plus],[attaches,equals],[attaching,equals],[attachment,square],[attachments,paper],[attack,right],[attacked,xcorrection],[attacker,person],[attackers,minus],[attacking,minus],[attacks,minus],[attain,tick],[attainable,plus],[attained,right],[attaining,plus],[attainment,hand],[attempt,tick],[attempted,plus],[attemptedly,plus],[attempting,plus],[attempts,plus],[attend,down],[attendance,plus],[attendant,person],[attended,plus],[attendees,person],[attender,person],[attending,tick],[attention,tick],[attentional,plus],[attentive,plus],[attentively,plus],[attest,plus],[attesting,plus],[attic,room],[attire,shirt],[attitude,plus],[attitudes,mind],[attorney,person],[attract,plus],[attracted,dash],[attracting,plus],[attraction,plus],[attractions,plus],[attractive,plus],[attribute,n],[attributed,dash],[attributes,square],[attuned,equals],[au,country],[auburn,square],[auction,dollarsymbol],[aud,dollarsign],[audacity,and],[audible,plus],[audience,person],[audiences,person],[audient,company],[audio,note],[audiobar,square],[audiobooks,audiobar],[audioship,company],[audit,tick],[audited,tick],[auditing,plus],[auditioner,person],[auditor,person],[auditorium,room],[auditory,ear],[audits,plus],[aug,square],[augmentary,up],[augmentation,up],[augmentative,plus],[augmented,up],[august,square],[aumanufacturing,company],[aun,dash],[aunt,person],[aunts,person],[aural,ear],[auras,box],[aurora,light],[aus,country],[auspicious,plus],[auspost,company],[austen,person],[austin,person],[australasian,country],[australia,country],[australian,person],[australians,person],[australias,country],[australis,country],[austria,country],[auth,plus],[authentic,plus],[authenticate,plus],[authenticated,plus],[authenticating,plus],[authentication,tick],[authenticity,plus],[author,person],[authored,square],[authorise,plus],[authorised,plus],[authoritarian,plus],[authoritarianism,minus],[authoritative,plus],[authorities,person],[authority,person],[authorization,plus],[authorized,plus],[authors,person],[authorship,paper],[autism,minus],[autist,person],[auto,tick],[autobiographical,book],[autobiography,book],[autobios,paper],[autobot,person],[autocomplete,tick],[autocompleted,plus],[autocorrect,plus],[autocue,square],[autogenerator,right],[autograph,square],[autoimmune,plus],[autoluna,moon],[automagically,plus],[automata,and],[automatable,circle],[automatarchy,person],[automate,box],[automated,cycle],[automates,cycle],[automatic,circle],[automatically,and],[automating,cycle],[automation,circle],[automaton,and],[automator,circle],[automobile,car],[autonomic,plus],[autonomous,plus],[autonomy,circle],[autorun,right],[autorunning,plus],[autosaved,plus],[autosaves,plus],[autowrite,pen],[autumn,square],[auxilliary,plus],[auyeung,person],[av,dash],[avail,square],[availabilities,square],[availability,tick],[available,plus],[avalanche,snow],[avalanches,minus],[avb,or],[ave,n],[avenue,road],[avenues,path],[average,line],[averages,line],[avert,xcorrection],[avi,dash],[avid,plus],[avignon,city],[avocado,],[avoid,xcorrection],[avoidance,xcorrection],[avoided,xcorrection],[avoiding,xcorrection],[avoids,xcorrection],[avon,person],[avz,dash],[aw,software],[awabakal,person],[await,plus],[awaiting,plus],[awake,plus],[awaked,up],[awakened,plus],[award,plus],[awarded,plus],[awarding,plus],[awards,tick],[awardspace,company],[aware,tick],[awareness,plus],[awarning,xcorrection],[away,zero],[awe,plus],[awn,n],[awx,square],[awy,square],[ax,variable],[axels,line],[axes,line],[axiomatic,and],[axioms,right],[axis,line],[axle,line],[axolotl,],[axon,line],[ay,variable],[ayers,person],[ayur,plus],[ayurveda,plus],[ayurvedic,box],[az,dash],[azerbaijan,country],[azizan,person],[azu,person],[azurite,stone],[azzx,dash],[añadida,dash],[añc,plus],[aś,dash],[aṃ,plus],[aṃś,plus],[b,],[ba,paper],[baba,person],[babe,person],[babel,city],[babied,plus],[babier,plus],[babies,baby],[baby,],[bacd,dash],[bach,person],[bachelor,person],[bachelors,book],[bachkirov,person],[back,left],[backbone,],[backconversion,left],[backdated,down],[backdates,left],[backdrop,square],[backed,right],[backend,and],[background,square],[backgrounds,square],[backing,line],[backlashes,minus],[backlinks,right],[backlog,left],[backlogged,right],[backlogs,book],[backpack,bag],[backpropagate,left],[backpropagation,left],[backs,plus],[backslash,],[backslashes,square],[backticks,square],[backtrace,left],[backtrack,left],[backtracked,left],[backtracking,left],[backtracks,left],[backtracktest,tick],[backtranslate,left],[backtranslated,right],[backtranslateuntilcorrect,tick],[backtranslation,square],[backtranslations,right],[backtranslationwords,square],[backup,tworectangle],[backups,paper],[backward,left],[backwards,left],[bacteria,bacterium],[bacterial,bacteria],[bacterium,],[bad,minus],[baddie,person],[badge,square],[badges,badge],[badly,minus],[badminton,shuttlecock],[badness,minus],[baduanjin,dash],[bae,company],[baffled,minus],[bag,],[bagel,],[bagged,bag],[bagof,box],[bagpipe,],[bagpipes,],[bags,bag],[bahama,plus],[bahamas,country],[bahrain,country],[bailey,person],[bain,person],[bake,bread],[bakelite,],[baker,person],[baking,oven],[baktraks,line],[bal,n],[balance,box],[balanced,equals],[bald,head],[balff,person],[bali,person],[balibanuba,apple],[baljit,person],[ball,],[ballantine,person],[ballet,right],[balloon,],[balloonifying,balloon],[balloons,balloon],[ballot,paper],[ballots,paper],[balls,ball],[balsa,wood],[balustrade,],[balustrades,balustrade],[bam,dash],[bamboo,],[bambury,city],[ban,dash],[banana,],[bananas,banana],[band,],[bandcamp,company],[bands,note],[bandura,person],[bandwidth,line],[bangladesh,country],[bangor,dog],[banjo,],[bank,building],[banker,person],[banking,dollarsymbol],[banks,dollarsymbol],[banned,xcorrection],[banner,square],[banning,xcorrection],[banquet,apple],[bar,square],[barata,person],[barbados,country],[barbara,person],[barbarians,person],[barbaric,minus],[barbecued,barbeque],[barbeque,],[barbuda,country],[barcelona,city],[barcode,n],[bare,down],[barebones,bone],[barely,plus],[barest,down],[bargain,plus],[baricello,cello],[baritone,person],[bark,plus],[barked,line],[barkly,street],[barley,],[barnacle,],[barnacles,barnacle],[barnes,person],[barnyard,],[barometers,n],[baroness,person],[barr,person],[barra,city],[barrabool,suburb],[barracuda,fish],[barracudas,fish],[barred,xcorrection],[barrel,],[barrelled,barrel],[barrett,person],[barricuda,fish],[barrier,box],[barriers,minus],[barrow,wheelbarrow],[barry,person],[bars,box],[bartimes,n],[bartlett,person],[barwon,river],[basal,zero],[basc,book],[base,square],[baseball,],[basecase,one],[basecasecondition,dash],[basecasecondition,tick],[basecaselistlength,variable],[based,square],[baseline,line],[basement,square],[basename,square],[bases,base],[bash,square],[bashibazouks,minus],[basic,line],[basically,one],[basics,n],[basil,person],[basing,square],[basis,square],[basket,],[bass,note],[bassoon,tube],[bat,],[batch,n],[bates,person],[bath,],[bathe,water],[bathed,bath],[bathers,],[bathing,water],[bathroom,room],[baths,bath],[bathtimes,bath],[baton,],[batt,person],[batter,],[batteries,battery],[battery,],[battle,minus],[battled,minus],[bauble,ball],[baulsch,person],[baum,person],[bay,water],[bayeux,dot],[bazaar,room],[bb,xcorrection],[bbb,dash],[bbbccc,square],[bbc,square],[bbcc,square],[bbedit,square],[bbj,square],[bbjj,square],[bbq,],[bc,dash],[bc,square],[bcabcd,square],[bcb,dash],[bcba,square],[bcbcbcd,square],[bcbcd,square],[bcc,square],[bcd,square],[bcdfghjklmnpqrstvwxyz,dash],[bcm,dash],[bcs,square],[bd,variable],[bdict,book],[bdrict,paper],[be,equals],[beac,company],[beach,sand],[beachball,ball],[beacon,company],[beagle,dog],[beaker,],[beam,line],[beamed,right],[beaming,right],[beams,line],[bean,],[beanie,],[beaning,right],[beans,bean],[bear,],[beard,],[bearer,person],[bearers,person],[bearing,tick],[bearings,plus],[beasoning,right],[beat,a],[beating,dot],[beatles,person],[beatrice,person],[beats,note],[beau,person],[beautiful,square],[beauty,face],[beauvoir,person],[bec,left],[became,equals],[because,left],[becharm,plus],[beckett,person],[beckon,plus],[beckoning,left],[become,equals],[becomes,right],[becoming,right],[bed,],[bedfellow,person],[bedroom,bed],[beds,bed],[bedspread,blanket],[bedtime,n],[bee,],[beechler,person],[beef,],[beek,person],[beeline,right],[been,equals],[bees,bee],[beese,right],[beetle,],[before,minus],[beforehand,left],[befriend,plus],[befriended,plus],[befriending,plus],[befu,dash],[began,zero],[begged,left],[begin,zero],[beginner,person],[beginners,person],[beginning,zero],[beginnings,zero],[begins,one],[behalf,right],[behave,plus],[behaved,plus],[behaves,plus],[behaving,plus],[behavior,plus],[behavioral,plus],[behaviors,plus],[behaviour,plus],[behavioural,right],[behaviours,right],[beheld,plus],[behemoth,monster],[behind,right],[bei,person],[being,person],[beings,person],[belarus,country],[belgium,country],[belief,square],[beliefs,tick],[believable,plus],[believe,tick],[believed,plus],[believers,person],[believes,plus],[believing,plus],[belize,country],[bell,],[bella,person],[bellissimo,plus],[bells,box],[belly,stomach],[belmont,suburb],[belong,box],[belonged,plus],[belonging,box],[belongingness,plus],[belongings,box],[belousoff,person],[below,down],[belt,],[ben,person],[benbya,person],[bench,],[benchmarking,plus],[benchmarks,plus],[bend,right],[bender,person],[bending,line],[bends,arc],[beneficence,plus],[beneficial,plus],[benefit,plus],[benefited,plus],[benefiting,plus],[benefits,plus],[benefitted,plus],[benefitting,plus],[benevolence,plus],[benfield,person],[benin,country],[benincasa,person],[benisalachocolat,chocolate],[benjamin,person],[bennett,person],[benni,person],[bent,minus],[bentivoglio,person],[bentwood,wood],[berkeley,building],[berklee,company],[bernadette,person],[bernie,person],[bernstein,person],[bernsteintapes,person],[berocca,box],[beroccas,tablet],[berries,berry],[berrio,person],[berry,],[bert,person],[berys,person],[beside,right],[besides,right],[best,a],[bestow,plus],[bestseller,dollarsymbol],[bestthinking,company],[bet,box],[beta,tworectangle],[betray,minus],[betrayed,minus],[betroth,down],[bettania,dog],[better,plus],[bettering,plus],[betting,dollarsymbol],[betty,person],[between,dash],[betweenizing,box],[betweenness,box],[betweennesses,box],[beverage,cup],[beware,xcorrection],[beyond,dot],[bf,dash],[bfs,right],[bg,square],[bgcolor,square],[bgv,dash],[bhagavad,book],[bhandari,person],[bhaskar,newspaper],[bhi,person],[bhide,person],[bhutan,country],[bi,two],[bias,minus],[bib,book],[bibby,person],[bibi,person],[bible,book],[biblical,book],[bibliographic,book],[bibliographies,paper],[bibliography,paper],[biblioteca,book],[bicameral,room],[biceps,muscle],[bicochemistry,ball],[bicycle,],[bid,dollarsymbol],[bidder,person],[biddle,plus],[biden,person],[bidirectional,dash],[bieber,person],[big,box],[bigger,greaterthan],[biggest,n],[biggs,person],[bilayer,box],[bildungsroman,book],[bilingual,book],[bill,person],[billing,dollarsymbol],[billion,line],[billionaire,dollarsymbol],[billions,dollarsymbol],[bills,paper],[billy,person],[biloba,plant],[bimbo,person],[bin,box],[binaries,n],[binary,n],[bind,box],[binding,square],[bindings,dash],[binds,dash],[bing,person],[bingo,counter],[bins,bin],[bio,paper],[bioactive,plus],[biochemical,ball],[biochemist,person],[biochemistry,ball],[biodiverse,plus],[biodiversity,plus],[bioethanol,box],[biofeedback,left],[biographies,book],[biography,book],[biological,book],[biologically,book],[biologist,person],[biology,book],[biomedical,book],[biomolecular,ball],[biomolecules,ball],[bios,paper],[bioschemistry,and],[biotechnology,right],[bips,dash],[birch,person],[bird,],[birdcage,cage],[birds,bird],[birth,zero],[birthday,square],[birthdays,square],[birthplace,dot],[birthright,plus],[births,zero],[bis,plus],[biscuit,],[biscuits,biscuit],[bisect,box],[bisected,box],[bisecting,down],[bishop,person],[bismarck,person],[bissau,country],[bit,one],[bitar,person],[bitcodes,dot],[bite,tooth],[biteful,],[biting,tooth],[bitl,square],[bitmap,square],[bits,tworectangle],[bitstream,company],[bitter,minus],[bitterness,minus],[bitwise,one],[biweekly,two],[bix,],[bizarre,plus],[bjelke,person],[bjorn,dog],[bk,person],[bl,dash],[black,square],[blackandwhitehen,duck],[blackandwhitehenwithblackspeckles,duck],[blackberry,],[blackboard,],[blackcurrant,],[blackduck,duck],[blackel,person],[blackice,person],[blackie,duck],[blackl,square],[blackler,person],[blackwell,person],[blackwhite,square],[blackye,person],[bladder,],[blade,],[blades,blade],[blair,person],[blake,person],[blame,xcorrection],[blamed,minus],[blames,xcorrection],[blaming,minus],[blanchard,person],[blank,square],[blanket,],[blankets,blanket],[blast,plus],[blasted,minus],[ble,dash],[blemishes,ball],[blend,plus],[blended,box],[blender,],[blending,box],[blendr,company],[bless,plus],[blessed,plus],[blessedness,plus],[blessing,plus],[blessings,plus],[blew,mouth],[blind,minus],[blinded,minus],[blindfold,],[blindfolded,blindfold],[blindfolds,blindfold],[blinding,minus],[blindly,square],[bliss,plus],[blissenobiarella,person],[blissful,plus],[bloating,minus],[blob,ball],[block,box],[blockage,minus],[blockages,ball],[blocked,box],[blockedness,box],[blocking,box],[blocks,box],[blog,paper],[blogger,company],[blogs,book],[blogspot,paper],[blonde,hair],[blood,],[bloodshed,minus],[bloodstream,blood],[bloom,flower],[blossom,],[blow,right],[blowers,right],[blowing,mouth],[blown,mouth],[blue,square],[blueberries,berry],[blueberry,],[bluegrass,note],[bluenature,planet],[blueprint,paper],[bluestone,],[bluff,minus],[blunden,plus],[blunders,minus],[blunt,plus],[blurb,square],[blurred,minus],[blurring,minus],[blurry,square],[blytheman,person],[blythman,person],[bmc,dash],[bme,dash],[bmj,company],[bmjopen,company],[bn,variable],[bo,minus],[board,person],[boarded,up],[boarding,school],[boardroom,room],[boas,person],[boat,],[boats,boat],[bob,person],[bobbed,up],[bobbin,ball],[bobbio,person],[bobble,ball],[bobby,person],[bock,person],[bocvfiqavd,dash],[bodied,person],[bodies,body],[bodily,person],[body,person],[bodyvars,variable],[bogged,down],[boggle,minus],[boggling,plus],[bohemian,n],[boil,n],[boiled,water],[boiling,n],[bold,square],[bolivia,country],[bollie,bottle],[bolly,bottle],[bolman,person],[bologna,city],[bolshoi,city],[bolt,twobox],[bolted,bolt],[bolting,bolt],[bolts,bolt],[bomb,ball],[bombing,minus],[bompel,dash],[bon,plus],[bonanza,plus],[bonaparte,person],[bond,right],[bondi,dot],[bonding,right],[bonds,right],[bone,],[bones,bone],[bongo,plus],[bonilla,person],[bonita,plus],[bonjour,plus],[bonjoura,plus],[bonkers,minus],[bonnet,hat],[bonus,plus],[book,],[bookcase,book],[booked,book],[bookings,tick],[bookkeeping,plus],[bookmaking,book],[bookmark,square],[bookmarked,n],[bookmarks,paper],[bookofbadarguments,book],[books,book],[bookshelves,box],[booksmy,book],[bool,n],[boolean,n],[booleans,n],[boom,line],[booming,plus],[boost,up],[boosted,up],[boosting,up],[boot,],[booth,and],[booting,right],[boots,boot],[border,square],[borderline,down],[borders,square],[bore,down],[bored,minus],[borges,person],[boring,minus],[boris,person],[born,one],[borrow,right],[borrowed,right],[borrowing,right],[boryslawski,person],[bos,dash],[bosnia,country],[boss,person],[boston,city],[bot,dot],[botanic,plant],[botanical,plant],[botanist,person],[botany,plant],[bote,dash],[both,tworectangle],[bothersome,minus],[botid,n],[bots,dot],[botswana,country],[bottle,],[bottleneck,bottle],[bottlenecks,minus],[bottler,person],[bottles,bottle],[bottom,down],[bottoms,down],[bought,dollarsymbol],[boulder,city],[bouma,person],[bounce,ball],[bounced,up],[bounces,equals],[bouncing,ball],[bound,plus],[boundaries,square],[boundary,square],[bounded,n],[bounds,line],[bountifulness,plus],[bounty,plus],[bouquet,flower],[bourbon,bottle],[bourdain,person],[bourgeois,person],[bourgeoisie,person],[bout,left],[bovine,cow],[bow,dot],[bowed,bow],[bowel,organ],[bower,person],[bowes,person],[bowie,person],[bowl,],[bowlby,person],[bowled,ball],[bowlen,person],[bowler,ball],[bowling,ball],[bowls,bowl],[box,],[boxes,box],[boy,person],[boyacigiller,person],[boycott,xcorrection],[boycotts,minus],[boys,person],[bp,dash],[br,apple],[bra,],[brac,box],[bracket,],[bracketed,box],[bracketing,box],[brackets,box],[bradbury,person],[bradfield,person],[brae,dog],[braeard,person],[braille,square],[brain,],[brains,brain],[brainstorm,paper],[brainstormed,paper],[brainstorming,square],[braintree,company],[brainwaves,n],[brainworks,brain],[bran,],[branch,dash],[branched,right],[branches,dash],[branching,dash],[brand,square],[branding,square],[brands,square],[brass,tube],[brat,minus],[braun,person],[brave,plus],[bravely,plus],[bravery,plus],[bravest,plus],[bravestudios,company],[brayden,person],[brazil,country],[brazilian,country],[brazzaville,country],[brdict,book],[brdicts,book],[brea,tar],[breach,minus],[bread,],[breadcrumbs,dot],[breadth,right],[break,dash],[breakdown,dash],[breakdowns,down],[breakers,box],[breakfast,plate],[breaking,xcorrection],[breaks,two],[breakthroughs,plus],[breasdostoning,threerectangle],[breasdostonings,threerectangle],[breason,box],[breasoned,apple],[breasoner,person],[breasoners,person],[breasoning,apple],[breasoningesquenesses,apple],[breasoninglimit,n],[breasoningname,square],[breasoningocracy,and],[breasonings,apple],[breasoningthe,dash],[breasoningwriter,software],[breasoniong,apple],[breasonism,box],[breasons,square],[breast,],[breastfeed,milk],[breastoned,box],[breasts,breast],[breath,air],[breathdostonings,step],[breathe,lung],[breathed,lung],[breathes,air],[breathing,lung],[breathless,minus],[breaths,dash],[breathsoned,sugar],[breathsoning,plus],[breathsoningitis,minus],[breathsonings,berry],[breathsons,apple],[breathwork,mouth],[breavsonings,square],[breed,animal],[breeding,right],[breeds,person],[breedsonings,apple],[breen,apple],[breenississimos,right],[breeze,right],[brenka,person],[brent,person],[brett,person],[brew,glass],[brian,person],[brianmoreau,person],[bribery,minus],[bricabraced,fabric],[brice,person],[brichko,person],[brick,],[brickle,person],[bricoleurs,square],[bridge,],[bridged,bridge],[bridges,right],[bridget,person],[bridging,dash],[brief,n],[briefed,paper],[briefing,paper],[briefly,n],[briefs,square],[briggs,person],[bright,ball],[brighter,n],[brightness,n],[brigid,person],[brilliance,plus],[brilliant,plus],[brim,],[brimmed,],[bring,right],[bringer,up],[bringing,left],[brings,right],[brinkman,person],[bristle,bristle],[bristles,bristle],[britannica,book],[british,person],[brits,],[brlog,book],[broach,mouth],[broached,plus],[broad,n],[broadband,right],[broadbee,plus],[broadcast,right],[broadcaster,person],[broadcasters,person],[broadcasthost,right],[broadcasting,right],[broadcasts,right],[broaden,right],[broader,plus],[broadly,plus],[broadminded,plus],[broccoli,],[broke,minus],[broken,xcorrection],[broker,person],[brokerage,dollarsymbol],[brolga,bird],[bronchi,tube],[bronze,],[brook,person],[brooke,person],[brooks,person],[broom,],[broth,soup],[brother,person],[brothers,person],[brought,right],[brow,],[brown,square],[brownkutschenkovargo,person],[browse,down],[browsed,right],[browser,square],[browsers,and],[browsing,right],[brs,twobox],[brtdict,variable],[brth,apple],[brthdict,book],[brthlength,n],[brueckner,person],[bruin,person],[bruising,minus],[brumaire,person],[brunei,country],[brunello,person],[bruno,person],[brunswick,suburb],[brush,],[brushed,brush],[brushing,right],[brute,down],[bryan,person],[bryce,person],[bs,square],[bsb,book],[bsc,paper],[bsd,paper],[bshot,dash],[bst,box],[bt,right],[btteduvjjoy,dash],[btw,box],[bu,dash],[bubble,],[bubbles,ball],[buble,person],[bubs,company],[bucket,],[buckle,],[buckled,buckle],[buckling,plus],[buckwheat,box],[bud,ball],[buddha,person],[buddhism,buddha],[buddhist,person],[buddhists,person],[buddies,person],[budding,plus],[buddy,person],[budgerigar,bird],[budget,dollarsymbol],[budgeted,dollarsymbol],[budgeting,dollarsymbol],[budgy,bird],[buffer,box],[buffered,box],[bug,minus],[buggily,minus],[buggy,minus],[bugs,minus],[bugsfixed,plus],[build,box],[buildable,box],[builder,up],[builders,up],[building,],[buildings,building],[builds,box],[built,box],[builtbyme,dash],[bulbourethral,tube],[bulgaria,country],[bulging,ball],[bulk,n],[bulky,box],[bull,],[bullet,],[bulletin,paper],[bulletins,paper],[bullets,square],[bullies,minus],[bulls,bull],[bully,minus],[bullying,minus],[bumped,minus],[bumping,up],[bumps,up],[bun,],[bunches,box],[bundle,box],[bundled,box],[bundles,box],[bung,dash],[bunn,person],[bunnies,rabbit],[bunny,rabbit],[buns,bun],[buoy,buoy],[burawoy,person],[burden,minus],[burdened,minus],[burdens,minus],[bureau,company],[bureaucracy,minus],[bureaucrat,person],[bureaucratic,minus],[burette,flask],[burgeoning,plus],[burger,],[burgers,burger],[burglary,minus],[buried,down],[burik,person],[burke,person],[burkina,country],[burma,country],[burn,minus],[burned,fire],[burner,fire],[burning,fire],[burnout,minus],[burnt,minus],[burrowed,down],[bursaries,dollarsymbol],[bursary,dollarsymbol],[burst,right],[bursts,up],[burundi,country],[bus,],[bush,plant],[bushfire,fire],[busily,plus],[business,square],[businesses,company],[businessman,person],[businessmen,person],[businessperson,person],[bust,],[busters,xcorrection],[busy,plus],[but,minus],[butler,person],[butter,],[buttered,butter],[butterflies,minus],[butterfly,],[butterscotch,],[buttocks,],[button,box],[buttons,square],[buy,dollarsign],[buyer,person],[buyers,person],[buying,a],[buys,dollarsymbol],[buzz,plus],[buzzy,plus],[bv,dash],[bve,dash],[bw,box],[bwe,dash],[bwn,n],[bx,variable],[bxx,variable],[bxydcqw,dash],[bxzt,dash],[by,dash],[bye,n],[byjus,dash],[bypass,line],[bypassed,right],[byproducts,right],[byron,person],[byte,a],[bytecode,square],[bytes,square],[byy,variable],[bz,dash],[bögels,person],[c,],[c,box],[ca,dash],[caaf,square],[cab,car],[cabaña,sausage],[cabbage,],[cabinet,],[cable,line],[cabo,country],[cached,box],[caction,hand],[cad,and],[cadillacs,car],[caesar,person],[caf,square],[cafe,building],[cafs,square],[café,building],[cage,box],[cagr,n],[cai,person],[caissy,person],[cake,],[cal,square],[calamity,minus],[calc,plus],[calcbtremaining,dash],[calciplus,liquid],[calcium,ball],[calculate,plus],[calculated,equals],[calculates,equals],[calculating,equals],[calculation,equals],[calculations,and],[calculator,plus],[calculus,n],[calendar,square],[calf,],[calibrated,n],[calibrations,line],[caliendo,person],[california,state],[call,speechballoon],[callaghan,person],[called,dash],[caller,person],[callibrate,line],[callibration,line],[callibrationvalue,variable],[calligraphy,square],[calling,speechballoon],[calliope,box],[calls,square],[callterm,square],[calm,line],[calmed,plus],[calming,plus],[calmness,plus],[calormen,dash],[caltech,university],[calves,calf],[cam,box],[cambodia,country],[cambodian,country],[cambridge,university],[came,right],[camera,box],[cameras,box],[cameron,person],[cameroon,country],[camilla,person],[camouflage,shirt],[camp,],[campaign,box],[campaignid,dash],[campaigning,plus],[campaigns,plus],[campbell,person],[campfire,fire],[camping,fire],[campus,square],[campuses,building],[camus,person],[can,dash],[canada,country],[canadian,person],[canadians,person],[canal,water],[canaveral,suburb],[canaverals,up],[canberra,city],[canc,xcorrection],[cancel,xcorrection],[cancellable,xcorrection],[cancellation,xcorrection],[cancellations,xcorrection],[cancelled,xcorrection],[canceller,xcorrection],[cancelling,xcorrection],[cancels,xcorrection],[cancer,minus],[candid,plus],[candidate,person],[candidates,person],[candies,lolly],[candle,],[candlelight,candle],[candles,candle],[candy,lolly],[cane,],[canine,dog],[canister,box],[cannibals,person],[cannibis,minus],[cannot,xcorrection],[canoe,],[canon,paper],[canonical,plus],[canopy,tree],[cant,minus],[cantaloupe,],[canterbury,city],[cantilever,line],[canvas,square],[cap,up],[capabilities,plus],[capability,plus],[capable,plus],[capacity,n],[cape,suburb],[capellic,person],[caper,],[caperchione,person],[capers,caper],[capillaries,tube],[capillary,tube],[capital,square],[capitalisation,up],[capitalisations,up],[capitalise,up],[capitalised,up],[capitalises,up],[capitalism,dollarsign],[capitalismmagazine,book],[capitalist,person],[capitalists,person],[capitally,country],[capitals,up],[capped,square],[cappuccino,],[capricorn,ball],[caps,square],[capsicum,],[capstone,box],[capsulated,box],[capsule,],[captain,person],[captcha,down],[caption,square],[captions,square],[captivated,tick],[captives,person],[capture,box],[captured,square],[captures,box],[capturing,plus],[car,],[carafe,jug],[caramel,],[caramelised,caramel],[carat,n],[carbines,person],[carbohydrates,ball],[carbon,],[carcinogeenan,minus],[carcinogen,minus],[card,square],[cardboard,square],[cardio,heart],[cardiovascular,heart],[cards,paper],[care,hand],[cared,plus],[career,line],[careers,line],[careful,plus],[carefully,plus],[carer,person],[carers,person],[cares,plus],[caretaker,person],[caring,plus],[caringly,plus],[caringness,plus],[carl,person],[carlisle,street],[carlo,person],[carly,person],[carnations,flower],[carnelian,person],[carob,],[carofarion,dash],[carol,person],[carolina,person],[caroline,person],[carpathians,book],[carpenter,person],[carpentry,wood],[carpet,],[carrageenan,ball],[carre,person],[carriage,vehicle],[carriages,vehicle],[carrie,person],[carried,box],[carrier,box],[carries,box],[carriesout,plus],[carrot,],[carrots,carrot],[carry,box],[carryable,box],[carrying,box],[cars,car],[carson,person],[cart,box],[carter,person],[cartesian,plus],[cartmel,person],[carton,box],[cartons,box],[cartridge,],[carts,box],[cartwheeling,right],[carucci,person],[carved,right],[carving,chisel],[cas,dash],[cascade,down],[cascading,down],[case,paper],[cased,down],[casein,],[cases,dash],[cash,dollarsymbol],[cashew,nut],[casomorphin,ball],[casserole,pot],[cast,person],[castanets,],[casting,tick],[castle,],[castration,ball],[casual,plus],[casually,plus],[cat,],[catalan,book],[catalina,person],[catalog,book],[catalogue,book],[cataracts,eye],[catarina,person],[catastrophe,minus],[catastrophes,minus],[catch,box],[catcher,box],[catches,xcorrection],[catchier,plus],[catchiness,plus],[catching,plus],[catchphrases,square],[catchy,plus],[categorical,dash],[categorically,box],[categories,square],[categorised,box],[categorises,box],[categorising,box],[category,square],[cater,apple],[catered,apple],[caterer,person],[catering,apple],[caters,plus],[cates,dash],[cathcart,person],[cathedral,],[catherine,person],[catheter,tube],[cato,person],[catriona,person],[cats,cat],[cattle,cow],[cauberghe,person],[caucas,person],[caucasian,person],[caught,xcorrection],[caulfield,suburb],[causal,right],[causality,right],[causation,right],[causations,right],[causative,right],[cause,right],[caused,right],[causes,right],[causing,right],[caution,xcorrection],[cautious,plus],[cautiously,plus],[caval,company],[cave,box],[caveats,plus],[cavell,person],[caves,cave],[caviar,],[cavity,box],[caw,asterisk],[cawa,a],[cawdel,and],[cawmp,and],[cawmpedit,dash],[cawmptest,tick],[cawp,and],[cawplistprolog,square],[cawps,up],[cawptest,plus],[cawptesta,tick],[cawpverify,plus],[cawpverifya,tick],[cawthorne,person],[cb,dash],[cba,square],[cbc,square],[cbd,square],[cbr,right],[cbswwcc,dash],[cc,dash],[cca,variable],[ccb,variable],[ccc,square],[cccbbb,square],[cccnumber,n],[ccl,variable],[cconverter,right],[ccpv,variable],[ccreep,right],[cd,],[cda,variable],[cdc,dash],[cde,dash],[cded,box],[cdfour,dash],[cds,cd],[ce,right],[cease,minus],[ceased,zero],[cecd,company],[ceg,right],[ceiba,tree],[ceil,up],[ceiling,square],[cele,box],[celebrant,person],[celebrate,plus],[celebrated,plus],[celebrates,plus],[celebrating,plus],[celebration,plus],[celebrations,plus],[celebrities,person],[celebrity,person],[celebs,person],[celery,],[celesta,tube],[celeste,person],[celestially,plus],[celia,person],[celibacy,zero],[celibate,plus],[cell,square],[cellar,box],[cello,],[cellophane,square],[cellpadding,square],[cells,box],[cellspacing,square],[cellular,cell],[celsius,n],[cengage,company],[censorship,minus],[censure,xcorrection],[census,paper],[centanni,person],[center,dot],[centers,building],[centigrade,n],[centimetre,n],[centipede,],[central,dot],[centralisation,right],[centralised,one],[centrality,n],[centrally,n],[centre,building],[centred,n],[centredness,n],[centrelink,company],[centres,dot],[centro,dot],[centuries,n],[century,n],[ceo,person],[ceos,person],[cerebellum,brain],[cerebros,person],[ceremonies,plus],[ceremony,box],[ceresio,city],[cert,square],[certa,company],[certain,up],[certainly,a],[certificate,paper],[certificates,paper],[certification,plus],[certified,plus],[cf,dash],[cfl,variable],[cflm,variable],[cfvkai,dash],[cggc,dash],[cgi,software],[cgpt,box],[ch,variable],[cha,square],[chad,country],[chai,cup],[chaillier,person],[chain,line],[chaine,right],[chains,line],[chair,],[chaise,carriage],[chaisomboon,person],[chaiwat,person],[chalice,],[chalk,],[chalky,chalk],[challenge,xcorrection],[challenged,xcorrection],[challenges,minus],[challenging,plus],[cham,person],[chamber,room],[championed,plus],[chan,person],[chance,plus],[chancellor,person],[chances,plus],[chandler,person],[chang,person],[change,delta],[changed,minus],[changegrid,grid],[changelength,n],[changelengthh,n],[changeover,circle],[changeovers,right],[changeperson,person],[changer,person],[changes,xcorrection],[changing,xcorrection],[channel,paper],[channelling,right],[channels,n],[chant,note],[chanter,person],[chants,note],[chaos,minus],[chaotic,minus],[chaplaincies,paper],[chapter,book],[chapters,book],[char,a],[character,person],[characterbr,box],[characterbrdiscontinuous,square],[characterbreasoner,a],[characterbrscript,paper],[characterisation,plus],[characterise,plus],[characterised,plus],[characterises,plus],[characterising,plus],[characteristic,square],[characteristics,square],[characters,a],[charang,box],[charge,dollarsymbol],[charged,plus],[charges,plus],[charging,right],[charisma,plus],[charismatic,plus],[charitable,plus],[charities,zero],[charity,plus],[charles,person],[charley,person],[charlie,person],[charlotte,person],[charm,plus],[charmer,person],[charnes,person],[charobj,apple],[charon,person],[chars,a],[charset,square],[chart,paper],[charted,square],[charter,paper],[charting,paper],[charts,chart],[chased,right],[chasing,right],[chasm,box],[chastity,plus],[chat,paper],[chatbot,box],[chatbots,box],[chatgpt,company],[chats,square],[chatted,mouth],[chatting,speechballoon],[chaucer,person],[chaudhary,person],[chauffeur,person],[chd,dash],[cheap,dollarsign],[cheaper,lessthan],[cheapest,n],[cheat,minus],[cheated,minus],[cheating,minus],[cheats,minus],[chec,plus],[chechnya,person],[check,tick],[checkarguments,tick],[checkbox,square],[checkboxes,v],[checked,tick],[checker,tick],[checkerboard,plane],[checkers,tick],[checking,tick],[checklist,square],[checklists,v],[checkmate,plus],[checkout,box],[checks,tick],[checktypes,tick],[cheek,square],[cheekily,minus],[cheeks,cheek],[cheeky,minus],[cheer,plus],[cheered,plus],[cheerful,plus],[cheers,plus],[cheese,],[cheeseburger,],[cheesecake,],[cheesonings,cheese],[chef,person],[chekov,person],[chem,ball],[chemical,box],[chemicals,oxygen],[chemist,person],[chemistry,book],[chen,person],[cheques,square],[cher,person],[cherished,plus],[cherries,cherry],[cherry,],[chess,game],[chessboard,],[chest,],[chestnut,],[cheung,person],[chew,mouth],[chewed,tooth],[chewing,tooth],[chi,n],[chiat,person],[chicago,city],[chicken,],[chickens,chicken],[chickpea,],[chicks,bird],[chief,person],[chiff,box],[child,],[childbearing,person],[childcare,plus],[childhood,line],[children,person],[childrenh,person],[childrens,child],[chile,country],[chilled,n],[chime,note],[chin,person],[china,country],[chindogu,plus],[chinese,person],[chinos,pants],[chip,box],[chippendale,person],[chips,],[chiropractic,muscle],[chiropractor,person],[chirp,bird],[chisel,],[chiselling,right],[chisels,chisel],[chit,mouth],[chitranishek,person],[chitty,person],[chivalrous,person],[chloe,person],[chmod,pen],[chn,dash],[cho,person],[chocolate,],[chodorow,person],[choice,dash],[choicepoint,dot],[choicepoints,dash],[choices,tick],[choir,note],[choirs,person],[choke,minus],[choked,minus],[chomsks,person],[chomsky,person],[chonga,dash],[choos,box],[choose,dash],[chooser,person],[chooses,person],[choosing,tick],[chopped,tworectangle],[chopping,knife],[chopstick,],[chopsticks,chopstick],[chord,threerectangle],[chords,note],[choreography,foot],[chorus,square],[chorussoloprogression,square],[chose,finger],[chosen,tick],[chowdorow,person],[chris,person],[christ,person],[christian,person],[christianity,person],[christie,person],[christina,person],[christine,person],[christmas,box],[christopher,person],[christos,person],[chrome,paper],[chronic,minus],[chronicles,book],[chronological,n],[chronologically,right],[chrysalis,],[chs,book],[chua,person],[chubby,plus],[chull,dash],[chunhui,person],[chunk,box],[chunked,box],[chunker,box],[chunks,box],[church,building],[churchill,person],[churned,circle],[churning,circle],[chutney,],[chylomicrons,ball],[ci,right],[cic,dash],[ciccy,square],[cicd,right],[cine,tablet],[cinema,square],[cinematic,line],[cinematographer,person],[cinematography,camera],[cinnamon,],[ciprian,person],[circa,box],[circle,],[circled,circle],[circles,circle],[circuit,square],[circuitry,and],[circuits,line],[circular,circle],[circulate,circle],[circulated,right],[circulating,circle],[circulation,circle],[circulatory,circle],[circumcise,box],[circumference,circle],[circumnavigate,right],[circumnavigated,right],[circumnavigating,right],[circumnavigator,circle],[circumstance,square],[circumstances,dash],[circumvent,xcorrection],[circus,tent],[cire,school],[cit,university],[citation,square],[citations,square],[cite,dash],[cited,square],[cites,square],[cities,city],[citing,square],[citizen,person],[citizens,person],[citizenship,paper],[citric,orange],[citroen,car],[citrus,orange],[city,],[civil,plus],[civilians,person],[civilisation,line],[civilisations,planet],[civilly,plus],[cjem,dash],[cjwkcajwto,dash],[ck,dash],[cking,dash],[cl,dash],[claim,paper],[claimed,],[claimed,plus],[claiming,square],[claims,square],[claire,person],[clamp,],[clapped,hand],[clapping,hand],[clara,person],[clare,person],[clarification,plus],[clarified,plus],[clarify,plus],[clarifying,plus],[clarinet,tube],[clarity,plus],[clarke,person],[clash,cymbal],[clashed,minus],[clashes,xcorrection],[clashing,minus],[class,square],[classcentral,company],[classed,square],[classes,room],[classic,paper],[classical,paper],[classicalcomposition,note],[classicalcompositionwithlength,note],[classicalpop,ball],[classicalpopcomposition,note],[classics,book],[classification,dash],[classified,box],[classifiers,box],[classify,dash],[classifying,box],[classmate,person],[classroom,room],[classrooms,room],[claudine,person],[claudius,person],[clause,square],[clausen,square],[clausenum,n],[clauses,square],[clavinet,],[claw,],[claws,claw],[clean,square],[cleaned,plus],[cleaner,square],[cleaning,right],[cleanliness,plus],[cleanly,plus],[cleanmymac,and],[cleanse,minus],[cleanup,zero],[clear,dash],[clearance,plus],[cleared,dash],[clearer,tick],[clearest,plus],[clearing,box],[clearly,dash],[clearness,equals],[clears,zero],[cleavage,box],[clef,note],[clegg,person],[clem,person],[clemens,person],[clenched,equals],[clength,n],[cleobabus,person],[cleopatra,person],[cleric,person],[clerk,person],[clerks,person],[clet,dash],[clever,plus],[cleverly,plus],[cliches,minus],[click,dot],[clicked,equals],[clicking,down],[clicks,plus],[client,person],[clients,person],[cliff,],[clifton,person],[climate,water],[climates,air],[climatology,book],[climax,dot],[climb,up],[climbed,up],[climber,person],[climbing,up],[climbs,up],[cling,plus],[clinic,tablet],[clinical,plus],[clinically,plus],[clinicians,person],[clinics,building],[clinton,person],[clip,],[clipboards,paper],[clipped,down],[clipper,clipper],[clippered,clipper],[clippers,],[clipping,box],[clips,square],[clique,triangle],[cliques,triangle],[clitoris,ball],[cloaked,xcorrection],[clock,n],[clocked,n],[clocking,n],[clocks,clock],[clockwise,right],[clockwork,dash],[clone,plus],[cloned,plus],[clones,box],[cloning,box],[clop,note],[close,n],[closebracket,],[closed,down],[closedroundbracket,],[closedsquarebracket,],[closely,n],[closeness,n],[closer,n],[closest,one],[closing,n],[closure,n],[closures,down],[cloth,square],[clothe,pants],[clothes,coat],[clotheshorse,rack],[clothing,shirt],[cloud,],[cloudfront,cloud],[clouds,box],[cloudy,box],[clove,clove],[clover,],[cloves,clove],[clown,person],[cloying,sugar],[clozapine,tablet],[clozapines,tablet],[clp,n],[clpfd,book],[clt,right],[club,room],[clubs,organisation],[clues,square],[clunky,minus],[cluster,star],[clutch,rod],[cm,n],[cmd,square],[cmedsav,box],[cn,variable],[cnn,square],[cnu,dash],[cnumber,variable],[co,company],[coach,person],[coaching,plus],[coalesced,box],[coalesces,box],[coalface,square],[coarse,square],[coat,],[coaxed,plus],[cochran,person],[cockatoo,bird],[cockatoos,bird],[cockerel,chicken],[cockfest,bird],[cocks,person],[coconut,],[coconuts,coconut],[code,variable],[coded,and],[codefolder,box],[codeless,square],[codenamed,square],[coder,person],[coderdojo,and],[codereuser,tick],[codes,square],[codified,plus],[codify,right],[coding,and],[codings,and],[cody,person],[coefficient,n],[coelho,person],[coffee,],[coffey,person],[cog,],[cognised,mind],[cognition,and],[cognitive,brain],[cognitively,and],[cognito,brain],[cognizant,plus],[cohen,person],[coherence,plus],[coherent,plus],[cohesion,plus],[cohesiveness,plus],[cohort,person],[cohorts,person],[coiffeur,clippers],[coil,cylinder],[coin,],[coincide,tworectangle],[coincided,equals],[coincidences,equals],[coincidentally,plus],[coincides,equals],[coined,plus],[coins,coin],[coit,],[coits,coit],[coker,person],[col,square],[colander,],[colanderianisation,colander],[cold,minus],[colds,bacterium],[cole,person],[colero,person],[coles,company],[collaborate,plus],[collaborated,plus],[collaborating,plus],[collaboration,person],[collaborations,dash],[collaborative,plus],[collaboratively,plus],[collaborators,person],[collage,],[collapse,down],[collapsed,down],[collapsing,down],[collar,],[collate,box],[collated,box],[collating,box],[collation,box],[colleague,person],[colleagues,person],[collect,left],[collected,square],[collecting,box],[collection,box],[collections,box],[collective,box],[collectively,box],[collectivism,person],[collectivist,person],[collector,person],[collects,square],[collectwords,tworectangle],[collectwordsdiff,minus],[colleen,person],[college,building],[colleges,school],[collegiality,plus],[collide,equals],[collided,equals],[colliders,equals],[colliding,equals],[collins,person],[collision,equals],[collisions,equals],[collude,minus],[colnerud,person],[colombia,country],[colon,],[colonial,city],[colonialism,country],[colonized,country],[colony,person],[color,square],[colorado,city],[colossal,plus],[colour,square],[coloured,square],[colourful,balloon],[colouring,square],[colourless,zero],[colourlessness,square],[colours,balloon],[cols,square],[coltan,person],[column,],[columns,line],[colvin,person],[com,box],[coma,minus],[comb,dash],[combat,right],[combating,xcorrection],[combats,xcorrection],[combed,comb],[combinants,dot],[combination,multiply],[combinations,plus],[combinator,dash],[combinators,dash],[combine,one],[combined,plus],[combiner,person],[combines,one],[combining,plus],[combo,tworectangle],[combophil,algorithm],[combos,tworectangle],[combs,person],[combustion,up],[come,right],[comeback,plus],[comedy,book],[comes,right],[comfort,person],[comfortable,seat],[comfortably,plus],[comforted,plus],[comforting,plus],[comic,book],[comical,plus],[comics,book],[coming,right],[comm,company],[comma,box],[command,square],[commanddict,book],[commanded,square],[commanding,plus],[commandletter,square],[commands,plus],[commaorrightbracketnext,square],[commaquoteorrightbracketnext,right],[commas,box],[commemorating,square],[commence,dot],[commenced,dot],[commendation,plus],[commendations,plus],[comment,a],[commentary,paper],[commentator,person],[commentators,person],[commented,speechballoon],[commentfinder,square],[commenting,mouth],[comments,twobox],[commentsa,square],[commerce,dollarsymbol],[commercial,dollarsymbol],[commercially,dollarsymbol],[commission,dollarsymbol],[commissioned,plus],[commissioning,plus],[commit,plus],[commitment,plus],[commitments,plus],[commits,plus],[committal,plus],[committed,plus],[committee,person],[committees,person],[committing,plus],[commodification,dollarsymbol],[commodities,dollarsymbol],[common,plus],[commonalities,plus],[commonly,plus],[commonplace,plus],[commons,plus],[commonwealth,country],[commtainer,box],[communal,box],[communic,paper],[communicate,square],[communicated,right],[communicating,mouth],[communication,dash],[communications,paper],[communicative,square],[communicator,book],[communicators,person],[communism,equals],[communist,person],[communists,person],[communities,person],[community,person],[commutative,equals],[commutativity,dash],[comoros,country],[comp,equals],[compact,down],[companies,building],[companiesmarketcap,dash],[companion,person],[companions,person],[companionship,person],[company,building],[comparable,equals],[comparative,greaterthan],[comparatively,greaterthan],[compare,greaterthan],[compared,greaterthan],[compares,dot],[comparing,dash],[comparison,dash],[comparisonoperator,greaterthan],[comparisons,greaterthan],[compartment,box],[compass,right],[compassion,plus],[compassionate,plus],[compatibility,tick],[compatible,plus],[compatibly,plus],[compcard,square],[compelling,plus],[compensate,plus],[compensated,plus],[compensating,xcorrection],[compensation,dollarsymbol],[compet,plus],[compete,xcorrection],[competed,plus],[competence,plus],[competences,plus],[competencies,tick],[competency,paper],[competent,plus],[competently,plus],[competes,xcorrection],[competing,xcorrection],[competition,two],[competitionism,plus],[competitions,dash],[competitive,plus],[competitiveness,plus],[competitor,person],[competitors,person],[compilation,book],[compile,down],[compiled,down],[compiler,and],[compilers,right],[compiles,down],[compiling,down],[complacency,minus],[complain,xcorrection],[complained,xcorrection],[complaining,minus],[complains,xcorrection],[complaint,xcorrection],[complaints,xcorrection],[complement,dash],[complementary,dash],[complemented,box],[complementing,plus],[complements,dot],[complete,a],[completed,tick],[completely,one],[completeness,tick],[completes,n],[completing,n],[completion,n],[complex,square],[complexes,box],[complexification,minus],[complexified,minus],[complexify,plus],[complexifying,up],[complexion,face],[complexities,n],[complexity,n],[complexly,plus],[compliance,plus],[complicated,and],[complicates,minus],[complicating,minus],[complications,minus],[complies,tick],[compliment,plus],[complimentary,plus],[compliments,plus],[comply,plus],[complying,plus],[component,square],[components,dash],[compose,note],[composed,box],[composer,person],[composex,paper],[composing,note],[composition,paper],[compositions,note],[compost,box],[composure,plus],[compound,tworectangle],[compounds,tworectangle],[comprehend,plus],[comprehension,right],[comprehensive,plus],[comprehensiveness,plus],[compress,down],[compressed,box],[compressing,down],[compression,down],[compressor,down],[compromise,minus],[compromises,xcorrection],[comps,company],[compulsory,plus],[computable,and],[computated,and],[computation,and],[computational,and],[computationalenglish,book],[computationalism,and],[computationally,plus],[computations,and],[compute,and],[computed,and],[computer,box],[computerised,and],[computers,twobox],[computerscience,and],[computes,algorithm],[computing,and],[comswipl,software],[con,dash],[conc,square],[concat,plus],[concatenate,plus],[concatenated,plus],[concatenates,plus],[concatenating,plus],[concatenation,plus],[concatl,left],[concatmap,right],[concatr,plus],[concave,circle],[concavity,box],[conceal,box],[concealed,down],[concealing,square],[concealment,box],[conceals,box],[conceive,right],[conceived,plus],[conceivers,person],[conceiving,one],[concentrate,tick],[concentrated,tick],[concentrating,plus],[concentration,plus],[concept,paper],[conception,plus],[conceptions,plus],[concepts,square],[conceptual,box],[conceptualised,plus],[conceptualises,plus],[conceptualising,eye],[conceptually,plus],[concern,plus],[concerned,xcorrection],[concerning,right],[concerns,plus],[concert,note],[concerted,plus],[concerto,note],[concerts,note],[conch,],[concise,box],[concisely,down],[conclude,square],[concluded,square],[concludes,n],[concluding,n],[conclusion,square],[conclusions,square],[concordances,down],[concords,jet],[concourse,square],[concrete,box],[concs,square],[concur,plus],[concurred,plus],[concurrence,equals],[concurrent,equals],[concurrently,zero],[cond,equals],[condense,down],[condensed,down],[condiments,sauce],[condition,n],[conditional,dash],[conditioned,plus],[conditioner,],[conditioning,plus],[conditions,dash],[condom,],[condoms,condom],[conduct,plus],[conducted,note],[conducting,note],[conductor,person],[conducts,plus],[cone,],[cones,cone],[confectioner,person],[confer,dash],[conference,room],[conferences,room],[conferred,right],[conferring,right],[confidence,plus],[confident,plus],[confidential,square],[confidentiality,box],[confidently,plus],[config,n],[configuration,dash],[configurations,n],[configure,n],[configured,n],[configuring,plus],[confined,box],[confines,box],[confirm,plus],[confirmability,plus],[confirmation,tick],[confirmatory,plus],[confirmed,tick],[confirming,plus],[confirms,plus],[conflict,minus],[conflicted,minus],[conflicting,minus],[conflicts,minus],[conform,plus],[conformation,box],[conformations,box],[conforming,right],[conformity,plus],[confront,minus],[confrontation,minus],[confrontational,minus],[confrontations,minus],[confronting,minus],[confronts,minus],[confucian,person],[confucius,person],[confuse,minus],[confused,minus],[confusing,minus],[confusingly,minus],[confusion,minus],[congestion,minus],[conglish,and],[congo,country],[congratulated,plus],[congratulating,plus],[congratulations,plus],[congruent,equals],[congtology,up],[conj,and],[conjecture,plus],[conjectured,right],[conjoin,and],[conjoined,right],[conjoiner,right],[conjoints,plus],[conjugational,right],[conjunction,plus],[conjunctions,and],[conjunctivitis,minus],[conjure,plus],[conjured,plus],[conjures,square],[conjuring,box],[conklin,person],[conn,person],[connect,dash],[connectable,right],[connected,plus],[connecticut,city],[connecting,plus],[connection,plus],[connectionism,right],[connectionnumber,n],[connectionnumberprompt,square],[connectionprompt,square],[connections,dash],[connective,dash],[connectives,dash],[connectivity,line],[connector,right],[connectors,box],[connects,plus],[connoisseur,person],[connotations,plus],[conns,dash],[cons,minus],[conscience,plus],[conscientious,plus],[conscientiousness,plus],[conscious,brain],[consciously,plus],[consciousness,plus],[consciousnesses,plus],[consec,right],[consecutive,right],[consent,plus],[consented,plus],[consenting,plus],[consequence,right],[consequences,dot],[consequent,a],[consequential,right],[consequentialist,person],[consequents,right],[conservation,equals],[conservatism,minus],[conservative,plus],[conservatively,n],[conserve,plus],[conserved,plus],[conserving,plus],[conses,box],[consider,tick],[considerable,plus],[considerably,plus],[considerate,plus],[consideration,plus],[considerations,dash],[considered,plus],[considering,plus],[considers,plus],[consist,box],[consisted,plus],[consistency,plus],[consistent,plus],[consistently,plus],[consisting,box],[consists,dash],[console,square],[consolidate,plus],[consolidated,plus],[consolidation,plus],[consolidator,plus],[consonants,square],[consortium,company],[const,n],[constancy,plus],[constant,n],[constantdis,equals],[constantly,n],[constants,k],[constellation,star],[constituent,box],[constituents,person],[constitute,box],[constituted,equals],[constitutes,plus],[constituting,box],[constitution,paper],[constitutional,book],[constitutions,book],[constitutive,plus],[constrain,down],[constrained,plus],[constraining,n],[constraint,n],[constraints,box],[construct,box],[constructed,box],[constructing,box],[construction,box],[constructions,box],[constructive,box],[constructivism,box],[constructor,box],[constructors,and],[constructs,box],[construed,plus],[consult,plus],[consultancy,plus],[consultant,person],[consultants,person],[consultation,paper],[consulted,plus],[consulting,right],[consume,plus],[consumed,plus],[consumer,person],[consumers,person],[consuming,left],[consumption,box],[cont,right],[contact,dash],[contacted,dash],[contactform,paper],[contacting,equals],[contacts,person],[contagious,minus],[contain,box],[containable,box],[contained,box],[container,box],[containers,box],[containing,box],[contains,square],[contaminable,minus],[contaminated,minus],[contamination,minus],[contemplated,plus],[contemplating,plus],[contemplation,plus],[contemporary,square],[contempt,minus],[contend,plus],[contended,plus],[contenders,person],[content,a],[contention,two],[contentions,square],[contents,square],[contest,plus],[contested,xcorrection],[contesting,plus],[context,square],[contexts,box],[contextsi,square],[contextual,box],[contextualisation,plus],[continent,],[continental,square],[continents,square],[contingency,plus],[contingent,plus],[contingents,person],[continual,right],[continually,right],[continuance,right],[continuation,right],[continue,right],[continued,right],[continues,right],[continuing,right],[continuities,right],[continuity,line],[continuous,circle],[continuously,circle],[contour,right],[contours,box],[contra,xcorrection],[contrabass,box],[contracello,cello],[contract,paper],[contracted,down],[contracting,down],[contraction,right],[contractionism,down],[contractor,person],[contractors,person],[contracts,paper],[contractual,paper],[contradiction,minus],[contradictions,minus],[contradictorily,minus],[contradictory,minus],[contradicts,xcorrection],[contradistinction,notequals],[contrarily,xcorrection],[contrary,minus],[contrast,minus],[contrasted,minus],[contrasting,minus],[contrasts,minus],[contrected,plus],[contrection,plus],[contribute,plus],[contributed,plus],[contributes,right],[contributing,plus],[contribution,right],[contributions,box],[contributor,person],[contributors,person],[control,tick],[controlled,plus],[controller,right],[controllers,plus],[controlling,plus],[controls,right],[controversial,minus],[controversy,minus],[conundrums,minus],[convenience,plus],[convenient,plus],[convenor,person],[convention,plus],[conventional,plus],[conventionally,plus],[conventions,tick],[converge,down],[converged,right],[convergence,down],[convergent,right],[conversants,person],[conversation,speechballoon],[conversational,mouth],[conversations,paper],[converse,square],[conversed,square],[conversely,plus],[conversing,mouth],[conversion,right],[conversions,right],[convert,right],[converted,right],[converter,right],[converters,right],[converting,right],[converts,right],[convex,circle],[convey,paper],[conveyable,right],[conveyed,plus],[conveying,paper],[conveys,right],[convince,plus],[convinced,plus],[convincingly,plus],[convoluted,minus],[convoy,person],[coo,dash],[coogan,person],[cook,n],[cookbook,book],[cooked,n],[cookery,apple],[cookie,],[cookies,cookie],[cooking,n],[cooks,person],[cool,n],[cooled,n],[cooler,lessthan],[cooling,n],[coootherlangpairs,tworectangle],[cooperated,plus],[cooperation,plus],[cooperative,plus],[cooperatives,plus],[coord,n],[coordinate,n],[coordinated,dash],[coordinates,dot],[coordination,plus],[coordinator,person],[coordinators,person],[coords,dot],[coordstype,square],[coorss,note],[coowithoutlangpairs,square],[cop,person],[cope,plus],[coped,plus],[copied,two],[copies,plus],[copilot,person],[coping,plus],[copiously,n],[copper,],[copula,equals],[copulate,plus],[copulation,plus],[copy,paper],[copyediting,paper],[copying,plus],[copyright,tick],[copyrighted,tick],[copyrighting,plus],[copyw,paper],[copywrite,square],[copywriter,person],[copywriting,pen],[copywritings,paper],[copywritten,plus],[copywrote,pen],[coral,],[cord,],[cordial,],[cordnode,dash],[cords,dash],[corduroy,pants],[core,paper],[cores,paper],[cori,dash],[cork,],[corkingdale,person],[corks,cork],[cormack,person],[corn,],[cornelia,person],[corner,zero],[cornered,box],[corners,dot],[cornerstone,box],[cornstarch,box],[cornupcopias,plus],[coronary,heart],[coronavirus,virus],[coronel,person],[corporal,person],[corporate,dollarsymbol],[corporateness,dollarsymbol],[corporation,company],[corporations,company],[corps,person],[corpuscle,ball],[corr,xcorrection],[correct,tick],[corrected,tick],[correcter,xcorrection],[correcting,xcorrection],[correction,xcorrection],[corrections,xcorrection],[correctly,tick],[correctness,tick],[correctors,person],[corrects,tick],[correl,right],[correlate,right],[correlated,dash],[correlates,dash],[correlating,dash],[correlation,dash],[correlational,right],[correlations,dash],[correspond,right],[corresponded,right],[correspondence,right],[correspondences,dash],[corresponding,dash],[corresponds,right],[corridor,room],[corridors,line],[corrs,dash],[corrupt,minus],[corrupted,minus],[corruption,minus],[corrupts,minus],[cortisol,ball],[cos,right],[cosine,n],[cosmic,plus],[cosmogony,book],[cosmol,box],[cosmological,box],[cosmologically,plus],[cosmologies,box],[cosmologist,person],[cosmologue,person],[cosmology,box],[cosmopolitan,magazine],[cosmos,box],[cosms,box],[cost,dollarsymbol],[costa,country],[costan,person],[costantino,person],[costly,dollarsymbol],[costs,dollarsign],[costume,hat],[costumes,coat],[cosy,plus],[cot,dash],[cototherlangpairs,square],[cottage,house],[cotton,line],[cottoning,plus],[couch,chair],[cough,mouth],[coughing,mouth],[could,dash],[couldn,xcorrection],[coun,person],[council,person],[councillors,person],[councils,up],[counselling,person],[counsellor,person],[counsellors,person],[count,n],[countable,n],[countalgorithmfirstline,square],[countalgorithmsecondline,square],[counted,n],[counter,n],[counteract,xcorrection],[counterbalance,equals],[countered,equals],[countering,equals],[countermand,xcorrection],[countermeasure,xcorrection],[counterpart,plus],[counterparts,equals],[counterpoint,dot],[counterproductive,minus],[counters,counter],[countess,person],[counting,n],[countries,square],[country,],[countryman,person],[counts,n],[counttwouses,n],[couple,person],[couples,person],[coupling,equals],[courage,plus],[couresware,dash],[courier,person],[course,tworectangle],[coursedesc,square],[coursera,company],[courses,book],[courseware,book],[coursework,paper],[court,building],[courteous,plus],[courteously,plus],[courtesy,plus],[courtier,person],[courtiers,person],[courtyard,yard],[cover,square],[coverage,square],[covered,square],[covering,square],[covers,square],[covid,minus],[cow,],[cowan,person],[cowden,person],[coworkers,person],[cowperson,person],[cows,animal],[cox,person],[cp,right],[cpa,variable],[cpb,variable],[cpd,plus],[cps,right],[cpt,n],[cpu,computer],[cpus,and],[cpv,variable],[cpvs,variable],[cpx,dot],[cq,plus],[cr,line],[crab,],[crack,box],[cracked,right],[cracker,xcorrection],[cracking,box],[cracks,box],[craft,box],[crafted,plus],[crafting,box],[crane,],[cranes,crane],[crang,person],[cranial,head],[crank,rod],[crap,ball],[crash,minus],[crashed,minus],[crashes,down],[crashing,minus],[crate,box],[crates,crate],[cravitron,plus],[crawled,right],[crawling,right],[cre,dash],[creagh,person],[cream,],[creaming,cream],[creams,cream],[creamy,cream],[crease,right],[creases,line],[create,plus],[created,person],[creates,person],[createview,plus],[creatine,ball],[creating,plus],[creatinine,ball],[creation,one],[creationism,plus],[creations,plus],[creative,left],[creativeinnovationglobal,plus],[creatively,plus],[creativity,and],[creator,person],[creators,person],[creature,animal],[creatures,dog],[cred,paper],[credence,plus],[credentials,plus],[credibility,plus],[credible,plus],[credit,plus],[credited,plus],[crediting,plus],[creditline,card],[creditor,person],[creed,plus],[creek,],[creelman,person],[creep,right],[creeper,right],[cremate,],[cremated,down],[cremating,box],[crepe,],[crescendo,slash],[creswell,person],[crew,person],[crewmember,person],[cricket,ball],[cricos,n],[cried,minus],[crime,minus],[crimes,minus],[criminal,person],[criminals,person],[criminology,xcorrection],[crises,minus],[crisis,minus],[crit,paper],[criteria,dash],[criterion,square],[critic,person],[criticable,xcorrection],[critical,xcorrection],[criticality,xcorrection],[critically,xcorrection],[criticism,square],[critics,person],[critique,paper],[critiqued,paper],[critiques,paper],[critiquing,plus],[croaker,frog],[croatia,country],[croc,dash],[crocodile,],[croissant,],[cron,software],[crontab,n],[crook,person],[crooke,person],[crop,plant],[crosbie,person],[crosby,person],[cross,x],[crossbow,],[crossed,minus],[crossers,person],[crosses,x],[crossing,equals],[crossings,dot],[crossover,software],[crossroads,x],[crossword,square],[crotchet,line],[crotchets,note],[crow,],[crowd,person],[crowds,person],[crowing,line],[crown,],[crs,dash],[crsotvt,dash],[crucial,tick],[cruise,person],[crumb,],[crumpled,down],[crunches,left],[crunching,down],[crush,down],[crushed,down],[cry,tear],[crying,minus],[crypt,room],[cryptic,right],[cryptically,right],[cryptographic,right],[cryptography,right],[crystal,box],[crystals,crystal],[cs,box],[csav,dash],[css,paper],[cstring,variable],[cstrings,variable],[cstringsr,variable],[cstringsrr,variable],[csymf,square],[ct,dash],[ctat,dash],[ctc,curve],[ctcs,arc],[ctinput,square],[ctobr,box],[ctrl,box],[ctt,box],[cttbacktranslate,square],[cttinput,n],[cttorig,box],[cttorigtran,right],[ctts,person],[cttsimplified,right],[ctx,dash],[cub,],[cuba,country],[cubby,],[cube,box],[cubeful,box],[cubes,box],[cubic,box],[cubie,person],[cubist,box],[cuckoo,bird],[cud,dash],[cuddled,circle],[cuddling,circle],[cue,square],[cues,square],[cuff,cylinder],[cuisine,tofu],[cukierman,person],[cul,plus],[culinary,apple],[culminating,up],[culmination,n],[culpa,minus],[culpate,minus],[culpated,minus],[cult,paper],[cultivate,plus],[cultivated,plus],[cultivating,plus],[cultofpedagogy,apple],[cultural,dash],[culturally,glass],[culturaltranslationtool,software],[culture,box],[cultured,plus],[cultures,square],[culturologists,person],[culturology,book],[culturvate,company],[cumbersome,minus],[cumquat,],[cumulative,n],[cup,],[cupboard,],[cupcake,],[cupcakes,cupcake],[cupid,person],[cupping,box],[cups,cup],[cur,dash],[curable,plus],[curated,plus],[curb,stop],[curbed,xcorrection],[curd,],[cure,plus],[cured,plus],[curing,plus],[curiosity,plus],[curious,eye],[curiously,plus],[curl,square],[curled,right],[curler,cylinder],[curly,line],[curr,dash],[currant,],[currencies,dollarsymbol],[currency,square],[current,tick],[currentdate,n],[currently,one],[currents,right],[curricula,book],[curricular,book],[curriculum,book],[curried,n],[currnotification,plus],[curry,],[currying,variable],[curse,minus],[cursor,square],[curtis,person],[curve,square],[curved,arc],[curves,arc],[curving,right],[curvy,arc],[cushion,],[cushions,cushion],[cusp,dot],[cust,person],[custodians,person],[custodianship,plus],[custody,room],[custom,plus],[customary,paper],[customer,person],[customers,person],[customisability,plus],[customisable,plus],[customisation,plus],[customisations,plus],[customise,right],[customised,dash],[customiser,box],[customising,square],[customized,plus],[customs,plus],[custs,person],[cut,xcorrection],[cute,plus],[cutedog,dog],[cuticle,cuticle],[cuticles,cuticle],[cutlass,],[cutoff,n],[cuts,n],[cutter,],[cutting,box],[cv,paper],[cve,dash],[cvs,right],[cw,book],[cwb,minus],[cwd,paper],[cws,paper],[cx,variable],[cxl,dash],[cxs,dash],[cxx,variable],[cxzrnlnncxplrjmyn,dash],[cy,variable],[cyber,computer],[cybersecurity,plus],[cyborg,person],[cycle,circle],[cycles,circle],[cyclic,circle],[cyclical,circle],[cycling,right],[cyclist,person],[cylinder,],[cylindrical,cylinder],[cymatics,line],[cymbal,],[cynthia,person],[cyprus,country],[cystic,minus],[cytokines,cell],[cyy,variable],[cz,dash],[czech,country],[czechia,country],[czz,variable],[côte,country],[d,square],[da,dash],[dabbed,dot],[dabble,down],[dad,person],[dada,book],[daddy,person],[dadirri,person],[dads,person],[daeea,dash],[daffodil,],[daguerreotypes,plus],[dagwell,person],[dahl,person],[dai,person],[daigou,dash],[daily,square],[dailyregimen,plus],[dainik,newspaper],[dairy,milk],[daisies,daisy],[daisy,],[dal,dash],[dale,person],[daloz,person],[dam,],[damage,minus],[damaged,minus],[damages,minus],[damaging,minus],[damfjdxt,dash],[damien,person],[damper,],[dampness,water],[dan,person],[danby,person],[dance,right],[danced,right],[dancer,person],[dancing,hand],[dandm,and],[danger,minus],[dangerous,minus],[dangers,minus],[dangled,down],[daniel,person],[danish,danish],[danishes,],[dann,person],[dao,path],[daodejing,book],[daoism,book],[daoismheideggergreen,paper],[daoist,person],[darcy,person],[daring,plus],[daringnesses,plus],[dark,square],[darkening,square],[darker,square],[darkest,square],[darling,person],[darlings,person],[darren,person],[darwin,person],[das,down],[dasein,person],[dash,],[dashboard,and],[dashes,dash],[data,a],[dataalgdict,book],[database,box],[databaseformulafinder,up],[databases,box],[datadictionary,square],[dataflow,right],[datalayer,line],[dataless,minus],[date,square],[dateandtime,n],[dates,square],[datestamp,square],[datetime,n],[dative,square],[datura,n],[daub,down],[daughter,person],[daughters,person],[daunt,minus],[dave,person],[davey,person],[david,person],[davis,person],[daw,computer],[dawkins,person],[dawn,n],[dawson,person],[day,],[daydob,n],[daylearned,square],[daylight,sun],[days,day],[daysb,square],[daysbspeoplearmy,n],[dayvaluea,variable],[dayvalueb,variable],[db,square],[dba,dash],[dbase,box],[dbcc,dash],[dbff,box],[dbpia,paper],[dbs,square],[dbw,square],[dc,dash],[dcd,dash],[dcg,square],[dcgs,dash],[dcgutils,right],[dd,dash],[ddd,dash],[dddr,dash],[dde,dash],[dded,dash],[de,country],[dea,person],[dead,minus],[deadening,down],[deadline,n],[deadlines,n],[deadlocked,equals],[deaf,minus],[deafness,ear],[deakin,university],[deal,plus],[dealer,person],[dealing,plus],[dealings,plus],[dealmaking,dollarsymbol],[deals,plus],[dealt,plus],[dean,person],[deane,person],[deanimate,zero],[deanna,person],[dear,person],[dearly,plus],[death,zero],[deathbed,bed],[deaths,dot],[deb,person],[debate,two],[debated,square],[debates,square],[debbie,person],[debian,software],[debriefing,square],[debris,box],[debt,minus],[debu,dash],[debug,a],[debugged,xcorrection],[debugger,tick],[debuggers,xcorrection],[debugging,tick],[debugvalue,n],[debutanted,plus],[dec,square],[decade,n],[decades,n],[decay,minus],[deceive,minus],[deceiving,minus],[december,square],[decentralised,dot],[deception,minus],[deceptive,minus],[decide,tick],[decided,tick],[decides,tick],[deciding,tick],[deciduate,down],[deciduousness,down],[decimal,n],[decimetre,tenrectangle],[decipher,right],[deciphered,right],[deciphering,right],[decision,tick],[decisions,tick],[decisive,plus],[deck,square],[declaration,paper],[declarations,square],[declarative,and],[declare,mouth],[declared,square],[decline,down],[decoded,square],[decoding,right],[decomp,down],[decompose,minus],[decomposeandbuild,right],[decomposed,down],[decomposes,down],[decomposing,down],[decomposition,minus],[decompress,up],[decompressing,up],[decomps,down],[decons,left],[deconstruct,down],[deconstructed,down],[deconstructing,down],[deconstruction,down],[decor,box],[decorated,square],[decoration,plus],[decrease,down],[decreased,minus],[decreases,down],[decreasing,down],[decrypt,left],[decrypted,plus],[decryption,right],[decurion,dash],[ded,dash],[dedicated,plus],[deduce,right],[deduced,right],[deducer,up],[deducing,right],[deduct,minus],[deduction,right],[deductive,up],[deed,hand],[deeds,hand],[deemed,plus],[deemer,person],[deeming,plus],[deep,down],[deepening,zline],[deepens,down],[deeper,down],[deepest,down],[deeply,down],[deeptissue,box],[deer,],[def,dash],[default,one],[defeasible,plus],[defeat,minus],[defeated,xcorrection],[defeating,xcorrection],[defeats,xcorrection],[defect,right],[defects,minus],[defence,plus],[defenceedit,dash],[defences,dash],[defend,xcorrection],[defended,xcorrection],[defending,plus],[defensibility,plus],[defensive,plus],[defer,right],[deference,plus],[deferens,tube],[defiantly,plus],[deficiencies,minus],[deficiency,minus],[deficit,minus],[define,dash],[defined,square],[defines,square],[defining,dash],[definite,plus],[definitely,plus],[definition,square],[definitions,square],[definitive,plus],[definitively,plus],[definitiveness,plus],[deflect,right],[deflected,right],[deflecting,right],[deflection,right],[deforestation,tree],[defying,xcorrection],[deg,n],[degenerate,minus],[degeneration,minus],[degradation,minus],[degrade,down],[degree,a],[degreelessness,zero],[degrees,paper],[dehiscence,up],[dehlinger,person],[dei,plus],[deidentified,xcorrection],[deidentifier,xcorrection],[deigned,plus],[deigning,plus],[deity,person],[dej,dash],[del,dash],[delay,dash],[delayed,n],[delays,n],[delect,plus],[delectable,apple],[delected,plus],[delecting,tongue],[delegate,person],[delegated,down],[delegatee,person],[delegatees,person],[delegater,person],[delegates,person],[delegating,right],[delegation,right],[delegator,person],[delegators,person],[delete,minus],[deletea,xcorrection],[deleted,xcorrection],[deleteduplicatecode,and],[deleteduplicates,xcorrection],[deletefirst,xcorrection],[deleter,xcorrection],[deletes,xcorrection],[deleting,minus],[deletion,xcorrection],[deletions,minus],[deliberate,plus],[deliberately,plus],[delicate,plus],[delicatessen,shop],[delicious,plus],[deliciousness,plus],[delight,plus],[delighted,plus],[delightful,plus],[delightfully,plus],[delightited,plus],[delights,plus],[delilah,person],[delimeter,dash],[delimit,box],[delimited,dash],[delimiter,square],[delimiters,square],[delineated,plus],[deliver,right],[delivered,right],[deliverer,person],[deliveries,right],[delivering,right],[delivers,right],[delivery,right],[deloan,person],[delta,],[deltoid,muscle],[delve,down],[delves,down],[delving,down],[demand,left],[demanded,plus],[demanding,minus],[demands,minus],[demarcates,equals],[demarcating,plus],[dematerialing,box],[dematerialised,zero],[demeaning,minus],[demeanour,plus],[demented,minus],[dementia,minus],[demetrioued,plus],[demetrius,person],[demo,paper],[democracy,dot],[democratic,plus],[democratically,plus],[democrats,person],[demoed,plus],[demographic,person],[demographics,person],[demon,minus],[demonadalised,box],[demonstrate,plus],[demonstrated,tick],[demonstrates,plus],[demonstrating,plus],[demonstration,hand],[demonstrations,box],[demoralised,minus],[demuted,up],[den,box],[dena,city],[denature,minus],[denatured,minus],[dendrite,right],[denial,xcorrection],[denied,xcorrection],[denies,xcorrection],[denis,person],[denison,person],[denmark,country],[denominator,n],[denote,right],[denoted,square],[denotes,square],[denoting,square],[dense,n],[density,dot],[dental,tooth],[dentist,person],[dentistry,tooth],[denuded,plus],[denured,plus],[deny,minus],[denzin,person],[deontological,person],[deontology,person],[deoxygenated,minus],[departed,right],[departing,right],[department,building],[departmental,book],[departments,paper],[departure,right],[depend,right],[dependability,plus],[dependable,plus],[depended,left],[dependence,down],[dependencies,up],[dependency,left],[dependent,left],[dependently,right],[dependents,up],[depending,left],[depends,left],[depict,square],[depicted,square],[depicting,square],[depicts,right],[depleting,minus],[deploy,right],[deployed,plus],[deploying,plus],[deployment,box],[deposit,dollarsymbol],[deposited,dollarsymbol],[depositing,dollarsymbol],[deprecate,xcorrection],[deprecated,minus],[depressed,minus],[depression,minus],[depressive,minus],[deps,up],[dept,building],[depth,n],[depts,building],[deputy,person],[dequals,company],[deransart,dash],[derby,square],[dereconstruction,down],[deregulation,plus],[derhuerst,person],[derision,minus],[derivability,right],[derivatives,down],[derive,plus],[derived,right],[deriving,right],[dermatol,person],[dermatological,square],[derogatory,minus],[derrida,person],[derridean,person],[derrimut,dot],[deruiter,person],[des,person],[desc,square],[descartes,person],[descend,down],[descended,down],[descends,down],[descent,down],[describe,square],[described,square],[describes,dash],[describing,square],[descrip,square],[description,square],[descriptioncombination,dash],[descriptionedit,dash],[descriptions,square],[descriptiontext,square],[descriptive,square],[descriptor,square],[descriptors,square],[desensi,person],[desert,sand],[deserve,tick],[deserved,plus],[deservedness,plus],[deserves,plus],[deservingly,plus],[desiccated,down],[desiccating,down],[desiderative,plus],[design,line],[designated,plus],[designates,plus],[designed,tick],[designer,person],[designers,person],[designing,square],[designs,square],[desirable,plus],[desire,plus],[desired,plus],[desires,plus],[desiring,plus],[desk,],[desks,desk],[desktop,square],[desktops,and],[despite,xcorrection],[despotic,minus],[desseminate,down],[dessert,apple],[desserts,dessert],[dessicated,down],[dest,dot],[destination,dot],[destinations,dot],[destined,dot],[destiny,dot],[destroy,xcorrection],[destroyed,xcorrection],[destroying,xcorrection],[destruction,minus],[destructive,minus],[det,square],[detach,notequals],[detached,plus],[detail,dash],[detailed,dot],[detailedly,plus],[detailer,square],[detailers,square],[detailing,dash],[details,box],[detailsoptionally,dash],[detect,equals],[detectability,plus],[detectable,plus],[detected,tick],[detecting,tick],[detection,equals],[detector,box],[detectors,eye],[detects,person],[detergent,],[deteriorated,down],[determinant,plus],[determinants,plus],[determination,n],[determine,equals],[determined,plus],[determiner,plus],[determines,right],[determining,tick],[determinism,plus],[determinisms,plus],[deterministic,equals],[deterred,xcorrection],[detlog,book],[detour,right],[detoured,right],[detracting,minus],[detrimental,minus],[detritus,ball],[deutscher,person],[deuxième,box],[dev,square],[devaluation,down],[develop,plus],[developed,plus],[developedly,plus],[developedness,plus],[developer,person],[developers,person],[developing,box],[development,plus],[developmental,box],[developmentally,plus],[developments,plus],[develops,box],[deviations,plus],[device,and],[devices,dash],[devil,person],[devin,person],[devise,square],[devised,plus],[devising,square],[devoid,minus],[devops,right],[devoted,plus],[devotion,plus],[devour,mouth],[dewey,person],[dewzguiy,dash],[dexterous,finger],[df,dash],[dfa,line],[dfb,dash],[dff,dash],[dfn,dash],[dfs,down],[dgc,dash],[dgl,dash],[dh,dash],[dhn,variable],[di,dash],[diabetes,minus],[diagnose,tick],[diagnosed,square],[diagnoses,plus],[diagnosing,square],[diagnosis,tick],[diagnostic,tick],[diagnostics,tick],[diagonal,line],[diagonally,right],[diagonals,line],[diagram,square],[diagrams,square],[dial,circle],[dialect,square],[dialectic,paper],[dialectics,paper],[dialectise,square],[dialled,n],[dialogical,square],[dialogically,square],[dialogue,paper],[dialogues,mouth],[diameter,n],[diamond,],[diamonds,box],[dian,person],[diana,person],[dianne,person],[diaphragm,],[diareasoner,square],[diaries,book],[diarrhoea,minus],[diary,book],[dibb,person],[dice,],[dichotomies,two],[dichotomy,dash],[dick,person],[dickson,person],[dickybird,bird],[dicotyledon,flower],[dict,book],[dictate,mouth],[dictated,mouth],[dictates,mouth],[dictation,mouth],[dictators,person],[diction,tongue],[dictionaries,book],[dictionary,book],[dicts,book],[did,dash],[didactic,plus],[diddly,minus],[didgeridoo,],[didn,xcorrection],[dido,person],[die,minus],[died,n],[dies,minus],[diet,apple],[dietary,apple],[diets,apple],[dif,variable],[diff,minus],[diffed,notequals],[differ,minus],[differed,two],[difference,minus],[differencebetween,minus],[differences,minus],[differencet,minus],[different,minus],[differential,plus],[differentiate,minus],[differentiated,two],[differentiates,box],[differentiating,minus],[differentiation,minus],[differentiations,plus],[differently,minus],[differing,notequals],[differs,minus],[difficult,minus],[difficulties,xcorrection],[difficultly,minus],[difficulty,minus],[diffs,notequals],[diffusion,right],[différence,minus],[dig,down],[digby,person],[digest,stomach],[digested,stomach],[digestible,stomach],[digesting,stomach],[digestion,stomach],[digestive,intestines],[digging,down],[digicon,square],[digit,n],[digital,finger],[digitally,n],[digits,n],[digittoint,right],[dignified,plus],[dilated,right],[dilemma,minus],[dilemmas,minus],[diligence,plus],[diligent,plus],[dillet,person],[dilucidated,plus],[diluted,water],[dim,n],[dimension,line],[dimensional,n],[dimensionality,box],[dimensionally,box],[dimensions,box],[diminish,down],[diminished,down],[diminishing,down],[dimly,n],[dinacarya,square],[dine,carrot],[dined,mouth],[diner,apple],[diners,person],[ding,dash],[dinicolantonio,person],[dining,potato],[dinley,person],[dinner,potato],[dinosaur,],[dion,person],[diorama,box],[dioxide,],[dip,paper],[dipankar,person],[diploma,paper],[diplomas,paper],[diplomat,person],[diplomatic,plus],[dipped,down],[dir,box],[direct,down],[direct,right],[directed,right],[directing,dot],[direction,right],[directional,right],[directionality,right],[directiondict,book],[directiondifferencet,d],[directiondt,d],[directionend,variable],[directionlength,n],[directionlengtht,n],[directions,right],[directionstring,variable],[directionstringth,variable],[directionvalues,variable],[directionx,variable],[directiony,variable],[directionz,variable],[directive,right],[directives,right],[directly,line],[director,person],[directories,box],[directors,person],[directory,paper],[dirs,right],[dirt,],[dirty,minus],[dis,variable],[disab,minus],[disabilities,minus],[disability,minus],[disabled,minus],[disadvanced,dash],[disadvantage,minus],[disadvantaged,minus],[disadvantages,minus],[disagree,xcorrection],[disagreeable,minus],[disagreed,xcorrection],[disagreeing,xcorrection],[disagreement,minus],[disagreements,xcorrection],[disagrees,xcorrection],[disallow,xcorrection],[disallowed,minus],[disambiguate,box],[disambiguated,right],[disandclass,variable],[disappear,zero],[disappeared,xcorrection],[disappearing,minus],[disappears,xcorrection],[disappointed,minus],[disappointedly,minus],[disappointing,minus],[disarm,xcorrection],[disarmed,plus],[disarming,xcorrection],[disaster,minus],[disasters,minus],[disc,],[discard,xcorrection],[discarded,xcorrection],[discarding,xcorrection],[discards,xcorrection],[discern,plus],[discernable,plus],[discerned,v],[discernible,plus],[discerning,plus],[discharge,right],[discharging,right],[discipiated,plus],[discipled,right],[disciples,person],[disciplinarity,book],[disciplinary,xcorrection],[discipline,book],[disciplined,plus],[disciplines,book],[disciplining,plus],[disclaimed,plus],[disclaimer,plus],[disclass,variable],[disclosing,right],[disclosure,plus],[disclosures,eye],[discomfort,minus],[disconcertedness,minus],[disconnect,notequals],[disconnected,xcorrection],[disconnects,notequals],[discontiguous,line],[discontinued,xcorrection],[discount,minus],[discounted,xcorrection],[discounts,dollarsymbol],[discourage,xcorrection],[discouraged,xcorrection],[discours,paper],[discourse,square],[discourses,book],[discover,plus],[discoverable,plus],[discovered,tick],[discoveries,plus],[discovering,plus],[discovers,plus],[discovery,plus],[discredit,minus],[discredited,xcorrection],[discreet,plus],[discrepancy,minus],[discriminant,notequals],[discriminate,minus],[discriminated,xcorrection],[discrimination,minus],[discriminatory,minus],[discripiated,plus],[discs,circle],[discursive,dash],[discus,],[discuses,discus],[discuss,mouth],[discussed,square],[discusses,mouth],[discussing,mouth],[discussion,square],[discussions,square],[disease,minus],[diseased,minus],[diseases,minus],[disembarked,left],[disembarking,right],[disempower,minus],[disengagement,minus],[disenrolled,xcorrection],[disentangling,right],[disequality,notequals],[disguise,glasses],[disguises,mask],[disgusting,minus],[dish,],[dishes,dish],[dishevelled,minus],[dishonesty,minus],[disinfectant,],[disinfected,disinfectant],[disintegrate,down],[disintegrated,down],[disinterest,minus],[disinterested,minus],[disised,minus],[disj,or],[disjointed,box],[disjunction,or],[disjunctions,or],[disjunctive,or],[disk,square],[dislike,xcorrection],[disliked,xcorrection],[dislikes,xcorrection],[disliking,minus],[dislodged,right],[dismantle,down],[dismantled,box],[dismantling,down],[dismiss,minus],[dismissal,xcorrection],[dismissed,xcorrection],[disorder,minus],[disordered,minus],[disorders,minus],[disorganised,minus],[disparagingly,minus],[disparities,minus],[disparity,minus],[dispassionate,minus],[dispatch,box],[dispensable,plus],[dispensation,plus],[dispense,right],[dispensed,down],[dispenser,down],[dispensers,dispenser],[dispensing,down],[disperse,right],[displacement,n],[display,square],[displayed,square],[displayer,display],[displaying,square],[displays,square],[disposable,plus],[disposal,down],[dispose,down],[disposed,down],[disposition,plus],[dispositional,plus],[dispute,minus],[disputed,xcorrection],[disputes,minus],[disregard,minus],[disregarded,minus],[disrespect,minus],[disrespectful,minus],[disrespects,minus],[disrupt,xcorrection],[disrupted,minus],[disrupting,minus],[disruption,minus],[disruptive,minus],[disruptor,minus],[disrupts,minus],[dissect,down],[dissected,right],[dissection,down],[disseminate,down],[disseminating,right],[dissemination,right],[disseminator,person],[dissertation,paper],[dissimilarity,notequals],[dissipate,down],[dissolve,water],[dissolved,water],[dissolver,down],[dissuade,xcorrection],[distance,line],[distanced,n],[distances,n],[distancing,n],[distant,n],[distaste,minus],[distastefully,minus],[distil,down],[distinct,plus],[distinction,two],[distinctions,a],[distinctive,plus],[distinctively,plus],[distinctiveness,plus],[distinctly,equals],[distinguish,box],[distinguished,plus],[distinguishing,right],[distort,minus],[distortion,minus],[distracted,minus],[distracting,minus],[distraction,minus],[distractions,minus],[distress,minus],[distressing,minus],[distribute,right],[distributed,right],[distributing,dash],[distribution,dash],[distributions,right],[distributive,right],[district,square],[districts,square],[disturbance,minus],[disturbances,minus],[disturbing,minus],[ditransitive,person],[div,square],[dive,down],[dived,down],[diverge,two],[diverged,two],[divergence,v],[divergences,right],[divergent,right],[diverging,right],[diverse,plus],[diversification,plus],[diversified,plus],[diversify,plus],[diversifying,plus],[diversions,dash],[diversity,plus],[diverted,right],[divertissement,plus],[divertissements,xcorrection],[dives,down],[divide,box],[dividebyfour,box],[divided,box],[dividend,dollarsymbol],[dividends,dollarsymbol],[dividers,box],[divides,division],[dividing,box],[divine,person],[divinely,plus],[diving,down],[divinity,person],[divisible,box],[division,box],[divisional,box],[divisions,box],[divorced,xcorrection],[divorcee,person],[divulged,up],[divulging,up],[dix,person],[dixee,person],[dixel,dash],[dixey,person],[dixie,goat],[dixl,person],[djablucinate,minus],[djameson,person],[djibouti,country],[dl,dash],[dlight,plus],[dlighted,plus],[dlive,plus],[dll,box],[dlo,dash],[dm,variable],[dmsd,dash],[dmsm,dash],[dmsr,dash],[dmss,dash],[dmst,dash],[dn,variable],[dna,],[dnn,variable],[do,person],[dob,square],[dobd,square],[dobm,square],[doby,square],[doc,paper],[dock,right],[docked,equals],[docking,equals],[docktime,and],[dockyards,water],[docs,paper],[doctor,person],[doctoral,book],[doctorate,book],[doctors,person],[doctrine,square],[doctype,square],[document,paper],[documentaries,plus],[documentary,square],[documentation,paper],[documented,paper],[documenting,paper],[documents,paper],[docx,paper],[dodged,xcorrection],[dodgem,xcorrection],[doer,plus],[does,right],[doesn,minus],[doffing,down],[dog,],[dogberry,person],[dogged,minus],[dogs,dog],[doh,dot],[doi,square],[doing,dash],[dois,n],[dojo,room],[dole,person],[doll,],[dollar,dollarsymbol],[dollars,dollarsymbol],[dollarsign,],[dollarsymbol,],[dollie,doll],[dollop,cream],[dom,person],[domain,square],[domains,square],[dome,hemisphere],[domestic,building],[dominance,plus],[dominant,plus],[dominate,minus],[dominated,minus],[dominates,plus],[dominating,plus],[domination,minus],[dominguez,person],[dominic,person],[dominica,country],[dominican,country],[dominions,house],[don,dash],[donald,person],[donaldkellett,person],[donate,dollarsymbol],[donated,right],[donating,dollarsymbol],[donation,dollarsymbol],[donations,dollarsymbol],[done,plus],[donee,dash],[donel,dash],[doney,person],[donl,person],[donna,person],[donor,person],[donrita,person],[dont,xcorrection],[donttry,minus],[doobies,plus],[doodles,pen],[door,],[doorbell,bell],[doors,door],[doorway,door],[doorways,door],[dorbuchers,person],[dore,dash],[dorjee,person],[dormant,dash],[dormitory,room],[dorothy,person],[dos,tick],[dose,n],[doses,n],[dostoevsky,person],[dot,],[dote,plus],[doting,plus],[dots,dot],[dotsandutterances,dot],[dott,plus],[dotted,dot],[dotting,dot],[double,two],[doubled,tworectangle],[doublemaze,right],[doubling,tworectangle],[doubly,tworectangle],[doubt,xcorrection],[doubting,minus],[doug,person],[dough,],[doughnut,],[dougsson,person],[dove,],[doves,dove],[dovetail,equals],[dovetailed,equals],[dow,person],[down,],[downcase,down],[downhill,down],[download,down],[downloaded,down],[downloading,down],[downloads,down],[downpipe,down],[downplay,xcorrection],[downplayed,xcorrection],[downs,minus],[downside,down],[downsize,down],[downstairs,down],[downton,city],[downturn,minus],[downwards,down],[dowry,dollarsymbol],[dowsed,water],[doyle,person],[dozen,n],[dozens,box],[dp,dash],[dqb,box],[dr,up],[dra,box],[draft,paper],[drafted,paper],[drafting,paper],[drafts,paper],[drag,right],[dragged,right],[dragging,right],[dragon,],[drain,box],[drained,down],[drains,down],[drake,duck],[drakes,duck],[drama,stage],[dramatic,stage],[dramatica,software],[dramatically,plus],[dramatis,book],[drank,water],[draw,paper],[drawbar,box],[drawer,box],[drawers,],[drawing,square],[drawings,square],[drawn,right],[draws,square],[drdftr,note],[dreadful,minus],[dream,box],[dreamballoon,],[dreamed,box],[dreamily,box],[dreams,speechballoon],[dreamt,square],[dreamy,dreamballoon],[dress,],[dressed,trousers],[dressing,plus],[dressmaker,person],[drew,pen],[drfile,paper],[dribbled,water],[dribbling,down],[dried,plus],[dries,up],[drigas,person],[drill,plus],[drilling,down],[drink,glass],[drinkable,water],[drinkers,person],[drinking,glass],[drinks,water],[drip,tube],[dripping,water],[drive,right],[driven,right],[driver,person],[drivers,person],[drives,right],[driveway,],[driving,right],[drizzled,down],[drizzles,down],[drone,robot],[drones,robot],[drop,down],[dropbox,box],[droplet,water],[dropped,down],[dropping,down],[drops,down],[dropwhile,plus],[drove,right],[drug,],[drugs,switch],[drum,],[drums,drum],[drunk,water],[drupe,cherry],[dry,box],[drying,square],[dryness,square],[ds,dash],[dsf,square],[dsorted,right],[dsp,dollarsymbol],[dsss,dash],[dsui,dash],[dt,variable],[dtd,dash],[dtree,dash],[dts,n],[du,dash],[dual,tworectangle],[dualise,tworectangle],[dualism,tworectangle],[dualistic,tworectangle],[duality,two],[dubai,city],[duchess,person],[duck,],[duckduckgo,company],[ducker,person],[duckling,],[ducklings,duckling],[ducks,duck],[due,n],[duel,right],[dug,down],[duk,person],[duke,cat],[dukel,person],[dukeltrance,person],[dukie,person],[dukkel,dash],[dukl,person],[dulcimer,box],[dulex,company],[dumit,person],[dumler,person],[dummies,minus],[dump,down],[dumped,down],[dumpling,],[dumptonesquenesses,plus],[duncan,person],[dune,sand],[dunmore,person],[dup,plus],[duplicate,two],[duplicated,two],[duplicates,tworectangle],[duplicating,two],[duplication,two],[dupoux,person],[dups,n],[durability,plus],[duration,n],[durations,n],[dure,person],[durham,person],[during,one],[durum,],[dust,sphere],[duster,],[dutch,person],[duties,plus],[duty,plus],[duuyqdrkdfmcsjskj,dash],[dvd,and],[dvouz,dash],[dwayne,person],[dwelling,house],[dwim,dash],[dwyer,person],[dx,variable],[dxx,variable],[dy,variable],[dyad,two],[dye,],[dyed,ink],[dyes,box],[dying,minus],[dylan,person],[dylib,book],[dynamic,plus],[dynamically,plus],[dynamics,n],[dynamism,plus],[dysfunctional,minus],[dyson,person],[dz,variable],[dznvkjcnza,dash],[e,box],[ea,earth],[each,dash],[eagle,],[eagled,arm],[eaning,dash],[ear,],[earlier,lessthan],[earliest,n],[earlobe,earlobe],[earlobes,earlobe],[early,left],[earn,dollarsign],[earned,dollarsign],[earner,person],[earners,dollarsymbol],[earning,dollarsign],[earningjobsprotectioninjobs,plus],[earnings,dollarsymbol],[earns,dollarsign],[earphone,],[earphones,],[earrings,circle],[ears,ear],[earth,planet],[earthlings,person],[earthquake,minus],[ease,plus],[eased,plus],[easier,plus],[easiest,plus],[easily,plus],[easiness,plus],[easing,plus],[east,right],[easter,egg],[eastern,left],[easterners,person],[easy,plus],[eat,apple],[eaten,mouth],[eater,person],[eateries,apple],[eaters,person],[eatery,apple],[eating,banana],[eats,apple],[eb,dash],[eba,dash],[ebay,company],[ebbed,plus],[ebeacon,company],[ebook,book],[ebrary,paper],[ebsco,and],[ec,variable],[eca,dash],[ecaster,company],[ecb,dash],[echo,square],[echoed,mouth],[echoes,n],[echoing,plus],[eco,apple],[ecological,plant],[ecologically,plant],[ecologist,person],[ecology,plant],[econ,dollarsymbol],[econd,dash],[econometric,right],[economic,dollarsymbol],[economical,dollarsymbol],[economically,dollarsymbol],[economics,dollarsign],[economisation,dollarsymbol],[economist,person],[economists,person],[economy,dollarsymbol],[ecosystem,plus],[ecosystems,plant],[ecritina,person],[ecritudine,paper],[ecstasy,plus],[ecstatic,plus],[ecuador,country],[ed,left],[edc,square],[edcf,square],[edcfd,square],[edd,square],[eddie,person],[ede,square],[edelweiss,plant],[edge,square],[edges,line],[edible,apple],[edit,pen],[editable,plus],[edited,right],[editing,tick],[edition,paper],[editions,n],[editor,square],[editorial,square],[editors,person],[edits,right],[edn,book],[eds,person],[edu,building],[educ,book],[educare,plus],[educate,book],[educated,plus],[educating,plus],[education,person],[educational,book],[educationally,plus],[educative,plus],[educator,person],[educators,person],[edward,person],[edwards,person],[edx,company],[ee,dash],[eee,company],[eeg,electron],[ef,plus],[efde,square],[effect,square],[effected,plus],[effective,tick],[effectively,plus],[effectiveness,plus],[effects,a],[effeminate,person],[effervent,plus],[efficacy,plus],[efficiency,plus],[efficient,variable],[efficiently,n],[effigies,sculpture],[effort,plus],[effortless,zero],[effortlessly,plus],[effortlessness,plus],[efforts,plus],[efmar,multiply],[efron,person],[eg,peach],[egalitarian,person],[egg,],[eggplant,],[eggs,ball],[ego,person],[egocentric,minus],[egos,person],[egs,square],[egyankosh,person],[egypt,country],[eh,dash],[eigengrau,square],[eigentlichkeit,plus],[eight,eightrectangle],[eighth,n],[eightrectangle,],[eighty,n],[eilat,person],[eilish,person],[einahpets,person],[einstein,person],[eisenhart,person],[either,or],[el,dash],[elaborate,plus],[elaboration,plus],[elapse,n],[elapsed,minus],[elapses,n],[elastic,],[elate,plus],[elbow,],[elbows,elbow],[elder,person],[elderberries,berry],[elderberry,],[elderly,n],[elders,person],[eldest,n],[eldorado,city],[elected,plus],[election,tick],[elections,tick],[elective,plus],[electoral,n],[electric,electron],[electrical,ball],[electrician,person],[electricity,electron],[electrified,electron],[electro,ball],[electrode,plus],[electromagnetic,plus],[electron,],[electronic,note],[electronics,and],[electrons,minus],[electrotherapy,plus],[elegance,plus],[elegant,square],[elegantly,plus],[element,n],[elementarily,plus],[elementary,one],[elements,one],[elenchus,plus],[elephantine,minus],[elet,right],[elevated,up],[elevation,zline],[elevations,square],[eleven,n],[elf,person],[eli,person],[elicit,plus],[eliciting,plus],[elide,right],[eligible,plus],[eliminate,minus],[eliminated,xcorrection],[eliminates,minus],[eliminating,xcorrection],[elimination,minus],[eliminativist,xcorrection],[eliminativistism,xcorrection],[elimininate,minus],[elite,plus],[elites,plus],[elitist,plus],[eliza,person],[elizabeth,person],[ellen,person],[ellie,person],[elliott,person],[ellipse,],[elly,person],[ellyard,person],[elmo,],[elope,minus],[eloquent,mouth],[els,square],[elsa,person],[else,or],[elsevier,company],[elsewhere,box],[elt,square],[eluded,xcorrection],[elusive,minus],[elvira,person],[ely,person],[em,box],[email,letterbox],[emailed,right],[emails,paper],[emanate,right],[emanated,right],[emanates,right],[embark,right],[embarked,plus],[embarking,right],[embarrassed,minus],[embarrassing,minus],[embarrassingly,minus],[embed,box],[embedded,down],[embedding,down],[embellishment,plus],[embodied,person],[embodies,plus],[embodiment,person],[embossed,box],[embouchure,mouth],[embrace,plus],[embraced,person],[embryo,],[embryos,embryo],[emerald,],[emerge,right],[emerged,up],[emergence,right],[emergency,minus],[emerges,up],[emerging,up],[emeritus,person],[emerson,person],[emily,person],[eminent,plus],[emir,person],[emirates,country],[emission,right],[emissions,right],[emitted,right],[emitters,right],[emitting,right],[emma,person],[emmons,person],[emoji,square],[emoticon,square],[emotion,smile],[emotional,smile],[emotionally,plus],[emotions,smile],[emotive,plus],[emoyed,plus],[empathetic,plus],[empathise,tick],[empathised,plus],[empathising,plus],[empathy,plus],[emperor,person],[empersonified,person],[emphases,plus],[emphasis,dot],[emphasise,plus],[emphasised,plus],[emphasises,plus],[emphasising,plus],[emphasized,plus],[emphasizes,plus],[emphatic,plus],[empire,country],[empirical,eye],[empirically,square],[empiricism,book],[employ,dash],[employed,plus],[employee,person],[employees,person],[employer,person],[employers,person],[employing,plus],[employment,plus],[employs,plus],[emporium,shop],[emporiums,building],[empower,plus],[empowered,plus],[empowering,plus],[empowerment,plus],[emptied,box],[emptiness,box],[empting,box],[empty,twobox],[emptycord,zero],[emptying,box],[emptyorvalsequal,dash],[emulate,right],[emulated,right],[emulating,plus],[emulator,box],[en,book],[enable,plus],[enabled,plus],[enables,plus],[enabling,plus],[enact,plus],[enacted,plus],[enacting,box],[enactment,plus],[enamelise,minus],[enamour,plus],[enamourate,plus],[enamoured,plus],[enamouring,plus],[enamourmonts,plus],[encapsulated,box],[encased,box],[ence,dash],[enchanted,plus],[enchilada,],[enclaves,box],[enclosed,box],[encloses,box],[encode,right],[encoded,right],[encoding,right],[encompasses,plus],[encounter,plus],[encountered,plus],[encountering,tick],[encounters,person],[encourage,plus],[encouraged,plus],[encouragement,plus],[encouragements,plus],[encourages,plus],[encouraging,plus],[encouragingly,plus],[encrypt,right],[encrypted,right],[encrypting,right],[encryption,plus],[encuntglish,box],[encyclopaedia,book],[encyclopedia,book],[encyclopedias,book],[end,n],[endanger,minus],[endangered,minus],[endeavours,plus],[ended,n],[endgame,n],[endgrammar,variable],[endgrammara,square],[ending,n],[endings,dot],[endless,n],[endnotes,paper],[endocrine,gland],[endocrinology,book],[endometrium,box],[endorse,plus],[endorsed,plus],[endorsement,plus],[endorsing,plus],[endowing,plus],[endpoint,n],[endpoints,dot],[endrec,dot],[ends,dot],[endstring,square],[endurance,plus],[endure,plus],[endured,right],[enduring,plus],[eneathiesnesses,plus],[enemies,person],[energetic,plus],[energise,up],[energised,plus],[energy,right],[energyeducation,book],[enforce,plus],[enforced,plus],[enforcement,plus],[eng,paper],[engage,box],[engaged,plus],[engagement,plus],[engaging,plus],[engels,person],[engender,plus],[engendering,plus],[engine,and],[engineer,person],[engineered,box],[engineering,plus],[engines,and],[england,country],[english,book],[engram,square],[engraved,box],[engrossed,minus],[engrossment,minus],[enhance,plus],[enhanced,plus],[enhancement,plus],[enhancements,plus],[enhances,plus],[enhancing,plus],[enigma,dot],[enigmas,plus],[enigmatic,plus],[enimals,animal],[enjoy,plus],[enjoyable,plus],[enjoyably,plus],[enjoyed,plus],[enjoying,plus],[enjoyment,plus],[enjoyments,plus],[enjoys,plus],[enlarge,up],[enlarged,up],[enlarging,up],[enlightened,plus],[enlightening,plus],[enlightenment,plus],[enlisted,plus],[enliven,plus],[enlivens,plus],[enmeshment,box],[eno,person],[enotecama,plus],[enough,n],[enqljvfy,dash],[enquire,square],[enquiries,square],[enquiring,square],[enquiry,plus],[enriches,plus],[enrol,plus],[enroll,tick],[enrolled,plus],[enrolling,tick],[enrollment,tick],[enrollments,square],[enrolment,plus],[enrolments,plus],[ensconce,plus],[ensemble,person],[enshrined,plus],[ensightment,eye],[enslaving,minus],[ensue,dot],[ensuing,right],[ensure,tick],[ensured,plus],[ensures,plus],[ensuring,plus],[entailed,plus],[entails,right],[entanglement,right],[enter,right],[entered,right],[enterer,person],[enteric,stomach],[entering,square],[enterprise,building],[enterprises,plus],[enterprising,plus],[enters,right],[entertain,plus],[entertained,plus],[entertaining,plus],[entertainment,plus],[enthusiasm,plus],[enthusiastically,plus],[enthusiasts,person],[entice,plus],[enticed,plus],[enticing,plus],[entire,n],[entirely,n],[entities,person],[entitled,square],[entitlement,plus],[entitlements,plus],[entity,person],[entrain,tick],[entrained,plus],[entrance,door],[entrants,person],[entrapping,box],[entreat,plus],[entrenched,down],[entrepreneur,person],[entrepreneurial,plus],[entrepreneurs,person],[entrepreneurship,one],[entries,square],[entropy,down],[entrust,plus],[entrusting,plus],[entrustments,plus],[entry,right],[entryt,square],[entwined,right],[enumerate,n],[enumerated,n],[enumerator,n],[enumerators,n],[enunciated,plus],[enunciating,square],[enus,person],[env,box],[envelope,],[environment,box],[environmental,plant],[environmentalism,plant],[environmentally,plus],[environments,box],[environs,box],[envisaged,eye],[envisaging,eye],[envision,eye],[envisioned,square],[enwrapped,box],[enya,person],[enzymatic,],[enzyme,plus],[enzymes,box],[eof,n],[eofc,n],[eol,square],[ep,line],[epages,company],[ephesia,person],[epic,book],[epidemic,minus],[epidemics,minus],[epididymis,down],[epinephrine,],[epiphany,plus],[episode,square],[episodes,square],[episteme,pen],[epistemological,box],[epistemologically,box],[epistemology,plus],[epithelium,square],[epoché,leftbracket],[epsilon,n],[epson,company],[epsonnet,line],[epton,plus],[epub,book],[eq,equals],[eqing,plus],[equal,equals],[equaled,equals],[equaling,equals],[equalitarian,person],[equality,equals],[equalled,equals],[equalling,equals],[equally,equals],[equals,],[equate,equals],[equated,equals],[equating,equals],[equation,tworectangle],[equations,equals],[equatorial,country],[equi,equals],[equilibrium,equals],[equip,plus],[equipment,computer],[equipped,plus],[equispace,n],[equispaced,square],[equitable,equals],[equitably,plus],[equity,plus],[equiv,equals],[equivalent,equals],[er,person],[era,line],[eradicate,xcorrection],[eradicated,xcorrection],[eras,line],[erase,xcorrection],[erased,xcorrection],[erasing,xcorrection],[erasure,xcorrection],[erc,dash],[ere,down],[erect,building],[erected,up],[erecting,up],[ereignis,box],[ergonomic,seat],[ergonomics,chair],[erica,person],[eritrea,country],[eroded,down],[erotisense,plus],[erpaccio,person],[err,minus],[erratic,minus],[erroneous,minus],[erroneously,minus],[error,xcorrection],[errors,minus],[ertain,minus],[eruption,volcano],[es,dash],[esau,person],[esc,right],[escalation,up],[escape,right],[escaped,right],[escapes,right],[eshan,person],[esl,book],[esol,book],[esophagus,tube],[esoteric,plus],[esotericnesses,plus],[especially,plus],[espeland,person],[esplanade,],[espouse,plus],[espouses,plus],[espousing,plus],[esque,plus],[ess,person],[essay,paper],[essayists,person],[essaylib,company],[essays,paper],[essence,apple],[essences,plus],[essent,box],[essential,one],[essentially,plus],[essentials,plus],[essonsciblenesses,plus],[est,equals],[establish,box],[established,zero],[establishes,box],[establishing,box],[estate,building],[estates,house],[esteem,plus],[ester,person],[esther,person],[estimate,n],[estimated,n],[estimating,equals],[estonia,country],[eswatini,country],[et,and],[etc,box],[etcetera,right],[etch,right],[etching,square],[eternal,n],[eternally,n],[eternity,n],[eth,dash],[ethel,person],[ethic,plus],[ethical,tick],[ethically,plus],[ethicist,person],[ethicists,person],[ethico,plus],[ethics,tick],[ethicst,dash],[ethiopia,country],[ethnic,person],[ethnically,person],[ethnicity,plus],[ethnographic,book],[ethnography,book],[etienne,person],[etiological,minus],[etiquette,plus],[etre,equals],[etymological,right],[etymology,square],[eu,country],[euan,person],[eucalyptus,tree],[eugenically,plus],[eukarya,person],[eukaryote,person],[eunice,person],[euro,dollarsymbol],[europe,continent],[european,continent],[eustace,person],[ev,dash],[evacuation,right],[evader,xcorrection],[eval,equals],[evaluability,plus],[evaluate,plus],[evaluated,tick],[evaluates,equals],[evaluating,plus],[evaluation,tick],[evaluations,plus],[evaluative,plus],[evaporated,up],[evaporation,up],[eve,person],[even,two],[evened,square],[evening,square],[evenings,moon],[evenly,line],[event,square],[eventbrite,company],[eventful,plus],[eventide,plus],[events,dot],[eventual,n],[eventually,n],[ever,dash],[everitt,person],[evermore,right],[every,n],[everybody,person],[everyday,square],[everyone,person],[everyones,person],[everything,a],[everyvarcovered,variable],[everyvarmentioned,box],[everywhere,box],[evidence,paper],[evident,tick],[evidential,box],[evie,person],[evil,xcorrection],[evince,plus],[evm,box],[evocative,plus],[evoke,plus],[evoked,plus],[evokes,up],[evoking,plus],[evolution,right],[evolutionary,right],[evolve,right],[evolved,plus],[evolving,right],[evonne,person],[evp,dash],[ew,right],[eway,company],[ex,left],[exacerbate,minus],[exacerbated,minus],[exact,equals],[exactly,equals],[exactness,equals],[exaggerate,plus],[exaggerated,plus],[exaggeration,plus],[exaltate,plus],[exam,paper],[examinable,plus],[examination,paper],[examinationism,eye],[examinations,paper],[examinative,plus],[examine,down],[examined,eye],[examinedness,plus],[examiner,person],[examines,eye],[examining,eye],[example,apple],[examples,apple],[exams,paper],[excalibur,sword],[excavate,box],[excavating,box],[excavation,up],[exceed,plus],[exceeded,plus],[excel,box],[excelled,plus],[excellence,a],[excellences,plus],[excellent,a],[excelsior,dot],[except,xcorrection],[excepted,minus],[exception,xcorrection],[exceptional,plus],[exceptionally,plus],[exceptions,minus],[excerpt,square],[excerpts,paper],[excess,box],[excesses,minus],[excessive,minus],[exchange,right],[exchanged,right],[exchanges,right],[excide,right],[excite,plus],[excited,plus],[excitement,plus],[exciting,plus],[exclaim,square],[exclaimed,plus],[exclamation,exclamationmark],[exclamationmark,],[exclude,minus],[excluded,minus],[excluding,xcorrection],[exclusion,dot],[exclusive,box],[exclusively,plus],[exclusivity,plus],[excrement,ball],[excrete,down],[excretory,down],[excursion,right],[excuse,square],[excuses,minus],[exe,software],[exec,square],[executable,and],[execute,tick],[executecommand,equals],[executed,right],[executes,equals],[executing,and],[execution,dash],[executive,person],[executives,person],[exegeses,book],[exegesiticals,book],[exemplary,plus],[exemplified,plus],[exemplify,plus],[exercise,shoe],[exercised,right],[exercises,paper],[exercising,leg],[exert,right],[exfoliated,flower],[exhale,air],[exhaled,right],[exhaling,right],[exhausted,minus],[exhaustion,minus],[exhaustive,plus],[exhibit,dot],[exhibited,square],[exhibition,square],[exhibitions,square],[exiciting,dash],[exigencies,plus],[exist,box],[existed,plus],[existence,dot],[existences,plus],[existent,box],[existential,box],[existentiell,square],[exister,person],[existing,a],[exists,plus],[exit,left],[exited,up],[exitfail,right],[exiting,right],[exits,right],[exocrine,organ],[exolec,book],[exon,right],[exp,square],[expand,right],[expandable,right],[expanded,tworectangle],[expander,right],[expanding,plus],[expands,box],[expanse,box],[expansion,plus],[expansionism,right],[expansionist,plus],[expansions,right],[expansive,right],[expect,plus],[expectancies,n],[expectancy,n],[expectation,box],[expectations,plus],[expected,box],[expectedness,plus],[expecting,left],[expend,plus],[expenditure,dollarsymbol],[expenses,dollarsymbol],[expensive,dollarsymbol],[experience,apple],[experienceable,plus],[experienced,plus],[experiencers,person],[experiences,box],[experiencing,plus],[experiential,plus],[experiment,tick],[experimental,tick],[experimentation,box],[experimented,plus],[experimenter,person],[experimenting,minus],[experiments,tick],[expert,person],[expertise,plus],[experts,person],[expiration,n],[expire,n],[expired,minus],[expires,n],[expiry,n],[explain,dash],[explained,plus],[explainer,plus],[explainers,plus],[explaining,speechballoon],[explains,plus],[explanation,square],[explanations,square],[explanatory,plus],[explicated,plus],[explicates,plus],[explicating,square],[explicit,plus],[explicitism,plus],[explicitly,plus],[exploit,minus],[exploitation,plus],[exploited,minus],[exploiting,plus],[exploration,right],[explorations,path],[explore,right],[explored,right],[explorer,plus],[explores,plus],[exploring,right],[explosion,up],[explosive,up],[exponent,n],[exponential,box],[exponentially,up],[exponents,n],[export,right],[exported,right],[exporting,right],[exports,right],[expos,paper],[expose,plus],[exposed,plus],[exposes,plus],[exposing,square],[expositio,paper],[exposition,paper],[expositions,paper],[exposure,n],[expounded,plus],[expr,plus],[express,right],[expressed,right],[expresses,square],[expressing,right],[expression,plus],[expressionnotatom,plus],[expressionnotheadache,xcorrection],[expressions,square],[expressive,plus],[expt,paper],[expulsion,xcorrection],[extend,right],[extended,right],[extending,n],[extension,line],[extensions,plus],[extensive,n],[extensively,plus],[extent,dash],[extents,right],[exterior,right],[external,right],[extinct,minus],[extinction,minus],[extol,plus],[extolled,plus],[extr,plus],[extra,plus],[extract,up],[extracted,up],[extraction,up],[extractmodes,up],[extracts,paper],[extracurricular,dot],[extraneous,minus],[extraordinary,plus],[extrapolate,right],[extrapolated,right],[extrapolation,right],[extrarelation,dash],[extras,plus],[extrasarguments,variable],[extraslabels,square],[extrasrelations,dash],[extravagance,minus],[extreme,minus],[extremely,plus],[extrinsic,right],[extroversion,plus],[extruded,right],[exts,right],[exuberant,plus],[exx,variable],[ey,person],[eye,],[eyebrow,],[eyebrows,brow],[eyed,eye],[eyeless,minus],[eyelid,],[eyes,eye],[eyeshadow,square],[eyesight,eye],[ez,variable],[eztalks,company],[ezzx,dash],[ezzxe,dash],[ezzxf,dash],[f,box],[fa,variable],[fab,person],[fabinyi,person],[fabric,],[fabrics,fabric],[fac,box],[face,],[facebook,paper],[faced,face],[faces,face],[facetime,n],[facets,square],[facial,face],[facilitate,plus],[facilitated,plus],[facilitates,plus],[facilitating,plus],[facilitation,plus],[facilitator,person],[facilitators,person],[facilities,building],[facility,room],[facing,right],[facitly,square],[facs,dash],[fact,a],[factor,n],[factored,n],[factorial,up],[factorisation,n],[factorise,square],[factors,dash],[factory,right],[facts,square],[factual,square],[faculties,building],[faculty,person],[fad,minus],[fade,line],[faded,down],[fader,square],[fading,line],[faeces,ball],[fahrenheit,n],[fail,xcorrection],[failed,minus],[failing,minus],[failings,minus],[failover,minus],[fails,minus],[failure,minus],[failures,minus],[faina,person],[faint,minus],[fair,plus],[faire,plus],[fairer,plus],[fairly,plus],[fairness,plus],[fairs,plus],[fairy,],[fait,plus],[faith,plus],[faithfulness,plus],[faiths,plus],[fake,minus],[fall,down],[fallace,dash],[fallacies,minus],[fallacy,minus],[fallen,down],[fallible,minus],[fallin,down],[falling,down],[fallopian,tube],[falls,down],[false,minus],[falsehood,minus],[falsely,minus],[falsified,xcorrection],[fam,person],[fame,plus],[familiar,plus],[familiarise,plus],[familiarised,plus],[families,person],[family,person],[famine,minus],[famous,person],[famously,plus],[famousness,plus],[famousnesses,plus],[fan,person],[fancy,plus],[fandom,company],[fanning,fan],[fans,person],[fantasias,plus],[fantasies,plus],[fantasised,plus],[fantastic,plus],[fantastically,plus],[fantasy,plus],[fao,company],[fapred,square],[faq,paper],[faqs,paper],[far,n],[farce,minus],[farcical,minus],[farewell,hand],[farion,person],[farm,box],[farmer,person],[farmers,person],[farming,apple],[farms,animal],[farrow,pig],[fas,dash],[fascinated,plus],[fascinating,plus],[fascium,box],[fashion,plus],[fashionable,plus],[faso,country],[fast,n],[fasted,zero],[fasten,plus],[fastened,equals],[faster,greaterthan],[fastest,n],[fat,],[fatah,person],[fatal,minus],[fate,minus],[father,person],[fathers,person],[fats,box],[fatty,fat],[fault,minus],[faultless,plus],[faults,minus],[faulty,minus],[fauna,animal],[fauve,plus],[favicon,square],[favorite,plus],[favour,plus],[favourable,plus],[favourably,plus],[favoured,plus],[favouring,plus],[favourite,plus],[favourites,plus],[favouritism,minus],[favours,plus],[fb,paper],[fbc,dash],[fbff,dash],[fbid,dash],[fc,dash],[fcre,right],[fd,dash],[fddb,box],[fdf,dash],[fe,],[fea,plus],[fear,minus],[fearless,plus],[fears,minus],[fearsome,minus],[feasibility,plus],[feasible,plus],[feasibly,plus],[feast,apple],[feasting,apple],[feat,plus],[feather,],[feathers,feather],[featherstone,person],[feature,square],[featured,square],[features,plus],[featurescawmp,square],[featuring,plus],[feb,square],[february,square],[febrvarius,person],[feces,ball],[fed,n],[federal,room],[feds,person],[fee,dollarsign],[feed,apple],[feedback,left],[feeder,down],[feedforward,right],[feeding,apple],[feeds,right],[feel,hand],[feeling,square],[feelings,plus],[feels,plus],[fees,dollarsymbol],[feet,foot],[felicity,person],[felis,person],[felix,person],[fell,down],[feller,person],[fellow,person],[fellows,person],[fellowships,plus],[felp,person],[felt,square],[feltman,person],[female,person],[females,person],[feminine,square],[femininity,person],[feminism,book],[feminist,person],[feministic,person],[feminists,person],[fence,],[ferdinand,person],[fergie,person],[fern,plant],[fernandez,person],[fernando,person],[fernfortuniversity,building],[ferrando,person],[ferrarotti,person],[ferris,wheel],[fertilisation,right],[fertilise,right],[fertilised,plus],[fertiliser,],[fertility,plus],[festiches,plus],[festival,note],[fetch,right],[fetched,box],[fetta,cheese],[feudal,n],[feudalism,n],[fever,minus],[feverishly,plus],[few,n],[fewer,n],[fewest,n],[ff,tick],[fff,dash],[fglasgow,city],[fh,dash],[fi,book],[fiance,person],[fiat,car],[fibonacci,plus],[fibre,],[fibres,box],[fibrosis,minus],[fibs,plus],[fiction,book],[fictional,book],[fiddle,violin],[fiddly,minus],[fidelity,plus],[fie,dash],[fiedler,person],[fiedlers,person],[field,box],[fields,square],[fieldwork,book],[fierce,minus],[fiesole,theatre],[fif,fiverectangle],[fifteen,fifteenrectangle],[fifteenrectangle,],[fifth,fiverectangle],[fifths,fivecrectangle],[fifties,n],[fifty,n],[fiftyastest,tick],[fiftybreasoningspersecond,f],[fig,],[fight,xcorrection],[fighting,minus],[fights,minus],[figuratively,right],[figure,person],[figured,square],[figurehead,person],[figures,number],[figuring,plus],[fiji,country],[file,paper],[filealg,and],[filecontents,paper],[filed,box],[filekeeplacomhits,dot],[filelist,square],[filemaking,paper],[filename,square],[filenames,square],[filenametime,t],[fileout,square],[filer,box],[files,paper],[filesex,right],[filet,paper],[fileticket,square],[filex,paper],[filexx,paper],[filexyz,paper],[filezilla,and],[filho,person],[filing,paper],[filipinos,person],[fill,square],[fillbuf,square],[filled,one],[filler,down],[fillers,box],[filling,box],[fills,down],[film,line],[filmed,box],[filming,box],[films,square],[filosofia,book],[filter,box],[filtered,box],[filtering,square],[filters,software],[filtrate,ball],[fin,],[final,n],[finalchar,square],[finalise,n],[finalised,n],[finalists,person],[finally,n],[finance,dollarsymbol],[finances,dollarsymbol],[financial,dollarsign],[financially,dollarsymbol],[financier,person],[financiers,dollarsymbol],[find,a],[findable,dot],[findall,tick],[findalls,eye],[findargs,variable],[findbest,plus],[finder,person],[finders,person],[finding,person],[findings,square],[findmelody,note],[findnsols,n],[findo,eye],[findprogram,square],[findr,square],[findresul,tick],[findresult,n],[findrhyme,equals],[findrulesflowingtopv,eye],[finds,person],[findsentence,square],[findsols,n],[findtypes,box],[findv,plus],[fine,plus],[finer,greaterthan],[finery,square],[finesse,plus],[finesses,plus],[finger,],[fingered,finger],[fingers,finger],[fingertip,square],[fingertips,square],[finish,n],[finished,tick],[finishes,tick],[finishing,n],[finite,n],[finitely,n],[finitism,n],[finkel,person],[finland,country],[finn,person],[finnish,country],[fire,],[firecracker,box],[fired,plus],[firefighter,person],[fireguard,blanket],[fireman,person],[fireworks,up],[firm,square],[firmament,up],[firmed,up],[firming,plus],[firmly,plus],[firs,dash],[first,one],[firstargs,variable],[firstletter,square],[firstline,line],[firstly,one],[firstname,square],[firststate,n],[firstvar,variable],[firstvarname,square],[fiscal,dollarsymbol],[fischer,person],[fish,],[fisher,person],[fishes,fish],[fiske,person],[fist,],[fit,box],[fitness,plus],[fits,equals],[fitted,down],[fitting,tick],[fitzgerald,person],[fitzgibbon,person],[fitzpatrick,person],[five,fiverectangle],[fivecrectangle,],[fiverectangle,],[fiverr,company],[fix,plus],[fixed,tick],[fixes,plus],[fixing,xcorrection],[fl,variable],[flag,square],[flagdisjunction,or],[flagella,line],[flagellum,line],[flagged,plus],[flags,n],[flamboyant,plus],[flame,],[flamingo,],[flammable,minus],[flapping,up],[flappings,up],[flappy,wing],[flare,light],[flash,box],[flashed,square],[flashing,light],[flask,],[flat,square],[flatten,down],[flattened,plane],[flattening,down],[flaunt,plus],[flavell,person],[flavours,strawberry],[flaw,minus],[flawed,minus],[flawless,plus],[flawlessly,plus],[flaws,minus],[fledged,plus],[flemington,suburb],[flesh,square],[fleshed,plus],[fleshing,up],[flew,right],[flex,dash],[flexes,dash],[flexibilities,plus],[flexibility,plus],[flexible,dash],[flexibly,right],[flexing,right],[flick,right],[flicked,tick],[flicking,tick],[flight,right],[flights,right],[flinders,person],[flip,right],[flipped,mirror],[flipping,right],[flips,mirror],[flipxy,mirror],[flipxygrid,grid],[flittering,minus],[float,one],[floated,up],[floating,up],[floats,up],[flock,duck],[flocks,duck],[flood,water],[floor,square],[floret,flower],[florida,state],[floss,],[flotation,buoy],[flour,ball],[flourish,plus],[flourishing,plus],[flow,line],[flowchart,],[flowed,right],[flower,],[flowers,flower],[flowing,right],[flown,right],[flows,line],[flp,function],[flu,virus],[fluctuations,minus],[fluent,plus],[fluently,tick],[fluffy,down],[fluid,water],[fluidity,right],[fluke,plus],[fluorite,stone],[flush,down],[flushed,down],[flute,note],[fluttering,right],[fly,right],[flyer,person],[flying,right],[flynn,person],[fm,square],[fmap,right],[fmr,country],[fn,function],[fna,right],[fnal,right],[fnalism,dash],[fnism,right],[fns,right],[fo,dash],[focal,eye],[focus,eye],[focused,eye],[focuses,eye],[focusing,eye],[fogs,box],[foisted,minus],[fold,paper],[foldable,right],[folded,down],[folder,paper],[folders,folder],[folding,down],[foldl,right],[foldr,right],[folds,line],[folk,guitar],[folklore,plus],[folks,person],[follow,right],[followed,two],[follower,person],[followers,person],[following,dash],[follows,right],[fond,plus],[fondled,hand],[fondness,plus],[font,a],[fonts,a],[fontspace,company],[food,apple],[foodchemistry,book],[foods,apple],[foodstuffs,apple],[fool,minus],[foot,],[footage,line],[football,],[footer,square],[foothills,hill],[footnote,square],[footnotes,square],[footpath,path],[footprint,],[footprintless,zero],[foottowel,towel],[footy,],[for,dash],[forall,plus],[foramen,organ],[forbidden,minus],[forbidding,xcorrection],[force,right],[forced,right],[forces,n],[forcing,minus],[ford,car],[fore,left],[forearm,forearm],[forearms,forearm],[forecast,right],[forecasts,n],[forefront,zero],[forego,xcorrection],[forehead,],[foreheads,forehead],[foreign,square],[forerunner,left],[foresaw,eye],[foresee,eye],[foreseeable,right],[foreseen,right],[foreshore,square],[forest,tree],[forestry,tree],[forests,tree],[foretold,mouth],[forever,one],[forgasz,person],[forge,down],[forget,minus],[forgetfulness,minus],[forgets,xcorrection],[forgetting,minus],[forging,plus],[forgive,xcorrection],[forgiveness,plus],[forgot,minus],[forgotten,minus],[fork,],[forked,right],[forks,dash],[form,paper],[formal,plus],[formalisation,plus],[formalism,a],[formally,plus],[format,dash],[formation,plus],[formations,plus],[formative,plus],[formatively,plus],[formats,paper],[formatted,square],[formatter,square],[formatting,paper],[formed,plus],[former,minus],[formerly,left],[formidable,minus],[forming,box],[formingness,plus],[formistical,box],[formlength,n],[formr,square],[forms,paper],[formula,plus],[formulae,and],[formulas,and],[formulate,plus],[formulated,plus],[formulation,plus],[fornix,and],[forrester,person],[forster,person],[forte,line],[forth,right],[forthcoming,right],[forthright,plus],[forthrightly,plus],[fortified,plus],[fortnight,square],[fortress,building],[fortuitous,plus],[fortuna,plus],[fortune,dollarsymbol],[fortunes,dollarsymbol],[forty,fortyrectangle],[fortyrectangle,],[forum,building],[forums,paper],[forward,right],[forwarded,right],[forwardness,right],[forwards,right],[fossen,person],[fossey,person],[fossil,],[foster,person],[fostered,plus],[fostering,plus],[fosters,plus],[foucauldian,person],[foucault,person],[fought,xcorrection],[found,plus],[foundation,square],[foundational,down],[foundations,box],[founded,down],[founder,person],[founders,person],[founding,plus],[fountain,],[four,fourrectangle],[fourrectangle,],[fours,shoe],[fourth,fourrectangle],[fox,animal],[foxtel,company],[fp,dash],[fph,right],[fpp,square],[fqmer,dash],[fr,paper],[fractals,square],[fraction,n],[fractional,n],[fractions,n],[fragile,minus],[fragility,minus],[fragment,n],[fragmentation,box],[fragments,square],[fragrance,perfume],[fragranced,perfume],[fragrant,flower],[frailty,minus],[frame,square],[framed,square],[framenumber,n],[frames,square],[framework,box],[frameworks,square],[framing,square],[framings,square],[franca,dash],[france,country],[frances,person],[francesca,person],[franchise,building],[franchised,plus],[francis,person],[francisco,city],[franco,country],[francois,person],[frangipane,flower],[frank,person],[franked,dollarsymbol],[frankenstein,person],[frankfurt,],[frankfurts,frankfurt],[frankie,person],[frankincense,],[française,book],[fraud,minus],[fraudulent,minus],[fraught,minus],[fred,person],[freddi,person],[frederick,person],[fredric,person],[free,a],[freeaccountingsoftware,company],[freebsd,plus],[freed,plus],[freedom,dash],[freefilesync,and],[freeimages,paper],[freelance,plus],[freely,plus],[freeman,person],[freemason,person],[freeness,zero],[freer,plus],[freesias,flower],[freeway,road],[freeze,zero],[freezer,],[freezing,minus],[freight,box],[french,paper],[frenchs,company],[freq,n],[freqs,n],[frequencies,f],[frequency,n],[frequent,f],[frequented,n],[frequently,n],[fresh,plus],[freshly,zero],[freshness,plus],[fret,line],[fretless,zero],[frets,minus],[freud,person],[freudian,person],[freya,person],[fri,square],[friar,person],[friday,square],[fried,oil],[friend,person],[friendlily,plus],[friendliness,dash],[friendlinesses,square],[friendly,plus],[friends,person],[friendship,plus],[friendships,plus],[fries,chip],[frieze,square],[frighten,minus],[frightened,minus],[fringes,square],[frivells,ribbon],[fro,right],[frock,dress],[frog,],[frolicked,plus],[from,dash],[frombae,person],[frombarrabool,dash],[fromglasgow,city],[fromglasgow,right],[fromlang,paper],[front,left],[frontal,left],[frontend,square],[frontier,line],[frontiers,line],[frown,minus],[frowned,minus],[frowning,minus],[froze,zero],[frozen,n],[frugally,plus],[fruit,apple],[fruiterer,person],[fruiting,apple],[fruition,plus],[fruitmonger,person],[fruits,apple],[fruity,apple],[frustrated,minus],[frustrating,minus],[frustration,minus],[frustrations,minus],[frying,pan],[fs,variable],[fsfs,square],[fst,square],[ft,dash],[ftoc,right],[ftp,right],[fu,square],[fudge,],[fuel,],[fuels,plus],[fulcrum,dot],[fulfil,tick],[fulfill,tick],[fulfilled,plus],[fulfilling,plus],[full,one],[fulladjective,plus],[fuller,person],[fullness,box],[fullstop,],[fully,n],[fume,nose],[fun,plus],[funcs,and],[function,a],[functional,square],[functionalise,function],[functionalised,down],[functionalism,function],[functionality,and],[functionally,plus],[functionarguments,square],[functioncalls,square],[functioned,plus],[functioning,plus],[functionname,square],[functionnumber,n],[functionresult,square],[functions,square],[functionymous,plus],[functionyms,dash],[functor,right],[functordis,variable],[functors,f],[fund,dollarsymbol],[fundamental,line],[fundamentalist,down],[fundamentally,down],[fundamentals,line],[funded,dollarsymbol],[funding,dollarsymbol],[fundraising,dollarsymbol],[funds,dollarsymbol],[funeral,n],[fungal,fungus],[fungus,],[funk,note],[funky,note],[funnel,],[funnelable,box],[funnelled,funnel],[funnelling,down],[funnels,funnel],[funnier,plus],[funny,smile],[funtastics,plus],[fur,],[furious,minus],[furniture,],[furrow,box],[furtado,person],[further,dot],[furthered,right],[furthering,right],[furthermore,plus],[furthest,n],[furthestfrommean,dot],[fury,minus],[fusing,equals],[fuss,minus],[fussy,plus],[future,plus],[futurelearn,company],[futures,right],[futuristic,n],[fx,tube],[fxs,square],[fz,variable],[g,box],[ga,dash],[gaant,square],[gabe,person],[gabetarian,dash],[gabon,country],[gabriel,person],[gadamer,person],[gadamerian,person],[gadgets,and],[gaelic,person],[gaffer,person],[gaga,person],[gaiety,plus],[gail,person],[gain,plus],[gained,plus],[gaining,plus],[gains,up],[galactic,box],[galactically,box],[galah,],[galanter,person],[galaxies,star],[galaxy,star],[gall,box],[gallbladder,organ],[galleries,square],[galleriic,plus],[gallipoli,city],[gallstone,minus],[gallstones,minus],[galumphs,plus],[galvanise,plus],[gamba,leg],[gambia,country],[gambling,minus],[game,a],[gameplay,plus],[games,square],[gamification,plus],[gaming,plus],[gamma,square],[gantar,person],[gao,person],[gap,square],[gapless,zero],[gaps,box],[garage,],[garageband,box],[garbage,],[garden,plant],[gardener,person],[gardeners,person],[gardening,plant],[gardens,plant],[gare,person],[garfield,person],[garland,person],[garlanded,flower],[garlard,person],[garlic,],[garment,shirt],[garments,shirt],[garner,person],[gary,person],[garydougsson,person],[gas,box],[gasper,person],[gassed,gas],[gastric,stomach],[gastrointestinal,stomach],[gastronomer,person],[gate,],[gates,person],[gateway,gate],[gather,box],[gathered,person],[gathering,box],[gathers,box],[gaud,apple],[gauge,n],[gauged,tick],[gauging,plus],[gauiea,dash],[gaut,person],[gave,pencilsharpener],[gavin,person],[gawenda,person],[gawk,and],[gay,person],[gayness,plus],[gays,person],[gaze,eye],[gazebo,],[gazebos,gazebo],[gazed,eye],[gazelles,animal],[gb,dash],[gbd,right],[gc,variable],[gcc,software],[gclid,dash],[gcloud,cloud],[gd,variable],[gdp,company],[ge,variable],[gear,n],[geared,plus],[ged,person],[geeds,person],[geeks,person],[geelong,city],[geis,person],[gel,],[gelder,person],[gem,jewel],[gemini,ball],[gems,jewel],[gemstone,],[gen,right],[gender,person],[gendered,person],[genders,person],[gene,n],[genealogy,down],[general,tick],[generalise,plus],[generalised,square],[generalises,plus],[generalist,plus],[generalities,plus],[generality,tworectangle],[generalizability,plus],[generally,one],[generally,square],[generate,plus],[generated,plus],[generatelevels,plus],[generatelyricslistsverse,square],[generatemelody,note],[generatemelodyh,note],[generatemelodym,note],[generaterange,plus],[generates,right],[generati,plus],[generating,right],[generation,square],[generations,person],[generative,plus],[generativeart,plus],[generativity,plus],[generator,right],[generators,and],[generic,one],[generous,plus],[generously,plus],[genes,n],[genesis,zero],[genetastics,plus],[genetic,a],[genetically,dot],[geneticist,person],[genetics,n],[genitals,box],[genitive,square],[genius,person],[genome,line],[genre,book],[genres,box],[gens,dot],[gentile,person],[gentle,plus],[gently,plus],[genuine,plus],[genuinely,plus],[genus,box],[genutils,right],[geocities,company],[geode,box],[geodes,box],[geoff,person],[geographic,country],[geographical,country],[geographies,country],[geography,book],[geolocation,dot],[geomatics,n],[geometers,person],[geometric,box],[geometrical,square],[geometrically,square],[geometry,box],[georg,person],[george,person],[georgia,person],[geraldine,person],[gereffi,person],[germ,],[german,paper],[germany,country],[germinate,seed],[germs,germ],[gerry,person],[gerrymandering,minus],[gertrude,person],[gerund,box],[gerundive,right],[gervais,person],[gestalt,n],[gestaltron,line],[gestate,n],[gestation,n],[gesticulated,plus],[gestures,right],[get,plus],[getchr,a],[getitemn,square],[getline,line],[getnthelem,square],[gets,left],[getting,left],[getty,company],[getv,variable],[getvalue,box],[getvalues,box],[getvar,variable],[gf,variable],[gfc,minus],[gg,right],[ggg,variable],[gggg,variable],[ggs,variable],[ggslrbv,dash],[gh,dash],[ghana,country],[ghc,and],[ghci,and],[ghei,dash],[gherkin,gherkin],[gherkins,gherkin],[ghetto,building],[ghost,person],[ghosts,minus],[giant,],[giants,giant],[gibberish,minus],[gift,box],[gifted,plus],[gifts,box],[giggle,plus],[giggled,person],[giggling,plus],[gill,person],[gillette,company],[gillian,person],[gilligan,person],[ginger,],[gingerly,plus],[ginkgo,plant],[ginseng,plant],[gip,minus],[giraffe,],[giri,person],[girl,person],[girls,person],[gist,left],[gists,square],[git,box],[gita,book],[gitenv,and],[github,book],[githubusercontent,box],[giuca,square],[giuseppes,person],[give,plus],[given,plus],[givenate,zero],[givens,plus],[giver,person],[gives,plus],[giving,plus],[gl,variable],[gla,and],[glacier,snow],[glad,plus],[glade,lawn],[glance,eye],[gland,],[glands,box],[glar,n],[glaring,plus],[glass,],[glasses,],[glaucoma,minus],[gleam,light],[gleaming,light],[gleeson,person],[glen,person],[glenhuntly,suburb],[gliding,right],[glitch,minus],[glittering,photon],[global,world],[globalisation,ball],[globalization,ball],[globalizing,planet],[globalleadershipexperience,plus],[globally,ball],[globals,box],[globe,ball],[glockenspiel,],[glockenspiels,xylophone],[gloria,plus],[gloriae,plus],[glories,plus],[glory,plus],[glossaries,book],[glossary,book],[glove,],[gloves,glove],[glow,light],[glucagon,ball],[glucose,ball],[glue,down],[glued,glue],[gluggy,minus],[glycogen,ball],[glyn,person],[glyphs,square],[gmail,paper],[gnu,software],[go,right],[goal,tick],[goals,n],[goat,],[goats,goat],[goblins,person],[god,person],[godfather,person],[godhead,head],[godly,person],[godmother,person],[godness,plus],[gods,person],[goering,person],[goers,right],[goes,right],[goethe,person],[goffee,person],[gofundme,dollarsymbol],[goggles,],[goglsgqx,dash],[gogu,person],[going,right],[goings,plus],[goji,plant],[gold,box],[goldberg,person],[golden,box],[goldilocks,person],[goldsmith,person],[goldstein,person],[goleman,person],[golf,ball],[gomez,person],[gondola,],[gone,zero],[gong,plus],[good,plus],[goodall,person],[goodbye,n],[gooddamn,dash],[goodman,person],[goodness,plus],[goodnesses,plus],[goodnight,plus],[goods,box],[google,building],[googleapis,box],[googleloc,dash],[googleplay,paper],[googlepreview,square],[googles,dash],[googletagmanager,software],[goonie,person],[goose,],[gooseberry,berry],[gorbachev,person],[gorelik,person],[gorgeous,plus],[gorkhali,person],[gossamer,],[gossip,mouth],[gossipname,square],[gosub,down],[got,left],[goto,right],[gotostates,right],[gotu,plant],[gourd,ball],[goureeve,person],[gov,government],[govender,person],[govern,plus],[governance,plus],[governed,plus],[government,],[governments,person],[governors,person],[govolunteer,website],[gown,],[gp,square],[gprt,dash],[gra,dot],[grabbed,hand],[grabbing,hand],[grabs,hand],[grace,plus],[gracious,plus],[graciously,plus],[grad,person],[grade,a],[gradebook,book],[graded,a],[grades,a],[gradient,n],[gradients,n],[grading,dash],[gradual,line],[gradually,one],[graduate,person],[graduated,paper],[graduates,person],[graduating,tick],[graduation,plus],[graduations,plus],[grafted,up],[graham,person],[grail,cup],[grain,ball],[grainge,person],[grains,],[gram,and],[gramm,and],[grammar,paper],[grammara,square],[grammarform,square],[grammarforms,square],[grammarinterpreter,dash],[grammark,and],[grammarly,tick],[grammarname,square],[grammarrulerest,square],[grammars,dash],[grammatical,dash],[grammatically,dash],[gramophone,],[grams,box],[gramsci,person],[grand,plus],[grande,person],[grandfather,person],[grandiose,plus],[grandma,person],[grandmother,person],[grandparents,person],[grandson,person],[grandviewresearch,company],[granita,],[granite,],[grant,right],[granted,plus],[granting,plus],[grants,dollarsymbol],[granularised,box],[granulated,down],[grape,],[grapefruit,],[graph,square],[graphed,square],[grapher,square],[graphic,square],[graphical,square],[graphically,square],[graphics,square],[graphing,square],[graphs,square],[grapple,hand],[grapples,plus],[gras,person],[grasp,hand],[grasped,hand],[grasps,hand],[grass,square],[grassroots,down],[grassy,square],[grate,square],[grated,down],[grateful,plus],[grater,],[grating,down],[gratitude,plus],[grattan,street],[grave,],[gravedigger,person],[gravel,ball],[graveyards,box],[gravity,n],[gray,person],[great,plus],[greater,greaterthan],[greaterthan,],[greatest,plus],[greatly,plus],[gree,person],[greece,country],[greed,minus],[greedson,person],[greek,person],[greeks,person],[green,square],[greenhouse,building],[greenie,person],[greenland,country],[greenlucian,person],[greens,person],[greensutra,square],[greer,person],[greet,plus],[greeted,plus],[greeter,face],[greeting,square],[greg,person],[gregarious,plus],[gregory,person],[greiner,person],[gremlin,person],[grenada,country],[grenadines,country],[grene,person],[greta,person],[grew,up],[grey,square],[greyhen,duck],[greying,minus],[gribble,person],[grid,],[gridline,line],[gridlines,line],[grids,square],[griegian,person],[grievance,minus],[grievances,xcorrection],[grieve,xcorrection],[griffin,person],[griffith,university],[griffiths,person],[grip,hand],[gripped,hand],[gripping,hand],[grips,hand],[grit,plus],[gritted,plus],[gritty,plus],[groan,minus],[grocery,building],[groove,box],[groovy,plus],[groped,hand],[ground,plane],[groundbreaking,plus],[grounded,down],[grounding,square],[grounds,plus],[groundsman,person],[group,square],[grouped,square],[grouping,square],[groupings,box],[groups,square],[grow,plant],[grower,person],[growing,up],[growling,line],[grown,up],[grows,up],[growth,up],[groysberg,person],[grub,],[grumbles,minus],[gs,dash],[gsbs,plus],[gst,dollarsymbol],[gt,greaterthan],[gtag,square],[gtm,software],[guache,],[guage,square],[guarantee,plus],[guaranteed,plus],[guaranteeing,plus],[guarantees,plus],[guard,person],[guarded,plus],[guardian,person],[guards,person],[guatemala,country],[guava,],[guba,person],[gudmundsdottir,person],[guerillacreative,company],[guernsey,],[guerriero,person],[guess,n],[guessed,plus],[guesses,questionmark],[guessing,n],[guest,person],[guesthouse,building],[guestlist,square],[guests,person],[gui,square],[guidance,plus],[guide,paper],[guided,right],[guidelines,square],[guides,square],[guiding,line],[guilds,box],[guilt,minus],[guiltily,minus],[guilty,minus],[guinea,country],[guirgis,person],[guitar,],[guitarplayer,person],[gulch,down],[gully,person],[gum,box],[gumley,person],[gumnut,],[gums,teeth],[gumtree,tree],[gun,minus],[gunfire,minus],[guns,minus],[gunshot,minus],[gupta,person],[gurgitated,up],[gurol,person],[guru,person],[gurus,person],[gustav,person],[gut,stomach],[guts,stomach],[gutter,],[guttman,person],[guy,person],[guyana,country],[guys,person],[gvc,dash],[gvcc,dash],[gvim,software],[gwfkgclcbgasyhq,dash],[gym,room],[gymansium,room],[gymcana,pony],[gymnasium,building],[gymnast,person],[gyms,weight],[gyrate,right],[gülcan,person],[h,variable],[ha,plus],[haapasalo,person],[habilitate,plus],[habit,plus],[habitable,plus],[habitably,plus],[habits,plus],[hack,down],[hacked,minus],[hacker,person],[hackernoon,dash],[hackers,person],[hackett,person],[hacking,down],[hackman,person],[had,square],[hadj,person],[hadn,xcorrection],[hadron,right],[haemmhorrhoids,minus],[haemorrhoid,minus],[haemorrhoids,minus],[haggedly,minus],[haggish,minus],[hagiography,person],[haha,plus],[hail,ball],[hailey,person],[hair,],[haircut,hair],[hairdresser,person],[hairdressing,scissors],[haired,hair],[hairstyles,hair],[hairstylist,person],[haiti,country],[haitsma,person],[halchitectures,square],[half,box],[halfway,n],[halides,ball],[hall,building],[halls,hall],[hallucinated,box],[hallucination,minus],[hallucinations,minus],[hallucinatory,minus],[hallucinogenic,minus],[hallway,hall],[halo,circle],[halt,box],[halted,xcorrection],[halting,zero],[halved,box],[halves,square],[ham,],[hamburger,],[hamel,person],[hamilton,person],[hamish,person],[hamlet,person],[hammer,],[hammersley,person],[hammock,],[hampstead,city],[hamster,],[hamstring,line],[hand,],[handbook,book],[handed,hand],[handful,box],[handing,hand],[handkerchief,],[handle,],[handled,hand],[handler,hand],[handlers,hand],[handles,handle],[handling,hand],[handmade,box],[hands,hand],[handshake,hand],[handstand,down],[handwriting,square],[handwritten,square],[handy,hand],[hang,person],[hanging,down],[hangs,minus],[hank,person],[hannaford,person],[hannigan,person],[hans,person],[hansard,person],[hansen,person],[hansom,car],[hao,person],[hap,dash],[happelstanded,plus],[happen,tick],[happened,tick],[happening,zero],[happenings,box],[happens,tick],[happier,plus],[happies,plus],[happiest,plus],[happily,plus],[happiness,plus],[happinesses,plus],[happisissiances,plus],[happy,],[happy,plus],[happyface,],[happynices,person],[hapsichords,plus],[harass,minus],[harassment,minus],[harbinger,person],[hard,minus],[hardback,book],[harddiskless,xcorrection],[harder,minus],[hardest,n],[hardgrave,person],[hardily,plus],[hardly,minus],[hardship,minus],[hardware,box],[hardwired,line],[hardy,person],[hargraves,person],[harlen,person],[harlequin,harlequin],[harlequinade,person],[harlequinades,person],[harlequins,harlequin],[harley,person],[harm,minus],[harmed,minus],[harmful,minus],[harming,minus],[harmless,plus],[harmlessly,plus],[harmlessness,plus],[harmonica,],[harmonics,n],[harmonies,note],[harmonious,plus],[harmonised,equals],[harmoniser,box],[harmony,note],[harmonyinstrumentnumber,n],[harmonyinstruments,tube],[harmonyinstrumentslength,n],[harmonypart,note],[harmonyparts,line],[harmonypartslength,n],[harmonyr,plus],[harms,minus],[harness,],[harnesses,right],[harnessing,plus],[harold,person],[harp,],[harper,person],[harpist,person],[harps,harp],[harpsichord,note],[harpsichorders,note],[harpsichords,note],[harraway,person],[harrington,person],[harris,person],[harrison,person],[harry,person],[harsha,person],[hart,person],[harvard,university],[harvest,plant],[harvested,grain],[harvester,up],[has,pencilsharpener],[hash,],[hashtags,square],[haskell,and],[hasn,minus],[hassaan,person],[hasseldine,person],[hat,],[hate,minus],[hated,minus],[hating,minus],[hatred,minus],[hats,hat],[hatted,hat],[haughty,minus],[hauled,up],[have,apple],[haven,home],[haverbanders,person],[havilland,person],[having,hand],[hawke,person],[hawking,person],[hawkins,person],[hawthorn,suburb],[hay,],[hayes,person],[haystack,hay],[hazelnut,nut],[hazen,person],[hazily,box],[hazy,minus],[hbb,dash],[hbr,company],[hc,variable],[hcl,],[hd,box],[hdd,box],[hdmi,square],[hdrdocuments,paper],[he,person],[head,],[headache,minus],[headaches,minus],[headdress,],[headed,right],[headedness,head],[header,square],[headers,square],[heading,square],[headings,square],[headlines,square],[headphones,],[headpiece,],[headroom,box],[heads,dot],[headset,],[headshot,square],[headsofstate,person],[headstrong,minus],[headwind,left],[heal,tick],[healed,plus],[healing,plus],[health,person],[healthand,plus],[healthier,plus],[healthily,plus],[healthiness,plus],[healthy,person],[heap,box],[heapings,ball],[heaps,n],[hear,ear],[heard,ear],[hearing,ear],[heart,],[heartbeat,heart],[hearted,plus],[heartening,plus],[heartland,city],[hearts,heart],[heat,n],[heated,n],[heater,],[heaters,heater],[heathcote,person],[heather,person],[heating,n],[heaven,zero],[heavenly,plus],[heavier,greaterthan],[heavily,n],[heaviness,n],[heavy,n],[hebrew,book],[hecklers,minus],[hecs,dollarsymbol],[hedgehog,],[hedgehogged,plus],[hedonistic,minus],[heel,],[heffalump,pig],[hegel,person],[hegelgreen,paper],[hegelian,person],[hegelpossyllabus,book],[heh,dash],[hei,person],[heidegger,person],[heideggerian,person],[heidi,person],[height,n],[heights,zline],[heiness,person],[heir,person],[heisenberg,zero],[held,hand],[helen,person],[helena,person],[helicopter,],[heliport,square],[helium,],[hell,minus],[hello,hand],[helloa,plus],[helloaa,plus],[helloc,plus],[helloca,plus],[hellocc,plus],[hellocca,plus],[hellod,plus],[hellodc,plus],[hellok,plus],[hellow,plus],[help,square],[helpdesk,desk],[helped,person],[helper,person],[helpers,person],[helpful,plus],[helpfulness,plus],[helpfulnesses,plus],[helping,plus],[helps,plus],[helpt,plus],[helter,n],[hem,line],[hemel,city],[hemisphere,],[hemispheres,hemisphere],[hemmed,left],[hen,duck],[hence,right],[henderson,person],[henle,person],[henry,person],[hens,duck],[henson,person],[heptagon,],[her,person],[herb,plant],[herbal,amount],[herbert,person],[herbology,plant],[herbs,amount],[here,down],[herea,square],[hereb,square],[herec,square],[hered,square],[heritage,square],[hermaphrodites,person],[hermaphroditic,minus],[hermeneutic,dash],[hermeneutical,book],[hermeneutically,right],[hermeneutics,square],[hermia,person],[hernán,person],[hero,person],[heroes,person],[heroic,plus],[herren,person],[herring,fish],[herrings,fish],[hers,person],[herself,person],[hersey,person],[hertz,n],[herzegovina,country],[hesh,person],[heshan,person],[heshe,person],[hesitation,minus],[hesterity,plus],[heterogeneous,notequals],[heterosexual,person],[heuristic,plus],[heuristics,and],[hew,person],[hewso,book],[hex,n],[hexadecimal,n],[hexagon,],[hexagonal,hexagon],[hexagons,hexagon],[hey,one],[hflush,down],[hhh,dash],[hhi,square],[hhii,square],[hi,hand],[hibbins,person],[hibisci,plant],[hibiscus,plant],[hibiscuses,flower],[hiccups,minus],[hickletons,plus],[hid,minus],[hidden,down],[hide,minus],[hides,box],[hiding,box],[hierarchical,dash],[hierarchically,dash],[hierarchies,dash],[hierarchy,dash],[hieroglyph,square],[hieroglyphics,square],[hieroglyphs,square],[high,a],[higher,line],[highest,n],[highlight,light],[highlighted,square],[highlighting,square],[highlights,plus],[highly,plus],[hight,n],[highton,suburb],[highway,road],[highways,road],[hike,right],[hiked,right],[hiking,boot],[hilbert,person],[hill,],[hillary,person],[hillier,person],[hills,hill],[hilsley,person],[him,person],[himlkampf,book],[himself,person],[hinder,minus],[hindered,minus],[hinders,minus],[hindi,language],[hindrances,minus],[hindsight,eye],[hindu,person],[hinduism,book],[hinge,],[hint,square],[hinted,square],[hintinesses,plus],[hints,plus],[hipaa,square],[hippies,person],[hippocampus,],[hippolyte,person],[hips,],[hiptalipuppies,puppy],[hiraffe,giraffe],[hire,dollarsymbol],[hired,plus],[hirer,person],[hires,dollarsymbol],[hiring,plus],[his,person],[hispanic,person],[hisrich,person],[hissed,minus],[hist,paper],[histogram,n],[historian,person],[historians,person],[historic,book],[historical,n],[historically,book],[histories,book],[history,square],[hit,tick],[hitch,box],[hitchens,person],[hither,right],[hithertoness,note],[hitler,person],[hits,note],[hitting,equals],[hive,],[hived,bee],[hjsdfsa,dash],[hk,dash],[hkq,dash],[hlo,dash],[hn,variable],[hnn,variable],[ho,plus],[hoax,minus],[hobbies,plus],[hoboken,person],[hoc,down],[hog,pig],[hoist,up],[hoisted,up],[hoity,plus],[holbrook,person],[hold,hand],[holder,person],[holders,hand],[holding,hand],[holds,hand],[hole,],[holes,circle],[holiday,square],[holidayed,plus],[holidaying,plus],[holidays,square],[holily,plus],[holiness,plus],[holism,plus],[holistic,plus],[holistically,plus],[hollownesses,box],[hollywood,city],[holmes,person],[holocap,],[holodeck,square],[hologram,box],[holograms,right],[hologranates,pomegranate],[holographic,box],[holy,paper],[home,house],[homebrew,software],[homeless,person],[homelessness,minus],[homemade,plus],[homeostasis,line],[homepage,paper],[homes,house],[homework,paper],[homicide,minus],[hominems,person],[homo,person],[homoeroticness,plus],[homographs,equals],[homophobe,minus],[homophones,equals],[homophonous,equals],[homosexual,person],[homosexuality,equals],[homosexuals,person],[honduras,country],[hone,square],[honed,plus],[honest,plus],[honestly,plus],[honesty,plus],[honey,],[honeycomb,hexagon],[hong,city],[honing,square],[honked,horn],[honking,horn],[honky,ball],[honorary,plus],[honorific,square],[honour,a],[honourable,a],[honourably,plus],[honoured,a],[honouringly,plus],[honours,book],[hood,person],[hook,],[hooked,hook],[hooks,hook],[hooky,hook],[hoop,],[hoops,hoop],[hop,up],[hope,plus],[hoped,plus],[hopeful,plus],[hopefully,a],[hopefully,plus],[hopes,plus],[hopetoun,city],[hoping,box],[hopkins,person],[hopping,right],[hops,right],[horace,person],[horizon,line],[horizons,horizon],[horizontal,line],[horizontally,line],[hormone,ball],[hormones,box],[horn,],[horoscope,moon],[horoscopes,plus],[horrific,minus],[horse,],[horseman,person],[horsemen,person],[horses,horse],[horticulture,plant],[hose,tube],[hosing,tube],[hospital,building],[hospitalisation,plus],[hospitalisations,plus],[hospitality,plus],[hospitals,building],[host,computer],[hosted,plus],[hostel,building],[hostility,minus],[hosting,box],[hot,n],[hotel,building],[hotline,n],[hotspots,square],[hotter,up],[houghton,person],[hour,square],[hourglass,],[hourlinesses,n],[hourly,n],[hours,square],[hoursprayer,square],[hourvaluea,variable],[hourvalueb,variable],[house,],[housed,house],[household,house],[housekeeper,person],[housekeeping,plus],[houses,house],[housing,house],[hove,person],[hover,up],[hovercar,],[how,dash],[howard,person],[howardfinestudio,company],[howdy,plus],[howe,person],[howell,company],[howells,company],[however,minus],[hows,person],[howzat,ball],[hpc,computer],[hq,plus],[hr,book],[href,dash],[hs,variable],[hsc,paper],[ht,dash],[html,paper],[htmlpreview,square],[http,tick],[httpd,software],[https,tick],[hu,person],[hub,box],[hubpages,company],[hubspot,company],[huckleys,building],[hudders,company],[hudson,company],[hug,],[huge,n],[huggable,person],[hugged,hug],[hugging,person],[hughes,person],[hugo,person],[huh,minus],[hull,box],[hulpke,person],[hum,square],[human,person],[humane,plus],[humanism,book],[humanist,person],[humanistic,person],[humanists,person],[humanitarian,person],[humanitas,book],[humanities,building],[humanity,person],[humankind,person],[humanness,plus],[humanoid,person],[humans,person],[humble,plus],[humbled,plus],[humbleness,plus],[humbling,plus],[humiliate,minus],[humility,plus],[humming,lip],[hummus,],[humorous,plus],[humour,plus],[humourous,plus],[humphrey,person],[humphries,person],[humpy,],[hun,person],[hundred,square],[hundreds,threerectangle],[hung,down],[hungary,country],[hunger,apple],[hungry,tongue],[hunt,eye],[hunted,dot],[huprich,person],[hurdle,box],[hurdles,hurdle],[hurt,minus],[hurting,minus],[hurtled,right],[hurtness,minus],[hurts,minus],[husband,person],[husbands,person],[hush,zero],[huskies,dog],[husky,dog],[husserl,person],[huston,person],[hut,building],[hutchings,person],[hutchinson,person],[hvuetwgillii,dash],[hw,paper],[hyam,person],[hybrid,two],[hybrids,plus],[hydrangeas,flower],[hydrant,water],[hydrated,water],[hydrochloric,hcl],[hydrogen,],[hydroponics,water],[hygiene,plus],[hygienic,plus],[hyle,box],[hymns,book],[hype,minus],[hyper,right],[hyperbole,up],[hypercard,software],[hyperlinks,right],[hyperlog,square],[hypermarket,dollarsymbol],[hypersensitive,minus],[hypervacillations,right],[hyphen,],[hypochondria,minus],[hypotenuse,line],[hypothalamus,box],[hypotheses,tick],[hypothesis,one],[hypothesise,one],[hypothesised,square],[hypothetical,square],[hysteresis,dash],[hysterically,plus],[hz,n],[i,person],[ia,plus],[iable,dash],[iacit,right],[iaclcbgasyhq,dash],[ian,person],[iata,book],[iax,dash],[ibid,book],[ibidem,right],[ibis,],[ibises,ibis],[ibisworld,company],[ibooks,and],[ice,person],[icecream,],[iceland,country],[icicle,],[icing,],[icon,square],[iconism,square],[icons,square],[icsamined,plus],[ict,computer],[icy,ice],[id,square],[idahobit,plus],[idea,a],[ideal,plus],[idealism,circle],[idealist,person],[ideality,plus],[ideally,plus],[ideas,tworectangle],[ideation,square],[idemic,dash],[identical,equals],[identicalness,equals],[identifiable,plus],[identification,square],[identified,eye],[identifier,n],[identifiers,square],[identifies,tick],[identify,a],[identifying,n],[identifywriters,person],[identigenitrix,plus],[identities,person],[identity,square],[ideological,plus],[ideologies,book],[ideology,square],[idiom,square],[idiosyncrasies,minus],[idiot,person],[idiots,person],[idle,zero],[ids,square],[idyll,flower],[ie,person],[ieee,organisation],[if,dash],[ifcn,dash],[ifind,plus],[iframe,square],[ifs,right],[igf,dash],[iglesias,person],[ignite,plus],[ignition,right],[ignoramus,person],[ignorance,minus],[ignore,xcorrection],[ignored,xcorrection],[ignores,xcorrection],[ignoring,xcorrection],[igo,company],[igual,dash],[ih,dash],[iherb,company],[ii,tworectangle],[iie,computer],[iii,dash],[iio,dash],[ij,square],[ikheloa,person],[ikinger,person],[il,square],[ilham,person],[ill,minus],[illegal,minus],[illegally,minus],[illicit,minus],[illinois,city],[illness,minus],[illnesses,minus],[illocutionary,plus],[illuminated,light],[illuminati,person],[illusion,plus],[illusory,minus],[illustator,software],[illustrate,square],[illustrated,square],[illustrates,square],[illustration,square],[illustrations,square],[illustrative,square],[illustrator,software],[illustrious,plus],[ilo,plus],[ilpi,right],[ilya,person],[im,company],[imac,computer],[image,square],[imagen,dash],[imageric,square],[imagery,square],[images,square],[imaginable,square],[imaginary,plus],[imagination,plus],[imaginative,plus],[imagine,square],[imagined,plus],[imagines,plus],[imaging,square],[imagining,square],[imaiangreen,person],[imbalance,minus],[imbalances,minus],[imc,company],[img,square],[imitate,right],[imitated,right],[imitates,right],[imitating,right],[imitation,box],[imm,plus],[immature,minus],[immaturity,minus],[immediate,zero],[immediately,one],[immediatism,plus],[immersed,down],[immersing,down],[immersion,down],[immigration,right],[imminent,zero],[imminently,n],[immoral,minus],[immorality,minus],[immortal,n],[immortalise,plus],[immortality,n],[immortals,person],[immune,plus],[immunity,plus],[immunogymnastics,plus],[immunology,book],[immutability,dot],[immutable,right],[immutably,right],[imogen,person],[impact,tick],[impacted,tick],[impactful,a],[impacts,plus],[impair,minus],[impaired,minus],[impairment,minus],[impairments,minus],[impart,square],[impartial,plus],[impeccable,plus],[impede,minus],[impediment,minus],[impediments,minus],[impending,zero],[impenetrable,xcorrection],[imperative,right],[imperatives,right],[imperceptible,dot],[imperceptibly,zero],[imperfect,minus],[imperfections,minus],[imperial,person],[imperialism,country],[impersonal,minus],[impetus,plus],[impinge,xcorrection],[implant,box],[implanted,down],[implants,box],[implement,plus],[implementation,plus],[implementations,and],[implemented,right],[implementing,and],[implements,right],[implicated,right],[implication,right],[implications,right],[implicit,down],[implicitly,box],[implied,right],[implies,right],[imply,right],[implying,right],[import,right],[importance,plus],[important,plus],[importantly,plus],[imported,down],[importing,down],[imports,right],[imposed,minus],[impossible,minus],[impractical,xcorrection],[impress,plus],[impressed,plus],[impression,square],[impressionably,plus],[impressionism,square],[impressive,plus],[improper,minus],[improperly,minus],[impropriety,minus],[improve,plus],[improved,up],[improvement,plus],[improvements,plus],[improves,plus],[improving,plus],[impulse,tick],[impurities,minus],[impurity,minus],[in,square],[inability,minus],[inaccessible,minus],[inaction,minus],[inactions,minus],[inactivity,minus],[inadequacies,minus],[inadequate,minus],[inadvertently,minus],[inanimate,zero],[inappropriate,minus],[inbound,right],[inbox,box],[inbuilt,box],[inc,plus],[incan,person],[incapable,minus],[incapacitated,minus],[incentive,plus],[incentives,plus],[inception,zero],[inch,n],[inches,n],[incidence,n],[incident,dot],[incidental,n],[incidents,box],[incision,knife],[incite,mouth],[incl,box],[inclination,plus],[incline,n],[inclish,suitcase],[include,plus],[included,plus],[includes,left],[including,plus],[inclusion,box],[inclusions,dash],[inclusive,box],[inclusively,plus],[inclusiveness,plus],[incognito,dash],[income,dollarsign],[incoming,right],[incompatibility,xcorrection],[incompatible,xcorrection],[incomplete,minus],[incompletely,minus],[incongruous,minus],[inconsequential,plus],[inconsistencies,minus],[inconsistent,minus],[inconvenience,minus],[incorporate,down],[incorporated,right],[incorporating,dot],[incorporation,plus],[incorrect,xcorrection],[incorrectly,minus],[increase,plus],[increased,plus],[increaser,plus],[increases,up],[increasing,plus],[increasingly,up],[incredible,plus],[increment,plus],[incremental,plus],[incrementally,right],[incubator,heater],[ind,up],[indeed,plus],[indefensible,minus],[indefinitely,n],[indefinitive,minus],[indefinitively,xcorrection],[indent,dash],[indentation,right],[indented,right],[indenting,right],[independence,plus],[independent,one],[independently,plus],[index,dash],[indexanimhighered,paper],[indexdownload,paper],[indexed,n],[indexga,dash],[indexing,n],[indexped,paper],[indextexttobr,paper],[india,country],[indian,person],[indiana,person],[indic,plus],[indicate,a],[indicated,tick],[indicates,up],[indicating,dash],[indicative,plus],[indicator,n],[indicators,i],[indices,n],[indie,plus],[indieradioshow,note],[indigeneity,person],[indigenous,person],[indirect,dot],[indirectly,right],[indispensable,plus],[indiv,person],[individual,person],[individualism,person],[individualistic,person],[individuality,plus],[individually,person],[individualness,person],[individuals,person],[individuation,plus],[indonesia,country],[indonesian,book],[indoor,building],[indoors,building],[indubitably,plus],[induc,up],[induced,right],[induct,plus],[induction,plus],[inductive,up],[inductively,up],[inductor,right],[indulge,plus],[indulging,plus],[industrial,company],[industrialisation,planet],[industries,company],[industry,building],[ineachiated,box],[ineffective,minus],[inefficient,minus],[inepretaer,dash],[inept,minus],[inequalities,greaterthan],[inequality,notequals],[inequity,minus],[inert,zero],[inessive,box],[inevitable,plus],[inexorable,plus],[inexpensive,n],[inexpensively,plus],[inexplicable,minus],[inexplicably,minus],[inextricably,plus],[infact,plus],[infallibility,plus],[infallible,plus],[infamous,minus],[infant,baby],[infants,baby],[infatuated,plus],[infatuations,plus],[infect,minus],[infected,minus],[infection,virus],[infections,minus],[infer,right],[inference,right],[inferenced,right],[inferences,right],[inferential,right],[inferred,right],[infers,right],[infertile,minus],[infiltrate,right],[infiltrating,xcorrection],[infinary,n],[infinite,circle],[infinitely,n],[infinitesimally,n],[infinitive,right],[infinitum,n],[infinity,n],[infix,plus],[inflammation,minus],[inflated,ball],[inflating,up],[inflation,n],[inflection,up],[inflow,right],[influence,person],[influenced,right],[influencers,person],[influences,right],[influencing,plus],[influential,plus],[influenza,virus],[influx,right],[info,paper],[infopreneur,person],[infopreneurs,person],[infopreneursummit,square],[inform,mouth],[informal,plus],[informally,minus],[information,paper],[informational,square],[informative,square],[informed,plus],[informer,person],[informing,plus],[informs,square],[infra,n],[infrastructure,box],[infringement,minus],[infringers,minus],[infringing,minus],[infuse,down],[infused,plus],[infusion,down],[ing,right],[ingenue,plus],[ingenuity,plus],[ingest,stomach],[inglish,hinge],[ingredient,apple],[ingredients,apple],[ingrid,person],[ingrown,minus],[ingénue,dash],[inhabit,down],[inhabitant,person],[inhabitants,person],[inhabiting,plus],[inhalation,lung],[inhale,air],[inhaled,air],[inherent,plus],[inherentana,plus],[inherently,plus],[inherit,left],[inherited,left],[inhibit,minus],[inhibiting,xcorrection],[inhibits,xcorrection],[inhumane,minus],[init,one],[initial,one],[initialisation,zero],[initialise,zero],[initialised,zero],[initialising,plus],[initialization,zero],[initially,one],[initialphilosophy,square],[initials,square],[initiate,one],[initiated,one],[initiates,one],[initiating,zero],[initiation,zero],[initiative,right],[initiatives,plus],[initiator,right],[initpredicate,square],[inject,syringe],[injecting,syringe],[injunctive,box],[injuries,minus],[injury,minus],[injustice,minus],[injustices,minus],[ink,],[inkey,square],[inks,ink],[inky,],[inkyclassic,inky],[inline,dash],[inmates,person],[inmodes,right],[innate,plus],[inner,box],[innocence,plus],[innocent,plus],[innovate,plus],[innovates,plus],[innovating,plus],[innovation,a],[innovations,plus],[innovative,plus],[innovators,plus],[inopportune,minus],[inorder,up],[input,variable],[inputasantecedant,n],[inputfile,paper],[inputs,n],[inputted,n],[inputting,right],[inputtype,square],[inputvalues,n],[inputvarlist,twobox],[inputvars,variable],[inputvarsa,variable],[inquests,line],[inquired,square],[inquirer,person],[inquiries,square],[inquiry,square],[inroads,right],[ins,box],[inscription,pen],[insect,],[insects,insect],[insensitive,minus],[insentient,minus],[inseparability,equals],[inseparable,equals],[insert,box],[insertdoublebackslashbeforequote,box],[inserted,xline],[inserting,down],[insertion,box],[insertions,box],[inserts,box],[inside,box],[insider,person],[insight,eye],[insightcrime,company],[insightful,plus],[insights,plus],[insomnia,minus],[inspect,eye],[inspected,eye],[inspecting,eye],[inspection,eye],[inspector,person],[inspiration,plus],[inspirational,plus],[inspirations,plus],[inspire,right],[inspired,tick],[inspires,plus],[inspiring,plus],[inst,note],[instability,minus],[instagram,company],[install,down],[installation,down],[installations,computer],[installed,down],[installer,down],[installers,down],[installing,down],[installs,right],[instalments,dollarsymbol],[instance,n],[instances,n],[instant,zero],[instantaneously,zero],[instantiate,plus],[instantiated,plus],[instantiation,tick],[instantly,one],[instated,plus],[instead,one],[instil,plus],[instincts,plus],[institute,company],[instituted,plus],[instituting,plus],[institution,organisation],[institutional,building],[institutionalised,plus],[institutions,building],[instr,note],[instruct,tick],[instructed,paper],[instructing,square],[instruction,square],[instructional,square],[instructions,paper],[instructor,person],[instructors,person],[instructs,paper],[instructure,square],[instrument,pen],[instrumental,guitar],[instrumentals,note],[instrumentation,note],[instrumentlist,square],[instrumentnumber,n],[instruments,violin],[insufficient,xcorrection],[insufficiently,xcorrection],[insulator,box],[insulin,box],[insult,minus],[insulted,minus],[insurance,paper],[int,n],[intact,line],[intactify,plus],[intake,right],[intangible,square],[integer,n],[integers,n],[integrate,one],[integrated,plus],[integrates,box],[integrating,down],[integration,box],[integrations,box],[integrative,dash],[integrator,equals],[integrity,plus],[integument,right],[integumentary,skin],[intell,plus],[intellect,plus],[intellectual,plus],[intellectually,brain],[intellectuals,person],[intelligence,plus],[intelligences,square],[intelligent,plus],[intelligently,and],[intend,dash],[intendant,person],[intended,right],[intends,right],[intense,n],[intensify,plus],[intensity,n],[intensive,a],[intent,plus],[intention,right],[intentional,plus],[intentions,square],[inter,dash],[interact,plus],[interacted,dash],[interacting,tick],[interaction,plus],[interactions,right],[interactive,square],[interactively,plus],[interacts,right],[intercept,c],[intercepted,equals],[intercepting,box],[intercepts,n],[interchange,right],[interchangeable,box],[interchangeably,right],[interchanged,right],[interchanges,right],[intercommunicated,right],[interconnect,right],[interconnectable,right],[interconnected,right],[interconnectedness,plus],[interconnection,right],[interconnections,right],[intercourse,square],[interdependence,plus],[interdependent,right],[interdisciplinary,tworectangle],[interest,plus],[interested,plus],[interesting,tick],[interestingengineering,company],[interestingly,plus],[interestingness,plus],[interests,square],[interface,square],[interfaced,square],[interfaces,square],[interfacing,equals],[interference,minus],[intergalactic,right],[intergalactically,dot],[intergenre,right],[interim,n],[interior,room],[interjected,mouth],[interleaved,line],[interline,square],[interludes,line],[intermediate,box],[intermingled,right],[intermittent,n],[intermittently,dot],[intermix,box],[intern,person],[internal,square],[internalised,box],[internally,box],[international,planet],[internationalisation,ball],[internationalised,square],[internationally,ball],[internet,book],[internetics,company],[internship,paper],[interobject,right],[interp,and],[interpersonal,person],[interplanetary,planet],[interplay,right],[interpolate,right],[interpolating,right],[interpolation,right],[interpred,square],[interpret,person],[interpretable,plus],[interpretation,paper],[interpretations,plus],[interpretative,plus],[interpretbody,square],[interpretbodylp,square],[interpreted,dash],[interpreter,person],[interpreters,square],[interpreting,right],[interpretpart,dash],[interprets,dash],[interpretstatement,dash],[interpretstatementlp,right],[interps,equals],[interquartile,minus],[interrelate,right],[interrelated,right],[interrelatedness,dash],[interrelation,dash],[interrelations,right],[interrelationship,right],[interrelationships,right],[interrogate,xcorrection],[interrogated,square],[interrupt,box],[interrupted,minus],[interrupting,box],[interruption,minus],[interruptions,minus],[intersect,box],[intersected,box],[intersecting,box],[intersection,and],[intersectional,box],[intersectionality,box],[intersections,box],[intersex,two],[interspecies,dash],[interspersed,right],[intersplicings,plus],[interstate,right],[interstellar,ball],[interstellarphone,right],[interstitial,box],[interstitium,box],[intersubjectively,plus],[intersubjectivity,right],[intertextual,right],[intertextualise,right],[intertextualised,right],[intertextuality,box],[intertwine,two],[intertwined,line],[intertwining,right],[intertwiningly,right],[interval,minus],[intervals,minus],[intervene,xcorrection],[intervention,box],[interventions,xcorrection],[interview,dash],[interviewed,person],[interviewer,person],[interviewers,person],[interviewing,square],[interviews,person],[interweaved,right],[interweaving,right],[intestinal,tube],[intestine,],[intestines,],[intimacy,equals],[intimate,equals],[intimating,plus],[intimation,plus],[intmath,equals],[into,square],[intolerance,xcorrection],[intoxicated,minus],[intoxicating,minus],[intra,right],[intranet,square],[intransitive,dot],[intrepid,plus],[intricate,square],[intrigued,plus],[intriguing,plus],[intrinsic,box],[intro,square],[introduce,dash],[introduced,one],[introduces,one],[introducing,dash],[introduction,square],[introductions,square],[introductory,one],[introns,right],[introversion,plus],[introvert,square],[introverted,minus],[introverts,square],[intructions,paper],[intrusion,minus],[ints,n],[intset,n],[intuition,plus],[intuitive,plus],[intuitively,plus],[intuitiveness,plus],[intuitivity,plus],[invalid,minus],[invalidates,xcorrection],[invariant,equals],[invasive,minus],[invasively,minus],[invasiveness,minus],[invent,box],[invented,box],[inventing,box],[invention,and],[inventions,and],[inventive,and],[inventors,person],[inventory,box],[invents,plus],[inverse,right],[inversely,right],[inversion,box],[inversions,left],[invert,right],[inverted,right],[inverter,x],[invertible,left],[inverting,right],[inverts,right],[invest,dollarsymbol],[invested,dollarsymbol],[investigate,plus],[investigated,eye],[investigates,eye],[investigating,eye],[investigation,tick],[investigations,book],[investing,dollarsymbol],[investism,dollarsymbol],[investment,n],[investments,dollarsymbol],[investor,person],[investors,person],[invests,dollarsymbol],[invigilated,plus],[invincibility,plus],[invincible,plus],[invisible,square],[invisibles,zero],[invisibly,zero],[invitation,paper],[invitations,square],[invite,left],[invited,plus],[invites,right],[inviting,left],[invitralised,plus],[invoke,plus],[invoked,box],[invokes,plus],[invoking,plus],[involuntarily,minus],[involuntary,zero],[involutional,equals],[involve,left],[involved,tick],[involvement,plus],[involves,left],[involving,plus],[inwards,right],[io,right],[iodine,ball],[ioi,dash],[ion,ball],[ioo,dash],[ios,square],[ip,n],[ipad,square],[iphone,],[iphones,iphone],[ipn,n],[iq,a],[iqr,minus],[iqrexcel,box],[iran,country],[iranian,person],[iraq,country],[ireland,country],[irigaray,person],[iris,circle],[irish,person],[irjci,dash],[iron,],[ironed,iron],[ironic,right],[ironically,plus],[ironing,down],[ironism,plus],[ironisms,up],[irony,right],[irrational,minus],[irrationality,minus],[irrazionali,minus],[irregular,plus],[irrel,minus],[irrelevant,minus],[irresponsibility,minus],[irreversible,minus],[irritable,minus],[irritating,minus],[irst,dash],[irwin,person],[is,equals],[isaac,person],[isbn,n],[iscomparison,equals],[iscomplex,plus],[iscomplicated,plus],[isdigit,n],[isempty,box],[iser,plus],[ishq,person],[isinfixof,box],[isla,square],[islam,religion],[islamic,person],[island,],[islands,island],[islist,square],[ism,box],[isn,minus],[iso,paper],[isolate,box],[isolated,box],[isolation,one],[isop,equals],[isoperator,equals],[isplus,plus],[israel,country],[israeli,country],[isreview,book],[iss,book],[isset,equals],[isspace,square],[issue,dash],[issued,plus],[issues,dash],[istituto,building],[isval,equals],[isvalempty,n],[isvalstr,threebox],[isvalstrempty,tworectangle],[isvalstrorundef,box],[isvar,equals],[it,down],[it,up],[italian,book],[italic,line],[italicised,square],[italics,square],[italy,country],[itch,minus],[itchiness,minus],[item,n],[itema,n],[itema,square],[itemid,n],[itemised,box],[itemlabels,square],[itemname,square],[itemnumber,n],[items,n],[itemsa,n],[itemsa,square],[iterate,n],[iterated,n],[iteration,n],[iterations,n],[iterative,n],[iterativedeepeningdepth,variable],[ith,dash],[itin,right],[itinerary,square],[itness,plus],[its,up],[itself,up],[itunes,software],[iup,book],[iv,fourrectangle],[ivci,line],[ivf,right],[ivison,person],[ivle,square],[ivn,variable],[ivns,variable],[ivo,person],[ivoire,country],[ivory,],[iwda,dash],[ixamine,plus],[ixx,n],[izcoatl,person],[izzimokays,plus],[j,],[jack,person],[jackall,person],[jackdaw,bird],[jacket,],[jackman,person],[jackova,person],[jackson,person],[jacob,person],[jacqueline,person],[jacques,person],[jag,person],[jagged,minus],[jail,building],[jailed,xcorrection],[jailings,xcorrection],[jairo,person],[jakat,person],[jake,person],[jam,],[jamaica,country],[james,person],[jameson,person],[jamesroyalmelbournehospital,person],[jamie,person],[jan,person],[jane,person],[janette,person],[jangle,plus],[january,square],[japan,country],[japanese,book],[japonica,flower],[jar,],[jarful,jar],[jargon,square],[jarrah,person],[jarret,person],[jarvis,person],[jase,person],[jaskula,person],[jasmine,person],[jason,person],[jasp,person],[java,and],[javascript,square],[jaw,],[jay,person],[jaynes,person],[jazz,note],[jazzed,note],[jb,dash],[jc,variable],[jcep,company],[jcm,dash],[je,person],[jealous,minus],[jed,person],[jeffrey,person],[jeffs,person],[jekyll,person],[jellioes,lolly],[jelly,],[jeloquated,plus],[jelucian,person],[jennifer,person],[jenny,person],[jensen,person],[jeopardy,minus],[jepsen,person],[jeremy,person],[jerked,right],[jerome,person],[jerry,person],[jersey,dot],[jess,person],[jesse,person],[jessica,person],[jest,plus],[jester,person],[jesuit,person],[jesus,person],[jet,plane],[jetde,dash],[jets,jet],[jettilise,xcorrection],[jewel,],[jewels,jewel],[jewish,person],[jews,person],[jezebel,person],[jf,dash],[jhrktec,dash],[jian,ball],[jig,square],[jiggled,up],[jiggling,up],[jigsaw,square],[jill,person],[jillian,person],[jilted,minus],[jim,person],[jimbo,person],[jimenez,person],[jimmy,person],[jimoh,person],[jing,person],[jingles,note],[jirga,person],[jit,box],[jitterbug,bug],[jive,note],[jives,note],[jl,variable],[jmir,dash],[jms,dash],[jn,dash],[jo,person],[joan,person],[job,person],[jobmedicine,plus],[jobs,person],[joe,person],[joel,person],[joensuu,dash],[joey,person],[jog,right],[jogged,right],[jogging,right],[john,person],[johnson,person],[join,tick],[joined,plus],[joiner,plus],[joining,tworectangle],[joins,box],[joint,box],[joints,dot],[jointure,equals],[joke,plus],[joked,plus],[jokers,person],[jokes,plus],[jokester,person],[joking,plus],[jolt,xcorrection],[jompelise,dash],[jonason,person],[jonathan,person],[jones,person],[jong,person],[jonno,person],[jonny,person],[jonquil,flower],[joon,person],[jordan,person],[jose,person],[joseph,person],[josh,person],[joshii,person],[jost,person],[jostle,minus],[josé,person],[journal,book],[journalcode,n],[journalism,paper],[journalist,person],[journals,book],[journey,path],[journeyed,right],[journeys,right],[jovial,plus],[joy,plus],[joyful,plus],[joyfully,plus],[joyously,plus],[joysticks,rod],[jp,dash],[jpeg,square],[jpg,square],[jquery,plus],[jr,dash],[js,square],[jsbasic,and],[json,dash],[jt,dash],[jtgz,dash],[jube,],[jubilacious,plus],[jubilant,plus],[jubilantly,plus],[judge,person],[judged,plus],[judgement,plus],[judgements,tick],[judges,person],[judging,plus],[judgment,],[judgmentally,plus],[judgments,square],[judy,person],[jug,],[jughead,person],[juice,],[juiced,apple],[juices,],[juicily,juice],[juicing,apple],[juicy,water],[jul,square],[julia,person],[julian,person],[julie,person],[juliet,person],[july,square],[julz,person],[jump,up],[jumped,right],[jumper,],[jumping,up],[jumps,up],[jun,dash],[junction,equals],[june,square],[jungian,person],[jungle,tree],[juni,person],[junior,down],[juniper,],[jupiter,planet],[jupyter,software],[jurisdiction,square],[jury,person],[just,one],[justice,plus],[justices,person],[justifiable,plus],[justification,plus],[justified,plus],[justify,right],[justifying,plus],[justin,person],[justly,plus],[juvenile,person],[juxtaposing,square],[jvcf,dash],[jx,square],[jyfz,dash],[jyotish,plus],[k,variable],[ka,n],[kadenze,company],[kadyrov,person],[kagan,person],[kakkonen,person],[kaleidoscope,],[kalff,person],[kalimba,box],[kamahen,person],[kamil,person],[kanai,person],[kangaroo,],[kant,person],[kantian,person],[karagoz,person],[karamazov,person],[kardinia,dot],[karen,person],[karena,person],[karl,person],[karlsson,person],[karma,plus],[karp,person],[karsten,person],[kastaniotis,person],[kat,person],[kataria,person],[kate,person],[katerelos,person],[katherine,person],[kathleea,person],[kathleen,person],[kathrin,person],[kathryn,person],[kathy,person],[katie,person],[katy,person],[katz,person],[kaufmann,person],[kavussanu,person],[kazakhstan,country],[kb,box],[kdp,software],[keats,person],[keep,one],[keeper,person],[keeping,box],[keeps,right],[keiko,person],[keith,person],[kelly,person],[kelsey,person],[kelson,person],[ken,person],[kennedy,person],[kenny,person],[kenya,country],[kenzhaliyev,person],[kept,plus],[ker,plus],[keras,and],[kernel,and],[kernie,right],[kerry,person],[kettle,],[kevin,person],[key,],[keyboard,],[keyboards,keyboard],[keyhole,],[keypad,box],[keypress,box],[keypresses,box],[keys,key],[keysort,right],[keystroke,square],[keystrokes,square],[keyvalue,n],[keyword,square],[keywords,square],[kfc,company],[kg,dot],[khamenei,person],[khan,person],[khurram,person],[khz,f],[kick,foot],[kickboard,],[kicked,foot],[kicking,leg],[kid,child],[kidding,plus],[kidman,person],[kidney,],[kidneys,kidney],[kids,child],[kie,dash],[kilda,person],[kill,minus],[killed,minus],[killer,person],[killers,person],[killing,minus],[kim,person],[kimberley,person],[kinaesthetic,hand],[kind,variable],[kinder,plus],[kindergarten,building],[kindle,product],[kindling,stick],[kindly,plus],[kindness,plus],[kinds,box],[kinesthetic,right],[kinetic,right],[king,person],[kingdom,person],[kingdoms,person],[kinglish,paper],[kings,person],[kinin,right],[kip,person],[kiribati,country],[kirsten,person],[kirsty,person],[kiss,plus],[kissed,mouth],[kissing,mouth],[kit,box],[kitchen,person],[kitching,person],[kite,],[kitt,person],[kitten,],[kitts,country],[kiwi,],[kizlik,person],[kjmdxpeaw,dash],[kkft,dash],[kl,square],[klaussinani,person],[klein,person],[klenowski,person],[km,plus],[kmlf,building],[knaves,person],[knee,],[knees,knee],[knelt,down],[knew,tick],[knickerbocker,plus],[knife,],[knight,person],[knights,person],[knit,right],[knitting,fabric],[knived,knife],[knn,n],[knns,equals],[knoch,person],[knock,equals],[knocked,right],[knocking,equals],[knockout,right],[knot,right],[knots,dot],[know,person],[knowing,tick],[knowingly,plus],[knowledge,book],[knowledgeable,plus],[known,plus],[knowness,plus],[knownness,plus],[knowns,plus],[knows,plus],[ko,person],[kobylarz,person],[koch,person],[kodály,person],[kohlberg,person],[kola,plant],[kolaci,person],[kolmogorov,person],[kolt,person],[kompozer,pen],[kong,city],[konstantatos,person],[kookaburra,bird],[korea,country],[korean,country],[korevaar,person],[kost,person],[kostakidis,person],[koto,box],[kpi,plus],[kpis,tick],[kr,square],[krezi,person],[kripke,person],[krishna,person],[kritikos,person],[kronberg,person],[kteily,person],[kumar,person],[kundalini,dot],[kunzite,stone],[kurt,person],[kuwait,country],[kvale,person],[kw,square],[kwlk,dash],[kyhmsopfy,dash],[kyrgyzstan,country],[l,variable],[la,building],[lab,room],[label,square],[labelall,square],[labeled,square],[labeling,square],[labelled,square],[labelling,square],[labels,square],[lability,minus],[labor,room],[laboratories,room],[laboratory,room],[laborious,dash],[laboriously,minus],[labors,plus],[labour,hand],[labourer,person],[labrador,dog],[labs,room],[laced,right],[laces,shoelaces],[lachlan,person],[lachy,person],[lacing,right],[lack,minus],[lacked,minus],[lacking,minus],[lacom,paper],[lacomfile,square],[lacrosse,ball],[lactic,box],[lactose,],[ladbury,person],[ladder,],[ladders,ladder],[laden,down],[ladies,person],[lady,person],[lag,n],[lagging,minus],[lah,dot],[lai,person],[laid,down],[laissez,plus],[lake,water],[lakeside,right],[lama,person],[lamb,person],[lambda,square],[lambropoulos,person],[lamp,light],[lan,square],[lancefield,person],[land,],[landed,down],[landing,square],[landmarks,dot],[lands,square],[landslide,down],[lane,],[lanes,lane],[lang,paper],[langprolog,and],[langs,paper],[language,book],[languageprolog,and],[languages,book],[lanka,country],[lanolin,],[lanyard,paper],[lanzer,person],[laos,country],[laoshi,person],[laozi,person],[lap,box],[lapel,],[lapels,lapel],[lapis,],[lapping,water],[laptop,],[lareau,person],[large,n],[largely,plus],[largenesses,plus],[larger,greaterthan],[largest,n],[larrikin,person],[larrikinism,plus],[larry,person],[laryngitis,minus],[larynx,box],[las,note],[lasagne,],[lascelles,right],[lasso,],[lassoed,lassoo],[lassoo,],[last,n],[lastchar,square],[lastcharisaid,square],[lastcharpartnersaid,square],[lastcstrings,variable],[lasted,n],[lasting,plus],[lastline,line],[lastly,n],[lastname,square],[lastperson,person],[lastrule,square],[lasts,n],[lastsyllable,square],[lat,dash],[latch,equals],[late,minus],[latency,n],[later,two],[lateral,right],[latest,n],[latex,square],[lather,person],[latin,book],[latitude,n],[latter,two],[latvia,country],[lau,person],[laugh,plus],[laughed,plus],[laughing,plus],[laughter,a],[launch,up],[launched,up],[launches,up],[launching,plus],[launchpad,and],[laundry,water],[laura,person],[laurel,flower],[laurels,flower],[lavender,],[law,square],[lawful,square],[lawfully,plus],[lawn,],[lawrence,person],[laws,and],[lawsuit,book],[lawyer,person],[lawyers,person],[lay,down],[layer,box],[layered,box],[layering,line],[layers,box],[laying,box],[layout,paper],[lays,down],[lazuli,],[lazy,minus],[lb,dash],[lc,v],[lcms,and],[lcritique,variable],[lcs,variable],[lctd,square],[lctda,variable],[ldmg,person],[le,dash],[lead,plus],[leader,person],[leadermode,right],[leaders,person],[leadership,person],[leading,one],[leads,right],[leaf,],[leaflet,paper],[league,dash],[leagues,dash],[leah,person],[lean,line],[leaning,down],[leanne,person],[leant,arm],[leap,up],[leaped,right],[leaper,person],[leaps,right],[leard,person],[learn,tick],[learned,plus],[learner,person],[learners,person],[learning,book],[learnmeditationnow,line],[learns,plus],[learnt,plus],[learntd,square],[learntm,square],[learnty,square],[lease,dollarsymbol],[leash,],[least,n],[leather,box],[leave,right],[leaves,tick],[leaving,right],[lebanon,country],[lec,mouth],[lechte,person],[lecs,book],[lectern,],[lects,book],[lecture,paper],[lectured,book],[lecturer,person],[lecturers,person],[lectures,paper],[lecturing,mouth],[led,right],[leder,person],[ledge,box],[ledger,book],[ledwidge,person],[lee,person],[leece,dash],[leeway,right],[left,],[leftbracket,],[leftmost,left],[leftover,n],[leftovers,box],[leg,line],[legacy,paper],[legal,square],[legalities,book],[legally,plus],[legible,square],[legislating,plus],[legislation,square],[legislative,book],[legitimate,plus],[legs,leg],[leh,dash],[leianne,person],[leila,person],[leisure,plus],[lejos,dash],[lem,variable],[lemanis,person],[lemon,],[lemonade,],[len,n],[lend,hand],[lending,right],[lends,plus],[length,twobox],[lengthed,n],[lengthened,right],[lengthmeditatorsanddoctors,n],[lengthphil,n],[lengths,n],[lengtht,variable],[lengthways,right],[lengthwise,line],[lengthy,n],[lenient,plus],[lenin,person],[lens,],[lenses,lens],[lent,right],[lentil,],[leo,person],[leon,person],[leonard,person],[leonardo,person],[leonardodavinci,duck],[leone,country],[leorke,person],[leprosy,xcorrection],[ler,square],[les,person],[leslie,person],[lesotho,country],[less,minus],[lessened,down],[lessening,down],[lesser,down],[lesson,paper],[lessons,square],[lessthan,],[leste,country],[let,equals],[lets,dash],[letter,paper],[lettera,square],[letterbox,],[letterc,variable],[lettering,square],[letters,paper],[letting,plus],[lettuce,],[lettuces,lettuce],[leu,variable],[leukaemia,minus],[lev,dash],[level,square],[levelled,n],[levels,tworectangle],[lever,rod],[leverage,right],[leveraged,plus],[leveraging,plus],[levi,person],[levinas,person],[levine,person],[levinthal,person],[levitate,up],[levity,minus],[levy,person],[lewin,person],[lewis,person],[lex,square],[lexicon,square],[lexicool,dash],[ley,variable],[leye,person],[lf,square],[lff,right],[lfl,dash],[lg,person],[lgalgs,box],[lgbtiq,person],[lgbtiqa,person],[lgbtiqas,person],[lgbtqi,person],[lgbtqia,person],[lgreen,person],[lgtext,box],[li,person],[liabilities,minus],[liability,minus],[liable,minus],[liaise,plus],[liaison,person],[liaisons,person],[lib,book],[libby,person],[libdyld,book],[libedit,pen],[liberal,plus],[liberalism,book],[liberality,plus],[liberally,plus],[liberation,plus],[liberia,country],[libexec,book],[librarian,person],[librarians,person],[libraries,book],[library,book],[libs,book],[libswipl,book],[libsystem,book],[libya,country],[licence,square],[licences,square],[license,paper],[licensed,paper],[licenses,paper],[licensing,square],[liceo,person],[lick,tongue],[licked,tongue],[licking,tongue],[licks,tongue],[licky,tongue],[licorice,],[liculia,plus],[lid,],[lie,minus],[liechtenstein,country],[lied,minus],[liegle,person],[lien,person],[lies,minus],[lieutenant,person],[liew,person],[life,person],[lifeless,minus],[lifeline,line],[lifelong,n],[lifelongness,plus],[lifespan,n],[lifespans,line],[lifestyle,dot],[lifetime,line],[lifetimes,line],[lift,up],[lifted,up],[lifter,up],[lifting,up],[lifts,up],[ligaments,right],[light,line],[lighted,dot],[lighter,n],[lightest,n],[lighthouse,],[lighting,light],[lightly,n],[lightness,n],[lightning,down],[lights,light],[likable,plus],[like,plus],[liked,plus],[likelihood,plus],[likely,plus],[likened,plus],[likeness,plus],[likenesses,plus],[likes,plus],[likewise,plus],[liking,plus],[lilibet,person],[lilo,],[lilting,up],[lily,flower],[lim,square],[limahl,person],[limbered,plus],[limbs,arm],[limit,n],[limitation,right],[limitations,minus],[limited,one],[limiter,n],[limiting,n],[limits,n],[limousine,],[limreached,n],[lin,person],[lincoln,person],[lindy,person],[line,],[lineage,line],[linear,line],[lineareqn,equals],[linearity,right],[linearly,line],[lined,plane],[linen,fabric],[liner,box],[lines,line],[lineup,line],[ling,person],[lingard,person],[lingua,paper],[linguistic,paper],[linguistically,paper],[linguistics,paper],[lining,box],[link,dash],[linkclick,right],[linked,dash],[linkedin,company],[linker,right],[linking,right],[links,dash],[linn,person],[lino,],[linode,dash],[linux,operatingsystem],[lion,],[lioness,lion],[lions,lion],[lip,mouth],[lipid,fat],[lipids,ball],[lipinit,zero],[lips,n],[lipstick,],[lipv,tick],[liquefied,water],[liquid,],[liquids,liquid],[lisa,person],[lisp,minus],[list,tworectangle],[lista,variable],[listb,variable],[listc,variable],[listd,variable],[listed,square],[listen,ear],[listenable,note],[listened,ear],[listener,person],[listeners,person],[listening,ear],[lister,person],[listhead,square],[listing,paper],[listings,square],[listitemnumber,n],[listlessness,minus],[listof,square],[listoflists,variable],[listoutputs,square],[listprolog,software],[listprologinterpreter,equals],[listrecursion,circle],[listrest,square],[lists,tworectangle],[listsorvars,or],[lit,light],[literacy,a],[literally,equals],[literariness,book],[literary,book],[literature,book],[lithium,],[lithuania,country],[lithuanian,book],[litigation,xcorrection],[litre,n],[litres,n],[little,n],[liturgical,book],[liu,person],[liv,organisation],[live,person],[lived,plus],[livelihood,plus],[livelihoods,plus],[lively,plus],[liver,],[lives,line],[living,plus],[livn,variable],[lizard,],[ll,right],[llb,paper],[llc,company],[lld,variable],[llen,variable],[llist,square],[llistitem,square],[llm,variable],[llms,and],[lloyd,person],[llvm,and],[lly,variable],[lm,person],[lmedic,book],[lms,square],[lmy,company],[ln,variable],[lns,variable],[lo,dash],[load,box],[loaded,up],[loading,up],[loads,up],[loadt,up],[loaf,],[loan,dollarsymbol],[loans,dollarsymbol],[loathe,minus],[lobbying,plus],[lobe,brain],[lobes,ball],[lobster,],[loc,dot],[local,one],[locale,square],[localhost,computer],[locally,suburb],[locals,person],[locate,dot],[located,dot],[locating,dot],[location,dot],[locations,dot],[locative,square],[lock,],[locked,equals],[lockerbie,city],[locking,equals],[locs,n],[locsin,dash],[locus,n],[lodge,house],[lodgment,right],[loeb,person],[log,],[logarithm,square],[logarithmic,square],[logged,paper],[logging,tick],[logic,and],[logical,and],[logicalconjunction,and],[logicaldisjunction,or],[logically,and],[logicalnot,xcorrection],[logicise,and],[logicism,and],[logico,and],[logics,and],[logicus,book],[login,square],[logins,square],[logistics,dash],[logo,square],[logos,square],[logout,down],[logowriter,and],[logs,log],[loiterer,person],[lolled,tongue],[lollies,lolly],[lolling,tongue],[lolliop,],[lollipop,],[lolly,],[london,city],[lone,square],[lonely,minus],[long,line],[longed,plus],[longer,greaterthan],[longest,n],[longestprefix,square],[longevities,n],[longevity,n],[longing,plus],[longingly,plus],[longitude,n],[longitudinal,line],[longness,n],[longtoshortform,right],[longue,person],[longueur,n],[look,eye],[lookahead,right],[lookalike,plus],[looked,eye],[looking,eye],[lookout,eye],[looks,eye],[lookup,eye],[loop,circle],[loopback,left],[looped,circle],[loophole,hole],[looping,circle],[loops,circle],[loose,minus],[loosely,plus],[loosened,minus],[looseness,plus],[loosening,right],[lopes,person],[lord,person],[lords,person],[lorelle,person],[loren,person],[lorian,house],[lorikeet,],[lorna,person],[lorraine,person],[lorry,truck],[los,city],[losada,person],[lose,minus],[losers,person],[loses,minus],[losing,minus],[loss,minus],[losses,minus],[lost,minus],[lot,n],[lote,book],[loted,paper],[lots,n],[lotus,flower],[loud,n],[louder,greaterthan],[loudest,n],[louis,person],[louisa,person],[louise,person],[louisegreen,person],[lounge,sofa],[love,heart],[loved,heart],[loveliana,person],[lovely,plus],[lover,person],[lovers,person],[loves,plus],[loving,heart],[lovingkindness,plus],[lovingly,heart],[lovingtude,plus],[low,n],[lower,square],[lowera,a],[lowerb,b],[lowerc,c],[lowercase,square],[lowerd,d],[lowere,e],[lowered,down],[lowerf,f],[lowerg,g],[lowerh,h],[loweri,i],[lowering,down],[lowerj,j],[lowerk,k],[lowerl,l],[lowerm,m],[lowern,n],[lowero,o],[lowerp,p],[lowerq,q],[lowerr,r],[lowers,s],[lowert,t],[loweru,u],[lowerv,v],[lowerw,w],[lowerx,x],[lowery,y],[lowerz,z],[lowest,n],[lowliest,down],[loyal,person],[loyalties,plus],[loyalty,tick],[lozenge,],[lp,variable],[lpcawmp,and],[lpcl,square],[lpconverter,right],[lpd,variable],[lpi,square],[lpiv,and],[lpiverify,tick],[lpm,variable],[lpo,building],[lpp,variable],[lppm,plus],[lpredstestdel,dash],[lpv,tick],[lpverify,plus],[lpy,variable],[lr,square],[lrjj,dash],[ls,variable],[lss,variable],[lst,line],[lstm,and],[lstms,right],[lt,lessthan],[ltbranch,line],[ltd,n],[lth,l],[ltleaf,dash],[ltree,dash],[ltrees,dash],[lu,person],[lub,person],[lubricant,],[lubricated,oil],[lubricating,lubricant],[luc,person],[luca,person],[lucan,person],[lucca,city],[luce,light],[lucia,country],[lucian,person],[lucianacademy,person],[lucianacademyapis,software],[luciancicd,right],[luciandmgreen,person],[luciangreen,person],[luciangreenfringe,person],[luciangreenmusic,note],[luciangreensphilosophyapril,box],[luciangreensphilosophyaugust,box],[luciangreensphilosophydecember,book],[luciangreensphilosophyfebruary,folder],[luciangreensphilosophyjanuary,paper],[luciangreensphilosophyjune,book],[luciangreensphilosophymarch,box],[luciangreensphilosophymay,box],[luciangreensphilosophynotebooks,book],[luciangreensphilosophynovember,paper],[luciangreensphilosophyoctober,box],[luciangreensphilosophyseptember,box],[lucianic,person],[lucianiccommentasacelebrityreport,paper],[lucianiccomputationalenglishreport,paper],[lucianicdaoistheadachemedicinetexttobr,paper],[lucianicmedicinesuccessfulconceptionandpreventmiscarriage,paper],[lucianicmeditation,square],[lucianicmeditationapps,software],[lucianicpedagogyanarchyquiz,paper],[lucianicpedagogycreateandhelpapedagoguereport,paper],[lucianicpedagogylecturerreport,paper],[lucianicpedagogyrecordingsreport,paper],[lucianictexttobr,paper],[lucianism,book],[lucianmantrapureform,square],[lucianmantrasunsafety,square],[lucianos,and],[lucianpedia,paper],[lucianphilosopher,person],[lucianpl,right],[lucians,person],[luciansair,computer],[lucianshand,hand],[lucianshandbitmap,dot],[lucianshd,box],[lucianspedagogy,book],[luciansphilosophy,square],[lucid,plus],[lucien,person],[luck,plus],[luckier,plus],[luckies,plus],[luckily,plus],[lucky,plus],[lucy,person],[lud,variable],[ludo,],[ludocytes,cell],[ludwig,person],[luggage,bag],[lugoondo,plus],[luke,person],[lukewarm,n],[lulang,plus],[lullaby,note],[lulu,person],[lum,variable],[lump,],[lumps,box],[luna,moon],[lunar,moon],[lunch,apple],[lunches,apple],[lunchtime,apple],[lung,],[lungfuls,lung],[lunging,right],[lungs,lung],[lur,dash],[lurking,minus],[lush,water],[lust,dash],[lustful,plus],[lustrous,plus],[lute,],[lutephonics,note],[luu,variable],[luxembourg,country],[luxury,plus],[luy,variable],[luís,person],[ly,square],[lycra,],[lyft,company],[lying,minus],[lymph,],[lymphatic,lymph],[lyn,person],[lynch,person],[lynxlet,and],[lyons,person],[lyotard,],[lyre,],[lyric,square],[lyrics,square],[lyricsa,square],[lyricsv,square],[lysander,person],[lyzeme,right],[m,box],[ma,dash],[maak,person],[maas,person],[mable,person],[mac,box],[macadamia,nut],[macbook,],[macca,person],[macedonia,country],[macfarlane,person],[mach,person],[machete,],[machiavelli,person],[machiavellian,person],[machiavellianism,minus],[machina,box],[machinations,right],[machine,box],[machinery,and],[machines,and],[macintosh,computer],[macmillan,company],[macos,box],[macpath,square],[macquarie,university],[macqueen,person],[macrae,person],[macro,n],[macrocombos,box],[macroeconomics,dollarsymbol],[macs,computer],[macvim,box],[mad,minus],[madagascar,country],[madam,person],[madang,person],[madcatsound,building],[made,box],[madeleine,person],[madeness,box],[madeup,plus],[madness,minus],[madonna,person],[madrid,city],[maestro,person],[magazine,],[magazines,book],[magda,person],[maggot,],[magic,star],[magical,plus],[magicians,person],[magister,person],[magistero,person],[magistri,person],[magna,n],[magnesium,ball],[magnet,],[magnetic,right],[magnetosphere,ball],[magnificent,plus],[magnifying,multiplication],[magnitude,n],[magnum,plus],[magpie,bird],[magulous,plus],[magut,person],[maharishi,person],[maharishisutra,square],[mahasti,person],[maher,person],[mahoganies,],[mahogany,wood],[maid,person],[mail,paper],[mailbox,box],[mailer,right],[mailing,paper],[mailto,paper],[main,one],[mainloop,circle],[mainly,one],[mainness,plus],[mainrole,person],[mainstream,plus],[maintain,dash],[maintained,plus],[maintainer,person],[maintaining,dash],[maintains,tick],[maintenance,right],[maize,plant],[major,happyface],[majoring,plus],[majority,n],[majorly,plus],[majors,square],[make,box],[makebasecase,square],[makecode,and],[makename,square],[makenames,square],[maker,person],[makerandomlist,square],[makerid,n],[makers,person],[makes,hand],[making,box],[mal,person],[maladministration,minus],[malaecotopics,plus],[malaria,minus],[malatesta,person],[malawi,country],[malaysia,country],[malaysian,person],[malcolm,person],[malcontents,minus],[maldives,country],[male,person],[malebranches,person],[maleficence,minus],[males,person],[malfunction,minus],[malfunctioned,minus],[malfunctioning,minus],[malfunctions,minus],[malhesian,person],[mali,country],[malibu,city],[mallet,rod],[malloc,box],[malloy,person],[malnourishment,minus],[malta,country],[malterud,person],[maltodextrin,ball],[malus,apple],[malvern,suburb],[malware,minus],[mambo,square],[mamet,person],[mammary,breast],[man,person],[manage,plus],[manageable,plus],[managed,plus],[management,plus],[manager,person],[managerial,person],[managers,person],[manages,plus],[managing,plus],[manahimhorisit,plus],[manaia,person],[mandarin,],[mandatory,plus],[mandy,person],[mangelsdorf,person],[mangione,person],[mango,],[manifest,square],[manifestation,plus],[manifested,plus],[manifestfile,square],[manifesting,tick],[manifesto,book],[manifests,plus],[manipulate,hand],[manipulated,hand],[manipulates,right],[manipulating,minus],[manipulation,right],[mannequin,],[manner,plus],[mannerisms,right],[manners,plus],[manny,person],[mantainers,person],[mantel,box],[mantelpiece,],[mantelpieces,mantelpiece],[mantissa,n],[mantle,square],[mantra,a],[mantras,square],[manual,book],[manually,hand],[manufacture,box],[manufactured,box],[manufacturer,person],[manufacturers,person],[manufacturing,box],[manup,person],[manuscript,paper],[manuscripts,paper],[manwarring,person],[many,n],[map,square],[maple,tree],[maplist,tworectangle],[maplists,right],[mapm,right],[mapped,map],[mapping,square],[maps,square],[mar,and],[maracas,],[marachusanchuchay,plus],[maraded,plus],[maranatha,square],[marble,ball],[marbles,ball],[marc,person],[marca,person],[march,person],[marched,right],[marchers,right],[marching,right],[mardi,person],[maremma,dog],[margarine,],[margin,line],[marginalised,minus],[margins,n],[maria,person],[mariam,person],[marie,person],[marimba,],[marina,person],[marine,water],[marino,country],[mario,person],[marion,person],[marital,right],[mark,lowera],[markdown,paper],[marked,tick],[markedly,plus],[marker,plus],[marker,square],[markers,square],[market,dollarsymbol],[marketed,square],[marketer,person],[marketers,person],[marketing,dollarsymbol],[marketplace,building],[markets,dollarsymbol],[marking,square],[markle,person],[marks,a],[markup,square],[marlebong,city],[marmalade,],[maroon,person],[marquee,],[marriage,two],[marriages,dash],[married,plus],[marrion,person],[marrow,box],[marry,plus],[marrying,right],[mars,planet],[marshall,person],[marshalling,right],[marshmallow,],[marson,person],[martha,person],[martin,person],[martino,person],[martorana,person],[marx,person],[marxian,person],[marxism,book],[marxist,person],[marxs,person],[mary,person],[marzipan,],[masculine,square],[masculinities,book],[masculinity,person],[mashed,down],[mask,],[masking,mask],[masks,mask],[maslow,person],[mason,person],[masonry,organisation],[mass,n],[massage,hand],[massaged,hand],[massaging,hand],[masse,person],[masses,person],[massive,n],[master,person],[mastered,n],[masterer,n],[mastering,n],[masterpiece,square],[masters,person],[mastery,plus],[mat,],[mataura,person],[match,equals],[matched,equals],[matches,equals],[matching,equals],[matchingness,equals],[mate,person],[material,box],[materialisable,plus],[materialisation,plus],[materialise,box],[materialised,box],[materialiser,plus],[materialisers,plus],[materialising,plus],[materialism,box],[materialistic,box],[materials,box],[materna,person],[maternity,baby],[math,plus],[mathematical,plus],[mathematically,equals],[mathematician,person],[mathematicians,person],[mathematics,multiply],[mathison,person],[mathjeopardy,plus],[maths,plus],[mathsisfun,equals],[mathworks,plus],[matics,up],[matilda,person],[matlabcentral,plus],[matriarchs,person],[matrices,n],[matrix,grid],[matt,person],[matter,box],[mattered,plus],[mattering,plus],[matters,plus],[matthew,person],[mattress,],[mature,n],[maturity,n],[mauboy,person],[maude,person],[maunsang,person],[maura,person],[maurice,person],[mauritania,country],[mauritius,country],[mawk,person],[max,n],[maxclauses,variable],[maxdepth,n],[maximal,plus],[maximally,a],[maxime,person],[maximise,plus],[maximised,plus],[maximising,plus],[maximum,n],[maxlength,n],[maxn,n],[maxpredicates,variable],[maxs,n],[maxterms,variable],[maxwell,person],[may,dash],[maya,country],[maybe,zero],[maybeapply,plus],[mayeroff,person],[mayfair,robot],[mayhem,minus],[mayonnaise,],[maypole,],[maze,],[mazes,box],[mb,square],[mba,paper],[mbarelation,dash],[mbas,paper],[mbc,dash],[mbti,n],[mc,person],[mcall,dash],[mcawp,and],[mccall,person],[mccallum,person],[mccarthy,person],[mcclure,person],[mccutcheon,person],[mcdermott,person],[mcdonalds,company],[mcewan,person],[mcgrath,person],[mcintyre,person],[mckelvey,person],[mckinsey,person],[mclean,person],[mclellan,person],[mcnally,person],[mcnamara,person],[mcowan,person],[mcpherson,person],[mcq,square],[mcqs,square],[md,person],[mdb,plus],[me,person],[mea,down],[meaf,dash],[meal,apple],[meals,carrot],[mean,dash],[meandered,right],[meaning,apple],[meaningful,plus],[meaningfulness,plus],[meaningless,minus],[meanings,dash],[meanjin,book],[means,dash],[meant,tick],[meantime,line],[meantness,plus],[measurable,n],[measure,n],[measured,n],[measurement,n],[measurements,n],[measures,plus],[measuring,n],[measuringly,n],[meat,minus],[meath,person],[meats,meat],[mechanic,person],[mechanical,right],[mechanically,right],[mechanics,right],[mechanism,right],[mechanisms,right],[med,plus],[medal,],[medi,plus],[media,square],[median,two],[mediated,plus],[mediating,plus],[mediation,plus],[medibank,company],[medic,book],[medical,plus],[medically,book],[medicare,company],[medication,],[medications,switch],[medicinal,tablet],[medicine,paper],[medicinenoreplace,plus],[medicines,box],[medieval,book],[medit,line],[meditate,tick],[meditated,meditator],[meditates,dot],[meditating,meditator],[meditation,meditator],[meditational,plus],[meditationally,mat],[meditationincreasedbrainpotential,up],[meditationindicatordecreasedstress,down],[meditationindicatorincreasedbloodflow,up],[meditationindicatorlowerriskofcancerandotherdiseasesinworkersandbroadcasters,down],[meditationnoreplace,dot],[meditations,square],[meditationteachersutra,square],[meditative,plus],[meditatively,plus],[meditator,person],[meditators,person],[meditatorsanddoctors,person],[mediterranean,person],[meditnoreplace,right],[medits,person],[medium,box],[medtech,company],[meet,dash],[meeting,dash],[meetings,dash],[meets,dash],[meetup,company],[meg,person],[megalopolises,city],[megan,person],[megapixels,square],[megascreendom,plus],[meghan,person],[mei,person],[mein,person],[meinong,person],[meisner,person],[mel,city],[melancholia,minus],[melancholy,minus],[melanie,person],[melb,city],[melbourne,city],[melchior,],[melinda,person],[melissa,person],[mellitus,minus],[mellow,plus],[melodic,note],[melodies,note],[melody,note],[melodyharmony,note],[melodyinstrumentnumber,n],[melodyinstruments,tube],[melodyinstrumentslength,n],[melodypart,note],[melodyparts,line],[melodypartslength,n],[melon,],[melt,down],[melted,down],[melting,minus],[melts,down],[mem,up],[member,person],[membera,box],[memberchk,box],[memberlp,box],[memberprogram,box],[memberr,tworectangle],[members,person],[membership,paper],[memberships,folder],[membrane,box],[meme,square],[memes,square],[memo,paper],[memorable,n],[memories,box],[memorisation,square],[memorise,tick],[memorised,plus],[memorises,square],[memorising,down],[memory,box],[memphis,city],[men,person],[mena,company],[menace,minus],[menacing,minus],[mend,plus],[mene,plus],[meniscus,n],[meniscuses,meniscus],[menken,person],[menkoff,person],[meno,person],[mens,person],[mensicus,line],[menstrual,blood],[mental,brain],[mentally,brain],[mentee,person],[mentees,person],[mention,dash],[mentioned,dash],[mentioning,square],[mentions,mouth],[mentor,person],[mentored,plus],[mentoring,plus],[mentors,person],[menu,square],[menulog,company],[menus,square],[meontological,box],[meontology,box],[meoworld,sphere],[merch,product],[merchandise,box],[merchant,person],[merchantability,dollarsymbol],[merchantid,dash],[merchants,person],[mercurially,mercury],[mercury,],[mercy,plus],[mere,plus],[merely,plus],[merge,one],[merged,right],[mergedtree,tree],[merger,box],[mergers,one],[merges,box],[mergesort,up],[mergetexttobrdict,threerectangle],[merging,right],[meringue,],[meristem,line],[merit,plus],[merited,plus],[meritocracy,plus],[meritocratic,plus],[merleau,person],[merman,person],[meronym,box],[meronymic,equals],[meronymnal,box],[meronyms,box],[merrington,person],[merror,xcorrection],[mes,dash],[mesentery,organ],[mesh,square],[mesmerized,plus],[mess,minus],[message,square],[messager,square],[messages,square],[messaging,square],[messences,ball],[messenger,right],[messing,minus],[messy,minus],[met,dash],[meta,square],[metabolic,right],[metabolise,down],[metabolism,right],[metabolites,ball],[metacognition,circle],[metacognitively,plus],[metadata,square],[metal,],[metallic,square],[metals,metal],[metamucil,plant],[metaphilosophy,book],[metaphor,apple],[metaphorical,box],[metaphorically,right],[metaphorist,person],[metaphors,square],[metaphysical,plus],[metaphysically,box],[metaphysician,person],[metaphysics,dot],[metaprogramming,square],[meter,one],[metered,line],[metering,n],[meters,n],[method,plus],[methodical,plus],[methodically,plus],[methodological,right],[methodologies,right],[methodology,square],[methods,dash],[methuselah,person],[meticulous,plus],[meticulously,plus],[metodologici,plus],[metre,line],[metres,square],[metric,n],[metrics,n],[metrolyrics,site],[mexception,dash],[mexican,country],[mexico,country],[mexit,zero],[mfail,xcorrection],[mfalse,xcorrection],[mfile,paper],[mg,n],[mgrammar,right],[mgs,school],[mhealth,company],[mhr,person],[mhu,dash],[mi,minus],[miami,city],[mic,microphone],[michael,person],[michaelreturnedfromthailand,person],[michaels,person],[michel,person],[michelle,person],[michigan,city],[mick,person],[micky,person],[micro,n],[microbiology,book],[microchip,and],[microchips,and],[microcircuit,and],[microcircuits,and],[microcred,paper],[microeconomics,dollarsymbol],[microlevel,down],[micromanagement,down],[micromentor,company],[micronesia,country],[microphone,],[microprocessor,and],[microprocessors,and],[microscope,],[microsoft,building],[mid,note],[midday,n],[middle,two],[middles,n],[midgley,person],[midi,note],[midiutils,right],[midlying,dot],[midnight,zero],[midpoint,dot],[mids,note],[midsummer,n],[midway,n],[mifflin,person],[might,dash],[migraines,minus],[migrant,person],[migrate,right],[migrated,right],[migrating,right],[migration,right],[mike,person],[mil,n],[mild,plus],[milder,n],[mildly,plus],[mile,n],[milestone,box],[milestones,dot],[milieu,box],[military,minus],[milk,],[milked,milk],[milking,milk],[milkmaid,person],[milks,milk],[milkshake,milk],[milkshakes,cup],[milky,milk],[millar,person],[millenia,line],[millennia,line],[millennials,person],[miller,person],[millinery,hat],[million,line],[millionaire,person],[millions,n],[millipede,],[millswyn,path],[milne,person],[mimetic,right],[mimic,right],[mimicked,equals],[mimicking,equals],[mimickry,plus],[mimicry,tworectangle],[mimics,two],[min,n],[mince,],[minced,box],[mind,brain],[minded,plus],[mindedness,plus],[mindful,plus],[mindfully,plus],[mindfulness,plus],[minding,plus],[mindmap,dash],[mindmapping,dash],[mindread,left],[mindreader,eye],[mindreading,brain],[mindreadingcaw,and],[mindreadtest,tick],[mindreadtesta,tick],[mindreadtestb,tick],[mindreadtestcharacters,a],[mindreadteste,tick],[mindreadtestlang,plus],[mindreadtestmeditation,up],[mindreadtestminus,minus],[mindreadtestminusunrelatedthought,minus],[mindreadtestmusiccomposer,note],[mindreadtestne,tick],[mindreadtestobj,apple],[mindreadtestpersoncomputer,computer],[mindreadtestpsychiatrist,person],[mindreadtestsec,plus],[mindreadtestsecthoughts,box],[mindreadtestsecurity,xcorrection],[mindreadtestshared,plus],[mindreadtestz,tick],[minds,brain],[mindset,plus],[mindsuite,brain],[mindtools,plus],[mine,down],[mineral,],[minerals,mineral],[minestrone,soup],[mingle,right],[mingliang,person],[mini,n],[miniature,down],[minimal,one],[minimalism,down],[minimalist,box],[minimax,company],[minimaxir,algorithm],[minimisation,down],[minimise,minus],[minimised,down],[minimiser,down],[minimises,down],[minimising,down],[minimum,n],[mining,down],[minions,person],[minister,person],[ministers,person],[ministries,box],[ministry,plus],[minor,sadface],[minorities,person],[minority,n],[mins,square],[mint,plant],[mintzberg,person],[minus,],[minute,square],[minutes,paper],[minutevaluea,variable],[minutevalueb,variable],[mir,dash],[miracle,plus],[miracles,plus],[miraculous,plus],[miraculously,plus],[miraculousness,plus],[miranda,person],[miriam,person],[miriammorris,person],[mirriam,person],[mirror,],[mirrors,mirror],[mirth,happyface],[mis,minus],[misaligned,notequals],[misbehaving,minus],[misbehaviour,minus],[miscalculate,minus],[miscarriage,minus],[miscarriages,minus],[miscellaneous,right],[miscellany,box],[misconceived,minus],[misconduct,minus],[miscounted,minus],[misdoings,minus],[miseducative,minus],[misery,minus],[misfit,minus],[misfortunes,minus],[misguided,minus],[misha,person],[mishandling,minus],[misheard,minus],[mishra,person],[misinformation,minus],[misinformed,minus],[mismatch,notequals],[misnamed,minus],[miso,],[misrepresentation,minus],[misrepresented,minus],[misrepresenting,minus],[miss,person],[missed,minus],[misses,minus],[missing,minus],[mission,plus],[missionaries,person],[missioncloud,right],[missions,right],[misspelled,minus],[misspellings,minus],[mistake,xcorrection],[mistaken,minus],[mistakes,xcorrection],[mistress,person],[mistyped,minus],[mistypes,minus],[misunderstand,minus],[misunderstanding,minus],[misunderstandings,minus],[misunderstood,minus],[misused,minus],[mit,university],[mited,ball],[mitigate,plus],[mitigated,plus],[mitigation,xcorrection],[mitochondria,],[mitsea,person],[mix,dot],[mixed,plus],[mixer,n],[mixes,box],[mixing,n],[mixture,ball],[mixtures,amount],[mkcid,dash],[mkevt,dash],[mkgroupid,dash],[mklykpuce,dash],[mkrid,dash],[mktemp,variable],[mktype,dash],[ml,dash],[mlb,company],[mlblawyers,company],[mleash,leash],[mls,n],[mm,],[mmc,and],[mn,person],[mnemonic,square],[mnemonics,square],[mnode,dash],[mnws,dash],[mo,person],[mobile,box],[mobiles,iphone],[mobility,right],[mock,plus],[mocktail,glass],[mockup,plus],[mod,n],[modal,n],[modalities,m],[modalities,n],[modality,n],[mode,b],[moded,right],[model,box],[modeled,box],[modelled,right],[modelling,box],[models,and],[moderate,plus],[moderated,plus],[moderately,plus],[moderation,plus],[modern,plus],[modernist,plus],[modernity,plus],[moderns,person],[modes,n],[modespec,square],[modestatements,square],[modestly,plus],[modesty,plus],[modifiable,plus],[modification,plus],[modifications,right],[modified,right],[modifier,plus],[modifiers,key],[modifies,right],[modify,right],[modifying,right],[modifyitem,plus],[modularisation,box],[modularise,dash],[modularised,box],[module,box],[modules,book],[modus,n],[modusponens,right],[modw,square],[mohammad,person],[mohammadfneish,person],[mohammed,person],[moistened,square],[mol,n],[molars,tooth],[mold,box],[molding,mold],[moldova,country],[molecular,ball],[molecularly,ball],[molecule,ball],[molecules,ball],[molly,person],[molyhedrons,box],[mom,person],[moment,one],[momentarily,one],[momentary,n],[moments,one],[mon,square],[monaco,country],[monad,and],[monadalisation,right],[monadalised,line],[monadic,and],[monarch,person],[monarchs,person],[monarchy,king],[monash,university],[monastery,building],[monastic,person],[monastics,person],[monday,square],[mondragón,equals],[monetary,dollarsymbol],[monetise,dollarsymbol],[money,dollarsign],[moneyless,zero],[moneymaking,dollarsymbol],[mongolia,country],[monies,dollarsymbol],[moniker,plus],[monitor,square],[monitored,plus],[monitoring,eye],[monitors,plus],[moniz,person],[monk,person],[monkey,],[mono,box],[monocle,circle],[monogamy,box],[monologue,paper],[monologues,paper],[monopoles,up],[monopoly,box],[monos,box],[monotheism,person],[monotheistic,person],[monotone,one],[monotonicities,plus],[monster,person],[montelhedrons,box],[montenegro,country],[montero,person],[montessori,person],[month,square],[monthdob,n],[monthlearned,square],[monthly,],[months,square],[monthvaluea,variable],[monthvalueb,variable],[montreal,city],[mooc,paper],[moocs,book],[mood,square],[moods,variable],[moody,person],[moog,note],[moon,],[moonbelt,ball],[mooney,person],[moonlight,moon],[moonwalk,right],[mop,],[mopped,mop],[moral,tick],[morale,plus],[morality,tick],[morally,plus],[morals,plus],[more,plus],[moreover,and],[morgan,person],[moriac,suburb],[mormonisms,person],[morning,sun],[mornings,sun],[morocco,country],[morphs,right],[morris,person],[mortal,n],[mortality,n],[mortals,person],[mortar,],[mortarboard,],[mortem,n],[mortgage,n],[moslem,person],[mosquito,mosquito],[mosquitoes,mosquito],[mossop,person],[most,one],[mostly,one],[mother,person],[motherhood,plus],[mothers,person],[moti,right],[motion,right],[motioning,right],[motions,right],[motivate,plus],[motivated,plus],[motivates,plus],[motivating,plus],[motivation,plus],[motivational,plus],[motivations,plus],[motives,plus],[motor,right],[motto,square],[mould,box],[moulded,box],[moules,person],[mound,earth],[mount,up],[mountain,],[mountaineer,person],[mountains,mountain],[mounted,down],[mouse,],[mouseovers,dot],[mousse,],[moustache,],[moustached,moustache],[mouth,],[mouthful,mouth],[mouthfuls,cube],[mouthing,mouth],[mouthpiece,mouth],[mouthwash,],[move,right],[moved,right],[movement,right],[movements,right],[mover,right],[moves,right],[movie,line],[moviegoer,person],[movies,line],[moviesandsales,square],[moviestorm,software],[moving,right],[movingappearances,right],[movingappearancespurusha,square],[mow,right],[moxibustion,dash],[moyn,person],[mozambique,country],[mozart,person],[mozilla,application],[mp,dash],[mpa,book],[mr,variable],[mrcaw,and],[mrcawmp,square],[mredo,two],[mrs,person],[mrtree,dash],[mrtsthoughts,square],[mrwed,company],[ms,building],[mscore,n],[mscp,school],[msetup,n],[msg,square],[msn,company],[msort,right],[mssbtl,mat],[mt,mountain],[mteach,book],[mteaching,book],[mth,m],[mtjcbct,dash],[mtotal,n],[mtrace,dash],[mtree,dash],[mtrees,dash],[mtrue,one],[mts,square],[much,one],[mucous,box],[mueller,person],[muesli,],[muffin,],[muffins,muffin],[mug,],[muhammad,person],[mukesh,person],[mul,n],[mulch,box],[mulched,box],[mulling,plus],[mulready,person],[mult,multiplication],[multi,n],[multichoice,n],[multiclause,square],[multiclauses,square],[multiconnections,right],[multicultural,tworectangle],[multiculturalism,plus],[multidim,box],[multidimensional,dash],[multidisciplinary,book],[multifile,paper],[multileveled,dash],[multiline,two],[multilingual,book],[multimedia,square],[multimethod,plus],[multinational,ball],[multiness,n],[multiple,n],[multiples,multiplication],[multiplication,multiplication],[multiplications,multiplication],[multiplicity,n],[multiplied,multiply],[multiplier,n],[multiplies,multiplication],[multiply,],[multiplybytwo,multiply],[multiplying,multiplication],[multireps,box],[multistage,n],[multistate,square],[multitask,plus],[multitasking,n],[multithreaded,line],[multithreading,line],[multitrait,box],[multitude,two],[multiverse,box],[multiverses,box],[multividual,person],[multividuals,person],[mulvaney,person],[mum,university],[mummery,company],[mummoco,person],[mummy,person],[mums,person],[mundane,minus],[munery,plus],[munify,equals],[munster,person],[muradbd,person],[murch,plus],[murder,minus],[murdered,minus],[murderer,person],[murderers,person],[murdering,minus],[muriel,person],[murnaghan,person],[murphy,person],[murray,person],[mus,note],[musc,dash],[muscle,box],[muscles,box],[muscomp,note],[muscovy,duck],[muscular,muscle],[mushroom,plant],[mushrooms,mushroom],[music,note],[musical,note],[musically,note],[musicals,note],[musiccomposer,software],[musician,person],[musicians,person],[musicking,note],[musicles,box],[musiclibrary,book],[musings,plus],[muslims,person],[must,dash],[mustafa,person],[mutants,person],[mutate,right],[mutated,right],[mutating,right],[mutation,n],[muted,zero],[mutex,box],[mutexed,box],[mutexes,box],[mutt,software],[mutual,plus],[mutually,tworectangle],[mutuallyexclusive,notequals],[mv,right],[mvisible,square],[mvm,dash],[mvs,dash],[mwarning,xcorrection],[mwt,company],[mwyeqclcbgasyhq,dash],[mx,variable],[my,person],[myadmissionessay,company],[myanmar,country],[mycorrhizae,],[myelem,square],[myers,person],[myface,company],[myhead,square],[myki,square],[mymind,company],[myness,person],[myproduct,multiplication],[myreverse,left],[myriad,two],[myrrh,],[myself,person],[mysql,and],[mysteries,questionmark],[mysterious,plus],[mystery,plus],[mystic,plus],[mystical,plus],[mysum,plus],[mytail,square],[myth,minus],[myths,square],[myune,paper],[mywhile,plus],[mzi,dash],[n,variable],[na,dash],[nab,company],[nac,dash],[nacl,],[nad,dash],[nadia,person],[nagel,person],[nagging,minus],[naif,person],[nail,down],[nailed,nail],[nails,nail],[naive,plus],[nakar,person],[naked,zero],[nal,variable],[namaskar,hand],[namaskara,plus],[name,threerectangle],[nameable,square],[named,square],[namely,square],[names,a],[nameserver,square],[namibia,country],[naming,square],[namogoo,company],[nan,person],[nana,person],[nancy,person],[nanga,city],[nanny,person],[nanometres,line],[nanotode,down],[nanotubes,box],[naomi,person],[nap,bed],[napa,tree],[napkin,],[naplan,paper],[naples,city],[napoletana,tomato],[nappies,nappy],[nappy,],[narcissism,minus],[narcissist,minus],[narcissistic,minus],[narcissists,minus],[narcissus,person],[nare,dash],[narrated,mouth],[narrating,mouth],[narration,book],[narrative,square],[narratives,square],[narratology,mouth],[narrator,person],[narrow,box],[narrowed,down],[narrowing,minus],[narrows,minus],[narst,person],[nasa,up],[nasal,nose],[nascendi,right],[natalie,person],[nathalie,person],[nathan,person],[nation,square],[national,country],[nationalism,country],[nationalist,person],[nationality,square],[nationally,country],[nations,square],[native,person],[natively,plus],[nativity,baby],[natural,plus],[naturalistic,plus],[naturally,plus],[naturalness,plus],[nature,sun],[natureofcode,and],[natures,plus],[naughty,minus],[nauru,country],[nautilus,square],[nav,square],[naval,sea],[navel,orange],[navigate,right],[navigated,right],[navigating,right],[navigation,right],[navigational,right],[navigator,and],[navy,person],[nay,minus],[nazarene,person],[nazi,person],[nazis,person],[nazism,person],[naïve,minus],[nb,plus],[nba,n],[nbsp,dash],[nby,left],[nc,dash],[ncbi,company],[ncolour,square],[ncurl,dash],[nd,dash],[ndis,right],[ne,dash],[near,n],[nearby,equals],[neared,plus],[nearest,equals],[nearing,right],[nearly,plus],[neat,plus],[neaten,plus],[neatened,plus],[neatly,plus],[neatness,plus],[nebuchadnezzar,person],[nebula,nebula],[nebulous,nebula],[nec,plus],[necessarily,tick],[necessary,one],[necessitate,plus],[necessitating,plus],[necessitivist,plus],[necessity,plus],[neck,],[necking,equals],[necklace,],[nectar,],[nectarine,],[nectars,nectar],[nee,dash],[need,left],[needaccess,dash],[needed,left],[needing,left],[needle,],[needles,needle],[needless,xcorrection],[needlessly,minus],[needs,left],[needy,minus],[nef,dash],[neg,minus],[negatable,minus],[negatably,minus],[negate,xcorrection],[negated,xcorrection],[negates,xcorrection],[negating,xcorrection],[negation,minus],[negative,minus],[negatively,minus],[negativity,minus],[negeia,minus],[negentropic,down],[neglected,minus],[negligence,minus],[negotiate,plus],[negotiated,plus],[negotiating,plus],[negotiation,plus],[negotiations,square],[negotiator,person],[nei,person],[neiess,person],[neiey,person],[neigh,program],[neighborhood,suburb],[neighbour,equals],[neighbourhood,suburb],[neighbouring,equals],[neighbours,person],[neil,person],[neiler,person],[neiney,person],[neither,xcorrection],[nel,person],[nellie,person],[nelly,person],[nelson,person],[neo,plus],[neof,square],[neoliberalism,plus],[neon,square],[nepal,country],[nephew,person],[nephtali,person],[ner,dash],[nerd,person],[nerve,],[nerves,right],[nervous,nerve],[nervousness,minus],[nesn,company],[ness,plus],[nest,],[nested,box],[nesting,nest],[nestlé,company],[net,box],[netherlands,country],[network,dot],[networked,line],[networking,star],[networks,dash],[neumann,person],[neural,brain],[neurally,brain],[neuro,brain],[neuroalgorithms,and],[neurobot,person],[neurocode,n],[neurodiverse,person],[neurofeatures,plus],[neurointerpreter,and],[neuromodel,right],[neuron,star],[neuronal,neuron],[neuronetwork,plus],[neuronetworks,plus],[neurons,star],[neuroplastic,square],[neuroplasticity,plus],[neurorecordings,square],[neuroscience,brain],[neuroscientist,person],[neurosolver,plus],[neurospected,eye],[neurotechnologies,brain],[neurotic,minus],[neuroticism,minus],[neurotypical,plus],[neuse,plus],[neuter,square],[neutral,zero],[never,zero],[nevertheless,plus],[nevis,country],[new,plus],[newborns,baby],[newbranch,dash],[newbranchifcall,dash],[newcastle,city],[newcat,and],[newcomers,person],[newer,one],[newland,person],[newline,square],[newlines,square],[newlist,square],[newly,plus],[newn,plus],[newness,one],[newrule,square],[newrulenumber,n],[news,paper],[newsletter,paper],[newsletters,paper],[newspaper,paper],[newspapers,book],[newsreader,person],[newsworthy,paper],[newton,person],[next,right],[ney,dash],[nfns,right],[nfrom,dash],[ng,person],[ngaire,person],[nhalt,dash],[nhandle,dash],[nhmrc,book],[ni,square],[niall,person],[niaz,person],[nib,],[nibbled,tooth],[nicaragua,country],[niccolò,person],[niccoló,person],[nice,plus],[nicely,plus],[nicer,greaterthan],[nicest,plus],[niche,box],[niches,box],[nicholas,person],[nicholls,person],[nick,person],[nickered,minus],[nicknames,square],[nicknaming,square],[nicola,person],[nicole,person],[nida,university],[nidaacttech,square],[nidaamericanaccent,square],[niece,person],[nielsen,person],[niet,person],[nieto,person],[nietzche,person],[nietzsche,person],[nietzschean,person],[niger,country],[nigeria,country],[night,square],[nightclubs,building],[nighter,square],[nightfall,n],[nightingale,],[nightmare,minus],[nightmares,minus],[nights,square],[nih,company],[nike,person],[nikola,person],[nil,zero],[nile,river],[nine,ninerectangle],[ninerectangle,],[ninja,person],[ninjitsu,book],[ninstallation,down],[nint,dash],[ninth,n],[nirit,person],[nirvana,zero],[nitesh,person],[nitin,person],[nk,dash],[nl,square],[nleash,dash],[nlm,square],[nlp,bracket],[nls,down],[nlucian,person],[nm,dash],[nmain,dash],[nmany,dash],[nmr,n],[nn,right],[nna,square],[nnb,square],[nnc,square],[nnd,square],[nne,square],[nnf,variable],[nng,variable],[nnh,variable],[nni,variable],[nnit,box],[nnitt,box],[nnl,variable],[nnmy,box],[nno,dash],[nnobzgawdoxnmxdjgpyvege,dash],[nns,right],[nnzss,dash],[no,xcorrection],[noam,person],[nobel,plus],[noble,plus],[nobles,person],[nobodyproblemsapp,software],[nocut,xcorrection],[noddings,person],[node,dot],[nodes,dot],[nodisplay,square],[nogating,xcorrection],[noheadaches,xcorrection],[noheadachesapp,box],[noise,zero],[nola,person],[nolan,person],[nomenclature,square],[nominalisation,square],[nominalised,square],[nominate,square],[nominated,plus],[nominative,square],[non,xcorrection],[nonabort,plus],[nonaligned,minus],[nonbeing,person],[nonbeings,person],[nonbreaking,square],[nondet,n],[nondeterminism,two],[nondeterministic,or],[nondeterministically,n],[nondurable,minus],[none,zero],[nonempty,square],[nonemptycord,square],[nonequally,notequals],[nonhuman,animal],[nonmonotonicities,minus],[nonmonotonicity,plus],[nonprofits,zero],[nonrec,right],[nonrecursive,dot],[nonsense,minus],[nontechnical,right],[nonvar,square],[nonverbal,paper],[nonviolent,plus],[noodle,],[noodles,noodle],[noon,n],[nopqrstuvwxyz,dash],[noprotocol,xcorrection],[nor,zero],[noradrenaline,],[norberto,person],[nordlinger,person],[norepinephrine,],[noreply,minus],[norm,plus],[normal,plus],[normalising,down],[normally,plus],[normative,n],[normativise,field],[normativity,plus],[norms,plus],[norover,dash],[norris,person],[north,up],[northwest,right],[norway,country],[norwegian,person],[noscript,square],[nose,],[nosepieces,peg],[nosing,nose],[nossal,person],[nostril,],[nostrils,nostril],[not,xcorrection],[notable,plus],[notably,plus],[notation,square],[notations,square],[notcher,down],[notchers,down],[note,],[notebook,book],[notebooks,book],[noted,plus],[notepad,paper],[notequals,],[notes,paper],[notestonames,note],[noteworthy,plus],[nothing,zero],[nothingness,zero],[notice,eye],[noticeable,plus],[noticed,person],[noticing,square],[notification,plus],[notifications,plus],[notified,square],[notifies,tick],[notify,plus],[notifying,square],[noting,square],[notion,square],[notions,box],[notrace,xcorrection],[notranslate,xcorrection],[notree,dash],[nots,xcorrection],[noumena,box],[noumenal,dot],[noumenalist,person],[noumenon,apple],[noumenonisation,dot],[noun,apple],[nouns,square],[nourish,plus],[nourished,plus],[nourishing,plus],[nourishment,plus],[nov,square],[novaes,person],[novel,book],[novels,book],[november,square],[now,zero],[nowell,person],[nowhere,box],[nowrap,equals],[nozzle,box],[np,square],[npackage,box],[npm,down],[npp,dash],[npredicate,square],[nq,square],[nr,dash],[nrectangle,],[ns,n],[nsay,dash],[nsw,square],[nswipl,dash],[nsys,square],[nt,square],[ntexttobr,apple],[nth,n],[ntime,dash],[ntitle,square],[ntotal,n],[ntotal,right],[nuance,plus],[nuances,plus],[nuclear,minus],[nucleus,ball],[nude,minus],[nudging,right],[nudity,person],[nuggets,box],[nuland,person],[null,zero],[num,n],[number,one],[numbera,variable],[numbered,n],[numbering,n],[numberofinstruments,n],[numberofphils,n],[numbers,n],[numberstring,square],[numeracy,one],[numerals,n],[numeric,n],[numerical,number],[numerous,n],[numin,n],[numinputs,variable],[numout,n],[numoutputs,variable],[nums,n],[nuptial,dash],[nur,plus],[nurse,person],[nursery,building],[nursing,plus],[nurture,plus],[nurtured,plus],[nurturers,person],[nurtures,plus],[nurturing,plus],[nusrat,person],[nussbaum,person],[nut,],[nuted,plus],[nutmeg,plant],[nutrient,ball],[nutrients,apple],[nutrifactored,apple],[nutrition,apple],[nutritional,apple],[nutritious,apple],[nuts,nut],[nutshell,nut],[nuzzle,mouth],[nuzzled,mouth],[nv,square],[nvivo,square],[nwcxu,dash],[nwhen,dash],[nwhere,dash],[nx,dash],[ny,variable],[nylon,line],[nyou,dash],[nyse,dash],[nz,n],[nǚ,person],[nǚzǐ,person],[o,variable],[oak,tree],[oaks,tree],[oar,],[oarsman,person],[oasis,],[oat,],[oats,oat],[obd,book],[obedience,plus],[obesity,fat],[obey,plus],[obeyed,plus],[obj,apple],[object,apple],[objected,xcorrection],[objecting,xcorrection],[objection,xcorrection],[objectionable,minus],[objections,xcorrection],[objective,box],[objectively,plus],[objectives,dot],[objectivity,box],[objectnames,square],[objectpath,line],[objects,apple],[objectstofinish,box],[objectstoprepare,apple],[objecttofinish,box],[objecttofinishdict,book],[objecttofinishdifferencet,d],[objecttofinishdt,d],[objecttofinishend,variable],[objecttofinishlength,n],[objecttofinishlengtht,n],[objecttofinishstring,variable],[objecttofinishstringth,variable],[objecttofinishvalues,variable],[objecttofinishx,variable],[objecttofinishy,variable],[objecttofinishz,variable],[objecttoprepare,apple],[objecttopreparedict,book],[objecttopreparedifferencet,d],[objecttopreparedt,d],[objecttoprepareend,variable],[objecttopreparelength,n],[objecttopreparelengtht,n],[objecttopreparestring,variable],[objecttopreparestringth,variable],[objecttopreparevalues,variable],[objecttopreparex,variable],[objecttopreparey,variable],[objecttopreparez,variable],[objs,apple],[oblate,person],[oblates,person],[obligate,plus],[obligated,plus],[obligates,plus],[obligating,plus],[obligation,plus],[obligations,plus],[obliged,plus],[oboe,tube],[obs,box],[obscure,minus],[observable,plus],[observant,eye],[observation,eye],[observational,eye],[observations,n],[observatory,star],[observe,eye],[observed,eye],[observer,person],[observers,person],[observing,eye],[obsessed,minus],[obsolescence,minus],[obsolete,minus],[obsoleted,xcorrection],[obsoletion,xcorrection],[obstacle,box],[obstacles,box],[obstruct,box],[obstruction,box],[obtain,left],[obtained,right],[obvious,plus],[obviously,plus],[obviousness,plus],[ocarina,box],[occasion,square],[occasionally,plus],[occasions,plus],[occupant,person],[occupants,person],[occupation,paper],[occupational,paper],[occupied,plus],[occupies,dot],[occupy,square],[occupying,plus],[occur,plus],[occuring,box],[occurred,plus],[occurrence,plus],[occurrences,plus],[occurring,dot],[occurs,dot],[ocean,water],[oceanographer,person],[oceans,water],[oclass,variable],[ocr,and],[oct,eightrectangle],[octagon,],[octahedral,octahedron],[octahedron,],[octahedrons,octahedron],[octave,eightrectangle],[octaves,line],[octet,eightrectangle],[october,square],[octopus,],[ocw,book],[odd,n],[oddly,minus],[odds,p],[odwyer,person],[oedipal,person],[oedipus,person],[oesophagus,tube],[oeuvre,book],[oevre,book],[of,dash],[off,xcorrection],[offable,tick],[offend,minus],[offended,minus],[offender,person],[offending,minus],[offensive,minus],[offer,plus],[offered,right],[offering,hand],[offerings,box],[offers,apple],[office,room],[officer,person],[officers,person],[offices,room],[officeworks,company],[official,plus],[offing,down],[offline,zero],[offs,down],[offset,n],[offsets,n],[offside,right],[offspring,person],[ofh,box],[often,one],[oga,dash],[ogvoh,dash],[oh,plus],[ohdvoxc,dash],[ohio,person],[oi,dash],[oii,dash],[oil,],[oiled,oil],[oiling,oil],[oio,dash],[oivn,variable],[ok,plus],[okay,plus],[ol,variable],[old,dash],[olddoctors,person],[older,greaterthan],[oldest,n],[oldham,person],[oldmeditators,person],[olds,person],[olfactory,nose],[olive,],[oliver,person],[olivia,person],[olkin,person],[olsson,person],[olympiad,book],[olympic,ball],[olympics,ball],[om,person],[oman,country],[omar,person],[omega,square],[omerbsezer,person],[omissions,minus],[omit,xcorrection],[omits,minus],[omitted,zero],[omitting,xcorrection],[on,plus],[onboard,plus],[onboarding,dot],[once,one],[oncoming,square],[ond,dash],[one,],[oneanother,person],[onelonghandle,handle],[oneness,one],[ones,one],[oneself,person],[onezero,one],[onfray,person],[ong,person],[ongoing,line],[onings,apple],[onion,],[onlily,one],[online,dash],[onlinelibrary,book],[only,one],[onnahope,plus],[onness,square],[onnonletter,square],[onoy,person],[ons,dot],[onset,line],[ont,square],[ontario,city],[ontesol,book],[ontic,square],[ontinue,dash],[onto,down],[ontol,box],[ontological,square],[ontologically,and],[ontologies,square],[ontologise,square],[ontologised,box],[ontology,and],[onward,right],[onwards,right],[oo,dot],[ooh,plus],[oohs,note],[ooi,dash],[ool,dash],[ooo,right],[oop,minus],[op,right],[opaque,square],[open,dash],[openbracket,],[opened,up],[openeditor,square],[openeditorwitherror,xcorrection],[opening,up],[openings,box],[openly,tick],[openmind,company],[openness,plus],[openroundbracket,],[opens,square],[opensquarebracket,],[opera,book],[operand,square],[operate,hand],[operated,box],[operates,plus],[operating,line],[operatingsystem,],[operation,and],[operational,tick],[operations,and],[operative,and],[operativity,plus],[operator,and],[operators,and],[ophelia,person],[opiate,plus],[opie,person],[opinion,square],[opinions,square],[opponent,person],[opponents,person],[opportunities,plus],[opportunity,plus],[oppose,xcorrection],[opposed,xcorrection],[opposing,dash],[opposite,dash],[opposites,xcorrection],[opposition,minus],[oppositions,xcorrection],[oppress,minus],[oppressed,minus],[oppression,minus],[oppressions,minus],[oppressive,minus],[oppressiveness,minus],[oprah,person],[oprotocol,dash],[ops,box],[opt,plus],[optative,plus],[optical,eye],[optimal,plus],[optimally,one],[optimisation,one],[optimisations,down],[optimise,one],[optimisecode,one],[optimised,one],[optimiser,one],[optimisers,down],[optimises,down],[optimising,down],[optimism,plus],[optimistic,plus],[optimization,plus],[optimum,plus],[option,dash],[optional,plus],[optionally,plus],[options,plus],[optometry,eye],[opus,book],[opusman,person],[or,],[oral,mouth],[orally,mouth],[orange,],[oranges,orange],[orangutan,],[oranoos,person],[orb,ball],[orbiting,circle],[orch,violin],[orchestra,instrument],[orchestral,baton],[orcutorifthen,right],[ord,n],[ordained,plus],[order,n],[ordered,n],[ordering,paper],[orders,n],[ordinary,plus],[ordinate,n],[ordinated,plus],[ordinates,n],[ordination,plus],[ordinator,person],[ords,n],[ore,dash],[oreo,],[org,building],[organ,box],[organic,apple],[organisation,building],[organisational,company],[organisations,organisation],[organise,plus],[organised,tick],[organiser,person],[organising,dash],[organism,person],[organisms,person],[organization,building],[organizational,company],[organizations,building],[organized,plus],[organizer,person],[organizers,person],[organs,box],[orhan,person],[orient,right],[orientation,right],[orientations,line],[oriented,right],[orienteered,right],[orienteering,right],[orienteers,person],[orienting,right],[orig,plus],[origin,box],[original,one],[originality,plus],[originally,one],[originals,one],[originate,zero],[originated,dot],[originating,dot],[origins,dot],[orlando,person],[ornament,plus],[orphanage,building],[orphaned,dot],[orphanedalgorithms,variable],[orphanedbreasonings,apple],[orphanedbreathsonings,variable],[orphaneddirections,variable],[orphanedobjectstofinish,variable],[orphanedobjectstoprepare,variable],[orphanedpartsofrooms,variable],[orphanedrooms,variable],[orsola,person],[orthodontic,tooth],[orthodontist,person],[orthodox,plus],[orthogonal,square],[ory,person],[os,box],[osaka,city],[osborne,person],[oscar,person],[oscillating,right],[ost,note],[osxdaily,dash],[ot,right],[otac,variable],[oth,dash],[other,person],[otherdepartment,building],[othered,two],[otherlander,person],[others,person],[otherwise,minus],[otte,person],[ou,dash],[oua,company],[ouch,minus],[oui,box],[ouptut,square],[our,dash],[ours,person],[ourselves,person],[out,minus],[outages,minus],[outback,tree],[outbreak,minus],[outbursts,minus],[outcome,n],[outcomes,right],[outdoor,plant],[outdoors,right],[outed,right],[outer,right],[outermost,right],[outfit,trousers],[outfits,shirt],[outflow,right],[outgoing,plus],[outlandish,minus],[outlasting,n],[outlawed,xcorrection],[outlawing,xcorrection],[outlay,down],[outlaying,square],[outlet,building],[outlets,box],[outlier,dot],[outliers,minus],[outline,square],[outlined,square],[outlines,paper],[outlining,dash],[outlook,eye],[outlooks,eye],[outlying,dot],[outmodes,right],[outness,right],[outpatients,person],[outperform,plus],[outperforming,plus],[outpost,city],[output,variable],[outputarguments,variable],[outputasconsequent,n],[outputformat,n],[outputlabels,square],[outputless,zero],[outputlyrics,square],[outputnumber,n],[outputs,variable],[outputted,n],[outputting,right],[outputtype,square],[outputvalues,n],[outputvar,variable],[outputvarlist,twobox],[outputvars,box],[outputvarsl,n],[outro,n],[outs,right],[outset,n],[outside,square],[outsider,person],[outsiders,person],[outsource,right],[outsourced,right],[outsourcing,right],[outstanding,plus],[outturned,right],[outward,right],[outwardly,right],[outwards,right],[outweigh,greaterthan],[ov,dash],[oval,square],[ovale,oval],[ovar,variable],[ovary,organ],[oven,],[over,xcorrection],[overactive,up],[overall,n],[overarching,box],[overbearing,xcorrection],[overbearingly,xcorrection],[overcame,up],[overcast,sky],[overcharged,minus],[overcome,xcorrection],[overcomes,xcorrection],[overcoming,xcorrection],[overconfidence,minus],[overconfident,minus],[overdo,minus],[overdoing,minus],[overdone,plus],[overdriven,up],[overeating,minus],[overemphasis,minus],[overflow,minus],[overhanging,right],[overhead,up],[overheard,ear],[overlap,square],[overlapped,n],[overlapping,box],[overlaps,square],[overload,minus],[overlooked,minus],[overlooks,minus],[overly,plus],[overnight,square],[overreliance,minus],[override,right],[overriding,plus],[overs,right],[overseas,dot],[oversee,eye],[overseer,person],[overseers,person],[overshadowed,minus],[oversight,minus],[overspend,dollarsymbol],[overtake,right],[overtaken,xcorrection],[overtakes,right],[overthrow,xcorrection],[overthrowing,xcorrection],[overtime,plus],[overtook,right],[overture,note],[overturned,up],[overturning,plus],[overuse,minus],[overview,square],[overviews,plus],[overwatered,water],[overwhelmed,minus],[overwhelming,minus],[overwork,minus],[overwrite,down],[overwriting,left],[overwrote,down],[overzealous,plus],[ovid,person],[ovn,variable],[ovns,variable],[owing,dollarsymbol],[owl,],[owls,owl],[own,dash],[owned,box],[owner,person],[ownerisation,plus],[owners,person],[ownership,box],[owning,dollarsymbol],[owns,box],[oxe,dash],[oxford,university],[oxley,person],[oxygen,],[oxygenated,box],[oyqym,dash],[oz,person],[ozemail,company],[ozone,],[p,a],[pa,dash],[pablo,person],[pace,n],[pachelbel,person],[pacific,water],[pacing,n],[pack,box],[package,box],[packaged,box],[packages,software],[packaging,box],[packed,box],[packet,],[packing,down],[packs,box],[pad,line],[padded,box],[padding,box],[paddled,right],[paddock,square],[pads,line],[paedophile,person],[paedophiles,person],[paedophilia,minus],[paella,],[page,paper],[pagerank,n],[pages,paper],[paginated,paper],[pagination,paper],[pagliarella,person],[paid,dollarsign],[paidós,person],[pain,minus],[painful,minus],[painless,plus],[pains,minus],[painstaking,plus],[painstakingly,plus],[paint,],[paintbrush,brush],[painted,paint],[painter,person],[painting,],[paintings,painting],[paints,paint],[pair,two],[paired,box],[pairs,tworectangle],[pairwise,box],[pak,person],[pakarinen,person],[pakistan,country],[pal,person],[palace,building],[palacebos,dash],[palau,country],[palestine,country],[palette,],[palimpsest,paper],[pallet,],[pallot,person],[palm,up],[palmer,person],[palms,palm],[palpation,dash],[palsy,minus],[pam,person],[pamphlet,],[pamphlets,pamphlet],[pan,],[panama,country],[pancake,],[panchan,person],[pancreas,],[pandemic,minus],[pane,square],[panegyric,plus],[panel,square],[panforte,],[pangs,minus],[panic,minus],[panning,dash],[pansies,flower],[pant,],[pantry,],[pants,],[panty,underpants],[papa,person],[paper,],[paperback,book],[papers,paper],[papersga,dash],[papua,country],[para,square],[parabola,n],[paracosm,plus],[parade,right],[paraded,person],[paradigm,right],[paradigmatic,plus],[paradigms,right],[parading,person],[paradox,minus],[paradoxical,minus],[paradoxically,minus],[paragraph,square],[paragraphs,square],[paraguay,country],[parakeet,bird],[paralink,company],[parallel,line],[parallels,line],[paralysed,minus],[parameter,n],[parameters,n],[paramilitary,person],[parampalli,person],[params,variable],[paraphrase,square],[paraphrased,square],[paraphraser,right],[paraphrasers,right],[paraphrases,right],[paraphrasing,square],[paraphrasings,equals],[paraprogramming,plus],[paras,square],[parasol,],[parathyroid,box],[parcel,box],[parchment,paper],[pareil,equals],[parent,up],[parental,person],[parentheses,square],[parenthesis,brackets],[parenthesised,square],[parenthood,person],[parenting,person],[parents,person],[pareto,plus],[parietal,brain],[paris,city],[parises,city],[parisian,city],[park,],[parke,person],[parked,dot],[parking,car],[parkinson,person],[parkinsons,minus],[parks,plant],[parliament,building],[parliamentarians,person],[parliamentary,book],[parlour,room],[parmar,person],[parochialism,minus],[parodied,plus],[parodies,plus],[paronamastically,equals],[paronomastic,equals],[paronomastically,plus],[paronomasticism,dash],[parriage,plus],[parrot,],[parsana,person],[parse,down],[parsed,down],[parser,dash],[parsers,right],[parses,dash],[parsesentence,square],[parshina,person],[parsing,down],[part,box],[partake,plus],[partaking,plus],[parted,right],[parth,dash],[parthh,dash],[partial,n],[partially,one],[participant,person],[participants,person],[participate,tick],[participated,plus],[participating,tick],[participation,plus],[participatory,plus],[participle,plus],[participles,plus],[particle,ball],[particles,box],[particular,one],[particularly,plus],[particulars,plus],[parties,person],[partisan,plus],[partlength,n],[partly,plus],[partner,person],[partnered,right],[partnering,right],[partners,person],[partnership,plus],[partnerships,dash],[partofroom,table],[partofroomdict,book],[partofroomdifferencet,d],[partofroomdt,variable],[partofroomend,variable],[partofroomlength,n],[partofroomlengtht,n],[partofroomstring,variable],[partofroomstringth,variable],[partofroomvalues,variable],[partofroomx,variable],[partofroomy,variable],[partofroomz,variable],[parts,tworectangle],[partsofrooms,table],[party,house],[pas,plus],[pasa,city],[pasadena,suburb],[pashphalt,plus],[pass,dash],[passage,right],[passages,hall],[passcode,n],[passed,tick],[passenger,person],[passengers,person],[passer,person],[passers,person],[passes,right],[passing,dash],[passion,plus],[passionate,plus],[passionne,plus],[passive,zero],[passphrase,square],[passport,book],[passwd,square],[password,square],[passwordatom,box],[passwords,square],[passwordstring,square],[past,left],[pasta,],[pastable,square],[paste,square],[pasted,square],[pastime,n],[pasting,square],[pastry,],[pasttries,left],[pasty,],[pat,hand],[pataki,person],[patches,square],[patents,plus],[paternalistic,person],[path,],[pathetic,minus],[pathogen,minus],[pathogenesis,line],[pathogenic,line],[pathogens,bacterium],[pathological,minus],[pathologically,minus],[pathology,minus],[paths,path],[pathway,],[pathways,path],[patience,plus],[patient,person],[patients,person],[patriage,plus],[patriarchy,person],[patricia,person],[patrick,person],[patrol,person],[patrolled,plus],[patron,person],[patrons,person],[patted,hand],[pattern,square],[patterns,square],[patting,hand],[patty,],[paul,person],[paula,person],[paulhus,person],[paull,person],[pauls,person],[paultramlgreturningfromshowreel,person],[pause,line],[paused,zero],[pauses,box],[pavane,paper],[pavement,path],[paves,right],[paving,plane],[paw,],[pawing,right],[pawn,person],[pawns,person],[paws,paw],[pax,line],[pay,dollarsign],[payanimhighered,dollarsymbol],[paycheck,dollarsymbol],[paycommentasacelebrity,a],[paydaoistheadachemedicinetexttobr,dollarsign],[payer,person],[paying,dollarsymbol],[payment,dollarsign],[payments,dollarsymbol],[payne,person],[paypal,company],[paypalapis,company],[pays,dollarsymbol],[pbm,square],[pbook,book],[pbt,dash],[pc,box],[pcap,dash],[pce,and],[pck,book],[pckg,box],[pconverter,right],[pcre,right],[pdf,paper],[pdhnkk,dash],[pea,],[peace,a],[peaceful,plus],[peacefully,plus],[peach,],[peaches,peach],[peacock,peacock],[peacocks,peacock],[peak,n],[peaks,dot],[peanut,nut],[peanuts,nut],[pear,],[pearl,person],[pears,pear],[pearson,company],[peas,pea],[peasants,person],[pecan,],[pecking,seed],[ped,book],[pedagogical,paper],[pedagogically,book],[pedagogies,book],[pedagogise,book],[pedagogue,person],[pedagogues,person],[pedagogy,book],[pedal,pedal],[pedalling,pedal],[pedanticals,person],[pedestal,],[pedestrian,person],[pedestrians,person],[pedocracy,one],[pedophile,person],[pedophilia,minus],[pedroza,person],[peds,person],[pee,dot],[peek,eye],[peeked,eye],[peeking,eye],[peel,box],[peeled,apple],[peeling,up],[peephole,hole],[peer,person],[peers,person],[peg,],[pegboard,],[peirce,person],[pelicanstore,dollarsymbol],[pellas,person],[pellets,],[peloponnesia,city],[pen,],[penalised,xcorrection],[penalties,xcorrection],[penalty,minus],[pencil,],[pencilsharpener,],[pending,plus],[pendulum,],[penetrate,right],[penetrating,down],[penetration,right],[pengines,square],[penguin,],[penis,],[penned,pen],[penning,pen],[penny,person],[pennycook,person],[pens,pen],[pensiero,box],[pension,dollarsymbol],[pensions,dollarsymbol],[pentagon,],[pentose,sugar],[penultimatevars,variable],[peo,person],[people,person],[peoples,person],[pepper,],[peppered,pepper],[peppermint,plant],[peppers,capsicum],[peptides,],[per,dash],[perceive,eye],[perceived,eye],[perceiving,plus],[percent,n],[percentage,n],[percentile,n],[perception,square],[perceptions,eye],[perceptron,eye],[perceptual,eye],[percussion,drum],[percussive,equals],[pereira,person],[perennial,line],[perez,person],[perfect,a],[perfection,one],[perfectionism,plus],[perfectly,one],[perfects,plus],[perforated,line],[perform,a],[performance,stage],[performances,box],[performative,plus],[performativity,plus],[performed,person],[performer,person],[performers,person],[performerstuff,person],[performing,person],[performs,priscilla],[perfume,],[perfusion,right],[pergola,],[perhaps,zero],[period,line],[periodically,t],[periods,line],[peripheral,right],[periphery,n],[perishable,minus],[peristaltic,down],[perks,plus],[perlin,person],[perm,plus],[permanent,plus],[permanently,plus],[permeable,right],[permeate,down],[permission,plus],[permits,tick],[permitted,plus],[permutation,right],[permutations,right],[perpendicular,square],[perpendicularly,square],[perpetual,n],[perpetuate,plus],[perpetuity,n],[perron,person],[perry,person],[perseverance,plus],[persevering,plus],[persia,country],[persist,plus],[persistence,plus],[persistent,plus],[persistently,plus],[persists,plus],[person,],[personable,plus],[personae,person],[personal,dot],[personalised,plus],[personalism,person],[personalities,person],[personality,person],[personalityforge,company],[personalizing,plus],[personally,person],[personcoords,n],[personell,person],[personhood,plus],[personnel,person],[persons,person],[perspectival,eye],[perspective,eye],[perspectives,dot],[perspectivism,eye],[perspiration,water],[persps,eye],[persson,person],[persuade,plus],[persuaded,plus],[persuading,plus],[persuasive,plus],[pertain,plus],[pertaining,right],[pertinent,plus],[peru,country],[pessimising,minus],[pestel,plus],[pestle,],[pet,animal],[petal,],[petals,petal],[peter,person],[peters,person],[petersburg,city],[petersburgs,city],[petersen,person],[petissiences,plus],[pets,animal],[petty,person],[pf,square],[pfile,paper],[pfs,and],[pft,box],[pg,paper],[pgrad,person],[pgst,dash],[pgtd,square],[ph,dash],[phaggusemes,equals],[phallus,rod],[phalluses,rod],[phantasmagoria,box],[phantom,person],[pharmaceuticals,medicine],[pharmacological,ball],[pharmacology,ball],[pharmacy,building],[pharynx,tube],[phase,line],[phd,book],[phds,book],[pheasant,],[phenomena,box],[phenomenological,right],[phenomenologist,person],[phenomenology,square],[phenomenon,square],[pher,square],[phi,square],[phil,a],[philanthropical,plus],[philanthropist,person],[philanthropists,person],[philanthropy,right],[philatelist,person],[philip,person],[philippines,country],[philips,person],[phillip,person],[phillips,person],[philn,paper],[philosopher,person],[philosophers,person],[philosophic,dash],[philosophical,dash],[philosophically,right],[philosophicon,square],[philosophicus,right],[philosophies,paper],[philosophise,square],[philosophised,book],[philosophy,book],[phils,book],[philslengthlist,square],[phine,dash],[phoenix,bird],[phone,n],[phonebook,n],[phoneme,square],[phonemes,square],[phones,phone],[phonetic,square],[phontology,mouth],[photo,square],[photograph,square],[photographed,square],[photographer,person],[photographic,square],[photographing,square],[photographs,square],[photography,square],[photon,],[photons,ball],[photos,square],[photoshop,software],[photosynthesis,right],[photosynthetic,right],[php,paper],[phpversion,n],[phrase,square],[phrases,square],[phrasesfile,paper],[phrasevalue,n],[phrasevarname,square],[phrasing,square],[phrasings,square],[phryne,person],[phyllis,person],[phylogenetic,dash],[phys,box],[physic,square],[physical,box],[physicalism,box],[physically,box],[physician,person],[physicians,person],[physicist,person],[physics,plus],[physiological,body],[physiologically,person],[physiologies,body],[physiologist,person],[physiology,person],[physiotherapy,book],[physis,right],[pi,],[pia,person],[pianissimos,line],[pianist,person],[piano,],[pic,square],[picas,n],[piccolo,note],[pick,plus],[picked,apple],[pickering,person],[pickies,n],[picking,tick],[pickle,],[pickled,n],[picks,tick],[picky,plus],[picture,square],[pictured,square],[pictures,square],[picturing,square],[pid,dash],[pie,],[piece,paper],[pieces,box],[piecing,right],[pierced,dot],[piercings,hole],[pierre,person],[piety,plus],[pig,],[pigeon,],[pigeonhole,box],[pigment,square],[pilcher,person],[pile,up],[pill,tablet],[pillar,cylinder],[pillars,rod],[pillow,],[pillowed,pillow],[pillows,pillow],[pilot,person],[piloting,plus],[pimple,],[pin,],[pinball,ball],[pinch,minus],[pine,tree],[pineal,gland],[pineapple,],[pinecone,],[pined,plus],[pinhead,dot],[pinhole,hole],[pink,square],[pinkies,finger],[pinkwhistles,whistle],[pinky,finger],[pinnacle,dot],[pinned,dot],[pinpoint,dot],[pins,pin],[pinter,person],[pinterest,company],[pinx,square],[pio,right],[pious,plus],[pip,],[pipe,],[pipeline,line],[pipelines,line],[piper,person],[pipes,pipe],[pipette,pipette],[pipetted,pipette],[pippin,apple],[pips,pip],[pipsqueaks,minus],[pisa,city],[pistachio,nut],[pistrui,person],[pitch,tick],[pitched,plus],[pitches,tick],[pitchouli,],[pitchproof,note],[pitfalls,minus],[pithfulnesses,down],[pitrisaur,dinosaur],[pitted,stone],[pittosporum,plant],[pituitary,box],[pity,minus],[pivot,right],[pivotal,dot],[pixar,company],[pixel,square],[pixelate,square],[pixelated,square],[pixellated,square],[pixels,square],[pixie,person],[pixies,person],[pizza,],[pizzicato,finger],[pl,square],[pla,dash],[place,building],[placebo,tablet],[placed,hand],[placeholder,dot],[placeholders,dot],[placement,square],[placenta,organ],[places,tree],[placing,down],[plagiarise,minus],[plagiarised,minus],[plagiarises,minus],[plagiarising,minus],[plagiarism,minus],[plagues,minus],[plain,plus],[plainer,plus],[plainised,right],[plainness,plus],[plaintiffs,person],[plan,paper],[plane,],[planes,plane],[planet,],[planetarium,star],[planetary,planet],[planets,ball],[plank,],[planned,paper],[planner,square],[planners,right],[planning,right],[plans,paper],[plant,],[planted,seed],[planting,seed],[plants,plant],[plaque,minus],[plasma,],[plastic,plane],[plasticated,plastic],[plasticity,n],[plate,],[plateau,box],[platelet,],[platelets,platelet],[platform,box],[platforms,square],[platinum,ball],[plato,person],[platolucian,person],[platonic,person],[platonism,book],[platonist,person],[platter,plate],[plausible,plus],[play,ball],[playback,right],[played,ball],[player,person],[players,person],[playful,ball],[playfully,plus],[playground,swings],[playing,plus],[playlist,square],[plays,book],[playwright,person],[pldoc,square],[ple,person],[plea,plus],[pleasant,plus],[please,plus],[pleased,plus],[pleases,plus],[pleasing,plus],[pleasure,plus],[pleasures,plus],[plebeian,person],[plectrum,plectrum],[plectrums,plectrum],[pleiotropic,n],[plentiful,n],[plenty,plus],[pless,person],[pliers,],[plimsoll,line],[plode,box],[plogue,company],[plos,book],[plot,dot],[plots,dot],[plotted,paper],[plotting,dot],[plough,],[ploughed,right],[pls,square],[plucked,right],[plucking,right],[plug,box],[plugged,right],[plugins,box],[plugs,right],[pluhar,person],[plum,],[plumber,up],[plume,feather],[plunger,down],[pluperfect,right],[plural,n],[pluralism,box],[pluralistic,box],[plus,],[plusone,plus],[plustwo,plus],[pluto,planet],[pm,dash],[pmc,square],[pmi,square],[pms,software],[pn,n],[pnal,n],[pnals,n],[png,paper],[po,university],[pocket,],[pocketing,box],[pocketwatch,],[pod,square],[podcast,paper],[podcaster,person],[podiatrist,person],[podium,],[podophile,person],[pods,pod],[poem,paper],[poems,equals],[poet,person],[poetic,square],[poetics,book],[poetry,square],[poh,person],[poi,dash],[point,dot],[pointed,dot],[pointedness,dot],[pointer,dot],[pointers,dot],[pointing,down],[pointless,xcorrection],[points,dot],[poisson,person],[poked,down],[poking,right],[pol,box],[poland,country],[polanyi,person],[polarity,rod],[polastrini,person],[pole,dot],[polemical,minus],[poles,line],[police,person],[policeman,person],[policies,square],[policy,paper],[polis,city],[polish,plus],[polished,plus],[polishing,square],[polite,person],[politely,plus],[politeness,plus],[politenesses,plus],[political,up],[politically,dot],[politician,person],[politicians,person],[politicity,plus],[politico,dot],[politics,dash],[politicsrelation,dash],[polkinghorns,plus],[poll,plus],[pollution,box],[polly,person],[polonialnesses,person],[poltergeist,minus],[polycaring,plus],[polyglot,n],[polygon,square],[polygons,square],[polyhedron,box],[polyhedrons,box],[polymath,person],[polymorphism,box],[polynomial,n],[polynomials,line],[polyp,ball],[polysynth,n],[polytechnic,university],[polzin,person],[pom,tomato],[pome,apple],[pomegranate,],[pommelles,plus],[pomodoro,tomato],[pomodoros,paper],[pomology,dash],[poms,book],[pond,water],[ponds,suburb],[pone,dash],[ponens,hand],[ponty,person],[pony,],[poo,person],[poofters,person],[pooh,bear],[pool,],[poor,minus],[poorly,minus],[pooty,person],[pop,bubble],[popclassical,ball],[popclassicalcomposition,note],[popcorn,],[pope,person],[popogogy,book],[popol,book],[popological,person],[popologically,person],[popology,book],[poppadum,],[popped,up],[poppies,flower],[popping,air],[poppy,person],[pops,bubble],[popsicle,],[popular,plus],[popularities,plus],[popularity,plus],[popularized,plus],[populate,down],[populated,person],[populatevars,variable],[populating,person],[population,n],[populations,n],[populum,person],[porcupine,],[pork,],[porridge,],[port,box],[portable,right],[portalid,n],[ported,right],[porter,person],[portfolio,book],[portfolios,book],[porting,right],[portion,box],[portions,box],[portrait,square],[portray,square],[portugal,country],[portuguese,paper],[pos,plus],[pose,plus],[posed,plus],[poseia,plus],[poses,plus],[posing,plus],[posited,plus],[positing,plus],[position,n],[positioned,n],[positioning,plus],[positionlist,square],[positions,person],[positive,plus],[positively,plus],[positives,plus],[positivestate,plus],[positivism,plus],[positivist,plus],[positivity,plus],[positivityscore,plus],[posits,plus],[poss,plus],[possess,box],[possessing,box],[possession,plus],[possibilities,dash],[possibility,dash],[possible,plus],[possibly,one],[possiblynotworking,minus],[post,right],[postal,paper],[postcode,square],[postcolonial,right],[postcolonialism,country],[posted,letter],[poster,paper],[posteriori,right],[posterity,n],[posters,paper],[postfix,right],[postgrads,person],[postgraduate,person],[posthumanism,right],[posting,square],[postkey,n],[postludetudine,square],[postnet,paper],[postobon,dash],[postpone,xcorrection],[posts,square],[postsong,note],[poststructural,book],[poststructuralism,right],[postulated,plus],[posture,line],[pot,],[potato,],[potatoes,potato],[potbelly,],[potency,n],[potential,tick],[potentiality,plus],[potentially,plus],[potentials,plus],[potholes,down],[potion,],[potoroo,],[potpourri,flower],[pots,pot],[potter,person],[pottery,pot],[pouch,bag],[pouches,bag],[poulton,person],[pounder,down],[pounding,down],[pounds,n],[pour,right],[poured,water],[pouring,down],[pov,dot],[poverty,minus],[powder,box],[powell,person],[power,plus],[powered,right],[powerful,plus],[powerfully,plus],[powering,n],[powerless,zero],[powerlessness,minus],[powerload,n],[powerman,person],[powerpoint,software],[powers,person],[powerschool,company],[poynter,person],[pozible,company],[pp,square],[ppca,company],[ppi,n],[ppm,paper],[ppx,dash],[pq,line],[pr,square],[pra,dash],[prac,plus],[pract,dash],[practica,paper],[practical,plus],[practicality,plus],[practically,plus],[practice,plus],[practiced,plus],[practices,square],[practicing,plus],[practicum,paper],[practicums,paper],[practise,plus],[practised,plus],[practisers,person],[practises,plus],[practising,plus],[practitioner,person],[practitioners,person],[pragmatic,plus],[pragmatism,right],[prahran,suburb],[praise,plus],[praised,plus],[praising,plus],[pranayama,yline],[praxemes,square],[pray,plus],[prayed,plus],[prayer,paper],[prayers,square],[praying,brain],[pre,left],[preamble,square],[prec,left],[precative,up],[precaution,plus],[precautions,xcorrection],[precede,right],[preceded,left],[precedence,up],[precedes,right],[preceeding,left],[precious,plus],[precipice,up],[precise,plus],[precisely,equals],[precision,plus],[precludes,xcorrection],[precomputed,and],[precondition,plus],[precur,left],[precursor,left],[precursors,left],[pred,square],[predation,animal],[predationism,up],[predator,minus],[predatorial,up],[predators,animals],[predefined,tick],[predetermined,plus],[predicate,n],[predicateer,person],[predicateform,square],[predicateforms,square],[predicatename,square],[predicatenamen,square],[predicates,square],[predicatesa,square],[predicatesdictionary,and],[predicatised,square],[predict,two],[predictability,plus],[predictable,dot],[predicted,right],[predicting,right],[prediction,square],[predictions,square],[predictive,left],[predictively,plus],[predictor,variable],[predicts,up],[prednamearity,n],[prednamearitylist,square],[predominantly,plus],[preds,square],[preened,plus],[preening,plus],[prefer,plus],[preferable,plus],[preferably,plus],[preference,tick],[preferences,tick],[preferred,plus],[preferring,plus],[prefers,plus],[prefix,left],[prefixes,square],[prefixof,left],[preg,dash],[pregnancies,baby],[pregnancy,line],[pregnant,person],[prehistoric,line],[prehistory,line],[preincrement,plus],[preincrements,plus],[prejudice,minus],[prejudices,minus],[preliminarily,one],[preliminary,one],[prelude,note],[preludes,left],[premise,square],[premises,room],[premium,one],[premiums,dollarsymbol],[première,one],[preoccupation,plus],[preoedipal,right],[prep,right],[prepaid,dollarsymbol],[prepar,plus],[preparation,left],[preparations,plus],[preparatory,right],[prepare,right],[prepared,plus],[preparedness,plus],[prepares,plus],[preparing,plus],[prepensive,left],[preplpi,and],[preposition,box],[prepositioned,right],[prepositions,box],[preprocessed,plus],[prequel,book],[prereqs,right],[prerequisite,right],[prerequisites,right],[prerogative,plus],[presage,n],[preschool,building],[prescribe,square],[prescribed,paper],[prescribing,paper],[prescription,paper],[preselected,plus],[presence,person],[present,box],[presentation,square],[presentations,square],[presented,square],[presenter,person],[presenters,person],[presenting,square],[presentness,plus],[presents,square],[preservation,plus],[preserve,one],[preserved,right],[preserver,plus],[preserves,right],[preserving,plus],[preset,one],[preside,plus],[president,person],[prespises,plus],[press,box],[pressed,down],[presser,down],[presses,finger],[pressing,down],[pressure,n],[prestige,plus],[prestigious,plus],[prestigiousness,plus],[presto,plus],[preston,person],[prestring,square],[presumption,square],[presuppositions,square],[pretend,plus],[pretended,plus],[pretending,plus],[prettily,plus],[pretty,plus],[prettyprint,square],[prev,left],[prevalent,plus],[prevent,xcorrection],[preventable,xcorrection],[preventative,plus],[preventbodyproblems,xcorrection],[prevented,xcorrection],[preventer,xcorrection],[preventheadaches,xcorrection],[preventing,xcorrection],[prevention,xcorrection],[preventive,xcorrection],[prevents,xcorrection],[preview,square],[previews,square],[previous,left],[previousbt,left],[previously,minus],[prevstate,n],[prey,down],[preying,down],[price,dollarsign],[priced,dollarsymbol],[prices,dollarsymbol],[pricing,dollarsymbol],[prickly,spine],[pride,plus],[priest,person],[prig,dash],[prim,one],[primarily,one],[primary,one],[prime,one],[primed,plus],[primer,paper],[primera,one],[primes,n],[primitive,down],[primordial,n],[prince,person],[princes,person],[princess,person],[princesses,person],[principal,person],[principals,person],[principe,country],[principias,book],[principle,square],[principles,square],[print,square],[printable,paper],[printed,square],[printer,],[printers,paper],[printf,square],[printheader,square],[printing,square],[printline,line],[printmaking,square],[printout,square],[printphonebook,book],[prints,square],[prior,left],[priori,left],[priorities,n],[prioritise,plus],[prioritised,plus],[priority,one],[priscilla,],[prism,],[prisms,cube],[prisoner,person],[prisons,xcorrection],[pristine,plus],[privacy,box],[private,box],[privately,plus],[privateness,plus],[privilege,plus],[privileged,plus],[privileges,plus],[prize,box],[prizes,plus],[prlote,paper],[prlotereview,box],[prlr,box],[prnewswire,company],[pro,plus],[proactive,plus],[probabilistic,n],[probability,one],[probable,one],[probably,plus],[probe,right],[probed,down],[probiotic,plant],[problem,xcorrection],[problematic,minus],[problematise,xcorrection],[problematised,xcorrection],[problems,a],[proboscis,right],[proc,plus],[procedural,right],[procedure,and],[procedures,right],[proceed,right],[proceeded,right],[proceedings,line],[proceeds,dollarsymbol],[process,line],[processcode,and],[processed,right],[processes,right],[processing,right],[processor,and],[processors,box],[processual,right],[proclaim,square],[proclaimed,plus],[proclamation,square],[procrastinator,person],[procreate,down],[procure,up],[procurement,right],[procures,hand],[prod,box],[prodigious,plus],[prodos,box],[prods,product],[produce,right],[produced,right],[producer,person],[producers,person],[produces,right],[producing,right],[product,leash],[production,right],[productions,box],[productive,plus],[productively,plus],[productivity,plus],[products,apple],[prof,person],[profane,minus],[profession,book],[professional,person],[professionalism,tick],[professionally,plus],[professionals,person],[professor,person],[professorial,person],[professors,person],[professorships,person],[proficient,plus],[profile,paper],[profiled,paper],[profit,plus],[profitability,dollarsymbol],[profitable,dollarsymbol],[profits,dollarsymbol],[proforma,paper],[profoundly,plus],[profs,person],[prog,square],[program,b],[programa,and],[programfinder,square],[programfindercall,right],[programmability,and],[programmable,and],[programmatic,and],[programme,book],[programmed,and],[programmer,person],[programmers,person],[programmes,book],[programming,and],[programs,square],[progress,right],[progressed,right],[progresses,right],[progressing,right],[progression,nrectangle],[progressions,line],[progressive,plus],[progs,square],[prohibited,xcorrection],[prohibitive,minus],[project,box],[projected,right],[projectile,ball],[projecting,square],[projection,square],[projections,up],[projector,right],[projectors,right],[projects,paper],[prolactin,],[prolapse,minus],[prolegomenon,square],[proletarian,person],[proletariats,person],[prolifically,n],[proline,],[prolog,paper],[prologtoplevel,up],[prologue,book],[prolong,right],[prolonged,n],[prolongs,n],[promenade,right],[prominence,plus],[prominent,plus],[prominently,plus],[promiscuity,minus],[promise,plus],[promised,plus],[promises,plus],[promising,plus],[promote,plus],[promoted,plus],[promoters,person],[promotes,plus],[promoting,plus],[promotion,up],[promotional,plus],[promotions,up],[prompt,square],[prompted,dash],[promptedread,square],[prompting,plus],[promptly,plus],[prompts,square],[prone,plus],[pronoun,person],[pronounce,mouth],[pronounced,mouth],[pronouncement,square],[pronouncing,tongue],[pronouns,person],[pronunciation,mouth],[proof,tick],[prooflistened,ear],[proofread,eye],[proofreader,person],[proofreading,paper],[proofs,paper],[prop,apple],[propaganda,minus],[propagate,right],[propagated,right],[propagating,right],[propagation,right],[propel,right],[propeller,],[propensity,plus],[proper,plus],[properly,plus],[properties,n],[property,dash],[prophet,person],[proponents,person],[proportion,n],[proportional,n],[proportionality,equals],[proportions,box],[proposal,plus],[proposals,paper],[propose,right],[proposed,paper],[proposes,plus],[proposing,plus],[proposition,square],[propositional,tick],[propositions,a],[propped,strut],[proprietary,dollarsymbol],[props,apple],[propulsion,right],[proquest,and],[pros,plus],[prose,paper],[prosecute,xcorrection],[prosody,note],[prospect,person],[prospected,plus],[prospective,plus],[prospects,plus],[prospectus,book],[prosperity,plus],[prosperous,dollarsymbol],[prostate,box],[prostheses,leg],[prosthetic,],[prostitutes,person],[prostitution,minus],[protect,plus],[protected,plus],[protectedness,plus],[protecting,plus],[protection,plus],[protectionism,plus],[protections,plus],[protective,plus],[protector,plus],[protectors,square],[protects,plus],[protein,ball],[proteins,protein],[proto,plus],[protocol,line],[protocols,dash],[protonmail,paper],[prototype,software],[prototypes,and],[protractor,n],[protruded,right],[protrudes,right],[proud,plus],[proudly,plus],[proulx,person],[proust,person],[prove,right],[proved,plus],[proven,plus],[prover,plus],[proves,right],[provide,right],[provided,right],[provider,company],[providers,person],[provides,right],[providing,right],[province,square],[proving,right],[provision,right],[provisional,plus],[proviso,plus],[provoke,minus],[provoked,right],[provoking,plus],[proximal,n],[proximity,n],[proxy,dash],[prozacs,tablet],[prozesky,person],[prue,person],[prune,minus],[pruned,n],[prunes,box],[pruning,xcorrection],[présent,zero],[ps,dash],[psc,dash],[pseudo,square],[pseudonym,square],[pseudospecs,n],[psipsina,square],[psu,university],[psych,person],[psychiatric,brain],[psychiatrically,brain],[psychiatrist,person],[psychiatrists,person],[psychiatry,person],[psychic,person],[psychoanalysis,brain],[psychoanalyst,person],[psychoanalytic,brain],[psychographic,brain],[psychological,brain],[psychologically,brain],[psychologies,square],[psychologist,person],[psychologists,person],[psychology,paper],[psychopaths,person],[psychopathy,minus],[psychosis,minus],[pt,dash],[ptsd,minus],[pty,plus],[pub,plus],[public,paper],[publication,book],[publications,book],[publicisation,plus],[publicise,plus],[publicised,plus],[publicising,plus],[publicist,person],[publicity,paper],[publicly,plus],[publish,paper],[publishable,book],[published,book],[publisher,person],[publishers,company],[publishes,book],[publishing,book],[pudding,],[puddle,water],[puffin,],[puffs,air],[pugh,person],[puja,plus],[pull,left],[pulled,right],[pulling,right],[pulls,right],[pulp,],[pulse,square],[pummeled,down],[pummeling,down],[pummelled,hand],[pummelling,hand],[pump,box],[pumped,pump],[pumping,right],[pumpkin,],[pun,equals],[punch,magazine],[punct,square],[punctual,n],[punctuation,fullstop],[punish,minus],[punished,xcorrection],[punishment,minus],[punkt,dot],[pup,],[pupils,person],[pupkebqf,dash],[puppet,],[puppets,puppet],[puppies,puppy],[puppy,],[purchase,dollarsymbol],[purchased,dollarsymbol],[purchasers,person],[purchases,dollarsymbol],[purchasing,dollarsymbol],[pure,box],[puree,],[purely,plus],[purified,plus],[purist,equals],[purity,plus],[purple,square],[purported,plus],[purporters,person],[purpose,dash],[purposeful,plus],[purposes,dash],[purse,coin],[pursue,right],[pursued,right],[pursuing,right],[pursuit,right],[pursuits,right],[purusha,air],[purushans,person],[purushas,person],[push,right],[pushed,right],[pushing,right],[pushkin,person],[puspita,person],[pussy,cat],[put,right],[putchar,square],[putin,person],[putonghua,book],[puts,right],[putstr,square],[putstrln,n],[putting,right],[puttogether,plus],[putvalue,box],[putvalues,down],[putvar,down],[puzzle,box],[puzzles,box],[pv,dash],[pvc,],[pw,square],[pwd,box],[pws,variable],[px,dash],[py,software],[pyes,tick],[pyramid,],[python,and],[pz,variable],[pǔtōnghua,book],[pǔtōnghuà,book],[q,dash],[qa,plus],[qadb,right],[qatar,country],[qb,box],[qbmusic,note],[qbpl,box],[qbrbqeiwal,dash],[qe,dash],[qi,dot],[qigong,book],[qigongbrocades,plus],[qigongmantra,square],[qigongsutra,square],[qingyu,person],[qm,dash],[qp,variable],[qrom,dash],[qs,square],[qsort,up],[qsp,dash],[qsuhxoccdwqavd,dash],[qsuxdqninknuubrqyba,dash],[qtdi,dash],[qu,square],[qua,down],[quadb,right],[quadcycles,car],[quaderni,fourrectangle],[quadrant,book],[quadrants,square],[quadratic,two],[quadrilateral,square],[quadroots,n],[quadruple,fourrectangle],[quake,minus],[qual,dash],[qualia,n],[qualification,paper],[qualificationism,dollarsymbol],[qualifications,paper],[qualified,plus],[qualify,plus],[qualifying,plus],[qualitative,n],[qualitatively,plus],[qualities,square],[quality,a],[qualitygurus,person],[qualms,minus],[quals,paper],[quantification,n],[quantified,n],[quantitative,n],[quantities,n],[quantitive,n],[quantity,n],[quantum,n],[quantumbox,box],[quantumlink,right],[quarrelled,minus],[quarryise,stone],[quarter,box],[quartered,box],[quarterly,n],[quarters,room],[quartile,square],[quartz,],[quasi,plus],[quasifontanaland,city],[quasilogics,f],[quasiquotation,square],[quason,plus],[quaver,note],[qubit,n],[queen,person],[queenie,person],[queenly,plus],[queens,person],[queensland,square],[queer,person],[quell,plus],[quench,water],[quenched,water],[quenching,water],[quentin,person],[queried,square],[querier,person],[queries,questionmark],[query,questionmark],[queryable,square],[queryb,square],[querying,square],[question,questionmark],[questionable,minus],[questioned,questionmark],[questioner,person],[questioning,questionmark],[questionmark,],[questionmode,right],[questionnaire,paper],[questionnaires,paper],[questions,questionmark],[queue,line],[queued,line],[queueing,line],[queues,threerectangle],[quick,n],[quicker,greaterthan],[quickly,plus],[quickness,n],[quicksort,up],[quicktime,and],[quiet,n],[quietly,zero],[quince,],[quinces,quince],[quincy,person],[quinlan,person],[quinn,person],[quinoa,],[quip,plus],[quips,plus],[quipu,string],[quipus,string],[quirk,minus],[quirks,minus],[quirky,minus],[quit,right],[quite,plus],[quits,left],[quitting,left],[quivers,right],[quiz,paper],[quizmaze,plus],[quizzed,paper],[quizzes,paper],[quo,person],[quoll,animal],[quora,company],[quorum,person],[quot,square],[quota,n],[quotation,square],[quote,square],[quoted,square],[quotes,square],[quoth,mouth],[quotient,n],[quoting,square],[qutie,dog],[qv,variable],[qvahhcqz,dash],[qwfmqsneubo,dash],[qx,variable],[qy,variable],[qz,variable],[r,box],[ra,dash],[raab,person],[rabbit,],[rabbits,rabbit],[rabia,person],[rabiashahid,person],[raboude,person],[rac,right],[race,right],[races,person],[rachel,person],[racial,person],[racing,right],[racism,minus],[racist,minus],[rack,],[racket,plus],[radial,circle],[radians,right],[radiation,],[radical,two],[radicalism,plus],[radically,plus],[radio,zline],[radioactive,minus],[radioairplay,company],[radios,right],[radish,],[radius,n],[rae,person],[raedt,person],[raft,],[rag,],[rage,minus],[raggatt,person],[raggatts,person],[raichel,person],[rail,line],[rain,down],[rainbow,square],[rained,water],[rainforest,tree],[raining,water],[raise,up],[raised,up],[raises,up],[raisin,raisin],[raising,up],[raisins,raisin],[raison,square],[raj,person],[rajendra,person],[rake,],[raked,rake],[rally,person],[ralph,person],[ram,],[rambada,right],[ramifications,right],[ramirez,person],[ramp,],[ramping,up],[ramponponed,person],[ramzan,person],[ran,path],[ranamukhaarachchi,person],[rancid,minus],[rancière,person],[rand,n],[random,n],[randomfn,n],[randomfns,n],[randomisation,n],[randomise,n],[randomised,n],[randomises,n],[randomized,n],[randomlist,n],[randomly,n],[randomness,n],[randvars,variable],[rang,n],[range,line],[ranged,right],[ranger,person],[ranges,line],[ranging,n],[rango,dog],[rani,person],[ranjit,person],[rank,n],[ranked,n],[ranking,n],[rankings,n],[ranks,square],[ransacked,minus],[rape,minus],[rapid,n],[rapidly,n],[rapidssl,box],[rapidtables,dash],[rapport,plus],[rapprochements,plus],[raptures,plus],[rapturous,plus],[rapturously,plus],[rapunzel,person],[rard,dash],[rare,plus],[rarely,minus],[rash,minus],[rashti,person],[rasp,minus],[raspberries,raspberry],[raspberry,berry],[rasping,minus],[rastapopolous,person],[rata,plus],[rate,n],[rated,n],[ratepayers,person],[rates,n],[rathbowne,person],[rather,minus],[ratified,plus],[ratify,plus],[rating,n],[ratings,n],[ratio,divide],[rational,plus],[rationale,paper],[rationales,square],[rationalisation,plus],[rationalist,person],[rationality,plus],[rationalizations,plus],[rationalized,plus],[rationally,plus],[ratios,r],[raven,],[raw,one],[rawgit,box],[ray,line],[rayner,person],[rays,right],[raza,person],[razor,blade],[rb,dash],[rbv,tree],[rc,dash],[rcaw,square],[rcawmp,n],[rcawp,and],[rcawpa,square],[rcawpastea,square],[rcawpspec,paper],[rcmp,person],[rct,right],[rd,dash],[rdbc,dash],[re,square],[rea,dash],[reach,dot],[reachable,plus],[reached,right],[reachedness,plus],[reaches,dot],[reaching,dot],[react,tick],[reacted,right],[reacting,square],[reaction,right],[reactions,right],[reactivity,plus],[read,book],[readability,eye],[readable,plus],[readd,eye],[readee,person],[reader,person],[readers,eye],[readfile,paper],[readily,plus],[readin,eye],[readiness,plus],[reading,book],[readinginstructions,eye],[readings,square],[readline,eye],[readme,paper],[readmejson,paper],[readmes,square],[reado,apple],[reads,book],[readsc,n],[readthedocs,eye],[readv,right],[ready,plus],[real,box],[realclean,plus],[realisation,plus],[realise,tick],[realised,plus],[realises,plus],[realising,tick],[realism,box],[realist,box],[realistic,plus],[realistically,plus],[realities,box],[reality,box],[really,plus],[realm,country],[realpeoplecastings,company],[realtime,zero],[realtimesync,and],[reap,right],[reappear,square],[reappears,plus],[reapplied,plus],[reapply,plus],[rear,left],[reared,plus],[rearrangement,n],[rearranging,right],[reason,square],[reasonable,square],[reasonableness,plus],[reasonably,plus],[reasoned,square],[reasoner,person],[reasoning,square],[reasonings,square],[reasons,square],[reassemble,box],[reassembled,plus],[reassess,plus],[reassign,equals],[reassigned,number],[reassurance,plus],[reassure,plus],[reassured,plus],[reassuring,plus],[reassuringly,plus],[reattached,plus],[reattaches,dash],[reattaching,plus],[reb,xcorrection],[rebecca,person],[rebelled,minus],[rebellious,minus],[rebelliousness,minus],[rebirth,zero],[reborn,zero],[rebound,right],[rebounder,u],[rebr,one],[rebrand,right],[rebreason,equals],[rebreasoned,equals],[rebreasoning,one],[rebreasonings,equals],[rebreathsoned,plus],[rebreathsoning,plus],[rebreathsonings,hand],[rebuilding,up],[rebuilt,building],[rebut,xcorrection],[rebuttal,xcorrection],[rebuttals,xcorrection],[rebutted,xcorrection],[rebutting,xcorrection],[rec,tick],[recalculate,plus],[recalculated,equals],[recalibrate,n],[recall,down],[recalling,plus],[recast,plus],[receipt,paper],[receipts,square],[receive,left],[received,left],[receiver,box],[receivers,receiver],[receives,left],[receiving,left],[recent,left],[recently,minus],[receptacle,box],[reception,plus],[receptionist,person],[receptive,plus],[receptor,box],[recession,minus],[recharge,plus],[recheck,plus],[recipe,paper],[recipes,apple],[recipient,person],[recipients,person],[reciprocation,right],[reciprocity,right],[recite,mouth],[recited,mouth],[recklessness,minus],[recklinghausen,minus],[reclaimed,plus],[recline,down],[reclined,down],[reclining,down],[reclusive,minus],[recn,circle],[recog,plus],[recognisability,eye],[recognisable,tick],[recognise,eye],[recognised,eye],[recogniser,tick],[recognises,tick],[recognising,eye],[recognition,plus],[recognized,eye],[recognizing,plus],[recollected,plus],[recombined,down],[recommend,plus],[recommendation,plus],[recommendations,plus],[recommended,plus],[recommending,plus],[recommends,tick],[recompile,right],[recompiling,right],[recompute,plus],[recomputed,right],[reconceptualisation,square],[reconciled,plus],[reconciles,tick],[reconciliation,plus],[reconciling,plus],[reconfigurations,plus],[reconfigured,right],[reconnect,dash],[reconnected,dash],[reconnecting,equals],[reconsider,plus],[reconstruct,up],[reconstructed,up],[reconstructing,up],[reconstruction,up],[reconstructs,box],[recontinue,right],[reconvert,right],[reconverted,right],[record,square],[recorded,line],[recordequins,person],[recorder,paper],[recorders,line],[recording,square],[recordings,square],[records,down],[recounted,n],[recover,right],[recovered,right],[recovering,xcorrection],[recovery,plus],[recreate,plus],[recreated,plus],[recreating,plus],[recreation,plus],[recruit,up],[recruited,plus],[recruiter,person],[recruiters,person],[recruiting,up],[recruitment,plus],[rect,square],[rectangle,square],[rectangles,square],[rectangular,square],[rectified,plus],[rectify,xcorrection],[rectum,tube],[recuperate,plus],[recur,plus],[recurred,circle],[recurrent,n],[recurring,circle],[recurs,circle],[recurse,circle],[recursion,circle],[recursions,circle],[recursive,dash],[recursivecall,circle],[recursively,circle],[recursivety,circle],[recursor,circle],[recyclables,circle],[recycle,plus],[recycled,amount],[recycler,plus],[recycling,circle],[red,square],[redblackduck,duck],[redcoats,person],[reddit,company],[redeem,tick],[redefine,down],[redefined,square],[redefining,right],[redemption,tick],[redesign,pen],[redesigned,paper],[redesigning,plus],[redid,plus],[redirect,right],[redirected,right],[rediscover,eye],[redisplayed,square],[redistribute,right],[redistributed,right],[redistribution,right],[redistributions,right],[redo,two],[redoing,left],[redone,tick],[redraft,paper],[redrafted,paper],[redrafting,paper],[redrafts,paper],[redrawn,pen],[redress,right],[redshift,n],[reduce,minus],[reduced,minus],[reducer,down],[reduces,minus],[reducing,one],[reductio,down],[reduction,minus],[reductionism,down],[reductions,right],[reductive,down],[reductivist,down],[redundancy,minus],[redundant,minus],[reed,tube],[reedoei,dash],[reeds,plant],[reefman,person],[reenter,down],[reentered,right],[reentering,right],[rees,person],[ref,left],[refactoring,plus],[refer,right],[referee,person],[referees,person],[reference,right],[referenced,right],[references,square],[referencing,right],[referendum,paper],[referral,paper],[referred,right],[referrer,right],[referring,dash],[refers,dash],[refill,box],[refine,plus],[refined,down],[refinement,plus],[refinements,plus],[refining,plus],[refit,plus],[reflect,n],[reflected,line],[reflecting,n],[reflection,square],[reflections,right],[reflective,right],[reflector,right],[reflects,dot],[reflex,right],[reflexively,circle],[reflexivity,circle],[reflexology,foot],[reform,plus],[reformation,plus],[reformatted,square],[reforming,plus],[reforms,plus],[reformulation,right],[refound,plus],[refracted,n],[refraction,n],[refractive,n],[reframe,square],[reframing,square],[refreeze,zero],[refreezing,box],[refresh,plus],[refreshed,plus],[refreshes,plus],[refreshing,plus],[refreshments,water],[refrigerator,],[refs,square],[refuelling,fuel],[refuge,plus],[refugee,person],[refugees,person],[refund,dollarsymbol],[refunded,dollarsymbol],[refunds,right],[refused,minus],[refutations,xcorrection],[reg,plus],[regained,right],[regal,person],[regan,person],[regard,plus],[regarded,plus],[regarding,dash],[regardless,plus],[regards,dash],[regatta,boat],[regenerate,plus],[regenerated,plus],[regenerating,plus],[regeneration,plus],[regime,square],[regimen,paper],[regiment,person],[regimes,dash],[region,square],[regional,square],[regionally,square],[regions,square],[register,paper],[registeracompany,plus],[registered,n],[registering,square],[registeringcompany,company],[registers,plus],[registrar,person],[registration,plus],[registry,square],[regogitation,minus],[regress,left],[regression,down],[regrowing,up],[regular,one],[regularity,n],[regularly,n],[regulate,line],[regulated,plus],[regulating,line],[regulation,book],[regulations,square],[regulatory,plus],[regurgitated,up],[rehabilitation,plus],[rehearsal,paper],[rehearse,plus],[rehearsed,note],[rehearsing,plus],[rehse,person],[reid,person],[reification,plus],[reified,plus],[reigned,plus],[reigning,plus],[reimagine,plus],[reimplement,plus],[reincarnation,plus],[reine,equals],[reiner,n],[reinforce,box],[reinforced,plus],[reinforcement,plus],[reinforcing,plus],[reinitialise,zero],[reinsert,box],[reinserted,right],[reinstating,plus],[reinterpret,two],[reinterpreting,book],[reintroduce,plus],[reinvented,plus],[reishi,plant],[reject,xcorrection],[rejected,xcorrection],[rejecting,xcorrection],[rejection,xcorrection],[rejects,xcorrection],[rejoiced,plus],[rejoin,plus],[rejoined,right],[rejuvenate,plus],[rejuvenated,plus],[rel,dash],[relabelled,square],[relabelling,square],[relatable,plus],[relate,dash],[related,dash],[relates,dash],[relating,dash],[relation,dash],[relational,right],[relations,dash],[relationship,right],[relationships,dash],[relative,person],[relatively,minus],[relativeness,right],[relatives,person],[relativism,right],[relativity,right],[relax,person],[relaxation,plus],[relaxed,person],[relaxes,down],[relaxing,plus],[relearning,plus],[release,right],[released,right],[releases,up],[releasing,right],[relegate,down],[relevance,equals],[relevant,tick],[relevantly,plus],[reliability,plus],[reliable,plus],[reliably,plus],[reliance,right],[reliant,plus],[relic,minus],[relied,right],[relief,plus],[relies,left],[relieve,plus],[relieved,plus],[relig,person],[religion,book],[religions,person],[religious,organisation],[relish,],[relishment,plus],[reload,plus],[reloaded,up],[reloading,up],[reloads,up],[relocate,right],[rels,dash],[reluctance,minus],[rely,right],[relying,left],[rem,brain],[remain,one],[remainder,n],[remainders,n],[remained,dot],[remaining,tworectangle],[remains,plus],[remap,right],[remark,square],[remarkable,plus],[remarks,square],[remastered,plus],[remediation,plus],[remedied,plus],[remedies,plus],[remember,brain],[remembered,plus],[remembering,right],[remembers,box],[remembrance,up],[remind,plus],[reminded,equals],[reminder,n],[reminders,square],[reminding,square],[reminds,square],[remix,note],[remixes,paper],[remnants,dot],[remorse,minus],[remote,dot],[remotehost,computer],[remotely,box],[remoteuser,person],[removal,minus],[removals,right],[remove,minus],[removebrackets,minus],[removebrackets,xcorrection],[removed,minus],[removenotice,xcorrection],[removenotrhyming,xcorrection],[remover,up],[removerepeatedterm,xcorrection],[removers,minus],[removes,xcorrection],[removetoolong,xcorrection],[removetoolongandnotrhyming,xcorrection],[removing,xcorrection],[remunerate,dollarsymbol],[remunerated,dollarsymbol],[remuneration,dollarsymbol],[remvdup,minus],[rena,person],[renaissance,square],[renal,kidney],[rename,right],[renamed,square],[renames,right],[renaming,right],[render,square],[rendered,square],[renderer,square],[renderh,note],[rendering,square],[renderline,line],[renderlineh,line],[renderlinerests,box],[renderlines,line],[renderm,note],[renders,square],[rendersong,square],[renderv,square],[renditions,plus],[renegade,person],[renegades,person],[renew,plus],[renewable,plus],[renewal,plus],[renewed,plus],[renewing,plus],[renogitation,minus],[renovated,plus],[renowned,plus],[rent,dollarsymbol],[rentable,dollarsymbol],[rental,building],[rented,dollarsymbol],[renumber,n],[reopen,up],[reorder,n],[reordered,dash],[reordering,line],[reorders,tick],[reorganisation,plus],[reorganised,plus],[reorganize,plus],[reoutputted,n],[rep,square],[repair,plus],[repaired,plus],[repairing,plus],[repairs,plus],[repay,dollarsymbol],[repeat,n],[repeated,n],[repeatedly,n],[repeating,n],[repeatlastnote,n],[repeatlastnoteh,note],[repeats,n],[repel,xcorrection],[repellant,plus],[repels,xcorrection],[repetition,n],[repetitions,n],[repetitive,two],[replace,dash],[replaceable,plus],[replacecard,card],[replaced,dash],[replacement,dash],[replacements,dash],[replacer,right],[replaces,dash],[replacesynonyms,right],[replacing,right],[replay,line],[replayed,right],[replaying,plus],[repleteness,n],[repli,plus],[replic,plus],[replicant,person],[replicants,person],[replicat,company],[replicate,two],[replicated,plus],[replicates,plus],[replicating,plus],[replication,plus],[replications,plus],[replicator,box],[replicators,plus],[replied,square],[replies,square],[reply,a],[replying,left],[repo,box],[repoint,up],[report,paper],[reported,paper],[reportedly,plus],[reporter,person],[reporters,person],[reporting,paper],[reports,paper],[reportspam,xcorrection],[repos,box],[reposition,right],[repositioning,right],[repositories,box],[repository,paper],[repositoryname,square],[reposting,square],[reprehensible,minus],[represent,right],[representation,square],[representations,square],[representative,person],[representatives,person],[represented,square],[representing,right],[represents,square],[reprint,paper],[reprocesses,right],[reproduce,right],[reproducing,two],[reproduction,right],[reproductive,plus],[reprogrammed,and],[reps,square],[republic,country],[republican,person],[repurchase,dollarsymbol],[reputable,plus],[reputation,plus],[request,left],[requested,questionmark],[requester,person],[requesters,person],[requesting,square],[requests,left],[require,left],[required,left],[requirement,tick],[requirements,paper],[requires,left],[requiring,left],[requisite,plus],[requisites,right],[reran,right],[reranked,n],[rereading,eye],[rerun,plus],[rerunning,right],[res,paper],[reschedule,right],[rescued,plus],[rescuer,person],[rescuing,plus],[resealable,bag],[research,paper],[researchable,paper],[researched,paper],[researcher,person],[researchers,person],[researchgate,paper],[researching,tick],[reseller,person],[reselling,dollarsymbol],[resemble,plus],[resembled,equals],[resembling,right],[resend,right],[resent,right],[reservation,plus],[reserve,plus],[reserved,plus],[reservedness,plus],[reserves,box],[reset,zero],[resetcps,right],[resetcpstohere,dot],[resets,left],[resetting,zero],[residences,house],[resident,person],[residential,house],[residents,person],[resides,plus],[residing,house],[residue,box],[resilience,plus],[resist,xcorrection],[resistance,minus],[resistant,plus],[resists,xcorrection],[resolute,line],[resolutely,plus],[resolution,square],[resolutions,square],[resolve,tick],[resolved,plus],[resolves,tick],[resolving,plus],[resonance,n],[resonate,plus],[resonated,plus],[resonates,plus],[resonating,plus],[resort,building],[resounded,plus],[resource,box],[resourced,plus],[resources,box],[resourcesfulnesses,plus],[resp,box],[respecified,two],[respecify,two],[respect,plus],[respected,plus],[respectful,plus],[respecting,plus],[respective,box],[respectively,tworectangle],[respects,plus],[respiratory,lung],[respond,plus],[responded,square],[respondents,person],[responder,right],[responding,right],[responds,plus],[response,tick],[responses,tick],[responsibilities,paper],[responsibility,plus],[responsible,dash],[responsibly,plus],[responsive,plus],[responsiveness,plus],[rest,threebox],[restart,dot],[restarted,right],[restarting,right],[restated,square],[restating,left],[restaurant,building],[restaurants,company],[rested,down],[resting,zero],[restlast,box],[restofgrammar,square],[restore,left],[restored,down],[restores,left],[restoring,plus],[restrain,xcorrection],[restraint,plus],[restrict,n],[restricted,box],[restrictions,minus],[restrictive,minus],[restricts,xcorrection],[restructure,box],[rests,box],[resubmission,up],[resubmit,right],[resubmitting,right],[result,n],[resulted,right],[resulting,equals],[results,a],[resume,paper],[resumes,paper],[resuming,right],[resupply,down],[resuscitate,plus],[resuscitated,plus],[resuscitating,up],[resynth,star],[resynthesis,star],[resynthesise,star],[resynthesised,star],[retail,dollarsymbol],[retailers,person],[retain,plus],[retained,plus],[retaining,plus],[retains,plus],[reteach,plus],[reteaches,square],[retention,plus],[retest,dot],[retested,plus],[rethink,square],[rethinking,brain],[retina,circle],[retired,plus],[retirement,n],[retouched,plus],[retract,left],[retractall,one],[retracted,left],[retracting,left],[retracts,left],[retrain,book],[retraining,right],[retranslate,right],[retranslated,right],[retransplantation,right],[retreat,building],[retreats,left],[retried,plus],[retries,plus],[retrievable,plus],[retrieval,plus],[retrieve,up],[retrieved,right],[retrieves,left],[retrieving,right],[retro,plus],[retry,tick],[retrying,plus],[return,left],[returnable,right],[returned,right],[returnedfromthailand,right],[returner,up],[returning,left],[returns,left],[retweet,plus],[retweeted,plus],[retyping,square],[reuploading,up],[reusable,plus],[reuse,tick],[reused,tick],[reuser,person],[reuses,tick],[reusing,tick],[rev,left],[reveal,plus],[revealed,plus],[revealing,box],[reveals,square],[revelation,plus],[revenue,dollarsymbol],[reverbnation,company],[revered,plus],[reverehealth,heart],[reversal,left],[reverse,threerectangle],[reversealgorithmfirstline,square],[reversealgorithmfourthline,square],[reversealgorithmsecondline,square],[reversealgorithmthirdline,square],[reversed,mirror],[reverses,left],[reversing,right],[revert,left],[reverted,left],[reverter,right],[reverting,left],[review,paper],[reviewed,tick],[reviewer,person],[reviewers,person],[reviewing,paper],[reviews,paper],[revise,book],[revised,eye],[revising,plus],[revision,plus],[revisited,dot],[revitalising,plus],[revived,plus],[revoke,xcorrection],[revoked,xcorrection],[revolt,minus],[revolution,circle],[revolutionary,plus],[revolutionised,plus],[revolutions,circle],[revolve,plus],[revolved,right],[reward,plus],[rewarded,plus],[rewarding,plus],[rewards,plus],[reword,square],[reworded,square],[rewording,right],[rework,plus],[reworked,plus],[rewrite,pen],[rewriter,pen],[rewriters,plus],[rewrites,two],[rewriting,pen],[rewritten,paper],[rewrote,pen],[rex,king],[reyes,person],[reynardine,person],[reynolds,person],[rezai,person],[rf,variable],[rff,right],[rgb,n],[rhapsody,note],[rhesus,monkey],[rhetoric,square],[rhetorical,right],[rhetorically,right],[rhetorician,person],[rhino,rhinoceros],[rhinoceros,rhinoceros],[rhizomatic,dash],[rhizome,dash],[rhizomes,dash],[rhizomic,dash],[rhizomicity,dash],[rhoden,person],[rhodochrosite,stone],[rhodopsin,ball],[rhomboid,square],[rhs,right],[rhubarb,],[rhyme,equals],[rhymed,equals],[rhymes,equals],[rhyming,equals],[rhys,person],[rhythm,note],[rhythmic,note],[rhythms,note],[ri,dash],[riable,dash],[ribbon,],[ribbons,square],[rica,country],[rice,person],[rich,coin],[richard,person],[richardson,person],[richer,dollarsign],[richmond,suburb],[rick,person],[ricky,person],[ricocheting,right],[ricoeur,person],[rid,xcorrection],[ridden,minus],[ride,car],[ridge,box],[ridiculed,minus],[riding,right],[ridley,person],[rife,minus],[riff,note],[rig,book],[rigatoni,],[right,],[rightbracketnext,square],[rightful,plus],[rightmost,right],[rightness,tick],[rightnesses,plus],[rights,plus],[rightup,up],[rigid,box],[rigmaroles,plus],[rigor,plus],[rigorous,tick],[rigorously,plus],[rigorousness,tick],[rigour,plus],[riley,dog],[rim,box],[ring,],[ringing,n],[rinsed,water],[rinsing,water],[rip,person],[ripe,plus],[ripenesses,plus],[rise,up],[rises,up],[rising,up],[risk,xcorrection],[risking,minus],[risks,minus],[risky,minus],[rissole,],[rita,person],[ritetag,company],[rittel,person],[rivalry,minus],[river,],[rivers,right],[rivista,person],[rizzacasa,person],[rizzo,person],[rl,dash],[rli,institute],[rlist,line],[rlr,right],[rlsatarget,dash],[rm,xcorrection],[rmit,building],[rmoldtmpfile,paper],[rms,n],[rn,dash],[rna,n],[rnh,company],[rnn,right],[rnns,right],[ro,right],[road,],[roadblock,minus],[roadhouse,house],[roads,road],[roadside,right],[roald,person],[roared,lion],[roast,n],[roasted,n],[roasting,n],[rob,person],[robbie,person],[robbins,person],[robe,],[robin,person],[robopod,],[robot,person],[robotic,person],[roboticist,person],[robotics,person],[robots,box],[robust,plus],[robyn,person],[rochelle,person],[rock,note],[rocked,plus],[rocket,],[rocking,right],[rocks,rock],[rococo,square],[rod,person],[rode,right],[rodrigues,person],[rods,rod],[rog,person],[rogaining,plus],[roger,person],[rohan,person],[roi,person],[roity,plus],[rold,dash],[role,person],[roleplay,paper],[roles,person],[roll,],[rolled,ball],[rolling,ball],[roman,person],[romania,country],[rome,city],[romeo,person],[romeos,restaurant],[ron,person],[ronson,person],[roof,],[rook,],[room,],[roomdict,book],[roomdifferencet,d],[roomdt,variable],[roomend,variable],[roomlength,n],[roomlengtht,n],[roompartname,square],[rooms,room],[roomstring,variable],[roomstringth,variable],[roomvalues,variable],[roomx,variable],[roomy,variable],[roomz,variable],[roosting,down],[root,dash],[roots,right],[rope,],[ropensci,software],[ropes,rope],[rorty,person],[rose,],[rosemary,person],[rosemont,person],[rosenberg,person],[rosenkranz,person],[roses,rose],[rosettacode,and],[rosin,],[ross,person],[rossman,person],[rosuly,rose],[rosy,rose],[rot,circle],[rota,company],[rotarian,person],[rotarians,person],[rotary,building],[rotaryprahran,company],[rotate,semicircle],[rotated,arc],[rotating,right],[rotation,right],[rote,plus],[rotoract,dash],[rots,right],[rotunda,],[rough,minus],[roughly,minus],[round,dot],[rounded,circle],[rounding,down],[rounds,right],[roundtable,table],[roused,plus],[rousing,plus],[rousseau,person],[route,right],[routenote,company],[router,line],[routes,right],[routine,paper],[routinely,plus],[routines,and],[routining,left],[routledge,book],[roux,],[row,line],[rowdy,minus],[rowed,right],[rower,person],[rowing,right],[rows,line],[royal,king],[royalmelbournehospital,hospital],[royals,person],[royalties,dollarsymbol],[royalty,person],[rpg,plus],[rpl,tick],[rpv,company],[rqidlbhwp,dash],[rr,dash],[rrs,variable],[rs,variable],[rsa,tick],[rst,dash],[rsvp,left],[rsync,right],[rtch,dot],[rtcmiux,dash],[rtf,paper],[rto,school],[ru,paper],[rub,down],[rubbed,down],[rubber,box],[rubbing,equals],[rubbish,minus],[rubinstein,person],[rubric,v],[rubrics,square],[ruby,],[rudd,person],[rude,minus],[rudeness,minus],[rudimental,box],[rudolf,person],[rue,dash],[rug,],[rugby,ball],[ruishihealthcare,plus],[rule,plus],[rulea,square],[ruled,plus],[rulename,square],[ruler,],[rules,and],[rulesx,plus],[ruling,plus],[ruminating,minus],[rumination,minus],[rummaged,down],[run,path],[runa,tick],[runabout,box],[runall,right],[runb,tick],[rune,tick],[rung,],[rungrammarly,software],[rungs,rung],[runhosting,company],[runn,dash],[runne,tick],[runner,person],[runners,right],[running,shoe],[runny,water],[runs,path],[runtime,n],[runz,tick],[rural,paddock],[rush,right],[rushing,right],[russell,person],[russia,country],[russian,paper],[rut,box],[ruth,person],[ruts,box],[rv,dash],[rvawki,dash],[rvs,dash],[rw,dash],[rwanda,country],[rxcawp,and],[ryan,person],[ryde,person],[rye,bread],[s,box],[sa,dash],[saas,dollarsymbol],[sabalcore,company],[sac,plus],[saccharine,sugar],[sack,],[sacked,xcorrection],[sacking,xcorrection],[sacrament,paper],[sacramento,city],[sacred,plus],[sacrifice,minus],[sacrifices,minus],[sacrificing,xcorrection],[sacrosanctnesses,plus],[sacrosancts,plus],[sad,tear],[sadface,],[sadler,person],[sadly,minus],[sadness,],[safari,and],[safe,a],[safeguard,plus],[safeguarding,plus],[safeguards,plus],[safely,plus],[safer,plus],[safest,tick],[safety,plus],[sage,person],[sages,person],[saggy,down],[saharan,sand],[said,speechballoon],[sail,boat],[sailed,right],[sailor,person],[sailors,person],[sails,sail],[saint,person],[saints,person],[sake,plus],[sal,variable],[salad,tomato],[salami,],[salaries,dollarsymbol],[salary,dollarsymbol],[saldana,person],[sale,dollarsign],[saleable,dollarsymbol],[salek,person],[salem,person],[salerno,city],[sales,dollarsign],[salespeople,person],[salesperson,person],[salivary,liquid],[salivating,water],[sally,person],[salmon,square],[salome,person],[salon,building],[salt,],[saltaté,right],[salted,salt],[salts,crystal],[salty,salt],[salute,plus],[saluted,plus],[salvador,country],[salvage,plus],[sam,person],[samad,person],[samadhi,plus],[samantha,person],[samaritan,person],[samaritans,person],[samatha,plus],[same,equals],[sameexp,equals],[sameexperience,dot],[sameness,equals],[samenesses,equals],[samepn,equals],[samoa,country],[samovar,tea],[sample,box],[samplereport,paper],[samples,n],[samplestock,box],[sampling,n],[samsung,tablet],[samwalrus,person],[san,city],[sanctioning,plus],[sanctity,plus],[sand,],[sandals,],[sandbox,box],[sandelowski,person],[sandiford,person],[sandpaper,],[sandra,person],[sands,person],[sandstone,],[sandwich,],[sandwiches,sandwich],[sandy,person],[sane,plus],[sang,note],[sanitary,square],[sanity,plus],[sank,down],[sanskrit,square],[sansserifprint,box],[santa,city],[santo,person],[santos,person],[sao,country],[sap,],[sapient,plus],[sapling,tree],[saplings,tree],[sapub,dash],[sara,person],[sarah,person],[sarasota,person],[sarkozys,person],[sarte,person],[sasnett,person],[sassmaiden,person],[sat,seat],[satanism,religion],[satchwell,person],[satell,person],[satellite,],[satiated,plus],[satisfaction,plus],[satisfactorily,plus],[satisfactory,plus],[satisfiability,one],[satisfiable,plus],[satisfied,plus],[satisfies,tick],[satisfy,tick],[satisfying,tick],[saturated,water],[saturday,square],[sauce,],[saucepan,],[saucer,],[saucers,saucer],[sauces,sauce],[saucy,plus],[saudi,country],[sauer,person],[saunders,person],[sausage,],[sav,eye],[savant,person],[savants,person],[save,plus],[saveable,down],[saved,tick],[saver,plus],[savers,plus],[saves,tick],[saving,plus],[savings,plus],[savoured,plus],[savoury,mayonnaise],[savvy,company],[saw,eye],[sawtooth,square],[sawyer,person],[sax,tube],[saxophone,],[say,speechballoon],[sayer,person],[sayers,person],[saying,mouth],[says,speechballoon],[sb,dash],[sbr,paper],[sc,dash],[scaffolding,box],[scalability,tworectangle],[scalable,tworectangle],[scale,nrectangle],[scaled,up],[scaler,n],[scales,up],[scaling,n],[scammer,minus],[scams,minus],[scan,eye],[scandizzo,person],[scanf,left],[scanned,square],[scanner,right],[scanning,eye],[scarborough,city],[scarcity,minus],[scared,minus],[scarf,],[scarring,minus],[scary,minus],[scattered,down],[scavenger,person],[scenario,box],[scenarios,box],[scene,square],[scenes,box],[scenic,tree],[scent,perfume],[scented,nose],[schachte,person],[schapzazzure,plus],[schedule,square],[scheduled,dash],[scheduler,n],[schedules,square],[scheduling,square],[schema,square],[scheme,right],[schemes,square],[schepisi,person],[schisandra,plant],[schizophrenia,minus],[schizophrenic,minus],[schmitt,person],[scholar,person],[scholarising,person],[scholarly,person],[scholars,person],[scholarship,dollarsymbol],[scholarships,down],[schonfelder,person],[school,building],[schoolmaster,person],[schools,building],[schoolyard,],[schroeter,person],[schudmack,person],[schumacher,person],[schwarz,person],[schwerdt,person],[sci,testtube],[scicluna,person],[science,testtube],[scienced,testtube],[sciencedirect,company],[sciences,testtube],[scientific,and],[scientifically,testtube],[scientist,person],[scientists,person],[scintillated,plus],[scintillation,plus],[scissors,],[scone,],[scones,scone],[scoop,],[scooped,scoop],[scooping,down],[scoops,ball],[scope,dash],[scopes,dash],[scopic,up],[scopus,dot],[score,n],[scored,n],[scores,n],[scoring,n],[scorn,minus],[scorned,minus],[scorning,minus],[scots,person],[scott,person],[scottjeffrey,person],[scoundrels,person],[scourer,],[scouring,down],[scout,person],[scouted,eye],[scp,two],[scrap,xcorrection],[scraped,right],[scraper,],[scraping,right],[scrapped,down],[scratch,line],[scratcher,person],[scratching,down],[scratchy,minus],[scrawn,minus],[screams,line],[screen,square],[screening,square],[screens,square],[screenshot,square],[screenshots,square],[screw,],[screwdriver,],[screwed,down],[scribbled,pen],[scribbling,pen],[scripsi,book],[script,paper],[scripted,paper],[scripting,paper],[scripts,paper],[scriptural,book],[scripture,book],[scriptures,book],[scriptwriter,person],[scrn,square],[scrnsong,note],[scrnsongs,note],[scroll,paper],[scrolling,square],[scrotum,ball],[scrubbing,down],[scrumptious,plus],[scrutinise,tick],[scrutiny,plus],[scuba,air],[scull,person],[sculpted,chisel],[sculptor,person],[sculpture,],[scuola,building],[sdc,company],[sdcinternetics,company],[sde,air],[sdk,box],[se,dash],[sea,water],[seahorse,],[seal,],[seale,person],[sealed,plus],[sealing,equals],[seamless,plus],[seamlessly,line],[seances,dash],[search,square],[searchable,plus],[searched,eye],[searchers,person],[searches,square],[searching,eye],[seashells,shell],[seashore,line],[season,square],[seasons,square],[seat,],[seated,seat],[seats,seat],[seaweed,plant],[sebastien,person],[sec,line],[secateurs,],[secolsky,person],[second,two],[seconda,line],[secondary,two],[secondhen,duck],[secondly,n],[secondness,two],[seconds,one],[secondsvaluea,variable],[secondsvalueb,variable],[secondvar,variable],[secondvarname,square],[secondvarparent,variable],[secondvarparentname,square],[secrecy,xcorrection],[secret,questionmark],[secretary,person],[secreted,right],[secretly,plus],[secrets,square],[secs,n],[sectarian,box],[sectest,plus],[sectichords,n],[section,square],[sectional,box],[sections,square],[sector,box],[sectors,circle],[secular,plus],[secularism,dot],[secularity,zero],[secularly,plus],[secure,plus],[secured,plus],[securely,plus],[securepay,company],[security,plus],[sedate,minus],[see,square],[seeable,eye],[seed,],[seedlings,seed],[seeds,seed],[seeing,eye],[seek,left],[seeker,person],[seekers,person],[seeking,eye],[seeks,hand],[seem,tick],[seemed,plus],[seeming,plus],[seemingly,plus],[seems,plus],[seen,eye],[seep,down],[seer,person],[sees,eye],[seesaw,line],[segment,line],[segmentation,square],[segmented,box],[segments,box],[segs,variable],[segunda,box],[segv,square],[sehj,dash],[seiende,plus],[seize,hand],[seizing,hand],[select,dash],[selected,plus],[selectee,person],[selecting,tick],[selection,square],[selections,square],[selective,plus],[selector,person],[selectpositions,n],[selects,square],[selecttouching,equals],[selena,person],[self,person],[selfcontrol,and],[selfhood,person],[selfie,square],[selfish,minus],[selfishness,minus],[selflessness,plus],[sell,dollarsign],[sellar,person],[seller,person],[sellers,person],[selling,dollarsymbol],[sells,dollarsymbol],[selves,person],[selye,person],[semantic,dash],[semantics,square],[semblance,plus],[semester,square],[semesters,square],[semi,box],[semicircle,],[semicolon,square],[semidet,square],[semily,n],[seminal,tube],[seminar,room],[seminars,room],[semitones,square],[semiz,person],[semler,person],[semolina,],[semver,n],[sen,square],[send,right],[sender,person],[sending,right],[sends,right],[sendspace,company],[senegal,country],[senior,dot],[senorita,person],[sensable,plus],[sensation,plus],[sensations,finger],[sense,eye],[sensed,eye],[senses,eye],[sensible,plus],[sensing,plus],[sensitive,hand],[sensitively,plus],[sensitivities,plus],[sensitivity,plus],[sensor,eye],[sensors,eye],[sensory,finger],[sent,right],[sentence,box],[sentencechars,square],[sentenceendpunctuation,fullstop],[sentences,paper],[sentenceswordsfrompos,square],[sentencewithspaces,square],[sentencewordsfrompos,square],[sentience,plus],[sentient,plus],[sentiently,plus],[sentiment,plus],[sents,paper],[seo,plus],[seoclerk,company],[seoclerks,company],[sep,square],[sepandpad,square],[separate,two],[separated,box],[separately,tworectangle],[separateness,box],[separates,box],[separating,box],[separation,box],[september,square],[septuagenarian,seventyrectangle],[septum,box],[sequel,book],[sequence,n],[sequencer,square],[sequences,n],[sequencing,and],[sequential,right],[sequin,],[sequitur,right],[sera,person],[serbia,country],[serene,plus],[serengitis,minus],[serial,n],[series,square],[serious,tick],[seriously,plus],[seriousness,plus],[serotonin,ball],[servant,person],[servants,person],[serve,right],[served,plus],[server,right],[servers,computer],[serves,hand],[service,plus],[services,plus],[servicing,plus],[serviette,],[serving,plate],[ses,plus],[sesame,seed],[sesquicentenary,n],[session,square],[sessionaim,square],[sessions,paper],[set,a],[setarg,n],[setback,minus],[setof,box],[sets,fiverectangle],[settable,equals],[settee,],[setting,box],[settings,n],[settle,plus],[settled,person],[settlement,plus],[settler,person],[settlers,person],[settling,down],[setton,person],[setup,plus],[seven,sevenrectangle],[sevenfold,box],[sevenrectangle,],[seventh,sevenrectangle],[seventyrectangle,seventyrectangle],[several,n],[severe,n],[severed,minus],[severely,minus],[sew,right],[sewage,ball],[sewed,right],[sewing,thread],[sewn,right],[sex,person],[sexes,person],[sexily,plus],[sexism,minus],[sexist,minus],[sexual,down],[sexualities,person],[sexuality,n],[sexually,plus],[sexy,plus],[seychelles,country],[sf,dash],[sfile,paper],[sfu,university],[sgetcode,and],[sh,software],[shabbanhykenever,box],[shade,square],[shaded,shade],[shades,square],[shadow,square],[shadowing,square],[shadowkill,person],[shadows,square],[shaft,line],[shah,person],[shakabanga,dash],[shake,cup],[shaker,],[shakers,shaker],[shakespeare,person],[shakespearean,person],[shaking,up],[shakuhachi,box],[shall,right],[sham,minus],[shame,minus],[shamisen,box],[shampoo,],[shamrock,person],[shanai,box],[shandy,book],[shane,person],[shank,person],[shape,diamond],[shaped,box],[shapes,square],[shaping,ball],[share,right],[shared,square],[shareholder,person],[shareholders,person],[shares,square],[sharing,square],[shark,],[sharon,person],[sharp,dot],[sharpe,person],[sharpen,plus],[sharpened,plus],[sharpener,blade],[sharpening,dot],[sharpness,plus],[shauna,person],[shave,square],[shaved,hair],[shaver,],[shaving,right],[shavings,line],[she,person],[sheared,right],[shearing,down],[shed,],[sheds,shed],[sheep,],[sheer,plus],[sheeran,person],[sheet,paper],[sheetofpaper,paper],[sheets,paper],[sheila,person],[shelf,],[shell,box],[shelled,box],[shelley,person],[shelling,plane],[shells,dash],[shells,line],[shelter,],[sheltered,square],[shelves,shelf],[shen,person],[shepherd,dog],[shetgovekar,person],[sheu,person],[shevardnadzes,plus],[shibboleth,plus],[shield,square],[shields,square],[shift,square],[shifting,right],[shifts,square],[shilito,person],[shimbun,newspaper],[shine,down],[shines,star],[shining,star],[shiny,square],[ship,],[shipped,right],[shipping,square],[ships,ship],[shirley,person],[shirt,],[shirts,shirt],[shirtsleeve,],[shitsued,plus],[shmoop,dash],[shoal,coral],[shock,minus],[shocking,minus],[shoe,],[shoelaces,],[shoes,shoe],[shone,light],[shook,minus],[shoot,equals],[shooting,box],[shoots,up],[shop,dollarsymbol],[shopper,person],[shoppers,person],[shopping,apple],[shops,dollarsymbol],[shore,],[short,dash],[shortage,minus],[shortcake,],[shortcomings,minus],[shortcourses,line],[shortcut,right],[shortcuts,right],[shorten,n],[shortened,n],[shortening,minus],[shorter,n],[shortest,n],[shorthand,square],[shortlist,square],[shortlisted,plus],[shortlisting,n],[shortly,n],[shortness,n],[shorttree,tree],[shot,square],[shotgun,minus],[shou,plant],[should,dash],[shoulder,],[shoulders,],[shouldn,xcorrection],[shouted,square],[show,square],[showalter,person],[showbiz,square],[showcase,plus],[showed,finger],[shower,water],[showered,water],[showering,water],[showers,water],[showing,square],[shown,square],[showpaperpdf,paper],[showreel,square],[shows,stage],[shredded,down],[shredding,up],[shrewd,plus],[shse,company],[shuffle,down],[shulman,person],[shunted,right],[shut,minus],[shutter,plane],[shuttle,spacecraft],[shuttlecock,],[shuttlecocks,shuttlecock],[shy,minus],[si,variable],[sibling,person],[siblings,person],[sic,tick],[siccen,plus],[sichians,person],[sick,minus],[sickness,minus],[sicling,plus],[side,square],[sided,dot],[sides,square],[sidestepping,right],[sideways,right],[sierra,country],[siesta,bed],[sieve,],[sieved,sieve],[sieving,sieve],[sifted,down],[sifter,down],[sig,square],[sigcrashhandler,plus],[sigcse,dash],[siggets,square],[siggraph,software],[sight,eye],[sighted,eye],[sighting,eye],[sights,eye],[sign,square],[signage,square],[signal,one],[signaled,signal],[signaling,right],[signalled,square],[signalling,right],[signals,plus],[signature,a],[signatures,tick],[signed,square],[significance,plus],[significances,plus],[significant,plus],[significantly,plus],[signification,plus],[signified,right],[signifier,square],[signifies,plus],[signify,right],[signifying,dash],[signing,square],[signorelli,person],[signpost,square],[signposted,signpost],[signposting,square],[signposts,signpost],[signs,square],[signup,square],[signups,plus],[sigtramp,square],[silence,zero],[silent,zero],[silently,zero],[silicon,],[silk,],[silly,minus],[silo,cylinder],[silos,box],[silva,person],[silver,ball],[sim,box],[similar,equals],[similarities,equals],[similarity,equals],[similarly,equals],[simile,equals],[simmered,n],[simmo,person],[simon,person],[simone,person],[simple,a],[simplephilosophy,square],[simpler,down],[simplest,dot],[simplicity,dot],[simplification,one],[simplifications,minus],[simplified,one],[simplifies,down],[simplify,dot],[simplifying,one],[simplistic,plus],[simply,plus],[simpsons,person],[simulacrum,box],[simulate,box],[simulated,and],[simulates,and],[simulating,box],[simulation,box],[simulationees,person],[simulations,box],[simulator,and],[simulatory,square],[simultanenously,equals],[simultaneous,one],[simultaneously,two],[simut,person],[sin,minus],[since,dot],[sincere,plus],[sincerely,plus],[sincerity,plus],[sine,function],[sing,note],[singapore,country],[singer,person],[singh,person],[singing,note],[single,one],[singles,one],[singleton,one],[singletons,one],[singly,box],[singular,one],[sink,box],[sinkhole,hole],[siphoned,down],[sipped,water],[sipping,water],[sir,person],[siri,and],[sister,person],[sit,seat],[sitar,],[site,paper],[sitemap,paper],[sites,paper],[sitkin,person],[sitting,],[situated,dot],[situation,square],[situational,plus],[situations,box],[sity,box],[siva,person],[six,],[sixrectangle,],[sixteen,n],[sixth,sixrectangle],[sixty,n],[size,n],[sized,ruler],[sizes,n],[sizzling,plus],[sjgrant,person],[skeletal,bone],[skeleton,],[skelter,right],[skene,person],[sketch,square],[sketched,pen],[sketching,pen],[skewed,minus],[skewer,],[skewered,skewer],[skewering,skewer],[skewers,skewer],[ski,],[skianis,person],[skies,box],[skiied,ski],[skiing,right],[skill,tick],[skilled,plus],[skillful,plus],[skillman,person],[skills,a],[skim,box],[skimmed,box],[skin,square],[skip,two],[skipped,right],[skipping,right],[skips,right],[skittle,],[skivvy,],[skull,],[skuuudle,company],[sky,square],[skye,dog],[skyeae,person],[skyrocket,up],[skyscraper,building],[sl,variable],[sla,paper],[slack,minus],[slander,minus],[slang,minus],[slanted,right],[slap,hand],[slash,],[slate,square],[slated,plus],[slave,person],[slavery,minus],[slaves,person],[sld,up],[sleek,square],[sleep,],[sleepily,plus],[sleeping,bed],[sleepinterval,minus],[sleeps,bed],[sleepy,minus],[sleeve,],[sleeved,sleeve],[sleeves,sleeve],[slept,bed],[slg,plus],[slice,box],[sliced,box],[slicer,down],[slices,box],[slicing,knife],[slid,right],[slide,down],[slider,n],[sliders,right],[slides,square],[sliding,right],[slight,n],[slightly,dash],[slingshot,],[slip,minus],[slipped,down],[slippery,water],[slipping,right],[slope,up],[slopes,up],[slot,hole],[slotted,right],[sloughed,right],[slovakia,country],[slovenia,country],[slow,n],[slowdown,down],[slowed,minus],[slower,lessthan],[slowing,line],[slowly,zero],[slowness,n],[slp,box],[slpconverter,right],[slug,],[sm,right],[small,amount],[smaller,lessthan],[smallest,n],[smarbangers,sausage],[smart,tick],[smarter,plus],[smartphone,phone],[smartphones,phone],[smartwatch,],[smedley,person],[smell,rose],[smelled,nose],[smelling,nose],[smells,nose],[smelt,nose],[smile,],[smiled,face],[smiley,face],[smiling,mouth],[smith,person],[smithereen,plus],[smock,],[smoke,],[smoked,smoke],[smoking,minus],[smooth,line],[smoothed,plus],[smoother,plus],[smoothie,cup],[smoothies,cup],[smoothing,down],[smoothly,line],[smorgasbord,bread],[smp,and],[sms,right],[smurf,person],[smv,right],[smythe,person],[sn,variable],[snack,apple],[snail,],[snake,],[snakes,snake],[snap,box],[snaps,dot],[snapshot,square],[snapshots,paper],[sneakers,shoe],[sneeze,nose],[sniff,nose],[sniffing,nose],[sniper,dot],[snooker,ball],[snow,],[snowflake,],[snowy,duck],[snuff,],[snuffed,xcorrection],[snuggled,down],[snyder,person],[so,right],[soaked,water],[soap,],[soapie,square],[soc,book],[socatoan,circle],[soccer,ball],[soci,book],[social,person],[socialisation,plus],[socialise,plus],[socialised,plus],[socialising,plus],[socialism,equals],[socialist,person],[socialists,person],[socially,person],[societal,person],[societally,person],[societies,person],[societol,book],[societological,room],[societologically,person],[societologist,person],[societology,person],[society,person],[socio,person],[sociocultural,plus],[socioeconomic,dollarsymbol],[sociologia,person],[sociological,book],[sociology,book],[sociologygroup,person],[sock,sock],[socks,sock],[socrates,person],[socratic,person],[soda,bottle],[sodhi,person],[sofa,],[soft,square],[softened,down],[softening,down],[softens,down],[softer,plus],[softly,n],[softness,plus],[software,square],[soh,dot],[soil,],[soimort,person],[sol,plus],[sola,ball],[solar,sun],[sold,dollarsign],[soldier,person],[soldiers,person],[soldiership,person],[sole,],[solely,one],[soles,],[solfatonotes,note],[solfege,note],[solfegenotetonote,right],[solicitude,minus],[solid,box],[solidarity,plus],[solids,box],[soliloquy,paper],[solipsism,minus],[solitude,one],[solo,one],[solomon,country],[solos,person],[sols,n],[solstices,n],[solution,plus],[solutions,tick],[solvable,plus],[solve,tick],[solved,tick],[solver,equals],[solves,plus],[solving,tick],[soma,body],[somali,person],[somalia,country],[some,one],[somebody,person],[someday,n],[somehow,dash],[someone,person],[somersault,circle],[something,one],[sometime,n],[sometimes,one],[someway,minus],[somewhat,plus],[somewhere,dot],[son,person],[sonar,square],[song,square],[songs,book],[songwriter,person],[songwriting,note],[sonic,note],[sonorous,plus],[sons,person],[soon,n],[sooner,greaterthan],[soothing,plus],[sophia,person],[sophie,person],[sophisticated,plus],[sophistication,plus],[soprano,person],[sore,minus],[soreness,minus],[sorgenti,person],[sorrow,minus],[sorry,xcorrection],[sort,a],[sortbyx,line],[sortcn,x],[sortcna,x],[sortcnremvdup,right],[sorted,line],[sorting,right],[sorts,line],[sostalgia,minus],[sostalgic,minus],[sostalgically,minus],[soufflé,],[sought,plus],[soul,],[soulful,plus],[souls,person],[sound,note],[soundcloud,company],[sounded,note],[sounding,dash],[soundly,plus],[soundness,plus],[sounds,note],[soundtrack,line],[soundtracks,note],[soup,bowl],[sour,minus],[source,square],[sourced,plus],[sources,dot],[sourcing,book],[sourness,lemon],[sousa,person],[south,down],[southampton,suburb],[southern,down],[southgate,person],[southwest,right],[souvenir,book],[soviet,country],[sowed,seed],[sown,down],[sowwell,person],[soy,glass],[sp,paper],[space,box],[spacecraft,],[spaced,box],[spaceport,building],[spaces,space],[spaceship,spacecraft],[spacetime,line],[spaceweb,paper],[spacing,n],[spadaccini,person],[spade,],[spades,box],[spaghetti,],[spain,country],[spam,box],[span,line],[spaniel,dog],[spanish,paper],[spare,plus],[spared,plus],[sparing,plus],[sparingly,plus],[spark,box],[sparks,dot],[spars,right],[sparse,minus],[spartacus,person],[spatial,box],[spatiality,box],[spatially,box],[spatiotemporal,dot],[spatula,],[spatulas,spatula],[speak,speechballoon],[speakable,mouth],[speaker,person],[speakers,person],[speaking,mouth],[speaks,mouth],[spear,],[speared,spear],[spearhead,right],[spearing,spear],[spec,paper],[special,plus],[specialisation,tick],[specialisations,plus],[specialise,plus],[specialised,down],[specialises,plus],[specialising,dot],[specialism,plus],[specialisms,book],[specialist,person],[specialists,person],[speciality,book],[specialization,line],[specialized,plus],[specially,plus],[speciation,dash],[species,animal],[specific,tick],[specifically,one],[specification,paper],[specifications,dash],[specificity,one],[specified,tick],[specifier,plus],[specifiers,square],[specifies,square],[specify,paper],[specifying,n],[specimen,box],[speckles,dot],[specs,paper],[specstage,right],[specstages,dot],[spectacles,glasses],[spectacularly,plus],[spectrum,line],[speculated,plus],[speculation,plus],[sped,right],[speech,mouth],[speechballoon,],[speeches,square],[speechwriter,person],[speed,n],[speedily,n],[speeding,n],[speeds,n],[spel,square],[spell,square],[spellbinding,plus],[spelled,square],[speller,person],[spelling,square],[spells,square],[spelt,square],[spencer,person],[spend,dollarsymbol],[spenders,person],[spending,dollarsymbol],[spent,dollarsymbol],[sperm,],[spf,n],[sphere,ball],[spheres,ball],[spherical,sphere],[sphermulated,ball],[sphinx,],[sphygmomanometer,],[spiccato,box],[spice,plant],[spiced,up],[spicer,person],[spider,],[spiel,square],[spike,n],[spill,down],[spilling,water],[spin,top],[spinach,],[spinal,spine],[spindle,line],[spine,],[spinner,circle],[spinoza,person],[spiral,circle],[spiraled,spiral],[spiralling,circle],[spirit,person],[spirits,glass],[spiritual,person],[spiritualism,dash],[spirituality,plus],[spiritually,tick],[spitting,amount],[splash,water],[splashed,water],[splatted,minus],[splayed,square],[spleen,],[splices,box],[split,two],[splitfurther,box],[splitfurthert,box],[splitintosent,box],[splitintosentences,box],[splitonphrases,box],[splits,two],[splitting,two],[spoiled,minus],[spoke,speechballoon],[spoken,speechballoon],[spokes,line],[spokespeople,person],[spokesperson,person],[sponge,],[sponged,sponge],[sponges,sponge],[sponsor,person],[sponsored,dollarsymbol],[sponsors,person],[sponsorships,dollarsymbol],[spontaneous,zero],[spool,],[spoolos,person],[spoon,],[spooned,spoon],[spoonerisms,square],[spoonful,spoon],[spoons,spoon],[spoorthi,person],[sport,ball],[sporting,ball],[sports,ball],[spot,tick],[spotify,company],[spotless,and],[spotlight,light],[spots,dot],[spotted,eye],[spout,],[spray,right],[sprayed,down],[spread,right],[spreading,right],[spreads,right],[spreadsheet,plus],[spreadsheets,square],[spreams,right],[sprig,leaf],[spring,],[springcm,dash],[springer,book],[springs,water],[sprinkler,water],[sprint,right],[sprints,right],[sprites,square],[spurned,plus],[spurns,plus],[spy,eye],[sq,multiplication],[sql,software],[sqrt,line],[sqrtpm,square],[squadron,person],[square,],[squared,square],[squares,square],[squash,],[squat,down],[squatted,down],[squatting,down],[squeezed,square],[squeezer,],[squeezing,hand],[squelch,orange],[squelching,orange],[squib,minus],[squirted,right],[squirter,right],[squirting,right],[squished,down],[squishing,down],[src,paper],[sread,eye],[sree,person],[sri,country],[srl,company],[srv,computer],[ss,s],[ssaged,dash],[ssh,software],[ssi,right],[ssivoziruuezia,dash],[ssiws,paper],[sskip,dash],[ssl,line],[ssn,dash],[st,dash],[stabbed,down],[stabbing,minus],[stabilise,plus],[stability,box],[stable,box],[stably,box],[stacey,person],[stack,box],[stacked,box],[stackexchange,paper],[stacking,box],[stackoverflow,paper],[stacks,box],[staff,person],[stafford,company],[stage,],[stagecraft,stage],[staged,n],[stages,dot],[stagger,dash],[staggered,box],[staggering,dash],[staging,plus],[stagnation,minus],[staining,ink],[stainless,zero],[stairs,step],[stake,],[staked,stake],[stakeholder,person],[stakeholders,person],[stakes,plus],[staley,person],[stalin,person],[stalk,],[stall,stall],[stallion,horse],[stalls,stall],[stalwarts,plus],[stamp,square],[stamped,square],[stamps,square],[stan,person],[stance,person],[stand,box],[standard,a],[standardisation,n],[standardised,plus],[standards,square],[standing,person],[standpoint,dot],[standpoints,square],[stands,box],[stanford,building],[stanley,person],[staple,dash],[stapler,],[star,],[starboard,right],[starches,ball],[stardom,plus],[starfish,],[staring,eye],[starjazzled,star],[stark,person],[starnow,star],[starred,plus],[stars,star],[start,startingline],[started,plus],[starter,zero],[starters,one],[starting,one],[startingline,],[startrec,dot],[starts,one],[startup,one],[startups,company],[stasis,plus],[stat,n],[state,a],[stated,paper],[stately,plus],[statement,paper],[statements,square],[states,n],[statesman,person],[stateterritory,square],[static,box],[stating,person],[station,box],[stationisation,building],[stations,building],[statistical,tick],[statistically,n],[statistics,plus],[statisticshowto,plus],[stats,plus],[statu,plus],[statue,],[statuette,],[status,a],[statuses,square],[stave,line],[stay,down],[stayed,dot],[staying,one],[stayingness,down],[stays,plus],[stc,dash],[stdio,box],[stdlib,book],[stdout,line],[stead,plus],[steadfast,plus],[steadily,n],[steady,plus],[steal,left],[stealing,minus],[stealth,plus],[steam,],[steamed,steam],[steel,line],[steep,n],[steeper,up],[steer,right],[steered,right],[steering,right],[stefanovic,person],[stellar,star],[stem,line],[stemming,up],[stemmingly,line],[stems,line],[stencil,],[step,],[stepclassical,box],[steph,person],[stephanie,person],[stephanies,person],[stephen,person],[stepped,step],[stepping,step],[steppopclassical,box],[steps,step],[stepwise,step],[stereo,tworectangle],[stereotypes,square],[sterile,minus],[sterilising,plus],[sterility,zero],[sterling,person],[stern,person],[steve,person],[steven,person],[stevens,person],[stevenson,person],[stew,pot],[stewart,person],[stewed,n],[stick,box],[sticker,square],[stickies,square],[stickily,equals],[sticking,equals],[sticks,box],[sticky,equals],[stiff,minus],[stiffness,n],[stigma,minus],[still,zero],[stillbirth,minus],[stillbirths,minus],[stillbornness,minus],[stillness,box],[stills,square],[stilltodo,right],[stimulate,plus],[stimulated,plus],[stimulating,plus],[stimulation,plus],[stimulatory,plus],[stimulus,plus],[stinoj,person],[stipulated,plus],[stir,right],[stirred,right],[stirring,circle],[stitched,needle],[stlp,n],[stmts,square],[stock,box],[stocking,box],[stockings,],[stocks,box],[stokes,person],[stole,minus],[stolen,minus],[stoller,person],[stomach,],[stone,],[stones,minus],[stonier,person],[stonnington,suburb],[stood,person],[stool,],[stop,],[stopped,n],[stopping,n],[stops,zero],[stopwatch,n],[storage,n],[store,building],[stored,down],[stores,building],[storey,building],[stories,book],[storing,down],[storm,water],[storms,right],[story,paper],[storyboard,square],[storyboarded,square],[storybook,book],[storytelling,paper],[stott,person],[stove,],[str,square],[straight,line],[straightened,line],[straightforward,square],[straightforwardly,plus],[strained,minus],[straining,down],[straitsresearch,company],[strand,line],[stranded,minus],[strange,minus],[strangelove,person],[strangely,plus],[stranger,person],[strangers,person],[strap,],[strapped,strap],[straps,square],[strategic,paper],[strategically,plus],[strategies,square],[strategy,a],[stratosphere,box],[stravinsky,person],[straw,],[strawberries,strawberry],[strawberry,berry],[straws,straw],[stray,minus],[stream,line],[streamed,right],[streaming,line],[streamline,right],[streamlined,line],[streamlining,line],[streams,line],[street,],[streets,street],[strength,n],[strengthen,plus],[strengthened,plus],[strengthening,plus],[strengthens,plus],[strengths,plus],[stress,xcorrection],[stressed,minus],[stresses,minus],[stressful,minus],[stretch,software],[stretched,right],[stretches,right],[stretching,right],[strict,tick],[strictly,plus],[strictness,plus],[stricture,box],[strike,down],[strikes,dot],[striking,plus],[string,twobox],[stringa,square],[stringchars,square],[stringconcat,plus],[stringconcat,threerectangle],[stringconcata,box],[strings,square],[stringsorvars,or],[stringth,variable],[stringtobreason,square],[stringtonumber,right],[stringx,square],[strip,square],[stripe,company],[stripes,line],[stripped,minus],[strips,square],[strive,plus],[striving,plus],[strlen,n],[stroke,line],[strokes,right],[stroking,right],[strong,n],[stronger,greaterthan],[strongest,plus],[strongholds,plus],[strongly,plus],[stroto,down],[strove,plus],[struck,box],[struct,box],[structs,box],[structural,box],[structuralism,box],[structurally,box],[structure,box],[structured,box],[structurers,person],[structures,box],[structuring,box],[struggle,minus],[struggled,minus],[struggles,minus],[struggling,minus],[strut,box],[struts,zline],[strutted,right],[stuart,person],[stub,square],[stubborn,minus],[stubs,square],[stuck,minus],[stuckey,person],[stud,ball],[studded,up],[student,person],[students,person],[studi,room],[studied,person],[studies,book],[studio,room],[studios,room],[studiosity,company],[studious,pen],[study,room],[studying,person],[stueduc,dash],[stuff,box],[stumble,minus],[stump,cylinder],[stunning,plus],[stunt,right],[stunts,right],[stupefying,minus],[stupid,minus],[sturdy,box],[style,a],[styled,plus],[styles,plus],[stylesheet,square],[styling,plus],[stylisation,plus],[stylise,line],[stylised,plus],[stylish,plus],[stylized,square],[stylus,pen],[su,person],[sub,dash],[subarguments,v],[subatom,square],[subatomic,ball],[subatoms,square],[subbotsky,person],[subbranch,dash],[subcategories,square],[subcategory,dash],[subclauses,dash],[subclinical,down],[subconsciously,box],[subcultures,plus],[subcutaneous,down],[subdirectory,dash],[subdomains,square],[subfunctions,right],[subgoals,plus],[subheadings,square],[subj,square],[subject,person],[subjective,plus],[subjectively,person],[subjectivism,person],[subjectivist,plus],[subjectivity,person],[subjects,square],[subjectsandobjects,square],[subjugated,down],[subjugation,down],[subjunctive,plus],[sublevel,dash],[sublevels,dash],[sublime,plus],[subliminal,n],[sublist,square],[sublists,square],[submarine,],[submenu,square],[submerged,water],[submerging,down],[submission,paper],[submissions,right],[submit,paper],[submits,up],[submitted,paper],[submitting,paper],[subordinate,down],[subordinating,down],[subordination,down],[subpar,minus],[subparts,box],[subpredicate,down],[subpredicates,down],[subroutines,square],[subscale,n],[subscribed,plus],[subscriber,person],[subscribers,person],[subscription,paper],[subscriptions,paper],[subsentences,square],[subsequent,right],[subset,box],[subsets,box],[subsidiaries,down],[subsidiary,right],[subsidies,minus],[subsidise,minus],[subsidised,minus],[subsidising,dollarsymbol],[subsidy,down],[subsisted,plus],[subsistence,apple],[subspecies,person],[subspecifications,square],[substance,box],[substances,box],[substandard,minus],[substantial,box],[substantiate,plus],[substantiated,plus],[substantive,plus],[substinence,plus],[substitute,down],[substituted,dash],[substitutes,box],[substitutevarsa,dash],[substituting,box],[substitution,right],[substrate,box],[substring,square],[substrings,square],[subsume,up],[subsumed,down],[subterm,variable],[subtitle,square],[subtle,plus],[subtleties,one],[subtlety,plus],[subtopics,square],[subtract,minus],[subtracted,minus],[subtracting,minus],[subtraction,minus],[subtractions,subtraction],[subtractive,minus],[subtractors,minus],[subtree,box],[subtrees,dash],[subtypes,dash],[suburb,square],[suburbs,suburb],[subvar,variable],[subvariables,box],[subvars,variable],[succ,right],[succeed,plus],[succeeded,plus],[succeeds,plus],[success,tick],[successes,plus],[successful,a],[successfully,tick],[succession,right],[successions,right],[successive,right],[successively,right],[successor,person],[successors,right],[succleton,water],[succt,up],[succulent,plus],[succulently,water],[such,dash],[suck,up],[sucked,up],[sucking,right],[suction,up],[sudan,country],[sudo,plus],[sue,person],[sued,minus],[suess,person],[suffer,minus],[suffered,minus],[sufferer,person],[sufferers,minus],[suffering,minus],[sufferings,minus],[suffice,plus],[sufficiency,plus],[sufficient,plus],[sufficiently,plus],[suffix,right],[suffixes,square],[sufism,book],[sugar,],[sugars,sugar],[sugary,sugar],[sugg,plus],[suggest,plus],[suggested,dash],[suggesting,square],[suggestion,plus],[suggestions,square],[suggestive,plus],[suggestively,plus],[suggests,right],[suhas,person],[suicidal,minus],[suicide,minus],[suit,],[suitability,plus],[suitable,plus],[suitcase,],[suitcases,suitcase],[suite,box],[suited,equals],[suiting,plus],[suits,plus],[sulfide,ball],[sulk,minus],[sullivan,person],[sully,person],[sultan,person],[sultana,],[sultanas,sultana],[sum,n],[sumlist,square],[summaries,square],[summarisation,square],[summarise,square],[summarised,square],[summarises,square],[summarising,square],[summarized,paper],[summary,square],[summative,n],[summatively,n],[summed,n],[summer,sun],[summing,plus],[summit,n],[sums,plus],[sumsq,plus],[sumtimes,n],[sun,star],[sunbathe,down],[sunburn,minus],[sunday,square],[sunderland,person],[sunfeltnesses,plus],[sung,note],[sunglasses,],[sunjay,person],[sunk,down],[sunlight,sun],[sunny,sun],[sunrise,sun],[sunscreen,],[sunset,sun],[sunshade,],[sunshine,light],[sunstroke,minus],[suo,a],[suor,water],[sup,up],[super,up],[superabled,plus],[superannuation,dollarsymbol],[superb,plus],[supercard,software],[supercomputer,box],[supercomputers,and],[supercomputing,and],[superconductors,right],[superego,up],[superficial,down],[superficiality,minus],[superfluous,minus],[superfoods,apple],[superhero,person],[superimpose,square],[superimposed,equals],[superimposes,down],[superimposing,square],[superintendent,person],[superior,plus],[superiors,person],[supermarket,line],[supermarkets,building],[supermodel,person],[superordinate,n],[superpower,plus],[superseded,xcorrection],[superstar,person],[supervise,plus],[supervised,plus],[supervisor,person],[supervisors,person],[suppl,plus],[supple,plus],[supplement,plus],[supplemental,plus],[supplementary,plus],[supplemented,plus],[supplements,plus],[supplied,plus],[supplier,right],[suppliers,person],[supplies,apple],[supply,right],[supplying,right],[support,square],[supported,box],[supporter,person],[supporters,person],[supporting,box],[supportive,plus],[supports,right],[suppose,plus],[supposed,right],[supposedly,plus],[suppress,xcorrection],[suppressed,down],[supremely,plus],[sur,dash],[surd,n],[surds,n],[sure,plus],[surely,plus],[sureness,plus],[surf,box],[surface,square],[surfaces,square],[surfer,person],[surfing,water],[surgeon,person],[surgery,box],[suriname,country],[surmised,plus],[surname,square],[surnames,square],[surpass,right],[surpassed,right],[surpassing,right],[surplus,plus],[surprise,star],[surprised,minus],[surprises,plus],[surprising,minus],[surprisingly,plus],[surreal,n],[surrected,up],[surrendering,plus],[surrogate,person],[surrounded,box],[surrounding,right],[surroundings,box],[surveillance,eye],[survey,paper],[surveyed,paper],[surveying,plus],[surveyor,person],[surveys,paper],[survival,plus],[survive,line],[survived,plus],[surviving,plus],[survivor,person],[surya,sun],[susa,person],[susan,person],[susannah,person],[susceptible,minus],[suscicipi,dash],[suscicipid,dash],[sushi,],[susie,person],[suspected,plus],[suspend,xcorrection],[suspended,zero],[suspenseful,n],[suspension,line],[suspensions,zero],[suspicious,minus],[sustain,plus],[sustainability,plus],[sustainable,plus],[sustained,plus],[sustaining,right],[sustenance,apple],[sutherland,person],[sutra,a],[sutras,square],[suzanne,person],[suzie,person],[svetlana,person],[sw,dash],[swab,],[swallow,stomach],[swallowed,down],[swallowing,apple],[swam,right],[swamp,water],[swan,],[swans,swan],[swanston,street],[swap,x],[swapped,right],[swapping,dash],[swaps,dash],[swashbuckling,plus],[swaziland,country],[sweat,],[sweaty,water],[sweden,country],[swedish,person],[sweep,right],[sweet,apple],[sweetened,sugar],[sweetening,plus],[sweeter,sugar],[sweetinvincibleandprayedfor,plus],[sweetly,sugar],[sweetness,sugar],[sweets,ball],[swept,right],[swi,dot],[swim,right],[swimmer,person],[swimmers,person],[swimming,water],[swin,university],[swinburne,university],[swine,pig],[swing,right],[swinging,right],[swings,swings],[swipe,right],[swipl,twobox],[swirl,right],[swirled,right],[swish,plus],[swiss,country],[switch,dash],[switchboard,dot],[switched,one],[switches,dash],[switching,dash],[switzerland,country],[sword,],[swore,plus],[swot,plus],[swung,right],[sx,variable],[sy,square],[syd,city],[sydn,person],[sydney,dash],[syll,variable],[syllable,square],[syllablecount,n],[syllables,square],[syllabus,paper],[syllogistical,right],[symbol,square],[symbolic,x],[symbolise,dash],[symbolised,dash],[symbolises,right],[symbolising,a],[symbolized,n],[symbolizing,equals],[symbols,a],[symmetrical,equals],[symmetrically,right],[symmetry,equals],[sympathetic,plus],[sympathised,plus],[sympathising,plus],[sympathy,plus],[symposium,room],[symptoms,dash],[syn,equals],[synapses,neuron],[synaptic,dot],[sync,equals],[synchronisation,equals],[synchronise,equals],[synchronised,tworectangle],[synchroniser,tick],[synchrotron,and],[syncing,equals],[syncopated,note],[syndrome,minus],[syndromes,minus],[synergita,company],[synergy,one],[synlait,company],[syno,equals],[synocommands,equals],[synogism,equals],[synogistic,equals],[synogistically,right],[synogrammars,square],[synologic,right],[synon,equals],[synonym,equals],[synonymous,equals],[synonymously,equals],[synonyms,equals],[synopsis,paper],[synopup,puppy],[synothoughts,square],[syntactic,square],[syntactical,square],[syntax,dash],[syntaxes,dash],[synth,tube],[synthese,book],[syntheses,dot],[synthesis,star],[synthesise,box],[synthesised,dash],[synthesiser,box],[synthesising,right],[synthesized,box],[synthetic,box],[synthetical,right],[synthetically,box],[syria,country],[syringe,],[syrup,],[sys,and],[syspred,b],[system,dash],[systematic,plus],[systematically,plus],[systematising,plus],[systemctl,square],[systemic,and],[systems,box],[sysutils,right],[syydw,dash],[sz,variable],[t,variable],[ta,variable],[tab,square],[table,paper],[tabled,n],[tables,table],[tablespoons,spoon],[tablet,box],[tablets,tablet],[tabling,equals],[tablon,person],[taboos,minus],[tabs,square],[tabular,square],[tackle,xcorrection],[tackled,minus],[tackling,xcorrection],[tact,plus],[tactful,plus],[tactfully,plus],[tactic,plus],[tactical,plus],[tactics,plus],[tactile,finger],[tadpole,],[tadpoles,tadpole],[tae,paper],[taeass,book],[tafta,company],[tag,square],[tagged,square],[tagger,square],[tagging,square],[tagline,line],[tags,square],[tai,up],[taiko,box],[tail,twobox],[taila,variable],[tailed,n],[tailor,person],[tailored,plus],[tailoring,jacket],[tails,down],[tait,person],[tajikistan,country],[take,left],[takeaway,plus],[taken,left],[takeover,plus],[taker,plus],[takers,right],[takes,left],[taking,left],[takings,dollarsymbol],[tale,paper],[talent,person],[talented,plus],[talents,plus],[tales,book],[talip,company],[talis,person],[talk,person],[talked,speechballoon],[talker,mouth],[talking,speechballoon],[talks,paper],[tall,n],[tallest,n],[tallied,n],[tallies,n],[tally,n],[tamas,person],[tambourine,],[tame,plus],[tamouridis,person],[tampered,minus],[tamsin,person],[tan,person],[tandfonline,book],[tandy,person],[tang,person],[tangent,dot],[tangents,line],[tangerine,],[tangible,box],[tango,box],[tangy,orange],[tank,box],[tanks,tank],[tantalized,plus],[tantra,plus],[tantrum,minus],[tanzania,country],[tanzanian,person],[tanzanians,person],[tao,dot],[tap,one],[tape,],[tapes,square],[taping,tape],[taponada,plus],[tapped,down],[tapper,person],[tapping,person],[taps,down],[tar,tar],[taras,person],[target,dot],[targeted,dot],[targetid,dash],[targeting,dot],[targets,dot],[tariffs,n],[tarnish,minus],[tarpaulin,fabric],[tart,],[task,book],[taskforce,person],[tasking,square],[tasks,paper],[taste,tongue],[tasted,tongue],[tasteful,apple],[tasteless,minus],[tastes,tongue],[tasting,tongue],[tasty,apple],[tat,dash],[tatiana,person],[tattoos,square],[taught,tick],[tautological,right],[tautology,right],[tawid,person],[tax,n],[taxation,dollarsymbol],[taxes,dollarsymbol],[taxi,car],[taxing,money],[taxonomised,dash],[taxonomy,dash],[tay,person],[taylor,person],[taylorfrancis,person],[tb,variable],[tbody,square],[tbroker,company],[tc,variable],[tchaikovsky,person],[tcm,plus],[tcp,plus],[td,square],[te,person],[tea,water],[teach,person],[teacher,person],[teachermagazine,book],[teachers,person],[teaches,room],[teaching,person],[teacups,cup],[team,person],[teams,person],[teamwork,person],[tear,],[tearing,minus],[tears,water],[tease,minus],[teased,minus],[teasing,minus],[tech,box],[techconnect,company],[technavio,dash],[technical,a],[technically,and],[technician,person],[technique,paper],[techniques,one],[technium,company],[techno,note],[technological,and],[technologically,and],[technologies,software],[technologisation,and],[technology,and],[techopedia,paper],[tecture,plus],[ted,person],[teddies,toy],[tedious,minus],[tee,person],[teenage,n],[teenager,person],[teenagers,person],[teeth,tooth],[tel,n],[telecast,box],[telegram,square],[telegraph,right],[telepathic,right],[telepathically,dash],[telepaths,person],[telepathy,right],[telephone,],[teleport,right],[teleportation,right],[teleported,right],[teleporter,right],[teleporting,right],[teleports,right],[teletypist,person],[television,box],[tell,mouth],[telling,mouth],[tells,speechballoon],[telnet,square],[telstra,company],[temerity,minus],[temp,box],[tempel,person],[temper,minus],[temperance,plus],[temperature,n],[tempered,plus],[template,paper],[templates,paper],[temple,building],[temples,building],[tempo,n],[tempold,box],[temporal,n],[temporality,n],[temporally,n],[temporarily,plus],[temporary,plus],[temporium,n],[tempos,n],[temptation,minus],[tempted,plus],[ten,tenrectangle],[tenacity,plus],[tenancy,square],[tenants,person],[tend,plus],[tended,right],[tendency,plus],[tendon,fibre],[tendons,right],[tends,plus],[teng,person],[tennis,ball],[tenor,person],[tenrectangle,],[tens,tenrectangle],[tense,minus],[tension,minus],[tensorflow,software],[tent,],[tentacle,line],[tentacles,tentacle],[tentaclising,tentacle],[tenth,n],[tents,tent],[tenuously,minus],[tenure,right],[tenured,paper],[teo,person],[ter,n],[tere,person],[teresa,person],[term,b],[terma,box],[termb,box],[termed,square],[terminal,square],[terminally,n],[terminals,square],[terminalvalue,n],[terminates,n],[termination,xcorrection],[terminations,zero],[terminator,box],[terminology,square],[termius,box],[termless,zero],[terms,b],[ternary,threerectangle],[terraformed,earth],[terraforming,planet],[terrain,map],[territory,square],[terror,minus],[terrorism,minus],[tertiary,n],[tes,person],[tesla,person],[tess,person],[tessellating,square],[tesseract,software],[test,paper],[testa,tick],[testable,tick],[testcicd,v],[testcut,box],[tested,dash],[testees,person],[tester,tick],[testers,person],[testes,ball],[testicle,ball],[testimonial,plus],[testimonials,paper],[testimony,square],[testing,tick],[testnumber,n],[testopen,tick],[tests,tick],[testtrace,tick],[testtube,],[tether,line],[tetrahedron,],[tetris,square],[texas,state],[text,paper],[texta,square],[textarea,square],[textbook,book],[textbooks,book],[textbroker,company],[textedit,box],[textgenrnn,algorithm],[texting,box],[textoffice,box],[texts,paper],[texttoalg,right],[texttobr,right],[texttobrall,box],[texttobrc,equals],[texttobreasoning,right],[texttobreasonings,software],[texttobreathsonings,right],[texttobrqb,apple],[texttobrqbpl,box],[texttobrth,right],[textual,square],[textured,square],[textures,square],[tf,dash],[tfe,dash],[tfile,square],[tfjs,algorithm],[tfn,n],[tg,variable],[th,n],[thailand,country],[than,greaterthan],[thank,plus],[thanked,plus],[thankful,plus],[thankfully,plus],[thanking,plus],[thanks,hand],[that,up],[thats,equals],[the,up],[thea,person],[theater,stage],[theatre,stage],[theatrical,stage],[theatricised,stage],[thebreasonings,apple],[theconversation,paper],[theft,minus],[theguardian,company],[their,left],[them,person],[thematic,square],[theme,square],[themed,square],[themes,square],[themselves,person],[then,right],[thens,right],[theodor,person],[theol,book],[theologian,person],[theological,person],[theologically,person],[theologist,person],[theology,book],[theorem,n],[theorems,right],[theoretical,right],[theoretically,and],[theories,square],[theorised,plus],[theorist,person],[theorists,person],[theory,and],[ther,person],[therapeutic,plus],[therapists,person],[therapy,plus],[there,down],[therefore,right],[thereness,dot],[theres,down],[thermo,n],[thermometer,n],[thermonuclear,n],[thermos,flask],[thermostats,n],[thes,equals],[thesaurus,book],[these,down],[theses,book],[thesis,book],[theta,n],[they,person],[thick,n],[thicket,plant],[thickness,n],[thief,person],[thigh,],[thighs,],[thin,box],[thinbook,square],[thing,pencilsharpener],[thingness,box],[things,pencilsharpener],[think,brain],[thinker,person],[thinkers,person],[thinking,a],[thinks,plus],[thinkstock,person],[third,three],[thirdly,plus],[thirds,square],[thirst,water],[thirsty,tongue],[thirty,thirtyrectangle],[thirtyrectangle,],[thiruvengadam,person],[this,down],[thisby,person],[thoma,person],[thomas,person],[thompson,person],[thomson,person],[thon,plus],[thonet,person],[thorne,person],[thorough,plus],[thoroughly,plus],[thoroughness,plus],[those,down],[thoseacquisitions,plus],[thosesales,plus],[though,xcorrection],[thought,box],[thoughtful,square],[thoughtfully,box],[thoughts,square],[thousand,n],[thousands,n],[thread,line],[threaded,line],[threading,right],[threads,line],[threat,minus],[threatened,minus],[threatening,minus],[threatens,minus],[threats,minus],[three,threerectangle],[threebox,],[threerectangle,],[threes,threerectangle],[threshold,],[thresholds,n],[threw,ball],[throat,],[throne,],[through,dash],[throughout,n],[throughs,right],[throw,ball],[throwing,right],[thrown,right],[throws,up],[ths,dash],[thstreet,street],[thu,square],[thumar,person],[thumb,],[thumbnails,square],[thumbs,finger],[thunberg,person],[thunder,line],[thunderstorm,water],[thunks,box],[thurs,square],[thursday,square],[thursdays,square],[thus,right],[thy,left],[thymine,ball],[thymus,],[thyroid,box],[ti,dash],[tial,n],[tian,person],[tiananmen,gate],[tianity,person],[tianrenheyi,plus],[tibetan,person],[tick,],[ticked,tick],[ticket,square],[ticketing,square],[tickets,square],[ticking,tick],[tickle,plus],[tickled,plus],[tickling,plus],[tiddlies,counter],[tiddly,counter],[tide,line],[tides,right],[tidied,dash],[tidy,plus],[tie,],[tied,equals],[tier,square],[tiers,line],[ties,dash],[tight,n],[tighten,xcorrection],[tightened,xcorrection],[tightening,down],[tightly,n],[tightrope,right],[tights,],[til,dash],[tilda,person],[tile,],[till,right],[tilt,right],[tilted,triangle],[tilting,right],[tim,person],[time,],[timed,n],[timeframes,n],[timeless,plus],[timelily,n],[timeline,line],[timelines,line],[timely,zero],[timemachine,],[timemachinetest,plus],[timenow,n],[timeout,n],[timeouts,zero],[timephone,n],[timer,n],[times,n],[timespace,box],[timestamp,variable],[timetable,square],[timetables,square],[timetodo,n],[timetravelgrandfatherparadoxsolved,tick],[timezone,square],[timing,n],[timings,n],[timist,minus],[timor,country],[timothy,person],[timpani,drum],[tin,],[tina,person],[tinalia,plus],[ting,ear],[tinge,dot],[tinkle,dot],[tins,can],[tinsel,],[tinsels,tinsel],[tinted,n],[tiny,n],[tion,dash],[tip,up],[tipped,plus],[tips,plus],[tire,minus],[tired,minus],[tiredly,minus],[tiredness,minus],[tires,minus],[tiring,minus],[tis,equals],[tissue,box],[tissues,box],[tiszai,person],[tit,variable],[titillations,plus],[title,square],[titled,square],[titles,square],[titlez,square],[tizziwinkle,hedgehog],[tl,box],[tla,plus],[tlalli,person],[tln,variable],[tm,company],[tmp,paper],[tmps,paper],[tn,variable],[to,right],[toadie,person],[toadstool,],[toast,],[toasted,toast],[toastmaster,company],[toastmasters,company],[tobacco,minus],[tobago,country],[tobias,person],[toboggan,],[toby,person],[toc,dash],[today,square],[todays,square],[toddler,person],[todo,tick],[toe,],[toenail,],[toes,toe],[toffee,],[tofu,],[together,dash],[toggle,right],[togo,country],[toi,person],[toiled,hand],[toilet,down],[toiletries,toothbrush],[toity,plus],[token,n],[tokenise,square],[tokeniser,square],[tokenises,square],[tokens,variable],[tolang,paper],[told,speechballoon],[tolerance,plus],[tolerate,plus],[tolerated,plus],[tolines,right],[tolkeinesque,person],[toller,dollarsymbol],[tollway,road],[tolower,square],[tom,drum],[tomato,],[tomatoes,tomato],[tombs,room],[tome,country],[tomfromglasgow,person],[tommillswyn,person],[tomorrow,square],[tonal,n],[tonally,n],[tone,smile],[toned,plus],[tones,n],[tonewheel,note],[tong,dash],[tonga,country],[tongs,],[tongue,],[tonics,note],[tonified,plus],[tonight,square],[tonk,ball],[tonsils,ball],[tony,person],[too,plus],[took,left],[tool,hammer],[toolbox,box],[toole,person],[tools,right],[toom,person],[toomey,person],[toorak,suburb],[tooth,],[toothbrush,],[toothpaste,],[toothpick,],[top,square],[topaz,stone],[topic,a],[topical,square],[topics,tworectangle],[topk,box],[toplevel,dot],[topliss,person],[topologically,dot],[topology,right],[topped,up],[topper,up],[topping,],[topple,down],[topresentandpagesto,right],[tops,up],[toptal,company],[tor,person],[torch,],[torso,],[tort,minus],[torte,cake],[torus,],[tos,paper],[toss,right],[tossed,up],[tot,baby],[total,uppert],[totalbars,n],[totality,n],[totalled,n],[totallength,n],[totally,n],[totals,n],[totalvars,n],[totem,plus],[totted,right],[touch,square],[touched,finger],[touches,equals],[touching,equals],[tough,minus],[toupée,],[tour,right],[toured,right],[touring,right],[tourism,right],[tourist,person],[tourists,person],[tournament,right],[tournaments,greaterthan],[tours,right],[tow,right],[toward,right],[towards,right],[towardsdatascience,right],[towel,],[towels,towel],[tower,],[town,],[towords,right],[toxic,minus],[toxins,minus],[toy,box],[toyota,car],[toys,toy],[tp,square],[tpfile,square],[tquk,company],[tr,right],[tra,plus],[trace,dash],[traced,line],[traces,line],[trachea,tube],[tracing,line],[track,tick],[tracked,plus],[tracking,tick],[tracks,line],[tracksnumber,n],[tract,box],[tractatus,book],[traction,right],[tractor,right],[tracy,person],[trade,book],[traded,right],[tradesman,person],[tradesmanship,plus],[tradie,person],[trading,right],[tradition,plus],[traditional,plus],[traditionally,plus],[traditions,book],[traffic,right],[trafficking,minus],[tragedy,minus],[tragically,minus],[trail,right],[trailer,note],[trails,right],[train,box],[trained,tick],[trainee,person],[trainees,person],[trainer,person],[trainers,person],[training,paper],[trainings,paper],[trains,person],[trait,square],[traits,square],[trajectories,right],[trajectory,right],[tram,],[tramlgreturningfromshowreel,tram],[trams,tram],[tran,right],[trancing,right],[tranfrom,left],[trans,right],[transaction,dollarsymbol],[transactional,box],[transactionally,right],[transactions,dollarsymbol],[transceiver,dot],[transcend,line],[transcended,up],[transcendence,up],[transcendences,up],[transcendental,line],[transcendentalism,up],[transcending,up],[transcends,up],[transcribe,right],[transcribed,right],[transcribing,pen],[transcript,paper],[transcription,right],[transcripts,paper],[transdisciplinarity,dot],[transdisciplinary,right],[transf,right],[transfer,right],[transferability,right],[transferable,right],[transference,right],[transferred,right],[transferring,right],[transfers,right],[transfixed,equals],[transform,right],[transformation,right],[transformational,right],[transformations,right],[transformative,right],[transformed,right],[transforming,right],[transforms,right],[transgender,person],[transgress,minus],[transgressed,minus],[transgression,minus],[transgressions,minus],[transhumanism,right],[transistors,right],[transition,right],[transitioned,right],[transitioning,right],[transitionism,right],[transitions,right],[transitive,ball],[transitively,right],[transitivity,right],[translatable,tick],[translatative,right],[translate,right],[translated,apple],[translatedtext,square],[translates,right],[translating,right],[translation,right],[translationa,paper],[translationdictionary,book],[translationi,right],[translationmanagementsystem,box],[translationpairs,box],[translations,right],[translationword,square],[translationwords,square],[translator,person],[translators,right],[transliterations,square],[translocal,right],[translucent,right],[transmission,right],[transmit,right],[transmitted,right],[transmitter,zline],[transmitters,right],[transmitting,right],[transparency,square],[transparent,n],[transparently,plus],[transplanted,right],[transplanting,right],[transport,right],[transported,right],[transporting,right],[transpose,right],[transposing,right],[transsexual,person],[transsexualisation,right],[transversion,right],[transversions,right],[transvert,right],[transverted,dot],[transverter,right],[transverting,right],[transverts,right],[trap,box],[trapeze,],[trapezium,],[traps,stick],[trash,box],[trauma,minus],[traumatic,minus],[traumatised,minus],[travails,book],[travel,train],[travelled,right],[traveller,person],[travellers,person],[travelling,right],[travels,right],[traversal,right],[traversals,down],[traverse,right],[traversed,right],[traverses,right],[traversing,down],[trawl,right],[trawled,right],[trawling,right],[trawls,right],[tray,],[trays,tray],[treacle,],[treasure,coin],[treasurer,person],[treasures,coin],[treasury,dollarsymbol],[treat,apple],[treated,plus],[treating,book],[treatment,tick],[treatments,book],[treats,plus],[treble,note],[tree,],[trees,tree],[treesort,up],[treetime,n],[treetops,tree],[tremolo,line],[trend,right],[trends,up],[trent,person],[triad,triangle],[trial,line],[trialed,n],[trialled,plus],[trialling,tick],[trials,n],[trialy,n],[triangle,square],[triangles,triangle],[triangular,triangle],[triangulate,plus],[triangulation,triangle],[triathlon,n],[tribal,person],[tribe,person],[tributaries,right],[trice,plus],[triceps,muscle],[trick,plus],[tricked,plus],[tricks,plus],[tricky,minus],[triculating,down],[tricycle,],[trident,],[tried,tick],[trier,dash],[tries,tick],[trigger,tick],[triggered,plus],[triggering,right],[triggers,plus],[trigonometric,triangle],[trilogy,book],[trim,square],[trimester,square],[trimino,triangle],[trimming,down],[trinidad,country],[trinity,threerectangle],[trip,right],[tripartite,box],[triple,threerectangle],[triplec,dash],[triplets,three],[tripped,minus],[tripping,minus],[trips,right],[tristram,book],[triumph,plus],[triumphant,plus],[trivia,square],[trivial,plus],[trivially,plus],[trivium,threerectangle],[trobe,person],[troff,person],[trojan,minus],[troll,person],[trolley,],[trombone,],[trope,right],[tropes,plus],[trophy,],[tropical,water],[trotsky,person],[troubadour,person],[trouble,minus],[troubles,minus],[troubleshooting,plus],[trousers,],[trout,fish],[troy,person],[truck,],[true,one],[trump,person],[trumpet,],[trumpeting,trumpet],[trumpets,trumpet],[truncate,n],[truncated,n],[truncates,n],[truncating,n],[trundle,right],[trunk,],[trust,plus],[trusted,plus],[trusting,plus],[trustingly,plus],[trusts,plus],[trustworthiness,plus],[trustworthy,plus],[truth,one],[truthful,plus],[truthfully,plus],[truths,plus],[try,dash],[trying,tick],[tryoutputs,variable],[tryoutputsa,variable],[trytranslations,tick],[ts,dash],[tst,variable],[tsunami,water],[tt,dash],[ttb,apple],[ttfa,dash],[tts,dash],[ttspec,square],[ttt,dash],[tttt,variable],[tty,square],[ttys,dash],[tub,box],[tuba,tube],[tube,box],[tubes,tube],[tubing,tube],[tubising,tube],[tubular,tube],[tubules,tube],[tuck,plus],[tucked,box],[tucker,person],[tucking,down],[tue,square],[tues,],[tuesday,square],[tuesdays,square],[tuition,plus],[tulip,flower],[tumour,minus],[tumours,minus],[tune,note],[tuned,note],[tunes,note],[tunic,],[tuning,n],[tunisia,country],[tunnel,],[tunnels,box],[tuple,tworectangle],[tuples,square],[turbine,right],[ture,plus],[tures,plus],[turin,city],[turing,person],[turkey,country],[turkmenistan,country],[turmeric,],[turn,square],[turnaround,right],[turnbull,person],[turncut,line],[turndebug,tick],[turned,two],[turnequals,right],[turners,right],[turning,right],[turnip,],[turnitin,box],[turnoffas,zero],[turnover,circle],[turnpike,right],[turns,tworectangle],[turntables,box],[turquoise,stone],[turtle,],[turtleart,right],[turtles,turtle],[tutankhamen,person],[tute,room],[tutes,room],[tutor,person],[tutored,plus],[tutorial,room],[tutorials,room],[tutoring,book],[tutors,person],[tuvalu,country],[tuxedo,],[tuxedos,tuxedo],[tv,],[tw,company],[twain,person],[tween,box],[tweet,bird],[twelfth,n],[twelve,twelverectangle],[twelverectangle,twelverectangle],[twenty,twentyrectangle],[twentyman,person],[twentyrectangle,],[twice,two],[twigg,person],[twin,person],[twirl,right],[twirled,circle],[twirling,right],[twist,right],[twisted,x],[twisting,right],[twists,right],[twitch,n],[twitched,n],[twitches,n],[twitter,building],[two,],[twobox,],[twodimensional,square],[twofold,plus],[tworectangle,],[twoshorthandles,handle],[twouses,tworectangle],[tx,variable],[txr,variable],[txt,paper],[txtsetting,square],[ty,dash],[tying,dot],[type,box],[typeclass,box],[typed,box],[typedict,book],[types,variable],[typesoftypes,square],[typesoftypesdictionary,paper],[typestatements,equals],[typewriter,],[typical,plus],[typically,plus],[typified,plus],[typing,square],[typist,person],[typographed,square],[typographical,box],[typology,box],[typos,minus],[tyr,variable],[tyrannosaurus,dinosaur],[tyranny,minus],[tyre,],[tyson,person],[tz,dash],[tzr,variable],[u,variable],[ua,n],[ual,dash],[ub,variable],[uba,dash],[ubd,plus],[uber,company],[ubiquitously,box],[ubreen,square],[ubu,person],[uc,variable],[ucgjzsx,dash],[ucr,company],[udaya,person],[udoagwu,dash],[ue,dash],[ug,dash],[uganda,country],[ugly,minus],[uh,minus],[uhealth,company],[uht,box],[ui,square],[uk,country],[ukraine,country],[ul,square],[ulml,dash],[ultimate,plus],[ultimately,plus],[ultraviolet,n],[um,dash],[umami,tofu],[umberto,person],[umbrella,],[un,one],[una,one],[unabated,right],[unable,minus],[unabridged,book],[unacceptable,minus],[unaccredited,minus],[unacknowledged,minus],[unadjacent,equals],[unadvanced,minus],[unaffected,plus],[unallocated,dash],[unambiguous,plus],[uname,square],[unanswered,minus],[unassuming,plus],[unassumingness,plus],[unauthorised,minus],[unauthorized,minus],[unavailable,minus],[unavoidable,minus],[unaware,minus],[unawareness,minus],[unbalanced,greaterthan],[unbeatable,plus],[unbelievable,minus],[unbiased,plus],[unblock,plus],[unblocked,box],[unblocking,right],[unborn,dot],[unbroken,plus],[unbuttoned,left],[uncannily,plus],[uncapped,up],[uncared,minus],[uncertain,minus],[uncertainty,minus],[unchanged,one],[unchanging,one],[unchecked,minus],[unchosen,minus],[unclaimed,minus],[unclassified,minus],[uncle,person],[unclear,xcorrection],[unclearly,minus],[uncles,person],[uncomfortable,minus],[uncomment,minus],[uncommented,box],[uncommenting,xcorrection],[uncompetitive,minus],[uncompile,down],[uncompressed,box],[unconceal,up],[unconcealed,up],[unconcealing,eye],[unconceals,eye],[unconceived,zero],[unconditional,plus],[unconditionally,plus],[unconfident,minus],[unconnected,minus],[unconscious,zero],[unconsciously,minus],[uncontactable,minus],[uncontrollable,minus],[uncorrect,minus],[uncountable,n],[uncover,up],[uncovered,up],[uncovering,down],[uncovers,right],[uncritical,minus],[uncurled,right],[undead,plus],[undebated,minus],[undecided,minus],[undef,zero],[undefined,dash],[undefinedalgorithms,variable],[undefinedbreasonings,apple],[undefinedbreathsonings,variable],[undefineddirections,variable],[undefinedobjectstofinish,variable],[undefinedobjectstoprepare,variable],[undefinedpartsofrooms,variable],[undefinedrooms,variable],[under,down],[underbelly,square],[underdog,person],[undergarments,shirt],[undergirds,plus],[undergo,plus],[undergoes,plus],[undergoing,plus],[undergrad,n],[undergraduate,person],[undergraduates,person],[underground,building],[underhanded,minus],[underline,line],[underlined,line],[underlying,square],[undermine,minus],[underneath,down],[underpants,underpants],[underpin,plus],[underpinned,down],[underpins,down],[underrepresentation,minus],[underscore,],[underscored,line],[underscores,square],[understand,tick],[understandable,tick],[understanding,square],[understandingly,plus],[understandings,plus],[understands,plus],[understated,line],[understood,plus],[undertake,plus],[undertaken,plus],[undertaking,plus],[undertook,plus],[underwater,water],[underwent,down],[undesirable,minus],[undetected,minus],[undeveloped,plus],[undevelopedly,plus],[undid,minus],[undirected,minus],[undisciplined,minus],[undiscombobulated,plus],[undiscovered,plus],[undo,xcorrection],[undoes,xcorrection],[undoing,xcorrection],[undone,minus],[undos,left],[undress,down],[undue,minus],[unduly,minus],[une,university],[unemployment,minus],[unend,box],[unendingly,n],[unenroll,right],[unenrolled,up],[unep,company],[unequal,notequals],[unequally,minus],[unesco,building],[unethical,minus],[unethically,minus],[uneventful,plus],[unexamined,minus],[unexp,box],[unexpanded,box],[unexpected,xcorrection],[unexpectedly,minus],[unexplained,minus],[unexplored,minus],[unexpressive,minus],[unfair,minus],[unfairly,minus],[unfamiliar,minus],[unfavourable,minus],[unfilled,box],[unfinished,minus],[unfitness,minus],[unfolded,right],[unfolding,right],[unfolds,right],[unforgettable,plus],[unfortunate,minus],[unfortunately,minus],[unfound,minus],[unfounded,minus],[unfreezing,plus],[unfriendly,minus],[unfulfilled,minus],[unfunded,zero],[unfurled,right],[ungrammatical,minus],[unguent,],[unhappiness,minus],[unhappy,minus],[unhd,dash],[unhealthiness,minus],[unhealthy,minus],[unheard,minus],[uni,building],[unidentified,minus],[unification,equals],[unifications,equals],[unified,one],[uniform,one],[uniformisation,plus],[uniformise,plus],[uniformised,plus],[uniformiser,plus],[uniformity,equals],[uniformly,n],[uniforms,shirt],[unify,one],[unifying,equals],[unimelb,building],[unimportant,minus],[uninitialised,zero],[uninstall,xcorrection],[uninstantiated,minus],[unintelligent,minus],[unintended,minus],[uninterrupted,plus],[union,threerectangle],[unions,equals],[unique,one],[uniquely,one],[uniqueremainders,n],[unirioja,dash],[unis,building],[unit,one],[united,square],[units,line],[unity,box],[univ,square],[univer,box],[univeristy,dash],[universal,box],[universalisation,box],[universalism,one],[universality,n],[universally,box],[universe,box],[universes,box],[universitates,box],[universities,university],[university,building],[unix,dash],[unjust,minus],[unkind,minus],[unknown,n],[unknowns,minus],[unlawful,minus],[unless,xcorrection],[unlike,minus],[unlikely,minus],[unlimited,n],[unlines,line],[unlinked,dash],[unload,xcorrection],[unlock,key],[unlocked,right],[unlocks,right],[unmanifested,one],[unmarked,zero],[unmatch,notequals],[unmeant,minus],[unminimised,minus],[unmodified,dash],[unmutexed,plus],[unnameable,square],[unnamed,zero],[unnatural,minus],[unnaturalness,minus],[unnec,xcorrection],[unnecessarily,minus],[unnecessariness,minus],[unnecessary,minus],[unneeded,xcorrection],[unnegated,plus],[unnoted,box],[unnoticeable,plus],[unobservably,minus],[unobtrusive,plus],[unoptimised,minus],[unordered,n],[unoriginal,minus],[unpack,up],[unpacked,up],[unpacking,up],[unparaphrased,equals],[unpersuadable,minus],[unpile,up],[unplanned,plus],[unplugged,box],[unpolished,minus],[unpopularity,minus],[unpredictability,minus],[unpredictable,minus],[unpredictably,minus],[unprepared,minus],[unpreventable,minus],[unprocessed,plus],[unproduced,dash],[unproductive,plus],[unprofessionalism,minus],[unprofessionally,dash],[unprofitable,minus],[unprotected,minus],[unproven,minus],[unquoted,square],[unravel,right],[unravelled,right],[unravelling,right],[unread,minus],[unreasonable,minus],[unreasonably,minus],[unrecognised,minus],[unreconstructed,minus],[unregistered,minus],[unregulated,minus],[unrelated,xcorrection],[unrelatedly,dash],[unrelatedness,plus],[unreliability,minus],[unreliable,minus],[unreliably,minus],[unresearched,minus],[unreset,minus],[unresolved,minus],[unresolvedly,minus],[unrestricted,n],[unreturned,minus],[unrolled,right],[unsafe,minus],[unsafely,minus],[unsaid,zero],[unsatisfactory,minus],[unsavoury,minus],[unscheduled,minus],[unscrew,up],[unscrewed,up],[unscrewing,right],[unseen,zero],[unskilled,minus],[unsolvable,minus],[unsolved,minus],[unspecified,zero],[unspoken,square],[unstable,minus],[unstoppable,plus],[unsucc,xcorrection],[unsuccessful,xcorrection],[unsuggested,minus],[unsupported,minus],[unsure,minus],[unsureness,minus],[unsurmountable,minus],[unsustainable,minus],[unsynthesised,minus],[untaken,plus],[untenable,plus],[untested,minus],[unthinkable,xcorrection],[unthinking,minus],[until,dot],[untitled,square],[unto,right],[untouched,square],[untraced,right],[untrained,zero],[untranslatable,minus],[untried,minus],[untrue,zero],[untwisted,line],[unupdated,minus],[unused,square],[unusual,minus],[unveil,right],[unverified,minus],[unvisited,xcorrection],[unwanted,minus],[unwantedly,minus],[unwavering,plus],[unwell,minus],[unwinding,right],[unwittingly,minus],[unworlding,plus],[unwound,plus],[unwrap,right],[unwrapped,up],[unwrapping,right],[unwritten,zero],[unzip,right],[unzipped,notequals],[unzipping,down],[uo,right],[uon,university],[uonline,paper],[uop,dash],[up,],[upanisad,book],[upanisads,book],[upanishads,book],[upasana,meditator],[upasanasutra,square],[upbeat,plus],[upbringing,plus],[upcase,square],[upcoming,one],[upda,plus],[updat,up],[update,plus],[updated,tick],[updatefile,plus],[updatelocal,up],[updater,tick],[updaters,plus],[updates,up],[updatetrans,plus],[updatevar,variable],[updatevars,variable],[updating,up],[upf,n],[upgrade,up],[upgraded,up],[upgrades,up],[upgrading,up],[upheld,plus],[uphill,up],[uphold,hand],[upholding,up],[upkeep,plus],[uplifting,up],[upload,up],[uploaded,up],[uploading,up],[uploads,up],[upness,up],[upon,down],[upper,up],[uppera,a],[upperb,b],[upperc,c],[uppercase,up],[upperd,d],[uppere,e],[upperf,f],[upperg,g],[upperh,h],[upperi,i],[upperj,j],[upperk,k],[upperl,l],[upperm,m],[uppern,n],[uppero,o],[upperp,p],[upperq,q],[upperr,r],[uppers,s],[uppert,t],[upperu,u],[upperv,v],[upperw,w],[upperx,x],[uppery,y],[upperz,z],[uprety,person],[upright,up],[uprise,xcorrection],[ups,up],[upset,minus],[upsets,minus],[upsetting,minus],[upside,up],[upsized,up],[upsold,dollarsymbol],[upstairs,up],[uptime,n],[uptown,city],[upward,up],[upwards,up],[upwork,company],[ur,person],[urban,box],[urea,],[ureters,tube],[urethra,tube],[urge,plus],[urged,plus],[urgency,xcorrection],[urgent,plus],[urgently,plus],[uri,square],[urinary,urine],[urinated,liquid],[urination,urine],[urine,],[url,square],[urls,square],[urn,],[uruguay,country],[us,person],[usa,country],[usability,plus],[usable,plus],[usage,hand],[usb,box],[usborne,company],[use,spoon],[usea,hand],[useableness,plus],[used,plus],[useexisting,plus],[useful,plus],[usefulness,plus],[useless,dot],[uselessness,minus],[user,person],[userid,n],[userinput,square],[username,square],[usernames,square],[users,person],[uses,tworectangle],[using,dash],[usp,square],[usr,person],[ustad,person],[usual,plus],[usually,one],[usurp,minus],[utc,dash],[utensil,fork],[uterus,box],[utf,dash],[utilise,plus],[utilised,plus],[utilising,dash],[utilitarian,person],[utilitarianism,box],[utilities,right],[utility,plus],[utilize,plus],[utm,dash],[utpermianbasinx,university],[utter,square],[utterance,square],[utterances,square],[uttered,square],[uttering,square],[utters,mouth],[uu,dash],[uv,line],[uva,n],[uvb,n],[uvks,dash],[uxrqkjk,dash],[uypi,dash],[uzbekistan,country],[uztuabyg,dash],[v,variable],[va,variable],[vacancies,box],[vacation,square],[vaccinated,plus],[vaccination,syringe],[vaccine,],[vaccines,xcorrection],[vacuum,box],[vacuumed,up],[vadher,person],[vag,person],[vagina,box],[vague,minus],[vaguely,minus],[vagus,box],[vain,minus],[vaj,person],[val,one],[valerie,person],[valiant,plus],[valid,plus],[validate,tick],[validated,plus],[validates,plus],[validating,plus],[validation,plus],[validity,one],[valley,box],[valleys,down],[vals,n],[valuable,dollarsymbol],[valuation,plus],[value,one],[valuea,variable],[valued,n],[valueia,variable],[valuer,person],[values,one],[valuing,plus],[valve,box],[van,person],[vandalised,minus],[vandelanotte,person],[vanessa,person],[vanga,person],[vanilla,icecream],[vanishing,box],[vansh,person],[vanuatu,country],[vaporisation,zero],[vaporise,zero],[vaporised,zero],[vaporiser,xcorrection],[vaporising,air],[var,variable],[vara,variable],[varese,city],[variability,n],[variable,],[variablename,square],[variables,variable],[variance,minus],[variant,two],[variants,right],[variation,n],[variations,n],[varicose,minus],[varied,plus],[varies,n],[varieties,plus],[variety,n],[various,equals],[variously,plus],[variousness,plus],[varlist,tworectangle],[varlists,square],[varn,dash],[varname,square],[varnames,variable],[varnum,n],[vars,variable],[varsd,box],[varsx,variable],[vary,n],[varyers,dash],[varying,two],[vas,tube],[vascular,tube],[vasculature,tube],[vase,],[vast,n],[vastu,tree],[vasuli,person],[vatican,city],[vaudren,person],[vault,box],[vb,dash],[vc,variable],[vca,university],[vce,paper],[vd,variable],[vdwhhj,dash],[ve,dash],[vec,right],[vector,right],[vectors,right],[ved,n],[veda,dash],[vedic,person],[vegan,person],[veganism,zucchini],[vegans,person],[vegetable,box],[vegetables,squash],[vegetarian,apple],[vegetarianism,zucchini],[vegetarians,person],[vehemence,xcorrection],[vehement,minus],[vehicle,],[vehicles,car],[vein,tube],[veins,tube],[veirman,person],[vel,dash],[velocities,n],[velocity,n],[ven,person],[venan,person],[vending,dollarsymbol],[vendone,duck],[vendor,person],[vendors,person],[venezuela,country],[venice,dash],[venn,square],[ventilation,box],[ventriloquist,person],[venture,right],[venue,building],[venus,person],[venusian,person],[verb,equals],[verbal,tongue],[verbalisation,mouth],[verbally,mouth],[verbose,plus],[verbphrase,square],[verbs,square],[verde,country],[verdict,plus],[verifiable,plus],[verifiably,plus],[verification,tick],[verificational,plus],[verificationalism,tick],[verificationism,tick],[verifications,v],[verificative,v],[verified,tick],[verifier,person],[verifiers,v],[verifies,tick],[verify,tick],[verifying,tick],[vermicelli,spaghetti],[vermillion,square],[vernacular,square],[verpaccio,person],[vers,n],[versa,dash],[verse,square],[versechorussoloprogression,right],[versed,plus],[verses,square],[version,one],[versioning,n],[versions,tworectangle],[versity,square],[versus,dash],[vertex,dot],[vertical,up],[vertically,up],[very,plus],[ves,person],[vesicle,ball],[vesicles,vesicle],[vessel,cup],[vessels,tube],[vest,],[vestibular,box],[vestiches,square],[vestments,robe],[vet,paper],[veterans,person],[veterinary,person],[vetiver,],[vetus,n],[vetusia,land],[vetusian,square],[vgaz,dash],[vgh,dash],[vgp,square],[vgp,variable],[vgps,square],[vgps,variable],[vi,box],[via,right],[viability,plus],[viable,plus],[viagra,tablet],[vibrant,plus],[vibraphone,dot],[vibrated,n],[vibrato,n],[vic,state],[vice,two],[vicinity,n],[victim,person],[victor,person],[victoria,state],[victorian,person],[victoriously,plus],[victory,plus],[video,square],[videoed,camera],[videoing,line],[videophone,square],[videos,square],[vids,line],[vienna,person],[viete,person],[vietnam,country],[view,eye],[viewable,square],[viewaction,eye],[viewed,eye],[viewer,person],[viewers,person],[viewfinder,square],[viewing,eye],[viewitem,plus],[viewpoint,eye],[viewpoints,eye],[viewport,square],[viewproductviaportal,dash],[views,eye],[viewscreen,square],[vile,minus],[villain,person],[villains,person],[vim,and],[vince,person],[vincent,country],[vinci,person],[vine,],[vineet,person],[vinegar,],[vintage,plus],[viol,viola],[viola,],[violated,minus],[violation,minus],[violations,minus],[violence,minus],[violent,minus],[violin,],[viral,n],[viralise,right],[virality,plus],[virgin,person],[virginia,person],[virgo,star],[virility,person],[virt,box],[virtanen,person],[virtual,box],[virtualization,box],[virtually,box],[virtue,plus],[virtues,plus],[virtuous,plus],[virtuously,plus],[virus,],[viruses,virus],[visa,dollarsymbol],[visceral,tissue],[viscosity,n],[viscous,n],[visibility,eye],[visible,square],[vision,eye],[visionary,person],[visioning,eye],[visions,eye],[visit,one],[visited,one],[visitees,person],[visiting,plus],[visitor,person],[visitors,person],[visor,hat],[vista,tree],[visual,square],[visualisation,square],[visualisations,square],[visualise,square],[visualised,eye],[visualises,eye],[visualising,square],[visualization,eye],[visualize,eye],[visualized,eye],[visually,eye],[visuals,square],[vita,line],[vitae,line],[vital,plus],[vitamin,capsule],[vitamins,ball],[vitiate,plus],[vitiated,plus],[vitro,testtube],[viva,plus],[vive,plus],[vivid,plus],[vivo,person],[vivs,variable],[vivshaw,person],[vj,person],[vl,n],[vladimir,person],[vln,variable],[vlucians,person],[vmr,right],[vn,dash],[vns,right],[vo,variable],[vocab,square],[vocabularies,square],[vocabulary,square],[vocal,speechballoon],[vocalist,person],[vocalists,person],[vocally,mouth],[vocals,note],[vocalstubinstrument,tube],[vocation,book],[vocational,book],[vocationally,plus],[vocative,plus],[voced,book],[vocedplus,and],[voff,n],[voice,mouth],[voiceparts,line],[voices,square],[voicetrack,line],[voicing,square],[void,zero],[voids,box],[vol,n],[volcanic,up],[volcano,],[volume,box],[volumes,box],[voluntarily,plus],[voluntary,plus],[volunteer,person],[volunteering,plus],[volunteers,person],[vomit,box],[vomited,up],[vomiting,up],[von,n],[voom,right],[voomed,right],[vorstellung,box],[vote,tick],[voted,n],[voter,person],[voters,person],[votes,plus],[voting,square],[voucher,square],[vouchers,square],[vous,person],[voutier,person],[vowels,square],[vows,square],[vox,tube],[voxel,box],[voxels,box],[voyage,right],[vp,square],[vpn,and],[vpns,square],[vps,computer],[vpsbu,box],[vpscodes,n],[vpsfilename,square],[vpspath,square],[vpsstring,square],[vpsterm,square],[vpsunit,box],[vr,variable],[vrai,box],[vren,person],[vrooming,right],[vrqa,company],[vs,dash],[vsorted,right],[vsys,variable],[vt,dash],[vtp,square],[vu,dash],[vuckanova,person],[vulgar,minus],[vulnerabilities,minus],[vulnerability,minus],[vulnerable,minus],[vulnerably,plus],[vulva,organ],[vv,mirror],[vyes,tick],[w,variable],[wa,variable],[wackery,plus],[waddling,duck],[wade,person],[waded,right],[wafer,],[waffle,],[wage,dollarsymbol],[wages,dollarsymbol],[wahr,box],[wailful,ear],[waist,],[wait,n],[waited,plus],[waiter,person],[waiting,plus],[waiver,plus],[wake,right],[wakeful,plus],[waking,up],[wales,state],[walk,path],[walked,right],[walker,person],[walkers,person],[walking,path],[walks,path],[walkthrough,right],[walkthroughs,right],[walkway,right],[wall,square],[wallabies,wallaby],[wallaby,],[wallace,person],[wallerstein,person],[wallet,],[walls,box],[walnut,],[walter,person],[walton,person],[waltzing,right],[wam,book],[wan,person],[wand,],[wandered,right],[wandering,right],[wang,person],[want,plus],[wanted,book],[wantedly,plus],[wantedness,plus],[wanting,left],[wantingness,plus],[wants,left],[war,minus],[ward,plus],[wardley,person],[wardrobe,],[wards,room],[ware,plus],[warehouse,building],[warehouses,building],[warfare,minus],[wark,person],[warm,plus],[warmed,up],[warming,up],[warmth,plus],[warn,xcorrection],[warne,person],[warned,xcorrection],[warning,minus],[warnings,xcorrection],[warns,xcorrection],[warp,right],[warrant,paper],[warranted,plus],[warranties,plus],[warranty,square],[warring,minus],[wars,minus],[wart,box],[warthog,warthog],[warthogs,warthog],[warts,wart],[was,equals],[wash,water],[washed,water],[washing,water],[washington,person],[washy,minus],[wasn,notequals],[waste,minus],[wasted,minus],[wasteland,minus],[wasting,minus],[watch,eye],[watched,eye],[watcher,person],[watching,eye],[water,amount],[waterbed,bed],[waterchick,],[waterchicks,waterchick],[watercress,],[watered,water],[watering,water],[wateringly,plus],[watermark,square],[watermelon,],[waters,person],[watertight,box],[waterway,water],[watery,water],[watson,person],[waurn,suburb],[wav,file],[wave,right],[waved,hand],[wavelength,n],[wavering,minus],[waves,right],[waving,hand],[wavs,paper],[wax,],[way,path],[wayback,company],[ways,path],[wb,variable],[wc,variable],[we,person],[weak,minus],[weakened,minus],[weaker,minus],[weakness,minus],[weaknesses,minus],[wealth,dollarsymbol],[wealthy,dollarsymbol],[weapons,minus],[wear,coat],[wearer,person],[wearers,person],[wearing,trousers],[wears,plus],[weary,minus],[weasel,],[weasoned,plus],[weather,water],[weave,right],[weaved,right],[web,paper],[webarranger,algorithm],[webb,person],[webbed,duck],[webber,person],[webinar,square],[webinars,square],[weblog,paper],[webmaster,person],[webpage,paper],[webpages,paper],[website,paper],[websites,paper],[webster,person],[wed,square],[wedding,dash],[wedgeman,person],[wednesday,square],[weebly,company],[weed,plant],[week,sevenrectangle],[weekday,square],[weekdays,square],[weekend,square],[weekends,square],[weekly,square],[weeks,square],[ween,box],[weet,],[wei,person],[weigh,n],[weighed,n],[weighers,n],[weighing,n],[weight,n],[weighted,down],[weightings,n],[weights,n],[weighty,n],[weingart,person],[weird,minus],[weirdness,minus],[welcome,plus],[welcomed,plus],[welcoming,hand],[welfare,dollarsymbol],[well,person],[wellbeing,plus],[wellington,person],[wellness,plus],[welsh,country],[wemba,person],[wendy,person],[went,right],[wentworth,program],[wept,amount],[wer,box],[were,equals],[weren,notequals],[west,left],[western,left],[wet,liquid],[wetransfer,right],[wetted,water],[wetting,water],[weyoun,person],[wgets,and],[whale,],[whales,whale],[wharf,],[wharton,university],[what,plus],[whatever,one],[whats,box],[whd,variable],[wheat,],[wheats,plant],[wheatsheaf,wheat],[wheel,circle],[wheelbarrow,],[wheelchair,],[wheeled,right],[wheeledly,wheel],[wheeler,person],[wheels,wheel],[when,zero],[whenever,n],[where,square],[whereas,dash],[whereby,plus],[wherefarers,person],[wherein,plus],[wherever,dot],[whet,plus],[whether,dash],[whfuwamyi,dash],[which,up],[while,dash],[whilst,equals],[whirl,right],[whisk,],[whisking,whisk],[whist,person],[whistl,person],[whistle,tube],[whistleblower,person],[whistler,dog],[whistlice,person],[whistlur,person],[whistye,person],[white,square],[whiteboard,],[whiteduckling,duck],[whitelaw,person],[whiteleys,person],[whiter,square],[whiterhen,duck],[whitetocolour,right],[whitewater,duck],[whitey,duck],[whitmore,person],[whittaker,person],[whittington,person],[whittling,down],[who,person],[whoa,plus],[whois,dash],[whole,box],[wholeness,n],[wholesalers,person],[wholesaling,right],[wholesome,plus],[wholly,n],[whom,person],[whose,box],[why,dash],[wi,right],[wicca,religion],[wiccans,person],[wick,],[wicked,minus],[wickets,],[wide,line],[widely,plus],[wider,greaterthan],[widgets,and],[widow,person],[width,n],[widths,n],[wife,person],[wifi,right],[wig,hair],[wiggled,right],[wigglesworth,person],[wigwam,wigwam],[wigwams,wigwam],[wiki,book],[wikia,book],[wikihow,paper],[wikimedia,company],[wikipedia,book],[wil,dash],[wilb,person],[wilbarans,person],[wilbercock,person],[wilbercocks,person],[wilbur,goat],[wild,lion],[wildcard,square],[wilderness,tree],[wildlife,animal],[wildly,plus],[wiley,company],[wiliam,person],[wilkins,person],[will,right],[william,person],[williams,person],[willie,person],[williec,person],[willing,plus],[willingly,plus],[willingness,plus],[willis,person],[willow,person],[wills,plus],[wilson,person],[wilsons,person],[wilt,down],[wimmer,person],[win,a],[winchell,person],[winckler,person],[wind,right],[winding,circle],[window,square],[windowless,wall],[windows,square],[windpipe,],[winds,right],[windsock,],[wine,],[winemaker,person],[wing,dash],[wingate,person],[wings,wing],[winkled,plus],[winks,eye],[winner,person],[winners,person],[winnie,person],[winning,a],[winnings,dollarsymbol],[wins,plus],[winston,person],[winter,water],[wiped,right],[wiper,],[wiping,right],[wire,line],[wireframe,box],[wires,line],[wiring,wire],[wisdom,plus],[wise,plus],[wisely,book],[wish,left],[wished,plus],[wishes,plus],[wishing,plus],[wishlist,paper],[wishy,minus],[wit,plus],[witch,person],[witchcraft,plus],[witches,witch],[with,dash],[withdraw,right],[withdrawal,minus],[withdraws,right],[withdrew,left],[wither,minus],[withfromtolang,plus],[within,box],[withitness,plus],[witholding,dollarsymbol],[without,minus],[withoutfromtolang,minus],[withstood,plus],[witness,person],[witnesses,person],[wits,plus],[witt,person],[wittgenstein,person],[wittgensteinian,person],[wives,person],[wizardry,plus],[wk,square],[wm,variable],[wn,n],[wnn,dash],[wo,dash],[wok,],[woke,up],[wolcott,person],[woman,person],[womanhood,plus],[womb,],[wombat,],[women,person],[womens,person],[won,plus],[wonder,plus],[wonderbras,bra],[wondered,speechballoon],[wonderful,plus],[wondering,mouth],[wonderment,plus],[wondrous,plus],[wong,person],[woo,plus],[wood,],[woodblock,box],[woodcutter,person],[wooden,spoon],[woodford,person],[woodwind,flute],[wool,],[woolworths,shop],[word,lowera],[wording,square],[wordnet,star],[wordpress,paper],[words,lowera],[wordsfrompos,square],[wordstring,square],[wordswithtwouses,square],[wordy,minus],[wore,shirt],[work,paper],[workable,plus],[workaround,plus],[workarounds,plus],[workbooks,book],[workduring,dash],[worked,plus],[worker,person],[workers,person],[workforce,person],[working,paper],[workingness,plus],[workings,and],[workload,paper],[workloads,book],[workman,person],[workout,up],[workplace,building],[workplaces,room],[works,book],[worksheet,paper],[worksheets,paper],[workshop,room],[workshops,book],[workstation,square],[workweek,paper],[world,],[worlding,ball],[worldly,plus],[worlds,planet],[worldview,eye],[worldwide,planet],[worm,],[wormed,right],[wormhole,box],[worms,minus],[worn,down],[worried,minus],[worries,minus],[worry,minus],[worrying,minus],[worse,minus],[worship,plus],[worshippers,person],[worshipping,plus],[worst,minus],[wort,plant],[worth,dollarsymbol],[worthless,minus],[worthwhile,plus],[worthy,plus],[wos,mouth],[wot,down],[would,zero],[wouldn,minus],[wound,minus],[woven,right],[wp,square],[wqtxts,dash],[wr,variable],[wrap,box],[wrapped,box],[wrapper,box],[wrappers,box],[wrapping,box],[wraps,box],[wreak,minus],[wreath,],[wreck,minus],[wrecked,minus],[wrecking,minus],[wri,person],[wright,person],[wrike,person],[wrist,],[writ,dash],[write,pencil],[writeln,box],[writelns,square],[writenotification,plus],[writeq,square],[writer,person],[writers,person],[writes,person],[writing,paper],[writings,square],[written,box],[wrong,xcorrection],[wrongdoing,minus],[wrongly,minus],[wrongness,minus],[wrote,pen],[ws,dash],[wss,dog],[wtg,dash],[wtjpd,dash],[wu,plant],[wuc,dash],[wuh,dash],[wujec,person],[wwc,dash],[www,asterisk],[wxdxuwu,dash],[wxmyyfsdqarlgjvlmof,dash],[wyatt,person],[x,],[xa,dash],[xander,person],[xas,variable],[xb,variable],[xbjgehu,dash],[xc,variable],[xcommand,square],[xcompliment,plus],[xcorrection,],[xd,dash],[xderivefunctor,square],[xemote,plus],[xenon,planet],[xf,dash],[xfavorite,plus],[xgoodbye,n],[xgossip,minus],[xgottago,n],[xhello,one],[xhgpiykilr,dash],[xi,person],[xiaolong,person],[xinitiate,one],[xinmin,person],[xinsult,minus],[xintroduce,one],[xit,dash],[xix,n],[xkl,dash],[xlen,line],[xline,],[xm,line],[xmath,plus],[xmax,n],[xmemory,box],[xmin,n],[xml,square],[xn,variable],[xnfxvjk,dash],[xnfxwml,dash],[xngf,dash],[xnone,zero],[xnonsense,minus],[xnp,dash],[xns,variable],[xochi,person],[xor,box],[xp,line],[xrest,variable],[xs,x],[xskwk,dash],[xss,n],[xswx,company],[xsz,dash],[xu,variable],[xuanxue,school],[xval,n],[xviii,n],[xx,plus],[xxa,variable],[xxi,n],[xxii,n],[xxs,dash],[xxx,dash],[xxxx,plus],[xxxxxxb,square],[xy,variable],[xylophone,],[xyz,company],[xz,variable],[xzle,dash],[y,yline],[ya,dash],[yacht,],[yachts,yacht],[yael,person],[yahoo,person],[yan,person],[yang,plus],[yantra,square],[yao,person],[yard,box],[yarra,river],[yashwantreddy,person],[yaw,right],[yawn,mouth],[yawned,up],[yawning,box],[yb,variable],[yc,variable],[yd,variable],[ydwpw,dash],[ye,person],[yeah,plus],[year,square],[yeardob,n],[yearlearned,square],[yearly,square],[yearn,plus],[years,threerectangle],[yearvaluea,variable],[yearvalueb,variable],[yefuititi,plus],[yellow,square],[yellowgod,person],[yeltsin,person],[yemen,country],[yemiscigil,person],[yes,plus],[yesorno,plus],[yesornoagain,plus],[yesterday,square],[yestree,dash],[yet,one],[yetis,person],[yew,tree],[yf,variable],[yield,box],[yielded,plus],[yielding,right],[yields,plus],[yik,person],[yili,company],[yin,minus],[yip,person],[ylen,line],[yline,],[ym,line],[ymax,n],[ymin,n],[yml,dash],[yn,variable],[yns,variable],[yo,dash],[yodeler,person],[yodeller,person],[yoga,mat],[yogamantra,square],[yogasutra,square],[yogasutrachildrenh,plus],[yoghurt,],[yogi,person],[yogic,and],[yogically,and],[yogis,person],[yohn,person],[yoko,person],[yomiuri,newspaper],[yorick,person],[york,city],[yorker,magazine],[yorkey,person],[you,person],[youare,equals],[young,n],[younger,lessthan],[your,person],[yourarticlelibrary,book],[yourcoach,person],[yourdictionary,book],[yourdomain,square],[yourfolder,box],[yourlegaldff,plus],[yourname,square],[yourorganization,company],[yours,box],[yourself,person],[yourselves,person],[youth,person],[youthful,n],[youths,person],[youtu,dash],[youtube,video],[yovel,person],[yoyo,],[yp,dash],[yqypmzmqbhasvox,dash],[yrest,variable],[ys,y],[yt,paper],[yu,person],[yussy,person],[yuste,person],[yval,n],[yy,variable],[yyyy,square],[yyzvs,dash],[yz,variable],[z,zline],[za,dash],[zac,person],[zag,right],[zags,left],[zambia,country],[zamzar,company],[zang,dash],[zap,plus],[zaphir,person],[zaria,person],[zb,dash],[zd,variable],[zeal,plus],[zealand,country],[zealic,plus],[zeals,plus],[zebulontheta,planet],[zero,],[zeroed,zero],[zerofill,zero],[zeros,zero],[zest,orange],[zesty,orange],[zgf,dash],[zhan,person],[zhang,person],[zhiping,person],[zhongshu,person],[zig,right],[zimbabwe,country],[zinc,ball],[zingy,plus],[zip,],[zippcode,n],[zipwith,up],[zither,],[zjh,dash],[zl,dash],[zlen,line],[zline,],[zm,dash],[zn,variable],[zns,variable],[zoltan,person],[zombie,person],[zone,square],[zoned,square],[zones,square],[zoo,building],[zoology,animal],[zoom,right],[zoomed,right],[zooming,up],[zourabichvili,person],[zp,variable],[zpd,n],[zqso,dash],[zs,z],[zu,variable],[zubair,person],[zucchini,],[zuhandenheit,plus],[zwischen,box],[zwith,plus],[zxy,square],[zygote,],[zz,dash],[zza,variable],[zzx,square],[zzxwriteflag,square],[à,dash],[éclair,biscuit],[égale,dash],[übermensch,person],[īṣ,dash],[ṣ,dash]] \ No newline at end of file diff --git a/Text-to-Breasonings-master/brdict2.txt b/Text-to-Breasonings-master/brdict2.txt new file mode 100644 index 0000000000000000000000000000000000000000..1fbc55cdb30409f21d1e40d2262f4ae91b0f9f18 --- /dev/null +++ b/Text-to-Breasonings-master/brdict2.txt @@ -0,0 +1 @@ +[[a,1,1.5,0],[aardvark,20,10,10],[abacus,20,1,1],[accent,1,0.5,0],[acetic,1,1,1],[acid,1,1,1],[acorn,1,1,1],[adrenalin,1,1,1],[aerosol,5,5,15],[air,1,1,1],[aircraft,10,10,2],[album,10,10,0.5],[alcohol,0.1,0.1,0.1],[algae,1,1,1],[algorithm,1,1,0],[almond,2,1,0.5],[aluminum,1,1,0.01],[amalgam,1,1,1],[ambulance,200,400,200],[amount,0.5,0.5,0.5],[amphitheatre,100,100,100],[amphora,30,30,60],[amygdala,1,1,1],[anchor,50,10,50],[and,1,1,0],[animal,100,30,80],[animals,30,30,50],[ankle,2,1,2],[ant,0.2,0.1,0.1],[antenna,0.5,0.5,100],[anther,0.1,0.1,1],[antibody,1,1,1],[ape,50,30,180],[apple,5,5,5],[application,1,1,1],[apricot,4,4,4],[aqualung,20,20,30],[aquarium,100,50,40],[arc,1,1,0],[arch,200,30,250],[archipelago,1,5,1],[architrave,100,20,10],[argan,1,1,1],[arm,10,10,100],[armadillo,20,30,50],[armour,55,35,190],[artichoke,5,5,5],[artwork,10,1,10],[asparagus,20,1,1],[association,400,500,300],[asterisk,1,1,0],[atest,1,1,0],[atom,1,1,1],[audiobar,10,0,0.5],[avocado,5,4,3],[axolotl,10,2,1],[b,1,1,1],[baby,30,50,20],[backbone,100,1,1],[backslash,1,1.5,0],[bacteria,1,1,1],[bacterium,2,1,1],[badge,1,1,0.3],[bag,35,10,55],[bagel,10,10,5],[bagpipe,20,20,50],[bagpipes,30,30,50],[bakelite,1,1,0.1],[ball,5,5,5],[balloon,20,20,30],[balustrade,10,100,100],[bamboo,1,1,200],[banana,20,10,5],[band,1000,500,180],[banjo,30,10,60],[bank,500,400,300],[barbeque,50,50,120],[barley,1,1,100],[barnacle,1,1,1],[barnyard,1000,5000,300],[barrel,80,80,100],[base,100,100,1],[baseball,5,5,5],[basket,30,30,40],[bat,30,10,7],[bath,150,60,70],[bathers,30,20,20],[baton,15,0.5,0.5],[batter,1,1,1],[battery,5,1,1],[bbq,100,100,150],[beaker,5,5,7],[bean,5,0.5,0.5],[beanie,20,20,20],[bear,50,30,150],[beard,20,5,5],[bed,200,200,50],[bee,1,0.5,0.5],[beef,1,1,1],[beetle,1,1,0.5],[bell,2,2,2],[belt,120,5,1],[bench,100,30,50],[berry,1,1,1],[bicycle,200,30,100],[bin,80,80,150],[bird,50,25,5],[biscuit,5,1,0.5],[biteful,1,1,1],[bix,10,5,1],[blackberry,1,1,1],[blackboard,400,10,100],[blackcurrant,1,1,1],[bladder,10,5,10],[blade,5,0.01,1],[blanket,200,200,5],[blender,15,15,40],[blindfold,20,20,5],[blood,1,1,1],[blossom,5,5,5],[blueberry,1,1,1],[bluestone,30,20,20],[boat,300,150,50],[body,50,30,180],[bolt,4,1,1],[bone,45,10,10],[book,15,20,1],[boot,30,10,20],[bottle,5,1,20],[bow,100,1,1],[bowl,30,30,5],[box,1,1,1],[bra,30,25,10],[bracket,1,1.5,0],[brackets,1,1.5,0],[brain,20,20,30],[bran,0.5,0.3,0.1],[bread,30,10,5],[breast,10,10,10],[brick,20,10,5],[bridge,10,1,2],[brim,30,30,0.5],[brimmed,30,30,1],[bristle,0.1,0.1,1],[brits,10,5,1],[broccoli,5,5,5],[bronze,1,1,0.3],[broom,50,10,150],[brow,4,1,1],[brush,20,2,3],[bubble,1,1,1],[bucket,20,20,30],[buckle,2,0.5,2],[buddha,50,30,180],[bug,1,1,0.4],[building,50,50,50],[bull,100,200,180],[bullet,2,1,1],[bun,5,5,5],[buoy,100,100,100],[burger,10,10,5],[bus,10,1,1],[bust,30,25,1],[butter,1,1,1],[butterfly,10,5,1],[butterscotch,1,1,0.5],[buttocks,30,15,20],[button,1,1,1],[c,1,1,1],[cabbage,20,20,20],[cabinet,100,100,100],[cage,100,50,50],[cake,10,10,10],[calf,150,50,100],[camera,10,10,10],[camp,50,50,3],[can,5,5,10],[candle,1,1,20],[cane,10,1,100],[canoe,300,100,50],[cantaloupe,15,15,25],[caper,0.5,0.5,0.5],[cappuccino,10,10,15],[capsicum,5,5,8],[capsule,1,0.25,0.25],[car,400,200,150],[caramel,0.5,0.5,0.1],[carbon,1,1,1],[card,5,3,0.1],[carob,2,0.5,0.5],[carpet,10,10,0.5],[carriage,200,100,200],[carrot,10,1,1],[cartridge,3,1,1],[casein,1,1,1],[castanets,5,5,2],[castle,16,16,16],[cat,40,15,30],[cathedral,2,5,4],[cave,100,100,100],[caviar,0.2,0.2,0.2],[cd,10,10,0.1],[celery,1,1,1],[cell,1,1,1],[cello,50,20,120],[centipede,100,1,1],[chair,50,50,100],[chalice,5,5,20],[chalk,4,0.5,0.5],[chart,1,1,0],[cheek,5,0.0001,5],[cheese,10,10,0.25],[cheeseburger,10,10,2.2],[cheesecake,10,10,2],[cherry,1,1,1],[chessboard,16,16,1],[chest,30,20,1],[chestnut,1,1,1],[chicken,30,40,40],[chickpea,0.5,0.5,0.5],[child,30,20,150],[chip,4,0.5,0.5],[chips,5,1,1],[chisel,1,1,15],[chocolate,1,1,1],[chopstick,20,0.5,0.5],[chrysalis,1,8,1],[chutney,1,1,1],[cinnamon,0.1,0.1,0.1],[circle,1,1,0],[city,100,100,10],[claimed,0,0,0],[clamp,4,1,4],[clavinet,1,1,40],[claw,0.5,2,2],[cliff,100,100,100],[clip,1,1,1],[clipper,10,10,1],[clippers,20,10,0.5],[clock,10,1,10],[closebracket,1,1.5,0],[closedroundbracket,1,1.5,0],[closedsquarebracket,1,1.5,0],[cloud,3,2,0],[clove,1,1,2],[clover,2,2,4],[coat,50,20,100],[coconut,15,15,15],[coffee,5,5,10],[cog,1,1,0.5],[coin,1,1,0.25],[coit,10,10,1],[colander,20,20,5],[collage,100,100,0.5],[collar,20,20,2],[colon,1,1,0],[column,20,20,400],[comb,10,2,0.2],[company,100,100,10],[computer,100,30,50],[conch,10,10,10],[conditioner,1,1,1],[condom,20,3,3],[cone,1,1,1],[continent,150,100,0],[cookie,5,5,1],[copper,1,1,0.01],[coral,1,1,1],[cord,5,1,1],[cordial,1,1,1],[cork,2,1,1],[corn,15,5,5],[counter,1,1,0.1],[country,100,100,1],[cow,100,200,180],[crab,10,5,2],[crane,5,10,10],[crate,50,50,50],[cream,1,1,1],[creek,100,400,100],[cremate,5,5,5],[crepe,10,10,0.3],[crocodile,400,100,40],[croissant,15,5,3],[crossbow,50,1,150],[crow,100,50,30],[crown,20,20,10],[crumb,0.1,0.1,0.1],[crystal,1,1,2],[cub,30,60,40],[cubby,200,200,200],[cube,1,1,1],[cumquat,3,3,3],[cup,5,5,5],[cupboard,100,50,200],[cupcake,5,5,5],[curd,1,1,1],[currant,1,1,0.5],[curry,1,1,1],[curve,1,1,0],[cushion,30,30,5],[cuticle,0.1,0.5,0.1],[cutlass,10,5,50],[cutter,1,5,20],[cycle,1,1,0],[cylinder,1,1,1],[cymbal,2,100,100],[d,1,1.5,0],[daffodil,5,5,30],[daisy,1,0.1,10],[dam,2000,2000,100],[damper,1,1,1],[danish,15,15,13],[danishes,15,15,3],[dash,1,1,0],[day,1,1,0],[deer,50,180,160],[delta,1,1.5,0],[desk,200,100,100],[dessert,10,10,10],[detergent,1,1,1],[diamond,2,2,0],[diaphragm,25,18,7],[dice,1,1,1],[didgeridoo,100,10,10],[dinosaur,50,120,180],[dioxide,2,1,1],[dirt,0.1,0.1,0.1],[disc,10,10,0.1],[discus,10,10,3],[dish,30,30,1],[disinfectant,1,1,1],[dispenser,1,1,1],[display,50,1,30],[divide,1,1.5,0],[division,1,1.5,0],[dna,1,1,1],[dog,100,30,90],[doll,5,3,18],[dollarsign,1,1.5,0],[dollarsymbol,1,1.5,0],[door,100,10,200],[dot,1,1,0],[dough,1,1,1],[doughnut,10,3,3],[dove,30,15,20],[down,1,1,0],[dragon,500,100,400],[drawers,100,100,100],[dreamballoon,10,10,0],[dress,50,25,130],[driveway,10,10,1],[drug,1,1,4],[drum,30,30,20],[duck,30,40,50],[duckling,5,5,5],[dumpling,2,2,2],[durum,0.1,0.1,0.1],[duster,10,5,1],[dye,1,1,1],[e,1,1.5,0],[eagle,200,100,50],[ear,0.5,3,5],[earlobe,0.5,1,1],[earphone,1,1,1],[earphones,0.5,0.5,0.5],[earth,1,1,1],[egg,3,3,5],[eggplant,10,5,5],[eightrectangle,8,1,0],[elastic,10,0.05,0.01],[elbow,5,5,5],[elderberry,1,1,1],[electron,1,1,1],[ellipse,2,1,0],[elmo,30,100,15],[embryo,1,1,1],[emerald,1,2,1],[enchilada,20,2,2],[envelope,20,10,0.01],[enzymatic,1,1,1],[epinephrine,1,1,1],[equals,1,1,0],[esplanade,50,100,2],[exclamationmark,1,1.5,0],[eye,5,5,5],[eyebrow,4,0.001,1],[eyelid,4,0.001,2],[f,1,1.5,0],[fabric,30,30,0.1],[face,20,15,30],[fairy,50,30,180],[fan,20,20,5],[fat,1,1,1],[fe,0.1,0.1,0.1],[feather,15,1,0.1],[fence,100,1,50],[fertiliser,1,1,1],[fibre,1,1,10],[field,1,1,1],[fifteenrectangle,15,1,0],[fig,3,3,3],[file,20,30,0.0001],[fin,2,30,30],[finger,1,10,1],[fire,1,1,2],[fish,30,10,10],[fist,10,10,10],[fivecrectangle,5,1,0],[fiverectangle,5,1,0],[flame,1,1,5],[flamingo,50,100,150],[flask,10,5,20],[floss,30,0.1,0.1],[flowchart,20,30,0.0001],[flower,1,1,10],[flute,100,2,2],[folder,30,0.0005,20],[foot,30,10,10],[football,30,15,15],[footprint,30,10,0],[footy,30,10,10],[forearm,50,10,10],[forehead,10,0.001,5],[fork,2,20,1],[fortyrectangle,40,1,1],[fossil,2,1,0.5],[fountain,100,100,150],[fourrectangle,4,1,0],[frankfurt,5,1,1],[frankincense,0.1,0.1,0.1],[freezer,90,70,30],[frog,5,5,3],[fudge,1,1,1],[fuel,1,1,1],[fullstop,1,1,0],[function,1,1.5,0],[fungus,1,1,1],[funnel,10,10,10],[fur,1,1,1],[furniture,200,100,100],[g,1,1.5,0],[galah,10,10,40],[game,30,30,5],[garage,1000,6000,220],[garbage,50,50,100],[garlic,1,1,1],[gas,1,1,1],[gate,200,3,100],[gazebo,500,500,400],[gel,1,1,1],[gemstone,1,1,1],[germ,2,1,1],[gherkin,4,1,1],[giant,60,40,190],[ginger,5,3,1],[giraffe,200,100,300],[gland,1,1,0.2],[glass,5,5,5],[glasses,15,11,3],[glockenspiel,40,20,5],[glove,25,10,3],[glue,0.5,0.5,0.5],[goat,120,30,160],[goggles,20,21,2],[gondola,100,500,50],[goose,40,30,50],[gossamer,1,1,0.001],[government,10,5,5],[gown,50,20,130],[grain,0.1,0.1,0.1],[grains,1,1,1],[gramophone,40,30,30],[granita,1,1,1],[granite,1,1,1],[grape,1,1,1],[grapefruit,10,10,10],[grater,10,10,20],[grave,180,50,30],[greaterthan,1,1,0],[grid,2,2,0],[grub,1,0.5,0.5],[guache,1,1,1],[guava,5,5,7],[guernsey,50,20,50],[guitar,150,50,15],[gumnut,0.5,0.5,0.5],[gutter,200,10,10],[h,1,1.5,0],[hair,1,0,0],[hall,30,50,5],[ham,5,5,0.1],[hamburger,10,10,5],[hammer,15,5,1],[hammock,70,300,100],[hamster,10,20,15],[hand,20,10,3],[handkerchief,30,30,0.0001],[handle,15,1,5],[happy,1,1,0],[happyface,20,2,30],[harlequin,50,30,180],[harmonica,10,3,1],[harness,30,20,50],[harp,100,20,150],[hash,1,1.5,0],[hat,30,30,15],[hay,1,1,1],[hcl,1,1,1],[head,15,20,23],[headdress,20,20,5],[headphones,30,10,20],[headpiece,20,20,5],[headset,21,21,21],[heart,1,1,0],[heater,30,30,30],[hedgehog,15,15,15],[heel,10,10,0],[helicopter,50,30,15],[helium,1,1,1],[hemisphere,1,1,0.5],[heptagon,1,1,0],[hexagon,1,1,0],[hill,5,5,2],[hinge,4,0.3,4],[hippocampus,1,1,1],[hips,30,20,1],[hive,40,40,50],[hole,1,1,0],[holocap,20,20,10],[home,20,30,15],[honey,10,10,10],[hook,1,1,0.1],[hoop,100,100,1],[horizon,10000,0,100],[horn,5,5,50],[horse,100,200,200],[hospital,100,100,10],[hourglass,10,10,30],[house,30,100,30],[hovercar,400,200,150],[hug,50,65,180],[hummus,1,1,1],[humpy,200,200,150],[hurdle,100,50,50],[hydrogen,1,1,1],[hyphen,1,1,0],[i,1,1.5,0],[ibis,30,30,50],[ice,1,1,1],[icecream,5,5,15],[icicle,0.5,0.5,2],[icing,5,5,0.001],[ink,1,1,1],[inky,30,100,80],[insect,1,1,0.5],[institute,5000,5000,300],[instrument,50,20,120],[intestine,30,20,20],[intestines,30,20,20],[iphone,10,20,0.5],[iron,20,10,10],[island,100,100,10],[ivory,1,1,0.1],[j,1,1,1],[jacket,50,20,90],[jam,1,1,1],[jar,10,10,10],[jaw,20,10,10],[jelly,5,5,5],[jet,50,75,5],[jewel,1,1,1],[jube,1,1,1],[judgment,1,1,0],[jug,10,10,30],[juice,1,1,1],[juices,1,1,1],[jumper,50,20,60],[juniper,1,1,1],[k,1,1.5,0],[kaleidoscope,30,1,1],[kangaroo,100,50,150],[kettle,20,20,30],[key,5,1,0.001],[keyboard,50,20,2],[keyhole,0.5,3,0.5],[kickboard,30,40,4],[kidney,4,1,5],[king,50,30,180],[kite,50,0.2,100],[kitten,20,20,25],[kiwi,5,3,3],[knee,10,10,10],[knife,1,20,1],[l,1,1.5,0],[lactose,1,1,1],[ladder,10,50,300],[land,100,100,5],[lane,10,100,1],[language,1,1,0],[lanolin,0.5,0.5,0.5],[lapel,5,0.1,20],[lapis,1,1,1],[laptop,30,20,0.5],[lasagne,15,10,10],[lasso,1,1,0],[lassoo,1,1,10],[lavender,1,1,20],[lawn,100,100,3],[lazuli,1,1,1],[leaf,5,1,0.1],[leash,100,1,1],[lectern,100,50,150],[left,1,1,0],[left_bracket,1,1,0],[leftbracket,1,1.5,0],[leg,15,20,90],[lemon,5,5,5],[lemonade,1,1,1],[lens,1,0.1,1],[lentil,0.5,0.5,0.3],[lessthan,1,1,0],[letter,20,30,0.0001],[letterbox,20,10,20],[lettuce,15,15,15],[licorice,1,1,1],[lid,5,5,1],[light,10,10,10],[lighthouse,10,10,100],[lilo,50,180,10],[limousine,200,800,150],[line,1,0,0],[lino,10,10,0.5],[lion,80,150,100],[lip,4,0.5,1],[lipstick,7,1,1],[liquid,1,1,1],[lithium,1,1,1],[liver,25,10,6],[lizard,3,20,1],[loaf,30,10,10],[lobster,40,30,10],[lock,2,1,5],[log,30,10,10],[lolliop,10,0.5,20],[lollipop,1,1,10],[lolly,1,1,1],[lorikeet,7,7,25],[lowera,1,1,0],[lozenge,1,1,0.5],[lubricant,1,1,1],[ludo,30,30,1],[lump,1,1,1],[lung,10,10,30],[lute,30,10,90],[lycra,50,20,120],[lymph,1,1,1],[lyotard,50,20,120],[lyre,50,10,50],[m,1,1.5,0],[macbook,30,20,20],[machete,30,10,1],[magazine,20,30,0.3],[maggot,1,0.5,0.5],[magnet,5,5,1],[mahoganies,1,1,1],[mandarin,5,5,3],[mango,5,5,10],[mannequin,50,30,180],[mantelpiece,200,10,100],[map,1,1,0],[maracas,10,10,30],[margarine,1,1,1],[marimba,100,20,20],[marmalade,1,1,1],[marquee,400,400,250],[marshmallow,1,1,1],[marzipan,1,1,1],[mask,20,10,30],[mat,180,50,1],[mattress,200,200,25],[mayonnaise,1,1,1],[maypole,1000,1000,200],[maze,100,100,10],[meat,1,1,1],[medal,5,0.5,5],[medication,1,1,0.1],[medicine,1,1,0.5],[meditator,100,100,150],[melchior,1,1,1],[melon,20,20,20],[meniscus,1,1,0.1],[mercury,0.1,0.1,0.1],[meringue,20,20,5],[metal,1,1,0.2],[microphone,5,5,20],[microscope,15,15,30],[milk,0,0,0],[milk,1,1,1],[millipede,1000,1,1],[mince,1,1,1],[mind,20,20,30],[mineral,1,1,1],[minus,1,1,0],[mirror,1,1,0],[miso,1,1,1],[mitochondria,4,2,1],[mm,0.1,0,0],[mold,1,1,1],[molecule,1,1,1],[money,1,1,0.1],[monkey,40,20,150],[monster,70,50,230],[monthly,1,1,0],[moon,1,1,1],[mop,30,30,150],[mortar,20,20,15],[mortarboard,30,30,10],[mosquito,0.5,0.5,0.5],[mountain,10,10,5],[mouse,15,4,5],[mousse,5,5,5],[moustache,4,1,1],[mouth,4,0.5,0],[mouthwash,1,1,1],[muesli,1,1,1],[muffin,5,5,5],[mug,10,12,10],[multiplication,1,1,0],[multiply,1,1,0],[muscle,10,0.5,0.5],[mushroom,5,5,5],[mycorrhizae,0.1,0.1,0.1],[myrrh,1,1,1],[n,1,1,0],[nacl,0.1,0.1,0.1],[nail,0.3,0.3,3],[napkin,100,100,0.2],[nappy,100,100,0.3],[nebula,1,1,1],[neck,15,15,10],[necklace,30,30,0.3],[nectar,1,1,1],[nectarine,5,5,5],[needle,0.01,0.01,3],[nerve,1,1,5],[nest,30,30,10],[neuron,10,1,1],[newspaper,50,75,0.0001],[nib,1,1,2],[nightingale,10,10,30],[ninerectangle,9,1,0],[noodle,1,0.5,0.5],[noradrenaline,1,1,1],[norepinephrine,1,1,1],[nose,1,2,4],[nostril,0.5,0.5,0],[note,1,1,0],[notequals,1,1,0],[nrectangle,3,1,0],[number,1,1.5,0],[nut,1,1,1],[o,1,1.5,0],[oar,150,10,3],[oasis,1,1,0],[oat,1,0.5,0.1],[octagon,1,1,0],[octahedron,1,1,1],[octopus,10,10,10],[oil,0.1,0.1,0.1],[olive,1,2,1],[one,1,1.5,0],[onion,5,5,5],[openbracket,1,1.5,0],[openroundbracket,1,1.5,0],[opensquarebracket,1,1.5,0],[operatingsystem,1,1,1],[or,1,1,0],[orange,10,10,10],[orangutan,50,30,180],[oreo,1,1,0.5],[organ,5,1,5],[organisation,1,1,0.1],[oval,2,1,0],[oven,100,100,100],[owl,20,20,40],[oxygen,2,1,1],[ozone,1,1,1],[p,1,1.5,0],[packet,10,10,1],[paddock,200,200,10],[paella,20,20,4],[paint,5,5,1],[painting,100,1,100],[palette,20,10,2],[pallet,10,10,0.5],[palm,10,10,0],[pamphlet,30,20,0.0001],[pan,30,30,5],[pancake,10,10,0.3],[pancreas,10,5,2],[panforte,1,1,1],[pant,30,20,95],[pantry,100,100,200],[pants,30,20,100],[paper,20,30,0.0001],[parasol,80,80,50],[park,1,1,0],[parrot,10,10,40],[pasta,20,20,5],[pastry,1,1,0.1],[pasty,15,5,4],[path,100,1000,0],[pathway,100,1000,0],[patty,10,10,1],[paw,5,5,30],[pea,0.5,0.5,0.5],[peach,5,5,5],[peacock,100,40,80],[pear,5,5,10],[pecan,2,1,1],[pedal,12,2,12],[pedestal,50,50,30],[peg,1,1,1],[pegboard,10,1,10],[pellets,1,1,1],[pen,15,1,1],[pencil,15,1,1],[pencilsharpener,1,1,1],[pendulum,1,1,20],[penguin,30,30,100],[penis,3,3,10],[pentagon,1,1,0],[pepper,0.2,0.2,0.2],[peptides,1,1,1],[perfume,0.1,0.1,0.1],[pergola,500,200,200],[person,50,30,180],[pestle,15,3,3],[petal,1,1,0.01],[pheasant,40,30,50],[phone,5,10,0.5],[photon,1,1,1],[pi,1,1,0],[piano,150,260,190],[pickle,2,1,0.4],[pie,10,10,3],[pig,50,20,40],[pigeon,30,15,15],[pillow,80,40,15],[pimple,0.1,0.1,0.1],[pin,0.03,0.03,3],[pineapple,1,1,1],[pinecone,5,5,5],[pip,0.5,0.3,0.3],[pipe,1,1.5,0],[pipette,0.5,0.5,20],[pitchouli,0.1,0.1,0.1],[pizza,30,30,1],[plane,75,50,5],[planet,1,1,1],[plank,15,150,0.5],[plant,1,1,10],[plasma,1,1,1],[plastic,1,1,0.3],[plate,10,10,1],[platelet,1,1,0.5],[plectrum,1,1,0.1],[pliers,20,2,1],[plough,500,400,300],[plum,4,4,4],[plus,1,1,0],[pocket,20,5,20],[pocketwatch,3,5,5],[pod,5,1,0.3],[podium,100,50,150],[pomegranate,7,7,7],[pony,150,80,120],[pool,100,100,30],[popcorn,1,1,1],[poppadum,20,20,0.5],[popsicle,10,1,20],[porcupine,15,10,10],[pork,1,1,1],[porridge,10,10,5],[pot,5,5,10],[potato,5,3,3],[potbelly,30,5,30],[potion,5,5,5],[potoroo,30,40,50],[printer,30,20,20],[priscilla,50,30,250],[prism,1,1,1],[product,30,20,30],[program,1,1,0],[prolactin,1,1,1],[proline,1,1,1],[propeller,30,10,10],[prosthetic,20,30,100],[protein,3,1,1],[publisher,500,400,300],[pudding,10,10,10],[puffin,10,10,30],[pulp,1,1,1],[pump,1,1,3],[pumpkin,30,30,20],[pup,20,40,25],[puppet,20,10,30],[puppy,30,40,50],[puree,1,1,1],[pvc,0.1,0.1,0.1],[pyramid,1,1,0.5],[q,1,1.5,0],[quartz,0.1,0.1,0.1],[questionmark,1,1.5,0],[quince,5,5,5],[quinoa,1,1,1],[r,1,1.5,0],[rabbit,20,20,50],[rack,150,1,1],[radiation,1,1,1],[radish,2,2,2],[raft,150,150,30],[rag,10,10,0.1],[raisin,1,0.5,0.3],[rake,30,10,150],[ram,1,1,0.0001],[ramp,500,100,100],[raspberry,1,1,1],[raven,5,5,30],[receiver,1,1,1],[refrigerator,100,70,150],[religion,50,30,180],[relish,1,1,1],[restaurant,1500,2000,400],[rhinoceros,100,400,150],[rhubarb,1,1,0.5],[ribbon,30,2,0.001],[rigatoni,4,1,1],[right,1,1,0],[ring,1.2,1.2,0.1],[rissole,5,5,2],[river,10,1,1],[road,20,200,1],[robe,50,20,140],[robopod,200,200,200],[robot,50,30,180],[rock,1,1,1],[rocket,1,1,5],[rod,10,1,1],[roll,5,5,5],[roof,100,100,5],[rook,1,1,4],[room,500,400,300],[rope,1000,1,1],[rose,10,10,20],[rosin,1,1,1],[rotunda,100,100,50],[roux,1,1,1],[ruby,1,1,1],[rug,100,200,0.3],[ruler,30,1,0],[rung,50,5,1],[s,1,1.5,0],[sack,80,80,150],[sadface,20,2,30],[sadness,1,0,1],[sail,100,10,400],[salami,30,5,5],[salt,0.005,0.005,0.0025],[sand,0.1,0.1,0.1],[sandals,10,30,3],[sandpaper,1,1,0.3],[sandstone,20,10,10],[sandwich,10,10,1.5],[sap,0.5,0.5,0.1],[satellite,100,100,100],[sauce,1,1,1],[saucepan,20,10,5],[saucer,20,20,1],[sausage,10,3,3],[saxophone,10,30,50],[scarf,100,20,0.3],[school,100,100,10],[schoolyard,10000,10000,200],[scissors,20,10,1],[scone,5,5,5],[scoop,10,2,2],[scourer,5,5,5],[scraper,20,5,5],[screw,0.5,0.5,3],[screwdriver,20,1,1],[sculpture,60,25,60],[sea,1,1,1],[seahorse,1,0.5,5],[seal,1,1,0.1],[seat,50,50,100],[secateurs,20,5,5],[seed,1,1,0.5],[semicircle,2,1,0],[semolina,0.1,0.1,0.1],[sequin,1,1,0.1],[serviette,50,50,0.0001],[settee,400,100,100],[sevenrectangle,7,1,0],[seventyrectangle,70,1,0],[shade,1,1,0],[shaker,10,10,30],[shampoo,1,1,1],[shark,300,90,70],[shaver,5,20,3],[shed,120,200,200],[sheep,100,50,70],[shelf,100,20,1],[shell,3,3,1],[shelter,500,200,250],[ship,1000,10000,15000],[shirt,50,30,40],[shirtsleeve,10,10,60],[shoe,30,10,5],[shoelaces,40,0.3,0.3],[shop,1000,1000,300],[shore,1000,100,100],[shortcake,5,1,0.5],[shoulder,10,10,10],[shoulders,10,10,10],[shuttlecock,5,5,5],[sieve,20,20,5],[signal,1,1,0],[signpost,100,10,200],[silicon,1,1,1],[silk,1,1,0.01],[sitar,20,5,50],[site,30,20,0],[sitting,50,70,150],[six,1,1.5,0],[sixrectangle,6,1,0],[skeleton,50,30,180],[skewer,0.3,0.3,20],[ski,100,10,10],[skin,1,1,0.1],[skittle,10,10,30],[skivvy,50,20,60],[skull,20,20,30],[sky,1,1,0],[slash,1,1,0],[sleep,180,50,30],[sleeve,100,10,10],[slingshot,5,20,20],[slug,4,1,1],[smartwatch,4,10,5],[smile,1,1,0],[smock,40,20,80],[smoke,1,1,1],[snail,5,1,3],[snake,10,100,10],[snow,1,1,1],[snowflake,1,1,0],[snuff,0.1,0.1,0.1],[soap,5,3,2],[sock,30,10,15],[sofa,400,100,100],[software,1,1,0],[soil,1,1,1],[sole,30,10,0],[soles,10,30,1],[soufflé,10,10,20],[soul,50,30,180],[soup,15,15,3],[space,1,1.5,0],[spacecraft,800,400,400],[spade,10,2,50],[spaghetti,30,0.2,0.2],[spatula,20,0.5,0.5],[spear,5,1,100],[speechballoon,1,1,0],[sperm,1,0.5,0.1],[sphere,1,1,1],[sphinx,50,180,80],[sphygmomanometer,100,3,3],[spider,10,5,3],[spinach,5,5,0.01],[spine,1,1,10],[spiral,1,1,1],[spleen,2,2,2],[sponge,5,5,3],[spool,1,1,2],[spoon,15,2,1],[spout,10,1,1],[spring,1,1,5],[square,1,1,0],[squash,5,5,5],[squeezer,10,10,10],[stage,100,50,75],[stake,5,1,30],[stalk,0.3,0.3,10],[stall,400,100,100],[stapler,20,2,3],[star,1,1,1],[starfish,10,10,1],[startingline,100,1,0],[state,1,1,0],[statue,50,30,180],[statuette,5,5,30],[steam,1,1,1],[stencil,10,10,0.1],[step,100,30,20],[stick,10,1,1],[stockings,30,30,100],[stomach,15,10,10],[stone,10,10,1],[stool,30,30,30],[stop,1,1,0],[stove,100,100,1],[strap,30,1,0.1],[straw,0.005,0.005,20],[strawberry,1,1,1],[street,500,15,0],[string,0.1,0.1,20],[strut,0.5,1,2],[submarine,5,1,1],[subtraction,1,1,0],[suburb,1,1,0],[sugar,1,1,1],[suit,50,30,135],[suitcase,50,15,40],[sultana,1,0.5,0.3],[sun,10,10,10],[sunglasses,20,15,5],[sunscreen,0.5,0.5,0.5],[sunshade,100,100,100],[sushi,5,2,2],[swab,1,1,1],[swan,40,30,50],[sweat,1,1,0.1],[swings,500,10,250],[switch,1,1,1],[sword,100,10,3],[syringe,3,0.5,10],[syrup,1,1,1],[t,1,1.5,0],[table,100,100,100],[tablet,20,30,0.4],[tadpole,1,1,0.2],[tambourine,20,20,5],[tangerine,5,5,5],[tank,40,10,50],[tape,10,10,1],[tar,1,1,1],[tart,5,5,2],[tea,1,1,1],[tear,0.5,0.5,0.5],[teeth,0.5,0.5,1],[telephone,20,30,1],[tenrectangle,10,1,0],[tent,200,200,150],[tentacle,10,1,1],[testtube,1,1,10],[tetrahedron,1,1,1],[theatre,1000,1000,100],[thigh,15,30,0],[thighs,20,20,30],[thirtyrectangle,30,1,0],[thread,50,0.01,0.01],[three,1,1.5,0],[threebox,3,1,1],[threerectangle,3,1,0],[threshold,100,30,5],[throat,1,1,10],[throne,100,100,200],[thumb,10,1,1],[thymus,1,1,2],[tick,1,1,0],[tie,5,100,0.3],[tights,30,20,100],[tile,10,10,0.1],[time,1,1.5,0],[timemachine,1,1,1],[tin,10,10,10],[tinsel,100,5,5],[tissue,1,1,1],[toadstool,5,5,5],[toast,10,10,0.5],[toboggan,50,50,50],[toe,2,2,2],[toenail,1,1,0.3],[toffee,1,1,1],[tofu,2,1,1],[tomato,5,5,5],[tongs,20,1,1],[tongue,5,10,1],[tooth,1,1,1],[toothbrush,20,0.5,1],[toothpaste,20,3,3],[toothpick,0.2,0.1,4],[top,2,2,2],[topping,1,1,1],[torch,30,5,5],[torso,50,20,45],[torus,1,1,1],[toupée,20,20,1],[towel,100,100,1],[tower,10,10,100],[town,100,100,10],[toy,5,3,18],[train,10,3,0],[tram,10,2,3],[trapeze,70,1,1000],[trapezium,1,1,0],[tray,50,30,1],[treacle,1,1,1],[tree,5,5,10],[triangle,1,1,0.1],[tricycle,50,30,40],[trident,5,1,5],[trolley,60,100,100],[trombone,40,10,10],[trophy,30,30,50],[trousers,40,20,100],[truck,200,500,200],[trumpet,30,15,20],[trunk,200,100,100],[tube,3,1,1],[tues,0,0,0],[tunic,50,20,120],[tunnel,10,1,1],[turmeric,1,1,1],[turnip,5,5,5],[turtle,10,10,2],[tuxedo,50,20,130],[tv,150,5,100],[twelverectangle,12,1,1],[twentyrectangle,20,1,0],[two,1,1.5,0],[twobox,2,1,1],[tworectangle,2,1,0],[typewriter,30,20,10],[tyre,100,5,100],[u,1,1.5,0],[umbrella,100,100,100],[underpants,30,20,20],[underscore,1,1,0],[unguent,1,1,1],[university,100,100,10],[up,1,1,0],[uppert,1,1.5,0],[urea,1,1,1],[urine,1,1,1],[urn,10,10,20],[v,1,1.5,0],[vaccine,1,0.5,20],[variable,1,1,0],[vase,10,10,30],[vehicle,8,1,2],[vesicle,1,1,1],[vest,50,20,50],[vetiver,1,1,1],[video,1,1,0],[vine,10,1,1],[vinegar,1,1,1],[viola,40,15,150],[violin,30,100,10],[virus,1,1,1],[volcano,10,10,5],[w,1,1.5,0],[wafer,1,1,0.1],[waffle,1,1,0.5],[waist,30,20,1],[wall,300,10,400],[wallaby,100,80,130],[wallet,3,5,1],[walnut,2,2,2],[wand,30,0.5,0.5],[wardrobe,200,50,200],[wart,0.5,0.5,0.5],[warthog,50,30,40],[water,1,1,1],[waterchick,10,5,7],[watercress,1,1,0.1],[watermelon,40,40,30],[wax,1,1,1],[weasel,10,50,15],[website,20,0,30],[weet,10,5,1],[weight,10,10,10],[whale,5000,2500,500],[wharf,3,30,3],[wheat,1,1,30],[wheel,100,10,100],[wheelbarrow,50,100,50],[wheelchair,100,100,100],[whisk,5,5,20],[whistle,3,1,1],[whiteboard,200,10,100],[wick,0.1,0.1,20],[wickets,10,2,80],[wigwam,150,150,250],[windpipe,1,1,5],[windsock,30,10,10],[wine,5,5,30],[wing,5,1,0.5],[wiper,40,1,30],[wire,200,0.1,0.1],[witch,50,30,180],[wok,30,50,20],[womb,20,20,30],[wombat,30,50,30],[wood,5,1,1],[wool,20,10,10],[world,1,1,1],[worm,10,0.3,0.3],[wreath,30,30,5],[wrist,10,1,3],[write,1,1,0],[x,1,1,1],[xcorrection,1,1,0],[xline,1,0,0],[xylophone,100,20,20],[y,1,1.5,0],[yacht,500,200,800],[yard,500,500,250],[yline,0,1,0],[yoghurt,1,1,1],[yoyo,5,2,100],[z,1,1.5,0],[zero,1,1.5,0],[zip,1,0.2,5],[zither,30,40,5],[zline,0,0,1],[zucchini,20,3,3],[zygote,1,1,2]] \ No newline at end of file diff --git a/Text-to-Breasonings-master/brthdict.txt b/Text-to-Breasonings-master/brthdict.txt new file mode 100644 index 0000000000000000000000000000000000000000..057b8f948dbd9a210952f51249d732d6994c2b82 --- /dev/null +++ b/Text-to-Breasonings-master/brthdict.txt @@ -0,0 +1 @@ +[[a,useful]] \ No newline at end of file diff --git a/Text-to-Breasonings-master/d.sh b/Text-to-Breasonings-master/d.sh new file mode 100644 index 0000000000000000000000000000000000000000..87c67d10badfaffb522265d3d326ed1ef7397082 --- /dev/null +++ b/Text-to-Breasonings-master/d.sh @@ -0,0 +1,15 @@ +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh +./bc12.sh \ No newline at end of file diff --git a/Text-to-Breasonings-master/daily_medit b/Text-to-Breasonings-master/daily_medit new file mode 100644 index 0000000000000000000000000000000000000000..2b082c3445ce4ef5063e144783d3996e6568cc9a Binary files /dev/null and b/Text-to-Breasonings-master/daily_medit differ diff --git a/Text-to-Breasonings-master/daily_medit.pl b/Text-to-Breasonings-master/daily_medit.pl new file mode 100644 index 0000000000000000000000000000000000000000..da318f819c95eda4030fbc3ffdf4ba7f248f4b8b --- /dev/null +++ b/Text-to-Breasonings-master/daily_medit.pl @@ -0,0 +1,8 @@ +:-include('text_to_breasonings.pl'). +:-include('meditatorsanddoctors.pl'). +:-include('meditationnoreplace2.pl'). +main:-catch(meditation,Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). \ No newline at end of file diff --git a/Text-to-Breasonings-master/delegate_workloads.txt b/Text-to-Breasonings-master/delegate_workloads.txt new file mode 100644 index 0000000000000000000000000000000000000000..35d59df3277bcc1b522a3c6866cc59a28fffe824 --- /dev/null +++ b/Text-to-Breasonings-master/delegate_workloads.txt @@ -0,0 +1,250 @@ +Delegate Workloads 1 + + 1. I cut off infinity. + 2. I used the sun. + 3. The queen helped. + 4. The delivery arrived. + 5. I earned the role. + 6. I wrote the developed text. + 7. I knew about the lotus spoon. + 8. I knew the female politician. + 9. I knew positive religion. + 10. People are safe. + 11. I knew about God (the philosopher). + 12. I breastoned (sic) out the delegating. + 13. I wrote 10 details per sentence. + 14. I owned the image. + 15. The graduations officer was there. + 16. There was philosophy. + 17. There was music. + 18. There was acting. + 19. I delegated the workload to myself. + 20. I made the model. + 21. I redrafted the idea. + 22. I wrote the book. + 23. I counted the breasonings per paragraph. + 24. I counted the As per book. + 25. I am with it over the ideas given to me. + 26. I delegated the work. + 27. I saw you in the in the inference. + 28. I verified the work. + 29. I went around the sun. + 30. I eyed the robots back. + 31. I multiplied by 10. + 32. I compared notes. + 33. I efficiently edited the sentence by finding a good reason for it. + 34. I found the criteria and wrote the assignment and delegated it. + 35. I wrote the assignment with the correct criteria in accreditation in tenure. + 36. The company delegated the work. + 37. I noticed the sculpture. + 38. I tricked the person’s eye. + 39. The duckling lived. + 40. Theatre increased music. + 41. Philosophy increased computer science. + 42. Pedagogy increased meditation. + 43. Medicine increased Computational English. + 44. Politics increased economics. + 45. Popology increased societology. + 46. The delegatee (sic) completed the workload. + 47. I moved my arms to keep them sunburn-free. + 48. I delegated the workload in Pedagogy. + 49. I knew the Professor. + 50. I talked with the people. + 51. I performed work well. + 52. the child created the star. + 53. The delegatee was protected by natural law. + 54. The delegater was protected by natural law. + 55. I engaged with the thought. + 56. I observed that the excellence threshold had been reached. + 57. I observed that the goodness threshold had been reached. + 58. I observed that the very much goodness threshold had been reached. + 59. I observed that the high quality threshold had been reached. + 60. I observed that the area of study threshold had been reached. + 61. I observed that the product standard had been reached. + 62. I observed that the emeritus award threshold had been reached. + 63. I taught the special skills. + 64. I delegated the workload to the employee at a single level. + 65. I earned the sense to check. + 66. I shaped foods in appetising ways. + 67. I stamped the work. + 68. I plotted the graph of the delegates’ workload against time. + 69. I plotted the graph of the delegatees’ workload against time. + 70. Time stood still when the workload was delegated. + 71. I helped with the workload. + 72. I examined the workload. + 73. The present famous employee reached readiness threshold. + 74. The present hidden employee reached readiness threshold. + 75. I watched the film. + 76. I knew about famousness. + 77. I prayed for each breasoning. + 78. I breasoned each prayed-for breasoning the form of a sentence object. + 79. I retained breasonings, recordings and blessedness. + 80. The passage is safe. + +Delegate Workloads 2 + + 1. Recordings of 80 breasonings prayed for and put through are compatible with conception. + 2. The same hen of the clutch of eggs will protect the eggs. + 3. The greater one was chosen because of being preferred to recordings. + 4. I won. + 5. The sun rose. + 6. I found the meditation (pedagogy) employees at the University. + 7. I delegated work to the suction holder. + 8. I noticed the pine. + 9. I noticed the professor chain. + 10. I ate on the professor in chains. + 11. The order of equality in breasonings is prayer, recordings and kinder breasonings for individuals. + 12. I saw the professor reading my chapters. + 13. I employed the employee + 14. Numbers of people believed the idea. + 15. I can be like Plato. + 16. I can be like marzipan jellioes. + 17. I worked in the group. + 18. I went through the levels in deciding what work to delegate. + 19. I entrained myself in a better relationship with students as a student as teacher. + 20. I am big in politicity (sic). + 21. I delegated writing the argument. + 22. I delegated the task where the subjects tasks, prerequisites and the previous knowledge were known psychologically. + 23. I used the vegetable product. + 24. There was an esoteric feeling in the air. + 25. Many things can come from nothing. + 26. The idea was detailed. + 27. The idea was done-up. + 28. I had a business. + 29. I wrote the idea-in-itself. + 30. An idea was given to me. + 31. I examined the content. + 32. I deserved the idea. + 33. The student attributed ideas to the model at University. + 34. I synthesised the animal product. + 35. I gave the input and output for the delegated algorithm (disagreeing instead of agreeing (agreeing)). + 36. I disagreed (agreed) with the incorrect (correct) logic symbol. + 37. Addition is logically walking up a hill as imagined to be stated by e..g.s. + 38. I prayed for the breasoning to ensure it wasn’t (was) flawed (correct). + 39. The role character marched forward. + 40. I installed the pink screen. + 41. I examined the model heart. + 42. I made the police box. + 43. I wrote the idea by-by-itself. + 44. I wrote the inter-disciplinary area of study points down. + 45. I applied the literature to the philosophy, then wrote on the philosophy in itself. + 46. I wrote the book. + 47. I performed the algorithm. + 48. I employed (trialled) the delegate workload retrieval model. + 49. I knew what was real. + 50. I noticed the monastics (people) retrieving information. + 51. As the workload was delegated, the element was needed. + 52. I said the workload was unnecessary (necessary). + 53. I wanted you. + 54. Kant predicted that meditation would overtake. + 55. Romeo and Juliet was intertwined. + 56. I agreed with delegating workloads. + 57. I delegated the workloads. + 58. I asked for help. + 59. Pedagogy will be done. + 60. I asked nicely for work to be done. + 61. I delegated grouping exposition points. + 62. I delegated connecting critique points. + 63. I expected Lucianic Meditation. + 64. I verified that the breasoning lists were not the same. + 65. I planned to be good. + 66. I knew about lost phalluses (sic) (definitiveness). + 67. I was racial. + 68. I studied the subject. + 69. I observed the individual working. + 70. I alighted at Green. + 71. I delighted the studio audience member. + 72. I designed the idea. + 73. I helped work with an employee. + 74. I noticed the Geelong Skyscraper. + 75. I performed the most intelligent work at each point. + 76. I was given actual songs. + 77. I gave Daniel the cord. + 78. My student submitted his assignment. + 79. I read the logic of the assignment. + 80. I had fun thinking of the future. + +Delegate Workloads 3 + + 1. I noticed that each workload delegate was friendly. + 2. Positivity is in agreement. + 3. I noticed the post-traumatic stress disorder (good health) of the gay rape victim (gay). + 4. The philosopher was a doctor and a meditator. + 5. I observed the vandalised (healthy) tree. + 6. There should be breaks at work. + 7. I delegated rest. + 8. I noticed 3/8 was written 8/3 (by the idea of the plimsoll line-like symbol) in the top down form. + 9. I examined the man’s ideas in the order that the honorary fellows thought of them in. + 10. I examined life. + 11. I am at the top. + 12. I corrected the line to the square. + 13. I noticed that there was mass production. + 14. The means of production were creative. + 15. God was dead (alive) (knew) a breasoning helper. + 16. The breasoning helper worked at God’s University. + 17. God didn’t make the breasoning helper available (only in accreditation). + 18. God wrote breasonings. + 19. Normal is high quality. + 20. I compared ideas to get the work done. + 21. I noticed the stillness. + 22. I wrote music. + 23. I dotted on my work for the day. + 24. I breasoned out my work for the day. + 25. I noticed the nth degree. + 26. I noticed harlequinades. + 27. I experienced effortless threshold. + 28. I chose the content. + 29. I noticed the students perform the work. + 30. I noticed the fireworks. + 31. The pink object represented work having been done. + 32. Delegate workloads was an assignment in the topic “production”. + 33. Creativity is a skills in the topic “production”. + 34. Delegation was the negative part of sin(x), workload was the positive part. + 35. The University texts were supported by Masters graduates. + 36. The workloads of the University input and output were delegated. + 37. The workload was planned with delegation. + 38. The queen delegated the circle of work. + 39. The words were credible. + 40. The words inspired love. + 41. No one (someone) abandoned (adopted) me. + 42. Breasonings are University and Vocational Education and Training put together. + 43. Individual accreditation was for delegate workloads. + 44. Medicine was required for 250 breasonings. + 45. Inky wondered if she would be a mother. + 46. Inky avoided the red line. + 47. I moved to a new location. + 48. I delegated the breasoning. + 49. The head of state delegated the workloads. + 50. I delegated Education accreditation for breasonings. + 51. I delegated help from the pedagogy helper for breasonings. + 52. I avoided workplace bullying. + 53. I started the next task. + 54. I ate the croissant. + 55. Delegation was equal to workloads (not only because they were equal in type with them). + 56. Delegating workloads was commercially viable. + 57. Delegating workloads was compatible with evolution. + 58. I am the true philosopher of the abbey. + 59. Vocational Education and Training precedes University. + 60. Vocational Education and Training can be at (was different from) University. + 61. The heart complements the brain at University. + 62. You delegated the work. + 63. I performed the work. + 64. I am developed by what will appear. + 65. The students’ topics are what will appear. + 66. Stacking of workloads as triangles become the Gods. + 67. My pedagogy is musical, theatrical, and philosophical. + 68. The Education Doctoral graduate was unfunded until University. + 69. The PhD topic list was arrived at by agreement. + 70. PhDs were based on secondary texts. + 71. The PhD thesis was slightly new on the topic. + 72. Delegate workloads was a classic by the 340 billion people. + 73. I was manual. + 74. The vocational liberal arts course delegated the workloads. + 75. I exemplified the connection. + 76. Using a different pedagogy helper would show (ensure) the same error (correctness) in continuity. + 77. I jumped around (I practised saltaté). + 78. The negative terms (initial positive terms) (delegators) were replaced with positive terms (workloads). + 79. I observed performing As in front of an abbott may have led to overcoming the chiropractor in an experiential acting class. + 80. I noticed God (the seed for a child). + + diff --git a/Text-to-Breasonings-master/diff.pl b/Text-to-Breasonings-master/diff.pl new file mode 100644 index 0000000000000000000000000000000000000000..3f1016c8d0d7728aa75879ca3d555daf19689477 --- /dev/null +++ b/Text-to-Breasonings-master/diff.pl @@ -0,0 +1,19 @@ +:-include('../listprologinterpreter/listprolog.pl'). +main :- open_string_file_s("../Text-to-Breasonings/file0.txt",A), +open_string_file_s("../Text-to-Breasonings/file01.txt",A0), +find_sent(A,A1),find_sent(A0,A01), + +subtract(A1,A01,B1),length(B1,B1L), +subtract(A01,A1,B2),length(B2,B2L), +%findall([X,"\n"],member(X,B1),B1X), +%findall([X,"\n"],member(X,B2),B2X), +flatten([B1L,"sentences deleted from file0.txt", +B2L,"sentences added to file01.txt", +"*****","Deleted from file0.txt:",B1, +"*****","Added to file01.txt:",B2],List), +findall(_,(member(L,List),term_to_atom(L,L1),writeln(L1)),_), +%writeln(Atom), +!. + +find_sent(A,F) :- +split_string(A,".\n\r",".\n\r",B),delete(B,"",A1),findall(C,(member(C,A1),string_chars(C,D),not(forall(member(E,D),char_type(E,white)))),F),!. diff --git a/Text-to-Breasonings-master/directiondict.txt b/Text-to-Breasonings-master/directiondict.txt new file mode 100644 index 0000000000000000000000000000000000000000..0b70af307ae330566c274e01ca086e19bec684c5 --- /dev/null +++ b/Text-to-Breasonings-master/directiondict.txt @@ -0,0 +1 @@ +[[and,display]] \ No newline at end of file diff --git a/Text-to-Breasonings-master/edit.pl b/Text-to-Breasonings-master/edit.pl new file mode 100644 index 0000000000000000000000000000000000000000..0af1bcc2a34d4962b9832e74823fb3399a8f46ac --- /dev/null +++ b/Text-to-Breasonings-master/edit.pl @@ -0,0 +1,108 @@ +:- include(library(edit)). +:- multifile + edit:edit_command/2. +:- multifile prolog_edit:load/0. +%%:- use_module(library(filesex)). + +run1 :- + Brdict1='brdict1.txt', + Brdict2='brdict2.txt', + Brdict1vps='brdict1vps.txt', + Brdict2vps='brdict2vps.txt', + MacPath='~/yourfolder/', + MacPath1='/Users/yourname/yourfolder/', + VPSPath='root@xxx.xxx.xxx.xxx:/var/www/yourdomain.com/', + Scp='scp -p', + atom_concat(MacPath1,Brdict1, Brdict1Filename), + atom_concat(MacPath1,Brdict1vps,Brdict1vpsFilename), + + + shell1(Scp,VPSPath,Brdict1,MacPath,Brdict1vps,Command1), + shell1(Command1), + + ((time_file(Brdict1Filename,Brdict1FilenameTime), + time_file(Brdict1vpsFilename,Brdict1FilenameTime))->true; + + (prep1(Brdict1,Brdict1Term), + prep1(Brdict1vps,Brdict1vpsTerm), + + (Brdict1Term=Brdict1vpsTerm->true; + + (append(Brdict1Term,Brdict1vpsTerm,Brdict1Term2), + + update1(Brdict1,Brdict1Term2), + + shell1(Scp,MacPath,Brdict1,VPSPath,Brdict1,Command3), + shell1(Command3))))), + + shell1('rm',MacPath,Brdict1vps,Command5), + shell1(Command5), + + + atom_concat(MacPath1,Brdict2, Brdict2Filename), + atom_concat(MacPath1,Brdict2vps,Brdict2vpsFilename), + + shell1(Scp,VPSPath,Brdict2,MacPath,Brdict2vps,Command2), + shell1(Command2), + + ((time_file(Brdict2Filename,Brdict2FilenameTime), + time_file(Brdict2vpsFilename,Brdict2FilenameTime))->true; + + (prep2(Brdict2,Brdict2Term), + prep2(Brdict2vps,Brdict2vpsTerm), + + (Brdict2Term=Brdict2vpsTerm->true; + + (append(Brdict2Term,Brdict2vpsTerm,Brdict2Term2), + + update1(Brdict2,Brdict2Term2), + + shell1(Scp,MacPath,Brdict2,VPSPath,Brdict2,Command4), + shell1(Command4))))), + shell1('rm',MacPath,Brdict2vps,Command6), + shell1(Command6), +!. + +prep1(Brdict1,Brdict1Term1) :- + phrase_from_file(string(BrDict0), Brdict1), + splitfurther(BrDict0,BrDict01), + sort(BrDict01,Brdict1Term1), + !. + +prep2(Brdict2,Brdict2Term1) :- + phrase_from_file(string(BrDict0t), Brdict2), + splitfurthert(BrDict0t,BrDict01t), + sort(BrDict01t,Brdict2Term1), + !. + +update1(File,BrDict1) :- + sort(BrDict1,BrDict2), + open(File,write,Stream), + write(Stream,BrDict2), + close(Stream). + +shell1(Command1,Path,Object,Command4) :- + atom_concat(Command1,' ',Command2), + atom_concat(Command2,Path,Command3), + atom_concat(Command3,Object,Command4). + +shell1n(Command1,Command2,Command3) :- + atom_concat(Command1,'\n',Command1A), + atom_concat(Command1A,Command2,Command3). + +shell1(Command1,Path1,Object1,Path2,Object2,Command5) :- + shell1(Command1,Path1,Object1,Command4), + shell1(Command4,Path2,Object2,Command5). + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). diff --git a/Text-to-Breasonings-master/file.txt b/Text-to-Breasonings-master/file.txt new file mode 100644 index 0000000000000000000000000000000000000000..6103c7e2489f9ecc162230d4392ffd747f7a27b1 --- /dev/null +++ b/Text-to-Breasonings-master/file.txt @@ -0,0 +1 @@ +["","","",""] \ No newline at end of file diff --git a/Text-to-Breasonings-master/go.pl b/Text-to-Breasonings-master/go.pl new file mode 100644 index 0000000000000000000000000000000000000000..fa1ec681a6d44b54319e8b287dee79df55328b13 --- /dev/null +++ b/Text-to-Breasonings-master/go.pl @@ -0,0 +1,32 @@ +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +go :- +working_directory(_,'../Philosophy/'), + +A=[person1,person2], + +time(findall(_,(member(P,A), + +time(( +foldr(string_concat,["swipl --stack-limit=80G -f -q ./bag_args21.pl"],S3)%, +,catch(bash_command(S3,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text4),%writeln1(Text4), + fail%abort + )), + +foldr(string_concat,["swipl --stack-limit=80G -f -q ./bag_algs1.pl"],S31)%, +,catch(bash_command(S31,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text4),%writeln1(Text4), + fail%abort + )), + + texttobr2_1(1), % meditation + texttobr2_1(1), % teleport 1 + texttobr2_1(1), % medicine 1 + texttobr2_1(1), % teleport 2 + texttobr2_1(1), % medicine 2 + texttobr2_1(1), % a thought + + writeln([P,done]) + )) + ),_)),!. \ No newline at end of file diff --git a/Text-to-Breasonings-master/grammar.pl b/Text-to-Breasonings-master/grammar.pl new file mode 100644 index 0000000000000000000000000000000000000000..ccd69b9bff1bdad5c20795f5f254623f7cd9ea57 --- /dev/null +++ b/Text-to-Breasonings-master/grammar.pl @@ -0,0 +1,279 @@ +%% Replace with grammar part, write grammar part, write grammars to convert string to compound + +%% Replace with grammar part + +%% Make atoms into strings as soon as entered x have both, because it is annoying to enter strings as strings* - prefer atoms because they're faster, use strings only when necessary - * get rid of atoms as strings and use atoms only as names in form [n,name], v + +%% ** Need compound terms in LPI +%% convert_to_grammar_part1([[n,nouns],[[v,l1],[v,l2]]],"->",[[[n,word],[[v,noun1]]],[[n,word],[[v,noun2]]],[[n,code],[[n,append],[[v,l1],[[v,noun1],[v,noun2]],[v,l2]]]]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],[[v,l1],[v,l2]],"->",[[[n,word],[[v,noun1]]]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],"->",[[[n,word],[[v,noun1]]]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],"->",[[[v,a]],[[n,code],[[n,=],[[v,a],"apple"]]]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],"->",[[v,a],[[n,code],[[n,=],[[v,a],["apple"]]]]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],"->",[["apple"]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],"->",["apple"]]],[],G). + +%% Notes +%% Brackets around body in input v + +%% base case [] v +%% calls, terminals, code v +%% if calls - 2vp or non 2vp v + +%% No bc needed except [] v +%% Recurse after all except bc [] v + +%% code in converted code has no brackets v + +%% $->v v + +%% must have a call or terminal v + +%% terminals have vgps, undef g t->g p v + +%% can find x as terminals, can find singletons later v + +%% grammar_part,[n,noun],[v,vgp1],[v,vgp2] should be +%% grammar_part,[[n,noun],[v,vgp1],[v,vgp2]] do using list proc v + +%% append [[]] not [] to [[v v]] v + +%% Print [] around body v + +%% append list in first query args 1<->2 v + +%% ** change [v,name] to [v*,name] so new interpreter, grammar command can run them, can have multiple v, v* or groups of these used in an interpreter shell x use a marker to avoid replacing variables with values when passing algorithms to shells, or detect and leave without marker + +%% keep original length recorded to add base case in case when add variables +convert_to_grammar_part1(Grammar1,Grammar2,Grammar3) :- + convert_to_grammar_part11(Grammar1,Grammar2,Grammar4,[],EndGrammar), + append(Grammar4,EndGrammar,Grammar3),!. + +convert_to_grammar_part11([],Grammar,Grammar,EndGrammar,EndGrammar) :- !. +convert_to_grammar_part11(Grammar1,Grammar2,Grammar3,EndGrammar1,EndGrammar2) :- + Grammar1=[Grammar4|Grammar5], + (Grammar4=[[n,Name],Variables1,"->",Body1]->true; + (Grammar4=[[n,Name],"->",Body1],Variables1=[])), + %%((maplist(no_calls,Body1))-> %% this is a base case + %%append([[v,vgp],[v,vgp]],Variables1,Variables2); + append([[v,vgp1],[v,vgp2]],Variables1,Variables2) + %%) + , + member(Item1,Body1),call_or_terminal(Item1), %% If not, operator expected. + append([[n,Name]],Variables2,Variables3), + Grammar6=[[n,grammar_part],Variables3,":-"], + convert_to_grammar_part20(Body1,1,2,2,[],Body2), + append(Grammar6,[Body2],Grammar7), + + %% member to check all doesn't work elsewhere, do ; to ->true; +(maplist(basecasecondition(Variables3,[n,Name]),Grammar2)-> +((Variables1=[]->(Grammar9=[[n,grammar_part],[[n,Name],[],[v,vgp]]],Grammar10=[[n,grammar_part],[[n,Name],"",[v,vgp]]],append(EndGrammar1,[[[n,grammar_part],[[n,Name],[v,vgp],[v,vgp]]]],EndGrammar3) +);( +Grammar9=[[n,grammar_part],[[n,Name],[],[v,vgp]|Variables1]],Grammar10=[[n,grammar_part],[[n,Name],"",[v,vgp]|Variables1]],append(EndGrammar1,[[[n,grammar_part],[[n,Name],[v,vgp],[v,vgp]|Variables1]]],EndGrammar3) +) + ), + append(Grammar2,[Grammar9, + Grammar10, + Grammar7],Grammar8)); + (EndGrammar1=EndGrammar3, + append(Grammar2,[ + Grammar7],Grammar8)) + ), + convert_to_grammar_part11(Grammar5,Grammar8,Grammar3,EndGrammar3,EndGrammar2),!. +convert_to_grammar_part11(Grammar1,Grammar2,Grammar3,EndGrammar1,EndGrammar2) :- + Grammar1=[Grammar4|Grammar5], + ((((Grammar4=[_Name1,_Variables1,":-",_Body1]->true; + Grammar4=[_Name2,":-",_Body2])->true; + Grammar4=[_Name3,_Variables2])->true; + Grammar4=[_Name4])->true; + (writeln(["Error: Grammar",Grammar4,"badly formed."]),abort)), + append(Grammar2,[Grammar4],Grammar6), + convert_to_grammar_part11(Grammar5,Grammar6,Grammar3,EndGrammar1,EndGrammar2),!. + +basecasecondition(Variables3,[n,Name],Item1) :- + not((Item1=[[n,grammar_part],Item2|_Rest2],Item2=[[n,Name]|_Rest3],length(Variables3,Length),length(Item2,Length))). + +no_calls(Item) :- + not(call1(Item)),!. + +convert_to_grammar_part20(Body1,FirstVar1,SecondVar1,SecondVarParent,Body2,Body3) :- + (count_call_or_terminal(Body1,0,1,[],_I)-> + SecondVar2=SecondVar1;SecondVar2=3), + +convert_to_grammar_part2(Body1,FirstVar1,SecondVar2,SecondVarParent,Body2,Body3),!. + +count_call_or_terminal([],N,N,I,I) :- !. +count_call_or_terminal([Item|Items],N1,N2,I1,I2) :- + (call_or_terminal(Item)->(N3 is N1+1,append(I1,[Item],I3));N3 is N1,I3=I1), + count_call_or_terminal(Items,N3,N2,I3,I2),!. + + +convert_to_grammar_part2([],_FirstVar,_SecondVar,_SecondVarParent,Body,Body) :- !. + +%% only want 2vp to be used if last call + + +convert_to_grammar_part2(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item|Items], + call1(Item), + (last_call_or_terminal2(Items)-> + convert_to_grammar_part31(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3); + convert_to_grammar_part32(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3)), !. + +%% +convert_to_grammar_part2(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item|Items], + terminal(Item), + (last_call_or_terminal2(Items)-> + convert_to_grammar_part31t(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3); + convert_to_grammar_part32t(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3)), !. + +convert_to_grammar_part2(Body1,FirstVar,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item|Rest1], + Item=[[n,code]|Rest2], + append(Body2,Rest2,Body4), + convert_to_grammar_part2(Rest1,FirstVar,SecondVar,SecondVarParent,Body4,Body3),!. + +convert_to_grammar_part2(Body1,_FirstVar,_SecondVar,_SecondVarParent,_Body4,_Body3) :- + Body1=[Item|_Rest1], + writeln(["Error: Item",Item,"badly formed."]),abort,!. + +convert_to_grammar_part31(Body1,FirstVar,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item1|Rest1], + convert_to_grammar_part311(Item1,FirstVar,SecondVarParent,Body2,Body4), + convert_to_grammar_part2(Rest1,FirstVar,SecondVar,SecondVarParent,Body4,Body3), !. + +convert_to_grammar_part311([[n,RuleName]],FirstVar1,SecondVarParent,Body2,Body3) :- + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVarParent,SecondVarParentName), + Call=[[n,RuleName],FirstVarName,SecondVarParentName], + append([[n,grammar_part]],[Call],Item), + append(Body2,[Item],Body3),!. + +convert_to_grammar_part311([[n,RuleName]|[Variables1]],FirstVar1,SecondVarParent,Body2,Body3) :- + variables(Variables1), + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVarParent,SecondVarParentName), + append([[n,RuleName],FirstVarName,SecondVarParentName],Variables1,Call), + append([[n,grammar_part]],[Call],Item2), + append(Body2,[Item2],Body3),!. + +convert_to_grammar_part32(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item1|Rest1], + convert_to_grammar_part321(Item1,Rest1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3),!. + +convert_to_grammar_part321(Item1,Rest1,FirstVar1,SecondVar1,SecondVarParent,Body2,Body3) :- + Item1=[[n,RuleName]], + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVar1,SecondVarName), + Call=[[n,RuleName],FirstVarName,SecondVarName], + append([[n,grammar_part]],[Call],Item2), + append(Body2,[Item2],Body4), + FirstVar2 is SecondVar1, + SecondVar2 is SecondVar1+1, + convert_to_grammar_part2(Rest1,FirstVar2,SecondVar2,SecondVarParent,Body4,Body3),!. + +convert_to_grammar_part321(Item1,Rest1,FirstVar1,SecondVar1,SecondVarParent,Body2,Body3) :- + Item1=[[n,RuleName]|[Variables1]], + variables(Variables1), + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVar1,SecondVarName), + append([[n,RuleName],FirstVarName,SecondVarName],Variables1,Call), + append([[n,grammar_part]],[Call],Item2), + append(Body2,[Item2],Body4), + FirstVar2 is SecondVar1, + SecondVar2 is SecondVar1+1, + convert_to_grammar_part2(Rest1,FirstVar2,SecondVar2,SecondVarParent,Body4,Body3),!. + +%% + +convert_to_grammar_part31t(Body1,FirstVar,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item1|Rest1], + convert_to_grammar_part311t(Item1,FirstVar,SecondVarParent,Body2,Body4), + convert_to_grammar_part2(Rest1,FirstVar,SecondVar,SecondVarParent,Body4,Body3), !. + +convert_to_grammar_part311t(Item1,FirstVar1,SecondVarParent,Body2,Body3) :- + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVarParent,SecondVarParentName), + Call=[Item1,FirstVarName,SecondVarParentName], + append([[n,grammar_part]],[Call],Item2), + append(Body2,[Item2],Body3),!. + +convert_to_grammar_part32t(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item1|Rest1], + convert_to_grammar_part321t(Item1,Rest1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3),!. + +convert_to_grammar_part321t(Item1,Rest1,FirstVar1,SecondVar1,SecondVarParent,Body2,Body3) :- + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVar1,SecondVarName), + Call=[Item1,FirstVarName,SecondVarName], + append([[n,grammar_part]],[Call],Item2), + append(Body2,[Item2],Body4), + FirstVar2 is SecondVar1, + SecondVar2 is SecondVar1+1, + convert_to_grammar_part2(Rest1,FirstVar2,SecondVar2,SecondVarParent,Body4,Body3),!. + +last_call_or_terminal2([]) :- !. +last_call_or_terminal2(Body1) :- + Body1=[Item|Body2], + not(call_or_terminal(Item)), + last_call_or_terminal2(Body2),!. + +to_variable_name(Var,Name1) :- + atom_concat(vgp,Var,Name2), + Name1=[v,Name2],!. +variable_name([v,_Name]) :- !. +predicate_or_rule_name([n,_Name]) :- !. +predicate_or_rule([[n,_Name]|_Variables]) :- !. +variables([]) :- !. +variables(Variables1) :- + Variables1=[Variable|Variables2], + terminal(Variable),%%=[v,_VariableName], + variables(Variables2),!. +terminal([]) :- !. +terminal(Item1) :- + not(code(Item1)), + ([Item2]=Item1->true;Item2=Item1), + (variable_name(Item2)->true;string(Item2)),!. +code(Item) :- + (Item=[[n,code]|_Rest]->true;Item=[n,code]),!. +call1(Item) :- (Item=[[n,_PredicateName]|_Variables]->true;Item=[n,_PredicateName]),not(code(Item)),!. +call_not_grammar([[n,PredicateName]|_Variables]) :- not(PredicateName=grammar),not(PredicateName=grammar_part),!. +call_grammar_part([[n,grammar_part]|_Variables]) :- !. +name([n,_Name]):-!. +call_or_terminal(Item) :- + terminal(Item)->true;call1(Item),!. + +%% write grammar part + +%% [[]] in args + +%% collect code commands to write + + +%% Terminal known, Phrase2 known +%% Terminal unknown, Phrase2 known +%% Terminal known, Phrase2 unknown +%% Terminal unknown, Phrase2 unknown - error +%% if known or unknown, can be [v] + +%% Gets val or sets prolog var to undef, puts val in undef in var + +%% Doesn't support [v], use wrap(v) x - append not string concat if P1 is ["a"] + +getvalues2([],Values,Values,_Vars,Flags,Flags) :- !. +getvalues2(VarNames1,Values1,Values2,Vars,Flags1,Flags2) :- + VarNames1=[VarName1|VarNames2], + (VarName1=[VarName2]->Flag1=true;VarName2=VarName1), + getvalue(VarName2,Value1,Vars), + (Value1=empty->Flag2=true;(Value2=Value1,Flag2=false)), + (Flag1=true->Value3=[Value2];Value3=Value2), + append(Values1,Value3,Values3), + append(Flags1,[Flag2],Flags3), + getvalues2(VarNames2,Values3,Values2,Vars,Flags3,Flags2),!. +undefined_to_empty([],Values,Values) :- !. +undefined_to_empty(Values1,Values2,Values3) :- + Values1=[Value1|Values4], + (var(Value1)->Value2=empty;Value2=Value1), + append(Values2,[Value2],Values5), + undefined_to_empty(Values4,Values5,Values3),!. \ No newline at end of file diff --git a/Text-to-Breasonings-master/group_meditation b/Text-to-Breasonings-master/group_meditation new file mode 100644 index 0000000000000000000000000000000000000000..fea863f638a1ba5fb8b4139f2f39eafd66023f7e Binary files /dev/null and b/Text-to-Breasonings-master/group_meditation differ diff --git a/Text-to-Breasonings-master/group_meditation.pl b/Text-to-Breasonings-master/group_meditation.pl new file mode 100644 index 0000000000000000000000000000000000000000..0733703b15e14c828d7c567106a58b37baffb2f4 --- /dev/null +++ b/Text-to-Breasonings-master/group_meditation.pl @@ -0,0 +1,8 @@ +:-include('text_to_breasonings.pl'). +:-include('meditatorsanddoctors.pl'). +:-include('meditationnoreplace2.pl'). +main:-catch((meditators(A),meditators2(B),length(A,AL),length(B,BL),CL is AL+BL,N1 is 108*2,texttobr2_1(N1),texttobr2_1(CL)),Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). \ No newline at end of file diff --git a/Text-to-Breasonings-master/in username folder/texttobr2.sh b/Text-to-Breasonings-master/in username folder/texttobr2.sh new file mode 100644 index 0000000000000000000000000000000000000000..01549a0f3d9c4438e56558204b21f7a2cc0ff084 --- /dev/null +++ b/Text-to-Breasonings-master/in username folder/texttobr2.sh @@ -0,0 +1,3 @@ +#!/bin/sh +cd /Users/username/codefolder/ +swipl -G100g -T20g -L2g -s meditation1.pl diff --git a/Text-to-Breasonings-master/listprologinterpreter1listrecursion4.pl b/Text-to-Breasonings-master/listprologinterpreter1listrecursion4.pl new file mode 100644 index 0000000000000000000000000000000000000000..a9c35f7fda7e88f35e21dde6f1bff9c3a297c290 --- /dev/null +++ b/Text-to-Breasonings-master/listprologinterpreter1listrecursion4.pl @@ -0,0 +1,780 @@ +:- dynamic debug/1. +:- dynamic cut/1. + +/** List Prolog Interpreter **/ + +interpret(Debug,Query,Functions1,Result) :- +%%writeln([i1]), + convert_to_grammar_part1(Functions1,[],Functions2), + %%writeln(Functions2), +%%writeln(Functions2), + interpret1(Debug,Query,Functions2,Functions2,Result), + !. +interpret1(Debug,Query,Functions1,Functions2,Result) :- +%%writeln([i11]), + retractall(debug(_)), + assertz(debug(Debug)), + retractall(cut(_)), + assertz(cut(off)), + member1(Query,Functions1,Functions2,Result). +%%member1([_,R],_,[],R). +%%member1(_,_,[],[]). +member1(_,_,[],_) :- fail,!. +member1(Query,Functions,Functions2,Vars8) :- +%%writeln([m1]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2,":-",Body]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + + ((Function=[n,grammar]->true;Function=[n,grammar_part])->checkarguments1(Arguments1,Arguments2,[],Vars1,[],FirstArgs);checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!), + %%->ca2 +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,Vars2,Body,true), + updatevars(FirstArgs,Vars2,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), +%%writeln(["FirstArgs",FirstArgs,"Vars",Vars2,"Result",Result,"Vars7",Vars7,"Vars72",Vars72,"Var71",Var71,"Vars8",Vars8]), +%%writeln(["Vars8",Vars8]), + findresult3(Arguments1,Vars8,[],Result2) +%%writeln([findresult3,"Arguments1",Arguments1,"Vars8",Vars8,"Result2",Result2]) + );( +%%writeln(here1), + Vars8=[],Result2=[])), +%%writeln(["Arguments1",Arguments1,"Vars2",Vars2,"Result",Result]), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true)) + ->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2,":-",_Body]|Functions3], %% make like previous trunk? + member11(Query,Functions,Functions2,Vars8)) + );(turncut(off)%%,Result=[] + ). +member11(Query,Functions,Functions2,Result) :- +%%writeln([m11]), +%%writeln(["Query",Query,"Functions",Functions,"Functions2",Functions2,"Result",Result]), + cut(off)->( + (Query=[Function], + (Functions2=[[Function,":-",Body]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Result=[], + interpretbody(Functions,Functions2,[],_Vars2,Body,true),!, + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + %%Functions2=[[Function]|Functions3], + member12(Query,Functions,Functions2,Result)) + );(turncut(off)). +member12(Query,Functions,Functions2,Vars8) :- +%%writeln([m12]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + ((Function=[n,grammar]->true;Function=[n,grammar_part])->checkarguments1(Arguments1,Arguments2,[],Vars1,[],FirstArgs);checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!), +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + updatevars(FirstArgs,Vars1,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) + );( +%%writeln(here2), + Vars8=[],Result2=[])), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2]|Functions3], + member13(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member13(Query,Functions,Functions2,Result) :- +%%writeln([m13]), + cut(off)->( + (Query=[Function],!, + (Functions2=[[Function]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Result=[], + %%interpretbody(Functions,[],_Vars2,Body,true), + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + Functions2=[_Function|Functions3], + member1(Query,Functions,Functions3,Result)) + );(turncut(off)). +interpret2(Query,Functions1,Functions2,Result) :- +%%writeln(i2), +%%writeln(["%%interpret2 Query",Query,"Functions1",Functions1,"Functions2",Functions2]), + member2(Query,Functions1,Functions2,Result). +%%member2([_,R],_,[],R). +%%member2(_,_,[],[]). +member2(_,_,[],_) :- fail,!. +member2(Query,Functions,Functions2,Vars8) :- +%%writeln([m2]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2,":-",Body]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + ((Function=[n,grammar]->true;Function=[n,grammar_part])->checkarguments1(Arguments1,Arguments2,[],Vars1,[],FirstArgs);checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!), +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,Vars2,Body,true), %%**arg2 change +%%writeln(["Functions",Functions,"Functions2",Functions2,"Vars1",Vars1,"Vars2",Vars2,"Body",Body]), + updatevars(FirstArgs,Vars2,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) +%%writeln(["Vars2",Vars2,"Result",Result]), + );( + %%writeln(here3), + Vars8=[],Result2=[])), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2,":-",_Body]|Functions3], + member21(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member21(Query,Functions,Functions2,Result) :- +%%writeln([m21]), + cut(off)->( + (Query=[Function], + (Functions2=[[Function,":-",Body]|_Functions3]), + Vars1=[], + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + interpretbody(Functions,Functions2,Vars1,_Vars2,Body,true),!, %%**arg2 change + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + %%Functions2=[[Function]|Functions3], + member22(Query,Functions,Functions2,Result)) + );(turncut(off)). +member22(Query,Functions,Functions2,Vars8) :- +%%writeln([m22]), + cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + ((Function=[n,grammar]->true;Function=[n,grammar_part])->checkarguments1(Arguments1,Arguments2,[],Vars1,[],FirstArgs);checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs),!), +%%writeln([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + updatevars(FirstArgs,Vars1,[],Result), + %%reverse(Result,[],Vars7), + ((not(Result=[])-> + %%Result=[Var71|Vars72], + unique1(Result,[],Vars8), + findresult3(Arguments1,Vars8,[],Result2) + );( +%%writeln(here4), + Vars8=[],Result2=[])), + (debug(on)->(writeln([call,[Function,Arguments1],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[Function,Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2]|Functions3], + member23(Query,Functions,Functions2,Vars8)) + );(turncut(off)). +member23(Query,Functions,Functions2,Vars8) :- +%%writeln([m23]), + cut(off)->( + (Query=[Function],!, + (Functions2=[[Function]|_Functions3]), + (debug(on)->(writeln([call,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true), + Vars8=[], + (debug(on)->(writeln([exit,[Function],"Press c."]),(not(get_single_char(97))->true;abort));true) + )->true; + (%%Query=[Function], + Functions2=[_Function|Functions3], + member2(Query,Functions,Functions3,Vars8)) + );(turncut(off)). +checkarguments([],[],Vars,Vars,FirstArgs,FirstArgs). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %% +%%writeln(1), + Arguments1=[Value|Arguments3], %% Value may be a number, string, list or tree + expressionnotatom(Value), + Arguments2=[Variable2|Arguments4], + not(var(Variable2)),isvar(Variable2), + putvalue(Variable2,Value,Vars1,Vars3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs1,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %%A +%%writeln(2), + Arguments1=[Variable|Arguments3], %% Value may be a number, string, list or tree + not(var(Variable)),isvar(Variable), + Arguments2=[Value|Arguments4], + expressionnotatom(Value), + putvalue(Variable,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable,Value]],FirstArgs3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln(3), + Arguments1=[Variable1|Arguments3], + not(var(Variable1)),isvar(Variable1), + Arguments2=[Variable2|Arguments4], + not(var(Variable2)),isvar(Variable2), + (getvalue(Variable2,Value,Vars1)->true;Value=empty), + %%((Value=empty->Value1=Variable2;Value1=Value))), + putvalue(Variable2,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable1,Variable2]],FirstArgs3), + checkarguments(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2). +checkarguments(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln(4), + Arguments1=[Value1|Arguments3], + expressionnotatom(Value1), + Arguments2=[Value1|Arguments4], + expressionnotatom(Value1), + checkarguments(Arguments3,Arguments4,Vars1,Vars2,FirstArgs1,FirstArgs2). + +checkarguments1([],[],Vars,Vars,FirstArgs,FirstArgs). +checkarguments1(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %% +%%writeln(1), + Arguments1=[Value|Arguments3], %% Value may be a number, string, list or tree + expressionnotatom(Value), + Arguments2=[Variable2|Arguments4], + not(var(Variable2)),isvar(Variable2), + putvalue(Variable2,Value,Vars1,Vars3), + checkarguments1(Arguments3,Arguments4,Vars3,Vars2,FirstArgs1,FirstArgs2). +checkarguments1(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %%A +%%writeln(2), + Arguments1=[Variable|Arguments3], %% Value may be a number, string, list or tree + not(var(Variable)),isvar(Variable), + Arguments2=[Value|Arguments4], + expressionnotatom(Value), + putvalue(Variable,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable,_]],FirstArgs3), + checkarguments1(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2). +checkarguments1(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln(3), + Arguments1=[Variable1|Arguments3], + not(var(Variable1)),isvar(Variable1), + Arguments2=[Variable2|Arguments4], + not(var(Variable2)),isvar(Variable2), + (getvalue(Variable2,Value,Vars1)->((Value=empty->Value1=Variable2;Value1=Value))), + putvalue(Variable2,Value1,Vars1,Vars3), + append(FirstArgs1,[[Variable1,Variable2]],FirstArgs3), + checkarguments1(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2). +checkarguments1(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln(4), + Arguments1=[Value1|Arguments3], + expressionnotatom(Value1), + Arguments2=[Value1|Arguments4], + expressionnotatom(Value1), + checkarguments1(Arguments3,Arguments4,Vars1,Vars2,FirstArgs1,FirstArgs2). + +interpretbody(_Functions1,_Functions2,Vars,Vars,[],true) :- !. + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[Statements1|Statements2],not(predicate_or_rule_name(Statements1)), + interpretbody(Functions0,Functions,Vars1,Vars3,Statements1,Result2), + interpretbody(Functions0,Functions,Vars3,Vars2,Statements2,Result3), + %%((Result3=cut)->!;true), + logicalconjunction(Result1,Result2,Result3),!. + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[[not,[Statements]]|Statements2], + interpretbody(Functions0,Functions,Vars1,Vars3,Statements,Result2), + %%((Result2=cut)->!;true), + interpretbody(Functions0,Functions,Vars3,Vars2,Statements2,Result3), + %%((Result3=cut)->!;true), + logicalnot(Result2,Result4), + (logicalconjunction(Result1,Result4,Result3)->true;(Result1=false)),!. +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[[Statements1],or,[Statements2]], + (interpretbody(Functions0,Functions,Vars1,Vars2,Statements1,Result1)->true; + %%,((Value1=cut)->!;true)); + interpretbody(Functions0,Functions,Vars1,Vars2,Statements2,Result1)),!. + %%,((Value=cut)->!;true)). + %%(logicaldisjunction(Result1,Value1,Value2)->true;(Result1=false)). + + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[Statement|Statements], +%%writeln(["Functions0",Functions0,"Functions",Functions,"Statement",Statement,"Vars1",Vars1,"Vars3",Vars3,"Result2",Result2,"Cut",Cut]), + not(predicate_or_rule_name(Statement)), + interpretstatement1(Functions0,Functions,Statement,Vars1,Vars3,Result2,Cut), +%%writeln(["here1"]), + ((not(Cut=cut))->(Functions2=Functions);(turncut(on))), %% cut to interpret1/2 (assertz) +%%writeln(["here3"]), + interpretbody(Functions0,Functions2,Vars3,Vars2,Statements,Result3), + %%((Result3=cut)->!;true), +%%writeln(["here4"]), + logicalconjunction(Result1,Result2,Result3),!. +%%writeln([Result1,Result2,Result3]). +turncut(State1) :- + cut(State2), + retract(cut(State2)), + assertz(cut(State1)). +logicaldisjunction(true,Result2,Result3) :- + true(Result2);true(Result3). +logicalconjunction(true,Result2,Result3) :- + true(Result2),true(Result3). +logicalnot(Result1,Result2) :- + true(Result1),false(Result2). +logicalnot(Result1,Result2) :- + false(Result1),true(Result2). +true(true). +false(false). + +%%interpretstatement1(_F0,[],_,Vars,Vars,true,nocut) :- ! +%%writeln("AND HERE!") +%% . + +interpretstatement1(_F0,_Functions,[[n,cut]],Vars,Vars,true,cut) :- !. + +interpretstatement1(_F0,_Functions,[[n,atom],[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[[n,atom],[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + atom(Value), + (debug(on)->(writeln([exit,[[n,atom],[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[[n,string],[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[[n,string],[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + string(Value), + (debug(on)->(writeln([exit,[[n,string],[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[[n,number],[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[[n,number],[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + number(Value), + (debug(on)->(writeln([exit,[[n,number],[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[[n,letters],[Variable]],Vars,Vars,true,nocut) :- + getvalue(Variable,Value,Vars), + (debug(on)->(writeln([call,[[n,letters],[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true), + string_codes(Value,Value1), + phrase(word1(Value1),_), + (debug(on)->(writeln([exit,[[n,letters],[Value]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[[n,variable],[Variable]],Vars,Vars,true,nocut) :- + var(Variable), + (debug(on)->(writeln([call,[[n,variable],[Variable]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,variable],[Variable]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretstatement1(_F0,_Functions,[[n,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- + isop(Operator), + interpretpart(is,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[[n,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(31), + isop(Operator), + interpretpart(is,Variable2,Variable1,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[[n,+],[Variable2,Variable3,Variable1]],Vars1,Vars2,true,nocut) :- +%%writeln(4), + interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2). + +%%interpretstatement1(_F0,_Functions,[Variable2+Variable3,is,Variable1],Vars1,Vars2,true,nocut) :- +%%writeln(41), + %%interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[[n,=],[Variable1,[Variable2,Variable3]]],Vars1,Vars2,true,nocut) :- +%%writeln(5), + interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2). + +%%interpretstatement1(_F0,_Functions,[[Variable2,Variable3]=Variable1],Vars1,Vars2,true,nocut) :- +%%writeln(51), +%% interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[[n,wrap],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(52), wrap + interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[[n,unwrap],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(53), unwrap + interpretpart(bracket2,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[[n,head],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(6), + interpretpart(head,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[[n,tail],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(61), + interpretpart(tail,Variable1,Variable2,Vars1,Vars2). + + +interpretstatement1(_F0,_Functions,[[n,member],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(8), + interpretpart(member,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[[n,delete],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +%%writeln(), + interpretpart(delete,Variable1,Variable2,Variable3,Vars1,Vars2). +%%** all in form f,[1,1,etc], including + with 0,1 + +interpretstatement1(_F0,_Functions,[[n,append],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +%%writeln(9), + interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(_F0,_Functions,[[n,stringconcat],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +%%writeln(9), + interpretpart(stringconcat,Variable1,Variable2,Variable3,Vars1,Vars2). + +/**interpretstatement1(_F0,_Functions,[[n,grammar_part]|Variables1],Vars1,Vars2,true,nocut) :- +%%writeln(x9), + [Variables2]=Variables1, + interpretpart(grammar_part,Variables2,Vars1,Vars2),!.**/ + +interpretstatement1(_F0,_Functions,[[n,stringtonumber],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln(52), wrap + interpretpart(stringtonumber,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(Functions0,_Functions,Query1,Vars1,Vars8,true,nocut) :- +%%writeln("h1/10"), + Query1=[[n,grammar]|Arguments], + ((Arguments=[[Grammar1,Phrase1,RuleName|Variables2]], + %%[Variables3]=Variables2, + name(RuleName), + convert_to_grammar_part1(Grammar1,[],Grammar2))->true; + (Grammar2=Functions0, + ((Arguments=[[Phrase1,RuleName|Variables2]] + %%([Variables3]=Variables2->true;(Variables2=[],Variables3=[])) + )))), + +%%writeln(["Arguments",Arguments,"Vars1",Vars1]), + +%%substitutevarsA1(Phrase,Vars1,[],Vars3,[],FirstArgs1), + +%%Vars3=[[[v,PhraseVarName],PhraseValue]], +%%Vars4=[[[v,vgp1],PhraseValue]], + + append([Phrase1],Variables2,Variables4), %% *** Should V3 be in [] v + +substitutevarsA1(Variables4,Vars1,[],Vars2,[],FirstArgs), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + +Vars2=[Phrase2|Vars4], +((Phrase2=[]->true;Phrase2=[_A|_B])->End=[];End=""), + (not(Vars4=[])->append([RuleName,Phrase2,End],Vars4,Vars5); + (Vars5=[RuleName,Phrase2,End])), + Query2=[[n,grammar_part],Vars5], + ((((terminal(RuleName), + + + (not(Vars4=[])->append([Phrase2,RuleName],Vars4,Vars52); + (Vars52=[Phrase2,RuleName])), + + (debug(on)->(writeln([call,[[n,grammar],Vars52],"Press c."]),(not(get_single_char(97))->true;abort));true), + + interpretpart(grammar_part,Vars5,[],Result1), + + updatevars2(FirstArgs,Result1,[],Vars51), + updatevars3(Vars2,Vars51,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) +)->true;( +%%writeln(here1), + Vars8=[]),strip(Vars8,[],Result2))->true)), + + (debug(on)->(writeln([exit,[[n,grammar],Result2],"Press c."]),(not(get_single_char(97))->true;abort));true) + +)->true;(not(terminal(RuleName)), + %% Bodyvars2? + + (not(Vars4=[])->append([Phrase2,RuleName],Vars4,Vars52); + (Vars52=[Phrase2,RuleName])), + +(debug(on)->(writeln([call,[[n,grammar],Vars52],"Press c."]),(not(get_single_char(97))->true;abort));true), + +%% debug(on)->writeln([call,[Function,[Vars3]]]), +%%writeln(["Query2",Query2,"Functions0",Functions0]), + interpret2(Query2,Grammar2,Grammar2,Result1), + (debug(on)->(writeln([exit,[[n,grammar],Vars52],"Press c."]),(not(get_single_char(97))->true;abort));true), + + updatevars2(FirstArgs,Result1,[],Vars51), + updatevars3(Vars2,Vars51,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) +);( +%%writeln(here1), + Vars8=[])))),!. + + +interpretstatement1(Grammar,_Grammar2,Query1,Vars1,Vars8,true,nocut) :- +%%writeln("h1/10"), +%%trace,%%%%**** + Query1=[[n,grammar_part]|Arguments], + Arguments=[[RuleName|Variables2]], + %%(([Variables4|Rest]=Variables2->Variables3=Variables2;(Variables2=[],Variables3=[]))), + + ((not(terminal(RuleName)), +%%writeln(["Arguments",Arguments,"Vars1",Vars1]), + substitutevarsA1(Variables2,Vars1,[],Vars3,[],FirstArgs), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + (not(Vars3=[])->(append([RuleName],Vars3,Vars4),Query2=[[n,grammar_part],Vars4]); + Query2=[[n,grammar_part],RuleName]), %% Bodyvars2? +%% debug(on)->writeln([call,[Function,[Vars3]]]), +%%writeln(["Query2",Query2,"Functions0",Functions0]), + %%notrace,%%**** + interpret2(Query2,Grammar,Grammar,Result1), + %%trace,%**** + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) +)->true;( +%%writeln(here1), + Vars8=[]))->true)->true; +(terminal(RuleName),substitutevarsA1(Variables2,Vars1,[],Vars3,[],FirstArgs), +%%writeln(here), %%**** +%%Vars3=[Phrase,End], +%%Vars41=[Phrase,[v,vgp]], +append([RuleName],Vars3,Vars9), +%%writeln([vars9,Vars9]), %%%%%***** +interpretpart(grammar_part,Vars9,[],Result1), + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars3,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) + %%writeln([vars8,Vars8]) %%%***** +)->true;( +%%writeln(here1), + Vars8=[]))->true)),%%notrace, %%**** + !. + + +interpretstatement1(Functions0,_Functions,Query1,Vars1,Vars8,true,nocut) :- +%%writeln("h1/10"), + Query1=[Function,Arguments],not(Function=[n,grammar]->true;Function=[n,grammar_part]), +%%writeln(["Arguments",Arguments,"Vars1",Vars1]), + substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + Query2=[Function,Vars3], %% Bodyvars2? +%% debug(on)->writeln([call,[Function,[Vars3]]]), +%%writeln(["Query2",Query2,"Functions0",Functions0]), + interpret2(Query2,Functions0,Functions0,Result1), + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) +);( +%%writeln(here1), + Vars8=[])). +%%**** reverse and take first instance of each variable. + %%findresult3(Arguments,Vars6,[],Result2) +%%writeln(["FirstArgs",FirstArgs,"Result1",Result1,"Vars5",Vars5,"Vars4",Vars4]), +%%writeln(["Vars1:",Vars1,"Vars4:",Vars4]), +%% debug(on)->writeln([exit,[Function,[Result2]]]). +interpretstatement1(Functions0,_Functions,Query,Vars,Vars,true) :- + Query=[_Function], + debug(on)->writeln([call,[Function]]), + interpret2(Query,Functions0,Functions0,_Result1), + debug(on)->writeln([exit,[Function]]). + +word1([])-->[]. +word1([A|As]) --> [A],word1(As),{%%atom_codes(A,AC), +char_type(A,alpha)},!. +/**interpretstatement1(_Functions0, _Functions,_Query,_Vars1,_Vars2,false) :- + writeln([false]). +**/ +interpretstatement2(Value,_Vars,Value) :- + (number(Value);atom(Value)). +interpretstatement2(Variable,Vars1,Value) :- + getvalue(Variable,Value,Vars1). +interpretstatement3(A + B,Vars,Value1) :- + interpretstatement2(A,Vars,Value2), + interpretstatement2(B,Vars,Value3), + Value1 = Value2 + Value3. +interpretstatement3(Value,_Vars,Value) :- + (number(Value);atom(Value)). +interpretstatement3(Variable,Vars,Value) :- + getvalue(Variable,Value,Vars). +getvalue(Variable,Value,Vars) :- + ((not(isvar(Variable)),isvalstrorundef(Value),Variable=Value)->true; + (isvar(Variable),isvalstrorundef(Value),getvar(Variable,Value,Vars))). +putvalue(Variable,Value,Vars1,Vars2) :- + ((not(isvar(Variable)),isvalstrorundef(Value),Variable=Value,Vars1=Vars2)->true; + (isvar(Variable),isvalstrorundef(Value),updatevar(Variable,Value,Vars1,Vars2))),!. +getvar(Variable,Value,Vars) :- + member([Variable,Value],Vars), + not(Value=empty). +getvar(undef,undef,_Vars) :- + !. +getvar(Variable,empty,Vars) :- + not(member([Variable,_Value],Vars))->true; + member([Variable,empty],Vars). +updatevar(undef,_Value,Vars,Vars) :- + !. +updatevar(Variable,Value,Vars1,Vars2) :- + ((((member([Variable,empty],Vars1), + delete(Vars1,[Variable,empty],Vars3), + append(Vars3,[[Variable,Value]],Vars2))->true; + ((not(member([Variable,Value1],Vars1)), + ((Value1=empty)->true;(Value1=Value)))), + append(Vars1,[[Variable,Value]],Vars2))->true; + (member([Variable,Value],Vars1),Vars2=Vars1))->true; + (undef(Variable), + append(Vars1,[[Variable,Value]],Vars2))). +/**updatevars(_FirstArgs,[],Vars,Vars). +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[[Variable1,Value]|Vars4], + ((member([Variable2,Variable1],FirstArgs), %% removed brackets around firstargs here and 2 line below + append(Vars2,[[Variable2,Value]],Vars5))->true; + (member([Variable1,_Variable2],FirstArgs), + append(Vars2,[[Variable1,Value]],Vars5))), + updatevars(FirstArgs,Vars4,Vars5,Vars3), + !. +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[_Vars4|Vars5], + updatevars(FirstArgs,Vars5,Vars2,Vars3).**/ +updatevars([],_Vars1,Vars2,Vars2) :- !. +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + FirstArgs=[[Orig,New]|Rest], + (expressionnotatom(New)->append(Vars2,[[Orig,New]],Vars4); + (member([New,Value],Vars1), + append(Vars2,[[Orig,Value]],Vars4))), + updatevars(Rest,Vars1,Vars4,Vars3),!. +updatevars2(_FirstArgs,[],Vars,Vars) :- !. +updatevars2(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[[Variable,Value]|Vars4], + (member(Variable,FirstArgs), %% removed brackets around firstargs here and 2 line below, ** vars1 into arg in (10), check cond + append(Vars2,[[Variable,Value]],Vars5)), + updatevars2(FirstArgs,Vars4,Vars5,Vars3). +updatevars3(Vars1,[],Vars1). +updatevars3(Vars1,Vars2,Vars4) :- + Vars2=[[Variable,Value]|Vars5], + delete(Vars1,[Variable,empty],Vars6), + append(Vars6,[[Variable,Value]],Vars7), + updatevars3(Vars7,Vars5,Vars4), + !. +updatevars3(Vars1,Vars2,Vars4) :- + Vars2=[[Variable,Value]|Vars5], + append(Vars1,[[Variable,Value]],Vars6), + updatevars3(Vars6,Vars5,Vars4). +reverse([],List,List). +reverse(List1,List2,List3) :- + List1=[Head|Tail], + append([Head],List2,List4), + reverse(Tail,List4,List3). +unique1([],Items,Items). +unique1([Item|Items1],Items2,Items3) :- + delete(Items1,Item,Items4), + append(Items2,[Item],Items5), + unique1(Items4,Items5,Items3). +isvar([v,_Value]) :- !. +isval(Value) :- + number(Value). +isvalstr(N) :- + isval(N);string(N). +isvalempty(N) :- + isval(N);(N=empty). +/**isvalstrempty(N) :- + isval(N);(string(N);N=empty).**/ +isvalstrempty(N) :- + var(N),!. +isvalstrempty(N) :- + isval(N),!. +isvalstrempty(N) :- + string(N). +isvalstrempty(empty). +isvalstrempty([]). +/**isvalstrempty(N) :- + atom(N),fail,!. +**/ +isvalstrorundef(N) :- + var(N),!. +isvalstrorundef(N) :- + not(var(N)),isval(N),!. +isvalstrorundef(N) :- + not(var(N)),expression(N),!. +undef(N) :- + var(N). +/** +expression(N) :- + isval(N);(string(N);atom(N)),!. +expression([]). +expression(empty). +expression([N]) :- + expression(N). +expression([N|Ns]):- + expression(N),expression(Ns). +**/ + +expression(empty) :- + !. +expression(N) :- + isval(N),!. +expression(N) :- + string(N),!. +expression(N) :- + atom(N),!. +expression([]) :- + !. +expression(N) :- + not(atom(N)), + length(N,L),L>=1, + expression2(N). +expression2([]). +expression2([N|Ns]) :- + expression3(N), + expression2(Ns). +expression3(N) :- + isval(N),!. +expression3(N) :- + string(N),!. +expression3(N) :- + atom(N),!. + +expressionnotatom(N) :- + isvalstrempty(N),!. +expressionnotatom(N) :- + not(atom(N)), + length(N,L),L>=1, + expressionnotatom2(N),!. +expressionnotatom(Name) :- + predicate_or_rule_name(Name),!. +expressionnotatom2([]). +expressionnotatom2([N|Ns]) :- + isvalstrempty(N), + expressionnotatom2(Ns). + +substitutevarsA1(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- + substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2),!. +substitutevarsA2([],_Vars1,Vars2,Vars2,FirstArgs,FirstArgs):-!. +substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- + Arguments=[Variable|Variables], + ((getvalue(Variable,Value,Vars1), + Value=empty)-> + ((append(Vars2,[Variable],Vars4)), + (isvar(Variable)->append(FirstArgs1,[Variable], + FirstArgs3);FirstArgs3=FirstArgs1)); + (getvalue(Variable,Value,Vars1), + append(Vars2,[Value],Vars4)), + FirstArgs3=FirstArgs1), + substitutevarsA2(Variables,Vars1,Vars4,Vars3,FirstArgs3,FirstArgs2),!. + +findresult3([],_Result,Result2,Result2). +findresult3(Arguments1,Result1,Result2,Result3) :- + Arguments1=[Value|Arguments2], + expressionnotatom(Value), + append(Result2,[Value],Result4), + findresult3(Arguments2,Result1,Result4,Result3). +findresult3(Arguments1,Result1,Result2,Result3) :- + Arguments1=[Variable|Arguments2], + isvar(Variable), + member([Variable,Value],Result1), + append(Result2,[Value],Result4), + findresult3(Arguments2,Result1,Result4,Result3). + +strip([],Result2,Result2). +strip(Arguments1,Result2,Result3) :- + Arguments1=[[Variable,Value]|Arguments2], + isvar(Variable), + append(Result2,[Value],Result4), + strip(Arguments2,Result4,Result3). diff --git a/Text-to-Breasonings-master/listprologinterpreter3preds5.pl b/Text-to-Breasonings-master/listprologinterpreter3preds5.pl new file mode 100644 index 0000000000000000000000000000000000000000..5bab8794134637fbffd1255171f1a130675a2040 --- /dev/null +++ b/Text-to-Breasonings-master/listprologinterpreter3preds5.pl @@ -0,0 +1,218 @@ +interpretpart(is,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + + %%getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + expression(Value1), + expression(Value2), + val1emptyorvalsequal(Value1,Value2), + %%isval(Value2), + putvalue(Variable1,Value2,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,is],[variable,Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,is],[Variable1,Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true). + + interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1A = [Value2], + val1emptyorvalsequal(Value1,Value1A), + %%val1emptyorvalsequal(Value1A,Value2), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,wrap],[variable,[Value2]]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,wrap],[Variable2,[Value2]]],"Press c."]),(not(get_single_char(97))->true;abort));true). + + interpretpart(stringtonumber,Variable2,Variable1,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + %%Value1A = [Value2], + (Value2A=""->Value1=""; + number_string(Value1,Value2A)), + val1emptyorvalsequal(Value2,Value2A), + %%val1emptyorvalsequal(Value1A,Value2), + putvalue(Variable2,Value2A,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,stringtonumber],[Value1,Value2A]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,stringtonumber],[Value1,Value2A]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretpart(bracket2,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1A = Value2, + val1emptyorvalsequal(Value1,Value1A), + %%val1emptyorvalsequal(Value2A,Value1), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,unwrap],[[variable],[Value2]]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,unwrap],[[Value2],[Value2]]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretpart(head,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1=[Value1A|_Rest], + val1emptyorvalsequal(Value2,Value1A), + putvalue(Variable2,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,head],[Value1,variable]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,head],[Value1,Value1A]],"Press c."]),(not(get_single_char(97))->true;abort));true), !. + +interpretpart(tail,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Value1=[_Head|Value1A], + %%removebrackets(Value1A,Value1B), + val1emptyorvalsequal(Value2,Value1A), + putvalue(Variable2,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,tail],[Value1,variable]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,tail],[Value1,Value1A]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretpart(member,Variable1,Variable2,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + (not(Value1=empty)->(member(Value1,Value2),Vars2=Vars1, + (debug(on)->(writeln([call,[[n,member],[Value1,Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,member],[Value1,Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true)); + (member(Value3,Value2), + putvalue(Variable1,Value3,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,member],[variable1,Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,member],[Value3,Value2]],"Press c."]),(not(get_single_char(97))->true;abort));true))). + +interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + isvalempty(Value1), + isval(Value2), + isval(Value3), + Value1A is Value2 + Value3, + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,+],[Value2,Value3,variable]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,+],[Value2,Value3,Value1A]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1 = [Value2A, Value3A], + val1emptyorvalsequal(Value2,Value2A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable2,Value2A,Vars1,Vars3), + putvalue(Variable3,Value3A,Vars3,Vars2), + (debug(on)->(writeln([call,[[n,=],[[Value2A, Value3A],[variable1,variable2]]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,=],[[Value2A, Value3A],[Value2A, Value3A]],"Press c."]]),(not(get_single_char(97))->true;abort));true). + +interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Value1A = [Value2, Value3], + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,=],[variable,[Value2,Value3]]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,=],[Value2,Value3],[Value2,Value3]],"Press c."]),(not(get_single_char(97))->true;abort));true). + +interpretpart(delete,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + delete(Value1,Value2,Value3A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable3,Value3A,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,delete],[Value1,Value2,variable3]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,delete],[Value1,Value2,Value3A]],"Press c."]),(not(get_single_char(97))->true;abort));true). + + +interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2) :- + getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + append1(Value1,Value2,Value3A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable3,Value3A,Vars1,Vars2), + (debug(on)->(writeln([call,[[n,append],[Value1,Value2,variable3]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,append],[Value1,Value2,Value3A]],"Press c."]),(not(get_single_char(97))->true;abort));true). + + +interpretpart(stringconcat,Terminal,Phrase2,Phrase1,Vars1,Vars2) :- + %%Variables1=[Terminal,Phrase1,Phrase2], %% terminal can be v or "a" + getvalues2([Terminal,Phrase1,Phrase2], + [],[TerminalValue1,Phrase1Value1,Phrase2Value1],Vars1,[],[Flag1,Flag2,_Flag3]), %% prolog vars, list of vars, [v]=[prolog var] + %%delete(Value1,Value2,Value3A), + (Terminal=[_Value]->TerminalValue2=[TerminalValue1];TerminalValue2=TerminalValue1), + +(Terminal=""->(TerminalValue2="", + +string_concat(TerminalValue2,Phrase2Value1,Phrase1Value1))->true; + ((var(TerminalValue2)->(string_concat(TerminalValue2,Phrase2Value1,Phrase1Value1)),string_length(TerminalValue2,1));string_concat(TerminalValue2,Phrase2Value1,Phrase1Value1))), + + putvalue(Terminal,TerminalValue2,Vars1,Vars3), + putvalue(Phrase2,Phrase2Value1,Vars3,Vars4), + putvalue(Phrase1,Phrase1Value1,Vars4,Vars2), + (Flag1=true->TerminalValue3=variable1;TerminalValue3=TerminalValue1), + (Flag2=true->Phrase1Value3=variable2;Phrase1Value3=Phrase1Value1), + (debug(on)->(writeln([call,[[n,stringconcat],[TerminalValue3,Phrase1Value3,Phrase2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,stringconcat],[TerminalValue1,Phrase1Value1,Phrase2Value1]],"Press c."]),(not(get_single_char(97))->true;abort));true),!. + + + +interpretpart(grammar_part,Variables1,Vars1,Vars2) :- + Variables1=[Terminal,Phrase1,Phrase2], %% terminal can be v or "a" + %%terminal(Terminal), + getvalues2([Terminal,Phrase1,Phrase2], + [],[TerminalValue1,Phrase1Value1,Phrase2Value1],Vars1,[],[Flag1,Flag2,_Flag3]), %% prolog vars, list of vars, [v]=[prolog var] + %%delete(Value1,Value2,Value3A), + (Terminal=[_Value]->TerminalValue2=[TerminalValue1];TerminalValue2=TerminalValue1), + + + +((string(Phrase1Value1)->Phrase1Value1=Phrase1Value11;(number(Phrase1Value1)->number_string(Phrase1Value1,Phrase1Value11);Phrase1Value1=Phrase1Value11)), + +(Terminal=""->TerminalValue2="";true), + +(((var(TerminalValue2)->(string_concat(TerminalValue2,Phrase2Value1,Phrase1Value11)),string_length(TerminalValue2,1));string_concat(TerminalValue2,Phrase2Value1,Phrase1Value11))->true; + +string_concat(TerminalValue2,Phrase2Value1,Phrase1Value11))->true; + + + +((Phrase1Value1=[_ItemA|_ItemsA]),(Terminal=[]->(TerminalValue2=[], + +((var(TerminalValue2)->length(TerminalValue2,1);true),(append(TerminalValue2,Phrase2Value1,Phrase1Value1))))->true; + +(append(TerminalValue2,Phrase2Value1,Phrase1Value1)->true)))), + + putvalue(Terminal,TerminalValue2,Vars1,Vars3), + putvalue(Phrase2,Phrase2Value1,Vars3,Vars4), + putvalue(Phrase1,Phrase1Value1,Vars4,Vars2), + (Flag1=true->TerminalValue3=variable1;TerminalValue3=TerminalValue1), + (Flag2=true->Phrase1Value3=variable2;Phrase1Value3=Phrase1Value1), + (debug(on)->(writeln([call,[[n,grammar_part],[TerminalValue3,Phrase1Value3,Phrase2]],"Press c."]),(not(get_single_char(97))->true;abort));true), + (debug(on)->(writeln([exit,[[n,grammar_part],[TerminalValue1,Phrase1Value1,Phrase2Value1]],"Press c."]),(not(get_single_char(97))->true;abort));true),!. + +getvalues(Variable1,Variable2,Value1,Value2,Vars) :- + getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars). +getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars) :- + getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars), + getvalue(Variable3,Value3,Vars). +val1emptyorvalsequal(empty,_Value) :- !. +val1emptyorvalsequal(Value,Value) :- + not(Value=empty). +isop(is). +isop(=). +stringconcat1([],Item,Item) :- + !. +stringconcat1(Item11,Item21,Item31) :- + +replace_empty_with_empty_set( [Item11,Item21,Item31],[],[Item1,Item2,Item3]), +maplist(expressionnotatom,[Item1,Item2,Item3]), + string_concat(Item1,Item2,Item3),!. + +append1([],Item,Item) :- + !. +append1(Item11,Item21,Item31) :- + +replace_empty_with_empty_set( [Item11,Item21,Item31],[],[Item1,Item2,Item3]), +maplist(expressionnotatom,[Item1,Item2,Item3]), +/**((isvalstr(Item1),Item1A=[Item1]);(not(isvalstr(Item1)),Item1A=Item1)), + ((isvalstr(Item2),Item2A=[Item2]);(not(isvalstr(Item2)),Item2A=Item2)), + %%((isvalstr(Item3),Item3A=[Item3]);(not(isvalstr(Item3)),Item3A=Item3)), + **/ + append(Item1,Item2,Item3),!. +/**delete1(Item1,Item2,Item3) :- + ((isvalstr(Item1),Item1A=[Item1]);(not(isvalstr(Item1)),Item1A=Item1)), + ((isvalstr(Item2),Item2A=[Item2]);(not(isvalstr(Item2)),Item2A=Item2)), + %%((isvalstr(Item3),Item3A=[Item3]);(not(isvalstr(Item3)),Item3A=Item3)), + delete(Item1A,Item2A,Item3). +**/ +replace_empty_with_empty_set([],A,A) :-!. +replace_empty_with_empty_set(A,B,C) :- + A=[Item1|Items], + (var(Item1)->Item2=Item1;(Item1=empty->Item2=[];Item2=Item1)), + append(B,[Item2],D), + replace_empty_with_empty_set(Items,D,C),!. +removebrackets([[Value]],Value) :-!. +removebrackets(Value,Value). \ No newline at end of file diff --git a/Text-to-Breasonings-master/lucian_s_pedagogy.txt b/Text-to-Breasonings-master/lucian_s_pedagogy.txt new file mode 100644 index 0000000000000000000000000000000000000000..b48666cf3273ba26b07d7bcb7b81db870e11397a --- /dev/null +++ b/Text-to-Breasonings-master/lucian_s_pedagogy.txt @@ -0,0 +1,331 @@ +Lucian’s Pedagogy +Writing Sentences By Making Distinctions Between Objects (The Pedagogical Ways of Thinking) + +The first aim of this website is to help students write sentences by making distinctions between objects. + +I recommend you read the following pages: + +1. Details - As an exercise, think of two uses, a future use and two types for each object. +2. Breasoning - Think of the x, y, z dimensions and colour of each object. +3. Rebreasoning - Think of the fact that the person and the object in a sentence are connected by a verb (an action) that means they touch. +4. Breathsoning - Think of a reason that the object is good. +5. Rebreathsoning - Think of a reason that the verb (action) is done well. +6. Space - Think of spatial characteristics of the person in relation to the object. +7. Time - Think of temporal characteristics of the person in relation to the object. +8. God Algorithm - Initialise your ability to hone breasonings for H1s. +9. Professor Algorithm - Use your ability to hone breasonings for H1s. +Ideas +The following ideas should be represented in high quality work. + +1. Two uses +a->b, a->c +Include two uses or ideas about an idea from your essay. For example, John (0.5m, 0.3m, 1.8m) ate an apple (0.6m, 0.6m, 0.6m) has the two uses John (see previous dimensions) planted the seed (0.005m, 0.0025m, 0.0025m) and John stepped over the line (0.5m, 0.01m, 0.0001m). + +2. Future +a->b +Think about how one of your ideas relates to the future. For example, a student should learn to read a sign (0.3m, 0.6m, 2m) so he can step over a crack (0.5m, 0.01m, 0.005m). + +3. Two types +a->b, c->b +Think about how two objects are co-ordinated in relation to each other. For example, Luke ate the wheat biscuit (0.1m, 0.05m, 0.01m), which was softened by milk, which was constituted by the wheat biscuit (0.1m, 0.05m, 0.01m) and the glass of milk (0.05m, 0.05m, 0.15m). + + +Breasoning +Pray for the breasoning, then think of the x, y and z dimensions, and the colour of each object (their "breasonings"). + + + +When you think of the object's properties, you will think of the object's specific name. + +See also Professor Algorithm - Use your ability to hone breasonings for H1s. + +Creating original philosophy with help only from God: Also, to visualise imagery of a quality necessary to God, think of objects that appear normally, as part of an argument, inside a square. There are sub squares for the verb for the object, uses and algorithmic objects for use with the object. + +Be God: In addition, this square is necessary to think of with you as God. + + + +Rebreasoning +There should be a chain of breasoned objects interacting to ensure the flow of the essay or story. Think of the way two objects touch when they are mentioned together. +a. Think of the fact the two objects touch. +b. Do either of the objects change size? +c. Do either of the objects change colour? +These are known as the "rebreasoning" rules. + + +where apple = 6x6x6 cm + +When you think of the action's properties, you will think of the action's specific name. + + +Breathsoning +Think of an adjective (a "breathsoning"), which is a human judgement for each noun. For example, describe an apple as "delicious". + + + + + +where apple = 6x6x6 cm + +Rebreathsoning +Think of an adverb (called a "rebreathsoning"), which is a human judgement for each verb. For example, describe "John ate the apple" (6x6x6 cm) with "ravenously". + + + + + +Space +a. Which room is the proposition set in? + + where dining room = 10x5x5 m + +b. Which part of the room is it set in? + + where chair = 0.5x0.5x0.5 m + + + +c. Which direction is the person facing? + + + +where dinner plate = 20x20x5 cm + + +Time +a. What happened before the event? + + +where apple = 6x6x6 cm, shopping bag = 30x10x40 cm. +b. How long will the event take? + + +where apple = 6x6x6 cm +c. What will happen after the event? + + + +where apple core = 1x1x6 cm, bin = 20x20x20 cm + +God Algorithm + +To breason out an A, that can be an 80%, 90% or 100% for 85, 130 or 190 breasonings respectively: +1. Spiritually "turn off" the following, meaning it is protected. +2. Spiritually "play" 3 previously breasoned out As (done using Lucian Green's Anarchy Quiz – see Appendix 1) to Cosmology (meaning that it is protected). +3. Breason out the object by thinking of its x, y and z dimensions. +4. Spiritually place a Lucian doll on the left side of a stage with you behind it, looking at the object. +5. Move Lucian to the right, to a mother doll, to forget the object. +6. Imagine the object is replaced with a counter. Move Lucian to the left, to recognise that it is divine, and spiritually listen to him saying a three line prayer. +7. To help the counter and it's seen as version to move forward in a straight line, say "It is a line", then say "It is the same", then repeat the three prayer points, "I love you", "I love you dearly" and "I love you forever". This step "dots on" the breasoning in a "high quality way". + +Professor Algorithm + +The "Professor Algorithm" is different from the "God Algorithm" even though it is an alternative to it because it allows working out the way of thinking (including the verb), not just the object the verb is performed on. + +1a. "Find out" an A (agreeing) way of thinking after thinking of a "blue eater that moves in the x direction" (that comes from the most general perspective Darwinism applied to Pedagogy, i.e. one moves in the x direction to eat). + +The way to use this is: + + +- The first step in “dotting” anything on requires thinking of a professor thinking of a pathway for a breasoning (an action on an object) in relation to that object, where a certain number of breasonings form part of an A-grade argument, by moving a cotton swab along a pathway from inside a clean, dry test tube. +- Pretend the cotton part of the test tube system is the object to be breasoned out. +- Breason out the object (visualise it as being measured by the dimensions x m, y m, z m). +- As part of visualising the breasoning, pretend to remove the blue dots from the ground below the object to make sure the object is “pure” by “asking” that the dots should to be removed, then checking that this is done. +- This removal of impurities can also be done while pretending to untouch a foam blue strip from another strip, meaning the dots are “lifted” away. +- Take the object “off” it’s “on” status by repeating the untouching step, this time to remove it from view. + +Appendix 1 + +Anarchy Quiz + +Earn A in 85 questions. Use the A you earn to earn A in an assignment (search for Lucian Green breasoning list or Lucian Green BR in Google for more breasoning lists), to have a healthy baby or use 50 As to earn a job. +You still have to do the work, and there are no guarantees. + +For these questions, do not check any previous answers before answering, and do not submit the same set of answers twice or more for different purposes. + + 1. What is the reason you would like to complete this quiz? + +This seen-as essay will be breasoned out by you (you will be asked for the x, y and z dimensions of each object from a short sentence about the army, originally the argument for my essay on anarchy, which I wrote a pop-song about. + +NB: Before breasoning out each of the following objects (thinking of their x, y and z dimensions) pray for the breasoning. + +EXPOSITION: + +1. What are the x, y and z dimensions in metres of "cup" in "The soldier cleaned himself with a cup of water"? + +2. What are the x, y and z dimensions in metres of "bottle" in "The soldier drank a bottle of apple juice"? + +3. What are the x, y and z dimensions in metres of "apple" in "The soldier put an apple in his lunch-box"? + +4. What are the x, y and z dimensions in metres of "periscope" in "The soldier looked at the stand through the periscope"? + +5. What are the x, y and z dimensions in metres of "pear" in "The soldier ate a pear with the friend he found on the Internet"? + +CRITIQUE: + +6. What are the x, y and z dimensions in metres of "brick" in "The soldier stood on a brick, which was like a wall"? + +7. What are the x, y and z dimensions in metres of "log" in "The soldier stepped over the log"? + +8. What are the x, y and z dimensions in metres of "flag" in "The soldier found a flag"? + +9. What are the x, y and z dimensions in metres of "bun" in "The soldier ate a bun, which he had bartered for"? + +10. What are the x, y and z dimensions in metres of "plant" in "The soldier watered the plant, after moving it"? + +DETAILED REASONING: + +11. What are the x, y and z dimensions in metres of "tofu" in "The soldier ate tofu"? + +12. What are the x, y and z dimensions in metres of "garbage bag" in "The soldier moved the garbage bag"? + +13. What are the x, y and z dimensions in metres of "nuts" in "The soldier chewed nuts at the theatre"? + +14. What are the x, y and z dimensions in metres of "processed cheese" in "The soldier bit processed cheese"? + +15. What are the x, y and z dimensions in metres of "cup" in "The soldier swallowed a cup of grape juice"? + +16. What are the x, y and z dimensions in metres of "roll" in "The soldier ate a roll with vegan cheese"? + +17. What are the x, y and z dimensions in metres of "peanut" in "The soldier nipped a peanut"? + +18. What are the x, y and z dimensions in metres of "nectarine" in "The soldier munched a nectarine"? + +19. What are the x, y and z dimensions in metres of "sugar" in "The soldier packed sugar in his bag"? + +20. What are the x, y and z dimensions in metres of "ball" in "The soldier threw a ball in the air"? + +21. What are the x, y and z dimensions in metres of "banana" in "The soldier peeled the banana"? + +22. What are the x, y and z dimensions in metres of "orange" in "The soldier squeezed the orange"? + +23. What are the x, y and z dimensions in metres of "mandarin" in "The soldier removed a segment from a mandarin"? + +24. What are the x, y and z dimensions in metres of "bra" in "The soldier made a bra"? + +25. What are the x, y and z dimensions in metres of "stand" in "The soldier jumped to touch the top of the stand"? + +26. What are the x, y and z dimensions in metres of "ring" in "The soldier wore a ring"? + +27. What are the x, y and z dimensions in metres of "watering container" in "The soldier watered the apricot tree with the watering container"? + +28. What are the x, y and z dimensions in metres of "base" in "The soldier placed the base on the flat ground"? + +29. What are the x, y and z dimensions in metres of "abdominal muscle exerciser" in "The soldier exercised his abdominal muscles with the abdominal muscle exerciser"? + +30. What are the x, y and z dimensions in metres of "flask" in "The soldier gargled water from his flask"? + +31. What are the x, y and z dimensions in metres of "dried fig" in "The soldier chewed the dried fig"? + +32. What are the x, y and z dimensions in metres of "shorts" in "The soldier ran on the spot in shorts"? + +33. What are the x, y and z dimensions in metres of "two sticks" in "The soldier jumped over two sticks"? + +34. What are the x, y and z dimensions in metres of "hoop" in "The soldier swung the hoop around his waist"? + +35. What are the x, y and z dimensions in metres of "glass" in "The soldier drank a glass of water"? + +MIND MAP: + +36. What are the x, y and z dimensions in metres of "bible" in "The army chaplain distributed the bibles"? + +37. What are the x, y and z dimensions in metres of "wood" in "The soldier cut the wood into skirting boards"? + +38. What are the x, y and z dimensions in metres of "stretcher" in "The soldier lied down on the stretcher"? + +39. What are the x, y and z dimensions in metres of "locker" in "The soldier found the correct locker"? + +40. What are the x, y and z dimensions in metres of "carrot" in the sentence "The soldier ate the carrot to check it was healthy"? + +41. What are the x, y and z dimensions in metres of "seat" in the sentence "The soldier sat on the seat to check it was stable"? + +42. What are the x, y and z dimensions in metres of "salt" in the sentence "The soldier salted the onion"? + +43. What are the x, y and z dimensions in metres of "name badge" in the sentence "The soldier found his name badge"? + +44. What are the x, y and z dimensions in metres of "drum" in the sentence "The soldier beat a regular rhythm on the drum"? + +45. What are the x, y and z dimensions in metres of "bow" in the sentence "The soldier stayed balanced when aiming the arrow at the target with the bow"? + +46. What are the x, y and z dimensions in metres of "crotchet" in the sentence "The soldier moved the crotchet forward one beat of the musical bar"? + +47. What are the x, y and z dimensions in metres of "money" in the sentence "The soldier labelled the money to take home"? + +48. What are the x, y and z dimensions in metres of "celery" in the sentence "The soldier fed the kangaroo the celery"? + +49. What are the x, y and z dimensions in metres of "balloon" in the sentence "The soldier blew up a balloon"? + +50. What are the x, y and z dimensions in metres of "corn" in the sentence "The soldier ate the corn"? + +51. What are the x, y and z dimensions in metres of "towel" in the sentence "The soldier towelled himself dry"? + +52. What are the x, y and z dimensions in metres of "playing card" in the sentence "The soldier placed the playing card on the table"? + +53. What are the x, y and z dimensions in metres of "tea cup" in the sentence "The soldier drank tea from the tea cup"? + +54. What are the x, y and z dimensions in metres of "stick" in the sentence "The soldier moved the stick off the road"? + +55. What are the x, y and z dimensions in metres of "label" in the sentence "The soldier read the label"? + +56. What are the x, y and z dimensions in metres of "pole" in the sentence "The soldier stood straight against the pole"? + +57. What are the x, y and z dimensions in metres of "pupil" in the sentence "The soldier looked at the girl's pupil"? + +58. What are the x, y and z dimensions in metres of "face" in the sentence "The soldier looked at the boy's face"? + +59. What are the x, y and z dimensions in metres of "number" in the sentence "The soldier wrote the table number on the vegemite"? + +60. What are the x, y and z dimensions in metres of "ladder" in the sentence "The soldier like his friend because his friend was able to climb a ladder"? + +61. What are the x, y and z dimensions in metres of "tea towel" in the sentence "The soldier folded the tea towel until it was hand sized"? + +62. What are the x, y and z dimensions in metres of "cow" in the sentence "The soldier counted the number of times the cow mooed"? + +63. What are the x, y and z dimensions in metres of "glasses" in the sentence "The soldier opened a shape book at the page for science and looked at the illustration of the glasses"? + +64. What are the x, y and z dimensions in metres of "mitochondrion" in the sentence "The soldier examined the model mitochondrion"? + +65. What are the x, y and z dimensions in metres of "candle" in the sentence "The soldier lit a candle at church"? + +66. What are the x, y and z dimensions in metres of "birth schedule" in the sentence "The soldier displayed the birth schedule"? + +67. What are the x, y and z dimensions in metres of "staff timetable" in the sentence "The soldier wrote the staff timetable"? + +68. What are the x, y and z dimensions in metres of "room timetable" in the sentence "The soldier read the room timetable"? + +69. What are the x, y and z dimensions in metres of "couch" in the sentence "The soldier sat on the couch"? + +70. What are the x, y and z dimensions in metres of "pencil case" in the sentence "The soldier put a pencil in the pencil case"? + +71. What are the x, y and z dimensions in metres of "hole puncher" in the sentence "The soldier punched holes in the paper with the hole puncher"? + +72. What are the x, y and z dimensions in metres of "water container" in the sentence "The soldier took the water container with him for his hike"? + +73. What are the x, y and z dimensions in metres of "pen" in the sentence "The soldier returned his friend's pen to his friend"? + +74. What are the x, y and z dimensions in metres of "two pieces of paper" in the sentence "The soldier wrote what he will do and said he will do on the two pieces of paper, respectively"? + +75. What are the x, y and z dimensions in metres of "piece of paper" in the sentence "The soldier wrote five reasons to do a job in his company on a piece of paper"? + +76. What are the x, y and z dimensions in metres of "tambourine" in the sentence "The soldier made a model of a tambourine"? + +77. What are the x, y and z dimensions in metres of "friend" in the sentence "The soldier found the friend of his who mentioned the same key phrase as him"? + +78. What are the x, y and z dimensions in metres of "three friends" in the sentence "The soldier found two other friends, each of whom said the same key phrase as another one in the group"? + +79. What are the x, y and z dimensions in metres of "objects" in the sentence "The soldier found objects representing what he and his three friends had said they needed together"? + +80. What are the x, y and z dimensions in metres of "the library" in the sentence "The soldier checked the random place: the library"? + +81. What are the x, y and z dimensions in metres of "the conference room" in the sentence "The soldier checked the conference room, in the library"? + +82. What are the x, y and z dimensions in metres of "the encyclopaedia" in the sentence "The soldier read an article in the encyclopaedia to read widely"? + +83. What are the x, y and z dimensions in metres of "the blank book" in the sentence "The soldier wrote his own encyclopaedia article in the blank book"? + +84. What are the x, y and z dimensions in metres of "wheat biscuits" in the sentence "The soldier moistened wheat biscuits after he answered a girl friend's call"? + +85. What are the x, y and z dimensions in metres of "glove" in the sentence "The soldier inserted his hand in his glove"? + + + diff --git a/Text-to-Breasonings-master/luciansphilosophy.txt b/Text-to-Breasonings-master/luciansphilosophy.txt new file mode 100644 index 0000000000000000000000000000000000000000..979ed2e5b03e77a8fa7fbd132b184f0cf35b4e08 --- /dev/null +++ b/Text-to-Breasonings-master/luciansphilosophy.txt @@ -0,0 +1 @@ + \ No newline at end of file diff --git a/Text-to-Breasonings-master/medicinenoreplace.pl b/Text-to-Breasonings-master/medicinenoreplace.pl new file mode 100644 index 0000000000000000000000000000000000000000..04905a28422792baed50df4c7fae891f67c88026 --- /dev/null +++ b/Text-to-Breasonings-master/medicinenoreplace.pl @@ -0,0 +1,13 @@ +%% Medicine.pl + +medicine :- + File="I love you, I love you, I love you. Love.", + doctors(Doctors), + length(Doctors,DL), + texttobr2(DL,u,File,u),!. +doctors([ +%% Don't include meditators here +[doctor1firstname,surname,dobd,dobm,doby,started_day,started_month,started_year], +[doctor2firstname,surname,dobd,dobm,doby,started_day,started_month,started_year] +%% number of meditators + number of doctors= e.g. 21 +]). diff --git a/Text-to-Breasonings-master/meditation.pl b/Text-to-Breasonings-master/meditation.pl new file mode 100644 index 0000000000000000000000000000000000000000..82fdabf95cb683f054e1fab166278cc68f97a7cc --- /dev/null +++ b/Text-to-Breasonings-master/meditation.pl @@ -0,0 +1,49 @@ +%% Meditation.pl + +%% lucian, green, friendliness, medicine, sutras, courses, qi gong and yoga mantras and sutras - arem 10 br links +%% Stops linking in meditators 100 years after date of learning LM + +%%:- include('qb2.pl'). +:- include('../listprologinterpreter/listprolog'). +:- include('texttobr'). +%%:- include('texttobr2qb'). +%%:- include('texttobrqb'). +:- include('../listprologinterpreter/la_strings'). +:- include('mergetexttobrdict'). +:- include('edit.pl'). + +meditation :- + File="I love you, I love you, I love you. Arem.", + Utterances=[lucianicmeditationapps,dailyregimen,nobodyproblemsapp,lucian, green, friendliness, medicine, childrenh1earningjobsprotectioninjobs,headsofstate,lucianmantrapureform,lucianmantrasunsafety,maharishisutra,meditationteachersutra,movingappearances,purusha,upasanasutra,yellowgod,greensutra,bluenature,appearances,pranayama,soma,hoursprayer,fiftybreasoningspersecond,meditationindicatorlowerriskofcancerandotherdiseasesinworkersandbroadcasters,meditationindicatordecreasedstress,meditationindicatorincreasedbloodflow,meditationincreasedbrainpotential,autobiography,computationalenglish,computerscience,economics,hermeneutics,pedagogy,breasonings,quantumbox,lucianicmeditation,lm,meditation,metaphysics,music,plays,popology,theology, qigongmantra,qigongsutra, yogamantra,yogasutra], + %% initiate_utterances(File,Utterances), + %% currentDate(Today), + %% Today=date(Year,Month,Day), + meditators(Meditators), + length(Utterances,UL), + length(Meditators,ML), + Length2 is UL+ML, + texttobr2(Length2,u,File,u),texttobr(Length2,u,File,u),!. + + %% protect(File,Year,Month,Day,Meditators,[],Old), + %% writeln([oldmeditators,Old]),!. +initiate_utterances(_File,[]) :- !. +initiate_utterances(File,[_Utterances|Utterances]) :- + texttobr2(File),texttobr(File), + initiate_utterances(File,Utterances). +protect(_File,_Year,_Month,_Day,[],Old1,Old1) :- !. +protect(File,Year1,_Month1,_Day1,[Meditator|Meditators],Old1,Old2) :- + Meditator=[_FirstName,_LastName,_DayDOB,_MonthDOB,_YearDOB,_DayLearned,_MonthLearned,YearLearned], + Year2 is YearLearned+101, + (Year2=Year1->append(Old1,[Meditator],Old3);Old3=Old1), + texttobr2(File),texttobr(File), %% Sender + texttobr2(File),texttobr(File), %% Recipient + + protect(File,Year1,_Month,_Day,Meditators,Old3,Old2). +currentDate(Today) :- + get_time(Stamp), + stamp_date_time(Stamp,DateTime,local), + date_time_value(date,DateTime,Today). +meditators([ +[first,surname,dob_d,dob_m,dob_y,day_of_learning_d,day_of_learning_m,day_of_learning_y] +]). + diff --git a/Text-to-Breasonings-master/meditation1.pl b/Text-to-Breasonings-master/meditation1.pl new file mode 100644 index 0000000000000000000000000000000000000000..68fffc18f507256f265fad62b6d74cf7d0a02a01 --- /dev/null +++ b/Text-to-Breasonings-master/meditation1.pl @@ -0,0 +1,12 @@ +#!SWI-Prolog -f -q + +:- include('meditation.pl'). + +:- initialization prolog_edit:main. + +main :- + time((texttobr2(u,u,u,u),texttobr(u,u,u,u))),time((prolog_edit:mergetexttobrdict)), + halt(0) . +main :- + halt(1). + \ No newline at end of file diff --git a/Text-to-Breasonings-master/meditationnoreplace.pl b/Text-to-Breasonings-master/meditationnoreplace.pl new file mode 100644 index 0000000000000000000000000000000000000000..879122e521a45475d78ed035a5a2abdb44612185 --- /dev/null +++ b/Text-to-Breasonings-master/meditationnoreplace.pl @@ -0,0 +1,2 @@ +:-include('meditationnoreplace2.pl'). +:-include('meditatorsanddoctors'). diff --git a/Text-to-Breasonings-master/meditationnoreplace2.pl b/Text-to-Breasonings-master/meditationnoreplace2.pl new file mode 100644 index 0000000000000000000000000000000000000000..77e9c532c093ffda76d01ca02061186293b0eb53 --- /dev/null +++ b/Text-to-Breasonings-master/meditationnoreplace2.pl @@ -0,0 +1,566 @@ +%% Meditation.pl + +%% lucian, green, friendliness, medicine, sutras, courses, qi gong and yoga mantras and sutras - arem 10 br links +%% Stops linking in meditators 100 years after date of learning LM + +%%:- include('qb2.pl'). +:- include('../listprologinterpreter/listprolog'). +:- include('texttobr'). +:- include('texttobr2qb'). +%%:- include('texttobrqb'). +:- include('../listprologinterpreter/la_strings'). +%:- include('mergetexttobrdict'). +%:- include('edit.pl'). +%% Meditation.pl + +%% 14 5 19 - 250 for dot, point or utterance instance + +meditation :- + +meditation1(Utterances1,Utterances2,Immortality), +meditation2(Utterances1,Utterances2,Immortality). + +meditation1(Utterances1,Utterances2,Immortality) :- + + Utterances1=[%%dailyregimen and noheadaches, + time_travel,mind_reading,medicine,meditation,fish_oil, + + tt_loc_one_to_two,tt_loc_two_to_three,tt_loc_three_to_one, +a_for_love_head_of_state,b_for_love_head_of_state,b_to_b_for_love_head_of_state, +a_for_immortality,b_for_immortality,b_to_b_for_immortality, +a_for_body_replacement,b_for_body_replacement,b_to_b_for_body_replacement, +a_for_immortality_medicine_for_thinking,b_for_immortality_medicine_for_thinking,b_to_b_for_immortality_medicine_for_thinking, +a_for_immortality_medicine_for_stopping_dementia,b_for_immortality_medicine_for_stopping_dementia,b_to_b_for_immortality_medicine_for_stopping_dementia, +a_for_immortality_medicine_for_seeing_clearly,b_for_immortality_medicine_for_seeing_clearly,b_to_b_for_immortality_medicine_for_seeing_clearly, +a_for_immortality_medicine_for_muscle_relaxation,b_for_immortality_medicine_for_muscle_relaxation,b_to_b_for_immortality_medicine_for_muscle_relaxation, +a_for_immortality_medicine_for_antidepressant,b_for_immortality_medicine_for_antidepressant,b_to_b_for_immortality_medicine_for_antidepressant, +a_for_immortality_medicine_for_antipsychotic,b_for_immortality_medicine_for_antipsychotic,b_to_b_for_immortality_medicine_for_antipsychotic, +a_for_other_medicines_for_the_body,b_for_other_medicines_for_the_body,b_to_b_for_other_medicines_for_the_body, +a_for_the_other_as,b_for_the_other_as,b_to_b_for_the_other_as, +a_for_thank_head_of_state,b_for_thank_head_of_state,b_to_b_for_thank_head_of_state, + + + %%/** + connect_what_i_say_and_the_question,the_adverse,knock_out_cancer_frequency,i_am_serious,rod_off,good_circulation,no_smells,knock_out_annoying_frequency,knock_out_nagging_frequency,knock_out_complaining_frequency,stop_pain_inflammation_help_blood_flow_relaxation_and_wellbeing_and_deeptissue_massage +,memory,learning,no_cancer,comments,i_do_not_mean_it,law_to_protect_product,krishna_rcawp_for_wanted_people,say_reason_to_them,wanted_otherwise_dismissed,graciously_give_as, royalty, royalty_about_sales, nothing_rather_than_something,to_lucian,to_god,to_lecturer,today,tomorrow,day_after_tomorrow,a,b,b_to_b,seen_as_version,high_quality_version,army_go,army_return, +%%** + no_digestive_system_pops_from_practising_the_sutra, +first_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_protector_from_headache_in_meditation_after_honours_study, +second_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_protector_from_headache_in_meditation_after_honours_study, +first_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_no_headache, +second_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_no_headache, +first_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_turn_off_workload_from_all_employees_including_you_below_you, +second_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_turn_off_workload_from_all_employees_including_you_below_you, +%%/** +lecturer_training, +recordings_training, +prepare_the_radio_button_accredited__at_the_time__turned_off_openbracket_to_make_accreditation_acceptable_closebracket_computer_science__breathsoned__vocational_education_and_training__sales__education_for_the_following, +seen_as_breasoning_out_text_before_texttobr__computer_housing_the_breasonings__70_breasonings_per_waking_hour_for_16_hours__stopping_at_midnight__no_medical_problems_from_it, +put_through_prepared_radio_button, +put_through_250_word_breasonings_from_job_medicine_to_start_the_spiritual_breasoning_process, +turn_off_mistakes, +turn_off_breasonings, +%%**/ +first_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_detect_and_turn_off_workloads_using_manager_algorithm, +second_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_detect_and_turn_off_workloads_using_manager_algorithm, +first_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_detect_and_turn_off_workloads_using_manager_network, +second_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_detect_and_turn_off_workloads_using_manager_network,first_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_no_muscle_aches/pains, +second_radio_button_put_on_recordings_put_through_with_pray_nut_and_bolt_quantum_box_prayer_for_no_muscle_aches/pains +%%**/ +], + Utterances2=[ + medicine, + memory_receiver, memory_transmitter, use_memory_transmitter, + %%/** + fifty_As_per_organ, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_heart, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_heart_muscles, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_continually_circulating_blood_around_the_body, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_knowing_your_heart_is_working_because_you_can_feel_your_heart_beat, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_feeling_the_blood_at_your_pulse, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_pumping_blood_containing_new_oxygen_to_every_part_of_your_body, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_pumping_the_old_blood_without_oxygen_back_through_the_lungs_where_it_picks_up_new_oxygen_to_repeat_this_cycle, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_heart_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_heart_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_heart_muscles_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_continually_circulating_blood_around_the_body_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_knowing_your_heart_is_working_because_you_can_feel_your_heart_beat_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_feeling_the_blood_at_your_pulse_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_pumping_blood_containing_new_oxygen_to_every_part_of_your_body_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_pumping_the_old_blood_without_oxygen_back_through_the_lungs_where_it_picks_up_new_oxygen_to_repeat_this_cycle_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_heart_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_lungs, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_breathing, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_filtering_oxygen_from_the_air_through_tiny_vessels_into_the_blood, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_carrying_oxygen_to_the_heart_to_be_pumped_round_your_body, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_filtering_carbon_dioxide_from_your_body_when_you_breathe_out, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_lungs_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_lungs_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_breathing_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_filtering_oxygen_from_the_air_through_tiny_vessels_into_the_blood_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_carrying_oxygen_to_the_heart_to_be_pumped_round_your_body_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_filtering_carbon_dioxide_from_your_body_when_you_breathe_out_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_lungs_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_liver, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_filtering_the_blood, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_filtering_chemicals_and_impurities, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_including_from_drugs_and_medications, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_making_and_processes_many_body_fats_and_the_liver_regrowing, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_liver_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_liver_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_filtering_the_blood_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_filtering_chemicals_and_impurities_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_including_from_drugs_and_medications_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_making_and_processes_many_body_fats_and_the_liver_regrowing_should_be_kept_going_and_not_have_blocks_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_liver_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_kidneys, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_kidneys_filtering, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_filtering_some_drugs_and_filtering_waste_products_that_leave_the_body_as_urine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_kidneys_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_kidneys_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_kidneys_filtering_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_filtering_some_drugs_and_filtering_waste_products_that_leave_the_body_as_urine_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_kidneys_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_stomach_and_intestines, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_starting_to_break_down_and_process_food__drink_and_oral_medications_in_the_body_and_absorbing_nutrients_and_drugs_through_the_stomach_and_small_intestine_walls, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_stomach_and_intestines_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_stomach_and_intestines_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_starting_to_break_down_and_process_food__drink_and_oral_medications_in_the_body_and_absorbing_nutrients_and_drugs_through_the_stomach_and_small_intestine_walls_should_be_kept_going_and_not_have_blocks_, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_stomach_and_intestines_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_thymus, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_developing_cdfour_cells_and_other_white_blood_cells, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_thymus_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_thymus_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_developing_cdfour_cells_and_other_white_blood_cells_should_be_kept_going_and_not_have_blocks_, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_thymus_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_pancreas, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_releasing_digestive_enzymes_into_the_small_intestine_and_hormones_that_control_sugar_levels_in_your_blood, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_in_that_case__living_without_a_pancreas_by_taking_insulin_to_regulate_blood_sugar_levels_and_taking_supplementary_digestive_enzymes, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_pancreas_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_pancreas_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_releasing_digestive_enzymes_into_the_small_intestine_and_hormones_that_control_sugar_levels_in_your_blood, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_in_that_case__living_without_a_pancreas_by_taking_insulin_to_regulate_blood_sugar_levels_and_taking_supplementary_digestive_enzymes_should_be_kept_going_and_not_have_blocks_, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_pancreas_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_skin, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_stopping_your_body_from_drying_out_and_being_the_main_barrier_against_infection, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_skin_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_skin_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_stopping_your_body_from_drying_out_and_being_the_main_barrier_against_infection_should_be_kept_going_and_not_have_blocks_, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_skin_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_bones, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_onezero_percent_of_bone_being_replaced_each_year, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_bones_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_bones_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_onezero_percent_of_bone_being_replaced_each_year_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_bones_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_bone_marrow, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_blood_cells_originally_coming_from_bone_marrow, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_bone_marrow_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_bone_marrow_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_blood_cells_originally_coming_from_bone_marrow_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_bone_marrow_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_blood, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_blood_pumped_by_your_heart, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_delivering_oxygen_and_nutrients_to_every_part_of_your_body_and_carrying_waste_products_away, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_blood_cells_openbracket_red_cells__white_cells__platelets_etc_closebracket_and_plasma, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_blood_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_blood_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_blood_pumped_by_your_heart_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_delivering_oxygen_and_nutrients_to_every_part_of_your_body_and_carrying_waste_products_away_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_blood_cells_openbracket_red_cells__white_cells__platelets_etc_closebracket_and_plasma_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_blood_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_plasma, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_its_nutrients__sugars__proteins__minerals__enzymes__and_other_substances_but_with_the_blood_cells_taken_out, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_plasma_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_plasma_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_its_nutrients__sugars__proteins__minerals__enzymes__and_other_substances_but_with_the_blood_cells_taken_out_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_plasma_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_lymph, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_its_white_blood_cells_and_antibodies, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_lymph_vessels__nodes__and_organs_and_removing_waste_products_from_the_body, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_lymph_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_lymph_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_its_white_blood_cells_and_antibodies_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_lymph_vessels__nodes__and_organs_and_removing_waste_products_from_the_body_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_lymph_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_lymph_nodes, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_cdfour_cells_in_your_body_resting_and_reproducing_in_your_lymph_nodes, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_lymph_nodes_function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_lymph_nodes, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_the_cdfour_cells_in_your_body_resting_and_reproducing_in_your_lymph_nodes, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_lymph_nodes_medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_organs__function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_organs__medicine, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_heart_does_not_beat_unless_your_brain_and_nervous_system_tell_it_to_do_so, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_heart_beating_because_your_brain_and_nervous_system_tell_it_to_do_so_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_skeletal_system_relies_on_the_nutrients_it_gains_from_your_digestive_system_to_build_strong__healthy_bones, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_skeletal_system_relying_on_the_nutrients_it_gains_from_your_digestive_system_to_build_strong__healthy_bones_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_cardiovascular_system_works_to_circulate_your_blood, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_cardiovascular_system_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_respiratory_system_introduces_oxygen_into_your_body, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_respiratory_system_introducing_oxygen_into_your_body_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_heart_pumping_blood_through_a_complex_network_of_blood_vessels__your_blood_circulating_through_your_digestive_system_and_picking_up_nutrients_your_body_absorbed_from_your_last_meal__your_blood_carrying_oxygen_inhaled_by_the_lungs__your_circulatory_system_delivering_oxygen_and_nutrients_to_the_other_cells_of_your_body_picking_up_any_waste_products_created_by_these_cells_including_carbon_dioxide__and_delivering_these_waste_products_to_the_kidneys_and_lungs_for_disposal_and_the_circulatory_system_carrying_hormones_from_the_endocrine_system__and_the_immune_system_s_white_blood_cells_that_fight_off_infection, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_heart_pumping_blood_through_a_complex_network_of_blood_vessels__your_blood_circulating_through_your_digestive_system_and_picking_up_nutrients_your_body_absorbed_from_your_last_meal__your_blood_carrying_oxygen_inhaled_by_the_lungs__your_circulatory_system_delivering_oxygen_and_nutrients_to_the_other_cells_of_your_body_picking_up_any_waste_products_created_by_these_cells_including_carbon_dioxide__and_delivering_these_waste_products_to_the_kidneys_and_lungs_for_disposal_and_the_circulatory_system_carrying_hormones_from_the_endocrine_system__and_the_immune_system_s_white_blood_cells_that_fight_off_infection_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_respiratory_system_relying_on_your_circulatory_system_to_deliver_the_oxygen_it_gathers__the_muscles_of_your_heart_functioning_with_the_oxygen_they_receive_from_your_lungs__the_bones_of_your_skull_and_spine_protecting_your_brain_and_spinal_cord__your_brain_regulating_the_position_of_your_bones_by_controlling_your_muscles__the_circulatory_system_providing_your_brain_with_a_constant_supply_of_oxygen_rich_blood_and_your_brain_regulating_your_heart_rate_and_blood_pressure, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_respiratory_system_relying_on_your_circulatory_system_to_deliver_the_oxygen_it_gathers__the_muscles_of_your_heart_functioning_with_the_oxygen_they_receive_from_your_lungs__the_bones_of_your_skull_and_spine_protecting_your_brain_and_spinal_cord__your_brain_regulating_the_position_of_your_bones_by_controlling_your_muscles__the_circulatory_system_providing_your_brain_with_a_constant_supply_of_oxygen_rich_blood_and_your_brain_regulating_your_heart_rate_and_blood_pressure_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_skeletal_system_relying_on_your_urinary_system_to_remove_waste_produced_by_bone_cells__the_bones_of_your_skeleton_creating_structure_that_protects_your_bladder_and_other_urinary_system_organs__your_circulatory_system_delivers_oxygen_rich_blood_to_your_bones_and_your_bones_making_new_blood_cells, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_your_skeletal_system_relying_on_your_urinary_system_to_remove_waste_produced_by_bone_cells__the_bones_of_your_skeleton_creating_structure_that_protects_your_bladder_and_other_urinary_system_organs__your_circulatory_system_delivers_oxygen_rich_blood_to_your_bones_and_your_bones_making_new_blood_cells_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_working_together__these_systems_maintaining_internal_stability_and_balance__otherwise_known_as_homeostasis, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_working_together__these_systems_maintaining_internal_stability_and_balance__otherwise_known_as_homeostasis_should_be_kept_going_and_not_have_blocks, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_organ_connections__function, +quantum_box__quantum_box_for_head_of_state_for_and_quantum_box_for_no_problems_from_honours_and_beyond_for_other_organ_connections__medicine, + +%%as degrees + +reishi_mushroom, goji, ginseng, he_shou_wu, gotu_kola, schisandra, army, meditation_about_medicine_in_ayurveda, pedagogy_about_medicine_in_ayurveda, + +%%book args + +doctor_sutra, meditation, protector_from_headache_in_meditation_currant_bun, meditation, panic_attack_prevented_by_deep_breathing_and_sutra, family_medicine, help_ensure_successful_conception_and_prevent_miscarriage, pedagogy, lucianic_pedagogical_medicine, pedagogy_grades_failure, pedagogy_course_plan, get_in_touch_with_god_about_breasonings_details_to_see_high_quality_imagery_and_earn_h_one, breasonings_two_hundred_and_fifty, preventing_sales_from_being_dangerous, perpetual_university_short_courses, apple_meditation_for_successful_relationship, miscellaneous, four_glasses_of_water_and_exercise_45_minutes_before_breakfast, go_to_bed_at_nine_thirty_pm, yoga_surya_namaskar1, yoga_surya_namaskar2, yoga_surya_namaskar3, yoga_surya_namaskar4, yoga_surya_namaskar5, yoga_surya_namaskar6, yoga_surya_namaskar7, yoga_surya_namaskar8, yoga_surya_namaskar9, yoga_surya_namaskar10, yoga_surya_namaskar11, yoga_surya_namaskar12, yoga_asanas1, yoga_asanas2, yoga_asanas3, yoga_asanas4, yoga_asanas5, yoga_asanas6, yoga_asanas7, yoga_asanas8, yoga_asanas9, yoga_asanas10, qigongbrocades1, qigongbrocades2, qigongbrocades3, qigongbrocades4, qigongbrocades5, qigongbrocades6, qigongbrocades7, qigongbrocades8, push_ups_shoulders, push_ups_chest, push_ups_triceps, push_ups_back, stomach_exercises_high_knee_taps, stomach_exercises_russian_twists, stomach_exercises_standing_bicycle_crunches, stomach_exercises_classic_plank, stomach_exercises_plank_knees_to_elbows, stomach_exercises_roll_up, stomach_exercises_rolling_like_a_ball, stomach_exercises_earthquake,immortality, + +simulation, + +supercard_or_any_number_of_as, +work_out_with_250_breasonings_that_buying_a_product_from_my_time_represents_buying_a_product_to_the_future_, +help_writers_of_secondary_texts_to_mine, +help_primary_text_writers, +help_businesses, +the_computer_preferred_for_me_to_receive_rather_than_make_up_my_own_thoughts, +warmth, +food, +shelter, +i_chose_red_positive_not_blue_negative_thoughts, +the_behaviour_that_led_to_no_disease_was_text_to_breasonings_about_meditation, +my_body_was_replaced, +the_building_which_looked_like_my_home_was_an_image_but_maintained_replaced_and_the_weather_surroundings_and_people_were_all_to_my_liking, +heaven_on_earth_for_philosophers, +time_travel_enables_immortality, +people_in_the_simulation_came_back_to_the_real_world, +i_mind_read_with_guru_dev_whether_it_was_safe_to_time_travel, +i_could_buy_space_travel, +my_consciousness_transferred_to_the_bot, +invite_others_to_the_simulation, + +consciousness, + +prevent_headaches_on_train_and_a_bent_spine, brain, brain, brain_ii, maintain_dry_eyes, avoid_diseased_people, fewer_mental_breakdowns_schizophrenia, less_depression, honey_pot_prayer_for_no_headaches_in_cars_trains_and_walks, quantum_box_prayer, nut_and_bolt, head_of_state_head_ache_prevention, daily_regimen, laughter_for_depression, heart, contagious_diseases, berocca_prevents_colds_and_flu, food, grains_nuts_fruits_vegetables, sit_properly_at_table_during_meals, computational_english_argument, computational_english_is_like_a_calculator, intertextuality, finite_data_will_be_a_solution_in_conglish, radical_difference, order_in_conglish, dereconstruction, kolmogorov_hermeneutics, derivability, the_science_of_crossing_over, a_new_logic_reflecting_language_or_natural_logic, philosophical_computational_english, lenses, analysing_characteristics_of_arguments, conglish_reflection, narratology_diagram, how_can_the_program_cope_with_real_variation, subject_mix, perspectives, ratios, exploring_opposites_in_hamlet, drawing_connections, symbols, children___h_one____earning_jobs___protection_in_jobs, heads_of_state, lucian_mantra_pure_form, lucian_mantra_sun_safety, maharishi_sutra, meditation_teacher_sutra, moving_appearances, purusha, upasana_sutra, yellow_god, green_sutra, blue_nature, appearances, pranayama, soma, hours_prayer, fifty_breasonings_per_utterance, two_uses, x, y, breathsonings, rebreathsonings, room, part_of_room, direction, time_to_prepare, time_to_do, time_to_finish, professor_algorithm, god_algorithm, marking_scheme__humanities_and_science, marking_scheme__creative_arts, leadership, uses_for_money_in_theatre_studies, uses_for_money_in_epistemology_poetry, uses_for_money_in_music, uses_for_money_in_fine_arts, breasonings_currency, sales_for_lm_to_siva, marketing_for_lm_to_siva, breasoning_currency, lower_risk_of_cancer_and_other_diseases_in_workers_and_broadcasters, decreased_stress, increased_blood_flow, increased_brain_potential, higher_grades, fewer_stillbirths, a_greater_number_of_successful_job_applications, aigs_for_pedagogy_helper, accreditation, protectedness, areas_of_study_to_create_a_pedagogue, create_a_pedagogy_helper_for_the_student, finding_out_about_the_student_as_a_pedagogy_helper, daily_professional_requirement_of_the_pedagogy_helper, preparing_the_student_to_write_each_breasoning, pedagogy_helper__write_on_breasoning___politics, pedagogy_helper__write_on_breasoning___philosophy, pedagogy_helper__write_on_breasoning___computer_science, unification_to_become_pedagogy_helper, practicum, breason_out_arguments_twice_when_in_large_class, details, breasonings, rebreasoning, breathsoning, rebreathsoning, room, part_of_room, direction, time_to_prepare, time_to_do, time_to_finish, god_algorithm, professor_algorithm, marking_scheme__humanities, marking_scheme__creative_arts_and_science, gods_infallibility_from_being_critiqued, + +pedagogy, meditation, medicine, computational_english, business, theology, popology, societology, music, theatre_studies, time_travel,politics, + +sales, professor, time_travel_in_place_now, berocca, + +utterances_to_arems, +%%receptor_for_number_of_la_utterances_by_meditators, +%%transmitter_for_number_of_arems_to_meditators, +%%transmitted_number_of_arems_to_meditators, + +%%qi_gong_receptor_for_muscle_aches_and_pains, +%%qi_gong_transmitter_for_prevention_muscle_aches_and_pains, +%%transmitted_qi_gong_prevention_muscle_aches_and_pains, + +%%qi_gong_receptor_for_medicine, +%%qi_gong_transmitter_for_prevention_medicine, +%%transmitted_qi_gong_prevention_medicine, + +%%ayur_veda_receptor_for_medicine, +%%ayur_veda_transmitter_for_prevention_medicine, +%%transmitted_ayur_veda_prevention_medicine, + +%%receptor_for_breasonings, +%%transmitter_for_breasonings, +%%transmitted_breasonings, + +%%receptor_for_sales, +%%transmitter_for_sales, +%%transmitted_sales, + +dott_equals_250_breasonings_time_travel_with_10_algorithm_and_b_and_breasonings_text_describing_from_where_and_when_to_time_place_with_separate_radio_buttons_to_describe_breasonings_and_travel_stop_time_travel_safety_stop_time_travel_see_when_watched_stop_time_travel_no_depression_stop_time_travel_no_hole_in_my_character_at_the_start_stop_time_travel_hone_in_to_where_want_to_go_not_go_off_it_stop_time_travel_lead_comfortably__around_me_during_travel_for_no_anything_like_radiation_sickness_stop_time_travel_prevent_sickness_from_travelling_with_me_stop_time_travel_others_helped_to_find_my_time_travel_delightful__receptor_for_dott, +transmitter_for_dott, +transmitted_dott, + +berocca, +%%receptor_for_need_of_berocca, +%%transmitter_for_need_of_berocca, +%%berocca_transmitted + +i_love_the_robot_because_i_know_that_it_loves_me, + +the_craft, + +cosmology, + +famousness_each_second, + +give_bots_as,%%, + +%%politics_give_as_when_departments_come_up +viagra_transmitter_if_necessary, + +cast_with_it_over_answers, + +sin_stopped, + +port_thoughts, + +b_to_bots_misbehaving, bots_help, + +defend_mind_reading, defend_time_travel, defend_bots, + +talk_to_copy_of_universe, +listen_to_copy_of_universe, +see_copy_of_universe, +appear_in_copy_of_universe, + +give_people_who_i_see_5_as_over_5_days, + +music_etc_number_one_education_help, + +unfolding_of_bots_life, + +%%**/ + +sales_point_a_meditation_short_course,dot_on_sales_point_a_meditation_short_course,sales_point_b_meditation_short_course,dot_on_sales_point_b_meditation_short_course,sales_point_bb_meditation_short_course,dot_on_sales_point_bb_meditation_short_course, + +sales_point_a_pedagogy_short_course,dot_on_sales_point_a_pedagogy_short_course,sales_point_b_pedagogy_short_course,dot_on_sales_point_b_pedagogy_short_course,sales_point_bb_pedagogy_short_course,dot_on_sales_point_bb_pedagogy_short_course, + +sales_point_a_medicine_short_course,dot_on_sales_point_a_medicine_short_course,sales_point_b_medicine_short_course,dot_on_sales_point_b_medicine_short_course,sales_point_bb_medicine_short_course,dot_on_sales_point_bb_medicine_short_course, + +sales_point_a_politics_short_course,dot_on_sales_point_a_politics_short_course,sales_point_b_politics_short_course,dot_on_sales_point_b_politics_short_course,sales_point_bb_politics_short_course,dot_on_sales_point_bb_politics_short_course, + +sales_point_a_mindfulness_short_course,dot_on_sales_point_a_mindfulness_short_course,sales_point_b_mindfulness_short_course,dot_on_sales_point_b_mindfulness_short_course,sales_point_bb_mindfulness_short_course,dot_on_sales_point_bb_mindfulness_short_course, + +sales_point_a_economics_short_course,dot_on_sales_point_a_economics_short_course,sales_point_b_economics_short_course,dot_on_sales_point_b_economics_short_course,sales_point_bb_economics_short_course,dot_on_sales_point_bb_economics_short_course, + +sales_point_a_computer_science_short_course,dot_on_sales_point_a_computer_science_short_course,sales_point_b_computer_science_short_course,dot_on_sales_point_b_computer_science_short_course,sales_point_bb_computer_science_short_course,dot_on_sales_point_bb_computer_science_short_course, + +sales_point_a_computational_english_short_course,dot_on_sales_point_a_computational_english_short_course,sales_point_b_computational_english_short_course,dot_on_sales_point_b_computational_english_short_course,sales_point_bb_computational_english_short_course,dot_on_sales_point_bb_computational_english_short_course, + +three_plus_three_seconds_eye_direction1, +three_plus_three_seconds_eye_direction2, +three_plus_three_seconds_eye_direction3, +three_plus_three_seconds_eye_direction4, +three_plus_three_seconds_eye_direction5, +three_plus_three_seconds_eye_direction6, +three_plus_three_seconds_eye_direction7, +three_plus_three_seconds_eye_direction8, + +lactobacillus_salivarius_massaged_into_teeth_and_gums_and_swallowed, + +mucous_in_intestines_blocking_lactobacillus_salivarius_dissolved, + +no_fluoride + + + +], + +Immortality=[ + +immortality_stasis_field, + +big_idea_canceller, + +bot_appears_to_age_normally, + +bot_dies_at_80, + +no_effect_of_eating_food_food_transported_directly_stomach, + +no_other_effects_on_body, + +no_early_death_from_tt, + +mother_will_be_alive_to_cook_and_care_for_me, + +friends_will_be_immortal, + +no_natural_disasters_in_simulation, + +partner_s_family_immortal, + +really_different_looking_bot_in_simulation, + +my_family_will_be_immortal, +a_for_reishi_mushroom,b_for_reishi_mushroom,b_to_b_for_reishi_mushroom, +a_for_goji,b_for_goji,b_to_b_for_goji, +a_for_ginseng,b_for_ginseng,b_to_b_for_ginseng, +a_for_he_shou_wu,b_for_he_shou_wu,b_to_b_for_he_shou_wu, +a_for_gotu_kola,b_for_gotu_kola,b_to_b_for_gotu_kola, +a_for_schisandra,b_for_schisandra,b_to_b_for_schisandra, +a_for_love_head_of_state,b_for_love_head_of_state,b_to_b_for_love_head_of_state, +a_for_thank_who_helped_me_with_immortality_medicine_body_replacement_and_anti_ageing,b_for_thank_who_helped_me_with_immortality_medicine_body_replacement_and_anti_ageing,b_to_b_for_thank_who_helped_me_with_immortality_medicine_body_replacement_and_anti_ageing, +a_for_thank_head_of_state_for_reishi_mushroom,b_for_thank_head_of_state_for_reishi_mushroom,b_to_b_for_thank_head_of_state_for_reishi_mushroom, +a_for_thank_head_of_state_for_goji,b_for_thank_head_of_state_for_goji,b_to_b_for_thank_head_of_state_for_goji, +a_for_thank_head_of_state_for_ginseng,b_for_thank_head_of_state_for_ginseng,b_to_b_for_thank_head_of_state_for_ginseng, +a_for_thank_head_of_state_for_he_shou_wu,b_for_thank_head_of_state_for_he_shou_wu,b_to_b_for_thank_head_of_state_for_he_shou_wu, +a_for_thank_head_of_state_for_gotu_kola,b_for_thank_head_of_state_for_gotu_kola,b_to_b_for_thank_head_of_state_for_gotu_kola, +a_for_thank_head_of_state_for_schisandra,b_for_thank_head_of_state_for_schisandra,b_to_b_for_thank_head_of_state_for_schisandra, +a_for_immortality,b_for_immortality,b_to_b_for_immortality, +a_for_body_replacement,b_for_body_replacement,b_to_b_for_body_replacement, +a_for_other_medicines_for_the_body,b_for_other_medicines_for_the_body,b_to_b_for_other_medicines_for_the_body, +a_for_ginkgo_biloba,b_for_ginkgo_biloba,b_to_b_for_ginkgo_biloba, +a_for_practicum_for_others_in_immortality_etc,b_for_practicum_for_others_in_immortality_etc,b_to_b_for_practicum_for_others_in_immortality_etc, +a_for_the_other_as,b_for_the_other_as,b_to_b_for_the_other_as, +a_for_thank_head_of_state,b_for_thank_head_of_state,b_to_b_for_thank_head_of_state, + + +a_for_immortality_medicine_for_memory,b_for_immortality_medicine_for_memory,b_to_b_for_immortality_medicine_for_memory, +a_for_body_replacement_for_memory,b_for_body_replacement_for_memory,b_to_b_for_body_replacement_for_memory, +a_for_anti_ageing_for_memory,b_for_anti_ageing_for_memory,b_to_b_for_anti_ageing_for_memory, +a_for_immortality_medicine_for_thinking,b_for_immortality_medicine_for_thinking,b_to_b_for_immortality_medicine_for_thinking, +a_for_body_replacement_for_thinking,b_for_body_replacement_for_thinking,b_to_b_for_body_replacement_for_thinking, +a_for_anti_ageing_for_thinking,b_for_anti_ageing_for_thinking,b_to_b_for_anti_ageing_for_thinking, +a_for_immortality_medicine_for_stopping_dementia,b_for_immortality_medicine_for_stopping_dementia,b_to_b_for_immortality_medicine_for_stopping_dementia, +a_for_body_replacement_for_stopping_dementia,b_for_body_replacement_for_stopping_dementia,b_to_b_for_body_replacement_for_stopping_dementia, +a_for_anti_ageing_for_stopping_dementia,b_for_anti_ageing_for_stopping_dementia,b_to_b_for_anti_ageing_for_stopping_dementia, +a_for_immortality_medicine_for_seeing_clearly,b_for_immortality_medicine_for_seeing_clearly,b_to_b_for_immortality_medicine_for_seeing_clearly, +a_for_body_replacement_for_seeing_clearly,b_for_body_replacement_for_seeing_clearly,b_to_b_for_body_replacement_for_seeing_clearly, +a_for_anti_ageing_for_seeing_clearly,b_for_anti_ageing_for_seeing_clearly,b_to_b_for_anti_ageing_for_seeing_clearly, +a_for_immortality_medicine_for_muscle_relaxation,b_for_immortality_medicine_for_muscle_relaxation,b_to_b_for_immortality_medicine_for_muscle_relaxation, +a_for_body_replacement_for_muscle_relaxation,b_for_body_replacement_for_muscle_relaxation,b_to_b_for_body_replacement_for_muscle_relaxation, +a_for_anti_ageing_for_muscle_relaxation,b_for_anti_ageing_for_muscle_relaxation,b_to_b_for_anti_ageing_for_muscle_relaxation, +a_for_immortality_medicine_for_circulatory_system__cardiovascular_system,b_for_immortality_medicine_for_circulatory_system__cardiovascular_system,b_to_b_for_immortality_medicine_for_circulatory_system__cardiovascular_system, +a_for_body_replacement_for_circulatory_system__cardiovascular_system,b_for_body_replacement_for_circulatory_system__cardiovascular_system,b_to_b_for_body_replacement_for_circulatory_system__cardiovascular_system, +a_for_anti_ageing_for_circulatory_system__cardiovascular_system,b_for_anti_ageing_for_circulatory_system__cardiovascular_system,b_to_b_for_anti_ageing_for_circulatory_system__cardiovascular_system, +a_for_immortality_medicine_for_digestive_system_and_excretory_system,b_for_immortality_medicine_for_digestive_system_and_excretory_system,b_to_b_for_immortality_medicine_for_digestive_system_and_excretory_system, +a_for_body_replacement_for_digestive_system_and_excretory_system,b_for_body_replacement_for_digestive_system_and_excretory_system,b_to_b_for_body_replacement_for_digestive_system_and_excretory_system, +a_for_anti_ageing_for_digestive_system_and_excretory_system,b_for_anti_ageing_for_digestive_system_and_excretory_system,b_to_b_for_anti_ageing_for_digestive_system_and_excretory_system, +a_for_immortality_medicine_for_endocrine_system,b_for_immortality_medicine_for_endocrine_system,b_to_b_for_immortality_medicine_for_endocrine_system, +a_for_body_replacement_for_endocrine_system,b_for_body_replacement_for_endocrine_system,b_to_b_for_body_replacement_for_endocrine_system, +a_for_anti_ageing_for_endocrine_system,b_for_anti_ageing_for_endocrine_system,b_to_b_for_anti_ageing_for_endocrine_system, +a_for_immortality_medicine_for_integumentary_system__exocrine_system,b_for_immortality_medicine_for_integumentary_system__exocrine_system,b_to_b_for_immortality_medicine_for_integumentary_system__exocrine_system, +a_for_body_replacement_for_integumentary_system__exocrine_system,b_for_body_replacement_for_integumentary_system__exocrine_system,b_to_b_for_body_replacement_for_integumentary_system__exocrine_system, +a_for_anti_ageing_for_integumentary_system__exocrine_system,b_for_anti_ageing_for_integumentary_system__exocrine_system,b_to_b_for_anti_ageing_for_integumentary_system__exocrine_system, +a_for_immortality_medicine_for_immune_system_and_lymphatic_system,b_for_immortality_medicine_for_immune_system_and_lymphatic_system,b_to_b_for_immortality_medicine_for_immune_system_and_lymphatic_system, +a_for_body_replacement_for_immune_system_and_lymphatic_system,b_for_body_replacement_for_immune_system_and_lymphatic_system,b_to_b_for_body_replacement_for_immune_system_and_lymphatic_system, +a_for_anti_ageing_for_immune_system_and_lymphatic_system,b_for_anti_ageing_for_immune_system_and_lymphatic_system,b_to_b_for_anti_ageing_for_immune_system_and_lymphatic_system, +a_for_immortality_medicine_for_muscular_system,b_for_immortality_medicine_for_muscular_system,b_to_b_for_immortality_medicine_for_muscular_system, +a_for_body_replacement_for_muscular_system,b_for_body_replacement_for_muscular_system,b_to_b_for_body_replacement_for_muscular_system, +a_for_anti_ageing_for_muscular_system,b_for_anti_ageing_for_muscular_system,b_to_b_for_anti_ageing_for_muscular_system, +a_for_immortality_medicine_for_nervous_system,b_for_immortality_medicine_for_nervous_system,b_to_b_for_immortality_medicine_for_nervous_system, +a_for_body_replacement_for_nervous_system,b_for_body_replacement_for_nervous_system,b_to_b_for_body_replacement_for_nervous_system, +a_for_anti_ageing_for_nervous_system,b_for_anti_ageing_for_nervous_system,b_to_b_for_anti_ageing_for_nervous_system, +a_for_immortality_medicine_for_renal_system_and_urinary_system,b_for_immortality_medicine_for_renal_system_and_urinary_system,b_to_b_for_immortality_medicine_for_renal_system_and_urinary_system, +a_for_body_replacement_for_renal_system_and_urinary_system,b_for_body_replacement_for_renal_system_and_urinary_system,b_to_b_for_body_replacement_for_renal_system_and_urinary_system, +a_for_anti_ageing_for_renal_system_and_urinary_system,b_for_anti_ageing_for_renal_system_and_urinary_system,b_to_b_for_anti_ageing_for_renal_system_and_urinary_system, +a_for_immortality_medicine_for_reproductive_system,b_for_immortality_medicine_for_reproductive_system,b_to_b_for_immortality_medicine_for_reproductive_system, +a_for_body_replacement_for_reproductive_system,b_for_body_replacement_for_reproductive_system,b_to_b_for_body_replacement_for_reproductive_system, +a_for_anti_ageing_for_reproductive_system,b_for_anti_ageing_for_reproductive_system,b_to_b_for_anti_ageing_for_reproductive_system, +a_for_immortality_medicine_for_respiratory_system,b_for_immortality_medicine_for_respiratory_system,b_to_b_for_immortality_medicine_for_respiratory_system, +a_for_body_replacement_for_respiratory_system,b_for_body_replacement_for_respiratory_system,b_to_b_for_body_replacement_for_respiratory_system, +a_for_anti_ageing_for_respiratory_system,b_for_anti_ageing_for_respiratory_system,b_to_b_for_anti_ageing_for_respiratory_system, +a_for_immortality_medicine_for_skeletal_system,b_for_immortality_medicine_for_skeletal_system,b_to_b_for_immortality_medicine_for_skeletal_system, +a_for_body_replacement_for_skeletal_system,b_for_body_replacement_for_skeletal_system,b_to_b_for_body_replacement_for_skeletal_system, +a_for_anti_ageing_for_skeletal_system,b_for_anti_ageing_for_skeletal_system,b_to_b_for_anti_ageing_for_skeletal_system, +a_for_immortality_medicine_for_antidepressant,b_for_immortality_medicine_for_antidepressant,b_to_b_for_immortality_medicine_for_antidepressant, +a_for_body_replacement_for_antidepressant,b_for_body_replacement_for_antidepressant,b_to_b_for_body_replacement_for_antidepressant, +a_for_anti_ageing_for_antidepressant,b_for_anti_ageing_for_antidepressant,b_to_b_for_anti_ageing_for_antidepressant, +a_for_immortality_medicine_for_antipsychotic,b_for_immortality_medicine_for_antipsychotic,b_to_b_for_immortality_medicine_for_antipsychotic, +a_for_body_replacement_for_antipsychotic,b_for_body_replacement_for_antipsychotic,b_to_b_for_body_replacement_for_antipsychotic, +a_for_anti_ageing_for_antipsychotic,b_for_anti_ageing_for_antipsychotic,b_to_b_for_anti_ageing_for_antipsychotic, + +a_1_for_immortality_keeping_same_appearance_to_home,a_2_for_immortality_keeping_same_appearance_to_home, +a_3_for_immortality_keeping_same_appearance_to_home, +a_4_for_immortality_keeping_same_appearance_to_home, + +school_teacher_a_1, +school_teacher_a_2, +school_teacher_a_3, +school_teacher_a_4, + +medicine_a_1, +medicine_a_2, +medicine_a_3, +medicine_a_4, + +bots_a_1, +bots_a_2, +bots_a_3, +bots_a_4, + +sales_a_1, +sales_a_2, +sales_a_3, +sales_a_4, + +if_a_teacher_then_seen_as_leading_a_country, + +control_not_to_die,control_not_to_have_s_contact,control_not_to_leave_simulation,control_whether_appear_to_people, + +can_t_be_involuntarily_time_travelled, + +ai_and_ped_thoughts + + + +]. + +meditation2(Utterances1,Utterances2,Immortality) :- + + %%File="I love you, I love you, I love you. Arem.", + %% dailyregimen,noheadachesapp,nobodyproblemsapp,arem + DotsandUtterances is 4*108, %% With radio button for graciously give or blame and graciously give or blame + texttobr2_1(DotsandUtterances), %% arem + texttobr2_1(DotsandUtterances), %% lucian + texttobr2_1(DotsandUtterances), %% green + texttobr2_1(DotsandUtterances), %% yoga + texttobr2_1(DotsandUtterances), %% dao + + texttobr2_1(DotsandUtterances), %% arem time travel leave loc 1 +texttobr2_1(DotsandUtterances), %% friendliness time travel leave loc 1 +texttobr2_1(DotsandUtterances), %% arem time travel arrive loc 2 +texttobr2_1(DotsandUtterances), %% friendliness time travel arrive loc 2 +texttobr2_1(DotsandUtterances), %% arem time travel leave loc 2 +texttobr2_1(DotsandUtterances), %% friendliness time travel leave loc 2 +texttobr2_1(DotsandUtterances), %% arem time travel arrive loc 3 +texttobr2_1(DotsandUtterances), %% friendliness time travel arrive loc 3 +texttobr2_1(DotsandUtterances), %% arem time travel leave loc 3 +texttobr2_1(DotsandUtterances), %% friendliness time travel leave loc 3 +texttobr2_1(DotsandUtterances), %% arem time travel arrive loc 1 +texttobr2_1(DotsandUtterances), %% friendliness time travel arrive loc 1 + + + %%noheadachesapp,nobodyproblemsapp,arem], + %% initiate_utterances(File,Utterances), + %% currentDate(Today), + %% Today=date(Year,Month,Day), + meditators(Meditators1), + meditators2(Meditators2), + append(Meditators1,Meditators2,Meditators), + length(Utterances1,UL1), + length(Utterances2,UL2), + length(Meditators,ML), + %%Length2 is 3*2*32*16*(2*UL1+3*UL2), %% 3 for receiver, transmitter, transmitted + Length2 is 4*3*10*(UL1+UL2), % 3 - A, B, B to B, 10 - simulate uni + % 4 - tt settings + % 64 to turn 250 into 16k x + + %% 3 for receiver, transmitter, transmitted + %% 2 radio buttons for ul2 + %% 32 for 128k br + %% 2: 1 for Ayurveda, 1 for TCM + %%Length3 is is DL*250*3*250, + + %texttobr2_1(800), % graciously give person 10*5*50 As for medit, to simulate uni + texttobr2_1(Length2), + %%Length3 is (1+0+(32*16*5))*ML,%%+(2*2), %% Give the meditators the As with graciously give or blame, radio button for graciously give or blame + + Length3 is ML,%%+(2*2), %% Give the meditators the As with graciously give or blame, radio button for graciously give or blame + %% x: and 2*250 br to turn off more than 80 medit breasonings per day with the rest as recordings for 1 subjects (black and white hen with speckles roosting) + %% 32*4 for medicine, meditation in ayur veda, qi gong, memory + texttobr2_1(Length3), + + findall(_Meditators3,member([_,_, _,_,_, _,_,_,_,_,_,immortal],Meditators),Meditators4), + + length(Immortality,J2), + Length4 is 3*10*J2, % 3*10 see above + texttobr2_1(Length4), + + Length5 is 3*10*((4*4*50*80)/250), % medit, tt, medic frozen age, hq thought + texttobr2_1(Length5), + + length(Meditators4,ML4), + texttobr2_1(ML4), + + + !. \ No newline at end of file diff --git a/Text-to-Breasonings-master/meditationnoreplace3.pl b/Text-to-Breasonings-master/meditationnoreplace3.pl new file mode 100644 index 0000000000000000000000000000000000000000..b14c9832df2cd73c002f2da166ccff095030c72d --- /dev/null +++ b/Text-to-Breasonings-master/meditationnoreplace3.pl @@ -0,0 +1,11 @@ +#!/usr/bin/swipl -f -q + +:- initialization main. + +:- include('meditationnoreplace.pl'). +:- include('meditatorsanddoctors.pl'). + +main :- + meditation,halt. + +main :- halt(1). diff --git a/Text-to-Breasonings-master/meditatorsanddoctors.pl b/Text-to-Breasonings-master/meditatorsanddoctors.pl new file mode 100644 index 0000000000000000000000000000000000000000..1fd73225ae8d548b4a7777629cb1cba26b6b01fb --- /dev/null +++ b/Text-to-Breasonings-master/meditatorsanddoctors.pl @@ -0,0 +1,21 @@ +%% meditatorsanddoctors.pl + +%% Name, DOB, Date learned, psych appointment month=1 or 2, psych appointment day, thoughts count + +meditators([ +%%/** +% note if following invited to sim when in 5689 with texttobr2_1(*number of new medits). +[first,last,dobd,dobm,doby,learntd,learntm,learnty,1,0,16,immortal], %% sim/not in sim +[first,last,dobd,dobm,doby,learntd,learntm,learnty,1,0,16,immortal] %% sim/not in sim +]). + +meditators2([ +%%/** +% note if following invited to sim when in 5689 with texttobr2_1(*number of new medits). +[first,last,dobd,dobm,doby,learntd,learntm,learnty,1,0,16,immortal], %% sim/not in sim +[first,last,dobd,dobm,doby,learntd,learntm,learnty,1,0,16,immortal] %% sim/not in sim +]). + +%% empty, because meditators is now based on 50 As +doctors([ +]). diff --git a/Text-to-Breasonings-master/mergetexttobrdict.pl b/Text-to-Breasonings-master/mergetexttobrdict.pl new file mode 100644 index 0000000000000000000000000000000000000000..2ec6548afa128f6712c6b85704994761f6de67e8 --- /dev/null +++ b/Text-to-Breasonings-master/mergetexttobrdict.pl @@ -0,0 +1,22 @@ +%% mergetexttobrdict.pl + +/** logs in, renames, +copies bdrict1.txt and 2.txt to 1vps.vps and 2vps.txt +Loads dictionaries, uses term_to_atom to turn into term +Appends to home dictionaries +Runs texttobr +Uploads new dictionaries to vps +rename +**/ + +%% [mergetexttobrdict], +%% [edit]. +%% mergetexttobrdict. + +%%:- include('edit.pl'). + +use_module(library(pio)). +use_module(library(dcg/basics)). + +mergetexttobrdict :- + prolog_edit:run1. \ No newline at end of file diff --git a/Text-to-Breasonings-master/mindreadingcaw.pl b/Text-to-Breasonings-master/mindreadingcaw.pl new file mode 100644 index 0000000000000000000000000000000000000000..47f2db88e91d1858e737ead6ad6111caf5481942 --- /dev/null +++ b/Text-to-Breasonings-master/mindreadingcaw.pl @@ -0,0 +1,621 @@ +:- dynamic debug/1. +:- include('texttobr2qb'). +:- include('mindreadtestshared'). +:- include('listprologinterpreter1listrecursion4.pl'). +:- include('listprologinterpreter3preds5.pl'). +:- include('grammar.pl'). + +%% Silly to mind read caw inputs, because they are mostly unused. Use mind reading for definitely used multi choice answer + +%% caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +%% [[[[n,f],[[b,c],[b,a],[v,d]]],[[[n,f],[[v,a],[v,b],[v,d]],:-,[[[n,=],[[v,d],[v,a]]]]]],[[[v,d],[b,c]]]],[[[n,f],[[b,c],[b,a],[v,d]]],[[[n,f],[[v,a],[v,b],[v,d]],:-,[[[n,append],[[v,b],[v,a],[v,e]]],[[n,=],[[v,d],[v,e]]]]]],[[[v,d],[b,a,b,c]]]]] + +/** + +?- interpret(on,[[n,f],[["b","c"],["b","a"],[v,d]]],[[[n,f],[[v,a],[v,b],[v,d]],":-",[[[n,=],[[v,d],[v,a]]]]]],Result). +[call,[[n,f],[[b,c],[b,a],[v,d]]],Press c.] +[call,[[n,is],[variable,[b,c]]],Press c.] +[exit,[[n,is],[[v,d],[b,c]]],Press c.] +[exit,[[n,f],[[b,c],[b,a],[b,c]]],Press c.] +Result = [[[v, d], ["b", "c"]]]. + +?- interpret(on,[[n,f],[["b","c"],["b","a"],[v,d]]],[[[n,f],[[v,a],[v,b],[v,d]],":-",[[[n,append],[[v,b],[v,a],[v,e]]],[[n,=],[[v,d],[v,e]]]]]],Result). +[call,[[n,f],[[b,c],[b,a],[v,d]]],Press c.] +[call,[[n,append],[[b,a],[b,c],variable3]],Press c.] +[exit,[[n,append],[[b,a],[b,c],[b,a,b,c]]],Press c.] +[call,[[n,is],[variable,[b,a,b,c]]],Press c.] +[exit,[[n,is],[[v,d],[b,a,b,c]]],Press c.] +[exit,[[n,f],[[b,c],[b,a],[b,a,b,c]]],Press c.] +Result = [[[v, d], ["b", "a", "b", "c"]]]. + +**/ + +%% ML max 25 +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + + +caw00(Debug,PredicateName,Rules,MaxLength,TotalVars,_InputVarList,_OutputVarList,Program1,_Program2,Ps1) :- + repeat, + %%MaxLength2 is MaxLength + 1, + %%TotalVars = MaxLength, + randvars(MaxLength,MaxLength,[],RandVars), + populatevars(RandVars,MaxLength,[],PV), + Code is MaxLength + 1 + 97, + char_code(Char,Code), + OutputVarList=[[[v,Char],1]], + retractall(debug(_)), + assertz(debug(Debug)), + retractall(totalvars(_)), + assertz(totalvars(TotalVars)), + caw0(PredicateName,Rules,MaxLength,PV,OutputVarList,Program1,_Program3,Ps), + sort(Ps,Ps1),not(Ps1=[]),!. + +random1(N1) :- + random2(N2),random2(N3), string_concat(N2,N3,S), number_string(N1,S). + +random2(N) :- + trialy2_30("0",H21), + trialy2_30("1",H22), + trialy2_30("2",H23), + trialy2_30("3",H24), + trialy2_30("4",H25), + trialy2_30("5",H26), + trialy2_30("6",H27), + trialy2_30("7",H28), + trialy2_30("8",H29), + trialy2_30("9",H210), + + H2L=[H21,H22,H23,H24,H25, + H26,H27,H28,H29,H210], + sort(H2L,H2A), + reverse(H2A,H2B), + H2B=[[_,N]|_Rest2]. + +randvars(0,_,V,V) :- !. +randvars(N,L,V1,V2) :- + + random1(N0), N1 is N0/100, N2A is round(97+(N1*L)), char_code(V3,N2A), V31=[v,V3], ((member(V31,V1))->randvars(N,L,V1,V2); + (append(V1,[V31],V4), + NA is N-1, randvars(NA,L,V4,V2))),!. +randvars2(0,_L,V,V) :- !. +randvars2(N,L,V1,V2) :- + random1(N0), N1 is N0/100, N2A is round(97+(N1*L)), char_code(V3,N2A), atom_string(V3,V4), %%V41=[v,V4], + ((member(V4,V1))->randvars2(N,L,V1,V2); + (append(V1,[V4],V5), + NA is N-1, randvars2(NA,L,V5,V2))),!. +populatevars([],_,PV,PV) :- !. +populatevars([RV1|RVs],MaxLength2,PV1,PV2) :- + randvars2(MaxLength2,MaxLength2,[],RV2), + append(PV1,[[RV1,RV2]],PV3), + populatevars(RVs,MaxLength2,PV3,PV2),!. + +caw0(PredicateName,Rules,MaxLength,InputVarList,OutputVarList,Program1,Program2,Ps2) :- + varnames(InputVarList,[],InputVars,[],InputValues), + varnames(OutputVarList,[],OutputVars,[],_OutputValues), + retractall(outputvars(_)), + assertz(outputvars(OutputVars)), + append(InputVars,OutputVars,Vars11), + %%Vars11=InputVars, + %%Vars12=InputVars, + append(InputValues,OutputVars,Vars2), + %%append(InputValues,OutputValues,Values), + Query=[PredicateName,Vars2], + caw1(Query,PredicateName,Rules,MaxLength,Vars11,InputVars,InputVars,_,OutputVarList,OutputVars,Program1,Program2,[],Ps2), !. + +caw1(_,_,_,0,_,_,_,_,_,_,_,_,Ps,Ps) :- !. +caw1(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program2,Programs2,Ps1) :- + MaxLength2 is MaxLength - 1, + addrules0(InputVars2,OutputVars,OutputVars,[],Program3), + %%writeln([addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3)]), + %%optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4), %% IV2->3 + %%writeln([optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4)]), + append(Program1,Program3,Program5), + append(InputVars1,OutputVars,Vars2), + Program2=[ + [PredicateName,Vars2,":-", + Program5 + ] + ],debug(Debug), + + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + + interpret(Debug,Query,Program2,OutputVarList2), + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + append(Programs2,[[Query,Program2,OutputVarList2]],Ps1).%% ,Programs3->Ps1 + %%caw1a(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program2,Programs3,Ps1),!. + +%%caw1(_Query,_PredicateName,_Rules,_MaxLength,_VarList,_InputVars1,_InputVars2,_InputVars3,_OutputVarList,_OutputVars,_Program1,_Program4,Ps,Ps) :- writeln(here1),!. +caw1(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program2,Programs3,Programs3) :- !. + %%writeln([here1, caw1(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program21,Programs3,Programs3)]),!. + +caw1a(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps1,Ps2) :- + + %%writeln([caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,OutputVarList,OutputVars,Program1,Program4)]), + %%MaxLength2 is MaxLength - 1, + %%writeln(["ml",MaxLength2]), + reverse(InputVars2,InputVars5), + random1(N0), N1 is N0/100, length(Rules,L), N2 is round(L*N1)-1, + (N2>=0-> + (length(List1,N2), append(List1,List2,Rules), + List2=[[RuleName,NumInputs,NumOutputs]|_Rest]);fail), + %%writeln([member([RuleName,NumInputs,NumOutputs],Rules)]), + %%writeln([rule(RuleName,NumInputs,NumOutputs,VarList,VarList2,Rule)]), + rule(RuleName,NumInputs,NumOutputs,InputVars5,InputVars4,VarList,VarList2,Rule), +%%writeln([inputVars1,InputVars1]), +%% writeln([rule(RuleName,NumInputs,NumOutputs,InputVars5,InputVars4,VarList,VarList2,Rule)]), + append(Program1,[Rule],Program3), + %%writeln([inputVars3,InputVars3]), + %%InputVars2=InputVars3, + %%writeln([program4,Program4]), + +%%caw1a(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps,Ps) :- +%%writeln([here,caw1a(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps,Ps)]) + + +caw(Query,PredicateName,Rules,MaxLength,VarList2,InputVars1,InputVars4,InputVars3,OutputVarList,OutputVars,Program3,Program4,Ps1,Ps2),!. + + +caw(_,_,_,0,_,_,_,_,_,_,_,_,Ps,Ps) :- !. +caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program2,Programs1,Programs2) :- + MaxLength2 is MaxLength - 1, + addrules(InputVars2,OutputVars,OutputVars,[],_PenultimateVars,[],Program3), + %%writeln([addrules(InputVars2,OutputVars,OutputVars,[],PenultimateVars,[],Program3)]), + %%optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4), %% IV2->3 + %%writeln([optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program4)]), + append(Program1,Program3,Program5), + append(InputVars1,OutputVars,Vars2), + Program2=[ + [PredicateName,Vars2,":-", + Program5 + ] + ],debug(Debug), +%%*** +%% () choose iv1 as args during caw, () eliminate e problem, could move forward in optimiser but don't need it v +%% should have a leading edge of 1 immediately previous (new output) as an arg in latest rule v, go backwards to choose latest possible args x, 3 x rules can have same as previous rule's output as an output x: at a time +%% chunks will solve having at least 1 rule that connects to last output +%% can optimise number of inputs + +%% test member, = in caw + + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + + interpret(Debug,Query,Program2,OutputVarList2), + %%writeln([interpret(Debug,Query,Program2,OutputVarList2)]), + append(Programs1,[[Query,Program2,OutputVarList2]],Programs3), + cawa(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program2,Programs3,Programs2),!. +%%caw(_,_,_,_,_,_,_,_,_,_,_,_,Ps,Ps) :- !. + +caw(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program2,Programs3,Programs3) :- + %%writeln([here2, caw(Query,PredicateName,Rules,MaxLength2,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,[],_Program21,Programs3,Programs3)]), + !. + + +cawa(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,InputVars3,OutputVarList,OutputVars,Program1,Program4,Ps1,Ps2) :- + %%writeln([caw(Query,PredicateName,Rules,MaxLength,VarList,InputVars1,InputVars2,OutputVarList,OutputVars,Program1,Program4)]), + %%MaxLength2 is MaxLength - 1, + %%writeln(["ml",MaxLength2]), + random1(N0), N1 is N0/100, length(Rules,L), N2 is round(L*N1)-1, + (N2>=0-> + (length(List1,N2), append(List1,List2,Rules), + List2=[[RuleName,NumInputs,NumOutputs]|_Rest]);fail), + %%writeln([member([RuleName,NumInputs,NumOutputs],Rules)]), + %%writeln([rule(RuleName,NumInputs,NumOutputs,VarList,VarList2,Rule)]), + rule(RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,Rule), +%%writeln([inputVars1,InputVars1]), +%% writeln([rule(RuleName,NumInputs,NumOutputs,InputVars2,InputVars4,VarList,VarList2,Rule)]), + append(Program1,[Rule],Program3), + %%writeln([inputVars3,InputVars3]), + %%InputVars2=InputVars3, + %%writeln([program4,Program4]), +caw(Query,PredicateName,Rules,MaxLength,VarList2,InputVars1,InputVars4,InputVars3,OutputVarList,OutputVars,Program3,Program4,Ps1,Ps2), !. + +varnames([],Vars,Vars,Values,Values) :- !. +varnames(VarList,Vars1,Vars2,Values1,Values2) :- + VarList=[Var|Vars3], + Var=[VarName,Value], + append(Vars1,[VarName],Vars4), + append(Values1,[Value],Values3), + varnames(Vars3,Vars4,Vars2,Values3,Values2),!. + +addrules0(_,_,[],Program,Program) :- !. +addrules0(VarList,OutputVars1,OutputVars2,Program1,Program2) :- + OutputVars2=[OutputVar|OutputVars3], + + random1(N0), N1 is N0/100, length(VarList,L), N2 is round(L*N1)-1, + (N2>=0-> + (length(List1,N2), append(List1,List2,VarList), + List2=[Var|_Rest]);fail), + + random1(N01), N11 is N01/100, length(OutputVars1,L1), N21 is round(L1*N11)-1, + (N21>=0-> + (length(List11,N21), append(List11,List21,OutputVars1), + List21=[OutputVar|_Rest2]);fail), + + append(Program1,[[[n,=],[OutputVar,Var]]],Program3), + addrules0(VarList,OutputVars1,OutputVars3,Program3,Program2),!. + +addrules(_,_,[],PV,PV,Program,Program) :- !. +addrules(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- + restlast(VarList,[],_,Var), + %%OutputVars2=[OutputVar|OutputVars3], + + random1(N0), N1 is N0/100, length(OutputVars2,L), N2 is round(L*N1)-1, + (N2>=0-> + (length(List1,N2), append(List1,List2,OutputVars2), + List2=[OutputVar|_Rest]);fail), + + delete(OutputVars1,OutputVar,OutputVars3), +%% member(Var,VarList), + + random1(N01), N11 is N01/100, length(OutputVars1,L1), N21 is round(L1*N11)-1, + (N21>=0-> + (length(List11,N21), append(List11,List21,OutputVars1), + List21=[OutputVar|_Rest2]);fail), + + append(Program1,[[[n,=],[OutputVar,Var]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars3), + addrules2(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2),!. + +addrules2(_,_,[],PV,PV,Program,Program) :- !. +addrules2(VarList,OutputVars1,OutputVars2,PenultimateVars1,PenultimateVars2,Program1,Program2) :- +%% restlast(VarList,[],_,Var), + OutputVars2=[OutputVar|OutputVars3], + + random1(N0), N1 is N0/100, length(VarList,L), N2 is round(L*N1)-1, + (N2>=0-> + (length(List1,N2), append(List1,List2,VarList), + List2=[Var|_Rest]);fail), + + not(member(Var,PenultimateVars1)), + + random1(N01), N11 is N01/100, length(OutputVars1,L1), N21 is round(L1*N11)-1, + (N21>=0-> + (length(List11,N21), append(List11,List21,OutputVars1), + List21=[OutputVar|_Rest2]);fail), + + append(Program1,[[[n,=],[OutputVar,Var]]],Program3), + append(PenultimateVars1,[Var],PenultimateVars3), + addrules2(VarList,OutputVars1,OutputVars3,PenultimateVars3,PenultimateVars2,Program3,Program2),!. + +%% optimise([[append,[a,a,d]],[append,[a,a,e]],[append,[a,a,f]],[append,[a,b,g]]],[g],P). +/** +optimise(Program1,InputVars1,InputVars2,PenultimateVars,Program2) :- + reverse(Program1,Program4), + findrulesflowingtopv1(Program4,InputVars1,InputVars2,PenultimateVars,[],Rules,true), + %%findrulesflowingtopv1a(Program1,_Program32,InputVars1,InputVars2,PenultimateVars,[],_Rules1), + intersection(Program1,Rules,Program3), + unique1(Program3,[],Program2). +findrulesflowingtopv1(_,_,_,[],Rules,Rules,false). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + (atom(Var);length(Var,1)), + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1). +findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars1,Rules1,Rules2,IV1Flag1) :- + Vars1=[Var|Vars2], + findrulesflowingtopv20(Program0,Program0,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2), + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Vars2,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv20(_,[],_InputVars1,_InputVars2,_Var,Rules,Rules,false). +findrulesflowingtopv20(Program0,Rules4,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rules4=[Rule|Rules], + (findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules3,IV1Flag2)->true;(Rules3=Rules1,IV1Flag2=false)), + %%delete(Program0,Rule,Program1), + findrulesflowingtopv20(Program0,Rules,InputVars1,InputVars2,Var,Rules3,Rules2,IV1Flag3),%%p1->0 + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). +%%findrulesflowingtopv2(_,[],[],_,_,_,Rules,Rules). +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + %%(not(intersection(Rulesx,Rules1))-> x + %% append, append, unique1 + %%append(Rules1,[Rule],Rules3);Rules3=Rules1), + + %%member(Var2,Rest), + %%member(Var2,InputVars1), + + length(Rest,Length1), Length1>=1, + subtract(Rest,InputVars1,IV3s), + length(IV3s,Length3), + subtract(Rest,IV3s,IV1s), + length(IV1s,Length2), Length2>=1, + subtract(IV3s,InputVars2,[]), + + IV1Flag2=true, + + %%delete(Program0,Rule,Program1), + + %%(delete(Program0,Rule,Program3), + %%iv3s1(IV3s,Program3,IV3s,[]), + (Length3>=1-> + (findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV3s,[],Rules5,IV1Flag3),not(Rules5=[])); + (Rules5=[],IV1Flag3=false)), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag4), + %%->true; Rules5=[],IV1Flag1=IV1Flag4), + + ((findrulesflowingtopv1(Program0,InputVars1,InputVars2,IV1s,[],Rules6,IV1Flag5), %%iv1s->rest, etc + iv1flagdisjunction(IV1Flag4,IV1Flag5,IV1Flag1))->true;(Rules6=[],IV1Flag1=IV1Flag4)), + + append([Rule],Rules1,Rules9), + append(Rules9,Rules5,Rules7), + append(Rules7,Rules6,Rules8), + unique1(Rules8,[],Rules2). + +/** +findrulesflowingtopv2(_Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program2), + %%Program2=Program1, + (not(member(Rule,Rules1))-> + append(Rules1,[Rule],Rules2);Rules2=Rules1), + subset(Rest,InputVars2), + + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), + +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), + + IV1Flag1=false. +**/ +/** +findrulesflowingtopv2(Program0,Rule,InputVars1,InputVars2,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[_PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program3), + %%Program3=Program1, + %%append(Rules1,[Rule],Rules3), + subset(Rest,InputVars2), + + intersection(Rest,InputVars1,Intersection), + length(Intersection,0), + +%% not((member(Var2,Rest), +%% member(Var2,InputVars1))), + +%% delete(Program0,Rule,Program1), + + IV1Flag2=false, + findrulesflowingtopv1(Program0,InputVars1,InputVars2,Rest,[],Rules4,IV1Flag3), + %%not(Rules4=[]), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1), + + append(Rules1,[Rule],Rules7), + append(Rules7,Rules4,Rules8), + unique1(Rules8,[],Rules2). +**/ +/** +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2(Rule,Program0,Program1,_Program2,InputVars1,InputVars,Var,Rules1,Rules2,IV1Flag1) :- + Rule=[PredicateName,Vars], + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + %%Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1(Program0,Program1,_Program2,InputVars1,InputVars,Rest,Rules3,Rules2,IV1Flag3), + iv1flagdisjunction(IV1Flag2,IV1Flag3,IV1Flag1). + + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). + +**/ +iv1flagdisjunction(A,B,true) :- + ((A=true)->true; (B=true)),!. +iv1flagdisjunction(_,_,false) :- !. +/** +iv3s0([],_,IV3s1,IV3s2). +iv3s0(IV3s,Program0,IV3s1,IV3s2). + IV3s=[IV3|IV3s3], + iv3s1(IV3,Program0,IV3s1,IV3s4), + iv3s0(IV3s3,Program0,IV3s4,IV3s2). +iv3s1(_,[],IV3s,IV3s). +iv3s1(IV3,Program0,IV3s1,IV3s2) :- + Program0=[Rule|Rules], + iv3s2(IV3,Rule,IV3s1,IV3s3), + iv3s1(IV3,Rules,IV3s3,IV3s2). +iv3s2(IV3,Rule,IV3s,IV3s1,IV3s2). + Rule=[_PredicateName,Vars], + restlast(Vars,[],_Rest,IV3), + delete(IV3s1,IV3,IV3s2). + + +findrulesflowingtopv1a(_,_,_,_,[],Rules,Rules). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + atom(Var), + findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2). +findrulesflowingtopv1a(Program1,Program2,InputVars1,InputVars2,Vars1,Rules1,Rules2) :- + Vars1=[Var|Vars2], + findrulesflowingtopv2(Program1,Program3,InputVars1,InputVars2,Var,Rules1,Rules3), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Vars2,Rules3,Rules2). +%%findrulesflowingtopv2([],Program,Program,_,_,Rules,Rules). +findrulesflowingtopv2a([],[],_,_,_,Rules,Rules). +findrulesflowingtopv2a(Program1,Program2,_InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program2), +Program2=Program1, + append(Rules1,[[PredicateName,Vars]],Rules2), + subset(Rest,InputVars2)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars2,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + ( +%%delete(Program1,[PredicateName,Vars],Program3), +Program3=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + subset(Rest,InputVars2)), + findrulesflowingtopv1a(Program3,Program2,InputVars1,InputVars2,Rest,Rules3,Rules2). + +%%->true;(Program2=Program1,Rules2=Rules1)). +findrulesflowingtopv2a(Program1,Program2,InputVars1,InputVars,Var,Rules1,Rules2) :- + member([PredicateName,Vars],Program1), + restlast(Vars,[],Rest,Var), + %%delete(Program1,[PredicateName,Vars],Program4), + Program4=Program1, + append(Rules1,[[PredicateName,Vars]],Rules3), + findrulesflowingtopv1a(Program4,Program2,InputVars1,InputVars,Rest,Rules3,Rules2). + %%findrulesflowingtopv2(Program5,Program2,Rest,Rules3,Rules2). + **/ +**/ +restlast([],_,_,_) :- fail, !. +restlast([Last],Rest,Rest,Last) :- + Last=[v,_],!. +restlast(Last,Rest,Rest,Last) :- + length(Last,1),!. +restlast(Vars1,Rest1,Rest2,Last) :- + Vars1=[Var|Vars2], + append(Rest1,[Var],Rest3), + restlast(Vars2,Rest3,Rest2,Last),!. + + + +rule(RuleName,1,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- + + random1(N0), N1 is N0/100, length(InputVars1,L), N2 is round(L*N1)-1, + (N2>=0-> + (length(List1,N2), append(List1,List2,InputVars1), + List2=[Var|_Rest]);fail), + + rule2(RuleName,Var,VarList,VarList2,Rule,Var1), + restlast(InputVars1,[],_,Last), %% Last should be outputs - 2nd last rule? + (Var=Last->true;Last=Var1), + append(InputVars1,[Var1],InputVars2),!. +rule(RuleName,1,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + + random1(N0), N1 is N0/100, length(InputVars1,L), N2 is round(L*N1)-1, + (N2>=0-> + (length(List1,N2), append(List1,List2,InputVars1), + List2=[Var|_Rest]);fail), + + rule3(RuleName,Var,VarList,VarList2,Rule,Var1,Var2), + restlast(InputVars1,[],_,Last), + (Var=Last->true; + (Last=Var1->true;Last=Var2)), + append(InputVars1,[Var1,Var2],InputVars2),!. +rule(RuleName,2,0,InputVars,InputVars,VarList,VarList,Rule) :- +%%writeln([rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule)]), + random1(N0), N1 is N0/100, length(InputVars,L), N2 is round(L*N1)-1, + (N2>=0-> + (length(List1,N2), append(List1,List2,InputVars), + List2=[Var|_Rest]);fail), + + random1(N01), N11 is N01/100, length(InputVars,L1), N21 is round(L1*N11)-1, + (N21>=0-> + (length(List11,N21), append(List11,List21,InputVars), + List21=[Vara|_Rest2]);fail), + + rule6(RuleName,Var,Vara,_VarList,_VarList2,Rule), + restlast(InputVars,[],_,Last), +%%writeln([last,Last]), + (Var=Last->true;Vara=Last),!. +rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule) :- +%%writeln([rule(RuleName,2,1,InputVars1,InputVars2,VarList,VarList2,Rule)]), + random1(N0), N1 is N0/100, length(InputVars1,L), N2 is round(L*N1)-1, + (N2>=0-> + (length(List1,N2), append(List1,List2,InputVars1), + List2=[Var|_Rest]);fail), + + random1(N01), N11 is N01/100, length(InputVars1,L1), N21 is round(L1*N11)-1, + (N21>=0-> + (length(List11,N21), append(List11,List21,InputVars1), + List21=[Vara|_Rest2]);fail), + + rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1), + restlast(InputVars1,[],_,Last), +%%writeln([last,Last]), + ((Var=Last->true;Vara=Last)->true; + (Last=Var1)), +%%writeln([var,Var,vara,Vara]), + append(InputVars1,[Var1],InputVars2),!. +rule(RuleName,2,2,InputVars1,InputVars2,VarList,VarList2,Rule) :- + + random1(N0), N1 is N0/100, length(InputVars,L), N2 is round(L*N1)-1, + (N2>=0-> + (length(List1,N2), append(List1,List2,InputVars), + List2=[Var|_Rest]);fail), + + random1(N01), N11 is N01/100, length(InputVars,L1), N21 is round(L1*N11)-1, + (N21>=0-> + (length(List11,N21), append(List11,List21,InputVars), + List21=[Vara|_Rest2]);fail), + + rule5(RuleName,Var,Vara,VarList,VarList2,Rule,Var1,Var2), + restlast(InputVars1,[],_,Last), + ((Var=Last->true;Vara=Last)->true; + (Last=Var1->true;Last=Var2)), %% make last var2, use different inputs from previous rule, make this line usea version of previous line as well (args from rule before that) - redo rules based on past programming + append(InputVars1,[Var1,Var2],InputVars2),!. + + +rule2(RuleName,Var,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Var1]],!.%%, + %%member(Var,!. + +rule3(RuleName,Var,VarList,VarList3,Rule,Var1,Var2) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Var1,Var2]],!. + +rule6(RuleName,Var,Vara,_VarList,_VarList2,Rule) :- + Rule=[RuleName,[Var,Vara]],!. + +rule4(RuleName,Var,Vara,VarList,VarList2,Rule,Var1) :- + var(VarList,Var1,VarList2), + Rule=[RuleName,[Var,Vara,Var1]],!. + +%%ae be with predicate support also +rule5(RuleName,Var,Vara,VarList,VarList3,Rule,Var1,Var2) :- + var(VarList,Var1,VarList2), + var(VarList2,Var2,VarList3), + Rule=[RuleName,[Var,Vara,Var1,Var2]],!. + +%%var(Item,Var,Vars,Vars) :- +%% member([Item,Var],Vars). +var(Vars1,Var1,Vars2) :- + length(Vars1,Vars1Length1), + Vars1Length2 is Vars1Length1-1, + length(Vars3,Vars1Length2), + append(Vars3,[Var2],Vars1), + Var2=[v,Var21], + char_code(Var21,Var2Code1), + Var2Code2 is Var2Code1 + 1, + var2(Var2Code2,Var1), + append(Vars1,[Var1],Vars2),!. + +var2(Code,Var1) :- + outputvars(OutputVars), + totalvars(TotalVars), + Code2 is 96+TotalVars, + Code =< Code2, %% 122 + char_code(Var11,Code), + Var1=[v,Var11], + not(member(Var1,OutputVars)),!. +var2(Var2Code,Code3) :- + Var2Code2 is Var2Code + 1, + totalvars(TotalVars), + Code2 is 96+TotalVars, + Var2Code2 =< Code2, + var2(Var2Code2,Code31), + Code3=[v,Code31],!. + +%% this goes from e.g. c not a to TotalVars +%% skip over vars, start from a x reassign vars to abc etc + +%% if returning 12 from 12345 remove 345 args + +%% try lowest possible number of vars first (return shortest program first), then keep on adding number of vars +%% can elimininate same program both here and in assessment verification diff --git a/Text-to-Breasonings-master/mindreadtestpsychiatrist.pl b/Text-to-Breasonings-master/mindreadtestpsychiatrist.pl new file mode 100644 index 0000000000000000000000000000000000000000..6120e7f6f31e7df136a95e2cf740510a37334aae --- /dev/null +++ b/Text-to-Breasonings-master/mindreadtestpsychiatrist.pl @@ -0,0 +1,99 @@ +%% mind read test + +%% Make files different for different tests + +%% *** Important: initialise program before running for the first time: +%% N is 1*2*3*5,texttobr2(N). %% 100 As for 1 (turned on)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects) +%% N is 1*2*3*5,texttobr2(N). %% 100 As for 1 (turned off)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects) +%% also breason out and dot on objects before line above and breason out and dot on when recognising and saying object (with all objects having different breasonings) + +%%use_module(library(pio)). + +:- use_module(library(date)). +:- include('texttobr2qb'). +:- include('mindreadtestshared'). + +sectest(Person1):- + +Person1=[Item1a,Item2a,_,_,_,_,_,_,Month2,Day2,_], + Person2=[Item1a,Item2a], +get_time(TS),stamp_date_time(TS,date(_Year,Month1,Day1,_Hour1,_Minute1,_Seconda,_A,_TZ,_False),local), + + ((Month2 is mod(Month1,2)) -> + ((Day1 is Day2)->do_c(Month1,Day1,Person2);true);true). + +do_c(Month1,Day1,Person):- + find_time(Hour,Minutes,Seconds), + %% "Do you see (hallucinatory) appearances?" + trialy2_6("Yes",R1), + trialy2_6("No",R2), + R=[R1,R2/**,R3,R4,R5,R6,R7,R8,R9,R10**,R11,R12,R13,R14,R15,R16,R17,R18,R19,R20,R21,R22,R23,R24,R25,R26,R27**/ + ], + sort(R,RA), + reverse(RA,RB), + RB=[[_,RC]|_Rest1], + + %% "Do you feel depressed?" + trialy2_6("Yes",S1), + trialy2_6("No",S2), + S=[S1,S2/**,S3,S4,S5,S6,S7,S8,S9,S10**,S11,S12,S13,S14,S15,S16,S17,S18,S19,S20,S21,S22,S23,S24,S25,S26,S27**/ + ], + sort(S,SA), + reverse(SA,SB), + SB=[[_,SC]|_Rest2], + + %% "Do you have headaches?" + trialy2_6("Yes",T1), + trialy2_6("No",T2), + T=[T1,T2/**,T3,T4,T5,T6,T7,T8,T9,T10**,T11,T12,T13,T14,T15,T16,T17,T18,T19,T20,T21,T22,T23,T24,T25,T26,T27**/ + ], + sort(T,TA), + reverse(TA,TB), + TB=[[_,TC]|_Rest3], + + %% "Do you have a job?" + trialy2_6("Yes",U1), + trialy2_6("No",U2), + U=[U1,U2/**,U3,U4,U5,U6,U7,U8,U9,U10**,U11,U12,U13,U14,U15,U16,U17,U18,U19,U20,U21,U22,U23,U24,U25,U26,U27**/ + ], + sort(U,UA), + reverse(UA,UB), + UB=[[_,UC]|_Rest4], + + %% "Do you have a business?" + trialy2_6("Yes",V1), + trialy2_6("No",V2), + V=[V1,V2/**,V3,V4,V5,V6,V7,V8,V9,V10**,V11,V12,V13,V14,V15,V16,V17,V18,V19,V20,V21,V22,V23,V24,V25,V26,V27**/ + ], + sort(V,VA), + reverse(VA,VB), + VB=[[_,VC]|_Rest5], + + %% "Are you doing training?" + trialy2_6("Yes",W1), + trialy2_6("No",W2), + W=[W1,W2/**,W3,W4,W5,W6,W7,W8,W9,W10**,W11,W12,W13,W14,W15,W16,W17,W18,W19,W20,W21,W22,W23,W24,W25,W26,W27**/ + ], + sort(W,WA), + reverse(WA,WB), + WB=[[_,WC]|_Rest6], + + %% "Do you have a partner?" + trialy2_6("Yes",X1), + trialy2_6("No",X2), + X=[X1,X2/**,X3,X4,X5,X6,X7,X8,X9,X10**,X11,X12,X13,X14,X15,X16,X17,X18,X19,X20,X21,X22,X23,X24,X25,X26,X27**/ + ], + sort(X,XA), + reverse(XA,XB), + XB=[[_,XC]|_Rest7], + + %% "Do you have sex?" + trialy2_6("Yes",Y1), + trialy2_6("No",Y2), + Y=[Y1,Y2/**,Y3,Y4,Y5,Y6,Y7,Y8,Y9,Y10**,Y11,Y12,Y13,Y14,Y15,Y16,Y17,Y18,Y19,Y20,Y21,Y22,Y23,Y24,Y25,Y26,Y27**/ + ], + sort(Y,YA), + reverse(YA,YB), + YB=[[_,YC]|_Rest8], + + writeln([Person,[appointment,Month1,Day1,Hour,Minutes,Seconds],["Do you see (hallucinatory) appearances?",RC],["Do you feel depressed?",SC],["Do you have headaches?",TC],["Do you have a job?",UC],["Do you have a business?",VC],["Are you doing training?",WC],["Do you have a partner?",XC],["Do you have sex?",YC]]). diff --git a/Text-to-Breasonings-master/mindreadtestsec.pl b/Text-to-Breasonings-master/mindreadtestsec.pl new file mode 100644 index 0000000000000000000000000000000000000000..a8aa9cc3644fa771dd259bc64236a7403a1a4a56 --- /dev/null +++ b/Text-to-Breasonings-master/mindreadtestsec.pl @@ -0,0 +1,32 @@ +%% mind read test + +%% Make files different for different tests + +%% *** Important: initialise program before running for the first time: +%% N is 1*2*3*5,texttobr2(N). %% 100 As for 1 (turned on)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects) +%% N is 1*2*3*5,texttobr2(N). %% 100 As for 1 (turned off)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects) +%% also breason out and dot on objects before line above and breason out and dot on when recognising and saying object (with all objects having different breasonings) + + +%%use_module(library(pio)). + +:- use_module(library(date)). +:- include('texttobr2qb'). +:- include('mindreadtestshared'). + +sectest(Person):- + find_time(H,M,S), + threats(0,Threats), + writeln([Person,H,M,S,Threats,threats]). + +threats(Threats1,Threats2):- + %% "Given that they are not likely to have meant it and that there is nothing wrong, is there anything else that is wrong?" + trialy2_6("Yes",R1), + trialy2_6("No",R2), + R=[R1,R2/**,R3,R4,R5,R6,R7,R8,R9,R10**,R11,R12,R13,R14,R15,R16,R17,R18,R19,R20,R21,R22,R23,R24,R25,R26,R27**/ + ], + sort(R,RA), + reverse(RA,RB), + RB=[[_,Answer]|_Rest], + + (Answer="No"->Threats2=Threats1;(Threats3 is Threats1+1,threats(Threats3,Threats2))). diff --git a/Text-to-Breasonings-master/mindreadtestsecthoughts.pl b/Text-to-Breasonings-master/mindreadtestsecthoughts.pl new file mode 100644 index 0000000000000000000000000000000000000000..4bc2ca9b089f2ad6794d6e5f7fbd461552e328b2 --- /dev/null +++ b/Text-to-Breasonings-master/mindreadtestsecthoughts.pl @@ -0,0 +1,40 @@ +%% mind read test + +%% Make files different for different tests + +%% *** Important: initialise program before running for the first time: +%% N is 1*2*3*5,texttobr2(N). %% 100 As for 1 (turned on)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects) +%% N is 1*2*3*5,texttobr2(N). %% 100 As for 1 (turned off)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects) +%% also breason out and dot on objects before line above and breason out and dot on when recognising and saying object (with all objects having different breasonings) + + +%%use_module(library(pio)). + +:- use_module(library(date)). +:- include('texttobr2qb'). +:- include('mindreadtestshared'). +:- include('../Algorithm-Writer-with-Lists/grammar_logic_to_alg_random'). + +sectest(Person):- + List=[1%%,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29 + ], + findall(B,(member(_A,List), + %%find_time(H,M,S), + random_member(H,[0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23]), + random_member(M,[0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59]), + random_member(S,[0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50,51,52,53,54,55,56,57,58,59]), + B=[H,M,S]),C), + sort(C,D), + findall(_,(member(E,D),writeln1([Person,thought,at,time,E]), + thought, %% radio button + thought, %% thought + thought, %% seen-as version + thought, %% comment + thought, %% connection + thought, %% B + thought %% B solution + +),_). + +thought:- + grammar_logic_to_alg1. diff --git a/Text-to-Breasonings-master/mindreadtestshared.pl b/Text-to-Breasonings-master/mindreadtestshared.pl new file mode 100644 index 0000000000000000000000000000000000000000..e87d7882723c760ce3297657f8406a919a5daa75 --- /dev/null +++ b/Text-to-Breasonings-master/mindreadtestshared.pl @@ -0,0 +1,327 @@ +%% Name, DOB, Date learned, psych appointment month=0 or 1, psych appointment day, thoughts count + +sectest0 :- +meditators(A), +findall(_,(member(B,A),sectest(B)),_) +/* +sectest([first,last,dobd,dobm,doby,daylearned,monthlearned,yearlearned,0,1,16]), +sectest([first,last,dobd,dobm,doby,daylearned,monthlearned,yearlearned,0,1,16]) +*/. + +sectest1 :- +meditators2(A), +findall(_,(member(B,A),sectest(B)),_) +/* +sectest([first,last,dobd,dobm,doby,daylearned,monthlearned,yearlearned,1,0,16]), +sectest([first,last,dobd,dobm,doby,daylearned,monthlearned,yearlearned,1,0,16]) +*/ +. + +find_time(H,M,S) :- + trialy2_15("0",H11), + trialy2_15("1",H12), + trialy2_15("2",H13), + H1L=[H11,H12,H13], + sort(H1L,H1A), + reverse(H1A,H1B), + H1B=[[_,H1]|_Rest1], + + (H1="2"->( + trialy2_30("0",H21), + trialy2_30("1",H22), + trialy2_30("2",H23), + trialy2_30("3",H24), + H2L=[H21,H22,H23,H24], + sort(H2L,H2A), + reverse(H2A,H2B), + H2B=[[_,H2]|_Rest2] + ) + ;( + trialy2_30("0",H21), + trialy2_30("1",H22), + trialy2_30("2",H23), + trialy2_30("3",H24), + trialy2_30("4",H25), + trialy2_30("5",H26), + trialy2_30("6",H27), + trialy2_30("7",H28), + trialy2_30("8",H29), + trialy2_30("9",H210), + + H2L=[H21,H22,H23,H24,H25, + H26,H27,H28,H29,H210], + sort(H2L,H2A), + reverse(H2A,H2B), + H2B=[[_,H2]|_Rest2] + )), + + trialy2_15("0",M11), + trialy2_15("1",M12), + trialy2_15("2",M13), + trialy2_15("3",M14), + trialy2_15("4",M15), + trialy2_15("5",M16), + M1L=[M11,M12,M13,M14,M15,M16], + sort(M1L,M1A), + reverse(M1A,M1B), + M1B=[[_,M1]|_Rest3], + + trialy2_30("0",M21), + trialy2_30("1",M22), + trialy2_30("2",M23), + trialy2_30("3",M24), + trialy2_30("4",M25), + trialy2_30("5",M26), + trialy2_30("6",M27), + trialy2_30("7",M28), + trialy2_30("8",M29), + trialy2_30("9",M210), + M2L=[M21,M22,M23,M24,M25,M26,M27,M28,M29,M210], + sort(M2L,M2A), + reverse(M2A,M2B), + M2B=[[_,M2]|_Rest4], + + trialy2_15("0",S11), + trialy2_15("1",S12), + trialy2_15("2",S13), + trialy2_15("3",S14), + trialy2_15("4",S15), + trialy2_15("5",S16), + S1L=[S11,S12,S13,S14,S15,S16], + sort(S1L,S1A), + reverse(S1A,S1B), + S1B=[[_,S1]|_Rest5], + + trialy2_30("0",S21), + trialy2_30("1",S22), + trialy2_30("2",S23), + trialy2_30("3",S24), + trialy2_30("4",S25), + trialy2_30("5",S26), + trialy2_30("6",S27), + trialy2_30("7",S28), + trialy2_30("8",S29), + trialy2_30("9",S210), + S2L=[S21,S22,S23,S24,S25,S26,S27,S28,S29,S210], + sort(S2L,S2A), + reverse(S2A,S2B), + S2B=[[_,S2]|_Rest6], + + string_concat(H1,H2,H), + string_concat(M1,M2,M), + string_concat(S1,S2,S). + +trialy2_6(Label,RA) :- + %%writeln([testing,Label]), + trialy1(R1), + trialy1(R2), + trialy1(R3), + trialy1(R4), + trialy1(R5), + trialy1(R6), /** + trialy1(R7), + trialy1(R8), + trialy1(R9), + trialy1(R10), + trialy1(R11), + trialy1(R12), + trialy1(R13), + trialy1(R14), + trialy1(R15), + trialy1(R16), + trialy1(R17), + trialy1(R18), + trialy1(R19), + trialy1(R20), + trialy1(R21), + trialy1(R22), + trialy1(R23), + trialy1(R24), + trialy1(R25), + trialy1(R26), + trialy1(R27), + trialy1(R28), + trialy1(R29), + trialy1(R30), **/ + R=[R1,R2,R3,R4,R5,R6 /**,R7,R8,R9,R10, + R11,R12,R13,R14,R15,R16,R17,R18,R19,R20, + R21,R22,R23,R24,R25,R26,R27,R28,R29,R30 **/ + ], + %%(member(true,R)->( + aggregate_all(count, member(true,R), Count), + RA=[Count,Label].%%,writeln([Label,Count,"/10"]));true). + +trialy2_15(Label,RA) :- + %%writeln([testing,Label]), + trialy1(R1), + trialy1(R2), + trialy1(R3), + trialy1(R4), + trialy1(R5), + trialy1(R6), + trialy1(R7), + trialy1(R8), + trialy1(R9), + trialy1(R10), + trialy1(R11), + trialy1(R12), + trialy1(R13), + trialy1(R14), + trialy1(R15), + /** + trialy1(R16), + trialy1(R17), + trialy1(R18), + trialy1(R19), + trialy1(R20), + trialy1(R21), + trialy1(R22), + trialy1(R23), + trialy1(R24), + trialy1(R25), + trialy1(R26), + trialy1(R27), + trialy1(R28), + trialy1(R29), + trialy1(R30),**/ + R=[R1,R2,R3,R4,R5,R6,R7,R8,R9,R10, + R11,R12,R13,R14,R15 /**,R16,R17,R18,R19,R20, + R21,R22,R23,R24,R25,R26,R27,R28,R29,R30 + **/], + %%(member(true,R)->( + aggregate_all(count, member(true,R), Count), + RA=[Count,Label].%%,writeln([Label,Count,"/10"]));true). + +trialy2_30(Label,RA) :- + %%writeln([testing,Label]), + trialy1(R1), + trialy1(R2), + trialy1(R3), + trialy1(R4), + trialy1(R5), + trialy1(R6), + trialy1(R7), + trialy1(R8), + trialy1(R9), + trialy1(R10), + trialy1(R11), + trialy1(R12), + trialy1(R13), + trialy1(R14), + trialy1(R15), + trialy1(R16), + trialy1(R17), + trialy1(R18), + trialy1(R19), + trialy1(R20), + trialy1(R21), + trialy1(R22), + trialy1(R23), + trialy1(R24), + trialy1(R25), + trialy1(R26), + trialy1(R27), + trialy1(R28), + trialy1(R29), + trialy1(R30), + R=[R1,R2,R3,R4,R5,R6,R7,R8,R9,R10, + R11,R12,R13,R14,R15,R16,R17,R18,R19,R20, + R21,R22,R23,R24,R25,R26,R27,R28,R29,R30], + %%(member(true,R)->( + aggregate_all(count, member(true,R), Count), + RA=[Count,Label].%%,writeln([Label,Count,"/10"]));true). + + +trialy1(R1) :- + %%control11(A1), + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, trial1(N,[],S),trial01(S,S3). +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +midpoint(S,MP) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,writeln(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). diff --git a/Text-to-Breasonings-master/objecttofinishdict.txt b/Text-to-Breasonings-master/objecttofinishdict.txt new file mode 100644 index 0000000000000000000000000000000000000000..96c78b1baa6ec9c854b78bfb3b96dc4ff4613f95 --- /dev/null +++ b/Text-to-Breasonings-master/objecttofinishdict.txt @@ -0,0 +1 @@ +[[and,line]] \ No newline at end of file diff --git a/Text-to-Breasonings-master/objecttopreparedict.txt b/Text-to-Breasonings-master/objecttopreparedict.txt new file mode 100644 index 0000000000000000000000000000000000000000..daa5d00e22a9615a36153b53ec3253e368cd7d2b --- /dev/null +++ b/Text-to-Breasonings-master/objecttopreparedict.txt @@ -0,0 +1 @@ +[[and,keyboard]] \ No newline at end of file diff --git a/Text-to-Breasonings-master/partofroomdict.txt b/Text-to-Breasonings-master/partofroomdict.txt new file mode 100644 index 0000000000000000000000000000000000000000..05cfe8d2e70468340f8a2047355ec700518fb652 --- /dev/null +++ b/Text-to-Breasonings-master/partofroomdict.txt @@ -0,0 +1 @@ +[[and,table]] \ No newline at end of file diff --git a/Text-to-Breasonings-master/prompt_meditation b/Text-to-Breasonings-master/prompt_meditation new file mode 100644 index 0000000000000000000000000000000000000000..ec37bf244ab34cd1f9ec81ed453a19d271883c42 Binary files /dev/null and b/Text-to-Breasonings-master/prompt_meditation differ diff --git a/Text-to-Breasonings-master/prompt_meditation.pl b/Text-to-Breasonings-master/prompt_meditation.pl new file mode 100644 index 0000000000000000000000000000000000000000..2cb91927a015b2f3180441b945baafdce3b47f15 --- /dev/null +++ b/Text-to-Breasonings-master/prompt_meditation.pl @@ -0,0 +1,6 @@ +:-include('prompt_meditation1.pl'). +main:-catch(prompt_meditation,Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). \ No newline at end of file diff --git a/Text-to-Breasonings-master/prompt_meditation1.pl b/Text-to-Breasonings-master/prompt_meditation1.pl new file mode 100644 index 0000000000000000000000000000000000000000..f819be15dcc718d04757984d77ba9cc69ac864ae --- /dev/null +++ b/Text-to-Breasonings-master/prompt_meditation1.pl @@ -0,0 +1,4 @@ +prompt_meditation :- +repeat,writeln("Please think of 108 arems and 108 friendlinesses, then enter \"y\":"), +read_string(user_input,"\n\r","\n\r",_,S), +S="y",!. \ No newline at end of file diff --git a/Text-to-Breasonings-master/prompt_question b/Text-to-Breasonings-master/prompt_question new file mode 100644 index 0000000000000000000000000000000000000000..93954c2fae06fecef03191c4d946541672c524a8 Binary files /dev/null and b/Text-to-Breasonings-master/prompt_question differ diff --git a/Text-to-Breasonings-master/prompt_question.pl b/Text-to-Breasonings-master/prompt_question.pl new file mode 100644 index 0000000000000000000000000000000000000000..ed1332c020f8c0925e3a623ae17212ad083672a1 --- /dev/null +++ b/Text-to-Breasonings-master/prompt_question.pl @@ -0,0 +1,6 @@ +:-include('prompt_question1.pl'). +main:-catch(prompt_question,Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). diff --git a/Text-to-Breasonings-master/prompt_question1.pl b/Text-to-Breasonings-master/prompt_question1.pl new file mode 100644 index 0000000000000000000000000000000000000000..b743507816348b447fce3992acb567172c8c735b --- /dev/null +++ b/Text-to-Breasonings-master/prompt_question1.pl @@ -0,0 +1,4 @@ +prompt_question :- +repeat,writeln("Please ask the future Head of State and the Simulation Co-ordinator for the age freezing medicine, then say \"too\" to each of the following:\n\nA, B, B to B for Reishi Mushroom\nA, B, B to B for Goji\nA, B, B to B for Ginseng\nA, B, B to B for He-Shou-Wu\nA, B, B to B for Gotu Kola\nA, B, B to B for Schisandra\nA, B, B to B for Love head of state\nA, B, B to B for Thank who helped me with immortality medicine, body replacement and anti-ageing\nA, B, B to B for Thank head of state for Reishi Mushroom\nA, B, B to B for Thank head of state for Goji\nA, B, B to B for Thank head of state for Ginseng\nA, B, B to B for Thank head of state for He-Shou-Wu\nA, B, B to B for Thank head of state for Gotu Kola\nA, B, B to B for Thank head of state for Schisandra\nA, B, B to B for Immortality\nA, B, B to B for Body replacement\nA, B, B to B for Anti-ageing medicine\nA, B, B to B for immortality medicine, body replacement and anti-ageing for memory\nA, B, B to B for immortality medicine, body replacement and anti-ageing for thinking\nA, B, B to B for immortality medicine, body replacement and anti-ageing for stopping dementia\nA, B, B to B for immortality medicine, body replacement and anti-ageing for seeing clearly\nA, B, B to B for immortality medicine, body replacement and anti-ageing for muscle relaxation\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Circulatory system / Cardiovascular system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Digestive system and Excretory system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Endocrine system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Integumentary system / Exocrine system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Immune system and lymphatic system:\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Muscular system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Nervous system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Renal system and Urinary system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Reproductive system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Respiratory system\nA, B, B to B for immortality medicine, body replacement and anti-ageing for Skeletal System\nA, B, B to B for immortality medicine, body replacement and anti-ageing for antidepressant\nA, B, B to B for immortality medicine, body replacement and anti-ageing for antipsychotic\nA, B, B to B for Other medicines for the body\nA, B, B to B for ginkgo biloba\nA, B, B to B for practicum for others in immortality, etc.\nA, B, B to B for the other As\nA, B, B to B for Thank head of state.\n\nThen enter \"y\":"), +read_string(user_input,"\n\r","\n\r",_,S), +S="y",!. \ No newline at end of file diff --git a/Text-to-Breasonings-master/random_proof_reader.pl b/Text-to-Breasonings-master/random_proof_reader.pl new file mode 100644 index 0000000000000000000000000000000000000000..0b78a9d9cd7403d56177886161deeb12a39b53f2 --- /dev/null +++ b/Text-to-Breasonings-master/random_proof_reader.pl @@ -0,0 +1,17 @@ +% random_proof_reader.pl + +% randomly highlight parts to proofread + +:-include('../listprologinterpreter/listprolog.pl'). +main :- open_string_file_s("../Text-to-Breasonings/file.txt",A), +find_sent(A,A1), + +findall([Mark,B],(member(B,A1),((B="\n"->true;B=".")->Mark="";random_member(Mark,[""," *: "]))),List), +flatten(List,List1), +foldr(string_concat,List1,S), +save_file_s("../Text-to-Breasonings/file-rpr-out.txt",S), +!. + +find_sent(A,F) :- +split_string(A,"\n\r","\n\r",B),findall([B14,"\n"],(member(B11,B),split_string(B11,".",".",B12),findall([B13,"."],member(B13,B12),B1),flatten(B1,B15),(string_concat(_,".",B11)->B14=B15;append(B14,[_],B15))),B2),flatten(B2,B3),((string_concat(_,"\n",A)->true;string_concat(_,"\r",A))->B21=B3;append(B21,[_],B3)),findall(C,(member(C,B21),string_chars(C,D),not(forall(member(E,D),char_type(E,white)))),F),!. + diff --git a/Text-to-Breasonings-master/rcaw.paste b/Text-to-Breasonings-master/rcaw.paste new file mode 100644 index 0000000000000000000000000000000000000000..421a893d0fa7063232d558b8769e60ea271b041f --- /dev/null +++ b/Text-to-Breasonings-master/rcaw.paste @@ -0,0 +1,292 @@ +% member(List,Item) (below) is part of deprecated lpi + +%% rcaw.paste - paste this into terminal +%% before pasting, to avoid conjunctivitis and haemorrhoids, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. + +ssh root@x.x.x.x -p 22 +/** Access your VPS - see https://github.com/luciangreen/listprologinterpreter/blob/master/Setting_up_a_VPS_with_TextToBr.txt + +set up shell ssh key terminal mac save public key log in +https://www.linode.com/docs/security/authentication/use-public-key-authentication-with-ssh/ +does ssh key require password +https://stackoverflow.com/questions/21095054/ssh-key-still-asking-for-password-and-passphrase +where is ssh configuration file mac +http://osxdaily.com/2011/04/05/setup-ssh-config-fie/ +command-shift-. see invisible finder files +**/ + +cd ../var/www/yourdomain.com/ + +%% Meditation and Medicine + +swipl -G100g -T20g -L2g + +[meditationnoreplace]. +meditation. + +halt. + +%% Preparation for Mind Reading - or at least gives you 100 done up As + +leash(-all),visible(+all). +protocol("./file.txt"). +trace. +caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). +notrace,noprotocol. + +%% ** Stop - close window x halt (last time I stopped here) +halt. +swipl -G100g -T20g -L2g +['../Text-to-Breasonings/text_to_breasonings.pl']. + +[meditatorsanddoctors]. +meditators(Meditators),doctors(Doctors),append(Meditators,Doctors,MeditatorsandDoctors),length(MeditatorsandDoctors,LengthMeditatorsandDoctors), + +time((N is 100*4, M is 4000, texttobr2(N,u,u,M),texttobr(N,u,u,M))). + +%% Give the meditators, etc. the As. + +[texttobr2qb]. + +[meditatorsanddoctors]. +meditators(Meditators),doctors(Doctors),append(Meditators,Doctors,MeditatorsandDoctors),length(MeditatorsandDoctors,LengthMeditatorsandDoctors1), + +LengthMeditatorsandDoctors2 is 3*LengthMeditatorsandDoctors1, +texttobr2(LengthMeditatorsandDoctors2). + +halt. + +%% Security + +%% First half + +swipl -G100g -T20g -L2g + +FileName2='./sfile1.txt', +FileName3='sfile1.txt', + +protocol(FileName2). +notrace. + +[mindreadtestsec]. + +sectest0. + +noprotocol. + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./sfile1.txt', +FileName3='sfile1.txt', + +time((texttobr2(u,FileName3,u,u),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Security 1" you@yourdomain.com -a ',FileName3,A1), + +shell1(A1). + +halt. + +%% Second half + +swipl -G100g -T20g -L2g + +FileName2='./sfile2.txt', +FileName3='sfile2.txt', + +protocol(FileName2). +notrace. + +[mindreadtestsec]. + +sectest1. + +noprotocol. + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./sfile2.txt', +FileName3='sfile2.txt', + +time((texttobr2(u,FileName3,u,u),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Security 2" you@yourdomain.com -a ',FileName3,A1), + +shell1(A1). + +halt. + +%% Give the meditators, etc. the As. + +swipl -G100g -T20g -L2g + +[mindreadtestsec]. + +[texttobr2qb]. + +[meditatorsanddoctors]. +meditators(Meditators),doctors(Doctors),append(Meditators,Doctors,MeditatorsandDoctors),length(MeditatorsandDoctors,LengthMeditatorsandDoctors1), + +LengthMeditatorsandDoctors2 is 2*LengthMeditatorsandDoctors1, +texttobr2(LengthMeditatorsandDoctors2). + +halt. + +%% Meditation + +%% andrew - duplicate for others from "andrew" +swipl -G100g -T20g -L2g +[listprolog]. +[mindreadingcaw]. +caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps),Ps=[[Query,Functions,_Result1]|_Rest],leash(-all),visible(+all),protocol("./andrew.txt"),trace,interpret(off,Query,Functions,_Result2),notrace,noprotocol. +halt. +swipl -G100g -T20g -L2g +['../Text-to-Breasonings/text_to_breasonings.pl']. +time((N is 2, M is 8000, texttobr2(N,"andrew.txt",u,M),texttobr(N,"andrew.txt",u,M))). +%% Give the meditator the As. +[texttobr2qb]. +texttobr2(3). %% Dot on As, GG,B,BB and dot on +atom_concat('echo "" | mutt -s "Meditations andrew" you@yourdomain.com -a andrew.txt','',A1), +shell1(A1). +halt. + + +%% Psychiatrist + +%% First half + +swipl -G100g -T20g -L2g + +FileName2='./pfile1.txt', +FileName3='pfile1.txt', + +protocol(FileName2). +notrace. + +[mindreadtestpsychiatrist]. + +sectest0. + +noprotocol. + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./pfile1.txt', +FileName3='pfile1.txt', + +time((texttobr2(u,FileName3,u,u),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Psychiatric Report 1" your@yourdomain -a ',FileName3,A1), + +shell1(A1). + +halt. + +%% Second half + +swipl -G100g -T20g -L2g + +FileName2='./pfile2.txt', +FileName3='pfile2.txt', + +protocol(FileName2). +notrace. + +[mindreadtestpsychiatrist]. + +sectest1. + +noprotocol. + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./pfile2.txt', +FileName3='pfile2.txt', + +time((texttobr2(u,FileName3,u,u),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Psychiatric Report 2" you@yourdomain -a ',FileName3,A1), + +shell1(A1). + +halt. + +%% Give the meditators, etc. the As. + +swipl -G100g -T20g -L2g + +[mindreadtestsec]. + +[texttobr2qb]. + +[meditatorsanddoctors]. +meditators(Meditators),doctors(Doctors),append(Meditators,Doctors,MeditatorsandDoctors),length(MeditatorsandDoctors,LengthMeditatorsandDoctors1), + +LengthMeditatorsandDoctors2 is 2*LengthMeditatorsandDoctors1, +texttobr2(LengthMeditatorsandDoctors2). + +halt. + + +%% Thoughts + +%% First half + +swipl -G100g -T20g -L2g + +FileName2='./tfile1.txt', +FileName3='tfile1.txt', + +protocol(FileName2). +notrace. + +[mindreadtestsecthoughts]. + +sectest0. + +noprotocol. + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./tfile1.txt', +FileName3='tfile1.txt', + +time((texttobr2(u,FileName3,u,u),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Thoughts 1" luciangreen@lucianacademy.com -a ',FileName3,A1), + +shell1(A1). + +halt. + +%% Second half + +swipl -G100g -T20g -L2g + +FileName2='./sfile2.txt', +FileName3='sfile2.txt', + +protocol(FileName2). +notrace. + +[mindreadtestsecthoughts]. + +sectest1. + +noprotocol. + +['../Text-to-Breasonings/text_to_breasonings.pl']. + +FileName2='./sfile2.txt', +FileName3='sfile2.txt', + +time((texttobr2(u,FileName3,u,u),texttobr(u,FileName3,u,u))), + +atom_concat('echo "" | mutt -s "Thoughts 2" luciangreen@lucianacademy.com -a ',FileName3,A1), + +shell1(A1). + +halt. + + +logout diff --git a/Text-to-Breasonings-master/roomdict.txt b/Text-to-Breasonings-master/roomdict.txt new file mode 100644 index 0000000000000000000000000000000000000000..b4ef9b66cf7f191ff9d39d2a4eb9c1be5d006a6e --- /dev/null +++ b/Text-to-Breasonings-master/roomdict.txt @@ -0,0 +1 @@ +[[and,office]] \ No newline at end of file diff --git a/Text-to-Breasonings-master/sectest_producer.pl b/Text-to-Breasonings-master/sectest_producer.pl new file mode 100644 index 0000000000000000000000000000000000000000..ca351a53c140eb865351498a6c301c300ff7b780 --- /dev/null +++ b/Text-to-Breasonings-master/sectest_producer.pl @@ -0,0 +1,77 @@ +% place in gh/t2b (not in private2/t2b) + +%% mind read test + +%% Make files different for different tests + +%% *** Important: initialise program before running for the first time: +%% N is 1*2*3*5,texttobr2(N). %% 100 As for 1 (turned on)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects) +%% N is 1*2*3*5,texttobr2(N). %% 100 As for 1 (turned off)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects) +%% also breason out and dot on objects before line above and breason out and dot on when recognising and saying object (with all objects having different breasonings) + + +%%use_module(library(pio)). + +:- use_module(library(date)). +:- include('../Text-to-Breasonings/text_to_breasonings.pl'). +:- include('../mindreader/mindreadtestshared'). +:-include('../listprologinterpreter/la_strings_string.pl'). %**** change path on server +:-include('../listprologinterpreter/la_maths.pl'). %**** change path on server +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). +:-include('../Algorithm-Writer-with-Lists/grammar_logic_to_alg.pl'). + +/* +- for each customer, each time +- medit and medic each given 250 +- 3 points each given 250 +- my opinion - + if they have finished, are happy +*/ + +sectest_p:- + %random(X),N is round(100*X), + + %phrase_from_file_s(string([N,_]), "a_tally.txt"), + %string_codes(String02b,String00a), + %atom_to_term(String00a,[N,_],[]), + +numbers(2,1,[],Ns), +%writeln(N), +findall(_,(member(N1,Ns), + + find_time1(H,M,S), + politeness(0,Medit), + politeness(0,Medic), + politeness(0,Three_points), + politeness(0,My_opinion), % if there's nothing else wrong, is there anything else wrong? + % no_death(0,Threats2), % medits for life + writeln([N1,H,M,S,Medit,medit]), + writeln([N1,H,M,S,Medic,medic]), + writeln([N1,H,M,S,Three_points,three_points]), + writeln([N1,H,M,S,My_opinion,my_opinion]), + + + Br=16000, W=80, + grammar_logic_to_alg1("I am happy.",Br,GL_out1), + term_to_atom(GL_out1,GL_out2), + string_atom(GL_out,GL_out2), + texttobr2(u,u,GL_out,Br,false,false,false,false,false,false,W), + texttobr(u,u,GL_out,Br) + + + + ),_).%,Threats2,no_death]). + +find_time1(H,M,S) :- + find_time(H,M,S),!. + +politeness(Threats1,Threats2):- + %% "Given that we are interested in friendliness in primary school, secondary school and university, is there anything else?" + trialy2_6("Yes",R1), + trialy2_6("No",R2), + R=[R1,R2/**,R3,R4,R5,R6,R7,R8,R9,R10**,R11,R12,R13,R14,R15,R16,R17,R18,R19,R20,R21,R22,R23,R24,R25,R26,R27**/ + ], + sort(R,RA), + reverse(RA,RB), + RB=[[_,Answer]|_Rest], + + (Answer="No"->Threats2=Threats1;(Threats3 is Threats1+1,politeness(Threats3,Threats2))),!. diff --git a/Text-to-Breasonings-master/single_meditation b/Text-to-Breasonings-master/single_meditation new file mode 100644 index 0000000000000000000000000000000000000000..a2323c08407bbb2e62e66597c670ccfbdee611fe Binary files /dev/null and b/Text-to-Breasonings-master/single_meditation differ diff --git a/Text-to-Breasonings-master/single_meditation.pl b/Text-to-Breasonings-master/single_meditation.pl new file mode 100644 index 0000000000000000000000000000000000000000..a9fcbf25694441076b27ab62b7f535fa6a270385 --- /dev/null +++ b/Text-to-Breasonings-master/single_meditation.pl @@ -0,0 +1,2 @@ +:-include('text_to_breasonings.pl'). +main:-N is 108*2,texttobr2_1(N). diff --git a/Text-to-Breasonings-master/t2b4 b/Text-to-Breasonings-master/t2b4 new file mode 100644 index 0000000000000000000000000000000000000000..b6bc2100d275134da1db37979e8d6894bf704665 Binary files /dev/null and b/Text-to-Breasonings-master/t2b4 differ diff --git a/Text-to-Breasonings-master/text_to_breasonings.pl b/Text-to-Breasonings-master/text_to_breasonings.pl new file mode 100644 index 0000000000000000000000000000000000000000..776f354addf82e36a528b9f7efa6ab054fc606a5 --- /dev/null +++ b/Text-to-Breasonings-master/text_to_breasonings.pl @@ -0,0 +1,10 @@ +:-include('../Text-to-Breasonings/texttobr.pl'). +:-include('../Text-to-Breasonings/texttobrall2_reading.pl'). + +:- include('../listprologinterpreter/listprolog'). +:- include('../Text-to-Breasonings/texttobr2qb'). +%%:- include('texttobrqb'). +%:- include('../listprologinterpreter/la_strings'). +%:- include('mergetexttobrdict'). +%:- include('edit.pl'). +%:- include('meditatorsanddoctors'). diff --git a/Text-to-Breasonings-master/text_to_breasonings3.pl b/Text-to-Breasonings-master/text_to_breasonings3.pl new file mode 100644 index 0000000000000000000000000000000000000000..7b1aae2cd314b315ca0e580719e31fdba392eb15 --- /dev/null +++ b/Text-to-Breasonings-master/text_to_breasonings3.pl @@ -0,0 +1,10 @@ +:-include('../Text-to-Breasonings/texttobr.pl'). +:-include('../Text-to-Breasonings/texttobrall2_reading3.pl'). + +:- include('../listprologinterpreter/listprolog'). +:- include('../Text-to-Breasonings/texttobr2qb'). +%%:- include('texttobrqb'). +%:- include('../listprologinterpreter/la_strings'). +%:- include('mergetexttobrdict'). +%:- include('edit.pl'). +%:- include('meditatorsanddoctors'). diff --git a/Text-to-Breasonings-master/text_to_breasonings4.pl b/Text-to-Breasonings-master/text_to_breasonings4.pl new file mode 100644 index 0000000000000000000000000000000000000000..b4f088aa18a02f1bac305802b9fe6bb85f47ba56 --- /dev/null +++ b/Text-to-Breasonings-master/text_to_breasonings4.pl @@ -0,0 +1,10 @@ +:-include('../Text-to-Breasonings/texttobr.pl'). +:-include('../Text-to-Breasonings/texttobrall2_reading4.pl'). + +:- include('../listprologinterpreter/listprolog'). +:- include('../Text-to-Breasonings/texttobr2qb'). +%%:- include('texttobrqb'). +%:- include('../listprologinterpreter/la_strings'). +%:- include('mergetexttobrdict'). +%:- include('edit.pl'). +%:- include('meditatorsanddoctors'). diff --git a/Text-to-Breasonings-master/texttobr.pl b/Text-to-Breasonings-master/texttobr.pl new file mode 100644 index 0000000000000000000000000000000000000000..73d600f855884ffe81c9371355afe384a695a2eb --- /dev/null +++ b/Text-to-Breasonings-master/texttobr.pl @@ -0,0 +1,134 @@ +%% use_module(library(pio)). In la_strings + + +%% texttobr - converts file stream to list of 3D dimensions of each character + +texttobr(N1,Filex1,Stringx1,M1) :- + + ((number(N1),N=N1)->true; + (N1=u,N=1)), + + ((Filex1=u,Filex="file.txt")->true; + Filex=Filex1), + + ((number(M1),M=M1)->true; + M=all), %% If m1 is undefined or all then m=all + + texttobrc1(N,Filex,Stringx1,M),!. + +texttobrc1(0,_,_,_) :- !. +texttobrc1(N1,Filex,Stringx1,BreasoningLimit) :- + texttobrc2(Filex,Stringx1,BreasoningLimit), + N2 is N1-1, + texttobrc1(N2,Filex,Stringx1,BreasoningLimit). + +texttobrc2(Filex,Stringx1,M) :- + + ((Stringx1=u, + phrase_from_file_s(string2(String1), Filex))->true; + string_codes(Stringx1,String1)), + + ((number(M),length(String,M), + append(String,_,String1))->true; + String=String1), + + br(String),!. + +string2(String) --> list2(String). +list2([]) --> []. +list2([L|Ls]) --> [L], list2(Ls). + +br([]) :- + !. +br([Code|Codes]) :- + char_code(Character,Code), + br(Character,_X,_Y,_Z), + brth2(Character,_Brth), + %%writeln([Character]),%%,X,Y,Z]), %% + %%write(' '), + br(Codes). +brth2(_,sweetinvincibleandprayedfor). +br('A',1,1.5,0). +br('B',1,1.5,0). +br('C',1,1.5,0). +br('D',1,1.5,0). +br('E',1,1.5,0). +br('F',1,1.5,0). +br('G',1,1.5,0). +br('H',1,1.5,0). +br('I',1,1.5,0). +br('J',1,1.5,0). +br('K',1,1.5,0). +br('L',1,1.5,0). +br('M',1,1.5,0). +br('N',1,1.5,0). +br('O',1,1.5,0). +br('P',1,1.5,0). +br('Q',1,1.5,0). +br('R',1,1.5,0). +br('S',1,1.5,0). +br('T',1,1.5,0). +br('U',1,1.5,0). +br('V',1,1.5,0). +br('W',1,1.5,0). +br('X',1,1.5,0). +br('Y',1,1.5,0). +br('Z',1,1.5,0). +br('a',1,1,0). +br('b',1,1.5,0). +br('c',1,1,0). +br('d',1,1.5,0). +br('e',1,1,0). +br('f',1,2.5,0). +br('g',1,2,0). +br('h',1,1.5,0). +br('i',1,1.5,0). +br('j',1,2.5,0). +br('k',1,1.5,0). +br('l',1,1.5,0). +br('m',1,1,0). +br('n',1,1,0). +br('o',1,1,0). +br('p',1,2,0). +br('q',1,2,0). +br('r',1,1,0). +br('s',1,1,0). +br('t',1,1.5,0). +br('u',1,1,0). +br('v',1,1,0). +br('w',1,1,0). +br('x',1,1,0). +br('y',1,2,0). +br('z',1,1,0). + +br('?',1,1.5,0). +br('-',1,1,0). +br(' ',1,1,0). +br(',',1,1,0). +br('(',1,1.5,0). +br(')',1,1.5,0). +br('|',1,2.5,0). +br('.',1,1,0). +br(':',1,1,0). +br('_',1,1,0). +br('\'',1,1.5,0). +br('[',1,1.5,0). +br(']',1,1.5,0). +br('<',1,1,0). +br('>',1,1,0). + +br('0',1,1.5,0). +br('1',1,1.5,0). +br('2',1,1.5,0). +br('3',1,1.5,0). +br('4',1,1.5,0). +br('5',1,1.5,0). +br('6',1,1.5,0). +br('7',1,1.5,0). +br('8',1,1.5,0). +br('9',1,1.5,0). + +br('{',1,1.5,0). +br('}',1,1.5,0). +br('\n',1,1,0). +br(_,1,1,0). diff --git a/Text-to-Breasonings-master/texttobr2_square b/Text-to-Breasonings-master/texttobr2_square new file mode 100644 index 0000000000000000000000000000000000000000..5492426c7835ca90adaa4e9497dc701569ee6bfe Binary files /dev/null and b/Text-to-Breasonings-master/texttobr2_square differ diff --git a/Text-to-Breasonings-master/texttobr2_square.pl b/Text-to-Breasonings-master/texttobr2_square.pl new file mode 100644 index 0000000000000000000000000000000000000000..d4f18c9517cb2f1ec28c34301f9502eb0ac23adb --- /dev/null +++ b/Text-to-Breasonings-master/texttobr2_square.pl @@ -0,0 +1,6 @@ +:-include('text_to_breasonings.pl'). +main:-catch((texttobr2_1(1),texttobr2(4,u,"square",u)),Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). \ No newline at end of file diff --git a/Text-to-Breasonings-master/texttobr2qb.pl b/Text-to-Breasonings-master/texttobr2qb.pl new file mode 100644 index 0000000000000000000000000000000000000000..7f17746b7f9c9c6c32393187fa7a44e8c66ac2e4 --- /dev/null +++ b/Text-to-Breasonings-master/texttobr2qb.pl @@ -0,0 +1,29 @@ +%% 250 breasonings + +texttobr2 :- +br1([[a,1,1.5,0],[about,1,1,0],[ache,1,0,0],[after,1,1,0],[all,1,1.5,0],[an,1,1.5,0],[and,1,1,0],[appearances,5,0,5],[apple,5,5,5],[are,1,1,0],[area,5,5,0],[argument,1,1.5,0],[as,1,1,0],[asanas,180,50,1],[at,1,1,0],[ate,15,2,1],[avoid,1,1,0],[b,1,1,1],[bacteria,2,1,1],[based,5,5,1],[be,50,30,180],[before,1,1,0],[being,50,30,180],[binding,1,1,0],[blocking,1,1,1],[bolt,2,1,1],[box,1,1,1],[breakfast,10,10,1],[breasoned,5,5,5],[breasoning,5,5,5],[breasonings,5,5,5],[by,1,1,0],[chapter,10,15,1],[clear,1,1,0],[colds,1,1,1],[comfort,180,50,30],[comfortable,50,50,100],[complexes,1,1,1],[conception,50,14,8],[connected,1,1,0],[correctness,1,1,0],[day,10,10,10],[depression,1,1,0],[details,1,1,1],[did,1,1,0],[do,50,30,180],[dotted,1,1,0],[down,1,1,0],[drinking,5,5,5],[during,1,1.5,0],[each,1,1,0],[earned,1,1.5,0],[education,50,30,180],[effects,1,1.5,0],[elderberries,1,1,1],[ensure,1,1,0],[excess,1,1,1],[exercises,10,10,1],[f,5,5,5],[feeling,1,1,0],[felt,1,1,0],[first,1,1.5,0],[flu,1,1,0],[food,2,1,1],[for,1,1,0],[from,1,1,0],[fruits,1,1,1],[function,1,1.5,0],[gaffer,10,10,1],[gave,1,1,1],[glasses,15,11,3],[god,50,30,180],[grade,1,1.5,0],[grains,1,1,1],[had,1,1,0],[hallucinogenic,1,1,0],[happiness,1,1,0],[happy,1,1,0],[have,5,5,5],[head,15,20,23],[headache,1,1,0],[headaches,1,1,0],[health,50,30,180],[healthy,50,30,180],[help,5,5,0],[helped,50,30,180],[high,1,1.5,0],[hills,5,5,2],[honey,10,10,10],[i,50,30,180],[imagery,5,5,0],[in,1,1,0],[incompatibility,1,1,0],[interpretation,20,30,0],[it,1,1,0],[keep,1,1,0],[laughter,1,1.5,0],[maintain,1,1,0],[maintaining,1,1,0],[master,50,30,180],[mind,3,4,4],[minutes,20,30,0],[mistake,1,1,0],[muscle,1,1,1],[muscles,1,1,1],[music,1,1,0],[must,1,1,0],[my,50,30,180],[namaskar,15,10,23],[neck,15,15,10],[next,1,1,0],[nietzsche,50,30,180],[no,1,1,0],[noticed,50,30,180],[nut,1,1,0],[nuts,1,1,1],[of,1,1,0],[off,1,1.5,0],[on,1,1,0],[one,1,1.5,0],[organs,1,1,1],[out,1,1,0],[over,1,1,0],[part,1,1,1],[peace,1,1.5,0],[perfect,1,1.5,0],[performed,50,30,180],[performing,50,30,250],[philosophy,10,20,1],[physiology,50,30,180],[pimple,1,1,0],[pm,5,5,5],[poem,1,1.5,0],[positive,1,1,0],[pot,5,5,10],[prayer,5,5,0],[prepared,1,1,0],[prevent,1,1,0],[prevented,1,1,0],[preventing,1,1,0],[prevention,1,1,0],[problems,1,1.5,0],[product,1,150,1],[products,5,5,5],[psychiatry,50,30,180],[psychology,2,2,0],[quality,1,1.5,0],[quantum,1,1,0],[read,15,1,20],[relax,180,50,30],[relaxed,180,50,30],[remain,1,1.5,0],[require,1,1,0],[room,500,400,300],[s,1,1,0],[safe,1,1.5,0],[sales,1,1.5,0],[same,1,1,0],[schizophrenia,1,1,0],[second,1,1.5,0],[see,1,0,1],[seek,1,1,0],[sell,5,3,0],[sets,5,1,0],[sex,1,1.5,0],[short,1,1,0],[single,1,1.5,0],[sites,20,0,30],[sitting,50,70,150],[sized,30,1,0],[skin,1,1,0],[sleep,180,50,30],[slices,2,2,2],[so,1,1,0],[societal,50,30,180],[some,1,1,0],[spine,1,1,10],[spiritual,50,30,180],[state,1,1.5,0],[stated,20,30,0],[stating,50,30,180],[status,1,1.5,0],[stop,5,1,15],[straight,1,1,0],[strawberry,1,1,1],[structure,1,1,1],[studied,50,30,180],[study,100,50,100],[studying,50,30,180],[subject,50,30,180],[successful,1,1.5,0],[surya,1,1.5,0],[sutra,5,1.5,0],[table,7,5,0],[tape,10,10,1],[task,15,2.5,0],[technique,1,1,0],[test,20,20,0],[text,20,30,0],[that,1,1,0],[the,1,1,0],[them,50,30,180],[then,1,1,0],[they,50,30,180],[thinking,1,1.5,0],[third,1,1.5,0],[this,1,1,0],[thoughts,1,1,0],[through,1,1,0],[tied,1,1,0],[time,1,1.5,0],[to,1,1,0],[together,1,1,0],[touch,1,1,0],[train,1,1,1],[trains,50,30,180],[travel,10,3,0],[treating,20,15,3],[turn,1,1,0],[two,1,1.5,0],[university,100,75,3],[unwanted,1,1,0],[up,1,1,0],[upasana,100,75,100],[used,1,1,0],[using,1,1,0],[var,1,1,1],[vegetables,5,5,5],[videos,5,0,3],[view,5,5,5],[virality,1,1,0],[viruses,1,1,2],[vitamin,1,1,1],[walks,100,500,0],[wanted,15,20,3],[warm,1,1,0],[was,1,1,0],[water,5,5,5],[way,1,1,0],[ways,50,100,0],[well,50,30,180],[where,1,1,0],[whole,1,1,1],[with,1,1,0],[without,1,1,0],[words,5,7,1],[write,15,1,1],[writing,5,5,0],[wrote,15,1,1],[years,3,1,0]]). + +br1([[a,1,1.5,0],[about,1,1,0],[ache,1,0,0],[after,1,1,0],[all,1,1.5,0],[an,1,1.5,0],[and,1,1,0],[appearances,5,0,5],[apple,5,5,5],[are,1,1,0],[area,5,5,0],[argument,1,1.5,0],[as,1,1,0],[asanas,180,50,1],[at,1,1,0],[ate,15,2,1],[avoid,1,1,0],[b,1,1,1],[bacteria,2,1,1],[based,5,5,1],[be,50,30,180],[before,1,1,0],[being,50,30,180],[binding,1,1,0],[blocking,1,1,1],[bolt,2,1,1],[box,1,1,1],[breakfast,10,10,1],[breasoned,5,5,5],[breasoning,5,5,5],[breasonings,5,5,5],[by,1,1,0],[chapter,10,15,1],[clear,1,1,0],[colds,1,1,1],[comfort,180,50,30],[comfortable,50,50,100],[complexes,1,1,1],[conception,50,14,8],[connected,1,1,0],[correctness,1,1,0],[day,10,10,10],[depression,1,1,0],[details,1,1,1],[did,1,1,0],[do,50,30,180],[dotted,1,1,0],[down,1,1,0],[drinking,5,5,5],[during,1,1.5,0],[each,1,1,0],[earned,1,1.5,0],[education,50,30,180],[effects,1,1.5,0],[elderberries,1,1,1],[ensure,1,1,0],[excess,1,1,1],[exercises,10,10,1],[f,5,5,5],[feeling,1,1,0],[felt,1,1,0],[first,1,1.5,0],[flu,1,1,0],[food,2,1,1],[for,1,1,0],[from,1,1,0],[fruits,1,1,1],[function,1,1.5,0],[gaffer,10,10,1],[gave,1,1,1],[glasses,15,11,3],[god,50,30,180],[grade,1,1.5,0],[grains,1,1,1],[had,1,1,0],[hallucinogenic,1,1,0],[happiness,1,1,0],[happy,1,1,0],[have,5,5,5],[head,15,20,23],[headache,1,1,0],[headaches,1,1,0],[health,50,30,180],[healthy,50,30,180],[help,5,5,0],[helped,50,30,180],[high,1,1.5,0],[hills,5,5,2],[honey,10,10,10],[i,50,30,180],[imagery,5,5,0],[in,1,1,0],[incompatibility,1,1,0],[interpretation,20,30,0],[it,1,1,0],[keep,1,1,0],[laughter,1,1.5,0],[maintain,1,1,0],[maintaining,1,1,0],[master,50,30,180],[mind,3,4,4],[minutes,20,30,0],[mistake,1,1,0],[muscle,1,1,1],[muscles,1,1,1],[music,1,1,0],[must,1,1,0],[my,50,30,180],[namaskar,15,10,23],[neck,15,15,10],[next,1,1,0],[nietzsche,50,30,180],[no,1,1,0],[noticed,50,30,180],[nut,1,1,0],[nuts,1,1,1],[of,1,1,0],[off,1,1.5,0],[on,1,1,0],[one,1,1.5,0],[organs,1,1,1],[out,1,1,0],[over,1,1,0],[part,1,1,1],[peace,1,1.5,0],[perfect,1,1.5,0],[performed,50,30,180],[performing,50,30,250],[philosophy,10,20,1],[physiology,50,30,180],[pimple,1,1,0],[pm,5,5,5],[poem,1,1.5,0],[positive,1,1,0],[pot,5,5,10],[prayer,5,5,0],[prepared,1,1,0],[prevent,1,1,0],[prevented,1,1,0],[preventing,1,1,0],[prevention,1,1,0],[problems,1,1.5,0],[product,1,150,1],[products,5,5,5],[psychiatry,50,30,180],[psychology,2,2,0],[quality,1,1.5,0],[quantum,1,1,0],[read,15,1,20],[relax,180,50,30],[relaxed,180,50,30],[remain,1,1.5,0],[require,1,1,0],[room,500,400,300],[s,1,1,0],[safe,1,1.5,0],[sales,1,1.5,0],[same,1,1,0],[schizophrenia,1,1,0],[second,1,1.5,0],[see,1,0,1],[seek,1,1,0],[sell,5,3,0],[sets,5,1,0],[sex,1,1.5,0],[short,1,1,0],[single,1,1.5,0],[sites,20,0,30],[sitting,50,70,150],[sized,30,1,0],[skin,1,1,0],[sleep,180,50,30],[slices,2,2,2],[so,1,1,0],[societal,50,30,180],[some,1,1,0],[spine,1,1,10],[spiritual,50,30,180],[state,1,1.5,0],[stated,20,30,0],[stating,50,30,180],[status,1,1.5,0],[stop,5,1,15],[straight,1,1,0],[strawberry,1,1,1],[structure,1,1,1],[studied,50,30,180],[study,100,50,100],[studying,50,30,180],[subject,50,30,180],[successful,1,1.5,0],[surya,1,1.5,0],[sutra,5,1.5,0],[table,7,5,0],[tape,10,10,1],[task,15,2.5,0],[technique,1,1,0],[test,20,20,0],[text,20,30,0],[that,1,1,0],[the,1,1,0],[them,50,30,180],[then,1,1,0],[they,50,30,180],[thinking,1,1.5,0],[third,1,1.5,0],[this,1,1,0],[thoughts,1,1,0],[through,1,1,0],[tied,1,1,0],[time,1,1.5,0],[to,1,1,0],[together,1,1,0],[touch,1,1,0],[train,1,1,1],[trains,50,30,180],[travel,10,3,0],[treating,20,15,3],[turn,1,1,0],[two,1,1.5,0],[university,100,75,3],[unwanted,1,1,0],[up,1,1,0],[upasana,100,75,100],[used,1,1,0],[using,1,1,0],[var,1,1,1],[vegetables,5,5,5],[videos,5,0,3],[view,5,5,5],[virality,1,1,0],[viruses,1,1,2],[vitamin,1,1,1],[walks,100,500,0],[wanted,15,20,3],[warm,1,1,0],[was,1,1,0],[water,5,5,5],[way,1,1,0],[ways,50,100,0],[well,50,30,180],[where,1,1,0],[whole,1,1,1],[with,1,1,0],[without,1,1,0],[words,5,7,1],[write,15,1,1],[writing,5,5,0],[wrote,15,1,1],[years,3,1,0]]). + +texttobr2(A) :- A1 is 1*A, texttobr2_2(A1). +texttobr2_2(0):-!. +texttobr2_2(N1):- + texttobr2,N2 is N1-1,texttobr2_2(N2),!. + +texttobr2_a(A,B) :- A1 is 1*A, texttobr2_a2(A1,B). + +texttobr2_a2(0,_):-!. +texttobr2_a2(N1,A):- + texttobr2(A,_),N2 is N1-1,texttobr2_a2(N2,A),!. + +texttobr2_1(A) :- A1 is 1*A, texttobr2_12(A1). + +texttobr2_12(0):-!. +texttobr2_12(N1):- + texttobr2,texttobr2(meditation,_),texttobr2(medicine,_),texttobr2(pedagogy,_),N2 is N1-1,texttobr2_12(N2),!. + +texttobr2(meditation,[[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[a,1,1.5,0],[dash,1,1,0],[person,50,30,180],[person,50,30,180],[hair,1,0,0],[person,50,30,180],[nail,0.3,0.3,3],[person,50,30,180],[person,50,30,180],[square,1,1,0],[a,1,1.5,0],[person,50,30,180],[apple,5,5,5],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[down,1,1,0],[equals,1,1,0],[square,1,1,0],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[person,50,30,180],[apple,5,5,5],[left,1,1,0],[a,1,1.5,0],[person,50,30,180],[plus,1,1,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[apple,5,5,5],[n,1,1,0],[right,1,1,0],[left,1,1,0],[down,1,1,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[a,1,1.5,0],[dash,1,1,0],[person,50,30,180],[person,50,30,180],[hair,1,0,0],[person,50,30,180],[nail,0.3,0.3,3],[person,50,30,180],[person,50,30,180],[square,1,1,0],[a,1,1.5,0],[person,50,30,180],[apple,5,5,5],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[down,1,1,0],[equals,1,1,0],[square,1,1,0],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[person,50,30,180],[apple,5,5,5],[left,1,1,0],[a,1,1.5,0],[person,50,30,180],[plus,1,1,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[apple,5,5,5],[n,1,1,0],[right,1,1,0],[left,1,1,0],[down,1,1,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[a,1,1.5,0],[dash,1,1,0],[person,50,30,180],[person,50,30,180],[hair,1,0,0],[person,50,30,180],[nail,0.3,0.3,3],[person,50,30,180],[person,50,30,180],[square,1,1,0],[a,1,1.5,0],[person,50,30,180],[apple,5,5,5],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[down,1,1,0],[equals,1,1,0],[square,1,1,0],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[person,50,30,180],[apple,5,5,5],[left,1,1,0],[a,1,1.5,0],[person,50,30,180],[plus,1,1,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[apple,5,5,5],[n,1,1,0],[right,1,1,0],[left,1,1,0],[down,1,1,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[a,1,1.5,0],[dash,1,1,0],[person,50,30,180],[person,50,30,180],[hair,1,0,0],[person,50,30,180],[nail,0.3,0.3,3],[person,50,30,180],[person,50,30,180],[square,1,1,0],[a,1,1.5,0],[person,50,30,180],[apple,5,5,5],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[down,1,1,0],[equals,1,1,0],[square,1,1,0],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[person,50,30,180],[apple,5,5,5],[left,1,1,0],[a,1,1.5,0],[person,50,30,180],[plus,1,1,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[apple,5,5,5],[n,1,1,0],[right,1,1,0],[left,1,1,0],[down,1,1,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[a,1,1.5,0],[dash,1,1,0],[person,50,30,180],[person,50,30,180],[hair,1,0,0],[person,50,30,180],[nail,0.3,0.3,3],[person,50,30,180],[person,50,30,180],[square,1,1,0],[a,1,1.5,0],[person,50,30,180],[apple,5,5,5],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[down,1,1,0],[equals,1,1,0],[square,1,1,0],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[person,50,30,180],[apple,5,5,5],[left,1,1,0],[a,1,1.5,0],[person,50,30,180],[plus,1,1,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[apple,5,5,5],[n,1,1,0],[right,1,1,0],[left,1,1,0],[down,1,1,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[a,1,1.5,0],[dash,1,1,0],[person,50,30,180],[person,50,30,180],[hair,1,0,0],[person,50,30,180],[nail,0.3,0.3,3],[person,50,30,180],[person,50,30,180],[square,1,1,0]]). + +texttobr2(medicine,[[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[person,50,30,180],[plus,1,1,0],[heart,1,1,0],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[equals,1,1,0],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[baby,30,50,20],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[plus,1,1,0],[zero,1,1.5,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[person,50,30,180],[plus,1,1,0],[heart,1,1,0],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[equals,1,1,0],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[baby,30,50,20],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[plus,1,1,0],[zero,1,1.5,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[person,50,30,180],[plus,1,1,0],[heart,1,1,0],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[equals,1,1,0],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[baby,30,50,20],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[plus,1,1,0],[zero,1,1.5,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[person,50,30,180],[plus,1,1,0],[heart,1,1,0],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[equals,1,1,0],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[baby,30,50,20],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[plus,1,1,0],[zero,1,1.5,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[person,50,30,180],[plus,1,1,0],[heart,1,1,0],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[equals,1,1,0],[dash,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180]]). + +texttobr2(pedagogy,[[person,50,30,180],[heart,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[baby,30,50,20],[up,1,1,0],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[square,1,1,0],[box,1,1,1],[down,1,1,0],[and,1,1,0],[square,1,1,0],[down,1,1,0],[square,1,1,0],[equals,1,1,0],[person,50,30,180],[zero,1,1.5,0],[a,1,1.5,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[baby,30,50,20],[up,1,1,0],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[square,1,1,0],[box,1,1,1],[down,1,1,0],[and,1,1,0],[square,1,1,0],[down,1,1,0],[square,1,1,0],[equals,1,1,0],[person,50,30,180],[zero,1,1.5,0],[a,1,1.5,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[baby,30,50,20],[up,1,1,0],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[square,1,1,0],[box,1,1,1],[down,1,1,0],[and,1,1,0],[square,1,1,0],[down,1,1,0],[square,1,1,0],[equals,1,1,0],[person,50,30,180],[zero,1,1.5,0],[a,1,1.5,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[baby,30,50,20],[up,1,1,0],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[square,1,1,0],[box,1,1,1],[down,1,1,0],[and,1,1,0],[square,1,1,0],[down,1,1,0],[square,1,1,0],[equals,1,1,0],[person,50,30,180],[zero,1,1.5,0],[a,1,1.5,0],[equals,1,1,0],[plus,1,1,0],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[baby,30,50,20],[up,1,1,0],[person,50,30,180],[heart,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[down,1,1,0],[person,50,30,180],[heart,1,1,0],[person,50,30,180],[person,50,30,180],[plus,1,1,0],[person,50,30,180],[plus,1,1,0],[a,1,1.5,0]]). \ No newline at end of file diff --git a/Text-to-Breasonings-master/texttobrall2_reading.pl b/Text-to-Breasonings-master/texttobrall2_reading.pl new file mode 100644 index 0000000000000000000000000000000000000000..34a0fdbae3f97673b7daa60ab706bd69d0af6944 --- /dev/null +++ b/Text-to-Breasonings-master/texttobrall2_reading.pl @@ -0,0 +1,666 @@ +% ['../Text-to-Breasonings/text_to_breasonings.pl']. +% W is 50*4,texttobr2(u,u,u,u,false,false,false,false,false,false,W) +% where W is the number of words to read +% and there are approximately 4 words per algorithm. + +%% Important: See instructions for using texttobr.pl at https://lucianpedia.wikia.com/wiki/Instructions_for_Using_texttobr(2).pl . + +%% use_module(library(pio)). %% In la_strings +%% use_module(library(dcg/basics)). %% In la_strings + +%% texttobr2 - converts file stream to dimensions of objects represented by the words +%% has object name as separate field for new users of texttobr to verify breasonings by hand +%% brdict1.txt contains word and object name, brdict2.txt contains object name and x, y and z + +%% texttobr2(Runs,File,StringtoBreason,BreasoningLimit). +:- include('../Text-to-Breasonings/mergetexttobrdict.pl'). +%:- include('../listprologinterpreter/la_strings'). + +%% Brth is true or false +texttobr2(N1,Filex1,Stringx1,M1) :- + texttobr2(N1,Filex1,Stringx1,M1,false,false,false,false,false,false,0,[auto,off]). + +texttobr2(N1,Filex1,Stringx1,M1,[auto,Auto]) :- + texttobr2(N1,Filex1,Stringx1,M1,false,false,false,false,false,false,0,[auto,Auto]). +texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction, + ObjectToPrepare,ObjectToFinish) :- + texttobr2(N1,Filex1,Stringx1,M1,Brth,Room, + PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,0,[auto,off]). + +texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction, + ObjectToPrepare,ObjectToFinish,[auto,Auto]) :- + texttobr2(N1,Filex1,Stringx1,M1,Brth,Room, + PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,0,[auto,Auto]),!. + +texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction, + ObjectToPrepare,ObjectToFinish,W) :- + texttobr2(N1,Filex1,Stringx1,M1,Brth,Room, + PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,0,[auto,off]). + +texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,Words_to_read,[auto,Auto]) :- + retractall(auto(_)), + assertz(auto(Auto)), + +(Stringx1=""->true; +( retractall(complete_display(_)), + assertz(complete_display(false)), + + retractall(words_to_read(_)), + assertz(words_to_read(Words_to_read)), + + ((number(N1),N=N1)->true; + (N1=u,N=1)), + + ((Filex1=u,Filex="../Text-to-Breasonings/file.txt")->true; + Filex=Filex1), + + ((number(M1),M=M1)->true; + M=all), %% If m1 is undefined or all then m=all + + prep(List1,BrDict03,BrDict03t,Filex,Stringx1,M,Brth,BrthDict03,Room,RoomDict03,PartOfRoom,PartOfRoomDict03,Direction,DirectionDict03,ObjectToPrepare,ObjectToPrepareDict03,ObjectToFinish,ObjectToFinishDict03), + + retractall(n(_)), + assertz(n(N)), + + retractall(brDict03(_)), + assertz(brDict03(BrDict03)), + + retractall(brDict03t(_)), + assertz(brDict03t(BrDict03t)), + +br2(List1,N),%,BrDict03,BrDict2,BrDict03t,BrDict03t2,N,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04), + +brDict031(BrDict2), +brDict03t1(BrDict03t2), + sort(BrDict2,BrDict3), + (BrDict03=BrDict3->true; + (open_s("../Text-to-Breasonings/brdict1.txt",write,Stream), +%% string_codes(BrDict3), + write(Stream,BrDict3), + close(Stream))), + + sort(BrDict03t2,BrDict03t3), + (BrDict03t=BrDict03t3->true; + (open_s("../Text-to-Breasonings/brdict2.txt",write,Stream2), +%% string_codes(BrDict3), + write(Stream2,BrDict03t3), + close(Stream2))), + + /* + ((Brth=true, + sort(BrthDict04,BrthDict044), + (BrthDict03=BrthDict044->true; + (open_s("../Text-to-Breasonings/brthdict.txt",write,Stream3), +%% string_codes(BrDict3), + write(Stream3,BrthDict044), + close(Stream3))))->true;true), + + ((Room=true, + sort(RoomDict04,RoomDict044), + (RoomDict04=RoomDict044->true; + (open_s("../Text-to-Breasonings/roomdict.txt",write,Stream4), +%% string_codes(BrDict3), + write(Stream4,RoomDict044), + close(Stream4))))->true;true), + + ((PartOfRoom=true, + sort(PartOfRoomDict04,PartOfRoomDict044), + (PartOfRoomDict04=PartOfRoomDict044->true; + (open_s("../Text-to-Breasonings/partofroomdict.txt",write,Stream5), +%% string_codes(BrDict3), + write(Stream5,PartOfRoomDict044), + close(Stream5))))->true;true), + + ((Direction=true, + sort(DirectionDict04,DirectionDict044), + (DirectionDict04=DirectionDict044->true; + (open_s("../Text-to-Breasonings/directiondict.txt",write,Stream6), +%% string_codes(BrDict3), + write(Stream6,DirectionDict044), + close(Stream6))))->true;true), + + ((ObjectToPrepare=true, + sort(ObjectToPrepareDict04,ObjectToPrepareDict044), + (ObjectToPrepareDict04=ObjectToPrepareDict044->true; + (open_s("../Text-to-Breasonings/objecttopreparedict.txt",write,Stream7), +%% string_codes(BrDict3), + write(Stream7,ObjectToPrepareDict044), + close(Stream7))))->true;true), + + ((ObjectToFinish=true, + sort(ObjectToFinishDict04,ObjectToFinishDict044), + (ObjectToFinishDict04=ObjectToFinishDict044->true; + (open_s("../Text-to-Breasonings/objecttofinishdict.txt",write,Stream8), +%% string_codes(BrDict3), + write(Stream8,ObjectToFinishDict044), + close(Stream8))))->true;true), + */ + + length(List1,List1_length_a), + Dividend_a is ceiling(List1_length_a/250), + Dividend_b is Dividend_a*3, % for graciously giving + texttobr2_a(Dividend_b,meditation), + texttobr2_a(Dividend_b,medicine), + texttobr2_a(Dividend_b,pedagogy) + )), + + !. + + +replace0(Input,Find,Replace,SepandPad,M,Output0) :- + replace00(Input,Find,Replace,SepandPad,[],Output1), + truncate(Output1,M,Output0),!. +replace00([],_Find,_Replace,_SepandPad,Input,Input) :- !. +replace00(Input1,Find,Replace,SepandPad,Input2,Input3) :- + Input1=[Input4|Input5], + string_codes(Input4,String1), + replace1(String1,Find,Replace,[],Input7), + string_codes(Output,Input7), + split_string(Output,SepandPad,SepandPad,Input8), + append(Input2,Input8,Input9), + replace00(Input5,Find,Replace,SepandPad,Input9,Input3), !. +replace1([],_Find,_Replace,Input,Input) :- !. +replace1(Input1,Find,Replace,Input2,Input3) :- + Input1=[Input4|Input5], + member(Input4,Find), + append(Input2,[Replace],Input6), + replace1(Input5,Find,Replace,Input6,Input3), !. +replace1(Input1,Find,Replace,Input2,Input3) :- + Input1=[Input4|Input5], + not(member(Input4,Find)), + append(Input2,[Input4],Input6), + replace1(Input5,Find,Replace,Input6,Input3), !. + +split_string_onnonletter(String00,List1) :- + string_codes(String00,String1), + split_string_onnonletter(String1,[],List0), + string_codes(List0,List2), + split_string(List2," "," ",List1),!. +split_string_onnonletter([],Input,Input) :- !. +split_string_onnonletter(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + not(char_type(Input4,alpha)), + append(Input2,[32],Input6), + split_string_onnonletter(Input5,Input6,Input3), !. +split_string_onnonletter(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + char_type(Input4,alpha), + append(Input2,[Input4],Input6), + split_string_onnonletter(Input5,Input6,Input3), !. + +%% Truncates the list if m is not undefined and m is greater than or equal to the length of string0 +truncate(List1,M,String0) :- + ((number(M),length(String0,M), + append(String0,_,List1))->true; + String0=List1),!. + + + +prep(List,BrDict03,BrDict03t,Filex,Stringx1,M,Brth,BrthDict03,Room,RoomDict03,PartOfRoom,PartOfRoomDict03,Direction,DirectionDict03,ObjectToPrepare,ObjectToPrepareDict03,ObjectToFinish,ObjectToFinishDict03) :- + +concurrent(2,[( + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + %%Chars="’", + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), +%%writeln([brdict1]), + splitfurther(BrDict0,BrDict01), +%%writeln([brDict01,BrDict01]), + %%char_code(Escape,27), + %%delete(BrDict01,[Escape,_,_,_,_],BrDict021), +%%writeln([brDict021,BrDict021]), + %%char_code(Apostrophe,8217), + %%delete(BrDict021,[Apostrophe,_,_,_,_],BrDict02), +%%writeln([brDict02,BrDict02]), + sort(BrDict01,BrDict03), +%%writeln([brDict03,BrDict03]), + length(BrDict03,Length0),write("Number of words in dictionary: "), writeln(Length0) + ),( + %%writeln(''), + %%writeln([brdict2]), + phrase_from_file_s(string(BrDict0t), "../Text-to-Breasonings/brdict2.txt"), + %%Chars="’", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), + splitfurthert(BrDict0t,BrDict01t), +%%writeln([brDict01,BrDict01]), + %%delete(BrDict01t,[Escape,_,_,_,_],BrDict021t), +%%writeln([brDict021,BrDict021]), + %%delete(BrDict021t,[Apostrophe,_,_,_,_],BrDict02t), +%%writeln([brDict02,BrDict02]), + sort(BrDict01t,BrDict03t), + +% br_freq %B=BrDict03t,A=BrDict03,findall([DL,C,"\n"],(member([C,_,_,_],B),findall(_,member([_,C],A),D),length(D,DL)),E),sort(E,F),reverse(F,G),writeln([br_freq,G]), + +%%writeln([brDict03,BrDict03]), + length(BrDict03t,Length0t),write("Number of unique breasonings in dictionary: "), writeln(Length0t) + )],[]), + + ((Stringx1=u, + phrase_from_file_s(string(String001), Filex))->true; + String001=Stringx1), + + process_t2b(String001,String00), + + split_string(String00,SepandPad,SepandPad,List1), + %%split_string_onnonletter(String00,List1), + + truncate(List1,M,List), + + /**replace0(String0,[8221, 8220], 34, SepandPad, M, String1), + replace0(String1,[8216, 8217], 39, SepandPad, M, String2), + replace0(String2,[8259, 8211, 8212], 45, SepandPad, M, String3), + replace0(String3,[160], 32, SepandPad, M, List), + **/ + +%%atom_codes(Atom999,String),writeln([atom999,Atom999]), + +%%writeln([list,List]), + %%delete(List,Escape,List11), +%%writeln([list11,List11]), + %%delete(List11,Apostrophe,List1), +%%writeln([list1,List1]), + length(List,Length1),write("Number of words to breason out in file: "), writeln(Length1), + sort(List,List2), +%%writeln([list2,List2]), + length(List2,Length2),write("Number of unique words in file: "), writeln(Length2), + + + + ((Brth=true, + phrase_from_file_s(string(BrthDict0), "../Text-to-Breasonings/brthdict.txt"), splitfurther(BrthDict0,BrthDict01), + sort(BrthDict01,BrthDict03), + length(BrthDict03,BrthLength0),write("Number of unique breathsonings in dictionary: "), writeln(BrthLength0))->true;true), + + + ((Room=true, + phrase_from_file_s(string(RoomDict0), "../Text-to-Breasonings/roomdict.txt"), splitfurther(RoomDict0,RoomDict01), + sort(RoomDict01,RoomDict03), + length(RoomDict03,RoomLength0),write("Number of unique rooms in dictionary: "), writeln(RoomLength0))->true;true), + + ((PartOfRoom=true, + phrase_from_file_s(string(PartOfRoomDict0), "../Text-to-Breasonings/partofroomdict.txt"), splitfurther(PartOfRoomDict0,PartOfRoomDict01), + sort(PartOfRoomDict01,PartOfRoomDict03), + length(PartOfRoomDict03,PartOfRoomLength0),write("Number of unique parts of rooms in dictionary: "), writeln(PartOfRoomLength0))->true;true), + + ((Direction=true, + phrase_from_file_s(string(DirectionDict0), "../Text-to-Breasonings/directiondict.txt"), splitfurther(DirectionDict0,DirectionDict01), + sort(DirectionDict01,DirectionDict03), + length(DirectionDict03,DirectionLength0),write("Number of unique directions in dictionary: "), writeln(DirectionLength0))->true;true), + + ((ObjectToPrepare=true, + phrase_from_file_s(string(ObjectToPrepareDict0), "../Text-to-Breasonings/objecttopreparedict.txt"), splitfurther(ObjectToPrepareDict0,ObjectToPrepareDict01), + sort(ObjectToPrepareDict01,ObjectToPrepareDict03), + length(ObjectToPrepareDict03,ObjectToPrepareLength0),write("Number of unique objects to prepare in dictionary: "), writeln(ObjectToPrepareLength0))->true;true), + + ((ObjectToFinish=true, + phrase_from_file_s(string(ObjectToFinishDict0), "../Text-to-Breasonings/objecttofinishdict.txt"), splitfurther(ObjectToFinishDict0,ObjectToFinishDict01), + sort(ObjectToFinishDict01,ObjectToFinishDict03), + length(ObjectToFinishDict03,ObjectToFinishLength0),write("Number of unique objects to finish in dictionary: "), writeln(ObjectToFinishLength0))->true;true), + +(complete_display(true)-> + ((Stringx1=u, %% Use file, not string as input. + + %%maplist(downcase_atom, List2, List3), + maplist(string_lower, List2, List3), + +%%writeln([list3,List3]), + towords3(BrDict03,[],BrDict04,[],_ObjectNames,[],AllUsedNames), + towords2(BrDict03t,[],BrDict04t), + +%%writeln([brDict04,BrDict04]), + subtract(List3,BrDict04,D1), +%%writeln([list3,brDict04,d1,List3,BrDict04,D1]), +%%writeln(["subtract(BrDict04,List3,D1).",List3,BrDict04,D1]), + length(D1,Length01),Difference is abs(Length01),write("Number of words remaining to define: "), writeln(Difference), + + subtract(AllUsedNames,BrDict04t,D2), + %%delete(D21,'',D2), + length(D2,Length01t),Differencet is abs(Length01t),write("Number of undefined breasonings: "), writeln(Differencet), + %% writeln([undefinedbreasonings,D2]), %% Print undefined breasonings + + %%delete(D31,'',D3), + subtract(BrDict04t,AllUsedNames,D3), + length(D3,Length01t2),Differencet2 is abs(Length01t2),write("Number of orphaned breasonings: "), writeln(Differencet2), + + %%,writeln([orphanedbreasonings,D3]) %% Print orphaned breasonings + + ((Brth=true, + + %%towords3(BrDict03,[],BrDict04,[],_ObjectNames,[],AllUsedNames), + towords2a(BrthDict03,[],BrtDict04t), + + %%subtract(List3,BrtDict04t,Dt1), + %%length(Dt1,Lengtht01),Differencet1 is abs(Lengtht01),write("Number of words remaining to define breathsonings for: "), writeln(Differencet1), + %%writeln(["Number of words remaining to define breathsonings for",Dt1]), %% Print number of words remaining to define breathsonings for + + subtract(AllUsedNames,BrtDict04t,Dt2), + length(Dt2,Lengtht01t),Differencet22 is abs(Lengtht01t),write("Number of undefined breathsonings: "), writeln(Differencet22), + %%writeln([undefinedbreathsonings,Dt2]), %% Print undefined breathsonings + + subtract(BrtDict04t,AllUsedNames,Dt3), + length(Dt3,Lengtht01t2),Differencet3 is abs(Lengtht01t2),write("Number of orphaned breathsonings: "), writeln(Differencet3) + + %%writeln([orphanedbreathsonings,Dt3]) %% Print orphaned breathsonings + + )->true;true), + + ((Room=true, + towords2a(RoomDict03,[],RoomDict04t), + subtract(AllUsedNames,RoomDict04t,RoomDt2), + length(RoomDt2,RoomLengtht01t),RoomDifferencet22 is abs(RoomLengtht01t),write("Number of undefined rooms: "), writeln(RoomDifferencet22), + %%writeln([undefinedrooms,RoomDt2]), %% Print undefined rooms + subtract(RoomDict04t,AllUsedNames,RoomDt3), + length(RoomDt3,RoomLengtht01t2),RoomDifferencet3 is abs(RoomLengtht01t2),write("Number of orphaned rooms: "), writeln(RoomDifferencet3) + %%writeln([orphanedrooms,RoomDt3]) %% Print orphaned rooms + )->true;true), + + ((PartOfRoom=true, + towords2a(PartOfRoomDict03,[],PartOfRoomDict04t), + subtract(AllUsedNames,PartOfRoomDict04t,PartOfRoomDt2), + length(PartOfRoomDt2,PartOfRoomLengtht01t),PartOfRoomDifferencet22 is abs(PartOfRoomLengtht01t),write("Number of undefined parts of rooms: "), writeln(PartOfRoomDifferencet22), + %%writeln([undefinedPartsOfRooms,PartOfRoomDt2]), %% Print undefined PartsOfRooms + subtract(PartOfRoomDict04t,AllUsedNames,PartOfRoomDt3), + length(PartOfRoomDt3,PartOfRoomLengtht01t2),PartOfRoomDifferencet3 is abs(PartOfRoomLengtht01t2),write("Number of orphaned parts of rooms: "), writeln(PartOfRoomDifferencet3) + %%writeln([orphanedPartsOfRooms,PartOfRoomDt3]) %% Print orphaned PartsOfRooms + )->true;true), + + ((Direction=true, + towords2a(DirectionDict03,[],DirectionDict04t), + subtract(AllUsedNames,DirectionDict04t,DirectionDt2), + length(DirectionDt2,DirectionLengtht01t),DirectionDifferencet22 is abs(DirectionLengtht01t),write("Number of undefined directions: "), writeln(DirectionDifferencet22), + %%writeln([undefinedDirections,DirectionDt2]), %% Print undefined Directions + subtract(DirectionDict04t,AllUsedNames,DirectionDt3), + length(DirectionDt3,DirectionLengtht01t2),DirectionDifferencet3 is abs(DirectionLengtht01t2),write("Number of orphaned directions: "), writeln(DirectionDifferencet3) + %%writeln([orphanedDirections,DirectionDt3]) %% Print orphaned Directions + )->true;true), + + ((ObjectToPrepare=true, + towords2a(ObjectToPrepareDict03,[],ObjectToPrepareDict04t), + subtract(AllUsedNames,ObjectToPrepareDict04t,ObjectToPrepareDt2), + length(ObjectToPrepareDt2,ObjectToPrepareLengtht01t),ObjectToPrepareDifferencet22 is abs(ObjectToPrepareLengtht01t),write("Number of undefined objects to prepare: "), writeln(ObjectToPrepareDifferencet22), + %%writeln([undefinedObjectsToPrepare,ObjectToPrepareDt2]), %% Print undefined ObjectsToPrepare + subtract(ObjectToPrepareDict04t,AllUsedNames,ObjectToPrepareDt3), + length(ObjectToPrepareDt3,ObjectToPrepareLengtht01t2),ObjectToPrepareDifferencet3 is abs(ObjectToPrepareLengtht01t2),write("Number of orphaned objects to prepare: "), writeln(ObjectToPrepareDifferencet3) + %%writeln([orphanedObjectsToPrepare,ObjectToPrepareDt3]) %% Print orphaned ObjectsToPrepare + )->true;true), + + ((ObjectToFinish=true, + towords2a(ObjectToFinishDict03,[],ObjectToFinishDict04t), + subtract(AllUsedNames,ObjectToFinishDict04t,ObjectToFinishDt2), + length(ObjectToFinishDt2,ObjectToFinishLengtht01t),ObjectToFinishDifferencet22 is abs(ObjectToFinishLengtht01t),write("Number of undefined objects to finish: "), writeln(ObjectToFinishDifferencet22), + %%writeln([undefinedObjectsToFinish,ObjectToFinishDt2]), %% Print undefined ObjectsToFinish + subtract(ObjectToFinishDict04t,AllUsedNames,ObjectToFinishDt3), + length(ObjectToFinishDt3,ObjectToFinishLengtht01t2),ObjectToFinishDifferencet3 is abs(ObjectToFinishLengtht01t2),write("Number of orphaned objects to finish: "), writeln(ObjectToFinishDifferencet3) + %%writeln([orphanedObjectsToFinish,ObjectToFinishDt3]) %% Print orphaned ObjectsToFinish + )->true;true) + + + +)->true;(string(Filex),writeln("Number of words, unique words, unique breathsonings, words remaining to define, undefined breasonings, orphaned breasonings, undefined breathsonings and orphaned breathsonings skipped for speed when breasoning out a string.")));true) + +,!. + +%n2(N) :-n(N). +brDict031(BrDict2) :- brDict03(BrDict2). +brDict03t1(BrDict03t2) :- brDict03t(BrDict03t2). + +br2(List1,N) :- +%br2(_,A,A,B,B,0,_Brth,BrthDict03,BrthDict03,_Room,RoomDict03,RoomDict03,_PartOfRoom,PartOfRoomDict03,PartOfRoomDict03,_Direction,DirectionDict03,DirectionDict03,_ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict03,_ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict03) :- !. +%br2(List1,BrDict03,BrDict2,BrDict03t,BrDict03t2,N1,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04) :- + + length(NL,N), + findall(_,(member(_,NL), + (auto(on)-> + concurrent_maplist(br,List1,_); + maplist(br,List1,_))),_),!. + %*** + +%br(List1,BrDict03,BrDict21,BrDict03t,BrDict03t21,Brth,BrthDict03,BrthDict041,Room,RoomDict03,RoomDict041,PartOfRoom,PartOfRoomDict03,PartOfRoomDict041,Direction,DirectionDict03,DirectionDict041,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict041,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict041), + %N2 is N1-1, + %br2(List1,BrDict21,BrDict2,BrDict03t21,BrDict03t2,N2,Brth,BrthDict041,BrthDict04,Room,RoomDict041,RoomDict04,PartOfRoom,PartOfRoomDict041,PartOfRoomDict04,Direction,DirectionDict041,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict041,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict041,ObjectToFinishDict04),!. + +towords2([],A,A) :- !. +towords2(BrDict03,A,B) :- + BrDict03=[[Word,_,_,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2(Rest,C,B). + +towords2a([],A,A) :- !. +towords2a(BrDict03,A,B) :- + BrDict03=[[Word,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2a(Rest,C,B). + +towords3([],A,A,C,C,D,D) :- !. +towords3(BrDict03,A,B,D,E,G,H) :- + BrDict03=[[Word1,Word2]|Rest], + (Word2=""->append(G,[Word1],I)->true; + append(G,[Word2],I)), + append(A,[Word1],C), + append(D,[Word2],F), + towords3(Rest,C,B,F,E,I,H). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +splitfurther(BrDict01,N) :- + phrase(file0(N),BrDict01). + +file0(N) --> "[", file(N), "]", !. +file0([]) --> []. + +%%file([]) --> []. +file([L|Ls]) --> entry(L),",", +%%{writeln(L)}, %%*** +file(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +file([L]) --> entry(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entry([Word2,Word4]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + word(Word3), {string_codes(Word4,Word3),string(Word4)}, + "]". + +splitfurthert(BrDict01,N) :- + phrase(file0t(N),BrDict01). + +file0t(N) --> "[", filet(N), "]", !. +file0t([]) --> []. + +%%file([]) --> []. +filet([L|Ls]) --> entryt(L),",", +%%{writeln(L)}, %%*** +filet(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +filet([L]) --> entryt(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entryt([Word2,X3,Y3,Z3]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + digits(X),",",{atom_codes(X2,X),atom_number(X2,X3),number(X3)}, + digits(Y),",",{atom_codes(Y2,Y),atom_number(Y2,Y3),number(Y3)}, + digits(Z),{atom_codes(Z2,Z),atom_number(Z2,Z3),number(Z3)}, + "]". + +word([X|Xs]) --> [X], {char_type(X,csymf)->true;(X=27->true;X=8217)}, word(Xs), !. +%%word([X]) --> [X], {char_type(X,csymf);(X=27;X=8217)}, !. +word([]) --> []. + +digits([X|Xs]) --> [X], {(char_type(X,digit)->true;(string_codes(Word2,[X]),Word2="."))}, digits(Xs), !. +%%digits([X]) --> [X], {(char_type(X,digit);(string_codes(Word2,[X]),Word2="."))}, !. +digits([]) --> []. + +br(Word,_) :- +%[],B,B,C,C,_,D,D,_Room,RoomDict03,RoomDict03,_PartOfRoom,PartOfRoomDict03,PartOfRoomDict03,_Direction,DirectionDict03,DirectionDict03,_ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict03,_ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict03) :- + %!. +%br([Word|Words],BrDict,BrDict2,BrDict4,BrDict5,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04) :- + downcase_atom(Word, Word2), atom_string(Word2,Word3), + + /* + words_to_read1(WR1), + (WR1>0->(writeln(WR1),write(Word), + texttobr2(3),nl,sleep(0.12), + WR2 is WR1-1, + retractall(words_to_read(_)), + assertz(words_to_read(WR2))); + true), + */ + + /**member([Word3,X,Y,Z],BrDict4) -> %% This feature is a bug because words in brdict2 shouldn't necessarily be the words in brdict1 + %%(append(BrDict,[[Word3,""]],BrDict3), BrDict3t=BrDict4, + %%br(Words,BrDict3,BrDict2,BrDict3t,BrDict5)) + %%; + %%(**/ + + %%(member([Word3,X,Y,Z],BrDict4) -> %% This feature is a bug because words in brdict1 should correspond to those in brdict2 + %%(atom_concat("The breasoning for ", Word3, P1), + %%atom_concat(P1, " is defined. Enter object name (without spaces), if different for ", Prompt)); + %Prompt="Enter object name (without spaces), if different for "), + + %%writeln([word3,Word3]), + brDict031(BrDict), + + (member([Word3,String4],BrDict)-> + BrDict3=BrDict; + ((repeat, + write("Enter object name (without spaces), if different for "), writeln(Word3),read_string1("box",user_input, "\n", "\r", _End2, String2),split_string(String2, "", " ", String3),String3=[String4]), + append(BrDict,[[Word3,String4]],BrDict3), + texttobr(1,u,String4,1))), + %%*brth(Word3,_Brth), + +(String4=""->String5=Word3;String5=String4), + + downcase_atom(String5, String52), atom_string(String52,String53), + +brDict03t1(BrDict4), + + (member([String53,_X,_Y,_Z],BrDict4)-> + BrDict3t1=BrDict4; + ((repeat, + write("Enter x, y and z in cm for "), writeln(String53),read_string1("1,1,1",user_input, "\n", "\r", _End, String),split_string(String, ",", " ", Values),Values=[X1,Y1,Z1],number_string(X,X1),number_string(Y,Y1),number_string(Z,Z1)), + append(BrDict4,[[String53,X,Y,Z]],BrDict3t1))), + %%*brth(String53,_Brth2), + %%write("br(\'"),write(Word3),writeln("\',)."), +%% writeln([Word3,X,Y,Z]), + %%write(' '), + + /* + (Brth=true,(member([String53,_Breathsoning],BrthDict03)-> + BrthDict3=BrthDict03; + ((repeat, + write("Enter human judgement (without spaces), if different for "), writeln(String53),read_string1("good",user_input, "\n", "\r", _End2, Stringth2),split_string(Stringth2, "", " ", Stringth3),Stringth3=[Stringth4]),texttobr(1,u,Stringth4,1), + append(BrthDict03,[[String53,Stringth4]],BrthDict3)))->true;true), + + (Room=true,(member([String53,_Room],RoomDict03)-> + RoomDict3=RoomDict03; + ((repeat, + write("Enter room (without spaces), if different for "), writeln(String53),read_string1("living",user_input, "\n", "\r", _RoomEnd2, RoomStringth2),split_string(RoomStringth2, "", " ", RoomStringth3),RoomStringth3=[RoomStringth4]),texttobr(1,u,RoomStringth4,1), + append(RoomDict03,[[String53,RoomStringth4]],RoomDict3)))->true;true), + + + (Room=true,(member([RoomStringth4,_X,_Y,_Z],BrDict3t1)-> + BrDict3t2=BrDict3t1; + ((repeat, + write("Enter x, y and z in cm for "), writeln(RoomStringth4),read_string1("500,400,300",user_input, "\n", "\r", _End, RoomString),split_string(RoomString, ",", " ", RoomValues),RoomValues=[RoomX1,RoomY1,RoomZ1],number_string(RoomX,RoomX1),number_string(RoomY,RoomY1),number_string(RoomZ,RoomZ1)), + append(BrDict3t1,[[RoomStringth4,RoomX,RoomY,RoomZ]],BrDict3t2)))->true;BrDict3t2=BrDict3t1), + + +(PartOfRoom=true,(member([String53,_PartOfRoom],PartOfRoomDict03)-> + PartOfRoomDict3=PartOfRoomDict03; + ((repeat, + write("Enter part of room (without spaces), if different for "), writeln(String53),read_string1("corner",user_input, "\n", "\r", _PartOfRoomEnd2, PartOfRoomStringth2),split_string(PartOfRoomStringth2, "", " ", PartOfRoomStringth3),PartOfRoomStringth3=[PartOfRoomStringth4]),texttobr(1,u,PartOfRoomStringth4,1), + append(PartOfRoomDict03,[[String53,PartOfRoomStringth4]],PartOfRoomDict3)))->true;true), + + + (PartOfRoom=true,(member([PartOfRoomStringth4,_X,_Y,_Z],BrDict3t2)-> + BrDict3t3=BrDict3t2; + ((repeat, + write("Enter x, y and z in cm for "), writeln(PartOfRoomStringth4),read_string1("100,100,300",user_input, "\n", "\r", _End, PartOfRoomString),split_string(PartOfRoomString, ",", " ", PartOfRoomValues),PartOfRoomValues=[PartOfRoomX1,PartOfRoomY1,PartOfRoomZ1],number_string(PartOfRoomX,PartOfRoomX1),number_string(PartOfRoomY,PartOfRoomY1),number_string(PartOfRoomZ,PartOfRoomZ1)), + append(BrDict3t2,[[PartOfRoomStringth4,PartOfRoomX,PartOfRoomY,PartOfRoomZ]],BrDict3t3)))->true;BrDict3t3=BrDict3t2), + +(Direction=true,(member([String53,_Direction],DirectionDict03)-> + DirectionDict3=DirectionDict03; + ((repeat, + write("Enter direction (without spaces), if different for "), writeln(String53),read_string1("table",user_input, "\n", "\r", _DirectionEnd2, DirectionStringth2),split_string(DirectionStringth2, "", " ", DirectionStringth3),DirectionStringth3=[DirectionStringth4]),texttobr(1,u,DirectionStringth4,1), + append(DirectionDict03,[[String53,DirectionStringth4]],DirectionDict3)))->true;true), + + + (Direction=true,(member([DirectionStringth4,_X,_Y,_Z],BrDict3t3)-> + BrDict3t4=BrDict3t3; + ((repeat, + write("Enter x, y and z in cm for "), writeln(DirectionStringth4),read_string1("100,100,100",user_input, "\n", "\r", _End, DirectionString),split_string(DirectionString, ",", " ", DirectionValues),DirectionValues=[DirectionX1,DirectionY1,DirectionZ1],number_string(DirectionX,DirectionX1),number_string(DirectionY,DirectionY1),number_string(DirectionZ,DirectionZ1)), + append(BrDict3t3,[[DirectionStringth4,DirectionX,DirectionY,DirectionZ]],BrDict3t4)))->true;BrDict3t4=BrDict3t3), + + +(ObjectToPrepare=true,(member([String53,_ObjectToPrepare],ObjectToPrepareDict03)-> + ObjectToPrepareDict3=ObjectToPrepareDict03; + ((repeat, + write("Enter object to prepare (without spaces), if different for "), writeln(String53),read_string1("measure",user_input, "\n", "\r", _ObjectToPrepareEnd2, ObjectToPrepareStringth2),split_string(ObjectToPrepareStringth2, "", " ", ObjectToPrepareStringth3),ObjectToPrepareStringth3=[ObjectToPrepareStringth4]),texttobr(1,u,ObjectToPrepareStringth4,1), + append(ObjectToPrepareDict03,[[String53,ObjectToPrepareStringth4]],ObjectToPrepareDict3)))->true;true), + + + (ObjectToPrepare=true,(member([ObjectToPrepareStringth4,_X,_Y,_Z],BrDict3t4)-> + BrDict3t5=BrDict3t4; + ((repeat, + write("Enter x, y and z in cm for "), writeln(ObjectToPrepareStringth4),read_string1("30,1,0.2",user_input, "\n", "\r", _End, ObjectToPrepareString),split_string(ObjectToPrepareString, ",", " ", ObjectToPrepareValues),ObjectToPrepareValues=[ObjectToPrepareX1,ObjectToPrepareY1,ObjectToPrepareZ1],number_string(ObjectToPrepareX,ObjectToPrepareX1),number_string(ObjectToPrepareY,ObjectToPrepareY1),number_string(ObjectToPrepareZ,ObjectToPrepareZ1)), + append(BrDict3t4,[[ObjectToPrepareStringth4,ObjectToPrepareX,ObjectToPrepareY,ObjectToPrepareZ]],BrDict3t5)))->true;BrDict3t5=BrDict3t4), + + +(ObjectToFinish=true,(member([String53,_ObjectToFinish],ObjectToFinishDict03)-> + ObjectToFinishDict3=ObjectToFinishDict03; + ((repeat, + write("Enter object to finish (without spaces), if different for "), writeln(String53),read_string1("file",user_input, "\n", "\r", _ObjectToFinishEnd2, ObjectToFinishStringth2),split_string(ObjectToFinishStringth2, "", " ", ObjectToFinishStringth3),ObjectToFinishStringth3=[ObjectToFinishStringth4]),texttobr(1,u,ObjectToFinishStringth4,1), + append(ObjectToFinishDict03,[[String53,ObjectToFinishStringth4]],ObjectToFinishDict3)))->true;true), + + + (ObjectToFinish=true,(member([ObjectToFinishStringth4,_X,_Y,_Z],BrDict3t5)-> + BrDict3t6=BrDict3t5; + ((repeat, + write("Enter x, y and z in cm for "), writeln(ObjectToFinishStringth4),read_string1("20,30,0.0001",user_input, "\n", "\r", _End, ObjectToFinishString),split_string(ObjectToFinishString, ",", " ", ObjectToFinishValues),ObjectToFinishValues=[ObjectToFinishX1,ObjectToFinishY1,ObjectToFinishZ1],number_string(ObjectToFinishX,ObjectToFinishX1),number_string(ObjectToFinishY,ObjectToFinishY1),number_string(ObjectToFinishZ,ObjectToFinishZ1)), + append(BrDict3t5,[[ObjectToFinishStringth4,ObjectToFinishX,ObjectToFinishY,ObjectToFinishZ]],BrDict3t6)))->true;BrDict3t6=BrDict3t5), + +*/ + +retractall(brDict03(_)), +assertz(brDict03(BrDict3)), + +retractall(brDict03t(_)), +assertz(brDict03t(BrDict3t1)),!. + +%br(Words,BrDict3,BrDict2,BrDict3t6,BrDict5,Brth,BrthDict3,BrthDict04,Room,RoomDict3,RoomDict04,PartOfRoom,PartOfRoomDict3,PartOfRoomDict04,Direction,DirectionDict3,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict3,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict3,ObjectToFinishDict04). + %%). +brth(_,sweetinvincibleandprayedfor). + +%% finds unknown words, asks for their br in form "n of m: word", verify, (can go back x) append and sort, save +read_string1(S,user_input, "\n", "\r", _End, ObjectToFinishString) :- + (auto(on)->S=ObjectToFinishString; + read_string(user_input, "\n", "\r", _End, ObjectToFinishString)),!. + + +process_t2b(A,C) :- + replace_t2b(Replacements), + atom_string(A1,A), + replace1_t2b(Replacements,A1,D1), + atom_string(D1,C),!. + +replace1_t2b([],A,A) :- !. +replace1_t2b(Replacements,A,D) :- + Replacements=[[B,C]|G], + atomic_list_concat(E,B,A), + atomic_list_concat(E,C,F), + replace1_t2b(G,F,D),!. + + replace_t2b([['\\',''],['–',' '],['—',' '],['“','\''],['”','\''],['‘','\''],['’','\''],['⁃','-']]). diff --git a/Text-to-Breasonings-master/texttobrall2_reading3.pl b/Text-to-Breasonings-master/texttobrall2_reading3.pl new file mode 100644 index 0000000000000000000000000000000000000000..722cda6c74004e285f450163bad519313b32de91 --- /dev/null +++ b/Text-to-Breasonings-master/texttobrall2_reading3.pl @@ -0,0 +1,511 @@ +% ['../Text-to-Breasonings/text_to_breasonings.pl']. +% W is 50*4,texttobr2(u,u,u,u,false,false,false,false,false,false,W) +% where W is the number of words to read +% and there are approximately 4 words per algorithm. + +%% Important: See instructions for using texttobr.pl at https://lucianpedia.wikia.com/wiki/Instructions_for_Using_texttobr(2).pl . + +%% use_module(library(pio)). %% In la_strings +%% use_module(library(dcg/basics)). %% In la_strings + +%% texttobr2 - converts file stream to dimensions of objects represented by the words +%% has object name as separate field for new users of texttobr to verify breasonings by hand +%% brdict1.txt contains word and object name, brdict2.txt contains object name and x, y and z + +%% texttobr2(Runs,File,StringtoBreason,BreasoningLimit). +:- include('../Text-to-Breasonings/mergetexttobrdict.pl'). +%:- include('../listprologinterpreter/la_strings'). +%:- include('../Philosophy/14 10 23.pl'). + +%% Brth is true or false +%texttobr2(N1,Filex1,Stringx1,M1) :- + %texttobr2(N1,Filex1,Stringx1,M1,false,false,false,false,false,false,0,[auto,off]). + +texttobr2(N1,Filex1,Stringx1,M1,[auto,on]) :- + texttobr2(N1,Filex1,Stringx1,M1,false,false,false,false,false,false,0,[auto,on]). +/*texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction, + ObjectToPrepare,ObjectToFinish) :- + texttobr2(N1,Filex1,Stringx1,M1,Brth,Room, + PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,0,[auto,off]). + */ +texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction, + ObjectToPrepare,ObjectToFinish,[auto,on]) :- + texttobr2(N1,Filex1,Stringx1,M1,Brth,Room, + PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,0,[auto,on]),!. + +%texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction, +% ObjectToPrepare,ObjectToFinish,W) :- +% texttobr2(N1,Filex1,Stringx1,M1,Brth,Room, + %PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,0,[auto,off]). + +texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,Words_to_read,[auto,on]) :- + retractall(auto(_)), + assertz(auto(on)), + +(Stringx1=""->true; +( retractall(complete_display(_)), + assertz(complete_display(false)), + + retractall(words_to_read(_)), + assertz(words_to_read(Words_to_read)), + + ((number(N1),N=N1)->true; + (N1=u,N=1)), + + ((Filex1=u,Filex="../Text-to-Breasonings/file.txt")->true; + Filex=Filex1), + + ((number(M1),M=M1)->true; + M=all), %% If m1 is undefined or all then m=all + + prep(List1,BrDict03,BrDict03t,Filex,Stringx1,M,Brth,BrthDict03,Room,RoomDict03,PartOfRoom,PartOfRoomDict03,Direction,DirectionDict03,ObjectToPrepare,ObjectToPrepareDict03,ObjectToFinish,ObjectToFinishDict03), + +findall(List3,(member(C,List1),downcase_atom(C,List2),atom_string(List2,List3)),D), + br2(D,BrDict03,BrDict2,BrDict03t,BrDict03t2,N,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04), +/* sort(BrDict2,BrDict3), + (BrDict03=BrDict3->true; + + (open_s("../Text-to-Breasonings/brdict1.txt",write,Stream), +%% string_codes(BrDict3), + write(Stream,BrDict3), + close(Stream))), + + sort(BrDict03t2,BrDict03t3), + (BrDict03t=BrDict03t3->true; + (open_s("../Text-to-Breasonings/brdict2.txt",write,Stream2), +%% string_codes(BrDict3), + write(Stream2,BrDict03t3), + close(Stream2))), + + ((Brth=true, + sort(BrthDict04,BrthDict044), + (BrthDict03=BrthDict044->true; + (open_s("../Text-to-Breasonings/brthdict.txt",write,Stream3), +%% string_codes(BrDict3), + write(Stream3,BrthDict044), + close(Stream3))))->true;true), + + ((Room=true, + sort(RoomDict04,RoomDict044), + (RoomDict04=RoomDict044->true; + (open_s("../Text-to-Breasonings/roomdict.txt",write,Stream4), +%% string_codes(BrDict3), + write(Stream4,RoomDict044), + close(Stream4))))->true;true), + + ((PartOfRoom=true, + sort(PartOfRoomDict04,PartOfRoomDict044), + (PartOfRoomDict04=PartOfRoomDict044->true; + (open_s("../Text-to-Breasonings/partofroomdict.txt",write,Stream5), +%% string_codes(BrDict3), + write(Stream5,PartOfRoomDict044), + close(Stream5))))->true;true), + + ((Direction=true, + sort(DirectionDict04,DirectionDict044), + (DirectionDict04=DirectionDict044->true; + (open_s("../Text-to-Breasonings/directiondict.txt",write,Stream6), +%% string_codes(BrDict3), + write(Stream6,DirectionDict044), + close(Stream6))))->true;true), + + ((ObjectToPrepare=true, + sort(ObjectToPrepareDict04,ObjectToPrepareDict044), + (ObjectToPrepareDict04=ObjectToPrepareDict044->true; + (open_s("../Text-to-Breasonings/objecttopreparedict.txt",write,Stream7), +%% string_codes(BrDict3), + write(Stream7,ObjectToPrepareDict044), + close(Stream7))))->true;true), + + ((ObjectToFinish=true, + sort(ObjectToFinishDict04,ObjectToFinishDict044), + (ObjectToFinishDict04=ObjectToFinishDict044->true; + (open_s("../Text-to-Breasonings/objecttofinishdict.txt",write,Stream8), +%% string_codes(BrDict3), + write(Stream8,ObjectToFinishDict044), + close(Stream8))))->true;true), + */ + length(List1,List1_length_a), + Dividend_a is ceiling(List1_length_a/250), + Dividend_b is Dividend_a*3, % for graciously giving + texttobr2_a(Dividend_b,meditation), + texttobr2_a(Dividend_b,medicine), + texttobr2_a(Dividend_b,pedagogy) + )), + + !. + + +replace0(Input,Find,Replace,SepandPad,M,Output0) :- + replace00(Input,Find,Replace,SepandPad,[],Output1), + truncate(Output1,M,Output0),!. +replace00([],_Find,_Replace,_SepandPad,Input,Input) :- !. +replace00(Input1,Find,Replace,SepandPad,Input2,Input3) :- + Input1=[Input4|Input5], + string_codes(Input4,String1), + replace1(String1,Find,Replace,[],Input7), + string_codes(Output,Input7), + split_string(Output,SepandPad,SepandPad,Input8), + append(Input2,Input8,Input9), + replace00(Input5,Find,Replace,SepandPad,Input9,Input3), !. +replace1([],_Find,_Replace,Input,Input) :- !. +replace1(Input1,Find,Replace,Input2,Input3) :- + Input1=[Input4|Input5], + member(Input4,Find), + append(Input2,[Replace],Input6), + replace1(Input5,Find,Replace,Input6,Input3), !. +replace1(Input1,Find,Replace,Input2,Input3) :- + Input1=[Input4|Input5], + not(member(Input4,Find)), + append(Input2,[Input4],Input6), + replace1(Input5,Find,Replace,Input6,Input3), !. + +split_string_onnonletter(String00,List1) :- + string_codes(String00,String1), + split_string_onnonletter(String1,[],List0), + string_codes(List0,List2), + split_string(List2," "," ",List1),!. +split_string_onnonletter([],Input,Input) :- !. +split_string_onnonletter(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + not(char_type(Input4,alpha)), + append(Input2,[32],Input6), + split_string_onnonletter(Input5,Input6,Input3), !. +split_string_onnonletter(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + char_type(Input4,alpha), + append(Input2,[Input4],Input6), + split_string_onnonletter(Input5,Input6,Input3), !. + +%% Truncates the list if m is not undefined and m is greater than or equal to the length of string0 +truncate(List1,M,String0) :- + ((number(M),length(String0,M), + append(String0,_,List1))->true; + String0=List1),!. + + + +prep(List,BrDict03,BrDict03t,Filex,Stringx1,M,Brth,BrthDict03,Room,RoomDict03,PartOfRoom,PartOfRoomDict03,Direction,DirectionDict03,ObjectToPrepare,ObjectToPrepareDict03,ObjectToFinish,ObjectToFinishDict03) :- + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + %%Chars="’", + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), +%%writeln([brdict1]), + splitfurther(BrDict0,BrDict01), +%%writeln([brDict01,BrDict01]), + %%char_code(Escape,27), + %%delete(BrDict01,[Escape,_,_,_,_],BrDict021), +%%writeln([brDict021,BrDict021]), + %%char_code(Apostrophe,8217), + %%delete(BrDict021,[Apostrophe,_,_,_,_],BrDict02), +%%writeln([brDict02,BrDict02]), + sort(BrDict01,BrDict03), +%%writeln([brDict03,BrDict03]), + length(BrDict03,Length0),write("Number of words in dictionary: "), writeln(Length0), + + %%writeln(''), + %%writeln([brdict2]), + phrase_from_file_s(string(BrDict0t), "../Text-to-Breasonings/brdict2.txt"), + %%Chars="’", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), + splitfurthert(BrDict0t,BrDict01t), +%%writeln([brDict01,BrDict01]), + %%delete(BrDict01t,[Escape,_,_,_,_],BrDict021t), +%%writeln([brDict021,BrDict021]), + %%delete(BrDict021t,[Apostrophe,_,_,_,_],BrDict02t), +%%writeln([brDict02,BrDict02]), + sort(BrDict01t,BrDict03t), + +% br_freq %B=BrDict03t,A=BrDict03,findall([DL,C,"\n"],(member([C,_,_,_],B),findall(_,member([_,C],A),D),length(D,DL)),E),sort(E,F),reverse(F,G),writeln([br_freq,G]), + +%%writeln([brDict03,BrDict03]), + length(BrDict03t,Length0t),write("Number of unique breasonings in dictionary: "), writeln(Length0t), + + ((Stringx1=u, + phrase_from_file_s(string(String001), Filex))->true; + String001=Stringx1), + + process_t2b(String001,String00), + + split_string(String00,SepandPad,SepandPad,List1), + %%split_string_onnonletter(String00,List1), + + truncate(List1,M,List), + + /**replace0(String0,[8221, 8220], 34, SepandPad, M, String1), + replace0(String1,[8216, 8217], 39, SepandPad, M, String2), + replace0(String2,[8259, 8211, 8212], 45, SepandPad, M, String3), + replace0(String3,[160], 32, SepandPad, M, List), + **/ + +%%atom_codes(Atom999,String),writeln([atom999,Atom999]), + +%%writeln([list,List]), + %%delete(List,Escape,List11), +%%writeln([list11,List11]), + %%delete(List11,Apostrophe,List1), +%%writeln([list1,List1]), + length(List,Length1),write("Number of words to breason out in file: "), writeln(Length1), + sort(List,List2), +%%writeln([list2,List2]), + length(List2,Length2),write("Number of unique words in file: "), writeln(Length2), + + + + ((Brth=true, + phrase_from_file_s(string(BrthDict0), "../Text-to-Breasonings/brthdict.txt"), splitfurther(BrthDict0,BrthDict01), + sort(BrthDict01,BrthDict03), + length(BrthDict03,BrthLength0),write("Number of unique breathsonings in dictionary: "), writeln(BrthLength0))->true;true), + + + ((Room=true, + phrase_from_file_s(string(RoomDict0), "../Text-to-Breasonings/roomdict.txt"), splitfurther(RoomDict0,RoomDict01), + sort(RoomDict01,RoomDict03), + length(RoomDict03,RoomLength0),write("Number of unique rooms in dictionary: "), writeln(RoomLength0))->true;true), + + ((PartOfRoom=true, + phrase_from_file_s(string(PartOfRoomDict0), "../Text-to-Breasonings/partofroomdict.txt"), splitfurther(PartOfRoomDict0,PartOfRoomDict01), + sort(PartOfRoomDict01,PartOfRoomDict03), + length(PartOfRoomDict03,PartOfRoomLength0),write("Number of unique parts of rooms in dictionary: "), writeln(PartOfRoomLength0))->true;true), + + ((Direction=true, + phrase_from_file_s(string(DirectionDict0), "../Text-to-Breasonings/directiondict.txt"), splitfurther(DirectionDict0,DirectionDict01), + sort(DirectionDict01,DirectionDict03), + length(DirectionDict03,DirectionLength0),write("Number of unique directions in dictionary: "), writeln(DirectionLength0))->true;true), + + ((ObjectToPrepare=true, + phrase_from_file_s(string(ObjectToPrepareDict0), "../Text-to-Breasonings/objecttopreparedict.txt"), splitfurther(ObjectToPrepareDict0,ObjectToPrepareDict01), + sort(ObjectToPrepareDict01,ObjectToPrepareDict03), + length(ObjectToPrepareDict03,ObjectToPrepareLength0),write("Number of unique objects to prepare in dictionary: "), writeln(ObjectToPrepareLength0))->true;true), + + ((ObjectToFinish=true, + phrase_from_file_s(string(ObjectToFinishDict0), "../Text-to-Breasonings/objecttofinishdict.txt"), splitfurther(ObjectToFinishDict0,ObjectToFinishDict01), + sort(ObjectToFinishDict01,ObjectToFinishDict03), + length(ObjectToFinishDict03,ObjectToFinishLength0),write("Number of unique objects to finish in dictionary: "), writeln(ObjectToFinishLength0))->true;true), + +(complete_display(true)-> + ((Stringx1=u, %% Use file, not string as input. + + %%maplist(downcase_atom, List2, List3), + maplist(string_lower, List2, List3), + +%%writeln([list3,List3]), + towords3(BrDict03,[],BrDict04,[],_ObjectNames,[],AllUsedNames), + towords2(BrDict03t,[],BrDict04t), + +%%writeln([brDict04,BrDict04]), + subtract(List3,BrDict04,D1), +%%writeln([list3,brDict04,d1,List3,BrDict04,D1]), +%%writeln(["subtract(BrDict04,List3,D1).",List3,BrDict04,D1]), + length(D1,Length01),Difference is abs(Length01),write("Number of words remaining to define: "), writeln(Difference), + + subtract(AllUsedNames,BrDict04t,D2), + %%delete(D21,'',D2), + length(D2,Length01t),Differencet is abs(Length01t),write("Number of undefined breasonings: "), writeln(Differencet), + %% writeln([undefinedbreasonings,D2]), %% Print undefined breasonings + + %%delete(D31,'',D3), + subtract(BrDict04t,AllUsedNames,D3), + length(D3,Length01t2),Differencet2 is abs(Length01t2),write("Number of orphaned breasonings: "), writeln(Differencet2), + + %%,writeln([orphanedbreasonings,D3]) %% Print orphaned breasonings + + ((Brth=true, + + %%towords3(BrDict03,[],BrDict04,[],_ObjectNames,[],AllUsedNames), + towords2a(BrthDict03,[],BrtDict04t), + + %%subtract(List3,BrtDict04t,Dt1), + %%length(Dt1,Lengtht01),Differencet1 is abs(Lengtht01),write("Number of words remaining to define breathsonings for: "), writeln(Differencet1), + %%writeln(["Number of words remaining to define breathsonings for",Dt1]), %% Print number of words remaining to define breathsonings for + + subtract(AllUsedNames,BrtDict04t,Dt2), + length(Dt2,Lengtht01t),Differencet22 is abs(Lengtht01t),write("Number of undefined breathsonings: "), writeln(Differencet22), + %%writeln([undefinedbreathsonings,Dt2]), %% Print undefined breathsonings + + subtract(BrtDict04t,AllUsedNames,Dt3), + length(Dt3,Lengtht01t2),Differencet3 is abs(Lengtht01t2),write("Number of orphaned breathsonings: "), writeln(Differencet3) + + %%writeln([orphanedbreathsonings,Dt3]) %% Print orphaned breathsonings + + )->true;true), + + ((Room=true, + towords2a(RoomDict03,[],RoomDict04t), + subtract(AllUsedNames,RoomDict04t,RoomDt2), + length(RoomDt2,RoomLengtht01t),RoomDifferencet22 is abs(RoomLengtht01t),write("Number of undefined rooms: "), writeln(RoomDifferencet22), + %%writeln([undefinedrooms,RoomDt2]), %% Print undefined rooms + subtract(RoomDict04t,AllUsedNames,RoomDt3), + length(RoomDt3,RoomLengtht01t2),RoomDifferencet3 is abs(RoomLengtht01t2),write("Number of orphaned rooms: "), writeln(RoomDifferencet3) + %%writeln([orphanedrooms,RoomDt3]) %% Print orphaned rooms + )->true;true), + + ((PartOfRoom=true, + towords2a(PartOfRoomDict03,[],PartOfRoomDict04t), + subtract(AllUsedNames,PartOfRoomDict04t,PartOfRoomDt2), + length(PartOfRoomDt2,PartOfRoomLengtht01t),PartOfRoomDifferencet22 is abs(PartOfRoomLengtht01t),write("Number of undefined parts of rooms: "), writeln(PartOfRoomDifferencet22), + %%writeln([undefinedPartsOfRooms,PartOfRoomDt2]), %% Print undefined PartsOfRooms + subtract(PartOfRoomDict04t,AllUsedNames,PartOfRoomDt3), + length(PartOfRoomDt3,PartOfRoomLengtht01t2),PartOfRoomDifferencet3 is abs(PartOfRoomLengtht01t2),write("Number of orphaned parts of rooms: "), writeln(PartOfRoomDifferencet3) + %%writeln([orphanedPartsOfRooms,PartOfRoomDt3]) %% Print orphaned PartsOfRooms + )->true;true), + + ((Direction=true, + towords2a(DirectionDict03,[],DirectionDict04t), + subtract(AllUsedNames,DirectionDict04t,DirectionDt2), + length(DirectionDt2,DirectionLengtht01t),DirectionDifferencet22 is abs(DirectionLengtht01t),write("Number of undefined directions: "), writeln(DirectionDifferencet22), + %%writeln([undefinedDirections,DirectionDt2]), %% Print undefined Directions + subtract(DirectionDict04t,AllUsedNames,DirectionDt3), + length(DirectionDt3,DirectionLengtht01t2),DirectionDifferencet3 is abs(DirectionLengtht01t2),write("Number of orphaned directions: "), writeln(DirectionDifferencet3) + %%writeln([orphanedDirections,DirectionDt3]) %% Print orphaned Directions + )->true;true), + + ((ObjectToPrepare=true, + towords2a(ObjectToPrepareDict03,[],ObjectToPrepareDict04t), + subtract(AllUsedNames,ObjectToPrepareDict04t,ObjectToPrepareDt2), + length(ObjectToPrepareDt2,ObjectToPrepareLengtht01t),ObjectToPrepareDifferencet22 is abs(ObjectToPrepareLengtht01t),write("Number of undefined objects to prepare: "), writeln(ObjectToPrepareDifferencet22), + %%writeln([undefinedObjectsToPrepare,ObjectToPrepareDt2]), %% Print undefined ObjectsToPrepare + subtract(ObjectToPrepareDict04t,AllUsedNames,ObjectToPrepareDt3), + length(ObjectToPrepareDt3,ObjectToPrepareLengtht01t2),ObjectToPrepareDifferencet3 is abs(ObjectToPrepareLengtht01t2),write("Number of orphaned objects to prepare: "), writeln(ObjectToPrepareDifferencet3) + %%writeln([orphanedObjectsToPrepare,ObjectToPrepareDt3]) %% Print orphaned ObjectsToPrepare + )->true;true), + + ((ObjectToFinish=true, + towords2a(ObjectToFinishDict03,[],ObjectToFinishDict04t), + subtract(AllUsedNames,ObjectToFinishDict04t,ObjectToFinishDt2), + length(ObjectToFinishDt2,ObjectToFinishLengtht01t),ObjectToFinishDifferencet22 is abs(ObjectToFinishLengtht01t),write("Number of undefined objects to finish: "), writeln(ObjectToFinishDifferencet22), + %%writeln([undefinedObjectsToFinish,ObjectToFinishDt2]), %% Print undefined ObjectsToFinish + subtract(ObjectToFinishDict04t,AllUsedNames,ObjectToFinishDt3), + length(ObjectToFinishDt3,ObjectToFinishLengtht01t2),ObjectToFinishDifferencet3 is abs(ObjectToFinishLengtht01t2),write("Number of orphaned objects to finish: "), writeln(ObjectToFinishDifferencet3) + %%writeln([orphanedObjectsToFinish,ObjectToFinishDt3]) %% Print orphaned ObjectsToFinish + )->true;true) + + + +)->true;(string(Filex),writeln("Number of words, unique words, unique breathsonings, words remaining to define, undefined breasonings, orphaned breasonings, undefined breathsonings and orphaned breathsonings skipped for speed when breasoning out a string.")));true) + +,!. + +br2(_,A,A,B,B,0,_Brth,BrthDict03,BrthDict03,_Room,RoomDict03,RoomDict03,_PartOfRoom,PartOfRoomDict03,PartOfRoomDict03,_Direction,DirectionDict03,DirectionDict03,_ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict03,_ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict03) :- !. +br2(List1,BrDict03,BrDict2,BrDict03t,BrDict03t2,N1,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04) :- + br(List1,BrDict03,BrDict21,BrDict03t,BrDict03t21,Brth,BrthDict03,BrthDict041,Room,RoomDict03,RoomDict041,PartOfRoom,PartOfRoomDict03,PartOfRoomDict041,Direction,DirectionDict03,DirectionDict041,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict041,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict041), + N2 is N1-1, + br2(List1,BrDict21,BrDict2,BrDict03t21,BrDict03t2,N2,Brth,BrthDict041,BrthDict04,Room,RoomDict041,RoomDict04,PartOfRoom,PartOfRoomDict041,PartOfRoomDict04,Direction,DirectionDict041,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict041,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict041,ObjectToFinishDict04),!. + +towords2([],A,A) :- !. +towords2(BrDict03,A,B) :- + BrDict03=[[Word,_,_,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2(Rest,C,B). + +towords2a([],A,A) :- !. +towords2a(BrDict03,A,B) :- + BrDict03=[[Word,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2a(Rest,C,B). + +towords3([],A,A,C,C,D,D) :- !. +towords3(BrDict03,A,B,D,E,G,H) :- + BrDict03=[[Word1,Word2]|Rest], + (Word2=""->append(G,[Word1],I)->true; + append(G,[Word2],I)), + append(A,[Word1],C), + append(D,[Word2],F), + towords3(Rest,C,B,F,E,I,H). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +splitfurther(BrDict01,N) :- + phrase(file0(N),BrDict01). + +file0(N) --> "[", file(N), "]", !. +file0([]) --> []. + +%%file([]) --> []. +file([L|Ls]) --> entry(L),",", +%%{writeln(L)}, %%*** +file(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +file([L]) --> entry(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entry([Word2,Word4]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + word(Word3), {string_codes(Word4,Word3),string(Word4)}, + "]". + +splitfurthert(BrDict01,N) :- + phrase(file0t(N),BrDict01). + +file0t(N) --> "[", filet(N), "]", !. +file0t([]) --> []. + +%%file([]) --> []. +filet([L|Ls]) --> entryt(L),",", +%%{writeln(L)}, %%*** +filet(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +filet([L]) --> entryt(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entryt([Word2,X3,Y3,Z3]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + digits(X),",",{atom_codes(X2,X),atom_number(X2,X3),number(X3)}, + digits(Y),",",{atom_codes(Y2,Y),atom_number(Y2,Y3),number(Y3)}, + digits(Z),{atom_codes(Z2,Z),atom_number(Z2,Z3),number(Z3)}, + "]". + +word([X|Xs]) --> [X], {char_type(X,csymf)->true;(X=27->true;X=8217)}, word(Xs), !. +%%word([X]) --> [X], {char_type(X,csymf);(X=27;X=8217)}, !. +word([]) --> []. + +digits([X|Xs]) --> [X], {(char_type(X,digit)->true;(string_codes(Word2,[X]),Word2="."))}, digits(Xs), !. +%%digits([X]) --> [X], {(char_type(X,digit);(string_codes(Word2,[X]),Word2="."))}, !. +digits([]) --> []. + +t(BrDict,BrDict4,Word) :- + (member([Word,Word1],BrDict)-> + (member([Word1,_X,_Y,_Z],BrDict4)->true;true);true). + +br([],B,B,C,C,_,D,D,_Room,RoomDict03,RoomDict03,_PartOfRoom,PartOfRoomDict03,PartOfRoomDict03,_Direction,DirectionDict03,DirectionDict03,_ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict03,_ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict03) :- + !. +br(Words,BrDict,BrDict2,BrDict4,BrDict5,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04) :- +%trace, + maplist(t(BrDict,BrDict4),Words),!. + %%). +brth(_,sweetinvincibleandprayedfor). + +%% finds unknown words, asks for their br in form "n of m: word", verify, (can go back x) append and sort, save +read_string1(S,user_input, "\n", "\r", _End, ObjectToFinishString) :- + (auto(on)->S=ObjectToFinishString; + read_string(user_input, "\n", "\r", _End, ObjectToFinishString)),!. + +%/* +process_t2b(A,C) :- + replace_t2b(Replacements), + atom_string(A1,A), + replace1_t2b(Replacements,A1,D1), + atom_string(D1,C),!. + +replace1_t2b([],A,A) :- !. +replace1_t2b(Replacements,A,D) :- + Replacements=[[B,C]|G], + atomic_list_concat(E,B,A), + atomic_list_concat(E,C,F), + replace1_t2b(G,F,D),!. + + replace_t2b([['\\',''],['–',' '],['—',' '],['“','\''],['”','\''],['‘','\''],['’','\''],['⁃','-']]). +%*/ \ No newline at end of file diff --git a/Text-to-Breasonings-master/texttobrall2_reading4.pl b/Text-to-Breasonings-master/texttobrall2_reading4.pl new file mode 100644 index 0000000000000000000000000000000000000000..334557c2d8d7a4c60e0f2cceee9e26c175c4e6aa --- /dev/null +++ b/Text-to-Breasonings-master/texttobrall2_reading4.pl @@ -0,0 +1,623 @@ +main_t2b4:-catch(texttobr2(u,"file.txt",u,u,[auto,on]),Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). + +% ['../Text-to-Breasonings/text_to_breasonings.pl']. +% W is 50*4,texttobr2(u,u,u,u,false,false,false,false,false,false,W) +% where W is the number of words to read +% and there are approximately 4 words per algorithm. + +%% Important: See instructions for using texttobr.pl at https://lucianpedia.wikia.com/wiki/Instructions_for_Using_texttobr(2).pl . + +%% use_module(library(pio)). %% In la_strings +%% use_module(library(dcg/basics)). %% In la_strings + +%% texttobr2 - converts file stream to dimensions of objects represented by the words +%% has object name as separate field for new users of texttobr to verify breasonings by hand +%% brdict1.txt contains word and object name, brdict2.txt contains object name and x, y and z + +%% texttobr2(Runs,File,StringtoBreason,BreasoningLimit). +:- include('../Text-to-Breasonings/mergetexttobrdict.pl'). +:-include('../listprologinterpreter/la_files.pl'). +:-include('../listprologinterpreter/la_strings.pl'). + +:-dynamic sn/1. +:-dynamic f/1. + +%:- include('../listprologinterpreter/la_strings'). +%:- include('../Philosophy/14 10 23.pl'). + +%% Brth is true or false +%texttobr2(N1,Filex1,Stringx1,M1) :- + %texttobr2(N1,Filex1,Stringx1,M1,false,false,false,false,false,false,0,[auto,off]). + +texttobr2(N1,Filex1,Stringx1,M1,[auto,on]) :- + texttobr2(N1,Filex1,Stringx1,M1,false,false,false,false,false,false,0,[auto,on]). +/*texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction, + ObjectToPrepare,ObjectToFinish) :- + texttobr2(N1,Filex1,Stringx1,M1,Brth,Room, + PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,0,[auto,off]). + */ +texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction, + ObjectToPrepare,ObjectToFinish,[auto,on]) :- + texttobr2(N1,Filex1,Stringx1,M1,Brth,Room, + PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,0,[auto,on]),!. + +%texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction, +% ObjectToPrepare,ObjectToFinish,W) :- +% texttobr2(N1,Filex1,Stringx1,M1,Brth,Room, + %PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,0,[auto,off]). + +texttobr2(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,Words_to_read,[auto,on]) :- + retractall(auto(_)), + assertz(auto(on)), + +(Stringx1=""->true; +( retractall(complete_display(_)), + assertz(complete_display(false)), + + retractall(words_to_read(_)), + assertz(words_to_read(Words_to_read)), + + ((number(N1),N=N1)->true; + (N1=u,N=1)), + + ((Filex1=u,Filex="../Text-to-Breasonings/file.txt")->true; + Filex=Filex1), + + ((number(M1),M=M1)->true; + M=all), %% If m1 is undefined or all then m=all + + prep(List1,BrDict03,BrDict03t,Filex,Stringx1,M,Brth,BrthDict03,Room,RoomDict03,PartOfRoom,PartOfRoomDict03,Direction,DirectionDict03,ObjectToPrepare,ObjectToPrepareDict03,ObjectToFinish,ObjectToFinishDict03), + +findall(List3,(member(C,List1),downcase_atom(C,List2),atom_string(List2,List3)),D), + br2(D,BrDict03,BrDict2,BrDict03t,BrDict03t2,N,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04), +/* sort(BrDict2,BrDict3), + (BrDict03=BrDict3->true; + + (open_s("../Text-to-Breasonings/brdict1.txt",write,Stream), +%% string_codes(BrDict3), + write(Stream,BrDict3), + close(Stream))), + + sort(BrDict03t2,BrDict03t3), + (BrDict03t=BrDict03t3->true; + (open_s("../Text-to-Breasonings/brdict2.txt",write,Stream2), +%% string_codes(BrDict3), + write(Stream2,BrDict03t3), + close(Stream2))), + + ((Brth=true, + sort(BrthDict04,BrthDict044), + (BrthDict03=BrthDict044->true; + (open_s("../Text-to-Breasonings/brthdict.txt",write,Stream3), +%% string_codes(BrDict3), + write(Stream3,BrthDict044), + close(Stream3))))->true;true), + + ((Room=true, + sort(RoomDict04,RoomDict044), + (RoomDict04=RoomDict044->true; + (open_s("../Text-to-Breasonings/roomdict.txt",write,Stream4), +%% string_codes(BrDict3), + write(Stream4,RoomDict044), + close(Stream4))))->true;true), + + ((PartOfRoom=true, + sort(PartOfRoomDict04,PartOfRoomDict044), + (PartOfRoomDict04=PartOfRoomDict044->true; + (open_s("../Text-to-Breasonings/partofroomdict.txt",write,Stream5), +%% string_codes(BrDict3), + write(Stream5,PartOfRoomDict044), + close(Stream5))))->true;true), + + ((Direction=true, + sort(DirectionDict04,DirectionDict044), + (DirectionDict04=DirectionDict044->true; + (open_s("../Text-to-Breasonings/directiondict.txt",write,Stream6), +%% string_codes(BrDict3), + write(Stream6,DirectionDict044), + close(Stream6))))->true;true), + + ((ObjectToPrepare=true, + sort(ObjectToPrepareDict04,ObjectToPrepareDict044), + (ObjectToPrepareDict04=ObjectToPrepareDict044->true; + (open_s("../Text-to-Breasonings/objecttopreparedict.txt",write,Stream7), +%% string_codes(BrDict3), + write(Stream7,ObjectToPrepareDict044), + close(Stream7))))->true;true), + + ((ObjectToFinish=true, + sort(ObjectToFinishDict04,ObjectToFinishDict044), + (ObjectToFinishDict04=ObjectToFinishDict044->true; + (open_s("../Text-to-Breasonings/objecttofinishdict.txt",write,Stream8), +%% string_codes(BrDict3), + write(Stream8,ObjectToFinishDict044), + close(Stream8))))->true;true), + */ + length(List1,List1_length_a), + Dividend_a is ceiling(List1_length_a/250), + Dividend_b is Dividend_a*3, % for graciously giving + texttobr2_a(Dividend_b,meditation), + texttobr2_a(Dividend_b,medicine), + texttobr2_a(Dividend_b,pedagogy) + )), + + !. + + +replace0(Input,Find,Replace,SepandPad,M,Output0) :- + replace00(Input,Find,Replace,SepandPad,[],Output1), + truncate(Output1,M,Output0),!. +replace00([],_Find,_Replace,_SepandPad,Input,Input) :- !. +replace00(Input1,Find,Replace,SepandPad,Input2,Input3) :- + Input1=[Input4|Input5], + string_codes(Input4,String1), + replace1(String1,Find,Replace,[],Input7), + string_codes(Output,Input7), + split_string(Output,SepandPad,SepandPad,Input8), + append(Input2,Input8,Input9), + replace00(Input5,Find,Replace,SepandPad,Input9,Input3), !. +replace1([],_Find,_Replace,Input,Input) :- !. +replace1(Input1,Find,Replace,Input2,Input3) :- + Input1=[Input4|Input5], + member(Input4,Find), + append(Input2,[Replace],Input6), + replace1(Input5,Find,Replace,Input6,Input3), !. +replace1(Input1,Find,Replace,Input2,Input3) :- + Input1=[Input4|Input5], + not(member(Input4,Find)), + append(Input2,[Input4],Input6), + replace1(Input5,Find,Replace,Input6,Input3), !. + +split_string_onnonletter(String00,List1) :- + string_codes(String00,String1), + split_string_onnonletter(String1,[],List0), + string_codes(List0,List2), + split_string(List2," "," ",List1),!. +split_string_onnonletter([],Input,Input) :- !. +split_string_onnonletter(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + not(char_type(Input4,alpha)), + append(Input2,[32],Input6), + split_string_onnonletter(Input5,Input6,Input3), !. +split_string_onnonletter(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + char_type(Input4,alpha), + append(Input2,[Input4],Input6), + split_string_onnonletter(Input5,Input6,Input3), !. + +%% Truncates the list if m is not undefined and m is greater than or equal to the length of string0 +truncate(List1,M,String0) :- + ((number(M),length(String0,M), + append(String0,_,List1))->true; + String0=List1),!. + + + +prep(List,BrDict03,BrDict03t,Filex,Stringx1,M,Brth,BrthDict03,Room,RoomDict03,PartOfRoom,PartOfRoomDict03,Direction,DirectionDict03,ObjectToPrepare,ObjectToPrepareDict03,ObjectToFinish,ObjectToFinishDict03) :- + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + + %%Chars="’", + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), +%%writeln([brdict1]), + splitfurther(BrDict0,BrDict01), +%%writeln([brDict01,BrDict01]), + %%char_code(Escape,27), + %%delete(BrDict01,[Escape,_,_,_,_],BrDict021), +%%writeln([brDict021,BrDict021]), + %%char_code(Apostrophe,8217), + %%delete(BrDict021,[Apostrophe,_,_,_,_],BrDict02), +%%writeln([brDict02,BrDict02]), + sort(BrDict01,BrDict03), +%%writeln([brDict03,BrDict03]), + length(BrDict03,Length0),write("Number of words in dictionary: "), writeln(Length0), + + %%writeln(''), + %%writeln([brdict2]), + phrase_from_file_s(string(BrDict0t), "../Text-to-Breasonings/brdict2.txt"), + %%Chars="’", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), + splitfurthert(BrDict0t,BrDict01t), +%%writeln([brDict01,BrDict01]), + %%delete(BrDict01t,[Escape,_,_,_,_],BrDict021t), +%%writeln([brDict021,BrDict021]), + %%delete(BrDict021t,[Apostrophe,_,_,_,_],BrDict02t), +%%writeln([brDict02,BrDict02]), + sort(BrDict01t,BrDict03t), + +% br_freq %B=BrDict03t,A=BrDict03,findall([DL,C,"\n"],(member([C,_,_,_],B),findall(_,member([_,C],A),D),length(D,DL)),E),sort(E,F),reverse(F,G),writeln([br_freq,G]), + +%%writeln([brDict03,BrDict03]), + length(BrDict03t,Length0t),write("Number of unique breasonings in dictionary: "), writeln(Length0t), + + ((Stringx1=u, + phrase_from_file_s(string(String001), Filex))->true; + String001=Stringx1), + + process_t2b(String001,String00), + + retractall(f(_)), + assertz(f(String00)), + + %trace, + %string_codes(SNL0,String00), + split_string(String00,"\n\r.","\n\r.",SNL1), + delete(SNL1,"",SNL), + length(SNL,SN), + retractall(sn(_)), + assertz(sn(SN)), + + split_string(String00,SepandPad,SepandPad,List1), + %%split_string_onnonletter(String00,List1), + + truncate(List1,M,List), + + %findall([X," "],member(X,List),Y), + %foldr(string_concat,Y,Y1), + + + /**replace0(String0,[8221, 8220], 34, SepandPad, M, String1), + replace0(String1,[8216, 8217], 39, SepandPad, M, String2), + replace0(String2,[8259, 8211, 8212], 45, SepandPad, M, String3), + replace0(String3,[160], 32, SepandPad, M, List), + **/ + +%%atom_codes(Atom999,String),writeln([atom999,Atom999]), + +%%writeln([list,List]), + %%delete(List,Escape,List11), +%%writeln([list11,List11]), + %%delete(List11,Apostrophe,List1), +%%writeln([list1,List1]), + length(List,Length1),write("Number of words to breason out in file: "), writeln(Length1), + sort(List,List2), +%%writeln([list2,List2]), + length(List2,Length2),write("Number of unique words in file: "), writeln(Length2), + + + + ((Brth=true, + phrase_from_file_s(string(BrthDict0), "../Text-to-Breasonings/brthdict.txt"), splitfurther(BrthDict0,BrthDict01), + sort(BrthDict01,BrthDict03), + length(BrthDict03,BrthLength0),write("Number of unique breathsonings in dictionary: "), writeln(BrthLength0))->true;true), + + + ((Room=true, + phrase_from_file_s(string(RoomDict0), "../Text-to-Breasonings/roomdict.txt"), splitfurther(RoomDict0,RoomDict01), + sort(RoomDict01,RoomDict03), + length(RoomDict03,RoomLength0),write("Number of unique rooms in dictionary: "), writeln(RoomLength0))->true;true), + + ((PartOfRoom=true, + phrase_from_file_s(string(PartOfRoomDict0), "../Text-to-Breasonings/partofroomdict.txt"), splitfurther(PartOfRoomDict0,PartOfRoomDict01), + sort(PartOfRoomDict01,PartOfRoomDict03), + length(PartOfRoomDict03,PartOfRoomLength0),write("Number of unique parts of rooms in dictionary: "), writeln(PartOfRoomLength0))->true;true), + + ((Direction=true, + phrase_from_file_s(string(DirectionDict0), "../Text-to-Breasonings/directiondict.txt"), splitfurther(DirectionDict0,DirectionDict01), + sort(DirectionDict01,DirectionDict03), + length(DirectionDict03,DirectionLength0),write("Number of unique directions in dictionary: "), writeln(DirectionLength0))->true;true), + + ((ObjectToPrepare=true, + phrase_from_file_s(string(ObjectToPrepareDict0), "../Text-to-Breasonings/objecttopreparedict.txt"), splitfurther(ObjectToPrepareDict0,ObjectToPrepareDict01), + sort(ObjectToPrepareDict01,ObjectToPrepareDict03), + length(ObjectToPrepareDict03,ObjectToPrepareLength0),write("Number of unique objects to prepare in dictionary: "), writeln(ObjectToPrepareLength0))->true;true), + + ((ObjectToFinish=true, + phrase_from_file_s(string(ObjectToFinishDict0), "../Text-to-Breasonings/objecttofinishdict.txt"), splitfurther(ObjectToFinishDict0,ObjectToFinishDict01), + sort(ObjectToFinishDict01,ObjectToFinishDict03), + length(ObjectToFinishDict03,ObjectToFinishLength0),write("Number of unique objects to finish in dictionary: "), writeln(ObjectToFinishLength0))->true;true), + +(complete_display(true)-> + ((Stringx1=u, %% Use file, not string as input. + + %%maplist(downcase_atom, List2, List3), + maplist(string_lower, List2, List3), + +%%writeln([list3,List3]), + towords3(BrDict03,[],BrDict04,[],_ObjectNames,[],AllUsedNames), + towords2(BrDict03t,[],BrDict04t), + +%%writeln([brDict04,BrDict04]), + subtract(List3,BrDict04,D1), +%%writeln([list3,brDict04,d1,List3,BrDict04,D1]), +%%writeln(["subtract(BrDict04,List3,D1).",List3,BrDict04,D1]), + length(D1,Length01),Difference is abs(Length01),write("Number of words remaining to define: "), writeln(Difference), + + subtract(AllUsedNames,BrDict04t,D2), + %%delete(D21,'',D2), + length(D2,Length01t),Differencet is abs(Length01t),write("Number of undefined breasonings: "), writeln(Differencet), + %% writeln([undefinedbreasonings,D2]), %% Print undefined breasonings + + %%delete(D31,'',D3), + subtract(BrDict04t,AllUsedNames,D3), + length(D3,Length01t2),Differencet2 is abs(Length01t2),write("Number of orphaned breasonings: "), writeln(Differencet2), + + %%,writeln([orphanedbreasonings,D3]) %% Print orphaned breasonings + + ((Brth=true, + + %%towords3(BrDict03,[],BrDict04,[],_ObjectNames,[],AllUsedNames), + towords2a(BrthDict03,[],BrtDict04t), + + %%subtract(List3,BrtDict04t,Dt1), + %%length(Dt1,Lengtht01),Differencet1 is abs(Lengtht01),write("Number of words remaining to define breathsonings for: "), writeln(Differencet1), + %%writeln(["Number of words remaining to define breathsonings for",Dt1]), %% Print number of words remaining to define breathsonings for + + subtract(AllUsedNames,BrtDict04t,Dt2), + length(Dt2,Lengtht01t),Differencet22 is abs(Lengtht01t),write("Number of undefined breathsonings: "), writeln(Differencet22), + %%writeln([undefinedbreathsonings,Dt2]), %% Print undefined breathsonings + + subtract(BrtDict04t,AllUsedNames,Dt3), + length(Dt3,Lengtht01t2),Differencet3 is abs(Lengtht01t2),write("Number of orphaned breathsonings: "), writeln(Differencet3) + + %%writeln([orphanedbreathsonings,Dt3]) %% Print orphaned breathsonings + + )->true;true), + + ((Room=true, + towords2a(RoomDict03,[],RoomDict04t), + subtract(AllUsedNames,RoomDict04t,RoomDt2), + length(RoomDt2,RoomLengtht01t),RoomDifferencet22 is abs(RoomLengtht01t),write("Number of undefined rooms: "), writeln(RoomDifferencet22), + %%writeln([undefinedrooms,RoomDt2]), %% Print undefined rooms + subtract(RoomDict04t,AllUsedNames,RoomDt3), + length(RoomDt3,RoomLengtht01t2),RoomDifferencet3 is abs(RoomLengtht01t2),write("Number of orphaned rooms: "), writeln(RoomDifferencet3) + %%writeln([orphanedrooms,RoomDt3]) %% Print orphaned rooms + )->true;true), + + ((PartOfRoom=true, + towords2a(PartOfRoomDict03,[],PartOfRoomDict04t), + subtract(AllUsedNames,PartOfRoomDict04t,PartOfRoomDt2), + length(PartOfRoomDt2,PartOfRoomLengtht01t),PartOfRoomDifferencet22 is abs(PartOfRoomLengtht01t),write("Number of undefined parts of rooms: "), writeln(PartOfRoomDifferencet22), + %%writeln([undefinedPartsOfRooms,PartOfRoomDt2]), %% Print undefined PartsOfRooms + subtract(PartOfRoomDict04t,AllUsedNames,PartOfRoomDt3), + length(PartOfRoomDt3,PartOfRoomLengtht01t2),PartOfRoomDifferencet3 is abs(PartOfRoomLengtht01t2),write("Number of orphaned parts of rooms: "), writeln(PartOfRoomDifferencet3) + %%writeln([orphanedPartsOfRooms,PartOfRoomDt3]) %% Print orphaned PartsOfRooms + )->true;true), + + ((Direction=true, + towords2a(DirectionDict03,[],DirectionDict04t), + subtract(AllUsedNames,DirectionDict04t,DirectionDt2), + length(DirectionDt2,DirectionLengtht01t),DirectionDifferencet22 is abs(DirectionLengtht01t),write("Number of undefined directions: "), writeln(DirectionDifferencet22), + %%writeln([undefinedDirections,DirectionDt2]), %% Print undefined Directions + subtract(DirectionDict04t,AllUsedNames,DirectionDt3), + length(DirectionDt3,DirectionLengtht01t2),DirectionDifferencet3 is abs(DirectionLengtht01t2),write("Number of orphaned directions: "), writeln(DirectionDifferencet3) + %%writeln([orphanedDirections,DirectionDt3]) %% Print orphaned Directions + )->true;true), + + ((ObjectToPrepare=true, + towords2a(ObjectToPrepareDict03,[],ObjectToPrepareDict04t), + subtract(AllUsedNames,ObjectToPrepareDict04t,ObjectToPrepareDt2), + length(ObjectToPrepareDt2,ObjectToPrepareLengtht01t),ObjectToPrepareDifferencet22 is abs(ObjectToPrepareLengtht01t),write("Number of undefined objects to prepare: "), writeln(ObjectToPrepareDifferencet22), + %%writeln([undefinedObjectsToPrepare,ObjectToPrepareDt2]), %% Print undefined ObjectsToPrepare + subtract(ObjectToPrepareDict04t,AllUsedNames,ObjectToPrepareDt3), + length(ObjectToPrepareDt3,ObjectToPrepareLengtht01t2),ObjectToPrepareDifferencet3 is abs(ObjectToPrepareLengtht01t2),write("Number of orphaned objects to prepare: "), writeln(ObjectToPrepareDifferencet3) + %%writeln([orphanedObjectsToPrepare,ObjectToPrepareDt3]) %% Print orphaned ObjectsToPrepare + )->true;true), + + ((ObjectToFinish=true, + towords2a(ObjectToFinishDict03,[],ObjectToFinishDict04t), + subtract(AllUsedNames,ObjectToFinishDict04t,ObjectToFinishDt2), + length(ObjectToFinishDt2,ObjectToFinishLengtht01t),ObjectToFinishDifferencet22 is abs(ObjectToFinishLengtht01t),write("Number of undefined objects to finish: "), writeln(ObjectToFinishDifferencet22), + %%writeln([undefinedObjectsToFinish,ObjectToFinishDt2]), %% Print undefined ObjectsToFinish + subtract(ObjectToFinishDict04t,AllUsedNames,ObjectToFinishDt3), + length(ObjectToFinishDt3,ObjectToFinishLengtht01t2),ObjectToFinishDifferencet3 is abs(ObjectToFinishLengtht01t2),write("Number of orphaned objects to finish: "), writeln(ObjectToFinishDifferencet3) + %%writeln([orphanedObjectsToFinish,ObjectToFinishDt3]) %% Print orphaned ObjectsToFinish + )->true;true) + + + +)->true;(string(Filex),writeln("Number of words, unique words, unique breathsonings, words remaining to define, undefined breasonings, orphaned breasonings, undefined breathsonings and orphaned breathsonings skipped for speed when breasoning out a string.")));true) + +,!. + +br2(_,A,A,B,B,0,_Brth,BrthDict03,BrthDict03,_Room,RoomDict03,RoomDict03,_PartOfRoom,PartOfRoomDict03,PartOfRoomDict03,_Direction,DirectionDict03,DirectionDict03,_ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict03,_ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict03) :- !. +br2(List1,BrDict03,BrDict2,BrDict03t,BrDict03t2,N1,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04) :- + br(List1,BrDict03,BrDict21,BrDict03t,BrDict03t21,Brth,BrthDict03,BrthDict041,Room,RoomDict03,RoomDict041,PartOfRoom,PartOfRoomDict03,PartOfRoomDict041,Direction,DirectionDict03,DirectionDict041,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict041,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict041), + N2 is N1-1, + br2(List1,BrDict21,BrDict2,BrDict03t21,BrDict03t2,N2,Brth,BrthDict041,BrthDict04,Room,RoomDict041,RoomDict04,PartOfRoom,PartOfRoomDict041,PartOfRoomDict04,Direction,DirectionDict041,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict041,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict041,ObjectToFinishDict04),!. + +towords2([],A,A) :- !. +towords2(BrDict03,A,B) :- + BrDict03=[[Word,_,_,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2(Rest,C,B). + +towords2a([],A,A) :- !. +towords2a(BrDict03,A,B) :- + BrDict03=[[Word,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2a(Rest,C,B). + +towords3([],A,A,C,C,D,D) :- !. +towords3(BrDict03,A,B,D,E,G,H) :- + BrDict03=[[Word1,Word2]|Rest], + (Word2=""->append(G,[Word1],I)->true; + append(G,[Word2],I)), + append(A,[Word1],C), + append(D,[Word2],F), + towords3(Rest,C,B,F,E,I,H). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +splitfurther(BrDict01,N) :- + phrase(file0(N),BrDict01). + +file0(N) --> "[", file(N), "]", !. +file0([]) --> []. + +%%file([]) --> []. +file([L|Ls]) --> entry(L),",", +%%{writeln(L)}, %%*** +file(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +file([L]) --> entry(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entry([Word2,Word4]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + word(Word3), {string_codes(Word4,Word3),string(Word4)}, + "]". + +splitfurthert(BrDict01,N) :- + phrase(file0t(N),BrDict01). + +file0t(N) --> "[", filet(N), "]", !. +file0t([]) --> []. + +%%file([]) --> []. +filet([L|Ls]) --> entryt(L),",", +%%{writeln(L)}, %%*** +filet(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +filet([L]) --> entryt(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entryt([Word2,X3,Y3,Z3]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + digits(X),",",{atom_codes(X2,X),atom_number(X2,X3),number(X3)}, + digits(Y),",",{atom_codes(Y2,Y),atom_number(Y2,Y3),number(Y3)}, + digits(Z),{atom_codes(Z2,Z),atom_number(Z2,Z3),number(Z3)}, + "]". + +word([X|Xs]) --> [X], {char_type(X,csymf)->true;(X=27->true;X=8217)}, word(Xs), !. +%%word([X]) --> [X], {char_type(X,csymf);(X=27;X=8217)}, !. +word([]) --> []. + +digits([X|Xs]) --> [X], {(char_type(X,digit)->true;(string_codes(Word2,[X]),Word2="."))}, digits(Xs), !. +%%digits([X]) --> [X], {(char_type(X,digit);(string_codes(Word2,[X]),Word2="."))}, !. +digits([]) --> []. + +t(BrDict,BrDict4,Word,W2) :- + W1=[square,1,1,0], + (member([Word,Word1],BrDict)-> + (member([Word1,X,Y,Z],BrDict4)->W2=[Word1,X,Y,Z];W2=W1);W2=W1). + +br([],B,B,C,C,_,D,D,_Room,RoomDict03,RoomDict03,_PartOfRoom,PartOfRoomDict03,PartOfRoomDict03,_Direction,DirectionDict03,DirectionDict03,_ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict03,_ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict03) :- + !. +br(Words,BrDict,BrDict2,BrDict4,BrDict5,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04) :- + +%trace, + findall([Word1,X,Y,Z],(member(W1,Words),t(BrDict,BrDict4,W1,[Word1,X,Y,Z])),Words1), + length(Words1,Words1L), + sn(SN), + + divide(1000,Words1,[],Words2), + length(Words2,Words2L), + numbers(Words2L,1,[],Words2Ns), + findall(Words6,(member(UN,Words2Ns),get_item_n(Words2,UN,U), + findall(["[",W,",",X,",",Y,",",Z,"]",","],(member([W,X,Y,Z],U)),U1), + flatten(U1,Words3), + append(Words4,[_],Words3), + flatten(["a",UN,"(",Words4,")"],Words5), + %trace, + foldr(string_concat,Words5,Words6)),Words7), + %foldr(string_concat,Words7,Words8), + + findall([X1,",\n"],member(X1,Words7),X2), + flatten(["a:-",X2],X3), + append(X31,[_],X3), + foldr(string_concat,X31,Words8), + + findall([X1,".\n"],member(X1,Words7),X21), + flatten(X21,X22), + foldr(string_concat,X22,Words81), + foldr(string_concat,["%SN=",SN,"\n","main:-catch(a,Err,handle_error(Err)),halt.\nhandle_error(_Err):-\n halt(1).\nmain :- halt(1).\n",Words8,".\n",Words81],Words9), + save_file_s("a.pl",Words9), + shell1_s("swipl --goal=main --stand_alone=true -o a -c a.pl"), + + + %trace, + + R is ceiling((4*16000)/SN), % medit, tt, medic, hq thoughts + + %numbers(R,1,[],Rs), + length(Rs,R), + findall(["a",","],member(_R1,Rs),R2), + flatten(["b:-",R2],R3), + append(R31,[_],R3), + flatten(["%R=",R,"\n","main:-catch(b,Err,handle_error(Err)),halt.\nhandle_error(_Err):-\n halt(1).\nmain :- halt(1).\n",R31,".\n",Words8,".\n",Words81],Words92), +foldr(string_concat,Words92,RWords8), + + %foldr(string_concat,Words91,RWords8), + save_file_s("b.pl",RWords8), + shell1_s("swipl --goal=main --stand_alone=true -o b -c b.pl"), + + +% for attached people (to a single simulation person) + +/* + R0 is ceiling((4*16000)/SN), % medit, tt, medic frozen age, hq thought + + + %numbers(R,1,[],Rs), + length(Rs0,R0), + findall(["a",","],member(_,Rs0),R20), + flatten(["b:-",R20],R30), + append(R310,[_],R30), + flatten(["%R=",R0,"\n","main:-b.\n",R310,".\n",Words8,".\n",Words81],Words920), +foldr(string_concat,Words920,RWords80), + + %foldr(string_concat,Words91,RWords8), + save_file_s("b0.pl",RWords80), + shell1_s("swipl --goal=main --stand_alone=true -o b0 -c b0.pl"), +*/ + + +f(F),term_to_atom(F,F1), + flatten([":-include('texttobr.pl').\n","main:-catch(texttobr(",R,",u,",F1,",u),Err,handle_error(Err)),halt.\nhandle_error(_Err):-\n halt(1).\nmain :- halt(1).\n"],Words93), +foldr(string_concat,Words93,RWords81), + save_file_s("c.pl",RWords81), + shell1_s("swipl --goal=main --stand_alone=true -o c -c c.pl"), + + + !. + +divide(_,[],Words2,Words2) :- !. +divide(N,Words1,Words21,Words22) :- + length(L,N), + (append(L,L1,Words1)->L2=L; + (L2=Words1,L1=[])), + append(Words21,[L2],Words23), + divide(N,L1,Words23,Words22),!. + + %%). +brth(_,sweetinvincibleandprayedfor). + +%% finds unknown words, asks for their br in form "n of m: word", verify, (can go back x) append and sort, save +read_string1(S,user_input, "\n", "\r", _, ObjectToFinishString) :- + (auto(on)->S=ObjectToFinishString; + read_string(user_input, "\n", "\r", _, ObjectToFinishString)),!. + +%/* +process_t2b(A,C) :- + replace_t2b(Replacements), + atom_string(A1,A), + replace1_t2b(Replacements,A1,D1), + atom_string(D1,C),!. + +replace1_t2b([],A,A) :- !. +replace1_t2b(Replacements,A,D) :- + Replacements=[[B,C]|G], + atomic_list_concat(E,B,A), + atomic_list_concat(E,C,F), + replace1_t2b(G,F,D),!. + + replace_t2b([['\\',''],['–',' '],['—',' '],['“','\''],['”','\''],['‘','\''],['’','\''],['⁃','-']]). +%*/ \ No newline at end of file diff --git a/Text-to-Breasonings-master/time_hop b/Text-to-Breasonings-master/time_hop new file mode 100644 index 0000000000000000000000000000000000000000..55829e98759764b44122e6817b7ce72bbba7ef34 Binary files /dev/null and b/Text-to-Breasonings-master/time_hop differ diff --git a/Text-to-Breasonings-master/time_hop.pl b/Text-to-Breasonings-master/time_hop.pl new file mode 100644 index 0000000000000000000000000000000000000000..18b13c50bcbf3d46477fa1d2fce0ccd76f9e8e30 --- /dev/null +++ b/Text-to-Breasonings-master/time_hop.pl @@ -0,0 +1,6 @@ +:-include('text_to_breasonings.pl'). +main:-catch((N is 1,texttobr2_1(N)),Err,handle_error(Err)),halt. + +handle_error(_Err):- + halt(1). +main :- halt(1). \ No newline at end of file diff --git a/Text-to-Breasonings-master/truncate.pl b/Text-to-Breasonings-master/truncate.pl new file mode 100644 index 0000000000000000000000000000000000000000..827a50170a5fe240af74695283ba855ba9d20dfe --- /dev/null +++ b/Text-to-Breasonings-master/truncate.pl @@ -0,0 +1,44 @@ +%:-include('la_strings2.pl'). +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). + +% word_count(["file","file.txt"],Words). +% Words = 69. + +% word_count(["string","a b c"],Words). +% Words = 3. + + +% given file with number of As required and gl file length, gives div and mod values for t2b + +truncate1(Type,File1,Words1,File2) :- + + +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + + (Type=file-> + (phrase_from_file_s(string(String2), File1)); + (Type=string,File1=String2)), + split_string(String2,SepandPad,SepandPad,String3), + %writeln(String3), + length(String3,Words2), + + (Words1>Words2->Words3=Words2;Words3=Words1), + length(String4,Words3), + append(String4,_,String3), + + findall([Item," "],member(Item,String4),Item2), + maplist(append,[Item2],[Item3]), + concat_list(Item3,String5), + + %term_to_atom(Item2,String4a), + %string_atom(String5,String4a), + + (Type=file-> + + (open_s(File2,write,Stream3), + write(Stream3,String5), + close(Stream3)); + + (Type=string,File2=String5)) + ,!. diff --git a/Text-to-Breasonings-master/truncate_between.pl b/Text-to-Breasonings-master/truncate_between.pl new file mode 100644 index 0000000000000000000000000000000000000000..9923ec033d09ca4393d23d5b8230950237dacddb --- /dev/null +++ b/Text-to-Breasonings-master/truncate_between.pl @@ -0,0 +1,43 @@ +%:-include('la_strings2.pl'). +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). + +% word_count(["file","file.txt"],Words). +% Words = 69. + +% word_count(["string","a b c"],Words). +% Words = 3. + + +% given file with number of As required and gl file length, gives div and mod values for t2b + +truncate_between1(File1,Words_before,Words_during,File2) :- + + Words_total is Words_before+Words_during, + +SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + + (true-> + (phrase_from_file_s(string(String2), File1)); + _String1=String2), + split_string(String2,SepandPad,SepandPad,String3), + %writeln(String3), + length(String3,Words2), + + (Words_total>Words2->Words3=Words2;Words3=Words_total), + length(String4,Words3), + append(String4,_,String3), + + length(String5a,Words_during), + append(_,String5a,String4), + + findall([Item," "],member(Item,String5a),Item2), + maplist(append,[Item2],[Item3]), + concat_list(Item3,String5), + + %term_to_atom(Item2,String4a), + %string_atom(String5,String4a), + + (open_s(File2,write,Stream3), + write(Stream3,String5), + close(Stream3)),!. diff --git a/Text-to-Breasonings-master/truncate_words_conserving_formatting.pl b/Text-to-Breasonings-master/truncate_words_conserving_formatting.pl new file mode 100644 index 0000000000000000000000000000000000000000..fa824f372e692c2aa16cb1ff3a3ac709d1dc6ee3 --- /dev/null +++ b/Text-to-Breasonings-master/truncate_words_conserving_formatting.pl @@ -0,0 +1,78 @@ +:-include('../listprologinterpreter/la_strings.pl'). +:-include('../listprologinterpreter/la_strings_string.pl'). + +% ?- truncate_words_conserving_formatting(["string","a123b ()c d e f"],3,["string",S]). +% S = "a123b ()c". + +/** +?- truncate_words_conserving_formatting(["file","file3.txt"],1,["file","file4.txt"]). +**/ + +truncate_words_conserving_formatting([Type1,File1],Words1,[Type2,File2]) :- + % split on non-alpha chars, count n words and keep formatting + + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + + (Type1="file"-> + (phrase_from_file_s(string(String2), File1)); + File1=String2), + + split_string_onnonletter_by_itself2(String2,SepandPad,String3), + + %trace, + + first_n_words(String3,Words1,[],String4), + maplist(append,[[String4]],[Item3]), + + concat_list(Item3,String5), + + (Type2="file"-> + (open_s(File2,write,Stream2), + write(Stream2,String5), + close(Stream2),!); + (String5=File2)). + + + +split_string_onnonletter_by_itself2(String00,Chars,List1) :- + string_codes(String00,String1), + split_string_onnonletter_by_itself2(String1,[],Chars,List0), + %string_codes(String1a,List0), + %writeln(String1a), + %trace, + string_codes(List0,List2), + split_string(List2,"ª","ª",List1),!. +split_string_onnonletter_by_itself2([],Input,_Chars,Input) :- !. +split_string_onnonletter_by_itself2(Input1,Input2,Chars,Input3) :- + Input1=[Input4|Input5], + %not(char_type(Input4,alpha)), + string_codes(Chars,Codes), + member(Code,Codes), + Input4=Code, + append(Input2,`ª`,Input81), + append(Input81,`‡`,Input8), + append(Input8,[Code],Input7), + append(Input7,`ª`,Input6), + split_string_onnonletter_by_itself2(Input5,Input6,Chars,Input3), !. +split_string_onnonletter_by_itself2(Input1,Input2,Chars,Input3) :- + Input1=[Input4|Input5], + append(Input2,[Input4],Input6), + split_string_onnonletter_by_itself2(Input5,Input6,Chars,Input3), !. + +first_n_words([],_,String,String) :- !. +first_n_words(_String1,0,String,String) :- !. +first_n_words(String1,Words1,String2a,String2b) :- + String1=[String3|String4], + string_length(String3,2), + string_concat(A,B,String3),string_length(B,1), + A="‡", + append(String2a,[B],String5), + first_n_words(String4,Words1,String5,String2b),!. +first_n_words(String1,Words1,String2a,String2b) :- + String1=[String3|String4], + (not((string_length(String3,2), + string_concat(A,B,String3),string_length(B,1), + A="‡"))), + append(String2a,[String3],String5), + Words2 is Words1-1, + first_n_words(String4,Words2,String5,String2b),!. diff --git a/Text-to-Object-Name/.DS_Store b/Text-to-Object-Name/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Text-to-Object-Name/.DS_Store differ diff --git a/Text-to-Object-Name/Instructions_for_Breasoning.txt b/Text-to-Object-Name/Instructions_for_Breasoning.txt new file mode 100644 index 0000000000000000000000000000000000000000..883fe296c2ac7d5e5e69998b8fd05a2287306460 --- /dev/null +++ b/Text-to-Object-Name/Instructions_for_Breasoning.txt @@ -0,0 +1,180 @@ +Instructions for Breasoning + + +Glossary: Breasoning is thinking of the x, y and z measurements in metres of an object in a sentence. + + + +1. Perform the following Daily Regimen, containing meditation, pedagogy and medicine preparation. + + +Meditation + + +You can naturally prevent a headache by silently repeating the mantras, "lucian" and "arem" for twenty, minutes, twice per day. The mantras becomes lighter and lighter, until one forgets the thought, clearing the mind. Also, silently repeat the sutra "green" for twenty minutes, twice per day after two months of using the mantras in the morning and evening. Also, pray for no digestive system pops from practising the sutra each day. + +Quantum Box and Nut and Bolt + + +Quantum Box + + +The quantum box is a guided meditation that I will describe later that prevents headaches, (relaxes head and neck muscles), and prevents muscle aches and pains. In it, you can make anything (e.g. a headache) instantly disappear, based on quantum physics (where at the quantum level, things may disappear if we want them to). + + +Nut and Bolt + + +The nut and bolt is a guided meditation that is a more detailed prevention of problems the Quantum Box prevents. In it, pretend to unscrew a nut from a long thin bolt each day to prevent a headache, etc. + + +One-off + +One-off: Repeat Upasana * 800 and breason out the following 7 breasonings details to God. I make a small recommendation of a University accredited short Education course every two years or a degree every ten years. + + + +Repeat Upasana * 800 if not done in a chain of meditation days linked to today (and green, Purusha, Use, Teacher, Maharishi, Doctor each x 800 before Upasana if they are not done in a chain of meditation days linked to today). + +Breason out (think of examples with objects that you have thought of the x, y and z dimensions of) the following 7 breasonings details as an exercise. Pretend to breason out the breasonings details to God. + +See http://lucianspedagogy.blogspot.com.au/p/writing-sentences-by-making.html + +1. Details - As an exercise, think of two uses, a future use and two types for each object. + +2. Breasoning - Think of the x, y, z dimensions and colour of each object. + +3. Rebreasoning - Think of the fact that the person and the object in a sentence are connected by a verb (an action) that means they touch. + +4. Breathsoning - Think of a reason that the object is judged good (a judgment adjective). + +5. Rebreathsoning - Think of a reason that the verb (action) is judged done well (a judgment adverb). + +6. Space - Think of spatial characteristics of the person in relation to the object (room, part of room e.g. table and direction in room e.g. plate). + +7. Time - Think of temporal characteristics of the person in relation to the object (time to prepare e.g. buy, time to do e.g. eat, and time to finish e.g. place core in a receptacle). + + + + +Daily Regimen + + +I recommend Yoga and Qi Gong. + + +Indicate 2 radio buttons (one for each of the following ideas, and one for each of these ideas working anyway, where a radio button is indicated by breasoning out, or thinking of the x, y and z dimensions of a counter, 0.01, 0.01, 0.005 metres). + +Too (just say “too” for each idea following to indicate the two radio buttons) + +meditation (80 lucian and arem mantras and 80 green sutras) + +Also, for the following, which also have quantum box and nut and bolt: + +No digestive system pops from practising the sutra + + +Also, for the following, which also have quantum box and nut and bolt: + +Protector from Headache in Meditation after Honours Study, etc. + +Also, for the following, which also have quantum box and nut and bolt: + +No headache + + +The following turn off headaches etc. + +Also, for the following, which also have quantum box and nut and bolt: + +Turn off workload from all employees including you below you, + +Also, for the following, which also have quantum box and nut and bolt: + +Detect and turn off workloads using manager algorithm. (The self does the As, turning off the workloads.) + +Also, for the following, which also have quantum box and nut and bolt: + +Detect and turn off workloads using manager network. (The others help finish the As, turning off the workloads.) + +Also, for the following, which also have quantum box and nut and bolt: + +No muscle aches/pains + + +2. One-off, breason out 250 characters from Lecturer Training, to be able to breason out anything you want, not just found out breasonings. + + +“Lecturer (all in this document) + + + 1. I collected the comments. + + 2. I wrote the formula. + + 3. I wrote the input. + + 4. I wrote the output. + + 5. I had the rights to government. + + 6. I had the rights to the state. + + 7. I had the rights to vote. + + 8. I amalgamated the b” + + +3. One-off, breason out 250 characters from Recordings Training, to be able to indicate that breasonings are implicitly breasoned out for you. + + +“Recordings from 30 As 1 + + + 1. I meditated on the day I produced recordings. + + 2. I confirmed training. + + 3. Recordings relate to the rest of pedagogy. + + 4. The delegator matched recordings (workloads) with different recordings. + + 5. I endorsed the 5*50 As pract” + + +“Too”: 5 symbolic strawberries (each is 0.01, 0.01, 0.01 cm) for Recordings to work. + + + +4. 250 characters from Delegate Workloads + +"1. I cut off infinity. + 2. I used the sun. + 3. The queen helped. + 4. The delivery arrived. + 5. I earned the role. + 6. I wrote the developed text. + 7. I knew about the lotus spoon. + 8. I knew the female politician. + 9. I knew positive religion. + 10. P" + + +Notes on Text-to-Object-Name: + + +Replace ⁃, – in file.txt with - or crashes + +Replace “,” in file.txt with " or crashes + +Replace ‘,’ in file.txt with ' or crashes + +Replace MS Word return in file.txt with normal return or crashes + +Replace nonbreaking space " " in file.txt with normal space " " or crashes + + +If there are any invisible characters apart from esc then brdict.txt may become corrupt, and should be replaced with a backup. You could directly delete the invisible characters from brdict.txt, but Lucian Academy accepts no responsibility for this. + + +You don't have to reinsert smart quotes in file.txt afterwards because texttobr2 doesn't change the file. Simply use the copy of your text before you inserted and changed it in file.txt. diff --git a/Text-to-Object-Name/LICENSE b/Text-to-Object-Name/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..920183ffbdaca89fc6faee40b6de08a6c8a061c7 --- /dev/null +++ b/Text-to-Object-Name/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Text-to-Object-Name/README.md b/Text-to-Object-Name/README.md new file mode 100644 index 0000000000000000000000000000000000000000..2c8c2e2e28f34721f96d26fb74a74b9ee70c54bd --- /dev/null +++ b/Text-to-Object-Name/README.md @@ -0,0 +1,76 @@ +# Text-to-Object-Name + +* Text-to-Object-Name (T2ON) is a set of algorithms that converts a file to the set of a certain number of words, and a word frequency counter. It does not breason out words. + +* To breason out words, one needs to finish an Education short course and read the instructions in Instructions_for_Breasoning.txt. + +* Generally, 80 word breasonings are needed to earn a high distinction at undergraduate level and below, have healthy children or sell products. This increases to 2 * 15 * 80=2400 breasonings per Honours level chapter, approximately 2 * 50 * 80=8000 breasonings per Masters level assignment and approximately 2 * 4 * 50 * 80=32,000 breasonings per PhD level assignment. + +* 50 As (50 * 80=4000 breasonings) are required to earn a job. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for preparing for breasoning. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, Text to Breasonings repository, the Lucian Academy Data repository, and the List Prolog Interpreter Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Text-to-Object-Name"). +halt +``` + +# Running + +* In Shell: +`cd Text-to-Object-Name` +`swipl` + +* Enter: +`['text_to_object_name.pl'].` + +* In the SWI-Prolog environment, enter: +`t2on(N,File,String,M).` +where N is the number of times to output the file, File is the file name, String is the string to output and M is the number of words in the file to output, e.g.: +* `t2on(u,"file.txt",u,u).` or `t2on(u,u,u,u).` +Outputs file.txt. +* `t2on(2,"file.txt",u,u).` +Outputs file.txt twice. +* `t2on(u,u,"Hello world.",u).` +Breason out "Hello world.". +* `t2on(3,u,"a b c",2).` +Outputs the first two words in "a b c" ("a" and "b") 3 times. + +# Reading Algorithm + +* The algorithm often runs too quickly. To notice a number of words ("read them") in unread texts, where Master=6 algorithms, PhD=~16 algorithms and professor/politician=~50 algorithms, run with: +``` +`['text_to_object_name.pl'].` +W is 50*4,t2on(u,u,u,u,W). +% where W is the number of words to read +% 50 is the number of algorithms, +% and there are approximately 4 words per algorithm. +``` + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/Text-to-Object-Name/file.txt b/Text-to-Object-Name/file.txt new file mode 100644 index 0000000000000000000000000000000000000000..c57eff55ebc0c54973903af5f72bac72762cf4f4 --- /dev/null +++ b/Text-to-Object-Name/file.txt @@ -0,0 +1 @@ +Hello World! \ No newline at end of file diff --git a/Text-to-Object-Name/t2on_reading.pl b/Text-to-Object-Name/t2on_reading.pl new file mode 100644 index 0000000000000000000000000000000000000000..882c7dd7d91bade4385b6edbcb4d3e71f83733bf --- /dev/null +++ b/Text-to-Object-Name/t2on_reading.pl @@ -0,0 +1,186 @@ +:- include('../listprologinterpreter/la_maths.pl'). +:- include('../Text-to-Breasonings/text_to_breasonings.pl'). + +t2on(N1,Filex1,Stringx1,M1) :- + t2on2(N1,Filex1,Stringx1,M1,0). +t2on2(N1,Filex1,Stringx1,M1,Words_to_read) :- + retractall(complete_display(_)), + assertz(complete_display(false)), + + retractall(words_to_read(_)), + assertz(words_to_read(Words_to_read)), + + ((number(N1),N=N1)->true; + (N1=u,N=1)), + + ((Filex1=u,Filex="../Text-to-Object-Name/file.txt")->true; + Filex=Filex1), + + ((number(M1),M=M1)->true; + M=all), %% If m1 is undefined or all then m=all + + prep(List1,T2ON_Dict03,Filex,Stringx1,M), + t2on_2(List1,T2ON_Dict03,T2ON_Dict2,[],A12,N), + + sort(T2ON_Dict2,T2ON_Dict3), + + (T2ON_Dict03=T2ON_Dict3->true; + (save_file_s("../Lucian-Academy-Data/Text-to-Object-Name/t2on_dict1.txt",T2ON_Dict3))), + +/* +open_file_s("../Lucian-Academy-Data/Text-to-Object-Name/t2on_end_text.txt",TET), + + length(List1,List1_length_a), + Dividend_a is ceiling(List1_length_a/250), + Dividend_b is Dividend_a*3, % for graciously giving + + findall(_,(member([_,TET_T],TET),t2on(Dividend_b,string,TET_T,u)),_), + */ + findall(B,(member([C,D],A12),(D=""->B=C;B=D)),F),sort(F,G),findall([L,J],(member(J,G),findall(J,member(J,F),K),length(K,L)),M11),sort(M11,M2), + writeln1(M2), + + !. + +%% Truncates the list if m is not undefined and m is greater than or equal to the length of string0 +truncate(List1,M,String0) :- + ((number(M),length(String0,M), + append(String0,_,List1))->true; + String0=List1),!. + +prep(List,T2ON_Dict03,Filex,Stringx1,M) :- + open_file_s("../Lucian-Academy-Data/Text-to-Object-Name/t2on_dict1.txt", T2ON_Dict01), + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + + sort(T2ON_Dict01,T2ON_Dict03), + length(T2ON_Dict03,Length0),write("Number of words in dictionary: "), writeln(Length0), + + %sort(T2ON_Dict01t,T2ON_Dict03t), + + + ((Stringx1=u, + phrase_from_file_s(string(String00), Filex))->true; + String00=Stringx1), + + string_codes(String001,String00), + string_lower(String001,String002), + string_codes(String002,String003), + + split_string(String003,SepandPad,SepandPad,List1), + + truncate(List1,M,List), + + length(List,Length1),write("Number of words to find the object name for in file: "), writeln(Length1), + sort(List,List3), +%%writeln([list2,List2]), + %length(List2,Length2),write("Number of unique words in file: "), writeln(Length2), + +(complete_display(true)-> + ((Stringx1=u, %% Use file, not string as input. + + %%maplist(downcase_atom, List2, List3), + %maplist(string_lower, List2, List3), + +%%writeln([list3,List3]), + towords3(T2ON_Dict03,[],T2ON_Dict04,[],_ObjectNames,[],_AllUsedNames), + %towords2(T2ON_Dict03t,[],T2ON_Dict04t), + +%%writeln([t2on_Dict04,T2ON_Dict04]), + subtract(List3,T2ON_Dict04,D1), +%%writeln([list3,t2on_Dict04,d1,List3,T2ON_Dict04,D1]), +%%writeln(["subtract(T2ON_Dict04,List3,D1).",List3,T2ON_Dict04,D1]), + length(D1,Length01),Difference is abs(Length01),write("Number of words remaining to define: "), writeln(Difference) + + + %subtract(AllUsedNames,T2ON_Dict04t,D2), + %%delete(D21,'',D2), + %length(D2,Length01t),Differencet is abs(Length01t),write("Number of undefined object names: "), writeln(Differencet), + + %%delete(D31,'',D3), + %subtract(T2ON_Dict04t,AllUsedNames,D3), + + + + +)->true;(string(Filex),writeln("Number of words, unique words and words remaining to define skipped for speed when finding object names for a string.")));true) + +,!. + +t2on_2(_,A,A,B,B,0) :- !. +t2on_2(List1,T2ON_Dict03,T2ON_Dict2,A11,A12,N1) :- + t2on(List1,T2ON_Dict03,T2ON_Dict21,A11,A13), + N2 is N1-1, + t2on_2(List1,T2ON_Dict21,T2ON_Dict2,A13,A12,N2),!. + +towords2([],A,A) :- !. +towords2(T2ON_Dict03,A,B) :- + T2ON_Dict03=[[Word,_,_,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2(Rest,C,B). + +towords2a([],A,A) :- !. +towords2a(T2ON_Dict03,A,B) :- + T2ON_Dict03=[[Word,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2a(Rest,C,B). + +towords3([],A,A,C,C,D,D) :- !. +towords3(T2ON_Dict03,A,B,D,E,G,H) :- + T2ON_Dict03=[[Word1,Word2]|Rest], + (Word2=""->append(G,[Word1],I)->true; + append(G,[Word2],I)), + append(A,[Word1],C), + append(D,[Word2],F), + towords3(Rest,C,B,F,E,I,H). + + +t2on([],B,B,C,C) :- + !. +t2on([Word3|Words],T2ON_Dict,T2ON_Dict2,A11,A12) :- + %downcase_atom(Word, Word2), atom_string(Word2,Word3), + + words_to_read(WR1), + (WR1>0->(writeln(WR1),write(Word3), + t2on2(3),nl,sleep(0.12), + WR2 is WR1-1, + retractall(words_to_read(_)), + assertz(words_to_read(WR2))); + true), + + %/**member([Word3,X,Y,Z],T2ON_Dict4) -> %% This feature is a bug because words in t2on_dict2 shouldn't necessarily be the words in t2on_dict1 + %append(A11,[[Word3,""]],T2ON_Dict3), T2ON_Dict3t=T2ON_Dict4, + %%t2on_(Words,T2ON_Dict3,T2ON_Dict2,T2ON_Dict3t,T2ON_Dict5)) + %%; + %%(**/ + + %%(member([Word3,X,Y,Z],T2ON_Dict4) -> %% This feature is a bug because words in t2on_dict1 should correspond to those in t2on_dict2 + %%(atom_concat("The for ", Word3, P1), + %%atom_concat(P1, " is defined. Enter object name (without spaces), if different for ", Prompt)); + %Prompt="Enter object name (without spaces), if different for "), + + %%writeln([word3,Word3]), + %trace, + (member([Word3,String4],T2ON_Dict)-> + T2ON_Dict3=T2ON_Dict; + ((repeat, + write("Enter object name (without spaces), if different for "), writeln(Word3),read_string(user_input, "\n", "\r", _End2, String2),split_string(String2, "", " ", String3),String3=[String4]), + append(T2ON_Dict,[[Word3,String4]],T2ON_Dict3) + %t2on(1,u,String4,1) + )), + + append(A11,[[Word3,String4]],A13), + + %%*t2on_th(Word3,_T2ON_th), + %concat_list(["[",Word3,",",String4,"], "],Notification1), + %write(Notification1), +%(String4=""->String5=Word3;String5=String4), + + %downcase_atom(String5, String52), atom_string(String52,String53), + + +t2on(Words,T2ON_Dict3,T2ON_Dict2,A13,A12). + %%). +%t2on_th(_,sweetinvincibleandprayedfor). + +%% finds unknown words, asks for their t2on_ in form "n of m: word", verify, (can go back x) append and sort, save \ No newline at end of file diff --git a/Text-to-Object-Name/t2on_repeating_text.pl b/Text-to-Object-Name/t2on_repeating_text.pl new file mode 100644 index 0000000000000000000000000000000000000000..81939b280928e5e7996117503fb94082e6344819 --- /dev/null +++ b/Text-to-Object-Name/t2on_repeating_text.pl @@ -0,0 +1,10 @@ +t2on_repeating_text(N1):- + phrase_from_file_s(string(Dict), "../Lucian-Academy-Data/Text-to-Object-Name/t2on_repeating_text.txt"), + string_codes(Dict_string,Dict), + term_to_atom(Dict_term,Dict_string), + Dict_term=[N,String], + %SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %split_string(String,SepandPad,SepandPad,List), + numbers(N1,1,[],N1_list), + findall(_,(member(_N11,N1_list), + t2on(N,string,String,u)),_),!. diff --git a/Text-to-Object-Name/text_to_object_name.pl b/Text-to-Object-Name/text_to_object_name.pl new file mode 100644 index 0000000000000000000000000000000000000000..5cd1f5157b40eba879f58125c6881ccfe6436af5 --- /dev/null +++ b/Text-to-Object-Name/text_to_object_name.pl @@ -0,0 +1,10 @@ +:-include('../Text-to-Object-Name/t2on_reading.pl'). + +:- include('../listprologinterpreter/listprolog'). +:- include('../listprologinterpreter/la_files.pl'). +:- include('../Text-to-Object-Name/t2on_repeating_text.pl'). +%%:- include('texttobrqb'). +%:- include('../listprologinterpreter/la_strings'). +%:- include('mergetexttobrdict'). +%:- include('edit.pl'). +%:- include('meditatorsanddoctors'). diff --git a/Time_Machine-main/.DS_Store b/Time_Machine-main/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/Time_Machine-main/.DS_Store differ diff --git a/Time_Machine-main/Caveats.md b/Time_Machine-main/Caveats.md new file mode 100644 index 0000000000000000000000000000000000000000..32a355ac8eea855f56aa96458c6d2c772fb7172e --- /dev/null +++ b/Time_Machine-main/Caveats.md @@ -0,0 +1,23 @@ +# Immortality Caveats + +Further explanation of the terms (recommended): +* Lecturer skill - Instead of finding out philosophies from the representations of a pedagogue such as Nietzsche to write down, I studied Honours in philosophy to earn the skill of writing my own ideas down, hence the adage "Forget Nietzsche". To earn the marks for Honours, I worked out the correct essay format, the ways of thinking to earn high distinctions and to connect quotes from at least two sources in one of the paragraphs. I also recommend using a grammar and form checker, breasoning out 5 million breasonings per assignment, representing the top value 5*50 As for a PhD assignment and copywriting job using Text-to-Breasonings and Grammar-Logic and writing 80-250 philosophy breasonings and 80-250 algorithms per high distinction. Breasonings are sentences which convey a philosophical idea, where the object and other words in the sentence is breasoned out for high distinctions, given ways of thinking such as details (two uses for an object), etc. I first wrote dreams and ideas like those in a book of theories as breasonings. I became interested in computational hermeneutics or the interpretation of texts to start writing algorithms, and counted one clause of Prolog as one algorithm. Writing algorithms not only checks the objects in writing, but the logic. +* Recordings - The ability to implicitly, or quickly convey breasonings, sometimes send objects or oneself as breasonings (a "bot") through space and time. A much-desired skill, one can indicate "As" (high distinctions) with a thought command after working on the original A. + +How to get these skills: +* For the lecturer skill, I was chosen as a tutor at University - I best understood a subject on Prolog compilers, for which I earned 100% in two assignments, I earned a scholarship to Logic Summer School at ANU, I met my logic professor at an IEEE meeting and was selected as a tutor in the subject. +* For the recordings skill - Write a number-one song with Music Composer and possibly join an acting agent to earn extras and acting roles, increasing one's "recordings" skills. I met a cosmologist, wrote a great first song which was aired on the radio, studied a theatre studies short course and joined an acting agent. +* Mind reading is recommended as the way to write songs with Music Composer for the best results. +* Lucian (the founder of Time Machine) wrote a number-one song on a radio show, which is optional. +* As well as the meditation, etc. subjects, I activated my writing ability through Creative Writing, and my pedagogy skill, which formed the basis of the meditation-based subjects, through Education. + +Notes: +* Immortality medicine - 100% of Lucianic Meditation doctors can help with immortality as a form of medicine. +* After joining the simulation from the year 5689, your clothes and appearance will automatically match your time travel destination. +* The other advantages of this simulation are single-thought activated time travel, protection from accidents in possibilities of the multiverse, avoiding some weather, better health in immortality and possible replacement of faulty body parts using "quantum" operations (imperceptible operations initiated by As), the possible detectability of electrons when one is not looking in the simulation and a possible different physics of the simulation which is more stable. There are reports of only seeing people from other times and those close to oneself after activating the simulation, seeing those with 50 As, and not seeing time travellers visiting the home time. +* It is better not to take risks in the simulation, and remember to avoid danger. + +# Important + +* One should breason out 4\*50 As (16,000\*125=2000000 breasonings) including some handwritten content and mostly Grammar Logic content, breasoned out with Text to Breasonings to not age in the home dimension every ten years. +* One should do this and have 250 breasonings per bodily organ system in the Daily Regimen per meditator per dimension time travelled to each day. diff --git a/Time_Machine-main/Immortality.md b/Time_Machine-main/Immortality.md new file mode 100644 index 0000000000000000000000000000000000000000..27862f7ca23c40dc2f319ad22fef6fa805540cee --- /dev/null +++ b/Time_Machine-main/Immortality.md @@ -0,0 +1,29 @@ +# Immortality + +* Write an A to time travel, and an A to become immortal and breason it out. + +* Before using `texttobr2_1(1).` to time travel to October 1 5689, read: + + * Before running texttobr, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. + + * Follow instructions in Instructions for Using texttobr(2) when using texttobr, texttobr2 or mind reader to avoid medical problems. + + * The Prevent Headaches App may be necessary. + +* Read Immortality entry 1 for instructions on how to enter the simulation to be immortal. ("Work.. out with 250 breasonings that buying the sim card to my time represented buying the simulation chip - `texttobr2_1(1).`. Turn... it on - `texttobr2_1(1).`). You can indicate to become immortal. + +* There are text-to-breasonings supported window periods in the future. For example, it is recommended to repeatedly visit the time October 1-16 5689, time travelling back to the start of this time every 16 days, otherwise a non-text-to-breasonings supported time may cause medical problems. + +* You need to visit a future time that supports Immortality each day to activate Immortality. + +* Switch for Immortality medicine to be immortal, replace one's body, stop dementia, have clear sight, think clearly, relax muscles. each month in Immortality. If necessary, use Grammar-Logic to generate 4000 word breasonings for a department, for example Education, generate 4000 word breasonings for B (disagreement with) Education and 4000 word breasonings for B to B to (solve disagreements with) Education, then hand-breason them out. + +* In total, there should be 16,000 sentences breasoned out for immortality. + +* Switch to turn off medical problems once from each department you have been developed about to activate Immortality. Use Grammar-Logic to generate 4000 word breasonings from the list of body systems, generate 4000 word breasonings for B (problems with) body systems and 4000 word breasonings for B to B to (solve problems with) body systems, then use Text-to-Breasonings to breason them out. + +* The custodians, the robots, require residents to think of an English (autobiographical) statement to be thought of per day (your aim that day). + +* As an aside, watch videos from University departments every week (with 50 As each, breasoned out with Text to Breasonings) for professional development. + +* More entries on the Simulation and Immortality from August 2021, September 2021 and October 2021. Further months at Lucian Green's Philosophy. diff --git a/Time_Machine-main/Instructions to freeze age - T2B4 (Faster).txt b/Time_Machine-main/Instructions to freeze age - T2B4 (Faster).txt new file mode 100644 index 0000000000000000000000000000000000000000..381f15790b9a795d2a6e247d9bd284f6728c4406 --- /dev/null +++ b/Time_Machine-main/Instructions to freeze age - T2B4 (Faster).txt @@ -0,0 +1,59 @@ +# Instructions to freeze age - T2B4 (Faster) + +# Requires luciangreen/Philosophy repository +# https://github.com/luciangreen/Philosophy/ + +# The robots are simple minded. The Text to Breasonings algorithm breasons out sentences and algorithms, which pass the professional requirement of 4*50 sentence As (A=80 sentences) for two destinations per day of meditation, time teleportation and medicine for anti-ageing. + +# One-off +cd Dropbox/GitHub/Text-to-Breasonings +swipl --goal=main --stand_alone=true -o t2b4 -c text_to_breasonings4.pl + +# Each week +cd Dropbox/GitHub/Philosophy + +swipl --stack-limit=80G +['cat_arg_files.pl']. +cat_arg_files(6000). +halt. + +swipl --stack-limit=80G +['cat_alg_files.pl']. +cat_alg_files(6000). +halt. + +cd ../Text-to-Breasonings + +cp ../Lucian-Academy/Books1/algs/lgalgs_a.txt file.txt +./t2b4 +rm a.pl +mv a a1 +rm b.pl +mv b b1 +rm c.pl +mv c c1 + +cp ../Lucian-Academy/Books1/args/lgtext_a.txt file.txt +./t2b4 +rm a.pl +mv a a2 +rm b.pl +mv b b2 +rm c.pl +mv c c2 + +# Every day, in each dimension +cd Dropbox/GitHub/Text-to-Breasonings + +swipl +['text_to_breasonings.pl']. +% for pedagogy, meditation and medicine about breasonings +texttobr2_1(1). +texttobr2(4,u,"square",u). +halt. + +# Either +./bc12.sh + +# Or, for 15 people +./d.sh diff --git a/Time_Machine-main/Instructions to freeze age.txt b/Time_Machine-main/Instructions to freeze age.txt new file mode 100644 index 0000000000000000000000000000000000000000..ef379dfb406acea24eb5896f3c37b0e9a346d62f --- /dev/null +++ b/Time_Machine-main/Instructions to freeze age.txt @@ -0,0 +1,34 @@ +# Instructions to freeze age + +# Requires luciangreen/Philosophy repository +# https://github.com/luciangreen/Philosophy/ + +# The robots are simple minded. The BAG algorithm produces sentences and mock algorithms, which pass the professional requirement of 4*50 sentence As (A=80 sentences) for two destinations per day of 2 meditations, 2 time teleportations and 2 medicines for anti-ageing. + +# Optionally run on a Virtual Private Server (VPS) (See https://github.com/luciangreen/Text-to-Breasonings/blob/master/Setting_up_a_VPS_with_TextToBr.txt). + +# So, 64,000 argument and algorithm breasonings per destination (of meditation, time teleportation and medicine for anti-ageing, must have home and October 5689 and wait for the breasonings to finish to take effect in each destination). I prefer to run the 64,000 breasonings for a third time after returning to 5689, as this is a new leg of the journey. + +cd Dropbox/GitHub/Philosophy + +swipl --stack-limit=80G +['cat_arg_files.pl']. +cat_arg_files(6000). +halt. + +swipl --stack-limit=80G +['cat_alg_files.pl']. +cat_alg_files(6000). +halt. + +swipl +['day2.pl']. +main. + +# STOP + +# Print running totals + +swipl +['print_totals.pl']. +print_totals. diff --git a/Time_Machine-main/LICENSE b/Time_Machine-main/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..920183ffbdaca89fc6faee40b6de08a6c8a061c7 --- /dev/null +++ b/Time_Machine-main/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2021, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/Time_Machine-main/Prevent-Headaches.md b/Time_Machine-main/Prevent-Headaches.md new file mode 100644 index 0000000000000000000000000000000000000000..81057c3e1842ec3f9600a89a1b7ab42c7f6e3134 --- /dev/null +++ b/Time_Machine-main/Prevent-Headaches.md @@ -0,0 +1,33 @@ +# Prevent Headaches + +* Introduction +* What is this? +Use the Prevent Headaches App to help prevent stress (non-pathological) headaches. It combines meditation with a non-invasive spiritual medical technique and a daily regimen. Because stress headaches can start in the body, it is recommended to use our muscle aches and pain preventative technique. This app can help prevent headaches in post-Honours University students, on trains, in cars, and in the sun. Please stay out of the sun in high-UV hours and do yoga before long train journeys. +* Meditation +You can naturally prevent a headache by silently repeating the mantra, "arem" for twenty, minutes, twice per day. The mantra becomes lighter and lighter, until one forgets the thought, clearing the mind. Also, silently repeat the sutra "green" for twenty minutes, twice per day after two months of using the mantra in the morning and evening. Also, pray for no digestive system pops from practising the sutra each day. +* Quantum Box and Nut and Bolt +* Quantum Box +The quantum box is a guided meditation that I will describe later that prevents headaches, (relaxes head and neck muscles), and prevents muscle aches and pains. In it, you can make anything (e.g. a headache) instantly disappear, based on quantum physics (where at the quantum level, things may disappear if we want them to). +* Nut and Bolt +The nut and bolt is a guided meditation that is a more detailed prevention of problems the Quantum Box prevents. In it, pretend to unscrew a nut from a long thin bolt each day to prevent a headache, etc. +* Daily Regimen +Indicate 2 radio buttons (one for each of the following ideas, and one for each of these ideas working anyway, where a radio button is indicated by breasoning out, or thinking of the x, y and z dimensions of a counter, 0.01, 0.01, 0.005 metres). +Too (just say “too” for each idea following to indicate the two radio buttons) +meditation (108 arem mantras and 108 green sutras) +Also for the following, which also have quantum box and nut and bolt: +No digestive system pops from practising the sutra + +Also for the following, which also have quantum box and nut and bolt: +Protector from Headache in Meditation after Honours Study, etc. +Also for the following, which also have quantum box and nut and bolt: +No headache + +The following turn off headaches etc. +Also for the following, which also have quantum box and nut and bolt: +Turn off workload from all employees including you below you, +Also for the following, which also have quantum box and nut and bolt: +Detect and turn off workloads using manager algorithm. (The self does the As, turning off the workloads.) +Also for the following, which also have quantum box and nut and bolt: +Detect and turn off workloads using manager network. (The others help finish the As, turning off the workloads.) +* Prevention Technique +If you feel a headache coming on, then think, "I don't want nothing to happen, followed by a headache. This idea works at the time. Each of the other parts have been taken care of. Green, Doctor." \ No newline at end of file diff --git a/Time_Machine-main/Protection.md b/Time_Machine-main/Protection.md new file mode 100644 index 0000000000000000000000000000000000000000..8c8209f43e28efdbe46b788d7be8ceb4a63e667b --- /dev/null +++ b/Time_Machine-main/Protection.md @@ -0,0 +1,26 @@ +# Additional Protection from Dying + +* Immortal people can be protected by giving them 4*50 As of high-quality thoughts at the start of a period (or once per day). +* The 4*50 As are Breasoning Algorithm Generator (BAG) (sentence) As completed with freezing age As (Note 1) or freezing age As (faster) (Note 2). +* Note 2: Run ./d.sh in Freezing Age As (Faster) (above) daily to earn sentience in the simulation (have one's living quarters and appearance) by preparing for 15 simulants. +* Once you have passed preparing for 15 simulants, spend 4*50 high distinctions on acquiring a computer in the future. Thought command this computer for high distinctions, rather than using your computer. The future computer can only affect the future, not other times. +* Notes 1 and 2: Run ./day2 in Note 1 to freeze your age, followed by the following to freeze others' ages: +First, edit meditators' names, e.g. (john, not John) in `meditatorsanddoctors.pl`. +``` +cd Text-to-Breasonings +swipl +['text_to_breasonings.pl']. +['meditatorsanddoctors.pl','meditationnoreplace2.pl']. +meditation. +halt. +``` +It is better for you to time travel but they don't have to. +* See big_medit1.md for instructions on running a Prolog program to automatically time travel and increase the longevity of your meditators each day. +* Run +``` +cd Daily-Regimen +./d-prep.sh # if necessary +./d.sh +``` +four times to switch off home and future ageing and switch them off being switched off, done twice, once in the present and once in 5689. +* Do the above four times again (to install time crystals to indicate arguments are automatically breasoned out after daily meditation, to use these time crystals, done twice, once in the present and once in 5689), multiplied for each argument (text to breasonings for enough breasonings per breasoning session, meditation, medicine for anti-ageing, time travel, mind reading, having a spiritual future apartment and a spiritual future computer). diff --git a/Time_Machine-main/README.md b/Time_Machine-main/README.md new file mode 100644 index 0000000000000000000000000000000000000000..d11b68871d5fd5942afcdd3fe33fcd5f51d637b4 --- /dev/null +++ b/Time_Machine-main/README.md @@ -0,0 +1,81 @@ +![lines-1171160](https://user-images.githubusercontent.com/15845542/124043330-28198f00-da4e-11eb-8703-37e03cda6b73.jpg) +Image of blue vertical lines disappearing into a vanishing point (Photo by Maxime Perron Caissy from FreeImages) + +# Time_Machine + +* Summary: To become immortal, you can dot on the co-op’s planned 16k (sentence) breasonings for meditation and 16k breasonings for anti-ageing medicine for the day in your home time each day without time travelling with two radio buttons (1 by 1 by 0.5 cm for a radio button). Don’t forget to lock off ageing (to other people) and lock off this lock being tampered with further 16k breasonings for each lock. The locks are done once. Tell others about this repository or dot the breasonings on for them each day with their consent to help them become immortal. + +* You will need 1000 pedagogy breasonings for text to breasonings for enough breasonings per breasoning session, 80 meditation breasonings, 80 medicine breasonings for anti-ageing, 80 time travel breasonings, 80 mind reading breasonings, 80 time crystal breasonings, 80 breasonings for having a spiritual future apartment and 80 breasonings for having a spiritual future computer, indicated in the present and the correct set of dates in 5689 each day after joining the simulation for immortality without a laptop with power supply. Otherwise, you can run the algorithm in this repository. +* Welcome to Lucian's Time Machine, which offers a positive way of life, extending life for those who would like it and bringing advanced technology to us so that all may enjoy immortality through meditation. +* The way of life in the simulation is relaxing; there is no pressure to work, and one can live on a pension, creating art, studying and having fun. +* Imagine the possibilities: articulating to the top of academia and industry, enjoying the perks of relaxation from activity and preventing the onset of age-related body breakdown. + +* Warning: Changing the algorithm not to include meditation may be fatal. You should meditate with 108 arems and 108 greens before and after travel. +* Travel through time with meditation. +* Youtube video +* "I can convert myself to a bot, convert shadow bots to bots (allowing people and finance) and time travel as a bot (all with 250 breasonings). There is a lag while people are ready for you to time travel." + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, Lucian-Academy, List Prolog Interpreter, the Text-to-Breasonings repository and the repositories these repositories depend on. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","Time_Machine"). +halt +``` + +# Caution: + +* Before running texttobr, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. + +* Follow instructions in Instructions for Using texttobr(2) when using texttobr, texttobr2 or mind reader to avoid medical problems. + +* The Prevent Headaches App may be necessary. + +* Instructions on becoming immortal + +* The Immortality Caveats describe ways to enter immortality and more about the simulation. + +* Running Time Machine requires the Text to Breasonings algorithm, which requires: +- Meditation before running the algorithm and after arriving at a time travel destination. +- Understanding of the Pedagogy (breasoning) skill, activated by a University short course in Education. +- Recordings skill, which is activated by a person who has this skill after breasoning out a Recordings High Distinction. +- Other requirements, including the Lecturer skill +- The window (ability) to time travel may disappear soon after learning the time travel skill, so it is advisable to travel to the time in Instructions on becoming immortal to gain a permanent time travel skill. +- The courses above required for time travel are included in the algorithm below (so they may not be required apart from the High Distinction texts in the algorithm). + +- Instructions to freeze one's age in one's home time - Video + +- Additional Protection from Dying + +# Running + +* In Shell: +`cd Time_Machine` +`swipl` or possibly `swipl --stack-limit=4G` if more memory is needed +`['time_machine.pl'].` +`time_machine_prepare.` to breason out necessary courses for time travel including: +- Delegate workloads, Lecturer, Recordings +- Fundamentals of Meditation and Meditation Indicators +- Fundamentals of Pedagogy and Pedagogy Indicators +- Lecturer +- Medicine +- Mind Reading +- Time Travel + +* To time travel: `texttobr2_1(3).` with radiation switched off gives "appearance time travel". This means people from the times will walk past you wearing clothes and surrounded by the setting of your current time. Your computer will work and you can go home and recharge it to time travel to your home again. A bot representing you will appear at home while you are time travelling and bots representing those from your home will appear to you while you are time travelling. +* Using `texttobr2_1(3).` not `texttobr2_1(1).` indicates A, B (catching blocks to travel) and B to B (objecting to blocks to travel). diff --git a/Time_Machine-main/big_medit1.md b/Time_Machine-main/big_medit1.md new file mode 100644 index 0000000000000000000000000000000000000000..ff347fb0226b19c2d94ebe502f7a9c337c8b62c6 --- /dev/null +++ b/Time_Machine-main/big_medit1.md @@ -0,0 +1,43 @@ +# big_medit1.sh + +* Automatically help meditators to time travel each day and increase their longevity. + +* Add meditators (e.g. john, not John) to `Text-to-Breasonings/meditatorsanddoctors.pl`. + +* When you add a meditator, please invite them to the simulation when they end up in 5689 at the end of the first day with `texttobr2_1(*number of new meditators)`, e.g. `texttobr2_1(1).`. + +* Please record that you have invited them to the simulation in `Text-to-Breasonings/meditatorsanddoctors.pl`. + +* If using `big_medit1.pl` below, replace the secret key in `chatgpt_qa.pl` according to ChatGPT. You may comment/uncomment lines of the script at `Daily-Regimen/big_medit.sh`. + +# Weekly Instructions + +* Update Philosophy, Lucian-Academy (at the minimum) each week: + +``` +cd GitHub +rm -rf Philosophy Lucian-Academy +git clone https://github.com/luciangreen/Philosophy.git +git clone https://github.com/luciangreen/Lucian-Academy.git + +cd Daily-Regimen +./d-prep.sh +``` + +# Each day: + +* Personal meditation + +* Run the daily script with: + +``` +cd Daily-Regimen +./big_medit1.sh +``` + +* Or, run the (much) faster daily script: + +``` +cd Daily-Regimen +./big_medit2.sh +``` diff --git a/Time_Machine-main/time_machine.pl b/Time_Machine-main/time_machine.pl new file mode 100644 index 0000000000000000000000000000000000000000..466eb6ee3bed98e2327b2fc9ca353f7404918496 --- /dev/null +++ b/Time_Machine-main/time_machine.pl @@ -0,0 +1,31 @@ +% needs t2b repository + +:-include('../listprologinterpreter/listprolog.pl'). +:-include('../Text-to-Breasonings/text_to_breasonings.pl'). + +time_machine_prepare :- + +K=["../Lucian-Academy/Books/Time Travel/", +"../Lucian-Academy/Books/Fundamentals of Pedagogy and Pedagogy Indicators/", +"../Lucian-Academy/Books/Medicine/", +"../Lucian-Academy/Books/Fundamentals of Meditation and Meditation Indicators/", +"../Lucian-Academy/Books/Lecturer/", +"../Lucian-Academy/Books/Delegate workloads, Lecturer, Recordings/", +"../Lucian-Academy/Books/Mind Reading/" +], + + +findall(J,(member(K1,K), directory_files(K1,F), + delete_invisibles_etc(F,G), + +findall([File_term,"\n"],(member(H,G),string_concat(K1,H,H1),open_file_s(H1,File_term)),J)),J3), + +flatten(J3,J1), +foldr(string_concat,J1,"",J2), + + +N=1,M=u,texttobr2(N,u,J2,M,false,false,false,false,false,false), + + +N=1,M=u,texttobr(N,u,J2,M). + diff --git a/culturaltranslationtool/.DS_Store b/culturaltranslationtool/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..fbf611cc9151f99ee9abbbb152fc239891bfec32 Binary files /dev/null and b/culturaltranslationtool/.DS_Store differ diff --git a/culturaltranslationtool/LICENSE b/culturaltranslationtool/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..3a27c90f1c2986ac0fb73b4ae956c6205343dfdd --- /dev/null +++ b/culturaltranslationtool/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/culturaltranslationtool/README.md b/culturaltranslationtool/README.md new file mode 100644 index 0000000000000000000000000000000000000000..6bd4513027335b7688f703367ba5fabb1b0c883e --- /dev/null +++ b/culturaltranslationtool/README.md @@ -0,0 +1,38 @@ +# Cultural Translation Tool (CTT) + +* CTT allows translating folders and files of text and Prolog files from one language to another. It is used only with files without filenames as strings in Prolog, so if you are translating an algorithm, use it with individual files that contain the text from your algorithm. +* Requires ChatGPT API key (from the ChatGPT website) in `Daily-Regimen/chatgpt_qa_key.txt`. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository and repositories listed in `List-Prolog-Package-Manager/lppm_registry.txt` under `"Cultural Translation Tool"`. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","culturaltranslationtool"). +halt +``` + +# Running + +* In Shell: +`cd culturaltranslationtool` +`swipl` +`['ctt3.pl'].` +`ctt3("source/", "destination/", "English", "French").` + +* where `"source/"` is the source directory or file, `"destination/"` is the destination directory or file, `"English"` is the original language (optional) and `"French"` is the destination language. + diff --git a/culturaltranslationtool/ctt2.pl b/culturaltranslationtool/ctt2.pl new file mode 100644 index 0000000000000000000000000000000000000000..3175d1ef2a82e0ac4cd780e319743062fce82495 --- /dev/null +++ b/culturaltranslationtool/ctt2.pl @@ -0,0 +1 @@ +:-include('vintage_ctt2/ctt2.pl'). \ No newline at end of file diff --git a/culturaltranslationtool/ctt3.pl b/culturaltranslationtool/ctt3.pl new file mode 100644 index 0000000000000000000000000000000000000000..8d1789537568705169cfdea9c5043d65939797e4 --- /dev/null +++ b/culturaltranslationtool/ctt3.pl @@ -0,0 +1,155 @@ +% 19 5 24.pl x ctt3.pl + +% ["Short Arguments","Appearances.txt",0,algorithms,"3. I prepared to help the appearances, increase to academia, recommend meditation and improved logical thinking. I did this by stating that the student constructed a logic of the appearance. First, I observed the plebeian write the number of As on different topics. Second, I observed her either have children or something else. Third, I read research about graduating from the Master degree."] + +%tag("<&t>","","<&t,'","'>",""). + +/* +p2lp for pl +find strings in files +insert tags (auto around strings) x +approve tags x + +split strings by \n, \t, (tags x) +delete white space strings +save white space formatting at start and end of strings +approve grammar x + +trans +- delete missing strings from orig in data file +- only unknown strings +approve orig, trans, btt file by changing orig, orig2, trans +save orig2, trans data +sub back into copy of orig, (with tags x) +render orig wo tags x + +* Use only with files without e.g. filenames as strings in Prolog, so use it with individual files that are data Prolog or text files. + +* gitl in ctt/lppm etc. +*/ + +:-include('../listprologinterpreter/listprolog.pl'). +:-include('../gitl/find_files.pl'). +%:-include('../Prolog-to-List-Prolog/p2lpconverter.pl'). +:-include('../List-Prolog-to-Prolog-Converter/lp2pconverter.pl'). +:-include('../Philosophy/sub_term_with_address.pl'). +:-include('../Daily-Regimen/chatgpt_qa.pl'). + +ctt3(Source,Dest,_Orig_lang,Dest_lang) :- +(exists_file_s(Source)-> +(open_string_file_s(Source,File1), +Files=[[Source,File1]], +Dest_type=file +); +(find_files(Source,Files), + +(not(exists_directory_s(Dest))->(writeln([Dest,"is not a directory."]),abort);true), + +Dest_type=folder +)), +findall(X,(member([X1,X2],Files), +(string_concat(_,".pl",X1)-> +(%trace, +p2lpconverter([string,X2],LP), +sub_term_types_wa([string],LP,Instances1), +findall([Address,Z9],(member([Address,Y1],Instances1),%trace, +find_formatting(Y1,Z9) +),Z7), +X=[stwa,X1,LP,Z7]); +(%trace, +find_formatting(X2,Z9), +X=[string,X1,Z9]))),X3), + +findall(A9,(member(A10,X3), +(A10=[stwa,X11,LP,X12]-> +(findall([Address,A171],(member(A12,X12), +A12=[Address,A13], +findall(A14,(member(A15,A13), +(is_white_space(A15)->A14=A15; +(foldr(string_concat,["What is ",A15," in the language ",Dest_lang,"?"],S), +(catch(q(S,A16),_,false)->A16=A14; +(writeln("Translation failed."),abort))))),A17), +foldr(string_concat,A17,A171) +),A18), +A9=[stwa,X11,LP,%X12, +A18]); + +(A10=[string,X11,X12]-> +findall(A14,(member(A15,X12), +(is_white_space(A15)->A14=A15; +(foldr(string_concat,["What is ",A15," in the language ",Dest_lang,"?"],S), +(catch(q(S,A16),_,false)->A16=A14; +(writeln("Translation failed."),abort))))),A17), +foldr(string_concat,A17,A171), +A9=[string,X11,A171]))),A19), + +findall(A20,(member(A21,A19), +(A21=[string,X1,Z9]->A20=[X1,Z9]; +(A21=[stwa,X11,LP,%X12, +A18]-> +(%trace, +foldr(put_sub_term_wa_ae,A18,LP,X13), +lp2p1(X13,X14), +A20=[X11,X14])))),X21), +%trace, +findall(_,(member([K2,File2],X21), +(Dest_type=file->K3=Dest; +( + +(string_concat(S5,"/",Dest)->true;S5=Dest), +split_string1(K2,"/",S2), +append([S3],S4,S2), +%foldr(string_concat,[S5,"/",S4],S6) +flatten([S3,"/../",S5,S4],S6), +foldr(string_concat,S6,K3))), +open_s(K3,write,S), +write(S,File2),close(S) +),_),!. + +%q1(A,A). + +is_white_space(C) :- + string_chars(C,D),forall(member(E,D),char_type(E,space)). + +find_spaces_before(A,A51,A8) :- + string_chars(A,A1), + findall(A2,(member(A3,A1),atom_string(A3,A2)),A4), + append(A5,A6,A4), + append([A7],_A8,A6), + not(is_space(A7)), + foldr(string_concat,A6,A8), + foldr(string_concat,A5,A51),!. +find_spaces_before(A,A,"") :- + string_chars(A,A1), + findall(A2,(member(A3,A1),atom_string(A3,A2)),A4), + forall(member(A5,A4),is_space(A5)),!. +find_spaces_before(A,"",A) :- !. + +find_spaces_after(A,A52,A8) :- + string_chars(A,A1), + findall(A2,(member(A3,A1),atom_string(A3,A2)),A4), + reverse(A4,A41), + append(A5,A6,A41), + append([A7],_A8,A6), + not(is_space(A7)), + reverse(A6,A61), + foldr(string_concat,A61,A8), + reverse(A5,A51), + foldr(string_concat,A51,A52),!. +find_spaces_after(A,A,"") :- + string_chars(A,A1), + findall(A2,(member(A3,A1),atom_string(A3,A2)),A4), + forall(member(A5,A4),is_space(A5)),!. +find_spaces_after(A,"",A) :- !. + +find_spaces_before_and_after(A,Before,B,After) :- + find_spaces_before(A,Before,C), + find_spaces_after(C,After,B),!. + + +find_formatting(Y1,Z9) :- + split_on_substring1(Y1,"\n\t",Y2), +findall(Z8,(member(Z1,Y2), +find_spaces_before_and_after(Z1,Z2,Z3,Z4), +delete([Z2,Z3,Z4],"",Z8)),Z5), +flatten(Z5,Z9). diff --git a/culturaltranslationtool/vintage cultural translation tool/.DS_Store b/culturaltranslationtool/vintage cultural translation tool/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..858a8a46d116e51f14ad1177fa91be92544cf4fa Binary files /dev/null and b/culturaltranslationtool/vintage cultural translation tool/.DS_Store differ diff --git a/culturaltranslationtool/vintage cultural translation tool/ctt.pl b/culturaltranslationtool/vintage cultural translation tool/ctt.pl new file mode 100644 index 0000000000000000000000000000000000000000..e19d037a21ab9c973d5a1a7e0e0fc36be51c1f01 --- /dev/null +++ b/culturaltranslationtool/vintage cultural translation tool/ctt.pl @@ -0,0 +1,929 @@ +/** Cultural Translation Tool + +The Cultural Translation Tool saves previous translations and texts to back-translate, saving time. + +*** Important: Please quit Grammarly (or the grammar editor) before running and after each use. +*** If using vi instead of Grammarly, please grammar check new sentences in Grammarly before entering. +- assumed cultural customisations before use of alg +- texttobr[1,2] after back-translation or (lote texttobr[1,2] when) translation is complete + +files to operate on different folder, different tmp folder + +simulate repeat by automatically entering entered text when not in correct format + +include common ctt preds + +texttobr/2 this alg, plan before monday + +\' in strings fr can't, aren't - convert to and from - when debugged + + +[ctt]. +[edit]. + +New Features +- 5 11 18 Takes previous backtranslations in any language pair that match original and tries to translate them in the given language pair. + +**/ + +use_module(library(pio)). +use_module(library(dcg/basics)). + +%%:- include(library(edit)). +%%:- use_module(library(edit)). +%%:- multifile edit:edit_command/2. +%%:- multifile prolog_edit:load/0. + +%% run with prolog_edit:ctt. + +%%:- include('edit.pl'). + +ctt :- + cttInput('files/ctt-input.txt',CttInput1,FromLang,ToLang,Tokens2,_Tokens3,Tokens32), + File1='files/ctt-orig1-orig2.txt', + readfile(File1,CttOrig1Orig21,COOOtherLangPairs,COOWithoutLangPairs,FromLang,ToLang,"files/ctt-orig1-orig2.txt file read error.","Number of back-translation pairs in lang1->lang2: "), + File2='files/ctt-orig-tran.txt', + readfile(File2,CttOrigTran1,COTOtherLangPairs,_,FromLang,ToLang,"files/ctt-orig-tran.txt file read error.","Number of translation pairs in lang1->lang2: "), + + %% ctt-orig1-orig2.txt, ctt-orig-tran.txt->google + + calcbtremaining(CttInput1,CttOrig1Orig21,CttOrigTran1,FromLang,ToLang), %% redo calc based on google x can't bec relies on on-fly processing + %%ctt(Tokens2,[],Translation,CttOrig1Orig21,_Translation1,CttOrig1Orig212,CttOrigTran1,CttOrigTran2,FromLang,ToLang), + ctt2(Tokens2,[],Translation,CttOrig1Orig21,CttOrig1Orig212,CttOrigTran1,CttOrigTran2,FromLang,ToLang,COOWithoutLangPairs), + updatetrans(Translation,Tokens32,"",Translation2), + removerepeatedterm(CttOrig1Orig212,[],CttOrig1Orig2123), + addfromtolang(FromLang,ToLang,CttOrig1Orig2123,[],CttOrig1Orig2122), + addfromtolang(FromLang,ToLang,CttOrigTran2,[],CttOrigTran21), + wrap2(COOOtherLangPairs,[],COO1), + wrap2(COTOtherLangPairs,[],COT1), + append(CttOrig1Orig2122,COO1,COO3), + append(CttOrigTran21,COT1,COT), + updatefile1(COO3,File1), + updatefile1(COT,File2), + updatefile2(Translation2,'files/ctt-output.txt'),!. + +removerepeatedterm([],List,List) :- !. +removerepeatedterm(List1,List2,List3) :- + List1=[[Item,Item]|Rest], + append(List2,[[Item,""]],List4), + removerepeatedterm(Rest,List4,List3),!. +removerepeatedterm(List1,List2,List3) :- + List1=[Item|Rest], + append(List2,[Item],List4), + removerepeatedterm(Rest,List4,List3),!. + +addfromtolang(_FromLang,_ToLang,[],WithFromToLang,WithFromToLang) :- !. +addfromtolang(FromLang,ToLang,WithoutFromToLang,WithFromToLang1,WithFromToLang2) :- + WithoutFromToLang=[[A,B]|Rest], + WithFromToLang3=[[FromLang],[ToLang],[[A],[B]]], + append(WithFromToLang1,[WithFromToLang3],WithFromToLang4), + addfromtolang(FromLang,ToLang,Rest,WithFromToLang4,WithFromToLang2), !. + + +/** + updatetrans(Tokens3,Translation1,Translation2), + updatefile(CttOrig1Orig212,File1). + updatefile(CttOrigTran2,File2), + updatefile(Translation2,'files/ctt-output.txt'). +**/ + +updatefile1(List2,File) :- + sort(List2,List3), + updatefile3(List3,File). +updatefile2(List2,File) :- + updatefile3(List2,File). + +updatefile3(List2,File) :- + open(File,write, Stream), +%% string_codes(List2), + write(Stream,List2), + close(Stream),!. + +updatetrans(_,[],Translation,Translation) :- !. +updatetrans(Translation,Tokens1,Translation2,Translation3) :- + Tokens1=[[_,r,N]|Tokens2], + repeat("\n",N,String), + atom_concat(Translation2,String,Translation4), + updatetrans(Translation,Tokens2,Translation4,Translation3),!. +updatetrans(Translation,Tokens1,Translation2,Translation3) :- + Tokens1=[[_,s,N]|Tokens2], + repeat(" ",N,String), + atom_concat(Translation2,String,Translation4), + updatetrans(Translation,Tokens2,Translation4,Translation3),!. +updatetrans(Translation,Tokens1,Translation2,Translation3) :- + Tokens1=[N|Tokens2], + member([N,Token],Translation), + delete(Translation,[N,Token],Translationa), + atom_concat(Translation2,Token,Translation4), + updatetrans(Translationa,Tokens2,Translation4,Translation3),!. + +/**atom_concat2([],List,List) :- !. +atom_concat2(List1,List2,List3) :- + List1=[List4|List5], + atom_concat(List2,List4,List6), + atom_concat2(List5,List6,List3),!.**/ + +repeat(Str,1,Str). +repeat(Str,Num,Res):- + Num1 is Num-1, + repeat(Str,Num1,Res1), + string_concat(Str, Res1, Res). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +cttInput(CttInput1,CttInput2,FromLang,ToLang,Tokens2,Tokens3,Tokens32) :- + %%SepandPad=".?!", %% split by \n and record number of \ns between paras, then split into sentences adding back punctuation, trim white space before sentence recording number of \ss before - record sentence # + phrase_from_file(string(CttInput4), CttInput1), + (phrase(entry1(CttInput3),CttInput4)->true;(writeln("files/ctt-input.txt file read error."),abort)), + CttInput3=[FromLang,ToLang,Tokens1], + removers(Tokens1,1,[],Tokens2,[],Tokens3,[],Tokens31,[],Tokens32), + length(Tokens2,Length0),write("Number of sentences to translate in files/ctt-input.txt: "), writeln(Length0), + sort(Tokens31,CttInput2), + length(CttInput2,Length1),write("Number of unique sentences to translate in files/ctt-input.txt: "), writeln(Length1). + +%% [en,fr] +%% Num of sentences to modify, etc. + +removers([],_,Tokens1,Tokens1,Tokens2,Tokens2,Tokens3,Tokens3,Tokens4,Tokens4) :- !. +removers(Tokens1,N1,Tokens2,Tokens3,Tokens7,Tokens8,Tokens10,Tokens11,Tokens13,Tokens14) :- + Tokens1=[AtomToken4|Tokens5], %% Token4 to AtomToken4 + not(AtomToken4=[r,_]),not(AtomToken4=[s,_]), %% " + %%[Token41]=Token4, + %%atom_string(AtomToken4,Token41), + append(Tokens2,[[N1,AtomToken4]],Tokens6), + append(Tokens13,[N1],Tokens15), + append(Tokens10,[AtomToken4],Tokens12), + append(Tokens7,[N1],Tokens9), + N2 is N1+1, + removers(Tokens5,N2,Tokens6,Tokens3,Tokens9,Tokens8,Tokens12,Tokens11,Tokens15,Tokens14),!. +removers(Tokens1,N1,Tokens2,Tokens3,Tokens6,Tokens7,Tokens10,Tokens11,Tokens13,Tokens14) :- + Tokens1=[Token4|Tokens5], + ((Token4=[r,N3],T=r);(Token4=[s,N3],T=s)), + append(Tokens13,[[N1,T, N3]],Tokens15), + %%atom_string(AtomToken4,Token4), + %%append(Tokens6,[[AtomToken4]],Tokens8), + N2 is N1+1, + removers(Tokens5,N2,Tokens2,Tokens3,Tokens6,Tokens7,Tokens10,Tokens11,Tokens15,Tokens14),!. + +readfile(List1,List2,List7,List8,FromLang,ToLang,Error,Notification) :- + phrase_from_file(string(List6), List1), + (phrase(file0(List3),List6)->true;(writeln(Error),abort)), + select(List3,FromLang,ToLang,[],List5,[],List7,[],List8), + sort(List5,List2), + length(List2,Length),write(Notification), writeln(Length) %% + . + +select([],_FromLang,_ToLang,List1,List1,List2,List2,List3,List3) :- !. +select(List,FromLang,ToLang,List1,List2,List5,List6,List7,List8) :- + List=[Item1|List3], + ((Item1=[FromLang,ToLang,Item2], + append(List1,[Item2],List3)); + (Item1=[FromLang,ToLang,Item2,Item3], + append(List1,[[Item2,Item3]],List4))), + select(List3,FromLang,ToLang,List4,List2,List5,List6,List7,List8),!. +select(List,FromLang,ToLang,List1,List2,List4,List5,List7,List8) :- + List=[Item1|List3], + append(List4,[Item1],List6), + Item1=[_,_,Item2,Item3], + append(List7,[[Item2,Item3]],List9), + select(List3,FromLang,ToLang,List1,List2,List6,List5,List9,List8),!. + +calcbtremaining(CttInput1,CttOrig1Orig211,CttOrigTran1,FromLang,ToLang) :- + towords(FromLang,ToLang,CttOrig1Orig211,[],CttOrig1Orig212,[],_ObjectNames,[],AllUsedNames), + tranfrom(CttOrigTran1,[],TranFrom), + subtract(CttInput1,CttOrig1Orig212,D1), %% + length(D1,Length),Difference is abs(Length),write("Number of back-translations remaining to define: "), writeln(Difference), + + %%towords2(CttInput1,[],CttInput2), + subtract(AllUsedNames,TranFrom,D2), %% Should AUN be appended to TF, " x + %%delete(D21,'',D2), + length(D2,Length01t),Differencet is abs(Length01t),write("Number of undefined back-translations: "), writeln(Differencet), + %%writeln([undefined,D2]), %% Print undefined + + %%delete(D31,'',D3), + subtract(TranFrom,AllUsedNames,D3), + length(D3,Length01t2),Differencet2 is abs(Length01t2),write("Number of orphaned translations: "), writeln(Differencet2),!. + %%writeln([orphaned,D3]). %% Print orphaned + + +towords(_,_,[],A,A,C,C,D,D) :- !. +towords(FromLang,ToLang,J,A,B,D,E,G,H) :- + J=[[Word1,Word2]|Rest], + Atom1=Word1, + Atom2=Word2, + (Atom2=''->append(G,[Atom1],I); + append(G,[Atom2],I)), + append(A,[Atom1],C), + append(D,[Atom2],F), + towords(FromLang,ToLang,Rest,C,B,F,E,I,H). +towords(FromLang,ToLang,J,A,B,D,E,G,H) :- + not(J=[[[FromLang,ToLang],[_,_]]|Rest]), + towords(FromLang,ToLang,Rest,A,B,D,E,G,H). + +tranfrom([],TranFrom,TranFrom) :- !. +tranfrom(CttOrigTran1,TranFrom1,TranFrom2) :- + CttOrigTran1 = [[TranFrom3,_Tran]|TranFrom4], + append(TranFrom1,[TranFrom3],TranFrom5), + tranfrom(TranFrom4,TranFrom5,TranFrom2), !. + +/**%%% already to atom in removers +towords2([],A,A) :- !. +towords2(BrDict03,A,B) :- + BrDict03=[[N,Word]|Rest], + atom_string(Atom,Word), + append(A,[[N,Word]],C), + towords2(Rest,C,B). +**/ + +%% ctt-orig1-orig2, ctt-orig-tran + +%% string_codes("[[[[a],[a]],[[a],[a]]],[[[a],[a]],[[a],[a]]],[[[a],[a]],[[a],[a]]]]",Y),phrase(file0(X),Y). + +file0(N) --> "[", file(N), "]", !. +file0([]) --> []. + +%%file([]) --> []. +file([L|Ls]) --> entry(L),",", +%%{writeln(L)}, %%*** +file(Ls), !. +file([L]) --> entry(L), +%%{writeln(L)}, +!. + +%%quote(_X)-->"'". + +/** string_codes( +"[[[a],[a]],[[a],[a]]]" +,Y),phrase(entry(X),Y). **/ + +entry([Word1,Word2,Word3,Word4]) --> + "[","[", sentence1(Word11), "]", {string_codes(Word1,Word11),string(Word1)}, + ",", + "[", sentence1(Word22),"]", {string_codes(Word2,Word22),string(Word2)}, + ",", + "[","[", sentence1(Word33), "]", {string_codes(Word3,Word33),string(Word3)}, + ",", + "[",sentence1(Word44), "]","]", {string_codes(Word4,Word44),string(Word4)}, + "]",!. + +sentence1([X|Xs]) --> [X], {((true%%char_type(X,alnum) +;char_type(X,white));char_type(X,punct)), not(X=93) +%% ] +, not(X=91) +%% [ +}, sentence1(Xs), !. +sentence1([]) --> []. + +%% string_codes("[[a],[a],[a.a.a. Hello, how are you?\n]]",Y),phrase(entry1(X),Y). +%% X = [a,a,[[a.],[a.],[a.],[s,1],[Hello, how are you?],[r,1]]] + +entry1([Word1,Word2,Word3]) --> + "[","[", sentence1(Word11), "]", {string_codes(Word1,Word11),string(Word1)}, + ",", + "[", sentence1(Word22), "]", {string_codes(Word2,Word22),string(Word2)}, + ",", + "[", paragraph(Word3), %% {string_codes(Word3,Word33),string(Word3)}, + "]","]", !. + +%%paragraph([]) --> []. + +%% "a.a." +paragraph(AAs) --> spaces(Y), {%%atom_string(Y,YS), +string_length(Y,YLen), (YLen>0->Start=[[s,YLen]];Start=[])}, sentence32(X), returns(Z), {%%atom_string(X,XS), +[XS]=X, +string_length(XS,XLen), atom_string(XS,X1), (XLen>0->append(Start,[X1],Next);Next=Start), %%atom_string(Z,ZS), +string_length(Z,ZLen), (ZLen>0->append(Next,[[r,ZLen]],Last);Last=Next)},paragraph(As),{ append(Last,As,AAs)}, !. +%% "a." +paragraph(Last) --> spaces(Y), {%%atom_string(Y,YS), +string_length(Y,YLen), (YLen>0->Start=[[s,YLen]];Start=[])}, sentence32(X), returns(Z), {%%atom_string(X,XS), +[XS]=X, +string_length(XS,XLen), atom_string(XS,X1), (XLen>0->append(Start,[X1],Next);Next=Start), %%atom_string(Z,ZS), +string_length(Z,ZLen), (ZLen>0->append(Next,[[r,ZLen]],Last);Last=Next)}, !. + +%% "a\na\n" +paragraph(AAs) --> spaces(Y), {%%atom_string(Y,YS), +string_length(Y,YLen), (YLen>0->Start=[[s,YLen]];Start=[])}, sentence33(X), returns(Z),{%%atom_string(X,XS), +[XS]=X, +string_length(XS,XLen), atom_string(XS,X1), (XLen>0->append(Start,[X1],Next);Next=Start), %%atom_string(Z,ZS), +string_length(Z,ZLen), ZLen>=1,(ZLen>0->append(Next,[[r,ZLen]],Last);Last=Next)},paragraph(As),{append(Last,As,AAs)}, !. +%% "a\n" +paragraph(Last) --> spaces(Y), {%%atom_string(Y,YS), +string_length(Y,YLen), (YLen>0->Start=[[s,YLen]];Start=[])}, sentence33(X), returns(Z), {%%atom_string(X,XS), +[XS]=X, +string_length(XS,XLen), atom_string(XS,X1), (XLen>0->append(Start,[X1],Next);Next=Start), %%atom_string(Z,ZS), +string_length(Z,ZLen), ZLen>=1,(ZLen>0->append(Next,[[r,ZLen]],Last);Last=Next)}, !. + +%% "a" +paragraph(Next) --> spaces(Y), {%%atom_string(Y,YS), +string_length(Y,YLen), (YLen>0->Start=[[s,YLen]];Start=[])}, sentence33(X), {%%atom_string(X,XS), +[XS]=X, +string_length(XS,XLen), atom_string(XS,X1), (XLen>0->append(Start,[X1],Next);Next=Start)}, !. + + +spaces(XXs) --> [X], {X=32}, spaces(Xs), {char_code(Ch,X),atom_string(CA,Ch),atom_concat(CA,Xs,XXs)}, !. %% Space +spaces('') --> []. + +sentence32([XsZ]) --> sentence33(Xs), sentenceendpunctuation(Z), {atom_string(CA,Xs),atom_concat(CA,Z,XsZ)}, !. +sentence32('') --> []. + +%%sentence321(CA) --> sentence33(Xs), {atom_string(CA,Xs)}, !. +%%sentence321('') --> []. + +returns(XXs) --> [X], {X=10}, returns(Xs), {char_code(Ch,X),atom_string(CA,Ch),atom_concat(CA,Xs,XXs)}, !. %% Newline +returns('') --> []. + + +%% sentence33([C|Xs]) +sentence33(CXs) --> [X], {((true%%char_type(X,alnum) +;char_type(X,white));char_type(X,punct)), not(X=93),char_code(C,X) +%% ] +, not(X=91) + +%% . +, not(X=46) +%% ! +, not(X=33) +%% ? +, not(X=63) +%% \n +, not(X=10) + +}, sentence33(Xs), {atom_string(CA,C),atom_concat(CA,Xs,CXs)}, !. +sentence33('') --> []. + +sentenceendpunctuation(Z) --> [Z1], {char_code(Z,Z1),(Z='.';(Z='?';(Z='!')))}, !. + + +%% [['<<>>','How do you do?'],[[[BT:'How are you?',bt]...],['How are you - entered?'...]]] +%% [[<<>>,hello],[],[[hello,]]] + +file1(N) --> "[", "[", "[", sentence1(_A1), "]", ",", "[", sentence1(_A2), "]", "]", ",", "[", file3(_A3), "]", %%% +",", "[",file3(N), "]", "]", optional_end(_A4), !. +%%file1([]) --> []. + +optional_end(_A4) --> [Return],[EOF],{Return=13,EOF=10}, !. +optional_end(_A4) --> [], !. + +%%file2([Entries]) --> "[", file4(Entries), "]", !. +%%%%file2([]) --> []. + +%%file([]) --> []. +file3([L|Ls]) --> entry2(L),",", +%%{writeln(L)}, %% +file3(Ls), !. +file3([L]) --> entry2(L), +%%{writeln(L)}, +!. + +/**%%file([]) --> []. +file4([L|Ls]) --> entry3(L),",", +%%{writeln(L)}, %% +file4(Ls), !. +file4([L]) --> entry3(L), +%%{writeln(L)}, +!.**/ + + +entry2([Word1,Word2]) --> + "[","[", sentence1(Word11), "]", {string_codes(Word1,Word11),string(Word1)}, + ",", "[", sentence1(Word22), "]", {string_codes(Word2,Word22),string(Word2)}, "]",!. + + + +%%entry3([Word1]) --> +%% "[", sentence1(Word11), "]", {string_codes(Word1,Word11),string(Word1)}, +%% !. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + + +swap_quote_to_space(A,B) :- + string_codes(A,C),findall(D,(member(C1,C),swap1(C1,D)),E),string_codes(B,E),!. +swap1(A,B) :- string_codes("'",[A]),string_codes(" ",[B]),!. +swap1(A,A) :- !. + +translate(Input,FromLang,ToLang,Output3) :- + %%swap_quote_to_space(Input,Input1), + insertdoublebackslashbeforequote(Input,Input1), + concat_list(["../../../trans ",FromLang,":",ToLang," \"",Input1,"\""],F), + %%atom_concat("export GOOGLE_APPLICATION_CREDENTIALS=\"/Users/luciangreen/Dropbox/Program Finder/possibly not working/translationmanagementsystem/Cultural Translation Tool-19XXXXXXb4.json\"\ncurl -s -X POST -H \"Content-Type: application/json\" -H \"Authorization: Bearer \"$(/Users/luciangreen/Dropbox/Program\\ Finder/possibly\\ not\\ working/translationmanagementsystem/google-cloud-sdk/bin/gcloud auth application-default print-access-token) --data \"{ +/** + 'q': '",Input,A), + atom_concat(A,"', + 'source': '",B), + atom_concat(B,FromLang,C), + atom_concat(C,"', + 'target': '",D), + atom_concat(D,ToLang,E), + atom_concat(E,"', + 'format': 'text' +}\" \"https://translation.googleapis.com/language/translate/v2\"",F), +**/ +repeat, + +catch(call_with_time_limit(5, +catch( + (bash_command(F,Output1)), + _, + (writeln("Translate failed. Press c to retry."), + read_string(user_input, "\n", "\r", _,C), + C="c"->fail;abort) +) +), + time_limit_exceeded, + (writeln1("Error: translate timed out."),abort)), + + split_string(Output1,"\033","\033",Output2), + Output2=[_,Output3a|_], %% *** May be 3rd item on Linux + %%atom_concat("{\n \"data\": {\n \"translations\": [\n {\n \"translatedText\": \"",A1,Output1),atom_concat(Output2,"\"\n }\n ]\n }\n}\n",A1), + atom_string(Output3a,Output3b), + string_concat("[1m",Output3,Output3b), + %%string_concat(Output3,"\033\[22m",Output3c) +!. + +/** + +translate(Input1,FromLang,ToLang,Output3) :- + insertdoublebackslashbeforequote(Input1,Input), + atom_concat("export GOOGLE_APPLICATION_CREDENTIALS=\"/Users/luciangreen/Dropbox/Program Finder/possibly not working/translationmanagementsystem/Cultural Translation Tool-19XXXXXXb4.json\"\ncurl -s -X POST -H \"Content-Type: application/json\" -H \"Authorization: Bearer \"$(/Users/luciangreen/Dropbox/Program\\ Finder/possibly\\ not\\ working/translationmanagementsystem/google-cloud-sdk/bin/gcloud auth application-default print-access-token) --data \"{ + 'q': '",Input,A), + atom_concat(A,"', + 'source': '",B), + atom_concat(B,FromLang,C), + atom_concat(C,"', + 'target': '",D), + atom_concat(D,ToLang,E), + atom_concat(E,"', + 'format': 'text' +}\" \"https://translation.googleapis.com/language/translate/v2\"",F), + bash_command(F,Output1), + atom_concat("{\n \"data\": {\n \"translations\": [\n {\n \"translatedText\": \"",A1,Output1),atom_concat(Output2,"\"\n }\n ]\n }\n}\n",A1), + atom_string(Output2,Output3). + + **/ +insertdoublebackslashbeforequote(Input1,Input) :- + string_codes(Input1,Input2), + insertdoublebackslashbeforequote1(Input2,[],Input3), + string_codes(Input,Input3). +insertdoublebackslashbeforequote1([],Input,Input) :- !. +insertdoublebackslashbeforequote1(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + Input4=39, %% quote + append(Input2,[92,39],Input6), + insertdoublebackslashbeforequote1(Input5,Input6,Input3), !. +insertdoublebackslashbeforequote1(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + not(Input4=39), %% quote + append(Input2,[Input4],Input6), + insertdoublebackslashbeforequote1(Input5,Input6,Input3), !. + + +ctt2([],T,T,B,B,C,C,_,_,_) :- + !. +ctt2([[N,Word]|Words],Translation1,Translation2,List,List2,List4,List5,FromLang,ToLang,COOWithoutLangPairs) :- + + tryoutputs0([[Word,_]],COOWithoutLangPairs,[],COO2), + +%%writeln([tryoutputs0([[word,_]],cOOWithoutLangPairs,[],cOO2),tryoutputs0([[Word,_]],COOWithoutLangPairs,[],COO2)]), + sort(COO2,COO4), + + prolog_edit:open_grammar_editor(Word,FromLang,ToLang,List,List4,String4,N,COO4), + + /** + ctt-orig1-orig2.txt +[[[en,fr],['How do you do?','How are you?']]...] 2nd item is correct back-translation with cult cust +o1 o2*** +ctt-orig-tran.txt +[[[en,fr],['How are you?','Comment allez-vous?']]...] +o2 l*** + **/ + + (String4=[BT,Translation4]->true;String4=[[BT,Translation4]]), %% s4=[o2 l] + append(Translation1,[[N,Translation4]],Translation3), %% String4=[o2 l]2 + %% *********** don't add if '' + String5=[Word,BT], + tryoutputs1(String5,List,List621), %% want [o1 o2]1 + %%List62=[[FromLang],[ToLang],List621], + String6=[BT,Translation4], + tryoutputs11(String6,List4,List611), %% want [o2 l]1 + %%List61=[[FromLang],[ToLang],List611], + + ctt2(Words,Translation3,Translation2,List621,List2,List611,List5,FromLang,ToLang,COOWithoutLangPairs),!. + %%). + +%% finds unknown words, asks for their br in form "n of m: word", verify, (can go back x) append and sort, save + +fulladjective("en",'English') :- !. +fulladjective("fr",'French') :- !. +fulladjective("de",'German') :- !. +fulladjective("es",'Spanish') :- !. +fulladjective("ru",'Russian') :- !. +fulladjective(A,A) :- !. + +writenotification1(A,E,F,Notification):- + %%working_directory(CWD, CWD), + atom_concat(A,E,E1), + atom_concat(E1,F,Notification). + +writenotification2(A,FromLang1,F,Notification):- + working_directory(CWD, CWD), + fulladjective(FromLang1,FromLang2), + atom_concat(A,FromLang2,B), + %%atom_concat(B,CWD,E), + atom_concat(B,F,Notification). + +%% Attempts to back-translate List1 given [original,previous back-translation] pairs +%% List2, [previous back-translation,previous translation] pairs List2, the Orig to +%% translate, the PastTries, FromLang, ToLang, file path E, shell command H, Notification +%% with editing instructions and giving the translation Output. + +/** +ctt-orig1-orig2-tmp-1.txt +[['<>','I love you.'],[['','']],[['I dote on you.', '']]] + +backtranslateuntilcorrect([['I love you.','I dote on you.']],[['I dote on you.','Je me passionne pour toi.']],'I love you.',[],'en','fr','tmp/ctt-orig1-orig2-tmp-1.txt','/Applications/Grammarly.app/Contents/MacOS/Grammarly tmp/ctt-orig1-orig2-tmp-1.txt --line','<>',O). + + + + + + + +**/ +backtranslateuntilcorrect(List1,List2,Orig,PastTries,FromLang,ToLang,E,H,Notification1,Notification2,Output) :- +%%trace, + ( + phrase_from_file(string(List6),E), + (phrase(file1(Outputs11),List6), + removenotice(Outputs11,Outputs1a), + sort(Outputs1a,Outputs1) + %%not(Outputs1=Orig) +%%writeln([outputs1,Outputs1]) + %%not(Outputs1=[]) + )-> + ( + ( %% remove extra br *** + (%% Use a correct translation from ctt-orig-tran.txt + tryoutputs0(Outputs1,List2,[],Output),not(Output=[]))-> + true + ; + + (%% Use a correct translation from a back-translation from ctt-orig1-orig2.txt + (tryoutputs(Outputs1,List1,[],Output1),tryoutputs0(Output1,List2,[],Output),not(Output=[]))-> + true; + ( + (trytranslations1(Outputs1,FromLang,ToLang, + false,Flag1,[],Outputs3,_,Output2)), + ( + Flag1=true-> + Output=Output2 + ; + (trytranslations2(Orig,Outputs3,Output3,Flag2), + ( + Flag2=true-> + Output=Output3 + ; + (openeditor(Notification1,Notification2,Orig,PastTries,Outputs1,Tries,E,H), + backtranslateuntilcorrect(List1,List2,Orig,Tries, + FromLang,ToLang,E,H,Notification1,Notification2,Output) + ) + ) + ) + ) + ) + ) + ) + ) + ; + ( + openeditorwitherror(H), + Tries=PastTries, + backtranslateuntilcorrect(List1,List2,Orig,Tries,FromLang, + ToLang,E,H,Notification1,Notification2,Output) + ) + ). + + %% Reads file E giving Outputs1, the list of sentences to try back-translating + %% Outputs1 = [[o1,pbt1]] + + %% Finds whether the sentences to try back-translating are in the list + %% [previous back-translation,previous translation] + %% ?- tryoutputs0 [[o1,pbt1],[o2,pbt2]],[[pbt2,pt2]],[],I . + %% I = [pbt2, pt2]. + + %% Finds whether the sentences to try back-translating are in the list + %% [original,previous back-translation] + /** + ?- tryoutputs [[o1,pbt1],[o2,pbt2]],[[o2,pbt2]],I . + I = [o2, pbt2]. + + ?- tryoutputs2 [o2, pbt2],[[pbt1,pt1],[pbt2,pt2],[pbt2,pt2]],I . + I = [pbt2, pt2]. + **/ + + %% Finds the back-translations of the sentences to try + %% back-translating from one language to another, and returns + %% Flag=true if there is a successful back-translation Output2 + %% See ***1 +/** +*** 1 +?- trytranslations1([['I love you.-1','I love you.'],['I love you1-1','I love you1']],'en','fr',false,Flag,[],O,_,O2). +Flag = true, +O = [['I love you.', '', 'I love you tr.'], ['I love you1.', '', 'I love you1 tr.']], +O2 = ['I love you1.', 'I love you1 tr.']. + +?- trytranslations1([['I love you.-1','I love you3.']],'en','fr',false,Flag,[],O,_,O2). +Flag = false, +O = [['I love you3.', 'I love you3 fake.', 'I love you3 tr.']]. +**/ + + %% Asks if a back-translation is correct + %% trytranslations2 'Original', 'I love you3.', 'I love you3 fake.', 'I love you3 tr.' ,[a,b,c]],O . + +removenotice(Outputs11,Outputs1) :- + %%writeln([outputs11,Outputs11]), + Contents=[_,"Insert an alternative that might translate better here."], + /**((((Contents=[_,"Insert an alternative that might translate better here."]; + Contents=[_,"Insert a correct translation here",_]); + Contents=[_,"Insert a correct translation here"]) ; + Contents=_Empty), + member(Contents,Outputs11),**/ + delete(Outputs11,Contents,Outputs1), !. +%%* delete these one by one + +openeditor(Notification1,Notification2,Orig,PastTries,Outputs12,Tries,E,H) :- + writeln(Notification1), + + %%removenotice(Outputs11,Outputs12), + + wrap(Outputs12,[],Outputs1), + %%writeln([pastTries,PastTries]), %%% + append(PastTries,Outputs1,Tries), %% Output, Outputs1 x *** REMOVED [] from O1 + %%(phrase_from_file(string(FileContents1),E)->true;writeln("Error opening tmp file.")), + + append([[["<<>>"],[""]]],Tries,Tries2), +%%writeln([append([[["<<>>"],[""]]],tries,tries2),append([[["<<>>"],[""]]],Tries,Tries2)]), + FileContents1=[[[Notification1],[Orig]],Tries2,[[[Notification2],["Insert an alternative that might translate better here."]]]], + + updatefile2(FileContents1,E), + writeln("Opening editor."), + (bash_command(H,_)-> + true; + (writeln("Failed opening Editor."),abort) + ). %% open editor + +openeditorwitherror(H) :- + write("*** Your previous entry was not in the correct format. "), + writeln("Please correct this. ***"), + writeln("Opening editor."), + ( + ( + bash_command(H,_)-> + true + ; + ( + writeln("Failed opening Editor."),abort + ) + ) + ). + +wrap([],List,List) :-!. +wrap(List1,List2,List3) :- + List1=[[Item1,Item2]|List4], + append(List2,[[[Item1],[Item2]]],List5), + wrap(List4,List5,List3), !. + +wrap2([],List,List) :-!. +wrap2(List1,List2,List3) :- + List1=[[Item1,Item2,Item3,Item4]|List4], + append(List2,[[[Item1],[Item2],[[Item3],[Item4]]]],List5), + wrap2(List4,List5,List3), !. + +tryoutputs0([],_List3,Output,Output) :- !. +tryoutputs0([[Original,BT]|Outputs],List2,List3,List4) :- + ( + ( + (BT=""-> + (member([Original,Translation],List2), + append(List3,[[Original,Translation]],List5)) + -> + true + ; + ( member([BT,Translation],List2))-> + append(List3,[[BT,Translation]],List5) + ) + ) + ; + List5=List3 + ), + tryoutputs0(Outputs,List3,List5,List4) + ,!. +%%tryoutputs0([_Output|Outputs],List,Answer) :- +%% tryoutputs0(Outputs,List,Answer),!. + +tryoutputs([],_List3,Output,Output) :- !. +tryoutputs([Item|Outputs],List2,List3,List4) :- + tryoutputsa(Item,List2,List3,List5), + tryoutputs(Outputs,List2,List5,List4). + +tryoutputsa(_,[],List,List) :- !. +tryoutputsa([Original,BT],[List21|List22],List3,List4) :- + [Original,Translation]=List21, + (BT=""-> + append(List3,[[Original,Translation]],List5) + ; + append(List3,[[Original,BT]],List5)) + , + tryoutputsa([Original,BT],List22,List5,List4) + ,!. +tryoutputsa(Item,[_|List22],List3,List4) :- + tryoutputsa(Item,List22,List3,List4), !. + + +/** +tryoutputs([Item1],List2,Item2) :- + Item1=[Original,BT1], + ((BT1="",member([Original,BT2],List2),Item2=[Original,BT2]); + (member(Item1,List2),Item2=Item1)) + %%,Item1=[Item2,_Output] + ,!. +tryoutputs([],_List3,_Output) :- fail. +tryoutputs([_Output|Outputs],List,Answer) :- + tryoutputs(Outputs,List,Answer),!. +**/ + +/**tryoutputs2(Item1,List2,Item2) :- + Item1=[Original,PreviousBT1], + (PreviousBT1=""->PreviousBT2=Original;PreviousBT2=PreviousBT1), + member(Item2,List2), + Item2=[PreviousBT2,_Translation], + !. +**/ +/** ?- tryoutputs1([original,bt],[[original,bt],[original1,bt1]],L). +L = [[original, bt], [original1, bt1]]. **/ +tryoutputs1(Output1,Output2,Output2) :- + member(Output1,Output2),!. +tryoutputs1(Output1,Output2,Output3) :- + not(member(Output1,Output2)), + (Output2=""-> + Output3=Output2; + append(Output2,[Output1],Output3)), + !. + +/** ?- tryoutputs11([bt,t],[[bt1,t1]],L). +L = [[bt1, t1], [bt, t]]. **/ +tryoutputs11(Output1,Output2,Output2) :- + member(Output1,Output2),!. +tryoutputs11(Output1,Output2,Output3) :- + not(member(Output1,Output2)),append(Output2,[Output1],Output3),!. + +trytranslations1([],_FromLang,_ToLang,Flag,Flag,List,List,String,String) :- !. +/**trytranslations1([[_,String]],FromLang,ToLang,Flag1,Flag2,List1,List3,String1,String2) :- + translate1(String,FromLang,ToLang,Output2), + String4=[String,Output2], + translate1(Output2,ToLang,FromLang,Stringa), + (String=Stringa-> + (Flag2=true,String2=String4,append(List1,[[String,"",Output2]],List3)); + (Flag1=Flag2,String2=String1,append(List1,[[String,Stringa,Output2]],List3))), + !.**/ +trytranslations1([[Original,PreviousBT1]|Outputs],FromLang,ToLang,Flag1,Flag2,List1,List2,String1,String2) :- + (PreviousBT1=""->String=Original;String=PreviousBT1), + translate(String,FromLang,ToLang,Output2), + String4=[String,Output2], + translate(Output2,ToLang,FromLang,Stringa), + (String=Stringa-> + (Flag3=true,String3=String4,append(List1,[[String,"",Output2]],List3)); + (Flag3=Flag1,String3=String1),append(List1,[[String,Stringa,Output2]],List3)), + trytranslations1(Outputs,FromLang,ToLang,Flag3,Flag2,List3,List2,String3,String2),!. +trytranslations1([_|Outputs],FromLang,ToLang,Flag1,Flag2,List1,List2,String1,String2) :- + trytranslations1(Outputs,FromLang,ToLang,Flag1,Flag2,List1,List2,String1,String2),!. + +translate1("I love you.","en","fr","I love you tr."). +translate1("I love you1.","en","fr","I love you1 tr."). +translate1("I love you2.","en","fr","I love you2 tr."). +translate1("I love you tr.","fr","en","I love you."). +translate1("I love you1 tr.","fr","en","I love you1."). +translate1("I love you1 tr.","fr","en","I love you1."). +translate1("I dote on you1.","en","fr","I dote on you1 tr."). +translate1("I dote on you1 tr.","fr","en","I dote on you1 different."). + +translate1("Je t|aime.","en","fr","I love you."). +translate1("I love you.","fr","en","Je t|aime."). +translate1("I dote on you.","en","fr","Je me passionne pour toi."). +translate1("Je me passionne pour toi.","fr","en","I dote on you."). + +translate1("I love you3.","en","fr","I love you3 tr."). +translate1("I love you3 tr.","fr","en","I love you3 fake."). + +trytranslations2(Orig,Outputs1,Output,Flag2) :- + %%removenotice(Outputs11,Outputs1), + writeln(""), + writeln("Which of the following sentences:\n- are grammatical\n- have the same meaning as the other in the pair, and\n- have the same meaning as the original sentence?\n"), + write('Original:'),write("\t"),writeln(Orig),writeln(''), + listoutputs(1,N,Outputs1), + write("Enter 1-"),write(N),writeln(", or 0 for none:"), + repeat2(Outputs1,Output,Flag1), + conjunction(Flag1,true,Flag2). +repeat2(Outputs1,Output,Flag) :- + read_string(user_input, "\n", "\r", _End2, Input), + number_string(N1,Input), + (N1=0->Flag=false; + ( + Input2 is N1-1,length(List,Input2), + append(List,[[_Original,Backtranslation,Translation]|_],Outputs1), + Output=[Backtranslation,Translation] + )-> + true; + repeat2(Outputs1,Output,Flag) + ). +listoutputs(N1,N2,[]) :- + N2 is N1-1, !. +listoutputs(N1,N3,[[Original,Backtranslation,_Translation]|Outputs1]) :- + write(N1),write("\t\t"),write(Original),write("\n\t\t"),writeln(Backtranslation),write("\n"), + N2 is N1+1, + listoutputs(N2,N3,Outputs1). + +conjunction(true,true,true) :- !. +conjunction(_,_,false) :- !. + +%% list Trans +%% orig to choose from v + +/** +- 0. Enter from, to languages + +ctt-input.txt +[[en,fr],['How do you do?...']] + +ctt-orig1-orig2.txt +[[[en,fr],['How do you do?','How are you?']]...] 2nd item is correct back-translation with cult cust + +1. If back-translation of sentence and ctt-orig-tran.txt version exists, or a link to it exists or is the same, adds to translation if nec (and lang tag x to) ctt-orig-tran.txt + 2. Else, + - asks if the sentences have the same meaning with return/n options + If yes, goto 1 + If no, opens Grammarly editor to add possible from2 versions to retry + +ctt-orig1-orig2-tmp-N.txt (keep all finished tmps, open in folder by itself) +[['<<>>','How do you do?'],[[[BT:'How are you?',bt]...],[['How are you - entered?']...]]] +f2t only has f1 sentence and allows changes in grammarly - sentence (one of entered ones) with an acceptable back-translation will be added to f1f2 + +ctt-orig-tran.txt +[[[en,fr],['How are you?','Comment allez-vous?']]...] + +ctt-output.txt +[[en,fr],[['How are you?','Comment allez-vous?']...]] - complete text translated + +Note: +- Can split sentences into logic, A verb B with C x do manually +ctt-from2tmp.txt (delete just before end) +([[['How are you?'] x v working translation exists] x,[['How do you do?' (original first),...]] +- assumes Sentences have been checked with Grammarly before using the program +- lists from1 or from2 distance 1 x n away from from1 x all with same meaning index (eg 1) +- mistaken mergers will require sentences copied from screen, manually diverged later x undo feature - makes backups at each point with command to undo) x +() to (translation), from2 (back-translation) outputted +back-translation ready at this stage +**/ + +%% star 2 + +%%ctt-orig1-orig2.txt +%%[[[en,fr],['How do you do?','How are you?']]...] 2nd item is correct back-translation with cult cust + +%%ctt-orig-tran.txt +%%[[[en,fr],['How are you?','Comment allez-vous?']]...] + +%% star 3 + +%%[['<<>>','How do you do?'],[[[BT:'How are you?',bt]...],['How are you - entered?'...]]] + +%% * enter default tmp contents v + + +%%ctt-orig1-orig2.txt +%%[[[en,fr],['How do you do?','How are you?']]...] 2nd item is correct back-translation with cult cust +%%o1 o2*** +%%ctt-orig-tran.txt +%%[[[en,fr],['How are you?','Comment allez-vous?']]...] +%%o2 l*** + + +%% List=[o1 o2]1, List4=[o2 l]1,String4=[o2* l]2 where Word3 is o12 + +concat_list(A1,B):- + A1=[A|List], + concat_list(A,List,B),!. + +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). diff --git a/culturaltranslationtool/vintage cultural translation tool/edit.pl b/culturaltranslationtool/vintage cultural translation tool/edit.pl new file mode 100644 index 0000000000000000000000000000000000000000..3fa0db58b212d93ea0f125b278040d12e406918e --- /dev/null +++ b/culturaltranslationtool/vintage cultural translation tool/edit.pl @@ -0,0 +1,95 @@ +%% Multiline file1.txt will contain list of sentences which were attemptedly translated +%% Instructions at top of file1.txt to save and quit +%% with path to load and save in + +:- include(library(edit)). +:- multifile + edit:edit_command/2. +:- multifile prolog_edit:load/0. + +%%prolog_edit:load :- +%% ensure_loaded(library(swi_edit)). + +run :- %%, + %%prolog_edit:locate("/Applications/MacVim.app/Contents/MacOS/macvim-askpass",F,_L), + %%arg(1,F,A), + %% prolog_edit:locate("file1.txt",_F2,L2), + %%set_prolog_flag(editor,'gvim --line2'), + %%prolog_edit:edit_command(gvim, '%e --line%d'), + %%atom_concat(A, ' file1.txt --line',B), + %%atom_concat(B, 2 ,C), + %%shell(C), + %%prolog_edit:edit_source(L), + edit("file1.txt"), +writeln(here). + +rungrammarly :- %%, + prolog_edit:locate("/Applications/Grammarly.app/Contents/MacOS/Grammarly",F,_L), + arg(1,F,A), + %%prolog_edit:locate("file1.txt",_F2,L2), + %%set_prolog_flag(editor,'gvim --line2'), + %%prolog_edit:edit_command(gvim, '%e --line%d'), + atom_concat(A, ' file1.txt --line',C), + %%atom_concat(B, 2 ,C), + shell(C), + %%prolog_edit:edit_source(L), + %%edit("file1.txt"), +a(a), +writeln(here),!. + +a(a). +b:-%%prolog_edit: +rungrammarly. + + +open_grammar_editor(Word,FromLang,ToLang,List,List4,String4,N,COO2) :- + %%downcase_atom(Word, Word2), atom_string(Word2,Word), + + prolog_edit:locate("/Applications/Grammarly.app/Contents/MacOS/Grammarly",F,_L), + %% prolog_edit:locate("/usr/bin/vi",F,_L), + arg(1,F,A), + atom_concat(A, ' ',C), + + atom_concat('tmp/ctt-orig1-orig2-tmp-',N,D), + atom_concat(D, '.txt',E), + + atom_concat(C,E,G), + atom_concat(G, ' --line',H), + + %%fulladjective(FromLang,FromLang1), + + writenotification1('<<>>',Notification1), + writenotification2('Insert "',FromLang,'" sentence with meaning of the first sentence here.',Notification2), + %%writenotification('<<>>',Notification2), + + rmoldtmpfile(E), + wrap(COO2,[],COO5), + append([[[Word],[""]]],COO5,COO4), + %%sort(COO4,COO5), + append([[["<<>>"],[""]]],COO4,COO3), + append(COO4,[[[Notification2],["Insert an alternative that might translate better here."]]],Editable1), + %%[Editable2]=Editable1, +FileContents1=[[[Notification1],[Word]],COO3,Editable1], + updatefile2(FileContents1,E), + +%% star 3 + backtranslateuntilcorrect(List,List4,Word,[],FromLang,ToLang,E,H,Notification1,Notification2,String4), %% check against List + + atom_concat('mv ',E,G1), + atom_concat(G1,' ',G2), + atom_concat(G2,'files/ctt-orig1-orig2-tmp-',G3), + atom_concat(G3,N,G4), + atom_concat(G4, '.txt',G5), + + (bash_command(G5,_)-> + true; + (writeln("Failed moving tmp file to ctt folder."),abort) + ). %% open editor + +rmoldtmpfile(E) :- + atom_concat('rm -f ',E,G1), + + (bash_command(G1,_)-> + true; + (writeln("Failed removing tmp file."),abort) + ). %% open editor diff --git a/culturaltranslationtool/vintage cultural translation tool/files/ctt-input.txt b/culturaltranslationtool/vintage cultural translation tool/files/ctt-input.txt new file mode 100644 index 0000000000000000000000000000000000000000..ac950c55e1ffb69da197af046467845bc7482d4e --- /dev/null +++ b/culturaltranslationtool/vintage cultural translation tool/files/ctt-input.txt @@ -0,0 +1 @@ +[[en],[gd],[The second variable is equal to the first variable plus one.]] \ No newline at end of file diff --git a/culturaltranslationtool/vintage cultural translation tool/files/ctt-orig-tran.txt b/culturaltranslationtool/vintage cultural translation tool/files/ctt-orig-tran.txt new file mode 100644 index 0000000000000000000000000000000000000000..3f74f3a5c1b062cb739c242f45970978b2287d31 --- /dev/null +++ b/culturaltranslationtool/vintage cultural translation tool/files/ctt-orig-tran.txt @@ -0,0 +1 @@ +[[[en],[de],[[I adore you.],[Ich verehre dich.]]],[[en],[de],[[The third variable is equal to the second variable plus one.],[Die dritte Variable ist gleich der zweiten Variablen plus eins.]]],[[en],[es],[[I adore you.],[Te adoro.]]],[[en],[es],[[The second variable is equal to the first variable with one added.],[La segunda variable es igual a la primera variable con una añadida.]]],[[en],[es],[[The third variable is equal to the second variable plus one.],[La tercera variable es igual a la segunda variable más uno.]]],[[en],[fr],[[I love you.],[Je vous adore.]]],[[en],[fr],[[The second variable is equal to the first variable plus one.],[La deuxième variable est égale à la première variable plus un.]]],[[en],[gd],[[The second variable equals the first variable and the number one.],[Tha an dàrna caochladair co-ionann ris a ’chiad chaochlaideach agus an àireamh a h-aon.]]]] \ No newline at end of file diff --git a/culturaltranslationtool/vintage cultural translation tool/files/ctt-orig1-orig2.txt b/culturaltranslationtool/vintage cultural translation tool/files/ctt-orig1-orig2.txt new file mode 100644 index 0000000000000000000000000000000000000000..1557cb9a3dcb30657a78036bfa9107b9df2881d8 --- /dev/null +++ b/culturaltranslationtool/vintage cultural translation tool/files/ctt-orig1-orig2.txt @@ -0,0 +1 @@ +[[[en],[de],[[I adore you.],[]]],[[en],[de],[[The third variable is equal to the second variable with one added.],[The third variable is equal to the second variable plus one.]]],[[en],[es],[[I adore you.],[]]],[[en],[es],[[The second variable is equal to the first variable with one added.],[]]],[[en],[es],[[The third variable is equal to the second variable with one added.],[The third variable is equal to the second variable plus one.]]],[[en],[fr],[[I adore you.],[I love you.]]],[[en],[fr],[[The second variable is equal to the first variable with one added.],[The second variable is equal to the first variable plus one.]]],[[en],[gd],[[The second variable is equal to the first variable plus one.],[The second variable equals the first variable and the number one.]]],[[en],[ja],[[I adore you.],[]]],[[en],[ja],[[The third variable is equal to the second variable with one added.],[The third variable is equal to the second variable plus one.]]]] \ No newline at end of file diff --git a/culturaltranslationtool/vintage cultural translation tool/files/ctt-output.txt b/culturaltranslationtool/vintage cultural translation tool/files/ctt-output.txt new file mode 100644 index 0000000000000000000000000000000000000000..72c09a8a03848b3111e4cf6abbbe4e58f890ddb5 --- /dev/null +++ b/culturaltranslationtool/vintage cultural translation tool/files/ctt-output.txt @@ -0,0 +1 @@ +Tha an dàrna caochladair co-ionann ris a ’chiad chaochlaideach agus an àireamh a h-aon. \ No newline at end of file diff --git a/culturaltranslationtool/vintage cultural translation tool/tmp/stub.txt b/culturaltranslationtool/vintage cultural translation tool/tmp/stub.txt new file mode 100644 index 0000000000000000000000000000000000000000..8f0f39a40ac50bee5d3fe06ecc097fda163ee113 --- /dev/null +++ b/culturaltranslationtool/vintage cultural translation tool/tmp/stub.txt @@ -0,0 +1 @@ +stub - needs tmp folder \ No newline at end of file diff --git a/culturaltranslationtool/vintage cultural translation tool/walkthrough.txt b/culturaltranslationtool/vintage cultural translation tool/walkthrough.txt new file mode 100644 index 0000000000000000000000000000000000000000..c1ef29d8962d54b9dcaf91bea58085123249e6d9 --- /dev/null +++ b/culturaltranslationtool/vintage cultural translation tool/walkthrough.txt @@ -0,0 +1,41 @@ +?- ctt. +Number of sentences to translate in files/ctt-input.txt: 1 +Number of unique sentences to translate in files/ctt-input.txt: 1 +Number of back-translation pairs in lang1->lang2: 0 +Number of translation pairs in lang1->lang2: 0 +Number of back-translations remaining to define: 1 +Number of undefined back-translations: 0 +Number of orphaned translations: 0 + +Which of the following sentences: +- are grammatical +- have the same meaning as the other in the pair, and +- have the same meaning as the original sentence? + +Original: The second variable is equal to the first variable plus one. + +1 The second variable is equal to the first variable plus one. + The second variable equals the first variable and one variable. + +Enter 1-1, or 0 for none: +|: 0 +<<>> +Opening editor. + + +[[[<<>>],[The second variable is equal to the first variable plus one.]],[[[<<>>],[]],[[The second variable is equal to the first variable plus one.],[]]],[[[The second variable is equal to the first variable plus one.],[The second variable is equal to the first variable plus the number one.]]]] + + +Which of the following sentences: +- are grammatical +- have the same meaning as the other in the pair, and +- have the same meaning as the original sentence? + +Original: The second variable is equal to the first variable plus one. + +1 The second variable is equal to the first variable plus the number one. + The second variable equals the first variable and the number one. + +Enter 1-1, or 0 for none: +|: 1 +true. \ No newline at end of file diff --git a/culturaltranslationtool/vintage_ctt2/.DS_Store b/culturaltranslationtool/vintage_ctt2/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..d92f687bc5ee2522fa2eef5d8e2c9400d5485d51 Binary files /dev/null and b/culturaltranslationtool/vintage_ctt2/.DS_Store differ diff --git a/culturaltranslationtool/vintage_ctt2/README.md b/culturaltranslationtool/vintage_ctt2/README.md new file mode 100644 index 0000000000000000000000000000000000000000..17479fcc079e3994ef8516405c137d8f46308ca9 --- /dev/null +++ b/culturaltranslationtool/vintage_ctt2/README.md @@ -0,0 +1,59 @@ +# culturaltranslationtool +Cultural Translation Tool (CTT) + +I was inspired to write CTT when I worked out to back-translate in Homework Help Club to reach a "more" correct translation. CTT gives better translations (and can be modified to give back translations, the translation back into the original language). This tool can help businesses expand to other languages. Requires Translate Shell. If you back translate and you think the sentence either isn't grammatical or has the same meaning as the original, you can keep trying new variants, where the CTT saves the correct one. + +Sentences are back-translated and saved in ctt-output.txt with the spaces and returns from the original. + +NB. - Because of a limitation in Google Translate, CTT only improves, not perfects translations. E.g. "I measured the dimensions of the object" is translated as "I measured the dimensions of the partner". Additional verification is required. +- Sentences in texts given should not contain a full stop except at the end, but you can put them back in afterwards. +- Please remember that Google Translate translates by translating from English first. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +* You may need to install gawk using Homebrew. + +* Install Translation Shell on Mac, etc. +Change line in +``` +culturaltranslationtool/ctt2.pl +trans_location("../../../gawk/trans"). +``` +to correct location of trans. + +# 1. Install manually + +Download this repository and List Prolog Interpreter. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","culturaltranslationtool"). +halt +``` + +# Running + +* In Shell: +`cd culturaltranslationtool` +`swipl` + +Instructions + +Enter the from and to language codes and the sentences to be translated in `ctt-input.txt` before running, e.g.: +`[["en"],["fr"],["The second variable is equal to the first variable with one added. I adore you."]]` + +Load in SWI-Prolog using `['../listprologinterpreter/listprolog.pl'].`, then run using `ctt2.`. + +Follow the prompts asking for modifications to a possible back translation and whether a non-identical back translated sentence from the original is grammatical and has the same meaning as the original. You can save time using CTT instead of Google Translate because CTT uses translations with identical back translations without asking. Web site and document translations can be automated, using the saved back translations, where I recommend entering short sentences that are more likely to translate correctly and which you can reuse in other documents. You can save the `ctt-orig1-orig2.txt` (original language from input to original language with the correct back translation) and `ctt-orig-tran.txt` (back translation to translation) files separately for a particular document for fast translation into multiple languages. + diff --git a/culturaltranslationtool/vintage_ctt2/ctt2.pl b/culturaltranslationtool/vintage_ctt2/ctt2.pl new file mode 100644 index 0000000000000000000000000000000000000000..21acc19e7c180f61ca1f5b8e209ba59d20bdd789 --- /dev/null +++ b/culturaltranslationtool/vintage_ctt2/ctt2.pl @@ -0,0 +1,312 @@ +% ctt2.pl + +/* + + ctt-input.txt +["en","fr","Helloa."] + + ctt-orig1-orig2.txt +[["en","fr","Helloa.","Hellow."]] + + ctt-orig-tran.txt +[["en","fr","Hellow.","Bonjour."]] + + ctt-output.txt +["en","fr","Bonjour. "] + +*/ + +trans_location("../../../gawk/trans"). + +ctt2 :- + get_files(Ctt_input,Ctt_orig1_orig2,Ctt_orig_tran,From_lang,To_lang), + back_translate(Ctt_input,Ctt_orig1_orig2,Ctt_orig_tran,From_lang,To_lang,"",Translation,[],Ctt_orig1_orig2_1,[],Ctt_orig_tran_1), + +save_file("files/ctt-output.txt",[From_lang,To_lang,Translation]), +save_file("files/ctt-orig1-orig2.txt",Ctt_orig1_orig2_1), +save_file("files/ctt-orig-tran.txt",Ctt_orig_tran_1), +!. + +save_file(File_path,File) :- + + term_to_atom(File,String02a_b), + string_atom(String02a_c,String02a_b), + + (open_s(File_path,write,Stream1), + write(Stream1,String02a_c), + close(Stream1)),!. + + +get_files(Item4,Ctt_orig1_orig2,Ctt_orig_tran,From_lang,To_lang) :- + get_file("files/ctt-input.txt",Ctt_input), + Ctt_input=[From_lang,To_lang,String1], + %split_string2(String1,["\n","\r"],List1), + string_codes(String1,Codes1), + split_on_substring117(Codes1,`\n\r`,[],List1), + %trace, + findall(Item2,(member(Item1,List1), + ((Item1="\n"->true;Item1="\r")->Item2=[Item1];( +%trace, + string_codes(Item1,Codes2), + split_on_substring117(Codes2,`.!?`,[],Item11), + +%split_string2(Item1,[".","!","?"],Item11), +join_chars_after(Item11,[".","!","?"],[],Item2)) +)),Item3), + maplist(append,[Item3],[Item4]), + findall(Item5,(member(Item5,Item4),not(((Item5="\n"->true;Item5="\r")))),Item6), + length(Item6,Length0), + write("Number of sentences to translate in files/ctt-input.txt: "), writeln(Length0), + + sort(Item6,CttInput2), + length(CttInput2,Length1),write("Number of unique sentences to translate in files/ctt-input.txt: "), writeln(Length1), + + get_file("files/ctt-orig1-orig2.txt",Ctt_orig1_orig2), + findall([A,B],member([From_lang,To_lang,A,B],Ctt_orig1_orig2),Ctt_orig1_orig22), + + length(Ctt_orig1_orig22,Length2), + + write("Number of back-translation pairs in lang1->lang2: "), writeln(Length2), + + get_file("files/ctt-orig-tran.txt",Ctt_orig_tran), + + findall([A,B],member([From_lang,To_lang,A,B],Ctt_orig_tran),Ctt_orig_tran2), + + length(Ctt_orig_tran2,Length3), + + write("Number of translation pairs in lang1->lang2: "), writeln(Length3), + + findall(Item9,member([From_lang,To_lang,Item9,_],Ctt_orig1_orig22),Item10), + +findall(Item7,member([From_lang,To_lang,_,Item7],Ctt_orig1_orig22),Item8), + +findall(Item11,member([From_lang,_,Item11],Ctt_orig_tran2),Item12), + + subtract(Item6,Item10,D1), %% + length(D1,Length),Difference is abs(Length),write("Number of back-translations remaining to define: "), writeln(Difference), + + %%towords2(CttInput1,[],CttInput2), + subtract(Item8,Item12,D2), %% Should AUN be appended to TF, " x + %%delete(D21,'',D2), + length(D2,Length01t),Differencet is abs(Length01t),write("Number of undefined back-translations: "), writeln(Differencet), + %%writeln([undefined,D2]), %% Print undefined + + %%delete(D31,'',D3), + subtract(Item12,Item8,D3), + length(D3,Length01t2),Differencet2 is abs(Length01t2),write("Number of orphaned translations: "), writeln(Differencet2),!. + + +get_file(File_path,File) :- + phrase_from_file_s(string(String00a),File_path), + string_codes(String02b,String00a), + atom_to_term(String02b,File,[]),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). + +/* + +translate_ctt2("Hello.","en","fr","Bonjoura."). +translate_ctt2("Bonjoura.","fr","en","Hellok."). + +translate_ctt2("Helloa.","en","fr","Bonjour."). +translate_ctt2("Bonjour.","fr","en","Helloaa."). + +translate_ctt2("Helloc.","en","fr","Hellod."). +translate_ctt2("Hellod.","fr","en","Helloca."). + +translate_ctt2("Hellocc.","en","fr","Hellodc."). +translate_ctt2("Hellodc.","fr","en","Hellocca."). + +*/ + +translate_ctt2(Input,FromLang,ToLang,Output3) :- + trans_location(Trans_location), + insertdoublebackslashbeforequote(Input,Input1), + concat_list([Trans_location," ",FromLang,":",ToLang," \"",Input1,"\""],F), + +repeat, + +catch(call_with_time_limit(5, +catch( + (bash_command(F,Output1)), + _, + (writeln("Translate failed. Press c to retry."), + read_string(user_input, "\n", "\r", _,C), + C="c"->fail;abort) +) +), + time_limit_exceeded, + (writeln1("Error: translate timed out."),abort)), + + split_string(Output1,"\033","\033",Output2), + Output2=[_,Output3a|_], + atom_string(Output3a,Output3b), + string_concat("[1m",Output3,Output3b), +!. + +back_translate([],Ctt_orig1_orig2,Ctt_orig_tran,_From_lang,_To_lang,Translation,Translation,Ctt_orig1_orig2_1,Ctt_orig1_orig2_2,Ctt_orig_tran_1,Ctt_orig_tran_2) :- + append(Ctt_orig1_orig2,Ctt_orig1_orig2_1,Ctt_orig1_orig2_2), + append(Ctt_orig_tran,Ctt_orig_tran_1,Ctt_orig_tran_2),!. + + back_translate(Ctt_input,Ctt_orig1_orig2,Ctt_orig_tran,From_lang,To_lang,Translation1,Translation2,Ctt_orig1_orig2_1,Ctt_orig1_orig2_2,Ctt_orig_tran_1,Ctt_orig_tran_2) :- + + Ctt_input=[Ctt_input1a|Ctt_input2], + remove_spaces_from_start(Ctt_input1a,"",Ctt_input1), + + back_translate2(Ctt_input1,Ctt_input1,Ctt_orig1_orig2,Ctt_orig_tran,From_lang,To_lang,Translation3,Ctt_orig1_orig2_3,Ctt_orig_tran_3), + + concat_list([Translation1,Translation3," "],Translation4), + back_translate(Ctt_input2,Ctt_orig1_orig2_3,Ctt_orig_tran_3,From_lang,To_lang,Translation4,Translation2,Ctt_orig1_orig2_1,Ctt_orig1_orig2_2,Ctt_orig_tran_1,Ctt_orig_tran_2),!. + + + + %*Ctt_orig1_orig2_1=Ctt_orig1_orig2_3,Ctt_orig_tran_1,Ctt_orig_tran_3 +%in neither dict, add to both + %)). + %update dicts in other cases + + % add space afterwards + +remove_spaces_from_start("",B,B) :- !. +remove_spaces_from_start(A,_B,A) :- + not(string_concat(" ",_C,A)), + !. +remove_spaces_from_start(A,B,D) :- + string_concat(" ",C,A), + remove_spaces_from_start(C,B,D),!. + + %back_translate2(_Ctt_input0,_Ctt_input1,Ctt_orig1_orig2_1,Ctt_orig_tran_1,_From_lang,_To_lang,Translation3,Translation3,Ctt_orig1_orig2_1,Ctt_orig_tran_1) :- !. + back_translate2(Ctt_input0,Ctt_input1a,Ctt_orig1_orig2,Ctt_orig_tran,From_lang,To_lang,Translation3,Ctt_orig1_orig2_1,Ctt_orig_tran_1) :- + %trace, + + remove_spaces_from_start(Ctt_input1a,"",Ctt_input1), + + + (((Ctt_input1="\n"->true;(Ctt_input1="\r"->true;Ctt_input1="")), + Translation3=Ctt_input1, + %(string_concat(Translation1,Ctt_input1,Translation3), + Ctt_orig1_orig2=Ctt_orig1_orig2_1,Ctt_orig_tran=Ctt_orig_tran_1)->true; + + + ((member([From_lang,To_lang,Ctt_input1,Ctt_orig1_orig2_11],Ctt_orig1_orig2), + member([From_lang,To_lang,Ctt_orig1_orig2_11,Translation3],Ctt_orig_tran), + %Ctt_orig1_orig2=Ctt_orig1_orig2_1,Ctt_orig_tran=Ctt_orig_tran_1 + + append_if_needed(Ctt_orig1_orig2,[[From_lang,To_lang,Ctt_input0,%*** not earlier % here too + Ctt_orig1_orig2_11]],Ctt_orig1_orig2_1), + append_if_needed(Ctt_orig_tran,[[From_lang,To_lang,Ctt_orig1_orig2_11, % here too + Translation3]],Ctt_orig_tran_1) + +)->true; + + + ((member([From_lang,To_lang,Ctt_input1,Translation3],Ctt_orig_tran),Ctt_orig1_orig2=Ctt_orig1_orig2_1,Ctt_orig_tran=Ctt_orig_tran_1)->true; + + ((member([From_lang,To_lang,Ctt_input1,Ctt_orig1_orig2_11],Ctt_orig1_orig2), + not(member([From_lang,To_lang,Ctt_orig1_orig2_11,Translation3],Ctt_orig_tran)), + + %translate_ctt2(Ctt_orig1_orig2_11,% or ctinput + %From_lang,To_lang,Translation3), + %translate_ctt2(Translation3,% or ctinput + %To_lang,From_lang,Ctt_orig1_orig2_11), + %trace, + ((back_translate_and_check(Ctt_input0,Ctt_orig1_orig2_11,% or ctinput +From_lang,To_lang,Translation3), + + append_if_needed(Ctt_orig_tran,[[From_lang,To_lang,Ctt_orig1_orig2_11, % here too + Translation3]],Ctt_orig_tran_1),%trace, + Ctt_orig1_orig2=Ctt_orig1_orig2_1)->true; + + fail + %check_similar_sentences(Ctt_input0,Ctt_orig1_orig2,Ctt_orig_tran,From_lang,To_lang,Translation3,Ctt_orig1_orig2_1,Ctt_orig_tran_1) +) + + )->true; + + + ((not(member([From_lang,To_lang,Ctt_input1,Translation3],Ctt_orig_tran)), + + ((not(member([From_lang,To_lang,Ctt_input1,Ctt_orig1_orig2_1],Ctt_orig1_orig2)), + + (( + %translate_ctt2(Ctt_input1,% or ctinput + %From_lang,To_lang,Translation3), + %translate_ctt2(Translation3,% or ctinput + %To_lang,From_lang,Ctt_input1), + + back_translate_and_check(Ctt_input0,Ctt_input1,% or ctinput +From_lang,To_lang,Translation3), + + append_if_needed(Ctt_orig_tran,[[From_lang,To_lang,Ctt_input1, % here too + Translation3]],Ctt_orig_tran_1), + Ctt_orig1_orig2=Ctt_orig1_orig2_1)->true; + check_similar_sentences(Ctt_input0,Ctt_orig1_orig2,Ctt_orig_tran,From_lang,To_lang,Translation3,Ctt_orig1_orig2_1,Ctt_orig_tran_1) +))))))))), + +!. + +check_similar_sentences(Ctt_input0,Ctt_orig1_orig2,Ctt_orig_tran,From_lang,To_lang,Translation3,Ctt_orig1_orig2_1,Ctt_orig_tran_1) :- + concat_list(["\n","Please enter a sentence with the same meaning as \"",Ctt_input0,"\", which can be more easily translated:"],Note1), + writeln(Note1), + read_string(user_input,"\n\r","\n\r",_,Ctt_input1b), + + %trace, + back_translate2(Ctt_input0,Ctt_input1b,Ctt_orig1_orig2,Ctt_orig_tran,From_lang,To_lang,Translation3,_Ctt_orig1_orig2_2,Ctt_orig_tran_2), + + %Ctt_orig1_orig2_2=Ctt_orig1_orig2_1, + Ctt_orig1_orig2_1=[[From_lang,To_lang,Ctt_input0,Ctt_input1b]], + Ctt_orig_tran_2=Ctt_orig_tran_1, + %append_if_needed(Ctt_orig1_orig2_2,[[From_lang,To_lang,Ctt_input0,%*** not earlier % here too + % Ctt_input1b]],Ctt_orig1_orig2_1), + + + %append_if_needed(Ctt_orig_tran_2,[[From_lang,To_lang,Ctt_input1b, % here too + %Translation3]],Ctt_orig_tran_1), + !. + + +back_translate_and_check(Ctt_input0,Ctt_orig1_orig2_11,% or ctinput +From_lang,To_lang,Translation3) :- + translate_ctt2(Ctt_orig1_orig2_11,% or ctinput + From_lang,To_lang,Translation31), + translate_ctt2(Translation31,% or ctinput + To_lang,From_lang,Ctt_orig1_orig2_12), + ((Ctt_orig1_orig2_11=Ctt_orig1_orig2_12)-> + Translation3=Translation31; + (concat_list(["\n","Are the following sentences:\n- grammatical\n- have the same meaning as the other in the pair, and\n- have the same meaning as the original sentence (y/n)?\n\n", + "Original:","\t",Ctt_input0,"\n\n", + "\t\t",Ctt_orig1_orig2_11,"\n\t\t",Ctt_orig1_orig2_12],Note1), + writeln(Note1), + read_string(user_input,"\n\r","\n\r",_,YN), + %repeat, + %trace, + (YN="n"->fail;Translation3=Translation31))),!. + + + +insertdoublebackslashbeforequote(Input1,Input) :- + string_codes(Input1,Input2), + insertdoublebackslashbeforequote1(Input2,[],Input3), + string_codes(Input,Input3). +insertdoublebackslashbeforequote1([],Input,Input) :- !. +insertdoublebackslashbeforequote1(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + Input4=39, %% quote + append(Input2,[92,39],Input6), + insertdoublebackslashbeforequote1(Input5,Input6,Input3), !. +insertdoublebackslashbeforequote1(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + not(Input4=39), %% quote + append(Input2,[Input4],Input6), + insertdoublebackslashbeforequote1(Input5,Input6,Input3), !. + +append_if_needed(A,[B],C) :- +%trace, + (member(B,A)->C=A; + append(A,[B],C))%,notrace + ,!. \ No newline at end of file diff --git a/culturaltranslationtool/vintage_ctt2/files/.DS_Store b/culturaltranslationtool/vintage_ctt2/files/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..13663bec4d66b095d9586c5ff2ee714dc9ffec14 Binary files /dev/null and b/culturaltranslationtool/vintage_ctt2/files/.DS_Store differ diff --git a/culturaltranslationtool/vintage_ctt2/files/ctt-input.txt b/culturaltranslationtool/vintage_ctt2/files/ctt-input.txt new file mode 100644 index 0000000000000000000000000000000000000000..0b6986e7ee5dbef7d343d30b6cd9c059f43fa05a --- /dev/null +++ b/culturaltranslationtool/vintage_ctt2/files/ctt-input.txt @@ -0,0 +1 @@ +["en","fr","I earned the high distinction."] \ No newline at end of file diff --git a/culturaltranslationtool/vintage_ctt2/files/ctt-orig-tran.txt b/culturaltranslationtool/vintage_ctt2/files/ctt-orig-tran.txt new file mode 100644 index 0000000000000000000000000000000000000000..337b3d981907f334470b9b3db23ac7aa624869ee --- /dev/null +++ b/culturaltranslationtool/vintage_ctt2/files/ctt-orig-tran.txt @@ -0,0 +1 @@ +[["en","fr","I got H1.","J'ai H1."]] \ No newline at end of file diff --git a/culturaltranslationtool/vintage_ctt2/files/ctt-orig1-orig2.txt b/culturaltranslationtool/vintage_ctt2/files/ctt-orig1-orig2.txt new file mode 100644 index 0000000000000000000000000000000000000000..edbc039b4cc35f7e3e1566f86a6e8adfc5895b6b --- /dev/null +++ b/culturaltranslationtool/vintage_ctt2/files/ctt-orig1-orig2.txt @@ -0,0 +1 @@ +[["en","fr","I earned the high distinction.","I got H1."]] \ No newline at end of file diff --git a/culturaltranslationtool/vintage_ctt2/files/ctt-output.txt b/culturaltranslationtool/vintage_ctt2/files/ctt-output.txt new file mode 100644 index 0000000000000000000000000000000000000000..e62f38193c52a73197e184be2f15526712ed6372 --- /dev/null +++ b/culturaltranslationtool/vintage_ctt2/files/ctt-output.txt @@ -0,0 +1 @@ +["en","fr","J'ai H1. "] \ No newline at end of file diff --git a/databaseformulafinder/README.md b/databaseformulafinder/README.md new file mode 100644 index 0000000000000000000000000000000000000000..584a069e40d84441a9f296792a31c285a9f54c39 --- /dev/null +++ b/databaseformulafinder/README.md @@ -0,0 +1,81 @@ +# Database Formula Finder + +* Finds database formulas in terms of "and" and "or" from data. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","databaseformulafinder"). +halt +``` + +# Running + +* In Shell: +`cd databaseformulafinder` +`swipl` +`['databaseformulafinder.pl'].` + +``` +dbff(Columns,Result,Formula). +Columns - Column names and data +Result - The result to work the formula out from +Formula - The worked-out formula + +?- dbff([[a,1,2,3]],[1,2,3],F). +F = [a] + +?- dbff([[a, 1], [b, 1, 2]], [1],F). +F = [a, and, b] +F = [b, and, a] + +?- dbff([[a, 1], [b, 1, 2],[c,3]], [1,2,3],F). +F = [[b, or, a], or, c] +F = [b, or, [c, or, a]] + +?- dbff([[a, 1], [b, 1, 2],[c,1,3]], [1],F). +F = [[a, and, c], and, b] +F = [[a, or, c], and, b] +F = [a, and, [b, and, c]] +F = [a, and, [b, or, c]] +F = [[a, and, b], and, c] +F = [[a, or, b], and, c] +F = [a, and, [c, and, b]] +F = [a, and, [c, or, b]] +F = [[b, and, c], and, a] +F = [[b, or, c], and, a] +F = [b, and, [a, and, c]] +F = [b, and, [a, or, c]] +F = [[b, and, a], and, c] +F = [[b, or, a], and, c] +F = [b, and, [c, and, a]] +F = [b, and, [c, or, a]] +F = [[c, and, b], and, a] +F = [[c, or, b], and, a] +F = [c, and, [a, and, b]] +F = [c, and, [a, or, b]] +F = [[c, and, a], and, b] +F = [[c, or, a], and, b] +F = [c, and, [b, and, a]] +F = [c, and, [b, or, a]] + +?- dbff([[a, 1], [b, 1, 2],[c,1,3]], [1,3],F). +F = [[a, and, b], or, c] +F = [c, or, [a, and, b]] +``` diff --git a/databaseformulafinder/databaseformulafinder.pl b/databaseformulafinder/databaseformulafinder.pl new file mode 100644 index 0000000000000000000000000000000000000000..38ec170fa27c84085092f1cfb88afd5d2f753598 --- /dev/null +++ b/databaseformulafinder/databaseformulafinder.pl @@ -0,0 +1,67 @@ +dbff(Columns1,Result,Formula1) :- + dbff1(Columns1,Columns1,Result,Formula1). +dbff1(Columns1,Columns2,Result,Formula1) :- + member(Column,Columns1), + Column=[Name|_Rest], + delete(Columns2,Column,Columns3), + Formula2=Name, + dbff2(Columns1,Columns3,Result,Formula2,Formula3), + list(Formula3,Formula1). +dbff2(_Columns1,[],_Result,Formula,Formula). +dbff2(Columns1,Columns2,Result,Formula1,Formula2) :- + member(Column,Columns2), + Column=[Name|_Rest], + delete(Columns2,Column,Columns3), + appendlogic(Formula1,Name,Formula3), + dbff3(Columns1,Result,Formula3), + dbff2(Columns1,Columns3,Result,Formula3,Formula2). +dbff3(Columns1,Result1,[Formula1,and,Formula2]) :- + dbff3(Columns1,Result2,Formula1), + dbff3(Columns1,Result3,Formula2), + intersection1(Result2,Result3,[],Result1). +dbff3(Columns1,Result1,[Formula1,or,Formula2]) :- + dbff3(Columns1,Result2,Formula1), + dbff3(Columns1,Result3,Formula2), + union(Result2,Result3,Result1). +dbff3(Columns,Result1,Formula1) :- + member([Formula1|Result1],Columns). +appendlogic([Formula1,Operator,Formula2],Name2,Formula3) :- + appendlogic(Formula1,Name2,Formula4), + Formula3=[Formula4,Operator,Formula2]. +appendlogic([Formula1,Operator,Formula2],Name2,Formula3) :- + appendlogic(Formula2,Name2,Formula4), + Formula3=[Formula1,Operator,Formula4]. +appendlogic(Formula1,Name,Formula2) :- + atom(Formula1),append([Formula1],[and,Name],Formula2). +appendlogic(Formula1,Name,Formula2) :- + atom(Formula1), append([Formula1],[or,Name],Formula2). +unique1(_Remainders,[],UniqueRemainders,UniqueRemainders) :- !. +unique1(Remainders1,Remainder,UniqueRemainders1,UniqueRemainders2) :- + delete(Remainders1,Remainder,Remainders2), + append([Remainder],Remainders2,Remainders3), + unique2(Remainders3,Remainders4,Remainders5,UniqueRemainders1,UniqueRemainders3), + unique1(Remainders5,Remainders4,UniqueRemainders3,UniqueRemainders2). +unique2(Remainders1,Remainder1,Remainders2,UniqueRemainders1,UniqueRemainders2) :- + Remainders1=[Remainder2,Remainder1|Remainders2], + append(UniqueRemainders1,[Remainder2],UniqueRemainders2). +unique2(Remainders1,_Remainder1,Remainder2,UniqueRemainders1,UniqueRemainders2) :- + Remainders1=[Remainder2], + append(UniqueRemainders1,[Remainder2],UniqueRemainders2). +intersection1([],_Result1,Result2,Result2). +intersection1(Result1,Result2,Result3,Result4) :- + Result1=[Result6|Results1], + intersection2(Result6,Result2,Result3,Result5), + intersection1(Results1,Result2,Result5,Result4). +intersection2(Result1,Result2,Result7,Result3) :- + member(Result1,Result2), + append(Result7,[Result1],Result3). +intersection2(Result1,Result2,Result7,Result7) :- + not(member(Result1,Result2)). +union(Result1,Result2,Result3) :- + append(Result1,Result2,Result4), + Result4=[Result5|Results], + unique1(Results,Result5,[],Result3). +list(Formula1,Formula2) :- + atom(Formula1),Formula2=[Formula1]. +list(Formula,Formula) :- + not(atom(Formula)). diff --git a/gitl/.DS_Store b/gitl/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/gitl/.DS_Store differ diff --git a/gitl/LICENSE b/gitl/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..43ac320c124b6bba1157a463da11bcfebfd47839 --- /dev/null +++ b/gitl/LICENSE @@ -0,0 +1,28 @@ +BSD 3-Clause License + +Copyright (c) 2023, Lucian Green + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/gitl/README.md b/gitl/README.md new file mode 100644 index 0000000000000000000000000000000000000000..4012f1734ff38084d113793217434c42a78e7cbe --- /dev/null +++ b/gitl/README.md @@ -0,0 +1,86 @@ +# GitL + +* GitL is a decentralised Git that lets you commit and view diffs of version control system in your local folder. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for controlling versions. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +* Download: +* this repository and its dependencies +* List Prolog Interpreter +* Lucian CI/CD + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","gitl"). +halt +``` + +# Running + +* In Shell: +`cd gitl` +`swipl` + +* To load the algorithm, enter: +``` +['gitl.pl']. +``` + +# Instructions + +* In the folder `gitl_test` at the same level as `gitl`, store your repositories, e.g. `b`. +* In the folder `gitl_data` at the same level as `gitl`, GitL stores the version control system and diffs between versions. + +* `commit("b","Description of changes.").` - Commits repository `b` to the version control system and creates an HTML file with the differences between versions and the description of changes. + +* Dev.to article about GitL + +# GitL Web Service + +* To see a list of repositories with changes and commit some of the changed ones, load with: +``` +gitl_server(8000). +``` + +* Go to `http://localhost:8000/gitl`. + +* Remember to edit the password in `../Philosophy/web-editor-pw.pl` before running. + +* To view and change repositories, load with: +``` +['web_editor_gitl_test.pl']. +web_editor_server(8000). +``` + +* Go to `http://localhost:8000/webeditor`. + +# Integrate with Lucian CI/CD + +* So far, to commit changes from Lucian CI/CD, run: +`scp -pr ../../GitHub2o/ ../gitl_test/` in the `gitl` folder before committing. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + diff --git a/gitl/diff.pl b/gitl/diff.pl new file mode 100644 index 0000000000000000000000000000000000000000..0bd734b6160bdc5829b3a1402a6f7b652f63d833 --- /dev/null +++ b/gitl/diff.pl @@ -0,0 +1,112 @@ +diff_gitl(After3,HTML3) :- +diff1_gitl(After3,HTML2), + string_concat("Diff output
Key
Insertion

Deletion

",HTML2,HTML3). + +diff1_gitl(After3,HTML3) :- + %correspondences(Corr), + + findall(HTML_a,(member(Item%[[n,comment],[Item]] + ,After3), + + (Item=[*, Name, Item_a] -> + + diff2_gitl(Name,Item_a,HTML_a); + + + + ((%trace, + not(Item=[i,_]), + not(Item=[d,_])) + %string(Item) + -> + (Item=Item2,%numbers_to_term([Item],Corr,[],Item2), + Colour="white",Change=""); + ((%trace, + Item=[i,Item2])-> + (%trace, + (Item2=[]->fail;true), + %numbers_to_term([Item3],Corr,[],Item2), + Colour="green",Change="Insertion: "); + ((%trace, + Item=[d,Item2])-> + (%trace, + (Item2=[]->fail;true), + %numbers_to_term([Item3],Corr,[],Item2), + Colour="red",Change="Deletion: ") + /*; + (Item=[[c,_],Item_a,Item_b]-> + (numbers_to_term(Item_a,Corr,[],Item2a), + numbers_to_term(Item_b,Corr,[],Item2b), + %trace, + %term_to_atom(Item2a,Item2a1), + %term_to_atom(Item2b,Item2b1), + %foldr(string_concat, + term_to_atom([Item2b,' -> ',Item2a],Item2), + Colour="yellow",Change="Change: ")) + */ + ))), + + HTML_a=["
",Change,Item2,"
"] + + )),HTML), + flatten(HTML,HTML1), + foldr(string_concat,HTML1,HTML3). + + + diff2_gitl(Name,After3,HTML3) :- + %correspondences(Corr), + + findall(["
",Change,Item2,"
"],(member(Item%[[n,comment],[Item]] + ,After3), + +%trace, + ((not(Item=[i,_]), + not(Item=[d,_])) + %string(Item) + -> + (Item=Item2,%numbers_to_term([Item],Corr,[],Item2), + Colour="white",Change=""); + ((%trace, + Item=[i,Item2])-> + (%trace, + (Item2=[]->fail;true), + %numbers_to_term([Item3],Corr,[],Item2), + Colour="green",Change="Insertion: "); + ((%trace, + Item=[d,Item2])-> + (%trace, + (Item2=[]->fail;true), + %numbers_to_term([Item3],Corr,[],Item2), + Colour="red",Change="Deletion: ") + /*; + (Item=[[c,_],Item_a,Item_b]-> + (numbers_to_term(Item_a,Corr,[],Item2a), + numbers_to_term(Item_b,Corr,[],Item2b), + %trace, + %term_to_atom(Item2a,Item2a1), + %term_to_atom(Item2b,Item2b1), + %foldr(string_concat, + term_to_atom([Item2b,' -> ',Item2a],Item2), + Colour="yellow",Change="Change: ")) + */ + )))),HTML), + flatten(HTML,HTML1), + foldr(string_concat,HTML1,HTML2), + + foldr(string_concat,["File name: ",Name,"
",HTML2],HTML3). + %term_to_atom(HTML1,HTML2), + + %time1(Time), + %diff_html_n(Diff_html_n), + %(exists_directory_s("../../lc_logs/")->true;%make_directory_s("../../lc_logs/")), + +%foldr(string_concat,["../../lc_logs/diff_html",Time,"-",Diff_html_n,".html"],File1), + +/* + Diff_html_n1 is Diff_html_n+1, + retractall(diff_html_n(_)), + assertz(diff_html_n(Diff_html_n1)), +*/ + + %save_file_s(File1,HTML3). + diff --git a/gitl/find_files.pl b/gitl/find_files.pl new file mode 100644 index 0000000000000000000000000000000000000000..614471be8ef5d71095aebe6465264ff6f8c227d5 --- /dev/null +++ b/gitl/find_files.pl @@ -0,0 +1,132 @@ +find_files(A1,Mod_time_a) :- +%trace, + +working_directory1(A,A), +working_directory1(_,A), + + +%repositories_paths(K), + +%omit_paths(Omit), + +%findall(Omit1,(member(Omit2,Omit),atom_string(Omit1,Omit2)),Omit3), +/*K01=[[A1,[A1]]], +%findall([A1,G4],(%member(K1,[A]), +trace, + %directory_files("./",F), +% delete_invisibles_etc(F,G1), + +%delete(G1,"n.txt",G11), +%findall(H,(member(H,G),not(string_concat("dot",_,H)), + +%subtract(G,Omit,G1), + +%findall(G3,(member(G2,G1),string_concat(G2,"/",G3)),G4) +%not(member(G,Omit)) + +%),K01), +%trace, +%foldr(append,K0,K01), + +working_directory1(Old_D,Old_D), + +findall(Tests1,(member([D,K31],K01), + +working_directory1(_,Old_D), + +working_directory1(_,D), +*/ +%member(K2,K31), + +%exists_directory(K2), +/* +writeln1(process_directory_ff_tests_ff_ff_ff([A1],%K31,%_G, + %Omit,% + true, + Tests1)), + */ + process_directory_ff([A1],%_G, + %Omit,% + true, + Mod_time2),%),Mod_time) + %),Mod_time2), + foldr(append,Mod_time2,Mod_time), + %foldr(append,Mod_time3,Mod_time), + sort(Mod_time,Mod_time_a), + + working_directory1(_,A) + + ,!. + +process_directory_ff(K,%G, + Top_level,%Mod_time1, + Mod_time61) :- + +%G=K, +%/* +findall(K4,(member(K1,K), directory_files(K1,F), + delete_invisibles_etc(F,G), +%*/ +findall(Mod_time3,(member(H,G),%not(string_concat("dot",_,H)), + +%not(member(H,Omit)), + + +foldr(string_concat,[K1,H],H1), + +% if a file then find modification date +% if a folder then continue finding files in folder +(exists_directory(H1)-> + +(string_concat(H1,"/",H2), +process_directory_ff([H2],%[H], + false,%[],%Omit % only omit top level dirs xx + %Mod_time1, + Mod_time3) + %foldr(append,Mod_time31,Mod_time3) + ); + +( +open_string_file_s(H1,Tests31), +Mod_time3=[[H1,Tests31]] + +%time_file(H1,Mod_time4), +%trace, +%append(Mod_time1,[[H1,Mod_time4]],Mod_time3))) +%Mod_time3=[[H1,Mod_time4]] +)) + +),Mod_time5),%trace, +foldr(append,Mod_time5,Mod_time51), + +%Mod_time5=Mod_time51, + +(Top_level=true%not(Omit=[]) % at top level +-> +( +%term_to_atom(Mod_time51,Mod_time52), +Mod_time51=Mod_time52, +%string_concat(K3,"/",K1), +%foldr(string_concat,["../private2/luciancicd-data/mod_times_",K3,".txt"],K2), +K4=%K2, +Mod_time52 +%open_s(K2,write,S), +%write(S,Mod_time52),close(S) + +%writeln(["*",K2, +%Mod_time52] +%) +); +K4=Mod_time51 +) + + + +),Mod_time6), + +(%not(Omit=[])-> +Top_level=true-> +Mod_time6=Mod_time61; +foldr(append,Mod_time6,Mod_time61)), + +!. \ No newline at end of file diff --git a/gitl/gitl.pl b/gitl/gitl.pl new file mode 100644 index 0000000000000000000000000000000000000000..1c48e48246f21717947fa8c8eda37ff68c6ebe7e --- /dev/null +++ b/gitl/gitl.pl @@ -0,0 +1,185 @@ +% GitL + +% Moves a copy of a folder into the folder +% - adds a version number to the folder +% Diffs last two versions + +:-include('diff.pl'). +:-include('find_files.pl'). +:-include('../luciancicd/ci_vintage.pl'). +:-include('../luciancicd/diff-cgpt.pl'). +:-include('gitl_ws.pl'). +:-include('gitl_ws1.pl'). +:-include('../listprologinterpreter/listprolog.pl'). +:-include('../luciancicd/remove_end_comment.pl'). + +repository_root_path("../gitl_test/"). +gitl_data_path("../gitl_data/"). + +commit(Repository1,Label) :- + + (string_concat(Repository,"/",Repository1)->true; + Repository=Repository1), + + working_directory1(A1,A1), + + repository_root_path(Repository_root_path), + +(exists_directory_s(Repository_root_path)->true;make_directory(Repository_root_path)), + + gitl_data_path(Gitl_data_path1), + +(exists_directory_s(Gitl_data_path1)->true;make_directory(Gitl_data_path1)), + + %trace, + foldr(string_concat,[Repository_root_path,Repository,"/"],R1), + (exists_directory_s(R1)->true;make_directory(R1)), + + foldr(string_concat,[Gitl_data_path1,Repository,"/"],R21), + (exists_directory_s(R21)->true;make_directory(R21)), +%trace, + + foldr(string_concat,[Gitl_data_path1,Repository,"/","n.txt"],N_path), + (exists_file_s(N_path)-> + open_file_s(N_path,N); + (N=0,number_string(N,NS),save_file_s(N_path,NS))), + + N1 is N+1, + + (N=0->N0=N1;N0=N), + foldr(string_concat,[Gitl_data_path1,Repository,"/",N0,"/"],To_m_1), + foldr(string_concat,[Gitl_data_path1,Repository,"/",N1,"/"],To1), + + %trace, + + %(N0=0-> + + + + + ((To=To1, + find_files(R1,RTests), + (N1=1->(scp1(Repository_root_path,Repository,Gitl_data_path1,N1,R1,N_path), + + HTML=(-)); + (sd1(RTests,R1,To_m_1,Repository_root_path,Repository,Gitl_data_path1,N1,R1,N_path,To,HTML))) + +) + ->( + + %working_directory1(_,A1), + + + + %Gitl_data_path=R2, + + + + + + + foldr(string_concat,[Gitl_data_path1,Repository,"/",N1,"-",Label,".html"],HTMLP), + + working_directory1(_,A1), + (HTML=(-) ->true; + save_file_s(HTMLP,HTML)) + );(writeln("Not committed. Folders identical."),abort)),!. + +%open_string_file_s + +%save_diff(N0,R1,To_m_1,To,HTML3) :- +%trace, + +% . + +scp1(Repository_root_path,Repository,Gitl_data_path1,N1,R1,N_path) :- + + (number_string(N1,N1S),save_file_s(N_path,N1S)), + +%trace, + foldr(string_concat,[Gitl_data_path1,Repository,"/",N1,"/"],R2), + (exists_directory_s(R2)->true;make_directory(R2)), + + + Scp="scp -pr ", + foldr(string_concat,[Repository_root_path,Repository,"/*"],From1), + + atomic_list_concat(B," ",From1), + atomic_list_concat(B,"\\ ",From), + + foldr(string_concat,[Gitl_data_path1,Repository,"/",N1,"/."],To_a1), + + atomic_list_concat(C," ",To_a1), + atomic_list_concat(C,"\\ ",To_a), + + foldr(string_concat,[Scp,From," ",To_a],Command), + + directory_files(R1,F), + delete_invisibles_etc(F,G), +%trace, + (G=[]->true; + shell1_s(Command)). + +sd1(RTests,R1,To_m_1,Repository_root_path,Repository,Gitl_data_path1,N1,R1,N_path,To,HTML) :- + +%trace, + + %findall(T1a,(member([T1,_],RTests),string_concat(R1,T1a,T1)),R11), + findall([T1a,BA],(member([T1,BA],RTests),string_concat(R1,T1a,T1)),R110), + + find_files(To_m_1,Tests1), + findall(T1a,(member([T1,_],Tests1),string_concat(To_m_1,T1a,T1)),T11), + findall([T1a,BA],(member([T1,BA],Tests1),string_concat(To_m_1,T1a,T1)),T110), + not(T110=R110), + + + + scp1(Repository_root_path,Repository,Gitl_data_path1,N1,R1,N_path), + + find_files(To,Tests2), + findall(T2b,(member([T2,_],Tests2),string_concat(To,T2b,T2)),T21), + + %trace, + + intersection(T11,T21,IT12), + subtract(T11,T21,D), + subtract(T21,T11,I), + %trace, + findall([A,B],(member([A1,B],Tests1),string_concat(To_m_1,A,A1)),Tests11), + findall([A,B],(member([A1,B],Tests2),string_concat(To,A,A1)),Tests21), + append(Tests11,Tests21,Tests3), + findall([A1,B],(member(A,IT12),string_concat(To_m_1,A,A1),member([A1,B],Tests1)),IT11), + findall([A1,B],(member(A,IT12),string_concat(To,A,A1),member([A1,B],Tests2)),IT123), + %trace, + length(IT11,IT11L), + numbers(IT11L,1,[],Ns), + + + findall([*,T1a,C],(member(NA,Ns),get_item_n(IT11,NA,[A1,B1]),string_concat(To_m_1,T1a,A1),get_item_n(IT123,NA,[_A2,B2]), + split_string(B1,"\n\r","\n\r",IT111), + split_string(B2,"\n\r","\n\r",IT121), + %trace, + diff_gitl(IT111,IT121,C)),CA), + %trace, + + findall([*,A,[[i,B14]]],(member(A,I),%string_concat(To_m_1,A,A1), + member([A,B],Tests3),split_string(B,"\n\r","\n\r",B1), + findall([B11,"
"],member(B11,B1),B12), + flatten(B12,B13),foldr(string_concat,B13,B14)),IT11a), + findall([*,A,[[d,B14]]],(member(A,D),%string_concat(To,A,A1), + member([A,B],Tests3),split_string(B,"\n\r","\n\r",B1), + findall([B11,"
"],member(B11,B1),B12), + flatten(B12,B13),foldr(string_concat,B13,B14)),IT123a), + + foldr(append,[IT11a,IT123a,CA],C1), + + diff_gitl(C1,HTML). + + +diff_gitl(Before,After,After3) :- + %find_insertions_and_deletions_vintage_old(Before,After,Insertions,Deletions), + diff(Before,After,_,_,[],[],After3), + + %replace11_vintage(After,Insertions,[],[],After2), + %replace12_vintage(Before,After2,Deletions,[],After3), + !. \ No newline at end of file diff --git a/gitl/gitl_ws.pl b/gitl/gitl_ws.pl new file mode 100644 index 0000000000000000000000000000000000000000..be1daca9ea046a2f1758a8f8cbdabb95ba367d34 --- /dev/null +++ b/gitl/gitl_ws.pl @@ -0,0 +1,211 @@ +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/gitl', gitl_web_form, []). + +%:-include('../SSI/ssi.pl'). +%:-include('../SSI/ssi.pl'). +%:-include('gitl1_pl.pl'). + +%:- include('files/listprolog.pl'). + +% show changes to all repos on page, allow selecting checkboxes of repos to commit or all +% have anchor, link to top +% list of repos at top have notif icon, pass icon +% refresh time + +:-dynamic num1/1. + +gitl_server(Port) :- + http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + + gitl_web_form(_Request) :- +%retractall(html_api_maker_or_terminal(_)), +%assertz(html_api_maker_or_terminal(html + %terminal + %)), + + retractall(num1(_)),assertz(num1(1)), + + repository_root_path(RRP),atom_string(RRP1,RRP), + working_directory(A3,A3), + working_directory(_,RRP1), + format('Content-type: text/html~n~n', []), + +data(Header,Footer), + +format(Header,[]), + +writeln("

GitL



+

All

+
+
"), +directory_files("./",F), + delete_invisibles_etc(F,G), + + findall([N,H,H1,F1],(member(H,G),atom_string(H,HH),ws_html(HH,F1),%open_string_file_s(H,F11), + get_num(N), +%(string_concat("log",_,H)->(term_to_atom(F111,F11),F111=[Success,F12], +((F1=(-))->Success1="";foldr(string_concat,["CHANGED Label: "],Success1)),foldr(string_concat,["",Success1," - "],Success2),%);(F11=F12,Success2="")), time_file(H,T),atomic_list_concat(A,"\n",F12),atomic_list_concat(A,"
",F1), + foldr(string_concat,[Success2,"", + H,"", + "

"],H1)%,writeln(H1) + ),J0), + + sort(J0,J), + %reverse(J1,J), + + findall(_,(member([_,_,H1,_],J),writeln(H1)),_), + + writeln(""), + + findall(_,(member([N,H,_,F1],J),foldr(string_concat,["

",H,"

Top
",F1,"

"],H2),writeln(H2)),_), +%Debug=off, + + %test_open_types_cases(4,Query,Types,Modes,Functions), + +%international_lucianpl([lang,"en"],Debug,Query,Types,Modes,Functions,_Result), +%p2lpconverter([file,"../private/la_com_ssi1.pl"],List3), + +%testopen_cases(8,[[n,test]],List3), + %test(1,Query,Functions,Result), + +% Form and HTML Table +%test1(Functions), +%Query=[[n,test]], + %gitl_test(List3), + %para(List3), + %international_lucianpl([lang,"en"],Debug,[[n,gitl]],List3,_Result1), +writeln("
+Top"), + + working_directory(_,A3), + + +format(Footer,[]) + + . + + %term_to_atom(Debug2,'off'), +%term_to_atom(Query2,Query1), +%term_to_atom(Functions2,Functions1), + +%international_interpret([lang,"en"],Debug2,Query2,Functions2,Result), + %%format('

========~n', []), + %%portray_clause + %portray_clause(result), + %%writeln1(Data), + +%format('

'). + + +data(Header,Footer) :- + +Header=' + + + + + GitL + + + + + + + + + + +
+ + + + + + +
+

', + +Footer='

+
+
+ +
+ +'. + +get_num(A) :- num1(A),retractall(num1(_)),A1 is A+1,assertz(num1(A1)). + + + +:- http_handler('/gitl_landing', gitl_landing_pad, []). + + gitl_landing_pad(Request) :- + member(method(post), Request), !, + http_read_data(Request, Data, []), + format('Content-type: text/html~n~n', []), + format('

', []), + %portray_clause(Data), + + +%a(Data) :- %%term_to_atom(Term,Data), + %append(Data1,[submit=_],Data), + + findall(_,(member(HHa=O,Data),atom_concat(HH,'xxA',HHa),atom_concat(HH,'xxB',HH1a),(O=on->(member(HH1a=Label,Data), + atomic_list_concat(B,"+",HH),%atomic_list_concat(B,"\\ ",HH1), + atomic_list_concat(B," ",HH1),atom_string(HH1,HH2),atom_string(Label,Label1), + (commit(HH2,Label1)->(foldr(string_concat,[HH1," (with label: \"",Label1,"\") was committed.
"],A),writeln(A));writeln([HH,not,committed]))))),_), +%Data=[%%debug='off',%%Debug1, +%query=Query1,functions=Functions1,submit=_], + +%term_to_atom(Debug2,'off'), +%term_to_atom(Query2,Query1), +%term_to_atom(Functions2,Functions1), + +%international_interpret([lang,"en"],Debug2,Query2,Functions2,Result), + %%format('

========~n', []), + %%portray_clause + %portray_clause(Result), + %%writeln1(Data), + +format('

') +%*/ +. \ No newline at end of file diff --git a/gitl/gitl_ws1.pl b/gitl/gitl_ws1.pl new file mode 100644 index 0000000000000000000000000000000000000000..115b2e06f9376db4c057cf2163903ebe627186a1 --- /dev/null +++ b/gitl/gitl_ws1.pl @@ -0,0 +1,124 @@ +% Web server - finds differences between last saved version and current changes + +ws_html(Repository1,HTML1) :- + + (string_concat(Repository,"/",Repository1)->true; + Repository=Repository1), + + working_directory1(A1,A1), + + repository_root_path(Repository_root_path), + +(exists_directory_s(Repository_root_path)->true;make_directory(Repository_root_path)), + + gitl_data_path(Gitl_data_path1), + +(exists_directory_s(Gitl_data_path1)->true;make_directory(Gitl_data_path1)), + + %trace, + foldr(string_concat,[Repository_root_path,Repository,"/"],R1), + (exists_directory_s(R1)->true;make_directory(R1)), + + foldr(string_concat,[Gitl_data_path1,Repository,"/"],R21), + (exists_directory_s(R21)->true;make_directory(R21)), +%trace, + + foldr(string_concat,[Gitl_data_path1,Repository,"/","n.txt"],N_path), + (exists_file_s(N_path)-> + (open_file_s(N_path,N), + + + + %N1 is N+1, + + %(N=0->N0=N1;N0=N), + foldr(string_concat,[Gitl_data_path1,Repository,"/",N,"/"],To_m_1), + %foldr(string_concat,[Gitl_data_path1,Repository,"/",N1,"/"],To1), + + %trace, + + %(N0=0-> + + + + + %((%To=To1, + %(N1=1->(scp1(Repository_root_path,Repository,Gitl_data_path1,N1,R1,N_path), + + %HTML=(-)); + %( + + find_files(To_m_1,Tests1) + );Tests1=[]), + + find_files(R1,RTests), + + findall([T1a,BA1],(member([T1,BA],RTests),remove_end_comments2(BA,BA1),string_concat(R1,T1a,T1)),R110), + + + To=R1, + +(sd2(R110,Tests1,RTests,R1,To_m_1,Repository_root_path,Repository,Gitl_data_path1,N,R1,N_path,To,HTML) + ->HTML1=HTML; + HTML1=(-)), + + !. + +sd2(R110,Tests1,RTests,R1,To_m_1,_Repository_root_path,_Repository,_Gitl_data_path1,_N1,R1,_N_path,To,HTML) :- + +%trace, + + %findall(T1a,(member([T1,_],RTests),string_concat(R1,T1a,T1)),R11), + findall(T1a,(member([T1,_],Tests1),string_concat(To_m_1,T1a,T1)),T11), + findall([T1a,BA1],(member([T1,BA],Tests1),remove_end_comments2(BA,BA1),string_concat(To_m_1,T1a,T1)),T110), + + %trace, + + not(T110=R110), + + + + %scp1(Repository_root_path,Repository,Gitl_data_path1,N1,R1,N_path), + + %find_files(To,Tests2), + Tests2=RTests, + %trace, + findall(T2b,(member([T2,_],Tests2),string_concat(To,T2b,T2)),T21), + + %trace, + + intersection(T11,T21,IT12), + subtract(T11,T21,D), + subtract(T21,T11,I), + %trace, + findall([A,B],(member([A1,B],Tests1),string_concat(To_m_1,A,A1)),Tests11), + findall([A,B],(member([A1,B],Tests2),string_concat(To,A,A1)),Tests21), + append(Tests11,Tests21,Tests3), + findall([A1,B],(member(A,IT12),string_concat(To_m_1,A,A1),member([A1,B],Tests1)),IT11), + findall([A1,B],(member(A,IT12),string_concat(To,A,A1),member([A1,B],Tests2)),IT123), + %trace, + length(IT11,IT11L), + numbers(IT11L,1,[],Ns), + + + findall([*,T1a,C],(member(NA,Ns),get_item_n(IT11,NA,[A1,B1]),string_concat(To_m_1,T1a,A1),get_item_n(IT123,NA,[_A2,B2]), + split_string(B1,"\n\r","\n\r",IT111), + split_string(B2,"\n\r","\n\r",IT121), + %trace, + diff_gitl(IT111,IT121,C)),CA), + %trace, + + findall([*,A,[[i,B14]]],(member(A,I),%string_concat(To_m_1,A,A1), + member([A,B],Tests3),split_string(B,"\n\r","\n\r",B1), + findall([B11,"
"],member(B11,B1),B12), + flatten(B12,B13),foldr(string_concat,B13,B14)),IT11a), + findall([*,A,[[d,B14]]],(member(A,D),%string_concat(To,A,A1), + member([A,B],Tests3),split_string(B,"\n\r","\n\r",B1), + findall([B11,"
"],member(B11,B1),B12), + flatten(B12,B13),foldr(string_concat,B13,B14)),IT123a), + + foldr(append,[IT11a,IT123a,CA],C1), + + diff_gitl(C1,HTML). + + \ No newline at end of file diff --git a/gitl/web_editor_gitl_test.pl b/gitl/web_editor_gitl_test.pl new file mode 100644 index 0000000000000000000000000000000000000000..d8147c178188b3b986cd527913a75d0e095be77e --- /dev/null +++ b/gitl/web_editor_gitl_test.pl @@ -0,0 +1,3 @@ +:-include('../Philosophy/web-editor.pl'). + +first_directory('../gitl_test/'). \ No newline at end of file diff --git a/listprologinterpreter/.DS_Store b/listprologinterpreter/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..ebc233efb8df9c6d1f327c8b71fe401a5412cc21 Binary files /dev/null and b/listprologinterpreter/.DS_Store differ diff --git a/listprologinterpreter/D2T_docs.md b/listprologinterpreter/D2T_docs.md new file mode 100644 index 0000000000000000000000000000000000000000..ef8fa3f8684404297da5882dcc00befe92b6b60d --- /dev/null +++ b/listprologinterpreter/D2T_docs.md @@ -0,0 +1,38 @@ +# Data to Types Documentation + +* Data to Types is an algorithm that converts data into a type statement, for example: + +``` +?- data_to_types([[1],"a",1],[],T). +T = [[[t, brackets], [[t, number]]], [t, string], [t, number]]. + +?- data_to_types([1,["a"],1],[],T). +T = [[t, number], [[t, brackets], [[t, string]]], [t, number]]. + +?- data_to_types([1,"a",[1]],[],T). +T = [[t, number], [t, string], [[t, brackets], [[t, number]]]]. +``` + +# Getting Started + +Please read the following instructions on how to install the project on your computer for converting data to types. + +# Prerequisites + +None. + +# Installing + +* Please download and install SWI-Prolog for your machine at https://www.swi-prolog.org/build/. + +* Download the file data_to_types.pl to your machine. + +* Use in the same way as the examples above. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details \ No newline at end of file diff --git a/listprologinterpreter/LICENSE.txt b/listprologinterpreter/LICENSE.txt new file mode 100644 index 0000000000000000000000000000000000000000..3a7675925f40114cfdcd03de302df5d59b67af86 --- /dev/null +++ b/listprologinterpreter/LICENSE.txt @@ -0,0 +1,13 @@ +LICENSE + +The FreeBSD Copyright +Copyright 1992-2019 The FreeBSD Project. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: + +Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. +Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. +THIS SOFTWARE IS PROVIDED BY THE AUTHOR AND CONTRIBUTORS ``AS IS'' AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE AUTHOR OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + +The views and conclusions contained in the software and documentation are those of the authors and should not be interpreted as representing official policies, either expressed or implied, of the FreeBSD Project. + diff --git a/listprologinterpreter/LPI_Caveats.md b/listprologinterpreter/LPI_Caveats.md new file mode 100644 index 0000000000000000000000000000000000000000..a37d75e2507458fd129b36d3118ca3ebb02ce86e --- /dev/null +++ b/listprologinterpreter/LPI_Caveats.md @@ -0,0 +1,11 @@ +# List Prolog Caveats + +* Name predicates, variables Name (name), Name1 (name1), or Name_a_1 (name_a_1) not Name_a1 (name_a1). + +* Use findall, not rely on semi-colon-like separated results for non-determinism. + +* member(Item,List) in List Prolog now uses the same order of arguments as Prolog. member2(List, Item) can be used in CAW. + +* get_item_n(List,Item_number,Item) replaces append and length (the programmer must copy this algorithm into their algorithm). + +* lucianpl doesn't currently support other languages. diff --git a/listprologinterpreter/LPI_docs.md b/listprologinterpreter/LPI_docs.md new file mode 100644 index 0000000000000000000000000000000000000000..8a0891d9cbd98364bdd14f046a36fc4f38428c75 --- /dev/null +++ b/listprologinterpreter/LPI_docs.md @@ -0,0 +1,571 @@ +# List Prolog Documentation + +* Load List Prolog by downloading the repository from GitHub. + +* Please download and install SWI-Prolog for your machine. + +* Load the List Prolog Interpreter by typing: +`['listprolog'].` + +* The interpreter is called in the form: +`interpret(Debug,Query,Functions,Result).` + +Where: +Debug - on or off for trace, +Query - the query, +Functions - the algorithm, +Result - the result. + +* For example: +``` +interpret(off,[[n,function],[1,1,[v,c]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[n,+],[[v,a],[v,b],[v,c]]] + ] + ] +] +,Result). +``` + +The result of the query is: +`Result=[[[[v,c], 2]]]` This is the list of non-deterministic results (i.e. ones that SWI-Prolog would return after pressing `";"`) containing the list of variable values. [] is returned if the predicate is false and [[]] is returned if the predicate is true, but there are no results. + + +# Documentation of Body Structures + +In: +``` +interpret(off,[[n,function],[1,1,[v,c]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + THE BODY + ] + ] +] +,Result). +``` + +* Commands may be in brackets, e.g. +`[[[n,+],[[v,a],[v,b],[v,c]]]]` + + +* Statements may be negated in the form: + +`[[n,not],[Statement]]` + +For example: + +`[[n,not],[[[n,=],[[v,n],1]]]]` + + +* Statements may be connected by the disjunctive (or): + +`[[n,or],[Statements1,Statements2]]` + +For example: + +``` +[[n,or],[[[n,is],[[v,a],1]], +[[n,is],[[v,a],2]]]] +``` + +A limitation of List Prolog is that multiple clauses should be used rather than "or" to give non-deterministic results. + +* If-then statements may either be in the form: + +`[[n,"->"],[Statements1,Statements2]]` + +This means "If Statements1 then Statements2". + +E.g. +``` +[[n,"->"],[[[n,>],[[v,a],[v,c122]]], +[[n,downpipe],[[v,c122],[v,b],[v,c]]]]] +``` + +* Or, if-then statements may be in the form: + +`[[n,"->"],[Statements1,Statements2,Statements2a]]` + +This means "If Statements1 then Statements2, else Statements2a". + +For example: + +``` +[[n,"->"],[[[n,deletea2],[[v,m],[v,h],[v,m1]]], +[[n,true]], +[[n,=],[[v,m],[v,m1]]]]] +``` + + +# Documentation of Commands + + +* `[[n,IsOp],[Variable1,Variable2,Variable3]]` e.g. `[[n,+],[1,2,[v,b]]]]` returns `[v,b]=3` where `IsOp` may be `+`,`-`,`*` or `/`. + +* `[[n,cut]]` - behaves like swipl's ! (doesn't allow backtracking forward or back past it) + +* `[[n,true]]` - behaves like true + +* `[[n,fail]]` - fails the current predicate + +* `[[n,atom],[Variable]]`, e.g. `[[n,atom],[[v,a]]]` - returns true if `[v,a]=`e.g. `'a'`, an atom + +* `[[n,string],[Variable]]`, e.g. `[[n,string],[[v,a]]]` - returns true if `[v,a]=`e.g. `"a"`, a string + +* `[[n,number],[Variable]]`, e.g. `[[n,number],[14]]` - returns true where `14` is a number + +* `[[n,letters],[Variable]]`, e.g. `[[n,letters],["abc"]]` - returns true where `"abc"` is letters + +* `[[n,IsOperator],[Variable1,Variable2]]`, where `IsOperator=is` or `IsOperator="="`, e.g. `[[n,=],[[v,a],1]]` - returns true if `[v,a]=1` + +* `[[n,ComparisonOperator],[Variable1,Variable2]]`, where `ComparisonOperator=">",">=","<", "=<", "=" or "=\="` e.g. `[[n,=\=],[1,2]]` - returns `not(1=2)=true`. + +* `[[n,equals1],[Variable1,[Variable2,Variable3]]]` e.g. `[[n,equals1],[["a","b"],[[v,a],[v,b]]]]` returns `[v,a]="a"` and `[v,b]="b"` + +* `[[n,equals2],[Variable1,[Variable2,Variable3]]]` e.g. `[[n,equals2],[[v,a],["a","b"]]]` returns `[v,a]=["a","b"]` + +* `[[n,equals3],[Variable1,Variable2]]` e.g. `[[n,equals3],[[v,a],[1,2,3]]]` returns `[v,a]=[1,2,3]` + +* `[[n,wrap],[Variable1,Variable2]]` e.g. `[[n,wrap],["a",[v,b]]]` returns `[v,b]=["a"]` + +* `[[n,unwrap],[Variable1,Variable2]]` e.g. `[[n,wrap],[["a"],[v,b]]]` returns `[v,b]="a"` + +* `[[n,head],[Variable1,Variable2]]` e.g. `[[n,head],[["a","b","c"],[v,b]]]` returns `[v,b]="a"` + +* `[[n,tail],[Variable1,Variable2]]` e.g. `[[n,tail],[["a","b","c"],[v,b]]]` returns `[v,b]=["b","c"]` + +* `[[n,member],[Variable1,Variable2]]` e.g. `[[n,member],[[v,b],["a","b","c"]]]` returns `[v,b]="a"`, `[v,b]="b"` or `[v,b]="c"`, or e.g. `[[n,member],["c",["a","b","c"]]]` returns true. (Formerly member2). + +* `[[n,delete],[Variable1,Variable2,Variable3]]` e.g. `[[n,delete],[["a","b","b","c"],"b",[v,c]]]` returns `[v,c]=["a","c"]` + +* `[[n,append],[Variable1,Variable2,Variable3]]` e.g. `[[n,append],[["a","b"],["c"],[v,d]]]` returns `[v,d]=["a","b","c"]`. Note: Variable2 must be in list form, not e.g. `"c"`, or the command will fail. Wrap may wrap in `[]`. + +* `[[n,stringconcat],[Variable1,Variable2,Variable3]]` e.g. `[[n,stringconcat],["hello ","john",[v,c]]]` returns `[v,c]="hello john"`. + +* `[[n,stringtonumber],[Variable1,Variable2]]` e.g. `[[n,stringtonumber],["3",[v,b]]]` returns `[v,b]=3` + +* `[[n,random],[Variable1]]` e.g. `[[n,random],[[v,r]]]` returns e.g. `[v,r]=0.19232608946956326` + +* `[[n,length],[Variable1,Variable2]]` e.g. `[[n,length],[[1,2,3],[v,l]]]` returns `[v,l]=3` + +* `[[n,ceiling],[Variable1,Variable2]]` e.g. `[[n,ceiling],[0.19232608946956326,[v,c]]]` returns `[v,c]=1` + +* `[[n,date],[Year,Month,Day,Hour,Minute,Seconds]]` e.g. `[[n,date],[[v,year],[v,month],[v,day],[v,hour],[v,minute],[v,second]]]` returns e.g. `[v,year]=2019`, `[v,month]=11`, `[v,day]=6`, `[v,hour]=12`, `[v,minute]=15`, `[v,second]=20.23353409767151`. + +* `[[n,sqrt],[Variable1,Variable2]]` e.g. `[[n,ceiling],[4,[v,c]]]` returns `[v,c]=2` + +* `[[n,round],[Variable1,Variable2]]` e.g. `[[n,round],[1.5,[v,c]]]` returns `[v,c]=2` + +* `[[n,equals4],[Variable1,Variable2]]` e.g. `[[n,equals4],[[[v,c],"|",[v,d]],[1,2,3]]]` returns `[v,c]=1` and `[v,d]=[2,3]`. You may use either order (i.e. a=1 or 1=a). Multiple items are allowed in the head of the list, there may be lists within lists, and lists with pipes must have the same number of items in the head in each list, or no pipe in the other list. + +* `[[n,findall],[Variable1,Variable2,Variable3]]` e.g. `[[n,equals3],[[v,a],[1,2,3]]],[[n,findall],[[v,a1],[[n,member],[[v,a1],[v,a]]],[v,b]]]` returns `[v,b]=[1,2,3]` + +* `[[n,string_from_file],[Variable1,Variable2]]` e.g. `[[n,string_from_file],[[v,a],"file.txt"]]` returns `[v,a]="Hello World"` + +* `[[n,maplist],[Variable1,Variable2,Variable3,Variable4]]` e.g. `[[n,maplist],[[n,+],[1,2,3],0,[v,b]]]` returns `[v,b]=6` + +* `[[n,string_length],[Variable1,Variable2]]` e.g. `[[n,string_length],["abc",[v,b]]]` returns `[v,b]=3` + +* `[[n,sort],[Variable1,Variable2]]` e.g. `[[n,sort],[[1,3,2],[v,b]]]` returns `[v,b]=[1,2,3]` + +* `[[n,intersection],[Variable1,Variable2]]` e.g. `[[n,intersection],[[1,3,2],[3,4,5],[v,b]]]` returns `[v,b]=[3]` + +* `[[n,read_string],[Variable1]]` e.g. `[[n,read_string],[[v,a]]]` asks for input and returns `[v,a]="hello"` + +* `[[n,writeln],[Variable1]]` e.g. `[[n,writeln],[[v,a]]]` writes `[v,a]` which is `"hello"` + +* `[[n,atom_string],[Variable1,Variable2]]` e.g. `[[n,atom_string],[a,[v,b]]]` returns `[v,b]="a"` or `[[n,atom_string],[[v,b],"a"]]` returns `[v,b]=a` + +* (1) `[[n,call],[Function,Arguments]]` e.g. `[[n,call],[[n,member2a],[["a","b","c"],[v,b]]]]` returns `[v,b]=a` + +* (2) `[[n,call],[[lang,Lang],Debug,[Function,Arguments],Functions]]` e.g. `[[n,call],[[lang,same],same,[[n,member2a],[[v,b],["a","b","c"]]], +[[[n,member2a],[[v,b],[v,a]],":-", + [[[n,member],[[v,b],[v,a]]]]]]]]` returns `[v,b]="a"`, where Lang may be e.g. `"en"`, etc., or `same` (meaning the same language as the parent predicate) and Debug may be `on`, `off` or `same` (meaning the same debug status as the parent predicate). + +* (3) `[[n,call],[[lang,Lang],Debug,[Function,Arguments],Types,Modes,Functions]]` e.g. `[[n,call],[[lang,same],same,[[n,member2a],[[v,b],["a","b","c"]]], [[[n,member2a],[[t, number], [[t, number], [t, number], [t, number]]]]], + [[[n,member2a],[output,input]]], +[[[n,member2a],[[v,b],[v,a]],":-", + [ [[n,member],[[v,b],[v,a]]]] + ]]]]` returns `[v,b]="a"`. (See call(2) above for possible values of Lang and Debug.) + +* `[[n,trace]]` switches on trace (debug) mode. + +* `[[n,notrace]]` switches off trace (debug) mode. + +* `[[n,get_lang_word],[Variable1,Variable2]` e.g. `[n,get_lang_word],["get_lang_word",[v,word]]` returns `[v,word]="get_lang_word"`. + +* `[[n,equals4_on]]` switches on equals4 mode (in check arguments, substitute vars and update vars but not in the equals4 command itself), for e.g. list processing in arguments. + +* `[[n,equals4_off]]` switches off equals4 mode, for debugging. + +* See also Additional LPI Commands. + +* See lpiverify4.pl for examples of rules (predicates without bodies) and calls to predicates. + + +# Grammars + +* List Prolog supports grammars, for example: + +* Grammars may be recursive (see test 9 in lpiverify4.pl), i.e. they may repeat until triggering the base case: + +``` +test(9,[[n,grammar1],["aaa"]], +[ + [[n,grammar1],[[v,s]],":-",[[[n,noun],[[v,s],""]]]], + [[n,noun],"->",[""]], + [[n,noun],"->",["a",[[n,noun]]]] + ], +[[]]). +``` + +* And: + +``` +test(8,[[n,grammar1],["apple"]], +[ + [[n,grammar1],[[v,s]],":-", + [ + [[n,noun],[[v,s],""]] + ] + ], + + [[n,noun],"->",["apple"]] +],[[]]). +``` + +* In `[[n,noun],[[v,s],""]]`, the argument `[v,s]` is the entry string and `""` is the exit string. + +* In the above example, the word `"apple"` is parsed. Grammars use `"->"`, not `":-"`. + +* Grammars may be recursive (see test 9 in lpiverify4.pl), i.e. they may repeat until triggering the base case. + +* Grammars may have extra arguments, placed after the other arguments. The entry and exit string arguments are only used outside the grammar, and can be accessed, e.g. `b` in: +``` + [[n,lookahead],[[v,a],[v,a],[v,b]],":-", + [[[n,stringconcat],[[v,b],[v,d],[v,a]]]]] +``` +Note: `":-"`, not `"->"` + + +* Base case 1 for recursive grammars, which requires e.g. +``` + [[n,compound212],["","",[v,t],[v,t]]], +``` +which is triggered at the end of the string (`""`), taking and returning the extra argument `t`. + +and a "bottom case" (base case 2) in case it is not at the end of the string, e.g.: +``` + [[n,compound212],[[v,x],[v,x],[v,t],[v,t]]], +``` +which is triggered if the first base case is not matched. It takes and returns `x` (the entry and exit strings) and the extra argument `t`. + +given the clause: +``` + [ + [n,compound21],[[v,t],[v,u]],"->", + [ + [[n,a]], + [[n,code],[[n,wrap],["a",[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]] + ], + [[n,compound212],[[v,v],[v,u]]] + ] + ], +``` + +In it, `[[n,a]]` calls a grammar predicate called `"a"`. `[[n,code],...]` is similar to `{}` in SWI-Prolog (it allows commands to be called within a grammar). The code wraps a string and appends it to a list, before exiting code and calling the grammar predicate `compound212`. `v` and `u` are not entry and exit strings, they are extra arguments, handled with `t` in the base cases 1 and 2 above. The start of the entry string is matched with strings when [[n,a]] is called and any grammar predicates (outside `[[n,code],...]` i.e. `[[n,compound212],[[v,v],[v,u]]]` are given the rest of the entry string (which is an exit string), and this continues until the string ends at base case 1 or the string doesn't end at base case 2 and is processed in a later clause. + +* Sometimes there is another recursive clause, which calls itself: +``` + [ + [n,compound21],[[v,t],[v,u]],"->", + [ + [[n,a]],",", + [[n,compound21],[[],[v,compound1name]]], + [[n,code],[[n,wrap],["a",[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]], + [[n,append],[[v,v],[v,compound1name],[v,u]]] + ] + ] + ], +``` + +`[[n,a]]`, a call, could be substituted with `[v,b]`, however `[[n,a]]` would call a grammar predicate and `[v,b]` would return a character. + +* When we need to find out what the next character is but parse the character somewhere else, we use lookahead. + +E.g.: +``` + [ + [n,word21],[[v,t],[v,u]],"->", + [ + [v,a],[[n,commaorrightbracketnext]], + [[n,code],[[n,letters],[[v,a]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]] + ], + [[n,word212],[[v,v],[v,u]]] + ] + ], +``` + +With `commaorrightbracketnext` (which looks ahead for a comma or `"]"`), it doesn't return true in `"a"` of `"ab,c"` +when it is run and goes to `"b"` instead as wanted on another run. + +* Note, we can call `lookahead` as a grammar predicate: +`[[n,lookahead],["ate"]]` + +even though the predicate itself is not a grammar predicate: +``` + [[n,lookahead],[[v,a],[v,a],[v,b]],":-", + [[[n,stringconcat],[[v,b],[v,d],[v,a]]]]] +``` +where `[v,b]="ate"`. + +* Predicates or predicates to modify to provide the function of string to list (test 15), split on character(s) (test 17), `intersection` and `minus` are in lpiverify4.pl. + +# Type Statements + +* Functions may have strong types, which check inputted values when calling a function and check all values when exiting a function. So far, any type statement with the name and arity of the function may match data for a call to that function. + +* The user may optionally enter types after the query. The following type statement tests number, string and predicate name types. + +* Note: Mode statements, described in the next section, are required after Type Statements. + +* Types with lists (0-infinite repeats of type statements) are written inside {}. There may be nested curly brackets. + +* Type Statements may be recursive (see test 23 in lpiverify4_types.pl), i.e. they may repeat until triggering the base case: + +``` +test_types_cases(23,[[n,connect_cliques],[[["a",1],[1,2],[2,"b"]],[["a",3],[3,4],[4,"b"]],[v,output]]], +[ +[[n,connect_cliques],[[t,list2],[t,list2],[t,list2]]], +[[t,item],[[t,number]]], +[[t,item],[[t,string]]], +[[t,list2],[{[t,item]}]], +[[t,list2],[{[t,list2]}]] +], + [[[n,connect_cliques],[input,input,output]]], + [[[n,connect_cliques],[[v,a],[v,b],[v,c]],":-", + [[[n,append],[[v,a],[v,b],[v,c]]]]]], +[[[[v,output],[["a",1],[1,2],[2,"b"],["a",3],[3,4],[4,"b"]]]]]). +``` + +* Also: + +``` +test_types_cases(2,[[n,function],[[v,a],[v,b],[v,c]]], +[[[n,function],[[t,number],[t,string],[t,predicatename]]]], +[[[n,function],[output,output,output]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[n,=],[[v,a],1]], + [[n,=],[[v,b],"a"]], + [[n,=],[[v,c],[n,a]]] + ]] +] +,[[[[v,a], 1],[[v,b], "a"],[[v,c], [n,a]]]]). +``` + +* The following type statement tests number types. +``` +test_types_cases(1,[[n,function],[1,1,[v,c]]], +[[[n,function],[[t,number],[t,number],[t,number]]]], +[[[n,function],[input,input,output]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[n,+],[[v,a],[v,b],[v,c]]] + ] + ] +] +,[[[[v,c], 2]]]). +``` + +* The following type statement tests number and bracket types. +``` +test_types_cases(3,[[n,function],[[v,a]]], +[ +[[n,function],[[[t,number]]]] +], +[[[n,function],[output]]], +[ + [[n,function],[[1]]] +] +,[[[[v,a], [1]]]]). +``` + +* The following type statement tests number and string types. +``` +test_types_cases(4,[[n,f],[[v,a],[v,b],[v,c],[v,d]]], +[[[n,f],[[t,number],[t,string],[t,number],[t,string]]]], +[[[n,f],[output,output,output,output]]], +[ + [[n,f],[1,"a",2,"b"]] +] +,[[[[v,a], 1],[[v,b], "a"],[[v,c], 2],[[v,d], "b"]]]). +``` + +* The following type statement tests unique types, number and string types. +``` +test_types_cases(5,[[n,f],[[v,a],[v,b]]], +[ + [[n,f],[[t,a],[t,b]]], + [[t,a],[[t,number]]], + [[t,b],[[t,string]]] +], +[ + [[n,f],[output,output]] +], +[ + [[n,f],[1,"a"]] +] +,[[[[v,a], 1],[[v,b], "a"]]]). +``` + +* The following type statement tests any of number and string types. +``` +test_types_cases(6,[[n,f],[[v,a]]], +[ + [[n,f],[[t,a]]], + [[t,a],[[t,number]]], + [[t,a],[[t,string]]] +], +[ + [[n,f],[output]] +], +[ + [[n,f],["a"]] +] +,[[[[v,a], "a"]]]). +``` + +* The following type statements test any types (including multiple types). +``` +[ + [[n,map],[[[t, predicatename], [[t, any]]], [t, number], [t, number]]], + + [[n,getitemn],[[t, number], {[t, any]}, [t, any]]] +], +``` + +# Mode Statements + +In the following, +``` +test_types_cases(2,[[n,function],[[v,a],[v,b],[v,c]]], +[[[n,function],[[t,number],[t,string],[t,predicatename]]]], +[[[n,function],[output,output,output]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[n,=],[[v,a],1]], + [[n,=],[[v,b],"a"]], + [[n,=],[[v,c],[n,a]]] + ]] +] +,[[[[v,a], 1],[[v,b], "a"],[[v,c], [n,a]]]]). +``` + +`[[[n,function],[output,output,output]]],` is the mode statement, which must follow the type statement (although type and mode statements together are optional). The Mode Statement specifies whether each of the variables takes input or gives output. + +# Functional List Prolog (FLP) + +* List Prolog has an optional functional mode. In FLP, function calls may be passed as variables and functions may have strong types. + +* Functional algorithms may be recursive (see test 7 in lpiverify4_types.pl), i.e. they may repeat until triggering the base case: + +``` +test_types_cases(7,[[n,map],[[[n,add],[[[n,add],[[[n,add],[1]]]]]],0,[v,d]]], +[ +[[n,map],[[t,map1],[t,number],[t,number]]], +[[t,map1],[[t,number]]], +[[t,map1],[[[t,predicatename],[[t,map1]]]]], +[[n,add],[[t,number],[t,number],[t,number]]], +[[n,getitemn],[[t,number],{[t,any]},[t,any]]] +], +[ + [[n,map],[input,input,output]], + + [[n,add],[input,input,output]], + + [[n,getitemn],[input,input,output]] +], +[ + [[n,map],[[v,f1],[v,n1],[v,n]],":-", + [ + [[n,number],[[v,f1]]], + [[n,add],[[v,n1],[v,f1],[v,n]]] + ] + ], + [[n,map],[[v,f1],[v,l],[v,n]],":-", + [ + [[n,equals1],[[v,f1],[[v,f11],[v,f12]]]], + [[n,=],[[v,f11],[n,add]]], + [[n,getitemn],[1,[v,f12],[v,bb]]], + [[v,f11],[[v,l],1,[v,l2]]], + [[n,map],[[v,bb],[v,l2],[v,n]]] + ] + ], + + [[n,add],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,c]]] + ]], + + [[n,getitemn],[1,[v,b],[v,c]],":-", + [ [[n,head],[[v,b],[v,c]]] + ]], + [[n,getitemn],[[v,a],[v,b],[v,c]],":-", + [ [[n,not],[[[n,=],[[v,a],1]]]], + [[n,tail],[[v,b],[v,t]]], + [[n,-],[[v,a],1,[v,d]]], + [[n,getitemn],[[v,d],[v,t],[v,c]]] + ]] + +] + +,[[[[v,d], 4]]]). +``` + +* In the following, `[[n,function],[[[n,function2],[2]],1,1,[v,c]]]` function2 is passed as a variable. `[v,f11]` is replaced with the function name. +``` +%% c=f(g(2), 1, 1) +test(53,[[n,function],[[[n,function2],[2]],1,1,[v,c]]], +[ + [[n,function],[[v,f1],[v,a],[v,b],[v,c]],":-", + [ + [[n,equals1],[[v,f1],[[v,f11],[v,f12]]]], + [[n,getitemn],[1,[v,f12],[v,bb]]], + [[v,f11],[[v,bb],[v,d],[v,f]]], + [[n,+],[[v,a],[v,b],[v,e]]], + [[n,+],[[v,e],[v,f],[v,g]]], + [[n,+],[[v,g],[v,d],[v,c]]] + ] + ], + [[n,function2],[[v,bb],[v,a],[v,f]],":-", + [ + [[n,is],[[v,a],[v,bb]]], + [[n,is],[[v,f],1]] + ] + ], + + [[n,getitemn],[1,[v,b],[v,c]],":-", + [ [[n,head],[[v,b],[v,c]]] + ]], + [[n,getitemn],[[v,a],[v,b],[v,c]],":-", + [ [[n,not],[[[n,=],[[v,a],0]]]], + [[n,tail],[[v,b],[v,t]]], + [[n,-],[[v,a],1,[v,d]]], + [[n,getitemn],[[v,d],[v,t],[v,c]]] + ]] +] + +,[[[[v,c], 5]]]). +``` + +* For other examples, please see lpiverify4.pl, lpiverify4_types.pl (for examples with types, including the list type), lpiverify4_open.pl (for examples with open-ended results) and lpiverify4_open_types.pl (for examples with open-ended results with types). diff --git a/listprologinterpreter/README.md b/listprologinterpreter/README.md new file mode 100644 index 0000000000000000000000000000000000000000..b296df3c11efae9632a07d72318106818e5fd59f --- /dev/null +++ b/listprologinterpreter/README.md @@ -0,0 +1,216 @@ +# List Prolog Interpreter + +List Prolog Interpreter (LPI) makes it easier to generate List Prolog programs. It works by interpreting different versions of Prolog that are in a list format. Using an algorithm that was written in SWI-Prolog, it easily parses and runs List Prolog code. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for writing code. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +* You may need to install gawk using Homebrew. + +* Install Translation Shell on Mac, etc. +Change line in +``` +culturaltranslationtool/ctt2.pl +trans_location("../../../gawk/trans"). +``` +to correct location of trans. + +# 1. Install manually + +Download this repository, the Languages repository (which enables List Prolog Interpreter to be run in different languages), SSI and Cultural Translation Tool. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","listprologinterpreter"). +halt +``` + +# Running + +* In Shell: +`cd listprologinterpreter` +`swipl` +`['listprolog'].` + +* Running the tests +To run all tests, enter: +`test(off,NTotal,Score).` + +To run a specific test: +`test1(off,TestNumber,Passed).` +where TestNumber is the test number from lpiverify4.pl. + +Example of an end to end test +The query `test1(Debug,1,Passed).` +where `Debug (trace)=off` +tests the following predicate: +``` +test(1,[[n,function],[1,1,[v,c]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[n,+],[[v,a],[v,b],[v,c]]] + ] + ] +] +,[[[[v,c], 2]]]). +``` +This query contains the query tested, the predicate and the required result. + +Also, the query `test1(off,7,Passed).` +``` +test(7,[[n,reverse],[[1,2,3],[],[v,l]]], +[ + [[n,reverse],[[],[v,l],[v,l]]], + [[n,reverse],[[v,l],[v,m],[v,n]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,wrap],[[v,h],[v,h1]]], + [[n,append],[[v,h1],[v,m],[v,o]]], + [[n,reverse],[[v,t],[v,o],[v,n]]] + ] + ] +],[[[[v,l], [3, 2, 1]]]]). +``` +tests the reverse predicate. + +To run all tests (main, types, open and open types) in any language: +``` +test_all00("en",off,NTotal,Score). +test_all00("en2",off,NTotal,Score). +``` +where "en2" is an English language with e.g. `"concatenate strings"` instead of `stringconcat` ("en", or see available language codes - see the Languages repository for instructions about how to install different languages). + +* Note 1: drag and drop contents of `test_open_and_types_open_data/` into an empty file in BBEdit (Mac) to copy and paste into Terminal for tests with input. + +To run a test from one of main, types, open or open types, run one of: +``` +test_all01(test, 4,"en2",off,1,Passed). +test_all01(test_types_cases,6,"en2",off,1,Passed). +test_all01(testopen_cases, 3,"en2",off,1,Passed). +test_all01(test_open_types, 5,"en2",off,1,Passed). +``` +where 1 is replaced with the test number from + +* lpiverify4.pl +* lpiverify4_types.pl +* lpiverify4_open.pl +* lpiverify4_open_types.pl + +respectively. + +* Run Prolog tests: +``` +['lpiverify_pl.pl]. +test_pl1(off,A,B). +``` + +* See note 1 above. + +To run all tests (main, types, open and open types) back-translating to and from any language: +``` +test_all_bt00("en2",off,NTotal,Score). +``` + +To run a test from one of main, types, open or open types, run one of: +``` +test_all_bt01(test, 4,"en2",off,1,Passed). +test_all_bt01(test_types_cases,6,"en2",off,1,Passed). +test_all_bt01(testopen_cases, 3,"en2",off,1,Passed). +test_all_bt01(test_open_types, 5,"en2",off,1,Passed). +``` +where 1 is replaced with the test number from + +* lpiverify4.pl +* lpiverify4_types.pl +* lpiverify4_open.pl +* lpiverify4_open_types.pl + +respectively. + +* See note 1 above. + +# Documentation + +See List Prolog Documentation and List Prolog Caveats. + +# Text to Breasonings + +Text to Breasonings now has its own repository. + +# Data to Types Documentation + +See Data to Types Documentation. + +# LPI API + +* To run LPI on a Prolog server: +* Move `lpi-api.pl` to the root (`/username/` or `~` on a server) of your machine. +* Re-enter the paths to your Prolog files in it. +* Enter `[lpi-api.pl]` in SWI-Prolog and `server(8000).`. +* On a local host access the algorithm at `http://127.0.0.1:8000` and replace 127.0.0.1 with your server address. + +# Diagram of List Prolog Converters + +Diagram of List Prolog Converters + +``` +* Above: Cycle of Normal Prolog e.g. a(B,C):-d(E),f. to Simple List Prolog e.g. +[[f1, [a, b, c, d, e], (:-), +[[+, [a, b, f]], +[+, [c, f, g]], +[+, [d, g, h]], +[=, [e, h]]]]] +(Prolog-to-List-Prolog Converter), to List Prolog e.g. +[ +[[n,function],[[v,a],[v,b],[v,c]],":-", +[ + [[n,+],[[v,a],[v,b],[v,c]]] + ] +] +] +(Simple-List-Prolog-to-List-Prolog) back to Normal Prolog (List-Prolog-to-Prolog-Converter). +``` + +See Simple-List-Prolog-to-List-Prolog, Prolog-to-List-Prolog (includes Prolog to Simple List Prolog) and List-Prolog-to-Prolog-Converter. + +# Occurs Check + +* To return false if an equals4 variable is in terms of itself, for example: + +``` +occurs_check([v,a],[[v,a]]). +false. +``` + +* then enter `turn_occurs_check(on).`. To return `true.` above, indicating any occurrences have been ignored, enter `turn_occurs_check(off).` (the default). + +# List Prolog Interpreter with Prolog queries + +* Run List Prolog Interpreter with Prolog queries. + +# Versioning + +We will use SemVer for versioning. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/listprologinterpreter/a.out b/listprologinterpreter/a.out new file mode 100644 index 0000000000000000000000000000000000000000..7f0aeb54535dc8180f9f5938767725f6773e3687 Binary files /dev/null and b/listprologinterpreter/a.out differ diff --git a/listprologinterpreter/additional_lpi_commands.md b/listprologinterpreter/additional_lpi_commands.md new file mode 100644 index 0000000000000000000000000000000000000000..58ad1626f9ecb17eac7caf2baa407c5a6db740ed --- /dev/null +++ b/listprologinterpreter/additional_lpi_commands.md @@ -0,0 +1,248 @@ +# Additional LPI Commands + +* NB. Change two backslashes to one when copying. +--- + * Run C in List Prolog. + * [[n,shell_c],[[I,P,OVar]]] + * e.g. [[n,shell_c],["a","#include\n#include\n\nint main(void)\n{\n char str1[20];\n\n scanf(\"%19s\", str1);\n \n printf(\"%s\\n\", str1);\n return 0;\n}",[v,o]]] + * I - inputs, e.g. "a" is in file input.txt + * P - the C source file (see example above). Escape quotes. + * OVar - e.g. [v,o], the List Prolog variable to return output to. Take care to return the correct variable because it is hard to debug shell commands. + --- + * Run Prolog in List Prolog. + * [[n,shell_pl],[[I,QP,QV,P,OVar]]] + * e.g. [[n,shell_pl],[[1,1],"a","A,A1","B is A+A1,B1 is A-A1,write([B,B1]).",[v,o]]] + * I - inputs, e.g. [1,1] means A=1, A1=1 + * QP - query predicate, e.g. "a", which is used in the call and to build the predicate. + * QV - query variables, e.g. "A,A1", which are used in the call and to build the predicate. + * P - e.g. "B is A+A1,B1 is A-A1,write([B,B1]).", the body of the predicate. Convert result to an atom with term_to_atom/2 (not in example because it is not necessary). Return output from Prolog with write/1. + * OVar - e.g. [v,o], the List Prolog variable to return output to. Take care to return the correct variable because it is hard to debug shell commands. + --- + * [[n,phrase_from_file],[[[n,string],[[v,out]]],[v,path]]] + * [v,out] - replace with variable to save string to + * [v,path] - atom file path to read + --- + * [[n,text_area +],["rows=\\"4\\" style=\\"width:100%\\"","a",[v,s]]] + * "rows=\\"4\\" style=\\"width:100%\\"" - text area parameters. + * "a" - replace with value already in field + * [v,s] - text entered in field +--- + * [[n,date_time_stamp],[Year,Month,Day,Hour,Minute,Seconds,Seconds2,Variable1]], + * Converts the date and time to a number stamp +--- + * +[[n,interpret],[[lang,"en"/same],debug is on/off/same,[[n,predicate],[[v,variable],...]],Predicates]] - Legacy List Prolog Interpreter call for SSI with language code, debug or trace mode, predicate and variable query and predicates to call. + * For example, https://github.com/luciangreen/Philosophy/blob/master/paraphraser1_lp.pl +--- +* In the following, ["predicate_name",[i,o]] is called as [[n,predicate_name],[[v,i],[v,o]]] + + ["assertz",[i]], + ["getz",[o]], - get value of an asserted variable + ["retractall",[o]], - e.g. retractall(a). + + * e.g. assertz(a(1)),assertz(a(2)),getz(a(A)). + * A=[1,2]. + + ["term_to_atom",[i,i]], + ["term_to_atom",[i,o]], + ["term_to_atom",[o,i]], + + ["open_file_s",[i,o]], + ["open_string_file_s",[i,o]], + + ["split_on_substring117a",[i,i,i,o]], + ["string_strings",[i,o]], + + ["working_directory",[i,i]], + ["working_directory",[o,i]], + ["directory_files",[o]], + ["exists_directory",[i]], + ["exists_file",[i]], + + ["split_string",[i,i,i,o]], + + ["number_string",[i,i]], + ["number_string",[i,o]], + ["number_string",[o,i]], + + ["get_single_char",[i]], + ["get_single_char",[o]], + + ["word1",[i]], + + ["term_to_atom",[i,o]], + ["term_to_atom",[o,i]], + ["term_to_atom",[i,i]], + + ["shell",[i]], + + ["read_password",[i]], + ["read_password",[o]], + + ["string_chars",[i,i]], + ["string_chars",[i,o]], + ["string_chars",[o,i]], + ["atom_chars",[i,i]], + ["atom_chars",[i,o]], + ["atom_chars",[o,i]], + ["atom_codes",[i,i]], + ["atom_codes",[i,o]], + ["atom_codes",[o,i]], + ["atom_concat",[i,i,i]], + ["atom_concat",[i,o,i]], + ["atom_concat",[o,i,i]], + ["atom_concat",[o,o,i]], + ["atom_concat",[i,i,o]], + ["atomic",[i]], + ["atom_length",[i,i]], + ["atom_length",[i,o]], + ["sub_atom",[i,i,i,i,i]], + ["sub_atom",[i,i,i,i,o]], + ["sub_atom",[i,i,i,o,i]], + ["sub_atom",[i,i,i,o,o]], + ["sub_atom",[i,i,o,i,i]], + ["sub_atom",[i,i,o,i,o]], + ["sub_atom",[i,i,o,o,o]], + ["sub_atom",[i,o,i,i,i]], + ["sub_atom",[i,o,i,i,o]], + ["sub_atom",[i,o,i,o,i]], + ["sub_atom",[i,o,i,o,o]], + ["sub_atom",[i,o,o,i,i]], + ["sub_atom",[i,o,o,i,o]], + ["sub_atom",[i,o,o,o,i]], + ["sub_atom",[i,o,o,o,o]], + ["sub_string",[i,i,i,i,i]], + ["sub_string",[i,i,i,i,o]], + ["sub_string",[i,i,i,o,i]], + ["sub_string",[i,i,i,o,o]], + ["sub_string",[i,i,o,i,i]], + ["sub_string",[i,i,o,i,o]], + ["sub_string",[i,i,o,o,o]], + ["sub_string",[i,o,i,i,i]], + ["sub_string",[i,o,i,i,o]], + ["sub_string",[i,o,i,o,i]], + ["sub_string",[i,o,i,o,o]], + ["sub_string",[i,o,o,i,i]], + ["sub_string",[i,o,o,i,o]], + ["sub_string",[i,o,o,o,i]], + ["sub_string",[i,o,o,o,o]], + + ["char_code",[i,i]], + ["char_code",[i,o]], + ["char_code",[o,i]], + ["number_chars",[i,i]], + ["number_chars",[i,o]], + ["number_chars",[o,i]], + ["number_codes",[i,i]], + ["number_codes",[i,o]], + ["number_codes",[o,i]], + ["close",[i,i]], + ["close",[i]], + ["stream_property",[i,i]], + ["stream_property",[i,o]], + ["stream_property",[o,i]], + ["stream_property",[o,o]], + + ["at_end_of_stream",[]], + ["at_end_of_stream",[i]], + ["set_stream_position",[i,i]], + ["compound",[i]], + ["copy_term",[i,i]], + ["copy_term",[i,o]], + ["copy_term",[o,i]], + ["copy_term",[o,o]], + ["current_prolog_flag",[i,i]], + ["current_prolog_flag",[i,o]], + ["current_prolog_flag",[o,i]], + ["current_prolog_flag",[o,o]], + ["current_input",[i]], + ["current_input",[o]], + ["current_output",[i]], + ["current_output",[o]], + ["float",[i]], + ["get_byte",[i,i]], + ["get_byte",[i,o]], + ["get_byte",[i]], + ["get_byte",[o]], + ["peek_byte",[i,i]], + ["peek_byte",[i,o]], + ["peek_byte",[i]], + ["peek_byte",[o]], + ["put_byte",[i,o]], + ["put_byte",[o,o]], + ["put_byte",[i]], + ["put_byte",[o]], + + ["peek_char",[i,i]], + ["peek_char",[i,o]], + ["peek_char",[i]], + ["peek_char",[o]], + + ["peek_code",[i,i]], + ["peek_code",[i,o]], + ["peek_code",[i]], + ["peek_code",[o]], + + ["get_char",[i]], + ["get_char",[o]], + ["get_char",[i,i]], + ["get_char",[i,o]], + ["get_code",[i]], + ["get_code",[o]], + ["get_code",[i,i]], + ["get_code",[i,o]], + + ["halt",[]], + ["halt",[i]], + + ["set_prolog_flag",[i,i]], + ["integer",[i]], + ["set_input",[i]], + ["set_output",[i]], + ["open",[i,i,o,i]], + ["open",[i,i,o]], + ["read",[i,o]], + ["read",[i,i]], + ["nonvar",[i]], + + ["sin",[i,i]], + ["sin",[i,o]], + ["cos",[i,o]], + ["cos",[i,i]], + ["atan",[i,i]], + ["atan",[i,o]], + ["log",[i,i]], + ["log",[i,o]], + ["sqrt",[i,i]], + ["sqrt",[i,o]], + + ["put_char",[i,i]], + ["put_char",[i]], + ["put_code",[i,i]], + ["put_code",[i]], + ["nl",[]], + ["nl",[i]], + + ["read_term",[i,i,i]], + ["read_term",[i,o,i]], + ["read_term",[i,i]], + ["read_term",[i,o]], + ["read",[i,i]], + ["read",[i,o]], + ["read",[i]], + ["read",[o]], + + ["write",[i,i]], + ["write",[i]], + ["writeq",[i,i]], + ["writeq",[i]], + + ["abs",[i,i]], + ["abs",[i,o]], + ["sign",[i,i]], + ["sign",[i,o]], + ["floor",[i,i]], + ["floor",[i,o]], + ["round",[i,i]], + ["round",[i,o]] diff --git a/listprologinterpreter/collect_arguments.pl b/listprologinterpreter/collect_arguments.pl new file mode 100644 index 0000000000000000000000000000000000000000..ef6879dd45a2d3bbcbc45c373ca7a871b87f50ed --- /dev/null +++ b/listprologinterpreter/collect_arguments.pl @@ -0,0 +1,208 @@ +%symbol(Symbol,Symbol) :-!. + +%%slp2lp_variables(Name1,[v,Name1]) :- predicate_or_rule_name(Name1),!. +%slp2lp_variables(Name,Name) :- !. + +/** +collect_arguments_body(Body1,Body2) :- + findall(*,(member(Statement1,Body1 +collect_arguments_body(Body1,[],Body2) :- + +**/ + +%%predicate_or_rule_name([A,B]) :- atom(A),is_list(B),!. +%predicate_or_rule_name([V_or_n,_Name]) :- (V_or_n=v->true;V_or_n=n),!.%%,atom(Name),!. +%% x: predicate_or_rule_name(V_or_n) :- (V_or_n=v->true;V_or_n=n),fail,!. + +collect_arguments_body2([],N,N):-!.%%,Body3 + + +%%collect_arguments_body2([],Body,Body) :- !. +collect_arguments_body2(Body1,Body2,Body3) :- + Body1=[[Statements1|Statements1a]|Statements2 + ], + + not(predicate_or_rule_name(Statements1)), + %Number1a is Number1+1, +collect_arguments_body2([Statements1],Body2,Body4), %% 2->1 + + collect_arguments_body2(Statements1a,Body4,Body5), + collect_arguments_body2(Statements2,Body5,Body3), + +%% append([Body3,Body4],Body6), +%% append([[Body6],Body5],Body2), + + + !. + + + + +collect_arguments_body2(Body1,Body2,Body3) :- +get_lang_word("n",Dbw_n), +get_lang_word("not",Dbw_not), + + Body1=[[[Dbw_n,Dbw_not],Statement]|Statements2 %% [] removed from Statement + ], + %Number1a is Number1+1, + collect_arguments_body2([Statement],Body2,Body4), + collect_arguments_body2(Statements2,Body4,Body3), + %append([Number1,%%*, + %[n,not]],Body3,Body5), + %append([Body5],Body4 + % ,Body2), + + !. + + + + +collect_arguments_body2(Body1,Body2,Body3) :- +get_lang_word("n",Dbw_n), +get_lang_word("or",Dbw_or), + Body1=[[[Dbw_n,Dbw_or],[Statements1,Statements2]]|Statements3], + %Number1a is Number1+1, + collect_arguments_body2([Statements1],Body2,Body4), + collect_arguments_body2([Statements2],Body4,Body5), + collect_arguments_body2(Statements3,Body5,Body3), + %append(Body3,Body4,Body34), + %Body6=[Number1,[n,or],Body34 + %], + %append([Body6],Body5,Body2), + !. + + +collect_arguments_body2(Body1,Body2,Body3) :- +get_lang_word("n",Dbw_n), + + Body1=[[[Dbw_n,"->"],[Statements1,Statements2]]|Statements3], + %Number1a is Number1+1, + collect_arguments_body2([Statements1],Body2,Body4), + collect_arguments_body2([Statements2],Body4,Body5), + + collect_arguments_body2(Statements3,Body5,Body3), + %append(Body3,Body4,Body34), + %Body6=[Number1,[n,"->"],Body34 + %], + %append([Body6],Body5,Body2), + + !. + + + + +collect_arguments_body2(Body1,Body2,Body3) :- +get_lang_word("n",Dbw_n), + Body1=[[[Dbw_n,"->"],[Statements1,Statements2,Statements2a]]|Statements3], + %Number1a is Number1+1, + collect_arguments_body2([Statements1],Body2,Body4), + collect_arguments_body2([Statements2],Body4,Body5), + %%trace, + collect_arguments_body2([Statements2a],Body5,Body6), + collect_arguments_body2(Statements3,Body6,Body3), + %append_list2([Body3,Body4,Body5],Body345), + %Body7=[Number1,[n,"->"],Body345], + %append([Body7],Body6,Body2), + !. + + +collect_arguments_body2(Body1,Body2,Body3) :- + Body1=[Statement|Statements], + not(predicate_or_rule_name(Statement)), + collect_arguments_statement1(Statement,Body2,Body4), + collect_arguments_body2(Statements,Body4,Body3), + %append_list2([Result1,Result2],Body2), + !. + +collect_arguments_statement1(Statement,Arguments1,Arguments2) :- +get_lang_word("n",Dbw_n), + + ((Statement=[[Dbw_n,_Name],Arguments], + %trace, + recursive_collect_arguments(Arguments,Arguments1,Arguments2) + +%findall(Argument,(member(Argument,Arguments),variable_name(Argument)),Arguments3), + %append(Arguments1,Arguments3,Arguments2) + %Arguments=Result2, + %findall(Argument,(member(Argument,Arguments),(predicate_or_rule_name(Argument))),Result2), + %Result1=[[Number1,[n,Name],Result2]] + )->true; + (Statement=[[Dbw_n,_Name]], + Arguments1=Arguments2)). + +recursive_collect_arguments([],Arguments,Arguments) :- !. +recursive_collect_arguments(Statement,Arguments1,Arguments2) :- + Statement=[Statement1|Statement2], + (variable_name(Statement1)->append(Arguments1,[Statement1],Arguments3);(expression_not_var(Statement1)->Arguments1=Arguments3; + recursive_collect_arguments(Statement1,Arguments1,Arguments3))), + recursive_collect_arguments(Statement2,Arguments3,Arguments2). +%recursive_collect_arguments(Statement,Arguments1,Arguments2) :- + %variable_name(Statement)-> + +%/* +occurs_check(Variable1,Variables2) :- +%trace, + (occurs_check(on)-> + ((Variable1=Variables2->true; + (occurs_check2([Variable1],[Variables2])->fail;true))); + true). + + + +contains_var(_,[]) :- fail. + +contains_var(Var,Statement) :- + +(Var=Statement->true; (Statement=[Statement1|Statement2], + (Var=Statement1->true; + (contains_var(Var,Statement1)->true; + contains_var(Var,Statement2))))). + + +% if a=b(a) then fail + occurs_check2([],[]) :- true. + +occurs_check2(Variables1,Variables2) :- + +(Variables1=Variables2->fail; + +/* +((variable_name(Variables1), +not(variable_name(Variables2)), +contains_var1(Variables1,Variables2))->true; + +(variable_name(Variables2), +not(variable_name(Variables1)), +contains_var1(Variables2,Variables1)))), +*/ + ((Variables1=[Statement1a|Statement2a], +Variables2=[Statement1b|Statement2b]), + (Statement1a=Statement1b->fail; + +((variable_name(Statement1a), +not(variable_name(Statement1b)), +contains_var(Statement1a,Statement1b))->true; + +((variable_name(Statement1b), +not(variable_name(Statement1a)), +contains_var(Statement1b,Statement1a))->true; + + + occurs_check2(Statement1a,Statement1b)))), + occurs_check2(Statement2a,Statement2b))). + + + + +contains_empty([]) :- fail. +contains_empty(Statement) :- + +(is_empty(Statement)->true; (Statement=[Statement1|Statement2], + (is_empty(Statement1)->true; + (contains_empty(Statement1)->true; + contains_empty(Statement2))))). + + length_is_list(A,B) :- + is_list(A),length(A,B),!. + + diff --git a/listprologinterpreter/curly_brackets.pl b/listprologinterpreter/curly_brackets.pl new file mode 100644 index 0000000000000000000000000000000000000000..f1ed255909591ed521c92d0a9a486ee9fdbc3dac --- /dev/null +++ b/listprologinterpreter/curly_brackets.pl @@ -0,0 +1,76 @@ +conjunction_list(C, [C]) :- not(C = (_,_)). +conjunction_list(C, [P|R]) :- C = (P,Q), conjunction_list(Q, R). + + +curly_square({}, []). +curly_square({C}, L) :- once(conjunction_list(C, L)). + +curly_head_taila(Head1,C1,C2) :- + curly_square(Head1,[C1|C2]). + +square_to_round(A,B) :- + conjunction_list(B,A),!. + +round_to_curly(A,B) :- + square_to_round(C,A),curly_square(B,C),!. +/* +t_or_empty([t,_]). +t_or_empty({[t,_]}). +t_or_empty([]). + + +%curly_head_taila(T,H,Ta),%,append(List1,T,List2), +%!. + +curly_head_taila(Head1,C1,C2) :- + + Head1 =.. [{}, Round_bracket_list], + + curly_head_tail1a(Round_bracket_list,[],[Head1a|Head1b]), + + (Round_bracket_list =.. [{}, _]-> (square_to_curly(Head1a,C1),( + %(Head1b =.. [{}, _]-> square_to_curly(Head1b,C2); + Head1b=C2))%) + ; + [Head1a|Head1b]=[C1|C2]),!. + + +curly_head_tail1a(T,List1,List2) :- t_or_empty(T),append(List1,[T],List2),!. + +%curly_head_tail1([T],List1,List2) :- t_or_empty(T),append(List1,%[[T]],List2),!. + +curly_head_tail1a(Round_bracket_list,List1,List2) :- + curly_head_taila(Round_bracket_list,List3,List4), + append(List1,[[List3|List4]],List2). + +curly_head_tail1a(Round_bracket_list,List1,List2) :- + + %t_or_empty(T), + + Round_bracket_list =.. [',', T, B], + %curly_head_tail(T,_,_), + append(List1,[T],List3), + (B=..[{}, _]->append(List3,[B],List2); + curly_head_tail1a(B,List3,List2)),!. +% + + +square_to_curly(A,B) :- + square_to_round(A,C),round_to_curly(C,B). + +round_to_curly(A,B) :- + B=..[{},A]. + +% square_to_round([1,2,3],A). +% A = (1, 2, 3). + +square_to_round(A,C) :- + %t_or_empty(T), + ((A=[D]%,D=T + )-> + C=(D); + (A=[D|E],%not(T=A), + square_to_round(E,F), + C=..[',',D,F])),!. + +*/ \ No newline at end of file diff --git a/listprologinterpreter/data_to_types.pl b/listprologinterpreter/data_to_types.pl new file mode 100644 index 0000000000000000000000000000000000000000..5040ef82a7d5d91c66159e25787c8108be4d5ab9 --- /dev/null +++ b/listprologinterpreter/data_to_types.pl @@ -0,0 +1,65 @@ +%% data_to_types.pl + +/** + +?- data_to_types([[1],"a",1],[],T). +T = [[[t, brackets], [[t, number]]], [t, string], [t, number]]. + +?- data_to_types([1,["a"],1],[],T). +T = [[t, number], [[t, brackets], [[t, string]]], [t, number]]. + +?- data_to_types([1,"a",[1]],[],T). +T = [[t, number], [t, string], [[t, brackets], [[t, number]]]]. + +**/ + +data_to_types([],Types,Types) :- !. +data_to_types(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("number",Dbw_number), + number(Data),append(Types1,[[T,Dbw_number]],Types2),!. +data_to_types(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("string",Dbw_string), + string(Data),append(Types1,[[T,Dbw_string]],Types2),!. +data_to_types(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[[Data2]], + data_to_types(Data2,[],Types4), + Types5=[[[T,Dbw_brackets],Types4]], + append_list([Types1,Types5],Types2),!. +data_to_types(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[[Data2]|Data4], + data_to_types(Data2,[],Types4), + Types5=[[[T,Dbw_brackets],Types4]], + data_to_types(Data4,[],Types6), + append_list([Types1,Types5,Types6],Types2),!. +data_to_types(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[[Data2|Data3]|Data4], + data_to_types(Data2,[],Types3), + data_to_types(Data3,Types3,Types4), + Types5=[[[T,Dbw_brackets],Types4]], + data_to_types(Data4,[],Types6), + append_list([Types1,Types5,Types6],Types2),!. +data_to_types(Data1,Types1,Types2) :- + Data1=[Data2|Data3], + data_to_types(Data2,Types1,Types3), + data_to_types(Data3,Types3,Types2),!. + %%Types2=[Types4]. + +append_list(A1,B):- + append_list(A1,[],B),!. + +%/* + +append_list([],A,A):-!. +append_list(List,A,B) :- + List=[Item|Items], + append(A,Item,C), + append_list(Items,C,B). +%*/ \ No newline at end of file diff --git a/listprologinterpreter/e4_fa_get_vals.pl b/listprologinterpreter/e4_fa_get_vals.pl new file mode 100644 index 0000000000000000000000000000000000000000..a659e0af9dcf9248bcc054ce972690f1b99ded2d --- /dev/null +++ b/listprologinterpreter/e4_fa_get_vals.pl @@ -0,0 +1,358 @@ + +e4_fa_getvalues2([],Values,Values,_Vars,Flags,Flags) :- !. +e4_fa_getvalues2(VarNames1,Values1,Values2,Vars,Flags1,Flags2) :- + VarNames1=[VarName1|VarNames2], + (VarName1=[VarName2]->Flag1=true;VarName2=VarName1), + e4_fa_getvalue(VarName2,Value1,Vars), + (is_empty(Value1)->Flag2=true;(Value2=Value1,Flag2=false)), + (Flag1=true->Value3=[Value2];Value3=Value2), + append(Values1,Value3,Values3), + append(Flags1,[Flag2],Flags3), + e4_fa_getvalues2(VarNames2,Values3,Values2,Vars,Flags3,Flags2),!. + +e4_fa_getvalues(Variable1,Variable2,Value1,Value2,Vars) :- + e4_fa_getvalue(Variable1,Value1,Vars), + e4_fa_getvalue(Variable2,Value2,Vars). +e4_fa_getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars) :- + e4_fa_getvalue(Variable1,Value1,Vars), + e4_fa_getvalue(Variable2,Value2,Vars), + e4_fa_getvalue(Variable3,Value3,Vars). + + +%e4_fa_getvalue(Variable,_,_Vars) :- +% var(Variable),!. + +e4_fa_getvalue(Variable,Value,Vars) :- + ((not(isvar(Variable)),isvalstrorundef(Value),Variable=Value)->true; + (isvar(Variable),isvalstrorundef(Value),e4_fa_getvar(Variable,Value,Vars))). +%putvalue(empty,A,Vars,Vars) :- %writeln1(A), +% not(isvar(A)),!. +%putvalue(Variable,Value,Vars1,Vars1) :- +%trace, +% isvar(Variable),isvar(Value),!. +putvalue(Variable,Value,Vars1,Vars2) :- +%writeln1(putvalue(Variable,Value,Vars1,Vars2)), +%trace, + ((not(isvar(Variable)),isvalstrorundef_or_compound(Value),Variable=Value,Vars1=Vars2)->true; + (isvar(Variable),isvalstrorundef_or_compound(Value),updatevar(Variable,Value,Vars1,Vars2))),!. +isvalstrorundef_or_compound(Value):- + (compound(Value)->true; + isvalstrorundef(Value)),!. +e4_fa_getvar(Variable,Value,Vars) :- + ((member([Variable,Value],Vars), + not(is_empty(Value)))->true; + ((aggregate_all(count,member([Variable,_Value],Vars),0)->true;%% + (member([Variable,Empty],Vars),is_empty(Empty))),Value=Variable)) +. +e4_fa_getvar(undef,undef,_Vars) :- + !. + +e4_fa_val1emptyorvalsequal(_Empty,_Value) :- true, !. + +%e4_fa_val1emptyorvalsequal(Value,Value) :- +% not(Value=empty). + + +collect_vars([],Vars,Vars) :- !. +collect_vars(Term,Vars1,Vars2) :- + not(variable_name(Term)), + Term=[Term1|Term2], + collect_vars(Term1,Vars1,Vars3), + collect_vars(Term2,Vars3,Vars2),!. +collect_vars(Term,Vars1,Vars2) :- + variable_name(Term), + append(Vars1,[Term],Vars2),!. +collect_vars(Term,Vars,Vars) :- + not(variable_name(Term)),!. + +equals4_first_args(Variable1,Variable2,First_args2) :- +%trace, + length(Variable1,Length), + +equals4_first_args1(1,Length,Variable1,Variable2,[],First_args2),!. + +equals4_first_args1(Length2,Length1,[],[],First_args,First_args) :- + Length2 is Length1+1,!. +equals4_first_args1(Length0,Length1,Variable1,Variable2,First_args0,First_args01) :- + Variable1=[Item1|Item1a], + Variable2=[Item2|Item2a], + %numbers(Length,1,[],N), + %findall(First_args%First_args1 + %,(member(N1,N), + %get_item_n(Variable1,N1,Item1), + %get_item_n(Variable2,N1,Item2), + %trace, + %e4_fa_match4_2([Item1],[Item2],[],First_args1), + match4_new_22([Item1],[Item2],[],First_args1%,e4 + ), + + collect_vars(Item2,[],Vars2), + %trace, + findall(Value2,(member([First_args5,Value],First_args1), + (not(member(First_args5,Vars2))->Value2=[First_args5,Value]; + (not(expression_not_var(Value)),Value2=[Value,First_args5]))),First_args6), + + %(First_args2=[]->First_args6=First_args2;First_args6=First_args2), + %maplist(append,First_args6,[First_args]), + %*append(First_args6,[First_args), + %*),First_args3), + %trace, + %maplist(append,[First_args3],[First_args4]), + %*delete(First_args3,[],First_args2), + + append(First_args0,First_args6,First_args02), + + Length2 is Length0+1, + equals4_first_args1(Length2,Length1,Item1a,Item2a,First_args02,First_args01), + + !. + /* + collect_vars(Item1,[],Vars1), + collect_vars(Item2,[],Vars2), + %trace, + findall(First_args5,(member([First_args3,Value],First_args1), + (member(First_args3,Vars1)->First_args5=[First_args3,Value]; + ((member(First_args3,Vars2),not(Vars1=[]))->First_args5=[Value,First_args3]->true; + First_args5=[First_args3,Value]))),First_args6), +% maplist(append,[First_args0],[First_args]), + %trace, + findall([First_args3,Value],(member([First_args3,Value],First_args6), + (not(member(First_args3,Vars2))->true; + (member(First_args3,Vars1),member(First_args3,Vars2))) + ),First_args0), + maplist(append,[First_args0],[First_args]) + ),First_args4), + delete(First_args4,[],First_args2), + %maplist(append,[First_args4],[First_args2]), + !. +*/ + +e4_updatevars([],_,Vars2,Vars2) :- !. +e4_updatevars(FirstArgs,Vars1,Vars2,Vars3) :- +%writeln1(e4_updatevars(FirstArgs,Vars1,Vars2,Vars3)), +%trace, + %get_lang_word("v",Dbw_v), + FirstArgs=[[Orig,New]|Rest], + (expression_not_var(New)->append(Vars2,[[Orig,New]],Vars5); + (%trace, + replace_vars([Orig],[],[Orig1],[],First_vars1), + + %member([New,Value],Vars1), + (not(expression_not_var(Orig1))-> + ( remember_and_turn_off_debug(Debug), +%trace, +%find_sys(Sys_name), + + + (interpretpart(match4,Orig1,New,Vars1,Vars4,_)->true;(turn_back_debug(Debug),fail)), + %trace, + collect_vars(Orig1,[],Orig2), + findall([O1,O2],(member([O1,O2],Vars4),member(O1,Orig2)),Vars6), + %subtract(Vars4,Vars1,Vars6), + %getvalue(Orig,Value,Vars4), + + turn_back_debug(Debug), + + replace_first_vars1(Vars6,First_vars1,[],Vars61), + + append(Vars2,Vars61,Vars5))); + Vars2=Vars5)), + + %append(Vars2,[[Orig,Value]],Vars4))), + e4_updatevars(Rest,Vars1,Vars5,Vars3),!. + + +/* +replace_vars(Term,Vars1,Vars2,First_vars1,First_vars2) :- + not(variable_name(Term)), + Term=[[Term1|Term1a]|Term2], + not(variable_name([Term1|Term1a])), + replace_vars([Term1a],Vars1,Vars3,First_vars1,First_vars3), + replace_vars(Term1,Vars3,Vars4,First_vars3,First_vars4), + replace_vars(Term2,Vars4,Vars2,First_vars4,First_vars2),!. + */ +%replace_vars(Term,Vars1,Vars2,First_vars1,First_vars2) :- +% replace_vars0(Term,Vars1,Vars2,First_vars1,First_vars2),!. + + +replace_vars(Term,_Vars1,X,First_vars1,First_vars2) :- + % for first vars only + replace_vars0(Term,First_vars1,First_vars2), + % for vars + replace_vars01(Term,X,First_vars2),!. + +replace_vars0([],Variable,Variable,First_vars,First_vars) :- !. + +replace_vars0(Term,First_vars,First_vars) :- + expression_not_var(Term), + !. +replace_vars0(Term,First_vars1,First_vars2) :- + not(variable_name(Term)), + Term=[Term1|Term2], + replace_vars0(Term1,First_vars1,First_vars3), + %Vars5=[Vars3], + replace_vars0(Term2,First_vars3,First_vars2), + %Vars5=[Vars4], + %trace, + %append(Vars1],Vars3,Vars4],Vars2), + %append(Vars1,Vars5,Vars6), + %append(Vars6,Vars4,Vars2), + %maplist(append,[[Vars1],Vars3,Vars4],Vars2), + !. +replace_vars0(Term,First_vars1,First_vars2) :- + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, + variable_name(Term), + %( + (member([Term,[Dbw_v,_Var_name1]],First_vars1)-> + (First_vars1=First_vars2); + ((Term=[_,'_']->Var_name2='_';find_sys(Var_name2)), + append(First_vars1,[[Term,[Dbw_v,Var_name2]]],First_vars2))),%)), + !. + +replace_first_vars1([],_First_vars,Vars,Vars) :- !. +replace_first_vars1(Vars1,First_vars,Vars2,Vars3) :- + %get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, + Vars1=[[Var_name,Value]|Vars5], + (member([Term,Var_name],First_vars)-> + append(Vars2,[[Term,Value]],Vars4); + append(Vars2,[[Var_name,Value]],Vars4)), + replace_first_vars1(Vars5,First_vars,Vars4,Vars3),!. + +replace_first_vars2([],_First_vars,Vars,Vars) :- !. +replace_first_vars2(Vars1,First_vars,Vars2,Vars3) :- + %get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, + Vars1=[[Var_name1,Var_name2]|Vars5], + %(member([Term1,Var_name1],First_vars)-> + %Term2=Term1;Term2=Var_name1), + replace_first_vars211(Var_name1,First_vars,Term2), + %(member([Term3,Var_name2],First_vars)-> + %Term4=Term3;Term4=Var_name2), + replace_first_vars211(Var_name2,First_vars,Term4), + append(Vars2,[[Term2,Term4]],Vars4), + replace_first_vars2(Vars5,First_vars,Vars4,Vars3),!. + + +%* + +%replace_first_vars21(Variable2,_,Vars1,X,FirstArgs1,FirstArgs2) :- +% is_list(Variable2), +% replace_first_vars211(Variable2,X,Vars1,FirstArgs1,FirstArgs2). + +replace_first_vars211(Var_name1,First_vars,Term2) :- + single_item_or_var(Var_name1), + (member([Term1,Var_name1],First_vars)-> + Term2=Term1;Term2=Var_name1), + %append(Term3,[Term2],Term4), + !. + +replace_first_vars211([],_,[]) :- !. +replace_first_vars211(Variable1,First_vars,Term) :- + not(single_item_or_var(Variable1)), + Variable1=[Variable1a|Variable1b], + replace_first_vars211(Variable1a,First_vars,Value1a), + replace_first_vars211(Variable1b,First_vars,Value1b), + append([Value1a],Value1b,Term),!. + +%* + +is_single_item_or_expression_list(A) :- + not(variable_name(A)), + (single_item(A)->true; + (is_list(A),findall(B,(member(B,A),expressionnotatom(B) + %not(variable_name(B)) + ),C),length(A,L),length(C,L))), + !. + +is_single_item_or_expression_list_with_atoms(A) :- + not(variable_name(A)), + (single_item_or_atom(A)->true; + (is_list(A),findall(B,(member(B,A),expression_or_atom(B) + %not(variable_name(B)) + ),C),length(A,L),length(C,L))), + !. + + + % replace_vars0(Term,Vars1,Vars2,First_vars1,First_vars2),!. + +replace_vars01(Variable2,X,Vars1) :- + not(variable_name(Variable2)), + is_list(Variable2), + getvalue_match1(Variable2,X,Vars1). + +getvalue_match1(Variable1,Value1,Vars1) :- + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, + single_item_or_atom(Variable1), + (is_single_item_or_expression_list_with_atoms(Variable1)-> + Value1=Variable1; + + (member([Variable1,[Dbw_v,Var_name1]],Vars1)-> + (Value1=[Dbw_v,Var_name1]); + (find_sys(Var_name2), + Value1=[Dbw_v,Var_name2]))),!. + + +getvalue_match1([],[],_Vars1) :- !. +getvalue_match1(Variable1,Value1,Vars1) :- + not(single_item_or_atom(Variable1)), + Variable1=[Variable1a|Variable1b], + getvalue_match1(Variable1a,Value1a,Vars1), + getvalue_match1(Variable1b,Value1b,Vars1), + append([Value1a],Value1b,Value1),!. + +% get new var listing + +/* +replace_vars011(Variable2,Vars1,Vars2a,Vars2b) :- + not(variable_name(Variable2)), + is_list(Variable2), + getvalue_match11(Variable2,Vars1,Vars2a,Vars2b). + +getvalue_match11(Variable1,Vars1,Vars2a,Vars2b) :- + single_item(Variable1), + (is_single_item_or_expression_list(Variable1)-> + Vars2a=Vars2b%append(Vars2a,[Variable1],Vars2b + ; + (getvalue(Variable1,Value2,Vars1), + append(Vars2a,[[Variable1,Value2]],Vars2b))),!. + +getvalue_match11([],_Vars1,Vars2,Vars2) :- !. +getvalue_match11(Variable1,Vars1,Vars2a,Vars2b) :- + not(single_item(Variable1)), + Variable1=[Variable1a|Variable1b], + getvalue_match11(Variable1a,Vars1,Vars2a,Vars2c), + getvalue_match11(Variable1b,Vars1,Vars2c,Vars2b), + %append([Value1a],Value1b,Value1),!. + !. +*/ + +replace_vars011(Variable2,_Vars1,_Vars2a,Vars2b) :- + get_lang_word("sys",Dbw_sys), + + findall([[A,C],C1],(member([[A,C],C1],Variable2), + string_concat(Dbw_sys,_N1,C)),Vars2c), + subtract(Variable2,Vars2c,Vars2b),!. + + +e4_substitutevarsA1(Variable2,Vars1,_,X,FirstArgs1,FirstArgs2) :- + is_list(Variable2), + e4_substitutevarsA2_getvalue_match1(Variable2,X,Vars1,FirstArgs1,FirstArgs2). + +e4_substitutevarsA2_getvalue_match1(Variable1,Value1,Vars1,FirstArgs1,FirstArgs3) :- + single_item_or_var(Variable1), + getvalue(Variable1,Value,Vars1), + ((is_empty(Value)-> + ((Value1=Variable1), + (isvar(Variable1)->append(FirstArgs1,[Variable1], + FirstArgs3);FirstArgs3=FirstArgs1)); + (getvalue(Variable1,Value,Vars1), + Value1=Value, + FirstArgs3=FirstArgs1))),!. + +e4_substitutevarsA2_getvalue_match1([],[],_Vars1,FirstArgs1,FirstArgs1) :- !. +e4_substitutevarsA2_getvalue_match1(Variable1,Value1,Vars1,FirstArgs1,FirstArgs2) :- + not(single_item_or_var(Variable1)), + Variable1=[Variable1a|Variable1b], + e4_substitutevarsA2_getvalue_match1(Variable1a,Value1a,Vars1,FirstArgs1,FirstArgs3), + e4_substitutevarsA2_getvalue_match1(Variable1b,Value1b,Vars1,FirstArgs3,FirstArgs2), + append([Value1a],Value1b,Value1),!. + diff --git a/listprologinterpreter/expand_types.pl b/listprologinterpreter/expand_types.pl new file mode 100644 index 0000000000000000000000000000000000000000..cbba9544b5dadf326ce5a8a346e0b6d070d4f8fd --- /dev/null +++ b/listprologinterpreter/expand_types.pl @@ -0,0 +1,176 @@ +%% expand_types1.pl + +% not the following anymore, as LPI types are simple. + +% numbers(68,1,[],N),findall([N1,T3,"\n"],(member(N1,N),test_types_cases(N1,_,T,M,F,_),findall(TC,(member(T0,T),T0=[TA,TB],simplify_types(TB,[],T1),expand_types(T1,[],T2),(TB=T2->TC=y;TC=[TB,T2])),T3)),T4),writeln1(T4). + +% numbers(35,1,[],N),findall([N1,T3,"\n"],(member(N1,N),test_open_types_cases(N1,_,T,M,F),findall(TC,(member(T0,T),T0=[TA,TB],simplify_types(TB,[],T1),expand_types(T1,[],T2),(TB=T2->TC=y;TC=[TB,T2])),T3)),T4),writeln1(T4). + +/** + +?- expand_types1([[[t, number]], [t, string], [t, number]],[],T). +T = [[[t, brackets], [[t, number]]], [t, string], [t, number]]. + +**/ + +:-include('curly_brackets.pl'). +:-include('simplify_types.pl'). + +/* +expand_types(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[Data2], + + not(Data2=[T,_]), + expand_types1(Data2,[],Types4), + Types5=Types4, + append_list3([Types1,Types5],Types2),!. + +expand_types(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + + Data1=[[Data2|Data3]|Data41], + +not([Data2|Data3]=[T,_]), + expand_types1(Data2,[],Types3), + expand_types1(Data3,Types3,Types4), + Types5=Types4, + +findall(Types61,(member(Data4,Data41), expand_types1(Data4,[],Types61)),Types62), +foldr(append,Types62,Types6), + append_list3([Types1,Types5,Types6],Types2),!. + +*/ +expand_types(Data1,Types1,Types2) :- + expand_types1(Data1,Types1,Types2). + + +expand_types1([],Types,Types) :- !. +expand_types1(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("number",Dbw_number), + +Data=[T,Dbw_number], %number(Data), +append(Types1,[[T,Dbw_number]],Types2),!. +expand_types1(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("string",Dbw_string), + +Data=[T,Dbw_string], %string(Data), +append(Types1,[[T,Dbw_string]],Types2),!. +expand_types1(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("atom",Dbw_atom), + +Data=[T,Dbw_atom], %string(Data), +append(Types1,[[T,Dbw_atom]],Types2),!. +/* +expand_types1(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[Data2], + + not(Data2=[T,_]), + not(Data2={[T,_]}), + expand_types1(Data2,[],Types4), + Types5=[[[T,Dbw_brackets],Types4]], + append_list3([Types1,Types5],Types2),!. +*/ +/* +expand_types1(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[[Data2]|Data4], + expand_types1(Data2,[],Types4), + Types5=[[[T,Dbw_brackets],Types4]], + expand_types1(Data4,[],Types6), + append_list3([Types1,Types5,Types6],Types2),!. +*/ + +expand_types1(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + + Data1=[[Data2|Data3]|Data41], + +not([Data2|Data3]=[T,_]), + expand_types1([Data2],[],Types3), + expand_types1(Data3,Types3,Types4), + Types5=[[[T,Dbw_brackets],Types4]], + +findall(Types61,(member(Data4,Data41), expand_types1([Data4],[],Types61)),Types62), +foldr(append,Types62,Types6), + append_list3([Types1,Types5,Types6],Types2),!. + + expand_types1(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("list",Dbw_list), + %Data1=[Data2Data3|Data41], + +%not(Data2Data3=[T,_]), +not(Data1=[T,_]), +%trace, +curly_head_taila(Data1,Data2,Data3), +%trace, + expand_types1(Data2,[],Types3), + expand_types1(Data3,Types3,Types4), + Types5=[[[T,Dbw_list],Types4]], +%trace, +%findall(Types61,(member(Data4,Data41), +%expand_types1(Data4,[],Types6),%),Types62), +%foldr(append,Types62,Types6), + append_list3([Types1,%Types5, + Types5],Types2),!. + + +expand_types1(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("list",Dbw_list), + Data1=[Data2Data3|Data41], + +not(Data2Data3=[T,_]), +%trace, +curly_head_taila(Data2Data3,Data2,Data3), + expand_types1([Data2],[],Types3), + expand_types1(Data3,Types3,Types4), + Types5=[[[T,Dbw_list],Types4]], +%trace, +findall(Types61,(member(Data4,Data41), expand_types1([Data4],[],Types61)),Types62), +foldr(append,Types62,Types6), + append_list3([Types1,Types5, + Types6],Types2),!. + + + expand_types1(Data,Types1,Types2) :- +get_lang_word("t",T), +%get_lang_word("string",Dbw_string), + +Data=[T,A], %string(Data), +append(Types1,[[T,A]],Types2),!. + + expand_types1(Data,Types1,Types2) :- +%get_lang_word("t",T), +%get_lang_word("string",Dbw_string), + +Data="|", %string(Data), +append(Types1,["|"],Types2),!. + + +expand_types1(Data1,Types1,Types2) :- + Data1=[Data2|Data3], + expand_types1(Data2,Types1,Types3), + expand_types1(Data3,Types3,Types2),!. + %%Types2=[Types4]. + +%/* +append_list3(A1,B):- + append_list3(A1,[],B),!. + +append_list3([],A,A):-!. +append_list3(List,A,B) :- + List=[Item|Items], + append(A,Item,C), + append_list3(Items,C,B),!. +%*/ \ No newline at end of file diff --git a/listprologinterpreter/expression_not_var.pl b/listprologinterpreter/expression_not_var.pl new file mode 100644 index 0000000000000000000000000000000000000000..e66d3d41827b297323411f86f00f6c0875fec4cc --- /dev/null +++ b/listprologinterpreter/expression_not_var.pl @@ -0,0 +1,27 @@ +expression_not_var(Variable2) :- + not(variable_name(Variable2)), + expression_not_var1(Variable2,start). + +expression_not_var1(Variable1,_) :- + single_item_not_var(Variable1),!. + +expression_not_var1([],_) :- !. +expression_not_var1(Variable1,Position) :- + (Position=start->not(variable_name(Variable1));true), + Variable1=[Variable1a|Variable1b], + not(variable_name(Variable1a)), + expression_not_var1(Variable1a,start), + expression_not_var1(Variable1b,non_start),!. + +single_item_not_var(A) :- predicate_or_rule_name(A),!. +single_item_not_var(A) :- string(A),!. +single_item_not_var(A) :- number(A),!. +single_item_not_var(A) :- atom(A),!. +single_item_not_var(A) :- blob(A,stream),!. + +single_item_or_var(A) :- predicate_or_rule_name(A),!. +single_item_or_var(A) :- variable_name(A),!. +single_item_or_var(A) :- string(A),!. +single_item_or_var(A) :- number(A),!. +single_item_or_var(A) :- atom(A),!. +single_item_or_var(A) :- blob(A,stream),!. diff --git a/listprologinterpreter/file.txt b/listprologinterpreter/file.txt new file mode 100644 index 0000000000000000000000000000000000000000..8395beeca8e90b28f98b775d18f6fcbbd8c71907 --- /dev/null +++ b/listprologinterpreter/file.txt @@ -0,0 +1 @@ +["Name","a"]. \ No newline at end of file diff --git a/listprologinterpreter/grammar.pl b/listprologinterpreter/grammar.pl new file mode 100644 index 0000000000000000000000000000000000000000..aa08d859bfc5e9f16230948c5d28406278d06060 --- /dev/null +++ b/listprologinterpreter/grammar.pl @@ -0,0 +1,389 @@ +%% Replace with grammar part, write grammar part, write grammars to convert string to compound + +%% Replace with grammar part + +%% Make atoms into strings as soon as entered x have both, because it is annoying to enter strings as strings* - prefer atoms because they're faster, use strings only when necessary - * get rid of atoms as strings and use atoms only as names in form [n,name], v + +%% ** Need compound terms in LPI +%% convert_to_grammar_part1([[n,nouns],[[v,l1],[v,l2]]],"->",[[[n,word],[[v,noun1]]],[[n,word],[[v,noun2]]],[[n,code],[[n,append],[[v,l1],[[v,noun1],[v,noun2]],[v,l2]]]]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],[[v,l1],[v,l2]],"->",[[[n,word],[[v,noun1]]]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],"->",[[[n,word],[[v,noun1]]]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],"->",[[[v,a]],[[n,code],[[n,=],[[v,a],"apple"]]]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],"->",[[v,a],[[n,code],[[n,=],[[v,a],["apple"]]]]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],"->",[["apple"]]]],[],G). +%% convert_to_grammar_part1([[[n,noun],"->",["apple"]]],[],G). + +%% Notes +%% Brackets around body in input v + +%% base case [] v +%% calls, terminals, code v +%% if calls - 2vp or non 2vp v + +%% No bc needed except [] v +%% Recurse after all except bc [] v + +%% code in converted code has no brackets v + +%% $->v v + +%% must have a call or terminal v + +%% terminals have vgps, undef g t->g p v + +%% can find x as terminals, can find singletons later v + +%% grammar_part,[n,noun],[v,vgp1],[v,vgp2] should be +%% grammar_part,[[n,noun],[v,vgp1],[v,vgp2]] do using list proc v + +%% append [[]] not [] to [[v v]] v + +%% Print [] around body v + +%% append list in first query args 1<->2 v + +%% ** change [v,name] to [v*,name] so new interpreter, grammar command can run them, can have multiple v, v* or groups of these used in an interpreter shell x use a marker to avoid replacing variables with values when passing algorithms to shells, or detect and leave without marker + +%% keep original length recorded to add base case in case when add variables +convert_to_grammar_part1(Grammar1,Grammar2,Grammar3,Grammar5) :- + convert_to_grammar_part11(Grammar1,Grammar2,Grammar4,[],_EndGrammar1,[],Grammar6,[],_EndGrammar2), + %%append(Grammar4,EndGrammar1,Grammar3), + %%append(Grammar6,EndGrammar2,Grammar5), + Grammar3=Grammar4, + Grammar5=Grammar6, + %%duplicate(Grammar4,[],Grammar5), %% 6->4 + !. + +convert_to_grammar_part11([],Grammar1,Grammar1,EndGrammar1,EndGrammar1,Grammar2,Grammar2,EndGrammar2,EndGrammar2) :- !. +convert_to_grammar_part11(Grammar1,Grammar2,Grammar3,_EndGrammar1,EndGrammar2,Grammara2,Grammara3,_EndGrammara1,EndGrammara2) :- +get_lang_word("v",Dbw_v), +get_lang_word("n",Dbw_n), +get_lang_word("vgp1",Dbw_vgp1), +get_lang_word("vgp2",Dbw_vgp2), + Grammar1=[Grammar4|Grammar5], + (Grammar4=[[Dbw_n,Name],Variables1,"->",Body1]->true; + (Grammar4=[[Dbw_n,Name],"->",Body1],Variables1=[])), + %%((maplist(no_calls,Body1))-> %% this is a base case + %%append([[v,vgp],[v,vgp]],Variables1,Variables2); + append([[Dbw_v,Dbw_vgp1],[Dbw_v,Dbw_vgp2]],Variables1,Variables3) + %%) + , + member(Item1,Body1),call_or_terminal(Item1), %% If not, operator expected. + %%append([[n,Name]],Variables2,Variables3), + Grammar6=[[Dbw_n,Name],Variables3,":-"], + convert_to_grammar_part20(Body1,1,2,2,[],Body2), + append(Grammar6,[Body2],Grammar7), %% 7 to 8 x + + %% member to check all doesn't work elsewhere, do ; to ->true; +/** +(maplist(basecasecondition(Variables3,[n,Name]),Grammar2)-> +((Variables1=[]->(Grammar9=[[n,Name],[[],[v,vgp]]],Grammar10=[[n,Name],["",[v,vgp]]],append(EndGrammar1,[[[n,Name],[[v,vgp],[v,vgp]]]],EndGrammar3),append(EndGrammara1,[[[],[[[n,Name],[[v,vgp],[v,vgp]]]]]],EndGrammara4) +);( +Grammar9=[[n,Name],[[],[v,vgp]|Variables1]],Grammar10=[[n,Name],["",[v,vgp]|Variables1]],append(EndGrammar1,[[[n,Name],[[v,vgp],[v,vgp]|Variables1]]],EndGrammar3),append(EndGrammara1,[[[],[[[n,Name],[[v,vgp],[v,vgp]|Variables1]]]]],EndGrammara4) +) + ), + append(Grammar2,[Grammar9, + Grammar10, + Grammar7],Grammar8), + append(Grammara2,[[Grammar4,[Grammar9, + Grammar10,Grammar7]]],Grammara4)); + (EndGrammar1=EndGrammar3, + append(Grammar2,[ + Grammar7],Grammar8), + append(Grammara2,[[Grammar4,[Grammar7]]],Grammara4)) + ), + **/ + append(Grammar2,[Grammar7],Grammar8), + append(Grammara2,[[Grammar4,[Grammar7]]],Grammara4), +convert_to_grammar_part11(Grammar5,Grammar8,Grammar3,_EndGrammar3,EndGrammar2,Grammara4,Grammara3,_EndGrammara4,EndGrammara2),!. +convert_to_grammar_part11(Grammar1,Grammar2,Grammar3,EndGrammar1,EndGrammar2,Grammara1,Grammara2,EndGrammara1,EndGrammara2) :- + Grammar1=[Grammar4|Grammar5], + (((((Grammar4=[_Name1,_Variables1,":-",_Body1]->true; + Grammar4=[_Name2,":-",_Body2])->true; + Grammar4=[_Name3,_Variables2])->true; + Grammar4=[":-",_,_])->true; + Grammar4=[_Name4])->true; + (writeln0(["Error: Grammar",Grammar4,"badly formed."]),abort)), + append(Grammar2,[Grammar4],Grammar6), + append(Grammara1,[[Grammar4,Grammar4]],Grammara4), + convert_to_grammar_part11(Grammar5,Grammar6,Grammar3,EndGrammar1,EndGrammar2,Grammara4,Grammara2,EndGrammara1,EndGrammara2),!. + +duplicate([],Grammar,Grammar) :- !. +duplicate(Grammar1,Grammar2,Grammar3) :- + Grammar1=[Grammar4|Grammar5], + append(Grammar2,[[Grammar4,Grammar4]],Grammar6), + duplicate(Grammar5,Grammar6,Grammar3). + + +basecasecondition(Variables3,[Dbw_n,Name],Item1) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + + not((Item1=[[Dbw_n,Name],Item2|_Rest2],length(Variables3,Length),length(Item2,Length))). + +no_calls(Item) :- + not(call1(Item)),!. + +convert_to_grammar_part20(Body1,FirstVar1,SecondVar1,SecondVarParent,Body2,Body3) :- + (count_call_or_terminal(Body1,0,1,[],_I)-> + SecondVar2=SecondVar1;SecondVar2=3), + +convert_to_grammar_part2(Body1,FirstVar1,SecondVar2,SecondVarParent,Body2,Body3),!. + +count_call_or_terminal([],N,N,I,I) :- !. +count_call_or_terminal([Item|Items],N1,N2,I1,I2) :- + (call_or_terminal(Item)->(N3 is N1+1,append(I1,[Item],I3));N3 is N1,I3=I1), + count_call_or_terminal(Items,N3,N2,I3,I2),!. + + +convert_to_grammar_part2([],_FirstVar,_SecondVar,_SecondVarParent,Body,Body) :- !. + +%% only want 2vp to be used if last call + + +convert_to_grammar_part2(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item|Items], + call1(Item), + (last_call_or_terminal2(Items)-> + convert_to_grammar_part31(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3); + convert_to_grammar_part32(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3)), !. + +%% +convert_to_grammar_part2(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item|Items], + terminal(Item), + (last_call_or_terminal2(Items)-> + convert_to_grammar_part31t(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3); + convert_to_grammar_part32t(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3)), !. + +convert_to_grammar_part2(Body1,FirstVar,SecondVar,SecondVarParent,Body2,Body3) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("code",Dbw_code), + Body1=[Item|Rest1], + Item=[[Dbw_n,Dbw_code]|Rest2], + append(Body2,Rest2,Body4), + convert_to_grammar_part2(Rest1,FirstVar,SecondVar,SecondVarParent,Body4,Body3),!. + +convert_to_grammar_part2(Body1,_FirstVar,_SecondVar,_SecondVarParent,_Body4,_Body3) :- + Body1=[Item|_Rest1], + writeln0(["Error: Item",Item,"badly formed."]),abort,!. + +convert_to_grammar_part31(Body1,FirstVar,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item1|Rest1], + convert_to_grammar_part311(Item1,FirstVar,SecondVarParent,Body2,Body4), + convert_to_grammar_part2(Rest1,FirstVar,SecondVar,SecondVarParent,Body4,Body3), !. + +convert_to_grammar_part311([[Dbw_n,RuleName]],FirstVar1,SecondVarParent,Body2,Body3) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVarParent,SecondVarParentName), + Item=[[Dbw_n,RuleName],[FirstVarName,SecondVarParentName]], + %%append([[n,grammar_part]],[Call],Item), + append(Body2,[Item],Body3),!. + +convert_to_grammar_part311([[Dbw_n,RuleName]|[Variables1]],FirstVar1,SecondVarParent,Body2,Body3) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + variables(Variables1), + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVarParent,SecondVarParentName), + append([FirstVarName,SecondVarParentName],Variables1,Call), + append([[Dbw_n,RuleName]],[Call],Item2), + append(Body2,[Item2],Body3),!. + +convert_to_grammar_part32(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item1|Rest1], + convert_to_grammar_part321(Item1,Rest1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3),!. + +convert_to_grammar_part321(Item1,Rest1,FirstVar1,SecondVar1,SecondVarParent,Body2,Body3) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + Item1=[[Dbw_n,RuleName]], + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVar1,SecondVarName), + Call=[FirstVarName,SecondVarName], + append([[Dbw_n,RuleName]],[Call],Item2), + append(Body2,[Item2],Body4), + FirstVar2 is SecondVar1, + SecondVar2 is SecondVar1+1, + convert_to_grammar_part2(Rest1,FirstVar2,SecondVar2,SecondVarParent,Body4,Body3),!. + +convert_to_grammar_part321(Item1,Rest1,FirstVar1,SecondVar1,SecondVarParent,Body2,Body3) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + Item1=[[Dbw_n,RuleName]|[Variables1]], + variables(Variables1), + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVar1,SecondVarName), + append([FirstVarName,SecondVarName],Variables1,Call), + append([[Dbw_n,RuleName]],[Call],Item2), + append(Body2,[Item2],Body4), + FirstVar2 is SecondVar1, + SecondVar2 is SecondVar1+1, + convert_to_grammar_part2(Rest1,FirstVar2,SecondVar2,SecondVarParent,Body4,Body3),!. + +%% + +convert_to_grammar_part31t(Body1,FirstVar,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item1|Rest1], + convert_to_grammar_part311t(Item1,FirstVar,SecondVarParent,Body2,Body4), + convert_to_grammar_part2(Rest1,FirstVar,SecondVar,SecondVarParent,Body4,Body3), !. + +convert_to_grammar_part311t(Item1,FirstVar1,SecondVarParent,Body2,Body3) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVarParent,SecondVarParentName), + Call=[Item1,FirstVarName,SecondVarParentName], + append([[Dbw_n,grammar_part]],[Call],Item2), + append(Body2,[Item2],Body3),!. + +convert_to_grammar_part32t(Body1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3) :- + Body1=[Item1|Rest1], + convert_to_grammar_part321t(Item1,Rest1,FirstVar1,SecondVar,SecondVarParent,Body2,Body3),!. + +convert_to_grammar_part321t(Item1,Rest1,FirstVar1,SecondVar1,SecondVarParent,Body2,Body3) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + to_variable_name(FirstVar1,FirstVarName), + to_variable_name(SecondVar1,SecondVarName), + Call=[Item1,FirstVarName,SecondVarName], + append([[Dbw_n,grammar_part]],[Call],Item2), + append(Body2,[Item2],Body4), + FirstVar2 is SecondVar1, + SecondVar2 is SecondVar1+1, + convert_to_grammar_part2(Rest1,FirstVar2,SecondVar2,SecondVarParent,Body4,Body3),!. + +last_call_or_terminal2([]) :- !. +last_call_or_terminal2(Body1) :- + Body1=[Item|Body2], + not(call_or_terminal(Item)), + last_call_or_terminal2(Body2),!. + +to_variable_name(Var,Name1) :- +get_lang_word("v",Dbw_v), + string_concat("vgp",Var,Name2), + get_lang_word(Name2,Dbw_vgp1), + Name1=[Dbw_v,Dbw_vgp1],!. +variable_name([Dbw_v,_Name]) :- +get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v,!. +predicate_or_rule_name([Dbw_n1,_Name]) :- %trace, +get_lang_word("n",Dbw_n),Dbw_n=Dbw_n1,!. +predicate_or_rule([[Dbw_n,_Name]|_Variables]) :- +get_lang_word("n",Dbw_n),!. +islist(A) :- list1(A,_,_),!. +variables([]) :- !. +variables(Variables1) :- + Variables1=[Variable|Variables2], + terminal(Variable),%%=[v,_VariableName], + variables(Variables2),!. +terminal([]) :- !. +terminal(Item1) :- + not(code(Item1)), + ((not(variable_name(Item1)),is_list(Item1))->(findall(y,(member(Item3,Item1), terminal2(Item3)),Item4), +length(Item1,L), +length(Item4,L));terminal2(Item1)), +!. +terminal2(Item):-(variable_name(Item)->true; +(string(Item)->true; +(number(Item)->true; +(atom(Item))))). + +code(Item) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("code",Dbw_code), + (Item=[[Dbw_n,Dbw_code]|_Rest]->true;Item=[Dbw_n,Dbw_code]),!. +call1(Item) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +(Item=[[Dbw_n,_PredicateName]|_Variables]->true;Item=[Dbw_n,_PredicateName]),not(code(Item)),!. +call_not_grammar([[Dbw_n,PredicateName]|_Variables]) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("grammar",Dbw_grammar), +not(PredicateName=Dbw_grammar),not(PredicateName=grammar_part),!. +call_grammar_part([[Dbw_n,grammar_part]|_Variables]) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%get_lang_word("grammar_part",Dbw_grammar_part1),Dbw_grammar_part1=Dbw_grammar_part, +!. +name([Dbw_n,_Name]):- +get_lang_word("n",Dbw_n1),Dbw_n1,Dbw_n,!. +call_or_terminal(Item) :- + terminal(Item)->true;call1(Item),!. + +%% write grammar part + +%% [[]] in args + +%% collect code commands to write + + +%% Terminal known, Phrase2 known +%% Terminal unknown, Phrase2 known +%% Terminal known, Phrase2 unknown +%% Terminal unknown, Phrase2 unknown - error +%% if known or unknown, can be [v] + +%% Gets val or sets prolog var to undef, puts val in undef in var + +%% Doesn't support [v], use wrap(v) x - append not string concat if P1 is ["a"] + +getvalues2([],Values,Values,_Vars,Flags,Flags) :- !. +getvalues2(VarNames1,Values1,Values2,Vars,Flags1,Flags2) :- + VarNames1=[VarName1|VarNames2], +%trace, + ((isvar(VarName1),VarName1=[VarName2])->Flag1=true;VarName2=VarName1), + getvalue(VarName2,Value1,Vars), + (is_empty(Value1)->Flag2=true;(Value2=Value1,Flag2=false)), + (Flag1=true->Value3=[Value2];Value3=Value2), + append(Values1,Value3,Values3), + append(Flags1,[Flag2],Flags3), + getvalues2(VarNames2,Values3,Values2,Vars,Flags3,Flags2),!. + +undefined_to_empty([],Values,Values) :- !. +undefined_to_empty(Values1,Values2,Values3) :- + Values1=[Value1|Values4], + (var(Value1)->is_empty(Value2);Value2=Value1), + append(Values2,[Value2],Values5), + undefined_to_empty(Values4,Values5,Values3),!. + +getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars) :- + (equals4(on)->(getvalue_equals41(Variable1,Value1,Vars), + getvalue_equals41(Variable2,Value2,Vars)); + (getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars))). + +getvalues_equals4(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars) :- + (equals4(on)->(getvalue_equals41(Variable1,Value1,Vars), + getvalue_equals41(Variable2,Value2,Vars), + getvalue_equals41(Variable3,Value3,Vars)); + (getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars), + getvalue(Variable3,Value3,Vars))). + +getvalue_equals4(Variable,Value,Vars) :- + (equals4(on)->getvalue_equals41(Variable,Value,Vars); + getvalue(Variable,Value,Vars)). + +getvalue_equals41(Variable,Value,Vars) :- + get_lang_word("v",Dbw_v), + remember_and_turn_off_debug(Debug), + + find_sys(Sys_name), + + +((interpretpart(match4,Variable,[Dbw_v,Sys_name],Vars,Vars3,_), + getvalue([Dbw_v,Sys_name],Value,Vars3))->true;(turn_back_debug(Debug),fail)), + turn_back_debug(Debug). + +%putvalue_equals4(empty,A,Vars,Vars) :- not(isvar(A)),!. +putvalue_equals4(Variable,Value,Vars1,Vars2) :- + (equals4(on)->putvalue_equals41(Variable,Value,Vars1,Vars2); + putvalue(Variable,Value,Vars1,Vars2)). + +%putvalue_equals41(Variable,Value,Vars1,Vars1) :- +% isvar(Variable),isvar(Value),!. + +putvalue_equals41(Variable,Value,Vars1,Vars2) :- + %get_lang_word("v",Dbw_v), + remember_and_turn_off_debug(Debug), + + %find_sys(Sys_name), + + +(interpretpart(match4,Variable,Value,Vars1,Vars2,_)->true; +(turn_back_debug(Debug),fail)), + %getvalue([Dbw_v,Sys_name],Value,Vars3))->true;(turn_back_debug(Debug),fail)). + turn_back_debug(Debug). \ No newline at end of file diff --git a/listprologinterpreter/input.txt b/listprologinterpreter/input.txt new file mode 100644 index 0000000000000000000000000000000000000000..075c842d1b8cdbc5f791e44aa60e3b9639d3b452 --- /dev/null +++ b/listprologinterpreter/input.txt @@ -0,0 +1 @@ +"a" \ No newline at end of file diff --git a/listprologinterpreter/la_files.pl b/listprologinterpreter/la_files.pl new file mode 100644 index 0000000000000000000000000000000000000000..576c402aa5038c3e55ebe49892bd29071b29609e --- /dev/null +++ b/listprologinterpreter/la_files.pl @@ -0,0 +1,274 @@ +delete_invisibles_etc(F,G) :- + findall(J,(member(H,F),atom_string(H,J),not(J="."),not(J=".."),not(string_concat(".",_,J))),G),!. + + +open_file_s(Path,File_term) :- + phrase_from_file_s(string(File), Path), + string_codes(File_string,File), + term_to_atom(File_term,File_string),!. + +open_file_sh(F1,File_term) :- + (absolute_url(F1)-> + F1=Path; + (working_directory_sh(F11,F11), + string_concat_url(F11,F1,Path))), + split_string(Path,"/","/",Path1), + append(_P1,[P2],Path1), + string_concat(P2,"tmp54837",P3), + + Scp="scp -pr ", + concat_list([Scp,Path," ",P3],Command), + shell1_s(Command), + + + open_file_s(P3,File_term), + + Rm="rm -rf", + concat_list([Rm," ",P3],Command1), + shell1_s(Command1), + !. + + +open_string_file_s(Path,File_string) :- + phrase_from_file_s(string(File), Path), + string_codes(File_string,File),!. + %term_to_atom(File_term,File_string),!. + + +save_file_s(Path,Content_term_or_string) :- + ((compound(Content_term_or_string)->true;Content_term_or_string=[])->term_to_atom(Content_term_or_string,String);((string(Content_term_or_string)->true;(atom(Content_term_or_string)->true; + number(Content_term_or_string)) + )->Content_term_or_string=String; + (concat_list(["Error: save_file_s content not in compound, atom or string format."],Notification), + writeln0(Notification),abort))), + (open_s(Path,write,Stream), + write(Stream,String), + close(Stream)), + !. + +% working_directory_sh(_,"root@x.x.x.x:~/"). +% save_file_sh("a1.txt", [a]). + +save_file_sh(F1,File_term) :- + (absolute_url(F1)-> + F1=Path; + (working_directory_sh(F11,F11), + string_concat_url(F11,F1,Path))), + split_string(Path,"/","/",Path1), + append(_P1,[P2],Path1), + string_concat(P2,"tmp54837",P3), + + save_file_s(P3,File_term), + + Mv="rsync -avz --remove-source-files ", + concat_list([Mv,P3," ",Path],Command), + shell1_s(Command), + + !. + +directory_files_s(F1,B) :- + atom_string(F1,F2), + directory_files(F2,B),!. + + + % ssh root@46.250.240.201 + % open_file_sh("root@x.x.x.x:~/Dropbox/GitHub/",) + +% directory_files_sh("",A). + +directory_files_sh(F1,B) :- + (absolute_url(F1)-> + F1=F2; + (working_directory_sh(F11,F11), + string_concat_url(F11,F1,F2))), + split_string(F2,":",":",F), + append([G],[H],F), + string_concat(K,K1,H), + string_length(K,2), + (K1=""->K11="./";K11=K1), + foldr(string_concat,["main_tmp :- catch((directory_files('",K11,"',A),term_to_atom(A,A1),write(A1)),Err,handle_error(Err)),halt.\nmain_tmp :- halt(1).\nhandle_error(_Err):-\n halt(1)."],S1), + foldr(string_concat,[G,":~/tmp54837.pl"],P1), + save_file_sh(P1,S1), + foldr(string_concat,["ssh ",G," swipl --goal=main_tmp --stand_alone=true -o tmp54837 -c tmp54837.pl"],S2), + (catch(shell1_s(S2),_,fail)-> + (foldr(string_concat,["ssh ",G," ./tmp54837"],S), + (catch(shell1_s(S,Out),_,fail)-> + ( + term_to_atom(B%Out1 + ,Out), + foldr(string_concat,["ssh ",G," rm tmp54837.pl\nssh ",G," rm tmp54837"],S3), + shell1_s(S3) + );(writeln("directory_files_sh aborted."),abort)) + );(writeln("directory_files_sh aborted."),abort)),!. + +exists_file_s(F1) :- + atom_string(F2,F1), + exists_file(F2),!. + +exists_file_sh(F1) :- + (absolute_url(F1)-> + F1=F2; + (working_directory_sh(F11,F11), + string_concat_url(F11,F1,F2))), + split_string(F2,":",":",F), + append([G],[H],F), + string_concat(K,K1,H), + string_length(K,2), + foldr(string_concat,["main_tmp :- catch(exists_file('",K1,"'),Err,handle_error(Err)),halt.\nmain_tmp :- halt(1).\nhandle_error(_Err):-\n halt(1)."],S1), + foldr(string_concat,[G,":~/tmp54837.pl"],P1), + save_file_sh(P1,S1), + foldr(string_concat,["ssh ",G," swipl --goal=main_tmp --stand_alone=true -o tmp54837 -c tmp54837.pl"],S2), + (catch(shell1_s(S2),_,fail)-> + (foldr(string_concat,["ssh ",G," ./tmp54837"],S), + (catch(shell1_s(S,_Out),_,fail)-> + ( + true, + foldr(string_concat,["ssh ",G," rm tmp54837.pl\nssh ",G," rm tmp54837"],S3), + shell1_s(S3) + );fail)) + ;(writeln("exists_file aborted."),abort)),!. + +exists_directory_s(F1) :- + atom_string(F2,F1), + exists_directory(F2),!. + +exists_directory_sh(F1) :- + (absolute_url(F1)-> + F1=F2; + (working_directory_sh(F11,F11), + string_concat_url(F11,F1,F2))), + split_string(F2,":",":",F), + append([G],[H],F), + string_concat(K,K1,H), + string_length(K,2), + foldr(string_concat,["main_tmp :- catch(exists_directory('",K1,"'),Err,handle_error(Err)),halt.\nmain_tmp :- halt(1).\nhandle_error(_Err):-\n halt(1)."],S1), + foldr(string_concat,[G,":~/tmp54837.pl"],P1), + save_file_sh(P1,S1), + foldr(string_concat,["ssh ",G," swipl --goal=main_tmp --stand_alone=true -o tmp54837 -c tmp54837.pl"],S2), + (catch(shell1_s(S2),_,fail)-> + (foldr(string_concat,["ssh ",G," ./tmp54837"],S), + (catch(shell1_s(S,_Out),_,fail)-> + ( + true, + foldr(string_concat,["ssh ",G," rm tmp54837.pl\nssh ",G," rm tmp54837"],S3), + shell1_s(S3) + );fail)) + ;(writeln("exists_directory_sh aborted."),abort)),!. + +delete_file_sh(F1) :- + (absolute_url(F1)-> + F1=F2; + (working_directory_sh(F11,F11), + string_concat_url(F11,F1,F2))), + split_string(F2,":",":",F), + append([G],[H],F), + string_concat(K,K1,H), + string_length(K,2), + foldr(string_concat,["main_tmp :- catch(delete_file('",K1,"'),Err,handle_error(Err)),halt.\nmain_tmp :- halt(1).\nhandle_error(_Err):-\n halt(1)."],S1), + foldr(string_concat,[G,":~/tmp54837.pl"],P1), + save_file_sh(P1,S1), + foldr(string_concat,["ssh ",G," swipl --goal=main_tmp --stand_alone=true -o tmp54837 -c tmp54837.pl"],S2), + (catch(shell1_s(S2),_,fail)-> + (foldr(string_concat,["ssh ",G," ./tmp54837"],S), + (catch(shell1_s(S,_Out),_,fail)-> + ( + true, + foldr(string_concat,["ssh ",G," rm tmp54837.pl\nssh ",G," rm tmp54837"],S3), + shell1_s(S3) + );fail)) + ;(writeln("delete_file_sh aborted."),abort)),!. + + +delete_directory_sh(F1) :- + (absolute_url(F1)-> + F1=F2; + (working_directory_sh(F11,F11), + string_concat_url(F11,F1,F2))), + split_string(F2,":",":",F), + append([G],[H],F), + string_concat(K,K1,H), + string_length(K,2), + foldr(string_concat,["main_tmp :- catch(delete_directory('",K1,"'),Err,handle_error(Err)),halt.\nmain_tmp :- halt(1).\nhandle_error(_Err):-\n halt(1)."],S1), + foldr(string_concat,[G,":~/tmp54837.pl"],P1), + save_file_sh(P1,S1), + foldr(string_concat,["ssh ",G," swipl --goal=main_tmp --stand_alone=true -o tmp54837 -c tmp54837.pl"],S2), + (catch(shell1_s(S2),_,fail)-> + (foldr(string_concat,["ssh ",G," ./tmp54837"],S), + (catch(shell1_s(S,_Out),_,fail)-> + ( + true, + foldr(string_concat,["ssh ",G," rm tmp54837.pl\nssh ",G," rm tmp54837"],S3), + shell1_s(S3) + );fail)) + ;(writeln("delete_directory_sh aborted."),abort)),!. + +make_directory_s(F1) :- + atom_string(F2,F1), + make_directory(F2),!. + +% make_directory_sh("root@x.x.x.x:~/a"). + +make_directory_sh(F1) :- + (absolute_url(F1)-> + F1=F2; + (working_directory_sh(F11,F11), + string_concat_url(F11,F1,F2))), + split_string(F2,":",":",F), + append([G],[H],F), + string_concat(K,K1,H), + string_length(K,2), + foldr(string_concat,["main_tmp :- catch(make_directory('",K1,"'),Err,handle_error(Err)),halt.\nmain_tmp :- halt(1).\nhandle_error(_Err):-\n halt(1)."],S1), + foldr(string_concat,[G,":~/tmp54837.pl"],P1), + save_file_sh(P1,S1), + foldr(string_concat,["ssh ",G," swipl --goal=main_tmp --stand_alone=true -o tmp54837 -c tmp54837.pl"],S2), + (catch(shell1_s(S2),_,fail)-> + (foldr(string_concat,["ssh ",G," ./tmp54837\nssh ",G," rm tmp54837.pl\nssh ",G," rm tmp54837" + ],S), + (catch(shell1_s(S,_Out),_,fail)-> + ( + true + );(writeln("make_directory_sh aborted."),abort)) + );(writeln("make_directory_sh aborted."),abort)),!. + +string_concat_url("",F1,F1) :-!. +string_concat_url(F11,F1,F2) :- + (string_concat(_F12,"/",F11)-> + string_concat(F11,F1,F2); + (foldr(string_concat,[F11,"/",F1],F2))),!. + +absolute_url(A) :- + ((split_string(A,":",":",F), + append([_G],[_H],F))->true; + (split_string(A,"/","/",Path1), + length(Path1,L),L>=2, + append([P1],_P2,Path1), + not(string_concat("/",_,P1)))),!. + +working_directory1(A1,B1) :- + (string(A1)->atom_string(A,A1);A=A1), + (string(B1)->atom_string(B,B1);B=B1), + term_to_atom(working_directory(A,B),Atom), + catch(working_directory(A,B), _, (foldr(string_concat,["Error on ",Atom%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],Text41),writeln1(Text41)%fail%abort + )),!. + +% working_directory_sh(_,"root@x.x.x.x:~/"). + +working_directory_sh(A,B) :- + ((var(A),not(var(B)))-> + (dynamic(working_directory_sh1/1), + + split_string(B,":",":",F), + append([G],[H],F), + string_concat(K,K1,H), + string_length(K,2), + (K1=""->K11="./";K11=K1), + foldr(string_concat,[G,":",K,K11],B1), + + retractall(working_directory_sh1(_)), + assertz(working_directory_sh1(B1)) + ); + ((var(A),var(B))-> + (dynamic(working_directory_sh1/1), + working_directory_sh1(A),A=B))),!. + \ No newline at end of file diff --git a/listprologinterpreter/la_maths.pl b/listprologinterpreter/la_maths.pl new file mode 100644 index 0000000000000000000000000000000000000000..8ffdd6de9d26a60bd5050bc05e3ad3280880fa09 --- /dev/null +++ b/listprologinterpreter/la_maths.pl @@ -0,0 +1,102 @@ +% la_maths.pl + +%% numbers(5,1,[],N). +%% N=[1,2,3,4,5] + +numbers(N2,N1,Numbers2) :- + (N2>=N1-> + numbers(N2,N1,[],Numbers2); + numbers(N1,N2,[],Numbers2)),!. + +numbers(N2,N1,Numbers1,Numbers2):- +numbers1(N2,N1,Numbers1,Numbers2),!. +numbers1(N2,N1,Numbers,Numbers) :- + N2 is N1-1,!. +numbers1(N2,N1,Numbers1,Numbers2) :- + N3 is N1+1, + append(Numbers1,[N1],Numbers3), + numbers1(N2,N3,Numbers3,Numbers2),!. + +%% get_item_n([a,b,c],3,Item). +%% Item = c + +get_item_n(A,B,C) :- + catch(get_item_n1(A,B,C),_,fail),!. + +get_item_n1([],_,[]) :-!.%,fail. +get_item_n1(Exposition,Number1,Item) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[Item|_],Exposition),!. + +put_item_n(Exposition,Number1,Item2,Exposition2) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[_Item1|Rest],Exposition), + append(List,[Item2|Rest],Exposition2),!. + +% put_item_ns([a,b,c,b],[[2,d],[4,e]],F). +% F = [a, d, c, e]. + +put_item_ns(A,[],A) :- !. +put_item_ns(A,BCs,D) :- + BCs=[[B,C]|BCs2], + put_item_n(A,B,C,E), + put_item_ns(E,BCs2,D). + + +% get_n_item([4,6,7],6,L). +% L = 2 + +get_n_item(A,C,L2) :- + append(B,[C|_],A),length(B,L),L2 is L+1. + +% delete_item_n([4,5,6],2,D). +% D = [4,6] +delete_item_n(A,N,D) :- + N1 is N-1,length(B,N1),append(B,[_|C],A),append(B,C,D),!. + + +sum(A,S) :- + sum(A,0,S), !. +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2),!. + +% number_order(1000,M). +% M = 3 + +number_order(N1,M) :- + log(N1,N),log(10,P),M is ceiling(N/P),!. + +foldr(plus,A,B) :- + foldr(plus,A,0,B). + +maximum_length(List,Maximum_length) :- + findall(L,(member(A2,List),length(A2,L)),L2), + sort(L2,L3), + append(_,[Maximum_length],L3),!. + +sub_list(List,Before_list,Sub_list,After_list) :- + append(Before_list,L1,List), + append(Sub_list,After_list,L1). + +sub_list_reverse(List,Before_list,Sub_list,After_list) :- + reverse(List,L2), + append(To_after_list,L1,L2), + reverse(To_after_list,After_list), + reverse(L1,L11), + append(Before_list,Sub_list,L11). + +all_distinct1(A) :- sort(A,B),msort(A,B). + +split_list_on_item(A,Item,Lists) :- + (is_list(Item)->Item1=Item;Item1=[Item]), + findall(BID,(((append(B,C,A),append(Item1,F,C), + split_list_on_item(F,Item1,D),delete([B,Item1],[],BI),%foldr(append,[B,Item1],BI), + foldr(append,[BI,D],BID))->true;(A=[]->BID=[];BID=[A]))),Lists1),foldr(append,Lists1,Lists). + + + diff --git a/listprologinterpreter/la_string_codes.pl b/listprologinterpreter/la_string_codes.pl new file mode 100644 index 0000000000000000000000000000000000000000..463ca9443f0edfed3a2e090d39444c4fa3ce1902 --- /dev/null +++ b/listprologinterpreter/la_string_codes.pl @@ -0,0 +1,18 @@ +% la_string_codes.pl + +prepare_string_c(S,O) :- + string_codes(S,O). + +codes_to_string_c(C,S) :- + string_codes(S,C). + +/** +read_string_c(Stream, Chars1, Chars2, End, Result) :- + read_string(Stream, Chars1, Chars2, End, Result1), + prepare_string(Result1,Result). +**/ + +write_to_stream_c(String,I,O) :- + prepare_string_c(String,O2), + append(I,O2,O). + diff --git a/listprologinterpreter/la_strings.pl b/listprologinterpreter/la_strings.pl new file mode 100644 index 0000000000000000000000000000000000000000..12e7c1ce7202b2c0f8abbeba5305e975096f2b93 --- /dev/null +++ b/listprologinterpreter/la_strings.pl @@ -0,0 +1,643 @@ +%% la_strings.pl + +:- dynamic html_api_maker_or_terminal/1. + +use_module(library(pio)). +use_module(library(dcg/basics)). + +string_strings(S,S1) :- + string_chars(S,S2), + findall(S3,(member(S4,S2),atom_string(S4,S3)),S1),!. + +get_until_char(S,C,Cs1,Left1) :- + string_strings(S,S1), + append(Cs,B,S1),(is_list(C)->C1=C; string_strings(C,C1)), + append(C1,Left,B), + foldr(string_concat,Cs,Cs1), + foldr(string_concat,Left,Left1), + !. +get_until_char(S,_C,S,"") :- !. + +open_s(File,Mode,Stream) :- + atom_string(File1,File), + open(File1,Mode,Stream),!. + +string_atom(String,Atom) :- + atom_string(Atom,String),!. + +phrase_from_file_s(string(Output), String) :- + atom_string(String1,String), + phrase_from_file(string(Output), String1),!. + +writeln0(Term) :- + %term_to_atom(Term,Atom), +%append([Term],["\n"],Term1), + %append_retry_back_stack([text,Term1]),!. + (html_api_maker_or_terminal(html)-> + (%term_to_atom(Term,Atom), + writeln(Term%Atom%,[] + ) + ,format('
\n',[])); + writeln(Term)),!. + +write0(Term) :- + %term_to_atom(Term,Atom), + %append_retry_back_stack([text,Term]) + (html_api_maker_or_terminal(html)-> + (%term_to_atom(Term,Atom), + format(Term,[])%,format('\n',[]) + ); + write(Term)),!. + +writeln1(Term) :- + term_to_atom(Term,Atom), + +%atom_concat(Atom,"\n",Atom1), + %append_retry_back_stack([text,Atom1]),!. + (html_api_maker_or_terminal(html)-> + (%term_to_atom(Term,Atom), + format(Atom,[]),format('
\n',[])); + writeln(Atom)),!. + + %writeln(Atom),!. + +write1(Term) :- +/* + term_to_atom(Term,Atom), + append_retry_back_stack([text,Atom]),!. + %write(Atom),!. + */ + term_to_atom(Term,Atom), + +%atom_concat(Atom,"\n",Atom1), + %append_retry_back_stack([text,Atom1]),!. + (html_api_maker_or_terminal(html)-> + (%term_to_atom(Term,Atom), + format(Atom,[])%,format('\n',[]) + ); + write(Atom)),!. + + + + +n_to_br(Term,Term1) :- + sub_term_types_wa([string,atom],Term,Instances), + findall([Add,X],(member([Add,X1],Instances), + atomic_list_concat(A,'\n',X1), + atomic_list_concat(A,'
',X2), + (atom(X1) + ->X2=X;(atom_string(X2,X4)),X4=X%,term_to_atom(X4,X) + )),X3), + foldr(put_sub_term_wa_ae,X3,Term,Term1),!. + +writeln_br(Term) :- + (html_api_maker_or_terminal(html)-> + (n_to_br(Term,Term1), + %term_to_atom(Term1,Atom), + write(Term1),write('
\n')); + (%term_to_atom(Term,Atom), + writeln(Term))),!. + +write_br(Term) :- + (html_api_maker_or_terminal(html)-> + (n_to_br(Term,Term1), + %term_to_atom(Term1,Atom), + write(Term1)); + (%term_to_atom(Term,Atom), + write(Term))),!. + + +shell1_s(Command,Out) :- + atom_string(Command1,Command), + bash_command(Command1,Out), + %not(Out=user:fail), + !. + +shell1_s(Command) :- + atom_string(Command1,Command), + shell1(Command1),!. + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln0(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)),!. + +foldr(string_concat,[A,B,C],D) :- + not(var(A)),var(B),not(var(C)), + string_concat(A,G,D), + string_concat(B,C,G),!. + +% aBc=abc +foldr(string_concat1,ABC,D) :- + findall(E,(member(E,ABC),var(E)),E1), + length(E1,1),not(var(D)), + member(V,ABC),var(V),!, + sub_term_types_wa([var],ABC,[[Ad,_]]), + (string(D)->string_strings(D,D11);atom_chars(D,D11)), + %trace, + findall(M,(member(M1,D11),(catch(number_string(M,M1),_,false)->true;M=M1)),M2), + get_sub_term_wa(M2,Ad,V),!. + +/* +foldr(string_concat2,["a","b",C,D,E,"f"],"abcdef"). + +C = D, D = "", +E = "cde" ; +C = "", +D = "c", +E = "de" ; +C = "", +D = "cd", +E = "e" ; +C = E, E = "", +D = "cde" ; +C = "c", +D = "", +E = "de" +... +*/ + +foldr(string_concat2,ABC,D) :- + findall(E,(member(E,ABC),var(E)),E1), + length(E1,3), + not(var(D)), + sub_term_types_wa([var],ABC,[[Ad1,_],[Ad2,_],[Ad3,_]]), + Ad1=[_,N1],Ad2=[_,N2],N2 is N1+1,Ad3=[_,N3],N3 is N2+1, + (string(D)->string_strings(D,D11);atom_chars(D,D11)), + findall(M,(member(M1,D11),(catch(number_string(M,M1),_,false)->true;M=M1)),M2), + %findall(L,(member(L1,ABC),(var(L1)->L=L1;L=[L1])),L2), + + N4 is N1-1, + length(N4L,N4), + length(N4L1,N4), + N5 is 3, + length(N5L,N5), + length(N5L1,N5), + append(N4L,N6,M2), + append(N5L,_,N6), + + append(N4L1,N61,ABC), + append(N5L1,_,N61), + + append(V1,V4,N5L), + append(V2,V3,V4), + + foldr(string_concat,V1,V11), + foldr(string_concat,V2,V21), + foldr(string_concat,V3,V31), + N5L1=[V11,V21,V31]. + + +foldr(Function,A,L,B) :- + reverse(A,C), + foldl(Function,C,L,B),!. + +foldr(append,A,B) :- + foldr(append,A,[],B),!. + +%foldr(append1,A,B) :- +% foldr(append1,A,[],B),!. + +foldr(string_concat,A,B) :- + foldr(string_concat,A,"",B),!. + +foldr(atom_concat,A,B) :- + foldr(atom_concat,A,'',B),!. + +concat_list_terms(A,B) :- + findall(X,(member(X1,A),((string(X1)->true;(atom(X1)->true;number(X1)))->X=X1;term_to_atom(X1,X)%,term_to_atom(X2,X) + )),A1), + concat_list(A1,B),!. + +concat_list([],""):-!. +concat_list(A1,B):- + %A1=[A|List], + concat_list("",A1,B),!. + +concat_list(A,List,B) :- + concat_list1(A,List,B). + %string_codes(B,B1),!. + +concat_list1(A,[],A):-!. +concat_list1(A,List,B) :- + List=[Item|Items], + %string_codes(Item,Item1), + string_concat(A,Item,C), + concat_list1(C,Items,B). + +atom_concat_list([],''):-!. +atom_concat_list(A1,B):- + %A1=[A|List], + atom_concat_list([],A1,B),!. + +atom_concat_list(A,List,B) :- + atom_concat_list1(A,List,B1), + atom_codes(B,B1),!. + +atom_concat_list1(A,[],A):-!. +atom_concat_list1(A,List,B) :- + List=[Item|Items], + atom_codes(Item,Item1), + append(A,Item1,C), + atom_concat_list1(C,Items,B). + +append_list(A1,B):- + %A1=[A|List], + append_list([],A1,B),!. + +append_list(A,[],A):-!. +append_list(A,List,B) :- + List=[Item|Items], + append(A,[Item],C), + append_list(C,Items,B). + +append_list2([],[]):-!. +append_list2(A1,B):- + %A1=[A|List], + append_list2([],A1,B),!. + +append_list2(A,[],A):-!. +append_list2(A,List,B) :- + List=[Item|Items], + append(A,Item,C), + append_list2(C,Items,B). + +list1(A,_,_) :- + (A=[_|_]->true;A=[]),!. + + % splits after !,?,. + +/* +split_string17(String1,List) :- + %string_codes(String2,String1), + test(17,_,Code,_), + %trace, + %writeln1([interpret(off,[[n,grammar1],[String1,Chars,[v,t]]], + % Code,A)]), + interpret(off,[[n,grammar1],[String1,[v,t]]], + Code,[[[[v,t],List]]]),!. +*/ + +% split_string1(Docs,["`"],Input1) - splits and deletes on chars + +split_string(A,B,C) :- + split_string(A,B,B,C),!. + +split_string1a(String1,Chars,List) :- + %string_codes(String2,String1), + test(116,_,Code,_), + %trace, + %writeln1([interpret(off,[[n,grammar1],[String1,Chars,[v,t]]], + % Code,A)]), + interpret(off,[[n,grammar1],[String1,Chars,[v,t]]], + Code,[[[[v,t],List]]]),!. + +/** +?- split_string2("a a",[" "],A). +A = ["a", " ", " ", " ", "a"]. +**/ + +split_string2(String1,Chars,List) :- + %string_codes(String2,String1), + test(117,_,Code,_), + %trace, + %writeln1([interpret(off,[[n,grammar1],[String1,Chars,[v,t]]], + % Code,A)]), + interpret(off,[[n,grammar1],[String1,Chars,[v,t]]], + Code,[[[[v,t],List]]]),!. + +% ?- join_chars_after(["d","e","a","c","f","b","g"],["a","b"],[],L). +% L = ["d", "ea", "c", "fb", "g"]. + +join_chars_after([],_Chars,List,List) :- !. +join_chars_after([List1],_Chars,List2,List3) :- + append(List2,[List1],List3),!. +join_chars_after(List1,Chars,List5,List2) :- + List1=[Char1,Char2|List3], + member(Char2,Chars), + string_concat(Char1,Char2,Char3), + append([Char3],List3,List4), + join_chars_after(List4,Chars,List5,List2),!. +join_chars_after(List1,Chars,List5,List2) :- + List1=[Char1,Char2|List3], + not(member(Char2,Chars)), + append([Char2],List3,List4), + append(List5,[Char1],List6), + join_chars_after(List4,Chars,List6,List2),!. + +% split_on_substring117a("AAABAAD","BD",[],A). +% A = ["AAA", "B", "AA", "D"]. + +split_on_substring117a(A,B,_,D) :- + string_codes(A,A1), + string_codes(B,B1), + split_on_substring117(A1,B1,[],D),!. + +% split_on_substring117a("AAABAAD","BD",A). +% A = ["AAA", "B", "AA", "D"]. + +split_on_substring117a(A,B,C) :- + split_on_substring117a(A,B,_,C),!. + + +% split_on_substring117(`AAABAAD`,`BD`,[],A). or +% ?- split_on_substring117([65,65,65,66,65,65,68],[66,68],[],A). +% A = ["AAA", "B", "AA", "D"]. + +split_on_substring117([],_A,E,E) :- !. +split_on_substring117(A,B2,E,C) :- + intersection(A,B2,[]), + string_codes(E1,E), + string_codes(A1,A), + concat_list([E1,A1],C2), + append_list([C2],C), + !. +split_on_substring117(A,B2,E,C) :- + member(B,B2), + append([B],D,A), + %trace, + split_on_substring117(D,B2,[],C1), + string_codes(E1,E), + string_codes(B1,[B]), + %trace, + (E1=""->maplist(append,[[[B1],C1]],[C]); + (%trace, + maplist(append,[[[E1,B1],C1]],[C]))), + !. +split_on_substring117(A,B,E1,C) :- + length(E,1), + append(E,D,A), + append(E1,E,E2), + split_on_substring117(D,B,E2,C),!. + +num_chars(Char,Num1,String) :- + numbers(Num1,1,[],Nums), + findall(Char,(member(_Num2,Nums)),Chars), + concat_list(Chars,String),!. + +/* +replace_new('0ab0','a','c',A). +A = "0cb0". +*/ + +replace_new(A1,Find,Replace,F) :- +string_length(Replace,Replace_l), +string_concat("%",A1,A2), +string_concat(A2,"%",A), split_string(A,Find,Find,B),findall([C,Replace],(member(C,B)),D),maplist(append,[D],[E]),concat_list(E,F1),string_concat(F2,G,F1),string_length(G,Replace_l), + string_concat("%",F3,F2), + string_concat(F,"%",F3),!. + + +% find_first(B,(member(B,[true,false]),B),D). +% D = [true] . +find_first(A) :- + A,!. + +/* +find_first(A,B,B_condition,C) :- + D = (B,(B_condition->!;fail)),findall(A,D,C1), + (C1=[C|_]->true;C=[]),!. +*/ + +% findall_until_fail(B,member(B,[true,true,false,true]),B,D). +% D = [true,true] . + +findall_until_fail(A,B,B_condition,C) :- + D=(B,(B_condition->true;(!,fail))), + findall(A,D,C),!. + +find_until(A,B,C) :- + append(C,B4,A), + append([B],_C4,B4),!. + +% repeat_until_the_same(A=1,A,B is A+1,B,C). +% A=1,B = 2. + +/* +repeat_until_the_same(A,Value,B_initial,B,B_result) :- + %copy_term(B,B1), + A=Value, + %A_initial_assignment, + repeat_until_the_same1(A,B_initial,B,B_result),!. +repeat_until_the_same1(A,B_initial,B,B_result2) :- + %copy_term(A,A1), + copy_term(B_initial,B_initial1), + copy_term(B,B1), + copy_term(B,B2), + copy_term(B_result,B_result1), + A=B_initial,B1,(A=B_result->B_result=B_result2;(repeat_until_the_same1(A,B_initial1,B2,B_result2))),!. +*/ + +% repeat_until_the_same(A,1,(C=1,writeln(C)),C). +% A = C, C = D. + +% repeat_until_the_same(A,1,(random(X),C is floor(3*X),writeln(C)),C). +% 0 +% 2 +% 1 +% A = E, E = C, C = 1, +% X = 0.5791569817112253. + +/* +repeat_until_the_same(A,Value,B,B_res_var) :- + repeat_until_the_same1(A,Value,B_res_var,B,B_res_var,B_res_var),!. + +repeat_until_the_same1(A1,Value,Value,B,B_res_var,B_res_var1) :- + copy_term(A,A1), + A=Value, + B,(B_res_var=B_res_var1-> + true; + repeat_until_the_same1(A1,Value,B_res_var,B,B_res_var,B_res_var1)),!. + + + +% repeat_until_the_same(A,1,((A=1->A2=2;A2=1),writeln(A2)),A2). +% repeat until the last two results are the same. + +repeat_until_the_same(A,Value,B,B_res_var) :- + copy_term(A,A_new), + copy_term(B,B_new), + A=Value, + B, + repeat_until_the_same1(B_new,A_new,%A, + A,Value,B_res_var,B,B_res_var,B_res_var),!. + +repeat_until_the_same1(B_new,A_new,%A, +A,Value,Value,B,B_res_var,B_res_var1) :- + copy_term(A,A1), + copy_term(A_new,A_new1), + copy_term(A_new,A_new2), + copy_term(B_new,B_new1), + A_new=Value, + B_new,(B_res_var=B_res_var1-> + true; + repeat_until_the_same1(B_new1,A_new2,%A2, + A1,Value,B_res_var,B,B_res_var,B_res_var1)),!. +*/ + +% repeat_until_last_two_same(generate_result(Result),Result,Result1). + +% to do: initial values, more result vars + +%repeat_until_last_two_same(Pred,Pred_res_var,Result) :- +% repeat_until_last_two_same(Pred,Pred_res_var, _,_, Result),!. + +/* +repeat_until_last_two_same(Pred,Pred_res_var,A, B, Result) :- + functor(Pred,Name,Arity), + functor(Pred_new,Name,Arity), + copy_term(Pred_res_var,Pred_res_var_new), + %copy_term(Pred_new,Pred_new1), + %copy_term(Pred_new,Pred_new2), + % Generate some result + (Pred,%generate_result(Result1), + (%false%var(Pred_res_var) + %->%var(B) + + %( + %repeat_until_last_two_same(Pred_new1,Pred_res_var,B,Pred_res_var, Result); + +% repeat_until_last_two_same(Result1). + +% to do: initial values, more result vars + +* put into shell command to work with other Prolog predicate arguments + + */ + +repeat_until_last_two_same(Result) :- + repeat_until_last_two_same(_, Result),!. + + repeat_until_last_two_same(B, Result) :- + % Generate some result + generate_result(Result1), + % Check if the last two results are the same + ( ((var(B)->fail;true), + Result1 = B, B = Result)->true; + % If not, continue repeating + repeat_until_last_two_same(Result1, Result) + ),!. + + +/* +% repeat_until_last_two_same(generate_result(Result1),Result1,R). + repeat_until_last_two_same( Pred, Result_var, Result) :- + repeat_until_last_two_same1(Pred, Result_var,_, Result),!. + repeat_until_last_two_same1(Pred, Result_var, B, Result) :- + %copy_term(Result_var,Result_var0), + functor(Result_var,Name,Arity), + functor(Result_var0,Name,Arity), + + % Generate some result + Pred,%generate_result(Result1), + % Check if the last two results are the same + ( ((var(B)->fail;true), + Result_var = B, B = Result)->true; + % If not, continue repeating + repeat_until_last_two_same1(Pred, Result_var0,Result_var, Result) + ),!. + +% Predicate to generate a result (replace this with your actual computation) +*/ +generate_result(Result) :- + % For example, generating a random integer between 1 and 100 + random(1, 3, Result), + writeln(Result). + +%not(A) :- \+(A),!. + +% Doesn't delete \n in "\n" +split_on_substring1(A,B,D) :- + string_codes(A,A1), + string_codes(B,B1), + split_on_substring117(A1,B1,[],D),!. + +% Deletes \n in "\n" +split_on_substring(A,B,_,D) :- + split_on_substring(A,B,D),!. + +split_on_substring(A,B,D) :- + string_codes(A,A1), + string_codes(B,B1), + split_on_substring117(A1,B1,[],D1), + (D1=[]->D2=[""];D1=D2), + %D2=D,%delete_double_newlines(D2,[],D), + append(_D6,B4,D2), + append([B5],_C4,B4), + not(B5="\n"), + delete_newlines_after_text(B4,[],D),!. + +split_string1(A,B,C) :- + split_string1(A,B,_,C),!. + +split_string1(A,B,_,C) :- + %string_strings(B,B2), + split_on_substring(A,B,C), + %findall(D,(member(D1,C1),(member(D1,B2)->D="";D=D1)),C), + !. + +split_string1b(A,B,C) :- + split_string1b(A,B,_,C),!. + +split_string1b(A,B,_,C) :- + string_strings(B,B2), + split_on_substring(A,B,C1), + findall(D,(member(D1,C1),(member(D1,B2)->D="";D=D1)),C),!. + +delete_newlines_after_text([],A,A) :- !. +delete_newlines_after_text([A],B,C) :- + append(B,[A],C),!. +delete_newlines_after_text(D2,D1,D) :- + D2=[T,"\n"|D4], + append(D5,B4,D4), + append([B],_C4,B4), + not(B="\n"), + length(D5,L), + numbers(L,1,[],Ns), + findall("",member(_,Ns),D51), + foldr(append,[D1,[T],D51],D6), + delete_newlines_after_text(B4,D6,D),!. +delete_newlines_after_text(D2,D1,D) :- + D2=[T|D4], + %append(D5,B4,D4), + %append([B],_C4,B4), + %not(B="\n"), + foldr(append,[D1,[T]],D6), + delete_newlines_after_text(D4,D6,D),!. + +/* +delete_double_newlines([],D,D) :- !. +delete_double_newlines(D2,D5,D) :- + trace, + D2=["\n","\n"|D4], + append(D5,["\n"],D6), + delete_double_newlines(D4,D6,D),!. +delete_double_newlines(D2,D5,D) :- + trace, + D2=["\n"|D4], + append(D5,[],D6), + delete_double_newlines(D4,D6,D),!. +delete_double_newlines(D2,D5,D) :- + D2=[D3|D4], + append(D5,[D3],D6), + delete_double_newlines(D4,D6,D),!. +*/ + +retract_all(A) :- retractall(A). + +remove_dups([],[]) :- !. +remove_dups([Head|Tail],Result):- + member(Head,Tail), + remove_dups(Tail,Result),!. +remove_dups([Head|Tail],[Head|Result]):- + remove_dups(Tail,Result),!. + + diff --git a/listprologinterpreter/la_strings_string.pl b/listprologinterpreter/la_strings_string.pl new file mode 100644 index 0000000000000000000000000000000000000000..1baa77c179426362a791fe9ad79f6f47fa6be697 --- /dev/null +++ b/listprologinterpreter/la_strings_string.pl @@ -0,0 +1,8 @@ +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +word1([])-->[]. +word1([A|As]) --> [A],word1(As),{%%atom_codes(A,AC), +char_type(A,alpha)},!. diff --git a/listprologinterpreter/la_terms.pl b/listprologinterpreter/la_terms.pl new file mode 100644 index 0000000000000000000000000000000000000000..e74f2ba66b2448f22d10864e4a3d69eec462324a --- /dev/null +++ b/listprologinterpreter/la_terms.pl @@ -0,0 +1,7 @@ +replace_term(A,F,R,B) :- + sub_term_wa(F,A,In), + findall([Ad,R],member([Ad,_],In),In2), + foldr(put_sub_term_wa_ae,In2,A,B),!. + +only_item(A) :- + (atom(A)->true;(number(A)->true;string(A))),!. \ No newline at end of file diff --git a/listprologinterpreter/languages/en2.txt b/listprologinterpreter/languages/en2.txt new file mode 100644 index 0000000000000000000000000000000000000000..6ebef65b2af705ee0d151edc5f09aadfe3accddb --- /dev/null +++ b/listprologinterpreter/languages/en2.txt @@ -0,0 +1,144 @@ +[["Press c to creep or a to abort.","Press c to creep or a to abort.","en2","Press c to creep or a to abort."],["Press c to creep, s to skip or a to abort.","Press c to creep, s to skip or a to abort.","en2","Press c to creep, s to skip or a to abort."],["Type check","Type check","en2","Type check"],["abort","abort","en2","abort"],["any","any","en2","any"],["append","append","en2","append"],["atom","atom","en2","atom"],["brackets","brackets","en2","brackets"],["call","call","en2","call"],["ceiling","ceiling","en2","ceiling"],["code","code","en2","code"],["creep","creep","en2","creep"],["cut","cut","en2","cut"],["date","date","en2","date"],["delete","delete","en2","delete"],["equals","equals","en2","equals"],["exit","exit","en2","exit"],["fail","fail","en2","fail"],["findall","find all","en2","find all"],["grammar","grammar","en2","grammar"],["head","head","en2","head"],["input type check","input type check","en2","input type check"],["is","is","en2","is"],["length","length","en2","length"],["letters","letters","en2","letters"],["list","list","en2","list"],["maplist","map list","en2","map list"],["member","member","en2","member"],["n","n","en2","n"],["not","not","en2","not"],["number","number","en2","number"],["or","or","en2","or"],["predicatename","predicate name","en2","predicate name"],["random","random","en2","random"],["round","round","en2","round"],["skip","skip","en2","skip"],["string","string","en2","string"],["string_from_file","string from file","en2","string from file"],["stringconcat","concatenate strings","en2","concatenate strings"],["string_concat","concatenate strings","en2","concatenate strings"],["stringtonumber","string to number","en2","string to number"],["sys","system","en2","system"],["equals4_on","equals4 on","en2","equals4 on"],["equals4_off","equals4 off","en2","equals4 off"],["t","t","en2","t"],["tail","tail","en2","tail"],["true","true","en2","true"],["unwrap","unwrap","en2","unwrap"],["v","v","en2","v"],["var","variable","en2","variable"],["vgp","grammar part variable","en2","grammar part variable"],["wrap","wrap","en2","wrap"],["input","input","en2","input"],["output","output","en2","output"],["string_length","string length","en2","string length"],["sort","sort","en2","sort"],["intersection","intersection","en2","intersection"],["read_string","read string","en2","read string"],["read_password","read password","en2","read password"],["writeln","write line","en2","write line"],["atom_string","atom string","en2","atom string"],["trace","trace","en2","trace"],["notrace","no trace","en2","no trace"],["sqrt","square root","en2","square root"],["notrace","no trace","en2","no trace"],["get_lang_word","get language word","en2","get language word"],["query_box","query box","en2","query box"], + +["on_true","on true","en2","on true"], +["go_after","go after","en2","go after"], +["on_false","on false","en2","on false"], +["go_to_predicates","go to predicates","en2","go to predicates"], +["exit_function","exit function","en2","exit function"], +["fail_function","fail function","en2","fail function"], +["findall_exit_function","exit find all function","en2","exit find all function"], +["findall_fail_function","fail find all function","en2","fail find all function"], + +["abs","absolute","en2","absolute"], + +["at_end_of_stream","at end of stream","en2","at end of stream"], + +["atan","arc tangent","en2","arc tangent"], + +["atom_chars","atom characters","en2","atom characters"], + +["atom_codes","atom codes","en2","atom codes"], + +["atom_concat","concatenate atoms","en2","concatenate atoms"], + +["atom_length","atom length","en2","atom length"], + +["atomic","atomic","en2","atomic"], + +["char_code","character code","en2","character code"], + +["close","close","en2","close"], + +["compound","compound","en2","compound"], + +["copy_term","copy term","en2","copy term"], + +["cos","cosine","en2","cosine"], + +["current_input","current input","en2","current input"], + +["current_output","current output","en2","current output"], + +["current_prolog_flag","current prolog flag","en2","current prolog flag"], + +["float","float","en2","float"], + +["floor","floor","en2","floor"], + +["get_byte","get byte","en2","get byte"], + +["get_char","get character","en2","get character"], + +["get_code","get code","en2","get code"], + +["halt","halt","en2","halt"], + +["integer","integer","en2","integer"], + +["log","logarithm","en2","logarithm"], + +["nl","new line","en2","new line"], + +["nonvar","non variable","en2","non variable"], + +["number_chars","number characters","en2","number characters"], + +["number_codes","number codes","en2","number codes"], + +["open","open","en2","open"], + +["peek_byte","peek byte","en2","peek byte"], + +["peek_char","peek character","en2","peek character"], + +["peek_code","peek code","en2","peek code"], + +["put_byte","put byte","en2","put byte"], + +["put_char","put character","en2","put character"], + +["put_code","put code","en2","put code"], + +["read","read","en2","read"], + +["read_password","read password","en2","read password"], + +["read_term","read term","en2","read term"], + +["round","round","en2","round"], + +["set_input","set input","en2","set input"], + +["set_output","set output","en2","set output"], + +["set_prolog_flag","set prolog flag","en2","set prolog flag"], + +["set_stream_position","set stream position","en2","set stream position"], + +["sign","sign","en2","sign"], + +["sin","sine","en2","sine"], + +["sqrt","square root","en2","square root"], + +["stream_property","stream property","en2","stream property"], + +["string_chars","string characters","en2","string characters"], + +["sub_atom","subatom","en2","subatom"], + +["text_area","text area","en2","text area"], + +["write","write","en2","write"], + +["word1","word 1","en2","word 1"], + +["sub_string","substring","en2","substring"], + +["interpret","interpret","en2","interpret"], + +["writeq","write q","en2","write q"], + +["shell_pl","shell prolog","en2","shell prolog"], + +["shell_c","shell c","en2","shell c"], + +["assertz","assert z","en2","assert z"], + +["retractall","retract all","en2","retract all"], + +["number_string","number string","en2","number string"], + +["get_single_char","get single character","en2","get single character"], + +["getz","get z","en2","get z"], + +["comment","comment","en2","comment"], + +["split_on_substring117a","split on substring 117 a","en2","split on substring 117 a"], + +["open_file_s","open file string","en2","open file string"], + +["open_string_file_s","open string file string","en2","open string file string"] + +] \ No newline at end of file diff --git a/listprologinterpreter/listprolog.pl b/listprologinterpreter/listprolog.pl new file mode 100644 index 0000000000000000000000000000000000000000..d2b90de75ef69c1aa4a187e3b82f407c32f8ebb5 --- /dev/null +++ b/listprologinterpreter/listprolog.pl @@ -0,0 +1,43 @@ +:-style_check(-discontiguous). +:-style_check(-singleton). + +:- include('grammar.pl'). +%%:- include('lpi_caw_commands.pl'). +:- include('listprologinterpreter1listrecursion4.pl'). +:- include('listprologinterpreter3preds5.pl'). +:- include('lpiverify4.pl'). +:- include('lpiverify4_open.pl'). +:- include('lpiverify4_types.pl'). +:- include('lpiverify4_open_types.pl'). +:- include('lpiverify_pl.pl'). +%%:- include('caw5 copy 12.pl'). +%%:- include('cawpverify.pl'). +%%:- include('rcawp.pl'). +%%:- include('rcaw.pl'). +%%:- include('texttobr2.pl'). +:- include('la_strings.pl'). +:- include('la_string_codes.pl'). +:- include('la_maths.pl'). +:- include('la_files.pl'). +:- include('la_strings_string.pl'). +%:- include('../Languages/lang_db_generator.pl'). % leave off, run separately through Languages +%:- include('lpiverify4-fr.pl'). +:- include('operators.pl'). +:- include('lpiverify4_test_lang_all.pl'). +:- include('lpiverify4_test_bt_lang_all.pl'). +:- include('../Languages/make_docs.pl'). +:- include('../SSI/find_pred_sm.pl'). +:- include('e4_fa_get_vals.pl'). +%:- include('equals4_first_args.pl'). +:- include('expression_not_var.pl'). +:- include('collect_arguments.pl'). +:- include('reserved_words2.pl'). +:- include('expand_types.pl'). +:- include('replace_in_term.pl'). +:- include('preds_converters_and_matrices.pl'). +%:- include('numbers_of_items_correspond.pl'). +:- include('match_get_put_vals.pl'). +%:- include('insert_cuts.pl'). +:- include('../Philosophy/sub_term_with_address.pl'). +:-include('../List-Prolog-to-Prolog-Converter/lp2pconverter2.pl'). +:-include('la_terms.pl'). \ No newline at end of file diff --git a/listprologinterpreter/listprologinterpreter1listrecursion4.pl b/listprologinterpreter/listprologinterpreter1listrecursion4.pl new file mode 100644 index 0000000000000000000000000000000000000000..220c81dd43c9110817575ae7b42d2519a3eb5480 --- /dev/null +++ b/listprologinterpreter/listprologinterpreter1listrecursion4.pl @@ -0,0 +1,3035 @@ +:- use_module(library(date)). + +:- dynamic debug/1. +:- dynamic cut/1. +:- dynamic leash1/1. +:- dynamic types/1. +:- dynamic typestatements/1. +:- dynamic modestatements/1. +:- dynamic sys/1. +:- dynamic equals4/1. +:- dynamic query_box_n/1. +:- dynamic save_debug/1. +:- dynamic saved_debug/1. + +:- dynamic lang/1. + +:- dynamic retry_back/1. +:- dynamic retry_back_stack/1. +:- dynamic retry_back_stack_n/1. +:- dynamic cumulative_or_current_text/1. +:- dynamic number_of_current_text/1. +:- dynamic html_api_maker_or_terminal/1. + +:- dynamic occurs_check/1. + +%:- dynamic assertz_functions/1. + +command_n_sols(10). + +main3:- +time(( +test_all00("en",off,NTotal1,Score1), +test_all00("en2",off,NTotal2,Score2), +test_all_bt00("en2",off,NTotal3,Score3), +writeln(test_all00("en",off,NTotal1,Score1)), +writeln(test_all00("en2",off,NTotal2,Score2)), +writeln(test_all_bt00("en2",off,NTotal3,Score3)) +)). + +/** List Prolog Interpreter **/ + +interpret(Debug,Query,Functions1,Result) :- + international_interpret([lang,"en"], + Debug,Query,Functions1,Result). + +international_interpret([lang,Lang],Debug,Query,Functions1,Result) :- + retractall(lang(_)), + assertz(lang(Lang)), + interpret_1(Debug,Query,Functions1,Result). + +international_interpret([lang,Lang],Debug,Query,TypeStatements,ModeStatements,Functions1,Result) :- + retractall(lang(_)), + assertz(lang(Lang)), + interpret_1(Debug,Query,TypeStatements,ModeStatements,Functions1,Result). + + +interpret_1(Debug,Query,Functions1,Result) :- + retractall(types(_)), + assertz(types(off)), +interpret11(Debug,Query,Functions1,Result). + +interpret_1(Debug,Query,TypeStatements,ModeStatements,Functions1,Result) :- + retractall(types(_)), + assertz(types(on)), + retractall(typestatements(_)), + +%writeln(here1), + findall([A,C],(member([A,B],TypeStatements),expand_types(B,[],C)),TypeStatements1), + assertz(typestatements(TypeStatements1)), + retractall(modestatements(_)), + assertz(modestatements(ModeStatements)), +interpret11(Debug,Query,Functions1,Result). + +interpret11(Debug,Query,Functions,Result) :- + ((not(lang(_Lang1)) + %var(Lang1) + )-> + (retractall(lang(_)), + assertz(lang("en"))); + true), + load_lang_db, + + query_box(Query,Query1,Functions,Functions1), + +%trace, +%writeln1(query_box(Query,Query1,Functions,Functions1)), +%%writeln1([i1]), + %%writeln1(convert_to_grammar_part1(Functions1,[],Functions2,_)), + convert_to_grammar_part1(Functions1,[],Functions2,_), + + %insert_cuts(Functions2a,Functions2), + %retractall(assertz_functions(_)), + %assertz(assertz_functions(Functions2)), + %trace, + %writeln1(convert_to_grammar_part1(Functions1,[],Functions2,_)), + %writeln1(Functions2), + %%pp3(Functions2), + %%writeln1(interpret1(Debug,Query,Functions2,Functions2,Result)), + + %writeln1(interpret1(Debug,Query1,Functions2,Functions2,Result1)), + findall(Result1,interpret1(Debug,Query1,Functions2,Functions2,Result1),Result). + +query_box(Query,Query1,Functions,Functions1) :- +get_lang_word("n",Dbw_n), + + collect_arguments_body2([Query],[],Arguments1), + msort(Arguments1,Arguments), + %trace, + find_query_box_n(Query_box_n), + (Arguments=[]-> + (Query1=[[Dbw_n,Query_box_n],[]], + append( + [ + [[Dbw_n,Query_box_n],[],":-", + [ + Query + ]] + ] + , + Functions,Functions1)); + (Query1=[[Dbw_n,Query_box_n],Arguments], + append( + [ + [[Dbw_n,Query_box_n],Arguments,":-", + [ + Query%,%[[n,trace2]] + ]] + ] + , + Functions,Functions1))). + +interpret1(Debug,Query,Functions1,Functions2,Result) :- +%%writeln1([i11]), + retractall(debug(_)), + assertz(debug(Debug)), + retractall(cut(_)), + assertz(cut(off)), + retractall(leash1(_)), + assertz(leash1(off)), %% Should normally be off + retractall(sys(_)), + assertz(sys(1)), + (not(equals4(_Equals4))->(retractall(equals4(_)),assertz(equals4(on)));true),%equals4(Equals4)), + %trace, + (not(save_debug(_))->(retractall(save_debug(_)),assertz(save_debug(off)));true), + + (not(occurs_check(_))->(retractall(occurs_check(_)),assertz(occurs_check(off)));true), + + +retractall(retry_back(_)), + retractall(retry_back_stack(_)), + retractall(retry_back_stack_n(_)), + retractall(cumulative_or_current_text(_)), + retractall(number_of_current_text(_)), + + assertz(retry_back(off)), % on - retry/back mode options available in trace mode + assertz(retry_back_stack([])), % on - retry/back mode options available in trace mode + assertz(retry_back_stack_n(0)), + assertz(cumulative_or_current_text(current)), + assertz(number_of_current_text(1)), + + %%writeln1(member1(Query,Functions1,Functions2,Result)), + member1(Query,Functions1,Functions2,Result). +%%member1([_,R],_,[],R). +%%member1(_,_,[],[]). +member1(_Query,_,_,[],_) :- %%writeln1(["The query",Query,"matches no predicates."]), +fail. +member1(Query,Functions,Functions2,Vars8) :- +%%writeln1([m1]), + (cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2,":-",Body]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + + debug_call(Skip,[Function,Arguments1]), + +(( +checktypes_inputs(Function,Arguments1), + + %%writeln1(checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs)), + %trace, + %(Function=[n,query_box_1]->true;trace), + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs), + %notrace, + %%->ca2 +%writeln1([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + interpretbody(Functions,Functions2,Vars1,Vars2,Body,true), + updatevars(FirstArgs,Vars2,[],Result), + %writeln1(updatevars(FirstArgs,Vars2,[],Result)), + %trace, + unique1(Result,[],Vars8) + %writeln1(unique1(Result,[],Vars8))%,notrace + )->debug_fail_fail(Skip);debug_fail(Skip,[Function,Arguments1])) + , + findresult3(Arguments1,Vars8,[],Result2), + debug_exit(Skip,[Function,Result2]), + + %trace, + %%writeln1(updatevars(FirstArgs,Vars2,[],Result)), + %trace, + + %notrace, + %%reverse(Result,[],Vars7), + ((true->%not(Result=[])-> + %%Result=[Var71|Vars72], + %%writeln1(unique1(Result,[],Vars8)), + (true +%%writeln1(["FirstArgs",FirstArgs,"Vars",Vars2,"Result",Result,"Vars7",Vars7,"Vars72",Vars72,"Var71",Var71,"Vars8",Vars8]), +%%writeln1(["Vars8",Vars8]), + %%writeln1(findresult3(Arguments1,Vars8,[],Result2)), + %trace, + +%writeln1([findresult3,"Arguments1",Arguments1,"Vars8",Vars8,"Result2",Result2]) + );( +%%writeln1(here1), + Vars8=[],Result2=[]))), +%%writeln1(["Arguments1",Arguments1,"Vars2",Vars2,"Result",Result]), + %trace, + checktypes(Function,Result2) + + ) + ; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2,":-",_Body]|Functions3], %% make like previous trunk? + member11(Query,Functions,Functions2,Vars8)) + );(turncut(off),fail%%,Result=[] + )). +member11(Query,Functions,Functions2,Result) :- +%%writeln1([m11]), +%%writeln1(["Query",Query,"Functions",Functions,"Functions2",Functions2,"Result",Result]), + (cut(off)->( + (Query=[Function], + (Functions2=[[Function,":-",Body]|_Functions3]), + debug_call(Skip,[Function]), + Result=[], + (interpretbody(Functions,Functions2,[],_Vars2,Body,true)->debug_fail_fail(Skip);debug_fail(Skip,[Function])), + debug_exit(Skip,[Function]) + ); + (%%Query=[Function], + %%Functions2=[[Function]|Functions3], + member12(Query,Functions,Functions2,Result)) + );(turncut(off),fail)). +member12(Query,Functions,Functions2,Vars8) :- +%%writeln1([m12]), + (cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + +debug_call(Skip,[Function,Arguments1]), + + (( + checktypes_inputs(Function,Arguments1), + + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs), + + +%%writeln1([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + updatevars(FirstArgs,Vars1,[],Result), + %%reverse(Result,[],Vars7), + ((%not + true->%(Result=[])-> + %%Result=[Var71|Vars72], + (%trace, + unique1(Result,[],Vars8),%notrace, + findresult3(Arguments1,Vars8,[],Result2) + );( +%%writeln1(here2), + Vars8=[],Result2=[]))))->debug_fail_fail(Skip); + debug_fail(Skip,[Function,Arguments1])), + + debug_exit(Skip,[Function,Result2]), + checktypes(Function,Result2) + + ); + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2]|Functions3], + member13(Query,Functions,Functions2,Vars8)) + );(turncut(off),fail)). +member13(Query,Functions,Functions2,Result) :- +%%writeln1([m13]), + (cut(off)->( + (Query=[Function], + (Functions2=[[Function]|_Functions3]), + debug_call(Skip,[Function]), + Result=[], + %%interpretbody(Functions,[],_Vars2,Body,true), + debug_exit(Skip,[Function]) + );%%->true; + (%%Query=[Function], + Functions2=[_Function|Functions3], + member1(Query,Functions,Functions3,Result)) + );(turncut(off),fail)). +interpret2(Query,Functions1,Functions2,Result) :- +%%writeln1(i2), +%%writeln1(["%%interpret2 Query",Query,"Functions1",Functions1,"Functions2",Functions2]), + member2(Query,Functions1,Functions2,Result). +%%member2([_,R],_,[],R). +%%member2(_,_,[],[]). +member2(_Query,_,_,[],_) :- %%writeln1(["The query",Query,"matches no predicates."]), +fail. +member2(Query,Functions,Functions2,Vars8) :- +%writeln1(member2(Query,Functions,Functions2,Vars8)), +%%writeln1([m2]), + (cut(off)->( + (%trace, + Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2,":-",Body]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + +debug_call(Skip,[Function,Arguments1]), + + (( + checktypes_inputs(Function,Arguments1), + + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs), + + +%%writeln1([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + interpretbody(Functions,Functions2,Vars1,Vars2,Body,true), + updatevars(FirstArgs,Vars2,[],Result), + %trace, + unique1(Result,[],Vars8)%,notrace + )->debug_fail_fail(Skip); + debug_fail(Skip,[Function,Arguments1])), %%**arg2 change +%%writeln1(["Functions",Functions,"Functions2",Functions2,"Vars1",Vars1,"Vars2",Vars2,"Body",Body]), + %trace, + + %%reverse(Result,[],Vars7), + (true->%not(Result=[])-> + %%Result=[Var71|Vars72], + (true, + findresult3(Arguments1,Vars8,[],Result2) +%%writeln1(["Vars2",Vars2,"Result",Result]), + );( + %%writeln1(here3), + Vars8=[],Result2=[])), + debug_exit(Skip,[Function,Result2]), + checktypes(Function,Result2) + + );%%->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2,":-",_Body]|Functions3], + member21(Query,Functions,Functions2,Vars8)) + );(turncut(off),fail)). +member21(Query,Functions,Functions2,Result) :- +%%writeln1([m21]), + (cut(off)->( + (Query=[Function], + (Functions2=[[Function,":-",Body]|_Functions3]), + Vars1=[], + debug_call(Skip,[Function]), + (interpretbody(Functions,Functions2,Vars1,_Vars2,Body,true)->debug_fail_fail(Skip); + debug_fail(Skip,[Function])), %%**arg2 change + debug_exit(Skip,[Function]) + );%%->true; + (%%Query=[Function], + %%Functions2=[[Function]|Functions3], + member22(Query,Functions,Functions2,Result)) + );(turncut(off),fail)). +member22(Query,Functions,Functions2,Vars8) :- +%%writeln1([m22]), + (cut(off)->( + (Query=[Function,Arguments1], + (Functions2=[[Function,Arguments2]|_Functions3]), + length(Arguments1,Length), + length(Arguments2,Length), + +debug_call(Skip,[Function,Arguments1]), + + (( + checktypes_inputs(Function,Arguments1), + + checkarguments(Arguments1,Arguments2,[],Vars1,[],FirstArgs), + + +%%writeln1([checkarguments,"Arguments1",Arguments1,"Arguments2",Arguments2,"Vars1",Vars1,"FirstArgs",FirstArgs]), + updatevars(FirstArgs,Vars1,[],Result), + %%reverse(Result,[],Vars7), + (true->%not(Result=[])-> + %%Result=[Var71|Vars72], + (%trace, + unique1(Result,[],Vars8),%notrace, + findresult3(Arguments1,Vars8,[],Result2) + );( +%%writeln1(here4), + Vars8=[],Result2=[])), + checktypes(Function,Result2) + + )->debug_fail_fail(Skip); + debug_fail(Skip,[Function,Arguments1])), %%**arg2 change + debug_exit(Skip,[Function,Result2]) + + );%%->true; + (%%Query=[Function,_Arguments1], + %%Functions2=[[Function,_Arguments2]|Functions3], + member23(Query,Functions,Functions2,Vars8)) + );(turncut(off),fail)). +member23(Query,Functions,Functions2,Vars8) :- +%%writeln1([m23]), + (cut(off)->( + (Query=[Function], + (Functions2=[[Function]|_Functions3]), + debug_call(Skip,[Function]), + Vars8=[], + debug_exit(Skip,[Function]) + );%%->true; + (%%Query=[Function], + Functions2=[_Function|Functions3], + member2(Query,Functions,Functions3,Vars8)) + );(turncut(off),fail)). + +turn_occurs_check(on) :- + retractall(occurs_check(_)),assertz(occurs_check(on)),!. +turn_occurs_check(off) :- + retractall(occurs_check(_)),assertz(occurs_check(off)),!. + +checkarguments(Variable1a,Variable2a,Vars1,Vars2,_,FirstArgs2) :- +%writeln1(a1checkarguments(Variable1a,Variable2a,Vars1,Vars2,_,FirstArgs2)), +%trace, +%(a1checkarguments(Variable1a,Variable2a,Vars1,Vars2,_,FirstArgs2)=a1checkarguments([[v,a],[[v,b],[v,c]]],[[[v,d],[v,e]],[v,f]],[],_346892,_346906,_346894)->trace;true), + +%(not(Variable1a=[[v,a],[v,d]])->trace;true), +%writeln([Variable1a,Variable2a]), +%(Variable2a=[[[v,a],"|",_]|_]->trace;true), + simplify(Variable1a,Variable1), + simplify(Variable2a,Variable2), + (equals4(on)->checkarguments1(Variable1,Variable2,Vars1,Vars2,[],FirstArgs2); +checkarguments2(Variable1,Variable2,Vars1,Vars2,[],FirstArgs2)), +% fail if two bs in a,b c,b in first args +%trace, +/*length(FirstArgs2,L), +findall(FA3,(member([_,FA3],FirstArgs2),not(expression_not_var(FA3))),FA4),sort(FA4,FA5), +findall(FA6,(member([_,FA6],FirstArgs2),expression_not_var(FA6)),FA7),append(FA5,FA7,FA8),length(FA8,L), +*/ +%writeln1(a2checkarguments(Variable1a,Variable2a,Vars1,Vars2,_,FirstArgs2)), + +!. + +checkarguments1(Variable1,Variable2,Vars1,Vars2,_,FirstArgs2) :- +%trace, +%writeln1(checkarguments1(Variable1,Variable2,Vars1,Vars2,_,FirstArgs2)), +occurs_check(Variable1,Variable2), +%notrace, %trace, + replace_vars(Variable1,[],Variable1a,[],First_vars1), + %writeln1(replace_vars(Variable1,[],Variable1a,[],First_vars1)), + + replace_vars(Variable2,[],Variable2a,[],First_vars2), + %writeln1(replace_vars(Variable2,[],Variable2a,[],First_vars2)), + + append(First_vars1,First_vars2,First_vars3), + + %match4_21(Variable2a,Variable1a,Vars1,Vars3), + match4_new_22(Variable2a,Variable1a,Vars1,Vars3%,standard + ), + + %writeln1(match4_21(Variable2a,Variable1a,Vars1,Vars3)), +% match4_21(Arguments2,Arguments1,Vars1,Vars2), + + replace_first_vars1(Vars3,First_vars2,[],Vars2a), + %writeln1(replace_first_vars1(Vars3,First_vars2,[],Vars2a)), + + replace_vars011(Vars2a,_Variable1a,[],Vars2b), % Vars2b->Vars2 + %writeln1(replace_vars011(Vars2a,_Variable1a,[],Vars2b)), + + equals4_first_args(Variable1a,Variable2a,FirstArgs3), + %writeln1(equals4_first_args(Variable1a,Variable2a,FirstArgs3)), + + replace_first_vars1(Vars2b,First_vars1,[],Vars2), + %writeln1(replace_first_vars1(Vars2b,First_vars1,[],Vars2)), + %equals4_first_args(Vars2b,Vars2,FirstArgs3), + + replace_first_vars2(FirstArgs3,First_vars3,[],FirstArgs2), + %writeln1(replace_first_vars2(FirstArgs3,First_vars3,[],FirstArgs2)), + !. + + +checkarguments2([],[],Vars,Vars,FirstArgs,FirstArgs) :- !. +/* +checkarguments2(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %% +%%writeln1(1), +%trace, + Arguments1=[Value|Arguments3], %% Value may be a number, string, list or tree + %expressionnotatom3(Value), + Arguments2=[Variable2|Arguments4], + %trace, + (Value=[_,'_']->true;Variable2=[_,'_']), + + %not(var(Variable2)),isvar(Variable2), + %((Value=[_,'_']->true;Variable2=[_,'_'])-> + %Vars1=Vars3; + %( + %get_lang_word("v",Dbw_v), + putvalue(Variable2,undef,%[Dbw_v,'_'], + Vars1,Vars3),%)), + checkarguments2(Arguments3,Arguments4,Vars3,Vars2,FirstArgs1,FirstArgs2),!. + checkarguments2(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %% +%%writeln1(1), +trace, + Arguments1=[Value|Arguments3], %% Value may be a number, string, list or tree + %expressionnotatom3(Value), + Arguments2=[Variable2|Arguments4], + %trace, + (Value=undef->true;Variable2=undef), + + %not(var(Variable2)),isvar(Variable2), + %((Value=[_,'_']->true;Variable2=[_,'_'])-> + %Vars1=Vars3; + %( + %get_lang_word("v",Dbw_v), + Vars1=Vars3, + %putvalue(Variable2,undef,%[Dbw_v,'_'], + %Vars1,Vars3),%)), + checkarguments2(Arguments3,Arguments4,Vars3,Vars2,FirstArgs1,FirstArgs2),!. + */ +checkarguments2(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %% +%%writeln1(1), +%trace, + Arguments1=[Value|Arguments3], %% Value may be a number, string, list or tree + expressionnotatom3(Value), + Arguments2=[Variable2|Arguments4], + not(var(Variable2)),isvar(Variable2), + %((Value=[_,'_']->true;Variable2=[_,'_'])-> + %Vars1=Vars3; + %( + putvalue(Variable2,Value,Vars1,Vars3),%)), + checkarguments2(Arguments3,Arguments4,Vars3,Vars2,FirstArgs1,FirstArgs2),!. + +checkarguments2(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- %%A +%%writeln1(2), + Arguments1=[Variable|Arguments3], %% Value may be a number, string, list or tree + not(var(Variable)),isvar(Variable), + Arguments2=[Value|Arguments4], + expressionnotatom3(Value), + putvalue(Variable,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable,Value]],FirstArgs3), + checkarguments2(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2),!. +checkarguments2(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln1(3), + Arguments1=[Variable1|Arguments3], + not(var(Variable1)),isvar(Variable1), + Arguments2=[Variable2|Arguments4], + not(var(Variable2)),isvar(Variable2), + %trace, + (getvalue(Variable2,Value,Vars1)),%%->true);Value=empty), + %%((Value=empty->Value1=Variable2;Value1=Value))), + putvalue(Variable2,Value,Vars1,Vars3), + append(FirstArgs1,[[Variable1,Variable2]],FirstArgs3), + checkarguments2(Arguments3,Arguments4,Vars3,Vars2,FirstArgs3,FirstArgs2),!. +checkarguments2(Arguments1,Arguments2,Vars1,Vars2,FirstArgs1,FirstArgs2) :- +%%writeln1(4), + Arguments1=[Value1|Arguments3], + expressionnotatom3(Value1), + Arguments2=[Value1|Arguments4], + expressionnotatom3(Value1), + checkarguments2(Arguments3,Arguments4,Vars1,Vars2,FirstArgs1,FirstArgs2),!. + + +%% checktypes([n,f],[1,"a",[n,a]],[[[n,f],[[t,number],[t,string],[t,predicatename]]]]). +%% checktypes([n,f],[1,1,1],[[[n,f],[[[t,list],[[t,number]]]]]]). +%% checktypes([n,f],[[1]],[[[n,f],[[[t,brackets],[[t,number]]]]]]). +%% checktypes([n,f],[1,"a",2,"b"],[[[n,f],[[[t,list],[[t,number],[t,string]]]]]]). +%% checktypes([n,f],[1,"a"],[[[n,f],[[t,a],[t,b]]],[[t,a],[[t,number]]],[[t,b],[[t,string]]]]). +%% Can write your own "any" type. + + +checktypes_inputs(Function,Vars1):-%%,TypeStatements1) :- +%%trace, +%%writeln(checktypes(Function,Vars1)), + get_lang_word("n",Dbw_n), + get_lang_word("query_box",Dbw_query_box), + + (((%trace, + types(on),Function=[Dbw_n,Dbw_query_box1], + not(string_concat(Dbw_query_box,_,Dbw_query_box1)))%,notrace + )->(typestatements(TypeStatements1), + modestatements(ModeStatements1), + checktypes0_inputs(Function,Vars1,TypeStatements1,ModeStatements1));true),!. +checktypes0_inputs(Function,Vars1,_TypeStatements1,_ModeStatements1) :- + length(Vars1,L),L is 0,Vars1=[], + get_lang_word("input type check",Input_type_check), + (types(on)->debug_types_call([Function,/,~,L,Input_type_check]);true), + + + (types(on)->debug_call(Skip,[Function,Vars1]);true), + + (types(on)->debug_exit(Skip,[Function,Vars1]);true), + (types(on)->(debug_types_exit([Function,/,~,L,Input_type_check]));true),!. +checktypes0_inputs(Function,Vars1,TypeStatements1,ModeStatements1) :- + length(Vars1,L), + get_lang_word("input type check",Input_type_check), + (types(on)->(debug_types_call([Function,/,~,L,Input_type_check]));true), + + + (member([Function|[TypeStatements2]],TypeStatements1), + member([Function|[ModeStatements2]],ModeStatements1), + extract_modes1(TypeStatements2,TypeStatements3,Vars1,Vars2,ModeStatements2), + (types(on)->debug_call(Skip,[Function,Vars2]);true), + ((checktypes1(Vars2,TypeStatements3,TypeStatements3,TypeStatements1))-> + ( + (types(on)->debug_exit(Skip,[Function,Vars2]);true), + (types(on)->(debug_types_exit([Function,/,~,L,Input_type_check]));true)) + +;( + (types(on)->debug_fail(Skip,[Function,Vars1]);true), + +(types(on)->(debug_types_fail([Function,/,~,L,Input_type_check]));true)))),!. + +extract_modes1(TypeStatements1,TypeStatements3,Vars1,Vars2,ModeStatements1) :- + %%TypeStatements1=[TypeStatements2|TypeStatements3], + %%trace, + %%writeln1([TypeStatements1,ModeStatements1]), + extract_modes2(TypeStatements1,[],TypeStatements3,Vars1,[],Vars2,ModeStatements1),!. + %%TypeStatements3=[TypeStatements3a|TypeStatements3]. +extract_modes2([],TypeStatements2a,TypeStatements2a,[],Vars,Vars,[]) :- !. +%%extract_modes2(_,TypeStatements2a,TypeStatements2a,[],Vars,Vars,[]) :- !. +extract_modes2(TypeStatements1,TypeStatements2a,TypeStatements3,Vars1,Vars2,Vars3,ModeStatements1) :- + get_lang_word("input",Input), + ModeStatements1=[Input|ModeStatements3], + TypeStatements1=[TypeStatements2|TypeStatements3a], + Vars1=[Vars11|Vars12], + append(TypeStatements2a,[TypeStatements2],TypeStatements4), + append(Vars2,[Vars11],Vars4), + extract_modes2(TypeStatements3a,TypeStatements4,TypeStatements3,Vars12,Vars4,Vars3,ModeStatements3),!. +extract_modes2(TypeStatements1,TypeStatements2a,TypeStatements3,Vars1,Vars2,Vars3,ModeStatements1) :- + get_lang_word("output",Output), + ModeStatements1=[Output|ModeStatements3], + TypeStatements1=[_TypeStatements2|TypeStatements3a], + Vars1=[_Vars11|Vars12], + extract_modes2(TypeStatements3a,TypeStatements2a,TypeStatements3,Vars12,Vars2,Vars3,ModeStatements3),!. + + +checktypes(Function,Vars1):-%%,TypeStatements1) :- +%%writeln(checktypes(Function,Vars1)), + get_lang_word("n",Dbw_n), + get_lang_word("query_box",Dbw_query_box), + + ((types(on),Function=[Dbw_n,Dbw_query_box1], + not(string_concat(Dbw_query_box,_,Dbw_query_box1)))%,notrace + + ->(typestatements(TypeStatements1), + checktypes0(Function,Vars1,TypeStatements1));true),!. +checktypes0(Function,Vars1,_TypeStatements1) :- + get_lang_word("Type check",Type_check), + length(Vars1,L),L is 0,Vars1=[], + (types(on)->(debug_types_call([Function,/,L,Type_check]));true), + + + (types(on)->debug_call(Skip,[Function,Vars1]);true), + + (types(on)->debug_exit(Skip,[Function,Vars1]);true), + (types(on)->(debug_types_exit([Function,/,L,Type_check]));true),!. + +checktypes0(Function,Vars1,TypeStatements1) :- + get_lang_word("Type check",Type_check), + length(Vars1,L), + (types(on)->(debug_types_call([Function,/,L,Type_check]));true), + + + (types(on)->debug_call(Skip,[Function,Vars1]);true), + ((member([Function|[TypeStatements2]],TypeStatements1), + checktypes1(Vars1,TypeStatements2,TypeStatements2,TypeStatements1))-> + ( + (types(on)->debug_exit(Skip,[Function,Vars1]);true), + (types(on)->(debug_types_exit([Function,/,L,Type_check]));true)) + +;( + (types(on)->debug_fail(Skip,[Function,Vars1]);true), + +(types(on)->(debug_types_fail([Function,/,L,Type_check]));true))),!. + +checktypes1(Vars1,TypeStatements0,TypeStatements1,TypeStatements4) :- + findall(-,member("|",Vars1),A),length(A,L),not(L>=2), +checktypes10(Vars1,TypeStatements0,TypeStatements1,TypeStatements4). + +checktypes10([],[],_,_) :- !. + + checktypes10(Vars1,TypeStatements1,TypeStatements2,TypeStatements4) :- +get_lang_word("t",T), +%trace, +get_lang_word("list",Dbw_list), +%%writeln(checktypes1(Vars1,TypeStatements1,TypeStatements2,TypeStatements4)), + Vars1=[Vars2|Vars3], + list1(Vars2,_,_), + TypeStatements1=[[[T,Dbw_list]|[TypeStatements3]]|TypeStatements4a], +(types(on)->(%TypeStatements3=[TypeStatements32], +simplify_types([[[T,Dbw_list]|[TypeStatements3]]],[],TypeStatements31), +debug_call(Skip,TypeStatements31));true), + + ((checktypes3(Vars2,TypeStatements3,TypeStatements2,TypeStatements4))-> + ((types(on)->(debug_exit(Skip,{Vars2}));true), + checktypes10(Vars3,TypeStatements4a,TypeStatements2,TypeStatements4)) +; (types(on)->(debug_fail(Skip,{Vars2}));true) +) +%%not(variable_name(Vars2)), + . %% ** in brac as well + +checktypes10(Vars1,TypeStatements1,TypeStatements2,TypeStatements4) :- + get_lang_word("t",T),get_lang_word("list",Dbw_list), +%%writeln(checktypes1(Vars1,TypeStatements1,TypeStatements2,TypeStatements4)), + %%Vars1=[Vars2|Vars3], + %%list(Vars1,_,_),%%length(Vars1,1), + TypeStatements1=[[[T,Dbw_list]|[TypeStatements3]]|_TypeStatements4a], +(types(on)->(%TypeStatements3=[TypeStatements32], +simplify_types([[[T,Dbw_list]|[TypeStatements3]]],[],TypeStatements31),debug_call(Skip,TypeStatements31));true), + + ((checktypes3(Vars1,TypeStatements3,TypeStatements2,TypeStatements4))-> + (types(on)->debug_exit(Skip,{Vars1});true) +; (types(on)->debug_fail(Skip,{Vars1});true)). + %%checktypes1(Vars3,TypeStatements4a,TypeStatements2,TypeStatements4). %% ** in brac as well + + +checktypes10(Vars1,TypeStatements1,TypeStatements2,TypeStatements4) :- + get_lang_word("t",T),get_lang_word("brackets",Dbw_brackets), + TypeStatements1=[[[T,Dbw_brackets]|[TypeStatements3]]|TypeStatements4a], +(types(on)->(%trace,%TypeStatements3=[TypeStatements32], +simplify_types([[[T,Dbw_brackets]|[TypeStatements3]]],[],TypeStatements31),debug_call(Skip,TypeStatements31));true), + (([Vars2|Vars3]=Vars1, + checktypes10(Vars2,TypeStatements3,TypeStatements2,TypeStatements4))-> + ((types(on)->debug_exit(Skip,Vars1);true), + checktypes10(Vars3,TypeStatements4a,TypeStatements2,TypeStatements4)) +; (types(on)->debug_fail(Skip,Vars1);true)) +%%not(variable_name(Vars2)), + ,!. %% ** in brac as well + +/**checktypes1(Vars1,TypeStatements0,TypeStatements1,TypeStatements4) :- + ((number(Vars1)->true);string(Vars1)->true;Vars1=[n,_]), + %%Vars1=[Vars2|Vars3], + %%TypeStatements0=[TypeStatements2|TypeStatements3], + checktypes2(Vars1,TypeStatements0,TypeStatements1,TypeStatements4). + %%checktypes1(Vars3,TypeStatements3,TypeStatements1,TypeStatements4). + +**/ + +checktypes10(Vars1,TypeStatements0,TypeStatements1,TypeStatements4) :- + Vars1=Vars3, + TypeStatements0=["|",TypeStatements3], + %checktypes2(Vars2,TypeStatements2,TypeStatements1,TypeStatements4), + %%not(variable_name(Vars2)), + checktypes10([Vars3],[TypeStatements3],TypeStatements1,TypeStatements4). +checktypes10(Vars1,TypeStatements0,TypeStatements1,TypeStatements4) :- + Vars1=[Vars2|Vars3], + TypeStatements0=[TypeStatements2|TypeStatements3], + checktypes2(Vars2,TypeStatements2,TypeStatements1,TypeStatements4), + %%not(variable_name(Vars2)), + checktypes10(Vars3,TypeStatements3,TypeStatements1,TypeStatements4). + +checktypes2(Vars,TypeStatements1,_TypeStatements2,_C) :- + get_lang_word("t",T),get_lang_word("number",Dbw_number), + +%%writeln(checktypes2(Vars,TypeStatements1,_TypeStatements2,C)), +TypeStatements1=[T,Dbw_number], +(types(on)->debug_call(Skip,[[T,Dbw_number],Vars]);true), + ((number(Vars))-> + (types(on)->debug_exit(Skip,[[T,Dbw_number],Vars]);true) +; (types(on)->debug_fail(Skip,[[T,Dbw_number],Vars]);true)). +checktypes2(Vars,TypeStatements1,_TypeStatements2,_C) :- + get_lang_word("t",T),get_lang_word("atom",Dbw_atom), + +%%writeln(checktypes2(Vars,TypeStatements1,_TypeStatements2,C)), +TypeStatements1=[T,Dbw_atom], +(types(on)->debug_call(Skip,[[T,Dbw_atom],Vars]);true), + ((atom(Vars))-> + (types(on)->debug_exit(Skip,[[T,Dbw_atom],Vars]);true) +; (types(on)->debug_fail(Skip,[[T,Dbw_atom],Vars]);true)). +checktypes2(Vars,TypeStatements1,_TypeStatements2,_) :- + get_lang_word("t",T),get_lang_word("predicatename",Dbw_predicatename), + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + +TypeStatements1=[T,Dbw_predicatename], +(types(on)->debug_call(Skip,[[T,Dbw_predicatename],Vars]);true), + ((Vars=[Dbw_n,_])-> + (types(on)->debug_exit(Skip,[[T,Dbw_predicatename],Vars]);true) +; (types(on)->debug_fail(Skip,[[T,Dbw_predicatename],Vars]);true)). + +checktypes2(Vars,TypeStatements1,_TypeStatements2,_) :- + get_lang_word("t",T),get_lang_word("string",Dbw_string), + +TypeStatements1=[T,Dbw_string], +(types(on)->debug_call(Skip,[[T,Dbw_string],Vars]);true), + ((string(Vars))-> + (types(on)->debug_exit(Skip,[[T,Dbw_string],Vars]);true) +; (types(on)->debug_fail(Skip,[[T,Dbw_string],Vars]);true)). + +checktypes2(Vars,TypeStatements1,_TypeStatements2,_) :- + get_lang_word("t",T),get_lang_word("any",Dbw_any), + +TypeStatements1=[T,Dbw_any], +(types(on)->debug_call(Skip,[[T,Dbw_any],Vars]);true), + ((true)-> + (types(on)->debug_exit(Skip,[[T,Dbw_any],Vars]);true) +; (types(on)->debug_fail(Skip,[[T,Dbw_any],Vars]);true)). + +checktypes2(Vars,TypeStatements1,TypeStatements2,TypeStatements4) :- + get_lang_word("t",T), + get_lang_word("list",Dbw_list), + get_lang_word("brackets",Dbw_brackets), + get_lang_word("number",Dbw_number), + get_lang_word("predicatename",Dbw_predicatename), + get_lang_word("string",Dbw_string), + get_lang_word("any",Dbw_any), + +TypeStatements1=[T,Type],(not(Type=Dbw_list),not(Type=Dbw_brackets),not(Type=Dbw_number),not(Type=Dbw_predicatename),not(Type=Dbw_string),not(Type=Dbw_any)), +(types(on)->debug_call(Skip,[[T,Type],Vars]);true), + (( + %%not(variable_name(Vars)), + member([[T,Type]|[TypeStatements3]],TypeStatements4), + (checktypes10(Vars,TypeStatements3,TypeStatements2,TypeStatements4)->true; + checktypes10([Vars],TypeStatements3,TypeStatements2,TypeStatements4)))-> + (types(on)->debug_exit(Skip,[[T,Type],Vars]);true) +; (types(on)->debug_fail(Skip,[[T,Type],Vars]);true)). + +/** +checktypes2(Vars,TypeStatements1,TypeStatements2,TypeStatements4) :- + TypeStatements1=[t,Type], + member([[t,Type]|[TypeStatements3]],TypeStatements4), + checktypes1([Vars],TypeStatements3,TypeStatements2,TypeStatements4). +**/ +checktypes3([],_,_TypeStatements2,_) :- !. +checktypes3(Vars,TypeStatements3,TypeStatements2,TypeStatements6) :- +%%writeln(checktypes3(Vars,TypeStatements3,TypeStatements2,TypeStatements6)), +%delete(TypeStatements3,"|",TypeStatements31), +%trace, + %findall(L0,length1(TypeStatements3,TypeStatements6,[],_,0,L0),L01), + %sort(L01,L02), + %member(L,L02), + length(TypeStatements3,L), + length(L1,L), + append(L1,L2,Vars), + %%[L10]=L1, + %%TypeStatements3=[TypeStatements4|TypeStatements5], + %%findall(L10,(member(L10,L1),checktypes2(L10,TypeStatements4,TypeStatements2,TypeStatements6)),B), + checktypes10(L1,TypeStatements3,TypeStatements2,TypeStatements6), + checktypes3(L2,TypeStatements3,TypeStatements2,TypeStatements6). + +/* +length1([],_,T,T,L,L). + +length1(TypeStatements3,TypeStatements2,T1,T2,L1,L2) :- + TypeStatements3=["|",B], + length1(B,TypeStatements2,T1,T2,L1,L2). + +length1(TypeStatements3,TypeStatements2,T1,T2,L1,L2) :- + TypeStatements3=[TypeStatements1|B], + L3 is L1+1, + (undefined_type([T,Type])-> + (not(member([T,Type],T1)),append(T1,[[T,Type]],T3), + member([TypeStatements1|[TypeStatements4]],TypeStatements2), +length1(TypeStatements4,TypeStatements2,T3,T2,L3,L2)); +length1(B,TypeStatements2,T1,T2,L3,L2)). + +undefined_type([T,Type]) :- + get_lang_word("t",T), + get_lang_word("list",Dbw_list), + get_lang_word("brackets",Dbw_brackets), + get_lang_word("number",Dbw_number), + get_lang_word("predicatename",Dbw_predicatename), + get_lang_word("string",Dbw_string), + get_lang_word("any",Dbw_any), + +TypeStatements1=[T,Type], +not(Type=Dbw_list),not(Type=Dbw_brackets),not(Type=Dbw_number),not(Type=Dbw_predicatename),not(Type=Dbw_string),not(Type=Dbw_any). + +*/ + + +interpretbody(_Functions1,_Functions2,Vars,Vars,[],true) :- true.%%!. + + + +/** +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[[[n,not],Statement] + ], + + writeln1(interpretbody(Functions0,Functions,Vars1,Vars3,Statement,Result2)), + not(interpretbody(Functions0,Functions,Vars1,Vars3,Statement,Result2)), %% 2->1 + ((Result2=cut)->!;true). +**/ + +/** *** may need to uncomment +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[Statements1|Statements2],not(Statements1=[[n,not],_]),not(predicate_or_rule_name(Statements1)), + interpretbody(Functions0,Functions,Vars1,Vars3,Statements1,Result2), + interpretbody(Functions0,Functions,Vars3,Vars2,Statements2,Result3), + %%((Result3=cut)->!;true), + logicalconjunction(Result1,Result2,Result3),!. +**/ + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + Body=[[Statements1|Statements1a]|Statements2 + ], + + not(predicate_or_rule_name(Statements1)), +%%writeln1(interpretbody(Functions0,Functions,Vars1,Vars3,[Statement],Result2)), + interpretbody(Functions0,Functions,Vars1,Vars3,[Statements1],Result2), %% 2->1 + %%((Result2=cut)->!;true), + + interpretbody(Functions0,Functions,Vars3,Vars4,Statements1a,Result22), %% 2->1 + %%((Result22=cut)->!;true), + interpretbody(Functions0,Functions,Vars4,Vars2,Statements2,Result3), + %%((Result3=cut)->!;true), + %%() logicalnot(Result2,Result4), +logicalconjunction(Result1a,Result2,Result22), +logicalconjunction(Result1,Result1a,Result3), + true.%%!. + + + + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + get_lang_word("n",Dbw_n),get_lang_word("not",Dbw_not), + Body=[[[Dbw_n,Dbw_not],[Statement]]|Statements2 + ], + +debug_call(Skip,[[Dbw_n,Dbw_not]]), + ( (not(interpretbody(Functions0,Functions,Vars1,_Vars3,[Statement],Result22)))-> %% 2->1 + %%((Result22=cut)->!;true)),%%-> +debug_exit(Skip,[[Dbw_n,Dbw_not]]) +; debug_fail(Skip,[[Dbw_n,Dbw_not]])), + %%writeln1(interpretbody(Functions0,Functions,Vars1,Vars3,[Statement],Result2)), + + interpretbody(Functions0,Functions,Vars1,Vars2,Statements2,Result32), + %%((Result32=cut)->!;true), + logicalnot(Result1a,Result22), +logicalconjunction(Result1,Result1a,Result32), + true.%%!. + + + + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result0) :- + get_lang_word("n",Dbw_n),get_lang_word("or",Dbw_or), + + Body=[[[Dbw_n,Dbw_or],[Statements1,Statements2]]|Statements3], + (interpretbody(Functions0,Functions,Vars1,Vars3,[Statements1],Result1) + %%((Result1=cut)->!;true)); %% *** changed from 1 to Result2 + %%,((Value1=cut)->!;true)) + ; + interpretbody(Functions0,Functions,Vars1,Vars3,[Statements2],Result2)),%%!. *** changed from 1 to Result2 + %%((Result2=cut)->!;true), + + interpretbody(Functions0,Functions,Vars3,Vars2,Statements3,Result3), + %%((Result3=cut)->!;true), + logicaldisjunction(Result1a,Result1,Result2), + logicalconjunction(Result0,Result1a,Result3), + true.%%!. + + + %%,((Value=cut)->!;true)). + %%(logicaldisjunction(Result1,Value1,Value2)->true;(Result1=false)). + + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + + + Body=[[[Dbw_n,"->"],[Statements1,Statements2]]|Statements3], + (interpretbody(Functions0,Functions,Vars1,Vars3,[Statements1],Result2) + %%((Result2=cut)->!;true)) +-> + interpretbody(Functions0,Functions,Vars3,Vars4,[Statements2],Result22)), + %%((Result22=cut)->!;true))), + + interpretbody(Functions0,Functions,Vars4,Vars2,Statements3,Result3), + %%((Result3=cut)->!;true), + logicalconjunction(Result1a,Result2,Result22), + logicalconjunction(Result1,Result1a,Result3), + + true.%%!. + + + + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + + Body=[[[Dbw_n,"->"],[Statements1,Statements2,Statements2a]]|Statements3], + ((interpretbody(Functions0,Functions,Vars1,Vars3,[Statements1],Result2) + %%((Result2=cut)->!;true)) + -> + interpretbody(Functions0,Functions,Vars3,Vars4,[Statements2],Result22) + %%((Result22=cut)->!;true)) + ; + interpretbody(Functions0,Functions,Vars1,Vars4,[Statements2a],Result23))), + %%((Result23=cut)->!;true))), + + interpretbody(Functions0,Functions,Vars4,Vars2,Statements3,Result3), + + logicalconjunction(Result1a,Result2,Result22), + logicaldisjunction(Result1b,Result1a,Result23), + logicalconjunction(Result1,Result1b,Result3), + + %%((Result3=cut)->!;true), + true.%%!. + + +interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1) :- +%writeln1(interpretbody(Functions0,Functions,Vars1,Vars2,Body,Result1)), +%trace, + Body=[Statement|Statements], +%%writeln1(["Functions0",Functions0,"Functions",Functions,"Statement",Statement,"Vars1",Vars1,"Vars3",Vars3,"Result2",Result2,"Cut",Cut]), + not(predicate_or_rule_name(Statement)), +%trace, + interpretstatement1(_,Functions0,Functions,Statement,Vars1,Vars3,Result2,Cut), +%%writeln1(["here1"]), +%trace, + ((not(Cut=cut))->(Functions2=Functions);(%%trace, + !,turncut(on)) + ), %% cut to interpret1/2 (assertz) +%%writeln1(["here3"]), + interpretbody(Functions0,Functions2,Vars3,Vars2,Statements,Result3), + %%((Result3=cut)->!;true), +%%writeln1(["here4"]), + logicalconjunction(Result1,Result2,Result3) ,true.%%,!. +%%writeln1([Result1,Result2,Result3]). +turncut(State1) :- + cut(State2), + retract(cut(State2)), + assertz(cut(State1)). +turndebug(State1) :- + debug(State2), + retract(debug(State2)), + assertz(debug(State1)). +turnequals4(State1) :- + (not(equals4(_Equals4))->(retractall(equals4(_)),assertz(equals4(on)));true), + equals4(State2), + retract(equals4(State2)), + assertz(equals4(State1)). +turn_save_debug(State1) :- + (not(save_debug(_))->(retractall(save_debug(_)),assertz(save_debug(off)));true), + %save_debug(State2), + retractall(save_debug(_)), + assertz(save_debug(State1)), +!. +do_saved_debug(State1) :- + (not(saved_debug(_))->(retractall(saved_debug(_)),assertz(saved_debug([])));true), + %saved_debug(State2), + retractall(saved_debug(_)), + assertz(saved_debug(State1)), + !. +logicaldisjunction(true,true,true) :- !. +logicaldisjunction(true,false,true) :- !. +logicaldisjunction(true,true,false) :- !. +logicaldisjunction(false,false,true) :- !. +logicalconjunction(true,true,true) :- !. +logicalconjunction(false,false,true) :- !. +logicalconjunction(false,true,false) :- !. +logicalconjunction(false,false,false) :- !. +logicalnot(Result1,Result2) :- + true(Result1),false(Result2). +logicalnot(Result1,Result2) :- + false(Result1),true(Result2). +true(true). +false(false). + +%%interpretstatement1(ssi,_F0,[],_,Vars,Vars,true,nocut) :- ! +%%writeln1("AND HERE!") +%% . + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals4_on]|_],Vars,Vars,true,nocut) :- %writeln(here), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("equals4_on",Dbw_equals4_on1),Dbw_equals4_on1=Dbw_equals4_on, +turnequals4(on), +!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals4_off]|_],Vars,Vars,true,nocut) :- %writeln(here), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("equals4_off",Dbw_equals4_off1),Dbw_equals4_off1=Dbw_equals4_off, +turnequals4(off), +!. + +interpretstatement1(ssi,_F0,_Functions,[[n,trace2]|_],Vars,Vars,true,nocut) :- %writeln(here), +trace,!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_trace]|_],Vars,Vars,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("trace",Dbw_trace1),Dbw_trace1=Dbw_trace, +turndebug(on), +!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_notrace]|_],Vars,Vars,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("notrace",Dbw_notrace1),Dbw_notrace1=Dbw_notrace, +turndebug(off), +!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_cut]|_],Vars,Vars,true,cut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("cut",Dbw_cut1),Dbw_cut1=Dbw_cut,!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_true]|_],Vars,Vars,_,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("true",Dbw_true1),Dbw_true1=Dbw_true,!. +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_fail]|_],Vars,Vars,_,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("fail",Dbw_fail1),Dbw_fail1=Dbw_fail, +fail. + +/** +interpretstatement1(ssi,Functions0,Functions,[[n,not],[Statements]],Vars1,Vars2,Result,nocut) :- + not(interpretbody(Functions0,Functions,Vars1,Vars2, + Statements,Result)). + +interpretstatement1(ssi,Functions0,Functions,[[n,or],[Statement1,Statement2]],Vars1,Vars2,Result,nocut) :- + (interpretbody(Functions0,Functions,Vars1,Vars2, + Statement1,Result1); + interpretbody(Functions0,Functions,Vars1,Vars2, + Statement2,Result2)). +**/ + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_comment],[_Variable]],Vars,Vars,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("comment",Dbw_comment1),Dbw_comment1=Dbw_comment,!. + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_atom],[Variable]],Vars,Vars,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("atom",Dbw_atom1),Dbw_atom1=Dbw_atom, + + getvalue(Variable,Value,Vars), +debug_call(Skip,[[Dbw_n,Dbw_atom],[Value]]), + (atom(Value)-> +debug_exit(Skip,[[Dbw_n,Dbw_atom],[Value]]) +; debug_fail(Skip,[[Dbw_n,Dbw_atom],[Value]])),!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_string],[Variable]],Vars,Vars,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("string",Dbw_string1),Dbw_string=Dbw_string1, + + getvalue(Variable,Value,Vars), +debug_call(Skip,[[Dbw_n,Dbw_string],[Value]]), + (string(Value)-> +debug_exit(Skip,[[Dbw_n,Dbw_string],[Value]]) +; debug_fail(Skip,[[Dbw_n,Dbw_string],[Value]])),!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_number],[Variable]],Vars,Vars,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("number",Dbw_number1),Dbw_number1=Dbw_number, + getvalue(Variable,Value,Vars), +debug_call(Skip,[[Dbw_n,Dbw_number],[Value]]), + (number(Value)-> +debug_exit(Skip,[[Dbw_n,Dbw_number],[Value]]) +; debug_fail(Skip,[[Dbw_n,Dbw_number],[Value]])),!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_letters],[Variable]],Vars,Vars,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("letters",Dbw_letters1),Dbw_letters1=Dbw_letters, + + getvalue(Variable,Value,Vars), +debug_call(Skip,[[Dbw_n,Dbw_letters],[Value]]), + ((string_codes(Value,Value1), + phrase(word1(Value1),_))-> +debug_exit(Skip,[[Dbw_n,Dbw_letters],[Value]]) +; debug_fail(Skip,[[Dbw_n,Dbw_letters],[Value]])),!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_variable],[Variable]],Vars,Vars,true,nocut) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("variable",Dbw_variable1), +get_lang_word("var",Dbw_variable2) +,(Dbw_variable1=Dbw_variable->true; +Dbw_variable2=Dbw_variable), + +debug_call(Skip,[[Dbw_n,Dbw_variable],[Variable]]), + getvalue(Variable,Value,Vars), + (isvar(Value)-> +debug_exit(Skip,[[Dbw_n,Dbw_variable],[Variable]]) +; debug_fail(Skip,[[Dbw_n,Dbw_variable],[Variable]])),!. + +/**interpretstatement1(ssi,_F0,_Functions,[[n,Operator],[Variable1]],Vars1,Vars2,true,nocut) :- + isop(Operator), + interpretpart(is,Variable1,Vars1,Vars2),!. +**/ + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + isop(Operator), + %trace, + interpretpart(is,Variable1,Variable2,Vars1,Vars2),!. + +/** +interpretstatement1(ssi,_F0,_Functions,[[n,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln1(31), + isop(Operator), + interpretpart(is,Variable2,Variable1,Vars1,Vars2). +**/ + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Operator],[Variable2,Variable3,Variable1]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + operator(Operator), +%%writeln1(4), + %trace, +%trace, + interpretpart(isop,Operator,Variable1,Variable2,Variable3,Vars1,Vars2),!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Operator],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + comparisonoperator(Operator), +%%writeln1(4), + interpretpart(iscomparison,Operator,Variable1,Variable2,Vars1,Vars2),!. + +%%interpretstatement1(ssi,_F0,_Functions,[Variable2+Variable3,is,Variable1],Vars1,Vars2,true,nocut) :- +%%writeln1(41), + %%interpretpart(isplus,Variable1,Variable2,Variable3,Vars1,Vars2). + +/**interpretstatement1(ssi,_F0,_Functions,[[n,=],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%%writeln1(5), + interpretpart(is,Variable1,Variable2,Vars1,Vars2). +**/ + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals1],[Variable1,[Variable2,Variable3]]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("equals1",Dbw_equals11),Dbw_equals11=Dbw_equals1, + +%%writeln1(5), + interpretpart(match1,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals2],[Variable1,[Variable2,Variable3]]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("equals2",Dbw_equals21),Dbw_equals21=Dbw_equals2, + +%%writeln1(5), + interpretpart(match2,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals3],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("equals3",Dbw_equals31),Dbw_equals31=Dbw_equals3, +%%writeln1(5), + interpretpart(match3,Variable1,Variable2,Vars1,Vars2). + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals4],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%trace, +%writeln1(interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_equals4],[Variable1,Variable2]],Vars1,Vars2,true,nocut)), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("equals4",Dbw_equals41),Dbw_equals41=Dbw_equals4, +get_lang_word("v",Dbw_v), +get_lang_word("sys1",Dbw_sys1), +%%writeln1(5), +%trace, + remember_and_turn_off_debug(Debug), + %trace, + (interpretpart(match4,Variable1,Variable2,Vars1,Vars5,_)->true;(turn_back_debug(Debug), + %fail + interpretpart(match4,Variable1,Variable2,Vars1,_Vars2,_))), + + interpretpart(match4,Variable1,[Dbw_v,Dbw_sys1],Vars5,Vars4,_), + + getvalue([Dbw_v,Dbw_sys1],Value3,Vars4), + + turn_back_debug(Debug), + + + interpretpart(match4,Variable1,Variable2,Vars1,Vars2,Value3),!. + +%%interpretstatement1(ssi,_F0,_Functions,[[Variable2,Variable3]=Variable1],Vars1,Vars2,true,nocut) :- +%%writeln1(51), +%% interpretpart(match,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_wrap],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("wrap",Dbw_wrap1),Dbw_wrap1=Dbw_wrap, +%%writeln1(52), wrap +%%writeln([[n,wrap],[Variable1,Variable2]]), + interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_unwrap],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("unwrap",Dbw_unwrap1),Dbw_unwrap1=Dbw_unwrap, +%%writeln1(53), unwrap + interpretpart(bracket2,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_head],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("head",Dbw_head1),Dbw_head1=Dbw_head, + +%%writeln1(6), + interpretpart(head,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_tail],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("tail",Dbw_tail1),Dbw_tail1=Dbw_tail, +%%writeln1(61), + interpretpart(tail,Variable1,Variable2,Vars1,Vars2). + +/* +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_member],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("member",Dbw_member1),Dbw_member1=Dbw_member, +%%writeln1(8), + interpretpart(member2,Variable1,Variable2,Vars1,Vars2). +*/ +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_member2],[Variable2,Variable1]],Vars1,Vars2,true,nocut) :- +%writeln(here), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +get_lang_word("member2",Dbw_member21), +get_lang_word("member",Dbw_member22), +(Dbw_member21=Dbw_member2-> + interpretpart(member2,Variable2,Variable1,Vars1,Vars2); +(Dbw_member22=Dbw_member2, +%%writeln1(8), + + interpretpart(member2,Variable1,Variable2,Vars1,Vars2))). + +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_member2],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +get_lang_word("member3",Dbw_member21),Dbw_member21=Dbw_member2, +%%writeln1(8), + interpretpart(member3,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_delete],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("delete",Dbw_delete1),Dbw_delete1=Dbw_delete, +%%writeln1(), + interpretpart(delete,Variable1,Variable2,Variable3,Vars1,Vars2). +%%** all in form f,[1,1,etc], including + with 0,1 + +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_append],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("append",Dbw_append1),Dbw_append1=Dbw_append, +%%writeln1(9), + %trace, + interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2). + +/* +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_stringconcat],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("string_concat",Dbw_stringconcat1),Dbw_stringconcat1=Dbw_stringconcat, + interpretpart(stringconcat,Variable1,Variable2,Variable3,Vars1,Vars2). + */ + +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_stringconcat],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("stringconcat",Dbw_stringconcat1), +get_lang_word("string_concat",Dbw_stringconcat2), +(Dbw_stringconcat1=Dbw_stringconcat->true; +Dbw_stringconcat2=Dbw_stringconcat), + interpretpart(stringconcat,Variable1,Variable2,Variable3,Vars1,Vars2). + +/** +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_stringconcat_1],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("stringconcat1",Dbw_stringconcat1),Dbw_stringconcat1=Dbw_stringconcat_1, + + interpretpart(stringconcat1,Variable1,Variable2,Variable3,Vars1,Vars2). +**/ + +/**interpretstatement1(ssi,_F0,_Functions,[[n,grammar_part]|Variables1],Vars1,Vars2,true,nocut) :- +%%writeln1(x9), + [Variables2]=Variables1, + interpretpart(grammar_part,Variables2,Vars1,Vars2),!.**/ + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_stringtonumber],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("stringtonumber",Dbw_stringtonumber1),Dbw_stringtonumber1=Dbw_stringtonumber, + interpretpart(stringtonumber,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_random],[Variable1]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("random",Dbw_random1),Dbw_random1=Dbw_random, + interpretpart(random,Variable1,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_length],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("length",Dbw_length1),Dbw_length1=Dbw_length, + interpretpart(length,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_ceiling],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("ceiling",Dbw_ceiling1),Dbw_ceiling1=Dbw_ceiling, + interpretpart(ceiling,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_date],[Year,Month,Day,Hour,Minute,Seconds]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("date",Dbw_date1),Dbw_date1=Dbw_date, + interpretpart(date,Year,Month,Day,Hour,Minute,Seconds,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_round],[N1,N2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("round",Dbw_round1),Dbw_round1=Dbw_round, + interpretpart(round,N1,N2,Vars1,Vars2). + +/*** +interpretstatement1(ssi,Functions0,_Functions,Query1,Vars1,Vars8,true,nocut) :- +%%writeln1("h1/10"), + Query1=[[n,grammar]|Arguments], + ((Arguments=[[Grammar1,Phrase1,RuleName|Variables2]], + %%[Variables3]=Variables2, + name(RuleName), + convert_to_grammar_part1(Grammar1,[],Grammar2))->true; + (Grammar2=Functions0, + ((Arguments=[[Phrase1,RuleName|Variables2]] + %%([Variables3]=Variables2->true;(Variables2=[],Variables3=[])) + )))), + +%%writeln1(["Arguments",Arguments,"Vars1",Vars1]), + +%%substitutevarsA1(Phrase,Vars1,[],Vars3,[],FirstArgs1), + +%%Vars3=[[[v,PhraseVarName],PhraseValue]], +%%Vars4=[[[v,vgp1],PhraseValue]], + + append([Phrase1],Variables2,Variables4), %% *** Should V3 be in [] v + +substitutevarsA1(Variables4,Vars1,[],Vars2,[],FirstArgs), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln1([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + +Vars2=[Phrase2|Vars4], +((Phrase2=[]->true;Phrase2=[_A|_B])->End=[];End=""), + (not(Vars4=[])->append([RuleName,Phrase2,End],Vars4,Vars5); + (Vars5=[RuleName,Phrase2,End])), + Query2=[[n,grammar_part],Vars5], + ((((terminal(RuleName), + + + (not(Vars4=[])->append([Phrase2,RuleName],Vars4,Vars52); + (Vars52=[Phrase2,RuleName])), + + (debug(on)->(writeln1([call,[[n,grammar],Vars52],"Press c."]),(leash1(on)->true;(not(get_single_char(97))->true;abort)));true), + + interpretpart(grammar_part,Vars5,[],Result1), + + updatevars2(FirstArgs,Result1,[],Vars51), + updatevars3(Vars2,Vars51,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) +)->true;( +%%writeln1(here1), + Vars8=[]),strip(Vars8,[],Result2))->true)), + + (debug(on)->(writeln1([exit,[[n,grammar],Result2],"Press c."]),(leash1(on)->true;(not(get_single_char(97))->true;abort)));true) + +)->true;(not(terminal(RuleName)), + %% Bodyvars2? + + (not(Vars4=[])->append([Phrase2,RuleName],Vars4,Vars52); + (Vars52=[Phrase2,RuleName])), + +(debug(on)->(writeln1([call,[[n,grammar],Vars52],"Press c."]),(leash1(on)->true;(not(get_single_char(97))->true;abort)));true), + +%% debug(on)->writeln1([call,[Function,[Vars3]]]), +%%writeln1(["Query2",Query2,"Functions0",Functions0]), + interpret2(Query2,Grammar2,Grammar2,Result1), + (debug(on)->(writeln1([exit,[[n,grammar],Vars52],"Press c."]),(leash1(on)->true;(not(get_single_char(97))->true;abort)));true), + + updatevars2(FirstArgs,Result1,[],Vars51), + updatevars3(Vars2,Vars51,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) +);( +%%writeln1(here1), + Vars8=[])))),!. +***/ + +interpretstatement1(ssi,_Grammar,_Grammar2,[[Dbw_n,grammar_part],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%get_lang_word("grammar_part",Dbw_grammar_part1),Dbw_grammar_part1=Dbw_grammar_part, + + +%%writeln1("h1/10"), +%%trace,%%%%**** + interpretpart(grammar_part,[Variable1,Variable2,Variable3],Vars1,Vars2). + +interpretstatement1(non-ssi,Functions0,Functions,[[Dbw_n,Dbw_findall],[Variable1,Body,Variable3]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("findall",Dbw_findall1),Dbw_findall1=Dbw_findall, +get_lang_word("v",Dbw_v), + +%trace, + +%writeln1(interpretstatement1(non-ssi,Functions0,Functions,[[Dbw_n,Dbw_findall],[Variable1,Body,Variable3]],Vars1,Vars2,true,nocut)), + +%%writeln1(interpretstatement1(ssi,Functions0,Functions,[[n,findall],[Variable1,Body,Variable3]],Vars1,Vars2,true,nocut)), +%%writeln1("h1/10"), +%%trace,%%%%**** +%% +%trace, + debug_call(Skip,[[Dbw_n,Dbw_findall],[Variable1,Body,Variable3]]), +(( + findall(Value3,( + %%trace, + %%writeln1( interpretbody(Functions0,Functions,Vars1,Vars3,[Body],Result2)), + +%writeln1( interpretbody(Functions0,Functions,Vars1,Vars3,[Body],_Result2)), + interpretbody(Functions0,Functions,Vars1,Vars3,[Body],Result2), %% 2->1 + Result2=true, + %%((Result2=cut)->!;true), + %%trace, + %%(cut(on)->(%%notrace, + %%fail);(%%trace, + %%true)),%%notrace, + + remember_and_turn_off_debug(Debug), + %%trace, +find_sys(Sys_name), + interpretpart(match4,Variable1,[Dbw_v,Sys_name],Vars3,Vars2,_), +%%writeln1( interpretpart(match4,Variable1,[v,sys1],Vars3,Vars2,_)), +%%interpretstatement1(ssi,Functions0,Functions,[[n,equals4],[Variable1,Variable3]],Vars3,Vars2,true,nocut), + getvalue([Dbw_v,Sys_name],Value3,Vars2), + + turn_back_debug(Debug) + + ),Value3a), + putvalue(Variable3,Value3a,Vars1,Vars2) + )-> +debug_exit(Skip,[[Dbw_n,Dbw_findall],[Variable1,Body,Value3a]]) +; debug_fail(Skip,[[Dbw_n,Dbw_findall],[Variable1,Body,Variable3]])). + + +interpretstatement1(ssi,_Functions0,_Functions,[[Dbw_n,Dbw_string_from_file],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("string_from_file",Dbw_string_from_file1),Dbw_string_from_file1=Dbw_string_from_file, + interpretpart(string_from_file,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,Functions0,Functions,[[Dbw_n,Dbw_maplist],[Variable1,Variable2,Variable3,Variable4]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("maplist",Dbw_maplist1),Dbw_maplist1=Dbw_maplist, + + interpretpart(maplist,Functions0,Functions,Variable1,Variable2,Variable3,Variable4,Vars1,Vars2). + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_string_length],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("string_length",Dbw_string_length1),Dbw_string_length1=Dbw_string_length, + interpretpart(string_length,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_sort],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("sort",Dbw_sort1),Dbw_sort1=Dbw_sort, + interpretpart(sort,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_intersection],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("intersection",Dbw_intersection1),Dbw_intersection1=Dbw_intersection, + interpretpart(intersection,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_read_string],[Variable1]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("read_string",Dbw_read_string1), +get_lang_word("read_password",Dbw_read_string2), +(Dbw_read_string1=Dbw_read_string->true;Dbw_read_string2=Dbw_read_string), + interpretpart(read_string,Variable1,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_text_area],[Variable1,Variable2,Variable3]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("text_area",Dbw_text_area1), +(Dbw_text_area1=Dbw_text_area), + interpretpart(text_area,Variable1,Variable2,Variable3,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_writeln],[Variable1]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("writeln",Dbw_writeln1),Dbw_writeln1=Dbw_writeln, + interpretpart(writeln,Variable1,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_atom_string],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("atom_string",Dbw_atom_string1),Dbw_atom_string1=Dbw_atom_string, + interpretpart(atom_string,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_get_lang_word],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("get_lang_word",Dbw_get_lang_word1),Dbw_get_lang_word1=Dbw_get_lang_word, + interpretpart(get_lang_word,Variable1,Variable2,Vars1,Vars2). + + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_shell],[Variable1]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("shell",Dbw_shell1),Dbw_shell1=Dbw_shell, + interpretpart(shell,Variable1,Vars1,Vars2). + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_date_time_stamp],[Y,M,D,Ho,Mi,Se,Se2,Variable1]],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("date_time_stamp",Dbw_date_time_stamp1),Dbw_date_time_stamp1=Dbw_date_time_stamp, + interpretpart(date_time_stamp,Y,M,D,Ho,Mi,Se,Se2,Variable1,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_phrase_from_file],[[[Dbw_n,Dbw_string],[Out]],Path]],Vars1,Vars2,true,nocut) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("phrase_from_file",Dbw_phrase_from_file1),Dbw_phrase_from_file1=Dbw_phrase_from_file, +get_lang_word("string",Dbw_string1),Dbw_string1=Dbw_string, + interpretpart(phrase_from_file,Out,Path,Vars1,Vars2). + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_assertz],[[[Dbw_n, Var], [In]]]],Vars1,Vars1,true,nocut) :- +% +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("assertz",Dbw_assertz1),Dbw_assertz1=Dbw_assertz, + +%assertz(Var(In)) +debug_call(Skip,[[Dbw_n,Dbw_assertz],[[[Dbw_n, Var], [In]]]]), + +functor(A,Var,1),arg(1,A,In), +dynamic(Var/1), +assertz(A), + +debug_exit(Skip,[[Dbw_n,Dbw_assertz],[[[Dbw_n, Var], [In]]]]), + +!. + + + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_getz],[[[Dbw_n, Var], [Variable1]]]],Vars1,Vars2,true,nocut) :- +% +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("getz",Dbw_getz1),Dbw_getz1=Dbw_getz, +%trace, + +%assertz(Var(In)) +functor(A,Var,1), +findall(Variable21,(A, +arg(1,A,Variable21)),Variable2), + + %Variable2=Value2, + getvalue(Variable2,Value2,Vars1), + %%getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + %not(is_empty(Value1)), + %expression(Value1), + %is_empty(Value2), + val1emptyorvalsequal(Value2,Value1), + %%isval(Value2), +debug_call(Skip,[[Dbw_n,Dbw_getz],[variable]]), +( putvalue(Variable1,Value2,Vars1,Vars2)-> +debug_exit(Skip,[[Dbw_n,Dbw_getz],[Value1]]) +; debug_fail(Skip,[[Dbw_n,Dbw_getz],[Value1]])). + + +%assertz(A), + + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_retractall],[[[Dbw_n, Var], [In]]]],Vars1,Vars1,true,nocut) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("retractall",Dbw_retractall1),Dbw_retractall1=Dbw_retractall, + +%assertz(Var(In)) +debug_call(Skip,[[Dbw_n,Dbw_retractall],[[[Dbw_n, Var], [In]]]]), + +functor(A,Var,1),arg(1,A,In), +retractall(A), + +debug_exit(Skip,[[Dbw_n,Dbw_retractall],[[[Dbw_n, Var], [In]]]]), + +!. + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_word11],[Variable1]],Vars1,Vars1,true,nocut) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("word1",Dbw_word1),Dbw_word11=Dbw_word1, + %trace, + interpretpart(word1,Variable1,Vars1). + +/* +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_term_to_atom],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("term_to_atom",Dbw_term_to_atom1),Dbw_term_to_atom1=Dbw_term_to_atom, + %trace, + interpretpart(term_to_atom,Variable1,Variable2,Vars1,Vars2). +*/ +/* +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_split_on_substring117a],[Variable1,Variable2,Variable3,Variable4]],Vars1,Vars2,true,nocut) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("split_on_substring117a",Dbw_split_on_substring117a1),Dbw_split_on_substring117a1=Dbw_split_on_substring117a, + %trace, + interpretpart(split_on_substring117a,Variable1,Variable2,Variable3,Variable4,Vars1,Vars2). + + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_string_strings],[Variable1,Variable2]],Vars1,Vars2,true,nocut) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("string_strings",Dbw_string_strings1),Dbw_string_strings1=Dbw_string_strings, + %trace, + interpretpart(string_strings,Variable1,Variable2,Vars1,Vars2). +*/ +/* +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_assertz],[[[Dbw_n,Variable1],[Variable2]]]],Vars1,Vars2,true,nocut) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("assertz",Dbw_assertz1),Dbw_assertz1=Dbw_assertz, + %trace, + interpretpart(assertz,Variable1,Variable2,Vars1,Vars2). + +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_retractall],[[[Dbw_n,Variable1],[Variable2]]]],Vars1,Vars2,true,nocut) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("retractall",Dbw_retractall1),Dbw_retractall1=Dbw_retractall, + %trace, + interpretpart(retractall,Variable1,Variable2,Vars1,Vars2). +*/ + +interpretstatement1(ssi,_F0,_Functions,[[Dbw_n,Dbw_command]|Variables2],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +(Variables2=[]->Variables=[];Variables2=[Variables]), + +%trace, +member([Command,Args], +[ + %["phrase_from_file",[i,i]], + %["phrase_from_file",[i,i]], + + ["term_to_atom",[i,i]], + ["term_to_atom",[i,o]], + ["term_to_atom",[o,i]], + + ["open_file_s",[i,o]], + ["open_string_file_s",[i,o]], + + ["split_on_substring117a",[i,i,i,o]], + ["string_strings",[i,o]], + + ["working_directory",[i,i]], + ["working_directory",[o,i]], + ["directory_files",[o]], + ["exists_directory",[i]], + ["exists_file",[i]], + + ["split_string",[i,i,i,o]], + ["numbers",[i,i,i,o]], + ["string_atom",[i,o]], + ["string_atom",[i,i]], + ["string_atom",[o,i]], + /* + ["atom_string",[i,o]], + ["atom_string",[i,i]], + ["atom_string",[o,i]], + */ + ["string_codes",[i,o]], + ["string_codes",[i,i]], + ["string_codes",[o,i]], + ["number_string",[i,o]], + ["number_string",[i,i]], + ["number_string",[o,i]], + ["reverse",[i,o]], + ["reverse",[i,i]], + ["reverse",[o,i]], + ["get_time",[o]], + ["string_chars",[i,i]], + ["string_chars",[i,o]], + ["string_chars",[o,i]], + ["atom_chars",[i,i]], + ["atom_chars",[i,o]], + ["atom_chars",[o,i]], + ["atom_codes",[i,i]], + ["atom_codes",[i,o]], + ["atom_codes",[o,i]], + ["atom_concat",[i,i,i]], + ["atom_concat",[i,o,i]], + ["atom_concat",[o,i,i]], + ["atom_concat",[o,o,i]], + ["atom_concat",[i,i,o]], + ["atomic",[i]], + ["atom_length",[i,i]], + ["atom_length",[i,o]], + /* + ["sub_atom",[i,i,i,i,i]], + ["sub_atom",[i,i,i,i,o]], + ["sub_atom",[i,i,i,o,i]], + ["sub_atom",[i,i,i,o,o]], + ["sub_atom",[i,i,o,i,i]], + ["sub_atom",[i,i,o,i,o]], + ["sub_atom",[i,i,o,o,o]], + ["sub_atom",[i,o,i,i,i]], + ["sub_atom",[i,o,i,i,o]], + ["sub_atom",[i,o,i,o,i]], + ["sub_atom",[i,o,i,o,o]], + ["sub_atom",[i,o,o,i,i]], + ["sub_atom",[i,o,o,i,o]], + ["sub_atom",[i,o,o,o,i]], + ["sub_atom",[i,o,o,o,o]], + ["sub_string",[i,i,i,i,i]], + ["sub_string",[i,i,i,i,o]], + ["sub_string",[i,i,i,o,i]], + ["sub_string",[i,i,i,o,o]], + ["sub_string",[i,i,o,i,i]], + ["sub_string",[i,i,o,i,o]], + ["sub_string",[i,i,o,o,o]], + ["sub_string",[i,o,i,i,i]], + ["sub_string",[i,o,i,i,o]], + ["sub_string",[i,o,i,o,i]], + ["sub_string",[i,o,i,o,o]], + ["sub_string",[i,o,o,i,i]], + ["sub_string",[i,o,o,i,o]], + ["sub_string",[i,o,o,o,i]], + ["sub_string",[i,o,o,o,o]], + */ + ["char_code",[i,i]], + ["char_code",[i,o]], + ["char_code",[o,i]], + ["number_chars",[i,i]], + ["number_chars",[i,o]], + ["number_chars",[o,i]], + ["number_codes",[i,i]], + ["number_codes",[i,o]], + ["number_codes",[o,i]], + ["close",[i,i]], + ["close",[i]], + ["stream_property",[i,i]], + ["stream_property",[i,o]], + ["stream_property",[o,i]], + ["stream_property",[o,o]], + + ["at_end_of_stream",[]], + ["at_end_of_stream",[i]], + ["set_stream_position",[i,i]], + ["compound",[i]], + ["copy_term",[i,i]], + ["copy_term",[i,o]], + ["copy_term",[o,i]], + ["copy_term",[o,o]], + ["current_prolog_flag",[i,i]], + ["current_prolog_flag",[i,o]], + ["current_prolog_flag",[o,i]], + ["current_prolog_flag",[o,o]], + ["current_input",[i]], + ["current_input",[o]], + ["current_output",[i]], + ["current_output",[o]], + ["float",[i]], + ["get_byte",[i,i]], + ["get_byte",[i,o]], + ["get_byte",[i]], + ["get_byte",[o]], + ["peek_byte",[i,i]], + ["peek_byte",[i,o]], + ["peek_byte",[i]], + ["peek_byte",[o]], + ["put_byte",[i,o]], + ["put_byte",[o,o]], + ["put_byte",[i]], + ["put_byte",[o]], + + ["peek_char",[i,i]], + ["peek_char",[i,o]], + ["peek_char",[i]], + ["peek_char",[o]], + + ["peek_code",[i,i]], + ["peek_code",[i,o]], + ["peek_code",[i]], + ["peek_code",[o]], + + ["get_char",[i]], + ["get_char",[o]], + ["get_char",[i,i]], + ["get_char",[i,o]], + ["get_code",[i]], + ["get_code",[o]], + ["get_code",[i,i]], + ["get_code",[i,o]], + + ["halt",[]], + ["halt",[i]], + + ["set_prolog_flag",[i,i]], + ["integer",[i]], + ["set_input",[i]], + ["set_output",[i]], + ["open",[i,i,o,i]], + ["open",[i,i,o]], + ["nonvar",[i]], + + ["sin",[i,i]], + ["sin",[i,o]], + ["cos",[i,o]], + ["cos",[i,i]], + ["atan",[i,i]], + ["atan",[i,o]], + %["exp",[i,i]], + %["exp",[i,o]], + ["log",[i,i]], + ["log",[i,o]], + ["sqrt",[i,i]], + ["sqrt",[i,o]], + + ["put_char",[i,i]], + ["put_char",[i]], + ["put_code",[i,i]], + ["put_code",[i]], + ["nl",[]], + ["nl",[i]], + + ["read_term",[i,i,i]], + ["read_term",[i,o,i]], + ["read_term",[i,i]], + ["read_term",[i,o]], + ["read",[i,i]], + ["read",[i,o]], + ["read",[i]], + ["read",[o]], + + %["write_term",[i,i,i]], + %["write_term",[i,i]], + ["write",[i,i]], + ["write",[i]], + ["writeq",[i,i]], + ["writeq",[i]], + %["write_canonical",[i,i]], + %["write_canonical",[i]], + + ["abs",[i,i]], + ["abs",[i,o]], + ["sign",[i,i]], + ["sign",[i,o]], + ["floor",[i,i]], + ["floor",[i,o]], + ["round",[i,i]], + ["round",[i,o]] + %["ceiling",[i,o]] +]), + +get_lang_word(Command,Dbw_command1),Dbw_command1=Dbw_command, + +%(Dbw_command=string_chars->trace;true), + %trace, + + interpretpart(command,Command,Args,Variables,Vars1,Vars2). + + +interpretstatement1(non-ssi,_F0,_Functions,[[Dbw_n,Dbw_command]|Variables2],Vars1,Vars2,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%trace, +(Variables2=[]->Variables=[];Variables2=[Variables]), + +%trace, +member([Command,Args], +[ + %["phrase_from_file",[i,i]], + %["phrase_from_file",[i,i]], + /*["numbers",[i,i,i,o]], + ["string_atom",[i,o]], + ["string_atom",[i,i]], + ["string_atom",[o,i]], + ["atom_string",[i,o]], + ["atom_string",[i,i]], + ["atom_string",[o,i]], + ["string_codes",[i,o]], + ["string_codes",[i,i]], + ["string_codes",[o,i]], + ["number_string",[i,o]], + ["number_string",[i,i]], + ["number_string",[o,i]], + ["reverse",[i,o]], + ["reverse",[i,i]], + ["reverse",[o,i]], + ["get_time",[o]], + ["string_chars",[i,i]], + ["string_chars",[i,o]], + ["string_chars",[o,i]], + ["atom_chars",[i,i]], + ["atom_chars",[i,o]], + ["atom_chars",[o,i]], + ["atom_codes",[i,i]], + ["atom_codes",[i,o]], + ["atom_codes",[o,i]], + ["atom_concat",[i,i,i]], + ["atom_concat",[i,o,i]], + ["atom_concat",[o,i,i]], + ["atom_concat",[o,o,i]], + ["atom_concat",[i,i,o]], + ["atomic",[i]], + ["atom_length",[i,i]], + ["atom_length",[i,o]], + */ + ["sub_atom",[i,i,i,i,i]], + ["sub_atom",[i,i,i,i,o]], + ["sub_atom",[i,i,i,o,i]], + ["sub_atom",[i,i,i,o,o]], + ["sub_atom",[i,i,o,i,i]], + ["sub_atom",[i,i,o,i,o]], + ["sub_atom",[i,i,o,o,o]], + ["sub_atom",[i,o,i,i,i]], + ["sub_atom",[i,o,i,i,o]], + ["sub_atom",[i,o,i,o,i]], + ["sub_atom",[i,o,i,o,o]], + ["sub_atom",[i,o,o,i,i]], + ["sub_atom",[i,o,o,i,o]], + ["sub_atom",[i,o,o,o,i]], + ["sub_atom",[i,o,o,o,o]], + ["sub_string",[i,i,i,i,i]], + ["sub_string",[i,i,i,i,o]], + ["sub_string",[i,i,i,o,i]], + ["sub_string",[i,i,i,o,o]], + ["sub_string",[i,i,o,i,i]], + ["sub_string",[i,i,o,i,o]], + ["sub_string",[i,i,o,o,o]], + ["sub_string",[i,o,i,i,i]], + ["sub_string",[i,o,i,i,o]], + ["sub_string",[i,o,i,o,i]], + ["sub_string",[i,o,i,o,o]], + ["sub_string",[i,o,o,i,i]], + ["sub_string",[i,o,o,i,o]], + ["sub_string",[i,o,o,o,i]], + ["sub_string",[i,o,o,o,i]], + ["number_string",[i,i]], + ["number_string",[i,o]], + ["number_string",[o,i]], + ["get_single_char",[i]], + ["get_single_char",[o]] + /*, + ["atom_string",[i,i]], + ["atom_string",[i,o]], + ["atom_string",[o,i]], + ["char_code",[i,i]], + ["char_code",[i,o]], + ["char_code",[o,i]], + ["number_chars",[i,i]], + ["number_chars",[i,o]], + ["number_chars",[o,i]], + ["number_codes",[i,i]], + ["number_codes",[i,o]], + ["number_codes",[o,i]], + ["close",[i,i]], + ["close",[i]], + ["stream_property",[i,i]], + ["stream_property",[i,o]], + ["stream_property",[o,i]], + ["stream_property",[o,o]], + + ["at_end_of_stream",[]], + ["at_end_of_stream",[i]], + ["set_stream_position",[i,i]], + ["compound",[i]], + ["copy_term",[i,i]], + ["copy_term",[i,o]], + ["copy_term",[o,i]], + ["copy_term",[o,o]], + ["current_prolog_flag",[i,i]], + ["current_prolog_flag",[i,o]], + ["current_prolog_flag",[o,i]], + ["current_prolog_flag",[o,o]], + ["current_input",[i]], + ["current_input",[o]], + ["current_output",[i]], + ["current_output",[o]], + ["float",[i]], + ["get_byte",[i,i]], + ["get_byte",[i,o]], + ["get_byte",[i]], + ["get_byte",[o]], + ["peek_byte",[i,i]], + ["peek_byte",[i,o]], + ["peek_byte",[i]], + ["peek_byte",[o]], + ["put_byte",[i,o]], + ["put_byte",[o,o]], + ["put_byte",[i]], + ["put_byte",[o]], + + ["peek_char",[i,i]], + ["peek_char",[i,o]], + ["peek_char",[i]], + ["peek_char",[o]], + + ["peek_code",[i,i]], + ["peek_code",[i,o]], + ["peek_code",[i]], + ["peek_code",[o]], + + ["get_char",[i]], + ["get_char",[o]], + ["get_char",[i,i]], + ["get_char",[i,o]], + ["get_code",[i]], + ["get_code",[o]], + ["get_code",[i,i]], + ["get_code",[i,o]], + + ["halt",[]], + ["halt",[i]], + + ["set_prolog_flag",[i,i]], + ["integer",[i]], + ["set_input",[i]], + ["set_output",[i]], + ["open",[i,i,o,i]], + ["open",[i,i,o]], + ["nonvar",[i]], + + ["sin",[i,i]], + ["sin",[i,o]], + ["cos",[i,o]], + ["cos",[i,i]], + ["atan",[i,i]], + ["atan",[i,o]], + %["exp",[i,i]], + %["exp",[i,o]], + ["log",[i,i]], + ["log",[i,o]], + ["sqrt",[i,i]], + ["sqrt",[i,o]], + + ["put_char",[i,i]], + ["put_char",[i]], + ["put_code",[i,i]], + ["put_code",[i]], + ["nl",[]], + ["nl",[i]], + + ["read_term",[i,i,i]], + ["read_term",[i,o,i]], + ["read_term",[i,i]], + ["read_term",[i,o]], + ["read",[i,i]], + ["read",[i,o]], + ["read",[i]], + ["read",[o]], + + %["write_term",[i,i,i]], + %["write_term",[i,i]], + ["write",[i,i]], + ["write",[i]], + ["writeq",[i,i]], + ["writeq",[i]], + %["write_canonical",[i,i]], + %["write_canonical",[i]], + + ["abs",[i,i]], + ["abs",[i,o]], + ["sign",[i,i]], + ["sign",[i,o]], + ["floor",[i,i]], + ["floor",[i,o]], + ["round",[i,i]], + ["round",[i,o]]*/ + %["ceiling",[i,o]] +]), + +get_lang_word(Command,Dbw_command1),Dbw_command1=Dbw_command, + +%(Dbw_command=string_chars->trace;true), + %trace, + + interpretpart(command,Command,Args,Variables,Vars1,Vars2). + + %%interpretpart(findall,[Variable1,Variable3],Vars3,Vars2). + +/*** + Query1=[[n,grammar_part]|Arguments], + Arguments=[[RuleName|Variables2]], + %%(([Variables4|Rest]=Variables2->Variables3=Variables2;(Variables2=[],Variables3=[]))), + + ((not(terminal(RuleName)), +%%writeln1(["Arguments",Arguments,"Vars1",Vars1]), + substitutevarsA1(Variables2,Vars1,[],Vars3,[],FirstArgs), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln1([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + (not(Vars3=[])->(append([RuleName],Vars3,Vars4),Query2=[[n,grammar_part],Vars4]); + Query2=[[n,grammar_part],RuleName]), %% Bodyvars2? +%% debug(on)->writeln1([call,[Function,[Vars3]]]), +%%writeln1(["Query2",Query2,"Functions0",Functions0]), + %%notrace,%%**** + interpret2(Query2,Grammar,Grammar,Result1), + %%trace,%**** + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) +)->true;( +%%writeln1(here1), + Vars8=[]))->true)->true; +(terminal(RuleName),substitutevarsA1(Variables2,Vars1,[],Vars3,[],FirstArgs), +%%writeln1(here), %%**** +%%Vars3=[Phrase,End], +%%Vars41=[Phrase,[v,vgp]], +append([RuleName],Vars3,Vars9), +%%writeln1([vars9,Vars9]), %%%%%***** +interpretpart(grammar_part,Vars9,[],Result1), + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars3,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + unique1(Vars7,[],Vars8) + %%writeln1([vars8,Vars8]) %%%***** +)->true;( +%%writeln1(here1), + Vars8=[]))->true)),%%notrace, %%**** + !. +**/ + +interpretstatement1(non-ssi,Functions0,Functions,Query1,Vars1,Vars8,true,nocut) :- + + %trace, + %(Query1=[[n, flatten2], [[v, e], [v, f], [v, c]]]->trace;true),%writeln1(interpretstatement1(ssi,Functions0,_Functions,Query1,Vars1,Vars8,true,nocut)), + %trace, + %writeln(interpretstatement1(ssi,Functions0,_Functions,Query1,Vars1,Vars8,true,nocut)), +% +get_lang_word("v",Dbw_v), +%get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +%get_lang_word("call",Dbw_call1),Dbw_call1=Dbw_call, +%trace, +%%writeln1("h1/10"), + +%writeln([Functions0,Functions0]), +%find_pred_sm(Reserved_words1), + + %trace, + % ((Query1=[[Dbw_n,Dbw_call],[Function,Arguments]]%, not_reserved_word(Function,Reserved_words1) + %)->true; +%( +Query1=[Function,Arguments],%,Function=[Dbw_n1,Function_a],atom_string(Function_a,Function_s), +%(Function=[n,paraphrase2]->trace;true), +%) +%), + +%trace, + %%not(Function=[n,grammar]->true;Function=[n,grammar_part]), **** +%%writeln1(["Arguments",Arguments,"Vars1",Vars1]), + %%***writeln1(substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs)), + Function=[Dbw_v,Function2], + not(reserved_word2(Function2)), + + getvalue(Function,Function3,Vars1), + %reserved_word(Function3), + append([Function3],[Arguments],Arguments1), + +interpretstatement1(_,Functions0,Functions,Arguments1,Vars1,Vars8,true,nocut). + + +interpretstatement1(non-ssi,_Functions0,_Functions,Query1,Vars1,Vars8,true,nocut) :- +get_lang_word("v",Dbw_v), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("call",Dbw_call1),Dbw_call1=Dbw_call, + +%%writeln1("h1/10"), +%trace, +%find_pred_sm(Reserved_words1), + ((Query1=[[Dbw_n,Dbw_call],[[lang,Lang1],Debug1,[Function,Arguments],Functions%,Result + ]],Tm=off%, + %not(member(Dbw_call,Reserved_words1)) + )->true; + (Query1=[[Dbw_n,Dbw_call],[[lang,Lang1],Debug1,[Function,Arguments],Types,Modes,Functions%,Result + ]],Tm=on)), + + %trace, + + lang(Lang2a), + types(Types2a), + (Types2a=on->(typestatements(TypeStatements2a), + modestatements(ModeStatements2a));true), + + (Lang1=same->lang(Lang2);Lang2=Lang1), + (Debug1=same->debug(Debug2);Debug2=Debug1), + + %%not(Function=[n,grammar]->true;Function=[n,grammar_part]), **** +%%writeln1(["Arguments",Arguments,"Vars1",Vars1]), + %%***writeln1(substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs)), + ((Function=[Dbw_v,F_name], + not(reserved_word2(F_name)))-> + (append([Function],Arguments,Arguments1), + substitutevarsA1(Arguments1,Vars1,[],Vars3,[],FirstArgs), + Vars3=[Function1|Vars31], + Query2=[Function1,Vars31]); + (substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs), + %simplify(Vars32,Vars3), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln1([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + Query2=[Function,Vars3] + %not(reserved_word2(Vars3)) + )), %% Bodyvars2? +%% debug(on)->writeln1([call,[Function,[Vars3]]]), +%%writeln1(["Query2",Query2,"Functions0",Functions0]), + + + %interpret2(Query2,Functions0,Functions0,Result1), + +(Tm=off->international_interpret([lang,Lang2],Debug2,Query2,Functions,Result1a); + international_interpret([lang,Lang2],Debug2,Query2,Types,Modes,Functions,Result1a)), + member(Result1,Result1a), + + retractall(lang(_)), + assertz(lang(Lang2a)), + + retractall(types(_)), + assertz(types(Types2a)), + + (Types2a=on->( + retractall(typestatements(_)), + + %findall([A,C],(member([A,B],TypeStatements2a),expand_types(B,[],C)),TypeStatements2a1), + assertz(typestatements(TypeStatements2a)), + retractall(modestatements(_)), + assertz(modestatements(ModeStatements2a)));true), + + + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + (%trace, + unique1(Vars7,[],Vars8)%,notrace + ) +);( +%%writeln1(here1), + Vars8=[])). + +%%%% Run Prolog + +interpretstatement1(ssi,_Functions0,_Functions,[[Dbw_n,Dbw_shell_pl],[I,QP,QV,P,OVar]],Vars1,Vars2,true,nocut) :- +%trace, +%get_lang_word("v",Dbw_v), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("shell_pl",Dbw_shell_pl1),Dbw_shell_pl1=Dbw_shell_pl, + + interpretpart(shell_pl,I,QP,QV,P,OVar,Vars1,Vars2). + + +interpretstatement1(ssi,_Functions0,_Functions,[[Dbw_n,Dbw_shell_c],[I,P,OVar]],Vars1,Vars2,true,nocut) :- +%trace, +%get_lang_word("v",Dbw_v), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("shell_c",Dbw_shell_c1),Dbw_shell_c1=Dbw_shell_c, + + interpretpart(shell_c,I,P,OVar,Vars1,Vars2). + +%%% LEGACY INTERPRET FOR SSI + + + +interpretstatement1(ssi,_Functions0,_Functions,Query1,Vars1,Vars8,true,nocut) :- +get_lang_word("v",Dbw_v), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("interpret",Dbw_interpret1),Dbw_interpret1=Dbw_interpret, + +%%writeln1("h1/10"), +%trace, +%find_pred_sm(Reserved_words1), + ((Query1=[[Dbw_n,Dbw_interpret],[[lang,Lang1],Debug1,[Function,Arguments],Functions%,Result + ]],Tm=off%, + %not(member(Dbw_call,Reserved_words1)) + )->true; + (Query1=[[Dbw_n,Dbw_interpret],[[lang,Lang1],Debug1,[Function,Arguments],Types,Modes,Functions%,Result + ]],Tm=on)), + + %trace, + + lang(Lang2a), + types(Types2a), + (Types2a=on->(typestatements(TypeStatements2a), + modestatements(ModeStatements2a));true), + + (Lang1=same->lang(Lang2);Lang2=Lang1), + (Debug1=same->debug(Debug2);Debug2=Debug1), + + %%not(Function=[n,grammar]->true;Function=[n,grammar_part]), **** +%%writeln1(["Arguments",Arguments,"Vars1",Vars1]), + %%***writeln1(substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs)), + ((Function=[Dbw_v,F_name], + not(reserved_word2(F_name)))-> + (append([Function],Arguments,Arguments1), + substitutevarsA1(Arguments1,Vars1,[],Vars3,[],FirstArgs), + Vars3=[Function1|Vars31], + Query2=[Function1,Vars31]); + (substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs), + %simplify(Vars32,Vars3), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%%writeln1([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + Query2=[Function,Vars3] + %not(reserved_word2(Vars3)) + )), %% Bodyvars2? +%% debug(on)->writeln1([call,[Function,[Vars3]]]), +%%writeln1(["Query2",Query2,"Functions0",Functions0]), + + + %interpret2(Query2,Functions0,Functions0,Result1), + +(Tm=off->international_interpret([lang,Lang2],Debug2,Query2,Functions,Result1a); + international_interpret([lang,Lang2],Debug2,Query2,Types,Modes,Functions,Result1a)), + member(Result1,Result1a), + + retractall(lang(_)), + assertz(lang(Lang2a)), + + retractall(types(_)), + assertz(types(Types2a)), + + (Types2a=on->( + retractall(typestatements(_)), + + %findall([A,C],(member([A,B],TypeStatements2a),expand_types(B,[],C)),TypeStatements2a1), + assertz(typestatements(TypeStatements2a)), + retractall(modestatements(_)), + assertz(modestatements(ModeStatements2a)));true), + + + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + (%trace, + unique1(Vars7,[],Vars8)%,notrace + ) +);( +%%writeln1(here1), + Vars8=[])). + +interpretstatement1(non-ssi,Functions0,_Functions,Query1,Vars1,Vars8,true,nocut) :- + + %trace, + %writeln(interpretstatement1(ssi,Functions0,_Functions,Query1,Vars1,Vars8,true,nocut)), + +get_lang_word("v",Dbw_v), +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("call",Dbw_call1),Dbw_call1=Dbw_call, + +%%writeln1("h1/10"), +%trace, +%writeln([Functions0,Functions0]), +%find_pred_sm(Reserved_words1), + + %trace, + ((Query1=[[Dbw_n,Dbw_call],[Function,Arguments]]%, not_reserved_word(Function,Reserved_words1) + )->true; +(Query1=[Function,Arguments] +%not(reserved_word2(Function))%,Function=[Dbw_n1,Function_a],atom_string(Function_a,Function_s), +%,not_reserved_word(Function,Reserved_words1)) +) +), + +%trace, + %%not(Function=[n,grammar]->true;Function=[n,grammar_part]), **** +%%writeln1(["Arguments",Arguments,"Vars1",Vars1]), + %%***writeln1(substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs)), + ((Function=[Dbw_v,F_name], + not(reserved_word2(F_name)))-> + (%trace, + append([Function],Arguments,Arguments1), + %trace, + substitutevarsA1(Arguments1,Vars1,[],Vars3,[],FirstArgs), + Vars3=[Function1|Vars31], + Query2=[Function1,Vars31] + ); + (%trace, + + substitutevarsA1(Arguments,Vars1,[],Vars3,[],FirstArgs), + + Query2=[Function,Vars3], + + Function=[Dbw_n,F_name], + not(reserved_word2(F_name)) + %simplify(Vars32,Vars3), %%% var to value, after updatevars: more vars to values, and select argument vars from latest vars +%writeln1([substitutevarsA1,arguments,Arguments,vars1,Vars1,vars3,Vars3,firstargs,FirstArgs]), + + %not(reserved_word2(Vars3)) + )), %% Bodyvars2? +%(Function=[n,compound213]->%true +%trace +%;true), + %trace, +%% debug(on)->writeln1([call,[Function,[Vars3]]]), +%%writeln1(["Query2",Query2,"Functions0",Functions0]), +%trace, +%writeln1(interpret2(Query2,Functions0,Functions0,Result1)), + + interpret2(Query2,Functions0,Functions0,Result1), + + %trace, %writeln1(interpret2(Query2,Functions0,Functions0,Result1)), + %writeln1(updatevars2(FirstArgs,Result1,[],Vars5)), + + updatevars2(FirstArgs,Result1,[],Vars5), + updatevars3(Vars1,Vars5,Vars6), + reverse(Vars6,[],Vars7), + ((not(Vars7=[])-> + %%Vars7=[Var71|Vars72], + (%trace, + unique1(Vars7,[],Vars8)%,notrace + ) +);( +%%writeln1(here1), + Vars8=[])). + + + + + + + + +%%**** reverse and take first instance of each variable. + %%findresult3(Arguments,Vars6,[],Result2) +%%writeln1(["FirstArgs",FirstArgs,"Result1",Result1,"Vars5",Vars5,"Vars4",Vars4]), +%%writeln1(["Vars1:",Vars1,"Vars4:",Vars4]), +%% debug(on)->writeln1([exit,[Function,[Result2]]]). +interpretstatement1(non-ssi,Functions0,_Functions,Query,Vars,Vars,true,nocut) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + +%find_pred_sm(Reserved_words1), +%trace, + Query=[Function], + + Function=[Dbw_n,F_name], + not(reserved_word2(F_name)), + %trace, + %not(reserved_word2(Function)), +%debug_call(Skip,[Function]), + (interpret2(Query,Functions0,Functions0,_Result1)-> +true%debug_exit(Skip,[Function]) +; fail%debug_fail(Skip,[Function]) +) +,!. + +not_reserved_word(Function,Reserved_words) :- + %Function=[[_,Function_a]|_] + Function=[_,Function_a],(atom(Function_a)->true;string(Function_a)),atom_string(Function_a,Function_s) +,not(member(Function_s,Reserved_words)). + + +debug_react(Status,115,true) :- Status=call, +turndebug(off), write0(" "),get_lang_word("skip",Dbw_skip),writeln0(Dbw_skip). %% skip +debug_react(_Status,97,false) :- write0(" "),get_lang_word("abort",Dbw_abort),writeln0(Dbw_abort),abort. %% abort +debug_react(Status,A,false) :- ((Status=call,not(A=115),not(A=97))->true; +(member_exit_fail(Status),not(A=97))), write0(" "),get_lang_word("creep",Dbw_creep),writeln0(Dbw_creep). %% creep + +member_exit_fail(exit). +member_exit_fail(fail). + +debug_call(Skip,FunctionArguments1) :- +%debug(On),leash1(On2), +%writeln1([debug(on),debug(On),leash1(on),leash1(On2)]), +get_lang_word("call",Dbw_call), +get_lang_word("Press c to creep, s to skip or a to abort.",Dbw_note1), +%trace, +((save_debug(on),debug(on))->(saved_debug(List1),append(List1,[[Dbw_call,FunctionArguments1,Dbw_note1]],List2), +do_saved_debug(List2));true), +((retry_back(on),debug(on))->(append_retry_back_stack([debug,[Dbw_call,FunctionArguments1,Dbw_note1]]));true), +(debug(on)->(write1([Dbw_call,FunctionArguments1,Dbw_note1]), +(leash1(on)->writeln0("");(%print_text, +get_single_char(Key),debug_react(call,Key,Skip))));Skip=false). + +debug_fail_fail(Skip) :- +(debug(on)->(Skip=true->turndebug(on);true);true). + +debug_fail(Skip,FunctionArguments1) :- +get_lang_word("fail",Dbw_fail), +get_lang_word("Press c to creep or a to abort.",Dbw_note1), + +((save_debug(on),debug(on))->(saved_debug(List1),append(List1,[[Dbw_fail,FunctionArguments1,Dbw_note1]],List2), +do_saved_debug(List2));true), +((retry_back(on),debug(on))->(append_retry_back_stack([debug,[Dbw_fail,FunctionArguments1,Dbw_note1]]));true), +((Skip=true->turndebug(on);true),((debug(on)->(write1([Dbw_fail,FunctionArguments1,Dbw_note1]), +(leash1(on)->writeln0("");(%print_text, +get_single_char(Key),debug_react(fail,Key,_Skip))));true),fail)). + +debug_exit(Skip,FunctionResult2) :- +get_lang_word("exit",Dbw_exit), +get_lang_word("Press c to creep or a to abort.",Dbw_note1), +((save_debug(on),debug(on))->(saved_debug(List1),append(List1,[[Dbw_exit,FunctionResult2,Dbw_note1]],List2), +do_saved_debug(List2));true), +((retry_back(on),debug(on))->(append_retry_back_stack([debug,[Dbw_exit,FunctionResult2,Dbw_note1]]));true), +((Skip=true->turndebug(on);true),((debug(on)->(write1([Dbw_exit,FunctionResult2,Dbw_note1]), +(leash1(on)->writeln0("");(%print_text, +get_single_char(Key),debug_react(exit,Key,_Skip))));true))). + + +debug_types_call(FunctionArguments1) :- +get_lang_word("call",Dbw_call), +debug_types(Dbw_call,FunctionArguments1). + +debug_types(Call,FunctionArguments1) :- +((save_debug(on),debug(on))->(saved_debug(List1),append(List1,[[Call,FunctionArguments1]],List2), +do_saved_debug(List2));true), +((retry_back(on),debug(on))->(append_retry_back_stack([debug,[Call,FunctionArguments1]]));true), +(debug(on)->(writeln1([Call,FunctionArguments1]));true). + +debug_types_fail(FunctionArguments1) :- +get_lang_word("fail",Dbw_fail), +((save_debug(on),debug(on))->(saved_debug(List1),append(List1,[[Dbw_fail,FunctionArguments1]],List2), +do_saved_debug(List2));true), +((retry_back(on),debug(on))->(append_retry_back_stack([debug,[Dbw_fail,FunctionArguments1]]));true), +((debug(on)->(writeln1([Dbw_fail,FunctionArguments1])) +;true),fail). + +debug_types_exit(FunctionResult2) :- +get_lang_word("exit",Dbw_exit), +debug_types(Dbw_exit,FunctionResult2). + + +word1([])-->[]. +word1([A|As]) --> [A],word1(As),{%%atom_codes(A,AC), +char_type(A,alpha)},!. +/**interpretstatement1(ssi,_Functions0, _Functions,_Query,_Vars1,_Vars2,false) :- + writeln1([false]). +**/ +/** +interpretstatement2(Value,_Vars,Value) :- + (number(Value);atom(Value)). +interpretstatement2(Variable,Vars1,Value) :- + getvalue(Variable,Value,Vars1). +interpretstatement3(A + B,Vars,Value1) :- + interpretstatement2(A,Vars,Value2), + interpretstatement2(B,Vars,Value3), + Value1 = Value2 + Value3. +interpretstatement3(Value,_Vars,Value) :- + (number(Value);atom(Value)). +interpretstatement3(Variable,Vars,Value) :- + getvalue(Variable,Value,Vars). + **/ +getvalue(Variable,Value,Vars) :- + ((not(isvar(Variable)),isvalstrorundef(Value), + %simplify(Variable,Variable2),%->true;(writeln1(simplify(Variable,Variable2)),abort)),%notrace, + Variable=Value)->true; + (isvar(Variable),isvalstrorundef(Value),getvar(Variable,Value,Vars))). +/*putvalue(Variable,Value,Vars1,Vars2) :- + ((not(isvar(Variable)),isvalstrorundef(Value),Variable=Value,Vars1=Vars2)->true; + (isvar(Variable),isvalstrorundef(Value),%trace, + updatevar(Variable,Value,Vars1,Vars2))),!. + */ +getvar(Variable,Value,Vars) :- + ((member([Variable,Value],Vars), + not(is_empty(Value)))->true; + ((aggregate_all(count,member([Variable,_Value],Vars),0)->true;%% + (member([Variable,Empty],Vars),is_empty(Empty))),(Empty=Value))) +. +getvar(undef,undef,_Vars) :- + !. +%%getvar(Variable,empty,Vars) :- + %%(aggregate_all(count,member([Variable,_Value],Vars),0)->true;%%; + %%member([Variable,empty],Vars)) + %%. + +simplify(A,A) :- + (blob(A,stream)->%false + true + ;(variable_name(A)->true;(string(A)->true;(number(A)->true;(atom(A)->true;A=[]))))),!. +%simplify([A,"|",[B|B0]],[A1|[B1|B10]]) :- +% simplify(A,A1), +% simplify(B,B1), +% simplify(B0,B10),!. +simplify([A,"|",B],C) :- + simplify(A,A1), + simplify(B,B1), + ((not(isvar(B1)),is_list(B1))-> + C=[A1|B1]; + C=[A1,"|",B1]),!. +simplify(AB,[A1|B1]) :- + AB=[A|B], +%(not(is_list(B))->trace;true), +%is_list(B),not(variable_name(B)), +(AB=[_A2,"|"|B2]-> +(B2=[B3],(((is_list(B3),not(variable_name(B3)))->true;variable_name(B3))));true), + simplify(A,A1), + simplify(B,B1),!. + + +all_empty([]) :- !. +all_empty(Empty) :- is_empty(Empty),!. +all_empty([A|B]) :- + all_empty(A),all_empty(B),!. + +updatevar(undef,_Value,Vars,Vars) :- + !. +updatevar(Variable,Value,Vars1,Vars2a) :- +%writeln1(updatevar(Variable,Value,Vars1,Vars2)) +%trace, + + ((((member([Variable,A],Vars1), + %trace, + %(isvar(Variable)->Value2=Value; + (updatevar_recursive(Value,A,Value2)), + %notrace, + %all_empty(A), + delete(Vars1,[Variable,A],Vars3), + append(Vars3,[[Variable,Value2]],Vars2) + )->true; + ((not(member([Variable,Value1],Vars1)), + ((is_empty(Value1))->true;(Value1=Value)))), + append(Vars1,[[Variable,Value]],Vars2))->true; + (member([Variable,Value],Vars1),Vars2=Vars1))->true; + (undef(Variable), + append(Vars1,[[Variable,Value]],Vars2))), + + %(%variable_name(Variable)-> + %( + %/*trace, + findall([Variable1,B],(member([Variable1,C],Vars2), + replace_in_term(C,Variable,Value,B)), + Vars2a) + %*/ + %Vars2=Vars2a + %,writeln1(replace_in_term(Vars1a,Variable,Value,Vars1)) + %); + %Vars1a=Vars1), + +. + +updatevar_recursive([],[],[]) :- !. +updatevar_recursive(Variable,A,Value) :- + (is_empty(Variable)->Value=A; + (is_empty(A)->Value=Variable; + (Variable=A->Value=Variable; + (Variable=[B|C], + A=[D|E], + updatevar_recursive(B,D,Value1), + updatevar_recursive(C,E,Value2), + append([Value1],Value2,Value))))),!. + + + +/**updatevars(_FirstArgs,[],Vars,Vars). +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[[Variable1,Value]|Vars4], + ((member([Variable2,Variable1],FirstArgs), %% removed brackets around firstargs here and 2 line below + append(Vars2,[[Variable2,Value]],Vars5))->true; + (member([Variable1,_Variable2],FirstArgs), + append(Vars2,[[Variable1,Value]],Vars5))), + updatevars(FirstArgs,Vars4,Vars5,Vars3), + !. +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- + Vars1=[_Vars4|Vars5], + updatevars(FirstArgs,Vars5,Vars2,Vars3).**/ + +updatevars(FirstArgs,Vars1,Vars2,Vars3) :- +%writeln1(updatevars(FirstArgs,Vars1,Vars2,Vars3)), + +%(updatevars(FirstArgs,Vars1,Vars2,Vars3)=updatevars([[[v,vgp3],[v,vgp2]],[[v,i],[v,t]]],[[[v,t1],"b"],[[v,vgp2],",""c"",[]],1]"],[[v,vgp1],"b,""c"",[]],1]"],[[v,t],b]],[],_181890)->trace;true), + + +%trace, +(equals4(on)-> +%(writeln1(e4_updatevars_1(FirstArgs,Vars1,Vars2,Vars3)), +e4_updatevars(FirstArgs,Vars1,Vars2,Vars3) +%,writeln1(e4_updatevars_2(FirstArgs,Vars1,Vars2,Vars3))) +; updatevars1(FirstArgs,Vars1,Vars2,Vars3)),!. + + +updatevars1([],_Vars1,Vars2,Vars2) :- !. +updatevars1(FirstArgs,Vars1,Vars2,Vars3) :- + FirstArgs=[[Orig,New]|Rest], + (expressionnotatom(New)->append(Vars2,[[Orig,New]],Vars4); + (member([New,Value],Vars1), + append(Vars2,[[Orig,Value]],Vars4))), + updatevars1(Rest,Vars1,Vars4,Vars3),!. + +updatevars2(_FirstArgs,[],Vars,Vars) :- !. +updatevars2(FirstArgs,Vars1,Vars2,Vars3) :- +%trace, +%writeln(updatevars2(FirstArgs,Vars1,Vars2,Vars3)), +%trace, + Vars1=[[Variable,Value]|Vars4], + (%member(Variable,FirstArgs), %% removed brackets around firstargs here and 2 line below, ** vars1 into arg in (10), check cond + append(Vars2,[[Variable,Value]],Vars5)), + updatevars2(FirstArgs,Vars4,Vars5,Vars3). + +updatevars3(Vars1,Vars2,Vars4) :- + updatevars31(Vars1,Vars2,Vars11), + updatevars32(Vars11,Vars2,Vars4). + +updatevars31(Vars1,Vars2,Vars11) :- + findall([V,Value3],(member([V,Val],Vars1), + +get_lang_word("v",Dbw_v), +get_lang_word("sys1",Dbw_sys1), +%%writeln1(5), +%trace, + remember_and_turn_off_debug(Debug), + %trace, + %(interpretpart(match4,Variable1,Variable2,Vars1,Vars5,_)->true;(turn_back_debug(Debug), + %fail + %interpretpart(match4,Variable1,Variable2,Vars1,_Vars2,_))), + + interpretpart(match4,Val,[Dbw_v,Dbw_sys1],Vars2,Vars4,_), + + getvalue([Dbw_v,Dbw_sys1],Value3,Vars4), + + turn_back_debug(Debug)),Vars11). + + + %interpretpart(match4,Variable1,Variable2,Vars1,Vars2,Value3) + %) + + %) + +updatevars32(Vars1,[],Vars1). +updatevars32(Vars1,Vars2,Vars4) :- + get_lang_word("v",Dbw_v), + Vars2=[[Variable,Value]|Vars5], + delete(Vars1,[Variable,[Dbw_v,_]],Vars6), + append(Vars6,[[Variable,Value]],Vars7), + updatevars32(Vars7,Vars5,Vars4), + !. +updatevars32(Vars1,Vars2,Vars4) :- + Vars2=[[Variable,Value]|Vars5], + append(Vars1,[[Variable,Value]],Vars6), + updatevars32(Vars6,Vars5,Vars4). +reverse([],List,List). +reverse(List1,List2,List3) :- + List1=[Head|Tail], + append([Head],List2,List4), + reverse(Tail,List4,List3). +/** +unique1([Item|Items1],Items2,Items3) :- + delete(Items1,Item,Items4), + append(Items2,[Item],Items5), + unique1(Items4,Items5,Items3). + **/ + +unique1(A,Items2,Items3) :- +%trace, +unique1a(A,Items2,Items3). +%notrace. +unique1a([],Items,Items). +unique1a([[Item,Val]|Items1],Items2,Items3) :- + + (member([Item,Val2],Items1)->(not(Val=Val2)-> + fail%delete(Items1,[Item,Val2],Items6) + ;true%Items1=Items6 + ); + true%Items1=Items6 + ), + %delete(Items1,Item,Items4), + %append(Items2,[Item],Items5), + delete(Items1,[Item,Val],Items4), + append(Items2,[[Item,Val]],Items5), + unique1a(Items4,Items5,Items3). + + +isvar([Dbw_v,_Value]) :- +get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v,!. +isval(Value) :- + number(Value). +isvalstr(N) :- + isval(N);string(N). +isvalempty(N) :- + isval(N);(is_empty(N)). +is_empty(N) :- + variable_name(N).%=empty. +/**isvalstrempty(N) :- + isval(N);(string(N);N=empty).**/ +isvalstrempty(N) :- + var(N),!. +isvalstrempty(N) :- + isval(N),!. +isvalstrempty(N) :- +get_lang_word("v",Dbw_v), + not(N=Dbw_v),not(N=[Dbw_v,_]), + string(N). +isvalstrempty(Empty) :- is_empty(Empty). +isvalstrempty([]). +/**isvalstrempty(N) :- + atom(N),fail,!. +**/ +isvalstrorundef(N) :- + var(N),!. +isvalstrorundef(N) :- + blob(N,stream),!. +isvalstrorundef(N) :- + not(var(N)),isval(N),!. +isvalstrorundef(N) :- + not(var(N)),expression(N),!. +undef(N) :- + var(N). +/** +expression(N) :- + isval(N);(string(N);atom(N)),!. +expression([]). +expression(empty). +expression([N]) :- + expression(N). +expression([N|Ns]):- + expression(N),expression(Ns). +**/ + +expression(Empty) :- is_empty(Empty), + !. +expression(N) :- + isval(N),!. +expression(N) :- + string(N),!. +expression(N) :- + atom(N),!. +expression([]) :- + !. +expression(N) :- + not(atom(N)), + length_is_list(N,L),L>=1, + expression2(N). +expression2([]). +expression2([N|Ns]) :- + %%( + expression3(N),%%->true;expression2(N)), + expression2(Ns). +expression3(N) :- + isval(N),!. +expression3(N) :- + string(N),!. +expression3(N) :- + atom(N),!. +expression3(N) :- + expression2(N),!. + +expressionnotatom3(N) :- + expressionnotatom(N),!. +expressionnotatom3(N) :- + get_lang_word("v",Dbw_v), + not(N=[v,_]),not(N=["v",_]),not(N=[Dbw_v,_]),expression(N),!. + + +expression_or_atom(N) :- + (isvalstrempty(N)->true;atom(N)),!. +expression_or_atom(N) :- + is_list(N), + length(N,L),L>=1, + expression_or_atom2(N),!. +expression_or_atom(Name) :- + predicate_or_rule_name(Name),!. +expression_or_atom2([]). +expression_or_atom2([N|Ns]) :- + (isvalstrempty(N)->true;atom(N)), + expression_or_atom2(Ns). + +expressionnotatom(N) :- + isvalstrempty(N),!. +expressionnotatom(N) :- + not(atom(N)), + length(N,L),L>=1, + expressionnotatom2(N),!. +expressionnotatom(Name) :- + predicate_or_rule_name(Name),!. +expressionnotatom2([]). +expressionnotatom2([N|Ns]) :- + isvalstrempty(N), + expressionnotatom2(Ns). + +substitutevarsA1(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- +%writeln1(substitutevarsA1(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2)), +%(Arguments=[[v,d],[[v,a],"|",[v,b]],[v,c]]->trace;true), + %simplify(Arguments1,Arguments), +%trace,writeln(substitutevarsA1(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2)), + (equals4(on)->e4_substitutevarsA1(Arguments,Vars1,Vars2,Vars33,FirstArgs1,FirstArgs2); + substitutevarsA11(Arguments,Vars1,Vars2,Vars33,FirstArgs1,FirstArgs2)), + + findall(Vars31,(member(Vars32,Vars33),simplify(Vars32,Vars31)),Vars3), + %substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2), + !. + substitutevarsA11(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- + substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2),!. + +substitutevarsA2([],_Vars1,Vars2,Vars2,FirstArgs,FirstArgs):-!. +substitutevarsA2(Arguments,Vars1,Vars2,Vars3,FirstArgs1,FirstArgs2) :- + Arguments=[Variable|Variables], + ((getvalue(Variable,Value,Vars1), + is_empty(Value))-> + ((append(Vars2,[Variable],Vars4)), + (isvar(Variable)->append(FirstArgs1,[Variable], + FirstArgs3);FirstArgs3=FirstArgs1)); + (getvalue(Variable,Value,Vars1), + append(Vars2,[Value],Vars4)), + FirstArgs3=FirstArgs1), + substitutevarsA2(Variables,Vars1,Vars4,Vars3,FirstArgs3,FirstArgs2),!. + +findresult3([],_Result,Result2,Result2):-!. +findresult3(Arguments1,Result1,Result2,Result3) :- + Arguments1=[Value|Arguments2], + expression_not_var(Value), + append(Result2,[Value],Result4), + findresult3(Arguments2,Result1,Result4,Result3),!. +findresult3(Arguments1,Result1,Result2,Result3) :- + Arguments1=[Variable|Arguments2], + +(equals4(on)->(get_lang_word("v",Dbw_v), + + remember_and_turn_off_debug(Debug), + +find_sys(Sys_name), + interpretpart(match4,Variable,[Dbw_v,Sys_name],Result1,Vars3,_), + turn_back_debug(Debug), + + getvalue([Dbw_v,Sys_name],Value,Vars3) + +) + +;( isvar(Variable), + member([Variable,Value],Result1))), + + append(Result2,[Value],Result4), + findresult3(Arguments2,Result1,Result4,Result3),!. + +strip([],Result2,Result2). +strip(Arguments1,Result2,Result3) :- + Arguments1=[[Variable,Value]|Arguments2], + isvar(Variable), + append(Result2,[Value],Result4), + strip(Arguments2,Result4,Result3). + + + remember_and_turn_off_debug(Debug) :- + debug(Debug),retractall(debug(_)),assertz(debug(off)). + + turn_back_debug(Debug) :- + retractall(debug(_)),assertz(debug(Debug)). + +find_sys(Name2) :- + sys(N1), + concat_list(["sys",N1],Name1), + get_lang_word(Name1,Name2), + %atom_string(Name2,Name1), + N2 is N1+1, + retractall(sys(_)), + assertz(sys(N2)). + +find_query_box_n(Name2) :- + (query_box_n(N1)->N=N1; + (retractall(query_box_n(_)), + assertz(query_box_n(1)), + N=1)), + concat_list(["query_box_",N],Name1), + get_lang_word(Name1,Name2), + %atom_string(Name2,Name1), + N2 is N+1, + retractall(sys(_)), + assertz(query_box_n(N2)). + +find_v_sys(V_sys) :- + get_lang_word("v",Dbw_v), + find_sys(Sys_name), + V_sys=[Dbw_v,Sys_name],!. diff --git a/listprologinterpreter/listprologinterpreter3preds5.pl b/listprologinterpreter/listprologinterpreter3preds5.pl new file mode 100644 index 0000000000000000000000000000000000000000..3ae17b607d290e9dbc4670d2bbbf7f8621a34fe6 --- /dev/null +++ b/listprologinterpreter/listprologinterpreter3preds5.pl @@ -0,0 +1,1563 @@ +interpretpart(is,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("is",Dbw_is), + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + %%getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + not(is_empty(Value1)), + expression(Value1), + is_empty(Value2), + val1emptyorvalsequal(Value2,Value1), + %%isval(Value2), +debug_call(Skip,[[Dbw_n,Dbw_is],[Value1,variable]]), +( putvalue(Variable2,Value1,Vars1,Vars2)-> +debug_exit(Skip,[[Dbw_n,Dbw_is],[Value1,Value1]]) +; debug_fail(Skip,[[Dbw_n,Dbw_is],[Value1,variable]])),!. + +interpretpart(is,Variable1,Variable2,Vars1,Vars2) :- +%writeln(here), +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("is",Dbw_is), + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + %%getvalue(Value1,Value1A,Vars1), + %%isvalstr(Value1), + %%isvalstr(Value1A), + is_empty(Value1), + not(is_empty(Value2)), + expression(Value2), + val1emptyorvalsequal(Value1,Value2), + %%isval(Value2), +debug_call(Skip,[[Dbw_n,Dbw_is],[variable,Value2]]), +( putvalue(Variable1,Value2,Vars1,Vars2)-> +debug_exit(Skip,[[Dbw_n,Dbw_is],[Value2,Value2]]) +; debug_fail(Skip,[[Dbw_n,Dbw_is],[variable,Value2]])),!. + +interpretpart(bracket1,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("wrap",Dbw_wrap), + getvalues_equals4(Variable1,Variable2,Value1,_Value2,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_wrap],[Value1,Variable2]]), +(( Value1A = [Value1], + %val1emptyorvalsequal(Value2,Value1A), + %%val1emptyorvalsequal(Value1A,Value2), + putvalue_equals4(Variable2,Value1A,Vars1,Vars2))-> +debug_exit(Skip,[[Dbw_n,Dbw_wrap],[Value1A,Value1A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_wrap],[Value1,Variable2]])),!. + +interpretpart(stringtonumber,Variable2,Variable1,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("stringtonumber",Dbw_stringtonumber), + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + %%Value1A = [Value2], +debug_call(Skip,[[Dbw_n,Dbw_stringtonumber],[Value2,value]]), + ((((Value2=""->true;is_empty(Value2))->Value1=""; + number_string(Value1A,Value2)), + val1emptyorvalsequal(Value1,Value1A), + %%val1emptyorvalsequal(Value1A,Value2), + putvalue(Variable1,Value1A,Vars1,Vars2))-> +debug_exit(Skip,[[Dbw_n,Dbw_stringtonumber],[Value2,Value1A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_stringtonumber],[Value2,value]])),!. + + +interpretpart(bracket2,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("unwrap",Dbw_unwrap), + getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_unwrap],[variable,Value2]]), + (([Value2A] = Value1, + %val1emptyorvalsequal(Value2,Value2A), + %%val1emptyorvalsequal(Value2A,Value1), + putvalue_equals4(Variable2,Value2A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_unwrap],[Value1,Value2A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_unwrap],[variable,Value2]])),!. + +interpretpart(head,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("head",Dbw_head), + getvalues_equals4(Variable1,Variable2,Value1,_Value2,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_head],[Value1,variable]]), + ((Value1=[Value1A|_Rest], + %val1emptyorvalsequal(Value2,Value1A), + putvalue_equals4(Variable2,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_head],[Value1,Value1A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_head],[Value1,variable]])),!. + +interpretpart(tail,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("tail",Dbw_tail), + getvalues_equals4(Variable1,Variable2,Value1,_Value2,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_tail],[Value1,variable]]), + ((Value1=[_Head|Value1A], + %%removebrackets(Value1A,Value1B), + %val1emptyorvalsequal(Value2,Value1A), + putvalue_equals4(Variable2,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_tail],[Value1,Value1A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_tail],[Value1,variable]])),!. + +/*interpretpart(member,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("member",Dbw_member), + getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), +debug_call(Skip,[[Dbw_n,Dbw_member],[Value1,Value2]]), + %(((not(Value2=empty)->member(Value2,Value1), + ((member(Value3,Value1), + putvalue_equals4(Variable2,Value3,Vars1,Vars2)%%,Vars2=Vars1 + )-> + debug_exit(Skip,[[Dbw_n,Dbw_member],[Value1,Value3]]) +; debug_fail(Skip,[[Dbw_n,Dbw_member],[Value1,Value2]])),!. +*/ +interpretpart(member2,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("member2",Dbw_member2), +get_lang_word("v",Dbw_v), + +%trace, + getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), + + + matrix_member(Matrix),findall(X,(member(Y,[Value1,Value2]),(contains_var([Dbw_v,_],Y)->X=o;X=i)),Z), +foldr(atom_concat,Z,'',W),(member(W,Matrix)->true;(writeln([incorrect,member2,modes,W]),abort)), + +((W=ii->true;W=io)-> ((%Value2=empty, + ((member(Value2a,Value1), + debug_call(Skip,[[Dbw_n,Dbw_member2],[Value2,Value1]]), + putvalue_equals4(Variable2,Value2a,Vars1,Vars2)))), + debug_exit(Skip,[[Dbw_n,Dbw_member2],[Value2a,Value1]])); + + + + +(W=oi,%trace, +%replace_in_term([Value2,],_%'$VAR'(_) +% ,empty,Value1A1), + command_n_sols(N), +%trace, +%writeln(Value2), + findnsols(N,Value1A1,(member(Value2,Value1A), + + %Value1A=[Value3A2|Value3A3], + %ValueIA1=[Value3A2,"|",Value3A3], + + find_v_sys(V_sys), + +%trace, + + replace_in_term(Value1A,_%'$VAR'(_) + ,empty2,Value1A2), + + convert_to_lp_pipe(Value1A2,Value1A3), + + replace_in_term(Value1A3,empty2%'$VAR'(_) + ,V_sys,Value1A1) + + ) + ,ValueA),!, + + + %val1emptyorvalsequal(Value3,Value3A), + %trace, + %Vars1=Vars2, + member(Value1a,ValueA), + putvalue_equals4(Variable1,Value1a,Vars1,Vars2), + + + debug_call(Skip,[[Dbw_n,Dbw_member2],[Value2,Value1]]), + debug_exit(Skip,[[Dbw_n,Dbw_member2],[Value2,Value1a]])); + + +(W=oo->%**** change this + + +(command_n_sols(N), + %findall([Vars2b,[Value1a,Value3a],Value1a,Value3a],( + findnsols(N,%[Value1A2, + Value2A2%] + ,(member(_Value1A,Value2A), + %replace_in_term(Value1A,_%'$VAR'(_) + %,empty,Value1A1), + + find_v_sys(V_sys), + + replace_in_term(Value2A,_%'$VAR'(_) + ,empty2,Value2A4), + + %convert_to_lp_pipe(Value1A1,Value1A2), + convert_to_lp_pipe(Value2A4,Value2A3), + + replace_in_term(Value2A3,empty2%'$VAR'(_) + ,V_sys,Value2A2) + + ) + ,ValueA),!, + %val1emptyorvalsequal(Value3,Value3A), + %trace, + %Vars1=Vars2, + member(%[Value1a, + Value2a%] + ,ValueA), + %putvalue_equals4(Variable1,Value1a,Vars1,Vars3),%)-> + putvalue_equals4(Variable2,Value2a,Vars1,Vars2) + ,%,Vars2a),Vars2a=[[Vars2,_,Value1a,Value3a]|Vars2d], + %findall([Vars2e,Vals2g],member([Vars2e,Vals2g,_,_],Vars2d),Vars2c1), + + %Vars2c=[[Dbw_n,Dbw_member2],[Value1,Value3],_,_,%,%Value2a + %_,_,%[Value1,Value2a] + %Vars2c1], + debug_call(Skip,[[Dbw_n,Dbw_member2],[Value1,Value1]]), + + debug_exit(Skip,[[Dbw_n,Dbw_member2],[Value2a,Value1]])))) + . +%%; %%debug_fail(Skip,[[n,member],[Value1,Value2]])),!. +%% ((debug(on)->(writeln1([fail,[[n,member],[Value1,value]],"Press c."]),(leash1(on)->true;(not(get_single_char(97))->true;abort)));true),fail))))). + +interpretpart(member3,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("member3",Dbw_member2), + getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), + %trace, + ((%Value2=empty, + %trace, + ((member(Value1a,Value2), + + remember_and_turn_off_debug(Debug), + + (interpretpart(match4,Variable1,Value1a,Vars1,Vars2,_)->true;(turn_back_debug(Debug),fail)), + + turn_back_debug(Debug), + + + debug_call(Skip,[[Dbw_n,Dbw_member2],[Value2,Value1]]) + %putvalue(Variable1,Value1a,Vars1,Vars2) + ))), + debug_exit(Skip,[[Dbw_n,Dbw_member2],[Value2,Value1a]])). + +interpretpart(isop,Operator,Variable1,Variable2,Variable3,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + debug_call(Skip,[[Dbw_n,Operator],[Value2,Value3,variable]]), + ((isvalempty(Value1), + isval(Value2), + isval(Value3), + Expression=..[Operator,Value2,Value3], + Value1A is Expression, + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Operator],[Value2,Value3,Value1A]]) +; debug_fail(Skip,[[Dbw_n,Operator],[Value2,Value3,variable]])),!. + +interpretpart(iscomparison,Operator,Variable1,Variable2,Vars1,Vars1) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + debug_call(Skip,[[Dbw_n,Operator],[Value1,Value2]]), + ((isval(Value1), + isval(Value2), + Expression=..[Operator,Value1,Value2], + Expression)-> + debug_exit(Skip,[[Dbw_n,Operator],[Value1,Value2]]) +; debug_fail(Skip,[[Dbw_n,Operator],[Value1,Value2]])),!. + +interpretpart(is,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + getvalues_equals4(Variable1,Variable2,Value1,Value2,Vars1), + % not(isempty(Value1)), + % not(isempty(Value2)), + debug_call(Skip,[[Dbw_n,=],[Value1,Value2]]), + ((Value1A = Value2, + %val1emptyorvalsequal(Value1,Value1A), + putvalue_equals4(Variable1,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,=],[Value1A,Value2]]) +; debug_fail(Skip,[[Dbw_n,=],[Value1,Value2]])),!. + + +interpretpart(match1,Variable1,Variable2,Variable3,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +getvalues_equals4(Variable1,Variable2,Variable3,Value1,_Value2,_Value3,Vars1), + Value1 = [Value2A, Value3A], + debug_call(Skip,[[Dbw_n,=],[[Value2A,Value3A],[variable1,variable2]]]), + ((%val1emptyorvalsequal(Value2,Value2A), + %val1emptyorvalsequal(Value3,Value3A), + putvalue_equals4(Variable2,Value2A,Vars1,Vars3), + putvalue_equals4(Variable3,Value3A,Vars3,Vars2))-> + debug_exit(Skip,[[Dbw_n,=],[[Value2A, Value3A],[Value2A, Value3A]]]) +; debug_fail(Skip,[[Dbw_n,=],[[Value2A,Value3A],[variable1,variable2]]])),!. + +interpretpart(match2,Variable1,Variable2,Variable3,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + getvalues_equals4(Variable1,Variable2,Variable3,_Value1,Value2,Value3,Vars1), + Value1A = [Value2, Value3], + debug_call(Skip,[[Dbw_n,=],[variable,[Value2,Value3]]]), + ((%val1emptyorvalsequal(Value1,Value1A), + putvalue_equals4(Variable1,Value1A,Vars1,Vars2))-> + (debug_exit(Skip,[[Dbw_n,=],[[Value2,Value3],[Value2,Value3]]]) +; debug_fail(Skip,[[Dbw_n,=],[variable,[Value2,Value3]]]))),!. + +interpretpart(match3,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + getvalues_equals4(Variable1,Variable2,_Value1,Value2,Vars1), + Value1A = Value2, + debug_call(Skip,[[Dbw_n,=],[variable,Value2]]), + ((%val1emptyorvalsequal(Value1,Value1A), + putvalue_equals4(Variable1,Value1A,Vars1,Vars2))-> + (debug_exit(Skip,[[Dbw_n,=],[Value2,Value2]]) +; debug_fail(Skip,[[Dbw_n,=],[variable,Value2]]))),!. + +interpretpart(match4,Variable1,Variable2,Vars1,Vars2,_Note) :- +get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("equals4",Dbw_equals4), +%trace, + debug_call(Skip,[[Dbw_n,Dbw_equals4],[Variable1,Variable2]]), + %trace, + remember_and_turn_off_debug(Debug), + + ((match4_new_22(Variable1,Variable2,Vars1,Vars2%,standard + ), + +%match4_2(Variable1,Variable2,Vars1,Vars2), + + + + %trace, +find_sys(Sys_name), + match4_new_22(Variable1,[Dbw_v,Sys_name],Vars2,Vars3%,standard + ), +%%writeln1( interpretpart(match4,Variable1,[v,sys1],Vars3,Vars2,_)), +%%interpretstatement1(ssi,Functions0,Functions,[[n,equals4],[Variable1,Variable3]],Vars3,Vars2,true,nocut), + getvalue([Dbw_v,Sys_name],Value3,Vars3), + + turn_back_debug(Debug)) + + + %%Value1A = Value2, + %%((val1emptyorvalsequal(Value1,Value1A), + %%putvalue(Variable1,Value1A,Vars1,Vars2)) + -> + debug_exit(Skip,[[Dbw_n,Dbw_equals4],[Value3,Value3]]) +; (turn_back_debug(Debug), +debug_fail(Skip,[[Dbw_n,Dbw_equals4],[Variable1,Variable2]]))),!. + + +interpretpart(delete,Variable1,Variable2,Variable3,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("delete",Dbw_delete), + getvalues_equals4(Variable1,Variable2,Variable3,Value1,Value2,_Value3,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_delete],[Value1,Value2,variable3]]), + ((delete(Value1,Value2,Value3A), + %val1emptyorvalsequal(Value3,Value3A), + putvalue_equals4(Variable3,Value3A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_delete],[Value1,Value2,Value3A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_delete],[Value1,Value2,variable3]])),!. + +interpretpart(append,Variable1,Variable2,Variable3,Vars1,Vars2) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("append",Dbw_append), + %trace, + %getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), +%trace, + getvalue_equals4(Variable1,Value11,Vars1), + getvalue_equals4(Variable2,Value21,Vars1), + getvalue_equals4(Variable3,Value31,Vars1), + +append2(Dbw_n,Dbw_append,Variable1,Variable2,Variable3,Value11,Value21,Value31,Vars1,Vars2). + +append2(Dbw_n,Dbw_append,_Variable1,_Variable2,Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(1), +(contains_empty(Value31),not(contains_empty(Value11)),not(contains_empty(Value21))), +( +Value11=Value1,Value21=Value2, +debug_call(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,variable3]]), + ((append1(Value1,Value2,Value3A), + %val1emptyorvalsequal(Value3,Value3A), + %trace, + putvalue_equals4(Variable3,Value3A,Vars1,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,Value3A]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,variable3]]) +) +) +). + +append2(Dbw_n,Dbw_append,Variable1,Variable2,_Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(2), +%trace, + (contains_empty(Value11),contains_empty(Value21),not(contains_empty(Value31))), +( +Value31=Value3, +debug_call(Skip,[[Dbw_n,Dbw_append],[variable1,variable2,Value3]]), + ((append1(Value1A,Value2A,Value3), + %val1emptyorvalsequal(Value3,Value3A), + %trace, + putvalue_equals4(Variable1,Value1A,Vars1,Vars3), + putvalue_equals4(Variable2,Value2A,Vars3,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1A,Value2A,Value3]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_append],[Value1A,Value2A,variable3]])) +%); +))). + +append2(Dbw_n,Dbw_append,Variable1,_Variable2,_Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(3), +(contains_empty(Value11),not(contains_empty(Value21)),not(contains_empty(Value31))), +( +Value21=Value2,Value31=Value3, +debug_call(Skip,[[Dbw_n,Dbw_append],[variable1,Value2,Value3]]), + ((append1(Value1A,Value2,Value3), + %val1emptyorvalsequal(Value3,Value3A), + %trace, + putvalue_equals4(Variable1,Value1A,Vars1,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1A,Value2,Value3]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_append],[variable1,Value2,Value3]])) +%); +))). + +append2(Dbw_n,Dbw_append,_Variable1,Variable2,_Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(4), +(contains_empty(Value21),not(contains_empty(Value11)),not(contains_empty(Value31))), +( +Value11=Value1,Value31=Value3, +debug_call(Skip,[[Dbw_n,Dbw_append],[Value1,variable2,Value3]]), + ((append1(Value1,Value2A,Value3), + %val1emptyorvalsequal(Value3,Value3A), + %trace, + putvalue_equals4(Variable2,Value2A,Vars1,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1,Value2A,Value3]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_append],[Value1,variable2,Value3]])) +))). + +append2(Dbw_n,Dbw_append,_Variable1,_Variable2,_Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(5), +(not(contains_empty(Value21)),not(contains_empty(Value11)),not(contains_empty(Value31))), +( +Value11=Value1,Value21=Value2,Value31=Value3, +debug_call(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,Value3]]), + ((append1(Value1,Value2,Value3), + %val1emptyorvalsequal(Value3,Value3A), + %trace, + Vars1=Vars2, + %putvalue_equals4(Variable2,Value2A,Vars1,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,Value3]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_append],[Value1,variable2,Value3]])) +))). + +%oio +append2(Dbw_n,Dbw_append,Variable1,_Variable2,Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(5), +%trace, +(contains_empty(Value11),not(contains_empty(Value21)),contains_empty(Value31)), +( +%trace, +%Value11=Value1, +Value21=Value2,%Value31=Value3, +debug_call(Skip,[[Dbw_n,Dbw_append],[_Value1,Value2,_Value3]]), +command_n_sols(N), +%trace, + ((findnsols(N,[Value1A1,Value3A1],(append1(Value1A,Value2,Value3A), + find_v_sys(V_sys1), + replace_in_term(Value1A,_%'$VAR'(_) + ,V_sys1,Value1A1), + find_v_sys(V_sys2), + replace_in_term(Value3A,_%'$VAR'(_) + ,V_sys2,Value3A1)) + ,ValueA),!, + %val1emptyorvalsequal(Value3,Value3A), + %trace, + %Vars1=Vars2, + member([Value1A,Value3A],ValueA), + putvalue_equals4(Variable1,Value1A,Vars1,Vars3),%)-> + putvalue_equals4(Variable3,Value3A,Vars3,Vars2),%)-> + + %putvalue_equals4(Variable2,Value2A,Vars1,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1A,Value2,Value3A]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_append],[Value1,variable2,Value3]])) +))). + +%ioo +append2(Dbw_n,Dbw_append,_Variable1,_Variable2,Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(5), +(not(contains_empty(Value11)),contains_empty(Value21),contains_empty(Value31)), +( +%trace, +%Value11=Value1, +Value11=Value1,%Value31=Value3, +Value21=Value2, +debug_call(Skip,[[Dbw_n,Dbw_append],[Value1,Value21,Value31]]), +%command_n_sols(N), + ((%findnsols(N,[Value1A,Value3A], + find_v_sys(V_sys), + append1(Value1,_Value2A,Value3A),%ValueA), + replace_in_term(Value3A,_%'$VAR'(_) + ,V_sys,Value3A1), + Value3A1=[Value3A2|Value3A3], + Value3A4=[Value3A2,"|",Value3A3], + %val1emptyorvalsequal(Value3,Value3A), + %trace, + %Vars1=Vars2, + %member([Value1A,Value3A],ValueA), + putvalue_equals4(Variable3,Value3A4,Vars1,Vars2),%)-> + + %putvalue_equals4(Variable2,Value2A,Vars1,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,Value3A4]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_append],[Value1,variable2,Value3]])) +))). + +%ooo +append2(Dbw_n,Dbw_append,Variable1,_Variable2,Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +get_lang_word("v",Dbw_v), +%writeln(5), +(contains_empty(Value11),contains_empty(Value21),contains_empty(Value31)), +( +%trace, +%Value11=Value1, +Value11=Value1,Value31=Value3, +Value21=Value2, +debug_call(Skip,[[Dbw_n,Dbw_append],[Value1,Value2,Value3]]), +command_n_sols(N), +%N=3, +%find_v_sys(V_sys), +replace_in_term(Value2,[Dbw_v,_],%'$VAR'(_) + _,Value22), + ((findnsols(N,[Value1A1,Value3A1], +( + append1(Value1A,Value22,Value3A), + find_v_sys(V_sys1), + replace_in_term(Value1A,_%'$VAR'(_) + ,empty2,Value1A2), + %replace_in_term(Value2A,_%'$VAR'(_) + %,empty,Value2A1), + find_v_sys(V_sys2), + replace_in_term(Value3A,_%'$VAR'(_) + ,empty2,Value3A2), + convert_to_lp_pipe(Value1A2,Value1A3), + %convert_to_lp_pipe(Value2A2,Value2A1), + convert_to_lp_pipe(Value3A2,Value3A3), + + replace_in_term(Value1A3,empty2%'$VAR'(_) + ,V_sys1,Value1A1), + + replace_in_term(Value3A3,empty2%'$VAR'(_) + ,V_sys2,Value3A1) + + ),ValueA),!, + member([Value1a,Value3a],ValueA), + + %Value1A1=Value1a, + %Value3A1=Value3a, + %trace, + putvalue_equals4(Variable1,Value1a,Vars1,Vars3b),%)-> + %putvalue_equals4(Variable2,Value2a,Vars3b,Vars3),%)-> + %trace, + putvalue_equals4(Variable3,Value3a,Vars3b,Vars2), + + debug_exit(Skip,[[Dbw_n,Dbw_append],[Value1a,Value2,Value3a]])) +%; debug_fail(Skip,[[Dbw_n,Dbw_append],[Value1,variable2,Value3]])) +)). + + + + + + + +interpretpart(stringconcat,Variable1,Variable2,Variable3,Vars1,Vars2) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("stringconcat",Dbw_stringconcat), + %trace, + %getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + getvalue_equals4(Variable1,Value11,Vars1), + getvalue_equals4(Variable2,Value21,Vars1), + getvalue_equals4(Variable3,Value31,Vars1), + +stringconcat2(Dbw_n,Dbw_stringconcat,Variable1,Variable2,Variable3,Value11,Value21,Value31,Vars1,Vars2). + +stringconcat2(Dbw_n,Dbw_stringconcat,_Variable1,_Variable2,Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(1), +(contains_empty(Value31),not(contains_empty(Value11)),not(contains_empty(Value21))), +( +Value11=Value1,Value21=Value2, +debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2,variable3]]), + ((string_concat(Value1,Value2,Value3A), + %val1emptyorvalsequal(Value3,Value3A), + %trace, + putvalue_equals4(Variable3,Value3A,Vars1,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3A]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2,variable3]]) +) +) +). + +stringconcat2(Dbw_n,Dbw_stringconcat,Variable1,Variable2,_Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(2), +%trace, + (contains_empty(Value11),contains_empty(Value21),not(contains_empty(Value31))), +( +Value31=Value3, +debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[variable1,variable2,Value3]]), + ((string_concat(Value1A,Value2A,Value3), + %val1emptyorvalsequal(Value3,Value3A), + %trace, + putvalue_equals4(Variable1,Value1A,Vars1,Vars3), + putvalue_equals4(Variable2,Value2A,Vars3,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1A,Value2A,Value3]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_stringconcat],[Value1A,Value2A,variable3]])) +%); +))). + +stringconcat2(Dbw_n,Dbw_stringconcat,Variable1,_Variable2,_Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(3), +(contains_empty(Value11),not(contains_empty(Value21)),not(contains_empty(Value31))), +( +Value21=Value2,Value31=Value3, +debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[variable1,Value2,Value3]]), + ((string_concat(Value1A,Value2,Value3), + %val1emptyorvalsequal(Value3,Value3A), + %trace, + putvalue_equals4(Variable1,Value1A,Vars1,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1A,Value2,Value3]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_stringconcat],[variable1,Value2,Value3]])) +%); +))). + +stringconcat2(Dbw_n,Dbw_stringconcat,_Variable1,Variable2,_Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(4), +(contains_empty(Value21),not(contains_empty(Value11)),not(contains_empty(Value31))), +( +Value11=Value1,Value31=Value3, +debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,variable2,Value3]]), + ((string_concat(Value1,Value2A,Value3), + %val1emptyorvalsequal(Value3,Value3A), + %trace, + putvalue_equals4(Variable2,Value2A,Vars1,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2A,Value3]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,variable2,Value3]])) +))). + +stringconcat2(Dbw_n,Dbw_stringconcat,_Variable1,_Variable2,_Variable3,Value11,Value21,Value31,Vars1,Vars2) :- +%writeln(5), +(not(contains_empty(Value21)),not(contains_empty(Value11)),not(contains_empty(Value31))), +( +Value11=Value1,Value21=Value2,Value31=Value3, +debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3]]), + ((string_concat(Value1,Value2,Value3), + %val1emptyorvalsequal(Value3,Value3A), + %trace, + Vars1=Vars2, + %putvalue_equals4(Variable2,Value2A,Vars1,Vars2),%)-> + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,Value2,Value3]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_stringconcat],[Value1,variable2,Value3]])) +))). + + + +interpretpart(date,Year,Month,Day,Hour,Minute,Seconds,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("date",Dbw_date), + + getvalues(Year,Month,Day,YearValueA,MonthValueA,DayValueA,Vars1), + getvalues(Hour,Minute,Seconds,HourValueA,MinuteValueA,SecondsValueA,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_date],[[variable1,variable2,variable3,variable4,variable5,variable6]]]), + ((get_time(TS),stamp_date_time(TS,date(YearValueB,MonthValueB,DayValueB,HourValueB,MinuteValueB,SecondsValueB,_A,_TZ,_False),local), + val1emptyorvalsequal(YearValueA,YearValueB), + val1emptyorvalsequal(MonthValueA,MonthValueB), + val1emptyorvalsequal(DayValueA,DayValueB), + val1emptyorvalsequal(HourValueA,HourValueB), + val1emptyorvalsequal(MinuteValueA,MinuteValueB), + val1emptyorvalsequal(SecondsValueA,SecondsValueB), + putvalue(Year,YearValueB,Vars1,Vars3), + putvalue(Month,MonthValueB,Vars3,Vars4), + putvalue(Day,DayValueB,Vars4,Vars5), + putvalue(Hour,HourValueB,Vars5,Vars6), + putvalue(Minute,MinuteValueB,Vars6,Vars7), + putvalue(Seconds,SecondsValueB,Vars7,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_date],[YearValueB,MonthValueB,DayValueB,HourValueB,MinuteValueB,SecondsValueB]]) +; debug_fail(Skip,[[Dbw_n,Dbw_date],[variable1,variable2,variable3,variable4,variable5,variable6]])),!. + +interpretpart(random,Variable1,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("random",Dbw_random), + getvalue(Variable1,Value1,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_random],[variable]]), + ((random(Value1A), + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_random],[Value1A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_random],[variable]])),!. + +interpretpart(length,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("length",Dbw_length), + getvalues_equals4(Variable1,Variable2,Value1,_Value2,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_length],[Value1,variable]]), + ((islist(Value1), + length(Value1,Value2A), + %val1emptyorvalsequal(Value2,Value2A), + putvalue_equals4(Variable2,Value2A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_length],[Value1,Value2A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_length],[Value1,variable]])),!. + +interpretpart(ceiling,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("ceiling",Dbw_ceiling), + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_ceiling],[Value1,variable]]), + ((isval(Value1), + ceiling(Value1,Value2A), + val1emptyorvalsequal(Value2,Value2A), + putvalue(Variable2,Value2A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_ceiling],[Value1,Value2A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_ceiling],[Value1,variable]])),!. + +interpretpart(round,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("round",Dbw_round), + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_round],[Value1,variable2]]), + ((Value2A is round(Value1), + val1emptyorvalsequal(Value2,Value2A), + putvalue(Variable2,Value2A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_round],[Value1,Value2A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_round],[Value1,variable2]])),!. + +interpretpart(string_from_file,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("string_from_file",Dbw_string_from_file), + + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + + debug_call(Skip,[[Dbw_n,Dbw_string_from_file],[variable,Value2]]), + %%A=..[a,1] + ((phrase_from_file_s(string_g(String00a),Value2), + string_codes(Value1A,String00a), +%%interpretstatement1(ssi,Functions0,Functions,[[Value1,Value2]],Vars1,Vars2,true,nocut), + + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable2,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_string_from_file],[Value1A,Value2]]) +; debug_fail(Skip,[[Dbw_n,Dbw_string_from_file],[variable,Value2]])),!. + +interpretpart(word1,Variable1,Vars1) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("word1",Dbw_word1), + + getvalue(Variable1,Value1,Vars1), + + debug_call(Skip,[[Dbw_n,Dbw_word1],[variable]]), + %%A=..[a,1] + ((phrase(word1(Value1),_))-> + %phrase_from_file_s(string_g(String00a),Value2), + %string_codes(Value1A,String00a), +%%interpretstatement1(ssi,Functions0,Functions,[[Value1,Value2]],Vars1,Vars2,true,nocut), + + %val1emptyorvalsequal(Value1,Value1A), + %putvalue(Variable2,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_word1],[Value1]]) +; debug_fail(Skip,[[Dbw_n,Dbw_word1],[variable]])),!. + + +interpretpart(maplist,Functions0,Functions,Variable1,Variable2,Variable3,Variable4,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("maplist",Dbw_maplist), + + getvalue(Variable1,Value1,Vars1), + getvalues_equals4(Variable2,Variable3,Variable4,Value2,Value3,_Value4,Vars1), + + debug_call(Skip,[[Dbw_n,Dbw_maplist],[Value1,Value2,Value3,variable]]), + %%A=..[a,1] + (( + map(Functions0,Functions,Value1,Value2,Value3,Value4A,Vars1), + +%%interpretstatement1(ssi,Functions0,Functions,[[Value1,Value2]],Vars1,Vars2,true,nocut), + + %val1emptyorvalsequal(Value4,Value4A), + putvalue_equals4(Variable4,Value4A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_maplist],[Value1,Value2,Value3,Value4A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_maplist],[Value1,Value2,Value3,variable]])),!. + +interpretpart(string_length,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("string_length",Dbw_string_length), + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_string_length],[Value1,variable]]), + ((string(Value1), + string_length(Value1,Value2A), + val1emptyorvalsequal(Value2,Value2A), + putvalue(Variable2,Value2A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_string_length],[Value1,Value2A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_string_length],[Value1,variable]])),!. + +interpretpart(sort,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("sort",Dbw_sort), + getvalues_equals4(Variable1,Variable2,Value1,_Value2,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_sort],[Value1,variable]]), + ((is_list(Value1), + sort(Value1,Value2A), + %val1emptyorvalsequal(Value2,Value2A), + putvalue_equals4(Variable2,Value2A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_sort],[Value1,Value2A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_sort],[Value1,variable]])),!. + +interpretpart(intersection,Variable1,Variable2,Variable3,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("intersection",Dbw_intersection), + getvalues_equals4(Variable1,Variable2,Variable3,Value1,Value2,_Value3,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_intersection],[Value1,Value2,variable]]), + ((is_list(Value1),is_list(Value2), + intersection(Value1,Value2,Value3A), + %val1emptyorvalsequal(Value3,Value3A), + putvalue_equals4(Variable3,Value3A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_intersection],[Value1,Value2,Value3A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_intersection],[Value1,Value2,variable]])),!. + + +interpretpart(read_string,Variable1,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("read_string",Dbw_read_string), + getvalue(Variable1,Value1,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_read_string],[variable]]), + ((read_string(user_input, "\n", "\r", _End1, Value1A), + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_read_string],[Value1A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_read_string],[variable]])),!. + + +interpretpart(text_area,Variable1,Variable2,Variable3,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("text_area",Dbw_text_area), + getvalue(Variable1,Value1,Vars1), + getvalue(Variable2,Value2,Vars1), + getvalue(Variable3,Value3,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_text_area],[Value1,Value2,variable]]), + ((writeln(Value2), + read_string(user_input, "\n", "\r", _End1, Value3A), + val1emptyorvalsequal(Value3,Value3A), + putvalue(Variable3,Value3A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_text_area],[Value3A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_text_area],[variable]])),!. + +interpretpart(writeln,Variable1,Vars1,Vars1) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("writeln",Dbw_writeln), + %getvalue(Variable1,Value1,Vars1), +get_lang_word("v",Dbw_v), + + +debug_call(Skip,[[Dbw_n,Dbw_writeln],[variable]]), + ((%writeln(Value1) + + remember_and_turn_off_debug(Debug), + %%trace, +find_sys(Sys_name), + interpretpart(match4,Variable1,[Dbw_v,Sys_name],Vars1,Vars3,_), +%%writeln1( interpretpart(match4,Variable1,[v,sys1],Vars3,Vars2,_)), +%%interpretstatement1(ssi,Functions0,Functions,[[n,equals4],[Variable1,Variable3]],Vars3,Vars2,true,nocut), + getvalue([Dbw_v,Sys_name],Value3,Vars3), + + turn_back_debug(Debug), + + writeln0(Value3) + + %val1emptyorvalsequal(Value1,Value1A), + %putvalue(Variable1,Value1A,Vars1,Vars2) + )-> + debug_exit(Skip,[[Dbw_n,Dbw_writeln],[Value3]]) +; debug_fail(Skip,[[Dbw_n,Dbw_writeln],[variable]])),!. + + +interpretpart(atom_string,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("atom_string",Dbw_atom_string), + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + (contains_empty(Value1)-> + (debug_call(Skip,[[Dbw_n,Dbw_atom_string],[variable,Value2]]), + ((string(Value2), + atom_string(Value1A,Value2), + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_atom_string],[Value1A,Value2]]) +; debug_fail(Skip,[[Dbw_n,Dbw_atom_string],[variable,Value2]]))); + + (debug_call(Skip,[[Dbw_n,Dbw_atom_string],[Value1,variable]]), + ((atom(Value1), + atom_string(Value1,Value2A), + val1emptyorvalsequal(Value2,Value2A), + putvalue(Variable2,Value2A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_atom_string],[Value1,Value2A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_atom_string],[Value1,variable]])))) +,!. + +interpretpart(get_lang_word,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("get_lang_word",Dbw_get_lang_word), + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_get_lang_word],[Value1,variable]]), + ((%is_list(Value1), + get_lang_word(Value1,Value2A1), + Value2A=Value2A1, + %string_atom(Value2A,Value2A1), % *** LPI only takes strings + %sort(Value1,Value2A), + val1emptyorvalsequal(Value2,Value2A), + putvalue(Variable2,Value2A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_get_lang_word],[Value1,Value2A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_get_lang_word],[Value1,variable]])),!. + + + +interpretpart(shell,Variable1,Vars1,Vars1) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("shell",Dbw_shell), + getvalue(Variable1,Value1,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_shell],[Value1]]), + ((%is_list(Value1), + shell1_s(Value1) + %Value2A=Value2A1, + %string_atom(Value2A,Value2A1), % *** LPI only takes strings + %sort(Value1,Value2A), + %val1emptyorvalsequal(Value2,Value2A), + %putvalue(Variable2,Value2A,Vars1,Vars2) + )-> + debug_exit(Skip,[[Dbw_n,Dbw_shell],[Value1]]) +; debug_fail(Skip,[[Dbw_n,Dbw_shell],[Value1]])),!. + + + +interpretpart(date_time_stamp,Y,M,D,Ho,Mi,Se,Se2,Variable,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("date_time_stamp",Dbw_date_time_stamp), + + getvalues(Y,M,D,YearValueA,MonthValueA,DayValueA,Vars1), + getvalues(Ho,Mi,Se,HourValueA,MinuteValueA,SecondsValueA,Vars1), + getvalues(Se2,Variable,Se2ValueA,ValueA,Vars1), + + %getvalue(Variable1,Value1,Vars1), + debug_call(Skip,[[Dbw_n,Dbw_date_time_stamp],[YearValueA,MonthValueA,DayValueA,HourValueA,MinuteValueA,SecondsValueA,Se2ValueA,variable]]), + ((%is_list(Value1), + date_time_stamp(date(YearValueA,MonthValueA,DayValueA,HourValueA,MinuteValueA,SecondsValueA,Se2ValueA,-,-),Value1A), + %Value2A=Value2A1, + %string_atom(Value2A,Value2A1), % *** LPI only takes strings + %sort(Value1,Value2A), + val1emptyorvalsequal(ValueA,Value1A), + putvalue(Variable,Value1A,Vars1,Vars2) + )-> + debug_exit(Skip,[[Dbw_n,Dbw_date_time_stamp],[YearValueA,MonthValueA,DayValueA,HourValueA,MinuteValueA,SecondsValueA,Se2ValueA,Value1A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_date_time_stamp],[YearValueA,MonthValueA,DayValueA,HourValueA,MinuteValueA,SecondsValueA,Se2ValueA,variable]])),!. + +interpretpart(phrase_from_file,Variable1,In,Vars1,Vars2) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("phrase_from_file",Dbw_phrase_from_file), + getvalue(Variable1,Value1,Vars1), + getvalue(In,Value2,Vars1), + %trace, + +%trace, + debug_call(Skip,[[Dbw_n,Dbw_phrase_from_file],[variable,Value2]]), + ((%is_list(Value1), + phrase_from_file(string(Out),Value2), + %shell1_s(Value1) + %Value2A=Value2A1, + %string_atom(Value2A,Value2A1), % *** LPI only takes strings + %sort(Value1,Value2A), + val1emptyorvalsequal(Value1,Out), + putvalue(Variable1,Out,Vars1,Vars2) + )-> + debug_exit(Skip,[[Dbw_n,Dbw_phrase_from_file],[Out,Value2]]) +; debug_fail(Skip,[[Dbw_n,Dbw_phrase_from_file],[variable,Value2]])),!. + +/* +interpretpart(term_to_atom,Variable1,Variable2,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("term_to_atom",Dbw_term_to_atom), + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + +((contains_empty(Value1),not(contains_empty(Value2)))-> + + (debug_call(Skip,[[Dbw_n,Dbw_term_to_atom],[variable,Value2]]), + ((%is_list(Value1), + term_to_atom(Value1A,Value2), + %string_atom(Value2A,Value2A1), % *** LPI only takes strings + %sort(Value1,Value2A), + val1emptyorvalsequal(Value1,Value1A), + putvalue(Variable1,Value1A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_term_to_atom],[Value1A,Value2]]) +; debug_fail(Skip,[[Dbw_n,Dbw_term_to_atom],[variable,Value2]]))); + +( (debug_call(Skip,[[Dbw_n,Dbw_term_to_atom],[Value1,variable]]), + ((%is_list(Value1), + term_to_atom(Value1,Value2A), + %string_atom(Value2A,Value2A1), % *** LPI only takes strings + %sort(Value1,Value2A), + val1emptyorvalsequal(Value2,Value2A), + putvalue(Variable2,Value2A,Vars1,Vars2))-> + debug_exit(Skip,[[Dbw_n,Dbw_term_to_atom],[Value1,Value2A]]) +; debug_fail(Skip,[[Dbw_n,Dbw_term_to_atom],[Value1,variable]]))) +)) +,!. + + +interpretpart(split_on_substring117a,Variable1,Variable2,Variable3,Variable4,Vars1,Vars2) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("split_on_substring117a",Dbw_split_on_substring117a), + getvalue(Variable1,Value1,Vars1), + getvalue(Variable2,Value2,Vars1), + getvalue(Variable3,Value3,Vars1), + getvalue(Variable4,Value4,Vars1), + %trace, + +%trace, + debug_call(Skip,[[Dbw_n,Dbw_split_on_substring117a],[Value1,Value2,Value3,variable]]), + ((%is_list(Value1), + split_on_substring117a(Value1,Value2,Value3,Value41), + %shell1_s(Value1) + %Value2A=Value2A1, + %string_atom(Value2A,Value2A1), % *** LPI only takes strings + %sort(Value1,Value2A), + val1emptyorvalsequal(Value4,Value41), + putvalue(Variable4,Value41,Vars1,Vars2) + )-> + debug_exit(Skip,[[Dbw_n,Dbw_split_on_substring117a],[Value1,Value2,Value3,Value41]]) +; debug_fail(Skip,[[Dbw_n,Dbw_split_on_substring117a],[Value1,Value2,Value3,variable]])),!. + + +interpretpart(string_strings,Variable1,Variable2,Vars1,Vars2) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("string_strings",Dbw_string_strings), + getvalue(Variable1,Value1,Vars1), + getvalue(Variable2,Value2,Vars1), + %trace, + +%trace, + debug_call(Skip,[[Dbw_n,Dbw_string_strings],[Value1,variable]]), + ((%is_list(Value1), + string_strings(Value1,Value21), + %shell1_s(Value1) + %Value2A=Value2A1, + %string_atom(Value2A,Value2A1), % *** LPI only takes strings + %sort(Value1,Value2A), + val1emptyorvalsequal(Value2,Value21), + putvalue(Variable2,Value21,Vars1,Vars2) + )-> + debug_exit(Skip,[[Dbw_n,Dbw_string_strings],[Value1,Value21]]) +; debug_fail(Skip,[[Dbw_n,Dbw_string_strings],[Value1,variable]])),!. +*/ + +interpretpart(shell_pl,I0,QP0,QV0,P0,OVar0,Vars1,Vars2) :- +%trace, +% eg [I,QP,QV,P,OVar]=[[1,1],"a","A,A1","B is A+A1,B1 is A1-A1,write([B,B1]).",[v,o]] + find_v_sys(V_sys), + remember_and_turn_off_debug(Debug), + + interpretpart(match4,I0,V_sys,Vars1,Vars3,_), + + turn_back_debug(Debug), + + getvalue(V_sys,I1,Vars3), + term_to_atom(I1,I), + + %getvalue(I0,I,Vars1), + getvalue(QP0,QP,Vars1), + getvalue(QV0,QV,Vars1), + getvalue(P0,P,Vars1), + getvalue(OVar0,OVar,Vars1), + + +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("shell_pl",Dbw_shell_pl1),Dbw_shell_pl1=Dbw_shell_pl, + +debug_call(Skip,[[Dbw_n,Dbw_shell_pl],[I,QP,QV,P,variable]]), + +((QV="",I1=[])->Mid= [ + QP,",\n\t", + "halt.\n\n","main :- halt(1).\n", + QP, " :-","\n\t"]; + + Mid=[ "[",QV,"]=",I,",","\n\t", + QP,"(",QV,"),","\n\t", + + + "halt.\n\n","main :- halt(1).\n", + QP,"(",QV,") :-","\n\t"]), + +flatten(["#!/usr/bin/swipl -f -q\n\n",%":-include('",Go_path5,File,"').\n", +":- initialization(catch(main, Err, handle_error(Err))).\n\nhandle_error(_Err):-\n halt(1).\n\n","main :-\n\t", + + %"opt_arguments([], _, Args),","\n\t", +Mid,P],String1), + +foldr(string_concat,String1,String), + +foldr(string_concat,[%"../private2/luciancicd-testing/",Repository1b,"/",Go_path5, +"main.pl"],GP), +open_s(GP,write,S1), +write(S1,String),close(S1), + +findall([" ",I1],member(I1,I),I2),flatten(I2,I3),foldr(string_concat,I3,I4), +foldr(string_concat,["chmod +x ",GP],S31),%, +%trace, +catch(bash_command(S31,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text41),%writeln1(Text4), + fail%abort + )), + foldr(string_concat,["swipl -f -q ./",GP,%" ", +I4],S3),%, +%trace, +((catch(bash_command(S3,VO), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text4),%writeln1(Text4), + fail%abort + )), term_to_atom(OVar1,VO))-> + ( val1emptyorvalsequal(OVar,OVar1), + + + remember_and_turn_off_debug(Debug1), + +find_sys(Sys_name), interpretpart(match4,OVar0,[Dbw_v,Sys_name],Vars1,Vars2fa,_), + getvalue([Dbw_v,Sys_name],Value3,Vars2fa), + interpretpart(match4,OVar1,Value3,[]%Vars1 + ,Vars2,_), + turn_back_debug(Debug1), + + %putvalue(OVar0,OVar1,Vars1,Vars2), + + +debug_exit(Skip,[[Dbw_n,Dbw_shell_pl],[I,QP,QV,P,OVar1]])); +(debug_fail(Skip,[[Dbw_n,Dbw_shell_pl],[I,QP,QV,P,variable]]),fail)),!. + +interpretpart(shell_c,I0,P0,OVar0,Vars1,Vars2) :- +%trace, +% eg [I,QP,QV,P,OVar]=[[1,1],"a","A,A1","B is A+A1,B1 is A1-A1,write([B,B1]).",[v,o]] + %find_v_sys(V_sys), + %interpretpart(match4,I0,V_sys,Vars1,Vars3,_), + getvalue(I0,I1,Vars1), + term_to_atom(I1,I), + +open_s("input.txt",write,S1T), +write(S1T,I),close(S1T), + + + %getvalue(I0,I,Vars1), + %getvalue(QP0,QP,Vars1), + %getvalue(QV0,QV,Vars1), + %trace, + getvalue(P0,P1,Vars1), + %trace, +%atomic_list_concat(P1,'\n',P2), +%atomic_list_concat(P1,'\\n',P3), +%atomic_list_concat(P4,'\"',P3), +%atomic_list_concat(P4,'"',P), +atom_string(P,P1), +open_s("program.txt",write,S1C), +write(S1C,P),close(S1C), + +foldr(string_concat,["cp program.txt program.c"],S32),%, +%trace, +catch(bash_command(S32,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42),%writeln1(Text4), + fail%abort + )), + + + getvalue(OVar0,OVar,Vars1), + + +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("shell_c",Dbw_shell_c1),Dbw_shell_c1=Dbw_shell_c, + +debug_call(Skip,[[Dbw_n,Dbw_shell_c],[I,P,variable]]), + + +foldr(string_concat,["gcc program.c"],S31),%, +%trace, +catch(bash_command(S31,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text43),%writeln1(Text4), + fail%abort + )), + foldr(string_concat,["./a.out + ( val1emptyorvalsequal(OVar,OVar1), + putvalue(OVar0,OVar1,Vars1,Vars2), +debug_exit(Skip,[[Dbw_n,Dbw_shell_c],[I,P,OVar1]])); +(debug_fail(Skip,[[Dbw_n,Dbw_shell_c],[I,P,variable]]),fail)),!. + +/* + bash_command1(Command, Output) :- + process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out). +*/ + +interpretpart(command,Command1,Args,Variables,Vars1,Vars2) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, +get_lang_word(Command1,Dbw_command), +%trace, + +findall(Value,(member(Variable,Variables), +getvalue(Variable,Value,Vars1)),Values), + +/* +(length(Variables,0)-> +(Variables=[], + Values=[]); +(length(Variables,1)-> +(Variables=[Variable1], + getvalue(Variable1,Value1,Vars1),Values=[Value1]); +(length(Variables,2)-> +(Variables=[Variable1,Variable2], + getvalues(Variable1,Variable2,Value1,Value2,Vars1), + Values=[Value1,Value2]); +(length(Variables,3)-> +(Variables=[Variable1,Variable2,Variable3], + getvalue(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars1), + Values=[Value1,Value2,Value3]))))), +*/ + +length(Variables,VL), +length(Args,VL), +numbers(VL,1,[],VLN), + +% check modes of arguments +forall(member(VLN1,VLN),(get_item_n(Args,VLN1,Arg), +get_item_n(Values,VLN1,Val), +(Arg=o->contains_var([Dbw_v,_],Val); +not(contains_var([Dbw_v,_],Val))))), + +findall(Debug_variable,(member(VLN1,VLN), + get_item_n(Args,VLN1,Arg),get_item_n(Values,VLN1,Value), + (Arg=i->Debug_variable=Value;Debug_variable=variable)), + Debug_variables), + + length(Command_vars,VL),%=[A,B,C], + +findall(Command_variable,(member(VLN1,VLN), + get_item_n(Args,VLN1,Arg),get_item_n(Values,VLN1,Value), + get_item_n(Command_vars,VLN1,Command_vars_val), + (Arg=i->Command_variable=Value; + Command_variable=Command_vars_val)), + Command_variables), + + debug_call(Skip,[[Dbw_n,Dbw_command],Debug_variables]), + ((%is_list(Value1), + string_atom(Command1,Command1_atom), + %trace, + functor(Command2,Command1_atom,VL), + (length(Command_variables,0)->Command3=Command2; + (arg2(VLN,Command2,Command_variables),Command2=Command3)), + Command3, + %string_atom(Value2A,Value2A1), % *** LPI only takes strings + %sort(Value1,Value2A), + +forall(member(VLN1,VLN),(get_item_n(Args,VLN1,Arg), +get_item_n(Command_vars,VLN1,Command_vars_n), +get_item_n(Values,VLN1,Value), +(Arg=o->val1emptyorvalsequal(Value,Command_vars_n);true))), + +putvalues2(Args,Variables,Command_variables,Vars1,Vars2) + %putvalue(Variable2,Value2A,Vars1,Vars2) + ),%-> + debug_exit(Skip,[[Dbw_n,Dbw_command],Command_variables]) + ). +%; debug_fail(Skip,[[Dbw_n,Dbw_command],Debug_variables])). + +arg2([],_Command,_Args) :- !. +arg2([Arg_n|Arg_n2],Command,[Arg|Args]) :- + arg(Arg_n,Command,Arg), + arg2(Arg_n2,Command,Args). + +putvalues2([],_Variables,_Command_vars,Vars,Vars) :- !. +putvalues2([Arg|Args],[Variable|Variables],[Command_var|Command_vars],Vars1,Vars2) :- + (Arg=o->match4_new_22(Variable,Command_var,Vars1,Vars3); + Vars1=Vars3), + putvalues2(Args,Variables,Command_vars,Vars3,Vars2). + + +/** +A,B,x* +A,x,B +x,A,B +A,x,y +x,A,y +x,y,A +x,y,z +A,B,C +**/ + +/** +interpretpart(stringconcat1,Terminal,Phrase2,Phrase1,Vars1,Vars2) :- +%trace, +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("stringconcat1",Dbw_stringconcat), +isvar(Terminal), +isvar(Phrase2), + getvalues(Terminal,Phrase2,Phrase1,Value1,Value2,Value3,Vars1), + +debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[variable1,variable2,Value2]]), + ((string(Value3), + string_concat(Value1A,Value2A,Value3), + val1emptyorvalsequal(Value1,Value1A), + val1emptyorvalsequal(Value2,Value2A), + putvalue(Terminal,Value1A,Vars1,Vars3), + putvalue(Phrase2,Value2A,Vars3,Vars2)), + debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[Value1A,Value2A,Value3]]) +%; debug_fail(Skip,[[Dbw_n,Dbw_stringconcat],[variable1,variable2,Value3]]) +).%,!. +**/ + +/* +interpretpart(stringconcat,Terminal,Phrase2,Phrase1,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("stringconcat",Dbw_stringconcat), + %%Variables1=[Terminal,Phrase1,Phrase2], %% terminal can be v or "a" + ((getvalues2([Terminal,Phrase1,Phrase2], + [],[TerminalValue1,Phrase1Value1,Phrase2Value1],Vars1,[],[Flag1,Flag2,_Flag3]), %% prolog vars, list of vars, [v]=[prolog var] + %%delete(Value1,Value2,Value3A), + (Terminal=[_Value]->TerminalValue2=[TerminalValue1];TerminalValue2=TerminalValue1), + +(Terminal=""->(TerminalValue2="", + +string_concat(TerminalValue2,Phrase2Value1,Phrase1Value1))->true; + ((var(TerminalValue2)->(string_concat(TerminalValue2,Phrase2Value1,Phrase1Value1)),string_length(TerminalValue2,1) + );string_concat(TerminalValue2,Phrase2Value1,Phrase1Value1))), + + putvalue(Terminal,TerminalValue2,Vars1,Vars3), + putvalue(Phrase2,Phrase2Value1,Vars3,Vars4), + putvalue(Phrase1,Phrase1Value1,Vars4,Vars2), + (Flag1=true->TerminalValue3=variable1;TerminalValue3=TerminalValue1), + (Flag2=true->Phrase1Value3=variable2;Phrase1Value3=Phrase1Value1))-> + +(debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[TerminalValue3,Phrase1Value3,Phrase2]]), + +debug_exit(Skip,[[Dbw_n,Dbw_stringconcat],[TerminalValue1,Phrase1Value1,Phrase2Value1]]) + ); + + (debug_call(Skip,[[Dbw_n,Dbw_stringconcat],[variable1,variable2,variable3]]), + debug_fail(Skip,[[Dbw_n,Dbw_stringconcat],[variable1,variable2,variable3]]) + )).%!. + + */ + +interpretpart(grammar_part,Variables1,Vars1,Vars2) :- +get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, +get_lang_word("grammar_part",Dbw_grammar_part), + + Variables1=[Terminal,Phrase1,Phrase2], %% terminal can be v or "a" + %%terminal(Terminal), + %trace, +grammar_part2(Dbw_n,Dbw_grammar_part,Terminal,Phrase1,Phrase2,Vars1,Vars2). + + +grammar_part2(_Dbw_n,_Dbw_grammar_part,Terminal,Phrase1,Phrase2,Vars1,Vars2) :- +%trace, + + getvalues2([Terminal,Phrase1,Phrase2], + [],[TerminalValue1,Phrase1Value1,Phrase2Value1],Vars1,[],[_Flag1,_Flag2,_Flag3]), + +(%(not(contains_var1(Terminal)), +is_list(TerminalValue1)->true; +(%(not(contains_var1(Phrase1)), +is_list(Phrase1Value1)->true; +(%(not(contains_var1(Phrase2)), +is_list(Phrase2Value1)))), + +%getvalues(Terminal,Phrase1,Phrase2,TerminalValue1,Phrase1Value1,Phrase2Value1,Vars1), + +interpretpart(append,Terminal,Phrase2,Phrase1,Vars1,Vars2). + + +grammar_part2(Dbw_n,_Dbw_grammar_part,Terminal,Phrase1,Phrase2,Vars1,Vars2) :- + + getvalues2([Terminal,Phrase1,Phrase2], + [],[TerminalValue1,Phrase1Value1,Phrase2Value1],Vars1,[],[Flag1,Flag2,_Flag3]), %% prolog vars, list of vars, [v]=[prolog var] + %%delete(Value1,Value2,Value3A), + +(string(TerminalValue1)->true; +(string(Phrase1Value1)->true; +(string(Phrase2Value1)))), + + (( (Terminal=[_Value]->TerminalValue2=[TerminalValue1];TerminalValue2=TerminalValue1), + + + +((string(Phrase1Value1)->Phrase1Value1=Phrase1Value11;(number(Phrase1Value1)->number_string(Phrase1Value1,Phrase1Value11);Phrase1Value1=Phrase1Value11)), + +(Terminal=""->TerminalValue2="";true), + +(((var(TerminalValue2)->(string_concat(TerminalValue2,Phrase2Value1,Phrase1Value11)),string_length(TerminalValue2,1));string_concat(TerminalValue2,Phrase2Value1,Phrase1Value11))->true; + +string_concat(TerminalValue2,Phrase2Value1,Phrase1Value11))->true; + + + +((Phrase1Value1=[_ItemA|_ItemsA]),(Terminal=[]->(TerminalValue2=[], + +((var(TerminalValue2)->length(TerminalValue2,1);true),(append(TerminalValue2,Phrase2Value1,Phrase1Value1))))->true; + +(append(TerminalValue2,Phrase2Value1,Phrase1Value1)->true)))), + + putvalue(Terminal,TerminalValue2,Vars1,Vars3), + putvalue(Phrase2,Phrase2Value1,Vars3,Vars4), + putvalue(Phrase1,Phrase1Value1,Vars4,Vars2), + (Flag1=true->TerminalValue3=variable1;TerminalValue3=TerminalValue1), + (Flag2=true->Phrase1Value3=variable2;Phrase1Value3=Phrase1Value1))-> + (debug_call(Skip,[[Dbw_n,grammar_part],[TerminalValue3,Phrase1Value3,Phrase2]]), + debug_exit(Skip,[[Dbw_n,grammar_part],[TerminalValue1,Phrase1Value1,Phrase2Value1]])); + +% CAW requires input,input,output with "a","ab",[v,a] where [v,a]="b" + (debug_call(Skip,[[Dbw_n,grammar_part],[Terminal,Phrase1,Phrase2]]), + (debug_fail(Skip,[[Dbw_n,grammar_part],[Terminal,Phrase1,Phrase2]])))),!. + + + + +getvalues(Variable1,Variable2,Value1,Value2,Vars) :- + getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars). +getvalues(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars) :- + getvalue(Variable1,Value1,Vars), + getvalue(Variable2,Value2,Vars), + getvalue(Variable3,Value3,Vars). + /** +getvalues2(Variable1,Variable2,Value1,Value2,Vars) :- + getvalue2(Variable1,Value1,Vars), + getvalue2(Variable2,Value2,Vars). +getvalues2(Variable1,Variable2,Variable3,Value1,Value2,Value3,Vars) :- + getvalue2(Variable1,Value1,Vars), + getvalue2(Variable2,Value2,Vars), + getvalue2(Variable3,Value3,Vars). +**/ +%val1emptyorvalsequal([],_Value) :- !. +val1emptyorvalsequal(Empty,_Value) :- is_empty(Empty),!. +val1emptyorvalsequal(Value,Value) :- + not(is_empty(Value)). +val1emptyorvalsequal([Value1|Value1a],[Value2|Value2a]) :- + val1emptyorvalsequal(Value1,Value2), + val1emptyorvalsequal(Value1a,Value2a),!. +isop(Is):-get_lang_word("is",Is1),Is1=Is,!. +isop(=). +stringconcat1([],Item,Item) :- + !. +stringconcat1(Item11,Item21,Item31) :- + +replace_empty_with_empty_set( [Item11,Item21,Item31],[],[Item1,Item2,Item3]), +maplist(expression,[Item1,Item2,Item3]), + string_concat(Item1,Item2,Item3),!. + +%append1([],Item,Item). +append1(Item11,Item21,Item31) :- + +replace_empty_with_empty_set( [Item11,Item21,Item31],[],[Item1,Item2,Item3]), +%%maplist(expression,[Item1,Item2,Item3]), %% commented out 21 8 19 +/**((isvalstr(Item1),Item1A=[Item1]);(not(isvalstr(Item1)),Item1A=Item1)), + ((isvalstr(Item2),Item2A=[Item2]);(not(isvalstr(Item2)),Item2A=Item2)), + %%((isvalstr(Item3),Item3A=[Item3]);(not(isvalstr(Item3)),Item3A=Item3)), + **/ + append(Item1,Item2,Item3). +/**delete1(Item1,Item2,Item3) :- + ((isvalstr(Item1),Item1A=[Item1]);(not(isvalstr(Item1)),Item1A=Item1)), + ((isvalstr(Item2),Item2A=[Item2]);(not(isvalstr(Item2)),Item2A=Item2)), + %%((isvalstr(Item3),Item3A=[Item3]);(not(isvalstr(Item3)),Item3A=Item3)), + delete(Item1A,Item2A,Item3). +**/ +replace_empty_with_empty_set([],A,A). +replace_empty_with_empty_set(A,B,C) :- + A=[Item1|Items], + (var(Item1)->Item2=Item1;(is_empty(Item1)->Item2=[];Item2=Item1)), + append(B,[Item2],D), + replace_empty_with_empty_set(Items,D,C). +removebrackets([[Value]],Value) :-!. +removebrackets(Value,Value). + + +%% bc in a=bc +%%: if doesn't contain "|" in first level, then match4 list x, terminal + + + +single_item(A) :- predicate_or_rule_name(A),!. +single_item(A) :- variable_name(A),!. +single_item(A) :- A="|",fail,!. +single_item(A) :- string(A),!. +single_item(A) :- number(A),!. +single_item(A) :- blob(A,stream),!. + +%single_item(A) :- atom(A),!. +%single_item([A,B]) :- atom(A),atom(b),!. + +single_item_or_atom(A) :- predicate_or_rule_name(A),!. +single_item_or_atom(A) :- variable_name(A),!. +%single_item_or_atom(A) :- A="|",fail,!. +single_item_or_atom(A) :- string(A),!. +single_item_or_atom(A) :- number(A),!. +single_item_or_atom(A) :- atom(A),!. +single_item_or_atom(A) :- blob(A,stream),!. + +is_value_match(A) :- predicate_or_rule_name(A),!. +is_value_match(A) :- A="|",fail,!. +is_value_match(A) :- string(A),!. +is_value_match(A) :- number(A),!. +is_value_match(A) :- blob(A,stream),!. +%is_value_match(A) :- atom(A),!. +%is_value_match([A,B]) :- atom(A),atom(b),!. + +append11(Empty,A,A) :- is_empty(Empty),!. +append11(A,B,C) :- append(A,B,C). + +/** match4 + +from bracketed head: + +match4([[[v,a],[v,c]],"|",[v,b]],[[1,2],3,4],[],V). V = [[[v, a], 1], [[v, c], 2], [[v, b], [3, 4]]]. + + match4([[v,a],"|",[v,b]],[1,2,3,4],[],V). +V = [[[v, a], 1], [[v, b], [2, 3, 4]]]. + +match4([[v,a],[v,c],"|",[v,b],[v,d]],[1,2,3,4],[],V). +should be false + +match4([[[v,a]],[v,c],"|",[v,b]],[[1],2,3,4],[],V). +V = [[[v, a], 1], [[v, c], 2], [[v, b], [3, 4]]]. + + + match4([[v,a],"|",[v,b]],[[1,2],3,4],[],V). +V = [[[v, a], [1, 2]], [[v, b], [3, 4]]]. + +match4([[[v,a],"|",[v,d]],[v,c],"|",[v,b]],[[1,5],2,3,4],[],V). +V = [[[v, a], 1], [[v, d], [5]], [[v, c], 2], [[v, b], [3, 4]]]. + +match4([[v,a],"|",[v,b]],[[1,2],3,4],[],V). +V = [[[v, a], [1, 2]], [[v, b], [3, 4]]]. + +match4([[v,a],"|",[[v,b]]],[1,2],[],V). V = [[[v, a], 1], [[v, b], 2]]. + +match4([[v,a]],[1],[],V). +V = [[[v, a], 1]]. + +match4([[v,a],[v,b]],[1,2],[],V). +V = [[[v, a], 1], [[v, b], 2]]. + match4([[v,a],[v,b]],[[1,3],2],[],V). +V = [[[v, a], [1, 3]], [[v, b], 2]] + +**/ + +map(_,_,_F,[],L,L,_). +map(Functions0,Functions,F,L,M1,N,Vars1):- +get_lang_word("v",Dbw_v), +get_lang_word("sys1",Dbw_sys1), +not((L=[])),L=[H|T], + + interpretstatement1(_,Functions0,Functions,[F,[M1,H,[Dbw_v,Dbw_sys1]]],Vars1,Vars2,true,nocut), + getvalue([Dbw_v,Dbw_sys1],M2,Vars2), + +%%(F,(M1,H,M2)), +map(Functions0,Functions,F,T,M2,N,Vars1). + diff --git a/listprologinterpreter/lpi b/listprologinterpreter/lpi new file mode 100644 index 0000000000000000000000000000000000000000..7ce628f697e1429d54a3f71a169662ff05810162 Binary files /dev/null and b/listprologinterpreter/lpi differ diff --git a/listprologinterpreter/lpi-api.pl b/listprologinterpreter/lpi-api.pl new file mode 100644 index 0000000000000000000000000000000000000000..c60af69482e9501db103bfb9727826e79437cf4d --- /dev/null +++ b/listprologinterpreter/lpi-api.pl @@ -0,0 +1,76 @@ +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/', web_form, []). + +:- include('files/listprolog.pl'). + +server(Port) :- + http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + + web_form(_Request) :- + reply_html_page( + title('List Prolog Interpreter'), + [ + form([action='/landing', method='POST'], [ + /** + p([], [ + label([for=debug],'Debug (on/off):'), + input([name=debug, type=textarea]) + ]), + **/ + p([], [ + label([for=query],'Query (e.g. [[n,reverse],[[1,2,3],[],[v,l]]]):'), + input([name=query, type=textarea]) + ]), + p([], [ + label([for=functions],'Functions (e.g. [ +[[n,reverse],[[],[v,l],[v,l]]], +[[n,reverse],[[v,l],[v,m],[v,n]],":-", +[[[n,head],[[v,l],[v,h]]], +[[n,tail],[[v,l],[v,t]]], +[[n,wrap],[[v,h],[v,h1]]], +[[n,append],[[v,h1],[v,m],[v,o]]], +[[n,reverse],[[v,t],[v,o],[v,n]]] +] +] +]):'), + input([name=functions, type=textarea]) + ]), + p([], input([name=submit, type=submit, value='Submit'], [])) + ])]). + + :- http_handler('/landing', landing_pad, []). + + landing_pad(Request) :- + member(method(post), Request), !, + http_read_data(Request, Data, []), + format('Content-type: text/html~n~n', []), + format('

', []), + %%portray_clause(Data), + + %%term_to_atom(Term,Data), + +Data=[%%debug='off',%%Debug1, +query=Query1,functions=Functions1,submit=_], + +term_to_atom(Debug2,'off'), +term_to_atom(Query2,Query1), +term_to_atom(Functions2,Functions1), + +international_interpret([lang,"en"],Debug2,Query2,Functions2,Result), + %%format('

========~n', []), + %%portray_clause + portray_clause(Result), + %%writeln1(Data), + +format('

'). \ No newline at end of file diff --git a/listprologinterpreter/lpiverify4.pl b/listprologinterpreter/lpiverify4.pl new file mode 100644 index 0000000000000000000000000000000000000000..ef803f2c23c07fc4d8260afcbca2dd885c479319 --- /dev/null +++ b/listprologinterpreter/lpiverify4.pl @@ -0,0 +1,5505 @@ +%% test(Debug[on/off],Total,Score). + +%%:- use_module(library(time)). + +%% Test cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +test(Debug,NTotal,Score) :- test(Debug,0,NTotal,0,Score),!. +test(_Debug,NTotal,NTotal,Score,Score) :- NTotal=251, !. +test(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + test(NTotal3,Query,Functions,Result), + ((international_interpret([lang,"en"],Debug,Query,Functions,Result1), + %writeln1([result1,Result1]), + Result=Result1 + )->(Score3 is Score1+1,writeln0([test,NTotal3,passed]));(Score3=Score1,writeln0([test,NTotal3,failed]))), + writeln0(""), + test(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% Test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +test1(Debug,N,Passed) :- + test(N,Query,Functions,Result), + ((international_interpret([lang,"en"],Debug,Query,Functions,Result1), + %writeln1([result1,Result1]), + Result=Result1 + )->(Passed=passed,writeln0([test,N,passed]));(Passed=failed,writeln0([test,N,failed]))),!. + + +%%writeln([eg1]), +test(1,[[n,function]], +[ + [[n,function],":-", + [ + [[n,equals4_on]] + ] + ] +] +,[[]]). +%%writeln([eg2]), +test(2,[[n,function],[1,1,[v,c]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[n,+],[[v,a],[v,b],[v,d]]], + [[n,+],[[v,d],1,[v,c]]] + ] + ] +] +,[[[[v,c], 3]]]). +%%writeln([eg3]), +test(3,[[n,function],[1,1,[v,c]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[n,function2],[[v,d],[v,f]]], + [[n,+],[[v,a],[v,b],[v,e]]], + [[n,+],[[v,e],[v,f],[v,g]]], + [[n,+],[[v,g],[v,d],[v,c]]] + ] + ], + [[n,function2],[[v,a],[v,f]],":-", + [ + [[n,is],[[v,a],2]], + [[n,is],[[v,f],1]] + ] + ] +] +,[[[[v,c], 5]]]). +%%writeln([eg4]), +test(4,[[n,append1],[[v,a]]], +[ + [[n,append1],[[v,a]],":-", + [ + [[n,b],[[v,b]]], + [[n,c],[[v,c]]], + [[n,append],[[v,b],[v,c],[v,a]]] + ] + ], + [[n,b],[["b"]]], + [[n,c],[["c"]]] +] +,[[[[v,a], ["b", "c"]]]]). + +%%writeln([eg5]), +test(5,[[n,count],[1,[v,n]]], +[ + [[n,count],[1,2]] + +] ,[[[[v,n], 2]]]). +%%writeln([eg6]), +test(6,[[n,count],[0,[v,n]]], +[ + [[n,count],[1,2]], + [[n,count],[[v,n],[v,p]],":-", + [ + [[n,+],[[v,n],1,[v,m]]], + [[n,count],[[v,m],[v,p]]] + ] + ]/* + , + + [[n,p1],[[v,n],[v,a],[v,p]],":-", + [ + [[n,+],[[v,n],[v,a],[v,p]]] + ] + ]*/ + +] ,[[[[v,n], 2]]]). +%%writeln([eg7]), +test(7,[[n,reverse],[[1,2,3],[],[v,l]]], +[ + [[n,reverse],[[],[v,l],[v,l]]], + [[n,reverse],[[v,l],[v,m],[v,n]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,wrap],[[v,h],[v,h1]]], + [[n,append],[[v,h1],[v,m],[v,o]]], + [[n,reverse],[[v,t],[v,o],[v,n]]] + ] + ] +],[[[[v,l], [3, 2, 1]]]]). + +test(8,[[n,grammar1],["apple"]], +[ + [[n,grammar1],[[v,s]],":-", + [ + [[n,noun],[[v,s],""]] + ] + ], + + [[n,noun],"->",["apple"]] +],[[]]). + +test(9,[[n,grammar1],["aaa"]], +[ + [[n,grammar1],[[v,s]],":-", + [ + [[n,noun],[[v,s],""]] + ] + ], + + [[n,noun],"->",[""]], + [[n,noun],"->",["a",[[n,noun]]]] +],[[]]). + +test(10,[[n,grammar1],["aa",[v,t]]], +[ + [[n,grammar1],[[v,s],[v,t]],":-", + [ + [[n,noun],[[v,s],"",[v,t]]] + ] + ], + + [[n,noun],["b"],"->",[""]], + [[n,noun],[[v,t]],"->",["a",[[n,noun],[[v,t]]]]] +],[[[[v,t],"b"]]]). + +test(11,[[n,grammar1],["aa",[v,t],[v,u]]], +[ + [[n,grammar1],[[v,s],[v,t],[v,u]],":-", + [ + [[n,noun],[[v,s],"",[v,t],[v,u]]] + ] + ], + + [[n,noun],["b","c"],"->",[""]], + [[n,noun],[[v,t],[v,u]],"->",["a",[[n,noun],[[v,t],[v,u]]]]] +],[[[[v,t],"b"],[[v,u],"c"]]]). + +test(12,[[n,grammar1],["aa"]], +[ + [[n,grammar1],[[v,s]],":-", + [ + [[n,noun],[[v,s],""]] + ] + ], + + [[n,noun],"->",[""]], + [[n,noun],"->",["a",[[n,noun]]]] + +],[[]]). + +test(13,[[n,grammar1],["[a,a]",[v,t]]], +[ + [[n,grammar1],[[v,u],[v,t]],":-", + [ + [[n,compound],[[v,u],"",[],[v,t]]] + ] + ], + + %[[n,compound213],["","",[v,t],[v,t]]], + + [[n,compound213],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,compound],[[v,t],[v,u]],"->", + ["[",[[n,compound21],[[v,t],[v,v]]],"]", + [[n,compound213],[[v,v],[v,u]]]]], + + %[[n,compound212],["","",[v,t],[v,t]]], + + [[n,compound212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,compound21],[[v,t],[v,u]],"->", + [[[n,a]],[[n,rightbracketnext]], + [[n,code],[[n,wrap],["a",[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]]], + [[n,compound212],[[v,v],[v,u]]]]], + + [[n,compound21],[[v,t],[v,u]],"->", + [[[n,a]],",", + [[n,compound21],[[],[v,compound1name]]], + [[n,code],[[n,wrap],["a",[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]], + [[n,append],[[v,v],[v,compound1name],[v,u]]]]]], + + %[[n,compound212],[[v,a],[v,a],[v,t],[v,t]]], + + %[[n,a],["",""]], + + [[n,a],"->",["a"]], + + [[n,rightbracketnext],"->", + [[[n,lookahead],["]"]]]], + + [[n,lookahead],[[v,a],[v,a],[v,b]],":-", + [[[n,stringconcat],[[v,b],[v,d],[v,a]]]]] + %[[n,a],[[v,a],[v,a]]] + +],[[[[v,t],["a","a"]]]]). + +test(14,[[n,grammar1],["[a]",[v,t]]], +[ + [[n,grammar1],[[v,u],[v,t]],":-", + [ + [[n,compound],[[v,u],"",[],[v,t]]] + ] + ], + + [[n,compound213],["","",[v,t],[v,t]]], + + [[n,compound213],[[v,a],[v,a],[v,t],[v,t]]], + + [[n,compound],[[v,t],[v,u]],"->", + ["[",[[n,compound21],[[v,t],[v,v]]],"]", + [[n,compound213],[[v,v],[v,u]]]]], + + [[n,compound212],["","",[v,t],[v,t]]], + + [[n,compound212],[[v,a],[v,a],[v,t],[v,t]]], + + [[n,compound21],[[v,t],[v,u]],"->", + [[[n,a]], + [[n,code],[[n,wrap],["a",[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]]], + [[n,compound212],[[v,v],[v,u]]]]], + + [[n,a],["",""]], + + [[n,a],"->",["a"]], + + [[n,a],[[v,a],[v,a]]] + +],[[[[v,t],["a"]]]]). + +% String to List (Term) + +% run with swipl --stack-limit=1G + +% n,letters needs to include chars except quote + +%test(15,[[n,grammar1],["[[\"aa,]\",\"b\",a],1]",[v,t]]], +test(15, + +[[n,grammar1],[%"[[1],1]" +"[[\"aa,]\",b,\"c\",[]],1]" +,[v,t]]], + +%[[n,item],["b,""c"",[]],1]",[v,vgp3],[v,i]]], + +%test(15,[[n,grammar1],["[]"]], +%est(15,[[n,item],["\"aa,\"","",[v,t]]], +%test(15,[[n,item],["a","",[v,t]]], +%test(15,[[n,grammar1],["[a]",[v,t]]], +%%test(15,[[n,compound213],["","",[["a"],1],[v,t]]], + +[ + [[n,grammar1],[[v,u],[v,t]],":-", + [ + [[n,compound],[[v,u],"",[],[v,t]]] + %,[[n,cut]] + %%[[n,number21],[[v,u],"","",[v,t]]] + %%[[n,compound213],["","",[["a"],1],[v,t]]] + ] + ], + + %[[n,compound213],["","",[v,t],[v,t]]], + + [[n,compound213],[[v,u],[v,u],[v,t],[v,t]]], %% swapped these + + [[n,compound],[[v,t],[v,u]],"->", + ["[","]", + [[n,compound213],[[v,t],[v,u]]]]], + + [[n,compound],[[v,t],[v,u]],"->", + ["[",[[n,compound21],[[v,t],[v,v]]],"]",%[[n,code],[[n,trace2]]], + [[n,compound213],[[v,v],[v,u]]]]], + + %[[n,compound212],["","",[v,t],[v,t]]], + + [[n,compound212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,compound21],[[v,t],[v,u]],"->", + [[[n,item],[[v,i]]], + [[n,lookahead],["]"]], + [[n,code],[[n,wrap],[[v,i],[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]]], + [[n,compound212],[[v,v],[v,u]]] + %,[[n,code],[[n,cut]]] + ]], + + [[n,compound21],[[v,t],[v,u]],"->", + [[[n,item],[[v,i]]],",", + + %[[n,code],[[n,trace]]], + + [[n,compound21],[[],[v,compound1name]]], + [[n,code],[[n,wrap],[[v,i],[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]], + [[n,append],[[v,v],[v,compound1name],[v,u]]] + %,[[n,cut]] + ]]], + + [[n,item],[[v,t]],"->",["\"",[[n,word21],["",[v,t]]], + "\"" + %[[n,code],[[n,cut]]] + ]], + + [[n,item],[[v,t]],"->", + [[[n,number21],["",[v,u]]],[[n,code], + [[n,stringtonumber],[[v,u],[v,t]]] + %,[[n,cut]] + ]]], + +%/* + [[n,item],[[v,t]],"->",[[[n,word21_atom],["",[v,t1]]], + [[n,code],[[n,atom_string],[[v,t],[v,t1]]] + %,[[n,cut]] + ]]], % atoms +%*/ + [[n,item],[[v,t]],"->",[[[n,compound],[[],[v,t]]] + %,[[n,code],[[n,cut]]] + ]], + + %[[n,number212],["","",[v,t],[v,t]]], + + [[n,number212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a],[[n,commaorrightbracketnext]], + [[n,code], + [[n,"->"],[[ + [[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]] + ]], + [[n,true]], + [[n,"->"],[ + [[n,equals4],[[v,a],"."]], + [[n,true]], + [[n,equals4],[[v,a],"-"]] + ]] + ]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number212],[[v,v],[v,u]]]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code], + [[n,"->"],[[ + [[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]] + ]], + [[n,true]], + [[n,"->"],[ + [[n,equals4],[[v,a],"."]], + [[n,true]], + [[n,equals4],[[v,a],"-"]] + ]] + ]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number21],["",[v,numberstring]]], + [[n,code],[[n,stringconcat], + [[v,v],[v,numberstring],[v,u]]]]]], + + %[[n,word212],["","",[v,t],[v,t]]], + + [[n,word212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,word21],[[v,t],[v,u]],"->", + [[v,a],[[n,quote_next]], + [[n,code],%[[n,letters],[[v,a]]], + [[n,not],[[[n,=],[[v,a],"\""]]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word212],[[v,v],[v,u]]]]], + + [[n,word21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],%[[n,letters],[[v,a]]], + [[n,not],[[[n,=],[[v,a],"\""]]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word21],["",[v,wordstring]]], + [[n,code], + [[n,stringconcat],[[v,v],[v,wordstring],[v,u]]]]]], + +%/* + %[[n,word212_atom],["","",[v,t],[v,t]]], + + [[n,word212_atom],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,word21_atom],[[v,t],[v,u]],"->", + [[v,a],[[n,commaorrightbracketnext]], + [[n,code],%[[n,letters],[[v,a]]], + [[n,not],[[[n,=],[[v,a],"\""]]]], + [[n,not],[[[n,=],[[v,a],"["]]]], + [[n,not],[[[n,=],[[v,a],"]"]]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word212_atom],[[v,v],[v,u]]]]], + + [[n,word21_atom],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],%[[n,letters],[[v,a]]], + [[n,not],[[[n,=],[[v,a],"\""]]]], + [[n,not],[[[n,=],[[v,a],"["]]]], + [[n,not],[[[n,=],[[v,a],"]"]]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word21_atom],["",[v,wordstring]]], + [[n,code], + [[n,stringconcat],[[v,v],[v,wordstring],[v,u]]]]]], + %*/ + [[n,commaorrightbracketnext],"->", + [[[n,lookahead],[","]]]], + + [[n,commaorrightbracketnext],"->", + [[[n,lookahead],["]"]]]], + + + [[n,quote_next],"->", + [[[n,lookahead],["\""]]]], + + + [[n,lookahead],[[v,a],[v,a],[v,b]],":-", + [[[n,stringconcat],[[v,b],[v,d],[v,a]]]]] + +%()%],[[[v,t],[["a"],1]]]). +%],[[[[v,t],[["aa,]","b",a],1]]]]). + +],[[[[v,t],[["aa,]",b,"c",[]],1]]]]). +%],[[[[v,i],b]]]). + +%],[[[[v,t],[[1],1]]]]). + +%],[[[[v,t],[[]]]]]). +%],[[[[v,t],"aa,"]]]). +%],[[[[v,t],[a]]]]). + + + +%% Dukel goes to the grammar example + +test(16,[[n,grammar1],["john ate the apple"]], +[ + [[n,grammar1],[[v,u]],":-", + [ + [[n,sentence],[[v,u],""]] + ] + ], + + [[n,sentence],"->", + [[[n,subject]],[[n,verbphrase]]]], + + [[n,verbphrase],"->", + [[[n,verb]],[[n,object]]]], + + [[n,subject],["",""]], + + [[n,subject],"->",["john"," "]], + + [[n,subject],[[v,a],[v,a]]], + + [[n,verb],["",""]], + + [[n,verb],"->",["ate"," "]], + + [[n,verb],[[v,a],[v,a]]], + + [[n,object],["",""]], + + [[n,object],"->",["the"," ","apple"]], + + [[n,object],[[v,a],[v,a]]] +],[[]]). + +% Split string on ".", "?", "!" + +%% Blackl loves the grammar + +test(17,[[n,grammar1],["aaa1 ,-'! a? b! b.",[v,t]]], +%test(17,[[n,grammar1],["a? b!",[v,t]]], +%%test(15,[[n,compound213],["","",[["a"],1],[v,t]]], + +[ + [[n,grammar1],[[v,u],[v,t]],":-", + [ + [[n,compound21],[[v,u],"",[],[v,t]]] + ,[[n,cut]] + %%[[n,number21],[[v,u],"","",[v,t]]] + %%[[n,compound213],["","",[["a"],1],[v,t]]] + ] + ], + + [[n,compound213],["","",[v,t],[v,t]]], + + [[n,compound213],[[v,u],[v,u],[v,t],[v,t]]], %% swapped these + + [[n,compound],[[v,t],[v,u]],"->", + [[[n,compound21],[[v,t],[v,v]]], + [[n,compound213],[[v,v],[v,u]]]]], + + [[n,compound212],["","",[v,t],[v,t]]], + + [[n,compound212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,compound21],[[v,t],[v,u]],"->", + [[[n,item],[[v,i]]], + [[n,code],%%[[n,stringconcat],[[v,i],".",[v,i2]]], + [[n,wrap],[[v,i],[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]]], + [[n,compound212],[[v,v],[v,u]]] + ,[[n,code],[[n,cut]]] + ]], + + [[n,compound21],[[v,t],[v,u]],"->", + [[[n,item],[[v,i]]]," ", + [[n,compound21],[[],[v,compound1name]]], + [[n,code],%%[[n,stringconcat],[[v,i],".",[v,i2]]], + [[n,wrap],[[v,i],[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]], + [[n,append],[[v,v],[v,compound1name],[v,u]]] + ,[[n,cut]] + ]]], +%/** + [[n,item],[[v,t]],"->", + [[[n,number21],["",[v,t]]], + [[n,code],[[n,cut]]]]], +%**/ + [[n,item],[[v,t]],"->",[[[n,word21],["",[v,t]]]%, + %[[n,code],[[n,cut]]]]], + ]], + + [[n,item],[[v,t]],"->",[[[n,compound],[[],[v,t]]]%, + %[[n,code],[[n,cut]]]]], + ]], +%/** + [[n,number212],["","",[v,t],[v,t]]], + + [[n,number212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a],[[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number212],[[v,v],[v,u]]]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number21],["",[v,numberstring]]], + [[n,code],[[n,stringconcat], + [[v,v],[v,numberstring],[v,u]]]]]], +%**/ + [[n,word212],["","",[v,t],[v,t]]], + + [[n,word212],[[v,u],[v,u],[v,t],[v,t]]], + +/** + [[n,word21],[[v,t],[v,u]],"->", + [[v,a],[[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word212],[[v,v],[v,u]]]]], +**/ + [[n,word21],[[v,t],[v,u]],"->", + [[v,a],[v,b], + [[n,code],[[n,sentencechars],[[v,a]]], + [[n,finalchar],[[v,b]]], + [[n,stringconcat],[[v,t],[v,a],[v,v1]]], + [[n,stringconcat],[[v,v1],[v,b],[v,v]]]], + [[n,word212],[[v,v],[v,u]]]]], + +/** + [[n,word21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word21],["",[v,numberstring]]], + [[n,code],[[n,stringconcat], + [[v,v],[v,numberstring],[v,u]]]]]] + +**/ + [[n,word21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,sentencechars],[[v,a]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word21],["",[v,wordstring]]], + [[n,code], + [[n,stringconcat],[[v,v],[v,wordstring],[v,u]]]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,letters],[[v,c]]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[[n,stringtonumber],[[v,c],[v,n]]], + [[n,number],[[v,n]]]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c]," "]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],","]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],"-"]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],"'"]]]], + + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],"."]]]], + + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],"!"]]]], + + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],"?"]]]] + +%%],[[[v,t],[["a"],1]]]). +],[[[[v,t],["aaa1 ,-'!","a?","b!","b."]]]]). + +%% Adye is Venan + +test(18,[[n,grammar1],["what is 1+11",[v,c]]], +[ + [[n,grammar1],[[v,u],[v,c]],":-", + [ + [[n,sentence],[[v,u],"",[v,c]]] + ] + ], + + [[n,sentence],[[v,c]],"->", + [[[n,subject]],[[n,verbphrase],[[v,c]]]]], + + [[n,verbphrase],[[v,c]],"->", + [[[n,verb]],[[n,object],[[v,c]]]]], + + [[n,subject],["",""]], + + [[n,subject],"->",["what"," "]], + + [[n,subject],[[v,a],[v,a]]], + + [[n,verb],["",""]], + + [[n,verb],"->",["is"," "]], + + [[n,verb],[[v,a],[v,a]]], + + [[n,object],["","",[v,c]]], + + [[n,object],[[v,c]],"->",[[[n,item],[[v,a]]], + "+", + [[n,item],[[v,b]]], + [[n,code],[[n,+],[[v,a],[v,b],[v,c]]]]]], + + [[n,object],[[v,a],[v,a]]], + + [[n,item],[[v,t]],"->", + [[[n,number21],["",[v,u]]],[[n,code], + [[n,stringtonumber],[[v,u],[v,t]]]]]], + + [[n,number212],["","",[v,t],[v,t]]], + + [[n,number212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a],[[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number212],[[v,v],[v,u]]]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number21],["",[v,numberstring]]], + [[n,code],[[n,stringconcat], + [[v,v],[v,numberstring],[v,u]]]]]] + +],[[[[v,c],12]]]). + + +%% Inky Classic 2 + +test(19,[[n,positivityscore],[["not","you","like","a","walk"] +,["would","you","like","a","walk" +],0,[v,s]]], + +/** +test(19,[[n,positivityscore],["would1"%%,"you","like","a","walk" +,["would"%%,"you","like","a","walk" +]]], +**/ +[ +/** + [[n,positivityscore],[[v,l],[v,m]],":-", + [ [[n,not],[ + [[n,member],[[v,l],[v,m]]]]] + + ]] +**/ + + [[n,positivityscore],[[],[v,l],[v,s],[v,s]]], + [[n,positivityscore],[[v,l],[v,m],[v,s1],[v,s2]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,member],[[v,h],[v,m]]], + [[n,+],[[v,s1],1,[v,s3]]], + [[n,positivityscore],[[v,t],[v,m],[v,s3], + [v,s2]]] + ]], + + [[n,positivityscore],[[v,l],[v,m],[v,s1],[v,s2]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,not],[[[n,member],[[v,h],[v,m]]]]], + [[n,positivityscore],[[v,t],[v,m],[v,s1], + [v,s2]]]]] + + +] + +,[[[[v,s], 4]]]). +%%,[]). + +test(20,[[n,function],[1,1,[v,c]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[[n,+],[[v,a],[v,b],[v,c]]]] + ] + ] +] +,[[[[v,c], 2]]]). + +%%test(21,[[n,grammar1],["john ate"]], +test(21,[[n,grammar1],["ate",[v,t]]], +[ + + + [[n,grammar1],[[v,u],[v,t]],":-", + [ + [[n,lookahead],[[v,u],[v,t],"ate"]] %% 2 is endstring, 3 is what lookahead checks for + ] + ], +/** + [[n,sentence],"->", + [[[n,subject]], + [[n,lookahead],["ate"]], + [[n,verb]] + ]], + + [[n,verbphrase],"->", + [[[n,verb]]]], + + [[n,subject],["",""]], + + [[n,subject],"->",["john"," "]], + + [[n,subject],[[v,a],[v,a]]], + + [[n,object],["",""]], + + [[n,object],"->",["apples"]], + + [[n,object],[[v,a],[v,a]]], +**/ + + [[n,lookahead],[[v,a],[v,a],[v,b]],":-", + [[[n,stringconcat],[[v,b],[v,d],[v,a]]]]] + +] +,[[[[v,t],"ate"]]]). + + +test(22,[[n,grammar1],["peter cut the pear"]], +[ + [[n,grammar1],[[v,u]],":-", + [ + [[n,sentence],[[v,u],""]] + ] + ], + + [[n,sentence],"->", + [[[n,subject]],[[n,verbphrase]]]], + + [[n,verbphrase],"->", + [[[n,verb]],[[n,object]]]], + + [[n,subject],["",""]], + + [[n,subject],"->",["john"," "]], + + [[n,subject],"->",["peter"," "]], + + [[n,subject],[[v,a],[v,a]]], + + [[n,verb],["",""]], + + [[n,verb],"->",["ate"," "]], + + [[n,verb],"->",["bought"," "]], + + [[n,verb],"->",["cut"," "]], + + [[n,verb],[[v,a],[v,a]]], + + [[n,object],"->", + ["the"," ",[[n,noun]]]], + + [[n,noun],["",""]], + + [[n,noun],"->",["apple"]], + + [[n,noun],"->",["pear"]], + + [[n,noun],"->",["peach"]], + + [[n,noun],[[v,a],[v,a]]] +],[[]]). + +%% Two Uses - PhD algorithm - agree with all of only one of two sides and give opinion + +%%do you agree with either abc (list j1) or def (list j2) given you agree with abcg (list k)? +%%what is your opinion of a given opinion of a is b? + +test(23,[[n,agree],[["a","b","c"] +,["d","e","f"],["a","b","c","g"],[v,s]]], + +[ + +%% test whether each item of jn is a member of k +%% test whether each item of jn is not a member of k + + [[n,agree],[[v,j1],[v,j2],[v,k],"You agree with j1."],":-", + [ [[n,membera1],[[v,j1],[v,k]]], + [[n,membera2],[[v,j2],[v,k]]] + ]], + + [[n,agree],[[v,j1],[v,j2],[v,k],"You agree with j2."],":-", + [ [[n,membera1],[[v,j2],[v,k]]], + [[n,membera2],[[v,j1],[v,k]]] + ]], + + [[n,membera1],[[],[v,l]]], + [[n,membera1],[[v,l],[v,m]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,member],[[v,h],[v,m]]], + [[n,membera1],[[v,t],[v,m]]] + ]], + + [[n,membera2],[[],[v,l]]], + [[n,membera2],[[v,l],[v,m]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,membera3],[[v,m],[v,h]]], + [[n,membera2],[[v,t],[v,m]]]]], + + [[n,membera3],[[],[v,l]]], + [[n,membera3],[[v,l],[v,m]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,not],[[[n,=],[[v,m],[v,h]]]]], + [[n,membera3],[[v,t],[v,m]]] + ]] + + +],[[[[v,s],"You agree with j1."]]]). + +test(24,[[n,modus_ponens],["a",[["a","b"],["c","d"],["e","f"]],[v,s]]], + +[ + [[n,modus_ponens],[[v,a],[v,ab],[v,b]],":-", + [ [[n,member],[[v,ab1],[v,ab]]], + [[n,equals1],[[v,ab1],[[v,a],[v,b]]]] + ]] + +],[[[[v,s],"b"]]]). + +%% Two Uses - original argument and algorithm (file) +%% splits on \n, removes 1 duplicate per line, returns score of duplicates + +test(25,[[n,grammar1],["aaa1 ,-'\na\nb\nb\n", +"aaa1 ,-'\na\nb\na", +[v,s]]], +%%()test(15,[[n,compound213],["","",[["a"],1],[v,t]]], +%%test(25,[[n,word21],["a\n","","",[v,t]]], +%%test(25,[[n,deletea2],[["a","b"],"a",[v,m1]]], +%%test(25,[[n,deletea2],[["a","a","b"],"a",[v,m1]]], +%%test(25,[[n,membera3],[["a","b"],"c"]], +%%test(25,[[n,positivityscore],[["a","b"],["a","b"],0,[v,m1]]], + +[ + [[n,grammar1],[[v,u],[v,t],[v,s]],":-", + [ + [[n,compound21],[[v,u],"",[],[v,u1]]], + [[n,compound21],[[v,t],"",[],[v,t1]]], + [[n,positivityscore],[[v,u1],[v,t1],0,[v,s]]] + %%[[n,membera3],[["a","b"],"a"]] + %%[[n,number21],[[v,u],"","",[v,t]]] + %%[[n,compound213],["","",[["a"],1],[v,t]]] + ] + ], + + [[n,compound213],["","",[v,t],[v,t]]], + + [[n,compound213],[[v,u],[v,u],[v,t],[v,t]]], %% swapped these + + [[n,compound],[[v,t],[v,u]],"->", + [[[n,compound21],[[v,t],[v,v]]], + [[n,compound213],[[v,v],[v,u]]]]], + + [[n,compound212],["","",[v,t],[v,t]]], + + [[n,compound212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,compound21],[[v,t],[v,u]],"->", + [[[n,item],[[v,i]]], + [[n,code],%%[[n,stringconcat],[[v,i],".",[v,i2]]], + [[n,wrap],[[v,i],[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]]], + [[n,compound212],[[v,v],[v,u]]]]], + + [[n,compound21],[[v,t],[v,u]],"->", + [[[n,item],[[v,i]]], + [[n,compound21],[[],[v,compound1name]]], + [[n,code],%%[[n,stringconcat],[[v,i],".",[v,i2]]], + [[n,wrap],[[v,i],[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]], + [[n,append],[[v,v],[v,compound1name],[v,u]]]]]], +/** + [[n,item],[[v,t]],"->", + [[[n,number21],["",[v,t]]]]], +**/ + [[n,item],[[v,t]],"->",[[[n,word21],["",[v,t]]]]], + + [[n,item],[[v,t]],"->",[[[n,compound],[[],[v,t]]]]], +/** + [[n,number212],["","",[v,t],[v,t]]], + + [[n,number212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a],[[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number212],[[v,v],[v,u]]]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number21],["",[v,numberstring]]], + [[n,code],[[n,stringconcat], + [[v,v],[v,numberstring],[v,u]]]]]], +**/ + [[n,word212],["","",[v,t],[v,t]]], + + [[n,word212],[[v,u],[v,u],[v,t],[v,t]]], + +/** + [[n,word21],[[v,t],[v,u]],"->", + [[v,a],[[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word212],[[v,v],[v,u]]]]], +**/ + [[n,word21],[[v,t],[v,u]],"->", + [[v,b], + [[n,code],%%[[n,sentencechars],[[v,a]]], + [[n,finalchar],[[v,b]]] + %%[[n,stringconcat],[[v,t],[v,a],[v,v1]]], + %%[[n,stringconcat],[[v,t],[v,b],[v,v] + ], + [[n,word212],[[v,t],[v,u]]]]], + +/** + [[n,word21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word21],["",[v,numberstring]]], + [[n,code],[[n,stringconcat], + [[v,v],[v,numberstring],[v,u]]]]]] + +**/ + [[n,word21],[[v,t],[v,u]],"->", + [[v,a],%%[[n,not_return_next]], + [[n,code],[[n,not],[[[n,=],[[v,a],"\n"]]]], + %%[[n,sentencechars],[[v,a]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word21],["",[v,wordstring]]], + [[n,code], + [[n,stringconcat],[[v,v],[v,wordstring],[v,u]]]]]], + +/** + [[n,sentencechars],[[v,c]],":-", + [[[n,letters],[[v,c]]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[[n,stringtonumber],[[v,c],[v,n]]], + [[n,number],[[v,n]]]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c]," "]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],","]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],"-"]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],"'"]]]], +**/ + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],"\n"]]]], + + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],""]]]], + +%% [[n,not_return_next],[[v,a],[v,a]],":-", +%% [[[n,not],[[[n,stringconcat],["\n",[v,d],[v,a]]]]]]] + + [[n,positivityscore],[[],[v,l],[v,s],[v,s]]], + [[n,positivityscore],[[v,l],[v,m],[v,s1],[v,s2]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,member],[[v,h],[v,m]]], + [[n,"->"],[[[n,deletea2],[[v,m],[v,h],[v,m1]]], + [[n,true]], + [[n,=],[[v,m],[v,m1]]]]], + [[n,+],[[v,s1],1,[v,s3]]], + [[n,positivityscore],[[v,t],[v,m1],[v,s3], + [v,s2]]] + ]], + + [[n,positivityscore],[[v,l],[v,m],[v,s1],[v,s2]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,not],[[[n,membera3],[[v,m],[v,h]]]]], + [[n,positivityscore],[[v,t],[v,m],[v,s1], + [v,s2]]]]], + + [[n,deletea2],[[],[v,l],[v,m1]],":-",[[[n,fail]]]], %%%** + [[n,deletea2],[[v,l],[v,m],[v,t]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,=],[[v,m],[v,h]]] + %%[[n,delete],[[v,m],[v,h],[v,m1]]]]], + ]], + + [[n,deletea2],[[v,l],[v,m],[v,m1]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,not],[[[n,=],[[v,m],[v,h]]]]], + %%[[n,not],[[[n,membera3],[[v,m],[v,h]]]]], + [[n,deletea2],[[v,t],[v,m],[v,m1]]] + ]], + + [[n,membera3],[[],[v,l]],":-",[[[n,fail]]]], + [[n,membera3],[[v,l],[v,m]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[[n,=],[[v,m],[v,h]]]] + ]], + + [[n,membera3],[[v,l],[v,m]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,not],[[[[n,=],[[v,m],[v,h]]]]]], + [[n,membera3],[[v,t],[v,m]]] + ]] + + + +%%()],[[[v,t],[["a"],1]]]). +],[[[[v,s],3]]]). +%% ],[[[[v,m1],2]]]). +%%],[]). +%%()],[[[[v,t],"a"]]]). + +%% Pedagogy X - write 15*2=30 As in Honours and 50*2=100 As in Masters and 250 As in PhD. - use cawp to find algorithms with input and output in same pattern as a past algorithm - question answer for cawp specs, run cawp in post order + +/** +%% - simple x spec with data - for alg, and simple x spec with data for grammars +- in text file + +- instruct to call only previously found predicates (x call first) - then constructs next predicate up as head predicate - new feature: substitutes a call x either tries existing tree or writes fresh code +**/ + +%% Two Uses - See philosophy/he should identify different writers in the exposition.pl + +test(26,[[n,append1],[["a"],["b"],[v,s]]], + +[ + [[n,append1],[[v,a],[v,b],[v,s]],":-", + [ [[n,append],[[v,a],[v,b],[v,s]]] + ]] + +],[[[[v,s],["a","b"]]]]). + + +test(27,[[n,equals11],["a","a"]], + +[ + [[n,equals11],[[v,a],[v,a]]] + +],[[]]). + +test(28,[[n,number11],[1]], + +[ + [[n,number11],[[v,a]],":-", + [ [[n,number],[[v,a]]] + ]] + +],[[]]). + +test(29,[[n,minus11],[[1,2,3],[3],[v,c]]], + +[ + [[n,minus11],[[v,a],[],[v,a]]], + [[n,minus11],[[v,a],[v,b],[v,c]],":-", + [ [[n,head],[[v,b],[v,h]]], + [[n,tail],[[v,b],[v,t]]], + [[n,delete],[[v,a],[v,h],[v,c]]], + [[n,minus11],[[v,c],[v,t],[v,c]]] + ]] + +],[[[[v,c],[1,2]]]]). + +test(30,[[n,if11],[1,[v,b]]], + +[ + [[n,if11],[[v,a],[v,b]],":-", + [ [[n,"->"],[[[n,is],[[v,a],1]], + [[n,is],[[v,b],2]], + [[n,is],[[v,b],3]]]] + ]] + +],[[[[v,b],2]]]). + + +test(31,[[n,not11],[1]], + +[ + [[n,not11],[[v,a]],":-", + [ [[n,not],[[[n,=],[[v,a],2]]]] + ]] + +],[[]]). + +test(32,[[n,or11],[1]], + +[ + [[n,or11],[[v,a]],":-", + [ [[n,or],[[[n,is],[[v,a],1]], + [[n,is],[[v,a],2]]]] + ]] + +],[[]]). + +%% Starts at 3, decreases given the lesser of A or B until reaching 1. + +test(33,[[n,downpipe],[3,1,[[3,[4,2]],[2,[3,1]]]]], + +[ + [[n,downpipe],[[v,a],[v,a],[v,b]]], + [[n,downpipe],[[v,a],[v,b],[v,c]],":-", + [ [[n,member],[[v,c1],[v,c]]], + [[n,equals4],[[v,c1],[[v,a],[v,c12]]]], + [[n,equals4],[[v,c12],[[v,c121],[v,c122]]]], + [[n,"->"],[[[n,>],[[v,a],[v,c121]]], + [[n,downpipe],[[v,c121],[v,b],[v,c]]], + [[n,"->"],[[[n,>],[[v,a],[v,c122]]], + [[n,downpipe],[[v,c122],[v,b],[v,c]]], + [[n,fail]]]]]] + ]] + +],[[]]). + +%% Get item n, copies it + +test(34,[[n,getitemn],[3,[1,2,3],[v,c]]], + +[ + [[n,getitemn],[1,[v,b],[v,c]],":-", + [ [[n,head],[[v,b],[v,c]]] + ]], + [[n,getitemn],[[v,a],[v,b],[v,c]],":-", + [ [[n,not],[[[n,=],[[v,a],1]]]], + [[n,tail],[[v,b],[v,t]]], + [[n,-],[[v,a],1,[v,d]]], + [[n,getitemn],[[v,d],[v,t],[v,c]]] + ]] + +],[[[[v,c],3]]]). + +%% A shell of LPI allows manipulation of variable order, testing for e.g. identical inverse + +%% commutative not identical + +test(35,[[n,identical],[1,2]], + +[ + [[n,identical],[[v,a],[v,b]],":-", + [ [[n,+],[[v,a],[v,b],[v,c]]], + [[n,+],[[v,b],[v,a],[v,c]]] + ]] + +],[[]]). + +test(36,[[n,associative],[1,2,3]], + +[ + [[n,associative],[[v,a],[v,b],[v,c]],":-", + [ [[n,*],[[v,a],[v,b],[v,d]]], + [[n,*],[[v,d],[v,c],[v,e]]], + [[n,*],[[v,b],[v,c],[v,f]]], + [[n,*],[[v,f],[v,a],[v,e]]] + ]] + +],[[]]). + +%% audience size + +test(37,[[n,length1],[[1],0,[v,l]]], +[ + [[n,length1],[[],[v,l],[v,l]]], + [[n,length1],[[v,l],[v,m1],[v,n]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,tail],[[v,l],[v,t]]], + [[n,+],[[v,m1],1,[v,m2]]], + [[n,length1],[[v,t],[v,m2],[v,n]]] + ] + ] +],[[[[v,l], 1]]]). + + +%% Are their differences equal? + +test(38,[[n,optimise1],[[[5,4],[3,2],[1,0]],[v,d]]], + +[ + [[n,optimise1],[[v,a],[v,b]],":-", + [ [[n,head],[[v,a],[v,h]]], + [[n,tail],[[v,a],[v,t]]], + [[n,equals1],[[v,h],[[v,h1],[v,h2]]]], + [[n,-],[[v,h1],[v,h2],[v,b]]], + [[n,"->"],[[[n,not],[[[n,=],[[v,t],[]]]]], + [[n,optimise1],[[v,t],[v,b]]], + [[n,true]]]] + ]] + +],[[[[v,d], 1]]]). + + +test(39,[[n,member1a],[1,[1,2]]], + +[[[n,member1a],[[v,i1],[v,l]], ":-", + [[[n,intersection2],[[v,i1],[v,l],[],[v,m]]]]], +[[n,intersection2],[[v,a], [], [v,l], [v,l]]], +[[n,intersection2],[[v,i1], [v,l1], [v,l2], [v,l3]], ":-", + [[[n,head],[[v,l1],[v,i1]]], + [[n,tail],[[v,l1],[v,l4]]], + [[n,wrap],[[v,i1],[v,i11]]], + [[n,append],[[v,l2],[v,i11],[v,l3]]]]], + %%[[n,intersection2],[[v,i1], [v,l4], [v,l5], [v,l3]]]]], +[[n,intersection2],[[v,i1], [v,l1], [v,l2], [v,l3]], ":-", + [[[n,head],[[v,l1],[v,i2]]], + [[n,tail],[[v,l1],[v,l4]]], + [[n,not],[[[n,=],[[v,i1],[v,i2]]]]], + [[n,intersection2],[[v,i1], [v,l4], [v,l2], [v,l3]]]]]] +,[[]]). + +test(40,[[n,minus1],[[1,2,3],[1,2],[v,a]]], + +[[[n,minus1],[[v,l], [], [v,l]]], +[[n,minus1],[[v,l1], [v,l2], [v,l3]],":-", + [[[n,head],[[v,l2],[v,i1]]], + [[n,tail],[[v,l2],[v,l5]]], + [[n,delete2],[[v,l1],[v,i1],[],[v,l6]]], + [[n,minus1],[[v,l6], [v,l5], [v,l3]]]]], +[[n,delete2],[[], [v,a], [v,l], [v,l]]], +[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-", + [[[n,head],[[v,l1],[v,i1]]], + [[n,tail],[[v,l1],[v,l5]]], + [[n,delete2],[[v,l5],[v,i1],[v,l2],[v,l3]]]]], +[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-", + [[[n,head],[[v,l1],[v,i2]]], + [[n,tail],[[v,l1],[v,l5]]], + [[n,not],[[[n,=],[[v,i1],[v,i2]]]]], + [[n,wrap],[[v,i2],[v,i21]]], + [[n,append],[[v,l2],[v,i21],[v,l6]]], + [[n,delete2],[[v,l5],[v,i1],[v,l6],[v,l3]]]]]] + +,[[[[v,a], [3]]]]). + +test(41,[[n,part_of_string],[[1,2,3,4],[2,3]]], + +[ + [[n,part_of_string],[[], []]], + [[n,part_of_string],[[],[v,b]],":-", + [[[n,not],[[[n,=],[[v,b],[]]]]], + [[n,fail]]]], + [[n,part_of_string],[[v,a],[v,b]],":-", + [[[n,tail],[[v,a],[v,at]]], + [[n,"->"],[[[n,listhead],[[v,a],[v,b]]], + [[n,true]], + [[n,part_of_string],[[v,at],[v,b]]]]]]], + + [[n,listhead],[[v,l],[]]], + [[n,listhead],[[v,a],[v,b]],":-", + [[[n,head],[[v,a],[v,ah]]], + [[n,tail],[[v,a],[v,at]]], + [[n,head],[[v,b],[v,ah]]], + [[n,tail],[[v,b],[v,bt]]], + [[n,listhead],[[v,at],[v,bt]]] + ]] + +],[[]]). + +test(42,%[[n,or12],[[v,a]]], +[[n,findall],[[v,b],[[[n,or12],[[v,b]]]],[v,a]]], +[ + [[n,or12],[1]], + [[n,or12],[2]] + +],[[[[v,a],[1,2]]]]). + +test(43,[[n,intersection1],[[1,2,3],[3,4,5],[],[v,a]]], + +[[[n,intersection1],[[], [v,a], [v,l], [v,l]]], +[[n,intersection1],[[v,l1], [v,l2], [v,l3a], [v,l3]],":-", + [[[n,head],[[v,l1],[v,i1]]], + [[n,tail],[[v,l1],[v,l4]]], + [[n,intersection2],[[v,i1],[v,l2],[],[v,l5]]], + [[n,append],[[v,l3a],[v,l5],[v,l6]]], + [[n,intersection1],[[v,l4],[v,l2],[v,l6],[v,l3]]]]], +[[n,intersection2],[[v,a], [], [v,l], [v,l]]], +[[n,intersection2],[[v,i1], [v,l1], [v,l2], [v,l3]], ":-", + [[[n,head],[[v,l1],[v,i1]]], + [[n,tail],[[v,l1],[v,l4]]], + [[n,wrap],[[v,i1],[v,i11]]], + [[n,append],[[v,l2],[v,i11],[v,l3]]]]], + %%[[n,intersection2],[[v,i1], [v,l4], [v,l5], [v,l3]]]]], +[[n,intersection2],[[v,i1], [v,l1], [v,l2], [v,l3]], ":-", + [[[n,head],[[v,l1],[v,i2]]], + [[n,tail],[[v,l1],[v,l4]]], + [[n,not],[[[n,=],[[v,i1],[v,i2]]]]], + [[n,intersection2],[[v,i1], [v,l4], [v,l2], [v,l3]]]]]] + +,[[[[v,a], [3]]]]). + +test(44,[[n,delete2],[[1,1,2],1,[],[v,a]]], + +[[[n,delete2],[[], [v,a], [v,l], [v,l]]], +[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-", + [[[n,head],[[v,l1],[v,i1]]], + [[n,tail],[[v,l1],[v,l5]]], + [[n,delete2],[[v,l5],[v,i1],[v,l2],[v,l3]]]]], +[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-", + [[[n,head],[[v,l1],[v,i2]]], + [[n,tail],[[v,l1],[v,l5]]], + [[n,not],[[[n,=],[[v,i1],[v,i2]]]]], + [[n,wrap],[[v,i2],[v,i21]]], + [[n,append],[[v,l2],[v,i21],[v,l6]]], + [[n,delete2],[[v,l5],[v,i1],[v,l6],[v,l3]]]]]] + +,[[[[v,a], [2]]]]). + +%% confidence - when a person produces a certain amount of work, they will be fulfilled + +test(45,[[n,greaterthan],[3,2]], + +[ +[[n,greaterthan],[[v,a],[v,b]],":-", + [[[n,>],[[v,a],[v,b]]]]] + +],[[]]). + +%% did - check a box true + +test(46,[[n,conjunction],["true","false",[v,c]]], + +[ +[[n,conjunction],["true","true","true"]], +[[n,conjunction],[[v,a],[v,b],"false"],":-", + [[[n,not],[[[[n,=],[[v,a],"true"]], + [[n,=],[[v,b],"true"]]]]]]]] + +,[[[[v,c], "false"]]]). + +%% have - I had the collection of 1D items + +test(47,[[n,sum],[[3,1,2],0,[v,l]]], +[ + [[n,sum],[[],[v,l],[v,l]]], + [[n,sum],[[v,l],[v,m1],[v,n]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,+],[[v,m1],[v,h],[v,m2]]], + [[n,sum],[[v,t],[v,m2],[v,n]]] + ] + ] +],[[[[v,l], 6]]]). + +%% I see to sort + +test(48,[[n,sort0],[[9,4,8,2,1,5,7,6,3,10],[v,l]]], +[ + [[n,sort0],[[v,l],[v,n]],":-", + [ [[n,sort1],[[v,l],[],[v,n]]] + ] + ], + [[n,sort1],[[],[v,l],[v,l]]], + [[n,sort1],[[v,l],[v,m1],[v,n]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,maximum],[[v,t],[v,h],[v,m2],[],[v,r]]], + [[n,wrap],[[v,m2],[v,m3]]], + [[n,append],[[v,m1],[v,m3],[v,m4]]], + [[n,sort1],[[v,r],[v,m4],[v,n]]] + ] + ], + [[n,maximum],[[],[v,l],[v,l],[v,r],[v,r]]], + [[n,maximum],[[v,l],[v,m1],[v,n],[v,r1],[v,r2]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,"->"],[[[n,>=],[[v,m1],[v,h]]], + [[[n,=],[[v,m2],[v,m1]]], + [[n,wrap],[[v,h],[v,h2]]], + [[n,append],[[v,r1],[v,h2],[v,r3]]]], + [[[n,=],[[v,m2],[v,h]]], + [[n,wrap],[[v,m1],[v,m12]]], + [[n,append],[[v,r1],[v,m12],[v,r3]]]]]], + [[n,maximum],[[v,t],[v,m2],[v,n],[v,r3],[v,r2]]] + ] + ] +],[[[[v,l], [10,9,8,7,6,5,4,3,2,1]]]]). + +%% the aim of going to a place is reaching local maximum height + +test(49,[[n,maximum0],[[2,1,3,5,-1],[v,m]]], +[ + [[n,maximum0],[[v,l],[v,m]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,maximum],[[v,t],[v,h],[v,m],[],[v,r]]] + ] + ], + [[n,maximum],[[],[v,l],[v,l],[v,r],[v,r]]], + [[n,maximum],[[v,l],[v,m1],[v,n],[v,r1],[v,r2]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,"->"],[[[n,>=],[[v,m1],[v,h]]], + [[[n,=],[[v,m2],[v,m1]]], + [[n,wrap],[[v,h],[v,h2]]], + [[n,append],[[v,r1],[v,h2],[v,r3]]]], + [[[[n,=],[[v,m2],[v,h]]]], + [[n,wrap],[[v,m1],[v,m12]]], + [[n,append],[[v,r1],[v,m12],[v,r3]]]]]], + [[n,maximum],[[v,t],[v,m2],[v,n],[v,r3],[v,r2]]] + ] + ] +],[[[[v,m], 5]]]). + +%% the tutor gave the mark for either answer + +test(50,[[n,disjunction],["true","false",[v,c]]], + +[ +[[n,disjunction],["false","false","false"]], +[[n,disjunction],[[v,a],[v,b],"true"],":-", + [[[n,not],[[[[n,=],[[v,a],"false"]], + [[n,=],[[v,b],"false"]]]]]]]] + +,[[[[v,c], "true"]]]). + +test(51,[[n,expressionnotheadache],["true",[v,c]]], + +[ +[[n,expressionnotheadache],["true","true"]], +[[n,expressionnotheadache],[[v,a],"false"],":-", + [[[n,not],[[[[n,=],[[v,a],"true"]]]]]]] ] +,[[[[v,c], "true"]]]). + +test(52,[[n,mainrole],[7,[v,c]]], + +[ +[[n,mainrole],[7,"mainrole"]], +[[n,mainrole],[[v,shortcourses],"false"],":-", + [[[n,not],[[[[n,=],[[v,shortcourses],7]]]]]]] ] +,[[[[v,c], "mainrole"]]]). + +%% c=f(g(2), 1, 1) +test(53,[[n,function],[[[n,function2],[2]],1,1,[v,c]]], +%%test(53,[[n,getitemn],[1,[1,2,3],[v,bb]]], +[ + [[n,function],[[v,f1],[v,a],[v,b],[v,c]],":-", + [ + [[n,equals1],[[v,f1],[[v,f11],[v,f12]]]], + [[n,getitemn],[1,[v,f12],[v,bb]]], + [[v,f11],[[v,bb],[v,d],[v,f]]], + [[n,+],[[v,a],[v,b],[v,e]]], + [[n,+],[[v,e],[v,f],[v,g]]], + [[n,+],[[v,g],[v,d],[v,c]]] + ] + ], + [[n,function2],[[v,bb],[v,a],[v,f]],":-", + [ + [[n,is],[[v,a],[v,bb]]], + [[n,is],[[v,f],1]] + ] + ], + + [[n,getitemn],[1,[v,b],[v,c]],":-", + [ [[n,head],[[v,b],[v,c]]] + ]], + [[n,getitemn],[[v,a],[v,b],[v,c]],":-", + [ [[n,not],[[[n,=],[[v,a],1]]]], + [[n,tail],[[v,b],[v,t]]], + [[n,-],[[v,a],1,[v,d]]], + [[n,getitemn],[[v,d],[v,t],[v,c]]] + ]] +] + +,[[[[v,c], 5]]]). +%%,[[[[v,bb], 1]]]). + +test(54,[[n,function],[[[n,function2],[2]],1,1,[v,c]]], +[ + [[n,function],[[v,f1],[v,a],[v,b],[v,c]],":-", + [ + [[n,equals1],[[v,f1],[[v,f11],[v,f12]]]], + [[n,getitemn],[1,[v,f12],[v,bb]]], + [[v,f11],[[v,bb],[v,d],[v,f]]], + [[n,+],[[v,a],[v,b],[v,e]]], + [[n,+],[[v,e],[v,f],[v,g]]], + [[n,+],[[v,g],[v,d],[v,c]]] + ] + ], + [[n,function2],[[v,bb],[v,a],[v,f]],":-", + [ + [[n,is],[[v,a],[v,bb]]], + [[n,is],[[v,f],1]] + ] + ], + + [[n,getitemn],[1,[v,b],[v,c]],":-", + [ [[n,head],[[v,b],[v,c]]] + ]], + [[n,getitemn],[[v,a],[v,b],[v,c]],":-", + [ [[n,not],[[[n,=],[[v,a],1]]]], + [[n,tail],[[v,b],[v,t]]], + [[n,-],[[v,a],1,[v,d]]], + [[n,getitemn],[[v,d],[v,t],[v,c]]] + ]] +] + +,[[[[v,c], 5]]]). + +test(55,[[n,test1],[[v,c]]], + +[ +[[n,test1],[1]], +[[n,test2],[2]]] +,[[[[v, c], 1]]]). + +test(56,[[n,map],[[n,add],[1,2,3],0,[v,d]]], +[ + [[n,map],[[v,f],[],[v,l],[v,l]]], + [[n,map],[[v,f],[v,l],[v,m1],[v,n]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[v,f],[[v,m1],[v,h],[v,m2]]], + [[n,map],[[v,f],[v,t],[v,m2],[v,n]]] + ] + ], + + [[n,add],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,c]]] + ] + ] +] + +,[[[[v,d], 6]]]). + +%% later: (test 58) omit if [v,f] fails + +test(57,[[n,findall1],[[n,plusone],[1,2,3],[],[v,d]]], +[ + [[n,findall1],[[v,f],[],[v,l],[v,l]]], + [[n,findall1],[[v,f],[v,l],[v,m1],[v,n]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[v,f],[[v,h],[v,m2]]], + [[n,wrap],[[v,m2],[v,m3]]], + [[n,append],[[v,m1],[v,m3],[v,m4]]], + [[n,findall1],[[v,f],[v,t],[v,m4],[v,n]]] + ] + ], + + [[n,plusone],[[v,a],[v,c]],":-", + [ [[n,+],[[v,a],1,[v,c]]] + ] + ] +] + +,[[[[v,d], [2,3,4]]]]). + + +test(58,[[n,findall1],[[n,a_to_c],["a","b","a"],[],[v,d]]], +[ + [[n,findall1],[[v,f],[],[v,l],[v,l]]], + [[n,findall1],[[v,f],[v,l],[v,m1],[v,n]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + %[[n,trace2]], + [[n,"->"],[[[v,f],[[v,h],[v,m2]]], + [ [[n,wrap],[[v,m2],[v,m3]]], + [[n,append],[[v,m1],[v,m3],[v,m4]]] + ], + [ + [[n,=],[[v,m1],[v,m4]]] + ]]], + [[n,findall1],[[v,f],[v,t],[v,m4],[v,n]]] + + ] + ], + + [[n,a_to_c],["a","c"] + ] +] + +,[[[[v,d], ["c","c"]]]]). + + +test(59,[[n,count],[1,[v,n]]], +[ + [[n,count],[1,2],":-",[[[n,cut]]]], + [[n,count],[1,3]] + + +] ,[[[[v,n], 2]]]). + +test(60,[[n,a]], +[ +[[n,a],":-", +[ + +[[n,interweaving1],[[["select,dash"],["neiey,person"],["neiey,person"]],[["select,dash"],["neiey,person"],["neiey,person"]],[],[["select,dash"],["neiey,person"],["neiey,person"]]]], + +[[n,duplicates],[[["select,dash"],["neiey,person"],["neiey,person"],["neiey,person"],["neiey,person"]],[["select,dash"],["neiey,person"],["neiey,person"],["neiey,person"],["neiey,person"]],[],[["select,dash"],["neiey,person"],["neiey,person"],["neiey,person"],["neiey,person"]]]], + +[[n,minus1],[[["select,dash"],["neiey,person"],["neiey,person"],["neiey,person"],["neiey,person"]],[["select,dash"],["neiey,person"],["neiey,person"],["neiey,person"],["neiey,person"]],[]]], + +[[n,reverse],[[["select,dash"],["neiey,person"],["neiey,person"]],[],[["neiey,person"],["neiey,person"],["select,dash"]]]], + +[[n,interweaving1],[[["neiey,person"],["neiey,person"],["select,dash"]],[["hipaa,square"],["releases,up"],["hipaa,square"]],[],[]]], + +[[n,append1],[[],[["hipaa,square"],["releases,up"],["hipaa,square"]],[["hipaa,square"],["releases,up"],["hipaa,square"]]]], + +[[n,minus1],[[["hipaa,square"],["releases,up"],["hipaa,square"]],[["select,dash"],["neiey,person"],["neiey,person"]],[["hipaa,square"],["releases,up"],["hipaa,square"]]]] + +]], +[[n,reverse],[[],[v,l],[v,l]]], +[[n,reverse],[[v,l],[v,m],[v,n]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,reverse],[[v,t],[v,o],[v,n]]]]], +[[n,interweaving1],[[],[v,a],[v,l],[v,l]]], +[[n,interweaving1],[[v,l1],[v,l2],[v,l3a],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l4]]],[[n,interweaving2],[[v,i1],[v,l2],[],[v,l5]]],[[n,append],[[v,l3a],[v,l5],[v,l6]]],[[n,interweaving1],[[v,l4],[v,l2],[v,l6],[v,l3]]]]], +[[n,interweaving2],[[v,a],[],[v,l],[v,l]]], +[[n,interweaving2],[[v,i1],[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l4]]],[[n,wrap],[[v,i1],[v,i11]]],[[n,append],[[v,l2],[v,i11],[v,l3]]]]],%%[[n,interweaving2],[[v,i1],[v,l4],[v,l5],[v,l3]]]]], +[[n,interweaving2],[[v,i1],[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i2]]],[[n,tail],[[v,l1],[v,l4]]],[[n,not],[[[n,=],[[v,i1],[v,i2]]]]],[[n,interweaving2],[[v,i1],[v,l4],[v,l2],[v,l3]]]]], +[[n,append1],[[v,b],[v,c],[v,a]],":-",[[[n,append],[[v,b],[v,c],[v,a]]]]], +[[n,minus1],[[v,l],[],[v,l]]], +[[n,minus1],[[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l2],[v,i1]]],[[n,tail],[[v,l2],[v,l5]]],[[n,delete2],[[v,l1],[v,i1],[],[v,l6]]],[[n,minus1],[[v,l6],[v,l5],[v,l3]]]]], +[[n,delete2],[[],[v,a],[v,l],[v,l]]], +[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l5]]],[[n,delete2],[[v,l5],[v,i1],[v,l2],[v,l3]]]]], +[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i2]]],[[n,tail],[[v,l1],[v,l5]]],[[n,not],[[[n,=],[[v,i1],[v,i2]]]]],[[n,wrap],[[v,i2],[v,i21]]],[[n,append],[[v,l2],[v,i21],[v,l6]]],[[n,delete2],[[v,l5],[v,i1],[v,l6],[v,l3]]]]], +[[n,mutuallyexclusive],[[],[v,l]]], +[[n,mutuallyexclusive],[[v,l],[v,m]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,membera3],[[v,m],[v,h]]],[[n,mutuallyexclusive],[[v,t],[v,m]]]]], +[[n,membera3],[[],[v,l]]], +[[n,membera3],[[v,l],[v,m]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,=],[[v,m],[v,h]]]]],[[n,membera3],[[v,t],[v,m]]]]], +[[n,duplicates],[[],[v,l],[v,s],[v,s]]], +[[n,duplicates],[[v,l],[v,m],[v,s1],[v,s2]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,member],[[v,h],[v,m]]],[[n,"->"],[[[n,deletea2],[[v,m],[v,h],[v,m1]]],[[n,true]],[[n,=],[[v,m],[v,m1]]]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,s1],[v,h1],[v,s3]]],[[n,duplicates],[[v,t],[v,m1],[v,s3],[v,s2]]]]], +[[n,duplicates],[[v,l],[v,m],[v,s1],[v,s2]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,membera4],[[v,m],[v,h]]]]],[[n,duplicates],[[v,t],[v,m],[v,s1],[v,s2]]]]], +[[n,deletea2],[[],[v,l],[v,m1]],":-",[[[n,fail]]]], +[[n,deletea2],[[v,l],[v,m],[v,t]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,=],[[v,m],[v,h]]]]], +[[n,deletea2],[[v,l],[v,m],[v,m1]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,=],[[v,m],[v,h]]]]],[[n,deletea2],[[v,t],[v,m],[v,m1]]]]], +[[n,membera4],[[],[v,l]],":-",[[[n,fail]]]], +[[n,membera4],[[v,l],[v,h]],":-",[[[n,head],[[v,l],[v,h]]]]], +[[n,membera4],[[v,l],[v,m]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,=],[[v,m],[v,h]]]]],[[n,membera4],[[v,t],[v,m]]]]], +[[n,part_of_string],[[],[]]], +[[n,part_of_string],[[],[v,b]],":-",[[[n,not],[[[n,=],[[v,b],[]]]]],[[n,fail]]]], +%%[[n,part_of_string],[[v,a],[v,b]],":-",[[[n,tail],[[v,a],[v,at]]],[[n,"->"],[[[n,listhead],[[v,a],[v,b]]],[[[n,true]]],[[[n,part_of_string],[[v,at],[v,b]]]]]]]], +[[n,part_of_string],[[v,a],[v,b]],":-",[[[n,tail],[[v,a],[v,at]]],[[n,"->"],[[[[n,listhead],[[v,a],[v,b]]]],[[[n,true]]],[[[n,part_of_string],[[v,at],[v,b]]]]]]]], +[[n,listhead],[[v,l],[]]], +[[n,listhead],[[v,a],[v,b]],":-",[[[n,head],[[v,a],[v,ah]]],[[n,tail],[[v,a],[v,at]]],[[n,head],[[v,b],[v,ah]]],[[n,tail],[[v,b],[v,bt]]],[[n,listhead],[[v,at],[v,bt]]]]], +[[n,listhead],[[v,a],[v,b]],":-",[[[n,head],[[v,a],[v,ah]]],[[n,tail],[[v,a],[v,at]]],[[n,head],[[v,b],[v,ah]]],[[n,tail],[[v,b],[v,bt]]],[[n,listhead],[[v,at],[v,bt]]]]] +],[[]]). + +test(61,[[n,add],[[1,2,3],3,[],[v,l]]], +[ + [[n,add],[[],[v,th],[v,l],[v,l]]], + [[n,add],[[v,l],[v,th],[v,m],[v,n]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,+],[[v,h],[v,th],[v,h0]]], + [[n,wrap],[[v,h0],[v,h1]]], + [[n,append],[[v,m],[v,h1],[v,o]]], + [[n,add],[[v,t],[v,th],[v,o],[v,n]]] + ] + ] +],[[[[v,l], [4,5,6]]]]). + +test(62,[[n,add],[[1],[2,3],[v,l]]], +[ + [[n,add],[[],[v,l],[v,l]]], + [[n,add],[[v,l],[v,m],[v,n]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,wrap],[[v,h],[v,h1]]], + [[n,append],[[v,h1],[v,m],[v,o]]], + [[n,add],[[v,t],[v,o],[v,n]]] + ] + ] +],[[[[v, l], [1,2,3]]]]). + +test(63,[[n,add],[1,[v,b]]], +[ + [[n,add],[2,3]], + [[n,add],[1,[v,b]],":-", + [ [[n,add],[2,[v,b]]]]] +],[[[[v, b], 3]]]). + + +test(64,[[n,add0],[[1,2],[v,b]]], +[ + [[n,add2],[[v,a],[v,b]],":-", + [ [[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], + [[n,add3],[[v,a],[v,b]],":-", + [ [[n,tail],[[v,a],[v,b]]]]], + + [[n,add0],[[v,a],[v,b]],":-", + [ [[n,1],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[n,1],[[v,a],[v,b]],":-", + [ [[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + [[n,1],[[v,a],[v,b]],":-", + [ [[n,add3],[[v,a],[v,c]]], + [[n,1],[[v,c],[v,d]]], + [[n,=],[[v,d],[v,b]]]]] +],[[[[v, b], []]]]). + +test(65,[[n,add0],[[1],[v,b]]], +[[[n,add3],[[v,a],[v,b]],":-",[[[n,tail],[[v,a],[v,b]]]]],[[n,add0],[[v,a],[v,b]],":-",[[[n,add3],[[v,a],[v,c]]],[[n,=],[[v,c],[v,b]]]]]] +,[[[[v, b], []]]]). + +/** +%%[[1],[2,3],[1,2,3]]],[[],[1,2,3],[1,2,3]]] + +test(63,[[n,add],[1,2,[v,l]]], +[ + [[n,add],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,c]]]]], + [[n,add],[[v,a],[v,b],[v,c]],":-", + [ [[n,-],[[v,a],[v,b],[v,c]]]]] +],[[[[v, l], 3]], [[[v, l], -1]]]). + +test(64,[[n,add],[[1,2,3],3,[],[v,l],[v,t],[v,t],[v,th],[v,th],[v,o],[v,o]]], +[ + [[n,add],[[],[v,th],[v,l],[v,l],[v,t],[v,t],[v,th],[v,th],[v,o],[v,o]]], + [[n,add],[[v,l],[v,th],[v,m],[v,n],[v,t],[v,th],[v,o]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,+],[[v,h],[v,th],[v,h0]]], + [[n,wrap],[[v,h0],[v,h1]]], + [[n,append],[[v,m],[v,h1],[v,o]]], + [[n,add],[[v,t],[v,th],[v,o],[v,n],[v,t],[v,t],[v,th],[v,th],[v,o],[v,o]]] + ] + ] +],[[[[v,l], [4,5,6]],[[v,t],888],[[v,th],888],[[v,o],888]]]). + +%% do separate i,o to group of last 3 vars +%% separate i,o + + +test(65,[[n,add3],[[v,a],[v,b]]], +[ +[[n,add1],[1]], +[[n,add2],[[v,a],[v,b]],":-", +[ [[n,+],[[v,a],1,[v,b]]]]], +[[n,add3],[[v,a],[v,b]],":-", +[ [[n,+],[[v,a],1,[v,b]]]]], + + +%% give functional function base case name as arg, which it can move around using cawp not cawmp + +%% fibonacci + + +%% change back lpi, cawp verify +**/ + +test(66,[[n,addorsubtract1],[2,1,1]], +[ + [[n,addorsubtract1],[[v,a],[v,b],[v,c]],":-", + [ %%[[n,or],[[[n,addorsubtract2],[[v,a],[v,b],[v,c]]], + %%[[n,true]], + [[n,addorsubtract2],[[v,a],[v,b],[v,d]]],%%] + [[n,=],[[v,d],[v,c]]] + ] + ], + [[n,addorsubtract2],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,d]]],%%] + [[n,=],[[v,d],[v,c]]] + ] + ], + [[n,addorsubtract2],[[v,a],[v,b],[v,c]],":-", + [ [[n,-],[[v,a],[v,b],[v,d]]],%%] + [[n,=],[[v,d],[v,c]]] + ] + ] +],[[]]). + +test(67,[[n,addorsubtract1],[2,1,1]], +[ + [[n,addorsubtract1],[[v,a],[v,b],[v,c]],":-", + [ [[n,or],[[[n,addorsubtract2],[[v,a],[v,b],[v,c]]], + %%[[n,true]], + [[n,addorsubtract3],[[v,a],[v,b],[v,c]]]]] + ] + ], + [[n,addorsubtract2],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,c]]] + ] + ], + [[n,addorsubtract3],[[v,a],[v,b],[v,c]],":-", + [ [[n,-],[[v,a],[v,b],[v,c]]] + ] + ] +],[[]]). + + +test(68,[[n,addorsubtract1],[2,1,1]], +[ + [[n,addorsubtract1],[[v,a],[v,b],[v,c]],":-", + [ [[n,"->"],[[[n,addorsubtract2],[[v,a],[v,b],[v,c]]], + [[n,true]], + [[n,addorsubtract3],[[v,a],[v,b],[v,c]]]]] + ] + ], + [[n,addorsubtract2],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,c]]] + ] + ], + [[n,addorsubtract3],[[v,a],[v,b],[v,c]],":-", + [ [[n,-],[[v,a],[v,b],[v,c]]] + ] + ] +],[[]]). + +test(69,[[n,add0],[2,1]], +[ + [[n,add0],[[v,a],[v,b]],":-", + [[[n,1],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + [[n,1],[[v,a],[v,b]],":-", + [ [[n,+],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[n,1],[[v,a],[v,b]],":-", + [ [[n,-],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]] +],[[]]). + +test(70,[[n,add0],[1,2]], +[ + [[n,a2],[[v,a],[v,b]],":-", + [ [[n,+],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[n,a3],[[v,a],[v,b]],":-", + [ [[n,-],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[n,add0],[[v,a],[v,b]],":-", + [ [[n,1],[[v,a],[v,b]]]]], + + [[n,1],[[v,a],[v,b]],":-", + [ [[n,a2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + + [[n,1],[[v,a],[v,b]],":-", + [ [[n,a3],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]] +],[[]]). + +test(71,[[n,add0],[1,2]], +[ + [[n,add0],[[v,a],[v,b]],":-", + [ [[n,1],[[v,a],[v,b]]]]], + + [[n,1],[[v,a],[v,b]],":-", + [ [[n,+],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]] +],[[]]). + +test(72,[[n,add0],[1,[v,b]]], +[ + [[n,add0],[[v,a],[v,b]],":-", + [ [[n,1],[[v,a],[v,b]]]]], + + [[n,1],[[v,a],[v,b]],":-", + [ [[n,+],[[v,a],1,[v,c]]], + [[n,=],[[v,c],[v,b]]]]] +],[[[[v,b],2]]]). + +test(73,[[n,add0],[1,1,[v,c]]], +[ + [[n,add0],[[v,a],[v,b],[v,c]],":-", + [ [[n,1],[[v,a],[v,b],[v,c]]]]], + + [[n,1],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,d]]], + [[n,=],[[v,d],[v,c]]]]] +],[[[[v,c],2]]]). + +test(74,[[n,add0],[[1,2],[v,c]]], +[ %% Result +[[n,add2],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], +[[n,add3],[[v,a],[v,b]],":-", + [[[n,tail],[[v,a],[v,b]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,add3],[[v,a],[v,c]]], + [[n,add0],[[v,c],[v,d]]], + [[n,=],[[v,d],[v,b]]]]]] +,[[[[v,c],[]]]]). + +test(75,[[n,add0],[[],[v,c]]], +[[[n,add2],[[v,a],[v,b]],":-", + [[[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], +[[n,add0],[[v,a],[v,b]],":-", + [[[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,a],[v,b]]]]]], +[[[[v,c],[]]]]). + +test(76,[[n,implies2],[1,[v,b]]], + +[ + [[n,implies2],[[v,a],[v,b]],":-", + [ [[n,"->"],[[[n,is],[[v,a],1]], + [[n,is],[[v,b],2]]]] + ]] + +],[[[[v,b],2]]]). + +test(77,[[n,findall1],[[1,2,3],[v,b]]], + +[ + [[n,findall1],[[v,a],[v,b]],":-", + [ [[n,findall],[[v,a1],[[n,member],[[v,a1],[v,a]]], + [v,b]]] + ]] + +],[[[[v,b],[1,2,3]]]]). + +/** +test(77a,[[n,member2a],[[1,2,3],[v,b]]], + +[ + [[n,member2a],[[v,a],[v,b]],":-", + [[[n,member],[[v,a],[v,b]]] + ]] + + +],[[[[v,b],1]],[[[v,b],2]],[[[v,b],3]]]). + +**/ + +test(78,[[n,maplist1],[[1,2,3],[v,b]]], + +[ + [[n,maplist1],[[v,a],[v,b]],":-", + [ [[n,maplist],[[n,+],[v,a],0,[v,b]]] + ]] + + +],[[[[v,b],6]]]). + + +test(79,[[n,equals41],[[1,2,3],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,equals4],[[v,a],[[v,b],"|",[v,c]]]] + ]] + +],[[[[v,b],1]]]). + +test(80,[[n,equals41],[[v,a],[v,d],[v,c],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,d],[v,c],[v,b]],":-", + [ [[n,equals4],[[[1,5],2,3,4],[[[v,a],"|",[v,d]],[v,c],"|",[v,b]]]] + ]] + +],[[[[v,a],1],[[v,b],[3,4]],[[v,c],2],[[v,d],[5]]]]). + + +test(81,[[n,equals41],[[v,a],[v,c],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,c],[v,b]],":-", + [ [[n,equals4],[[[[v,a],[v,c]],"|",[v,b]],[[1,2],3,4]]] + ]] + +],[[[[v,a],1],[[v,b],[3,4]],[[v,c],2]]]). + +test(82,[[n,equals41],[[v,a],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,equals4],[[[v,a],"|",[v,b]],[1,2,3,4]]] + ]] + +],[[[[v, a], 1], [[v, b], [2, 3, 4]]]]). + +test(83,[[n,equals41]], + +[ + [[n,equals41],":-", + [ [[n,equals4],[[[v,a],[v,c],"|",[v,b],[v,d]],[1,2,3,4]]] + ]] + +],[]). + +test(84,[[n,equals41],[[v,a],[v,c],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,c],[v,b]],":-", + [ [[n,equals4],[[[[v,a]],[v,c],"|",[v,b]],[[1],2,3,4]]] + ]] + +],[[[[v,a],1],[[v,b],[3,4]],[[v,c],2]]]). + +test(85,[[n,equals41],[[v,a],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,equals4],[[[v,a],"|",[v,b]],[[1,2],3,4]]] + ]] + +],[[[[v, a], [1, 2]], [[v, b], [3, 4]]]]). + +test(86,[[n,equals41],[[v,a],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,equals4],[[[v,a],"|",[[v,b]]],[1,2]]] + ]] + +],[[[[v, a], 1], [[v, b], 2]]]). + +test(87,[[n,equals41],[[v,a]]], + +[ + [[n,equals41],[[v,a]],":-", + [ [[n,equals4],[[[v,a]],[1]]] + ]] + +],[[[[v, a], 1]]]). + +test(88,[[n,equals41],[[v,a],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,equals4],[[[v,a],[v,b]],[1,2]]] + ]] + +],[[[[v, a], 1], [[v, b], 2]]]). + +test(89,[[n,equals41],[[v,a],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,equals4],[[[v,a],[v,b]],[[1,3],2]]] + ]] + +],[[[[v, a], [1, 3]], [[v, b], 2]]]). + + +test(90,[[n,equals41]], + +[ + [[n,equals41],":-", + [ [[n,equals4],[[[v,a],[v,c],"|",[v,b],"|",[v,d]],[1,2,3,4]]] + ]] + +],[]). + +test(91,[[n,equals41],[[1,2,3]]], + +[ + [[n,equals41],[[v,a]],":-", + [ [[n,equals4],[[v,a],[1,2,3]]] + ]] + +],[[]]). + +test(92,[[n,equals41],[[v,a],[v,b],[v,d]]], + +[ + [[n,equals41],[[v,a],[v,b],[v,d]],":-", + [ [[n,equals4],[[[v,a],"|",[[v,b],"|",[v,d]]],[1,2,3,4]]] + ]] + +],[[[[v, a], 1], [[v, b], 2],[[v, d], [3,4]]]]). + +test(93,[[n,maplist1],[[[1],[2],[3]],[v,b]]], + +[ + [[n,maplist1],[[v,a],[v,b]],":-", + [ [[n,maplist],[[n,append],[v,a],[],[v,b]]] + ]] + + +],[[[[v,b],[1,2,3]]]]). + +test(94,[[n,maplist1],[[[[1]],[[2]],[[3]]],[v,b]]], + +[ + [[n,maplist1],[[v,a],[v,b]],":-", + [ [[n,maplist],[[n,append],[v,a],[],[v,b]]] + ]] + + +],[[[[v,b],[[1],[2],[3]]]]]). + +test(95,[[n,findall1],[[1,2,3],[v,b]]], + +[ + [[n,findall1],[[v,a],[v,b]],":-", + [ [[n,findall],[[[v,a1],[v,a1]],[[n,member],[[v,a1],[v,a]]], + [v,b]]] + ]] + +],[[[[v,b],[[1,1],[2,2],[3,3]]]]]). + + +test(96,[[n,equals41],[1,[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,equals4],[[v,b],[[v,a],[v,a]]]] + ]] + +],[[[[v, b], [1,1]]]]). + + +test(97,[[n,equals41],[[v,a]]], + +[ + [[n,equals41],[[v,a]],":-", + [ [[n,equals4],[[v,a],[1,2,3]]] + ]] + +],[[[[v,a],[1,2,3]]]]). + +test(98,[[n,equals41],[[[1,2],3,4],[v,a],[v,b]]], + +[ + [[n,equals41],[[v,c],[v,a],[v,b]],":-", + [ [[n,equals4],[[[v,a],"|",[v,b]],[v,c]]] + ]] + +],[[[[v, a], [1, 2]], [[v, b], [3, 4]]]]). + +test(99,[[n,equals41],[1,[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,equals4],[[v,b],[[[v,a],[v,a]],[v,a]]]] + ]] + +],[[[[v, b], [[1,1],1]]]]). + + +test(100,[[n,equals41],[1,[v,c],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,c],[v,b]],":-", + [ [[n,equals4],[[[v,c],"|",[v,b]],[[[v,a],[v,a]],[v,a]]]] + ]] + +],[[[[v,b],[1]],[[v,c],[1,1]]]]). + + +test(101,[[n,equals41],[1,[v,c],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,c],[v,b]],":-", + [ [[n,equals4],[[[[v,a],[v,a]],[v,a]],[[v,c],"|",[v,b]]]] + ]] + +],[[[[v,b],[1]],[[v,c],[1,1]]]]). + + +test(102,[[n,equals41],[1,[2,3],[v,b1],[v,b2],[v,b3]]], + +[ + [[n,equals41],[[v,a],[v,d],[v,b1],[v,b2],[v,b3]],":-", + [ [[n,equals4],[[[v,a],"|",[v,d]],[[v,b1],[v,b2],[v,b3]]]] + ]] + +],[[[[v, b1], 1],[[v,b2],2],[[v,b3],3]]]). + +test(103,[[n,equals41],[1,[2,3],[v,b1],[v,b2],[v,b3]]], + +[ + [[n,equals41],[[v,a],[v,d],[v,b1],[v,b2],[v,b3]],":-", + [ [[n,equals4],[[[v,b1],[v,b2],[v,b3]],[[v,a],"|",[v,d]]]] + ]] + +],[[[[v, b1], 1],[[v,b2],2],[[v,b3],3]]]). + +test(104,[[n,findall1],[[[1,2],[3,4]],[v,b]]],%[[[1,11,111],[2,22,222],[3,33,333]],[v,b]]], + +[ + [[n,findall1],[[v,a],[v,b]],":-", + [ [[n,findall],[[v,b1],[[[n,member],[[v,a1],[v,a]]], + + [[n,findall],[[v,a3],[[[n,member],[[v,a2],[v,a1]]], + [[n,+],[[v,a2],5,[v,a3]]]], + [v,b1]]]], + + [v,b]]] + ]] + +],[[[[v,b],[[6,7],[8,9]] +%[[1,11,111],[2,22,222],[3,33,333]] +]]]). + + +test(105,[[n,member2a],[[v,b],[1,11,111]]], + +[ + [[n,member2a],[[v,b],[v,a]],":-", + [ [[n,member],[[v,b],[v,a]]],[[n,cut]]] + ] + +],[[[[v,b],1]]]). + +/** + +%% Need to analyse body, test whether cut is after a statement, cut results + +test(105a,[[n,findall1],[[[1,11,111],[2,22,222],[3,33,333]],[v,b]]], + +[ + [[n,findall1],[[v,a],[v,b]],":-", + [ [[n,findall],[[v,b1],[[[n,member],[[v,a],[v,a1]]], + [[n,cut]], + + [[n,findall],[[v,a2],[[n,member],[[v,a1],[v,a2]]], + [v,b1]]] + ], + + [v,b]]] + ]] + +],[[[[v,b],[[1,11,111]]]]]). + +test(105b,[[n,findall1],[[1,2,3],[v,b]]], + +[ + [[n,findall1],[[v,a],[v,b]],":-", + [ [[n,findall],[[[v,a1],[v,a1]],[[[n,member],[[v,a],[v,a1]]],[[n,cut]]], + [v,b]]] + ]] + +],[[[[v,b],[[1,1]]]]]). + + +**/ + +test(106,[[n,call1a],[[v,b],[1,11,111]]], + +[ + [[n,call1a],[[v,b],[v,a]],":-", + [ [[n,call],[[n,member2a],[[v,b],[v,a]]]]] + ], + + [[n,member2a],[[v,b],[v,a]],":-", + [ [[n,member],[[v,b],[v,a]]],[[n,cut]]] + ] + +],[[[[v,b],1]]]). + +test(107,[[n,call1b],[[1,11,111],[v,b]]], + +[ + [[n,call1b],[[v,a],[v,b]],":-", + [ + + %[[n,equals4_on]], called algorithm isn't treated with equals4 anyway + + [[n,call],[[lang,same],same,[[n,member2a],[[v,a],[v,b]]], +[[[n,member2a],[[v,a],[v,b]],":-", + [ [[n,member],[[v,b],[v,a]]],[[n,cut]]] + ]]]], + + [[n,equals4_on]] +]] + +],[[[[v,b],1]]]). + + + +test(108,[[n,call1b],[[1,11,111],[v,b]]], + %[[[n,call1b],[[[t,brackets],[[t,number],[t,number],[t,number]]],[t,number]]]], + %[[[n,call1b],[input,output]]], + +[ + %[[n,equals4_off]], + [[n,call1b],[[v,a],[v,b]],":-", + [ [[n,call],[[lang,same],same,[[n,member2a],[[v,a],[v,b]]], + [[[n,member2a],[[[t,number],[t,number],[t,number]],[t,number]]]], + [[[n,member2a],[input,output]]], + +[[[n,member2a],[[v,a],[v,b]],":-", + [ [[n,member],[[v,b],[v,a]]],[[n,cut]]] + ]]]], + [[n,cut]], + [[n,equals4_on]] + ]] + +],[[[[v,b],1]]]). + + +test(109,[[n,middle],[2,[v,b]]], +[ + [[n,middle],[[v,a],[v,b]],":-", + [ [[n,/],[[v,a],2,[v,b]]] + ]] + +],[[[[v,b],1]]]). + +test(110,[[n,level_with],[170,[v,b]]], +[ + [[n,level_with],[[v,a],[v,a]]] + +],[[[[v,b],170]]]). + +test(111,[[n,tra_las],[5,[v,a]]], +[ + [[n,tra_las],[[v,n],[v,a]],":-", + [ [[n,las],[[v,n],[],[v,b]]], + [[n,append],[["tra"],[v,b],[v,a]]] + ]], + + [[n,las],[0,[v,l],[v,l]]], + [[n,las],[[v,l],[v,m],[v,n]],":-", + [ [[n,-],[[v,l],1,[v,h]]], + [[n,append],[[v,m],["la"],[v,o]]], + [[n,las],[[v,h],[v,o],[v,n]]] + ] + ] + +],[[[[v,a],["tra","la","la","la","la","la"]]]]). + +test(112,[[n,final_gong],[5,[v,a]]], +[ + [[n,final_gong],[[v,n],[v,a]],":-", + [ [[n,-],[[v,n],1,[v,n1]]], + [[n,dashes],[[v,n1],[],[v,b]]], + [[n,append],[[v,b],["gong"],[v,a]]] + ]], + + [[n,dashes],[0,[v,l],[v,l]]], + [[n,dashes],[[v,l],[v,m],[v,n]],":-", + [ [[n,-],[[v,l],1,[v,h]]], + [[n,append],[[v,m],["-"],[v,o]]], + [[n,dashes],[[v,h],[v,o],[v,n]]] + ] + ] + +],[[[[v,a],["-","-","-","-","gong"]]]]). + +test(113,[[n,bedroom_to_garden],["bedroom",[v,b]]], +[ + [[n,bedroom_to_garden],["bedroom","garden"]] + +],[[[[v,b],"garden"]]]). + +test(114,[[n,stop_at_top],[["-","-","-","top"],[v,a]]], +[ + [[n,stop_at_top],[[],"fail"]], + [[n,stop_at_top],[[v,l],"success"],":-", + [ [[n,head],[[v,l],"top"]] + ]], + [[n,stop_at_top],[[v,l],[v,n]],":-", + [ [[n,head],[[v,l],"-"]], + [[n,tail],[[v,l],[v,t]]], + [[n,stop_at_top],[[v,t],[v,n]]] + ] + ] + +],[[[[v,a],"success"]]]). + + +% Program finder 6 3 21 + +test(115,[[n,function],[[["n1","a"]],[["a",5]],[],[v,result]]], +[[[n,function],[[],[v,inputs2],[v,output],[v,output]]],[[n,function],[[v,input1],[v,inputs2],[v,inputs3],[v,output]],":-",[[[n,head],[[v,input1],[v,head]]],[[n,tail],[[v,input1],[v,tail]]],[[n,equals1],[[v,head],[[v,a],[v,b]]]],[[[n,string],[[v,a]]],[[n,string],[[v,b]]]],[[n,head],[[v,inputs2],[v,head1]]],[[n,tail],[[v,inputs2],[v,tail1]]],[[n,equals1],[[v,head1],[[v,b],[v,c]]]],[[[n,number],[[v,c]]]],[[n,equals2],[[v,item1],[[v,a],[v,c]]]],[[n,wrap],[[v,item1],[v,item1a]]],[[n,append],[[v,inputs3],[v,item1a],[v,item2]]],[[n,function],[[v,tail],[v,tail1],[v,item2],[v,output]]]]]] + +,[[[[v,result],[["n1", 5]]]]]). + +% Split after ".","!","?", producing "" if one of these characters is at the start + +test(116,[[n,grammar1],[".aaa.bbb.",[".","?"],[v,t]]], +%test(17,[[n,grammar1],["aaa1 ,-'! a? b! b.",[v,t]]], +%%test(15,[[n,compound213],["","",[["a"],1],[v,t]]], + +[ + [[n,grammar1],[[v,u],[v,cs],[v,t]],":-", + [ + [[n,compound21],[[v,u],"",[v,cs],[],[v,t]]] + %%[[n,number21],[[v,u],"","",[v,t]]] + %%[[n,compound213],["","",[["a"],1],[v,t]]] + ] + ], + + [[n,compound213],["","",[v,t],[v,t]]], + + [[n,compound213],[[v,u],[v,u],[v,t],[v,t]]], %% swapped these + + [[n,compound],[[v,cs],[v,t],[v,u]],"->", + [[[n,compound21],[[v,cs],[v,t],[v,v]]], + [[n,compound213],[[v,v],[v,u]]]]], + + [[n,compound212],["","",[v,t],[v,t]]], + + [[n,compound212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,compound21],["","",[v,cs],[],[""]]], + + [[n,compound21],[[v,cs],[v,t],[v,u]],"->", + [[[n,item],[[v,i],[v,cs]]], + [[n,code],%%[[n,stringconcat],[[v,i],".",[v,i2]]], + [[n,wrap],[[v,i],[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]]], + [[n,compound212],[[v,v],[v,u]]]]], + + [[n,compound21],[[v,cs],[v,t],[v,u]],"->", + [[[n,item],[[v,i],[v,cs]]],%" ", + [[n,compound21],[[v,cs],[],[v,compound1name]]], + [[n,code],%%[[n,stringconcat],[[v,i],".",[v,i2]]], + [[n,wrap],[[v,i],[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]], + [[n,append],[[v,v],[v,compound1name],[v,u]]]]]], +/** + [[n,item],[[v,t]],"->", + [[[n,number21],["",[v,t]]]]], +**/ + [[n,item],[[v,t],[v,cs]],"->",[[[n,word21],[[v,cs],"",[v,t]]]]], + + [[n,item],[[v,t],[v,cs]],"->",[[[n,compound],[[v,cs],[],[v,t]]]]], +/** + [[n,number212],["","",[v,t],[v,t]]], + + [[n,number212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a],[[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number212],[[v,v],[v,u]]]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number21],["",[v,numberstring]]], + [[n,code],[[n,stringconcat], + [[v,v],[v,numberstring],[v,u]]]]]], +**/ + [[n,word212],["","",[v,t],[v,t]]], + + [[n,word212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,word213],["","",[v,t],[v,t]]], + +/** + [[n,word21],[[v,t],[v,u]],"->", + [[v,a],[[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word212],[[v,v],[v,u]]]]], +**/ + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [%[v,a], + [v,b],%[[n,lookahead1],[[v,cs]]], + [[n,code],%[[n,sentencechars],[[v,a]]], + [[n,finalchar],[[v,b],[v,cs]]] + %[[n,stringconcat],[[v,t],[v,a],[v,v1]]], + %[[n,stringconcat],[[v,t],[v,b],[v,v]]] + ], + [[n,word212],[[v,t],[v,u]]]]], + + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [%[v,a], + [v,b],%[[n,lookahead1],[[v,cs]]], + [[n,code],%[[n,sentencechars],[[v,a]]], + %[[n,trace]], + [[n,sentencechars],[[v,b],[v,cs]]], + [[n,stringconcat],[[v,t],[v,b],[v,v1]]] + %[[n,stringconcat],[[v,t],[v,b],[v,v]]] + ], + [[n,word213],[[v,v1],[v,u]]]]], + +/** nothing in string + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [%[v,a], + "",%[[n,lookahead],[[v,c]]], + %[[n,code],%[[n,sentencechars],[[v,a]]], + %[[n,finalchar],[[v,b],[v,cs]]] + %[[n,stringconcat],[[v,t],[v,a],[v,v1]]], + %[[n,stringconcat],[[v,t],[v,b],[v,v]]] + %], + %[[n,code],%[[n,sentencechars],[[v,b],[v,cs]]], + %[[n,stringconcat],[[v,t],[v,b],[v,v]]], + %[[n,not],[[n,finalchar],[[v,c],[v,cs]]]]], + [[n,word212],[[v,t],[v,u]]]]], +**/ +/** + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [%[v,a], + %[[n,code],[[n,trace]]], + [v,b],[[n,lookahead],[[v,c]]], + [[n,code],[[n,sentencechars],[[v,b],[v,cs]]], + [[n,stringconcat],[[v,t],[v,b],[v,v]]], + [[n,not],[[n,finalchar],[[v,c],[v,cs]]]]] + %[[n,finalchar],[[v,b],[v,cs]]] + %[[n,stringconcat],[[v,t],[v,a],[v,v1]]], + %[[n,stringconcat],[[v,t],[v,b],[v,v]]] + , + [[n,word212],[[v,v],[v,u]]]]], +**/ +/** + [[n,word21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word21],["",[v,numberstring]]], + [[n,code],[[n,stringconcat], + [[v,v],[v,numberstring],[v,u]]]]]] + +**/ + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,sentencechars],[[v,a],[v,cs]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word21],[[v,cs],"",[v,wordstring]]], + [[n,code], + [[n,stringconcat],[[v,v],[v,wordstring],[v,u]]]]]], + + [[n,sentencechars],[[v,c],[v,cs]],":-", + [[[n,not],[[[n,member],[[v,c],[v,cs]]]]]]], + +/** [[n,sentencechars],[[v,c]],":-", + [[[n,letters],[[v,c]]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[[n,stringtonumber],[[v,c],[v,n]]], + [[n,number],[[v,n]]]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c]," "]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],","]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],"-"]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],"'"]]]], + + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],"."]]]], + +**/ + [[n,finalchar],[[v,c],[v,cs]],":-", + [[[n,member],[[v,c],[v,cs]]]]], + +/** + [[n,lookahead1],[[v,c],[v,cs]],":-", %? + [[[n,member],[[v,cs],[v,c]]], + [[n,lookahead],[[v,c]]]]], + + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],"!"]]]], + + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],"?"]]]] + **/ + + [[n,lookahead],[[v,a],[v,a],[v,b]],":-", + [[[n,stringconcat],[[v,b],[v,d],[v,a]]]]] + +%%],[[[v,t],[["a"],1]]]). +%],[[[[v,t],["aaa1 ,-'!","a?","b!","b."]]]]). +],[[[[v,t],["","aaa","bbb"]]]]). + +test(117,[[n,grammar1],["a a. a ",[" ","."],[v,t]]], +%test(17,[[n,grammar1],["aaa1 ,-'! a? b! b.",[v,t]]], +%%test(15,[[n,compound213],["","",[["a"],1],[v,t]]], + +[ + [[n,grammar1],[[v,u],[v,cs],[v,t]],":-", + [ + [[n,compound21],[[v,u],"",[v,cs],[],[v,t]]] + %%[[n,number21],[[v,u],"","",[v,t]]] + %%[[n,compound213],["","",[["a"],1],[v,t]]] + ] + ], + + %[[n,compound213],["","",[v,t],[v,t]]], + + [[n,compound213],[[v,u],[v,u],[v,t],[v,t]]], %% swapped these + + [[n,compound],[[v,cs],[v,t],[v,u]],"->", + [[[n,compound21],[[v,cs],[v,t],[v,v]]], + [[n,compound213],[[v,v],[v,u]]]]], + + %[[n,compound212],["","",[v,t],[v,t]]], + + [[n,compound212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,compound21],["","",[v,cs],[],[""]]], + + [[n,compound21],[[v,cs],[v,t],[v,u]],"->", + [[[n,item],[[v,i],[v,cs]]], + [[n,code],%%[[n,stringconcat],[[v,i],".",[v,i2]]], + [[n,wrap],[[v,i],[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]]], + [[n,compound212],[[v,v],[v,u]]]]], + + [[n,compound21],[[v,cs],[v,t],[v,u]],"->", + [[[n,item],[[v,i],[v,cs]]],%" ", + [[n,compound21],[[v,cs],[],[v,compound1name]]], + [[n,code],%%[[n,stringconcat],[[v,i],".",[v,i2]]], + [[n,wrap],[[v,i],[v,itemname1]]], + [[n,append],[[v,t],[v,itemname1],[v,v]]], + [[n,append],[[v,v],[v,compound1name],[v,u]]]]]], +/** + [[n,item],[[v,t]],"->", + [[[n,number21],["",[v,t]]]]], +**/ + [[n,item],[[v,t],[v,cs]],"->",[[[n,word21],[[v,cs],"",[v,t]]]]], + + [[n,item],[[v,t],[v,cs]],"->",[[[n,compound],[[v,cs],[],[v,t]]]]], +/** + [[n,number212],["","",[v,t],[v,t]]], + + [[n,number212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a],[[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number212],[[v,v],[v,u]]]]], + + [[n,number21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,number21],["",[v,numberstring]]], + [[n,code],[[n,stringconcat], + [[v,v],[v,numberstring],[v,u]]]]]], +**/ + [[n,word212],["","",[v,t],[v,t]]], + + [[n,word212],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,word213],["","",[v,t],[v,t]]], + +/** + [[n,word21],[[v,t],[v,u]],"->", + [[v,a],[[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word212],[[v,v],[v,u]]]]], +**/ + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [%[v,a], + [v,b],%[[n,lookahead1],[[v,cs]]], + [[n,code],%[[n,sentencechars],[[v,a]]], + [[n,finalchar],[[v,b],[v,cs]]], + [[n,stringconcat],[[v,t],[v,b],[v,v1]]] + %[[n,stringconcat],[[v,t],[v,b],[v,v]]] + ], + [[n,word212],[[v,v1],[v,u]]]]], + + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [%[v,a], + [v,b],[[n,lookahead],[[v,c]]], + [[n,code], + %[[n,trace]], + [[n,finalchar],[[v,c],[v,cs]]], + % + [[n,sentencechars],[[v,b],[v,cs]]], + [[n,stringconcat],[[v,t],[v,b],[v,v1]]] + %[[n,stringconcat],[[v,t],[v,b],[v,v]]] + ], + [[n,word212],[[v,v1],[v,u]]]]], + + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [%[v,a], + [v,b],%[[n,lookahead],[[v,c]]], + [[n,code], + %[[n,trace]], + %[[n,sentencechars],[[v,c],[v,cs]]], + % + [[n,sentencechars],[[v,b],[v,cs]]], + [[n,stringconcat],[[v,t],[v,b],[v,v1]]] + %[[n,stringconcat],[[v,t],[v,b],[v,v]]] + ], + [[n,word213],[[v,v1],[v,u]]]]], + +/** nothing in string + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [%[v,a], + "",%[[n,lookahead],[[v,c]]], + %[[n,code],%[[n,sentencechars],[[v,a]]], + %[[n,finalchar],[[v,b],[v,cs]]] + %[[n,stringconcat],[[v,t],[v,a],[v,v1]]], + %[[n,stringconcat],[[v,t],[v,b],[v,v]]] + %], + %[[n,code],%[[n,sentencechars],[[v,b],[v,cs]]], + %[[n,stringconcat],[[v,t],[v,b],[v,v]]], + %[[n,not],[[n,finalchar],[[v,c],[v,cs]]]]], + [[n,word212],[[v,t],[v,u]]]]], +**/ +/** + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [%[v,a], + %[[n,code],[[n,trace]]], + [v,b],[[n,lookahead],[[v,c]]], + [[n,code],[[n,sentencechars],[[v,b],[v,cs]]], + [[n,stringconcat],[[v,t],[v,b],[v,v]]], + [[n,not],[[n,finalchar],[[v,c],[v,cs]]]]] + %[[n,finalchar],[[v,b],[v,cs]]] + %[[n,stringconcat],[[v,t],[v,a],[v,v1]]], + %[[n,stringconcat],[[v,t],[v,b],[v,v]]] + , + [[n,word212],[[v,v],[v,u]]]]], +**/ +/** + [[n,word21],[[v,t],[v,u]],"->", + [[v,a], + [[n,code],[[n,stringtonumber],[[v,a],[v,a1]]], + [[n,number],[[v,a1]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word21],["",[v,numberstring]]], + [[n,code],[[n,stringconcat], + [[v,v],[v,numberstring],[v,u]]]]]] + +**/ + [[n,word21],[[v,cs],[v,t],[v,u]],"->", + [[v,a], + % lookahead sent + %[[n,lookahead],[[v,c]]], + [[n,code],[[n,sentencechars],[[v,a],[v,cs]]], + [[n,stringconcat],[[v,t],[v,a],[v,v]]]], + [[n,word21],[[v,cs],"",[v,wordstring]]], + [[n,code], + [[n,stringconcat],[[v,v],[v,wordstring],[v,u]]]]]], + + [[n,sentencechars],[[v,c],[v,cs]],":-", + [[[n,not],[[[n,member],[[v,c],[v,cs]]]]]]], + +/** [[n,sentencechars],[[v,c]],":-", + [[[n,letters],[[v,c]]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[[n,stringtonumber],[[v,c],[v,n]]], + [[n,number],[[v,n]]]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c]," "]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],","]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],"-"]]]], + + [[n,sentencechars],[[v,c]],":-", + [[[n,=],[[v,c],"'"]]]], + + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],"."]]]], + +**/ + [[n,finalchar],[[v,c],[v,cs]],":-", + [[[n,member],[[v,c],[v,cs]]]]], + +/** + [[n,lookahead1],[[v,c],[v,cs]],":-", %? + [[[n,member],[[v,cs],[v,c]]], + [[n,lookahead],[[v,c]]]]], + + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],"!"]]]], + + [[n,finalchar],[[v,c]],":-", + [[[n,=],[[v,c],"?"]]]] + **/ + + [[n,lookahead],[[v,a],[v,a],[v,b]],":-", + [%[[n,trace]], + [[n,stringconcat],[[v,b],[v,d],[v,a]]], + +[[n,string_length],[[v,b],1]] + + %[[n,trace]], + %[[n,string_length],[[v,b],1]]]] + ]] + +%%],[[[v,t],[["a"],1]]]). +%],[[[[v,t],["aaa1 ,-'!","a?","b!","b."]]]]). +],[[[[v,t],["a", " ", " ", " ", "a", ".", " ", "a", " "]]]]). + +% a types state machine + +% v1 checked by type1 etc + +/** + [[[n,connect_cliques],[[t,list2],[t,list2],[t,list2]]], + [[t,item],[[t,number]]], + [[t,item],[[t,string]]], + [[t,list2],[[[t,list],[[t,item]]]]], + [[t,list2],[[[t,list],[[t,list2]]]]]], + [[[n,connect_cliques],[input,input,output]]], + +**/ + +% list type accepts whole not partial repeats of types + +/** +test(118,[[n,connect_cliques_types],[[["a",1],[1,2],[2,"b"]],[["a",3],[3,4],[4,"b"]],[["a",1],[1,2],[2,"b"],["a",3],[3,4],[4,"b"]]]], + +[ + [[n,connect_cliques_types],[[v,vtp1],[v,vtp2],[v,vtp3]],":-", +[ [[n,list2_types],[[v,vtp1]]], + [[n,list2_types],[[v,vtp2]]], + [[n,list2_types],[[v,vtp3]]]]], % not in check types on entry, just exit + +% t->n + +[[[n,types],[[v,vt],[[v,vtp1]]]],":-", + +% like curr type checker +% put up with non (what would appear in lpi) type checking code in trace, will also affect findall, maplist, forall, intersection, etc. +% - i.e. don't use trace1, notrace1 +% x only trace1s wanted type checking trace statements x trace all if on x trace1 switched off whenever unnecessary + +*** + + +[[n,item_types],[[v,vtp1]],":-", +[ [[n,number_types],[[v,vtp1]]]]], + +[[n,item_types],[[v,vtp1]],":-", +[ [[n,string_types],[[v,vtp1]]]]], + +[[n,list2_types],[[v,vtp1]],":-", +[ [[n,list_types],[[v,vtp1]]]]], + + +[[n,list_types],[[v,vtp1]],":-", +[ %[[n,brackets_type],[[[[t,list_types],[[t,item_types]]]],[[v,vtp1]]]] ** % make types args rather than in sm +[[n,list_type],[[[t,item_types]]]],[[v,vtp1]]]] ** % make types args rather than in sm + +[[n,brackets_type],[[[v,vt1]],[[v,vtp1]]],":-", % could be vt1 vt2 vtp1 vtp2 + +[ [[n,notrace1]], +% repeats items until finished + [[n,list_types_1],[[v,vt1],[v,vtp1]]], % ["a",1] + [[n,trace1]], + [[n,list_types],[[n,list_types],[[v,vtp1]]]]]], + +[[n,list_types_1],[[]]], +[[n,list_types_1],[[v,vtp1]],":-", +[ [[n,equals4],[[v,vtp1],[[[v,vtp2],"|",[v,vtp3]]]]], + [[n,trace1]], + [[n,item_types],[[v,vtp2]]], + [[n,notrace1]], + [[n,list_types_1],[[v,vtp3]]]]], + +[[n,list2_types],[[v,vtp1]],":-", +[ [[n,list2_types],[[v,vtp1]]]]], + +[[n,trace1],":-", +[ [[n,trace]]]], + +[[n,notrace1],":-", +[ [[n,notrace]]]] + +],[[]]). +**/ + +% trace on for writelns + + + +%test(118,[[n,extract_modes2],[[[t,string],[t,string]],[],[v,typestatements3],["yes","yes"],[],[v,vars3],[input,output]]], +test(118,%[[n,types],["on"]], +[[n,query_pred]], +%[[n,is_list],[[1,2,3]]], +%[[n,extractmodes2],[[[t,string]],[],[v,typestatements3],["yes"],[],[v,vars3],[out]]], + +/*[[n,checktypes_inputs], + +[[n,pred],["yes","yes"], + +[[[n,pred],[[t,string],[t,string]]]], + +[[[n,pred],[in,out]]]]], + + +[[n,checktypes_inputs], + +[["n","want_baby"],["yes","yes","yes","yes"], + +[[["n","want_baby"],[["t","string"],["t","string"],["t","string"],["t","string"]]]], + +[[["n","want_baby"],["in","in","in","out"]]]]], +*/ +%/** *** THESE +%/* +[ + [[n,query_pred],":-", + [[[n,equals4_on]], + +[[n,checktypes_inputs], + +[[n,want_baby],["yes"], + +[[[n,want_baby],[[t,string]]]], + +[[[n,want_baby],[in]]]]], + + [[n,equals4_on]] + +]], +%*/ +%**/ + +% the type checker sm is better than the type command anyway because it will work with skip and retry in trace +% - use normal trace, notrace on checktypes, works with skip, retry (trace before checktypes, if exits or fails, turns off trace) +% later: $ trace status to display + + +[[n,types],["off"]], % need assertz command in ssi, not in lpi + +[[n,checktypes_inputs],[[v,function],[v,vars1],[v,typestatements1],[v,modestatements1]],":-", % need these last 2 vars for out check as well +[ + [[n,"->"],[[[n,types],["on"]], + %[[n,typestatements],[[v,typestatements1]]],[[n,modestatements],[[v,modestatements1]]], + [[n,checktypes0_inputs],[[v,function],[v,vars1],[v,typestatements1],[v,modestatements1]]],[[n,true]]]], + [[n,cut]] +]], +[[n,checktypes0_inputs],[[v,function],[v,vars1],[v,typestatements1],[v,modestatements1]],":-", +[ + [[n,length],[[v,vars1],[v,l]]], + [[n,is],[[v,l],0]], + [[n,equals4],[[v,vars1],[]]], + [[n,get_lang_word],["in type check",[v,input_type_check]]], % need this command + [[n,"->"],[[[n,types],["on"]],[[n,debug_types_call],[[[v,function],"/","~",[v,l],[v,input_type_check]]]],[[n,true]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_call],[[v,skip],[[v,function],[v,vars1]]]],[[n,true]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[v,function],[v,vars1]]]],[[n,true]]]], + [[n,"->"],[[[n,types],["on"]],[[[n,debug_types_exit],[[[v,function],"/","~",[v,l],[v,input_type_check]]]]],[[n,true]]]], + [[n,cut]] +]], +[[n,checktypes0_inputs],[[v,function],[v,vars1],[v,typestatements1],[v,modestatements1]],":-", +[ + [[n,length],[[v,vars1],[v,l]]], + [[n,get_lang_word],["in type check",[v,input_type_check]]], + [[n,"->"],[[[n,types],["on"]],[[[n,debug_types_call],[[[v,function],"/","~",[v,l],[v,input_type_check]]]]],[[n,true]]]], + + %[[n,trace]], + [[[[n,member3],[[[v,function],"|",[[v,typestatements2]]],[v,typestatements1]]],[[n,member3],[[[v,function],"|",[[v,modestatements2]]],[v,modestatements1]]],[[n,extractmodes1],[[v,typestatements2],[v,typestatements3],[v,vars1],[v,vars2],[v,modestatements2]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_call],[[v,skip],[[v,function],[v,vars2]]]],[[n,true]]]],[[n,"->"],[[[[n,checktypes1],[[v,vars2],[v,typestatements3],[v,typestatements3],[v,typestatements1]]]],[[[[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[v,function],[v,vars2]]]],[[n,true]]]],[[n,"->"],[[[n,types],["on"]],[[[n,debug_types_exit],[[[v,function],"/","~",[v,l],[v,input_type_check]]]]],[[n,true]]]]]],[[[[n,"->"],[[[n,types],["on"]],[[n,debug_fail],[[v,skip],[[v,function],[v,vars1]]]],[[n,true]]]],[[n,"->"],[[[n,types],["on"]],[[[n,debug_types_fail],[[[v,function],"/","~",[v,l],[v,input_type_check]]]]],[[n,true]]]]]]]]]], + [[n,cut]] +]], +[[n,extractmodes1],[[v,typestatements1],[v,typestatements3],[v,vars1],[v,vars2],[v,modestatements1]],":-", +[ + [[n,extractmodes2],[[v,typestatements1],[],[v,typestatements3],[v,vars1],[],[v,vars2],[v,modestatements1]]], + [[n,cut]] +]], + +[[n,extractmodes2],[[],[v,typestatements2a],[v,typestatements2a],[],[v,vars],[v,vars],[]],":-", +[ + [[n,cut]] +]], +[[n,extractmodes2],[[v,typestatements1],[v,typestatements2a],[v,typestatements3],[v,vars1],[v,vars2],[v,vars3],[v,modestatements1]],":-", +[ + [[n,get_lang_word],["in",[v,in]]], + + %[[n,trace]], + %[[n,writeln],[[[v,typestatements1],[v,typestatements2a],[v,typestatements3],[v,vars1],[v,vars2],[v,vars3],[v,modestatements1]]]], + + %[[n,writeln],[[v,modestatements1]]], + + [[n,equals4],[[v,modestatements1],[[v,in],"|",[v,modestatements3]]]], + [[n,equals4],[[v,typestatements1],[[v,typestatements2],"|",[v,typestatements3a]]]], + [[n,equals4],[[v,vars1],[[v,vars11],"|",[v,vars12]]]], + + %[[n,trace]], + [[n,append],[[v,typestatements2a],[[v,typestatements2]],[v,typestatements4]]], + [[n,append],[[v,vars2],[[v,vars11]],[v,vars4]]], + [[n,extractmodes2],[[v,typestatements3a],[v,typestatements4],[v,typestatements3],[v,vars12],[v,vars4],[v,vars3],[v,modestatements3]]] + ,[[n,cut]] %** +]], +[[n,extractmodes2],[[v,typestatements1],[v,typestatements2a],[v,typestatements3],[v,vars1],[v,vars2],[v,vars3],[v,modestatements1]],":-", +[ + [[n,get_lang_word],["out",[v,out]]], + [[n,equals4],[[v,modestatements1],[[v,out],"|",[v,modestatements3]]]], + [[n,equals4],[[v,typestatements1],[[v,typestatements2],"|",[v,typestatements3a]]]], + [[n,equals4],[[v,vars1],[[v,vars11],"|",[v,vars12]]]], + [[n,extractmodes2],[[v,typestatements3a],[v,typestatements2a],[v,typestatements3],[v,vars12],[v,vars2],[v,vars3],[v,modestatements3]]], + [[n,cut]] +]], +[[n,checktypes],[[v,function],[v,vars1],[v,typestatements1],[v,modestatements1]],":-", +[ + [[n,"->"],[[[n,types],["on"]],[[[[n,typestatements],[[v,typestatements1]]],[[n,checktypes0],[[v,function],[v,vars1],[v,typestatements1]]]]],[[n,true]]]], + [[n,cut]] +]], +[[n,checktypes0],[[v,function],[v,vars1],[v,typestatements1]],":-", +[ + [[n,get_lang_word],["Type check",[v,type_check]]], + [[n,length],[[v,vars1],[v,l]]], + [[n,is],[[v,l],0]], + [[n,equals4],[[v,vars1],[]]], + [[n,"->"],[[[n,types],["on"]],[[[n,debug_types_call],[[[v,function],"/",[v,l],[v,type_check]]]]],[[n,true]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_call],[[v,skip],[[v,function],[v,vars1]]]],[[n,true]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[v,function],[v,vars1]]]],[[n,true]]]], + [[n,"->"],[[[n,types],["on"]],[[[n,debug_types_exit],[[[v,function],"/",[v,l],[v,type_check]]]]],[[n,true]]]], + [[n,cut]] +]], +[[n,checktypes0],[[v,function],[v,vars1],[v,typestatements1]],":-", +[ + [[n,get_lang_word],["Type check",[v,type_check]]], + [[n,length],[[v,vars1],[v,l]]], + [[n,"->"],[[[n,types],["on"]],[[[n,debug_types_call],[[[v,function],"/",[v,l],[v,type_check]]]]],[[n,true]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_call],[[v,skip],[[v,function],[v,vars1]]]],[[n,true]]]], + [[n,"->"],[[[[[n,member3],[[[v,function],"|",[[v,typestatements2]]],[v,typestatements1]]],[[n,checktypes1],[[v,vars1],[v,typestatements2],[v,typestatements2],[v,typestatements1]]]]],[[[[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[v,function],[v,vars1]]]],[[n,true]]]],[[n,"->"],[[[n,types],["on"]],[[[n,debug_types_exit],[[[v,function],"/",[v,l],[v,type_check]]]]],[[n,true]]]]]],[[[[n,"->"],[[[n,types],["on"]],[[n,debug_fail],[[v,skip],[[v,function],[v,vars1]]]],[[n,true]]]],[[n,"->"],[[[n,types],["on"]],[[[n,debug_types_fail],[[[v,function],"/",[v,l],[v,type_check]]]]],[[n,true]]]]]]]], + [[n,cut]] +]], +[[n,checktypes1],[[],[],[v,u1],[v,u2]],":-", +[ + [[n,cut]] +]], +[[n,checktypes1],[[v,vars1],[v,typestatements1],[v,typestatements2],[v,typestatements4]],":-", +[ + [[n,get_lang_word],["t",[v,t]]], + [[n,get_lang_word],["list",[v,dbw_list]]], + [[n,equals4],[[v,vars1],[[v,vars2],"|",[v,vars3]]]], + + %[[n,trace]], + + [[n,is_list],[[v,vars2]]], + + %[[n,writeln],[[[v,typestatements1]]]], + [[n,equals4],[[v,typestatements1],[[[[v,t],[v,dbw_list]],"|",[[v,typestatements3]]],"|",[v,typestatements4a]]]], + [[n,"->"],[[[n,types],["on"]],[[[n,debug_call],[[v,skip],[[[v,t],[v,dbw_list]],[v,typestatements3]]]]],[[n,true]]]], + [[n,"->"],[[[[n,checktypes3],[[v,vars2],[v,typestatements3],[v,typestatements2],[v,typestatements4]]]],[[[[n,"->"],[[[n,types],["on"]],[[[n,debug_exit],[[v,skip],[[[v,t],[v,dbw_list]],[v,vars2]]]]],[[n,true]]]],[[n,checktypes1],[[v,vars3],[v,typestatements4a],[v,typestatements2],[v,typestatements4]]]]],[[n,"->"],[[[n,types],["on"]],[[[n,debug_fail],[[v,skip],[[[v,t],[v,dbw_list]],[v,vars2]]]]],[[n,true]]]]]] +]], +[[n,checktypes1],[[v,vars1],[v,typestatements1],[v,typestatements2],[v,typestatements4]],":-", +[ + [[n,get_lang_word],["t",[v,t]]], + [[n,get_lang_word],["list",[v,dbw_list]]], + [[n,equals4],[[v,typestatements1],[[[[v,t],[v,dbw_list]],"|",[[v,typestatements3]]],"|",[v,typestatements4a]]]], + [[n,"->"],[[[n,types],["on"]],[[[n,debug_call],[[v,skip],[[[v,t],[v,dbw_list]],[v,typestatements3]]]]],[[n,true]]]], + [[n,"->"],[[[[n,checktypes3],[[v,vars1],[v,typestatements3],[v,typestatements2],[v,typestatements4]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[[v,t],[v,dbw_list]],[v,vars1]]]],[[n,true]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_fail],[[v,skip],[[[v,t],[v,dbw_list]],[v,vars1]]]],[[n,true]]]]]] +]], +[[n,checktypes1],[[v,vars1],[v,typestatements1],[v,typestatements2],[v,typestatements4]],":-", +[ + [[n,get_lang_word],["t",[v,t]]], + [[n,get_lang_word],["brackets",[v,dbw_brackets]]], + [[n,equals4],[[v,typestatements1],[[[[v,t],[v,dbw_brackets]],"|",[[v,typestatements3]]],"|",[v,typestatements4a]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_call],[[v,skip],[[[v,t],[v,dbw_brackets]],[v,typestatements3]]]],[[n,true]]]], + [[n,"->"],[[[[[n,equals4],[[v,vars1],[[v,vars2],"|",[v,vars3]]]],[[n,checktypes1],[[v,vars2],[v,typestatements3],[v,typestatements2],[v,typestatements4]]]]],[[[[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[[v,t],[v,dbw_brackets]],[v,vars1]]]],[[n,true]]]],[[n,checktypes1],[[v,vars3],[v,typestatements4a],[v,typestatements2],[v,typestatements4]]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_fail],[[v,skip],[[[v,t],[v,dbw_brackets]],[v,vars1]]]],[[n,true]]]]]], + [[n,cut]] +]], +[[n,checktypes1],[[v,vars1],[v,typestatements0],[v,typestatements1],[v,typestatements4]],":-", +[ + [[n,equals4],[[v,vars1],[[v,vars2],"|",[v,vars3]]]], + [[n,equals4],[[v,typestatements0],[[v,typestatements2],"|",[v,typestatements3]]]], + [[n,checktypes2],[[v,vars2],[v,typestatements2],[v,typestatements1],[v,typestatements4]]], + [[n,checktypes1],[[v,vars3],[v,typestatements3],[v,typestatements1],[v,typestatements4]]] +]], +[[n,checktypes2],[[v,vars],[v,typestatements1],[v,typestatements2],[v,c]],":-", +[ + [[n,get_lang_word],["t",[v,t]]], + [[n,get_lang_word],["number",[v,dbw_number]]], + [[n,equals4],[[v,typestatements1],[[v,t],[v,dbw_number]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_call],[[v,skip],[[[v,t],[v,dbw_number]],[v,vars]]]],[[n,true]]]], + [[n,"->"],[[[[n,number],[[v,vars]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[[v,t],[v,dbw_number]],[v,vars]]]],[[n,true]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_fail],[[v,skip],[[[v,t],[v,dbw_number]],[v,vars]]]],[[n,true]]]]]] +]], +[[n,checktypes2],[[v,vars],[v,typestatements1],[v,typestatements2],[v,u1]],":-", +[ + [[n,get_lang_word],["t",[v,t]]], + [[n,get_lang_word],["predicatename",[v,dbw_predicatename]]], + [[n,get_lang_word],["n",[v,dbw_n1]]], + [[n,=],[[v,dbw_n1],[v,dbw_n]]], + [[n,equals4],[[v,typestatements1],[[v,t],[v,dbw_predicatename]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_call],[[v,skip],[[[v,t],[v,dbw_predicatename]],[v,vars]]]],[[n,true]]]], + [[n,"->"],[[[[n,equals4],[[v,vars],[[[v,dbw_n],[v,u2]]]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[[v,t],[v,dbw_predicatename]],[v,vars]]]],[[n,true]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_fail],[[v,skip],[[v,t],[v,dbw_predicatename]],[v,vars]]],[[n,true]]]]]] +]], +[[n,checktypes2],[[v,vars],[v,typestatements1],[v,typestatements2],[v,u1]],":-", +[ + [[n,get_lang_word],["t",[v,t]]], + [[n,get_lang_word],["string",[v,dbw_string]]], + + %[[n,trace]], **** + + [[n,equals4],[[v,typestatements1],[[v,t],[v,dbw_string]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_call],[[v,skip],[[[v,t],[v,dbw_string]],[v,vars]]]],[[n,true]]]], + [[n,"->"],[[[[n,string],[[v,vars]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[[v,t],[v,dbw_string]],[v,vars]]]],[[n,true]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_fail],[[v,skip],[[[v,t],[v,dbw_string]],[v,vars]]]],[[n,true]]]]]] +]], +[[n,checktypes2],[[v,vars],[v,typestatements1],[v,typestatements2],[v,u1]],":-", +[ + [[n,get_lang_word],["t",[v,t]]], + [[n,get_lang_word],["any",[v,dbw_any]]], + [[n,equals4],[[v,typestatements1],[[v,t],[v,dbw_any]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_call],[[v,skip],[[[v,t],[v,dbw_any]],[v,vars]]]],[[n,true]]]], + [[n,"->"],[[[[n,true]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[[v,t],[v,dbw_any]],[v,vars]]]],[[n,true]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_fail],[[v,skip],[[[v,t],[v,dbw_any]],[v,vars]]]],[[n,true]]]]]] +]], +[[n,checktypes2],[[v,vars],[v,typestatements1],[v,typestatements2],[v,typestatements4]],":-", +[ + [[n,get_lang_word],["t",[v,t]]], + [[n,get_lang_word],["list",[v,dbw_list]]], + [[n,get_lang_word],["brackets",[v,dbw_brackets]]], + [[n,get_lang_word],["number",[v,dbw_number]]], + [[n,get_lang_word],["predicatename",[v,dbw_predicatename]]], + [[n,get_lang_word],["string",[v,dbw_string]]], + [[n,get_lang_word],["any",[v,dbw_any]]], + [[n,equals4],[[v,typestatements1],[[v,t],[v,type]]]], + [[[[n,not],[[[n,=],[[v,type],[v,dbw_list]]]]],[[n,not],[[[n,=],[[v,type],[v,dbw_brackets]]]]],[[n,not],[[[n,=],[[v,type],[v,dbw_number]]]]],[[n,not],[[[n,=],[[v,type],[v,dbw_predicatename]]]]],[[n,not],[[[n,=],[[v,type],[v,dbw_string]]]]],[[n,not],[[[n,=],[[v,type],[v,dbw_any]]]]]]], + [[n,"->"],[[[n,types],["on"]],[[n,debug_call],[[v,skip],[[[v,t],[v,type]],[v,vars]]]],[[n,true]]]], + [[n,"->"],[[[[[n,member3],[[[[v,t],[v,type]],"|",[[v,typestatements3]]],[v,typestatements4]]],[[n,"->"],[[[n,checktypes1],[[v,vars],[v,typestatements3],[v,typestatements2],[v,typestatements4]]],[[n,true]],[[n,checktypes1],[[[v,vars]],[v,typestatements3],[v,typestatements2],[v,typestatements4]]]]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_exit],[[v,skip],[[[v,t],[v,type]],[v,vars]]]],[[n,true]]]],[[n,"->"],[[[n,types],["on"]],[[n,debug_fail],[[v,skip],[[[v,t],[v,type]],[v,vars]]]],[[n,true]]]]]] +]], +[[n,checktypes3],[[],[v,u1],[v,typestatements2],[v,u2]],":-", +[ + [[n,cut]] +]], +[[n,checktypes3],[[v,vars],[v,typestatements3],[v,typestatements2],[v,typestatements6]],":-", +[ + [[n,length],[[v,typestatements3],[v,l]]], + [[n,length],[[v,l1],[v,l]]], + [[n,append],[[v,l1],[v,l2],[v,vars]]], + [[n,checktypes1],[[v,l1],[v,typestatements3],[v,typestatements2],[v,typestatements6]]], + [[n,checktypes3],[[v,l2],[v,typestatements3],[v,typestatements2],[v,typestatements6]]], + [[n,cut]] +]], + +[[n,debug_call],[[v,skip],[[v,function],[v,vars1]]],":-",[ +[[n,writeln],[["debug_call",[v,function],[v,vars1]]]] +]], +[[n,debug_exit],[[v,skip],[[v,function],[v,vars1]]],":-",[ +[[n,writeln],[["debug_exit",[v,function],[v,vars1]]]] +]], +[[n,debug_fail],[[v,skip],[[v,function],[v,vars1]]],":-",[ +[[n,writeln],[["debug_fail",[v,function],[v,vars1]]]], +[[n,fail]] +]], +[[n,debug_types_call],[[v,function]],":-",[ +[[n,writeln],[["debug_types_call",[v,function]]]] +]], +[[n,debug_types_exit],[[v,function]],":-",[ +[[n,writeln],[["debug_types_exit",[v,function]]]] +]], +[[n,debug_types_fail],[[v,function]],":-",[ +[[n,writeln],[["debug_types_fail",[v,function]]]], +[[n,fail]] +]], + + [[n,is_list],[[v,var]],":-", + [[[n,=],[[v,var],[]]]]], + + [[n,is_list],[[v,var]],":-", + [[[n,equals4],[[v,var],[[v,v1],"|",[v,v2]]]]]] + + +], +[[]]). + +test(119,[[n,count],[2]], +[ + [[n,count],[[v,n]],":-", + [ + [[n,=],[[v,n],1]] + ] + ], + [[n,count],[[v,n]],":-", + [ + [[n,=],[[v,n],2]] + ] + ], + [[n,count],[[v,n]],":-", + [ + [[n,=],[[v,n],3]] + ] + ] +] ,[[]]). + + +test(120,[[n,function],[1,[v,b],2,[v,a]]], +[ + [[n,function],[[v,a],[v,a],[v,b],[v,b]],":-", + [ + [[n,true]] + ] + ] +] +, [[[[v,a],2],[[v,b],1]]]). + +test(121,[[n,append1],[[v,a]]], +[ + [[n,append1],[[v,a]],":-", + [ + [[n,a],[[v,a]]] + ] + ], + [[n,a],[["a"]],":-", + [ + [[n,true]] + ]] +] +,[[[[v,a], ["a"]]]]). + + +test(122,[[n,equals4_on1]], +%test(122,[[n,compound],["[],1]",[v,u],["aa,]",b,"c",[]],[v,t]]], + +[ +/* + [[n,compound],["[]","",[],[v,t]],":-", + [ + [[n,compound],[[v,u],"",[],[v,t]]] + ] + ], +*/ +/* + [[n,compound213],["","",[v,t],[v,t]]], + + [[n,compound213],[[v,u],[v,u],[v,t],[v,t]]], + [[n,a],[[v,u],[v,u],[v,t],[v,t]]], + + [[n,compound],[[v,t],[v,u]],"->", + ["[",[[n,a],[[v,t],[v,v]]], + "]", + [[n,compound213],[[v,v],[v,u]]]]] + +], +*/ + [[n,equals4_on1],":-", + [[[n,equals4_on]]]] +], + +%[[[[v,u], ",1]"],[[v,t], ["aa,]",b,"c",[]]]]]). +[[]]). + + + +test(123,[[n,equals41],[[[v,b],"|",[v,c]]]], + +[ + [[n,equals41],[[1,2,3]]] + +],[[[[v,b],1],[[v,c],[2,3]]]]). + + +test(124,[[n,equals41],[[[[v,a],"|",[v,d]],[v,c],"|",[v,b]]]], + +[ + [[n,equals41],[[[1,5],2,3,4]]] + +],[[[[v,a],1],[[v,b],[3,4]],[[v,c],2],[[v,d],[5]]]]). + + +test(125,[[n,equals41],[[[[v,a],[v,c]],"|",[v,b]]]], + +[ + [[n,equals41],[[[1,2],3,4]]] + +],[[[[v,a],1],[[v,b],[3,4]],[[v,c],2]]]). + +test(126,[[n,equals41],[[[v,a],"|",[v,b]]]], + +[ + [[n,equals41],[[1,2,3,4]]] + +],[[[[v, a], 1], [[v, b], [2, 3, 4]]]]). + +test(127,[[n,equals41],[[[v,a],[v,c],"|",[v,b],[v,d]]]], + +[ + [[n,equals41],[[1,2,3,4]]] + +],[]). + +test(128,[[n,equals41],[[[[v,a]],[v,c],"|",[v,b]]]], + +[ + [[n,equals41],[[[1],2,3,4]]] + +],[[[[v,a],1],[[v,b],[3,4]],[[v,c],2]]]). + +test(129,[[n,equals41],[[[v,a],"|",[v,b]]]], + +[ + [[n,equals41],[[[1,2],3,4]]] + +],[[[[v, a], [1, 2]], [[v, b], [3, 4]]]]). + +test(130,[[n,equals41],[[[v,a],"|",[[v,b]]]]], + +[ + [[n,equals41],[[1,2]]] + +],[[[[v, a], 1], [[v, b], 2]]]). + +test(131,[[n,equals41],[[[v,a]]]], + +[ + [[n,equals41],[[1]]] + +],[[[[v, a], 1]]]). + +test(132,[[n,equals41],[[[v,a],[v,b]]]], + +[ + [[n,equals41],[[1,2]]] + +],[[[[v, a], 1], [[v, b], 2]]]). + +test(133,[[n,equals41],[[[v,a],[v,b]]]], + +[ + [[n,equals41],[[[1,3],2]]] + +],[[[[v, a], [1, 3]], [[v, b], 2]]]). + + +test(134,[[n,equals41],[[[v,a],[v,c],"|",[v,b],"|",[v,d]]]], + +[ + [[n,equals41],[[1,2,3,4]]] + +],[]). + +test(135,[[n,equals41],[[1,2,3]]], + +[ + [[n,equals41],[[1,2,3]]] + +],[[]]). + +test(136,[[n,equals41],[[[v,a],"|",[[v,b],"|",[v,d]]]]], + +[ + [[n,equals41],[[1,2,3,4]]] + +],[[[[v, a], 1], [[v, b], 2],[[v, d], [3,4]]]]). + + + + +test(137,[[n,equals41],[[v,b]]], + +[ + [[n,equals41],[[v,b]],":-", + [ [[n,equals42],[[[v,b],"|",[v,c]]]] +% [ [[n,equals42],[[v,b]]] + ]], + + [[n,equals42],[[1,2,3]]] + + +],[[[[v,b],1]]]). + +test(138,[[n,equals41]], +[ + [[n,equals41],":-", + [[[n,equals4_on]]]] +], + +[[]]). + +test(139,[[n,append1],[[v,a],[v,d]]], +[ + [[n,append1],[[v,a],[v,d]],":-", + [ + %[[n,equals4_on]], + [[n,b],[[v,b]]], + [[n,c],[[v,c]]], + [[n,append],[[[v,b],[v,c]],[[v,c]],[[v,a],"|",[v,d]]]] + %[[n,equals4_off]] + ] + ], + [[n,b],["b"]], + [[n,c],["c"]] +] +,[[[[v,a], "b"],[[v,d], ["c", "c"]]]]). + +test(140,[[n,equals41],[[1,2,3],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,wrap],[[[v,a]],[v,b]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,b],[[[1,2,3]]]]]]). + +test(141,[[n,equals41],[[1,2,3],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,head],[[[v,a]],[v,b]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,b],[1,2,3]]]]). + +test(142,[[n,equals41],[[1,2,3],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,tail],[[[v,a]],[v,b]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,b],[]]]]). + +test(143,[[n,equals41],[[1,2,3],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,member],[[v,b],[[v,a]]]], + [[n,member],[[1,2,3],[[v,a]]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,b],[1,2,3]]]]). + +test(144,[[n,equals41],[[1,2,3],[v,b]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,member],[[v,b],[[v,a]]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,b],[1,2,3]]]]). + +test(145,[[n,equals41],[[[1,2,3]],[v,c]]], + +[ + [[n,equals41],[[v,a],[v,c]],":-", + [ [[n,member3],[[1,"|",[v,c]],[v,a]] +% [ [[n,equals42],[[v,b]]] + ]]] + +],[[[[v,c],[2,3]]]]). + +test(146,[[n,equals41],[[[1,2,3],4,5],[v,c]]], + +[ + [[n,equals41],[[v,a],[v,c]],":-", + [ %[[n,=],[[v,a],[[1,2,3],"|",[v,c]]]], + %[[n,is],[[v,a],[[1,2,3],"|",[v,c]]]], + [[n,equals3],[[[1,2,3],"|",[v,c]],[v,a]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,c],[4,5]]]]). + +test(147,[[n,equals41],[[v,c]]], + +[ + [[n,equals41],[[v,c]],":-", + [ [[n,equals2],[[v,c],[1,[4,5]]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,c],[1,[4,5]]]]]). + +test(148,[[n,equals41],[[v,c]]], + +[ + [[n,equals41],[[v,a]],":-", + [ [[n,equals3],[[[v,b],"|",[v,a]],[4,5,6]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,c],[5,6]]]]). + +test(149,[[n,equals41],[[[1,2,3]],[v,b],[v,c]]], + +[ + [[n,equals41],[[v,a],[v,b],[v,c]],":-", + [ [[n,delete],[[v,a],[1],[[v,b],"|",[v,c]]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,b],[1,2,3]],[[v,c],[]]]]). + + +test(150,[[n,equals41],[[[4,5,6]],[v,c]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,maplist],[[n,append],[v,a],[1,2,3],[[v,b],"|",[v,d]]]] + ]] + +],[[[[v,c],1]]]). + +test(151,[[n,equals41],[[[6,2,3],[5]],[v,c]]], + +[ + [[n,equals41],[[v,a],[v,c]],":-", + [ [[n,sort],[[v,a],[[5],"|",[v,c]]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,c],[[6,2,3]]]]]). + +test(152,[[n,equals41],[[6,2,3],[1,2,3],[v,c]]], + +[ + [[n,equals41],[[v,a],[v,b],[v,c]],":-", + [ [[n,intersection],[[v,a],[v,b],[[v,c],"|",[v,d]]]] +% [ [[n,equals42],[[v,b]]] + ]] + +],[[[[v,c],2]]]). + +test(153,[[n,equals41],[[[4,5,6]],[v,c]]], + +[ + [[n,equals41],[[v,a],[v,b]],":-", + [ [[n,maplist],[[n,append],[v,a],[1,2,3],[[v,b],"|",[2,3,4,5,6]]]] + ]] + +],[[[[v,c],1]]]). + + +test(154,[[n,equals41],[[[4,5,6]],[v,c]]], + +[ + [[n,equals41],[[v,a],[v,d]],":-", + [ [[n,maplist],[[n,append],[v,a],[1,2,3],[1,"|",[v,d]]]] + ]] + +],[[[[v,c],[2,3,4,5,6]]]]). + +test(155,[[n,equals41],[[v,a]]], + +[ + [[n,equals41],[[v,a]],":-", + [ [[n,equals4],[[v,a],[1,"|",[2,3]]]] + ]] + +],[[[[v,a],[1,2,3]]]]). + + +test(156,[[n,equals42],[[v,a],[[v,b],[v,c]]]], + +[ + [[n,equals42],[[[v,d],[v,e]],[v,f]],":-", + [ + [[n,equals4],[[[v,d],[v,e],[v,f]],[1,2,[3,4]]]] + ]] + +],[[[[v,a],[1,2]],[[v,b],3],[[v,c],4]]]). + +test(157,[[n,equals41],[[v,a]]], + +[ + [[n,equals41],[[[n,b],c]]] + +],[[[[v,a],[[n,b],c]]]]). + + +test(158,[[n,equals41],[[v,a]]], + +[ + [[n,equals41],[[v,a]],":-", + [ [[n,equals42],[[v,a]]] + ]], + + [[n,equals41],[1]] + +],[[[[v,a],1]]]). + + +test(159,[[n,equals41]], +[ + [[n,equals41],":-", + [[[n,equals4_on]]]] +], + +[[]]). + +% formerly test 1 + +test(160,[[n,function],[1,1,[v,c]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + %[[n,equals4_off]], + [[n,+],[[v,a],[v,b],[v,c]]] + ] + ] +] +,[[[[v,c], 2]]]). + +test(161,[[n,function],[1,1,[v,c]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + %[[n,trace2]], + [[n,function2],[[v,a],[v,b],[v,f]]], + [[n,function2],[[v,f],[v,f],[v,c]]] + + ] + ], + [[n,function2],[[v,a],[v,f],[v,h]],":-", + [ + [[n,+],[[v,a],[v,f],[v,h]]] + ] + ] +] +,[[[[v,c], 4]]]). + + +test(162,[[n,equals41],[[v,a],[v,b],[v,c]]], + +[ + [[n,equals41],[[v,a],[v,b],[v,c]],":-", + [ [[n,equals42],[[v,a],[[v,b],[v,c]]]] + ]], + + [[n,equals42],[[[v,d],[v,e]],[v,f]],":-", + [ %[[n,trace2]], + [[n,equals4],[[[v,d],[v,e],[v,f]],[1,2,[3,4]]]] + ]] + + %[[n,equals42],[[1,2,[3,4]]]] + +],[[[[v,a],[1,2]],[[v,b],3],[[v,c],4]]]). + +test(163,[[n,function1],[[v,a],[v,a],[v,b],[v,b]]], +[ + [[n,function1],[[v,b],[v,b],[v,a],[v,a]],":-", + [ + [[n,function],[1,[v,b],2,[v,a]]] + ] + ], + + [[n,function],[[v,a],[v,a],[v,b],[v,b]],":-", + [ + [[n,true]] + ] + ] + +] +,[[[[v,a],1],[[v,b],2]]]). + + +test(164,[[n,function1],[[v,a],[v,a],[v,b],[v,b]]], +[ + [[n,function1],[[v,b],[v,a],[v,a],[v,b]],":-", % should fail + [ + [[n,function],[1,[v,b],2,[v,a]]] + ] + ], + + [[n,function],[[v,a],[v,a],[v,b],[v,b]],":-", + [ + [[n,true]] + ] + ] + +] +,[]). + +test(165,[[[n,equals4],[[v,a],[1,2]]],[[n,equals4],[[v,b],[0,"|",[v,a]]]]], +[], +[[[[v,a],[1,2]],[[v,b],[0,1,2]]]]). + +test(166,[[[n,equals4],[[v,a],0]],[[n,equals4],[[v,b],[[v,a],"|",[1,2]]]]], +[], +[[[[v,a],0],[[v,b],[0,1,2]]]]). + +test(167,[[n,equals4],[[[v, a], "|", [v, b]], [1, "|", 2]]], +[], +[[[[v, a], 1], [[v, b], 2]]]). + +test(168,[[n,function1],[[v,a],"|",[v,b]]], +[ + [[n,function1],[1,"|",2]] +] +,[[[[v, a], 1], [[v, b], 2]]]). + +test(169,[[n,function1],[[v,a],[v,b]]], +[ + [[n,function1],[[v,a],[v,b]],":-", + [ + [[n,equals4],[[[v,a],[v,a]],[[1,[v,b]],[1,1]]]] + ]] +] +,[[[[v, a], [1,1]], [[v, b], 1]]]). + +test(170,[[n,function1],[[v,a],[v,a]]], +[ + [[n,function1],[[1,[v,b]],[1,1]]] +] +,[[[[v, a], [1,1]]]]). + +test(171,[[n,function1],[[[]]]], +[ + [[n,function1],[[]]] +] +,[]). + +test(172,[[n,equals4],[[],[[]]]], +[], +[]). + +test(173,[[n,equals4],[[[v,d],[v,d],[v,a],[v,b],[v,a]], +[[v,e],[v,c],1,[v,c],[v,b]]]], +[], +[[[[v, a], 1], [[v, b], 1], [[v, c], 1], [[v, d], 1], [[v, e], 1]]]). + + +test(174,[[n,function1],[[v,e],[v,c],[v,a],[v,c],[v,a]]], +[ + [[n,function1],[[v,d],[v,d],1,[v,b],[v,b]]] +] +,[[[[v, a], 1], [[v, c], 1], [[v, e], 1]]]). + +test(175,[[n,equals4], + [ + [[[v,e1],[v,e2]] ,[v,c],[v,a],[v,c],[v,a],[v,e1]], + [[v,d],[v,d],[[1,1],[1,1]],[v,b],[v,b],[[v,f],[v,f2]]] + ]], +[], +[[[[v,a],[[1,1],[1,1]]],[[v,b],[[1,1],[1,1]]],[[v,c],[[1,1],[1,1]]],[[v,d],[[1,1],[1,1]]],[[v,e1],[1,1]],[[v,e2],[1,1]],[[v,f],1],[[v,f2],1]]]). + + + +test(176,[[n,function1], + [[[v,e1],[v,e2]] ,[v,c],[v,a],[v,c],[v,a],[v,e1]]], + +[ + [[n,function1], + [[v,d] ,[v,d],[[1,1],[1,1]],[v,b],[v,b],[[v,f],[v,f2]]]] +], +[[[[v, a], [[1,1],[1,1]]], [[v, c], [[1,1],[1,1]]], [[v, e1], [1,1]],[[v, e2], [1,1]]]]). + +test(177,[[n,equals4], + [[[[v,e1],[v,e2]] ,"|",[[v,c],[v,a],[v,c],[v,a],[v,e1]]], + [[v,d] ,"|",[[v,d],[[1,1],[1,1]],[v,b],[v,b],[[v,f],[v,f2]]]]]], +[], +[[[[v,a],[[1,1],[1,1]]],[[v,b],[[1,1],[1,1]]],[[v,c],[[1,1],[1,1]]],[[v,d],[[1,1],[1,1]]],[[v,e1],[1,1]],[[v,e2],[1,1]],[[v,f],1],[[v,f2],1]]]). + + +test(178,[[n,function1], + [[[v,e1],[v,e2]] ,"|",[[v,c],[v,a], [v,c],[v,a],[v,e1]]]], + +[ + [[n,function1], + [[v,d], "|",[[v,d],[[1,1],[1,1]],[v,b],[v,b],[[v,f],[v,f2]]]]] +], +[[[[v, a], [[1,1],[1,1]]], [[v, c], [[1,1],[1,1]]], [[v, e1], [1,1]],[[v, e2], [1,1]]]]). + +test(179,[[n,add0]], +[ +/* + [[n,add0],":-", + [ [[n,add1],[[v,a]]], + [[n,writeln],["a"]]]], + + [[n,add1],[[v,a]],":-", + [ [[n,add2],[[v,a]]]]], + */ + [[n,add0],":-", + [ %[[n,trace2]], + [[n,fail]]]] +] +,[]). + +test(180,[[n,add0]], +[ +/* + [[n,add0],":-", + [ [[n,add1],[[v,a]]], + [[n,writeln],["a"]]]], + + [[n,add1],[[v,a]],":-", + [ [[n,add2],[[v,a]]]]], + */ + [[n,add0],":-", + [ %[[n,trace2]], + [[n,true]]]] +] +,[[]]). + + +test(181,[[n,1],[[1,2],[v,b]]], +[ + %[[n,query_box],[[v,a],[v,b]],":-", + %[ [[n,1],[[v,a],[v,b]]]]], + + [[n,add2],[[v,a],[v,b]],":-", + [ [[n,=],[[v,a],[]]], + [[n,=],[[v,b],[]]]]], + [[n,add3],[[v,a],[v,b]],":-", + [ [[n,tail],[[v,a],[v,b]]]]], + + %[[n,add0],[[v,a],[v,b]],":-", + %[ [[n,1],[[v,a],[v,c]]], + % [[n,=],[[v,c],[v,b]]]]], + + [[n,1],[[v,a],[v,b]],":-", + [ [[n,add2],[[v,a],[v,c]]], + [[n,=],[[v,c],[v,b]]]]], + [[n,1],[[v,a],[v,b]],":-", + [ [[n,add3],[[v,a],[v,c]]], + [[n,1],[[v,c],[v,d]]], + [[n,=],[[v,d],[v,b]]]]] +],[[[[v, b], []]]]). + +test(182,[[n,member_try],[[1,2,3],[v,b]]], +[ + [[n,member_try],[[v,a],[v,b]],":-", + [ + [[n,member],[[v,b],[v,a]]], + [[n,equals4],[[v,b],2]]] + ] +] + +,[[[[v, b], 2]]]). + + +test(183,[[n,not1]], +[ + [[n,not1],":-", + [ + [[n,not],[[[n,equals4],[3,2]]]]%, + %[[n,true]] + ]] +] + +,[[]]). + + +test(184,[[n,not1]], +[ + [[n,not1],":-", + [ + [[n,not],[[[[n,equals4],[3,2]],[[n,equals4],[3,2]]]]], + [[n,true]] + ]] +] + +,[[]]). + + +test(185,[[n,not1]], +[ + [[n,not1],":-", + [ + [[n,not],[[[[n,equals4],[2,2]],[[n,equals4],[2,2]]]]], + [[n,true]] + ]] +] + +,[]). + +test(186,[[n,brackets1]], +[ + [[n,brackets1],":-", + [ + %[[n,true]], + %[[[n,true]],[[n,true]]] + %[[n,true]], + [[[n,true]],%[[n,true]], + [[[n,true]],[[n,true]]]] + ]] +] + +,[[]]). + +test(187,[[n,brackets1]], +[ + [[n,brackets1],":-", + [ + %[[n,true]], + %[[[n,true]],[[n,true]]] + %[[n,true]], + %*[[[n,writeln],["1"]],%[[n,true]], + %*[[[n,writeln],["2"]],[[n,writeln],["3"]]], + %[[n,writeln],["3.5"]], + [[[n,writeln],["4"]]%,[[n,writeln],["5"]] + ], + [[[n,writeln],["6"]],[[n,writeln],["7"]]] + ]] +] + +,[[]]). + +test(188,[[n,brackets1]], +[ + [[n,brackets1],":-", + [ + %[[n,true]], + %[[[n,true]],[[n,true]]] + %[[n,true]], + [[[n,true]],%[[n,true]], + [[[n,true]]%,[[n,trace2]] + ]] + ]] +] + +,[[]]). + +test(189,[[n,brackets1]], +[ + [[n,brackets1],":-", + [ [[n,true]] + ]] +] + +,[[]]). + +test(190,[[n,true]], +[] + +,[[]]). + +test(191,[[n,not1]], +[ + [[n,not1],":-", + [ [[n,not],[[[n,member],["not",["would","you","like","a","walk"]]]]] + ]] +] + +,[[]]). + +/* +test(192,[[n,a]], +[ + [[n,a],":-", + [ [[[n,writeln],["1"]], + + [[n,writeln],["2"]]], + + + [[n,writeln],["3"]], + [[n,false]] + ]] +] + +,[]). + +*/ + +test(192,[[n,findall1],[[1,2,3],[v,b]]],%[[[1,11,111],[2,22,222],[3,33,333]],[v,b]]], + +[ + [[n,findall1],[[v,a],[v,b]],":-", + [ [[n,findall],[[v,b1],[[[n,member],[[v,a1],[v,a]]], + + %[[n,findall],[[v,a3],[[[n,member],[[v,a1],[v,a2]]], + [[n,+],[[v,a1],2,[v,b1]]]], + %[v,b1]]]], + + [v,b]]] + ]] + +],[[[[v,b],[3,4,5] +%[[1,11,111],[2,22,222],[3,33,333]] +]]]). + +test(193,[[n,findall1],[[[[1,2,3,4]]],[v,b]]],%[[[[1],[2]],[[3],[4]]],[v,b]]],%[[[1,11,111],[2,22,222],[3,33,333]],[v,b]]], + +[ + [[n,findall1],[[v,a],[v,b]],":-", + [ [[n,true]], + [[n,findall],[[v,b1],[[[n,member],[[v,a1],[v,a]]], + + [[n,findall],[[v,b2],[[[n,member],[[v,a2],[v,a1]]], + + [[n,findall],[[v,a4],[[[n,member],[[v,a3],[v,a2]]], + [[n,+],[[v,a3],4,[v,a4]]]], + [v,b2]]]], + [v,b1]]]], + + [v,b]]] + ]] + +],[[[[v,b],[[[5,6,7,8]]]%[[[5],[6]],[[7],[8]]] +%[[1,11,111],[2,22,222],[3,33,333]] +]]]). + + +test(194,[[n,cut1],[[v,a]]],%[[[[1],[2]],[[3],[4]]],[v,b]]],%[[[1,11,111],[2,22,222],[3,33,333]],[v,b]]], + +[ + [[n,cut1],[[v,a]],":-", + [[[n,findall],[[v,a1],[[[n,member],[[v,a1],[1,2]]]%, + %,[[n,cut]] + ],[v,a]]]]] + +],[[[[v,a],[1,2]%[[[5],[6]],[[7],[8]]] +%[[1,11,111],[2,22,222],[3,33,333]] +]]]). + + +test(195,%[[n,or12],[[v,a]]], +[[n,findall],[[v,b],[[[n,or12],[[v,b]]]],[v,a]]], +[ + [[n,or12],[1],":-",[[[n,true]]]], + [[n,or12],[2],":-",[[[n,fail]]]], + [[n,or12],[3],":-",[[[n,fail]]]] + +],[[[[v,a],[1]]]]). + +test(196,%[[n,or12],[[v,a]]], +[[n,findall],[[v,b],[[[n,or12],[[v,b]]]],[v,a]]], +[ + [[n,or12],[1]] + +],[[[[v,a],[1]]]]). + +test(197,%[[n,or12],[[v,a]]], +[[n,findall],[[v,b],[[[n,member],[[v,b],[1]]]],[v,a]]], +[ + +],[[[[v,a],[1]]]]). + +test(198, +[[n,function]], + +[[[n,function],":-",[[[n,not],[[[[n,false]],[[n,false]]]]],[[n,true]]]]], + +[[]]). + +test(199, +[[n,function],[1,1,[v,a]]], +[[[n,function],[[v,a],[v,b],[v,c]],":-",[[[[n,+],[[v,a],[v,b],[v,d]]],[[n,+],[[v,d],[v,d],[v,e]]]],[[n,-],[[v,e],[v,e],[v,c]]]]]] +, +[[[[v,a],0]]]). + +test(200, +[[n,function]], +[[[n,function],":-",[[[n,or],[[[n,false]],[[n,true]]]],[[n,true]]]]], +[[]]). + +test(201, +[[n,function]], +[[[n,function],":-",[[[n,"->"],[[[n,true]],[[n,true]]]],[[n,true]]]]], +[[]]). + +test(202, +[[n,function]], +[[[n,function],":-",[[[n,"->"],[[[n,true]],[[n,true]],[[n,true]]]],[[n,true]]]]], +[[]]). + +test(203, + +%a( +%[[n,apply_all_to_all],[["m","vine"],[v,inventory4]]],% + +[[n,traverse],[2,6]], %2,8 +%[[n,apply_all_to_all],[["vine","c","k"],[v,inventory4]]], +[ +%[[n,rainforest],[[[1,9,["*"]],[2,9,["*"]],[3,9,["*"]],[4,9,["*"]],[5,9,["*"]],[6,9,["*"]],[1,8,["*"]],[2,8,["m"]],[3,8,["vine"]],[4,8,[]],[5,8,[]],[6,8,["*"]],[1,7,["*"]],[2,7,["*"]],[3,7,["*"]],[4,7,["*"]],[5,7,["*"]],[6,7,["*"]]]]], + +[[n,rainforest],[[[1,9,[ ]],[2,9,[ ]],[3,9,["*" ]],[4,9,["*" ]],[5,9,["*" ]], + [1,8,[ ]],[2,8,["*" ]],[3,8,[ ]],[4,8,[ ]],[5,8,["*" ]], + [1,7,[ ]],[2,7,["*" ]],[3,7,[ ]],[4,7,["*" ]],[5,7,[ ]], + [1,6,["*" ]],[2,6,[ ]],[3,6,["k" ]],[4,6,["*" ]],[5,6,["*" ]], + [1,5,["*" ]],[2,5,[ ]],[3,5,[ ]],[4,5,["c" ]],[5,5,["*" ]], + [1,4,[ ]],[2,4,["*" ]],[3,4,[ ]],[4,4,[ ]],[5,4,["*" ]], + [1,3,["*" ]],[2,3,[ ]],[3,3,[ ]],[4,3,["*" ]],[5,3,[ ]], + [1,2,["*" ]],[2,2,[ ]],[3,2,["vine" ]],[4,2,["*" ]],[5,2,[ ]], + [1,1,[ ]],[2,1,["*" ]],[3,1,["*" ]],[4,1,["*" ]],[5,1,[ ]]]]], + +[[n,traverse],[[v,x],[v,y]],":-", +[ + [[n,"->"], + [ + [[n,traverse],[[v,x],[v,y],[],[v,c],[],[v,a],"no",[v,b]]], + + [ + [[n,true]] + ], + + [ + [[n,false]] + ] + ]]%, + %,[[n,cut]] +]], +[[n,traverse],[[v,'_x'],[v,'_y'],[v,explored],[v,explored],[v,inventory],[v,inventory],[v,e],[v,e]],":-", +[ + [[n,"->"], + [ + [[n,equals4],[[v,e],"e"]], + + [[n,true]], + + [[n,fail]] + ]]%, + %,[[n,cut]] +]], +[[n,traverse],[[v,x],[v,y],[v,explored1],[v,explored2],[v,inventory1],[v,inventory2],[v,e1],[v,e2]],":-", +[ + [[n,rainforest],[[v,map]]], + [[n,member],[[[v,x],[v,y],[v,cell]],[v,map]]], + [[n,"->"], + [ + [[n,"->"], + [ + [[n,equals4],[[v,cell],["*"]]], + + [ + [[n,true]] + ], + + [ + [[n,member],[[[v,x],[v,y]],[v,explored1]]] + ] + ]], + + [ + [ + [[n,equals4],[[v,explored1],[v,explored2]]], + [[n,equals4],[[v,inventory1],[v,inventory2]]], + [[n,=],[[v,e2],[v,e1]]] + %,[[n,cut]] speeds up in ssi but crashes lpi + ] + ], + + [ + [ + [[n,writeln],[[[v,x],[v,y]]]], + [[n,"->"], + [ + [[n,equals4],[[v,cell],[]]], + + [ + [[n,equals4],[[v,inventory4a],[v,inventory1]]] + ], + + [ + [ + [[n,equals4],[[v,cell],[[v,item]]]], + [[n,append],[[v,inventory1],[[v,item]],[v,inventory3]]], + [[n,apply_all_to_all],[[v,inventory3],[v,inventory4]]], + [[n,equals4],[[v,inventory4a],[v,inventory4]]] + ] + ] + ]], + [[n,writeln],[[v,inventory4a]]], + [[n,"->"], + [ + [[n,member],["e",[v,inventory4a]]], + + [ + [ + [[n,writeln],["Game Over"]], + [[n,equals4],[[v,explored1],[v,explored2]]], + [[n,equals4],[[v,inventory2],[v,inventory4a]]], + [[n,=],[[v,e2],"e"]]%, + %,[[n,cut]] + %,[[n,trace]] + ] + ], + + [ + [ + [[n,append],[[v,explored1],[[[v,x],[v,y]]],[v,explored3]]], + [[n,-],[[v,x],1,[v,xm1]]], + [[n,-],[[v,y],1,[v,ym1]]], + [[n,+],[[v,x],1,[v,xp1]]], + [[n,+],[[v,y],1,[v,yp1]]], + [[n,traverse],[[v,xm1],[v,y],[v,explored3],[v,explored4],[v,inventory4a],[v,inventory5],[v,e1],[v,e3]]], + [[n,traverse],[[v,x],[v,ym1],[v,explored4],[v,explored5],[v,inventory5],[v,inventory6],[v,e3],[v,e4]]], + [[n,traverse],[[v,xp1],[v,y],[v,explored5],[v,explored6],[v,inventory6],[v,inventory7],[v,e4],[v,e5]]], + [[n,traverse],[[v,x],[v,yp1],[v,explored6],[v,explored2],[v,inventory7],[v,inventory2],[v,e5],[v,e2]]] + ] + ] + ]] + ] + ] + ]] +%,[[n,cut]] +]], +[[n,apply_all_to_all],[[v,inventory1],[v,inventory2]],":-", +[ +%[[n,trace2]], + [[n,findall], + [ + [v,item3], + + [ + [[n,member],[[v,item1],[v,inventory1]]], + [[n,member],[[v,item2],[v,inventory1]]], + [[n,not],[[[[n,equals4],[[v,item1],[v,item2]]]]]], + [[n,apply],[[v,item1],[v,item2],[v,item3]]], + [[n,not],[[[[n,member],[[v,item3],[v,inventory1]]]]]] + ], + + [v,addeditems] + ]], + [[n,"->"], + [ + [[n,equals4],[[v,addeditems],[]]], + + [ + [[n,equals4],[[v,inventory1],[v,inventory2]]] + ], + + [ + [ + [[n,append],[[v,inventory1],[v,addeditems],[v,inventory3]]], + [[n,apply_all_to_all],[[v,inventory3],[v,inventory2]]] + ] + ] + ]] +%,[[n,cut]] +]], +[[n,apply],["k","c","m"]], +[[n,apply],["m","vine","e"]] +], +[[]]). +%[[[[v,inventory4],["vine","c","k","m","e"]]]]). + +test(204, +[[n,function]], +[[[n,function],":-",[[[n,writeln],["Hello World!"]]]]], +[[]]). + +test(205,[[n,findall1],[[[1,2],[1,4]],[v,b]]],%[[[1,11,111],[2,22,222],[3,33,333]],[v,b]]], + +[ + [[n,findall1],[[v,a],[v,b]],":-", + [ [[n,findall],[[v,b1],[[[n,member],[[v,a1],[v,a]]], + + [[n,findall],[[v,a2],[[[n,member],[[v,a2],[v,a1]]], + %[[n,+],[[v,a2],5,[v,a3]]], + %[[n,/],[[v,a3],2,[v,a4]]], + %[[n,ceiling],[[v,a4],[v,a5]]], + [[n,equals4],[[v,a2],1]] + ], + [v,b1]]]], + + [v,b]]] + ]] + +],[[[[v,b],[[1],[1]] +%[[1,11,111],[2,22,222],[3,33,333]] +]]]). + +test(206,%[[n,or12],[[v,a]]], +[[n,findall],[[v,b],[[[n,or12],[1,[v,b]]]],[v,a]]], +[ + [[n,or12],[1,1]], + [[n,or12],[2,2]], + [[n,or12],[1,3]] + +],[[[[v,a],[1,3]]]]). + + +test(207,[[n,findall1],[[1,2],[v,b1]]],%[[[1,11,111],[2,22,222],[3,33,333]],[v,b]]], + +[ + [[n,findall1],[[v,a],[v,b1]],":-", + [ %[[n,findall],[[v,b1],[[[n,member],[[v,a],[v,a1]]], + + [[n,findall],[[v,a3],[[[n,member],[[v,a2],[v,a]]], + [[n,+],[[v,a2],5,[v,a3]]], + [[n,/],[[v,a3],2,[v,a4]]], + [[n,ceiling],[[v,a4],[v,a5]]], + [[n,equals4],[[v,a4],[v,a5]]]] + , + [v,b1]]]] + + %[v,b]]] + ] + +],[[[[v,b1],[6] +%[[1,11,111],[2,22,222],[3,33,333]] +]]]). + +%/* +test(208,[[n,reverse1],[[1,2,3],[],[v,c],[v,b],[v,g]]],%[[[1,11,111],[2,22,222],[3,33,333]],[v,b]]], +[[[n,reverse1],[[],[v,a],[v,c],[v,b],[v,a]]], +[[n,reverse1],[[v,a],[v,b],[v,c],[v,f],[v,g]],":-",[[[n,head],[[v,a],[v,b1]]],[[n,tail],[[v,a],[v,c1]]],[[n,wrap],[[v,b1],[v,d]]],[[n,append],[[v,d],[v,b],[v,f]]],[[n,reverse1],[[v,c1],[v,f],[v,h],[v,j],[v,g]]]]]] +,[[[[v,b],[1]],[[v,g],[3,2,1]]]]). +%*/ + +test(209,[[n,a],[["a","b"],"",[v,a]]], + +[ % string concat +[[n,a],[[],[v,a],[v,a]]], +[[n,a],[[[v,e],"|",[v,d]],[v,b],[v,c]],":-", +[ + %[[n,equals4],[]], + [[n,stringconcat],[[v,b],[v,e],[v,f]]], + [[n,a],[[v,d],[v,f],[v,c]]] +]] +], + +[[[[v,a],"ab"]]]). + + + +test(210,[[n,grammar1],[[apple]]], +[ + [[n,grammar1],[[v,s]],":-", + [ + [[n,noun],[[v,s],[]]] + ] + ], + + [[n,noun],"->",[[apple]]] +],[[]]). + + + +test(211,[[n,grammar1],[[apple,banana]]], +[ + [[n,grammar1],[[v,s]],":-", + [ + [[n,noun],[[v,s],[]]] + ] + ], + + [[n,noun],"->",[[apple,banana]]] +],[[]]). + + +test(212,[[n,grammar1],[[apple,banana,carrot]]], +[ + [[n,grammar1],[[v,s]],":-", + [ + [[n,noun],[[v,s],[]]] + ] + ], + + [[n,noun],"->",[[apple,banana],[carrot]]] +],[[]]). + + +test(213,[[n,reverse1],[[1,2,3],[],[v,a]]], +[ +[[n,reverse1],[[],[v,a],[v,a]]], +[[n,reverse1],[[[v,a],"|",[v,d]],[v,b],[v,c]],":-", +[ + [[n,reverse1],[[v,d],[[v,a],"|",[v,b]],[v,c]]] +]] +],[[[[v,a],[3,2,1]]]]). + + +test(214,[[n,append1],[[1,2,3],[],[v,a]]], +[ +[[n,append1],[[],[v,a],[v,a]]], +[[n,append1],[[[v,a],"|",[v,d]],[v,b],[[v,a],"|",[v,c]]],":-", +[ + [[n,append1],[[v,d],[v,b],[v,c]]] +]] +],[[[[v,a],[1,2,3]]]]). + +% passes with occurs_check(on). +/* +test(215,[[n,function1],[[v,a]]], +[ + [[n,function1],[[[v,a]]],":-", + [ + [[n,true]]]] +] +,[]). +*/ + +test(215,[[n,findall],[[[v,a],[v,b]],[[n,append],[[v,a],[v,b],[1,2,3]]],[v,c]]], +[],[[[[v,c],[[[],[1,2,3]],[[1],[2,3]],[[1,2],[3]],[[1,2,3],[]]]]]]). + +/* +% cut deletes cp data back to and including pred id call + +test(216,[[n,count],[1,[v,n]]], +[ + [[n,count],[1,[v,a]],":-",[[[n,member],[[2,3],[v,a]]],[[n,cut]],[[n,fail]]]], + [[n,count],[1,3],":-",[[[n,cut]]]] + + +] ,[[[[v,n], 2]]]). + +*/ + +test(216, + +%a( +% x[[n,apply_all_to_all],[["m","vine"],[v,inventory4]]],% +[[n,traverse],[2,8]], %2,8 or 2,6 +%[[n,apply_all_to_all],[["k","c","m","vine"],[v,inventory4]]], +%[[n,traverse],[3,9,empty,empty,empty,empty]], +[ +[[n,rainforest],[[[1,9,["*"]],[2,9,["*"]],[3,9,["*"]],[4,9,["*"]],[5,9,["*"]],[6,9,["*"]],[7,9,["*"]],[1,8,["*"]],[2,8,["k"]],[3,8,["c"]],[4,8,["vine"]],[5,8,[]],[6,8,[]],[7,8,["*"]],[1,7,["*"]],[2,7,["*"]],[3,7,["*"]],[4,7,["*"]],[5,7,["*"]],[6,7,["*"]],[7,7,["*"]]]]], +[[n,traverse],[[v,x],[v,y]],":-", +[ + [[n,"->"], + [ + [[n,traverse],[[v,x],[v,y],[],[v,c],[],[v,a],"no",[v,b]]], + + [ + [[n,true]] + ], + + [ + [[n,false]] + ] + ]]%, + %[[n,cut]] +]], +[[n,traverse],[[v,'_x'],[v,'_y'],[v,explored],[v,explored],[v,inventory],[v,inventory],[v,e],[v,e]],":-", +[ + [[n,"->"], + [ + [[n,equals4],[[v,e],"e"]], + + [[n,true]], + + [[n,fail]] + ]]%, + %[[n,cut]] +]], +[[n,traverse],[[v,x],[v,y],[v,explored1],[v,explored2],[v,inventory1],[v,inventory2],[v,e1],[v,e2]],":-", +[ + [[n,rainforest],[[v,map]]], + [[n,member],[[[v,x],[v,y],[v,cell]],[v,map]]], + [[n,"->"], + [ + [[n,"->"], + [ + [[n,equals4],[[v,cell],["*"]]], + + [ + [[n,true]] + ], + + [ + [[n,member],[[[v,x],[v,y]],[v,explored1]]] + ] + ]], + + [ + [ + [[n,equals4],[[v,explored1],[v,explored2]]], + [[n,equals4],[[v,inventory1],[v,inventory2]]], + [[n,=],[[v,e2],[v,e1]]] + ] + ], + + [ + [ + [[n,writeln],[[[v,x],[v,y]]]], + [[n,"->"], + [ + [[n,equals4],[[v,cell],[]]], + + [ + [[n,equals4],[[v,inventory4a],[v,inventory1]]] + ], + + [ + [ + [[n,equals4],[[v,cell],[[v,item]]]], + [[n,append],[[v,inventory1],[[v,item]],[v,inventory3]]], + [[n,apply_all_to_all],[[v,inventory3],[v,inventory4]]], + [[n,equals4],[[v,inventory4a],[v,inventory4]]] + ] + ] + ]], + [[n,writeln],[[v,inventory4a]]], + [[n,"->"], + [ + [[n,member],["e",[v,inventory4a]]], + + [ + [ + [[n,writeln],["Game Over"]], + [[n,equals4],[[v,explored1],[v,explored2]]], + [[n,equals4],[[v,inventory2],[v,inventory4a]]], + [[n,=],[[v,e2],"e"]]%, + %,[[n,cut]] + %,[[n,trace]] + ] + ], + + [ + [ + [[n,append],[[v,explored1],[[[v,x],[v,y]]],[v,explored3]]], + [[n,-],[[v,x],1,[v,xm1]]], + [[n,-],[[v,y],1,[v,ym1]]], + [[n,+],[[v,x],1,[v,xp1]]], + [[n,+],[[v,y],1,[v,yp1]]], + [[n,traverse],[[v,xm1],[v,y],[v,explored3],[v,explored4],[v,inventory4a],[v,inventory5],[v,e1],[v,e3]]], + [[n,traverse],[[v,x],[v,ym1],[v,explored4],[v,explored5],[v,inventory5],[v,inventory6],[v,e3],[v,e4]]], + [[n,traverse],[[v,xp1],[v,y],[v,explored5],[v,explored6],[v,inventory6],[v,inventory7],[v,e4],[v,e5]]], + [[n,traverse],[[v,x],[v,yp1],[v,explored6],[v,explored2],[v,inventory7],[v,inventory2],[v,e5],[v,e2]]] + ] + ] + ]] + ] + ] + ]] +]], +[[n,apply_all_to_all],[[v,inventory1],[v,inventory2]],":-", +[ + [[n,findall], + [ + [v,item3], + + [ + [[n,member],[[v,item1],[v,inventory1]]], + [[n,member],[[v,item2],[v,inventory1]]], + [[n,not],[[[[n,equals4],[[v,item1],[v,item2]]]]]], + [[n,apply],[[v,item1],[v,item2],[v,item3]]], + [[n,not],[[[[n,member],[[v,item3],[v,inventory1]]]]]] + ], + + [v,addeditems] + ]], + [[n,"->"], + [ + [[n,equals4],[[v,addeditems],[]]], + + [ + [[n,equals4],[[v,inventory1],[v,inventory2]]] + ], + + [ + [ + [[n,append],[[v,inventory1],[v,addeditems],[v,inventory3]]], + [[n,apply_all_to_all],[[v,inventory3],[v,inventory2]]] + ] + ] + ]] +]], +[[n,apply],["k","c","m"]], +[[n,apply],["m","vine","e"]] +], +[[]] +%[[[[v,inventory4],["k","c","m","vine","e"]]]] +). + +/* +test(217,[[n,a],[[v,a]]], +[[[n,a],[[v,a]],":-",[[[n,b],[[v,a]]]]], +[[n,b],[empty],":-",[[[n,true]]]] +%[[n,b],[2],":-",[[[n,true]]]]], +], +[[[[v,a],empty]]]). + +test(218,[[n,member],[[1,2],[v,a]]], +[], +[[[[v,a],1]]]). + +test(219,[[n,a],[[v,a]]], +[[[n,a],[[v,a]],":-",[[[n,b],[[v,a]]]]], +[[n,b],[empty],":-",[[[n,true]]]] +%[[n,b],[2],":-",[[[n,true]]]]], +], +[[[[v,a],empty]]]). + +*/ + +test(217, + +%a( +% x[[n,apply_all_to_all],[["m","vine"],[v,inventory4]]],% +[[n,traverse],[2,6]], %2,8 or 2,6 +%[[n,apply_all_to_all],[["k","c","m","vine"],[v,inventory4]]], +%[[n,traverse],[3,9,empty,empty,empty,empty]], +[ +/*[[n,rainforest],[[ +[1,9,["*"]],[2,9,["*"]],[3,9,["*"]],[4,9,["*"]],[5,9,["*"]],[6,9,["*"]],[7,9,["*"]], +[1,8,["*"]],[2,8,["k"]],[3,8,["c"]],[4,8,["vine"]],[5,8,[]],[6,8,[]],[7,8,["*"]], +[1,7,["*"]],[2,7,["*"]],[3,7,["*"]],[4,7,["*"]],[5,7,["*"]],[6,7,["*"]],[7,7,["*"]]]]], +*/ +%/* +[[n,rainforest],[[[1,9,[ ]],[2,9,[ ]],[3,9,["*" ]],[4,9,["*" ]],[5,9,["*" ]], + [1,8,[ ]],[2,8,["*" ]],[3,8,[ ]],[4,8,[ ]],[5,8,["*" ]], + [1,7,[ ]],[2,7,["*" ]],[3,7,[ ]],[4,7,["*" ]],[5,7,[ ]], + [1,6,["*" ]],[2,6,[ ]],[3,6,["k" ]],[4,6,["*" ]],[5,6,["*" ]], + [1,5,["*" ]],[2,5,[ ]],[3,5,[ ]],[4,5,["c" ]],[5,5,["*" ]], + [1,4,[ ]],[2,4,["*" ]],[3,4,[ ]],[4,4,[ ]],[5,4,["*" ]], + [1,3,["*" ]],[2,3,[ ]],[3,3,[ ]],[4,3,["*" ]],[5,3,[ ]], + [1,2,["*" ]],[2,2,[ ]],[3,2,["vine" ]],[4,2,["*" ]],[5,2,[ ]], + [1,1,[ ]],[2,1,["*" ]],[3,1,["*" ]],[4,1,["*" ]],[5,1,[ ]]]]], +%*/ +[[n,traverse],[[v,x],[v,y]],":-", +[ + [[n,"->"], + [ + [[n,traverse],[[v,x],[v,y],[],[v,c],[],[v,a],"no",[v,b]]], + + [ + [[n,true]] + ], + + [ + [[n,false]] + ] + ]]%, + %[[n,cut]] +]], +[[n,traverse],[[v,'_x'],[v,'_y'],[v,explored],[v,explored],[v,inventory],[v,inventory],[v,e],[v,e]],":-", +[ + [[n,"->"], + [ + [[n,equals4],[[v,e],"e"]], + + [[n,true]], + + [[n,fail]] + ]]%, + %[[n,cut]] +]], +[[n,traverse],[[v,x],[v,y],[v,explored1],[v,explored2],[v,inventory1],[v,inventory2],[v,e1],[v,e2]],":-", +[ + [[n,rainforest],[[v,map]]], + [[n,member],[[[v,x],[v,y],[v,cell]],[v,map]]], + [[n,"->"], + [ + [[n,"->"], + [ + [[n,equals4],[[v,cell],["*"]]], + + [ + [[n,true]] + ], + + [ + [[n,member],[[[v,x],[v,y]],[v,explored1]]] + ] + ]], + + [ + [ + [[n,equals4],[[v,explored1],[v,explored2]]], + [[n,equals4],[[v,inventory1],[v,inventory2]]], + [[n,=],[[v,e2],[v,e1]]] + ] + ], + + [ + [ + [[n,writeln],[[[v,x],[v,y]]]], + [[n,"->"], + [ + [[n,equals4],[[v,cell],[]]], + + [ + [[n,equals4],[[v,inventory4a],[v,inventory1]]] + ], + + [ + [ + [[n,equals4],[[v,cell],[[v,item]]]], + [[n,append],[[v,inventory1],[[v,item]],[v,inventory3]]], + [[n,apply_all_to_all],[[v,inventory3],[v,inventory4]]], + [[n,equals4],[[v,inventory4a],[v,inventory4]]] + ] + ] + ]], + [[n,writeln],[[v,inventory4a]]], + [[n,"->"], + [ + [[n,member],["e",[v,inventory4a]]], + + [ + [ + [[n,writeln],["Game Over"]], + [[n,equals4],[[v,explored1],[v,explored2]]], + [[n,equals4],[[v,inventory2],[v,inventory4a]]], + [[n,=],[[v,e2],"e"]]%, + %,[[n,trace]] + ] + ], + + [ + [ + [[n,append],[[v,explored1],[[[v,x],[v,y]]],[v,explored3]]], + [[n,-],[[v,x],1,[v,xm1]]], + [[n,-],[[v,y],1,[v,ym1]]], + [[n,+],[[v,x],1,[v,xp1]]], + [[n,+],[[v,y],1,[v,yp1]]], + [[n,traverse],[[v,xm1],[v,y],[v,explored3],[v,explored4],[v,inventory4a],[v,inventory5],[v,e1],[v,e3]]], + [[n,traverse],[[v,x],[v,ym1],[v,explored4],[v,explored5],[v,inventory5],[v,inventory6],[v,e3],[v,e4]]], + [[n,traverse],[[v,xp1],[v,y],[v,explored5],[v,explored6],[v,inventory6],[v,inventory7],[v,e4],[v,e5]]], + [[n,traverse],[[v,x],[v,yp1],[v,explored6],[v,explored2],[v,inventory7],[v,inventory2],[v,e5],[v,e2]]] + ] + ] + ]] + ] + ] + ]] +]], +[[n,apply_all_to_all],[[v,inventory1],[v,inventory2]],":-", +[ + [[n,findall], + [ + [v,item3], + + [ + [[n,member],[[v,item1],[v,inventory1]]], + [[n,member],[[v,item2],[v,inventory1]]], + [[n,not],[[[[n,equals4],[[v,item1],[v,item2]]]]]], + [[n,apply],[[v,item1],[v,item2],[v,item3]]], + [[n,not],[[[[n,member],[[v,item3],[v,inventory1]]]]]] + ], + + [v,addeditems] + ]], + [[n,"->"], + [ + [[n,equals4],[[v,addeditems],[]]], + + [ + [[n,equals4],[[v,inventory1],[v,inventory2]]] + ], + + [ + [ + [[n,append],[[v,inventory1],[v,addeditems],[v,inventory3]]], + [[n,apply_all_to_all],[[v,inventory3],[v,inventory2]]] + ] + ] + ]] +]], +[[n,apply],["k","c","m"]], +[[n,apply],["m","vine","e"]] +], +[[]] +%[[[[v,inventory4],["k","c","m","vine","e"]]]] +). + +test(218,[[n,findall1],[[[1,2],[3,4]],[v,b]]],%[[[1,11,111],[2,22,222],[3,33,333]],[v,b]]], + +[ + [[n,findall1],[[v,a],[v,b]],":-", + [ [[n,findall],[[v,b1],[[[n,member],[[v,a1],[v,a]]], + + [[n,findall],[[v,a3],[[[n,member],[[v,a2],[v,a1]]], + [[n,a],[[v,a2],[v,a3]]]], + [v,b1]]]], + + [v,b]]] + ]], + + [[n,a],[1,6]], + [[n,a],[2,7]], + [[n,a],[3,8]], + [[n,a],[4,9]] + +],[[[[v,b],[[6,7],[8,9]] +%[[1,11,111],[2,22,222],[3,33,333]] +]]]). + + +test(219,[[n,findall],[[[v,a],[v,b]],[[n,stringconcat],[[v,a],[v,b],"abc"]],[v,c]]], +[],[[[[v,c],[["","abc"],["a","bc"],["ab","c"],["abc",""]]]]]). + + +test(220,[[n,stringconcat],["a","b","ab"]], +[], +[[]]). + +test(221,[[n,stringconcat],["a","b",[v,a]]], +[], +[[[[v,a],"ab"]]]). + +test(222,[[n,stringconcat],["a",[v,a],"ab"]], +[], +[[[[v,a],"b"]]]). + +test(223,[[n,stringconcat],[[v,a],"b","ab"]], +[], +[[[[v,a],"a"]]]). + + +test(224,[[n,append],[["a"],["b"],["a","b"]]], +[], +[[]]). + +test(225,[[n,append],[["a"],["b"],[v,a]]], +[], +[[[[v,a],["a","b"]]]]). + +test(226,[[n,append],[["a"],[v,a],["a","b"]]], +[], +[[[[v,a],["b"]]]]). + +test(227,[[n,append],[[v,a],["b"],["a","b"]]], +[], +[[[[v,a],["a"]]]]). + + +test(228,[[n,findall],[[[v,a],[v,b]],[[n,append],[[v,a],[v,b],["a","b","c"]]],[v,c]]], +[],[[[[v,c],[[[],["a","b","c"]],[["a"],["b","c"]],[["a","b"],["c"]],[["a","b","c"],[]]]]]]). + +test(229,[[n,findall],[[v,a],[[n,member],[[v,a],["a","b"]]],[v,c]]], +[],[[[[v,c],["a","b"]]]]). + +test(230,[[n,append],[["a"],[v,a],[v,b]]], +[], +[[[[v,b],["a","|",[v,_]]]]]). + +test(231,[[n,findall],[[[v,a],[v,c]],[[n,append],[[v,a],["a","b"],[v,c]]],[v,d]]], +[], +[[[[v,d],[[[],["a","b"]],[[[v,_]],[[v,_],"a","b"]],[[[v,_],[v,_]],[[v,_],[v,_],"a","b"]],[[[v,_],[v,_],[v,_]],[[v,_],[v,_],[v,_],"a","b"]],[[[v,_],[v,_],[v,_],[v,_]],[[v,_],[v,_],[v,_],[v,_],"a","b"]],[[[v,_],[v,_],[v,_],[v,_],[v,_]],[[v,_],[v,_],[v,_],[v,_],[v,_],"a","b"]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a","b"]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a","b"]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a","b"]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a","b"]]]]]]). + +test(232,[[n,member],["a",["a","b"]]], +[], +[[]]). + +test(233,[[n,findall],[[v,a],[[n,member],["a",[v,a]]],[v,d]]], +[], +[[[[v,d],[["a","|",[v,_]],[[v,_],"a","|",[v,_]],[[v,_],[v,_],"a","|",[v,_]],[[v,_],[v,_],[v,_],"a","|",[v,_]],[[v,_],[v,_],[v,_],[v,_],"a","|",[v,_]],[[v,_],[v,_],[v,_],[v,_],[v,_],"a","|",[v,_]],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a","|",[v,_]],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a","|",[v,_]],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a","|",[v,_]],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a","|",[v,_]]]]]]). + +test(234,[[n,member],[[v,a],["a","b"]]], +[], +[[[[v,a],"a"]]]). + +test(235,[[n,findall],[[[v,a],[v,b]],[[n,member],[[v,b],[v,a]]],[v,d]]], +[], +[[[[v,d],[[[v,_],[[v,_],"|",[v,_]]],[[v,_],[[v,_],[v,_],"|",[v,_]]],[[v,_],[[v,_],[v,_],[v,_],"|",[v,_]]],[[v,_],[[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]]]]]]). + +test(236,[[n,findall],[[[v,a],[v,b],[v,c]],[[n,append],[[v,a],["a"],[[v,b],"|",[v,c]]]],[v,d]]], +[], +[[[[v,d],[[[],"a",[]],[[[v,_]],[v,_],["a"]],[[[v,_],[v,_]],[v,_],[[v,_],"a"]],[[[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],"a"]],[[[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],"a"]],[[[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],"a"]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],"a"]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a"]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a"]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"a"]]]]]]). + +test(237,[[n,findall],[[[v,a],[v,b],[v,c]],[[n,append],[[v,a],["b",[v,b]],[v,c]]],[v,d]]], +[], +[[[[v,d],[[[],[v,_],["b",[v,_]]],[[[v,_]],[v,_],[[v,_],"b",[v,_]]],[[[v,_],[v,_]],[v,_],[[v,_],[v,_],"b",[v,_]]],[[[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],"b",[v,_]]],[[[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],"b",[v,_]]],[[[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],"b",[v,_]]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"b",[v,_]]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"b",[v,_]]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"b",[v,_]]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"b",[v,_]]]]]]]). + +test(238,[[n,findall],[[[v,a],[v,b],[v,c]],[[n,append],[[v,a],[v,b],[v,c]]],[v,d]]], +[], +[[[[v,d],[[[],[v,_],[v,_]],[[[v,_]],[v,_],[[v,_],"|",[v,_]]],[[[v,_],[v,_]],[v,_],[[v,_],[v,_],"|",[v,_]]],[[[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],"|",[v,_]]],[[[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]],[[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_]],[v,_],[[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],[v,_],"|",[v,_]]]]]]]). + +test(239,[[n,append],[[v,a],["b",[v,b]],[[v,c]]]], +[], +[]). + +test(240,[[n,append],[[v,a],["b"],[[v,c]]]], +[], +[[[[v,a],[]],[[v,c],"b"]]]). + + +test(241,%[[n,reverse1],[[v,d],[empty,empty,empty],[3,2,1]]],% +[[n,reverse1],[[v,a],[],[3, +2,1]]], +[ +[[n,reverse1],[[],[v,a],[v,a]],":-",[[[n,cut]]]], +[[n,reverse1],[[[v,a],"|",[v,d]],[v,b],[v,c]],":-", +[ + [[n,reverse1],[[v,d],[[v,a],"|",[v,b]],[v,c]]], + [[n,cut]] +]] +],[[[[v,a],[1,2,3 +]]]]). + +test(242,%[[n,reverse1],[[v,d],[empty,empty,empty],[3,2,1]]],% +[[n,append1],[[v,a],[],[1,2,3]]], +[ +[[n,append1],[[],[v,a],[v,a]],":-",[[[n,cut]]]], +[[n,append1],[[[v,a],"|",[v,d]],[v,b],[[v,a],"|",[v,c]]],":-", +[ + [[n,append1],[[v,d],[v,b],[v,c]]], + [[n,cut]] +]] +],[[[[v,a],[1,2,3 +]]]]). + +test(243, +[[n,back_propagate],[[v,a]]], +[ +[[n,back_propagate],[[v,a]],":-", +[ + [[n,equals4],[[v,a],[v,b]]], + [[n,equals4],[[v,b],1]] +]] +],[[[[v,a],1]]]). + +% similar to test 56 + +%test(244,[[n,foldl1],[[n,add],[1,2,3],0,[v,d]]], +%test(244,[[n,var],[[v,a]]], +%test(244,[[n,string_chars1],[[v,a],[v,d]]], +%test(244,[[n,string_concat1],["a","",[v,d]]], +%test(244,[[n,append1],[[a],[],[v,d]]], +%test(244,[[n,string_chars1],["ab",[v,d]]], +test(244,[[n,foldl1],[[n,stringconcata1],["a","b"],"",[v,d]]], +%test(244,[[n,foldl1],[[n,add],[1,2,3],0,[v,d]]], +[ + [[n,foldl1],[[v,f],[],[v,l],[v,l]],":-",[[[n,cut]]]], + [[n,foldl1],[[v,f],[[v,h],"|",[v,t]],[v,m1],[v,na]],":-", + [ [[v,f],[[v,m1],[v,h],[v,m2]]], + [[n,foldl1],[[v,f],[v,t],[v,m2],[v,na]]] + ] + ], + + [[n,add],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,c]]] + ] + ], + + +[[n,stringconcata1],[[v,a],[v,b],[v,c]],":-", +[ + [[n,"->"], + [ + [[n,"->"], + [ + [ + [[n,var],[[v,a]]], + [[n,not],[[[n,var],[[v,b]]]]], + [[n,var],[[v,c]]] + ], + + [[n,true]], + + [ + [ + [[n,not],[[[n,var],[[v,a]]]]], + [[n,var],[[v,b]]], + [[n,var],[[v,c]]] + ] + ] + ]], + + [[n,fail]], + + [ + [[n,stringchars1],[[v,a],[v,a1]]], + [[n,stringchars1],[[v,b],[v,b1]]], + [[n,stringchars1],[[v,c],[v,c1]]], + %[[n,writeln],[[v,c1]]], + %[[n,trace2]], + [[n,appenda1],[[v,a1],[v,b1],[v,c1]]], + [[n,stringchars1],[[v,a],[v,a1]]], + [[n,stringchars1],[[v,b],[v,b1]]], + [[n,stringchars1],[[v,c],[v,c1]]] + ] + ]] +]], +[[n,stringchars1],[[v,a],[v,b]],":-", +[ + + [[n,"->"], + [[ + + [[n,var],[[v,a]]], + [[n,var],[[v,b]]] + ], + [[n,true]], + + [[n,string_chars],[[v,a],[v,b]]] + ]] +]], +%[[n,stringchars1],[[v,a],[v,b]],":-", +%[ +% [[n,stringchars],[[v,a],[v,b]]] +%]], +[[n,appenda1],[[],[v,a],[v,a]]], +[[n,appenda1],[[[v,a],"|",[v,d]],[v,b],[[v,a],"|",[v,c]]],":-", +[ + [[n,appenda1],[[v,d],[v,b],[v,c]]] +]] + +] + +%,[[[[v,d], 6]]]). +%,[[]]). +%,[[]]). +%,[[[[v,d], "a"]]]). +%,[[[[v,d], [a]]]]). +%,[[[[v,d], [a]]]]). +,[[[[v,d], "ab"]]]). +%,[[[[v,d], 6]]]). + +/* + +*/ + +test(245,[[[n,equals4],[[v,a],"abc"]],[[n,string_chars],[[v,a],[[v,x],[v,y],[v,z]]]]], + +[] + +,[[[[v,a], "abc"],[[v,x], a],[[v,y], b],[[v,z], c]]]). + +/* +test(246,[[n,a],[[v,b]]], +%test(244,[[n,foldl1],[[n,add],[1,2,3],0,[v,d]]], +[ + [[n,a],[[v,b]],":-", + [ [[n,not],[[[[n,c],[v,c]]]]] + ]] +] +,[[]]). + +test(247,[[n,a],[[v,b]]], +%test(244,[[n,foldl1],[[n,add],[1,2,3],0,[v,d]]], +[ + [[n,a],[[v,b]],":-", + [ [[n,"->"],[[[[n,c],[[v,c]]]],[[[n,c],[[v,c]]]],[[[n,c],[[v,c]]]]]] + ]] +] +,[[]]). + + +test(248,[[n,a],[[v,b]]], +%test(244,[[n,foldl1],[[n,add],[1,2,3],0,[v,d]]], +[ + [[n,a],[[v,b]],":-", + [ [[n,"->"],[[[[n,c],[[v,c]]]],[[[n,c],[[v,c]]]],[[n,"->"],[[[[n,c],[[v,c]]]],[[[n,c],[[v,c]]]],[[[n,c],[[v,c]]]]]]]] + ]] +] +,[[]]). + +test(249,[[n,conjunction],["true","false",[v,c]]], + +[ + [[n,findall1],[[v,f],[v,l],[v,m1],[v,n]],":-", + [ [[n,"->"],[[[v,f],[[v,h],[v,m2]]], + [ [[n,wrap],[[v,m2],[v,m3]]], + [[n,append],[[v,m1],[v,m3],[v,m4]]] + ], + [ + [[n,=],[[v,m1],[v,m4]]] + ]]], + [[n,findall1],[[v,f],[v,t],[v,m4],[v,n]]] + + ] + ]] + +,[[[[v,c], "false"]]]). + +test(250,[[n,a]], + +[[[n,a],":-", + [ + [ + [[n,member3],[[[v,function],"|",[[v,typestatements2]]],[v,typestatements1]]], + [[n,member3],[[[v,function],"|",[[v,modestatements2]]],[v,modestatements1]]], + [[n,extract_modes1],[[v,typestatements2],[v,typestatements3],[v,vars1],[v,vars2],[v,modestatements2]]], + [[n,"->"], + [ + [[n,types],["on"]], + + [[n,debug_call],[[v,skip],[[v,function],[v,vars2]]]], + + [[n,true]] + ]], + [[n,"->"], + [ + [ + [[n,checktypes1],[[v,vars2],[v,typestatements3],[v,typestatements3],[v,typestatements1]]] + ], + + [ + [ + [[n,"->"], + [ + [[n,types],["on"]], + + [[n,debug_exit],[[v,skip],[[v,function],[v,vars2]]]], + + [[n,true]] + ]], + [[n,"->"], + [ + [[n,types],["on"]], + + [ + [[n,debug_types_exit],[[[v,function],"/","~",[v,l],[v,input_type_check]]]] + ], + + [[n,true]] + ]] + ] + ], + + [ + [ + [[n,"->"], + [ + [[n,types],["on"]], + + [[n,debug_fail],[[v,skip],[[v,function],[v,vars1]]]], + + [[n,true]] + ]], + [[n,"->"], + [ + [[n,types],["on"]], + + [ + [[n,debug_types_fail],[[[v,function],"/","~",[v,l],[v,input_type_check]]]] + ], + + [[n,true]] + ]] + ] + ] + ]] + ] + ]]], +[[]]). + +test(251,[[n,a]], + +[ +[[n,checktypes1],[[v,vars1],[v,typestatements1],[v,typestatements2],[v,typestatements4]],":-", +[ + + [ + [[n,not],[[[n,=],[[v,type],[v,dbw_list]]]]], + [[n,not],[[[n,=],[[v,type],[v,dbw_brackets]]]]] + ] + + ,[[n,not],[[[n,=],[[v,type],[v,dbw_list]]]]] + + ]]], +[[]]). + + +test(252,[[n,a]], + +[ +[[n,checktypes1],[[v,vars1],[v,typestatements1],[v,typestatements2],[v,typestatements4]],":-", +[ + + [[ + [[n,true]], + [[n,true]] + ]] + + ,[[n,true]] + + ]]], +[[]]). + +test(253,[[n,a]], + +[ +[[n,checktypes1],[[v,vars1],[v,typestatements1],[v,typestatements2],[v,typestatements4]],":-", +[ + + [[n,"->"], + [ + [[n,types],["on"]], + + [[n,checktypes0_inputs],[[v,function],[v,vars1],[v,typestatements1],[v,modestatements1]]], + + [[n,true]] + ]]]]], + +[[]]). +*/ + +/* +test(246,%[[n,or12],[[v,a]]], +[[n,findall],[[v,b],[[n,findall],[[v,b],[[[n,or12],[[v,b]]]],[v,a]]],[v,a]]], +[ + [[n,or12],[1]], + [[n,or12],[2]] + +],[[[[v,a],[1,2]]]]). +*/ + +/* +test(246,[[n,test]], + +[ +[[n,test],":-", +[ + [[n,member],[[1,2],[v,a]]], + [[n,member],[[1,2],3]] +]] +],[[[[v,a],[1,2]]]]). +*/ + +%test(247,[[[n,equals4],[[v,a],[]]]][[n,findall],[[v,d],[[n,append1],[[v,a],[v,b],[v,d]]], +%test(244,[[n,string_concat1],["a","",[v,d]]], +%test(244,[[n,append1],[[a],[],[v,d]]], +%test(244,[[n,string_chars1],["ab",[v,d]]], +%test(244,[[n,foldl1],[[n,string_concat1],["a","b"],"",[v,d]]], +%test(244,[[n,foldl1],[[n,add],[1,2,3],0,[v,d]]], +%[ +/* +[[n,string_concat1],[[v,a],[v,b],[v,c]],":-", +[ + + %[[n,string_chars1],[[v,c],[v,c1]]], + %[[n,equals4],[[v,c],[v,c1]]], + %[[n,writeln],[[v,c1]]], + %[[n,trace2]], + [[n,append1],[[a],[],[v,c1]]]]], + + +[[n,append1],[[],[v,a],[v,a]]], +[[n,append1],[[[v,a],"|",[v,d]],[v,b],[[v,a],"|",[v,c]]],":-", +[ + [[n,append1],[[v,d],[v,b],[v,c]]] +]] + + +],[[[[v,a],[1,2]]]]). + +*/ +/* +test(247,[[n,findall],[[[v,a],[v,b]],[[n,append200],[[v,a],[v,b],["a","b","c"]]],[v,c]]], +[ +[[n,append200],[[],[v,a],[v,a]],":-",[[[n,cut]]]], +[[n,append200],[[[v,a],"|",[v,d]],[v,b],[[v,a],"|",[v,c]]],":-", +[ + [[n,append200],[[v,d],[v,b],[v,c]]] + ,[[n,cut]] +]] + +],[[[[v,c],[[[],["a","b","c"]],[["a"],["b","c"]],[["a","b"],["c"]],[["a","b","c"],[]]]]]]). + +test(248,[[n,test2]], + +[ +[[n,test2],":-", +[ + [[n,a],[1,[v,t]]], + [[n,b],[[v,t]]]%, + %[[n,cut]] +]], +[[n,test2],":-", +[ + [[n,=],[[v,t],[v,false]]], + [[n,b],[[v,t]]] +]], +[[n,a],[1,true]], +[[n,b],[true],":-", +[ + [[n,writeln],[true]] +]], +[[n,b],[false],":-", +[ + [[n,writeln],[false]] +]] +], +[[]]). +*/ + +test(246,[[n,test2]], + +[ +[[n,test2],":-", +[ + [[n,a],[1,[v,t]]], + [[n,b],[[v,t]]] +]], +[[n,test2],":-", +[ + [[n,=],[[v,t],[v,false]]], + [[n,b],[[v,t]]] +]], +[[n,a],[1,true]], +[[n,b],[true],":-", +[ + [[n,writeln],[true]] +]], +[[n,b],[false],":-", +[ + [[n,writeln],[false]] +]] +], + +[[]]). + +% Tests 247 and 248: In Prolog, the clause in not and antecedent in ->/2 and ->/3 have cut after them if needed (there is no cp or not already a cut after the last cp). + +% This is accomplished by editing the state machine (SM) for the algorithm in LPI and SSI, and converting the SM back to List Prolog in LPI. + +/* +test(247,[[n,test]], + +[ +[[n,test],":-", +[ + [[n,not],[[[[n,member],[[1,2],[v,a]]] + %,[[n,cut]] + ]]] +]]], + +[]). + +test(248,[[n,test],[[v,b]]], + +[ +[[n,test],[[v,b]],":-", +[ + [[n,findall],[[v,a], + [[n,"->"],[[[[n,member],[[1,2],[v,a]]] + %,[[n,cut]] + ], + [[n,writeln],["1"]], + [[n,writeln],["2"]] + ]] + ,[v,b]]] +]]], + +[[[[v,b],[1]]]]). +*/ + +test(247,[[n,test2],[[v,a]]], + +[ +[[n,test2],[[v,a]],":-", +[ + [[n,shell_c],["a","#include\n#include\n\nint main(void)\n{\n char str1[20];\n\n scanf(\"%19s\", str1);\n \n printf(\"%s\\n\", str1);\n return 0;\n}",[v,a]]] +]] +], + + +[[[[v,a],"a"]]]). + +%[0, [6, 7]], [6, []], [7, [6, 7]] + +/* +test(248,[[n,w]], +[[[n,w],":-",[[[n,i]],[[n,e]]]], +[[n,e],":-",[[[n,i]],[[n,e]]]], +[[n,i]] +], +[[]]). + + +test(248,[[n,w]], +[[[n,w],":-",[[[n,y]],[[n,g]]]], +[[n,y],":-",[[[n,r]],[[n,b]]]], +[[n,g],":-",[[[n,p]]]], +[[n,r]], +[[n,b]], +[[n,p],":-",[[[n,w]]]] +], +[[]]). +%object(cube3,[["yellow",["red","blue"]],["red",[]],["blue",[]],["white",["yellow","green"]],["green",["purple"]],["purple",["white"]]], + +test(249,[[n,w]], +[[[n,w],":-",[[[n,r]],[[n,b]]]], +[[n,r],":-",[[[n,w]]]], +[[n,b],":-",[[[n,w]]]] +], +[[]]). +*/ + +test(248,[[n,function],[[v,'_'],[1],[v,c]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[n,append],[[v,a],[v,b],[v,c]]] + ] + ] +] +,[[[[v,c], [1]]]]). + +test(249,[[n,function],[1,2]], +[ + [[n,function],[[v,'_'],[v,'_']]] +] +,[[]]). + +/* +test(250,[[n,function],[[v,a]]], +[ + [[n,function],[[v,a]],":-", + [ + [[n,not],[[n,false] + ]] + ] + ] + +] +,[[]]). + +*/ + + +test(250,[[n,grammar1],[["a","a","a"]]], +[ + [[n,grammar1],[[v,s]],":-", + [ + [[n,noun],[[v,s],[]]] + ] + ], + + [[n,noun],"->",[[]]], + [[n,noun],"->",[["a"],[[n,noun]]]] +],[[]]). + + + +test(251,[[n,grammar1],[["b","c","b","c"]]], +[ + [[n,grammar1],[[v,s]],":-", + [ + [[n,2],[[v,s],[]]] + ] + ], + /* + [[n,1],"->",[[]]], + [[n,1],"->",[["a"],[[n,2]],[[n,1]] + ]], + */ + [[n,2],"->",[[]]], + [[n,2],"->",[["b"],[[n,3]],[[n,2]]]], + [[n,3],"->",[[]]], + [[n,3],"->",[["c"],[[n,3]]]] + +],[[]]). + + diff --git a/listprologinterpreter/lpiverify4_open.pl b/listprologinterpreter/lpiverify4_open.pl new file mode 100644 index 0000000000000000000000000000000000000000..f374b3239ad066aece37f1eb2ffbf5e19b03f8b6 --- /dev/null +++ b/listprologinterpreter/lpiverify4_open.pl @@ -0,0 +1,596 @@ +%% Test cases, Debug=trace=on or off, N=output=result +testopen(Debug,NTotal) :- testopen(Debug,0,NTotal),!. +testopen(_Debug,NTotal,NTotal) :- NTotal=8, !. +testopen(Debug,NTotal1,NTotal2) :- + NTotal3 is NTotal1+1, + testopen_cases(NTotal3,Query,Functions), + ((international_interpret([lang,"en"],Debug,Query,Functions,Result),not(Result=[]))->(writeln0([test,NTotal3,result,Result]),writeln0([test,NTotal3,passed]));(writeln0([test,NTotal3,failed]))), + writeln0(""), + testopen(Debug,NTotal3,NTotal2),!. + +%% Test individual cases, Debug=trace=on or off, N=case number +testopen1(Debug,N) :- + testopen_cases(N,Query,Functions), +((international_interpret([lang,"en"],Debug,Query,Functions,Result),not(Result=[]))->(writeln0([test,N,result,Result]),writeln0([test,N,passed]));(writeln0([test,N,failed]))),!. + +testopen_cases(1,[[n,datetime],[[v,year],[v,month],[v,day],[v,hour],[v,minute],[v,second]]], + +[ +[[n,datetime],[[v,y],[v,m],[v,d],[v,h],[v,mi],[v,s]],":-", + [[[n,date],[[v,y],[v,m],[v,d],[v,h],[v,mi],[v,s]]]]]] +). + +testopen_cases(2,[[n,algwriter],[[v,na]]], +[ + [[n,algwriter],[[v,na]],":-", + [ [[n,makerandomlist],[3,[],[v,r1]]], + [[n,makerandomlist],[3,[],[v,r2]]], + [[n,wrap],[[v,r1],[v,nb1]]], + [[n,wrap],[[v,r2],[v,nb2]]], + [[n,append],[[v,nb1],[v,nb2],[v,nb3]]], + [[n,randomfns],[8,[v,nb3],[v,na]]] + ]], + + [[n,makerandomlist],[0,[v,a],[v,a]]], + [[n,makerandomlist],[[v,a],[v,c1],[v,c]],":-", + [ [[n,not],[[[n,=],[[v,a],0]]]], + [[n,random],[[v,r]]], + [[n,*],[[v,r],5,[v,r1]]], + [[n,ceiling],[[v,r1],[v,n1]]], + [[n,wrap],[[v,n1],[v,n2]]], + [[n,append],[[v,c1],[v,n2],[v,nb3]]], + [[n,-],[[v,a],1,[v,d]]], + [[n,makerandomlist],[[v,d],[v,nb3],[v,c]]] + ]], + + [[n,randomfns],[0, + [v,a],[v,a]]], + [[n,randomfns],[[v,a],[v,b], + [v,c]],":-", + [ [[n,not],[[[n,=],[[v,a],0]]]], + [[n,randomlist],[[v,b],[v,na1]]], + [[n,randomlist],[[v,b],[v,na2]]], + [[n,randomfn],[[v,na1],[v,na2],[v,nb]]], + [[n,wrap],[[v,nb],[v,nb2]]], + [[n,append],[[v,b],[v,nb2],[v,nb3]]], + [[n,tail],[[v,b],[v,t]]], + [[n,-],[[v,a],1,[v,d]]], + [[n,randomfns],[[v,d], + [v,nb3],[v,c]]] + ]], + + [[n,randomlist],[[v,b],[v,na]],":-", + [ [[n,random],[[v,r]]], + [[n,length],[[v,b],[v,bl]]], + [[n,*],[[v,r],[v,bl],[v,n]]], + [[n,ceiling],[[v,n],[v,n1]]], + [[n,getitemn],[[v,n1],[v,b],[v,na]]] + ]], + + [[n,getitemn],[0,[v,a],[]]], + [[n,getitemn],[1,[v,b],[v,c]],":-", + [ [[n,head],[[v,b],[v,c]]] + ]], + [[n,getitemn],[[v,a],[v,b],[v,c]],":-", + [ [[n,not],[[[n,=],[[v,a],1]]]], + [[n,tail],[[v,b],[v,t]]], + [[n,-],[[v,a],1,[v,d]]], + [[n,getitemn],[[v,d],[v,t],[v,c]]] + ]], + + [[n,randomfn],[[v,a1],[v,a2],[v,b]],":-", + [ + [[n,random],[[v,r]]], + [[n,*],[[v,r],9,[v,n]]], + [[n,ceiling],[[v,n],[v,n1]]], + [[n,fna],[[v,n1],[v,a1],[v,a2],[v,b]]] + ]], + + [[n,fna],[1,[v,a1],[v,a2],[v,b]],":-", + [ + [[n,reverse],[[v,a1],[],[v,b]]] + ]], + + [[n,fna],[2,[v,a1],[v,a2],[v,b]],":-", + [ + [[n,sort0],[[v,a1],[v,b]]] + ]], + + [[n,fna],[3,[v,a1],[v,a2],[v,b]],":-", + [ + [[n,findall1],[[n,dividebyfour],[v,a1],[],[v,b]]] + ]], + + [[n,fna],[4,[v,a1],[v,a2],[v,b]],":-", + [ + [[n,append1],[[v,a1],[v,a2],[v,b]]] + ]], + + [[n,fna],[5,[v,a1],[v,a2],[v,b]],":-", + [ + [[n,findall1],[[n,plusone],[v,a1],[],[v,b]]] + ]], + + [[n,fna],[6,[v,a1],[v,a2],[v,b]],":-", + [ + [[n,findall1],[[n,plustwo],[v,a1],[],[v,b]]] + ]], + + [[n,fna],[7,[v,a1],[v,a2],[v,b]],":-", + [ + [[n,findall1],[[n,multiplybytwo],[v,a1],[],[v,b]]] + ]], + + [[n,fna],[8,[v,a1],[v,a2],[v,b]],":-", + [ + [[n,minus1],[[v,a1],[v,a2],[v,b]]] + ]], + + [[n,fna],[9,[v,a1],[v,a2],[v,b]],":-", + [ + [[n,intersection1],[[v,a1],[v,a2],[],[v,b]]] + ]], + + [[n,reverse],[[],[v,l],[v,l]]], + [[n,reverse],[[v,l],[v,m],[v,n]],":-", + [ [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,wrap],[[v,h],[v,h1]]], + [[n,append],[[v,h1],[v,m],[v,o]]], + [[n,reverse],[[v,t],[v,o],[v,n]]] + ] + ], + + [[n,sort0],[[v,l],[v,n]],":-", + [ [[n,sort1],[[v,l],[],[v,n]]] + ] + ], + [[n,sort1],[[],[v,l],[v,l]]], + [[n,sort1],[[v,l],[v,m1],[v,n]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,maximum],[[v,t],[v,h],[v,m2],[],[v,r]]], + [[n,wrap],[[v,m2],[v,m3]]], + [[n,append],[[v,m1],[v,m3],[v,m4]]], + [[n,sort1],[[v,r],[v,m4],[v,n]]] + ] + ], + [[n,maximum],[[],[v,l],[v,l],[v,r],[v,r]]], + [[n,maximum],[[v,l],[v,m1],[v,n],[v,r1],[v,r2]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,"->"],[[[n,>=],[[v,m1],[v,h]]], + [[[n,=],[[v,m2],[v,m1]]], + [[n,wrap],[[v,h],[v,h2]]], + [[n,append],[[v,r1],[v,h2],[v,r3]]]], + [[[[n,=],[[v,m2],[v,h]]]], + [[n,wrap],[[v,m1],[v,m12]]], + [[n,append],[[v,r1],[v,m12],[v,r3]]]]]], + [[n,maximum],[[v,t],[v,m2],[v,n],[v,r3],[v,r2]]] + ] + ], + + [[n,map],[[v,f],[],[v,l],[v,l]]], + [[n,map],[[v,f],[v,l],[v,m1],[v,n]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[v,f],[[v,m1],[v,h],[v,m2]]], + [[n,map],[[v,f],[v,t],[v,m2],[v,n]]] + ] + ], + + [[n,add],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,c]]] + ] + ], + + [[n,findall1],[[v,f],[],[v,l],[v,l]]], + [[n,findall1],[[v,f],[v,l],[v,m1],[v,n]],":-", + [ [[n,not],[[[n,=],[[v,l],[]]]]], + [[n,head],[[v,l],[v,h]]], + [[n,tail],[[v,l],[v,t]]], + [[n,"->"],[[[v,f],[[v,h],[v,m2]]], + [ [[n,wrap],[[v,m2],[v,m3]]], + [[n,append],[[v,m1],[v,m3],[v,m4]]] + ], + [ + [[n,=],[[v,m1],[v,m4]]] + ]]], + [[n,findall1],[[v,f],[v,t],[v,m4],[v,n]]] + + ] + ], + + [[n,plusone],[[v,a],[v,c]],":-", + [ [[n,+],[[v,a],1,[v,c]]] + ] + ], + + [[n,plustwo],[[v,a],[v,c]],":-", + [ [[n,+],[[v,a],2,[v,c]]] + ] + ], + + [[n,multiplybytwo],[[v,a],[v,c]],":-", + [ [[n,*],[[v,a],2,[v,c]]] + ] + ], + + [[n,dividebyfour],[[v,a],[v,c]],":-", + [ [[n,/],[[v,a],4,[v,c]]] + ] + ], + +[[n,intersection1],[[], [v,a], [v,l], [v,l]]], +[[n,intersection1],[[v,l1], [v,l2], [v,l3a], [v,l3]],":-", + [[[n,head],[[v,l1],[v,i1]]], + [[n,tail],[[v,l1],[v,l4]]], + [[n,intersection2],[[v,i1],[v,l2],[],[v,l5]]], + [[n,append],[[v,l3a],[v,l5],[v,l6]]], + [[n,intersection1],[[v,l4],[v,l2],[v,l6],[v,l3]]]]], +[[n,intersection2],[[v,a], [], [v,l], [v,l]]], +[[n,intersection2],[[v,i1], [v,l1], [v,l2], [v,l3]], ":-", + [[[n,head],[[v,l1],[v,i1]]], + [[n,tail],[[v,l1],[v,l4]]], + [[n,wrap],[[v,i1],[v,i11]]], + [[n,append],[[v,l2],[v,i11],[v,l5]]], + [[n,intersection2],[[v,i1], [v,l4], [v,l5], [v,l3]]]]], +[[n,intersection2],[[v,i1], [v,l1], [v,l2], [v,l3]], ":-", + [[[n,head],[[v,l1],[v,i2]]], + [[n,tail],[[v,l1],[v,l4]]], + [[n,not],[[[n,=],[[v,i1],[v,i2]]]]], + [[n,intersection2],[[v,i1], [v,l4], [v,l2], [v,l3]]]]], + + [[n,append1],[[v,b],[v,c],[v,a]],":-", + [ + [[n,append],[[v,b],[v,c],[v,a]]] + ] + ], + +[[n,minus1],[[v,l], [], [v,l]]], +[[n,minus1],[[v,l1], [v,l2], [v,l3]],":-", + [[[n,head],[[v,l2],[v,i1]]], + [[n,tail],[[v,l2],[v,l5]]], + [[n,delete2],[[v,l1],[v,i1],[],[v,l6]]], + [[n,minus1],[[v,l6], [v,l5], [v,l3]]]]], +[[n,delete2],[[], [v,a], [v,l], [v,l]]], +[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-", + [[[n,head],[[v,l1],[v,i1]]], + [[n,tail],[[v,l1],[v,l5]]], + [[n,delete2],[[v,l5],[v,i1],[v,l2],[v,l3]]]]], +[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-", + [[[n,head],[[v,l1],[v,i2]]], + [[n,tail],[[v,l1],[v,l5]]], + [[n,not],[[[n,=],[[v,i1],[v,i2]]]]], + [[n,wrap],[[v,i2],[v,i21]]], + [[n,append],[[v,l2],[v,i21],[v,l6]]], + [[n,delete2],[[v,l5],[v,i1],[v,l6],[v,l3]]]]] +]). + + +testopen_cases(3,[[n,episode_character],[[v,a]]], + +[ + [[n,episode_character],[[v,ds3]],":-", + [ + [[n,findall],[[v,ds2], + [ + [[n,member],[[v,l1],[1,2,3,4]]], + [[n,findall],[[v,ds1], + [ + [[n,member],[[v,l2],[1,2,3,4]]], + [[n,findall],[[[v,l1],[v,l2],[v,l3],[v,d]], + [ + [[n,member],[[v,l3],[1,2,3,4]]], + [[n,equals4],[[v,line],["Level",[v,l1],[v,l2],[v,l3], + "Please write a detail."]]], + [[n,writeln],[[v,line]]], + [[n,read_string],[[v,d]]]], + + [v,ds1]]]], + %[[n,=],[[v,item1],[v,item1b]]]], + [v,ds2]]]] + , + %[[n,=],[[v,item1],[v,item1b]]]], + [v,ds3]]] + + ]] +]).%,[[[[v,a],"success"]]]). + + + +testopen_cases(4,[[n,test]], + +[[[n,test],":-",[[[n,findall],[[[v,x1],[v,y1],[v,s]],[[[n,member],[[v,y1],[1,2]]],[[n,member],[[v,x1],[a,b]]],[[n,writeln],["Enter cell"]],[[n,writeln],[[v,y1]]],[[n,writeln],[[v,x1]]],[[n,read_string],[[v,s]]]],[v,z]]],[[n,findall],[["",[v,z1],""],[[[n,member],[[v,y2],[1,2]]],[[n,findall],[["",[v,s2],""],[[[n,member],[[v,x2],[a,b]]],[[n,member],[[[v,x2],[v,y2],[v,s2]],[v,z]]]],[v,z1]]]],[v,z2]]],[[n,equals4],[[v,z3],["",[v,z2],"
"]]],[[n,flatten1],[[v,z3],[v,z4]]],[[n,concat_list],[[v,z4],[v,z5]]],[[n,writeln],[[v,z5]]]]],[[n,flatten1],[[v,a],[v,b]],":-",[[[n,flatten2],[[v,a],[],[v,b]]]]],[[n,flatten2],[[],[v,b],[v,b]]],[[n,flatten2],[[v,a],[v,b],[v,c]],":-",[[[n,"->"],[[[n,not],[[[n,"->"],[[[n,equals4],[[v,a],[[v,a1],"|",[v,a2]]]],[[n,true]],[[n,equals4],[[v,a],[]]]]]]],[[n,append],[[v,b],[[v,a]],[v,c]]],[[[n,equals4],[[v,a],[[v,d],"|",[v,e]]]],[[n,flatten2],[[v,d],[v,b],[v,f]]],[[n,flatten2],[[v,e],[v,f],[v,c]]]]]]]],[[n,concat_list],[[v,a1],[v,b]],":-",[[[n,equals4],[[v,a1],[[v,a],"|",[v,list]]]],[[n,concat_list],[[v,a],[v,list],[v,b]]],[[n,cut]]]],[[n,concat_list],[[v,a],[],[v,a]],":-",[[[n,cut]]]],[[n,concat_list],[[v,a],[v,list],[v,b]],":-",[[[n,equals4],[[v,list],[[v,item],"|",[v,items]]]],[[n,stringconcat],[[v,a],[v,item],[v,c]]],[[n,concat_list],[[v,c],[v,items],[v,b]]]]]]). + + +% 5 - form to enter data, save + +testopen_cases(5,[[n,test]], +%[[n,list_to_string],["a",[v,file_terma]]], + +[[[n,test],":-",[[[n,writeln],["Enter your name:"]], +%[[n,trace2]], +[[n,read_string],[[v,s]]], +[[n,write_file],['file.txt',["Name",[v,s]]]] +]], +[[n,write_file],[[v,path],[v,file_term]],":-", +[ + [[n,list_to_string],[[v,file_term],[v,file_terma]]], + [[n,stringconcat],[[v,file_terma],".",[v,file_termb]]], + [[n,open],[[v,path],write,[v,stream1]]], + [[n,write],[[v,stream1],[v,file_termb]]], + [[n,close],[[v,stream1]]] + %,[[n,cut]] +]], + + +[[n,list_to_string],[[v,a],[v,b]],":-", +[ + [[n,"->"], + [ + [[n,string],[[v,a]]], + + [[n,wrap_if_string],[[v,a],[v,b]]] + ]], + [[n,cut]] +]], + +[[n,list_to_string],[[v,a],[v,b]],":-", +[ + [[n,list_to_string],[[v,a],"",[v,b]]] + %,[[n,cut]] +]], +[[n,list_to_string],["",[v,b],[v,c]],":-", +[ + [[n,concat],[[v,b],"\"\"","",[v,c]]] + %,[[n,cut]] +]], +[[n,list_to_string],[[],[v,b],[v,c]],":-", +[ + [[n,concat],["[",[v,b],"]",[v,c]]] + %,[[n,cut]] +]], +[[n,list_to_string],[[v,a],"",[v,c]],":-", +[ + [ + [[n,"->"], + [ + [[n,atom],[[v,a]]], + + [[n,true]], + + [ + [[n,number],[[v,a]]] + ] + ]], + [[n,atom_string],[[v,a],[v,c]]] + ] + %,[[n,cut]] +]], +[[n,list_to_string],[[v,a],[v,b],[v,c]],":-", +[ + [[n,"->"], + [ + [ + [[n,not],[[[n,"->"],[[[n,equals4],[[v,a],[[v,a1],"|",[v,a2]]]],[[n,true]], + [[n,equals4],[[v,a],[]]]]] + ]] + ], + + [ + [[n,"->"], + [ + [[n,equals4],[[v,b],""]], + + [[n,equals4],[[v,g],""]], + + [[n,equals4],[[v,g],[","]]] + ]], + [[n,concat],[[v,b],[v,g],[v,a],[v,c]]] + ], + + [ + [[n,equals4],[[v,a],[[v,d],"|",[v,e]]]], + [[n,wrap_if_string],[[v,d],[v,d1]]], + %[[n,trace2]], + [[n,list_to_string],[[v,d1],"",[v,f]]], + [[n,"->"], + [ + [[n,equals4],[[v,b],""]], + + [ + [[n,equals4],[[v,g],""]], + [[n,equals4],[[v,f],[v,f2]]] + ], + + [ + [[n,equals4],[[v,g],","]],[[n,equals4],[[v,f],[v,f2]]] + ] + ]], + [[n,concat],[[v,b],[v,g],[v,f2],[v,f1]]], + [[n,list_to_string],[[v,e],[v,f1],[v,c]]] + ] + ]] + %,[[n,cut]] +]], +[[n,wrap_if_string],[[v,a],[v,b]],":-", +[ + [[n,"->"], + [ + [ + [[n,not],[[[n,equals4],[[v,a],""]]]], + [[n,string],[[v,a]]] + ], + + [[n,maplist],[[n,string_concat],["\"",[v,a],"\""],"",[v,b]]], + + [[n,equals4],[[v,a],[v,b]]] + ]] + %,[[n,cut]] +]], +[[n,concat],[[v,b],[v,g],[v,a],[v,c]],":-", +[ + [[n,maplist],[[n,string_concat],[[v,b],[v,g],[v,a]],"",[v,c]]] + %,[[n,cut]] +]] +] + +). + +% 6 - load data + +testopen_cases(6,[[n,test]], + +[ +[[n,test],":-",[[[n,open_file],['file.txt',["Name",[v,t1]]]], +[[n,list_to_string],[[v,t1],[v,t]]], +[[n,writeln],[["Name",[v,t]]]]]], + +[[n,open_file],[[v,path],[v,file_term]],":-", +[ + [[n,open],[[v,path],read,[v,stream]]], + [[n,read],[[v,stream],[v,file_term]]], + [[n,close],[[v,stream]]] + %[[n,atom_string],[[v,file_term1],[v,file_term2]]], + + %,[[n,cut]] +]], + + +[[n,list_to_string],[[v,a],[v,b]],":-", +[ + [[n,"->"], + [ + [[n,string],[[v,a]]], + + [[n,wrap_if_string],[[v,a],[v,b]]] + ]], + [[n,cut]] +]], + +[[n,list_to_string],[[v,a],[v,b]],":-", +[ + [[n,list_to_string],[[v,a],"",[v,b]]] + %,[[n,cut]] +]], +[[n,list_to_string],["",[v,b],[v,c]],":-", +[ + [[n,concat],[[v,b],"\"\"","",[v,c]]] + %,[[n,cut]] +]], +[[n,list_to_string],[[],[v,b],[v,c]],":-", +[ + [[n,concat],["[",[v,b],"]",[v,c]]] + %,[[n,cut]] +]], +[[n,list_to_string],[[v,a],"",[v,c]],":-", +[ + [ + [[n,"->"], + [ + [[n,atom],[[v,a]]], + + [[n,true]], + + [ + [[n,number],[[v,a]]] + ] + ]], + [[n,atom_string],[[v,a],[v,c]]] + ] + %,[[n,cut]] +]], +[[n,list_to_string],[[v,a],[v,b],[v,c]],":-", +[ + [[n,"->"], + [ + [ + [[n,not],[[[n,"->"],[[[n,equals4],[[v,a],[[v,a1],"|",[v,a2]]]],[[n,true]], + [[n,equals4],[[v,a],[]]]]] + ]] + ], + + [ + [[n,"->"], + [ + [[n,equals4],[[v,b],""]], + + [[n,equals4],[[v,g],""]], + + [[n,equals4],[[v,g],[","]]] + ]], + [[n,concat],[[v,b],[v,g],[v,a],[v,c]]] + ], + + [ + [[n,equals4],[[v,a],[[v,d],"|",[v,e]]]], + [[n,wrap_if_string],[[v,d],[v,d1]]], + %[[n,trace2]], + [[n,list_to_string],[[v,d1],"",[v,f]]], + [[n,"->"], + [ + [[n,equals4],[[v,b],""]], + + [ + [[n,equals4],[[v,g],""]], + [[n,equals4],[[v,f],[v,f2]]] + ], + + [ + [[n,equals4],[[v,g],","]],[[n,equals4],[[v,f],[v,f2]]] + ] + ]], + [[n,concat],[[v,b],[v,g],[v,f2],[v,f1]]], + [[n,list_to_string],[[v,e],[v,f1],[v,c]]] + ] + ]] + %,[[n,cut]] +]], +[[n,wrap_if_string],[[v,a],[v,b]],":-", +[ + [[n,"->"], + [ + [ + [[n,not],[[[n,equals4],[[v,a],""]]]], + [[n,string],[[v,a]]] + ], + + [[n,maplist],[[n,string_concat],["\"",[v,a],"\""],"",[v,b]]], + + [[n,equals4],[[v,a],[v,b]]] + ]] + %,[[n,cut]] +]], +[[n,concat],[[v,b],[v,g],[v,a],[v,c]],":-", +[ + [[n,maplist],[[n,string_concat],[[v,b],[v,g],[v,a]],"",[v,c]]] + %,[[n,cut]] +]] +] + +). + + +testopen_cases(7,[[n,test]], +%[[n,list_to_string],["a",[v,file_terma]]], + +[[[n,test],":-",[ +%[[n,trace2]], +[[n,read_password +],[[v,s]]] +]]]). + + +testopen_cases(8,[[n,test]], +%[[n,list_to_string],["a",[v,file_terma]]], + +[[[n,test],":-",[ +%[[n,trace2]], +[[n,text_area +],["rows=\"4\" style=\"width:100%\"","a",[v,s]]], +[[n,equals4],[[v,s],"a"]] +]]]). diff --git a/listprologinterpreter/lpiverify4_open_types.pl b/listprologinterpreter/lpiverify4_open_types.pl new file mode 100644 index 0000000000000000000000000000000000000000..bd2aa4c08e344a449abc200419224fc83fd3ab5d --- /dev/null +++ b/listprologinterpreter/lpiverify4_open_types.pl @@ -0,0 +1,813 @@ +%% test_open_types(Debug[on/off],Total,Score). + +%%:- use_module(library(time)). + +%% Test cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +test_open_types(Debug,NTotal,Score) :- test_open_types(Debug,0,NTotal,0,Score),!. +test_open_types(_Debug,NTotal,NTotal,Score,Score) :- NTotal=35, !. +test_open_types(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + test_open_types_cases(NTotal3,Query,Types,Modes,Functions), + ((international_interpret([lang,"en"],Debug,Query,Types,Modes,Functions,Result),not(Result=[]))->(Score3 is Score1+1,writeln0([test_open_types,NTotal3,result,Result]),writeln0([test_open_types,NTotal3,passed]));(Score3=Score1,writeln0([test_open_types,NTotal3,failed]))), + writeln0(""), + test_open_types(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% Test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +test_open_types1(Debug,N,Passed) :- + test_open_types_cases(N,Query,Types,Modes,Functions), + (((international_interpret([lang,"en"],Debug,Query,Types,Modes,Functions,Result),not(Result=[]))%%writeln(Result1), + %%Result=Result1 + )->(Passed=passed,writeln0([test_open_types,N,result,Result]),writeln0([test_open_types,N,passed]));(Passed=failed,writeln0([test_open_types,N,failed]))),!. + + +test_open_types_cases(1,[[n,true_vs_good],[[v,t],[v,g]]], + +[ +[[n,true_vs_good],[[[[t,number],[t,number]],[[t,number],[t,number]],[[t,number],[t,number]],[[t,number],[t,number]],[[t,number],[t,number]],[[t,number],[t,number]]],[[[t,number],[t,number]],[[t,number],[t,number]],[[t,number],[t,number]],[[t,number],[t,number]],[[t,number],[t,number]],[[t,number],[t,number]]]]], +[[n,random1],[[t,number],[t,number],[t,number]]] +] + +,[[[n,true_vs_good],[output,output]], +[[n,random1],[input,input,output]]], +[ + [[n,true_vs_good],[[v,t],[v,g]],":-", + [ + [[n,random1],[0.1,4.6,[v,y1]]], + [[n,random1],[[v,y1],4.7,[v,y2]]], + [[n,random1],[[v,y2],4.8,[v,y3]]], + [[n,random1],[[v,y3],4.9,[v,y4]]], + + [[n,equals2],[[v,c11],[0,0]]], + [[n,wrap],[[v,c11],[v,c12]]], + + [[n,equals2],[[v,c21],[1,[v,y1]]]], + [[n,wrap],[[v,c21],[v,c22]]], + [[n,append],[[v,c12],[v,c22],[v,c23]]], + + [[n,equals2],[[v,c31],[2,[v,y2]]]], + [[n,wrap],[[v,c31],[v,c32]]], + [[n,append],[[v,c23],[v,c32],[v,c33]]], + + [[n,equals2],[[v,c41],[3,[v,y3]]]], + [[n,wrap],[[v,c41],[v,c42]]], + [[n,append],[[v,c33],[v,c42],[v,c43]]], + + [[n,equals2],[[v,c51],[4,[v,y4]]]], + [[n,wrap],[[v,c51],[v,c52]]], + [[n,append],[[v,c43],[v,c52],[v,c53]]], + + [[n,equals2],[[v,c61],[5,5]]], + [[n,wrap],[[v,c61],[v,c62]]], + [[n,append],[[v,c53],[v,c62],[v,g]]], + + [[n,equals3],[[v,t],[[0,0],[1,1], + [2,2],[3,3],[4,4],[5,5]]] + ]]], + + [[n,random1],[[v,a1],[v,a2],[v,n5]],":-", + [ + [[n,-],[[v,a2],[v,a1],[v,a3]]], + [[n,random],[[v,n1]]], + [[n,*],[[v,a3],[v,n1],[v,n2]]], + [[n,+],[[v,n2],[v,a1],[v,n21]]], + [[n,*],[10,[v,n21],[v,n3]]], + [[n,round],[[v,n3],[v,n4]]], + [[n,/],[[v,n4],10,[v,n5]]] + ]] +]). + +test_open_types_cases(2,[[n,true_vs_good],[[[n,a],[1]],1,[v,g2]]], + + [ +[[n,true_vs_good],[[[t,predicatename],[[t,number]]],[t,number],[t,number]]] +], + +[[[n,true_vs_good],[input,input,output]]], + + + +[ + [[n,true_vs_good],[[v,f1],[v,l],[v,n]],":-", + [ + [[n,equals3],[[v,n],1] + + ]] +]]). + +test_open_types_cases(3,[[n,function],[[v,a]]], +[ +[[n,function],[[[t,number]]]] +], +[[[n,function],[output]]], +[ + [[n,function],[[1]]] +] +). + + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 3 of 4.txt",0,algorithms,"23. *I prepared to cultivate people. I did this by writing 16 250 breasoning areas of study influenced by Plato’s forms about Popology. First, I equated Plato’s forms with Lucianic Popology, by equating people with objects. Second, I equated the names of Plato’s forms with an agreed with argument, by writing simulations of people are in people’s minds. Third, I equated the functions of Plato’s forms with a positive argument, by writing people are stronger than objects. In this way, I prepared to cultivate people by writing 16 250 breasoning areas of study influenced by Plato’s forms about Popology."] + +% read do you create the person, do you switch them on to existing for the rest of their life? + + +test_open_types_cases(4,[[n,cultivate_person],[[v,a],[v,b]]], + [[[n,cultivate_person],[[t,string],[t,string]]]], + [[[n,cultivate_person],[output,output]]], + +[ + [[n,cultivate_person],[[v,a],[v,b]],":-", + [ + %% do you create the person + + [[n,writeln],["Do you create the person?"]], + [[n,read_string],[[v,a]]], + + %% do you switch them on to existing for the rest of their life? + [[n,writeln],["Do you switch them on to existing for the rest of their life?"]], + + [[n,read_string],[[v,b]]] +/* +, + [[n,writeln],["Do you switch them on to existing for the rest of their life?"]], + + [[n,read_string],[[v,c]]] + */ + + + ]] +]). + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 4 of 4.txt",0,algorithms,"31. I prepared to give the song I am not a Peach and medicine degree away. I did this by giving the woman the anti-depression song and anti-depression degree. First, I gave the woman the anti-depression song. Second, I gave the woman the anti-depression degree. Third, I observed her as happy. In this way, I prepared to give the song I am not a Peach and medicine degree away by giving the woman the anti-depression song and anti-depression degree."] + +% Prevents rumination + +test_open_types_cases(5,[[n,prevent_rumination],[[]]], + [[[n,prevent_rumination],[{[t,string]}]]], + [[[n,prevent_rumination],[input]]], + +[ + [[n,prevent_rumination],[[v,list]],":-", + [ + + [[n,writeln],["What would you like to think about?"]], + [[n,read_string],[[v,item]]], + + [[n,"->"],[[[n,member],[[v,item],[v,list]]], + + [[[n,writeln],["Please do not think of"]], + [[n,writeln],[[v,item]]], + [[n,writeln],["again."]], + [[n,=],[[v,list],[v,list2]]]], + + [[[n,wrap],[[v,item],[v,item2]]], + [[n,append],[[v,list],[v,item2],[v,list2]]]]]], + + [[n,writeln],["Have you finished thinking (y/n)?"]], + [[n,read_string],[[v,y_n]]], + + [[n,"->"],[[[n,=],[[v,y_n],"y"]], + + [[n,true]], + + [[n,prevent_rumination],[[v,list2]]]]] + ]] +]). + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Yellow God 2 of 4.txt",0,algorithms,"12. I prepared to put through an A with a “negatable pressure cup appearance”, in other words negatably but in a way that is protected by meditation, placing a medical question on oneself (thinking of a dental drill, the medical question and a conclusion) for a child to be conceived, a job to be earned or an H1 to be supported. I did this by holding the dog model, like the pressure cup. First, I picked up the dog model. Second, I held it. Third, I placed it on the ground. In this way, I prepared to put through an A with a “negatable pressure cup appearance” by holding the dog model, like the pressure cup."] + +% Text to breasoning checklist + +test_open_types_cases(6,[[n,t2b_checklist],[[v,a],[v,b],[v,c]]], + [[[n,t2b_checklist],[[t,string],[t,string],[t,string]]]], + [[[n,t2b_checklist],[output,output,output]]], + +[ + [[n,t2b_checklist],[[v,a],[v,b],[v,c]],":-", + [ + [[n,writeln],["Do you study education?"]], + [[n,read_string],[[v,a]]], + + [[n,writeln],["Do you study medicine?"]], + [[n,read_string],[[v,b]]], + + [[n,writeln],["Do you study meditation?"]], + [[n,read_string],[[v,c]]] + ]] +]). + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Part of Room 1 of 4.txt",0,algorithms,"6. The disabilities teacher student prepared to assess a “done-up” assignment (with a short story containing 64 departmental perspectives about it) and a “seen-as” version of “A” quality written by the student. He did this by placing the bird model in the hole. First, he lifted the bird model up. Second, he walked to the hole. Third, he placed it in the hole. In this way, the disabilities teacher student prepared to assess a “done-up” assignment and a “seen-as” version of “A” quality written by the student by placing the bird model in the hole."] + +test_open_types_cases(7,[[n,episode_character],[[v,a]]], + [ +[[n,episode_character],[[t,loop0]]], +[[t,loop0],[[[t,loop1],[t,loop1],[t,loop1]]]], +[[t,loop1],[[[t,loop2],[t,loop2],[t,loop2]]]], +[[t,loop2],[[[t,loop3],[t,loop3],[t,loop3]]]], +[[t,loop3],[[[t,items],[t,items],[t,items]]]], +[[t,items],[[t,number],[t,number],[t,number],[t,number],[t,string]]] +], + [[[n,episode_character],[output]]], + +[ + [[n,episode_character],[[v,ds4]],":-", + [ + [[n,findall],[[v,ds3], + [ + [[n,member],[[v,l0],[10,11,12%,4 + ]]], + [[n,findall],[[v,ds2], + [ + [[n,member],[[v,l1],[1,2,3%,4 + ]]], + [[n,findall],[[v,ds1], + [ + [[n,member],[[v,l2],[%1,2, + %3, + 4,5,6 + ]]], + [[n,findall],[[[v,l0],[v,l1],[v,l2],[v,l3],[v,d]], + [ + [[n,member],[[v,l3],[7,8,9%5,6%1,2,3,4 + ]]], + [[n,equals4],[[v,line],["Level",[v,l0],[v,l1],[v,l2],[v,l3], + "Please write a detail."]]], + [[n,writeln],[[v,line]]], + [[n,read_string],[[v,d]]]], + + [v,ds1]]]], + [v,ds2]]]] + , + [v,ds3]]]], + [v,ds4]]] + ]] +]). + +% I want to swim if I am hot + +test_open_types_cases(8,[[n,swim],[[v,c]]], + [[[n,swim],[[t,string]]]], + [[[n,swim],[output]]], + +[ + [[n,swim],[[v,c]],":-", + [ + [[n,writeln],["Are you hot (y/n)?"]], + [[n,read_string],[[v,a]]], + + [[n,"->"],[[[n,=],[[v,a],"y"]], + [[[n,writeln],["Would you like to go for a swim?"]], + [[n,read_string],[[v,b]]], + [[n,"->"],[[[n,=],[[v,b],"y"]], + [[n,=],[[v,c],"swim"]], + [[n,=],[[v,c],"no swim"]]]]], + + [[n,=],[[v,c],"no swim"]]]] + ]] +]). + + +test_open_types_cases(9,[[n,recognise_above_waist],[[v,c]]], + [[[n,recognise_above_waist],[[t,string]]], + [[n,person],[[t,string],[t,string],[t,string],[t,string]]]], + [[[n,recognise_above_waist],[output]], + [[n,person],[input,input,input,output]]], + +[ + [[n,recognise_above_waist],[[v,d1]],":-", + [ + [[n,writeln],["Does the person have a male chest (y/n)?"]], + [[n,read_string],[[v,a]]], + [[n,"->"],[[[n,=],[[v,a],"y"]], + [[n,=],[[v,a1],"male chest"]], + [[n,=],[[v,a1],"female chest"]]]], + + [[n,writeln],["Does the person have a male face (y/n)?"]], + [[n,read_string],[[v,b]]], + [[n,"->"],[[[n,=],[[v,b],"y"]], + [[n,=],[[v,b1],"male face"]], + [[n,=],[[v,b1],"female face"]]]], + + [[n,writeln],["What colour hair does the person have (brown/blonde/auburn)?"]], + [[n,read_string],[[v,c]]], + [[n,"->"],[[[n,=],[[v,c],"brown"]], + [[n,=],[[v,c1],"brown hair"]], + [[n,"->"],[[[n,=],[[v,c],"blonde"]], + [[n,=],[[v,c1],"blonde hair"]], + [[n,=],[[v,c1],"auburn hair"]]]]]], + + [[n,"->"],[[[n,person],[[v,a1],[v,b1],[v,c1],[v,d1]]], + [[n,true]], + [[n,=],[[v,d1],"Unknown"]]]] + ]], + + [[n,person],["male chest","male face","brown hair","Harry"]], + [[n,person],["female chest","female face","blonde hair","Susan"]], + [[n,person],["female chest","female face","auburn hair","April"]] +]). + + +test_open_types_cases(10,[[n,want_me],[[v,c]]], + [[[n,want_me],[[t,string]]]], + [[[n,want_me],[output]]], + +[ + [[n,want_me],[[v,a1]],":-", + [ + [[n,writeln],["Do you want me for 1-food, 2-activity, 3-toy or 4-not want me?"]], + [[n,read_string],[[v,a]]], + [[n,"->"],[[[n,=],[[v,a],"4"]], + [[n,=],[[v,a1],"no"]], + [[n,=],[[v,a1],"yes"]]]] + ]] + +]). + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","PEDAGOGY INDICATORS by Lucian Green Aigs for Pedagogy Helper 2 of 3.txt",0,algorithms,"24. *The pedagogy helper was in natural law. The pedagogy aigs helper was legally right. The self examined the water line at different times. The self was above it at those times. The self offered the Aigs service."] + +% Check whether the algorithm/argument is classified under natural law. + +test_open_types_cases(11,[[n,natural_law],[[v,a1]]], + [[[n,natural_law],[[t,string]]]], + [[[n,natural_law],[output]]], + +[ + [[n,natural_law],[[v,a1]],":-", + [ + [[n,writeln],["Is the idea from pedagogy, meditation or medicine (y/n)?"]], + [[n,read_string],[[v,a]]], + [[n,"->"],[[[n,=],[[v,a],"y"]], + [[n,=],[[v,a1],"true"]], + [[[n,writeln],["Does the idea not contain genetic modification of an organism, not include nuclear energy, and is compatible with natural-law (y/n)?"]], + [[n,read_string],[[v,b]]], + [[n,"->"],[[[n,=],[[v,b],"y"]], + [[n,=],[[v,a1],"true"]], + [[n,=],[[v,a1],"false"]]]]]]] + ]] +]). + +% ["Medicine","MEDICINE by Lucian Green Doctor Sutra 1 of 4.txt",0,algorithms,"3. *The Vag prepared to keep on his own line. He did this by checking his indicator of his health. First, he stepped onto the line. Second, he walked along it. Third, he left the line. In this way, the Vag prepared to keep on his own line by checking his indicator of his health."] + +% Use headache medicine if you have a stress, not pathological headache. + +test_open_types_cases(12,[[n,headache_medicine],[[v,a1]]], + [[[n,headache_medicine],[[t,string]]]], + [[[n,headache_medicine],[output]]], + +[ + [[n,headache_medicine],[[v,a1]],":-", + [ + [[n,writeln],["Do you have a stress, not pathological headache (y/n)?"]], + [[n,read_string],[[v,a]]], + [[n,"->"],[[[n,=],[[v,a],"n"]], + [[n,=],[[v,a1],"true"]], + [[n,=],[[v,a1],"false"]]]] + ]] +]). + +% ["Short Arguments","Competition.txt",0,algorithms,"7. *I liked breasonings and equality - and economic freedom. I performed better by using the daily regimen to go to church (play the note). Confidence blocks and blocks from lack of practice were cleared. I maintained a high level of performance. I functioned (played the note) positively. "] + +% Would you like the same as someone else? + +test_open_types_cases(13,[[n,same],[[v,a1]]], + [[[n,same],[[t,string]]]], + [[[n,same],[output]]], + +[ + [[n,same],[[v,a1]],":-", + [ + [[n,writeln],["Would you like the same as someone else? What is it?"]], + [[n,read_string],[[v,a1]]] + ]] +]). + +% ["Mind Reading","Mr other times 8.txt",0,algorithms,"[""Green, L 2021, Mr other times 8, Lucian Academy Press, Melbourne."",""Green, L 2021"",1,""*Mr other times 8"] + +% What would you like to remember? + +test_open_types_cases(14,[[n,remember],[[v,a1]]], + [[[n,remember],[[t,string]]]], + [[[n,remember],[output]]], + +[ + [[n,remember],[[v,a1]],":-", + [ + [[n,writeln],["What would you like to remember?"]], + [[n,read_string],[[v,a1]]], + [[n,writeln],["Press to display the memory?"]], + [[n,read_string],[[v,a2]]], + [[n,writeln],[[v,a1]]] + ]] +]). + +% ["Medicine","MEDICINE by Lucian Green Panic attack prevented by deep breathing and sutra 1 of 4.txt",0,algorithms,"1a. *I prepared to identify and prevent class distinctions. I did this by writing the Box song argument. First, I wrote about the box. Second, I wrote about the specific. Third, I wrote about the general. In this way, I prepared to identify and prevent class distinctions by writing the Box song argument."] + +% Is the simulated intelligence a life form? + +test_open_types_cases(15,[[n,life],[[v,a1]]], + [[[n,life],[[t,string]]]], + [[[n,life],[output]]], + +[ + [[n,life],[[v,a3]],":-", + [ + [[n,writeln],["Does the entity feel (y/n)?"]], + [[n,read_string],[[v,a1]]], + [[n,writeln],["Does the entity have human thoughts (y/n)?"]], + [[n,read_string],[[v,a2]]], + [[n,"->"],[[[[n,=],[[v,a1],"y"]],[[n,=],[[v,a2],"y"]]], + [[n,=],[[v,a3],"true"]], + [[n,=],[[v,a3],"false"]]]] + + ]] +]). + +% ["Medicine","MEDICINE by Lucian Green Quantum Box and Prayer 3 of 4.txt",0,algorithms,"27. *I prepared to make sure that my day in the rooms was fine. I did this by enjoying dialogue with the quantum box/prayer character. First, I mentioned the first visible level of matter in the object to the character. Second, I listened to the character negate that the level was problematic (say that it was fine). Third, I repeated this for all the visible levels of matter in the object. In this way, I prepared to make sure that my day in the rooms was fine by enjoying dialogue with the quantum box/prayer character."] + +test_open_types_cases(16,[[n,fifty_algorithms],[[v,a1]]], + [[[n,fifty_algorithms],[[t,string]]]], + [[[n,fifty_algorithms],[output]]], + +[ + [[n,fifty_algorithms],[[v,a3]],":-", + [ + [[n,writeln],["What will you use the fifty algorithms for (e.g. politician, professor, actor or musician)?"]], + [[n,read_string],[[v,a3]]] + + ]] +]). + +% ["Short Arguments","Professor Algorithm - Student.txt",0,algorithms,"6. *I prepared to smile. I did this by symbolising the verb. First, I enjoyed the song. Second, I rummaged in the Christmas sack. Third, I pulled the theatrical mask out of the sack."] + +test_open_types_cases(17,[[n,prepare_to_smile],[[v,a1]]], + [[[n,prepare_to_smile],[[t,string]]]], + [[[n,prepare_to_smile],[output]]], + +[ + [[n,prepare_to_smile],[[v,a3]],":-", + [ + [[n,writeln],["What do you find touching, lovely or inspiring about the person?"]], + [[n,read_string],[[v,a3]]] + + ]] +]). + +% ["Time Travel","Meditate to Time Travel 4.txt",0,algorithms,"49. *I meditated to avoid insider trading by time travelling."] + +test_open_types_cases(18,[[n,detect_insider_trading],[[v,a1]]], + [[[n,detect_insider_trading],[[t,string]]]], + [[[n,detect_insider_trading],[output]]], + +[ + [[n,detect_insider_trading],[[v,a3]],":-", + [ + [[n,writeln],["After learning insider information, did you trade?"]], + [[n,read_string],[[v,a3]]] + + ]] +]). + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 2 of 4.txt",0,algorithms,"17. *I prepared to confirm the ability to breason in meditation. I did this by unblocking not wanting to write breasonings in meditation. First, I studied Nietzsche in Arts. Second, I studied Creative Writing. Third, I studied Education. In this way, I prepared to confirm the ability to breason in meditation by unblocking not wanting to write breasonings in meditation."] + +test_open_types_cases(19,[[n,breason],[[v,a1]]], + [[[n,breason],[[t,string]]]], + [[[n,breason],[output]]], + +[ + [[n,breason],[[v,a3]],":-", + [ + [[n,writeln],["If I told you a single way of satisfying the spiritual requirements to have a child, earn a high distinction and prevent quantum ailments such as headaches (where it would not be possible to do these things in other ways) would you be interested?"]], + [[n,read_string],[[v,a3]]] + + ]] +]). + +% ["Short Arguments","Simulated Intelligence 2.txt",0,algorithms,"14. *The man added to the simulation. God (the man) took care of people in subsets of the simulation. God found the subset of the simulation. It was the place. The man recorded them."] + +test_open_types_cases(20,[[n,add_to_simulation],[[v,a1]]], + [[[n,add_to_simulation],[[t,string]]]], + [[[n,add_to_simulation],[output]]], + +[ + [[n,add_to_simulation],[[v,a3]],":-", + [ + [[n,writeln],["Do you add text to breasonings to the simulation?"]], + [[n,read_string],[[v,a3]]] + + ]] +]). + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green X Dimension 3 of 4.txt",0,algorithms,"29. *I prepared to help Earth avoid catastrophe. I did this by stating that I am peaceful. First, I made vegan food available. Second, I guided the number of children per family. Third, I recommended green transport. In this way, I prepared to help Earth avoid catastrophe by stating that I am peaceful."] + +% How will you make sure that the food tastes delicious? + +/** + +?- test_open_types1(off,21,R). +How will you make sure that the food tastes delicious? +|: I will use softer, fresh ingredients and use the best type of cold pressed extra virgin olive oil when cooking. +[test_open_types,21,result,[[[[v,a1],I will use softer, fresh ingredients and use the best type of cold pressed extra virgin olive oil when cooking.]]]] +[test_open_types,21,passed] +R = passed. + +**/ + +test_open_types_cases(21,[[n,delicious],[[v,a1]]], + [[[n,delicious],[[t,string]]]], + [[[n,delicious],[output]]], + +[ + [[n,delicious],[[v,a3]],":-", + [ + [[n,writeln],["How will you make sure that the food tastes delicious?"]], + [[n,read_string],[[v,a3]]] + + ]] +]). + +% ["Creating and Helping Pedagogues","CREATE AND HELP PEDAGOGUES by Lucian Green Areas of Study to Create a Pedagogue 1 of 1.txt",0,algorithms,"1. *A peer should create a Pedagogue by writing 30 areas of study with 5 As, per student before they have the professor algorithm breasoned out for him or her. Have spiritual questions and answers set up to expand these breasonings, e.g. use the ways of thinking like breasonings, etc."] + +% Department algorithm filer + +/** +?- test_open_types1(off,22,R). +What department is the algorithm from? +|: Popology +What subject is the algorithm from? +|: Interpreter +What is a short description of the algorithm (2-3 words)? +|: Verification of input, and output using a verify script. +**/ + +test_open_types_cases(22,[[n,department_filer],[[v,a1],[v,a2],[v,a3]]], + [[[n,department_filer],[[t,string],[t,string],[t,string]]]], + [[[n,department_filer],[output,output,output]]], + +[ + [[n,department_filer],[[v,a1],[v,a2],[v,a3]],":-", + [ + [[n,writeln],["What department is the algorithm from?"]], + [[n,read_string],[[v,a1]]], + [[n,writeln],["What subject is the algorithm from?"]], + [[n,read_string],[[v,a2]]], + [[n,writeln],["What is a short description of the algorithm (2-3 words)?"]], + [[n,read_string],[[v,a3]]] + ]] +]). + +% ["Short Arguments","Green_Sutra.txt",0,algorithms,"8. *I prepared to notice the Lucian Effect. I did this by teaching others. First, I taught the person. Second, they taught someone else. Third, I noticed the positive effects."] + +test_open_types_cases(23,[[n,lucian_effect],[[v,a1]]], + [[[n,lucian_effect],[[t,string]]]], + [[[n,lucian_effect],[output]]], + +[ + [[n,lucian_effect],[[v,a1]],":-", + [ + [[n,writeln],["Would you like to silently repeat the Lucian mantra for twenty minutes twice per day, letting your thoughts become lighter and forgetting your stress?"]], + [[n,read_string],[[v,a1]]] + ]] +]). + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Y Dimension 2 of 4.txt",0,algorithms,"19. *I prepared to grow. I did this by eating the pear. First, I shelled it. Second, I sliced it. Third, I ate it. In this way, I prepared to grow by eating the pear."] + +test_open_types_cases(24,[[n,chinese_longevity_herbs],[[v,a1]]], + [[[n,chinese_longevity_herbs],[[t,string]]]], + [[[n,chinese_longevity_herbs],[output]]], + +[ + [[n,chinese_longevity_herbs],[[v,a1]],":-", + [ + [[n,writeln],["Would you like to have Reishi Mushroom, Goji berries, Ginseng, He-Shou-Wu, Gotu Kola and Schisandra each day for longevity?"]], + [[n,read_string],[[v,a1]]] + ]] +]). + + + +test_open_types_cases(25,[[n,episode_character],[[v,a]]], + [ +[[n,episode_character],[[t,loop0]]], +[[t,loop0],[[[t,loop1],[t,loop1]]]], +[[t,loop1],[[[t,loop2],[t,loop2]]]], +[[t,loop2],[[[t,loop3],[t,loop3]]]], +[[t,loop3],[[[t,items],[t,items]]]], +[[t,items],[[t,number],[t,number],[t,number],[t,number],[t,string]]] +], + [[[n,episode_character],[output]]], + +[ + [[n,episode_character],[[v,ds3]],":-", + [ + [[n,findall],[[v,ds2], + [ + [[n,member],[[v,l1],[1,2]]], + [[n,findall],[[v,ds21], + [ + [[n,member],[[v,l11],[3,4]]], + [[n,findall],[[v,ds1], + [ + [[n,member],[[v,l2],[5,6]]], + [[n,findall],[[[v,l1],[v,l11],[v,l2],[v,l3],[v,d]], + [ + [[n,member],[[v,l3],[7,8]]], + [[n,equals4],[[v,line],["Level",[v,l1],[v,l11],[v,l2],[v,l3], + "Please write a detail."]]], + [[n,writeln],[[v,line]]], + [[n,read_string],[[v,d]]]], + + [v,ds1]]]], + [v,ds21]]]], + [v,ds2]]]] + , + [v,ds3]]] + + ]] +]). + + +test_open_types_cases(26,[[n,episode_character],[[v,a]]], + [ +[[n,episode_character],[[t,loop0]]], +[[t,loop0],[[[t,loop1],[t,loop1]]]], +[[t,loop1],[[[t,items],[t,items]]]], +[[t,items],[[t,number],[t,number],[t,string]]] +], + [[[n,episode_character],[output]]], + +[ + [[n,episode_character],[[v,ds3]],":-", + [ + [[n,findall],[[v,ds2], + [ + [[n,member],[[v,l1],[1,2]]], + [[n,findall],[[[v,l1],[v,l2],[v,d]], + [ + [[n,member],[[v,l2],[3,4]]], + [[n,equals4],[[v,line],["Level",[v,l1],[v,l2], + "Please write a detail."]]], + [[n,writeln],[[v,line]]], + [[n,read_string],[[v,d]]]], + + [v,ds2]]]] + , + [v,ds3]]] + + ]] +]). + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Pranayama 4 of 4.txt",0,algorithms,"Soma"] + +test_open_types_cases(27,[[n,soma],[[v,a1]]], + [[[n,soma],[[t,string]]]], + [[[n,soma],[output]]], + +[ + [[n,soma],[[v,a1]],":-", + [ + [[n,writeln],["Will you spiritually drink the soma each morning to stop digestive system pops from practising the sutra?"]], + [[n,read_string],[[v,a1]]] + ]] +]). + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","PEDAGOGY INDICATORS by Lucian Green Fewer Stillbirths 3 of 3.txt",0,algorithms,"Fewer Stillbirths 3 of 3"] + +test_open_types_cases(28,[[n,fewer_stillbirths],[[v,a1]]], + [[[n,fewer_stillbirths],[[t,string]]]], + [[[n,fewer_stillbirths],[output]]], + +[ + [[n,fewer_stillbirths],[[v,a1]],":-", + [ + [[n,writeln],["Will you follow the instructions and breason out at least 80 breasonings before conception to prevent stillbirth?"]], + [[n,read_string],[[v,a1]]] + ]] +]). + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 4 of 4.txt",0,algorithms,"[""Green, L 2021, Analysing characteristics of arguments 4 of 4, Lucian Academy Press, Melbourne."",""Green, L 2021"",1,""COMPUTATIONAL ENGLISH"] + +test_open_types_cases(29,[[n,properties_of_arguments],[[v,a1],[v,a2]]], + [[[n,properties_of_arguments],[[t,string],[t,string]]]], + [[[n,properties_of_arguments],[output,output]]], + +[ + [[n,properties_of_arguments],[[v,a1],[v,a2]],":-", + [ + [[n,writeln],["What is the conclusion?"]], + [[n,read_string],[[v,a1]]], + + [[n,writeln],["What is the reason?"]], + [[n,read_string],[[v,a2]]] + ]] +]). + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Meditation Teacher Sutra 1 of 4.txt",0,algorithms,"8. I prepared to keep the positive gifts. I did this by differentiating between apples and oranges. First, I looked at the apple. Second, I looked at the orange. Third, I found differences between them. In this way, I prepared to keep the positive gifts by differentiating between apples and oranges."] + +% * I prepared to keep the positive gifts. + +test_open_types_cases(30,[[n,keep_positive_gifts],[[v,a1]]], + [[[n,keep_positive_gifts],[[t,string]]]], + [[[n,keep_positive_gifts],[output]]], + +[ + [[n,keep_positive_gifts],[[v,a1]],":-", + [ + [[n,writeln],["Do you keep the positive gifts, the apple, banana and orange?"]], + [[n,read_string],[[v,a1]]] + ]] +]). + +% * I did this by differentiating between apples and oranges. + +test_open_types_cases(31,[[n,differentiate],[[v,a1]]], + [[[n,differentiate],[[t,string]]]], + [[[n,differentiate],[output]]], + +[ + [[n,differentiate],[[v,a1]],":-", + [ + [[n,writeln],["What is the first type?"]], + [[n,read_string],[[v,a2]]], + + [[n,writeln],["What is the second type?"]], + [[n,read_string],[[v,a3]]], + + [[n,"->"],[[[n,equals4],[[v,a2],[v,a3]]], + [[n,equals4],[[v,a1],"The first and second types are the same"]], + [[n,equals4],[[v,a1],"The first and second types are not the same"]]]] + + ]] +]). + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 4 of 4.txt",0,algorithms,"31. I prepared to enter the room in the heartland. I did this by writing the Room Essay Press Release. First, I wrote that 250 breasonings expanded to 50 As. Second, I wrote that a breasoned out pop song expanded to 50 As. Third, I wrote the classical music composition contained 5 pop songs. In this way, I prepared to enter the room in the heartland by writing the Room Essay Press Release."] + +% * Second, I wrote that a breasoned out pop song expanded to 50 As. + +test_open_types_cases(32,[[n,product],[[v,a1],[v,a2]]], + [[[n,product],[[t,string],[t,string]]]], + [[[n,product],[output,output]]], + +[ + [[n,product],[[v,a1],[v,a2]],":-", + [ + [[n,writeln],["What is your product?"]], + [[n,read_string],[[v,a1]]], + + [[n,writeln],["Do you have 50 As for it?"]], + [[n,read_string],[[v,a2]]] + ]] +]). + +% * Third, I wrote the classical music composition contained 5 pop songs. + +test_open_types_cases(33,[[n,music_form_number],[[v,a]]], + [[[n,music_form_number],[[t,number]]]], + [[[n,music_form_number],[output]]], + +[ + [[n,music_form_number],[[v,a]],":-", + [ + [[n,writeln],["How many form sections should your composition have?"]], + [[n,read_string],[[v,a1]]], + [[n,stringtonumber],[[v,a1],[v,a]]] + ]] +]). + +% * Prevents + +test_open_types_cases(34,[[n,prevent],[[v,a1],[v,a2]]], + [[[n,prevent],[[t,string],[t,string]]]], + [[[n,prevent],[output,output]]], + +[ + [[n,prevent],[[v,a1],[v,a2]],":-", + [ + [[n,writeln],["What will you prevent?"]], + [[n,read_string],[[v,a1]]], + + [[n,writeln],["What will you prevent it with?"]], + [[n,read_string],[[v,a2]]] + ]] +]). + +% * love + +test_open_types_cases(35,[[n,love],[[v,a1],[v,a2]]], + [[[n,love],[[t,string],[t,string]]]], + [[[n,love],[output,output]]], + +[ + [[n,love],["y","y"],":-", + [ + [[n,writeln],["Do you love your partner? (y/n)"]], + [[n,read_string],["y"]], + + [[n,writeln],["Do they love you? (y/n)"]], + [[n,read_string],["y"]] + ]] +]). + + + + diff --git a/listprologinterpreter/lpiverify4_test_bt_lang_all.pl b/listprologinterpreter/lpiverify4_test_bt_lang_all.pl new file mode 100644 index 0000000000000000000000000000000000000000..df6c0a862d89d30c0b80be0a38e946e526a96a62 --- /dev/null +++ b/listprologinterpreter/lpiverify4_test_bt_lang_all.pl @@ -0,0 +1,143 @@ +%:-include('../Languages/make_docs.pl'). + +%:-include('lpiverify4.pl'). +%:-include('lpiverify4_types.pl'). +%:-include('lpiverify4_open.pl'). +%:-include('lpiverify4_open_types.pl'). + +% test (NTotal3,Query,Functions,Result) +% test_types_cases (NTotal3,Query,Types,Modes,Functions,Result) +% testopen_cases (N, Query,Functions) +% test_open_types (NTotal3,Query,Types,Modes,Functions) + +%% test_all_bt00("en2",off,NTotal,Score). + +test_all_bt00(Lang,Debug,NTotal,Score) :- + retractall(lang(_)), + assertz(lang(Lang)), + + test_all_bt0(test,4,Lang,Debug,NT1,S1), + writeln0([lpiverify4,S1,/,NT1,passed]), + writeln0(""), writeln0(""), + + test_all_bt0(test_types_cases,6,Lang,Debug,NT2,S2), + writeln0([lpiverify4_types,S2,/,NT2,passed]), + writeln0(""), writeln0(""), + + test_all_bt0(testopen_cases,3,Lang,Debug,NT3,S3), + writeln0([lpiverify4_open,S3,/,NT3,passed]), + writeln0(""), writeln0(""), + + test_all_bt0(test_open_types_cases,5,Lang,Debug,NT4,S4), + writeln0([lpiverify4_open_types,S4,/,NT4,passed]), + writeln0(""), writeln0(""), + + NTotal is NT1+NT2+NT3+NT4, + Score is S1+S2+S3+S4. + +test_all_bt0(Test,Arity,Lang,Debug,NTotal,Score) :- + functor(Test2,Test,Arity), + findall(Test2,(Test2),B),length(B,NTotal1), +test_all_bt0(Test,Arity,Lang,Debug,0,NTotal,0,Score,NTotal1),!. +test_all_bt0(_Test,_Arity,_Lang,_Debug,NTotal,NTotal,Score,Score,NTotal) :- +%NTotal=105, + !. +test_all_bt0(Test,Arity,Lang,Debug,NTotal1,NTotal2,Score1,Score2,NTotal4) :- + NTotal3 is NTotal1+1, + test_all_bt000(Test,Debug,NTotal3,Score1,Score3,Lang), + writeln0(""), + test_all_bt0(Test,Arity,Lang,Debug,NTotal3,NTotal2,Score3,Score2,NTotal4),!. + +%% test_all_bt01 individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +%% test_all_bt01(test, 4,"en2",off,1,Passed). +%% test_all_bt01(test_types_cases,6,"en2",off,1,Passed). +%% test_all_bt01(testopen_cases, 3,"en2",off,1,Passed). +%% test_all_bt01(test_open_types, 5,"en2",off,1,Passed). + +test_all_bt01(Test,_Arity,Lang,Debug,NTotal3,Passed) :- + test_all_bt000(Test,Debug,NTotal3,0,Passed1,Lang), + (Passed1=1->Passed=passed;Passed=failed), + /** + ((international_interpret([lang,"en"],Debug,Query,Functions,Result1),%%writeln(Result1), + Result=Result1 + )->(Passed=passed,writeln([test_all_bt0,N,passed]));(Passed=failed,writeln([test_all_bt0,N,failed]))), +**/ + !. + +test_all_bt000(test,Debug,NTotal3,Score1,Score3,Lang) :- + test(NTotal3,Query,Functions,Result), + trans_alg1(Query,"en",Lang,Query1), + (Query=Query1->true%writeln("Query=Query1") + ;(writeln0("not(Query=Query1)"),abort)), + trans_alg1(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions);true), + (Debug=on->writeln1(Functions1);true), + (Functions=Functions1->true%writeln("Functions=Functions1") + ;(writeln0("not(Functions=Functions1)"),abort)), + trans_alg1(Result,"en",Lang,Result1), + (Result1=_Result11->true%writeln("Result1=Result11") + ;(writeln0("not(Result1=Result11)"),abort)), + (international_interpret([lang,"en"],Debug,Query1,Functions1,Result1) + %%writeln1(Result2 + ->(Score3 is Score1+1,writeln0([test,NTotal3,passed]));(Score3=Score1,writeln0([test,NTotal3,failed]))). + +test_all_bt000(test_types_cases,Debug,NTotal3,Score1,Score3,Lang) :- + +test_types_cases(NTotal3,Query,Types,Modes,Functions,Result), + + trans_alg1(Query,"en",Lang,Query1), + + findall([F1|Types00],(member([F1|Types003],Types), + + expand_types1(Types003,[],Types00)),Types004), + + %Types004=[[[n, find_record], [[t, brackets], [[[t, list], [[t, number], [t, string]]], [t, number], [t, string]]]]], + + trans_alg1(Types004,"en",Lang,Types005),%,expand_types1(Types002,[],Types003),simplify_types(Types003,[],Types00) + %),Types1), + + %simplify_types(Types01,[],Types1),%findall + findall([F|Types100],(member([F|Types101],Types005),simplify_types(Types101,[],Types100)),Types1), + + trans_alg1(Modes,"en",Lang,Modes1), + trans_alg1(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + trans_alg1(Result,"en",Lang,Result1), + +(international_interpret([lang,"en"],Debug,Query1,Types1,Modes1,Functions1,Result1)->(Score3 is Score1+1,writeln0([test_types,NTotal3,passed]));(Score3=Score1,writeln0([test_types,NTotal3,failed]))). + +test_all_bt000(testopen_cases,Debug,NTotal3,Score1,Score3,Lang) :- + testopen_cases(NTotal3,Query,Functions), + trans_alg1(Query,"en",Lang,Query1), + trans_alg1(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + ((international_interpret([lang,"en"],Debug,Query1,Functions1,Result),not(Result=[]))->(Score3 is Score1+1,writeln0([testopen,NTotal3,result,Result]),writeln0([test,NTotal3,passed]));(Score3=Score1,writeln0([testopen,NTotal3,failed]))). + +test_all_bt000(test_open_types_cases,Debug,NTotal3,Score1,Score3,Lang) :- + test_open_types_cases(NTotal3,Query,Types,Modes,Functions), + trans_alg1(Query,"en",Lang,Query1), + %trans_alg1(Types,"en",Lang,Types1), + + findall([F1|Types00],(member([F1|Types003],Types), + + expand_types1(Types003,[],Types00)),Types004), + + %Types004=[[[n, find_record], [[t, brackets], [[[t, list], [[t, number], [t, string]]], [t, number], [t, string]]]]], + + trans_alg1(Types004,"en",Lang,Types005),%,expand_types1(Types002,[],Types003),simplify_types(Types003,[],Types00) + %),Types1), + + %simplify_types(Types01,[],Types1),%findall + findall([F|Types100],(member([F|Types101],Types005),simplify_types(Types101,[],Types100)),Types1), + + trans_alg1(Modes,"en",Lang,Modes1), + trans_alg1(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + + +((international_interpret([lang,"en"],Debug,Query1,Types1,Modes1,Functions1,Result),not(Result=[]))->(Score3 is Score1+1,writeln0([test_open_types,NTotal3,result,Result]),writeln0([test_open_types,NTotal3,passed]));(Score3=Score1,writeln0([test_open_types,NTotal3,failed]))). + +trans_alg1(Query,"en",Lang,Query1) :- + trans_alg(Query,"en",Lang,Query2), + trans_alg(Query2,Lang,"en",Query1). diff --git a/listprologinterpreter/lpiverify4_test_lang_all.pl b/listprologinterpreter/lpiverify4_test_lang_all.pl new file mode 100644 index 0000000000000000000000000000000000000000..54604676675893d7af193b6188c1b30092f01a74 --- /dev/null +++ b/listprologinterpreter/lpiverify4_test_lang_all.pl @@ -0,0 +1,134 @@ +%:-include('../Languages/make_docs.pl'). + +%:-include('lpiverify4.pl'). +%:-include('lpiverify4_types.pl'). +%:-include('lpiverify4_open.pl'). +%:-include('lpiverify4_open_types.pl'). + +% test (NTotal3,Query,Functions,Result) +% test_types_cases (NTotal3,Query,Types,Modes,Functions,Result) +% testopen_cases (N, Query,Functions) +% test_open_types (NTotal3,Query,Types,Modes,Functions) + +%% test_all00("en2",off,NTotal,Score). + +test_all00(Lang,Debug,NTotal,Score) :- + retractall(lang(_)), + assertz(lang(Lang)), + + test_all0(test,4,Lang,Debug,NT1,S1), + writeln0([lpiverify4,S1,/,NT1,passed]), + writeln0(""), writeln0(""), + + test_all0(test_types_cases,6,Lang,Debug,NT2,S2), + writeln0([lpiverify4_types,S2,/,NT2,passed]), + writeln0(""), writeln0(""), + + test_all0(testopen_cases,3,Lang,Debug,NT3,S3), + writeln0([lpiverify4_open,S3,/,NT3,passed]), + writeln0(""), writeln0(""), + + test_all0(test_open_types_cases,5,Lang,Debug,NT4,S4), + writeln0([lpiverify4_open_types,S4,/,NT4,passed]), + writeln0(""), writeln0(""), + + NTotal is NT1+NT2+NT3+NT4, + Score is S1+S2+S3+S4. + +test_all0(Test,Arity,Lang,Debug,NTotal,Score) :- + functor(Test2,Test,Arity), + findall(Test2,(Test2),B),length(B,NTotal1), +test_all0(Test,Arity,Lang,Debug,0,NTotal,0,Score,NTotal1),!. +test_all0(_Test,_Arity,_Lang,_Debug,NTotal,NTotal,Score,Score,NTotal) :- +%NTotal=105, + !. +test_all0(Test,Arity,Lang,Debug,NTotal1,NTotal2,Score1,Score2,NTotal4) :- + NTotal3 is NTotal1+1, + test_all000(Test,Debug,NTotal3,Score1,Score3,Lang), + writeln0(""), + test_all0(Test,Arity,Lang,Debug,NTotal3,NTotal2,Score3,Score2,NTotal4),!. + +%% test_all01 individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +%% test_all01(test, 4,"en2",off,1,Passed). +%% test_all01(test_types_cases,6,"en2",off,1,Passed). +%% test_all01(testopen_cases, 3,"en2",off,1,Passed). +%% test_all01(test_open_types, 5,"en2",off,1,Passed). + +test_all01(Test,_Arity,Lang,Debug,NTotal3,Passed) :- + test_all000(Test,Debug,NTotal3,0,Passed1,Lang), + (Passed1=1->Passed=passed;Passed=failed), + /** + ((international_interpret([lang,"en"],Debug,Query,Functions,Result1),%%writeln(Result1), + Result=Result1 + )->(Passed=passed,writeln([test_all0,N,passed]));(Passed=failed,writeln([test_all0,N,failed]))), +**/ + !. + +test_all000(test,Debug,NTotal3,Score1,Score3,Lang) :- + test(NTotal3,Query,Functions,Result), + trans_alg(Query,"en",Lang,Query1), + trans_alg(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions);true), + (Debug=on->writeln1(Functions1);true), + trans_alg(Result,"en",Lang,Result1), + ((international_interpret([lang,Lang],Debug,Query1,Functions1,Result1) + %,writeln1(Result2) + ) + ->(Score3 is Score1+1,writeln0([test,NTotal3,passed]));(Score3=Score1,writeln0([test,NTotal3,failed]))). + +test_all000(test_types_cases,Debug,NTotal3,Score1,Score3,Lang) :- + +test_types_cases(NTotal3,Query,Types,Modes,Functions,Result), + + trans_alg(Query,"en",Lang,Query1), + + retractall(lang(_)), + assertz(lang("en")), + + +%/* + findall([F1|Types00],(member([F1|Types003],Types), + + expand_types1(Types003,[],Types00)),Types004), + + %Types004=[[[n, find_record], [[t, brackets], [[[t, list], [[t, number], [t, string]]], [t, number], [t, string]]]]], + + trans_alg(Types004,"en",Lang,Types005),%,expand_types1(Types002,[],Types003),simplify_types(Types003,[],Types00) + %),Types1), + + %simplify_types(Types01,[],Types1),%findall + findall([F|Types100],(member([F|Types101],Types005), + + %retractall(lang(_)), + %assertz(lang("en")), + + simplify_types(Types101,[],Types100)),Types1), + + %*/ + %trans_alg(Types,"en",Lang,Types1), + trans_alg(Modes,"en",Lang,Modes1), + trans_alg(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + trans_alg(Result,"en",Lang,Result1), + +(international_interpret([lang,Lang],Debug,Query1,Types1,Modes1,Functions1,Result1)->(Score3 is Score1+1,writeln0([test_types,NTotal3,passed]));(Score3=Score1,writeln0([test_types,NTotal3,failed]))). + +test_all000(testopen_cases,Debug,NTotal3,Score1,Score3,Lang) :- + testopen_cases(NTotal3,Query,Functions), + trans_alg(Query,"en",Lang,Query1), + trans_alg(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + ((international_interpret([lang,Lang],Debug,Query1,Functions1,Result),not(Result=[]))->(Score3 is Score1+1,writeln0([testopen,NTotal3,result,Result]),writeln0([test,NTotal3,passed]));(Score3=Score1,writeln0([testopen,NTotal3,failed]))). + +test_all000(test_open_types_cases,Debug,NTotal3,Score1,Score3,Lang) :- + test_open_types_cases(NTotal3,Query,Types,Modes,Functions), + trans_alg(Query,"en",Lang,Query1), + trans_alg(Types,"en",Lang,Types1), + trans_alg(Modes,"en",Lang,Modes1), + trans_alg(Functions,"en",Lang,Functions1), + (Debug=on->writeln1(Functions1);true), + + +((international_interpret([lang,Lang],Debug,Query1,Types1,Modes1,Functions1,Result),not(Result=[]))->(Score3 is Score1+1,writeln0([test_open_types,NTotal3,result,Result]),writeln0([test_open_types,NTotal3,passed]));(Score3=Score1,writeln0([test_open_types,NTotal3,failed]))). + diff --git a/listprologinterpreter/lpiverify4_types.pl b/listprologinterpreter/lpiverify4_types.pl new file mode 100644 index 0000000000000000000000000000000000000000..f8e0419d15c9fc57d968e099c5adde1eccd7663e --- /dev/null +++ b/listprologinterpreter/lpiverify4_types.pl @@ -0,0 +1,1429 @@ +%% test_types(Debug[on/off],Total,Score). + +%%:- use_module(library(time)). + +%% Test cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +test_types(Debug,NTotal,Score) :- test_types(Debug,0,NTotal,0,Score),!. +test_types(_Debug,NTotal,NTotal,Score,Score) :- NTotal=70, !. +test_types(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + test_types_cases(NTotal3,Query,Types,Modes,Functions,Result), + (international_interpret([lang,"en"],Debug,Query,Types,Modes,Functions,Result)->(Score3 is Score1+1,writeln0([test_types,NTotal3,passed]));(Score3=Score1,writeln0([test_types,NTotal3,failed]))), + writeln0(""), + test_types(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% Test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +test_types1(Debug,N,Passed) :- + test_types_cases(N,Query,Types,Modes,Functions,Result), + ((international_interpret([lang,"en"],Debug,Query,Types,Modes,Functions,Result1),%writeln(Result1), + Result=Result1 + )->(Passed=passed,writeln0([test_types,N,passed]));(Passed=failed,writeln0([test_types,N,failed]))),!. + + +%%writeln([eg1]), +test_types_cases(1,[[n,function],[1,1,[v,c]]], +[[[n,function],[[t,number],[t,number],[t,number]]]], +[[[n,function],[input,input,output]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[n,+],[[v,a],[v,b],[v,c]]] + ] + ] +] +,[[[[v,c], 2]]]). + +test_types_cases(2,[[n,function],[[v,a],[v,b],[v,c]]], +[[[n,function],[[t,number],[t,string],[t,predicatename]]]], +[[[n,function],[output,output,output]]], +[ + [[n,function],[[v,a],[v,b],[v,c]],":-", + [ + [[n,=],[[v,a],1]], + [[n,=],[[v,b],"a"]], + [[n,=],[[v,c],[n,a]]] + ]] +] +,[[[[v,a], 1],[[v,b], "a"],[[v,c], [n,a]]]]). + +test_types_cases(3,[[n,function],[[v,a]]], +[ +[[n,function],[[[t,number]]]] +], +[[[n,function],[output]]], +[ + [[n,function],[[1]]] +] +,[[[[v,a], [1]]]]). + +test_types_cases(4,[[n,f],[[v,a],[v,b],[v,c],[v,d]]], +[[[n,f],[[t,number],[t,string],[t,number],[t,string]]]], +[[[n,f],[output,output,output,output]]], +[ + [[n,f],[1,"a",2,"b"]] +] +,[[[[v,a], 1],[[v,b], "a"],[[v,c], 2],[[v,d], "b"]]]). + +test_types_cases(5,[[n,f],[[v,a],[v,b]]], +[ + [[n,f],[[t,a],[t,b]]], + [[t,a],[[t,number]]], + [[t,b],[[t,string]]] +], +[ + [[n,f],[output,output]] +], +[ + [[n,f],[1,"a"]] +] +,[[[[v,a], 1],[[v,b], "a"]]]). + +test_types_cases(6,[[n,f],[[v,a]]], +[ + [[n,f],[[t,a]]], + [[t,a],[[t,number]]], + [[t,a],[[t,string]]] +], +[ + [[n,f],[output]] +], +[ + [[n,f],["a"]] +] +,[[[[v,a], "a"]]]). + +test_types_cases(7,[[n,map],[[[n,add],[[[n,add],[[[n,add],[1]]]]]],0,[v,d]]], +%test_types_cases(7,[[n,map],[1,0,[v,d]]], +[ +[[n,map],[[t,map1],[t,number],[t,number]]], +[[t,map1],[[t,number]]], +[[t,map1],[[[t,predicatename],[[t,map1]]]]], +[[n,add],[[t,number],[t,number],[t,number]]], +[[n,getitemn],[[t,number],{[t,any]},[t,any]]] +], +[ + [[n,map],[input,input,output]], + + [[n,add],[input,input,output]], + + [[n,getitemn],[input,input,output]] +], +[ + [[n,map],[[v,f1],[v,n1],[v,n]],":-", + [ + [[n,number],[[v,f1]]], + [[n,add],[[v,n1],[v,f1],[v,n]]] + ] + ], + [[n,map],[[v,f1],[v,l],[v,n]],":-", + [ + [[n,equals1],[[v,f1],[[v,f11],[v,f12]]]], + [[n,=],[[v,f11],[n,add]]], + [[n,getitemn],[1,[v,f12],[v,bb]]], + [[v,f11],[[v,l],1,[v,l2]]], + [[n,map],[[v,bb],[v,l2],[v,n]]] + ] + ], + + [[n,add],[[v,a],[v,b],[v,c]],":-", + [ [[n,+],[[v,a],[v,b],[v,c]]] + ]], + + [[n,getitemn],[1,[v,b],[v,c]],":-", + [ [[n,head],[[v,b],[v,c]]] + ]], + [[n,getitemn],[[v,a],[v,b],[v,c]],":-", + [ [[n,not],[[[n,=],[[v,a],1]]]], + [[n,tail],[[v,b],[v,t]]], + [[n,-],[[v,a],1,[v,d]]], + [[n,getitemn],[[v,d],[v,t],[v,c]]] + ]] + +] + +,[[[[v,d], 4]]]). +%%,[[[[v,bb], 1]]]). + + +test_types_cases(8,[[n,f],[[v,d],[v,a],[v,c]]], +[ +[[n,f],[[t,number],{[t,number],[t,string]},[t,number]]] +], +[[[n,f],[output,output,output]]], +[ + [[n,f],[1,[1,"a",2,"b"],1]] +] +,[[[[v,a],[1,"a",2,"b"]],[[v,c],1],[[v,d],1]]]). + +test_types_cases(9,[[n,f],[1,"a"]], +[ + [[n,f],[[t,a],[t,b]]], + [[t,a],[[t,number]]], + [[t,b],[[t,string]]] +], +[ + [[n,f],[input,input]] +], +[ + [[n,f],[1,"a"]] +] +,[[]]). + +test_types_cases(10,[[n,f],["a"]], +[ + [[n,f],[[t,a]]], + [[t,a],[[t,number]]], + [[t,a],[[t,string]]] +], +[ + [[n,f],[input]] +], +[ + [[n,f],["a"]] +] +,[[]]). + +test_types_cases(11,[[n,call1b],[[v,b],[1,11,111]]], + [ +[[n,call1b],[[t,number], [[t,number],[t,number],[t,number]]]] +], + [[[n,call1b],[output,input]]], + +[ + [[n,call1b],[[v,b],[v,a]],":-", + [ [[n,call],[[lang,same],same,[[n,member2a],[[v,b],[v,a]]], + [[[n,member2a],[[t,number],[[t,number],[t,number],[t,number]]]]], + [[[n,member2a],[output,input]]], + +[[[n,member2a],[[v,b],[v,a]],":-", + [ [[n,member],[[v,b],[v,a]]],[[n,cut]]] + ]]]], + [[n,cut]]]] + +],[[[[v,b],1]]]). + + +test_types_cases(12,[[n,call1b],[[v,b],[1,11,111]]], + [ +[[n,call1b],[[t,number],[[t,number],[t,number],[t,number]]]] +], + [[[n,call1b],[output,input]]], + +[ + [[n,call1b],[[v,b],[v,a]],":-", + [ [[n,call],[[lang,same],same,[[n,member2a],[[v,b],[v,a]]], + %[[[n,member2a],[[[t,brackets],[[t,number],[t,number],[t,number]]],[t,number]]]], + %[[[n,member2a],[input,output]]], + +[[[n,member2a],[[v,b],[v,a]],":-", + [ [[n,member],[[v,b],[v,a]]],[[n,cut]]] + ]]]], + [[n,cut]]]] + +],[[[[v,b],1]]]). + + + +test_types_cases(13,[[n,person],["not-care",[v,output]]], + [[[n,person],[[t,string],[t,string]]]], + [[[n,person],[input,output]]], + +[ + [[n,person],["care","care"]], + [[n,person],[[v,a],"justice to care"]] + +],[[[[v,output],"justice to care"]]]). + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Dereconstruction 3 of 4.txt",0,algorithms,"24. *I prepared to be an expert on the brain. I did this by writing about neuroscience. First I wrote about food. Second, I wrote about activity. Third, I wrote about sleep. In this way, I prepared to be an expert on the brain by writing about neuroscience."] + +test_types_cases(14,[[n,neuroscience],["**","***",[v,output]]], + [[[n,neuroscience],[[t,string],[t,string],[t,number]]]], + [[[n,neuroscience],[input,input,output]]], + +[ + [[n,neuroscience],[[v,a],[v,b],[v,c]],":-", + [ [[n,string_length],[[v,a],[v,a1]]], + [[n,string_length],[[v,b],[v,b1]]], + [[n,+],[[v,a1],[v,b1],[v,c]]] + ]] + +],[[[[v,output],5]]]). + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Blue Nature 1 of 4.txt",0,algorithms,"9. *I prepared to connect together main points from cliques. I did this by writing on something interesting to do with the song. First, I identified the topic. Second, I constructed an aphohedron from all the songs parts. Third, I thought of interconnections between clique nodes from the randomly broken down aphohedron. In this way, I prepared to connect together main points from cliques by writing on something interesting to do with the song."] + +test_types_cases(15,[[n,connect_cliques],[[["a",1],[1,2],[2,"b"]],[["a",3],[3,4],[4,"b"]],[v,output]]], + [ +[[n,connect_cliques],[[t,list2],[t,list2],[t,list2]]], +[[t,list2],[{[t,set]}]], +[[t,set],[{[t,item]}]], +[[t,item],[[t,number]]], +[[t,item],[[t,string]]] +], + [[[n,connect_cliques],[input,input,output]]], + +[ + [[n,connect_cliques],[[v,a],[v,b],[v,c]],":-", + [ [[n,append],[[v,a],[v,b],[v,c]]] + ]] +],[[[[v,output],[["a",1],[1,2],[2,"b"],["a",3],[3,4],[4,"b"]]]]]). + +/** +test_types_cases(16,[[n,a],[[1,[1,[1]]]]], + [[[n,a],[[t,b2]]], + [[t,b2],[[t,number]]], + [[t,b2],[[t,number],[t,b2]]]], + [[[n,a],[input]]], + +[ + [[n,a],[[v,a]],":-", + [ [[n,true]] + ]] +],[[]]). +**/ + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Children, H1, Earning Jobs and Protection in Jobs 3 of 4.txt",0,algorithms,"24. *I prepared to watch the insect eat a fruit. I did this by feeding it the raspberry. First, I lifted the raspberry on a fork. Second, I placed it in the airlock. Third, I unlocked the airlocks den side to feed the raspberry to the mosquitoes. In this way, I prepared to watch the insect eat a fruit by feeding it the raspberry."] + + +test_types_cases(16,[[n,insect_food],["food",[v,stomach]]], + [[[n,insect_food],[[t,string],[t,string]]]], + [[[n,insect_food],[input,output]]], + +[ + [[n,insect_food],[[v,mouth],[v,mouth]]] +],[[[[v,stomach],"food"]]]). + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Drawing connections 1 of 4.txt",0,algorithms,"9. I prepared to distance myself from *. I did this by shelling the snow pea. First, I read that Winston Churchill asked that if funding was diverted from arts then what would we be fighting for? Second, I determined that arts was the conclusion from defence, not vice versa. Third, I determined that arts is necessary rather than defence. In this way, I prepared to distance myself from * by shelling the snow pea."] + +test_types_cases(17,[[n,distance_myself],[2,[v,distance]]], + [[[n,distance_myself],[[t,number],[t,number]]]], + [[[n,distance_myself],[input,output]]], + + [[[n,distance_myself],[[v,a],[v,b]],":-", + [ [[n,*],[[v,a],2,[v,b]]] + ]] +],[[[[v,distance],4]]]). + +% ["Short Arguments","Nut_and_Bolt.txt",0,algorithms,"13. *I prepared to want the baby. I did this by synthesising the chemistry of the reproductive system with the nut and bolt. First, I found the baby. Second, I found the parents. Third, I bolted the baby to the parents."] + +test_types_cases(18,[[n,want_baby],["yes","yes","yes",[v,result]]], + [[[n,want_baby],[[t,string],[t,string],[t,string],[t,string]]]], + [[[n,want_baby],[input,input,input,output]]], + + [[[n,want_baby],[[v,money],[v,as],[v,parents_in_academia],[v,result]],":-", + [ [[n,=],[[v,money],"yes"]], + [[n,=],[[v,as],"yes"]], + [[n,=],[[v,parents_in_academia],"yes"]], + [[n,=],["yes",[v,result]]] + + ]] +],[[[[v,result],"yes"]]]). + +% ["Short Arguments","Competition 3.txt",0,algorithms,"28. I explored losing as well. I agreed with the competition. I found the competitor. * I saw he was weaker. I agreed with (was stronger than) him."] + +test_types_cases(19,[[n,saw_weaker],[1,0,[v,result]]], + [[[n,saw_weaker],[[t,number],[t,number],[t,number]]]], + [[[n,saw_weaker],[input,input,output]]], + + [[[n,saw_weaker],[[v,my_value],[v,his_value],[v,result]],":-", + [ [[n,>],[[v,my_value],[v,his_value]]], + [[n,=],[1,[v,result]]] + + ]] +],[[[[v,result],1]]]). + + +% ["Time Travel","Interesting histories to visit 4.txt",0,algorithms,"*Interesting histories to visit 4"] + +test_types_cases(20,[[n,visit_staged_history],["us",[v,staged_history]]], + [[[n,visit_staged_history],[[t,string],[t,string]]]], + [[[n,visit_staged_history],[input,output]]], + +[ + [[n,visit_staged_history],[[v,now],[v,now]]] +],[[[[v,staged_history],"us"]]]). + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 2 of 4.txt",0,algorithms,"11. *The teacher prepared to represent being interested in a lesson by dotting it on. He did this by climbing the rope ladder. First, he found the correct ladder. Second, he tested that the ladder was about to start. Third, he climbed the ladder with his arms and legs. In this way, the teacher prepared to represent being interested in a lesson by dotting it on by climbing the rope ladder."] + + +test_types_cases(21,[[n,memorise_point],["point",[v,memory_out]]], + [[[n,memorise_point],[[t,string],[t,string]]]], + [[[n,memorise_point],[input,output]]], + +[ + [[n,memorise_point],[[v,memory_in],[v,memory_in]]] +],[[[[v,memory_out],"point"]]]). + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Hours Prayer 1 of 4.txt",0,algorithms,"8. *I prepared to endorse Nietzsches brilliance. I did this by writing Alexius Meinongs probable comments on the Medicine blog. First, I called it Anarchy 3. Second, I liked brilliance. Third, I liked Nietzsches brilliance. In this way, I prepared to endorse Nietzsches brilliance by writing Alexius Meinongs probable comments on the Medicine blog."] + +% trope chain + +test_types_cases(22,[[n,function],[[["a","b"],["b","c"]]]], + [ +[[n,function],[{[t,list2]}]], +[[t,list2],[[t,string],[t,string]]], +[[n,reverse],[{[t,list2]},{[t,list2]},{[t,list2]}]], +[[n,function2],[{[t,list2]},[t,string],[t,string]]], +[[n,length],[{[t,list2]},[t,number],[t,number]]] +], + + [[[n,function],[input]], + [[n,reverse],[input,input,output]], + [[n,function2],[input,input,input]], + [[n,length],[input,input,output]]], + +[[[n,function],[[v,a]],":-", +[[[n,length],[[v,a],0,[v,b]]], +[[n,=],[[v,b],1]]]], + +[[n,function],[[v,a]],":-", +[[[n,head],[[v,a],[v,d]]], +[[n,equals1],[[v,d],[[v,e],[v,f]]]], +[[n,reverse],[[v,a],[],[v,a1]]], +[[n,head],[[v,a1],[v,d1]]], +[[n,equals1],[[v,d1],[[v,e1],[v,f1]]]], +[[n,function2],[[v,a],[v,f],[v,f1]]] +%,[[n,cut]] +]], + +[[n,reverse],[[],[v,l],[v,l]]], + +[[n,reverse],[[v,l],[v,m],[v,n]],":-", +[[[n,head],[[v,l],[v,h]]], +[[n,tail],[[v,l],[v,t]]], +[[n,wrap],[[v,h],[v,h1]]], +[[n,append],[[v,h1],[v,m],[v,o]]], +[[n,reverse],[[v,t],[v,o],[v,n]]]]], + +[[n,function2],[[v,a],[v,b],[v,f]],":-", +[[[n,member],[[v,d],[v,a]]], +[[n,equals1],[[v,d],[[v,b],[v,f]]]] +%,[[n,cut]] +]], + +[[n,function2],[[v,a],[v,b],[v,c]],":-", +[[[n,member],[[v,d],[v,a]]], +[[n,equals1],[[v,d],[[v,b],[v,f]]]], +[[n,function2],[[v,d],[v,f],[v,c]]]]], + +[[n,length],[[],[v,l],[v,l]]], + +[[n,length],[[v,l],[v,m1],[v,n]],":-", +[[[n,not],[[[n,=],[[v,l],[]]]]], +[[n,tail],[[v,l],[v,t]]], +[[n,+],[[v,m1],1,[v,m2]]], +[[n,length],[[v,t],[v,m2],[v,n]]]]]], +[[]]). + + +% recursive types + +test_types_cases(23,[[n,connect_cliques],[[["a",1],[1,2],[2,"b"]],[["a",3],[3,4],[4,"b"]],[v,output]]], + [ +[[n,connect_cliques],[[t,list2],[t,list2],[t,list2]]], +[[t,item],[[t,number]]], +[[t,item],[[t,string]]], +[[t,list2],[{[t,item]}]], +[[t,list2],[{[t,list2]}]] +], + [[[n,connect_cliques],[input,input,output]]], + +[ + [[n,connect_cliques],[[v,a],[v,b],[v,c]],":-", + [ [[n,append],[[v,a],[v,b],[v,c]]] + ]] +],[[[[v,output],[["a",1],[1,2],[2,"b"],["a",3],[3,4],[4,"b"]]]]]). + + +% ["Mind Reading","Mr other times 7.txt",0,algorithms,"57. *I responsibly chose an ontological value (side of the car that the steering wheel was on in the particular car) by mind reading the other time."] + +% Aus, UK - left hand traffic, US - right hand traffic + +test_types_cases(24,[[n,hand_traffic],["Australia",[v,a1]]], + [[[n,hand_traffic],[[t,string],[t,string]]]], + [[[n,hand_traffic],[input,output]]], + +[ + [[n,hand_traffic],["Australia","left"]], + [[n,hand_traffic],["UK","left"]], + [[n,hand_traffic],["US","right"]] + +],[[[[v,a1],"left"]]]). + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Finite Data will be a Solution in Conglish 2 of 4.txt",0,algorithms,"17. *I prepared to judge the way the other person was speaking. I did this by watching the diareasoner identify the speech rate in her partner. First, I counted the number of words over the time. Second, I counted the number of minutes. Third, I calculated the speech rate to equal the number of words divided by the number of minutes. In this way, I prepared to judge the way the other person was speaking by watching the diareasoner identify the speech rate in her partner."] + +test_types_cases(25,[[n,way_of_speaking_a],[["high-pitched","smiling"],[v,way]]], + [ +[[n,way_of_speaking_a],[{[t,string]},{[t,string]}]], +[[n,way_of_speaking],[[t,string],[t,string]]] +], + [[[n,way_of_speaking_a],[input,output]], + [[n,way_of_speaking],[input,output]]], + +[ + [[n,way_of_speaking_a],[[v,properties],[v,expression]],":-", + [[[n,equals4],[[v,properties],[[v,item1a],[v,item1b]]]], + [[n,findall],[[v,item2], + [[[n,way_of_speaking],[[v,item1a],[v,item2]]]], + %[[n,=],[[v,item1],[v,item1a]]]], + [v,items2a]]], + [[n,sort],[[v,items2a],[v,items2a1]]], + [[n,findall],[[v,item2], + [[[n,way_of_speaking],[[v,item1b],[v,item2]]]], + %[[n,=],[[v,item1],[v,item1b]]]], + [v,items2b]]], + [[n,sort],[[v,items2b],[v,items2b1]]], + [[n,intersection],[[v,items2a1],[v,items2b1],[v,expression]]] + ]], + [[n,way_of_speaking],["high-pitched","happy"]], + [[n,way_of_speaking],["high-pitched","unhappy"]], + [[n,way_of_speaking],["low-pitched","angry"]], + [[n,way_of_speaking],["smiling","happy"]], + [[n,way_of_speaking],["frowning","sad"]] +],[[[[v,way],["happy"]]]]). + +% ["Time Travel","Space Flight 2.txt",0,algorithms,"32. *The space ship algorithm automated meditation before space jumps and when it detected pedagogy help."] + +% The time around the time to time travel to was mind read. + +% choose >= 10 time units after the projectile has passed. + +test_types_cases(26,[[n,choose_time],[[-15,-10,-5,0,5,10,15],[v,time]]], + [ +[[n,choose_time],[{[t,number]},[t,number]]] +], + [[[n,choose_time],[input,output]]], + +[ + [[n,choose_time],[[v,a],10],":-", + [ [[n,member],[10,[v,a]]] + ]] +],[[[[v,time],10]]]). + + +% ["Time Travel","Technologies in Times 1.txt",0,algorithms,"63. *The workings of DNA and RNA were examined in cloning for medicine."] + +% The sequences were the same. + +test_types_cases(27,[[n,same],[[1,2,3,4,5,6],[1,2,3,4,5,6]]], + [ +[[n,same],[{[t,number]},{[t,number]}]] +], + [[[n,same],[input,input]]], + +[ + [[n,same],[[v,sequence1],[v,sequence1]]] +],[[]]). + +% ["Lecturer","Lecturer.txt",0,algorithms,"2. *I found what the person aimed for. I wrote on hermeneutics. I identified the discourse. I grouped the topics into ideologies. I grouped the ideas into ontologies."] + +test_types_cases(28,[[n,aimed],[[["bulls-eye","red"],["outer-ring","blue"]],"bulls-eye",[v,object]]], + [ +[[n,aimed],[{{[t,string],[t,string]}},[t,string],[t,string]]] +], + [[[n,aimed],[input,input,output]]], + +[ + [[n,aimed],[[v,a],[v,b],[v,c]],":-", + [ [[n,member],[[v,d],[v,a]]], + [[n,equals4],[[v,d],[[v,b],[v,c]]]] + ]] +],[[[[v,object],"red"]]]). + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Analysing characteristics of arguments 1 of 4.txt",0,algorithms,"Do the premises work in all cases?"] + +test_types_cases(29,[[n,verify_modus_ponens],[["a",["a","b"],"b"]]], + [ +[[n,verify_modus_ponens],[[[t,string],{[t,string],[t,string]},[t,string]]]] +], + [[[n,verify_modus_ponens],[input]]], + +[ + [[n,verify_modus_ponens],[[v,a]],":-", + [ [[n,equals4],[[v,a],[[v,a1],[[v,a1],[v,b1]],[v,b1]]]] + ]] +],[[]]). + + +% ["Short Arguments","Rebreasoning.txt",0,algorithms,"8. I prepared to verify the connection. I did this by moving on to the next point. First, I breasoned out the first point. Second, I breasoned out the connection to the second point. Third, I moved on to the second point."] + +% I prepared to work the connection out. + +test_types_cases(30,[[n,work_modus_ponens_out],[["a","b"],[v,mp]]], + [ +[[n,work_modus_ponens_out],[{[t,string]},{[t,string]}]] +], + [[[n,work_modus_ponens_out],[input,output]]], + +[ + [[n,work_modus_ponens_out],[[v,a],[v,c]],":-", + [ [[n,equals4],[[v,a],[[v,a1],[v,b1]]]], + [[n,equals4],[[v,c],[[v,a1],"->",[v,b1]]]] + ]] +],[[[[v,mp],["a","->","b"]]]]). + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Exploring opposites in Hamlet 2 of 4.txt",0,algorithms,"11. *I prepared to experience the art forms of God (the master). I did this by trusting God (the master). First, I trusted the art of the master. Second, I trusted the music of the master. Third, I trusted the architecture of the master. In this way, I prepared to experience the art forms of God (the master) by trusting God (the master)."] + +test_types_cases(31,[[n,art],[["I","ate","apple"],[v,art_form]]], + +[[[n,art],[{[t,string]},[[t,string],{[t,string]}]]]] +, + [[[n,art],[input,output]]], + +[ + [[n,art],[[v,a],[v,c]],":-", + [ [[n,equals4],[[v,a],[[v,a1],[v,b1],[v,c1]]]], + [[n,equals4],[[v,c],[[v,b1],[[v,a1],[v,c1]]]]] + ]] +],[[[[v,art_form],["ate",["I","apple"]]]]]). + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Do 3 of 4.txt",0,algorithms,"28. The fun park visitor prepared to ride the helter skelter. He did this by licking the chocolate from his finger. First, he started from the base of his finger. Second, he spiraled his tongue upwards, licking all the chocolate from his finger on the way. Third, he stopped when he reached the top. In this way, the fun park visitor prepared to ride the helter skelter by licking the chocolate from his finger."] + +% Triangle train line + +test_types_cases(32,[[n,triangle_train1],["Canterbury","Bambury"]], + [ +[[n,triangle_train1],[[t,string],[t,string]]], +[[n,triangle_train],[[t,string],[t,string]]], +[[n,link],[[t,string],[t,string]]] +], + [[[n,triangle_train1],[input,input]], + [[n,triangle_train],[input,output]], + [[n,link],[input,output]]], + +[ + [[n,triangle_train1],[[v,a],[v,b]],":-", + [ [[n,triangle_train],[[v,a],[v,b]]]]], + [[n,triangle_train],[[v,a],[v,b]],":-", + [ [[n,link],[[v,a],[v,b]]]]], + [[n,triangle_train],[[v,a],[v,b]],":-", + [ [[n,link],[[v,a],[v,c]]], + [[n,triangle_train],[[v,c],[v,b]]] + ]], + %[[n,link],["Canterbury","Bambury"]], + [[n,link],["Canterbury","Avignon"]], + [[n,link],["Bambury","Canterbury"]], + %[[n,link],["Bambury","Avignon"]], + [[n,link],["Avignon","Bambury"]] + %[[n,link],["Avignon","Canterbury"]] + +],[[]]). + + +test_types_cases(33,[[n,wear],[["hat","head"],[v,c]]], + [ +[[n,wear],[[[t,string],[t,string]],[[t,string],[t,string]]]] +], + [[[n,wear],[input,output]]], + +[ + [[n,wear],[[v,c],[v,c]],":-", + [ + [[n,equals4],[[v,c],["hat","head"]]] + ]] +],[[[[v,c],["hat","head"]]]]). + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Breathsonings 4 of 4.txt",0,algorithms,"41. I loved planet meditation (books). I did this by holding the reflection (philosophy) retreat. First, I covered texts. Second, I covered retreat details. Third, I gave presents out. In this way, I prepared to love planet meditation (books) by holding the reflection (philosophy) retreat."] + +%% travelling 10 space units and 15 time units in the maximum jump of 1 space unit and 1 time unit takes 10 space jumps and 15 time jumps + +test_types_cases(34,[[n,space_time_jump],[[10,15],[v,c]]], + [ +[[n,space_time_jump],[[[t,number],[t,number]],[[t,number],[t,number]]]] +], + [[[n,space_time_jump],[input,output]]], + +[ + [[n,space_time_jump],[[v,c],[v,c]]] +],[[[[v,c],[10,15]]]]). + + +% ["Short Arguments","Medicine - Quantum Box of Circulatory System 1.txt",0,algorithms,"5. *I used cardiovascular activity to maintain circulatory system flow."] + +test_types_cases(35,[[n,circulation1],["heart1","cells"]], + [[[n,circulation1],[[t,string],[t,string]]], + [[n,circulation],[[t,string],[t,string]]], + [[n,link],[[t,string],[t,string]]]], + [[[n,circulation1],[input,input]], + [[n,circulation],[input,output]], + [[n,link],[input,output]]], + +[ + [[n,circulation1],[[v,a],[v,b]],":-", + [ [[n,circulation],[[v,a],[v,b]]]]], + [[n,circulation],[[v,a],[v,b]],":-", + [ [[n,link],[[v,a],[v,b]]]]], + [[n,circulation],[[v,a],[v,b]],":-", + [ [[n,link],[[v,a],[v,c]]], + [[n,circulation],[[v,c],[v,b]]] + ]], + %[[n,link],["Canterbury","Bambury"]], + [[n,link],["heart1","lungs"]], + [[n,link],["lungs","heart2"]], + %[[n,link],["Bambury","Avignon"]], + [[n,link],["heart2","cells"]], + [[n,link],["cells","heart1"]] + +],[[]]). + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Philosophical Computational English 4 of 4.txt",0,algorithms,"32. *I prepared to buy products that I added value to. I did this by breasoning out 5 As per day for sales. First, I trialled the product. Second, I found a new use for the product. Third I used the product for thus new use. In this way, I prepared to buy products that I added value to by breasoning out 5 As per day for sales."] + +test_types_cases(36,[[n,stock_because_buy],["word processor",[v,c]]], + [[[n,stock_because_buy],[[t,string],[t,string]]]], + [[[n,stock_because_buy],[input,output]]], + +[ + [[n,stock_because_buy],[[v,c],[v,c]]] +],[[[[v,c],"word processor"]]]). + +% ["Lecturer","Lecturer - Recordings Pedagogy.txt",0,algorithms,"6. *The necessary amount of work didn't become a headache. I prevented the muscle ache from recordings. I used the quantum box to prevent muscle aches. This included headaches. I prevented body aches becoming headaches."] + +% The number of As of work equals the number of As in medicine + +test_types_cases(37,[[n,medicine_as],[3,[v,medicine_as]]], + [[[n,medicine_as],[[t,number],[t,number]]]], + [[[n,medicine_as],[input,output]]], + +[ + [[n,medicine_as],[[v,as],[v,as]]] +],[[[[v,medicine_as],3]]]). + +% ["Medicine","MEDICINE by Lucian Green Heart 2 of 4.txt",0,algorithms,"17. *I prepared to go running. I did this by flexing the ball of my foot. First, I stood up. Second, I leant against a wall. Third, I performed a calf stretch. In this way, I prepared to go running by flexing the ball of my foot."] + +test_types_cases(38,[[n,run_checklist],["true","true","true",[v,run]]], + [[[n,run_checklist],[[t,string],[t,string],[t,string],[t,string]]]], + [[[n,run_checklist],[input,input,input,output]]], + +[ + [[n,run_checklist],[[v,stretches],[v,water],[v,gear],[v,run]],":-", + [ [[n,=],[[v,stretches],"true"]], + [[n,=],[[v,water],"true"]], + [[n,=],[[v,gear],"true"]], + [[n,=],[[v,run],"true"]] + ]] +],[[[[v,run],"true"]]]). + + +% ["Fundamentals of Pedagogy and Pedagogy Indicators","FUNDAMENTALS OF PEDAGOGY by Lucian Green Time to Prepare 3 of 4.txt",0,algorithms,"27. *The bottler prepared to put a cork in the bottle. He did this by closing the refrigerator door. First, he pushed the door with his hand. Second, he lifted the latch. Third, he closed the door. In this way, the bottler prepared to put a cork in the bottle by closing the refrigerator door."] + +% fill or empty a bottle + +test_types_cases(39,[[n,fill_or_empty_bottle],["nothing",[v,a2]]], + [[[n,fill_or_empty_bottle],[[t,string],[t,string]]]], + [[[n,fill_or_empty_bottle],[input,output]]], + +[ + [[n,fill_or_empty_bottle],["nothing","liquid"]], + [[n,fill_or_empty_bottle],["liquid","nothing"]] +],[[[[v,a2],"liquid"]]]). + +% ["Lecturer","Lecturer Culturology.txt",0,algorithms,"3. *Reverse CAW was guessing the input and output. Culturology is good. I applied back-translation to an algorithm. I found that reversing the algorithm resulted in the same result as the original. Reverse interpret was CAW."] + +test_types_cases(40,[[n,guess_io],["+",[v,a2],[v,a3]]], + [ +[[n,guess_io],[[t,string],{[t,item]},{[t,item]}]], +[[t,item],[[t,number]]], +[[t,item],[[t,string]]] +], + [[[n,guess_io],[input,output,output]]], + +[ + [[n,guess_io],["+",[1,1],[2]]], + [[n,guess_io],["-",[1,1],[0]]] +],[[[[v,a2],[1,1]],[[v,a3],[2]]]]). + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Order in Conglish 2 of 4.txt",0,algorithms,"15. *I prepared to order the Conglish subjects. I did this by observing the marriage. First, I observed the partner place the ring on his or her partners finger. Second, I observed the couple say their vows. Third, I observed the couple sign the wedding register. In this way, I prepared to order the Conglish subjects by observing the marriage."] + +% Order strings by length + +test_types_cases(41,[[n,order_strings],[["***","*","**"],[v,ordered_strings]]], + [ +[[n,order_strings],[{[t,string]},{[t,string]}]] +], + [[[n,order_strings],[input,output]]], + +[ + [[n,order_strings],[[v,strings],[v,ordered_strings]],":-", + [ [[n,sort],[[v,strings],[v,ordered_strings]]]]] +],[[[[v,ordered_strings],["*", "**", "***"]]]]). + +% ["Medicine","MEDICINE by Lucian Green 250 Breasonings 1 of 4.txt",0,algorithms,"5. I prepared to listen to the classical music, which had an expanse of 50 As. I did this by listening to classical music. First, I found the record. Second, I played it on the gramophone. Third, I listened to the classical music. In this way, I prepared to listen to the classical music, which had an expanse of 50 As by listening to classical music."] + +test_types_cases(42,[[n,as_expanse],[[1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43,44,45,46,47,48,49,50]]], +[ +[[n,as_expanse],[{[t,number]}]] +], + [[[n,as_expanse],[input]]], +[ + [[n,as_expanse],[[v,numbers]],":-", + [ [[n,findall],[[v,num],[[[n,member],[[v,num],[v,numbers]]], + [[n,number],[[v,num]]]],[v,num2]]]]] +],[[]]). + +test_types_cases(43,[[n,is_classical],[[2,3,5,6,7,10,11]]], +[ +[[n,is_classical],[[t,numbers]]], +[[n,is_set1],[[t,numbers],[t,numbers]]], +[[t,numbers],[{[t,number]}]] +], + [[[n,is_classical],[input]], + [[n,is_set1],[input,input]]], +[ + [[n,is_classical],[[v,numbers]],":-", + [ %[[n,trace2]], + [[n,is_set1],[[v,numbers],[2,3,5,6,7,10,11]]]]], + + [[n,is_set1],[[v,set1],[v,set2]],":-", + [ [[n,sort],[[v,set1],[v,set3]]], + [[n,sort],[[v,set2],[v,set3]]] + ]] + +],[[]]). + +test_types_cases(44,[[n,find_record],[[[1,"a"],[2,"b"]],1,[v,r]]], +[ +[[n,find_record],[{[[t,number],[t,string]]},[t,number],[t,string]]] +], + [[[n,find_record],[input,input,output]]], +[ + [[n,find_record],[[v,pairs],[v,num],[v,rec]],":-", + [ %[[n,trace2]], + [[n,member],[[[v,num],[v,rec]],[v,pairs]]]]] +] + +,[[[[v,r],"a"]]]). + + +test_types_cases(45,[[n,play_gramophone],[[1,2,3,4,5],2,[v,p]]], +[ +[[n,play_gramophone],[{[t,number]},[t,number],{[t,number]}]] +], + [[[n,play_gramophone],[input,input,output]]], +[ + [[n,play_gramophone],[[v,tracks],[v,first_track],[v,rest]],":-", + [ %[[n,trace2]], + [[n,equals4],[[[v,a],"|",[v,b]],[v,tracks]]], + [[n,"->"],[[[n,equals4],[[v,a],[v,first_track]]], + [[n,equals4],[[v,rest],[v,tracks]]], + [[n,play_gramophone],[[v,b],[v,first_track],[v,rest]]] + ]] +]]] + +,[[[[v,p],[2,3,4,5]]]]). + +/* + Sales +1. Inner child + +1. The product was spoon-fed to the customer. +*/ + +test_types_cases(46,[[n,spoon_feed],[1]], +[[[n,spoon_feed],[[t,number]]]], + [[[n,spoon_feed],[input]]], +[ + + [[n,spoon_feed],[5]], + [[n,spoon_feed],[[v,n1]],":-", + [ %[[n,trace2]], + [[n,+],[[v,n1],1,[v,n2]]], + [[n,spoon_feed],[[v,n2]]]]] +] +,[[]]). + +% ["Computational English","COMPUTATIONAL ENGLISH by Lucian Green Perspectives 1 of 4.txt",0,algorithms,"1. The first technique can be used to give a perspective on a text. For example, given the reason 'X is younger than Y' the perspective gives the conclusion 'X was likely to have been looked after by Y'."] + +test_types_cases(47,[[n,greater_than],[2,1]], +[[[n,greater_than],[[t,number],[t,number]]]], + [[[n,greater_than],[input,input]]], +[ + + [[n,greater_than],[[v,n1],[v,n2]],":-", + [ %[[n,trace2]], + [[n,>],[[v,n1],[v,n2]]]]] +] +,[[]]). + +% ["Short Arguments","Rebreathsoning.txt",0,algorithms,"2. *I prepared to side with the pole. I did this by observing meantness (sic). First, I found the statement to be unwavering through time. Second, I found it to be unwavering in relation to other statements. Third, I found it to be unwavering in relation with other people."] + +test_types_cases(48,[[n,pole],[["a",[1,2,3]],["b",[4,5,6]],3,[v,a_or_b]]], +[ +[[n,pole],[[t,pole],[t,pole],[t,number],[t,string]]], +[[t,pole],[[t,string],{[t,number]}]] +], + [[[n,pole],[input,input,input,output]]], +[ + + [[n,pole],[[v,pole1],[v,pole2],[v,person],[v,pole_name]],":-", + [ %[[n,trace2]], + [[n,equals4],[[v,pole1],[[v,pole_name],[v,list1]]]], + [[n,member],[[v,person],[v,list1]]]]], + + [[n,pole],[[v,pole1],[v,pole2],[v,person],[v,pole_name]],":-", + [ %[[n,trace2]], + [[n,equals4],[[v,pole2],[[v,pole_name],[v,list1]]]], + [[n,member],[[v,person],[v,list1]]]]] + +] +,[[[[v,a_or_b],"a"]]]). + +% I did this by observing meantness (sic). + +test_types_cases(49,[[n,meantness],[[["a","b"],["b","c"]],["a","b","c"]]], +[ +[[n,meantness],[[t,lists_string],[t,list_string]]], +[[t,lists_string],[{[t,list_string]}]], +[[t,list_string],[{[t,string]}]] +], + [[[n,meantness],[input,input]]], +[ + + [[n,meantness],[[v,lists],[]]], + [[n,meantness],[[v,lists],[v,list]],":-", + [ %[[n,trace2]], + [[n,equals4],[[v,list],[[v,head],"|",[v,tail]]]], + [[n,member],[[v,list1],[v,lists]]], + [[n,member],[[v,head],[v,list1]]], + %[[n,delete],[[v,lists],[v,list1],[v,lists2]]], + [[n,meantness],[[v,lists],[v,tail]] + ]]] +] +,[[]]). + +% First, I found the statement to be unwavering through time. + +test_types_cases(50,[[n,unwavering],[[1,1,1,1]]], +[ +[[n,unwavering],[[t,list_number]]], +[[t,list_number],[{[t,number]}]], +[[n,1],[[t,number]]] +], + +[[[n,unwavering],[input]], +[[n,1],[input]]], +[ + + [[n,unwavering],[[]]], + [[n,unwavering],[[v,list]],":-", + [ %[[n,trace2]], + [[n,equals4],[[v,list],[[v,head],"|",[v,tail]]]], + [[n,1],[[v,head]]], + [[n,unwavering],[[v,tail]]]]], + + [[n,1],[1]] +] +,[[]]). + +% Second, I found it to be unwavering in relation to other statements. + +test_types_cases(51,[[n,unwavering_list],[[[1,1,1,1],[1,1,1,1],[1,1,1,1]]]], +[ +[[n,unwavering_list],[{[t,list_number]}]], +[[n,unwavering],[[t,list_number]]], +[[t,list_number],[{[t,number]}]], +[[n,1],[[t,number]]] +], + +[[[n,unwavering_list],[input]], +[[n,unwavering],[input]], +[[n,1],[input]]], +[ + + [[n,unwavering_list],[[]]], + [[n,unwavering_list],[[v,list]],":-", + [ %[[n,trace2]], + [[n,equals4],[[v,list],[[v,head],"|",[v,tail]]]], + [[n,unwavering],[[v,head]]], + [[n,unwavering_list],[[v,tail]]]]], + + [[n,unwavering],[[]]], + [[n,unwavering],[[v,list]],":-", + [ %[[n,trace2]], + [[n,equals4],[[v,list],[[v,head],"|",[v,tail]]]], + [[n,1],[[v,head]]], + [[n,unwavering],[[v,tail]]]]], + + [[n,1],[1]] +] +,[[]]). + + +% Third, I found it to be unwavering in relation with other people. + +test_types_cases(52,[[n,unwavering_people],%[[[[1],[1]],[[1],[1]]]]], +[[[[[1,1,1,1],[1,1,1,1],[1,1,1,1]],[[1,1,1,1],[1,1,1,1],[1,1,1,1]]],[[[1,1,1,1],[1,1,1,1],[1,1,1,1]],[[1,1,1,1],[1,1,1,1],[1,1,1,1]]]]]], +[ +[[n,unwavering_people],[[t,unwavering_people1]]], +[[t,unwavering_people1],[[t,number]]], +[[t,unwavering_people1],[{[t,unwavering_people1]}]], +[[n,unwavering],[{[t,number]}]], +[[n,1],[[t,number]]] +], + +[[[n,unwavering_people],[input]], +[[n,unwavering],[input]], +[[n,1],[input]]], +[ + + [[n,unwavering_people],[[]]], + + [[n,unwavering_people],[[v,list]],":-", + [ %[[n,trace2]], + [[n,equals4],[[v,list],[[v,head],"|",[v,tail]]]], + [[n,unwavering],[[v,head]]], + [[n,unwavering_people],[[v,tail]]]]], + + [[n,unwavering_people],[[v,list]],":-", + [ %[[n,trace2]], + [[n,equals4],[[v,list],[[v,head],"|",[v,tail]]]], + [[n,not],[[[n,unwavering],[[v,head]]]]], + [[n,unwavering_people],[[v,head]]], + [[n,unwavering_people],[[v,tail]]]]], + + [[n,unwavering],[[]]], + + [[n,unwavering],[[v,list]],":-", + [ %[[n,trace2]], + [[n,equals4],[[v,list],[[v,head],"|",[v,tail]]]], + [[n,1],[[v,head]]], + [[n,unwavering],[[v,tail]]] + %[[n,cut]] + ]], + + [[n,1],[1]] +] +,[[]]). + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Heads of State 4 of 4.txt",0,algorithms,"39. *I prepared to enjoy subsidised accreditation. I did this by agreeing with the government. First, I read the government policy. Second, I verified that it was a good idea. Third, I agreed with it. In this way, I prepared to enjoy subsidised accreditation by agreeing with the government."] + +test_types_cases(53,[[n,subsidised_accreditation],[1,30]], +[[[n,subsidised_accreditation],[[t,number],[t,number]]]], + [[[n,subsidised_accreditation],[input,input]]], +[ + + [[n,subsidised_accreditation],[1,1]], + [[n,subsidised_accreditation],[1,30]], + [[n,subsidised_accreditation],[1,100]], + [[n,subsidised_accreditation],[1,400]] +] +,[[]]). + +% I did this by agreeing with the government. + + +test_types_cases(54,[[n,agree_with_government],[1,["a"],[v,end]]], +[ +[[n,agree_with_government],[[t,number],{[t,string]},[[t,number],[t,string]]]] +], + [[[n,agree_with_government],[input,input,output]]], +[ + + [[n,agree_with_government],[[v,policy],[v,part],[v,end]],":-", + [ + [[n,wrap],[[v,policy],[v,policy1]]], + [[n,append],[[v,policy1],[v,part],[v,end]]] + ] + ] +] +,[[[[v,end],[1,"a"]]]]). + +% First, I read the government policy. + +test_types_cases(55,[[n,read_policy],[[n,+],1,1,[v,a]]], +[[[n,read_policy],[[t,predicatename],[t,number],[t,number],[t,number]]]], + [[[n,read_policy],[input,input,input,output]]], +[ + + [[n,read_policy],[[v,pred_name],[v,var1],[v,var2],[v,var3]],":-", + [ + %[[n,trace2]], + [[v,pred_name],[[v,var1],[v,var2],[v,var3]]] + ]] +] +,[[[[v,a],2]]]). + +% ["Short Arguments","Two_Types.txt",0,algorithms,"1. I prepared to sell the artificial breasts. I did this by stating that the breast cooled and stored the milk. First, I read the temperature of the breast. Second, I read how much milk it stored. Third, I simulated these."] + +test_types_cases(56,[[n,amount_earned],[2,2,[v,a]]], +[[[n,amount_earned],[[t,number],[t,number],[t,number]]]], + [[[n,amount_earned],[input,input,output]]], +[ + + [[n,amount_earned],[[v,n1],[v,n2],[v,n3]],":-", + [ %[[n,trace2]], + [[n,*],[[v,n1],[v,n2],[v,n3]]]]] +] +,[[[[v,a],4]]]). + +% I did this by stating that the breast *cooled and stored the milk. + +test_types_cases(57,[[n,cool],[[-4.1,-4,-4.2],-5,-4]], +[ +[[n,cool],[{[t,number]},[t,number],[t,number]]] +], + [[[n,cool],[input,input,input]]], +[ + + [[n,cool],[[],[v,n2],[v,n3]]], + [[n,cool],[[v,list],[v,n2],[v,n3]],":-", + [ %[[n,trace2]], + [[n,equals4],[[v,list],[[v,head],"|",[v,tail]]]], + [[n,>=],[[v,head],[v,n2]]], + [[n,>=],[[v,n3],[v,head]]], + [[n,cool],[[v,tail],[v,n2],[v,n3]]]]] +] +,[[]]). + +% I did this by stating that the breast cooled and *stored the milk. + +test_types_cases(58,[[n,store],[["milk"],"milk"]], +[ +[[n,store],[{[t,string]},[t,string]]] +], + [[[n,store],[input,input]]], +[ + + [[n,store],[[[v,a]],[v,a]]] +] +,[[]]). + + +% ["Fundamentals of Meditation and Meditation Indicators","FUNDAMENTALS OF MEDITATION by Lucian Green Appearances 4 of 4.txt",0,algorithms,"31. I prepared to enter the room in the heartland. I did this by writing the Room Essay Press Release. First, I wrote that 250 breasonings expanded to 50 As. Second, I wrote that a breasoned out pop song expanded to 50 As. Third, I wrote the classical music composition contained 5 pop songs. In this way, I prepared to enter the room in the heartland by writing the Room Essay Press Release."] + +% * I prepared to enter the room in the heartland. + +% Follows a path with directions along the paths + +test_types_cases(59,[[n,tour_heartland],[[1,2]]], + + [ +[[n,tour_heartland],[{[t,number]}]], +[[n,tour_heartland1],[{{[t,number]}},{[t,number]}]], +[[n,heartland],[{{[t,number]}}]] +], + + [[[n,tour_heartland],[input]], + [[n,tour_heartland1],[input,input]], + [[n,heartland],[output]]], +[ + + [[n,tour_heartland],[[v,path]],":-", + [ + [[n,heartland],[[v,heartland]]], + [[n,tour_heartland1],[[v,heartland],[v,path]]] + ]], + + [[n,tour_heartland1],[[v,heartland],[]]], + [[n,tour_heartland1],[[v,heartland],[[v,single_step]]]], + [[n,tour_heartland1],[[v,heartland],[v,path]],":-", + [ + [[n,equals4],[[v,path],[[v,curr_step],[v,next_step],"|",[v,rest]]]], + [[n,member],[[v,step],[v,heartland]]], + [[n,equals4],[[v,step],[[v,curr_step],"|",[v,links_to]]]], + [[n,member],[[v,next_step],[v,links_to]]], + [[n,equals4],[[v,next],[[v,next_step],"|",[v,rest]]]], + [[n,tour_heartland1],[[v,heartland],[v,next]]] + ]], + + [[n,heartland],[[[1, % node number + 2,3 % links to + ],[2,3],[3]]]] + +] +,[[]]). + +% * I did this by writing the Room Essay Press Release. + +test_types_cases(60,[[n,find_in_room],["newspaper",[v,x],[v,y]]], + + [ +[[n,find_in_room],[[t,string],[t,number],[t,number]]], +[[n,room],[{{[t,number],[t,number],[t,string]}}]] +], + + [[[n,find_in_room],[input,output,output]], + [[n,room],[output]]], + +[ + [[n,find_in_room],[[v,string],[v,x],[v,y]],":-", + [ + [[n,room],[[v,room]]], + [[n,member],[[[v,x],[v,y],[v,string]],[v,room]]] + ]], + + [[n,room],[ + [ + [1,3,""],[2,3,"newspaper"],[3,3,""], + [1,2,""],[2,2,""],[3,2,""], + [1,1,"door"],[2,1,""],[3,1,""] + ] + ]] + +] +,[[[[v,x],2],[[v,y],3]]]). + + +% * First, I wrote that 250 breasonings expanded to 50 As. + +test_types_cases(61,[[n,return],[250,4000,[v,return]]], + + [[[n,return],[[t,number],[t,number],[t,number]]]], + + [[[n,return],[input,input,output]]], + +[ + [[n,return],[[v,take],[v,give],[v,return]],":-", + [ + [[n,/],[[v,give],[v,take],[v,return]]] + ]] +] +,[[[[v,return],16]]]). + +% ["Mind Reading","Mr Cryptography 3.txt",0,algorithms,"51. The robot was classed disabled (rather, superabled) in human terms, so was modified to be human-like when interpreting messages following decryption."] + +test_types_cases(62,[[n,text2b_as_per_business_hour],[[v,brs]]], + + [[[n,text2b_as_per_business_hour],[[t,number]]]], + + [[[n,text2b_as_per_business_hour],[output]]], + +[ + [[n,text2b_as_per_business_hour],[[v,brs]],":-", + [ + [[n,*],[80,% breasonings per A + 100, % As per week allowed in Text to Breasonings + [v,a1]]], % br per week + + [[n,/],[[v,a1],% br per week + 7, % days + [v,a2]]], % br per day + + [[n,/],[[v,a2],% br per day + 8, % business hours per day + [v,a3]]], % breasonings per hour + + [[n,ceiling],[[v,a3],[v,brs]]] + ]] +] +,[[[[v,brs],143]]]). + +% ["Short Arguments","God Algorithm.txt",0,algorithms,"5. I prepared to ask why why was. I did this by stating that I knew the result of the God algorithm was why. First, I noticed the occurrence of the event. Second, I read the result of the God algorithm. Third, I worked out that the result explained the occurrence."] + +% * I did this by stating that I knew the result of the God algorithm was why. + +test_types_cases(63,[[n,each_topic],[[v,return]]], + + [[[n,each_topic],[[t,number]]]], + + [[[n,each_topic],[output]]], + +[ + [[n,each_topic],[[v,return]],":-", + [ + [[n,*],[4,4000,[v,return]]] + ]] +] +,[[[[v,return],16000]]]). + +% ["Medicine","MEDICINE by Lucian Green Head of State Head Ache Prevention 4 of 4.txt",0,algorithms,"32. I prepared to observe the people like my lecturer friend and meditation student was a doctor and friend. I did this by observing the hansard. First, I found the hansard. Second, I observed him listen to the politician. Third, I observed him take notes. In this way, I prepared to observe the people like my lecturer friend and meditation student was a doctor and friend by observing the hansard."] + +% * I prepared to observe the people like my lecturer friend and meditation student was a doctor and friend. + +test_types_cases(64,[[n,characters],["Tom",[v,characters]]], + + [ +[[n,characters],[[t,string],{[t,string]}]], +[[n,c],[[t,string],[t,string]]] +], + + [[[n,characters],[input,output]], + [[n,c],[input,output]]], + +[ + [[n,characters],[[v,person],[v,characters]],":-", + [ + [[n,findall],[[v,c],[[[n,c],[[v,person],[v,c]]]],[v,characters]]] + ]], + + [[n,c],["Tom","Chef"]], + [[n,c],["Tom","Baker"]], + [[n,c],["Tom","Writer"]], + [[n,c],["John","Writer"]] + +] +,[[[[v,characters],["Chef","Baker","Writer"]]]]). + + +test_types_cases(65, +[[n,characters],[["a",[["a"],"a"]]]], +%[[n,characters],[[["a"],"a"]]], + [ +[[n,characters],[[[t,string],[[[t,string]],[t,string]]]]] +], + %[[[n,characters],[[[t,brackets],[[[t,brackets],[[t,string]]],[t,string]]]]]], + + [[[n,characters],[input]]], + +[ + [[n,characters],[[v,person]]] + +] +,[[]]). + +test_types_cases(66, +[[n,characters],[["a",[["a"],"a"]]]], +%[[n,characters],[[["a"],"a"]]], + [ +[[n,characters],[{[t,string],{{[t,string]},[t,string]}}]] +], + %[[[n,characters],[[[t,brackets],[[[t,brackets],[[t,string]]],[t,string]]]]]], + + [[[n,characters],[input]]], + +[ + [[n,characters],[[v,person]]] + +] +,[[]]). + + +test_types_cases(67, +[[n,characters],[[["a",["a"]],"a"]]], +%[[n,characters],[[["a",["a"]]]]], + [ +[[n,characters],[[[[t,string],[[t,string]]],[t,string]]]] +], + %[[[n,characters],[[[t,brackets],[[[t,brackets],[[t,string],[[t,brackets],[[t,string]]]]]]]]]], + + [[[n,characters],[input]]], + +[ + [[n,characters],[[v,person]]] + +] +,[[]]). + + +test_types_cases(68, +[[n,characters],[[["a",["a"]],"a"]]], +%[[n,characters],[[["a",["a"]]]]], + [ +[[n,characters],[{{[t,string],{[t,string]}},[t,string]}]] +], + %[[[n,characters],[[[t,brackets],[[[t,brackets],[[t,string],[[t,brackets],[[t,string]]]]]]]]]], + + [[[n,characters],[input]]], + +[ + [[n,characters],[[v,person]]] + +] +,[[]]). + +test_types_cases(69, +[[n,characters],[[a]]], +%[[n,characters],[[["a"],"a"]]], + [ +[[n,characters],[{[t,atom]}]] +], + %[[[n,characters],[[[t,brackets],[[[t,brackets],[[t,string]]],[t,string]]]]]], + + [[[n,characters],[input]]], + +[ + [[n,characters],[[v,person]]] + +] +,[[]]). + +test_types_cases(70, +[[n,characters],[[a],b]], +%[[n,characters],[[["a"],"a"]]], + [ +[[n,characters],[{[t,atom]},[t,atom]]] +], + %[[[n,characters],[[[t,brackets],[[[t,brackets],[[t,string]]],[t,string]]]]]], + + [[[n,characters],[input,input]]], + +[ + [[n,characters],[[v,person],[v,p2]]] + +] +,[[]]). + +test_types_cases(71, +[[n,numbers],[[1,2,3]]], +[ +[[n,numbers],[[t,number1]]], +[[t,number1],[[]]], +[[t,number1],[[[t,number],"|",[t,number1]]]] +], +[[[n,numbers],[input]]], +[ + [[n,numbers],[[v,_]]] + +] +,[[]]). + +test_types_cases(72, +[[n,string_numbers],[["a",2,3]]], +[ +[[n,string_numbers],[[t,string_numbers1]]], +[[t,string_numbers1],[[[t,string],"|",{[t,number]}]]] +], +[[[n,string_numbers],[input]]], +[ + [[n,string_numbers],[[v,_]]] + +] +,[[]]). + +test_types_cases(73, +[[n,atom_any],[["a",2,"b",1]]], +[ +[[n,atom_any],[[t,atom_any1]]], +[[t,atom_any1],[{[t,string],[t,any]}]] +], +[[[n,atom_any],[input]]], +[ + [[n,atom_any],[[v,_]]] + +] +,[[]]). + + +test_types_cases(74, +[[n,atom_any],[["a",2,3,"b",1,1]]], +[ +[[n,atom_any],[[t,atom_any1]]], +[[t,atom_any1],[{[t,string],[t,number],[t,number]}]] +%[[t,any1],[[t,number]]], +%[[t,any1],[[t,number],[t,number]]] +], +[[[n,atom_any],[input]]], +[ + [[n,atom_any],[[v,_]]] + +] +,[[]]). + + +test_types_cases(75, +[[n,list1],[[4,b,"c","c"]]], +%[[n,list1],[[4,b,"c",b,"c",5,d,"e",d,"e"]]], +[ +[[n,list1],[[t,1]]], +[[t,1],[[]]], +[[t,1],[[[t,number],"|",[t,2]],"|",[t,1]]], +[[t,2],[[]]], +[[t,2],[[[t,atom],"|",[t,3]],"|",[t,2]]], +[[t,3],[[]]], +[[t,3],[[t,string],"|",[t,3]]]], + +[[[n,list1],[input]]], + +[[[n,list1],[[v,_]]]] + +,[[]]). diff --git a/listprologinterpreter/lpiverify_pl.pl b/listprologinterpreter/lpiverify_pl.pl new file mode 100644 index 0000000000000000000000000000000000000000..b93853e1c9afaa5aa5afd96c6deaf7c03a98cbc3 --- /dev/null +++ b/listprologinterpreter/lpiverify_pl.pl @@ -0,0 +1,51 @@ +%% test_pl(Debug[on/off],Total,Score). + +%%:- use_module(library(time)). + +%% test_pl cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +test_pl(Debug,NTotal,Score) :- test_pl(Debug,0,NTotal,0,Score),!. +test_pl(_Debug,NTotal,NTotal,Score,Score) :- NTotal=1, !. +test_pl(Debug,NTotal1,NTotal2,Score1,Score2) :- + NTotal3 is NTotal1+1, + test_pl(NTotal3,Query,Functions,Result), + p2lpconverter_command([string,Query],Query1), + p2lpconverter([string,Functions],Functions1), + p2lpconverter_term([string,Result],Result2), + ((international_interpret([lang,"en"],Debug,Query1,Functions1,Result1), + %writeln1([result1,Result1]), + Result2=Result1 + )->(Score3 is Score1+1,writeln0([test_pl,NTotal3,passed]));(Score3=Score1,writeln0([test_pl,NTotal3,failed]))), + writeln0(""), + test_pl(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% test_pl individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +test_pl1(Debug,N,Passed) :- + test_pl(N,Query,Functions,Result), + p2lpconverter_command([string,Query],Query1), + p2lpconverter([string,Functions],Functions1), +%trace, + p2lpconverter_term([string,Result],Result2), + ((international_interpret([lang,"en"],Debug,Query1,Functions1,Result1), + %writeln1([result1,Result1]), + Result2=Result1 + )->(Passed=passed,writeln0([test_pl,N,passed]));(Passed=failed,writeln0([test_pl,N,failed]))),!. + +p2lpconverter_command([Type,In],Out) :- + string_concat("a:-",In,In1), + p2lpconverter([Type,In1],Out1), + Out1=[[[n,a],":-",Out]],!. +p2lpconverter_term([Type,In],Out) :- + foldr(string_concat,["a:-",In,"."],In1), + p2lpconverter([Type,In1],Out1), + Out1=[[[n,a],":-",Out]],!. + + +test_pl(1,"string_chars(\"aabb\",A),findall(A1,(member(A2,A),atom_string(A2,A1)),A3),c(A3,A4).", +"c-->[]. +c-->[\"a\"],d,c. +d-->[]. +d-->[\"b\"],d.", + +"[[A,[a,a,b,b]],[A3,[\"a\",\"a\",\"b\",\"b\"]],[A4,[\"a\",\"a\",\"b\",\"b\"]]]"). diff --git a/listprologinterpreter/match_get_put_vals.pl b/listprologinterpreter/match_get_put_vals.pl new file mode 100644 index 0000000000000000000000000000000000000000..848a7a3adca0981540ec5f7f589811750b259243 --- /dev/null +++ b/listprologinterpreter/match_get_put_vals.pl @@ -0,0 +1,238 @@ +% test1(off,1,R). + + +match4_new_22(Variable1,Variable2,Vars1,Vars2%,Standard_or_e4 +) :- + +%writeln(match4_new_22(Variable1,Variable2,Vars1,Vars2)), +%trace, + (match4_new_220([Variable1],[Variable2],Vars1,Vars2%,Standard_or_e4 + )->true; + match4_new_220([Variable2],[Variable1],Vars1,Vars2%,Standard_or_e4 + )),!. + +match4_new_220(Variable1,Variable2,Vars1,Vars6%,Standard_or_e4 +) :- +%trace, + match4_new_222(Variable1,Variable2,Vars1,Vars2%,Standard_or_e4 + ), + + ((subtract(Vars1,Vars2,[]), + subtract(Vars2,Vars1,[]))->Vars6=Vars2; + + ( + get_lang_word("v",Dbw_v1),Dbw_v1=Dbw_v, + + find_sys(Sys_name1), + match4_new_222(Variable1,[Dbw_v,Sys_name1],Vars2,Vars3%,Standard_or_e4 + ), + getvalue([Dbw_v,Sys_name1],Value3,Vars3), + + match4_new_222(Value3,Variable2,Vars2,Vars4%,Standard_or_e4 + ), + + find_sys(Sys_name2), + match4_new_222(Variable2,[Dbw_v,Sys_name2],Vars4,Vars5%,Standard_or_e4 + ), + getvalue([Dbw_v,Sys_name2],Value31,Vars5), + + match4_new_222(Variable1,Value31,Vars4,Vars7%,Standard_or_e4 + ), + + match4_new_220(Variable1,Variable2,Vars7,Vars6%,Standard_or_e4 + )))%,! + . + + +match4_new_222(Variable1,Variable2,Vars1,Vars2%,Standard_or_e4 +) :- +%trace, +occurs_check(Variable1,Variable2), +%notrace, + match4_new([Variable1],[Variable2],Vars1,Vars3%,Standard_or_e4 + ), + findall([V,Val2],(member([V,Val1],Vars3), + simplify(Val1,Val2)),Vars2), + !. + + +match4_new(S1,S2,V1,V2%,_Standard_or_e4 +) :- + match_get_vals(S1,[],S3,V1), + simplify(S3,S5), + match_get_vals(S2,[],S4,V1), + simplify(S4,S6), + match_put_vals(S5,S6,V1,V2%,_Standard_or_e4 + ),!. + + +match_get_vals([],S1,S1,_) :- !. + +match_get_vals(Statement,S1,S2,Vars) :- + +%variable_name(Statement)-> +%match_get_val(Statement,Value,Vars), +%append(S1,[Value],S2)); + (Statement=[Statement1|Statement2], + (variable_name(Statement1)-> +(match_get_val(Statement1,Value1,Vars), +append(S1,[Value1],S3)); + (single_item_or_var(Statement1)-> + (Value1=Statement1, +append(S1,[Value1],S3)); + (match_get_vals(Statement1,[],S31,Vars), + S3=[S31]))), + match_get_vals(Statement2,[],S4,Vars)), + foldr(append,[S3,S4],S5), + %trace, + %S6=[S5], + %(S1=[]->S2=S5; + foldr(append,[S1,S5],S2),!. + +match_get_val([_,'_'],undef,_Vars) :- !. + +match_get_val(Variable,Value,Vars) :- + (member([Variable,Value],Vars)->true; + Variable=Value),!. + + + + + + +match_put_vals([],[],Vars,Vars%,_Standard_or_e4 +) :- %trace, +!. + +match_put_vals([],Variables2,Vars1,Vars2%,Standard_or_e4 +) :- + Variables2=[Statement1b|Statement2b], + (Statement1b="|"-> +match_put_vals([[]],Statement2b,Vars1,Vars2%,Standard_or_e4 +)),!. + +match_put_vals(Variables1,[],Vars1,Vars2%,Standard_or_e4 +) :- + Variables1=[Statement1a|Statement2a], + (Statement1a="|"-> +match_put_vals(Statement2a,[[]],Vars1,Vars2%,Standard_or_e4 +)),!. + +match_put_vals(Variables1,Variables2,Vars1,Vars2%,Standard_or_e4 +) :- + +%(Variables1=Variables2->fail; + +Variables1=[Statement1a|Statement2a], +Variables2=[Statement1b|Statement2b], + %(Statement1a=Statement1b->fail; + +((Statement1a="|",Statement1b="|") -> +match_put_vals([Statement2a],[Statement2b],Vars1,Vars2%,Standard_or_e4 +); +(Statement1a="|"-> +match_put_vals(Statement2a,[Variables2],Vars1,Vars2%,Standard_or_e4 +); +(Statement1b="|"-> +match_put_vals([Variables1],Statement2b,Vars1,Vars2%,Standard_or_e4 +); + +% v=_1 if undef +% vsys x _ + +% v=[multi items] - insert vals in multi items - |, simplify, what if v is defined & in following +% v=v +% v=item +% item=item + +% +% +(%Vars1=Vars3, +( +((Statement1a=undef->true;Statement1b=undef)-> +Vars1=Vars3; + +(((single_item_or_var(Statement1a)%,Vars1=Vars2 +)->true; +(single_item_or_var(Statement1b)%,Vars1=Vars2 +))-> + match4_new1(Statement1a,Statement1b,Vars1,Vars3%,Standard_or_e4 + ) + ; +( +%contains_var1(Statement1a,Statement1b))->true; + +%((variable_name(Statement1b), +%not(variable_name(Statement1a)), +%contains_var1(Statement1b,Statement1a))->true; + + + match_put_vals(Statement1a,Statement1b,Vars1,Vars3%,Standard_or_e4 + )))) + ), + %Vars3=Vars4, + match_put_vals(Statement2a,Statement2b,Vars3,Vars2%,Standard_or_e4 + )) + ))),!. + + + + + +match4_new1(Statement1,Statement2,Vars1,Vars2%,_Standard_or_e4 +) :- + %get_lang_word("v",Dbw_v), %(variable_name(Statement1)->(find_sys(Name1),Statement1a=[Dbw_v,Name1]);%Statement1a=Statement1), + + variable_name(Statement1), + variable_name(Statement2), + + %(Standard_or_e4=standard-> + %putvalue(Statement1,empty,Vars1,Vars2); + + %getvalue_new2(Statement1,Value1,Vars1,Standard_or_e4), + %getvalue_new2(Statement2,Value2,Vars1,Standard_or_e4), + (Statement1=Statement2->Vars1=Vars2; + putvalue(Statement2,Statement1,Vars1,Vars2)),!. + +match4_new1(Statement1,Statement2,Vars1,Vars2%,_Standard_or_e4 +) :- + not(variable_name(Statement1)), + variable_name(Statement2), + %getvalue_new(Statement1,Value1,Vars1), + %getvalue_new2(Statement2,Value2,Vars1,Standard_or_e4), + + putvalue(Statement2,Statement1,Vars1,Vars2). + %match4_new2(Statement1,Value2,Vars1,Vars2,Standard_or_e4). + +match4_new1(Statement1,Statement2,Vars1,Vars2%,_Standard_or_e4 +) :- + variable_name(Statement1), + not(variable_name(Statement2)), + %getvalue_new2(Statement1,Value1,Vars1,Standard_or_e4), + %getvalue_new(Statement2,Value2,Vars1), + + putvalue(Statement1,Statement2,Vars1,Vars2). + %match4_new2(Value1,Statement2,Vars1,Vars2,Standard_or_e4). + +match4_new1(Statement1,Statement2,Vars1,Vars1%,_Standard_or_e4 +) :- + not(variable_name(Statement1)), + not(variable_name(Statement2)), + %getvalue_new(Statement1,Value1,Vars1), + %getvalue_new(Statement2,Value2,Vars1), + +Statement1=Statement2. %match4_new2(Statement1,Statement2,Vars1,Vars2,Standard_or_e4). + + +% A=B (B=B) + +/* +getvalue_new2(Variable,Value,Vars,_) :- + (not(variable_name(Variable))->Variable=Value; + ((member([Variable,Value],Vars), + not(variable_name(Value)))->true;% is this required in getvalue_new1 + (%not(member([Variable,_Value],Vars)), + (%Standard_or_e4=standard-> + %Value=empty; + Variable=Value)))),!. +*/ \ No newline at end of file diff --git a/listprologinterpreter/operators.pl b/listprologinterpreter/operators.pl new file mode 100644 index 0000000000000000000000000000000000000000..14279025e80f69872d406c834249762e15f4728f --- /dev/null +++ b/listprologinterpreter/operators.pl @@ -0,0 +1,13 @@ +%% operators.pl + +operator(+) :- !. +operator(-) :- !. +operator(*) :- !. +operator(/) :- !. + +comparisonoperator(>). +comparisonoperator(>=). +comparisonoperator(<). +comparisonoperator(=<). +%%comparisonoperator(=). +comparisonoperator(=\=). diff --git a/listprologinterpreter/preds_converters_and_matrices.pl b/listprologinterpreter/preds_converters_and_matrices.pl new file mode 100644 index 0000000000000000000000000000000000000000..06c1744cfea38307593cec1aec64b61d90b3a8b6 --- /dev/null +++ b/listprologinterpreter/preds_converters_and_matrices.pl @@ -0,0 +1,30 @@ +convert_to_lp_pipe(empty2,empty2) :- !. +convert_to_lp_pipe([Dbw_v,A],[Dbw_v,A]) :- + get_lang_word("v",Dbw_v), + !. +convert_to_lp_pipe(Value1A,Value1A) :- + foldr(append,[Value1A],[],_). + +convert_to_lp_pipe(Value1A,Value1A1) :- + command_n_sols(N), + numbers(N,1,[],N1), + member(N2,N1), + length(L,N2), + append(L,A,Value1A), + %length(A,1), + %foldr(append,[A],[],_), + %var(A), + A=empty2, + foldr(append,[L,["|"],[A]],[],Value1A1). + +matrix([iii, +iio, +ioi, +ioo, +oii, +oio, +ooi, +ooo]). + +matrix_member([ii,oi,io,oo]). + diff --git a/listprologinterpreter/program.c b/listprologinterpreter/program.c new file mode 100644 index 0000000000000000000000000000000000000000..d76c027f1c886f2e9504dde5ae14f0d24282d42b --- /dev/null +++ b/listprologinterpreter/program.c @@ -0,0 +1,12 @@ +#include +#include + +int main(void) +{ + char str1[20]; + + scanf("%19s", str1); + + printf("%s\n", str1); + return 0; +} \ No newline at end of file diff --git a/listprologinterpreter/program.txt b/listprologinterpreter/program.txt new file mode 100644 index 0000000000000000000000000000000000000000..d76c027f1c886f2e9504dde5ae14f0d24282d42b --- /dev/null +++ b/listprologinterpreter/program.txt @@ -0,0 +1,12 @@ +#include +#include + +int main(void) +{ + char str1[20]; + + scanf("%19s", str1); + + printf("%s\n", str1); + return 0; +} \ No newline at end of file diff --git a/listprologinterpreter/replace_in_term.pl b/listprologinterpreter/replace_in_term.pl new file mode 100644 index 0000000000000000000000000000000000000000..d29b90057da2262bb9edaad1a6e579c5f120c7b3 --- /dev/null +++ b/listprologinterpreter/replace_in_term.pl @@ -0,0 +1,28 @@ +%/* +match1(A,B) :- + (var(A)->A1=empty1;A1=A), + (var(B)->B1=empty1;B1=B), + A1=B1,!. + +replace_in_term(A,B,_,A) :- (not(var(A))->A=[];not(var(B))),!. + +replace_in_term(Statement,A,B,Term2) :- + +(match1(A,Statement)->Term2=B; (Statement=[Statement1|Statement2], + ((match1(A,Statement1)->Statement1_a=B; + replace_in_term(Statement1,A,B,Statement1_a)), + replace_in_term(Statement2,A,B,Statement2_a))), + (isvar(Statement2_a)->Statement2_b=["|",Statement2_a]; + Statement2_b=Statement2_a), +append([Statement1_a],Statement2_b,Term2)),!. + +replace_in_term(A,B,_,A) :- not(match1(A,B)),!. + + +replace_empty_with_undefined(Values,Values_u) :- + get_lang_word("v",Dbw_v), + findall(Values_u1,(member(Value,Values),replace_in_term(Value,[Dbw_v,_],_,Values_u1)),Values_u),!. +replace_undefined_with_empty(Values,Values_e) :- +find_v_sys(V_sys), + findall(Values_e1,(member(Value,Values),replace_in_term(Value,_,V_sys,Values_e1)),Values_e),!. +%*/ \ No newline at end of file diff --git a/listprologinterpreter/reserved_words2.pl b/listprologinterpreter/reserved_words2.pl new file mode 100644 index 0000000000000000000000000000000000000000..1ef609d863de3afa8af9b8615d88752d413536b6 --- /dev/null +++ b/listprologinterpreter/reserved_words2.pl @@ -0,0 +1,3 @@ +reserved_words2(["+","-","*","/","abort","any","append","atom","brackets","call","ceiling","code","creep","cut","date","delete","equals1","equals2","equals3","equals4","equals4_on","equals4_off","exit","fail","findall","grammar","head","is","length","letters","list","member","member2","member3","n","not","number","or","predicatename","random","round","skip","string","string_from_file","stringconcat","stringtonumber","sys","findall_sys","t","tail","true","unwrap","v","variable","vgp","wrap","input","output","string_length","sort","intersection","read_string","writeln","atom_string","trace","notrace","sqrt","notrace","get_lang_word" +,"on_true","go_after","on_false","go_to_predicates", +"exit_function","fail_function","findall_exit_function","findall_fail_function"]). \ No newline at end of file diff --git a/listprologinterpreter/run_all.pl b/listprologinterpreter/run_all.pl new file mode 100644 index 0000000000000000000000000000000000000000..969e42e69161c47409ba28b05833196987796b19 --- /dev/null +++ b/listprologinterpreter/run_all.pl @@ -0,0 +1,139 @@ +%% test_run_all(Debug[on/off],Total,Score). + +%%:- use_module(library(time)). + +%% test_run_all cases, Debug=trace=on or off, NTotal=output=total cases, Score=output=result + +test_run_all(Debug,NTotal,Score) :- test_run_all(Debug,0,NTotal,0,Score),!. +test_run_all(_Debug,NTotal,NTotal,Score,Score) :- NTotal=1, !. +test_run_all(Debug,NTotal1,NTotal2,Score1,Score2) :- +%% Finds the alg given i/o + NTotal3 is NTotal1+1, + functions(Functions), + test_run_all(NTotal3,Query,Function1,Result), + ((findall([Function,R],(member(Function,Functions),international_interpret([lang,"en"],Debug,Query,Function,R)),Rs),member([Function1,Result],Rs)) + %%writeln1(Result2 + ->(Score3 is Score1+1,writeln0([test_run_all,NTotal3,passed]));(Score3=Score1,writeln0([test_run_all,NTotal3,failed]))), + writeln0(""), + test_run_all(Debug,NTotal3,NTotal2,Score3,Score2),!. + +%% test_run_all individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +test1(Debug,N,Passed) :- + functions(Functions), + test_run_all(N,Query,Function1,Result), + (((findall([Function,R],(member(Function,Functions),international_interpret([lang,"en"],Debug,Query,Function,R) + + %%,writeln1(R) + + ),Rs),%%writeln(Rs), + %%writeln1([Function1,Result,Rs]), + %%member([Function1,Result],Rs)), + member([Function1,Result],Rs)) + %%Function1=Function10,Result=Result0 + %%Result=Result1 + )->(Passed=passed,writeln0([test_run_all,N,passed]));(Passed=failed,writeln([test_run_all,N,failed]))),!. + +functions([ +/** +[%% reverse +[[n,a],[[],[v,l],[v,l]]], +[[n,a],[[v,l],[v,m],[v,n]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,h1],[v,m],[v,o]]],[[n,a],[[v,t],[v,o],[v,n]]]]] +], +[ %% intersection +[[n,a],[[],[v,a],[v,l],[v,l]]], +[[n,a],[[v,l1],[v,l2],[v,l3a],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l4]]],[[n,intersection2],[[v,i1],[v,l2],[],[v,l5]]],[[n,append],[[v,l3a],[v,l5],[v,l6]]],[[n,a],[[v,l4],[v,l2],[v,l6],[v,l3]]]]], +[[n,intersection2],[[v,a],[],[v,l],[v,l]]], +[[n,intersection2],[[v,i1],[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l4]]],[[n,wrap],[[v,i1],[v,i11]]],[[n,append],[[v,l2],[v,i11],[v,l3]]]]],%%,[[n,intersection2],[[v,i1],[v,l4],[v,l5],[v,l3]]]]], +[[n,intersection2],[[v,i1],[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i2]]],[[n,tail],[[v,l1],[v,l4]]],[[n,not],[[[n,=],[[v,i1],[v,i2]]]]],[[n,intersection2],[[v,i1],[v,l4],[v,l2],[v,l3]]]]] +], +[ +[[n,append1],[[v,b],[v,c],[v,a]],":-",[[[n,append],[[v,b],[v,c],[v,a]]]]] +], +[ +[[n,minus1],[[v,l],[],[v,l]]], +[[n,minus1],[[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l2],[v,i1]]],[[n,tail],[[v,l2],[v,l5]]],[[n,delete2],[[v,l1],[v,i1],[],[v,l6]]],[[n,minus1],[[v,l6],[v,l5],[v,l3]]]]], +[[n,delete2],[[],[v,a],[v,l],[v,l]]], +[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l5]]],[[n,delete2],[[v,l5],[v,i1],[v,l2],[v,l3]]]]], +[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i2]]],[[n,tail],[[v,l1],[v,l5]]],[[n,not],[[[n,=],[[v,i1],[v,i2]]]]],[[n,wrap],[[v,i2],[v,i21]]],[[n,append],[[v,l2],[v,i21],[v,l6]]],[[n,delete2],[[v,l5],[v,i1],[v,l6],[v,l3]]]]] +], +[ +[[n,mutuallyexclusive],[[],[v,l]]], +[[n,mutuallyexclusive],[[v,l],[v,m]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,membera3],[[v,m],[v,h]]],[[n,mutuallyexclusive],[[v,t],[v,m]]]]], +[[n,membera3],[[],[v,l]]], +[[n,membera3],[[v,l],[v,m]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,=],[[v,m],[v,h]]]]],[[n,membera3],[[v,t],[v,m]]]]] +], +**/ +[ % duplicates +[[n,a],[[],[v,l],[v,s],[v,s]]], +[[n,a],[[v,l],[v,m],[v,s1],[v,s2]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,member],[[v,h],[v,m]]],[[n,"->"],[[[n,deletea2],[[v,m],[v,h],[v,m1]]],[[n,true]],[[n,=],[[v,m],[v,m1]]]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,s1],[v,h1],[v,s3]]],[[n,a],[[v,t],[v,m1],[v,s3],[v,s2]]]]], +[[n,a],[[v,l],[v,m],[v,s1],[v,s2]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,membera4],[[v,m],[v,h]]]]],[[n,a],[[v,t],[v,m],[v,s1],[v,s2]]]]], +[[n,deletea2],[[],[v,l],[v,m1]],":-",[[[n,fail]]]], +[[n,deletea2],[[v,l],[v,m],[v,t]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,=],[[v,m],[v,h]]]]], +[[n,deletea2],[[v,l],[v,m],[v,m1]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,=],[[v,m],[v,h]]]]],[[n,deletea2],[[v,t],[v,m],[v,m1]]]]],[[n,membera4],[[],[v,l]],":-",[[[n,fail]]]], +[[n,membera4],[[v,l],[v,h]],":-",[[[n,head],[[v,l],[v,h]]]]], +[[n,membera4],[[v,l],[v,m]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,=],[[v,m],[v,h]]]]],[[n,membera4],[[v,t],[v,m]]]]] +] + +/**, +[[n,substring],[[],[]]], +[[n,substring],[[],[v,b]],":-",[[[n,not],[[[n,=],[[v,b],[]]]]],[[n,fail]]]], + +[[n,substring],[[],[]]], +[[n,substring],[[],[v,b]],":-",[[[n,not],[[[n,=],[[v,b],[]]]]],[[n,fail]]]], +%%[[n,substring],[[v,a],[v,b]],":-",[[[n,tail],[[v,a],[v,at]]],[[n,"->"],[[[n,listhead],[[v,a],[v,b]]],[[[n,true]]],[[[n,substring],[[v,at],[v,b]]]]]]]], +[[n,substring],[[v,a],[v,b]],":-",[[[n,tail],[[v,a],[v,at]]],[[n,"->"],[[[[n,listhead],[[v,a],[v,b]]]],[[[n,true]]],[[[n,substring],[[v,at],[v,b]]]]]]]] +], +[ +[[n,listhead],[[v,l],[]]], +[[n,listhead],[[v,a],[v,b]],":-",[[[n,head],[[v,a],[v,ah]]],[[n,tail],[[v,a],[v,at]]],[[n,head],[[v,b],[v,ah]]],[[n,tail],[[v,b],[v,bt]]],[[n,listhead],[[v,at],[v,bt]]]]]%%, +%%[[n,listhead],[[v,a],[v,b]],":-",[[[n,head],[[v,a],[v,ah]]],[[n,tail],[[v,a],[v,at]]],[[n,head],[[v,b],[v,ah]]],[[n,tail],[[v,b],[v,bt]]],[[n,listhead],[[v,at],[v,bt]]]]] +] +**/ +]). + + +%%writeln([eg1]), +test_run_all(1,[[n,a],[[["select,dash"],["neiey,person"],["neiey,person"]],[["select,dash"],["neiey,person"],["neiey,person"]],[],[v,c]]], + +[ % intersection +[[n,a],[[],[v,a],[v,l],[v,l]]], +[[n,a],[[v,l1],[v,l2],[v,l3a],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l4]]],[[n,intersection2],[[v,i1],[v,l2],[],[v,l5]]],[[n,append],[[v,l3a],[v,l5],[v,l6]]],[[n,a],[[v,l4],[v,l2],[v,l6],[v,l3]]]]], +[[n,intersection2],[[v,a],[],[v,l],[v,l]]], +[[n,intersection2],[[v,i1],[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l4]]],[[n,wrap],[[v,i1],[v,i11]]],[[n,append],[[v,l2],[v,i11],[v,l3]]]]],%%[[n,intersection2],[[v,i1],[v,l4],[v,l5],[v,l3]]]]], +[[n,intersection2],[[v,i1],[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i2]]],[[n,tail],[[v,l1],[v,l4]]],[[n,not],[[[n,=],[[v,i1],[v,i2]]]]],[[n,intersection2],[[v,i1],[v,l4],[v,l2],[v,l3]]]]] +] +,[[[[v,c], [["select,dash"],["neiey,person"],["neiey,person"]]]]]). + + +test_run_all(2,[[n,a],[[["select,dash"],["neiey,person"],["neiey,person"],["neiey,person"],["neiey,person"]],[["select,dash"],["neiey,person"],["neiey,person"],["neiey,person"],["neiey,person"]],[],[v,c]]], + +[ % duplicates +[[n,a],[[],[v,l],[v,s],[v,s]]], +[[n,a],[[v,l],[v,m],[v,s1],[v,s2]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,member],[[v,h],[v,m]]],[[n,"->"],[[[n,deletea2],[[v,m],[v,h],[v,m1]]],[[n,true]],[[n,=],[[v,m],[v,m1]]]]],[[n,wrap],[[v,h],[v,h1]]],[[n,append],[[v,s1],[v,h1],[v,s3]]],[[n,a],[[v,t],[v,m1],[v,s3],[v,s2]]]]], +[[n,a],[[v,l],[v,m],[v,s1],[v,s2]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,membera4],[[v,m],[v,h]]]]],[[n,a],[[v,t],[v,m],[v,s1],[v,s2]]]]], +[[n,deletea2],[[],[v,l],[v,m1]],":-",[[[n,fail]]]], +[[n,deletea2],[[v,l],[v,m],[v,t]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,=],[[v,m],[v,h]]]]], +[[n,deletea2],[[v,l],[v,m],[v,m1]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,=],[[v,m],[v,h]]]]],[[n,deletea2],[[v,t],[v,m],[v,m1]]]]], +[[n,membera4],[[],[v,l]],":-",[[[n,fail]]]], +[[n,membera4],[[v,l],[v,h]],":-",[[[n,head],[[v,l],[v,h]]]]], +[[n,membera4],[[v,l],[v,m]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,not],[[[n,=],[[v,m],[v,h]]]]],[[n,membera4],[[v,t],[v,m]]]]] +], + +[[[[v,c],[["select,dash"],["neiey,person"],["neiey,person"],["neiey,person"],["neiey,person"]]]]]). + +/** +[[n,minus1],[[["select,dash"],["neiey,person"],["neiey,person"],["neiey,person"],["neiey,person"]],[["select,dash"],["neiey,person"],["neiey,person"],["neiey,person"],["neiey,person"]],[]]], + +[[n,reverse],[[["select,dash"],["neiey,person"],["neiey,person"]],[],[["neiey,person"],["neiey,person"],["select,dash"]]]], + +[[n,intersection1],[[["neiey,person"],["neiey,person"],["select,dash"]],[["hipaa,square"],["releases,up"],["hipaa,square"]],[],[]]], + +[[n,append1],[[],[["hipaa,square"],["releases,up"],["hipaa,square"]],[["hipaa,square"],["releases,up"],["hipaa,square"]]]], + +[[n,minus1],[[["hipaa,square"],["releases,up"],["hipaa,square"]],[["select,dash"],["neiey,person"],["neiey,person"]],[["hipaa,square"],["releases,up"],["hipaa,square"]]]] + + +] +,[[[[v,c], 2]]]). +**/ \ No newline at end of file diff --git a/listprologinterpreter/simplify_types.pl b/listprologinterpreter/simplify_types.pl new file mode 100644 index 0000000000000000000000000000000000000000..c835238716a5a07e372ce1650bf2c28dd11a8de8 --- /dev/null +++ b/listprologinterpreter/simplify_types.pl @@ -0,0 +1,110 @@ +%% simplify_types.pl + +/** + +?- simplify_types([[[t, brackets], [[t, number]]], [t, string], [t, number]],[],T). +T = [[[t, number]], [t, string], [t, number]]. + +**/ + + +simplify_types([],Types,Types) :- !. +simplify_types(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("number",Dbw_number), +Data=[T,Dbw_number], +% number(Data), +append(Types1,[[T,Dbw_number]],Types2),!. +simplify_types(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("string",Dbw_string), +Data=[T,Dbw_string], + %string(Data), + append(Types1,[[T,Dbw_string]],Types2),!. +simplify_types(Data,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("atom",Dbw_atom), +Data=[T,Dbw_atom], + %string(Data), + append(Types1,[[T,Dbw_atom]],Types2),!. +simplify_types(Data,Types1,Types2) :- +%get_lang_word("t",T), +%get_lang_word("atom",Dbw_atom), +Data="|", + %string(Data), + append(Types1,["|"],Types2),!. +simplify_types(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[[[T,Dbw_brackets],Types4]|Types6], + simplify_types(Types4,[],Data2), + Types5=[Data2], + append_list3([Types1,Types5],Types2a), + +simplify_types(Types6,Types2a,Types2),!. + + +simplify_types(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("list",Dbw_list), + Data1=[[[T,Dbw_list],Types4]|Types6], +%trace, + simplify_types(Types4,[],Data2),%[Data2|Data2a]), + +(Data2=[Data2a]->Types5=[{Data2a}]; +(Data2=[Data2a|Data2b]-> +(square_to_round([Data2a|Data2b],Types52), +round_to_curly(Types52,Types51), +Types5=[Types51]);false)), + append_list3([Types1,Types5],Types2a), + simplify_types(Types6,Types2a,Types2),!. + + /* +simplify_types(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[[T,Dbw_brackets]|Types4], + simplify_types(Types4,[],Data2), + Types5=[[Data2],Data4], + simplify_types(Data4,[],Types6), + append_list3([Types1,Types5,Types6],Types2),!. +*/ + +/* +simplify_types(Data1,Types1,Types2) :- +get_lang_word("t",T), +get_lang_word("brackets",Dbw_brackets), + Data1=[[Data2|Data3]|Data4], + simplify_types(Data2,[],Types3), + simplify_types(Data3,Types3,Types4), + Types5=[[[T,Dbw_brackets],Types4]], + simplify_types(Data4,[],Types6), + append_list3([Types1,Types5,Types6],Types2),!. + +*/ + + +simplify_types(Data,Types1,Types2) :- +get_lang_word("t",T), +%get_lang_word("string",Dbw_string), +Data=[T,A], + %string(Data), + append(Types1,[[T,A]],Types2),!. + + +simplify_types(Data1,Types1,Types2) :- + Data1=[Data2|Data3], + simplify_types(Data2,Types1,Types3), + simplify_types(Data3,Types3,Types2),!. + %%Types2=[Types4]. + +/* +append_list3(A1,B):- + append_list3(A1,[],B),!. + +append_list3([],A,A):-!. +append_list3(List,A,B) :- + List=[Item|Items], + append(A,Item,C), + append_list3(Items,C,B). +*/ \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/.DS_Store b/listprologinterpreter/test_open_and_types_open_data/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/listprologinterpreter/test_open_and_types_open_data/.DS_Store differ diff --git a/listprologinterpreter/test_open_and_types_open_data/test_open_data test 3.txt b/listprologinterpreter/test_open_and_types_open_data/test_open_data test 3.txt new file mode 100644 index 0000000000000000000000000000000000000000..6badb1f6969f3e6cc99e27838d61f1f010d15b54 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_open_data test 3.txt @@ -0,0 +1,64 @@ +1 +2 +3 +4 +5 +6 +7 +8 +9 +10 +11 +12 +13 +14 +15 +16 +17 +18 +19 +20 +21 +22 +23 +24 +25 +26 +27 +28 +29 +30 +31 +32 +33 +34 +35 +36 +37 +38 +39 +40 +41 +42 +43 +44 +45 +46 +47 +48 +49 +50 +51 +52 +53 +54 +55 +56 +57 +58 +59 +60 +61 +62 +63 +64 \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_open_data test 4.txt b/listprologinterpreter/test_open_and_types_open_data/test_open_data test 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..b17865743d773986327f6057bbd171107f1cfd6e --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_open_data test 4.txt @@ -0,0 +1,4 @@ +1 +2 +3 +4 \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_open_data test 5.txt b/listprologinterpreter/test_open_and_types_open_data/test_open_data test 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..2e65efe2a145dda7ee51d1741299f848e5bf752e --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_open_data test 5.txt @@ -0,0 +1 @@ +a \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_open_data test 7.txt b/listprologinterpreter/test_open_and_types_open_data/test_open_data test 7.txt new file mode 100644 index 0000000000000000000000000000000000000000..2e65efe2a145dda7ee51d1741299f848e5bf752e --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_open_data test 7.txt @@ -0,0 +1 @@ +a \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_open_data test 8.txt b/listprologinterpreter/test_open_and_types_open_data/test_open_data test 8.txt new file mode 100644 index 0000000000000000000000000000000000000000..2e65efe2a145dda7ee51d1741299f848e5bf752e --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_open_data test 8.txt @@ -0,0 +1 @@ +a \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 10.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 10.txt new file mode 100644 index 0000000000000000000000000000000000000000..56a6051ca2b02b04ef92d5150c9ef600403cb1de --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 10.txt @@ -0,0 +1 @@ +1 \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 11.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 11.txt new file mode 100644 index 0000000000000000000000000000000000000000..1e354a9f18e7fcaf2e8eae412fe7e73c92f72bcb --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 11.txt @@ -0,0 +1,2 @@ +pedagogy +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 12.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 12.txt new file mode 100644 index 0000000000000000000000000000000000000000..e25f1814e51579d5f55c0f1fe0135ddb28a47f4a --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 12.txt @@ -0,0 +1 @@ +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 13.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 13.txt new file mode 100644 index 0000000000000000000000000000000000000000..2e65efe2a145dda7ee51d1741299f848e5bf752e --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 13.txt @@ -0,0 +1 @@ +a \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 14.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 14.txt new file mode 100644 index 0000000000000000000000000000000000000000..78981922613b2afb6025042ff6bd878ac1994e85 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 14.txt @@ -0,0 +1 @@ +a diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 15.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 15.txt new file mode 100644 index 0000000000000000000000000000000000000000..e0914d6807d07381e4301341f90363fdc83a5b5d --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 15.txt @@ -0,0 +1,2 @@ +y +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 16.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 16.txt new file mode 100644 index 0000000000000000000000000000000000000000..45cad29e7bb1a772465934d1ca6ea6b9302525d3 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 16.txt @@ -0,0 +1 @@ +professor \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 17.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 17.txt new file mode 100644 index 0000000000000000000000000000000000000000..2e65efe2a145dda7ee51d1741299f848e5bf752e --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 17.txt @@ -0,0 +1 @@ +a \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 18.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 18.txt new file mode 100644 index 0000000000000000000000000000000000000000..ef073cc45ccc66c921adc7ccc6221e38aa54ae17 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 18.txt @@ -0,0 +1 @@ +n \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 19.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 19.txt new file mode 100644 index 0000000000000000000000000000000000000000..e25f1814e51579d5f55c0f1fe0135ddb28a47f4a --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 19.txt @@ -0,0 +1 @@ +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 20.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 20.txt new file mode 100644 index 0000000000000000000000000000000000000000..e25f1814e51579d5f55c0f1fe0135ddb28a47f4a --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 20.txt @@ -0,0 +1 @@ +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 21.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 21.txt new file mode 100644 index 0000000000000000000000000000000000000000..2e65efe2a145dda7ee51d1741299f848e5bf752e --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 21.txt @@ -0,0 +1 @@ +a \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 22.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 22.txt new file mode 100644 index 0000000000000000000000000000000000000000..1d5782d280f27c703a7937260a13358cb4048810 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 22.txt @@ -0,0 +1,3 @@ +a +a +a \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 23.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 23.txt new file mode 100644 index 0000000000000000000000000000000000000000..e25f1814e51579d5f55c0f1fe0135ddb28a47f4a --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 23.txt @@ -0,0 +1 @@ +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 24.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 24.txt new file mode 100644 index 0000000000000000000000000000000000000000..e25f1814e51579d5f55c0f1fe0135ddb28a47f4a --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 24.txt @@ -0,0 +1 @@ +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 25.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 25.txt new file mode 100644 index 0000000000000000000000000000000000000000..889b95267cd0649e729e90b6e16a416092425c01 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 25.txt @@ -0,0 +1,16 @@ +1 +2 +3 +4 +5 +6 +7 +8 +9 +10 +11 +12 +13 +14 +15 +16 \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 26.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 26.txt new file mode 100644 index 0000000000000000000000000000000000000000..b17865743d773986327f6057bbd171107f1cfd6e --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 26.txt @@ -0,0 +1,4 @@ +1 +2 +3 +4 \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 27.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 27.txt new file mode 100644 index 0000000000000000000000000000000000000000..e25f1814e51579d5f55c0f1fe0135ddb28a47f4a --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 27.txt @@ -0,0 +1 @@ +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 28.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 28.txt new file mode 100644 index 0000000000000000000000000000000000000000..e25f1814e51579d5f55c0f1fe0135ddb28a47f4a --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 28.txt @@ -0,0 +1 @@ +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 29.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 29.txt new file mode 100644 index 0000000000000000000000000000000000000000..7a754f414cd8ac6c069ac2e25baff0f55bff8b4a --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 29.txt @@ -0,0 +1,2 @@ +1 +2 \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 30.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 30.txt new file mode 100644 index 0000000000000000000000000000000000000000..e25f1814e51579d5f55c0f1fe0135ddb28a47f4a --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 30.txt @@ -0,0 +1 @@ +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 31.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 31.txt new file mode 100644 index 0000000000000000000000000000000000000000..0bc999a3e6aa2c0a9be8d7750b47540bab82fc84 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 31.txt @@ -0,0 +1,2 @@ +1 +1 \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 32.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 32.txt new file mode 100644 index 0000000000000000000000000000000000000000..d3c6d002235b168024840827ee62bce1e80a9d12 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 32.txt @@ -0,0 +1,2 @@ +1 +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 33.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 33.txt new file mode 100644 index 0000000000000000000000000000000000000000..56a6051ca2b02b04ef92d5150c9ef600403cb1de --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 33.txt @@ -0,0 +1 @@ +1 \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 34.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 34.txt new file mode 100644 index 0000000000000000000000000000000000000000..0bc999a3e6aa2c0a9be8d7750b47540bab82fc84 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 34.txt @@ -0,0 +1,2 @@ +1 +1 \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 35.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 35.txt new file mode 100644 index 0000000000000000000000000000000000000000..e0914d6807d07381e4301341f90363fdc83a5b5d --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 35.txt @@ -0,0 +1,2 @@ +y +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 4.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 4.txt new file mode 100644 index 0000000000000000000000000000000000000000..0a207c060e61f3b88eaee0a8cd0696f46fb155eb --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 4.txt @@ -0,0 +1,2 @@ +a +b \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 5.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 5.txt new file mode 100644 index 0000000000000000000000000000000000000000..acee907170080a92f0f7a3680093842be18d276c --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 5.txt @@ -0,0 +1,6 @@ +a +n +a +n +b +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 6.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 6.txt new file mode 100644 index 0000000000000000000000000000000000000000..d923c65b16ef6ed1ac05ebbac3dad26d271d2b31 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 6.txt @@ -0,0 +1,3 @@ +y +y +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 7.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 7.txt new file mode 100644 index 0000000000000000000000000000000000000000..c172c4d1b52aa9634ad5b154116d63bb169ec13b --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 7.txt @@ -0,0 +1,81 @@ +1 +2 +3 +4 +5 +6 +7 +8 +9 +10 +11 +12 +13 +14 +15 +16 +17 +18 +19 +20 +21 +22 +23 +24 +25 +26 +27 +28 +29 +30 +31 +32 +33 +34 +35 +36 +37 +38 +39 +40 +41 +42 +43 +44 +45 +46 +47 +48 +49 +50 +51 +52 +53 +54 +55 +56 +57 +58 +59 +60 +61 +62 +63 +64 +65 +66 +67 +68 +69 +70 +71 +72 +73 +74 +75 +76 +77 +78 +79 +80 +81 \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 8.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 8.txt new file mode 100644 index 0000000000000000000000000000000000000000..e0914d6807d07381e4301341f90363fdc83a5b5d --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 8.txt @@ -0,0 +1,2 @@ +y +y \ No newline at end of file diff --git a/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 9.txt b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 9.txt new file mode 100644 index 0000000000000000000000000000000000000000..f65afebce6c52f53971743cd8f08d1c869513734 --- /dev/null +++ b/listprologinterpreter/test_open_and_types_open_data/test_types_open_data test 9.txt @@ -0,0 +1,3 @@ +y +y +brown \ No newline at end of file diff --git a/luciancicd/.DS_Store b/luciancicd/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..80580da7b5db051d18b15f148cd3a545cfa9eace Binary files /dev/null and b/luciancicd/.DS_Store differ diff --git a/luciancicd/LICENSE b/luciancicd/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..43ac320c124b6bba1157a463da11bcfebfd47839 --- /dev/null +++ b/luciancicd/LICENSE @@ -0,0 +1,28 @@ +BSD 3-Clause License + +Copyright (c) 2023, Lucian Green + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/luciancicd/README.md b/luciancicd/README.md new file mode 100644 index 0000000000000000000000000000000000000000..f9095f9a06904659ecdb5a00580d6fa46cf5715a --- /dev/null +++ b/luciancicd/README.md @@ -0,0 +1,170 @@ +# Lucian CI/CD + +![761B2A85-6DA4-4EA7-9F20-8CCB2DC60D28](https://user-images.githubusercontent.com/15845542/234572372-8446f119-6151-4ea8-844b-4df89f605143.jpeg) + +* Image of different coloured pipelines + +* Single-user continuous integration and continuous deployment. Integrates (merges changed repositories from combinations of changes), builds repositories that depend on these repositories and tests predicates implicated, where tests are in comments before each predicate. + +* Bottom-up Version: Lucian CI/CD can now check sets of repositories bottom-up, which means there are up to seven possible changes to a set of current predicates to find a working combination. Current predicates are each in depth-first, post-order, or sets of clauses or predicates involved in loops. + +* Other programming languages apart from Prolog aren't fully supported yet (even though the non-bottom-up version does). + +# Getting Started + +Please read the following instructions on installing the project on your computer for automatic testing. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +* Download: +* this repository and its dependencies +* List Prolog to Prolog Converter + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","luciancicd"). +halt +``` + +# Running + +* In Shell: +`cd luciancicd` +`swipl` + +* To load the algorithm, enter: +``` +['luciancicd.pl']. +``` + +# Instructions and troubleshooting + +* To correct failures, it's important to run Lucian CI/CD on algorithms all at once or one predicate at a time with dependencies, and then perform necessary changes and run it again. + +* Lucian CI/CD is a necessary way to refactor, debug and point to debugging. Lucian CI/CD builds the algorithm with only parts supported by the tests. This method requires checking the code before is as complete as possible, the tests, the code afterwards and the algorithm's result. + +* In the folder `GitHub2` (specified in `settings.pl`, not to be confused with `GitHub` or the folder's name housing your GitHub repositories), store your repositories, e.g. `b`. These folders contain Prolog (`*.pl`) and other files. + +* Learn Prolog with family.pl and search for Prolog tutorials. Examine the examples in luciancicdverify.pl and write examples to help test and debug algorithms. The following instructions will guide you to use Lucian CI/CD more easily. + +* Change the LPPM user (your GitHub username), the repositories, any omitted folders, the output folder, `fail_if_greater_than_n_changes1` (used to find more combinations of lines to test) and the time limit for running algorithms in `settings.pl`. + +* If necessary, modify tests in the Prolog files in the repositories, for example: + +``` +% remove_and_find_item_number([a,b,c],2,c,N2). +% N2 = 2. +% remove_and_find_item_number([a,b,b,c],2,c,N3). +% N3 = 3. +``` + +* Write a `main_file.txt` in the main folder of each repository, e.g.: + +``` +[ + ["c.pl", + [[c,2],[d,3]] + ], + ["c1.pl", + [[c1,2],[d1,3]] + ] +] +``` + +* contains the current main file in the repository and its possible main predicate names and arities (the number of arguments). Suppose a repository contains no main files to test; enter `[]`. + +* Note: Dependencies are in `List-Prolog-Package-Manager/lppm_registry.txt` (LPPM) in the form `[[User,Repository,Description,Dependencies], etc]`, e.g. + +``` +["luciangreen","Daily-Regimen","Scripts for daily meditation, bot prep, uni and breasoning.", +[ + ["luciangreen","listprologinterpreter"], + ["luciangreen","Languages"], + ["luciangreen","culturaltranslationtool"], + ["luciangreen","Algorithm-Writer-with-Lists"], + ["luciangreen","Text-to-Breasonings"], + ["luciangreen","mindreader"] +]] +``` + +* Lucian CI/CD only returns an overall successful result if all dependencies connected to a repository, their main files and predicates and each level in the bottom-up order successfully pass all tests for each predicate. + +* Lucian CI/CD works with: + * Prolog files + * Text data files + +* Check if `main_file.txt` contains all main files (files with main predicates) and main predicates. + +* Please ensure that each repository set loads the necessary files. + +* `set_up_luciancicd.` - Records time modified of repositories in `repositories_paths1//1` (from `settings.pl`). I.e. it saves the files without testing them. + +* `luciancicd.` - Tests repositories with changed modification dates. Run before committing changes. + +* For more info, see a Dev.to article about Lucian CI/CD. + +* Note: Once all tests are successful, the files in the temporary folder `../private2/luciancicd-testing/` are moved into the repository. To undo this, enter `move_to_repository_or_back.`. + +* Once the files are in the repository, you can commit the changes. + +* Note: A notification such as `"Cannot find "../private2/luciancicd-cicd-tests/tests_a.txt"` means the repository `"a"` is connected with a changed file through dependencies, but Lucian CI/CD can't install it. + +* Lucian Academy - Lucian CI/CD Demo + +# Lucian CI/CD Web Service + +* To list available logs and diff files, enter `luciancicd_server(8000).`, then go to `http://localhost:8000/luciancicd`. + +# Caveats + +* Causes of the "1 or more tests failed." error include the first entered algorithm after deleting `Github_lc/tests_*x*.txt` not passing all tests. Programs with uninstantiated variables, etc, also cause the error. + +* Writeln and commands that don't produce checkable output are not kept unless they are in `keep1/1` in the `keep.pl` register. + +* Generated code currently loses (newline, etc.) formatting. Lucian CI/CD will pretty print the code later. + +* Increase N to a higher number in `fail_if_greater_than_n_changes1(N).` in `settings.pl` (or the `fail_if_greater_than_n_changes1 overrider` value in `luciancicdverify.pl` to N in `[Increase Max to,N]` in the log in tests) to improve performance. + +* Before testing, Lucian CI/CD backs up `GitHub2o` and `Github_lc` in `gh2_tmp2` and `GitHub2` in `gh2_tmp` (at the same level as `GitHub`, see `settings.pl`). + +* If there are problems with testing, remove the contents from `GitHub2`, `GitHub_lc` and `GitHub2o`. + +# Compiling Lucian CI/CD for better performance + +* Enter the command, e.g. `luciancicd` after `main:-` in `main.pl`. + +* In bash, compile Lucian CI/CD with `swipl --goal=main --stand_alone=true -o luciancicd -c luciancicd.pl`. + +* In bash, run with `./luciancicd`. + +# Tests + +* Running the tests +To run all tests, enter: +`lc_test(NTotal,Score).` + +To run a specific test: +`lc_test1(TestNumber,Passed).` +where TestNumber is the test number from luciancicdverify.pl. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + diff --git a/luciancicd/check_non_var.pl b/luciancicd/check_non_var.pl new file mode 100644 index 0000000000000000000000000000000000000000..85d461bb644c2743cb9efd358b2f979ce3008c93 --- /dev/null +++ b/luciancicd/check_non_var.pl @@ -0,0 +1,126 @@ +lp2p1_a(A,B) :- + sub_term_wa([v,_],A,A1), + findall([A2,A3],(member([A2,A4],A1), + ((not(A4=[v,'_']), + A4=[v,A5], + atom_concat('_',_,A5), + get_cnv(A6),A3=[v,A6] + )->true;A4=A3)),A7), + %trace, + foldr(put_sub_term_wa_ae,A7,A,A8), + lp2p1([[[n, a], ":-", [A8]]],C), + string_concat("a:-",B1,C), + string_concat(B2,".\n",B1), + foldr(string_concat,["(",B2,")"],B), + %square_to_round([B2],B), + !. + +% (c(1,A),A=5) +% -> (c(1,A),not(var(A)),A=5) +check_non_var(C1,C2) :- + + dynamic(cnvn/1), + retractall(cnvn(_)), + assertz(cnvn(1)), + + dynamic(cnv/1), + retractall(cnv(_)), + assertz(cnv([])), + + term_to_atom(C1,C110), + foldr(string_concat,["a:-",C110,"."],C111), + fastp2lp2(C111,C112), + C112=[[[n, a], ":-", C11]], + + % [[[n, c], [1, [v, '_449476'], [v, c]]], [[n, =], [[v, '_449476'], 1]], [[n, =], [[v, c], 2]]] + +%trace, + %square_to_round(C11,C1), + C11=[C12|C13], + + (C13=[]->lp2p1_a(C12,C2); + ( + + %functor(C12,Name,Arity), + %numbers(Arity,1,[],Ns), + + C12=[Name, Args], + %length(Args,Arity), + + %functor(C1A,Name,Arity), + %* needs to get from A not _ + findall(CNV41,(%member(N,Ns),arg(N,C12,Arg),%copy_term(Arg,Arg1), + member(Arg,Args), + (Arg=[v,_]-> + (get_cnv(CNV4), + %arg(N,C1A,CNV4), + cnv(CNV6), + retractall(cnv(_)), + %delete(CNV6,[Arg,CNV4],CNV61), + append(CNV6,[[Arg,CNV4]],CNV7), + assertz(cnv(CNV7)), + CNV41=[v,CNV4]); + CNV41=Arg)%arg(N,C1A,Arg)) + %cnv(CNV1), + ),CC1), + %foldr(append,CC1,CC2), + %append(CC3,[_],CC2), + %foldr(append,[[Name,"("],CC3,["),"]],CC31), + %foldr(string_concat,CC31,CC4), + CC4=[Name, CC1], + %dynamic(cnv/1), + %retractall(cnv(_)), + %assertz(cnv([])), + cnv(CNV9), + %trace, + findall([[n,not],[[[n,var],[[v,CNV8]]]]],(%member(N,Ns),arg(N,C12,Arg),%copy_term(Arg,Arg1), + %var(Arg),%cnv(CNV1), + member([Arg,CNV8],CNV9), %trace, + not((%not(var(Arg)), + % [[n, =], [[v, '_449476'], 1]] + (member([[n, =],[Arg,B2]],C13)->true; + member([[n, equals4],[Arg,B2]],C13)), + B2=[v,_] + %append(CNV1,[not(var(Arg))],CNV2)%retractall(cnv(_)), + %assertz(cnv(CNV2)) + ))),CC5), + %foldr(append,Ns1,CC5), + %foldr(string_concat,Ns11,CC5), + + %cnv(CNV3), + findall([[n, =],[[v,CNV10],Num1]],(%member(Arg,C13),%N,Ns),arg(N,C13,CNV11=Num),% + (member([[n, =],[CNV11,Num]],C13)->true; + member([[n, equals4],[CNV11,Num]],C13)), +%copy_term(Arg,Arg1), + %var(Arg),%cnv(CNV1), + member([CNV11,CNV10],CNV9), + + + ((Num=[v,A5], + atom_concat('_',_,A5))->Num1=[v,'_'];Num1=Num) + %append(CNV1,[not(var(Arg))],CNV2)%retractall(cnv(_)), + %assertz(cnv(CNV2)) + ),CC7), + %trace, + %foldr(append,Ns2,CC7), + %foldr(string_concat,Ns21,CC6), + %string_concat(CC7,",",CC6), + C21=[[CC4],CC5,CC7], + foldr(append,C21,C23), + + lp2p1([[[n, a], ":-", C23]],C), + string_concat("a:-",B1,C), + string_concat(C22,".\n",B1), + foldr(string_concat,["(",C22,")"],C2) + %square_to_round([C22],C2) + )), + %square_to_round(C111,C2), + !. + +get_cnv(CNV42) :- + cnvn(CNV4), + CNV41 is CNV4+96, + char_code(CNV42,CNV41), + retractall(cnvn(_)), + CNV5 is CNV4+1, + assertz(cnvn(CNV5)). diff --git a/luciancicd/check_non_var0.pl b/luciancicd/check_non_var0.pl new file mode 100644 index 0000000000000000000000000000000000000000..6205d99610a346bf688da6b970fb37bbb9cfbe65 --- /dev/null +++ b/luciancicd/check_non_var0.pl @@ -0,0 +1,142 @@ +/* +lp2p1_a(A,B) :- + sub_term_wa([v,_],A,A1), + findall([A2,A3],(member([A2,A4],A1), + ((not(A4=[v,'_']), + A4=[v,A5], + atom_concat('_',_,A5), + get_cnv(A6),A3=[v,A6] + )->true;A4=A3)),A7), + %trace, + foldr(put_sub_term_wa_ae,A7,A,A8), + lp2p1([[[n, a], ":-", [A8]]],C), + string_concat("a:-",B1,C), + string_concat(B2,".\n",B1), + foldr(string_concat,["(",B2,")"],B), + %square_to_round([B2],B), + !. +*/ + +no_vars(A16) :- + + foldr(string_concat,["a:-",A16,"."],C111), + fastp2lp2(C111,C112), + C112=[[[n, a], ":-", [[[n, _],V]]]], + not(member([v,_],V)),!. + +% (c(1,A),A=5) +% -> (c(1,A),not(var(A)),A=5) +check_non_var0(C1,C2) :- + + dynamic(cnvn/1), + retractall(cnvn(_)), + assertz(cnvn(1)), + + dynamic(cnv/1), + retractall(cnv(_)), + assertz(cnv([])), + + + %term_to_atom(C1,C110), + foldr(string_concat,["a:-",C1,"."],C111), + fastp2lp2(C111,C112), + C112=[[[n, a], ":-", C11]], + + % [[[n, c], [1, [v, '_449476'], [v, c]]], [[n, =], [[v, '_449476'], 1]], [[n, =], [[v, c], 2]]] + +%trace, + %square_to_round(C11,C1), + C11=[C12|C13], + + (C13=[]->lp2p1_a(C12,C2); + ( + + %functor(C12,Name,Arity), + %numbers(Arity,1,[],Ns), + + (C12=[Name]-> + C23=C12; + ( + C12=[Name, Args], + %length(Args,Arity), + + %functor(C1A,Name,Arity), + %* needs to get from A not _ + findall(CNV41,(%member(N,Ns),arg(N,C12,Arg),%copy_term(Arg,Arg1), + member(Arg,Args), + (Arg=[v,_]-> + (get_cnv(CNV4), + %arg(N,C1A,CNV4), + cnv(CNV6), + retractall(cnv(_)), + %delete(CNV6,[Arg,CNV4],CNV61), + append(CNV6,[[Arg,CNV4]],CNV7), + assertz(cnv(CNV7)), + CNV41=[v,CNV4]); + CNV41=Arg)%arg(N,C1A,Arg)) + %cnv(CNV1), + ),CC1), + %foldr(append,CC1,CC2), + %append(CC3,[_],CC2), + %foldr(append,[[Name,"("],CC3,["),"]],CC31), + %foldr(string_concat,CC31,CC4), + CC4=[Name, CC1], + %dynamic(cnv/1), + %retractall(cnv(_)), + %assertz(cnv([])), + cnv(CNV9), + %trace, + + /*findall([[n,not],[[[n,var],[[v,CNV8]]]]],(%member(N,Ns),arg(N,C12,Arg),%copy_term(Arg,Arg1), + %var(Arg),%cnv(CNV1), + member([Arg,CNV8],CNV9), %trace, + not((%not(var(Arg)), + % [[n, =], [[v, '_449476'], 1]] + member([[n, =],[Arg,B2]],C13),B2=[v,_] + %append(CNV1,[not(var(Arg))],CNV2)%retractall(cnv(_)), + %assertz(cnv(CNV2)) + ))),CC5), + %foldr(append,Ns1,CC5), + %foldr(string_concat,Ns11,CC5), +*/ +CC5=[], + %cnv(CNV3), + findall([[n, =],[[v,CNV10],Num1]],(%member(Arg,C13),%N,Ns),arg(N,C13,CNV11=Num),% + (member([[n, =],[CNV11,Num]],C13)->true; + member([[n, equals4],[CNV11,Num]],C13)), +%copy_term(Arg,Arg1), + %var(Arg),%cnv(CNV1), + member([CNV11,CNV10],CNV9), + + + ((Num=[v,A5], + atom_concat('_',_,A5))->Num1=[v,'_'];Num1=Num) + %append(CNV1,[not(var(Arg))],CNV2)%retractall(cnv(_)), + %assertz(cnv(CNV2)) + ),CC7), + %trace, + %foldr(append,Ns2,CC7), + %foldr(string_concat,Ns21,CC6), + %string_concat(CC7,",",CC6), + C21=[[CC4],CC5,CC7], + foldr(append,C21,C23) + ), + + lp2p1([[[n, a], ":-", C23]],C), + string_concat("a:-",B1,C), + string_concat(C2,".\n",B1) + %foldr(string_concat,["(",C22,")"],C2) + %square_to_round([C22],C2) + ))), + %square_to_round(C111,C2), + !. + + /* +get_cnv(CNV42) :- + cnvn(CNV4), + CNV41 is CNV4+96, + char_code(CNV42,CNV41), + retractall(cnvn(_)), + CNV5 is CNV4+1, + assertz(cnvn(CNV5)). +*/ \ No newline at end of file diff --git a/luciancicd/ci.pl b/luciancicd/ci.pl new file mode 100644 index 0000000000000000000000000000000000000000..df54fea0b0c7b976c2eae39a80bf38382dfbb07c --- /dev/null +++ b/luciancicd/ci.pl @@ -0,0 +1,830 @@ +/* +continuous integration (with diff x) + +p2lp +comments even where spaces are x ignore comments xx check comments checked for in all spaces +- move to after command xxx +detect changed preds +[1,2,3],[1,2,4,3] - notice insertions, test +[1,2,3],[1,3] - notice deletions, test +[1,2,3],[1,4,3] - x, as above +- try combos of preds, starting with no changes +convert to pl, test +copy to build folder if passes + +*/ + +:-include('../Prolog-to-List-Prolog/p2lpconverter.pl'). +:-include('../List-Prolog-to-Prolog-Converter/lp2pconverter.pl'). +%:-include('luciancicd.pl'). +:-dynamic term_to_numbers1/1. +%:-dynamic term_to_numbers2/1. +:-dynamic changes/1. +:-dynamic correspondences/1. + +/*cicd(Path) :- + merge(Path), + build_and_test. +*/ + +get_file(Type,File1,S1) :- + (Type=file->(exists_file_s(File1)-> + (fastp2lp(File1,S1) + %p2lpconverter([Type,File1],S1) + ->true; + open_string_file_s(File1,S10), + lines_to_comments(S10,S1))); + fastp2lp(File1,S1) + %p2lpconverter([Type,File1],S1) + ),!. + +/* +set_up_merge(Type,File1) :- + get_file(Type,File1,S1), + %p2lpconverter([Type,File1],S1), + pp0(S1,S2),S2=S3,%term_to_atom(S2,S3), + open_s("test.lp",write,S21), + write(S21,S3),close(S21). +*/ + +% build saves over gh (which is moved to lc) with a command + +merge(K11,File1,Path1,Tests) :- +%writeln(merge(K11,File1,Path1,Tests)),`` +%trace,%trace, +%%%%%%%%%*****Change gh2 to gh + %foldr(string_concat,["../../Github2/",K11,"/",File1],Path11), + get_file(file,Path1,S1), + %p2lpconverter([Type,File1],S1), + %pp0(S1,S2),S2=S3,%term_to_atom(S2,S3) + %trace, + split_string(Path1,"/","/",P2), + append(_,[P3],P2), + + working_directory1(A2,A2), + + home_dir(A1), + + working_directory1(_,A1), + + + %working_directory1(_,"../../Github_lc/"), +foldr(string_concat,["../../Github_lc/tests_",K11,".txt"%"/",File1 + ],File2), + + foldr(string_concat,["[[n,comment],[[""File delimiter"",""../../Github_lc/",K11,""",""",P3,"""]]]"],String0), + catch(open_file_s(File2,[_,Old_S11]),_,Old_S11=["[",String0,"]"]), + + foldr(string_concat,Old_S11,Old_S112), + + term_to_atom(Old_S1121,Old_S112), + %trace, + split_into_lp_files(Old_S1121,Old_S113), + + foldr(string_concat,["../../Github_lc/",K11],PZ), + %trace, + foldr(string_concat,[P3],FZ), + + (once(member([[[n, comment], [["File delimiter", PZ, FZ]]]|Old_S1],Old_S113))->true;Old_S1=[]), + + %term_to_atom(Old_S114,Old_S1141), + %findall([AT1,",\n"],(member(AT1,Old_S11)),AT12),flatten(AT12,AT1x),%)),AT12), + %append(AT14,[_],AT1x), + %foldr(string_concat,Old_S1141,AT135), + %foldr(string_concat,["[",AT135,"]"],AT132), + %term_to_atom(AT134,AT135), + %foldr(append,AT131,AT133), + %trace, + %pp0(AT133,AT134), + %Old_S114=[_|Old_S1], + %split_string(AT134,"\n","\n",AT13) + %,trace + working_directory1(_,A2), + + %trace, + %open_file_s(File2,Old_S1), + (S1=Old_S1-> + (%trace, + writeln(["Files",K11,"/",File1,"and in",File2,"are identical"]), + ci_fail(Ci_fail), + append(Ci_fail,[1],Ci_fail1), + retractall(ci_fail(_)), + assertz(ci_fail(Ci_fail1)) + %Tests=[]%fail%abort + %Tests=[[K11,File1,Old_S1,S1]]%fail%abort + );( + ci_fail(Ci_fail), + append(Ci_fail,[0],Ci_fail1), + retractall(ci_fail(_)), + assertz(ci_fail(Ci_fail1)) + )),%trace, + foldr(string_concat,["../../Github_lc/",K11],K12), + %trace, + %term_to_atom(String01,String0), + %(append(String01,_,Old_S1)->Old_S1=Old_S10; + %append([String01],Old_S1,Old_S10)), + Old_S1=Old_S10, + %(append(String01,_,S1)->S1=S10; + %append([String01],S1,S10)), + S1=S10, + %trace, + Tests=[[K12,File1,Old_S10,S10]],!. + +merge2(Old_S1,S1,T3) :- +%trace, + %open_s("test.lp",write,S21), + %write(S21,S3),close(S21), + retractall(term_to_numbers1(_)), + assertz(term_to_numbers1(1)), + %trace, + term_to_numbers(term_to_numbers1,Old_S1,[],Corr,[],N1), + term_to_numbers(term_to_numbers1,S1,Corr,Corr2,[],N2), + retractall(correspondences(_)), + %trace, + assertz(correspondences(Corr2)), + %diff_group_combos(N1,N2,C), + diff_combos_vintage(N1,N2,C), + findall(T,(member(C1,C),numbers_to_term(C1,Corr2,[],T0)%,lp2p1(T0,T) + ,T=T0 + ),T1), + delete(T1,[],T31), + + %subtract(Combos411,[[]],Combos412), + /* + findall(Combos413,(member(Combos413,Combos412), + sort(Combos413,Combos414), + sort(Before,Before1), + not(Combos414=Before1)),Combos41), + */ + sort(T31,Combos41), + %subtract(Combos413,[Old_S1],Combos41), + sort_by_length(Combos41,T3), + + !. + +kept([[n,A1],A1a]) :- + keep(A),length(A1a,L),member([A1,L],A),!. +%kept([_,[[n,A1],A1a]]) :- +% keep(A),length(A1a,L),member([A1,L],A),!. + +merge21(Old_S11,S11,T3) :- + %keep(Kept), +%writeln1(merge21(Old_S11,S11,T3)), +%trace, + retractall(term_to_numbers1(_)), + assertz(term_to_numbers1(1)), + + findall(A,(member([_,A1],Old_S11),term_to_atom(A1,A)),Old_S1), + findall(A,(member([_,A1],S11),term_to_atom(A1,A)),S1), + term_to_numbers(term_to_numbers1,Old_S1,[],Corr,[],N1), + %trace, + term_to_numbers(term_to_numbers1,S1,Corr,Corr2,[],N2), + + + length(Old_S11,Old_S11L), + numbers(Old_S11L,1,[],Old_S11N), + %trace, + findall([A,C],(member(Old_S11N1,Old_S11N),get_item_n(Old_S11,Old_S11N1,[A1,A1a]), + (kept([A1,A1a])%=[n, Command] + ->A=_;A=A1), + get_item_n(N1,Old_S11N1,C)),N11), + + length(S11,S11L), + numbers(S11L,1,[],S11N), + findall([A,C],(member(S11N1,S11N),get_item_n(S11,S11N1,[A1,A1a]), + (kept([A1,A1a])->A=_;A=A1),get_item_n(N2,S11N1,C)),N21), + append(N11,N21,N31), + + + retractall(correspondences(_)), + assertz(correspondences(Corr2)), + diff_group_combos1(N1,N2,C000), + %trace, + (C000=[C]->true;C000=C), + %trace, + findall(T2,(member(C1,C), + (string(C1)-> + (numbers_to_term(C1,Corr2,T), + member([N32,C1],N31), + not(T=[]),T2=[[N32,T]] + ); + (C1=[[c,_],O,N]-> + ( + findall([N32,T],(member(C2,O), + numbers_to_term(C2,Corr2,T), + member([N32,C2],N31), + not(T=[])),O111), + + findall([N32,T],(member(C2,N), + numbers_to_term(C2,Corr2,T), + member([N32,C2],N31), + not(T=[])),N111), + %trace, + %writeln1(merge_files3(O111,N111,T2)), + merge_files3(O111,N111,T2) + ) + + ))),T31), + foldr(append,T31,T32), + sort1(T32,T3), + !. + +numbers_to_term(C1,Corr2,T01) :- +numbers_to_term([C1],Corr2,[],T0),T0=[T02], + term_to_atom(T01,T02),!. + %T3=[T4|_], + %open_s("test.pl",write,S22), + %write(S22,T4),close(S22). + +/* +merge2a(Old_S1,S1,T3) :- + %open_s("test.lp",write,S21), + %write(S21,S3),close(S21), + retractall(term_to_numbers1(_)), + assertz(term_to_numbers1(1)), + term_to_numbers(Old_S1,[],Corr,[],N1), + term_to_numbers(S1,Corr,Corr2,[],N2), + diff_group_combos(N1,N2,C), + findall(T,(member(C1,C),numbers_to_term(C1,Corr2,[],T)%, + %lp2p1(T0,T) + ),T1), + delete(T1,[],T3),!. +*/ + +get_token_number(_N1,S1,C1,_N,N2) :- +%writeln(get_token_number(N1,S1,N,N2)), + %trace, + %findall(*,(member(AAA,N1),)) + findall([SI,N3],member([S1,N3],C1),B), + (catch((append(_,[[_S2,N4]],B), + get_base_token_number(N4,N)),_,false)->true;N="0"),%)),S2), + +/* +findall(S1xx,(member(S1x,N1),(number(S1x)->number_string(S1x,S1xx);%S1x=S1xx), + get_base_token_number(S1x,S1xx))),S2), + (number(S1)->number_string(S1,S1xxx);%S1x=S1xx), + (trace,get_base_token_number(S1,S1xxx))), + findall(S2,member(S1xxx,S2),S3), + */ + length(B,L), + %N=N2,!. + foldr(string_concat,[N,".",L,"x"],N2),!. + +get_base_token_number(S1x,S1) :- + split_string(S1x,".x",".x",[S1|_]),!. + +term_to_numbers(_,[],C,C,N,N) :- !. +term_to_numbers(term_to_numbers1,S,C1,C2,N1,N2) :- + S=[S1|S2], + %trace, + (member([S1,N],C1)-> + (%C1=C3, + %N=N2A); + get_token_number(_N1,S1,C1,_N,N2A), + append(C1,[[S1,N2A]],C3)); + (term_to_numbers1(N2A1), + retractall(term_to_numbers1(_)), + N4 is N2A1+1, + assertz(term_to_numbers1(N4)), + number_string(N2A1,N2AA), + append(C1,[[S1,N2AA]],C3), + %N2AA=N2A + foldr(string_concat,[N2AA,".",0,"x"],N2A) + )), + %trace, + %(member(N2A,N1)->(%trace, + %get_token_number(N1,N2A,N2A,N2A2));N2A2=N2A), + append(N1,[N2A],N3), + term_to_numbers(term_to_numbers1,S2,C3,C2,N3,N2),!. + + + /* +term_to_numbers(term_to_numbers2,S,C1,C2,N1,N2) :- + S=[S1|S2], + (member([S1,N],C1)-> + C1=C3; + (term_to_numbers2(N), + retractall(term_to_numbers2(_)), + N4 is N+1, + assertz(term_to_numbers2(N4)), + append(C1,[[S1,N]],C3))), + append(N1,[N],N3), + term_to_numbers(S2,C3,C2,N3,N2),!. +*/ +numbers_to_term([],_,T,T) :- !. +numbers_to_term(SN,C1,T1,T2) :- +%trace, + SN=[SN1|SN2], + %SN1=SN3, + get_base_token_number(SN1,SN3), + member([S1,SN3],C1), + append(T1,[S1],T3), + numbers_to_term(SN2,C1,T3,T2),!. + + +/* + +X: + +diff_combos([1,2,3],[1,2,4,3],C). +C = [[1, 2, 4, 3], [1, 2, 3]]. + +diff_combos([1,2,3],[1,3],C). +C = [[1, 2, 3], [1, 3]]. + +diff_combos([1,2,3],[1,5,2,4,3],C). +C = [[1, 5, 2, 4, 3], [1, 5, 2, 3], [1, 2, 4, 3], [1, 2, 3]]. + +diff_combos([1,2,3,4,5],[1,3,5],C). +C = [[1, 2, 3, 4, 5], [1, 2, 3, 5], [1, 3, 4, 5], [1, 3, 5]]. + +diff_combos([1,3,4,5],[1,2,3,5],C). +C = [[1, 2, 3, 4, 5], [1, 2, 3, 5], [1, 3, 4, 5], [1, 3, 5]]. + +diff_combos([1,4,6,5],[1,5],C). +C = [[1, 4, 5], [1, 4, 6, 5], [1, 5], [1, 6, 5]]. + +diff_combos([4,6,5],[5],C). +C = [[4, 5], [4, 6, 5], [5], [6, 5]]. + +diff_combos([5],[4,5],C). +C = [[4, 5], [5]]. + +--- + +do all in one go without splitting on \n xx +don't worry about 121 131 repeating units, just group + +[1, 2, [d, 5], 1, [i, 4]] + +- join same types between non i,d +- change i,d in same space to c (change) + +*/ + +%/* + +% use in find_insertions_and_deletions, +% but need to differentiate same values [1,1] as 1_1 and 1_2 + +/* +subtract1(A,B,C) :- + subtract1(A,B,[],C),!. + +subtract1([],_,B,B) :- !. +subtract1(A,B,C,G) :- + A=[D|E], + (member(D,B)-> + (delete(B,D,F), + append(C,[D],H)); + (F=B,C=H)), + subtract1(E,F,H,G),!. + +subtract2(A,B,C) :- + subtract2(A,B,[],C),!. + +subtract2([],_,B,B) :- !. +subtract2(A,B,C,G) :- + A=[[D,D1]|E], + (member([D,_],B)-> + (delete(B,[D,_],F), + append(C,[[D,D1]],H)); + (F=B,C=H)), + subtract2(E,F,H,G),!. +*/ + +subtract2(A,B,C) :- + subtract2(A,B,[],C),!. + +subtract2([],_,B,B) :- !. +subtract2(A,B,C,G) :- + A=[[D,D1]|E], + (member([_,D1],B)-> + (delete(B,[_,D1],F), + append(C,[[D,D1]],H)); + (F=B,C=H)), + subtract2(E,F,H,G),!. + +/* + +differentiate(A,B) :- + differentiate(A,[],_,[],B),!. +differentiate([],Corrs,Corrs,Diffs,Diffs) :- !. +differentiate(List,Corrs1,Corrs2,Diffs1,Diffs2) :- + List=[A|B], + findall([N,A],member([N,A],Corrs1),C), + (C=[]->(N1=1,append(Corrs1,[[N1,A]],Corrs3)); + (sort(C,C1),append(_,[[N2,_]],C1), + N1 is N2+1,append(Corrs1,[[N1,A]],Corrs3))), + append(Diffs1,[[A,N1]],Diffs3), + differentiate(B,Corrs3,Corrs2,Diffs3,Diffs2). + +%subtract2(A,B,C) :- +break_into_tokens(A,B) :- + string_codes(A,A1), + split_on_substring117(A1, `#@~%$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\"!\``,[],B),!. +*/ + +fail_if_greater_than_n_changes(After3) :- +%trace, +(fail_if_greater_than_n_changes2(N1)->N=N1; +fail_if_greater_than_n_changes1(N)), +%trace, + findall(A,(member(A,After3),not(string(A))),B), + length(B,L),(L=true;(writeln2(["Increase Max to",L]),fail)). + + + +diff_group_combos(A,A,[A]) :- !. +diff_group_combos(Before,After,Combos4) :- + %length(Before,BL),length(After,AL), + %((A>10->true;B>10)->fail;true), +/* + sort(After0,After01),sub* + retractall(term_to_numbers2(_)), + assertz(term_to_numbers2(-1000000)), + term_to_numbers(term_to_numbers2,Old_S1,[],Corr,[],N1), + term_to_numbers(term_to_numbers1,S1,Corr,Corr2,[],N2), + diff_group_combos(N1,N2,C), + findall(T,(member(C1,C),numbers_to_term(C1,Corr2,[],T0), +*/ + + retractall(changes(_)), + assertz(changes(1)), + %differentiate(Before,Before0), + %differentiate(After,After0), + %find_insertions_and_deletions(Before,After,Insertions,Deletions), + %trace, + diff(Before,After,_,_,[],[],After31), + + + %replace11(After,Insertions,[],After2), + %replace12(Before,After2,Deletions,[],After31), + join_and_change(After31,[],After3), + %trace, + %save_diff_html(After3), + fail_if_greater_than_n_changes(After3), + %length(After3,L) + findall([[i,_],I],(member([[i,_NA],I],After3)%,not(number(NA)) + ),Insertions1), + findall([[d,_],D],(member([[d,_NB],D],After3)%,not(number(NB)) + ),Deletions1), + %findall([c,C],(member([[c,_],C],After3)),Changes1), + findall(Combos,find_combos1(Insertions1,Deletions1,%Changes1, + Combos),Combos2), + findall(Combos10,(member(Combos3,Combos2), + find_combos3(After3,Combos3,[],Combos1), + flatten(Combos1,Combos10)),Combos4), + %subtract(Combos411,[[]],Combos412), + /* + findall(Combos413,(member(Combos413,Combos412), + sort(Combos413,Combos414), + sort(Before,Before1), + not(Combos414=Before1)),Combos41), + */ + %sort(Combos412,Combos413), + %subtract(Combos413,[Before],Combos41), + %sort_by_length(Combos41,Combos4), + (Combos4=[]->fail;true), + %length(Combos4,L), + %trace, + %(L>100->fail;true), + !. +%diff_group_combos(_Before,_After,[]). +diff_group_combos(_Before,After,[After]). + + + +diff_group_combos1(A,A,[A]) :- !. +diff_group_combos1(Before,After,Combos4) :- + + retractall(changes(_)), + assertz(changes(1)), + %differentiate(Before,Before0), + %differentiate(After,After0), + %find_insertions_and_deletions(Before,After,Insertions,Deletions), + + diff(Before,After,_,_,[],[],After31), + + + %replace11(After,Insertions,[],After2), + %replace12(Before,After2,Deletions,[],After31), + %trace, + join_and_change(After31,[],After3), + fail_if_greater_than_n_changes(After3), + %trace, + + findall(A1,(member(A,After3), + (string(A)->A1=A; + (A=[[c,_],O,N]-> + A1=A;%[O,N]; + A=[[_, _], E]-> + A1=E))),Combos4),%A2), + %flatten(A2,Combos4), + + %findall(A,member([[_,_NA],A],After3),Combos4), + !. +%diff_group_combos(_Before,_After,[]). +diff_group_combos1(_Before,After,[After]). + +sort_by_length(A,F) :- + findall([L,B],(member(B,A),length(B,L)),C),sort(C,D),findall(E,member([_,E],D),F). + +%join_and_change(After31,After3) :- + +i_or_d([i,_]). +i_or_d([d,_]). + +%group(-,After4,After4) :- !. +group(L,After4,After41) :- + findall(A,(member([L,A],After4)),After42), + (not(After42=[])-> + After41=[[[L,-] + ,After42]]; + After41=[]),!. + +find_change(%After40, +After41,After42,After43) :- + ((not(After41=[]),not(After42=[]))-> + (After41=[[_,A1]],After42=[[_,A2]], + changes(N), + N1 is N+1, + retractall(changes(_)), + assertz(changes(N1)), + %append(A1,A2,A3),After43=[[[c,N],A3]] + After43=[[[c,N],A1,A2]%,[[d,N],A2] + ]); + append(After41,After42,After43)),!. + +join_and_change([],%_Insertions, +After,After) :- !. +join_and_change(After,%Insertions, +After2,After3) :- + After=[After4|After5], + not(i_or_d(After4)), + append(After2,[After4],After6), + join_and_change(After5,%Insertions, +After6,After3),!. + +/* + append(After4,After5,After), + not(After4=[]), + %not(i_or_d(After4)), + %forall(member(After45,After4),not(i_or_d(After45)))-> + + append(After53,_After54,After5), + not(After53=[]), + (i_or_d(After53)), + append(After4,After53,After500), + (After500=[]->After2=After3; + append(After2,After4,After6), + join_and_change(After5,After6,After3)),!. +*/ +join_and_change(After,%Insertions, +After2,After3) :- + %After=[After4|After5], + + append(After4,After5,After), + not(After4=[]), + %forall(member(After45,After4),not(i_or_d(After45)))-> + + append(After53,_After54,After5), + (After5=[]->true;(not(After53=[]), + %append(After51,_After52,After54), + %not(After51=[]), + [After531]=After53, + not(i_or_d(After531)))), + + %group(-,After53,After40), + group(i,After4,After41), + group(d,After4,After42), + find_change(%After40, + After41,After42,After43), + + %After53=[], + %After43=After4)), + + foldr(append,[After2,%After4, + %After50, + After43],After6), + %After50=After5, + %;(append(After2,After4,After6), + %*x append(After53,After54,After50))), + %After50=After5)), + join_and_change(After5,After6,After3),!. + +join_and_change(A,%_Insertions, +After1,After2) :- append(After1,A,After2),!. + +/* +diff_combos(A,A,[]) :- !. +diff_combos(Before,After,Combos4) :- + find_insertions_and_deletions(Before,After,Insertions,Deletions), + replace11(After,Insertions,[],After2), + replace12(Before,After2,Deletions,[],After3), + findall(Combos,find_combos1(Insertions,Deletions,Combos),Combos2), + findall(Combos1,(member(Combos3,Combos2), + find_combos3(After3,Combos3,[],Combos1)),Combos41), + sort(Combos41,Combos4),!. +*/ + +/* +replace11([],_Insertions,After,After) :- !. +replace11(After,Insertions,After2,After3) :- + After=[After4|After5], + (member(After4,Insertions)-> + After7=[i,After4]; + After7=After4), + append(After2,[After7],After6), + replace11(After5,Insertions,After6,After3),!. + +replace12(_,After,[],_After1,After) :- + %append(After1,[A],After2), + !. + +replace12(Before,After,Deletions,After2,After3) :- + %Before=[B|Bs], + After=[[i,A]|As], + append(After2,[[i,A]],After4), + replace12(Before,As,Deletions,After4,After3),!. + +replace12([],[],_Deletions,After,After) :- + %append(After1,[A],After2), + !. +%replace12([A],[A],_Deletions,After1,After2) :- +% append(After1,[A],After2),!. +replace12(Before,After,Deletions,After2,After3) :- + append(After4,After5,After), + not(After4=[]), + append(After51,After52,After5), + + append(After4,Before5,Before), + append(Before53,Before54,Before5), + append(After51,Before52,Before54), + %After=[After4|After5], + %not(Before53=[]), + (length(Before53,1)->Before53=[Before55]; + Before53=Before55), + (true%member(Before53,Deletions) + -> + (Before53=[]->After7=[]; + + (is_list(Before55)-> + findall([d,B],member(B,Before55),After7); + After7=[[d,Before55]]) + ); + + After7=Before55), + (After7=[]-> + foldr(append,[After2,After4],After6); + foldr(append,[After2,After4,After7],After6)), + replace12(Before52,After52,Deletions,After6,After3). +replace12(Before,After,Deletions,After2,After3) :- + %trace,%append(After4,After5,After), + %(After4=[]), + append(After51,After52,After), + + %append(After4,Before5,Before), + append(Before53,Before54,Before), + append(After51,Before52,Before54), + %After=[After4|After5], + %not(Before53=[]), + %not(length(Before53,1)),%-> + not(Before53=[]),%-> + Before53=[Before55],%; + %Before53=Before55), + (true%member(Before53,Deletions) + -> + (Before53=[]->After7=[]; + + (is_list(Before55)-> + findall([d,B],member(B,Before55),After7); + After7=[[d,Before55]]) + ); + + After7=Before55), + (After7=[]-> + foldr(append,[After2%,After4 + ],After6); + foldr(append,[After2,%After4, + After7],After6)), + replace12(Before52,After52,Deletions,After6,After3). +*/ + +% find_insertions_and_deletions([1,2,3],[1,2,4,3],In,D). +% In = [4], +% D = []. + +% find_insertions_and_deletions([1,2,3],[1,3],In,D). +% In = [], +% D = [2]. + +/* +find_insertions_and_deletions(Before,After,Insertions,Deletions) :- + subtract(After,Before,Insertions), + subtract(Before,After,Deletions). +*/ + +find_combos1(Insertions,Deletions,%Changes, +Combos) :- + %findall([i,In],member(In,Insertions),In1), + %findall([d,De],member(De,Deletions),De1), + %findall([c,Ch],member(Ch,Changes),Ch1), + foldr(append,[Insertions,Deletions%,Changes + ],Ops), + find_combos2(Ops,[],Combos). + +find_combos2([],Combos,Combos). +find_combos2(Ops,Combos1,Combos2) :- + Ops=[Op|Ops1], + %trace, + %member(Switch,[on,off]), + (Op=[[i,_],_]->(S=[on,off + ]%,S2=n + );(Op=[[d,_],_],S=[on,off + ]%,S2=_) + )), + member(Switch,S), + append(Combos1,[[Op,Switch]],Combos3), + find_combos2(Ops1,Combos3,Combos2). + +%findall([Op,Switch],(member(Op,Ops),member(Switch,[on,off])),Switches). +% ((member(In,Insertions)->true;In=[]), +% (member(De,Deletions)->true;De=[]), + +find_combos3([],_Combos,Combos,Combos) :- !. + +find_combos3(After,Combos,Combos1,Combos2) :- + After=[Item1|After2], + %trace, + (Item1=[_Type,N1,N2],member(Item,[N1,N2])), + append(Combos1,Item,Combos3), + find_combos3(After2,Combos,Combos3,Combos2). + + +find_combos3(After,Combos,Combos1,Combos2) :- + After=[Item1|After2], + not(Item1=[_,_,_]), + %trace, + %((Item1=[Type,N1,N2],member(Item,[N1,N2]))->true; + ((Item1=[Type,N], + member([[Type,N],Switch],Combos), + ((%Type=i, + Switch=on)->Item=[N];Item=[]))->true;(%number + string(Item1)->Item=[Item1];Item=[]) + ), + append(Combos1,Item,Combos3), + find_combos3(After2,Combos,Combos3,Combos2). +/* +find_combos2([],Combos,Combos). +find_combos2(Ops,Combos1,Combos2) :- +%trace, + Ops=[[[Op,N1],A]|Ops1], + % for n, i or d + %member([_,A],Ops), + %changes(N), + %member(Op,N), + (number(N1)-> + (S3=[i,d], + member(Switch,S3), + S2=_); + + ((Op=i->(S=[on,off + ],S2=n);(S=[on%,off + ],S2=_)), + member(Switch,S%[on,off + %] + ))), + append(Combos1,[[[[Op,S2],A],Switch]],Combos3), + find_combos2(Ops1,Combos3,Combos2). + +%findall([Op,Switch],(member(Op,Ops),member(Switch,[on,off])),Switches). +% ((member(In,Insertions)->true;In=[]), +% (member(De,Deletions)->true;De=[]), + +find_combos3([],_Combos,Combos,Combos) :- !. +find_combos3(After,Combos,Combos1,Combos2) :- +%trace, + After=[Item1|After2], + (Item1=[Type1,N1], %T=N, i or d + member([[Type,N],Switch],Combos),%trace, + Type=[T,_],Type1=[T,_],(Type1=[_,ID],(%)-> + number(N)->(ID=i->Switch1=on;(ID=d->Switch1=off));Switch=Switch1), + ((%Type=i, + Switch1=on)->Item=[N];Item=[]))->true;Item=[Item1]), + append(Combos1,Item,Combos3), + find_combos3(After2,Combos,Combos3,Combos2),!. + +* +find_combos3([],_Combos,Combos,Combos) :- !. +find_combos3(After,Combos,Combos1,Combos2) :- +%trace, + After=[Item1|After2], + ((Item1=[Type1,N], + member([[Type,N],Switch],Combos),%trace, + Type=[T,_],Type1=[T,_],(Type1=[_,n]->number(N1);true), + ((%Type=i, + Switch=on)->Item=[N];Item=[]))->true;Item=[Item1]), + append(Combos1,Item,Combos3), + find_combos3(After2,Combos,Combos3,Combos2),!. +*/ \ No newline at end of file diff --git a/luciancicd/ci3.pl b/luciancicd/ci3.pl new file mode 100644 index 0000000000000000000000000000000000000000..c970d43c1b068dcc953ea257164b0d076afff205 --- /dev/null +++ b/luciancicd/ci3.pl @@ -0,0 +1,396 @@ +/* + +ci3.pl + +- use diff group combos to find changed preds - setting to make interpred comments perm v included with space changes +- do by file + - (use a version of dgc that returns i as well as d for changes to get out code to check) +- within changed groups, separates into different pred name, arity +- remove comments within preds (x make perm), change interpred comments to permanent (to delete manually later if nec) x delete interline comments xxx they will be included with space changes +- use diff group combos (dgc) with tokens not strings to compare old and new preds (use a version of dgc that returns i as well as d for changes x) +- bu +- run tests to find smallest set of changes needed to each pred + +x delete interline comments +or convert them to lp to see if they are comm, then keep them x they are with the changes + +*/ + +:- use_module(library(date)). + +%:-include('luciancicd.pl'). +:-include('../Prolog-to-List-Prolog/pretty_print_lp2p.pl'). +%:-include('../Alg_to_Types/find_dependencies.pl'). +%:-dynamic merges_files/1. +%:-dynamic merges_preds/1. + +ci :- + +working_directory1(A,A), + + retractall(home_dir(_)),assertz(home_dir(A)), + retractall(ci_fail(_)),assertz(ci_fail([])), + +(exists_directory('../../Github_lc')->true;make_directory('../../Github_lc')), + +repositories_paths(K), + +omit_paths(Omit), + +%findall(Omit1,(member(Omit2,Omit),atom_string(Omit1,Omit2)),Omit3), +findall([K1,G4],(member(K1,K), directory_files(K1,F), + delete_invisibles_etc(F,G), + +%retractall(merges_files(_)), +%assertz(merges_files([])), +%retractall(merges_preds(_)), +%assertz(merges_preds([])), +% merge_files contains the file data from the repository. +% Its predicates need to be updated and saved. + +% merges_preds contains the name and arity of updated preds x +%findall(H,(member(H,G),not(string_concat("dot",_,H)), + +subtract(G,Omit,G1), + +findall(G3,(member(G2,G1),string_concat(G2,"/",G3)),G4) +%not(member(G,Omit)) + +),K01), +%trace, +%foldr(append,K0,K01), + +working_directory1(Old_D,Old_D), + +findall(Tests1,(member([D,K31],K01), + +%trace, + +working_directory1(_,Old_D), + +working_directory1(_,D), + +%member(K2,K31), + +%exists_directory(K2), +%trace, +process_directory_merge(K31,%_G, + %Omit,% + true, + Tests11), + + % get reverse order of preds + + % go through them in order, ", for each combo make changes to original, test until find the simplest working version + %trace, + %find_pred_order(Tests11,Ordered_pred_nums), + + prepare_repositories(Tests11,Tests1) %xx delimit files, combine in luciancicd, find combos in multireps x reps, take apart on file delimiter comments, find shortest combos first, test + + %process_merge_preds(Tests11,Ordered_pred_nums,Tests1) + + + %process_merge_preds() + ),Tests2), + foldr(append,Tests2,Tests), + %trace, + retractall(lc_tests(_)), + assertz(lc_tests(Tests)) + + ,working_directory1(_,A) + +,%writeln("All tests were successful."), + +!. + +%/* + + +prepare_repositories(Tests,T3%,Ordered_pred_nums +) :- %* in a separate file.pl, process dirs rec'ly +%trace, + + +findall([Tests12,Tokens2,Tokens1], +%findall(Tests143,( +(member([Tests12,C],Tests), + %Tests=[[Tests12,C]], + term_to_atom(Tests14,C),%),Tests142), + %foldr(append,Tests142,Tests14), + + %findall([[[n,comment],["File delimiter",P,F]],O],=( + %trace, + findall(O1, + ( + member([P,F,O,N],Tests14), + %member(A,[1,2]), + %(%A= 1-> + %( + (O=[]->O1=O; + append([[[n,comment],[["File delimiter",P,F]]]],O,O1)) + %[P,F,O,_N],Tests14),N2), + %foldr(append,O1,Functions2), + %term_to_atom(Functions2,String2), + %break_into_tokens(String2,Tokens2), + %delete(O1,[],Functions21), + ),Tokens2a), + foldr(append,Tokens2a,O2), + pp0_1(O2,String2), + %term_to_atom(String2,String21), + split_string(String2,"\n\r","\n\r",Tokens2), + %),%; + %(%A=2, + %findall([[[n,comment],["File delimiter",P,F]],N],=( + %[P,F,_O,N]=Tests14, + %member([P,F,O,N],Tests14), + findall(N1, + ( + member([P,F,O,N],Tests14), + (N=[]->N1=N; + append([[[n,comment],[["File delimiter",P,F]]]],N,N1)) + %[P,F,_O,N],Tests14),N1), + %foldr(append,N1,Functions1), + %term_to_atom(Functions1,String1), + %break_into_tokens(String1,Tokens1) + %delete(N1,[],Functions11), + ),Tokens1a), + foldr(append,Tokens1a,N2), + pp0_1(N2,String1), + %term_to_atom(String1,String11), + split_string(String1,"\n\r","\n\r",Tokens1) + ), +T3),!. + %merge2(Tokens2,Tokens1,T4), + %T3=[["../../Github_lc/tests.txt"%_Tests12 + %,Tokens2,Tokens1%T4 + %]]. + +%trace, +/*for a repo: +- find reverse ordered preds, find changed, not inserted or deleted preds - (no combos x) - save changed preds as go, find min alg pred set (don't find preds' combos until needed by the minimum macrocombos x) +for each changed pred: +find i,d,c parts' combos +run pred tests on repo, and deps, and determine minimum version of pred, save in pre-build folder +go to build +*/ +%A=[1], +%findall(B,(member(Functions1,N2),%Debug=off,member(B,A),test(B,Q,F,R),query_box(Q,Query1,F,Functions1), +/*convert_to_grammar_part1(Functions1,[],Functions2,_),add_line_numbers_to_algorithm1(Functions2,Functions2a),find_pred_numbers(Functions2a,[],Pred_numbers),find_state_machine1(Functions2a,Functions3,Pred_numbers), + + +find_pred_numbers_dependencies(Functions3,[],Functions2a,Pred_numbers), + +order_preds_bottom_up(1,Functions2a,[],Ordered_pred_nums) +. +*/ + +process_directory_merge(K,%G, + Top_level,%Tests1, + Tests61) :- +working_directory1(A0,A0), + +%G=K, +%/* +findall(K4,(member(K1,K), +working_directory1(_,A0), +%exists_directory(K1), +directory_files(K1,F), + delete_invisibles_etc(F,G), +%*/ +findall(Tests3,(member(H,G),not(string_concat("dot",_,H)), + +%not(member(H,Omit)), + + +foldr(string_concat,[K1,H],H1), + +% if a file then find modification date +% if a folder then continue finding files in folder +(exists_directory(H1)-> + +(string_concat(H1,"/",H2), +process_directory_merge([H2],%[H], + false,%[],%Omit % only omit top level dirs xx + %Tests1, + Tests3) + %foldr(append,Tests31,Tests3) + ); + +(true%string_concat(_,".pl",H) +-> + +(%trace, +%trace, +find_merge(K1,H,H1,Tests3) +%p2lpconverter([file,H1],LP), + +%time_file(H1,Tests4), +%trace, +%append(Tests1,[[H1,Tests4]],Tests3))) +%Tests3=[[H1,Tests]] +) +; +(Tests3=[] +) +)) + +),Tests5),%trace, +foldr(append,Tests5,Tests51), + +%Tests5=Tests51, + +(%true% +Top_level=true%not(Omit=[]) % at top level +-> +( % folder/file, pl +findall([T1,',\n'],(member([T,TT,TTT,TTTT +],Tests51),term_to_atom(TTT,TTT1),term_to_atom(TTTT,TTTT1), +foldr(atom_concat,["[","\"",T,"\"",",","\"",TT,"\"",",",TTT1,",",TTTT1, +"]"],T1)%term_to_atom(T,T1) +),T2), +%trace, +flatten(T2,TT2), +foldr(atom_concat,TT2,T21), +(T2=[]->T6=[];(find_sl_2(T21,T6)%string_concat(T4,T5,T21),%trace, +%string_length(T5,2), +%foldr(string_concat,["[","\n",T4,"\n","]"],T6) +)), +%term_to_atom(Tests51,Tests52), +%trace, +string_concat(K3,"/",K1), +foldr(string_concat,["../../Github_lc/tests_",K3,".txt"],K2), +%trace, +K4=[K2,T6] +%open_s(K2,write,S), +%write(S,Tests52),close(S) + +%writeln(["*",K2, +%Tests52] +% +); +(%trace, +K4=Tests51) +) + + + +),Tests6), +%trace, +(%not(Omit=[])-> +Top_level=true-> +Tests6=Tests61; +(%trace, +foldr(append,Tests6,Tests61))), + +!. + + /* +test_b(Tests) :- + +working_directory1(A,A), + +working_directory1(_,'../../GitHub2/'), + +[K1,H,_H1]=["Philosophy/", "4 2 23.pl", "Philosophy/4 2 23.pl"], + string_concat(K11,"/",K1), + +LP=[[[n, c], ["%r(NA)."]], [[n, c], ["%NA=2."]]%, [[n, c], ["% r([a],N2)."]], [[n, c], ["% N2 = 3."]] +,[[n,r]] +], + +(find_merge2(H,K11,LP,Tests)->true;working_directory1(_,A)). +*/ + +find_merge(K1,H,H1,Tests) :- + + string_concat(K11,"/",K1), + + merge(K11,H,H1,Tests). + +%catch(call_with_time_limit(0.005, + %p2lpconverter([file,H1],LP),%),_,false), + %,writeln1(Result2) + + %find_merge2(H,K11,LP,Tests). + +/* +find_merge2(H,K11,LP,Tests) :- + + %trace, + findall(N1,(member([[n,N]|_],LP), + string_strings(N,N1)),Ns), + + findall([K11,H,F2],(member([[n,comment%c + ],[Comment]],LP), + string_strings(Comment,C), + member(N2,Ns), + append(_A,B,C), + append(N2,Dx,B), + %trace, + append(Ex,Dx1,Dx), + %append(_Ex1,Dx2,Dx1), + append(["."],_N21,Dx1), + %trace, + flatten([N2,Ex%,"." + ],N2Ex), + foldr(string_concat,N2Ex,F), + + % the answer is A= ... "." or " " + % in this, just 1 answer + %trace, + reverse(Ex,D),append(E2,_E3,D),reverse(E2,E31),(append([","],E5,E31)->true;append(["("],E5,E31)),append(E6,E7,E5),append([")"],_,E7), + %trace, + %member(N21,Ns), + + member([[n,comment%c + ],[Comment1]],LP), + string_strings(Comment1,C1), + + append(_A1,Bx,C1), + append(E6,Dxx,Bx), + append(E61,Dxx1,Dxx), + %trace, + (append(["."],_Exx,Dxx1)%->true; + %append([],Exx,Dxx1) + ), + %trace, + %writeln([_A1,Bx,E6,Dxx,E61,Dxx1]), + %flatten([]) + foldr(string_concat,E61,E612), + sub_string(E612,_,_,_,"="), + %trace, + flatten([E6,E612%,Exx%,"." + ],E6Exx), + foldr(string_concat,E6Exx,F1), + %trace, + %term_to_atom((F,F1),F00), + %term_to_atom(F0,F00), + %term_to_atom(F10,F1), + foldr(string_concat,["(",F,",",F1,")"],F2) + %atom_string(F0,F), + %atom_string(F10,F1), + %F2=%F0% + %(F0,F10) + ),Tests),!. + + + % is K11 just d or gh/d x d + +%[K11,H,] +%[["d","a.pl",(a(B),B=1)]] +*/ + +/* +process_merge_preds(Tests11,Ordered_pred_nums,Tests1) :- + Tests11=[B,Tests12],term_to_atom(C,Tests12), + findall(A,(member([P,F,O,N],C) + + ),_). +*/ + +find_sl_2(T21,T6) :- + find_sl_21(T21,T6),!. +find_sl_21(T21,T6) :- + string_concat(T4,T5,T21),%trace, + string_length(T5,2), + foldr(string_concat,["[","\n",T4,"\n","]"],T6). diff --git a/luciancicd/ci_vintage.pl b/luciancicd/ci_vintage.pl new file mode 100644 index 0000000000000000000000000000000000000000..66d1d9feaa9326591efe28db284cbe2635cddf48 --- /dev/null +++ b/luciancicd/ci_vintage.pl @@ -0,0 +1,252 @@ +/* +continuous integration (with diff x) + +p2lp +comments even where spaces are x ignore comments xx check comments checked for in all spaces +- move to after command xxx +detect changed preds +[1,2,3],[1,2,4,3] - notice insertions, test +[1,2,3],[1,3] - notice deletions, test +[1,2,3],[1,4,3] - x, as above +- try combos of preds, starting with no changes +convert to pl, test +copy to build folder if passes + + +diff_combos_vintage([1,2,3],[1,2,4,3],C). +C = [[1, 2, 4, 3], [1, 2, 3]]. + +diff_combos_vintage([1,2,3],[1,3],C). +C = [[1, 2, 3], [1, 3]]. + +diff_combos_vintage([1,2,3],[1,5,2,4,3],C). +C = [[1, 5, 2, 4, 3], [1, 5, 2, 3], [1, 2, 4, 3], [1, 2, 3]]. + +diff_combos_vintage([1,2,3,4,5],[1,3,5],C). +C = [[1, 2, 3, 4, 5], [1, 2, 3, 5], [1, 3, 4, 5], [1, 3, 5]]. + +diff_combos_vintage([1,3,4,5],[1,2,3,5],C). +C = [[1, 2, 3, 4, 5], [1, 2, 3, 5], [1, 3, 4, 5], [1, 3, 5]]. + +diff_combos_vintage([1,4,6,5],[1,5],C). +C = [[1, 4, 5], [1, 4, 6, 5], [1, 5], [1, 6, 5]]. + +diff_combos_vintage([4,6,5],[5],C). +C = [[4, 5], [4, 6, 5], [5], [6, 5]]. + +diff_combos_vintage([5],[4,5],C). +C = [[4, 5], [5]]. + +*/ +diff_combos_vintage(A,A,[A]) :- !. + +diff_combos_vintage(Before,After,Combos4) :- + %trace, + find_insertions_and_deletions_vintage(Before,After,Insertions1,Deletions1,Permanent_insertions), + %trace, +diff(Before,After,Insertions1,Deletions1,Insertions,Deletions,Permanent_insertions,[],After3), + + %replace11_vintage(After,Insertions,Permanent_insertions,[],After2), + %trace, + %append(Before,["*"],Before1), + %append(After2,["*"],After21), + %replace12_vintage(Before,After2,Deletions,[],After3), + %delete(After31,"*",After3), + %save_diff_html(After3), + fail_if_greater_than_n_changes(After3), + %trace, + findall(Combos,find_combos1_vintage(Insertions,Deletions,Permanent_insertions,Combos),Combos2), + findall(Combos1,(member(Combos3,Combos2), + find_combos3_vintage(After3,Combos3,[],Combos1)),Combos41), + sort(Combos41,Combos4),!. + +diff_combos_vintage(_Before,After,[After]) :- !. + +/* +replace11_vintage([],_Insertions,_Permanent_insertions,After,After) :- !. +replace11_vintage(After,Insertions,Permanent_insertions,After2,After3) :- + After=[After4|After5], + (member(After4,Insertions)-> + After7=[i,After4]; + (member(After4,Permanent_insertions)-> + After7=[p,After4]; + After7=After4)), + append(After2,[After7],After6), + replace11_vintage(After5,Insertions,Permanent_insertions,After6,After3),!. + +replace12_vintage(_,After,[],_After1,After) :- + %append(After1,[A],After2), + !. + +replace12_vintage(Before,After,Deletions,After2,After3) :- + %Before=[B|Bs], + %trace, + + After=[[I,A]|As], + %writeln([*,A]), + %(A="9.0x"->trace;true), + (I=i->true;I=p), + append(After2,[[I,A]],After4), + replace12_vintage(Before,As,Deletions,After4,After3),!. + +replace12_vintage([],[],_Deletions,After,After) :- + %append(After1,[A],After2), + !. +%replace12([A],[A],_Deletions,After1,After2) :- +% append(After1,[A],After2),!. +replace12_vintage(Before,After,Deletions,After2,After3) :- + append(After4,After5,After), + not(After4=[]), + append(After51,After52,After5), + + append(After4,Before5,Before), + append(Before53,Before54,Before5), + append(After51,Before52,Before54), + %After=[After4|After5], + %not(Before53=[]), + (length(Before53,1)->Before53=[Before55]; + Before53=Before55), + (true%member(Before53,Deletions) + -> + (Before53=[]->After7=[]; + + (is_list(Before55)-> + findall([d,B],member(B,Before55),After7); + After7=[[d,Before55]]) + ); + + After7=Before55), + (After7=[]-> + foldr(append,[After2,After4],After6); + foldr(append,[After2,After4,After7],After6)), + replace12_vintage(Before52,After52,Deletions,After6,After3). +replace12_vintage(Before,After,Deletions,After2,After3) :- + %trace,%append(After4,After5,After), + %(After4=[]), + append(After51,After52,After), + + %append(After4,Before5,Before), + append(Before53,Before54,Before), + append(After51,Before52,Before54), + %After=[After4|After5], + %not(Before53=[]), + %not(length(Before53,1)),%-> + not(Before53=[]),%-> + Before53=[Before55],%; + %Before53=Before55), + (true%member(Before53,Deletions) + -> + (Before53=[]->After7=[]; + + (is_list(Before55)-> + findall([d,B],member(B,Before55),After7); + After7=[[d,Before55]]) + ); + + After7=Before55), + (After7=[]-> + foldr(append,[After2%,After4 + ],After6); + foldr(append,[After2,%After4, + After7],After6)), + replace12_vintage(Before52,After52,Deletions,After6,After3). +*/ + +% find_insertions_and_deletions_vintage([1,2,3],[1,2,4,3],In,D). +% In = [4], +% D = []. + +% find_insertions_and_deletions_vintage([1,2,3],[1,3],In,D). +% In = [], +% D = [2]. + +find_insertions_and_deletions_vintage(Before,After,Insertions1,Deletions1,Permanent_insertions2) :- +%trace, + %/* + correspondences(Corr), + keep1(Kept), + findall(A1,(member(A1,After),get_base_token_number(A1,A10),member([Comm,A10],Corr), +%(( +(string_concat(Comm1,",",Comm)->true;Comm1=Comm),catch(term_to_atom(A2,Comm1),_,fail),A2=[[_,Name],Args],length(Args,Arity),member([Name,Arity],Kept) +%)->true;Comm=",") +),Permanent_insertions1), +sort(Permanent_insertions1,Permanent_insertions), + %Permanent_insertions=[], + %()subtract(After,After1,After2), + %*/ + + findall([M21,M11],(member(M21,After),get_base_token_number(M21,M11)),After2), + findall([M22,M12],(member(M22,Before),get_base_token_number(M22,M12)),Before2), + + %findall(P11,(member([P11,P12],M31),member([P13,P14],M32),not(P12=P14)),After11), + %sort(After11,After1), + %findall(P11,(member([P11,P12],M32),member([P13,P14],M31),not(P12=P14)),Before11), +% sort(Before11,Before1), +%trace, + %subtract(M31,M32,After1), + %subtract(M32,M31,Before1), + subtract_civ(After2,Before2,[],After1), + subtract_civ(Before2,After2,[],Before1), + + %Before1=Insertions, + %After1=Deletions. + + append(Before,After,Both), + %trace, + findall(A1,(member(A1,Both),%A1=A10, + get_base_token_number(A1,A10), + member([",",A10],Corr)),%)->A11=[A1]; + A11%=[] + ), + + subtract(After1,Permanent_insertions,Insertions), + subtract(Before1,Permanent_insertions,Deletions), + + subtract(Permanent_insertions,A11,Permanent_insertions2), + union(A11,Insertions,Insertions1), + subtract(Deletions,A11,Deletions1). +/* +find_insertions_and_deletions_vintage_old(Before,After,Insertions,Deletions) :- + subtract(After,Before,Insertions), + subtract(Before,After,Deletions),!. +*/ +subtract_civ([],_M32,A,A) :- !. +subtract_civ(M31,M32,A1,A2) :- + M31=[[C,D]|E], + (member([_,D],M32)-> + A1=A3; + append(A1,[C]%[[C,D]] + ,A3)), + subtract_civ(E,M32,A3,A2). + + +find_combos1_vintage(Insertions,Deletions,Permanent_insertions,Combos) :- + findall([i,In],member(In,Insertions),In1), + findall([d,De],member(De,Deletions),De1), + findall([p,Pe],member(Pe,Permanent_insertions),Pe1), + foldr(append,[In1,De1,Pe1],Ops), + find_combos2_vintage(Ops,[],Combos). + +find_combos2_vintage([],Combos,Combos). +find_combos2_vintage(Ops,Combos1,Combos2) :- + Ops=[Op|Ops1], + (Op=[p,_]->Switch=on; + member(Switch,[on,off])), + append(Combos1,[[Op,Switch]],Combos3), + find_combos2_vintage(Ops1,Combos3,Combos2). + +%findall([Op,Switch],(member(Op,Ops),member(Switch,[on,off])),Switches). +% ((member(In,Insertions)->true;In=[]), +% (member(De,Deletions)->true;De=[]), + +find_combos3_vintage([],_Combos,Combos,Combos) :- !. +find_combos3_vintage(After,Combos,Combos1,Combos2) :- + After=[Item1|After2], + ((Item1=[Type,N], + (catch(get_base_token_number(N,N1),_,false)->true;true), + (member([[Type,N1],Switch],Combos)->N2=N1; + (member([[Type,N],Switch],Combos),N2=N)), + ((%Type=i, + Switch=on)->Item=[N2];Item=[]))->true;Item=[Item1]), + append(Combos1,Item,Combos3), + find_combos3_vintage(After2,Combos,Combos3,Combos2),!. diff --git a/luciancicd/diff-cgpt.pl b/luciancicd/diff-cgpt.pl new file mode 100644 index 0000000000000000000000000000000000000000..4d9c9e25d68cca5c5850c5ef31013f8e7816d345 --- /dev/null +++ b/luciancicd/diff-cgpt.pl @@ -0,0 +1,93 @@ +% Base case: diff of empty lists is empty +diff([], [], [], [], _, L, L). + +% Case 1: Head elements are the same +diff([X|Xs], [X|Ys], Ins, Del, PI, List1, List2) :- + (member(X,PI)->append(List1,[[p,X]],List3); + append(List1,[X],List3)), + diff(Xs, Ys, Ins, Del, PI, List3, List2), + !. + +% Case 2: Head elements are different - X is in insertions +diff(Xs, [Y|Ys], [Y1|Ins], Del, PI, List1, List2) :- + (member(Y,PI)->(append(List1,[[p,Y]],List3), + Y1=[]); + (append(List1,[[i,Y]],List3), + Y1=Y)), + diff(Xs, Ys, Ins, Del, PI, List3, List2), + %append(Ins,[Y1],Ins1), + !. + +% Case 3: Head elements are different - Y is in deletions +diff([X|Xs], Ys, Ins, [X1|Del], PI, List1, List2) :- + (member(X,PI)->(append(List1,[[p,X]],List3), + X1=[]); + (append(List1,[[d,X]],List3), + X1=X)), + diff(Xs, Ys, Ins, Del, PI, List3, List2), + %append(Del,[X1],Del1), + !. + + +%%%%%%%%% + +% Use base number unless a member of I1,D1 +already_member1(X1,X2,Insertions1,Deletions1, + X) :- + get_base_token_number(X1,%X% + X11 + ), + get_base_token_number(X2,%X% + X21 + ), + X11=X21, + ((member(X1,Insertions1)->true;member(X1,Deletions1))->X=X1; + (X=X11)), + !. +already_member2(X1,Insertions1,X) :- + get_base_token_number(X1,X11), + ((member(X1,Insertions1))->X=X1; + (X=X11)),!. + +diff(A, B, C, D, E, F, G, H, J) :- + diff_a(A, B, C, D, E1, F1, G, H, J), + delete(E1,[],E), + delete(F1,[],F). + +% Base case: diff_a of empty lists is empty +diff_a([], [], _,_,[], [], _, L, L). + +% Case 1: Head elements are the same +diff_a([X1|Xs], [X2|Ys], Insertions1,Deletions1,Ins, Del, PI, List1, List2) :- +%trace, + already_member1(X1,X2,Insertions1,Deletions1, + X), + ((member(X3,PI),(catch(get_base_token_number(X3,X),_,false)->true;X3=X))->append(List1,[[p,X3]],List3); + append(List1,[X],List3)), + diff_a(Xs, Ys, Insertions1,Deletions1,Ins, Del, PI, List3, List2), + !. + +% Case 2: Head elements are diff_aerent - X is in insertions +diff_a(Xs, [Y2|Ys], Insertions1,Deletions1,[Y1|Ins], Del, PI, List1, List2) :- + already_member2(Y2,Insertions1,Y), + ((member(Y3,PI),(catch(get_base_token_number(Y3,Y),_,false)->true;Y3=Y))->(append(List1,[[p,Y3]],List3), + Y1=[]); + (append(List1,[[i,Y]],List3), + Y1=Y)), + diff_a(Xs, Ys, Insertions1,Deletions1,Ins, Del, PI, List3, List2), + %append(Ins,[Y1],Ins1), + !. + +% Case 3: Head elements are diff_aerent - Y is in deletions +diff_a([X2|Xs], Ys, Insertions1,Deletions1,Ins, [X1|Del], PI, List1, List2) :- + already_member2(X2,Deletions1,X), + ((member(X3,PI),(catch(get_base_token_number(X3,X),_,false)->true;X3=X))->(append(List1,[[p,X3]],List3), + X1=[]); + (append(List1,[[d,X]],List3), + X1=X)), + diff_a(Xs, Ys, Insertions1,Deletions1,Ins, Del, PI, List3, List2), + %append(Del,[X1],Del1), + !. + + + diff --git a/luciancicd/find_dependencies.pl b/luciancicd/find_dependencies.pl new file mode 100644 index 0000000000000000000000000000000000000000..f3bab129b17420cc760c640fc1c0ba95d1ed7c55 --- /dev/null +++ b/luciancicd/find_dependencies.pl @@ -0,0 +1,719 @@ +% find_dependencies - finds dependencies of predicates, orders bottom up + +% Deps - after ordering sm, finds deps of sm pred nums + +% modes not types + +% modes will help add commands to sm, for conversion to c + +:-include('../Philosophy/sub_term_with_address.pl'). +%:-include('../SSI/ssi.pl'). %XXX +%:-dynamic resort_n/1. +find_dependencies(Dep99_name,Dep99_arity,F,F2 +,Functions2b,Pred_numbers2) :- +%trace, +%writeln1(find_dependencies(Dep99_name,Dep99_arity,F,F2,Functions2b,Pred_numbers2)), +%trace, +/* +A=[15 +], +findall(%[B, +Functions2b%] +,(_Debug=off,member(B,A), +*/ +%trace, +%test(248% +%15 +%,Q,F,_R), +numbers(Dep99_arity,1,[],V2), + + +length(Q4,Dep99_arity), +(Q4=[]->Q5=[[n,Dep99_name]]; +Q5=[[n,Dep99_name],Q4|_]), + +findall([v,V3],(member(V1,V2),atom_concat("a",V1,V3)),V), +%trace, +(V=[]->Q=[[n,Dep99_name]]; +Q=[[n,Dep99_name],V]), +query_box(Q,_Query1,F,Functions1), +%trace, +%get_n_item(Functions1,Q,N2), x +%N2=1,%XXX +findall(N2,(((Q5=[NG],Q51=[NG|_])->true;Q51=Q5),%writeln([*,Q51,Q5]), +get_n_item(Functions1,Q51,N2)),N2s), +findall(N22,(member(N21,N2s),N22 is N21-1),N22s),%*N1 is N2-1, + +convert_to_grammar_part1(Functions1,[],Functions2,_),add_line_numbers_to_algorithm1(Functions2,Functions2a),find_pred_numbers_to_cut(Functions2a,[],Pred_numbers),find_state_machine1(Functions2a,Functions3,Pred_numbers), +%trace, +a_to_m2(N22s,Functions3%2a%3 +,Pred_numbers, +Functions2b), + +%trace, +%/* +findall([NF,Arity,PN2],(member(PN1,Functions2b),member([NF,Arity,PN2],Pred_numbers),%member(PN1% +(PN1=[loop1,PN11]->member(PN12,PN11);PN12=PN1),member(PN12 +,PN2)),Pred_numbers21),%Pred_numbers21=Pred_numbers2,% +sort(Pred_numbers21,Pred_numbers2), +%Pred_numbers21=Pred_numbers2, +%*/ +%Pred_numbers2=Pred_numbers, +%trace, + +findall(NFAR,(member([NF,Arity,_],Pred_numbers2), +length(Args,Arity), +(Args=[]->NFAR=[NF|Rest];NFAR=[NF,Args|Rest]), +member(NFAR,F)%,length(Args,Arity) +),F21),%F21=F2,% +sort(F21,F2), +!. +%a_and_m_to_clp(Functions3,Functions2b,Functions2c), +%lp_to_c(Functions2c,Functions2d) + +%),Functions2d).%find_pred_numbers_dependencies2(Functions3,Functions2b),lucianpl(Debug,Q,F,R1),Functions2b=[[_|_]|Functions2c],lucianpl(off,Q,Functions2c,R2),(R1=R2->Result=success;Result=fail),writeln([B,Result])),C),sort(C,C1),writeln(C1). + +% could run in lp or compile and run in c (taking alg file as argument) + +find_pred_numbers_to_cut(Functions2a,Functions2ab,Pred_numbers) :- + find_pred_numbers(Functions2a,Functions2ab,Pred_numbers),!. + +a_to_m(N1,Functions1,Pred_numbers, +Ordered_pred_nums1) :- +%trace, +%find pred nums in sm (done) + +% find occurrence of pred calls in preds regardless of clause, for finding modes in bottom-up order +% find order in terms of pred name, arity +% x modify find_pred_numbers + +find_pred_numbers_dependencies(Functions1,[],Functions2a,Pred_numbers), +N=0, +%member([N,P],Functions2a), +%delete(Functions2a,[N,P],F), + +% order_preds_bottom_up1_post_order_dfs(_L1,[N],Functions2a,[N],Ordered_pred_nums0,[N]), +% Ordered_pred_nums0=[0, [1, 2, [loop, 1]]] +% In test 7, query box (predicate 0) and predicate 1 are called, where predicate 1 calls predicate 2 and itself. + + +%order_preds_bottom_up1_post_order_dfs(1,Functions2a,[],Ordered_pred_nums0), +%trace, +order_preds_bottom_up1_post_order_dfs(_L1,[N],Functions2a,[],Ordered_pred_nums01,[N]), + +(Ordered_pred_nums01=[Ordered_pred_nums02]->true; +Ordered_pred_nums01=Ordered_pred_nums02), + +%flatten_except_loops1(Ordered_pred_nums02,Ordered_pred_nums0), +Ordered_pred_nums02=Ordered_pred_nums0, +%foldr(append,Ordered_pred_nums01,Ordered_pred_nums0), + + +/* +% find min, max + +findall(L,member([L,_]),L1), +sort(L1,L2), +append([Min_L],Rest,L2), +append(_,[Max_L],Rest), +*/ + +% find_groups([0, [1, 2, [loop, 1]]], [0], A). + +% A = [[loop1, 1], 2, 0] + +% In post order depth-first search, the order of the predicates to test are predicate 1, which calls itself, predicates 2 and 0. +% loop1 denotes a group of predicates that are in a loop, so have to be tested separately, with their own combinations of changes +trace, +find_groups(Ordered_pred_nums0,[N1],Ordered_pred_nums11,true), +%reverse(Ordered_pred_nums11,Ordered_pred_nums13), +list_to_set(Ordered_pred_nums11,Ordered_pred_nums14), +%reverse(Ordered_pred_nums12,Ordered_pred_nums14), +remove_dups_from_loops(Ordered_pred_nums14,Ordered_pred_nums15), + +%trace, +findall(Ordered_pred_nums19,(member(Ordered_pred_nums16,Ordered_pred_nums15), +(Ordered_pred_nums16=[loop1,Ordered_pred_nums17]->(list_to_set(Ordered_pred_nums17,Ordered_pred_nums18),Ordered_pred_nums19=[loop1,Ordered_pred_nums18]);Ordered_pred_nums19=Ordered_pred_nums16 +)),Ordered_pred_nums20), + +delete(Ordered_pred_nums20,loop,Ordered_pred_nums21), + + findall(E,(member(F,Ordered_pred_nums21), + (F=[loop1,[A1]]->E=A1;E=F)),Ordered_pred_nums1), +%flatten(Ordered_pred_nums0,Ordered_pred_nums1), +/* bfs: +foldr(append,Ordered_pred_nums0,Ordered_pred_nums), +sort(Ordered_pred_nums,Ordered_pred_nums2), +reverse(Ordered_pred_nums2,Ordered_pred_nums3), +findall(B,member([_,B],Ordered_pred_nums3),Ordered_pred_nums4), +flatten(Ordered_pred_nums4,Ordered_pred_nums1), +*/ +!. + +%alg_to_modes(Ordered_pred_nums1,Functions1,[],_Var_modes,[],_Functions_with_modes) + + +%find_pred_numbers_dependencies(Functions1,_Reserved_words,Pred_numbers,Functions2a) :- + +% pred group y is called by pred group z +% don't need name, arity + +%find_pred_numbers_dependencies2(Functions1,Functions2a,Pred_numbers). + +% *** do in sm form instead to find deps +% - attribute calls in preds to groups of preds + + +%find_pred_numbers_dependencies(Algorithm1,Algorithm2,Pred_numbers) :- + +%findall(Pred_nums,member([_Name,_Arity,Pred_nums],Pred_numbers),Pred_numbers1), + %find_pred_numbers_dependencies(_Algorithm1,[],_Deps,_Pred_numbers),!. + +% * need p, clause number in deps + +find_pred_numbers_dependencies([],Deps,Deps,_) :- !. +find_pred_numbers_dependencies(Algorithm1,Deps1,Deps2,Pred_numbers) :- + Algorithm1=[Function1|Functions], + (Function1=[Number,_Name,_Arguments1,_Symbol1,Body1] + ->%symbol(Symbol1,Symbol2), + (%trace, + find_deps3(Body1,%Body2, + %[],Body2, + Pred_numbers, + Deps3), + foldr(append,Deps3,Deps31), + append(Deps1,[[Number, + %Name,Arguments1,Symbol1,Body2 + Deps31]],Deps4)), + find_pred_numbers_dependencies(Functions,Deps4,Deps2,Pred_numbers)),!. + + +find_deps3(Body1,%Body2,Body3, +Pred_numbers, +Deps) :- + + findall(Pred_nums,(member([_Number,[_Dbw_on_true,_Statements1_number],[_Dbw_go_after,_Statements2_number],[_Dbw_on_false,_Return_line_false],[_Dbw_go_to_predicates,_Predicates],[_Dbw_n_or_v1,F]|Arguments1],Body1), + foldr(append,Arguments1,Arguments), + length(Arguments,Arity), + get_lang_word("n",Dbw_n1),Dbw_n1=Dbw_n, + member([[Dbw_n,F],Arity,Pred_nums],Pred_numbers) + ),Deps1), + sort(Deps1,Deps),!. + +/* +order_preds_bottom_up_post_order_dfs(_L,Functions,Ordered_pred_nums1,Ordered_pred_nums2) :- +%trace, +Functions=[[N,P]|F], + +append(Ordered_pred_nums1,[%[L, +N%] +],Ordered_pred_nums3), + findall(Ordered_pred_nums21,order_preds_bottom_up1_post_order_dfs(_L1,P,F,Ordered_pred_nums3,Ordered_pred_nums21),Ordered_pred_nums2),!. + */ + +% program may have unconnected preds, causing a bug + +%order_preds_bottom_up1_post_order_dfs(_,[],_%[[N, []]] +%,Ordered_pred_nums,Ordered_pred_nums) :- + %append(Ordered_pred_nums1,[N],Ordered_pred_nums2), +% fail,!. + +order_preds_bottom_up1_post_order_dfs(_,_,[],Ordered_pred_nums,Ordered_pred_nums,_) :- !. +order_preds_bottom_up1_post_order_dfs(_L,N,Functions,Ordered_pred_nums1,Ordered_pred_nums2,Pre_order1) :- +%L3 is L+1, +% assign a level, record min, max levels + +%findall(Ordered_pred_nums8,( +%member(P4,P3),%),Ordered_pred_nums1a), + +%foldr(append,[Ordered_pred_nums1,Ordered_pred_nums1a],Ordered_pred_nums1b), + +%findall(Ordered_pred_nums31, +%(member([N2,P4],Functions),%L2 is L-1, +(member(P2,N),member([P2,P3],Functions), + +order_preds_bottom_up1_post_order_dfs2(P3,P2,Functions,Ordered_pred_nums1,Ordered_pred_nums3,Pre_order1) +%append([P3],Ordered_pred_nums3,Ordered_pred_nums11) + +), + +%(P3=[]->Ordered_pred_nums3=P2; +%delete(Functions,[P2,P3],Functions2),%member(P1,P), +%order_preds_bottom_up1_post_order_dfs(_L3,P3,Functions,[],Ordered_pred_nums3))) +%append(Ordered_pred_nums3,[N]%[[L,N]] +%,Ordered_pred_nums11), + +foldr(append,[Ordered_pred_nums1,Ordered_pred_nums3],Ordered_pred_nums2). + +%foldr(append,Ordered_pred_nums3,Ordered_pred_nums5), +%(N=[]->Ordered_pred_nums8=Ordered_pred_nums2;append(Ordered_pred_nums8,[N,P3]%[[L3,P3]] +%,Ordered_pred_nums2)) + +%),Ordered_pred_nums2) + +order_preds_bottom_up1_post_order_dfs2(P3,_P2,_Functions,_Ordered_pred_nums1,Ordered_pred_nums3,_) :- +P3=[],Ordered_pred_nums3=[],!. + +order_preds_bottom_up1_post_order_dfs2(P3,_P2,Functions,_Ordered_pred_nums1,Ordered_pred_nums3,Pre_order1) :- +member(P4,P3),flatten(Pre_order1,Ordered_pred_nums2), +member(P4,Ordered_pred_nums2), +append(P5,P6,P3), +append([P4],P7,P6), +order_preds_bottom_up1_post_order_dfs2(P7,_P21,Functions,[],Ordered_pred_nums31,Pre_order1), +foldr(append,[P5,[[loop,P4]],Ordered_pred_nums31],Ordered_pred_nums3),!. +%delete(Functions,[P2,P3],Functions2),%member(P1,P), + +order_preds_bottom_up1_post_order_dfs2(P3,_P2,Functions,_Ordered_pred_nums1,Ordered_pred_nums32,Pre_order1) :- +findall(Ordered_pred_nums31, +(member(P4,P3),append(Pre_order1,[P4],Pre_order2),order_preds_bottom_up1_post_order_dfs(_L3,[P4],Functions,[],Ordered_pred_nums3,Pre_order2),append([P4],Ordered_pred_nums3,Ordered_pred_nums31)),Ordered_pred_nums32). + + +find_groups([],Ordered_pred_nums,Ordered_pred_nums,_) :- !. + + +find_groups(A,Ordered_pred_nums1,Ordered_pred_nums2,First) :- + (number(A)->A=B;(A=[B],number(B))), + (First=true->Ordered_pred_nums1=Ordered_pred_nums2; + (member(B,Ordered_pred_nums1)->Ordered_pred_nums1=Ordered_pred_nums2; + append([B],Ordered_pred_nums1,Ordered_pred_nums2))), + !. + +find_groups(Ordered_pred_nums0,Ordered_pred_nums1,Ordered_pred_nums22,_) :- + Ordered_pred_nums0=[Ordered_pred_nums3|Ordered_pred_nums4], + %findall(Ordered_pred_nums2,( + %member(Ordered_pred_nums41,Ordered_pred_nums4), +%append([Ordered_pred_nums3],Ordered_pred_nums1,Ordered_pred_nums5), +find_groups2(Ordered_pred_nums3,Ordered_pred_nums4,Ordered_pred_nums1,Ordered_pred_nums22). + +find_groups2(_,[],Ordered_pred_nums,Ordered_pred_nums) :- !. +%find_groups2(_,[A],Ordered_pred_nums1,Ordered_pred_nums2) :- +% append([A],Ordered_pred_nums1,Ordered_pred_nums2),!. +find_groups2(_,A,Ordered_pred_nums1,Ordered_pred_nums2) :- + (number(A)->A=B;(A=[B],number(B))), + append([B],Ordered_pred_nums1,Ordered_pred_nums2),!. +find_groups2(Ordered_pred_nums3,Ordered_pred_nums4,Ordered_pred_nums1,Ordered_pred_nums22) :- + +%writeln(find_groups2(Ordered_pred_nums3,Ordered_pred_nums4,Ordered_pred_nums1,Ordered_pred_nums22)), + + Ordered_pred_nums4=[Ordered_pred_nums41|Ordered_pred_nums42], + (contains_loop(Ordered_pred_nums3,Ordered_pred_nums41,[],_)-> + %append(P5,P6,Ordered_pred_nums4), +%append([Ordered_pred_nums41],P7,P6), +%P71=[%Ordered_pred_nums3| +%P7], +%trace, + ((Ordered_pred_nums4=[Ordered_pred_nums43]->true; + Ordered_pred_nums4=Ordered_pred_nums43), + + ((Ordered_pred_nums43=[[AN|AN2]|_],number(AN))-> + (find_groups([Ordered_pred_nums3,[AN|AN2]],[]%Ordered_pred_nums1 + ,%[],%Ordered_pred_nums24, + Ordered_pred_nums25,true), + subtract(Ordered_pred_nums25,Ordered_pred_nums1,Ordered_pred_nums225), + +foldr(append,[Ordered_pred_nums225, +Ordered_pred_nums1],Ordered_pred_nums24)) + + ; + + (%trace, + %trace, + %find_groups_replace_loops(%Ordered_pred_nums3, + %Ordered_pred_nums1,Ordered_pred_nums1a),%P71,%Ordered_pred_nums1, + in_or_exiting_loop(Ordered_pred_nums3,Ordered_pred_nums43,[],In_loop,[],%Ordered_pred_nums1, + Exiting_loop%,[],Rest_of_preds + ), + %notrace, +find_groups_replace_loops(%Ordered_pred_nums3, + In_loop,%P71,%Ordered_pred_nums1, + Ordered_pred_nums311), +find_groups_replace_loops(%Ordered_pred_nums3, + Exiting_loop,%P71,%Ordered_pred_nums1, + Ordered_pred_nums312), + flatten(Ordered_pred_nums311,Ordered_pred_nums321), + flatten(Ordered_pred_nums312,Ordered_pred_nums322), + append(Ordered_pred_nums321,[Ordered_pred_nums3],Ordered_pred_nums323), + +%trace, +subtract(Ordered_pred_nums322,Ordered_pred_nums1,Ordered_pred_nums324), +%subtract(Ordered_pred_nums1,Ordered_pred_nums322,Ordered_pred_nums111), +%trace, +insert_loop1([loop1,Ordered_pred_nums323],Ordered_pred_nums1,Ordered_pred_nums1b), +list_to_set(Ordered_pred_nums1b,Ordered_pred_nums1c), +foldr(append,[%Rest_of_preds, +Ordered_pred_nums324,%[[loop1,Ordered_pred_nums323]], +Ordered_pred_nums1c%P5, +],Ordered_pred_nums251), +%,notrace +%reverse(Ordered_pred_nums251,Ordered_pred_nums261), +%list_to_set(Ordered_pred_nums261,Ordered_pred_nums271), +list_to_set(Ordered_pred_nums251,Ordered_pred_nums24) +%reverse(Ordered_pred_nums271,Ordered_pred_nums24) +%,notrace +)), + +find_groups([Ordered_pred_nums3|Ordered_pred_nums42],Ordered_pred_nums24,Ordered_pred_nums22,true) + +); +(%(Ordered_pred_nums3=3->trace;true),% +(member(Ordered_pred_nums3,Ordered_pred_nums1)->Ordered_pred_nums1=Ordered_pred_nums5; +(number(Ordered_pred_nums3)->append([Ordered_pred_nums3],Ordered_pred_nums1,Ordered_pred_nums5); +(Ordered_pred_nums3=[Ordered_pred_nums31|_]-> +(number(Ordered_pred_nums31)->append([Ordered_pred_nums31],Ordered_pred_nums1,Ordered_pred_nums5); +Ordered_pred_nums1=Ordered_pred_nums5)); +Ordered_pred_nums1=Ordered_pred_nums5)), +%notrace, +find_groups(Ordered_pred_nums41,Ordered_pred_nums5,Ordered_pred_nums23,false), +find_groups2(Ordered_pred_nums3,Ordered_pred_nums42,Ordered_pred_nums23,Ordered_pred_nums22) +)). +/* + +%append(Ordered_pred_nums1,[[loop1,Ordered_pred_nums3]],Ordered_pred_nums2)); + (%trace, + %append([Ordered_pred_nums3],Ordered_pred_nums1,Ordered_pred_nums5), +findall(Ordered_pred_nums22,(member(Ordered_pred_nums412,Ordered_pred_nums4), +find_groups(Ordered_pred_nums412,[]%Ordered_pred_nums5 +,Ordered_pred_nums22)),Ordered_pred_nums21), +%trace, + (foldr(append, + Ordered_pred_nums21,Ordered_pred_nums23)%->true; + %Ordered_pred_nums21=Ordered_pred_nums23 + ), + +foldr(append,[Ordered_pred_nums23,[Ordered_pred_nums3],Ordered_pred_nums1],Ordered_pred_nums2) + %flatten( + +)), + %flatten(%foldr(append, + %Ordered_pred_nums2,Ordered_pred_nums22). + Ordered_pred_nums2=Ordered_pred_nums22. + */ + +contains_loop(A, [loop, A], B, C) :- append(B,[A],C),!. + +contains_loop(Ordered_pred_nums1,Ordered_pred_nums2,P1,P2) :- + (Ordered_pred_nums2=[loop,Ordered_pred_nums1]->P1=P2; + (Ordered_pred_nums2=[Ordered_pred_nums3|Ordered_pred_nums4], + member(Ordered_pred_nums41,Ordered_pred_nums4), + append(P1,[Ordered_pred_nums3],P3), + contains_loop(Ordered_pred_nums1,Ordered_pred_nums41,P3,P2))). + +not_contains_loop(A, [loop, A], B, B) :- !. + +not_contains_loop(Ordered_pred_nums1,Ordered_pred_nums2,P1,P21) :- +%trace, + recursive_reverse(Ordered_pred_nums2,Ordered_pred_nums21), + not_contains_loop1(Ordered_pred_nums1,Ordered_pred_nums21,P1,P21). + %reverse(P2,P22), + %list_to_set(P22,P23), + %reverse(P23,P21). + +not_contains_loop1(Ordered_pred_nums1,Ordered_pred_nums2,P1,P21) :- +%writeln([ordered_pred_nums2,Ordered_pred_nums2]), + ((Ordered_pred_nums2=[_Ordered_pred_nums1x,loop]%->true; + %(Ordered_pred_nums2=loop%->true; + %Ordered_pred_nums2=[loop] + )->fail;%P1=P2; + ((Ordered_pred_nums2=A,number(A))->append(P1,[A],P21); + (append(Ordered_pred_nums4,[Ordered_pred_nums3],Ordered_pred_nums2), + %reverse(Ordered_pred_nums4,Ordered_pred_nums42), + findall(P3,(member(Ordered_pred_nums41,Ordered_pred_nums4), + %reverse(Ordered_pred_nums41,Ordered_pred_nums42), + + not_contains_loop1(Ordered_pred_nums1,Ordered_pred_nums41,[],P3)),P31), + %retractall(resort_n(_)), + %assertz(resort_n(1)), + %append(P31,[P1],P33), + resort(P31,P32), + %reverse(P31,P32), + %trace, + foldr(append,[ + %P1, + P32%[Ordered_pred_nums3 + ,[Ordered_pred_nums3]],P2), + %notrace, + flatten(P2,P21) + ))). + +resort(P31,P32) :- + member([A|B],P31),member([A|C],P31),not(B=C), + subtract(P31,[[A|B],[A|C]],P33), + foldr(append,[[A],B,C],D), + append(P33,[D],P32),!. +resort(A,B) :- + member([C|D],A),member(E,A),append(F,[C],E), + subtract(A,[[C|D],E],G), + foldr(append,[F,[C],D],H), + append(G,[H],B),!. +resort(A,A) :- !. + +recursive_reverse(A,B) :- + recursive_reverse(A,[],B). + +recursive_reverse([],A,A) :- !. +recursive_reverse(A,B,C) :- + A=[D|E], + (not(is_list(D))->D=D1; + (flatten(D,D)->reverse(D,D1); + recursive_reverse(D,[],D1))), + append([D1],B,D2), + recursive_reverse(E,D2,C). + +remove_dups_from_loops(A,B) :- + remove_dups_in_loops_from_rest(A,C), + remove_dups_in_loops(C,B). + %trace, + %findall(E,(member(F,D), + %(F=[loop1,[A1]]->E=A1;E=F)),B). + +remove_dups_in_loops_from_rest(A,C) :- + findall(D,member([loop1,D],A),E), + flatten(E,F), + subtract(A,F,C). + +remove_dups_in_loops(C,B) :- + sub_term_wa([loop1,_],C,A), + remove_dups_in_loops1(A,[],D), + foldr(put_sub_term_wa_ae,D,C,B). + +remove_dups_in_loops1([],A,A) :- !. +remove_dups_in_loops1(A,B,C) :- + A=[[Add,[loop1,List]]|D], + remove_dups_in_loops1(List,D,[],E), + append(B,[[Add,[loop1,List]]],F), + remove_dups_in_loops1(E,F,C). + +remove_dups_in_loops1(_,[],F,F) :- !. +remove_dups_in_loops1(List1,D,F,E) :- + D=[[Add,[loop1,List2]]|G], + subtract(List2,List1,List3), + append(F,[[Add,[loop1,List3]]],H), + remove_dups_in_loops1(List1,G,H,E). + +/* +resort(A,B) :- + %length(A,C),numbers(C,1,[],N), + resort_n(NDs), + findall(N-D,(member(D,A), + (member(N-D,NDs)->true;get_resort_n(N))),E), + + +get_resort_n(A) :- + resort_n(A), + A1 is A+1, + retractall(resort_n(_)), + assertz(resort_n(A1)). +*/ + /* +not_contains_loop2a(Ordered_pred_nums1,Ordered_pred_nums2,P1,P23,First) :- + (Ordered_pred_nums2=[loop,Ordered_pred_nums1]->P23=[]; + (Ordered_pred_nums2=[[loop,Ordered_pred_nums1]|_]->fail; + (Ordered_pred_nums2=[]->true;%P1=P2; + %((Ordered_pred_nums2=[A|_],number(A))->P2=A;%P1=P2;%append(P1,[A],P2); + ((Ordered_pred_nums2=[Ordered_pred_nums21]->true; + Ordered_pred_nums2=Ordered_pred_nums21), + Ordered_pred_nums21=[_Ordered_pred_nums3|Ordered_pred_nums4], + findall(P21,(member(Ordered_pred_nums41,Ordered_pred_nums4), + (Ordered_pred_nums41=[loop,_]->fail; +((Ordered_pred_nums41=A,number(A),First=false +)->P21=A; %(number(Ordered_pred_nums41)->Ordered_pred_nums43=[Ordered_pred_nums41];Ordered_pred_nums43=[]), + %(append(Ordered_pred_nums41 + %,[Ordered_pred_nums3] + %,P3), + %delete(Ordered_pred_nums4,Ordered_pred_nums41,Ordered_pred_nums412), + ((get_n_item(Ordered_pred_nums4,Ordered_pred_nums41,1)->First=true;First=false), + not_contains_loop(Ordered_pred_nums1,[Ordered_pred_nums41],[], + P21,First))))),P22), + foldr(append,[P1,P22],P2), + flatten(P2,P23))))). +*/ +find_groups_replace_loops(A,%B, + C) :- + %trace, + sub_term_wa([loop,_],A,D), + findall(E,member([E,_],D),F), + foldl(delete_sub_term_wa,[F],A,C). + +%in_or_exiting_loop(_,[],In_loop,In_loop,Exiting_loop,Exiting_loop) :- !. +in_or_exiting_loop(Ordered_pred_nums3,P71,In_loop1,In_loop2,Exiting_loop1,Exiting_loop2%,Rest_of_preds1,Rest_of_preds2 +) :- + %P71=[P72|P73], + (contains_loop(Ordered_pred_nums3,P71,[],P)-> + append(In_loop1,[P],In_loop2); + (%P=[], + In_loop1=In_loop2)), + %Exiting_loop1=Exiting_loop2); + (not_contains_loop(Ordered_pred_nums3,P71,[],P1%,false + ), + %In_loop1=In_loop2, + /* + flatten(P71,P71F), + subtract(P71F,[Ordered_pred_nums3,P,loop],P1), + append(Exiting_loop1,[P1],Exiting_loop2). + */ + %Exiting_loop1=Exiting_loop2). + %trace, + %P71=[P72|P73], + %( + %contains_loop(Ordered_pred_nums3,P71,[],_P2)-> + %(find_groups([Ordered_pred_nums3,[P71]]%P73] + %,[%P72 + %],%Ordered_pred_nums24, + %P2,true), + %find_groups2(Ordered_pred_nums3,P71,[],%Ordered_pred_nums24, + %P2), +%find_groups2(Ordered_pred_nums3,Ordered_pred_nums42,Ordered_pred_nums23,Ordered_pred_nums22) + + append(Exiting_loop1,[P1],Exiting_loop2) + ).%); + %(find_groups_replace_loops(P71,P74), + %append(Rest_of_preds1,[P74%P72|P74 + %],Rest_of_preds2))). + + +contains_loop1(A) :- + member([loop,_],A),!.%,flatten(A,B),member(loop,B),!. + +%flatten_except_loops1([],[]) :- !. +%flatten_except_loops1([A],[A]) :- !. +flatten_except_loops1(A,B) :- + flatten_except_loops2(A,[],B,false). + %flatten_except_loops2([A],[],B1), + %flatten_except_loops2(C,B1,B),!. + +flatten_except_loops2([],B,B,_) :- !. +flatten_except_loops2(A,B,C,First) :- + (not(is_list(A))->append([],[A],C); + (A=[D|E], + (D=[loop,_]->(append([],[D],F)%,Flag=nowrap + ); + ((D=[D1],number(D1))->(append([],[D1],F)%,Flag=nowrap + ); + (flatten_except_loops2(D,[],F,true)%,Flag=wrap + ))), + %(Flag=wrap->append([],[F],C2);append([],F,C2)), + flatten_except_loops2(E,F,C1,false), + (First=true%Flag=wrap + ->append(B,[C1],C);append(B,C1,C)))),!. + + /* + findall([E,Ad],member([Ad,[loop,E]],D),F), + foldr(put_sub_term_wa_ae,F, + A, C1), + (C1=A->C1=C; + find_groups_replace_loops(C1,C)). +*/ + + + +put_sub_term_wa_ae([E,A],B,C) :- + put_sub_term_wa(A,E,B,C),!. + + +order_preds_bottom_up_bfs(L,Functions,Ordered_pred_nums1,Ordered_pred_nums2) :- + +Functions=[[N,P]|F], + +append(Ordered_pred_nums1,[[L,N]],Ordered_pred_nums3), + order_preds_bottom_up1(L,P,F,Ordered_pred_nums3,Ordered_pred_nums2),!. + + +% program may have unconnected preds, causing a bug + +order_preds_bottom_up1(_,_,[],Ordered_pred_nums,Ordered_pred_nums) :- !. +order_preds_bottom_up1(L,N,Functions,Ordered_pred_nums1,Ordered_pred_nums2) :- +L3 is L+1, +append(Ordered_pred_nums1,[[L,N]],Ordered_pred_nums11), +% assign a level, record min, max levels +findall(Ordered_pred_nums8,(member(P2,N),member([P2,P3],Functions), +%member(P4,P3),%),Ordered_pred_nums1a), + +%foldr(append,[Ordered_pred_nums1,Ordered_pred_nums1a],Ordered_pred_nums1b), + +%findall(Ordered_pred_nums3, +%(member([N2,P4],Functions),%L2 is L-1, +delete(Functions,[P2,P3],Functions2),%member(P1,P), +order_preds_bottom_up1(L3,P3,Functions2,[],Ordered_pred_nums3), +foldr(append,Ordered_pred_nums3,Ordered_pred_nums5), +append([[L3,P3]],Ordered_pred_nums5,Ordered_pred_nums7), +append(Ordered_pred_nums11,Ordered_pred_nums7,Ordered_pred_nums8) + +),Ordered_pred_nums2),!.%),_Ordered_pred_nums4), +/* +% what about sort + +delete(Functions,[N,P],Functions2), +%*** +%trace, +%foldr(append,Ordered_pred_nums4,Ordered_pred_nums5), + + +findall(Ordered_pred_nums6, +(member([N2,P1],Functions),member(N,P1),L2 is L+1,delete(Functions2,[N2,P1],Functions3), +order_preds_bottom_up1(L2,N2,Functions3,[],Ordered_pred_nums6)),Ordered_pred_nums7), + +findall([N2,P1],(not(member([N2,P1],Functions2)), +member(N,P1)),Functions3), + +foldr(append,[Ordered_pred_nums1b,Ordered_pred_nums7],Ordered_pred_nums2). +*/ + +% delete unconnected preds +%order_preds_bottom_up(L100,Functions3,[],Ordered_pred_nums2) + + +%find bottom up order of preds + +% - delete 1 in 1-1 (cycle) + +%find_dependencies(SM1,[],Deps). + +%find_dependencies(SM1,Deps1,Deps2) :- + + + +%alg_to_modes(Ordered_pred_nums1,Functions1,Functions_with_modes1,Functions_with_modes2) :- + + + /* +alg_to_modes([],_,Functions_with_modes,Functions_with_modes) :- !. +alg_to_modes(Ordered_pred_nums1,Functions1,Functions_with_modes1,Functions_with_modes2) :- + Ordered_pred_nums1=[[_,Number]%Function1 + |Functions], + + (member([Number,Name,Arguments1,Symbol1,Body1],Function1) + ->%symbol(Symbol1,Symbol2), + ( + alg_to_modes3(Body1,%Body2, + %[],Body2, + [],%var Modes 1 + _Var_modes,%var Modes 2 + [],%Modes1 + Modes), + + % *** do header as well + + append(Functions_with_modes1,[[Number,Name,Arguments1,Symbol1,Modes]],Functions_with_modes5)), + alg_to_modes(Functions,Functions1,Functions_with_modes4,Functions_with_modes2). + */ +/* +alg_to_modes2(Body1,%Body2, + %[],Body2, + Var_modes1,%var Modes + Var_modes2,%var Modes + Modes1,%Modes1 + Modes2) :- + */ + + +insert_loop1([loop1,Ordered_pred_nums323],[],Ordered_pred_nums1b) :- + Ordered_pred_nums1b=[[loop1,Ordered_pred_nums323]],!. +insert_loop1([loop1,Ordered_pred_nums323],Ordered_pred_nums1,Ordered_pred_nums1b) :- + append(_,[Index],Ordered_pred_nums323), + ((append(A,B,Ordered_pred_nums1),append([[loop1,D]],C,B), + append(_,[Index],D))->(union(D,Ordered_pred_nums323,E),%subtract(Ordered_pred_nums1,[[loop1,D]],E), + foldr(append,[A,[[loop1,E]],C],Ordered_pred_nums1b)); + + ((append(A,B,Ordered_pred_nums1),append([Index],C,B))->(union([Index],Ordered_pred_nums323,E),%subtract(Ordered_pred_nums1,[Index],E), + foldr(append,[A,[[loop1,E]],C],Ordered_pred_nums1b)))),!. + diff --git a/luciancicd/find_dependencies2-cgpt1.pl b/luciancicd/find_dependencies2-cgpt1.pl new file mode 100644 index 0000000000000000000000000000000000000000..b57e4c5a711c496fc77ff3228e3ac705fee31650 --- /dev/null +++ b/luciancicd/find_dependencies2-cgpt1.pl @@ -0,0 +1,506 @@ +% O=cube,traverse1(O,A,B),object(O,_,A). + + +a_to_m2(N1,Functions1,Pred_numbers, +Ordered_pred_nums1) :- +%trace, +%find pred nums in sm (done) + +% find occurrence of pred calls in preds regardless of clause, for finding modes in bottom-up order +% find order in terms of pred name, arity +% x modify find_pred_numbers + +find_pred_numbers_dependencies(Functions1,[],Functions2a,Pred_numbers), +%N=0, +%member([N,P],Functions2a), +%delete(Functions2a,[N,P],F), + +% order_preds_bottom_up1_post_order_dfs(_L1,[N],Functions2a,[N],Ordered_pred_nums0,[N]), +% Ordered_pred_nums0=[0, [1, 2, [loop, 1]]] +% In test 7, query box (predicate 0) and predicate 1 are called, where predicate 1 calls predicate 2 and itself. + + +%order_preds_bottom_up1_post_order_dfs(1,Functions2a,[],Ordered_pred_nums0), +%trace, + + find_deps(N1,Functions2a,Ordered_pred_nums1). + + +%dfs_post_order(N1,Functions2a,[],Ordered_pred_nums1,[],_All1). + + +%order_preds_bottom_up1_post_order_dfs(_L1,[N],Functions2a,[],Ordered_pred_nums01,[N]). + + + +object(cube,[["yellow",["red","blue"]],["red",[]],["blue",[]],[0,["yellow","green"]],["green",["purple"]],["purple",[]]], +["red", "blue", "yellow", "purple", "green", 0]). + +object(cube1,[[0,[0]]], +[0]). + +object(cube2,[[0,["red"]],["red",[0]]], +[[loop1, [0, "red"]]]). + +object(cube3,[["yellow",["red","blue"]],["red",[]],["blue",[]],[0,["yellow","green"]],["green",["purple"]],["purple",[0]]], +["red","blue","yellow",[loop1,[0,"green","purple"]]]). + +object(cube4,[[0,["red","blue"]],["red",[0]],["blue",[0]]], +[[loop1, [0, "blue", "red"]]]). + +object(cube5,[[0,["red","blue"]],["red",[0]],["blue",[]]], +["blue", [loop1, [0, "red"]]]). + + +object(cube6,[[0,["red","blue"]],["red",["orange"]],["orange",[0]],["blue",[]]], +["blue", [loop1, [0, "orange", "red"]]]). + +object(cube7,[[0,["red"]],["red",["orange"]],["orange",[0]]], +[[loop1, [0, "orange", "red"]]]). + + +object(t15,[[0, [1]], [1, [3, 4]], [2, []], [3, [2]], [4, [2, 6, 7]], [5, []], [6, [5, 8, 9, 10, 11, 24]], [7, [6, 7, 8, 9, 10, 11]], [8, [16, 17]], [9, [13, 14]], [10, [19, 20]], [11, [3, 4]], [12, []], [13, [12, 21, 22]], [14, [13, 14]], [15, []], [16, [15, 23]], [17, [16, 17]], [18, []], [19, [18, 21, 22]], [20, [19, 20]], [21, [24]], [22, [24]], [23, [24]], [24, []]], + +[18, 24, 22, 21, 19, 20, 10, 12, 13, 14, 9, 5, 23, 15, 16, 17, 8, [loop1, [6, 7, 11, 4]], 2, 3, 1, 0] +). + +object(t151,[[0, [6, 7]], [6, []], [7, [6, 7]]], +[6,7,0]). + +object(t152,[[0, [6, 7]], [6, []], [7, [6, 7,0]]], +[6, [loop1, [0, 7]]]). + +object(t153,[[0, [6, 7]], [6, []], [7, [8, 7,0]],[8,[6]]], +%[6, [loop1, [0, 7]]]). +[6,8, [loop1, [0, 7]]]). + +%traverse(cube,Items,A). +% L = ["red", "blue", "yellow", "purple", "green", "white"]. + +%/* + + +traverse%(%Object, +%Items11%,All1 +%) + :- + findall(Items1,(object(Object,Items2,Items1), + %writeln1(dfs_post_order("white",Items2,[],Items1,[],All1)), + dfs_post_order0(0,Items2,[],Items12,[],_All1), + (Items1=Items12->S=success;S=fail), + nl,writeln1([Object,S,"\n",Items1,"\n",Items12])),Items11), + writeln1(Items11). + +traverse1(Object,Items1,All1) :- + object(Object,Items2,_), + dfs_post_order0(0,Items2,[],Items1,[],All1). + +%*/ + +dfs_post_order0(Curr,Items_all,Items2,Items31,Items2_all1,Items3_all1) :- + +dfs_post_order(Curr,Items_all,Items2,Items31,Items2_all1,Items3_all1) +/*, + + +list_to_set(Items32,Ordered_pred_nums14), +%reverse(Ordered_pred_nums12,Ordered_pred_nums14), +reverse(Ordered_pred_nums14,Ordered_pred_nums141), +remove_dups_from_loops(Ordered_pred_nums141,Ordered_pred_nums151), +reverse(Ordered_pred_nums151,Ordered_pred_nums15), +%trace, +findall(Ordered_pred_nums19,(member(Ordered_pred_nums16,Ordered_pred_nums15), +(Ordered_pred_nums16=[loop1,Ordered_pred_nums17]->(list_to_set(Ordered_pred_nums17,Ordered_pred_nums18),Ordered_pred_nums19=[loop1,Ordered_pred_nums18]);Ordered_pred_nums19=Ordered_pred_nums16 +)),Ordered_pred_nums20), + +%delete(Ordered_pred_nums20,loop,Ordered_pred_nums21), + + findall(E,(member(F,Ordered_pred_nums20), + (F=[loop1,[A1]]->E=A1;E=F)),Items31), + +! +*/ +. +dfs_post_order(Curr,Items_all,Items2,Items31,Items2_all1,Items3_all1) :- + member([Curr,Items],Items_all), + %delete(Items_all,[Curr,Items],Items_all1), + Items_all=Items_all1, + dfs_post_order2(Curr,Items,Items_all1,[] + ,Items6,Items2_all1,Items6_all1), + + %trace, + (member(Curr,Items6_all1)-> + (Items6=Items3, + Items6_all1=Items3_all1); + (%trace, + append(Items6,[Curr],Items3), + append(Items6_all1,[Curr],Items3_all1))), + + %trace, + findall(C,(member(C,Items3),not(C=[loop1,_])),A1), + findall([loop1,A],member([loop1,A],Items3),A2), + append(A1,A2,A3), + append(Items2,A3,Items31), + !. +dfs_post_order([],_,Items2,Items2,Items2_all1,Items2_all1) :- !. + +dfs_post_order2(_,[],_,Items,Items,Items_all1,Items_all1) :- !. +dfs_post_order2(_,_,[],Items,Items,Items_all1,Items_all1) :- !. +dfs_post_order2(Curr,Item7,Items_all,Items2,Items8,Items2_all1,Items8_all1) :- + Item7=[Item1|Items3], + %if item1 has been covered, end + (member(Item1,Items2_all1)-> + (Items2=Items8, + Items2_all1=Items8_all1); + + (%trace, + (contains_loop_dfs1(Curr,Item1,Items_all,[%Item1%Curr, + %cycle(Item1, Cycles,Noncycles) :- + + ],Items41,[],Not_on_line) + %,writeln([not_on_line,Not_on_line]) + )-> + (append(Items3,Not_on_line,Items31), + %if loop item + %dfs_post_order(Item1,Items_all,[],Items41,Items2_all1,Items4_all1), + length(Items41,L), + (L=<1->append(Items2,Items41,Items4); + ( + %trace, + + sub_term_wa([loop1,_],Items2,A), + findall(C,(member(C,A),C=[_Add,[loop1,B]],not(intersection(Items41,B)=[])),D), + + findall(F,member([_,[_,F]],D),G), + append([Items41],G,J), + foldr(append,J,J1), + sort(J1,H), + + findall(K,member([K,_],D),M), + delete_sub_term_wa(M, Items2, E), + + append(E,[[loop1,H + ]],Items4) + + %*/ + )), + %append(Items2,[[loop1,Items41 + %]],Items4) + %), + append(Items2_all1,Items41,Items43_all1 + ), + sort(Items43_all1,Items4_all1) + ); + %if non loop item + (Items3=Items31, + dfs_post_order(Item1,Items_all,Items2,Items4,Items2_all1,Items4_all1))), + dfs_post_order2(Curr,Items31,Items_all,Items4,Items8,Items4_all1,Items8_all1)),!. + +contains_loop_dfs1(Curr1,Curr,Items_all +,Items2_all1,Items61_all1,Not_on_line1,Not_on_line21) :- +contains_loop_dfs(Curr1,Curr,Items_all +,Items2_all1,Items6_all1,Not_on_line1,Not_on_line2), +%not +%trace, +subtract(Not_on_line2,Items6_all1,Not_on_line21), +subtract(Items6_all1,Not_on_line21,Items61_all1), +(member(Curr,Items61_all1)%->true; +%member(Curr,Items6_all1) +),!. + +contains_loop_dfs(Curr1,Curr,Items_all +,Items2_all1,Items6_all1,Not_on_line1,Not_on_line2) :- + member([Curr,Items],Items_all), + + (member(Curr,Items2_all1)-> + (Items2_all1=Items7_all1, + Not_on_line1=Not_on_line3); + %findall(Items7_all1, + contains_loop_dfs2(Curr1,Items,Items_all%,Items2,Items6 + ,Items2_all1,Items7_all1,Not_on_line1,Not_on_line3)), + %),Items71_all1), + %trace, + %/* + + %trace, + %writeln(lead_to_empty_list(Items,Items_all,[],_)), + (true%,lead_to_empty_list(Items,Items_all,[],_) + ->( + %writeln(lead_to_empty_list(Items,Items_all,[],_)), + %trace, + append([Curr|Items],Not_on_line3,Not_on_line2) + %Items7_all1=Items6_all1 + ); + Not_on_line3=Not_on_line2), + + (member(Curr,Items7_all1)-> + Items7_all1=Items6_all1; + append(Items7_all1,[Curr],Items6_all1))%); + %Items2_all1=Items6_all1), + %*/ + %flatten(Items6_all1,Items61_all1), + %append(Items6_all1,[Curr],Items3_all1), + %append(Items6,[Curr],Items3), + . +%contains_loop_dfs([],_,Items2,Items2,Items2_all1,Items2_all1) :- !. + +% collect loops, path in cont loop + +% list all arms of loop + +contains_loop_dfs2(_C,[],_%,Items,Items +,A,A,B,B%Items_all1,Items_all1 +% +) :- %append(B1,[C],B2), +!%,fail +. +contains_loop_dfs2(Curr,Item7,Items_all%,Items2,Items8 +,Items2_all1,Items8_all1,Not_on_line1,Not_on_line2 +) :- +%trace, + (member(Curr,Item7)->%->fail; + %Items2_all1=Items6_all1;%->true;member(Curr1,) + %trace, + %* find 8 etc with lead to empty list + (member(Curr,Items2_all1)-> + (Items2_all1=Items8_all1, + Not_on_line1=Not_on_line2 + ); + (append(Items2_all1,[Curr],Items8_all1), + Not_on_line1=Not_on_line2)); + +% !. +%contains_loop_dfs2(Curr,Item7,Items_all%,Items2,Items8 +%,Items2_all1,Items8_all1 +%) :- +%writeln1(contains_loop_dfs2(Curr,Item7,Items_all%,Items2,Items8 +%,Items2_all1,Items8_all1)), + %findall([Items4_all1,Items3],( + %member(Item1,Item7),% + ( + %(intersection(Items0,[Curr],[])-> + + + Item7=[Item1|Items3], + %trace, + %(Item1=Item7-> + + %(member(Item1,Items2_all1)-> + %Items2_all1=Items62_all1; + %append(Items2_all1,[Item1],Items62_all1)); + + %delete(Item7,Item1,Items3), + %if item1 has been covered, end + + %( + contains_loop_dfs(Curr,Item1,Items_all%,Items2,Items4 + ,Items2_all1 + ,Items6_all1,Not_on_line1,Not_on_line3),%->true;*) + + %append()%),Items5_all1), + %findall(A,member([A,_],Items5_all1),Items51_all1), findall(A,member([_,A],Items5_all1),Items31), + %flatten(Items31,Items32), + %append(Items2_all1,Items4_all1,Items6_all1),%->true; + flatten(Items6_all1,Items61_all1), + %writeln(Items61_all1), + ((%false, + Items2_all1=Items61_all1 + %member(Item1,Items61_all1)%true%trace, + %intersection(Item7,Items61_all1,[]) + )-> + + %member(Item1,Items61_all1)-> + (Items61_all1=Items8_all1, + Not_on_line3=Not_on_line2); + %append(Items61_all1,[Item1],Items62_all1)), + contains_loop_dfs2(Curr,Items3,Items_all%,Items4,Items8 + ,Items61_all1,Items8_all1,Not_on_line3,Not_on_line2))))%; + + %(%member(Items61_all1,Items61_all1)-> + %Items61_all1=Items8_all1 + %append([],%Items61_all1, + %Item7,Items63_all1), + %sort(Items63_all1,Items8_all1) +%)))) + . + +lead_to_empty_list([],_,A,A) :- !. +lead_to_empty_list(Items,Items_all,A,B) :- + Items=[Items1|Items2], + (member(Items1,A)->fail; + (append(A,[Items1],C), + member([Items1,Items3],Items_all), + lead_to_empty_list(Items3,Items_all,C,D), + lead_to_empty_list(Items2,Items_all,D,B))). + + +% cycles +path(Node, Node, Tree, Cycles1, A4) :- + findall(Cycles2,path00(Node, Node, Tree, Cycles1, Cycles2),Cycles3), + flatten(Cycles3,A3),list_to_set(A3,A4), + not(A4=[]),!. + + +% Predicate to check if there's a path from Start to End +path00(Start, End, Tree, Visited, [End|Visited]) :- + member([Start, Ends], Tree), + member(End, Ends), + \+ member(End, Visited). + +path00(Start, End, Tree, Visited, Path) :- + member([Start, Ends], Tree), + member(Next, Ends), + %edge(Start, Next), + \+ member(Next, Visited), + path00(Next, End, Tree, [Next|Visited], Path). + +% Predicate to check if there's a cycle starting and ending at Node +cycle(Node, Tree,Cycles,Noncycles) :- + % cycles + %trace, + path(Node, Node, Tree, [], Cycles), + % non-cycles + %trace, + path1([Node],%Cycles, + Node, + Tree,[],A1), + flatten(A1,A2), + list_to_set(A2,A3), + subtract(A3,Cycles,Noncycles). + +% non-cycles +path1(Cycles,First,Tree,A0,A5) :- +%trace, + findall(A1,path10(Cycles,First,Tree,A0,A1),A2), + flatten(A2,A3),list_to_set(A3,A4), + subtract(A4,[First],A5) + %A4=A5 + . + +%path10(Stop,First,_Tree,A,A) :- + %member(First,Stop), + %*append(A0,[First],A1), + %!. +path10(_Stop,First,Tree,A0,A1) :- + member([First,[]],Tree), + %trace, + %member(Second1,Second), + append(A0,[First],A1). + %path1(Second1,A,Tree,A2,A1). +path10(Stop,First,Tree,A0,A1) :- + member([First,Second],Tree), + member(Second1,Second), + append(A0,[First],A2), + append(Stop,[First],Stop1), + (member(Second1,Stop) + %(Second1=Stop + ->A2=A1; + path10(Stop1,Second1,Tree,A2,A1)). + + +find_deps(Nodes,Tree,Deps) :- + findall(Deps2,(member(Node,Nodes), + cycle1(Node,Tree,[],Deps2)),Deps1a), + foldr(append,Deps1a,Deps1), + + +list_to_set(Deps1,Ordered_pred_nums14), +%reverse(Ordered_pred_nums12,Ordered_pred_nums14), +reverse(Ordered_pred_nums14,Ordered_pred_nums141), +remove_dups_from_loops(Ordered_pred_nums141,Ordered_pred_nums151), +reverse(Ordered_pred_nums151,Ordered_pred_nums15), +%trace, +findall(Ordered_pred_nums19,(member(Ordered_pred_nums16,Ordered_pred_nums15), +(Ordered_pred_nums16=[loop1,Ordered_pred_nums17]->(list_to_set(Ordered_pred_nums17,Ordered_pred_nums18),Ordered_pred_nums19=[loop1,Ordered_pred_nums18]);Ordered_pred_nums19=Ordered_pred_nums16 +)),Ordered_pred_nums20), + +%delete(Ordered_pred_nums20,loop,Ordered_pred_nums21), + + findall(E,(member(F,Ordered_pred_nums20), + (F=[loop1,[A1]]->E=A1;E=F)),Deps2), + delete(Deps2,[loop1,[]],Deps). + +%cycle1(Node,Tree,Deps1,Deps2) :- +cycle1(Node,Tree,Deps1,Deps2) :- + (cycle(Node, Tree,Cycles,Noncycles)-> + (%append(Deps1,[[loop1,Cycles]],Deps3), + %findall(A,(member(A,Noncycles), + cycle2(Noncycles,Tree,% Cycles1,Noncycles1, + Deps1,Deps3), + (Cycles=[A]-> + (member(A,Deps3)->Deps3=Deps2; + append(Deps3,[A],Deps2)); + append(Deps3,[[loop1,Cycles]],Deps2))); + (% find noncycles + %trace, + path1([Node],Node,Tree,[],Noncycles), + %append(Deps1,[Node],Deps3), + cycle2(Noncycles,Tree,% Cycles1,Noncycles1, + Deps1,Deps3), + (member(Node,Deps3)-> + Deps3=Deps2; + append(Deps3,[Node],Deps2)) + )),!. + +cycle2([], _,Deps,Deps) :-!. +cycle2(Noncycles, Tree,Deps1,Deps2) :- + Noncycles=[A|B], + cycle1(A,Tree,Deps1,Deps3), + (member(A,Deps3)->Deps3=Deps31; + append(Deps3,[A],Deps31)), + cycle2(B, Tree,Deps31,Deps2). + +test(N) :- + tests(Tests), + member([N,Tree,Ans],Tests), + find_deps(0,Tree,Deps), + (Deps=Ans->S=success;S=fail), + %nl, +writeln1([N,S,"\n",result,Deps,"\n",answer,Ans]),!. + +tests([ + [1,[[0,[]]], + [0] ], + [2,[[0, [0]]], + [0] ], + [3, + [["yellow",["red","blue"]],["red",[]],["blue",[]],[0,["yellow","green"]],["green",["purple"]],["purple",[]]], +["red", "blue", "yellow", "purple", "green", 0] + ], + [4,[[0,["red"]],["red",[0]]], +[[loop1, [0, "red"]]]], + [5,[["yellow",["red","blue"]],["red",[]],["blue",[]],[0,["yellow","green"]],["green",["purple"]],["purple",[0]]], + ["red","blue","yellow",[loop1,[0,"purple","green"]]]], + + [6,[[0,["red","blue"]],["red",[0]],["blue",[0]]], +[[loop1, [0,"red","blue"]]]], + + [7,[[0,["red","blue"]],["red",[0]],["blue",[]]], +["blue", [loop1, [0, "red"]]]], + + [8,[[0,["red","blue"]],["red",["orange"]],["orange",[0]],["blue",[]]], + ["blue", [loop1, [0, "orange", "red"]]]], + +[9,[[0, [1]], [1, [3, 4]], [2, []], [3, [2]], [4, [2, 6, 7]], [5, []], [6, [5, 8, 9, 10, 11, 24]], [7, [6, 7, 8, 9, 10, 11]], [8, [16, 17]], [9, [13, 14]], [10, [19, 20]], [11, [3, 4]], [12, []], [13, [12, 21, 22]], [14, [13, 14]], [15, []], [16, [15, 23]], [17, [16, 17]], [18, []], [19, [18, 21, 22]], [20, [19, 20]], [21, [24]], [22, [24]], [23, [24]], [24, []]], + +[2,3,5,15,24,23,16,17,8,12,21,22,13,14,9,18,19,20,10,[loop1,[7,4,11,6]],1,0]], + +[10,[[0, [6, 7]], [6, []], [7, [6, 7]]], +[6,7,0]], + +[11,[[0, [6, 7]], [6, []], [7, [6, 7,0]]], +[6, [loop1, [0, 7]]]], + +[12,[[0, [6, 7]], [6, []], [7, [8, 7,0]],[8,[6]]], +[6,8, [loop1, [0, 7]]]]] +). +tests :- + tests(Tests), + findall(_,( + member([N,Tree,Ans],Tests), + find_deps(0,Tree,Deps), + (Deps=Ans->S=success;S=fail), + nl, writeln1([N,S,"\n",result,Deps,"\n",answer,Ans])),_),!. + + + diff --git a/luciancicd/find_tests_from_repos.pl b/luciancicd/find_tests_from_repos.pl new file mode 100644 index 0000000000000000000000000000000000000000..94b1a34b8bca9ce42e3df4731af52a5b58e3ee13 --- /dev/null +++ b/luciancicd/find_tests_from_repos.pl @@ -0,0 +1,478 @@ +/* + +find tests from repos +- convert pl->lp, +- finds list of pred names +- find egs in comments + +- find eg + +% beginnings_middles_ends(10,5,BME). +% [[1,[1,2,3,4,5]],[2,[1,2,3,4,5]],[3,[1,2,3,4,5]],[4,[1,2,3,4,5]],[5,[1,2,3,4,5]],[6,[1,2,3,4,5]],[7,[1,2,3,4,5]],[8,[1,2,3,4,5]],[9,[1,2,3,4,5]],[10,[1,2,3,4,5]]] + +or + +% phrase(sentence(A),[the,john,read,to,sera]). + +- extract query and answers + +- if no answers given, a warning + +- if can't convert, a warning + +- moves old cicd.txt to new loc + +- later: find if all given preds have egs +- and whether preds called by preds have egs (if don't, give a warning) +- a warning if there is no eg at all + +* which file should it load? + +*/ + +:- use_module(library(date)). + +%:-include('luciancicd.pl'). +%:-include('../Prolog-to-List-Prolog/p2lpconverter.pl'). + +find_tests_from_repos :- + +working_directory1(A,A), + +(exists_directory('../private2/luciancicd-cicd-tests')->true;make_directory('../private2/luciancicd-cicd-tests')), + +repositories_paths(K), + +omit_paths(Omit), + +%findall(Omit1,(member(Omit2,Omit),atom_string(Omit1,Omit2)),Omit3), +findall([K1,G4],(member(K1,K), directory_files(K1,F), + delete_invisibles_etc(F,G), + +%findall(H,(member(H,G),not(string_concat("dot",_,H)), + +subtract(G,Omit,G1), + +findall(G3,(member(G2,G1),string_concat(G2,"/",G3)),G4) +%not(member(G,Omit)) + +),K01), +%trace, +%foldr(append,K0,K01), + +working_directory1(Old_D,Old_D), +%trace, +findall(Tests1,(member([D,K31],K01), + +working_directory1(_,Old_D), + +working_directory1(_,D), + +%member(K2,K31), + +%exists_directory(K2), + +process_directory_tests(K31,%_G, + %Omit,% + true, + Tests1)%),Tests) + ),Tests2), + foldr(append,Tests2,Tests), + + working_directory1(_,A), + + (exists_directory('../private2/luciancicd-cicd-tests')-> + + ( time1(Time), + foldr(string_concat,["../private2/luciancicd-cicd-tests",Time,"/"],Folder1), + %concat_list3(File1,[".txt"],File2), + +mv_lc("../private2/luciancicd-cicd-tests/",Folder1) + %foldr(string_concat,["rsync -av --exclude=\".*\" ../private2/luciancicd-cicd-tests/ ",Folder1],Command314), + %catch(bash_command(Command314,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub."],Text41),writeln1(Text41),abort)) + ); + ( + %exists_directory('../private2/luciancicd-cicd-tests')->true; +%make_directory('../private2/luciancicd-cicd-tests') +true)), + +findall(_,(member([K21,Tests521],Tests), +open_s(K21,write,S21), +write(S21,Tests521),close(S21) +),_), +%writeln("All tests were successful."), + +!. + +process_directory_tests(K,%G, + Top_level,%Tests1, + Tests61) :- + +%G=K, +%/* +findall(K4,(member(K1,K), %exists_directory(K1), +directory_files(K1,F), + delete_invisibles_etc(F,G), +%*/ +findall(Tests3,(member(H,G),not(string_concat("dot",_,H)), + +%not(member(H,Omit)), + + +foldr(string_concat,[K1,H],H1), + +% if a file then find modification date +% if a folder then continue finding files in folder +(exists_directory(H1)-> + +(string_concat(H1,"/",H2), +process_directory_tests([H2],%[H], + false,%[],%Omit % only omit top level dirs xx + %Tests1, + Tests3) + %foldr(append,Tests31,Tests3) + ); + +(string_concat(_,".pl",H)-> + +( + +find_tests(K1,H,H1,Tests3) +%p2lpconverter([file,H1],LP), + +%time_file(H1,Tests4), +%trace, +%append(Tests1,[[H1,Tests4]],Tests3))) +%Tests3=[[H1,Tests]] +) +; +(Tests3=[] +) +)) + +),Tests5),%trace, +foldr(append,Tests5,Tests51), + +%Tests5=Tests51, + +(%true% +Top_level=true%not(Omit=[]) % at top level +-> +( +findall([T1,',\n'],(member([T,TT,TTT],Tests51),foldr(atom_concat,["[","\"",T,"\"",",","\"",TT,"\"",",",TTT,"]"],T1)%term_to_atom(T,T1) +),T2), +flatten(T2,TT2), +foldr(atom_concat,TT2,T21), +(T2=[]->T6=[];(string_concat(T4,T5,T21),string_length(T5,2), +foldr(string_concat,["[","\n",T4,"\n","]"],T6))), +%term_to_atom(Tests51,Tests52), +string_concat(K3,"/",K1), +foldr(string_concat,["../private2/luciancicd-cicd-tests/tests_",K3,".txt"],K2), +K4=[K2,T6] +%open_s(K2,write,S), +%write(S,Tests52),close(S) + +%writeln(["*",K2, +%Tests52] +% +); +(%trace, +K4=Tests51) +) + + + +),Tests6), + +(%not(Omit=[])-> +Top_level=true-> +Tests6=Tests61; +(%trace, +foldr(append,Tests6,Tests61))), + +!. + +/* +test_a(Tests) :- + +working_directory1(A,A), + +working_directory1(_,'../../GitHub2/'), + +[K1,H,_H1]=["Philosophy/", "4 2 23.pl", "Philosophy/4 2 23.pl"], + string_concat(K11,"/",K1), + +LP=[[[n, c], ["%r(NA)."]], [[n, c], ["%NA=2."]]%, [[n, c], ["% r([a],N2)."]], [[n, c], ["% N2 = 3."]] +,[[n,r]] +], + +(find_tests2(H,K11,LP,Tests)->true;working_directory1(_,A)). +*/ + +find_tests(K1,H,H1,Tests) :- + + string_concat(K11,"/",K1), + + +%catch(call_with_time_limit(0.005, +%trace, +%p2lpconverter([file,H1],LP),%),_,false), + fastp2lp(H1,LP1), +%trace, + find_tests2(%H1, + H,K11,LP1,Tests). + +fastp2lp(H1,LP1) :- +%trace, +(string_concat(_,".pl",H1) +-> +(time_file(H1,T), +open_string_file_s(H1,F), +remove_end_comments1(H1,F2), +%string_concat(F1,"\n% ",F)->true;%F2 = F; +%(string_concat(F,"\n% ",F2), + +%atomic_list_concat(F3,'\n\n',F2), +%atomic_list_concat(F3,'',F4), + +save_file_s(H1,F2) +,set_time_file(H1,[],[modified(T)]) + +);true), + +foldr(string_concat,["#!/usr/bin/swipl -g main -q\n\n",":-include('../GitHub/Prolog-to-List-Prolog/p2lpconverter.pl').\n","handle_error(_Err):-\n halt(1).\n","main :-\n catch((p2lpconverter([file,\"",H1,"\"],LP),term_to_atom(LP,LP1), write(LP1)),Err, handle_error(Err)), nl,\n halt.\n","main :- halt(1).\n"],String), + +%trace, + fastp2lp1(String,LP1). + +fastp2lp2(H1,LP1) :- + +%string_concat(H10,"\n%",H1), +%pwd, +foldr(string_concat,["#!/usr/bin/swipl -g main -q\n\n",":-include('../GitHub/Prolog-to-List-Prolog/p2lpconverter.pl').\n","handle_error(_Err):-\n halt(1).\n","main :-\n catch((p2lpconverter([string,\"",H1,"\"],LP),term_to_atom(LP,LP1), write(LP1)),Err, handle_error(Err)), nl,\n halt.\n","main :- halt(1).\n"],String), + +working_directory1(A,A), + +%writeln([*,A]), + +working_directory1(_,"../"), + + + fastp2lp1(String,LP1), + + working_directory1(_,A). + +fastp2lp1(String,LP1) :- + +%trace, +%working_directory1(_,A), +foldr(string_concat,[%"../private2/luciancicd-testing/",Repository1b,"/",Go_path5, +"tmp.pl"],GP), +%string_concat(Go_path,"testcicd.pl",GP), +open_s(GP,write,S1), +write(S1,String),close(S1), +foldr(string_concat,["chmod +x ",GP,"\n","swipl -g main -q ./",GP],S3),%, + +(catch(bash_command(S3,LP), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text4),%writeln1(Text4), + fail%abort + ))-> + ( +%trace, working_directory1(A,A), +% writeln([*,A]), + +delete_tmp, + term_to_atom(LP1,LP) + );(%writeln("Fatal error on converting Prolog to List Prolog."), + delete_tmp,fail)). + +delete_tmp:- +foldr(string_concat,[%"scp -pr ../../Github_lc/ ", + "rm -f tmp.pl" + %Folder1 + ],Command315), + catch(bash_command(Command315,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42)%,writeln1(Text42)%,abort + )), + +foldr(string_concat,[%"scp -pr ../../Github_lc/ ", + "rm -f luciancicd/tmp.pl" + %Folder1 + ],Command316), + catch(bash_command(Command316,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42)%,writeln1(Text42)%,abort + )). + %,writeln1(Result2) + +/* +find_tests21(%H1, +H,K11,LP,Tests) :- + + %foldr(string_concat,[H1%,"/",K11 + %],K12), + %trace, + findall(N1,(member([[n,N]|_],LP), + string_strings(N,N1)),Ns), + + findall([K11,H,F2],(append(_,LP1,LP), + append([[[n,comment%c + ],[Comment]]],LP2,LP1), + + %findall([K11,H,F2],(member([[n,comment%c + %],[Comment]],LP), + + + string_strings(Comment,C), + member(N2,Ns), + append(_A,B,C), + append(N2,Dx,B), + %trace, + append(Ex,Dx1,Dx), + %append(_Ex1,Dx2,Dx1), + append(["."],_N21,Dx1), + %trace, + flatten([N2,Ex%,"." + ],N2Ex), + foldr(string_concat,N2Ex,F), + + % the answer is A= ... "." or " " + % in this, just 1 answer + %trace, + reverse(Ex,D),append(E2,_E3,D),reverse(E2,E31),(append([","],E5,E31)->true;append(["("],E5,E31)),append(E6,E7,E5),append([")"],_,E7), + %trace, + %member(N21,Ns), + + append([[[n,comment%c + ],[Comment1]]],_,LP2), + + %member([[n,comment%c + %],[Comment1]],LP), + + string_strings(Comment1,C1), + + append(_A1,Bx,C1), + append(E6,Dxx,Bx), + append(E61,Dxx1,Dxx), + %trace, + (append(["."],_Exx,Dxx1)%->true; + %append([],Exx,Dxx1) + ), + %trace, + %writeln([_A1,Bx,E6,Dxx,E61,Dxx1]), + %flatten([]) + foldr(string_concat,E61,E612), + sub_string(E612,_,_,_,"="), + %trace, + flatten([E6,E612%,Exx%,"." + ],E6Exx), + foldr(string_concat,E6Exx,F1), + %trace, + %term_to_atom((F,F1),F00), + %term_to_atom(F0,F00), + %term_to_atom(F10,F1), + +%trace, + foldr(string_concat,["(",F,",",F1,")"],F2), + foldr(string_concat,["a:-",F2,"."],F3), + working_directory1(WD1,WD1), + home_dir1(HD), + working_directory1(_,HD), + fastp2lp2(F3,_), + working_directory1(_,WD1) + %atom_string(F0,F), + %atom_string(F10,F1), + %F2=%F0% + %(F0,F10) + ),Tests1),sort(Tests1,Tests),!. +*/ + + % is K11 just d or gh/d x d + +%[K11,H,] +%[["d","a.pl",(a(B),B=1)]] +%/* +find_tests2(H,K11,LP,Tests) :- + + %findall(N1,(member([[n,N]|_],LP), + %string_strings(N,N1)),Ns), + working_directory1(BHD,BHD), +home_dir1(HD), + working_directory1(_,HD), + + findall([K11,H,F2],(append(_,LP1,LP), + append([[[n,comment%c + ],[Comment]]],LP2,LP1), + +%findall(Tests1,(member([[n,comment],[A]],LP), +string_strings(Comment,A12), +not(member("=",A12)), +truncate_comment(A12,A11), +reverse(A11,A13), +truncate_full_stop(A13,A14), +reverse(A14,A15), +foldr(string_concat,A15,A16), +%term_to_atom(A1,A16), +%trace, +((not(member("(",A12))->true;(no_vars(A16)->true; +string_concat("not",_,A16)))-> +foldr(string_concat,[%"(", +A16],A1); +( + append([[[n,comment%c + ],[Comment1]]],_,LP2), + +string_strings(Comment1,A121), +member("=",A121), +truncate_comment(A121,A111), +reverse(A111,A131), +truncate_full_stop(A131,A141), +reverse(A141,A151), +foldr(string_concat,A151,A161), +%term_to_atom(A110,A161), + +foldr(string_concat,[%"(", +A16,",",A161 +%,")" +],A1))), +check_non_var0(A1,F21), +foldr(string_concat,["(", +F21,")"],F2) +),Tests1),sort(Tests1,Tests), + working_directory1(_,BHD), +!. + +/* +findall(B,(member([[n,comment],[A]],LP), +string_strings(A,A12), +truncate_comment(A12,A11), +reverse(A11,A13), +truncate_full_stop(A13,A14), +reverse(A14,A15), +foldr(string_concat,A15,A16), +term_to_atom(A1,A16), + +((functor(A1,(=),2),arg(1,A1,N),arg(2,A1,Ans),B=[ans,N,=,Ans])->true; +(functor(A1,N,Ar),numbers(Ar,1,[],ArN),findall(ArN2,(member(ArN1,ArN),arg(ArN1,A1,ArN2),var(ArN2)),ArN3), +B=[A1,ArN3]))),C), +findall([K11,H,F2],(member([ans,N,=,Ans],C), +member([A1,Ans],C), +F=A1, +F1=(N=Ans), +foldr(string_concat,["(",F,",",F1,")"],F2)),Tests),!. +%*/ +truncate_comment(["%"|A13],A11) :- + truncate_comment(A13,A11),!. +truncate_comment([" "|A13],A11) :- + truncate_comment(A13,A11),!. +truncate_comment(A14,A14) :- + A14=[A12|A13], + not(A12="%"),not(A12=" "),!. + +truncate_full_stop(["."|A13],A11) :- + truncate_full_stop(A13,A11),!. +truncate_full_stop([" "|A13],A11) :- + truncate_full_stop(A13,A11),!. +truncate_full_stop(A14,A14) :- + A14=[A12|A13], + not(A12="."),not(A12=" "),!. +%*/ diff --git a/luciancicd/keep.pl b/luciancicd/keep.pl new file mode 100644 index 0000000000000000000000000000000000000000..58619e4f5128660f70109c6046a57b1d3c31c630 --- /dev/null +++ b/luciancicd/keep.pl @@ -0,0 +1,47 @@ +keep( +[ +[comment,1] +] +). + +keep1( +[ +%[comment,1], +%/*, +[writeln,1], +[write,1], +[cut,0], +[true,0], +[fail,0], +[false,0], +[halt,0], +[halt,1], +[trace,0], +[notrace,0], +[equals4_on,0], +[equals4_off,0], +[assertz,1], +[atomic,1], +[close,2], +[close,1], +[at_end_of_stream,0], +[at_end_of_stream,1], +[compound,1], +[float,1], +[set_prolog_flag,2], +[integer,1], +[set_input,1], +[set_output,1], +[nonvar,1], +[put_char,2], +[put_char,1], +[put_code,2], +[put_code,1], +[nl,0], +[nl,1], +[write,2], +[write,1], +[writeq,2], +[writeq,1] +%*/ +]). \ No newline at end of file diff --git a/luciancicd/lc b/luciancicd/lc new file mode 100644 index 0000000000000000000000000000000000000000..512a89ec615a5107e10d72ff870f0d799bd02116 Binary files /dev/null and b/luciancicd/lc differ diff --git a/luciancicd/luciancicd.pl b/luciancicd/luciancicd.pl new file mode 100644 index 0000000000000000000000000000000000000000..5a8b8f131548f7e860eb59e846d5914fd2c8eb56 --- /dev/null +++ b/luciancicd/luciancicd.pl @@ -0,0 +1,1984 @@ +% luciancicd.pl + +/* + +CI/CD for LPPM repositories + +- records modification date for each file +- command that resets modification dates to that of files x +- installs repositories in which it or a dependency repository has been modified +- tests each repository above at a certain time each day +- can turn off or manually run install/tests for certain repositories (have tests if needed) +- emails results of failures +- only allows commits if tests have been run + +NB: +- ignore invisible files +- stores data in private2/luciancicd-data folder - needs to be created +- stores modification dates in separate files for each repository +- uses LPPM dependencies + +Later: +- converts pl to lp and checks if code has changed or if it is the same or if just comments have changed +- requires saving the last version of the lp code +- a pl pretty printer based on lp + +*/ + +% - records modification date for each file + +:-include('../SSI/ssi.pl'). +:-include('../listprologinterpreter/la_files.pl'). +:-include('../List-Prolog-Package-Manager/lppm.pl'). +%:-include('lppm_install_luciancicd.pl'). +:-include('find_tests_from_repos.pl'). +:-include('ci.pl'). +:-include('ci3.pl'). +%:-include('save_diff_html.pl'). +:-include('move_to_repository_or_back.pl'). +:-include('luciancicd_ws.pl'). +:-include('find_dependencies.pl'). +%:-include('find_dependencies2.pl'). +:-include('settings.pl'). +:-include('find_dependencies2-cgpt1.pl'). +:-include('ci_vintage.pl'). +:-include('keep.pl'). +:-include('check_non_var.pl'). +:-include('remove_end_comment.pl'). +:-include('luciancicdverify.pl'). +:-include('luciancicdverify1.pl'). +:-include('../gitl/find_files.pl'). +%:-include('../gitl/gitl.pl'). +:-include('diff-cgpt.pl'). +:-include('merge3.pl'). +:-include('luciancicd_ws1.pl'). +:-include('../gitl/gitl_ws1.pl'). +%:-include('../gitl/diff.pl'). +:-include('main.pl'). +:-include('check_non_var0.pl'). + +:-dynamic lc_tests/1. +:-dynamic home_dir/1. +:-dynamic home_dir1/1. +:-dynamic ci_fail/1. +:-dynamic time1/1. +:-dynamic log/1. +:-dynamic pred_list/1. +:-dynamic pred_list_v/1. +:-dynamic success/1. +:-dynamic success1/1. +:-dynamic success_tmp/1. +:-dynamic test_n/1. +%:-dynamic diff_html_n/1. +:-dynamic tests_preds3/1. +:-dynamic fail_if_greater_than_n_changes2/1. +:-dynamic c/1. +:-dynamic ci_end/1. + +%:-dynamic dep99_na/1. + +%:-dynamic lc_mode/1. + + +set_up_luciancicd :- + + +get_time1, + +check_repositories_paths, + +working_directory1(A1,A1), + + +modification_dates(Mod_times), + +clear_mod_dates, + +findall(_,(member([K2,Mod_time52],Mod_times), +open_s(K2,write,S), +write(S,Mod_time52),close(S) +),_),!, + + + +%A1="../../Github_lc/", %working_directory1(_,"../../Github_lc/"), + + %working_directory1(_,A1), + + + %retractall(home_dir(_)),assertz(home_dir(A1)), +%retractall(home_dir(_)),assertz(home_dir(_)) + %retractall(ci_fail(_)),assertz(ci_fail([])), +retractall(ci_end(_)), +assertz(ci_end(false)), + +ci, +ci_end, +working_directory1(_,A1) +. + +% Mode = "token", "line" or "predicate" + +%luciancicd(Mode) :- + +% retractall(lc_mode(_)),assertz(lc_mode(Mode)), +% luciancicd. + + +luciancicd :- + + working_directory1(A1000,A1000), + + + retractall(success1(_)),assertz(success1(_)), + + gh_init2, + + retractall(diff_html_n(_)), + assertz(diff_html_n(1)), + + retractall(test_n(_)), + assertz(test_n(0)), + + retractall(success_tmp(_)), + assertz(success_tmp([])), + + (time1(_T1)->true;get_time1), + + check_repositories_paths, + %(lc_mode(_)->true; + %(retractall(lc_mode(_)),assertz(lc_mode("line")))), + + working_directory1(A1z,A1z), + + find_tests_from_repos, + + working_directory1(_,A1z), + + retractall(log(_)),assertz(log("")), + + retractall(success(_)),assertz(success(0)), + retractall(ci_fail(_)),assertz(ci_fail([])), + + lppm_get_registry_luciancicd(LPPM_registry_term1), + +(exists_directory('../private2')->true;make_directory('../private2')), + +(exists_directory('../private2/luciancicd-data')->true;make_directory('../private2/luciancicd-data')), + +directory_files('../private2/luciancicd-data/',F), + delete_invisibles_etc(F,G), + +findall([F1,Mod_times12], +(member(F2,G),string_concat('../private2/luciancicd-data/',F2,F1), +open_file_s(F1,Mod_times1), +term_to_atom(Mod_times1,Mod_times12)),Mod_times11), + +modification_dates(Mod_times2), + +%trace, + %msort(Mod_times11, Sorted1), + %msort(Mod_times2, Sorted2), + subtract(Mod_times2,Mod_times11,New), + + working_directory1(A1,A1), + + retractall(home_dir(_)),assertz(home_dir(A1)), +retractall(ci_end(_)), +assertz(ci_end(false)), + +ci, +working_directory1(_,A1), + +( %Sorted1=Sorted2 + (%trace, + (New=[]->true;(ci_fail(Ci_fail),forall(member(Ci_fail1,Ci_fail),Ci_fail1=1)))) +->( +writeln2("There are no modifications to repositories to test.")); +% if +( +%trace, + +findall(Repository1,(member([Path,_],New), +string_concat(Path1,".txt",Path), +string_concat("../private2/luciancicd-data/mod_times_",Repository1,Path1)),Repositories), +%trace, +findall([Repository1,Dependencies5],(member(Repository1,Repositories), +%trace, +find_all_depending_luciancicd(LPPM_registry_term1,Repository1,[],Dependencies5) +%flatten(Dependencies42,Dependencies41), +%sort(Dependencies41,Dependencies5) +),Dependencies6), + +findall(Dependencies5,(member([Repository1,Dependencies5],Dependencies6)),Dependencies8), +flatten(Dependencies8,Dependencies83), + + + sort(Dependencies83,Dependencies9), + + +%trace, + +%(findall(Results%[Repository1,T4] + + %BD='../../Github_lc/build', +%(exists_directory(BD)->true;make_directory(BD)), + +%working_directory1(BD,BD), + +LCTD="../private2/luciancicd-testing", + + +%trace, +findall(Dependencies990%Results +,(member(Repository1,Dependencies9), + + +working_directory1(_,A1), +%trace, + (success(1)->fail;true), + %success_tmp(Tmp31),(forall(member(Tmp4,Tmp31),Tmp4=1)->true;fail), + + foldr(string_concat,["rm -rf ../private2/luciancicd-testing/"],Command312), + catch(bash_command(Command312,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],Text412),writeln1(Text412),abort + )), + +(exists_directory_s(LCTD)->true;make_directory_s(LCTD)), + + +user(User1), + +find_all_dependencies(LPPM_registry_term1,%[[User1,Repository1]],%%,Description,Dependencies1 + [[User1,Repository1]%|Dependencies1 + ],[],Dependencies1a) + , + %)), + % time_limit_exceeded, + % (concat_list(["Error: Cycle in lppm_registry.txt: ",Dependencies1],Note_a),writeln(Note_a),abort)), + + append([[User1,Repository1%%,Description,Dependencies1 + ]],Dependencies1a,Dependencies2), + +findall(D21,member([_,D21],Dependencies2),D22), +append(Dependencies9,D22,D23), + sort(D23,Dependencies990) + + ),Dependencies991), + + flatten(Dependencies991,Dependencies992), + sort(Dependencies992,Dependencies99), + %trace, + lc_tests(Lc_tests), + +%trace, +%(( +findall([Tokens2,Tokens1] +,(member(Repository1a,Dependencies99), + %trace, + working_directory1(_,A1), + +foldr(string_concat,["../../Github_lc/tests_",Repository1a,".txt"],K211), + +%trace, + %open_file_s + %trace, + member(%file, + [K211|File2A1],Lc_tests), + %File2A1=[_,Content1], + %findall(*,(member([P,Tokens_i,Tokens_f],File2A1), + File2A1=[Tokens2,Tokens1]),Tokens3), + + %trace, + + findall(%[ + AT2z%,",\n"] + ,(member([AT2,_],Tokens3),foldr(string_concat,AT2,AT2z1), + term_to_atom(AT2z,AT2z1%AT232 + )),AT22),%flatten(AT22,AT2x),%)),AT22), + %append(AT24,[_],AT2x), + %foldr(string_concat,AT24,AT235), + %foldr(string_concat,["[",AT235,"]"],AT232), + %term_to_atom(AT231,AT22%AT232 + %), + foldr(append,AT22%AT231 + ,AT233), + %trace, +%trace, + + findall(%[ + AT1z%,",\n"] + ,(member([_,AT1],Tokens3),foldr(string_concat,AT1,AT1z1), + term_to_atom(AT1z,AT1z1%AT132 + )),AT12),%flatten(AT12,AT1x),%)),AT12), + %append(AT14,[_],AT1x), + %foldr(string_concat,AT14,AT135), + %foldr(string_concat,["[",AT135,"]"],AT132), + %term_to_atom(AT131,AT12%AT132 + %), + foldr(append,AT12%AT131 + ,AT133), + + % id changed repos xx, get Tests - run tests from main file in Repositories + + % run find deps + + % find names, arities + + %trace, +%pwd, + + %append(AT233,AT133,AT333), +%trace, + findall(AT233C,(member(AT233A1,AT233),(AT233A1=[[n, comment], [["File delimiter", _, _]]]->AT233C=AT233A1; + ((AT233A1=[N, _],(N=[n, comment]->true;N=":-"))->fail; + AT233C=[o,AT233A1]))),AT233A), + findall(AT133C,(member(AT133A1,AT133),(AT133A1=[[n, comment], [["File delimiter", _, _]]]->AT133C=AT133A1;AT133C=[n,AT133A1])),AT133A), +%trace, +%merge_files(AT233A,AT133A,AT333AF), +%trace, +%merge21(AT233A,AT133A,AT333A), +%merge_files(AT233A,AT133A,AT333A), +%trace, +merge_files1a(AT233A,AT133A,AT333A), +%merge3(AT233A,AT133A,AT333A), +%AT133A=AT333A, + +%trace, +findall(AT333C,(member(AT333A1,AT333A),(AT333A1=[[n, comment], [["File delimiter", _, _]]]->AT333C=AT333A1; +AT333A1=[_,AT333C])),AT333AD), + + +%trace, +%findall(AT333C,(member(AT333A1,AT333AF),(AT333A1=[[n, comment], [["File delimiter", _, _]]]->AT333C=AT333A1; +%AT333A1=[_,AT333C])),AT333AG), +%trace, +%trace, +%pred_list(PL), +%writeln(pred_list(PL)), +% +%trace, +%trace, +get_order(AT333AD,AT333B), + % * merge, copy of new or old from start, into files, place same name, arity preds together + % put same pred name arity together or at end if new + % use split into lp files + + %trace, + working_directory1(_,A1), + + findall(H,( + + member(Dep99,Dependencies99), %* make future depends 99s [dep99] + + %(Dep99="b"->trace;true), + + %Dep991=[Dep99], + %trace, + + read_main_file(Dep99,H%_,Dep99_name,Dep99_arity + )),H1), + + foldr(append,H1,H2), + sort(H2,H3), + %trace, + findall(Tests_a,(member(Repository1b1,Dependencies99), +foldr(string_concat,["../private2/luciancicd-cicd-tests/tests_",Repository1b1,".txt"],Test_script_path), +(catch(open_file_s(Test_script_path,Tests_a),_, +(writeln2(["Cannot find",Test_script_path]),fail%,abort +)))),Tests_b), +foldr(append,Tests_b,Tests),%-> + + retractall(pred_list(_)), + assertz(pred_list([]%Dependencies7d + )), + + %retractall(dep99_na(_)), + %assertz(dep99_na([])), + + %trace, + findall(_,( + member([Dep99,_,Dep99_name,Dep99_arity],H3), +%trace, +%writeln(member([*,Dep99,_,Dep99_name,Dep99_arity])), + %dep99_na(Dep99_na), +/* + delete_dep99_na([]%Dep99_na + ,AT333DA,AT333), +%trace, + append(Dep99_na,[[Dep99_name,Dep99_arity]],Dep99_na1), + assertz(dep99_na(Dep99_na1%Dependencies7d + )), +*/ +%trace, + %pred_list(PL1), +delete_repeated_preds(AT333AD,AT333AE), +%trace, +find_dependencies(Dep99_name,Dep99_arity,AT333AE,AT333,Dependencies7d,Pred_numbers0), +%get_order(AT333,AT333B), + +%trace, + %length(AT333,AT333L), + %numbers(AT333L,1,[],AT333N), + + % New ones + + (false%PL1=[] % true - t1-8, false - t9 + ->(AT333AH=AT333A,AT333AH1=AT333,AT333AD1=AT333,AT333AD2=AT333); + (AT333AH=AT333A, + AT333AH1=AT333A,AT333AD1=AT333AD,AT333AD2=AT333)), + + %trace, + + length(AT333AH1,AT333L), + numbers(AT333L,1,[],AT333N3), + +%trace, + + findall(AT233N1,(member(AT233N1,AT333N3), + get_item_n(AT333AH,AT233N1,AT233N2), + member(AT233N2,AT133A)),AT233N1a), + %[1, 3, 4, 11, 12, 13, 14, 15, 16] + %AT233N1a=AT233N, + sort(AT233N1a,AT233N), + % + %trace, + + findall(AT233N1,(member(AT233N1,AT333N3), + get_item_n(AT333AH,AT233N1,AT233N2), + not(AT233N2=[[n, comment], [["File delimiter", _, _]]]), + + /* + ((member(AT233N2,AT133A), + % + AT233N2=[_,[NZ|_]], + %not + ((NZ=[n, comment]->true;NZ=":-"%,member([_,[NZ|_]],AT133A) + )))->true;%(*/ + member(AT233N2,AT233A) + /*not((member(AT233N2,AT133A), + % + AT233N2=[_,[NZ|_]], + %not + ((NZ=[n, comment]->true;NZ=":-"%,member([_,[NZ|_]],AT133A) + )))) + */ + ),AT233N_old1a), + %AT233N_old1a=AT233N_old, + sort(AT233N_old1a,AT233N_old), + % [1, 2, 4, 5, 6, 7, 8, 9, 10] + %length(AT233,AT233L) + %trace, + %numbers(AT233L,1,[],AT233N), + /* + + + + ((member([Pred_name1|Rest2],AT1331), + pred_rest(Arity1,Rest2,Lines2))-> + (append(AT333,)(T10,T11,[],T12), + delete(AT1331,[[[n, comment], [["File delimiter", PZ, FZ]]]|T11],AT1333)); + (T12=T10,AT1331=AT1333)), + append(AT333,[[[n, comment], [["File delimiter", PZ, FZ]]]|T12],AT3332), + merge_files3(AT2333,AT1333,AT3332,AT3331). + +pred_rest(Arity1,Rest) :- +*/ + + % group clauses + + delete(Pred_numbers0,[[n, query_box_1], _, _],Pred_numbers), + group_clauses(Dependencies7d,Pred_numbers,Dependencies7d1), + + %length(AT133,AT133L), + %numbers(AT133L,1,[],AT133N), + + % assign old or new labels to deps + %trace, + findall(LD1,(member(Dependencies7d2,Dependencies7d1), + (Dependencies7d2=[loop1,Loop1a]-> + (findall([ON,CN,PN],(member(Loop1b,Loop1a),Loop1b=[CN,PN],(((member(PN,AT233N),member(PN,AT233N_old))->member(ON,[new,old]);(member(PN,AT233N)->ON=new;ON=old)))),Loop1c),LD1=[loop1,Loop1c]); + (Dependencies7d2=[CN,PN],(((member(PN,AT233N),member(PN,AT233N_old))->member(ON,[new,old]);(member(PN,AT233N))->ON=new;ON=old),LD1=[ON,CN,PN])))),Dependencies7d5), + +%trace, + %(once(member([[n, comment], 1, Comment_pred_ns3],Pred_numbers))->true;Comment_pred_ns3=[]), + + (once(member([":-", 1, Includes_pred_ns],Pred_numbers))->true; + Includes_pred_ns=[]), + +% Find new comments + +%(Test_n1=2->trace;true), + + +%findall(AT133N1,(member(AT133N1,AT333N3), +%get_item_n(AT333,AT133N1,[[n,comment]|_])),Comment_pred_ns), +%trace, + findall(AT233N1,(member(AT233N1,AT333N3), + get_item_n(AT333AH,AT233N1,AT233N2), + member(AT233N2,AT133A), + (AT233N2=[[n,comment]|_]->true;(AT233N2=[_,[[n,comment]|_]]%->true; + %AT233N2=[_,[":-"|_]]) + ))),Comment_pred_ns1), + %trace, + sort(Comment_pred_ns1,Comment_pred_ns), + + append(Comment_pred_ns,Includes_pred_ns,Comment_pred_ns2), +%findall(Comment_pred_n,(member(Comment_pred_n,Comment_pred_ns),(member(Comment_pred_n,AT233N))),Comment_pred_ns2), + + % group into old, new clauses, loops + +%trace, + /* + findall(LD1A,(member(Dependencies7d2,Dependencies7d3), + (Dependencies7d2=[loop1,Loop1a]->LD1A=[loop1,Loop1a];LD1A=Dependencies7d2)),Dependencies7d5), + */ + %foldr(append,Dependencies7d51,Dependencies7d5), + + %trace, + group_into_clauses1(Comment_pred_ns2,Dependencies7d5,[],Dependencies7d4), + + +/* + % delete(Dependencies7d1,Comment_pred_ns,LD2) - delete all comments + findall(LD1,(member(Dependencies7d2,Dependencies7d4),* + (Dependencies7d2=[loop1,Loop1a]-> + (findall(Loop1b,(member(Loop1b,Loop1a),Loop1b=[ON,CN,PN],not(member(PN,Comment_pred_ns))),Loop1c),LD1=[loop1,Loop1c]); + (Dependencies7d2=[ON,CN,PN],not(member(PN,Comment_pred_ns))->LD1=Dependencies7d2)))),LD21), +*/ + + % Choose predicates to test +%trace, + +(Dependencies7d4=[]->Dependencies7d6=[]; +append([[[old,Old_a],[new,New_a]]],Dependencies7d6,Dependencies7d4)), + findall([new,_,Comment_pred_ns21],member(Comment_pred_ns21,Comment_pred_ns2),Comment_pred_ns22),%* + + append(New_a,Comment_pred_ns22,Comment_pred_ns23), + + append(Old_a,Comment_pred_ns22,Comment_pred_ns24), + + append([[[old,Comment_pred_ns24],[new,Comment_pred_ns23]]],Dependencies7d6,Dependencies7d7), + +%trace, + + findall(_,( + %trace, + + %success_tmp(Tmp32),(forall(member(Tmp4,Tmp32),Tmp4=1)->true;fail), + %trace, + + +%trace, + append(Curr_preds,_,Dependencies7d7%LD21 + ), + not(Curr_preds=[]), + +%trace, +%writeln1(append(Curr_preds,_,Dependencies7d7)), + + (success(1)->fail;true), + %trace, +%writeln( append(Curr_preds,_,Dependencies7d7)), + %trace, + + %length(Curr_preds,Curr_preds_L), + %length(Dependencies7d7,Dependencies7d7_L), + %writeln(append(Curr_preds_L,_,Dependencies7d7_L)), + + %(Curr_preds_L=2->trace;true), + /* + %append(Curr_preds,Comment_pred_ns2,LD4) - append new comments ** ld4 has no loops xx +findall(LD31,(member(LD3,Dependencies7d4),LD3=[ON,CN,PN],(member(PN,Curr_preds)->LD31=LD3; + + (LD3=[loop1,Loop1a]-> + (findall(Loop1b,(member(Loop1b,Loop1a),Loop1b=[ON,CN,PN],member(PN,Comment_pred_ns2)),Loop1c),LD31=[loop1,Loop1c]); +(LD3=[ON,CN,PN],member(PN,Comment_pred_ns2)->LD31=LD3)) + +)),LD4), +*/ + %Curr_preds=[Curr_preds2], + append(_Curr_preds1,[Curr_pred_n],Curr_preds), + % cpn can include loop1 + + + + %findall(LD51,(member([[old,_],[new,New_a]],Curr_preds%LD4 + %),member(get_item_n(Dependencies7d,LD5,LD51)),AT1331), +%loop* x +%trace, + Curr_pred_n=[[old,Old_a4],[new,New_a4]], +%trace, +%trace, + %list_to_set(Old_a,Old_a1), + %list_to_set(New_a,New_a1), + findall([LD6,LD7,LD8],member([LD7,LD8,LD6],Old_a4),Old_a2), + sort(Old_a2,Old_a3), + findall([LD7,LD8,LD6],member([LD6,LD7,LD8],Old_a3),Old_a1), + + findall([LD6,LD7,LD8],member([LD7,LD8,LD6],New_a4),New_a2), + sort(New_a2,New_a3), + findall([LD7,LD8,LD6],member([LD6,LD7,LD8],New_a3),New_a1), + + +%trace, + (true%c(i)%false%false%true t1-8, false t-T9 + ->( + + findall(LD52,(%member(LD51,Old_a%LD4 + %), + member([_,LD5a,LD5],Old_a1),(true%var(LD5a) + ->get_item_n_catch(AT333AD1,LD5,LD52b);get_item_n_catch(AT333AD2,LD5a,LD52b)),((LD52b=[[n,PName]|Rest_x]%,not(PName=comment) + )->(foldr(string_concat,["a",LD5,"_",PName],PName2),LD52=[[n,PName2]|Rest_x]);LD52=LD52b)),AT2331c), +%trace, + findall(LD52,(%member(LD51,New_a%LD4 + %), + member([_,LD5a,LD5],New_a1),(true%var(LD5a) + ->get_item_n_catch(AT333AD1,LD5,LD52b);get_item_n_catch(AT333AD2,LD5a,LD52b)),((LD52b=[[n,PName]|Rest_x]%,not(PName=comment) + )->(foldr(string_concat,["a",LD5,"_",PName],PName2),LD52=[[n,PName2]|Rest_x]);LD52=LD52b)),AT1331c) + ) +;( +findall(LD52,( + %), + member([_,_,LD5],Old_a1),get_item_n(AT333AD,LD5,LD52)),AT2331c), +%trace, + findall(LD52,(%member(LD51,New_a%LD4 + %), + member([_,_,LD5],New_a1),get_item_n(AT333AD,LD5,LD52)),AT1331c) + )), + %trace, + %list_to_set1(AT2331c1,AT2331c), + %list_to_set1(AT1331c1,AT1331c), + +%loop* x + +/* + findall(LD52,(member(Old_a1,Curr_preds1),member(LD51,Old_a1%LD4 + ),member([_,_,LD5],LD51),member(get_item_n(Dependencies7d,LD5,LD52)),AT2331), + + + findall(LD52,(member(New_a1,Curr_preds1),member(LD51,New_a1%LD4 + ),member([_,_,LD5],LD51),member(get_item_n(Dependencies7d,LD5,LD52)),AT1331), +*/ + + % merge, build, test each level of preds, saving progress + + % merge only curr pred, put back in list, save changes + + % comments + %pwd, + %trace, + %trace, + writeln(here1), + time(pp0_1(AT2331c,AT234)), + %term_to_atom(AT234,AT2341), + split_string(AT234,"\n","\n",AT23), + + time(pp0_1(AT1331c,AT134)), + %term_to_atom(AT134,AT1341), + split_string(AT134,"\n","\n",AT13), + + %,trace + %)->true;(writeln("fault"),fail)), + + +%trace, +% +%writeln(merge2(AT23,AT13,T4)), +%trace, +writeln(here1a), + time(merge2(AT23,AT13,T4)),%),T5), + + %findall(XXX,(member(XXX1,T4),foldr(string_concat,XXX1,XXX2),catch(term_to_atom(XXX3,XXX4),_,fail),%pp0(XXX3,XXX4), + %lp2p1(XXX4,XXX),nl),XXX3), + %writeln(XXX3), + %trace, + %writeln(merge2(AT23,AT13,T4)), +%trace, +% get all files, choose ones that are deps of a rep + + (success(1)->fail;true), + %success_tmp(Tmp33),(forall(member(Tmp4,Tmp33),Tmp4=1)->true;fail), +%trace, + /* + foldr(string_concat,["rm -rf ../private2/luciancicd-testing/"],Command3), + catch(bash_command(Command3,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],Text4),writeln1(Text4),abort + )), +*/ + % find deps + %trace, + + findall(_%Results%[PZ,FZ,T10] + ,( + %writeln(member(T44,T4)), + member(T44,T4), +%trace,writeln([*,T44]), + (success(1)->fail;true), +%trace, + % (Curr_preds_L=2->trace;true), + %foldr(string_concat,T44,T48), + pred_list(Pred_list2), + + foldr(string_concat,T44,T451), + + catch(term_to_atom(T49,T451),_,fail),%not(T49=[]), + %writeln([*,T49]), + %trace, + %not((forall(member(XY,T49),(XY=[[n,comment]|_]->true;XY=[":-"|_])))), + %pp0(T49,T47), + +%trace, + %put_in_nums(T49,AT333,T491), % leave exact comments, includes x +%trace, + findall([_,T4911],(member(T4911,T49)%,process_subtract([_,T4911],T49112) + ),T491), + + append(Pred_list2,T491,T4731), +%trace, + put_in_order(T4731,AT333B,T47), % leave exact comments, includes x + + T47=T471, + findall(XXX1,(member([XXX3,[[n,PName]|Rest_x]],T471),foldr(string_concat,["a",XXX3,"_",PName],PName2),XXX1=[_,[[n,PName2]|Rest_x]]),T471A), + %writeln1([t471,T471]), + %sort(T47,T471), % leave comments, includes x + findall(T472,member([_,T472],T471),T473), % strip nums + + writeln(here2), + %trace, + working_directory1(Here2,Here2), + + home_dir(Here2a), + working_directory1(_,Here2a), + + time((term_to_atom(T473,T4731a), + foldr(string_concat,["#!/usr/bin/swipl -g main -q\n\n",":-include('../Prolog-to-List-Prolog/p2lpconverter.pl').\n","handle_error(_Err):-\n halt(1).\n","main :-\n catch((pp0_3(",T4731a,",T501),term_to_atom(T501,T50), write(T50)),Err, handle_error(Err)), nl,\n halt.\n","main :- halt(1).\n"],String_pp0_3), + +foldr(string_concat,[%"../private2/luciancicd-testing/",Repository1b,"/",Go_path5, +"tmp.pl"],GP_pp0_3), +%string_concat(Go_path,"testcicd.pl",GP), +open_s(GP_pp0_3,write,S1_pp0_3), +write(S1_pp0_3,String_pp0_3),close(S1_pp0_3), +foldr(string_concat,["chmod +x ",GP_pp0_3,"\n","swipl -g main -q ./",GP_pp0_3],S3_pp0_3),%, + +((catch(bash_command(S3_pp0_3,T502), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_),%writeln1(Text4), + fail%abort + )),term_to_atom(T503,T502))%sd2(R110,Tests1,RTests,R1,To_m_1,Repository_root_path,Repository,Gitl_data_path1,N,R1,N_path,To,HTML) + ->T50=T503; + writeln("Couldn't run pp0_3."),fail),%HTML1="Identical"), +delete_tmp +)), + working_directory1(_,Here2), + + %time(pp0_3(T473,T50)), + + + %pp0_1(T46,T47), + %term_to_atom(AT234,AT2341), + split_string(T50,"\n","\n",T45), + + + %writeln(member(T44,T4)), + +/* + pred_list(Pred_list2), + %trace, + ((%false, + not(Pred_list2=[]),not(Pred_list2="[]"), + not(T44=[]),not(T44="[]"))-> + + (Pred_list2=["["|Pred_list222], + append(Pred_list223,["]"],Pred_list222), + T44=["["|T444], + foldr(append,[["["],Pred_list223,[","],T444],T45),notrace); + + %foldr(append,[Pred_list2,[Poss_comma],T44],T45); + (foldr(append,[Pred_list2,T44],T45))), + notrace, % assuming T44 is a list of preds + */ + + findall([T51,"\n"],member(T51,T45),T522),%append(T522,[_],T52), + flatten(T522,T53), + foldr(string_concat,T53,T5), + term_to_atom(T7,T5),split_into_lp_files(T7,T8), + + (success(1)->fail;true), + %success_tmp(Tmp34),(forall(member(Tmp4,Tmp34),Tmp4=1)->true;fail), + + writeln2(""),writeln2("**********"), +writeln2(["Installing Combination"]), + + test_n(Test_n), + Test_n1 is Test_n+1, + retractall(test_n(_)), + assertz(test_n(Test_n1)), + %writeln([test_n1,Test_n1]), + + %test_n(Test_n0), + %(Test_n0=1->trace;true), + + %(Test_n1=5->trace;true), + + findall(_,(member([[[n, comment], [["File delimiter", PZ, FZ]]]|T10],T8), + + (success(1)->fail;true), + %success_tmp(Tmp35),(forall(member(Tmp4,Tmp35),Tmp4=1)->true;fail), + + + writeln2(""),%writeln("**********"), +writeln2(["Installing",PZ, FZ%Repository1 +]), + + %pwd, + working_directory1(_,A1), +%pwd, + + %writeln(["Installing dependency path",PZ,"file"%dependency" + %,FZ]), + + % path + %trace, + string_concat("../../Github_lc/",PZ1,PZ), + foldr(string_concat,[LCTD,"/",PZ1%,"/" + ],K212), + + % only want some reps files + (exists_directory_s(LCTD)->true;make_directory_s(LCTD)), + %(exists_directory_s(K212)->true;make_directory_s(K212)), + make_directory_recursive_s(LCTD,PZ1), + + %trace, + + working_directory1(_,K212), + %trace, + % clear dir *** +%<<<<<<< Updated upstream +%======= + %pp_lp2p0(T10,T11), +%>>>>>>> Stashed changes + lp2p1(T10,T11), + %findall(_,(member([K2,Mod_time52],Mod_times), +open_s(FZ,write,S0), +write(S0,T11),close(S0) + +%writeln([write(FZ,T11)]) + +%),_),!. + + ),_%T6 + ), + %*** + %),_), + + % take apart, save repos + % delete build/rep afterwards + +%get needed reps +%findall(Results,(member(Repository1,Dependencies9), + +%(Repository1="b"->trace;true), + +%member([Repository1,Dependencies7],Dependencies6), +%findall(_,(member(Repository1,Dependencies7), +%writeln(["Installing required repository",Repository1]), + +%lppm_install_luciancicd(LPPM_registry_term1,"luciangreen",Repository1),%),_), + +%trace, +%pwd, +%notrace, +% test non-interactive algorithms +%trace, + + (success(1)->fail;true), + %success_tmp(Tmp36),(forall(member(Tmp4,Tmp36),Tmp4=1)->true;fail), + +writeln2(["Running tests"]), +%trace, +findall(H4,(member(Repository1b,Dependencies99), + +findall([Repository1b,Main_file1],member([Repository1b,Main_file1,_,_],H3),H4)),H5), +%writeln([member(Repository1b,Dependencies99)]), + +foldr(append,H5,H61), +sort(H61,H6), +%trace, +%Repository1b=Dep99, +%trace, +findall(Results2,(member([_,_Main_file],H6),%member(Repository1b,Dependencies99), + +(success(1)->fail;true), + +working_directory1(_,A1), + +/*findall(Tests_a,(member([Repository1b1,_],H6), +foldr(string_concat,["../private2/luciancicd-cicd-tests/tests_",Repository1b1,".txt"],Test_script_path), +(catch(open_file_s(Test_script_path,Tests_a),_, +(writeln2(["Cannot find",Test_script_path]),fail%,abort +)))),Tests_b),*/ +%trace, +%foldr(append,Tests_b,Tests),%-> + +(( +%trace, +%working_directory1(_,A1), %*** +working_directory1(A,A), + + +%trace, +%T473=AT3331c, +append(AT2331c,AT1331c,AT3331c), +%trace, +tests_pred2(Tests,AT3331c,Tests01), + +%Tests=Tests01, +sort1(Tests01,Tests0), +%writeln([tests0,Tests0]), +%notrace +%trace, + +findall(Result,(member([Go_path1,File,Command],Tests0), +working_directory1(_,A), +%trace, +check_non_var(Command,Command1), +Repository1b=Go_path1, +%trace, +(true->%tests_pred(AT1331c,Command)-> +( + +%foldr(string_concat,["../private2/luciancicd-testing/",%Repository1, +%Go_path1],Go_path), + +foldr(string_concat,["../private2/luciancicd-testing/",Repository1b,"/"],_Go_path3), + +%foldr(string_concat,["../private2/luciancicd-testing/",%Repository1, +%"/", +%Go_path1,"/main_file.txt"],Go_path2), + +%(catch(open_file_s(Go_path2,[Main_file1,_,_]),_, +%(writeln2(["Cannot find",Test_script_path]),(writeln(["Missing main_file.txt in " ,Go_path1,"/"]),abort)%,abort +%))), +%trace, + +%read_main_file(Repository1b,Main_file1%,_,_ +%), + +%atom_string(Main_file1,Main_file), + +((working_directory1(_,A), + +%trace, %*** + %(exists_directory_s(LCTD)->true;make_directory_s(LCTD)), + + make_directory_recursive_s(LCTD,Go_path1), + +working_directory1(_,LCTD), +%working_directory1(_,Go_path), +working_directory1(_,Go_path1), + +% *** Change path to swipl if necessary +%trace, + +%term_to_atom(Command2,Command1), + +/* +string_concat(Repository1b,Go_path1a,Go_path1), +split_string(Go_path1a,"/","/",Go_path3), +(Go_path3=[_Go_path4]->Go_path5="";(Go_path3=[_|Go_path6],atomic_list_concat(Go_path6,'/',Go_path7),string_concat(Go_path7,"/",Go_path5))), +*/ + +%:-initialization(catch(call_with_time_limit(1,main),Err,handle_error(Err))). + +time_limit(Time_limit), + +foldr(string_concat,["#!/usr/bin/swipl -g main -q\n\n",":-include('../",Repository1b,"/",%Go_path5, +File%File +,"').\n","handle_error(_Err):-\n halt(1).\n","main :-\n catch(call_with_time_limit(",Time_limit,",(",Command1,")), Err, handle_error(Err)), nl,\n halt.\n","main :- halt(1).\n"],String), +%trace, +%working_directory1(_,A), +foldr(string_concat,[%"../private2/luciancicd-testing/",Repository1b,"/",Go_path5, +"testcicd.pl"],GP), +%string_concat(Go_path,"testcicd.pl",GP), +open_s(GP,write,S1), +write(S1,String),close(S1), +foldr(string_concat,["chmod +x ",GP,"\n","swipl -g main -q ./",GP],S3),%, + + %(Test_n0=5->trace;true), + +/* +catch(call_with_time_limit(7,bash_command(S3,_)),_,(foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text4),%writeln1(Text4), + fail%abort + )) +*/ +%/* +catch(bash_command(S3,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text4),%writeln1(Text4), + fail%abort + )) + %*/ +%Command +)->((Result=success, +%trace, + +retractall(pred_list(_)), +%trace, +assertz(pred_list(T471A)) +%) + +));(Result=fail%,trace +)),%trace, +writeln12([Go_path1,File,Command1,Result]) +);Result=fail) +),Results2) + +,flatten(Results2,Results2a),(forall(member(Result,Results2a),Result=success)->(retractall(success(_)),assertz(success(1)));true) + +%, (Test_n0=4->trace;true) + + + +%; +%true +))) + +,Results3) + + + +%,flatten(Results3,Results3a),(forall(member(Result,Results3a),Result=success)->(success_tmp(Tmp),append(Tmp,[1],Tmp1),retractall(success_tmp(_)),assertz(success_tmp(Tmp1))); +%(success_tmp(Tmp),append(Tmp,[0],Tmp1),retractall(success_tmp(_)),assertz(success_tmp(Tmp1)))) + +,flatten(Results3,Results3a),(forall(member(Result,Results3a),Result=success)->(retractall(success(_)),assertz(success(1)));true) + + +%),_) +),_Results1), + +%trace, +/* + pred_list_v(T8), + pred_list(Pred_list), + append(Pred_list,T8,Pred_list2), + retractall(pred_list(_)), + assertz(pred_list(Pred_list2%Dependencies7d + )) + */ + (success(0)->(writeln2("Current predicate set failed."),retractall(success(_)),assertz(success(1)),fail%,abort,working_directory1(_,A1) + );(writeln2("Current predicate set passed."),%trace, + %leash(-all), + %trace,sleep(0.5) + retractall(success(_)),assertz(success(0)) + )) + +),Result4) + +,length(Dependencies7d7,Dependencies7d7L), +length(Result4,Dependencies7d7L) + +),Result5), + +%trace, +%flatten(Results1,Results2), +%Results2=Results21, +%findall(Result4,(member(Result4,Results2),not(var(Result4))),Results21), + + + %success_tmp(_Tmp37),(true%forall(member(Tmp4,Tmp37),Tmp4=1) + %->(retractall(success(_)),assertz(success(1))); +%(retractall(success(_)),assertz(success(0)))), + +((length(H3,H3L), +length(Result5,H3L)) +%success(1)%(forall(member(Result3,Results21),(not(var(Result3)),Result3=success))%,not(Results21=[]) +-> + +% Only save mod times if all tests passed +(working_directory1(_,A1), +/* + foldr(string_concat,["rm -rf ../private2/luciancicd-data/"],Command31), + catch(bash_command(Command31,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],Text41),writeln1(Text41),abort + )), + +(exists_directory('../private2/luciancicd-data')->true;make_directory('../private2/luciancicd-data')), +*/ +findall(_,(member([K21,Mod_time521],Mod_times2), +open_s(K21,write,S21), +write(S21,Mod_time521),close(S21) +),_), + + + + +move_to_repository_or_back, + +retractall(ci_end(_)), +assertz(ci_end(true)), + +ci, +ci_end, + +%pwd, + +remove_end_comment, + +writeln2("All tests were successful."), +home_dir(HD), +working_directory1(_,HD) +,S001=0,retractall(success1(_)),assertz(success1(S001)) + +) +;((true%not(Results21=[]) +->(working_directory1(_,A1),remove_end_comment, + + +writeln2("1 or more tests failed.") +,S001=1,retractall(success1(_)),assertz(success1(S001)) + +);true)) +))), + +working_directory1(_,A1), + +%success(S000), +working_directory1(A22,A22), + +repositories_paths([Path]), +working_directory1(_,Path), + (exists_directory_s("../lc_logs/")->true;make_directory_s("../lc_logs/")), +%trace, +log(Log), +term_to_atom(Log,Log1), +%Log1=[Log], +time1(Time),foldr(string_concat,["../lc_logs/log",Time,".txt"],Log_file_name), +open_s(Log_file_name,write,S21T), +write(S21T,[S001,Log1]),close(S21T), + + + + retractall(ci_end(_)), + assertz(ci_end(false)), +%trace, +luciancicd_ws1, + + + + retractall(time1(_)), + +working_directory1(_,A22) +. + + +make_directory_recursive_s(LCTD,PZ1) :- + split_string(PZ1,"/","/",PZ2), + delete(PZ2,"",PZ3), + make_directory_recursive_s(LCTD,"",%PZ4, + PZ3),!. + +make_directory_recursive_s(_,_,%_, +[]) :- !. +make_directory_recursive_s(LCTD,PZ5,%PZ4, +PZ) :- + PZ=[PZ6|PZ7], + foldr(string_concat,[LCTD,"/",PZ5,PZ6%,"/" + ],K212), + (exists_directory_s(K212)->true;make_directory_s(K212)), + foldr(string_concat,[PZ5,"/",PZ6,"/" + ],PZ8), + make_directory_recursive_s(LCTD,PZ8,%PZ4, +PZ7),!. + +truncate_path(P1,P2,P3) :- + string_strings(P1,L1), + reverse(L1,L2), + append(L3,L4,L2), + append(["/"],L5,L4), + foldr(append,[["/"],L5],L6), + reverse(L6,L7), + foldr(string_concat,L7,P2), + reverse(L3,L8), + foldr(string_concat,L8,P3),!. + +ci_end:- + + + lc_tests(Tests), + + home_dir(AAA), + working_directory1(_,AAA), + + + +working_directory1(Old_D1,Old_D1), +working_directory1(_,"../../Github_lc/"), + + +foldr(string_concat,[%"scp -pr ../../Github_lc/ ", + "rm -f ../Github_lc/* ../Github_lc/*/* ../Github_lc/*/*/* ../Github_lc/*/*/*/*" + %Folder1 + ],Command315), + catch(bash_command(Command315,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42)%,writeln1(Text42)%,abort + )), + + working_directory1(_,Old_D1), + + % The modified Prolog programs are saved +% - reset dirs, make folders x files have been cleaned from folders +%trace, +findall(_,(member([K21|Tests521],Tests), +term_to_atom(Tests521,Tests522), +open_s(K21,write,S21), +write(S21,Tests522),close(S21) +),_), + +modification_dates(Mod_times), + +clear_mod_dates, + +findall(_,(member([K2,Mod_time52],Mod_times), +open_s(K2,write,S), +write(S,Mod_time52),close(S) +),_),!. + + +repositories_paths(Paths) :- + (ci_end(true)-> + output_path(Paths); + ( + repositories_paths1(Paths1), + findall(Paths2,(member(Paths3,Paths1), + ((string_concat(_Paths4,"/",Paths3), + Paths2=Paths3)->true; + string_concat(Paths3,"/",Paths2))),Paths))),!. + +omit_paths(Paths) :- + omit_paths1(Paths1), + findall(Paths2,(member(Paths3,Paths1), + ((string_concat(Paths2,"/",Paths3))->true; + (Paths3=Paths2))),Paths),!. + +output_path(Paths) :- + output_path1(Paths1), + findall(Paths2,(member(Paths3,Paths1), + ((string_concat(_Paths4,"/",Paths3), + Paths2=Paths3)->true; + string_concat(Paths3,"/",Paths2))),Paths),!. + +modification_dates(Mod_time) :- + +working_directory1(A,A), + +(exists_directory('../private2/luciancicd-data')->true;make_directory('../private2/luciancicd-data')), + +repositories_paths(K), + +omit_paths(Omit), + +%findall(Omit1,(member(Omit2,Omit),atom_string(Omit1,Omit2)),Omit3), +findall([K1,G4],(member(K1,K), directory_files(K1,F), + delete_invisibles_etc(F,G), + +%findall(H,(member(H,G),not(string_concat("dot",_,H)), + +subtract(G,Omit,G1), + +findall(G3,(member(G2,G1),string_concat(G2,"/",G3)),G4) +%not(member(G,Omit)) + +),K01), +%trace, +%foldr(append,K0,K01), + +working_directory1(Old_D,Old_D), + +findall(Mod_time1,(member([D,K31],K01), + +working_directory1(_,Old_D), + +working_directory1(_,D), + +%member(K2,K31), + +%exists_directory(K2), + +process_directory(K31,%_G, + %Omit,% + true, + Mod_time1)%),Mod_time) + ),Mod_time2), + foldr(append,Mod_time2,Mod_time), + + working_directory1(_,A) + + ,!. + +process_directory(K,%G, + Top_level,%Mod_time1, + Mod_time61) :- + +%G=K, +%/* +findall(K4,(member(K1,K), directory_files(K1,F), + delete_invisibles_etc(F,G), +%*/ +findall(Mod_time3,(member(H,G),not(string_concat("dot",_,H)), + +%not(member(H,Omit)), + + +foldr(string_concat,[K1,H],H1), + +% if a file then find modification date +% if a folder then continue finding files in folder +(exists_directory(H1)-> + +(string_concat(H1,"/",H2), +process_directory([H2],%[H], + false,%[],%Omit % only omit top level dirs xx + %Mod_time1, + Mod_time3) + %foldr(append,Mod_time31,Mod_time3) + ); + +(time_file(H1,Mod_time4), +%trace, +%append(Mod_time1,[[H1,Mod_time4]],Mod_time3))) +Mod_time3=[[H1,Mod_time4]])) + +),Mod_time5),%trace, +foldr(append,Mod_time5,Mod_time51), + +%Mod_time5=Mod_time51, + +(Top_level=true%not(Omit=[]) % at top level +-> +( +term_to_atom(Mod_time51,Mod_time52), +string_concat(K3,"/",K1), +foldr(string_concat,["../private2/luciancicd-data/mod_times_",K3,".txt"],K2), +K4=[K2,Mod_time52] +%open_s(K2,write,S), +%write(S,Mod_time52),close(S) + +%writeln(["*",K2, +%Mod_time52] +%) +); +K4=Mod_time51 +) + + + +),Mod_time6), + +(%not(Omit=[])-> +Top_level=true-> +Mod_time6=Mod_time61; +foldr(append,Mod_time6,Mod_time61)), + +!. + + + %find_all_depending_luciancicd(LPPM_registry_term1,Repository1,Dependencies,Dependencies) :- !. + find_all_depending_luciancicd(LPPM_registry_term1,Repository1,Dependencies7,Dependencies72) :- + find_all_depending_luciancicd(LPPM_registry_term1,Repository1,Dependencies7,[],Dependencies72),!. + find_all_depending_luciancicd(LPPM_registry_term1,Repository1,Dependencies7,D,Dependencies72) :- +((member([User1,Repository1,_Description1,_Dependencies1],LPPM_registry_term1), +not(member(Repository1,D)))-> +(findall(Dependencies5,(member([User1,Repository2,_Description,Dependencies2],LPPM_registry_term1), +member([User1,Repository1],Dependencies2), +append(D,[Repository1],D2), +find_all_depending_luciancicd(LPPM_registry_term1,Repository2,[],D2,Dependencies4), +foldr(append,[Dependencies7,Dependencies4],Dependencies5) + +),Dependencies3), +append([Repository1],Dependencies3,Dependencies6), +flatten(Dependencies6,Dependencies72)); +Dependencies72=[]), +%flatten(Dependencies71,Dependencies72), +!. + +%**** change later +lppm_get_registry_luciancicd(LPPM_registry_term1) :- + catch(phrase_from_file_s(string(LPPM_registry_string), "../List-Prolog-Package-Manager/lppm_registry.txt"),_,(writeln1("Error: Cannot find ../List-Prolog-Package-Manager/lppm_registry.txt"),abort)), + +term_to_atom(LPPM_registry_term1,LPPM_registry_string). + +working_directory1(A1,B1) :- + (string(A1)->atom_string(A,A1);A=A1), + (string(B1)->atom_string(B,B1);B=B1), + term_to_atom(working_directory(A,B),Atom), + catch(working_directory(A,B), _, (foldr(string_concat,["Error on ",Atom%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],Text41),writeln1(Text41)%fail%abort + )),!. + +split_into_lp_files(T7,T10) :- + split_into_lp_files(T7,[],_T8,[],T9), + delete(T9,[],T10),!. + +split_into_lp_files([],B1,_B2,C1,C2) :- + append(C1,[B1],C2),!. +split_into_lp_files(A,B,C,B1,C1) :- + A=[D|E], + not(D=[[n,comment],[["File delimiter",_P,_F1]]]), + append(B,[D],F), + split_into_lp_files(E,F,C,B1,C1),!. +split_into_lp_files(A,B,C,B1,C1) :- + A=[D|E], + D=[[n,comment],[["File delimiter",_P,_F1]]], + append(B1,[B],B2), + split_into_lp_files(E,[D],C,B2,C1),!. + +/* +split_into_lp_files1(T7,T10) :- + split_into_lp_files1(T7,[],_T8,[],T9), + delete(T9,[],T10),!. + +split_into_lp_files1([],B1,_B2,C1,C2) :- + append(C1,[B1],C2),!. +split_into_lp_files1(A,B,C,B1,C1) :- + A=[D|E], + not(D=[[n,comment],[["File delimiter",_P,_F1]]]), + append(B,[D],F), + split_into_lp_files1(E,F,C,B1,C1),!. +split_into_lp_files1(A,B,C,B1,C1) :- + A=[D|E], + D=[[n,comment],[["File delimiter",_P,_F1]]], + append(B1,[B],B2), + split_into_lp_files1(E,[D],C,B2,C1),!. +*/ + +pp0_1(A,B):- + ((%trace, + %false% + pp0_2(A,B)) + ->true; + (%trace, + %delete(A,[],A1), + lines_to_comments(A,B))). + +%lines_to_comments([],[]) :- !. +lines_to_comments(A,_) :- member([],A),writeln("Error in main_file.txt, or other."),abort. +lines_to_comments(A,B) :- + %term_to_atom(A,A1), + split_string(A,"\n\r","\n\r",C), + findall([[n,comment],[D]],member(D,C),B). + +clear_mod_dates :- + +working_directory1(A1,A1), +working_directory1(_,"../private2/luciancicd-data/"), + +foldr(string_concat,[%"scp -pr ../../Github_lc/ ", + "rm -f ../luciancicd-data/*" + %Folder1 + ],Command315), + catch(bash_command(Command315,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42)%,writeln1(Text42)%,abort + )), + + +working_directory1(_,A1). + +check_repositories_paths :- + repositories_paths(Paths), + (not(Paths=[_])-> + (writeln2("Only one repository path can be processed at a time."),abort); + true),!. + +get_time1 :- + + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + foldr(string_concat,["-",Year,"-",Month,"-",Day,"-",Hour1,"-",Minute1,"-",Seconda],Time), + + retractall(time1(_)), + assertz(time1(Time)). + +writeln2(A) :- writeln12(A). +/* +log(B),foldr(string_concat,[B,A,"\n"],C), + retractall(log(_)), + assertz(log(C)). +*/ +writeln12(A) :- log(B),term_to_atom(A,A1),foldr(string_concat,[B,A1,"\n"],C), + retractall(log(_)), + assertz(log(C)), + writeln1(A). + +group_clauses(Dependencies7,Pred_numbers,Clause_Ns3) :- + length(Pred_numbers,Pred_numbers_L), + numbers(Pred_numbers_L,1,[],Pred_numbers_Ns), + findall([Pred_numbers_N,A,B,C],(member(Pred_numbers_N,Pred_numbers_Ns),get_item_n(Pred_numbers,Pred_numbers_N,[A,B,C])),Pred_numbers2), + + findall(D3,(member(D,Dependencies7), + (D=[loop1,D1]->(findall([N1,D2],(member(D2,D1), + member([N1,A,B,C],Pred_numbers2),member(D2,C)),ND1),D3=[loop1,ND1]); + (member([N,A,B,C],Pred_numbers2),member(D,C),D3=[N,D]))),Clause_Ns), + + collect_clauses_in_loops(Clause_Ns,Clause_Ns1), + move_non_loop_clauses_to_loop(Clause_Ns1,Clause_Ns3),!. + + %findall(M,(member(G,Clause_Ns2), + %(G=[loop1,H]->(findall(J,member([_,J],H),L),M=[loop1,L]);G=[_,M])),Clause_Ns3),!. + +collect_clauses_in_loops(C,B) :- + sub_term_wa([loop1,_],C,A), + collect_clauses_in_loops1(A,[],D), + foldr(put_sub_term_wa_ae,D,C,B). + +collect_clauses_in_loops1([],A,A) :- !. +collect_clauses_in_loops1(A,B,C) :- + A=[[Add,[loop1,List]]|D], + collect_clauses_in_loops1(List,D,[],E), + append(B,[[Add,[loop1,List]]],F), + collect_clauses_in_loops1(E,F,C). + +collect_clauses_in_loops1(_,[],F,F) :- !. +collect_clauses_in_loops1(List1,D,F,E) :- + D=[[Add,[loop1,List2]]|G], + %subtract(List2,List1,List3), +foldr(collect_clauses_in_loops2,List1,List2,List3), + + append(F,[[Add,[loop1,List3]]],H), + collect_clauses_in_loops1(List1,G,H,E). + +collect_clauses_in_loops2([N1,A],NDs1,NDs2) :- + (member([N1,_],NDs1)->delete(NDs1,[N1,_A2],NDs2); + append(NDs1,[[N1,A]],NDs2)). + + +move_non_loop_clauses_to_loop(C,B) :- + %sub_term_wa([loop1,_],C,A), + %findall([loop1,D],(member([loop1,D],C)),A), + move_non_loop_clauses_to_loop1(C,C,B1%,[],D%,[],NLC + ),delete(B1,["&del"],B). + % delete non loop clauses + %foldr(put_sub_term_wa_ae,D,C,B). + +move_non_loop_clauses_to_loop1([],A,A%,D,D%,NLC,NLC +) :- !. +move_non_loop_clauses_to_loop1(C,A1,A2%,D1,D2%,NLC1,NLC2 +) :- + C=[E|F], + (E=[loop1,_]->(A1=A2);%get_n_item(A1,E,N),put_item_n(A1,N,append(A1,[E],A2)); + (move_non_loop_clauses_to_loop2(E,A1,A1,A3), + move_non_loop_clauses_to_loop1(F,A3,A2))).%,D1,D2%,NLC1,NLC2 + + +move_non_loop_clauses_to_loop2(_,[],A,A) :- !. +move_non_loop_clauses_to_loop2(E,A1,A2,A4) :- + E=[N,E1], + A1=[%[Add, + F%[loop1,B]%] + |C], + (F=[loop1,B]-> + (member([N,_E2],B)->(append(B,[[N,E1]],B1),%delete(A1,[loop1,B],A3), + get_n_item(A2,F,N1),put_item_n(A2,N1,[loop1,B1],A31), + get_n_item(A31,[N,E1],N2),put_item_n(A31,N2,["&del"],A3) + %append(A2,[%[Add, + %[loop1,B1]]%] + %,A4) + ); + (A2=A3%append(A2,[%[Add, + %[loop1,B]]%] + %,A3) + %move_non_loop_clauses_to_loop2(E,C,A3,A4)))); + )); + A2=A3%append(A2,[F],A3) + ), + move_non_loop_clauses_to_loop2(E,C,A3,A4). + +% [[loop1, [[new, 1, 9], [new, 3, 15], [new, 2, 12], [new, 1, 8]]]] +% [[[old, []], [new, [[new, 1, 9], [new, 3, 15], [new, 2, 12], [new, 1, 8]]]]] + + +%group_into_clauses1(A,B,C):-forall(member(C,A),A=[loop1,_]),append(A,B,C),!. +group_into_clauses1(Comment_pred_ns,A,B,F) :- +%trace, + reverse(A,A1), + group_into_clauses10(Comment_pred_ns,A1,B,F1), + reverse(F1,F). + +group_into_clauses10(_,[],A,A) :- !. +group_into_clauses10(Comment_pred_ns,A,B,F) :- + A=[C|D], + (C=[loop1,E]->( + H=[[old,[]],[new,E]], + append(B,[H],B1),G=D + %append(D,E,B11),G=D, + %group_into_clauses1(Comment_pred_ns,B11,[],B1) + ); + (C=[_ON,CN,PN], + (member(PN,Comment_pred_ns)-> + (G=D,H=[[old,[]],[new,[C]]]); + (append(D,A + ,G2),list_to_set(G2,G1),group_into_clauses([_, CN, _], G1, G1, G%[] + , []%A + , H1), + group_into_old_new(H1,H))), + append(B,[H],B1))), + group_into_clauses10(Comment_pred_ns,G,B1,F). + +group_into_clauses(_,[], +NDs1,NDs1,NDs2,NDs2) :- !. +group_into_clauses([A2,N1,B2],NDs0, +NDs1,NDs2,NDs3,NDs4) :- + NDs0=[[A3,N10,B3]|NDs01],%member([_,N1,_],NDs1), + ((N10=N1,member([A3,N1,B3],NDs1))->(delete(NDs1,[A3,N1,B3],NDs20), + append(NDs3,[[A3,N1,B3]],NDs40)); + (NDs3=NDs40,NDs1=NDs20%append(NDs1,[[A2,N1,B2]],NDs20) + )), + group_into_clauses([A2,N1,B2],NDs01, + NDs20,NDs2,NDs40,NDs4),!. + + +group_into_old_new(H1,H) :- + findall([old,A,B],member([old,A,B],H1),H2), + findall([new,A,B],member([new,A,B],H1),H3), + H=[[old,H2],[new,H3]]. + +/* +tests_pred(AT1331c,Command) :- + member([N|VE],AT1331c), + N=[_,N1], + ((VE=[V]->true;(VE=[V|_]))->length(V,Arity);Arity=0), + square_to_round(List,Command), + functor(Item,N1,Arity), + member(Item,List),!. +*/ + +tests_pred2(Tests,AT3331ca,Tests0) :- + findall(X,(member(X1,AT3331ca),process_subtract([_,X1],[_,X])),AT3331c), + %writeln2(["Contains predicates: "]), + retractall(tests_preds3(_)), + assertz(tests_preds3([])), +findall([Go_path1,File,Command1],(member([N|VE],AT3331c), + N=[_,N1], + ((VE=[V]->true;(VE=[V|_],not(V=":-")))->length(V,Arity);Arity=0), + member([Go_path1,File,Command1],Tests), + %trace, + (not(Command)=Command1;Command=Command1), + square_to_round(List,Command), + functor(Item,N1,Arity), + (not(N1=comment)-> + (tests_preds3(TP), + append(TP,[[N1,Arity]],TP1), + retractall(tests_preds3(_)), + assertz(tests_preds3(TP1))) + ;true), + %((Item=[Item1]->Item2=[Item1|_];Item2=Item),writeln([*,member(Item2,List)]), + member(Item,List)),Tests0), + tests_preds3(TP2), + sort(TP2,TP3), + %trace, + %(TP3=[]->fail;true), + writeln2(["Contains predicates: ",TP3]),%writeln2(""), + !. + + +%find_first_pred(Dep99,H%File,Dep99_name,Dep99_arity +%) :- +%read_main_file(Dep99,G), +%findall([B,C],member([_,B,C],G),H),!. +%[A,B,C]=[File,Dep99_name,Dep99_arity],!. + +read_main_file(Dep99,G) :- +working_directory1(A1,A1), + +repositories_paths([Path]), + +foldr(string_concat,[Path,Dep99, +%"/", +%Go_path1, +"/main_file.txt"],Go_path2), + +(catch(open_file_s(Go_path2,A),_, +(writeln2(["Cannot find",Go_path2]),(writeln(["Missing main_file.txt in " ,Dep99,"/"]),abort)%,abort +))), + +findall([Dep99,B,C,D],(member([B,E],A),member([C,D],E)),G), +%atom_string(Dep99_name1,Dep99_name), + +working_directory1(_,A1), +!. + +%merge_files(_AT233,AT3331,AT3331) :- !. +merge_files(AT233,AT133,AT3331) :- + split_into_lp_files(AT233,AT2331), + split_into_lp_files(AT133,AT1331), + merge_files2(AT2331,AT1331,[],AT333), + foldr(append,AT333,AT3332), + % keep separate tests (sometimes with duplicates %A=1), remove other dups + %trace, + AT3332=AT3331. + %list_to_set1(AT3332,AT3331). + %list_to_set(AT3332,AT3331). + +merge_files2([],AT1331,AT333,AT3331) :- + append(AT333,AT1331,AT3331),!. +merge_files2(AT2331,AT1331,AT333,AT3331) :- + AT2331=[[[[n, comment], [["File delimiter", PZ, FZ]]]|T10]|AT2333], + %merge3(T10) + (member([[[n, comment], [["File delimiter", PZ, FZ]]]|T11],AT1331)-> + (append(T10,T11,T12), + %merge_files3(T10,T11,T12), + delete(AT1331,[[[n, comment], [["File delimiter", PZ, FZ]]]|T11],AT1333)); + (T12=T10,AT1331=AT1333)), + append(AT333,[[[[n, comment], [["File delimiter", PZ, FZ]]]|T12]],AT3332), + %*/ + merge_files2(AT2333,AT1333,AT3332,AT3331). + +/* +merge_files2([],AT1331,AT333,AT3331) :- + append(AT333,AT1331,AT3331),!. +merge_files2(AT2331,AT1331,AT333,AT3331) :- + AT2331=[[[[n, comment], [["File delimiter", PZ, FZ]]]|T10]|AT2333], + (member([[[n, comment], [["File delimiter", PZ, FZ]]]|T11],AT1331)-> + (not(string_concat(_,".pl",FZ))-> + (delete(AT1331,[[[n, comment], [["File delimiter", PZ, FZ]]]|T10],AT1333), + append(AT333,[[[[n, comment], [["File delimiter", PZ, FZ]]]|T12]],AT3332), + merge_files2(AT2333,AT1333,AT3332,AT3331)); + + (append(T10,T11,T12), + %merge_files3(T10,T11,T12), + delete(AT1331,[[[n, comment], [["File delimiter", PZ, FZ]]]|T11],AT1333), + append(AT333,[[[[n, comment], [["File delimiter", PZ, FZ]]]|T12]],AT3332), + merge_files2(AT2333,AT1333,AT3332,AT3331))); + (T12=T10,AT1331=AT1333, + append(AT333,[[[[n, comment], [["File delimiter", PZ, FZ]]]|T12]],AT3332), + merge_files2(AT2333,AT1333,AT3332,AT3331))) + .*/ + +merge_files3([],AT1331,AT1331%,AT3331 +) :- + %append(AT333,AT1331,AT13331), + !. +merge_files3(AT2331,AT1331,AT333%,AT3331 +) :- +%writeln1(merge_files3(AT2331,AT1331,AT333)), + AT2331=[[_N0,[Pred_name1|Rest1]]|_AT2333], + pred_rest(Arity1,Rest1,_Lines1), + findall([_N,[Pred_name1|Rest3]],(member([_N2,[Pred_name1|Rest3]],AT2331), + pred_rest(Arity1,Rest3,_Lines3)),Ps), + subtract(AT2331,Ps,Ps2), + %((%trace, +% append(C2,D2,AT2331),append(Ps,E2,D2),foldr(append,[C2,E2],Ps2))->true;AT2331=Ps2),!, + + reverse(AT1331,AT1332), + ((append(A,B,AT1332), + append([[N1,[Pred_name1|Rest4]]],C,B),!, + pred_rest(Arity1,Rest4,_Lines2) + )-> + (%trace, + reverse(A,A1),reverse(C,C1),foldr(append,[C1,[[N1,[Pred_name1|Rest4]]],Ps,A1],AT1334)); + append(AT1331,Ps,AT1334)), + merge_files3(Ps2,AT1334,AT333),!.%,AT3331) :- + +put_in_nums(T49,AT333,T491) :- % leave exact comments, includes x + /*findall(*,(member([Pred_name1|Rest1],T49), + pred_rest(Arity1,Rest1,_Lines2), + get_n_item(AT333,) + ))) + */ +/* + length(AT333,AT333L), + numbers(AT333L,1,[],AT333N), + findall([AT333N1,AT333Item],(member(AT333N1,AT333N), + get_item_n(AT333,AT333N1,AT333Item), + member(AT333Item,T49)),T491),!. +*/ +%/* + %length(T49,T49L), + %numbers(T49L,1,[],T49N), + findall([T49N1,T49A],(member(T49A,T49), + once(get_n_item(AT333,T49A,T49N1)) + %member(AT333Item,T49) + ),T491),%sort(T492,T491), + !. +%*/ + +%/* +subtract1([],_,B,B) :- !. +subtract1(A,A1,B,C) :- + A=[D|E], + process_subtract(D,D4), + %foldr(string_concat,["a",_D1,"_",D2],D), + (member(D4,A1)-> + (delete(A1,D4,F), + B=G); + (append(B,[D],G),A1=F)), + subtract1(E,F,G,C). + +process_subtract(D,D4) :- + D=[N1, [[n, D0]|D01]], + string_strings(D0,D3), + append(["a"],D5,D3), + append(_D6,D7,D5 + ), + append(["_"],D2,D7), + foldr(atom_concat,D2,D41), + D4=[N1, [[n, D41]|D01]],!. + %*/ + + +process_put_in_order(B1,B,Num):- + string_strings(B1,D3), + append(["a"],D5,D3), + append(Num1,D7,D5 + ), + append(["_"],D2,D7), + foldr(string_concat,Num1,Num2), + number_string(Num,Num2), + foldr(atom_concat,D2,B),!. + +% leave exact comments, includes x + +% leave comments as AT333B, put rest in order +put_in_order(T4721,AT333B,T47) :- +%trace, +%writeln1(put_in_order(T4721,AT333B,T47)), + findall([A, [N|C]],(member([A, [N|C]],AT333B), + (N=[n,comment]->true;N=":-")),AT333BA), + subtract1(T4721,AT333BA,[],T472), + + findall(B1,(member([_, [[n,B12]|C]],T472), + +process_put_in_order(B12,B,Num), + (%false,B=comment,once(member([A,[[n,B]|C]],AT333B)))->B1=[A,[[n,B]|C]]; + ((once(member([Num,[[n,B]|_C1]],AT333B))%,append(C1,_,C) + )->B1=[Num,[[n,B]|C]]))),D1), + /* + findall([n,B],member([_, [[n,B]|C]],T472),B2), + sort(B2,B3),%length(B3,B3L), + %numbers(B3L,1,[],Ns), + findall(X,(member(X2,B3),findall(X3,(member(X1,T472))) + findall([N1,[[n,B]|C]],(member(N1,Ns),get_item_n(T472,N1,[_, [[n,B]|C]])),B5), + %findall(B3,(member([_, [[n,B]|C]],T472),member([n,B]))) + + %findall([A,[[n,B]|C]],member([A, [[n,B]|C]],AT333B),B21), + findall([n,B],member([A, [[n,B]|C]],AT333B),B21), + sort(B21,B31),length(B31,B31L), + numbers(B31L,1,[],Ns1), + findall([N11,B41],(member(N11,Ns1),get_item_n(AT333B,N11,B41)),B51), + %findall(B31,(member([_, [[n,B]|C]],T472),member([n,B]))) + + findall(B1,(member([N1, [[n,B]|C]],B5), +(%false,B=comment,once(member([A,[[n,B]|C]],AT333B)))->B1=[A,[[n,B]|C]]; + ((%trace, + once(member([N1,[A,[[n,B]|_C1]]],B51))%,append(C1,_,C) + )->B1=[A,[[n,B]|C]]))),D1), + */ + append(AT333BA,D1,D), + findall(E,member([E,_],D),F), + sort(F,G), + findall([H,J],(member(H,G),member([H,J],D)),T471), + sort(T471,T47),!. + + +list_to_set1(A,B) :- + list_to_set1(A,[],B),!. + +list_to_set1([],A,A) :- !. +/*list_to_set1(A,B,C) :- + A=[D|E], + (E=[]->append(B,[D],C); + ((D=[[n,comment],[String]],string(String),string_strings(String,C1),contains_assignment(C1))-> + (E=[E1|E2], + ((E1=[[n,comment],[String1]],string(String1),string_strings(String1,C2),contains_assignment(C2))->(append(B,[D],G),list_to_set1(E2,G,C));(append(B,[D],G),list_to_set1(E,G,C)))); + + (%E=F,% + ((append(E3,E4,E), + append([D],E5,E4), + foldr(append,[E3,E5],F))->true; + E=F), + %delete(E,D,F), + append(B,[D],G), + list_to_set1(F,G,C)))),!. + +contains_assignment(C1) :- +append(_A1,Bx,C1),append(_E6,Dxx,Bx),append(E61,Dxx1,Dxx),(append(["."],_Exx,Dxx1)),foldr(string_concat,E61,E612),sub_string(E612,_,_,_,"="). +%subtract1(A,B,C) :- +% subtract1(A,B,[],C),!. +*/ +/* +subtract1(A,[],A) :- !. +subtract1(A,B,G) :- + B=[D|E], + ((append(C2,D2,A),append([D],E2,D2),!)->true;A=E2),subtract1(E2,E,H),foldr(append,[C2,H],G). + %subtract1(F,E,G),!. +*/ + +/* +subtract2(A,[],A) :- !. +subtract2(A,B,G) :- + B=[D|E], + + (D=[[n,comment]|_]-> + A=C;%append(A,[D],G);%)(((append(C2,D2,A),append([D],E2,D2),!)->true;(C2=[],A=E2)),subtract2(E2,E,H),foldr(append,[C2,H],G)); + delete(A,D,C)),subtract2(C,E,G),!. + */ + +/* +sublist(D,A,F) :- + append(C2,D2,A),append([D],E2,D2),foldr(append,[C2,E2],F),!. +*/ +sort1(Tests01,Tests0) :- + sort1(Tests01,[],Tests0),!. +sort1([],B,B) :- !. +sort1(A,B,C) :- + A=[D|E], + delete(E,D,F), + append(B,[D],G), + sort1(F,G,C). + +get_order(AT333,AT333B) :- +%trace, +%writeln1(get_order(AT333,AT333B)), + findall(AT333C,(member(AT333D,AT333), + ((AT333D=[N|_],(N=[n, comment]->true;N=":-"))-> + AT333C=AT333D; + (((%trace, + AT333D=[[n, B],Args| _],not(var(Args1)),(not(Args1=":-")))->true;(AT333D=[[n, B]| _],Args=[]))-> + (((%trace, + not(Args=":-"),not(Args=[]))->Args1=[Args];Args1=[]), + AT333C=[[n, B]|Args1])))),AT333E), + %list_to_set(AT333E,AT333F), + AT333E=AT333F, + length(AT333F,AT333FL), + numbers(AT333FL,1,[],AT333FN), + findall([N,AT333G],(member(N,AT333FN), + get_item_n(AT333F,N,AT333G)),AT333B),!. + +/* +get_order2(AT233,AT133,AT333B) :- + get_order(AT333,AT2331), + get_order(AT133,AT1331), + delete(AT1331,[_,[["File delimiter", _, _]]],AT1332), + !. +*/ +%/* +delete_dep99_na([],AT333,AT333) :- !. +delete_dep99_na(Dep99_na,AT333DA,AT333) :- +%trace, + Dep99_na=[[N,A|_]|Dep99_na1], + length(A,L), + length(Args,L), + C=[N,Args|_], + delete(AT333DA,C,AT333DA1), + delete_dep99_na(Dep99_na1,AT333DA1,AT333),!. +%*/ + +delete_repeated_preds(AT333,AT333AB) :- +%trace, + pred_list(PL), + + (PL=[]->AT333=AT333AB; + (findall(PL1,member([_,PL1],PL),PL2), + subtract(AT333,PL2,AT333A), + delete_dep99_na(PL2,AT333A,AT333AB))),!. + +get_item_n_catch(A,B,C) :- + catch(get_item_n(A,B,C),_,(writeln1("Instantiation error."),abort)),!. \ No newline at end of file diff --git a/luciancicd/luciancicd_ws.pl b/luciancicd/luciancicd_ws.pl new file mode 100644 index 0000000000000000000000000000000000000000..4144c230fbc2cb397764ed75f78c67a4390d2a6f --- /dev/null +++ b/luciancicd/luciancicd_ws.pl @@ -0,0 +1,152 @@ +:- use_module(library(http/thread_httpd)). +:- use_module(library(http/http_dispatch)). +:- use_module(library(http/http_error)). +:- use_module(library(http/html_write)). + +% we need this module from the HTTP client library for http_read_data +:- use_module(library(http/http_client)). +:- http_handler('/luciancicd', luciancicd_web_form, []). + +%:-include('../SSI/ssi.pl'). +%:-include('../SSI/ssi.pl'). +%:-include('luciancicd1_pl.pl'). + +%:- include('files/listprolog.pl'). + +:-dynamic num1/1. + +luciancicd_server(Port) :- + http_server(http_dispatch, [port(Port)]). + + /* + browse http://127.0.0.1:8000/ + This demonstrates handling POST requests + */ + + luciancicd_web_form(_Request) :- + %html_api_maker_or_terminal(HAMOT), +retractall(html_api_maker_or_terminal(_)), +assertz(html_api_maker_or_terminal(html + %terminal + )), + + retractall(num1(_)),assertz(num1(1)), + + + working_directory(A3,A3), + working_directory(_,'../../lc_logs/'), + format('Content-type: text/html~n~n', []), + +data(Header,Footer), + +format(Header,[]), + +writeln("

Lucian CI/CD



"), +directory_files("./",F), + delete_invisibles_etc(F,G), + + findall([T,N,H,H1,F1],(member(H,G),once((open_string_file_s(H,F11), +(string_concat("log",_,H)->(term_to_atom(F111,F11),F111=[Success,F12],(Success= 0->Success1="PASSED";Success1="FAILED"),foldr(string_concat,["",Success1," - "],Success2));(F11=F12,Success2="")), time_file(H,T),atomic_list_concat(A,"\n",F12),atomic_list_concat(A,"
",F1), + get_num(N),foldr(string_concat,[Success2,"", + H,"", + "

"],H1)%,writeln(H1) + ))),J0), + + sort(J0,J1), + reverse(J1,J), + + findall(_,(member([_,_,_,H1,_],J),writeln(H1)),_), + + findall(_,(member([_,N,H,_,F1],J),foldr(string_concat,["

",H,"

Top
",F1,"

"],H2),writeln(H2)),_), +%Debug=off, + + %test_open_types_cases(4,Query,Types,Modes,Functions), + +%international_lucianpl([lang,"en"],Debug,Query,Types,Modes,Functions,_Result), +%p2lpconverter([file,"../private/la_com_ssi1.pl"],List3), + +%testopen_cases(8,[[n,test]],List3), + %test(1,Query,Functions,Result), + +% Form and HTML Table +%test1(Functions), +%Query=[[n,test]], + %luciancicd_test(List3), + %para(List3), + %international_lucianpl([lang,"en"],Debug,[[n,luciancicd]],List3,_Result1), + working_directory(_,A3), + + +format(Footer,[]), + +retractall(html_api_maker_or_terminal(_)), +assertz(html_api_maker_or_terminal(terminal + %terminal + )) + . + + %term_to_atom(Debug2,'off'), +%term_to_atom(Query2,Query1), +%term_to_atom(Functions2,Functions1), + +%international_interpret([lang,"en"],Debug2,Query2,Functions2,Result), + %%format('

========~n', []), + %%portray_clause + %portray_clause(result), + %%writeln1(Data), + +%format('

'). + + +data(Header,Footer) :- + +Header=' + + + + + Lucian CI/CD + + + + + + + +
+ + + + + + +
+

', + +Footer='

+
+
+ +
+ +'. + +get_num(A) :- num1(A),retractall(num1(_)),A1 is A+1,assertz(num1(A1)). \ No newline at end of file diff --git a/luciancicd/luciancicd_ws1.pl b/luciancicd/luciancicd_ws1.pl new file mode 100644 index 0000000000000000000000000000000000000000..4d8e7f73f531a70f3e833fea669821653fd234eb --- /dev/null +++ b/luciancicd/luciancicd_ws1.pl @@ -0,0 +1,87 @@ +luciancicd_ws1 :- +%trace, +home_dir1(A), +working_directory1(_,A), + +repositories_paths([RP]), + +output_path([OP]), + +foldr(string_concat,[RP],To_m_1), +foldr(string_concat,[OP],R1), + + + find_files(To_m_1,Tests1), + %);Tests1=[]), + + + find_files(R1,RTests), + + working_directory1(_,A), + +findall([T1a,BA1],(member([T1,BA],RTests),remove_end_comments2(BA,BA1),string_concat(R1,T1a,T1)),R110), + + + To=R1, + +term_to_atom(R110,R1101), +term_to_atom(Tests1,Tests11), +term_to_atom(RTests,RTests1), +term_to_atom(R1,R11), +term_to_atom(To_m_1,To_m_11), +%term_to_atom(Repository_root_path,Repository_root_path1), +%term_to_atom(Repository,Repository1), +%term_to_atom(Gitl_data_path1,Gitl_data_path11), +%term_to_atom(N,N1), +%term_to_atom(R1,R1A), +%term_to_atom(N_path,N_path1), +term_to_atom(To,To1), + +%trace, + +foldr(string_concat,["#!/usr/bin/swipl -g main -q\n\n",":-include('../gitl/gitl.pl').\n","handle_error(_Err):-\n halt(1).\n","main :-\n catch((sd2(",R1101,",",Tests11,",",RTests1,",",R11,",",To_m_11,",_,_,_,_,_,_,",To1,",HTML),term_to_atom(HTML,HTML1), write(HTML1)),Err, handle_error(Err)), nl,\n halt.\n","main :- halt(1).\n"],String), + +foldr(string_concat,[%"../private2/luciancicd-testing/",Repository1b,"/",Go_path5, +"tmp.pl"],GP), +%string_concat(Go_path,"testcicd.pl",GP), +open_s(GP,write,S1), +write(S1,String),close(S1), +foldr(string_concat,["chmod +x ",GP,"\n","swipl -g main -q ./",GP],S3),%, + +((catch(bash_command(S3,HTML), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text4),%writeln1(Text4), + fail%abort + )),term_to_atom(HTML2,HTML))%sd2(R110,Tests1,RTests,R1,To_m_1,Repository_root_path,Repository,Gitl_data_path1,N,R1,N_path,To,HTML) + ->HTML1=HTML2; + HTML1="Identical"), +delete_tmp, + working_directory1(_,A), + + time1(Time), + %diff_html_n(Diff_html_n), + +working_directory1(A1,A1), + +%repositories_paths([RP]), + +working_directory1(_,RP), + + (exists_directory_s("../lc_logs/")->true;make_directory_s("../lc_logs/")), + +foldr(string_concat,["../lc_logs/diff_html",Time,%"-",Diff_html_n, +".html"],File1), + + %Diff_html_n1 is Diff_html_n+1, + %retractall(diff_html_n(_)), + %assertz(diff_html_n(Diff_html_n1)), + + + string_concat("Diff output
These are the changes.

"%Key
Insertion

Deletion

" + ,HTML1,HTML3), + save_file_s(File1,HTML3), + + working_directory1(_,A1), + + + + !. \ No newline at end of file diff --git a/luciancicd/luciancicdverify.pl b/luciancicd/luciancicdverify.pl new file mode 100644 index 0000000000000000000000000000000000000000..f509f84a4d741e893167cdf9e10dee4b0b123907 --- /dev/null +++ b/luciancicd/luciancicdverify.pl @@ -0,0 +1,246 @@ +% move gh2 contents to tmp folder +% copy each test folder set to gh2 (strings in this file) +% run lc, test against right result +% move tmp back to gh2 + +% a command to delete lc/_x.txt before each test + +% test each lc feature + + +%% lc_test(Total,Score). + +%%:- use_module(library(time)). + +%% Test cases, NTotal=output=total cases, Score=output=result + +% * check if lc fails + +% - reuse deps in tests - say this in lc verify.pl + +lc_test(NTotal,Score) :- + gh_init(false), + findall(_,(lc_test0(_N,_At_start,_Max,_CICD,_Start_files,_End_files)),B),length(B,NTotal), + lc_test(0,NTotal,0,Score,[],List), + findall(_,(member(L,List),writeln(L),nl),_), + !. +lc_test(NTotal,NTotal,Score,Score,List,List) :- !. +lc_test(NTotal1,NTotal2,Score1,Score2,List1,List2) :- + NTotal3 is NTotal1+1, + %gh2tmp, + ((lc_test0(NTotal3,At_start,Max,CICD,Start_files,End_files), + %writeln(***),pwd, + %gh_init(At_start), + luciancicd(At_start,Max,CICD,Start_files,End_files), + sleep(2) + %writeln1([result1,Result1]), + %Result=Result1 + )->(Score3 is Score1+1,append(List1,[[lc_test,NTotal3,passed]],List3));(Score3=Score1,append(List1,[[lc_test,NTotal3,failed]],List3))), + writeln0(""), + %tmp2gh, + lc_test(NTotal3,NTotal2,Score3,Score2,List3,List2),!. + +%% lc_test individual cases, Debug=trace=on or off, N=case number, Passed=output=result + +lc_test1(N,Passed) :- + %gh2tmp, + ((lc_test0(N,At_start,Max,CICD,Start_files,End_files), + gh_init(At_start), + luciancicd(At_start,Max,CICD,Start_files,End_files) + %writeln1([result1,Result1]), + %Result=Result1 + )->(Passed=passed,writeln0([lc_test,N,passed]));(Passed=failed,writeln0([lc_test,N,failed]))), + %tmp2gh, +!. + +lc_test0(1,true, % true - delete config, false - use last test's config + 7, % fail_if_greater_than_n_changes1 overrider + i, % i or d - ci (testing) or cd (changing) +% Start_files +[["c/c.pl","%c(A).\n%A=1.\nc(1).\n%d(A).\n%A=1.\nd(A):-A=1."], +["c/main_file.txt","[[\"c.pl\",[[c,1],[d,1]]]]"]], +% End_files +[["c/c.pl","%c(A).\n%A=1.\nc(1).\n%d(A).\n%A=1.\nd(A):-A=1."], +["c/main_file.txt","[[\"c.pl\",[[c,1],[d,1]]]]"]] +). + +%/* +lc_test0(2,true,7,i, +[["c/c.pl","%a([a,b,c],[],A).\n%A = [a, b, c].\na([],A,A):-!.\na(A,B,C):-A=[D|E],append(B,[D],F),a(E,F,C),!."], +["c/main_file.txt","[[\"c.pl\",[[a,3]]]]"]], +[["c/c.pl","%a([a,b,c],[],A).\n%A = [a, b, c].\na([],A,A):-!.\na(A,B,C):-A=[D|E],append(B,[D],F),a(E,F,C),!."], +["c/main_file.txt","[[\"c.pl\",[[a,3]]]]"]] +). +%*/ + +lc_test0(3,true,7,i, +[["c/c.pl","%a(A).\n%A=1.\na(1).\n%b(A).\n%A=1.\nb(1).\n%c(A).\n%A=1.\nc(1)."], +["c/main_file.txt","[[\"c.pl\",[[a,1],[b,1],[c,1]]]]"]], +[["c/c.pl","%a(A).\n%A=1.\na(1).\n%b(A).\n%A=1.\nb(1).\n%c(A).\n%A=1.\nc(1)."], +["c/main_file.txt","[[\"c.pl\",[[a,1],[b,1],[c,1]]]]"]] +). + +lc_test0(4,true,7,i, +[["c/a.pl",":-include('b.pl').\n:-include('c.pl').\n%a(A).\n%A=1.\na(1)."], +["c/b.pl","%b(A).\n%A=1.\nb(1)."], +["c/c.pl","%c(A).\n%A=1.\nc(1)."], +["c/main_file.txt","[[\"a.pl\",[[a,1]]],[\"b.pl\",[[b,1]]],[\"c.pl\",[[c,1]]]]"]], +[["c/a.pl",":-include('b.pl').\n:-include('c.pl').\n%a(A).\n%A=1.\na(1)."], +["c/b.pl","%b(A).\n%A=1.\nb(1)."], +["c/c.pl","%c(A).\n%A=1.\nc(1)."], +["c/main_file.txt","[[\"a.pl\",[[a,1]]],[\"b.pl\",[[b,1]]],[\"c.pl\",[[c,1]]]]"]] +). + +%/* +lc_test0(5,true,7,i, +[["d/d.pl",":-include('../e/e.pl').\n%d(A).\n%A=1.\nd(A):-e(A)."], +["e/e.pl","%e(A).\n%A=1.\ne(1)."], +["d/main_file.txt","[[\"d.pl\",[[d,1]]]]"], +["e/main_file.txt","[]"]], +[["d/d.pl",":-include('../e/e.pl').\n%d(A).\n%A=1.\nd(A):-e(A)."], +["e/e.pl","%e(A).\n%A=1.\ne(1)."], +["d/main_file.txt","[[\"d.pl\",[[d,1]]]]"], +["e/main_file.txt","[]"]] +). +%*/ +lc_test0(6,true,7,i, +[["c/c.pl","%a(0,A).\n%A=1.\n%a(1,A).\n%A=1.\na(1,1):-!.\na(A,B):-b(A,B).\n%b(0,A).\n%A=1.\nb(A,B):-c(A,B).\n%c(0,A).\n%A=1.\nc(A,B):-C is A+1,a(C,B)."], +["c/main_file.txt","[[\"c.pl\",[[a,2]]]]"]], +[["c/c.pl","%a(0,A).\n%A=1.\n%a(1,A).\n%A=1.\na(1,1):-!.\na(A,B):-b(A,B).\n%b(0,A).\n%A=1.\nb(A,B):-c(A,B).\n%c(0,A).\n%A=1.\nc(A,B):-C is A+1,a(C,B)."], +["c/main_file.txt","[[\"c.pl\",[[a,2]]]]"]] +). +%*/ +lc_test0(7,true,7,i, +[["c/c.pl","%c(A).\n%A=1.\nc(1).\nc(2)."], +["c/main_file.txt","[[\"c.pl\",[[c,1]]]]"]], +[["c/c.pl","%c(A).\n%A=1.\nc(1)."], +["c/main_file.txt","[[\"c.pl\",[[c,1]]]]"]] +). + +lc_test0(8,true,7,i, +[["c/c.pl","%c(A).\n%A=2.\nc(C):-A=1,B=1,C is A+B."], +["c/main_file.txt","[[\"c.pl\",[[c,1]]]]"]], +[["c/c.pl","%c(A).\n%A=2.\nc(C):-A=1,B=1,C is A+B."], +["c/main_file.txt","[[\"c.pl\",[[c,1]]]]"]] +). +%/* + +% Finds A=2,B=1 from A=2,B=2 and A=1,B=1 (test 8). + +lc_test0(9,false,16,d, +[["c/c.pl","%c(A).\n%A=3.\nc(C):-A=2,B=2,C is A+B."], +["c/main_file.txt","[[\"c.pl\",[[c,1]]]]"]], +[["c/c.pl","%c(A).\n%A=3.\nc(C):-A=2,B=1,C is A+B."], +["c/main_file.txt","[[\"c.pl\",[[c,1]]]]"]] +). +%*/ + +lc_test0(10,true,7,i, +[["c/c.pl","%c1(A).\n%A=1.\nc1(C):-d(C).\n%c2(A).\n%A=2.\nc2(C):-d(C).\n%d(A).\n%A=1.\n%d(A).\n%A=2.\nd(1).\nd(2)."], +["c/main_file.txt","[[\"c.pl\",[[c1,1],[c2,1]]]]"]], +[["c/c.pl","%c1(A).\n%A=1.\nc1(C):-d(C).\n%c2(A).\n%A=2.\nc2(C):-d(C).\n%d(A).\n%A=1.\n%d(A).\n%A=2.\nd(1).\nd(2)."], +["c/main_file.txt","[[\"c.pl\",[[c1,1],[c2,1]]]]"]] +). + +lc_test0(11,true,7,i, +[["c/c.pl","%c1(A).\n%A=1.\nc1(1).\n%c1(A).\n%A=2.\nc1(2)."], +["c/main_file.txt","[[\"c.pl\",[[c1,1]]]]"]], +[["c/c.pl","%c1(A).\n%A=1.\nc1(1).\n%c1(A).\n%A=2.\nc1(2)."], +["c/main_file.txt","[[\"c.pl\",[[c1,1]]]]"]] +). + +lc_test0(12,true,7,i, +[["c/c.pl","%c(A).\n%A=1.\nc(1):-writeln(1),writeln(1)."], +["c/main_file.txt","[[\"c.pl\",[[c,1]]]]"]], +[["c/c.pl","%c(A).\n%A=1.\nc(1):-writeln(1),writeln(1)."], +["c/main_file.txt","[[\"c.pl\",[[c,1]]]]"]] +). + +lc_test0(13,true,7,i, +[["c/c.pl","%c1(1,A).\n%A=1.\n%c1(2,A).\n%A=2.\nc1(A,A)."], +["c/main_file.txt","[[\"c.pl\",[[c1,2]]]]"]], +[["c/c.pl","%c1(1,A).\n%A=1.\n%c1(2,A).\n%A=2.\nc1(A,A)."], +["c/main_file.txt","[[\"c.pl\",[[c1,2]]]]"]] +). + +lc_test0(14,true,7,i, +[["d/d.pl",":-include('../e/e.pl').\n%d(A).\n%A=1.\nd(A):-e(A)."], +["e/e.pl","%e(A).\n%A=1.\ne(A):-f(A).\n%f(A).\n%A=1.\nf(1)."], +["d/main_file.txt","[[\"d.pl\",[[d,1]]]]"], +["e/main_file.txt","[]"]], +[["d/d.pl",":-include('../e/e.pl').\n%d(A).\n%A=1.\nd(A):-e(A)."], +["e/e.pl","%e(A).\n%A=1.\ne(A):-f(A).\n%f(A).\n%A=1.\nf(1)."], +["d/main_file.txt","[[\"d.pl\",[[d,1]]]]"], +["e/main_file.txt","[]"]] +). + +lc_test0(15,true,7,i, +[["d/d.pl",":-include('../e/e.pl').\n%d(A).\n%A=1.\nd(A):-e(A).\n%d(A).\n%A=2."], +["e/e.pl","%e(A).\n%A=1.\ne(A):-f(A).\n%f(A).\n%A=1.\nf(1).\n%f(A).\n%A=2.\nf(2)."], +["d/main_file.txt","[[\"d.pl\",[[d,1]]]]"], +["e/main_file.txt","[]"]], +[["d/d.pl",":-include('../e/e.pl').\n%d(A).\n%A=1.\nd(A):-e(A).\n%d(A).\n%A=2."], +["e/e.pl","%e(A).\n%A=1.\ne(A):-f(A).\n%f(A).\n%A=1.\nf(1).\n%f(A).\n%A=2.\nf(2)."], +["d/main_file.txt","[[\"d.pl\",[[d,1]]]]"], +["e/main_file.txt","[]"]] +). + +lc_test0(16,true,7,i, +[["c/c.pl","%d(1,A,2,B).\n%A=1,B=2.\nd(A,A,B,B)."], +["c/main_file.txt","[[\"c.pl\",[[d,4]]]]"]], +[["c/c.pl","%d(1,A,2,B).\n%A=1,B=2.\nd(A,A,B,B)."], +["c/main_file.txt","[[\"c.pl\",[[d,4]]]]"]] +). + +lc_test0(17,true,7,i, +[["c/c.pl","%d.\nd."], +["c/main_file.txt","[[\"c.pl\",[[d,0]]]]"]], +[["c/c.pl","%d.\nd."], +["c/main_file.txt","[[\"c.pl\",[[d,0]]]]"]] +). + +lc_test0(18,true,7,i, +[["c/c.pl","%d(1).\nd(1)."], +["c/main_file.txt","[[\"c.pl\",[[d,1]]]]"]], +[["c/c.pl","%d(1).\nd(1)."], +["c/main_file.txt","[[\"c.pl\",[[d,1]]]]"]] +). + +lc_test0(19,true,7,i, +[["c/c.pl","%not(d).\nd:-false."], +["c/main_file.txt","[[\"c.pl\",[[d,0]]]]"]], +[["c/c.pl","%not(d).\nd:-false."], +["c/main_file.txt","[[\"c.pl\",[[d,0]]]]"]] +). + +%/* +lc_test0(20,true,7,i, +[["c/c.pl","%not(d(1)).\nd(1):-false."], +["c/main_file.txt","[[\"c.pl\",[[d,1]]]]"]], +[["c/c.pl","%not(d(1)).\nd(1):-false."], +["c/main_file.txt","[[\"c.pl\",[[d,1]]]]"]] +). +%*/ + +%/* +lc_test0(21,true,7,i, +[["c/c.pl","%a.\na:-d1(1).\na:-not(d2(1)).\n%d1(A).\n%A=1.\nd1(1).\n%not(d2(1)).\nd2(1):-false."], +["c/main_file.txt","[[\"c.pl\",[[a,0]]]]"]], +[["c/c.pl","%a.\na:-d1(1).\na:-not((d2(1))).\n%d1(A).\n%A=1.\nd1(1).\n%not(d2(1)).\nd2(1):-false."], +["c/main_file.txt","[[\"c.pl\",[[a,0]]]]"]] +). +%*/ + +lc_test0(22,true,7,i, +[["c/c.pl","%a(A).\n%A=[1].\na(A):-findall(B,b(B),A).\n%b(A).\n%A=1.\nb(1)."], +["c/main_file.txt","[[\"c.pl\",[[a,1]]]]"]], +[["c/c.pl","%a(A).\n%A=[1].\na(A):-findall(B,b(B),A).\n%b(A).\n%A=1.\nb(1)."], +["c/main_file.txt","[[\"c.pl\",[[a,1]]]]"]] +). + +lc_test0(23,false,15,d, +[["c/c.pl","%a(A).\n%A=[2,1].\na(A):-findall(B,b(B),A).\n%b(A).\n%A=1.\n%b(A).\n%A=2.\nb(2)."], +["c/main_file.txt","[[\"c.pl\",[[a,1]]]]"]], +[["c/c.pl","%a(A).\n%A=[2,1].\na(A):-findall(B,b(B),A).\n%b(A).\n%A=1.\n%b(A).\n%A=2.\nb(2).\nb(1)."], +["c/main_file.txt","[[\"c.pl\",[[a,1]]]]"]] +). diff --git a/luciancicd/luciancicdverify1.pl b/luciancicd/luciancicdverify1.pl new file mode 100644 index 0000000000000000000000000000000000000000..e46f4d4a789401bf5be0ad85ad434306fc47d5ee --- /dev/null +++ b/luciancicd/luciancicdverify1.pl @@ -0,0 +1,216 @@ +gh_init2 :- + + (exists_file_s("luciancicd.pl")->true;(writeln("Error: Please quit Prolog, reload and run luciancicd in its folder."),abort)), + +working_directory1(A1000,A1000), + (time1(_T1)->true;get_time1), + + (home_dir1(HD)->true;true), + +(var(HD)->(retractall(home_dir1(_)),assertz(home_dir1(A1000)));true), + +repositories_paths([RP_1]), + + (exists_directory_s(RP_1)->true;make_directory_s(RP_1)), + +working_directory1(_,A1000), + +output_path([OP_1]), + + (exists_directory_s(OP_1)->true;make_directory_s(OP_1)), + +(exists_directory_s("../private2/")->true;make_directory_recursive_s("./","../private2")), + + +working_directory1(_,A1000). + +gh_init(At_start) :- +working_directory1(A1000,A1000), + + gh_init2, +output_path([OP_1]), + + (exists_directory_s("../private2/luciancicd-cicd-tests")->true;make_directory_recursive_s("./","../private2/luciancicd-cicd-tests")), + +working_directory1(_,A1000), + + working_directory1(A,A), + repositories_paths1([Path]), + working_directory1(_,Path), + + (exists_directory('../gh2_tmp2')-> + (time1(T),string_concat('../gh2_tmp2',T,O2),string_concat(O2,"/",O3),%working_directory1(_,Path), + + O4=O3);(make_directory('../gh2_tmp2'),O4="../gh2_tmp2/")),%make_directory_s(O)), + %string_concat("../../",Path2,Path), + %string_concat(Path3,"/",Path2), + string_concat("../../",OP_2,OP_1), + string_concat(OP_3,"/",OP_2), + foldr(string_concat,[O4,%"/", + OP_2],PX11), + foldr(string_concat,[O4,%"/", + "Github_lc/"],PX21), + %mv_lc(PX,O4), + + + (exists_directory(PX11)->true; + (%trace, + make_directory_recursive_s("./",PX11))), + (exists_directory(PX21)->true; + (%trace, + make_directory_recursive_s("./",PX21))), + + %trace, + foldr(string_concat,["scp -r ../",OP_3,"/ ",O4,"",OP_3,"/."],PX1), + foldr(string_concat,["scp -r ","../Github_lc/"," ",O4,"","Github_lc/","."],PX2), + %mv_lc(PX,O4), + + catch(bash_command(PX1,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42)%,writeln1(Text42)%,abort + )), + + catch(bash_command(PX2,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42)%,writeln1(Text42)%,abort + )), + +foldr(string_concat,[%"scp -pr ../../Github_lc/ ", + "rm -rf ../",OP_2,"*" + %Folder1 + ],Command315), + catch(bash_command(Command315,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42)%,writeln1(Text42)%,abort + )), + + + (At_start=true-> + (foldr(string_concat,[%"scp -pr ../../Github_lc/ ", + "rm -rf ../Github_lc/*" + %Folder1 + ],Command316), + catch(bash_command(Command316,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42)%,writeln1(Text42)%,abort + )));true), + working_directory1(_,A), + !. + +gh2tmp :- +%trace, + working_directory1(A,A), + (time1(_T1)->true;get_time1), + repositories_paths1([Path]), + %trace, + working_directory1(_,Path), + %trace, + %pwd, +%(exists_directory('../gh2_tmp')->true;make_directory('../gh2_tmp')), + (exists_directory('../gh2_tmp')-> + %trace, + (time1(T),string_concat('../gh2_tmp',T,O2),string_concat(O2,"/",O3),%working_directory1(_,Path), + O4=O3,make_directory(O4));(%trace, + O4="../gh2_tmp/"),make_directory_s(O4)),%(O)), + %trace, + foldr(string_concat, ["scp -pr ./ ",O4,"."],O41), + + catch(bash_command(O41, O42), _, (foldr(string_concat, ["Warning."], _), writeln1("Couldn't back up repositories."), abort)), + +%trace, + + string_concat("../",Path1,Path), + string_concat("../",Path2,Path1), + working_directory1(_,"../"), + + foldr(string_concat, ["rm -rf ",Path2],O43), + + catch(bash_command(O43, O44), _, (foldr(string_concat, ["Warning."], _), writeln1("Couldn't back up repositories."), abort)), +%trace, +%pwd, + foldr(string_concat, ["mkdir ",Path2],O45), + + catch(bash_command(O45, O46), _, (foldr(string_concat, ["Warning."], _), writeln1("Couldn't back up repositories."), abort)), + %mv_lc("./",O4), + %rm_lc("../gh2_tmp/*"), + %trace, + %mv_lc("./","../gh2_tmp/"), + %rm_lc("./*"), + working_directory1(_,A),!. + +tmp2gh :- !. + + +% rm after +luciancicd(At_start,Max,CICD,Start_files,End_files) :- +%trace, + retractall(c(_)), + assertz(c(CICD)), + retractall(fail_if_greater_than_n_changes2(_)), + assertz(fail_if_greater_than_n_changes2(Max)), + (At_start=true-> + % overwrites existing tests_c.txt, leaves the new one behind for bug checking + gh2tmp;true), + working_directory1(A,A), + repositories_paths1([Path]), + working_directory1(_,Path), + (At_start=true-> + % overwrites existing tests_c.txt, leaves the new one behind for bug checking + (foldr(string_concat,[%"scp -pr ../../Github_lc/ ", + "rm -f ../Github_lc/*" + %Folder1 + ],Command315), + catch(bash_command(Command315,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42)%,writeln1(Text42)%,abort + )) + );true), + + findall(_,(member([File_name,Contents],Start_files), + truncate_path(File_name,P,F2), + working_directory1(A1,A1), + (exists_directory(P)->true; + make_directory_recursive_s("./",P)), + working_directory1(_,P), + save_file_s(F2,Contents), + working_directory1(_,A1) + ),_), + working_directory1(_,A), + + + luciancicd, + + (success1(0)-> + ( + output_path([O]), + working_directory1(_,O), + find_files("./",F), + + findall([File_name,Contents1],(member([File_name,Contents],F), + remove_end_comments2(Contents,Contents1)),F0), + + %trace, + + findall([File_name1,Contents],(member([File_name,Contents],End_files), + (string_concat("./",_,File_name)->File_name1=File_name; + (%trace, + string_concat("./",File_name,File_name1)))),End_files1), + + working_directory1(_,A), + + (At_start=true-> + % overwrites existing tests_c.txt, leaves the new one behind for bug checking + tmp2gh;true), + + retractall(fail_if_greater_than_n_changes2(_)), + + msort(F0,F1A), + msort(End_files1,F1B), + + writeln1(["Result",F1A]), + writeln1(["Correct Result",F1B]), + F1A=F1B + );(working_directory1(_,A), + (At_start=true-> + % overwrites existing tests_c.txt, leaves the new one behind for bug checking + tmp2gh;true), + + fail)), + + + !. diff --git a/luciancicd/main.pl b/luciancicd/main.pl new file mode 100644 index 0000000000000000000000000000000000000000..96dbd7da8514b1cb982181c77e5c107a82b588e7 --- /dev/null +++ b/luciancicd/main.pl @@ -0,0 +1,3 @@ +main :- time(lc_test(A,B)). + +%time((lc_test1(8,A),sleep(1),lc_test1(9,B))),!. diff --git a/luciancicd/merge3.pl b/luciancicd/merge3.pl new file mode 100644 index 0000000000000000000000000000000000000000..168c2f715165aeab6ffc28c73cddc70e1d71109c --- /dev/null +++ b/luciancicd/merge3.pl @@ -0,0 +1,35 @@ +%abc,ab to aabbc + +:-dynamic(merge31/1). + +% merge3([a,a,b,a],[a,b,c],A). +% A = [a, a, a, b, b, a, c]. + +merge3(A,B,Q) :- + retractall(merge31(_)), + assertz(merge31(A)), + findall([[Y],Z],(member(Y,B),Y=[_,[N|_]|_],Y1=[_,[N|_]|_],findall(Y1,(merge31(A1),member(Y1,A1),delete(A1,Y1,A2),retractall(merge31(_)), + assertz(merge31(A2))),Z)),Q1), + foldr(append,Q1,Q2), + foldr(append,Q2,Q3), + merge31(R),foldr(append,[Q3,R],Q),!. + +merge_files1a(A,B,C) :- + split_into_lp_files(A,AT2331), + split_into_lp_files(B,AT1331), + merge_files2b(AT2331,AT1331,[],AT333), + foldr(append,AT333,C),!. + +merge_files2b([],B,C,D) :- append(C,B,D),!. +merge_files2b(A,B,C,D) :- + A=[[[[n, comment], [["File delimiter", PZ, FZ]]]|T10]|E], + (member([[[n, comment], [["File delimiter", PZ, FZ]]]|T11],B)-> + (merge3(T10,T11,T12), + %merge_files3(T10,T11,T12), + delete(B,[[[n, comment], [["File delimiter", PZ, FZ]]]|T11],B1)); + (T12=T10,B=B1)), + append(C,[[[[n, comment], [["File delimiter", PZ, FZ]]]|T12]],A2), + %*/ + merge_files2b(E,B1,A2,D),!. + + diff --git a/luciancicd/move_to_repository_or_back.pl b/luciancicd/move_to_repository_or_back.pl new file mode 100644 index 0000000000000000000000000000000000000000..d8e7152c34f94a6d59c4e3e4e6b809555fb659c2 --- /dev/null +++ b/luciancicd/move_to_repository_or_back.pl @@ -0,0 +1,88 @@ +% mv luciancicd-testing->luciancicd-testing tmp +% mv gh*->luciancicd-testing +% mv luciancicd-testing tmp->gh* + +move_to_repository_or_back :- + + working_directory1(AAA,AAA), + + %retractall(home_dir(_)),assertz(home_dir(AAA)), + + LCTD="../private2/luciancicd-testing/", + LCTD2="../private2/luciancicd-testing-tmp/", + repositories_paths(K), + output_path([O]), + %rm_lc(O), + (exists_directory_s(O)-> + (time1(T),string_concat(O1,"/",O),string_concat(O1,T,O2),string_concat(O2,"/",O3),mv_lc(O,O3));make_directory_s(O)), + +/* omit_paths(Omit), + + (exists_directory_s(LCTD2)->true;make_directory_s(LCTD2)), + + +findall([K1,G4],(member(K1,K), directory_files(K1,F), + delete_invisibles_etc(F,G), + +%findall(H,(member(H,G),not(string_concat("dot",_,H)), + +subtract(G,Omit,G1), + +findall([G3,G31],(member(G2,G1),foldr(string_concat,[LCTD,K1,G2,"/"],LCTDa), +foldr(string_concat,[LCTD2,K1,G2,"/"],LCTD2a), +foldr(string_concat,[K1,G2,"/"],K1a), +*/ + %mv_lc(LCTDa,LCTD2a), + %mv_lc(K1a,LCTDa), + %mv_lc(LCTD2a,K1a) + mv_lc(LCTD,LCTD2), + +findall(_,(member(K1,K), + %mv_lc(K1,LCTD), + mv_lc(LCTD2,O), + + string_concat(O,"*/testcicd.pl",R1), + + rm_lc(R1) + + ),_), +%),_G4) +%findall(G3,(member(G2,G1),foldr(string_concat,[LCTD2,G2,"/"],G3)),LCTD2_G4), + + + %mv_lc(R,LCTD), + %mv_lc(LCTD2,R) + +%not(member(G,Omit)) + +%),_K01), + + rm_lc(LCTD2), + + + working_directory1(_,AAA),!. + + + + +mv_lc(From,To) :- + foldr(string_concat,[%"scp -pr ../../Github_lc/ ", + "scp -pr " + %"rsync -av --ignore-existing --remove-source-files --exclude=\".*\" " + ,From,%"../../Github_lc/ ", + " ", + To,%," && \\","\n","rsync -av --delete `mktemp -d`/ ",From + "\n","rm -rf ",From,"*"],Command314), + catch(bash_command(Command314,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],Text41),writeln1(Text41),abort + )),!. + +rm_lc(Item) :- + +foldr(string_concat,[%"scp -pr ../../Github_lc/ ", + "rm -rf ",Item + %Folder1 + ],Command315), + catch(bash_command(Command315,_), _, (foldr(string_concat,["Warning."%%"Error: Can't clone ",User3,"/",Repository3," repository on GitHub." + ],_Text42)%,writeln1(Text42)%,abort + )),!. diff --git a/luciancicd/remove_end_comment.pl b/luciancicd/remove_end_comment.pl new file mode 100644 index 0000000000000000000000000000000000000000..ffdae0c16576483037e3f9c1e012f38065520e4e --- /dev/null +++ b/luciancicd/remove_end_comment.pl @@ -0,0 +1,158 @@ +remove_end_comment :- + +working_directory1(A,A), + +%(exists_directory('../private2/luciancicd-data')->true;make_directory('../private2/luciancicd-data')), + +repositories_paths(K), + +omit_paths(Omit), + +%findall(Omit1,(member(Omit2,Omit),atom_string(Omit1,Omit2)),Omit3), +findall([K1,G4],(member(K1,K), directory_files(K1,F), + delete_invisibles_etc(F,G), + +%findall(H,(member(H,G),not(string_concat("dot",_,H)), + +subtract(G,Omit,G1), + +findall(G3,(member(G2,G1),string_concat(G2,"/",G3)),G4) +%not(member(G,Omit)) + +),K01), +%trace, +%foldr(append,K0,K01), + +working_directory1(Old_D,Old_D), + +findall(Mod_time1,(member([D,K31],K01), + +working_directory1(_,Old_D), + +working_directory1(_,D), + +%member(K2,K31), + +%exists_directory(K2), + +process_directory_remove_end_comment(K31,%_G, + %Omit,% + true, + Mod_time1)%),Mod_time) + ),Mod_time2), + foldr(append,Mod_time2,Mod_time), + + %/* + %trace, +findall(_,(member([_,Tests521],Mod_time), +term_to_atom(Tests523,Tests521), +member([K21,Tests522,T],Tests523), +open_s(K21,write,S21), +write(S21,Tests522),close(S21) +,set_time_file(K21,[],[modified(T)]) +),_), +%*/ + +/* + +[["c/main_file.txt","[[""c.pl"",[[c,3]]]] +"],["c/c.pl","%c(1,1,A). +%A=2. +%c(1,2,A). +%A=3. +c(A,B,B4):-B4 is A+B."]] + +findall([F1,Mod_times12], +(member(F2,G),string_concat('../private2/luciancicd-data/',F2,F1), +open_file_s(F1,Mod_times1), +term_to_atom(Mod_times1,Mod_times12)),Mod_times11), +*/ + + working_directory1(_,A) + + + ,!. + +process_directory_remove_end_comment(K,%G, + Top_level,%Mod_time1, + Mod_time61) :- + +%G=K, +%/* +findall(K4,(member(K1,K), directory_files(K1,F), + delete_invisibles_etc(F,G), +%*/ +findall(Mod_time3,(member(H,G),%not(string_concat("dot",_,H)), + +%not(member(H,Omit)), + + +foldr(string_concat,[K1,H],H1), + +% if a file then find modification date +% if a folder then continue finding files in folder +(exists_directory(H1)-> + +(string_concat(H1,"/",H2), +process_directory_remove_end_comment([H2],%[H], + false,%[],%Omit % only omit top level dirs xx + %Mod_time1, + Mod_time3) + %foldr(append,Mod_time31,Mod_time3) + ); + +(time_file(H1,T), +(string_concat(_,".pl",H1)-> +remove_end_comments1(H1,Mod_time4); +open_string_file_s(H1,Mod_time4)), +%trace, +%append(Mod_time1,[[H1,Mod_time4]],Mod_time3))) +Mod_time3=[[H1,Mod_time4,T]])) + +),Mod_time5),%trace, +foldr(append,Mod_time5,Mod_time51), + +%Mod_time5=Mod_time51, + +(Top_level=true%not(Omit=[]) % at top level +-> +( +term_to_atom(Mod_time51,Mod_time52), +string_concat(K3,"/",K1), +foldr(string_concat,[K3,".txt"],K2), +K4=[K2,Mod_time52] +%open_s(K2,write,S), +%write(S,Mod_time52),close(S) + +%writeln(["*",K2, +%Mod_time52] +%) +); +K4=Mod_time51 +) + + + +),Mod_time6), + +(%not(Omit=[])-> +Top_level=true-> +Mod_time6=Mod_time61; +foldr(append,Mod_time6,Mod_time61)), + +!. + +remove_end_comments1(H1,Mod_time4) :- + open_string_file_s(H1,Mod_time5), + remove_end_comments2(Mod_time5,Mod_time4). + +remove_end_comments2(Mod_time5,Mod_time4) :- + string_concat(Mod_time6,"\n% ",Mod_time5), + remove_end_comments2(Mod_time6,Mod_time4). +remove_end_comments2(Mod_time5,Mod_time4) :- + string_concat(Mod_time6,"\n",Mod_time5), + remove_end_comments2(Mod_time6,Mod_time4). +remove_end_comments2(Mod_time4,Mod_time4) :- + not(string_concat(_,"\n% ",Mod_time4)), + not(string_concat(_,"\n",Mod_time4)),!. + diff --git a/luciancicd/settings.pl b/luciancicd/settings.pl new file mode 100644 index 0000000000000000000000000000000000000000..79db4b03620591ba42ab8952acc04551f2d71578 --- /dev/null +++ b/luciancicd/settings.pl @@ -0,0 +1,22 @@ +user("luciangreen"). + +% Note: the following may or may not end in "/". + +repositories_paths1([ +"../../GitHub2/" +]). + +omit_paths1([ +"private2" +%"b" % omits GitHub2/b/ +]). + +output_path1([ +"../../GitHub2o/" +]). + +fail_if_greater_than_n_changes1(%7 +20 +). + +time_limit(1). \ No newline at end of file diff --git a/mindreader/.DS_Store b/mindreader/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/mindreader/.DS_Store differ diff --git a/mindreader/LICENSE b/mindreader/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/mindreader/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/mindreader/README.md b/mindreader/README.md new file mode 100644 index 0000000000000000000000000000000000000000..01e99b86db432db010eaa1253a38376a98b142ca --- /dev/null +++ b/mindreader/README.md @@ -0,0 +1,173 @@ +Notice + +Mindreader produces vague mind reading results, that remind you of your thoughts but are not actually your thoughts. + +# Mindreader + +* Contains Prolog programs that can be used by an individual to project spiritual images and read their mind using meditation. +* See also the Essay-Helper repository and the Music-Composer Repository. +* Build a daily mind reading "framework" allowing you to run Grammar Logic (GL) (details that form the content of high distinctions) and earn money by breasoning them out. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for mind reading. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, the List Prolog Interpreter repository and the Text to Breasonings Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","mindreader"). +halt +``` + +# Caution + +- Before running texttobr-based mind reader, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. + +- Follow instructions in https://github.com/luciangreen/Text-to-Breasonings/blob/master/Instructions_for_Using_texttobr(2).pl.txt when using texttobr, texttobr2 or mind reader to avoid medical problems. + +Please follow instructions in init.txt to initialise any new mind-reading algorithms. + + +# Running Mind Reader from Decision Tree + +`cd mindreader` + +``` +['../Music-Composer/mindreadtestmusiccomposer-unusual-mr-tree.pl']. +mind_read(A,[option1,option2]). +``` + +# Also to run + +Additional instructions for preparing to run screen projector: +Breason out songs used to help project pixels. +In Shell: +`cd mindreader` + +In swipl: +``` +['../Text-to-Breasonings/text_to_breasonings.pl']. +texttobr2(u,"file-songs.txt",u,u,false,false,false,false,false,false). +texttobr(u,"file-songs.txt",u,u). + +``` +In Shell: +`cd ../mindreader` +In swipl: +``` +['postsong']. +postsong(10). %% For ten songs +``` +Write a spiritual letter to your federal government (with a done up A per sentence) to activate the technology. +In Shell: +`cd mindreader` +In swipl: + +``` +['../Text-to-Breasonings/text_to_breasonings.pl']. +texttobr2(u,"letter.txt",u,u,false,false,false,false,false,false). +texttobr2(u,"file-letter.txt",u,u,false,false,false,false,false,false). +``` + +In the following, 1 is the number of times the person has a creation A, 80 is the number of years the person has daily connections for mind reading. If the person is a bot you can't see, it will work. +``` +N is 1*16+80*365*16,texttobr2(N). +``` + +Display a spiritual screen with 'A' above the screen. +``` +['screen.pl']. +sectest0. +``` + +Running +Recognise you thinking of objects +In Shell: +`cd ../mindreader` +In swipl: + +``` +['mindreadtestobj-12scrn-2']. +sectest(1). +``` +Replace 1 with the number of times to repeat algorithm. + +Recognise you thinking of more objects +``` +['mindreadtestobj-12scrn-2medit']. +sectest(1). +``` +Replace 1 with the number of times to repeat algorithm. + +Recognise you thinking of characters +``` +['mindreadtestobj-12scrn-2chars']. +sectest(1). +``` +Replace 1 with the number of times to repeat algorithm. + +* If necessary, repeat the "arem" mantra all the time while the mind reading computation is running to ensure the best results. + +# Mind Reading Tree + +* The Mind Reading tree algorithm converts a list of strings into a decision tree and removes nodes with a single child. +``` +['make_mind_reading_tree4 working1.pl']. + +make_mind_reading_tree4(["aqa","awx","awy"],Tree),writeln1(Tree). + +Tree = [[1,"a",2],[2,"q",[-,"aqa"]],[2,"w",9],[9,"x",[-,"awx"]],[9,"y",[-,"awy"]]] +``` + +* This is accomplished by converting a list of strings into a decision tree: +``` +string_to_list1(["aqa","awx","awy"],1,_,[],Options2), +maplist(append,[Options2],[Tree]),writeln1(Tree). + +Tree = [[1,"a",2],[2,"q",3],[3,"a",[-,"aqa"]],[1,"a",5],[5,"w",6],[6,"x",[-,"awx"]],[1,"a",8],[8,"w",9],[9,"y",[-,"awy"]]] +``` + +* Merging the branches of the decision tree: +``` +merge_lists_a([1],[[1,"a",2],[2,"q",3],[3,"a",[-,"aqa"]],[1,"a",5],[5,"w",6],[6,"x",[-,"awx"]],[1,"a",8],[8,"w",9],[9,"y",[-,"awy"]]],[],MergedTree),writeln1(MergedTree). + +MergedTree = [[1,"a",2],[2,"q",3],[2,"w",9],[3,"a",[-,"aqa"]],[9,"x",[-,"awx"]],[9,"y",[-,"awy"]]] +``` + +* And removing nodes with a single child: +``` +remove_chains_of_one_child_a([1],[[1,"a",2],[2,"q",3],[2,"w",9],[3,"a",[-,"aqa"]],[9,"x",[-,"awx"]],[9,"y",[-,"awy"]]],[],ShortTree),writeln1(ShortTree). + +ShortTree = [[1,"a",2],[2,"q",[-,"aqa"]],[2,"w",9],[9,"x",[-,"awx"]],[9,"y",[-,"awy"]]] +``` + +* (Transition [2,"w",9] is followed by 2 children, so is kept.) + +# Versioning + +We will use SemVer for versioning. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details + diff --git a/mindreader/file-letter.txt b/mindreader/file-letter.txt new file mode 100644 index 0000000000000000000000000000000000000000..d9d68502a713d799c3a78a1f2d9b95159351b730 --- /dev/null +++ b/mindreader/file-letter.txt @@ -0,0 +1,13560 @@ + Call: (8) interpret(off, [[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (8) interpret(off, [[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551) + Unify: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551) + Call: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5542|_G5543] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], _G5554, "->", _G5563] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], _G5554, "->", _G5563] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5554, [], _G5556) + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], "->", _G5560] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], "->", _G5560] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5542|_G5543] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_G5545, _G5548, ":-", _G5557] + Exit: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _G5572) + Unify: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G5572, [], _G5574) + Unify: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _G5572) + Unify: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]|_G5564]) + Exit: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (9) interpret1(off, [[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (9) interpret1(off, [[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) +^ Call: (10) retractall(debug(_G5566)) +^ Exit: (10) retractall(debug(_G5566)) +^ Call: (10) assertz(debug(off)) +^ Exit: (10) assertz(debug(off)) +^ Call: (10) retractall(cut(_G5570)) +^ Exit: (10) retractall(cut(_G5570)) +^ Call: (10) assertz(cut(off)) +^ Exit: (10) assertz(cut(off)) + Call: (10) member1([[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (10) member1([[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) cut(off) + Unify: (11) cut(off) + Exit: (11) cut(off) + Call: (11) [[n, f], [["a", "b"], ["a", "b"], [v, d]]]=[_G5574, _G5577] + Exit: (11) [[n, f], [["a", "b"], ["a", "b"], [v, d]]]=[[n, f], [["a", "b"], ["a", "b"], [v, d]]] + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _G5586, ":-", _G5595]|_G5581] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) length([["a", "b"], ["a", "b"], [v, d]], _G5606) + Unify: (11) length([["a", "b"], ["a", "b"], [v, d]], _G5606) + Exit: (11) length([["a", "b"], ["a", "b"], [v, d]], 3) + Call: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Exit: (11) length([[v, c], [v, b], [v, d]], 3) + Call: (11) [n, f]=[n, grammar] + Fail: (11) [n, f]=[n, grammar] + Redo: (10) member1([[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) [n, f]=[n, grammar_part] + Fail: (11) [n, f]=[n, grammar_part] + Redo: (10) member1([[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) checkarguments([["a", "b"], ["a", "b"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5608, [], _G5610) + Unify: (11) checkarguments([["a", "b"], ["a", "b"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5608, [], _G5610) + Call: (12) [["a", "b"], ["a", "b"], [v, d]]=[_G5598|_G5599] + Exit: (12) [["a", "b"], ["a", "b"], [v, d]]=[["a", "b"], ["a", "b"], [v, d]] + Call: (12) expressionnotatom(["a", "b"]) + Unify: (12) expressionnotatom(["a", "b"]) + Call: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) var(["a", "b"]) + Fail: (14) var(["a", "b"]) + Redo: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) isval(["a", "b"]) + Unify: (14) isval(["a", "b"]) + Call: (15) number(["a", "b"]) + Fail: (15) number(["a", "b"]) + Fail: (14) isval(["a", "b"]) + Redo: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) string(["a", "b"]) + Fail: (14) string(["a", "b"]) + Fail: (13) isvalstrempty(["a", "b"]) + Redo: (12) expressionnotatom(["a", "b"]) + Unify: (12) expressionnotatom(["a", "b"]) +^ Call: (13) not(atom(["a", "b"])) +^ Unify: (13) not(user:atom(["a", "b"])) +^ Exit: (13) not(user:atom(["a", "b"])) + Call: (13) length(["a", "b"], _G5614) + Unify: (13) length(["a", "b"], _G5614) + Exit: (13) length(["a", "b"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["a", "b"]) + Unify: (13) expressionnotatom2(["a", "b"]) + Call: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) var("a") + Fail: (15) var("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) isval("a") + Unify: (15) isval("a") + Call: (16) number("a") + Fail: (16) number("a") + Fail: (15) isval("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) string("a") + Exit: (15) string("a") + Exit: (14) isvalstrempty("a") + Call: (14) expressionnotatom2(["b"]) + Unify: (14) expressionnotatom2(["b"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["b"]) + Exit: (13) expressionnotatom2(["a", "b"]) + Exit: (12) expressionnotatom(["a", "b"]) + Call: (12) [[v, c], [v, b], [v, d]]=[_G5606|_G5607] + Exit: (12) [[v, c], [v, b], [v, d]]=[[v, c], [v, b], [v, d]] +^ Call: (12) not(var([v, c])) +^ Unify: (12) not(user:var([v, c])) +^ Exit: (12) not(user:var([v, c])) + Call: (12) isvar([v, c]) + Unify: (12) isvar([v, c]) + Exit: (12) isvar([v, c]) + Call: (12) putvalue([v, c], ["a", "b"], [], _G5624) + Unify: (12) putvalue([v, c], ["a", "b"], [], _G5624) +^ Call: (13) not(isvar([v, c])) +^ Unify: (13) not(user:isvar([v, c])) + Call: (14) isvar([v, c]) + Unify: (14) isvar([v, c]) + Exit: (14) isvar([v, c]) +^ Fail: (13) not(user:isvar([v, c])) + Redo: (12) putvalue([v, c], ["a", "b"], [], _G5624) + Call: (13) isvar([v, c]) + Unify: (13) isvar([v, c]) + Exit: (13) isvar([v, c]) + Call: (13) isvalstrorundef(["a", "b"]) + Unify: (13) isvalstrorundef(["a", "b"]) + Call: (14) var(["a", "b"]) + Fail: (14) var(["a", "b"]) + Redo: (13) isvalstrorundef(["a", "b"]) + Unify: (13) isvalstrorundef(["a", "b"]) +^ Call: (14) not(var(["a", "b"])) +^ Unify: (14) not(user:var(["a", "b"])) +^ Exit: (14) not(user:var(["a", "b"])) + Call: (14) isval(["a", "b"]) + Unify: (14) isval(["a", "b"]) + Call: (15) number(["a", "b"]) + Fail: (15) number(["a", "b"]) + Fail: (14) isval(["a", "b"]) + Redo: (13) isvalstrorundef(["a", "b"]) + Unify: (13) isvalstrorundef(["a", "b"]) +^ Call: (14) not(var(["a", "b"])) +^ Unify: (14) not(user:var(["a", "b"])) +^ Exit: (14) not(user:var(["a", "b"])) + Call: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) string(["a", "b"]) + Fail: (15) string(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) atom(["a", "b"]) + Fail: (15) atom(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) +^ Call: (15) not(atom(["a", "b"])) +^ Unify: (15) not(user:atom(["a", "b"])) +^ Exit: (15) not(user:atom(["a", "b"])) + Call: (15) length(["a", "b"], _G5632) + Unify: (15) length(["a", "b"], _G5632) + Exit: (15) length(["a", "b"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["a", "b"]) + Unify: (15) expression2(["a", "b"]) + Call: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) expression3("a") + Call: (16) expression2(["b"]) + Unify: (16) expression2(["b"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["b"]) + Exit: (15) expression2(["a", "b"]) + Exit: (14) expression(["a", "b"]) + Exit: (13) isvalstrorundef(["a", "b"]) + Call: (13) updatevar([v, c], ["a", "b"], [], _G5634) + Unify: (13) updatevar([v, c], ["a", "b"], [], _G5634) + Call: (14) lists:member([[v, c], empty], []) + Fail: (14) lists:member([[v, c], empty], []) + Redo: (13) updatevar([v, c], ["a", "b"], [], _G5634) +^ Call: (14) not(member([[v, c], _G5630], [])) +^ Unify: (14) not(user:member([[v, c], _G5630], [])) +^ Exit: (14) not(user:member([[v, c], _G5630], [])) + Call: (14) _G5630=empty + Exit: (14) empty=empty + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[v, c], ["a", "b"]]], _G5654) + Unify: (14) lists:append([], [[[v, c], ["a", "b"]]], [[[v, c], ["a", "b"]]]) + Exit: (14) lists:append([], [[[v, c], ["a", "b"]]], [[[v, c], ["a", "b"]]]) + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Exit: (13) updatevar([v, c], ["a", "b"], [], [[[v, c], ["a", "b"]]]) + Exit: (12) putvalue([v, c], ["a", "b"], [], [[[v, c], ["a", "b"]]]) + Call: (12) checkarguments([["a", "b"], [v, d]], [[v, b], [v, d]], [[[v, c], ["a", "b"]]], _G5655, [], _G5657) + Unify: (12) checkarguments([["a", "b"], [v, d]], [[v, b], [v, d]], [[[v, c], ["a", "b"]]], _G5655, [], _G5657) + Call: (13) [["a", "b"], [v, d]]=[_G5645|_G5646] + Exit: (13) [["a", "b"], [v, d]]=[["a", "b"], [v, d]] + Call: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) + Call: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) var(["a", "b"]) + Fail: (15) var(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) string(["a", "b"]) + Fail: (15) string(["a", "b"]) + Fail: (14) isvalstrempty(["a", "b"]) + Redo: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) +^ Call: (14) not(atom(["a", "b"])) +^ Unify: (14) not(user:atom(["a", "b"])) +^ Exit: (14) not(user:atom(["a", "b"])) + Call: (14) length(["a", "b"], _G5661) + Unify: (14) length(["a", "b"], _G5661) + Exit: (14) length(["a", "b"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["a", "b"]) + Unify: (14) expressionnotatom2(["a", "b"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2(["b"]) + Unify: (15) expressionnotatom2(["b"]) + Call: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) var("b") + Fail: (17) var("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) isvalstrempty("b") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["b"]) + Exit: (14) expressionnotatom2(["a", "b"]) + Exit: (13) expressionnotatom(["a", "b"]) + Call: (13) [[v, b], [v, d]]=[_G5653|_G5654] + Exit: (13) [[v, b], [v, d]]=[[v, b], [v, d]] +^ Call: (13) not(var([v, b])) +^ Unify: (13) not(user:var([v, b])) +^ Exit: (13) not(user:var([v, b])) + Call: (13) isvar([v, b]) + Unify: (13) isvar([v, b]) + Exit: (13) isvar([v, b]) + Call: (13) putvalue([v, b], ["a", "b"], [[[v, c], ["a", "b"]]], _G5671) + Unify: (13) putvalue([v, b], ["a", "b"], [[[v, c], ["a", "b"]]], _G5671) +^ Call: (14) not(isvar([v, b])) +^ Unify: (14) not(user:isvar([v, b])) + Call: (15) isvar([v, b]) + Unify: (15) isvar([v, b]) + Exit: (15) isvar([v, b]) +^ Fail: (14) not(user:isvar([v, b])) + Redo: (13) putvalue([v, b], ["a", "b"], [[[v, c], ["a", "b"]]], _G5671) + Call: (14) isvar([v, b]) + Unify: (14) isvar([v, b]) + Exit: (14) isvar([v, b]) + Call: (14) isvalstrorundef(["a", "b"]) + Unify: (14) isvalstrorundef(["a", "b"]) + Call: (15) var(["a", "b"]) + Fail: (15) var(["a", "b"]) + Redo: (14) isvalstrorundef(["a", "b"]) + Unify: (14) isvalstrorundef(["a", "b"]) +^ Call: (15) not(var(["a", "b"])) +^ Unify: (15) not(user:var(["a", "b"])) +^ Exit: (15) not(user:var(["a", "b"])) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) isvalstrorundef(["a", "b"]) + Unify: (14) isvalstrorundef(["a", "b"]) +^ Call: (15) not(var(["a", "b"])) +^ Unify: (15) not(user:var(["a", "b"])) +^ Exit: (15) not(user:var(["a", "b"])) + Call: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) + Call: (16) isval(["a", "b"]) + Unify: (16) isval(["a", "b"]) + Call: (17) number(["a", "b"]) + Fail: (17) number(["a", "b"]) + Fail: (16) isval(["a", "b"]) + Redo: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) + Call: (16) string(["a", "b"]) + Fail: (16) string(["a", "b"]) + Redo: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) + Call: (16) atom(["a", "b"]) + Fail: (16) atom(["a", "b"]) + Redo: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) +^ Call: (16) not(atom(["a", "b"])) +^ Unify: (16) not(user:atom(["a", "b"])) +^ Exit: (16) not(user:atom(["a", "b"])) + Call: (16) length(["a", "b"], _G5679) + Unify: (16) length(["a", "b"], _G5679) + Exit: (16) length(["a", "b"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expression2(["a", "b"]) + Unify: (16) expression2(["a", "b"]) + Call: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) isval("a") + Unify: (18) isval("a") + Call: (19) number("a") + Fail: (19) number("a") + Fail: (18) isval("a") + Redo: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) string("a") + Exit: (18) string("a") + Exit: (17) expression3("a") + Call: (17) expression2(["b"]) + Unify: (17) expression2(["b"]) + Call: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) isval("b") + Unify: (19) isval("b") + Call: (20) number("b") + Fail: (20) number("b") + Fail: (19) isval("b") + Redo: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) string("b") + Exit: (19) string("b") + Exit: (18) expression3("b") + Call: (18) expression2([]) + Unify: (18) expression2([]) + Exit: (18) expression2([]) + Exit: (17) expression2(["b"]) + Exit: (16) expression2(["a", "b"]) + Exit: (15) expression(["a", "b"]) + Exit: (14) isvalstrorundef(["a", "b"]) + Call: (14) updatevar([v, b], ["a", "b"], [[[v, c], ["a", "b"]]], _G5681) + Unify: (14) updatevar([v, b], ["a", "b"], [[[v, c], ["a", "b"]]], _G5681) + Call: (15) lists:member([[v, b], empty], [[[v, c], ["a", "b"]]]) + Unify: (15) lists:member([[v, b], empty], [[[v, c], ["a", "b"]]]) + Fail: (15) lists:member([[v, b], empty], [[[v, c], ["a", "b"]]]) + Redo: (14) updatevar([v, b], ["a", "b"], [[[v, c], ["a", "b"]]], _G5681) +^ Call: (15) not(member([[v, b], _G5677], [[[v, c], ["a", "b"]]])) +^ Unify: (15) not(user:member([[v, b], _G5677], [[[v, c], ["a", "b"]]])) +^ Exit: (15) not(user:member([[v, b], _G5677], [[[v, c], ["a", "b"]]])) + Call: (15) _G5677=empty + Exit: (15) empty=empty + Call: (15) true + Exit: (15) true + Call: (15) lists:append([[[v, c], ["a", "b"]]], [[[v, b], ["a", "b"]]], _G5701) + Unify: (15) lists:append([[[v, c], ["a", "b"]]], [[[v, b], ["a", "b"]]], [[[v, c], ["a", "b"]]|_G5693]) + Exit: (15) lists:append([[[v, c], ["a", "b"]]], [[[v, b], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Call: (15) true + Exit: (15) true + Call: (15) true + Exit: (15) true + Exit: (14) updatevar([v, b], ["a", "b"], [[[v, c], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Exit: (13) putvalue([v, b], ["a", "b"], [[[v, c], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Call: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5711) + Unify: (15) length([v, d], _G5711) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5703|_G5704] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5719) + Unify: (15) length([v, d], _G5719) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5703|_G5704] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) getvalue([v, d], _G5719, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Unify: (14) getvalue([v, d], _G5719, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) getvalue([v, d], _G5719, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(_G5718) + Unify: (15) isvalstrorundef(_G5718) + Call: (16) var(_G5718) + Exit: (16) var(_G5718) + Exit: (15) isvalstrorundef(_G5718) + Call: (15) getvar([v, d], _G5719, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Unify: (15) getvar([v, d], _G5719, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Call: (16) lists:member([[v, d], _G5714], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Unify: (16) lists:member([[v, d], _G5714], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Fail: (16) lists:member([[v, d], _G5714], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Redo: (15) getvar([v, d], _G5719, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Unify: (15) getvar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) +^ Call: (16) not(member([[v, d], _G5717], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]])) +^ Unify: (16) not(user:member([[v, d], _G5717], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]])) +^ Exit: (16) not(user:member([[v, d], _G5717], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]])) + Call: (16) true + Exit: (16) true + Exit: (15) getvar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Exit: (14) getvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Call: (14) true + Exit: (14) true + Call: (14) putvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5733) + Unify: (14) putvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5733) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5733) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) + Call: (16) var(empty) + Fail: (16) var(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) isval(empty) + Unify: (16) isval(empty) + Call: (17) number(empty) + Fail: (17) number(empty) + Fail: (16) isval(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) expression(empty) + Unify: (16) expression(empty) + Exit: (16) expression(empty) + Exit: (15) isvalstrorundef(empty) + Call: (15) updatevar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5738) + Unify: (15) updatevar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5738) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Fail: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Redo: (15) updatevar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], _G5738) +^ Call: (16) not(member([[v, d], _G5734], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]])) +^ Unify: (16) not(user:member([[v, d], _G5734], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]])) +^ Exit: (16) not(user:member([[v, d], _G5734], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]])) + Call: (16) _G5734=empty + Exit: (16) empty=empty + Call: (16) true + Exit: (16) true + Call: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], [[[v, d], empty]], _G5758) + Unify: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], [[[v, d], empty]], [[[v, c], ["a", "b"]]|_G5750]) + Exit: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], [[[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (14) putvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (14) lists:append([], [[[v, d], [v, d]]], _G5773) + Unify: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Call: (14) checkarguments([], [], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[v, d], [v, d]]], _G5776) + Unify: (14) checkarguments([], [], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) checkarguments([], [], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (12) checkarguments([["a", "b"], [v, d]], [[v, b], [v, d]], [[[v, c], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (11) checkarguments([["a", "b"], ["a", "b"], [v, d]], [[v, c], [v, b], [v, d]], [], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) true + Exit: (11) true + Call: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5764|_G5765] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5772|_G5773] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5781]]|_G5773] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5781]]|_G5773] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[_G5775], or, [_G5784]] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[_G5775], or, [_G5784]] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5772|_G5773] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Fail: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5773]]|_G5765] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5773]]|_G5765] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5767], or, [_G5776]] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5767], or, [_G5776]] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5764|_G5765] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5783, _G5784, _G5785) + Unify: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5783, true, nocut) + Call: (13) isop(=) + Unify: (13) isop(=) + Exit: (13) isop(=) + Call: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5783) + Unify: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5783) + Call: (14) getvalues([v, d], [v, c], _G5781, _G5782, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (14) getvalues([v, d], [v, c], _G5781, _G5782, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (15) getvalue([v, d], _G5780, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (15) getvalue([v, d], _G5780, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, d])) +^ Unify: (16) not(user:isvar([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) +^ Fail: (16) not(user:isvar([v, d])) + Redo: (15) getvalue([v, d], _G5780, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) + Call: (16) isvalstrorundef(_G5779) + Unify: (16) isvalstrorundef(_G5779) + Call: (17) var(_G5779) + Exit: (17) var(_G5779) + Exit: (16) isvalstrorundef(_G5779) + Call: (16) getvar([v, d], _G5780, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], _G5780, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], _G5775], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], _G5775], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) +^ Call: (17) not(empty=empty) +^ Unify: (17) not(user: (empty=empty)) +^ Fail: (17) not(user: (empty=empty)) + Redo: (16) getvar([v, d], _G5780, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) +^ Call: (17) not(member([[v, d], _G5778], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]])) +^ Unify: (17) not(user:member([[v, d], _G5778], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]])) +^ Fail: (17) not(user:member([[v, d], _G5778], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]])) + Redo: (16) getvar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (16) getvar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (15) getvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (15) getvalue([v, c], _G5786, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (15) getvalue([v, c], _G5786, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, c])) +^ Unify: (16) not(user:isvar([v, c])) + Call: (17) isvar([v, c]) + Unify: (17) isvar([v, c]) + Exit: (17) isvar([v, c]) +^ Fail: (16) not(user:isvar([v, c])) + Redo: (15) getvalue([v, c], _G5786, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (16) isvar([v, c]) + Unify: (16) isvar([v, c]) + Exit: (16) isvar([v, c]) + Call: (16) isvalstrorundef(_G5785) + Unify: (16) isvalstrorundef(_G5785) + Call: (17) var(_G5785) + Exit: (17) var(_G5785) + Exit: (16) isvalstrorundef(_G5785) + Call: (16) getvar([v, c], _G5786, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (16) getvar([v, c], _G5786, [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (17) lists:member([[v, c], _G5781], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, c], _G5781], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, c], ["a", "b"]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) +^ Call: (17) not(["a", "b"]=empty) +^ Unify: (17) not(user: (["a", "b"]=empty)) +^ Exit: (17) not(user: (["a", "b"]=empty)) + Exit: (16) getvar([v, c], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (15) getvalue([v, c], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (14) getvalues([v, d], [v, c], empty, ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (14) expression(empty) + Unify: (14) expression(empty) + Exit: (14) expression(empty) + Call: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) string(["a", "b"]) + Fail: (15) string(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) atom(["a", "b"]) + Fail: (15) atom(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) +^ Call: (15) not(atom(["a", "b"])) +^ Unify: (15) not(user:atom(["a", "b"])) +^ Exit: (15) not(user:atom(["a", "b"])) + Call: (15) length(["a", "b"], _G5803) + Unify: (15) length(["a", "b"], _G5803) + Exit: (15) length(["a", "b"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["a", "b"]) + Unify: (15) expression2(["a", "b"]) + Call: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) expression3("a") + Call: (16) expression2(["b"]) + Unify: (16) expression2(["b"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["b"]) + Exit: (15) expression2(["a", "b"]) + Exit: (14) expression(["a", "b"]) + Call: (14) val1emptyorvalsequal(empty, ["a", "b"]) + Unify: (14) val1emptyorvalsequal(empty, ["a", "b"]) + Exit: (14) val1emptyorvalsequal(empty, ["a", "b"]) + Call: (14) putvalue([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5805) + Unify: (14) putvalue([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5805) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5805) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(["a", "b"]) + Unify: (15) isvalstrorundef(["a", "b"]) + Call: (16) var(["a", "b"]) + Fail: (16) var(["a", "b"]) + Redo: (15) isvalstrorundef(["a", "b"]) + Unify: (15) isvalstrorundef(["a", "b"]) +^ Call: (16) not(var(["a", "b"])) +^ Unify: (16) not(user:var(["a", "b"])) +^ Exit: (16) not(user:var(["a", "b"])) + Call: (16) isval(["a", "b"]) + Unify: (16) isval(["a", "b"]) + Call: (17) number(["a", "b"]) + Fail: (17) number(["a", "b"]) + Fail: (16) isval(["a", "b"]) + Redo: (15) isvalstrorundef(["a", "b"]) + Unify: (15) isvalstrorundef(["a", "b"]) +^ Call: (16) not(var(["a", "b"])) +^ Unify: (16) not(user:var(["a", "b"])) +^ Exit: (16) not(user:var(["a", "b"])) + Call: (16) expression(["a", "b"]) + Unify: (16) expression(["a", "b"]) + Call: (17) isval(["a", "b"]) + Unify: (17) isval(["a", "b"]) + Call: (18) number(["a", "b"]) + Fail: (18) number(["a", "b"]) + Fail: (17) isval(["a", "b"]) + Redo: (16) expression(["a", "b"]) + Unify: (16) expression(["a", "b"]) + Call: (17) string(["a", "b"]) + Fail: (17) string(["a", "b"]) + Redo: (16) expression(["a", "b"]) + Unify: (16) expression(["a", "b"]) + Call: (17) atom(["a", "b"]) + Fail: (17) atom(["a", "b"]) + Redo: (16) expression(["a", "b"]) + Unify: (16) expression(["a", "b"]) +^ Call: (17) not(atom(["a", "b"])) +^ Unify: (17) not(user:atom(["a", "b"])) +^ Exit: (17) not(user:atom(["a", "b"])) + Call: (17) length(["a", "b"], _G5813) + Unify: (17) length(["a", "b"], _G5813) + Exit: (17) length(["a", "b"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expression2(["a", "b"]) + Unify: (17) expression2(["a", "b"]) + Call: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) isval("a") + Unify: (19) isval("a") + Call: (20) number("a") + Fail: (20) number("a") + Fail: (19) isval("a") + Redo: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) string("a") + Exit: (19) string("a") + Exit: (18) expression3("a") + Call: (18) expression2(["b"]) + Unify: (18) expression2(["b"]) + Call: (19) expression3("b") + Unify: (19) expression3("b") + Call: (20) isval("b") + Unify: (20) isval("b") + Call: (21) number("b") + Fail: (21) number("b") + Fail: (20) isval("b") + Redo: (19) expression3("b") + Unify: (19) expression3("b") + Call: (20) string("b") + Exit: (20) string("b") + Exit: (19) expression3("b") + Call: (19) expression2([]) + Unify: (19) expression2([]) + Exit: (19) expression2([]) + Exit: (18) expression2(["b"]) + Exit: (17) expression2(["a", "b"]) + Exit: (16) expression(["a", "b"]) + Exit: (15) isvalstrorundef(["a", "b"]) + Call: (15) updatevar([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5815) + Unify: (15) updatevar([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5815) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (16) lists:delete([[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[v, d], empty], _G5826) + Unify: (16) lists:delete([[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[v, d], empty], _G5826) + Exit: (16) lists:delete([[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]]) + Call: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], [[[v, d], ["a", "b"]]], _G5841) + Unify: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], [[[v, d], ["a", "b"]]], [[[v, c], ["a", "b"]]|_G5833]) + Exit: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["a", "b"]]], [[[v, d], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]]) + Exit: (14) putvalue([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]]) + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]]) + Call: (14) true + Exit: (14) true + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]]) + Call: (14) true + Exit: (14) true + Exit: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]]) + Exit: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], true, nocut) +^ Call: (12) not(nocut=cut) +^ Unify: (12) not(user: (nocut=cut)) +^ Exit: (12) not(user: (nocut=cut)) + Call: (12) _G5851=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (12) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], _G5854, [], _G5856) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [], true) + Exit: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [], true) + Call: (12) logicalconjunction(true, true, true) + Unify: (12) logicalconjunction(true, true, true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Exit: (12) logicalconjunction(true, true, true) + Exit: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [[[n, =], [[v, d], [v, c]]]], true) + Call: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [], _G5854) + Unify: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [], _G5854) + Call: (12) [[[v, d], [v, d]]]=[[_G5847, _G5850]|_G5845] + Exit: (12) [[[v, d], [v, d]]]=[[[v, d], [v, d]]] + Call: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) var([v, d]) + Fail: (14) var([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) isval([v, d]) + Unify: (14) isval([v, d]) + Call: (15) number([v, d]) + Fail: (15) number([v, d]) + Fail: (14) isval([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) string([v, d]) + Fail: (14) string([v, d]) + Fail: (13) isvalstrempty([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) +^ Call: (13) not(atom([v, d])) +^ Unify: (13) not(user:atom([v, d])) +^ Exit: (13) not(user:atom([v, d])) + Call: (13) length([v, d], _G5866) + Unify: (13) length([v, d], _G5866) + Exit: (13) length([v, d], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2([v, d]) + Unify: (13) expressionnotatom2([v, d]) + Call: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) var(v) + Fail: (15) var(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) isval(v) + Unify: (15) isval(v) + Call: (16) number(v) + Fail: (16) number(v) + Fail: (15) isval(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) string(v) + Fail: (15) string(v) + Fail: (14) isvalstrempty(v) + Fail: (13) expressionnotatom2([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) predicate_or_rule_name([v, d]) + Fail: (13) predicate_or_rule_name([v, d]) + Fail: (12) expressionnotatom([v, d]) + Redo: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [], _G5863) + Call: (12) lists:member([[v, d], _G5856], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]]) + Unify: (12) lists:member([[v, d], _G5856], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]]) + Exit: (12) lists:member([[v, d], ["a", "b"]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]]) + Call: (12) lists:append([], [[[v, d], ["a", "b"]]], _G5877) + Unify: (12) lists:append([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (12) lists:append([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Call: (12) updatevars([], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]], _G5878) + Unify: (12) updatevars([], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (12) updatevars([], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["a", "b"]], [[v, b], ["a", "b"]], [[v, d], ["a", "b"]]], [], [[[v, d], ["a", "b"]]]) +^ Call: (11) not([[[v, d], ["a", "b"]]]=[]) +^ Unify: (11) not(user: ([[[v, d], ["a", "b"]]]=[])) +^ Exit: (11) not(user: ([[[v, d], ["a", "b"]]]=[])) + Call: (11) unique1([[[v, d], ["a", "b"]]], [], _G14) + Unify: (11) unique1([[[v, d], ["a", "b"]]], [], _G14) + Call: (12) lists:delete([], [[v, d], ["a", "b"]], _G5883) + Unify: (12) lists:delete([], [[v, d], ["a", "b"]], []) + Exit: (12) lists:delete([], [[v, d], ["a", "b"]], []) + Call: (12) lists:append([], [[[v, d], ["a", "b"]]], _G5886) + Unify: (12) lists:append([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (12) lists:append([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Call: (12) unique1([], [[[v, d], ["a", "b"]]], _G14) + Unify: (12) unique1([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (12) unique1([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (11) unique1([[[v, d], ["a", "b"]]], [], [[[v, d], ["a", "b"]]]) + Call: (11) findresult3([["a", "b"], ["a", "b"], [v, d]], [[[v, d], ["a", "b"]]], [], _G5887) + Unify: (11) findresult3([["a", "b"], ["a", "b"], [v, d]], [[[v, d], ["a", "b"]]], [], _G5887) + Call: (12) [["a", "b"], ["a", "b"], [v, d]]=[_G5877|_G5878] + Exit: (12) [["a", "b"], ["a", "b"], [v, d]]=[["a", "b"], ["a", "b"], [v, d]] + Call: (12) expressionnotatom(["a", "b"]) + Unify: (12) expressionnotatom(["a", "b"]) + Call: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) var(["a", "b"]) + Fail: (14) var(["a", "b"]) + Redo: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) isval(["a", "b"]) + Unify: (14) isval(["a", "b"]) + Call: (15) number(["a", "b"]) + Fail: (15) number(["a", "b"]) + Fail: (14) isval(["a", "b"]) + Redo: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) string(["a", "b"]) + Fail: (14) string(["a", "b"]) + Fail: (13) isvalstrempty(["a", "b"]) + Redo: (12) expressionnotatom(["a", "b"]) + Unify: (12) expressionnotatom(["a", "b"]) +^ Call: (13) not(atom(["a", "b"])) +^ Unify: (13) not(user:atom(["a", "b"])) +^ Exit: (13) not(user:atom(["a", "b"])) + Call: (13) length(["a", "b"], _G5893) + Unify: (13) length(["a", "b"], _G5893) + Exit: (13) length(["a", "b"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["a", "b"]) + Unify: (13) expressionnotatom2(["a", "b"]) + Call: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) var("a") + Fail: (15) var("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) isval("a") + Unify: (15) isval("a") + Call: (16) number("a") + Fail: (16) number("a") + Fail: (15) isval("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) string("a") + Exit: (15) string("a") + Exit: (14) isvalstrempty("a") + Call: (14) expressionnotatom2(["b"]) + Unify: (14) expressionnotatom2(["b"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["b"]) + Exit: (13) expressionnotatom2(["a", "b"]) + Exit: (12) expressionnotatom(["a", "b"]) + Call: (12) lists:append([], [["a", "b"]], _G5897) + Unify: (12) lists:append([], [["a", "b"]], [["a", "b"]]) + Exit: (12) lists:append([], [["a", "b"]], [["a", "b"]]) + Call: (12) findresult3([["a", "b"], [v, d]], [[[v, d], ["a", "b"]]], [["a", "b"]], _G5898) + Unify: (12) findresult3([["a", "b"], [v, d]], [[[v, d], ["a", "b"]]], [["a", "b"]], _G5898) + Call: (13) [["a", "b"], [v, d]]=[_G5888|_G5889] + Exit: (13) [["a", "b"], [v, d]]=[["a", "b"], [v, d]] + Call: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) + Call: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) var(["a", "b"]) + Fail: (15) var(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) string(["a", "b"]) + Fail: (15) string(["a", "b"]) + Fail: (14) isvalstrempty(["a", "b"]) + Redo: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) +^ Call: (14) not(atom(["a", "b"])) +^ Unify: (14) not(user:atom(["a", "b"])) +^ Exit: (14) not(user:atom(["a", "b"])) + Call: (14) length(["a", "b"], _G5904) + Unify: (14) length(["a", "b"], _G5904) + Exit: (14) length(["a", "b"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["a", "b"]) + Unify: (14) expressionnotatom2(["a", "b"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2(["b"]) + Unify: (15) expressionnotatom2(["b"]) + Call: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) var("b") + Fail: (17) var("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) isvalstrempty("b") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["b"]) + Exit: (14) expressionnotatom2(["a", "b"]) + Exit: (13) expressionnotatom(["a", "b"]) + Call: (13) lists:append([["a", "b"]], [["a", "b"]], _G5908) + Unify: (13) lists:append([["a", "b"]], [["a", "b"]], [["a", "b"]|_G5900]) + Exit: (13) lists:append([["a", "b"]], [["a", "b"]], [["a", "b"], ["a", "b"]]) + Call: (13) findresult3([[v, d]], [[[v, d], ["a", "b"]]], [["a", "b"], ["a", "b"]], _G5912) + Unify: (13) findresult3([[v, d]], [[[v, d], ["a", "b"]]], [["a", "b"], ["a", "b"]], _G5912) + Call: (14) [[v, d]]=[_G5902|_G5903] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5918) + Unify: (15) length([v, d], _G5918) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) findresult3([[v, d]], [[[v, d], ["a", "b"]]], [["a", "b"], ["a", "b"]], _G5912) + Unify: (13) findresult3([[v, d]], [[[v, d], ["a", "b"]]], [["a", "b"], ["a", "b"]], _G5912) + Call: (14) [[v, d]]=[_G5902|_G5903] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) lists:member([[v, d], _G5908], [[[v, d], ["a", "b"]]]) + Unify: (14) lists:member([[v, d], _G5908], [[[v, d], ["a", "b"]]]) + Exit: (14) lists:member([[v, d], ["a", "b"]], [[[v, d], ["a", "b"]]]) + Call: (14) lists:append([["a", "b"], ["a", "b"]], [["a", "b"]], _G5923) + Unify: (14) lists:append([["a", "b"], ["a", "b"]], [["a", "b"]], [["a", "b"]|_G5915]) + Exit: (14) lists:append([["a", "b"], ["a", "b"]], [["a", "b"]], [["a", "b"], ["a", "b"], ["a", "b"]]) + Call: (14) findresult3([], [[[v, d], ["a", "b"]]], [["a", "b"], ["a", "b"], ["a", "b"]], _G5930) + Unify: (14) findresult3([], [[[v, d], ["a", "b"]]], [["a", "b"], ["a", "b"], ["a", "b"]], [["a", "b"], ["a", "b"], ["a", "b"]]) + Exit: (14) findresult3([], [[[v, d], ["a", "b"]]], [["a", "b"], ["a", "b"], ["a", "b"]], [["a", "b"], ["a", "b"], ["a", "b"]]) + Exit: (13) findresult3([[v, d]], [[[v, d], ["a", "b"]]], [["a", "b"], ["a", "b"]], [["a", "b"], ["a", "b"], ["a", "b"]]) + Exit: (12) findresult3([["a", "b"], [v, d]], [[[v, d], ["a", "b"]]], [["a", "b"]], [["a", "b"], ["a", "b"], ["a", "b"]]) + Exit: (11) findresult3([["a", "b"], ["a", "b"], [v, d]], [[[v, d], ["a", "b"]]], [], [["a", "b"], ["a", "b"], ["a", "b"]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "b"]]]) + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Exit: (10) member1([[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "b"]]]) + Exit: (9) interpret1(off, [[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "b"]]]) + Exit: (8) interpret(off, [[n, f], [["a", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "b"]]]) + + Call: (8) interpret(off, [[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (8) interpret(off, [[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (9) convert_to_grammar_part1([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5516) + Unify: (9) convert_to_grammar_part1([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5516) + Call: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5516, [], _G5518) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5516, [], _G5518) + Call: (11) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5507|_G5508] + Exit: (11) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5516], _G5519, "->", _G5528] + Fail: (11) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5516], _G5519, "->", _G5528] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5519, [], _G5521) + Call: (11) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5516], "->", _G5525] + Fail: (11) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5516], "->", _G5525] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5516, [], _G5518) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5516, [], _G5518) + Call: (11) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5507|_G5508] + Exit: (11) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_G5510, _G5513, ":-", _G5522] + Exit: (11) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) lists:append([], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], _G5537) + Unify: (11) lists:append([], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (11) lists:append([], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (11) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _G5537, [], _G5539) + Unify: (11) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (11) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (10) lists:append([[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], [], _G5537) + Unify: (10) lists:append([[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]|_G5529]) + Exit: (10) lists:append([[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (9) convert_to_grammar_part1([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (9) interpret1(off, [[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (9) interpret1(off, [[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) +^ Call: (10) retractall(debug(_G5531)) +^ Exit: (10) retractall(debug(_G5531)) +^ Call: (10) assertz(debug(off)) +^ Exit: (10) assertz(debug(off)) +^ Call: (10) retractall(cut(_G5535)) +^ Exit: (10) retractall(cut(_G5535)) +^ Call: (10) assertz(cut(off)) +^ Exit: (10) assertz(cut(off)) + Call: (10) member1([[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (10) member1([[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) cut(off) + Unify: (11) cut(off) + Exit: (11) cut(off) + Call: (11) [[n, f], [["c", "a"], ["b", "a"], [v, d]]]=[_G5539, _G5542] + Exit: (11) [[n, f], [["c", "a"], ["b", "a"], [v, d]]]=[[n, f], [["c", "a"], ["b", "a"], [v, d]]] + Call: (11) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _G5551, ":-", _G5560]|_G5546] + Exit: (11) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) length([["c", "a"], ["b", "a"], [v, d]], _G5571) + Unify: (11) length([["c", "a"], ["b", "a"], [v, d]], _G5571) + Exit: (11) length([["c", "a"], ["b", "a"], [v, d]], 3) + Call: (11) length([[v, b], [v, c], [v, d]], 3) + Unify: (11) length([[v, b], [v, c], [v, d]], 3) + Unify: (11) length([[v, b], [v, c], [v, d]], 3) + Exit: (11) length([[v, b], [v, c], [v, d]], 3) + Call: (11) [n, f]=[n, grammar] + Fail: (11) [n, f]=[n, grammar] + Redo: (10) member1([[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) [n, f]=[n, grammar_part] + Fail: (11) [n, f]=[n, grammar_part] + Redo: (10) member1([[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) checkarguments([["c", "a"], ["b", "a"], [v, d]], [[v, b], [v, c], [v, d]], [], _G5573, [], _G5575) + Unify: (11) checkarguments([["c", "a"], ["b", "a"], [v, d]], [[v, b], [v, c], [v, d]], [], _G5573, [], _G5575) + Call: (12) [["c", "a"], ["b", "a"], [v, d]]=[_G5563|_G5564] + Exit: (12) [["c", "a"], ["b", "a"], [v, d]]=[["c", "a"], ["b", "a"], [v, d]] + Call: (12) expressionnotatom(["c", "a"]) + Unify: (12) expressionnotatom(["c", "a"]) + Call: (13) isvalstrempty(["c", "a"]) + Unify: (13) isvalstrempty(["c", "a"]) + Call: (14) var(["c", "a"]) + Fail: (14) var(["c", "a"]) + Redo: (13) isvalstrempty(["c", "a"]) + Unify: (13) isvalstrempty(["c", "a"]) + Call: (14) isval(["c", "a"]) + Unify: (14) isval(["c", "a"]) + Call: (15) number(["c", "a"]) + Fail: (15) number(["c", "a"]) + Fail: (14) isval(["c", "a"]) + Redo: (13) isvalstrempty(["c", "a"]) + Unify: (13) isvalstrempty(["c", "a"]) + Call: (14) string(["c", "a"]) + Fail: (14) string(["c", "a"]) + Fail: (13) isvalstrempty(["c", "a"]) + Redo: (12) expressionnotatom(["c", "a"]) + Unify: (12) expressionnotatom(["c", "a"]) +^ Call: (13) not(atom(["c", "a"])) +^ Unify: (13) not(user:atom(["c", "a"])) +^ Exit: (13) not(user:atom(["c", "a"])) + Call: (13) length(["c", "a"], _G5579) + Unify: (13) length(["c", "a"], _G5579) + Exit: (13) length(["c", "a"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["c", "a"]) + Unify: (13) expressionnotatom2(["c", "a"]) + Call: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) var("c") + Fail: (15) var("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) isval("c") + Unify: (15) isval("c") + Call: (16) number("c") + Fail: (16) number("c") + Fail: (15) isval("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) string("c") + Exit: (15) string("c") + Exit: (14) isvalstrempty("c") + Call: (14) expressionnotatom2(["a"]) + Unify: (14) expressionnotatom2(["a"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["a"]) + Exit: (13) expressionnotatom2(["c", "a"]) + Exit: (12) expressionnotatom(["c", "a"]) + Call: (12) [[v, b], [v, c], [v, d]]=[_G5571|_G5572] + Exit: (12) [[v, b], [v, c], [v, d]]=[[v, b], [v, c], [v, d]] +^ Call: (12) not(var([v, b])) +^ Unify: (12) not(user:var([v, b])) +^ Exit: (12) not(user:var([v, b])) + Call: (12) isvar([v, b]) + Unify: (12) isvar([v, b]) + Exit: (12) isvar([v, b]) + Call: (12) putvalue([v, b], ["c", "a"], [], _G5589) + Unify: (12) putvalue([v, b], ["c", "a"], [], _G5589) +^ Call: (13) not(isvar([v, b])) +^ Unify: (13) not(user:isvar([v, b])) + Call: (14) isvar([v, b]) + Unify: (14) isvar([v, b]) + Exit: (14) isvar([v, b]) +^ Fail: (13) not(user:isvar([v, b])) + Redo: (12) putvalue([v, b], ["c", "a"], [], _G5589) + Call: (13) isvar([v, b]) + Unify: (13) isvar([v, b]) + Exit: (13) isvar([v, b]) + Call: (13) isvalstrorundef(["c", "a"]) + Unify: (13) isvalstrorundef(["c", "a"]) + Call: (14) var(["c", "a"]) + Fail: (14) var(["c", "a"]) + Redo: (13) isvalstrorundef(["c", "a"]) + Unify: (13) isvalstrorundef(["c", "a"]) +^ Call: (14) not(var(["c", "a"])) +^ Unify: (14) not(user:var(["c", "a"])) +^ Exit: (14) not(user:var(["c", "a"])) + Call: (14) isval(["c", "a"]) + Unify: (14) isval(["c", "a"]) + Call: (15) number(["c", "a"]) + Fail: (15) number(["c", "a"]) + Fail: (14) isval(["c", "a"]) + Redo: (13) isvalstrorundef(["c", "a"]) + Unify: (13) isvalstrorundef(["c", "a"]) +^ Call: (14) not(var(["c", "a"])) +^ Unify: (14) not(user:var(["c", "a"])) +^ Exit: (14) not(user:var(["c", "a"])) + Call: (14) expression(["c", "a"]) + Unify: (14) expression(["c", "a"]) + Call: (15) isval(["c", "a"]) + Unify: (15) isval(["c", "a"]) + Call: (16) number(["c", "a"]) + Fail: (16) number(["c", "a"]) + Fail: (15) isval(["c", "a"]) + Redo: (14) expression(["c", "a"]) + Unify: (14) expression(["c", "a"]) + Call: (15) string(["c", "a"]) + Fail: (15) string(["c", "a"]) + Redo: (14) expression(["c", "a"]) + Unify: (14) expression(["c", "a"]) + Call: (15) atom(["c", "a"]) + Fail: (15) atom(["c", "a"]) + Redo: (14) expression(["c", "a"]) + Unify: (14) expression(["c", "a"]) +^ Call: (15) not(atom(["c", "a"])) +^ Unify: (15) not(user:atom(["c", "a"])) +^ Exit: (15) not(user:atom(["c", "a"])) + Call: (15) length(["c", "a"], _G5597) + Unify: (15) length(["c", "a"], _G5597) + Exit: (15) length(["c", "a"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["c", "a"]) + Unify: (15) expression2(["c", "a"]) + Call: (16) expression3("c") + Unify: (16) expression3("c") + Call: (17) isval("c") + Unify: (17) isval("c") + Call: (18) number("c") + Fail: (18) number("c") + Fail: (17) isval("c") + Redo: (16) expression3("c") + Unify: (16) expression3("c") + Call: (17) string("c") + Exit: (17) string("c") + Exit: (16) expression3("c") + Call: (16) expression2(["a"]) + Unify: (16) expression2(["a"]) + Call: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) isval("a") + Unify: (18) isval("a") + Call: (19) number("a") + Fail: (19) number("a") + Fail: (18) isval("a") + Redo: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) string("a") + Exit: (18) string("a") + Exit: (17) expression3("a") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["a"]) + Exit: (15) expression2(["c", "a"]) + Exit: (14) expression(["c", "a"]) + Exit: (13) isvalstrorundef(["c", "a"]) + Call: (13) updatevar([v, b], ["c", "a"], [], _G5599) + Unify: (13) updatevar([v, b], ["c", "a"], [], _G5599) + Call: (14) lists:member([[v, b], empty], []) + Fail: (14) lists:member([[v, b], empty], []) + Redo: (13) updatevar([v, b], ["c", "a"], [], _G5599) +^ Call: (14) not(member([[v, b], _G5595], [])) +^ Unify: (14) not(user:member([[v, b], _G5595], [])) +^ Exit: (14) not(user:member([[v, b], _G5595], [])) + Call: (14) _G5595=empty + Exit: (14) empty=empty + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[v, b], ["c", "a"]]], _G5619) + Unify: (14) lists:append([], [[[v, b], ["c", "a"]]], [[[v, b], ["c", "a"]]]) + Exit: (14) lists:append([], [[[v, b], ["c", "a"]]], [[[v, b], ["c", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Exit: (13) updatevar([v, b], ["c", "a"], [], [[[v, b], ["c", "a"]]]) + Exit: (12) putvalue([v, b], ["c", "a"], [], [[[v, b], ["c", "a"]]]) + Call: (12) checkarguments([["b", "a"], [v, d]], [[v, c], [v, d]], [[[v, b], ["c", "a"]]], _G5620, [], _G5622) + Unify: (12) checkarguments([["b", "a"], [v, d]], [[v, c], [v, d]], [[[v, b], ["c", "a"]]], _G5620, [], _G5622) + Call: (13) [["b", "a"], [v, d]]=[_G5610|_G5611] + Exit: (13) [["b", "a"], [v, d]]=[["b", "a"], [v, d]] + Call: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) + Call: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Fail: (14) isvalstrempty(["b", "a"]) + Redo: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) +^ Call: (14) not(atom(["b", "a"])) +^ Unify: (14) not(user:atom(["b", "a"])) +^ Exit: (14) not(user:atom(["b", "a"])) + Call: (14) length(["b", "a"], _G5626) + Unify: (14) length(["b", "a"], _G5626) + Exit: (14) length(["b", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["b", "a"]) + Unify: (14) expressionnotatom2(["b", "a"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["b", "a"]) + Exit: (13) expressionnotatom(["b", "a"]) + Call: (13) [[v, c], [v, d]]=[_G5618|_G5619] + Exit: (13) [[v, c], [v, d]]=[[v, c], [v, d]] +^ Call: (13) not(var([v, c])) +^ Unify: (13) not(user:var([v, c])) +^ Exit: (13) not(user:var([v, c])) + Call: (13) isvar([v, c]) + Unify: (13) isvar([v, c]) + Exit: (13) isvar([v, c]) + Call: (13) putvalue([v, c], ["b", "a"], [[[v, b], ["c", "a"]]], _G5636) + Unify: (13) putvalue([v, c], ["b", "a"], [[[v, b], ["c", "a"]]], _G5636) +^ Call: (14) not(isvar([v, c])) +^ Unify: (14) not(user:isvar([v, c])) + Call: (15) isvar([v, c]) + Unify: (15) isvar([v, c]) + Exit: (15) isvar([v, c]) +^ Fail: (14) not(user:isvar([v, c])) + Redo: (13) putvalue([v, c], ["b", "a"], [[[v, b], ["c", "a"]]], _G5636) + Call: (14) isvar([v, c]) + Unify: (14) isvar([v, c]) + Exit: (14) isvar([v, c]) + Call: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) +^ Call: (15) not(var(["b", "a"])) +^ Unify: (15) not(user:var(["b", "a"])) +^ Exit: (15) not(user:var(["b", "a"])) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) +^ Call: (15) not(var(["b", "a"])) +^ Unify: (15) not(user:var(["b", "a"])) +^ Exit: (15) not(user:var(["b", "a"])) + Call: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) isval(["b", "a"]) + Unify: (16) isval(["b", "a"]) + Call: (17) number(["b", "a"]) + Fail: (17) number(["b", "a"]) + Fail: (16) isval(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) string(["b", "a"]) + Fail: (16) string(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) atom(["b", "a"]) + Fail: (16) atom(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) +^ Call: (16) not(atom(["b", "a"])) +^ Unify: (16) not(user:atom(["b", "a"])) +^ Exit: (16) not(user:atom(["b", "a"])) + Call: (16) length(["b", "a"], _G5644) + Unify: (16) length(["b", "a"], _G5644) + Exit: (16) length(["b", "a"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expression2(["b", "a"]) + Unify: (16) expression2(["b", "a"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2(["a"]) + Unify: (17) expression2(["a"]) + Call: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) isval("a") + Unify: (19) isval("a") + Call: (20) number("a") + Fail: (20) number("a") + Fail: (19) isval("a") + Redo: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) string("a") + Exit: (19) string("a") + Exit: (18) expression3("a") + Call: (18) expression2([]) + Unify: (18) expression2([]) + Exit: (18) expression2([]) + Exit: (17) expression2(["a"]) + Exit: (16) expression2(["b", "a"]) + Exit: (15) expression(["b", "a"]) + Exit: (14) isvalstrorundef(["b", "a"]) + Call: (14) updatevar([v, c], ["b", "a"], [[[v, b], ["c", "a"]]], _G5646) + Unify: (14) updatevar([v, c], ["b", "a"], [[[v, b], ["c", "a"]]], _G5646) + Call: (15) lists:member([[v, c], empty], [[[v, b], ["c", "a"]]]) + Unify: (15) lists:member([[v, c], empty], [[[v, b], ["c", "a"]]]) + Fail: (15) lists:member([[v, c], empty], [[[v, b], ["c", "a"]]]) + Redo: (14) updatevar([v, c], ["b", "a"], [[[v, b], ["c", "a"]]], _G5646) +^ Call: (15) not(member([[v, c], _G5642], [[[v, b], ["c", "a"]]])) +^ Unify: (15) not(user:member([[v, c], _G5642], [[[v, b], ["c", "a"]]])) +^ Exit: (15) not(user:member([[v, c], _G5642], [[[v, b], ["c", "a"]]])) + Call: (15) _G5642=empty + Exit: (15) empty=empty + Call: (15) true + Exit: (15) true + Call: (15) lists:append([[[v, b], ["c", "a"]]], [[[v, c], ["b", "a"]]], _G5666) + Unify: (15) lists:append([[[v, b], ["c", "a"]]], [[[v, c], ["b", "a"]]], [[[v, b], ["c", "a"]]|_G5658]) + Exit: (15) lists:append([[[v, b], ["c", "a"]]], [[[v, c], ["b", "a"]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Call: (15) true + Exit: (15) true + Call: (15) true + Exit: (15) true + Exit: (14) updatevar([v, c], ["b", "a"], [[[v, b], ["c", "a"]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Exit: (13) putvalue([v, c], ["b", "a"], [[[v, b], ["c", "a"]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Call: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5670, [], _G5672) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5670, [], _G5672) + Call: (14) [[v, d]]=[_G5660|_G5661] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5676) + Unify: (15) length([v, d], _G5676) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5670, [], _G5672) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5670, [], _G5672) + Call: (14) [[v, d]]=[_G5660|_G5661] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5668|_G5669] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5684) + Unify: (15) length([v, d], _G5684) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5670, [], _G5672) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5670, [], _G5672) + Call: (14) [[v, d]]=[_G5660|_G5661] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5668|_G5669] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) getvalue([v, d], _G5684, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Unify: (14) getvalue([v, d], _G5684, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) getvalue([v, d], _G5684, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(_G5683) + Unify: (15) isvalstrorundef(_G5683) + Call: (16) var(_G5683) + Exit: (16) var(_G5683) + Exit: (15) isvalstrorundef(_G5683) + Call: (15) getvar([v, d], _G5684, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Unify: (15) getvar([v, d], _G5684, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Call: (16) lists:member([[v, d], _G5679], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Unify: (16) lists:member([[v, d], _G5679], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Fail: (16) lists:member([[v, d], _G5679], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Redo: (15) getvar([v, d], _G5684, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Unify: (15) getvar([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) +^ Call: (16) not(member([[v, d], _G5682], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5682], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5682], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]])) + Call: (16) true + Exit: (16) true + Exit: (15) getvar([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Exit: (14) getvalue([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) putvalue([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5698) + Unify: (14) putvalue([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5698) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5698) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) + Call: (16) var(empty) + Fail: (16) var(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) isval(empty) + Unify: (16) isval(empty) + Call: (17) number(empty) + Fail: (17) number(empty) + Fail: (16) isval(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) expression(empty) + Unify: (16) expression(empty) + Exit: (16) expression(empty) + Exit: (15) isvalstrorundef(empty) + Call: (15) updatevar([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5703) + Unify: (15) updatevar([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5703) + Call: (16) lists:member([[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Unify: (16) lists:member([[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Fail: (16) lists:member([[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Redo: (15) updatevar([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], _G5703) +^ Call: (16) not(member([[v, d], _G5699], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5699], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5699], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]])) + Call: (16) _G5699=empty + Exit: (16) empty=empty + Call: (16) true + Exit: (16) true + Call: (16) lists:append([[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], [[[v, d], empty]], _G5723) + Unify: (16) lists:append([[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], [[[v, d], empty]], [[[v, b], ["c", "a"]]|_G5715]) + Exit: (16) lists:append([[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], [[[v, d], empty]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Exit: (14) putvalue([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (14) lists:append([], [[[v, d], [v, d]]], _G5738) + Unify: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Call: (14) checkarguments([], [], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5739, [[[v, d], [v, d]]], _G5741) + Unify: (14) checkarguments([], [], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) checkarguments([], [], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (12) checkarguments([["b", "a"], [v, d]], [[v, c], [v, d]], [[[v, b], ["c", "a"]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (11) checkarguments([["c", "a"], ["b", "a"], [v, d]], [[v, b], [v, c], [v, d]], [], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) true + Exit: (11) true + Call: (11) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5739, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5739, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5729|_G5730] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5747, [[n, =], [[v, d], [v, c]]], _G5749) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5747, [[n, =], [[v, d], [v, c]]], _G5749) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5737|_G5738] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Redo: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5747, [[n, =], [[v, d], [v, c]]], _G5749) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5747, [[n, =], [[v, d], [v, c]]], _G5749) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5746]]|_G5738] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5746]]|_G5738] + Redo: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5747, [[n, =], [[v, d], [v, c]]], _G5749) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5747, [[n, =], [[v, d], [v, c]]], _G5749) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[_G5740], or, [_G5749]] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[_G5740], or, [_G5749]] + Redo: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5747, [[n, =], [[v, d], [v, c]]], _G5749) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5747, [[n, =], [[v, d], [v, c]]], _G5749) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5737|_G5738] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Fail: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5747, [[n, =], [[v, d], [v, c]]], _G5749) + Redo: (11) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5739, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5739, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5738]]|_G5730] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5738]]|_G5730] + Redo: (11) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5739, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5739, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5732], or, [_G5741]] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5732], or, [_G5741]] + Redo: (11) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5739, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5739, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5729|_G5730] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretstatement1([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5748, _G5749, _G5750) + Unify: (12) interpretstatement1([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5748, true, nocut) + Call: (13) isop(=) + Unify: (13) isop(=) + Exit: (13) isop(=) + Call: (13) interpretpart(is, [v, d], [v, c], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5748) + Unify: (13) interpretpart(is, [v, d], [v, c], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5748) + Call: (14) getvalues([v, d], [v, c], _G5746, _G5747, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Unify: (14) getvalues([v, d], [v, c], _G5746, _G5747, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, d], _G5745, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, d], _G5745, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, d])) +^ Unify: (16) not(user:isvar([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) +^ Fail: (16) not(user:isvar([v, d])) + Redo: (15) getvalue([v, d], _G5745, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) + Call: (16) isvalstrorundef(_G5744) + Unify: (16) isvalstrorundef(_G5744) + Call: (17) var(_G5744) + Exit: (17) var(_G5744) + Exit: (16) isvalstrorundef(_G5744) + Call: (16) getvar([v, d], _G5745, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], _G5745, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], _G5740], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], _G5740], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(empty=empty) +^ Unify: (17) not(user: (empty=empty)) +^ Fail: (17) not(user: (empty=empty)) + Redo: (16) getvar([v, d], _G5745, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(member([[v, d], _G5743], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]])) +^ Unify: (17) not(user:member([[v, d], _G5743], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]])) +^ Fail: (17) not(user:member([[v, d], _G5743], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]])) + Redo: (16) getvar([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Exit: (16) getvar([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, d], empty, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, c], _G5751, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, c], _G5751, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, c])) +^ Unify: (16) not(user:isvar([v, c])) + Call: (17) isvar([v, c]) + Unify: (17) isvar([v, c]) + Exit: (17) isvar([v, c]) +^ Fail: (16) not(user:isvar([v, c])) + Redo: (15) getvalue([v, c], _G5751, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, c]) + Unify: (16) isvar([v, c]) + Exit: (16) isvar([v, c]) + Call: (16) isvalstrorundef(_G5750) + Unify: (16) isvalstrorundef(_G5750) + Call: (17) var(_G5750) + Exit: (17) var(_G5750) + Exit: (16) isvalstrorundef(_G5750) + Call: (16) getvar([v, c], _G5751, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, c], _G5751, [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, c], _G5746], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, c], _G5746], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, c], ["b", "a"]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(["b", "a"]=empty) +^ Unify: (17) not(user: (["b", "a"]=empty)) +^ Exit: (17) not(user: (["b", "a"]=empty)) + Exit: (16) getvar([v, c], ["b", "a"], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, c], ["b", "a"], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Exit: (14) getvalues([v, d], [v, c], empty, ["b", "a"], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (14) expression(empty) + Unify: (14) expression(empty) + Exit: (14) expression(empty) + Call: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) atom(["b", "a"]) + Fail: (15) atom(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) +^ Call: (15) not(atom(["b", "a"])) +^ Unify: (15) not(user:atom(["b", "a"])) +^ Exit: (15) not(user:atom(["b", "a"])) + Call: (15) length(["b", "a"], _G5768) + Unify: (15) length(["b", "a"], _G5768) + Exit: (15) length(["b", "a"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["b", "a"]) + Unify: (15) expression2(["b", "a"]) + Call: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) expression3("b") + Call: (16) expression2(["a"]) + Unify: (16) expression2(["a"]) + Call: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) isval("a") + Unify: (18) isval("a") + Call: (19) number("a") + Fail: (19) number("a") + Fail: (18) isval("a") + Redo: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) string("a") + Exit: (18) string("a") + Exit: (17) expression3("a") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["a"]) + Exit: (15) expression2(["b", "a"]) + Exit: (14) expression(["b", "a"]) + Call: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Unify: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Exit: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Call: (14) putvalue([v, d], ["b", "a"], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5770) + Unify: (14) putvalue([v, d], ["b", "a"], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5770) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], ["b", "a"], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5770) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) + Call: (16) var(["b", "a"]) + Fail: (16) var(["b", "a"]) + Redo: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) +^ Call: (16) not(var(["b", "a"])) +^ Unify: (16) not(user:var(["b", "a"])) +^ Exit: (16) not(user:var(["b", "a"])) + Call: (16) isval(["b", "a"]) + Unify: (16) isval(["b", "a"]) + Call: (17) number(["b", "a"]) + Fail: (17) number(["b", "a"]) + Fail: (16) isval(["b", "a"]) + Redo: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) +^ Call: (16) not(var(["b", "a"])) +^ Unify: (16) not(user:var(["b", "a"])) +^ Exit: (16) not(user:var(["b", "a"])) + Call: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) isval(["b", "a"]) + Unify: (17) isval(["b", "a"]) + Call: (18) number(["b", "a"]) + Fail: (18) number(["b", "a"]) + Fail: (17) isval(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) string(["b", "a"]) + Fail: (17) string(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) atom(["b", "a"]) + Fail: (17) atom(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) +^ Call: (17) not(atom(["b", "a"])) +^ Unify: (17) not(user:atom(["b", "a"])) +^ Exit: (17) not(user:atom(["b", "a"])) + Call: (17) length(["b", "a"], _G5778) + Unify: (17) length(["b", "a"], _G5778) + Exit: (17) length(["b", "a"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expression2(["b", "a"]) + Unify: (17) expression2(["b", "a"]) + Call: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) isval("b") + Unify: (19) isval("b") + Call: (20) number("b") + Fail: (20) number("b") + Fail: (19) isval("b") + Redo: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) string("b") + Exit: (19) string("b") + Exit: (18) expression3("b") + Call: (18) expression2(["a"]) + Unify: (18) expression2(["a"]) + Call: (19) expression3("a") + Unify: (19) expression3("a") + Call: (20) isval("a") + Unify: (20) isval("a") + Call: (21) number("a") + Fail: (21) number("a") + Fail: (20) isval("a") + Redo: (19) expression3("a") + Unify: (19) expression3("a") + Call: (20) string("a") + Exit: (20) string("a") + Exit: (19) expression3("a") + Call: (19) expression2([]) + Unify: (19) expression2([]) + Exit: (19) expression2([]) + Exit: (18) expression2(["a"]) + Exit: (17) expression2(["b", "a"]) + Exit: (16) expression(["b", "a"]) + Exit: (15) isvalstrorundef(["b", "a"]) + Call: (15) updatevar([v, d], ["b", "a"], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5780) + Unify: (15) updatevar([v, d], ["b", "a"], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], _G5780) + Call: (16) lists:member([[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Unify: (16) lists:member([[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Exit: (16) lists:member([[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]]) + Call: (16) lists:delete([[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[v, d], empty], _G5791) + Unify: (16) lists:delete([[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[v, d], empty], _G5791) + Exit: (16) lists:delete([[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[v, d], empty], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]]) + Call: (16) lists:append([[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], [[[v, d], ["b", "a"]]], _G5806) + Unify: (16) lists:append([[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, b], ["c", "a"]]|_G5798]) + Exit: (16) lists:append([[[v, b], ["c", "a"]], [[v, c], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], ["b", "a"], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]]) + Exit: (14) putvalue([v, d], ["b", "a"], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Exit: (13) interpretpart(is, [v, d], [v, c], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]]) + Exit: (12) interpretstatement1([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], true, nocut) +^ Call: (12) not(nocut=cut) +^ Unify: (12) not(user: (nocut=cut)) +^ Exit: (12) not(user: (nocut=cut)) + Call: (12) _G5816=[[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (12) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], _G5819, [], _G5821) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [], true) + Exit: (12) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [], true) + Call: (12) logicalconjunction(true, true, true) + Unify: (12) logicalconjunction(true, true, true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Exit: (12) logicalconjunction(true, true, true) + Exit: (11) interpretbody([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], empty]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [[[n, =], [[v, d], [v, c]]]], true) + Call: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [], _G5819) + Unify: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [], _G5819) + Call: (12) [[[v, d], [v, d]]]=[[_G5812, _G5815]|_G5810] + Exit: (12) [[[v, d], [v, d]]]=[[[v, d], [v, d]]] + Call: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) var([v, d]) + Fail: (14) var([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) isval([v, d]) + Unify: (14) isval([v, d]) + Call: (15) number([v, d]) + Fail: (15) number([v, d]) + Fail: (14) isval([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) string([v, d]) + Fail: (14) string([v, d]) + Fail: (13) isvalstrempty([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) +^ Call: (13) not(atom([v, d])) +^ Unify: (13) not(user:atom([v, d])) +^ Exit: (13) not(user:atom([v, d])) + Call: (13) length([v, d], _G5831) + Unify: (13) length([v, d], _G5831) + Exit: (13) length([v, d], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2([v, d]) + Unify: (13) expressionnotatom2([v, d]) + Call: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) var(v) + Fail: (15) var(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) isval(v) + Unify: (15) isval(v) + Call: (16) number(v) + Fail: (16) number(v) + Fail: (15) isval(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) string(v) + Fail: (15) string(v) + Fail: (14) isvalstrempty(v) + Fail: (13) expressionnotatom2([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) predicate_or_rule_name([v, d]) + Fail: (13) predicate_or_rule_name([v, d]) + Fail: (12) expressionnotatom([v, d]) + Redo: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [], _G5828) + Call: (12) lists:member([[v, d], _G5821], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]]) + Unify: (12) lists:member([[v, d], _G5821], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]]) + Exit: (12) lists:member([[v, d], ["b", "a"]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (12) lists:append([], [[[v, d], ["b", "a"]]], _G5842) + Unify: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Call: (12) updatevars([], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], _G5843) + Unify: (12) updatevars([], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) updatevars([], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["c", "a"]], [[v, c], ["b", "a"]], [[v, d], ["b", "a"]]], [], [[[v, d], ["b", "a"]]]) +^ Call: (11) not([[[v, d], ["b", "a"]]]=[]) +^ Unify: (11) not(user: ([[[v, d], ["b", "a"]]]=[])) +^ Exit: (11) not(user: ([[[v, d], ["b", "a"]]]=[])) + Call: (11) unique1([[[v, d], ["b", "a"]]], [], _G14) + Unify: (11) unique1([[[v, d], ["b", "a"]]], [], _G14) + Call: (12) lists:delete([], [[v, d], ["b", "a"]], _G5848) + Unify: (12) lists:delete([], [[v, d], ["b", "a"]], []) + Exit: (12) lists:delete([], [[v, d], ["b", "a"]], []) + Call: (12) lists:append([], [[[v, d], ["b", "a"]]], _G5851) + Unify: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Call: (12) unique1([], [[[v, d], ["b", "a"]]], _G14) + Unify: (12) unique1([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) unique1([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (11) unique1([[[v, d], ["b", "a"]]], [], [[[v, d], ["b", "a"]]]) + Call: (11) findresult3([["c", "a"], ["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], _G5852) + Unify: (11) findresult3([["c", "a"], ["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], _G5852) + Call: (12) [["c", "a"], ["b", "a"], [v, d]]=[_G5842|_G5843] + Exit: (12) [["c", "a"], ["b", "a"], [v, d]]=[["c", "a"], ["b", "a"], [v, d]] + Call: (12) expressionnotatom(["c", "a"]) + Unify: (12) expressionnotatom(["c", "a"]) + Call: (13) isvalstrempty(["c", "a"]) + Unify: (13) isvalstrempty(["c", "a"]) + Call: (14) var(["c", "a"]) + Fail: (14) var(["c", "a"]) + Redo: (13) isvalstrempty(["c", "a"]) + Unify: (13) isvalstrempty(["c", "a"]) + Call: (14) isval(["c", "a"]) + Unify: (14) isval(["c", "a"]) + Call: (15) number(["c", "a"]) + Fail: (15) number(["c", "a"]) + Fail: (14) isval(["c", "a"]) + Redo: (13) isvalstrempty(["c", "a"]) + Unify: (13) isvalstrempty(["c", "a"]) + Call: (14) string(["c", "a"]) + Fail: (14) string(["c", "a"]) + Fail: (13) isvalstrempty(["c", "a"]) + Redo: (12) expressionnotatom(["c", "a"]) + Unify: (12) expressionnotatom(["c", "a"]) +^ Call: (13) not(atom(["c", "a"])) +^ Unify: (13) not(user:atom(["c", "a"])) +^ Exit: (13) not(user:atom(["c", "a"])) + Call: (13) length(["c", "a"], _G5858) + Unify: (13) length(["c", "a"], _G5858) + Exit: (13) length(["c", "a"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["c", "a"]) + Unify: (13) expressionnotatom2(["c", "a"]) + Call: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) var("c") + Fail: (15) var("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) isval("c") + Unify: (15) isval("c") + Call: (16) number("c") + Fail: (16) number("c") + Fail: (15) isval("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) string("c") + Exit: (15) string("c") + Exit: (14) isvalstrempty("c") + Call: (14) expressionnotatom2(["a"]) + Unify: (14) expressionnotatom2(["a"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["a"]) + Exit: (13) expressionnotatom2(["c", "a"]) + Exit: (12) expressionnotatom(["c", "a"]) + Call: (12) lists:append([], [["c", "a"]], _G5862) + Unify: (12) lists:append([], [["c", "a"]], [["c", "a"]]) + Exit: (12) lists:append([], [["c", "a"]], [["c", "a"]]) + Call: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["c", "a"]], _G5863) + Unify: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["c", "a"]], _G5863) + Call: (13) [["b", "a"], [v, d]]=[_G5853|_G5854] + Exit: (13) [["b", "a"], [v, d]]=[["b", "a"], [v, d]] + Call: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) + Call: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Fail: (14) isvalstrempty(["b", "a"]) + Redo: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) +^ Call: (14) not(atom(["b", "a"])) +^ Unify: (14) not(user:atom(["b", "a"])) +^ Exit: (14) not(user:atom(["b", "a"])) + Call: (14) length(["b", "a"], _G5869) + Unify: (14) length(["b", "a"], _G5869) + Exit: (14) length(["b", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["b", "a"]) + Unify: (14) expressionnotatom2(["b", "a"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["b", "a"]) + Exit: (13) expressionnotatom(["b", "a"]) + Call: (13) lists:append([["c", "a"]], [["b", "a"]], _G5873) + Unify: (13) lists:append([["c", "a"]], [["b", "a"]], [["c", "a"]|_G5865]) + Exit: (13) lists:append([["c", "a"]], [["b", "a"]], [["c", "a"], ["b", "a"]]) + Call: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["c", "a"], ["b", "a"]], _G5877) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["c", "a"], ["b", "a"]], _G5877) + Call: (14) [[v, d]]=[_G5867|_G5868] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5883) + Unify: (15) length([v, d], _G5883) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["c", "a"], ["b", "a"]], _G5877) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["c", "a"], ["b", "a"]], _G5877) + Call: (14) [[v, d]]=[_G5867|_G5868] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) lists:member([[v, d], _G5873], [[[v, d], ["b", "a"]]]) + Unify: (14) lists:member([[v, d], _G5873], [[[v, d], ["b", "a"]]]) + Exit: (14) lists:member([[v, d], ["b", "a"]], [[[v, d], ["b", "a"]]]) + Call: (14) lists:append([["c", "a"], ["b", "a"]], [["b", "a"]], _G5888) + Unify: (14) lists:append([["c", "a"], ["b", "a"]], [["b", "a"]], [["c", "a"]|_G5880]) + Exit: (14) lists:append([["c", "a"], ["b", "a"]], [["b", "a"]], [["c", "a"], ["b", "a"], ["b", "a"]]) + Call: (14) findresult3([], [[[v, d], ["b", "a"]]], [["c", "a"], ["b", "a"], ["b", "a"]], _G5895) + Unify: (14) findresult3([], [[[v, d], ["b", "a"]]], [["c", "a"], ["b", "a"], ["b", "a"]], [["c", "a"], ["b", "a"], ["b", "a"]]) + Exit: (14) findresult3([], [[[v, d], ["b", "a"]]], [["c", "a"], ["b", "a"], ["b", "a"]], [["c", "a"], ["b", "a"], ["b", "a"]]) + Exit: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["c", "a"], ["b", "a"]], [["c", "a"], ["b", "a"], ["b", "a"]]) + Exit: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["c", "a"]], [["c", "a"], ["b", "a"], ["b", "a"]]) + Exit: (11) findresult3([["c", "a"], ["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], [["c", "a"], ["b", "a"], ["b", "a"]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Exit: (10) member1([[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Exit: (9) interpret1(off, [[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Exit: (8) interpret(off, [[n, f], [["c", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + + Call: (8) interpret(off, [[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (8) interpret(off, [[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551) + Unify: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551) + Call: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5542|_G5543] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], _G5554, "->", _G5563] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], _G5554, "->", _G5563] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5554, [], _G5556) + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], "->", _G5560] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], "->", _G5560] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5542|_G5543] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_G5545, _G5548, ":-", _G5557] + Exit: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _G5572) + Unify: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G5572, [], _G5574) + Unify: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _G5572) + Unify: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]|_G5564]) + Exit: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (9) interpret1(off, [[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (9) interpret1(off, [[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) +^ Call: (10) retractall(debug(_G5566)) +^ Exit: (10) retractall(debug(_G5566)) +^ Call: (10) assertz(debug(off)) +^ Exit: (10) assertz(debug(off)) +^ Call: (10) retractall(cut(_G5570)) +^ Exit: (10) retractall(cut(_G5570)) +^ Call: (10) assertz(cut(off)) +^ Exit: (10) assertz(cut(off)) + Call: (10) member1([[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (10) member1([[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) cut(off) + Unify: (11) cut(off) + Exit: (11) cut(off) + Call: (11) [[n, f], [["b", "c"], ["a", "b"], [v, d]]]=[_G5574, _G5577] + Exit: (11) [[n, f], [["b", "c"], ["a", "b"], [v, d]]]=[[n, f], [["b", "c"], ["a", "b"], [v, d]]] + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _G5586, ":-", _G5595]|_G5581] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) length([["b", "c"], ["a", "b"], [v, d]], _G5606) + Unify: (11) length([["b", "c"], ["a", "b"], [v, d]], _G5606) + Exit: (11) length([["b", "c"], ["a", "b"], [v, d]], 3) + Call: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Exit: (11) length([[v, c], [v, b], [v, d]], 3) + Call: (11) [n, f]=[n, grammar] + Fail: (11) [n, f]=[n, grammar] + Redo: (10) member1([[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) [n, f]=[n, grammar_part] + Fail: (11) [n, f]=[n, grammar_part] + Redo: (10) member1([[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) checkarguments([["b", "c"], ["a", "b"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5608, [], _G5610) + Unify: (11) checkarguments([["b", "c"], ["a", "b"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5608, [], _G5610) + Call: (12) [["b", "c"], ["a", "b"], [v, d]]=[_G5598|_G5599] + Exit: (12) [["b", "c"], ["a", "b"], [v, d]]=[["b", "c"], ["a", "b"], [v, d]] + Call: (12) expressionnotatom(["b", "c"]) + Unify: (12) expressionnotatom(["b", "c"]) + Call: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) var(["b", "c"]) + Fail: (14) var(["b", "c"]) + Redo: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) isval(["b", "c"]) + Unify: (14) isval(["b", "c"]) + Call: (15) number(["b", "c"]) + Fail: (15) number(["b", "c"]) + Fail: (14) isval(["b", "c"]) + Redo: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) string(["b", "c"]) + Fail: (14) string(["b", "c"]) + Fail: (13) isvalstrempty(["b", "c"]) + Redo: (12) expressionnotatom(["b", "c"]) + Unify: (12) expressionnotatom(["b", "c"]) +^ Call: (13) not(atom(["b", "c"])) +^ Unify: (13) not(user:atom(["b", "c"])) +^ Exit: (13) not(user:atom(["b", "c"])) + Call: (13) length(["b", "c"], _G5614) + Unify: (13) length(["b", "c"], _G5614) + Exit: (13) length(["b", "c"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["b", "c"]) + Unify: (13) expressionnotatom2(["b", "c"]) + Call: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) var("b") + Fail: (15) var("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) isval("b") + Unify: (15) isval("b") + Call: (16) number("b") + Fail: (16) number("b") + Fail: (15) isval("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) string("b") + Exit: (15) string("b") + Exit: (14) isvalstrempty("b") + Call: (14) expressionnotatom2(["c"]) + Unify: (14) expressionnotatom2(["c"]) + Call: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) var("c") + Fail: (16) var("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) isval("c") + Unify: (16) isval("c") + Call: (17) number("c") + Fail: (17) number("c") + Fail: (16) isval("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) string("c") + Exit: (16) string("c") + Exit: (15) isvalstrempty("c") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["c"]) + Exit: (13) expressionnotatom2(["b", "c"]) + Exit: (12) expressionnotatom(["b", "c"]) + Call: (12) [[v, c], [v, b], [v, d]]=[_G5606|_G5607] + Exit: (12) [[v, c], [v, b], [v, d]]=[[v, c], [v, b], [v, d]] +^ Call: (12) not(var([v, c])) +^ Unify: (12) not(user:var([v, c])) +^ Exit: (12) not(user:var([v, c])) + Call: (12) isvar([v, c]) + Unify: (12) isvar([v, c]) + Exit: (12) isvar([v, c]) + Call: (12) putvalue([v, c], ["b", "c"], [], _G5624) + Unify: (12) putvalue([v, c], ["b", "c"], [], _G5624) +^ Call: (13) not(isvar([v, c])) +^ Unify: (13) not(user:isvar([v, c])) + Call: (14) isvar([v, c]) + Unify: (14) isvar([v, c]) + Exit: (14) isvar([v, c]) +^ Fail: (13) not(user:isvar([v, c])) + Redo: (12) putvalue([v, c], ["b", "c"], [], _G5624) + Call: (13) isvar([v, c]) + Unify: (13) isvar([v, c]) + Exit: (13) isvar([v, c]) + Call: (13) isvalstrorundef(["b", "c"]) + Unify: (13) isvalstrorundef(["b", "c"]) + Call: (14) var(["b", "c"]) + Fail: (14) var(["b", "c"]) + Redo: (13) isvalstrorundef(["b", "c"]) + Unify: (13) isvalstrorundef(["b", "c"]) +^ Call: (14) not(var(["b", "c"])) +^ Unify: (14) not(user:var(["b", "c"])) +^ Exit: (14) not(user:var(["b", "c"])) + Call: (14) isval(["b", "c"]) + Unify: (14) isval(["b", "c"]) + Call: (15) number(["b", "c"]) + Fail: (15) number(["b", "c"]) + Fail: (14) isval(["b", "c"]) + Redo: (13) isvalstrorundef(["b", "c"]) + Unify: (13) isvalstrorundef(["b", "c"]) +^ Call: (14) not(var(["b", "c"])) +^ Unify: (14) not(user:var(["b", "c"])) +^ Exit: (14) not(user:var(["b", "c"])) + Call: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) isval(["b", "c"]) + Unify: (15) isval(["b", "c"]) + Call: (16) number(["b", "c"]) + Fail: (16) number(["b", "c"]) + Fail: (15) isval(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) string(["b", "c"]) + Fail: (15) string(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) atom(["b", "c"]) + Fail: (15) atom(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) +^ Call: (15) not(atom(["b", "c"])) +^ Unify: (15) not(user:atom(["b", "c"])) +^ Exit: (15) not(user:atom(["b", "c"])) + Call: (15) length(["b", "c"], _G5632) + Unify: (15) length(["b", "c"], _G5632) + Exit: (15) length(["b", "c"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["b", "c"]) + Unify: (15) expression2(["b", "c"]) + Call: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) expression3("b") + Call: (16) expression2(["c"]) + Unify: (16) expression2(["c"]) + Call: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) isval("c") + Unify: (18) isval("c") + Call: (19) number("c") + Fail: (19) number("c") + Fail: (18) isval("c") + Redo: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) string("c") + Exit: (18) string("c") + Exit: (17) expression3("c") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["c"]) + Exit: (15) expression2(["b", "c"]) + Exit: (14) expression(["b", "c"]) + Exit: (13) isvalstrorundef(["b", "c"]) + Call: (13) updatevar([v, c], ["b", "c"], [], _G5634) + Unify: (13) updatevar([v, c], ["b", "c"], [], _G5634) + Call: (14) lists:member([[v, c], empty], []) + Fail: (14) lists:member([[v, c], empty], []) + Redo: (13) updatevar([v, c], ["b", "c"], [], _G5634) +^ Call: (14) not(member([[v, c], _G5630], [])) +^ Unify: (14) not(user:member([[v, c], _G5630], [])) +^ Exit: (14) not(user:member([[v, c], _G5630], [])) + Call: (14) _G5630=empty + Exit: (14) empty=empty + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[v, c], ["b", "c"]]], _G5654) + Unify: (14) lists:append([], [[[v, c], ["b", "c"]]], [[[v, c], ["b", "c"]]]) + Exit: (14) lists:append([], [[[v, c], ["b", "c"]]], [[[v, c], ["b", "c"]]]) + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Exit: (13) updatevar([v, c], ["b", "c"], [], [[[v, c], ["b", "c"]]]) + Exit: (12) putvalue([v, c], ["b", "c"], [], [[[v, c], ["b", "c"]]]) + Call: (12) checkarguments([["a", "b"], [v, d]], [[v, b], [v, d]], [[[v, c], ["b", "c"]]], _G5655, [], _G5657) + Unify: (12) checkarguments([["a", "b"], [v, d]], [[v, b], [v, d]], [[[v, c], ["b", "c"]]], _G5655, [], _G5657) + Call: (13) [["a", "b"], [v, d]]=[_G5645|_G5646] + Exit: (13) [["a", "b"], [v, d]]=[["a", "b"], [v, d]] + Call: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) + Call: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) var(["a", "b"]) + Fail: (15) var(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) string(["a", "b"]) + Fail: (15) string(["a", "b"]) + Fail: (14) isvalstrempty(["a", "b"]) + Redo: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) +^ Call: (14) not(atom(["a", "b"])) +^ Unify: (14) not(user:atom(["a", "b"])) +^ Exit: (14) not(user:atom(["a", "b"])) + Call: (14) length(["a", "b"], _G5661) + Unify: (14) length(["a", "b"], _G5661) + Exit: (14) length(["a", "b"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["a", "b"]) + Unify: (14) expressionnotatom2(["a", "b"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2(["b"]) + Unify: (15) expressionnotatom2(["b"]) + Call: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) var("b") + Fail: (17) var("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) isvalstrempty("b") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["b"]) + Exit: (14) expressionnotatom2(["a", "b"]) + Exit: (13) expressionnotatom(["a", "b"]) + Call: (13) [[v, b], [v, d]]=[_G5653|_G5654] + Exit: (13) [[v, b], [v, d]]=[[v, b], [v, d]] +^ Call: (13) not(var([v, b])) +^ Unify: (13) not(user:var([v, b])) +^ Exit: (13) not(user:var([v, b])) + Call: (13) isvar([v, b]) + Unify: (13) isvar([v, b]) + Exit: (13) isvar([v, b]) + Call: (13) putvalue([v, b], ["a", "b"], [[[v, c], ["b", "c"]]], _G5671) + Unify: (13) putvalue([v, b], ["a", "b"], [[[v, c], ["b", "c"]]], _G5671) +^ Call: (14) not(isvar([v, b])) +^ Unify: (14) not(user:isvar([v, b])) + Call: (15) isvar([v, b]) + Unify: (15) isvar([v, b]) + Exit: (15) isvar([v, b]) +^ Fail: (14) not(user:isvar([v, b])) + Redo: (13) putvalue([v, b], ["a", "b"], [[[v, c], ["b", "c"]]], _G5671) + Call: (14) isvar([v, b]) + Unify: (14) isvar([v, b]) + Exit: (14) isvar([v, b]) + Call: (14) isvalstrorundef(["a", "b"]) + Unify: (14) isvalstrorundef(["a", "b"]) + Call: (15) var(["a", "b"]) + Fail: (15) var(["a", "b"]) + Redo: (14) isvalstrorundef(["a", "b"]) + Unify: (14) isvalstrorundef(["a", "b"]) +^ Call: (15) not(var(["a", "b"])) +^ Unify: (15) not(user:var(["a", "b"])) +^ Exit: (15) not(user:var(["a", "b"])) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) isvalstrorundef(["a", "b"]) + Unify: (14) isvalstrorundef(["a", "b"]) +^ Call: (15) not(var(["a", "b"])) +^ Unify: (15) not(user:var(["a", "b"])) +^ Exit: (15) not(user:var(["a", "b"])) + Call: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) + Call: (16) isval(["a", "b"]) + Unify: (16) isval(["a", "b"]) + Call: (17) number(["a", "b"]) + Fail: (17) number(["a", "b"]) + Fail: (16) isval(["a", "b"]) + Redo: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) + Call: (16) string(["a", "b"]) + Fail: (16) string(["a", "b"]) + Redo: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) + Call: (16) atom(["a", "b"]) + Fail: (16) atom(["a", "b"]) + Redo: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) +^ Call: (16) not(atom(["a", "b"])) +^ Unify: (16) not(user:atom(["a", "b"])) +^ Exit: (16) not(user:atom(["a", "b"])) + Call: (16) length(["a", "b"], _G5679) + Unify: (16) length(["a", "b"], _G5679) + Exit: (16) length(["a", "b"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expression2(["a", "b"]) + Unify: (16) expression2(["a", "b"]) + Call: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) isval("a") + Unify: (18) isval("a") + Call: (19) number("a") + Fail: (19) number("a") + Fail: (18) isval("a") + Redo: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) string("a") + Exit: (18) string("a") + Exit: (17) expression3("a") + Call: (17) expression2(["b"]) + Unify: (17) expression2(["b"]) + Call: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) isval("b") + Unify: (19) isval("b") + Call: (20) number("b") + Fail: (20) number("b") + Fail: (19) isval("b") + Redo: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) string("b") + Exit: (19) string("b") + Exit: (18) expression3("b") + Call: (18) expression2([]) + Unify: (18) expression2([]) + Exit: (18) expression2([]) + Exit: (17) expression2(["b"]) + Exit: (16) expression2(["a", "b"]) + Exit: (15) expression(["a", "b"]) + Exit: (14) isvalstrorundef(["a", "b"]) + Call: (14) updatevar([v, b], ["a", "b"], [[[v, c], ["b", "c"]]], _G5681) + Unify: (14) updatevar([v, b], ["a", "b"], [[[v, c], ["b", "c"]]], _G5681) + Call: (15) lists:member([[v, b], empty], [[[v, c], ["b", "c"]]]) + Unify: (15) lists:member([[v, b], empty], [[[v, c], ["b", "c"]]]) + Fail: (15) lists:member([[v, b], empty], [[[v, c], ["b", "c"]]]) + Redo: (14) updatevar([v, b], ["a", "b"], [[[v, c], ["b", "c"]]], _G5681) +^ Call: (15) not(member([[v, b], _G5677], [[[v, c], ["b", "c"]]])) +^ Unify: (15) not(user:member([[v, b], _G5677], [[[v, c], ["b", "c"]]])) +^ Exit: (15) not(user:member([[v, b], _G5677], [[[v, c], ["b", "c"]]])) + Call: (15) _G5677=empty + Exit: (15) empty=empty + Call: (15) true + Exit: (15) true + Call: (15) lists:append([[[v, c], ["b", "c"]]], [[[v, b], ["a", "b"]]], _G5701) + Unify: (15) lists:append([[[v, c], ["b", "c"]]], [[[v, b], ["a", "b"]]], [[[v, c], ["b", "c"]]|_G5693]) + Exit: (15) lists:append([[[v, c], ["b", "c"]]], [[[v, b], ["a", "b"]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Call: (15) true + Exit: (15) true + Call: (15) true + Exit: (15) true + Exit: (14) updatevar([v, b], ["a", "b"], [[[v, c], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Exit: (13) putvalue([v, b], ["a", "b"], [[[v, c], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Call: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5711) + Unify: (15) length([v, d], _G5711) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5703|_G5704] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5719) + Unify: (15) length([v, d], _G5719) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5703|_G5704] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) getvalue([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Unify: (14) getvalue([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) getvalue([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(_G5718) + Unify: (15) isvalstrorundef(_G5718) + Call: (16) var(_G5718) + Exit: (16) var(_G5718) + Exit: (15) isvalstrorundef(_G5718) + Call: (15) getvar([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Unify: (15) getvar([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Call: (16) lists:member([[v, d], _G5714], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Unify: (16) lists:member([[v, d], _G5714], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Fail: (16) lists:member([[v, d], _G5714], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Redo: (15) getvar([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Unify: (15) getvar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) +^ Call: (16) not(member([[v, d], _G5717], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]])) +^ Unify: (16) not(user:member([[v, d], _G5717], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]])) +^ Exit: (16) not(user:member([[v, d], _G5717], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]])) + Call: (16) true + Exit: (16) true + Exit: (15) getvar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Exit: (14) getvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Call: (14) true + Exit: (14) true + Call: (14) putvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5733) + Unify: (14) putvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5733) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5733) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) + Call: (16) var(empty) + Fail: (16) var(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) isval(empty) + Unify: (16) isval(empty) + Call: (17) number(empty) + Fail: (17) number(empty) + Fail: (16) isval(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) expression(empty) + Unify: (16) expression(empty) + Exit: (16) expression(empty) + Exit: (15) isvalstrorundef(empty) + Call: (15) updatevar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5738) + Unify: (15) updatevar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5738) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Fail: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Redo: (15) updatevar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], _G5738) +^ Call: (16) not(member([[v, d], _G5734], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]])) +^ Unify: (16) not(user:member([[v, d], _G5734], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]])) +^ Exit: (16) not(user:member([[v, d], _G5734], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]])) + Call: (16) _G5734=empty + Exit: (16) empty=empty + Call: (16) true + Exit: (16) true + Call: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], [[[v, d], empty]], _G5758) + Unify: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], [[[v, d], empty]], [[[v, c], ["b", "c"]]|_G5750]) + Exit: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], [[[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (14) putvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (14) lists:append([], [[[v, d], [v, d]]], _G5773) + Unify: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Call: (14) checkarguments([], [], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[v, d], [v, d]]], _G5776) + Unify: (14) checkarguments([], [], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) checkarguments([], [], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (12) checkarguments([["a", "b"], [v, d]], [[v, b], [v, d]], [[[v, c], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (11) checkarguments([["b", "c"], ["a", "b"], [v, d]], [[v, c], [v, b], [v, d]], [], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) true + Exit: (11) true + Call: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5764|_G5765] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5772|_G5773] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5781]]|_G5773] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5781]]|_G5773] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[_G5775], or, [_G5784]] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[_G5775], or, [_G5784]] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5772|_G5773] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Fail: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5773]]|_G5765] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5773]]|_G5765] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5767], or, [_G5776]] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5767], or, [_G5776]] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5764|_G5765] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5783, _G5784, _G5785) + Unify: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5783, true, nocut) + Call: (13) isop(=) + Unify: (13) isop(=) + Exit: (13) isop(=) + Call: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5783) + Unify: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5783) + Call: (14) getvalues([v, d], [v, c], _G5781, _G5782, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (14) getvalues([v, d], [v, c], _G5781, _G5782, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (15) getvalue([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (15) getvalue([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, d])) +^ Unify: (16) not(user:isvar([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) +^ Fail: (16) not(user:isvar([v, d])) + Redo: (15) getvalue([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) + Call: (16) isvalstrorundef(_G5779) + Unify: (16) isvalstrorundef(_G5779) + Call: (17) var(_G5779) + Exit: (17) var(_G5779) + Exit: (16) isvalstrorundef(_G5779) + Call: (16) getvar([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], _G5775], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], _G5775], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) +^ Call: (17) not(empty=empty) +^ Unify: (17) not(user: (empty=empty)) +^ Fail: (17) not(user: (empty=empty)) + Redo: (16) getvar([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) +^ Call: (17) not(member([[v, d], _G5778], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]])) +^ Unify: (17) not(user:member([[v, d], _G5778], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]])) +^ Fail: (17) not(user:member([[v, d], _G5778], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]])) + Redo: (16) getvar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (16) getvar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (15) getvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (15) getvalue([v, c], _G5786, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (15) getvalue([v, c], _G5786, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, c])) +^ Unify: (16) not(user:isvar([v, c])) + Call: (17) isvar([v, c]) + Unify: (17) isvar([v, c]) + Exit: (17) isvar([v, c]) +^ Fail: (16) not(user:isvar([v, c])) + Redo: (15) getvalue([v, c], _G5786, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (16) isvar([v, c]) + Unify: (16) isvar([v, c]) + Exit: (16) isvar([v, c]) + Call: (16) isvalstrorundef(_G5785) + Unify: (16) isvalstrorundef(_G5785) + Call: (17) var(_G5785) + Exit: (17) var(_G5785) + Exit: (16) isvalstrorundef(_G5785) + Call: (16) getvar([v, c], _G5786, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (16) getvar([v, c], _G5786, [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (17) lists:member([[v, c], _G5781], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, c], _G5781], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, c], ["b", "c"]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) +^ Call: (17) not(["b", "c"]=empty) +^ Unify: (17) not(user: (["b", "c"]=empty)) +^ Exit: (17) not(user: (["b", "c"]=empty)) + Exit: (16) getvar([v, c], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (15) getvalue([v, c], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (14) getvalues([v, d], [v, c], empty, ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (14) expression(empty) + Unify: (14) expression(empty) + Exit: (14) expression(empty) + Call: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) isval(["b", "c"]) + Unify: (15) isval(["b", "c"]) + Call: (16) number(["b", "c"]) + Fail: (16) number(["b", "c"]) + Fail: (15) isval(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) string(["b", "c"]) + Fail: (15) string(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) atom(["b", "c"]) + Fail: (15) atom(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) +^ Call: (15) not(atom(["b", "c"])) +^ Unify: (15) not(user:atom(["b", "c"])) +^ Exit: (15) not(user:atom(["b", "c"])) + Call: (15) length(["b", "c"], _G5803) + Unify: (15) length(["b", "c"], _G5803) + Exit: (15) length(["b", "c"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["b", "c"]) + Unify: (15) expression2(["b", "c"]) + Call: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) expression3("b") + Call: (16) expression2(["c"]) + Unify: (16) expression2(["c"]) + Call: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) isval("c") + Unify: (18) isval("c") + Call: (19) number("c") + Fail: (19) number("c") + Fail: (18) isval("c") + Redo: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) string("c") + Exit: (18) string("c") + Exit: (17) expression3("c") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["c"]) + Exit: (15) expression2(["b", "c"]) + Exit: (14) expression(["b", "c"]) + Call: (14) val1emptyorvalsequal(empty, ["b", "c"]) + Unify: (14) val1emptyorvalsequal(empty, ["b", "c"]) + Exit: (14) val1emptyorvalsequal(empty, ["b", "c"]) + Call: (14) putvalue([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5805) + Unify: (14) putvalue([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5805) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5805) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(["b", "c"]) + Unify: (15) isvalstrorundef(["b", "c"]) + Call: (16) var(["b", "c"]) + Fail: (16) var(["b", "c"]) + Redo: (15) isvalstrorundef(["b", "c"]) + Unify: (15) isvalstrorundef(["b", "c"]) +^ Call: (16) not(var(["b", "c"])) +^ Unify: (16) not(user:var(["b", "c"])) +^ Exit: (16) not(user:var(["b", "c"])) + Call: (16) isval(["b", "c"]) + Unify: (16) isval(["b", "c"]) + Call: (17) number(["b", "c"]) + Fail: (17) number(["b", "c"]) + Fail: (16) isval(["b", "c"]) + Redo: (15) isvalstrorundef(["b", "c"]) + Unify: (15) isvalstrorundef(["b", "c"]) +^ Call: (16) not(var(["b", "c"])) +^ Unify: (16) not(user:var(["b", "c"])) +^ Exit: (16) not(user:var(["b", "c"])) + Call: (16) expression(["b", "c"]) + Unify: (16) expression(["b", "c"]) + Call: (17) isval(["b", "c"]) + Unify: (17) isval(["b", "c"]) + Call: (18) number(["b", "c"]) + Fail: (18) number(["b", "c"]) + Fail: (17) isval(["b", "c"]) + Redo: (16) expression(["b", "c"]) + Unify: (16) expression(["b", "c"]) + Call: (17) string(["b", "c"]) + Fail: (17) string(["b", "c"]) + Redo: (16) expression(["b", "c"]) + Unify: (16) expression(["b", "c"]) + Call: (17) atom(["b", "c"]) + Fail: (17) atom(["b", "c"]) + Redo: (16) expression(["b", "c"]) + Unify: (16) expression(["b", "c"]) +^ Call: (17) not(atom(["b", "c"])) +^ Unify: (17) not(user:atom(["b", "c"])) +^ Exit: (17) not(user:atom(["b", "c"])) + Call: (17) length(["b", "c"], _G5813) + Unify: (17) length(["b", "c"], _G5813) + Exit: (17) length(["b", "c"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expression2(["b", "c"]) + Unify: (17) expression2(["b", "c"]) + Call: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) isval("b") + Unify: (19) isval("b") + Call: (20) number("b") + Fail: (20) number("b") + Fail: (19) isval("b") + Redo: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) string("b") + Exit: (19) string("b") + Exit: (18) expression3("b") + Call: (18) expression2(["c"]) + Unify: (18) expression2(["c"]) + Call: (19) expression3("c") + Unify: (19) expression3("c") + Call: (20) isval("c") + Unify: (20) isval("c") + Call: (21) number("c") + Fail: (21) number("c") + Fail: (20) isval("c") + Redo: (19) expression3("c") + Unify: (19) expression3("c") + Call: (20) string("c") + Exit: (20) string("c") + Exit: (19) expression3("c") + Call: (19) expression2([]) + Unify: (19) expression2([]) + Exit: (19) expression2([]) + Exit: (18) expression2(["c"]) + Exit: (17) expression2(["b", "c"]) + Exit: (16) expression(["b", "c"]) + Exit: (15) isvalstrorundef(["b", "c"]) + Call: (15) updatevar([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5815) + Unify: (15) updatevar([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], _G5815) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Exit: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]]) + Call: (16) lists:delete([[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[v, d], empty], _G5826) + Unify: (16) lists:delete([[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[v, d], empty], _G5826) + Exit: (16) lists:delete([[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]]) + Call: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], [[[v, d], ["b", "c"]]], _G5841) + Unify: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], [[[v, d], ["b", "c"]]], [[[v, c], ["b", "c"]]|_G5833]) + Exit: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["a", "b"]]], [[[v, d], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]]) + Exit: (14) putvalue([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]]) + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]]) + Call: (14) true + Exit: (14) true + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]]) + Call: (14) true + Exit: (14) true + Exit: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]]) + Exit: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], true, nocut) +^ Call: (12) not(nocut=cut) +^ Unify: (12) not(user: (nocut=cut)) +^ Exit: (12) not(user: (nocut=cut)) + Call: (12) _G5851=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (12) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], _G5854, [], _G5856) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [], true) + Exit: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [], true) + Call: (12) logicalconjunction(true, true, true) + Unify: (12) logicalconjunction(true, true, true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Exit: (12) logicalconjunction(true, true, true) + Exit: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [[[n, =], [[v, d], [v, c]]]], true) + Call: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [], _G5854) + Unify: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [], _G5854) + Call: (12) [[[v, d], [v, d]]]=[[_G5847, _G5850]|_G5845] + Exit: (12) [[[v, d], [v, d]]]=[[[v, d], [v, d]]] + Call: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) var([v, d]) + Fail: (14) var([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) isval([v, d]) + Unify: (14) isval([v, d]) + Call: (15) number([v, d]) + Fail: (15) number([v, d]) + Fail: (14) isval([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) string([v, d]) + Fail: (14) string([v, d]) + Fail: (13) isvalstrempty([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) +^ Call: (13) not(atom([v, d])) +^ Unify: (13) not(user:atom([v, d])) +^ Exit: (13) not(user:atom([v, d])) + Call: (13) length([v, d], _G5866) + Unify: (13) length([v, d], _G5866) + Exit: (13) length([v, d], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2([v, d]) + Unify: (13) expressionnotatom2([v, d]) + Call: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) var(v) + Fail: (15) var(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) isval(v) + Unify: (15) isval(v) + Call: (16) number(v) + Fail: (16) number(v) + Fail: (15) isval(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) string(v) + Fail: (15) string(v) + Fail: (14) isvalstrempty(v) + Fail: (13) expressionnotatom2([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) predicate_or_rule_name([v, d]) + Fail: (13) predicate_or_rule_name([v, d]) + Fail: (12) expressionnotatom([v, d]) + Redo: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [], _G5863) + Call: (12) lists:member([[v, d], _G5856], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]]) + Unify: (12) lists:member([[v, d], _G5856], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]]) + Exit: (12) lists:member([[v, d], ["b", "c"]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]]) + Call: (12) lists:append([], [[[v, d], ["b", "c"]]], _G5877) + Unify: (12) lists:append([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Call: (12) updatevars([], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]], _G5878) + Unify: (12) updatevars([], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (12) updatevars([], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "c"]], [[v, b], ["a", "b"]], [[v, d], ["b", "c"]]], [], [[[v, d], ["b", "c"]]]) +^ Call: (11) not([[[v, d], ["b", "c"]]]=[]) +^ Unify: (11) not(user: ([[[v, d], ["b", "c"]]]=[])) +^ Exit: (11) not(user: ([[[v, d], ["b", "c"]]]=[])) + Call: (11) unique1([[[v, d], ["b", "c"]]], [], _G14) + Unify: (11) unique1([[[v, d], ["b", "c"]]], [], _G14) + Call: (12) lists:delete([], [[v, d], ["b", "c"]], _G5883) + Unify: (12) lists:delete([], [[v, d], ["b", "c"]], []) + Exit: (12) lists:delete([], [[v, d], ["b", "c"]], []) + Call: (12) lists:append([], [[[v, d], ["b", "c"]]], _G5886) + Unify: (12) lists:append([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Call: (12) unique1([], [[[v, d], ["b", "c"]]], _G14) + Unify: (12) unique1([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (12) unique1([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (11) unique1([[[v, d], ["b", "c"]]], [], [[[v, d], ["b", "c"]]]) + Call: (11) findresult3([["b", "c"], ["a", "b"], [v, d]], [[[v, d], ["b", "c"]]], [], _G5887) + Unify: (11) findresult3([["b", "c"], ["a", "b"], [v, d]], [[[v, d], ["b", "c"]]], [], _G5887) + Call: (12) [["b", "c"], ["a", "b"], [v, d]]=[_G5877|_G5878] + Exit: (12) [["b", "c"], ["a", "b"], [v, d]]=[["b", "c"], ["a", "b"], [v, d]] + Call: (12) expressionnotatom(["b", "c"]) + Unify: (12) expressionnotatom(["b", "c"]) + Call: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) var(["b", "c"]) + Fail: (14) var(["b", "c"]) + Redo: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) isval(["b", "c"]) + Unify: (14) isval(["b", "c"]) + Call: (15) number(["b", "c"]) + Fail: (15) number(["b", "c"]) + Fail: (14) isval(["b", "c"]) + Redo: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) string(["b", "c"]) + Fail: (14) string(["b", "c"]) + Fail: (13) isvalstrempty(["b", "c"]) + Redo: (12) expressionnotatom(["b", "c"]) + Unify: (12) expressionnotatom(["b", "c"]) +^ Call: (13) not(atom(["b", "c"])) +^ Unify: (13) not(user:atom(["b", "c"])) +^ Exit: (13) not(user:atom(["b", "c"])) + Call: (13) length(["b", "c"], _G5893) + Unify: (13) length(["b", "c"], _G5893) + Exit: (13) length(["b", "c"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["b", "c"]) + Unify: (13) expressionnotatom2(["b", "c"]) + Call: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) var("b") + Fail: (15) var("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) isval("b") + Unify: (15) isval("b") + Call: (16) number("b") + Fail: (16) number("b") + Fail: (15) isval("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) string("b") + Exit: (15) string("b") + Exit: (14) isvalstrempty("b") + Call: (14) expressionnotatom2(["c"]) + Unify: (14) expressionnotatom2(["c"]) + Call: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) var("c") + Fail: (16) var("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) isval("c") + Unify: (16) isval("c") + Call: (17) number("c") + Fail: (17) number("c") + Fail: (16) isval("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) string("c") + Exit: (16) string("c") + Exit: (15) isvalstrempty("c") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["c"]) + Exit: (13) expressionnotatom2(["b", "c"]) + Exit: (12) expressionnotatom(["b", "c"]) + Call: (12) lists:append([], [["b", "c"]], _G5897) + Unify: (12) lists:append([], [["b", "c"]], [["b", "c"]]) + Exit: (12) lists:append([], [["b", "c"]], [["b", "c"]]) + Call: (12) findresult3([["a", "b"], [v, d]], [[[v, d], ["b", "c"]]], [["b", "c"]], _G5898) + Unify: (12) findresult3([["a", "b"], [v, d]], [[[v, d], ["b", "c"]]], [["b", "c"]], _G5898) + Call: (13) [["a", "b"], [v, d]]=[_G5888|_G5889] + Exit: (13) [["a", "b"], [v, d]]=[["a", "b"], [v, d]] + Call: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) + Call: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) var(["a", "b"]) + Fail: (15) var(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) string(["a", "b"]) + Fail: (15) string(["a", "b"]) + Fail: (14) isvalstrempty(["a", "b"]) + Redo: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) +^ Call: (14) not(atom(["a", "b"])) +^ Unify: (14) not(user:atom(["a", "b"])) +^ Exit: (14) not(user:atom(["a", "b"])) + Call: (14) length(["a", "b"], _G5904) + Unify: (14) length(["a", "b"], _G5904) + Exit: (14) length(["a", "b"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["a", "b"]) + Unify: (14) expressionnotatom2(["a", "b"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2(["b"]) + Unify: (15) expressionnotatom2(["b"]) + Call: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) var("b") + Fail: (17) var("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) isvalstrempty("b") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["b"]) + Exit: (14) expressionnotatom2(["a", "b"]) + Exit: (13) expressionnotatom(["a", "b"]) + Call: (13) lists:append([["b", "c"]], [["a", "b"]], _G5908) + Unify: (13) lists:append([["b", "c"]], [["a", "b"]], [["b", "c"]|_G5900]) + Exit: (13) lists:append([["b", "c"]], [["a", "b"]], [["b", "c"], ["a", "b"]]) + Call: (13) findresult3([[v, d]], [[[v, d], ["b", "c"]]], [["b", "c"], ["a", "b"]], _G5912) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "c"]]], [["b", "c"], ["a", "b"]], _G5912) + Call: (14) [[v, d]]=[_G5902|_G5903] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5918) + Unify: (15) length([v, d], _G5918) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) findresult3([[v, d]], [[[v, d], ["b", "c"]]], [["b", "c"], ["a", "b"]], _G5912) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "c"]]], [["b", "c"], ["a", "b"]], _G5912) + Call: (14) [[v, d]]=[_G5902|_G5903] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) lists:member([[v, d], _G5908], [[[v, d], ["b", "c"]]]) + Unify: (14) lists:member([[v, d], _G5908], [[[v, d], ["b", "c"]]]) + Exit: (14) lists:member([[v, d], ["b", "c"]], [[[v, d], ["b", "c"]]]) + Call: (14) lists:append([["b", "c"], ["a", "b"]], [["b", "c"]], _G5923) + Unify: (14) lists:append([["b", "c"], ["a", "b"]], [["b", "c"]], [["b", "c"]|_G5915]) + Exit: (14) lists:append([["b", "c"], ["a", "b"]], [["b", "c"]], [["b", "c"], ["a", "b"], ["b", "c"]]) + Call: (14) findresult3([], [[[v, d], ["b", "c"]]], [["b", "c"], ["a", "b"], ["b", "c"]], _G5930) + Unify: (14) findresult3([], [[[v, d], ["b", "c"]]], [["b", "c"], ["a", "b"], ["b", "c"]], [["b", "c"], ["a", "b"], ["b", "c"]]) + Exit: (14) findresult3([], [[[v, d], ["b", "c"]]], [["b", "c"], ["a", "b"], ["b", "c"]], [["b", "c"], ["a", "b"], ["b", "c"]]) + Exit: (13) findresult3([[v, d]], [[[v, d], ["b", "c"]]], [["b", "c"], ["a", "b"]], [["b", "c"], ["a", "b"], ["b", "c"]]) + Exit: (12) findresult3([["a", "b"], [v, d]], [[[v, d], ["b", "c"]]], [["b", "c"]], [["b", "c"], ["a", "b"], ["b", "c"]]) + Exit: (11) findresult3([["b", "c"], ["a", "b"], [v, d]], [[[v, d], ["b", "c"]]], [], [["b", "c"], ["a", "b"], ["b", "c"]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "c"]]]) + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Exit: (10) member1([[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "c"]]]) + Exit: (9) interpret1(off, [[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "c"]]]) + Exit: (8) interpret(off, [[n, f], [["b", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "c"]]]) + + Call: (8) interpret(off, [[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (8) interpret(off, [[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (9) convert_to_grammar_part1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551) + Unify: (9) convert_to_grammar_part1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551) + Call: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Call: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5542|_G5543] + Exit: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], _G5554, "->", _G5563] + Fail: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], _G5554, "->", _G5563] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5554, [], _G5556) + Call: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], "->", _G5560] + Fail: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], "->", _G5560] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Call: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5542|_G5543] + Exit: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_G5545, _G5548, ":-", _G5557] + Exit: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) lists:append([], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], _G5572) + Unify: (11) lists:append([], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (11) lists:append([], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (11) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G5572, [], _G5574) + Unify: (11) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (11) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (10) lists:append([[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [], _G5572) + Unify: (10) lists:append([[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]|_G5564]) + Exit: (10) lists:append([[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (9) convert_to_grammar_part1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (9) interpret1(off, [[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (9) interpret1(off, [[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) +^ Call: (10) retractall(debug(_G5566)) +^ Exit: (10) retractall(debug(_G5566)) +^ Call: (10) assertz(debug(off)) +^ Exit: (10) assertz(debug(off)) +^ Call: (10) retractall(cut(_G5570)) +^ Exit: (10) retractall(cut(_G5570)) +^ Call: (10) assertz(cut(off)) +^ Exit: (10) assertz(cut(off)) + Call: (10) member1([[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (10) member1([[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) cut(off) + Unify: (11) cut(off) + Exit: (11) cut(off) + Call: (11) [[n, f], [["b", "a"], ["b", "a"], [v, d]]]=[_G5574, _G5577] + Exit: (11) [[n, f], [["b", "a"], ["b", "a"], [v, d]]]=[[n, f], [["b", "a"], ["b", "a"], [v, d]]] + Call: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _G5586, ":-", _G5595]|_G5581] + Exit: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) length([["b", "a"], ["b", "a"], [v, d]], _G5606) + Unify: (11) length([["b", "a"], ["b", "a"], [v, d]], _G5606) + Exit: (11) length([["b", "a"], ["b", "a"], [v, d]], 3) + Call: (11) length([[v, b], [v, a], [v, d]], 3) + Unify: (11) length([[v, b], [v, a], [v, d]], 3) + Unify: (11) length([[v, b], [v, a], [v, d]], 3) + Exit: (11) length([[v, b], [v, a], [v, d]], 3) + Call: (11) [n, f]=[n, grammar] + Fail: (11) [n, f]=[n, grammar] + Redo: (10) member1([[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) [n, f]=[n, grammar_part] + Fail: (11) [n, f]=[n, grammar_part] + Redo: (10) member1([[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) checkarguments([["b", "a"], ["b", "a"], [v, d]], [[v, b], [v, a], [v, d]], [], _G5608, [], _G5610) + Unify: (11) checkarguments([["b", "a"], ["b", "a"], [v, d]], [[v, b], [v, a], [v, d]], [], _G5608, [], _G5610) + Call: (12) [["b", "a"], ["b", "a"], [v, d]]=[_G5598|_G5599] + Exit: (12) [["b", "a"], ["b", "a"], [v, d]]=[["b", "a"], ["b", "a"], [v, d]] + Call: (12) expressionnotatom(["b", "a"]) + Unify: (12) expressionnotatom(["b", "a"]) + Call: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) var(["b", "a"]) + Fail: (14) var(["b", "a"]) + Redo: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) isval(["b", "a"]) + Unify: (14) isval(["b", "a"]) + Call: (15) number(["b", "a"]) + Fail: (15) number(["b", "a"]) + Fail: (14) isval(["b", "a"]) + Redo: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) string(["b", "a"]) + Fail: (14) string(["b", "a"]) + Fail: (13) isvalstrempty(["b", "a"]) + Redo: (12) expressionnotatom(["b", "a"]) + Unify: (12) expressionnotatom(["b", "a"]) +^ Call: (13) not(atom(["b", "a"])) +^ Unify: (13) not(user:atom(["b", "a"])) +^ Exit: (13) not(user:atom(["b", "a"])) + Call: (13) length(["b", "a"], _G5614) + Unify: (13) length(["b", "a"], _G5614) + Exit: (13) length(["b", "a"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["b", "a"]) + Unify: (13) expressionnotatom2(["b", "a"]) + Call: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) var("b") + Fail: (15) var("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) isval("b") + Unify: (15) isval("b") + Call: (16) number("b") + Fail: (16) number("b") + Fail: (15) isval("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) string("b") + Exit: (15) string("b") + Exit: (14) isvalstrempty("b") + Call: (14) expressionnotatom2(["a"]) + Unify: (14) expressionnotatom2(["a"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["a"]) + Exit: (13) expressionnotatom2(["b", "a"]) + Exit: (12) expressionnotatom(["b", "a"]) + Call: (12) [[v, b], [v, a], [v, d]]=[_G5606|_G5607] + Exit: (12) [[v, b], [v, a], [v, d]]=[[v, b], [v, a], [v, d]] +^ Call: (12) not(var([v, b])) +^ Unify: (12) not(user:var([v, b])) +^ Exit: (12) not(user:var([v, b])) + Call: (12) isvar([v, b]) + Unify: (12) isvar([v, b]) + Exit: (12) isvar([v, b]) + Call: (12) putvalue([v, b], ["b", "a"], [], _G5624) + Unify: (12) putvalue([v, b], ["b", "a"], [], _G5624) +^ Call: (13) not(isvar([v, b])) +^ Unify: (13) not(user:isvar([v, b])) + Call: (14) isvar([v, b]) + Unify: (14) isvar([v, b]) + Exit: (14) isvar([v, b]) +^ Fail: (13) not(user:isvar([v, b])) + Redo: (12) putvalue([v, b], ["b", "a"], [], _G5624) + Call: (13) isvar([v, b]) + Unify: (13) isvar([v, b]) + Exit: (13) isvar([v, b]) + Call: (13) isvalstrorundef(["b", "a"]) + Unify: (13) isvalstrorundef(["b", "a"]) + Call: (14) var(["b", "a"]) + Fail: (14) var(["b", "a"]) + Redo: (13) isvalstrorundef(["b", "a"]) + Unify: (13) isvalstrorundef(["b", "a"]) +^ Call: (14) not(var(["b", "a"])) +^ Unify: (14) not(user:var(["b", "a"])) +^ Exit: (14) not(user:var(["b", "a"])) + Call: (14) isval(["b", "a"]) + Unify: (14) isval(["b", "a"]) + Call: (15) number(["b", "a"]) + Fail: (15) number(["b", "a"]) + Fail: (14) isval(["b", "a"]) + Redo: (13) isvalstrorundef(["b", "a"]) + Unify: (13) isvalstrorundef(["b", "a"]) +^ Call: (14) not(var(["b", "a"])) +^ Unify: (14) not(user:var(["b", "a"])) +^ Exit: (14) not(user:var(["b", "a"])) + Call: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) atom(["b", "a"]) + Fail: (15) atom(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) +^ Call: (15) not(atom(["b", "a"])) +^ Unify: (15) not(user:atom(["b", "a"])) +^ Exit: (15) not(user:atom(["b", "a"])) + Call: (15) length(["b", "a"], _G5632) + Unify: (15) length(["b", "a"], _G5632) + Exit: (15) length(["b", "a"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["b", "a"]) + Unify: (15) expression2(["b", "a"]) + Call: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) expression3("b") + Call: (16) expression2(["a"]) + Unify: (16) expression2(["a"]) + Call: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) isval("a") + Unify: (18) isval("a") + Call: (19) number("a") + Fail: (19) number("a") + Fail: (18) isval("a") + Redo: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) string("a") + Exit: (18) string("a") + Exit: (17) expression3("a") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["a"]) + Exit: (15) expression2(["b", "a"]) + Exit: (14) expression(["b", "a"]) + Exit: (13) isvalstrorundef(["b", "a"]) + Call: (13) updatevar([v, b], ["b", "a"], [], _G5634) + Unify: (13) updatevar([v, b], ["b", "a"], [], _G5634) + Call: (14) lists:member([[v, b], empty], []) + Fail: (14) lists:member([[v, b], empty], []) + Redo: (13) updatevar([v, b], ["b", "a"], [], _G5634) +^ Call: (14) not(member([[v, b], _G5630], [])) +^ Unify: (14) not(user:member([[v, b], _G5630], [])) +^ Exit: (14) not(user:member([[v, b], _G5630], [])) + Call: (14) _G5630=empty + Exit: (14) empty=empty + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[v, b], ["b", "a"]]], _G5654) + Unify: (14) lists:append([], [[[v, b], ["b", "a"]]], [[[v, b], ["b", "a"]]]) + Exit: (14) lists:append([], [[[v, b], ["b", "a"]]], [[[v, b], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Exit: (13) updatevar([v, b], ["b", "a"], [], [[[v, b], ["b", "a"]]]) + Exit: (12) putvalue([v, b], ["b", "a"], [], [[[v, b], ["b", "a"]]]) + Call: (12) checkarguments([["b", "a"], [v, d]], [[v, a], [v, d]], [[[v, b], ["b", "a"]]], _G5655, [], _G5657) + Unify: (12) checkarguments([["b", "a"], [v, d]], [[v, a], [v, d]], [[[v, b], ["b", "a"]]], _G5655, [], _G5657) + Call: (13) [["b", "a"], [v, d]]=[_G5645|_G5646] + Exit: (13) [["b", "a"], [v, d]]=[["b", "a"], [v, d]] + Call: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) + Call: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Fail: (14) isvalstrempty(["b", "a"]) + Redo: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) +^ Call: (14) not(atom(["b", "a"])) +^ Unify: (14) not(user:atom(["b", "a"])) +^ Exit: (14) not(user:atom(["b", "a"])) + Call: (14) length(["b", "a"], _G5661) + Unify: (14) length(["b", "a"], _G5661) + Exit: (14) length(["b", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["b", "a"]) + Unify: (14) expressionnotatom2(["b", "a"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["b", "a"]) + Exit: (13) expressionnotatom(["b", "a"]) + Call: (13) [[v, a], [v, d]]=[_G5653|_G5654] + Exit: (13) [[v, a], [v, d]]=[[v, a], [v, d]] +^ Call: (13) not(var([v, a])) +^ Unify: (13) not(user:var([v, a])) +^ Exit: (13) not(user:var([v, a])) + Call: (13) isvar([v, a]) + Unify: (13) isvar([v, a]) + Exit: (13) isvar([v, a]) + Call: (13) putvalue([v, a], ["b", "a"], [[[v, b], ["b", "a"]]], _G5671) + Unify: (13) putvalue([v, a], ["b", "a"], [[[v, b], ["b", "a"]]], _G5671) +^ Call: (14) not(isvar([v, a])) +^ Unify: (14) not(user:isvar([v, a])) + Call: (15) isvar([v, a]) + Unify: (15) isvar([v, a]) + Exit: (15) isvar([v, a]) +^ Fail: (14) not(user:isvar([v, a])) + Redo: (13) putvalue([v, a], ["b", "a"], [[[v, b], ["b", "a"]]], _G5671) + Call: (14) isvar([v, a]) + Unify: (14) isvar([v, a]) + Exit: (14) isvar([v, a]) + Call: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) +^ Call: (15) not(var(["b", "a"])) +^ Unify: (15) not(user:var(["b", "a"])) +^ Exit: (15) not(user:var(["b", "a"])) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) +^ Call: (15) not(var(["b", "a"])) +^ Unify: (15) not(user:var(["b", "a"])) +^ Exit: (15) not(user:var(["b", "a"])) + Call: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) isval(["b", "a"]) + Unify: (16) isval(["b", "a"]) + Call: (17) number(["b", "a"]) + Fail: (17) number(["b", "a"]) + Fail: (16) isval(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) string(["b", "a"]) + Fail: (16) string(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) atom(["b", "a"]) + Fail: (16) atom(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) +^ Call: (16) not(atom(["b", "a"])) +^ Unify: (16) not(user:atom(["b", "a"])) +^ Exit: (16) not(user:atom(["b", "a"])) + Call: (16) length(["b", "a"], _G5679) + Unify: (16) length(["b", "a"], _G5679) + Exit: (16) length(["b", "a"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expression2(["b", "a"]) + Unify: (16) expression2(["b", "a"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2(["a"]) + Unify: (17) expression2(["a"]) + Call: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) isval("a") + Unify: (19) isval("a") + Call: (20) number("a") + Fail: (20) number("a") + Fail: (19) isval("a") + Redo: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) string("a") + Exit: (19) string("a") + Exit: (18) expression3("a") + Call: (18) expression2([]) + Unify: (18) expression2([]) + Exit: (18) expression2([]) + Exit: (17) expression2(["a"]) + Exit: (16) expression2(["b", "a"]) + Exit: (15) expression(["b", "a"]) + Exit: (14) isvalstrorundef(["b", "a"]) + Call: (14) updatevar([v, a], ["b", "a"], [[[v, b], ["b", "a"]]], _G5681) + Unify: (14) updatevar([v, a], ["b", "a"], [[[v, b], ["b", "a"]]], _G5681) + Call: (15) lists:member([[v, a], empty], [[[v, b], ["b", "a"]]]) + Unify: (15) lists:member([[v, a], empty], [[[v, b], ["b", "a"]]]) + Fail: (15) lists:member([[v, a], empty], [[[v, b], ["b", "a"]]]) + Redo: (14) updatevar([v, a], ["b", "a"], [[[v, b], ["b", "a"]]], _G5681) +^ Call: (15) not(member([[v, a], _G5677], [[[v, b], ["b", "a"]]])) +^ Unify: (15) not(user:member([[v, a], _G5677], [[[v, b], ["b", "a"]]])) +^ Exit: (15) not(user:member([[v, a], _G5677], [[[v, b], ["b", "a"]]])) + Call: (15) _G5677=empty + Exit: (15) empty=empty + Call: (15) true + Exit: (15) true + Call: (15) lists:append([[[v, b], ["b", "a"]]], [[[v, a], ["b", "a"]]], _G5701) + Unify: (15) lists:append([[[v, b], ["b", "a"]]], [[[v, a], ["b", "a"]]], [[[v, b], ["b", "a"]]|_G5693]) + Exit: (15) lists:append([[[v, b], ["b", "a"]]], [[[v, a], ["b", "a"]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Call: (15) true + Exit: (15) true + Call: (15) true + Exit: (15) true + Exit: (14) updatevar([v, a], ["b", "a"], [[[v, b], ["b", "a"]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Exit: (13) putvalue([v, a], ["b", "a"], [[[v, b], ["b", "a"]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Call: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5711) + Unify: (15) length([v, d], _G5711) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5703|_G5704] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5719) + Unify: (15) length([v, d], _G5719) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5703|_G5704] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) getvalue([v, d], _G5719, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Unify: (14) getvalue([v, d], _G5719, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) getvalue([v, d], _G5719, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(_G5718) + Unify: (15) isvalstrorundef(_G5718) + Call: (16) var(_G5718) + Exit: (16) var(_G5718) + Exit: (15) isvalstrorundef(_G5718) + Call: (15) getvar([v, d], _G5719, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Unify: (15) getvar([v, d], _G5719, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Call: (16) lists:member([[v, d], _G5714], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Unify: (16) lists:member([[v, d], _G5714], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Fail: (16) lists:member([[v, d], _G5714], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Redo: (15) getvar([v, d], _G5719, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Unify: (15) getvar([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) +^ Call: (16) not(member([[v, d], _G5717], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5717], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5717], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]])) + Call: (16) true + Exit: (16) true + Exit: (15) getvar([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Exit: (14) getvalue([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) putvalue([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5733) + Unify: (14) putvalue([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5733) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5733) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) + Call: (16) var(empty) + Fail: (16) var(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) isval(empty) + Unify: (16) isval(empty) + Call: (17) number(empty) + Fail: (17) number(empty) + Fail: (16) isval(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) expression(empty) + Unify: (16) expression(empty) + Exit: (16) expression(empty) + Exit: (15) isvalstrorundef(empty) + Call: (15) updatevar([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5738) + Unify: (15) updatevar([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5738) + Call: (16) lists:member([[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Unify: (16) lists:member([[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Fail: (16) lists:member([[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Redo: (15) updatevar([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], _G5738) +^ Call: (16) not(member([[v, d], _G5734], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5734], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5734], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]])) + Call: (16) _G5734=empty + Exit: (16) empty=empty + Call: (16) true + Exit: (16) true + Call: (16) lists:append([[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], [[[v, d], empty]], _G5758) + Unify: (16) lists:append([[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], [[[v, d], empty]], [[[v, b], ["b", "a"]]|_G5750]) + Exit: (16) lists:append([[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], [[[v, d], empty]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (14) putvalue([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (14) lists:append([], [[[v, d], [v, d]]], _G5773) + Unify: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Call: (14) checkarguments([], [], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[v, d], [v, d]]], _G5776) + Unify: (14) checkarguments([], [], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) checkarguments([], [], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (12) checkarguments([["b", "a"], [v, d]], [[v, a], [v, d]], [[[v, b], ["b", "a"]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (11) checkarguments([["b", "a"], ["b", "a"], [v, d]], [[v, b], [v, a], [v, d]], [], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) true + Exit: (11) true + Call: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, b]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, b]]]], true) + Call: (12) [[[n, =], [[v, d], [v, b]]]]=[_G5764|_G5765] + Exit: (12) [[[n, =], [[v, d], [v, b]]]]=[[[n, =], [[v, d], [v, b]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, b]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, b]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) + Call: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, b]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, b]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, b]]]=[_G5772|_G5773] + Exit: (13) [[n, =], [[v, d], [v, b]]]=[[n, =], [[v, d], [v, b]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Redo: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, b]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, b]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, b]]]=[[not, [_G5781]]|_G5773] + Fail: (13) [[n, =], [[v, d], [v, b]]]=[[not, [_G5781]]|_G5773] + Redo: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, b]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, b]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, b]]]=[[_G5775], or, [_G5784]] + Fail: (13) [[n, =], [[v, d], [v, b]]]=[[_G5775], or, [_G5784]] + Redo: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, b]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, b]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, b]]]=[_G5772|_G5773] + Exit: (13) [[n, =], [[v, d], [v, b]]]=[[n, =], [[v, d], [v, b]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Fail: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, b]]], _G5784) + Redo: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, b]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, b]]]], true) + Call: (12) [[[n, =], [[v, d], [v, b]]]]=[[not, [_G5773]]|_G5765] + Fail: (12) [[[n, =], [[v, d], [v, b]]]]=[[not, [_G5773]]|_G5765] + Redo: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, b]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, b]]]], true) + Call: (12) [[[n, =], [[v, d], [v, b]]]]=[[_G5767], or, [_G5776]] + Fail: (12) [[[n, =], [[v, d], [v, b]]]]=[[_G5767], or, [_G5776]] + Redo: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, b]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, b]]]], true) + Call: (12) [[[n, =], [[v, d], [v, b]]]]=[_G5764|_G5765] + Exit: (12) [[[n, =], [[v, d], [v, b]]]]=[[[n, =], [[v, d], [v, b]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, b]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, b]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) + Call: (12) interpretstatement1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, b]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5783, _G5784, _G5785) + Unify: (12) interpretstatement1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, b]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5783, true, nocut) + Call: (13) isop(=) + Unify: (13) isop(=) + Exit: (13) isop(=) + Call: (13) interpretpart(is, [v, d], [v, b], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5783) + Unify: (13) interpretpart(is, [v, d], [v, b], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5783) + Call: (14) getvalues([v, d], [v, b], _G5781, _G5782, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (14) getvalues([v, d], [v, b], _G5781, _G5782, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, d], _G5780, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, d], _G5780, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, d])) +^ Unify: (16) not(user:isvar([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) +^ Fail: (16) not(user:isvar([v, d])) + Redo: (15) getvalue([v, d], _G5780, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) + Call: (16) isvalstrorundef(_G5779) + Unify: (16) isvalstrorundef(_G5779) + Call: (17) var(_G5779) + Exit: (17) var(_G5779) + Exit: (16) isvalstrorundef(_G5779) + Call: (16) getvar([v, d], _G5780, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], _G5780, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], _G5775], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], _G5775], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(empty=empty) +^ Unify: (17) not(user: (empty=empty)) +^ Fail: (17) not(user: (empty=empty)) + Redo: (16) getvar([v, d], _G5780, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(member([[v, d], _G5778], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]])) +^ Unify: (17) not(user:member([[v, d], _G5778], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]])) +^ Fail: (17) not(user:member([[v, d], _G5778], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]])) + Redo: (16) getvar([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (16) getvar([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, d], empty, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, b], _G5786, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, b], _G5786, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, b])) +^ Unify: (16) not(user:isvar([v, b])) + Call: (17) isvar([v, b]) + Unify: (17) isvar([v, b]) + Exit: (17) isvar([v, b]) +^ Fail: (16) not(user:isvar([v, b])) + Redo: (15) getvalue([v, b], _G5786, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, b]) + Unify: (16) isvar([v, b]) + Exit: (16) isvar([v, b]) + Call: (16) isvalstrorundef(_G5785) + Unify: (16) isvalstrorundef(_G5785) + Call: (17) var(_G5785) + Exit: (17) var(_G5785) + Exit: (16) isvalstrorundef(_G5785) + Call: (16) getvar([v, b], _G5786, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, b], _G5786, [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, b], _G5781], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, b], _G5781], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, b], ["b", "a"]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(["b", "a"]=empty) +^ Unify: (17) not(user: (["b", "a"]=empty)) +^ Exit: (17) not(user: (["b", "a"]=empty)) + Exit: (16) getvar([v, b], ["b", "a"], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, b], ["b", "a"], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (14) getvalues([v, d], [v, b], empty, ["b", "a"], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (14) expression(empty) + Unify: (14) expression(empty) + Exit: (14) expression(empty) + Call: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) atom(["b", "a"]) + Fail: (15) atom(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) +^ Call: (15) not(atom(["b", "a"])) +^ Unify: (15) not(user:atom(["b", "a"])) +^ Exit: (15) not(user:atom(["b", "a"])) + Call: (15) length(["b", "a"], _G5803) + Unify: (15) length(["b", "a"], _G5803) + Exit: (15) length(["b", "a"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["b", "a"]) + Unify: (15) expression2(["b", "a"]) + Call: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) expression3("b") + Call: (16) expression2(["a"]) + Unify: (16) expression2(["a"]) + Call: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) isval("a") + Unify: (18) isval("a") + Call: (19) number("a") + Fail: (19) number("a") + Fail: (18) isval("a") + Redo: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) string("a") + Exit: (18) string("a") + Exit: (17) expression3("a") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["a"]) + Exit: (15) expression2(["b", "a"]) + Exit: (14) expression(["b", "a"]) + Call: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Unify: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Exit: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Call: (14) putvalue([v, d], ["b", "a"], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5805) + Unify: (14) putvalue([v, d], ["b", "a"], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5805) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], ["b", "a"], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5805) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) + Call: (16) var(["b", "a"]) + Fail: (16) var(["b", "a"]) + Redo: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) +^ Call: (16) not(var(["b", "a"])) +^ Unify: (16) not(user:var(["b", "a"])) +^ Exit: (16) not(user:var(["b", "a"])) + Call: (16) isval(["b", "a"]) + Unify: (16) isval(["b", "a"]) + Call: (17) number(["b", "a"]) + Fail: (17) number(["b", "a"]) + Fail: (16) isval(["b", "a"]) + Redo: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) +^ Call: (16) not(var(["b", "a"])) +^ Unify: (16) not(user:var(["b", "a"])) +^ Exit: (16) not(user:var(["b", "a"])) + Call: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) isval(["b", "a"]) + Unify: (17) isval(["b", "a"]) + Call: (18) number(["b", "a"]) + Fail: (18) number(["b", "a"]) + Fail: (17) isval(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) string(["b", "a"]) + Fail: (17) string(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) atom(["b", "a"]) + Fail: (17) atom(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) +^ Call: (17) not(atom(["b", "a"])) +^ Unify: (17) not(user:atom(["b", "a"])) +^ Exit: (17) not(user:atom(["b", "a"])) + Call: (17) length(["b", "a"], _G5813) + Unify: (17) length(["b", "a"], _G5813) + Exit: (17) length(["b", "a"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expression2(["b", "a"]) + Unify: (17) expression2(["b", "a"]) + Call: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) isval("b") + Unify: (19) isval("b") + Call: (20) number("b") + Fail: (20) number("b") + Fail: (19) isval("b") + Redo: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) string("b") + Exit: (19) string("b") + Exit: (18) expression3("b") + Call: (18) expression2(["a"]) + Unify: (18) expression2(["a"]) + Call: (19) expression3("a") + Unify: (19) expression3("a") + Call: (20) isval("a") + Unify: (20) isval("a") + Call: (21) number("a") + Fail: (21) number("a") + Fail: (20) isval("a") + Redo: (19) expression3("a") + Unify: (19) expression3("a") + Call: (20) string("a") + Exit: (20) string("a") + Exit: (19) expression3("a") + Call: (19) expression2([]) + Unify: (19) expression2([]) + Exit: (19) expression2([]) + Exit: (18) expression2(["a"]) + Exit: (17) expression2(["b", "a"]) + Exit: (16) expression(["b", "a"]) + Exit: (15) isvalstrorundef(["b", "a"]) + Call: (15) updatevar([v, d], ["b", "a"], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5815) + Unify: (15) updatevar([v, d], ["b", "a"], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5815) + Call: (16) lists:member([[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (16) lists:member([[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (16) lists:member([[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (16) lists:delete([[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[v, d], empty], _G5826) + Unify: (16) lists:delete([[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[v, d], empty], _G5826) + Exit: (16) lists:delete([[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[v, d], empty], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]]) + Call: (16) lists:append([[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], [[[v, d], ["b", "a"]]], _G5841) + Unify: (16) lists:append([[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, b], ["b", "a"]]|_G5833]) + Exit: (16) lists:append([[[v, b], ["b", "a"]], [[v, a], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], ["b", "a"], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Exit: (14) putvalue([v, d], ["b", "a"], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, b], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, b], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Exit: (13) interpretpart(is, [v, d], [v, b], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Exit: (12) interpretstatement1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, b]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], true, nocut) +^ Call: (12) not(nocut=cut) +^ Unify: (12) not(user: (nocut=cut)) +^ Exit: (12) not(user: (nocut=cut)) + Call: (12) _G5851=[[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (12) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], _G5854, [], _G5856) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], true) + Exit: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], true) + Call: (12) logicalconjunction(true, true, true) + Unify: (12) logicalconjunction(true, true, true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Exit: (12) logicalconjunction(true, true, true) + Exit: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[n, =], [[v, d], [v, b]]]], true) + Call: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], _G5854) + Unify: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], _G5854) + Call: (12) [[[v, d], [v, d]]]=[[_G5847, _G5850]|_G5845] + Exit: (12) [[[v, d], [v, d]]]=[[[v, d], [v, d]]] + Call: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) var([v, d]) + Fail: (14) var([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) isval([v, d]) + Unify: (14) isval([v, d]) + Call: (15) number([v, d]) + Fail: (15) number([v, d]) + Fail: (14) isval([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) string([v, d]) + Fail: (14) string([v, d]) + Fail: (13) isvalstrempty([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) +^ Call: (13) not(atom([v, d])) +^ Unify: (13) not(user:atom([v, d])) +^ Exit: (13) not(user:atom([v, d])) + Call: (13) length([v, d], _G5866) + Unify: (13) length([v, d], _G5866) + Exit: (13) length([v, d], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2([v, d]) + Unify: (13) expressionnotatom2([v, d]) + Call: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) var(v) + Fail: (15) var(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) isval(v) + Unify: (15) isval(v) + Call: (16) number(v) + Fail: (16) number(v) + Fail: (15) isval(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) string(v) + Fail: (15) string(v) + Fail: (14) isvalstrempty(v) + Fail: (13) expressionnotatom2([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) predicate_or_rule_name([v, d]) + Fail: (13) predicate_or_rule_name([v, d]) + Fail: (12) expressionnotatom([v, d]) + Redo: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], _G5863) + Call: (12) lists:member([[v, d], _G5856], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Unify: (12) lists:member([[v, d], _G5856], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Exit: (12) lists:member([[v, d], ["b", "a"]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (12) lists:append([], [[[v, d], ["b", "a"]]], _G5877) + Unify: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Call: (12) updatevars([], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], _G5878) + Unify: (12) updatevars([], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) updatevars([], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["b", "a"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], [[[v, d], ["b", "a"]]]) +^ Call: (11) not([[[v, d], ["b", "a"]]]=[]) +^ Unify: (11) not(user: ([[[v, d], ["b", "a"]]]=[])) +^ Exit: (11) not(user: ([[[v, d], ["b", "a"]]]=[])) + Call: (11) unique1([[[v, d], ["b", "a"]]], [], _G14) + Unify: (11) unique1([[[v, d], ["b", "a"]]], [], _G14) + Call: (12) lists:delete([], [[v, d], ["b", "a"]], _G5883) + Unify: (12) lists:delete([], [[v, d], ["b", "a"]], []) + Exit: (12) lists:delete([], [[v, d], ["b", "a"]], []) + Call: (12) lists:append([], [[[v, d], ["b", "a"]]], _G5886) + Unify: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Call: (12) unique1([], [[[v, d], ["b", "a"]]], _G14) + Unify: (12) unique1([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) unique1([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (11) unique1([[[v, d], ["b", "a"]]], [], [[[v, d], ["b", "a"]]]) + Call: (11) findresult3([["b", "a"], ["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], _G5887) + Unify: (11) findresult3([["b", "a"], ["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], _G5887) + Call: (12) [["b", "a"], ["b", "a"], [v, d]]=[_G5877|_G5878] + Exit: (12) [["b", "a"], ["b", "a"], [v, d]]=[["b", "a"], ["b", "a"], [v, d]] + Call: (12) expressionnotatom(["b", "a"]) + Unify: (12) expressionnotatom(["b", "a"]) + Call: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) var(["b", "a"]) + Fail: (14) var(["b", "a"]) + Redo: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) isval(["b", "a"]) + Unify: (14) isval(["b", "a"]) + Call: (15) number(["b", "a"]) + Fail: (15) number(["b", "a"]) + Fail: (14) isval(["b", "a"]) + Redo: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) string(["b", "a"]) + Fail: (14) string(["b", "a"]) + Fail: (13) isvalstrempty(["b", "a"]) + Redo: (12) expressionnotatom(["b", "a"]) + Unify: (12) expressionnotatom(["b", "a"]) +^ Call: (13) not(atom(["b", "a"])) +^ Unify: (13) not(user:atom(["b", "a"])) +^ Exit: (13) not(user:atom(["b", "a"])) + Call: (13) length(["b", "a"], _G5893) + Unify: (13) length(["b", "a"], _G5893) + Exit: (13) length(["b", "a"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["b", "a"]) + Unify: (13) expressionnotatom2(["b", "a"]) + Call: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) var("b") + Fail: (15) var("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) isval("b") + Unify: (15) isval("b") + Call: (16) number("b") + Fail: (16) number("b") + Fail: (15) isval("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) string("b") + Exit: (15) string("b") + Exit: (14) isvalstrempty("b") + Call: (14) expressionnotatom2(["a"]) + Unify: (14) expressionnotatom2(["a"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["a"]) + Exit: (13) expressionnotatom2(["b", "a"]) + Exit: (12) expressionnotatom(["b", "a"]) + Call: (12) lists:append([], [["b", "a"]], _G5897) + Unify: (12) lists:append([], [["b", "a"]], [["b", "a"]]) + Exit: (12) lists:append([], [["b", "a"]], [["b", "a"]]) + Call: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["b", "a"]], _G5898) + Unify: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["b", "a"]], _G5898) + Call: (13) [["b", "a"], [v, d]]=[_G5888|_G5889] + Exit: (13) [["b", "a"], [v, d]]=[["b", "a"], [v, d]] + Call: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) + Call: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Fail: (14) isvalstrempty(["b", "a"]) + Redo: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) +^ Call: (14) not(atom(["b", "a"])) +^ Unify: (14) not(user:atom(["b", "a"])) +^ Exit: (14) not(user:atom(["b", "a"])) + Call: (14) length(["b", "a"], _G5904) + Unify: (14) length(["b", "a"], _G5904) + Exit: (14) length(["b", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["b", "a"]) + Unify: (14) expressionnotatom2(["b", "a"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["b", "a"]) + Exit: (13) expressionnotatom(["b", "a"]) + Call: (13) lists:append([["b", "a"]], [["b", "a"]], _G5908) + Unify: (13) lists:append([["b", "a"]], [["b", "a"]], [["b", "a"]|_G5900]) + Exit: (13) lists:append([["b", "a"]], [["b", "a"]], [["b", "a"], ["b", "a"]]) + Call: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["b", "a"], ["b", "a"]], _G5912) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["b", "a"], ["b", "a"]], _G5912) + Call: (14) [[v, d]]=[_G5902|_G5903] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5918) + Unify: (15) length([v, d], _G5918) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["b", "a"], ["b", "a"]], _G5912) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["b", "a"], ["b", "a"]], _G5912) + Call: (14) [[v, d]]=[_G5902|_G5903] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) lists:member([[v, d], _G5908], [[[v, d], ["b", "a"]]]) + Unify: (14) lists:member([[v, d], _G5908], [[[v, d], ["b", "a"]]]) + Exit: (14) lists:member([[v, d], ["b", "a"]], [[[v, d], ["b", "a"]]]) + Call: (14) lists:append([["b", "a"], ["b", "a"]], [["b", "a"]], _G5923) + Unify: (14) lists:append([["b", "a"], ["b", "a"]], [["b", "a"]], [["b", "a"]|_G5915]) + Exit: (14) lists:append([["b", "a"], ["b", "a"]], [["b", "a"]], [["b", "a"], ["b", "a"], ["b", "a"]]) + Call: (14) findresult3([], [[[v, d], ["b", "a"]]], [["b", "a"], ["b", "a"], ["b", "a"]], _G5930) + Unify: (14) findresult3([], [[[v, d], ["b", "a"]]], [["b", "a"], ["b", "a"], ["b", "a"]], [["b", "a"], ["b", "a"], ["b", "a"]]) + Exit: (14) findresult3([], [[[v, d], ["b", "a"]]], [["b", "a"], ["b", "a"], ["b", "a"]], [["b", "a"], ["b", "a"], ["b", "a"]]) + Exit: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["b", "a"], ["b", "a"]], [["b", "a"], ["b", "a"], ["b", "a"]]) + Exit: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["b", "a"]], [["b", "a"], ["b", "a"], ["b", "a"]]) + Exit: (11) findresult3([["b", "a"], ["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], [["b", "a"], ["b", "a"], ["b", "a"]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Exit: (10) member1([[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Exit: (9) interpret1(off, [[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Exit: (8) interpret(off, [[n, f], [["b", "a"], ["b", "a"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + + Call: (8) interpret(off, [[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (8) interpret(off, [[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551) + Unify: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551) + Call: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Call: (11) [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5542|_G5543] + Exit: (11) [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], _G5554, "->", _G5563] + Fail: (11) [[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], _G5554, "->", _G5563] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5554, [], _G5556) + Call: (11) [[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], "->", _G5560] + Fail: (11) [[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], "->", _G5560] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Call: (11) [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5542|_G5543] + Exit: (11) [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_G5545, _G5548, ":-", _G5557] + Exit: (11) [[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) lists:append([], [[[n, f], [[v, c], [v, a], [v|...]], ":-", [[[...|...]|...]]]], _G5572) + Unify: (11) lists:append([], [[[n, f], [[v, c], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, a], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (11) lists:append([], [[[n, f], [[v, c], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, a], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G5572, [], _G5574) + Unify: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (10) lists:append([[[n, f], [[v, c], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [], _G5572) + Unify: (10) lists:append([[[n, f], [[v, c], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, a], [v|...]], ":-", [[[...|...]|...]]]|_G5564]) + Exit: (10) lists:append([[[n, f], [[v, c], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, a], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (9) interpret1(off, [[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (9) interpret1(off, [[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) +^ Call: (10) retractall(debug(_G5566)) +^ Exit: (10) retractall(debug(_G5566)) +^ Call: (10) assertz(debug(off)) +^ Exit: (10) assertz(debug(off)) +^ Call: (10) retractall(cut(_G5570)) +^ Exit: (10) retractall(cut(_G5570)) +^ Call: (10) assertz(cut(off)) +^ Exit: (10) assertz(cut(off)) + Call: (10) member1([[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (10) member1([[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) cut(off) + Unify: (11) cut(off) + Exit: (11) cut(off) + Call: (11) [[n, f], [["c", "b"], ["b", "a"], [v, d]]]=[_G5574, _G5577] + Exit: (11) [[n, f], [["c", "b"], ["b", "a"], [v, d]]]=[[n, f], [["c", "b"], ["b", "a"], [v, d]]] + Call: (11) [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _G5586, ":-", _G5595]|_G5581] + Exit: (11) [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) length([["c", "b"], ["b", "a"], [v, d]], _G5606) + Unify: (11) length([["c", "b"], ["b", "a"], [v, d]], _G5606) + Exit: (11) length([["c", "b"], ["b", "a"], [v, d]], 3) + Call: (11) length([[v, c], [v, a], [v, d]], 3) + Unify: (11) length([[v, c], [v, a], [v, d]], 3) + Unify: (11) length([[v, c], [v, a], [v, d]], 3) + Exit: (11) length([[v, c], [v, a], [v, d]], 3) + Call: (11) [n, f]=[n, grammar] + Fail: (11) [n, f]=[n, grammar] + Redo: (10) member1([[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) [n, f]=[n, grammar_part] + Fail: (11) [n, f]=[n, grammar_part] + Redo: (10) member1([[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) checkarguments([["c", "b"], ["b", "a"], [v, d]], [[v, c], [v, a], [v, d]], [], _G5608, [], _G5610) + Unify: (11) checkarguments([["c", "b"], ["b", "a"], [v, d]], [[v, c], [v, a], [v, d]], [], _G5608, [], _G5610) + Call: (12) [["c", "b"], ["b", "a"], [v, d]]=[_G5598|_G5599] + Exit: (12) [["c", "b"], ["b", "a"], [v, d]]=[["c", "b"], ["b", "a"], [v, d]] + Call: (12) expressionnotatom(["c", "b"]) + Unify: (12) expressionnotatom(["c", "b"]) + Call: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) var(["c", "b"]) + Fail: (14) var(["c", "b"]) + Redo: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) isval(["c", "b"]) + Unify: (14) isval(["c", "b"]) + Call: (15) number(["c", "b"]) + Fail: (15) number(["c", "b"]) + Fail: (14) isval(["c", "b"]) + Redo: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) string(["c", "b"]) + Fail: (14) string(["c", "b"]) + Fail: (13) isvalstrempty(["c", "b"]) + Redo: (12) expressionnotatom(["c", "b"]) + Unify: (12) expressionnotatom(["c", "b"]) +^ Call: (13) not(atom(["c", "b"])) +^ Unify: (13) not(user:atom(["c", "b"])) +^ Exit: (13) not(user:atom(["c", "b"])) + Call: (13) length(["c", "b"], _G5614) + Unify: (13) length(["c", "b"], _G5614) + Exit: (13) length(["c", "b"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["c", "b"]) + Unify: (13) expressionnotatom2(["c", "b"]) + Call: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) var("c") + Fail: (15) var("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) isval("c") + Unify: (15) isval("c") + Call: (16) number("c") + Fail: (16) number("c") + Fail: (15) isval("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) string("c") + Exit: (15) string("c") + Exit: (14) isvalstrempty("c") + Call: (14) expressionnotatom2(["b"]) + Unify: (14) expressionnotatom2(["b"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["b"]) + Exit: (13) expressionnotatom2(["c", "b"]) + Exit: (12) expressionnotatom(["c", "b"]) + Call: (12) [[v, c], [v, a], [v, d]]=[_G5606|_G5607] + Exit: (12) [[v, c], [v, a], [v, d]]=[[v, c], [v, a], [v, d]] +^ Call: (12) not(var([v, c])) +^ Unify: (12) not(user:var([v, c])) +^ Exit: (12) not(user:var([v, c])) + Call: (12) isvar([v, c]) + Unify: (12) isvar([v, c]) + Exit: (12) isvar([v, c]) + Call: (12) putvalue([v, c], ["c", "b"], [], _G5624) + Unify: (12) putvalue([v, c], ["c", "b"], [], _G5624) +^ Call: (13) not(isvar([v, c])) +^ Unify: (13) not(user:isvar([v, c])) + Call: (14) isvar([v, c]) + Unify: (14) isvar([v, c]) + Exit: (14) isvar([v, c]) +^ Fail: (13) not(user:isvar([v, c])) + Redo: (12) putvalue([v, c], ["c", "b"], [], _G5624) + Call: (13) isvar([v, c]) + Unify: (13) isvar([v, c]) + Exit: (13) isvar([v, c]) + Call: (13) isvalstrorundef(["c", "b"]) + Unify: (13) isvalstrorundef(["c", "b"]) + Call: (14) var(["c", "b"]) + Fail: (14) var(["c", "b"]) + Redo: (13) isvalstrorundef(["c", "b"]) + Unify: (13) isvalstrorundef(["c", "b"]) +^ Call: (14) not(var(["c", "b"])) +^ Unify: (14) not(user:var(["c", "b"])) +^ Exit: (14) not(user:var(["c", "b"])) + Call: (14) isval(["c", "b"]) + Unify: (14) isval(["c", "b"]) + Call: (15) number(["c", "b"]) + Fail: (15) number(["c", "b"]) + Fail: (14) isval(["c", "b"]) + Redo: (13) isvalstrorundef(["c", "b"]) + Unify: (13) isvalstrorundef(["c", "b"]) +^ Call: (14) not(var(["c", "b"])) +^ Unify: (14) not(user:var(["c", "b"])) +^ Exit: (14) not(user:var(["c", "b"])) + Call: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) + Call: (15) isval(["c", "b"]) + Unify: (15) isval(["c", "b"]) + Call: (16) number(["c", "b"]) + Fail: (16) number(["c", "b"]) + Fail: (15) isval(["c", "b"]) + Redo: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) + Call: (15) string(["c", "b"]) + Fail: (15) string(["c", "b"]) + Redo: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) + Call: (15) atom(["c", "b"]) + Fail: (15) atom(["c", "b"]) + Redo: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) +^ Call: (15) not(atom(["c", "b"])) +^ Unify: (15) not(user:atom(["c", "b"])) +^ Exit: (15) not(user:atom(["c", "b"])) + Call: (15) length(["c", "b"], _G5632) + Unify: (15) length(["c", "b"], _G5632) + Exit: (15) length(["c", "b"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["c", "b"]) + Unify: (15) expression2(["c", "b"]) + Call: (16) expression3("c") + Unify: (16) expression3("c") + Call: (17) isval("c") + Unify: (17) isval("c") + Call: (18) number("c") + Fail: (18) number("c") + Fail: (17) isval("c") + Redo: (16) expression3("c") + Unify: (16) expression3("c") + Call: (17) string("c") + Exit: (17) string("c") + Exit: (16) expression3("c") + Call: (16) expression2(["b"]) + Unify: (16) expression2(["b"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["b"]) + Exit: (15) expression2(["c", "b"]) + Exit: (14) expression(["c", "b"]) + Exit: (13) isvalstrorundef(["c", "b"]) + Call: (13) updatevar([v, c], ["c", "b"], [], _G5634) + Unify: (13) updatevar([v, c], ["c", "b"], [], _G5634) + Call: (14) lists:member([[v, c], empty], []) + Fail: (14) lists:member([[v, c], empty], []) + Redo: (13) updatevar([v, c], ["c", "b"], [], _G5634) +^ Call: (14) not(member([[v, c], _G5630], [])) +^ Unify: (14) not(user:member([[v, c], _G5630], [])) +^ Exit: (14) not(user:member([[v, c], _G5630], [])) + Call: (14) _G5630=empty + Exit: (14) empty=empty + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[v, c], ["c", "b"]]], _G5654) + Unify: (14) lists:append([], [[[v, c], ["c", "b"]]], [[[v, c], ["c", "b"]]]) + Exit: (14) lists:append([], [[[v, c], ["c", "b"]]], [[[v, c], ["c", "b"]]]) + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Exit: (13) updatevar([v, c], ["c", "b"], [], [[[v, c], ["c", "b"]]]) + Exit: (12) putvalue([v, c], ["c", "b"], [], [[[v, c], ["c", "b"]]]) + Call: (12) checkarguments([["b", "a"], [v, d]], [[v, a], [v, d]], [[[v, c], ["c", "b"]]], _G5655, [], _G5657) + Unify: (12) checkarguments([["b", "a"], [v, d]], [[v, a], [v, d]], [[[v, c], ["c", "b"]]], _G5655, [], _G5657) + Call: (13) [["b", "a"], [v, d]]=[_G5645|_G5646] + Exit: (13) [["b", "a"], [v, d]]=[["b", "a"], [v, d]] + Call: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) + Call: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Fail: (14) isvalstrempty(["b", "a"]) + Redo: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) +^ Call: (14) not(atom(["b", "a"])) +^ Unify: (14) not(user:atom(["b", "a"])) +^ Exit: (14) not(user:atom(["b", "a"])) + Call: (14) length(["b", "a"], _G5661) + Unify: (14) length(["b", "a"], _G5661) + Exit: (14) length(["b", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["b", "a"]) + Unify: (14) expressionnotatom2(["b", "a"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["b", "a"]) + Exit: (13) expressionnotatom(["b", "a"]) + Call: (13) [[v, a], [v, d]]=[_G5653|_G5654] + Exit: (13) [[v, a], [v, d]]=[[v, a], [v, d]] +^ Call: (13) not(var([v, a])) +^ Unify: (13) not(user:var([v, a])) +^ Exit: (13) not(user:var([v, a])) + Call: (13) isvar([v, a]) + Unify: (13) isvar([v, a]) + Exit: (13) isvar([v, a]) + Call: (13) putvalue([v, a], ["b", "a"], [[[v, c], ["c", "b"]]], _G5671) + Unify: (13) putvalue([v, a], ["b", "a"], [[[v, c], ["c", "b"]]], _G5671) +^ Call: (14) not(isvar([v, a])) +^ Unify: (14) not(user:isvar([v, a])) + Call: (15) isvar([v, a]) + Unify: (15) isvar([v, a]) + Exit: (15) isvar([v, a]) +^ Fail: (14) not(user:isvar([v, a])) + Redo: (13) putvalue([v, a], ["b", "a"], [[[v, c], ["c", "b"]]], _G5671) + Call: (14) isvar([v, a]) + Unify: (14) isvar([v, a]) + Exit: (14) isvar([v, a]) + Call: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) +^ Call: (15) not(var(["b", "a"])) +^ Unify: (15) not(user:var(["b", "a"])) +^ Exit: (15) not(user:var(["b", "a"])) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) +^ Call: (15) not(var(["b", "a"])) +^ Unify: (15) not(user:var(["b", "a"])) +^ Exit: (15) not(user:var(["b", "a"])) + Call: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) isval(["b", "a"]) + Unify: (16) isval(["b", "a"]) + Call: (17) number(["b", "a"]) + Fail: (17) number(["b", "a"]) + Fail: (16) isval(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) string(["b", "a"]) + Fail: (16) string(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) atom(["b", "a"]) + Fail: (16) atom(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) +^ Call: (16) not(atom(["b", "a"])) +^ Unify: (16) not(user:atom(["b", "a"])) +^ Exit: (16) not(user:atom(["b", "a"])) + Call: (16) length(["b", "a"], _G5679) + Unify: (16) length(["b", "a"], _G5679) + Exit: (16) length(["b", "a"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expression2(["b", "a"]) + Unify: (16) expression2(["b", "a"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2(["a"]) + Unify: (17) expression2(["a"]) + Call: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) isval("a") + Unify: (19) isval("a") + Call: (20) number("a") + Fail: (20) number("a") + Fail: (19) isval("a") + Redo: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) string("a") + Exit: (19) string("a") + Exit: (18) expression3("a") + Call: (18) expression2([]) + Unify: (18) expression2([]) + Exit: (18) expression2([]) + Exit: (17) expression2(["a"]) + Exit: (16) expression2(["b", "a"]) + Exit: (15) expression(["b", "a"]) + Exit: (14) isvalstrorundef(["b", "a"]) + Call: (14) updatevar([v, a], ["b", "a"], [[[v, c], ["c", "b"]]], _G5681) + Unify: (14) updatevar([v, a], ["b", "a"], [[[v, c], ["c", "b"]]], _G5681) + Call: (15) lists:member([[v, a], empty], [[[v, c], ["c", "b"]]]) + Unify: (15) lists:member([[v, a], empty], [[[v, c], ["c", "b"]]]) + Fail: (15) lists:member([[v, a], empty], [[[v, c], ["c", "b"]]]) + Redo: (14) updatevar([v, a], ["b", "a"], [[[v, c], ["c", "b"]]], _G5681) +^ Call: (15) not(member([[v, a], _G5677], [[[v, c], ["c", "b"]]])) +^ Unify: (15) not(user:member([[v, a], _G5677], [[[v, c], ["c", "b"]]])) +^ Exit: (15) not(user:member([[v, a], _G5677], [[[v, c], ["c", "b"]]])) + Call: (15) _G5677=empty + Exit: (15) empty=empty + Call: (15) true + Exit: (15) true + Call: (15) lists:append([[[v, c], ["c", "b"]]], [[[v, a], ["b", "a"]]], _G5701) + Unify: (15) lists:append([[[v, c], ["c", "b"]]], [[[v, a], ["b", "a"]]], [[[v, c], ["c", "b"]]|_G5693]) + Exit: (15) lists:append([[[v, c], ["c", "b"]]], [[[v, a], ["b", "a"]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Call: (15) true + Exit: (15) true + Call: (15) true + Exit: (15) true + Exit: (14) updatevar([v, a], ["b", "a"], [[[v, c], ["c", "b"]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Exit: (13) putvalue([v, a], ["b", "a"], [[[v, c], ["c", "b"]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Call: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5711) + Unify: (15) length([v, d], _G5711) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5703|_G5704] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5719) + Unify: (15) length([v, d], _G5719) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5703|_G5704] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) getvalue([v, d], _G5719, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Unify: (14) getvalue([v, d], _G5719, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) getvalue([v, d], _G5719, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(_G5718) + Unify: (15) isvalstrorundef(_G5718) + Call: (16) var(_G5718) + Exit: (16) var(_G5718) + Exit: (15) isvalstrorundef(_G5718) + Call: (15) getvar([v, d], _G5719, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Unify: (15) getvar([v, d], _G5719, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Call: (16) lists:member([[v, d], _G5714], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Unify: (16) lists:member([[v, d], _G5714], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Fail: (16) lists:member([[v, d], _G5714], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Redo: (15) getvar([v, d], _G5719, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Unify: (15) getvar([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) +^ Call: (16) not(member([[v, d], _G5717], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5717], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5717], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]])) + Call: (16) true + Exit: (16) true + Exit: (15) getvar([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Exit: (14) getvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) putvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5733) + Unify: (14) putvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5733) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5733) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) + Call: (16) var(empty) + Fail: (16) var(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) isval(empty) + Unify: (16) isval(empty) + Call: (17) number(empty) + Fail: (17) number(empty) + Fail: (16) isval(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) expression(empty) + Unify: (16) expression(empty) + Exit: (16) expression(empty) + Exit: (15) isvalstrorundef(empty) + Call: (15) updatevar([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5738) + Unify: (15) updatevar([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5738) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Fail: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Redo: (15) updatevar([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], _G5738) +^ Call: (16) not(member([[v, d], _G5734], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5734], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5734], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]])) + Call: (16) _G5734=empty + Exit: (16) empty=empty + Call: (16) true + Exit: (16) true + Call: (16) lists:append([[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], [[[v, d], empty]], _G5758) + Unify: (16) lists:append([[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], [[[v, d], empty]], [[[v, c], ["c", "b"]]|_G5750]) + Exit: (16) lists:append([[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], [[[v, d], empty]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (14) putvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (14) lists:append([], [[[v, d], [v, d]]], _G5773) + Unify: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Call: (14) checkarguments([], [], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[v, d], [v, d]]], _G5776) + Unify: (14) checkarguments([], [], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) checkarguments([], [], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (12) checkarguments([["b", "a"], [v, d]], [[v, a], [v, d]], [[[v, c], ["c", "b"]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (11) checkarguments([["c", "b"], ["b", "a"], [v, d]], [[v, c], [v, a], [v, d]], [], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) true + Exit: (11) true + Call: (11) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, a]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, a]]]], true) + Call: (12) [[[n, =], [[v, d], [v, a]]]]=[_G5764|_G5765] + Exit: (12) [[[n, =], [[v, d], [v, a]]]]=[[[n, =], [[v, d], [v, a]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, a]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, a]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, a]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, a]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, a]]])) + Call: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, a]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, a]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, a]]]=[_G5772|_G5773] + Exit: (13) [[n, =], [[v, d], [v, a]]]=[[n, =], [[v, d], [v, a]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Redo: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, a]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, a]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, a]]]=[[not, [_G5781]]|_G5773] + Fail: (13) [[n, =], [[v, d], [v, a]]]=[[not, [_G5781]]|_G5773] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, a]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, a]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, a]]]=[[_G5775], or, [_G5784]] + Fail: (13) [[n, =], [[v, d], [v, a]]]=[[_G5775], or, [_G5784]] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, a]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, a]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, a]]]=[_G5772|_G5773] + Exit: (13) [[n, =], [[v, d], [v, a]]]=[[n, =], [[v, d], [v, a]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Fail: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, a]]], _G5784) + Redo: (11) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, a]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, a]]]], true) + Call: (12) [[[n, =], [[v, d], [v, a]]]]=[[not, [_G5773]]|_G5765] + Fail: (12) [[[n, =], [[v, d], [v, a]]]]=[[not, [_G5773]]|_G5765] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, a]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, a]]]], true) + Call: (12) [[[n, =], [[v, d], [v, a]]]]=[[_G5767], or, [_G5776]] + Fail: (12) [[[n, =], [[v, d], [v, a]]]]=[[_G5767], or, [_G5776]] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, a]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, a]]]], true) + Call: (12) [[[n, =], [[v, d], [v, a]]]]=[_G5764|_G5765] + Exit: (12) [[[n, =], [[v, d], [v, a]]]]=[[[n, =], [[v, d], [v, a]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, a]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, a]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, a]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, a]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, a]]])) + Call: (12) interpretstatement1([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, a]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5783, _G5784, _G5785) + Unify: (12) interpretstatement1([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, a]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5783, true, nocut) + Call: (13) isop(=) + Unify: (13) isop(=) + Exit: (13) isop(=) + Call: (13) interpretpart(is, [v, d], [v, a], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5783) + Unify: (13) interpretpart(is, [v, d], [v, a], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5783) + Call: (14) getvalues([v, d], [v, a], _G5781, _G5782, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (14) getvalues([v, d], [v, a], _G5781, _G5782, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, d], _G5780, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, d], _G5780, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, d])) +^ Unify: (16) not(user:isvar([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) +^ Fail: (16) not(user:isvar([v, d])) + Redo: (15) getvalue([v, d], _G5780, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) + Call: (16) isvalstrorundef(_G5779) + Unify: (16) isvalstrorundef(_G5779) + Call: (17) var(_G5779) + Exit: (17) var(_G5779) + Exit: (16) isvalstrorundef(_G5779) + Call: (16) getvar([v, d], _G5780, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], _G5780, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], _G5775], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], _G5775], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(empty=empty) +^ Unify: (17) not(user: (empty=empty)) +^ Fail: (17) not(user: (empty=empty)) + Redo: (16) getvar([v, d], _G5780, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(member([[v, d], _G5778], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]])) +^ Unify: (17) not(user:member([[v, d], _G5778], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]])) +^ Fail: (17) not(user:member([[v, d], _G5778], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]])) + Redo: (16) getvar([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (16) getvar([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, a], _G5786, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, a], _G5786, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, a])) +^ Unify: (16) not(user:isvar([v, a])) + Call: (17) isvar([v, a]) + Unify: (17) isvar([v, a]) + Exit: (17) isvar([v, a]) +^ Fail: (16) not(user:isvar([v, a])) + Redo: (15) getvalue([v, a], _G5786, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, a]) + Unify: (16) isvar([v, a]) + Exit: (16) isvar([v, a]) + Call: (16) isvalstrorundef(_G5785) + Unify: (16) isvalstrorundef(_G5785) + Call: (17) var(_G5785) + Exit: (17) var(_G5785) + Exit: (16) isvalstrorundef(_G5785) + Call: (16) getvar([v, a], _G5786, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, a], _G5786, [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, a], _G5781], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, a], _G5781], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, a], ["b", "a"]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(["b", "a"]=empty) +^ Unify: (17) not(user: (["b", "a"]=empty)) +^ Exit: (17) not(user: (["b", "a"]=empty)) + Exit: (16) getvar([v, a], ["b", "a"], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, a], ["b", "a"], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (14) getvalues([v, d], [v, a], empty, ["b", "a"], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (14) expression(empty) + Unify: (14) expression(empty) + Exit: (14) expression(empty) + Call: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) atom(["b", "a"]) + Fail: (15) atom(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) +^ Call: (15) not(atom(["b", "a"])) +^ Unify: (15) not(user:atom(["b", "a"])) +^ Exit: (15) not(user:atom(["b", "a"])) + Call: (15) length(["b", "a"], _G5803) + Unify: (15) length(["b", "a"], _G5803) + Exit: (15) length(["b", "a"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["b", "a"]) + Unify: (15) expression2(["b", "a"]) + Call: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) expression3("b") + Call: (16) expression2(["a"]) + Unify: (16) expression2(["a"]) + Call: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) isval("a") + Unify: (18) isval("a") + Call: (19) number("a") + Fail: (19) number("a") + Fail: (18) isval("a") + Redo: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) string("a") + Exit: (18) string("a") + Exit: (17) expression3("a") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["a"]) + Exit: (15) expression2(["b", "a"]) + Exit: (14) expression(["b", "a"]) + Call: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Unify: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Exit: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Call: (14) putvalue([v, d], ["b", "a"], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5805) + Unify: (14) putvalue([v, d], ["b", "a"], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5805) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], ["b", "a"], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5805) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) + Call: (16) var(["b", "a"]) + Fail: (16) var(["b", "a"]) + Redo: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) +^ Call: (16) not(var(["b", "a"])) +^ Unify: (16) not(user:var(["b", "a"])) +^ Exit: (16) not(user:var(["b", "a"])) + Call: (16) isval(["b", "a"]) + Unify: (16) isval(["b", "a"]) + Call: (17) number(["b", "a"]) + Fail: (17) number(["b", "a"]) + Fail: (16) isval(["b", "a"]) + Redo: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) +^ Call: (16) not(var(["b", "a"])) +^ Unify: (16) not(user:var(["b", "a"])) +^ Exit: (16) not(user:var(["b", "a"])) + Call: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) isval(["b", "a"]) + Unify: (17) isval(["b", "a"]) + Call: (18) number(["b", "a"]) + Fail: (18) number(["b", "a"]) + Fail: (17) isval(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) string(["b", "a"]) + Fail: (17) string(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) atom(["b", "a"]) + Fail: (17) atom(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) +^ Call: (17) not(atom(["b", "a"])) +^ Unify: (17) not(user:atom(["b", "a"])) +^ Exit: (17) not(user:atom(["b", "a"])) + Call: (17) length(["b", "a"], _G5813) + Unify: (17) length(["b", "a"], _G5813) + Exit: (17) length(["b", "a"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expression2(["b", "a"]) + Unify: (17) expression2(["b", "a"]) + Call: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) isval("b") + Unify: (19) isval("b") + Call: (20) number("b") + Fail: (20) number("b") + Fail: (19) isval("b") + Redo: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) string("b") + Exit: (19) string("b") + Exit: (18) expression3("b") + Call: (18) expression2(["a"]) + Unify: (18) expression2(["a"]) + Call: (19) expression3("a") + Unify: (19) expression3("a") + Call: (20) isval("a") + Unify: (20) isval("a") + Call: (21) number("a") + Fail: (21) number("a") + Fail: (20) isval("a") + Redo: (19) expression3("a") + Unify: (19) expression3("a") + Call: (20) string("a") + Exit: (20) string("a") + Exit: (19) expression3("a") + Call: (19) expression2([]) + Unify: (19) expression2([]) + Exit: (19) expression2([]) + Exit: (18) expression2(["a"]) + Exit: (17) expression2(["b", "a"]) + Exit: (16) expression(["b", "a"]) + Exit: (15) isvalstrorundef(["b", "a"]) + Call: (15) updatevar([v, d], ["b", "a"], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5815) + Unify: (15) updatevar([v, d], ["b", "a"], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], _G5815) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Exit: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]]) + Call: (16) lists:delete([[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[v, d], empty], _G5826) + Unify: (16) lists:delete([[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[v, d], empty], _G5826) + Exit: (16) lists:delete([[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[v, d], empty], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]]) + Call: (16) lists:append([[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], [[[v, d], ["b", "a"]]], _G5841) + Unify: (16) lists:append([[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, c], ["c", "b"]]|_G5833]) + Exit: (16) lists:append([[[v, c], ["c", "b"]], [[v, a], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], ["b", "a"], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Exit: (14) putvalue([v, d], ["b", "a"], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, a], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, a], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Exit: (13) interpretpart(is, [v, d], [v, a], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Exit: (12) interpretstatement1([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, a]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], true, nocut) +^ Call: (12) not(nocut=cut) +^ Unify: (12) not(user: (nocut=cut)) +^ Exit: (12) not(user: (nocut=cut)) + Call: (12) _G5851=[[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (12) [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], _G5854, [], _G5856) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], true) + Exit: (12) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], true) + Call: (12) logicalconjunction(true, true, true) + Unify: (12) logicalconjunction(true, true, true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Exit: (12) logicalconjunction(true, true, true) + Exit: (11) interpretbody([[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[n, =], [[v, d], [v, a]]]], true) + Call: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], _G5854) + Unify: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], _G5854) + Call: (12) [[[v, d], [v, d]]]=[[_G5847, _G5850]|_G5845] + Exit: (12) [[[v, d], [v, d]]]=[[[v, d], [v, d]]] + Call: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) var([v, d]) + Fail: (14) var([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) isval([v, d]) + Unify: (14) isval([v, d]) + Call: (15) number([v, d]) + Fail: (15) number([v, d]) + Fail: (14) isval([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) string([v, d]) + Fail: (14) string([v, d]) + Fail: (13) isvalstrempty([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) +^ Call: (13) not(atom([v, d])) +^ Unify: (13) not(user:atom([v, d])) +^ Exit: (13) not(user:atom([v, d])) + Call: (13) length([v, d], _G5866) + Unify: (13) length([v, d], _G5866) + Exit: (13) length([v, d], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2([v, d]) + Unify: (13) expressionnotatom2([v, d]) + Call: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) var(v) + Fail: (15) var(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) isval(v) + Unify: (15) isval(v) + Call: (16) number(v) + Fail: (16) number(v) + Fail: (15) isval(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) string(v) + Fail: (15) string(v) + Fail: (14) isvalstrempty(v) + Fail: (13) expressionnotatom2([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) predicate_or_rule_name([v, d]) + Fail: (13) predicate_or_rule_name([v, d]) + Fail: (12) expressionnotatom([v, d]) + Redo: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], _G5863) + Call: (12) lists:member([[v, d], _G5856], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Unify: (12) lists:member([[v, d], _G5856], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Exit: (12) lists:member([[v, d], ["b", "a"]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]]) + Call: (12) lists:append([], [[[v, d], ["b", "a"]]], _G5877) + Unify: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Call: (12) updatevars([], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], _G5878) + Unify: (12) updatevars([], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) updatevars([], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["c", "b"]], [[v, a], ["b", "a"]], [[v, d], ["b", "a"]]], [], [[[v, d], ["b", "a"]]]) +^ Call: (11) not([[[v, d], ["b", "a"]]]=[]) +^ Unify: (11) not(user: ([[[v, d], ["b", "a"]]]=[])) +^ Exit: (11) not(user: ([[[v, d], ["b", "a"]]]=[])) + Call: (11) unique1([[[v, d], ["b", "a"]]], [], _G14) + Unify: (11) unique1([[[v, d], ["b", "a"]]], [], _G14) + Call: (12) lists:delete([], [[v, d], ["b", "a"]], _G5883) + Unify: (12) lists:delete([], [[v, d], ["b", "a"]], []) + Exit: (12) lists:delete([], [[v, d], ["b", "a"]], []) + Call: (12) lists:append([], [[[v, d], ["b", "a"]]], _G5886) + Unify: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Call: (12) unique1([], [[[v, d], ["b", "a"]]], _G14) + Unify: (12) unique1([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) unique1([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (11) unique1([[[v, d], ["b", "a"]]], [], [[[v, d], ["b", "a"]]]) + Call: (11) findresult3([["c", "b"], ["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], _G5887) + Unify: (11) findresult3([["c", "b"], ["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], _G5887) + Call: (12) [["c", "b"], ["b", "a"], [v, d]]=[_G5877|_G5878] + Exit: (12) [["c", "b"], ["b", "a"], [v, d]]=[["c", "b"], ["b", "a"], [v, d]] + Call: (12) expressionnotatom(["c", "b"]) + Unify: (12) expressionnotatom(["c", "b"]) + Call: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) var(["c", "b"]) + Fail: (14) var(["c", "b"]) + Redo: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) isval(["c", "b"]) + Unify: (14) isval(["c", "b"]) + Call: (15) number(["c", "b"]) + Fail: (15) number(["c", "b"]) + Fail: (14) isval(["c", "b"]) + Redo: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) string(["c", "b"]) + Fail: (14) string(["c", "b"]) + Fail: (13) isvalstrempty(["c", "b"]) + Redo: (12) expressionnotatom(["c", "b"]) + Unify: (12) expressionnotatom(["c", "b"]) +^ Call: (13) not(atom(["c", "b"])) +^ Unify: (13) not(user:atom(["c", "b"])) +^ Exit: (13) not(user:atom(["c", "b"])) + Call: (13) length(["c", "b"], _G5893) + Unify: (13) length(["c", "b"], _G5893) + Exit: (13) length(["c", "b"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["c", "b"]) + Unify: (13) expressionnotatom2(["c", "b"]) + Call: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) var("c") + Fail: (15) var("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) isval("c") + Unify: (15) isval("c") + Call: (16) number("c") + Fail: (16) number("c") + Fail: (15) isval("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) string("c") + Exit: (15) string("c") + Exit: (14) isvalstrempty("c") + Call: (14) expressionnotatom2(["b"]) + Unify: (14) expressionnotatom2(["b"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["b"]) + Exit: (13) expressionnotatom2(["c", "b"]) + Exit: (12) expressionnotatom(["c", "b"]) + Call: (12) lists:append([], [["c", "b"]], _G5897) + Unify: (12) lists:append([], [["c", "b"]], [["c", "b"]]) + Exit: (12) lists:append([], [["c", "b"]], [["c", "b"]]) + Call: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["c", "b"]], _G5898) + Unify: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["c", "b"]], _G5898) + Call: (13) [["b", "a"], [v, d]]=[_G5888|_G5889] + Exit: (13) [["b", "a"], [v, d]]=[["b", "a"], [v, d]] + Call: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) + Call: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Fail: (14) isvalstrempty(["b", "a"]) + Redo: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) +^ Call: (14) not(atom(["b", "a"])) +^ Unify: (14) not(user:atom(["b", "a"])) +^ Exit: (14) not(user:atom(["b", "a"])) + Call: (14) length(["b", "a"], _G5904) + Unify: (14) length(["b", "a"], _G5904) + Exit: (14) length(["b", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["b", "a"]) + Unify: (14) expressionnotatom2(["b", "a"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["b", "a"]) + Exit: (13) expressionnotatom(["b", "a"]) + Call: (13) lists:append([["c", "b"]], [["b", "a"]], _G5908) + Unify: (13) lists:append([["c", "b"]], [["b", "a"]], [["c", "b"]|_G5900]) + Exit: (13) lists:append([["c", "b"]], [["b", "a"]], [["c", "b"], ["b", "a"]]) + Call: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["c", "b"], ["b", "a"]], _G5912) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["c", "b"], ["b", "a"]], _G5912) + Call: (14) [[v, d]]=[_G5902|_G5903] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5918) + Unify: (15) length([v, d], _G5918) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["c", "b"], ["b", "a"]], _G5912) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["c", "b"], ["b", "a"]], _G5912) + Call: (14) [[v, d]]=[_G5902|_G5903] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) lists:member([[v, d], _G5908], [[[v, d], ["b", "a"]]]) + Unify: (14) lists:member([[v, d], _G5908], [[[v, d], ["b", "a"]]]) + Exit: (14) lists:member([[v, d], ["b", "a"]], [[[v, d], ["b", "a"]]]) + Call: (14) lists:append([["c", "b"], ["b", "a"]], [["b", "a"]], _G5923) + Unify: (14) lists:append([["c", "b"], ["b", "a"]], [["b", "a"]], [["c", "b"]|_G5915]) + Exit: (14) lists:append([["c", "b"], ["b", "a"]], [["b", "a"]], [["c", "b"], ["b", "a"], ["b", "a"]]) + Call: (14) findresult3([], [[[v, d], ["b", "a"]]], [["c", "b"], ["b", "a"], ["b", "a"]], _G5930) + Unify: (14) findresult3([], [[[v, d], ["b", "a"]]], [["c", "b"], ["b", "a"], ["b", "a"]], [["c", "b"], ["b", "a"], ["b", "a"]]) + Exit: (14) findresult3([], [[[v, d], ["b", "a"]]], [["c", "b"], ["b", "a"], ["b", "a"]], [["c", "b"], ["b", "a"], ["b", "a"]]) + Exit: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["c", "b"], ["b", "a"]], [["c", "b"], ["b", "a"], ["b", "a"]]) + Exit: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["c", "b"]], [["c", "b"], ["b", "a"], ["b", "a"]]) + Exit: (11) findresult3([["c", "b"], ["b", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], [["c", "b"], ["b", "a"], ["b", "a"]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Exit: (10) member1([[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Exit: (9) interpret1(off, [[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Exit: (8) interpret(off, [[n, f], [["c", "b"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + + Call: (8) interpret(off, [[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Unify: (8) interpret(off, [[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (9) convert_to_grammar_part1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552) + Unify: (9) convert_to_grammar_part1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552) + Call: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Call: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5543|_G5544] + Exit: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], _G5555, "->", _G5564] + Fail: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], _G5555, "->", _G5564] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5555, [], _G5557) + Call: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], "->", _G5561] + Fail: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], "->", _G5561] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Call: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5543|_G5544] + Exit: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_G5546, _G5549, ":-", _G5558] + Exit: (11) [[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) lists:append([], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], _G5573) + Unify: (11) lists:append([], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (11) lists:append([], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (11) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G5573, [], _G5575) + Unify: (11) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (11) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (10) convert_to_grammar_part11([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (10) lists:append([[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [], _G5573) + Unify: (10) lists:append([[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]|_G5565]) + Exit: (10) lists:append([[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, b], [v, a], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (9) convert_to_grammar_part1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (9) interpret1(off, [[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Unify: (9) interpret1(off, [[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) +^ Call: (10) retractall(debug(_G5567)) +^ Exit: (10) retractall(debug(_G5567)) +^ Call: (10) assertz(debug(off)) +^ Exit: (10) assertz(debug(off)) +^ Call: (10) retractall(cut(_G5571)) +^ Exit: (10) retractall(cut(_G5571)) +^ Call: (10) assertz(cut(off)) +^ Exit: (10) assertz(cut(off)) + Call: (10) member1([[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Unify: (10) member1([[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) cut(off) + Unify: (11) cut(off) + Exit: (11) cut(off) + Call: (11) [[n, f], [["a", "c"], ["a", "b"], [v, d]]]=[_G5575, _G5578] + Exit: (11) [[n, f], [["a", "c"], ["a", "b"], [v, d]]]=[[n, f], [["a", "c"], ["a", "b"], [v, d]]] + Call: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _G5587, ":-", _G5596]|_G5582] + Exit: (11) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) length([["a", "c"], ["a", "b"], [v, d]], _G5607) + Unify: (11) length([["a", "c"], ["a", "b"], [v, d]], _G5607) + Exit: (11) length([["a", "c"], ["a", "b"], [v, d]], 3) + Call: (11) length([[v, b], [v, a], [v, d]], 3) + Unify: (11) length([[v, b], [v, a], [v, d]], 3) + Unify: (11) length([[v, b], [v, a], [v, d]], 3) + Exit: (11) length([[v, b], [v, a], [v, d]], 3) + Call: (11) [n, f]=[n, grammar] + Fail: (11) [n, f]=[n, grammar] + Redo: (10) member1([[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) [n, f]=[n, grammar_part] + Fail: (11) [n, f]=[n, grammar_part] + Redo: (10) member1([[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) checkarguments([["a", "c"], ["a", "b"], [v, d]], [[v, b], [v, a], [v, d]], [], _G5609, [], _G5611) + Unify: (11) checkarguments([["a", "c"], ["a", "b"], [v, d]], [[v, b], [v, a], [v, d]], [], _G5609, [], _G5611) + Call: (12) [["a", "c"], ["a", "b"], [v, d]]=[_G5599|_G5600] + Exit: (12) [["a", "c"], ["a", "b"], [v, d]]=[["a", "c"], ["a", "b"], [v, d]] + Call: (12) expressionnotatom(["a", "c"]) + Unify: (12) expressionnotatom(["a", "c"]) + Call: (13) isvalstrempty(["a", "c"]) + Unify: (13) isvalstrempty(["a", "c"]) + Call: (14) var(["a", "c"]) + Fail: (14) var(["a", "c"]) + Redo: (13) isvalstrempty(["a", "c"]) + Unify: (13) isvalstrempty(["a", "c"]) + Call: (14) isval(["a", "c"]) + Unify: (14) isval(["a", "c"]) + Call: (15) number(["a", "c"]) + Fail: (15) number(["a", "c"]) + Fail: (14) isval(["a", "c"]) + Redo: (13) isvalstrempty(["a", "c"]) + Unify: (13) isvalstrempty(["a", "c"]) + Call: (14) string(["a", "c"]) + Fail: (14) string(["a", "c"]) + Fail: (13) isvalstrempty(["a", "c"]) + Redo: (12) expressionnotatom(["a", "c"]) + Unify: (12) expressionnotatom(["a", "c"]) +^ Call: (13) not(atom(["a", "c"])) +^ Unify: (13) not(user:atom(["a", "c"])) +^ Exit: (13) not(user:atom(["a", "c"])) + Call: (13) length(["a", "c"], _G5615) + Unify: (13) length(["a", "c"], _G5615) + Exit: (13) length(["a", "c"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["a", "c"]) + Unify: (13) expressionnotatom2(["a", "c"]) + Call: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) var("a") + Fail: (15) var("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) isval("a") + Unify: (15) isval("a") + Call: (16) number("a") + Fail: (16) number("a") + Fail: (15) isval("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) string("a") + Exit: (15) string("a") + Exit: (14) isvalstrempty("a") + Call: (14) expressionnotatom2(["c"]) + Unify: (14) expressionnotatom2(["c"]) + Call: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) var("c") + Fail: (16) var("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) isval("c") + Unify: (16) isval("c") + Call: (17) number("c") + Fail: (17) number("c") + Fail: (16) isval("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) string("c") + Exit: (16) string("c") + Exit: (15) isvalstrempty("c") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["c"]) + Exit: (13) expressionnotatom2(["a", "c"]) + Exit: (12) expressionnotatom(["a", "c"]) + Call: (12) [[v, b], [v, a], [v, d]]=[_G5607|_G5608] + Exit: (12) [[v, b], [v, a], [v, d]]=[[v, b], [v, a], [v, d]] +^ Call: (12) not(var([v, b])) +^ Unify: (12) not(user:var([v, b])) +^ Exit: (12) not(user:var([v, b])) + Call: (12) isvar([v, b]) + Unify: (12) isvar([v, b]) + Exit: (12) isvar([v, b]) + Call: (12) putvalue([v, b], ["a", "c"], [], _G5625) + Unify: (12) putvalue([v, b], ["a", "c"], [], _G5625) +^ Call: (13) not(isvar([v, b])) +^ Unify: (13) not(user:isvar([v, b])) + Call: (14) isvar([v, b]) + Unify: (14) isvar([v, b]) + Exit: (14) isvar([v, b]) +^ Fail: (13) not(user:isvar([v, b])) + Redo: (12) putvalue([v, b], ["a", "c"], [], _G5625) + Call: (13) isvar([v, b]) + Unify: (13) isvar([v, b]) + Exit: (13) isvar([v, b]) + Call: (13) isvalstrorundef(["a", "c"]) + Unify: (13) isvalstrorundef(["a", "c"]) + Call: (14) var(["a", "c"]) + Fail: (14) var(["a", "c"]) + Redo: (13) isvalstrorundef(["a", "c"]) + Unify: (13) isvalstrorundef(["a", "c"]) +^ Call: (14) not(var(["a", "c"])) +^ Unify: (14) not(user:var(["a", "c"])) +^ Exit: (14) not(user:var(["a", "c"])) + Call: (14) isval(["a", "c"]) + Unify: (14) isval(["a", "c"]) + Call: (15) number(["a", "c"]) + Fail: (15) number(["a", "c"]) + Fail: (14) isval(["a", "c"]) + Redo: (13) isvalstrorundef(["a", "c"]) + Unify: (13) isvalstrorundef(["a", "c"]) +^ Call: (14) not(var(["a", "c"])) +^ Unify: (14) not(user:var(["a", "c"])) +^ Exit: (14) not(user:var(["a", "c"])) + Call: (14) expression(["a", "c"]) + Unify: (14) expression(["a", "c"]) + Call: (15) isval(["a", "c"]) + Unify: (15) isval(["a", "c"]) + Call: (16) number(["a", "c"]) + Fail: (16) number(["a", "c"]) + Fail: (15) isval(["a", "c"]) + Redo: (14) expression(["a", "c"]) + Unify: (14) expression(["a", "c"]) + Call: (15) string(["a", "c"]) + Fail: (15) string(["a", "c"]) + Redo: (14) expression(["a", "c"]) + Unify: (14) expression(["a", "c"]) + Call: (15) atom(["a", "c"]) + Fail: (15) atom(["a", "c"]) + Redo: (14) expression(["a", "c"]) + Unify: (14) expression(["a", "c"]) +^ Call: (15) not(atom(["a", "c"])) +^ Unify: (15) not(user:atom(["a", "c"])) +^ Exit: (15) not(user:atom(["a", "c"])) + Call: (15) length(["a", "c"], _G5633) + Unify: (15) length(["a", "c"], _G5633) + Exit: (15) length(["a", "c"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["a", "c"]) + Unify: (15) expression2(["a", "c"]) + Call: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) expression3("a") + Call: (16) expression2(["c"]) + Unify: (16) expression2(["c"]) + Call: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) isval("c") + Unify: (18) isval("c") + Call: (19) number("c") + Fail: (19) number("c") + Fail: (18) isval("c") + Redo: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) string("c") + Exit: (18) string("c") + Exit: (17) expression3("c") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["c"]) + Exit: (15) expression2(["a", "c"]) + Exit: (14) expression(["a", "c"]) + Exit: (13) isvalstrorundef(["a", "c"]) + Call: (13) updatevar([v, b], ["a", "c"], [], _G5635) + Unify: (13) updatevar([v, b], ["a", "c"], [], _G5635) + Call: (14) lists:member([[v, b], empty], []) + Fail: (14) lists:member([[v, b], empty], []) + Redo: (13) updatevar([v, b], ["a", "c"], [], _G5635) +^ Call: (14) not(member([[v, b], _G5631], [])) +^ Unify: (14) not(user:member([[v, b], _G5631], [])) +^ Exit: (14) not(user:member([[v, b], _G5631], [])) + Call: (14) _G5631=empty + Exit: (14) empty=empty + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[v, b], ["a", "c"]]], _G5655) + Unify: (14) lists:append([], [[[v, b], ["a", "c"]]], [[[v, b], ["a", "c"]]]) + Exit: (14) lists:append([], [[[v, b], ["a", "c"]]], [[[v, b], ["a", "c"]]]) + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Exit: (13) updatevar([v, b], ["a", "c"], [], [[[v, b], ["a", "c"]]]) + Exit: (12) putvalue([v, b], ["a", "c"], [], [[[v, b], ["a", "c"]]]) + Call: (12) checkarguments([["a", "b"], [v, d]], [[v, a], [v, d]], [[[v, b], ["a", "c"]]], _G5656, [], _G5658) + Unify: (12) checkarguments([["a", "b"], [v, d]], [[v, a], [v, d]], [[[v, b], ["a", "c"]]], _G5656, [], _G5658) + Call: (13) [["a", "b"], [v, d]]=[_G5646|_G5647] + Exit: (13) [["a", "b"], [v, d]]=[["a", "b"], [v, d]] + Call: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) + Call: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) var(["a", "b"]) + Fail: (15) var(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) string(["a", "b"]) + Fail: (15) string(["a", "b"]) + Fail: (14) isvalstrempty(["a", "b"]) + Redo: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) +^ Call: (14) not(atom(["a", "b"])) +^ Unify: (14) not(user:atom(["a", "b"])) +^ Exit: (14) not(user:atom(["a", "b"])) + Call: (14) length(["a", "b"], _G5662) + Unify: (14) length(["a", "b"], _G5662) + Exit: (14) length(["a", "b"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["a", "b"]) + Unify: (14) expressionnotatom2(["a", "b"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2(["b"]) + Unify: (15) expressionnotatom2(["b"]) + Call: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) var("b") + Fail: (17) var("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) isvalstrempty("b") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["b"]) + Exit: (14) expressionnotatom2(["a", "b"]) + Exit: (13) expressionnotatom(["a", "b"]) + Call: (13) [[v, a], [v, d]]=[_G5654|_G5655] + Exit: (13) [[v, a], [v, d]]=[[v, a], [v, d]] +^ Call: (13) not(var([v, a])) +^ Unify: (13) not(user:var([v, a])) +^ Exit: (13) not(user:var([v, a])) + Call: (13) isvar([v, a]) + Unify: (13) isvar([v, a]) + Exit: (13) isvar([v, a]) + Call: (13) putvalue([v, a], ["a", "b"], [[[v, b], ["a", "c"]]], _G5672) + Unify: (13) putvalue([v, a], ["a", "b"], [[[v, b], ["a", "c"]]], _G5672) +^ Call: (14) not(isvar([v, a])) +^ Unify: (14) not(user:isvar([v, a])) + Call: (15) isvar([v, a]) + Unify: (15) isvar([v, a]) + Exit: (15) isvar([v, a]) +^ Fail: (14) not(user:isvar([v, a])) + Redo: (13) putvalue([v, a], ["a", "b"], [[[v, b], ["a", "c"]]], _G5672) + Call: (14) isvar([v, a]) + Unify: (14) isvar([v, a]) + Exit: (14) isvar([v, a]) + Call: (14) isvalstrorundef(["a", "b"]) + Unify: (14) isvalstrorundef(["a", "b"]) + Call: (15) var(["a", "b"]) + Fail: (15) var(["a", "b"]) + Redo: (14) isvalstrorundef(["a", "b"]) + Unify: (14) isvalstrorundef(["a", "b"]) +^ Call: (15) not(var(["a", "b"])) +^ Unify: (15) not(user:var(["a", "b"])) +^ Exit: (15) not(user:var(["a", "b"])) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) isvalstrorundef(["a", "b"]) + Unify: (14) isvalstrorundef(["a", "b"]) +^ Call: (15) not(var(["a", "b"])) +^ Unify: (15) not(user:var(["a", "b"])) +^ Exit: (15) not(user:var(["a", "b"])) + Call: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) + Call: (16) isval(["a", "b"]) + Unify: (16) isval(["a", "b"]) + Call: (17) number(["a", "b"]) + Fail: (17) number(["a", "b"]) + Fail: (16) isval(["a", "b"]) + Redo: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) + Call: (16) string(["a", "b"]) + Fail: (16) string(["a", "b"]) + Redo: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) + Call: (16) atom(["a", "b"]) + Fail: (16) atom(["a", "b"]) + Redo: (15) expression(["a", "b"]) + Unify: (15) expression(["a", "b"]) +^ Call: (16) not(atom(["a", "b"])) +^ Unify: (16) not(user:atom(["a", "b"])) +^ Exit: (16) not(user:atom(["a", "b"])) + Call: (16) length(["a", "b"], _G5680) + Unify: (16) length(["a", "b"], _G5680) + Exit: (16) length(["a", "b"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expression2(["a", "b"]) + Unify: (16) expression2(["a", "b"]) + Call: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) isval("a") + Unify: (18) isval("a") + Call: (19) number("a") + Fail: (19) number("a") + Fail: (18) isval("a") + Redo: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) string("a") + Exit: (18) string("a") + Exit: (17) expression3("a") + Call: (17) expression2(["b"]) + Unify: (17) expression2(["b"]) + Call: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) isval("b") + Unify: (19) isval("b") + Call: (20) number("b") + Fail: (20) number("b") + Fail: (19) isval("b") + Redo: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) string("b") + Exit: (19) string("b") + Exit: (18) expression3("b") + Call: (18) expression2([]) + Unify: (18) expression2([]) + Exit: (18) expression2([]) + Exit: (17) expression2(["b"]) + Exit: (16) expression2(["a", "b"]) + Exit: (15) expression(["a", "b"]) + Exit: (14) isvalstrorundef(["a", "b"]) + Call: (14) updatevar([v, a], ["a", "b"], [[[v, b], ["a", "c"]]], _G5682) + Unify: (14) updatevar([v, a], ["a", "b"], [[[v, b], ["a", "c"]]], _G5682) + Call: (15) lists:member([[v, a], empty], [[[v, b], ["a", "c"]]]) + Unify: (15) lists:member([[v, a], empty], [[[v, b], ["a", "c"]]]) + Fail: (15) lists:member([[v, a], empty], [[[v, b], ["a", "c"]]]) + Redo: (14) updatevar([v, a], ["a", "b"], [[[v, b], ["a", "c"]]], _G5682) +^ Call: (15) not(member([[v, a], _G5678], [[[v, b], ["a", "c"]]])) +^ Unify: (15) not(user:member([[v, a], _G5678], [[[v, b], ["a", "c"]]])) +^ Exit: (15) not(user:member([[v, a], _G5678], [[[v, b], ["a", "c"]]])) + Call: (15) _G5678=empty + Exit: (15) empty=empty + Call: (15) true + Exit: (15) true + Call: (15) lists:append([[[v, b], ["a", "c"]]], [[[v, a], ["a", "b"]]], _G5702) + Unify: (15) lists:append([[[v, b], ["a", "c"]]], [[[v, a], ["a", "b"]]], [[[v, b], ["a", "c"]]|_G5694]) + Exit: (15) lists:append([[[v, b], ["a", "c"]]], [[[v, a], ["a", "b"]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Call: (15) true + Exit: (15) true + Call: (15) true + Exit: (15) true + Exit: (14) updatevar([v, a], ["a", "b"], [[[v, b], ["a", "c"]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Exit: (13) putvalue([v, a], ["a", "b"], [[[v, b], ["a", "c"]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Call: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5706, [], _G5708) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5706, [], _G5708) + Call: (14) [[v, d]]=[_G5696|_G5697] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5712) + Unify: (15) length([v, d], _G5712) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5706, [], _G5708) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5706, [], _G5708) + Call: (14) [[v, d]]=[_G5696|_G5697] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5704|_G5705] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5720) + Unify: (15) length([v, d], _G5720) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5706, [], _G5708) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5706, [], _G5708) + Call: (14) [[v, d]]=[_G5696|_G5697] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5704|_G5705] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) getvalue([v, d], _G5720, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Unify: (14) getvalue([v, d], _G5720, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) getvalue([v, d], _G5720, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(_G5719) + Unify: (15) isvalstrorundef(_G5719) + Call: (16) var(_G5719) + Exit: (16) var(_G5719) + Exit: (15) isvalstrorundef(_G5719) + Call: (15) getvar([v, d], _G5720, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Unify: (15) getvar([v, d], _G5720, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Call: (16) lists:member([[v, d], _G5715], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Unify: (16) lists:member([[v, d], _G5715], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Fail: (16) lists:member([[v, d], _G5715], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Redo: (15) getvar([v, d], _G5720, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Unify: (15) getvar([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) +^ Call: (16) not(member([[v, d], _G5718], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]])) +^ Unify: (16) not(user:member([[v, d], _G5718], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]])) +^ Exit: (16) not(user:member([[v, d], _G5718], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]])) + Call: (16) true + Exit: (16) true + Exit: (15) getvar([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Exit: (14) getvalue([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Call: (14) true + Exit: (14) true + Call: (14) putvalue([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5734) + Unify: (14) putvalue([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5734) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5734) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) + Call: (16) var(empty) + Fail: (16) var(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) isval(empty) + Unify: (16) isval(empty) + Call: (17) number(empty) + Fail: (17) number(empty) + Fail: (16) isval(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) expression(empty) + Unify: (16) expression(empty) + Exit: (16) expression(empty) + Exit: (15) isvalstrorundef(empty) + Call: (15) updatevar([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5739) + Unify: (15) updatevar([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5739) + Call: (16) lists:member([[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Unify: (16) lists:member([[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Fail: (16) lists:member([[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Redo: (15) updatevar([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], _G5739) +^ Call: (16) not(member([[v, d], _G5735], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]])) +^ Unify: (16) not(user:member([[v, d], _G5735], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]])) +^ Exit: (16) not(user:member([[v, d], _G5735], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]])) + Call: (16) _G5735=empty + Exit: (16) empty=empty + Call: (16) true + Exit: (16) true + Call: (16) lists:append([[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], [[[v, d], empty]], _G5759) + Unify: (16) lists:append([[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], [[[v, d], empty]], [[[v, b], ["a", "c"]]|_G5751]) + Exit: (16) lists:append([[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], [[[v, d], empty]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Exit: (14) putvalue([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (14) lists:append([], [[[v, d], [v, d]]], _G5774) + Unify: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Call: (14) checkarguments([], [], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5775, [[[v, d], [v, d]]], _G5777) + Unify: (14) checkarguments([], [], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) checkarguments([], [], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (13) checkarguments([[v, d]], [[v, d]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (12) checkarguments([["a", "b"], [v, d]], [[v, a], [v, d]], [[[v, b], ["a", "c"]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (11) checkarguments([["a", "c"], ["a", "b"], [v, d]], [[v, b], [v, a], [v, d]], [], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) true + Exit: (11) true + Call: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, b]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, b]]]], true) + Call: (12) [[[n, =], [[v, d], [v, b]]]]=[_G5765|_G5766] + Exit: (12) [[[n, =], [[v, d], [v, b]]]]=[[[n, =], [[v, d], [v, b]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, b]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, b]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) + Call: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, b]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, b]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, b]]]=[_G5773|_G5774] + Exit: (13) [[n, =], [[v, d], [v, b]]]=[[n, =], [[v, d], [v, b]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Redo: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, b]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, b]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, b]]]=[[not, [_G5782]]|_G5774] + Fail: (13) [[n, =], [[v, d], [v, b]]]=[[not, [_G5782]]|_G5774] + Redo: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, b]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, b]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, b]]]=[[_G5776], or, [_G5785]] + Fail: (13) [[n, =], [[v, d], [v, b]]]=[[_G5776], or, [_G5785]] + Redo: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, b]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, b]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, b]]]=[_G5773|_G5774] + Exit: (13) [[n, =], [[v, d], [v, b]]]=[[n, =], [[v, d], [v, b]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Fail: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, b]]], _G5785) + Redo: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, b]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, b]]]], true) + Call: (12) [[[n, =], [[v, d], [v, b]]]]=[[not, [_G5774]]|_G5766] + Fail: (12) [[[n, =], [[v, d], [v, b]]]]=[[not, [_G5774]]|_G5766] + Redo: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, b]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, b]]]], true) + Call: (12) [[[n, =], [[v, d], [v, b]]]]=[[_G5768], or, [_G5777]] + Fail: (12) [[[n, =], [[v, d], [v, b]]]]=[[_G5768], or, [_G5777]] + Redo: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, b]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, b]]]], true) + Call: (12) [[[n, =], [[v, d], [v, b]]]]=[_G5765|_G5766] + Exit: (12) [[[n, =], [[v, d], [v, b]]]]=[[[n, =], [[v, d], [v, b]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, b]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, b]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, b]]])) + Call: (12) interpretstatement1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, b]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5784, _G5785, _G5786) + Unify: (12) interpretstatement1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, b]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5784, true, nocut) + Call: (13) isop(=) + Unify: (13) isop(=) + Exit: (13) isop(=) + Call: (13) interpretpart(is, [v, d], [v, b], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5784) + Unify: (13) interpretpart(is, [v, d], [v, b], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5784) + Call: (14) getvalues([v, d], [v, b], _G5782, _G5783, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Unify: (14) getvalues([v, d], [v, b], _G5782, _G5783, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (15) getvalue([v, d], _G5781, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Unify: (15) getvalue([v, d], _G5781, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, d])) +^ Unify: (16) not(user:isvar([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) +^ Fail: (16) not(user:isvar([v, d])) + Redo: (15) getvalue([v, d], _G5781, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) + Call: (16) isvalstrorundef(_G5780) + Unify: (16) isvalstrorundef(_G5780) + Call: (17) var(_G5780) + Exit: (17) var(_G5780) + Exit: (16) isvalstrorundef(_G5780) + Call: (16) getvar([v, d], _G5781, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], _G5781, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], _G5776], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], _G5776], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) +^ Call: (17) not(empty=empty) +^ Unify: (17) not(user: (empty=empty)) +^ Fail: (17) not(user: (empty=empty)) + Redo: (16) getvar([v, d], _G5781, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) +^ Call: (17) not(member([[v, d], _G5779], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]])) +^ Unify: (17) not(user:member([[v, d], _G5779], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]])) +^ Fail: (17) not(user:member([[v, d], _G5779], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]])) + Redo: (16) getvar([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Exit: (16) getvar([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Exit: (15) getvalue([v, d], empty, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (15) getvalue([v, b], _G5787, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Unify: (15) getvalue([v, b], _G5787, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, b])) +^ Unify: (16) not(user:isvar([v, b])) + Call: (17) isvar([v, b]) + Unify: (17) isvar([v, b]) + Exit: (17) isvar([v, b]) +^ Fail: (16) not(user:isvar([v, b])) + Redo: (15) getvalue([v, b], _G5787, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (16) isvar([v, b]) + Unify: (16) isvar([v, b]) + Exit: (16) isvar([v, b]) + Call: (16) isvalstrorundef(_G5786) + Unify: (16) isvalstrorundef(_G5786) + Call: (17) var(_G5786) + Exit: (17) var(_G5786) + Exit: (16) isvalstrorundef(_G5786) + Call: (16) getvar([v, b], _G5787, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Unify: (16) getvar([v, b], _G5787, [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (17) lists:member([[v, b], _G5782], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, b], _G5782], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, b], ["a", "c"]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) +^ Call: (17) not(["a", "c"]=empty) +^ Unify: (17) not(user: (["a", "c"]=empty)) +^ Exit: (17) not(user: (["a", "c"]=empty)) + Exit: (16) getvar([v, b], ["a", "c"], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Exit: (15) getvalue([v, b], ["a", "c"], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Exit: (14) getvalues([v, d], [v, b], empty, ["a", "c"], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (14) expression(empty) + Unify: (14) expression(empty) + Exit: (14) expression(empty) + Call: (14) expression(["a", "c"]) + Unify: (14) expression(["a", "c"]) + Call: (15) isval(["a", "c"]) + Unify: (15) isval(["a", "c"]) + Call: (16) number(["a", "c"]) + Fail: (16) number(["a", "c"]) + Fail: (15) isval(["a", "c"]) + Redo: (14) expression(["a", "c"]) + Unify: (14) expression(["a", "c"]) + Call: (15) string(["a", "c"]) + Fail: (15) string(["a", "c"]) + Redo: (14) expression(["a", "c"]) + Unify: (14) expression(["a", "c"]) + Call: (15) atom(["a", "c"]) + Fail: (15) atom(["a", "c"]) + Redo: (14) expression(["a", "c"]) + Unify: (14) expression(["a", "c"]) +^ Call: (15) not(atom(["a", "c"])) +^ Unify: (15) not(user:atom(["a", "c"])) +^ Exit: (15) not(user:atom(["a", "c"])) + Call: (15) length(["a", "c"], _G5804) + Unify: (15) length(["a", "c"], _G5804) + Exit: (15) length(["a", "c"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["a", "c"]) + Unify: (15) expression2(["a", "c"]) + Call: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) expression3("a") + Call: (16) expression2(["c"]) + Unify: (16) expression2(["c"]) + Call: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) isval("c") + Unify: (18) isval("c") + Call: (19) number("c") + Fail: (19) number("c") + Fail: (18) isval("c") + Redo: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) string("c") + Exit: (18) string("c") + Exit: (17) expression3("c") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["c"]) + Exit: (15) expression2(["a", "c"]) + Exit: (14) expression(["a", "c"]) + Call: (14) val1emptyorvalsequal(empty, ["a", "c"]) + Unify: (14) val1emptyorvalsequal(empty, ["a", "c"]) + Exit: (14) val1emptyorvalsequal(empty, ["a", "c"]) + Call: (14) putvalue([v, d], ["a", "c"], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5806) + Unify: (14) putvalue([v, d], ["a", "c"], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5806) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], ["a", "c"], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5806) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(["a", "c"]) + Unify: (15) isvalstrorundef(["a", "c"]) + Call: (16) var(["a", "c"]) + Fail: (16) var(["a", "c"]) + Redo: (15) isvalstrorundef(["a", "c"]) + Unify: (15) isvalstrorundef(["a", "c"]) +^ Call: (16) not(var(["a", "c"])) +^ Unify: (16) not(user:var(["a", "c"])) +^ Exit: (16) not(user:var(["a", "c"])) + Call: (16) isval(["a", "c"]) + Unify: (16) isval(["a", "c"]) + Call: (17) number(["a", "c"]) + Fail: (17) number(["a", "c"]) + Fail: (16) isval(["a", "c"]) + Redo: (15) isvalstrorundef(["a", "c"]) + Unify: (15) isvalstrorundef(["a", "c"]) +^ Call: (16) not(var(["a", "c"])) +^ Unify: (16) not(user:var(["a", "c"])) +^ Exit: (16) not(user:var(["a", "c"])) + Call: (16) expression(["a", "c"]) + Unify: (16) expression(["a", "c"]) + Call: (17) isval(["a", "c"]) + Unify: (17) isval(["a", "c"]) + Call: (18) number(["a", "c"]) + Fail: (18) number(["a", "c"]) + Fail: (17) isval(["a", "c"]) + Redo: (16) expression(["a", "c"]) + Unify: (16) expression(["a", "c"]) + Call: (17) string(["a", "c"]) + Fail: (17) string(["a", "c"]) + Redo: (16) expression(["a", "c"]) + Unify: (16) expression(["a", "c"]) + Call: (17) atom(["a", "c"]) + Fail: (17) atom(["a", "c"]) + Redo: (16) expression(["a", "c"]) + Unify: (16) expression(["a", "c"]) +^ Call: (17) not(atom(["a", "c"])) +^ Unify: (17) not(user:atom(["a", "c"])) +^ Exit: (17) not(user:atom(["a", "c"])) + Call: (17) length(["a", "c"], _G5814) + Unify: (17) length(["a", "c"], _G5814) + Exit: (17) length(["a", "c"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expression2(["a", "c"]) + Unify: (17) expression2(["a", "c"]) + Call: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) isval("a") + Unify: (19) isval("a") + Call: (20) number("a") + Fail: (20) number("a") + Fail: (19) isval("a") + Redo: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) string("a") + Exit: (19) string("a") + Exit: (18) expression3("a") + Call: (18) expression2(["c"]) + Unify: (18) expression2(["c"]) + Call: (19) expression3("c") + Unify: (19) expression3("c") + Call: (20) isval("c") + Unify: (20) isval("c") + Call: (21) number("c") + Fail: (21) number("c") + Fail: (20) isval("c") + Redo: (19) expression3("c") + Unify: (19) expression3("c") + Call: (20) string("c") + Exit: (20) string("c") + Exit: (19) expression3("c") + Call: (19) expression2([]) + Unify: (19) expression2([]) + Exit: (19) expression2([]) + Exit: (18) expression2(["c"]) + Exit: (17) expression2(["a", "c"]) + Exit: (16) expression(["a", "c"]) + Exit: (15) isvalstrorundef(["a", "c"]) + Call: (15) updatevar([v, d], ["a", "c"], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5816) + Unify: (15) updatevar([v, d], ["a", "c"], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], _G5816) + Call: (16) lists:member([[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Unify: (16) lists:member([[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Exit: (16) lists:member([[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]]) + Call: (16) lists:delete([[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[v, d], empty], _G5827) + Unify: (16) lists:delete([[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[v, d], empty], _G5827) + Exit: (16) lists:delete([[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[v, d], empty], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]]) + Call: (16) lists:append([[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], [[[v, d], ["a", "c"]]], _G5842) + Unify: (16) lists:append([[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], [[[v, d], ["a", "c"]]], [[[v, b], ["a", "c"]]|_G5834]) + Exit: (16) lists:append([[[v, b], ["a", "c"]], [[v, a], ["a", "b"]]], [[[v, d], ["a", "c"]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], ["a", "c"], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]]) + Exit: (14) putvalue([v, d], ["a", "c"], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]]) + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, b], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]]) + Call: (14) true + Exit: (14) true + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, b], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]]) + Call: (14) true + Exit: (14) true + Exit: (13) interpretpart(is, [v, d], [v, b], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]]) + Exit: (12) interpretstatement1([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, b]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], true, nocut) +^ Call: (12) not(nocut=cut) +^ Unify: (12) not(user: (nocut=cut)) +^ Exit: (12) not(user: (nocut=cut)) + Call: (12) _G5852=[[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (12) [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], _G5855, [], _G5857) + Unify: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [], true) + Exit: (12) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [], true) + Call: (12) logicalconjunction(true, true, true) + Unify: (12) logicalconjunction(true, true, true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Exit: (12) logicalconjunction(true, true, true) + Exit: (11) interpretbody([[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], empty]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [[[n, =], [[v, d], [v, b]]]], true) + Call: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [], _G5855) + Unify: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [], _G5855) + Call: (12) [[[v, d], [v, d]]]=[[_G5848, _G5851]|_G5846] + Exit: (12) [[[v, d], [v, d]]]=[[[v, d], [v, d]]] + Call: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) var([v, d]) + Fail: (14) var([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) isval([v, d]) + Unify: (14) isval([v, d]) + Call: (15) number([v, d]) + Fail: (15) number([v, d]) + Fail: (14) isval([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) string([v, d]) + Fail: (14) string([v, d]) + Fail: (13) isvalstrempty([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) +^ Call: (13) not(atom([v, d])) +^ Unify: (13) not(user:atom([v, d])) +^ Exit: (13) not(user:atom([v, d])) + Call: (13) length([v, d], _G5867) + Unify: (13) length([v, d], _G5867) + Exit: (13) length([v, d], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2([v, d]) + Unify: (13) expressionnotatom2([v, d]) + Call: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) var(v) + Fail: (15) var(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) isval(v) + Unify: (15) isval(v) + Call: (16) number(v) + Fail: (16) number(v) + Fail: (15) isval(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) string(v) + Fail: (15) string(v) + Fail: (14) isvalstrempty(v) + Fail: (13) expressionnotatom2([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) predicate_or_rule_name([v, d]) + Fail: (13) predicate_or_rule_name([v, d]) + Fail: (12) expressionnotatom([v, d]) + Redo: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [], _G5864) + Call: (12) lists:member([[v, d], _G5857], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]]) + Unify: (12) lists:member([[v, d], _G5857], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]]) + Exit: (12) lists:member([[v, d], ["a", "c"]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]]) + Call: (12) lists:append([], [[[v, d], ["a", "c"]]], _G5878) + Unify: (12) lists:append([], [[[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]]) + Exit: (12) lists:append([], [[[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]]) + Call: (12) updatevars([], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]], _G5879) + Unify: (12) updatevars([], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]]) + Exit: (12) updatevars([], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]]) + Exit: (11) updatevars([[[v, d], [v, d]]], [[[v, b], ["a", "c"]], [[v, a], ["a", "b"]], [[v, d], ["a", "c"]]], [], [[[v, d], ["a", "c"]]]) +^ Call: (11) not([[[v, d], ["a", "c"]]]=[]) +^ Unify: (11) not(user: ([[[v, d], ["a", "c"]]]=[])) +^ Exit: (11) not(user: ([[[v, d], ["a", "c"]]]=[])) + Call: (11) unique1([[[v, d], ["a", "c"]]], [], _G15) + Unify: (11) unique1([[[v, d], ["a", "c"]]], [], _G15) + Call: (12) lists:delete([], [[v, d], ["a", "c"]], _G5884) + Unify: (12) lists:delete([], [[v, d], ["a", "c"]], []) + Exit: (12) lists:delete([], [[v, d], ["a", "c"]], []) + Call: (12) lists:append([], [[[v, d], ["a", "c"]]], _G5887) + Unify: (12) lists:append([], [[[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]]) + Exit: (12) lists:append([], [[[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]]) + Call: (12) unique1([], [[[v, d], ["a", "c"]]], _G15) + Unify: (12) unique1([], [[[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]]) + Exit: (12) unique1([], [[[v, d], ["a", "c"]]], [[[v, d], ["a", "c"]]]) + Exit: (11) unique1([[[v, d], ["a", "c"]]], [], [[[v, d], ["a", "c"]]]) + Call: (11) findresult3([["a", "c"], ["a", "b"], [v, d]], [[[v, d], ["a", "c"]]], [], _G5888) + Unify: (11) findresult3([["a", "c"], ["a", "b"], [v, d]], [[[v, d], ["a", "c"]]], [], _G5888) + Call: (12) [["a", "c"], ["a", "b"], [v, d]]=[_G5878|_G5879] + Exit: (12) [["a", "c"], ["a", "b"], [v, d]]=[["a", "c"], ["a", "b"], [v, d]] + Call: (12) expressionnotatom(["a", "c"]) + Unify: (12) expressionnotatom(["a", "c"]) + Call: (13) isvalstrempty(["a", "c"]) + Unify: (13) isvalstrempty(["a", "c"]) + Call: (14) var(["a", "c"]) + Fail: (14) var(["a", "c"]) + Redo: (13) isvalstrempty(["a", "c"]) + Unify: (13) isvalstrempty(["a", "c"]) + Call: (14) isval(["a", "c"]) + Unify: (14) isval(["a", "c"]) + Call: (15) number(["a", "c"]) + Fail: (15) number(["a", "c"]) + Fail: (14) isval(["a", "c"]) + Redo: (13) isvalstrempty(["a", "c"]) + Unify: (13) isvalstrempty(["a", "c"]) + Call: (14) string(["a", "c"]) + Fail: (14) string(["a", "c"]) + Fail: (13) isvalstrempty(["a", "c"]) + Redo: (12) expressionnotatom(["a", "c"]) + Unify: (12) expressionnotatom(["a", "c"]) +^ Call: (13) not(atom(["a", "c"])) +^ Unify: (13) not(user:atom(["a", "c"])) +^ Exit: (13) not(user:atom(["a", "c"])) + Call: (13) length(["a", "c"], _G5894) + Unify: (13) length(["a", "c"], _G5894) + Exit: (13) length(["a", "c"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["a", "c"]) + Unify: (13) expressionnotatom2(["a", "c"]) + Call: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) var("a") + Fail: (15) var("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) isval("a") + Unify: (15) isval("a") + Call: (16) number("a") + Fail: (16) number("a") + Fail: (15) isval("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) string("a") + Exit: (15) string("a") + Exit: (14) isvalstrempty("a") + Call: (14) expressionnotatom2(["c"]) + Unify: (14) expressionnotatom2(["c"]) + Call: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) var("c") + Fail: (16) var("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) isval("c") + Unify: (16) isval("c") + Call: (17) number("c") + Fail: (17) number("c") + Fail: (16) isval("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) string("c") + Exit: (16) string("c") + Exit: (15) isvalstrempty("c") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["c"]) + Exit: (13) expressionnotatom2(["a", "c"]) + Exit: (12) expressionnotatom(["a", "c"]) + Call: (12) lists:append([], [["a", "c"]], _G5898) + Unify: (12) lists:append([], [["a", "c"]], [["a", "c"]]) + Exit: (12) lists:append([], [["a", "c"]], [["a", "c"]]) + Call: (12) findresult3([["a", "b"], [v, d]], [[[v, d], ["a", "c"]]], [["a", "c"]], _G5899) + Unify: (12) findresult3([["a", "b"], [v, d]], [[[v, d], ["a", "c"]]], [["a", "c"]], _G5899) + Call: (13) [["a", "b"], [v, d]]=[_G5889|_G5890] + Exit: (13) [["a", "b"], [v, d]]=[["a", "b"], [v, d]] + Call: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) + Call: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) var(["a", "b"]) + Fail: (15) var(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) isvalstrempty(["a", "b"]) + Unify: (14) isvalstrempty(["a", "b"]) + Call: (15) string(["a", "b"]) + Fail: (15) string(["a", "b"]) + Fail: (14) isvalstrempty(["a", "b"]) + Redo: (13) expressionnotatom(["a", "b"]) + Unify: (13) expressionnotatom(["a", "b"]) +^ Call: (14) not(atom(["a", "b"])) +^ Unify: (14) not(user:atom(["a", "b"])) +^ Exit: (14) not(user:atom(["a", "b"])) + Call: (14) length(["a", "b"], _G5905) + Unify: (14) length(["a", "b"], _G5905) + Exit: (14) length(["a", "b"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["a", "b"]) + Unify: (14) expressionnotatom2(["a", "b"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2(["b"]) + Unify: (15) expressionnotatom2(["b"]) + Call: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) var("b") + Fail: (17) var("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) isvalstrempty("b") + Unify: (16) isvalstrempty("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) isvalstrempty("b") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["b"]) + Exit: (14) expressionnotatom2(["a", "b"]) + Exit: (13) expressionnotatom(["a", "b"]) + Call: (13) lists:append([["a", "c"]], [["a", "b"]], _G5909) + Unify: (13) lists:append([["a", "c"]], [["a", "b"]], [["a", "c"]|_G5901]) + Exit: (13) lists:append([["a", "c"]], [["a", "b"]], [["a", "c"], ["a", "b"]]) + Call: (13) findresult3([[v, d]], [[[v, d], ["a", "c"]]], [["a", "c"], ["a", "b"]], _G5913) + Unify: (13) findresult3([[v, d]], [[[v, d], ["a", "c"]]], [["a", "c"], ["a", "b"]], _G5913) + Call: (14) [[v, d]]=[_G5903|_G5904] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5919) + Unify: (15) length([v, d], _G5919) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) findresult3([[v, d]], [[[v, d], ["a", "c"]]], [["a", "c"], ["a", "b"]], _G5913) + Unify: (13) findresult3([[v, d]], [[[v, d], ["a", "c"]]], [["a", "c"], ["a", "b"]], _G5913) + Call: (14) [[v, d]]=[_G5903|_G5904] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) lists:member([[v, d], _G5909], [[[v, d], ["a", "c"]]]) + Unify: (14) lists:member([[v, d], _G5909], [[[v, d], ["a", "c"]]]) + Exit: (14) lists:member([[v, d], ["a", "c"]], [[[v, d], ["a", "c"]]]) + Call: (14) lists:append([["a", "c"], ["a", "b"]], [["a", "c"]], _G5924) + Unify: (14) lists:append([["a", "c"], ["a", "b"]], [["a", "c"]], [["a", "c"]|_G5916]) + Exit: (14) lists:append([["a", "c"], ["a", "b"]], [["a", "c"]], [["a", "c"], ["a", "b"], ["a", "c"]]) + Call: (14) findresult3([], [[[v, d], ["a", "c"]]], [["a", "c"], ["a", "b"], ["a", "c"]], _G5931) + Unify: (14) findresult3([], [[[v, d], ["a", "c"]]], [["a", "c"], ["a", "b"], ["a", "c"]], [["a", "c"], ["a", "b"], ["a", "c"]]) + Exit: (14) findresult3([], [[[v, d], ["a", "c"]]], [["a", "c"], ["a", "b"], ["a", "c"]], [["a", "c"], ["a", "b"], ["a", "c"]]) + Exit: (13) findresult3([[v, d]], [[[v, d], ["a", "c"]]], [["a", "c"], ["a", "b"]], [["a", "c"], ["a", "b"], ["a", "c"]]) + Exit: (12) findresult3([["a", "b"], [v, d]], [[[v, d], ["a", "c"]]], [["a", "c"]], [["a", "c"], ["a", "b"], ["a", "c"]]) + Exit: (11) findresult3([["a", "c"], ["a", "b"], [v, d]], [[[v, d], ["a", "c"]]], [], [["a", "c"], ["a", "b"], ["a", "c"]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "c"]]]) + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Exit: (10) member1([[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "c"]]]) + Exit: (9) interpret1(off, [[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "c"]]]) + Exit: (8) interpret(off, [[n, f], [["a", "c"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, a], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "c"]]]) + + Call: (8) interpret(off, [[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Unify: (8) interpret(off, [[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552) + Unify: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552) + Call: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5543|_G5544] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], _G5555, "->", _G5564] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], _G5555, "->", _G5564] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5555, [], _G5557) + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], "->", _G5561] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], "->", _G5561] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5543|_G5544] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_G5546, _G5549, ":-", _G5558] + Exit: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _G5573) + Unify: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G5573, [], _G5575) + Unify: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _G5573) + Unify: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]|_G5565]) + Exit: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (9) interpret1(off, [[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Unify: (9) interpret1(off, [[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) +^ Call: (10) retractall(debug(_G5567)) +^ Exit: (10) retractall(debug(_G5567)) +^ Call: (10) assertz(debug(off)) +^ Exit: (10) assertz(debug(off)) +^ Call: (10) retractall(cut(_G5571)) +^ Exit: (10) retractall(cut(_G5571)) +^ Call: (10) assertz(cut(off)) +^ Exit: (10) assertz(cut(off)) + Call: (10) member1([[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Unify: (10) member1([[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) cut(off) + Unify: (11) cut(off) + Exit: (11) cut(off) + Call: (11) [[n, f], [["a", "b"], ["b", "c"], [v, d]]]=[_G5575, _G5578] + Exit: (11) [[n, f], [["a", "b"], ["b", "c"], [v, d]]]=[[n, f], [["a", "b"], ["b", "c"], [v, d]]] + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _G5587, ":-", _G5596]|_G5582] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) length([["a", "b"], ["b", "c"], [v, d]], _G5607) + Unify: (11) length([["a", "b"], ["b", "c"], [v, d]], _G5607) + Exit: (11) length([["a", "b"], ["b", "c"], [v, d]], 3) + Call: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Exit: (11) length([[v, c], [v, b], [v, d]], 3) + Call: (11) [n, f]=[n, grammar] + Fail: (11) [n, f]=[n, grammar] + Redo: (10) member1([[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) [n, f]=[n, grammar_part] + Fail: (11) [n, f]=[n, grammar_part] + Redo: (10) member1([[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) checkarguments([["a", "b"], ["b", "c"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5609, [], _G5611) + Unify: (11) checkarguments([["a", "b"], ["b", "c"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5609, [], _G5611) + Call: (12) [["a", "b"], ["b", "c"], [v, d]]=[_G5599|_G5600] + Exit: (12) [["a", "b"], ["b", "c"], [v, d]]=[["a", "b"], ["b", "c"], [v, d]] + Call: (12) expressionnotatom(["a", "b"]) + Unify: (12) expressionnotatom(["a", "b"]) + Call: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) var(["a", "b"]) + Fail: (14) var(["a", "b"]) + Redo: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) isval(["a", "b"]) + Unify: (14) isval(["a", "b"]) + Call: (15) number(["a", "b"]) + Fail: (15) number(["a", "b"]) + Fail: (14) isval(["a", "b"]) + Redo: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) string(["a", "b"]) + Fail: (14) string(["a", "b"]) + Fail: (13) isvalstrempty(["a", "b"]) + Redo: (12) expressionnotatom(["a", "b"]) + Unify: (12) expressionnotatom(["a", "b"]) +^ Call: (13) not(atom(["a", "b"])) +^ Unify: (13) not(user:atom(["a", "b"])) +^ Exit: (13) not(user:atom(["a", "b"])) + Call: (13) length(["a", "b"], _G5615) + Unify: (13) length(["a", "b"], _G5615) + Exit: (13) length(["a", "b"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["a", "b"]) + Unify: (13) expressionnotatom2(["a", "b"]) + Call: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) var("a") + Fail: (15) var("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) isval("a") + Unify: (15) isval("a") + Call: (16) number("a") + Fail: (16) number("a") + Fail: (15) isval("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) string("a") + Exit: (15) string("a") + Exit: (14) isvalstrempty("a") + Call: (14) expressionnotatom2(["b"]) + Unify: (14) expressionnotatom2(["b"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["b"]) + Exit: (13) expressionnotatom2(["a", "b"]) + Exit: (12) expressionnotatom(["a", "b"]) + Call: (12) [[v, c], [v, b], [v, d]]=[_G5607|_G5608] + Exit: (12) [[v, c], [v, b], [v, d]]=[[v, c], [v, b], [v, d]] +^ Call: (12) not(var([v, c])) +^ Unify: (12) not(user:var([v, c])) +^ Exit: (12) not(user:var([v, c])) + Call: (12) isvar([v, c]) + Unify: (12) isvar([v, c]) + Exit: (12) isvar([v, c]) + Call: (12) putvalue([v, c], ["a", "b"], [], _G5625) + Unify: (12) putvalue([v, c], ["a", "b"], [], _G5625) +^ Call: (13) not(isvar([v, c])) +^ Unify: (13) not(user:isvar([v, c])) + Call: (14) isvar([v, c]) + Unify: (14) isvar([v, c]) + Exit: (14) isvar([v, c]) +^ Fail: (13) not(user:isvar([v, c])) + Redo: (12) putvalue([v, c], ["a", "b"], [], _G5625) + Call: (13) isvar([v, c]) + Unify: (13) isvar([v, c]) + Exit: (13) isvar([v, c]) + Call: (13) isvalstrorundef(["a", "b"]) + Unify: (13) isvalstrorundef(["a", "b"]) + Call: (14) var(["a", "b"]) + Fail: (14) var(["a", "b"]) + Redo: (13) isvalstrorundef(["a", "b"]) + Unify: (13) isvalstrorundef(["a", "b"]) +^ Call: (14) not(var(["a", "b"])) +^ Unify: (14) not(user:var(["a", "b"])) +^ Exit: (14) not(user:var(["a", "b"])) + Call: (14) isval(["a", "b"]) + Unify: (14) isval(["a", "b"]) + Call: (15) number(["a", "b"]) + Fail: (15) number(["a", "b"]) + Fail: (14) isval(["a", "b"]) + Redo: (13) isvalstrorundef(["a", "b"]) + Unify: (13) isvalstrorundef(["a", "b"]) +^ Call: (14) not(var(["a", "b"])) +^ Unify: (14) not(user:var(["a", "b"])) +^ Exit: (14) not(user:var(["a", "b"])) + Call: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) string(["a", "b"]) + Fail: (15) string(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) atom(["a", "b"]) + Fail: (15) atom(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) +^ Call: (15) not(atom(["a", "b"])) +^ Unify: (15) not(user:atom(["a", "b"])) +^ Exit: (15) not(user:atom(["a", "b"])) + Call: (15) length(["a", "b"], _G5633) + Unify: (15) length(["a", "b"], _G5633) + Exit: (15) length(["a", "b"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["a", "b"]) + Unify: (15) expression2(["a", "b"]) + Call: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) expression3("a") + Call: (16) expression2(["b"]) + Unify: (16) expression2(["b"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["b"]) + Exit: (15) expression2(["a", "b"]) + Exit: (14) expression(["a", "b"]) + Exit: (13) isvalstrorundef(["a", "b"]) + Call: (13) updatevar([v, c], ["a", "b"], [], _G5635) + Unify: (13) updatevar([v, c], ["a", "b"], [], _G5635) + Call: (14) lists:member([[v, c], empty], []) + Fail: (14) lists:member([[v, c], empty], []) + Redo: (13) updatevar([v, c], ["a", "b"], [], _G5635) +^ Call: (14) not(member([[v, c], _G5631], [])) +^ Unify: (14) not(user:member([[v, c], _G5631], [])) +^ Exit: (14) not(user:member([[v, c], _G5631], [])) + Call: (14) _G5631=empty + Exit: (14) empty=empty + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[v, c], ["a", "b"]]], _G5655) + Unify: (14) lists:append([], [[[v, c], ["a", "b"]]], [[[v, c], ["a", "b"]]]) + Exit: (14) lists:append([], [[[v, c], ["a", "b"]]], [[[v, c], ["a", "b"]]]) + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Exit: (13) updatevar([v, c], ["a", "b"], [], [[[v, c], ["a", "b"]]]) + Exit: (12) putvalue([v, c], ["a", "b"], [], [[[v, c], ["a", "b"]]]) + Call: (12) checkarguments([["b", "c"], [v, d]], [[v, b], [v, d]], [[[v, c], ["a", "b"]]], _G5656, [], _G5658) + Unify: (12) checkarguments([["b", "c"], [v, d]], [[v, b], [v, d]], [[[v, c], ["a", "b"]]], _G5656, [], _G5658) + Call: (13) [["b", "c"], [v, d]]=[_G5646|_G5647] + Exit: (13) [["b", "c"], [v, d]]=[["b", "c"], [v, d]] + Call: (13) expressionnotatom(["b", "c"]) + Unify: (13) expressionnotatom(["b", "c"]) + Call: (14) isvalstrempty(["b", "c"]) + Unify: (14) isvalstrempty(["b", "c"]) + Call: (15) var(["b", "c"]) + Fail: (15) var(["b", "c"]) + Redo: (14) isvalstrempty(["b", "c"]) + Unify: (14) isvalstrempty(["b", "c"]) + Call: (15) isval(["b", "c"]) + Unify: (15) isval(["b", "c"]) + Call: (16) number(["b", "c"]) + Fail: (16) number(["b", "c"]) + Fail: (15) isval(["b", "c"]) + Redo: (14) isvalstrempty(["b", "c"]) + Unify: (14) isvalstrempty(["b", "c"]) + Call: (15) string(["b", "c"]) + Fail: (15) string(["b", "c"]) + Fail: (14) isvalstrempty(["b", "c"]) + Redo: (13) expressionnotatom(["b", "c"]) + Unify: (13) expressionnotatom(["b", "c"]) +^ Call: (14) not(atom(["b", "c"])) +^ Unify: (14) not(user:atom(["b", "c"])) +^ Exit: (14) not(user:atom(["b", "c"])) + Call: (14) length(["b", "c"], _G5662) + Unify: (14) length(["b", "c"], _G5662) + Exit: (14) length(["b", "c"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["b", "c"]) + Unify: (14) expressionnotatom2(["b", "c"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2(["c"]) + Unify: (15) expressionnotatom2(["c"]) + Call: (16) isvalstrempty("c") + Unify: (16) isvalstrempty("c") + Call: (17) var("c") + Fail: (17) var("c") + Redo: (16) isvalstrempty("c") + Unify: (16) isvalstrempty("c") + Call: (17) isval("c") + Unify: (17) isval("c") + Call: (18) number("c") + Fail: (18) number("c") + Fail: (17) isval("c") + Redo: (16) isvalstrempty("c") + Unify: (16) isvalstrempty("c") + Call: (17) string("c") + Exit: (17) string("c") + Exit: (16) isvalstrempty("c") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["c"]) + Exit: (14) expressionnotatom2(["b", "c"]) + Exit: (13) expressionnotatom(["b", "c"]) + Call: (13) [[v, b], [v, d]]=[_G5654|_G5655] + Exit: (13) [[v, b], [v, d]]=[[v, b], [v, d]] +^ Call: (13) not(var([v, b])) +^ Unify: (13) not(user:var([v, b])) +^ Exit: (13) not(user:var([v, b])) + Call: (13) isvar([v, b]) + Unify: (13) isvar([v, b]) + Exit: (13) isvar([v, b]) + Call: (13) putvalue([v, b], ["b", "c"], [[[v, c], ["a", "b"]]], _G5672) + Unify: (13) putvalue([v, b], ["b", "c"], [[[v, c], ["a", "b"]]], _G5672) +^ Call: (14) not(isvar([v, b])) +^ Unify: (14) not(user:isvar([v, b])) + Call: (15) isvar([v, b]) + Unify: (15) isvar([v, b]) + Exit: (15) isvar([v, b]) +^ Fail: (14) not(user:isvar([v, b])) + Redo: (13) putvalue([v, b], ["b", "c"], [[[v, c], ["a", "b"]]], _G5672) + Call: (14) isvar([v, b]) + Unify: (14) isvar([v, b]) + Exit: (14) isvar([v, b]) + Call: (14) isvalstrorundef(["b", "c"]) + Unify: (14) isvalstrorundef(["b", "c"]) + Call: (15) var(["b", "c"]) + Fail: (15) var(["b", "c"]) + Redo: (14) isvalstrorundef(["b", "c"]) + Unify: (14) isvalstrorundef(["b", "c"]) +^ Call: (15) not(var(["b", "c"])) +^ Unify: (15) not(user:var(["b", "c"])) +^ Exit: (15) not(user:var(["b", "c"])) + Call: (15) isval(["b", "c"]) + Unify: (15) isval(["b", "c"]) + Call: (16) number(["b", "c"]) + Fail: (16) number(["b", "c"]) + Fail: (15) isval(["b", "c"]) + Redo: (14) isvalstrorundef(["b", "c"]) + Unify: (14) isvalstrorundef(["b", "c"]) +^ Call: (15) not(var(["b", "c"])) +^ Unify: (15) not(user:var(["b", "c"])) +^ Exit: (15) not(user:var(["b", "c"])) + Call: (15) expression(["b", "c"]) + Unify: (15) expression(["b", "c"]) + Call: (16) isval(["b", "c"]) + Unify: (16) isval(["b", "c"]) + Call: (17) number(["b", "c"]) + Fail: (17) number(["b", "c"]) + Fail: (16) isval(["b", "c"]) + Redo: (15) expression(["b", "c"]) + Unify: (15) expression(["b", "c"]) + Call: (16) string(["b", "c"]) + Fail: (16) string(["b", "c"]) + Redo: (15) expression(["b", "c"]) + Unify: (15) expression(["b", "c"]) + Call: (16) atom(["b", "c"]) + Fail: (16) atom(["b", "c"]) + Redo: (15) expression(["b", "c"]) + Unify: (15) expression(["b", "c"]) +^ Call: (16) not(atom(["b", "c"])) +^ Unify: (16) not(user:atom(["b", "c"])) +^ Exit: (16) not(user:atom(["b", "c"])) + Call: (16) length(["b", "c"], _G5680) + Unify: (16) length(["b", "c"], _G5680) + Exit: (16) length(["b", "c"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expression2(["b", "c"]) + Unify: (16) expression2(["b", "c"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2(["c"]) + Unify: (17) expression2(["c"]) + Call: (18) expression3("c") + Unify: (18) expression3("c") + Call: (19) isval("c") + Unify: (19) isval("c") + Call: (20) number("c") + Fail: (20) number("c") + Fail: (19) isval("c") + Redo: (18) expression3("c") + Unify: (18) expression3("c") + Call: (19) string("c") + Exit: (19) string("c") + Exit: (18) expression3("c") + Call: (18) expression2([]) + Unify: (18) expression2([]) + Exit: (18) expression2([]) + Exit: (17) expression2(["c"]) + Exit: (16) expression2(["b", "c"]) + Exit: (15) expression(["b", "c"]) + Exit: (14) isvalstrorundef(["b", "c"]) + Call: (14) updatevar([v, b], ["b", "c"], [[[v, c], ["a", "b"]]], _G5682) + Unify: (14) updatevar([v, b], ["b", "c"], [[[v, c], ["a", "b"]]], _G5682) + Call: (15) lists:member([[v, b], empty], [[[v, c], ["a", "b"]]]) + Unify: (15) lists:member([[v, b], empty], [[[v, c], ["a", "b"]]]) + Fail: (15) lists:member([[v, b], empty], [[[v, c], ["a", "b"]]]) + Redo: (14) updatevar([v, b], ["b", "c"], [[[v, c], ["a", "b"]]], _G5682) +^ Call: (15) not(member([[v, b], _G5678], [[[v, c], ["a", "b"]]])) +^ Unify: (15) not(user:member([[v, b], _G5678], [[[v, c], ["a", "b"]]])) +^ Exit: (15) not(user:member([[v, b], _G5678], [[[v, c], ["a", "b"]]])) + Call: (15) _G5678=empty + Exit: (15) empty=empty + Call: (15) true + Exit: (15) true + Call: (15) lists:append([[[v, c], ["a", "b"]]], [[[v, b], ["b", "c"]]], _G5702) + Unify: (15) lists:append([[[v, c], ["a", "b"]]], [[[v, b], ["b", "c"]]], [[[v, c], ["a", "b"]]|_G5694]) + Exit: (15) lists:append([[[v, c], ["a", "b"]]], [[[v, b], ["b", "c"]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Call: (15) true + Exit: (15) true + Call: (15) true + Exit: (15) true + Exit: (14) updatevar([v, b], ["b", "c"], [[[v, c], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Exit: (13) putvalue([v, b], ["b", "c"], [[[v, c], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Call: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5706, [], _G5708) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5706, [], _G5708) + Call: (14) [[v, d]]=[_G5696|_G5697] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5712) + Unify: (15) length([v, d], _G5712) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5706, [], _G5708) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5706, [], _G5708) + Call: (14) [[v, d]]=[_G5696|_G5697] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5704|_G5705] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5720) + Unify: (15) length([v, d], _G5720) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5706, [], _G5708) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5706, [], _G5708) + Call: (14) [[v, d]]=[_G5696|_G5697] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5704|_G5705] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) getvalue([v, d], _G5720, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Unify: (14) getvalue([v, d], _G5720, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) getvalue([v, d], _G5720, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(_G5719) + Unify: (15) isvalstrorundef(_G5719) + Call: (16) var(_G5719) + Exit: (16) var(_G5719) + Exit: (15) isvalstrorundef(_G5719) + Call: (15) getvar([v, d], _G5720, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Unify: (15) getvar([v, d], _G5720, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Call: (16) lists:member([[v, d], _G5715], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Unify: (16) lists:member([[v, d], _G5715], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Fail: (16) lists:member([[v, d], _G5715], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Redo: (15) getvar([v, d], _G5720, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Unify: (15) getvar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) +^ Call: (16) not(member([[v, d], _G5718], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]])) +^ Unify: (16) not(user:member([[v, d], _G5718], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]])) +^ Exit: (16) not(user:member([[v, d], _G5718], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]])) + Call: (16) true + Exit: (16) true + Exit: (15) getvar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Exit: (14) getvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Call: (14) true + Exit: (14) true + Call: (14) putvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5734) + Unify: (14) putvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5734) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5734) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) + Call: (16) var(empty) + Fail: (16) var(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) isval(empty) + Unify: (16) isval(empty) + Call: (17) number(empty) + Fail: (17) number(empty) + Fail: (16) isval(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) expression(empty) + Unify: (16) expression(empty) + Exit: (16) expression(empty) + Exit: (15) isvalstrorundef(empty) + Call: (15) updatevar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5739) + Unify: (15) updatevar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5739) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Fail: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Redo: (15) updatevar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], _G5739) +^ Call: (16) not(member([[v, d], _G5735], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]])) +^ Unify: (16) not(user:member([[v, d], _G5735], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]])) +^ Exit: (16) not(user:member([[v, d], _G5735], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]])) + Call: (16) _G5735=empty + Exit: (16) empty=empty + Call: (16) true + Exit: (16) true + Call: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], [[[v, d], empty]], _G5759) + Unify: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], [[[v, d], empty]], [[[v, c], ["a", "b"]]|_G5751]) + Exit: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], [[[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Exit: (14) putvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (14) lists:append([], [[[v, d], [v, d]]], _G5774) + Unify: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Call: (14) checkarguments([], [], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5775, [[[v, d], [v, d]]], _G5777) + Unify: (14) checkarguments([], [], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) checkarguments([], [], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (12) checkarguments([["b", "c"], [v, d]], [[v, b], [v, d]], [[[v, c], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (11) checkarguments([["a", "b"], ["b", "c"], [v, d]], [[v, c], [v, b], [v, d]], [], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) true + Exit: (11) true + Call: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5765|_G5766] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5773|_G5774] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5782]]|_G5774] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5782]]|_G5774] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[_G5776], or, [_G5785]] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[_G5776], or, [_G5785]] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5773|_G5774] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Fail: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5774]]|_G5766] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5774]]|_G5766] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5768], or, [_G5777]] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5768], or, [_G5777]] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5765|_G5766] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5784, _G5785, _G5786) + Unify: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5784, true, nocut) + Call: (13) isop(=) + Unify: (13) isop(=) + Exit: (13) isop(=) + Call: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5784) + Unify: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5784) + Call: (14) getvalues([v, d], [v, c], _G5782, _G5783, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Unify: (14) getvalues([v, d], [v, c], _G5782, _G5783, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (15) getvalue([v, d], _G5781, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Unify: (15) getvalue([v, d], _G5781, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, d])) +^ Unify: (16) not(user:isvar([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) +^ Fail: (16) not(user:isvar([v, d])) + Redo: (15) getvalue([v, d], _G5781, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) + Call: (16) isvalstrorundef(_G5780) + Unify: (16) isvalstrorundef(_G5780) + Call: (17) var(_G5780) + Exit: (17) var(_G5780) + Exit: (16) isvalstrorundef(_G5780) + Call: (16) getvar([v, d], _G5781, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], _G5781, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], _G5776], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], _G5776], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) +^ Call: (17) not(empty=empty) +^ Unify: (17) not(user: (empty=empty)) +^ Fail: (17) not(user: (empty=empty)) + Redo: (16) getvar([v, d], _G5781, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) +^ Call: (17) not(member([[v, d], _G5779], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]])) +^ Unify: (17) not(user:member([[v, d], _G5779], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]])) +^ Fail: (17) not(user:member([[v, d], _G5779], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]])) + Redo: (16) getvar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Exit: (16) getvar([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Exit: (15) getvalue([v, d], empty, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (15) getvalue([v, c], _G5787, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Unify: (15) getvalue([v, c], _G5787, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, c])) +^ Unify: (16) not(user:isvar([v, c])) + Call: (17) isvar([v, c]) + Unify: (17) isvar([v, c]) + Exit: (17) isvar([v, c]) +^ Fail: (16) not(user:isvar([v, c])) + Redo: (15) getvalue([v, c], _G5787, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (16) isvar([v, c]) + Unify: (16) isvar([v, c]) + Exit: (16) isvar([v, c]) + Call: (16) isvalstrorundef(_G5786) + Unify: (16) isvalstrorundef(_G5786) + Call: (17) var(_G5786) + Exit: (17) var(_G5786) + Exit: (16) isvalstrorundef(_G5786) + Call: (16) getvar([v, c], _G5787, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Unify: (16) getvar([v, c], _G5787, [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (17) lists:member([[v, c], _G5782], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, c], _G5782], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, c], ["a", "b"]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) +^ Call: (17) not(["a", "b"]=empty) +^ Unify: (17) not(user: (["a", "b"]=empty)) +^ Exit: (17) not(user: (["a", "b"]=empty)) + Exit: (16) getvar([v, c], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Exit: (15) getvalue([v, c], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Exit: (14) getvalues([v, d], [v, c], empty, ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (14) expression(empty) + Unify: (14) expression(empty) + Exit: (14) expression(empty) + Call: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) isval(["a", "b"]) + Unify: (15) isval(["a", "b"]) + Call: (16) number(["a", "b"]) + Fail: (16) number(["a", "b"]) + Fail: (15) isval(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) string(["a", "b"]) + Fail: (15) string(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) + Call: (15) atom(["a", "b"]) + Fail: (15) atom(["a", "b"]) + Redo: (14) expression(["a", "b"]) + Unify: (14) expression(["a", "b"]) +^ Call: (15) not(atom(["a", "b"])) +^ Unify: (15) not(user:atom(["a", "b"])) +^ Exit: (15) not(user:atom(["a", "b"])) + Call: (15) length(["a", "b"], _G5804) + Unify: (15) length(["a", "b"], _G5804) + Exit: (15) length(["a", "b"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["a", "b"]) + Unify: (15) expression2(["a", "b"]) + Call: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) expression3("a") + Unify: (16) expression3("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) expression3("a") + Call: (16) expression2(["b"]) + Unify: (16) expression2(["b"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["b"]) + Exit: (15) expression2(["a", "b"]) + Exit: (14) expression(["a", "b"]) + Call: (14) val1emptyorvalsequal(empty, ["a", "b"]) + Unify: (14) val1emptyorvalsequal(empty, ["a", "b"]) + Exit: (14) val1emptyorvalsequal(empty, ["a", "b"]) + Call: (14) putvalue([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5806) + Unify: (14) putvalue([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5806) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5806) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(["a", "b"]) + Unify: (15) isvalstrorundef(["a", "b"]) + Call: (16) var(["a", "b"]) + Fail: (16) var(["a", "b"]) + Redo: (15) isvalstrorundef(["a", "b"]) + Unify: (15) isvalstrorundef(["a", "b"]) +^ Call: (16) not(var(["a", "b"])) +^ Unify: (16) not(user:var(["a", "b"])) +^ Exit: (16) not(user:var(["a", "b"])) + Call: (16) isval(["a", "b"]) + Unify: (16) isval(["a", "b"]) + Call: (17) number(["a", "b"]) + Fail: (17) number(["a", "b"]) + Fail: (16) isval(["a", "b"]) + Redo: (15) isvalstrorundef(["a", "b"]) + Unify: (15) isvalstrorundef(["a", "b"]) +^ Call: (16) not(var(["a", "b"])) +^ Unify: (16) not(user:var(["a", "b"])) +^ Exit: (16) not(user:var(["a", "b"])) + Call: (16) expression(["a", "b"]) + Unify: (16) expression(["a", "b"]) + Call: (17) isval(["a", "b"]) + Unify: (17) isval(["a", "b"]) + Call: (18) number(["a", "b"]) + Fail: (18) number(["a", "b"]) + Fail: (17) isval(["a", "b"]) + Redo: (16) expression(["a", "b"]) + Unify: (16) expression(["a", "b"]) + Call: (17) string(["a", "b"]) + Fail: (17) string(["a", "b"]) + Redo: (16) expression(["a", "b"]) + Unify: (16) expression(["a", "b"]) + Call: (17) atom(["a", "b"]) + Fail: (17) atom(["a", "b"]) + Redo: (16) expression(["a", "b"]) + Unify: (16) expression(["a", "b"]) +^ Call: (17) not(atom(["a", "b"])) +^ Unify: (17) not(user:atom(["a", "b"])) +^ Exit: (17) not(user:atom(["a", "b"])) + Call: (17) length(["a", "b"], _G5814) + Unify: (17) length(["a", "b"], _G5814) + Exit: (17) length(["a", "b"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expression2(["a", "b"]) + Unify: (17) expression2(["a", "b"]) + Call: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) isval("a") + Unify: (19) isval("a") + Call: (20) number("a") + Fail: (20) number("a") + Fail: (19) isval("a") + Redo: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) string("a") + Exit: (19) string("a") + Exit: (18) expression3("a") + Call: (18) expression2(["b"]) + Unify: (18) expression2(["b"]) + Call: (19) expression3("b") + Unify: (19) expression3("b") + Call: (20) isval("b") + Unify: (20) isval("b") + Call: (21) number("b") + Fail: (21) number("b") + Fail: (20) isval("b") + Redo: (19) expression3("b") + Unify: (19) expression3("b") + Call: (20) string("b") + Exit: (20) string("b") + Exit: (19) expression3("b") + Call: (19) expression2([]) + Unify: (19) expression2([]) + Exit: (19) expression2([]) + Exit: (18) expression2(["b"]) + Exit: (17) expression2(["a", "b"]) + Exit: (16) expression(["a", "b"]) + Exit: (15) isvalstrorundef(["a", "b"]) + Call: (15) updatevar([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5816) + Unify: (15) updatevar([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], _G5816) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Exit: (16) lists:member([[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]]) + Call: (16) lists:delete([[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[v, d], empty], _G5827) + Unify: (16) lists:delete([[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[v, d], empty], _G5827) + Exit: (16) lists:delete([[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[v, d], empty], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]]) + Call: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], [[[v, d], ["a", "b"]]], _G5842) + Unify: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], [[[v, d], ["a", "b"]]], [[[v, c], ["a", "b"]]|_G5834]) + Exit: (16) lists:append([[[v, c], ["a", "b"]], [[v, b], ["b", "c"]]], [[[v, d], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]]) + Exit: (14) putvalue([v, d], ["a", "b"], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]]) + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]]) + Call: (14) true + Exit: (14) true + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]]) + Call: (14) true + Exit: (14) true + Exit: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]]) + Exit: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], true, nocut) +^ Call: (12) not(nocut=cut) +^ Unify: (12) not(user: (nocut=cut)) +^ Exit: (12) not(user: (nocut=cut)) + Call: (12) _G5852=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (12) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], _G5855, [], _G5857) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [], true) + Exit: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [], true) + Call: (12) logicalconjunction(true, true, true) + Unify: (12) logicalconjunction(true, true, true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Exit: (12) logicalconjunction(true, true, true) + Exit: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], empty]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [[[n, =], [[v, d], [v, c]]]], true) + Call: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [], _G5855) + Unify: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [], _G5855) + Call: (12) [[[v, d], [v, d]]]=[[_G5848, _G5851]|_G5846] + Exit: (12) [[[v, d], [v, d]]]=[[[v, d], [v, d]]] + Call: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) var([v, d]) + Fail: (14) var([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) isval([v, d]) + Unify: (14) isval([v, d]) + Call: (15) number([v, d]) + Fail: (15) number([v, d]) + Fail: (14) isval([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) string([v, d]) + Fail: (14) string([v, d]) + Fail: (13) isvalstrempty([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) +^ Call: (13) not(atom([v, d])) +^ Unify: (13) not(user:atom([v, d])) +^ Exit: (13) not(user:atom([v, d])) + Call: (13) length([v, d], _G5867) + Unify: (13) length([v, d], _G5867) + Exit: (13) length([v, d], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2([v, d]) + Unify: (13) expressionnotatom2([v, d]) + Call: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) var(v) + Fail: (15) var(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) isval(v) + Unify: (15) isval(v) + Call: (16) number(v) + Fail: (16) number(v) + Fail: (15) isval(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) string(v) + Fail: (15) string(v) + Fail: (14) isvalstrempty(v) + Fail: (13) expressionnotatom2([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) predicate_or_rule_name([v, d]) + Fail: (13) predicate_or_rule_name([v, d]) + Fail: (12) expressionnotatom([v, d]) + Redo: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [], _G5864) + Call: (12) lists:member([[v, d], _G5857], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]]) + Unify: (12) lists:member([[v, d], _G5857], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]]) + Exit: (12) lists:member([[v, d], ["a", "b"]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]]) + Call: (12) lists:append([], [[[v, d], ["a", "b"]]], _G5878) + Unify: (12) lists:append([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (12) lists:append([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Call: (12) updatevars([], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]], _G5879) + Unify: (12) updatevars([], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (12) updatevars([], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["a", "b"]], [[v, b], ["b", "c"]], [[v, d], ["a", "b"]]], [], [[[v, d], ["a", "b"]]]) +^ Call: (11) not([[[v, d], ["a", "b"]]]=[]) +^ Unify: (11) not(user: ([[[v, d], ["a", "b"]]]=[])) +^ Exit: (11) not(user: ([[[v, d], ["a", "b"]]]=[])) + Call: (11) unique1([[[v, d], ["a", "b"]]], [], _G15) + Unify: (11) unique1([[[v, d], ["a", "b"]]], [], _G15) + Call: (12) lists:delete([], [[v, d], ["a", "b"]], _G5884) + Unify: (12) lists:delete([], [[v, d], ["a", "b"]], []) + Exit: (12) lists:delete([], [[v, d], ["a", "b"]], []) + Call: (12) lists:append([], [[[v, d], ["a", "b"]]], _G5887) + Unify: (12) lists:append([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (12) lists:append([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Call: (12) unique1([], [[[v, d], ["a", "b"]]], _G15) + Unify: (12) unique1([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (12) unique1([], [[[v, d], ["a", "b"]]], [[[v, d], ["a", "b"]]]) + Exit: (11) unique1([[[v, d], ["a", "b"]]], [], [[[v, d], ["a", "b"]]]) + Call: (11) findresult3([["a", "b"], ["b", "c"], [v, d]], [[[v, d], ["a", "b"]]], [], _G5888) + Unify: (11) findresult3([["a", "b"], ["b", "c"], [v, d]], [[[v, d], ["a", "b"]]], [], _G5888) + Call: (12) [["a", "b"], ["b", "c"], [v, d]]=[_G5878|_G5879] + Exit: (12) [["a", "b"], ["b", "c"], [v, d]]=[["a", "b"], ["b", "c"], [v, d]] + Call: (12) expressionnotatom(["a", "b"]) + Unify: (12) expressionnotatom(["a", "b"]) + Call: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) var(["a", "b"]) + Fail: (14) var(["a", "b"]) + Redo: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) isval(["a", "b"]) + Unify: (14) isval(["a", "b"]) + Call: (15) number(["a", "b"]) + Fail: (15) number(["a", "b"]) + Fail: (14) isval(["a", "b"]) + Redo: (13) isvalstrempty(["a", "b"]) + Unify: (13) isvalstrempty(["a", "b"]) + Call: (14) string(["a", "b"]) + Fail: (14) string(["a", "b"]) + Fail: (13) isvalstrempty(["a", "b"]) + Redo: (12) expressionnotatom(["a", "b"]) + Unify: (12) expressionnotatom(["a", "b"]) +^ Call: (13) not(atom(["a", "b"])) +^ Unify: (13) not(user:atom(["a", "b"])) +^ Exit: (13) not(user:atom(["a", "b"])) + Call: (13) length(["a", "b"], _G5894) + Unify: (13) length(["a", "b"], _G5894) + Exit: (13) length(["a", "b"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["a", "b"]) + Unify: (13) expressionnotatom2(["a", "b"]) + Call: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) var("a") + Fail: (15) var("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) isval("a") + Unify: (15) isval("a") + Call: (16) number("a") + Fail: (16) number("a") + Fail: (15) isval("a") + Redo: (14) isvalstrempty("a") + Unify: (14) isvalstrempty("a") + Call: (15) string("a") + Exit: (15) string("a") + Exit: (14) isvalstrempty("a") + Call: (14) expressionnotatom2(["b"]) + Unify: (14) expressionnotatom2(["b"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["b"]) + Exit: (13) expressionnotatom2(["a", "b"]) + Exit: (12) expressionnotatom(["a", "b"]) + Call: (12) lists:append([], [["a", "b"]], _G5898) + Unify: (12) lists:append([], [["a", "b"]], [["a", "b"]]) + Exit: (12) lists:append([], [["a", "b"]], [["a", "b"]]) + Call: (12) findresult3([["b", "c"], [v, d]], [[[v, d], ["a", "b"]]], [["a", "b"]], _G5899) + Unify: (12) findresult3([["b", "c"], [v, d]], [[[v, d], ["a", "b"]]], [["a", "b"]], _G5899) + Call: (13) [["b", "c"], [v, d]]=[_G5889|_G5890] + Exit: (13) [["b", "c"], [v, d]]=[["b", "c"], [v, d]] + Call: (13) expressionnotatom(["b", "c"]) + Unify: (13) expressionnotatom(["b", "c"]) + Call: (14) isvalstrempty(["b", "c"]) + Unify: (14) isvalstrempty(["b", "c"]) + Call: (15) var(["b", "c"]) + Fail: (15) var(["b", "c"]) + Redo: (14) isvalstrempty(["b", "c"]) + Unify: (14) isvalstrempty(["b", "c"]) + Call: (15) isval(["b", "c"]) + Unify: (15) isval(["b", "c"]) + Call: (16) number(["b", "c"]) + Fail: (16) number(["b", "c"]) + Fail: (15) isval(["b", "c"]) + Redo: (14) isvalstrempty(["b", "c"]) + Unify: (14) isvalstrempty(["b", "c"]) + Call: (15) string(["b", "c"]) + Fail: (15) string(["b", "c"]) + Fail: (14) isvalstrempty(["b", "c"]) + Redo: (13) expressionnotatom(["b", "c"]) + Unify: (13) expressionnotatom(["b", "c"]) +^ Call: (14) not(atom(["b", "c"])) +^ Unify: (14) not(user:atom(["b", "c"])) +^ Exit: (14) not(user:atom(["b", "c"])) + Call: (14) length(["b", "c"], _G5905) + Unify: (14) length(["b", "c"], _G5905) + Exit: (14) length(["b", "c"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["b", "c"]) + Unify: (14) expressionnotatom2(["b", "c"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2(["c"]) + Unify: (15) expressionnotatom2(["c"]) + Call: (16) isvalstrempty("c") + Unify: (16) isvalstrempty("c") + Call: (17) var("c") + Fail: (17) var("c") + Redo: (16) isvalstrempty("c") + Unify: (16) isvalstrempty("c") + Call: (17) isval("c") + Unify: (17) isval("c") + Call: (18) number("c") + Fail: (18) number("c") + Fail: (17) isval("c") + Redo: (16) isvalstrempty("c") + Unify: (16) isvalstrempty("c") + Call: (17) string("c") + Exit: (17) string("c") + Exit: (16) isvalstrempty("c") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["c"]) + Exit: (14) expressionnotatom2(["b", "c"]) + Exit: (13) expressionnotatom(["b", "c"]) + Call: (13) lists:append([["a", "b"]], [["b", "c"]], _G5909) + Unify: (13) lists:append([["a", "b"]], [["b", "c"]], [["a", "b"]|_G5901]) + Exit: (13) lists:append([["a", "b"]], [["b", "c"]], [["a", "b"], ["b", "c"]]) + Call: (13) findresult3([[v, d]], [[[v, d], ["a", "b"]]], [["a", "b"], ["b", "c"]], _G5913) + Unify: (13) findresult3([[v, d]], [[[v, d], ["a", "b"]]], [["a", "b"], ["b", "c"]], _G5913) + Call: (14) [[v, d]]=[_G5903|_G5904] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5919) + Unify: (15) length([v, d], _G5919) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) findresult3([[v, d]], [[[v, d], ["a", "b"]]], [["a", "b"], ["b", "c"]], _G5913) + Unify: (13) findresult3([[v, d]], [[[v, d], ["a", "b"]]], [["a", "b"], ["b", "c"]], _G5913) + Call: (14) [[v, d]]=[_G5903|_G5904] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) lists:member([[v, d], _G5909], [[[v, d], ["a", "b"]]]) + Unify: (14) lists:member([[v, d], _G5909], [[[v, d], ["a", "b"]]]) + Exit: (14) lists:member([[v, d], ["a", "b"]], [[[v, d], ["a", "b"]]]) + Call: (14) lists:append([["a", "b"], ["b", "c"]], [["a", "b"]], _G5924) + Unify: (14) lists:append([["a", "b"], ["b", "c"]], [["a", "b"]], [["a", "b"]|_G5916]) + Exit: (14) lists:append([["a", "b"], ["b", "c"]], [["a", "b"]], [["a", "b"], ["b", "c"], ["a", "b"]]) + Call: (14) findresult3([], [[[v, d], ["a", "b"]]], [["a", "b"], ["b", "c"], ["a", "b"]], _G5931) + Unify: (14) findresult3([], [[[v, d], ["a", "b"]]], [["a", "b"], ["b", "c"], ["a", "b"]], [["a", "b"], ["b", "c"], ["a", "b"]]) + Exit: (14) findresult3([], [[[v, d], ["a", "b"]]], [["a", "b"], ["b", "c"], ["a", "b"]], [["a", "b"], ["b", "c"], ["a", "b"]]) + Exit: (13) findresult3([[v, d]], [[[v, d], ["a", "b"]]], [["a", "b"], ["b", "c"]], [["a", "b"], ["b", "c"], ["a", "b"]]) + Exit: (12) findresult3([["b", "c"], [v, d]], [[[v, d], ["a", "b"]]], [["a", "b"]], [["a", "b"], ["b", "c"], ["a", "b"]]) + Exit: (11) findresult3([["a", "b"], ["b", "c"], [v, d]], [[[v, d], ["a", "b"]]], [], [["a", "b"], ["b", "c"], ["a", "b"]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "b"]]]) + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Exit: (10) member1([[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "b"]]]) + Exit: (9) interpret1(off, [[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "b"]]]) + Exit: (8) interpret(off, [[n, f], [["a", "b"], ["b", "c"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["a", "b"]]]) + + Call: (8) interpret(off, [[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (8) interpret(off, [[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551) + Unify: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551) + Call: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5542|_G5543] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], _G5554, "->", _G5563] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], _G5554, "->", _G5563] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5554, [], _G5556) + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], "->", _G5560] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5551], "->", _G5560] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5551, [], _G5553) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5542|_G5543] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_G5545, _G5548, ":-", _G5557] + Exit: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _G5572) + Unify: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G5572, [], _G5574) + Unify: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _G5572) + Unify: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]|_G5564]) + Exit: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (9) interpret1(off, [[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (9) interpret1(off, [[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) +^ Call: (10) retractall(debug(_G5566)) +^ Exit: (10) retractall(debug(_G5566)) +^ Call: (10) assertz(debug(off)) +^ Exit: (10) assertz(debug(off)) +^ Call: (10) retractall(cut(_G5570)) +^ Exit: (10) retractall(cut(_G5570)) +^ Call: (10) assertz(cut(off)) +^ Exit: (10) assertz(cut(off)) + Call: (10) member1([[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Unify: (10) member1([[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) cut(off) + Unify: (11) cut(off) + Exit: (11) cut(off) + Call: (11) [[n, f], [["b", "c"], ["b", "a"], [v, d]]]=[_G5574, _G5577] + Exit: (11) [[n, f], [["b", "c"], ["b", "a"], [v, d]]]=[[n, f], [["b", "c"], ["b", "a"], [v, d]]] + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _G5586, ":-", _G5595]|_G5581] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) length([["b", "c"], ["b", "a"], [v, d]], _G5606) + Unify: (11) length([["b", "c"], ["b", "a"], [v, d]], _G5606) + Exit: (11) length([["b", "c"], ["b", "a"], [v, d]], 3) + Call: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Exit: (11) length([[v, c], [v, b], [v, d]], 3) + Call: (11) [n, f]=[n, grammar] + Fail: (11) [n, f]=[n, grammar] + Redo: (10) member1([[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) [n, f]=[n, grammar_part] + Fail: (11) [n, f]=[n, grammar_part] + Redo: (10) member1([[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) checkarguments([["b", "c"], ["b", "a"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5608, [], _G5610) + Unify: (11) checkarguments([["b", "c"], ["b", "a"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5608, [], _G5610) + Call: (12) [["b", "c"], ["b", "a"], [v, d]]=[_G5598|_G5599] + Exit: (12) [["b", "c"], ["b", "a"], [v, d]]=[["b", "c"], ["b", "a"], [v, d]] + Call: (12) expressionnotatom(["b", "c"]) + Unify: (12) expressionnotatom(["b", "c"]) + Call: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) var(["b", "c"]) + Fail: (14) var(["b", "c"]) + Redo: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) isval(["b", "c"]) + Unify: (14) isval(["b", "c"]) + Call: (15) number(["b", "c"]) + Fail: (15) number(["b", "c"]) + Fail: (14) isval(["b", "c"]) + Redo: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) string(["b", "c"]) + Fail: (14) string(["b", "c"]) + Fail: (13) isvalstrempty(["b", "c"]) + Redo: (12) expressionnotatom(["b", "c"]) + Unify: (12) expressionnotatom(["b", "c"]) +^ Call: (13) not(atom(["b", "c"])) +^ Unify: (13) not(user:atom(["b", "c"])) +^ Exit: (13) not(user:atom(["b", "c"])) + Call: (13) length(["b", "c"], _G5614) + Unify: (13) length(["b", "c"], _G5614) + Exit: (13) length(["b", "c"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["b", "c"]) + Unify: (13) expressionnotatom2(["b", "c"]) + Call: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) var("b") + Fail: (15) var("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) isval("b") + Unify: (15) isval("b") + Call: (16) number("b") + Fail: (16) number("b") + Fail: (15) isval("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) string("b") + Exit: (15) string("b") + Exit: (14) isvalstrempty("b") + Call: (14) expressionnotatom2(["c"]) + Unify: (14) expressionnotatom2(["c"]) + Call: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) var("c") + Fail: (16) var("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) isval("c") + Unify: (16) isval("c") + Call: (17) number("c") + Fail: (17) number("c") + Fail: (16) isval("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) string("c") + Exit: (16) string("c") + Exit: (15) isvalstrempty("c") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["c"]) + Exit: (13) expressionnotatom2(["b", "c"]) + Exit: (12) expressionnotatom(["b", "c"]) + Call: (12) [[v, c], [v, b], [v, d]]=[_G5606|_G5607] + Exit: (12) [[v, c], [v, b], [v, d]]=[[v, c], [v, b], [v, d]] +^ Call: (12) not(var([v, c])) +^ Unify: (12) not(user:var([v, c])) +^ Exit: (12) not(user:var([v, c])) + Call: (12) isvar([v, c]) + Unify: (12) isvar([v, c]) + Exit: (12) isvar([v, c]) + Call: (12) putvalue([v, c], ["b", "c"], [], _G5624) + Unify: (12) putvalue([v, c], ["b", "c"], [], _G5624) +^ Call: (13) not(isvar([v, c])) +^ Unify: (13) not(user:isvar([v, c])) + Call: (14) isvar([v, c]) + Unify: (14) isvar([v, c]) + Exit: (14) isvar([v, c]) +^ Fail: (13) not(user:isvar([v, c])) + Redo: (12) putvalue([v, c], ["b", "c"], [], _G5624) + Call: (13) isvar([v, c]) + Unify: (13) isvar([v, c]) + Exit: (13) isvar([v, c]) + Call: (13) isvalstrorundef(["b", "c"]) + Unify: (13) isvalstrorundef(["b", "c"]) + Call: (14) var(["b", "c"]) + Fail: (14) var(["b", "c"]) + Redo: (13) isvalstrorundef(["b", "c"]) + Unify: (13) isvalstrorundef(["b", "c"]) +^ Call: (14) not(var(["b", "c"])) +^ Unify: (14) not(user:var(["b", "c"])) +^ Exit: (14) not(user:var(["b", "c"])) + Call: (14) isval(["b", "c"]) + Unify: (14) isval(["b", "c"]) + Call: (15) number(["b", "c"]) + Fail: (15) number(["b", "c"]) + Fail: (14) isval(["b", "c"]) + Redo: (13) isvalstrorundef(["b", "c"]) + Unify: (13) isvalstrorundef(["b", "c"]) +^ Call: (14) not(var(["b", "c"])) +^ Unify: (14) not(user:var(["b", "c"])) +^ Exit: (14) not(user:var(["b", "c"])) + Call: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) isval(["b", "c"]) + Unify: (15) isval(["b", "c"]) + Call: (16) number(["b", "c"]) + Fail: (16) number(["b", "c"]) + Fail: (15) isval(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) string(["b", "c"]) + Fail: (15) string(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) atom(["b", "c"]) + Fail: (15) atom(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) +^ Call: (15) not(atom(["b", "c"])) +^ Unify: (15) not(user:atom(["b", "c"])) +^ Exit: (15) not(user:atom(["b", "c"])) + Call: (15) length(["b", "c"], _G5632) + Unify: (15) length(["b", "c"], _G5632) + Exit: (15) length(["b", "c"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["b", "c"]) + Unify: (15) expression2(["b", "c"]) + Call: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) expression3("b") + Call: (16) expression2(["c"]) + Unify: (16) expression2(["c"]) + Call: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) isval("c") + Unify: (18) isval("c") + Call: (19) number("c") + Fail: (19) number("c") + Fail: (18) isval("c") + Redo: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) string("c") + Exit: (18) string("c") + Exit: (17) expression3("c") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["c"]) + Exit: (15) expression2(["b", "c"]) + Exit: (14) expression(["b", "c"]) + Exit: (13) isvalstrorundef(["b", "c"]) + Call: (13) updatevar([v, c], ["b", "c"], [], _G5634) + Unify: (13) updatevar([v, c], ["b", "c"], [], _G5634) + Call: (14) lists:member([[v, c], empty], []) + Fail: (14) lists:member([[v, c], empty], []) + Redo: (13) updatevar([v, c], ["b", "c"], [], _G5634) +^ Call: (14) not(member([[v, c], _G5630], [])) +^ Unify: (14) not(user:member([[v, c], _G5630], [])) +^ Exit: (14) not(user:member([[v, c], _G5630], [])) + Call: (14) _G5630=empty + Exit: (14) empty=empty + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[v, c], ["b", "c"]]], _G5654) + Unify: (14) lists:append([], [[[v, c], ["b", "c"]]], [[[v, c], ["b", "c"]]]) + Exit: (14) lists:append([], [[[v, c], ["b", "c"]]], [[[v, c], ["b", "c"]]]) + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Exit: (13) updatevar([v, c], ["b", "c"], [], [[[v, c], ["b", "c"]]]) + Exit: (12) putvalue([v, c], ["b", "c"], [], [[[v, c], ["b", "c"]]]) + Call: (12) checkarguments([["b", "a"], [v, d]], [[v, b], [v, d]], [[[v, c], ["b", "c"]]], _G5655, [], _G5657) + Unify: (12) checkarguments([["b", "a"], [v, d]], [[v, b], [v, d]], [[[v, c], ["b", "c"]]], _G5655, [], _G5657) + Call: (13) [["b", "a"], [v, d]]=[_G5645|_G5646] + Exit: (13) [["b", "a"], [v, d]]=[["b", "a"], [v, d]] + Call: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) + Call: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Fail: (14) isvalstrempty(["b", "a"]) + Redo: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) +^ Call: (14) not(atom(["b", "a"])) +^ Unify: (14) not(user:atom(["b", "a"])) +^ Exit: (14) not(user:atom(["b", "a"])) + Call: (14) length(["b", "a"], _G5661) + Unify: (14) length(["b", "a"], _G5661) + Exit: (14) length(["b", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["b", "a"]) + Unify: (14) expressionnotatom2(["b", "a"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["b", "a"]) + Exit: (13) expressionnotatom(["b", "a"]) + Call: (13) [[v, b], [v, d]]=[_G5653|_G5654] + Exit: (13) [[v, b], [v, d]]=[[v, b], [v, d]] +^ Call: (13) not(var([v, b])) +^ Unify: (13) not(user:var([v, b])) +^ Exit: (13) not(user:var([v, b])) + Call: (13) isvar([v, b]) + Unify: (13) isvar([v, b]) + Exit: (13) isvar([v, b]) + Call: (13) putvalue([v, b], ["b", "a"], [[[v, c], ["b", "c"]]], _G5671) + Unify: (13) putvalue([v, b], ["b", "a"], [[[v, c], ["b", "c"]]], _G5671) +^ Call: (14) not(isvar([v, b])) +^ Unify: (14) not(user:isvar([v, b])) + Call: (15) isvar([v, b]) + Unify: (15) isvar([v, b]) + Exit: (15) isvar([v, b]) +^ Fail: (14) not(user:isvar([v, b])) + Redo: (13) putvalue([v, b], ["b", "a"], [[[v, c], ["b", "c"]]], _G5671) + Call: (14) isvar([v, b]) + Unify: (14) isvar([v, b]) + Exit: (14) isvar([v, b]) + Call: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) +^ Call: (15) not(var(["b", "a"])) +^ Unify: (15) not(user:var(["b", "a"])) +^ Exit: (15) not(user:var(["b", "a"])) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrorundef(["b", "a"]) + Unify: (14) isvalstrorundef(["b", "a"]) +^ Call: (15) not(var(["b", "a"])) +^ Unify: (15) not(user:var(["b", "a"])) +^ Exit: (15) not(user:var(["b", "a"])) + Call: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) isval(["b", "a"]) + Unify: (16) isval(["b", "a"]) + Call: (17) number(["b", "a"]) + Fail: (17) number(["b", "a"]) + Fail: (16) isval(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) string(["b", "a"]) + Fail: (16) string(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) + Call: (16) atom(["b", "a"]) + Fail: (16) atom(["b", "a"]) + Redo: (15) expression(["b", "a"]) + Unify: (15) expression(["b", "a"]) +^ Call: (16) not(atom(["b", "a"])) +^ Unify: (16) not(user:atom(["b", "a"])) +^ Exit: (16) not(user:atom(["b", "a"])) + Call: (16) length(["b", "a"], _G5679) + Unify: (16) length(["b", "a"], _G5679) + Exit: (16) length(["b", "a"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expression2(["b", "a"]) + Unify: (16) expression2(["b", "a"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2(["a"]) + Unify: (17) expression2(["a"]) + Call: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) isval("a") + Unify: (19) isval("a") + Call: (20) number("a") + Fail: (20) number("a") + Fail: (19) isval("a") + Redo: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) string("a") + Exit: (19) string("a") + Exit: (18) expression3("a") + Call: (18) expression2([]) + Unify: (18) expression2([]) + Exit: (18) expression2([]) + Exit: (17) expression2(["a"]) + Exit: (16) expression2(["b", "a"]) + Exit: (15) expression(["b", "a"]) + Exit: (14) isvalstrorundef(["b", "a"]) + Call: (14) updatevar([v, b], ["b", "a"], [[[v, c], ["b", "c"]]], _G5681) + Unify: (14) updatevar([v, b], ["b", "a"], [[[v, c], ["b", "c"]]], _G5681) + Call: (15) lists:member([[v, b], empty], [[[v, c], ["b", "c"]]]) + Unify: (15) lists:member([[v, b], empty], [[[v, c], ["b", "c"]]]) + Fail: (15) lists:member([[v, b], empty], [[[v, c], ["b", "c"]]]) + Redo: (14) updatevar([v, b], ["b", "a"], [[[v, c], ["b", "c"]]], _G5681) +^ Call: (15) not(member([[v, b], _G5677], [[[v, c], ["b", "c"]]])) +^ Unify: (15) not(user:member([[v, b], _G5677], [[[v, c], ["b", "c"]]])) +^ Exit: (15) not(user:member([[v, b], _G5677], [[[v, c], ["b", "c"]]])) + Call: (15) _G5677=empty + Exit: (15) empty=empty + Call: (15) true + Exit: (15) true + Call: (15) lists:append([[[v, c], ["b", "c"]]], [[[v, b], ["b", "a"]]], _G5701) + Unify: (15) lists:append([[[v, c], ["b", "c"]]], [[[v, b], ["b", "a"]]], [[[v, c], ["b", "c"]]|_G5693]) + Exit: (15) lists:append([[[v, c], ["b", "c"]]], [[[v, b], ["b", "a"]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Call: (15) true + Exit: (15) true + Call: (15) true + Exit: (15) true + Exit: (14) updatevar([v, b], ["b", "a"], [[[v, c], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Exit: (13) putvalue([v, b], ["b", "a"], [[[v, c], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Call: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5711) + Unify: (15) length([v, d], _G5711) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5703|_G5704] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5719) + Unify: (15) length([v, d], _G5719) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5705, [], _G5707) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5705, [], _G5707) + Call: (14) [[v, d]]=[_G5695|_G5696] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5703|_G5704] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) getvalue([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Unify: (14) getvalue([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) getvalue([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(_G5718) + Unify: (15) isvalstrorundef(_G5718) + Call: (16) var(_G5718) + Exit: (16) var(_G5718) + Exit: (15) isvalstrorundef(_G5718) + Call: (15) getvar([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Unify: (15) getvar([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Call: (16) lists:member([[v, d], _G5714], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Unify: (16) lists:member([[v, d], _G5714], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Fail: (16) lists:member([[v, d], _G5714], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Redo: (15) getvar([v, d], _G5719, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Unify: (15) getvar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) +^ Call: (16) not(member([[v, d], _G5717], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5717], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5717], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]])) + Call: (16) true + Exit: (16) true + Exit: (15) getvar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Exit: (14) getvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) putvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5733) + Unify: (14) putvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5733) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5733) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) + Call: (16) var(empty) + Fail: (16) var(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) isval(empty) + Unify: (16) isval(empty) + Call: (17) number(empty) + Fail: (17) number(empty) + Fail: (16) isval(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) expression(empty) + Unify: (16) expression(empty) + Exit: (16) expression(empty) + Exit: (15) isvalstrorundef(empty) + Call: (15) updatevar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5738) + Unify: (15) updatevar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5738) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Fail: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Redo: (15) updatevar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], _G5738) +^ Call: (16) not(member([[v, d], _G5734], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5734], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5734], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]])) + Call: (16) _G5734=empty + Exit: (16) empty=empty + Call: (16) true + Exit: (16) true + Call: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], [[[v, d], empty]], _G5758) + Unify: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], [[[v, d], empty]], [[[v, c], ["b", "c"]]|_G5750]) + Exit: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], [[[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Exit: (14) putvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (14) lists:append([], [[[v, d], [v, d]]], _G5773) + Unify: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Call: (14) checkarguments([], [], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5774, [[[v, d], [v, d]]], _G5776) + Unify: (14) checkarguments([], [], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) checkarguments([], [], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (12) checkarguments([["b", "a"], [v, d]], [[v, b], [v, d]], [[[v, c], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (11) checkarguments([["b", "c"], ["b", "a"], [v, d]], [[v, c], [v, b], [v, d]], [], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G14) + Call: (11) true + Exit: (11) true + Call: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5764|_G5765] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5772|_G5773] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5781]]|_G5773] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5781]]|_G5773] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[_G5775], or, [_G5784]] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[_G5775], or, [_G5784]] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5772|_G5773] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Fail: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5782, [[n, =], [[v, d], [v, c]]], _G5784) + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5773]]|_G5765] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5773]]|_G5765] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5767], or, [_G5776]] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5767], or, [_G5776]] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5774, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5764|_G5765] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5783, _G5784, _G5785) + Unify: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5783, true, nocut) + Call: (13) isop(=) + Unify: (13) isop(=) + Exit: (13) isop(=) + Call: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5783) + Unify: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5783) + Call: (14) getvalues([v, d], [v, c], _G5781, _G5782, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Unify: (14) getvalues([v, d], [v, c], _G5781, _G5782, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, d])) +^ Unify: (16) not(user:isvar([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) +^ Fail: (16) not(user:isvar([v, d])) + Redo: (15) getvalue([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) + Call: (16) isvalstrorundef(_G5779) + Unify: (16) isvalstrorundef(_G5779) + Call: (17) var(_G5779) + Exit: (17) var(_G5779) + Exit: (16) isvalstrorundef(_G5779) + Call: (16) getvar([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], _G5775], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], _G5775], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(empty=empty) +^ Unify: (17) not(user: (empty=empty)) +^ Fail: (17) not(user: (empty=empty)) + Redo: (16) getvar([v, d], _G5780, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(member([[v, d], _G5778], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]])) +^ Unify: (17) not(user:member([[v, d], _G5778], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]])) +^ Fail: (17) not(user:member([[v, d], _G5778], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]])) + Redo: (16) getvar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Exit: (16) getvar([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, d], empty, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, c], _G5786, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, c], _G5786, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, c])) +^ Unify: (16) not(user:isvar([v, c])) + Call: (17) isvar([v, c]) + Unify: (17) isvar([v, c]) + Exit: (17) isvar([v, c]) +^ Fail: (16) not(user:isvar([v, c])) + Redo: (15) getvalue([v, c], _G5786, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, c]) + Unify: (16) isvar([v, c]) + Exit: (16) isvar([v, c]) + Call: (16) isvalstrorundef(_G5785) + Unify: (16) isvalstrorundef(_G5785) + Call: (17) var(_G5785) + Exit: (17) var(_G5785) + Exit: (16) isvalstrorundef(_G5785) + Call: (16) getvar([v, c], _G5786, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, c], _G5786, [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, c], _G5781], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, c], _G5781], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, c], ["b", "c"]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) +^ Call: (17) not(["b", "c"]=empty) +^ Unify: (17) not(user: (["b", "c"]=empty)) +^ Exit: (17) not(user: (["b", "c"]=empty)) + Exit: (16) getvar([v, c], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, c], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Exit: (14) getvalues([v, d], [v, c], empty, ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (14) expression(empty) + Unify: (14) expression(empty) + Exit: (14) expression(empty) + Call: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) isval(["b", "c"]) + Unify: (15) isval(["b", "c"]) + Call: (16) number(["b", "c"]) + Fail: (16) number(["b", "c"]) + Fail: (15) isval(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) string(["b", "c"]) + Fail: (15) string(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) + Call: (15) atom(["b", "c"]) + Fail: (15) atom(["b", "c"]) + Redo: (14) expression(["b", "c"]) + Unify: (14) expression(["b", "c"]) +^ Call: (15) not(atom(["b", "c"])) +^ Unify: (15) not(user:atom(["b", "c"])) +^ Exit: (15) not(user:atom(["b", "c"])) + Call: (15) length(["b", "c"], _G5803) + Unify: (15) length(["b", "c"], _G5803) + Exit: (15) length(["b", "c"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["b", "c"]) + Unify: (15) expression2(["b", "c"]) + Call: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) expression3("b") + Call: (16) expression2(["c"]) + Unify: (16) expression2(["c"]) + Call: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) isval("c") + Unify: (18) isval("c") + Call: (19) number("c") + Fail: (19) number("c") + Fail: (18) isval("c") + Redo: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) string("c") + Exit: (18) string("c") + Exit: (17) expression3("c") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["c"]) + Exit: (15) expression2(["b", "c"]) + Exit: (14) expression(["b", "c"]) + Call: (14) val1emptyorvalsequal(empty, ["b", "c"]) + Unify: (14) val1emptyorvalsequal(empty, ["b", "c"]) + Exit: (14) val1emptyorvalsequal(empty, ["b", "c"]) + Call: (14) putvalue([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5805) + Unify: (14) putvalue([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5805) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5805) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(["b", "c"]) + Unify: (15) isvalstrorundef(["b", "c"]) + Call: (16) var(["b", "c"]) + Fail: (16) var(["b", "c"]) + Redo: (15) isvalstrorundef(["b", "c"]) + Unify: (15) isvalstrorundef(["b", "c"]) +^ Call: (16) not(var(["b", "c"])) +^ Unify: (16) not(user:var(["b", "c"])) +^ Exit: (16) not(user:var(["b", "c"])) + Call: (16) isval(["b", "c"]) + Unify: (16) isval(["b", "c"]) + Call: (17) number(["b", "c"]) + Fail: (17) number(["b", "c"]) + Fail: (16) isval(["b", "c"]) + Redo: (15) isvalstrorundef(["b", "c"]) + Unify: (15) isvalstrorundef(["b", "c"]) +^ Call: (16) not(var(["b", "c"])) +^ Unify: (16) not(user:var(["b", "c"])) +^ Exit: (16) not(user:var(["b", "c"])) + Call: (16) expression(["b", "c"]) + Unify: (16) expression(["b", "c"]) + Call: (17) isval(["b", "c"]) + Unify: (17) isval(["b", "c"]) + Call: (18) number(["b", "c"]) + Fail: (18) number(["b", "c"]) + Fail: (17) isval(["b", "c"]) + Redo: (16) expression(["b", "c"]) + Unify: (16) expression(["b", "c"]) + Call: (17) string(["b", "c"]) + Fail: (17) string(["b", "c"]) + Redo: (16) expression(["b", "c"]) + Unify: (16) expression(["b", "c"]) + Call: (17) atom(["b", "c"]) + Fail: (17) atom(["b", "c"]) + Redo: (16) expression(["b", "c"]) + Unify: (16) expression(["b", "c"]) +^ Call: (17) not(atom(["b", "c"])) +^ Unify: (17) not(user:atom(["b", "c"])) +^ Exit: (17) not(user:atom(["b", "c"])) + Call: (17) length(["b", "c"], _G5813) + Unify: (17) length(["b", "c"], _G5813) + Exit: (17) length(["b", "c"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expression2(["b", "c"]) + Unify: (17) expression2(["b", "c"]) + Call: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) isval("b") + Unify: (19) isval("b") + Call: (20) number("b") + Fail: (20) number("b") + Fail: (19) isval("b") + Redo: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) string("b") + Exit: (19) string("b") + Exit: (18) expression3("b") + Call: (18) expression2(["c"]) + Unify: (18) expression2(["c"]) + Call: (19) expression3("c") + Unify: (19) expression3("c") + Call: (20) isval("c") + Unify: (20) isval("c") + Call: (21) number("c") + Fail: (21) number("c") + Fail: (20) isval("c") + Redo: (19) expression3("c") + Unify: (19) expression3("c") + Call: (20) string("c") + Exit: (20) string("c") + Exit: (19) expression3("c") + Call: (19) expression2([]) + Unify: (19) expression2([]) + Exit: (19) expression2([]) + Exit: (18) expression2(["c"]) + Exit: (17) expression2(["b", "c"]) + Exit: (16) expression(["b", "c"]) + Exit: (15) isvalstrorundef(["b", "c"]) + Call: (15) updatevar([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5815) + Unify: (15) updatevar([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], _G5815) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Exit: (16) lists:member([[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]]) + Call: (16) lists:delete([[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[v, d], empty], _G5826) + Unify: (16) lists:delete([[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[v, d], empty], _G5826) + Exit: (16) lists:delete([[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[v, d], empty], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]]) + Call: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], [[[v, d], ["b", "c"]]], _G5841) + Unify: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], [[[v, d], ["b", "c"]]], [[[v, c], ["b", "c"]]|_G5833]) + Exit: (16) lists:append([[[v, c], ["b", "c"]], [[v, b], ["b", "a"]]], [[[v, d], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]]) + Exit: (14) putvalue([v, d], ["b", "c"], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]]) + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]]) + Call: (14) true + Exit: (14) true + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]]) + Call: (14) true + Exit: (14) true + Exit: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]]) + Exit: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], true, nocut) +^ Call: (12) not(nocut=cut) +^ Unify: (12) not(user: (nocut=cut)) +^ Exit: (12) not(user: (nocut=cut)) + Call: (12) _G5851=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (12) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], _G5854, [], _G5856) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [], true) + Exit: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [], true) + Call: (12) logicalconjunction(true, true, true) + Unify: (12) logicalconjunction(true, true, true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Exit: (12) logicalconjunction(true, true, true) + Exit: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], empty]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [[[n, =], [[v, d], [v, c]]]], true) + Call: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [], _G5854) + Unify: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [], _G5854) + Call: (12) [[[v, d], [v, d]]]=[[_G5847, _G5850]|_G5845] + Exit: (12) [[[v, d], [v, d]]]=[[[v, d], [v, d]]] + Call: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) var([v, d]) + Fail: (14) var([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) isval([v, d]) + Unify: (14) isval([v, d]) + Call: (15) number([v, d]) + Fail: (15) number([v, d]) + Fail: (14) isval([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) string([v, d]) + Fail: (14) string([v, d]) + Fail: (13) isvalstrempty([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) +^ Call: (13) not(atom([v, d])) +^ Unify: (13) not(user:atom([v, d])) +^ Exit: (13) not(user:atom([v, d])) + Call: (13) length([v, d], _G5866) + Unify: (13) length([v, d], _G5866) + Exit: (13) length([v, d], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2([v, d]) + Unify: (13) expressionnotatom2([v, d]) + Call: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) var(v) + Fail: (15) var(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) isval(v) + Unify: (15) isval(v) + Call: (16) number(v) + Fail: (16) number(v) + Fail: (15) isval(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) string(v) + Fail: (15) string(v) + Fail: (14) isvalstrempty(v) + Fail: (13) expressionnotatom2([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) predicate_or_rule_name([v, d]) + Fail: (13) predicate_or_rule_name([v, d]) + Fail: (12) expressionnotatom([v, d]) + Redo: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [], _G5863) + Call: (12) lists:member([[v, d], _G5856], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]]) + Unify: (12) lists:member([[v, d], _G5856], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]]) + Exit: (12) lists:member([[v, d], ["b", "c"]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]]) + Call: (12) lists:append([], [[[v, d], ["b", "c"]]], _G5877) + Unify: (12) lists:append([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Call: (12) updatevars([], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]], _G5878) + Unify: (12) updatevars([], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (12) updatevars([], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "c"]], [[v, b], ["b", "a"]], [[v, d], ["b", "c"]]], [], [[[v, d], ["b", "c"]]]) +^ Call: (11) not([[[v, d], ["b", "c"]]]=[]) +^ Unify: (11) not(user: ([[[v, d], ["b", "c"]]]=[])) +^ Exit: (11) not(user: ([[[v, d], ["b", "c"]]]=[])) + Call: (11) unique1([[[v, d], ["b", "c"]]], [], _G14) + Unify: (11) unique1([[[v, d], ["b", "c"]]], [], _G14) + Call: (12) lists:delete([], [[v, d], ["b", "c"]], _G5883) + Unify: (12) lists:delete([], [[v, d], ["b", "c"]], []) + Exit: (12) lists:delete([], [[v, d], ["b", "c"]], []) + Call: (12) lists:append([], [[[v, d], ["b", "c"]]], _G5886) + Unify: (12) lists:append([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Call: (12) unique1([], [[[v, d], ["b", "c"]]], _G14) + Unify: (12) unique1([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (12) unique1([], [[[v, d], ["b", "c"]]], [[[v, d], ["b", "c"]]]) + Exit: (11) unique1([[[v, d], ["b", "c"]]], [], [[[v, d], ["b", "c"]]]) + Call: (11) findresult3([["b", "c"], ["b", "a"], [v, d]], [[[v, d], ["b", "c"]]], [], _G5887) + Unify: (11) findresult3([["b", "c"], ["b", "a"], [v, d]], [[[v, d], ["b", "c"]]], [], _G5887) + Call: (12) [["b", "c"], ["b", "a"], [v, d]]=[_G5877|_G5878] + Exit: (12) [["b", "c"], ["b", "a"], [v, d]]=[["b", "c"], ["b", "a"], [v, d]] + Call: (12) expressionnotatom(["b", "c"]) + Unify: (12) expressionnotatom(["b", "c"]) + Call: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) var(["b", "c"]) + Fail: (14) var(["b", "c"]) + Redo: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) isval(["b", "c"]) + Unify: (14) isval(["b", "c"]) + Call: (15) number(["b", "c"]) + Fail: (15) number(["b", "c"]) + Fail: (14) isval(["b", "c"]) + Redo: (13) isvalstrempty(["b", "c"]) + Unify: (13) isvalstrempty(["b", "c"]) + Call: (14) string(["b", "c"]) + Fail: (14) string(["b", "c"]) + Fail: (13) isvalstrempty(["b", "c"]) + Redo: (12) expressionnotatom(["b", "c"]) + Unify: (12) expressionnotatom(["b", "c"]) +^ Call: (13) not(atom(["b", "c"])) +^ Unify: (13) not(user:atom(["b", "c"])) +^ Exit: (13) not(user:atom(["b", "c"])) + Call: (13) length(["b", "c"], _G5893) + Unify: (13) length(["b", "c"], _G5893) + Exit: (13) length(["b", "c"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["b", "c"]) + Unify: (13) expressionnotatom2(["b", "c"]) + Call: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) var("b") + Fail: (15) var("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) isval("b") + Unify: (15) isval("b") + Call: (16) number("b") + Fail: (16) number("b") + Fail: (15) isval("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) string("b") + Exit: (15) string("b") + Exit: (14) isvalstrempty("b") + Call: (14) expressionnotatom2(["c"]) + Unify: (14) expressionnotatom2(["c"]) + Call: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) var("c") + Fail: (16) var("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) isval("c") + Unify: (16) isval("c") + Call: (17) number("c") + Fail: (17) number("c") + Fail: (16) isval("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) string("c") + Exit: (16) string("c") + Exit: (15) isvalstrempty("c") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["c"]) + Exit: (13) expressionnotatom2(["b", "c"]) + Exit: (12) expressionnotatom(["b", "c"]) + Call: (12) lists:append([], [["b", "c"]], _G5897) + Unify: (12) lists:append([], [["b", "c"]], [["b", "c"]]) + Exit: (12) lists:append([], [["b", "c"]], [["b", "c"]]) + Call: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "c"]]], [["b", "c"]], _G5898) + Unify: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "c"]]], [["b", "c"]], _G5898) + Call: (13) [["b", "a"], [v, d]]=[_G5888|_G5889] + Exit: (13) [["b", "a"], [v, d]]=[["b", "a"], [v, d]] + Call: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) + Call: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) var(["b", "a"]) + Fail: (15) var(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) isvalstrempty(["b", "a"]) + Unify: (14) isvalstrempty(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Fail: (14) isvalstrempty(["b", "a"]) + Redo: (13) expressionnotatom(["b", "a"]) + Unify: (13) expressionnotatom(["b", "a"]) +^ Call: (14) not(atom(["b", "a"])) +^ Unify: (14) not(user:atom(["b", "a"])) +^ Exit: (14) not(user:atom(["b", "a"])) + Call: (14) length(["b", "a"], _G5904) + Unify: (14) length(["b", "a"], _G5904) + Exit: (14) length(["b", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["b", "a"]) + Unify: (14) expressionnotatom2(["b", "a"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["b", "a"]) + Exit: (13) expressionnotatom(["b", "a"]) + Call: (13) lists:append([["b", "c"]], [["b", "a"]], _G5908) + Unify: (13) lists:append([["b", "c"]], [["b", "a"]], [["b", "c"]|_G5900]) + Exit: (13) lists:append([["b", "c"]], [["b", "a"]], [["b", "c"], ["b", "a"]]) + Call: (13) findresult3([[v, d]], [[[v, d], ["b", "c"]]], [["b", "c"], ["b", "a"]], _G5912) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "c"]]], [["b", "c"], ["b", "a"]], _G5912) + Call: (14) [[v, d]]=[_G5902|_G5903] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5918) + Unify: (15) length([v, d], _G5918) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) findresult3([[v, d]], [[[v, d], ["b", "c"]]], [["b", "c"], ["b", "a"]], _G5912) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "c"]]], [["b", "c"], ["b", "a"]], _G5912) + Call: (14) [[v, d]]=[_G5902|_G5903] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) lists:member([[v, d], _G5908], [[[v, d], ["b", "c"]]]) + Unify: (14) lists:member([[v, d], _G5908], [[[v, d], ["b", "c"]]]) + Exit: (14) lists:member([[v, d], ["b", "c"]], [[[v, d], ["b", "c"]]]) + Call: (14) lists:append([["b", "c"], ["b", "a"]], [["b", "c"]], _G5923) + Unify: (14) lists:append([["b", "c"], ["b", "a"]], [["b", "c"]], [["b", "c"]|_G5915]) + Exit: (14) lists:append([["b", "c"], ["b", "a"]], [["b", "c"]], [["b", "c"], ["b", "a"], ["b", "c"]]) + Call: (14) findresult3([], [[[v, d], ["b", "c"]]], [["b", "c"], ["b", "a"], ["b", "c"]], _G5930) + Unify: (14) findresult3([], [[[v, d], ["b", "c"]]], [["b", "c"], ["b", "a"], ["b", "c"]], [["b", "c"], ["b", "a"], ["b", "c"]]) + Exit: (14) findresult3([], [[[v, d], ["b", "c"]]], [["b", "c"], ["b", "a"], ["b", "c"]], [["b", "c"], ["b", "a"], ["b", "c"]]) + Exit: (13) findresult3([[v, d]], [[[v, d], ["b", "c"]]], [["b", "c"], ["b", "a"]], [["b", "c"], ["b", "a"], ["b", "c"]]) + Exit: (12) findresult3([["b", "a"], [v, d]], [[[v, d], ["b", "c"]]], [["b", "c"]], [["b", "c"], ["b", "a"], ["b", "c"]]) + Exit: (11) findresult3([["b", "c"], ["b", "a"], [v, d]], [[[v, d], ["b", "c"]]], [], [["b", "c"], ["b", "a"], ["b", "c"]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "c"]]]) + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Exit: (10) member1([[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "c"]]]) + Exit: (9) interpret1(off, [[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "c"]]]) + Exit: (8) interpret(off, [[n, f], [["b", "c"], ["b", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "c"]]]) + + Call: (8) interpret(off, [[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Unify: (8) interpret(off, [[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552) + Unify: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552) + Call: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5543|_G5544] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], _G5555, "->", _G5564] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], _G5555, "->", _G5564] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5555, [], _G5557) + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], "->", _G5561] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5552], "->", _G5561] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5552, [], _G5554) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5543|_G5544] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_G5546, _G5549, ":-", _G5558] + Exit: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _G5573) + Unify: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G5573, [], _G5575) + Unify: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _G5573) + Unify: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]|_G5565]) + Exit: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (9) interpret1(off, [[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Unify: (9) interpret1(off, [[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) +^ Call: (10) retractall(debug(_G5567)) +^ Exit: (10) retractall(debug(_G5567)) +^ Call: (10) assertz(debug(off)) +^ Exit: (10) assertz(debug(off)) +^ Call: (10) retractall(cut(_G5571)) +^ Exit: (10) retractall(cut(_G5571)) +^ Call: (10) assertz(cut(off)) +^ Exit: (10) assertz(cut(off)) + Call: (10) member1([[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Unify: (10) member1([[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) cut(off) + Unify: (11) cut(off) + Exit: (11) cut(off) + Call: (11) [[n, f], [["c", "b"], ["c", "a"], [v, d]]]=[_G5575, _G5578] + Exit: (11) [[n, f], [["c", "b"], ["c", "a"], [v, d]]]=[[n, f], [["c", "b"], ["c", "a"], [v, d]]] + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _G5587, ":-", _G5596]|_G5582] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) length([["c", "b"], ["c", "a"], [v, d]], _G5607) + Unify: (11) length([["c", "b"], ["c", "a"], [v, d]], _G5607) + Exit: (11) length([["c", "b"], ["c", "a"], [v, d]], 3) + Call: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Exit: (11) length([[v, c], [v, b], [v, d]], 3) + Call: (11) [n, f]=[n, grammar] + Fail: (11) [n, f]=[n, grammar] + Redo: (10) member1([[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) [n, f]=[n, grammar_part] + Fail: (11) [n, f]=[n, grammar_part] + Redo: (10) member1([[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) checkarguments([["c", "b"], ["c", "a"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5609, [], _G5611) + Unify: (11) checkarguments([["c", "b"], ["c", "a"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5609, [], _G5611) + Call: (12) [["c", "b"], ["c", "a"], [v, d]]=[_G5599|_G5600] + Exit: (12) [["c", "b"], ["c", "a"], [v, d]]=[["c", "b"], ["c", "a"], [v, d]] + Call: (12) expressionnotatom(["c", "b"]) + Unify: (12) expressionnotatom(["c", "b"]) + Call: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) var(["c", "b"]) + Fail: (14) var(["c", "b"]) + Redo: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) isval(["c", "b"]) + Unify: (14) isval(["c", "b"]) + Call: (15) number(["c", "b"]) + Fail: (15) number(["c", "b"]) + Fail: (14) isval(["c", "b"]) + Redo: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) string(["c", "b"]) + Fail: (14) string(["c", "b"]) + Fail: (13) isvalstrempty(["c", "b"]) + Redo: (12) expressionnotatom(["c", "b"]) + Unify: (12) expressionnotatom(["c", "b"]) +^ Call: (13) not(atom(["c", "b"])) +^ Unify: (13) not(user:atom(["c", "b"])) +^ Exit: (13) not(user:atom(["c", "b"])) + Call: (13) length(["c", "b"], _G5615) + Unify: (13) length(["c", "b"], _G5615) + Exit: (13) length(["c", "b"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["c", "b"]) + Unify: (13) expressionnotatom2(["c", "b"]) + Call: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) var("c") + Fail: (15) var("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) isval("c") + Unify: (15) isval("c") + Call: (16) number("c") + Fail: (16) number("c") + Fail: (15) isval("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) string("c") + Exit: (15) string("c") + Exit: (14) isvalstrempty("c") + Call: (14) expressionnotatom2(["b"]) + Unify: (14) expressionnotatom2(["b"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["b"]) + Exit: (13) expressionnotatom2(["c", "b"]) + Exit: (12) expressionnotatom(["c", "b"]) + Call: (12) [[v, c], [v, b], [v, d]]=[_G5607|_G5608] + Exit: (12) [[v, c], [v, b], [v, d]]=[[v, c], [v, b], [v, d]] +^ Call: (12) not(var([v, c])) +^ Unify: (12) not(user:var([v, c])) +^ Exit: (12) not(user:var([v, c])) + Call: (12) isvar([v, c]) + Unify: (12) isvar([v, c]) + Exit: (12) isvar([v, c]) + Call: (12) putvalue([v, c], ["c", "b"], [], _G5625) + Unify: (12) putvalue([v, c], ["c", "b"], [], _G5625) +^ Call: (13) not(isvar([v, c])) +^ Unify: (13) not(user:isvar([v, c])) + Call: (14) isvar([v, c]) + Unify: (14) isvar([v, c]) + Exit: (14) isvar([v, c]) +^ Fail: (13) not(user:isvar([v, c])) + Redo: (12) putvalue([v, c], ["c", "b"], [], _G5625) + Call: (13) isvar([v, c]) + Unify: (13) isvar([v, c]) + Exit: (13) isvar([v, c]) + Call: (13) isvalstrorundef(["c", "b"]) + Unify: (13) isvalstrorundef(["c", "b"]) + Call: (14) var(["c", "b"]) + Fail: (14) var(["c", "b"]) + Redo: (13) isvalstrorundef(["c", "b"]) + Unify: (13) isvalstrorundef(["c", "b"]) +^ Call: (14) not(var(["c", "b"])) +^ Unify: (14) not(user:var(["c", "b"])) +^ Exit: (14) not(user:var(["c", "b"])) + Call: (14) isval(["c", "b"]) + Unify: (14) isval(["c", "b"]) + Call: (15) number(["c", "b"]) + Fail: (15) number(["c", "b"]) + Fail: (14) isval(["c", "b"]) + Redo: (13) isvalstrorundef(["c", "b"]) + Unify: (13) isvalstrorundef(["c", "b"]) +^ Call: (14) not(var(["c", "b"])) +^ Unify: (14) not(user:var(["c", "b"])) +^ Exit: (14) not(user:var(["c", "b"])) + Call: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) + Call: (15) isval(["c", "b"]) + Unify: (15) isval(["c", "b"]) + Call: (16) number(["c", "b"]) + Fail: (16) number(["c", "b"]) + Fail: (15) isval(["c", "b"]) + Redo: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) + Call: (15) string(["c", "b"]) + Fail: (15) string(["c", "b"]) + Redo: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) + Call: (15) atom(["c", "b"]) + Fail: (15) atom(["c", "b"]) + Redo: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) +^ Call: (15) not(atom(["c", "b"])) +^ Unify: (15) not(user:atom(["c", "b"])) +^ Exit: (15) not(user:atom(["c", "b"])) + Call: (15) length(["c", "b"], _G5633) + Unify: (15) length(["c", "b"], _G5633) + Exit: (15) length(["c", "b"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["c", "b"]) + Unify: (15) expression2(["c", "b"]) + Call: (16) expression3("c") + Unify: (16) expression3("c") + Call: (17) isval("c") + Unify: (17) isval("c") + Call: (18) number("c") + Fail: (18) number("c") + Fail: (17) isval("c") + Redo: (16) expression3("c") + Unify: (16) expression3("c") + Call: (17) string("c") + Exit: (17) string("c") + Exit: (16) expression3("c") + Call: (16) expression2(["b"]) + Unify: (16) expression2(["b"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["b"]) + Exit: (15) expression2(["c", "b"]) + Exit: (14) expression(["c", "b"]) + Exit: (13) isvalstrorundef(["c", "b"]) + Call: (13) updatevar([v, c], ["c", "b"], [], _G5635) + Unify: (13) updatevar([v, c], ["c", "b"], [], _G5635) + Call: (14) lists:member([[v, c], empty], []) + Fail: (14) lists:member([[v, c], empty], []) + Redo: (13) updatevar([v, c], ["c", "b"], [], _G5635) +^ Call: (14) not(member([[v, c], _G5631], [])) +^ Unify: (14) not(user:member([[v, c], _G5631], [])) +^ Exit: (14) not(user:member([[v, c], _G5631], [])) + Call: (14) _G5631=empty + Exit: (14) empty=empty + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[v, c], ["c", "b"]]], _G5655) + Unify: (14) lists:append([], [[[v, c], ["c", "b"]]], [[[v, c], ["c", "b"]]]) + Exit: (14) lists:append([], [[[v, c], ["c", "b"]]], [[[v, c], ["c", "b"]]]) + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Exit: (13) updatevar([v, c], ["c", "b"], [], [[[v, c], ["c", "b"]]]) + Exit: (12) putvalue([v, c], ["c", "b"], [], [[[v, c], ["c", "b"]]]) + Call: (12) checkarguments([["c", "a"], [v, d]], [[v, b], [v, d]], [[[v, c], ["c", "b"]]], _G5656, [], _G5658) + Unify: (12) checkarguments([["c", "a"], [v, d]], [[v, b], [v, d]], [[[v, c], ["c", "b"]]], _G5656, [], _G5658) + Call: (13) [["c", "a"], [v, d]]=[_G5646|_G5647] + Exit: (13) [["c", "a"], [v, d]]=[["c", "a"], [v, d]] + Call: (13) expressionnotatom(["c", "a"]) + Unify: (13) expressionnotatom(["c", "a"]) + Call: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) var(["c", "a"]) + Fail: (15) var(["c", "a"]) + Redo: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) isval(["c", "a"]) + Unify: (15) isval(["c", "a"]) + Call: (16) number(["c", "a"]) + Fail: (16) number(["c", "a"]) + Fail: (15) isval(["c", "a"]) + Redo: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) string(["c", "a"]) + Fail: (15) string(["c", "a"]) + Fail: (14) isvalstrempty(["c", "a"]) + Redo: (13) expressionnotatom(["c", "a"]) + Unify: (13) expressionnotatom(["c", "a"]) +^ Call: (14) not(atom(["c", "a"])) +^ Unify: (14) not(user:atom(["c", "a"])) +^ Exit: (14) not(user:atom(["c", "a"])) + Call: (14) length(["c", "a"], _G5662) + Unify: (14) length(["c", "a"], _G5662) + Exit: (14) length(["c", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["c", "a"]) + Unify: (14) expressionnotatom2(["c", "a"]) + Call: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) var("c") + Fail: (16) var("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) isval("c") + Unify: (16) isval("c") + Call: (17) number("c") + Fail: (17) number("c") + Fail: (16) isval("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) string("c") + Exit: (16) string("c") + Exit: (15) isvalstrempty("c") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["c", "a"]) + Exit: (13) expressionnotatom(["c", "a"]) + Call: (13) [[v, b], [v, d]]=[_G5654|_G5655] + Exit: (13) [[v, b], [v, d]]=[[v, b], [v, d]] +^ Call: (13) not(var([v, b])) +^ Unify: (13) not(user:var([v, b])) +^ Exit: (13) not(user:var([v, b])) + Call: (13) isvar([v, b]) + Unify: (13) isvar([v, b]) + Exit: (13) isvar([v, b]) + Call: (13) putvalue([v, b], ["c", "a"], [[[v, c], ["c", "b"]]], _G5672) + Unify: (13) putvalue([v, b], ["c", "a"], [[[v, c], ["c", "b"]]], _G5672) +^ Call: (14) not(isvar([v, b])) +^ Unify: (14) not(user:isvar([v, b])) + Call: (15) isvar([v, b]) + Unify: (15) isvar([v, b]) + Exit: (15) isvar([v, b]) +^ Fail: (14) not(user:isvar([v, b])) + Redo: (13) putvalue([v, b], ["c", "a"], [[[v, c], ["c", "b"]]], _G5672) + Call: (14) isvar([v, b]) + Unify: (14) isvar([v, b]) + Exit: (14) isvar([v, b]) + Call: (14) isvalstrorundef(["c", "a"]) + Unify: (14) isvalstrorundef(["c", "a"]) + Call: (15) var(["c", "a"]) + Fail: (15) var(["c", "a"]) + Redo: (14) isvalstrorundef(["c", "a"]) + Unify: (14) isvalstrorundef(["c", "a"]) +^ Call: (15) not(var(["c", "a"])) +^ Unify: (15) not(user:var(["c", "a"])) +^ Exit: (15) not(user:var(["c", "a"])) + Call: (15) isval(["c", "a"]) + Unify: (15) isval(["c", "a"]) + Call: (16) number(["c", "a"]) + Fail: (16) number(["c", "a"]) + Fail: (15) isval(["c", "a"]) + Redo: (14) isvalstrorundef(["c", "a"]) + Unify: (14) isvalstrorundef(["c", "a"]) +^ Call: (15) not(var(["c", "a"])) +^ Unify: (15) not(user:var(["c", "a"])) +^ Exit: (15) not(user:var(["c", "a"])) + Call: (15) expression(["c", "a"]) + Unify: (15) expression(["c", "a"]) + Call: (16) isval(["c", "a"]) + Unify: (16) isval(["c", "a"]) + Call: (17) number(["c", "a"]) + Fail: (17) number(["c", "a"]) + Fail: (16) isval(["c", "a"]) + Redo: (15) expression(["c", "a"]) + Unify: (15) expression(["c", "a"]) + Call: (16) string(["c", "a"]) + Fail: (16) string(["c", "a"]) + Redo: (15) expression(["c", "a"]) + Unify: (15) expression(["c", "a"]) + Call: (16) atom(["c", "a"]) + Fail: (16) atom(["c", "a"]) + Redo: (15) expression(["c", "a"]) + Unify: (15) expression(["c", "a"]) +^ Call: (16) not(atom(["c", "a"])) +^ Unify: (16) not(user:atom(["c", "a"])) +^ Exit: (16) not(user:atom(["c", "a"])) + Call: (16) length(["c", "a"], _G5680) + Unify: (16) length(["c", "a"], _G5680) + Exit: (16) length(["c", "a"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expression2(["c", "a"]) + Unify: (16) expression2(["c", "a"]) + Call: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) isval("c") + Unify: (18) isval("c") + Call: (19) number("c") + Fail: (19) number("c") + Fail: (18) isval("c") + Redo: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) string("c") + Exit: (18) string("c") + Exit: (17) expression3("c") + Call: (17) expression2(["a"]) + Unify: (17) expression2(["a"]) + Call: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) isval("a") + Unify: (19) isval("a") + Call: (20) number("a") + Fail: (20) number("a") + Fail: (19) isval("a") + Redo: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) string("a") + Exit: (19) string("a") + Exit: (18) expression3("a") + Call: (18) expression2([]) + Unify: (18) expression2([]) + Exit: (18) expression2([]) + Exit: (17) expression2(["a"]) + Exit: (16) expression2(["c", "a"]) + Exit: (15) expression(["c", "a"]) + Exit: (14) isvalstrorundef(["c", "a"]) + Call: (14) updatevar([v, b], ["c", "a"], [[[v, c], ["c", "b"]]], _G5682) + Unify: (14) updatevar([v, b], ["c", "a"], [[[v, c], ["c", "b"]]], _G5682) + Call: (15) lists:member([[v, b], empty], [[[v, c], ["c", "b"]]]) + Unify: (15) lists:member([[v, b], empty], [[[v, c], ["c", "b"]]]) + Fail: (15) lists:member([[v, b], empty], [[[v, c], ["c", "b"]]]) + Redo: (14) updatevar([v, b], ["c", "a"], [[[v, c], ["c", "b"]]], _G5682) +^ Call: (15) not(member([[v, b], _G5678], [[[v, c], ["c", "b"]]])) +^ Unify: (15) not(user:member([[v, b], _G5678], [[[v, c], ["c", "b"]]])) +^ Exit: (15) not(user:member([[v, b], _G5678], [[[v, c], ["c", "b"]]])) + Call: (15) _G5678=empty + Exit: (15) empty=empty + Call: (15) true + Exit: (15) true + Call: (15) lists:append([[[v, c], ["c", "b"]]], [[[v, b], ["c", "a"]]], _G5702) + Unify: (15) lists:append([[[v, c], ["c", "b"]]], [[[v, b], ["c", "a"]]], [[[v, c], ["c", "b"]]|_G5694]) + Exit: (15) lists:append([[[v, c], ["c", "b"]]], [[[v, b], ["c", "a"]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Call: (15) true + Exit: (15) true + Call: (15) true + Exit: (15) true + Exit: (14) updatevar([v, b], ["c", "a"], [[[v, c], ["c", "b"]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Exit: (13) putvalue([v, b], ["c", "a"], [[[v, c], ["c", "b"]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Call: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5706, [], _G5708) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5706, [], _G5708) + Call: (14) [[v, d]]=[_G5696|_G5697] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5712) + Unify: (15) length([v, d], _G5712) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5706, [], _G5708) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5706, [], _G5708) + Call: (14) [[v, d]]=[_G5696|_G5697] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5704|_G5705] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5720) + Unify: (15) length([v, d], _G5720) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5706, [], _G5708) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5706, [], _G5708) + Call: (14) [[v, d]]=[_G5696|_G5697] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5704|_G5705] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) getvalue([v, d], _G5720, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Unify: (14) getvalue([v, d], _G5720, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) getvalue([v, d], _G5720, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(_G5719) + Unify: (15) isvalstrorundef(_G5719) + Call: (16) var(_G5719) + Exit: (16) var(_G5719) + Exit: (15) isvalstrorundef(_G5719) + Call: (15) getvar([v, d], _G5720, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Unify: (15) getvar([v, d], _G5720, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Call: (16) lists:member([[v, d], _G5715], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Unify: (16) lists:member([[v, d], _G5715], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Fail: (16) lists:member([[v, d], _G5715], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Redo: (15) getvar([v, d], _G5720, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Unify: (15) getvar([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) +^ Call: (16) not(member([[v, d], _G5718], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5718], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5718], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]])) + Call: (16) true + Exit: (16) true + Exit: (15) getvar([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Exit: (14) getvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) putvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5734) + Unify: (14) putvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5734) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5734) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) + Call: (16) var(empty) + Fail: (16) var(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) isval(empty) + Unify: (16) isval(empty) + Call: (17) number(empty) + Fail: (17) number(empty) + Fail: (16) isval(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) expression(empty) + Unify: (16) expression(empty) + Exit: (16) expression(empty) + Exit: (15) isvalstrorundef(empty) + Call: (15) updatevar([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5739) + Unify: (15) updatevar([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5739) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Fail: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Redo: (15) updatevar([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], _G5739) +^ Call: (16) not(member([[v, d], _G5735], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5735], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5735], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]])) + Call: (16) _G5735=empty + Exit: (16) empty=empty + Call: (16) true + Exit: (16) true + Call: (16) lists:append([[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], [[[v, d], empty]], _G5759) + Unify: (16) lists:append([[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], [[[v, d], empty]], [[[v, c], ["c", "b"]]|_G5751]) + Exit: (16) lists:append([[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], [[[v, d], empty]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (14) putvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (14) lists:append([], [[[v, d], [v, d]]], _G5774) + Unify: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Call: (14) checkarguments([], [], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5775, [[[v, d], [v, d]]], _G5777) + Unify: (14) checkarguments([], [], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) checkarguments([], [], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (12) checkarguments([["c", "a"], [v, d]], [[v, b], [v, d]], [[[v, c], ["c", "b"]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (11) checkarguments([["c", "b"], ["c", "a"], [v, d]], [[v, c], [v, b], [v, d]], [], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G15) + Call: (11) true + Exit: (11) true + Call: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5765|_G5766] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5773|_G5774] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5782]]|_G5774] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5782]]|_G5774] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[_G5776], or, [_G5785]] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[_G5776], or, [_G5785]] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5773|_G5774] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Fail: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5783, [[n, =], [[v, d], [v, c]]], _G5785) + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5774]]|_G5766] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5774]]|_G5766] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5768], or, [_G5777]] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5768], or, [_G5777]] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5775, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5765|_G5766] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5784, _G5785, _G5786) + Unify: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5784, true, nocut) + Call: (13) isop(=) + Unify: (13) isop(=) + Exit: (13) isop(=) + Call: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5784) + Unify: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5784) + Call: (14) getvalues([v, d], [v, c], _G5782, _G5783, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (14) getvalues([v, d], [v, c], _G5782, _G5783, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, d], _G5781, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, d], _G5781, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, d])) +^ Unify: (16) not(user:isvar([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) +^ Fail: (16) not(user:isvar([v, d])) + Redo: (15) getvalue([v, d], _G5781, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) + Call: (16) isvalstrorundef(_G5780) + Unify: (16) isvalstrorundef(_G5780) + Call: (17) var(_G5780) + Exit: (17) var(_G5780) + Exit: (16) isvalstrorundef(_G5780) + Call: (16) getvar([v, d], _G5781, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], _G5781, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], _G5776], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], _G5776], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) +^ Call: (17) not(empty=empty) +^ Unify: (17) not(user: (empty=empty)) +^ Fail: (17) not(user: (empty=empty)) + Redo: (16) getvar([v, d], _G5781, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) +^ Call: (17) not(member([[v, d], _G5779], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]])) +^ Unify: (17) not(user:member([[v, d], _G5779], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]])) +^ Fail: (17) not(user:member([[v, d], _G5779], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]])) + Redo: (16) getvar([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (16) getvar([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, d], empty, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, c], _G5787, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, c], _G5787, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, c])) +^ Unify: (16) not(user:isvar([v, c])) + Call: (17) isvar([v, c]) + Unify: (17) isvar([v, c]) + Exit: (17) isvar([v, c]) +^ Fail: (16) not(user:isvar([v, c])) + Redo: (15) getvalue([v, c], _G5787, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, c]) + Unify: (16) isvar([v, c]) + Exit: (16) isvar([v, c]) + Call: (16) isvalstrorundef(_G5786) + Unify: (16) isvalstrorundef(_G5786) + Call: (17) var(_G5786) + Exit: (17) var(_G5786) + Exit: (16) isvalstrorundef(_G5786) + Call: (16) getvar([v, c], _G5787, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, c], _G5787, [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, c], _G5782], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, c], _G5782], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, c], ["c", "b"]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) +^ Call: (17) not(["c", "b"]=empty) +^ Unify: (17) not(user: (["c", "b"]=empty)) +^ Exit: (17) not(user: (["c", "b"]=empty)) + Exit: (16) getvar([v, c], ["c", "b"], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, c], ["c", "b"], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (14) getvalues([v, d], [v, c], empty, ["c", "b"], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (14) expression(empty) + Unify: (14) expression(empty) + Exit: (14) expression(empty) + Call: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) + Call: (15) isval(["c", "b"]) + Unify: (15) isval(["c", "b"]) + Call: (16) number(["c", "b"]) + Fail: (16) number(["c", "b"]) + Fail: (15) isval(["c", "b"]) + Redo: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) + Call: (15) string(["c", "b"]) + Fail: (15) string(["c", "b"]) + Redo: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) + Call: (15) atom(["c", "b"]) + Fail: (15) atom(["c", "b"]) + Redo: (14) expression(["c", "b"]) + Unify: (14) expression(["c", "b"]) +^ Call: (15) not(atom(["c", "b"])) +^ Unify: (15) not(user:atom(["c", "b"])) +^ Exit: (15) not(user:atom(["c", "b"])) + Call: (15) length(["c", "b"], _G5804) + Unify: (15) length(["c", "b"], _G5804) + Exit: (15) length(["c", "b"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["c", "b"]) + Unify: (15) expression2(["c", "b"]) + Call: (16) expression3("c") + Unify: (16) expression3("c") + Call: (17) isval("c") + Unify: (17) isval("c") + Call: (18) number("c") + Fail: (18) number("c") + Fail: (17) isval("c") + Redo: (16) expression3("c") + Unify: (16) expression3("c") + Call: (17) string("c") + Exit: (17) string("c") + Exit: (16) expression3("c") + Call: (16) expression2(["b"]) + Unify: (16) expression2(["b"]) + Call: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) isval("b") + Unify: (18) isval("b") + Call: (19) number("b") + Fail: (19) number("b") + Fail: (18) isval("b") + Redo: (17) expression3("b") + Unify: (17) expression3("b") + Call: (18) string("b") + Exit: (18) string("b") + Exit: (17) expression3("b") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["b"]) + Exit: (15) expression2(["c", "b"]) + Exit: (14) expression(["c", "b"]) + Call: (14) val1emptyorvalsequal(empty, ["c", "b"]) + Unify: (14) val1emptyorvalsequal(empty, ["c", "b"]) + Exit: (14) val1emptyorvalsequal(empty, ["c", "b"]) + Call: (14) putvalue([v, d], ["c", "b"], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5806) + Unify: (14) putvalue([v, d], ["c", "b"], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5806) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], ["c", "b"], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5806) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(["c", "b"]) + Unify: (15) isvalstrorundef(["c", "b"]) + Call: (16) var(["c", "b"]) + Fail: (16) var(["c", "b"]) + Redo: (15) isvalstrorundef(["c", "b"]) + Unify: (15) isvalstrorundef(["c", "b"]) +^ Call: (16) not(var(["c", "b"])) +^ Unify: (16) not(user:var(["c", "b"])) +^ Exit: (16) not(user:var(["c", "b"])) + Call: (16) isval(["c", "b"]) + Unify: (16) isval(["c", "b"]) + Call: (17) number(["c", "b"]) + Fail: (17) number(["c", "b"]) + Fail: (16) isval(["c", "b"]) + Redo: (15) isvalstrorundef(["c", "b"]) + Unify: (15) isvalstrorundef(["c", "b"]) +^ Call: (16) not(var(["c", "b"])) +^ Unify: (16) not(user:var(["c", "b"])) +^ Exit: (16) not(user:var(["c", "b"])) + Call: (16) expression(["c", "b"]) + Unify: (16) expression(["c", "b"]) + Call: (17) isval(["c", "b"]) + Unify: (17) isval(["c", "b"]) + Call: (18) number(["c", "b"]) + Fail: (18) number(["c", "b"]) + Fail: (17) isval(["c", "b"]) + Redo: (16) expression(["c", "b"]) + Unify: (16) expression(["c", "b"]) + Call: (17) string(["c", "b"]) + Fail: (17) string(["c", "b"]) + Redo: (16) expression(["c", "b"]) + Unify: (16) expression(["c", "b"]) + Call: (17) atom(["c", "b"]) + Fail: (17) atom(["c", "b"]) + Redo: (16) expression(["c", "b"]) + Unify: (16) expression(["c", "b"]) +^ Call: (17) not(atom(["c", "b"])) +^ Unify: (17) not(user:atom(["c", "b"])) +^ Exit: (17) not(user:atom(["c", "b"])) + Call: (17) length(["c", "b"], _G5814) + Unify: (17) length(["c", "b"], _G5814) + Exit: (17) length(["c", "b"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expression2(["c", "b"]) + Unify: (17) expression2(["c", "b"]) + Call: (18) expression3("c") + Unify: (18) expression3("c") + Call: (19) isval("c") + Unify: (19) isval("c") + Call: (20) number("c") + Fail: (20) number("c") + Fail: (19) isval("c") + Redo: (18) expression3("c") + Unify: (18) expression3("c") + Call: (19) string("c") + Exit: (19) string("c") + Exit: (18) expression3("c") + Call: (18) expression2(["b"]) + Unify: (18) expression2(["b"]) + Call: (19) expression3("b") + Unify: (19) expression3("b") + Call: (20) isval("b") + Unify: (20) isval("b") + Call: (21) number("b") + Fail: (21) number("b") + Fail: (20) isval("b") + Redo: (19) expression3("b") + Unify: (19) expression3("b") + Call: (20) string("b") + Exit: (20) string("b") + Exit: (19) expression3("b") + Call: (19) expression2([]) + Unify: (19) expression2([]) + Exit: (19) expression2([]) + Exit: (18) expression2(["b"]) + Exit: (17) expression2(["c", "b"]) + Exit: (16) expression(["c", "b"]) + Exit: (15) isvalstrorundef(["c", "b"]) + Call: (15) updatevar([v, d], ["c", "b"], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5816) + Unify: (15) updatevar([v, d], ["c", "b"], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5816) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (16) lists:member([[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (16) lists:delete([[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[v, d], empty], _G5827) + Unify: (16) lists:delete([[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[v, d], empty], _G5827) + Exit: (16) lists:delete([[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[v, d], empty], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]]) + Call: (16) lists:append([[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], [[[v, d], ["c", "b"]]], _G5842) + Unify: (16) lists:append([[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], [[[v, d], ["c", "b"]]], [[[v, c], ["c", "b"]]|_G5834]) + Exit: (16) lists:append([[[v, c], ["c", "b"]], [[v, b], ["c", "a"]]], [[[v, d], ["c", "b"]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], ["c", "b"], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]]) + Exit: (14) putvalue([v, d], ["c", "b"], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]]) + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]]) + Call: (14) true + Exit: (14) true + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]]) + Call: (14) true + Exit: (14) true + Exit: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]]) + Exit: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], true, nocut) +^ Call: (12) not(nocut=cut) +^ Unify: (12) not(user: (nocut=cut)) +^ Exit: (12) not(user: (nocut=cut)) + Call: (12) _G5852=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (12) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], _G5855, [], _G5857) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [], true) + Exit: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [], true) + Call: (12) logicalconjunction(true, true, true) + Unify: (12) logicalconjunction(true, true, true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Exit: (12) logicalconjunction(true, true, true) + Exit: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [[[n, =], [[v, d], [v, c]]]], true) + Call: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [], _G5855) + Unify: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [], _G5855) + Call: (12) [[[v, d], [v, d]]]=[[_G5848, _G5851]|_G5846] + Exit: (12) [[[v, d], [v, d]]]=[[[v, d], [v, d]]] + Call: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) var([v, d]) + Fail: (14) var([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) isval([v, d]) + Unify: (14) isval([v, d]) + Call: (15) number([v, d]) + Fail: (15) number([v, d]) + Fail: (14) isval([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) string([v, d]) + Fail: (14) string([v, d]) + Fail: (13) isvalstrempty([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) +^ Call: (13) not(atom([v, d])) +^ Unify: (13) not(user:atom([v, d])) +^ Exit: (13) not(user:atom([v, d])) + Call: (13) length([v, d], _G5867) + Unify: (13) length([v, d], _G5867) + Exit: (13) length([v, d], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2([v, d]) + Unify: (13) expressionnotatom2([v, d]) + Call: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) var(v) + Fail: (15) var(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) isval(v) + Unify: (15) isval(v) + Call: (16) number(v) + Fail: (16) number(v) + Fail: (15) isval(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) string(v) + Fail: (15) string(v) + Fail: (14) isvalstrempty(v) + Fail: (13) expressionnotatom2([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) predicate_or_rule_name([v, d]) + Fail: (13) predicate_or_rule_name([v, d]) + Fail: (12) expressionnotatom([v, d]) + Redo: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [], _G5864) + Call: (12) lists:member([[v, d], _G5857], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]]) + Unify: (12) lists:member([[v, d], _G5857], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]]) + Exit: (12) lists:member([[v, d], ["c", "b"]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]]) + Call: (12) lists:append([], [[[v, d], ["c", "b"]]], _G5878) + Unify: (12) lists:append([], [[[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]]) + Exit: (12) lists:append([], [[[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]]) + Call: (12) updatevars([], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]], _G5879) + Unify: (12) updatevars([], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]]) + Exit: (12) updatevars([], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]]) + Exit: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["c", "b"]], [[v, b], ["c", "a"]], [[v, d], ["c", "b"]]], [], [[[v, d], ["c", "b"]]]) +^ Call: (11) not([[[v, d], ["c", "b"]]]=[]) +^ Unify: (11) not(user: ([[[v, d], ["c", "b"]]]=[])) +^ Exit: (11) not(user: ([[[v, d], ["c", "b"]]]=[])) + Call: (11) unique1([[[v, d], ["c", "b"]]], [], _G15) + Unify: (11) unique1([[[v, d], ["c", "b"]]], [], _G15) + Call: (12) lists:delete([], [[v, d], ["c", "b"]], _G5884) + Unify: (12) lists:delete([], [[v, d], ["c", "b"]], []) + Exit: (12) lists:delete([], [[v, d], ["c", "b"]], []) + Call: (12) lists:append([], [[[v, d], ["c", "b"]]], _G5887) + Unify: (12) lists:append([], [[[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]]) + Exit: (12) lists:append([], [[[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]]) + Call: (12) unique1([], [[[v, d], ["c", "b"]]], _G15) + Unify: (12) unique1([], [[[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]]) + Exit: (12) unique1([], [[[v, d], ["c", "b"]]], [[[v, d], ["c", "b"]]]) + Exit: (11) unique1([[[v, d], ["c", "b"]]], [], [[[v, d], ["c", "b"]]]) + Call: (11) findresult3([["c", "b"], ["c", "a"], [v, d]], [[[v, d], ["c", "b"]]], [], _G5888) + Unify: (11) findresult3([["c", "b"], ["c", "a"], [v, d]], [[[v, d], ["c", "b"]]], [], _G5888) + Call: (12) [["c", "b"], ["c", "a"], [v, d]]=[_G5878|_G5879] + Exit: (12) [["c", "b"], ["c", "a"], [v, d]]=[["c", "b"], ["c", "a"], [v, d]] + Call: (12) expressionnotatom(["c", "b"]) + Unify: (12) expressionnotatom(["c", "b"]) + Call: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) var(["c", "b"]) + Fail: (14) var(["c", "b"]) + Redo: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) isval(["c", "b"]) + Unify: (14) isval(["c", "b"]) + Call: (15) number(["c", "b"]) + Fail: (15) number(["c", "b"]) + Fail: (14) isval(["c", "b"]) + Redo: (13) isvalstrempty(["c", "b"]) + Unify: (13) isvalstrempty(["c", "b"]) + Call: (14) string(["c", "b"]) + Fail: (14) string(["c", "b"]) + Fail: (13) isvalstrempty(["c", "b"]) + Redo: (12) expressionnotatom(["c", "b"]) + Unify: (12) expressionnotatom(["c", "b"]) +^ Call: (13) not(atom(["c", "b"])) +^ Unify: (13) not(user:atom(["c", "b"])) +^ Exit: (13) not(user:atom(["c", "b"])) + Call: (13) length(["c", "b"], _G5894) + Unify: (13) length(["c", "b"], _G5894) + Exit: (13) length(["c", "b"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["c", "b"]) + Unify: (13) expressionnotatom2(["c", "b"]) + Call: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) var("c") + Fail: (15) var("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) isval("c") + Unify: (15) isval("c") + Call: (16) number("c") + Fail: (16) number("c") + Fail: (15) isval("c") + Redo: (14) isvalstrempty("c") + Unify: (14) isvalstrempty("c") + Call: (15) string("c") + Exit: (15) string("c") + Exit: (14) isvalstrempty("c") + Call: (14) expressionnotatom2(["b"]) + Unify: (14) expressionnotatom2(["b"]) + Call: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) var("b") + Fail: (16) var("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) isval("b") + Unify: (16) isval("b") + Call: (17) number("b") + Fail: (17) number("b") + Fail: (16) isval("b") + Redo: (15) isvalstrempty("b") + Unify: (15) isvalstrempty("b") + Call: (16) string("b") + Exit: (16) string("b") + Exit: (15) isvalstrempty("b") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["b"]) + Exit: (13) expressionnotatom2(["c", "b"]) + Exit: (12) expressionnotatom(["c", "b"]) + Call: (12) lists:append([], [["c", "b"]], _G5898) + Unify: (12) lists:append([], [["c", "b"]], [["c", "b"]]) + Exit: (12) lists:append([], [["c", "b"]], [["c", "b"]]) + Call: (12) findresult3([["c", "a"], [v, d]], [[[v, d], ["c", "b"]]], [["c", "b"]], _G5899) + Unify: (12) findresult3([["c", "a"], [v, d]], [[[v, d], ["c", "b"]]], [["c", "b"]], _G5899) + Call: (13) [["c", "a"], [v, d]]=[_G5889|_G5890] + Exit: (13) [["c", "a"], [v, d]]=[["c", "a"], [v, d]] + Call: (13) expressionnotatom(["c", "a"]) + Unify: (13) expressionnotatom(["c", "a"]) + Call: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) var(["c", "a"]) + Fail: (15) var(["c", "a"]) + Redo: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) isval(["c", "a"]) + Unify: (15) isval(["c", "a"]) + Call: (16) number(["c", "a"]) + Fail: (16) number(["c", "a"]) + Fail: (15) isval(["c", "a"]) + Redo: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) string(["c", "a"]) + Fail: (15) string(["c", "a"]) + Fail: (14) isvalstrempty(["c", "a"]) + Redo: (13) expressionnotatom(["c", "a"]) + Unify: (13) expressionnotatom(["c", "a"]) +^ Call: (14) not(atom(["c", "a"])) +^ Unify: (14) not(user:atom(["c", "a"])) +^ Exit: (14) not(user:atom(["c", "a"])) + Call: (14) length(["c", "a"], _G5905) + Unify: (14) length(["c", "a"], _G5905) + Exit: (14) length(["c", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["c", "a"]) + Unify: (14) expressionnotatom2(["c", "a"]) + Call: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) var("c") + Fail: (16) var("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) isval("c") + Unify: (16) isval("c") + Call: (17) number("c") + Fail: (17) number("c") + Fail: (16) isval("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) string("c") + Exit: (16) string("c") + Exit: (15) isvalstrempty("c") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["c", "a"]) + Exit: (13) expressionnotatom(["c", "a"]) + Call: (13) lists:append([["c", "b"]], [["c", "a"]], _G5909) + Unify: (13) lists:append([["c", "b"]], [["c", "a"]], [["c", "b"]|_G5901]) + Exit: (13) lists:append([["c", "b"]], [["c", "a"]], [["c", "b"], ["c", "a"]]) + Call: (13) findresult3([[v, d]], [[[v, d], ["c", "b"]]], [["c", "b"], ["c", "a"]], _G5913) + Unify: (13) findresult3([[v, d]], [[[v, d], ["c", "b"]]], [["c", "b"], ["c", "a"]], _G5913) + Call: (14) [[v, d]]=[_G5903|_G5904] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5919) + Unify: (15) length([v, d], _G5919) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) findresult3([[v, d]], [[[v, d], ["c", "b"]]], [["c", "b"], ["c", "a"]], _G5913) + Unify: (13) findresult3([[v, d]], [[[v, d], ["c", "b"]]], [["c", "b"], ["c", "a"]], _G5913) + Call: (14) [[v, d]]=[_G5903|_G5904] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) lists:member([[v, d], _G5909], [[[v, d], ["c", "b"]]]) + Unify: (14) lists:member([[v, d], _G5909], [[[v, d], ["c", "b"]]]) + Exit: (14) lists:member([[v, d], ["c", "b"]], [[[v, d], ["c", "b"]]]) + Call: (14) lists:append([["c", "b"], ["c", "a"]], [["c", "b"]], _G5924) + Unify: (14) lists:append([["c", "b"], ["c", "a"]], [["c", "b"]], [["c", "b"]|_G5916]) + Exit: (14) lists:append([["c", "b"], ["c", "a"]], [["c", "b"]], [["c", "b"], ["c", "a"], ["c", "b"]]) + Call: (14) findresult3([], [[[v, d], ["c", "b"]]], [["c", "b"], ["c", "a"], ["c", "b"]], _G5931) + Unify: (14) findresult3([], [[[v, d], ["c", "b"]]], [["c", "b"], ["c", "a"], ["c", "b"]], [["c", "b"], ["c", "a"], ["c", "b"]]) + Exit: (14) findresult3([], [[[v, d], ["c", "b"]]], [["c", "b"], ["c", "a"], ["c", "b"]], [["c", "b"], ["c", "a"], ["c", "b"]]) + Exit: (13) findresult3([[v, d]], [[[v, d], ["c", "b"]]], [["c", "b"], ["c", "a"]], [["c", "b"], ["c", "a"], ["c", "b"]]) + Exit: (12) findresult3([["c", "a"], [v, d]], [[[v, d], ["c", "b"]]], [["c", "b"]], [["c", "b"], ["c", "a"], ["c", "b"]]) + Exit: (11) findresult3([["c", "b"], ["c", "a"], [v, d]], [[[v, d], ["c", "b"]]], [], [["c", "b"], ["c", "a"], ["c", "b"]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["c", "b"]]]) + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Exit: (10) member1([[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["c", "b"]]]) + Exit: (9) interpret1(off, [[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["c", "b"]]]) + Exit: (8) interpret(off, [[n, f], [["c", "b"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["c", "b"]]]) + + Call: (8) interpret(off, [[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G18) + Unify: (8) interpret(off, [[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G18) + Call: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5555) + Unify: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5555) + Call: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5555, [], _G5557) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5555, [], _G5557) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5546|_G5547] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5555], _G5558, "->", _G5567] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5555], _G5558, "->", _G5567] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5558, [], _G5560) + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5555], "->", _G5564] + Fail: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _G5555], "->", _G5564] + Redo: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5555, [], _G5557) + Unify: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _G5555, [], _G5557) + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_G5546|_G5547] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_G5549, _G5552, ":-", _G5561] + Exit: (11) [[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Call: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], _G5576) + Unify: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (11) lists:append([], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G5576, [], _G5578) + Unify: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (11) convert_to_grammar_part11([], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (10) convert_to_grammar_part11([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], _G5576) + Unify: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]|_G5568]) + Exit: (10) lists:append([[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, c], [v, b], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (9) convert_to_grammar_part1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (9) interpret1(off, [[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G18) + Unify: (9) interpret1(off, [[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G18) +^ Call: (10) retractall(debug(_G5570)) +^ Exit: (10) retractall(debug(_G5570)) +^ Call: (10) assertz(debug(off)) +^ Exit: (10) assertz(debug(off)) +^ Call: (10) retractall(cut(_G5574)) +^ Exit: (10) retractall(cut(_G5574)) +^ Call: (10) assertz(cut(off)) +^ Exit: (10) assertz(cut(off)) + Call: (10) member1([[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G18) + Unify: (10) member1([[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G18) + Call: (11) cut(off) + Unify: (11) cut(off) + Exit: (11) cut(off) + Call: (11) [[n, f], [["b", "a"], ["c", "a"], [v, d]]]=[_G5578, _G5581] + Exit: (11) [[n, f], [["b", "a"], ["c", "a"], [v, d]]]=[[n, f], [["b", "a"], ["c", "a"], [v, d]]] + Call: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _G5590, ":-", _G5599]|_G5585] + Exit: (11) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) length([["b", "a"], ["c", "a"], [v, d]], _G5610) + Unify: (11) length([["b", "a"], ["c", "a"], [v, d]], _G5610) + Exit: (11) length([["b", "a"], ["c", "a"], [v, d]], 3) + Call: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Unify: (11) length([[v, c], [v, b], [v, d]], 3) + Exit: (11) length([[v, c], [v, b], [v, d]], 3) + Call: (11) [n, f]=[n, grammar] + Fail: (11) [n, f]=[n, grammar] + Redo: (10) member1([[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G18) + Call: (11) [n, f]=[n, grammar_part] + Fail: (11) [n, f]=[n, grammar_part] + Redo: (10) member1([[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G18) + Call: (11) checkarguments([["b", "a"], ["c", "a"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5612, [], _G5614) + Unify: (11) checkarguments([["b", "a"], ["c", "a"], [v, d]], [[v, c], [v, b], [v, d]], [], _G5612, [], _G5614) + Call: (12) [["b", "a"], ["c", "a"], [v, d]]=[_G5602|_G5603] + Exit: (12) [["b", "a"], ["c", "a"], [v, d]]=[["b", "a"], ["c", "a"], [v, d]] + Call: (12) expressionnotatom(["b", "a"]) + Unify: (12) expressionnotatom(["b", "a"]) + Call: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) var(["b", "a"]) + Fail: (14) var(["b", "a"]) + Redo: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) isval(["b", "a"]) + Unify: (14) isval(["b", "a"]) + Call: (15) number(["b", "a"]) + Fail: (15) number(["b", "a"]) + Fail: (14) isval(["b", "a"]) + Redo: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) string(["b", "a"]) + Fail: (14) string(["b", "a"]) + Fail: (13) isvalstrempty(["b", "a"]) + Redo: (12) expressionnotatom(["b", "a"]) + Unify: (12) expressionnotatom(["b", "a"]) +^ Call: (13) not(atom(["b", "a"])) +^ Unify: (13) not(user:atom(["b", "a"])) +^ Exit: (13) not(user:atom(["b", "a"])) + Call: (13) length(["b", "a"], _G5618) + Unify: (13) length(["b", "a"], _G5618) + Exit: (13) length(["b", "a"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["b", "a"]) + Unify: (13) expressionnotatom2(["b", "a"]) + Call: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) var("b") + Fail: (15) var("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) isval("b") + Unify: (15) isval("b") + Call: (16) number("b") + Fail: (16) number("b") + Fail: (15) isval("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) string("b") + Exit: (15) string("b") + Exit: (14) isvalstrempty("b") + Call: (14) expressionnotatom2(["a"]) + Unify: (14) expressionnotatom2(["a"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["a"]) + Exit: (13) expressionnotatom2(["b", "a"]) + Exit: (12) expressionnotatom(["b", "a"]) + Call: (12) [[v, c], [v, b], [v, d]]=[_G5610|_G5611] + Exit: (12) [[v, c], [v, b], [v, d]]=[[v, c], [v, b], [v, d]] +^ Call: (12) not(var([v, c])) +^ Unify: (12) not(user:var([v, c])) +^ Exit: (12) not(user:var([v, c])) + Call: (12) isvar([v, c]) + Unify: (12) isvar([v, c]) + Exit: (12) isvar([v, c]) + Call: (12) putvalue([v, c], ["b", "a"], [], _G5628) + Unify: (12) putvalue([v, c], ["b", "a"], [], _G5628) +^ Call: (13) not(isvar([v, c])) +^ Unify: (13) not(user:isvar([v, c])) + Call: (14) isvar([v, c]) + Unify: (14) isvar([v, c]) + Exit: (14) isvar([v, c]) +^ Fail: (13) not(user:isvar([v, c])) + Redo: (12) putvalue([v, c], ["b", "a"], [], _G5628) + Call: (13) isvar([v, c]) + Unify: (13) isvar([v, c]) + Exit: (13) isvar([v, c]) + Call: (13) isvalstrorundef(["b", "a"]) + Unify: (13) isvalstrorundef(["b", "a"]) + Call: (14) var(["b", "a"]) + Fail: (14) var(["b", "a"]) + Redo: (13) isvalstrorundef(["b", "a"]) + Unify: (13) isvalstrorundef(["b", "a"]) +^ Call: (14) not(var(["b", "a"])) +^ Unify: (14) not(user:var(["b", "a"])) +^ Exit: (14) not(user:var(["b", "a"])) + Call: (14) isval(["b", "a"]) + Unify: (14) isval(["b", "a"]) + Call: (15) number(["b", "a"]) + Fail: (15) number(["b", "a"]) + Fail: (14) isval(["b", "a"]) + Redo: (13) isvalstrorundef(["b", "a"]) + Unify: (13) isvalstrorundef(["b", "a"]) +^ Call: (14) not(var(["b", "a"])) +^ Unify: (14) not(user:var(["b", "a"])) +^ Exit: (14) not(user:var(["b", "a"])) + Call: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) atom(["b", "a"]) + Fail: (15) atom(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) +^ Call: (15) not(atom(["b", "a"])) +^ Unify: (15) not(user:atom(["b", "a"])) +^ Exit: (15) not(user:atom(["b", "a"])) + Call: (15) length(["b", "a"], _G5636) + Unify: (15) length(["b", "a"], _G5636) + Exit: (15) length(["b", "a"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["b", "a"]) + Unify: (15) expression2(["b", "a"]) + Call: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) expression3("b") + Call: (16) expression2(["a"]) + Unify: (16) expression2(["a"]) + Call: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) isval("a") + Unify: (18) isval("a") + Call: (19) number("a") + Fail: (19) number("a") + Fail: (18) isval("a") + Redo: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) string("a") + Exit: (18) string("a") + Exit: (17) expression3("a") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["a"]) + Exit: (15) expression2(["b", "a"]) + Exit: (14) expression(["b", "a"]) + Exit: (13) isvalstrorundef(["b", "a"]) + Call: (13) updatevar([v, c], ["b", "a"], [], _G5638) + Unify: (13) updatevar([v, c], ["b", "a"], [], _G5638) + Call: (14) lists:member([[v, c], empty], []) + Fail: (14) lists:member([[v, c], empty], []) + Redo: (13) updatevar([v, c], ["b", "a"], [], _G5638) +^ Call: (14) not(member([[v, c], _G5634], [])) +^ Unify: (14) not(user:member([[v, c], _G5634], [])) +^ Exit: (14) not(user:member([[v, c], _G5634], [])) + Call: (14) _G5634=empty + Exit: (14) empty=empty + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[v, c], ["b", "a"]]], _G5658) + Unify: (14) lists:append([], [[[v, c], ["b", "a"]]], [[[v, c], ["b", "a"]]]) + Exit: (14) lists:append([], [[[v, c], ["b", "a"]]], [[[v, c], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Exit: (13) updatevar([v, c], ["b", "a"], [], [[[v, c], ["b", "a"]]]) + Exit: (12) putvalue([v, c], ["b", "a"], [], [[[v, c], ["b", "a"]]]) + Call: (12) checkarguments([["c", "a"], [v, d]], [[v, b], [v, d]], [[[v, c], ["b", "a"]]], _G5659, [], _G5661) + Unify: (12) checkarguments([["c", "a"], [v, d]], [[v, b], [v, d]], [[[v, c], ["b", "a"]]], _G5659, [], _G5661) + Call: (13) [["c", "a"], [v, d]]=[_G5649|_G5650] + Exit: (13) [["c", "a"], [v, d]]=[["c", "a"], [v, d]] + Call: (13) expressionnotatom(["c", "a"]) + Unify: (13) expressionnotatom(["c", "a"]) + Call: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) var(["c", "a"]) + Fail: (15) var(["c", "a"]) + Redo: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) isval(["c", "a"]) + Unify: (15) isval(["c", "a"]) + Call: (16) number(["c", "a"]) + Fail: (16) number(["c", "a"]) + Fail: (15) isval(["c", "a"]) + Redo: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) string(["c", "a"]) + Fail: (15) string(["c", "a"]) + Fail: (14) isvalstrempty(["c", "a"]) + Redo: (13) expressionnotatom(["c", "a"]) + Unify: (13) expressionnotatom(["c", "a"]) +^ Call: (14) not(atom(["c", "a"])) +^ Unify: (14) not(user:atom(["c", "a"])) +^ Exit: (14) not(user:atom(["c", "a"])) + Call: (14) length(["c", "a"], _G5665) + Unify: (14) length(["c", "a"], _G5665) + Exit: (14) length(["c", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["c", "a"]) + Unify: (14) expressionnotatom2(["c", "a"]) + Call: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) var("c") + Fail: (16) var("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) isval("c") + Unify: (16) isval("c") + Call: (17) number("c") + Fail: (17) number("c") + Fail: (16) isval("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) string("c") + Exit: (16) string("c") + Exit: (15) isvalstrempty("c") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["c", "a"]) + Exit: (13) expressionnotatom(["c", "a"]) + Call: (13) [[v, b], [v, d]]=[_G5657|_G5658] + Exit: (13) [[v, b], [v, d]]=[[v, b], [v, d]] +^ Call: (13) not(var([v, b])) +^ Unify: (13) not(user:var([v, b])) +^ Exit: (13) not(user:var([v, b])) + Call: (13) isvar([v, b]) + Unify: (13) isvar([v, b]) + Exit: (13) isvar([v, b]) + Call: (13) putvalue([v, b], ["c", "a"], [[[v, c], ["b", "a"]]], _G5675) + Unify: (13) putvalue([v, b], ["c", "a"], [[[v, c], ["b", "a"]]], _G5675) +^ Call: (14) not(isvar([v, b])) +^ Unify: (14) not(user:isvar([v, b])) + Call: (15) isvar([v, b]) + Unify: (15) isvar([v, b]) + Exit: (15) isvar([v, b]) +^ Fail: (14) not(user:isvar([v, b])) + Redo: (13) putvalue([v, b], ["c", "a"], [[[v, c], ["b", "a"]]], _G5675) + Call: (14) isvar([v, b]) + Unify: (14) isvar([v, b]) + Exit: (14) isvar([v, b]) + Call: (14) isvalstrorundef(["c", "a"]) + Unify: (14) isvalstrorundef(["c", "a"]) + Call: (15) var(["c", "a"]) + Fail: (15) var(["c", "a"]) + Redo: (14) isvalstrorundef(["c", "a"]) + Unify: (14) isvalstrorundef(["c", "a"]) +^ Call: (15) not(var(["c", "a"])) +^ Unify: (15) not(user:var(["c", "a"])) +^ Exit: (15) not(user:var(["c", "a"])) + Call: (15) isval(["c", "a"]) + Unify: (15) isval(["c", "a"]) + Call: (16) number(["c", "a"]) + Fail: (16) number(["c", "a"]) + Fail: (15) isval(["c", "a"]) + Redo: (14) isvalstrorundef(["c", "a"]) + Unify: (14) isvalstrorundef(["c", "a"]) +^ Call: (15) not(var(["c", "a"])) +^ Unify: (15) not(user:var(["c", "a"])) +^ Exit: (15) not(user:var(["c", "a"])) + Call: (15) expression(["c", "a"]) + Unify: (15) expression(["c", "a"]) + Call: (16) isval(["c", "a"]) + Unify: (16) isval(["c", "a"]) + Call: (17) number(["c", "a"]) + Fail: (17) number(["c", "a"]) + Fail: (16) isval(["c", "a"]) + Redo: (15) expression(["c", "a"]) + Unify: (15) expression(["c", "a"]) + Call: (16) string(["c", "a"]) + Fail: (16) string(["c", "a"]) + Redo: (15) expression(["c", "a"]) + Unify: (15) expression(["c", "a"]) + Call: (16) atom(["c", "a"]) + Fail: (16) atom(["c", "a"]) + Redo: (15) expression(["c", "a"]) + Unify: (15) expression(["c", "a"]) +^ Call: (16) not(atom(["c", "a"])) +^ Unify: (16) not(user:atom(["c", "a"])) +^ Exit: (16) not(user:atom(["c", "a"])) + Call: (16) length(["c", "a"], _G5683) + Unify: (16) length(["c", "a"], _G5683) + Exit: (16) length(["c", "a"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expression2(["c", "a"]) + Unify: (16) expression2(["c", "a"]) + Call: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) isval("c") + Unify: (18) isval("c") + Call: (19) number("c") + Fail: (19) number("c") + Fail: (18) isval("c") + Redo: (17) expression3("c") + Unify: (17) expression3("c") + Call: (18) string("c") + Exit: (18) string("c") + Exit: (17) expression3("c") + Call: (17) expression2(["a"]) + Unify: (17) expression2(["a"]) + Call: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) isval("a") + Unify: (19) isval("a") + Call: (20) number("a") + Fail: (20) number("a") + Fail: (19) isval("a") + Redo: (18) expression3("a") + Unify: (18) expression3("a") + Call: (19) string("a") + Exit: (19) string("a") + Exit: (18) expression3("a") + Call: (18) expression2([]) + Unify: (18) expression2([]) + Exit: (18) expression2([]) + Exit: (17) expression2(["a"]) + Exit: (16) expression2(["c", "a"]) + Exit: (15) expression(["c", "a"]) + Exit: (14) isvalstrorundef(["c", "a"]) + Call: (14) updatevar([v, b], ["c", "a"], [[[v, c], ["b", "a"]]], _G5685) + Unify: (14) updatevar([v, b], ["c", "a"], [[[v, c], ["b", "a"]]], _G5685) + Call: (15) lists:member([[v, b], empty], [[[v, c], ["b", "a"]]]) + Unify: (15) lists:member([[v, b], empty], [[[v, c], ["b", "a"]]]) + Fail: (15) lists:member([[v, b], empty], [[[v, c], ["b", "a"]]]) + Redo: (14) updatevar([v, b], ["c", "a"], [[[v, c], ["b", "a"]]], _G5685) +^ Call: (15) not(member([[v, b], _G5681], [[[v, c], ["b", "a"]]])) +^ Unify: (15) not(user:member([[v, b], _G5681], [[[v, c], ["b", "a"]]])) +^ Exit: (15) not(user:member([[v, b], _G5681], [[[v, c], ["b", "a"]]])) + Call: (15) _G5681=empty + Exit: (15) empty=empty + Call: (15) true + Exit: (15) true + Call: (15) lists:append([[[v, c], ["b", "a"]]], [[[v, b], ["c", "a"]]], _G5705) + Unify: (15) lists:append([[[v, c], ["b", "a"]]], [[[v, b], ["c", "a"]]], [[[v, c], ["b", "a"]]|_G5697]) + Exit: (15) lists:append([[[v, c], ["b", "a"]]], [[[v, b], ["c", "a"]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Call: (15) true + Exit: (15) true + Call: (15) true + Exit: (15) true + Exit: (14) updatevar([v, b], ["c", "a"], [[[v, c], ["b", "a"]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Exit: (13) putvalue([v, b], ["c", "a"], [[[v, c], ["b", "a"]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Call: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5709, [], _G5711) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5709, [], _G5711) + Call: (14) [[v, d]]=[_G5699|_G5700] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5715) + Unify: (15) length([v, d], _G5715) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5709, [], _G5711) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5709, [], _G5711) + Call: (14) [[v, d]]=[_G5699|_G5700] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5707|_G5708] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5723) + Unify: (15) length([v, d], _G5723) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5709, [], _G5711) + Unify: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5709, [], _G5711) + Call: (14) [[v, d]]=[_G5699|_G5700] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) [[v, d]]=[_G5707|_G5708] + Exit: (14) [[v, d]]=[[v, d]] +^ Call: (14) not(var([v, d])) +^ Unify: (14) not(user:var([v, d])) +^ Exit: (14) not(user:var([v, d])) + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) getvalue([v, d], _G5723, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Unify: (14) getvalue([v, d], _G5723, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) getvalue([v, d], _G5723, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(_G5722) + Unify: (15) isvalstrorundef(_G5722) + Call: (16) var(_G5722) + Exit: (16) var(_G5722) + Exit: (15) isvalstrorundef(_G5722) + Call: (15) getvar([v, d], _G5723, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Unify: (15) getvar([v, d], _G5723, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Call: (16) lists:member([[v, d], _G5718], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Unify: (16) lists:member([[v, d], _G5718], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Fail: (16) lists:member([[v, d], _G5718], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Redo: (15) getvar([v, d], _G5723, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Unify: (15) getvar([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) +^ Call: (16) not(member([[v, d], _G5721], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5721], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5721], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]])) + Call: (16) true + Exit: (16) true + Exit: (15) getvar([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Exit: (14) getvalue([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) putvalue([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5737) + Unify: (14) putvalue([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5737) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5737) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) + Call: (16) var(empty) + Fail: (16) var(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) isval(empty) + Unify: (16) isval(empty) + Call: (17) number(empty) + Fail: (17) number(empty) + Fail: (16) isval(empty) + Redo: (15) isvalstrorundef(empty) + Unify: (15) isvalstrorundef(empty) +^ Call: (16) not(var(empty)) +^ Unify: (16) not(user:var(empty)) +^ Exit: (16) not(user:var(empty)) + Call: (16) expression(empty) + Unify: (16) expression(empty) + Exit: (16) expression(empty) + Exit: (15) isvalstrorundef(empty) + Call: (15) updatevar([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5742) + Unify: (15) updatevar([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5742) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Fail: (16) lists:member([[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Redo: (15) updatevar([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], _G5742) +^ Call: (16) not(member([[v, d], _G5738], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]])) +^ Unify: (16) not(user:member([[v, d], _G5738], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]])) +^ Exit: (16) not(user:member([[v, d], _G5738], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]])) + Call: (16) _G5738=empty + Exit: (16) empty=empty + Call: (16) true + Exit: (16) true + Call: (16) lists:append([[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], [[[v, d], empty]], _G5762) + Unify: (16) lists:append([[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], [[[v, d], empty]], [[[v, c], ["b", "a"]]|_G5754]) + Exit: (16) lists:append([[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], [[[v, d], empty]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (14) putvalue([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (14) lists:append([], [[[v, d], [v, d]]], _G5777) + Unify: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) lists:append([], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Call: (14) checkarguments([], [], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5778, [[[v, d], [v, d]]], _G5780) + Unify: (14) checkarguments([], [], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (14) checkarguments([], [], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, d], [v, d]]], [[[v, d], [v, d]]]) + Exit: (13) checkarguments([[v, d]], [[v, d]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (12) checkarguments([["c", "a"], [v, d]], [[v, b], [v, d]], [[[v, c], ["b", "a"]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Exit: (11) checkarguments([["b", "a"], ["c", "a"], [v, d]], [[v, c], [v, b], [v, d]], [], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [], [[[v, d], [v, d]]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], _G18) + Call: (11) true + Exit: (11) true + Call: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5778, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5778, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5768|_G5769] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5786, [[n, =], [[v, d], [v, c]]], _G5788) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5786, [[n, =], [[v, d], [v, c]]], _G5788) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5776|_G5777] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5786, [[n, =], [[v, d], [v, c]]], _G5788) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5786, [[n, =], [[v, d], [v, c]]], _G5788) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5785]]|_G5777] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[not, [_G5785]]|_G5777] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5786, [[n, =], [[v, d], [v, c]]], _G5788) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5786, [[n, =], [[v, d], [v, c]]], _G5788) + Call: (13) [[n, =], [[v, d], [v, c]]]=[[_G5779], or, [_G5788]] + Fail: (13) [[n, =], [[v, d], [v, c]]]=[[_G5779], or, [_G5788]] + Redo: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5786, [[n, =], [[v, d], [v, c]]], _G5788) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5786, [[n, =], [[v, d], [v, c]]], _G5788) + Call: (13) [[n, =], [[v, d], [v, c]]]=[_G5776|_G5777] + Exit: (13) [[n, =], [[v, d], [v, c]]]=[[n, =], [[v, d], [v, c]]] +^ Call: (13) not(predicate_or_rule_name([n, =])) +^ Unify: (13) not(user:predicate_or_rule_name([n, =])) + Call: (14) predicate_or_rule_name([n, =]) + Unify: (14) predicate_or_rule_name([n, =]) + Exit: (14) predicate_or_rule_name([n, =]) +^ Fail: (13) not(user:predicate_or_rule_name([n, =])) + Fail: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5786, [[n, =], [[v, d], [v, c]]], _G5788) + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5778, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5778, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5777]]|_G5769] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[not, [_G5777]]|_G5769] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5778, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5778, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5771], or, [_G5780]] + Fail: (12) [[[n, =], [[v, d], [v, c]]]]=[[_G5771], or, [_G5780]] + Redo: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5778, [[[n, =], [[v, d], [v, c]]]], true) + Unify: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5778, [[[n, =], [[v, d], [v, c]]]], true) + Call: (12) [[[n, =], [[v, d], [v, c]]]]=[_G5768|_G5769] + Exit: (12) [[[n, =], [[v, d], [v, c]]]]=[[[n, =], [[v, d], [v, c]]]] +^ Call: (12) not(predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) +^ Unify: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) + Fail: (13) predicate_or_rule_name([[n, =], [[v, d], [v, c]]]) +^ Exit: (12) not(user:predicate_or_rule_name([[n, =], [[v, d], [v, c]]])) + Call: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5787, _G5788, _G5789) + Unify: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5787, true, nocut) + Call: (13) isop(=) + Unify: (13) isop(=) + Exit: (13) isop(=) + Call: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5787) + Unify: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5787) + Call: (14) getvalues([v, d], [v, c], _G5785, _G5786, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (14) getvalues([v, d], [v, c], _G5785, _G5786, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, d], _G5784, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, d], _G5784, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, d])) +^ Unify: (16) not(user:isvar([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) +^ Fail: (16) not(user:isvar([v, d])) + Redo: (15) getvalue([v, d], _G5784, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) + Call: (16) isvalstrorundef(_G5783) + Unify: (16) isvalstrorundef(_G5783) + Call: (17) var(_G5783) + Exit: (17) var(_G5783) + Exit: (16) isvalstrorundef(_G5783) + Call: (16) getvar([v, d], _G5784, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], _G5784, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], _G5779], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], _G5779], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) +^ Call: (17) not(empty=empty) +^ Unify: (17) not(user: (empty=empty)) +^ Fail: (17) not(user: (empty=empty)) + Redo: (16) getvar([v, d], _G5784, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) +^ Call: (17) not(member([[v, d], _G5782], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]])) +^ Unify: (17) not(user:member([[v, d], _G5782], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]])) +^ Fail: (17) not(user:member([[v, d], _G5782], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]])) + Redo: (16) getvar([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (16) getvar([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, d], empty, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (15) getvalue([v, c], _G5790, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (15) getvalue([v, c], _G5790, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) +^ Call: (16) not(isvar([v, c])) +^ Unify: (16) not(user:isvar([v, c])) + Call: (17) isvar([v, c]) + Unify: (17) isvar([v, c]) + Exit: (17) isvar([v, c]) +^ Fail: (16) not(user:isvar([v, c])) + Redo: (15) getvalue([v, c], _G5790, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (16) isvar([v, c]) + Unify: (16) isvar([v, c]) + Exit: (16) isvar([v, c]) + Call: (16) isvalstrorundef(_G5789) + Unify: (16) isvalstrorundef(_G5789) + Call: (17) var(_G5789) + Exit: (17) var(_G5789) + Exit: (16) isvalstrorundef(_G5789) + Call: (16) getvar([v, c], _G5790, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (16) getvar([v, c], _G5790, [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (17) lists:member([[v, c], _G5785], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (17) lists:member([[v, c], _G5785], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (17) lists:member([[v, c], ["b", "a"]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) +^ Call: (17) not(["b", "a"]=empty) +^ Unify: (17) not(user: (["b", "a"]=empty)) +^ Exit: (17) not(user: (["b", "a"]=empty)) + Exit: (16) getvar([v, c], ["b", "a"], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (15) getvalue([v, c], ["b", "a"], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (14) getvalues([v, d], [v, c], empty, ["b", "a"], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (14) expression(empty) + Unify: (14) expression(empty) + Exit: (14) expression(empty) + Call: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) isval(["b", "a"]) + Unify: (15) isval(["b", "a"]) + Call: (16) number(["b", "a"]) + Fail: (16) number(["b", "a"]) + Fail: (15) isval(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) string(["b", "a"]) + Fail: (15) string(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) + Call: (15) atom(["b", "a"]) + Fail: (15) atom(["b", "a"]) + Redo: (14) expression(["b", "a"]) + Unify: (14) expression(["b", "a"]) +^ Call: (15) not(atom(["b", "a"])) +^ Unify: (15) not(user:atom(["b", "a"])) +^ Exit: (15) not(user:atom(["b", "a"])) + Call: (15) length(["b", "a"], _G5807) + Unify: (15) length(["b", "a"], _G5807) + Exit: (15) length(["b", "a"], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expression2(["b", "a"]) + Unify: (15) expression2(["b", "a"]) + Call: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) isval("b") + Unify: (17) isval("b") + Call: (18) number("b") + Fail: (18) number("b") + Fail: (17) isval("b") + Redo: (16) expression3("b") + Unify: (16) expression3("b") + Call: (17) string("b") + Exit: (17) string("b") + Exit: (16) expression3("b") + Call: (16) expression2(["a"]) + Unify: (16) expression2(["a"]) + Call: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) isval("a") + Unify: (18) isval("a") + Call: (19) number("a") + Fail: (19) number("a") + Fail: (18) isval("a") + Redo: (17) expression3("a") + Unify: (17) expression3("a") + Call: (18) string("a") + Exit: (18) string("a") + Exit: (17) expression3("a") + Call: (17) expression2([]) + Unify: (17) expression2([]) + Exit: (17) expression2([]) + Exit: (16) expression2(["a"]) + Exit: (15) expression2(["b", "a"]) + Exit: (14) expression(["b", "a"]) + Call: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Unify: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Exit: (14) val1emptyorvalsequal(empty, ["b", "a"]) + Call: (14) putvalue([v, d], ["b", "a"], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5809) + Unify: (14) putvalue([v, d], ["b", "a"], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5809) +^ Call: (15) not(isvar([v, d])) +^ Unify: (15) not(user:isvar([v, d])) + Call: (16) isvar([v, d]) + Unify: (16) isvar([v, d]) + Exit: (16) isvar([v, d]) +^ Fail: (15) not(user:isvar([v, d])) + Redo: (14) putvalue([v, d], ["b", "a"], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5809) + Call: (15) isvar([v, d]) + Unify: (15) isvar([v, d]) + Exit: (15) isvar([v, d]) + Call: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) + Call: (16) var(["b", "a"]) + Fail: (16) var(["b", "a"]) + Redo: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) +^ Call: (16) not(var(["b", "a"])) +^ Unify: (16) not(user:var(["b", "a"])) +^ Exit: (16) not(user:var(["b", "a"])) + Call: (16) isval(["b", "a"]) + Unify: (16) isval(["b", "a"]) + Call: (17) number(["b", "a"]) + Fail: (17) number(["b", "a"]) + Fail: (16) isval(["b", "a"]) + Redo: (15) isvalstrorundef(["b", "a"]) + Unify: (15) isvalstrorundef(["b", "a"]) +^ Call: (16) not(var(["b", "a"])) +^ Unify: (16) not(user:var(["b", "a"])) +^ Exit: (16) not(user:var(["b", "a"])) + Call: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) isval(["b", "a"]) + Unify: (17) isval(["b", "a"]) + Call: (18) number(["b", "a"]) + Fail: (18) number(["b", "a"]) + Fail: (17) isval(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) string(["b", "a"]) + Fail: (17) string(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) + Call: (17) atom(["b", "a"]) + Fail: (17) atom(["b", "a"]) + Redo: (16) expression(["b", "a"]) + Unify: (16) expression(["b", "a"]) +^ Call: (17) not(atom(["b", "a"])) +^ Unify: (17) not(user:atom(["b", "a"])) +^ Exit: (17) not(user:atom(["b", "a"])) + Call: (17) length(["b", "a"], _G5817) + Unify: (17) length(["b", "a"], _G5817) + Exit: (17) length(["b", "a"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expression2(["b", "a"]) + Unify: (17) expression2(["b", "a"]) + Call: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) isval("b") + Unify: (19) isval("b") + Call: (20) number("b") + Fail: (20) number("b") + Fail: (19) isval("b") + Redo: (18) expression3("b") + Unify: (18) expression3("b") + Call: (19) string("b") + Exit: (19) string("b") + Exit: (18) expression3("b") + Call: (18) expression2(["a"]) + Unify: (18) expression2(["a"]) + Call: (19) expression3("a") + Unify: (19) expression3("a") + Call: (20) isval("a") + Unify: (20) isval("a") + Call: (21) number("a") + Fail: (21) number("a") + Fail: (20) isval("a") + Redo: (19) expression3("a") + Unify: (19) expression3("a") + Call: (20) string("a") + Exit: (20) string("a") + Exit: (19) expression3("a") + Call: (19) expression2([]) + Unify: (19) expression2([]) + Exit: (19) expression2([]) + Exit: (18) expression2(["a"]) + Exit: (17) expression2(["b", "a"]) + Exit: (16) expression(["b", "a"]) + Exit: (15) isvalstrorundef(["b", "a"]) + Call: (15) updatevar([v, d], ["b", "a"], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5819) + Unify: (15) updatevar([v, d], ["b", "a"], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], _G5819) + Call: (16) lists:member([[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Unify: (16) lists:member([[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Exit: (16) lists:member([[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]]) + Call: (16) lists:delete([[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[v, d], empty], _G5830) + Unify: (16) lists:delete([[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[v, d], empty], _G5830) + Exit: (16) lists:delete([[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[v, d], empty], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]]) + Call: (16) lists:append([[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], [[[v, d], ["b", "a"]]], _G5845) + Unify: (16) lists:append([[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], [[[v, d], ["b", "a"]]], [[[v, c], ["b", "a"]]|_G5837]) + Exit: (16) lists:append([[[v, c], ["b", "a"]], [[v, b], ["c", "a"]]], [[[v, d], ["b", "a"]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]]) + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Call: (16) true + Exit: (16) true + Exit: (15) updatevar([v, d], ["b", "a"], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]]) + Exit: (14) putvalue([v, d], ["b", "a"], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Call: (14) debug(on) + Fail: (14) debug(on) + Redo: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]]) + Call: (14) true + Exit: (14) true + Exit: (13) interpretpart(is, [v, d], [v, c], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]]) + Exit: (12) interpretstatement1([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[n, =], [[v, d], [v, c]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], true, nocut) +^ Call: (12) not(nocut=cut) +^ Unify: (12) not(user: (nocut=cut)) +^ Exit: (12) not(user: (nocut=cut)) + Call: (12) _G5855=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (12) [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], _G5858, [], _G5860) + Unify: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [], true) + Exit: (12) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [], true) + Call: (12) logicalconjunction(true, true, true) + Unify: (12) logicalconjunction(true, true, true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Call: (13) true(true) + Unify: (13) true(true) + Exit: (13) true(true) + Exit: (12) logicalconjunction(true, true, true) + Exit: (11) interpretbody([[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], empty]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [[[n, =], [[v, d], [v, c]]]], true) + Call: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [], _G5858) + Unify: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [], _G5858) + Call: (12) [[[v, d], [v, d]]]=[[_G5851, _G5854]|_G5849] + Exit: (12) [[[v, d], [v, d]]]=[[[v, d], [v, d]]] + Call: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) var([v, d]) + Fail: (14) var([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) isval([v, d]) + Unify: (14) isval([v, d]) + Call: (15) number([v, d]) + Fail: (15) number([v, d]) + Fail: (14) isval([v, d]) + Redo: (13) isvalstrempty([v, d]) + Unify: (13) isvalstrempty([v, d]) + Call: (14) string([v, d]) + Fail: (14) string([v, d]) + Fail: (13) isvalstrempty([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) +^ Call: (13) not(atom([v, d])) +^ Unify: (13) not(user:atom([v, d])) +^ Exit: (13) not(user:atom([v, d])) + Call: (13) length([v, d], _G5870) + Unify: (13) length([v, d], _G5870) + Exit: (13) length([v, d], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2([v, d]) + Unify: (13) expressionnotatom2([v, d]) + Call: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) var(v) + Fail: (15) var(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) isval(v) + Unify: (15) isval(v) + Call: (16) number(v) + Fail: (16) number(v) + Fail: (15) isval(v) + Redo: (14) isvalstrempty(v) + Unify: (14) isvalstrempty(v) + Call: (15) string(v) + Fail: (15) string(v) + Fail: (14) isvalstrempty(v) + Fail: (13) expressionnotatom2([v, d]) + Redo: (12) expressionnotatom([v, d]) + Unify: (12) expressionnotatom([v, d]) + Call: (13) predicate_or_rule_name([v, d]) + Fail: (13) predicate_or_rule_name([v, d]) + Fail: (12) expressionnotatom([v, d]) + Redo: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [], _G5867) + Call: (12) lists:member([[v, d], _G5860], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]]) + Unify: (12) lists:member([[v, d], _G5860], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]]) + Exit: (12) lists:member([[v, d], ["b", "a"]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]]) + Call: (12) lists:append([], [[[v, d], ["b", "a"]]], _G5881) + Unify: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Call: (12) updatevars([], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], _G5882) + Unify: (12) updatevars([], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) updatevars([], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (11) updatevars([[[v, d], [v, d]]], [[[v, c], ["b", "a"]], [[v, b], ["c", "a"]], [[v, d], ["b", "a"]]], [], [[[v, d], ["b", "a"]]]) +^ Call: (11) not([[[v, d], ["b", "a"]]]=[]) +^ Unify: (11) not(user: ([[[v, d], ["b", "a"]]]=[])) +^ Exit: (11) not(user: ([[[v, d], ["b", "a"]]]=[])) + Call: (11) unique1([[[v, d], ["b", "a"]]], [], _G18) + Unify: (11) unique1([[[v, d], ["b", "a"]]], [], _G18) + Call: (12) lists:delete([], [[v, d], ["b", "a"]], _G5887) + Unify: (12) lists:delete([], [[v, d], ["b", "a"]], []) + Exit: (12) lists:delete([], [[v, d], ["b", "a"]], []) + Call: (12) lists:append([], [[[v, d], ["b", "a"]]], _G5890) + Unify: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) lists:append([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Call: (12) unique1([], [[[v, d], ["b", "a"]]], _G18) + Unify: (12) unique1([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (12) unique1([], [[[v, d], ["b", "a"]]], [[[v, d], ["b", "a"]]]) + Exit: (11) unique1([[[v, d], ["b", "a"]]], [], [[[v, d], ["b", "a"]]]) + Call: (11) findresult3([["b", "a"], ["c", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], _G5891) + Unify: (11) findresult3([["b", "a"], ["c", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], _G5891) + Call: (12) [["b", "a"], ["c", "a"], [v, d]]=[_G5881|_G5882] + Exit: (12) [["b", "a"], ["c", "a"], [v, d]]=[["b", "a"], ["c", "a"], [v, d]] + Call: (12) expressionnotatom(["b", "a"]) + Unify: (12) expressionnotatom(["b", "a"]) + Call: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) var(["b", "a"]) + Fail: (14) var(["b", "a"]) + Redo: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) isval(["b", "a"]) + Unify: (14) isval(["b", "a"]) + Call: (15) number(["b", "a"]) + Fail: (15) number(["b", "a"]) + Fail: (14) isval(["b", "a"]) + Redo: (13) isvalstrempty(["b", "a"]) + Unify: (13) isvalstrempty(["b", "a"]) + Call: (14) string(["b", "a"]) + Fail: (14) string(["b", "a"]) + Fail: (13) isvalstrempty(["b", "a"]) + Redo: (12) expressionnotatom(["b", "a"]) + Unify: (12) expressionnotatom(["b", "a"]) +^ Call: (13) not(atom(["b", "a"])) +^ Unify: (13) not(user:atom(["b", "a"])) +^ Exit: (13) not(user:atom(["b", "a"])) + Call: (13) length(["b", "a"], _G5897) + Unify: (13) length(["b", "a"], _G5897) + Exit: (13) length(["b", "a"], 2) + Call: (13) 2>=1 + Exit: (13) 2>=1 + Call: (13) expressionnotatom2(["b", "a"]) + Unify: (13) expressionnotatom2(["b", "a"]) + Call: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) var("b") + Fail: (15) var("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) isval("b") + Unify: (15) isval("b") + Call: (16) number("b") + Fail: (16) number("b") + Fail: (15) isval("b") + Redo: (14) isvalstrempty("b") + Unify: (14) isvalstrempty("b") + Call: (15) string("b") + Exit: (15) string("b") + Exit: (14) isvalstrempty("b") + Call: (14) expressionnotatom2(["a"]) + Unify: (14) expressionnotatom2(["a"]) + Call: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) var("a") + Fail: (16) var("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) isval("a") + Unify: (16) isval("a") + Call: (17) number("a") + Fail: (17) number("a") + Fail: (16) isval("a") + Redo: (15) isvalstrempty("a") + Unify: (15) isvalstrempty("a") + Call: (16) string("a") + Exit: (16) string("a") + Exit: (15) isvalstrempty("a") + Call: (15) expressionnotatom2([]) + Unify: (15) expressionnotatom2([]) + Exit: (15) expressionnotatom2([]) + Exit: (14) expressionnotatom2(["a"]) + Exit: (13) expressionnotatom2(["b", "a"]) + Exit: (12) expressionnotatom(["b", "a"]) + Call: (12) lists:append([], [["b", "a"]], _G5901) + Unify: (12) lists:append([], [["b", "a"]], [["b", "a"]]) + Exit: (12) lists:append([], [["b", "a"]], [["b", "a"]]) + Call: (12) findresult3([["c", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["b", "a"]], _G5902) + Unify: (12) findresult3([["c", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["b", "a"]], _G5902) + Call: (13) [["c", "a"], [v, d]]=[_G5892|_G5893] + Exit: (13) [["c", "a"], [v, d]]=[["c", "a"], [v, d]] + Call: (13) expressionnotatom(["c", "a"]) + Unify: (13) expressionnotatom(["c", "a"]) + Call: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) var(["c", "a"]) + Fail: (15) var(["c", "a"]) + Redo: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) isval(["c", "a"]) + Unify: (15) isval(["c", "a"]) + Call: (16) number(["c", "a"]) + Fail: (16) number(["c", "a"]) + Fail: (15) isval(["c", "a"]) + Redo: (14) isvalstrempty(["c", "a"]) + Unify: (14) isvalstrempty(["c", "a"]) + Call: (15) string(["c", "a"]) + Fail: (15) string(["c", "a"]) + Fail: (14) isvalstrempty(["c", "a"]) + Redo: (13) expressionnotatom(["c", "a"]) + Unify: (13) expressionnotatom(["c", "a"]) +^ Call: (14) not(atom(["c", "a"])) +^ Unify: (14) not(user:atom(["c", "a"])) +^ Exit: (14) not(user:atom(["c", "a"])) + Call: (14) length(["c", "a"], _G5908) + Unify: (14) length(["c", "a"], _G5908) + Exit: (14) length(["c", "a"], 2) + Call: (14) 2>=1 + Exit: (14) 2>=1 + Call: (14) expressionnotatom2(["c", "a"]) + Unify: (14) expressionnotatom2(["c", "a"]) + Call: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) var("c") + Fail: (16) var("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) isval("c") + Unify: (16) isval("c") + Call: (17) number("c") + Fail: (17) number("c") + Fail: (16) isval("c") + Redo: (15) isvalstrempty("c") + Unify: (15) isvalstrempty("c") + Call: (16) string("c") + Exit: (16) string("c") + Exit: (15) isvalstrempty("c") + Call: (15) expressionnotatom2(["a"]) + Unify: (15) expressionnotatom2(["a"]) + Call: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) var("a") + Fail: (17) var("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) isval("a") + Unify: (17) isval("a") + Call: (18) number("a") + Fail: (18) number("a") + Fail: (17) isval("a") + Redo: (16) isvalstrempty("a") + Unify: (16) isvalstrempty("a") + Call: (17) string("a") + Exit: (17) string("a") + Exit: (16) isvalstrempty("a") + Call: (16) expressionnotatom2([]) + Unify: (16) expressionnotatom2([]) + Exit: (16) expressionnotatom2([]) + Exit: (15) expressionnotatom2(["a"]) + Exit: (14) expressionnotatom2(["c", "a"]) + Exit: (13) expressionnotatom(["c", "a"]) + Call: (13) lists:append([["b", "a"]], [["c", "a"]], _G5912) + Unify: (13) lists:append([["b", "a"]], [["c", "a"]], [["b", "a"]|_G5904]) + Exit: (13) lists:append([["b", "a"]], [["c", "a"]], [["b", "a"], ["c", "a"]]) + Call: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["b", "a"], ["c", "a"]], _G5916) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["b", "a"], ["c", "a"]], _G5916) + Call: (14) [[v, d]]=[_G5906|_G5907] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) var([v, d]) + Fail: (16) var([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) isval([v, d]) + Unify: (16) isval([v, d]) + Call: (17) number([v, d]) + Fail: (17) number([v, d]) + Fail: (16) isval([v, d]) + Redo: (15) isvalstrempty([v, d]) + Unify: (15) isvalstrempty([v, d]) + Call: (16) string([v, d]) + Fail: (16) string([v, d]) + Fail: (15) isvalstrempty([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) +^ Call: (15) not(atom([v, d])) +^ Unify: (15) not(user:atom([v, d])) +^ Exit: (15) not(user:atom([v, d])) + Call: (15) length([v, d], _G5922) + Unify: (15) length([v, d], _G5922) + Exit: (15) length([v, d], 2) + Call: (15) 2>=1 + Exit: (15) 2>=1 + Call: (15) expressionnotatom2([v, d]) + Unify: (15) expressionnotatom2([v, d]) + Call: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) var(v) + Fail: (17) var(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) isval(v) + Unify: (17) isval(v) + Call: (18) number(v) + Fail: (18) number(v) + Fail: (17) isval(v) + Redo: (16) isvalstrempty(v) + Unify: (16) isvalstrempty(v) + Call: (17) string(v) + Fail: (17) string(v) + Fail: (16) isvalstrempty(v) + Fail: (15) expressionnotatom2([v, d]) + Redo: (14) expressionnotatom([v, d]) + Unify: (14) expressionnotatom([v, d]) + Call: (15) predicate_or_rule_name([v, d]) + Fail: (15) predicate_or_rule_name([v, d]) + Fail: (14) expressionnotatom([v, d]) + Redo: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["b", "a"], ["c", "a"]], _G5916) + Unify: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["b", "a"], ["c", "a"]], _G5916) + Call: (14) [[v, d]]=[_G5906|_G5907] + Exit: (14) [[v, d]]=[[v, d]] + Call: (14) isvar([v, d]) + Unify: (14) isvar([v, d]) + Exit: (14) isvar([v, d]) + Call: (14) lists:member([[v, d], _G5912], [[[v, d], ["b", "a"]]]) + Unify: (14) lists:member([[v, d], _G5912], [[[v, d], ["b", "a"]]]) + Exit: (14) lists:member([[v, d], ["b", "a"]], [[[v, d], ["b", "a"]]]) + Call: (14) lists:append([["b", "a"], ["c", "a"]], [["b", "a"]], _G5927) + Unify: (14) lists:append([["b", "a"], ["c", "a"]], [["b", "a"]], [["b", "a"]|_G5919]) + Exit: (14) lists:append([["b", "a"], ["c", "a"]], [["b", "a"]], [["b", "a"], ["c", "a"], ["b", "a"]]) + Call: (14) findresult3([], [[[v, d], ["b", "a"]]], [["b", "a"], ["c", "a"], ["b", "a"]], _G5934) + Unify: (14) findresult3([], [[[v, d], ["b", "a"]]], [["b", "a"], ["c", "a"], ["b", "a"]], [["b", "a"], ["c", "a"], ["b", "a"]]) + Exit: (14) findresult3([], [[[v, d], ["b", "a"]]], [["b", "a"], ["c", "a"], ["b", "a"]], [["b", "a"], ["c", "a"], ["b", "a"]]) + Exit: (13) findresult3([[v, d]], [[[v, d], ["b", "a"]]], [["b", "a"], ["c", "a"]], [["b", "a"], ["c", "a"], ["b", "a"]]) + Exit: (12) findresult3([["c", "a"], [v, d]], [[[v, d], ["b", "a"]]], [["b", "a"]], [["b", "a"], ["c", "a"], ["b", "a"]]) + Exit: (11) findresult3([["b", "a"], ["c", "a"], [v, d]], [[[v, d], ["b", "a"]]], [], [["b", "a"], ["c", "a"], ["b", "a"]]) + Call: (11) debug(on) + Fail: (11) debug(on) + Redo: (10) member1([[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Call: (11) true + Exit: (11) true + Call: (11) true + Exit: (11) true + Exit: (10) member1([[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Exit: (9) interpret1(off, [[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + Exit: (8) interpret(off, [[n, f], [["b", "a"], ["c", "a"], [v, d]]], [[[n, f], [[v, c], [v, b], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[v, d], ["b", "a"]]]) + diff --git a/mindreader/file-songs.txt b/mindreader/file-songs.txt new file mode 100644 index 0000000000000000000000000000000000000000..1275e30b5f152bfad6e36eb6441dce46102b4895 --- /dev/null +++ b/mindreader/file-songs.txt @@ -0,0 +1,12892 @@ +format=1 tracks=6 division=384 + +BA 1 CR 0 TR 0 CH 16 Text type 2: "Produced by Mind Reading Music Composer by Lucian Academy" +BA 1 CR 0 TR 0 CH 16 Text type 3: "Wilb el writes on the mir ac le." +BA 1 CR 0 TR 0 CH 1 Channel volume 127 +BA 1 CR 0 TR 0 CH 16 Tempo 63.00009 +BA 1 CR 0 TR 1 CH 1 Instrument 1 +BA 1 CR 0 TR 1 CH 1 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 6 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 6 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 8 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 8 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 14 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 14 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 16 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 16 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 17 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 17 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 17 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 19 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 19 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 19 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 25 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 25 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 25 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 25 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 25 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 26 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 26 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 26 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 26 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 26 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 26 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 27 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 28 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 28 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 28 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 28 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 28 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 28 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 29 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 30 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 31 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 31 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 31 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 31 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 32 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 33 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 33 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 33 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 33 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 33 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 33 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 34 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 34 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 34 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 34 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 34 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 34 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 35 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 35 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 35 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 35 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 35 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 36 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 36 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 36 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 36 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 36 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 36 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 1 CR 0 TR 2 CH 2 Instrument 88 +BA 1 CR 0 TR 2 CH 2 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 1 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 1 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 1 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 1 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 1 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 1 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 1 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 2 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1 TR 2 CH 2 NT G#- 1/2 voff=0 +BA 2 CR 1+1/2 TR 2 CH 2 NT G#- 1/2 voff=0 +BA 2 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 2 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 3 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 3 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 3 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 3 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 3 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 3 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 3 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 3 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 4 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1 TR 2 CH 2 NT G#- 1/2 voff=0 +BA 4 CR 1+1/2 TR 2 CH 2 NT G#- 1/2 voff=0 +BA 4 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 4 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 6 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 6 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 7 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 8 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 8 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 10 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 10 CR 1 TR 2 CH 2 NT C#- 1/2 voff=0 +BA 10 CR 1+1/2 TR 2 CH 2 NT C#- 1/2 voff=0 +BA 10 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 10 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 10 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 10 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 11 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 12 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 12 CR 1 TR 2 CH 2 NT C#- 1/2 voff=0 +BA 12 CR 1+1/2 TR 2 CH 2 NT C#- 1/2 voff=0 +BA 12 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 12 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 12 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 12 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 17 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 17 CR 1 TR 2 CH 2 NT E- 1/2 voff=0 +BA 17 CR 1+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 17 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 19 CR 1 TR 2 CH 2 NT E- 1/2 voff=0 +BA 19 CR 1+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 19 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 Instrument 23 +BA 1 CR 0 TR 3 CH 3 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 5 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 5 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 5 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 6 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 6 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 7 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 7 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 7 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 7 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 8 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 8 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 Instrument 81 +BA 1 CR 0 TR 4 CH 4 NT R 0 von=127 voff=0 +BA 13 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 Instrument 20 +BA 1 CR 0 TR 5 CH 5 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 5 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 5 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 6 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 6 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 6 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 6 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 7 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 7 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 8 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 8 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 8 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 8 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 37 CR 0 TR 0 CH 16 End of track +Wilb el writes on the mir ac le. + +Wilb el writes on the mir ac le. +Syd ess makes the breaths on er. +The ess ence has C. +The oth er is Don ur. + +Add ney makes the writ er. +Duk ess has the writ er. +The ess ence is in Luc ur. +One loves Count an. + +Le ye feels the test er. +Dix ur makes the spell er. +The oth er is Dix ney. +Whist ess reads on Whist ney. + +Ack ney has the ess ence. +Syd ard writes on Le ney. +The check er finds C. +The draw er feels Whist ess. + + +format=1 tracks=8 division=384 + +BA 1 CR 0 TR 0 CH 16 Text type 2: "Produced by Mind Reading Music Composer by Lucian Academy" +BA 1 CR 0 TR 0 CH 16 Text type 3: "Brae ess nur tures Duk ice." +BA 1 CR 0 TR 0 CH 1 Channel volume 127 +BA 1 CR 0 TR 0 CH 16 Tempo 63.00009 +BA 1 CR 0 TR 1 CH 1 Instrument 1 +BA 1 CR 0 TR 1 CH 1 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 5 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 5 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 5 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 5 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 5 CR 2+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 6 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 6 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 6 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 6 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 6 CR 3 TR 1 CH 1 NT D- 1/2 voff=0 +BA 7 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 7 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 7 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 7 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 7 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 7 CR 2+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 7 CR 3 TR 1 CH 1 NT G- 1/2 voff=0 +BA 8 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 8 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 8 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 8 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 13 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 13 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 13 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 13 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 13 CR 3 TR 1 CH 1 NT G- 1/2 voff=0 +BA 14 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 14 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 14 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 14 CR 3 TR 1 CH 1 NT D- 1/2 voff=0 +BA 15 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 15 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 15 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 15 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 16 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 16 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 16 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 17 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 17 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 17 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 17 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 17 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 17 CR 3 TR 1 CH 1 NT F- 1/2 voff=0 +BA 18 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 18 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 18 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 18 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 18 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 19 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 19 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 19 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 19 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 20 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 20 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 20 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 20 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 20 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 20 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 25 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 25 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 25 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 25 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 25 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 25 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 26 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 26 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 26 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 26 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 26 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 27 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 27 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 27 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 27 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 28 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 28 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 28 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 28 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 28 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 28 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 29 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 29 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 29 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 29 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 30 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 30 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 30 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 30 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 30 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 31 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 31 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 31 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 31 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 31 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 32 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 32 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 32 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 32 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 32 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 33 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 33 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 33 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 33 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 33 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 33 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 34 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 34 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 34 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 34 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 34 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 35 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 35 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 35 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 35 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 35 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 35 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 36 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 36 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 36 CR 1 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 36 CR 1+1/2 TR 1 CH 1 NT D#- 1/2 voff=0 +BA 36 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 36 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 1 CR 0 TR 2 CH 2 Instrument 82 +BA 1 CR 0 TR 2 CH 2 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 5 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 5 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 5 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 5 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 5 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 5 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 5 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 6 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 6 CR 1+1/2 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 6 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 6 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 6 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 6 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 7 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 7 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 7 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 7 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 7 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 7 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 7 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 7 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 8 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 8 CR 1+1/2 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 8 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 8 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 8 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 8 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 9 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 9 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 9 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 9 CR 2+1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 9 CR 3 TR 2 CH 2 NT B- 1/2 voff=0 +BA 9 CR 3+1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 10 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 10 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 10 CR 1 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 10 CR 1+1/2 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 10 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 10 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 10 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 10 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 11 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 11 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 11 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 11 CR 2+1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 11 CR 3 TR 2 CH 2 NT B- 1/2 voff=0 +BA 11 CR 3+1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 12 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 12 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 12 CR 1 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 12 CR 1+1/2 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 12 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 12 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 12 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 12 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 21 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 21 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 21 CR 1 TR 2 CH 2 NT E- 1/2 voff=0 +BA 21 CR 1+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 21 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 21 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 21 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 21 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 22 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 22 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 22 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 22 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 22 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 22 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 22 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 22 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 23 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 23 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 23 CR 1 TR 2 CH 2 NT E- 1/2 voff=0 +BA 23 CR 1+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 23 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 23 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 23 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 23 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 24 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 24 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 24 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 24 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 24 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 24 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 24 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 24 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 Instrument 26 +BA 1 CR 0 TR 3 CH 3 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 21 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 21 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 21 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 21 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 21 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 22 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 22 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 23 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 23 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 23 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 23 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 23 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 23 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 24 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 24 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 Instrument 53 +BA 1 CR 0 TR 4 CH 4 NT R 0 von=127 voff=0 +BA 9 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 9 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 9 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 9 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 9 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 9 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 10 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 1 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 10 CR 1+1/2 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 10 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 10 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 10 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 10 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 11 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 11 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 11 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 11 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 11 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 12 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 1 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 12 CR 1+1/2 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 12 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 12 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 12 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 12 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 13 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT D#- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT D#- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT D#- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT D#- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 22 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 22 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 22 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 22 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 24 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 24 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 24 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 24 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 Instrument 29 +BA 1 CR 0 TR 5 CH 5 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 1 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 1 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 1 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 1 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 1 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 1 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 1 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 1 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 1 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 1 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 1 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 1 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 1 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 2 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 2 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 2 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 2 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 2 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 2 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 2 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 2 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 2 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 2 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 2 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 2 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 2 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 2 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 2 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 2 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 3 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 3 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 3 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 3 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 3 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 3 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 3 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 3 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 3 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 3 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 3 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 3 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 4 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 4 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 4 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 4 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 4 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 4 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 4 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 4 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 4 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 4 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 4 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 4 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 4 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 4 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 4 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 4 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 13 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 13 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 13 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 13 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 13 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 13 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 13 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 13 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 13 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 13 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 13 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 13 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 13 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 13 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 13 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 13 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 13 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 13 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 14 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 14 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 14 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 14 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 14 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 14 CR 1 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 14 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 14 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 14 CR 1+1/2 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 14 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 14 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 14 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 14 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 14 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 14 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 14 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 14 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 14 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 14 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 14 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 14 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 14 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 15 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 15 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 15 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 15 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 15 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 15 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 15 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 15 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 15 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 15 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 15 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 15 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 15 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 15 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 15 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 15 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 15 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 15 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 16 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 16 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 16 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 16 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 16 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 16 CR 1 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 16 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 16 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 16 CR 1+1/2 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 16 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 16 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 16 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 16 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 16 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 16 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 16 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 16 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 16 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 16 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 16 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 16 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 16 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 17 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 17 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 17 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 17 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 17 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 17 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 17 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 17 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 17 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 17 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 17 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 17 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 18 CR 0 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 18 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 18 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 18 CR 1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 18 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 18 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 18 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 18 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 18 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 18 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 18 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 18 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 18 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 18 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 18 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 18 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 19 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 19 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 19 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 19 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 19 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 19 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 19 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 19 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 19 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 19 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 19 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 20 CR 0 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 20 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 20 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 20 CR 1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 20 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 20 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 20 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 20 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 20 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 20 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 20 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 20 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 20 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 20 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 20 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 20 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 1 CR 0 TR 6 CH 6 Instrument 79 +BA 1 CR 0 TR 6 CH 6 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 6 CH 6 NT F- 1/2 voff=0 +BA 5 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 5 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 5 CR 1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 5 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 5 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 5 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 5 CR 1 TR 6 CH 6 NT B- 1/2 voff=0 +BA 5 CR 1 TR 6 CH 6 NT D- 1/2 voff=0 +BA 5 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 5 CR 1+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 5 CR 1+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 5 CR 2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 5 CR 2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 5 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 5 CR 2+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 5 CR 2+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 5 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 5 CR 3 TR 6 CH 6 NT G- 1/2 voff=0 +BA 5 CR 3 TR 6 CH 6 NT B- 1/2 voff=0 +BA 5 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 5 CR 3+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 5 CR 3+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 5 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 6 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 6 CR 0 TR 6 CH 6 NT E- 1/2 voff=0 +BA 6 CR 0 TR 6 CH 6 NT G- 1/2 voff=0 +BA 6 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 6 CR 1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 6 CR 1 TR 6 CH 6 NT D#- 1/2 voff=0 +BA 6 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 6 CR 1 TR 6 CH 6 NT A- 1/2 voff=0 +BA 6 CR 1+1/2 TR 6 CH 6 NT D#- 1/2 voff=0 +BA 6 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 6 CR 1+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 6 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 6 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 6 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 6 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 6 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 6 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 6 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 6 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 6 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 6 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 6 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 6 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 7 CR 0 TR 6 CH 6 NT F- 1/2 voff=0 +BA 7 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 7 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 7 CR 1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 7 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 7 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 7 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 7 CR 1 TR 6 CH 6 NT B- 1/2 voff=0 +BA 7 CR 1 TR 6 CH 6 NT D- 1/2 voff=0 +BA 7 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 7 CR 1+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 7 CR 1+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 7 CR 2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 7 CR 2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 7 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 7 CR 2+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 7 CR 2+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 7 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 7 CR 3 TR 6 CH 6 NT G- 1/2 voff=0 +BA 7 CR 3 TR 6 CH 6 NT B- 1/2 voff=0 +BA 7 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 7 CR 3+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 7 CR 3+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 7 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 8 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 8 CR 0 TR 6 CH 6 NT E- 1/2 voff=0 +BA 8 CR 0 TR 6 CH 6 NT G- 1/2 voff=0 +BA 8 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 8 CR 1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 8 CR 1 TR 6 CH 6 NT D#- 1/2 voff=0 +BA 8 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 8 CR 1 TR 6 CH 6 NT A- 1/2 voff=0 +BA 8 CR 1+1/2 TR 6 CH 6 NT D#- 1/2 voff=0 +BA 8 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 8 CR 1+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 8 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 8 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 8 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 8 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 8 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 8 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 8 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 8 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 8 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 8 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 8 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 8 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 13 CR 0 TR 6 CH 6 NT F- 1/2 voff=0 +BA 13 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 13 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 13 CR 1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 13 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 13 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 13 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 1 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 1 TR 6 CH 6 NT D- 1/2 voff=0 +BA 13 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 1+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 13 CR 2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 13 CR 2+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 13 CR 3 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 3 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 13 CR 3+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 3+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 14 CR 0 TR 6 CH 6 NT E- 1/2 voff=0 +BA 14 CR 0 TR 6 CH 6 NT G- 1/2 voff=0 +BA 14 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 14 CR 1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 14 CR 1 TR 6 CH 6 NT D#- 1/2 voff=0 +BA 14 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 14 CR 1 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 1+1/2 TR 6 CH 6 NT D#- 1/2 voff=0 +BA 14 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 14 CR 1+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 14 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 14 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 14 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 14 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 15 CR 0 TR 6 CH 6 NT F- 1/2 voff=0 +BA 15 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 15 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 15 CR 1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 15 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 15 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 15 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 1 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 1 TR 6 CH 6 NT D- 1/2 voff=0 +BA 15 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 1+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 15 CR 2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 15 CR 2+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 15 CR 3 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 3 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 15 CR 3+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 3+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 16 CR 0 TR 6 CH 6 NT E- 1/2 voff=0 +BA 16 CR 0 TR 6 CH 6 NT G- 1/2 voff=0 +BA 16 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 16 CR 1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 16 CR 1 TR 6 CH 6 NT D#- 1/2 voff=0 +BA 16 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 16 CR 1 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 1+1/2 TR 6 CH 6 NT D#- 1/2 voff=0 +BA 16 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 16 CR 1+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 16 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 16 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 16 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 16 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 17 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 17 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 17 CR 0 TR 6 CH 6 NT E- 1/2 voff=0 +BA 17 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 17 CR 1 TR 6 CH 6 NT C- 1/2 voff=0 +BA 17 CR 1 TR 6 CH 6 NT E- 1/2 voff=0 +BA 17 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 17 CR 1+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 17 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 17 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 17 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 17 CR 2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 17 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 17 CR 2+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 17 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 17 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 17 CR 3 TR 6 CH 6 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 17 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 17 CR 3+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 18 CR 0 TR 6 CH 6 NT A#- 1/2 voff=0 +BA 18 CR 0 TR 6 CH 6 NT D- 1/2 voff=0 +BA 18 CR 0 TR 6 CH 6 NT F- 1/2 voff=0 +BA 18 CR 1/2 TR 6 CH 6 NT A#- 1/2 voff=0 +BA 18 CR 1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 18 CR 1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 18 CR 1 TR 6 CH 6 NT E- 1/2 voff=0 +BA 18 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 18 CR 1 TR 6 CH 6 NT B- 1/2 voff=0 +BA 18 CR 1+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 18 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 18 CR 1+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 18 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 18 CR 2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 18 CR 2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 18 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 18 CR 2+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 18 CR 2+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 18 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 18 CR 3 TR 6 CH 6 NT C- 1/2 voff=0 +BA 18 CR 3 TR 6 CH 6 NT E- 1/2 voff=0 +BA 18 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 18 CR 3+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 18 CR 3+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 19 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 19 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 19 CR 0 TR 6 CH 6 NT E- 1/2 voff=0 +BA 19 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 19 CR 1 TR 6 CH 6 NT C- 1/2 voff=0 +BA 19 CR 1 TR 6 CH 6 NT E- 1/2 voff=0 +BA 19 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 19 CR 1+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 19 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 19 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 19 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 19 CR 2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 19 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 19 CR 2+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 19 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 19 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 19 CR 3 TR 6 CH 6 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 19 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 19 CR 3+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 20 CR 0 TR 6 CH 6 NT A#- 1/2 voff=0 +BA 20 CR 0 TR 6 CH 6 NT D- 1/2 voff=0 +BA 20 CR 0 TR 6 CH 6 NT F- 1/2 voff=0 +BA 20 CR 1/2 TR 6 CH 6 NT A#- 1/2 voff=0 +BA 20 CR 1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 20 CR 1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 20 CR 1 TR 6 CH 6 NT E- 1/2 voff=0 +BA 20 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 20 CR 1 TR 6 CH 6 NT B- 1/2 voff=0 +BA 20 CR 1+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 20 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 20 CR 1+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 20 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 20 CR 2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 20 CR 2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 20 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 20 CR 2+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 20 CR 2+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 20 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 20 CR 3 TR 6 CH 6 NT C- 1/2 voff=0 +BA 20 CR 3 TR 6 CH 6 NT E- 1/2 voff=0 +BA 20 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 20 CR 3+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 20 CR 3+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 1 CR 0 TR 7 CH 7 Instrument 97 +BA 1 CR 0 TR 7 CH 7 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 1 CR 0 TR 7 CH 7 NT D- 1/2 voff=0 +BA 1 CR 0 TR 7 CH 7 NT F- 1/2 voff=0 +BA 1 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 1 CR 1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 1 CR 1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 1 CR 1 TR 7 CH 7 NT F- 1/2 voff=0 +BA 1 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 1 CR 1 TR 7 CH 7 NT C- 1/2 voff=0 +BA 1 CR 1+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 1 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 1 CR 1+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 1 CR 2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 1 CR 2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 1 CR 2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 1 CR 2+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 1 CR 2+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 1 CR 2+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 1 CR 3 TR 7 CH 7 NT E- 1/2 voff=0 +BA 1 CR 3 TR 7 CH 7 NT G- 1/2 voff=0 +BA 1 CR 3 TR 7 CH 7 NT B- 1/2 voff=0 +BA 1 CR 3+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 1 CR 3+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 1 CR 3+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 2 CR 0 TR 7 CH 7 NT A- 1/2 voff=0 +BA 2 CR 0 TR 7 CH 7 NT C- 1/2 voff=0 +BA 2 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 2 CR 1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 2 CR 1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 2 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 2 CR 1 TR 7 CH 7 NT B- 1/2 voff=0 +BA 2 CR 1 TR 7 CH 7 NT D- 1/2 voff=0 +BA 2 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 2 CR 1+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 2 CR 1+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 2 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 2 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 2 CR 2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 2 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 2 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 2 CR 2+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 2 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 2 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 2 CR 3 TR 7 CH 7 NT C- 1/2 voff=0 +BA 2 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 2 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 2 CR 3+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 3 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 3 CR 0 TR 7 CH 7 NT D- 1/2 voff=0 +BA 3 CR 0 TR 7 CH 7 NT F- 1/2 voff=0 +BA 3 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 3 CR 1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 3 CR 1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 3 CR 1 TR 7 CH 7 NT F- 1/2 voff=0 +BA 3 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 3 CR 1 TR 7 CH 7 NT C- 1/2 voff=0 +BA 3 CR 1+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 3 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 3 CR 1+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 3 CR 2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 3 CR 2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 3 CR 2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 3 CR 2+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 3 CR 2+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 3 CR 2+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 3 CR 3 TR 7 CH 7 NT E- 1/2 voff=0 +BA 3 CR 3 TR 7 CH 7 NT G- 1/2 voff=0 +BA 3 CR 3 TR 7 CH 7 NT B- 1/2 voff=0 +BA 3 CR 3+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 3 CR 3+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 3 CR 3+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 4 CR 0 TR 7 CH 7 NT A- 1/2 voff=0 +BA 4 CR 0 TR 7 CH 7 NT C- 1/2 voff=0 +BA 4 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 4 CR 1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 4 CR 1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 4 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 4 CR 1 TR 7 CH 7 NT B- 1/2 voff=0 +BA 4 CR 1 TR 7 CH 7 NT D- 1/2 voff=0 +BA 4 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 4 CR 1+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 4 CR 1+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 4 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 4 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 4 CR 2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 4 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 4 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 4 CR 2+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 4 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 4 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 4 CR 3 TR 7 CH 7 NT C- 1/2 voff=0 +BA 4 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 4 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 4 CR 3+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 13 CR 0 TR 7 CH 7 NT F- 1/2 voff=0 +BA 13 CR 0 TR 7 CH 7 NT A- 1/2 voff=0 +BA 13 CR 0 TR 7 CH 7 NT C- 1/2 voff=0 +BA 13 CR 1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 13 CR 1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 13 CR 1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 13 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 13 CR 1 TR 7 CH 7 NT B- 1/2 voff=0 +BA 13 CR 1 TR 7 CH 7 NT D- 1/2 voff=0 +BA 13 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 13 CR 1+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 13 CR 2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 13 CR 2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 13 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 13 CR 2+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 13 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 13 CR 3 TR 7 CH 7 NT G- 1/2 voff=0 +BA 13 CR 3 TR 7 CH 7 NT B- 1/2 voff=0 +BA 13 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 13 CR 3+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 13 CR 3+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 13 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 14 CR 0 TR 7 CH 7 NT C- 1/2 voff=0 +BA 14 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 14 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 14 CR 1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 14 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 14 CR 1 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 14 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 14 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 14 CR 1+1/2 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 14 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 14 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 14 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 14 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 14 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 14 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 14 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 14 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 14 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 14 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 14 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 14 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 14 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 15 CR 0 TR 7 CH 7 NT F- 1/2 voff=0 +BA 15 CR 0 TR 7 CH 7 NT A- 1/2 voff=0 +BA 15 CR 0 TR 7 CH 7 NT C- 1/2 voff=0 +BA 15 CR 1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 15 CR 1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 15 CR 1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 15 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 15 CR 1 TR 7 CH 7 NT B- 1/2 voff=0 +BA 15 CR 1 TR 7 CH 7 NT D- 1/2 voff=0 +BA 15 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 15 CR 1+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 15 CR 2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 15 CR 2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 15 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 15 CR 2+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 15 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 15 CR 3 TR 7 CH 7 NT G- 1/2 voff=0 +BA 15 CR 3 TR 7 CH 7 NT B- 1/2 voff=0 +BA 15 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 15 CR 3+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 15 CR 3+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 15 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 16 CR 0 TR 7 CH 7 NT C- 1/2 voff=0 +BA 16 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 16 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 16 CR 1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 16 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 16 CR 1 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 16 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 16 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 16 CR 1+1/2 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 16 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 16 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 16 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 16 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 16 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 16 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 16 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 16 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 16 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 16 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 16 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 16 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 16 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 17 CR 0 TR 7 CH 7 NT A- 1/2 voff=0 +BA 17 CR 0 TR 7 CH 7 NT C- 1/2 voff=0 +BA 17 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 17 CR 1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 17 CR 1 TR 7 CH 7 NT C- 1/2 voff=0 +BA 17 CR 1 TR 7 CH 7 NT E- 1/2 voff=0 +BA 17 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 17 CR 1+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 17 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 17 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 17 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 17 CR 2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 17 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 17 CR 2+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 17 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 17 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 17 CR 3 TR 7 CH 7 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 17 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 17 CR 3+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 18 CR 0 TR 7 CH 7 NT A#- 1/2 voff=0 +BA 18 CR 0 TR 7 CH 7 NT D- 1/2 voff=0 +BA 18 CR 0 TR 7 CH 7 NT F- 1/2 voff=0 +BA 18 CR 1/2 TR 7 CH 7 NT A#- 1/2 voff=0 +BA 18 CR 1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 18 CR 1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 18 CR 1 TR 7 CH 7 NT E- 1/2 voff=0 +BA 18 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 18 CR 1 TR 7 CH 7 NT B- 1/2 voff=0 +BA 18 CR 1+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 18 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 18 CR 1+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 18 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 18 CR 2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 18 CR 2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 18 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 18 CR 2+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 18 CR 2+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 18 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 18 CR 3 TR 7 CH 7 NT C- 1/2 voff=0 +BA 18 CR 3 TR 7 CH 7 NT E- 1/2 voff=0 +BA 18 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 18 CR 3+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 18 CR 3+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 19 CR 0 TR 7 CH 7 NT A- 1/2 voff=0 +BA 19 CR 0 TR 7 CH 7 NT C- 1/2 voff=0 +BA 19 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 19 CR 1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 19 CR 1 TR 7 CH 7 NT C- 1/2 voff=0 +BA 19 CR 1 TR 7 CH 7 NT E- 1/2 voff=0 +BA 19 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 19 CR 1+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 19 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 19 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 19 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 19 CR 2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 19 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 19 CR 2+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 19 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 19 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 19 CR 3 TR 7 CH 7 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 19 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 19 CR 3+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 20 CR 0 TR 7 CH 7 NT A#- 1/2 voff=0 +BA 20 CR 0 TR 7 CH 7 NT D- 1/2 voff=0 +BA 20 CR 0 TR 7 CH 7 NT F- 1/2 voff=0 +BA 20 CR 1/2 TR 7 CH 7 NT A#- 1/2 voff=0 +BA 20 CR 1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 20 CR 1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 20 CR 1 TR 7 CH 7 NT E- 1/2 voff=0 +BA 20 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 20 CR 1 TR 7 CH 7 NT B- 1/2 voff=0 +BA 20 CR 1+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 20 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 20 CR 1+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 20 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 20 CR 2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 20 CR 2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 20 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 20 CR 2+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 20 CR 2+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 20 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 20 CR 3 TR 7 CH 7 NT C- 1/2 voff=0 +BA 20 CR 3 TR 7 CH 7 NT E- 1/2 voff=0 +BA 20 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 20 CR 3+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 20 CR 3+1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 25 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 25 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 25 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 25 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 25 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 25 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 25 CR 1 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 25 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 25 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 25 CR 1+1/2 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 25 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 25 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 25 CR 2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 25 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 25 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 25 CR 2+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 25 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 25 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 25 CR 3 TR 7 CH 7 NT B- 1/2 voff=0 +BA 25 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 25 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 25 CR 3+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 25 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 25 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 26 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 26 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 26 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 26 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 26 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 26 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 26 CR 1 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 26 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 26 CR 1 TR 7 CH 7 NT C- 1/2 voff=0 +BA 26 CR 1+1/2 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 26 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 26 CR 1+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 26 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 26 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 26 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 26 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 26 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 26 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 26 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 26 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 26 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 26 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 26 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 26 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 27 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 27 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 27 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 27 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 27 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 27 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 27 CR 1 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 27 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 27 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 27 CR 1+1/2 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 27 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 27 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 27 CR 2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 27 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 27 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 27 CR 2+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 27 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 27 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 27 CR 3 TR 7 CH 7 NT B- 1/2 voff=0 +BA 27 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 27 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 27 CR 3+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 27 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 27 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 28 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 28 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 28 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 28 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 28 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 28 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 28 CR 1 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 28 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 28 CR 1 TR 7 CH 7 NT C- 1/2 voff=0 +BA 28 CR 1+1/2 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 28 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 28 CR 1+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 28 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 28 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 28 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 28 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 28 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 28 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 28 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 28 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 28 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 28 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 28 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 28 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 29 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 29 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 29 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 29 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 29 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 29 CR 1 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 29 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 29 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 29 CR 1+1/2 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 29 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 29 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 29 CR 2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 29 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 29 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 29 CR 2+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 29 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 29 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 29 CR 3 TR 7 CH 7 NT B- 1/2 voff=0 +BA 29 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 29 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 29 CR 3+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 29 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 29 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 30 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 30 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 30 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 30 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 30 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 30 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 30 CR 1 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 30 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 30 CR 1 TR 7 CH 7 NT C- 1/2 voff=0 +BA 30 CR 1+1/2 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 30 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 30 CR 1+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 30 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 30 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 30 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 30 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 30 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 30 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 30 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 30 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 30 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 30 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 31 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 31 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 31 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 31 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 31 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 31 CR 1 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 31 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 31 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 31 CR 1+1/2 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 31 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 31 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 31 CR 2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 31 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 31 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 31 CR 2+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 31 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 31 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 31 CR 3 TR 7 CH 7 NT B- 1/2 voff=0 +BA 31 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 31 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 31 CR 3+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 31 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 31 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 32 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 32 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 32 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 32 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 32 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 32 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 32 CR 1 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 32 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 32 CR 1 TR 7 CH 7 NT C- 1/2 voff=0 +BA 32 CR 1+1/2 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 32 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 32 CR 1+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 32 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 32 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 32 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 32 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 32 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 32 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 32 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 32 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 32 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 32 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 33 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 33 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 33 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 33 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 33 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 33 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 33 CR 1 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 33 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 33 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 33 CR 1+1/2 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 33 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 33 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 33 CR 2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 33 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 33 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 33 CR 2+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 33 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 33 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 33 CR 3 TR 7 CH 7 NT B- 1/2 voff=0 +BA 33 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 33 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 33 CR 3+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 33 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 33 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 34 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 34 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 34 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 34 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 34 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 34 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 34 CR 1 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 34 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 34 CR 1 TR 7 CH 7 NT C- 1/2 voff=0 +BA 34 CR 1+1/2 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 34 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 34 CR 1+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 34 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 34 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 34 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 34 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 34 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 34 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 34 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 34 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 34 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 34 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 34 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 34 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 35 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 35 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 35 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 35 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 35 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 35 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 35 CR 1 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 35 CR 1 TR 7 CH 7 NT G- 1/2 voff=0 +BA 35 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 35 CR 1+1/2 TR 7 CH 7 NT D#- 1/2 voff=0 +BA 35 CR 1+1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 35 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 35 CR 2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 35 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 35 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 35 CR 2+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 35 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 35 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 35 CR 3 TR 7 CH 7 NT B- 1/2 voff=0 +BA 35 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 35 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 35 CR 3+1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 35 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 35 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 36 CR 0 TR 7 CH 7 NT E- 1/2 voff=0 +BA 36 CR 0 TR 7 CH 7 NT G- 1/2 voff=0 +BA 36 CR 0 TR 7 CH 7 NT B- 1/2 voff=0 +BA 36 CR 1/2 TR 7 CH 7 NT E- 1/2 voff=0 +BA 36 CR 1/2 TR 7 CH 7 NT G- 1/2 voff=0 +BA 36 CR 1/2 TR 7 CH 7 NT B- 1/2 voff=0 +BA 36 CR 1 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 36 CR 1 TR 7 CH 7 NT A- 1/2 voff=0 +BA 36 CR 1 TR 7 CH 7 NT C- 1/2 voff=0 +BA 36 CR 1+1/2 TR 7 CH 7 NT F#- 1/2 voff=0 +BA 36 CR 1+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 36 CR 1+1/2 TR 7 CH 7 NT C- 1/2 voff=0 +BA 36 CR 2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 36 CR 2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 36 CR 2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 36 CR 2+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 36 CR 2+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 36 CR 2+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 36 CR 3 TR 7 CH 7 NT D- 1/2 voff=0 +BA 36 CR 3 TR 7 CH 7 NT F- 1/2 voff=0 +BA 36 CR 3 TR 7 CH 7 NT A- 1/2 voff=0 +BA 36 CR 3+1/2 TR 7 CH 7 NT D- 1/2 voff=0 +BA 36 CR 3+1/2 TR 7 CH 7 NT F- 1/2 voff=0 +BA 36 CR 3+1/2 TR 7 CH 7 NT A- 1/2 voff=0 +BA 37 CR 0 TR 0 CH 16 End of track +Brae ess nur tures Duk ice. + +Brae ess nur tures Duk ice. +Count ae lifts the mir ac le. +The draw er reads on Duk ice. +The choos er feels Duk ice. + +Whist ess goes to the test er. +Don an is the breaths on er. +The draw er has Skye el. +Nei ney writes on Skye ess. + +Duk el writes on the spell er. +Brae el is Ven ler. +Ven ler needs Ven ler. +Ven ler reads on Add ae. + +Wilb ee lifts the draw er. +Le ard has Brae ler. +Whist ice writes on All ard. +Dix an nur tures Syd ler. + + +format=1 tracks=5 division=384 + +BA 1 CR 0 TR 0 CH 16 Text type 2: "Produced by Mind Reading Music Composer by Lucian Academy" +BA 1 CR 0 TR 0 CH 16 Text type 3: "Nei el moves to Add ice." +BA 1 CR 0 TR 0 CH 1 Channel volume 127 +BA 1 CR 0 TR 0 CH 16 Tempo 63.00009 +BA 1 CR 0 TR 1 CH 1 Instrument 1 +BA 1 CR 0 TR 1 CH 1 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1 TR 1 CH 1 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 18 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 18 CR 3 TR 1 CH 1 NT D- 1/2 voff=0 +BA 19 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1 TR 1 CH 1 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 20 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 25 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 26 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 26 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 26 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 26 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 27 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 28 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 28 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 28 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 28 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 29 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 1 TR 1 CH 1 NT D- 1/2 voff=0 +BA 30 CR 1+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 30 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 30 CR 3 TR 1 CH 1 NT D- 1/2 voff=0 +BA 31 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 1 TR 1 CH 1 NT D- 1/2 voff=0 +BA 32 CR 1+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 32 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 1 CR 0 TR 2 CH 2 Instrument 36 +BA 1 CR 0 TR 2 CH 2 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 1 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 1 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 1 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 1 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 1 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 1 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 1 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 2 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 3 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 3 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 3 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 3 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 3 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 3 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 3 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 4 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 19 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 30 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 30 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 30 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 31 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 32 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 32 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 32 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 33 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 34 CR 0 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 34 CR 1/2 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 34 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 34 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 34 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 34 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 34 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 34 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 35 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 36 CR 0 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 36 CR 1/2 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 36 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 36 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 36 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 36 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 36 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 36 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 Instrument 113 +BA 1 CR 0 TR 3 CH 3 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 29 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 30 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 30 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 30 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 30 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 30 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 30 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 31 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 32 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 32 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 32 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 32 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 32 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 32 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 33 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 33 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 33 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 33 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 33 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 33 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 33 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 33 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 33 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 33 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 33 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 33 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 33 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 33 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 33 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 33 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 34 CR 0 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 34 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 34 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 34 CR 1/2 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 34 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 34 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 34 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 34 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 34 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 34 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 34 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 34 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 34 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 34 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 34 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 34 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 34 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 34 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 34 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 34 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 34 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 34 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 34 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 34 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 35 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 35 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 35 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 35 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 35 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 35 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 35 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 35 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 36 CR 0 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 36 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 36 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 36 CR 1/2 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 36 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 36 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 36 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 36 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 36 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 36 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 36 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 36 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 36 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 36 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 36 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 36 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 36 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 36 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 36 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 36 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 36 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 36 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 36 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 36 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 Instrument 95 +BA 1 CR 0 TR 4 CH 4 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 1 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 1 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 1 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 1 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 1 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 1 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 1 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 1 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 1 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 1 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 1 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 1 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 1 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 1 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 1 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 1 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 2 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 2 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 2 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 2 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 2 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 2 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 2 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 2 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 3 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 3 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 3 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 3 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 3 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 3 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 3 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 3 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 3 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 3 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 3 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 3 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 3 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 3 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 3 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 3 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 3 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 3 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 3 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 4 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 4 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 4 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 4 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 4 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 4 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 4 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 4 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 5 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 37 CR 0 TR 0 CH 16 End of track +Nei el moves to Add ice. + +Nei el moves to Add ice. +Ven ess is the gramm ar. +The graph er makes Add ice. +The choos er is in Skye ald. + +Ven ald feels the oth er. +Don ess nur tures one. +The test er loves Ack el. +Nei ess has Don ur. + +Dix ye has the trans lat or. +Don an is in the test er. +Add ice feels Nei ald. +Don an is Nei ess. + +Le ur loves the graph er. +Ven ae nur tures Le ice. +The breaths on er forms Don an. +Don an writes on Add ice. + + +format=1 tracks=4 division=384 + +BA 1 CR 0 TR 0 CH 16 Text type 2: "Produced by Mind Reading Music Composer by Lucian Academy" +BA 1 CR 0 TR 0 CH 16 Text type 3: "Whist ler loves Skye ess." +BA 1 CR 0 TR 0 CH 1 Channel volume 127 +BA 1 CR 0 TR 0 CH 16 Tempo 63.00009 +BA 1 CR 0 TR 1 CH 1 Instrument 1 +BA 1 CR 0 TR 1 CH 1 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 5 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 5 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 5 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 5 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 7 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 7 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 7 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 7 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 13 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 13 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 13 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 15 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 15 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 15 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 26 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 26 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 26 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 26 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 26 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 28 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 28 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 28 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 28 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 28 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 1 CR 0 TR 2 CH 2 Instrument 13 +BA 1 CR 0 TR 2 CH 2 NT R 0 von=127 voff=0 +BA 9 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 33 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 34 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 34 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 34 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 34 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 34 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 34 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 34 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 34 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 35 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 36 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 36 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 36 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 36 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 36 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 36 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 36 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 36 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 Instrument 65 +BA 1 CR 0 TR 3 CH 3 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 21 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 21 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 21 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 21 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 21 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 1 TR 3 CH 3 NT B- 1/2 voff=0 +BA 21 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 1+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 21 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 21 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 21 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 21 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 21 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 23 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 23 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 23 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 23 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 23 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 1 TR 3 CH 3 NT B- 1/2 voff=0 +BA 23 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 1+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 23 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 23 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 23 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 23 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 23 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 37 CR 0 TR 0 CH 16 End of track +Whist ler loves Skye ess. + +Whist ler loves Skye ess. +Count el has Dix ney. +Skye ur writes on Count ney. +The comb in er is Skye ess. + +Syd ard feels Nei an. +Wilb ice lifts the choos er. +The choos er finds Dix ney. +The count er moves to Don ler. + +Whist ur reads on the oth er. +Le ae writes on Count ney. +The gramm ar forms Le ler. +Luc ice has Nei an. + +Ven ur is Skye ur. +Whist ler is Nei el. +The mir ac le lifts Count ald. +Count ee has Nei an. + + +format=1 tracks=7 division=384 + +BA 1 CR 0 TR 0 CH 16 Text type 2: "Produced by Mind Reading Music Composer by Lucian Academy" +BA 1 CR 0 TR 0 CH 16 Text type 3: "Ack ard finds the spell er." +BA 1 CR 0 TR 0 CH 1 Channel volume 127 +BA 1 CR 0 TR 0 CH 16 Tempo 63.00009 +BA 1 CR 0 TR 1 CH 1 Instrument 1 +BA 1 CR 0 TR 1 CH 1 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 6 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 6 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 6 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 6 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 6 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 6 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 6 CR 3 TR 1 CH 1 NT D- 1/2 voff=0 +BA 6 CR 3+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 7 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 8 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 8 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 8 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 8 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 8 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 8 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 8 CR 3 TR 1 CH 1 NT D- 1/2 voff=0 +BA 13 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 14 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 14 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 14 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 14 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 14 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 14 CR 3 TR 1 CH 1 NT D- 1/2 voff=0 +BA 15 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 3 TR 1 CH 1 NT E- 1/2 voff=0 +BA 16 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 16 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 16 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 16 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 16 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 16 CR 3 TR 1 CH 1 NT D- 1/2 voff=0 +BA 17 CR 0 TR 1 CH 1 NT B- 1/2 voff=0 +BA 17 CR 1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 17 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 17 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 18 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 18 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 18 CR 1 TR 1 CH 1 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 18 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 19 CR 0 TR 1 CH 1 NT B- 1/2 voff=0 +BA 19 CR 1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 19 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 19 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 19 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 19 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 20 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 20 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 20 CR 1 TR 1 CH 1 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 20 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 20 CR 3 TR 1 CH 1 NT D- 1/2 voff=0 +BA 25 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 25 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 25 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 26 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 26 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 26 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 26 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 26 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 26 CR 3 TR 1 CH 1 NT B- 1/2 voff=0 +BA 27 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 27 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 27 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 27 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 28 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 28 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 28 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 28 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 28 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 28 CR 3 TR 1 CH 1 NT B- 1/2 voff=0 +BA 29 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 29 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 29 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 30 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 30 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 30 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 30 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 30 CR 3 TR 1 CH 1 NT B- 1/2 voff=0 +BA 31 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 31 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 31 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 31 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 32 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 32 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 32 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 32 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 32 CR 3 TR 1 CH 1 NT B- 1/2 voff=0 +BA 33 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 33 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 33 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 33 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 33 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 34 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 34 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 34 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 34 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 34 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 34 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 34 CR 3 TR 1 CH 1 NT B- 1/2 voff=0 +BA 35 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 35 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 35 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 35 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 35 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 35 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 36 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 36 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 36 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 36 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 36 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 36 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 36 CR 3 TR 1 CH 1 NT B- 1/2 voff=0 +BA 1 CR 0 TR 2 CH 2 Instrument 43 +BA 1 CR 0 TR 2 CH 2 NT R 0 von=127 voff=0 +BA 17 CR 0 TR 2 CH 2 NT B- 1/2 voff=0 +BA 17 CR 1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 17 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 17 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 17 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 17 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 17 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 17 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 18 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 18 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 19 CR 0 TR 2 CH 2 NT B- 1/2 voff=0 +BA 19 CR 1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 19 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 19 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 19 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 19 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 19 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 19 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 20 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 20 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 22 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 22 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 22 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 22 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 22 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 22 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 22 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 22 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 24 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 24 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 24 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 24 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 24 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 24 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 24 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 24 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 Instrument 47 +BA 1 CR 0 TR 3 CH 3 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 22 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 22 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 24 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 24 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 Instrument 8 +BA 1 CR 0 TR 4 CH 4 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 13 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 21 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 21 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 21 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 21 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 21 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 21 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 21 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 21 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 22 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 22 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 22 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 22 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 22 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 22 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 22 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 22 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 22 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 22 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 22 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 22 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 22 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 22 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 23 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 23 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 23 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 23 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 23 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 23 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 23 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 23 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 24 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 24 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 24 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 24 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 24 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 24 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 24 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 24 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 24 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 24 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 24 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 24 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 24 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 24 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 Instrument 95 +BA 1 CR 0 TR 5 CH 5 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 1 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 1 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 1 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 1 CR 1 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 1 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 1 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 1+1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 1 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 1 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 1 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 1 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 1 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 1 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 1 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 1 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 2 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 2 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 2 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 2 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 2 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 2 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 3 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 3 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 3 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 3 CR 1 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 3 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 3 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 3 CR 1+1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 3 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 3 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 3 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 3 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 3 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 3 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 3 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 4 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 4 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 4 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 4 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 4 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 4 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 0 TR 6 CH 6 Instrument 98 +BA 1 CR 0 TR 6 CH 6 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 6 CH 6 NT G- 1/2 voff=0 +BA 1 CR 0 TR 6 CH 6 NT B- 1/2 voff=0 +BA 1 CR 0 TR 6 CH 6 NT D- 1/2 voff=0 +BA 1 CR 1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 1 CR 1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 1 CR 1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 1 CR 1 TR 6 CH 6 NT A#- 1/2 voff=0 +BA 1 CR 1 TR 6 CH 6 NT D- 1/2 voff=0 +BA 1 CR 1 TR 6 CH 6 NT F- 1/2 voff=0 +BA 1 CR 1+1/2 TR 6 CH 6 NT A#- 1/2 voff=0 +BA 1 CR 1+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 1 CR 1+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 1 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 1 CR 2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 1 CR 2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 1 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 1 CR 2+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 1 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 1 CR 3 TR 6 CH 6 NT C- 1/2 voff=0 +BA 1 CR 3 TR 6 CH 6 NT E- 1/2 voff=0 +BA 1 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 1 CR 3+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 2 CR 0 TR 6 CH 6 NT D- 1/2 voff=0 +BA 2 CR 0 TR 6 CH 6 NT F- 1/2 voff=0 +BA 2 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 2 CR 1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 2 CR 1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 2 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 2 CR 1 TR 6 CH 6 NT E- 1/2 voff=0 +BA 2 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 2 CR 1 TR 6 CH 6 NT B- 1/2 voff=0 +BA 2 CR 1+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 2 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 2 CR 1+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 2 CR 2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 2 CR 2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 2 CR 2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 2 CR 2+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 2 CR 2+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 2 CR 3 TR 6 CH 6 NT E- 1/2 voff=0 +BA 2 CR 3 TR 6 CH 6 NT G- 1/2 voff=0 +BA 2 CR 3 TR 6 CH 6 NT B- 1/2 voff=0 +BA 2 CR 3+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 2 CR 3+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 3 CR 0 TR 6 CH 6 NT G- 1/2 voff=0 +BA 3 CR 0 TR 6 CH 6 NT B- 1/2 voff=0 +BA 3 CR 0 TR 6 CH 6 NT D- 1/2 voff=0 +BA 3 CR 1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 3 CR 1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 3 CR 1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 3 CR 1 TR 6 CH 6 NT A#- 1/2 voff=0 +BA 3 CR 1 TR 6 CH 6 NT D- 1/2 voff=0 +BA 3 CR 1 TR 6 CH 6 NT F- 1/2 voff=0 +BA 3 CR 1+1/2 TR 6 CH 6 NT A#- 1/2 voff=0 +BA 3 CR 1+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 3 CR 1+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 3 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 3 CR 2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 3 CR 2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 3 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 3 CR 2+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 3 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 3 CR 3 TR 6 CH 6 NT C- 1/2 voff=0 +BA 3 CR 3 TR 6 CH 6 NT E- 1/2 voff=0 +BA 3 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 3 CR 3+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 4 CR 0 TR 6 CH 6 NT D- 1/2 voff=0 +BA 4 CR 0 TR 6 CH 6 NT F- 1/2 voff=0 +BA 4 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 4 CR 1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 4 CR 1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 4 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 4 CR 1 TR 6 CH 6 NT E- 1/2 voff=0 +BA 4 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 4 CR 1 TR 6 CH 6 NT B- 1/2 voff=0 +BA 4 CR 1+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 4 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 4 CR 1+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 4 CR 2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 4 CR 2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 4 CR 2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 4 CR 2+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 4 CR 2+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 4 CR 3 TR 6 CH 6 NT E- 1/2 voff=0 +BA 4 CR 3 TR 6 CH 6 NT G- 1/2 voff=0 +BA 4 CR 3 TR 6 CH 6 NT B- 1/2 voff=0 +BA 4 CR 3+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 4 CR 3+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 0 TR 6 CH 6 NT E- 1/2 voff=0 +BA 13 CR 0 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 0 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 1 TR 6 CH 6 NT E- 1/2 voff=0 +BA 13 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 1 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 1+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 13 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 13 CR 2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 2+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 13 CR 2+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 3 TR 6 CH 6 NT E- 1/2 voff=0 +BA 13 CR 3 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 3 TR 6 CH 6 NT B- 1/2 voff=0 +BA 13 CR 3+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 13 CR 3+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 13 CR 3+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 14 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 14 CR 0 TR 6 CH 6 NT E- 1/2 voff=0 +BA 14 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 14 CR 1 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 1 TR 6 CH 6 NT F- 1/2 voff=0 +BA 14 CR 1 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 1+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 1+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 14 CR 1+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 14 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 14 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 14 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 14 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 14 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 14 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 15 CR 0 TR 6 CH 6 NT E- 1/2 voff=0 +BA 15 CR 0 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 0 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 1 TR 6 CH 6 NT E- 1/2 voff=0 +BA 15 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 1 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 1+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 15 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 15 CR 2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 2+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 15 CR 2+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 3 TR 6 CH 6 NT E- 1/2 voff=0 +BA 15 CR 3 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 3 TR 6 CH 6 NT B- 1/2 voff=0 +BA 15 CR 3+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 15 CR 3+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 15 CR 3+1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 16 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 0 TR 6 CH 6 NT C- 1/2 voff=0 +BA 16 CR 0 TR 6 CH 6 NT E- 1/2 voff=0 +BA 16 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 16 CR 1 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 1 TR 6 CH 6 NT F- 1/2 voff=0 +BA 16 CR 1 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 1+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 1+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 16 CR 1+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 16 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 16 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 16 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 16 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 16 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 16 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 21 CR 0 TR 6 CH 6 NT D- 1/2 voff=0 +BA 21 CR 0 TR 6 CH 6 NT F- 1/2 voff=0 +BA 21 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 21 CR 1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 21 CR 1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 21 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 21 CR 1 TR 6 CH 6 NT D- 1/2 voff=0 +BA 21 CR 1 TR 6 CH 6 NT F- 1/2 voff=0 +BA 21 CR 1 TR 6 CH 6 NT A- 1/2 voff=0 +BA 21 CR 1+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 21 CR 1+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 21 CR 1+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 21 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 21 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 21 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 21 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 21 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 21 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 21 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 21 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 21 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 21 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 21 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 21 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 22 CR 0 TR 6 CH 6 NT G- 1/2 voff=0 +BA 22 CR 0 TR 6 CH 6 NT B- 1/2 voff=0 +BA 22 CR 0 TR 6 CH 6 NT D- 1/2 voff=0 +BA 22 CR 1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 22 CR 1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 22 CR 1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 22 CR 1 TR 6 CH 6 NT C- 1/2 voff=0 +BA 22 CR 1 TR 6 CH 6 NT E- 1/2 voff=0 +BA 22 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 22 CR 1+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 22 CR 1+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 22 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 22 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 22 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 22 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 22 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 22 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 22 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 22 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 22 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 22 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 22 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 22 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 22 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 23 CR 0 TR 6 CH 6 NT D- 1/2 voff=0 +BA 23 CR 0 TR 6 CH 6 NT F- 1/2 voff=0 +BA 23 CR 0 TR 6 CH 6 NT A- 1/2 voff=0 +BA 23 CR 1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 23 CR 1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 23 CR 1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 23 CR 1 TR 6 CH 6 NT D- 1/2 voff=0 +BA 23 CR 1 TR 6 CH 6 NT F- 1/2 voff=0 +BA 23 CR 1 TR 6 CH 6 NT A- 1/2 voff=0 +BA 23 CR 1+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 23 CR 1+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 23 CR 1+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 23 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 23 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 23 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 23 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 23 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 23 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 23 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 23 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 23 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 23 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 23 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 23 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 24 CR 0 TR 6 CH 6 NT G- 1/2 voff=0 +BA 24 CR 0 TR 6 CH 6 NT B- 1/2 voff=0 +BA 24 CR 0 TR 6 CH 6 NT D- 1/2 voff=0 +BA 24 CR 1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 24 CR 1/2 TR 6 CH 6 NT B- 1/2 voff=0 +BA 24 CR 1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 24 CR 1 TR 6 CH 6 NT C- 1/2 voff=0 +BA 24 CR 1 TR 6 CH 6 NT E- 1/2 voff=0 +BA 24 CR 1 TR 6 CH 6 NT G- 1/2 voff=0 +BA 24 CR 1+1/2 TR 6 CH 6 NT C- 1/2 voff=0 +BA 24 CR 1+1/2 TR 6 CH 6 NT E- 1/2 voff=0 +BA 24 CR 1+1/2 TR 6 CH 6 NT G- 1/2 voff=0 +BA 24 CR 2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 24 CR 2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 24 CR 2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 24 CR 2+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 24 CR 2+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 24 CR 2+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 24 CR 3 TR 6 CH 6 NT D- 1/2 voff=0 +BA 24 CR 3 TR 6 CH 6 NT F- 1/2 voff=0 +BA 24 CR 3 TR 6 CH 6 NT A- 1/2 voff=0 +BA 24 CR 3+1/2 TR 6 CH 6 NT D- 1/2 voff=0 +BA 24 CR 3+1/2 TR 6 CH 6 NT F- 1/2 voff=0 +BA 24 CR 3+1/2 TR 6 CH 6 NT A- 1/2 voff=0 +BA 37 CR 0 TR 0 CH 16 End of track +Ack ard finds the spell er. + +Ack ard finds the spell er. +Count el is in the trans lat or. +The runn er makes Syd ard. +The trans lat or loves All el. + +Syd ice goes to Nei ur. +Add ee loves the comb in er. +The choos er is in Syd ard. +The spell er is in Nei ur. + +Ven ard feels one. +Le an reads on Ack ler. +The runn er makes Luc an. +The draw er reads on Duk ney. + +Don ye has Black ney. +Syd ard is in the choos er. +The ess ence forms Ack ler. +The ess ence is in Add ur. + + +format=1 tracks=6 division=384 + +BA 1 CR 0 TR 0 CH 16 Text type 2: "Produced by Mind Reading Music Composer by Lucian Academy" +BA 1 CR 0 TR 0 CH 16 Text type 3: "Le el moves to the comb in er." +BA 1 CR 0 TR 0 CH 1 Channel volume 127 +BA 1 CR 0 TR 0 CH 16 Tempo 63.00009 +BA 1 CR 0 TR 1 CH 1 Instrument 1 +BA 1 CR 0 TR 1 CH 1 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 18 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 18 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 18 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 18 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 19 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 20 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 20 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 20 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 20 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 25 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 26 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 26 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 26 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 26 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 26 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 26 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 28 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 28 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 28 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 28 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 28 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 28 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 30 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 30 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 30 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 30 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 31 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 32 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 32 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 32 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 32 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 1 CR 0 TR 2 CH 2 Instrument 39 +BA 1 CR 0 TR 2 CH 2 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 1 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 1 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 1 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 1 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 1 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 2 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 3 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 3 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 3 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 3 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 3 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 4 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 21 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 21 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 21 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 21 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 22 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 22 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 22 CR 1 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 22 CR 1+1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 22 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 23 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 23 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 23 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 23 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 24 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 24 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 24 CR 1 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 24 CR 1+1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 24 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 25 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 25 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 25 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 25 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 25 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 25 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 26 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 26 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 26 CR 1 TR 2 CH 2 NT E- 1/2 voff=0 +BA 26 CR 1+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 26 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 26 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 26 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 26 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 27 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 27 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 27 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 27 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 27 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 27 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 28 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 28 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 28 CR 1 TR 2 CH 2 NT E- 1/2 voff=0 +BA 28 CR 1+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 28 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 28 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 28 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 28 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 Instrument 48 +BA 1 CR 0 TR 3 CH 3 NT R 0 von=127 voff=0 +BA 9 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 21 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 21 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 22 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 22 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 23 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 23 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 24 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 24 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 25 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 25 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 25 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 25 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 25 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 25 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 26 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 26 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 26 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 26 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 26 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 26 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 26 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 26 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 27 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 27 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 27 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 27 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 27 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 27 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 28 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 28 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 28 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 28 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 28 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 28 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 28 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 28 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 Instrument 119 +BA 1 CR 0 TR 4 CH 4 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 1 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 1 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 1 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 1 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 1 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 2 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 3 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 3 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 3 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 3 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 3 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 4 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 10 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 10 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 10 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 11 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 12 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 12 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 12 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 21 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 21 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 21 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 21 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 22 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 22 CR 1 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 22 CR 1+1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 22 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 22 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 22 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 22 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 23 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 23 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 23 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 23 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 24 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 24 CR 1 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 24 CR 1+1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 24 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 24 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 24 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 24 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 Instrument 69 +BA 1 CR 0 TR 5 CH 5 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 5 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 5 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 5 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 5 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 5 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 5 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 5 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 5 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 6 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 6 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 6 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 6 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 6 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 6 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 6 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 6 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 7 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 7 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 7 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 7 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 7 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 7 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 7 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 7 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 8 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 8 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 8 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 8 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 8 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 8 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 8 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 8 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 17 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 17 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 17 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 17 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 17 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 17 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 17 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 17 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 18 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 18 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 18 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 19 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 19 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 19 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 19 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 19 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 19 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 19 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 19 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 19 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 20 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 20 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 20 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 21 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 21 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 21 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 21 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 21 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 21 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 21 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 21 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 21 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 21 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 21 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 21 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 21 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 21 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 21 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 21 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 21 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 21 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 21 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 21 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 21 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 21 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 21 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 21 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 22 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 22 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 22 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 22 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 22 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 22 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 22 CR 1 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 22 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 22 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 22 CR 1+1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 22 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 22 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 22 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 22 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 22 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 22 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 22 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 22 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 22 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 22 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 22 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 22 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 22 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 22 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 23 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 23 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 23 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 23 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 23 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 23 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 23 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 23 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 23 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 23 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 23 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 23 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 23 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 23 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 23 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 23 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 23 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 23 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 23 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 23 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 23 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 23 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 23 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 23 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 24 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 24 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 24 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 24 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 24 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 24 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 24 CR 1 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 24 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 24 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 24 CR 1+1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 24 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 24 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 24 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 24 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 24 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 24 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 24 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 24 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 24 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 24 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 24 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 24 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 24 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 24 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 33 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 34 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 35 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 36 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 37 CR 0 TR 0 CH 16 End of track +Le el moves to the comb in er. + +Le el moves to the comb in er. +Count ur is in the draw er. +The gramm ar writes on Dix ler. +All el is in All el. + +Count ald is All ur. +Ack ae loves Duk ee. +The updat er reads on Count ard. +The choos er finds Duk ee. + +Count ald has the spell er. +Count ae makes All el. +The spell er is Brae ur. +Luc ess needs All ur. + +Duk ess loves the updat er. +Brae ice lifts the ess ence. +Ven an needs Luc ess. +The runn er writes on C. + + +format=1 tracks=5 division=384 + +BA 1 CR 0 TR 0 CH 16 Text type 2: "Produced by Mind Reading Music Composer by Lucian Academy" +BA 1 CR 0 TR 0 CH 16 Text type 3: "Don ice is in the ess ence." +BA 1 CR 0 TR 0 CH 1 Channel volume 127 +BA 1 CR 0 TR 0 CH 16 Tempo 63.00009 +BA 1 CR 0 TR 1 CH 1 Instrument 1 +BA 1 CR 0 TR 1 CH 1 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 5 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 5 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 7 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 7 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 13 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 13 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 15 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 15 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 17 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 17 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 17 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 17 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 17 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 17 CR 3 TR 1 CH 1 NT A- 1/2 voff=0 +BA 18 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 18 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 18 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 18 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 18 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 19 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 19 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 19 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 19 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 19 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 20 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 20 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 20 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 20 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 20 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 20 CR 2+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 25 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 25 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 25 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 25 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 25 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 26 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 1 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 1+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 27 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 27 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 27 CR 1 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 1+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 28 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 1 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 1+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 3 TR 1 CH 1 NT F- 1/2 voff=0 +BA 29 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 29 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 29 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 29 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 29 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 29 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 29 CR 3 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 30 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 30 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 30 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 30 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 31 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 31 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 31 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 31 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 31 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 32 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 32 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 32 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 32 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 32 CR 2+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 1 CR 0 TR 2 CH 2 Instrument 101 +BA 1 CR 0 TR 2 CH 2 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 5 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 5 CR 1 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 1+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 7 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 7 CR 1 TR 2 CH 2 NT E- 1/2 voff=0 +BA 7 CR 1+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 7 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 21 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 21 CR 1 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 21 CR 1+1/2 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 21 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 21 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 21 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 21 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 22 CR 0 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 22 CR 1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 22 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 22 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 22 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 22 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 22 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 22 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 23 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 23 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 23 CR 1 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 23 CR 1+1/2 TR 2 CH 2 NT D#- 1/2 voff=0 +BA 23 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 23 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 23 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 23 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 24 CR 0 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 24 CR 1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 24 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 24 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 24 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 24 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 24 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 24 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 Instrument 69 +BA 1 CR 0 TR 3 CH 3 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 5 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 5 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 5 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 5 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 5 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 5 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 5 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 5 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 5 CR 1 TR 3 CH 3 NT B- 1/2 voff=0 +BA 5 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 5 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 5 CR 1+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 5 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 5 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 5 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 5 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 5 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 5 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 5 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 5 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 5 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 5 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 6 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 6 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 6 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 6 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 6 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 6 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 6 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 6 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 6 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 6 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 6 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 6 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 6 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 6 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 6 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 6 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 7 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 7 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 7 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 7 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 7 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 7 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 7 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 7 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 7 CR 1 TR 3 CH 3 NT B- 1/2 voff=0 +BA 7 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 7 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 7 CR 1+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 7 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 7 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 7 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 7 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 7 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 7 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 7 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 7 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 7 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 7 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 8 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 8 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 8 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 8 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 8 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 8 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 8 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 8 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 8 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 8 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 8 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 8 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 8 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 8 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 8 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 8 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT B- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT B- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 29 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 29 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 29 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 29 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 29 CR 1 TR 3 CH 3 NT B- 1/2 voff=0 +BA 29 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 29 CR 1+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 29 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 29 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 29 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 29 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 29 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 29 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 30 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 30 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 30 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 30 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 30 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 30 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 30 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 30 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 30 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 30 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 30 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 30 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 30 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 30 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 30 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 31 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 31 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 31 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 31 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 31 CR 1 TR 3 CH 3 NT B- 1/2 voff=0 +BA 31 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 31 CR 1+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 31 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 31 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 31 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 31 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 31 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 32 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 32 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 32 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 32 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 32 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 32 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 32 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 32 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 32 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 32 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 32 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 32 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 32 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 32 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 32 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 Instrument 22 +BA 1 CR 0 TR 4 CH 4 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 9 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 9 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 9 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 9 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 9 CR 1 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 9 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 9 CR 1+1/2 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 9 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 9 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 9 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 9 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 9 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 9 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 9 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 9 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 9 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 9 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 11 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 11 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 11 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 11 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 11 CR 1 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 11 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 11 CR 1+1/2 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 11 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 11 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 11 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 11 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 11 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 11 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 37 CR 0 TR 0 CH 16 End of track +Don ice is in the ess ence. + +Don ice is in the ess ence. +Dix ald finds Dix ard. +Count ee goes to C. +The count er is Ack el. + +Nei ler goes to the graph er. +Add ler is the graph er. +Count ee forms Wilb an. +Add el feels Dix ard. + +Luc el is in the ess ence. +Black ur is Black ard. +Dix ard moves to C. +The choos er is Black ard. + +Dix ae is in Ack ard. +Add ald nur tures Black ard. +Count ee writes on Dix ard. +The trans lat or finds Black ard. + + +Syd ney feels one. + +Syd ney feels one. +All ee is the choos er. +The breaths on er is Whist ald. +The writ er loves Wilb ess. + +Whist ur needs the graph er. +Duk an nur tures Whist ald. +The test er feels Syd an. +The updat er lifts Whist ald. + +Ven ler nur tures the breaths on er. +Don ae is the writ er. +The breaths on er feels Black ess. +One lifts Add ney. + +Brae ur forms Wilb ney. +All ney writes on Whist ald. +The runn er goes to Whist ald. +Whist ae makes Brae ard. + + +format=1 tracks=6 division=384 + +BA 1 CR 0 TR 0 CH 16 Text type 2: "Produced by Mind Reading Music Composer by Lucian Academy" +BA 1 CR 0 TR 0 CH 16 Text type 3: "Syd ney feels one." +BA 1 CR 0 TR 0 CH 1 Channel volume 127 +BA 1 CR 0 TR 0 CH 16 Tempo 63.00009 +BA 1 CR 0 TR 1 CH 1 Instrument 1 +BA 1 CR 0 TR 1 CH 1 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 5 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 6 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 6 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 6 CR 1 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 6 CR 1+1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 6 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 7 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 7 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 8 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 8 CR 1 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 8 CR 1+1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 8 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 13 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 14 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 14 CR 1 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 14 CR 1+1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 14 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 15 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 16 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 16 CR 1 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 16 CR 1+1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 16 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1 TR 1 CH 1 NT F#- 1/2 voff=0 +BA 17 CR 1+1/2 TR 1 CH 1 NT F#- 1/2 voff=0 +BA 17 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 17 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 17 CR 3 TR 1 CH 1 NT E- 1/2 voff=0 +BA 17 CR 3+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 18 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 18 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 18 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 18 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 18 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 18 CR 2+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 19 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1 TR 1 CH 1 NT F#- 1/2 voff=0 +BA 19 CR 1+1/2 TR 1 CH 1 NT F#- 1/2 voff=0 +BA 19 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 19 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 19 CR 3 TR 1 CH 1 NT E- 1/2 voff=0 +BA 20 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 20 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 20 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 20 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 25 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 25 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 25 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 26 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 26 CR 1 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 1+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 26 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 26 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 27 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 27 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 27 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 27 CR 3 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 28 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 28 CR 1 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 1+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 28 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1 TR 1 CH 1 NT F#- 1/2 voff=0 +BA 29 CR 1+1/2 TR 1 CH 1 NT F#- 1/2 voff=0 +BA 29 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 3 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 30 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 30 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 30 CR 2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 30 CR 2+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 31 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1 TR 1 CH 1 NT F#- 1/2 voff=0 +BA 31 CR 1+1/2 TR 1 CH 1 NT F#- 1/2 voff=0 +BA 31 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 31 CR 3 TR 1 CH 1 NT E- 1/2 voff=0 +BA 32 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 32 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 1 CR 0 TR 2 CH 2 Instrument 101 +BA 1 CR 0 TR 2 CH 2 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 2 CH 2 NT B- 1/2 voff=0 +BA 1 CR 1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 1 CR 1 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 1 CR 1+1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 1 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 2 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 2 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 2 CR 1 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 2 CR 1+1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 2 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 2 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 3 CR 0 TR 2 CH 2 NT B- 1/2 voff=0 +BA 3 CR 1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 3 CR 1 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 3 CR 1+1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 3 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 4 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 4 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 4 CR 1 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 4 CR 1+1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 4 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 4 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 5 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 5 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 6 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 6 CR 1 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 6 CR 1+1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 6 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 7 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 7 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 7 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 8 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 8 CR 1 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 8 CR 1+1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 8 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 9 CR 1+1/2 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 9 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 9 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 9 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 9 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 10 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 10 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 10 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 10 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 10 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 11 CR 1+1/2 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 11 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 11 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 11 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 11 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 12 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 12 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 12 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 12 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 12 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 13 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 13 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 13 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 14 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 14 CR 1 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 14 CR 1+1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 14 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 15 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 15 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 15 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 0 TR 2 CH 2 NT F- 1/2 voff=0 +BA 16 CR 1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 16 CR 1 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 16 CR 1+1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 16 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 1 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 17 CR 1+1/2 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 17 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 17 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 17 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 17 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 18 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 18 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 18 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 18 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 18 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 18 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 18 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 18 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 19 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 1 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 19 CR 1+1/2 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 19 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 19 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 19 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 19 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 20 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 20 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 20 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 20 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 20 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 20 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 20 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 20 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 25 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 25 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 25 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 25 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 25 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 25 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 25 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 26 CR 0 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 26 CR 1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 26 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 26 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 26 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 26 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 26 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 26 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 27 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 27 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 27 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 27 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 27 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 27 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 27 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 28 CR 0 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 28 CR 1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 28 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 28 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 28 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 28 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 28 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 28 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 29 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 1 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 29 CR 1+1/2 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 29 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 29 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 30 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 30 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 30 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 30 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 30 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 30 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 30 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 31 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 1 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 31 CR 1+1/2 TR 2 CH 2 NT F#- 1/2 voff=0 +BA 31 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 31 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 32 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 32 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 32 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 32 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 32 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 32 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 32 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 33 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 33 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 33 CR 1 TR 2 CH 2 NT E- 1/2 voff=0 +BA 33 CR 1+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 33 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 33 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 33 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 33 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 34 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 34 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 34 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 34 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 34 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 34 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 34 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 34 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 35 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 35 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 35 CR 1 TR 2 CH 2 NT E- 1/2 voff=0 +BA 35 CR 1+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 35 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 35 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 35 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 35 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 36 CR 0 TR 2 CH 2 NT A- 1/2 voff=0 +BA 36 CR 1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 36 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 36 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 36 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 36 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 36 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 36 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 Instrument 78 +BA 1 CR 0 TR 3 CH 3 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 9 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 10 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 10 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT D#- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 11 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT C#- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 12 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 12 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 17 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 17 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 17 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 17 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 17 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 18 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 18 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 18 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 18 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 19 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 19 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 19 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 19 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 19 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 20 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 20 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 20 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 20 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 21 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 21 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 21 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 21 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 21 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 21 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 21 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 21 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 21 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 21 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 21 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 21 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 21 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 22 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 22 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 22 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 22 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 22 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 22 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 22 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 22 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 22 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 23 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 23 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 23 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 23 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 23 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 23 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 23 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 23 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 23 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 23 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 23 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 23 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 23 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 24 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 24 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 24 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 24 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 24 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 24 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 24 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 24 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 24 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 29 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 29 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 29 CR 1 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 29 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 29 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 29 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 29 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 29 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 29 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 29 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 29 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 29 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 29 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 29 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 29 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 30 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 30 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 30 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 30 CR 1 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 30 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 30 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 1+1/2 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 30 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 30 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 30 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 30 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 30 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 30 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 30 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 30 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 30 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 30 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 31 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 31 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 31 CR 1 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 31 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 31 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 3 CH 3 NT F#- 1/2 voff=0 +BA 31 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 31 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 31 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 31 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 31 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 31 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 31 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 31 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 31 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 31 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 32 CR 0 TR 3 CH 3 NT A- 1/2 voff=0 +BA 32 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 32 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 32 CR 1 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 32 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 32 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 1+1/2 TR 3 CH 3 NT G#- 1/2 voff=0 +BA 32 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 32 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 32 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 32 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 32 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 32 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 32 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 32 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 32 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 32 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 Instrument 29 +BA 1 CR 0 TR 4 CH 4 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 1 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 1 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 1 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 1 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 1 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 1 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 1 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 1 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 1 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 1 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 1 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 1 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 1 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 1 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 1 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 1 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 1 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 2 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 2 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 2 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 2 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 2 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 2 CR 1 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 2 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 2 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 2 CR 1+1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 2 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 2 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 2 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 2 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 2 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 2 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 2 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 2 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 2 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 2 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 3 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 3 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 3 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 3 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 3 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 3 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 3 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 3 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 3 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 3 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 3 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 3 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 3 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 3 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 3 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 3 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 3 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 3 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 3 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 3 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 4 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 4 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 4 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 4 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 4 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 4 CR 1 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 4 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 4 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 4 CR 1+1/2 TR 4 CH 4 NT A#- 1/2 voff=0 +BA 4 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 4 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 4 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 4 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 4 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 4 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 4 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 4 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 4 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 4 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 5 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 5 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 5 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 6 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 6 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 6 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 7 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 7 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 7 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 8 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 8 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 8 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 9 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 9 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 9 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 9 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 9 CR 1 TR 4 CH 4 NT D#- 1/2 voff=0 +BA 9 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 9 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 9 CR 1+1/2 TR 4 CH 4 NT D#- 1/2 voff=0 +BA 9 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 9 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 9 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 9 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 9 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 9 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 9 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 9 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 9 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 9 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 10 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 10 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 10 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 10 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 10 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 10 CR 1 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 10 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 10 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 1+1/2 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 10 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 10 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 10 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 10 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 11 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 11 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 11 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 11 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 11 CR 1 TR 4 CH 4 NT D#- 1/2 voff=0 +BA 11 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 11 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 1+1/2 TR 4 CH 4 NT D#- 1/2 voff=0 +BA 11 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 11 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 11 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 11 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 11 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 11 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 12 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 12 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 12 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 12 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 12 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 12 CR 1 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 12 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 12 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 1+1/2 TR 4 CH 4 NT C#- 1/2 voff=0 +BA 12 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 12 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 12 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 12 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT F#- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT G#- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 33 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 34 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 34 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 35 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 36 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 36 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 Instrument 72 +BA 1 CR 0 TR 5 CH 5 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 1 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 1 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 1 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 1 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 1 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 1 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 1 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 1 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 1 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 1 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 1 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 1 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 1 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 1 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 1 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 1 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 2 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 2 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 2 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 2 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 2 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 2 CR 1 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 2 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 2 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 2 CR 1+1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 2 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 2 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 2 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 2 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 2 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 3 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 3 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 3 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 3 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 3 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 3 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 3 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 3 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 3 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 3 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 3 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 3 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 3 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 3 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 3 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 3 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 4 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 4 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 4 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 4 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 4 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 4 CR 1 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 4 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 4 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 4 CR 1+1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 4 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 4 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 4 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 4 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 4 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 5 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 5 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 5 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 5 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 5 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 5 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 5 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 5 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 5 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 6 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 6 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 6 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 6 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 6 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 6 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 7 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 7 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 7 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 7 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 7 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 7 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 7 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 7 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 8 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 8 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 8 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 8 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 8 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 8 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 9 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 10 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 10 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT D#- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 11 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 12 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 12 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 13 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 13 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 13 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 13 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 13 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 13 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 13 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 13 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 13 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 13 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 13 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 13 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 13 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 13 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 13 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 14 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 14 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 14 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 14 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 14 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 14 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 14 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 14 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 14 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 14 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 14 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 14 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 14 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 14 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 14 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 14 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 14 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 14 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 14 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 14 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 14 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 15 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 0 TR 5 CH 5 NT B- 1/2 voff=0 +BA 15 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 15 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 15 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 15 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 15 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 15 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 15 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 15 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 15 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 15 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 15 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 15 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 15 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 15 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 16 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 16 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 16 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 16 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 16 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 16 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 16 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 16 CR 1 TR 5 CH 5 NT E- 1/2 voff=0 +BA 16 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 16 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 16 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 16 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 16 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 16 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 16 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 16 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 16 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 16 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 16 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 16 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 16 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 17 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 17 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 17 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 17 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 17 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 17 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 17 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 17 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 17 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 17 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 17 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 17 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 17 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 17 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 17 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 17 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 18 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 18 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 18 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 18 CR 1 TR 5 CH 5 NT G#- 1/2 voff=0 +BA 18 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 18 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 5 CH 5 NT G#- 1/2 voff=0 +BA 18 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 18 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 18 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 18 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 18 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 19 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 19 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 19 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 19 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 19 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 19 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 19 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 19 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 19 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 19 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 19 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 19 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 19 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 19 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 19 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 19 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 20 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 20 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 20 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 20 CR 1 TR 5 CH 5 NT G#- 1/2 voff=0 +BA 20 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 20 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 5 CH 5 NT G#- 1/2 voff=0 +BA 20 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 20 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 20 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 20 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 20 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 25 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 26 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 26 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 26 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 2+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 3 TR 5 CH 5 NT C- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 27 CR 3+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 0 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT A#- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 28 CR 1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT C#- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT F- 1/2 voff=0 +BA 28 CR 1+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 28 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 29 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 29 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT G#- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT G#- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 30 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 30 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 0 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT F#- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 31 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 31 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 0 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 5 CH 5 NT E- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT G#- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 1 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT G#- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT C- 1/2 voff=0 +BA 32 CR 1+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 2+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 3 TR 5 CH 5 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT G- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT B- 1/2 voff=0 +BA 32 CR 3+1/2 TR 5 CH 5 NT D- 1/2 voff=0 +BA 37 CR 0 TR 0 CH 16 End of track +format=1 tracks=5 division=384 + +BA 1 CR 0 TR 0 CH 16 Text type 2: "Produced by Mind Reading Music Composer by Lucian Academy" +BA 1 CR 0 TR 0 CH 16 Text type 3: "Dix ess writes on the test er." +BA 1 CR 0 TR 0 CH 1 Channel volume 127 +BA 1 CR 0 TR 0 CH 16 Tempo 63.00009 +BA 1 CR 0 TR 1 CH 1 Instrument 1 +BA 1 CR 0 TR 1 CH 1 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 5 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 6 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 6 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 6 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 6 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 6 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 6 CR 3 TR 1 CH 1 NT A- 1/2 voff=0 +BA 7 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 8 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 8 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 8 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 8 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 8 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 8 CR 3 TR 1 CH 1 NT A- 1/2 voff=0 +BA 13 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 14 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 14 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 14 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 14 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 15 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 16 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 16 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 16 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 16 CR 2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 16 CR 2+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 17 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 25 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 25 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 25 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 25 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 25 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 25 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 26 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 26 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 26 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 26 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 26 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 26 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 26 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 27 CR 0 TR 1 CH 1 NT G- 1/2 voff=0 +BA 27 CR 1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 27 CR 1 TR 1 CH 1 NT A- 1/2 voff=0 +BA 27 CR 1+1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 27 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 27 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 28 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 28 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 28 CR 1 TR 1 CH 1 NT G- 1/2 voff=0 +BA 28 CR 1+1/2 TR 1 CH 1 NT G- 1/2 voff=0 +BA 28 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 29 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 30 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 0 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 32 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 1 CR 0 TR 2 CH 2 Instrument 16 +BA 1 CR 0 TR 2 CH 2 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 5 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 6 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 6 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 6 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 6 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 6 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 6 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 6 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 7 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 8 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 8 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 8 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 8 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 8 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 8 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 8 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 9 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 9 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 10 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 11 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 12 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 14 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 14 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 14 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 14 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 14 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 14 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 14 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 14 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 15 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 16 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 16 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 16 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 16 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 16 CR 2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 16 CR 2+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 16 CR 3 TR 2 CH 2 NT A- 1/2 voff=0 +BA 16 CR 3+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 17 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 22 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 22 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 22 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 22 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 22 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 23 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 24 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 24 CR 2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 24 CR 2+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 24 CR 3 TR 2 CH 2 NT G- 1/2 voff=0 +BA 24 CR 3+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 25 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 25 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 25 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 25 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 25 CR 2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 25 CR 2+1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 25 CR 3 TR 2 CH 2 NT B- 1/2 voff=0 +BA 25 CR 3+1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 26 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 26 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 26 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 26 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 26 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 26 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 26 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 26 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 27 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 27 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 27 CR 1 TR 2 CH 2 NT A- 1/2 voff=0 +BA 27 CR 1+1/2 TR 2 CH 2 NT A- 1/2 voff=0 +BA 27 CR 2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 27 CR 2+1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 27 CR 3 TR 2 CH 2 NT B- 1/2 voff=0 +BA 27 CR 3+1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 28 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 28 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 28 CR 1 TR 2 CH 2 NT G- 1/2 voff=0 +BA 28 CR 1+1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 28 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 28 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 28 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 28 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 29 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 30 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 31 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 0 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 32 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 Instrument 6 +BA 1 CR 0 TR 3 CH 3 NT R 0 von=127 voff=0 +BA 21 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 21 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 22 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 22 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 22 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 22 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 22 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 23 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 23 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 24 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 24 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 24 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 24 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 24 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 33 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 33 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 33 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 33 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 34 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 34 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 34 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 34 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 34 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 34 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 34 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 34 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 35 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 35 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 36 CR 0 TR 3 CH 3 NT C- 1/2 voff=0 +BA 36 CR 1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 36 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 36 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 36 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 36 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 36 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 36 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 0 TR 4 CH 4 Instrument 30 +BA 1 CR 0 TR 4 CH 4 NT R 0 von=127 voff=0 +BA 13 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 13 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 13 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 13 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 13 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 13 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 13 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 13 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 13 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 14 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 14 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 14 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 14 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 14 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 14 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 15 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 16 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 16 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 16 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 16 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 16 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 16 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 17 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 19 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 21 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 21 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 21 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 21 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 21 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 21 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 21 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 21 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 21 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 21 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 22 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 22 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 22 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 22 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 22 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 22 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 22 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 22 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 22 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 22 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 22 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 22 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 22 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 22 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 22 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 22 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 22 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 22 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 23 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 23 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 23 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 23 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 23 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 23 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 23 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 23 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 23 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 23 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 23 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 24 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 24 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 24 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 24 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 24 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 24 CR 1 TR 4 CH 4 NT F- 1/2 voff=0 +BA 24 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 24 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 1+1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 24 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 24 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 24 CR 2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 24 CR 2+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 24 CR 2+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 24 CR 3 TR 4 CH 4 NT G- 1/2 voff=0 +BA 24 CR 3 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 24 CR 3+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 24 CR 3+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 24 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 25 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 26 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 26 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 26 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 26 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT G- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 0 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 27 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 28 CR 0 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 28 CR 1/2 TR 4 CH 4 NT F- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT G- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT G- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT A- 1/2 voff=0 +BA 28 CR 1+1/2 TR 4 CH 4 NT B- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 28 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 29 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 30 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 31 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 0 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 1 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 1+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 2+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 3 TR 4 CH 4 NT E- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT C- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT D- 1/2 voff=0 +BA 32 CR 3+1/2 TR 4 CH 4 NT E- 1/2 voff=0 +BA 37 CR 0 TR 0 CH 16 End of track +Dix ess writes on the test er. + +Dix ess writes on the test er. +Ack an moves to the gramm ar. +The draw er forms Don ler. +The test er goes to Nei ler. + +Le ald makes the draw er. +Brae ye forms All an. +The oth er moves to Wilb el. +The choos er forms Syd el. + +Ack ur is in the oth er. +Nei ye feels Luc ard. +Luc ard writes on Luc ard. +The trans lat or is Luc ard. + +Count an makes the updat er. +Black ee writes on the oth er. +The ess ence finds Whist el. +Ack ess needs All an. + + +format=1 tracks=4 division=384 + +BA 1 CR 0 TR 0 CH 16 Text type 2: "Produced by Mind Reading Music Composer by Lucian Academy" +BA 1 CR 0 TR 0 CH 16 Text type 3: "Don ee loves the runn er." +BA 1 CR 0 TR 0 CH 1 Channel volume 127 +BA 1 CR 0 TR 0 CH 16 Tempo 63.00009 +BA 1 CR 0 TR 1 CH 1 Instrument 1 +BA 1 CR 0 TR 1 CH 1 NT R 0 von=127 voff=0 +BA 5 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 5 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 5 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 5 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 5 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 6 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 6 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 6 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 7 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 7 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 7 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 7 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 7 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 8 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 8 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 8 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 8 CR 3 TR 1 CH 1 NT C- 1/2 voff=0 +BA 13 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 13 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 13 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 13 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 13 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 14 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 14 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 14 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 15 CR 0 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 15 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 15 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 15 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 15 CR 2+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 16 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 16 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 16 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 17 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 17 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 17 CR 1 TR 1 CH 1 NT F- 1/2 voff=0 +BA 17 CR 1+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 18 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 18 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 18 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 18 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 19 CR 0 TR 1 CH 1 NT D- 1/2 voff=0 +BA 19 CR 1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 19 CR 1 TR 1 CH 1 NT F- 1/2 voff=0 +BA 19 CR 1+1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 19 CR 2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 20 CR 0 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 20 CR 1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 20 CR 1 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 1 CH 1 NT C- 1/2 voff=0 +BA 20 CR 2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 1 CH 1 NT D- 1/2 voff=0 +BA 25 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 25 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 25 CR 1 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 25 CR 1+1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 25 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 25 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 26 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 26 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 26 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 26 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 26 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 26 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 26 CR 3 TR 1 CH 1 NT B- 1/2 voff=0 +BA 27 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 27 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 27 CR 1 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 27 CR 1+1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 27 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 27 CR 3 TR 1 CH 1 NT E- 1/2 voff=0 +BA 28 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 28 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 28 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 28 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 28 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 29 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 29 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 29 CR 1 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 29 CR 1+1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 29 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 29 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 30 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 30 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 30 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 30 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 30 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 30 CR 3 TR 1 CH 1 NT B- 1/2 voff=0 +BA 31 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 31 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 31 CR 1 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 31 CR 1+1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 31 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 31 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 31 CR 3 TR 1 CH 1 NT E- 1/2 voff=0 +BA 32 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 32 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 32 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 32 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 33 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 33 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 33 CR 1 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 33 CR 1+1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 33 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 33 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 34 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 34 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 34 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 34 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 34 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 34 CR 2+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 34 CR 3 TR 1 CH 1 NT B- 1/2 voff=0 +BA 35 CR 0 TR 1 CH 1 NT F- 1/2 voff=0 +BA 35 CR 1/2 TR 1 CH 1 NT F- 1/2 voff=0 +BA 35 CR 1 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 35 CR 1+1/2 TR 1 CH 1 NT A#- 1/2 voff=0 +BA 35 CR 2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 35 CR 2+1/2 TR 1 CH 1 NT E- 1/2 voff=0 +BA 35 CR 3 TR 1 CH 1 NT E- 1/2 voff=0 +BA 36 CR 0 TR 1 CH 1 NT A- 1/2 voff=0 +BA 36 CR 1/2 TR 1 CH 1 NT A- 1/2 voff=0 +BA 36 CR 1 TR 1 CH 1 NT B- 1/2 voff=0 +BA 36 CR 1+1/2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 36 CR 2 TR 1 CH 1 NT B- 1/2 voff=0 +BA 1 CR 0 TR 2 CH 2 Instrument 41 +BA 1 CR 0 TR 2 CH 2 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 1 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 1 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 1 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 1 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 1 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 1 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 1 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 2 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 2 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 2 CR 1 TR 2 CH 2 NT C#- 1/2 voff=0 +BA 2 CR 1+1/2 TR 2 CH 2 NT C#- 1/2 voff=0 +BA 2 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 2 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 3 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 3 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 3 CR 1 TR 2 CH 2 NT D- 1/2 voff=0 +BA 3 CR 1+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 3 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 3 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 3 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 3 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 4 CR 0 TR 2 CH 2 NT G- 1/2 voff=0 +BA 4 CR 1/2 TR 2 CH 2 NT G- 1/2 voff=0 +BA 4 CR 1 TR 2 CH 2 NT C#- 1/2 voff=0 +BA 4 CR 1+1/2 TR 2 CH 2 NT C#- 1/2 voff=0 +BA 4 CR 2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 4 CR 3 TR 2 CH 2 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 5 CR 1 TR 2 CH 2 NT B- 1/2 voff=0 +BA 5 CR 1+1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 5 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 5 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 5 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 5 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 6 CR 0 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 6 CR 1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 6 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 6 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 7 CR 0 TR 2 CH 2 NT E- 1/2 voff=0 +BA 7 CR 1/2 TR 2 CH 2 NT E- 1/2 voff=0 +BA 7 CR 1 TR 2 CH 2 NT B- 1/2 voff=0 +BA 7 CR 1+1/2 TR 2 CH 2 NT B- 1/2 voff=0 +BA 7 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 7 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 7 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 7 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 8 CR 0 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 8 CR 1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 8 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 2+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3 TR 2 CH 2 NT C- 1/2 voff=0 +BA 8 CR 3+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 17 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 17 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 17 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 17 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 17 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 17 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 17 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 17 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 18 CR 0 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 18 CR 1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 18 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 18 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 18 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 19 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 19 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 19 CR 1 TR 2 CH 2 NT F- 1/2 voff=0 +BA 19 CR 1+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 19 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 19 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 19 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 19 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 20 CR 0 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 20 CR 1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 20 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 20 CR 2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 2+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 3 TR 2 CH 2 NT D- 1/2 voff=0 +BA 20 CR 3+1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 21 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 21 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 21 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 21 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 21 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 22 CR 0 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 22 CR 1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 22 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 22 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 22 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 22 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 22 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 23 CR 0 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 1/2 TR 2 CH 2 NT D- 1/2 voff=0 +BA 23 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 23 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 23 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 23 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 23 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 24 CR 0 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 24 CR 1/2 TR 2 CH 2 NT A#- 1/2 voff=0 +BA 24 CR 1 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 1+1/2 TR 2 CH 2 NT C- 1/2 voff=0 +BA 24 CR 2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 24 CR 2+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 24 CR 3 TR 2 CH 2 NT F- 1/2 voff=0 +BA 24 CR 3+1/2 TR 2 CH 2 NT F- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 Instrument 59 +BA 1 CR 0 TR 3 CH 3 NT R 0 von=127 voff=0 +BA 1 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 1 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 1 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 1 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 1 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 1 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 1 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 2 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 2 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 2 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 2 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 2 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 3 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 3 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 3 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 4 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 4 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 4 CR 1 TR 3 CH 3 NT A- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 4 CR 1+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 2+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 3 TR 3 CH 3 NT B- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 4 CR 3+1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 13 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 13 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 13 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 13 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 13 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 13 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 13 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 14 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 14 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 0 TR 3 CH 3 NT B- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 15 CR 1/2 TR 3 CH 3 NT B- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT D- 1/2 voff=0 +BA 15 CR 1 TR 3 CH 3 NT F- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 15 CR 1+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 15 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 15 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT F- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT A- 1/2 voff=0 +BA 15 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT A- 1/2 voff=0 +BA 15 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 0 TR 3 CH 3 NT F- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT A#- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT D- 1/2 voff=0 +BA 16 CR 1/2 TR 3 CH 3 NT F- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 1 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 1+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 2+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 3 TR 3 CH 3 NT G- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT C- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT E- 1/2 voff=0 +BA 16 CR 3+1/2 TR 3 CH 3 NT G- 1/2 voff=0 +BA 37 CR 0 TR 0 CH 16 End of track +Don ee loves the runn er. + +Don ee loves the runn er. +Black ye nur tures Don ice. +The choos er is Skye an. +The oth er writes on Whist ee. + +Dix ney finds the test er. +Le ard finds the spell er. +The gramm ar is Dix ler. +All ye moves to Duk ess. + +Le ice makes one. +Le ard is the gramm ar. +Don ice is Duk ald. +The spell er loves Le ney. + +Count el makes the choos er. +Luc ye reads on the oth er. +The spell er nur tures Le an. +Black ard is Duk ur. + + diff --git a/mindreader/file.txt b/mindreader/file.txt new file mode 100644 index 0000000000000000000000000000000000000000..c365b9bbd551d5027885fba7f6a97d3b65a1f457 --- /dev/null +++ b/mindreader/file.txt @@ -0,0 +1,855 @@ +[trace] ?- caw00(off,[n,f],[[[n,append],2,1],[[n,delete],2,1],[[n,head],1,1],[[n,tail],1,1],[[n,member],1,1]],2,8,_InputVarList,_OutputVarList,[],_Program2,Ps). + Call: (8) caw00(off, [n, f], [[[n, append], 2, 1], [[n, delete], 2, 1], [[n, head], 1, 1], [[n, tail], 1, 1], [[n, member], 1, 1]], 2, 8, _16552, _16554, [], _16558, _16560) + Unify: (8) caw00(off, [n, f], [[[n, append], 2, 1], [[n, delete], 2, 1], [[n, head], 1, 1], [[n, tail], 1, 1], [[n, member], 1, 1]], 2, 8, _16552, _16554, [], _16558, _16560) + Call: (9) repeat + Exit: (9) repeat + Call: (9) randvars(2, 2, [], _17132) + Unify: (9) randvars(2, 2, [], _17132) + Call: (10) random:random(_17126) + Unify: (10) random:random(_17126) + Call: (11) _17126 is random_float + Exit: (11) 0.3814023340532047 is random_float + Exit: (10) random:random(0.3814023340532047) + Call: (10) _17148 is round(97+0.3814023340532047*2) + Exit: (10) 98 is round(97+0.3814023340532047*2) + Call: (10) char_code(_17148, 98) + Exit: (10) char_code(b, 98) + Call: (10) _17160=[v, b] + Exit: (10) [v, b]=[v, b] + Call: (10) lists:member([v, b], []) + Fail: (10) lists:member([v, b], []) + Redo: (9) randvars(2, 2, [], _17166) + Call: (10) lists:append([], [[v, b]], _17170) + Unify: (10) lists:append([], [[v, b]], [[v, b]]) + Exit: (10) lists:append([], [[v, b]], [[v, b]]) + Call: (10) _17172 is 2+ -1 + Exit: (10) 1 is 2+ -1 + Call: (10) randvars(1, 2, [[v, b]], _17178) + Unify: (10) randvars(1, 2, [[v, b]], _17178) + Call: (11) random:random(_17172) + Unify: (11) random:random(_17172) + Call: (12) _17172 is random_float + Exit: (12) 0.8758702925960795 is random_float + Exit: (11) random:random(0.8758702925960795) + Call: (11) _17194 is round(97+0.8758702925960795*2) + Exit: (11) 99 is round(97+0.8758702925960795*2) + Call: (11) char_code(_17194, 99) + Exit: (11) char_code(c, 99) + Call: (11) _17206=[v, c] + Exit: (11) [v, c]=[v, c] + Call: (11) lists:member([v, c], [[v, b]]) + Unify: (11) lists:member([v, c], [[v, b]]) + Fail: (11) lists:member([v, c], [[v, b]]) + Redo: (10) randvars(1, 2, [[v, b]], _17212) + Call: (11) lists:append([[v, b]], [[v, c]], _17216) + Unify: (11) lists:append([[v, b]], [[v, c]], [[v, b]|_17200]) + Exit: (11) lists:append([[v, b]], [[v, c]], [[v, b], [v, c]]) + Call: (11) _17224 is 1+ -1 + Exit: (11) 0 is 1+ -1 + Call: (11) randvars(0, 2, [[v, b], [v, c]], _17230) + Unify: (11) randvars(0, 2, [[v, b], [v, c]], [[v, b], [v, c]]) + Exit: (11) randvars(0, 2, [[v, b], [v, c]], [[v, b], [v, c]]) + Exit: (10) randvars(1, 2, [[v, b]], [[v, b], [v, c]]) + Exit: (9) randvars(2, 2, [], [[v, b], [v, c]]) + Call: (9) populatevars([[v, b], [v, c]], 2, [], _17230) + Unify: (9) populatevars([[v, b], [v, c]], 2, [], _17230) + Call: (10) randvars2(2, 2, [], _17230) + Unify: (10) randvars2(2, 2, [], _17230) + Call: (11) random:random(_17224) + Unify: (11) random:random(_17224) + Call: (12) _17224 is random_float + Exit: (12) 0.9905487027815254 is random_float + Exit: (11) random:random(0.9905487027815254) + Call: (11) _17246 is round(97+0.9905487027815254*2) + Exit: (11) 99 is round(97+0.9905487027815254*2) + Call: (11) char_code(_17246, 99) + Exit: (11) char_code(c, 99) + Call: (11) atom_string(c, _17248) + Exit: (11) atom_string(c, "c") + Call: (11) lists:member("c", []) + Fail: (11) lists:member("c", []) + Redo: (10) randvars2(2, 2, [], _17258) + Call: (11) lists:append([], ["c"], _17262) + Unify: (11) lists:append([], ["c"], ["c"]) + Exit: (11) lists:append([], ["c"], ["c"]) + Call: (11) _17264 is 2+ -1 + Exit: (11) 1 is 2+ -1 + Call: (11) randvars2(1, 2, ["c"], _17270) + Unify: (11) randvars2(1, 2, ["c"], _17270) + Call: (12) random:random(_17264) + Unify: (12) random:random(_17264) + Call: (13) _17264 is random_float + Exit: (13) 0.5970644307576888 is random_float + Exit: (12) random:random(0.5970644307576888) + Call: (12) _17286 is round(97+0.5970644307576888*2) + Exit: (12) 98 is round(97+0.5970644307576888*2) + Call: (12) char_code(_17286, 98) + Exit: (12) char_code(b, 98) + Call: (12) atom_string(b, _17288) + Exit: (12) atom_string(b, "b") + Call: (12) lists:member("b", ["c"]) + Unify: (12) lists:member("b", ["c"]) + Fail: (12) lists:member("b", ["c"]) + Redo: (11) randvars2(1, 2, ["c"], _17298) + Call: (12) lists:append(["c"], ["b"], _17302) + Unify: (12) lists:append(["c"], ["b"], ["c"|_17286]) + Exit: (12) lists:append(["c"], ["b"], ["c", "b"]) + Call: (12) _17310 is 1+ -1 + Exit: (12) 0 is 1+ -1 + Call: (12) randvars2(0, 2, ["c", "b"], _17316) + Unify: (12) randvars2(0, 2, ["c", "b"], ["c", "b"]) + Exit: (12) randvars2(0, 2, ["c", "b"], ["c", "b"]) + Exit: (11) randvars2(1, 2, ["c"], ["c", "b"]) + Exit: (10) randvars2(2, 2, [], ["c", "b"]) + Call: (10) lists:append([], [[[v, b], ["c", "b"]]], _17332) + Unify: (10) lists:append([], [[[v, b], ["c", "b"]]], [[[v, b], ["c", "b"]]]) + Exit: (10) lists:append([], [[[v, b], ["c", "b"]]], [[[v, b], ["c", "b"]]]) + Call: (10) populatevars([[v, c]], 2, [[[v, b], ["c", "b"]]], _17334) + Unify: (10) populatevars([[v, c]], 2, [[[v, b], ["c", "b"]]], _17334) + Call: (11) randvars2(2, 2, [], _17334) + Unify: (11) randvars2(2, 2, [], _17334) + Call: (12) random:random(_17328) + Unify: (12) random:random(_17328) + Call: (13) _17328 is random_float + Exit: (13) 0.08156498097205207 is random_float + Exit: (12) random:random(0.08156498097205207) + Call: (12) _17350 is round(97+0.08156498097205207*2) + Exit: (12) 97 is round(97+0.08156498097205207*2) + Call: (12) char_code(_17350, 97) + Exit: (12) char_code(a, 97) + Call: (12) atom_string(a, _17352) + Exit: (12) atom_string(a, "a") + Call: (12) lists:member("a", []) + Fail: (12) lists:member("a", []) + Redo: (11) randvars2(2, 2, [], _17362) + Call: (12) lists:append([], ["a"], _17366) + Unify: (12) lists:append([], ["a"], ["a"]) + Exit: (12) lists:append([], ["a"], ["a"]) + Call: (12) _17368 is 2+ -1 + Exit: (12) 1 is 2+ -1 + Call: (12) randvars2(1, 2, ["a"], _17374) + Unify: (12) randvars2(1, 2, ["a"], _17374) + Call: (13) random:random(_17368) + Unify: (13) random:random(_17368) + Call: (14) _17368 is random_float + Exit: (14) 0.0934625988191671 is random_float + Exit: (13) random:random(0.0934625988191671) + Call: (13) _17390 is round(97+0.0934625988191671*2) + Exit: (13) 97 is round(97+0.0934625988191671*2) + Call: (13) char_code(_17390, 97) + Exit: (13) char_code(a, 97) + Call: (13) atom_string(a, _17392) + Exit: (13) atom_string(a, "a") + Call: (13) lists:member("a", ["a"]) + Unify: (13) lists:member("a", ["a"]) + Exit: (13) lists:member("a", ["a"]) + Call: (13) randvars2(1, 2, ["a"], _17402) + Unify: (13) randvars2(1, 2, ["a"], _17402) + Call: (14) random:random(_17396) + Unify: (14) random:random(_17396) + Call: (15) _17396 is random_float + Exit: (15) 0.10844423614956468 is random_float + Exit: (14) random:random(0.10844423614956468) + Call: (14) _17418 is round(97+0.10844423614956468*2) + Exit: (14) 97 is round(97+0.10844423614956468*2) + Call: (14) char_code(_17418, 97) + Exit: (14) char_code(a, 97) + Call: (14) atom_string(a, _17420) + Exit: (14) atom_string(a, "a") + Call: (14) lists:member("a", ["a"]) + Unify: (14) lists:member("a", ["a"]) + Exit: (14) lists:member("a", ["a"]) + Call: (14) randvars2(1, 2, ["a"], _17430) + Unify: (14) randvars2(1, 2, ["a"], _17430) + Call: (15) random:random(_17424) + Unify: (15) random:random(_17424) + Call: (16) _17424 is random_float + Exit: (16) 0.15626511108000463 is random_float + Exit: (15) random:random(0.15626511108000463) + Call: (15) _17446 is round(97+0.15626511108000463*2) + Exit: (15) 97 is round(97+0.15626511108000463*2) + Call: (15) char_code(_17446, 97) + Exit: (15) char_code(a, 97) + Call: (15) atom_string(a, _17448) + Exit: (15) atom_string(a, "a") + Call: (15) lists:member("a", ["a"]) + Unify: (15) lists:member("a", ["a"]) + Exit: (15) lists:member("a", ["a"]) + Call: (15) randvars2(1, 2, ["a"], _17458) + Unify: (15) randvars2(1, 2, ["a"], _17458) + Call: (16) random:random(_17452) + Unify: (16) random:random(_17452) + Call: (17) _17452 is random_float + Exit: (17) 0.23425412348298377 is random_float + Exit: (16) random:random(0.23425412348298377) + Call: (16) _17474 is round(97+0.23425412348298377*2) + Exit: (16) 97 is round(97+0.23425412348298377*2) + Call: (16) char_code(_17474, 97) + Exit: (16) char_code(a, 97) + Call: (16) atom_string(a, _17476) + Exit: (16) atom_string(a, "a") + Call: (16) lists:member("a", ["a"]) + Unify: (16) lists:member("a", ["a"]) + Exit: (16) lists:member("a", ["a"]) + Call: (16) randvars2(1, 2, ["a"], _17486) + Unify: (16) randvars2(1, 2, ["a"], _17486) + Call: (17) random:random(_17480) + Unify: (17) random:random(_17480) + Call: (18) _17480 is random_float + Exit: (18) 0.2395771174504641 is random_float + Exit: (17) random:random(0.2395771174504641) + Call: (17) _17502 is round(97+0.2395771174504641*2) + Exit: (17) 97 is round(97+0.2395771174504641*2) + Call: (17) char_code(_17502, 97) + Exit: (17) char_code(a, 97) + Call: (17) atom_string(a, _17504) + Exit: (17) atom_string(a, "a") + Call: (17) lists:member("a", ["a"]) + Unify: (17) lists:member("a", ["a"]) + Exit: (17) lists:member("a", ["a"]) + Call: (17) randvars2(1, 2, ["a"], _17514) + Unify: (17) randvars2(1, 2, ["a"], _17514) + Call: (18) random:random(_17508) + Unify: (18) random:random(_17508) + Call: (19) _17508 is random_float + Exit: (19) 0.5822983868661542 is random_float + Exit: (18) random:random(0.5822983868661542) + Call: (18) _17530 is round(97+0.5822983868661542*2) + Exit: (18) 98 is round(97+0.5822983868661542*2) + Call: (18) char_code(_17530, 98) + Exit: (18) char_code(b, 98) + Call: (18) atom_string(b, _17532) + Exit: (18) atom_string(b, "b") + Call: (18) lists:member("b", ["a"]) + Unify: (18) lists:member("b", ["a"]) + Fail: (18) lists:member("b", ["a"]) + Redo: (17) randvars2(1, 2, ["a"], _17542) + Call: (18) lists:append(["a"], ["b"], _17546) + Unify: (18) lists:append(["a"], ["b"], ["a"|_17530]) + Exit: (18) lists:append(["a"], ["b"], ["a", "b"]) + Call: (18) _17554 is 1+ -1 + Exit: (18) 0 is 1+ -1 + Call: (18) randvars2(0, 2, ["a", "b"], _17560) + Unify: (18) randvars2(0, 2, ["a", "b"], ["a", "b"]) + Exit: (18) randvars2(0, 2, ["a", "b"], ["a", "b"]) + Exit: (17) randvars2(1, 2, ["a"], ["a", "b"]) + Exit: (16) randvars2(1, 2, ["a"], ["a", "b"]) + Exit: (15) randvars2(1, 2, ["a"], ["a", "b"]) + Exit: (14) randvars2(1, 2, ["a"], ["a", "b"]) + Exit: (13) randvars2(1, 2, ["a"], ["a", "b"]) + Exit: (12) randvars2(1, 2, ["a"], ["a", "b"]) + Exit: (11) randvars2(2, 2, [], ["a", "b"]) + Call: (11) lists:append([[[v, b], ["c", "b"]]], [[[v, c], ["a", "b"]]], _17576) + Unify: (11) lists:append([[[v, b], ["c", "b"]]], [[[v, c], ["a", "b"]]], [[[v, b], ["c", "b"]]|_17560]) + Exit: (11) lists:append([[[v, b], ["c", "b"]]], [[[v, c], ["a", "b"]]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]]) + Call: (11) populatevars([], 2, [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], _17584) + Unify: (11) populatevars([], 2, [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]]) + Exit: (11) populatevars([], 2, [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]]) + Exit: (10) populatevars([[v, c]], 2, [[[v, b], ["c", "b"]]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]]) + Exit: (9) populatevars([[v, b], [v, c]], 2, [], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]]) + Call: (9) _17590 is 2+1+97 + Exit: (9) 100 is 2+1+97 + Call: (9) char_code(_17590, 100) + Exit: (9) char_code(d, 100) + Call: (9) _17620=[[[v, d], 1]] + Exit: (9) [[[v, d], 1]]=[[[v, d], 1]] +^ Call: (9) retractall(debug(_17606)) +^ Exit: (9) retractall(debug(_17606)) +^ Call: (9) assertz(debug(off)) +^ Exit: (9) assertz(debug(off)) +^ Call: (9) retractall(totalvars(_17614)) +^ Exit: (9) retractall(totalvars(_17614)) +^ Call: (9) assertz(totalvars(8)) +^ Exit: (9) assertz(totalvars(8)) + Call: (9) caw0([n, f], [[[n, append], 2, 1], [[n, delete], 2, 1], [[n, head], 1, 1], [[n, tail], 1, 1], [[n, member], 1, 1]], 2, [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], [[[v, d], 1]], [], _17648, _17650) + Unify: (9) caw0([n, f], [[[n, append], 2, 1], [[n, delete], 2, 1], [[n, head], 1, 1], [[n, tail], 1, 1], [[n, member], 1, 1]], 2, [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], [[[v, d], 1]], [], _17648, _17650) + Call: (10) varnames([[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], [], _17640, [], _17644) + Unify: (10) varnames([[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], [], _17640, [], _17644) + Call: (11) [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]]=[_17622|_17624] + Exit: (11) [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]]=[[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]] + Call: (11) [[v, b], ["c", "b"]]=[_17628, _17634] + Exit: (11) [[v, b], ["c", "b"]]=[[v, b], ["c", "b"]] + Call: (11) lists:append([], [[v, b]], _17664) + Unify: (11) lists:append([], [[v, b]], [[v, b]]) + Exit: (11) lists:append([], [[v, b]], [[v, b]]) + Call: (11) lists:append([], [["c", "b"]], _17670) + Unify: (11) lists:append([], [["c", "b"]], [["c", "b"]]) + Exit: (11) lists:append([], [["c", "b"]], [["c", "b"]]) + Call: (11) varnames([[[v, c], ["a", "b"]]], [[v, b]], _17670, [["c", "b"]], _17674) + Unify: (11) varnames([[[v, c], ["a", "b"]]], [[v, b]], _17670, [["c", "b"]], _17674) + Call: (12) [[[v, c], ["a", "b"]]]=[_17652|_17654] + Exit: (12) [[[v, c], ["a", "b"]]]=[[[v, c], ["a", "b"]]] + Call: (12) [[v, c], ["a", "b"]]=[_17658, _17664] + Exit: (12) [[v, c], ["a", "b"]]=[[v, c], ["a", "b"]] + Call: (12) lists:append([[v, b]], [[v, c]], _17694) + Unify: (12) lists:append([[v, b]], [[v, c]], [[v, b]|_17678]) + Exit: (12) lists:append([[v, b]], [[v, c]], [[v, b], [v, c]]) + Call: (12) lists:append([["c", "b"]], [["a", "b"]], _17706) + Unify: (12) lists:append([["c", "b"]], [["a", "b"]], [["c", "b"]|_17690]) + Exit: (12) lists:append([["c", "b"]], [["a", "b"]], [["c", "b"], ["a", "b"]]) + Call: (12) varnames([], [[v, b], [v, c]], _17712, [["c", "b"], ["a", "b"]], _17716) + Unify: (12) varnames([], [[v, b], [v, c]], [[v, b], [v, c]], [["c", "b"], ["a", "b"]], [["c", "b"], ["a", "b"]]) + Exit: (12) varnames([], [[v, b], [v, c]], [[v, b], [v, c]], [["c", "b"], ["a", "b"]], [["c", "b"], ["a", "b"]]) + Exit: (11) varnames([[[v, c], ["a", "b"]]], [[v, b]], [[v, b], [v, c]], [["c", "b"]], [["c", "b"], ["a", "b"]]) + Exit: (10) varnames([[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], [], [[v, b], [v, c]], [], [["c", "b"], ["a", "b"]]) + Call: (10) varnames([[[v, d], 1]], [], _17712, [], _17716) + Unify: (10) varnames([[[v, d], 1]], [], _17712, [], _17716) + Call: (11) [[[v, d], 1]]=[_17694|_17696] + Exit: (11) [[[v, d], 1]]=[[[v, d], 1]] + Call: (11) [[v, d], 1]=[_17700, _17706] + Exit: (11) [[v, d], 1]=[[v, d], 1] + Call: (11) lists:append([], [[v, d]], _17736) + Unify: (11) lists:append([], [[v, d]], [[v, d]]) + Exit: (11) lists:append([], [[v, d]], [[v, d]]) + Call: (11) lists:append([], [1], _17742) + Unify: (11) lists:append([], [1], [1]) + Exit: (11) lists:append([], [1], [1]) + Call: (11) varnames([], [[v, d]], _17742, [1], _17746) + Unify: (11) varnames([], [[v, d]], [[v, d]], [1], [1]) + Exit: (11) varnames([], [[v, d]], [[v, d]], [1], [1]) + Exit: (10) varnames([[[v, d], 1]], [], [[v, d]], [], [1]) +^ Call: (10) retractall(outputvars(_17724)) +^ Exit: (10) retractall(outputvars(_17724)) +^ Call: (10) assertz(outputvars([[v, d]])) +^ Exit: (10) assertz(outputvars([[v, d]])) + Call: (10) lists:append([[v, b], [v, c]], [[v, d]], _17750) + Unify: (10) lists:append([[v, b], [v, c]], [[v, d]], [[v, b]|_17734]) + Exit: (10) lists:append([[v, b], [v, c]], [[v, d]], [[v, b], [v, c], [v, d]]) + Call: (10) lists:append([["c", "b"], ["a", "b"]], [[v, d]], _17762) + Unify: (10) lists:append([["c", "b"], ["a", "b"]], [[v, d]], [["c", "b"]|_17746]) + Exit: (10) lists:append([["c", "b"], ["a", "b"]], [[v, d]], [["c", "b"], ["a", "b"], [v, d]]) + Call: (10) _17782=[[n, f], [["c", "b"], ["a", "b"], [v, d]]] + Exit: (10) [[n, f], [["c", "b"], ["a", "b"], [v, d]]]=[[n, f], [["c", "b"], ["a", "b"], [v, d]]] + Call: (10) caw1([[n, f], [["c", "b"], ["a", "b"], [v, d]]], [n, f], [[[n, append], 2, 1], [[n, delete], 2, 1], [[n, head], 1, 1], [[n, tail], 1, 1], [[n, member], 1, 1]], 2, [[v, b], [v, c], [v, d]], [[v, b], [v, c]], [[v, b], [v, c]], _17796, [[[v, d], 1]], [[v, d]], [], _17804, [], _17808) + Unify: (10) caw1([[n, f], [["c", "b"], ["a", "b"], [v, d]]], [n, f], [[[n, append], 2, 1], [[n, delete], 2, 1], [[n, head], 1, 1], [[n, tail], 1, 1], [[n, member], 1, 1]], 2, [[v, b], [v, c], [v, d]], [[v, b], [v, c]], [[v, b], [v, c]], _17796, [[[v, d], 1]], [[v, d]], [], _17804, [], _17808) + Call: (11) _17788 is 2+ -1 + Exit: (11) 1 is 2+ -1 + Call: (11) addrules0([[v, b], [v, c]], [[v, d]], [[v, d]], [], _17796) + Unify: (11) addrules0([[v, b], [v, c]], [[v, d]], [[v, d]], [], _17796) + Call: (12) [[v, d]]=[_17774|_17776] + Exit: (12) [[v, d]]=[[v, d]] + Call: (12) random:random_member(_17794, [[v, b], [v, c]]) + Unify: (12) random:random_member(_17794, [[v, b], [v, c]]) + Call: (13) error:must_be(list, [[v, b], [v, c]]) + Unify: (13) error:must_be(list, [[v, b], [v, c]]) + Exit: (13) error:must_be(list, [[v, b], [v, c]]) + Call: (13) length([[v, b], [v, c]], _17796) + Unify: (13) length([[v, b], [v, c]], _17796) + Exit: (13) length([[v, b], [v, c]], 2) + Call: (13) 2>0 + Exit: (13) 2>0 + Call: (13) _17798 is random(2) + Exit: (13) 0 is random(2) + Call: (13) lists:nth0(0, [[v, b], [v, c]], _17802) + Unify: (13) lists:nth0(0, [[v, b], [v, c]], _17802) + Exit: (13) lists:nth0(0, [[v, b], [v, c]], [v, b]) + Exit: (12) random:random_member([v, b], [[v, b], [v, c]]) + Call: (12) random:random_member([v, d], [[v, d]]) + Unify: (12) random:random_member([v, d], [[v, d]]) + Call: (13) error:must_be(list, [[v, d]]) + Unify: (13) error:must_be(list, [[v, d]]) + Exit: (13) error:must_be(list, [[v, d]]) + Call: (13) length([[v, d]], _17800) + Unify: (13) length([[v, d]], _17800) + Exit: (13) length([[v, d]], 1) + Call: (13) 1>0 + Exit: (13) 1>0 + Call: (13) _17802 is random(1) + Exit: (13) 0 is random(1) + Call: (13) lists:nth0(0, [[v, d]], [v, d]) + Unify: (13) lists:nth0(0, [[v, d]], [v, d]) + Exit: (13) lists:nth0(0, [[v, d]], [v, d]) + Exit: (12) random:random_member([v, d], [[v, d]]) + Call: (12) lists:append([], [[[n, =], [[v, d], [v, b]]]], _17848) + Unify: (12) lists:append([], [[[n, =], [[v, d], [v, b]]]], [[[n, =], [[v, d], [v, b]]]]) + Exit: (12) lists:append([], [[[n, =], [[v, d], [v, b]]]], [[[n, =], [[v, d], [v, b]]]]) + Call: (12) addrules0([[v, b], [v, c]], [[v, d]], [], [[[n, =], [[v, d], [v, b]]]], _17852) + Unify: (12) addrules0([[v, b], [v, c]], [[v, d]], [], [[[n, =], [[v, d], [v, b]]]], [[[n, =], [[v, d], [v, b]]]]) + Exit: (12) addrules0([[v, b], [v, c]], [[v, d]], [], [[[n, =], [[v, d], [v, b]]]], [[[n, =], [[v, d], [v, b]]]]) + Exit: (11) addrules0([[v, b], [v, c]], [[v, d]], [[v, d]], [], [[[n, =], [[v, d], [v, b]]]]) + Call: (11) lists:append([], [[[n, =], [[v, d], [v, b]]]], _17848) + Unify: (11) lists:append([], [[[n, =], [[v, d], [v, b]]]], [[[n, =], [[v, d], [v, b]]]]) + Exit: (11) lists:append([], [[[n, =], [[v, d], [v, b]]]], [[[n, =], [[v, d], [v, b]]]]) + Call: (11) lists:append([[v, b], [v, c]], [[v, d]], _17848) + Unify: (11) lists:append([[v, b], [v, c]], [[v, d]], [[v, b]|_17832]) + Exit: (11) lists:append([[v, b], [v, c]], [[v, d]], [[v, b], [v, c], [v, d]]) + Call: (11) _17892=[[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]] + Exit: (11) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (11) debug(_17892) + Unify: (11) debug(off) + Exit: (11) debug(off) + Call: (11) interpret(off, [[n, f], [["c", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _17898) + Unify: (11) interpret(off, [[n, f], [["c", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _17898) + Call: (12) convert_to_grammar_part1([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _17896) + Unify: (12) convert_to_grammar_part1([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _17896) + Call: (13) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _17896, [], _17900) + Unify: (13) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _17896, [], _17900) + Call: (14) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_17878|_17880] + Exit: (14) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (14) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _17896], _17902, "->", _17920] + Fail: (14) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _17896], _17902, "->", _17920] + Redo: (13) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _17902, [], _17906) + Call: (14) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _17896], "->", _17914] + Fail: (14) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, _17896], "->", _17914] + Redo: (13) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _17896, [], _17900) + Unify: (13) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], _17896, [], _17900) + Call: (14) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[_17878|_17880] + Exit: (14) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (14) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[_17884, _17890, ":-", _17908] + Exit: (14) [[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]]=[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n, =], [[...|...]|...]]]] + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Call: (14) true + Exit: (14) true + Call: (14) lists:append([], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], _17938) + Unify: (14) lists:append([], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (14) lists:append([], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]]) + Call: (14) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _17938, [], _17942) + Unify: (14) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (14) convert_to_grammar_part11([], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Exit: (13) convert_to_grammar_part11([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], []) + Call: (13) lists:append([[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], [], _17938) + Unify: (13) lists:append([[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]|_17922]) + Exit: (13) lists:append([[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]], [], [[[n, f], [[v, b], [v, c], [v|...]], ":-", [[[...|...]|...]]]]) + Exit: (12) convert_to_grammar_part1([[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]) + Call: (12) interpret1(off, [[n, f], [["c", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _17948) + Unify: (12) interpret1(off, [[n, f], [["c", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _17948) +^ Call: (13) retractall(debug(_17926)) +^ Exit: (13) retractall(debug(_17926)) +^ Call: (13) assertz(debug(off)) +^ Exit: (13) assertz(debug(off)) +^ Call: (13) retractall(cut(_17934)) +^ Exit: (13) retractall(cut(_17934)) +^ Call: (13) assertz(cut(off)) +^ Exit: (13) assertz(cut(off)) + Call: (13) member1([[n, f], [["c", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _17962) + Unify: (13) member1([[n, f], [["c", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _17962) + Call: (14) cut(off) + Unify: (14) cut(off) + Exit: (14) cut(off) + Call: (14) [[n, f], [["c", "b"], ["a", "b"], [v, d]]]=[_17942, _17948] + Exit: (14) [[n, f], [["c", "b"], ["a", "b"], [v, d]]]=[[n, f], [["c", "b"], ["a", "b"], [v, d]]] + Call: (14) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], _17966, ":-", _17984]|_17956] + Exit: (14) [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]]=[[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]] + Call: (14) length([["c", "b"], ["a", "b"], [v, d]], _18006) + Unify: (14) length([["c", "b"], ["a", "b"], [v, d]], _18006) + Exit: (14) length([["c", "b"], ["a", "b"], [v, d]], 3) + Call: (14) length([[v, b], [v, c], [v, d]], 3) + Unify: (14) length([[v, b], [v, c], [v, d]], 3) + Unify: (14) length([[v, b], [v, c], [v, d]], 3) + Exit: (14) length([[v, b], [v, c], [v, d]], 3) + Call: (14) [n, f]=[n, grammar] + Fail: (14) [n, f]=[n, grammar] + Redo: (13) member1([[n, f], [["c", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _18010) + Call: (14) [n, f]=[n, grammar_part] + Fail: (14) [n, f]=[n, grammar_part] + Redo: (13) member1([[n, f], [["c", "b"], ["a", "b"], [v, d]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], [[[n, f], [[v, b], [v, c], [v, d]], ":-", [[[n|...], [...|...]]]]], _18010) + Call: (14) checkarguments([["c", "b"], ["a", "b"], [v, d]], [[v, b], [v, c], [v, d]], [], _18010, [], _18014) + Unify: (14) checkarguments([["c", "b"], ["a", "b"], [v, d]], [[v, b], [v, c], [v, d]], [], _18010, [], _18014) + Call: (15) [["c", "b"], ["a", "b"], [v, d]]=[_17990|_17992] + Exit: (15) [["c", "b"], ["a", "b"], [v, d]]=[["c", "b"], ["a", "b"], [v, d]] + Call: (15) expressionnotatom(["c", "b"]) + Unify: (15) expressionnotatom(["c", "b"]) + Call: (16) isvalstrempty(["c", "b"]) + Unify: (16) isvalstrempty(["c", "b"]) + Call: (17) var(["c", "b"]) + Fail: (17) var(["c", "b"]) + Redo: (16) isvalstrempty(["c", "b"]) + Unify: (16) isvalstrempty(["c", "b"]) + Call: (17) isval(["c", "b"]) + Unify: (17) isval(["c", "b"]) + Fail: (17) isval(["c", "b"]) + Redo: (16) isvalstrempty(["c", "b"]) + Unify: (16) isvalstrempty(["c", "b"]) + Fail: (16) isvalstrempty(["c", "b"]) + Redo: (15) expressionnotatom(["c", "b"]) + Unify: (15) expressionnotatom(["c", "b"]) +^ Call: (16) not(atom(["c", "b"])) +^ Unify: (16) not(user:atom(["c", "b"])) +^ Exit: (16) not(user:atom(["c", "b"])) + Call: (16) length(["c", "b"], _18022) + Unify: (16) length(["c", "b"], _18022) + Exit: (16) length(["c", "b"], 2) + Call: (16) 2>=1 + Exit: (16) 2>=1 + Call: (16) expressionnotatom2(["c", "b"]) + Unify: (16) expressionnotatom2(["c", "b"]) + Call: (17) isvalstrempty("c") + Unify: (17) isvalstrempty("c") + Call: (18) var("c") + Fail: (18) var("c") + Redo: (17) isvalstrempty("c") + Unify: (17) isvalstrempty("c") + Call: (18) isval("c") + Unify: (18) isval("c") + Fail: (18) isval("c") + Redo: (17) isvalstrempty("c") + Unify: (17) isvalstrempty("c") + Exit: (17) isvalstrempty("c") + Call: (17) expressionnotatom2(["b"]) + Unify: (17) expressionnotatom2(["b"]) + Call: (18) isvalstrempty("b") + Unify: (18) isvalstrempty("b") + Call: (19) var("b") + Fail: (19) var("b") + Redo: (18) isvalstrempty("b") + Unify: (18) isvalstrempty("b") + Call: (19) isval("b") + Unify: (19) isval("b") + Fail: (19) isval("b") + Redo: (18) isvalstrempty("b") + Unify: (18) isvalstrempty("b") + Exit: (18) isvalstrempty("b") + Call: (18) expressionnotatom2([]) + Unify: (18) expressionnotatom2([]) + Exit: (18) expressionnotatom2([]) + Exit: (17) expressionnotatom2(["b"]) + Exit: (16) expressionnotatom2(["c", "b"]) + Exit: (15) expressionnotatom(["c", "b"]) + Call: (15) [[v, b], [v, c], [v, d]]=[_18006|_18008] + Exit: (15) [[v, b], [v, c], [v, d]]=[[v, b], [v, c], [v, d]] +^ Call: (15) not(var([v, b])) +^ Unify: (15) not(user:var([v, b])) +^ Exit: (15) not(user:var([v, b])) + Call: (15) isvar([v, b]) + Unify: (15) isvar([v, b]) + Exit: (15) isvar([v, b]) + Call: (15) putvalue([v, b], ["c", "b"], [], _18042) + Unify: (15) putvalue([v, b], ["c", "b"], [], _18042) +^ Call: (16) not(isvar([v, b])) +^ Unify: (16) not(user:isvar([v, b])) + Call: (17) isvar([v, b]) + Unify: (17) isvar([v, b]) + Exit: (17) isvar([v, b]) +^ Fail: (16) not(user:isvar([v, b])) + Redo: (15) putvalue([v, b], ["c", "b"], [], _18042) + Call: (16) isvar([v, b]) + Unify: (16) isvar([v, b]) + Exit: (16) isvar([v, b]) + Call: (16) isvalstrorundef(["c", "b"]) + Unify: (16) isvalstrorundef(["c", "b"]) + Call: (17) var(["c", "b"]) + Fail: (17) var(["c", "b"]) + Redo: (16) isvalstrorundef(["c", "b"]) + Unify: (16) isvalstrorundef(["c", "b"]) +^ Call: (17) not(var(["c", "b"])) +^ Unify: (17) not(user:var(["c", "b"])) +^ Exit: (17) not(user:var(["c", "b"])) + Call: (17) isval(["c", "b"]) + Unify: (17) isval(["c", "b"]) + Fail: (17) isval(["c", "b"]) + Redo: (16) isvalstrorundef(["c", "b"]) + Unify: (16) isvalstrorundef(["c", "b"]) +^ Call: (17) not(var(["c", "b"])) +^ Unify: (17) not(user:var(["c", "b"])) +^ Exit: (17) not(user:var(["c", "b"])) + Call: (17) expression(["c", "b"]) + Unify: (17) expression(["c", "b"]) + Call: (18) isval(["c", "b"]) + Unify: (18) isval(["c", "b"]) + Fail: (18) isval(["c", "b"]) + Redo: (17) expression(["c", "b"]) + Unify: (17) expression(["c", "b"]) + Unify: (17) expression(["c", "b"]) + Unify: (17) expression(["c", "b"]) +^ Call: (18) not(atom(["c", "b"])) +^ Unify: (18) not(user:atom(["c", "b"])) +^ Exit: (18) not(user:atom(["c", "b"])) + Call: (18) length(["c", "b"], _18058) + Unify: (18) length(["c", "b"], _18058) + Exit: (18) length(["c", "b"], 2) + Call: (18) 2>=1 + Exit: (18) 2>=1 + Call: (18) expression2(["c", "b"]) + Unify: (18) expression2(["c", "b"]) + Call: (19) expression3("c") + Unify: (19) expression3("c") + Call: (20) isval("c") + Unify: (20) isval("c") + Fail: (20) isval("c") + Redo: (19) expression3("c") + Unify: (19) expression3("c") + Exit: (19) expression3("c") + Call: (19) expression2(["b"]) + Unify: (19) expression2(["b"]) + Call: (20) expression3("b") + Unify: (20) expression3("b") + Call: (21) isval("b") + Unify: (21) isval("b") + Fail: (21) isval("b") + Redo: (20) expression3("b") + Unify: (20) expression3("b") + Exit: (20) expression3("b") + Call: (20) expression2([]) + Unify: (20) expression2([]) + Exit: (20) expression2([]) + Exit: (19) expression2(["b"]) + Exit: (18) expression2(["c", "b"]) + Exit: (17) expression(["c", "b"]) + Exit: (16) isvalstrorundef(["c", "b"]) + Call: (16) updatevar([v, b], ["c", "b"], [], _18062) + Unify: (16) updatevar([v, b], ["c", "b"], [], _18062) + Call: (17) lists:member([[v, b], empty], []) + Fail: (17) lists:member([[v, b], empty], []) + Redo: (16) updatevar([v, b], ["c", "b"], [], _18062) +^ Call: (17) not(member([[v, b], _18054], [])) +^ Unify: (17) not(user:member([[v, b], _18054], [])) +^ Exit: (17) not(user:member([[v, b], _18054], [])) + Call: (17) _18054=empty + Exit: (17) empty=empty + Call: (17) true + Exit: (17) true + Call: (17) lists:append([], [[[v, b], ["c", "b"]]], _18102) + Unify: (17) lists:append([], [[[v, b], ["c", "b"]]], [[[v, b], ["c", "b"]]]) + Exit: (17) lists:append([], [[[v, b], ["c", "b"]]], [[[v, b], ["c", "b"]]]) + Call: (17) true + Exit: (17) true + Call: (17) true + Exit: (17) true + Exit: (16) updatevar([v, b], ["c", "b"], [], [[[v, b], ["c", "b"]]]) + Exit: (15) putvalue([v, b], ["c", "b"], [], [[[v, b], ["c", "b"]]]) + Call: (15) checkarguments([["a", "b"], [v, d]], [[v, c], [v, d]], [[[v, b], ["c", "b"]]], _18104, [], _18108) + Unify: (15) checkarguments([["a", "b"], [v, d]], [[v, c], [v, d]], [[[v, b], ["c", "b"]]], _18104, [], _18108) + Call: (16) [["a", "b"], [v, d]]=[_18084|_18086] + Exit: (16) [["a", "b"], [v, d]]=[["a", "b"], [v, d]] + Call: (16) expressionnotatom(["a", "b"]) + Unify: (16) expressionnotatom(["a", "b"]) + Call: (17) isvalstrempty(["a", "b"]) + Unify: (17) isvalstrempty(["a", "b"]) + Call: (18) var(["a", "b"]) + Fail: (18) var(["a", "b"]) + Redo: (17) isvalstrempty(["a", "b"]) + Unify: (17) isvalstrempty(["a", "b"]) + Call: (18) isval(["a", "b"]) + Unify: (18) isval(["a", "b"]) + Fail: (18) isval(["a", "b"]) + Redo: (17) isvalstrempty(["a", "b"]) + Unify: (17) isvalstrempty(["a", "b"]) + Fail: (17) isvalstrempty(["a", "b"]) + Redo: (16) expressionnotatom(["a", "b"]) + Unify: (16) expressionnotatom(["a", "b"]) +^ Call: (17) not(atom(["a", "b"])) +^ Unify: (17) not(user:atom(["a", "b"])) +^ Exit: (17) not(user:atom(["a", "b"])) + Call: (17) length(["a", "b"], _18116) + Unify: (17) length(["a", "b"], _18116) + Exit: (17) length(["a", "b"], 2) + Call: (17) 2>=1 + Exit: (17) 2>=1 + Call: (17) expressionnotatom2(["a", "b"]) + Unify: (17) expressionnotatom2(["a", "b"]) + Call: (18) isvalstrempty("a") + Unify: (18) isvalstrempty("a") + Call: (19) var("a") + Fail: (19) var("a") + Redo: (18) isvalstrempty("a") + Unify: (18) isvalstrempty("a") + Call: (19) isval("a") + Unify: (19) isval("a") + Fail: (19) isval("a") + Redo: (18) isvalstrempty("a") + Unify: (18) isvalstrempty("a") + Exit: (18) isvalstrempty("a") + Call: (18) expressionnotatom2(["b"]) + Unify: (18) expressionnotatom2(["b"]) + Call: (19) isvalstrempty("b") + Unify: (19) isvalstrempty("b") + Call: (20) var("b") + Fail: (20) var("b") + Redo: (19) isvalstrempty("b") + Unify: (19) isvalstrempty("b") + Call: (20) isval("b") + Unify: (20) isval("b") + Fail: (20) isval("b") + Redo: (19) isvalstrempty("b") + Unify: (19) isvalstrempty("b") + Exit: (19) isvalstrempty("b") + Call: (19) expressionnotatom2([]) + Unify: (19) expressionnotatom2([]) + Exit: (19) expressionnotatom2([]) + Exit: (18) expressionnotatom2(["b"]) + Exit: (17) expressionnotatom2(["a", "b"]) + Exit: (16) expressionnotatom(["a", "b"]) + Call: (16) [[v, c], [v, d]]=[_18100|_18102] + Exit: (16) [[v, c], [v, d]]=[[v, c], [v, d]] +^ Call: (16) not(var([v, c])) +^ Unify: (16) not(user:var([v, c])) +^ Exit: (16) not(user:var([v, c])) + Call: (16) isvar([v, c]) + Unify: (16) isvar([v, c]) + Exit: (16) isvar([v, c]) + Call: (16) putvalue([v, c], ["a", "b"], [[[v, b], ["c", "b"]]], _18136) + Unify: (16) putvalue([v, c], ["a", "b"], [[[v, b], ["c", "b"]]], _18136) +^ Call: (17) not(isvar([v, c])) +^ Unify: (17) not(user:isvar([v, c])) + Call: (18) isvar([v, c]) + Unify: (18) isvar([v, c]) + Exit: (18) isvar([v, c]) +^ Fail: (17) not(user:isvar([v, c])) + Redo: (16) putvalue([v, c], ["a", "b"], [[[v, b], ["c", "b"]]], _18136) + Call: (17) isvar([v, c]) + Unify: (17) isvar([v, c]) + Exit: (17) isvar([v, c]) + Call: (17) isvalstrorundef(["a", "b"]) + Unify: (17) isvalstrorundef(["a", "b"]) + Call: (18) var(["a", "b"]) + Fail: (18) var(["a", "b"]) + Redo: (17) isvalstrorundef(["a", "b"]) + Unify: (17) isvalstrorundef(["a", "b"]) +^ Call: (18) not(var(["a", "b"])) +^ Unify: (18) not(user:var(["a", "b"])) +^ Exit: (18) not(user:var(["a", "b"])) + Call: (18) isval(["a", "b"]) + Unify: (18) isval(["a", "b"]) + Fail: (18) isval(["a", "b"]) + Redo: (17) isvalstrorundef(["a", "b"]) + Unify: (17) isvalstrorundef(["a", "b"]) +^ Call: (18) not(var(["a", "b"])) +^ Unify: (18) not(user:var(["a", "b"])) +^ Exit: (18) not(user:var(["a", "b"])) + Call: (18) expression(["a", "b"]) + Unify: (18) expression(["a", "b"]) + Call: (19) isval(["a", "b"]) + Unify: (19) isval(["a", "b"]) + Fail: (19) isval(["a", "b"]) + Redo: (18) expression(["a", "b"]) + Unify: (18) expression(["a", "b"]) + Unify: (18) expression(["a", "b"]) + Unify: (18) expression(["a", "b"]) +^ Call: (19) not(atom(["a", "b"])) +^ Unify: (19) not(user:atom(["a", "b"])) +^ Exit: (19) not(user:atom(["a", "b"])) + Call: (19) length(["a", "b"], _18152) + Unify: (19) length(["a", "b"], _18152) + Exit: (19) length(["a", "b"], 2) + Call: (19) 2>=1 + Exit: (19) 2>=1 + Call: (19) expression2(["a", "b"]) + Unify: (19) expression2(["a", "b"]) + Call: (20) expression3("a") + Unify: (20) expression3("a") + Call: (21) isval("a") + Unify: (21) isval("a") + Fail: (21) isval("a") + Redo: (20) expression3("a") + Unify: (20) expression3("a") + Exit: (20) expression3("a") + Call: (20) expression2(["b"]) + Unify: (20) expression2(["b"]) + Call: (21) expression3("b") + Unify: (21) expression3("b") + Call: (22) isval("b") + Unify: (22) isval("b") + Fail: (22) isval("b") + Redo: (21) expression3("b") + Unify: (21) expression3("b") + Exit: (21) expression3("b") + Call: (21) expression2([]) + Unify: (21) expression2([]) + Exit: (21) expression2([]) + Exit: (20) expression2(["b"]) + Exit: (19) expression2(["a", "b"]) + Exit: (18) expression(["a", "b"]) + Exit: (17) isvalstrorundef(["a", "b"]) + Call: (17) updatevar([v, c], ["a", "b"], [[[v, b], ["c", "b"]]], _18156) + Unify: (17) updatevar([v, c], ["a", "b"], [[[v, b], ["c", "b"]]], _18156) + Call: (18) lists:member([[v, c], empty], [[[v, b], ["c", "b"]]]) + Unify: (18) lists:member([[v, c], empty], [[[v, b], ["c", "b"]]]) + Fail: (18) lists:member([[v, c], empty], [[[v, b], ["c", "b"]]]) + Redo: (17) updatevar([v, c], ["a", "b"], [[[v, b], ["c", "b"]]], _18156) +^ Call: (18) not(member([[v, c], _18148], [[[v, b], ["c", "b"]]])) +^ Unify: (18) not(user:member([[v, c], _18148], [[[v, b], ["c", "b"]]])) +^ Exit: (18) not(user:member([[v, c], _18148], [[[v, b], ["c", "b"]]])) + Call: (18) _18148=empty + Exit: (18) empty=empty + Call: (18) true + Exit: (18) true + Call: (18) lists:append([[[v, b], ["c", "b"]]], [[[v, c], ["a", "b"]]], _18196) + Unify: (18) lists:append([[[v, b], ["c", "b"]]], [[[v, c], ["a", "b"]]], [[[v, b], ["c", "b"]]|_18180]) + Exit: (18) lists:append([[[v, b], ["c", "b"]]], [[[v, c], ["a", "b"]]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]]) + Call: (18) true + Exit: (18) true + Call: (18) true + Exit: (18) true + Exit: (17) updatevar([v, c], ["a", "b"], [[[v, b], ["c", "b"]]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]]) + Exit: (16) putvalue([v, c], ["a", "b"], [[[v, b], ["c", "b"]]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]]) + Call: (16) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], _18204, [], _18208) + Unify: (16) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], _18204, [], _18208) + Call: (17) [[v, d]]=[_18184|_18186] + Exit: (17) [[v, d]]=[[v, d]] + Call: (17) expressionnotatom([v, d]) + Unify: (17) expressionnotatom([v, d]) + Call: (18) isvalstrempty([v, d]) + Unify: (18) isvalstrempty([v, d]) + Call: (19) var([v, d]) + Fail: (19) var([v, d]) + Redo: (18) isvalstrempty([v, d]) + Unify: (18) isvalstrempty([v, d]) + Call: (19) isval([v, d]) + Unify: (19) isval([v, d]) + Fail: (19) isval([v, d]) + Redo: (18) isvalstrempty([v, d]) + Unify: (18) isvalstrempty([v, d]) + Fail: (18) isvalstrempty([v, d]) + Redo: (17) expressionnotatom([v, d]) + Unify: (17) expressionnotatom([v, d]) +^ Call: (18) not(atom([v, d])) +^ Unify: (18) not(user:atom([v, d])) +^ Exit: (18) not(user:atom([v, d])) + Call: (18) length([v, d], _18216) + Unify: (18) length([v, d], _18216) + Exit: (18) length([v, d], 2) + Call: (18) 2>=1 + Exit: (18) 2>=1 + Call: (18) expressionnotatom2([v, d]) + Unify: (18) expressionnotatom2([v, d]) + Call: (19) isvalstrempty(v) + Unify: (19) isvalstrempty(v) + Call: (20) var(v) + Fail: (20) var(v) + Redo: (19) isvalstrempty(v) + Unify: (19) isvalstrempty(v) + Call: (20) isval(v) + Unify: (20) isval(v) + Fail: (20) isval(v) + Redo: (19) isvalstrempty(v) + Unify: (19) isvalstrempty(v) + Fail: (19) isvalstrempty(v) + Fail: (18) expressionnotatom2([v, d]) + Redo: (17) expressionnotatom([v, d]) + Unify: (17) expressionnotatom([v, d]) + Call: (18) predicate_or_rule_name([v, d]) + Fail: (18) predicate_or_rule_name([v, d]) + Fail: (17) expressionnotatom([v, d]) + Redo: (16) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], _18204, [], _18208) + Unify: (16) checkarguments([[v, d]], [[v, d]], [[[v, b], ["c", "b"]], [[v, c], ["a", "b"]]], _18204, [], _18208) + Call: (17) [[v, d]]=[_18184|_18186] + Exit: (17) [[v, d]]=[[v, d]] +^ Call: (17) not(var([v, d])) +^ Unify: (17) not(user:var([v, d])) +^ Exit: (17) not(user:var([v, d])) + Call: (17) isvar([v, d]) + Unify: (17) isvar([v, d]) + Exit: (17) isvar([v, d]) + Call: (17) [[v, d]]=[_18200|_18202] + Exit: (17) [[v, d]]=[[v, d]] + Call: (17) expressionnotatom([v, d]) + Unify: (17) expressionnotatom([v, d]) + Call: (18) isvalstrempty([v, d]) + Unify: (18) isvalstrempty([v, d]) + Call: (19) var([v, d]) + Fail: (19) var([v, d]) + Redo: (18) isvalstrempty([v, d]) + Unify: (18) isvalstrempty([v, d]) + Call: (19) isval([v, d]) + Unify: (19) isval([v, d]) + Fail: (19) isval([v, d]) + Redo: (18) isvalstrempty([v, d]) + Unify: (18) isvalstrempty([v, d]) + Fail: (18) isvalstrempty([v, d]) + Redo: (17) expressionnotatom([v, d]) + Unify: (17) expressionnotatom([v, d]) diff --git a/mindreader/init.txt b/mindreader/init.txt new file mode 100644 index 0000000000000000000000000000000000000000..c1846123c31986d037bfd719e214d26b4e92fc3b --- /dev/null +++ b/mindreader/init.txt @@ -0,0 +1,161 @@ +%% Instructions to initialise each algorithm before running for the first time. + +%% Before pasting each line, to avoid medical issues and make it work, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. + +%% Enter each line one at a time, not together + + +%% Once per day + +%% Load texttobr2, texttobr + +%% In shell +cd Text-to-Breasonings +swipl +['text_to_breasonings.pl']. + +%% texttobr2(2). %% Head of State to enable mind reading + +%% 50 done up As for training, 50 done up As on my side +F="../mindreader/file.txt",time((N is 100*4, M is 4000, texttobr2(N,F,u,M,false,false,false,false,false,false),texttobr(N,F,u,M))). + +%% Important: please switch off algorithm after each use with Head of State with +%% texttobr2(2). + +%% Once per algorithm + +%% Load 250 breasonings texttobr2qb - already done + + +texttobr2(3). %% indicates training radio button, radio button to graciously give and graciously give 250 breasonings for training +texttobr2(3). %% indicates training on our side radio button, radio button to graciously give and graciously give 250 breasonings for training on our side + +%% FOR OBJECTS: + +%% 250s for dotting yourself on speaking and dotting yourself receiving, and yourself speaking and yourself receiving + +texttobr2(2). + +%% Do 250 breasonings for binary oppositions for front, behind, above and below object being recognised, and protection and off for each one + +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). + +texttobr2(4). +texttobr2(4). + +%% Do 250s for quarter master and object binary oppositions x's for left, right, front, back, top, and bottom object being recognised, with protection and off for each one + +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). + +texttobr2(6). +texttobr2(6). + +%% 100 As for 1 (turned on)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 letters to recognise) + +texttobr2(3). %% * 5 + +%% 100 As for 2 (protections and turned off)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects to recognise) + +N is 3*5,texttobr2(N). +N is 3*5,texttobr2(N). + +%% FOR CHARACTERS: + +%% 250s for dotting yourself on speaking and dotting yourself receiving, and yourself speaking and yourself receiving + +texttobr2(2). + +%% Do 250 breasonings for binary oppositions for front, behind, above and below object being recognised, and protection and off for each one + +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). + +texttobr2(4). +texttobr2(4). + +%% Do 250s for quarter master and object binary oppositions x's for left, right, front, back, top, and bottom object being recognised, with protection and off for each one + +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). + +texttobr2(6). +texttobr2(6). + +%% 100 As for 1 (turned on)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*26 (26 letters to recognise) +%% Breason out pixels in e.g. https://www.fontspace.com/lucian-academy/lucianshandbitmap + +texttobr2(3). %% * 26 + +%% 100 As for 2 (protections and turned off)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*26 (26 letters to recognise) + +N is 3*26,texttobr2(N). +N is 3*26,texttobr2(N). + +%% also breason out and dot on (parts of, i.e. squares and rectangles in) characters before line above and breason out and dot on when recognising and saying object (with all objects having different breasonings) + +%% to recognise, think of stream of consciousness breasonings, the appearance of an object, words appearing and tricky appearances that mean something else + +%% teach subjects these skills telepathically one by one + +%% Before using, think 2 radio buttons put on recordings put through with prayer, nut and bolt, quantum box prayer, 1 1 0.5, 1 1 0.5 cm + +%% The algorithm has at times projected and identified my thought correctly but many of the figures are too similar + +%% Sometimes I had the feeling of a quick representation taken from me before I could speak, people "talking for me" (in any case, a false reading) and other options appearing above mine, and sometimes breasonings I said "stickily" matching a different option from what I meant registered as the option + + + +%%** + +%% FOR PROJECTING VOXELS: + +%% 250s for dotting yourself on projecting and dotting yourself seeing, and yourself projecting and yourself seeing + +texttobr2(2). + +%% Do 250 breasonings for binary oppositions for front, behind, above and below object being projected, and protection and off for each one + +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). + +texttobr2(4). +texttobr2(4). + +%% Do 250s for quarter master and object binary oppositions x's for left, right, front, back, top, and bottom object being projected, with protection and off for each one + +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). +texttobr2(1). + +texttobr2(6). +texttobr2(6). + +%% 100 As for 1 (turned on)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*10 (10 colours to project) + +%% black, white, red, orange, yellow, green, blue, purple, brown, grey + +texttobr2(3). %% * 10 + +%% 100 As for 2 (protections and turned off)*2 (to and from computer)*3 (rb, itself (already done), graciously give or blame, radio button for graciously give or blame)*5 (5 objects to recognise) + +N is 3*10,texttobr2(N). +N is 3*10,texttobr2(N). diff --git a/mindreader/letter.txt b/mindreader/letter.txt new file mode 100644 index 0000000000000000000000000000000000000000..5b8125df180ca66dedef6af5dfcc7e4ab3a112d6 --- /dev/null +++ b/mindreader/letter.txt @@ -0,0 +1,10 @@ +I would like the spiritual screen, please. +I would like no headaches from it, please. +I would like no depression from it, please. +I would like no hallucinations from it, please. +It is good. +It is good. +It is good. +It is good. +It is good. +It is good. \ No newline at end of file diff --git a/mindreader/make_mind_reading_tree4 working1.pl b/mindreader/make_mind_reading_tree4 working1.pl new file mode 100644 index 0000000000000000000000000000000000000000..e29f8c28e522028b6ff262fd0ff3123394daf983 --- /dev/null +++ b/mindreader/make_mind_reading_tree4 working1.pl @@ -0,0 +1,725 @@ +/** +WORKING + +make_mind_reading_tree4(["abc","abd"], +[[1, "b", 3], [3, "c", [-, "abc"]], [3, "d", [-, "abd"]]]). + +make_mind_reading_tree4(["aaa","aab","acc"], +[[1,"a",2],[2,"a",3],[2,"c",[-,"acc"]],[3,"a",[-,"aaa"]],[3,"b",[-,"aab"]]]). + +make_mind_reading_tree4(["cca","ccb"], +[[1, "c", 3], [3, "a", [-, "cca"]], [3, "b", [-, "ccb"]]]). + + + +make_mind_reading_tree4(["acc","bcc"], +[[1, "a", [-, "acc"]], [1, "b", [-, "bcc"]]]). + +make_mind_reading_tree4(["ac","bc"], +[[1, "a", [-, "ac"]], [1, "b", [-, "bc"]]]). + +make_mind_reading_tree4(["aqa","awx","awy"], +[[1,"a",2],[2,"q",[-,"aqa"]],[2,"w",9],[9,"x",[-,"awx"]],[9,"y",[-,"awy"]]] +). + +make_mind_reading_tree4(["ca","cb"], +[[1, "c", 2], [2, "a", [-, "ca"]], [2, "b", [-, "cb"]]]). + +make_mind_reading_tree4(["abc","ade"], +[[1, "a", 2], [2, "b", [-, "abc"]], [2, "d", [-, "ade"]]]). + + +make_mind_reading_tree4(["cccbbb"], +[[1, "b", [-, "cccbbb"]]]). + +make_mind_reading_tree4(["cccbbb","fsfs"], +[[1, "c", [-, "cccbbb"]], [1, "f", [-, "fsfs"]]]). + +make_mind_reading_tree4(["abcde"], +[[1, "e", [-, "abcde"]]]). + +make_mind_reading_tree4(["abcde","afgde"], +[[1, "a", 2], [2, "b", [-, "abcde"]], [2, "f", [-, "afgde"]]]). + + +make_mind_reading_tree4(["aaaabbbbcccc","aaaaddddeeee"], +[[1,"a",5],[5,"b",[-,"aaaabbbbcccc"]],[5,"d",[-,"aaaaddddeeee"]]]). + +make_mind_reading_tree4(["aaaccc","bbbccc"], +[[1, "a", [-, "aaaccc"]], [1, "b", [-, "bbbccc"]]]). + +**/ + +:-include('../listprologinterpreter/la_strings.pl'). +%:- include('../listprologinterpreter/la_strings.pl'). + +make_mind_reading_tree4(Options0,Options3) :- + sort(Options0,Options1), + abort_if_empty_string(Options1), + string_to_list1(Options1,1,_,[],Options2), + maplist(append,[Options2],[Options2a]), +%writeln1(merge_lists1a(Options2,Options2a,Options3a)), + make_mind_reading_tree4_a(Options2a,Options3). + +/** +make_mind_reading_tree4_a1(Options2a,Options3) :- + make_mind_reading_tree4_a(Options2a,Options3a), + make_mind_reading_tree4_a(Options3a,Options3b), + (Options3a=Options3b->Options3=Options3a; + make_mind_reading_tree4_a1(Options3b,Options3)). +**/ + +make_mind_reading_tree4_a(Options2a,Options3) :- + %trace, + merge_lists_a([1],Options2a,[],%%Options2a, + Options3a), +%writeln1(merge_lists_a([1],Options2a,[],%%Options2a, +% Options3a)), + sort(Options3a,Options3c), + %trace, + remove_chains_of_one_child_a([1],Options3c,[],Options3b), + %writeln1(remove_chains_of_one_child_a([1],Options3c,[],Options3b)), + sort(Options3b,Options3),!. + + +string_to_list1([],N,N,Options,Options) :- !. +string_to_list1(Options1,N1,N2,Options2a,Options2b) :- + Options1=[B|Rest], + %%findall(C,(member(B,Options1), + string_to_list2(B,B,N1,N3,Options2a,Options2c),%%),Options2). + %%trace, + [Options2c]=Options2d, +%%trace, + string_to_list1(Rest,N3,N2,[],Options2e), + %%trace, + Options2d=[[[_,D1,D2]|D3]], + Options2f=[[[1,D1,D2]|D3]], + append(Options2f,Options2e,Options2b). + +string_to_list2(B,B1,N1,N2,A1,A2) :- + string_concat(D,"",B), + string_length(D,1), + append(A1,[[N1,D,[-,B1]]],A2), + N2 is N1 + 1,!. +string_to_list2(A,B1,N1,N2,B,C) :- + string_concat(D,E,A), + string_length(D,1), + N3 is N1 + 1, + append(B,[[N1,D,N3]],F), + %%maplist(append,[[B,D]],[F]), + string_to_list2(E,B1,N3,N2,F,C). + + +merge_lists_a([],Options1,Options2,Options3) :- + append(Options1,Options2,Options3),!. %*? +%merge_lists_a([[-,A]],Options0,Options1,Options2) :- +% member([N,A1,[-,A]],Options0), +% append(Options1,[[N,A1,[-,A]]],Options2),!. +%%writeln1(merge_lists_a([-,_],_,Options,Options)),!. +merge_lists_a(N1,Options1,Options2, +Options3) :- + %trace, + N1=[M1|Ms], + abort_if_gone_past_error(M1,Options1), +%writeln1([n1,N1]), + % If all As lead to the same letter then merge them + findall(A,(member([M1,A,_N2],Options1)),A1), + sort(A1,A11), + % next n1 + merge_lists_a1(M1,A11,Options1,[],Options31,[],N21), + append(Options2,Options31,Options32), + append(Ms,N21,M21), + sort(M21,M2), + %M21=M2, + merge_lists_a(M2,Options32,[],Options3). + %sort(Options32,Options3). + + +merge_lists_a1(_,[],Options1,Options2,Options3,N,N) :- + append(Options1,Options2,Options3),!. %*? +merge_lists_a1(N1,A1,Options1,Options2,Options3,NA61,NA7) :- + %(N1=2->trace;true), + %writeln1(merge_lists_a1(N1,A1,Options1,Options2,Options3,NA61,NA7)), + %notrace, + A1=[A2|A3], + findall([N2,A5,N3],(member([N1,A2,N2],Options1), + member([N2,A5,N3],Options1)),A4), + %findall([-,_],(member([N1,A2,[-,_]],Options1)),A41), + findall([N1,A2,N2],(member([N1,A2,N2],Options1)),A6), + (%trace, + (merge_lists_a2(A4) + %->true;(length(A41,L),length(Options1,L) + )-> + ( + + %% merge N1 A2 N*, change N* in other states + %trace, + merge_lists_a3(A6,Options1,Options4), + findall(N2,(member([N1,A2,N2],Options4)),NA6)); + %(trace, + (%trace, + %subtract(A6, + merge_lists_a3(A6,Options1,Options4), + %NA6=[] + findall(N2,(member([N1,A2,N2],Options4)),NA6) %** + )), + append(Options2,Options4,Options5), + append(NA61,NA6,NA62), + %Options2=Options4), + %trace, + %delete(Options1,[N1,A2,N2],Options1a), + merge_lists_a1(N1,A3,Options5,[],Options3,NA62,NA7). + %%sort(Options31,Options3). + %*?o4 +%end state + +merge_lists_a2(A4) :- + A4=[[_N1,A,_N2]|A5], + merge_lists_a22(A,A5). +merge_lists_a22(_A,[]) :- !. +merge_lists_a22(A,A5) :- + A5=[[_N1,A,_N2]|A6], + merge_lists_a22(A,A6). + +merge_lists_a3(A6,Options1,%Options2, +Options3) :- + A6=[[N1,A,N2]|A8], + delete(Options1,[N1,A,_],Options1a), + %%(member([N1,A,N2],Options2)->(trace,Options2=Options2a,A9=A6); + (append(Options1a,[[N1,A,N2]],Options1aa)%%,A9=A8 + ),%), + merge_lists_a4(N2,A8,Options1aa,[],%Options2a, + Options3). + + +merge_lists_a4(_N2,[],Options1,Options2,Options3 +) :- + append(Options1,Options2,Options3),!. +merge_lists_a4(N2,A8,Options1,Options2,Options3) :- +%trace, +%writeln1(merge_lists_a4(N2,A8,Options1,Options2,Options3)), + A8=[[N1,A,N3]|A9], + delete(Options1,[N1,A,N3],Options2aa), + %%append(Options2,[[N1,A,N2]],Options2a), + %trace, + merge_lists_a5(N2,N3,Options2aa,[],Options4,[],Options5), + % remove Options4=[n from o5 + findall(N4,(member([N4,_,_],Options4)),N41), + subtract1(Options5,N41,[],Options45), + append(Options2,Options45,Options245), + %union(Options4,Options5,Options45), + %writeln1(Options45), +%writeln1( merge_lists_a5(N2,N3,Options1,Options2,Options4,[],Options5)), + merge_lists_a4(N2,A9,Options245,[],%***Options4, + Options31), + %(N2=2->trace;true), + append(Options31,Options4,Options3). + %(Options4=[[2, "a", 3], [2, "a", 6]]->true;notrace). + %append(Options45,Options31,Options3). + +% point second in pair to changed first state +merge_lists_a5(_N2,_N3,[],Options1,Options1,Options2,Options2) :- !. +merge_lists_a5(N2,N3,Options1,Options2,Options4,Options5,Options6) :- +%(N2=3->trace;true), + Options1=[[N3,A,N4]|A9], + delete(Options2,[N3,A,N4],Options2aa), + %trace, + append(Options2aa,[[N2,A,N4]],Options2a), + merge_lists_a5(N2,N3,A9,Options2a,Options4,Options5,Options6). +merge_lists_a5(N2,N3,Options1,Options2,Options4,Options5,Options6) :- + Options1=[[N2,A,N4]|A9], + delete(Options2,[N3,A,N4],Options2aa), + %trace, + append(Options2aa,[[N2,A,N4]],Options2a), + merge_lists_a5(N2,N3,A9,Options2a,Options4,Options5,Options6). +merge_lists_a5(N2,N3,Options1,Options2,Options4,Options5,Options6) :- + Options1=[[N31,A,N4]|A9], + not(N2=N31), + append(Options5,[[N31,A,N4]],Options5a), + merge_lists_a5(N2,N3,A9,Options2,Options4,Options5a,Options6). + +subtract1([],_N41,Options45,Options45) :- !. +subtract1(Options5,N41,Options451,Options45) :- + Options5=[[N42,_,_]|Options51], + member(N42,N41), + subtract1(Options51,N41,Options451,Options45). +subtract1(Options5,N41,Options451,Options45) :- + Options5=[[N42,A,B]|Options51], + not(member(N42,N41)), + append(Options451,[[N42,A,B]],Options452), + subtract1(Options51,N41,Options452,Options45). + +subtract2([],_N41,Options45,Options45) :- !. +subtract2(Options5,N41,Options451,Options45) :- + Options5=[[_,N42,_]|Options51], + member(N42,N41), + subtract2(Options51,N41,Options451,Options45). +subtract2(Options5,N41,Options451,Options45) :- + Options5=[[N42,A,B]|Options51], + not(member(A,N41)), + append(Options451,[[N42,A,B]],Options452), + subtract2(Options51,N41,Options452,Options45). + +abort_if_gone_past_error(M1,Options1) :- + %trace, + ((member([M1,A,N22],Options1),member([M1,A,N23],Options1),not(N22=N23),not(N22=[-,_]),N23=[-,_])->(term_to_atom([M1,A,N22],N221),term_to_atom([M1,A,N23],N231),concat_list(["Error: Conflicting branches ",N221," and ",N231,"."],W),writeln1(W),abort);true),!. + +abort_if_empty_string(Options1) :- + (member("",Options1)->(writeln1("Error: Cannot mind read \"\"."),abort);true),!. + +/** + + +merge_lists1a([],Options,Options) :- !. +merge_lists1a(Options1,Options2a,Options2b) :- + Options1=[Options4|Options5], + merge_lists1c(Options4,Options5,Options2a,Options2c), + merge_lists1a(Options5,Options2c,Options2b). + +merge_lists1c(_,[],Options,Options) :- !. +merge_lists1c(Options4,Options5,Options2a,Options2b) :- + Options5=[Options5a|Options5b], + merge_lists1b(Options4,Options5a,Options2a,Options2c), + merge_lists1c(Options4,Options5b,Options2c,Options2b). + +merge_lists1b([],[],Options,Options) :- !. +merge_lists1b(Options1,Options2,Options3a,Options3b) :- + Options1=[Options4|Options5], + Options2=[Options6|Options7], +%% trace, + merge_lists2(Options4,Options6,Options3a,Options3c), +%%writeln1( merge_lists2(Options4,Options6,Options3a,Options3c)), + merge_lists1b(Options5,Options7,Options3c,Options3b). + +**/ + +/** +merge_lists1([],Options,Options) :- !. +merge_lists1(Options1,Options3a,Options3b) :- + Options1=[Options4|Options5], + %%Options2=[Options6|Options7], +%% trace, + merge_lists1b(Options4,Options1,[],Options3c), + append(Options3a,Options3c,Options3d), +%%writeln1( merge_lists2(Options4,Options6,Options3a,Options3c)), + merge_lists1(Options5,Options3d,Options3b). + +merge_lists1b([],Options,Options) :- !. +merge_lists1b(Options1,Options3a,Options3b) :- + Options1=[Options4|Options5], + %%Options2=[Options6|Options7], +%% trace, + merge_lists2(Options4,Options1,[],Options3c), + append(Options3a,Options3c,Options3d), +%%writeln1( merge_lists2(Options4,Options6,Options3a,Options3c)), + merge_lists1b(Options5,Options3d,Options3b). + +**/ + +/** + +[debug] ?- merge_lists2([1, "a", 2],[4, "a", 5],[[[1, "a", 2], [2, "b", 3], [3, "c",-]], [[4, "a", 5], [5, "d", 6], [6, "e",-]]],O). +O = [[[1, "a", 2], [2, "b", 3], [3, "c", -]], [[1, "a", 5], [5, "d", 6], [6, "e", -]]]. + +[debug] ?- merge_lists2([2, "b",3],[5, "b", 6],[[[1, "a", 2], [2, "b", 3], [3, "c",-]], [[1, "a", 5], [5, "b", 6], [6, "e",-]]],O). +O = [[[1, "a", 2], [2, "b", 3], [3, "c", -]], [[1, "a", 2], [2, "b", 6], [6, "e", -]]]. + +remove [1, "a", 2],flatten +does reordering cause a bug? +() replace in place, redo each time changed + +merge if no more than 1 parent of each node before it +remove chains of 1-children +flatten, sort at end + +* strings must be same length + +- same parents, etc in next level x whole thing + +debug each pred + +fore and post cut in shortening (no middle cut) +- if middle cut (where abd acd->ad ad), find minimum combos of intermediate letters between forks with no conflicting (not the same items in the same order/same place) + - letter for letter x + - a single new level x + - cut only if necessary (leave alone) x + - a single new letter v + + continually merge and shorten until returns the same + +**/ + +/** +same_parents([N1,_A1,_N2],[N4,_A2,_N3],Options61) :- + findall([B1,B2,N1],member([B1,B2,N1],Options61),C1), + findall([B3,B4,N4],member([B3,B4,N4],Options61),C2), + subtract(C2,C1,[]). + %% and vv? +**/ +/** +merge_lists2([N1,A1,_N2],Options2,Options61,Options9) :- + Options2=[N4,A2,N3], + (A1=A2-> + (findall(Options52,(%%member(Options61,Options6), + ((member([N4,A,N3],Options61), + no_more_than_one_parent_of_each_node_before1([N4,A,N3],Options61), + same_parents([N1,A1,_N21],[N4,A2,N3],Options61), + delete(Options61,[N4,A,N3],Options5), + append([[N1,A,N3]],Options5,Options52))->true; + Options61=Options52) + ),[Options8]), + writeln1(Options8), +%%trace, +%% merge back until forward fork with a predicate +%%trace, + writeln1( merge_back_until_forward_fork(1,N4,N1,Options8,[],Options9a)),merge_back_until_forward_fork(1,N4,N1,Options8,[],Options9a), + sort(Options9a,Options9) + %%Options8=Options9 + ); + Options61=Options9). +**/ + /** +merge_lists2([N1,A1,_N2],Options2,Options61,Options9) :- + Options2=[N4,A2,N3], + (A1=A2-> + (findall(Options52,(%member(Options61,Options6), + ((member([N4,A,N3],Options61), + delete(Options61,[N4,A,N3],Options5), + append([[N1,A,N3]],Options5,Options52))->true; + Options61=Options52) + ),[Options8]), + writeln(Options8), +%%trace, + findall(Options7,(member(Options81,Options8), + ( + (member([N41,A10,N4],Options81), + delete(Options81,[N41,A10,N4],Options51), + append([[N41,A10,N1]],Options51,Options7))->true; + Options81=Options7 + )),Options9)); + Options61=Options9). + **/ +%%no_more_than_one_parent_of_each_node_before1([1,_A,_N3],_Options61) :- !. +/** +no_more_than_one_parent_of_each_node_before1([N4,_A,_N3],Options61) :- + %%member([_N2,_A1,N4],Options61), + %%trace, + findall([N4,_,N5],(member([N4,_,N5],Options61),N4=1),B), + not(B=[]),%%->true;fail),%%B=[]), + %%(B=[_]->true;fail), + !. +no_more_than_one_parent_of_each_node_before1([N4,_A,_N3],Options61) :- + %%member([N2,A1,N4],Options61), + findall([N5,_,N4],(member([N5,_,N4],Options61)),[_]),%%,not(N2=N5) + findall([N2,A1,N4],(member([N2,A1,N4],Options61),no_more_than_one_parent_of_each_node_before1([N2,A1,N4],Options61)),[_]) + ,!. + + +get_c(Name2) :- + c(N1), + concat_list(["c",N1],Name1), + atom_string(Name2,Name1), + N2 is N1+1, + retractall(c(_)), + assertz(c(N2)). + +group_by_same_destination1(B,Options8,Options11,Options12) :- + %%B=[[A1,A2,A3]|B2], +findall([A1,A2,A3,A4,A5],(member([A1,A2,A3],B),member([A3,A4,A5],Options8)),C1), + sort(C1,C2), + findall([[A1,A2,C],[C,A4,A5]],(member([A1,A2,A3,A4,A5],C2), + get_c(C)),Options11a), + maplist(append,[Options11a],[Options11]), + + findall([[A1,A2,A3],[A3,A4,A5]],(member([A1,A2,A3,A4,A5],C2)),Options12a), + maplist(append,[Options12a],[Options12]). + %%group_by_same_destination2(B1,Options8,[],Options12), + +%%group_by_same_destination2(B1,Options8,Options12a,Options12) :- + + +%%merge_back_until_forward_fork(1,1,_,Options1,Options2,Options3) :- +%% append(Options1,Options2,Options3), +%% !. +merge_back_until_forward_fork(_N0,N4,_N1,Options8,Options9,Options10) :- + (findall([N41,A,N6],(member([N41,_A10,N4],Options8), + (member([N41,A,N6],Options8),not(N6=N4))),B), + length(B,L),L>1), + group_by_same_destination1(B,Options8,Options11,Options12), + %%append(Options8,Options9,Options10). + subtract(Options8,Options12,Options13), + maplist(append,[Options13,Options11,Options9],[Options10]). + +merge_back_until_forward_fork(N0,N4,_N1,Options8,Options9,Options10) :- + (findall([N41,_,N6],(member([N41,A10,N4],Options8), + member([N41,_,N6],Options8),not(N6=N4)),[_])-> + (member([N41,A10,N4],Options8), + %%trace, + delete(Options8,[N41,A10,N4],Options51), + append(Options51,[[N41,A10,N4]],Options7), + merge_back_until_forward_fork(N0,N41,N4,Options7,Options9,Options10)); + %%Options9=Options10). + append(Options8,Options9,Options10)). + **/ +%% put [-,*] in +/** +merge_back_until_forward_fork1(_,[],Options7,Options7,N41,N41) :- !. +merge_back_until_forward_fork1(N1,Options8,Options72,Options7,_N43,N42) :- + Options8=[[N41,A10,N4]|Rest], + delete(Options8,[N41,A10,N4],Options51), + append(Options51,[[N41,A10,N1]],Options73), + append(Options72,Options73,Options74), + merge_back_until_forward_fork2(Rest,Options74,Options7,N41,N42). + +merge_back_until_forward_fork2([],Options7,Options7,N41,N41) :- !. +merge_back_until_forward_fork2(_,_Options71,_Options72,_N41,_N42) :- fail. + +ad +ac +bc + +**/ + +remove_chains_of_one_child_a([],Options1,Options2,Options3) :- + append(Options1,Options2,Options3),!. %*? +remove_chains_of_one_child_a(N1,Options0,Options1,Options2) :- + N1=[[-,A1]|Ms], + member([N,A1,[-,A]],Options0), + append(Options1,[[N,A1,[-,A]]],Options21), + remove_chains_of_one_child_a(Ms,Options21,[],Options2),!. + +remove_chains_of_one_child_a(N1,Options1,Options2, +Options3) :- + N1=[M1|Ms], + findall(A,(member([M1,A,_N2],Options1)),A1), + sort(A1,A11), + length(A11,A11L), + (A11L>1->Switch=left;Switch=right), + remove_chains_of_one_child_a1(Switch,M1,A11,Options1,[],Options31,[],N21), + append(Options2,Options31,Options32), + append(Ms,N21,M21), + sort(M21,M2), + remove_chains_of_one_child_a(M2,Options32,[],Options3). + +remove_chains_of_one_child_a1(_,_,[],Options1,Options2,Options3,N,N) :- + append(Options1,Options2,Options3),!. %*? +remove_chains_of_one_child_a1(Switch,N1,A1,Options1,Options2,Options3,_,NA7) :- + A1=[A2|A3], + findall([N1,A2,N2],(member([N1,A2,N2],Options1)),A6), + + remove_chains_of_one_child_a3(Switch,A6,Options1,[],Options4), + findall(N2,(member([N1,A2,N2],Options4)),NA6), + append(Options2,Options4,Options5), + remove_chains_of_one_child_a1(Switch,N1,A3,Options5,[],Options3,NA6,NA7). + + + + +remove_chains_of_one_child_a3(_,[],Options1,Options2,Options3 +) :- + append(Options1,Options2,Options3),!. +remove_chains_of_one_child_a3(Switch,A8,Options1,Options2,Options3) :- +%trace, + A8=[[N0,A,N1]|A9], + member([N0,A,N1],Options1), %% ** delete + delete(Options1,[N0,A,N1],Options6), + %trace, +%%writeln1( remove_chains_of_one_children2(N1,N3,Options1,[],_Options4)), + remove_chains_of_one_child_a4(N0,A,N3,Options1,[],Options41), + Options41=[_|Options42], + %writeln([options41,Options41]), + + subtract(Options6,Options42,Options6a), + % * if N0=n3 there are a and b etc. + % could add a state on to the start x + %%append(Options4,[[N0,A,N3]],Options4a), + %trace, + member([_N01,A2,N3],Options1), + %trace, + (Switch=left->[[N0,A,N3]]=Options4a;[[N0,A2,N3]]=Options4a), + + %[[N0,A2,N3]]=Options4a, + %delete(Options6,[_N0a1,A2,N3],Options6a), + append(Options2,Options4a,Options5), + %remove_chains_of_one_children1(N1,Options6a,Options5,Options3). + + remove_chains_of_one_child_a3(Switch,A9,Options6a,Options5,Options31), + (N3=[-,_]->Options31=Options3; + remove_chains_of_one_child_a([N3],Options31,[],Options3)) + %,notrace + . + +remove_chains_of_one_child_a4(N1,A,N3,Options1,Options2,Options3) :- +%trace, + (N1=[-,_]->(N3=N1,Options2=Options3); + (member([N1,A,N2],Options1),member([N2,_,N22],Options1),member([N2,_,N23],Options1),not(N22=N23),N3=N2,append(Options2,[[N1,A,N2]],Options3))),!. +remove_chains_of_one_child_a4(N1,A1,N3,Options1,Options2,Options3) :- + member([N1,A2,N2],Options1), + append(Options2,[[N1,A2,N2]],Options31), + remove_chains_of_one_child_a5(N2,A1,N3,Options1,Options31,Options3). + +remove_chains_of_one_child_a5(N1,_A1,N3,Options1,Options2,Options3) :- +%trace, + (N1=[-,_]->(N3=N1,Options2=Options3); + (member([N1,A,N2],Options1),member([N2,_,N22],Options1),member([N2,_,N23],Options1),not(N22=N23),N3=N2,append(Options2,[[N1,A,N2]],Options3))),!. +remove_chains_of_one_child_a5(N1,A1,N3,Options1,Options2,Options3) :- + member([N1,A2,N2],Options1), + append(Options2,[[N1,A2,N2]],Options31), + remove_chains_of_one_child_a5(N2,A1,N3,Options1,Options31,Options3). + + /** +remove_chains_of_one_children1([-,_],Options1,Options2,Options3) :- +%%trace, + append(Options1,Options2,Options3),!. +remove_chains_of_one_children1(N0,Options1,Options2,Options3) :- + not(member([N0,_A,_N1],Options1)), + append(Options1,Options2,Options3),!. +**/ + +/** +remove_chains_of_one_children0(N3) :- +Options1=[[1,"a",2],[2,"b",[-,"ab"]],[1,"c",4],[4,"d",[-,"cd"]]], +trace, + remove_chains_of_one_children2(1,N3,Options1,[],_Options41). + **/ + +/** +remove_chains_of_one_children1(N0,Options1,Options2,Options3) :- + +%trace, + %%remove_chains_of_one_children2(N0,NM1,Options1,[],_Options413), + %%((NM1=[-,_]->true;(member([-,String],Options1),string_length(String,NM1)))->fail;true), + + + %trace, + findall([N0,A,N1],(member([N0,A,N1],Options1), + remove_chains_of_one_children2(N1,N3,Options1,[],_Options41)),NN2), + + subtract(Options1,NN2,Options6), + + findall([N0,A,N1,A4,N3],(member([N0,A,N1],Options1), + remove_chains_of_one_children2(N1,N3,Options1,[],_Options412), + member([_N5,A4,N3],Options1)),NN3), + + %trace, + findall(A4,(member([N0,A,N1,A4,N3],NN3)),NN5), + sort(NN5,NN51), + findall([N0,A412,N3],(member(A41,NN51),member([N0,A412,N1,A41,N3],NN3)),NN6), +writeln1(NN6), + +%trace, + length(NN6,NN6L), + ((NN6L>1,NN6=[[NN6M1,NN6M2,_]|NN6R])->((member([NN6M1,NN6M2,_],NN6R),delete(NN6R,[NN6M1,NN6M2,_],NN62))-> + NN61=NN62;NN61=NN6 + );NN61=NN6), + + +%trace, + findall(A,(member([N0,A,N1,A4,N3],NN3)),NN7), + sort(NN7,NN71), + findall([N0,A12,N3],(member(A1,NN71),member([N0,A1,N1,A12,N3],NN3)),NN8), +writeln1(NN8), + +%trace, + + length(NN8,NN8L), + ((NN8L>1,NN8=[[NN8M1,NN8M2,_]|NN8R])->((member([NN8M1,NN8M2,_],NN8R),delete(NN8R,[NN8M1,NN8M2,_],NN82))-> + NN81=NN82;NN81=NN8 + );NN81=NN8), +%trace, +append(NN61,NN81,Options4a), +%findall(_,(member(Options4c,Options4b), +%%writeln1(remove_chains_of_one_children2(N1,N3,Options1,[],_Options4)), + %%append(Options4,[[N0,A,N3]],Options4a), + + append(Options2,Options4a,NNA), + +%trace, + + findall(NND,(member([_NNB,_NNC,NND],NNA)),NNE),sort(NNE,NNE1), + findall(NNI,(member(NNE2,NNE1),findall([NNF,NNG,NNE2],(member([NNF,NNG,NNE2],NNA),!),[NNI])),NNH), + NNH=Options5, + + remove_chains_of_one_children1(N0,Options6,Options5,Options3). + +**/ +/** +remove_chains_of_one_children1(N0,Options1,Options2,Options3) :- +%trace, + member([N0,A,N1],Options1), + delete(Options1,[N0,A,N1],Options6), + %trace, +%%writeln1( remove_chains_of_one_children2(N1,N3,Options1,[],_Options4)), + remove_chains_of_one_children2(N0,N3,Options1,[],_Options41), + % * if N0=n3 there are a and b etc. + % could add a state on to the start x + %%append(Options4,[[N0,A,N3]],Options4a), + %trace, + member([_N01,A2,N3],Options1), + %trace, + [[N0,A2,N3]]=Options4a, + delete(Options6,[_N0a1,A2,N3],Options6a), + append(Options2,Options4a,Options5), + remove_chains_of_one_children1(N1,Options6a,Options5,Options3). +**/ +%remove_chains_of_one_children2([-,A],[-,A],Options1,Options2,Options3) :- +% append(Options1,Options2,Options3),!. + +%remove_chains_of_one_children2(N1,N1,Options1,_Options2,_Options3) :- + %(member([N1,_,N22],Options1),member([N1,_,N23],Options1),not(N22=N23)),!. + /** +remove_chains_of_one_children2(N1,N3,Options1,_Options2,_Options3) :- + (N1=[-,_]->N3=N1; + (member([N1,_A,N2],Options1),member([N2,_,N22],Options1),member([N2,_,N23],Options1),not(N22=N23),N3=N2)),!. +remove_chains_of_one_children2(N1,N3,Options1,_Options2,_Options3) :- + member([N1,_,N2],Options1), + remove_chains_of_one_children2(N2,N3,Options1,_Options21,_Options31). + **/ +/** +remove_chains_of_one_children2(N1,N3,Options1,Options2,Options3) :- + findall([N1,A,N2],member([N1,A,N2],Options1),Options4), + (not(Options4=[_])-> + + %%remove_chains_of_one_children3(Options2,Options4,Options3) + + (%%append(Options2,Options4,Options5), + Options2=Options5, + +%% (member([N2,_,N22],Options1),member([N2,_,N23],Options1),not(N22=N23)), + +remove_chains_of_one_children11(Options4,Options5,Options3))), +Options4=[[N3,_,_]|_]. + %remove_chains_of_one_children1(N2,Options4,Options5,Options3), + %writeln1([options4,Options4]), + %Options4=[[N3,_,_]|_] + %) + %;fail).%%,!. + **/ + /** +remove_chains_of_one_children2(N1,N3,Options1,Options2,Options3) :- + findall([N1,A,N2],member([N1,A,N2],Options1),[_]), + member([N1,A,N2],Options1), + delete(Options1,[N1,A,N2],Options4), +writeln1(remove_chains_of_one_children2(N1,N3,Options1,Options2,Options3)), + %(member([N2,_,N22],Options1),member([N2,_,N23],Options1),not(N22=N23)), + remove_chains_of_one_children2(N2,N3,Options4,Options2,Options3). + +remove_chains_of_one_children11([],Options,Options) :- !. +remove_chains_of_one_children11(Options4,Options5,Options3) :- + Options4=[[_,_,N2]|N2s], + remove_chains_of_one_children1(N2,Options4,Options5,Options41), + remove_chains_of_one_children11(N2s,Options41,Options3). +**/ +/** +remove_chains_of_one_children3(Options2,Options4,Options3) :- + member(B, + (append(Options2,[[N1,A,N2]],Options5), + remove_chains_of_one_children1(N2,Options4,Options5,Options3 +**/ + +equals_empty_list([]). + +get_item_n(Exposition,Number1,Item) :- + Number2 is Number1-1, + length(List,Number2), + append(List,[Item|_],Exposition). diff --git a/mindreader/mindreadtestobj-12scrn-2.pl b/mindreader/mindreadtestobj-12scrn-2.pl new file mode 100644 index 0000000000000000000000000000000000000000..aa7896b4899f2151cb496477e4ce86f1a5121b6a --- /dev/null +++ b/mindreader/mindreadtestobj-12scrn-2.pl @@ -0,0 +1,80 @@ +%%use_module(library(pio)). + +:- use_module(library(date)). +:- include('mindreadtestshared'). +:- include('../Text-to-Breasonings/texttobr2qb.pl'). + +%%sectest0 :- +%%repeat,sectest,sectest0. +sectest(S) :- sectest(7,0,S). +sectest(0,S,S):-!. +sectest(N,S1,S2):- + writeln(["The computer will think of one of the following thoughts when I let you know."]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["seed"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["water"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["dam"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["redblackduck"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["lorelle"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), +/** writeln(["adrian"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + **/ + writeln(["The computer will think of one of the following thoughts in 5 seconds.", "seed","water","dam","redblackduck","lorelle"/**,"adrian" + **/ + ]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["Now"]), + sleep(1), + random_member(A,["seed","water","dam","redblackduck","lorelle" + /**,"adrian" + **/ + ]), + %%texttobr2(2), %% for 100 As for screen to display black border + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["Think of the object the computer thought of in 1 second.", "seed","water","dam","redblackduck","lorelle"/**,"adrian" + **/ + ]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(1), + %% Test thought here. + %%writeln(["Now"]), + %%get_time(TS),stamp_date_time(TS,T,local),writeln([dateandtime,T]), + trialy2_15("seed",R1), + trialy2_15("water",R2), + trialy2_15("dam",R3), + trialy2_15("redblackduck",R4), + trialy2_15("lorelle",R5), + R=[R1,R2,R3,R4,R5],%%,R6,R7,R8,R9,R10,R11,R12,R13,R14,R15,R16,R17,R18,R19,R20,R21,R22,R23,R24,R25,R26,R27], + sort(R,RA), + reverse(RA,RB), + RB=[[_,Item]|_Rest],writeln(["Computer thought",A,"You thought",Item]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(15), + N1 is N-1,(A=Item->S3 is S1+1;S3=S1), + sectest(N1,S3,S2). + %%read_string(user_input,1,_). diff --git a/mindreader/mindreadtestobj-12scrn-2alpha.pl b/mindreader/mindreadtestobj-12scrn-2alpha.pl new file mode 100644 index 0000000000000000000000000000000000000000..4b688a04a8f5faa966c3c313506df8ea08f0efbd --- /dev/null +++ b/mindreader/mindreadtestobj-12scrn-2alpha.pl @@ -0,0 +1,49 @@ +%%use_module(library(pio)). + +:- use_module(library(date)). +:- include('mindreadtestshared'). +:- include('../Text-to-Breasonings/texttobr2qb.pl'). + +%%sectest0 :- +%%repeat,sectest,sectest0. +%%sectest(S) :- sectest(7,0,S). +sectest(0):-!. +sectest(N):- + %%writeln(["Think of the lowercase letter of the alphabet or a space that the computer thought of in 1 second."]), + %%sleep(1), + %% Test thought here., 0.1 s then repeat M times + %%writeln(["Now"]), + %%get_time(TS),stamp_date_time(TS,T,local),writeln([dateandtime,T]), + trialy2_81("a",R1), + trialy2_81("b",R2), + trialy2_81("c",R3), + trialy2_81("d",R4), + trialy2_81("e",R5), + trialy2_81("f",R6), + trialy2_81("g",R7), + trialy2_81("h",R8), + trialy2_81("i",R9), + trialy2_81("j",R10), + trialy2_81("k",R11), + trialy2_81("l",R12), + trialy2_81("m",R13), + trialy2_81("n",R14), + trialy2_81("o",R15), + trialy2_81("p",R16), + trialy2_81("q",R17), + trialy2_81("r",R18), + trialy2_81("s",R19), + trialy2_81("t",R20), + trialy2_81("u",R21), + trialy2_81("v",R22), + trialy2_81("w",R23), + trialy2_81("x",R24), + trialy2_81("y",R25), + trialy2_81("z",R26), + trialy2_81(" ",R27), + R=[R1,R2,R3,R4,R5,R6,R7,R8,R9,R10,R11,R12,R13,R14,R15,R16,R17,R18,R19,R20,R21,R22,R23,R24,R25,R26,R27], + sort(R,RA), + reverse(RA,RB), + RB=[[_,Item]|_Rest],writeln([Item,RB]), + N1 is N-1, + sectest(N1). diff --git a/mindreader/mindreadtestobj-12scrn-2chars.pl b/mindreader/mindreadtestobj-12scrn-2chars.pl new file mode 100644 index 0000000000000000000000000000000000000000..3d3cbbf9cf6243ec616fd4d911fd3eda29242420 --- /dev/null +++ b/mindreader/mindreadtestobj-12scrn-2chars.pl @@ -0,0 +1,80 @@ +%%use_module(library(pio)). + +:- use_module(library(date)). +:- include('mindreadtestshared'). +:- include('../Text-to-Breasonings/texttobr2qb.pl'). + +%%sectest0 :- +%%repeat,sectest,sectest0. +sectest(S) :- sectest(7,0,S). +sectest(0,S,S):-!. +sectest(N,S1,S2):- + writeln(["The computer will think of one of the following thoughts when I let you know."]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["a"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["r"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["t"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["e"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["s"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), +/** writeln(["adrian"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + **/ + writeln(["The computer will think of one of the following thoughts in 5 seconds.", "a","r","t","e","s"/**,"adrian" + **/ + ]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["Now"]), + sleep(1), + random_member(A,["a","r","t","e","s" + /**,"adrian" + **/ + ]), + %%texttobr2(2), %% for 100 As for screen to display black border + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["Think of the object the computer thought of in 1 second.", "a","r","t","e","s"/**,"adrian" + **/ + ]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(1), + %% Test thought here. + %%writeln(["Now"]), + %%get_time(TS),stamp_date_time(TS,T,local),writeln([dateandtime,T]), + trialy2_15("a",R1), + trialy2_15("r",R2), + trialy2_15("t",R3), + trialy2_15("e",R4), + trialy2_15("s",R5), + R=[R1,R2,R3,R4,R5],%%,R6,R7,R8,R9,R10,R11,R12,R13,R14,R15,R16,R17,R18,R19,R20,R21,R22,R23,R24,R25,R26,R27], + sort(R,RA), + reverse(RA,RB), + RB=[[_,Item]|_Rest],writeln(["Computer thought",A,"You thought",Item]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(15), + N1 is N-1,(A=Item->S3 is S1+1;S3=S1), + sectest(N1,S3,S2). + %%read_string(user_input,1,_). diff --git a/mindreader/mindreadtestobj-12scrn-2medit.pl b/mindreader/mindreadtestobj-12scrn-2medit.pl new file mode 100644 index 0000000000000000000000000000000000000000..bb475db8b95b5eeb8826324aad3a42029a832cab --- /dev/null +++ b/mindreader/mindreadtestobj-12scrn-2medit.pl @@ -0,0 +1,80 @@ +%%use_module(library(pio)). + +:- use_module(library(date)). +:- include('mindreadtestshared'). +:- include('../Text-to-Breasonings/texttobr2qb.pl'). + +%%sectest0 :- +%%repeat,sectest,sectest0. +sectest(S) :- sectest(7,0,S). +sectest(0,S,S):-!. +sectest(N,S1,S2):- + writeln(["The computer will think of one of the following thoughts when I let you know."]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["characterbreasoner"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["1451"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["texttobr2"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["daily_regimen_meditation"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["list_prolog_interpreter"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), +/** writeln(["adrian"]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + **/ + writeln(["The computer will think of one of the following thoughts in 5 seconds.", "characterbreasoner","1451","texttobr2","daily_regimen_meditation","list_prolog_interpreter"/**,"adrian" + **/ + ]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["Now"]), + sleep(1), + random_member(A,["characterbreasoner","1451","texttobr2","daily_regimen_meditation","list_prolog_interpreter" + /**,"adrian" + **/ + ]), + %%texttobr2(2), %% for 100 As for screen to display black border + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(5), + writeln(["Think of the object the computer thought of in 1 second.", "characterbreasoner","1451","texttobr2","daily_regimen_meditation","list_prolog_interpreter"/**,"adrian" + **/ + ]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(1), + %% Test thought here. + %%writeln(["Now"]), + %%get_time(TS),stamp_date_time(TS,T,local),writeln([dateandtime,T]), + trialy2_15("characterbreasoner",R1), + trialy2_15("1451",R2), + trialy2_15("texttobr2",R3), + trialy2_15("daily_regimen_meditation",R4), + trialy2_15("list_prolog_interpreter",R5), + R=[R1,R2,R3,R4,R5],%%,R6,R7,R8,R9,R10,R11,R12,R13,R14,R15,R16,R17,R18,R19,R20,R21,R22,R23,R24,R25,R26,R27], + sort(R,RA), + reverse(RA,RB), + RB=[[_,Item]|_Rest],writeln(["Computer thought",A,"You thought",Item]), + texttobr2(2), %% for 100 As for screen to display white background + texttobr2(2), %% for 100 As for screen to display A + sleep(15), + N1 is N-1,(A=Item->S3 is S1+1;S3=S1), + sectest(N1,S3,S2). + %%read_string(user_input,1,_). diff --git a/mindreader/mindreadtestshared.pl b/mindreader/mindreadtestshared.pl new file mode 100644 index 0000000000000000000000000000000000000000..49147e17831f7b793bad0183bd0505afaef4ba43 --- /dev/null +++ b/mindreader/mindreadtestshared.pl @@ -0,0 +1,422 @@ +%% Name, DOB, Date learned, psych appointment month=1 or 2, psych appointment day, thoughts count + +sectest0 :- +sectest([first,last,dobd,dobm,doby,daylearned,monthlearned,yearlearned,1,0,16]), +sectest([first,last,dobd,dobm,doby,daylearned,monthlearned,yearlearned,1,0,16]). + +sectest1 :- +sectest([first,last,dobd,dobm,doby,daylearned,monthlearned,yearlearned,1,0,16]), +sectest([first,last,dobd,dobm,doby,daylearned,monthlearned,yearlearned,1,0,16]). + +find_time(H,M,S) :- + trialy2_15("0",H11), + trialy2_15("1",H12), + trialy2_15("2",H13), + H1L=[H11,H12,H13], + sort(H1L,H1A), + reverse(H1A,H1B), + H1B=[[_,H1]|_Rest1], + + (H1="2"->( + trialy2_30("0",H21), + trialy2_30("1",H22), + trialy2_30("2",H23), + trialy2_30("3",H24), + H2L=[H21,H22,H23,H24], + sort(H2L,H2A), + reverse(H2A,H2B), + H2B=[[_,H2]|_Rest2] + ) + ;( + trialy2_30("0",H21), + trialy2_30("1",H22), + trialy2_30("2",H23), + trialy2_30("3",H24), + trialy2_30("4",H25), + trialy2_30("5",H26), + trialy2_30("6",H27), + trialy2_30("7",H28), + trialy2_30("8",H29), + trialy2_30("9",H210), + + H2L=[H21,H22,H23,H24,H25, + H26,H27,H28,H29,H210], + sort(H2L,H2A), + reverse(H2A,H2B), + H2B=[[_,H2]|_Rest2] + )), + + trialy2_15("0",M11), + trialy2_15("1",M12), + trialy2_15("2",M13), + trialy2_15("3",M14), + trialy2_15("4",M15), + trialy2_15("5",M16), + M1L=[M11,M12,M13,M14,M15,M16], + sort(M1L,M1A), + reverse(M1A,M1B), + M1B=[[_,M1]|_Rest3], + + trialy2_30("0",M21), + trialy2_30("1",M22), + trialy2_30("2",M23), + trialy2_30("3",M24), + trialy2_30("4",M25), + trialy2_30("5",M26), + trialy2_30("6",M27), + trialy2_30("7",M28), + trialy2_30("8",M29), + trialy2_30("9",M210), + M2L=[M21,M22,M23,M24,M25,M26,M27,M28,M29,M210], + sort(M2L,M2A), + reverse(M2A,M2B), + M2B=[[_,M2]|_Rest4], + + trialy2_15("0",S11), + trialy2_15("1",S12), + trialy2_15("2",S13), + trialy2_15("3",S14), + trialy2_15("4",S15), + trialy2_15("5",S16), + S1L=[S11,S12,S13,S14,S15,S16], + sort(S1L,S1A), + reverse(S1A,S1B), + S1B=[[_,S1]|_Rest5], + + trialy2_30("0",S21), + trialy2_30("1",S22), + trialy2_30("2",S23), + trialy2_30("3",S24), + trialy2_30("4",S25), + trialy2_30("5",S26), + trialy2_30("6",S27), + trialy2_30("7",S28), + trialy2_30("8",S29), + trialy2_30("9",S210), + S2L=[S21,S22,S23,S24,S25,S26,S27,S28,S29,S210], + sort(S2L,S2A), + reverse(S2A,S2B), + S2B=[[_,S2]|_Rest6], + + string_concat(H1,H2,H), + string_concat(M1,M2,M), + string_concat(S1,S2,S). + +trialy2_6(Label,RA) :- + %%writeln([testing,Label]), + trialy1(R1), + trialy1(R2), + trialy1(R3), + trialy1(R4), + trialy1(R5), + trialy1(R6), /** + trialy1(R7), + trialy1(R8), + trialy1(R9), + trialy1(R10), + trialy1(R11), + trialy1(R12), + trialy1(R13), + trialy1(R14), + trialy1(R15), + trialy1(R16), + trialy1(R17), + trialy1(R18), + trialy1(R19), + trialy1(R20), + trialy1(R21), + trialy1(R22), + trialy1(R23), + trialy1(R24), + trialy1(R25), + trialy1(R26), + trialy1(R27), + trialy1(R28), + trialy1(R29), + trialy1(R30), **/ + R=[R1,R2,R3,R4,R5,R6 /**,R7,R8,R9,R10, + R11,R12,R13,R14,R15,R16,R17,R18,R19,R20, + R21,R22,R23,R24,R25,R26,R27,R28,R29,R30 **/ + ], + %%(member(true,R)->( + aggregate_all(count, member(true,R), Count), + RA=[Count,Label].%%,writeln([Label,Count,"/10"]));true). + +trialy2_15(Label,RA) :- + %%writeln([testing,Label]), + trialy1(R1), + trialy1(R2), + trialy1(R3), + trialy1(R4), + trialy1(R5), + trialy1(R6), + trialy1(R7), + trialy1(R8), + trialy1(R9), + trialy1(R10), + trialy1(R11), + trialy1(R12), + trialy1(R13), + trialy1(R14), + trialy1(R15), + /** + trialy1(R16), + trialy1(R17), + trialy1(R18), + trialy1(R19), + trialy1(R20), + trialy1(R21), + trialy1(R22), + trialy1(R23), + trialy1(R24), + trialy1(R25), + trialy1(R26), + trialy1(R27), + trialy1(R28), + trialy1(R29), + trialy1(R30),**/ + R=[R1,R2,R3,R4,R5,R6,R7,R8,R9,R10, + R11,R12,R13,R14,R15 /**,R16,R17,R18,R19,R20, + R21,R22,R23,R24,R25,R26,R27,R28,R29,R30 + **/], + %%(member(true,R)->( + aggregate_all(count, member(true,R), Count), + RA=[Count,Label].%%,writeln([Label,Count,"/10"]));true). + +trialy2_30(Label,RA) :- + %%writeln([testing,Label]), + trialy1(R1), + trialy1(R2), + trialy1(R3), + trialy1(R4), + trialy1(R5), + trialy1(R6), + trialy1(R7), + trialy1(R8), + trialy1(R9), + trialy1(R10), + trialy1(R11), + trialy1(R12), + trialy1(R13), + trialy1(R14), + trialy1(R15), + trialy1(R16), + trialy1(R17), + trialy1(R18), + trialy1(R19), + trialy1(R20), + trialy1(R21), + trialy1(R22), + trialy1(R23), + trialy1(R24), + trialy1(R25), + trialy1(R26), + trialy1(R27), + trialy1(R28), + trialy1(R29), + trialy1(R30), + R=[R1,R2,R3,R4,R5,R6,R7,R8,R9,R10, + R11,R12,R13,R14,R15,R16,R17,R18,R19,R20, + R21,R22,R23,R24,R25,R26,R27,R28,R29,R30], + %%(member(true,R)->( + aggregate_all(count, member(true,R), Count), + RA=[Count,Label].%%,writeln([Label,Count,"/10"]));true). + +trialy2_81(Label,RA) :- + %%writeln([testing,Label]), + trialy1(R1), + trialy1(R2), + trialy1(R3), + trialy1(R4), + trialy1(R5), + trialy1(R6), + trialy1(R7), + trialy1(R8), + trialy1(R9), + trialy1(R10), + trialy1(R11), + trialy1(R12), + trialy1(R13), + trialy1(R14), + trialy1(R15), + trialy1(R16), + trialy1(R17), + trialy1(R18), + trialy1(R19), + trialy1(R20), + trialy1(R21), + trialy1(R22), + trialy1(R23), + trialy1(R24), + trialy1(R25), + trialy1(R26), + trialy1(R27), + trialy1(R28), + trialy1(R29), + trialy1(R30), + trialy1(R31), + trialy1(R32), + trialy1(R33), + trialy1(R34), + trialy1(R35), + trialy1(R36), + trialy1(R37), + trialy1(R38), + trialy1(R39), + trialy1(R40), + trialy1(R41), + trialy1(R42), + trialy1(R43), + trialy1(R44), + trialy1(R45), + trialy1(R46), + trialy1(R47), + trialy1(R48), + trialy1(R49), + trialy1(R50), + trialy1(R51), + trialy1(R52), + trialy1(R53), + trialy1(R54), + trialy1(R55), + trialy1(R56), + trialy1(R57), + trialy1(R58), + trialy1(R59), + trialy1(R60), + trialy1(R61), + trialy1(R62), + trialy1(R63), + trialy1(R64), + trialy1(R65), + trialy1(R66), + trialy1(R67), + trialy1(R68), + trialy1(R69), + trialy1(R70), + trialy1(R71), + trialy1(R72), + trialy1(R73), + trialy1(R74), + trialy1(R75), + trialy1(R76), + trialy1(R77), + trialy1(R78), + trialy1(R79), + trialy1(R80), + trialy1(R81), + R=[ + R1,R2,R3,R4,R5,R6,R7,R8,R9,R10, + R11,R12,R13,R14,R15,R16,R17,R18,R19,R20, + R21,R22,R23,R24,R25,R26,R27,R28,R29,R30, + R31,R32,R33,R34,R35,R36,R37,R38,R39,R40, + R41,R42,R43,R44,R45,R46,R47,R48,R49,R50, + R51,R52,R53,R54,R55,R56,R57,R58,R59,R60, + R61,R62,R63,R64,R65,R66,R67,R68,R69,R70, + R71,R72,R73,R74,R75,R76,R77,R78,R79,R80, + R81 + ], + %%(member(true,R)->( + aggregate_all(count, member(true,R), Count), + RA=[Count,Label].%%,writeln([Label,Count,"/10"]));true). + +trialy1(R1) :- + %%control11(A1), + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, %trial1(N,[],S),trial01(S,S3). +catch( + (trial1(N,[],S),trial01(S,S3)), + _, + (trial0(S3)%,writeln(S3) + ) + ). + +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +midpoint(S,MP) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,writeln(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +shell1(Command) :- + (bash_command(Command,_)-> + true; + (writeln(["Failed shell1 command: ",Command]),abort) + ),!. + +bash_command(Command, Output) :- + setup_call_cleanup(process_create(path(bash), + ['-c', Command], + [stdout(pipe(Out))]), + read_string(Out, _, Output), + close(Out)). diff --git a/mindreader/mr_tree.pl b/mindreader/mr_tree.pl new file mode 100644 index 0000000000000000000000000000000000000000..43f22c622d56b4cbb60a21b5de6fb4ad8d49ec28 --- /dev/null +++ b/mindreader/mr_tree.pl @@ -0,0 +1,421 @@ +%% mr_tree.pl + +/* +findbest(R,R) :-!. +findbest2(R,Item):- + sort(R,RA), + reverse(RA,RB), + RB=[[_,Item]|_Rest]. + + concat_list2A(A1,B):- + A1=[A|List], + concat_list2A(A,List,B),!. + +concat_list2A(A,[],A):-!. +concat_list2A(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list2A(C,Items,B). +*/ + +mind_read(Item,[Item]) :- !. +mind_read(Item,List0) :- + +(lists_of_same_length1(List0)->mind_read_a(Item,List0); +%% segment decision trees + +mind_read_b(Item,List0)). +%% 1 decision tree + + +mind_read_a(Item,List0) :- + +/** + findall(L,(member(Item,List0),string_length(Item,L)),Ls1), + sort(Ls1,Ls2), + reverse(Ls2,Ls3), + Ls3=[Maximum_length|_], + numbers(Maximum_length,1,[],Numbers1), +**/ + + mind_read_a_1(List0,[],Item). + +%mind_read_a_1([List0],Item1_a,Item1_a2) :- +% append(Item1_a,[List0],Item1_a2),!. +mind_read_a_1(List0,Item1_a,Item1_a2) :- + %List0=[List0_1|_Rest], + (maplist(equals_empty_list,List0)->(%trace,%append(Item1_a,[Item],Item1_a2)); + Item1_a=Item1_a2); +( + %trace, +findall(Item2_a,(member(List0_1,List0),get_item_n(List0_1,1,Item2_a)),Item2_a1), + sort(Item2_a1,Item2_a2), + + findall(D1,(member(D2,Item2_a2), + term_to_atom(D2,D3),string_atom(D1,D3)),List1), +%trace, + + %List1=A, + findall(B,(member(C,List1),string_concat(C," 01",B)),List2), + findall(B,(member(C,List2),(number(C)->number_string(C,B)->true;((atom(C)->atom_string(C,B))->true;(string(C),C=B)))),List3), + + %trace, + + minimise_strings1(List3,List4,Map), + +%writeln1(minimise_strings1(List3,List4,Map)), + % findall(B,(member(C,List13),string_concat(C," 01",B)),List), + + %notrace, + %trace, + %writeln1(make_mind_reading_tree4(List,Tree)), + make_mind_reading_tree4(List4,Tree), + %writeln1(make_mind_reading_tree4-here1(List,Tree)), + +%writeln1(mind_read2(1,Tree,Item1)), + mind_read2(1,Tree,Item1), +writeln1(mind_read2(1,Tree,Item1)), +writeln(""), + %trace, + %string_concat(Item3," 01",Item1), + find_mapped_item(Item1,Item2,Map), + term_to_atom(Item,Item2), + + (findall(Rest1,(member(Item1_b,List0),%get_item_b(Item1_b,Number,Item1_b1),%Number2 is Number+1,get_item_b(Item1_b,Number2,Item1_b2) + Item1_b=[Item|Rest1]),Item1_b2), + append(Item1_a,[Item],Item1_a3), + mind_read_a_1(Item1_b2,Item1_a3,Item1_a2)))). + +mind_read_b(Item,List0) :- + findall(D1,(member(D2,List0),term_to_atom(D2,D3),string_atom(D1,D3)),List1), +%trace, + + %List1=A, + findall(B,(member(C,List1),string_concat(C," 01",B)),List2), + findall(B,(member(C,List2),(number(C)->number_string(C,B)->true;((atom(C)->atom_string(C,B))->true;(string(C),C=B)))),List3), + + %trace, + + minimise_strings1(List3,List4,Map), + +%writeln1(minimise_strings1(List3,List4,Map)), + % findall(B,(member(C,List13),string_concat(C," 01",B)),List), + + %notrace, + %trace, + %writeln1(make_mind_reading_tree4(List,Tree)), + make_mind_reading_tree4(List4,Tree), + %writeln1(make_mind_reading_tree4-here1(List,Tree)), + +%writeln1(mind_read2(1,Tree,Item1)), + mind_read2(1,Tree,Item1), +writeln1(mind_read2(1,Tree,Item1)), +writeln(""), + %trace, + %string_concat(Item3," 01",Item1), + find_mapped_item(Item1,Item2,Map), + term_to_atom(Item,Item2). + +mind_read2(N1,Tree1,Item1) :- + findall(Option,member([N1,Option,N2],Tree1),Options), + findall([N1,Option,N2],member([N1,Option,N2],Tree1),Options2), + %subtract(Tree1,Options,Tree2), + mind_read10(Item2,Options), + mind_read3(N1,Options2,Options,Tree1,Item2,Item1). + + +numbers(N2,N1,Numbers,Numbers) :- + N2 is N1-1,!. +numbers(N2,N1,Numbers1,Numbers2) :- + N3 is N1+1, + append(Numbers1,[N1],Numbers3), + numbers(N2,N3,Numbers3,Numbers2). + +mind_read3(N1,_,_,Tree1,Item2,Item1) :- + member([N1,Item2,[-,Item1]],Tree1),!. +mind_read3(N1,Options2,_Options,Tree1,Item2,Item1) :- +%trace, + %subtract2(Tree1,Options,[],Tree2), + member([N1,Item2,N2],Options2), + mind_read2(N2,Tree1,Item1). + +mind_read10("",[]) :- !. +mind_read10(Item,[Item]) :- + writeln1([item,Item]),!. +mind_read10(Item,List) :- +writeln1([list,List]), +%trace, +%%catch( + (trialy2(List,R1), + findbest(R1,Item), + writeln1([item,Item])) + %_, + %mind_read10(Item,List) + %) + , +!. + %%random_member(Item,List),!. + +mind_read100(Item,List) :- + length(List,L), + Trials is 3*L, + trialy22(List,Trials,[],R1), + findbest(R1,Item),!. + +mind_read_instruments(Instrument,_) :- + instruments_a(Instruments), + %notrace, + %writeln1(mind_read2(1,Instruments,Item1)), + + mind_read2(1,Instruments,Item1),%->trace;trace), + %writeln1(mind_read2-here3(1,Instruments,Item1)), + + %trace, + string_concat(Item2," 01",Item1), + term_to_atom(Instrument,Item2). + +trialy2([],R) :- + R=[[_,['C']]]. + %%writeln([[],in,trialy2]),abort. +trialy2(List,R) :- +%%writeln([list,List]), +%%notrace, + length(List,Length), + ((Length=<9-> + findr4(R4), + number_string(R4,R4A), + formr5([R4A],9,Length,R5), + findr(R5,List,R)); + (Length=<99-> + findr4(R41), + findr4(R42), + formr5([R41,R42],99,Length,R5), + findr(R5,List,R)); + (Length=<999-> + findr4(R41), + findr4(R42), + findr4(R43), + formr5([R41,R42,R43],999,Length,R5), + findr(R5,List,R)); + fail), + %%writeln([r,R]),trace. + true. + +findr4(R4) :- + List1=[0,1,2,3,4,5,6,7,8,9], + Trials is 30, +%catch( + (trialy22(List1,Trials,[],R1), + findbest2(R1,R4) + %writeln1([item,Item]) + ) + %_, + %findr4(R4) + %) + . + %%number_string(R3,R2), +formr5(RList,Upper,Length,R5) :- + %%findall(D,(member(C,RList),floor(C,D)),RList2), + concat_list2A(RList,R5A), + number_string(R5B,R5A), + R51 is floor((R5B/Upper)*Length), + (R5B=Upper->R5 is R51-1;R5=R51). +findr(R4,List,R) :- + %%floor(R4,R4A), + length(A,R4), + append(A,[R|_],List). + + %%random_member(A,List), + %%R=[[_,A]]. + + /** + length(List,L), + Trials is L*3, + trialy22(List,Trials,[],R).**/ + +trialy22([],_,R,R) :- !. +trialy22(List,Trials,RA,RB) :- + List=[Item|Items], + trialy21(Item,Trials,R1), + append(RA,[R1],RC), + trialy22(Items,Trials,RC,RB),!. + +trialy21(Label,Trials,RA) :- + trialy3(Trials,[],R), + aggregate_all(count, member(true,R), Count), + RA=[Count,Label]. + +trialy3(0,R,R) :-!. +trialy3(Trials1,RA,RB) :- + trialy1(R1), + append(RA,[R1],RC), + Trials2 is Trials1-1, + trialy3(Trials2,RC,RB),!. + +%% try other nouns +trialy1(R1) :- + %%control11(A1), + trial0(A22), %% Control + sum(A22,0,S22), + mean(S22,A1), + trial0(A21), %% Test 1 + sum(A21,0,S02), + mean(S02,A2), + (A1>A2->R1=true;R1=fail). + +trial0(S3) :- N is 10, +catch( + (trial1(N,[],S),trial01(S,S3)), + _, + trial0(S3)). +trial01(S1,S3) :- + sort(S1,S), + %%midpoint(S,MP), + halves(S,H1,H2), + midpoint(H1,Q1), + midpoint(H2,Q3), + IQR is Q3-Q1, + sum(S,0,S02), + mean(S02,Mean), + furthestfrommean(S,Mean,V), + D1 is 1.5*IQR, + D2 is V-Mean, + (D2>D1->(delete(S,V,S2),trial01(S2,S3));S=S3). + +trial1(0,A,A) :- !. +trial1(N,A,B) :- mindreadtest(S), append(A,[S],A2), + N1 is N-1,trial1(N1,A2,B). + +midpoint(S,MP) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2, M2 is M1+1,N1 is M1-1,N2 is M2-1,length(N11,N1),length(N21,N2),append(N11,[N12|_Rest1],S),append(N21,[N22|_Rest2],S),MP is (N12+N22)/2) + ; + (L2 is L+1, M1 is L2/2, N1 is M1-1,length(N11,N1),append(N11,[MP|_Rest],S))). + +halves(S,H1,H2) :- + length(S,L), + A is mod(L,2), + (A is 0-> + (M1 is L/2,length(H1,M1),append(H1,H2,S)) + ; + (L2 is L-1,M1 is L2/2,length(H1,M1),append(H1,[_|H2],S))). + +sum([],S,S):-!. +sum(S0,S1,S2) :- + S0=[S3|S4], + S5 is S1+S3, + sum(S4,S5,S2). + +mean(Sum,Mean) :- + Mean is Sum/2. + +furthestfrommean(S,Mean,V) :- + absdiffmean(S,Mean,[],D), + sort(D,D1), + reverse(D1,[[_,V]|_Rest]). + +absdiffmean([],_M,D,D) :- !. +absdiffmean(S,M,D1,D2) :- + S=[S1|S2], + S3 is abs(S1-M), + append(D1,[[S3,S1]],D3), + absdiffmean(S2,M,D3,D2). + +mindreadtest(Sec) :- + %% 250 br for characters to be br out with 10 br each from person to me - do when initial 250 br test done and doing 10 br test + %%comment(fiftyastest), + %%random(X),X1 is 10*X, X2 is floor(X1), (X2=<2 -> ( + %%texttobr,writeln(['true test']), %%); %% use breasonings breasoned out by computer for not by me, for job medicine for "me", at last time point + %%true), %% leave last time point blank + %%**texttobr2(640);true),%% make an A to detect reaction to gracious giving or blame of in following + get_time(TimeStamp1), + %%phrase_from_file(string(_String), 'file.txt'), + texttobr2(2), %% 100 As for answer (must be br before this on same day) + %% is gracious giving or blame + get_time(TimeStamp2), + %%comment(turnoffas), + Sec is TimeStamp2 - TimeStamp1. + +lists_of_same_length1(List0) :- + list1(List0,_,_), + List0=[Item1|_], + %term_to_atom(Item1,Item2), + (list1(Item1,_,_)->Item2=Item1;fail),%[Item1]=Item2), + length(Item2,Item2_l), + lists_of_same_length2(List0,Item2_l). + +lists_of_same_length2([],_Item2_l) :- !. +lists_of_same_length2(List0,Item2_l) :- + List0=[Item1|Rest], + (list1(Item1,_,_)->Item2=Item1;fail),%[Item1]=Item2), + %term_to_atom(Item1,Item2), + length(Item2,Item2_l), + lists_of_same_length2(Rest,Item2_l). + +minimise_strings1([List0],[List0],[[List0,List0]]) :- + string(List0),!. +minimise_strings1(List0,A,Map) :- + sort(List0,List1), + findall(L,(member(Item,List1),string_length(Item,L)),Ls1), + sort(Ls1,Ls2), + reverse(Ls2,Ls3), + Ls3=[Maximum_length|_], + numbers(Maximum_length,1,[],Numbers1), + minimise_strings11(Numbers1,_Numbers2,List1,List2,First_part), + string_length(First_part,First_part_l), + Maximum_length2 is Maximum_length-First_part_l+1, + numbers(Maximum_length2,1,[],Numbers3), + minimise_strings2(Numbers3,List2,First_part,A,Map). + +minimise_strings11(Numbers,Numbers,[],[],_) :- !. +minimise_strings11(Numbers1,Numbers2,List1,List2,First_part) :- + Numbers1=[Number|Numbers3], + findall(Item2,(member(Item1,List1),string_concat(Item2,_,Item1), + string_length(Item2,Number)),Item3), + Item3=[First_part2|_], + string_concat(First_part1,Char,First_part2), + string_length(Char,1), + sort(Item3,Item3a), + length(Item3,Item3_l), + length(List1,List1_l), + (Item3_l=List1_l-> + (Number2 is Number-1, + findall(Item4,(member(Item1,List1),string_concat(Item2,Item4,Item1), + string_length(Item2,Number2)),Item4a), + (length(Item3a,1)-> + minimise_strings11(Numbers3,Numbers2,List1,List2,First_part); + (Numbers2=Numbers1,List2=Item4a,First_part=First_part1))); + (Numbers2=Numbers1,List2=List1,First_part=First_part1)). + + +minimise_strings2([],_List1,_,_A,_Map) :- !. +minimise_strings2(Numbers1,List1,First_part,A1,Map1) :- + Numbers1=[Number|Numbers2], + findall(Item3a,(member(Item1,List1),string_concat(Item2,_,Item1), + string_length(Item2,Number),string_concat(Item2," 01",Item3a)),Item3), + sort(Item3,Item31), +%trace, +findall([Item3a,Item1a],(member(Item1,List1),string_concat(Item2,_,Item1), + string_length(Item2,Number),string_concat(Item2," 01",Item3a),string_concat(First_part,Item1,Item1a)),Item4), + length(Item31,Length1), + length(Item3,Length2), + (Length1=Length2->(A1=Item3,Map1=Item4); + minimise_strings2(Numbers2,List1,First_part,A1,Map1)). + +find_mapped_item(Item3,Item2,Map) :- + member([Item3,Item4],Map), + string_concat(Item2," 01",Item4). + +too_long1(List4) :- + too_long2(List4,0,N), + N>=250. + +too_long2([],N,N) :- !. +too_long2(List4,N1,N2) :- + List4=[Item|Rest], + string_length(Item,Length), + N3 is N1+Length, + too_long2(Rest,N3,N2). + diff --git a/mindreader/postsong.pl b/mindreader/postsong.pl new file mode 100644 index 0000000000000000000000000000000000000000..c5c3aad3c4af6e32edf5abeb18d27bab4bc26f1a --- /dev/null +++ b/mindreader/postsong.pl @@ -0,0 +1,32 @@ +:- include('../Text-to-Breasonings/text_to_breasonings.pl'). + +postsong(0) :- !. +postsong(N1) :- + texttobr2(3), %% give self breasonings + texttobr2(20), %%Feature 1 + texttobr2(20), %%Updates + texttobr2(20), %%Feature 2 + texttobr2(20), %%Updates + texttobr2(20), %%Feature 3 + texttobr2(20), %%Updates + texttobr2(100), %%Icon + texttobr2(20), %%Updates + texttobr2(32), %%Lyrics + texttobr2(36), %%Music + texttobr2(20), %%Updates + + texttobr2(2), %%Medicine + texttobr2(20), %%Updates + texttobr2(2), %%Sales + texttobr2(20), %%Updates + texttobr2(2), %%Marketing + texttobr2(20), %%Updates + + texttobr2(2), %%Graciously give or blame listener for colour imagery + texttobr2(20), %%Updates + + texttobr2(2), %%Play song + texttobr2(2), + + N2 is N1-1, + postsong(N2). \ No newline at end of file diff --git a/mindreader/screen.pl b/mindreader/screen.pl new file mode 100644 index 0000000000000000000000000000000000000000..532a5b7e1c203e148a8b8c5b403b2dc5acc3c551 --- /dev/null +++ b/mindreader/screen.pl @@ -0,0 +1,116 @@ +%% screen.pl + +%% Breason out 10 scrnsongs + +:- include('../Text-to-Breasonings/text_to_breasonings.pl'). + +%% run postsong/postsong. +%% before sectest0. + +sectest0 :- + %% Pixel grid is 10 cm above centre of MacBook Air camera + %% Pixels are 1,1,1 mm, facing west + %% Pixels are on for 5 seconds + %% 0,0 is origin + + %% For Cosmology to work: + texttobr2(1000), %% 100 done-up As * 10 songs + texttobr2(2), %% dot on graciously give, graciously give + texttobr2(20), %% spiritually play 10 songs to light pixel + + /**light([ + [0,0,black],[1,0,white], + [0,1,black],[1,1,white], + [0,2,black],[1,2,white], + [0,3,black],[1,3,white], + [0,4,black],[1,4,white], + [0,5,black],[1,5,white], + [0,6,black],[1,6,white], + [0,7,black],[1,7,white], + [0,8,black],[1,8,white], + [0,9,black],[1,9,white] + ]).**/ + + /**light([[1,9,white],[2,9,white],[3,9,black],[4,9,white],[5,9,white], + [1,8,white],[2,8,white],[3,8,black],[4,8,white],[5,8,white], + [1,7,white],[2,7,black],[3,7,white],[4,7,black],[5,7,white], + [1,6,white],[2,6,black],[3,6,white],[4,6,black],[5,6,white], + [1,5,white],[2,5,black],[3,5,white],[4,5,black],[5,5,white], + [1,4,black],[2,4,white],[3,4,white],[4,4,white],[5,4,black], + [1,3,black],[2,3,white],[3,3,white],[4,3,white],[5,3,black], + [1,2,white],[2,2,white],[3,2,white],[4,2,white],[5,2,white], + [1,1,white],[2,1,white],[3,1,white],[4,1,white],[5,1,white]]). +**/ + +%% 1 black, 2 white, 3 red, 4 orange, 5 yellow, 6 green, 7 blue, 8 purple, 9 brown, 10 grey + random(X),Y is round(10*X), + light(Y,[[1,9,0],[2,9,0],[3,9,1],[4,9,0],[5,9,0], + [1,8,0],[2,8,0],[3,8,1],[4,8,0],[5,8,0], + [1,7,0],[2,7,1],[3,7,0],[4,7,1],[5,7,0], + [1,6,0],[2,6,1],[3,6,0],[4,6,1],[5,6,0], + [1,5,0],[2,5,1],[3,5,0],[4,5,1],[5,5,0], + [1,4,1],[2,4,0],[3,4,0],[4,4,0],[5,4,1], + [1,3,1],[2,3,0],[3,3,0],[4,3,0],[5,3,1], + [1,2,0],[2,2,0],[3,2,0],[4,2,0],[5,2,0], + [1,1,0],[2,1,0],[3,1,0],[4,1,0],[5,1,0]]), + + writeln("Displaying 'A' 10 cm above MacBook Air's camera"), + sleep(5), + writeln(["1 black, 2 white, 3 red, 4 orange, 5 yellow, 6 green, 7 blue, 8 purple, 9 brown, 10 grey\nColour of Letter A=",Y]). + +/** +[] [] [1,2] [] [] +[] [] [1,2] [] [] +[] [1] [] [2] [] +[] [1] [] [2] [] +[] [1,3] [3] [2,3] [] +[1] [] [] [] [2] +[1] [] [] [] [2] +[] [] [] [] [] +[] [] [] [] [] +**/ + + +/** +%% Takes too long, run mindreadtestmusiccomposer-unusual.pl or postsong + +prep :- + ttb2(["scrnsong2019620131426.87270498275757.txt", +"scrnsong2019620131426.87270498275757lyrics.txt", +"scrnsong2019620131428.09975290298462.txt", +"scrnsong2019620131428.09975290298462lyrics.txt", +"scrnsong2019620131430.95198392868042.txt", +"scrnsong2019620131430.95198392868042lyrics.txt", +"scrnsong2019620131432.72644400596619.txt", +"scrnsong2019620131432.72644400596619lyrics.txt", +"scrnsong2019620131435.184492111206055.txt", +"scrnsong2019620131435.184492111206055lyrics.txt", +"scrnsong2019620131437.28680896759033.txt", +"scrnsong2019620131437.28680896759033lyrics.txt", +"scrnsong2019620131438.37197399139404.txt", +"scrnsong2019620131438.37197399139404lyrics.txt", +"scrnsong2019620131439.7610969543457.txt", +"scrnsong2019620131439.7610969543457lyrics.txt", +"scrnsong2019620131442.2528030872345.txt", +"scrnsong2019620131442.2528030872345lyrics.txt", +"scrnsong2019620131443.59552311897278.txt", +"scrnsong2019620131443.59552311897278lyrics.txt"]). + +ttb2([]) :- !. +ttb2(List1) :- + List1=[Item|List2], + texttobr2(8,Item,u,2000), texttobr(8,Item,u,2000), + ttb2(List2). +**/ + +light(_,[]) :- !. +light(WhitetoColour,Pixels1) :- + Pixels1=[Pixel|Pixels2], + Pixel=[_X,_Y,Colour1], + (Colour1=2-> Colour2=WhitetoColour;Colour2=Colour1), + texttobr2(1000), %% 100 done-up As * 10 songs + texttobr2(2), %% dot on graciously give, graciously give + texttobr2(20), %% spiritually play 10 songs to light pixel + light(WhitetoColour,Pixels2). + + %%*** With letter to politics, \ No newline at end of file diff --git a/qa_db_finder/.DS_Store b/qa_db_finder/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/qa_db_finder/.DS_Store differ diff --git a/qa_db_finder/LICENSE b/qa_db_finder/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..3a27c90f1c2986ac0fb73b4ae956c6205343dfdd --- /dev/null +++ b/qa_db_finder/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2020, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/qa_db_finder/README.md b/qa_db_finder/README.md new file mode 100644 index 0000000000000000000000000000000000000000..f12199e43049f7cf7bfa1ef1bc85e7b5e2057676 --- /dev/null +++ b/qa_db_finder/README.md @@ -0,0 +1,59 @@ +# qa_db_finder +Question Answering Database-Style Algorithm Finder + +Question Answering Database-Style Algorithm Finder is a SWI-Prolog algorithm that finds database style list-based algorithms with question answering, for example: + +``` +e.g. for cultural translation tool, the input lists of tuples: orig-trans, trans-bt where orig-bt have same meaning and input orig and output translation: + a(OT,TB,BO,O,T):- + member(A1,OT),A1=[O,T], + member(A2,TB),A2=[T,B], + member(A3,BO),A3=[B,O]. + + which is run as: + ?- a([[a,b]],[[b,a]],[[a,a]],a,T). + T = b. +``` + +# Getting Started + +Please read the following instructions on how to install the project on your computer for writing algorithms. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","qa_db_finder"). +halt +``` + +# Running + +* In Shell: +`cd qa_db_finder` +`swipl` +`['qa_db_finder.pl'].` +`qa_db_finder(Algorithm).` + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/qa_db_finder/qa_db_finder.pl b/qa_db_finder/qa_db_finder.pl new file mode 100644 index 0000000000000000000000000000000000000000..8c5d943d310720574cddb559df5f33365a46555c --- /dev/null +++ b/qa_db_finder/qa_db_finder.pl @@ -0,0 +1,101 @@ +%% qa_db_finder.pl +%% Finds database style list-based algorithms with question answering. + +%% Asks "What is the input variable?" +%% Asks "What variable is the input variable linked to?" + +%% Asks "What variable is this last variable linked to?" +%% Asks "Is this the final output?" + +/** +e.g. orig-trans, trans-bt where orig-bt have same meaning + a(OT,TB,BO,O,T):- + member(A1,OT),A1=[O,T], + member(A2,TB),A2=[T,B], + member(A3,BO),A3=[B,O]. + +What is the input variable? +|: O +What variable is this last variable linked to? +|: T +Is this the final variable? (y/n) +|: n +What variable is this last variable linked to? +|: B +Is this the final variable? (y/n) +|: n +What variable is this last variable linked to? +|: O +Is this the final variable? (y/n) +|: y +What is the final output variable? +|: T +A = "a(OT,TB,BO,O,T):-member(A1,OT),A1=[O,T],member(A2,TB),A2=[T,B],member(A3,BO),A3=[B,O].". + +a(OT,TB,BO,O,T):-member(A1,OT),A1=[O,T],member(A2,TB),A2=[T,B],member(A3,BO),A3=[B,O]. +?- a([[a,b]],[[b,a]],[[a,a]],a,T). +T = b. + +**/ + +qa_db_finder(Algorithm) :- + writeln("What is the input variable?"), + read_string(user_input, "\n", "\r", _End, I), + + writeln("What variable is this last variable linked to?"), + read_string(user_input, "\n", "\r", _End2, N), + + concat_list(["member(A1,",I,N,"),A1=[",I,",",N,"],"],Algorithm1), + Vars1=[I,N], + + repeat1(2,N,Algorithm1,Algorithm2,Vars1,Vars2), + + string_concat(Algorithm3,",",Algorithm2), + + writeln("What is the final output variable?"), + read_string(user_input, "\n", "\r", _End3, O), + %%trace, + + find_header_args1(Vars2,"",HA1), + %%string_concat(HA2,",",HA1), + + concat_list(["a(",HA1,I,",",O,"):-"],Algorithm4), + concat_list([Algorithm4,Algorithm3,"."],Algorithm). + + + +repeat1(M1,N,Algorithm1,Algorithm2,Vars1,Vars2) :- + writeln("Is this the final variable? (y/n)"), + read_string(user_input, "\n", "\r", _End, Q), + + (Q="y"->(Algorithm2=Algorithm1,Vars2=Vars1); + writeln("What variable is this last variable linked to?"), + read_string(user_input, "\n", "\r", _End2, V), + + concat_list(["member(A",M1,",",N,V,"),A",M1,"=[",N,",",V,"],"],Algorithm1a), + append(Vars1,[V],Vars3), + M2 is M1+1, + string_concat(Algorithm1,Algorithm1a,Algorithm1b), + repeat1(M2,V,Algorithm1b,Algorithm2,Vars3,Vars2) + + ). + + +find_header_args1([_],HA,HA) :- !. +find_header_args1(Vars,HA1,HA2) :- + Vars=[_|F], + Vars=[A,B|_], + concat_list([A,B,","],D), + string_concat(HA1,D,E), + find_header_args1(F,E,HA2). + + +concat_list(A1,B):- + A1=[A|List], + concat_list(A,List,B),!. + +concat_list(A,[],A):-!. +concat_list(A,List,B) :- + List=[Item|Items], + string_concat(A,Item,C), + concat_list(C,Items,B). diff --git a/t2ab/.DS_Store b/t2ab/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/t2ab/.DS_Store differ diff --git a/t2ab/LICENSE b/t2ab/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..488c8af2bf1b4900944f435d53d0a67b24474aa1 --- /dev/null +++ b/t2ab/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2022, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/t2ab/README.md b/t2ab/README.md new file mode 100644 index 0000000000000000000000000000000000000000..3536ae4f426cf374250b847dc0c69ecdbae0f96c --- /dev/null +++ b/t2ab/README.md @@ -0,0 +1,53 @@ +# t2ab - text to algorithm breasoning + +Text to Algorithm Breasoning is a SWI-Prolog algorithm that prepares to breason out file.txt (in the Text to Breasonings folder) with an algorithm for each word's object. Breasoning is thinking of the x, y and z dimensions of objects, necessary for meeting grades. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for breasoning out algorithms. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, the List Prolog Interpreter Repository and the Text to Breasonings Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","t2ab"). +halt +``` + +# Running + +* In Shell: +`cd t2ab` +`swipl` +`['t2ab.pl'].` + +* Before running texttobr, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. + +* Follow instructions in https://github.com/luciangreen/Text-to-Breasonings/blob/master/Instructions_for_Using_texttobr(2).pl.txt when using texttoalg, texttobr, or texttobr2 to avoid medical problems. + +* Enter the following to breason out object algorithms: +`t2ab(u,u,u,u).` + + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/t2ab/algdict1.txt b/t2ab/algdict1.txt new file mode 100644 index 0000000000000000000000000000000000000000..9497233cb543aeb45fc36cc594428ba9204a11da --- /dev/null +++ b/t2ab/algdict1.txt @@ -0,0 +1 @@ +[["a","equals"],["acid","minus"],["aircraft","plus"],["album","write"],["algae","minus"],["algorithm","write"],["amount","plus"],["and","and"],["animal","write"],["animals","write"],["ant","minus"],["antibody","minus"],["apple","write"],["arc","write"],["arm","plus"],["artichoke","write"],["artwork","write"],["asterisk","multiply"],["atom","write"],["b","equals"],["baby","write"],["backslash","write"],["bacterium","write"],["bag","write"],["ball","plus"],["balloon","plus"],["balustrade","write"],["banana","write"],["barbeque","write"],["barnacle","write"],["barrel","write"],["base","write"],["bath","write"],["baton","write"],["battery","write"],["bean","write"],["bear","write"],["bed","minus"],["bee","write"],["beetle","plus"],["bell","write"],["berry","minus"],["bird","write"],["biscuit","write"],["blade","write"],["blanket","write"],["blindfold","write"],["blood","write"],["boat","write"],["bone","write"],["book","write"],["boot","write"],["bottle","minus"],["bowl","write"],["box","multiply"],["bra","write"],["bracket","write"],["brain","and"],["bread","write"],["breast","write"],["bridge","write"],["bristle","write"],["brow","write"],["brush","write"],["bubble","minus"],["buckle","write"],["building","write"],["bull","write"],["bun","write"],["buoy","write"],["burger","write"],["butter","write"],["c","equals"],["cake","write"],["calf","write"],["candle","write"],["caper","write"],["capsicum","write"],["capsule","write"],["car","plus"],["caramel","write"],["carrot","write"],["castle","write"],["cat","write"],["cd","ls"],["cello","write"],["chair","write"],["chalk","write"],["cheese","write"],["cheeseburger","write"],["cherry","write"],["chicken","write"],["child","write"],["chisel","write"],["chocolate","minus"],["chopstick","plus"],["circle","write"],["city","write"],["claimed","write"],["claw","plus"],["clipper","minus"],["clippers","minus"],["cloud","write"],["clove","write"],["coat","write"],["coconut","write"],["coin","plus"],["coit","write"],["colander","write"],["collar","write"],["column","write"],["company","write"],["computer","write"],["condom","write"],["cone","minus"],["continent","write"],["cookie","write"],["coral","write"],["cork","equals"],["counter","write"],["country","write"],["cow","write"],["crane","write"],["cream","write"],["crystal","write"],["cup","minus"],["cupcake","minus"],["cushion","plus"],["cuticle","write"],["cycle","write"],["cylinder","minus"],["cymbal","write"],["daffodil","write"],["daisy","write"],["danish","write"],["dash","minus"],["day","write"],["delta","write"],["desk","write"],["diamond","write"],["dinosaur","write"],["discus","write"],["dish","write"],["disinfectant","write"],["dog","write"],["doll","write"],["dollarsign","plus"],["dollarsymbol","plus"],["door","plus"],["dot","write"],["dove","write"],["down","write"],["dreamballoon","write"],["dress","write"],["drum","plus"],["duck","write"],["duckling","write"],["ear","equals"],["earlobe","write"],["egg","write"],["eightrectangle","foldr(string_concat)"],["elbow","write"],["electron","minus"],["embryo","write"],["equals","equals"],["exclamationmark","cut"],["eye","write"],["f","write"],["fabric","minus"],["face","equals"],["fat","write"],["feather","write"],["fibre","write"],["fifteenrectangle","write"],["finger","write"],["fire","minus"],["fish","plus"],["fivecrectangle","write"],["fiverectangle","write"],["flask","minus"],["flowchart","write"],["flower","plus"],["folder","write"],["foot","plus"],["football","plus"],["forearm","plus"],["forehead","write"],["fork","write"],["fortyrectangle","write"],["fourrectangle","foldr(string_concat)"],["frankfurt","write"],["frog","write"],["fuel","write"],["fullstop","halt"],["function","write"],["fungus","write"],["funnel","minus"],["garbage","write"],["gate","and"],["gazebo","write"],["germ","minus"],["gherkin","minus"],["giant","write"],["giraffe","write"],["gland","write"],["glass","minus"],["glasses","write"],["glove","write"],["goat","write"],["government","write"],["greaterthan","greater_than"],["grid","write"],["guitar","write"],["hair","plus"],["hammer","plus"],["hand","plus"],["handle","write"],["happy","write"],["happyface","write"],["harlequin","write"],["harp","minus"],["hash","divide"],["hat","equals"],["hay","write"],["hcl","write"],["head","write"],["headset","write"],["heart","plus"],["heater","write"],["hedgehog","write"],["hemisphere","write"],["hexagon","write"],["hill","write"],["hinge","write"],["hole","minus"],["hook","write"],["hoop","write"],["horn","write"],["horse","write"],["house","write"],["hug","write"],["i","write"],["ibis","write"],["icecream","minus"],["ink","write"],["inky","write"],["insect","write"],["intestines","minus"],["iphone","minus"],["iron","write"],["island","write"],["j","equals"],["jacket","write"],["jar","equals"],["jaw","write"],["jewel","write"],["jug","plus"],["juice","minus"],["k","write"],["key","write"],["keyboard","plus"],["kidney","write"],["king","write"],["knee","write"],["knife","write"],["ladder","write"],["land","write"],["lane","write"],["lapel","write"],["laptop","write"],["lassoo","write"],["lawn","write"],["leaf","write"],["leash","plus"],["left","write"],["leftbracket","left_bracket"],["leg","write"],["lemon","write"],["lens","write"],["lessthan","less_than"],["letter","write"],["letterbox","write"],["lettuce","write"],["light","write"],["line","write"],["lion","write"],["liquid","minus"],["lock","write"],["log","write"],["lolly","write"],["lowera","write"],["lubricant","write"],["lung","minus"],["lycra","write"],["lymph","write"],["m","write"],["magazine","write"],["mantelpiece","write"],["map","write"],["mask","write"],["mat","write"],["mayonnaise","write"],["maze","write"],["medication","write"],["meditator","write"],["mercury","write"],["metal","write"],["microphone","write"],["milk","minus"],["mineral","write"],["minus","minus"],["mirror","equals"],["monkey","write"],["monthly","write"],["moon","plus"],["mop","write"],["mosquito","write"],["mountain","write"],["mouse","write"],["moustache","write"],["mouth","minus"],["muesli","write"],["muffin","write"],["multiplication","multiply"],["multiply","multiply"],["n","equals"],["nappy","write"],["nebula","write"],["neck","write"],["nectar","write"],["needle","write"],["nerve","plus"],["nest","write"],["neuron","write"],["ninerectangle","write"],["noodle","write"],["nose","equals"],["nostril","minus"],["note","write"],["notequals","not_equals"],["nrectangle","foldr(append)"],["number","write"],["nut","minus"],["oat","minus"],["octahedron","write"],["oil","minus"],["one","plus"],["operatingsystem","and"],["or","or"],["orange","write"],["organisation","write"],["oven","write"],["owl","write"],["oxygen","minus"],["paint","write"],["painting","write"],["pamphlet","write"],["pan","write"],["pants","write"],["paper","write"],["path","plus"],["pathway","write"],["patty","write"],["paw","plus"],["pea","minus"],["peach","write"],["peacock","write"],["pear","minus"],["pedal","plus"],["peg","write"],["pen","write"],["pencil","write"],["pencilsharpener","minus"],["pepper","write"],["perfume","minus"],["person","intersection"],["petal","write"],["phone","write"],["photon","write"],["pig","write"],["pigeon","write"],["pillow","write"],["pin","write"],["pip","write"],["pipe","write"],["pipette","minus"],["plane","plus"],["planet","write"],["plant","plus"],["plate","minus"],["platelet","plus"],["plectrum","plus"],["plus","subtract"],["pocket","write"],["pomegranate","write"],["pony","write"],["pot","minus"],["potato","write"],["potbelly","write"],["printer","write"],["priscilla","write"],["puppet","write"],["puppy","write"],["questionmark","query"],["quince","write"],["rabbit","write"],["rack","write"],["raisin","minus"],["rake","write"],["rhinoceros","write"],["ribbon","write"],["right","write"],["ring","write"],["river","write"],["road","plus"],["robe","write"],["rocket","write"],["room","plus"],["rope","write"],["rose","plus"],["ruler","write"],["rung","write"],["s","write"],["sadface","write"],["salt","write"],["sand","equals"],["sandwich","write"],["sauce","write"],["saucer","write"],["sausage","write"],["scissors","write"],["scone","write"],["scoop","minus"],["sculpture","write"],["seat","equals"],["secateurs","write"],["seed","plus"],["semicircle","plus"],["semolina","write"],["sevenrectangle","foldr(string_concat)"],["seventyrectangle","foldr(string_concat)"],["shaker","plus"],["shed","write"],["shelf","write"],["ship","plus"],["shirt","write"],["shoe","plus"],["shoelaces","plus"],["shoulders","write"],["shuttlecock","write"],["sieve","write"],["silicon","write"],["six","write"],["sixrectangle","foldr(string_concat)"],["skewer","write"],["ski","minus"],["slash","equals"],["sleep","write"],["sleeve","write"],["smile","write"],["smoke","minus"],["snake","write"],["snow","write"],["sock","plus"],["sofa","write"],["software","write"],["space","write"],["spacecraft","write"],["spaghetti","write"],["spatula","minus"],["spear","write"],["speechballoon","write"],["spider","write"],["spinach","write"],["spine","write"],["sponge","write"],["spoon","minus"],["square","write"],["squash","minus"],["stage","plus"],["stake","minus"],["stall","plus"],["star","plus"],["startingline","write"],["steam","minus"],["step","plus"],["stomach","minus"],["stone","equals"],["stop","halt"],["strap","plus"],["straw","minus"],["street","plus"],["sugar","minus"],["suitcase","write"],["sultana","minus"],["sun","write"],["swan","write"],["swings","plus"],["sword","equals"],["syringe","write"],["t","equals"],["table","write"],["tablet","write"],["tadpole","write"],["tape","minus"],["tar","minus"],["tear","write"],["telephone","write"],["tenrectangle","write"],["tent","write"],["testtube","minus"],["thirtyrectangle","write"],["three","write"],["threebox","multiply"],["threerectangle","foldr(string_concat)"],["threshold","write"],["tick","plus"],["tie","write"],["time","write"],["tinsel","write"],["toast","write"],["toe","write"],["toenail","write"],["tofu","write"],["tomato","minus"],["tongue","equals"],["tooth","plus"],["toothbrush","minus"],["towel","write"],["tower","write"],["town","write"],["train","write"],["tram","write"],["tray","write"],["tree","plus"],["triangle","write"],["trousers","write"],["truck","write"],["trumpet","write"],["trunk","write"],["tube","minus"],["turtle","write"],["tuxedo","write"],["twelverectangle","foldr(string_concat)"],["twentyrectangle","foldr(string_concat)"],["two","plus"],["twobox","plus"],["tworectangle","string_concat"],["underpants","write"],["underscore","write"],["university","write"],["up","write"],["uppert","write"],["urine","write"],["v","write"],["variable","equals"],["vehicle","write"],["viola","write"],["violin","plus"],["virus","minus"],["volcano","write"],["wallaby","write"],["warthog","write"],["water","minus"],["waterchick","plus"],["whale","write"],["wheat","write"],["wheelbarrow","write"],["whisk","plus"],["whiteboard","write"],["wigwam","write"],["windpipe","write"],["wood","minus"],["world","write"],["worm","write"],["write","write"],["x","equals"],["xcorrection","delete"],["xline","write"],["xylophone","write"],["yacht","write"],["yline","plus"],["zero","equals"],["zline","plus"],["zucchini","write"]] \ No newline at end of file diff --git a/t2ab/algdict2.txt b/t2ab/algdict2.txt new file mode 100644 index 0000000000000000000000000000000000000000..40de7f8c6e122a5a1bfed09ddeae442e9dce6c35 --- /dev/null +++ b/t2ab/algdict2.txt @@ -0,0 +1 @@ +[["(",""],["and","a,b"],["cut","!"],["delete","delete(A,B,C)"],["divide","A is B/C"],["enter object name (without spaces), if different for noodle","|: write"],["enter object name (without spaces), if different for turtle","|: write"],["enter object name (without spaces), if different for tuxedo","|: write"],["enter object name (without spaces), if different for yacht",""],["equals","A=B"],["foldr(append)","foldr(append,A,B)"],["foldr(string_concat)","foldr(string_concat,A,B)"],["greater_than","A>B"],["halt","halt"],["intersection","intersection2(A,B,C):-intersection2(A,B,[],C),!. intersection2([],_,A,A):-!. intersection2([A1|A2],B,C,D):- member1(A1,B), intersection2(A2,B,[A1|C],D). intersection2([_|A2],B,C,D):- intersection2(A2,B,C,D). member1(A,[A|_]):-!. member1(A,[_|B2]):-member1(A,B2)."],["left_bracket","A=[B]"],["less_than","Atrue; + (N1=u,N=1)), + + ((Filex1=u,Filex="../Text-to-Breasonings/file.txt")->true; + Filex=Filex1), + + ((number(M1),M=M1)->true; + M=all), %% If m1 is undefined or all then m=all + + t2ab_prep(List1,BrDict03,AlgDict_x,AlgDict,Filex,Stringx1,M), + + retractall(t2ab_brDict03(_)), + assertz(t2ab_brDict03(BrDict03)), + + retractall(t2ab_algDict_x(_)), + assertz(t2ab_algDict_x(AlgDict_x)), + + retractall(t2ab_algDict(_)), + assertz(t2ab_algDict(AlgDict)), + + retractall(t2ab_algString(_)), + assertz(t2ab_algString([])), + t2ab_br2(List1,N),%,BrDict03,BrDict2,AlgDict_x,AlgDict_x2,AlgDict,AlgDict2,N,[],AlgString), + + t2ab_brDict031(BrDict2), + t2ab_algDict_x1(AlgDict_x2), + t2ab_algDict1(AlgDict2), + t2ab_algString1(AlgString), + + %writeln("Press to save work:"),read_string(user_input,"\n","\r",_,_), + + sort(BrDict2,BrDict3), + (BrDict03=BrDict3->true; + (open_s("../Text-to-Breasonings/brdict1.txt",write,Stream), +%% string_codes(BrDict3), + write(Stream,BrDict3), + close(Stream))), + + sort(AlgDict_x2,AlgDict_x3), + (AlgDict_x=AlgDict_x3->true; + (open_s("../t2ab/algdict1.txt",write,Stream2), +%% string_codes(BrDict3), + term_to_atom(AlgDict_x3,AlgDict_x31), + write(Stream2,AlgDict_x31), + close(Stream2))), + + sort(AlgDict2,AlgDict3), + (AlgDict=AlgDict3->true; + (open_s("../t2ab/algdict2.txt",write,Stream3), + term_to_atom(AlgDict3,AlgDict31), +%% string_codes(BrDict3), + write(Stream3,AlgDict31), + close(Stream3))), + + length(List1,List1_length_a), + Dividend_a is ceiling(List1_length_a/250), + Dividend_b is Dividend_a*3, % for graciously giving + texttobr2_a(Dividend_b,meditation), + texttobr2_a(Dividend_b,medicine), + texttobr2_a(Dividend_b,pedagogy), + + flatten(AlgString,AlgString1), + foldr(string_concat,AlgString1,"",AlgString2), + %writeln1(AlgString2), + texttobr2(u,u,AlgString2,u,[auto,Auto]), + + !. + +%% Truncates the list if m is not undefined and m is greater than or equal to the length of string0 +truncate(List1,M,String0) :- + ((number(M),length(String0,M), + append(String0,_,List1))->true; + String0=List1),!. + +t2ab_prep(List,BrDict03,AlgDict_x,AlgDict,Filex,Stringx1,M) :- + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + %%Chars="’", + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), +%%writeln([brdict1]), + splitfurther(BrDict0,BrDict01), +%%writeln([brDict01,BrDict01]), + %%char_code(Escape,27), + %%delete(BrDict01,[Escape,_,_,_,_],BrDict021), +%%writeln([brDict021,BrDict021]), + %%char_code(Apostrophe,8217), + %%delete(BrDict021,[Apostrophe,_,_,_,_],BrDict02), +%%writeln([brDict02,BrDict02]), + sort(BrDict01,BrDict03), +%%writeln([brDict03,BrDict03]), + length(BrDict03,Length0),write("Number of words in dictionary: "), writeln(Length0), + + %%writeln(''), + %%writeln([brdict2]), + phrase_from_file_s(string(BrDict0t), "../t2ab/algdict1.txt"), + %%Chars="’", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), +% splitfurthert(BrDict0t,BrDict01t), +string_atom(BrDict0t,Atom),atom_to_term(Atom,BrDict01t,_), + +%%writeln([brDict01,BrDict01]), + %%delete(BrDict01t,[Escape,_,_,_,_],BrDict021t), +%%writeln([brDict021,BrDict021]), + %%delete(BrDict021t,[Apostrophe,_,_,_,_],BrDict02t), +%%writeln([brDict02,BrDict02]), + sort(BrDict01t,AlgDict_x), + +% br_freq %B=AlgDict_x,A=BrDict03,findall([DL,C,"\n"],(member([C,_,_,_],B),findall(_,member([_,C],A),D),length(D,DL)),E),sort(E,F),reverse(F,G),writeln([br_freq,G]), + +%%writeln([brDict03,BrDict03]), + length(AlgDict_x,Length0t),write("Number of unique algorithm names in dictionary: "), writeln(Length0t), + + %trace, + + phrase_from_file_s(string(AlgDict0), "../t2ab/algdict2.txt"), + %%Chars="’", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), +%%writeln([brdict1]), +string_atom(AlgDict0,Atom1),atom_to_term(Atom1,AlgDict01,_), + %%splitfurther(BrDict0,BrDict01), +%%writeln([brDict01,BrDict01]), + %%char_code(Escape,27), + %%delete(BrDict01,[Escape,_,_,_,_],BrDict021), +%%writeln([brDict021,BrDict021]), + %%char_code(Apostrophe,8217), + %%delete(BrDict021,[Apostrophe,_,_,_,_],BrDict02), +%%writeln([brDict02,BrDict02]), + sort(AlgDict01,AlgDict), +%%writeln([brDict03,BrDict03]), + length(AlgDict,Length01),write("Number of algorithms in dictionary: "), writeln(Length01), + + + + + ((Stringx1=u, + phrase_from_file_s(string(String001), Filex))->true; + String001=Stringx1), + + process_t2ab(String001,String00), + + split_string(String00,SepandPad,SepandPad,List1), + %%split_string_onnonletter(String00,List1), + + truncate(List1,M,List), + + /**replace0(String0,[8221, 8220], 34, SepandPad, M, String1), + replace0(String1,[8216, 8217], 39, SepandPad, M, String2), + replace0(String2,[8259, 8211, 8212], 45, SepandPad, M, String3), + replace0(String3,[160], 32, SepandPad, M, List), + **/ + +%%atom_codes(Atom999,String),writeln([atom999,Atom999]), + +%%writeln([list,List]), + %%delete(List,Escape,List11), +%%writeln([list11,List11]), + %%delete(List11,Apostrophe,List1), +%%writeln([list1,List1]), + length(List,Length1),write("Number of words to breason out in file: "), writeln(Length1), + sort(List,List2), +%%writeln([list2,List2]), + length(List2,Length2),write("Number of unique words in file: "), writeln(Length2), + + +(complete_display(true)-> + ((Stringx1=u, %% Use file, not string as input. + + %%maplist(downcase_atom, List2, List3), + maplist(string_lower, List2, List3), + +%%writeln([list3,List3]), + towords3(BrDict03,[],BrDict04,[],_ObjectNames,[],AllUsedNames), + + findall(BrDict04tt,(member([BrDict04tt,_],AlgDict_x)),BrDict04t), + findall(BrDict04tt0,(member([_,BrDict04tt0],AlgDict_x)),BrDict04t0), + findall(BrDict04tt1,(member([BrDict04tt1,_],AlgDict)),BrDict04t1), + %towords2(AlgDict_x,[],BrDict04t), + +%%writeln([brDict04,BrDict04]), + subtract(List3,BrDict04,D1), +%%writeln([list3,brDict04,d1,List3,BrDict04,D1]), +%%writeln(["subtract(BrDict04,List3,D1).",List3,BrDict04,D1]), + length(D1,Length01),Difference is abs(Length01),write("Number of words remaining to define: "), writeln(Difference), + + subtract(AllUsedNames,BrDict04t,D2), + %%delete(D21,'',D2), + length(D2,Length01t),Differencet is abs(Length01t),write("Number of undefined algorithm names: "), writeln(Differencet), + %% writeln([undefinedbreasonings,D2]), %% Print undefined breasonings + + %%delete(D31,'',D3), + subtract(BrDict04t,AllUsedNames,D3), + length(D3,Length01t2),Differencet2 is abs(Length01t2),write("Number of orphaned algorithm names: "), writeln(Differencet2), + + + + /* + subtract(List3,BrDict04,D1), +%%writeln([list3,brDict04,d1,List3,BrDict04,D1]), +%%writeln(["subtract(BrDict04,List3,D1).",List3,BrDict04,D1]), + length(D1,Length01),Difference is abs(Length01),write("Number of algorithms remaining to define: "), writeln(Difference), +*/ + + subtract(BrDict04t0,BrDict04t1,D21), + %%delete(D21,'',D2), + length(D21,Length01t1),Differencet1 is abs(Length01t1),write("Number of undefined algorithms: "), writeln(Differencet1), + %% writeln([undefinedbreasonings,D2]), %% Print undefined breasonings + + %%delete(D31,'',D3), + subtract(BrDict04t1,BrDict04t0,D31), + length(D31,Length01t21),Differencet21 is abs(Length01t21),write("Number of orphaned algorithms: "), writeln(Differencet21) + + %%,writeln([orphanedbreasonings,D3]) %% Print orphaned breasonings + + + + + +)->true;(string(Filex),writeln("Number of words, unique words, words remaining to define, undefined algorithm names, orphaned algorithm names, undefined algorithms and orphaned algorithms skipped for speed when breasoning out a string.")));true) + +,!. + +%n2(N) :-n(N). +t2ab_brDict031(BrDict2) :- t2ab_brDict03(BrDict2). +t2ab_algDict_x1(BrDict03t2) :- t2ab_algDict_x(BrDict03t2). +t2ab_algDict1(BrDict03t2) :- t2ab_algDict(BrDict03t2). +t2ab_algString1(BrDict03t2) :- t2ab_algString(BrDict03t2). + + +t2ab_br2(List1,N):-%_,A,A,B,B,C,C,0,L,L) :- !. +%br2(List1,BrDict03,BrDict2,AlgDict_x,AlgDict_x2,AlgDict,AlgDict2,N1,L1,L2) :- + + length(NL,N), + findall(_,(member(_,NL), + (auto(on)-> + concurrent_maplist(t2ab_br,List1,_); + maplist(t2ab_br,List1,_))),_),!. + %br(List1,BrDict03,BrDict21,AlgDict_x,AlgDict_x21,AlgDict,AlgDict21,L1,L3), +% N2 is N1-1, + %br2(List1,BrDict21,BrDict2,AlgDict_x21,AlgDict_x2,AlgDict21,AlgDict2,N2,L3,L2),!. + +towords2([],A,A) :- !. +towords2(BrDict03,A,B) :- + BrDict03=[[Word,_,_,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2(Rest,C,B). + +towords2a([],A,A) :- !. +towords2a(BrDict03,A,B) :- + BrDict03=[[Word,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2a(Rest,C,B). + +towords3([],A,A,C,C,D,D) :- !. +towords3(BrDict03,A,B,D,E,G,H) :- + BrDict03=[[Word1,Word2]|Rest], + (Word2=""->append(G,[Word1],I)->true; + append(G,[Word2],I)), + append(A,[Word1],C), + append(D,[Word2],F), + towords3(Rest,C,B,F,E,I,H). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +splitfurther(BrDict01,N) :- + phrase(file0(N),BrDict01). + +file0(N) --> "[", file(N), "]", !. +file0([]) --> []. + +%%file([]) --> []. +file([L|Ls]) --> entry(L),",", +%%{writeln(L)}, %%*** +file(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +file([L]) --> entry(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entry([Word2,Word4]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + word(Word3), {string_codes(Word4,Word3),string(Word4)}, + "]". + +splitfurthert(BrDict01,N) :- + phrase(file0t(N),BrDict01). + +file0t(N) --> "[", filet(N), "]", !. +file0t([]) --> []. + +%%file([]) --> []. +filet([L|Ls]) --> entryt(L),",", +%%{writeln(L)}, %%*** +filet(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +filet([L]) --> entryt(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entryt([Word2,X3,Y3,Z3]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + digits(X),",",{atom_codes(X2,X),atom_number(X2,X3),number(X3)}, + digits(Y),",",{atom_codes(Y2,Y),atom_number(Y2,Y3),number(Y3)}, + digits(Z),{atom_codes(Z2,Z),atom_number(Z2,Z3),number(Z3)}, + "]". + +word([X|Xs]) --> [X], {char_type(X,csymf)->true;(X=27->true;X=8217)}, word(Xs), !. +%%word([X]) --> [X], {char_type(X,csymf);(X=27;X=8217)}, !. +word([]) --> []. + +digits([X|Xs]) --> [X], {(char_type(X,digit)->true;(string_codes(Word2,[X]),Word2="."))}, digits(Xs), !. +%%digits([X]) --> [X], {(char_type(X,digit);(string_codes(Word2,[X]),Word2="."))}, !. +digits([]) --> []. + +t2ab_br(Word,_):-%[],B,B,C,C,D,D,L,L) :- +% !. +%br([Word|Words],BrDict,BrDict2,AlgDict4,AlgDict5,AlgDict6,AlgDict7,AlgString1,AlgString2) :- + downcase_atom(Word, Word2), atom_string(Word2,Word3), + + /* + words_to_read(WR1), + (WR1>0->(writeln(WR1),write(Word), + texttobr2(3),nl,sleep(0.12), + WR2 is WR1-1, + retractall(words_to_read(_)), + assertz(words_to_read(WR2))); + true), + */ + + /**member([Word3,X,Y,Z],AlgDict4) -> %% This feature is a bug because words in brdict2 shouldn't necessarily be the words in brdict1 + %%(append(BrDict,[[Word3,""]],BrDict3), BrDict3t=AlgDict4, + %%br(Words,BrDict3,BrDict2,BrDict3t,AlgDict5)) + %%; + %%(**/ + + %%(member([Word3,X,Y,Z],AlgDict4) -> %% This feature is a bug because words in brdict1 should correspond to those in brdict2 + %%(atom_concat("The breasoning for ", Word3, P1), + %%atom_concat(P1, " is defined. Enter object name (without spaces), if different for ", Prompt)); + %Prompt="Enter object name (without spaces), if different for "), + + %%writeln([word3,Word3]), + + %trace, + t2ab_brDict031(BrDict), + t2ab_algString1(AlgString1), + + (member([Word3,String4],BrDict)-> + (BrDict3=BrDict,AlgString1=AlgString3, + (String4=""->String41=Word3;String41=String4), + String5=String41); + ((repeat, + write("Enter object name (without spaces), if different for "), writeln(Word3),(auto(on)->String2="plus";read_string(user_input, "\n", "\r", _End2, String2)),split_string(String2, "", " ", String3),String3=[String4]), + %%*brth(Word3,_Brth), + +(String4=""->String5=Word3;String5=String4), + append(BrDict,[[Word3,String5]],BrDict3), + append(AlgString1,[[Word3," ",String5," "]],AlgString3) + %texttobr2(1,u,String5,1) + ) + ), + + t2ab_algDict_x1(AlgDict4), + + downcase_atom(String5, String52), atom_string(String52,String53), +%trace, + (member([String53,X1],AlgDict4)-> + (AlgDict41=AlgDict4,AlgString3=AlgString4, + (X1=""->StringX1=String53;StringX1=X1), + String51=StringX1); + + ((repeat, + write("Enter algorithm name for "), writeln(String53),(auto(on)->String21="write";read_string(user_input, "\n", "\r", _, String21)),split_string(String21, "", " ", String31),String31=[String411]), + %%*brth(Word3,_Brth), + +(String411=""->String51=String53;String51=String411), + append(AlgDict4,[[String53,String51]],AlgDict41), + append(AlgString3,[String51," "],AlgString4) + %texttobr2(1,u,String51,1) + )), + + downcase_atom(String51, String521), + atom_string(String521,String531), + + t2ab_algDict1(AlgDict6), + (member([String531,_Y1],AlgDict6)-> + (AlgDict61=AlgDict6,AlgString4=AlgString5); + + ((repeat, + write("Enter Prolog algorithm for "), writeln(String531),(auto(on)->String1="writeln(A)";read_string(user_input, "\n", "\r", _End, String1)), + split_string(String1, "\n\r", "\n\r", String),%Values=[X1,Y1,Z1],number_string(X,X1),number_string(Y,Y1),number_string(Z,Z1)), + (String=""->String11=String531;String11=String), + + append(AlgDict6,[[String531,String11]],AlgDict61), + append(AlgString4,[String11,"\n\n"],AlgString5) + ))), + %%*brth(String53,_Brth2), + %%write("br(\'"),write(Word3),writeln("\',)."), +%% writeln([Word3,X,Y,Z]), + %%write(' '), + +retractall(t2ab_brDict03(_)), +assertz(t2ab_brDict03(BrDict3)), + +retractall(t2ab_algDict_x(_)), +assertz(t2ab_algDict_x(AlgDict41)), + +retractall(t2ab_algDict(_)), +assertz(t2ab_algDict(AlgDict61)), + +retractall(t2ab_algString(_)), +assertz(t2ab_algString(AlgString5)), + +!. + + +%br(Words,BrDict3,BrDict2,AlgDict41,AlgDict5,AlgDict61,AlgDict7,AlgString5,AlgString2). + %%). + +%% finds unknown words, asks for their br in form "n of m: word", verify, (can go back x) append and sort, save + +%/* +process_t2ab(A,C) :- + replace_t2ab(Replacements), + atom_string(A1,A), + replace1_t2ab(Replacements,A1,D1), + atom_string(D1,C),!. + +replace1_t2ab([],A,A) :- !. +replace1_t2ab(Replacements,A,D) :- + Replacements=[[B,C]|G], + atomic_list_concat(E,B,A), + atomic_list_concat(E,C,F), + replace1_t2ab(G,F,D),!. + + replace_t2ab([['\\',''],['–',' '],['—',' '],['“','\''],['”','\''],['‘','\''],['’','\''],['⁃','-']]). +%*/ \ No newline at end of file diff --git a/texttoalg/.DS_Store b/texttoalg/.DS_Store new file mode 100644 index 0000000000000000000000000000000000000000..5008ddfcf53c02e82d7eee2e57c38e5672ef89f6 Binary files /dev/null and b/texttoalg/.DS_Store differ diff --git a/texttoalg/LICENSE b/texttoalg/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/texttoalg/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/texttoalg/README.md b/texttoalg/README.md new file mode 100644 index 0000000000000000000000000000000000000000..d4b66173baea209781059d5c6b49f643a08caa52 --- /dev/null +++ b/texttoalg/README.md @@ -0,0 +1,63 @@ +# texttoalg +Text to Algorithm + +Text to Algorithm is a SWI-Prolog algorithm that prepares to breason out file.txt with an algorithm for each word, producing a pastable Shell script file1a.txt, mainly used for postgraduate Pedagogy musings. Breasoning is thinking of the x, y and z dimensions of objects, necessary for meeting grades. + +# Getting Started + +Please read the following instructions on how to install the project on your computer for breasoning out algorithms. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, the List Prolog Interpreter Repository and the Text to Breasonings Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","texttoalg"). +halt +``` + +# Running + +* In Shell: +`cd texttoalg` +`swipl` +`['../Text-to-Breasonings/text_to_breasonings.pl'].` +`['texttoalg'].` + +* Enter the following to breason out Breasonings,Breathsonings,Room,PartOfRoom,Direction,ObjectToPrepare and ObjectToFinish. (Note: takes quite a while.) +`texttoalg(u,u,u,u,true,true,true,true,true,true).` + +* Enter the following to breason out the Breasonings and Breathsonings: +`texttoalg(u,u,u,u,true,false,false,false,false,false).` + +* Before running texttobr, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. + +* Follow instructions in https://github.com/luciangreen/Text-to-Breasonings/blob/master/Instructions_for_Using_texttobr(2).pl.txt when using texttoalg, texttobr, or texttobr2 to avoid medical problems. + +* Before breasoning, breason out algdict1.txt and algdict2.txt to allow breasoning multiple instances by dragging them from Finder (Mac) to empty BBEdit window for file.txt, then enter: +`['../Text-to-Breasonings/text_to_breasonings.pl'].` +`texttobr2(u,u,u,u,true,false,false,false,false,false),texttobr(u,u,u,u).` + +* Not recommended (due to idiosyncrasies of Shell, so breasoning out the dictionaries in the previous step may have to suffice): Copy and paste contents of file1a.txt into Terminal window (Mac) (one to a few lines at a time) to breason out algorithms for all instances of words. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/texttoalg/algdict1.txt b/texttoalg/algdict1.txt new file mode 100644 index 0000000000000000000000000000000000000000..3edc66973b21c3dafa0bcad2b02ac7b0e8cb041c --- /dev/null +++ b/texttoalg/algdict1.txt @@ -0,0 +1 @@ +[["a",""],["algorithms","modusponens"],["all","equals"],["and","modusponens"],["are","equals"],["arguments","modusponens"],["as","equals"],["attended","associative"],["b","b"],["c","equals"],["certain","equals"],["computer","member1a"],["confidence","greaterthan"],["conjunction","modusponens"],["copied","getitemn"],["did","conjunction"],["duplicate","modusponens"],["finds","modusponens"],["first","equals"],["fred","downpipe"],["has","modusponens"],["have","sum"],["he","substring"],["in","modusponens"],["items","modusponens"],["lectures","length"],["line","equals"],["lines","equals"],["member","modusponens"],["members","modusponens"],["minus",""],["new","append"],["not","not"],["number","number"],["of","modusponens"],["on","modusponens"],["or",""],["original","equals"],["pair","modusponens"],["returns","equals"],["science","minus1"],["second","equals"],["see","sort0"],["set","equals"],["someone","identical"],["splits","splitintosentences"],["tests","equals"],["the","equals"],["to","maximum0"],["tutor","disjunction"],["tutorials","optimise1"],["value","equals"],["values","equals"],["was","intersection1"],["whether","if"],["which","if"],["write","equals"],["year","delete2"]] \ No newline at end of file diff --git a/texttoalg/algdict2.txt b/texttoalg/algdict2.txt new file mode 100644 index 0000000000000000000000000000000000000000..9a1a4ebaf7f08c19689e05a54a84bfb8be45e7b4 --- /dev/null +++ b/texttoalg/algdict2.txt @@ -0,0 +1 @@ +[["a",[[n,function],[1,1,[v,c]]],[[[n,function],[[v,a],[v,b],[v,c]],":-",[[[n,+],[[v,a],[v,b],[v,c]]]]]],[[[[v,c],2]]]],["append",[[n,append1],[["a"],["b"],[v,s]]],[[[n,append1],[[v,a],[v,b],[v,s]],":-",[[[n,append],[[v,a],[v,b],[v,s]]]]]],[[[[v,s],["a","b"]]]]],["associative",[[n,associative],[1,2,3]],[[[n,associative],[[v,a],[v,b],[v,c]],":-",[[[n,*],[[v,a],[v,b],[v,d]]],[[n,*],[[v,d],[v,c],[v,e]]],[[n,*],[[v,b],[v,c],[v,f]]],[[n,*],[[v,f],[v,a],[v,e]]]]]],[[]]],["b","[]","[]","[]"],["conjunction",[[n,conjunction],["true","false",[v,c]]],[[[n,conjunction],["true","true","true"]],[[n,conjunction],[[v,a],[v,b],"false"],":-",[[[n,not],[[[[n,=],[[v,a],"true"]],[[n,=],[[v,b],"true"]]]]]]]],[[[[v,c],"false"]]]],["delete2",[[n,delete2],[[1,1,2],1,[],[v,a]]],[[[n,delete2],[[],[v,a],[v,l],[v,l]]],[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l5]]],[[n,delete2],[[v,l5],[v,i1],[v,l2],[v,l3]]]]],[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i2]]],[[n,tail],[[v,l1],[v,l5]]],[[n,not],[[[n,=],[[v,i1],[v,i2]]]]],[[n,wrap],[[v,i2],[v,i21]]],[[n,append],[[v,l2],[v,i21],[v,l6]]],[[n,delete2],[[v,l5],[v,i1],[v,l6],[v,l3]]]]]],[[[[v,a],[2]]]]],["disjunction",[[n,disjunction],["true","false",[v,c]]],[[[n,disjunction],["false","false","false"]],[[n,disjunction],[[v,a],[v,b],"true"],":-",[[[n,not],[[[[n,=],[[v,a],"false"]],[[n,=],[[v,b],"false"]]]]]]]],[[[[v,c],"true"]]]],["downpipe",[[n,downpipe],[3,1,[[3,[4,2]],[2,[3,1]]]]],[[[n,downpipe],[[v,a],[v,a],[v,b]]],[[n,downpipe],[[v,a],[v,b],[v,c]],":-",[[[n,member],[[v,c1],[v,c]]],[[n,equals1],[[v,c1],[[v,c11],[v,c12]]]],[[n,equals1],[[v,c12],[[v,c121],[v,c122]]]],[[n,"->"],[[[n,>],[[v,a],[v,c121]]],[[n,downpipe],[[v,c121],[v,b],[v,c]]],[[n,"->"],[[[n,>],[[v,a],[v,c122]]],[[n,downpipe],[[v,c122],[v,b],[v,c]]],[[n,fail]]]]]]]]],[[]]],["equals",[[n,equals11],["a","a"]],[[[n,equals11],[[v,a],[v,a]]]],[[]]],["getitemn",[[n,getitemn],[3,[1,2,3],[v,c]]],[[[n,getitemn],[1,[v,b],[v,c]],":-",[[[n,head],[[v,b],[v,c]]]]],[[n,getitemn],[[v,a],[v,b],[v,c]],":-",[[[n,not],[[[n,=],[[v,a],0]]]],[[n,tail],[[v,b],[v,t]]],[[n,-],[[v,a],1,[v,d]]],[[n,getitemn],[[v,d],[v,t],[v,c]]]]]],[[[[v,c],3]]]],["greaterthan",[[n,greaterthan],[3,2]],[[[n,greaterthan],[[v,a],[v,b]],":-",[[[n,>],[[v,a],[v,b]]]]]],[[]]],["identical",[[n,identical],[1,2]],[[[n,identical],[[v,a],[v,b]],":-",[[[n,+],[[v,a],[v,b],[v,c]]],[[n,+],[[v,b],[v,a],[v,c]]]]]],[[]]],["if",[[n,if11],[1,[v,b]]],[[[n,if11],[[v,a],[v,b]],":-",[[[n,"->"],[[[n,is],[[v,a],1]],[[n,is],[[v,b],2]],[[n,is],[[v,b],3]]]]]]],[[[[v,b],2]]]],["intersection1",[[n,intersection1],[[1,2,3],[3,4,5],[],[v,a]]],[[[n,intersection1],[[],[v,a],[v,l],[v,l]]],[[n,intersection1],[[v,l1],[v,l2],[v,l3a],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l4]]],[[n,intersection2],[[v,i1],[v,l2],[],[v,l5]]],[[n,append],[[v,l3a],[v,l5],[v,l6]]],[[n,intersection1],[[v,l4],[v,l2],[v,l6],[v,l3]]]]],[[n,intersection2],[[v,a],[],[v,l],[v,l]]],[[n,intersection2],[[v,i1],[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l4]]],[[n,wrap],[[v,i1],[v,i11]]],[[n,append],[[v,l2],[v,i11],[v,l5]]],[[n,intersection2],[[v,i1],[v,l4],[v,l5],[v,l3]]]]],[[n,intersection2],[[v,i1],[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i2]]],[[n,tail],[[v,l1],[v,l4]]],[[n,not],[[[n,=],[[v,i1],[v,i2]]]]],[[n,intersection2],[[v,i1],[v,l4],[v,l2],[v,l3]]]]]],[[[[v,a],[3]]]]],["length",[[n,length],[[1],0,[v,l]]],[[[n,length],[[],[v,l],[v,l]]],[[n,length],[[v,l],[v,m1],[v,n]],":-",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],1,[v,m2]]],[[n,length],[[v,t],[v,m2],[v,n]]]]]],[[[[v,l],1]]]],["maximum0",[[n,maximum0],[[2,1,3,5,-1],[v,m]]],[[[n,maximum0],[[v,l],[v,m]],":-",[[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,maximum],[[v,t],[v,h],[v,m],[],[v,r]]]]],[[n,maximum],[[],[v,l],[v,l],[v,r],[v,r]]],[[n,maximum],[[v,l],[v,m1],[v,n],[v,r1],[v,r2]],":-",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,"->"],[[[n,>=],[[v,m1],[v,h]]],[[[n,=],[[v,m2],[v,m1]]],[[n,wrap],[[v,h],[v,h2]]],[[n,append],[[v,r1],[v,h2],[v,r3]]]],[[[[n,=],[[v,m2],[v,h]]]],[[n,wrap],[[v,m1],[v,m12]]],[[n,append],[[v,r1],[v,m12],[v,r3]]]]]],[[n,maximum],[[v,t],[v,m2],[v,n],[v,r3],[v,r2]]]]]],[[[[v,m],5]]]],["member1a",[[n,member1a],[1,[1,2]]],[[[n,member1a],[[v,i1],[v,l]],":-",[[[n,intersection2],[[v,i1],[v,l],[],[v,m]]]]],[[n,intersection2],[[v,a],[],[v,l],[v,l]]],[[n,intersection2],[[v,i1],[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l4]]],[[n,wrap],[[v,i1],[v,i11]]],[[n,append],[[v,l2],[v,i11],[v,l5]]],[[n,intersection2],[[v,i1],[v,l4],[v,l5],[v,l3]]]]],[[n,intersection2],[[v,i1],[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i2]]],[[n,tail],[[v,l1],[v,l4]]],[[n,not],[[[n,=],[[v,i1],[v,i2]]]]],[[n,intersection2],[[v,i1],[v,l4],[v,l2],[v,l3]]]]]],[[]]],["minus",[[n,minus11],[[1,2,3],[3],[v,c]]],[[[n,minus11],[[v,a],[],[v,a]]],[[n,minus11],[[v,a],[v,b],[v,c]],":-",[[[n,head],[[v,b],[v,h]]],[[n,tail],[[v,b],[v,t]]],[[n,delete],[[v,a],[v,h],[v,c]]],[[n,minus11],[[v,c],[v,t],[v,c]]]]]],[[[[v,c],[1,2]]]]],["minus1",[[n,minus1],[[1,2,3],[1,2],[v,a]]],[[[n,minus1],[[v,l],[],[v,l]]],[[n,minus1],[[v,l1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l2],[v,i1]]],[[n,tail],[[v,l2],[v,l5]]],[[n,delete2],[[v,l1],[v,i1],[],[v,l6]]],[[n,minus1],[[v,l6],[v,l5],[v,l3]]]]],[[n,delete2],[[],[v,a],[v,l],[v,l]]],[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i1]]],[[n,tail],[[v,l1],[v,l5]]],[[n,delete2],[[v,l5],[v,i1],[v,l2],[v,l3]]]]],[[n,delete2],[[v,l1],[v,i1],[v,l2],[v,l3]],":-",[[[n,head],[[v,l1],[v,i2]]],[[n,tail],[[v,l1],[v,l5]]],[[n,not],[[[n,=],[[v,i1],[v,i2]]]]],[[n,wrap],[[v,i2],[v,i21]]],[[n,append],[[v,l2],[v,i21],[v,l6]]],[[n,delete2],[[v,l5],[v,i1],[v,l6],[v,l3]]]]]],[[[[v,a],[3]]]]],["modusponens",[[n,modusponens],["a",[["a","b"],["c","d"],["e","f"]],[v,s]]],[[[n,modusponens],[[v,a],[v,ab],[v,b]],":-",[[[n,member],[[v,ab1],[v,ab]]],[[n,equals1],[[v,ab1],[[v,a],[v,b]]]]]]],[[[[v,s],"b"]]]],["not",[[n,not11],[1]],[[[n,not11],[[v,a]],":-",[[[n,not],[[[n,=],[[v,a],2]]]]]]],[[]]],["number",[[n,number11],[1]],[[[n,number11],[[v,a]],":-",[[[n,number],[[v,a]]]]]],[[]]],["optimise1",[[n,optimise1],[[[5,4],[3,2],[1,0]],[v,d]]],[[[n,optimise1],[[v,a],[v,b]],":-",[[[n,head],[[v,a],[v,h]]],[[n,tail],[[v,a],[v,t]]],[[n,equals1],[[v,h],[[v,h1],[v,h2]]]],[[n,-],[[v,h1],[v,h2],[v,b]]],[[n,"->"],[[[n,not],[[[n,=],[[v,t],[]]]]],[[n,optimise1],[[v,t],[v,b]]],[[n,true]]]]]]],[[[[v,d],1]]]],["or",[[n,or11],[1]],[[[n,or11],[[v,a]],":-",[[[n,or],[[[n,is],[[v,a],1]],[[n,is],[[v,a],2]]]]]]],[[]]],["sort0",[[n,sort0],[[9,4,8,2,1,5,7,6,3,10],[v,l]]],[[[n,sort0],[[v,l],[v,n]],":-",[[[n,sort1],[[v,l],[],[v,n]]]]],[[n,sort1],[[],[v,l],[v,l]]],[[n,sort1],[[v,l],[v,m1],[v,n]],":-",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,maximum],[[v,t],[v,h],[v,m2],[],[v,r]]],[[n,wrap],[[v,m2],[v,m3]]],[[n,append],[[v,m1],[v,m3],[v,m4]]],[[n,sort1],[[v,r],[v,m4],[v,n]]]]],[[n,maximum],[[],[v,l],[v,l],[v,r],[v,r]]],[[n,maximum],[[v,l],[v,m1],[v,n],[v,r1],[v,r2]],":-",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,"->"],[[[n,>=],[[v,m1],[v,h]]],[[[n,=],[[v,m2],[v,m1]]],[[n,wrap],[[v,h],[v,h2]]],[[n,append],[[v,r1],[v,h2],[v,r3]]]],[[[[n,=],[[v,m2],[v,h]]]],[[n,wrap],[[v,m1],[v,m12]]],[[n,append],[[v,r1],[v,m12],[v,r3]]]]]],[[n,maximum],[[v,t],[v,m2],[v,n],[v,r3],[v,r2]]]]]],[[[[v,l],[10,9,8,7,6,5,4,3,2,1]]]]],["splitintosentences",[[n,splitintosentences],["aaa1 ,-'! a? b! b.",[v,t]]],[[[n,splitintosentences],[[v,u],[v,t]],":-",[[[n,compound21],[[v,u],"",[],[v,t]]]]],[[n,compound212],["","",[v,t],[v,t]]],[[n,compound212],[[v,u],[v,u],[v,t],[v,t]]],[[n,compound21],[[v,t],[v,u]],"->",[[[n,item],[[v,i]]],[[n,code],[[n,wrap],[[v,i],[v,itemname1]]],[[n,append],[[v,t],[v,itemname1],[v,v]]]],[[n,compound212],[[v,v],[v,u]]]]],[[n,compound21],[[v,t],[v,u]],"->",[[[n,item],[[v,i]]]," ",[[n,compound21],[[],[v,compound1name]]],[[n,code],[[n,wrap],[[v,i],[v,itemname1]]],[[n,append],[[v,t],[v,itemname1],[v,v]]],[[n,append],[[v,v],[v,compound1name],[v,u]]]]]],[[n,item],[[v,t]],"->",[[[n,word21],["",[v,t]]]]],[[n,item],[[v,t]],"->",[[[n,compound],[[],[v,t]]]]],[[n,word212],["","",[v,t],[v,t]]],[[n,word212],[[v,u],[v,u],[v,t],[v,t]]],[[n,word21],[[v,t],[v,u]],"->",[[v,a],[v,b],[[n,code],[[n,sentencechars],[[v,a]]],[[n,finalchar],[[v,b]]],[[n,stringconcat],[[v,t],[v,a],[v,v1]]],[[n,stringconcat],[[v,v1],[v,b],[v,v]]]],[[n,word212],[[v,v],[v,u]]]]],[[n,word21],[[v,t],[v,u]],"->",[[v,a],[[n,code],[[n,sentencechars],[[v,a]]],[[n,stringconcat],[[v,t],[v,a],[v,v]]]],[[n,word21],["",[v,wordstring]]],[[n,code],[[n,stringconcat],[[v,v],[v,wordstring],[v,u]]]]]],[[n,sentencechars],[[v,c]],":-",[[[n,letters],[[v,c]]]]],[[n,sentencechars],[[v,c]],":-",[[[[n,stringtonumber],[[v,c],[v,n]]],[[n,number],[[v,n]]]]]],[[n,sentencechars],[[v,c]],":-",[[[n,=],[[v,c]," "]]]],[[n,sentencechars],[[v,c]],":-",[[[n,=],[[v,c],","]]]],[[n,sentencechars],[[v,c]],":-",[[[n,=],[[v,c],"-"]]]],[[n,sentencechars],[[v,c]],":-",[[[n,=],[[v,c],"'"]]]],[[n,finalchar],[[v,c]],":-",[[[n,=],[[v,c],"."]]]],[[n,finalchar],[[v,c]],":-",[[[n,=],[[v,c],"!"]]]],[[n,finalchar],[[v,c]],":-",[[[n,=],[[v,c],"?"]]]]],[[[[v,t],["aaa1 ,-'!","a?","b!","b."]]]]],["substring",[[n,substring],[[1,2,3,4],[2,3]]],[[[n,substring],[[],[]]],[[n,substring],[[],[v,b]],":-",[[[n,not],[[[n,=],[[v,b],[]]]]],[[n,fail]]]],[[n,substring],[[v,a],[v,b]],":-",[[[n,head],[[v,a],[v,ah]]],[[n,tail],[[v,a],[v,at]]],[[n,"->"],[[[n,listhead],[[v,at],[v,bt]]],[[n,true]],[[n,substring],[[v,at],[v,b]]]]]]],[[n,listhead],[[],[v,l]]],[[n,listhead],[[v,a],[v,b]],":-",[[[n,head],[[v,a],[v,ah]]],[[n,tail],[[v,a],[v,at]]],[[n,head],[[v,b],[v,ah]]],[[n,tail],[[v,b],[v,bt]]],[[n,listhead],[[v,at],[v,bt]]]]]],[[]]],["sum",[[n,sum],[[3,1,2],0,[v,l]]],[[[n,sum],[[],[v,l],[v,l]]],[[n,sum],[[v,l],[v,m1],[v,n]],":-",[[[n,not],[[[n,=],[[v,l],[]]]]],[[n,head],[[v,l],[v,h]]],[[n,tail],[[v,l],[v,t]]],[[n,+],[[v,m1],[v,h],[v,m2]]],[[n,sum],[[v,t],[v,m2],[v,n]]]]]],[[[[v,l],6]]]]] \ No newline at end of file diff --git a/texttoalg/algdict3.txt b/texttoalg/algdict3.txt new file mode 100644 index 0000000000000000000000000000000000000000..ef84a24a77de2fb906cdb9361175b08795de0926 --- /dev/null +++ b/texttoalg/algdict3.txt @@ -0,0 +1 @@ +[[and,append],[hand,append],[up,plus],[word,append],[world,append]] \ No newline at end of file diff --git a/texttoalg/file.txt b/texttoalg/file.txt new file mode 100644 index 0000000000000000000000000000000000000000..3f155ba22fa221d9e5bc93a497f15e4ba964c66a --- /dev/null +++ b/texttoalg/file.txt @@ -0,0 +1 @@ +[[a,1,1.5,0 brontosaurus \ No newline at end of file diff --git a/texttoalg/t2alg3.pl b/texttoalg/t2alg3.pl new file mode 100644 index 0000000000000000000000000000000000000000..188c58f6069416b0ea87711b4ca7661be243f7b7 --- /dev/null +++ b/texttoalg/t2alg3.pl @@ -0,0 +1,472 @@ +:- include('../listprologinterpreter/la_strings.pl'). +%:- include('../Text-to-Breasonings/mergetexttobrdict.pl'). +:- include('../Text-to-Breasonings/texttobr2qb.pl'). +:- include('../Text-to-Breasonings/texttobr.pl'). +t2alg3(N1,Filex1,Stringx1,M1) :- + t2alg3(N1,Filex1,Stringx1,M1,false,false,false,false,false,false,0). +t2alg3(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction, + + ObjectToPrepare,ObjectToFinish) :- + t2alg3(N1,Filex1,Stringx1,M1,Brth,Room, + PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,0). + + % don't run +t2alg3(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish,Words_to_read) :- + %retractall(complete_display(_)), + %assertz(complete_display(false)), + + retractall(words_to_read(_)), + assertz(words_to_read(Words_to_read)), + + ((number(N1),N=N1)->true; + (N1=u,N=1)), + + ((Filex1=u,Filex=%"file.txt"% + "../Text-to-Breasonings/file.txt" + )->true; + Filex=Filex1), + + ((number(M1),M=M1)->true; + M=all), %% If m1 is undefined or all then m=all + + prep(List1,BrDict03,BrDict03t,Filex,Stringx1,M,Brth,BrthDict03,Room,RoomDict03,PartOfRoom,PartOfRoomDict03,Direction,DirectionDict03,ObjectToPrepare,ObjectToPrepareDict03,ObjectToFinish,ObjectToFinishDict03), + br2(List1,BrDict03,BrDict2,BrDict03t,BrDict03t2,N,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04), + sort(BrDict2,BrDict3), + (BrDict03=BrDict3->true; + %(open_s("brdict1.txt",write,Stream), + (open_s("../Text-to-Breasonings/brdict1.txt",write,Stream), +% string_codes(BrDict3), + write(Stream,BrDict3), + close(Stream))), + + sort(BrDict03t2,BrDict03t3), + (BrDict03t=BrDict03t3->true; + (open_s("algdict3.txt",write,Stream2), +%% string_codes(BrDict3), + write(Stream2,BrDict03t3), + close(Stream2))), + + /* ((Brth=true, + sort(BrthDict04,BrthDict044), + (BrthDict03=BrthDict044->true; + (open_s("../Text-to-Breasonings/brthdict.txt",write,Stream3), +%% string_codes(BrDict3), + write(Stream3,BrthDict044), + close(Stream3))))->true;true), + + ((Room=true, + sort(RoomDict04,RoomDict044), + (RoomDict04=RoomDict044->true; + (open_s("../Text-to-Breasonings/roomdict.txt",write,Stream4), +%% string_codes(BrDict3), + write(Stream4,RoomDict044), + close(Stream4))))->true;true), + + ((PartOfRoom=true, + sort(PartOfRoomDict04,PartOfRoomDict044), + (PartOfRoomDict04=PartOfRoomDict044->true; + (open_s("../Text-to-Breasonings/partofroomdict.txt",write,Stream5), +%% string_codes(BrDict3), + write(Stream5,PartOfRoomDict044), + close(Stream5))))->true;true), + + ((Direction=true, + sort(DirectionDict04,DirectionDict044), + (DirectionDict04=DirectionDict044->true; + (open_s("../Text-to-Breasonings/directiondict.txt",write,Stream6), +%% string_codes(BrDict3), + write(Stream6,DirectionDict044), + close(Stream6))))->true;true), + + ((ObjectToPrepare=true, + sort(ObjectToPrepareDict04,ObjectToPrepareDict044), + (ObjectToPrepareDict04=ObjectToPrepareDict044->true; + (open_s("../Text-to-Breasonings/objecttopreparedict.txt",write,Stream7), +%% string_codes(BrDict3), + write(Stream7,ObjectToPrepareDict044), + close(Stream7))))->true;true), + + ((ObjectToFinish=true, + sort(ObjectToFinishDict04,ObjectToFinishDict044), + (ObjectToFinishDict04=ObjectToFinishDict044->true; + (open_s("../Text-to-Breasonings/objecttofinishdict.txt",write,Stream8), +%% string_codes(BrDict3), + write(Stream8,ObjectToFinishDict044), + close(Stream8))))->true;true), + + */ + length(List1,List1_length_a), + Dividend_a is ceiling(List1_length_a/250), + Dividend_b is Dividend_a*3, % for graciously giving + texttobr2_a(Dividend_b,meditation), + texttobr2_a(Dividend_b,medicine), + texttobr2_a(Dividend_b,pedagogy), + + !. + + +replace0(Input,Find,Replace,SepandPad,M,Output0) :- + replace00(Input,Find,Replace,SepandPad,[],Output1), + truncate(Output1,M,Output0),!. +replace00([],_Find,_Replace,_SepandPad,Input,Input) :- !. +replace00(Input1,Find,Replace,SepandPad,Input2,Input3) :- + Input1=[Input4|Input5], + string_codes(Input4,String1), + replace1(String1,Find,Replace,[],Input7), + string_codes(Output,Input7), + split_string(Output,SepandPad,SepandPad,Input8), + append(Input2,Input8,Input9), + replace00(Input5,Find,Replace,SepandPad,Input9,Input3), !. +replace1([],_Find,_Replace,Input,Input) :- !. +replace1(Input1,Find,Replace,Input2,Input3) :- + Input1=[Input4|Input5], + member(Input4,Find), + append(Input2,[Replace],Input6), + replace1(Input5,Find,Replace,Input6,Input3), !. +replace1(Input1,Find,Replace,Input2,Input3) :- + Input1=[Input4|Input5], + not(member(Input4,Find)), + append(Input2,[Input4],Input6), + replace1(Input5,Find,Replace,Input6,Input3), !. + +split_string_onnonletter(String00,List1) :- + string_codes(String00,String1), + split_string_onnonletter(String1,[],List0), + string_codes(List0,List2), + split_string(List2," "," ",List1),!. +split_string_onnonletter([],Input,Input) :- !. +split_string_onnonletter(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + not(char_type(Input4,alpha)), + append(Input2,[32],Input6), + split_string_onnonletter(Input5,Input6,Input3), !. +split_string_onnonletter(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + char_type(Input4,alpha), + append(Input2,[Input4],Input6), + split_string_onnonletter(Input5,Input6,Input3), !. + +%% Truncates the list if m is not undefined and m is greater than or equal to the length of string0 +truncate(List1,M,String0) :- + ((number(M),length(String0,M), + append(String0,_,List1))->true; + String0=List1),!. + +prep(List,BrDict03,BrDict03t,Filex,Stringx1,M,Brth,BrthDict03,Room,RoomDict03,PartOfRoom,PartOfRoomDict03,Direction,DirectionDict03,ObjectToPrepare,ObjectToPrepareDict03,ObjectToFinish,ObjectToFinishDict03) :- + phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), + %phrase_from_file_s(string(BrDict0), "brdict1.txt"), + %%Chars="’", + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), +%%writeln([brdict1]), + splitfurther(BrDict0,BrDict01), +%%writeln([brDict01,BrDict01]), + %%char_code(Escape,27), + %%delete(BrDict01,[Escape,_,_,_,_],BrDict021), +%%writeln([brDict021,BrDict021]), + %%char_code(Apostrophe,8217), + %%delete(BrDict021,[Apostrophe,_,_,_,_],BrDict02), +%%writeln([brDict02,BrDict02]), + sort(BrDict01,BrDict03), +%%writeln([brDict03,BrDict03]), + length(BrDict03,Length0),write("Number of words in dictionary: "), writeln(Length0), + + %%writeln(''), + %%writeln([brdict2]), + phrase_from_file_s(string(BrDict0t), "algdict3.txt"), + %%Chars="’", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), + splitfurther(BrDict0t,BrDict01t), +%%writeln([brDict01,BrDict01]), + %%delete(BrDict01t,[Escape,_,_,_,_],BrDict021t), +%%writeln([brDict021,BrDict021]), + %%delete(BrDict021t,[Apostrophe,_,_,_,_],BrDict02t), +%%writeln([brDict02,BrDict02]), + sort(BrDict01t,BrDict03t), +%%writeln([brDict03,BrDict03]), + length(BrDict03t,Length0t),write("Number of unique algorithms in dictionary: "), writeln(Length0t), + + ((Stringx1=u, + phrase_from_file_s(string(String00), Filex))->true; + String00=Stringx1), + + split_string(String00,SepandPad,SepandPad,List1), + %%split_string_onnonletter(String00,List1), + + truncate(List1,M,List), + + /**replace0(String0,[8221, 8220], 34, SepandPad, M, String1), + replace0(String1,[8216, 8217], 39, SepandPad, M, String2), + replace0(String2,[8259, 8211, 8212], 45, SepandPad, M, String3), + replace0(String3,[160], 32, SepandPad, M, List), + **/ + +%%atom_codes(Atom999,String),writeln([atom999,Atom999]), + +%%writeln([list,List]), + %%delete(List,Escape,List11), +%%writeln([list11,List11]), + %%delete(List11,Apostrophe,List1), +%%writeln([list1,List1]), + length(List,Length1),write("Number of words to breason out in file: "), writeln(Length1), + sort(List,List2), +%%writeln([list2,List2]), + length(List2,Length2),write("Number of unique words in file: "), writeln(Length2) + + + +/* ((Brth=true, + phrase_from_file_s(string(BrthDict0), "../Text-to-Breasonings/brthdict.txt"), splitfurther(BrthDict0,BrthDict01), + sort(BrthDict01,BrthDict03), + length(BrthDict03,BrthLength0),write("Number of unique breathsonings in dictionary: "), writeln(BrthLength0))->true;true), + + + ((Room=true, + phrase_from_file_s(string(RoomDict0), "../Text-to-Breasonings/roomdict.txt"), splitfurther(RoomDict0,RoomDict01), + sort(RoomDict01,RoomDict03), + length(RoomDict03,RoomLength0),write("Number of unique rooms in dictionary: "), writeln(RoomLength0))->true;true), + + ((PartOfRoom=true, + phrase_from_file_s(string(PartOfRoomDict0), "../Text-to-Breasonings/partofroomdict.txt"), splitfurther(PartOfRoomDict0,PartOfRoomDict01), + sort(PartOfRoomDict01,PartOfRoomDict03), + length(PartOfRoomDict03,PartOfRoomLength0),write("Number of unique parts of rooms in dictionary: "), writeln(PartOfRoomLength0))->true;true), + + ((Direction=true, + phrase_from_file_s(string(DirectionDict0), "../Text-to-Breasonings/directiondict.txt"), splitfurther(DirectionDict0,DirectionDict01), + sort(DirectionDict01,DirectionDict03), + length(DirectionDict03,DirectionLength0),write("Number of unique directions in dictionary: "), writeln(DirectionLength0))->true;true), + + ((ObjectToPrepare=true, + phrase_from_file_s(string(ObjectToPrepareDict0), "../Text-to-Breasonings/objecttopreparedict.txt"), splitfurther(ObjectToPrepareDict0,ObjectToPrepareDict01), + sort(ObjectToPrepareDict01,ObjectToPrepareDict03), + length(ObjectToPrepareDict03,ObjectToPrepareLength0),write("Number of unique objects to prepare in dictionary: "), writeln(ObjectToPrepareLength0))->true;true), + + ((ObjectToFinish=true, + phrase_from_file_s(string(ObjectToFinishDict0), "../Text-to-Breasonings/objecttofinishdict.txt"), splitfurther(ObjectToFinishDict0,ObjectToFinishDict01), + sort(ObjectToFinishDict01,ObjectToFinishDict03), + length(ObjectToFinishDict03,ObjectToFinishLength0),write("Number of unique objects to finish in dictionary: "), writeln(ObjectToFinishLength0))->true;true), +*/ + + +,!. + +br2(_,A,A,B,B,0,_Brth,BrthDict03,BrthDict03,_Room,RoomDict03,RoomDict03,_PartOfRoom,PartOfRoomDict03,PartOfRoomDict03,_Direction,DirectionDict03,DirectionDict03,_ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict03,_ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict03) :- !. +br2(List1,BrDict03,BrDict2,BrDict03t,BrDict03t2,N1,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04) :- + br(List1,BrDict03,BrDict21,BrDict03t,BrDict03t21,Brth,BrthDict03,BrthDict041,Room,RoomDict03,RoomDict041,PartOfRoom,PartOfRoomDict03,PartOfRoomDict041,Direction,DirectionDict03,DirectionDict041,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict041,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict041), + N2 is N1-1, + br2(List1,BrDict21,BrDict2,BrDict03t21,BrDict03t2,N2,Brth,BrthDict041,BrthDict04,Room,RoomDict041,RoomDict04,PartOfRoom,PartOfRoomDict041,PartOfRoomDict04,Direction,DirectionDict041,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict041,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict041,ObjectToFinishDict04),!. + +towords2([],A,A) :- !. +towords2(BrDict03,A,B) :- + BrDict03=[[Word,_,_,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2(Rest,C,B). + +towords2a([],A,A) :- !. +towords2a(BrDict03,A,B) :- + BrDict03=[[Word,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2a(Rest,C,B). + +towords3([],A,A,C,C,D,D) :- !. +towords3(BrDict03,A,B,D,E,G,H) :- + BrDict03=[[Word1,Word2]|Rest], + (Word2=""->append(G,[Word1],I)->true; + append(G,[Word2],I)), + append(A,[Word1],C), + append(D,[Word2],F), + towords3(Rest,C,B,F,E,I,H). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +splitfurther(BrDict01,N) :- + phrase(file0(N),BrDict01). + +file0(N) --> "[", file(N), "]", !. +file0([]) --> []. + +%%file([]) --> []. +file([L|Ls]) --> entry(L),",", +%%{writeln(L)}, %%*** +file(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +file([L]) --> entry(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entry([Word2,Word4]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + word(Word3), {string_codes(Word4,Word3),string(Word4)}, + "]". + +splitfurthert(BrDict01,N) :- + phrase(file0t(N),BrDict01). + +file0t(N) --> "[", filet(N), "]", !. +file0t([]) --> []. + +%%file([]) --> []. +filet([L|Ls]) --> entryt(L),",", +%%{writeln(L)}, %%*** +filet(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +filet([L]) --> entryt(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entryt([Word2,X3,Y3,Z3]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + digits(X),",",{atom_codes(X2,X),atom_number(X2,X3),number(X3)}, + digits(Y),",",{atom_codes(Y2,Y),atom_number(Y2,Y3),number(Y3)}, + digits(Z),{atom_codes(Z2,Z),atom_number(Z2,Z3),number(Z3)}, + "]". + +word([X|Xs]) --> [X], {char_type(X,csymf)->true;(X=27->true;X=8217)}, word(Xs), !. +%%word([X]) --> [X], {char_type(X,csymf);(X=27;X=8217)}, !. +word([]) --> []. + +digits([X|Xs]) --> [X], {(char_type(X,digit)->true;(string_codes(Word2,[X]),Word2="."))}, digits(Xs), !. +%%digits([X]) --> [X], {(char_type(X,digit);(string_codes(Word2,[X]),Word2="."))}, !. +digits([]) --> []. + +br([],B,B,C,C,_,D,D,_Room,RoomDict03,RoomDict03,_PartOfRoom,PartOfRoomDict03,PartOfRoomDict03,_Direction,DirectionDict03,DirectionDict03,_ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict03,_ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict03) :- + !. +br([Word|Words],BrDict,BrDict2,BrDict4,BrDict5,Brth,BrthDict03,BrthDict04,Room,RoomDict03,RoomDict04,PartOfRoom,PartOfRoomDict03,PartOfRoomDict04,Direction,DirectionDict03,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict03,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict03,ObjectToFinishDict04) :- + downcase_atom(Word, Word2), atom_string(Word2,Word3), + + words_to_read(WR1), + (WR1>0->(writeln(WR1),write(Word), + t2alg3(3),nl,sleep(0.12), + WR2 is WR1-1, + retractall(words_to_read(_)), + assertz(words_to_read(WR2))); + true), + + /**member([Word3,X,Y,Z],BrDict4) -> %% This feature is a bug because words in brdict2 shouldn't necessarily be the words in brdict1 + %%(append(BrDict,[[Word3,""]],BrDict3), BrDict3t=BrDict4, + %%br(Words,BrDict3,BrDict2,BrDict3t,BrDict5)) + %%; + %%(**/ + + %%(member([Word3,X,Y,Z],BrDict4) -> %% This feature is a bug because words in brdict1 should correspond to those in brdict2 + %%(atom_concat("The breasoning for ", Word3, P1), + %%atom_concat(P1, " is defined. Enter object name (without spaces), if different for ", Prompt)); + %Prompt="Enter object name (without spaces), if different for "), + + %%writeln([word3,Word3]), + + (member([Word3,String4],BrDict)-> + BrDict3=BrDict; + ((repeat, + write("Enter object name (without spaces), if different for "), writeln(Word3),read_string(user_input, "\n", "\r", _End2, String2),split_string(String2, "", " ", String3),String3=[String4]), + append(BrDict,[[Word3,String4]],BrDict3), + texttobr(1,u,String4,1))), + %%*brth(Word3,_Brth), + +(String4=""->String5=Word3;String5=String4), + + downcase_atom(String5, String52), atom_string(String52,String53), + + (member([String53,_],BrDict4)-> + BrDict3t1=BrDict4; + ((repeat, + write("Enter algorithm word for "), writeln(String53),read_string(user_input, "\n", "\r", _End, String),split_string(String, ",", " ", String6), + String6=[String7]),%Values=[X1,Y1,Z1],number_string(X,X1),number_string(Y,Y1),number_string(Z,Z1)), + append(BrDict4,[[String53,String7]],BrDict3t1))), + %%*brth(String53,_Brth2), + %%write("br(\'"),write(Word3),writeln("\',)."), +%% writeln([Word3,X,Y,Z]), + %%write(' '), + +/* (Brth=true,(member([String53,_Breathsoning],BrthDict03)-> + BrthDict3=BrthDict03; + ((repeat, + write("Enter human judgement (without spaces), if different for "), writeln(String53),read_string(user_input, "\n", "\r", _End2, Stringth2),split_string(Stringth2, "", " ", Stringth3),Stringth3=[Stringth4]),texttobr(1,u,Stringth4,1), + append(BrthDict03,[[String53,Stringth4]],BrthDict3)))->true;true), + + (Room=true,(member([String53,_Room],RoomDict03)-> + RoomDict3=RoomDict03; + ((repeat, + write("Enter room (without spaces), if different for "), writeln(String53),read_string(user_input, "\n", "\r", _RoomEnd2, RoomStringth2),split_string(RoomStringth2, "", " ", RoomStringth3),RoomStringth3=[RoomStringth4]),texttobr(1,u,RoomStringth4,1), + append(RoomDict03,[[String53,RoomStringth4]],RoomDict3)))->true;true), + + + (Room=true,(member([RoomStringth4,_X,_Y,_Z],BrDict3t1)-> + BrDict3t2=BrDict3t1; + ((repeat, + write("Enter x, y and z in cm for "), writeln(RoomStringth4),read_string(user_input, "\n", "\r", _End, RoomString),split_string(RoomString, ",", " ", RoomValues),RoomValues=[RoomX1,RoomY1,RoomZ1],number_string(RoomX,RoomX1),number_string(RoomY,RoomY1),number_string(RoomZ,RoomZ1)), + append(BrDict3t1,[[RoomStringth4,RoomX,RoomY,RoomZ]],BrDict3t2)))->true;BrDict3t2=BrDict3t1), + + +(PartOfRoom=true,(member([String53,_PartOfRoom],PartOfRoomDict03)-> + PartOfRoomDict3=PartOfRoomDict03; + ((repeat, + write("Enter part of room (without spaces), if different for "), writeln(String53),read_string(user_input, "\n", "\r", _PartOfRoomEnd2, PartOfRoomStringth2),split_string(PartOfRoomStringth2, "", " ", PartOfRoomStringth3),PartOfRoomStringth3=[PartOfRoomStringth4]),texttobr(1,u,PartOfRoomStringth4,1), + append(PartOfRoomDict03,[[String53,PartOfRoomStringth4]],PartOfRoomDict3)))->true;true), + + + (PartOfRoom=true,(member([PartOfRoomStringth4,_X,_Y,_Z],BrDict3t2)-> + BrDict3t3=BrDict3t2; + ((repeat, + write("Enter x, y and z in cm for "), writeln(PartOfRoomStringth4),read_string(user_input, "\n", "\r", _End, PartOfRoomString),split_string(PartOfRoomString, ",", " ", PartOfRoomValues),PartOfRoomValues=[PartOfRoomX1,PartOfRoomY1,PartOfRoomZ1],number_string(PartOfRoomX,PartOfRoomX1),number_string(PartOfRoomY,PartOfRoomY1),number_string(PartOfRoomZ,PartOfRoomZ1)), + append(BrDict3t2,[[PartOfRoomStringth4,PartOfRoomX,PartOfRoomY,PartOfRoomZ]],BrDict3t3)))->true;BrDict3t3=BrDict3t2), + +(Direction=true,(member([String53,_Direction],DirectionDict03)-> + DirectionDict3=DirectionDict03; + ((repeat, + write("Enter direction (without spaces), if different for "), writeln(String53),read_string(user_input, "\n", "\r", _DirectionEnd2, DirectionStringth2),split_string(DirectionStringth2, "", " ", DirectionStringth3),DirectionStringth3=[DirectionStringth4]),texttobr(1,u,DirectionStringth4,1), + append(DirectionDict03,[[String53,DirectionStringth4]],DirectionDict3)))->true;true), + + + (Direction=true,(member([DirectionStringth4,_X,_Y,_Z],BrDict3t3)-> + BrDict3t4=BrDict3t3; + ((repeat, + write("Enter x, y and z in cm for "), writeln(DirectionStringth4),read_string(user_input, "\n", "\r", _End, DirectionString),split_string(DirectionString, ",", " ", DirectionValues),DirectionValues=[DirectionX1,DirectionY1,DirectionZ1],number_string(DirectionX,DirectionX1),number_string(DirectionY,DirectionY1),number_string(DirectionZ,DirectionZ1)), + append(BrDict3t3,[[DirectionStringth4,DirectionX,DirectionY,DirectionZ]],BrDict3t4)))->true;BrDict3t4=BrDict3t3), + + +(ObjectToPrepare=true,(member([String53,_ObjectToPrepare],ObjectToPrepareDict03)-> + ObjectToPrepareDict3=ObjectToPrepareDict03; + ((repeat, + write("Enter object to prepare (without spaces), if different for "), writeln(String53),read_string(user_input, "\n", "\r", _ObjectToPrepareEnd2, ObjectToPrepareStringth2),split_string(ObjectToPrepareStringth2, "", " ", ObjectToPrepareStringth3),ObjectToPrepareStringth3=[ObjectToPrepareStringth4]),texttobr(1,u,ObjectToPrepareStringth4,1), + append(ObjectToPrepareDict03,[[String53,ObjectToPrepareStringth4]],ObjectToPrepareDict3)))->true;true), + + + (ObjectToPrepare=true,(member([ObjectToPrepareStringth4,_X,_Y,_Z],BrDict3t4)-> + BrDict3t5=BrDict3t4; + ((repeat, + write("Enter x, y and z in cm for "), writeln(ObjectToPrepareStringth4),read_string(user_input, "\n", "\r", _End, ObjectToPrepareString),split_string(ObjectToPrepareString, ",", " ", ObjectToPrepareValues),ObjectToPrepareValues=[ObjectToPrepareX1,ObjectToPrepareY1,ObjectToPrepareZ1],number_string(ObjectToPrepareX,ObjectToPrepareX1),number_string(ObjectToPrepareY,ObjectToPrepareY1),number_string(ObjectToPrepareZ,ObjectToPrepareZ1)), + append(BrDict3t4,[[ObjectToPrepareStringth4,ObjectToPrepareX,ObjectToPrepareY,ObjectToPrepareZ]],BrDict3t5)))->true;BrDict3t5=BrDict3t4), + + +(ObjectToFinish=true,(member([String53,_ObjectToFinish],ObjectToFinishDict03)-> + ObjectToFinishDict3=ObjectToFinishDict03; + ((repeat, + write("Enter object to finish (without spaces), if different for "), writeln(String53),read_string(user_input, "\n", "\r", _ObjectToFinishEnd2, ObjectToFinishStringth2),split_string(ObjectToFinishStringth2, "", " ", ObjectToFinishStringth3),ObjectToFinishStringth3=[ObjectToFinishStringth4]),texttobr(1,u,ObjectToFinishStringth4,1), + append(ObjectToFinishDict03,[[String53,ObjectToFinishStringth4]],ObjectToFinishDict3)))->true;true), + + + (ObjectToFinish=true,(member([ObjectToFinishStringth4,_X,_Y,_Z],BrDict3t5)-> + BrDict3t6=BrDict3t5; + ((repeat, + write("Enter x, y and z in cm for "), writeln(ObjectToFinishStringth4),read_string(user_input, "\n", "\r", _End, ObjectToFinishString),split_string(ObjectToFinishString, ",", " ", ObjectToFinishValues),ObjectToFinishValues=[ObjectToFinishX1,ObjectToFinishY1,ObjectToFinishZ1],number_string(ObjectToFinishX,ObjectToFinishX1),number_string(ObjectToFinishY,ObjectToFinishY1),number_string(ObjectToFinishZ,ObjectToFinishZ1)), + append(BrDict3t5,[[ObjectToFinishStringth4,ObjectToFinishX,ObjectToFinishY,ObjectToFinishZ]],BrDict3t6)))->true;BrDict3t6=BrDict3t5), + */ + + +br(Words,BrDict3,BrDict2,BrDict3t1,BrDict5,Brth,BrthDict3,BrthDict04,Room,RoomDict3,RoomDict04,PartOfRoom,PartOfRoomDict3,PartOfRoomDict04,Direction,DirectionDict3,DirectionDict04,ObjectToPrepare,ObjectToPrepareDict3,ObjectToPrepareDict04,ObjectToFinish,ObjectToFinishDict3,ObjectToFinishDict04). + %%). +brth(_,sweetinvincibleandprayedfor). + +%% finds unknown words, asks for their br in form "n of m: word", verify, (can go back x) append and sort, save \ No newline at end of file diff --git a/texttoalg/texttoalg.pl b/texttoalg/texttoalg.pl new file mode 100644 index 0000000000000000000000000000000000000000..7b67e525fd7f4c8da5f7fbc6b52c5d68ec3fe933 --- /dev/null +++ b/texttoalg/texttoalg.pl @@ -0,0 +1,346 @@ +%% Load with ['../Text-to-Breasonings/text_to_breasonings.pl']. +%% and ['texttoalg']. +%% Run with texttoalg(u,u,u,u,true,true,true,true,true,true). +%% or texttoalg(u,u,u,u,true,false,false,false,false,false). +%% or texttoalg(8,u,u,2000,true,false,false,false,false,false). for PhD (16k words per algorithm) + +%% Important: See instructions for using texttobr.pl at https://lucianpedia.wikia.com/wiki/Instructions_for_Using_texttobr(2).pl . + +%% use_module(library(pio)). %% In la_strings +%% use_module(library(dcg/basics)). %% In la_strings + +%% texttobr2 - converts file stream to dimensions of objects represented by the words +%% has object name as separate field for new users of texttobr to verify breasonings by hand +%% brdict1.txt contains word and object name, brdict2.txt contains object name and x, y and z + +%% texttobr2(Runs,File,StringtoBreason,BreasoningLimit). +:- include('../Text-to-Breasonings/mergetexttobrdict.pl'). +:- include('../listprologinterpreter/la_strings'). +:- use_module(library(date)). + +%% Brth is true or false +texttoalg(N1,Filex1,Stringx1,M1,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish) :- + + ((number(N1),N=N1)->true; + (N1=u,N=1)), + + ((Filex1=u,Filex="file.txt")->true; + Filex=Filex1), + + ((number(M1),M=M1)->true; + M=all), %% If m1 is undefined or all then m=all + + prep(List1,BrDict03,BrDict03t,Filex,Stringx1,M), + term_to_atom(List1,List1A),atom_string(List1A,List1B),string_concat(List1B,"\n\n",List1C), + br2(List1,BrDict03,BrDict2,BrDict03t,BrDict03t2,1,List1C,Shell2,N,M,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish), + sort(BrDict2,BrDict3), + (BrDict03=BrDict3->true; + (open_s("algdict1.txt",write,Stream), +%% string_codes(BrDict3), +term_to_atom(BrDict3,BrDict31), + write(Stream,BrDict31), + close(Stream))), + + sort(BrDict03t2,BrDict03t3), + (BrDict03t=BrDict03t3->true; + (open_s("algdict2.txt",write,Stream2), +%% string_codes(BrDict3), + term_to_atom(BrDict03t3,BrDict03t31), + write(Stream2,BrDict03t31), + close(Stream2))), + + get_time(TS),stamp_date_time(TS,date(Year,Month,Day,Hour1,Minute1,Seconda,_A,_TZ,_False),local), + concat_list("file",[Year,Month,Day,Hour1,Minute1,Seconda],File1), + concat_list(File1,[".txt"],File2), + + + (open_s(File2,write,Stream1), +%% string_codes(BrDict3), + write(Stream1,Shell2), + close(Stream1)), + + !. + + +replace0(Input,Find,Replace,SepandPad,M,Output0) :- + replace00(Input,Find,Replace,SepandPad,[],Output1), + truncate(Output1,M,Output0),!. +replace00([],_Find,_Replace,_SepandPad,Input,Input) :- !. +replace00(Input1,Find,Replace,SepandPad,Input2,Input3) :- + Input1=[Input4|Input5], + string_codes(Input4,String1), + replace1(String1,Find,Replace,[],Input7), + string_codes(Output,Input7), + split_string(Output,SepandPad,SepandPad,Input8), + append(Input2,Input8,Input9), + replace00(Input5,Find,Replace,SepandPad,Input9,Input3), !. +replace1([],_Find,_Replace,Input,Input) :- !. +replace1(Input1,Find,Replace,Input2,Input3) :- + Input1=[Input4|Input5], + member(Input4,Find), + append(Input2,[Replace],Input6), + replace1(Input5,Find,Replace,Input6,Input3), !. +replace1(Input1,Find,Replace,Input2,Input3) :- + Input1=[Input4|Input5], + not(member(Input4,Find)), + append(Input2,[Input4],Input6), + replace1(Input5,Find,Replace,Input6,Input3), !. + +split_string_onnonletter(String00,List1) :- + string_codes(String00,String1), + split_string_onnonletter(String1,[],List0), + string_codes(List0,List2), + split_string(List2," "," ",List1),!. +split_string_onnonletter([],Input,Input) :- !. +split_string_onnonletter(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + not(char_type(Input4,alpha)), + append(Input2,[32],Input6), + split_string_onnonletter(Input5,Input6,Input3), !. +split_string_onnonletter(Input1,Input2,Input3) :- + Input1=[Input4|Input5], + char_type(Input4,alpha), + append(Input2,[Input4],Input6), + split_string_onnonletter(Input5,Input6,Input3), !. + +%% Truncates the list if m is not undefined and m is greater than or equal to the length of string0 +truncate(List1,M,String0) :- + ((number(M),length(String0,M), + append(String0,_,List1))->true; + String0=List1),!. + +prep(List,BrDict03,BrDict03t,Filex,Stringx1,M) :- + phrase_from_file_s(string(BrDict0), "algdict1.txt"), + %%Chars="’", + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), +%%writeln([brdict1]), +string_atom(BrDict0,Atom),atom_to_term(Atom,BrDict01,_), + %%splitfurther(BrDict0,BrDict01), +%%writeln([brDict01,BrDict01]), + %%char_code(Escape,27), + %%delete(BrDict01,[Escape,_,_,_,_],BrDict021), +%%writeln([brDict021,BrDict021]), + %%char_code(Apostrophe,8217), + %%delete(BrDict021,[Apostrophe,_,_,_,_],BrDict02), +%%writeln([brDict02,BrDict02]), + sort(BrDict01,BrDict03), +%%writeln([brDict03,BrDict03]), + length(BrDict03,Length0),write("Number of words in dictionary: "), writeln(Length0), + + %%writeln(''), + %%writeln([brdict2]), + phrase_from_file_s(string(BrDict0t), "algdict2.txt"), + %%Chars="’", + %%split_string(BrDict0,SepandPad,SepandPad,BrDict01), +%%writeln([brDict0,BrDict0]), +%% splitfurthert(BrDict0t,BrDict01t), +string_atom(BrDict0t,Atom2),atom_to_term(Atom2,BrDict01t,_), +%%writeln([brDict01,BrDict01]), + %%delete(BrDict01t,[Escape,_,_,_,_],BrDict021t), +%%writeln([brDict021,BrDict021]), + %%delete(BrDict021t,[Apostrophe,_,_,_,_],BrDict02t), +%%writeln([brDict02,BrDict02]), + sort(BrDict01t,BrDict03t), +%%writeln([brDict03,BrDict03]), + length(BrDict03t,Length0t),write("Number of unique algorithms in dictionary: "), writeln(Length0t), + + ((Stringx1=u, + phrase_from_file_s(string(String00), Filex))->true; + String00=Stringx1), + + split_string(String00,SepandPad,SepandPad,List1), + %%split_string_onnonletter(String00,List1), + + truncate(List1,M,List), + + /**replace0(String0,[8221, 8220], 34, SepandPad, M, String1), + replace0(String1,[8216, 8217], 39, SepandPad, M, String2), + replace0(String2,[8259, 8211, 8212], 45, SepandPad, M, String3), + replace0(String3,[160], 32, SepandPad, M, List), + **/ + +%%atom_codes(Atom999,String),writeln([atom999,Atom999]), + +%%writeln([list,List]), + %%delete(List,Escape,List11), +%%writeln([list11,List11]), + %%delete(List11,Apostrophe,List1), +%%writeln([list1,List1]), + length(List,Length1),write("Number of words to breason out algorithms for in file: "), writeln(Length1), + sort(List,List2), +%%writeln([list2,List2]), + length(List2,Length2),write("Number of unique words in file: "), writeln(Length2), + + + ((Stringx1=u, %% Use file, not string as input. + + %%maplist(downcase_atom, List2, List3), + maplist(string_lower, List2, List3), + +%%writeln([list3,List3]), + towords3(BrDict03,[],BrDict04,[],_ObjectNames,[],AllUsedNames), + towords2(BrDict03t,[],BrDict04t), + +%%writeln([brDict04,BrDict04]), + subtract(List3,BrDict04,D1), +%%writeln([list3,brDict04,d1,List3,BrDict04,D1]), +%%writeln(["subtract(BrDict04,List3,D1).",List3,BrDict04,D1]), + length(D1,Length01),Difference is abs(Length01),write("Number of words remaining to define: "), writeln(Difference), + + subtract(AllUsedNames,BrDict04t,D2), + %%delete(D21,'',D2), + length(D2,Length01t),Differencet is abs(Length01t),write("Number of undefined algorithms: "), writeln(Differencet), + %% writeln([undefinedalgorithms,D2]), %% Print undefined algorithms + + %%delete(D31,'',D3), + subtract(BrDict04t,AllUsedNames,D3), + length(D3,Length01t2),Differencet2 is abs(Length01t2),write("Number of orphaned algorithms: "), writeln(Differencet2) + + %%,writeln([orphanedalgorithms,D3]) %% Print orphaned algorithms + + + + +)->true;(string(Filex),writeln("Number of words, unique words, unique breathsonings, words remaining to define, undefined breasonings, orphaned breasonings, undefined breathsonings and orphaned breathsonings skipped for speed when breasoning out a string."))),!. + +br2(_,A,A,B,B,0,Shell,Shell,_N1,_M1,_Brth,_Room,_PartOfRoom,_Direction,_ObjectToPrepare,_ObjectToFinish) :- !. +br2(List1,BrDict03,BrDict2,BrDict03t,BrDict03t2,N1,Shell1,Shell2,N11,M1,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish) :- + br(List1,BrDict03,BrDict21,BrDict03t,BrDict03t21,Shell1,Shell3,N11,M1,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish), + N2 is N1-1, + br2(List1,BrDict21,BrDict2,BrDict03t21,BrDict03t2,N2,Shell3,Shell2,N11,M1,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish),!. + +towords2([],A,A) :- !. +towords2(BrDict03,A,B) :- + BrDict03=[[Word,_,_,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2(Rest,C,B). + +towords2a([],A,A) :- !. +towords2a(BrDict03,A,B) :- + BrDict03=[[Word,_]|Rest], + %%atom_string(Atom,Word), + append(A,[Word],C), + towords2a(Rest,C,B). + +towords3([],A,A,C,C,D,D) :- !. +towords3(BrDict03,A,B,D,E,G,H) :- + BrDict03=[[Word1,Word2]|Rest], + (Word2=""->append(G,[Word1],I)->true; + append(G,[Word2],I)), + append(A,[Word1],C), + append(D,[Word2],F), + towords3(Rest,C,B,F,E,I,H). + +string(String) --> list(String). + +list([]) --> []. +list([L|Ls]) --> [L], list(Ls). + +splitfurther(BrDict01,N) :- + phrase(file0(N),BrDict01). + +file0(N) --> "[", file(N), "]", !. +file0([]) --> []. + +%%file([]) --> []. +file([L|Ls]) --> entry(L),",", +%%{writeln(L)}, %%*** +file(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +file([L]) --> entry(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entry([Word2,Word4]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + word(Word3), {string_codes(Word4,Word3),string(Word4)}, + "]". + +splitfurthert(BrDict01,N) :- + phrase(file0t(N),BrDict01). + +file0t(N) --> "[", filet(N), "]", !. +file0t([]) --> []. + +%%file([]) --> []. +filet([L|Ls]) --> entryt(L),",", +%%{writeln(L)}, %%*** +filet(Ls), !. %% file(Ls),{M=[Ls]})), !. %%, {writeln(["l",L])},",", file(Ls), {writeln(["ls",Ls])},!. %%, {append(L,Ls,M)}, !. +filet([L]) --> entryt(L), +%%{writeln(L)}, +!. %%(entry(L),{M=L});{M=[],(writeln("Warning - Entry in incorrect format.") +%%,abort +%%)}, !. + +entryt([Word2,X3,Y3,Z3]) --> + "[", word(Word), {string_codes(Word2,Word),string(Word2)}, + ",", + digits(X),",",{atom_codes(X2,X),atom_number(X2,X3),number(X3)}, + digits(Y),",",{atom_codes(Y2,Y),atom_number(Y2,Y3),number(Y3)}, + digits(Z),{atom_codes(Z2,Z),atom_number(Z2,Z3),number(Z3)}, + "]". + +word([X|Xs]) --> [X], {char_type(X,csymf)->true;(X=27->true;X=8217)}, word(Xs), !. +%%word([X]) --> [X], {char_type(X,csymf);(X=27;X=8217)}, !. +word([]) --> []. + +digits([X|Xs]) --> [X], {(char_type(X,digit)->true;(string_codes(Word2,[X]),Word2="."))}, digits(Xs), !. +%%digits([X]) --> [X], {(char_type(X,digit);(string_codes(Word2,[X]),Word2="."))}, !. +digits([]) --> []. + +br([],B,B,C,C,Shell,Shell,_N1,_M1,_Brth,_Room,_PartOfRoom,_Direction,_ObjectToPrepare,_ObjectToFinish) :- + !. +br([Word|Words],BrDict,BrDict2,BrDict4,BrDict5,Shell1,Shell2,N1,M1,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish) :- + downcase_atom(Word, Word2), atom_string(Word2,Word3), + + /**member([Word3,X,Y,Z],BrDict4) -> %% This feature is a bug because words in brdict2 shouldn't necessarily be the words in brdict1 + %%(append(BrDict,[[Word3,""]],BrDict3), BrDict3t=BrDict4, + %%br(Words,BrDict3,BrDict2,BrDict3t,BrDict5)) + %%; + %%(**/ + + %%(member([Word3,X,Y,Z],BrDict4) -> %% This feature is a bug because words in brdict1 should correspond to those in brdict2 + %%(atom_concat("The breasoning for ", Word3, P1), + %%atom_concat(P1, " is defined. Enter object name (without spaces), if different for ", Prompt)); + %Prompt="Enter object name (without spaces), if different for "), + + %%writeln([word3,Word3]), + + (member([Word3,String4],BrDict)-> + BrDict3=BrDict; + ((repeat, + write("Enter algorithm name (without spaces), if different for "), writeln(Word3),read_string(user_input, "\n", "\r", _End2, String2),split_string(String2, "", " ", String3),String3=[String4]), + append(BrDict,[[Word3,String4]],BrDict3), + texttobr(1,u,String4,1))), + %%*brth(Word3,_Brth), + +(String4=""->String5=Word3;String5=String4), + + downcase_atom(String5, String52), atom_string(String52,String53), + + (member([String53,Query,Algorithm,Result],BrDict4)-> + BrDict3t1=BrDict4; + ((repeat, + write("Enter specification, algorithm and result for "), writeln(String53),read_string(user_input, "\n", "\r", _End, String),term_to_atom(Values,String),%% ",", " ", Values), + Values=(Query,Algorithm,Result),%%number_string(X,X1),number_string(Y,Y1),number_string(Z,Z1)), + append(BrDict4,[[String53,Query,Algorithm,Result]],BrDict3t1)))->true;true), + %%*brth(String53,_Brth2), + %%write("br(\'"),write(Word3),writeln("\',)."), +%% writeln([Word3,X,Y,Z]), + %%write(' '), + +term_to_atom(Query,Query1), +term_to_atom(Algorithm,Algorithm1), +term_to_atom(Result,Result1), +concat_list([Shell1,"swipl -G100g -T20g -L2g\n[listprolog].\nleash(-all),visible(+all),protocol(\"./file1.txt\"),trace,interpret(off,",Query1,",",Algorithm1,",",Result1,"),notrace,noprotocol.\nhalt.\nswipl -G100g -T20g -L2g\n['../Text-to-Breasonings/text_to_breasonings.pl'].\ntime((N = ",N1,",","M = ",M1,", texttobr2(N,\"file1.txt\",u,M,",Brth,",",Room,",",PartOfRoom,",",Direction,",",ObjectToPrepare,",",ObjectToFinish,"),texttobr(N,\"file1.txt\",u,M))).\nhalt.\n\n"],Shell3), + +br(Words,BrDict3,BrDict2,BrDict3t1,BrDict5,Shell3,Shell2,N1,M1,Brth,Room,PartOfRoom,Direction,ObjectToPrepare,ObjectToFinish). + %%). +brth(_,sweetinvincibleandprayedfor). + +%% finds unknown words, asks for their br in form "n of m: word", verify, (can go back x) append and sort, save \ No newline at end of file diff --git a/texttoalg2/LICENSE b/texttoalg2/LICENSE new file mode 100644 index 0000000000000000000000000000000000000000..d925e8c4aa63a274f76a5c22de79d0ada99b0da7 --- /dev/null +++ b/texttoalg2/LICENSE @@ -0,0 +1,29 @@ +BSD 3-Clause License + +Copyright (c) 2019, Lucian Green +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its + contributors may be used to endorse or promote products derived from + this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" +AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE +IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE +FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL +DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR +SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER +CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, +OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE +OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/texttoalg2/README.md b/texttoalg2/README.md new file mode 100644 index 0000000000000000000000000000000000000000..d75e59baa28e7457e065c9e2e185fb277e52ab0b --- /dev/null +++ b/texttoalg2/README.md @@ -0,0 +1,65 @@ +# texttoalg2 +Text to Algorithm 2 + +Text to Algorithm 2 is a SWI-Prolog algorithm that is an extension to Text to Algorithm (which prepares to breason out file.txt in the Text to Breasonings folder) that only finds algorithms for the base form of verbs, finding philosophy and algorithms in the process. It produces an algorithm for each word, producing a pastable Shell script file1a.txt, mainly used for postgraduate Pedagogy musings. Breasoning is thinking of the x, y and z dimensions of objects, necessary for meeting grades. + +To use this algorithm, you need another window open to develop and test each specification, algorithm and result, before pasting it into texttoalg. + +# Prerequisites + +* Please download and install SWI-Prolog for your machine at `https://www.swi-prolog.org/build/`. + +# 1. Install manually + +Download this repository, the List Prolog Interpreter Repository, the Text to Breasonings Repository and the Text to Algorithm Repository. + +# 2. Or Install from List Prolog Package Manager (LPPM) + +* Download the LPPM Repository: + +``` +mkdir GitHub +cd GitHub/ +git clone https://github.com/luciangreen/List-Prolog-Package-Manager.git +cd List-Prolog-Package-Manager +swipl +['lppm']. +lppm_install("luciangreen","texttoalg2"). +halt +``` + +# Running + +* In Shell: +`cd texttoalg` (sic) +``` +swipl +['../Text-to-Breasonings/text_to_breasonings.pl']. +['../texttoalg2/find_and_convert_verbs_to_base.pl']. +``` + +* Enter the following to breason out verbs in Text-to-Breasonings/file.txt: +`find_and_convert_verbs_to_base.` +Note: only asks for algorithms for verbs (i.e. that have the object "right" in brdict1.txt), which are stripped of the prefixes and suffixes in texttoalg2/prefixes.txt and texttoalg2/suffixes.txt respectively. + + +* Before running texttobr, think of two radio buttons put on recordings, put through with prayer, nut and bolt, quantum box prayer 1, 1, 0.5 cm and 1, 1, 0.5 cm. + +* Follow instructions in https://github.com/luciangreen/Text-to-Breasonings/blob/master/Instructions_for_Using_texttobr(2).pl.txt when using texttoalg, texttobr, or texttobr2 to avoid medical problems. + +* Before breasoning, breason out algdict1.txt and algdict2.txt to allow breasoning multiple instances by dragging them from Finder (Mac) to empty BBEdit window for file.txt, then enter: +`cd ../Text-to-Breasonings` +``` +swipl +['text_to_breasonings.pl']. +texttobr2(u,u,u,u,true,false,false,false,false,false),texttobr(u,u,u,u). +``` +* Not recommended (due to idiosyncrasies of Shell, so breasoning out the dictionaries in the previous step may have to suffice): Copy and paste contents of file1a.txt into Terminal window (Mac) (one to a few lines at a time) to breason out algorithms for all instances of words. + +# Authors + +Lucian Green - Initial programmer - Lucian Academy + +# License + +I licensed this project under the BSD3 License - see the LICENSE.md file for details diff --git a/texttoalg2/find_and_convert_verbs_to_base.pl b/texttoalg2/find_and_convert_verbs_to_base.pl new file mode 100644 index 0000000000000000000000000000000000000000..9c6e92b9b5e80c47b17bf68cc9be5de75c1cbcaf --- /dev/null +++ b/texttoalg2/find_and_convert_verbs_to_base.pl @@ -0,0 +1,35 @@ +:-include('../listprologinterpreter/la_strings'). +:-include('../texttoalg/texttoalg.pl'). + +find_and_convert_verbs_to_base :- + find_verbs(Text2a2), + convert_verbs_to_base(Text2a2,Word12a), + term_to_atom(Word12a,Word12), + texttoalg(u,u,Word12,u,true,false,false,false,false,false). + +find_verbs(Text2a2) :- +phrase_from_file_s(string(Text1), "../Text-to-Breasonings/file.txt"), +%% phrase_from_file_s(string(Text1), "file.txt"), + + +phrase_from_file_s(string(BrDict0), "../Text-to-Breasonings/brdict1.txt"), +%% phrase_from_file_s(string(BrDict0), "brdict1.txt"), + splitfurther(BrDict0,BrDict01), + sort(BrDict01,BrDict012), + + SepandPad="&#@~%`$?-+*^,()|.:;=_/[]<>{}\n\r\s\t\\\"!'0123456789", + split_string(Text1,SepandPad,SepandPad,Text2a), + + delete(Text2a,"",Text222), +findall(Text2a1,(member(Text2a1,Text222),member([Text2a1,"right"],BrDict012)),Text2a2). + +convert_verbs_to_base(Text1,Word12) :- + read_term1("../texttoalg2/prefixes.txt",Prefixes), + read_term1("../texttoalg2/suffixes.txt",Suffixes), + findall(Word1,(member(Word,Text1),once((member(Prefix,Prefixes),string_concat(Prefix,Word1,Word))->true;Word1=Word)),Word11), + findall(Word_a1,(member(Word_a,Word11),once((member(Suffix,Suffixes),string_concat(Word_a1,Suffix,Word_a))->true;Word_a1=Word_a)),Word12). + +read_term1(File,Prefixes3) :- + phrase_from_file_s(string(Prefixes1),File), + string_codes(Prefixes2,Prefixes1), + term_to_atom(Prefixes3,Prefixes2). \ No newline at end of file diff --git a/texttoalg2/prefixes.txt b/texttoalg2/prefixes.txt new file mode 100644 index 0000000000000000000000000000000000000000..d7d8778ff26c00681462d0d23672a98e11379e8b --- /dev/null +++ b/texttoalg2/prefixes.txt @@ -0,0 +1 @@ +["mis"] \ No newline at end of file diff --git a/texttoalg2/suffixes.txt b/texttoalg2/suffixes.txt new file mode 100644 index 0000000000000000000000000000000000000000..4b210054fd89cbb8dcf3f39fc2f6b08f8d6d65a0 --- /dev/null +++ b/texttoalg2/suffixes.txt @@ -0,0 +1 @@ +["d","ed","ing","s","ment"] \ No newline at end of file